May 13 23:55:10.051806 kernel: Linux version 6.6.89-flatcar (build@pony-truck.infra.kinvolk.io) (x86_64-cros-linux-gnu-gcc (Gentoo Hardened 14.2.1_p20241221 p7) 14.2.1 20241221, GNU ld (Gentoo 2.44 p1) 2.44.0) #1 SMP PREEMPT_DYNAMIC Tue May 13 22:08:35 -00 2025 May 13 23:55:10.051836 kernel: Command line: BOOT_IMAGE=/flatcar/vmlinuz-a mount.usr=/dev/mapper/usr verity.usr=PARTUUID=7130c94a-213a-4e5a-8e26-6cce9662f132 rootflags=rw mount.usrflags=ro consoleblank=0 root=LABEL=ROOT console=ttyS0,115200 flatcar.first_boot=detected verity.usrhash=8b3c5774a4242053287d41edc0d029958b7c22c131f7dd36b16a68182354e130 May 13 23:55:10.051851 kernel: BIOS-provided physical RAM map: May 13 23:55:10.051859 kernel: BIOS-e820: [mem 0x0000000000000000-0x000000000009fbff] usable May 13 23:55:10.051868 kernel: BIOS-e820: [mem 0x000000000009fc00-0x000000000009ffff] reserved May 13 23:55:10.051877 kernel: BIOS-e820: [mem 0x00000000000f0000-0x00000000000fffff] reserved May 13 23:55:10.051887 kernel: BIOS-e820: [mem 0x0000000000100000-0x000000009cfdbfff] usable May 13 23:55:10.051896 kernel: BIOS-e820: [mem 0x000000009cfdc000-0x000000009cffffff] reserved May 13 23:55:10.051905 kernel: BIOS-e820: [mem 0x00000000b0000000-0x00000000bfffffff] reserved May 13 23:55:10.051914 kernel: BIOS-e820: [mem 0x00000000fed1c000-0x00000000fed1ffff] reserved May 13 23:55:10.051926 kernel: BIOS-e820: [mem 0x00000000feffc000-0x00000000feffffff] reserved May 13 23:55:10.051935 kernel: BIOS-e820: [mem 0x00000000fffc0000-0x00000000ffffffff] reserved May 13 23:55:10.051944 kernel: BIOS-e820: [mem 0x000000fd00000000-0x000000ffffffffff] reserved May 13 23:55:10.051953 kernel: NX (Execute Disable) protection: active May 13 23:55:10.051964 kernel: APIC: Static calls initialized May 13 23:55:10.051978 kernel: SMBIOS 2.8 present. May 13 23:55:10.051988 kernel: DMI: QEMU Standard PC (Q35 + ICH9, 2009), BIOS 1.16.2-debian-1.16.2-1 04/01/2014 May 13 23:55:10.051998 kernel: Hypervisor detected: KVM May 13 23:55:10.052007 kernel: kvm-clock: Using msrs 4b564d01 and 4b564d00 May 13 23:55:10.052017 kernel: kvm-clock: using sched offset of 2399719408 cycles May 13 23:55:10.052027 kernel: clocksource: kvm-clock: mask: 0xffffffffffffffff max_cycles: 0x1cd42e4dffb, max_idle_ns: 881590591483 ns May 13 23:55:10.052037 kernel: tsc: Detected 2794.748 MHz processor May 13 23:55:10.052047 kernel: e820: update [mem 0x00000000-0x00000fff] usable ==> reserved May 13 23:55:10.052124 kernel: e820: remove [mem 0x000a0000-0x000fffff] usable May 13 23:55:10.052135 kernel: last_pfn = 0x9cfdc max_arch_pfn = 0x400000000 May 13 23:55:10.052149 kernel: MTRR map: 4 entries (3 fixed + 1 variable; max 19), built from 8 variable MTRRs May 13 23:55:10.052158 kernel: x86/PAT: Configuration [0-7]: WB WC UC- UC WB WP UC- WT May 13 23:55:10.052168 kernel: Using GB pages for direct mapping May 13 23:55:10.052178 kernel: ACPI: Early table checksum verification disabled May 13 23:55:10.052187 kernel: ACPI: RSDP 0x00000000000F59D0 000014 (v00 BOCHS ) May 13 23:55:10.052197 kernel: ACPI: RSDT 0x000000009CFE2408 000038 (v01 BOCHS BXPC 00000001 BXPC 00000001) May 13 23:55:10.052206 kernel: ACPI: FACP 0x000000009CFE21E8 0000F4 (v03 BOCHS BXPC 00000001 BXPC 00000001) May 13 23:55:10.052216 kernel: ACPI: DSDT 0x000000009CFE0040 0021A8 (v01 BOCHS BXPC 00000001 BXPC 00000001) May 13 23:55:10.052226 kernel: ACPI: FACS 0x000000009CFE0000 000040 May 13 23:55:10.052241 kernel: ACPI: APIC 0x000000009CFE22DC 000090 (v01 BOCHS BXPC 00000001 BXPC 00000001) May 13 23:55:10.052251 kernel: ACPI: HPET 0x000000009CFE236C 000038 (v01 BOCHS BXPC 00000001 BXPC 00000001) May 13 23:55:10.052260 kernel: ACPI: MCFG 0x000000009CFE23A4 00003C (v01 BOCHS BXPC 00000001 BXPC 00000001) May 13 23:55:10.052270 kernel: ACPI: WAET 0x000000009CFE23E0 000028 (v01 BOCHS BXPC 00000001 BXPC 00000001) May 13 23:55:10.052280 kernel: ACPI: Reserving FACP table memory at [mem 0x9cfe21e8-0x9cfe22db] May 13 23:55:10.052291 kernel: ACPI: Reserving DSDT table memory at [mem 0x9cfe0040-0x9cfe21e7] May 13 23:55:10.052306 kernel: ACPI: Reserving FACS table memory at [mem 0x9cfe0000-0x9cfe003f] May 13 23:55:10.052319 kernel: ACPI: Reserving APIC table memory at [mem 0x9cfe22dc-0x9cfe236b] May 13 23:55:10.052330 kernel: ACPI: Reserving HPET table memory at [mem 0x9cfe236c-0x9cfe23a3] May 13 23:55:10.052340 kernel: ACPI: Reserving MCFG table memory at [mem 0x9cfe23a4-0x9cfe23df] May 13 23:55:10.052350 kernel: ACPI: Reserving WAET table memory at [mem 0x9cfe23e0-0x9cfe2407] May 13 23:55:10.052361 kernel: No NUMA configuration found May 13 23:55:10.052373 kernel: Faking a node at [mem 0x0000000000000000-0x000000009cfdbfff] May 13 23:55:10.052385 kernel: NODE_DATA(0) allocated [mem 0x9cfd6000-0x9cfdbfff] May 13 23:55:10.052400 kernel: Zone ranges: May 13 23:55:10.052410 kernel: DMA [mem 0x0000000000001000-0x0000000000ffffff] May 13 23:55:10.052420 kernel: DMA32 [mem 0x0000000001000000-0x000000009cfdbfff] May 13 23:55:10.052430 kernel: Normal empty May 13 23:55:10.052440 kernel: Movable zone start for each node May 13 23:55:10.052450 kernel: Early memory node ranges May 13 23:55:10.052460 kernel: node 0: [mem 0x0000000000001000-0x000000000009efff] May 13 23:55:10.052471 kernel: node 0: [mem 0x0000000000100000-0x000000009cfdbfff] May 13 23:55:10.052481 kernel: Initmem setup node 0 [mem 0x0000000000001000-0x000000009cfdbfff] May 13 23:55:10.052491 kernel: On node 0, zone DMA: 1 pages in unavailable ranges May 13 23:55:10.052505 kernel: On node 0, zone DMA: 97 pages in unavailable ranges May 13 23:55:10.052515 kernel: On node 0, zone DMA32: 12324 pages in unavailable ranges May 13 23:55:10.052525 kernel: ACPI: PM-Timer IO Port: 0x608 May 13 23:55:10.052535 kernel: ACPI: LAPIC_NMI (acpi_id[0xff] dfl dfl lint[0x1]) May 13 23:55:10.052545 kernel: IOAPIC[0]: apic_id 0, version 17, address 0xfec00000, GSI 0-23 May 13 23:55:10.052556 kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 0 global_irq 2 dfl dfl) May 13 23:55:10.052566 kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 5 global_irq 5 high level) May 13 23:55:10.052576 kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 9 global_irq 9 high level) May 13 23:55:10.052587 kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 10 global_irq 10 high level) May 13 23:55:10.052600 kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 11 global_irq 11 high level) May 13 23:55:10.052610 kernel: ACPI: Using ACPI (MADT) for SMP configuration information May 13 23:55:10.052620 kernel: ACPI: HPET id: 0x8086a201 base: 0xfed00000 May 13 23:55:10.052631 kernel: TSC deadline timer available May 13 23:55:10.052641 kernel: smpboot: Allowing 4 CPUs, 0 hotplug CPUs May 13 23:55:10.052651 kernel: kvm-guest: APIC: eoi() replaced with kvm_guest_apic_eoi_write() May 13 23:55:10.052661 kernel: kvm-guest: KVM setup pv remote TLB flush May 13 23:55:10.052671 kernel: kvm-guest: setup PV sched yield May 13 23:55:10.052681 kernel: [mem 0xc0000000-0xfed1bfff] available for PCI devices May 13 23:55:10.052694 kernel: Booting paravirtualized kernel on KVM May 13 23:55:10.052705 kernel: clocksource: refined-jiffies: mask: 0xffffffff max_cycles: 0xffffffff, max_idle_ns: 1910969940391419 ns May 13 23:55:10.052716 kernel: setup_percpu: NR_CPUS:512 nr_cpumask_bits:4 nr_cpu_ids:4 nr_node_ids:1 May 13 23:55:10.052726 kernel: percpu: Embedded 58 pages/cpu s197032 r8192 d32344 u524288 May 13 23:55:10.052736 kernel: pcpu-alloc: s197032 r8192 d32344 u524288 alloc=1*2097152 May 13 23:55:10.052746 kernel: pcpu-alloc: [0] 0 1 2 3 May 13 23:55:10.052756 kernel: kvm-guest: PV spinlocks enabled May 13 23:55:10.052766 kernel: PV qspinlock hash table entries: 256 (order: 0, 4096 bytes, linear) May 13 23:55:10.052787 kernel: Kernel command line: rootflags=rw mount.usrflags=ro BOOT_IMAGE=/flatcar/vmlinuz-a mount.usr=/dev/mapper/usr verity.usr=PARTUUID=7130c94a-213a-4e5a-8e26-6cce9662f132 rootflags=rw mount.usrflags=ro consoleblank=0 root=LABEL=ROOT console=ttyS0,115200 flatcar.first_boot=detected verity.usrhash=8b3c5774a4242053287d41edc0d029958b7c22c131f7dd36b16a68182354e130 May 13 23:55:10.052803 kernel: Unknown kernel command line parameters "BOOT_IMAGE=/flatcar/vmlinuz-a", will be passed to user space. May 13 23:55:10.052813 kernel: random: crng init done May 13 23:55:10.052823 kernel: Dentry cache hash table entries: 524288 (order: 10, 4194304 bytes, linear) May 13 23:55:10.052833 kernel: Inode-cache hash table entries: 262144 (order: 9, 2097152 bytes, linear) May 13 23:55:10.052843 kernel: Fallback order for Node 0: 0 May 13 23:55:10.052853 kernel: Built 1 zonelists, mobility grouping on. Total pages: 632732 May 13 23:55:10.052863 kernel: Policy zone: DMA32 May 13 23:55:10.052873 kernel: mem auto-init: stack:off, heap alloc:off, heap free:off May 13 23:55:10.052888 kernel: Memory: 2430492K/2571752K available (14336K kernel code, 2296K rwdata, 25068K rodata, 43604K init, 1468K bss, 141000K reserved, 0K cma-reserved) May 13 23:55:10.052899 kernel: SLUB: HWalign=64, Order=0-3, MinObjects=0, CPUs=4, Nodes=1 May 13 23:55:10.052909 kernel: ftrace: allocating 37993 entries in 149 pages May 13 23:55:10.052919 kernel: ftrace: allocated 149 pages with 4 groups May 13 23:55:10.052929 kernel: Dynamic Preempt: voluntary May 13 23:55:10.052939 kernel: rcu: Preemptible hierarchical RCU implementation. May 13 23:55:10.052950 kernel: rcu: RCU event tracing is enabled. May 13 23:55:10.052960 kernel: rcu: RCU restricting CPUs from NR_CPUS=512 to nr_cpu_ids=4. May 13 23:55:10.052971 kernel: Trampoline variant of Tasks RCU enabled. May 13 23:55:10.052985 kernel: Rude variant of Tasks RCU enabled. May 13 23:55:10.052995 kernel: Tracing variant of Tasks RCU enabled. May 13 23:55:10.053005 kernel: rcu: RCU calculated value of scheduler-enlistment delay is 100 jiffies. May 13 23:55:10.053015 kernel: rcu: Adjusting geometry for rcu_fanout_leaf=16, nr_cpu_ids=4 May 13 23:55:10.053025 kernel: NR_IRQS: 33024, nr_irqs: 456, preallocated irqs: 16 May 13 23:55:10.053035 kernel: rcu: srcu_init: Setting srcu_struct sizes based on contention. May 13 23:55:10.053045 kernel: Console: colour VGA+ 80x25 May 13 23:55:10.053069 kernel: printk: console [ttyS0] enabled May 13 23:55:10.053079 kernel: ACPI: Core revision 20230628 May 13 23:55:10.053090 kernel: clocksource: hpet: mask: 0xffffffff max_cycles: 0xffffffff, max_idle_ns: 19112604467 ns May 13 23:55:10.053103 kernel: APIC: Switch to symmetric I/O mode setup May 13 23:55:10.053125 kernel: x2apic enabled May 13 23:55:10.053135 kernel: APIC: Switched APIC routing to: physical x2apic May 13 23:55:10.053146 kernel: kvm-guest: APIC: send_IPI_mask() replaced with kvm_send_ipi_mask() May 13 23:55:10.053156 kernel: kvm-guest: APIC: send_IPI_mask_allbutself() replaced with kvm_send_ipi_mask_allbutself() May 13 23:55:10.053166 kernel: kvm-guest: setup PV IPIs May 13 23:55:10.053187 kernel: ..TIMER: vector=0x30 apic1=0 pin1=2 apic2=-1 pin2=-1 May 13 23:55:10.053201 kernel: tsc: Marking TSC unstable due to TSCs unsynchronized May 13 23:55:10.053211 kernel: Calibrating delay loop (skipped) preset value.. 5589.49 BogoMIPS (lpj=2794748) May 13 23:55:10.053222 kernel: x86/cpu: User Mode Instruction Prevention (UMIP) activated May 13 23:55:10.053233 kernel: Last level iTLB entries: 4KB 512, 2MB 255, 4MB 127 May 13 23:55:10.053248 kernel: Last level dTLB entries: 4KB 512, 2MB 255, 4MB 127, 1GB 0 May 13 23:55:10.053259 kernel: Spectre V1 : Mitigation: usercopy/swapgs barriers and __user pointer sanitization May 13 23:55:10.053269 kernel: Spectre V2 : Mitigation: Retpolines May 13 23:55:10.053280 kernel: Spectre V2 : Spectre v2 / SpectreRSB: Filling RSB on context switch and VMEXIT May 13 23:55:10.053291 kernel: Spectre V2 : Enabling Speculation Barrier for firmware calls May 13 23:55:10.053305 kernel: RETBleed: Mitigation: untrained return thunk May 13 23:55:10.053316 kernel: Spectre V2 : mitigation: Enabling conditional Indirect Branch Prediction Barrier May 13 23:55:10.053326 kernel: Speculative Store Bypass: Mitigation: Speculative Store Bypass disabled via prctl May 13 23:55:10.053337 kernel: Speculative Return Stack Overflow: IBPB-extending microcode not applied! May 13 23:55:10.053348 kernel: Speculative Return Stack Overflow: WARNING: See https://kernel.org/doc/html/latest/admin-guide/hw-vuln/srso.html for mitigation options. May 13 23:55:10.053359 kernel: Speculative Return Stack Overflow: Vulnerable: Safe RET, no microcode May 13 23:55:10.053370 kernel: x86/fpu: Supporting XSAVE feature 0x001: 'x87 floating point registers' May 13 23:55:10.053381 kernel: x86/fpu: Supporting XSAVE feature 0x002: 'SSE registers' May 13 23:55:10.053396 kernel: x86/fpu: Supporting XSAVE feature 0x004: 'AVX registers' May 13 23:55:10.053406 kernel: x86/fpu: xstate_offset[2]: 576, xstate_sizes[2]: 256 May 13 23:55:10.053417 kernel: x86/fpu: Enabled xstate features 0x7, context size is 832 bytes, using 'compacted' format. May 13 23:55:10.053427 kernel: Freeing SMP alternatives memory: 32K May 13 23:55:10.053438 kernel: pid_max: default: 32768 minimum: 301 May 13 23:55:10.053449 kernel: LSM: initializing lsm=lockdown,capability,landlock,selinux,integrity May 13 23:55:10.053460 kernel: landlock: Up and running. May 13 23:55:10.053470 kernel: SELinux: Initializing. May 13 23:55:10.053481 kernel: Mount-cache hash table entries: 8192 (order: 4, 65536 bytes, linear) May 13 23:55:10.053495 kernel: Mountpoint-cache hash table entries: 8192 (order: 4, 65536 bytes, linear) May 13 23:55:10.053506 kernel: smpboot: CPU0: AMD EPYC 7402P 24-Core Processor (family: 0x17, model: 0x31, stepping: 0x0) May 13 23:55:10.053516 kernel: RCU Tasks: Setting shift to 2 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=4. May 13 23:55:10.053527 kernel: RCU Tasks Rude: Setting shift to 2 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=4. May 13 23:55:10.053538 kernel: RCU Tasks Trace: Setting shift to 2 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=4. May 13 23:55:10.053548 kernel: Performance Events: Fam17h+ core perfctr, AMD PMU driver. May 13 23:55:10.053559 kernel: ... version: 0 May 13 23:55:10.053569 kernel: ... bit width: 48 May 13 23:55:10.053579 kernel: ... generic registers: 6 May 13 23:55:10.053592 kernel: ... value mask: 0000ffffffffffff May 13 23:55:10.053603 kernel: ... max period: 00007fffffffffff May 13 23:55:10.053613 kernel: ... fixed-purpose events: 0 May 13 23:55:10.053623 kernel: ... event mask: 000000000000003f May 13 23:55:10.053634 kernel: signal: max sigframe size: 1776 May 13 23:55:10.053644 kernel: rcu: Hierarchical SRCU implementation. May 13 23:55:10.053655 kernel: rcu: Max phase no-delay instances is 400. May 13 23:55:10.053665 kernel: smp: Bringing up secondary CPUs ... May 13 23:55:10.053673 kernel: smpboot: x86: Booting SMP configuration: May 13 23:55:10.053685 kernel: .... node #0, CPUs: #1 #2 #3 May 13 23:55:10.053694 kernel: smp: Brought up 1 node, 4 CPUs May 13 23:55:10.053704 kernel: smpboot: Max logical packages: 1 May 13 23:55:10.053714 kernel: smpboot: Total of 4 processors activated (22357.98 BogoMIPS) May 13 23:55:10.053724 kernel: devtmpfs: initialized May 13 23:55:10.053734 kernel: x86/mm: Memory block size: 128MB May 13 23:55:10.053744 kernel: clocksource: jiffies: mask: 0xffffffff max_cycles: 0xffffffff, max_idle_ns: 1911260446275000 ns May 13 23:55:10.053754 kernel: futex hash table entries: 1024 (order: 4, 65536 bytes, linear) May 13 23:55:10.053765 kernel: pinctrl core: initialized pinctrl subsystem May 13 23:55:10.053788 kernel: NET: Registered PF_NETLINK/PF_ROUTE protocol family May 13 23:55:10.053800 kernel: audit: initializing netlink subsys (disabled) May 13 23:55:10.053808 kernel: audit: type=2000 audit(1747180509.343:1): state=initialized audit_enabled=0 res=1 May 13 23:55:10.053816 kernel: thermal_sys: Registered thermal governor 'step_wise' May 13 23:55:10.053824 kernel: thermal_sys: Registered thermal governor 'user_space' May 13 23:55:10.053832 kernel: cpuidle: using governor menu May 13 23:55:10.053840 kernel: acpiphp: ACPI Hot Plug PCI Controller Driver version: 0.5 May 13 23:55:10.053848 kernel: dca service started, version 1.12.1 May 13 23:55:10.053856 kernel: PCI: MMCONFIG for domain 0000 [bus 00-ff] at [mem 0xb0000000-0xbfffffff] (base 0xb0000000) May 13 23:55:10.053865 kernel: PCI: MMCONFIG at [mem 0xb0000000-0xbfffffff] reserved as E820 entry May 13 23:55:10.053875 kernel: PCI: Using configuration type 1 for base access May 13 23:55:10.053883 kernel: kprobes: kprobe jump-optimization is enabled. All kprobes are optimized if possible. May 13 23:55:10.053891 kernel: HugeTLB: registered 1.00 GiB page size, pre-allocated 0 pages May 13 23:55:10.053900 kernel: HugeTLB: 16380 KiB vmemmap can be freed for a 1.00 GiB page May 13 23:55:10.053911 kernel: HugeTLB: registered 2.00 MiB page size, pre-allocated 0 pages May 13 23:55:10.053921 kernel: HugeTLB: 28 KiB vmemmap can be freed for a 2.00 MiB page May 13 23:55:10.053929 kernel: ACPI: Added _OSI(Module Device) May 13 23:55:10.053937 kernel: ACPI: Added _OSI(Processor Device) May 13 23:55:10.053948 kernel: ACPI: Added _OSI(3.0 _SCP Extensions) May 13 23:55:10.053956 kernel: ACPI: Added _OSI(Processor Aggregator Device) May 13 23:55:10.053964 kernel: ACPI: 1 ACPI AML tables successfully acquired and loaded May 13 23:55:10.053972 kernel: ACPI: _OSC evaluation for CPUs failed, trying _PDC May 13 23:55:10.053979 kernel: ACPI: Interpreter enabled May 13 23:55:10.053987 kernel: ACPI: PM: (supports S0 S3 S5) May 13 23:55:10.053995 kernel: ACPI: Using IOAPIC for interrupt routing May 13 23:55:10.054003 kernel: PCI: Using host bridge windows from ACPI; if necessary, use "pci=nocrs" and report a bug May 13 23:55:10.054011 kernel: PCI: Using E820 reservations for host bridge windows May 13 23:55:10.054019 kernel: ACPI: Enabled 2 GPEs in block 00 to 3F May 13 23:55:10.054030 kernel: ACPI: PCI Root Bridge [PCI0] (domain 0000 [bus 00-ff]) May 13 23:55:10.054256 kernel: acpi PNP0A08:00: _OSC: OS supports [ExtendedConfig ASPM ClockPM Segments MSI HPX-Type3] May 13 23:55:10.054410 kernel: acpi PNP0A08:00: _OSC: platform does not support [PCIeHotplug LTR] May 13 23:55:10.054630 kernel: acpi PNP0A08:00: _OSC: OS now controls [PME AER PCIeCapability] May 13 23:55:10.054663 kernel: PCI host bridge to bus 0000:00 May 13 23:55:10.054885 kernel: pci_bus 0000:00: root bus resource [io 0x0000-0x0cf7 window] May 13 23:55:10.055045 kernel: pci_bus 0000:00: root bus resource [io 0x0d00-0xffff window] May 13 23:55:10.055205 kernel: pci_bus 0000:00: root bus resource [mem 0x000a0000-0x000bffff window] May 13 23:55:10.055353 kernel: pci_bus 0000:00: root bus resource [mem 0x9d000000-0xafffffff window] May 13 23:55:10.055497 kernel: pci_bus 0000:00: root bus resource [mem 0xc0000000-0xfebfffff window] May 13 23:55:10.055639 kernel: pci_bus 0000:00: root bus resource [mem 0x100000000-0x8ffffffff window] May 13 23:55:10.055797 kernel: pci_bus 0000:00: root bus resource [bus 00-ff] May 13 23:55:10.055992 kernel: pci 0000:00:00.0: [8086:29c0] type 00 class 0x060000 May 13 23:55:10.056281 kernel: pci 0000:00:01.0: [1234:1111] type 00 class 0x030000 May 13 23:55:10.056489 kernel: pci 0000:00:01.0: reg 0x10: [mem 0xfd000000-0xfdffffff pref] May 13 23:55:10.056702 kernel: pci 0000:00:01.0: reg 0x18: [mem 0xfebd0000-0xfebd0fff] May 13 23:55:10.056883 kernel: pci 0000:00:01.0: reg 0x30: [mem 0xfebc0000-0xfebcffff pref] May 13 23:55:10.057041 kernel: pci 0000:00:01.0: Video device with shadowed ROM at [mem 0x000c0000-0x000dffff] May 13 23:55:10.057254 kernel: pci 0000:00:02.0: [1af4:1005] type 00 class 0x00ff00 May 13 23:55:10.057427 kernel: pci 0000:00:02.0: reg 0x10: [io 0xc0c0-0xc0df] May 13 23:55:10.057590 kernel: pci 0000:00:02.0: reg 0x14: [mem 0xfebd1000-0xfebd1fff] May 13 23:55:10.057742 kernel: pci 0000:00:02.0: reg 0x20: [mem 0xfe000000-0xfe003fff 64bit pref] May 13 23:55:10.057932 kernel: pci 0000:00:03.0: [1af4:1001] type 00 class 0x010000 May 13 23:55:10.058117 kernel: pci 0000:00:03.0: reg 0x10: [io 0xc000-0xc07f] May 13 23:55:10.058335 kernel: pci 0000:00:03.0: reg 0x14: [mem 0xfebd2000-0xfebd2fff] May 13 23:55:10.058507 kernel: pci 0000:00:03.0: reg 0x20: [mem 0xfe004000-0xfe007fff 64bit pref] May 13 23:55:10.058685 kernel: pci 0000:00:04.0: [1af4:1000] type 00 class 0x020000 May 13 23:55:10.058872 kernel: pci 0000:00:04.0: reg 0x10: [io 0xc0e0-0xc0ff] May 13 23:55:10.059035 kernel: pci 0000:00:04.0: reg 0x14: [mem 0xfebd3000-0xfebd3fff] May 13 23:55:10.059220 kernel: pci 0000:00:04.0: reg 0x20: [mem 0xfe008000-0xfe00bfff 64bit pref] May 13 23:55:10.059412 kernel: pci 0000:00:04.0: reg 0x30: [mem 0xfeb80000-0xfebbffff pref] May 13 23:55:10.059626 kernel: pci 0000:00:1f.0: [8086:2918] type 00 class 0x060100 May 13 23:55:10.059800 kernel: pci 0000:00:1f.0: quirk: [io 0x0600-0x067f] claimed by ICH6 ACPI/GPIO/TCO May 13 23:55:10.059981 kernel: pci 0000:00:1f.2: [8086:2922] type 00 class 0x010601 May 13 23:55:10.060167 kernel: pci 0000:00:1f.2: reg 0x20: [io 0xc100-0xc11f] May 13 23:55:10.060333 kernel: pci 0000:00:1f.2: reg 0x24: [mem 0xfebd4000-0xfebd4fff] May 13 23:55:10.060510 kernel: pci 0000:00:1f.3: [8086:2930] type 00 class 0x0c0500 May 13 23:55:10.060670 kernel: pci 0000:00:1f.3: reg 0x20: [io 0x0700-0x073f] May 13 23:55:10.060687 kernel: ACPI: PCI: Interrupt link LNKA configured for IRQ 10 May 13 23:55:10.060698 kernel: ACPI: PCI: Interrupt link LNKB configured for IRQ 10 May 13 23:55:10.060715 kernel: ACPI: PCI: Interrupt link LNKC configured for IRQ 11 May 13 23:55:10.060725 kernel: ACPI: PCI: Interrupt link LNKD configured for IRQ 11 May 13 23:55:10.060737 kernel: ACPI: PCI: Interrupt link LNKE configured for IRQ 10 May 13 23:55:10.060748 kernel: ACPI: PCI: Interrupt link LNKF configured for IRQ 10 May 13 23:55:10.060759 kernel: ACPI: PCI: Interrupt link LNKG configured for IRQ 11 May 13 23:55:10.060770 kernel: ACPI: PCI: Interrupt link LNKH configured for IRQ 11 May 13 23:55:10.060791 kernel: ACPI: PCI: Interrupt link GSIA configured for IRQ 16 May 13 23:55:10.060802 kernel: ACPI: PCI: Interrupt link GSIB configured for IRQ 17 May 13 23:55:10.060812 kernel: ACPI: PCI: Interrupt link GSIC configured for IRQ 18 May 13 23:55:10.060827 kernel: ACPI: PCI: Interrupt link GSID configured for IRQ 19 May 13 23:55:10.060837 kernel: ACPI: PCI: Interrupt link GSIE configured for IRQ 20 May 13 23:55:10.060847 kernel: ACPI: PCI: Interrupt link GSIF configured for IRQ 21 May 13 23:55:10.060858 kernel: ACPI: PCI: Interrupt link GSIG configured for IRQ 22 May 13 23:55:10.060870 kernel: ACPI: PCI: Interrupt link GSIH configured for IRQ 23 May 13 23:55:10.060881 kernel: iommu: Default domain type: Translated May 13 23:55:10.060892 kernel: iommu: DMA domain TLB invalidation policy: lazy mode May 13 23:55:10.060903 kernel: PCI: Using ACPI for IRQ routing May 13 23:55:10.060915 kernel: PCI: pci_cache_line_size set to 64 bytes May 13 23:55:10.060930 kernel: e820: reserve RAM buffer [mem 0x0009fc00-0x0009ffff] May 13 23:55:10.060942 kernel: e820: reserve RAM buffer [mem 0x9cfdc000-0x9fffffff] May 13 23:55:10.061144 kernel: pci 0000:00:01.0: vgaarb: setting as boot VGA device May 13 23:55:10.061306 kernel: pci 0000:00:01.0: vgaarb: bridge control possible May 13 23:55:10.061467 kernel: pci 0000:00:01.0: vgaarb: VGA device added: decodes=io+mem,owns=io+mem,locks=none May 13 23:55:10.061483 kernel: vgaarb: loaded May 13 23:55:10.061495 kernel: hpet0: at MMIO 0xfed00000, IRQs 2, 8, 0 May 13 23:55:10.061506 kernel: hpet0: 3 comparators, 64-bit 100.000000 MHz counter May 13 23:55:10.061517 kernel: clocksource: Switched to clocksource kvm-clock May 13 23:55:10.061534 kernel: VFS: Disk quotas dquot_6.6.0 May 13 23:55:10.061546 kernel: VFS: Dquot-cache hash table entries: 512 (order 0, 4096 bytes) May 13 23:55:10.061557 kernel: pnp: PnP ACPI init May 13 23:55:10.061734 kernel: system 00:05: [mem 0xb0000000-0xbfffffff window] has been reserved May 13 23:55:10.061753 kernel: pnp: PnP ACPI: found 6 devices May 13 23:55:10.061764 kernel: clocksource: acpi_pm: mask: 0xffffff max_cycles: 0xffffff, max_idle_ns: 2085701024 ns May 13 23:55:10.061788 kernel: NET: Registered PF_INET protocol family May 13 23:55:10.061799 kernel: IP idents hash table entries: 65536 (order: 7, 524288 bytes, linear) May 13 23:55:10.061815 kernel: tcp_listen_portaddr_hash hash table entries: 2048 (order: 3, 32768 bytes, linear) May 13 23:55:10.061826 kernel: Table-perturb hash table entries: 65536 (order: 6, 262144 bytes, linear) May 13 23:55:10.061836 kernel: TCP established hash table entries: 32768 (order: 6, 262144 bytes, linear) May 13 23:55:10.061847 kernel: TCP bind hash table entries: 32768 (order: 8, 1048576 bytes, linear) May 13 23:55:10.061858 kernel: TCP: Hash tables configured (established 32768 bind 32768) May 13 23:55:10.061869 kernel: UDP hash table entries: 2048 (order: 4, 65536 bytes, linear) May 13 23:55:10.061880 kernel: UDP-Lite hash table entries: 2048 (order: 4, 65536 bytes, linear) May 13 23:55:10.061891 kernel: NET: Registered PF_UNIX/PF_LOCAL protocol family May 13 23:55:10.061905 kernel: NET: Registered PF_XDP protocol family May 13 23:55:10.062101 kernel: pci_bus 0000:00: resource 4 [io 0x0000-0x0cf7 window] May 13 23:55:10.062253 kernel: pci_bus 0000:00: resource 5 [io 0x0d00-0xffff window] May 13 23:55:10.062397 kernel: pci_bus 0000:00: resource 6 [mem 0x000a0000-0x000bffff window] May 13 23:55:10.062545 kernel: pci_bus 0000:00: resource 7 [mem 0x9d000000-0xafffffff window] May 13 23:55:10.062691 kernel: pci_bus 0000:00: resource 8 [mem 0xc0000000-0xfebfffff window] May 13 23:55:10.062842 kernel: pci_bus 0000:00: resource 9 [mem 0x100000000-0x8ffffffff window] May 13 23:55:10.062857 kernel: PCI: CLS 0 bytes, default 64 May 13 23:55:10.062869 kernel: Initialise system trusted keyrings May 13 23:55:10.062885 kernel: workingset: timestamp_bits=39 max_order=20 bucket_order=0 May 13 23:55:10.062896 kernel: Key type asymmetric registered May 13 23:55:10.062907 kernel: Asymmetric key parser 'x509' registered May 13 23:55:10.062918 kernel: Block layer SCSI generic (bsg) driver version 0.4 loaded (major 251) May 13 23:55:10.062929 kernel: io scheduler mq-deadline registered May 13 23:55:10.062940 kernel: io scheduler kyber registered May 13 23:55:10.062950 kernel: io scheduler bfq registered May 13 23:55:10.062961 kernel: ioatdma: Intel(R) QuickData Technology Driver 5.00 May 13 23:55:10.062972 kernel: ACPI: \_SB_.GSIG: Enabled at IRQ 22 May 13 23:55:10.062986 kernel: ACPI: \_SB_.GSIH: Enabled at IRQ 23 May 13 23:55:10.062998 kernel: ACPI: \_SB_.GSIE: Enabled at IRQ 20 May 13 23:55:10.063008 kernel: Serial: 8250/16550 driver, 4 ports, IRQ sharing enabled May 13 23:55:10.063019 kernel: 00:03: ttyS0 at I/O 0x3f8 (irq = 4, base_baud = 115200) is a 16550A May 13 23:55:10.063030 kernel: i8042: PNP: PS/2 Controller [PNP0303:KBD,PNP0f13:MOU] at 0x60,0x64 irq 1,12 May 13 23:55:10.063041 kernel: serio: i8042 KBD port at 0x60,0x64 irq 1 May 13 23:55:10.063067 kernel: serio: i8042 AUX port at 0x60,0x64 irq 12 May 13 23:55:10.063078 kernel: input: AT Translated Set 2 keyboard as /devices/platform/i8042/serio0/input/input0 May 13 23:55:10.063246 kernel: rtc_cmos 00:04: RTC can wake from S4 May 13 23:55:10.063402 kernel: rtc_cmos 00:04: registered as rtc0 May 13 23:55:10.063552 kernel: rtc_cmos 00:04: setting system clock to 2025-05-13T23:55:09 UTC (1747180509) May 13 23:55:10.063698 kernel: rtc_cmos 00:04: alarms up to one day, y3k, 242 bytes nvram, hpet irqs May 13 23:55:10.063714 kernel: amd_pstate: the _CPC object is not present in SBIOS or ACPI disabled May 13 23:55:10.063726 kernel: NET: Registered PF_INET6 protocol family May 13 23:55:10.063737 kernel: Segment Routing with IPv6 May 13 23:55:10.063748 kernel: In-situ OAM (IOAM) with IPv6 May 13 23:55:10.063759 kernel: NET: Registered PF_PACKET protocol family May 13 23:55:10.063784 kernel: Key type dns_resolver registered May 13 23:55:10.063794 kernel: IPI shorthand broadcast: enabled May 13 23:55:10.063805 kernel: sched_clock: Marking stable (861002475, 181622286)->(1270785771, -228161010) May 13 23:55:10.063816 kernel: registered taskstats version 1 May 13 23:55:10.063826 kernel: Loading compiled-in X.509 certificates May 13 23:55:10.063837 kernel: Loaded X.509 cert 'Kinvolk GmbH: Module signing key for 6.6.89-flatcar: 166efda032ca4d6e9037c569aca9b53585ee6f94' May 13 23:55:10.063848 kernel: Key type .fscrypt registered May 13 23:55:10.063859 kernel: Key type fscrypt-provisioning registered May 13 23:55:10.063870 kernel: ima: No TPM chip found, activating TPM-bypass! May 13 23:55:10.063883 kernel: ima: Allocated hash algorithm: sha1 May 13 23:55:10.063895 kernel: ima: No architecture policies found May 13 23:55:10.063905 kernel: clk: Disabling unused clocks May 13 23:55:10.063916 kernel: Freeing unused kernel image (initmem) memory: 43604K May 13 23:55:10.063927 kernel: Write protecting the kernel read-only data: 40960k May 13 23:55:10.063939 kernel: Freeing unused kernel image (rodata/data gap) memory: 1556K May 13 23:55:10.063950 kernel: Run /init as init process May 13 23:55:10.063961 kernel: with arguments: May 13 23:55:10.063971 kernel: /init May 13 23:55:10.063986 kernel: with environment: May 13 23:55:10.063996 kernel: HOME=/ May 13 23:55:10.064007 kernel: TERM=linux May 13 23:55:10.064017 kernel: BOOT_IMAGE=/flatcar/vmlinuz-a May 13 23:55:10.064030 systemd[1]: Successfully made /usr/ read-only. May 13 23:55:10.064045 systemd[1]: systemd 256.8 running in system mode (+PAM +AUDIT +SELINUX -APPARMOR +IMA +SMACK +SECCOMP +GCRYPT -GNUTLS +OPENSSL -ACL +BLKID +CURL +ELFUTILS -FIDO2 +IDN2 -IDN +IPTC +KMOD +LIBCRYPTSETUP +LIBCRYPTSETUP_PLUGINS +LIBFDISK +PCRE2 -PWQUALITY -P11KIT -QRENCODE +TPM2 +BZIP2 +LZ4 +XZ +ZLIB +ZSTD -BPF_FRAMEWORK -XKBCOMMON +UTMP -SYSVINIT +LIBARCHIVE) May 13 23:55:10.064072 systemd[1]: Detected virtualization kvm. May 13 23:55:10.064087 systemd[1]: Detected architecture x86-64. May 13 23:55:10.064099 systemd[1]: Running in initrd. May 13 23:55:10.064110 systemd[1]: No hostname configured, using default hostname. May 13 23:55:10.064122 systemd[1]: Hostname set to . May 13 23:55:10.064133 systemd[1]: Initializing machine ID from VM UUID. May 13 23:55:10.064145 systemd[1]: Queued start job for default target initrd.target. May 13 23:55:10.064157 systemd[1]: Started clevis-luks-askpass.path - Forward Password Requests to Clevis Directory Watch. May 13 23:55:10.064169 systemd[1]: Started systemd-ask-password-console.path - Dispatch Password Requests to Console Directory Watch. May 13 23:55:10.064184 systemd[1]: Expecting device dev-disk-by\x2dlabel-EFI\x2dSYSTEM.device - /dev/disk/by-label/EFI-SYSTEM... May 13 23:55:10.064211 systemd[1]: Expecting device dev-disk-by\x2dlabel-OEM.device - /dev/disk/by-label/OEM... May 13 23:55:10.064226 systemd[1]: Expecting device dev-disk-by\x2dlabel-ROOT.device - /dev/disk/by-label/ROOT... May 13 23:55:10.064239 systemd[1]: Expecting device dev-disk-by\x2dpartlabel-USR\x2dA.device - /dev/disk/by-partlabel/USR-A... May 13 23:55:10.064253 systemd[1]: Expecting device dev-disk-by\x2dpartuuid-7130c94a\x2d213a\x2d4e5a\x2d8e26\x2d6cce9662f132.device - /dev/disk/by-partuuid/7130c94a-213a-4e5a-8e26-6cce9662f132... May 13 23:55:10.064267 systemd[1]: Expecting device dev-mapper-usr.device - /dev/mapper/usr... May 13 23:55:10.064279 systemd[1]: Reached target cryptsetup-pre.target - Local Encrypted Volumes (Pre). May 13 23:55:10.064291 systemd[1]: Reached target cryptsetup.target - Local Encrypted Volumes. May 13 23:55:10.064302 systemd[1]: Reached target paths.target - Path Units. May 13 23:55:10.064314 systemd[1]: Reached target slices.target - Slice Units. May 13 23:55:10.064326 systemd[1]: Reached target swap.target - Swaps. May 13 23:55:10.064337 systemd[1]: Reached target timers.target - Timer Units. May 13 23:55:10.064349 systemd[1]: Listening on iscsid.socket - Open-iSCSI iscsid Socket. May 13 23:55:10.064361 systemd[1]: Listening on iscsiuio.socket - Open-iSCSI iscsiuio Socket. May 13 23:55:10.064376 systemd[1]: Listening on systemd-journald-dev-log.socket - Journal Socket (/dev/log). May 13 23:55:10.064388 systemd[1]: Listening on systemd-journald.socket - Journal Sockets. May 13 23:55:10.064400 systemd[1]: Listening on systemd-networkd.socket - Network Service Netlink Socket. May 13 23:55:10.064412 systemd[1]: Listening on systemd-udevd-control.socket - udev Control Socket. May 13 23:55:10.064423 systemd[1]: Listening on systemd-udevd-kernel.socket - udev Kernel Socket. May 13 23:55:10.064435 systemd[1]: Reached target sockets.target - Socket Units. May 13 23:55:10.064444 systemd[1]: Starting ignition-setup-pre.service - Ignition env setup... May 13 23:55:10.064453 systemd[1]: Starting kmod-static-nodes.service - Create List of Static Device Nodes... May 13 23:55:10.064464 systemd[1]: Finished network-cleanup.service - Network Cleanup. May 13 23:55:10.064473 systemd[1]: Starting systemd-fsck-usr.service... May 13 23:55:10.064482 systemd[1]: Starting systemd-journald.service - Journal Service... May 13 23:55:10.064491 systemd[1]: Starting systemd-modules-load.service - Load Kernel Modules... May 13 23:55:10.064500 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... May 13 23:55:10.064509 systemd[1]: Finished ignition-setup-pre.service - Ignition env setup. May 13 23:55:10.064519 systemd[1]: Finished kmod-static-nodes.service - Create List of Static Device Nodes. May 13 23:55:10.064531 systemd[1]: Finished systemd-fsck-usr.service. May 13 23:55:10.064540 systemd[1]: Starting systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully... May 13 23:55:10.064549 systemd[1]: Finished systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully. May 13 23:55:10.064563 systemd[1]: Starting systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev... May 13 23:55:10.064602 systemd-journald[194]: Collecting audit messages is disabled. May 13 23:55:10.064625 systemd-journald[194]: Journal started May 13 23:55:10.064647 systemd-journald[194]: Runtime Journal (/run/log/journal/7df71dff370b4143befcdcf13735b05c) is 6M, max 48.3M, 42.3M free. May 13 23:55:10.056665 systemd-modules-load[195]: Inserted module 'overlay' May 13 23:55:10.088705 systemd[1]: Started systemd-journald.service - Journal Service. May 13 23:55:10.087657 systemd[1]: Starting systemd-tmpfiles-setup.service - Create System Files and Directories... May 13 23:55:10.091653 systemd[1]: Finished systemd-vconsole-setup.service - Virtual Console Setup. May 13 23:55:10.112112 kernel: bridge: filtering via arp/ip/ip6tables is no longer available by default. Update your scripts to load br_netfilter if you need this. May 13 23:55:10.115408 systemd-modules-load[195]: Inserted module 'br_netfilter' May 13 23:55:10.116904 kernel: Bridge firewalling registered May 13 23:55:10.118417 systemd[1]: Finished systemd-modules-load.service - Load Kernel Modules. May 13 23:55:10.121874 systemd[1]: Starting dracut-cmdline-ask.service - dracut ask for additional cmdline parameters... May 13 23:55:10.127182 systemd[1]: Starting systemd-sysctl.service - Apply Kernel Variables... May 13 23:55:10.164787 systemd[1]: Finished systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev. May 13 23:55:10.174735 systemd[1]: Finished systemd-tmpfiles-setup.service - Create System Files and Directories. May 13 23:55:10.178255 systemd[1]: Finished systemd-sysctl.service - Apply Kernel Variables. May 13 23:55:10.180091 systemd[1]: Starting systemd-resolved.service - Network Name Resolution... May 13 23:55:10.181534 systemd[1]: Finished dracut-cmdline-ask.service - dracut ask for additional cmdline parameters. May 13 23:55:10.184641 systemd[1]: Starting dracut-cmdline.service - dracut cmdline hook... May 13 23:55:10.215964 dracut-cmdline[230]: dracut-dracut-053 May 13 23:55:10.221200 dracut-cmdline[230]: Using kernel command line parameters: rd.driver.pre=btrfs rootflags=rw mount.usrflags=ro BOOT_IMAGE=/flatcar/vmlinuz-a mount.usr=/dev/mapper/usr verity.usr=PARTUUID=7130c94a-213a-4e5a-8e26-6cce9662f132 rootflags=rw mount.usrflags=ro consoleblank=0 root=LABEL=ROOT console=ttyS0,115200 flatcar.first_boot=detected verity.usrhash=8b3c5774a4242053287d41edc0d029958b7c22c131f7dd36b16a68182354e130 May 13 23:55:10.247355 systemd-resolved[229]: Positive Trust Anchors: May 13 23:55:10.247375 systemd-resolved[229]: . IN DS 20326 8 2 e06d44b80b8f1d39a95c0b0d7c65d08458e880409bbc683457104237c7f8ec8d May 13 23:55:10.247406 systemd-resolved[229]: Negative trust anchors: home.arpa 10.in-addr.arpa 16.172.in-addr.arpa 17.172.in-addr.arpa 18.172.in-addr.arpa 19.172.in-addr.arpa 20.172.in-addr.arpa 21.172.in-addr.arpa 22.172.in-addr.arpa 23.172.in-addr.arpa 24.172.in-addr.arpa 25.172.in-addr.arpa 26.172.in-addr.arpa 27.172.in-addr.arpa 28.172.in-addr.arpa 29.172.in-addr.arpa 30.172.in-addr.arpa 31.172.in-addr.arpa 170.0.0.192.in-addr.arpa 171.0.0.192.in-addr.arpa 168.192.in-addr.arpa d.f.ip6.arpa ipv4only.arpa resolver.arpa corp home internal intranet lan local private test May 13 23:55:10.318232 systemd-resolved[229]: Defaulting to hostname 'linux'. May 13 23:55:10.320366 systemd[1]: Started systemd-resolved.service - Network Name Resolution. May 13 23:55:10.320506 systemd[1]: Reached target nss-lookup.target - Host and Network Name Lookups. May 13 23:55:10.406089 kernel: SCSI subsystem initialized May 13 23:55:10.417329 kernel: Loading iSCSI transport class v2.0-870. May 13 23:55:10.474090 kernel: iscsi: registered transport (tcp) May 13 23:55:10.605089 kernel: iscsi: registered transport (qla4xxx) May 13 23:55:10.605158 kernel: QLogic iSCSI HBA Driver May 13 23:55:10.654593 systemd[1]: Finished dracut-cmdline.service - dracut cmdline hook. May 13 23:55:10.734119 systemd[1]: Starting dracut-pre-udev.service - dracut pre-udev hook... May 13 23:55:10.849090 kernel: device-mapper: core: CONFIG_IMA_DISABLE_HTABLE is disabled. Duplicate IMA measurements will not be recorded in the IMA log. May 13 23:55:10.849157 kernel: device-mapper: uevent: version 1.0.3 May 13 23:55:10.850846 kernel: device-mapper: ioctl: 4.48.0-ioctl (2023-03-01) initialised: dm-devel@redhat.com May 13 23:55:10.930107 kernel: raid6: avx2x4 gen() 27517 MB/s May 13 23:55:10.947097 kernel: raid6: avx2x2 gen() 30435 MB/s May 13 23:55:10.984451 kernel: raid6: avx2x1 gen() 24663 MB/s May 13 23:55:10.984509 kernel: raid6: using algorithm avx2x2 gen() 30435 MB/s May 13 23:55:11.002366 kernel: raid6: .... xor() 19413 MB/s, rmw enabled May 13 23:55:11.002430 kernel: raid6: using avx2x2 recovery algorithm May 13 23:55:11.054088 kernel: xor: automatically using best checksumming function avx May 13 23:55:11.211101 kernel: Btrfs loaded, zoned=no, fsverity=no May 13 23:55:11.224992 systemd[1]: Finished dracut-pre-udev.service - dracut pre-udev hook. May 13 23:55:11.236981 systemd[1]: Starting systemd-udevd.service - Rule-based Manager for Device Events and Files... May 13 23:55:11.264134 systemd-udevd[413]: Using default interface naming scheme 'v255'. May 13 23:55:11.270350 systemd[1]: Started systemd-udevd.service - Rule-based Manager for Device Events and Files. May 13 23:55:11.274164 systemd[1]: Starting dracut-pre-trigger.service - dracut pre-trigger hook... May 13 23:55:11.300663 dracut-pre-trigger[417]: rd.md=0: removing MD RAID activation May 13 23:55:11.337412 systemd[1]: Finished dracut-pre-trigger.service - dracut pre-trigger hook. May 13 23:55:11.340581 systemd[1]: Starting systemd-udev-trigger.service - Coldplug All udev Devices... May 13 23:55:11.429802 systemd[1]: Finished systemd-udev-trigger.service - Coldplug All udev Devices. May 13 23:55:11.465272 systemd[1]: Starting dracut-initqueue.service - dracut initqueue hook... May 13 23:55:11.469073 kernel: virtio_blk virtio1: 4/0/0 default/read/poll queues May 13 23:55:11.562088 kernel: virtio_blk virtio1: [vda] 19775488 512-byte logical blocks (10.1 GB/9.43 GiB) May 13 23:55:11.569315 kernel: GPT:Primary header thinks Alt. header is not at the end of the disk. May 13 23:55:11.569367 kernel: libata version 3.00 loaded. May 13 23:55:11.569383 kernel: cryptd: max_cpu_qlen set to 1000 May 13 23:55:11.569398 kernel: GPT:9289727 != 19775487 May 13 23:55:11.569412 kernel: GPT:Alternate GPT header not at the end of the disk. May 13 23:55:11.569425 kernel: GPT:9289727 != 19775487 May 13 23:55:11.569438 kernel: GPT: Use GNU Parted to correct GPT errors. May 13 23:55:11.569451 kernel: vda: vda1 vda2 vda3 vda4 vda6 vda7 vda9 May 13 23:55:11.578526 kernel: AVX2 version of gcm_enc/dec engaged. May 13 23:55:11.578585 kernel: AES CTR mode by8 optimization enabled May 13 23:55:11.584485 systemd[1]: Finished dracut-initqueue.service - dracut initqueue hook. May 13 23:55:11.586230 systemd[1]: Reached target remote-fs-pre.target - Preparation for Remote File Systems. May 13 23:55:11.665845 kernel: ahci 0000:00:1f.2: version 3.0 May 13 23:55:11.666083 kernel: ACPI: \_SB_.GSIA: Enabled at IRQ 16 May 13 23:55:11.666106 kernel: ahci 0000:00:1f.2: AHCI 0001.0000 32 slots 6 ports 1.5 Gbps 0x3f impl SATA mode May 13 23:55:11.666253 kernel: ahci 0000:00:1f.2: flags: 64bit ncq only May 13 23:55:11.669048 systemd[1]: Reached target remote-cryptsetup.target - Remote Encrypted Volumes. May 13 23:55:11.738087 systemd[1]: Reached target remote-fs.target - Remote File Systems. May 13 23:55:11.743898 systemd[1]: dracut-cmdline-ask.service: Deactivated successfully. May 13 23:55:11.744087 systemd[1]: Stopped dracut-cmdline-ask.service - dracut ask for additional cmdline parameters. May 13 23:55:11.746065 kernel: BTRFS: device label OEM devid 1 transid 14 /dev/vda6 scanned by (udev-worker) (458) May 13 23:55:11.748069 kernel: scsi host0: ahci May 13 23:55:11.796299 kernel: BTRFS: device fsid d2fbd39e-42cb-4ccb-87ec-99f56cfe77f8 devid 1 transid 39 /dev/vda3 scanned by (udev-worker) (471) May 13 23:55:11.796331 kernel: scsi host1: ahci May 13 23:55:11.796604 systemd[1]: Stopping dracut-cmdline-ask.service - dracut ask for additional cmdline parameters... May 13 23:55:11.800085 kernel: scsi host2: ahci May 13 23:55:11.801152 kernel: scsi host3: ahci May 13 23:55:11.801510 systemd[1]: Starting dracut-pre-mount.service - dracut pre-mount hook... May 13 23:55:11.801570 systemd[1]: systemd-vconsole-setup.service: Deactivated successfully. May 13 23:55:11.855431 kernel: scsi host4: ahci May 13 23:55:11.855669 kernel: scsi host5: ahci May 13 23:55:11.855834 kernel: ata1: SATA max UDMA/133 abar m4096@0xfebd4000 port 0xfebd4100 irq 31 May 13 23:55:11.801683 systemd[1]: Stopped systemd-vconsole-setup.service - Virtual Console Setup. May 13 23:55:11.863613 kernel: ata2: SATA max UDMA/133 abar m4096@0xfebd4000 port 0xfebd4180 irq 31 May 13 23:55:11.863629 kernel: ata3: SATA max UDMA/133 abar m4096@0xfebd4000 port 0xfebd4200 irq 31 May 13 23:55:11.863640 kernel: ata4: SATA max UDMA/133 abar m4096@0xfebd4000 port 0xfebd4280 irq 31 May 13 23:55:11.863651 kernel: ata5: SATA max UDMA/133 abar m4096@0xfebd4000 port 0xfebd4300 irq 31 May 13 23:55:11.863661 kernel: ata6: SATA max UDMA/133 abar m4096@0xfebd4000 port 0xfebd4380 irq 31 May 13 23:55:11.863596 systemd[1]: Stopping systemd-vconsole-setup.service - Virtual Console Setup... May 13 23:55:11.867936 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... May 13 23:55:11.871483 systemd[1]: run-credentials-systemd\x2dvconsole\x2dsetup.service.mount: Deactivated successfully. May 13 23:55:11.879922 systemd[1]: Finished dracut-pre-mount.service - dracut pre-mount hook. May 13 23:55:11.913611 systemd[1]: Found device dev-disk-by\x2dlabel-EFI\x2dSYSTEM.device - /dev/disk/by-label/EFI-SYSTEM. May 13 23:55:11.952440 systemd[1]: Finished systemd-vconsole-setup.service - Virtual Console Setup. May 13 23:55:11.972170 systemd[1]: Found device dev-disk-by\x2dlabel-OEM.device - /dev/disk/by-label/OEM. May 13 23:55:12.015502 systemd[1]: Found device dev-disk-by\x2dlabel-ROOT.device - /dev/disk/by-label/ROOT. May 13 23:55:12.039919 systemd[1]: Found device dev-disk-by\x2dpartlabel-USR\x2dA.device - /dev/disk/by-partlabel/USR-A. May 13 23:55:12.061608 systemd[1]: Found device dev-disk-by\x2dpartuuid-7130c94a\x2d213a\x2d4e5a\x2d8e26\x2d6cce9662f132.device - /dev/disk/by-partuuid/7130c94a-213a-4e5a-8e26-6cce9662f132. May 13 23:55:12.067485 systemd[1]: Starting disk-uuid.service - Generate new UUID for disk GPT if necessary... May 13 23:55:12.070922 systemd[1]: Starting dracut-cmdline-ask.service - dracut ask for additional cmdline parameters... May 13 23:55:12.090425 systemd[1]: Finished dracut-cmdline-ask.service - dracut ask for additional cmdline parameters. May 13 23:55:12.204150 kernel: ata2: SATA link down (SStatus 0 SControl 300) May 13 23:55:12.204257 kernel: ata4: SATA link down (SStatus 0 SControl 300) May 13 23:55:12.204277 kernel: ata5: SATA link down (SStatus 0 SControl 300) May 13 23:55:12.207092 kernel: ata3: SATA link up 1.5 Gbps (SStatus 113 SControl 300) May 13 23:55:12.207187 kernel: ata1: SATA link down (SStatus 0 SControl 300) May 13 23:55:12.207200 kernel: ata3.00: ATAPI: QEMU DVD-ROM, 2.5+, max UDMA/100 May 13 23:55:12.207777 kernel: ata3.00: applying bridge limits May 13 23:55:12.209080 kernel: ata6: SATA link down (SStatus 0 SControl 300) May 13 23:55:12.210074 kernel: ata3.00: configured for UDMA/100 May 13 23:55:12.212084 kernel: scsi 2:0:0:0: CD-ROM QEMU QEMU DVD-ROM 2.5+ PQ: 0 ANSI: 5 May 13 23:55:12.259088 kernel: sr 2:0:0:0: [sr0] scsi3-mmc drive: 4x/4x cd/rw xa/form2 tray May 13 23:55:12.259437 kernel: cdrom: Uniform CD-ROM driver Revision: 3.20 May 13 23:55:12.273280 kernel: sr 2:0:0:0: Attached scsi CD-ROM sr0 May 13 23:55:12.313141 disk-uuid[567]: Primary Header is updated. May 13 23:55:12.313141 disk-uuid[567]: Secondary Entries is updated. May 13 23:55:12.313141 disk-uuid[567]: Secondary Header is updated. May 13 23:55:12.318113 kernel: vda: vda1 vda2 vda3 vda4 vda6 vda7 vda9 May 13 23:55:12.324087 kernel: vda: vda1 vda2 vda3 vda4 vda6 vda7 vda9 May 13 23:55:13.338087 kernel: vda: vda1 vda2 vda3 vda4 vda6 vda7 vda9 May 13 23:55:13.338522 disk-uuid[577]: The operation has completed successfully. May 13 23:55:13.373096 systemd[1]: disk-uuid.service: Deactivated successfully. May 13 23:55:13.373222 systemd[1]: Finished disk-uuid.service - Generate new UUID for disk GPT if necessary. May 13 23:55:13.404083 systemd[1]: Starting verity-setup.service - Verity Setup for /dev/mapper/usr... May 13 23:55:13.424212 sh[592]: Success May 13 23:55:13.474083 kernel: device-mapper: verity: sha256 using implementation "sha256-ni" May 13 23:55:13.511255 systemd[1]: Found device dev-mapper-usr.device - /dev/mapper/usr. May 13 23:55:13.517203 systemd[1]: Mounting sysusr-usr.mount - /sysusr/usr... May 13 23:55:13.534221 systemd[1]: Finished verity-setup.service - Verity Setup for /dev/mapper/usr. May 13 23:55:13.580396 kernel: BTRFS info (device dm-0): first mount of filesystem d2fbd39e-42cb-4ccb-87ec-99f56cfe77f8 May 13 23:55:13.580447 kernel: BTRFS info (device dm-0): using crc32c (crc32c-intel) checksum algorithm May 13 23:55:13.580458 kernel: BTRFS warning (device dm-0): 'nologreplay' is deprecated, use 'rescue=nologreplay' instead May 13 23:55:13.581558 kernel: BTRFS info (device dm-0): disabling log replay at mount time May 13 23:55:13.582452 kernel: BTRFS info (device dm-0): using free space tree May 13 23:55:13.587856 systemd[1]: Mounted sysusr-usr.mount - /sysusr/usr. May 13 23:55:13.588614 systemd[1]: afterburn-network-kargs.service - Afterburn Initrd Setup Network Kernel Arguments was skipped because no trigger condition checks were met. May 13 23:55:13.590248 systemd[1]: Starting ignition-setup.service - Ignition (setup)... May 13 23:55:13.598766 systemd[1]: Starting parse-ip-for-networkd.service - Write systemd-networkd units from cmdline... May 13 23:55:13.637989 kernel: BTRFS info (device vda6): first mount of filesystem c0e200fb-7321-4d2d-86ff-b28bdae5fafc May 13 23:55:13.638075 kernel: BTRFS info (device vda6): using crc32c (crc32c-intel) checksum algorithm May 13 23:55:13.638093 kernel: BTRFS info (device vda6): using free space tree May 13 23:55:13.642069 kernel: BTRFS info (device vda6): auto enabling async discard May 13 23:55:13.676085 kernel: BTRFS info (device vda6): last unmount of filesystem c0e200fb-7321-4d2d-86ff-b28bdae5fafc May 13 23:55:13.728667 systemd[1]: Finished parse-ip-for-networkd.service - Write systemd-networkd units from cmdline. May 13 23:55:13.730543 systemd[1]: Starting systemd-networkd.service - Network Configuration... May 13 23:55:13.779361 systemd-networkd[768]: lo: Link UP May 13 23:55:13.779372 systemd-networkd[768]: lo: Gained carrier May 13 23:55:13.801337 systemd-networkd[768]: Enumeration completed May 13 23:55:13.801581 systemd[1]: Started systemd-networkd.service - Network Configuration. May 13 23:55:13.803465 systemd-networkd[768]: eth0: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. May 13 23:55:13.803478 systemd-networkd[768]: eth0: Configuring with /usr/lib/systemd/network/zz-default.network. May 13 23:55:13.804434 systemd-networkd[768]: eth0: Link UP May 13 23:55:13.804439 systemd-networkd[768]: eth0: Gained carrier May 13 23:55:13.804447 systemd-networkd[768]: eth0: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. May 13 23:55:13.805085 systemd[1]: Reached target network.target - Network. May 13 23:55:13.859099 systemd-networkd[768]: eth0: DHCPv4 address 10.0.0.64/16, gateway 10.0.0.1 acquired from 10.0.0.1 May 13 23:55:13.923043 systemd[1]: Finished ignition-setup.service - Ignition (setup). May 13 23:55:13.924452 systemd[1]: Starting ignition-fetch-offline.service - Ignition (fetch-offline)... May 13 23:55:14.146355 ignition[773]: Ignition 2.20.0 May 13 23:55:14.146367 ignition[773]: Stage: fetch-offline May 13 23:55:14.146424 ignition[773]: no configs at "/usr/lib/ignition/base.d" May 13 23:55:14.146436 ignition[773]: no config dir at "/usr/lib/ignition/base.platform.d/qemu" May 13 23:55:14.146565 ignition[773]: parsed url from cmdline: "" May 13 23:55:14.146569 ignition[773]: no config URL provided May 13 23:55:14.146575 ignition[773]: reading system config file "/usr/lib/ignition/user.ign" May 13 23:55:14.146584 ignition[773]: no config at "/usr/lib/ignition/user.ign" May 13 23:55:14.146613 ignition[773]: op(1): [started] loading QEMU firmware config module May 13 23:55:14.146617 ignition[773]: op(1): executing: "modprobe" "qemu_fw_cfg" May 13 23:55:14.180519 ignition[773]: op(1): [finished] loading QEMU firmware config module May 13 23:55:14.219877 ignition[773]: parsing config with SHA512: 50ccfac7aad6dba85693cb56930448232197b24f156d03a09ee30e37c054fe7a0cf461d579aefa95f420c6282403d67bd52ad418f95789a303fb1f41af4f8ce2 May 13 23:55:14.225443 unknown[773]: fetched base config from "system" May 13 23:55:14.225454 unknown[773]: fetched user config from "qemu" May 13 23:55:14.225865 ignition[773]: fetch-offline: fetch-offline passed May 13 23:55:14.225938 ignition[773]: Ignition finished successfully May 13 23:55:14.263400 systemd[1]: Finished ignition-fetch-offline.service - Ignition (fetch-offline). May 13 23:55:14.265162 systemd[1]: ignition-fetch.service - Ignition (fetch) was skipped because of an unmet condition check (ConditionPathExists=!/run/ignition.json). May 13 23:55:14.266086 systemd[1]: Starting ignition-kargs.service - Ignition (kargs)... May 13 23:55:14.325851 ignition[784]: Ignition 2.20.0 May 13 23:55:14.325864 ignition[784]: Stage: kargs May 13 23:55:14.326065 ignition[784]: no configs at "/usr/lib/ignition/base.d" May 13 23:55:14.326081 ignition[784]: no config dir at "/usr/lib/ignition/base.platform.d/qemu" May 13 23:55:14.326949 ignition[784]: kargs: kargs passed May 13 23:55:14.331273 systemd[1]: Finished ignition-kargs.service - Ignition (kargs). May 13 23:55:14.326996 ignition[784]: Ignition finished successfully May 13 23:55:14.333711 systemd[1]: Starting ignition-disks.service - Ignition (disks)... May 13 23:55:14.387446 ignition[793]: Ignition 2.20.0 May 13 23:55:14.387465 ignition[793]: Stage: disks May 13 23:55:14.387670 ignition[793]: no configs at "/usr/lib/ignition/base.d" May 13 23:55:14.387686 ignition[793]: no config dir at "/usr/lib/ignition/base.platform.d/qemu" May 13 23:55:14.390729 systemd[1]: Finished ignition-disks.service - Ignition (disks). May 13 23:55:14.388483 ignition[793]: disks: disks passed May 13 23:55:14.392330 systemd[1]: Reached target initrd-root-device.target - Initrd Root Device. May 13 23:55:14.388529 ignition[793]: Ignition finished successfully May 13 23:55:14.395085 systemd[1]: Reached target local-fs-pre.target - Preparation for Local File Systems. May 13 23:55:14.396906 systemd[1]: Reached target local-fs.target - Local File Systems. May 13 23:55:14.399324 systemd[1]: Reached target sysinit.target - System Initialization. May 13 23:55:14.400584 systemd[1]: Reached target basic.target - Basic System. May 13 23:55:14.403127 systemd[1]: Starting systemd-fsck-root.service - File System Check on /dev/disk/by-label/ROOT... May 13 23:55:14.456171 systemd-fsck[803]: ROOT: clean, 14/553520 files, 52654/553472 blocks May 13 23:55:14.769171 systemd[1]: Finished systemd-fsck-root.service - File System Check on /dev/disk/by-label/ROOT. May 13 23:55:14.774824 systemd[1]: Mounting sysroot.mount - /sysroot... May 13 23:55:15.070172 kernel: EXT4-fs (vda9): mounted filesystem c413e98b-da35-46b1-9852-45706e1b1f52 r/w with ordered data mode. Quota mode: none. May 13 23:55:15.071182 systemd[1]: Mounted sysroot.mount - /sysroot. May 13 23:55:15.072024 systemd[1]: Reached target initrd-root-fs.target - Initrd Root File System. May 13 23:55:15.078434 systemd[1]: Mounting sysroot-oem.mount - /sysroot/oem... May 13 23:55:15.081147 systemd[1]: Mounting sysroot-usr.mount - /sysroot/usr... May 13 23:55:15.098918 systemd[1]: flatcar-metadata-hostname.service - Flatcar Metadata Hostname Agent was skipped because no trigger condition checks were met. May 13 23:55:15.098987 systemd[1]: ignition-remount-sysroot.service - Remount /sysroot read-write for Ignition was skipped because of an unmet condition check (ConditionPathIsReadWrite=!/sysroot). May 13 23:55:15.099027 systemd[1]: Reached target ignition-diskful.target - Ignition Boot Disk Setup. May 13 23:55:15.145299 systemd[1]: Mounted sysroot-usr.mount - /sysroot/usr. May 13 23:55:15.148590 systemd[1]: Starting initrd-setup-root.service - Root filesystem setup... May 13 23:55:15.176167 kernel: BTRFS: device label OEM devid 1 transid 15 /dev/vda6 scanned by mount (811) May 13 23:55:15.233573 kernel: BTRFS info (device vda6): first mount of filesystem c0e200fb-7321-4d2d-86ff-b28bdae5fafc May 13 23:55:15.233687 kernel: BTRFS info (device vda6): using crc32c (crc32c-intel) checksum algorithm May 13 23:55:15.233703 kernel: BTRFS info (device vda6): using free space tree May 13 23:55:15.269758 kernel: BTRFS info (device vda6): auto enabling async discard May 13 23:55:15.276390 systemd[1]: Mounted sysroot-oem.mount - /sysroot/oem. May 13 23:55:15.314223 initrd-setup-root[834]: cut: /sysroot/etc/passwd: No such file or directory May 13 23:55:15.337209 initrd-setup-root[842]: cut: /sysroot/etc/group: No such file or directory May 13 23:55:15.364148 initrd-setup-root[849]: cut: /sysroot/etc/shadow: No such file or directory May 13 23:55:15.377932 initrd-setup-root[856]: cut: /sysroot/etc/gshadow: No such file or directory May 13 23:55:15.565739 systemd[1]: Finished initrd-setup-root.service - Root filesystem setup. May 13 23:55:15.570447 systemd[1]: Starting ignition-mount.service - Ignition (mount)... May 13 23:55:15.591788 systemd[1]: Starting sysroot-boot.service - /sysroot/boot... May 13 23:55:15.606129 systemd[1]: sysroot-oem.mount: Deactivated successfully. May 13 23:55:15.609882 kernel: BTRFS info (device vda6): last unmount of filesystem c0e200fb-7321-4d2d-86ff-b28bdae5fafc May 13 23:55:15.656560 systemd[1]: Finished sysroot-boot.service - /sysroot/boot. May 13 23:55:15.674857 ignition[926]: INFO : Ignition 2.20.0 May 13 23:55:15.674857 ignition[926]: INFO : Stage: mount May 13 23:55:15.677145 ignition[926]: INFO : no configs at "/usr/lib/ignition/base.d" May 13 23:55:15.677145 ignition[926]: INFO : no config dir at "/usr/lib/ignition/base.platform.d/qemu" May 13 23:55:15.677145 ignition[926]: INFO : mount: mount passed May 13 23:55:15.677145 ignition[926]: INFO : Ignition finished successfully May 13 23:55:15.683545 systemd[1]: Finished ignition-mount.service - Ignition (mount). May 13 23:55:15.686338 systemd[1]: Starting ignition-files.service - Ignition (files)... May 13 23:55:15.777300 systemd-networkd[768]: eth0: Gained IPv6LL May 13 23:55:16.075603 systemd[1]: Mounting sysroot-oem.mount - /sysroot/oem... May 13 23:55:16.154904 kernel: BTRFS: device label OEM devid 1 transid 16 /dev/vda6 scanned by mount (938) May 13 23:55:16.160497 kernel: BTRFS info (device vda6): first mount of filesystem c0e200fb-7321-4d2d-86ff-b28bdae5fafc May 13 23:55:16.160559 kernel: BTRFS info (device vda6): using crc32c (crc32c-intel) checksum algorithm May 13 23:55:16.160595 kernel: BTRFS info (device vda6): using free space tree May 13 23:55:16.206338 kernel: BTRFS info (device vda6): auto enabling async discard May 13 23:55:16.211195 systemd[1]: Mounted sysroot-oem.mount - /sysroot/oem. May 13 23:55:16.304081 ignition[955]: INFO : Ignition 2.20.0 May 13 23:55:16.304081 ignition[955]: INFO : Stage: files May 13 23:55:16.304081 ignition[955]: INFO : no configs at "/usr/lib/ignition/base.d" May 13 23:55:16.304081 ignition[955]: INFO : no config dir at "/usr/lib/ignition/base.platform.d/qemu" May 13 23:55:16.304081 ignition[955]: DEBUG : files: compiled without relabeling support, skipping May 13 23:55:16.316511 ignition[955]: INFO : files: ensureUsers: op(1): [started] creating or modifying user "core" May 13 23:55:16.316511 ignition[955]: DEBUG : files: ensureUsers: op(1): executing: "usermod" "--root" "/sysroot" "core" May 13 23:55:16.331795 ignition[955]: INFO : files: ensureUsers: op(1): [finished] creating or modifying user "core" May 13 23:55:16.338558 ignition[955]: INFO : files: ensureUsers: op(2): [started] adding ssh keys to user "core" May 13 23:55:16.343268 unknown[955]: wrote ssh authorized keys file for user: core May 13 23:55:16.378944 ignition[955]: INFO : files: ensureUsers: op(2): [finished] adding ssh keys to user "core" May 13 23:55:16.380972 ignition[955]: INFO : files: createFilesystemsFiles: createFiles: op(3): [started] writing file "/sysroot/opt/helm-v3.13.2-linux-amd64.tar.gz" May 13 23:55:16.380972 ignition[955]: INFO : files: createFilesystemsFiles: createFiles: op(3): GET https://get.helm.sh/helm-v3.13.2-linux-amd64.tar.gz: attempt #1 May 13 23:55:16.473402 ignition[955]: INFO : files: createFilesystemsFiles: createFiles: op(3): GET result: OK May 13 23:55:16.681270 ignition[955]: INFO : files: createFilesystemsFiles: createFiles: op(3): [finished] writing file "/sysroot/opt/helm-v3.13.2-linux-amd64.tar.gz" May 13 23:55:16.681270 ignition[955]: INFO : files: createFilesystemsFiles: createFiles: op(4): [started] writing file "/sysroot/home/core/install.sh" May 13 23:55:16.686498 ignition[955]: INFO : files: createFilesystemsFiles: createFiles: op(4): [finished] writing file "/sysroot/home/core/install.sh" May 13 23:55:16.686498 ignition[955]: INFO : files: createFilesystemsFiles: createFiles: op(5): [started] writing file "/sysroot/home/core/nginx.yaml" May 13 23:55:16.686498 ignition[955]: INFO : files: createFilesystemsFiles: createFiles: op(5): [finished] writing file "/sysroot/home/core/nginx.yaml" May 13 23:55:16.686498 ignition[955]: INFO : files: createFilesystemsFiles: createFiles: op(6): [started] writing file "/sysroot/home/core/nfs-pod.yaml" May 13 23:55:16.686498 ignition[955]: INFO : files: createFilesystemsFiles: createFiles: op(6): [finished] writing file "/sysroot/home/core/nfs-pod.yaml" May 13 23:55:16.686498 ignition[955]: INFO : files: createFilesystemsFiles: createFiles: op(7): [started] writing file "/sysroot/home/core/nfs-pvc.yaml" May 13 23:55:16.686498 ignition[955]: INFO : files: createFilesystemsFiles: createFiles: op(7): [finished] writing file "/sysroot/home/core/nfs-pvc.yaml" May 13 23:55:16.686498 ignition[955]: INFO : files: createFilesystemsFiles: createFiles: op(8): [started] writing file "/sysroot/etc/flatcar/update.conf" May 13 23:55:16.686498 ignition[955]: INFO : files: createFilesystemsFiles: createFiles: op(8): [finished] writing file "/sysroot/etc/flatcar/update.conf" May 13 23:55:16.686498 ignition[955]: INFO : files: createFilesystemsFiles: createFiles: op(9): [started] writing link "/sysroot/etc/extensions/kubernetes.raw" -> "/opt/extensions/kubernetes/kubernetes-v1.30.1-x86-64.raw" May 13 23:55:16.686498 ignition[955]: INFO : files: createFilesystemsFiles: createFiles: op(9): [finished] writing link "/sysroot/etc/extensions/kubernetes.raw" -> "/opt/extensions/kubernetes/kubernetes-v1.30.1-x86-64.raw" May 13 23:55:16.686498 ignition[955]: INFO : files: createFilesystemsFiles: createFiles: op(a): [started] writing file "/sysroot/opt/extensions/kubernetes/kubernetes-v1.30.1-x86-64.raw" May 13 23:55:16.686498 ignition[955]: INFO : files: createFilesystemsFiles: createFiles: op(a): GET https://github.com/flatcar/sysext-bakery/releases/download/latest/kubernetes-v1.30.1-x86-64.raw: attempt #1 May 13 23:55:17.277906 ignition[955]: INFO : files: createFilesystemsFiles: createFiles: op(a): GET result: OK May 13 23:55:17.702194 ignition[955]: INFO : files: createFilesystemsFiles: createFiles: op(a): [finished] writing file "/sysroot/opt/extensions/kubernetes/kubernetes-v1.30.1-x86-64.raw" May 13 23:55:17.702194 ignition[955]: INFO : files: op(b): [started] processing unit "prepare-helm.service" May 13 23:55:17.751637 ignition[955]: INFO : files: op(b): op(c): [started] writing unit "prepare-helm.service" at "/sysroot/etc/systemd/system/prepare-helm.service" May 13 23:55:17.754129 ignition[955]: INFO : files: op(b): op(c): [finished] writing unit "prepare-helm.service" at "/sysroot/etc/systemd/system/prepare-helm.service" May 13 23:55:17.754129 ignition[955]: INFO : files: op(b): [finished] processing unit "prepare-helm.service" May 13 23:55:17.757840 ignition[955]: INFO : files: op(d): [started] processing unit "coreos-metadata.service" May 13 23:55:17.757840 ignition[955]: INFO : files: op(d): op(e): [started] writing unit "coreos-metadata.service" at "/sysroot/etc/systemd/system/coreos-metadata.service" May 13 23:55:17.761269 ignition[955]: INFO : files: op(d): op(e): [finished] writing unit "coreos-metadata.service" at "/sysroot/etc/systemd/system/coreos-metadata.service" May 13 23:55:17.761269 ignition[955]: INFO : files: op(d): [finished] processing unit "coreos-metadata.service" May 13 23:55:17.765892 ignition[955]: INFO : files: op(f): [started] setting preset to disabled for "coreos-metadata.service" May 13 23:55:17.833520 ignition[955]: INFO : files: op(f): op(10): [started] removing enablement symlink(s) for "coreos-metadata.service" May 13 23:55:17.837985 ignition[955]: INFO : files: op(f): op(10): [finished] removing enablement symlink(s) for "coreos-metadata.service" May 13 23:55:17.839980 ignition[955]: INFO : files: op(f): [finished] setting preset to disabled for "coreos-metadata.service" May 13 23:55:17.839980 ignition[955]: INFO : files: op(11): [started] setting preset to enabled for "prepare-helm.service" May 13 23:55:17.900802 ignition[955]: INFO : files: op(11): [finished] setting preset to enabled for "prepare-helm.service" May 13 23:55:17.902328 ignition[955]: INFO : files: createResultFile: createFiles: op(12): [started] writing file "/sysroot/etc/.ignition-result.json" May 13 23:55:17.904239 ignition[955]: INFO : files: createResultFile: createFiles: op(12): [finished] writing file "/sysroot/etc/.ignition-result.json" May 13 23:55:17.905997 ignition[955]: INFO : files: files passed May 13 23:55:17.906793 ignition[955]: INFO : Ignition finished successfully May 13 23:55:17.910258 systemd[1]: Finished ignition-files.service - Ignition (files). May 13 23:55:17.912615 systemd[1]: Starting ignition-quench.service - Ignition (record completion)... May 13 23:55:17.914611 systemd[1]: Starting initrd-setup-root-after-ignition.service - Root filesystem completion... May 13 23:55:17.936578 systemd[1]: ignition-quench.service: Deactivated successfully. May 13 23:55:17.936723 systemd[1]: Finished ignition-quench.service - Ignition (record completion). May 13 23:55:17.974918 initrd-setup-root-after-ignition[983]: grep: /sysroot/oem/oem-release: No such file or directory May 13 23:55:17.973151 systemd[1]: Finished initrd-setup-root-after-ignition.service - Root filesystem completion. May 13 23:55:17.978791 initrd-setup-root-after-ignition[985]: grep: /sysroot/etc/flatcar/enabled-sysext.conf: No such file or directory May 13 23:55:17.978791 initrd-setup-root-after-ignition[985]: grep: /sysroot/usr/share/flatcar/enabled-sysext.conf: No such file or directory May 13 23:55:17.976438 systemd[1]: Reached target ignition-complete.target - Ignition Complete. May 13 23:55:17.984163 initrd-setup-root-after-ignition[989]: grep: /sysroot/etc/flatcar/enabled-sysext.conf: No such file or directory May 13 23:55:17.980040 systemd[1]: Starting initrd-parse-etc.service - Mountpoints Configured in the Real Root... May 13 23:55:18.028341 systemd[1]: initrd-parse-etc.service: Deactivated successfully. May 13 23:55:18.028462 systemd[1]: Finished initrd-parse-etc.service - Mountpoints Configured in the Real Root. May 13 23:55:18.052951 systemd[1]: Reached target initrd-fs.target - Initrd File Systems. May 13 23:55:18.055118 systemd[1]: Reached target initrd.target - Initrd Default Target. May 13 23:55:18.057292 systemd[1]: dracut-mount.service - dracut mount hook was skipped because no trigger condition checks were met. May 13 23:55:18.058406 systemd[1]: Starting dracut-pre-pivot.service - dracut pre-pivot and cleanup hook... May 13 23:55:18.091342 systemd[1]: Finished dracut-pre-pivot.service - dracut pre-pivot and cleanup hook. May 13 23:55:18.136142 systemd[1]: Starting initrd-cleanup.service - Cleaning Up and Shutting Down Daemons... May 13 23:55:18.159574 systemd[1]: Stopped target nss-lookup.target - Host and Network Name Lookups. May 13 23:55:18.161222 systemd[1]: Stopped target remote-cryptsetup.target - Remote Encrypted Volumes. May 13 23:55:18.163691 systemd[1]: Stopped target timers.target - Timer Units. May 13 23:55:18.166033 systemd[1]: dracut-pre-pivot.service: Deactivated successfully. May 13 23:55:18.166239 systemd[1]: Stopped dracut-pre-pivot.service - dracut pre-pivot and cleanup hook. May 13 23:55:18.168761 systemd[1]: Stopped target initrd.target - Initrd Default Target. May 13 23:55:18.170549 systemd[1]: Stopped target basic.target - Basic System. May 13 23:55:18.172454 systemd[1]: Stopped target ignition-complete.target - Ignition Complete. May 13 23:55:18.174588 systemd[1]: Stopped target ignition-diskful.target - Ignition Boot Disk Setup. May 13 23:55:18.207947 systemd[1]: Stopped target initrd-root-device.target - Initrd Root Device. May 13 23:55:18.210688 systemd[1]: Stopped target remote-fs.target - Remote File Systems. May 13 23:55:18.212925 systemd[1]: Stopped target remote-fs-pre.target - Preparation for Remote File Systems. May 13 23:55:18.215375 systemd[1]: Stopped target sysinit.target - System Initialization. May 13 23:55:18.218098 systemd[1]: Stopped target local-fs.target - Local File Systems. May 13 23:55:18.232128 systemd[1]: Stopped target swap.target - Swaps. May 13 23:55:18.234563 systemd[1]: dracut-pre-mount.service: Deactivated successfully. May 13 23:55:18.234710 systemd[1]: Stopped dracut-pre-mount.service - dracut pre-mount hook. May 13 23:55:18.237378 systemd[1]: Stopped target cryptsetup.target - Local Encrypted Volumes. May 13 23:55:18.239487 systemd[1]: Stopped target cryptsetup-pre.target - Local Encrypted Volumes (Pre). May 13 23:55:18.294443 systemd[1]: clevis-luks-askpass.path: Deactivated successfully. May 13 23:55:18.294607 systemd[1]: Stopped clevis-luks-askpass.path - Forward Password Requests to Clevis Directory Watch. May 13 23:55:18.297413 systemd[1]: dracut-initqueue.service: Deactivated successfully. May 13 23:55:18.297630 systemd[1]: Stopped dracut-initqueue.service - dracut initqueue hook. May 13 23:55:18.300830 systemd[1]: ignition-fetch-offline.service: Deactivated successfully. May 13 23:55:18.300973 systemd[1]: Stopped ignition-fetch-offline.service - Ignition (fetch-offline). May 13 23:55:18.303506 systemd[1]: Stopped target paths.target - Path Units. May 13 23:55:18.306396 systemd[1]: systemd-ask-password-console.path: Deactivated successfully. May 13 23:55:18.310128 systemd[1]: Stopped systemd-ask-password-console.path - Dispatch Password Requests to Console Directory Watch. May 13 23:55:18.312295 systemd[1]: Stopped target slices.target - Slice Units. May 13 23:55:18.314662 systemd[1]: Stopped target sockets.target - Socket Units. May 13 23:55:18.364362 systemd[1]: iscsid.socket: Deactivated successfully. May 13 23:55:18.364503 systemd[1]: Closed iscsid.socket - Open-iSCSI iscsid Socket. May 13 23:55:18.373081 systemd[1]: iscsiuio.socket: Deactivated successfully. May 13 23:55:18.373210 systemd[1]: Closed iscsiuio.socket - Open-iSCSI iscsiuio Socket. May 13 23:55:18.375037 systemd[1]: initrd-setup-root-after-ignition.service: Deactivated successfully. May 13 23:55:18.375177 systemd[1]: Stopped initrd-setup-root-after-ignition.service - Root filesystem completion. May 13 23:55:18.377861 systemd[1]: ignition-files.service: Deactivated successfully. May 13 23:55:18.377966 systemd[1]: Stopped ignition-files.service - Ignition (files). May 13 23:55:18.380932 systemd[1]: Stopping ignition-mount.service - Ignition (mount)... May 13 23:55:18.382936 systemd[1]: kmod-static-nodes.service: Deactivated successfully. May 13 23:55:18.383067 systemd[1]: Stopped kmod-static-nodes.service - Create List of Static Device Nodes. May 13 23:55:18.421023 systemd[1]: Stopping sysroot-boot.service - /sysroot/boot... May 13 23:55:18.422187 systemd[1]: systemd-udev-trigger.service: Deactivated successfully. May 13 23:55:18.422363 systemd[1]: Stopped systemd-udev-trigger.service - Coldplug All udev Devices. May 13 23:55:18.424592 systemd[1]: dracut-pre-trigger.service: Deactivated successfully. May 13 23:55:18.435920 ignition[1010]: INFO : Ignition 2.20.0 May 13 23:55:18.435920 ignition[1010]: INFO : Stage: umount May 13 23:55:18.435920 ignition[1010]: INFO : no configs at "/usr/lib/ignition/base.d" May 13 23:55:18.435920 ignition[1010]: INFO : no config dir at "/usr/lib/ignition/base.platform.d/qemu" May 13 23:55:18.435920 ignition[1010]: INFO : umount: umount passed May 13 23:55:18.435920 ignition[1010]: INFO : Ignition finished successfully May 13 23:55:18.424708 systemd[1]: Stopped dracut-pre-trigger.service - dracut pre-trigger hook. May 13 23:55:18.432630 systemd[1]: initrd-cleanup.service: Deactivated successfully. May 13 23:55:18.432773 systemd[1]: Finished initrd-cleanup.service - Cleaning Up and Shutting Down Daemons. May 13 23:55:18.438179 systemd[1]: ignition-mount.service: Deactivated successfully. May 13 23:55:18.438303 systemd[1]: Stopped ignition-mount.service - Ignition (mount). May 13 23:55:18.489131 systemd[1]: Stopped target network.target - Network. May 13 23:55:18.490746 systemd[1]: ignition-disks.service: Deactivated successfully. May 13 23:55:18.490816 systemd[1]: Stopped ignition-disks.service - Ignition (disks). May 13 23:55:18.493660 systemd[1]: ignition-kargs.service: Deactivated successfully. May 13 23:55:18.493712 systemd[1]: Stopped ignition-kargs.service - Ignition (kargs). May 13 23:55:18.496012 systemd[1]: ignition-setup.service: Deactivated successfully. May 13 23:55:18.496123 systemd[1]: Stopped ignition-setup.service - Ignition (setup). May 13 23:55:18.498399 systemd[1]: ignition-setup-pre.service: Deactivated successfully. May 13 23:55:18.498466 systemd[1]: Stopped ignition-setup-pre.service - Ignition env setup. May 13 23:55:18.501206 systemd[1]: Stopping systemd-networkd.service - Network Configuration... May 13 23:55:18.503224 systemd[1]: Stopping systemd-resolved.service - Network Name Resolution... May 13 23:55:18.507103 systemd[1]: sysroot-boot.mount: Deactivated successfully. May 13 23:55:18.507770 systemd[1]: systemd-resolved.service: Deactivated successfully. May 13 23:55:18.507914 systemd[1]: Stopped systemd-resolved.service - Network Name Resolution. May 13 23:55:18.511956 systemd[1]: run-credentials-systemd\x2dresolved.service.mount: Deactivated successfully. May 13 23:55:18.512633 systemd[1]: systemd-networkd.service: Deactivated successfully. May 13 23:55:18.512780 systemd[1]: Stopped systemd-networkd.service - Network Configuration. May 13 23:55:18.517142 systemd[1]: run-credentials-systemd\x2dnetworkd.service.mount: Deactivated successfully. May 13 23:55:18.517494 systemd[1]: sysroot-boot.service: Deactivated successfully. May 13 23:55:18.517644 systemd[1]: Stopped sysroot-boot.service - /sysroot/boot. May 13 23:55:18.559389 systemd[1]: systemd-networkd.socket: Deactivated successfully. May 13 23:55:18.559456 systemd[1]: Closed systemd-networkd.socket - Network Service Netlink Socket. May 13 23:55:18.561302 systemd[1]: initrd-setup-root.service: Deactivated successfully. May 13 23:55:18.561378 systemd[1]: Stopped initrd-setup-root.service - Root filesystem setup. May 13 23:55:18.565217 systemd[1]: Stopping network-cleanup.service - Network Cleanup... May 13 23:55:18.566768 systemd[1]: parse-ip-for-networkd.service: Deactivated successfully. May 13 23:55:18.566840 systemd[1]: Stopped parse-ip-for-networkd.service - Write systemd-networkd units from cmdline. May 13 23:55:18.569628 systemd[1]: systemd-sysctl.service: Deactivated successfully. May 13 23:55:18.569699 systemd[1]: Stopped systemd-sysctl.service - Apply Kernel Variables. May 13 23:55:18.572361 systemd[1]: systemd-modules-load.service: Deactivated successfully. May 13 23:55:18.572423 systemd[1]: Stopped systemd-modules-load.service - Load Kernel Modules. May 13 23:55:18.575076 systemd[1]: systemd-tmpfiles-setup.service: Deactivated successfully. May 13 23:55:18.575136 systemd[1]: Stopped systemd-tmpfiles-setup.service - Create System Files and Directories. May 13 23:55:18.578449 systemd[1]: Stopping systemd-udevd.service - Rule-based Manager for Device Events and Files... May 13 23:55:18.616873 systemd[1]: run-credentials-systemd\x2dsysctl.service.mount: Deactivated successfully. May 13 23:55:18.616979 systemd[1]: run-credentials-systemd\x2dtmpfiles\x2dsetup.service.mount: Deactivated successfully. May 13 23:55:18.632476 systemd[1]: systemd-udevd.service: Deactivated successfully. May 13 23:55:18.632758 systemd[1]: Stopped systemd-udevd.service - Rule-based Manager for Device Events and Files. May 13 23:55:18.637419 systemd[1]: network-cleanup.service: Deactivated successfully. May 13 23:55:18.637565 systemd[1]: Stopped network-cleanup.service - Network Cleanup. May 13 23:55:18.703930 systemd[1]: systemd-udevd-control.socket: Deactivated successfully. May 13 23:55:18.704033 systemd[1]: Closed systemd-udevd-control.socket - udev Control Socket. May 13 23:55:18.705919 systemd[1]: systemd-udevd-kernel.socket: Deactivated successfully. May 13 23:55:18.705961 systemd[1]: Closed systemd-udevd-kernel.socket - udev Kernel Socket. May 13 23:55:18.707305 systemd[1]: dracut-pre-udev.service: Deactivated successfully. May 13 23:55:18.707361 systemd[1]: Stopped dracut-pre-udev.service - dracut pre-udev hook. May 13 23:55:18.709779 systemd[1]: dracut-cmdline.service: Deactivated successfully. May 13 23:55:18.709827 systemd[1]: Stopped dracut-cmdline.service - dracut cmdline hook. May 13 23:55:18.712198 systemd[1]: dracut-cmdline-ask.service: Deactivated successfully. May 13 23:55:18.712262 systemd[1]: Stopped dracut-cmdline-ask.service - dracut ask for additional cmdline parameters. May 13 23:55:18.715918 systemd[1]: Starting initrd-udevadm-cleanup-db.service - Cleanup udev Database... May 13 23:55:18.717368 systemd[1]: systemd-tmpfiles-setup-dev.service: Deactivated successfully. May 13 23:55:18.717425 systemd[1]: Stopped systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev. May 13 23:55:18.763651 systemd[1]: systemd-vconsole-setup.service: Deactivated successfully. May 13 23:55:18.763729 systemd[1]: Stopped systemd-vconsole-setup.service - Virtual Console Setup. May 13 23:55:18.767847 systemd[1]: run-credentials-systemd\x2dtmpfiles\x2dsetup\x2ddev.service.mount: Deactivated successfully. May 13 23:55:18.767930 systemd[1]: run-credentials-systemd\x2dvconsole\x2dsetup.service.mount: Deactivated successfully. May 13 23:55:18.781406 systemd[1]: initrd-udevadm-cleanup-db.service: Deactivated successfully. May 13 23:55:18.781546 systemd[1]: Finished initrd-udevadm-cleanup-db.service - Cleanup udev Database. May 13 23:55:18.800966 systemd[1]: Reached target initrd-switch-root.target - Switch Root. May 13 23:55:18.804264 systemd[1]: Starting initrd-switch-root.service - Switch Root... May 13 23:55:18.875373 systemd[1]: Switching root. May 13 23:55:18.945411 systemd-journald[194]: Journal stopped May 13 23:55:21.834781 systemd-journald[194]: Received SIGTERM from PID 1 (systemd). May 13 23:55:21.834844 kernel: SELinux: policy capability network_peer_controls=1 May 13 23:55:21.834862 kernel: SELinux: policy capability open_perms=1 May 13 23:55:21.834874 kernel: SELinux: policy capability extended_socket_class=1 May 13 23:55:21.834891 kernel: SELinux: policy capability always_check_network=0 May 13 23:55:21.834903 kernel: SELinux: policy capability cgroup_seclabel=1 May 13 23:55:21.834923 kernel: SELinux: policy capability nnp_nosuid_transition=1 May 13 23:55:21.834935 kernel: SELinux: policy capability genfs_seclabel_symlinks=0 May 13 23:55:21.834951 kernel: SELinux: policy capability ioctl_skip_cloexec=0 May 13 23:55:21.834962 kernel: audit: type=1403 audit(1747180520.420:2): auid=4294967295 ses=4294967295 lsm=selinux res=1 May 13 23:55:21.834975 systemd[1]: Successfully loaded SELinux policy in 40.771ms. May 13 23:55:21.834990 systemd[1]: Relabeled /dev/, /dev/shm/, /run/ in 14.468ms. May 13 23:55:21.835004 systemd[1]: systemd 256.8 running in system mode (+PAM +AUDIT +SELINUX -APPARMOR +IMA +SMACK +SECCOMP +GCRYPT -GNUTLS +OPENSSL -ACL +BLKID +CURL +ELFUTILS -FIDO2 +IDN2 -IDN +IPTC +KMOD +LIBCRYPTSETUP +LIBCRYPTSETUP_PLUGINS +LIBFDISK +PCRE2 -PWQUALITY -P11KIT -QRENCODE +TPM2 +BZIP2 +LZ4 +XZ +ZLIB +ZSTD -BPF_FRAMEWORK -XKBCOMMON +UTMP -SYSVINIT +LIBARCHIVE) May 13 23:55:21.835025 systemd[1]: Detected virtualization kvm. May 13 23:55:21.835041 systemd[1]: Detected architecture x86-64. May 13 23:55:21.835082 systemd[1]: Detected first boot. May 13 23:55:21.835100 systemd[1]: Initializing machine ID from VM UUID. May 13 23:55:21.835116 zram_generator::config[1058]: No configuration found. May 13 23:55:21.835133 kernel: Guest personality initialized and is inactive May 13 23:55:21.835146 kernel: VMCI host device registered (name=vmci, major=10, minor=125) May 13 23:55:21.835158 kernel: Initialized host personality May 13 23:55:21.835171 kernel: NET: Registered PF_VSOCK protocol family May 13 23:55:21.835183 systemd[1]: Populated /etc with preset unit settings. May 13 23:55:21.835197 systemd[1]: run-credentials-systemd\x2djournald.service.mount: Deactivated successfully. May 13 23:55:21.835212 systemd[1]: initrd-switch-root.service: Deactivated successfully. May 13 23:55:21.835224 systemd[1]: Stopped initrd-switch-root.service - Switch Root. May 13 23:55:21.835236 systemd[1]: systemd-journald.service: Scheduled restart job, restart counter is at 1. May 13 23:55:21.835249 systemd[1]: Created slice system-addon\x2dconfig.slice - Slice /system/addon-config. May 13 23:55:21.835262 systemd[1]: Created slice system-addon\x2drun.slice - Slice /system/addon-run. May 13 23:55:21.835278 systemd[1]: Created slice system-getty.slice - Slice /system/getty. May 13 23:55:21.835294 systemd[1]: Created slice system-modprobe.slice - Slice /system/modprobe. May 13 23:55:21.835307 systemd[1]: Created slice system-serial\x2dgetty.slice - Slice /system/serial-getty. May 13 23:55:21.835327 systemd[1]: Created slice system-system\x2dcloudinit.slice - Slice /system/system-cloudinit. May 13 23:55:21.835340 systemd[1]: Created slice system-systemd\x2dfsck.slice - Slice /system/systemd-fsck. May 13 23:55:21.835352 systemd[1]: Created slice user.slice - User and Session Slice. May 13 23:55:21.835364 systemd[1]: Started clevis-luks-askpass.path - Forward Password Requests to Clevis Directory Watch. May 13 23:55:21.835377 systemd[1]: Started systemd-ask-password-console.path - Dispatch Password Requests to Console Directory Watch. May 13 23:55:21.835389 systemd[1]: Started systemd-ask-password-wall.path - Forward Password Requests to Wall Directory Watch. May 13 23:55:21.835401 systemd[1]: Set up automount boot.automount - Boot partition Automount Point. May 13 23:55:21.835422 systemd[1]: Set up automount proc-sys-fs-binfmt_misc.automount - Arbitrary Executable File Formats File System Automount Point. May 13 23:55:21.835434 systemd[1]: Expecting device dev-disk-by\x2dlabel-OEM.device - /dev/disk/by-label/OEM... May 13 23:55:21.835449 systemd[1]: Expecting device dev-ttyS0.device - /dev/ttyS0... May 13 23:55:21.835463 systemd[1]: Reached target cryptsetup-pre.target - Local Encrypted Volumes (Pre). May 13 23:55:21.835476 systemd[1]: Stopped target initrd-switch-root.target - Switch Root. May 13 23:55:21.835488 systemd[1]: Stopped target initrd-fs.target - Initrd File Systems. May 13 23:55:21.835500 systemd[1]: Stopped target initrd-root-fs.target - Initrd Root File System. May 13 23:55:21.835512 systemd[1]: Reached target integritysetup.target - Local Integrity Protected Volumes. May 13 23:55:21.835524 systemd[1]: Reached target remote-cryptsetup.target - Remote Encrypted Volumes. May 13 23:55:21.835539 systemd[1]: Reached target remote-fs.target - Remote File Systems. May 13 23:55:21.835551 systemd[1]: Reached target slices.target - Slice Units. May 13 23:55:21.835563 systemd[1]: Reached target swap.target - Swaps. May 13 23:55:21.835578 systemd[1]: Reached target veritysetup.target - Local Verity Protected Volumes. May 13 23:55:21.835591 systemd[1]: Listening on systemd-coredump.socket - Process Core Dump Socket. May 13 23:55:21.835604 systemd[1]: Listening on systemd-creds.socket - Credential Encryption/Decryption. May 13 23:55:21.835618 systemd[1]: Listening on systemd-networkd.socket - Network Service Netlink Socket. May 13 23:55:21.835633 systemd[1]: Listening on systemd-udevd-control.socket - udev Control Socket. May 13 23:55:21.835645 systemd[1]: Listening on systemd-udevd-kernel.socket - udev Kernel Socket. May 13 23:55:21.835657 systemd[1]: Listening on systemd-userdbd.socket - User Database Manager Socket. May 13 23:55:21.835671 systemd[1]: Mounting dev-hugepages.mount - Huge Pages File System... May 13 23:55:21.835684 systemd[1]: Mounting dev-mqueue.mount - POSIX Message Queue File System... May 13 23:55:21.835696 systemd[1]: Mounting media.mount - External Media Directory... May 13 23:55:21.835709 systemd[1]: proc-xen.mount - /proc/xen was skipped because of an unmet condition check (ConditionVirtualization=xen). May 13 23:55:21.835721 systemd[1]: Mounting sys-kernel-debug.mount - Kernel Debug File System... May 13 23:55:21.835735 systemd[1]: Mounting sys-kernel-tracing.mount - Kernel Trace File System... May 13 23:55:21.835747 systemd[1]: Mounting tmp.mount - Temporary Directory /tmp... May 13 23:55:21.835760 systemd[1]: var-lib-machines.mount - Virtual Machine and Container Storage (Compatibility) was skipped because of an unmet condition check (ConditionPathExists=/var/lib/machines.raw). May 13 23:55:21.835775 systemd[1]: Reached target machines.target - Containers. May 13 23:55:21.835787 systemd[1]: Starting flatcar-tmpfiles.service - Create missing system files... May 13 23:55:21.835799 systemd[1]: ignition-delete-config.service - Ignition (delete config) was skipped because no trigger condition checks were met. May 13 23:55:21.835812 systemd[1]: Starting kmod-static-nodes.service - Create List of Static Device Nodes... May 13 23:55:21.835824 systemd[1]: Starting modprobe@configfs.service - Load Kernel Module configfs... May 13 23:55:21.835836 systemd[1]: Starting modprobe@dm_mod.service - Load Kernel Module dm_mod... May 13 23:55:21.835851 systemd[1]: Starting modprobe@drm.service - Load Kernel Module drm... May 13 23:55:21.835864 systemd[1]: Starting modprobe@efi_pstore.service - Load Kernel Module efi_pstore... May 13 23:55:21.835876 systemd[1]: Starting modprobe@fuse.service - Load Kernel Module fuse... May 13 23:55:21.835891 systemd[1]: Starting modprobe@loop.service - Load Kernel Module loop... May 13 23:55:21.835903 systemd[1]: setup-nsswitch.service - Create /etc/nsswitch.conf was skipped because of an unmet condition check (ConditionPathExists=!/etc/nsswitch.conf). May 13 23:55:21.835916 systemd[1]: systemd-fsck-root.service: Deactivated successfully. May 13 23:55:21.835928 kernel: fuse: init (API version 7.39) May 13 23:55:21.835940 systemd[1]: Stopped systemd-fsck-root.service - File System Check on Root Device. May 13 23:55:21.835952 systemd[1]: systemd-fsck-usr.service: Deactivated successfully. May 13 23:55:21.835964 systemd[1]: Stopped systemd-fsck-usr.service. May 13 23:55:21.835977 systemd[1]: systemd-hibernate-clear.service - Clear Stale Hibernate Storage Info was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/HibernateLocation-8cf2644b-4b0b-428f-9387-6d876050dc67). May 13 23:55:21.835992 kernel: loop: module loaded May 13 23:55:21.836006 systemd[1]: Starting systemd-journald.service - Journal Service... May 13 23:55:21.836018 systemd[1]: Starting systemd-modules-load.service - Load Kernel Modules... May 13 23:55:21.836030 kernel: ACPI: bus type drm_connector registered May 13 23:55:21.836042 systemd[1]: Starting systemd-network-generator.service - Generate network units from Kernel command line... May 13 23:55:21.836069 systemd[1]: Starting systemd-remount-fs.service - Remount Root and Kernel File Systems... May 13 23:55:21.836082 systemd[1]: Starting systemd-udev-load-credentials.service - Load udev Rules from Credentials... May 13 23:55:21.836094 systemd[1]: Starting systemd-udev-trigger.service - Coldplug All udev Devices... May 13 23:55:21.836109 systemd[1]: verity-setup.service: Deactivated successfully. May 13 23:55:21.836122 systemd[1]: Stopped verity-setup.service. May 13 23:55:21.836134 systemd[1]: xenserver-pv-version.service - Set fake PV driver version for XenServer was skipped because of an unmet condition check (ConditionVirtualization=xen). May 13 23:55:21.836167 systemd-journald[1129]: Collecting audit messages is disabled. May 13 23:55:21.836193 systemd[1]: Mounted dev-hugepages.mount - Huge Pages File System. May 13 23:55:21.836206 systemd-journald[1129]: Journal started May 13 23:55:21.836229 systemd-journald[1129]: Runtime Journal (/run/log/journal/7df71dff370b4143befcdcf13735b05c) is 6M, max 48.3M, 42.3M free. May 13 23:55:21.198526 systemd[1]: Queued start job for default target multi-user.target. May 13 23:55:21.214586 systemd[1]: Unnecessary job was removed for dev-vda6.device - /dev/vda6. May 13 23:55:21.215225 systemd[1]: systemd-journald.service: Deactivated successfully. May 13 23:55:21.215724 systemd[1]: systemd-journald.service: Consumed 1.855s CPU time. May 13 23:55:21.870081 systemd[1]: Started systemd-journald.service - Journal Service. May 13 23:55:21.872015 systemd[1]: Mounted dev-mqueue.mount - POSIX Message Queue File System. May 13 23:55:21.873360 systemd[1]: Mounted media.mount - External Media Directory. May 13 23:55:21.876234 systemd[1]: Mounted sys-kernel-debug.mount - Kernel Debug File System. May 13 23:55:21.927367 systemd[1]: Mounted sys-kernel-tracing.mount - Kernel Trace File System. May 13 23:55:21.928714 systemd[1]: Mounted tmp.mount - Temporary Directory /tmp. May 13 23:55:21.930109 systemd[1]: Finished kmod-static-nodes.service - Create List of Static Device Nodes. May 13 23:55:21.931799 systemd[1]: modprobe@configfs.service: Deactivated successfully. May 13 23:55:21.932028 systemd[1]: Finished modprobe@configfs.service - Load Kernel Module configfs. May 13 23:55:21.933654 systemd[1]: modprobe@dm_mod.service: Deactivated successfully. May 13 23:55:21.933880 systemd[1]: Finished modprobe@dm_mod.service - Load Kernel Module dm_mod. May 13 23:55:21.935445 systemd[1]: modprobe@drm.service: Deactivated successfully. May 13 23:55:21.935652 systemd[1]: Finished modprobe@drm.service - Load Kernel Module drm. May 13 23:55:21.937329 systemd[1]: modprobe@efi_pstore.service: Deactivated successfully. May 13 23:55:21.937546 systemd[1]: Finished modprobe@efi_pstore.service - Load Kernel Module efi_pstore. May 13 23:55:21.939188 systemd[1]: modprobe@fuse.service: Deactivated successfully. May 13 23:55:21.939388 systemd[1]: Finished modprobe@fuse.service - Load Kernel Module fuse. May 13 23:55:21.940889 systemd[1]: modprobe@loop.service: Deactivated successfully. May 13 23:55:21.941297 systemd[1]: Finished modprobe@loop.service - Load Kernel Module loop. May 13 23:55:21.943154 systemd[1]: Finished systemd-modules-load.service - Load Kernel Modules. May 13 23:55:21.944781 systemd[1]: Finished systemd-network-generator.service - Generate network units from Kernel command line. May 13 23:55:21.946577 systemd[1]: Finished systemd-remount-fs.service - Remount Root and Kernel File Systems. May 13 23:55:21.948314 systemd[1]: Finished systemd-udev-load-credentials.service - Load udev Rules from Credentials. May 13 23:55:21.960680 systemd[1]: Reached target network-pre.target - Preparation for Network. May 13 23:55:21.975438 systemd[1]: Mounting sys-fs-fuse-connections.mount - FUSE Control File System... May 13 23:55:21.978194 systemd[1]: Mounting sys-kernel-config.mount - Kernel Configuration File System... May 13 23:55:21.979588 systemd[1]: remount-root.service - Remount Root File System was skipped because of an unmet condition check (ConditionPathIsReadWrite=!/). May 13 23:55:21.979625 systemd[1]: Reached target local-fs.target - Local File Systems. May 13 23:55:21.982328 systemd[1]: Listening on systemd-sysext.socket - System Extension Image Management. May 13 23:55:22.034188 systemd[1]: Starting dracut-shutdown.service - Restore /run/initramfs on shutdown... May 13 23:55:22.036820 systemd[1]: Starting ldconfig.service - Rebuild Dynamic Linker Cache... May 13 23:55:22.038232 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. May 13 23:55:22.040092 systemd[1]: Starting systemd-hwdb-update.service - Rebuild Hardware Database... May 13 23:55:22.045188 systemd[1]: Starting systemd-journal-flush.service - Flush Journal to Persistent Storage... May 13 23:55:22.089635 systemd[1]: systemd-pstore.service - Platform Persistent Storage Archival was skipped because of an unmet condition check (ConditionDirectoryNotEmpty=/sys/fs/pstore). May 13 23:55:22.091803 systemd[1]: Starting systemd-random-seed.service - Load/Save OS Random Seed... May 13 23:55:22.093383 systemd[1]: systemd-repart.service - Repartition Root Disk was skipped because no trigger condition checks were met. May 13 23:55:22.097196 systemd[1]: Starting systemd-sysctl.service - Apply Kernel Variables... May 13 23:55:22.100349 systemd-journald[1129]: Time spent on flushing to /var/log/journal/7df71dff370b4143befcdcf13735b05c is 14.380ms for 962 entries. May 13 23:55:22.100349 systemd-journald[1129]: System Journal (/var/log/journal/7df71dff370b4143befcdcf13735b05c) is 8M, max 195.6M, 187.6M free. May 13 23:55:23.083879 systemd-journald[1129]: Received client request to flush runtime journal. May 13 23:55:23.084207 kernel: loop0: detected capacity change from 0 to 151640 May 13 23:55:23.084253 kernel: squashfs: version 4.0 (2009/01/31) Phillip Lougher May 13 23:55:23.084324 kernel: loop1: detected capacity change from 0 to 109808 May 13 23:55:23.084377 kernel: loop2: detected capacity change from 0 to 210664 May 13 23:55:23.084400 kernel: loop3: detected capacity change from 0 to 151640 May 13 23:55:23.084445 kernel: loop4: detected capacity change from 0 to 109808 May 13 23:55:23.084483 kernel: loop5: detected capacity change from 0 to 210664 May 13 23:55:22.100340 systemd[1]: Starting systemd-sysext.service - Merge System Extension Images into /usr/ and /opt/... May 13 23:55:22.187672 systemd[1]: Finished flatcar-tmpfiles.service - Create missing system files. May 13 23:55:22.189277 systemd[1]: Finished systemd-udev-trigger.service - Coldplug All udev Devices. May 13 23:55:22.190723 systemd[1]: Mounted sys-fs-fuse-connections.mount - FUSE Control File System. May 13 23:55:22.192175 systemd[1]: Mounted sys-kernel-config.mount - Kernel Configuration File System. May 13 23:55:22.193994 systemd[1]: Finished dracut-shutdown.service - Restore /run/initramfs on shutdown. May 13 23:55:22.200836 systemd[1]: Starting systemd-sysusers.service - Create System Users... May 13 23:55:22.266405 systemd[1]: Starting systemd-udev-settle.service - Wait for udev To Complete Device Initialization... May 13 23:55:22.346924 systemd[1]: Finished systemd-sysusers.service - Create System Users. May 13 23:55:22.397590 systemd[1]: Starting systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev... May 13 23:55:22.399512 udevadm[1186]: systemd-udev-settle.service is deprecated. Please fix lvm2-activation.service, lvm2-activation-early.service not to pull it in. May 13 23:55:22.455365 systemd[1]: Finished systemd-sysctl.service - Apply Kernel Variables. May 13 23:55:22.468139 systemd-tmpfiles[1191]: ACLs are not supported, ignoring. May 13 23:55:22.468157 systemd-tmpfiles[1191]: ACLs are not supported, ignoring. May 13 23:55:22.477088 systemd[1]: Finished systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev. May 13 23:55:22.499302 systemd[1]: Finished systemd-random-seed.service - Load/Save OS Random Seed. May 13 23:55:22.571026 systemd[1]: Reached target first-boot-complete.target - First Boot Complete. May 13 23:55:22.574205 systemd[1]: Starting systemd-machine-id-commit.service - Save Transient machine-id to Disk... May 13 23:55:23.065235 (sd-merge)[1197]: Using extensions 'containerd-flatcar', 'docker-flatcar', 'kubernetes'. May 13 23:55:23.066009 (sd-merge)[1197]: Merged extensions into '/usr'. May 13 23:55:23.071169 systemd[1]: Reload requested from client PID 1177 ('systemd-sysext') (unit systemd-sysext.service)... May 13 23:55:23.071185 systemd[1]: Reloading... May 13 23:55:23.187097 zram_generator::config[1224]: No configuration found. May 13 23:55:23.430190 systemd[1]: /usr/lib/systemd/system/docker.socket:6: ListenStream= references a path below legacy directory /var/run/, updating /var/run/docker.sock → /run/docker.sock; please update the unit file accordingly. May 13 23:55:23.503066 systemd[1]: Reloading finished in 431 ms. May 13 23:55:23.552291 ldconfig[1172]: /sbin/ldconfig: /usr/lib/ld.so.conf is not an ELF file - it has the wrong magic bytes at the start. May 13 23:55:23.558751 systemd[1]: etc-machine\x2did.mount: Deactivated successfully. May 13 23:55:23.579684 systemd[1]: Finished systemd-journal-flush.service - Flush Journal to Persistent Storage. May 13 23:55:23.605172 systemd[1]: Finished systemd-sysext.service - Merge System Extension Images into /usr/ and /opt/. May 13 23:55:23.607114 systemd[1]: Finished systemd-machine-id-commit.service - Save Transient machine-id to Disk. May 13 23:55:23.616625 systemd[1]: Starting ensure-sysext.service... May 13 23:55:23.618987 systemd[1]: Starting systemd-tmpfiles-setup.service - Create System Files and Directories... May 13 23:55:23.683685 systemd-tmpfiles[1267]: /usr/lib/tmpfiles.d/provision.conf:20: Duplicate line for path "/root", ignoring. May 13 23:55:23.684104 systemd-tmpfiles[1267]: /usr/lib/tmpfiles.d/systemd-flatcar.conf:6: Duplicate line for path "/var/log/journal", ignoring. May 13 23:55:23.685402 systemd-tmpfiles[1267]: /usr/lib/tmpfiles.d/systemd.conf:29: Duplicate line for path "/var/lib/systemd", ignoring. May 13 23:55:23.685785 systemd-tmpfiles[1267]: ACLs are not supported, ignoring. May 13 23:55:23.685890 systemd-tmpfiles[1267]: ACLs are not supported, ignoring. May 13 23:55:23.691040 systemd-tmpfiles[1267]: Detected autofs mount point /boot during canonicalization of boot. May 13 23:55:23.691118 systemd-tmpfiles[1267]: Skipping /boot May 13 23:55:23.691983 systemd[1]: Reload requested from client PID 1266 ('systemctl') (unit ensure-sysext.service)... May 13 23:55:23.692005 systemd[1]: Reloading... May 13 23:55:23.720928 systemd-tmpfiles[1267]: Detected autofs mount point /boot during canonicalization of boot. May 13 23:55:23.720946 systemd-tmpfiles[1267]: Skipping /boot May 13 23:55:23.774089 zram_generator::config[1297]: No configuration found. May 13 23:55:23.903218 systemd[1]: /usr/lib/systemd/system/docker.socket:6: ListenStream= references a path below legacy directory /var/run/, updating /var/run/docker.sock → /run/docker.sock; please update the unit file accordingly. May 13 23:55:23.980298 systemd[1]: Reloading finished in 287 ms. May 13 23:55:24.014784 systemd[1]: Finished systemd-tmpfiles-setup.service - Create System Files and Directories. May 13 23:55:24.027550 systemd[1]: Starting audit-rules.service - Load Audit Rules... May 13 23:55:24.053700 systemd[1]: Starting clean-ca-certificates.service - Clean up broken links in /etc/ssl/certs... May 13 23:55:24.057382 systemd[1]: Starting systemd-journal-catalog-update.service - Rebuild Journal Catalog... May 13 23:55:24.068305 systemd[1]: Starting systemd-resolved.service - Network Name Resolution... May 13 23:55:24.071665 systemd[1]: Starting systemd-update-utmp.service - Record System Boot/Shutdown in UTMP... May 13 23:55:24.102354 systemd[1]: Finished ldconfig.service - Rebuild Dynamic Linker Cache. May 13 23:55:24.107917 systemd[1]: proc-xen.mount - /proc/xen was skipped because of an unmet condition check (ConditionVirtualization=xen). May 13 23:55:24.108152 systemd[1]: ignition-delete-config.service - Ignition (delete config) was skipped because no trigger condition checks were met. May 13 23:55:24.109699 systemd[1]: Starting modprobe@dm_mod.service - Load Kernel Module dm_mod... May 13 23:55:24.112218 systemd[1]: Starting modprobe@efi_pstore.service - Load Kernel Module efi_pstore... May 13 23:55:24.114781 systemd[1]: Starting modprobe@loop.service - Load Kernel Module loop... May 13 23:55:24.115971 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. May 13 23:55:24.116184 systemd[1]: systemd-hibernate-clear.service - Clear Stale Hibernate Storage Info was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/HibernateLocation-8cf2644b-4b0b-428f-9387-6d876050dc67). May 13 23:55:24.125253 systemd[1]: Starting systemd-userdbd.service - User Database Manager... May 13 23:55:24.144487 systemd[1]: xenserver-pv-version.service - Set fake PV driver version for XenServer was skipped because of an unmet condition check (ConditionVirtualization=xen). May 13 23:55:24.146405 systemd[1]: modprobe@dm_mod.service: Deactivated successfully. May 13 23:55:24.146699 systemd[1]: Finished modprobe@dm_mod.service - Load Kernel Module dm_mod. May 13 23:55:24.148873 systemd[1]: modprobe@efi_pstore.service: Deactivated successfully. May 13 23:55:24.149145 systemd[1]: Finished modprobe@efi_pstore.service - Load Kernel Module efi_pstore. May 13 23:55:24.151298 systemd[1]: modprobe@loop.service: Deactivated successfully. May 13 23:55:24.151563 systemd[1]: Finished modprobe@loop.service - Load Kernel Module loop. May 13 23:55:24.157521 systemd[1]: proc-xen.mount - /proc/xen was skipped because of an unmet condition check (ConditionVirtualization=xen). May 13 23:55:24.157737 systemd[1]: ignition-delete-config.service - Ignition (delete config) was skipped because no trigger condition checks were met. May 13 23:55:24.159270 systemd[1]: Starting modprobe@dm_mod.service - Load Kernel Module dm_mod... May 13 23:55:24.161947 systemd[1]: Starting modprobe@efi_pstore.service - Load Kernel Module efi_pstore... May 13 23:55:24.175691 systemd[1]: Starting modprobe@loop.service - Load Kernel Module loop... May 13 23:55:24.190455 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. May 13 23:55:24.190599 systemd[1]: systemd-hibernate-clear.service - Clear Stale Hibernate Storage Info was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/HibernateLocation-8cf2644b-4b0b-428f-9387-6d876050dc67). May 13 23:55:24.190738 systemd[1]: xenserver-pv-version.service - Set fake PV driver version for XenServer was skipped because of an unmet condition check (ConditionVirtualization=xen). May 13 23:55:24.192727 systemd[1]: modprobe@dm_mod.service: Deactivated successfully. May 13 23:55:24.193102 systemd[1]: Finished modprobe@dm_mod.service - Load Kernel Module dm_mod. May 13 23:55:24.195580 systemd[1]: modprobe@efi_pstore.service: Deactivated successfully. May 13 23:55:24.195845 systemd[1]: Finished modprobe@efi_pstore.service - Load Kernel Module efi_pstore. May 13 23:55:24.198179 systemd[1]: Started systemd-userdbd.service - User Database Manager. May 13 23:55:24.200822 systemd[1]: modprobe@loop.service: Deactivated successfully. May 13 23:55:24.202183 systemd[1]: Finished modprobe@loop.service - Load Kernel Module loop. May 13 23:55:24.222010 systemd[1]: Finished systemd-journal-catalog-update.service - Rebuild Journal Catalog. May 13 23:55:24.229471 systemd[1]: proc-xen.mount - /proc/xen was skipped because of an unmet condition check (ConditionVirtualization=xen). May 13 23:55:24.229953 systemd[1]: ignition-delete-config.service - Ignition (delete config) was skipped because no trigger condition checks were met. May 13 23:55:24.234312 systemd[1]: Starting modprobe@dm_mod.service - Load Kernel Module dm_mod... May 13 23:55:24.239382 systemd[1]: Starting modprobe@drm.service - Load Kernel Module drm... May 13 23:55:24.246675 augenrules[1382]: No rules May 13 23:55:24.253501 systemd[1]: Starting modprobe@efi_pstore.service - Load Kernel Module efi_pstore... May 13 23:55:24.278907 systemd[1]: Starting modprobe@loop.service - Load Kernel Module loop... May 13 23:55:24.280312 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. May 13 23:55:24.280495 systemd[1]: systemd-hibernate-clear.service - Clear Stale Hibernate Storage Info was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/HibernateLocation-8cf2644b-4b0b-428f-9387-6d876050dc67). May 13 23:55:24.280689 systemd[1]: xenserver-pv-version.service - Set fake PV driver version for XenServer was skipped because of an unmet condition check (ConditionVirtualization=xen). May 13 23:55:24.282850 systemd[1]: audit-rules.service: Deactivated successfully. May 13 23:55:24.283116 systemd[1]: Finished audit-rules.service - Load Audit Rules. May 13 23:55:24.285091 systemd[1]: modprobe@dm_mod.service: Deactivated successfully. May 13 23:55:24.294358 systemd[1]: Finished modprobe@dm_mod.service - Load Kernel Module dm_mod. May 13 23:55:24.313948 systemd[1]: modprobe@drm.service: Deactivated successfully. May 13 23:55:24.314259 systemd[1]: Finished modprobe@drm.service - Load Kernel Module drm. May 13 23:55:24.316481 systemd[1]: modprobe@efi_pstore.service: Deactivated successfully. May 13 23:55:24.316744 systemd[1]: Finished modprobe@efi_pstore.service - Load Kernel Module efi_pstore. May 13 23:55:24.319040 systemd[1]: modprobe@loop.service: Deactivated successfully. May 13 23:55:24.319320 systemd[1]: Finished modprobe@loop.service - Load Kernel Module loop. May 13 23:55:24.324503 systemd[1]: Finished systemd-update-utmp.service - Record System Boot/Shutdown in UTMP. May 13 23:55:24.326648 systemd[1]: Finished ensure-sysext.service. May 13 23:55:24.334823 systemd[1]: systemd-pstore.service - Platform Persistent Storage Archival was skipped because of an unmet condition check (ConditionDirectoryNotEmpty=/sys/fs/pstore). May 13 23:55:24.334899 systemd[1]: systemd-repart.service - Repartition Root Disk was skipped because no trigger condition checks were met. May 13 23:55:24.337381 systemd[1]: Starting systemd-timesyncd.service - Network Time Synchronization... May 13 23:55:24.363712 systemd-resolved[1339]: Positive Trust Anchors: May 13 23:55:24.363731 systemd-resolved[1339]: . IN DS 20326 8 2 e06d44b80b8f1d39a95c0b0d7c65d08458e880409bbc683457104237c7f8ec8d May 13 23:55:24.363763 systemd-resolved[1339]: Negative trust anchors: home.arpa 10.in-addr.arpa 16.172.in-addr.arpa 17.172.in-addr.arpa 18.172.in-addr.arpa 19.172.in-addr.arpa 20.172.in-addr.arpa 21.172.in-addr.arpa 22.172.in-addr.arpa 23.172.in-addr.arpa 24.172.in-addr.arpa 25.172.in-addr.arpa 26.172.in-addr.arpa 27.172.in-addr.arpa 28.172.in-addr.arpa 29.172.in-addr.arpa 30.172.in-addr.arpa 31.172.in-addr.arpa 170.0.0.192.in-addr.arpa 171.0.0.192.in-addr.arpa 168.192.in-addr.arpa d.f.ip6.arpa ipv4only.arpa resolver.arpa corp home internal intranet lan local private test May 13 23:55:24.367875 systemd[1]: Finished clean-ca-certificates.service - Clean up broken links in /etc/ssl/certs. May 13 23:55:24.368436 systemd-resolved[1339]: Defaulting to hostname 'linux'. May 13 23:55:24.369632 systemd[1]: update-ca-certificates.service - Update CA bundle at /etc/ssl/certs/ca-certificates.crt was skipped because of an unmet condition check (ConditionPathIsSymbolicLink=!/etc/ssl/certs/ca-certificates.crt). May 13 23:55:24.370456 systemd[1]: Started systemd-resolved.service - Network Name Resolution. May 13 23:55:24.387306 systemd[1]: Reached target nss-lookup.target - Host and Network Name Lookups. May 13 23:55:24.437590 systemd[1]: Finished systemd-hwdb-update.service - Rebuild Hardware Database. May 13 23:55:24.442401 systemd[1]: Starting systemd-udevd.service - Rule-based Manager for Device Events and Files... May 13 23:55:24.485133 systemd[1]: Starting systemd-update-done.service - Update is Completed... May 13 23:55:24.491792 systemd[1]: Started systemd-timesyncd.service - Network Time Synchronization. May 13 23:55:24.493660 systemd[1]: Reached target time-set.target - System Time Set. May 13 23:55:24.502496 systemd[1]: Finished systemd-update-done.service - Update is Completed. May 13 23:55:24.550027 systemd-udevd[1399]: Using default interface naming scheme 'v255'. May 13 23:55:24.569075 systemd[1]: Started systemd-udevd.service - Rule-based Manager for Device Events and Files. May 13 23:55:24.585873 systemd[1]: Starting systemd-networkd.service - Network Configuration... May 13 23:55:24.629984 systemd[1]: Condition check resulted in dev-ttyS0.device - /dev/ttyS0 being skipped. May 13 23:55:24.663618 kernel: BTRFS warning: duplicate device /dev/vda3 devid 1 generation 39 scanned by (udev-worker) (1419) May 13 23:55:24.687217 systemd-networkd[1422]: lo: Link UP May 13 23:55:24.687231 systemd-networkd[1422]: lo: Gained carrier May 13 23:55:24.689936 systemd-networkd[1422]: Enumeration completed May 13 23:55:24.690709 systemd-networkd[1422]: eth0: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. May 13 23:55:24.690714 systemd-networkd[1422]: eth0: Configuring with /usr/lib/systemd/network/zz-default.network. May 13 23:55:24.691488 systemd-networkd[1422]: eth0: Link UP May 13 23:55:24.691492 systemd-networkd[1422]: eth0: Gained carrier May 13 23:55:24.691505 systemd-networkd[1422]: eth0: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. May 13 23:55:24.694485 systemd[1]: Started systemd-networkd.service - Network Configuration. May 13 23:55:24.712094 kernel: input: Power Button as /devices/LNXSYSTM:00/LNXPWRBN:00/input/input2 May 13 23:55:24.715593 systemd-networkd[1422]: eth0: DHCPv4 address 10.0.0.64/16, gateway 10.0.0.1 acquired from 10.0.0.1 May 13 23:55:24.717139 systemd-timesyncd[1396]: Network configuration changed, trying to establish connection. May 13 23:55:25.693954 systemd-resolved[1339]: Clock change detected. Flushing caches. May 13 23:55:25.694066 systemd-timesyncd[1396]: Contacted time server 10.0.0.1:123 (10.0.0.1). May 13 23:55:25.694118 systemd-timesyncd[1396]: Initial clock synchronization to Tue 2025-05-13 23:55:25.693898 UTC. May 13 23:55:25.695256 kernel: ACPI: button: Power Button [PWRF] May 13 23:55:25.702514 systemd[1]: Found device dev-disk-by\x2dlabel-OEM.device - /dev/disk/by-label/OEM. May 13 23:55:25.705261 kernel: i801_smbus 0000:00:1f.3: SMBus using PCI interrupt May 13 23:55:25.706286 kernel: i2c i2c-0: 1/1 memory slots populated (from DMI) May 13 23:55:25.739060 kernel: i2c i2c-0: Memory type 0x07 not supported yet, not instantiating SPD May 13 23:55:25.739027 systemd[1]: Reached target network.target - Network. May 13 23:55:25.743228 kernel: input: ImExPS/2 Generic Explorer Mouse as /devices/platform/i8042/serio1/input/input3 May 13 23:55:25.744323 systemd[1]: Starting systemd-fsck@dev-disk-by\x2dlabel-OEM.service - File System Check on /dev/disk/by-label/OEM... May 13 23:55:25.747578 systemd[1]: Starting systemd-networkd-persistent-storage.service - Enable Persistent Storage in systemd-networkd... May 13 23:55:25.750299 systemd[1]: Starting systemd-networkd-wait-online.service - Wait for Network to be Configured... May 13 23:55:25.770576 systemd[1]: Finished systemd-networkd-persistent-storage.service - Enable Persistent Storage in systemd-networkd. May 13 23:55:25.775826 systemd[1]: Finished systemd-fsck@dev-disk-by\x2dlabel-OEM.service - File System Check on /dev/disk/by-label/OEM. May 13 23:55:25.813520 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... May 13 23:55:25.865337 kernel: mousedev: PS/2 mouse device common for all mice May 13 23:55:25.879715 kernel: kvm_amd: TSC scaling supported May 13 23:55:25.879764 kernel: kvm_amd: Nested Virtualization enabled May 13 23:55:25.879778 kernel: kvm_amd: Nested Paging enabled May 13 23:55:25.879791 kernel: kvm_amd: LBR virtualization supported May 13 23:55:25.880713 kernel: kvm_amd: Virtual VMLOAD VMSAVE supported May 13 23:55:25.880738 kernel: kvm_amd: Virtual GIF supported May 13 23:55:25.929217 kernel: EDAC MC: Ver: 3.0.0 May 13 23:55:25.973931 systemd[1]: Finished systemd-udev-settle.service - Wait for udev To Complete Device Initialization. May 13 23:55:25.989429 systemd[1]: Starting lvm2-activation-early.service - Activation of LVM2 logical volumes... May 13 23:55:26.034797 systemd[1]: Finished systemd-vconsole-setup.service - Virtual Console Setup. May 13 23:55:26.072420 lvm[1448]: WARNING: Failed to connect to lvmetad. Falling back to device scanning. May 13 23:55:26.112185 systemd[1]: Finished lvm2-activation-early.service - Activation of LVM2 logical volumes. May 13 23:55:26.117139 systemd[1]: Reached target cryptsetup.target - Local Encrypted Volumes. May 13 23:55:26.118427 systemd[1]: Reached target sysinit.target - System Initialization. May 13 23:55:26.119625 systemd[1]: Started motdgen.path - Watch for update engine configuration changes. May 13 23:55:26.120899 systemd[1]: Started user-cloudinit@var-lib-flatcar\x2dinstall-user_data.path - Watch for a cloud-config at /var/lib/flatcar-install/user_data. May 13 23:55:26.122389 systemd[1]: Started logrotate.timer - Daily rotation of log files. May 13 23:55:26.123598 systemd[1]: Started mdadm.timer - Weekly check for MD array's redundancy information.. May 13 23:55:26.124832 systemd[1]: Started systemd-tmpfiles-clean.timer - Daily Cleanup of Temporary Directories. May 13 23:55:26.126054 systemd[1]: update-engine-stub.timer - Update Engine Stub Timer was skipped because of an unmet condition check (ConditionPathExists=/usr/.noupdate). May 13 23:55:26.126079 systemd[1]: Reached target paths.target - Path Units. May 13 23:55:26.127024 systemd[1]: Reached target timers.target - Timer Units. May 13 23:55:26.128901 systemd[1]: Listening on dbus.socket - D-Bus System Message Bus Socket. May 13 23:55:26.132086 systemd[1]: Starting docker.socket - Docker Socket for the API... May 13 23:55:26.135816 systemd[1]: Listening on sshd-unix-local.socket - OpenSSH Server Socket (systemd-ssh-generator, AF_UNIX Local). May 13 23:55:26.196250 systemd[1]: Listening on sshd-vsock.socket - OpenSSH Server Socket (systemd-ssh-generator, AF_VSOCK). May 13 23:55:26.197663 systemd[1]: Reached target ssh-access.target - SSH Access Available. May 13 23:55:26.201504 systemd[1]: Listening on sshd.socket - OpenSSH Server Socket. May 13 23:55:26.203076 systemd[1]: Listening on systemd-hostnamed.socket - Hostname Service Socket. May 13 23:55:26.205725 systemd[1]: Starting lvm2-activation.service - Activation of LVM2 logical volumes... May 13 23:55:26.207585 systemd[1]: Listening on docker.socket - Docker Socket for the API. May 13 23:55:26.208916 systemd[1]: Reached target sockets.target - Socket Units. May 13 23:55:26.210049 systemd[1]: Reached target basic.target - Basic System. May 13 23:55:26.211185 systemd[1]: addon-config@oem.service - Configure Addon /oem was skipped because no trigger condition checks were met. May 13 23:55:26.211240 systemd[1]: addon-run@oem.service - Run Addon /oem was skipped because no trigger condition checks were met. May 13 23:55:26.212410 systemd[1]: Starting containerd.service - containerd container runtime... May 13 23:55:26.214723 systemd[1]: Starting dbus.service - D-Bus System Message Bus... May 13 23:55:26.216513 lvm[1453]: WARNING: Failed to connect to lvmetad. Falling back to device scanning. May 13 23:55:26.258169 systemd[1]: Starting enable-oem-cloudinit.service - Enable cloudinit... May 13 23:55:26.261319 systemd[1]: Starting extend-filesystems.service - Extend Filesystems... May 13 23:55:26.262680 systemd[1]: flatcar-setup-environment.service - Modifies /etc/environment for CoreOS was skipped because of an unmet condition check (ConditionPathExists=/oem/bin/flatcar-setup-environment). May 13 23:55:26.264668 systemd[1]: Starting motdgen.service - Generate /run/flatcar/motd... May 13 23:55:26.267838 jq[1456]: false May 13 23:55:26.268544 systemd[1]: Starting prepare-helm.service - Unpack helm to /opt/bin... May 13 23:55:26.271415 systemd[1]: Starting ssh-key-proc-cmdline.service - Install an ssh key from /proc/cmdline... May 13 23:55:26.275420 systemd[1]: Starting sshd-keygen.service - Generate sshd host keys... May 13 23:55:26.288428 extend-filesystems[1457]: Found loop3 May 13 23:55:26.288428 extend-filesystems[1457]: Found loop4 May 13 23:55:26.288428 extend-filesystems[1457]: Found loop5 May 13 23:55:26.288428 extend-filesystems[1457]: Found sr0 May 13 23:55:26.288428 extend-filesystems[1457]: Found vda May 13 23:55:26.288428 extend-filesystems[1457]: Found vda1 May 13 23:55:26.288428 extend-filesystems[1457]: Found vda2 May 13 23:55:26.288428 extend-filesystems[1457]: Found vda3 May 13 23:55:26.288428 extend-filesystems[1457]: Found usr May 13 23:55:26.288428 extend-filesystems[1457]: Found vda4 May 13 23:55:26.288428 extend-filesystems[1457]: Found vda6 May 13 23:55:26.288348 systemd[1]: Starting systemd-logind.service - User Login Management... May 13 23:55:26.300159 extend-filesystems[1457]: Found vda7 May 13 23:55:26.300159 extend-filesystems[1457]: Found vda9 May 13 23:55:26.300159 extend-filesystems[1457]: Checking size of /dev/vda9 May 13 23:55:26.291880 dbus-daemon[1455]: [system] SELinux support is enabled May 13 23:55:26.293341 systemd[1]: tcsd.service - TCG Core Services Daemon was skipped because of an unmet condition check (ConditionPathExists=/dev/tpm0). May 13 23:55:26.305452 systemd[1]: cgroup compatibility translation between legacy and unified hierarchy settings activated. See cgroup-compat debug messages for details. May 13 23:55:26.306383 systemd[1]: Starting update-engine.service - Update Engine... May 13 23:55:26.310766 systemd[1]: Starting update-ssh-keys-after-ignition.service - Run update-ssh-keys once after Ignition... May 13 23:55:26.315269 extend-filesystems[1457]: Resized partition /dev/vda9 May 13 23:55:26.316869 systemd[1]: Started dbus.service - D-Bus System Message Bus. May 13 23:55:26.320909 systemd[1]: Finished lvm2-activation.service - Activation of LVM2 logical volumes. May 13 23:55:26.322394 extend-filesystems[1478]: resize2fs 1.47.2 (1-Jan-2025) May 13 23:55:26.323660 jq[1475]: true May 13 23:55:26.323810 systemd[1]: enable-oem-cloudinit.service: Skipped due to 'exec-condition'. May 13 23:55:26.324050 systemd[1]: Condition check resulted in enable-oem-cloudinit.service - Enable cloudinit being skipped. May 13 23:55:26.324410 systemd[1]: motdgen.service: Deactivated successfully. May 13 23:55:26.324728 systemd[1]: Finished motdgen.service - Generate /run/flatcar/motd. May 13 23:55:26.327684 systemd[1]: ssh-key-proc-cmdline.service: Deactivated successfully. May 13 23:55:26.327958 systemd[1]: Finished ssh-key-proc-cmdline.service - Install an ssh key from /proc/cmdline. May 13 23:55:26.340868 kernel: BTRFS warning: duplicate device /dev/vda3 devid 1 generation 39 scanned by (udev-worker) (1414) May 13 23:55:26.345368 systemd[1]: system-cloudinit@usr-share-oem-cloud\x2dconfig.yml.service - Load cloud-config from /usr/share/oem/cloud-config.yml was skipped because of an unmet condition check (ConditionFileNotEmpty=/usr/share/oem/cloud-config.yml). May 13 23:55:26.345405 systemd[1]: Reached target system-config.target - Load system-provided cloud configs. May 13 23:55:26.347033 systemd[1]: user-cloudinit-proc-cmdline.service - Load cloud-config from url defined in /proc/cmdline was skipped because of an unmet condition check (ConditionKernelCommandLine=cloud-config-url). May 13 23:55:26.347064 systemd[1]: Reached target user-config.target - Load user-provided cloud configs. May 13 23:55:26.372248 jq[1481]: true May 13 23:55:26.388253 kernel: EXT4-fs (vda9): resizing filesystem from 553472 to 1864699 blocks May 13 23:55:26.395255 update_engine[1474]: I20250513 23:55:26.393930 1474 main.cc:92] Flatcar Update Engine starting May 13 23:55:26.398939 update_engine[1474]: I20250513 23:55:26.396399 1474 update_check_scheduler.cc:74] Next update check in 4m18s May 13 23:55:26.396345 systemd[1]: Started update-engine.service - Update Engine. May 13 23:55:26.398686 (ntainerd)[1490]: containerd.service: Referenced but unset environment variable evaluates to an empty string: TORCX_IMAGEDIR, TORCX_UNPACKDIR May 13 23:55:26.400205 systemd[1]: Started locksmithd.service - Cluster reboot manager. May 13 23:55:26.489217 systemd-logind[1468]: Watching system buttons on /dev/input/event1 (Power Button) May 13 23:55:26.489259 systemd-logind[1468]: Watching system buttons on /dev/input/event0 (AT Translated Set 2 keyboard) May 13 23:55:26.495281 systemd-logind[1468]: New seat seat0. May 13 23:55:26.504677 systemd[1]: Started systemd-logind.service - User Login Management. May 13 23:55:26.541604 tar[1480]: linux-amd64/helm May 13 23:55:26.561229 sshd_keygen[1473]: ssh-keygen: generating new host keys: RSA ECDSA ED25519 May 13 23:55:26.611821 locksmithd[1499]: locksmithd starting currentOperation="UPDATE_STATUS_IDLE" strategy="reboot" May 13 23:55:26.630571 systemd[1]: Finished sshd-keygen.service - Generate sshd host keys. May 13 23:55:26.644044 systemd[1]: Starting issuegen.service - Generate /run/issue... May 13 23:55:26.690605 systemd[1]: issuegen.service: Deactivated successfully. May 13 23:55:26.690937 systemd[1]: Finished issuegen.service - Generate /run/issue. May 13 23:55:26.694455 systemd[1]: Starting systemd-user-sessions.service - Permit User Sessions... May 13 23:55:26.758256 kernel: EXT4-fs (vda9): resized filesystem to 1864699 May 13 23:55:26.901508 systemd[1]: Finished systemd-user-sessions.service - Permit User Sessions. May 13 23:55:26.927895 systemd[1]: Started getty@tty1.service - Getty on tty1. May 13 23:55:26.966134 systemd[1]: Started serial-getty@ttyS0.service - Serial Getty on ttyS0. May 13 23:55:26.967799 systemd[1]: Reached target getty.target - Login Prompts. May 13 23:55:27.120365 systemd-networkd[1422]: eth0: Gained IPv6LL May 13 23:55:27.123895 systemd[1]: Finished systemd-networkd-wait-online.service - Wait for Network to be Configured. May 13 23:55:27.125868 systemd[1]: Reached target network-online.target - Network is Online. May 13 23:55:27.165005 systemd[1]: Starting coreos-metadata.service - QEMU metadata agent... May 13 23:55:27.216467 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... May 13 23:55:27.262722 systemd[1]: Starting nvidia.service - NVIDIA Configure Service... May 13 23:55:27.294744 systemd[1]: coreos-metadata.service: Deactivated successfully. May 13 23:55:27.295076 systemd[1]: Finished coreos-metadata.service - QEMU metadata agent. May 13 23:55:27.297737 systemd[1]: packet-phone-home.service - Report Success to Packet was skipped because no trigger condition checks were met. May 13 23:55:27.465186 extend-filesystems[1478]: Filesystem at /dev/vda9 is mounted on /; on-line resizing required May 13 23:55:27.465186 extend-filesystems[1478]: old_desc_blocks = 1, new_desc_blocks = 1 May 13 23:55:27.465186 extend-filesystems[1478]: The filesystem on /dev/vda9 is now 1864699 (4k) blocks long. May 13 23:55:27.470411 extend-filesystems[1457]: Resized filesystem in /dev/vda9 May 13 23:55:27.470570 systemd[1]: extend-filesystems.service: Deactivated successfully. May 13 23:55:27.471036 systemd[1]: Finished extend-filesystems.service - Extend Filesystems. May 13 23:55:27.474167 systemd[1]: Finished nvidia.service - NVIDIA Configure Service. May 13 23:55:27.512287 containerd[1490]: time="2025-05-13T23:55:27Z" level=warning msg="Ignoring unknown key in TOML" column=1 error="strict mode: fields in the document are missing in the target struct" file=/usr/share/containerd/config.toml key=subreaper row=8 May 13 23:55:27.514232 containerd[1490]: time="2025-05-13T23:55:27.513466530Z" level=info msg="starting containerd" revision=88aa2f531d6c2922003cc7929e51daf1c14caa0a version=v2.0.1 May 13 23:55:27.526135 containerd[1490]: time="2025-05-13T23:55:27.526079408Z" level=warning msg="Configuration migrated from version 2, use `containerd config migrate` to avoid migration" t="9.208µs" May 13 23:55:27.527233 containerd[1490]: time="2025-05-13T23:55:27.526291717Z" level=info msg="loading plugin" id=io.containerd.image-verifier.v1.bindir type=io.containerd.image-verifier.v1 May 13 23:55:27.527233 containerd[1490]: time="2025-05-13T23:55:27.526319769Z" level=info msg="loading plugin" id=io.containerd.internal.v1.opt type=io.containerd.internal.v1 May 13 23:55:27.527233 containerd[1490]: time="2025-05-13T23:55:27.526518652Z" level=info msg="loading plugin" id=io.containerd.warning.v1.deprecations type=io.containerd.warning.v1 May 13 23:55:27.527233 containerd[1490]: time="2025-05-13T23:55:27.526533650Z" level=info msg="loading plugin" id=io.containerd.content.v1.content type=io.containerd.content.v1 May 13 23:55:27.527233 containerd[1490]: time="2025-05-13T23:55:27.526565721Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.blockfile type=io.containerd.snapshotter.v1 May 13 23:55:27.527233 containerd[1490]: time="2025-05-13T23:55:27.526635572Z" level=info msg="skip loading plugin" error="no scratch file generator: skip plugin" id=io.containerd.snapshotter.v1.blockfile type=io.containerd.snapshotter.v1 May 13 23:55:27.527233 containerd[1490]: time="2025-05-13T23:55:27.526646222Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.btrfs type=io.containerd.snapshotter.v1 May 13 23:55:27.527233 containerd[1490]: time="2025-05-13T23:55:27.526933901Z" level=info msg="skip loading plugin" error="path /var/lib/containerd/io.containerd.snapshotter.v1.btrfs (ext4) must be a btrfs filesystem to be used with the btrfs snapshotter: skip plugin" id=io.containerd.snapshotter.v1.btrfs type=io.containerd.snapshotter.v1 May 13 23:55:27.527233 containerd[1490]: time="2025-05-13T23:55:27.526946224Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.devmapper type=io.containerd.snapshotter.v1 May 13 23:55:27.527233 containerd[1490]: time="2025-05-13T23:55:27.526956614Z" level=info msg="skip loading plugin" error="devmapper not configured: skip plugin" id=io.containerd.snapshotter.v1.devmapper type=io.containerd.snapshotter.v1 May 13 23:55:27.527233 containerd[1490]: time="2025-05-13T23:55:27.526964779Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.native type=io.containerd.snapshotter.v1 May 13 23:55:27.527233 containerd[1490]: time="2025-05-13T23:55:27.527059647Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.overlayfs type=io.containerd.snapshotter.v1 May 13 23:55:27.527615 containerd[1490]: time="2025-05-13T23:55:27.527596534Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.zfs type=io.containerd.snapshotter.v1 May 13 23:55:27.527694 containerd[1490]: time="2025-05-13T23:55:27.527676624Z" level=info msg="skip loading plugin" error="lstat /var/lib/containerd/io.containerd.snapshotter.v1.zfs: no such file or directory: skip plugin" id=io.containerd.snapshotter.v1.zfs type=io.containerd.snapshotter.v1 May 13 23:55:27.527740 containerd[1490]: time="2025-05-13T23:55:27.527728241Z" level=info msg="loading plugin" id=io.containerd.event.v1.exchange type=io.containerd.event.v1 May 13 23:55:27.527806 containerd[1490]: time="2025-05-13T23:55:27.527793924Z" level=info msg="loading plugin" id=io.containerd.monitor.task.v1.cgroups type=io.containerd.monitor.task.v1 May 13 23:55:27.528092 containerd[1490]: time="2025-05-13T23:55:27.528076354Z" level=info msg="loading plugin" id=io.containerd.metadata.v1.bolt type=io.containerd.metadata.v1 May 13 23:55:27.528222 containerd[1490]: time="2025-05-13T23:55:27.528207460Z" level=info msg="metadata content store policy set" policy=shared May 13 23:55:27.739139 tar[1480]: linux-amd64/LICENSE May 13 23:55:27.739139 tar[1480]: linux-amd64/README.md May 13 23:55:27.805542 systemd[1]: Finished prepare-helm.service - Unpack helm to /opt/bin. May 13 23:55:28.170722 bash[1508]: Updated "/home/core/.ssh/authorized_keys" May 13 23:55:28.172673 systemd[1]: Finished update-ssh-keys-after-ignition.service - Run update-ssh-keys once after Ignition. May 13 23:55:28.222183 systemd[1]: sshkeys.service was skipped because no trigger condition checks were met. May 13 23:55:28.269939 containerd[1490]: time="2025-05-13T23:55:28.269858355Z" level=info msg="loading plugin" id=io.containerd.gc.v1.scheduler type=io.containerd.gc.v1 May 13 23:55:28.270088 containerd[1490]: time="2025-05-13T23:55:28.269977268Z" level=info msg="loading plugin" id=io.containerd.differ.v1.walking type=io.containerd.differ.v1 May 13 23:55:28.270088 containerd[1490]: time="2025-05-13T23:55:28.270000892Z" level=info msg="loading plugin" id=io.containerd.lease.v1.manager type=io.containerd.lease.v1 May 13 23:55:28.270088 containerd[1490]: time="2025-05-13T23:55:28.270017273Z" level=info msg="loading plugin" id=io.containerd.service.v1.containers-service type=io.containerd.service.v1 May 13 23:55:28.270088 containerd[1490]: time="2025-05-13T23:55:28.270034225Z" level=info msg="loading plugin" id=io.containerd.service.v1.content-service type=io.containerd.service.v1 May 13 23:55:28.270088 containerd[1490]: time="2025-05-13T23:55:28.270049313Z" level=info msg="loading plugin" id=io.containerd.service.v1.diff-service type=io.containerd.service.v1 May 13 23:55:28.270088 containerd[1490]: time="2025-05-13T23:55:28.270064822Z" level=info msg="loading plugin" id=io.containerd.service.v1.images-service type=io.containerd.service.v1 May 13 23:55:28.270088 containerd[1490]: time="2025-05-13T23:55:28.270080392Z" level=info msg="loading plugin" id=io.containerd.service.v1.introspection-service type=io.containerd.service.v1 May 13 23:55:28.270316 containerd[1490]: time="2025-05-13T23:55:28.270095239Z" level=info msg="loading plugin" id=io.containerd.service.v1.namespaces-service type=io.containerd.service.v1 May 13 23:55:28.270316 containerd[1490]: time="2025-05-13T23:55:28.270109125Z" level=info msg="loading plugin" id=io.containerd.service.v1.snapshots-service type=io.containerd.service.v1 May 13 23:55:28.270316 containerd[1490]: time="2025-05-13T23:55:28.270123673Z" level=info msg="loading plugin" id=io.containerd.shim.v1.manager type=io.containerd.shim.v1 May 13 23:55:28.270316 containerd[1490]: time="2025-05-13T23:55:28.270142749Z" level=info msg="loading plugin" id=io.containerd.runtime.v2.task type=io.containerd.runtime.v2 May 13 23:55:28.270437 containerd[1490]: time="2025-05-13T23:55:28.270404019Z" level=info msg="loading plugin" id=io.containerd.service.v1.tasks-service type=io.containerd.service.v1 May 13 23:55:28.270466 containerd[1490]: time="2025-05-13T23:55:28.270442651Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.containers type=io.containerd.grpc.v1 May 13 23:55:28.270466 containerd[1490]: time="2025-05-13T23:55:28.270460805Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.content type=io.containerd.grpc.v1 May 13 23:55:28.270516 containerd[1490]: time="2025-05-13T23:55:28.270476555Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.diff type=io.containerd.grpc.v1 May 13 23:55:28.270516 containerd[1490]: time="2025-05-13T23:55:28.270492715Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.events type=io.containerd.grpc.v1 May 13 23:55:28.270516 containerd[1490]: time="2025-05-13T23:55:28.270508174Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.images type=io.containerd.grpc.v1 May 13 23:55:28.270601 containerd[1490]: time="2025-05-13T23:55:28.270532319Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.introspection type=io.containerd.grpc.v1 May 13 23:55:28.270601 containerd[1490]: time="2025-05-13T23:55:28.270547738Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.leases type=io.containerd.grpc.v1 May 13 23:55:28.270601 containerd[1490]: time="2025-05-13T23:55:28.270562967Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.namespaces type=io.containerd.grpc.v1 May 13 23:55:28.270601 containerd[1490]: time="2025-05-13T23:55:28.270577775Z" level=info msg="loading plugin" id=io.containerd.sandbox.store.v1.local type=io.containerd.sandbox.store.v1 May 13 23:55:28.270601 containerd[1490]: time="2025-05-13T23:55:28.270592131Z" level=info msg="loading plugin" id=io.containerd.cri.v1.images type=io.containerd.cri.v1 May 13 23:55:28.270802 containerd[1490]: time="2025-05-13T23:55:28.270669757Z" level=info msg="Get image filesystem path \"/var/lib/containerd/io.containerd.snapshotter.v1.overlayfs\" for snapshotter \"overlayfs\"" May 13 23:55:28.270802 containerd[1490]: time="2025-05-13T23:55:28.270687601Z" level=info msg="Start snapshots syncer" May 13 23:55:28.270802 containerd[1490]: time="2025-05-13T23:55:28.270721334Z" level=info msg="loading plugin" id=io.containerd.cri.v1.runtime type=io.containerd.cri.v1 May 13 23:55:28.271173 containerd[1490]: time="2025-05-13T23:55:28.271022569Z" level=info msg="starting cri plugin" config="{\"containerd\":{\"defaultRuntimeName\":\"runc\",\"runtimes\":{\"runc\":{\"runtimeType\":\"io.containerd.runc.v2\",\"runtimePath\":\"\",\"PodAnnotations\":null,\"ContainerAnnotations\":null,\"options\":{\"BinaryName\":\"\",\"CriuImagePath\":\"\",\"CriuWorkPath\":\"\",\"IoGid\":0,\"IoUid\":0,\"NoNewKeyring\":false,\"Root\":\"\",\"ShimCgroup\":\"\",\"SystemdCgroup\":true},\"privileged_without_host_devices\":false,\"privileged_without_host_devices_all_devices_allowed\":false,\"baseRuntimeSpec\":\"\",\"cniConfDir\":\"\",\"cniMaxConfNum\":0,\"snapshotter\":\"\",\"sandboxer\":\"podsandbox\",\"io_type\":\"\"}},\"ignoreBlockIONotEnabledErrors\":false,\"ignoreRdtNotEnabledErrors\":false},\"cni\":{\"binDir\":\"/opt/cni/bin\",\"confDir\":\"/etc/cni/net.d\",\"maxConfNum\":1,\"setupSerially\":false,\"confTemplate\":\"\",\"ipPref\":\"\",\"useInternalLoopback\":false},\"enableSelinux\":true,\"selinuxCategoryRange\":1024,\"maxContainerLogSize\":16384,\"disableApparmor\":false,\"restrictOOMScoreAdj\":false,\"disableProcMount\":false,\"unsetSeccompProfile\":\"\",\"tolerateMissingHugetlbController\":true,\"disableHugetlbController\":true,\"device_ownership_from_security_context\":false,\"ignoreImageDefinedVolumes\":false,\"netnsMountsUnderStateDir\":false,\"enableUnprivilegedPorts\":true,\"enableUnprivilegedICMP\":true,\"enableCDI\":true,\"cdiSpecDirs\":[\"/etc/cdi\",\"/var/run/cdi\"],\"drainExecSyncIOTimeout\":\"0s\",\"ignoreDeprecationWarnings\":null,\"containerdRootDir\":\"/var/lib/containerd\",\"containerdEndpoint\":\"/run/containerd/containerd.sock\",\"rootDir\":\"/var/lib/containerd/io.containerd.grpc.v1.cri\",\"stateDir\":\"/run/containerd/io.containerd.grpc.v1.cri\"}" May 13 23:55:28.271173 containerd[1490]: time="2025-05-13T23:55:28.271090586Z" level=info msg="loading plugin" id=io.containerd.podsandbox.controller.v1.podsandbox type=io.containerd.podsandbox.controller.v1 May 13 23:55:28.271466 containerd[1490]: time="2025-05-13T23:55:28.271216523Z" level=info msg="loading plugin" id=io.containerd.sandbox.controller.v1.shim type=io.containerd.sandbox.controller.v1 May 13 23:55:28.271466 containerd[1490]: time="2025-05-13T23:55:28.271351836Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.sandbox-controllers type=io.containerd.grpc.v1 May 13 23:55:28.271466 containerd[1490]: time="2025-05-13T23:55:28.271396210Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.sandboxes type=io.containerd.grpc.v1 May 13 23:55:28.271466 containerd[1490]: time="2025-05-13T23:55:28.271412630Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.snapshots type=io.containerd.grpc.v1 May 13 23:55:28.271466 containerd[1490]: time="2025-05-13T23:55:28.271427729Z" level=info msg="loading plugin" id=io.containerd.streaming.v1.manager type=io.containerd.streaming.v1 May 13 23:55:28.271466 containerd[1490]: time="2025-05-13T23:55:28.271465420Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.streaming type=io.containerd.grpc.v1 May 13 23:55:28.271622 containerd[1490]: time="2025-05-13T23:55:28.271480818Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.tasks type=io.containerd.grpc.v1 May 13 23:55:28.271622 containerd[1490]: time="2025-05-13T23:55:28.271494143Z" level=info msg="loading plugin" id=io.containerd.transfer.v1.local type=io.containerd.transfer.v1 May 13 23:55:28.271622 containerd[1490]: time="2025-05-13T23:55:28.271526344Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.transfer type=io.containerd.grpc.v1 May 13 23:55:28.271622 containerd[1490]: time="2025-05-13T23:55:28.271571649Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.version type=io.containerd.grpc.v1 May 13 23:55:28.271622 containerd[1490]: time="2025-05-13T23:55:28.271583020Z" level=info msg="loading plugin" id=io.containerd.monitor.container.v1.restart type=io.containerd.monitor.container.v1 May 13 23:55:28.271718 containerd[1490]: time="2025-05-13T23:55:28.271624989Z" level=info msg="loading plugin" id=io.containerd.tracing.processor.v1.otlp type=io.containerd.tracing.processor.v1 May 13 23:55:28.271718 containerd[1490]: time="2025-05-13T23:55:28.271640939Z" level=info msg="skip loading plugin" error="skip plugin: tracing endpoint not configured" id=io.containerd.tracing.processor.v1.otlp type=io.containerd.tracing.processor.v1 May 13 23:55:28.271718 containerd[1490]: time="2025-05-13T23:55:28.271650106Z" level=info msg="loading plugin" id=io.containerd.internal.v1.tracing type=io.containerd.internal.v1 May 13 23:55:28.271718 containerd[1490]: time="2025-05-13T23:55:28.271659904Z" level=info msg="skip loading plugin" error="skip plugin: tracing endpoint not configured" id=io.containerd.internal.v1.tracing type=io.containerd.internal.v1 May 13 23:55:28.271718 containerd[1490]: time="2025-05-13T23:55:28.271668180Z" level=info msg="loading plugin" id=io.containerd.ttrpc.v1.otelttrpc type=io.containerd.ttrpc.v1 May 13 23:55:28.271718 containerd[1490]: time="2025-05-13T23:55:28.271677978Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.healthcheck type=io.containerd.grpc.v1 May 13 23:55:28.271718 containerd[1490]: time="2025-05-13T23:55:28.271689440Z" level=info msg="loading plugin" id=io.containerd.nri.v1.nri type=io.containerd.nri.v1 May 13 23:55:28.271718 containerd[1490]: time="2025-05-13T23:55:28.271709878Z" level=info msg="runtime interface created" May 13 23:55:28.271718 containerd[1490]: time="2025-05-13T23:55:28.271715879Z" level=info msg="created NRI interface" May 13 23:55:28.271718 containerd[1490]: time="2025-05-13T23:55:28.271724325Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.cri type=io.containerd.grpc.v1 May 13 23:55:28.271930 containerd[1490]: time="2025-05-13T23:55:28.271735717Z" level=info msg="Connect containerd service" May 13 23:55:28.271930 containerd[1490]: time="2025-05-13T23:55:28.271759651Z" level=info msg="using experimental NRI integration - disable nri plugin to prevent this" May 13 23:55:28.272607 containerd[1490]: time="2025-05-13T23:55:28.272575251Z" level=error msg="failed to load cni during init, please check CRI plugin status before setting up network for pods" error="cni config load failed: no network config found in /etc/cni/net.d: cni plugin not initialized: failed to load cni config" May 13 23:55:28.484676 containerd[1490]: time="2025-05-13T23:55:28.484495343Z" level=info msg="Start subscribing containerd event" May 13 23:55:28.484676 containerd[1490]: time="2025-05-13T23:55:28.484591022Z" level=info msg="Start recovering state" May 13 23:55:28.484836 containerd[1490]: time="2025-05-13T23:55:28.484739982Z" level=info msg=serving... address=/run/containerd/containerd.sock.ttrpc May 13 23:55:28.484836 containerd[1490]: time="2025-05-13T23:55:28.484747035Z" level=info msg="Start event monitor" May 13 23:55:28.484836 containerd[1490]: time="2025-05-13T23:55:28.484768034Z" level=info msg="Start cni network conf syncer for default" May 13 23:55:28.484836 containerd[1490]: time="2025-05-13T23:55:28.484775568Z" level=info msg="Start streaming server" May 13 23:55:28.484836 containerd[1490]: time="2025-05-13T23:55:28.484788302Z" level=info msg="Registered namespace \"k8s.io\" with NRI" May 13 23:55:28.484836 containerd[1490]: time="2025-05-13T23:55:28.484797660Z" level=info msg="runtime interface starting up..." May 13 23:55:28.484836 containerd[1490]: time="2025-05-13T23:55:28.484804142Z" level=info msg="starting plugins..." May 13 23:55:28.484836 containerd[1490]: time="2025-05-13T23:55:28.484819962Z" level=info msg="Synchronizing NRI (plugin) with current runtime state" May 13 23:55:28.484991 containerd[1490]: time="2025-05-13T23:55:28.484971676Z" level=info msg=serving... address=/run/containerd/containerd.sock May 13 23:55:28.485101 containerd[1490]: time="2025-05-13T23:55:28.485038522Z" level=info msg="containerd successfully booted in 0.979982s" May 13 23:55:28.485224 systemd[1]: Started containerd.service - containerd container runtime. May 13 23:55:29.195804 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. May 13 23:55:29.222319 systemd[1]: Reached target multi-user.target - Multi-User System. May 13 23:55:29.222732 (kubelet)[1582]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS May 13 23:55:29.224157 systemd[1]: Startup finished in 1.030s (kernel) + 10.668s (initrd) + 7.868s (userspace) = 19.567s. May 13 23:55:29.852176 kubelet[1582]: E0513 23:55:29.852103 1582 run.go:74] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" May 13 23:55:29.859566 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE May 13 23:55:29.859775 systemd[1]: kubelet.service: Failed with result 'exit-code'. May 13 23:55:29.860158 systemd[1]: kubelet.service: Consumed 1.556s CPU time, 243.1M memory peak. May 13 23:55:36.037611 systemd[1]: Created slice system-sshd.slice - Slice /system/sshd. May 13 23:55:36.039275 systemd[1]: Started sshd@0-10.0.0.64:22-10.0.0.1:60558.service - OpenSSH per-connection server daemon (10.0.0.1:60558). May 13 23:55:36.113929 sshd[1596]: Accepted publickey for core from 10.0.0.1 port 60558 ssh2: RSA SHA256:7f2XacyFcvGxEsM5obZzQpmkhMs9Q6mfAUEaqBEC3Xw May 13 23:55:36.116547 sshd-session[1596]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 13 23:55:36.126701 systemd[1]: Created slice user-500.slice - User Slice of UID 500. May 13 23:55:36.128065 systemd[1]: Starting user-runtime-dir@500.service - User Runtime Directory /run/user/500... May 13 23:55:36.129769 systemd-logind[1468]: New session 1 of user core. May 13 23:55:36.150712 systemd[1]: Finished user-runtime-dir@500.service - User Runtime Directory /run/user/500. May 13 23:55:36.153220 systemd[1]: Starting user@500.service - User Manager for UID 500... May 13 23:55:36.174583 (systemd)[1600]: pam_unix(systemd-user:session): session opened for user core(uid=500) by (uid=0) May 13 23:55:36.177158 systemd-logind[1468]: New session c1 of user core. May 13 23:55:36.330406 systemd[1600]: Queued start job for default target default.target. May 13 23:55:36.340703 systemd[1600]: Created slice app.slice - User Application Slice. May 13 23:55:36.340736 systemd[1600]: Reached target paths.target - Paths. May 13 23:55:36.340794 systemd[1600]: Reached target timers.target - Timers. May 13 23:55:36.342413 systemd[1600]: Starting dbus.socket - D-Bus User Message Bus Socket... May 13 23:55:36.354406 systemd[1600]: Listening on dbus.socket - D-Bus User Message Bus Socket. May 13 23:55:36.354564 systemd[1600]: Reached target sockets.target - Sockets. May 13 23:55:36.354617 systemd[1600]: Reached target basic.target - Basic System. May 13 23:55:36.354669 systemd[1600]: Reached target default.target - Main User Target. May 13 23:55:36.354710 systemd[1600]: Startup finished in 169ms. May 13 23:55:36.355074 systemd[1]: Started user@500.service - User Manager for UID 500. May 13 23:55:36.356973 systemd[1]: Started session-1.scope - Session 1 of User core. May 13 23:55:36.420593 systemd[1]: Started sshd@1-10.0.0.64:22-10.0.0.1:60562.service - OpenSSH per-connection server daemon (10.0.0.1:60562). May 13 23:55:36.471603 sshd[1611]: Accepted publickey for core from 10.0.0.1 port 60562 ssh2: RSA SHA256:7f2XacyFcvGxEsM5obZzQpmkhMs9Q6mfAUEaqBEC3Xw May 13 23:55:36.473368 sshd-session[1611]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 13 23:55:36.478325 systemd-logind[1468]: New session 2 of user core. May 13 23:55:36.488369 systemd[1]: Started session-2.scope - Session 2 of User core. May 13 23:55:36.543179 sshd[1613]: Connection closed by 10.0.0.1 port 60562 May 13 23:55:36.543601 sshd-session[1611]: pam_unix(sshd:session): session closed for user core May 13 23:55:36.559143 systemd[1]: sshd@1-10.0.0.64:22-10.0.0.1:60562.service: Deactivated successfully. May 13 23:55:36.561259 systemd[1]: session-2.scope: Deactivated successfully. May 13 23:55:36.562863 systemd-logind[1468]: Session 2 logged out. Waiting for processes to exit. May 13 23:55:36.564369 systemd[1]: Started sshd@2-10.0.0.64:22-10.0.0.1:60570.service - OpenSSH per-connection server daemon (10.0.0.1:60570). May 13 23:55:36.565263 systemd-logind[1468]: Removed session 2. May 13 23:55:36.607433 sshd[1618]: Accepted publickey for core from 10.0.0.1 port 60570 ssh2: RSA SHA256:7f2XacyFcvGxEsM5obZzQpmkhMs9Q6mfAUEaqBEC3Xw May 13 23:55:36.608866 sshd-session[1618]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 13 23:55:36.612829 systemd-logind[1468]: New session 3 of user core. May 13 23:55:36.622427 systemd[1]: Started session-3.scope - Session 3 of User core. May 13 23:55:36.674086 sshd[1621]: Connection closed by 10.0.0.1 port 60570 May 13 23:55:36.674521 sshd-session[1618]: pam_unix(sshd:session): session closed for user core May 13 23:55:36.690972 systemd[1]: sshd@2-10.0.0.64:22-10.0.0.1:60570.service: Deactivated successfully. May 13 23:55:36.692742 systemd[1]: session-3.scope: Deactivated successfully. May 13 23:55:36.694342 systemd-logind[1468]: Session 3 logged out. Waiting for processes to exit. May 13 23:55:36.696071 systemd[1]: Started sshd@3-10.0.0.64:22-10.0.0.1:60572.service - OpenSSH per-connection server daemon (10.0.0.1:60572). May 13 23:55:36.697021 systemd-logind[1468]: Removed session 3. May 13 23:55:36.743793 sshd[1626]: Accepted publickey for core from 10.0.0.1 port 60572 ssh2: RSA SHA256:7f2XacyFcvGxEsM5obZzQpmkhMs9Q6mfAUEaqBEC3Xw May 13 23:55:36.745576 sshd-session[1626]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 13 23:55:36.750520 systemd-logind[1468]: New session 4 of user core. May 13 23:55:36.765328 systemd[1]: Started session-4.scope - Session 4 of User core. May 13 23:55:36.819425 sshd[1629]: Connection closed by 10.0.0.1 port 60572 May 13 23:55:36.819854 sshd-session[1626]: pam_unix(sshd:session): session closed for user core May 13 23:55:36.832916 systemd[1]: sshd@3-10.0.0.64:22-10.0.0.1:60572.service: Deactivated successfully. May 13 23:55:36.834795 systemd[1]: session-4.scope: Deactivated successfully. May 13 23:55:36.836365 systemd-logind[1468]: Session 4 logged out. Waiting for processes to exit. May 13 23:55:36.837661 systemd[1]: Started sshd@4-10.0.0.64:22-10.0.0.1:60574.service - OpenSSH per-connection server daemon (10.0.0.1:60574). May 13 23:55:36.838624 systemd-logind[1468]: Removed session 4. May 13 23:55:36.892484 sshd[1634]: Accepted publickey for core from 10.0.0.1 port 60574 ssh2: RSA SHA256:7f2XacyFcvGxEsM5obZzQpmkhMs9Q6mfAUEaqBEC3Xw May 13 23:55:36.894559 sshd-session[1634]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 13 23:55:36.899418 systemd-logind[1468]: New session 5 of user core. May 13 23:55:36.909364 systemd[1]: Started session-5.scope - Session 5 of User core. May 13 23:55:36.969657 sudo[1638]: core : PWD=/home/core ; USER=root ; COMMAND=/usr/sbin/setenforce 1 May 13 23:55:36.970070 sudo[1638]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) May 13 23:55:36.989514 sudo[1638]: pam_unix(sudo:session): session closed for user root May 13 23:55:36.991431 sshd[1637]: Connection closed by 10.0.0.1 port 60574 May 13 23:55:36.991918 sshd-session[1634]: pam_unix(sshd:session): session closed for user core May 13 23:55:37.005498 systemd[1]: sshd@4-10.0.0.64:22-10.0.0.1:60574.service: Deactivated successfully. May 13 23:55:37.007821 systemd[1]: session-5.scope: Deactivated successfully. May 13 23:55:37.009567 systemd-logind[1468]: Session 5 logged out. Waiting for processes to exit. May 13 23:55:37.011155 systemd[1]: Started sshd@5-10.0.0.64:22-10.0.0.1:60586.service - OpenSSH per-connection server daemon (10.0.0.1:60586). May 13 23:55:37.012325 systemd-logind[1468]: Removed session 5. May 13 23:55:37.066672 sshd[1643]: Accepted publickey for core from 10.0.0.1 port 60586 ssh2: RSA SHA256:7f2XacyFcvGxEsM5obZzQpmkhMs9Q6mfAUEaqBEC3Xw May 13 23:55:37.068422 sshd-session[1643]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 13 23:55:37.074226 systemd-logind[1468]: New session 6 of user core. May 13 23:55:37.083610 systemd[1]: Started session-6.scope - Session 6 of User core. May 13 23:55:37.140343 sudo[1648]: core : PWD=/home/core ; USER=root ; COMMAND=/usr/sbin/rm -rf /etc/audit/rules.d/80-selinux.rules /etc/audit/rules.d/99-default.rules May 13 23:55:37.140784 sudo[1648]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) May 13 23:55:37.145828 sudo[1648]: pam_unix(sudo:session): session closed for user root May 13 23:55:37.153140 sudo[1647]: core : PWD=/home/core ; USER=root ; COMMAND=/usr/sbin/systemctl restart audit-rules May 13 23:55:37.153509 sudo[1647]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) May 13 23:55:37.164671 systemd[1]: Starting audit-rules.service - Load Audit Rules... May 13 23:55:37.214635 augenrules[1670]: No rules May 13 23:55:37.216367 systemd[1]: audit-rules.service: Deactivated successfully. May 13 23:55:37.216660 systemd[1]: Finished audit-rules.service - Load Audit Rules. May 13 23:55:37.217856 sudo[1647]: pam_unix(sudo:session): session closed for user root May 13 23:55:37.219474 sshd[1646]: Connection closed by 10.0.0.1 port 60586 May 13 23:55:37.219745 sshd-session[1643]: pam_unix(sshd:session): session closed for user core May 13 23:55:37.228342 systemd[1]: sshd@5-10.0.0.64:22-10.0.0.1:60586.service: Deactivated successfully. May 13 23:55:37.230548 systemd[1]: session-6.scope: Deactivated successfully. May 13 23:55:37.232427 systemd-logind[1468]: Session 6 logged out. Waiting for processes to exit. May 13 23:55:37.233773 systemd[1]: Started sshd@6-10.0.0.64:22-10.0.0.1:60596.service - OpenSSH per-connection server daemon (10.0.0.1:60596). May 13 23:55:37.234710 systemd-logind[1468]: Removed session 6. May 13 23:55:37.280894 sshd[1678]: Accepted publickey for core from 10.0.0.1 port 60596 ssh2: RSA SHA256:7f2XacyFcvGxEsM5obZzQpmkhMs9Q6mfAUEaqBEC3Xw May 13 23:55:37.282464 sshd-session[1678]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 13 23:55:37.287424 systemd-logind[1468]: New session 7 of user core. May 13 23:55:37.302386 systemd[1]: Started session-7.scope - Session 7 of User core. May 13 23:55:37.357227 sudo[1682]: core : PWD=/home/core ; USER=root ; COMMAND=/home/core/install.sh May 13 23:55:37.357652 sudo[1682]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) May 13 23:55:38.080106 systemd[1]: Starting docker.service - Docker Application Container Engine... May 13 23:55:38.093693 (dockerd)[1703]: docker.service: Referenced but unset environment variable evaluates to an empty string: DOCKER_CGROUPS, DOCKER_OPTS, DOCKER_OPT_BIP, DOCKER_OPT_IPMASQ, DOCKER_OPT_MTU May 13 23:55:38.770728 dockerd[1703]: time="2025-05-13T23:55:38.770649398Z" level=info msg="Starting up" May 13 23:55:38.772561 dockerd[1703]: time="2025-05-13T23:55:38.772535556Z" level=info msg="OTEL tracing is not configured, using no-op tracer provider" May 13 23:55:39.514280 dockerd[1703]: time="2025-05-13T23:55:39.514224682Z" level=info msg="Loading containers: start." May 13 23:55:40.024247 kernel: Initializing XFRM netlink socket May 13 23:55:40.035309 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 1. May 13 23:55:40.037099 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... May 13 23:55:40.109938 systemd-networkd[1422]: docker0: Link UP May 13 23:55:40.270298 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. May 13 23:55:40.274861 (kubelet)[1856]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS May 13 23:55:40.375835 kubelet[1856]: E0513 23:55:40.375725 1856 run.go:74] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" May 13 23:55:40.382570 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE May 13 23:55:40.382789 systemd[1]: kubelet.service: Failed with result 'exit-code'. May 13 23:55:40.383242 systemd[1]: kubelet.service: Consumed 232ms CPU time, 98.8M memory peak. May 13 23:55:41.516365 dockerd[1703]: time="2025-05-13T23:55:41.516299312Z" level=info msg="Loading containers: done." May 13 23:55:41.531691 systemd[1]: var-lib-docker-overlay2-opaque\x2dbug\x2dcheck3237346518-merged.mount: Deactivated successfully. May 13 23:55:41.722690 dockerd[1703]: time="2025-05-13T23:55:41.722609010Z" level=warning msg="Not using native diff for overlay2, this may cause degraded performance for building images: kernel has CONFIG_OVERLAY_FS_REDIRECT_DIR enabled" storage-driver=overlay2 May 13 23:55:41.722892 dockerd[1703]: time="2025-05-13T23:55:41.722727322Z" level=info msg="Docker daemon" commit=c710b88579fcb5e0d53f96dcae976d79323b9166 containerd-snapshotter=false storage-driver=overlay2 version=27.4.1 May 13 23:55:41.722892 dockerd[1703]: time="2025-05-13T23:55:41.722874538Z" level=info msg="Daemon has completed initialization" May 13 23:55:43.491661 dockerd[1703]: time="2025-05-13T23:55:43.491538546Z" level=info msg="API listen on /run/docker.sock" May 13 23:55:43.491830 systemd[1]: Started docker.service - Docker Application Container Engine. May 13 23:55:44.461340 containerd[1490]: time="2025-05-13T23:55:44.461251461Z" level=info msg="PullImage \"registry.k8s.io/kube-apiserver:v1.30.12\"" May 13 23:55:49.967539 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount2523799173.mount: Deactivated successfully. May 13 23:55:50.561992 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 2. May 13 23:55:50.563836 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... May 13 23:55:50.773626 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. May 13 23:55:50.790567 (kubelet)[1954]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS May 13 23:55:51.161877 kubelet[1954]: E0513 23:55:51.161810 1954 run.go:74] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" May 13 23:55:51.166419 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE May 13 23:55:51.166621 systemd[1]: kubelet.service: Failed with result 'exit-code'. May 13 23:55:51.167053 systemd[1]: kubelet.service: Consumed 288ms CPU time, 97.9M memory peak. May 13 23:56:01.312077 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 3. May 13 23:56:01.313944 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... May 13 23:56:01.486740 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. May 13 23:56:01.490988 (kubelet)[2021]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS May 13 23:56:01.535281 kubelet[2021]: E0513 23:56:01.535217 2021 run.go:74] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" May 13 23:56:01.539521 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE May 13 23:56:01.539722 systemd[1]: kubelet.service: Failed with result 'exit-code'. May 13 23:56:01.540064 systemd[1]: kubelet.service: Consumed 209ms CPU time, 94.4M memory peak. May 13 23:56:02.824224 containerd[1490]: time="2025-05-13T23:56:02.824129040Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-apiserver:v1.30.12\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 13 23:56:02.923077 containerd[1490]: time="2025-05-13T23:56:02.922964456Z" level=info msg="stop pulling image registry.k8s.io/kube-apiserver:v1.30.12: active requests=0, bytes read=32674873" May 13 23:56:03.036151 containerd[1490]: time="2025-05-13T23:56:03.036082995Z" level=info msg="ImageCreate event name:\"sha256:e113c59aa22f0650435e2a3ed64aadb01e87f3d2835aa3825fe078cd39699bfb\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 13 23:56:03.081904 containerd[1490]: time="2025-05-13T23:56:03.081738183Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-apiserver@sha256:4878682f7a044274d42399a6316ef452c5411aafd4ad99cc57de7235ca490e4e\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 13 23:56:03.082925 containerd[1490]: time="2025-05-13T23:56:03.082885980Z" level=info msg="Pulled image \"registry.k8s.io/kube-apiserver:v1.30.12\" with image id \"sha256:e113c59aa22f0650435e2a3ed64aadb01e87f3d2835aa3825fe078cd39699bfb\", repo tag \"registry.k8s.io/kube-apiserver:v1.30.12\", repo digest \"registry.k8s.io/kube-apiserver@sha256:4878682f7a044274d42399a6316ef452c5411aafd4ad99cc57de7235ca490e4e\", size \"32671673\" in 18.621582902s" May 13 23:56:03.082985 containerd[1490]: time="2025-05-13T23:56:03.082925615Z" level=info msg="PullImage \"registry.k8s.io/kube-apiserver:v1.30.12\" returns image reference \"sha256:e113c59aa22f0650435e2a3ed64aadb01e87f3d2835aa3825fe078cd39699bfb\"" May 13 23:56:03.103144 containerd[1490]: time="2025-05-13T23:56:03.103016641Z" level=info msg="PullImage \"registry.k8s.io/kube-controller-manager:v1.30.12\"" May 13 23:56:09.834059 containerd[1490]: time="2025-05-13T23:56:09.833954640Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-controller-manager:v1.30.12\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 13 23:56:09.878101 containerd[1490]: time="2025-05-13T23:56:09.877993663Z" level=info msg="stop pulling image registry.k8s.io/kube-controller-manager:v1.30.12: active requests=0, bytes read=29617534" May 13 23:56:09.928524 containerd[1490]: time="2025-05-13T23:56:09.928465433Z" level=info msg="ImageCreate event name:\"sha256:70742b7b7d90a618a1fa06d89248dbe2c291c19d7f75f4ad60a69d0454dbbac8\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 13 23:56:09.994761 containerd[1490]: time="2025-05-13T23:56:09.994682688Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-controller-manager@sha256:3a36711d0409d565b370a18d0c19339e93d4f1b1f2b3fd382eb31c714c463b74\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 13 23:56:09.995650 containerd[1490]: time="2025-05-13T23:56:09.995619844Z" level=info msg="Pulled image \"registry.k8s.io/kube-controller-manager:v1.30.12\" with image id \"sha256:70742b7b7d90a618a1fa06d89248dbe2c291c19d7f75f4ad60a69d0454dbbac8\", repo tag \"registry.k8s.io/kube-controller-manager:v1.30.12\", repo digest \"registry.k8s.io/kube-controller-manager@sha256:3a36711d0409d565b370a18d0c19339e93d4f1b1f2b3fd382eb31c714c463b74\", size \"31105907\" in 6.892557586s" May 13 23:56:09.995710 containerd[1490]: time="2025-05-13T23:56:09.995654129Z" level=info msg="PullImage \"registry.k8s.io/kube-controller-manager:v1.30.12\" returns image reference \"sha256:70742b7b7d90a618a1fa06d89248dbe2c291c19d7f75f4ad60a69d0454dbbac8\"" May 13 23:56:10.025318 containerd[1490]: time="2025-05-13T23:56:10.025281994Z" level=info msg="PullImage \"registry.k8s.io/kube-scheduler:v1.30.12\"" May 13 23:56:11.562038 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 4. May 13 23:56:11.563743 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... May 13 23:56:11.688519 update_engine[1474]: I20250513 23:56:11.688349 1474 update_attempter.cc:509] Updating boot flags... May 13 23:56:12.001136 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. May 13 23:56:12.006161 (kubelet)[2063]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS May 13 23:56:12.084649 kubelet[2063]: E0513 23:56:12.084589 2063 run.go:74] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" May 13 23:56:12.089452 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE May 13 23:56:12.089656 systemd[1]: kubelet.service: Failed with result 'exit-code'. May 13 23:56:12.090017 systemd[1]: kubelet.service: Consumed 273ms CPU time, 97.6M memory peak. May 13 23:56:12.412253 kernel: BTRFS warning: duplicate device /dev/vda3 devid 1 generation 39 scanned by (udev-worker) (2076) May 13 23:56:12.614674 kernel: BTRFS warning: duplicate device /dev/vda3 devid 1 generation 39 scanned by (udev-worker) (2079) May 13 23:56:12.650241 kernel: BTRFS warning: duplicate device /dev/vda3 devid 1 generation 39 scanned by (udev-worker) (2079) May 13 23:56:17.132232 containerd[1490]: time="2025-05-13T23:56:17.132114814Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-scheduler:v1.30.12\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 13 23:56:17.189823 containerd[1490]: time="2025-05-13T23:56:17.189696395Z" level=info msg="stop pulling image registry.k8s.io/kube-scheduler:v1.30.12: active requests=0, bytes read=17903682" May 13 23:56:17.287038 containerd[1490]: time="2025-05-13T23:56:17.286972385Z" level=info msg="ImageCreate event name:\"sha256:c0b91cfea9f9a1c09fc5d056f3a015e52604fd0d63671ff5bf31e642402ef05d\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 13 23:56:17.408458 containerd[1490]: time="2025-05-13T23:56:17.408309704Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-scheduler@sha256:521c843d01025be7d4e246ddee8cde74556eb9813c606d6db9f0f03236f6d029\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 13 23:56:17.409593 containerd[1490]: time="2025-05-13T23:56:17.409530608Z" level=info msg="Pulled image \"registry.k8s.io/kube-scheduler:v1.30.12\" with image id \"sha256:c0b91cfea9f9a1c09fc5d056f3a015e52604fd0d63671ff5bf31e642402ef05d\", repo tag \"registry.k8s.io/kube-scheduler:v1.30.12\", repo digest \"registry.k8s.io/kube-scheduler@sha256:521c843d01025be7d4e246ddee8cde74556eb9813c606d6db9f0f03236f6d029\", size \"19392073\" in 7.384209871s" May 13 23:56:17.409593 containerd[1490]: time="2025-05-13T23:56:17.409571656Z" level=info msg="PullImage \"registry.k8s.io/kube-scheduler:v1.30.12\" returns image reference \"sha256:c0b91cfea9f9a1c09fc5d056f3a015e52604fd0d63671ff5bf31e642402ef05d\"" May 13 23:56:17.449780 containerd[1490]: time="2025-05-13T23:56:17.449743976Z" level=info msg="PullImage \"registry.k8s.io/kube-proxy:v1.30.12\"" May 13 23:56:22.312046 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 5. May 13 23:56:22.314034 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... May 13 23:56:22.372858 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount392644431.mount: Deactivated successfully. May 13 23:56:22.750894 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. May 13 23:56:22.762722 (kubelet)[2110]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS May 13 23:56:22.847253 kubelet[2110]: E0513 23:56:22.847174 2110 run.go:74] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" May 13 23:56:22.851455 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE May 13 23:56:22.851656 systemd[1]: kubelet.service: Failed with result 'exit-code'. May 13 23:56:22.852013 systemd[1]: kubelet.service: Consumed 228ms CPU time, 96M memory peak. May 13 23:56:24.677923 containerd[1490]: time="2025-05-13T23:56:24.677851007Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-proxy:v1.30.12\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 13 23:56:24.705903 containerd[1490]: time="2025-05-13T23:56:24.705804653Z" level=info msg="stop pulling image registry.k8s.io/kube-proxy:v1.30.12: active requests=0, bytes read=29185817" May 13 23:56:24.730832 containerd[1490]: time="2025-05-13T23:56:24.730765170Z" level=info msg="ImageCreate event name:\"sha256:c9356fea5d151501039907c3ba870272461396117eabc74063632616f4e31b2b\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 13 23:56:24.763632 containerd[1490]: time="2025-05-13T23:56:24.763560953Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-proxy@sha256:ea8c7d5392acf6b0c11ebba78301e1a6c2dc6abcd7544102ed578e49d1c82f15\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 13 23:56:24.764449 containerd[1490]: time="2025-05-13T23:56:24.764387920Z" level=info msg="Pulled image \"registry.k8s.io/kube-proxy:v1.30.12\" with image id \"sha256:c9356fea5d151501039907c3ba870272461396117eabc74063632616f4e31b2b\", repo tag \"registry.k8s.io/kube-proxy:v1.30.12\", repo digest \"registry.k8s.io/kube-proxy@sha256:ea8c7d5392acf6b0c11ebba78301e1a6c2dc6abcd7544102ed578e49d1c82f15\", size \"29184836\" in 7.314603969s" May 13 23:56:24.764449 containerd[1490]: time="2025-05-13T23:56:24.764426514Z" level=info msg="PullImage \"registry.k8s.io/kube-proxy:v1.30.12\" returns image reference \"sha256:c9356fea5d151501039907c3ba870272461396117eabc74063632616f4e31b2b\"" May 13 23:56:24.784373 containerd[1490]: time="2025-05-13T23:56:24.784317654Z" level=info msg="PullImage \"registry.k8s.io/coredns/coredns:v1.11.1\"" May 13 23:56:27.707250 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount67069560.mount: Deactivated successfully. May 13 23:56:33.062253 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 6. May 13 23:56:33.064499 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... May 13 23:56:33.240320 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. May 13 23:56:33.245690 (kubelet)[2149]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS May 13 23:56:33.285992 kubelet[2149]: E0513 23:56:33.285867 2149 run.go:74] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" May 13 23:56:33.290169 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE May 13 23:56:33.290395 systemd[1]: kubelet.service: Failed with result 'exit-code'. May 13 23:56:33.290797 systemd[1]: kubelet.service: Consumed 200ms CPU time, 98M memory peak. May 13 23:56:39.758735 containerd[1490]: time="2025-05-13T23:56:39.758650861Z" level=info msg="ImageCreate event name:\"registry.k8s.io/coredns/coredns:v1.11.1\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 13 23:56:39.808927 containerd[1490]: time="2025-05-13T23:56:39.808833923Z" level=info msg="stop pulling image registry.k8s.io/coredns/coredns:v1.11.1: active requests=0, bytes read=18185761" May 13 23:56:39.852368 containerd[1490]: time="2025-05-13T23:56:39.852298295Z" level=info msg="ImageCreate event name:\"sha256:cbb01a7bd410dc08ba382018ab909a674fb0e48687f0c00797ed5bc34fcc6bb4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 13 23:56:39.860946 containerd[1490]: time="2025-05-13T23:56:39.860883753Z" level=info msg="ImageCreate event name:\"registry.k8s.io/coredns/coredns@sha256:1eeb4c7316bacb1d4c8ead65571cd92dd21e27359f0d4917f1a5822a73b75db1\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 13 23:56:39.862018 containerd[1490]: time="2025-05-13T23:56:39.861975484Z" level=info msg="Pulled image \"registry.k8s.io/coredns/coredns:v1.11.1\" with image id \"sha256:cbb01a7bd410dc08ba382018ab909a674fb0e48687f0c00797ed5bc34fcc6bb4\", repo tag \"registry.k8s.io/coredns/coredns:v1.11.1\", repo digest \"registry.k8s.io/coredns/coredns@sha256:1eeb4c7316bacb1d4c8ead65571cd92dd21e27359f0d4917f1a5822a73b75db1\", size \"18182961\" in 15.077606773s" May 13 23:56:39.862155 containerd[1490]: time="2025-05-13T23:56:39.862018305Z" level=info msg="PullImage \"registry.k8s.io/coredns/coredns:v1.11.1\" returns image reference \"sha256:cbb01a7bd410dc08ba382018ab909a674fb0e48687f0c00797ed5bc34fcc6bb4\"" May 13 23:56:39.884638 containerd[1490]: time="2025-05-13T23:56:39.884588495Z" level=info msg="PullImage \"registry.k8s.io/pause:3.9\"" May 13 23:56:40.936992 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount2455496928.mount: Deactivated successfully. May 13 23:56:41.255209 containerd[1490]: time="2025-05-13T23:56:41.255011954Z" level=info msg="ImageCreate event name:\"registry.k8s.io/pause:3.9\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 13 23:56:41.258250 containerd[1490]: time="2025-05-13T23:56:41.258180395Z" level=info msg="stop pulling image registry.k8s.io/pause:3.9: active requests=0, bytes read=322290" May 13 23:56:41.260055 containerd[1490]: time="2025-05-13T23:56:41.259998600Z" level=info msg="ImageCreate event name:\"sha256:e6f1816883972d4be47bd48879a08919b96afcd344132622e4d444987919323c\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 13 23:56:41.263919 containerd[1490]: time="2025-05-13T23:56:41.263873297Z" level=info msg="ImageCreate event name:\"registry.k8s.io/pause@sha256:7031c1b283388d2c2e09b57badb803c05ebed362dc88d84b480cc47f72a21097\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 13 23:56:41.264736 containerd[1490]: time="2025-05-13T23:56:41.264671647Z" level=info msg="Pulled image \"registry.k8s.io/pause:3.9\" with image id \"sha256:e6f1816883972d4be47bd48879a08919b96afcd344132622e4d444987919323c\", repo tag \"registry.k8s.io/pause:3.9\", repo digest \"registry.k8s.io/pause@sha256:7031c1b283388d2c2e09b57badb803c05ebed362dc88d84b480cc47f72a21097\", size \"321520\" in 1.380026896s" May 13 23:56:41.264736 containerd[1490]: time="2025-05-13T23:56:41.264720308Z" level=info msg="PullImage \"registry.k8s.io/pause:3.9\" returns image reference \"sha256:e6f1816883972d4be47bd48879a08919b96afcd344132622e4d444987919323c\"" May 13 23:56:41.285178 containerd[1490]: time="2025-05-13T23:56:41.285122147Z" level=info msg="PullImage \"registry.k8s.io/etcd:3.5.12-0\"" May 13 23:56:42.013092 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount1743567828.mount: Deactivated successfully. May 13 23:56:43.311960 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 7. May 13 23:56:43.314233 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... May 13 23:56:43.494230 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. May 13 23:56:43.503526 (kubelet)[2268]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS May 13 23:56:43.546273 kubelet[2268]: E0513 23:56:43.546148 2268 run.go:74] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" May 13 23:56:43.550854 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE May 13 23:56:43.551033 systemd[1]: kubelet.service: Failed with result 'exit-code'. May 13 23:56:43.551516 systemd[1]: kubelet.service: Consumed 204ms CPU time, 97.9M memory peak. May 13 23:56:45.991374 containerd[1490]: time="2025-05-13T23:56:45.991300515Z" level=info msg="ImageCreate event name:\"registry.k8s.io/etcd:3.5.12-0\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 13 23:56:46.089556 containerd[1490]: time="2025-05-13T23:56:46.089442492Z" level=info msg="stop pulling image registry.k8s.io/etcd:3.5.12-0: active requests=0, bytes read=57238571" May 13 23:56:46.161179 containerd[1490]: time="2025-05-13T23:56:46.161097806Z" level=info msg="ImageCreate event name:\"sha256:3861cfcd7c04ccac1f062788eca39487248527ef0c0cfd477a83d7691a75a899\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 13 23:56:46.348707 containerd[1490]: time="2025-05-13T23:56:46.348598828Z" level=info msg="ImageCreate event name:\"registry.k8s.io/etcd@sha256:44a8e24dcbba3470ee1fee21d5e88d128c936e9b55d4bc51fbef8086f8ed123b\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 13 23:56:46.349942 containerd[1490]: time="2025-05-13T23:56:46.349908738Z" level=info msg="Pulled image \"registry.k8s.io/etcd:3.5.12-0\" with image id \"sha256:3861cfcd7c04ccac1f062788eca39487248527ef0c0cfd477a83d7691a75a899\", repo tag \"registry.k8s.io/etcd:3.5.12-0\", repo digest \"registry.k8s.io/etcd@sha256:44a8e24dcbba3470ee1fee21d5e88d128c936e9b55d4bc51fbef8086f8ed123b\", size \"57236178\" in 5.064737027s" May 13 23:56:46.350011 containerd[1490]: time="2025-05-13T23:56:46.349948042Z" level=info msg="PullImage \"registry.k8s.io/etcd:3.5.12-0\" returns image reference \"sha256:3861cfcd7c04ccac1f062788eca39487248527ef0c0cfd477a83d7691a75a899\"" May 13 23:56:49.239620 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent. May 13 23:56:49.239799 systemd[1]: kubelet.service: Consumed 204ms CPU time, 97.9M memory peak. May 13 23:56:49.242144 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... May 13 23:56:49.269262 systemd[1]: Reload requested from client PID 2376 ('systemctl') (unit session-7.scope)... May 13 23:56:49.269280 systemd[1]: Reloading... May 13 23:56:49.360262 zram_generator::config[2419]: No configuration found. May 13 23:56:50.394606 systemd[1]: /usr/lib/systemd/system/docker.socket:6: ListenStream= references a path below legacy directory /var/run/, updating /var/run/docker.sock → /run/docker.sock; please update the unit file accordingly. May 13 23:56:50.535440 systemd[1]: Reloading finished in 1265 ms. May 13 23:56:50.604416 systemd[1]: Stopping kubelet.service - kubelet: The Kubernetes Node Agent... May 13 23:56:50.607107 systemd[1]: kubelet.service: Deactivated successfully. May 13 23:56:50.607477 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent. May 13 23:56:50.607523 systemd[1]: kubelet.service: Consumed 158ms CPU time, 83.7M memory peak. May 13 23:56:50.609171 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... May 13 23:56:50.836595 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. May 13 23:56:50.869795 (kubelet)[2469]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS May 13 23:56:50.912129 kubelet[2469]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. May 13 23:56:50.912129 kubelet[2469]: Flag --pod-infra-container-image has been deprecated, will be removed in a future release. Image garbage collector will get sandbox image information from CRI. May 13 23:56:50.912129 kubelet[2469]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. May 13 23:56:50.912611 kubelet[2469]: I0513 23:56:50.912180 2469 server.go:205] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" May 13 23:56:51.109373 kubelet[2469]: I0513 23:56:51.109243 2469 server.go:484] "Kubelet version" kubeletVersion="v1.30.1" May 13 23:56:51.109373 kubelet[2469]: I0513 23:56:51.109274 2469 server.go:486] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" May 13 23:56:51.109654 kubelet[2469]: I0513 23:56:51.109517 2469 server.go:927] "Client rotation is on, will bootstrap in background" May 13 23:56:51.162414 kubelet[2469]: I0513 23:56:51.162364 2469 dynamic_cafile_content.go:157] "Starting controller" name="client-ca-bundle::/etc/kubernetes/pki/ca.crt" May 13 23:56:51.165308 kubelet[2469]: E0513 23:56:51.165265 2469 certificate_manager.go:562] kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post "https://10.0.0.64:6443/apis/certificates.k8s.io/v1/certificatesigningrequests": dial tcp 10.0.0.64:6443: connect: connection refused May 13 23:56:51.182599 kubelet[2469]: I0513 23:56:51.182568 2469 server.go:742] "--cgroups-per-qos enabled, but --cgroup-root was not specified. defaulting to /" May 13 23:56:51.182857 kubelet[2469]: I0513 23:56:51.182815 2469 container_manager_linux.go:265] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] May 13 23:56:51.183016 kubelet[2469]: I0513 23:56:51.182848 2469 container_manager_linux.go:270] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"localhost","RuntimeCgroupsName":"","SystemCgroupsName":"","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":false,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":null,"HardEvictionThresholds":[{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"ExperimentalMemoryManagerPolicy":"None","ExperimentalMemoryManagerReservedMemory":null,"PodPidsLimit":-1,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null} May 13 23:56:51.184595 kubelet[2469]: I0513 23:56:51.184564 2469 topology_manager.go:138] "Creating topology manager with none policy" May 13 23:56:51.184595 kubelet[2469]: I0513 23:56:51.184584 2469 container_manager_linux.go:301] "Creating device plugin manager" May 13 23:56:51.184741 kubelet[2469]: I0513 23:56:51.184719 2469 state_mem.go:36] "Initialized new in-memory state store" May 13 23:56:51.192058 kubelet[2469]: I0513 23:56:51.192027 2469 kubelet.go:400] "Attempting to sync node with API server" May 13 23:56:51.192058 kubelet[2469]: I0513 23:56:51.192049 2469 kubelet.go:301] "Adding static pod path" path="/etc/kubernetes/manifests" May 13 23:56:51.192127 kubelet[2469]: I0513 23:56:51.192094 2469 kubelet.go:312] "Adding apiserver pod source" May 13 23:56:51.192127 kubelet[2469]: I0513 23:56:51.192122 2469 apiserver.go:42] "Waiting for node sync before watching apiserver pods" May 13 23:56:51.192564 kubelet[2469]: W0513 23:56:51.192508 2469 reflector.go:547] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://10.0.0.64:6443/api/v1/nodes?fieldSelector=metadata.name%3Dlocalhost&limit=500&resourceVersion=0": dial tcp 10.0.0.64:6443: connect: connection refused May 13 23:56:51.192564 kubelet[2469]: E0513 23:56:51.192553 2469 reflector.go:150] k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get "https://10.0.0.64:6443/api/v1/nodes?fieldSelector=metadata.name%3Dlocalhost&limit=500&resourceVersion=0": dial tcp 10.0.0.64:6443: connect: connection refused May 13 23:56:51.193640 kubelet[2469]: W0513 23:56:51.193603 2469 reflector.go:547] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://10.0.0.64:6443/api/v1/services?limit=500&resourceVersion=0": dial tcp 10.0.0.64:6443: connect: connection refused May 13 23:56:51.193640 kubelet[2469]: E0513 23:56:51.193637 2469 reflector.go:150] k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get "https://10.0.0.64:6443/api/v1/services?limit=500&resourceVersion=0": dial tcp 10.0.0.64:6443: connect: connection refused May 13 23:56:51.219029 kubelet[2469]: I0513 23:56:51.218988 2469 kuberuntime_manager.go:261] "Container runtime initialized" containerRuntime="containerd" version="v2.0.1" apiVersion="v1" May 13 23:56:51.224695 kubelet[2469]: I0513 23:56:51.224662 2469 kubelet.go:815] "Not starting ClusterTrustBundle informer because we are in static kubelet mode" May 13 23:56:51.224751 kubelet[2469]: W0513 23:56:51.224730 2469 probe.go:272] Flexvolume plugin directory at /opt/libexec/kubernetes/kubelet-plugins/volume/exec/ does not exist. Recreating. May 13 23:56:51.225506 kubelet[2469]: I0513 23:56:51.225361 2469 server.go:1264] "Started kubelet" May 13 23:56:51.226681 kubelet[2469]: I0513 23:56:51.226661 2469 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" May 13 23:56:51.227719 kubelet[2469]: I0513 23:56:51.227677 2469 server.go:163] "Starting to listen" address="0.0.0.0" port=10250 May 13 23:56:51.228976 kubelet[2469]: I0513 23:56:51.228720 2469 server.go:455] "Adding debug handlers to kubelet server" May 13 23:56:51.229666 kubelet[2469]: I0513 23:56:51.229611 2469 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 May 13 23:56:51.229888 kubelet[2469]: I0513 23:56:51.229855 2469 server.go:227] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" May 13 23:56:51.231005 kubelet[2469]: I0513 23:56:51.230857 2469 volume_manager.go:291] "Starting Kubelet Volume Manager" May 13 23:56:51.231005 kubelet[2469]: I0513 23:56:51.230954 2469 desired_state_of_world_populator.go:149] "Desired state populator starts to run" May 13 23:56:51.231101 kubelet[2469]: I0513 23:56:51.231011 2469 reconciler.go:26] "Reconciler: start to sync state" May 13 23:56:51.231474 kubelet[2469]: W0513 23:56:51.231418 2469 reflector.go:547] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://10.0.0.64:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 10.0.0.64:6443: connect: connection refused May 13 23:56:51.231574 kubelet[2469]: E0513 23:56:51.231482 2469 reflector.go:150] k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get "https://10.0.0.64:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 10.0.0.64:6443: connect: connection refused May 13 23:56:51.231969 kubelet[2469]: E0513 23:56:51.231926 2469 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://10.0.0.64:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/localhost?timeout=10s\": dial tcp 10.0.0.64:6443: connect: connection refused" interval="200ms" May 13 23:56:51.232318 kubelet[2469]: E0513 23:56:51.232282 2469 kubelet.go:1467] "Image garbage collection failed once. Stats initialization may not have completed yet" err="invalid capacity 0 on image filesystem" May 13 23:56:51.232466 kubelet[2469]: I0513 23:56:51.232447 2469 factory.go:221] Registration of the systemd container factory successfully May 13 23:56:51.232561 kubelet[2469]: I0513 23:56:51.232542 2469 factory.go:219] Registration of the crio container factory failed: Get "http://%2Fvar%2Frun%2Fcrio%2Fcrio.sock/info": dial unix /var/run/crio/crio.sock: connect: no such file or directory May 13 23:56:51.233506 kubelet[2469]: I0513 23:56:51.233486 2469 factory.go:221] Registration of the containerd container factory successfully May 13 23:56:51.244243 kubelet[2469]: I0513 23:56:51.244166 2469 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv4" May 13 23:56:51.245588 kubelet[2469]: I0513 23:56:51.245557 2469 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv6" May 13 23:56:51.245588 kubelet[2469]: I0513 23:56:51.245588 2469 status_manager.go:217] "Starting to sync pod status with apiserver" May 13 23:56:51.245673 kubelet[2469]: I0513 23:56:51.245608 2469 kubelet.go:2337] "Starting kubelet main sync loop" May 13 23:56:51.245673 kubelet[2469]: E0513 23:56:51.245651 2469 kubelet.go:2361] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" May 13 23:56:51.253414 kubelet[2469]: W0513 23:56:51.253365 2469 reflector.go:547] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://10.0.0.64:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 10.0.0.64:6443: connect: connection refused May 13 23:56:51.253648 kubelet[2469]: E0513 23:56:51.253555 2469 reflector.go:150] k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get "https://10.0.0.64:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 10.0.0.64:6443: connect: connection refused May 13 23:56:51.261865 kubelet[2469]: E0513 23:56:51.261702 2469 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://10.0.0.64:6443/api/v1/namespaces/default/events\": dial tcp 10.0.0.64:6443: connect: connection refused" event="&Event{ObjectMeta:{localhost.183f3b824a892fb9 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:localhost,UID:localhost,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:localhost,},FirstTimestamp:2025-05-13 23:56:51.225333689 +0000 UTC m=+0.351424831,LastTimestamp:2025-05-13 23:56:51.225333689 +0000 UTC m=+0.351424831,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:localhost,}" May 13 23:56:51.263841 kubelet[2469]: I0513 23:56:51.263825 2469 cpu_manager.go:214] "Starting CPU manager" policy="none" May 13 23:56:51.263841 kubelet[2469]: I0513 23:56:51.263836 2469 cpu_manager.go:215] "Reconciling" reconcilePeriod="10s" May 13 23:56:51.263924 kubelet[2469]: I0513 23:56:51.263851 2469 state_mem.go:36] "Initialized new in-memory state store" May 13 23:56:51.332070 kubelet[2469]: I0513 23:56:51.332043 2469 kubelet_node_status.go:73] "Attempting to register node" node="localhost" May 13 23:56:51.332387 kubelet[2469]: E0513 23:56:51.332355 2469 kubelet_node_status.go:96] "Unable to register node with API server" err="Post \"https://10.0.0.64:6443/api/v1/nodes\": dial tcp 10.0.0.64:6443: connect: connection refused" node="localhost" May 13 23:56:51.346667 kubelet[2469]: E0513 23:56:51.346623 2469 kubelet.go:2361] "Skipping pod synchronization" err="container runtime status check may not have completed yet" May 13 23:56:51.433317 kubelet[2469]: E0513 23:56:51.433157 2469 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://10.0.0.64:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/localhost?timeout=10s\": dial tcp 10.0.0.64:6443: connect: connection refused" interval="400ms" May 13 23:56:51.533942 kubelet[2469]: I0513 23:56:51.533886 2469 kubelet_node_status.go:73] "Attempting to register node" node="localhost" May 13 23:56:51.534273 kubelet[2469]: E0513 23:56:51.534246 2469 kubelet_node_status.go:96] "Unable to register node with API server" err="Post \"https://10.0.0.64:6443/api/v1/nodes\": dial tcp 10.0.0.64:6443: connect: connection refused" node="localhost" May 13 23:56:51.547345 kubelet[2469]: E0513 23:56:51.547326 2469 kubelet.go:2361] "Skipping pod synchronization" err="container runtime status check may not have completed yet" May 13 23:56:51.728661 kubelet[2469]: I0513 23:56:51.728524 2469 policy_none.go:49] "None policy: Start" May 13 23:56:51.729417 kubelet[2469]: I0513 23:56:51.729378 2469 memory_manager.go:170] "Starting memorymanager" policy="None" May 13 23:56:51.729417 kubelet[2469]: I0513 23:56:51.729400 2469 state_mem.go:35] "Initializing new in-memory state store" May 13 23:56:51.786826 systemd[1]: Created slice kubepods.slice - libcontainer container kubepods.slice. May 13 23:56:51.801568 systemd[1]: Created slice kubepods-burstable.slice - libcontainer container kubepods-burstable.slice. May 13 23:56:51.804972 systemd[1]: Created slice kubepods-besteffort.slice - libcontainer container kubepods-besteffort.slice. May 13 23:56:51.816651 kubelet[2469]: I0513 23:56:51.816602 2469 manager.go:479] "Failed to read data from checkpoint" checkpoint="kubelet_internal_checkpoint" err="checkpoint is not found" May 13 23:56:51.817113 kubelet[2469]: I0513 23:56:51.817059 2469 container_log_manager.go:186] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" May 13 23:56:51.817347 kubelet[2469]: I0513 23:56:51.817227 2469 plugin_manager.go:118] "Starting Kubelet Plugin Manager" May 13 23:56:51.818556 kubelet[2469]: E0513 23:56:51.818534 2469 eviction_manager.go:282] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"localhost\" not found" May 13 23:56:51.834026 kubelet[2469]: E0513 23:56:51.833982 2469 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://10.0.0.64:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/localhost?timeout=10s\": dial tcp 10.0.0.64:6443: connect: connection refused" interval="800ms" May 13 23:56:51.935862 kubelet[2469]: I0513 23:56:51.935826 2469 kubelet_node_status.go:73] "Attempting to register node" node="localhost" May 13 23:56:51.936399 kubelet[2469]: E0513 23:56:51.936254 2469 kubelet_node_status.go:96] "Unable to register node with API server" err="Post \"https://10.0.0.64:6443/api/v1/nodes\": dial tcp 10.0.0.64:6443: connect: connection refused" node="localhost" May 13 23:56:51.948156 kubelet[2469]: I0513 23:56:51.947741 2469 topology_manager.go:215] "Topology Admit Handler" podUID="6bbd9c8c1ad4a336f07164f184ab6e88" podNamespace="kube-system" podName="kube-apiserver-localhost" May 13 23:56:51.949118 kubelet[2469]: I0513 23:56:51.949084 2469 topology_manager.go:215] "Topology Admit Handler" podUID="b20b39a8540dba87b5883a6f0f602dba" podNamespace="kube-system" podName="kube-controller-manager-localhost" May 13 23:56:51.950080 kubelet[2469]: I0513 23:56:51.950053 2469 topology_manager.go:215] "Topology Admit Handler" podUID="6ece95f10dbffa04b25ec3439a115512" podNamespace="kube-system" podName="kube-scheduler-localhost" May 13 23:56:51.957738 systemd[1]: Created slice kubepods-burstable-pod6bbd9c8c1ad4a336f07164f184ab6e88.slice - libcontainer container kubepods-burstable-pod6bbd9c8c1ad4a336f07164f184ab6e88.slice. May 13 23:56:51.991112 systemd[1]: Created slice kubepods-burstable-podb20b39a8540dba87b5883a6f0f602dba.slice - libcontainer container kubepods-burstable-podb20b39a8540dba87b5883a6f0f602dba.slice. May 13 23:56:51.996167 systemd[1]: Created slice kubepods-burstable-pod6ece95f10dbffa04b25ec3439a115512.slice - libcontainer container kubepods-burstable-pod6ece95f10dbffa04b25ec3439a115512.slice. May 13 23:56:52.035409 kubelet[2469]: I0513 23:56:52.035330 2469 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/6bbd9c8c1ad4a336f07164f184ab6e88-usr-share-ca-certificates\") pod \"kube-apiserver-localhost\" (UID: \"6bbd9c8c1ad4a336f07164f184ab6e88\") " pod="kube-system/kube-apiserver-localhost" May 13 23:56:52.035409 kubelet[2469]: I0513 23:56:52.035390 2469 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/b20b39a8540dba87b5883a6f0f602dba-ca-certs\") pod \"kube-controller-manager-localhost\" (UID: \"b20b39a8540dba87b5883a6f0f602dba\") " pod="kube-system/kube-controller-manager-localhost" May 13 23:56:52.035409 kubelet[2469]: I0513 23:56:52.035433 2469 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"flexvolume-dir\" (UniqueName: \"kubernetes.io/host-path/b20b39a8540dba87b5883a6f0f602dba-flexvolume-dir\") pod \"kube-controller-manager-localhost\" (UID: \"b20b39a8540dba87b5883a6f0f602dba\") " pod="kube-system/kube-controller-manager-localhost" May 13 23:56:52.035694 kubelet[2469]: I0513 23:56:52.035459 2469 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/b20b39a8540dba87b5883a6f0f602dba-usr-share-ca-certificates\") pod \"kube-controller-manager-localhost\" (UID: \"b20b39a8540dba87b5883a6f0f602dba\") " pod="kube-system/kube-controller-manager-localhost" May 13 23:56:52.035694 kubelet[2469]: I0513 23:56:52.035483 2469 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/6ece95f10dbffa04b25ec3439a115512-kubeconfig\") pod \"kube-scheduler-localhost\" (UID: \"6ece95f10dbffa04b25ec3439a115512\") " pod="kube-system/kube-scheduler-localhost" May 13 23:56:52.035694 kubelet[2469]: I0513 23:56:52.035507 2469 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/6bbd9c8c1ad4a336f07164f184ab6e88-k8s-certs\") pod \"kube-apiserver-localhost\" (UID: \"6bbd9c8c1ad4a336f07164f184ab6e88\") " pod="kube-system/kube-apiserver-localhost" May 13 23:56:52.035694 kubelet[2469]: I0513 23:56:52.035530 2469 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/b20b39a8540dba87b5883a6f0f602dba-k8s-certs\") pod \"kube-controller-manager-localhost\" (UID: \"b20b39a8540dba87b5883a6f0f602dba\") " pod="kube-system/kube-controller-manager-localhost" May 13 23:56:52.035694 kubelet[2469]: I0513 23:56:52.035554 2469 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/b20b39a8540dba87b5883a6f0f602dba-kubeconfig\") pod \"kube-controller-manager-localhost\" (UID: \"b20b39a8540dba87b5883a6f0f602dba\") " pod="kube-system/kube-controller-manager-localhost" May 13 23:56:52.035801 kubelet[2469]: I0513 23:56:52.035578 2469 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/6bbd9c8c1ad4a336f07164f184ab6e88-ca-certs\") pod \"kube-apiserver-localhost\" (UID: \"6bbd9c8c1ad4a336f07164f184ab6e88\") " pod="kube-system/kube-apiserver-localhost" May 13 23:56:52.262564 kubelet[2469]: W0513 23:56:52.262391 2469 reflector.go:547] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://10.0.0.64:6443/api/v1/services?limit=500&resourceVersion=0": dial tcp 10.0.0.64:6443: connect: connection refused May 13 23:56:52.262564 kubelet[2469]: E0513 23:56:52.262466 2469 reflector.go:150] k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get "https://10.0.0.64:6443/api/v1/services?limit=500&resourceVersion=0": dial tcp 10.0.0.64:6443: connect: connection refused May 13 23:56:52.289693 kubelet[2469]: E0513 23:56:52.289652 2469 dns.go:153] "Nameserver limits exceeded" err="Nameserver limits were exceeded, some nameservers have been omitted, the applied nameserver line is: 1.1.1.1 1.0.0.1 8.8.8.8" May 13 23:56:52.290367 containerd[1490]: time="2025-05-13T23:56:52.290316902Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-apiserver-localhost,Uid:6bbd9c8c1ad4a336f07164f184ab6e88,Namespace:kube-system,Attempt:0,}" May 13 23:56:52.294399 kubelet[2469]: E0513 23:56:52.294359 2469 dns.go:153] "Nameserver limits exceeded" err="Nameserver limits were exceeded, some nameservers have been omitted, the applied nameserver line is: 1.1.1.1 1.0.0.1 8.8.8.8" May 13 23:56:52.294670 containerd[1490]: time="2025-05-13T23:56:52.294643142Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-controller-manager-localhost,Uid:b20b39a8540dba87b5883a6f0f602dba,Namespace:kube-system,Attempt:0,}" May 13 23:56:52.298905 kubelet[2469]: E0513 23:56:52.298872 2469 dns.go:153] "Nameserver limits exceeded" err="Nameserver limits were exceeded, some nameservers have been omitted, the applied nameserver line is: 1.1.1.1 1.0.0.1 8.8.8.8" May 13 23:56:52.299154 containerd[1490]: time="2025-05-13T23:56:52.299131747Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-scheduler-localhost,Uid:6ece95f10dbffa04b25ec3439a115512,Namespace:kube-system,Attempt:0,}" May 13 23:56:52.586334 kubelet[2469]: W0513 23:56:52.586268 2469 reflector.go:547] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://10.0.0.64:6443/api/v1/nodes?fieldSelector=metadata.name%3Dlocalhost&limit=500&resourceVersion=0": dial tcp 10.0.0.64:6443: connect: connection refused May 13 23:56:52.586446 kubelet[2469]: E0513 23:56:52.586341 2469 reflector.go:150] k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get "https://10.0.0.64:6443/api/v1/nodes?fieldSelector=metadata.name%3Dlocalhost&limit=500&resourceVersion=0": dial tcp 10.0.0.64:6443: connect: connection refused May 13 23:56:52.631807 kubelet[2469]: W0513 23:56:52.631738 2469 reflector.go:547] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://10.0.0.64:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 10.0.0.64:6443: connect: connection refused May 13 23:56:52.631807 kubelet[2469]: E0513 23:56:52.631802 2469 reflector.go:150] k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get "https://10.0.0.64:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 10.0.0.64:6443: connect: connection refused May 13 23:56:52.635084 kubelet[2469]: E0513 23:56:52.635047 2469 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://10.0.0.64:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/localhost?timeout=10s\": dial tcp 10.0.0.64:6443: connect: connection refused" interval="1.6s" May 13 23:56:52.680216 kubelet[2469]: W0513 23:56:52.680104 2469 reflector.go:547] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://10.0.0.64:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 10.0.0.64:6443: connect: connection refused May 13 23:56:52.680216 kubelet[2469]: E0513 23:56:52.680185 2469 reflector.go:150] k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get "https://10.0.0.64:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 10.0.0.64:6443: connect: connection refused May 13 23:56:52.737948 kubelet[2469]: I0513 23:56:52.737890 2469 kubelet_node_status.go:73] "Attempting to register node" node="localhost" May 13 23:56:52.738251 kubelet[2469]: E0513 23:56:52.738224 2469 kubelet_node_status.go:96] "Unable to register node with API server" err="Post \"https://10.0.0.64:6443/api/v1/nodes\": dial tcp 10.0.0.64:6443: connect: connection refused" node="localhost" May 13 23:56:53.291263 kubelet[2469]: E0513 23:56:53.291221 2469 certificate_manager.go:562] kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post "https://10.0.0.64:6443/apis/certificates.k8s.io/v1/certificatesigningrequests": dial tcp 10.0.0.64:6443: connect: connection refused May 13 23:56:53.348865 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount2020701515.mount: Deactivated successfully. May 13 23:56:53.357554 containerd[1490]: time="2025-05-13T23:56:53.357497558Z" level=info msg="ImageCreate event name:\"registry.k8s.io/pause:3.10\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" May 13 23:56:53.361236 containerd[1490]: time="2025-05-13T23:56:53.361114076Z" level=info msg="stop pulling image registry.k8s.io/pause:3.10: active requests=0, bytes read=321138" May 13 23:56:53.363093 containerd[1490]: time="2025-05-13T23:56:53.363025022Z" level=info msg="ImageUpdate event name:\"registry.k8s.io/pause:3.10\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" May 13 23:56:53.366843 containerd[1490]: time="2025-05-13T23:56:53.366410737Z" level=info msg="ImageCreate event name:\"sha256:873ed75102791e5b0b8a7fcd41606c92fcec98d56d05ead4ac5131650004c136\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" May 13 23:56:53.367543 containerd[1490]: time="2025-05-13T23:56:53.367431543Z" level=info msg="stop pulling image registry.k8s.io/pause:3.10: active requests=0, bytes read=0" May 13 23:56:53.368923 containerd[1490]: time="2025-05-13T23:56:53.368891293Z" level=info msg="ImageUpdate event name:\"registry.k8s.io/pause:3.10\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" May 13 23:56:53.369862 containerd[1490]: time="2025-05-13T23:56:53.369823692Z" level=info msg="stop pulling image registry.k8s.io/pause:3.10: active requests=0, bytes read=0" May 13 23:56:53.374334 containerd[1490]: time="2025-05-13T23:56:53.374278333Z" level=info msg="ImageCreate event name:\"registry.k8s.io/pause@sha256:ee6521f290b2168b6e0935a181d4cff9be1ac3f505666ef0e3c98fae8199917a\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" May 13 23:56:53.375074 containerd[1490]: time="2025-05-13T23:56:53.375038750Z" level=info msg="Pulled image \"registry.k8s.io/pause:3.10\" with image id \"sha256:873ed75102791e5b0b8a7fcd41606c92fcec98d56d05ead4ac5131650004c136\", repo tag \"registry.k8s.io/pause:3.10\", repo digest \"registry.k8s.io/pause@sha256:ee6521f290b2168b6e0935a181d4cff9be1ac3f505666ef0e3c98fae8199917a\", size \"320368\" in 1.076451065s" May 13 23:56:53.377314 containerd[1490]: time="2025-05-13T23:56:53.377247316Z" level=info msg="Pulled image \"registry.k8s.io/pause:3.10\" with image id \"sha256:873ed75102791e5b0b8a7fcd41606c92fcec98d56d05ead4ac5131650004c136\", repo tag \"registry.k8s.io/pause:3.10\", repo digest \"registry.k8s.io/pause@sha256:ee6521f290b2168b6e0935a181d4cff9be1ac3f505666ef0e3c98fae8199917a\", size \"320368\" in 1.079644377s" May 13 23:56:53.383416 containerd[1490]: time="2025-05-13T23:56:53.383339741Z" level=info msg="Pulled image \"registry.k8s.io/pause:3.10\" with image id \"sha256:873ed75102791e5b0b8a7fcd41606c92fcec98d56d05ead4ac5131650004c136\", repo tag \"registry.k8s.io/pause:3.10\", repo digest \"registry.k8s.io/pause@sha256:ee6521f290b2168b6e0935a181d4cff9be1ac3f505666ef0e3c98fae8199917a\", size \"320368\" in 1.067520592s" May 13 23:56:53.410502 containerd[1490]: time="2025-05-13T23:56:53.410434865Z" level=info msg="connecting to shim affb00b71c55e948378376225582c18459cd7b888d044d501381b40dd29713bc" address="unix:///run/containerd/s/d0bcbfaa61bdf5534737ad4b6cc1381393cb3b00c18970c9b184ca50f0fb5660" namespace=k8s.io protocol=ttrpc version=3 May 13 23:56:53.421465 containerd[1490]: time="2025-05-13T23:56:53.421319113Z" level=info msg="connecting to shim 682e32756bc33e0e743bb57d41fca5d40d6ab829025b91c4b700ee8986d401db" address="unix:///run/containerd/s/a57bc47b16a8fc22012fe730b7985bfdf58edc1904c8b65e5be4c9514ed5ebd1" namespace=k8s.io protocol=ttrpc version=3 May 13 23:56:53.429809 containerd[1490]: time="2025-05-13T23:56:53.429132288Z" level=info msg="connecting to shim 80b0367e8fbbff268f340d1ca22a22df47d00f42e478242a0d58d3ba89724f02" address="unix:///run/containerd/s/7ede32b11f8ff56c75b08e92887f7cd8c94c4b5bff0230644f508d7c0597533f" namespace=k8s.io protocol=ttrpc version=3 May 13 23:56:53.456470 systemd[1]: Started cri-containerd-affb00b71c55e948378376225582c18459cd7b888d044d501381b40dd29713bc.scope - libcontainer container affb00b71c55e948378376225582c18459cd7b888d044d501381b40dd29713bc. May 13 23:56:53.462787 systemd[1]: Started cri-containerd-682e32756bc33e0e743bb57d41fca5d40d6ab829025b91c4b700ee8986d401db.scope - libcontainer container 682e32756bc33e0e743bb57d41fca5d40d6ab829025b91c4b700ee8986d401db. May 13 23:56:53.465295 systemd[1]: Started cri-containerd-80b0367e8fbbff268f340d1ca22a22df47d00f42e478242a0d58d3ba89724f02.scope - libcontainer container 80b0367e8fbbff268f340d1ca22a22df47d00f42e478242a0d58d3ba89724f02. May 13 23:56:53.519425 containerd[1490]: time="2025-05-13T23:56:53.519376255Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-controller-manager-localhost,Uid:b20b39a8540dba87b5883a6f0f602dba,Namespace:kube-system,Attempt:0,} returns sandbox id \"affb00b71c55e948378376225582c18459cd7b888d044d501381b40dd29713bc\"" May 13 23:56:53.520600 kubelet[2469]: E0513 23:56:53.520491 2469 dns.go:153] "Nameserver limits exceeded" err="Nameserver limits were exceeded, some nameservers have been omitted, the applied nameserver line is: 1.1.1.1 1.0.0.1 8.8.8.8" May 13 23:56:53.523606 containerd[1490]: time="2025-05-13T23:56:53.523577971Z" level=info msg="CreateContainer within sandbox \"affb00b71c55e948378376225582c18459cd7b888d044d501381b40dd29713bc\" for container &ContainerMetadata{Name:kube-controller-manager,Attempt:0,}" May 13 23:56:53.548351 containerd[1490]: time="2025-05-13T23:56:53.548215954Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-apiserver-localhost,Uid:6bbd9c8c1ad4a336f07164f184ab6e88,Namespace:kube-system,Attempt:0,} returns sandbox id \"682e32756bc33e0e743bb57d41fca5d40d6ab829025b91c4b700ee8986d401db\"" May 13 23:56:53.549264 kubelet[2469]: E0513 23:56:53.549231 2469 dns.go:153] "Nameserver limits exceeded" err="Nameserver limits were exceeded, some nameservers have been omitted, the applied nameserver line is: 1.1.1.1 1.0.0.1 8.8.8.8" May 13 23:56:53.551746 containerd[1490]: time="2025-05-13T23:56:53.551714580Z" level=info msg="CreateContainer within sandbox \"682e32756bc33e0e743bb57d41fca5d40d6ab829025b91c4b700ee8986d401db\" for container &ContainerMetadata{Name:kube-apiserver,Attempt:0,}" May 13 23:56:53.577417 containerd[1490]: time="2025-05-13T23:56:53.577364192Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-scheduler-localhost,Uid:6ece95f10dbffa04b25ec3439a115512,Namespace:kube-system,Attempt:0,} returns sandbox id \"80b0367e8fbbff268f340d1ca22a22df47d00f42e478242a0d58d3ba89724f02\"" May 13 23:56:53.578298 kubelet[2469]: E0513 23:56:53.578249 2469 dns.go:153] "Nameserver limits exceeded" err="Nameserver limits were exceeded, some nameservers have been omitted, the applied nameserver line is: 1.1.1.1 1.0.0.1 8.8.8.8" May 13 23:56:53.580006 containerd[1490]: time="2025-05-13T23:56:53.579970694Z" level=info msg="CreateContainer within sandbox \"80b0367e8fbbff268f340d1ca22a22df47d00f42e478242a0d58d3ba89724f02\" for container &ContainerMetadata{Name:kube-scheduler,Attempt:0,}" May 13 23:56:53.830055 containerd[1490]: time="2025-05-13T23:56:53.830002002Z" level=info msg="Container 079326b7851ba01a228c358329e7b4b3e625e81f8290dc7074510404139adfb7: CDI devices from CRI Config.CDIDevices: []" May 13 23:56:53.833933 containerd[1490]: time="2025-05-13T23:56:53.833886483Z" level=info msg="Container 9d8cdbc47cf7d839ec7674ba8345c2a6a95c4e7f04e50f4243be17e9fddb4623: CDI devices from CRI Config.CDIDevices: []" May 13 23:56:53.837389 containerd[1490]: time="2025-05-13T23:56:53.837342990Z" level=info msg="Container 1561087abffc5ac9d3e47320cd12236b2330be178671067a6b807f5ba996041e: CDI devices from CRI Config.CDIDevices: []" May 13 23:56:53.844527 containerd[1490]: time="2025-05-13T23:56:53.844482020Z" level=info msg="CreateContainer within sandbox \"affb00b71c55e948378376225582c18459cd7b888d044d501381b40dd29713bc\" for &ContainerMetadata{Name:kube-controller-manager,Attempt:0,} returns container id \"079326b7851ba01a228c358329e7b4b3e625e81f8290dc7074510404139adfb7\"" May 13 23:56:53.845165 containerd[1490]: time="2025-05-13T23:56:53.845120538Z" level=info msg="StartContainer for \"079326b7851ba01a228c358329e7b4b3e625e81f8290dc7074510404139adfb7\"" May 13 23:56:53.846353 containerd[1490]: time="2025-05-13T23:56:53.846332643Z" level=info msg="connecting to shim 079326b7851ba01a228c358329e7b4b3e625e81f8290dc7074510404139adfb7" address="unix:///run/containerd/s/d0bcbfaa61bdf5534737ad4b6cc1381393cb3b00c18970c9b184ca50f0fb5660" protocol=ttrpc version=3 May 13 23:56:53.850404 containerd[1490]: time="2025-05-13T23:56:53.850331108Z" level=info msg="CreateContainer within sandbox \"80b0367e8fbbff268f340d1ca22a22df47d00f42e478242a0d58d3ba89724f02\" for &ContainerMetadata{Name:kube-scheduler,Attempt:0,} returns container id \"1561087abffc5ac9d3e47320cd12236b2330be178671067a6b807f5ba996041e\"" May 13 23:56:53.850860 containerd[1490]: time="2025-05-13T23:56:53.850835434Z" level=info msg="StartContainer for \"1561087abffc5ac9d3e47320cd12236b2330be178671067a6b807f5ba996041e\"" May 13 23:56:53.851799 containerd[1490]: time="2025-05-13T23:56:53.851778163Z" level=info msg="connecting to shim 1561087abffc5ac9d3e47320cd12236b2330be178671067a6b807f5ba996041e" address="unix:///run/containerd/s/7ede32b11f8ff56c75b08e92887f7cd8c94c4b5bff0230644f508d7c0597533f" protocol=ttrpc version=3 May 13 23:56:53.852993 containerd[1490]: time="2025-05-13T23:56:53.852168216Z" level=info msg="CreateContainer within sandbox \"682e32756bc33e0e743bb57d41fca5d40d6ab829025b91c4b700ee8986d401db\" for &ContainerMetadata{Name:kube-apiserver,Attempt:0,} returns container id \"9d8cdbc47cf7d839ec7674ba8345c2a6a95c4e7f04e50f4243be17e9fddb4623\"" May 13 23:56:53.852993 containerd[1490]: time="2025-05-13T23:56:53.852530345Z" level=info msg="StartContainer for \"9d8cdbc47cf7d839ec7674ba8345c2a6a95c4e7f04e50f4243be17e9fddb4623\"" May 13 23:56:53.853931 containerd[1490]: time="2025-05-13T23:56:53.853858198Z" level=info msg="connecting to shim 9d8cdbc47cf7d839ec7674ba8345c2a6a95c4e7f04e50f4243be17e9fddb4623" address="unix:///run/containerd/s/a57bc47b16a8fc22012fe730b7985bfdf58edc1904c8b65e5be4c9514ed5ebd1" protocol=ttrpc version=3 May 13 23:56:53.868386 systemd[1]: Started cri-containerd-079326b7851ba01a228c358329e7b4b3e625e81f8290dc7074510404139adfb7.scope - libcontainer container 079326b7851ba01a228c358329e7b4b3e625e81f8290dc7074510404139adfb7. May 13 23:56:53.874766 systemd[1]: Started cri-containerd-1561087abffc5ac9d3e47320cd12236b2330be178671067a6b807f5ba996041e.scope - libcontainer container 1561087abffc5ac9d3e47320cd12236b2330be178671067a6b807f5ba996041e. May 13 23:56:53.876112 systemd[1]: Started cri-containerd-9d8cdbc47cf7d839ec7674ba8345c2a6a95c4e7f04e50f4243be17e9fddb4623.scope - libcontainer container 9d8cdbc47cf7d839ec7674ba8345c2a6a95c4e7f04e50f4243be17e9fddb4623. May 13 23:56:53.944169 containerd[1490]: time="2025-05-13T23:56:53.943970648Z" level=info msg="StartContainer for \"079326b7851ba01a228c358329e7b4b3e625e81f8290dc7074510404139adfb7\" returns successfully" May 13 23:56:53.944169 containerd[1490]: time="2025-05-13T23:56:53.944026703Z" level=info msg="StartContainer for \"1561087abffc5ac9d3e47320cd12236b2330be178671067a6b807f5ba996041e\" returns successfully" May 13 23:56:53.954539 containerd[1490]: time="2025-05-13T23:56:53.954410963Z" level=info msg="StartContainer for \"9d8cdbc47cf7d839ec7674ba8345c2a6a95c4e7f04e50f4243be17e9fddb4623\" returns successfully" May 13 23:56:54.270666 kubelet[2469]: E0513 23:56:54.270507 2469 dns.go:153] "Nameserver limits exceeded" err="Nameserver limits were exceeded, some nameservers have been omitted, the applied nameserver line is: 1.1.1.1 1.0.0.1 8.8.8.8" May 13 23:56:54.274832 kubelet[2469]: E0513 23:56:54.274792 2469 dns.go:153] "Nameserver limits exceeded" err="Nameserver limits were exceeded, some nameservers have been omitted, the applied nameserver line is: 1.1.1.1 1.0.0.1 8.8.8.8" May 13 23:56:54.276358 kubelet[2469]: E0513 23:56:54.276322 2469 dns.go:153] "Nameserver limits exceeded" err="Nameserver limits were exceeded, some nameservers have been omitted, the applied nameserver line is: 1.1.1.1 1.0.0.1 8.8.8.8" May 13 23:56:54.340379 kubelet[2469]: I0513 23:56:54.340308 2469 kubelet_node_status.go:73] "Attempting to register node" node="localhost" May 13 23:56:55.002631 kubelet[2469]: E0513 23:56:55.002569 2469 nodelease.go:49] "Failed to get node when trying to set owner ref to the node lease" err="nodes \"localhost\" not found" node="localhost" May 13 23:56:55.270651 kubelet[2469]: I0513 23:56:55.270523 2469 kubelet_node_status.go:76] "Successfully registered node" node="localhost" May 13 23:56:55.277597 kubelet[2469]: E0513 23:56:55.277558 2469 dns.go:153] "Nameserver limits exceeded" err="Nameserver limits were exceeded, some nameservers have been omitted, the applied nameserver line is: 1.1.1.1 1.0.0.1 8.8.8.8" May 13 23:56:55.277801 kubelet[2469]: E0513 23:56:55.277777 2469 dns.go:153] "Nameserver limits exceeded" err="Nameserver limits were exceeded, some nameservers have been omitted, the applied nameserver line is: 1.1.1.1 1.0.0.1 8.8.8.8" May 13 23:56:56.195085 kubelet[2469]: I0513 23:56:56.195033 2469 apiserver.go:52] "Watching apiserver" May 13 23:56:56.231111 kubelet[2469]: I0513 23:56:56.231071 2469 desired_state_of_world_populator.go:157] "Finished populating initial desired state of world" May 13 23:56:57.535536 kubelet[2469]: E0513 23:56:57.535489 2469 dns.go:153] "Nameserver limits exceeded" err="Nameserver limits were exceeded, some nameservers have been omitted, the applied nameserver line is: 1.1.1.1 1.0.0.1 8.8.8.8" May 13 23:56:58.188292 kubelet[2469]: E0513 23:56:58.188254 2469 dns.go:153] "Nameserver limits exceeded" err="Nameserver limits were exceeded, some nameservers have been omitted, the applied nameserver line is: 1.1.1.1 1.0.0.1 8.8.8.8" May 13 23:56:58.281467 kubelet[2469]: E0513 23:56:58.281439 2469 dns.go:153] "Nameserver limits exceeded" err="Nameserver limits were exceeded, some nameservers have been omitted, the applied nameserver line is: 1.1.1.1 1.0.0.1 8.8.8.8" May 13 23:56:58.281936 kubelet[2469]: E0513 23:56:58.281906 2469 dns.go:153] "Nameserver limits exceeded" err="Nameserver limits were exceeded, some nameservers have been omitted, the applied nameserver line is: 1.1.1.1 1.0.0.1 8.8.8.8" May 13 23:57:01.904235 kubelet[2469]: I0513 23:57:01.904033 2469 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-apiserver-localhost" podStartSLOduration=4.90400993 podStartE2EDuration="4.90400993s" podCreationTimestamp="2025-05-13 23:56:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-05-13 23:57:01.903839881 +0000 UTC m=+11.029931023" watchObservedRunningTime="2025-05-13 23:57:01.90400993 +0000 UTC m=+11.030101082" May 13 23:57:01.904235 kubelet[2469]: I0513 23:57:01.904202 2469 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-controller-manager-localhost" podStartSLOduration=4.90418073 podStartE2EDuration="4.90418073s" podCreationTimestamp="2025-05-13 23:56:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-05-13 23:57:01.612457538 +0000 UTC m=+10.738548680" watchObservedRunningTime="2025-05-13 23:57:01.90418073 +0000 UTC m=+11.030271872" May 13 23:57:03.339500 systemd[1]: Reload requested from client PID 2750 ('systemctl') (unit session-7.scope)... May 13 23:57:03.339525 systemd[1]: Reloading... May 13 23:57:03.459315 zram_generator::config[2797]: No configuration found. May 13 23:57:03.599696 systemd[1]: /usr/lib/systemd/system/docker.socket:6: ListenStream= references a path below legacy directory /var/run/, updating /var/run/docker.sock → /run/docker.sock; please update the unit file accordingly. May 13 23:57:03.738325 systemd[1]: Reloading finished in 398 ms. May 13 23:57:03.768265 systemd[1]: Stopping kubelet.service - kubelet: The Kubernetes Node Agent... May 13 23:57:03.787897 systemd[1]: kubelet.service: Deactivated successfully. May 13 23:57:03.788280 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent. May 13 23:57:03.788350 systemd[1]: kubelet.service: Consumed 1.029s CPU time, 118.5M memory peak. May 13 23:57:03.790681 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... May 13 23:57:03.998825 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. May 13 23:57:04.004126 (kubelet)[2839]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS May 13 23:57:04.059496 kubelet[2839]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. May 13 23:57:04.059496 kubelet[2839]: Flag --pod-infra-container-image has been deprecated, will be removed in a future release. Image garbage collector will get sandbox image information from CRI. May 13 23:57:04.059496 kubelet[2839]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. May 13 23:57:04.060015 kubelet[2839]: I0513 23:57:04.059526 2839 server.go:205] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" May 13 23:57:04.068744 kubelet[2839]: I0513 23:57:04.068676 2839 server.go:484] "Kubelet version" kubeletVersion="v1.30.1" May 13 23:57:04.068744 kubelet[2839]: I0513 23:57:04.068721 2839 server.go:486] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" May 13 23:57:04.069287 kubelet[2839]: I0513 23:57:04.069260 2839 server.go:927] "Client rotation is on, will bootstrap in background" May 13 23:57:04.071328 kubelet[2839]: I0513 23:57:04.071295 2839 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-client-current.pem". May 13 23:57:04.072582 kubelet[2839]: I0513 23:57:04.072542 2839 dynamic_cafile_content.go:157] "Starting controller" name="client-ca-bundle::/etc/kubernetes/pki/ca.crt" May 13 23:57:04.082742 kubelet[2839]: I0513 23:57:04.082695 2839 server.go:742] "--cgroups-per-qos enabled, but --cgroup-root was not specified. defaulting to /" May 13 23:57:04.083000 kubelet[2839]: I0513 23:57:04.082943 2839 container_manager_linux.go:265] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] May 13 23:57:04.083241 kubelet[2839]: I0513 23:57:04.082981 2839 container_manager_linux.go:270] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"localhost","RuntimeCgroupsName":"","SystemCgroupsName":"","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":false,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":null,"HardEvictionThresholds":[{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"ExperimentalMemoryManagerPolicy":"None","ExperimentalMemoryManagerReservedMemory":null,"PodPidsLimit":-1,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null} May 13 23:57:04.083362 kubelet[2839]: I0513 23:57:04.083249 2839 topology_manager.go:138] "Creating topology manager with none policy" May 13 23:57:04.083362 kubelet[2839]: I0513 23:57:04.083264 2839 container_manager_linux.go:301] "Creating device plugin manager" May 13 23:57:04.083362 kubelet[2839]: I0513 23:57:04.083320 2839 state_mem.go:36] "Initialized new in-memory state store" May 13 23:57:04.083475 kubelet[2839]: I0513 23:57:04.083442 2839 kubelet.go:400] "Attempting to sync node with API server" May 13 23:57:04.083475 kubelet[2839]: I0513 23:57:04.083459 2839 kubelet.go:301] "Adding static pod path" path="/etc/kubernetes/manifests" May 13 23:57:04.083541 kubelet[2839]: I0513 23:57:04.083487 2839 kubelet.go:312] "Adding apiserver pod source" May 13 23:57:04.083541 kubelet[2839]: I0513 23:57:04.083514 2839 apiserver.go:42] "Waiting for node sync before watching apiserver pods" May 13 23:57:04.086410 kubelet[2839]: I0513 23:57:04.084187 2839 kuberuntime_manager.go:261] "Container runtime initialized" containerRuntime="containerd" version="v2.0.1" apiVersion="v1" May 13 23:57:04.086410 kubelet[2839]: I0513 23:57:04.084421 2839 kubelet.go:815] "Not starting ClusterTrustBundle informer because we are in static kubelet mode" May 13 23:57:04.086410 kubelet[2839]: I0513 23:57:04.084819 2839 server.go:1264] "Started kubelet" May 13 23:57:04.086410 kubelet[2839]: I0513 23:57:04.084945 2839 server.go:163] "Starting to listen" address="0.0.0.0" port=10250 May 13 23:57:04.086410 kubelet[2839]: I0513 23:57:04.085214 2839 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 May 13 23:57:04.086410 kubelet[2839]: I0513 23:57:04.085502 2839 server.go:227] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" May 13 23:57:04.086410 kubelet[2839]: I0513 23:57:04.085909 2839 server.go:455] "Adding debug handlers to kubelet server" May 13 23:57:04.094623 kubelet[2839]: I0513 23:57:04.094583 2839 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" May 13 23:57:04.094972 kubelet[2839]: E0513 23:57:04.094946 2839 kubelet.go:1467] "Image garbage collection failed once. Stats initialization may not have completed yet" err="invalid capacity 0 on image filesystem" May 13 23:57:04.095035 kubelet[2839]: I0513 23:57:04.095012 2839 volume_manager.go:291] "Starting Kubelet Volume Manager" May 13 23:57:04.097900 kubelet[2839]: I0513 23:57:04.097752 2839 desired_state_of_world_populator.go:149] "Desired state populator starts to run" May 13 23:57:04.097987 kubelet[2839]: I0513 23:57:04.097910 2839 reconciler.go:26] "Reconciler: start to sync state" May 13 23:57:04.098666 kubelet[2839]: I0513 23:57:04.098061 2839 factory.go:221] Registration of the systemd container factory successfully May 13 23:57:04.098666 kubelet[2839]: I0513 23:57:04.098200 2839 factory.go:219] Registration of the crio container factory failed: Get "http://%2Fvar%2Frun%2Fcrio%2Fcrio.sock/info": dial unix /var/run/crio/crio.sock: connect: no such file or directory May 13 23:57:04.101132 kubelet[2839]: I0513 23:57:04.101095 2839 factory.go:221] Registration of the containerd container factory successfully May 13 23:57:04.112542 kubelet[2839]: I0513 23:57:04.112251 2839 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv4" May 13 23:57:04.114971 kubelet[2839]: I0513 23:57:04.114374 2839 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv6" May 13 23:57:04.114971 kubelet[2839]: I0513 23:57:04.114434 2839 status_manager.go:217] "Starting to sync pod status with apiserver" May 13 23:57:04.114971 kubelet[2839]: I0513 23:57:04.114460 2839 kubelet.go:2337] "Starting kubelet main sync loop" May 13 23:57:04.114971 kubelet[2839]: E0513 23:57:04.114522 2839 kubelet.go:2361] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" May 13 23:57:04.139620 kubelet[2839]: I0513 23:57:04.139583 2839 cpu_manager.go:214] "Starting CPU manager" policy="none" May 13 23:57:04.139620 kubelet[2839]: I0513 23:57:04.139605 2839 cpu_manager.go:215] "Reconciling" reconcilePeriod="10s" May 13 23:57:04.139809 kubelet[2839]: I0513 23:57:04.139639 2839 state_mem.go:36] "Initialized new in-memory state store" May 13 23:57:04.139882 kubelet[2839]: I0513 23:57:04.139861 2839 state_mem.go:88] "Updated default CPUSet" cpuSet="" May 13 23:57:04.139909 kubelet[2839]: I0513 23:57:04.139879 2839 state_mem.go:96] "Updated CPUSet assignments" assignments={} May 13 23:57:04.139909 kubelet[2839]: I0513 23:57:04.139902 2839 policy_none.go:49] "None policy: Start" May 13 23:57:04.140710 kubelet[2839]: I0513 23:57:04.140685 2839 memory_manager.go:170] "Starting memorymanager" policy="None" May 13 23:57:04.140760 kubelet[2839]: I0513 23:57:04.140720 2839 state_mem.go:35] "Initializing new in-memory state store" May 13 23:57:04.140942 kubelet[2839]: I0513 23:57:04.140915 2839 state_mem.go:75] "Updated machine memory state" May 13 23:57:04.146090 kubelet[2839]: I0513 23:57:04.146052 2839 manager.go:479] "Failed to read data from checkpoint" checkpoint="kubelet_internal_checkpoint" err="checkpoint is not found" May 13 23:57:04.146519 kubelet[2839]: I0513 23:57:04.146300 2839 container_log_manager.go:186] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" May 13 23:57:04.146519 kubelet[2839]: I0513 23:57:04.146435 2839 plugin_manager.go:118] "Starting Kubelet Plugin Manager" May 13 23:57:04.203110 kubelet[2839]: I0513 23:57:04.203059 2839 kubelet_node_status.go:73] "Attempting to register node" node="localhost" May 13 23:57:04.215299 kubelet[2839]: I0513 23:57:04.215216 2839 topology_manager.go:215] "Topology Admit Handler" podUID="6ece95f10dbffa04b25ec3439a115512" podNamespace="kube-system" podName="kube-scheduler-localhost" May 13 23:57:04.215507 kubelet[2839]: I0513 23:57:04.215350 2839 topology_manager.go:215] "Topology Admit Handler" podUID="6bbd9c8c1ad4a336f07164f184ab6e88" podNamespace="kube-system" podName="kube-apiserver-localhost" May 13 23:57:04.215507 kubelet[2839]: I0513 23:57:04.215412 2839 topology_manager.go:215] "Topology Admit Handler" podUID="b20b39a8540dba87b5883a6f0f602dba" podNamespace="kube-system" podName="kube-controller-manager-localhost" May 13 23:57:04.349169 kubelet[2839]: E0513 23:57:04.348940 2839 kubelet.go:1928] "Failed creating a mirror pod for" err="pods \"kube-apiserver-localhost\" already exists" pod="kube-system/kube-apiserver-localhost" May 13 23:57:04.349169 kubelet[2839]: E0513 23:57:04.349115 2839 kubelet.go:1928] "Failed creating a mirror pod for" err="pods \"kube-controller-manager-localhost\" already exists" pod="kube-system/kube-controller-manager-localhost" May 13 23:57:04.351355 kubelet[2839]: I0513 23:57:04.351309 2839 kubelet_node_status.go:112] "Node was previously registered" node="localhost" May 13 23:57:04.351542 kubelet[2839]: I0513 23:57:04.351403 2839 kubelet_node_status.go:76] "Successfully registered node" node="localhost" May 13 23:57:04.399169 kubelet[2839]: I0513 23:57:04.399094 2839 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/6bbd9c8c1ad4a336f07164f184ab6e88-ca-certs\") pod \"kube-apiserver-localhost\" (UID: \"6bbd9c8c1ad4a336f07164f184ab6e88\") " pod="kube-system/kube-apiserver-localhost" May 13 23:57:04.399169 kubelet[2839]: I0513 23:57:04.399140 2839 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/6bbd9c8c1ad4a336f07164f184ab6e88-k8s-certs\") pod \"kube-apiserver-localhost\" (UID: \"6bbd9c8c1ad4a336f07164f184ab6e88\") " pod="kube-system/kube-apiserver-localhost" May 13 23:57:04.399169 kubelet[2839]: I0513 23:57:04.399182 2839 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/6bbd9c8c1ad4a336f07164f184ab6e88-usr-share-ca-certificates\") pod \"kube-apiserver-localhost\" (UID: \"6bbd9c8c1ad4a336f07164f184ab6e88\") " pod="kube-system/kube-apiserver-localhost" May 13 23:57:04.399169 kubelet[2839]: I0513 23:57:04.399219 2839 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/b20b39a8540dba87b5883a6f0f602dba-ca-certs\") pod \"kube-controller-manager-localhost\" (UID: \"b20b39a8540dba87b5883a6f0f602dba\") " pod="kube-system/kube-controller-manager-localhost" May 13 23:57:04.399496 kubelet[2839]: I0513 23:57:04.399237 2839 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/b20b39a8540dba87b5883a6f0f602dba-k8s-certs\") pod \"kube-controller-manager-localhost\" (UID: \"b20b39a8540dba87b5883a6f0f602dba\") " pod="kube-system/kube-controller-manager-localhost" May 13 23:57:04.399496 kubelet[2839]: I0513 23:57:04.399254 2839 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/b20b39a8540dba87b5883a6f0f602dba-kubeconfig\") pod \"kube-controller-manager-localhost\" (UID: \"b20b39a8540dba87b5883a6f0f602dba\") " pod="kube-system/kube-controller-manager-localhost" May 13 23:57:04.399496 kubelet[2839]: I0513 23:57:04.399269 2839 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/6ece95f10dbffa04b25ec3439a115512-kubeconfig\") pod \"kube-scheduler-localhost\" (UID: \"6ece95f10dbffa04b25ec3439a115512\") " pod="kube-system/kube-scheduler-localhost" May 13 23:57:04.399496 kubelet[2839]: I0513 23:57:04.399287 2839 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"flexvolume-dir\" (UniqueName: \"kubernetes.io/host-path/b20b39a8540dba87b5883a6f0f602dba-flexvolume-dir\") pod \"kube-controller-manager-localhost\" (UID: \"b20b39a8540dba87b5883a6f0f602dba\") " pod="kube-system/kube-controller-manager-localhost" May 13 23:57:04.399496 kubelet[2839]: I0513 23:57:04.399306 2839 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/b20b39a8540dba87b5883a6f0f602dba-usr-share-ca-certificates\") pod \"kube-controller-manager-localhost\" (UID: \"b20b39a8540dba87b5883a6f0f602dba\") " pod="kube-system/kube-controller-manager-localhost" May 13 23:57:04.627721 kubelet[2839]: E0513 23:57:04.627576 2839 dns.go:153] "Nameserver limits exceeded" err="Nameserver limits were exceeded, some nameservers have been omitted, the applied nameserver line is: 1.1.1.1 1.0.0.1 8.8.8.8" May 13 23:57:04.650630 kubelet[2839]: E0513 23:57:04.650589 2839 dns.go:153] "Nameserver limits exceeded" err="Nameserver limits were exceeded, some nameservers have been omitted, the applied nameserver line is: 1.1.1.1 1.0.0.1 8.8.8.8" May 13 23:57:04.650630 kubelet[2839]: E0513 23:57:04.650609 2839 dns.go:153] "Nameserver limits exceeded" err="Nameserver limits were exceeded, some nameservers have been omitted, the applied nameserver line is: 1.1.1.1 1.0.0.1 8.8.8.8" May 13 23:57:05.084293 kubelet[2839]: I0513 23:57:05.084238 2839 apiserver.go:52] "Watching apiserver" May 13 23:57:05.098166 kubelet[2839]: I0513 23:57:05.098119 2839 desired_state_of_world_populator.go:157] "Finished populating initial desired state of world" May 13 23:57:05.151889 kubelet[2839]: E0513 23:57:05.151840 2839 kubelet.go:1928] "Failed creating a mirror pod for" err="pods \"kube-scheduler-localhost\" already exists" pod="kube-system/kube-scheduler-localhost" May 13 23:57:05.152210 kubelet[2839]: E0513 23:57:05.152167 2839 dns.go:153] "Nameserver limits exceeded" err="Nameserver limits were exceeded, some nameservers have been omitted, the applied nameserver line is: 1.1.1.1 1.0.0.1 8.8.8.8" May 13 23:57:05.155561 kubelet[2839]: E0513 23:57:05.155525 2839 kubelet.go:1928] "Failed creating a mirror pod for" err="pods \"kube-apiserver-localhost\" already exists" pod="kube-system/kube-apiserver-localhost" May 13 23:57:05.155877 kubelet[2839]: E0513 23:57:05.155857 2839 kubelet.go:1928] "Failed creating a mirror pod for" err="pods \"kube-controller-manager-localhost\" already exists" pod="kube-system/kube-controller-manager-localhost" May 13 23:57:05.156395 kubelet[2839]: E0513 23:57:05.156376 2839 dns.go:153] "Nameserver limits exceeded" err="Nameserver limits were exceeded, some nameservers have been omitted, the applied nameserver line is: 1.1.1.1 1.0.0.1 8.8.8.8" May 13 23:57:05.156619 kubelet[2839]: E0513 23:57:05.156575 2839 dns.go:153] "Nameserver limits exceeded" err="Nameserver limits were exceeded, some nameservers have been omitted, the applied nameserver line is: 1.1.1.1 1.0.0.1 8.8.8.8" May 13 23:57:05.179672 kubelet[2839]: I0513 23:57:05.179494 2839 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-scheduler-localhost" podStartSLOduration=1.179472335 podStartE2EDuration="1.179472335s" podCreationTimestamp="2025-05-13 23:57:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-05-13 23:57:05.164443571 +0000 UTC m=+1.152147607" watchObservedRunningTime="2025-05-13 23:57:05.179472335 +0000 UTC m=+1.167176361" May 13 23:57:06.139337 kubelet[2839]: E0513 23:57:06.139276 2839 dns.go:153] "Nameserver limits exceeded" err="Nameserver limits were exceeded, some nameservers have been omitted, the applied nameserver line is: 1.1.1.1 1.0.0.1 8.8.8.8" May 13 23:57:06.140145 kubelet[2839]: E0513 23:57:06.139388 2839 dns.go:153] "Nameserver limits exceeded" err="Nameserver limits were exceeded, some nameservers have been omitted, the applied nameserver line is: 1.1.1.1 1.0.0.1 8.8.8.8" May 13 23:57:06.140145 kubelet[2839]: E0513 23:57:06.139539 2839 dns.go:153] "Nameserver limits exceeded" err="Nameserver limits were exceeded, some nameservers have been omitted, the applied nameserver line is: 1.1.1.1 1.0.0.1 8.8.8.8" May 13 23:57:07.140846 kubelet[2839]: E0513 23:57:07.140813 2839 dns.go:153] "Nameserver limits exceeded" err="Nameserver limits were exceeded, some nameservers have been omitted, the applied nameserver line is: 1.1.1.1 1.0.0.1 8.8.8.8" May 13 23:57:08.760423 kubelet[2839]: E0513 23:57:08.760311 2839 dns.go:153] "Nameserver limits exceeded" err="Nameserver limits were exceeded, some nameservers have been omitted, the applied nameserver line is: 1.1.1.1 1.0.0.1 8.8.8.8" May 13 23:57:09.143207 kubelet[2839]: E0513 23:57:09.143154 2839 dns.go:153] "Nameserver limits exceeded" err="Nameserver limits were exceeded, some nameservers have been omitted, the applied nameserver line is: 1.1.1.1 1.0.0.1 8.8.8.8" May 13 23:57:10.481558 sudo[1682]: pam_unix(sudo:session): session closed for user root May 13 23:57:10.483906 sshd[1681]: Connection closed by 10.0.0.1 port 60596 May 13 23:57:10.484854 sshd-session[1678]: pam_unix(sshd:session): session closed for user core May 13 23:57:10.490420 systemd[1]: sshd@6-10.0.0.64:22-10.0.0.1:60596.service: Deactivated successfully. May 13 23:57:10.493292 systemd[1]: session-7.scope: Deactivated successfully. May 13 23:57:10.493570 systemd[1]: session-7.scope: Consumed 5.499s CPU time, 229.3M memory peak. May 13 23:57:10.496145 systemd-logind[1468]: Session 7 logged out. Waiting for processes to exit. May 13 23:57:10.497752 systemd-logind[1468]: Removed session 7. May 13 23:57:10.871616 kubelet[2839]: E0513 23:57:10.871586 2839 dns.go:153] "Nameserver limits exceeded" err="Nameserver limits were exceeded, some nameservers have been omitted, the applied nameserver line is: 1.1.1.1 1.0.0.1 8.8.8.8" May 13 23:57:11.147355 kubelet[2839]: E0513 23:57:11.147176 2839 dns.go:153] "Nameserver limits exceeded" err="Nameserver limits were exceeded, some nameservers have been omitted, the applied nameserver line is: 1.1.1.1 1.0.0.1 8.8.8.8" May 13 23:57:13.109080 kubelet[2839]: E0513 23:57:13.109004 2839 dns.go:153] "Nameserver limits exceeded" err="Nameserver limits were exceeded, some nameservers have been omitted, the applied nameserver line is: 1.1.1.1 1.0.0.1 8.8.8.8" May 13 23:57:13.149714 kubelet[2839]: E0513 23:57:13.149679 2839 dns.go:153] "Nameserver limits exceeded" err="Nameserver limits were exceeded, some nameservers have been omitted, the applied nameserver line is: 1.1.1.1 1.0.0.1 8.8.8.8" May 13 23:57:14.150458 kubelet[2839]: E0513 23:57:14.150402 2839 dns.go:153] "Nameserver limits exceeded" err="Nameserver limits were exceeded, some nameservers have been omitted, the applied nameserver line is: 1.1.1.1 1.0.0.1 8.8.8.8" May 13 23:57:17.627402 kubelet[2839]: I0513 23:57:17.627349 2839 kuberuntime_manager.go:1523] "Updating runtime config through cri with podcidr" CIDR="192.168.0.0/24" May 13 23:57:17.627965 containerd[1490]: time="2025-05-13T23:57:17.627799739Z" level=info msg="No cni config template is specified, wait for other system components to drop the config." May 13 23:57:17.628292 kubelet[2839]: I0513 23:57:17.628045 2839 kubelet_network.go:61] "Updating Pod CIDR" originalPodCIDR="" newPodCIDR="192.168.0.0/24" May 13 23:57:17.731241 kubelet[2839]: I0513 23:57:17.729942 2839 topology_manager.go:215] "Topology Admit Handler" podUID="429d2fc0-cfe2-4ad4-8ef9-ff497801f5d9" podNamespace="tigera-operator" podName="tigera-operator-797db67f8-8wvkl" May 13 23:57:17.738676 systemd[1]: Created slice kubepods-besteffort-pod429d2fc0_cfe2_4ad4_8ef9_ff497801f5d9.slice - libcontainer container kubepods-besteffort-pod429d2fc0_cfe2_4ad4_8ef9_ff497801f5d9.slice. May 13 23:57:17.780505 kubelet[2839]: I0513 23:57:17.780455 2839 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-calico\" (UniqueName: \"kubernetes.io/host-path/429d2fc0-cfe2-4ad4-8ef9-ff497801f5d9-var-lib-calico\") pod \"tigera-operator-797db67f8-8wvkl\" (UID: \"429d2fc0-cfe2-4ad4-8ef9-ff497801f5d9\") " pod="tigera-operator/tigera-operator-797db67f8-8wvkl" May 13 23:57:17.780505 kubelet[2839]: I0513 23:57:17.780504 2839 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pn794\" (UniqueName: \"kubernetes.io/projected/429d2fc0-cfe2-4ad4-8ef9-ff497801f5d9-kube-api-access-pn794\") pod \"tigera-operator-797db67f8-8wvkl\" (UID: \"429d2fc0-cfe2-4ad4-8ef9-ff497801f5d9\") " pod="tigera-operator/tigera-operator-797db67f8-8wvkl" May 13 23:57:17.853983 kubelet[2839]: I0513 23:57:17.853415 2839 topology_manager.go:215] "Topology Admit Handler" podUID="47199bb4-43a6-4df2-8992-110e68aca1ee" podNamespace="kube-system" podName="kube-proxy-7zrcz" May 13 23:57:17.864578 systemd[1]: Created slice kubepods-besteffort-pod47199bb4_43a6_4df2_8992_110e68aca1ee.slice - libcontainer container kubepods-besteffort-pod47199bb4_43a6_4df2_8992_110e68aca1ee.slice. May 13 23:57:17.881122 kubelet[2839]: I0513 23:57:17.880742 2839 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"xtables-lock\" (UniqueName: \"kubernetes.io/host-path/47199bb4-43a6-4df2-8992-110e68aca1ee-xtables-lock\") pod \"kube-proxy-7zrcz\" (UID: \"47199bb4-43a6-4df2-8992-110e68aca1ee\") " pod="kube-system/kube-proxy-7zrcz" May 13 23:57:17.881122 kubelet[2839]: I0513 23:57:17.880794 2839 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-proxy\" (UniqueName: \"kubernetes.io/configmap/47199bb4-43a6-4df2-8992-110e68aca1ee-kube-proxy\") pod \"kube-proxy-7zrcz\" (UID: \"47199bb4-43a6-4df2-8992-110e68aca1ee\") " pod="kube-system/kube-proxy-7zrcz" May 13 23:57:17.881122 kubelet[2839]: I0513 23:57:17.880940 2839 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kjdzh\" (UniqueName: \"kubernetes.io/projected/47199bb4-43a6-4df2-8992-110e68aca1ee-kube-api-access-kjdzh\") pod \"kube-proxy-7zrcz\" (UID: \"47199bb4-43a6-4df2-8992-110e68aca1ee\") " pod="kube-system/kube-proxy-7zrcz" May 13 23:57:17.881122 kubelet[2839]: I0513 23:57:17.881013 2839 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/47199bb4-43a6-4df2-8992-110e68aca1ee-lib-modules\") pod \"kube-proxy-7zrcz\" (UID: \"47199bb4-43a6-4df2-8992-110e68aca1ee\") " pod="kube-system/kube-proxy-7zrcz" May 13 23:57:18.057062 containerd[1490]: time="2025-05-13T23:57:18.056999361Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:tigera-operator-797db67f8-8wvkl,Uid:429d2fc0-cfe2-4ad4-8ef9-ff497801f5d9,Namespace:tigera-operator,Attempt:0,}" May 13 23:57:18.167085 kubelet[2839]: E0513 23:57:18.166996 2839 dns.go:153] "Nameserver limits exceeded" err="Nameserver limits were exceeded, some nameservers have been omitted, the applied nameserver line is: 1.1.1.1 1.0.0.1 8.8.8.8" May 13 23:57:18.167733 containerd[1490]: time="2025-05-13T23:57:18.167573646Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-proxy-7zrcz,Uid:47199bb4-43a6-4df2-8992-110e68aca1ee,Namespace:kube-system,Attempt:0,}" May 13 23:57:18.421667 containerd[1490]: time="2025-05-13T23:57:18.420839894Z" level=info msg="connecting to shim 35e4c45be839cd91944051d4be5796f21d7de0ebd3fab007d1a47758f9b9bc97" address="unix:///run/containerd/s/ec1eb53a6f78b4a2c1a5784b7e77f6695b7678f117627c14e4b1f8cf91e73e64" namespace=k8s.io protocol=ttrpc version=3 May 13 23:57:18.424616 containerd[1490]: time="2025-05-13T23:57:18.424542860Z" level=info msg="connecting to shim c70e5902afd705c8141a52885a9f2cf243985786ce9b70bf8416033e717fcc2f" address="unix:///run/containerd/s/072ddc91448be13692e599ca2ffc67caf141bf461aa06eccc83d6620e7d27b10" namespace=k8s.io protocol=ttrpc version=3 May 13 23:57:18.484521 systemd[1]: Started cri-containerd-35e4c45be839cd91944051d4be5796f21d7de0ebd3fab007d1a47758f9b9bc97.scope - libcontainer container 35e4c45be839cd91944051d4be5796f21d7de0ebd3fab007d1a47758f9b9bc97. May 13 23:57:18.488693 systemd[1]: Started cri-containerd-c70e5902afd705c8141a52885a9f2cf243985786ce9b70bf8416033e717fcc2f.scope - libcontainer container c70e5902afd705c8141a52885a9f2cf243985786ce9b70bf8416033e717fcc2f. May 13 23:57:18.534227 containerd[1490]: time="2025-05-13T23:57:18.534021760Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-proxy-7zrcz,Uid:47199bb4-43a6-4df2-8992-110e68aca1ee,Namespace:kube-system,Attempt:0,} returns sandbox id \"c70e5902afd705c8141a52885a9f2cf243985786ce9b70bf8416033e717fcc2f\"" May 13 23:57:18.534897 kubelet[2839]: E0513 23:57:18.534869 2839 dns.go:153] "Nameserver limits exceeded" err="Nameserver limits were exceeded, some nameservers have been omitted, the applied nameserver line is: 1.1.1.1 1.0.0.1 8.8.8.8" May 13 23:57:18.537133 containerd[1490]: time="2025-05-13T23:57:18.537097089Z" level=info msg="CreateContainer within sandbox \"c70e5902afd705c8141a52885a9f2cf243985786ce9b70bf8416033e717fcc2f\" for container &ContainerMetadata{Name:kube-proxy,Attempt:0,}" May 13 23:57:18.551171 containerd[1490]: time="2025-05-13T23:57:18.551106361Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:tigera-operator-797db67f8-8wvkl,Uid:429d2fc0-cfe2-4ad4-8ef9-ff497801f5d9,Namespace:tigera-operator,Attempt:0,} returns sandbox id \"35e4c45be839cd91944051d4be5796f21d7de0ebd3fab007d1a47758f9b9bc97\"" May 13 23:57:18.552985 containerd[1490]: time="2025-05-13T23:57:18.552957237Z" level=info msg="PullImage \"quay.io/tigera/operator:v1.36.7\"" May 13 23:57:18.580430 containerd[1490]: time="2025-05-13T23:57:18.580368863Z" level=info msg="Container 8874fca1b99f932aeed0fe9ef37a2f8be28171fc23855a4c3a0a6c17d95185fb: CDI devices from CRI Config.CDIDevices: []" May 13 23:57:18.615136 containerd[1490]: time="2025-05-13T23:57:18.615083643Z" level=info msg="CreateContainer within sandbox \"c70e5902afd705c8141a52885a9f2cf243985786ce9b70bf8416033e717fcc2f\" for &ContainerMetadata{Name:kube-proxy,Attempt:0,} returns container id \"8874fca1b99f932aeed0fe9ef37a2f8be28171fc23855a4c3a0a6c17d95185fb\"" May 13 23:57:18.615869 containerd[1490]: time="2025-05-13T23:57:18.615817723Z" level=info msg="StartContainer for \"8874fca1b99f932aeed0fe9ef37a2f8be28171fc23855a4c3a0a6c17d95185fb\"" May 13 23:57:18.617604 containerd[1490]: time="2025-05-13T23:57:18.617562667Z" level=info msg="connecting to shim 8874fca1b99f932aeed0fe9ef37a2f8be28171fc23855a4c3a0a6c17d95185fb" address="unix:///run/containerd/s/072ddc91448be13692e599ca2ffc67caf141bf461aa06eccc83d6620e7d27b10" protocol=ttrpc version=3 May 13 23:57:18.646461 systemd[1]: Started cri-containerd-8874fca1b99f932aeed0fe9ef37a2f8be28171fc23855a4c3a0a6c17d95185fb.scope - libcontainer container 8874fca1b99f932aeed0fe9ef37a2f8be28171fc23855a4c3a0a6c17d95185fb. May 13 23:57:18.700297 containerd[1490]: time="2025-05-13T23:57:18.700156784Z" level=info msg="StartContainer for \"8874fca1b99f932aeed0fe9ef37a2f8be28171fc23855a4c3a0a6c17d95185fb\" returns successfully" May 13 23:57:19.162951 kubelet[2839]: E0513 23:57:19.162924 2839 dns.go:153] "Nameserver limits exceeded" err="Nameserver limits were exceeded, some nameservers have been omitted, the applied nameserver line is: 1.1.1.1 1.0.0.1 8.8.8.8" May 13 23:57:19.175974 kubelet[2839]: I0513 23:57:19.175910 2839 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-proxy-7zrcz" podStartSLOduration=2.175888674 podStartE2EDuration="2.175888674s" podCreationTimestamp="2025-05-13 23:57:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-05-13 23:57:19.175615169 +0000 UTC m=+15.163319205" watchObservedRunningTime="2025-05-13 23:57:19.175888674 +0000 UTC m=+15.163592700" May 13 23:57:20.467040 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount2241721100.mount: Deactivated successfully. May 13 23:57:23.928076 containerd[1490]: time="2025-05-13T23:57:23.927954736Z" level=info msg="ImageCreate event name:\"quay.io/tigera/operator:v1.36.7\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 13 23:57:24.008510 containerd[1490]: time="2025-05-13T23:57:24.008412468Z" level=info msg="stop pulling image quay.io/tigera/operator:v1.36.7: active requests=0, bytes read=22002662" May 13 23:57:24.143986 containerd[1490]: time="2025-05-13T23:57:24.143919016Z" level=info msg="ImageCreate event name:\"sha256:e9b19fa62f476f04e5840eb65a0f71b49c7b9f4ceede31675409ddc218bb5578\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 13 23:57:24.322687 containerd[1490]: time="2025-05-13T23:57:24.322631924Z" level=info msg="ImageCreate event name:\"quay.io/tigera/operator@sha256:a4a44422d8f2a14e0aaea2031ccb5580f2bf68218c9db444450c1888743305e9\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 13 23:57:24.323502 containerd[1490]: time="2025-05-13T23:57:24.323465060Z" level=info msg="Pulled image \"quay.io/tigera/operator:v1.36.7\" with image id \"sha256:e9b19fa62f476f04e5840eb65a0f71b49c7b9f4ceede31675409ddc218bb5578\", repo tag \"quay.io/tigera/operator:v1.36.7\", repo digest \"quay.io/tigera/operator@sha256:a4a44422d8f2a14e0aaea2031ccb5580f2bf68218c9db444450c1888743305e9\", size \"21998657\" in 5.770469228s" May 13 23:57:24.323565 containerd[1490]: time="2025-05-13T23:57:24.323514805Z" level=info msg="PullImage \"quay.io/tigera/operator:v1.36.7\" returns image reference \"sha256:e9b19fa62f476f04e5840eb65a0f71b49c7b9f4ceede31675409ddc218bb5578\"" May 13 23:57:24.436303 containerd[1490]: time="2025-05-13T23:57:24.436238333Z" level=info msg="CreateContainer within sandbox \"35e4c45be839cd91944051d4be5796f21d7de0ebd3fab007d1a47758f9b9bc97\" for container &ContainerMetadata{Name:tigera-operator,Attempt:0,}" May 13 23:57:25.218574 containerd[1490]: time="2025-05-13T23:57:25.218521144Z" level=info msg="Container 5fd163fcec792ef9e3e70f253ffe04f0dfe15fa6052783265b12fbbe50ab1ff1: CDI devices from CRI Config.CDIDevices: []" May 13 23:57:25.401803 containerd[1490]: time="2025-05-13T23:57:25.401743979Z" level=info msg="CreateContainer within sandbox \"35e4c45be839cd91944051d4be5796f21d7de0ebd3fab007d1a47758f9b9bc97\" for &ContainerMetadata{Name:tigera-operator,Attempt:0,} returns container id \"5fd163fcec792ef9e3e70f253ffe04f0dfe15fa6052783265b12fbbe50ab1ff1\"" May 13 23:57:25.402379 containerd[1490]: time="2025-05-13T23:57:25.402352954Z" level=info msg="StartContainer for \"5fd163fcec792ef9e3e70f253ffe04f0dfe15fa6052783265b12fbbe50ab1ff1\"" May 13 23:57:25.403430 containerd[1490]: time="2025-05-13T23:57:25.403389288Z" level=info msg="connecting to shim 5fd163fcec792ef9e3e70f253ffe04f0dfe15fa6052783265b12fbbe50ab1ff1" address="unix:///run/containerd/s/ec1eb53a6f78b4a2c1a5784b7e77f6695b7678f117627c14e4b1f8cf91e73e64" protocol=ttrpc version=3 May 13 23:57:25.431780 systemd[1]: Started cri-containerd-5fd163fcec792ef9e3e70f253ffe04f0dfe15fa6052783265b12fbbe50ab1ff1.scope - libcontainer container 5fd163fcec792ef9e3e70f253ffe04f0dfe15fa6052783265b12fbbe50ab1ff1. May 13 23:57:25.836774 containerd[1490]: time="2025-05-13T23:57:25.836731718Z" level=info msg="StartContainer for \"5fd163fcec792ef9e3e70f253ffe04f0dfe15fa6052783265b12fbbe50ab1ff1\" returns successfully" May 13 23:57:26.228736 kubelet[2839]: I0513 23:57:26.227121 2839 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="tigera-operator/tigera-operator-797db67f8-8wvkl" podStartSLOduration=3.344942342 podStartE2EDuration="9.227093197s" podCreationTimestamp="2025-05-13 23:57:17 +0000 UTC" firstStartedPulling="2025-05-13 23:57:18.552477325 +0000 UTC m=+14.540181361" lastFinishedPulling="2025-05-13 23:57:24.43462818 +0000 UTC m=+20.422332216" observedRunningTime="2025-05-13 23:57:26.22575677 +0000 UTC m=+22.213460796" watchObservedRunningTime="2025-05-13 23:57:26.227093197 +0000 UTC m=+22.214797233" May 13 23:57:32.758914 kubelet[2839]: I0513 23:57:32.758853 2839 topology_manager.go:215] "Topology Admit Handler" podUID="e099728d-fefb-4d19-a10a-2eecf9c709e2" podNamespace="calico-system" podName="calico-typha-56fc595cc4-g674g" May 13 23:57:32.766244 systemd[1]: Created slice kubepods-besteffort-pode099728d_fefb_4d19_a10a_2eecf9c709e2.slice - libcontainer container kubepods-besteffort-pode099728d_fefb_4d19_a10a_2eecf9c709e2.slice. May 13 23:57:32.911432 kubelet[2839]: I0513 23:57:32.911343 2839 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e099728d-fefb-4d19-a10a-2eecf9c709e2-tigera-ca-bundle\") pod \"calico-typha-56fc595cc4-g674g\" (UID: \"e099728d-fefb-4d19-a10a-2eecf9c709e2\") " pod="calico-system/calico-typha-56fc595cc4-g674g" May 13 23:57:32.911432 kubelet[2839]: I0513 23:57:32.911402 2839 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"typha-certs\" (UniqueName: \"kubernetes.io/secret/e099728d-fefb-4d19-a10a-2eecf9c709e2-typha-certs\") pod \"calico-typha-56fc595cc4-g674g\" (UID: \"e099728d-fefb-4d19-a10a-2eecf9c709e2\") " pod="calico-system/calico-typha-56fc595cc4-g674g" May 13 23:57:32.911432 kubelet[2839]: I0513 23:57:32.911423 2839 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-klnwd\" (UniqueName: \"kubernetes.io/projected/e099728d-fefb-4d19-a10a-2eecf9c709e2-kube-api-access-klnwd\") pod \"calico-typha-56fc595cc4-g674g\" (UID: \"e099728d-fefb-4d19-a10a-2eecf9c709e2\") " pod="calico-system/calico-typha-56fc595cc4-g674g" May 13 23:57:33.369804 kubelet[2839]: E0513 23:57:33.369711 2839 dns.go:153] "Nameserver limits exceeded" err="Nameserver limits were exceeded, some nameservers have been omitted, the applied nameserver line is: 1.1.1.1 1.0.0.1 8.8.8.8" May 13 23:57:33.370448 containerd[1490]: time="2025-05-13T23:57:33.370385631Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-typha-56fc595cc4-g674g,Uid:e099728d-fefb-4d19-a10a-2eecf9c709e2,Namespace:calico-system,Attempt:0,}" May 13 23:57:33.551638 kubelet[2839]: I0513 23:57:33.551550 2839 topology_manager.go:215] "Topology Admit Handler" podUID="992b5cf4-c756-4b7f-975a-3502b4e43490" podNamespace="calico-system" podName="calico-node-xttgm" May 13 23:57:33.564287 systemd[1]: Created slice kubepods-besteffort-pod992b5cf4_c756_4b7f_975a_3502b4e43490.slice - libcontainer container kubepods-besteffort-pod992b5cf4_c756_4b7f_975a_3502b4e43490.slice. May 13 23:57:33.715335 kubelet[2839]: I0513 23:57:33.714931 2839 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"xtables-lock\" (UniqueName: \"kubernetes.io/host-path/992b5cf4-c756-4b7f-975a-3502b4e43490-xtables-lock\") pod \"calico-node-xttgm\" (UID: \"992b5cf4-c756-4b7f-975a-3502b4e43490\") " pod="calico-system/calico-node-xttgm" May 13 23:57:33.715335 kubelet[2839]: I0513 23:57:33.714982 2839 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-calico\" (UniqueName: \"kubernetes.io/host-path/992b5cf4-c756-4b7f-975a-3502b4e43490-var-run-calico\") pod \"calico-node-xttgm\" (UID: \"992b5cf4-c756-4b7f-975a-3502b4e43490\") " pod="calico-system/calico-node-xttgm" May 13 23:57:33.715335 kubelet[2839]: I0513 23:57:33.714998 2839 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"flexvol-driver-host\" (UniqueName: \"kubernetes.io/host-path/992b5cf4-c756-4b7f-975a-3502b4e43490-flexvol-driver-host\") pod \"calico-node-xttgm\" (UID: \"992b5cf4-c756-4b7f-975a-3502b4e43490\") " pod="calico-system/calico-node-xttgm" May 13 23:57:33.715335 kubelet[2839]: I0513 23:57:33.715017 2839 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/992b5cf4-c756-4b7f-975a-3502b4e43490-tigera-ca-bundle\") pod \"calico-node-xttgm\" (UID: \"992b5cf4-c756-4b7f-975a-3502b4e43490\") " pod="calico-system/calico-node-xttgm" May 13 23:57:33.715335 kubelet[2839]: I0513 23:57:33.715033 2839 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-calico\" (UniqueName: \"kubernetes.io/host-path/992b5cf4-c756-4b7f-975a-3502b4e43490-var-lib-calico\") pod \"calico-node-xttgm\" (UID: \"992b5cf4-c756-4b7f-975a-3502b4e43490\") " pod="calico-system/calico-node-xttgm" May 13 23:57:33.715590 kubelet[2839]: I0513 23:57:33.715047 2839 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-certs\" (UniqueName: \"kubernetes.io/secret/992b5cf4-c756-4b7f-975a-3502b4e43490-node-certs\") pod \"calico-node-xttgm\" (UID: \"992b5cf4-c756-4b7f-975a-3502b4e43490\") " pod="calico-system/calico-node-xttgm" May 13 23:57:33.715590 kubelet[2839]: I0513 23:57:33.715060 2839 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-net-dir\" (UniqueName: \"kubernetes.io/host-path/992b5cf4-c756-4b7f-975a-3502b4e43490-cni-net-dir\") pod \"calico-node-xttgm\" (UID: \"992b5cf4-c756-4b7f-975a-3502b4e43490\") " pod="calico-system/calico-node-xttgm" May 13 23:57:33.715590 kubelet[2839]: I0513 23:57:33.715286 2839 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-log-dir\" (UniqueName: \"kubernetes.io/host-path/992b5cf4-c756-4b7f-975a-3502b4e43490-cni-log-dir\") pod \"calico-node-xttgm\" (UID: \"992b5cf4-c756-4b7f-975a-3502b4e43490\") " pod="calico-system/calico-node-xttgm" May 13 23:57:33.715590 kubelet[2839]: I0513 23:57:33.715304 2839 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-glvb9\" (UniqueName: \"kubernetes.io/projected/992b5cf4-c756-4b7f-975a-3502b4e43490-kube-api-access-glvb9\") pod \"calico-node-xttgm\" (UID: \"992b5cf4-c756-4b7f-975a-3502b4e43490\") " pod="calico-system/calico-node-xttgm" May 13 23:57:33.715590 kubelet[2839]: I0513 23:57:33.715321 2839 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-bin-dir\" (UniqueName: \"kubernetes.io/host-path/992b5cf4-c756-4b7f-975a-3502b4e43490-cni-bin-dir\") pod \"calico-node-xttgm\" (UID: \"992b5cf4-c756-4b7f-975a-3502b4e43490\") " pod="calico-system/calico-node-xttgm" May 13 23:57:33.715770 kubelet[2839]: I0513 23:57:33.715337 2839 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/992b5cf4-c756-4b7f-975a-3502b4e43490-lib-modules\") pod \"calico-node-xttgm\" (UID: \"992b5cf4-c756-4b7f-975a-3502b4e43490\") " pod="calico-system/calico-node-xttgm" May 13 23:57:33.715770 kubelet[2839]: I0513 23:57:33.715389 2839 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"policysync\" (UniqueName: \"kubernetes.io/host-path/992b5cf4-c756-4b7f-975a-3502b4e43490-policysync\") pod \"calico-node-xttgm\" (UID: \"992b5cf4-c756-4b7f-975a-3502b4e43490\") " pod="calico-system/calico-node-xttgm" May 13 23:57:33.721416 containerd[1490]: time="2025-05-13T23:57:33.721362450Z" level=info msg="connecting to shim 8d15a472e5cb14fcc016fb10f9fee8b744330a4871e11adda65901e0d733487f" address="unix:///run/containerd/s/7b3c49829b4e8e7507a2c3a5cb24608711ea665bbdf7bb8af33d6acc493b0275" namespace=k8s.io protocol=ttrpc version=3 May 13 23:57:33.741455 systemd[1]: Started cri-containerd-8d15a472e5cb14fcc016fb10f9fee8b744330a4871e11adda65901e0d733487f.scope - libcontainer container 8d15a472e5cb14fcc016fb10f9fee8b744330a4871e11adda65901e0d733487f. May 13 23:57:33.815969 containerd[1490]: time="2025-05-13T23:57:33.815915729Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-typha-56fc595cc4-g674g,Uid:e099728d-fefb-4d19-a10a-2eecf9c709e2,Namespace:calico-system,Attempt:0,} returns sandbox id \"8d15a472e5cb14fcc016fb10f9fee8b744330a4871e11adda65901e0d733487f\"" May 13 23:57:33.817771 kubelet[2839]: E0513 23:57:33.817001 2839 dns.go:153] "Nameserver limits exceeded" err="Nameserver limits were exceeded, some nameservers have been omitted, the applied nameserver line is: 1.1.1.1 1.0.0.1 8.8.8.8" May 13 23:57:33.818280 containerd[1490]: time="2025-05-13T23:57:33.818254339Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/typha:v3.29.3\"" May 13 23:57:33.818545 kubelet[2839]: E0513 23:57:33.818515 2839 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 23:57:33.818545 kubelet[2839]: W0513 23:57:33.818536 2839 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 23:57:33.818545 kubelet[2839]: E0513 23:57:33.818557 2839 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 23:57:33.822896 kubelet[2839]: E0513 23:57:33.822841 2839 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 23:57:33.822969 kubelet[2839]: W0513 23:57:33.822892 2839 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 23:57:33.822969 kubelet[2839]: E0513 23:57:33.822930 2839 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 23:57:33.917276 kubelet[2839]: E0513 23:57:33.917227 2839 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 23:57:33.917276 kubelet[2839]: W0513 23:57:33.917260 2839 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 23:57:33.917276 kubelet[2839]: E0513 23:57:33.917286 2839 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 23:57:34.018936 kubelet[2839]: E0513 23:57:34.018795 2839 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 23:57:34.018936 kubelet[2839]: W0513 23:57:34.018823 2839 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 23:57:34.018936 kubelet[2839]: E0513 23:57:34.018848 2839 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 23:57:34.103507 kubelet[2839]: E0513 23:57:34.103466 2839 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 23:57:34.103507 kubelet[2839]: W0513 23:57:34.103487 2839 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 23:57:34.103507 kubelet[2839]: E0513 23:57:34.103509 2839 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 23:57:34.167843 kubelet[2839]: E0513 23:57:34.167805 2839 dns.go:153] "Nameserver limits exceeded" err="Nameserver limits were exceeded, some nameservers have been omitted, the applied nameserver line is: 1.1.1.1 1.0.0.1 8.8.8.8" May 13 23:57:34.168258 containerd[1490]: time="2025-05-13T23:57:34.168221518Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-node-xttgm,Uid:992b5cf4-c756-4b7f-975a-3502b4e43490,Namespace:calico-system,Attempt:0,}" May 13 23:57:34.192703 kubelet[2839]: I0513 23:57:34.192653 2839 topology_manager.go:215] "Topology Admit Handler" podUID="20d8fe80-dcc7-435f-a033-cb9b5eaee915" podNamespace="calico-system" podName="csi-node-driver-fj9x5" May 13 23:57:34.192946 kubelet[2839]: E0513 23:57:34.192898 2839 pod_workers.go:1298] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-fj9x5" podUID="20d8fe80-dcc7-435f-a033-cb9b5eaee915" May 13 23:57:34.218763 kubelet[2839]: E0513 23:57:34.218714 2839 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 23:57:34.218763 kubelet[2839]: W0513 23:57:34.218736 2839 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 23:57:34.218763 kubelet[2839]: E0513 23:57:34.218755 2839 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 23:57:34.218978 kubelet[2839]: E0513 23:57:34.218967 2839 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 23:57:34.219006 kubelet[2839]: W0513 23:57:34.218987 2839 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 23:57:34.219006 kubelet[2839]: E0513 23:57:34.218997 2839 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 23:57:34.219222 kubelet[2839]: E0513 23:57:34.219184 2839 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 23:57:34.219222 kubelet[2839]: W0513 23:57:34.219216 2839 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 23:57:34.219222 kubelet[2839]: E0513 23:57:34.219225 2839 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 23:57:34.219473 kubelet[2839]: E0513 23:57:34.219440 2839 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 23:57:34.219473 kubelet[2839]: W0513 23:57:34.219453 2839 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 23:57:34.219473 kubelet[2839]: E0513 23:57:34.219460 2839 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 23:57:34.219640 kubelet[2839]: E0513 23:57:34.219617 2839 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 23:57:34.219640 kubelet[2839]: W0513 23:57:34.219636 2839 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 23:57:34.219687 kubelet[2839]: E0513 23:57:34.219643 2839 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 23:57:34.219806 kubelet[2839]: E0513 23:57:34.219792 2839 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 23:57:34.219806 kubelet[2839]: W0513 23:57:34.219802 2839 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 23:57:34.219855 kubelet[2839]: E0513 23:57:34.219810 2839 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 23:57:34.219981 kubelet[2839]: E0513 23:57:34.219964 2839 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 23:57:34.220011 kubelet[2839]: W0513 23:57:34.219985 2839 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 23:57:34.220011 kubelet[2839]: E0513 23:57:34.219995 2839 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 23:57:34.220249 kubelet[2839]: E0513 23:57:34.220229 2839 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 23:57:34.220288 kubelet[2839]: W0513 23:57:34.220252 2839 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 23:57:34.220288 kubelet[2839]: E0513 23:57:34.220279 2839 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 23:57:34.220557 kubelet[2839]: E0513 23:57:34.220543 2839 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 23:57:34.220557 kubelet[2839]: W0513 23:57:34.220556 2839 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 23:57:34.220613 kubelet[2839]: E0513 23:57:34.220567 2839 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 23:57:34.220843 kubelet[2839]: E0513 23:57:34.220828 2839 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 23:57:34.220843 kubelet[2839]: W0513 23:57:34.220841 2839 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 23:57:34.220843 kubelet[2839]: E0513 23:57:34.220852 2839 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 23:57:34.221114 kubelet[2839]: E0513 23:57:34.221086 2839 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 23:57:34.221114 kubelet[2839]: W0513 23:57:34.221098 2839 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 23:57:34.221114 kubelet[2839]: E0513 23:57:34.221108 2839 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 23:57:34.221361 kubelet[2839]: E0513 23:57:34.221347 2839 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 23:57:34.221390 kubelet[2839]: W0513 23:57:34.221360 2839 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 23:57:34.221390 kubelet[2839]: E0513 23:57:34.221372 2839 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 23:57:34.221667 kubelet[2839]: E0513 23:57:34.221641 2839 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 23:57:34.221667 kubelet[2839]: W0513 23:57:34.221651 2839 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 23:57:34.221667 kubelet[2839]: E0513 23:57:34.221659 2839 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 23:57:34.221847 kubelet[2839]: E0513 23:57:34.221831 2839 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 23:57:34.221847 kubelet[2839]: W0513 23:57:34.221843 2839 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 23:57:34.221894 kubelet[2839]: E0513 23:57:34.221851 2839 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 23:57:34.222041 kubelet[2839]: E0513 23:57:34.222024 2839 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 23:57:34.222041 kubelet[2839]: W0513 23:57:34.222036 2839 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 23:57:34.222087 kubelet[2839]: E0513 23:57:34.222045 2839 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 23:57:34.222264 kubelet[2839]: E0513 23:57:34.222253 2839 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 23:57:34.222264 kubelet[2839]: W0513 23:57:34.222262 2839 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 23:57:34.222325 kubelet[2839]: E0513 23:57:34.222270 2839 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 23:57:34.222491 kubelet[2839]: E0513 23:57:34.222475 2839 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 23:57:34.222491 kubelet[2839]: W0513 23:57:34.222485 2839 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 23:57:34.222538 kubelet[2839]: E0513 23:57:34.222493 2839 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 23:57:34.222714 kubelet[2839]: E0513 23:57:34.222695 2839 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 23:57:34.222714 kubelet[2839]: W0513 23:57:34.222708 2839 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 23:57:34.222764 kubelet[2839]: E0513 23:57:34.222717 2839 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 23:57:34.222924 kubelet[2839]: E0513 23:57:34.222913 2839 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 23:57:34.222924 kubelet[2839]: W0513 23:57:34.222922 2839 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 23:57:34.222969 kubelet[2839]: E0513 23:57:34.222929 2839 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 23:57:34.223112 kubelet[2839]: E0513 23:57:34.223102 2839 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 23:57:34.223137 kubelet[2839]: W0513 23:57:34.223111 2839 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 23:57:34.223137 kubelet[2839]: E0513 23:57:34.223118 2839 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 23:57:34.223402 kubelet[2839]: E0513 23:57:34.223392 2839 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 23:57:34.223402 kubelet[2839]: W0513 23:57:34.223401 2839 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 23:57:34.223511 kubelet[2839]: E0513 23:57:34.223408 2839 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 23:57:34.223511 kubelet[2839]: I0513 23:57:34.223435 2839 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/20d8fe80-dcc7-435f-a033-cb9b5eaee915-socket-dir\") pod \"csi-node-driver-fj9x5\" (UID: \"20d8fe80-dcc7-435f-a033-cb9b5eaee915\") " pod="calico-system/csi-node-driver-fj9x5" May 13 23:57:34.223652 kubelet[2839]: E0513 23:57:34.223639 2839 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 23:57:34.223652 kubelet[2839]: W0513 23:57:34.223650 2839 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 23:57:34.223708 kubelet[2839]: E0513 23:57:34.223665 2839 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 23:57:34.223708 kubelet[2839]: I0513 23:57:34.223680 2839 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/20d8fe80-dcc7-435f-a033-cb9b5eaee915-registration-dir\") pod \"csi-node-driver-fj9x5\" (UID: \"20d8fe80-dcc7-435f-a033-cb9b5eaee915\") " pod="calico-system/csi-node-driver-fj9x5" May 13 23:57:34.223887 kubelet[2839]: E0513 23:57:34.223868 2839 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 23:57:34.223887 kubelet[2839]: W0513 23:57:34.223884 2839 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 23:57:34.223935 kubelet[2839]: E0513 23:57:34.223898 2839 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 23:57:34.224105 kubelet[2839]: E0513 23:57:34.224090 2839 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 23:57:34.224105 kubelet[2839]: W0513 23:57:34.224101 2839 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 23:57:34.224160 kubelet[2839]: E0513 23:57:34.224115 2839 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 23:57:34.224366 kubelet[2839]: E0513 23:57:34.224354 2839 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 23:57:34.224403 kubelet[2839]: W0513 23:57:34.224366 2839 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 23:57:34.224403 kubelet[2839]: E0513 23:57:34.224380 2839 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 23:57:34.224403 kubelet[2839]: I0513 23:57:34.224395 2839 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mznb4\" (UniqueName: \"kubernetes.io/projected/20d8fe80-dcc7-435f-a033-cb9b5eaee915-kube-api-access-mznb4\") pod \"csi-node-driver-fj9x5\" (UID: \"20d8fe80-dcc7-435f-a033-cb9b5eaee915\") " pod="calico-system/csi-node-driver-fj9x5" May 13 23:57:34.224614 kubelet[2839]: E0513 23:57:34.224596 2839 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 23:57:34.224614 kubelet[2839]: W0513 23:57:34.224610 2839 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 23:57:34.224676 kubelet[2839]: E0513 23:57:34.224635 2839 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 23:57:34.224825 kubelet[2839]: E0513 23:57:34.224810 2839 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 23:57:34.224825 kubelet[2839]: W0513 23:57:34.224823 2839 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 23:57:34.224867 kubelet[2839]: E0513 23:57:34.224838 2839 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 23:57:34.225063 kubelet[2839]: E0513 23:57:34.225051 2839 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 23:57:34.225097 kubelet[2839]: W0513 23:57:34.225062 2839 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 23:57:34.225097 kubelet[2839]: E0513 23:57:34.225078 2839 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 23:57:34.225097 kubelet[2839]: I0513 23:57:34.225093 2839 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"varrun\" (UniqueName: \"kubernetes.io/host-path/20d8fe80-dcc7-435f-a033-cb9b5eaee915-varrun\") pod \"csi-node-driver-fj9x5\" (UID: \"20d8fe80-dcc7-435f-a033-cb9b5eaee915\") " pod="calico-system/csi-node-driver-fj9x5" May 13 23:57:34.225305 kubelet[2839]: E0513 23:57:34.225294 2839 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 23:57:34.225305 kubelet[2839]: W0513 23:57:34.225304 2839 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 23:57:34.225364 kubelet[2839]: E0513 23:57:34.225316 2839 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 23:57:34.225364 kubelet[2839]: I0513 23:57:34.225330 2839 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/20d8fe80-dcc7-435f-a033-cb9b5eaee915-kubelet-dir\") pod \"csi-node-driver-fj9x5\" (UID: \"20d8fe80-dcc7-435f-a033-cb9b5eaee915\") " pod="calico-system/csi-node-driver-fj9x5" May 13 23:57:34.225558 kubelet[2839]: E0513 23:57:34.225547 2839 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 23:57:34.225586 kubelet[2839]: W0513 23:57:34.225557 2839 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 23:57:34.225586 kubelet[2839]: E0513 23:57:34.225570 2839 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 23:57:34.225812 kubelet[2839]: E0513 23:57:34.225793 2839 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 23:57:34.225812 kubelet[2839]: W0513 23:57:34.225809 2839 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 23:57:34.225861 kubelet[2839]: E0513 23:57:34.225839 2839 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 23:57:34.226039 kubelet[2839]: E0513 23:57:34.226025 2839 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 23:57:34.226039 kubelet[2839]: W0513 23:57:34.226036 2839 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 23:57:34.226090 kubelet[2839]: E0513 23:57:34.226056 2839 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 23:57:34.226283 kubelet[2839]: E0513 23:57:34.226269 2839 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 23:57:34.226283 kubelet[2839]: W0513 23:57:34.226280 2839 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 23:57:34.226327 kubelet[2839]: E0513 23:57:34.226293 2839 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 23:57:34.226485 kubelet[2839]: E0513 23:57:34.226470 2839 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 23:57:34.226485 kubelet[2839]: W0513 23:57:34.226480 2839 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 23:57:34.226551 kubelet[2839]: E0513 23:57:34.226489 2839 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 23:57:34.226692 kubelet[2839]: E0513 23:57:34.226674 2839 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 23:57:34.226692 kubelet[2839]: W0513 23:57:34.226685 2839 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 23:57:34.226692 kubelet[2839]: E0513 23:57:34.226692 2839 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 23:57:34.326229 kubelet[2839]: E0513 23:57:34.326164 2839 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 23:57:34.326229 kubelet[2839]: W0513 23:57:34.326212 2839 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 23:57:34.326229 kubelet[2839]: E0513 23:57:34.326238 2839 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 23:57:34.326549 kubelet[2839]: E0513 23:57:34.326528 2839 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 23:57:34.326549 kubelet[2839]: W0513 23:57:34.326543 2839 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 23:57:34.326659 kubelet[2839]: E0513 23:57:34.326562 2839 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 23:57:34.326945 kubelet[2839]: E0513 23:57:34.326913 2839 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 23:57:34.326945 kubelet[2839]: W0513 23:57:34.326944 2839 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 23:57:34.327043 kubelet[2839]: E0513 23:57:34.326975 2839 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 23:57:34.327287 kubelet[2839]: E0513 23:57:34.327270 2839 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 23:57:34.327287 kubelet[2839]: W0513 23:57:34.327284 2839 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 23:57:34.327372 kubelet[2839]: E0513 23:57:34.327303 2839 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 23:57:34.327610 kubelet[2839]: E0513 23:57:34.327595 2839 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 23:57:34.327610 kubelet[2839]: W0513 23:57:34.327607 2839 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 23:57:34.327727 kubelet[2839]: E0513 23:57:34.327661 2839 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 23:57:34.327880 kubelet[2839]: E0513 23:57:34.327860 2839 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 23:57:34.327880 kubelet[2839]: W0513 23:57:34.327874 2839 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 23:57:34.327965 kubelet[2839]: E0513 23:57:34.327924 2839 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 23:57:34.328130 kubelet[2839]: E0513 23:57:34.328114 2839 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 23:57:34.328130 kubelet[2839]: W0513 23:57:34.328126 2839 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 23:57:34.328224 kubelet[2839]: E0513 23:57:34.328153 2839 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 23:57:34.328379 kubelet[2839]: E0513 23:57:34.328364 2839 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 23:57:34.328379 kubelet[2839]: W0513 23:57:34.328375 2839 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 23:57:34.328456 kubelet[2839]: E0513 23:57:34.328392 2839 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 23:57:34.328634 kubelet[2839]: E0513 23:57:34.328609 2839 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 23:57:34.328634 kubelet[2839]: W0513 23:57:34.328620 2839 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 23:57:34.328732 kubelet[2839]: E0513 23:57:34.328661 2839 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 23:57:34.328884 kubelet[2839]: E0513 23:57:34.328868 2839 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 23:57:34.328884 kubelet[2839]: W0513 23:57:34.328881 2839 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 23:57:34.328956 kubelet[2839]: E0513 23:57:34.328910 2839 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 23:57:34.329133 kubelet[2839]: E0513 23:57:34.329118 2839 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 23:57:34.329133 kubelet[2839]: W0513 23:57:34.329129 2839 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 23:57:34.329235 kubelet[2839]: E0513 23:57:34.329164 2839 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 23:57:34.329389 kubelet[2839]: E0513 23:57:34.329371 2839 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 23:57:34.329389 kubelet[2839]: W0513 23:57:34.329384 2839 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 23:57:34.329457 kubelet[2839]: E0513 23:57:34.329415 2839 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 23:57:34.329607 kubelet[2839]: E0513 23:57:34.329591 2839 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 23:57:34.329607 kubelet[2839]: W0513 23:57:34.329601 2839 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 23:57:34.329694 kubelet[2839]: E0513 23:57:34.329666 2839 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 23:57:34.329813 kubelet[2839]: E0513 23:57:34.329797 2839 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 23:57:34.329813 kubelet[2839]: W0513 23:57:34.329808 2839 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 23:57:34.329890 kubelet[2839]: E0513 23:57:34.329824 2839 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 23:57:34.330156 kubelet[2839]: E0513 23:57:34.330132 2839 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 23:57:34.330156 kubelet[2839]: W0513 23:57:34.330144 2839 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 23:57:34.330257 kubelet[2839]: E0513 23:57:34.330159 2839 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 23:57:34.330432 kubelet[2839]: E0513 23:57:34.330409 2839 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 23:57:34.330432 kubelet[2839]: W0513 23:57:34.330421 2839 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 23:57:34.330507 kubelet[2839]: E0513 23:57:34.330436 2839 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 23:57:34.330671 kubelet[2839]: E0513 23:57:34.330646 2839 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 23:57:34.330671 kubelet[2839]: W0513 23:57:34.330658 2839 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 23:57:34.330754 kubelet[2839]: E0513 23:57:34.330689 2839 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 23:57:34.330884 kubelet[2839]: E0513 23:57:34.330869 2839 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 23:57:34.330884 kubelet[2839]: W0513 23:57:34.330881 2839 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 23:57:34.330959 kubelet[2839]: E0513 23:57:34.330928 2839 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 23:57:34.331158 kubelet[2839]: E0513 23:57:34.331127 2839 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 23:57:34.331158 kubelet[2839]: W0513 23:57:34.331142 2839 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 23:57:34.331276 kubelet[2839]: E0513 23:57:34.331182 2839 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 23:57:34.331428 kubelet[2839]: E0513 23:57:34.331398 2839 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 23:57:34.331428 kubelet[2839]: W0513 23:57:34.331410 2839 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 23:57:34.331495 kubelet[2839]: E0513 23:57:34.331456 2839 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 23:57:34.331612 kubelet[2839]: E0513 23:57:34.331598 2839 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 23:57:34.331612 kubelet[2839]: W0513 23:57:34.331608 2839 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 23:57:34.331693 kubelet[2839]: E0513 23:57:34.331633 2839 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 23:57:34.331952 kubelet[2839]: E0513 23:57:34.331930 2839 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 23:57:34.331952 kubelet[2839]: W0513 23:57:34.331948 2839 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 23:57:34.332023 kubelet[2839]: E0513 23:57:34.331968 2839 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 23:57:34.332316 kubelet[2839]: E0513 23:57:34.332295 2839 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 23:57:34.332316 kubelet[2839]: W0513 23:57:34.332310 2839 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 23:57:34.332408 kubelet[2839]: E0513 23:57:34.332328 2839 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 23:57:34.332581 kubelet[2839]: E0513 23:57:34.332561 2839 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 23:57:34.332581 kubelet[2839]: W0513 23:57:34.332575 2839 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 23:57:34.332676 kubelet[2839]: E0513 23:57:34.332593 2839 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 23:57:34.332904 kubelet[2839]: E0513 23:57:34.332886 2839 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 23:57:34.332904 kubelet[2839]: W0513 23:57:34.332900 2839 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 23:57:34.332988 kubelet[2839]: E0513 23:57:34.332911 2839 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 23:57:34.346025 kubelet[2839]: E0513 23:57:34.345988 2839 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 23:57:34.346025 kubelet[2839]: W0513 23:57:34.346013 2839 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 23:57:34.346184 kubelet[2839]: E0513 23:57:34.346039 2839 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 23:57:34.504031 containerd[1490]: time="2025-05-13T23:57:34.503892277Z" level=info msg="connecting to shim ce2666a9e34bf422972541fdf7e7c21bd4ffb3d9b36f6a56a800322e2fafc43a" address="unix:///run/containerd/s/f1ec8a2162cacb6ba07d4ea966928e698976d60ca50403f18b25e1156eac6a59" namespace=k8s.io protocol=ttrpc version=3 May 13 23:57:34.538414 systemd[1]: Started cri-containerd-ce2666a9e34bf422972541fdf7e7c21bd4ffb3d9b36f6a56a800322e2fafc43a.scope - libcontainer container ce2666a9e34bf422972541fdf7e7c21bd4ffb3d9b36f6a56a800322e2fafc43a. May 13 23:57:34.578348 containerd[1490]: time="2025-05-13T23:57:34.578067597Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-node-xttgm,Uid:992b5cf4-c756-4b7f-975a-3502b4e43490,Namespace:calico-system,Attempt:0,} returns sandbox id \"ce2666a9e34bf422972541fdf7e7c21bd4ffb3d9b36f6a56a800322e2fafc43a\"" May 13 23:57:34.579289 kubelet[2839]: E0513 23:57:34.579262 2839 dns.go:153] "Nameserver limits exceeded" err="Nameserver limits were exceeded, some nameservers have been omitted, the applied nameserver line is: 1.1.1.1 1.0.0.1 8.8.8.8" May 13 23:57:36.115347 kubelet[2839]: E0513 23:57:36.115273 2839 pod_workers.go:1298] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-fj9x5" podUID="20d8fe80-dcc7-435f-a033-cb9b5eaee915" May 13 23:57:36.943715 containerd[1490]: time="2025-05-13T23:57:36.943665253Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/typha:v3.29.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 13 23:57:36.970525 containerd[1490]: time="2025-05-13T23:57:36.970392749Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/typha:v3.29.3: active requests=0, bytes read=30426870" May 13 23:57:37.020683 containerd[1490]: time="2025-05-13T23:57:37.020605000Z" level=info msg="ImageCreate event name:\"sha256:bde24a3cb8851b59372b76b3ad78f8028d1a915ffed82c6cc6256f34e500bd3d\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 13 23:57:37.096777 containerd[1490]: time="2025-05-13T23:57:37.096705681Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/typha@sha256:f5516aa6a78f00931d2625f3012dcf2c69d141ce41483b8d59c6ec6330a18620\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 13 23:57:37.097285 containerd[1490]: time="2025-05-13T23:57:37.097250018Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/typha:v3.29.3\" with image id \"sha256:bde24a3cb8851b59372b76b3ad78f8028d1a915ffed82c6cc6256f34e500bd3d\", repo tag \"ghcr.io/flatcar/calico/typha:v3.29.3\", repo digest \"ghcr.io/flatcar/calico/typha@sha256:f5516aa6a78f00931d2625f3012dcf2c69d141ce41483b8d59c6ec6330a18620\", size \"31919484\" in 3.278942757s" May 13 23:57:37.097351 containerd[1490]: time="2025-05-13T23:57:37.097292719Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/typha:v3.29.3\" returns image reference \"sha256:bde24a3cb8851b59372b76b3ad78f8028d1a915ffed82c6cc6256f34e500bd3d\"" May 13 23:57:37.098448 containerd[1490]: time="2025-05-13T23:57:37.098412381Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.29.3\"" May 13 23:57:37.110055 containerd[1490]: time="2025-05-13T23:57:37.109432667Z" level=info msg="CreateContainer within sandbox \"8d15a472e5cb14fcc016fb10f9fee8b744330a4871e11adda65901e0d733487f\" for container &ContainerMetadata{Name:calico-typha,Attempt:0,}" May 13 23:57:37.346063 containerd[1490]: time="2025-05-13T23:57:37.346006661Z" level=info msg="Container 51f5b9e01d51b5e4c40647257182f1f5042f2c8bbfa285e92c2a4c3380ef7b14: CDI devices from CRI Config.CDIDevices: []" May 13 23:57:38.232421 kubelet[2839]: E0513 23:57:38.232334 2839 pod_workers.go:1298] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-fj9x5" podUID="20d8fe80-dcc7-435f-a033-cb9b5eaee915" May 13 23:57:38.372287 containerd[1490]: time="2025-05-13T23:57:38.372237599Z" level=info msg="CreateContainer within sandbox \"8d15a472e5cb14fcc016fb10f9fee8b744330a4871e11adda65901e0d733487f\" for &ContainerMetadata{Name:calico-typha,Attempt:0,} returns container id \"51f5b9e01d51b5e4c40647257182f1f5042f2c8bbfa285e92c2a4c3380ef7b14\"" May 13 23:57:38.372862 containerd[1490]: time="2025-05-13T23:57:38.372839405Z" level=info msg="StartContainer for \"51f5b9e01d51b5e4c40647257182f1f5042f2c8bbfa285e92c2a4c3380ef7b14\"" May 13 23:57:38.373929 containerd[1490]: time="2025-05-13T23:57:38.373904553Z" level=info msg="connecting to shim 51f5b9e01d51b5e4c40647257182f1f5042f2c8bbfa285e92c2a4c3380ef7b14" address="unix:///run/containerd/s/7b3c49829b4e8e7507a2c3a5cb24608711ea665bbdf7bb8af33d6acc493b0275" protocol=ttrpc version=3 May 13 23:57:38.398327 systemd[1]: Started cri-containerd-51f5b9e01d51b5e4c40647257182f1f5042f2c8bbfa285e92c2a4c3380ef7b14.scope - libcontainer container 51f5b9e01d51b5e4c40647257182f1f5042f2c8bbfa285e92c2a4c3380ef7b14. May 13 23:57:39.868457 containerd[1490]: time="2025-05-13T23:57:39.868415537Z" level=info msg="StartContainer for \"51f5b9e01d51b5e4c40647257182f1f5042f2c8bbfa285e92c2a4c3380ef7b14\" returns successfully" May 13 23:57:40.115560 kubelet[2839]: E0513 23:57:40.115441 2839 pod_workers.go:1298] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-fj9x5" podUID="20d8fe80-dcc7-435f-a033-cb9b5eaee915" May 13 23:57:40.871218 kubelet[2839]: E0513 23:57:40.871151 2839 dns.go:153] "Nameserver limits exceeded" err="Nameserver limits were exceeded, some nameservers have been omitted, the applied nameserver line is: 1.1.1.1 1.0.0.1 8.8.8.8" May 13 23:57:40.969485 kubelet[2839]: E0513 23:57:40.969435 2839 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 23:57:40.969485 kubelet[2839]: W0513 23:57:40.969464 2839 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 23:57:40.969485 kubelet[2839]: E0513 23:57:40.969487 2839 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 23:57:40.969782 kubelet[2839]: E0513 23:57:40.969754 2839 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 23:57:40.969782 kubelet[2839]: W0513 23:57:40.969766 2839 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 23:57:40.969782 kubelet[2839]: E0513 23:57:40.969776 2839 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 23:57:40.970029 kubelet[2839]: E0513 23:57:40.970008 2839 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 23:57:40.970029 kubelet[2839]: W0513 23:57:40.970019 2839 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 23:57:40.970029 kubelet[2839]: E0513 23:57:40.970027 2839 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 23:57:40.970265 kubelet[2839]: E0513 23:57:40.970251 2839 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 23:57:40.970265 kubelet[2839]: W0513 23:57:40.970262 2839 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 23:57:40.970329 kubelet[2839]: E0513 23:57:40.970271 2839 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 23:57:40.970505 kubelet[2839]: E0513 23:57:40.970478 2839 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 23:57:40.970505 kubelet[2839]: W0513 23:57:40.970488 2839 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 23:57:40.970505 kubelet[2839]: E0513 23:57:40.970495 2839 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 23:57:40.970719 kubelet[2839]: E0513 23:57:40.970702 2839 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 23:57:40.970719 kubelet[2839]: W0513 23:57:40.970710 2839 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 23:57:40.970719 kubelet[2839]: E0513 23:57:40.970718 2839 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 23:57:40.970936 kubelet[2839]: E0513 23:57:40.970920 2839 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 23:57:40.970936 kubelet[2839]: W0513 23:57:40.970929 2839 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 23:57:40.970936 kubelet[2839]: E0513 23:57:40.970937 2839 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 23:57:40.971137 kubelet[2839]: E0513 23:57:40.971120 2839 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 23:57:40.971137 kubelet[2839]: W0513 23:57:40.971131 2839 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 23:57:40.971137 kubelet[2839]: E0513 23:57:40.971139 2839 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 23:57:40.971365 kubelet[2839]: E0513 23:57:40.971351 2839 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 23:57:40.971365 kubelet[2839]: W0513 23:57:40.971361 2839 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 23:57:40.971415 kubelet[2839]: E0513 23:57:40.971369 2839 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 23:57:40.971565 kubelet[2839]: E0513 23:57:40.971552 2839 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 23:57:40.971565 kubelet[2839]: W0513 23:57:40.971562 2839 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 23:57:40.971628 kubelet[2839]: E0513 23:57:40.971570 2839 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 23:57:40.971774 kubelet[2839]: E0513 23:57:40.971760 2839 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 23:57:40.971774 kubelet[2839]: W0513 23:57:40.971770 2839 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 23:57:40.971821 kubelet[2839]: E0513 23:57:40.971778 2839 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 23:57:40.971971 kubelet[2839]: E0513 23:57:40.971958 2839 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 23:57:40.971971 kubelet[2839]: W0513 23:57:40.971968 2839 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 23:57:40.972025 kubelet[2839]: E0513 23:57:40.971979 2839 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 23:57:40.972181 kubelet[2839]: E0513 23:57:40.972169 2839 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 23:57:40.972181 kubelet[2839]: W0513 23:57:40.972179 2839 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 23:57:40.972242 kubelet[2839]: E0513 23:57:40.972202 2839 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 23:57:40.972633 kubelet[2839]: E0513 23:57:40.972602 2839 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 23:57:40.972633 kubelet[2839]: W0513 23:57:40.972626 2839 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 23:57:40.972633 kubelet[2839]: E0513 23:57:40.972635 2839 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 23:57:40.972856 kubelet[2839]: E0513 23:57:40.972836 2839 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 23:57:40.972856 kubelet[2839]: W0513 23:57:40.972846 2839 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 23:57:40.972856 kubelet[2839]: E0513 23:57:40.972854 2839 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 23:57:40.975390 kubelet[2839]: E0513 23:57:40.975348 2839 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 23:57:40.975390 kubelet[2839]: W0513 23:57:40.975377 2839 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 23:57:40.975523 kubelet[2839]: E0513 23:57:40.975405 2839 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 23:57:40.975726 kubelet[2839]: E0513 23:57:40.975699 2839 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 23:57:40.975726 kubelet[2839]: W0513 23:57:40.975713 2839 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 23:57:40.975785 kubelet[2839]: E0513 23:57:40.975729 2839 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 23:57:40.976053 kubelet[2839]: E0513 23:57:40.976022 2839 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 23:57:40.976053 kubelet[2839]: W0513 23:57:40.976036 2839 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 23:57:40.976053 kubelet[2839]: E0513 23:57:40.976050 2839 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 23:57:40.976340 kubelet[2839]: E0513 23:57:40.976318 2839 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 23:57:40.976340 kubelet[2839]: W0513 23:57:40.976337 2839 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 23:57:40.976426 kubelet[2839]: E0513 23:57:40.976356 2839 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 23:57:40.976695 kubelet[2839]: E0513 23:57:40.976676 2839 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 23:57:40.976752 kubelet[2839]: W0513 23:57:40.976697 2839 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 23:57:40.976752 kubelet[2839]: E0513 23:57:40.976718 2839 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 23:57:40.976988 kubelet[2839]: E0513 23:57:40.976969 2839 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 23:57:40.976988 kubelet[2839]: W0513 23:57:40.976982 2839 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 23:57:40.977066 kubelet[2839]: E0513 23:57:40.976998 2839 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 23:57:40.977332 kubelet[2839]: E0513 23:57:40.977312 2839 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 23:57:40.977332 kubelet[2839]: W0513 23:57:40.977326 2839 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 23:57:40.977423 kubelet[2839]: E0513 23:57:40.977366 2839 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 23:57:40.977548 kubelet[2839]: E0513 23:57:40.977531 2839 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 23:57:40.977548 kubelet[2839]: W0513 23:57:40.977541 2839 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 23:57:40.977628 kubelet[2839]: E0513 23:57:40.977599 2839 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 23:57:40.977787 kubelet[2839]: E0513 23:57:40.977770 2839 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 23:57:40.977787 kubelet[2839]: W0513 23:57:40.977781 2839 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 23:57:40.977862 kubelet[2839]: E0513 23:57:40.977795 2839 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 23:57:40.978053 kubelet[2839]: E0513 23:57:40.978032 2839 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 23:57:40.978053 kubelet[2839]: W0513 23:57:40.978050 2839 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 23:57:40.978121 kubelet[2839]: E0513 23:57:40.978068 2839 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 23:57:40.978326 kubelet[2839]: E0513 23:57:40.978313 2839 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 23:57:40.978368 kubelet[2839]: W0513 23:57:40.978325 2839 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 23:57:40.978368 kubelet[2839]: E0513 23:57:40.978340 2839 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 23:57:40.978558 kubelet[2839]: E0513 23:57:40.978540 2839 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 23:57:40.978558 kubelet[2839]: W0513 23:57:40.978549 2839 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 23:57:40.978639 kubelet[2839]: E0513 23:57:40.978562 2839 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 23:57:40.978834 kubelet[2839]: E0513 23:57:40.978818 2839 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 23:57:40.978860 kubelet[2839]: W0513 23:57:40.978831 2839 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 23:57:40.978860 kubelet[2839]: E0513 23:57:40.978847 2839 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 23:57:40.979058 kubelet[2839]: E0513 23:57:40.979042 2839 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 23:57:40.979058 kubelet[2839]: W0513 23:57:40.979053 2839 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 23:57:40.979133 kubelet[2839]: E0513 23:57:40.979066 2839 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 23:57:40.979345 kubelet[2839]: E0513 23:57:40.979332 2839 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 23:57:40.979345 kubelet[2839]: W0513 23:57:40.979344 2839 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 23:57:40.979406 kubelet[2839]: E0513 23:57:40.979358 2839 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 23:57:40.979627 kubelet[2839]: E0513 23:57:40.979598 2839 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 23:57:40.979627 kubelet[2839]: W0513 23:57:40.979621 2839 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 23:57:40.979721 kubelet[2839]: E0513 23:57:40.979636 2839 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 23:57:40.979916 kubelet[2839]: E0513 23:57:40.979903 2839 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 23:57:40.979943 kubelet[2839]: W0513 23:57:40.979915 2839 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 23:57:40.979943 kubelet[2839]: E0513 23:57:40.979932 2839 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 23:57:40.980200 kubelet[2839]: E0513 23:57:40.980152 2839 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 23:57:40.980200 kubelet[2839]: W0513 23:57:40.980179 2839 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 23:57:40.980250 kubelet[2839]: E0513 23:57:40.980213 2839 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 23:57:41.543402 kubelet[2839]: I0513 23:57:41.543056 2839 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/calico-typha-56fc595cc4-g674g" podStartSLOduration=6.262788789 podStartE2EDuration="9.543036653s" podCreationTimestamp="2025-05-13 23:57:32 +0000 UTC" firstStartedPulling="2025-05-13 23:57:33.817988552 +0000 UTC m=+29.805692578" lastFinishedPulling="2025-05-13 23:57:37.098236416 +0000 UTC m=+33.085940442" observedRunningTime="2025-05-13 23:57:41.54269818 +0000 UTC m=+37.530402216" watchObservedRunningTime="2025-05-13 23:57:41.543036653 +0000 UTC m=+37.530740680" May 13 23:57:41.872100 kubelet[2839]: I0513 23:57:41.872062 2839 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" May 13 23:57:41.872785 kubelet[2839]: E0513 23:57:41.872763 2839 dns.go:153] "Nameserver limits exceeded" err="Nameserver limits were exceeded, some nameservers have been omitted, the applied nameserver line is: 1.1.1.1 1.0.0.1 8.8.8.8" May 13 23:57:41.878005 kubelet[2839]: E0513 23:57:41.877982 2839 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 23:57:41.878005 kubelet[2839]: W0513 23:57:41.877999 2839 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 23:57:41.878087 kubelet[2839]: E0513 23:57:41.878017 2839 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 23:57:41.878309 kubelet[2839]: E0513 23:57:41.878297 2839 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 23:57:41.878309 kubelet[2839]: W0513 23:57:41.878307 2839 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 23:57:41.878309 kubelet[2839]: E0513 23:57:41.878315 2839 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 23:57:41.878525 kubelet[2839]: E0513 23:57:41.878511 2839 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 23:57:41.878525 kubelet[2839]: W0513 23:57:41.878520 2839 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 23:57:41.878597 kubelet[2839]: E0513 23:57:41.878528 2839 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 23:57:41.878769 kubelet[2839]: E0513 23:57:41.878756 2839 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 23:57:41.878769 kubelet[2839]: W0513 23:57:41.878765 2839 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 23:57:41.878833 kubelet[2839]: E0513 23:57:41.878774 2839 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 23:57:41.878990 kubelet[2839]: E0513 23:57:41.878973 2839 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 23:57:41.878990 kubelet[2839]: W0513 23:57:41.878983 2839 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 23:57:41.879042 kubelet[2839]: E0513 23:57:41.878991 2839 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 23:57:41.879182 kubelet[2839]: E0513 23:57:41.879171 2839 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 23:57:41.879182 kubelet[2839]: W0513 23:57:41.879180 2839 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 23:57:41.879237 kubelet[2839]: E0513 23:57:41.879187 2839 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 23:57:41.879433 kubelet[2839]: E0513 23:57:41.879393 2839 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 23:57:41.879433 kubelet[2839]: W0513 23:57:41.879404 2839 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 23:57:41.879433 kubelet[2839]: E0513 23:57:41.879412 2839 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 23:57:41.879619 kubelet[2839]: E0513 23:57:41.879599 2839 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 23:57:41.879619 kubelet[2839]: W0513 23:57:41.879617 2839 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 23:57:41.879667 kubelet[2839]: E0513 23:57:41.879626 2839 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 23:57:41.879816 kubelet[2839]: E0513 23:57:41.879806 2839 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 23:57:41.879816 kubelet[2839]: W0513 23:57:41.879814 2839 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 23:57:41.879862 kubelet[2839]: E0513 23:57:41.879822 2839 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 23:57:41.880015 kubelet[2839]: E0513 23:57:41.880004 2839 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 23:57:41.880015 kubelet[2839]: W0513 23:57:41.880012 2839 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 23:57:41.880070 kubelet[2839]: E0513 23:57:41.880020 2839 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 23:57:41.880224 kubelet[2839]: E0513 23:57:41.880213 2839 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 23:57:41.880224 kubelet[2839]: W0513 23:57:41.880221 2839 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 23:57:41.880279 kubelet[2839]: E0513 23:57:41.880229 2839 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 23:57:41.880430 kubelet[2839]: E0513 23:57:41.880418 2839 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 23:57:41.880430 kubelet[2839]: W0513 23:57:41.880426 2839 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 23:57:41.880482 kubelet[2839]: E0513 23:57:41.880433 2839 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 23:57:41.880635 kubelet[2839]: E0513 23:57:41.880623 2839 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 23:57:41.880635 kubelet[2839]: W0513 23:57:41.880631 2839 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 23:57:41.880694 kubelet[2839]: E0513 23:57:41.880640 2839 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 23:57:41.880824 kubelet[2839]: E0513 23:57:41.880812 2839 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 23:57:41.880824 kubelet[2839]: W0513 23:57:41.880821 2839 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 23:57:41.880872 kubelet[2839]: E0513 23:57:41.880828 2839 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 23:57:41.881009 kubelet[2839]: E0513 23:57:41.880997 2839 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 23:57:41.881009 kubelet[2839]: W0513 23:57:41.881006 2839 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 23:57:41.881059 kubelet[2839]: E0513 23:57:41.881013 2839 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 23:57:41.882239 kubelet[2839]: E0513 23:57:41.882212 2839 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 23:57:41.882239 kubelet[2839]: W0513 23:57:41.882230 2839 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 23:57:41.882312 kubelet[2839]: E0513 23:57:41.882249 2839 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 23:57:41.882500 kubelet[2839]: E0513 23:57:41.882484 2839 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 23:57:41.882500 kubelet[2839]: W0513 23:57:41.882496 2839 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 23:57:41.882561 kubelet[2839]: E0513 23:57:41.882509 2839 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 23:57:41.882758 kubelet[2839]: E0513 23:57:41.882741 2839 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 23:57:41.882758 kubelet[2839]: W0513 23:57:41.882751 2839 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 23:57:41.882823 kubelet[2839]: E0513 23:57:41.882764 2839 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 23:57:41.882977 kubelet[2839]: E0513 23:57:41.882952 2839 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 23:57:41.882977 kubelet[2839]: W0513 23:57:41.882966 2839 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 23:57:41.883036 kubelet[2839]: E0513 23:57:41.882991 2839 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 23:57:41.883223 kubelet[2839]: E0513 23:57:41.883185 2839 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 23:57:41.883223 kubelet[2839]: W0513 23:57:41.883205 2839 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 23:57:41.883223 kubelet[2839]: E0513 23:57:41.883218 2839 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 23:57:41.883421 kubelet[2839]: E0513 23:57:41.883404 2839 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 23:57:41.883421 kubelet[2839]: W0513 23:57:41.883415 2839 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 23:57:41.883486 kubelet[2839]: E0513 23:57:41.883426 2839 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 23:57:41.883624 kubelet[2839]: E0513 23:57:41.883599 2839 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 23:57:41.883624 kubelet[2839]: W0513 23:57:41.883620 2839 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 23:57:41.883675 kubelet[2839]: E0513 23:57:41.883633 2839 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 23:57:41.883811 kubelet[2839]: E0513 23:57:41.883797 2839 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 23:57:41.883811 kubelet[2839]: W0513 23:57:41.883807 2839 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 23:57:41.883862 kubelet[2839]: E0513 23:57:41.883820 2839 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 23:57:41.883992 kubelet[2839]: E0513 23:57:41.883974 2839 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 23:57:41.883992 kubelet[2839]: W0513 23:57:41.883984 2839 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 23:57:41.884070 kubelet[2839]: E0513 23:57:41.884008 2839 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 23:57:41.884160 kubelet[2839]: E0513 23:57:41.884146 2839 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 23:57:41.884160 kubelet[2839]: W0513 23:57:41.884156 2839 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 23:57:41.884222 kubelet[2839]: E0513 23:57:41.884177 2839 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 23:57:41.884380 kubelet[2839]: E0513 23:57:41.884365 2839 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 23:57:41.884380 kubelet[2839]: W0513 23:57:41.884375 2839 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 23:57:41.884439 kubelet[2839]: E0513 23:57:41.884386 2839 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 23:57:41.884671 kubelet[2839]: E0513 23:57:41.884647 2839 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 23:57:41.884671 kubelet[2839]: W0513 23:57:41.884660 2839 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 23:57:41.884733 kubelet[2839]: E0513 23:57:41.884675 2839 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 23:57:41.884892 kubelet[2839]: E0513 23:57:41.884878 2839 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 23:57:41.884892 kubelet[2839]: W0513 23:57:41.884890 2839 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 23:57:41.884955 kubelet[2839]: E0513 23:57:41.884906 2839 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 23:57:41.885099 kubelet[2839]: E0513 23:57:41.885088 2839 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 23:57:41.885099 kubelet[2839]: W0513 23:57:41.885097 2839 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 23:57:41.885152 kubelet[2839]: E0513 23:57:41.885109 2839 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 23:57:41.885314 kubelet[2839]: E0513 23:57:41.885293 2839 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 23:57:41.885314 kubelet[2839]: W0513 23:57:41.885310 2839 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 23:57:41.885383 kubelet[2839]: E0513 23:57:41.885328 2839 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 23:57:41.885526 kubelet[2839]: E0513 23:57:41.885514 2839 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 23:57:41.885526 kubelet[2839]: W0513 23:57:41.885524 2839 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 23:57:41.885569 kubelet[2839]: E0513 23:57:41.885538 2839 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 23:57:41.885766 kubelet[2839]: E0513 23:57:41.885755 2839 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 23:57:41.885766 kubelet[2839]: W0513 23:57:41.885764 2839 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 23:57:41.885821 kubelet[2839]: E0513 23:57:41.885772 2839 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 23:57:41.886154 kubelet[2839]: E0513 23:57:41.886135 2839 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 23:57:41.886154 kubelet[2839]: W0513 23:57:41.886145 2839 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 23:57:41.886154 kubelet[2839]: E0513 23:57:41.886153 2839 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 23:57:42.115469 kubelet[2839]: E0513 23:57:42.115407 2839 pod_workers.go:1298] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-fj9x5" podUID="20d8fe80-dcc7-435f-a033-cb9b5eaee915" May 13 23:57:42.874391 kubelet[2839]: E0513 23:57:42.874359 2839 dns.go:153] "Nameserver limits exceeded" err="Nameserver limits were exceeded, some nameservers have been omitted, the applied nameserver line is: 1.1.1.1 1.0.0.1 8.8.8.8" May 13 23:57:42.886940 kubelet[2839]: E0513 23:57:42.886888 2839 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 23:57:42.886940 kubelet[2839]: W0513 23:57:42.886914 2839 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 23:57:42.886940 kubelet[2839]: E0513 23:57:42.886938 2839 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 23:57:42.887229 kubelet[2839]: E0513 23:57:42.887205 2839 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 23:57:42.887229 kubelet[2839]: W0513 23:57:42.887218 2839 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 23:57:42.887229 kubelet[2839]: E0513 23:57:42.887226 2839 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 23:57:42.887563 kubelet[2839]: E0513 23:57:42.887545 2839 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 23:57:42.887563 kubelet[2839]: W0513 23:57:42.887554 2839 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 23:57:42.887563 kubelet[2839]: E0513 23:57:42.887562 2839 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 23:57:42.887816 kubelet[2839]: E0513 23:57:42.887795 2839 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 23:57:42.887816 kubelet[2839]: W0513 23:57:42.887804 2839 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 23:57:42.887816 kubelet[2839]: E0513 23:57:42.887812 2839 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 23:57:42.888053 kubelet[2839]: E0513 23:57:42.888037 2839 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 23:57:42.888053 kubelet[2839]: W0513 23:57:42.888049 2839 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 23:57:42.888116 kubelet[2839]: E0513 23:57:42.888058 2839 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 23:57:42.888313 kubelet[2839]: E0513 23:57:42.888285 2839 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 23:57:42.888313 kubelet[2839]: W0513 23:57:42.888302 2839 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 23:57:42.888313 kubelet[2839]: E0513 23:57:42.888312 2839 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 23:57:42.888525 kubelet[2839]: E0513 23:57:42.888513 2839 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 23:57:42.888525 kubelet[2839]: W0513 23:57:42.888521 2839 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 23:57:42.888587 kubelet[2839]: E0513 23:57:42.888529 2839 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 23:57:42.888773 kubelet[2839]: E0513 23:57:42.888759 2839 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 23:57:42.888806 kubelet[2839]: W0513 23:57:42.888772 2839 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 23:57:42.888806 kubelet[2839]: E0513 23:57:42.888783 2839 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 23:57:42.889034 kubelet[2839]: E0513 23:57:42.889021 2839 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 23:57:42.889034 kubelet[2839]: W0513 23:57:42.889031 2839 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 23:57:42.889034 kubelet[2839]: E0513 23:57:42.889039 2839 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 23:57:42.889329 kubelet[2839]: E0513 23:57:42.889314 2839 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 23:57:42.889329 kubelet[2839]: W0513 23:57:42.889325 2839 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 23:57:42.889392 kubelet[2839]: E0513 23:57:42.889335 2839 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 23:57:42.889551 kubelet[2839]: E0513 23:57:42.889531 2839 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 23:57:42.889551 kubelet[2839]: W0513 23:57:42.889541 2839 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 23:57:42.889551 kubelet[2839]: E0513 23:57:42.889549 2839 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 23:57:42.889752 kubelet[2839]: E0513 23:57:42.889735 2839 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 23:57:42.889752 kubelet[2839]: W0513 23:57:42.889745 2839 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 23:57:42.889752 kubelet[2839]: E0513 23:57:42.889753 2839 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 23:57:42.889927 kubelet[2839]: E0513 23:57:42.889910 2839 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 23:57:42.889927 kubelet[2839]: W0513 23:57:42.889921 2839 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 23:57:42.889927 kubelet[2839]: E0513 23:57:42.889929 2839 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 23:57:42.890113 kubelet[2839]: E0513 23:57:42.890097 2839 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 23:57:42.890113 kubelet[2839]: W0513 23:57:42.890107 2839 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 23:57:42.890113 kubelet[2839]: E0513 23:57:42.890114 2839 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 23:57:42.890340 kubelet[2839]: E0513 23:57:42.890322 2839 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 23:57:42.890340 kubelet[2839]: W0513 23:57:42.890333 2839 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 23:57:42.890340 kubelet[2839]: E0513 23:57:42.890341 2839 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 23:57:42.987651 kubelet[2839]: E0513 23:57:42.987612 2839 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 23:57:42.987651 kubelet[2839]: W0513 23:57:42.987639 2839 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 23:57:42.987651 kubelet[2839]: E0513 23:57:42.987665 2839 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 23:57:42.987959 kubelet[2839]: E0513 23:57:42.987940 2839 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 23:57:42.987959 kubelet[2839]: W0513 23:57:42.987954 2839 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 23:57:42.988041 kubelet[2839]: E0513 23:57:42.987969 2839 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 23:57:42.988227 kubelet[2839]: E0513 23:57:42.988203 2839 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 23:57:42.988227 kubelet[2839]: W0513 23:57:42.988218 2839 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 23:57:42.988227 kubelet[2839]: E0513 23:57:42.988234 2839 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 23:57:42.988500 kubelet[2839]: E0513 23:57:42.988487 2839 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 23:57:42.988500 kubelet[2839]: W0513 23:57:42.988499 2839 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 23:57:42.988549 kubelet[2839]: E0513 23:57:42.988516 2839 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 23:57:42.988728 kubelet[2839]: E0513 23:57:42.988715 2839 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 23:57:42.988728 kubelet[2839]: W0513 23:57:42.988725 2839 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 23:57:42.988785 kubelet[2839]: E0513 23:57:42.988740 2839 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 23:57:42.989043 kubelet[2839]: E0513 23:57:42.989023 2839 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 23:57:42.989043 kubelet[2839]: W0513 23:57:42.989037 2839 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 23:57:42.989103 kubelet[2839]: E0513 23:57:42.989068 2839 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 23:57:42.989288 kubelet[2839]: E0513 23:57:42.989270 2839 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 23:57:42.989288 kubelet[2839]: W0513 23:57:42.989282 2839 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 23:57:42.989340 kubelet[2839]: E0513 23:57:42.989310 2839 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 23:57:42.989493 kubelet[2839]: E0513 23:57:42.989481 2839 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 23:57:42.989520 kubelet[2839]: W0513 23:57:42.989492 2839 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 23:57:42.989541 kubelet[2839]: E0513 23:57:42.989523 2839 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 23:57:42.989719 kubelet[2839]: E0513 23:57:42.989701 2839 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 23:57:42.989719 kubelet[2839]: W0513 23:57:42.989713 2839 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 23:57:42.989769 kubelet[2839]: E0513 23:57:42.989727 2839 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 23:57:42.990008 kubelet[2839]: E0513 23:57:42.989994 2839 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 23:57:42.990044 kubelet[2839]: W0513 23:57:42.990008 2839 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 23:57:42.990044 kubelet[2839]: E0513 23:57:42.990023 2839 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 23:57:42.990400 kubelet[2839]: E0513 23:57:42.990287 2839 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 23:57:42.990400 kubelet[2839]: W0513 23:57:42.990299 2839 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 23:57:42.990400 kubelet[2839]: E0513 23:57:42.990314 2839 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 23:57:42.990570 kubelet[2839]: E0513 23:57:42.990557 2839 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 23:57:42.990592 kubelet[2839]: W0513 23:57:42.990569 2839 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 23:57:42.990592 kubelet[2839]: E0513 23:57:42.990583 2839 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 23:57:42.990861 kubelet[2839]: E0513 23:57:42.990846 2839 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 23:57:42.990861 kubelet[2839]: W0513 23:57:42.990858 2839 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 23:57:42.990919 kubelet[2839]: E0513 23:57:42.990872 2839 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 23:57:42.991104 kubelet[2839]: E0513 23:57:42.991084 2839 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 23:57:42.991104 kubelet[2839]: W0513 23:57:42.991099 2839 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 23:57:42.991164 kubelet[2839]: E0513 23:57:42.991118 2839 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 23:57:42.991374 kubelet[2839]: E0513 23:57:42.991359 2839 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 23:57:42.991374 kubelet[2839]: W0513 23:57:42.991370 2839 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 23:57:42.991431 kubelet[2839]: E0513 23:57:42.991410 2839 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 23:57:42.991618 kubelet[2839]: E0513 23:57:42.991593 2839 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 23:57:42.991646 kubelet[2839]: W0513 23:57:42.991617 2839 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 23:57:42.991794 kubelet[2839]: E0513 23:57:42.991704 2839 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 23:57:42.991829 kubelet[2839]: E0513 23:57:42.991817 2839 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 23:57:42.991829 kubelet[2839]: W0513 23:57:42.991826 2839 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 23:57:42.991869 kubelet[2839]: E0513 23:57:42.991840 2839 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 23:57:42.992079 kubelet[2839]: E0513 23:57:42.992057 2839 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 23:57:42.992079 kubelet[2839]: W0513 23:57:42.992068 2839 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 23:57:42.992079 kubelet[2839]: E0513 23:57:42.992077 2839 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 23:57:43.773676 containerd[1490]: time="2025-05-13T23:57:43.773592219Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.29.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 13 23:57:43.782399 containerd[1490]: time="2025-05-13T23:57:43.782274923Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.29.3: active requests=0, bytes read=5366937" May 13 23:57:43.794382 containerd[1490]: time="2025-05-13T23:57:43.794283157Z" level=info msg="ImageCreate event name:\"sha256:0ceddb3add2e9955cbb604f666245e259f30b1d6683c428f8748359e83d238a5\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 13 23:57:43.807486 containerd[1490]: time="2025-05-13T23:57:43.807421228Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/pod2daemon-flexvol@sha256:eeaa2bb4f9b1aa61adde43ce6dea95eee89291f96963548e108d9a2dfbc5edd1\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 13 23:57:43.808185 containerd[1490]: time="2025-05-13T23:57:43.808133222Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.29.3\" with image id \"sha256:0ceddb3add2e9955cbb604f666245e259f30b1d6683c428f8748359e83d238a5\", repo tag \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.29.3\", repo digest \"ghcr.io/flatcar/calico/pod2daemon-flexvol@sha256:eeaa2bb4f9b1aa61adde43ce6dea95eee89291f96963548e108d9a2dfbc5edd1\", size \"6859519\" in 6.709685253s" May 13 23:57:43.808185 containerd[1490]: time="2025-05-13T23:57:43.808173929Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.29.3\" returns image reference \"sha256:0ceddb3add2e9955cbb604f666245e259f30b1d6683c428f8748359e83d238a5\"" May 13 23:57:43.810532 containerd[1490]: time="2025-05-13T23:57:43.810489731Z" level=info msg="CreateContainer within sandbox \"ce2666a9e34bf422972541fdf7e7c21bd4ffb3d9b36f6a56a800322e2fafc43a\" for container &ContainerMetadata{Name:flexvol-driver,Attempt:0,}" May 13 23:57:43.875833 kubelet[2839]: E0513 23:57:43.875769 2839 dns.go:153] "Nameserver limits exceeded" err="Nameserver limits were exceeded, some nameservers have been omitted, the applied nameserver line is: 1.1.1.1 1.0.0.1 8.8.8.8" May 13 23:57:43.895703 kubelet[2839]: E0513 23:57:43.895662 2839 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 23:57:43.895703 kubelet[2839]: W0513 23:57:43.895690 2839 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 23:57:43.895703 kubelet[2839]: E0513 23:57:43.895716 2839 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 23:57:43.896047 kubelet[2839]: E0513 23:57:43.896020 2839 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 23:57:43.896047 kubelet[2839]: W0513 23:57:43.896034 2839 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 23:57:43.896047 kubelet[2839]: E0513 23:57:43.896045 2839 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 23:57:43.896315 kubelet[2839]: E0513 23:57:43.896300 2839 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 23:57:43.896315 kubelet[2839]: W0513 23:57:43.896311 2839 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 23:57:43.896425 kubelet[2839]: E0513 23:57:43.896322 2839 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 23:57:43.896578 kubelet[2839]: E0513 23:57:43.896558 2839 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 23:57:43.896578 kubelet[2839]: W0513 23:57:43.896573 2839 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 23:57:43.896681 kubelet[2839]: E0513 23:57:43.896583 2839 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 23:57:43.896884 kubelet[2839]: E0513 23:57:43.896866 2839 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 23:57:43.896884 kubelet[2839]: W0513 23:57:43.896880 2839 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 23:57:43.896972 kubelet[2839]: E0513 23:57:43.896890 2839 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 23:57:43.897120 kubelet[2839]: E0513 23:57:43.897100 2839 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 23:57:43.897120 kubelet[2839]: W0513 23:57:43.897115 2839 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 23:57:43.897226 kubelet[2839]: E0513 23:57:43.897127 2839 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 23:57:43.897404 kubelet[2839]: E0513 23:57:43.897386 2839 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 23:57:43.897404 kubelet[2839]: W0513 23:57:43.897400 2839 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 23:57:43.897481 kubelet[2839]: E0513 23:57:43.897411 2839 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 23:57:43.897652 kubelet[2839]: E0513 23:57:43.897634 2839 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 23:57:43.897652 kubelet[2839]: W0513 23:57:43.897647 2839 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 23:57:43.897720 kubelet[2839]: E0513 23:57:43.897657 2839 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 23:57:43.897880 kubelet[2839]: E0513 23:57:43.897864 2839 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 23:57:43.897880 kubelet[2839]: W0513 23:57:43.897878 2839 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 23:57:43.897936 kubelet[2839]: E0513 23:57:43.897888 2839 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 23:57:43.898220 kubelet[2839]: E0513 23:57:43.898167 2839 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 23:57:43.898220 kubelet[2839]: W0513 23:57:43.898216 2839 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 23:57:43.898411 kubelet[2839]: E0513 23:57:43.898244 2839 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 23:57:43.898502 kubelet[2839]: E0513 23:57:43.898487 2839 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 23:57:43.898502 kubelet[2839]: W0513 23:57:43.898498 2839 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 23:57:43.898574 kubelet[2839]: E0513 23:57:43.898506 2839 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 23:57:43.898753 kubelet[2839]: E0513 23:57:43.898736 2839 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 23:57:43.898753 kubelet[2839]: W0513 23:57:43.898748 2839 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 23:57:43.898821 kubelet[2839]: E0513 23:57:43.898759 2839 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 23:57:43.899002 kubelet[2839]: E0513 23:57:43.898984 2839 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 23:57:43.899002 kubelet[2839]: W0513 23:57:43.898996 2839 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 23:57:43.899073 kubelet[2839]: E0513 23:57:43.899005 2839 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 23:57:43.899288 kubelet[2839]: E0513 23:57:43.899265 2839 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 23:57:43.899288 kubelet[2839]: W0513 23:57:43.899281 2839 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 23:57:43.899379 kubelet[2839]: E0513 23:57:43.899296 2839 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 23:57:43.899538 kubelet[2839]: E0513 23:57:43.899521 2839 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 23:57:43.899538 kubelet[2839]: W0513 23:57:43.899533 2839 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 23:57:43.899617 kubelet[2839]: E0513 23:57:43.899544 2839 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 23:57:43.953180 containerd[1490]: time="2025-05-13T23:57:43.953118375Z" level=info msg="Container 9e46296f468d0d5263d0365802df5cbca14c0134780b3b8bd360498fa2aab31e: CDI devices from CRI Config.CDIDevices: []" May 13 23:57:43.994613 kubelet[2839]: E0513 23:57:43.994565 2839 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 23:57:43.994613 kubelet[2839]: W0513 23:57:43.994595 2839 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 23:57:43.994833 kubelet[2839]: E0513 23:57:43.994629 2839 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 23:57:43.994944 kubelet[2839]: E0513 23:57:43.994915 2839 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 23:57:43.994944 kubelet[2839]: W0513 23:57:43.994928 2839 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 23:57:43.995097 kubelet[2839]: E0513 23:57:43.994947 2839 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 23:57:43.995204 kubelet[2839]: E0513 23:57:43.995179 2839 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 23:57:43.995497 kubelet[2839]: W0513 23:57:43.995400 2839 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 23:57:43.995497 kubelet[2839]: E0513 23:57:43.995437 2839 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 23:57:43.995905 kubelet[2839]: E0513 23:57:43.995860 2839 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 23:57:43.995905 kubelet[2839]: W0513 23:57:43.995892 2839 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 23:57:43.995982 kubelet[2839]: E0513 23:57:43.995934 2839 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 23:57:43.996278 kubelet[2839]: E0513 23:57:43.996260 2839 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 23:57:43.996278 kubelet[2839]: W0513 23:57:43.996273 2839 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 23:57:43.996378 kubelet[2839]: E0513 23:57:43.996319 2839 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 23:57:43.996535 kubelet[2839]: E0513 23:57:43.996507 2839 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 23:57:43.996535 kubelet[2839]: W0513 23:57:43.996520 2839 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 23:57:43.996618 kubelet[2839]: E0513 23:57:43.996559 2839 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 23:57:43.996787 kubelet[2839]: E0513 23:57:43.996759 2839 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 23:57:43.996787 kubelet[2839]: W0513 23:57:43.996770 2839 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 23:57:43.996867 kubelet[2839]: E0513 23:57:43.996812 2839 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 23:57:43.997005 kubelet[2839]: E0513 23:57:43.996984 2839 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 23:57:43.997005 kubelet[2839]: W0513 23:57:43.996996 2839 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 23:57:43.997072 kubelet[2839]: E0513 23:57:43.997014 2839 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 23:57:43.997314 kubelet[2839]: E0513 23:57:43.997276 2839 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 23:57:43.997314 kubelet[2839]: W0513 23:57:43.997296 2839 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 23:57:43.997314 kubelet[2839]: E0513 23:57:43.997314 2839 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 23:57:43.997563 kubelet[2839]: E0513 23:57:43.997537 2839 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 23:57:43.997563 kubelet[2839]: W0513 23:57:43.997553 2839 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 23:57:43.997648 kubelet[2839]: E0513 23:57:43.997568 2839 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 23:57:43.997891 kubelet[2839]: E0513 23:57:43.997867 2839 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 23:57:43.997891 kubelet[2839]: W0513 23:57:43.997881 2839 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 23:57:43.997977 kubelet[2839]: E0513 23:57:43.997896 2839 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 23:57:43.998120 kubelet[2839]: E0513 23:57:43.998100 2839 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 23:57:43.998120 kubelet[2839]: W0513 23:57:43.998110 2839 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 23:57:43.998120 kubelet[2839]: E0513 23:57:43.998122 2839 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 23:57:43.998400 kubelet[2839]: E0513 23:57:43.998379 2839 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 23:57:43.998400 kubelet[2839]: W0513 23:57:43.998390 2839 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 23:57:43.998472 kubelet[2839]: E0513 23:57:43.998404 2839 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 23:57:43.998758 kubelet[2839]: E0513 23:57:43.998734 2839 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 23:57:43.998758 kubelet[2839]: W0513 23:57:43.998749 2839 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 23:57:43.998835 kubelet[2839]: E0513 23:57:43.998764 2839 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 23:57:43.999002 kubelet[2839]: E0513 23:57:43.998981 2839 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 23:57:43.999002 kubelet[2839]: W0513 23:57:43.998993 2839 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 23:57:43.999076 kubelet[2839]: E0513 23:57:43.999031 2839 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 23:57:43.999209 kubelet[2839]: E0513 23:57:43.999169 2839 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 23:57:43.999209 kubelet[2839]: W0513 23:57:43.999180 2839 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 23:57:43.999274 kubelet[2839]: E0513 23:57:43.999217 2839 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 23:57:43.999380 kubelet[2839]: E0513 23:57:43.999360 2839 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 23:57:43.999380 kubelet[2839]: W0513 23:57:43.999371 2839 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 23:57:43.999453 kubelet[2839]: E0513 23:57:43.999386 2839 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 23:57:43.999637 kubelet[2839]: E0513 23:57:43.999614 2839 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 23:57:43.999637 kubelet[2839]: W0513 23:57:43.999626 2839 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 23:57:43.999637 kubelet[2839]: E0513 23:57:43.999635 2839 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 23:57:44.115429 kubelet[2839]: E0513 23:57:44.115377 2839 pod_workers.go:1298] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-fj9x5" podUID="20d8fe80-dcc7-435f-a033-cb9b5eaee915" May 13 23:57:44.124426 containerd[1490]: time="2025-05-13T23:57:44.124362713Z" level=info msg="CreateContainer within sandbox \"ce2666a9e34bf422972541fdf7e7c21bd4ffb3d9b36f6a56a800322e2fafc43a\" for &ContainerMetadata{Name:flexvol-driver,Attempt:0,} returns container id \"9e46296f468d0d5263d0365802df5cbca14c0134780b3b8bd360498fa2aab31e\"" May 13 23:57:44.124856 containerd[1490]: time="2025-05-13T23:57:44.124821415Z" level=info msg="StartContainer for \"9e46296f468d0d5263d0365802df5cbca14c0134780b3b8bd360498fa2aab31e\"" May 13 23:57:44.126585 containerd[1490]: time="2025-05-13T23:57:44.126558304Z" level=info msg="connecting to shim 9e46296f468d0d5263d0365802df5cbca14c0134780b3b8bd360498fa2aab31e" address="unix:///run/containerd/s/f1ec8a2162cacb6ba07d4ea966928e698976d60ca50403f18b25e1156eac6a59" protocol=ttrpc version=3 May 13 23:57:44.148335 systemd[1]: Started cri-containerd-9e46296f468d0d5263d0365802df5cbca14c0134780b3b8bd360498fa2aab31e.scope - libcontainer container 9e46296f468d0d5263d0365802df5cbca14c0134780b3b8bd360498fa2aab31e. May 13 23:57:44.212850 systemd[1]: cri-containerd-9e46296f468d0d5263d0365802df5cbca14c0134780b3b8bd360498fa2aab31e.scope: Deactivated successfully. May 13 23:57:44.215391 containerd[1490]: time="2025-05-13T23:57:44.215350063Z" level=info msg="TaskExit event in podsandbox handler container_id:\"9e46296f468d0d5263d0365802df5cbca14c0134780b3b8bd360498fa2aab31e\" id:\"9e46296f468d0d5263d0365802df5cbca14c0134780b3b8bd360498fa2aab31e\" pid:3599 exited_at:{seconds:1747180664 nanos:214716159}" May 13 23:57:44.611987 containerd[1490]: time="2025-05-13T23:57:44.611944216Z" level=info msg="received exit event container_id:\"9e46296f468d0d5263d0365802df5cbca14c0134780b3b8bd360498fa2aab31e\" id:\"9e46296f468d0d5263d0365802df5cbca14c0134780b3b8bd360498fa2aab31e\" pid:3599 exited_at:{seconds:1747180664 nanos:214716159}" May 13 23:57:44.621637 containerd[1490]: time="2025-05-13T23:57:44.621563194Z" level=info msg="StartContainer for \"9e46296f468d0d5263d0365802df5cbca14c0134780b3b8bd360498fa2aab31e\" returns successfully" May 13 23:57:44.636230 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-9e46296f468d0d5263d0365802df5cbca14c0134780b3b8bd360498fa2aab31e-rootfs.mount: Deactivated successfully. May 13 23:57:44.880209 kubelet[2839]: E0513 23:57:44.880059 2839 dns.go:153] "Nameserver limits exceeded" err="Nameserver limits were exceeded, some nameservers have been omitted, the applied nameserver line is: 1.1.1.1 1.0.0.1 8.8.8.8" May 13 23:57:45.456108 kubelet[2839]: E0513 23:57:45.456023 2839 pod_workers.go:1298] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-fj9x5" podUID="20d8fe80-dcc7-435f-a033-cb9b5eaee915" May 13 23:57:45.882883 kubelet[2839]: E0513 23:57:45.882849 2839 dns.go:153] "Nameserver limits exceeded" err="Nameserver limits were exceeded, some nameservers have been omitted, the applied nameserver line is: 1.1.1.1 1.0.0.1 8.8.8.8" May 13 23:57:45.888842 containerd[1490]: time="2025-05-13T23:57:45.888795484Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/cni:v3.29.3\"" May 13 23:57:46.200553 systemd[1]: Started sshd@7-10.0.0.64:22-10.0.0.1:42926.service - OpenSSH per-connection server daemon (10.0.0.1:42926). May 13 23:57:46.442588 sshd[3639]: Accepted publickey for core from 10.0.0.1 port 42926 ssh2: RSA SHA256:7f2XacyFcvGxEsM5obZzQpmkhMs9Q6mfAUEaqBEC3Xw May 13 23:57:46.445208 sshd-session[3639]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 13 23:57:46.541049 systemd-logind[1468]: New session 8 of user core. May 13 23:57:46.550436 systemd[1]: Started session-8.scope - Session 8 of User core. May 13 23:57:46.733948 sshd[3641]: Connection closed by 10.0.0.1 port 42926 May 13 23:57:46.734276 sshd-session[3639]: pam_unix(sshd:session): session closed for user core May 13 23:57:46.738558 systemd[1]: sshd@7-10.0.0.64:22-10.0.0.1:42926.service: Deactivated successfully. May 13 23:57:46.740758 systemd[1]: session-8.scope: Deactivated successfully. May 13 23:57:46.741485 systemd-logind[1468]: Session 8 logged out. Waiting for processes to exit. May 13 23:57:46.742446 systemd-logind[1468]: Removed session 8. May 13 23:57:47.203607 kubelet[2839]: E0513 23:57:47.203520 2839 pod_workers.go:1298] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-fj9x5" podUID="20d8fe80-dcc7-435f-a033-cb9b5eaee915" May 13 23:57:49.115066 kubelet[2839]: E0513 23:57:49.114961 2839 pod_workers.go:1298] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-fj9x5" podUID="20d8fe80-dcc7-435f-a033-cb9b5eaee915" May 13 23:57:51.115526 kubelet[2839]: E0513 23:57:51.115467 2839 pod_workers.go:1298] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-fj9x5" podUID="20d8fe80-dcc7-435f-a033-cb9b5eaee915" May 13 23:57:51.748499 systemd[1]: Started sshd@8-10.0.0.64:22-10.0.0.1:46568.service - OpenSSH per-connection server daemon (10.0.0.1:46568). May 13 23:57:52.545324 sshd[3671]: Accepted publickey for core from 10.0.0.1 port 46568 ssh2: RSA SHA256:7f2XacyFcvGxEsM5obZzQpmkhMs9Q6mfAUEaqBEC3Xw May 13 23:57:52.547058 sshd-session[3671]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 13 23:57:52.551361 systemd-logind[1468]: New session 9 of user core. May 13 23:57:52.564317 systemd[1]: Started session-9.scope - Session 9 of User core. May 13 23:57:52.688916 sshd[3673]: Connection closed by 10.0.0.1 port 46568 May 13 23:57:52.689285 sshd-session[3671]: pam_unix(sshd:session): session closed for user core May 13 23:57:52.693096 systemd[1]: sshd@8-10.0.0.64:22-10.0.0.1:46568.service: Deactivated successfully. May 13 23:57:52.695009 systemd[1]: session-9.scope: Deactivated successfully. May 13 23:57:52.695724 systemd-logind[1468]: Session 9 logged out. Waiting for processes to exit. May 13 23:57:52.696654 systemd-logind[1468]: Removed session 9. May 13 23:57:53.031396 containerd[1490]: time="2025-05-13T23:57:53.031336636Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/cni:v3.29.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 13 23:57:53.032675 containerd[1490]: time="2025-05-13T23:57:53.032587609Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/cni:v3.29.3: active requests=0, bytes read=97793683" May 13 23:57:53.033727 containerd[1490]: time="2025-05-13T23:57:53.033682394Z" level=info msg="ImageCreate event name:\"sha256:a140d04be1bc987bae0a1b9159e1dcb85751c448830efbdb3494207cf602b2d9\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 13 23:57:53.036277 containerd[1490]: time="2025-05-13T23:57:53.036246185Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/cni@sha256:4505ec8f976470994b6a94295a4dabac0cb98375db050e959a22603e00ada90b\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 13 23:57:53.036903 containerd[1490]: time="2025-05-13T23:57:53.036863426Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/cni:v3.29.3\" with image id \"sha256:a140d04be1bc987bae0a1b9159e1dcb85751c448830efbdb3494207cf602b2d9\", repo tag \"ghcr.io/flatcar/calico/cni:v3.29.3\", repo digest \"ghcr.io/flatcar/calico/cni@sha256:4505ec8f976470994b6a94295a4dabac0cb98375db050e959a22603e00ada90b\", size \"99286305\" in 7.148017616s" May 13 23:57:53.036903 containerd[1490]: time="2025-05-13T23:57:53.036898291Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/cni:v3.29.3\" returns image reference \"sha256:a140d04be1bc987bae0a1b9159e1dcb85751c448830efbdb3494207cf602b2d9\"" May 13 23:57:53.044996 containerd[1490]: time="2025-05-13T23:57:53.044950078Z" level=info msg="CreateContainer within sandbox \"ce2666a9e34bf422972541fdf7e7c21bd4ffb3d9b36f6a56a800322e2fafc43a\" for container &ContainerMetadata{Name:install-cni,Attempt:0,}" May 13 23:57:53.059040 containerd[1490]: time="2025-05-13T23:57:53.058975981Z" level=info msg="Container 4401f55a09ea3feaebb7b3398be47f6a89afa673f75ad8e9af8b43129a39dcee: CDI devices from CRI Config.CDIDevices: []" May 13 23:57:53.073204 containerd[1490]: time="2025-05-13T23:57:53.073130627Z" level=info msg="CreateContainer within sandbox \"ce2666a9e34bf422972541fdf7e7c21bd4ffb3d9b36f6a56a800322e2fafc43a\" for &ContainerMetadata{Name:install-cni,Attempt:0,} returns container id \"4401f55a09ea3feaebb7b3398be47f6a89afa673f75ad8e9af8b43129a39dcee\"" May 13 23:57:53.073782 containerd[1490]: time="2025-05-13T23:57:53.073737509Z" level=info msg="StartContainer for \"4401f55a09ea3feaebb7b3398be47f6a89afa673f75ad8e9af8b43129a39dcee\"" May 13 23:57:53.075631 containerd[1490]: time="2025-05-13T23:57:53.075583889Z" level=info msg="connecting to shim 4401f55a09ea3feaebb7b3398be47f6a89afa673f75ad8e9af8b43129a39dcee" address="unix:///run/containerd/s/f1ec8a2162cacb6ba07d4ea966928e698976d60ca50403f18b25e1156eac6a59" protocol=ttrpc version=3 May 13 23:57:53.100352 systemd[1]: Started cri-containerd-4401f55a09ea3feaebb7b3398be47f6a89afa673f75ad8e9af8b43129a39dcee.scope - libcontainer container 4401f55a09ea3feaebb7b3398be47f6a89afa673f75ad8e9af8b43129a39dcee. May 13 23:57:53.115629 kubelet[2839]: E0513 23:57:53.115560 2839 pod_workers.go:1298] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-fj9x5" podUID="20d8fe80-dcc7-435f-a033-cb9b5eaee915" May 13 23:57:53.187953 containerd[1490]: time="2025-05-13T23:57:53.187906322Z" level=info msg="StartContainer for \"4401f55a09ea3feaebb7b3398be47f6a89afa673f75ad8e9af8b43129a39dcee\" returns successfully" May 13 23:57:53.921784 kubelet[2839]: E0513 23:57:53.921725 2839 dns.go:153] "Nameserver limits exceeded" err="Nameserver limits were exceeded, some nameservers have been omitted, the applied nameserver line is: 1.1.1.1 1.0.0.1 8.8.8.8" May 13 23:57:54.647432 systemd[1]: cri-containerd-4401f55a09ea3feaebb7b3398be47f6a89afa673f75ad8e9af8b43129a39dcee.scope: Deactivated successfully. May 13 23:57:54.648290 systemd[1]: cri-containerd-4401f55a09ea3feaebb7b3398be47f6a89afa673f75ad8e9af8b43129a39dcee.scope: Consumed 609ms CPU time, 161M memory peak, 12K read from disk, 154M written to disk. May 13 23:57:54.650746 containerd[1490]: time="2025-05-13T23:57:54.650680513Z" level=info msg="received exit event container_id:\"4401f55a09ea3feaebb7b3398be47f6a89afa673f75ad8e9af8b43129a39dcee\" id:\"4401f55a09ea3feaebb7b3398be47f6a89afa673f75ad8e9af8b43129a39dcee\" pid:3703 exited_at:{seconds:1747180674 nanos:650466268}" May 13 23:57:54.651217 containerd[1490]: time="2025-05-13T23:57:54.650798517Z" level=info msg="TaskExit event in podsandbox handler container_id:\"4401f55a09ea3feaebb7b3398be47f6a89afa673f75ad8e9af8b43129a39dcee\" id:\"4401f55a09ea3feaebb7b3398be47f6a89afa673f75ad8e9af8b43129a39dcee\" pid:3703 exited_at:{seconds:1747180674 nanos:650466268}" May 13 23:57:54.661908 kubelet[2839]: I0513 23:57:54.661878 2839 kubelet_node_status.go:497] "Fast updating node status as it just became ready" May 13 23:57:54.676413 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-4401f55a09ea3feaebb7b3398be47f6a89afa673f75ad8e9af8b43129a39dcee-rootfs.mount: Deactivated successfully. May 13 23:57:54.694403 kubelet[2839]: I0513 23:57:54.693264 2839 topology_manager.go:215] "Topology Admit Handler" podUID="aff21006-cd99-438a-b534-0869f6aa0b49" podNamespace="calico-apiserver" podName="calico-apiserver-55cbbdb78f-kh882" May 13 23:57:54.700132 kubelet[2839]: I0513 23:57:54.697910 2839 topology_manager.go:215] "Topology Admit Handler" podUID="39453a0e-f73f-4297-8910-21729bd594b9" podNamespace="calico-system" podName="calico-kube-controllers-786464f5d8-csw8h" May 13 23:57:54.700132 kubelet[2839]: I0513 23:57:54.698089 2839 topology_manager.go:215] "Topology Admit Handler" podUID="7351e847-522c-49d3-9106-94703455ced8" podNamespace="calico-apiserver" podName="calico-apiserver-55cbbdb78f-59prw" May 13 23:57:54.700132 kubelet[2839]: I0513 23:57:54.699029 2839 topology_manager.go:215] "Topology Admit Handler" podUID="72666019-206a-4043-9545-c5ffb5aec026" podNamespace="kube-system" podName="coredns-7db6d8ff4d-7ms9x" May 13 23:57:54.700132 kubelet[2839]: I0513 23:57:54.699145 2839 topology_manager.go:215] "Topology Admit Handler" podUID="b4033e25-b2d0-4648-a7e0-b9d3051f72b0" podNamespace="kube-system" podName="coredns-7db6d8ff4d-js6jv" May 13 23:57:54.704848 systemd[1]: Created slice kubepods-besteffort-podaff21006_cd99_438a_b534_0869f6aa0b49.slice - libcontainer container kubepods-besteffort-podaff21006_cd99_438a_b534_0869f6aa0b49.slice. May 13 23:57:54.713821 systemd[1]: Created slice kubepods-besteffort-pod7351e847_522c_49d3_9106_94703455ced8.slice - libcontainer container kubepods-besteffort-pod7351e847_522c_49d3_9106_94703455ced8.slice. May 13 23:57:54.721136 systemd[1]: Created slice kubepods-besteffort-pod39453a0e_f73f_4297_8910_21729bd594b9.slice - libcontainer container kubepods-besteffort-pod39453a0e_f73f_4297_8910_21729bd594b9.slice. May 13 23:57:54.725290 systemd[1]: Created slice kubepods-burstable-podb4033e25_b2d0_4648_a7e0_b9d3051f72b0.slice - libcontainer container kubepods-burstable-podb4033e25_b2d0_4648_a7e0_b9d3051f72b0.slice. May 13 23:57:54.730596 systemd[1]: Created slice kubepods-burstable-pod72666019_206a_4043_9545_c5ffb5aec026.slice - libcontainer container kubepods-burstable-pod72666019_206a_4043_9545_c5ffb5aec026.slice. May 13 23:57:54.848354 kubelet[2839]: I0513 23:57:54.848290 2839 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"calico-apiserver-certs\" (UniqueName: \"kubernetes.io/secret/7351e847-522c-49d3-9106-94703455ced8-calico-apiserver-certs\") pod \"calico-apiserver-55cbbdb78f-59prw\" (UID: \"7351e847-522c-49d3-9106-94703455ced8\") " pod="calico-apiserver/calico-apiserver-55cbbdb78f-59prw" May 13 23:57:54.848354 kubelet[2839]: I0513 23:57:54.848344 2839 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/72666019-206a-4043-9545-c5ffb5aec026-config-volume\") pod \"coredns-7db6d8ff4d-7ms9x\" (UID: \"72666019-206a-4043-9545-c5ffb5aec026\") " pod="kube-system/coredns-7db6d8ff4d-7ms9x" May 13 23:57:54.848559 kubelet[2839]: I0513 23:57:54.848388 2839 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5jgkx\" (UniqueName: \"kubernetes.io/projected/7351e847-522c-49d3-9106-94703455ced8-kube-api-access-5jgkx\") pod \"calico-apiserver-55cbbdb78f-59prw\" (UID: \"7351e847-522c-49d3-9106-94703455ced8\") " pod="calico-apiserver/calico-apiserver-55cbbdb78f-59prw" May 13 23:57:54.848559 kubelet[2839]: I0513 23:57:54.848421 2839 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"calico-apiserver-certs\" (UniqueName: \"kubernetes.io/secret/aff21006-cd99-438a-b534-0869f6aa0b49-calico-apiserver-certs\") pod \"calico-apiserver-55cbbdb78f-kh882\" (UID: \"aff21006-cd99-438a-b534-0869f6aa0b49\") " pod="calico-apiserver/calico-apiserver-55cbbdb78f-kh882" May 13 23:57:54.848559 kubelet[2839]: I0513 23:57:54.848448 2839 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/39453a0e-f73f-4297-8910-21729bd594b9-tigera-ca-bundle\") pod \"calico-kube-controllers-786464f5d8-csw8h\" (UID: \"39453a0e-f73f-4297-8910-21729bd594b9\") " pod="calico-system/calico-kube-controllers-786464f5d8-csw8h" May 13 23:57:54.848559 kubelet[2839]: I0513 23:57:54.848471 2839 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/b4033e25-b2d0-4648-a7e0-b9d3051f72b0-config-volume\") pod \"coredns-7db6d8ff4d-js6jv\" (UID: \"b4033e25-b2d0-4648-a7e0-b9d3051f72b0\") " pod="kube-system/coredns-7db6d8ff4d-js6jv" May 13 23:57:54.848559 kubelet[2839]: I0513 23:57:54.848503 2839 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rtkmj\" (UniqueName: \"kubernetes.io/projected/39453a0e-f73f-4297-8910-21729bd594b9-kube-api-access-rtkmj\") pod \"calico-kube-controllers-786464f5d8-csw8h\" (UID: \"39453a0e-f73f-4297-8910-21729bd594b9\") " pod="calico-system/calico-kube-controllers-786464f5d8-csw8h" May 13 23:57:54.848699 kubelet[2839]: I0513 23:57:54.848536 2839 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6898l\" (UniqueName: \"kubernetes.io/projected/72666019-206a-4043-9545-c5ffb5aec026-kube-api-access-6898l\") pod \"coredns-7db6d8ff4d-7ms9x\" (UID: \"72666019-206a-4043-9545-c5ffb5aec026\") " pod="kube-system/coredns-7db6d8ff4d-7ms9x" May 13 23:57:54.848699 kubelet[2839]: I0513 23:57:54.848561 2839 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d77j2\" (UniqueName: \"kubernetes.io/projected/b4033e25-b2d0-4648-a7e0-b9d3051f72b0-kube-api-access-d77j2\") pod \"coredns-7db6d8ff4d-js6jv\" (UID: \"b4033e25-b2d0-4648-a7e0-b9d3051f72b0\") " pod="kube-system/coredns-7db6d8ff4d-js6jv" May 13 23:57:54.848699 kubelet[2839]: I0513 23:57:54.848584 2839 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pmq4t\" (UniqueName: \"kubernetes.io/projected/aff21006-cd99-438a-b534-0869f6aa0b49-kube-api-access-pmq4t\") pod \"calico-apiserver-55cbbdb78f-kh882\" (UID: \"aff21006-cd99-438a-b534-0869f6aa0b49\") " pod="calico-apiserver/calico-apiserver-55cbbdb78f-kh882" May 13 23:57:54.928864 kubelet[2839]: E0513 23:57:54.928735 2839 dns.go:153] "Nameserver limits exceeded" err="Nameserver limits were exceeded, some nameservers have been omitted, the applied nameserver line is: 1.1.1.1 1.0.0.1 8.8.8.8" May 13 23:57:54.930061 containerd[1490]: time="2025-05-13T23:57:54.929932645Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node:v3.29.3\"" May 13 23:57:55.009791 containerd[1490]: time="2025-05-13T23:57:55.009732972Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-55cbbdb78f-kh882,Uid:aff21006-cd99-438a-b534-0869f6aa0b49,Namespace:calico-apiserver,Attempt:0,}" May 13 23:57:55.018840 containerd[1490]: time="2025-05-13T23:57:55.018777943Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-55cbbdb78f-59prw,Uid:7351e847-522c-49d3-9106-94703455ced8,Namespace:calico-apiserver,Attempt:0,}" May 13 23:57:55.025365 containerd[1490]: time="2025-05-13T23:57:55.025269076Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-786464f5d8-csw8h,Uid:39453a0e-f73f-4297-8910-21729bd594b9,Namespace:calico-system,Attempt:0,}" May 13 23:57:55.028714 kubelet[2839]: E0513 23:57:55.028684 2839 dns.go:153] "Nameserver limits exceeded" err="Nameserver limits were exceeded, some nameservers have been omitted, the applied nameserver line is: 1.1.1.1 1.0.0.1 8.8.8.8" May 13 23:57:55.029834 containerd[1490]: time="2025-05-13T23:57:55.029732845Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-7db6d8ff4d-js6jv,Uid:b4033e25-b2d0-4648-a7e0-b9d3051f72b0,Namespace:kube-system,Attempt:0,}" May 13 23:57:55.033881 kubelet[2839]: E0513 23:57:55.033844 2839 dns.go:153] "Nameserver limits exceeded" err="Nameserver limits were exceeded, some nameservers have been omitted, the applied nameserver line is: 1.1.1.1 1.0.0.1 8.8.8.8" May 13 23:57:55.036031 containerd[1490]: time="2025-05-13T23:57:55.035971599Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-7db6d8ff4d-7ms9x,Uid:72666019-206a-4043-9545-c5ffb5aec026,Namespace:kube-system,Attempt:0,}" May 13 23:57:55.124446 systemd[1]: Created slice kubepods-besteffort-pod20d8fe80_dcc7_435f_a033_cb9b5eaee915.slice - libcontainer container kubepods-besteffort-pod20d8fe80_dcc7_435f_a033_cb9b5eaee915.slice. May 13 23:57:55.126026 containerd[1490]: time="2025-05-13T23:57:55.125979084Z" level=error msg="Failed to destroy network for sandbox \"d42ed2df87360a7a6fdec57e18969653399fa6e46782dba152dab10e71a3671c\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 13 23:57:55.128251 containerd[1490]: time="2025-05-13T23:57:55.128101506Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-fj9x5,Uid:20d8fe80-dcc7-435f-a033-cb9b5eaee915,Namespace:calico-system,Attempt:0,}" May 13 23:57:55.130564 containerd[1490]: time="2025-05-13T23:57:55.130500543Z" level=error msg="Failed to destroy network for sandbox \"7543243292d314f10dc77ee01e247286509f62d1a13caf620646ff17ea7a12b6\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 13 23:57:55.131405 containerd[1490]: time="2025-05-13T23:57:55.131354662Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-55cbbdb78f-59prw,Uid:7351e847-522c-49d3-9106-94703455ced8,Namespace:calico-apiserver,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"d42ed2df87360a7a6fdec57e18969653399fa6e46782dba152dab10e71a3671c\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 13 23:57:55.132186 kubelet[2839]: E0513 23:57:55.132023 2839 remote_runtime.go:193] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"d42ed2df87360a7a6fdec57e18969653399fa6e46782dba152dab10e71a3671c\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 13 23:57:55.132186 kubelet[2839]: E0513 23:57:55.132104 2839 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"d42ed2df87360a7a6fdec57e18969653399fa6e46782dba152dab10e71a3671c\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-55cbbdb78f-59prw" May 13 23:57:55.132186 kubelet[2839]: E0513 23:57:55.132127 2839 kuberuntime_manager.go:1166] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"d42ed2df87360a7a6fdec57e18969653399fa6e46782dba152dab10e71a3671c\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-55cbbdb78f-59prw" May 13 23:57:55.134149 containerd[1490]: time="2025-05-13T23:57:55.133314035Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-786464f5d8-csw8h,Uid:39453a0e-f73f-4297-8910-21729bd594b9,Namespace:calico-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"7543243292d314f10dc77ee01e247286509f62d1a13caf620646ff17ea7a12b6\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 13 23:57:55.134275 kubelet[2839]: E0513 23:57:55.133529 2839 remote_runtime.go:193] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"7543243292d314f10dc77ee01e247286509f62d1a13caf620646ff17ea7a12b6\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 13 23:57:55.134275 kubelet[2839]: E0513 23:57:55.133568 2839 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"7543243292d314f10dc77ee01e247286509f62d1a13caf620646ff17ea7a12b6\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-kube-controllers-786464f5d8-csw8h" May 13 23:57:55.134275 kubelet[2839]: E0513 23:57:55.133587 2839 kuberuntime_manager.go:1166] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"7543243292d314f10dc77ee01e247286509f62d1a13caf620646ff17ea7a12b6\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-kube-controllers-786464f5d8-csw8h" May 13 23:57:55.134383 kubelet[2839]: E0513 23:57:55.133620 2839 pod_workers.go:1298] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-kube-controllers-786464f5d8-csw8h_calico-system(39453a0e-f73f-4297-8910-21729bd594b9)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-kube-controllers-786464f5d8-csw8h_calico-system(39453a0e-f73f-4297-8910-21729bd594b9)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"7543243292d314f10dc77ee01e247286509f62d1a13caf620646ff17ea7a12b6\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/calico-kube-controllers-786464f5d8-csw8h" podUID="39453a0e-f73f-4297-8910-21729bd594b9" May 13 23:57:55.134383 kubelet[2839]: E0513 23:57:55.132364 2839 pod_workers.go:1298] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-apiserver-55cbbdb78f-59prw_calico-apiserver(7351e847-522c-49d3-9106-94703455ced8)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-apiserver-55cbbdb78f-59prw_calico-apiserver(7351e847-522c-49d3-9106-94703455ced8)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"d42ed2df87360a7a6fdec57e18969653399fa6e46782dba152dab10e71a3671c\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-apiserver/calico-apiserver-55cbbdb78f-59prw" podUID="7351e847-522c-49d3-9106-94703455ced8" May 13 23:57:55.144533 containerd[1490]: time="2025-05-13T23:57:55.144455260Z" level=error msg="Failed to destroy network for sandbox \"ecc52764ccce96349f41d3e1f0827bae37ba207adb4d3a0c94536a2623db4588\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 13 23:57:55.148629 containerd[1490]: time="2025-05-13T23:57:55.148593303Z" level=error msg="Failed to destroy network for sandbox \"4e2bd46a95240d0bc381c3ad4de9babdddd6f04cea0fd905c4757f0035130c32\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 13 23:57:55.149203 containerd[1490]: time="2025-05-13T23:57:55.149154206Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-55cbbdb78f-kh882,Uid:aff21006-cd99-438a-b534-0869f6aa0b49,Namespace:calico-apiserver,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"ecc52764ccce96349f41d3e1f0827bae37ba207adb4d3a0c94536a2623db4588\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 13 23:57:55.149696 kubelet[2839]: E0513 23:57:55.149548 2839 remote_runtime.go:193] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"ecc52764ccce96349f41d3e1f0827bae37ba207adb4d3a0c94536a2623db4588\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 13 23:57:55.149696 kubelet[2839]: E0513 23:57:55.149624 2839 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"ecc52764ccce96349f41d3e1f0827bae37ba207adb4d3a0c94536a2623db4588\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-55cbbdb78f-kh882" May 13 23:57:55.149696 kubelet[2839]: E0513 23:57:55.149654 2839 kuberuntime_manager.go:1166] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"ecc52764ccce96349f41d3e1f0827bae37ba207adb4d3a0c94536a2623db4588\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-55cbbdb78f-kh882" May 13 23:57:55.149836 kubelet[2839]: E0513 23:57:55.149711 2839 pod_workers.go:1298] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-apiserver-55cbbdb78f-kh882_calico-apiserver(aff21006-cd99-438a-b534-0869f6aa0b49)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-apiserver-55cbbdb78f-kh882_calico-apiserver(aff21006-cd99-438a-b534-0869f6aa0b49)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"ecc52764ccce96349f41d3e1f0827bae37ba207adb4d3a0c94536a2623db4588\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-apiserver/calico-apiserver-55cbbdb78f-kh882" podUID="aff21006-cd99-438a-b534-0869f6aa0b49" May 13 23:57:55.153455 containerd[1490]: time="2025-05-13T23:57:55.153397939Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-7db6d8ff4d-js6jv,Uid:b4033e25-b2d0-4648-a7e0-b9d3051f72b0,Namespace:kube-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"4e2bd46a95240d0bc381c3ad4de9babdddd6f04cea0fd905c4757f0035130c32\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 13 23:57:55.154097 kubelet[2839]: E0513 23:57:55.154037 2839 remote_runtime.go:193] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"4e2bd46a95240d0bc381c3ad4de9babdddd6f04cea0fd905c4757f0035130c32\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 13 23:57:55.154097 kubelet[2839]: E0513 23:57:55.154093 2839 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"4e2bd46a95240d0bc381c3ad4de9babdddd6f04cea0fd905c4757f0035130c32\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-7db6d8ff4d-js6jv" May 13 23:57:55.154319 kubelet[2839]: E0513 23:57:55.154115 2839 kuberuntime_manager.go:1166] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"4e2bd46a95240d0bc381c3ad4de9babdddd6f04cea0fd905c4757f0035130c32\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-7db6d8ff4d-js6jv" May 13 23:57:55.154319 kubelet[2839]: E0513 23:57:55.154164 2839 pod_workers.go:1298] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"coredns-7db6d8ff4d-js6jv_kube-system(b4033e25-b2d0-4648-a7e0-b9d3051f72b0)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"coredns-7db6d8ff4d-js6jv_kube-system(b4033e25-b2d0-4648-a7e0-b9d3051f72b0)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"4e2bd46a95240d0bc381c3ad4de9babdddd6f04cea0fd905c4757f0035130c32\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-7db6d8ff4d-js6jv" podUID="b4033e25-b2d0-4648-a7e0-b9d3051f72b0" May 13 23:57:55.157421 containerd[1490]: time="2025-05-13T23:57:55.157366450Z" level=error msg="Failed to destroy network for sandbox \"a64584c83ed1f913c1b6d92102e92740c414ee05e66c3a5928d526daf64c5a3e\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 13 23:57:55.159908 containerd[1490]: time="2025-05-13T23:57:55.159843194Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-7db6d8ff4d-7ms9x,Uid:72666019-206a-4043-9545-c5ffb5aec026,Namespace:kube-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"a64584c83ed1f913c1b6d92102e92740c414ee05e66c3a5928d526daf64c5a3e\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 13 23:57:55.160307 kubelet[2839]: E0513 23:57:55.160268 2839 remote_runtime.go:193] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"a64584c83ed1f913c1b6d92102e92740c414ee05e66c3a5928d526daf64c5a3e\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 13 23:57:55.160445 kubelet[2839]: E0513 23:57:55.160405 2839 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"a64584c83ed1f913c1b6d92102e92740c414ee05e66c3a5928d526daf64c5a3e\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-7db6d8ff4d-7ms9x" May 13 23:57:55.160488 kubelet[2839]: E0513 23:57:55.160452 2839 kuberuntime_manager.go:1166] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"a64584c83ed1f913c1b6d92102e92740c414ee05e66c3a5928d526daf64c5a3e\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-7db6d8ff4d-7ms9x" May 13 23:57:55.160574 kubelet[2839]: E0513 23:57:55.160520 2839 pod_workers.go:1298] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"coredns-7db6d8ff4d-7ms9x_kube-system(72666019-206a-4043-9545-c5ffb5aec026)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"coredns-7db6d8ff4d-7ms9x_kube-system(72666019-206a-4043-9545-c5ffb5aec026)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"a64584c83ed1f913c1b6d92102e92740c414ee05e66c3a5928d526daf64c5a3e\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-7db6d8ff4d-7ms9x" podUID="72666019-206a-4043-9545-c5ffb5aec026" May 13 23:57:55.200539 containerd[1490]: time="2025-05-13T23:57:55.200340735Z" level=error msg="Failed to destroy network for sandbox \"810c4566e00931ff4df6f1013f336f03c351f54c0639e79247bdc5479545772a\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 13 23:57:55.202666 containerd[1490]: time="2025-05-13T23:57:55.202600568Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-fj9x5,Uid:20d8fe80-dcc7-435f-a033-cb9b5eaee915,Namespace:calico-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"810c4566e00931ff4df6f1013f336f03c351f54c0639e79247bdc5479545772a\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 13 23:57:55.202979 kubelet[2839]: E0513 23:57:55.202921 2839 remote_runtime.go:193] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"810c4566e00931ff4df6f1013f336f03c351f54c0639e79247bdc5479545772a\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 13 23:57:55.203057 kubelet[2839]: E0513 23:57:55.203000 2839 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"810c4566e00931ff4df6f1013f336f03c351f54c0639e79247bdc5479545772a\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/csi-node-driver-fj9x5" May 13 23:57:55.203057 kubelet[2839]: E0513 23:57:55.203024 2839 kuberuntime_manager.go:1166] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"810c4566e00931ff4df6f1013f336f03c351f54c0639e79247bdc5479545772a\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/csi-node-driver-fj9x5" May 13 23:57:55.203139 kubelet[2839]: E0513 23:57:55.203083 2839 pod_workers.go:1298] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"csi-node-driver-fj9x5_calico-system(20d8fe80-dcc7-435f-a033-cb9b5eaee915)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"csi-node-driver-fj9x5_calico-system(20d8fe80-dcc7-435f-a033-cb9b5eaee915)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"810c4566e00931ff4df6f1013f336f03c351f54c0639e79247bdc5479545772a\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/csi-node-driver-fj9x5" podUID="20d8fe80-dcc7-435f-a033-cb9b5eaee915" May 13 23:57:55.676855 systemd[1]: run-netns-cni\x2d03e874ca\x2d6de7\x2d832e\x2dcf50\x2d82e59ee0bf44.mount: Deactivated successfully. May 13 23:57:55.676991 systemd[1]: run-netns-cni\x2d09d37df7\x2da08f\x2d2f29\x2d3765\x2d2d6f9f2a02b2.mount: Deactivated successfully. May 13 23:57:57.705135 systemd[1]: Started sshd@9-10.0.0.64:22-10.0.0.1:46578.service - OpenSSH per-connection server daemon (10.0.0.1:46578). May 13 23:57:57.768561 sshd[3966]: Accepted publickey for core from 10.0.0.1 port 46578 ssh2: RSA SHA256:7f2XacyFcvGxEsM5obZzQpmkhMs9Q6mfAUEaqBEC3Xw May 13 23:57:57.771093 sshd-session[3966]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 13 23:57:57.776853 systemd-logind[1468]: New session 10 of user core. May 13 23:57:57.792495 systemd[1]: Started session-10.scope - Session 10 of User core. May 13 23:57:57.912170 sshd[3968]: Connection closed by 10.0.0.1 port 46578 May 13 23:57:57.912550 sshd-session[3966]: pam_unix(sshd:session): session closed for user core May 13 23:57:57.916872 systemd[1]: sshd@9-10.0.0.64:22-10.0.0.1:46578.service: Deactivated successfully. May 13 23:57:57.918944 systemd[1]: session-10.scope: Deactivated successfully. May 13 23:57:57.919762 systemd-logind[1468]: Session 10 logged out. Waiting for processes to exit. May 13 23:57:57.920839 systemd-logind[1468]: Removed session 10. May 13 23:58:02.762695 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount3984740546.mount: Deactivated successfully. May 13 23:58:02.926586 systemd[1]: Started sshd@10-10.0.0.64:22-10.0.0.1:39698.service - OpenSSH per-connection server daemon (10.0.0.1:39698). May 13 23:58:02.984377 sshd[3987]: Accepted publickey for core from 10.0.0.1 port 39698 ssh2: RSA SHA256:7f2XacyFcvGxEsM5obZzQpmkhMs9Q6mfAUEaqBEC3Xw May 13 23:58:02.986311 sshd-session[3987]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 13 23:58:02.991237 systemd-logind[1468]: New session 11 of user core. May 13 23:58:03.001391 systemd[1]: Started session-11.scope - Session 11 of User core. May 13 23:58:05.555806 containerd[1490]: time="2025-05-13T23:58:05.555729705Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node:v3.29.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 13 23:58:05.557994 containerd[1490]: time="2025-05-13T23:58:05.557890714Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/node:v3.29.3: active requests=0, bytes read=144068748" May 13 23:58:05.559746 containerd[1490]: time="2025-05-13T23:58:05.559445115Z" level=info msg="ImageCreate event name:\"sha256:042163432abcec06b8077b24973b223a5f4cfdb35d85c3816f5d07a13d51afae\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 13 23:58:05.571024 containerd[1490]: time="2025-05-13T23:58:05.570945474Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node@sha256:750e267b4f8217e0ca9e4107228370190d1a2499b72112ad04370ab9b4553916\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 13 23:58:05.612345 containerd[1490]: time="2025-05-13T23:58:05.611421915Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/node:v3.29.3\" with image id \"sha256:042163432abcec06b8077b24973b223a5f4cfdb35d85c3816f5d07a13d51afae\", repo tag \"ghcr.io/flatcar/calico/node:v3.29.3\", repo digest \"ghcr.io/flatcar/calico/node@sha256:750e267b4f8217e0ca9e4107228370190d1a2499b72112ad04370ab9b4553916\", size \"144068610\" in 10.681363081s" May 13 23:58:05.612345 containerd[1490]: time="2025-05-13T23:58:05.611484634Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node:v3.29.3\" returns image reference \"sha256:042163432abcec06b8077b24973b223a5f4cfdb35d85c3816f5d07a13d51afae\"" May 13 23:58:05.632177 containerd[1490]: time="2025-05-13T23:58:05.632107011Z" level=info msg="CreateContainer within sandbox \"ce2666a9e34bf422972541fdf7e7c21bd4ffb3d9b36f6a56a800322e2fafc43a\" for container &ContainerMetadata{Name:calico-node,Attempt:0,}" May 13 23:58:05.669975 sshd[3989]: Connection closed by 10.0.0.1 port 39698 May 13 23:58:05.670409 sshd-session[3987]: pam_unix(sshd:session): session closed for user core May 13 23:58:05.676503 systemd[1]: sshd@10-10.0.0.64:22-10.0.0.1:39698.service: Deactivated successfully. May 13 23:58:05.681538 systemd[1]: session-11.scope: Deactivated successfully. May 13 23:58:05.683095 systemd-logind[1468]: Session 11 logged out. Waiting for processes to exit. May 13 23:58:05.684517 systemd-logind[1468]: Removed session 11. May 13 23:58:06.116307 containerd[1490]: time="2025-05-13T23:58:06.116253781Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-55cbbdb78f-kh882,Uid:aff21006-cd99-438a-b534-0869f6aa0b49,Namespace:calico-apiserver,Attempt:0,}" May 13 23:58:06.444719 containerd[1490]: time="2025-05-13T23:58:06.444575380Z" level=info msg="Container 2619bd3a960ea246c0909420ce7b56e324dd526c094ca248813ab28248a75958: CDI devices from CRI Config.CDIDevices: []" May 13 23:58:06.446696 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount2394965492.mount: Deactivated successfully. May 13 23:58:06.543934 containerd[1490]: time="2025-05-13T23:58:06.543861313Z" level=error msg="Failed to destroy network for sandbox \"6f04e54ce8e0f8f43ce6842a9333ce5d595b61fa22dbbce9e375fb22db03a25c\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 13 23:58:06.623347 systemd[1]: run-netns-cni\x2d9eb542cf\x2d2f91\x2dd311\x2d10a2\x2d90ecf2797cc9.mount: Deactivated successfully. May 13 23:58:06.733118 containerd[1490]: time="2025-05-13T23:58:06.732862044Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-55cbbdb78f-kh882,Uid:aff21006-cd99-438a-b534-0869f6aa0b49,Namespace:calico-apiserver,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"6f04e54ce8e0f8f43ce6842a9333ce5d595b61fa22dbbce9e375fb22db03a25c\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 13 23:58:06.733692 kubelet[2839]: E0513 23:58:06.733282 2839 remote_runtime.go:193] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"6f04e54ce8e0f8f43ce6842a9333ce5d595b61fa22dbbce9e375fb22db03a25c\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 13 23:58:06.733692 kubelet[2839]: E0513 23:58:06.733381 2839 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"6f04e54ce8e0f8f43ce6842a9333ce5d595b61fa22dbbce9e375fb22db03a25c\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-55cbbdb78f-kh882" May 13 23:58:06.733692 kubelet[2839]: E0513 23:58:06.733410 2839 kuberuntime_manager.go:1166] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"6f04e54ce8e0f8f43ce6842a9333ce5d595b61fa22dbbce9e375fb22db03a25c\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-55cbbdb78f-kh882" May 13 23:58:06.734081 kubelet[2839]: E0513 23:58:06.733466 2839 pod_workers.go:1298] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-apiserver-55cbbdb78f-kh882_calico-apiserver(aff21006-cd99-438a-b534-0869f6aa0b49)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-apiserver-55cbbdb78f-kh882_calico-apiserver(aff21006-cd99-438a-b534-0869f6aa0b49)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"6f04e54ce8e0f8f43ce6842a9333ce5d595b61fa22dbbce9e375fb22db03a25c\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-apiserver/calico-apiserver-55cbbdb78f-kh882" podUID="aff21006-cd99-438a-b534-0869f6aa0b49" May 13 23:58:06.747388 containerd[1490]: time="2025-05-13T23:58:06.747125376Z" level=info msg="CreateContainer within sandbox \"ce2666a9e34bf422972541fdf7e7c21bd4ffb3d9b36f6a56a800322e2fafc43a\" for &ContainerMetadata{Name:calico-node,Attempt:0,} returns container id \"2619bd3a960ea246c0909420ce7b56e324dd526c094ca248813ab28248a75958\"" May 13 23:58:06.750341 containerd[1490]: time="2025-05-13T23:58:06.747990503Z" level=info msg="StartContainer for \"2619bd3a960ea246c0909420ce7b56e324dd526c094ca248813ab28248a75958\"" May 13 23:58:06.750341 containerd[1490]: time="2025-05-13T23:58:06.749764519Z" level=info msg="connecting to shim 2619bd3a960ea246c0909420ce7b56e324dd526c094ca248813ab28248a75958" address="unix:///run/containerd/s/f1ec8a2162cacb6ba07d4ea966928e698976d60ca50403f18b25e1156eac6a59" protocol=ttrpc version=3 May 13 23:58:06.786486 systemd[1]: Started cri-containerd-2619bd3a960ea246c0909420ce7b56e324dd526c094ca248813ab28248a75958.scope - libcontainer container 2619bd3a960ea246c0909420ce7b56e324dd526c094ca248813ab28248a75958. May 13 23:58:06.952040 containerd[1490]: time="2025-05-13T23:58:06.951999165Z" level=info msg="StartContainer for \"2619bd3a960ea246c0909420ce7b56e324dd526c094ca248813ab28248a75958\" returns successfully" May 13 23:58:07.007580 kernel: wireguard: WireGuard 1.0.0 loaded. See www.wireguard.com for information. May 13 23:58:07.007748 kernel: wireguard: Copyright (C) 2015-2019 Jason A. Donenfeld . All Rights Reserved. May 13 23:58:07.959941 kubelet[2839]: E0513 23:58:07.959113 2839 dns.go:153] "Nameserver limits exceeded" err="Nameserver limits were exceeded, some nameservers have been omitted, the applied nameserver line is: 1.1.1.1 1.0.0.1 8.8.8.8" May 13 23:58:08.093139 containerd[1490]: time="2025-05-13T23:58:08.093079646Z" level=info msg="TaskExit event in podsandbox handler container_id:\"2619bd3a960ea246c0909420ce7b56e324dd526c094ca248813ab28248a75958\" id:\"31969838da188b7072715193ab015ad9ba31be90b4863321a6bfdd2d5a71031f\" pid:4107 exit_status:1 exited_at:{seconds:1747180688 nanos:92624464}" May 13 23:58:08.116032 kubelet[2839]: E0513 23:58:08.115993 2839 dns.go:153] "Nameserver limits exceeded" err="Nameserver limits were exceeded, some nameservers have been omitted, the applied nameserver line is: 1.1.1.1 1.0.0.1 8.8.8.8" May 13 23:58:08.116427 containerd[1490]: time="2025-05-13T23:58:08.116393417Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-7db6d8ff4d-js6jv,Uid:b4033e25-b2d0-4648-a7e0-b9d3051f72b0,Namespace:kube-system,Attempt:0,}" May 13 23:58:08.961523 kubelet[2839]: E0513 23:58:08.961488 2839 dns.go:153] "Nameserver limits exceeded" err="Nameserver limits were exceeded, some nameservers have been omitted, the applied nameserver line is: 1.1.1.1 1.0.0.1 8.8.8.8" May 13 23:58:09.021119 containerd[1490]: time="2025-05-13T23:58:09.021071786Z" level=info msg="TaskExit event in podsandbox handler container_id:\"2619bd3a960ea246c0909420ce7b56e324dd526c094ca248813ab28248a75958\" id:\"17357ecd6847e1f9fbcd35db466d4d59f70ed19962d3f7d6427b088ca4a44db4\" pid:4143 exit_status:1 exited_at:{seconds:1747180689 nanos:20747403}" May 13 23:58:09.844535 systemd-networkd[1422]: calif24d984d9ba: Link UP May 13 23:58:09.844749 systemd-networkd[1422]: calif24d984d9ba: Gained carrier May 13 23:58:10.116460 kubelet[2839]: E0513 23:58:10.115815 2839 dns.go:153] "Nameserver limits exceeded" err="Nameserver limits were exceeded, some nameservers have been omitted, the applied nameserver line is: 1.1.1.1 1.0.0.1 8.8.8.8" May 13 23:58:10.117483 containerd[1490]: time="2025-05-13T23:58:10.115883632Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-786464f5d8-csw8h,Uid:39453a0e-f73f-4297-8910-21729bd594b9,Namespace:calico-system,Attempt:0,}" May 13 23:58:10.117483 containerd[1490]: time="2025-05-13T23:58:10.116013447Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-55cbbdb78f-59prw,Uid:7351e847-522c-49d3-9106-94703455ced8,Namespace:calico-apiserver,Attempt:0,}" May 13 23:58:10.117483 containerd[1490]: time="2025-05-13T23:58:10.116205140Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-7db6d8ff4d-7ms9x,Uid:72666019-206a-4043-9545-c5ffb5aec026,Namespace:kube-system,Attempt:0,}" May 13 23:58:10.204689 kubelet[2839]: I0513 23:58:10.203969 2839 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/calico-node-xttgm" podStartSLOduration=6.16976782 podStartE2EDuration="37.203942779s" podCreationTimestamp="2025-05-13 23:57:33 +0000 UTC" firstStartedPulling="2025-05-13 23:57:34.580255118 +0000 UTC m=+30.567959144" lastFinishedPulling="2025-05-13 23:58:05.614430077 +0000 UTC m=+61.602134103" observedRunningTime="2025-05-13 23:58:08.752664652 +0000 UTC m=+64.740368668" watchObservedRunningTime="2025-05-13 23:58:10.203942779 +0000 UTC m=+66.191646805" May 13 23:58:10.207087 containerd[1490]: 2025-05-13 23:58:08.444 [INFO][4119] cni-plugin/utils.go 100: File /var/lib/calico/mtu does not exist May 13 23:58:10.207087 containerd[1490]: 2025-05-13 23:58:09.111 [INFO][4119] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {localhost-k8s-coredns--7db6d8ff4d--js6jv-eth0 coredns-7db6d8ff4d- kube-system b4033e25-b2d0-4648-a7e0-b9d3051f72b0 854 0 2025-05-13 23:57:17 +0000 UTC map[k8s-app:kube-dns pod-template-hash:7db6d8ff4d projectcalico.org/namespace:kube-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:coredns] map[] [] [] []} {k8s localhost coredns-7db6d8ff4d-js6jv eth0 coredns [] [] [kns.kube-system ksa.kube-system.coredns] calif24d984d9ba [{dns UDP 53 0 } {dns-tcp TCP 53 0 } {metrics TCP 9153 0 }] []}} ContainerID="2803fd96e2322525b49931181c744ff8ca9a0c4a26ee83910de3e33cf06b2caa" Namespace="kube-system" Pod="coredns-7db6d8ff4d-js6jv" WorkloadEndpoint="localhost-k8s-coredns--7db6d8ff4d--js6jv-" May 13 23:58:10.207087 containerd[1490]: 2025-05-13 23:58:09.111 [INFO][4119] cni-plugin/k8s.go 77: Extracted identifiers for CmdAddK8s ContainerID="2803fd96e2322525b49931181c744ff8ca9a0c4a26ee83910de3e33cf06b2caa" Namespace="kube-system" Pod="coredns-7db6d8ff4d-js6jv" WorkloadEndpoint="localhost-k8s-coredns--7db6d8ff4d--js6jv-eth0" May 13 23:58:10.207087 containerd[1490]: 2025-05-13 23:58:09.529 [INFO][4160] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="2803fd96e2322525b49931181c744ff8ca9a0c4a26ee83910de3e33cf06b2caa" HandleID="k8s-pod-network.2803fd96e2322525b49931181c744ff8ca9a0c4a26ee83910de3e33cf06b2caa" Workload="localhost-k8s-coredns--7db6d8ff4d--js6jv-eth0" May 13 23:58:10.207391 containerd[1490]: 2025-05-13 23:58:09.539 [INFO][4160] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="2803fd96e2322525b49931181c744ff8ca9a0c4a26ee83910de3e33cf06b2caa" HandleID="k8s-pod-network.2803fd96e2322525b49931181c744ff8ca9a0c4a26ee83910de3e33cf06b2caa" Workload="localhost-k8s-coredns--7db6d8ff4d--js6jv-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc00039e420), Attrs:map[string]string{"namespace":"kube-system", "node":"localhost", "pod":"coredns-7db6d8ff4d-js6jv", "timestamp":"2025-05-13 23:58:09.529788152 +0000 UTC"}, Hostname:"localhost", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} May 13 23:58:10.207391 containerd[1490]: 2025-05-13 23:58:09.539 [INFO][4160] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. May 13 23:58:10.207391 containerd[1490]: 2025-05-13 23:58:09.539 [INFO][4160] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. May 13 23:58:10.207391 containerd[1490]: 2025-05-13 23:58:09.539 [INFO][4160] ipam/ipam.go 107: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'localhost' May 13 23:58:10.207391 containerd[1490]: 2025-05-13 23:58:09.541 [INFO][4160] ipam/ipam.go 660: Looking up existing affinities for host handle="k8s-pod-network.2803fd96e2322525b49931181c744ff8ca9a0c4a26ee83910de3e33cf06b2caa" host="localhost" May 13 23:58:10.207391 containerd[1490]: 2025-05-13 23:58:09.546 [INFO][4160] ipam/ipam.go 372: Looking up existing affinities for host host="localhost" May 13 23:58:10.207391 containerd[1490]: 2025-05-13 23:58:09.552 [INFO][4160] ipam/ipam.go 489: Trying affinity for 192.168.88.128/26 host="localhost" May 13 23:58:10.207391 containerd[1490]: 2025-05-13 23:58:09.554 [INFO][4160] ipam/ipam.go 155: Attempting to load block cidr=192.168.88.128/26 host="localhost" May 13 23:58:10.207391 containerd[1490]: 2025-05-13 23:58:09.556 [INFO][4160] ipam/ipam.go 232: Affinity is confirmed and block has been loaded cidr=192.168.88.128/26 host="localhost" May 13 23:58:10.207391 containerd[1490]: 2025-05-13 23:58:09.556 [INFO][4160] ipam/ipam.go 1180: Attempting to assign 1 addresses from block block=192.168.88.128/26 handle="k8s-pod-network.2803fd96e2322525b49931181c744ff8ca9a0c4a26ee83910de3e33cf06b2caa" host="localhost" May 13 23:58:10.207623 containerd[1490]: 2025-05-13 23:58:09.557 [INFO][4160] ipam/ipam.go 1685: Creating new handle: k8s-pod-network.2803fd96e2322525b49931181c744ff8ca9a0c4a26ee83910de3e33cf06b2caa May 13 23:58:10.207623 containerd[1490]: 2025-05-13 23:58:09.595 [INFO][4160] ipam/ipam.go 1203: Writing block in order to claim IPs block=192.168.88.128/26 handle="k8s-pod-network.2803fd96e2322525b49931181c744ff8ca9a0c4a26ee83910de3e33cf06b2caa" host="localhost" May 13 23:58:10.207623 containerd[1490]: 2025-05-13 23:58:09.827 [INFO][4160] ipam/ipam.go 1216: Successfully claimed IPs: [192.168.88.129/26] block=192.168.88.128/26 handle="k8s-pod-network.2803fd96e2322525b49931181c744ff8ca9a0c4a26ee83910de3e33cf06b2caa" host="localhost" May 13 23:58:10.207623 containerd[1490]: 2025-05-13 23:58:09.827 [INFO][4160] ipam/ipam.go 847: Auto-assigned 1 out of 1 IPv4s: [192.168.88.129/26] handle="k8s-pod-network.2803fd96e2322525b49931181c744ff8ca9a0c4a26ee83910de3e33cf06b2caa" host="localhost" May 13 23:58:10.207623 containerd[1490]: 2025-05-13 23:58:09.827 [INFO][4160] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. May 13 23:58:10.207623 containerd[1490]: 2025-05-13 23:58:09.827 [INFO][4160] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.88.129/26] IPv6=[] ContainerID="2803fd96e2322525b49931181c744ff8ca9a0c4a26ee83910de3e33cf06b2caa" HandleID="k8s-pod-network.2803fd96e2322525b49931181c744ff8ca9a0c4a26ee83910de3e33cf06b2caa" Workload="localhost-k8s-coredns--7db6d8ff4d--js6jv-eth0" May 13 23:58:10.207752 containerd[1490]: 2025-05-13 23:58:09.830 [INFO][4119] cni-plugin/k8s.go 386: Populated endpoint ContainerID="2803fd96e2322525b49931181c744ff8ca9a0c4a26ee83910de3e33cf06b2caa" Namespace="kube-system" Pod="coredns-7db6d8ff4d-js6jv" WorkloadEndpoint="localhost-k8s-coredns--7db6d8ff4d--js6jv-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-coredns--7db6d8ff4d--js6jv-eth0", GenerateName:"coredns-7db6d8ff4d-", Namespace:"kube-system", SelfLink:"", UID:"b4033e25-b2d0-4648-a7e0-b9d3051f72b0", ResourceVersion:"854", Generation:0, CreationTimestamp:time.Date(2025, time.May, 13, 23, 57, 17, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"7db6d8ff4d", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"", Pod:"coredns-7db6d8ff4d-js6jv", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.88.129/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"calif24d984d9ba", MAC:"", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil)}} May 13 23:58:10.207811 containerd[1490]: 2025-05-13 23:58:09.830 [INFO][4119] cni-plugin/k8s.go 387: Calico CNI using IPs: [192.168.88.129/32] ContainerID="2803fd96e2322525b49931181c744ff8ca9a0c4a26ee83910de3e33cf06b2caa" Namespace="kube-system" Pod="coredns-7db6d8ff4d-js6jv" WorkloadEndpoint="localhost-k8s-coredns--7db6d8ff4d--js6jv-eth0" May 13 23:58:10.207811 containerd[1490]: 2025-05-13 23:58:09.830 [INFO][4119] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to calif24d984d9ba ContainerID="2803fd96e2322525b49931181c744ff8ca9a0c4a26ee83910de3e33cf06b2caa" Namespace="kube-system" Pod="coredns-7db6d8ff4d-js6jv" WorkloadEndpoint="localhost-k8s-coredns--7db6d8ff4d--js6jv-eth0" May 13 23:58:10.207811 containerd[1490]: 2025-05-13 23:58:09.844 [INFO][4119] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="2803fd96e2322525b49931181c744ff8ca9a0c4a26ee83910de3e33cf06b2caa" Namespace="kube-system" Pod="coredns-7db6d8ff4d-js6jv" WorkloadEndpoint="localhost-k8s-coredns--7db6d8ff4d--js6jv-eth0" May 13 23:58:10.207886 containerd[1490]: 2025-05-13 23:58:09.844 [INFO][4119] cni-plugin/k8s.go 414: Added Mac, interface name, and active container ID to endpoint ContainerID="2803fd96e2322525b49931181c744ff8ca9a0c4a26ee83910de3e33cf06b2caa" Namespace="kube-system" Pod="coredns-7db6d8ff4d-js6jv" WorkloadEndpoint="localhost-k8s-coredns--7db6d8ff4d--js6jv-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-coredns--7db6d8ff4d--js6jv-eth0", GenerateName:"coredns-7db6d8ff4d-", Namespace:"kube-system", SelfLink:"", UID:"b4033e25-b2d0-4648-a7e0-b9d3051f72b0", ResourceVersion:"854", Generation:0, CreationTimestamp:time.Date(2025, time.May, 13, 23, 57, 17, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"7db6d8ff4d", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"2803fd96e2322525b49931181c744ff8ca9a0c4a26ee83910de3e33cf06b2caa", Pod:"coredns-7db6d8ff4d-js6jv", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.88.129/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"calif24d984d9ba", MAC:"da:25:73:52:7d:1c", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil)}} May 13 23:58:10.207886 containerd[1490]: 2025-05-13 23:58:10.203 [INFO][4119] cni-plugin/k8s.go 500: Wrote updated endpoint to datastore ContainerID="2803fd96e2322525b49931181c744ff8ca9a0c4a26ee83910de3e33cf06b2caa" Namespace="kube-system" Pod="coredns-7db6d8ff4d-js6jv" WorkloadEndpoint="localhost-k8s-coredns--7db6d8ff4d--js6jv-eth0" May 13 23:58:10.524225 kernel: bpftool[4300]: memfd_create() called without MFD_EXEC or MFD_NOEXEC_SEAL set May 13 23:58:10.687894 systemd[1]: Started sshd@11-10.0.0.64:22-10.0.0.1:46230.service - OpenSSH per-connection server daemon (10.0.0.1:46230). May 13 23:58:10.793282 sshd[4322]: Accepted publickey for core from 10.0.0.1 port 46230 ssh2: RSA SHA256:7f2XacyFcvGxEsM5obZzQpmkhMs9Q6mfAUEaqBEC3Xw May 13 23:58:10.795572 sshd-session[4322]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 13 23:58:10.801319 systemd-logind[1468]: New session 12 of user core. May 13 23:58:10.808474 systemd[1]: Started session-12.scope - Session 12 of User core. May 13 23:58:10.817985 systemd-networkd[1422]: vxlan.calico: Link UP May 13 23:58:10.817993 systemd-networkd[1422]: vxlan.calico: Gained carrier May 13 23:58:11.115812 containerd[1490]: time="2025-05-13T23:58:11.115774337Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-fj9x5,Uid:20d8fe80-dcc7-435f-a033-cb9b5eaee915,Namespace:calico-system,Attempt:0,}" May 13 23:58:11.123611 sshd[4356]: Connection closed by 10.0.0.1 port 46230 May 13 23:58:11.123975 sshd-session[4322]: pam_unix(sshd:session): session closed for user core May 13 23:58:11.134640 systemd[1]: sshd@11-10.0.0.64:22-10.0.0.1:46230.service: Deactivated successfully. May 13 23:58:11.137753 systemd[1]: session-12.scope: Deactivated successfully. May 13 23:58:11.142244 systemd-logind[1468]: Session 12 logged out. Waiting for processes to exit. May 13 23:58:11.146566 systemd[1]: Started sshd@12-10.0.0.64:22-10.0.0.1:46234.service - OpenSSH per-connection server daemon (10.0.0.1:46234). May 13 23:58:11.149347 systemd-logind[1468]: Removed session 12. May 13 23:58:11.160522 systemd-networkd[1422]: cali4509cf80b06: Link UP May 13 23:58:11.161357 systemd-networkd[1422]: cali4509cf80b06: Gained carrier May 13 23:58:11.199699 sshd[4453]: Accepted publickey for core from 10.0.0.1 port 46234 ssh2: RSA SHA256:7f2XacyFcvGxEsM5obZzQpmkhMs9Q6mfAUEaqBEC3Xw May 13 23:58:11.204867 sshd-session[4453]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 13 23:58:11.212380 systemd-logind[1468]: New session 13 of user core. May 13 23:58:11.217375 systemd[1]: Started session-13.scope - Session 13 of User core. May 13 23:58:11.315485 containerd[1490]: 2025-05-13 23:58:10.871 [INFO][4326] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {localhost-k8s-coredns--7db6d8ff4d--7ms9x-eth0 coredns-7db6d8ff4d- kube-system 72666019-206a-4043-9545-c5ffb5aec026 858 0 2025-05-13 23:57:17 +0000 UTC map[k8s-app:kube-dns pod-template-hash:7db6d8ff4d projectcalico.org/namespace:kube-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:coredns] map[] [] [] []} {k8s localhost coredns-7db6d8ff4d-7ms9x eth0 coredns [] [] [kns.kube-system ksa.kube-system.coredns] cali4509cf80b06 [{dns UDP 53 0 } {dns-tcp TCP 53 0 } {metrics TCP 9153 0 }] []}} ContainerID="a5d57e06c3c75a512860d40e193bf5e0471f984ca8200315fe661d132d381d29" Namespace="kube-system" Pod="coredns-7db6d8ff4d-7ms9x" WorkloadEndpoint="localhost-k8s-coredns--7db6d8ff4d--7ms9x-" May 13 23:58:11.315485 containerd[1490]: 2025-05-13 23:58:10.875 [INFO][4326] cni-plugin/k8s.go 77: Extracted identifiers for CmdAddK8s ContainerID="a5d57e06c3c75a512860d40e193bf5e0471f984ca8200315fe661d132d381d29" Namespace="kube-system" Pod="coredns-7db6d8ff4d-7ms9x" WorkloadEndpoint="localhost-k8s-coredns--7db6d8ff4d--7ms9x-eth0" May 13 23:58:11.315485 containerd[1490]: 2025-05-13 23:58:10.927 [INFO][4401] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="a5d57e06c3c75a512860d40e193bf5e0471f984ca8200315fe661d132d381d29" HandleID="k8s-pod-network.a5d57e06c3c75a512860d40e193bf5e0471f984ca8200315fe661d132d381d29" Workload="localhost-k8s-coredns--7db6d8ff4d--7ms9x-eth0" May 13 23:58:11.315485 containerd[1490]: 2025-05-13 23:58:10.938 [INFO][4401] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="a5d57e06c3c75a512860d40e193bf5e0471f984ca8200315fe661d132d381d29" HandleID="k8s-pod-network.a5d57e06c3c75a512860d40e193bf5e0471f984ca8200315fe661d132d381d29" Workload="localhost-k8s-coredns--7db6d8ff4d--7ms9x-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc0003b46a0), Attrs:map[string]string{"namespace":"kube-system", "node":"localhost", "pod":"coredns-7db6d8ff4d-7ms9x", "timestamp":"2025-05-13 23:58:10.927874211 +0000 UTC"}, Hostname:"localhost", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} May 13 23:58:11.315485 containerd[1490]: 2025-05-13 23:58:10.939 [INFO][4401] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. May 13 23:58:11.315485 containerd[1490]: 2025-05-13 23:58:10.939 [INFO][4401] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. May 13 23:58:11.315485 containerd[1490]: 2025-05-13 23:58:10.939 [INFO][4401] ipam/ipam.go 107: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'localhost' May 13 23:58:11.315485 containerd[1490]: 2025-05-13 23:58:11.001 [INFO][4401] ipam/ipam.go 660: Looking up existing affinities for host handle="k8s-pod-network.a5d57e06c3c75a512860d40e193bf5e0471f984ca8200315fe661d132d381d29" host="localhost" May 13 23:58:11.315485 containerd[1490]: 2025-05-13 23:58:11.005 [INFO][4401] ipam/ipam.go 372: Looking up existing affinities for host host="localhost" May 13 23:58:11.315485 containerd[1490]: 2025-05-13 23:58:11.009 [INFO][4401] ipam/ipam.go 489: Trying affinity for 192.168.88.128/26 host="localhost" May 13 23:58:11.315485 containerd[1490]: 2025-05-13 23:58:11.011 [INFO][4401] ipam/ipam.go 155: Attempting to load block cidr=192.168.88.128/26 host="localhost" May 13 23:58:11.315485 containerd[1490]: 2025-05-13 23:58:11.013 [INFO][4401] ipam/ipam.go 232: Affinity is confirmed and block has been loaded cidr=192.168.88.128/26 host="localhost" May 13 23:58:11.315485 containerd[1490]: 2025-05-13 23:58:11.013 [INFO][4401] ipam/ipam.go 1180: Attempting to assign 1 addresses from block block=192.168.88.128/26 handle="k8s-pod-network.a5d57e06c3c75a512860d40e193bf5e0471f984ca8200315fe661d132d381d29" host="localhost" May 13 23:58:11.315485 containerd[1490]: 2025-05-13 23:58:11.019 [INFO][4401] ipam/ipam.go 1685: Creating new handle: k8s-pod-network.a5d57e06c3c75a512860d40e193bf5e0471f984ca8200315fe661d132d381d29 May 13 23:58:11.315485 containerd[1490]: 2025-05-13 23:58:11.115 [INFO][4401] ipam/ipam.go 1203: Writing block in order to claim IPs block=192.168.88.128/26 handle="k8s-pod-network.a5d57e06c3c75a512860d40e193bf5e0471f984ca8200315fe661d132d381d29" host="localhost" May 13 23:58:11.315485 containerd[1490]: 2025-05-13 23:58:11.140 [INFO][4401] ipam/ipam.go 1216: Successfully claimed IPs: [192.168.88.130/26] block=192.168.88.128/26 handle="k8s-pod-network.a5d57e06c3c75a512860d40e193bf5e0471f984ca8200315fe661d132d381d29" host="localhost" May 13 23:58:11.315485 containerd[1490]: 2025-05-13 23:58:11.143 [INFO][4401] ipam/ipam.go 847: Auto-assigned 1 out of 1 IPv4s: [192.168.88.130/26] handle="k8s-pod-network.a5d57e06c3c75a512860d40e193bf5e0471f984ca8200315fe661d132d381d29" host="localhost" May 13 23:58:11.315485 containerd[1490]: 2025-05-13 23:58:11.147 [INFO][4401] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. May 13 23:58:11.315485 containerd[1490]: 2025-05-13 23:58:11.147 [INFO][4401] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.88.130/26] IPv6=[] ContainerID="a5d57e06c3c75a512860d40e193bf5e0471f984ca8200315fe661d132d381d29" HandleID="k8s-pod-network.a5d57e06c3c75a512860d40e193bf5e0471f984ca8200315fe661d132d381d29" Workload="localhost-k8s-coredns--7db6d8ff4d--7ms9x-eth0" May 13 23:58:11.316843 containerd[1490]: 2025-05-13 23:58:11.153 [INFO][4326] cni-plugin/k8s.go 386: Populated endpoint ContainerID="a5d57e06c3c75a512860d40e193bf5e0471f984ca8200315fe661d132d381d29" Namespace="kube-system" Pod="coredns-7db6d8ff4d-7ms9x" WorkloadEndpoint="localhost-k8s-coredns--7db6d8ff4d--7ms9x-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-coredns--7db6d8ff4d--7ms9x-eth0", GenerateName:"coredns-7db6d8ff4d-", Namespace:"kube-system", SelfLink:"", UID:"72666019-206a-4043-9545-c5ffb5aec026", ResourceVersion:"858", Generation:0, CreationTimestamp:time.Date(2025, time.May, 13, 23, 57, 17, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"7db6d8ff4d", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"", Pod:"coredns-7db6d8ff4d-7ms9x", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.88.130/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"cali4509cf80b06", MAC:"", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil)}} May 13 23:58:11.316843 containerd[1490]: 2025-05-13 23:58:11.154 [INFO][4326] cni-plugin/k8s.go 387: Calico CNI using IPs: [192.168.88.130/32] ContainerID="a5d57e06c3c75a512860d40e193bf5e0471f984ca8200315fe661d132d381d29" Namespace="kube-system" Pod="coredns-7db6d8ff4d-7ms9x" WorkloadEndpoint="localhost-k8s-coredns--7db6d8ff4d--7ms9x-eth0" May 13 23:58:11.316843 containerd[1490]: 2025-05-13 23:58:11.154 [INFO][4326] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali4509cf80b06 ContainerID="a5d57e06c3c75a512860d40e193bf5e0471f984ca8200315fe661d132d381d29" Namespace="kube-system" Pod="coredns-7db6d8ff4d-7ms9x" WorkloadEndpoint="localhost-k8s-coredns--7db6d8ff4d--7ms9x-eth0" May 13 23:58:11.316843 containerd[1490]: 2025-05-13 23:58:11.163 [INFO][4326] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="a5d57e06c3c75a512860d40e193bf5e0471f984ca8200315fe661d132d381d29" Namespace="kube-system" Pod="coredns-7db6d8ff4d-7ms9x" WorkloadEndpoint="localhost-k8s-coredns--7db6d8ff4d--7ms9x-eth0" May 13 23:58:11.316843 containerd[1490]: 2025-05-13 23:58:11.163 [INFO][4326] cni-plugin/k8s.go 414: Added Mac, interface name, and active container ID to endpoint ContainerID="a5d57e06c3c75a512860d40e193bf5e0471f984ca8200315fe661d132d381d29" Namespace="kube-system" Pod="coredns-7db6d8ff4d-7ms9x" WorkloadEndpoint="localhost-k8s-coredns--7db6d8ff4d--7ms9x-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-coredns--7db6d8ff4d--7ms9x-eth0", GenerateName:"coredns-7db6d8ff4d-", Namespace:"kube-system", SelfLink:"", UID:"72666019-206a-4043-9545-c5ffb5aec026", ResourceVersion:"858", Generation:0, CreationTimestamp:time.Date(2025, time.May, 13, 23, 57, 17, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"7db6d8ff4d", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"a5d57e06c3c75a512860d40e193bf5e0471f984ca8200315fe661d132d381d29", Pod:"coredns-7db6d8ff4d-7ms9x", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.88.130/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"cali4509cf80b06", MAC:"7a:ee:3b:39:c6:a6", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil)}} May 13 23:58:11.316843 containerd[1490]: 2025-05-13 23:58:11.309 [INFO][4326] cni-plugin/k8s.go 500: Wrote updated endpoint to datastore ContainerID="a5d57e06c3c75a512860d40e193bf5e0471f984ca8200315fe661d132d381d29" Namespace="kube-system" Pod="coredns-7db6d8ff4d-7ms9x" WorkloadEndpoint="localhost-k8s-coredns--7db6d8ff4d--7ms9x-eth0" May 13 23:58:11.384411 systemd-networkd[1422]: cali825a25937b6: Link UP May 13 23:58:11.385367 systemd-networkd[1422]: cali825a25937b6: Gained carrier May 13 23:58:11.408347 systemd-networkd[1422]: calif24d984d9ba: Gained IPv6LL May 13 23:58:11.644276 containerd[1490]: 2025-05-13 23:58:10.924 [INFO][4367] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {localhost-k8s-calico--apiserver--55cbbdb78f--59prw-eth0 calico-apiserver-55cbbdb78f- calico-apiserver 7351e847-522c-49d3-9106-94703455ced8 859 0 2025-05-13 23:57:31 +0000 UTC map[apiserver:true app.kubernetes.io/name:calico-apiserver k8s-app:calico-apiserver pod-template-hash:55cbbdb78f projectcalico.org/namespace:calico-apiserver projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:calico-apiserver] map[] [] [] []} {k8s localhost calico-apiserver-55cbbdb78f-59prw eth0 calico-apiserver [] [] [kns.calico-apiserver ksa.calico-apiserver.calico-apiserver] cali825a25937b6 [] []}} ContainerID="0e970f10f875213dbc94d3098bc3c2d0f5735148204b8af5b9510f2495ddf867" Namespace="calico-apiserver" Pod="calico-apiserver-55cbbdb78f-59prw" WorkloadEndpoint="localhost-k8s-calico--apiserver--55cbbdb78f--59prw-" May 13 23:58:11.644276 containerd[1490]: 2025-05-13 23:58:10.924 [INFO][4367] cni-plugin/k8s.go 77: Extracted identifiers for CmdAddK8s ContainerID="0e970f10f875213dbc94d3098bc3c2d0f5735148204b8af5b9510f2495ddf867" Namespace="calico-apiserver" Pod="calico-apiserver-55cbbdb78f-59prw" WorkloadEndpoint="localhost-k8s-calico--apiserver--55cbbdb78f--59prw-eth0" May 13 23:58:11.644276 containerd[1490]: 2025-05-13 23:58:11.066 [INFO][4429] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="0e970f10f875213dbc94d3098bc3c2d0f5735148204b8af5b9510f2495ddf867" HandleID="k8s-pod-network.0e970f10f875213dbc94d3098bc3c2d0f5735148204b8af5b9510f2495ddf867" Workload="localhost-k8s-calico--apiserver--55cbbdb78f--59prw-eth0" May 13 23:58:11.644276 containerd[1490]: 2025-05-13 23:58:11.117 [INFO][4429] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="0e970f10f875213dbc94d3098bc3c2d0f5735148204b8af5b9510f2495ddf867" HandleID="k8s-pod-network.0e970f10f875213dbc94d3098bc3c2d0f5735148204b8af5b9510f2495ddf867" Workload="localhost-k8s-calico--apiserver--55cbbdb78f--59prw-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc0003756c0), Attrs:map[string]string{"namespace":"calico-apiserver", "node":"localhost", "pod":"calico-apiserver-55cbbdb78f-59prw", "timestamp":"2025-05-13 23:58:11.066827842 +0000 UTC"}, Hostname:"localhost", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} May 13 23:58:11.644276 containerd[1490]: 2025-05-13 23:58:11.117 [INFO][4429] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. May 13 23:58:11.644276 containerd[1490]: 2025-05-13 23:58:11.148 [INFO][4429] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. May 13 23:58:11.644276 containerd[1490]: 2025-05-13 23:58:11.148 [INFO][4429] ipam/ipam.go 107: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'localhost' May 13 23:58:11.644276 containerd[1490]: 2025-05-13 23:58:11.150 [INFO][4429] ipam/ipam.go 660: Looking up existing affinities for host handle="k8s-pod-network.0e970f10f875213dbc94d3098bc3c2d0f5735148204b8af5b9510f2495ddf867" host="localhost" May 13 23:58:11.644276 containerd[1490]: 2025-05-13 23:58:11.155 [INFO][4429] ipam/ipam.go 372: Looking up existing affinities for host host="localhost" May 13 23:58:11.644276 containerd[1490]: 2025-05-13 23:58:11.161 [INFO][4429] ipam/ipam.go 489: Trying affinity for 192.168.88.128/26 host="localhost" May 13 23:58:11.644276 containerd[1490]: 2025-05-13 23:58:11.168 [INFO][4429] ipam/ipam.go 155: Attempting to load block cidr=192.168.88.128/26 host="localhost" May 13 23:58:11.644276 containerd[1490]: 2025-05-13 23:58:11.171 [INFO][4429] ipam/ipam.go 232: Affinity is confirmed and block has been loaded cidr=192.168.88.128/26 host="localhost" May 13 23:58:11.644276 containerd[1490]: 2025-05-13 23:58:11.171 [INFO][4429] ipam/ipam.go 1180: Attempting to assign 1 addresses from block block=192.168.88.128/26 handle="k8s-pod-network.0e970f10f875213dbc94d3098bc3c2d0f5735148204b8af5b9510f2495ddf867" host="localhost" May 13 23:58:11.644276 containerd[1490]: 2025-05-13 23:58:11.308 [INFO][4429] ipam/ipam.go 1685: Creating new handle: k8s-pod-network.0e970f10f875213dbc94d3098bc3c2d0f5735148204b8af5b9510f2495ddf867 May 13 23:58:11.644276 containerd[1490]: 2025-05-13 23:58:11.334 [INFO][4429] ipam/ipam.go 1203: Writing block in order to claim IPs block=192.168.88.128/26 handle="k8s-pod-network.0e970f10f875213dbc94d3098bc3c2d0f5735148204b8af5b9510f2495ddf867" host="localhost" May 13 23:58:11.644276 containerd[1490]: 2025-05-13 23:58:11.378 [INFO][4429] ipam/ipam.go 1216: Successfully claimed IPs: [192.168.88.131/26] block=192.168.88.128/26 handle="k8s-pod-network.0e970f10f875213dbc94d3098bc3c2d0f5735148204b8af5b9510f2495ddf867" host="localhost" May 13 23:58:11.644276 containerd[1490]: 2025-05-13 23:58:11.378 [INFO][4429] ipam/ipam.go 847: Auto-assigned 1 out of 1 IPv4s: [192.168.88.131/26] handle="k8s-pod-network.0e970f10f875213dbc94d3098bc3c2d0f5735148204b8af5b9510f2495ddf867" host="localhost" May 13 23:58:11.644276 containerd[1490]: 2025-05-13 23:58:11.378 [INFO][4429] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. May 13 23:58:11.644276 containerd[1490]: 2025-05-13 23:58:11.378 [INFO][4429] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.88.131/26] IPv6=[] ContainerID="0e970f10f875213dbc94d3098bc3c2d0f5735148204b8af5b9510f2495ddf867" HandleID="k8s-pod-network.0e970f10f875213dbc94d3098bc3c2d0f5735148204b8af5b9510f2495ddf867" Workload="localhost-k8s-calico--apiserver--55cbbdb78f--59prw-eth0" May 13 23:58:11.645093 containerd[1490]: 2025-05-13 23:58:11.382 [INFO][4367] cni-plugin/k8s.go 386: Populated endpoint ContainerID="0e970f10f875213dbc94d3098bc3c2d0f5735148204b8af5b9510f2495ddf867" Namespace="calico-apiserver" Pod="calico-apiserver-55cbbdb78f-59prw" WorkloadEndpoint="localhost-k8s-calico--apiserver--55cbbdb78f--59prw-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-calico--apiserver--55cbbdb78f--59prw-eth0", GenerateName:"calico-apiserver-55cbbdb78f-", Namespace:"calico-apiserver", SelfLink:"", UID:"7351e847-522c-49d3-9106-94703455ced8", ResourceVersion:"859", Generation:0, CreationTimestamp:time.Date(2025, time.May, 13, 23, 57, 31, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"55cbbdb78f", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"", Pod:"calico-apiserver-55cbbdb78f-59prw", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.88.131/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"cali825a25937b6", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil)}} May 13 23:58:11.645093 containerd[1490]: 2025-05-13 23:58:11.382 [INFO][4367] cni-plugin/k8s.go 387: Calico CNI using IPs: [192.168.88.131/32] ContainerID="0e970f10f875213dbc94d3098bc3c2d0f5735148204b8af5b9510f2495ddf867" Namespace="calico-apiserver" Pod="calico-apiserver-55cbbdb78f-59prw" WorkloadEndpoint="localhost-k8s-calico--apiserver--55cbbdb78f--59prw-eth0" May 13 23:58:11.645093 containerd[1490]: 2025-05-13 23:58:11.382 [INFO][4367] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali825a25937b6 ContainerID="0e970f10f875213dbc94d3098bc3c2d0f5735148204b8af5b9510f2495ddf867" Namespace="calico-apiserver" Pod="calico-apiserver-55cbbdb78f-59prw" WorkloadEndpoint="localhost-k8s-calico--apiserver--55cbbdb78f--59prw-eth0" May 13 23:58:11.645093 containerd[1490]: 2025-05-13 23:58:11.385 [INFO][4367] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="0e970f10f875213dbc94d3098bc3c2d0f5735148204b8af5b9510f2495ddf867" Namespace="calico-apiserver" Pod="calico-apiserver-55cbbdb78f-59prw" WorkloadEndpoint="localhost-k8s-calico--apiserver--55cbbdb78f--59prw-eth0" May 13 23:58:11.645093 containerd[1490]: 2025-05-13 23:58:11.385 [INFO][4367] cni-plugin/k8s.go 414: Added Mac, interface name, and active container ID to endpoint ContainerID="0e970f10f875213dbc94d3098bc3c2d0f5735148204b8af5b9510f2495ddf867" Namespace="calico-apiserver" Pod="calico-apiserver-55cbbdb78f-59prw" WorkloadEndpoint="localhost-k8s-calico--apiserver--55cbbdb78f--59prw-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-calico--apiserver--55cbbdb78f--59prw-eth0", GenerateName:"calico-apiserver-55cbbdb78f-", Namespace:"calico-apiserver", SelfLink:"", UID:"7351e847-522c-49d3-9106-94703455ced8", ResourceVersion:"859", Generation:0, CreationTimestamp:time.Date(2025, time.May, 13, 23, 57, 31, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"55cbbdb78f", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"0e970f10f875213dbc94d3098bc3c2d0f5735148204b8af5b9510f2495ddf867", Pod:"calico-apiserver-55cbbdb78f-59prw", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.88.131/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"cali825a25937b6", MAC:"76:55:d3:e2:96:e9", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil)}} May 13 23:58:11.645093 containerd[1490]: 2025-05-13 23:58:11.640 [INFO][4367] cni-plugin/k8s.go 500: Wrote updated endpoint to datastore ContainerID="0e970f10f875213dbc94d3098bc3c2d0f5735148204b8af5b9510f2495ddf867" Namespace="calico-apiserver" Pod="calico-apiserver-55cbbdb78f-59prw" WorkloadEndpoint="localhost-k8s-calico--apiserver--55cbbdb78f--59prw-eth0" May 13 23:58:11.709085 sshd[4486]: Connection closed by 10.0.0.1 port 46234 May 13 23:58:11.709650 sshd-session[4453]: pam_unix(sshd:session): session closed for user core May 13 23:58:11.722455 systemd[1]: sshd@12-10.0.0.64:22-10.0.0.1:46234.service: Deactivated successfully. May 13 23:58:11.724857 systemd[1]: session-13.scope: Deactivated successfully. May 13 23:58:11.727789 systemd-logind[1468]: Session 13 logged out. Waiting for processes to exit. May 13 23:58:11.730585 systemd[1]: Started sshd@13-10.0.0.64:22-10.0.0.1:46246.service - OpenSSH per-connection server daemon (10.0.0.1:46246). May 13 23:58:11.732556 systemd-logind[1468]: Removed session 13. May 13 23:58:11.783061 sshd[4549]: Accepted publickey for core from 10.0.0.1 port 46246 ssh2: RSA SHA256:7f2XacyFcvGxEsM5obZzQpmkhMs9Q6mfAUEaqBEC3Xw May 13 23:58:11.784849 sshd-session[4549]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 13 23:58:11.789053 systemd-logind[1468]: New session 14 of user core. May 13 23:58:11.794318 systemd[1]: Started session-14.scope - Session 14 of User core. May 13 23:58:11.953462 systemd-networkd[1422]: cali2991946752e: Link UP May 13 23:58:11.954324 sshd[4552]: Connection closed by 10.0.0.1 port 46246 May 13 23:58:11.953751 systemd-networkd[1422]: cali2991946752e: Gained carrier May 13 23:58:11.954533 sshd-session[4549]: pam_unix(sshd:session): session closed for user core May 13 23:58:11.960746 systemd[1]: sshd@13-10.0.0.64:22-10.0.0.1:46246.service: Deactivated successfully. May 13 23:58:11.963361 systemd[1]: session-14.scope: Deactivated successfully. May 13 23:58:11.965853 systemd-logind[1468]: Session 14 logged out. Waiting for processes to exit. May 13 23:58:11.967521 systemd-logind[1468]: Removed session 14. May 13 23:58:12.313947 containerd[1490]: 2025-05-13 23:58:11.116 [INFO][4411] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {localhost-k8s-calico--kube--controllers--786464f5d8--csw8h-eth0 calico-kube-controllers-786464f5d8- calico-system 39453a0e-f73f-4297-8910-21729bd594b9 852 0 2025-05-13 23:57:34 +0000 UTC map[app.kubernetes.io/name:calico-kube-controllers k8s-app:calico-kube-controllers pod-template-hash:786464f5d8 projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:calico-kube-controllers] map[] [] [] []} {k8s localhost calico-kube-controllers-786464f5d8-csw8h eth0 calico-kube-controllers [] [] [kns.calico-system ksa.calico-system.calico-kube-controllers] cali2991946752e [] []}} ContainerID="8ed8a1b4f49038853de89d11ca83f7ac42c99a8ce24c1b29e01ee015f7524627" Namespace="calico-system" Pod="calico-kube-controllers-786464f5d8-csw8h" WorkloadEndpoint="localhost-k8s-calico--kube--controllers--786464f5d8--csw8h-" May 13 23:58:12.313947 containerd[1490]: 2025-05-13 23:58:11.116 [INFO][4411] cni-plugin/k8s.go 77: Extracted identifiers for CmdAddK8s ContainerID="8ed8a1b4f49038853de89d11ca83f7ac42c99a8ce24c1b29e01ee015f7524627" Namespace="calico-system" Pod="calico-kube-controllers-786464f5d8-csw8h" WorkloadEndpoint="localhost-k8s-calico--kube--controllers--786464f5d8--csw8h-eth0" May 13 23:58:12.313947 containerd[1490]: 2025-05-13 23:58:11.205 [INFO][4457] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="8ed8a1b4f49038853de89d11ca83f7ac42c99a8ce24c1b29e01ee015f7524627" HandleID="k8s-pod-network.8ed8a1b4f49038853de89d11ca83f7ac42c99a8ce24c1b29e01ee015f7524627" Workload="localhost-k8s-calico--kube--controllers--786464f5d8--csw8h-eth0" May 13 23:58:12.313947 containerd[1490]: 2025-05-13 23:58:11.314 [INFO][4457] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="8ed8a1b4f49038853de89d11ca83f7ac42c99a8ce24c1b29e01ee015f7524627" HandleID="k8s-pod-network.8ed8a1b4f49038853de89d11ca83f7ac42c99a8ce24c1b29e01ee015f7524627" Workload="localhost-k8s-calico--kube--controllers--786464f5d8--csw8h-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc00039ab20), Attrs:map[string]string{"namespace":"calico-system", "node":"localhost", "pod":"calico-kube-controllers-786464f5d8-csw8h", "timestamp":"2025-05-13 23:58:11.205276241 +0000 UTC"}, Hostname:"localhost", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} May 13 23:58:12.313947 containerd[1490]: 2025-05-13 23:58:11.314 [INFO][4457] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. May 13 23:58:12.313947 containerd[1490]: 2025-05-13 23:58:11.379 [INFO][4457] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. May 13 23:58:12.313947 containerd[1490]: 2025-05-13 23:58:11.379 [INFO][4457] ipam/ipam.go 107: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'localhost' May 13 23:58:12.313947 containerd[1490]: 2025-05-13 23:58:11.382 [INFO][4457] ipam/ipam.go 660: Looking up existing affinities for host handle="k8s-pod-network.8ed8a1b4f49038853de89d11ca83f7ac42c99a8ce24c1b29e01ee015f7524627" host="localhost" May 13 23:58:12.313947 containerd[1490]: 2025-05-13 23:58:11.388 [INFO][4457] ipam/ipam.go 372: Looking up existing affinities for host host="localhost" May 13 23:58:12.313947 containerd[1490]: 2025-05-13 23:58:11.639 [INFO][4457] ipam/ipam.go 489: Trying affinity for 192.168.88.128/26 host="localhost" May 13 23:58:12.313947 containerd[1490]: 2025-05-13 23:58:11.643 [INFO][4457] ipam/ipam.go 155: Attempting to load block cidr=192.168.88.128/26 host="localhost" May 13 23:58:12.313947 containerd[1490]: 2025-05-13 23:58:11.646 [INFO][4457] ipam/ipam.go 232: Affinity is confirmed and block has been loaded cidr=192.168.88.128/26 host="localhost" May 13 23:58:12.313947 containerd[1490]: 2025-05-13 23:58:11.646 [INFO][4457] ipam/ipam.go 1180: Attempting to assign 1 addresses from block block=192.168.88.128/26 handle="k8s-pod-network.8ed8a1b4f49038853de89d11ca83f7ac42c99a8ce24c1b29e01ee015f7524627" host="localhost" May 13 23:58:12.313947 containerd[1490]: 2025-05-13 23:58:11.649 [INFO][4457] ipam/ipam.go 1685: Creating new handle: k8s-pod-network.8ed8a1b4f49038853de89d11ca83f7ac42c99a8ce24c1b29e01ee015f7524627 May 13 23:58:12.313947 containerd[1490]: 2025-05-13 23:58:11.728 [INFO][4457] ipam/ipam.go 1203: Writing block in order to claim IPs block=192.168.88.128/26 handle="k8s-pod-network.8ed8a1b4f49038853de89d11ca83f7ac42c99a8ce24c1b29e01ee015f7524627" host="localhost" May 13 23:58:12.313947 containerd[1490]: 2025-05-13 23:58:11.945 [INFO][4457] ipam/ipam.go 1216: Successfully claimed IPs: [192.168.88.132/26] block=192.168.88.128/26 handle="k8s-pod-network.8ed8a1b4f49038853de89d11ca83f7ac42c99a8ce24c1b29e01ee015f7524627" host="localhost" May 13 23:58:12.313947 containerd[1490]: 2025-05-13 23:58:11.945 [INFO][4457] ipam/ipam.go 847: Auto-assigned 1 out of 1 IPv4s: [192.168.88.132/26] handle="k8s-pod-network.8ed8a1b4f49038853de89d11ca83f7ac42c99a8ce24c1b29e01ee015f7524627" host="localhost" May 13 23:58:12.313947 containerd[1490]: 2025-05-13 23:58:11.945 [INFO][4457] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. May 13 23:58:12.313947 containerd[1490]: 2025-05-13 23:58:11.945 [INFO][4457] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.88.132/26] IPv6=[] ContainerID="8ed8a1b4f49038853de89d11ca83f7ac42c99a8ce24c1b29e01ee015f7524627" HandleID="k8s-pod-network.8ed8a1b4f49038853de89d11ca83f7ac42c99a8ce24c1b29e01ee015f7524627" Workload="localhost-k8s-calico--kube--controllers--786464f5d8--csw8h-eth0" May 13 23:58:12.314584 containerd[1490]: 2025-05-13 23:58:11.949 [INFO][4411] cni-plugin/k8s.go 386: Populated endpoint ContainerID="8ed8a1b4f49038853de89d11ca83f7ac42c99a8ce24c1b29e01ee015f7524627" Namespace="calico-system" Pod="calico-kube-controllers-786464f5d8-csw8h" WorkloadEndpoint="localhost-k8s-calico--kube--controllers--786464f5d8--csw8h-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-calico--kube--controllers--786464f5d8--csw8h-eth0", GenerateName:"calico-kube-controllers-786464f5d8-", Namespace:"calico-system", SelfLink:"", UID:"39453a0e-f73f-4297-8910-21729bd594b9", ResourceVersion:"852", Generation:0, CreationTimestamp:time.Date(2025, time.May, 13, 23, 57, 34, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"calico-kube-controllers", "k8s-app":"calico-kube-controllers", "pod-template-hash":"786464f5d8", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-kube-controllers"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"", Pod:"calico-kube-controllers-786464f5d8-csw8h", Endpoint:"eth0", ServiceAccountName:"calico-kube-controllers", IPNetworks:[]string{"192.168.88.132/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.calico-kube-controllers"}, InterfaceName:"cali2991946752e", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil)}} May 13 23:58:12.314584 containerd[1490]: 2025-05-13 23:58:11.949 [INFO][4411] cni-plugin/k8s.go 387: Calico CNI using IPs: [192.168.88.132/32] ContainerID="8ed8a1b4f49038853de89d11ca83f7ac42c99a8ce24c1b29e01ee015f7524627" Namespace="calico-system" Pod="calico-kube-controllers-786464f5d8-csw8h" WorkloadEndpoint="localhost-k8s-calico--kube--controllers--786464f5d8--csw8h-eth0" May 13 23:58:12.314584 containerd[1490]: 2025-05-13 23:58:11.949 [INFO][4411] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali2991946752e ContainerID="8ed8a1b4f49038853de89d11ca83f7ac42c99a8ce24c1b29e01ee015f7524627" Namespace="calico-system" Pod="calico-kube-controllers-786464f5d8-csw8h" WorkloadEndpoint="localhost-k8s-calico--kube--controllers--786464f5d8--csw8h-eth0" May 13 23:58:12.314584 containerd[1490]: 2025-05-13 23:58:11.953 [INFO][4411] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="8ed8a1b4f49038853de89d11ca83f7ac42c99a8ce24c1b29e01ee015f7524627" Namespace="calico-system" Pod="calico-kube-controllers-786464f5d8-csw8h" WorkloadEndpoint="localhost-k8s-calico--kube--controllers--786464f5d8--csw8h-eth0" May 13 23:58:12.314584 containerd[1490]: 2025-05-13 23:58:11.955 [INFO][4411] cni-plugin/k8s.go 414: Added Mac, interface name, and active container ID to endpoint ContainerID="8ed8a1b4f49038853de89d11ca83f7ac42c99a8ce24c1b29e01ee015f7524627" Namespace="calico-system" Pod="calico-kube-controllers-786464f5d8-csw8h" WorkloadEndpoint="localhost-k8s-calico--kube--controllers--786464f5d8--csw8h-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-calico--kube--controllers--786464f5d8--csw8h-eth0", GenerateName:"calico-kube-controllers-786464f5d8-", Namespace:"calico-system", SelfLink:"", UID:"39453a0e-f73f-4297-8910-21729bd594b9", ResourceVersion:"852", Generation:0, CreationTimestamp:time.Date(2025, time.May, 13, 23, 57, 34, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"calico-kube-controllers", "k8s-app":"calico-kube-controllers", "pod-template-hash":"786464f5d8", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-kube-controllers"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"8ed8a1b4f49038853de89d11ca83f7ac42c99a8ce24c1b29e01ee015f7524627", Pod:"calico-kube-controllers-786464f5d8-csw8h", Endpoint:"eth0", ServiceAccountName:"calico-kube-controllers", IPNetworks:[]string{"192.168.88.132/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.calico-kube-controllers"}, InterfaceName:"cali2991946752e", MAC:"fa:2e:8e:4e:ec:61", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil)}} May 13 23:58:12.314584 containerd[1490]: 2025-05-13 23:58:12.310 [INFO][4411] cni-plugin/k8s.go 500: Wrote updated endpoint to datastore ContainerID="8ed8a1b4f49038853de89d11ca83f7ac42c99a8ce24c1b29e01ee015f7524627" Namespace="calico-system" Pod="calico-kube-controllers-786464f5d8-csw8h" WorkloadEndpoint="localhost-k8s-calico--kube--controllers--786464f5d8--csw8h-eth0" May 13 23:58:12.496923 systemd-networkd[1422]: cali4509cf80b06: Gained IPv6LL May 13 23:58:12.592543 containerd[1490]: time="2025-05-13T23:58:12.591887413Z" level=info msg="connecting to shim 2803fd96e2322525b49931181c744ff8ca9a0c4a26ee83910de3e33cf06b2caa" address="unix:///run/containerd/s/281455a1602d0cb949bd20ee47f6cd68c408d5c78ed073817c205722a0948a6e" namespace=k8s.io protocol=ttrpc version=3 May 13 23:58:12.672419 systemd[1]: Started cri-containerd-2803fd96e2322525b49931181c744ff8ca9a0c4a26ee83910de3e33cf06b2caa.scope - libcontainer container 2803fd96e2322525b49931181c744ff8ca9a0c4a26ee83910de3e33cf06b2caa. May 13 23:58:12.685673 systemd-resolved[1339]: Failed to determine the local hostname and LLMNR/mDNS names, ignoring: No such device or address May 13 23:58:12.738825 containerd[1490]: time="2025-05-13T23:58:12.736664114Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-7db6d8ff4d-js6jv,Uid:b4033e25-b2d0-4648-a7e0-b9d3051f72b0,Namespace:kube-system,Attempt:0,} returns sandbox id \"2803fd96e2322525b49931181c744ff8ca9a0c4a26ee83910de3e33cf06b2caa\"" May 13 23:58:12.738963 kubelet[2839]: E0513 23:58:12.738864 2839 dns.go:153] "Nameserver limits exceeded" err="Nameserver limits were exceeded, some nameservers have been omitted, the applied nameserver line is: 1.1.1.1 1.0.0.1 8.8.8.8" May 13 23:58:12.744506 containerd[1490]: time="2025-05-13T23:58:12.744458257Z" level=info msg="CreateContainer within sandbox \"2803fd96e2322525b49931181c744ff8ca9a0c4a26ee83910de3e33cf06b2caa\" for container &ContainerMetadata{Name:coredns,Attempt:0,}" May 13 23:58:12.751954 containerd[1490]: time="2025-05-13T23:58:12.751337682Z" level=info msg="connecting to shim 0e970f10f875213dbc94d3098bc3c2d0f5735148204b8af5b9510f2495ddf867" address="unix:///run/containerd/s/548df4024d042dbbc199ae44bf2963e6cee4ba439e56eb6ad42de7e4de46cdaa" namespace=k8s.io protocol=ttrpc version=3 May 13 23:58:12.754039 systemd-networkd[1422]: vxlan.calico: Gained IPv6LL May 13 23:58:12.755864 systemd-networkd[1422]: cali5b1c73fb3f6: Link UP May 13 23:58:12.756083 systemd-networkd[1422]: cali5b1c73fb3f6: Gained carrier May 13 23:58:12.776536 containerd[1490]: time="2025-05-13T23:58:12.776479347Z" level=info msg="connecting to shim a5d57e06c3c75a512860d40e193bf5e0471f984ca8200315fe661d132d381d29" address="unix:///run/containerd/s/a2fdde0ececec834f19c607b675c2539749c19daffa3111c58524e866e2f317a" namespace=k8s.io protocol=ttrpc version=3 May 13 23:58:12.786310 containerd[1490]: time="2025-05-13T23:58:12.783094031Z" level=info msg="connecting to shim 8ed8a1b4f49038853de89d11ca83f7ac42c99a8ce24c1b29e01ee015f7524627" address="unix:///run/containerd/s/fa07834af3b8e62616f56d57f27b8a6816f74324072b7308f6addf936f063fe0" namespace=k8s.io protocol=ttrpc version=3 May 13 23:58:12.795441 containerd[1490]: 2025-05-13 23:58:11.642 [INFO][4513] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {localhost-k8s-csi--node--driver--fj9x5-eth0 csi-node-driver- calico-system 20d8fe80-dcc7-435f-a033-cb9b5eaee915 690 0 2025-05-13 23:57:33 +0000 UTC map[app.kubernetes.io/name:csi-node-driver controller-revision-hash:55b7b4b9d k8s-app:csi-node-driver name:csi-node-driver pod-template-generation:1 projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:csi-node-driver] map[] [] [] []} {k8s localhost csi-node-driver-fj9x5 eth0 csi-node-driver [] [] [kns.calico-system ksa.calico-system.csi-node-driver] cali5b1c73fb3f6 [] []}} ContainerID="f08ac277c4e53bbc9be385a14f591df1d66673d4ed35fddd32ef9a8ac95e4e3b" Namespace="calico-system" Pod="csi-node-driver-fj9x5" WorkloadEndpoint="localhost-k8s-csi--node--driver--fj9x5-" May 13 23:58:12.795441 containerd[1490]: 2025-05-13 23:58:11.642 [INFO][4513] cni-plugin/k8s.go 77: Extracted identifiers for CmdAddK8s ContainerID="f08ac277c4e53bbc9be385a14f591df1d66673d4ed35fddd32ef9a8ac95e4e3b" Namespace="calico-system" Pod="csi-node-driver-fj9x5" WorkloadEndpoint="localhost-k8s-csi--node--driver--fj9x5-eth0" May 13 23:58:12.795441 containerd[1490]: 2025-05-13 23:58:11.689 [INFO][4538] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="f08ac277c4e53bbc9be385a14f591df1d66673d4ed35fddd32ef9a8ac95e4e3b" HandleID="k8s-pod-network.f08ac277c4e53bbc9be385a14f591df1d66673d4ed35fddd32ef9a8ac95e4e3b" Workload="localhost-k8s-csi--node--driver--fj9x5-eth0" May 13 23:58:12.795441 containerd[1490]: 2025-05-13 23:58:11.730 [INFO][4538] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="f08ac277c4e53bbc9be385a14f591df1d66673d4ed35fddd32ef9a8ac95e4e3b" HandleID="k8s-pod-network.f08ac277c4e53bbc9be385a14f591df1d66673d4ed35fddd32ef9a8ac95e4e3b" Workload="localhost-k8s-csi--node--driver--fj9x5-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc000050040), Attrs:map[string]string{"namespace":"calico-system", "node":"localhost", "pod":"csi-node-driver-fj9x5", "timestamp":"2025-05-13 23:58:11.689279678 +0000 UTC"}, Hostname:"localhost", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} May 13 23:58:12.795441 containerd[1490]: 2025-05-13 23:58:11.730 [INFO][4538] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. May 13 23:58:12.795441 containerd[1490]: 2025-05-13 23:58:11.945 [INFO][4538] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. May 13 23:58:12.795441 containerd[1490]: 2025-05-13 23:58:11.945 [INFO][4538] ipam/ipam.go 107: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'localhost' May 13 23:58:12.795441 containerd[1490]: 2025-05-13 23:58:11.948 [INFO][4538] ipam/ipam.go 660: Looking up existing affinities for host handle="k8s-pod-network.f08ac277c4e53bbc9be385a14f591df1d66673d4ed35fddd32ef9a8ac95e4e3b" host="localhost" May 13 23:58:12.795441 containerd[1490]: 2025-05-13 23:58:12.370 [INFO][4538] ipam/ipam.go 372: Looking up existing affinities for host host="localhost" May 13 23:58:12.795441 containerd[1490]: 2025-05-13 23:58:12.699 [INFO][4538] ipam/ipam.go 489: Trying affinity for 192.168.88.128/26 host="localhost" May 13 23:58:12.795441 containerd[1490]: 2025-05-13 23:58:12.711 [INFO][4538] ipam/ipam.go 155: Attempting to load block cidr=192.168.88.128/26 host="localhost" May 13 23:58:12.795441 containerd[1490]: 2025-05-13 23:58:12.719 [INFO][4538] ipam/ipam.go 232: Affinity is confirmed and block has been loaded cidr=192.168.88.128/26 host="localhost" May 13 23:58:12.795441 containerd[1490]: 2025-05-13 23:58:12.720 [INFO][4538] ipam/ipam.go 1180: Attempting to assign 1 addresses from block block=192.168.88.128/26 handle="k8s-pod-network.f08ac277c4e53bbc9be385a14f591df1d66673d4ed35fddd32ef9a8ac95e4e3b" host="localhost" May 13 23:58:12.795441 containerd[1490]: 2025-05-13 23:58:12.725 [INFO][4538] ipam/ipam.go 1685: Creating new handle: k8s-pod-network.f08ac277c4e53bbc9be385a14f591df1d66673d4ed35fddd32ef9a8ac95e4e3b May 13 23:58:12.795441 containerd[1490]: 2025-05-13 23:58:12.733 [INFO][4538] ipam/ipam.go 1203: Writing block in order to claim IPs block=192.168.88.128/26 handle="k8s-pod-network.f08ac277c4e53bbc9be385a14f591df1d66673d4ed35fddd32ef9a8ac95e4e3b" host="localhost" May 13 23:58:12.795441 containerd[1490]: 2025-05-13 23:58:12.742 [INFO][4538] ipam/ipam.go 1216: Successfully claimed IPs: [192.168.88.133/26] block=192.168.88.128/26 handle="k8s-pod-network.f08ac277c4e53bbc9be385a14f591df1d66673d4ed35fddd32ef9a8ac95e4e3b" host="localhost" May 13 23:58:12.795441 containerd[1490]: 2025-05-13 23:58:12.742 [INFO][4538] ipam/ipam.go 847: Auto-assigned 1 out of 1 IPv4s: [192.168.88.133/26] handle="k8s-pod-network.f08ac277c4e53bbc9be385a14f591df1d66673d4ed35fddd32ef9a8ac95e4e3b" host="localhost" May 13 23:58:12.795441 containerd[1490]: 2025-05-13 23:58:12.742 [INFO][4538] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. May 13 23:58:12.795441 containerd[1490]: 2025-05-13 23:58:12.742 [INFO][4538] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.88.133/26] IPv6=[] ContainerID="f08ac277c4e53bbc9be385a14f591df1d66673d4ed35fddd32ef9a8ac95e4e3b" HandleID="k8s-pod-network.f08ac277c4e53bbc9be385a14f591df1d66673d4ed35fddd32ef9a8ac95e4e3b" Workload="localhost-k8s-csi--node--driver--fj9x5-eth0" May 13 23:58:12.796092 containerd[1490]: 2025-05-13 23:58:12.748 [INFO][4513] cni-plugin/k8s.go 386: Populated endpoint ContainerID="f08ac277c4e53bbc9be385a14f591df1d66673d4ed35fddd32ef9a8ac95e4e3b" Namespace="calico-system" Pod="csi-node-driver-fj9x5" WorkloadEndpoint="localhost-k8s-csi--node--driver--fj9x5-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-csi--node--driver--fj9x5-eth0", GenerateName:"csi-node-driver-", Namespace:"calico-system", SelfLink:"", UID:"20d8fe80-dcc7-435f-a033-cb9b5eaee915", ResourceVersion:"690", Generation:0, CreationTimestamp:time.Date(2025, time.May, 13, 23, 57, 33, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"csi-node-driver", "controller-revision-hash":"55b7b4b9d", "k8s-app":"csi-node-driver", "name":"csi-node-driver", "pod-template-generation":"1", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"csi-node-driver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"", Pod:"csi-node-driver-fj9x5", Endpoint:"eth0", ServiceAccountName:"csi-node-driver", IPNetworks:[]string{"192.168.88.133/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.csi-node-driver"}, InterfaceName:"cali5b1c73fb3f6", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil)}} May 13 23:58:12.796092 containerd[1490]: 2025-05-13 23:58:12.749 [INFO][4513] cni-plugin/k8s.go 387: Calico CNI using IPs: [192.168.88.133/32] ContainerID="f08ac277c4e53bbc9be385a14f591df1d66673d4ed35fddd32ef9a8ac95e4e3b" Namespace="calico-system" Pod="csi-node-driver-fj9x5" WorkloadEndpoint="localhost-k8s-csi--node--driver--fj9x5-eth0" May 13 23:58:12.796092 containerd[1490]: 2025-05-13 23:58:12.749 [INFO][4513] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali5b1c73fb3f6 ContainerID="f08ac277c4e53bbc9be385a14f591df1d66673d4ed35fddd32ef9a8ac95e4e3b" Namespace="calico-system" Pod="csi-node-driver-fj9x5" WorkloadEndpoint="localhost-k8s-csi--node--driver--fj9x5-eth0" May 13 23:58:12.796092 containerd[1490]: 2025-05-13 23:58:12.753 [INFO][4513] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="f08ac277c4e53bbc9be385a14f591df1d66673d4ed35fddd32ef9a8ac95e4e3b" Namespace="calico-system" Pod="csi-node-driver-fj9x5" WorkloadEndpoint="localhost-k8s-csi--node--driver--fj9x5-eth0" May 13 23:58:12.796092 containerd[1490]: 2025-05-13 23:58:12.756 [INFO][4513] cni-plugin/k8s.go 414: Added Mac, interface name, and active container ID to endpoint ContainerID="f08ac277c4e53bbc9be385a14f591df1d66673d4ed35fddd32ef9a8ac95e4e3b" Namespace="calico-system" Pod="csi-node-driver-fj9x5" WorkloadEndpoint="localhost-k8s-csi--node--driver--fj9x5-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-csi--node--driver--fj9x5-eth0", GenerateName:"csi-node-driver-", Namespace:"calico-system", SelfLink:"", UID:"20d8fe80-dcc7-435f-a033-cb9b5eaee915", ResourceVersion:"690", Generation:0, CreationTimestamp:time.Date(2025, time.May, 13, 23, 57, 33, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"csi-node-driver", "controller-revision-hash":"55b7b4b9d", "k8s-app":"csi-node-driver", "name":"csi-node-driver", "pod-template-generation":"1", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"csi-node-driver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"f08ac277c4e53bbc9be385a14f591df1d66673d4ed35fddd32ef9a8ac95e4e3b", Pod:"csi-node-driver-fj9x5", Endpoint:"eth0", ServiceAccountName:"csi-node-driver", IPNetworks:[]string{"192.168.88.133/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.csi-node-driver"}, InterfaceName:"cali5b1c73fb3f6", MAC:"de:73:7b:c0:4c:47", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil)}} May 13 23:58:12.796092 containerd[1490]: 2025-05-13 23:58:12.783 [INFO][4513] cni-plugin/k8s.go 500: Wrote updated endpoint to datastore ContainerID="f08ac277c4e53bbc9be385a14f591df1d66673d4ed35fddd32ef9a8ac95e4e3b" Namespace="calico-system" Pod="csi-node-driver-fj9x5" WorkloadEndpoint="localhost-k8s-csi--node--driver--fj9x5-eth0" May 13 23:58:12.812881 containerd[1490]: time="2025-05-13T23:58:12.812833106Z" level=info msg="Container 07d319714b3b9ad4d6cf266e0623d1cf5278f735a4b12417b6bb79c9ac13c527: CDI devices from CRI Config.CDIDevices: []" May 13 23:58:12.817724 systemd-networkd[1422]: cali825a25937b6: Gained IPv6LL May 13 23:58:12.819449 systemd[1]: Started cri-containerd-8ed8a1b4f49038853de89d11ca83f7ac42c99a8ce24c1b29e01ee015f7524627.scope - libcontainer container 8ed8a1b4f49038853de89d11ca83f7ac42c99a8ce24c1b29e01ee015f7524627. May 13 23:58:12.836849 systemd[1]: Started cri-containerd-0e970f10f875213dbc94d3098bc3c2d0f5735148204b8af5b9510f2495ddf867.scope - libcontainer container 0e970f10f875213dbc94d3098bc3c2d0f5735148204b8af5b9510f2495ddf867. May 13 23:58:12.841801 systemd[1]: Started cri-containerd-a5d57e06c3c75a512860d40e193bf5e0471f984ca8200315fe661d132d381d29.scope - libcontainer container a5d57e06c3c75a512860d40e193bf5e0471f984ca8200315fe661d132d381d29. May 13 23:58:12.856009 systemd-resolved[1339]: Failed to determine the local hostname and LLMNR/mDNS names, ignoring: No such device or address May 13 23:58:12.856023 systemd-resolved[1339]: Failed to determine the local hostname and LLMNR/mDNS names, ignoring: No such device or address May 13 23:58:12.862666 systemd-resolved[1339]: Failed to determine the local hostname and LLMNR/mDNS names, ignoring: No such device or address May 13 23:58:12.889091 containerd[1490]: time="2025-05-13T23:58:12.889039608Z" level=info msg="CreateContainer within sandbox \"2803fd96e2322525b49931181c744ff8ca9a0c4a26ee83910de3e33cf06b2caa\" for &ContainerMetadata{Name:coredns,Attempt:0,} returns container id \"07d319714b3b9ad4d6cf266e0623d1cf5278f735a4b12417b6bb79c9ac13c527\"" May 13 23:58:12.894279 containerd[1490]: time="2025-05-13T23:58:12.894232144Z" level=info msg="StartContainer for \"07d319714b3b9ad4d6cf266e0623d1cf5278f735a4b12417b6bb79c9ac13c527\"" May 13 23:58:12.900968 containerd[1490]: time="2025-05-13T23:58:12.900910478Z" level=info msg="connecting to shim 07d319714b3b9ad4d6cf266e0623d1cf5278f735a4b12417b6bb79c9ac13c527" address="unix:///run/containerd/s/281455a1602d0cb949bd20ee47f6cd68c408d5c78ed073817c205722a0948a6e" protocol=ttrpc version=3 May 13 23:58:12.928552 systemd[1]: Started cri-containerd-07d319714b3b9ad4d6cf266e0623d1cf5278f735a4b12417b6bb79c9ac13c527.scope - libcontainer container 07d319714b3b9ad4d6cf266e0623d1cf5278f735a4b12417b6bb79c9ac13c527. May 13 23:58:12.960013 containerd[1490]: time="2025-05-13T23:58:12.959929401Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-55cbbdb78f-59prw,Uid:7351e847-522c-49d3-9106-94703455ced8,Namespace:calico-apiserver,Attempt:0,} returns sandbox id \"0e970f10f875213dbc94d3098bc3c2d0f5735148204b8af5b9510f2495ddf867\"" May 13 23:58:12.961593 containerd[1490]: time="2025-05-13T23:58:12.961545687Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.29.3\"" May 13 23:58:13.009464 systemd-networkd[1422]: cali2991946752e: Gained IPv6LL May 13 23:58:13.223131 containerd[1490]: time="2025-05-13T23:58:13.222941635Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-7db6d8ff4d-7ms9x,Uid:72666019-206a-4043-9545-c5ffb5aec026,Namespace:kube-system,Attempt:0,} returns sandbox id \"a5d57e06c3c75a512860d40e193bf5e0471f984ca8200315fe661d132d381d29\"" May 13 23:58:13.224230 kubelet[2839]: E0513 23:58:13.224083 2839 dns.go:153] "Nameserver limits exceeded" err="Nameserver limits were exceeded, some nameservers have been omitted, the applied nameserver line is: 1.1.1.1 1.0.0.1 8.8.8.8" May 13 23:58:13.226396 containerd[1490]: time="2025-05-13T23:58:13.226276809Z" level=info msg="StartContainer for \"07d319714b3b9ad4d6cf266e0623d1cf5278f735a4b12417b6bb79c9ac13c527\" returns successfully" May 13 23:58:13.228053 containerd[1490]: time="2025-05-13T23:58:13.228016968Z" level=info msg="CreateContainer within sandbox \"a5d57e06c3c75a512860d40e193bf5e0471f984ca8200315fe661d132d381d29\" for container &ContainerMetadata{Name:coredns,Attempt:0,}" May 13 23:58:13.228594 containerd[1490]: time="2025-05-13T23:58:13.228528955Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-786464f5d8-csw8h,Uid:39453a0e-f73f-4297-8910-21729bd594b9,Namespace:calico-system,Attempt:0,} returns sandbox id \"8ed8a1b4f49038853de89d11ca83f7ac42c99a8ce24c1b29e01ee015f7524627\"" May 13 23:58:13.671946 containerd[1490]: time="2025-05-13T23:58:13.671796206Z" level=info msg="connecting to shim f08ac277c4e53bbc9be385a14f591df1d66673d4ed35fddd32ef9a8ac95e4e3b" address="unix:///run/containerd/s/98e49582a52c96abffea135922a9168d94bd55a83e2066d9ddaa9d133aa99dc4" namespace=k8s.io protocol=ttrpc version=3 May 13 23:58:13.707780 containerd[1490]: time="2025-05-13T23:58:13.707702622Z" level=info msg="Container bebea67a3a6efe263d8617e13b57f4731900d8850dfab97719cbee0120d7f729: CDI devices from CRI Config.CDIDevices: []" May 13 23:58:13.709513 systemd[1]: Started cri-containerd-f08ac277c4e53bbc9be385a14f591df1d66673d4ed35fddd32ef9a8ac95e4e3b.scope - libcontainer container f08ac277c4e53bbc9be385a14f591df1d66673d4ed35fddd32ef9a8ac95e4e3b. May 13 23:58:13.724338 systemd-resolved[1339]: Failed to determine the local hostname and LLMNR/mDNS names, ignoring: No such device or address May 13 23:58:13.841355 systemd-networkd[1422]: cali5b1c73fb3f6: Gained IPv6LL May 13 23:58:13.876395 containerd[1490]: time="2025-05-13T23:58:13.876348627Z" level=info msg="CreateContainer within sandbox \"a5d57e06c3c75a512860d40e193bf5e0471f984ca8200315fe661d132d381d29\" for &ContainerMetadata{Name:coredns,Attempt:0,} returns container id \"bebea67a3a6efe263d8617e13b57f4731900d8850dfab97719cbee0120d7f729\"" May 13 23:58:13.876879 containerd[1490]: time="2025-05-13T23:58:13.876860124Z" level=info msg="StartContainer for \"bebea67a3a6efe263d8617e13b57f4731900d8850dfab97719cbee0120d7f729\"" May 13 23:58:13.877998 containerd[1490]: time="2025-05-13T23:58:13.877719949Z" level=info msg="connecting to shim bebea67a3a6efe263d8617e13b57f4731900d8850dfab97719cbee0120d7f729" address="unix:///run/containerd/s/a2fdde0ececec834f19c607b675c2539749c19daffa3111c58524e866e2f317a" protocol=ttrpc version=3 May 13 23:58:13.887174 containerd[1490]: time="2025-05-13T23:58:13.887066084Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-fj9x5,Uid:20d8fe80-dcc7-435f-a033-cb9b5eaee915,Namespace:calico-system,Attempt:0,} returns sandbox id \"f08ac277c4e53bbc9be385a14f591df1d66673d4ed35fddd32ef9a8ac95e4e3b\"" May 13 23:58:13.908696 systemd[1]: Started cri-containerd-bebea67a3a6efe263d8617e13b57f4731900d8850dfab97719cbee0120d7f729.scope - libcontainer container bebea67a3a6efe263d8617e13b57f4731900d8850dfab97719cbee0120d7f729. May 13 23:58:14.137644 containerd[1490]: time="2025-05-13T23:58:14.137506337Z" level=info msg="StartContainer for \"bebea67a3a6efe263d8617e13b57f4731900d8850dfab97719cbee0120d7f729\" returns successfully" May 13 23:58:14.145948 kubelet[2839]: E0513 23:58:14.145907 2839 dns.go:153] "Nameserver limits exceeded" err="Nameserver limits were exceeded, some nameservers have been omitted, the applied nameserver line is: 1.1.1.1 1.0.0.1 8.8.8.8" May 13 23:58:14.152182 kubelet[2839]: E0513 23:58:14.150362 2839 dns.go:153] "Nameserver limits exceeded" err="Nameserver limits were exceeded, some nameservers have been omitted, the applied nameserver line is: 1.1.1.1 1.0.0.1 8.8.8.8" May 13 23:58:14.668543 kubelet[2839]: I0513 23:58:14.666264 2839 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/coredns-7db6d8ff4d-7ms9x" podStartSLOduration=57.666228465 podStartE2EDuration="57.666228465s" podCreationTimestamp="2025-05-13 23:57:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-05-13 23:58:14.665815464 +0000 UTC m=+70.653519520" watchObservedRunningTime="2025-05-13 23:58:14.666228465 +0000 UTC m=+70.653932501" May 13 23:58:14.835664 kubelet[2839]: I0513 23:58:14.834418 2839 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/coredns-7db6d8ff4d-js6jv" podStartSLOduration=57.834384648 podStartE2EDuration="57.834384648s" podCreationTimestamp="2025-05-13 23:57:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-05-13 23:58:14.820827267 +0000 UTC m=+70.808531303" watchObservedRunningTime="2025-05-13 23:58:14.834384648 +0000 UTC m=+70.822088684" May 13 23:58:15.157350 kubelet[2839]: E0513 23:58:15.157185 2839 dns.go:153] "Nameserver limits exceeded" err="Nameserver limits were exceeded, some nameservers have been omitted, the applied nameserver line is: 1.1.1.1 1.0.0.1 8.8.8.8" May 13 23:58:15.157930 kubelet[2839]: E0513 23:58:15.157763 2839 dns.go:153] "Nameserver limits exceeded" err="Nameserver limits were exceeded, some nameservers have been omitted, the applied nameserver line is: 1.1.1.1 1.0.0.1 8.8.8.8" May 13 23:58:16.162458 kubelet[2839]: E0513 23:58:16.162239 2839 dns.go:153] "Nameserver limits exceeded" err="Nameserver limits were exceeded, some nameservers have been omitted, the applied nameserver line is: 1.1.1.1 1.0.0.1 8.8.8.8" May 13 23:58:16.175552 kubelet[2839]: E0513 23:58:16.175407 2839 dns.go:153] "Nameserver limits exceeded" err="Nameserver limits were exceeded, some nameservers have been omitted, the applied nameserver line is: 1.1.1.1 1.0.0.1 8.8.8.8" May 13 23:58:16.973842 systemd[1]: Started sshd@14-10.0.0.64:22-10.0.0.1:46252.service - OpenSSH per-connection server daemon (10.0.0.1:46252). May 13 23:58:17.080337 sshd[4914]: Accepted publickey for core from 10.0.0.1 port 46252 ssh2: RSA SHA256:7f2XacyFcvGxEsM5obZzQpmkhMs9Q6mfAUEaqBEC3Xw May 13 23:58:17.081953 sshd-session[4914]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 13 23:58:17.131372 systemd-logind[1468]: New session 15 of user core. May 13 23:58:17.148265 systemd[1]: Started session-15.scope - Session 15 of User core. May 13 23:58:17.341766 sshd[4916]: Connection closed by 10.0.0.1 port 46252 May 13 23:58:17.343418 sshd-session[4914]: pam_unix(sshd:session): session closed for user core May 13 23:58:17.347443 systemd[1]: sshd@14-10.0.0.64:22-10.0.0.1:46252.service: Deactivated successfully. May 13 23:58:17.348559 systemd-logind[1468]: Session 15 logged out. Waiting for processes to exit. May 13 23:58:17.350904 systemd[1]: session-15.scope: Deactivated successfully. May 13 23:58:17.353563 systemd-logind[1468]: Removed session 15. May 13 23:58:17.511887 containerd[1490]: time="2025-05-13T23:58:17.511832986Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/apiserver:v3.29.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 13 23:58:17.516722 containerd[1490]: time="2025-05-13T23:58:17.516674833Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.29.3: active requests=0, bytes read=43021437" May 13 23:58:17.518088 containerd[1490]: time="2025-05-13T23:58:17.518039932Z" level=info msg="ImageCreate event name:\"sha256:b1960e792987d99ee8f3583d7354dcd25a683cf854e8f10322ca7eeb83128532\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 13 23:58:17.520447 containerd[1490]: time="2025-05-13T23:58:17.520413777Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/apiserver@sha256:bcb659f25f9aebaa389ed1dbb65edb39478ddf82c57d07d8da474e8cab38d77b\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 13 23:58:17.520957 containerd[1490]: time="2025-05-13T23:58:17.520919222Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/apiserver:v3.29.3\" with image id \"sha256:b1960e792987d99ee8f3583d7354dcd25a683cf854e8f10322ca7eeb83128532\", repo tag \"ghcr.io/flatcar/calico/apiserver:v3.29.3\", repo digest \"ghcr.io/flatcar/calico/apiserver@sha256:bcb659f25f9aebaa389ed1dbb65edb39478ddf82c57d07d8da474e8cab38d77b\", size \"44514075\" in 4.559314393s" May 13 23:58:17.520957 containerd[1490]: time="2025-05-13T23:58:17.520951984Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.29.3\" returns image reference \"sha256:b1960e792987d99ee8f3583d7354dcd25a683cf854e8f10322ca7eeb83128532\"" May 13 23:58:17.521980 containerd[1490]: time="2025-05-13T23:58:17.521839431Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.29.3\"" May 13 23:58:17.523014 containerd[1490]: time="2025-05-13T23:58:17.522988662Z" level=info msg="CreateContainer within sandbox \"0e970f10f875213dbc94d3098bc3c2d0f5735148204b8af5b9510f2495ddf867\" for container &ContainerMetadata{Name:calico-apiserver,Attempt:0,}" May 13 23:58:17.545753 containerd[1490]: time="2025-05-13T23:58:17.545708305Z" level=info msg="Container c62c3225f047be6d19c525e640a8d4fe6f297b623f7d2055e72bfa5bf9cdd27a: CDI devices from CRI Config.CDIDevices: []" May 13 23:58:17.555744 containerd[1490]: time="2025-05-13T23:58:17.555706905Z" level=info msg="CreateContainer within sandbox \"0e970f10f875213dbc94d3098bc3c2d0f5735148204b8af5b9510f2495ddf867\" for &ContainerMetadata{Name:calico-apiserver,Attempt:0,} returns container id \"c62c3225f047be6d19c525e640a8d4fe6f297b623f7d2055e72bfa5bf9cdd27a\"" May 13 23:58:17.556172 containerd[1490]: time="2025-05-13T23:58:17.556149822Z" level=info msg="StartContainer for \"c62c3225f047be6d19c525e640a8d4fe6f297b623f7d2055e72bfa5bf9cdd27a\"" May 13 23:58:17.557380 containerd[1490]: time="2025-05-13T23:58:17.557337176Z" level=info msg="connecting to shim c62c3225f047be6d19c525e640a8d4fe6f297b623f7d2055e72bfa5bf9cdd27a" address="unix:///run/containerd/s/548df4024d042dbbc199ae44bf2963e6cee4ba439e56eb6ad42de7e4de46cdaa" protocol=ttrpc version=3 May 13 23:58:17.595492 systemd[1]: Started cri-containerd-c62c3225f047be6d19c525e640a8d4fe6f297b623f7d2055e72bfa5bf9cdd27a.scope - libcontainer container c62c3225f047be6d19c525e640a8d4fe6f297b623f7d2055e72bfa5bf9cdd27a. May 13 23:58:18.532858 containerd[1490]: time="2025-05-13T23:58:18.532810255Z" level=info msg="StartContainer for \"c62c3225f047be6d19c525e640a8d4fe6f297b623f7d2055e72bfa5bf9cdd27a\" returns successfully" May 13 23:58:20.115986 kubelet[2839]: E0513 23:58:20.115947 2839 dns.go:153] "Nameserver limits exceeded" err="Nameserver limits were exceeded, some nameservers have been omitted, the applied nameserver line is: 1.1.1.1 1.0.0.1 8.8.8.8" May 13 23:58:20.539174 kubelet[2839]: I0513 23:58:20.539049 2839 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" May 13 23:58:21.115493 kubelet[2839]: E0513 23:58:21.115433 2839 dns.go:153] "Nameserver limits exceeded" err="Nameserver limits were exceeded, some nameservers have been omitted, the applied nameserver line is: 1.1.1.1 1.0.0.1 8.8.8.8" May 13 23:58:21.115656 containerd[1490]: time="2025-05-13T23:58:21.115481598Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-55cbbdb78f-kh882,Uid:aff21006-cd99-438a-b534-0869f6aa0b49,Namespace:calico-apiserver,Attempt:0,}" May 13 23:58:21.353470 systemd-networkd[1422]: calida5133444d3: Link UP May 13 23:58:21.353702 systemd-networkd[1422]: calida5133444d3: Gained carrier May 13 23:58:21.514850 kubelet[2839]: I0513 23:58:21.513799 2839 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-apiserver/calico-apiserver-55cbbdb78f-59prw" podStartSLOduration=45.953353583 podStartE2EDuration="50.513774638s" podCreationTimestamp="2025-05-13 23:57:31 +0000 UTC" firstStartedPulling="2025-05-13 23:58:12.961263223 +0000 UTC m=+68.948967249" lastFinishedPulling="2025-05-13 23:58:17.521684278 +0000 UTC m=+73.509388304" observedRunningTime="2025-05-13 23:58:19.562516986 +0000 UTC m=+75.550221012" watchObservedRunningTime="2025-05-13 23:58:21.513774638 +0000 UTC m=+77.501478664" May 13 23:58:21.519611 containerd[1490]: 2025-05-13 23:58:21.203 [INFO][4976] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {localhost-k8s-calico--apiserver--55cbbdb78f--kh882-eth0 calico-apiserver-55cbbdb78f- calico-apiserver aff21006-cd99-438a-b534-0869f6aa0b49 849 0 2025-05-13 23:57:31 +0000 UTC map[apiserver:true app.kubernetes.io/name:calico-apiserver k8s-app:calico-apiserver pod-template-hash:55cbbdb78f projectcalico.org/namespace:calico-apiserver projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:calico-apiserver] map[] [] [] []} {k8s localhost calico-apiserver-55cbbdb78f-kh882 eth0 calico-apiserver [] [] [kns.calico-apiserver ksa.calico-apiserver.calico-apiserver] calida5133444d3 [] []}} ContainerID="541011d298cc021c0f196d675049c648e5c0b5549d9779ac73178a7b2d2b20eb" Namespace="calico-apiserver" Pod="calico-apiserver-55cbbdb78f-kh882" WorkloadEndpoint="localhost-k8s-calico--apiserver--55cbbdb78f--kh882-" May 13 23:58:21.519611 containerd[1490]: 2025-05-13 23:58:21.204 [INFO][4976] cni-plugin/k8s.go 77: Extracted identifiers for CmdAddK8s ContainerID="541011d298cc021c0f196d675049c648e5c0b5549d9779ac73178a7b2d2b20eb" Namespace="calico-apiserver" Pod="calico-apiserver-55cbbdb78f-kh882" WorkloadEndpoint="localhost-k8s-calico--apiserver--55cbbdb78f--kh882-eth0" May 13 23:58:21.519611 containerd[1490]: 2025-05-13 23:58:21.231 [INFO][4992] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="541011d298cc021c0f196d675049c648e5c0b5549d9779ac73178a7b2d2b20eb" HandleID="k8s-pod-network.541011d298cc021c0f196d675049c648e5c0b5549d9779ac73178a7b2d2b20eb" Workload="localhost-k8s-calico--apiserver--55cbbdb78f--kh882-eth0" May 13 23:58:21.519611 containerd[1490]: 2025-05-13 23:58:21.254 [INFO][4992] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="541011d298cc021c0f196d675049c648e5c0b5549d9779ac73178a7b2d2b20eb" HandleID="k8s-pod-network.541011d298cc021c0f196d675049c648e5c0b5549d9779ac73178a7b2d2b20eb" Workload="localhost-k8s-calico--apiserver--55cbbdb78f--kh882-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc0003089f0), Attrs:map[string]string{"namespace":"calico-apiserver", "node":"localhost", "pod":"calico-apiserver-55cbbdb78f-kh882", "timestamp":"2025-05-13 23:58:21.231680349 +0000 UTC"}, Hostname:"localhost", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} May 13 23:58:21.519611 containerd[1490]: 2025-05-13 23:58:21.254 [INFO][4992] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. May 13 23:58:21.519611 containerd[1490]: 2025-05-13 23:58:21.255 [INFO][4992] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. May 13 23:58:21.519611 containerd[1490]: 2025-05-13 23:58:21.255 [INFO][4992] ipam/ipam.go 107: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'localhost' May 13 23:58:21.519611 containerd[1490]: 2025-05-13 23:58:21.256 [INFO][4992] ipam/ipam.go 660: Looking up existing affinities for host handle="k8s-pod-network.541011d298cc021c0f196d675049c648e5c0b5549d9779ac73178a7b2d2b20eb" host="localhost" May 13 23:58:21.519611 containerd[1490]: 2025-05-13 23:58:21.261 [INFO][4992] ipam/ipam.go 372: Looking up existing affinities for host host="localhost" May 13 23:58:21.519611 containerd[1490]: 2025-05-13 23:58:21.264 [INFO][4992] ipam/ipam.go 489: Trying affinity for 192.168.88.128/26 host="localhost" May 13 23:58:21.519611 containerd[1490]: 2025-05-13 23:58:21.266 [INFO][4992] ipam/ipam.go 155: Attempting to load block cidr=192.168.88.128/26 host="localhost" May 13 23:58:21.519611 containerd[1490]: 2025-05-13 23:58:21.269 [INFO][4992] ipam/ipam.go 232: Affinity is confirmed and block has been loaded cidr=192.168.88.128/26 host="localhost" May 13 23:58:21.519611 containerd[1490]: 2025-05-13 23:58:21.269 [INFO][4992] ipam/ipam.go 1180: Attempting to assign 1 addresses from block block=192.168.88.128/26 handle="k8s-pod-network.541011d298cc021c0f196d675049c648e5c0b5549d9779ac73178a7b2d2b20eb" host="localhost" May 13 23:58:21.519611 containerd[1490]: 2025-05-13 23:58:21.271 [INFO][4992] ipam/ipam.go 1685: Creating new handle: k8s-pod-network.541011d298cc021c0f196d675049c648e5c0b5549d9779ac73178a7b2d2b20eb May 13 23:58:21.519611 containerd[1490]: 2025-05-13 23:58:21.277 [INFO][4992] ipam/ipam.go 1203: Writing block in order to claim IPs block=192.168.88.128/26 handle="k8s-pod-network.541011d298cc021c0f196d675049c648e5c0b5549d9779ac73178a7b2d2b20eb" host="localhost" May 13 23:58:21.519611 containerd[1490]: 2025-05-13 23:58:21.347 [INFO][4992] ipam/ipam.go 1216: Successfully claimed IPs: [192.168.88.134/26] block=192.168.88.128/26 handle="k8s-pod-network.541011d298cc021c0f196d675049c648e5c0b5549d9779ac73178a7b2d2b20eb" host="localhost" May 13 23:58:21.519611 containerd[1490]: 2025-05-13 23:58:21.347 [INFO][4992] ipam/ipam.go 847: Auto-assigned 1 out of 1 IPv4s: [192.168.88.134/26] handle="k8s-pod-network.541011d298cc021c0f196d675049c648e5c0b5549d9779ac73178a7b2d2b20eb" host="localhost" May 13 23:58:21.519611 containerd[1490]: 2025-05-13 23:58:21.347 [INFO][4992] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. May 13 23:58:21.519611 containerd[1490]: 2025-05-13 23:58:21.347 [INFO][4992] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.88.134/26] IPv6=[] ContainerID="541011d298cc021c0f196d675049c648e5c0b5549d9779ac73178a7b2d2b20eb" HandleID="k8s-pod-network.541011d298cc021c0f196d675049c648e5c0b5549d9779ac73178a7b2d2b20eb" Workload="localhost-k8s-calico--apiserver--55cbbdb78f--kh882-eth0" May 13 23:58:21.520802 containerd[1490]: 2025-05-13 23:58:21.350 [INFO][4976] cni-plugin/k8s.go 386: Populated endpoint ContainerID="541011d298cc021c0f196d675049c648e5c0b5549d9779ac73178a7b2d2b20eb" Namespace="calico-apiserver" Pod="calico-apiserver-55cbbdb78f-kh882" WorkloadEndpoint="localhost-k8s-calico--apiserver--55cbbdb78f--kh882-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-calico--apiserver--55cbbdb78f--kh882-eth0", GenerateName:"calico-apiserver-55cbbdb78f-", Namespace:"calico-apiserver", SelfLink:"", UID:"aff21006-cd99-438a-b534-0869f6aa0b49", ResourceVersion:"849", Generation:0, CreationTimestamp:time.Date(2025, time.May, 13, 23, 57, 31, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"55cbbdb78f", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"", Pod:"calico-apiserver-55cbbdb78f-kh882", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.88.134/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"calida5133444d3", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil)}} May 13 23:58:21.520802 containerd[1490]: 2025-05-13 23:58:21.350 [INFO][4976] cni-plugin/k8s.go 387: Calico CNI using IPs: [192.168.88.134/32] ContainerID="541011d298cc021c0f196d675049c648e5c0b5549d9779ac73178a7b2d2b20eb" Namespace="calico-apiserver" Pod="calico-apiserver-55cbbdb78f-kh882" WorkloadEndpoint="localhost-k8s-calico--apiserver--55cbbdb78f--kh882-eth0" May 13 23:58:21.520802 containerd[1490]: 2025-05-13 23:58:21.350 [INFO][4976] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to calida5133444d3 ContainerID="541011d298cc021c0f196d675049c648e5c0b5549d9779ac73178a7b2d2b20eb" Namespace="calico-apiserver" Pod="calico-apiserver-55cbbdb78f-kh882" WorkloadEndpoint="localhost-k8s-calico--apiserver--55cbbdb78f--kh882-eth0" May 13 23:58:21.520802 containerd[1490]: 2025-05-13 23:58:21.354 [INFO][4976] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="541011d298cc021c0f196d675049c648e5c0b5549d9779ac73178a7b2d2b20eb" Namespace="calico-apiserver" Pod="calico-apiserver-55cbbdb78f-kh882" WorkloadEndpoint="localhost-k8s-calico--apiserver--55cbbdb78f--kh882-eth0" May 13 23:58:21.520802 containerd[1490]: 2025-05-13 23:58:21.355 [INFO][4976] cni-plugin/k8s.go 414: Added Mac, interface name, and active container ID to endpoint ContainerID="541011d298cc021c0f196d675049c648e5c0b5549d9779ac73178a7b2d2b20eb" Namespace="calico-apiserver" Pod="calico-apiserver-55cbbdb78f-kh882" WorkloadEndpoint="localhost-k8s-calico--apiserver--55cbbdb78f--kh882-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-calico--apiserver--55cbbdb78f--kh882-eth0", GenerateName:"calico-apiserver-55cbbdb78f-", Namespace:"calico-apiserver", SelfLink:"", UID:"aff21006-cd99-438a-b534-0869f6aa0b49", ResourceVersion:"849", Generation:0, CreationTimestamp:time.Date(2025, time.May, 13, 23, 57, 31, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"55cbbdb78f", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"541011d298cc021c0f196d675049c648e5c0b5549d9779ac73178a7b2d2b20eb", Pod:"calico-apiserver-55cbbdb78f-kh882", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.88.134/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"calida5133444d3", MAC:"7a:c0:10:92:13:27", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil)}} May 13 23:58:21.520802 containerd[1490]: 2025-05-13 23:58:21.515 [INFO][4976] cni-plugin/k8s.go 500: Wrote updated endpoint to datastore ContainerID="541011d298cc021c0f196d675049c648e5c0b5549d9779ac73178a7b2d2b20eb" Namespace="calico-apiserver" Pod="calico-apiserver-55cbbdb78f-kh882" WorkloadEndpoint="localhost-k8s-calico--apiserver--55cbbdb78f--kh882-eth0" May 13 23:58:21.741075 containerd[1490]: time="2025-05-13T23:58:21.741000263Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/kube-controllers:v3.29.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 13 23:58:21.747346 containerd[1490]: time="2025-05-13T23:58:21.747185404Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/kube-controllers:v3.29.3: active requests=0, bytes read=34789138" May 13 23:58:21.754256 containerd[1490]: time="2025-05-13T23:58:21.753599948Z" level=info msg="ImageCreate event name:\"sha256:4e982138231b3653a012db4f21ed5e7be69afd5f553dba38cf7e88f0ed740b94\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 13 23:58:21.766516 containerd[1490]: time="2025-05-13T23:58:21.766318969Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/kube-controllers@sha256:feaab0197035d474845e0f8137a99a78cab274f0a3cac4d5485cf9b1bdf9ffa9\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 13 23:58:21.767009 containerd[1490]: time="2025-05-13T23:58:21.766916958Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/kube-controllers:v3.29.3\" with image id \"sha256:4e982138231b3653a012db4f21ed5e7be69afd5f553dba38cf7e88f0ed740b94\", repo tag \"ghcr.io/flatcar/calico/kube-controllers:v3.29.3\", repo digest \"ghcr.io/flatcar/calico/kube-controllers@sha256:feaab0197035d474845e0f8137a99a78cab274f0a3cac4d5485cf9b1bdf9ffa9\", size \"36281728\" in 4.245047962s" May 13 23:58:21.767009 containerd[1490]: time="2025-05-13T23:58:21.766975569Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.29.3\" returns image reference \"sha256:4e982138231b3653a012db4f21ed5e7be69afd5f553dba38cf7e88f0ed740b94\"" May 13 23:58:21.769186 containerd[1490]: time="2025-05-13T23:58:21.768720855Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.29.3\"" May 13 23:58:21.785534 containerd[1490]: time="2025-05-13T23:58:21.785485407Z" level=info msg="CreateContainer within sandbox \"8ed8a1b4f49038853de89d11ca83f7ac42c99a8ce24c1b29e01ee015f7524627\" for container &ContainerMetadata{Name:calico-kube-controllers,Attempt:0,}" May 13 23:58:21.795379 containerd[1490]: time="2025-05-13T23:58:21.795319348Z" level=info msg="connecting to shim 541011d298cc021c0f196d675049c648e5c0b5549d9779ac73178a7b2d2b20eb" address="unix:///run/containerd/s/8e8b69fca7121fb4684d51edf48c3044d6d72dc894a03dc4fe541477a64c50d3" namespace=k8s.io protocol=ttrpc version=3 May 13 23:58:21.801213 containerd[1490]: time="2025-05-13T23:58:21.801155281Z" level=info msg="Container 572f38d5c35475f6f617220ef2aa940cc436bafb9bdcadc1efde8d832cd8ff00: CDI devices from CRI Config.CDIDevices: []" May 13 23:58:21.813379 containerd[1490]: time="2025-05-13T23:58:21.813319804Z" level=info msg="CreateContainer within sandbox \"8ed8a1b4f49038853de89d11ca83f7ac42c99a8ce24c1b29e01ee015f7524627\" for &ContainerMetadata{Name:calico-kube-controllers,Attempt:0,} returns container id \"572f38d5c35475f6f617220ef2aa940cc436bafb9bdcadc1efde8d832cd8ff00\"" May 13 23:58:21.814296 containerd[1490]: time="2025-05-13T23:58:21.814256674Z" level=info msg="StartContainer for \"572f38d5c35475f6f617220ef2aa940cc436bafb9bdcadc1efde8d832cd8ff00\"" May 13 23:58:21.815435 containerd[1490]: time="2025-05-13T23:58:21.815389763Z" level=info msg="connecting to shim 572f38d5c35475f6f617220ef2aa940cc436bafb9bdcadc1efde8d832cd8ff00" address="unix:///run/containerd/s/fa07834af3b8e62616f56d57f27b8a6816f74324072b7308f6addf936f063fe0" protocol=ttrpc version=3 May 13 23:58:21.832817 systemd[1]: Started cri-containerd-541011d298cc021c0f196d675049c648e5c0b5549d9779ac73178a7b2d2b20eb.scope - libcontainer container 541011d298cc021c0f196d675049c648e5c0b5549d9779ac73178a7b2d2b20eb. May 13 23:58:21.839211 systemd[1]: Started cri-containerd-572f38d5c35475f6f617220ef2aa940cc436bafb9bdcadc1efde8d832cd8ff00.scope - libcontainer container 572f38d5c35475f6f617220ef2aa940cc436bafb9bdcadc1efde8d832cd8ff00. May 13 23:58:21.854507 systemd-resolved[1339]: Failed to determine the local hostname and LLMNR/mDNS names, ignoring: No such device or address May 13 23:58:21.891140 containerd[1490]: time="2025-05-13T23:58:21.891037190Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-55cbbdb78f-kh882,Uid:aff21006-cd99-438a-b534-0869f6aa0b49,Namespace:calico-apiserver,Attempt:0,} returns sandbox id \"541011d298cc021c0f196d675049c648e5c0b5549d9779ac73178a7b2d2b20eb\"" May 13 23:58:21.908605 containerd[1490]: time="2025-05-13T23:58:21.908551687Z" level=info msg="CreateContainer within sandbox \"541011d298cc021c0f196d675049c648e5c0b5549d9779ac73178a7b2d2b20eb\" for container &ContainerMetadata{Name:calico-apiserver,Attempt:0,}" May 13 23:58:22.015880 containerd[1490]: time="2025-05-13T23:58:22.015819660Z" level=info msg="StartContainer for \"572f38d5c35475f6f617220ef2aa940cc436bafb9bdcadc1efde8d832cd8ff00\" returns successfully" May 13 23:58:22.023328 containerd[1490]: time="2025-05-13T23:58:22.022865445Z" level=info msg="Container 203822a495794ef122ca294344016c5ee10e0b6955c56589184dfe022a4a6717: CDI devices from CRI Config.CDIDevices: []" May 13 23:58:22.033125 containerd[1490]: time="2025-05-13T23:58:22.033057933Z" level=info msg="CreateContainer within sandbox \"541011d298cc021c0f196d675049c648e5c0b5549d9779ac73178a7b2d2b20eb\" for &ContainerMetadata{Name:calico-apiserver,Attempt:0,} returns container id \"203822a495794ef122ca294344016c5ee10e0b6955c56589184dfe022a4a6717\"" May 13 23:58:22.033906 containerd[1490]: time="2025-05-13T23:58:22.033824481Z" level=info msg="StartContainer for \"203822a495794ef122ca294344016c5ee10e0b6955c56589184dfe022a4a6717\"" May 13 23:58:22.035842 containerd[1490]: time="2025-05-13T23:58:22.035801735Z" level=info msg="connecting to shim 203822a495794ef122ca294344016c5ee10e0b6955c56589184dfe022a4a6717" address="unix:///run/containerd/s/8e8b69fca7121fb4684d51edf48c3044d6d72dc894a03dc4fe541477a64c50d3" protocol=ttrpc version=3 May 13 23:58:22.066935 systemd[1]: Started cri-containerd-203822a495794ef122ca294344016c5ee10e0b6955c56589184dfe022a4a6717.scope - libcontainer container 203822a495794ef122ca294344016c5ee10e0b6955c56589184dfe022a4a6717. May 13 23:58:22.123623 containerd[1490]: time="2025-05-13T23:58:22.123565099Z" level=info msg="StartContainer for \"203822a495794ef122ca294344016c5ee10e0b6955c56589184dfe022a4a6717\" returns successfully" May 13 23:58:22.359455 systemd[1]: Started sshd@15-10.0.0.64:22-10.0.0.1:36616.service - OpenSSH per-connection server daemon (10.0.0.1:36616). May 13 23:58:22.436841 sshd[5144]: Accepted publickey for core from 10.0.0.1 port 36616 ssh2: RSA SHA256:7f2XacyFcvGxEsM5obZzQpmkhMs9Q6mfAUEaqBEC3Xw May 13 23:58:22.439432 sshd-session[5144]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 13 23:58:22.445295 systemd-logind[1468]: New session 16 of user core. May 13 23:58:22.449447 systemd[1]: Started session-16.scope - Session 16 of User core. May 13 23:58:22.621680 kubelet[2839]: I0513 23:58:22.620234 2839 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-apiserver/calico-apiserver-55cbbdb78f-kh882" podStartSLOduration=51.620173776 podStartE2EDuration="51.620173776s" podCreationTimestamp="2025-05-13 23:57:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-05-13 23:58:22.573683422 +0000 UTC m=+78.561387458" watchObservedRunningTime="2025-05-13 23:58:22.620173776 +0000 UTC m=+78.607877802" May 13 23:58:22.623793 kubelet[2839]: I0513 23:58:22.620458 2839 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/calico-kube-controllers-786464f5d8-csw8h" podStartSLOduration=40.081429422 podStartE2EDuration="48.62045119s" podCreationTimestamp="2025-05-13 23:57:34 +0000 UTC" firstStartedPulling="2025-05-13 23:58:13.22959442 +0000 UTC m=+69.217298446" lastFinishedPulling="2025-05-13 23:58:21.768616178 +0000 UTC m=+77.756320214" observedRunningTime="2025-05-13 23:58:22.619769533 +0000 UTC m=+78.607473569" watchObservedRunningTime="2025-05-13 23:58:22.62045119 +0000 UTC m=+78.608155216" May 13 23:58:22.628771 sshd[5146]: Connection closed by 10.0.0.1 port 36616 May 13 23:58:22.632007 sshd-session[5144]: pam_unix(sshd:session): session closed for user core May 13 23:58:22.637851 systemd[1]: sshd@15-10.0.0.64:22-10.0.0.1:36616.service: Deactivated successfully. May 13 23:58:22.641320 systemd[1]: session-16.scope: Deactivated successfully. May 13 23:58:22.642043 containerd[1490]: time="2025-05-13T23:58:22.641998590Z" level=info msg="TaskExit event in podsandbox handler container_id:\"572f38d5c35475f6f617220ef2aa940cc436bafb9bdcadc1efde8d832cd8ff00\" id:\"2f222473a5ba4724aaef40113e295627c6d22947fe60ac1bf79eb68853fd6528\" pid:5168 exited_at:{seconds:1747180702 nanos:641598004}" May 13 23:58:22.642602 systemd-logind[1468]: Session 16 logged out. Waiting for processes to exit. May 13 23:58:22.643951 systemd-logind[1468]: Removed session 16. May 13 23:58:22.865410 systemd-networkd[1422]: calida5133444d3: Gained IPv6LL May 13 23:58:25.067434 containerd[1490]: time="2025-05-13T23:58:25.067393746Z" level=info msg="TaskExit event in podsandbox handler container_id:\"572f38d5c35475f6f617220ef2aa940cc436bafb9bdcadc1efde8d832cd8ff00\" id:\"11ed7119fa3d3a2c997f3f5aae9b8eb164d1010f917da963496971f7ff66beba\" pid:5200 exited_at:{seconds:1747180705 nanos:67103618}" May 13 23:58:27.640968 systemd[1]: Started sshd@16-10.0.0.64:22-10.0.0.1:36618.service - OpenSSH per-connection server daemon (10.0.0.1:36618). May 13 23:58:27.692953 sshd[5213]: Accepted publickey for core from 10.0.0.1 port 36618 ssh2: RSA SHA256:7f2XacyFcvGxEsM5obZzQpmkhMs9Q6mfAUEaqBEC3Xw May 13 23:58:27.694773 sshd-session[5213]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 13 23:58:27.699258 systemd-logind[1468]: New session 17 of user core. May 13 23:58:27.713358 systemd[1]: Started session-17.scope - Session 17 of User core. May 13 23:58:28.117603 containerd[1490]: time="2025-05-13T23:58:28.117292272Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/csi:v3.29.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 13 23:58:28.193694 containerd[1490]: time="2025-05-13T23:58:28.193605438Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/csi:v3.29.3: active requests=0, bytes read=7912898" May 13 23:58:28.397384 containerd[1490]: time="2025-05-13T23:58:28.397186437Z" level=info msg="ImageCreate event name:\"sha256:4c37db5645f4075f8b8170eea8f14e340cb13550e0a392962f1f211ded741505\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 13 23:58:28.506956 containerd[1490]: time="2025-05-13T23:58:28.506899437Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/csi@sha256:72455a36febc7c56ec8881007f4805caed5764026a0694e4f86a2503209b2d31\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 13 23:58:28.507568 containerd[1490]: time="2025-05-13T23:58:28.507541068Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/csi:v3.29.3\" with image id \"sha256:4c37db5645f4075f8b8170eea8f14e340cb13550e0a392962f1f211ded741505\", repo tag \"ghcr.io/flatcar/calico/csi:v3.29.3\", repo digest \"ghcr.io/flatcar/calico/csi@sha256:72455a36febc7c56ec8881007f4805caed5764026a0694e4f86a2503209b2d31\", size \"9405520\" in 6.738790135s" May 13 23:58:28.507631 containerd[1490]: time="2025-05-13T23:58:28.507571225Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.29.3\" returns image reference \"sha256:4c37db5645f4075f8b8170eea8f14e340cb13550e0a392962f1f211ded741505\"" May 13 23:58:28.509532 containerd[1490]: time="2025-05-13T23:58:28.509496368Z" level=info msg="CreateContainer within sandbox \"f08ac277c4e53bbc9be385a14f591df1d66673d4ed35fddd32ef9a8ac95e4e3b\" for container &ContainerMetadata{Name:calico-csi,Attempt:0,}" May 13 23:58:28.646227 sshd[5215]: Connection closed by 10.0.0.1 port 36618 May 13 23:58:28.646568 sshd-session[5213]: pam_unix(sshd:session): session closed for user core May 13 23:58:28.654048 systemd[1]: sshd@16-10.0.0.64:22-10.0.0.1:36618.service: Deactivated successfully. May 13 23:58:28.656407 systemd[1]: session-17.scope: Deactivated successfully. May 13 23:58:28.657410 systemd-logind[1468]: Session 17 logged out. Waiting for processes to exit. May 13 23:58:28.658464 systemd-logind[1468]: Removed session 17. May 13 23:58:28.762914 containerd[1490]: time="2025-05-13T23:58:28.762855718Z" level=info msg="Container bcdfac9d1cb0e15f8dccc9e5b133b8f06983f87bc1dfec3716ed1359ed2b8aaa: CDI devices from CRI Config.CDIDevices: []" May 13 23:58:28.948989 containerd[1490]: time="2025-05-13T23:58:28.948640688Z" level=info msg="CreateContainer within sandbox \"f08ac277c4e53bbc9be385a14f591df1d66673d4ed35fddd32ef9a8ac95e4e3b\" for &ContainerMetadata{Name:calico-csi,Attempt:0,} returns container id \"bcdfac9d1cb0e15f8dccc9e5b133b8f06983f87bc1dfec3716ed1359ed2b8aaa\"" May 13 23:58:28.949493 containerd[1490]: time="2025-05-13T23:58:28.949444264Z" level=info msg="StartContainer for \"bcdfac9d1cb0e15f8dccc9e5b133b8f06983f87bc1dfec3716ed1359ed2b8aaa\"" May 13 23:58:28.951067 containerd[1490]: time="2025-05-13T23:58:28.951031640Z" level=info msg="connecting to shim bcdfac9d1cb0e15f8dccc9e5b133b8f06983f87bc1dfec3716ed1359ed2b8aaa" address="unix:///run/containerd/s/98e49582a52c96abffea135922a9168d94bd55a83e2066d9ddaa9d133aa99dc4" protocol=ttrpc version=3 May 13 23:58:28.976407 systemd[1]: Started cri-containerd-bcdfac9d1cb0e15f8dccc9e5b133b8f06983f87bc1dfec3716ed1359ed2b8aaa.scope - libcontainer container bcdfac9d1cb0e15f8dccc9e5b133b8f06983f87bc1dfec3716ed1359ed2b8aaa. May 13 23:58:29.645992 containerd[1490]: time="2025-05-13T23:58:29.645945323Z" level=info msg="StartContainer for \"bcdfac9d1cb0e15f8dccc9e5b133b8f06983f87bc1dfec3716ed1359ed2b8aaa\" returns successfully" May 13 23:58:29.646884 containerd[1490]: time="2025-05-13T23:58:29.646862144Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.29.3\"" May 13 23:58:33.661467 systemd[1]: Started sshd@17-10.0.0.64:22-10.0.0.1:34674.service - OpenSSH per-connection server daemon (10.0.0.1:34674). May 13 23:58:33.712957 containerd[1490]: time="2025-05-13T23:58:33.712889633Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node-driver-registrar:v3.29.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 13 23:58:33.715528 containerd[1490]: time="2025-05-13T23:58:33.713765346Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/node-driver-registrar:v3.29.3: active requests=0, bytes read=13991773" May 13 23:58:33.716543 containerd[1490]: time="2025-05-13T23:58:33.716459439Z" level=info msg="ImageCreate event name:\"sha256:e909e2ccf54404290b577fbddd190d036984deed184001767f820b0dddf77fd9\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 13 23:58:33.719149 containerd[1490]: time="2025-05-13T23:58:33.719069684Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node-driver-registrar@sha256:3f15090a9bb45773d1fd019455ec3d3f3746f3287c35d8013e497b38d8237324\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 13 23:58:33.719888 containerd[1490]: time="2025-05-13T23:58:33.719850127Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.29.3\" with image id \"sha256:e909e2ccf54404290b577fbddd190d036984deed184001767f820b0dddf77fd9\", repo tag \"ghcr.io/flatcar/calico/node-driver-registrar:v3.29.3\", repo digest \"ghcr.io/flatcar/calico/node-driver-registrar@sha256:3f15090a9bb45773d1fd019455ec3d3f3746f3287c35d8013e497b38d8237324\", size \"15484347\" in 4.072956394s" May 13 23:58:33.719956 containerd[1490]: time="2025-05-13T23:58:33.719897375Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.29.3\" returns image reference \"sha256:e909e2ccf54404290b577fbddd190d036984deed184001767f820b0dddf77fd9\"" May 13 23:58:33.722634 containerd[1490]: time="2025-05-13T23:58:33.722571552Z" level=info msg="CreateContainer within sandbox \"f08ac277c4e53bbc9be385a14f591df1d66673d4ed35fddd32ef9a8ac95e4e3b\" for container &ContainerMetadata{Name:csi-node-driver-registrar,Attempt:0,}" May 13 23:58:33.734101 containerd[1490]: time="2025-05-13T23:58:33.732843134Z" level=info msg="Container 5c9f11eaa5484ee6558627ad732baaea7431351b39790fd359743d3836247752: CDI devices from CRI Config.CDIDevices: []" May 13 23:58:33.794563 sshd[5272]: Accepted publickey for core from 10.0.0.1 port 34674 ssh2: RSA SHA256:7f2XacyFcvGxEsM5obZzQpmkhMs9Q6mfAUEaqBEC3Xw May 13 23:58:33.796823 sshd-session[5272]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 13 23:58:33.801671 systemd-logind[1468]: New session 18 of user core. May 13 23:58:33.809383 systemd[1]: Started session-18.scope - Session 18 of User core. May 13 23:58:33.901239 containerd[1490]: time="2025-05-13T23:58:33.901179094Z" level=info msg="CreateContainer within sandbox \"f08ac277c4e53bbc9be385a14f591df1d66673d4ed35fddd32ef9a8ac95e4e3b\" for &ContainerMetadata{Name:csi-node-driver-registrar,Attempt:0,} returns container id \"5c9f11eaa5484ee6558627ad732baaea7431351b39790fd359743d3836247752\"" May 13 23:58:33.901851 containerd[1490]: time="2025-05-13T23:58:33.901826406Z" level=info msg="StartContainer for \"5c9f11eaa5484ee6558627ad732baaea7431351b39790fd359743d3836247752\"" May 13 23:58:33.905298 containerd[1490]: time="2025-05-13T23:58:33.905246820Z" level=info msg="connecting to shim 5c9f11eaa5484ee6558627ad732baaea7431351b39790fd359743d3836247752" address="unix:///run/containerd/s/98e49582a52c96abffea135922a9168d94bd55a83e2066d9ddaa9d133aa99dc4" protocol=ttrpc version=3 May 13 23:58:33.933607 systemd[1]: Started cri-containerd-5c9f11eaa5484ee6558627ad732baaea7431351b39790fd359743d3836247752.scope - libcontainer container 5c9f11eaa5484ee6558627ad732baaea7431351b39790fd359743d3836247752. May 13 23:58:33.947648 sshd[5275]: Connection closed by 10.0.0.1 port 34674 May 13 23:58:33.948356 sshd-session[5272]: pam_unix(sshd:session): session closed for user core May 13 23:58:33.954670 systemd[1]: sshd@17-10.0.0.64:22-10.0.0.1:34674.service: Deactivated successfully. May 13 23:58:33.957492 systemd[1]: session-18.scope: Deactivated successfully. May 13 23:58:33.958936 systemd-logind[1468]: Session 18 logged out. Waiting for processes to exit. May 13 23:58:33.960281 systemd-logind[1468]: Removed session 18. May 13 23:58:33.988325 containerd[1490]: time="2025-05-13T23:58:33.988277865Z" level=info msg="StartContainer for \"5c9f11eaa5484ee6558627ad732baaea7431351b39790fd359743d3836247752\" returns successfully" May 13 23:58:34.251357 containerd[1490]: time="2025-05-13T23:58:34.251170572Z" level=info msg="TaskExit event in podsandbox handler container_id:\"2619bd3a960ea246c0909420ce7b56e324dd526c094ca248813ab28248a75958\" id:\"236551069918240d0a90382beb5e23efb190391243c6c4519440c2315c3876b8\" pid:5330 exited_at:{seconds:1747180714 nanos:250661792}" May 13 23:58:34.254361 kubelet[2839]: E0513 23:58:34.254325 2839 dns.go:153] "Nameserver limits exceeded" err="Nameserver limits were exceeded, some nameservers have been omitted, the applied nameserver line is: 1.1.1.1 1.0.0.1 8.8.8.8" May 13 23:58:34.266367 kubelet[2839]: I0513 23:58:34.266322 2839 csi_plugin.go:100] kubernetes.io/csi: Trying to validate a new CSI Driver with name: csi.tigera.io endpoint: /var/lib/kubelet/plugins/csi.tigera.io/csi.sock versions: 1.0.0 May 13 23:58:34.266367 kubelet[2839]: I0513 23:58:34.266365 2839 csi_plugin.go:113] kubernetes.io/csi: Register new plugin with name: csi.tigera.io at endpoint: /var/lib/kubelet/plugins/csi.tigera.io/csi.sock May 13 23:58:34.672353 kubelet[2839]: I0513 23:58:34.672241 2839 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/csi-node-driver-fj9x5" podStartSLOduration=41.841373324 podStartE2EDuration="1m1.672219011s" podCreationTimestamp="2025-05-13 23:57:33 +0000 UTC" firstStartedPulling="2025-05-13 23:58:13.889889891 +0000 UTC m=+69.877593917" lastFinishedPulling="2025-05-13 23:58:33.720735568 +0000 UTC m=+89.708439604" observedRunningTime="2025-05-13 23:58:34.672106659 +0000 UTC m=+90.659810705" watchObservedRunningTime="2025-05-13 23:58:34.672219011 +0000 UTC m=+90.659923047" May 13 23:58:35.115510 kubelet[2839]: E0513 23:58:35.115439 2839 dns.go:153] "Nameserver limits exceeded" err="Nameserver limits were exceeded, some nameservers have been omitted, the applied nameserver line is: 1.1.1.1 1.0.0.1 8.8.8.8" May 13 23:58:36.115576 kubelet[2839]: E0513 23:58:36.115531 2839 dns.go:153] "Nameserver limits exceeded" err="Nameserver limits were exceeded, some nameservers have been omitted, the applied nameserver line is: 1.1.1.1 1.0.0.1 8.8.8.8" May 13 23:58:38.964865 systemd[1]: Started sshd@18-10.0.0.64:22-10.0.0.1:49170.service - OpenSSH per-connection server daemon (10.0.0.1:49170). May 13 23:58:39.017784 sshd[5345]: Accepted publickey for core from 10.0.0.1 port 49170 ssh2: RSA SHA256:7f2XacyFcvGxEsM5obZzQpmkhMs9Q6mfAUEaqBEC3Xw May 13 23:58:39.019632 sshd-session[5345]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 13 23:58:39.024420 systemd-logind[1468]: New session 19 of user core. May 13 23:58:39.034433 systemd[1]: Started session-19.scope - Session 19 of User core. May 13 23:58:39.150864 sshd[5347]: Connection closed by 10.0.0.1 port 49170 May 13 23:58:39.151617 sshd-session[5345]: pam_unix(sshd:session): session closed for user core May 13 23:58:39.157690 systemd[1]: sshd@18-10.0.0.64:22-10.0.0.1:49170.service: Deactivated successfully. May 13 23:58:39.160457 systemd[1]: session-19.scope: Deactivated successfully. May 13 23:58:39.161504 systemd-logind[1468]: Session 19 logged out. Waiting for processes to exit. May 13 23:58:39.163218 systemd-logind[1468]: Removed session 19. May 13 23:58:44.171532 systemd[1]: Started sshd@19-10.0.0.64:22-10.0.0.1:49184.service - OpenSSH per-connection server daemon (10.0.0.1:49184). May 13 23:58:44.221360 sshd[5362]: Accepted publickey for core from 10.0.0.1 port 49184 ssh2: RSA SHA256:7f2XacyFcvGxEsM5obZzQpmkhMs9Q6mfAUEaqBEC3Xw May 13 23:58:44.223405 sshd-session[5362]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 13 23:58:44.229012 systemd-logind[1468]: New session 20 of user core. May 13 23:58:44.239457 systemd[1]: Started session-20.scope - Session 20 of User core. May 13 23:58:44.359096 sshd[5364]: Connection closed by 10.0.0.1 port 49184 May 13 23:58:44.359518 sshd-session[5362]: pam_unix(sshd:session): session closed for user core May 13 23:58:44.373173 systemd[1]: sshd@19-10.0.0.64:22-10.0.0.1:49184.service: Deactivated successfully. May 13 23:58:44.375082 systemd[1]: session-20.scope: Deactivated successfully. May 13 23:58:44.376789 systemd-logind[1468]: Session 20 logged out. Waiting for processes to exit. May 13 23:58:44.377954 systemd[1]: Started sshd@20-10.0.0.64:22-10.0.0.1:49190.service - OpenSSH per-connection server daemon (10.0.0.1:49190). May 13 23:58:44.378850 systemd-logind[1468]: Removed session 20. May 13 23:58:44.423791 sshd[5376]: Accepted publickey for core from 10.0.0.1 port 49190 ssh2: RSA SHA256:7f2XacyFcvGxEsM5obZzQpmkhMs9Q6mfAUEaqBEC3Xw May 13 23:58:44.425574 sshd-session[5376]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 13 23:58:44.430140 systemd-logind[1468]: New session 21 of user core. May 13 23:58:44.434326 systemd[1]: Started session-21.scope - Session 21 of User core. May 13 23:58:44.929893 sshd[5379]: Connection closed by 10.0.0.1 port 49190 May 13 23:58:44.930432 sshd-session[5376]: pam_unix(sshd:session): session closed for user core May 13 23:58:44.941046 systemd[1]: sshd@20-10.0.0.64:22-10.0.0.1:49190.service: Deactivated successfully. May 13 23:58:44.943942 systemd[1]: session-21.scope: Deactivated successfully. May 13 23:58:44.946519 systemd-logind[1468]: Session 21 logged out. Waiting for processes to exit. May 13 23:58:44.950486 systemd[1]: Started sshd@21-10.0.0.64:22-10.0.0.1:49192.service - OpenSSH per-connection server daemon (10.0.0.1:49192). May 13 23:58:44.952159 systemd-logind[1468]: Removed session 21. May 13 23:58:45.006001 sshd[5390]: Accepted publickey for core from 10.0.0.1 port 49192 ssh2: RSA SHA256:7f2XacyFcvGxEsM5obZzQpmkhMs9Q6mfAUEaqBEC3Xw May 13 23:58:45.007563 sshd-session[5390]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 13 23:58:45.012475 systemd-logind[1468]: New session 22 of user core. May 13 23:58:45.027418 systemd[1]: Started session-22.scope - Session 22 of User core. May 13 23:58:47.272525 sshd[5393]: Connection closed by 10.0.0.1 port 49192 May 13 23:58:47.272992 sshd-session[5390]: pam_unix(sshd:session): session closed for user core May 13 23:58:47.286491 systemd[1]: sshd@21-10.0.0.64:22-10.0.0.1:49192.service: Deactivated successfully. May 13 23:58:47.289055 systemd[1]: session-22.scope: Deactivated successfully. May 13 23:58:47.289374 systemd[1]: session-22.scope: Consumed 643ms CPU time, 69.9M memory peak. May 13 23:58:47.291410 systemd-logind[1468]: Session 22 logged out. Waiting for processes to exit. May 13 23:58:47.293790 systemd[1]: Started sshd@22-10.0.0.64:22-10.0.0.1:49200.service - OpenSSH per-connection server daemon (10.0.0.1:49200). May 13 23:58:47.295275 systemd-logind[1468]: Removed session 22. May 13 23:58:47.348946 sshd[5420]: Accepted publickey for core from 10.0.0.1 port 49200 ssh2: RSA SHA256:7f2XacyFcvGxEsM5obZzQpmkhMs9Q6mfAUEaqBEC3Xw May 13 23:58:47.351097 sshd-session[5420]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 13 23:58:47.357482 systemd-logind[1468]: New session 23 of user core. May 13 23:58:47.366551 systemd[1]: Started session-23.scope - Session 23 of User core. May 13 23:58:47.630291 sshd[5423]: Connection closed by 10.0.0.1 port 49200 May 13 23:58:47.630083 sshd-session[5420]: pam_unix(sshd:session): session closed for user core May 13 23:58:47.646716 systemd[1]: sshd@22-10.0.0.64:22-10.0.0.1:49200.service: Deactivated successfully. May 13 23:58:47.652822 systemd[1]: session-23.scope: Deactivated successfully. May 13 23:58:47.661794 systemd-logind[1468]: Session 23 logged out. Waiting for processes to exit. May 13 23:58:47.663585 systemd[1]: Started sshd@23-10.0.0.64:22-10.0.0.1:49204.service - OpenSSH per-connection server daemon (10.0.0.1:49204). May 13 23:58:47.665371 systemd-logind[1468]: Removed session 23. May 13 23:58:47.721241 sshd[5434]: Accepted publickey for core from 10.0.0.1 port 49204 ssh2: RSA SHA256:7f2XacyFcvGxEsM5obZzQpmkhMs9Q6mfAUEaqBEC3Xw May 13 23:58:47.723433 sshd-session[5434]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 13 23:58:47.728443 systemd-logind[1468]: New session 24 of user core. May 13 23:58:47.740509 systemd[1]: Started session-24.scope - Session 24 of User core. May 13 23:58:47.883561 sshd[5437]: Connection closed by 10.0.0.1 port 49204 May 13 23:58:47.883862 sshd-session[5434]: pam_unix(sshd:session): session closed for user core May 13 23:58:47.888977 systemd[1]: sshd@23-10.0.0.64:22-10.0.0.1:49204.service: Deactivated successfully. May 13 23:58:47.891899 systemd[1]: session-24.scope: Deactivated successfully. May 13 23:58:47.892686 systemd-logind[1468]: Session 24 logged out. Waiting for processes to exit. May 13 23:58:47.893613 systemd-logind[1468]: Removed session 24. May 13 23:58:50.747301 kubelet[2839]: I0513 23:58:50.747235 2839 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" May 13 23:58:52.901991 systemd[1]: Started sshd@24-10.0.0.64:22-10.0.0.1:41686.service - OpenSSH per-connection server daemon (10.0.0.1:41686). May 13 23:58:52.953818 sshd[5461]: Accepted publickey for core from 10.0.0.1 port 41686 ssh2: RSA SHA256:7f2XacyFcvGxEsM5obZzQpmkhMs9Q6mfAUEaqBEC3Xw May 13 23:58:52.955712 sshd-session[5461]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 13 23:58:52.960557 systemd-logind[1468]: New session 25 of user core. May 13 23:58:52.967339 systemd[1]: Started session-25.scope - Session 25 of User core. May 13 23:58:53.091867 sshd[5463]: Connection closed by 10.0.0.1 port 41686 May 13 23:58:53.092264 sshd-session[5461]: pam_unix(sshd:session): session closed for user core May 13 23:58:53.096319 systemd[1]: sshd@24-10.0.0.64:22-10.0.0.1:41686.service: Deactivated successfully. May 13 23:58:53.098789 systemd[1]: session-25.scope: Deactivated successfully. May 13 23:58:53.099618 systemd-logind[1468]: Session 25 logged out. Waiting for processes to exit. May 13 23:58:53.100676 systemd-logind[1468]: Removed session 25. May 13 23:58:53.116099 kubelet[2839]: E0513 23:58:53.116054 2839 dns.go:153] "Nameserver limits exceeded" err="Nameserver limits were exceeded, some nameservers have been omitted, the applied nameserver line is: 1.1.1.1 1.0.0.1 8.8.8.8" May 13 23:58:55.064518 containerd[1490]: time="2025-05-13T23:58:55.064462858Z" level=info msg="TaskExit event in podsandbox handler container_id:\"572f38d5c35475f6f617220ef2aa940cc436bafb9bdcadc1efde8d832cd8ff00\" id:\"ed8e68de372af6df26a92b1797301ab9a9378075835f70661d148703cebe1d1b\" pid:5488 exited_at:{seconds:1747180735 nanos:64182810}" May 13 23:58:58.105957 systemd[1]: Started sshd@25-10.0.0.64:22-10.0.0.1:39700.service - OpenSSH per-connection server daemon (10.0.0.1:39700). May 13 23:58:58.154120 sshd[5502]: Accepted publickey for core from 10.0.0.1 port 39700 ssh2: RSA SHA256:7f2XacyFcvGxEsM5obZzQpmkhMs9Q6mfAUEaqBEC3Xw May 13 23:58:58.155562 sshd-session[5502]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 13 23:58:58.160462 systemd-logind[1468]: New session 26 of user core. May 13 23:58:58.170410 systemd[1]: Started session-26.scope - Session 26 of User core. May 13 23:58:58.339440 sshd[5504]: Connection closed by 10.0.0.1 port 39700 May 13 23:58:58.339763 sshd-session[5502]: pam_unix(sshd:session): session closed for user core May 13 23:58:58.343695 systemd[1]: sshd@25-10.0.0.64:22-10.0.0.1:39700.service: Deactivated successfully. May 13 23:58:58.346626 systemd[1]: session-26.scope: Deactivated successfully. May 13 23:58:58.348912 systemd-logind[1468]: Session 26 logged out. Waiting for processes to exit. May 13 23:58:58.349935 systemd-logind[1468]: Removed session 26. May 13 23:59:03.354628 systemd[1]: Started sshd@26-10.0.0.64:22-10.0.0.1:39714.service - OpenSSH per-connection server daemon (10.0.0.1:39714). May 13 23:59:03.399714 sshd[5519]: Accepted publickey for core from 10.0.0.1 port 39714 ssh2: RSA SHA256:7f2XacyFcvGxEsM5obZzQpmkhMs9Q6mfAUEaqBEC3Xw May 13 23:59:03.401304 sshd-session[5519]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 13 23:59:03.405995 systemd-logind[1468]: New session 27 of user core. May 13 23:59:03.411461 systemd[1]: Started session-27.scope - Session 27 of User core. May 13 23:59:03.611672 sshd[5523]: Connection closed by 10.0.0.1 port 39714 May 13 23:59:03.613781 sshd-session[5519]: pam_unix(sshd:session): session closed for user core May 13 23:59:03.617531 systemd[1]: sshd@26-10.0.0.64:22-10.0.0.1:39714.service: Deactivated successfully. May 13 23:59:03.619490 systemd[1]: session-27.scope: Deactivated successfully. May 13 23:59:03.620392 systemd-logind[1468]: Session 27 logged out. Waiting for processes to exit. May 13 23:59:03.622153 systemd-logind[1468]: Removed session 27. May 13 23:59:04.235106 containerd[1490]: time="2025-05-13T23:59:04.235060396Z" level=info msg="TaskExit event in podsandbox handler container_id:\"2619bd3a960ea246c0909420ce7b56e324dd526c094ca248813ab28248a75958\" id:\"371a6f6032cb9cb325d5a4415a7bc07df7dfe08c4cc1dbce3fa0f4e7248d2613\" pid:5548 exited_at:{seconds:1747180744 nanos:234670702}" May 13 23:59:08.627253 systemd[1]: Started sshd@27-10.0.0.64:22-10.0.0.1:47486.service - OpenSSH per-connection server daemon (10.0.0.1:47486). May 13 23:59:08.686652 sshd[5561]: Accepted publickey for core from 10.0.0.1 port 47486 ssh2: RSA SHA256:7f2XacyFcvGxEsM5obZzQpmkhMs9Q6mfAUEaqBEC3Xw May 13 23:59:08.688432 sshd-session[5561]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 13 23:59:08.693554 systemd-logind[1468]: New session 28 of user core. May 13 23:59:08.700350 systemd[1]: Started session-28.scope - Session 28 of User core. May 13 23:59:08.829547 sshd[5564]: Connection closed by 10.0.0.1 port 47486 May 13 23:59:08.830504 sshd-session[5561]: pam_unix(sshd:session): session closed for user core May 13 23:59:08.834917 systemd[1]: sshd@27-10.0.0.64:22-10.0.0.1:47486.service: Deactivated successfully. May 13 23:59:08.838499 systemd[1]: session-28.scope: Deactivated successfully. May 13 23:59:08.839782 systemd-logind[1468]: Session 28 logged out. Waiting for processes to exit. May 13 23:59:08.840952 systemd-logind[1468]: Removed session 28. May 13 23:59:13.842432 systemd[1]: Started sshd@28-10.0.0.64:22-10.0.0.1:47500.service - OpenSSH per-connection server daemon (10.0.0.1:47500). May 13 23:59:13.888306 sshd[5578]: Accepted publickey for core from 10.0.0.1 port 47500 ssh2: RSA SHA256:7f2XacyFcvGxEsM5obZzQpmkhMs9Q6mfAUEaqBEC3Xw May 13 23:59:13.890614 sshd-session[5578]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 13 23:59:13.897130 systemd-logind[1468]: New session 29 of user core. May 13 23:59:13.904569 systemd[1]: Started session-29.scope - Session 29 of User core. May 13 23:59:14.034346 sshd[5580]: Connection closed by 10.0.0.1 port 47500 May 13 23:59:14.034709 sshd-session[5578]: pam_unix(sshd:session): session closed for user core May 13 23:59:14.039287 systemd[1]: sshd@28-10.0.0.64:22-10.0.0.1:47500.service: Deactivated successfully. May 13 23:59:14.042002 systemd[1]: session-29.scope: Deactivated successfully. May 13 23:59:14.042794 systemd-logind[1468]: Session 29 logged out. Waiting for processes to exit. May 13 23:59:14.044464 systemd-logind[1468]: Removed session 29. May 13 23:59:14.162495 containerd[1490]: time="2025-05-13T23:59:14.162011907Z" level=info msg="TaskExit event in podsandbox handler container_id:\"572f38d5c35475f6f617220ef2aa940cc436bafb9bdcadc1efde8d832cd8ff00\" id:\"abf2513f4ef532e728381571b71f3dc9649bf540b4213371a43c8f0a3631395a\" pid:5605 exited_at:{seconds:1747180754 nanos:161558912}"