May 15 12:47:58.795519 kernel: Linux version 6.12.20-flatcar (build@pony-truck.infra.kinvolk.io) (x86_64-cros-linux-gnu-gcc (Gentoo Hardened 14.2.1_p20241221 p7) 14.2.1 20241221, GNU ld (Gentoo 2.44 p1) 2.44.0) #1 SMP PREEMPT_DYNAMIC Thu May 15 10:42:41 -00 2025 May 15 12:47:58.795540 kernel: Command line: BOOT_IMAGE=/flatcar/vmlinuz-a mount.usr=/dev/mapper/usr verity.usr=PARTUUID=7130c94a-213a-4e5a-8e26-6cce9662f132 rootflags=rw mount.usrflags=ro consoleblank=0 root=LABEL=ROOT console=ttyS0,115200n8 console=tty0 flatcar.first_boot=detected flatcar.oem.id=hetzner verity.usrhash=48287e633374b880fa618bd42bee102ae77c50831859c6cedd6ca9e1aec3dd5c May 15 12:47:58.795548 kernel: BIOS-provided physical RAM map: May 15 12:47:58.795553 kernel: BIOS-e820: [mem 0x0000000000000000-0x000000000009fbff] usable May 15 12:47:58.795558 kernel: BIOS-e820: [mem 0x000000000009fc00-0x000000000009ffff] reserved May 15 12:47:58.795568 kernel: BIOS-e820: [mem 0x00000000000f0000-0x00000000000fffff] reserved May 15 12:47:58.795614 kernel: BIOS-e820: [mem 0x0000000000100000-0x000000007cfdbfff] usable May 15 12:47:58.795619 kernel: BIOS-e820: [mem 0x000000007cfdc000-0x000000007cffffff] reserved May 15 12:47:58.795624 kernel: BIOS-e820: [mem 0x00000000b0000000-0x00000000bfffffff] reserved May 15 12:47:58.795629 kernel: BIOS-e820: [mem 0x00000000fed1c000-0x00000000fed1ffff] reserved May 15 12:47:58.795634 kernel: BIOS-e820: [mem 0x00000000feffc000-0x00000000feffffff] reserved May 15 12:47:58.795638 kernel: BIOS-e820: [mem 0x00000000fffc0000-0x00000000ffffffff] reserved May 15 12:47:58.795643 kernel: BIOS-e820: [mem 0x000000fd00000000-0x000000ffffffffff] reserved May 15 12:47:58.795648 kernel: NX (Execute Disable) protection: active May 15 12:47:58.795655 kernel: APIC: Static calls initialized May 15 12:47:58.795661 kernel: SMBIOS 3.0.0 present. May 15 12:47:58.795666 kernel: DMI: Hetzner vServer/Standard PC (Q35 + ICH9, 2009), BIOS 20171111 11/11/2017 May 15 12:47:58.795671 kernel: DMI: Memory slots populated: 1/1 May 15 12:47:58.795676 kernel: Hypervisor detected: KVM May 15 12:47:58.795681 kernel: kvm-clock: Using msrs 4b564d01 and 4b564d00 May 15 12:47:58.795686 kernel: kvm-clock: using sched offset of 4111136240 cycles May 15 12:47:58.795692 kernel: clocksource: kvm-clock: mask: 0xffffffffffffffff max_cycles: 0x1cd42e4dffb, max_idle_ns: 881590591483 ns May 15 12:47:58.795699 kernel: tsc: Detected 2445.406 MHz processor May 15 12:47:58.795705 kernel: e820: update [mem 0x00000000-0x00000fff] usable ==> reserved May 15 12:47:58.795710 kernel: e820: remove [mem 0x000a0000-0x000fffff] usable May 15 12:47:58.795716 kernel: last_pfn = 0x7cfdc max_arch_pfn = 0x400000000 May 15 12:47:58.795721 kernel: MTRR map: 4 entries (3 fixed + 1 variable; max 19), built from 8 variable MTRRs May 15 12:47:58.795726 kernel: x86/PAT: Configuration [0-7]: WB WC UC- UC WB WP UC- WT May 15 12:47:58.795732 kernel: Using GB pages for direct mapping May 15 12:47:58.795737 kernel: ACPI: Early table checksum verification disabled May 15 12:47:58.795742 kernel: ACPI: RSDP 0x00000000000F5270 000014 (v00 BOCHS ) May 15 12:47:58.795749 kernel: ACPI: RSDT 0x000000007CFE2693 000038 (v01 BOCHS BXPC 00000001 BXPC 00000001) May 15 12:47:58.795754 kernel: ACPI: FACP 0x000000007CFE2483 0000F4 (v03 BOCHS BXPC 00000001 BXPC 00000001) May 15 12:47:58.795759 kernel: ACPI: DSDT 0x000000007CFE0040 002443 (v01 BOCHS BXPC 00000001 BXPC 00000001) May 15 12:47:58.795765 kernel: ACPI: FACS 0x000000007CFE0000 000040 May 15 12:47:58.795770 kernel: ACPI: APIC 0x000000007CFE2577 000080 (v03 BOCHS BXPC 00000001 BXPC 00000001) May 15 12:47:58.795775 kernel: ACPI: HPET 0x000000007CFE25F7 000038 (v01 BOCHS BXPC 00000001 BXPC 00000001) May 15 12:47:58.795781 kernel: ACPI: MCFG 0x000000007CFE262F 00003C (v01 BOCHS BXPC 00000001 BXPC 00000001) May 15 12:47:58.795786 kernel: ACPI: WAET 0x000000007CFE266B 000028 (v01 BOCHS BXPC 00000001 BXPC 00000001) May 15 12:47:58.795792 kernel: ACPI: Reserving FACP table memory at [mem 0x7cfe2483-0x7cfe2576] May 15 12:47:58.795800 kernel: ACPI: Reserving DSDT table memory at [mem 0x7cfe0040-0x7cfe2482] May 15 12:47:58.795806 kernel: ACPI: Reserving FACS table memory at [mem 0x7cfe0000-0x7cfe003f] May 15 12:47:58.795811 kernel: ACPI: Reserving APIC table memory at [mem 0x7cfe2577-0x7cfe25f6] May 15 12:47:58.795818 kernel: ACPI: Reserving HPET table memory at [mem 0x7cfe25f7-0x7cfe262e] May 15 12:47:58.795823 kernel: ACPI: Reserving MCFG table memory at [mem 0x7cfe262f-0x7cfe266a] May 15 12:47:58.795830 kernel: ACPI: Reserving WAET table memory at [mem 0x7cfe266b-0x7cfe2692] May 15 12:47:58.795835 kernel: No NUMA configuration found May 15 12:47:58.795841 kernel: Faking a node at [mem 0x0000000000000000-0x000000007cfdbfff] May 15 12:47:58.795847 kernel: NODE_DATA(0) allocated [mem 0x7cfd4dc0-0x7cfdbfff] May 15 12:47:58.795852 kernel: Zone ranges: May 15 12:47:58.795858 kernel: DMA [mem 0x0000000000001000-0x0000000000ffffff] May 15 12:47:58.795863 kernel: DMA32 [mem 0x0000000001000000-0x000000007cfdbfff] May 15 12:47:58.795869 kernel: Normal empty May 15 12:47:58.795874 kernel: Device empty May 15 12:47:58.795880 kernel: Movable zone start for each node May 15 12:47:58.795888 kernel: Early memory node ranges May 15 12:47:58.795898 kernel: node 0: [mem 0x0000000000001000-0x000000000009efff] May 15 12:47:58.795908 kernel: node 0: [mem 0x0000000000100000-0x000000007cfdbfff] May 15 12:47:58.795918 kernel: Initmem setup node 0 [mem 0x0000000000001000-0x000000007cfdbfff] May 15 12:47:58.795929 kernel: On node 0, zone DMA: 1 pages in unavailable ranges May 15 12:47:58.795938 kernel: On node 0, zone DMA: 97 pages in unavailable ranges May 15 12:47:58.795946 kernel: On node 0, zone DMA32: 12324 pages in unavailable ranges May 15 12:47:58.795955 kernel: ACPI: PM-Timer IO Port: 0x608 May 15 12:47:58.796254 kernel: ACPI: LAPIC_NMI (acpi_id[0xff] dfl dfl lint[0x1]) May 15 12:47:58.796265 kernel: IOAPIC[0]: apic_id 0, version 17, address 0xfec00000, GSI 0-23 May 15 12:47:58.796271 kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 0 global_irq 2 dfl dfl) May 15 12:47:58.796276 kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 5 global_irq 5 high level) May 15 12:47:58.796282 kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 9 global_irq 9 high level) May 15 12:47:58.796288 kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 10 global_irq 10 high level) May 15 12:47:58.796293 kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 11 global_irq 11 high level) May 15 12:47:58.796299 kernel: ACPI: Using ACPI (MADT) for SMP configuration information May 15 12:47:58.796305 kernel: ACPI: HPET id: 0x8086a201 base: 0xfed00000 May 15 12:47:58.796311 kernel: CPU topo: Max. logical packages: 1 May 15 12:47:58.796318 kernel: CPU topo: Max. logical dies: 1 May 15 12:47:58.796323 kernel: CPU topo: Max. dies per package: 1 May 15 12:47:58.796329 kernel: CPU topo: Max. threads per core: 1 May 15 12:47:58.796334 kernel: CPU topo: Num. cores per package: 2 May 15 12:47:58.796340 kernel: CPU topo: Num. threads per package: 2 May 15 12:47:58.796345 kernel: CPU topo: Allowing 2 present CPUs plus 0 hotplug CPUs May 15 12:47:58.796351 kernel: kvm-guest: APIC: eoi() replaced with kvm_guest_apic_eoi_write() May 15 12:47:58.796357 kernel: [mem 0xc0000000-0xfed1bfff] available for PCI devices May 15 12:47:58.796362 kernel: Booting paravirtualized kernel on KVM May 15 12:47:58.796368 kernel: clocksource: refined-jiffies: mask: 0xffffffff max_cycles: 0xffffffff, max_idle_ns: 1910969940391419 ns May 15 12:47:58.796375 kernel: setup_percpu: NR_CPUS:512 nr_cpumask_bits:2 nr_cpu_ids:2 nr_node_ids:1 May 15 12:47:58.796381 kernel: percpu: Embedded 60 pages/cpu s207832 r8192 d29736 u1048576 May 15 12:47:58.796386 kernel: pcpu-alloc: s207832 r8192 d29736 u1048576 alloc=1*2097152 May 15 12:47:58.796392 kernel: pcpu-alloc: [0] 0 1 May 15 12:47:58.796398 kernel: kvm-guest: PV spinlocks disabled, no host support May 15 12:47:58.796404 kernel: Kernel command line: rootflags=rw mount.usrflags=ro BOOT_IMAGE=/flatcar/vmlinuz-a mount.usr=/dev/mapper/usr verity.usr=PARTUUID=7130c94a-213a-4e5a-8e26-6cce9662f132 rootflags=rw mount.usrflags=ro consoleblank=0 root=LABEL=ROOT console=ttyS0,115200n8 console=tty0 flatcar.first_boot=detected flatcar.oem.id=hetzner verity.usrhash=48287e633374b880fa618bd42bee102ae77c50831859c6cedd6ca9e1aec3dd5c May 15 12:47:58.796411 kernel: Unknown kernel command line parameters "BOOT_IMAGE=/flatcar/vmlinuz-a", will be passed to user space. May 15 12:47:58.796416 kernel: random: crng init done May 15 12:47:58.796423 kernel: Dentry cache hash table entries: 262144 (order: 9, 2097152 bytes, linear) May 15 12:47:58.796429 kernel: Inode-cache hash table entries: 131072 (order: 8, 1048576 bytes, linear) May 15 12:47:58.796434 kernel: Fallback order for Node 0: 0 May 15 12:47:58.796440 kernel: Built 1 zonelists, mobility grouping on. Total pages: 511866 May 15 12:47:58.796445 kernel: Policy zone: DMA32 May 15 12:47:58.796451 kernel: mem auto-init: stack:off, heap alloc:off, heap free:off May 15 12:47:58.796457 kernel: SLUB: HWalign=64, Order=0-3, MinObjects=0, CPUs=2, Nodes=1 May 15 12:47:58.796463 kernel: ftrace: allocating 40065 entries in 157 pages May 15 12:47:58.796469 kernel: ftrace: allocated 157 pages with 5 groups May 15 12:47:58.796475 kernel: Dynamic Preempt: voluntary May 15 12:47:58.796481 kernel: rcu: Preemptible hierarchical RCU implementation. May 15 12:47:58.796487 kernel: rcu: RCU event tracing is enabled. May 15 12:47:58.796493 kernel: rcu: RCU restricting CPUs from NR_CPUS=512 to nr_cpu_ids=2. May 15 12:47:58.796499 kernel: Trampoline variant of Tasks RCU enabled. May 15 12:47:58.796505 kernel: Rude variant of Tasks RCU enabled. May 15 12:47:58.796511 kernel: Tracing variant of Tasks RCU enabled. May 15 12:47:58.796516 kernel: rcu: RCU calculated value of scheduler-enlistment delay is 100 jiffies. May 15 12:47:58.796522 kernel: rcu: Adjusting geometry for rcu_fanout_leaf=16, nr_cpu_ids=2 May 15 12:47:58.796529 kernel: RCU Tasks: Setting shift to 1 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=2. May 15 12:47:58.796534 kernel: RCU Tasks Rude: Setting shift to 1 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=2. May 15 12:47:58.796540 kernel: RCU Tasks Trace: Setting shift to 1 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=2. May 15 12:47:58.796546 kernel: NR_IRQS: 33024, nr_irqs: 440, preallocated irqs: 16 May 15 12:47:58.796551 kernel: rcu: srcu_init: Setting srcu_struct sizes based on contention. May 15 12:47:58.796557 kernel: Console: colour VGA+ 80x25 May 15 12:47:58.796563 kernel: printk: legacy console [tty0] enabled May 15 12:47:58.796568 kernel: printk: legacy console [ttyS0] enabled May 15 12:47:58.798547 kernel: ACPI: Core revision 20240827 May 15 12:47:58.798564 kernel: clocksource: hpet: mask: 0xffffffff max_cycles: 0xffffffff, max_idle_ns: 19112604467 ns May 15 12:47:58.798592 kernel: APIC: Switch to symmetric I/O mode setup May 15 12:47:58.798600 kernel: x2apic enabled May 15 12:47:58.798613 kernel: APIC: Switched APIC routing to: physical x2apic May 15 12:47:58.798625 kernel: ..TIMER: vector=0x30 apic1=0 pin1=2 apic2=-1 pin2=-1 May 15 12:47:58.798635 kernel: clocksource: tsc-early: mask: 0xffffffffffffffff max_cycles: 0x233fc4eb620, max_idle_ns: 440795316590 ns May 15 12:47:58.798645 kernel: Calibrating delay loop (skipped) preset value.. 4890.81 BogoMIPS (lpj=2445406) May 15 12:47:58.798655 kernel: x86/cpu: User Mode Instruction Prevention (UMIP) activated May 15 12:47:58.798666 kernel: Last level iTLB entries: 4KB 512, 2MB 255, 4MB 127 May 15 12:47:58.798674 kernel: Last level dTLB entries: 4KB 512, 2MB 255, 4MB 127, 1GB 0 May 15 12:47:58.798680 kernel: Spectre V1 : Mitigation: usercopy/swapgs barriers and __user pointer sanitization May 15 12:47:58.798686 kernel: Spectre V2 : Mitigation: Retpolines May 15 12:47:58.798692 kernel: Spectre V2 : Spectre v2 / SpectreRSB mitigation: Filling RSB on context switch May 15 12:47:58.798698 kernel: Spectre V2 : Spectre v2 / SpectreRSB : Filling RSB on VMEXIT May 15 12:47:58.798704 kernel: Spectre V2 : Enabling Speculation Barrier for firmware calls May 15 12:47:58.798710 kernel: RETBleed: Mitigation: untrained return thunk May 15 12:47:58.798717 kernel: Spectre V2 : mitigation: Enabling conditional Indirect Branch Prediction Barrier May 15 12:47:58.798723 kernel: Speculative Store Bypass: Mitigation: Speculative Store Bypass disabled via prctl May 15 12:47:58.798729 kernel: x86/fpu: Supporting XSAVE feature 0x001: 'x87 floating point registers' May 15 12:47:58.798735 kernel: x86/fpu: Supporting XSAVE feature 0x002: 'SSE registers' May 15 12:47:58.798741 kernel: x86/fpu: Supporting XSAVE feature 0x004: 'AVX registers' May 15 12:47:58.798747 kernel: x86/fpu: xstate_offset[2]: 576, xstate_sizes[2]: 256 May 15 12:47:58.798753 kernel: x86/fpu: Enabled xstate features 0x7, context size is 832 bytes, using 'compacted' format. May 15 12:47:58.798759 kernel: Freeing SMP alternatives memory: 32K May 15 12:47:58.798765 kernel: pid_max: default: 32768 minimum: 301 May 15 12:47:58.798772 kernel: LSM: initializing lsm=lockdown,capability,landlock,selinux,ima May 15 12:47:58.798778 kernel: landlock: Up and running. May 15 12:47:58.798784 kernel: SELinux: Initializing. May 15 12:47:58.798790 kernel: Mount-cache hash table entries: 4096 (order: 3, 32768 bytes, linear) May 15 12:47:58.798796 kernel: Mountpoint-cache hash table entries: 4096 (order: 3, 32768 bytes, linear) May 15 12:47:58.798802 kernel: smpboot: CPU0: AMD EPYC-Rome Processor (family: 0x17, model: 0x31, stepping: 0x0) May 15 12:47:58.798808 kernel: Performance Events: Fam17h+ core perfctr, AMD PMU driver. May 15 12:47:58.798814 kernel: ... version: 0 May 15 12:47:58.798820 kernel: ... bit width: 48 May 15 12:47:58.798826 kernel: ... generic registers: 6 May 15 12:47:58.798832 kernel: ... value mask: 0000ffffffffffff May 15 12:47:58.798838 kernel: ... max period: 00007fffffffffff May 15 12:47:58.798844 kernel: ... fixed-purpose events: 0 May 15 12:47:58.798850 kernel: ... event mask: 000000000000003f May 15 12:47:58.798856 kernel: signal: max sigframe size: 1776 May 15 12:47:58.798862 kernel: rcu: Hierarchical SRCU implementation. May 15 12:47:58.798868 kernel: rcu: Max phase no-delay instances is 400. May 15 12:47:58.798875 kernel: Timer migration: 1 hierarchy levels; 8 children per group; 1 crossnode level May 15 12:47:58.798881 kernel: smp: Bringing up secondary CPUs ... May 15 12:47:58.798888 kernel: smpboot: x86: Booting SMP configuration: May 15 12:47:58.798894 kernel: .... node #0, CPUs: #1 May 15 12:47:58.798900 kernel: smp: Brought up 1 node, 2 CPUs May 15 12:47:58.798905 kernel: smpboot: Total of 2 processors activated (9781.62 BogoMIPS) May 15 12:47:58.798912 kernel: Memory: 1917780K/2047464K available (14336K kernel code, 2438K rwdata, 9944K rodata, 54416K init, 2544K bss, 125140K reserved, 0K cma-reserved) May 15 12:47:58.798918 kernel: devtmpfs: initialized May 15 12:47:58.798924 kernel: x86/mm: Memory block size: 128MB May 15 12:47:58.798930 kernel: clocksource: jiffies: mask: 0xffffffff max_cycles: 0xffffffff, max_idle_ns: 1911260446275000 ns May 15 12:47:58.798937 kernel: futex hash table entries: 512 (order: 3, 32768 bytes, linear) May 15 12:47:58.798943 kernel: pinctrl core: initialized pinctrl subsystem May 15 12:47:58.798949 kernel: NET: Registered PF_NETLINK/PF_ROUTE protocol family May 15 12:47:58.798955 kernel: audit: initializing netlink subsys (disabled) May 15 12:47:58.798961 kernel: audit: type=2000 audit(1747313276.502:1): state=initialized audit_enabled=0 res=1 May 15 12:47:58.798967 kernel: thermal_sys: Registered thermal governor 'step_wise' May 15 12:47:58.798974 kernel: thermal_sys: Registered thermal governor 'user_space' May 15 12:47:58.798979 kernel: cpuidle: using governor menu May 15 12:47:58.798985 kernel: acpiphp: ACPI Hot Plug PCI Controller Driver version: 0.5 May 15 12:47:58.798993 kernel: dca service started, version 1.12.1 May 15 12:47:58.798999 kernel: PCI: ECAM [mem 0xb0000000-0xbfffffff] (base 0xb0000000) for domain 0000 [bus 00-ff] May 15 12:47:58.799005 kernel: PCI: Using configuration type 1 for base access May 15 12:47:58.799011 kernel: kprobes: kprobe jump-optimization is enabled. All kprobes are optimized if possible. May 15 12:47:58.799017 kernel: HugeTLB: registered 1.00 GiB page size, pre-allocated 0 pages May 15 12:47:58.799023 kernel: HugeTLB: 16380 KiB vmemmap can be freed for a 1.00 GiB page May 15 12:47:58.799029 kernel: HugeTLB: registered 2.00 MiB page size, pre-allocated 0 pages May 15 12:47:58.799035 kernel: HugeTLB: 28 KiB vmemmap can be freed for a 2.00 MiB page May 15 12:47:58.799041 kernel: ACPI: Added _OSI(Module Device) May 15 12:47:58.799048 kernel: ACPI: Added _OSI(Processor Device) May 15 12:47:58.799054 kernel: ACPI: Added _OSI(3.0 _SCP Extensions) May 15 12:47:58.799060 kernel: ACPI: Added _OSI(Processor Aggregator Device) May 15 12:47:58.799066 kernel: ACPI: 1 ACPI AML tables successfully acquired and loaded May 15 12:47:58.799072 kernel: ACPI: Interpreter enabled May 15 12:47:58.799078 kernel: ACPI: PM: (supports S0 S5) May 15 12:47:58.799084 kernel: ACPI: Using IOAPIC for interrupt routing May 15 12:47:58.799090 kernel: PCI: Using host bridge windows from ACPI; if necessary, use "pci=nocrs" and report a bug May 15 12:47:58.799096 kernel: PCI: Using E820 reservations for host bridge windows May 15 12:47:58.799103 kernel: ACPI: Enabled 2 GPEs in block 00 to 3F May 15 12:47:58.799109 kernel: ACPI: PCI Root Bridge [PCI0] (domain 0000 [bus 00-ff]) May 15 12:47:58.799220 kernel: acpi PNP0A08:00: _OSC: OS supports [ExtendedConfig ASPM ClockPM Segments MSI HPX-Type3] May 15 12:47:58.799286 kernel: acpi PNP0A08:00: _OSC: platform does not support [PCIeHotplug LTR] May 15 12:47:58.799386 kernel: acpi PNP0A08:00: _OSC: OS now controls [PME AER PCIeCapability] May 15 12:47:58.799398 kernel: PCI host bridge to bus 0000:00 May 15 12:47:58.799462 kernel: pci_bus 0000:00: root bus resource [io 0x0000-0x0cf7 window] May 15 12:47:58.799519 kernel: pci_bus 0000:00: root bus resource [io 0x0d00-0xffff window] May 15 12:47:58.799570 kernel: pci_bus 0000:00: root bus resource [mem 0x000a0000-0x000bffff window] May 15 12:47:58.799675 kernel: pci_bus 0000:00: root bus resource [mem 0x7d000000-0xafffffff window] May 15 12:47:58.799726 kernel: pci_bus 0000:00: root bus resource [mem 0xc0000000-0xfebfffff window] May 15 12:47:58.799775 kernel: pci_bus 0000:00: root bus resource [mem 0x100000000-0x8ffffffff window] May 15 12:47:58.799824 kernel: pci_bus 0000:00: root bus resource [bus 00-ff] May 15 12:47:58.799898 kernel: pci 0000:00:00.0: [8086:29c0] type 00 class 0x060000 conventional PCI endpoint May 15 12:47:58.799976 kernel: pci 0000:00:01.0: [1af4:1050] type 00 class 0x030000 conventional PCI endpoint May 15 12:47:58.800039 kernel: pci 0000:00:01.0: BAR 0 [mem 0xfb800000-0xfbffffff pref] May 15 12:47:58.800097 kernel: pci 0000:00:01.0: BAR 2 [mem 0xfd200000-0xfd203fff 64bit pref] May 15 12:47:58.800153 kernel: pci 0000:00:01.0: BAR 4 [mem 0xfea10000-0xfea10fff] May 15 12:47:58.800210 kernel: pci 0000:00:01.0: ROM [mem 0xfea00000-0xfea0ffff pref] May 15 12:47:58.800267 kernel: pci 0000:00:01.0: Video device with shadowed ROM at [mem 0x000c0000-0x000dffff] May 15 12:47:58.800332 kernel: pci 0000:00:02.0: [1b36:000c] type 01 class 0x060400 PCIe Root Port May 15 12:47:58.800394 kernel: pci 0000:00:02.0: BAR 0 [mem 0xfea11000-0xfea11fff] May 15 12:47:58.800496 kernel: pci 0000:00:02.0: PCI bridge to [bus 01] May 15 12:47:58.800560 kernel: pci 0000:00:02.0: bridge window [mem 0xfe800000-0xfe9fffff] May 15 12:47:58.801677 kernel: pci 0000:00:02.0: bridge window [mem 0xfd000000-0xfd1fffff 64bit pref] May 15 12:47:58.801755 kernel: pci 0000:00:02.1: [1b36:000c] type 01 class 0x060400 PCIe Root Port May 15 12:47:58.801817 kernel: pci 0000:00:02.1: BAR 0 [mem 0xfea12000-0xfea12fff] May 15 12:47:58.801880 kernel: pci 0000:00:02.1: PCI bridge to [bus 02] May 15 12:47:58.801937 kernel: pci 0000:00:02.1: bridge window [mem 0xfe600000-0xfe7fffff] May 15 12:47:58.801993 kernel: pci 0000:00:02.1: bridge window [mem 0xfce00000-0xfcffffff 64bit pref] May 15 12:47:58.802057 kernel: pci 0000:00:02.2: [1b36:000c] type 01 class 0x060400 PCIe Root Port May 15 12:47:58.802115 kernel: pci 0000:00:02.2: BAR 0 [mem 0xfea13000-0xfea13fff] May 15 12:47:58.802172 kernel: pci 0000:00:02.2: PCI bridge to [bus 03] May 15 12:47:58.802228 kernel: pci 0000:00:02.2: bridge window [mem 0xfe400000-0xfe5fffff] May 15 12:47:58.802372 kernel: pci 0000:00:02.2: bridge window [mem 0xfcc00000-0xfcdfffff 64bit pref] May 15 12:47:58.802440 kernel: pci 0000:00:02.3: [1b36:000c] type 01 class 0x060400 PCIe Root Port May 15 12:47:58.802499 kernel: pci 0000:00:02.3: BAR 0 [mem 0xfea14000-0xfea14fff] May 15 12:47:58.802555 kernel: pci 0000:00:02.3: PCI bridge to [bus 04] May 15 12:47:58.805082 kernel: pci 0000:00:02.3: bridge window [mem 0xfe200000-0xfe3fffff] May 15 12:47:58.805150 kernel: pci 0000:00:02.3: bridge window [mem 0xfca00000-0xfcbfffff 64bit pref] May 15 12:47:58.805218 kernel: pci 0000:00:02.4: [1b36:000c] type 01 class 0x060400 PCIe Root Port May 15 12:47:58.805310 kernel: pci 0000:00:02.4: BAR 0 [mem 0xfea15000-0xfea15fff] May 15 12:47:58.805429 kernel: pci 0000:00:02.4: PCI bridge to [bus 05] May 15 12:47:58.805493 kernel: pci 0000:00:02.4: bridge window [mem 0xfe000000-0xfe1fffff] May 15 12:47:58.805551 kernel: pci 0000:00:02.4: bridge window [mem 0xfc800000-0xfc9fffff 64bit pref] May 15 12:47:58.805696 kernel: pci 0000:00:02.5: [1b36:000c] type 01 class 0x060400 PCIe Root Port May 15 12:47:58.805760 kernel: pci 0000:00:02.5: BAR 0 [mem 0xfea16000-0xfea16fff] May 15 12:47:58.805817 kernel: pci 0000:00:02.5: PCI bridge to [bus 06] May 15 12:47:58.805874 kernel: pci 0000:00:02.5: bridge window [mem 0xfde00000-0xfdffffff] May 15 12:47:58.805955 kernel: pci 0000:00:02.5: bridge window [mem 0xfc600000-0xfc7fffff 64bit pref] May 15 12:47:58.806105 kernel: pci 0000:00:02.6: [1b36:000c] type 01 class 0x060400 PCIe Root Port May 15 12:47:58.806169 kernel: pci 0000:00:02.6: BAR 0 [mem 0xfea17000-0xfea17fff] May 15 12:47:58.806225 kernel: pci 0000:00:02.6: PCI bridge to [bus 07] May 15 12:47:58.806304 kernel: pci 0000:00:02.6: bridge window [mem 0xfdc00000-0xfddfffff] May 15 12:47:58.806428 kernel: pci 0000:00:02.6: bridge window [mem 0xfc400000-0xfc5fffff 64bit pref] May 15 12:47:58.806510 kernel: pci 0000:00:02.7: [1b36:000c] type 01 class 0x060400 PCIe Root Port May 15 12:47:58.806570 kernel: pci 0000:00:02.7: BAR 0 [mem 0xfea18000-0xfea18fff] May 15 12:47:58.806698 kernel: pci 0000:00:02.7: PCI bridge to [bus 08] May 15 12:47:58.806762 kernel: pci 0000:00:02.7: bridge window [mem 0xfda00000-0xfdbfffff] May 15 12:47:58.806819 kernel: pci 0000:00:02.7: bridge window [mem 0xfc200000-0xfc3fffff 64bit pref] May 15 12:47:58.806884 kernel: pci 0000:00:03.0: [1b36:000c] type 01 class 0x060400 PCIe Root Port May 15 12:47:58.806942 kernel: pci 0000:00:03.0: BAR 0 [mem 0xfea19000-0xfea19fff] May 15 12:47:58.807004 kernel: pci 0000:00:03.0: PCI bridge to [bus 09] May 15 12:47:58.807093 kernel: pci 0000:00:03.0: bridge window [mem 0xfd800000-0xfd9fffff] May 15 12:47:58.807182 kernel: pci 0000:00:03.0: bridge window [mem 0xfc000000-0xfc1fffff 64bit pref] May 15 12:47:58.807250 kernel: pci 0000:00:1f.0: [8086:2918] type 00 class 0x060100 conventional PCI endpoint May 15 12:47:58.807366 kernel: pci 0000:00:1f.0: quirk: [io 0x0600-0x067f] claimed by ICH6 ACPI/GPIO/TCO May 15 12:47:58.807476 kernel: pci 0000:00:1f.2: [8086:2922] type 00 class 0x010601 conventional PCI endpoint May 15 12:47:58.807539 kernel: pci 0000:00:1f.2: BAR 4 [io 0xc040-0xc05f] May 15 12:47:58.809664 kernel: pci 0000:00:1f.2: BAR 5 [mem 0xfea1a000-0xfea1afff] May 15 12:47:58.809745 kernel: pci 0000:00:1f.3: [8086:2930] type 00 class 0x0c0500 conventional PCI endpoint May 15 12:47:58.809806 kernel: pci 0000:00:1f.3: BAR 4 [io 0x0700-0x073f] May 15 12:47:58.809875 kernel: pci 0000:01:00.0: [1af4:1041] type 00 class 0x020000 PCIe Endpoint May 15 12:47:58.809937 kernel: pci 0000:01:00.0: BAR 1 [mem 0xfe880000-0xfe880fff] May 15 12:47:58.809996 kernel: pci 0000:01:00.0: BAR 4 [mem 0xfd000000-0xfd003fff 64bit pref] May 15 12:47:58.810061 kernel: pci 0000:01:00.0: ROM [mem 0xfe800000-0xfe87ffff pref] May 15 12:47:58.810119 kernel: pci 0000:00:02.0: PCI bridge to [bus 01] May 15 12:47:58.810184 kernel: pci 0000:02:00.0: [1b36:000d] type 00 class 0x0c0330 PCIe Endpoint May 15 12:47:58.810245 kernel: pci 0000:02:00.0: BAR 0 [mem 0xfe600000-0xfe603fff 64bit] May 15 12:47:58.810303 kernel: pci 0000:00:02.1: PCI bridge to [bus 02] May 15 12:47:58.810370 kernel: pci 0000:03:00.0: [1af4:1043] type 00 class 0x078000 PCIe Endpoint May 15 12:47:58.810429 kernel: pci 0000:03:00.0: BAR 1 [mem 0xfe400000-0xfe400fff] May 15 12:47:58.810599 kernel: pci 0000:03:00.0: BAR 4 [mem 0xfcc00000-0xfcc03fff 64bit pref] May 15 12:47:58.810817 kernel: pci 0000:00:02.2: PCI bridge to [bus 03] May 15 12:47:58.811056 kernel: pci 0000:04:00.0: [1af4:1045] type 00 class 0x00ff00 PCIe Endpoint May 15 12:47:58.811181 kernel: pci 0000:04:00.0: BAR 4 [mem 0xfca00000-0xfca03fff 64bit pref] May 15 12:47:58.811240 kernel: pci 0000:00:02.3: PCI bridge to [bus 04] May 15 12:47:58.811822 kernel: pci 0000:05:00.0: [1af4:1044] type 00 class 0x00ff00 PCIe Endpoint May 15 12:47:58.811897 kernel: pci 0000:05:00.0: BAR 1 [mem 0xfe000000-0xfe000fff] May 15 12:47:58.811958 kernel: pci 0000:05:00.0: BAR 4 [mem 0xfc800000-0xfc803fff 64bit pref] May 15 12:47:58.812018 kernel: pci 0000:00:02.4: PCI bridge to [bus 05] May 15 12:47:58.812086 kernel: pci 0000:06:00.0: [1af4:1048] type 00 class 0x010000 PCIe Endpoint May 15 12:47:58.812147 kernel: pci 0000:06:00.0: BAR 1 [mem 0xfde00000-0xfde00fff] May 15 12:47:58.812206 kernel: pci 0000:06:00.0: BAR 4 [mem 0xfc600000-0xfc603fff 64bit pref] May 15 12:47:58.812264 kernel: pci 0000:00:02.5: PCI bridge to [bus 06] May 15 12:47:58.812276 kernel: acpiphp: Slot [0] registered May 15 12:47:58.812341 kernel: pci 0000:07:00.0: [1af4:1041] type 00 class 0x020000 PCIe Endpoint May 15 12:47:58.812403 kernel: pci 0000:07:00.0: BAR 1 [mem 0xfdc80000-0xfdc80fff] May 15 12:47:58.812461 kernel: pci 0000:07:00.0: BAR 4 [mem 0xfc400000-0xfc403fff 64bit pref] May 15 12:47:58.812519 kernel: pci 0000:07:00.0: ROM [mem 0xfdc00000-0xfdc7ffff pref] May 15 12:47:58.812616 kernel: pci 0000:00:02.6: PCI bridge to [bus 07] May 15 12:47:58.812628 kernel: acpiphp: Slot [0-2] registered May 15 12:47:58.812689 kernel: pci 0000:00:02.7: PCI bridge to [bus 08] May 15 12:47:58.812701 kernel: acpiphp: Slot [0-3] registered May 15 12:47:58.812757 kernel: pci 0000:00:03.0: PCI bridge to [bus 09] May 15 12:47:58.812766 kernel: ACPI: PCI: Interrupt link LNKA configured for IRQ 10 May 15 12:47:58.812772 kernel: ACPI: PCI: Interrupt link LNKB configured for IRQ 10 May 15 12:47:58.812779 kernel: ACPI: PCI: Interrupt link LNKC configured for IRQ 11 May 15 12:47:58.812785 kernel: ACPI: PCI: Interrupt link LNKD configured for IRQ 11 May 15 12:47:58.812791 kernel: ACPI: PCI: Interrupt link LNKE configured for IRQ 10 May 15 12:47:58.812796 kernel: ACPI: PCI: Interrupt link LNKF configured for IRQ 10 May 15 12:47:58.812805 kernel: ACPI: PCI: Interrupt link LNKG configured for IRQ 11 May 15 12:47:58.812811 kernel: ACPI: PCI: Interrupt link LNKH configured for IRQ 11 May 15 12:47:58.812816 kernel: ACPI: PCI: Interrupt link GSIA configured for IRQ 16 May 15 12:47:58.812822 kernel: ACPI: PCI: Interrupt link GSIB configured for IRQ 17 May 15 12:47:58.812828 kernel: ACPI: PCI: Interrupt link GSIC configured for IRQ 18 May 15 12:47:58.812840 kernel: ACPI: PCI: Interrupt link GSID configured for IRQ 19 May 15 12:47:58.812846 kernel: ACPI: PCI: Interrupt link GSIE configured for IRQ 20 May 15 12:47:58.812852 kernel: ACPI: PCI: Interrupt link GSIF configured for IRQ 21 May 15 12:47:58.812858 kernel: ACPI: PCI: Interrupt link GSIG configured for IRQ 22 May 15 12:47:58.812865 kernel: ACPI: PCI: Interrupt link GSIH configured for IRQ 23 May 15 12:47:58.812872 kernel: iommu: Default domain type: Translated May 15 12:47:58.812878 kernel: iommu: DMA domain TLB invalidation policy: lazy mode May 15 12:47:58.812884 kernel: PCI: Using ACPI for IRQ routing May 15 12:47:58.812890 kernel: PCI: pci_cache_line_size set to 64 bytes May 15 12:47:58.812896 kernel: e820: reserve RAM buffer [mem 0x0009fc00-0x0009ffff] May 15 12:47:58.812902 kernel: e820: reserve RAM buffer [mem 0x7cfdc000-0x7fffffff] May 15 12:47:58.812961 kernel: pci 0000:00:01.0: vgaarb: setting as boot VGA device May 15 12:47:58.813019 kernel: pci 0000:00:01.0: vgaarb: bridge control possible May 15 12:47:58.813079 kernel: pci 0000:00:01.0: vgaarb: VGA device added: decodes=io+mem,owns=io+mem,locks=none May 15 12:47:58.813088 kernel: vgaarb: loaded May 15 12:47:58.813094 kernel: hpet0: at MMIO 0xfed00000, IRQs 2, 8, 0 May 15 12:47:58.813100 kernel: hpet0: 3 comparators, 64-bit 100.000000 MHz counter May 15 12:47:58.813106 kernel: clocksource: Switched to clocksource kvm-clock May 15 12:47:58.813112 kernel: VFS: Disk quotas dquot_6.6.0 May 15 12:47:58.813119 kernel: VFS: Dquot-cache hash table entries: 512 (order 0, 4096 bytes) May 15 12:47:58.813125 kernel: pnp: PnP ACPI init May 15 12:47:58.813191 kernel: system 00:04: [mem 0xb0000000-0xbfffffff window] has been reserved May 15 12:47:58.813204 kernel: pnp: PnP ACPI: found 5 devices May 15 12:47:58.813210 kernel: clocksource: acpi_pm: mask: 0xffffff max_cycles: 0xffffff, max_idle_ns: 2085701024 ns May 15 12:47:58.813216 kernel: NET: Registered PF_INET protocol family May 15 12:47:58.813223 kernel: IP idents hash table entries: 32768 (order: 6, 262144 bytes, linear) May 15 12:47:58.813229 kernel: tcp_listen_portaddr_hash hash table entries: 1024 (order: 2, 16384 bytes, linear) May 15 12:47:58.813235 kernel: Table-perturb hash table entries: 65536 (order: 6, 262144 bytes, linear) May 15 12:47:58.813241 kernel: TCP established hash table entries: 16384 (order: 5, 131072 bytes, linear) May 15 12:47:58.813247 kernel: TCP bind hash table entries: 16384 (order: 7, 524288 bytes, linear) May 15 12:47:58.813255 kernel: TCP: Hash tables configured (established 16384 bind 16384) May 15 12:47:58.813261 kernel: UDP hash table entries: 1024 (order: 3, 32768 bytes, linear) May 15 12:47:58.813267 kernel: UDP-Lite hash table entries: 1024 (order: 3, 32768 bytes, linear) May 15 12:47:58.813273 kernel: NET: Registered PF_UNIX/PF_LOCAL protocol family May 15 12:47:58.813279 kernel: NET: Registered PF_XDP protocol family May 15 12:47:58.813337 kernel: pci 0000:00:02.6: bridge window [io 0x1000-0x0fff] to [bus 07] add_size 1000 May 15 12:47:58.813439 kernel: pci 0000:00:02.7: bridge window [io 0x1000-0x0fff] to [bus 08] add_size 1000 May 15 12:47:58.813514 kernel: pci 0000:00:03.0: bridge window [io 0x1000-0x0fff] to [bus 09] add_size 1000 May 15 12:47:58.813617 kernel: pci 0000:00:02.6: bridge window [io 0x1000-0x1fff]: assigned May 15 12:47:58.813698 kernel: pci 0000:00:02.7: bridge window [io 0x2000-0x2fff]: assigned May 15 12:47:58.813759 kernel: pci 0000:00:03.0: bridge window [io 0x3000-0x3fff]: assigned May 15 12:47:58.813817 kernel: pci 0000:00:02.0: PCI bridge to [bus 01] May 15 12:47:58.813874 kernel: pci 0000:00:02.0: bridge window [mem 0xfe800000-0xfe9fffff] May 15 12:47:58.813931 kernel: pci 0000:00:02.0: bridge window [mem 0xfd000000-0xfd1fffff 64bit pref] May 15 12:47:58.814601 kernel: pci 0000:00:02.1: PCI bridge to [bus 02] May 15 12:47:58.814713 kernel: pci 0000:00:02.1: bridge window [mem 0xfe600000-0xfe7fffff] May 15 12:47:58.814781 kernel: pci 0000:00:02.1: bridge window [mem 0xfce00000-0xfcffffff 64bit pref] May 15 12:47:58.814847 kernel: pci 0000:00:02.2: PCI bridge to [bus 03] May 15 12:47:58.814906 kernel: pci 0000:00:02.2: bridge window [mem 0xfe400000-0xfe5fffff] May 15 12:47:58.814963 kernel: pci 0000:00:02.2: bridge window [mem 0xfcc00000-0xfcdfffff 64bit pref] May 15 12:47:58.815020 kernel: pci 0000:00:02.3: PCI bridge to [bus 04] May 15 12:47:58.815079 kernel: pci 0000:00:02.3: bridge window [mem 0xfe200000-0xfe3fffff] May 15 12:47:58.815140 kernel: pci 0000:00:02.3: bridge window [mem 0xfca00000-0xfcbfffff 64bit pref] May 15 12:47:58.815205 kernel: pci 0000:00:02.4: PCI bridge to [bus 05] May 15 12:47:58.815265 kernel: pci 0000:00:02.4: bridge window [mem 0xfe000000-0xfe1fffff] May 15 12:47:58.815322 kernel: pci 0000:00:02.4: bridge window [mem 0xfc800000-0xfc9fffff 64bit pref] May 15 12:47:58.815379 kernel: pci 0000:00:02.5: PCI bridge to [bus 06] May 15 12:47:58.815436 kernel: pci 0000:00:02.5: bridge window [mem 0xfde00000-0xfdffffff] May 15 12:47:58.815495 kernel: pci 0000:00:02.5: bridge window [mem 0xfc600000-0xfc7fffff 64bit pref] May 15 12:47:58.815552 kernel: pci 0000:00:02.6: PCI bridge to [bus 07] May 15 12:47:58.815664 kernel: pci 0000:00:02.6: bridge window [io 0x1000-0x1fff] May 15 12:47:58.815726 kernel: pci 0000:00:02.6: bridge window [mem 0xfdc00000-0xfddfffff] May 15 12:47:58.815787 kernel: pci 0000:00:02.6: bridge window [mem 0xfc400000-0xfc5fffff 64bit pref] May 15 12:47:58.815845 kernel: pci 0000:00:02.7: PCI bridge to [bus 08] May 15 12:47:58.815902 kernel: pci 0000:00:02.7: bridge window [io 0x2000-0x2fff] May 15 12:47:58.815958 kernel: pci 0000:00:02.7: bridge window [mem 0xfda00000-0xfdbfffff] May 15 12:47:58.816015 kernel: pci 0000:00:02.7: bridge window [mem 0xfc200000-0xfc3fffff 64bit pref] May 15 12:47:58.816071 kernel: pci 0000:00:03.0: PCI bridge to [bus 09] May 15 12:47:58.816130 kernel: pci 0000:00:03.0: bridge window [io 0x3000-0x3fff] May 15 12:47:58.816186 kernel: pci 0000:00:03.0: bridge window [mem 0xfd800000-0xfd9fffff] May 15 12:47:58.816243 kernel: pci 0000:00:03.0: bridge window [mem 0xfc000000-0xfc1fffff 64bit pref] May 15 12:47:58.816297 kernel: pci_bus 0000:00: resource 4 [io 0x0000-0x0cf7 window] May 15 12:47:58.816348 kernel: pci_bus 0000:00: resource 5 [io 0x0d00-0xffff window] May 15 12:47:58.816398 kernel: pci_bus 0000:00: resource 6 [mem 0x000a0000-0x000bffff window] May 15 12:47:58.816448 kernel: pci_bus 0000:00: resource 7 [mem 0x7d000000-0xafffffff window] May 15 12:47:58.816497 kernel: pci_bus 0000:00: resource 8 [mem 0xc0000000-0xfebfffff window] May 15 12:47:58.816546 kernel: pci_bus 0000:00: resource 9 [mem 0x100000000-0x8ffffffff window] May 15 12:47:58.816641 kernel: pci_bus 0000:01: resource 1 [mem 0xfe800000-0xfe9fffff] May 15 12:47:58.816699 kernel: pci_bus 0000:01: resource 2 [mem 0xfd000000-0xfd1fffff 64bit pref] May 15 12:47:58.816759 kernel: pci_bus 0000:02: resource 1 [mem 0xfe600000-0xfe7fffff] May 15 12:47:58.816817 kernel: pci_bus 0000:02: resource 2 [mem 0xfce00000-0xfcffffff 64bit pref] May 15 12:47:58.816899 kernel: pci_bus 0000:03: resource 1 [mem 0xfe400000-0xfe5fffff] May 15 12:47:58.816981 kernel: pci_bus 0000:03: resource 2 [mem 0xfcc00000-0xfcdfffff 64bit pref] May 15 12:47:58.817081 kernel: pci_bus 0000:04: resource 1 [mem 0xfe200000-0xfe3fffff] May 15 12:47:58.817446 kernel: pci_bus 0000:04: resource 2 [mem 0xfca00000-0xfcbfffff 64bit pref] May 15 12:47:58.817515 kernel: pci_bus 0000:05: resource 1 [mem 0xfe000000-0xfe1fffff] May 15 12:47:58.817630 kernel: pci_bus 0000:05: resource 2 [mem 0xfc800000-0xfc9fffff 64bit pref] May 15 12:47:58.817720 kernel: pci_bus 0000:06: resource 1 [mem 0xfde00000-0xfdffffff] May 15 12:47:58.817779 kernel: pci_bus 0000:06: resource 2 [mem 0xfc600000-0xfc7fffff 64bit pref] May 15 12:47:58.817845 kernel: pci_bus 0000:07: resource 0 [io 0x1000-0x1fff] May 15 12:47:58.817900 kernel: pci_bus 0000:07: resource 1 [mem 0xfdc00000-0xfddfffff] May 15 12:47:58.817952 kernel: pci_bus 0000:07: resource 2 [mem 0xfc400000-0xfc5fffff 64bit pref] May 15 12:47:58.818049 kernel: pci_bus 0000:08: resource 0 [io 0x2000-0x2fff] May 15 12:47:58.818119 kernel: pci_bus 0000:08: resource 1 [mem 0xfda00000-0xfdbfffff] May 15 12:47:58.818173 kernel: pci_bus 0000:08: resource 2 [mem 0xfc200000-0xfc3fffff 64bit pref] May 15 12:47:58.818232 kernel: pci_bus 0000:09: resource 0 [io 0x3000-0x3fff] May 15 12:47:58.818290 kernel: pci_bus 0000:09: resource 1 [mem 0xfd800000-0xfd9fffff] May 15 12:47:58.818341 kernel: pci_bus 0000:09: resource 2 [mem 0xfc000000-0xfc1fffff 64bit pref] May 15 12:47:58.818350 kernel: ACPI: \_SB_.GSIG: Enabled at IRQ 22 May 15 12:47:58.818357 kernel: PCI: CLS 0 bytes, default 64 May 15 12:47:58.818364 kernel: clocksource: tsc: mask: 0xffffffffffffffff max_cycles: 0x233fc4eb620, max_idle_ns: 440795316590 ns May 15 12:47:58.818370 kernel: Initialise system trusted keyrings May 15 12:47:58.818377 kernel: workingset: timestamp_bits=39 max_order=19 bucket_order=0 May 15 12:47:58.818383 kernel: Key type asymmetric registered May 15 12:47:58.818392 kernel: Asymmetric key parser 'x509' registered May 15 12:47:58.818398 kernel: Block layer SCSI generic (bsg) driver version 0.4 loaded (major 250) May 15 12:47:58.818404 kernel: io scheduler mq-deadline registered May 15 12:47:58.818410 kernel: io scheduler kyber registered May 15 12:47:58.818417 kernel: io scheduler bfq registered May 15 12:47:58.818476 kernel: pcieport 0000:00:02.0: PME: Signaling with IRQ 24 May 15 12:47:58.818535 kernel: pcieport 0000:00:02.0: AER: enabled with IRQ 24 May 15 12:47:58.818656 kernel: pcieport 0000:00:02.1: PME: Signaling with IRQ 25 May 15 12:47:58.818756 kernel: pcieport 0000:00:02.1: AER: enabled with IRQ 25 May 15 12:47:58.818823 kernel: pcieport 0000:00:02.2: PME: Signaling with IRQ 26 May 15 12:47:58.818881 kernel: pcieport 0000:00:02.2: AER: enabled with IRQ 26 May 15 12:47:58.818939 kernel: pcieport 0000:00:02.3: PME: Signaling with IRQ 27 May 15 12:47:58.818996 kernel: pcieport 0000:00:02.3: AER: enabled with IRQ 27 May 15 12:47:58.819053 kernel: pcieport 0000:00:02.4: PME: Signaling with IRQ 28 May 15 12:47:58.821119 kernel: pcieport 0000:00:02.4: AER: enabled with IRQ 28 May 15 12:47:58.821196 kernel: pcieport 0000:00:02.5: PME: Signaling with IRQ 29 May 15 12:47:58.821682 kernel: pcieport 0000:00:02.5: AER: enabled with IRQ 29 May 15 12:47:58.821765 kernel: pcieport 0000:00:02.6: PME: Signaling with IRQ 30 May 15 12:47:58.821882 kernel: pcieport 0000:00:02.6: AER: enabled with IRQ 30 May 15 12:47:58.822146 kernel: pcieport 0000:00:02.7: PME: Signaling with IRQ 31 May 15 12:47:58.822265 kernel: pcieport 0000:00:02.7: AER: enabled with IRQ 31 May 15 12:47:58.822292 kernel: ACPI: \_SB_.GSIH: Enabled at IRQ 23 May 15 12:47:58.822386 kernel: pcieport 0000:00:03.0: PME: Signaling with IRQ 32 May 15 12:47:58.822472 kernel: pcieport 0000:00:03.0: AER: enabled with IRQ 32 May 15 12:47:58.822483 kernel: ioatdma: Intel(R) QuickData Technology Driver 5.00 May 15 12:47:58.822490 kernel: ACPI: \_SB_.GSIF: Enabled at IRQ 21 May 15 12:47:58.822496 kernel: Serial: 8250/16550 driver, 4 ports, IRQ sharing enabled May 15 12:47:58.822503 kernel: 00:00: ttyS0 at I/O 0x3f8 (irq = 4, base_baud = 115200) is a 16550A May 15 12:47:58.822509 kernel: i8042: PNP: PS/2 Controller [PNP0303:KBD,PNP0f13:MOU] at 0x60,0x64 irq 1,12 May 15 12:47:58.822515 kernel: serio: i8042 KBD port at 0x60,0x64 irq 1 May 15 12:47:58.822526 kernel: serio: i8042 AUX port at 0x60,0x64 irq 12 May 15 12:47:58.822691 kernel: rtc_cmos 00:03: RTC can wake from S4 May 15 12:47:58.822783 kernel: rtc_cmos 00:03: registered as rtc0 May 15 12:47:58.822844 kernel: rtc_cmos 00:03: setting system clock to 2025-05-15T12:47:58 UTC (1747313278) May 15 12:47:58.822896 kernel: rtc_cmos 00:03: alarms up to one day, y3k, 242 bytes nvram, hpet irqs May 15 12:47:58.822909 kernel: amd_pstate: the _CPC object is not present in SBIOS or ACPI disabled May 15 12:47:58.822916 kernel: input: AT Translated Set 2 keyboard as /devices/platform/i8042/serio0/input/input1 May 15 12:47:58.822923 kernel: NET: Registered PF_INET6 protocol family May 15 12:47:58.822930 kernel: Segment Routing with IPv6 May 15 12:47:58.822938 kernel: In-situ OAM (IOAM) with IPv6 May 15 12:47:58.822944 kernel: NET: Registered PF_PACKET protocol family May 15 12:47:58.822950 kernel: Key type dns_resolver registered May 15 12:47:58.822957 kernel: IPI shorthand broadcast: enabled May 15 12:47:58.822964 kernel: sched_clock: Marking stable (2906038301, 144420032)->(3058800413, -8342080) May 15 12:47:58.822970 kernel: registered taskstats version 1 May 15 12:47:58.822976 kernel: Loading compiled-in X.509 certificates May 15 12:47:58.822983 kernel: Loaded X.509 cert 'Kinvolk GmbH: Module signing key for 6.12.20-flatcar: 05e05785144663be6df1db78301487421c4773b6' May 15 12:47:58.822989 kernel: Demotion targets for Node 0: null May 15 12:47:58.822997 kernel: Key type .fscrypt registered May 15 12:47:58.823003 kernel: Key type fscrypt-provisioning registered May 15 12:47:58.823009 kernel: ima: No TPM chip found, activating TPM-bypass! May 15 12:47:58.823016 kernel: ima: Allocated hash algorithm: sha1 May 15 12:47:58.823022 kernel: ima: No architecture policies found May 15 12:47:58.823028 kernel: clk: Disabling unused clocks May 15 12:47:58.823034 kernel: Warning: unable to open an initial console. May 15 12:47:58.823041 kernel: Freeing unused kernel image (initmem) memory: 54416K May 15 12:47:58.823049 kernel: Write protecting the kernel read-only data: 24576k May 15 12:47:58.823055 kernel: Freeing unused kernel image (rodata/data gap) memory: 296K May 15 12:47:58.823062 kernel: Run /init as init process May 15 12:47:58.823068 kernel: with arguments: May 15 12:47:58.823074 kernel: /init May 15 12:47:58.823081 kernel: with environment: May 15 12:47:58.823087 kernel: HOME=/ May 15 12:47:58.823093 kernel: TERM=linux May 15 12:47:58.823099 kernel: BOOT_IMAGE=/flatcar/vmlinuz-a May 15 12:47:58.823106 systemd[1]: Successfully made /usr/ read-only. May 15 12:47:58.823118 systemd[1]: systemd 256.8 running in system mode (+PAM +AUDIT +SELINUX -APPARMOR +IMA +SMACK +SECCOMP -GCRYPT -GNUTLS +OPENSSL -ACL +BLKID +CURL +ELFUTILS -FIDO2 +IDN2 -IDN +IPTC +KMOD +LIBCRYPTSETUP +LIBCRYPTSETUP_PLUGINS +LIBFDISK +PCRE2 -PWQUALITY -P11KIT -QRENCODE +TPM2 +BZIP2 +LZ4 +XZ +ZLIB +ZSTD -BPF_FRAMEWORK -XKBCOMMON +UTMP -SYSVINIT +LIBARCHIVE) May 15 12:47:58.823125 systemd[1]: Detected virtualization kvm. May 15 12:47:58.823132 systemd[1]: Detected architecture x86-64. May 15 12:47:58.823138 systemd[1]: Running in initrd. May 15 12:47:58.823145 systemd[1]: No hostname configured, using default hostname. May 15 12:47:58.823152 systemd[1]: Hostname set to . May 15 12:47:58.823160 systemd[1]: Initializing machine ID from VM UUID. May 15 12:47:58.823167 systemd[1]: Queued start job for default target initrd.target. May 15 12:47:58.823173 systemd[1]: Started clevis-luks-askpass.path - Forward Password Requests to Clevis Directory Watch. May 15 12:47:58.823180 systemd[1]: Started systemd-ask-password-console.path - Dispatch Password Requests to Console Directory Watch. May 15 12:47:58.823188 systemd[1]: Expecting device dev-disk-by\x2dlabel-EFI\x2dSYSTEM.device - /dev/disk/by-label/EFI-SYSTEM... May 15 12:47:58.823195 systemd[1]: Expecting device dev-disk-by\x2dlabel-OEM.device - /dev/disk/by-label/OEM... May 15 12:47:58.823201 systemd[1]: Expecting device dev-disk-by\x2dlabel-ROOT.device - /dev/disk/by-label/ROOT... May 15 12:47:58.823209 systemd[1]: Expecting device dev-disk-by\x2dpartlabel-USR\x2dA.device - /dev/disk/by-partlabel/USR-A... May 15 12:47:58.823218 systemd[1]: Expecting device dev-disk-by\x2dpartuuid-7130c94a\x2d213a\x2d4e5a\x2d8e26\x2d6cce9662f132.device - /dev/disk/by-partuuid/7130c94a-213a-4e5a-8e26-6cce9662f132... May 15 12:47:58.823225 systemd[1]: Expecting device dev-mapper-usr.device - /dev/mapper/usr... May 15 12:47:58.823232 systemd[1]: Reached target cryptsetup-pre.target - Local Encrypted Volumes (Pre). May 15 12:47:58.823238 systemd[1]: Reached target cryptsetup.target - Local Encrypted Volumes. May 15 12:47:58.823245 systemd[1]: Reached target paths.target - Path Units. May 15 12:47:58.823252 systemd[1]: Reached target slices.target - Slice Units. May 15 12:47:58.823259 systemd[1]: Reached target swap.target - Swaps. May 15 12:47:58.823265 systemd[1]: Reached target timers.target - Timer Units. May 15 12:47:58.823273 systemd[1]: Listening on iscsid.socket - Open-iSCSI iscsid Socket. May 15 12:47:58.823280 systemd[1]: Listening on iscsiuio.socket - Open-iSCSI iscsiuio Socket. May 15 12:47:58.823287 systemd[1]: Listening on systemd-journald-dev-log.socket - Journal Socket (/dev/log). May 15 12:47:58.823294 systemd[1]: Listening on systemd-journald.socket - Journal Sockets. May 15 12:47:58.823301 systemd[1]: Listening on systemd-networkd.socket - Network Service Netlink Socket. May 15 12:47:58.823308 systemd[1]: Listening on systemd-udevd-control.socket - udev Control Socket. May 15 12:47:58.823314 systemd[1]: Listening on systemd-udevd-kernel.socket - udev Kernel Socket. May 15 12:47:58.823321 systemd[1]: Reached target sockets.target - Socket Units. May 15 12:47:58.823329 systemd[1]: Starting ignition-setup-pre.service - Ignition env setup... May 15 12:47:58.823336 systemd[1]: Starting kmod-static-nodes.service - Create List of Static Device Nodes... May 15 12:47:58.823343 systemd[1]: Finished network-cleanup.service - Network Cleanup. May 15 12:47:58.823350 systemd[1]: systemd-battery-check.service - Check battery level during early boot was skipped because of an unmet condition check (ConditionDirectoryNotEmpty=/sys/class/power_supply). May 15 12:47:58.823356 systemd[1]: Starting systemd-fsck-usr.service... May 15 12:47:58.823363 systemd[1]: Starting systemd-journald.service - Journal Service... May 15 12:47:58.823371 systemd[1]: Starting systemd-modules-load.service - Load Kernel Modules... May 15 12:47:58.823378 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... May 15 12:47:58.823385 systemd[1]: Finished ignition-setup-pre.service - Ignition env setup. May 15 12:47:58.823393 systemd[1]: Finished kmod-static-nodes.service - Create List of Static Device Nodes. May 15 12:47:58.823417 systemd-journald[217]: Collecting audit messages is disabled. May 15 12:47:58.823437 systemd[1]: Finished systemd-fsck-usr.service. May 15 12:47:58.823445 systemd[1]: Starting systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully... May 15 12:47:58.823452 kernel: bridge: filtering via arp/ip/ip6tables is no longer available by default. Update your scripts to load br_netfilter if you need this. May 15 12:47:58.823459 kernel: Bridge firewalling registered May 15 12:47:58.823466 systemd[1]: Finished systemd-modules-load.service - Load Kernel Modules. May 15 12:47:58.823473 systemd-journald[217]: Journal started May 15 12:47:58.823491 systemd-journald[217]: Runtime Journal (/run/log/journal/039d6f22dcf94777906c21c65685288e) is 4.8M, max 38.6M, 33.7M free. May 15 12:47:58.787930 systemd-modules-load[218]: Inserted module 'overlay' May 15 12:47:58.860620 systemd[1]: Started systemd-journald.service - Journal Service. May 15 12:47:58.818652 systemd-modules-load[218]: Inserted module 'br_netfilter' May 15 12:47:58.861187 systemd[1]: Finished systemd-vconsole-setup.service - Virtual Console Setup. May 15 12:47:58.862029 systemd[1]: Finished systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully. May 15 12:47:58.864432 systemd[1]: Starting dracut-cmdline-ask.service - dracut ask for additional cmdline parameters... May 15 12:47:58.867692 systemd[1]: Starting systemd-sysctl.service - Apply Kernel Variables... May 15 12:47:58.881207 systemd[1]: Starting systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev... May 15 12:47:58.884035 systemd[1]: Starting systemd-tmpfiles-setup.service - Create System Files and Directories... May 15 12:47:58.889322 systemd[1]: Finished systemd-sysctl.service - Apply Kernel Variables. May 15 12:47:58.895530 systemd-tmpfiles[235]: /usr/lib/tmpfiles.d/var.conf:14: Duplicate line for path "/var/log", ignoring. May 15 12:47:58.896936 systemd[1]: Finished systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev. May 15 12:47:58.900024 systemd[1]: Finished systemd-tmpfiles-setup.service - Create System Files and Directories. May 15 12:47:58.902254 systemd[1]: Starting systemd-resolved.service - Network Name Resolution... May 15 12:47:58.903420 systemd[1]: Finished dracut-cmdline-ask.service - dracut ask for additional cmdline parameters. May 15 12:47:58.905201 systemd[1]: Starting dracut-cmdline.service - dracut cmdline hook... May 15 12:47:58.919701 dracut-cmdline[255]: Using kernel command line parameters: rd.driver.pre=btrfs SYSTEMD_SULOGIN_FORCE=1 rootflags=rw mount.usrflags=ro BOOT_IMAGE=/flatcar/vmlinuz-a mount.usr=/dev/mapper/usr verity.usr=PARTUUID=7130c94a-213a-4e5a-8e26-6cce9662f132 rootflags=rw mount.usrflags=ro consoleblank=0 root=LABEL=ROOT console=ttyS0,115200n8 console=tty0 flatcar.first_boot=detected flatcar.oem.id=hetzner verity.usrhash=48287e633374b880fa618bd42bee102ae77c50831859c6cedd6ca9e1aec3dd5c May 15 12:47:58.932764 systemd-resolved[254]: Positive Trust Anchors: May 15 12:47:58.932773 systemd-resolved[254]: . IN DS 20326 8 2 e06d44b80b8f1d39a95c0b0d7c65d08458e880409bbc683457104237c7f8ec8d May 15 12:47:58.932797 systemd-resolved[254]: Negative trust anchors: home.arpa 10.in-addr.arpa 16.172.in-addr.arpa 17.172.in-addr.arpa 18.172.in-addr.arpa 19.172.in-addr.arpa 20.172.in-addr.arpa 21.172.in-addr.arpa 22.172.in-addr.arpa 23.172.in-addr.arpa 24.172.in-addr.arpa 25.172.in-addr.arpa 26.172.in-addr.arpa 27.172.in-addr.arpa 28.172.in-addr.arpa 29.172.in-addr.arpa 30.172.in-addr.arpa 31.172.in-addr.arpa 170.0.0.192.in-addr.arpa 171.0.0.192.in-addr.arpa 168.192.in-addr.arpa d.f.ip6.arpa ipv4only.arpa resolver.arpa corp home internal intranet lan local private test May 15 12:47:58.935854 systemd-resolved[254]: Defaulting to hostname 'linux'. May 15 12:47:58.938566 systemd[1]: Started systemd-resolved.service - Network Name Resolution. May 15 12:47:58.939265 systemd[1]: Reached target nss-lookup.target - Host and Network Name Lookups. May 15 12:47:58.987654 kernel: SCSI subsystem initialized May 15 12:47:58.994616 kernel: Loading iSCSI transport class v2.0-870. May 15 12:47:59.003620 kernel: iscsi: registered transport (tcp) May 15 12:47:59.021673 kernel: iscsi: registered transport (qla4xxx) May 15 12:47:59.021731 kernel: QLogic iSCSI HBA Driver May 15 12:47:59.037093 systemd[1]: Starting systemd-network-generator.service - Generate network units from Kernel command line... May 15 12:47:59.058648 systemd[1]: Finished systemd-network-generator.service - Generate network units from Kernel command line. May 15 12:47:59.061229 systemd[1]: Reached target network-pre.target - Preparation for Network. May 15 12:47:59.097673 systemd[1]: Finished dracut-cmdline.service - dracut cmdline hook. May 15 12:47:59.099356 systemd[1]: Starting dracut-pre-udev.service - dracut pre-udev hook... May 15 12:47:59.142607 kernel: raid6: avx2x4 gen() 34097 MB/s May 15 12:47:59.159621 kernel: raid6: avx2x2 gen() 32230 MB/s May 15 12:47:59.176743 kernel: raid6: avx2x1 gen() 22385 MB/s May 15 12:47:59.176817 kernel: raid6: using algorithm avx2x4 gen() 34097 MB/s May 15 12:47:59.194807 kernel: raid6: .... xor() 4213 MB/s, rmw enabled May 15 12:47:59.194882 kernel: raid6: using avx2x2 recovery algorithm May 15 12:47:59.211620 kernel: xor: automatically using best checksumming function avx May 15 12:47:59.346616 kernel: Btrfs loaded, zoned=no, fsverity=no May 15 12:47:59.352604 systemd[1]: Finished dracut-pre-udev.service - dracut pre-udev hook. May 15 12:47:59.354872 systemd[1]: Starting systemd-udevd.service - Rule-based Manager for Device Events and Files... May 15 12:47:59.383137 systemd-udevd[464]: Using default interface naming scheme 'v255'. May 15 12:47:59.387706 systemd[1]: Started systemd-udevd.service - Rule-based Manager for Device Events and Files. May 15 12:47:59.390276 systemd[1]: Starting dracut-pre-trigger.service - dracut pre-trigger hook... May 15 12:47:59.412974 dracut-pre-trigger[471]: rd.md=0: removing MD RAID activation May 15 12:47:59.431739 systemd[1]: Finished dracut-pre-trigger.service - dracut pre-trigger hook. May 15 12:47:59.433146 systemd[1]: Starting systemd-udev-trigger.service - Coldplug All udev Devices... May 15 12:47:59.469192 systemd[1]: Finished systemd-udev-trigger.service - Coldplug All udev Devices. May 15 12:47:59.472418 systemd[1]: Starting dracut-initqueue.service - dracut initqueue hook... May 15 12:47:59.541842 kernel: cryptd: max_cpu_qlen set to 1000 May 15 12:47:59.541903 kernel: virtio_scsi virtio5: 2/0/0 default/read/poll queues May 15 12:47:59.551471 kernel: scsi host0: Virtio SCSI HBA May 15 12:47:59.551625 kernel: ACPI: bus type USB registered May 15 12:47:59.554368 kernel: usbcore: registered new interface driver usbfs May 15 12:47:59.554401 kernel: scsi 0:0:0:0: Direct-Access QEMU QEMU HARDDISK 2.5+ PQ: 0 ANSI: 5 May 15 12:47:59.567740 kernel: AES CTR mode by8 optimization enabled May 15 12:47:59.569590 kernel: usbcore: registered new interface driver hub May 15 12:47:59.571606 kernel: usbcore: registered new device driver usb May 15 12:47:59.572980 systemd[1]: systemd-vconsole-setup.service: Deactivated successfully. May 15 12:47:59.573721 systemd[1]: Stopped systemd-vconsole-setup.service - Virtual Console Setup. May 15 12:47:59.575484 systemd[1]: Stopping systemd-vconsole-setup.service - Virtual Console Setup... May 15 12:47:59.590991 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... May 15 12:47:59.611809 kernel: sd 0:0:0:0: Power-on or device reset occurred May 15 12:47:59.627943 kernel: sd 0:0:0:0: [sda] 80003072 512-byte logical blocks: (41.0 GB/38.1 GiB) May 15 12:47:59.628044 kernel: sd 0:0:0:0: [sda] Write Protect is off May 15 12:47:59.628126 kernel: sd 0:0:0:0: [sda] Mode Sense: 63 00 00 08 May 15 12:47:59.628200 kernel: sd 0:0:0:0: [sda] Write cache: enabled, read cache: enabled, doesn't support DPO or FUA May 15 12:47:59.628270 kernel: GPT:Primary header thinks Alt. header is not at the end of the disk. May 15 12:47:59.628279 kernel: GPT:17805311 != 80003071 May 15 12:47:59.628286 kernel: GPT:Alternate GPT header not at the end of the disk. May 15 12:47:59.628293 kernel: GPT:17805311 != 80003071 May 15 12:47:59.628300 kernel: GPT: Use GNU Parted to correct GPT errors. May 15 12:47:59.628307 kernel: sda: sda1 sda2 sda3 sda4 sda6 sda7 sda9 May 15 12:47:59.628317 kernel: sd 0:0:0:0: [sda] Attached SCSI disk May 15 12:47:59.640596 kernel: input: ImExPS/2 Generic Explorer Mouse as /devices/platform/i8042/serio1/input/input3 May 15 12:47:59.652591 kernel: libata version 3.00 loaded. May 15 12:47:59.662871 kernel: ahci 0000:00:1f.2: version 3.0 May 15 12:47:59.682233 kernel: ACPI: \_SB_.GSIA: Enabled at IRQ 16 May 15 12:47:59.682263 kernel: ahci 0000:00:1f.2: AHCI vers 0001.0000, 32 command slots, 1.5 Gbps, SATA mode May 15 12:47:59.682367 kernel: ahci 0000:00:1f.2: 6/6 ports implemented (port mask 0x3f) May 15 12:47:59.682445 kernel: ahci 0000:00:1f.2: flags: 64bit ncq only May 15 12:47:59.682523 kernel: xhci_hcd 0000:02:00.0: xHCI Host Controller May 15 12:47:59.682949 kernel: xhci_hcd 0000:02:00.0: new USB bus registered, assigned bus number 1 May 15 12:47:59.683033 kernel: xhci_hcd 0000:02:00.0: hcc params 0x00087001 hci version 0x100 quirks 0x0000000000000010 May 15 12:47:59.683107 kernel: scsi host1: ahci May 15 12:47:59.683187 kernel: xhci_hcd 0000:02:00.0: xHCI Host Controller May 15 12:47:59.683262 kernel: xhci_hcd 0000:02:00.0: new USB bus registered, assigned bus number 2 May 15 12:47:59.683333 kernel: xhci_hcd 0000:02:00.0: Host supports USB 3.0 SuperSpeed May 15 12:47:59.683407 kernel: hub 1-0:1.0: USB hub found May 15 12:47:59.683504 kernel: hub 1-0:1.0: 4 ports detected May 15 12:47:59.683616 kernel: usb usb2: We don't know the algorithms for LPM for this host, disabling LPM. May 15 12:47:59.683746 kernel: hub 2-0:1.0: USB hub found May 15 12:47:59.683838 kernel: hub 2-0:1.0: 4 ports detected May 15 12:47:59.683918 kernel: scsi host2: ahci May 15 12:47:59.683997 kernel: scsi host3: ahci May 15 12:47:59.684073 kernel: scsi host4: ahci May 15 12:47:59.684147 kernel: scsi host5: ahci May 15 12:47:59.684219 kernel: scsi host6: ahci May 15 12:47:59.684287 kernel: ata1: SATA max UDMA/133 abar m4096@0xfea1a000 port 0xfea1a100 irq 48 lpm-pol 0 May 15 12:47:59.684296 kernel: ata2: SATA max UDMA/133 abar m4096@0xfea1a000 port 0xfea1a180 irq 48 lpm-pol 0 May 15 12:47:59.684304 kernel: ata3: SATA max UDMA/133 abar m4096@0xfea1a000 port 0xfea1a200 irq 48 lpm-pol 0 May 15 12:47:59.684314 kernel: ata4: SATA max UDMA/133 abar m4096@0xfea1a000 port 0xfea1a280 irq 48 lpm-pol 0 May 15 12:47:59.684322 kernel: ata5: SATA max UDMA/133 abar m4096@0xfea1a000 port 0xfea1a300 irq 48 lpm-pol 0 May 15 12:47:59.684329 kernel: ata6: SATA max UDMA/133 abar m4096@0xfea1a000 port 0xfea1a380 irq 48 lpm-pol 0 May 15 12:47:59.698556 systemd[1]: Found device dev-disk-by\x2dlabel-ROOT.device - QEMU_HARDDISK ROOT. May 15 12:47:59.737031 systemd[1]: Finished systemd-vconsole-setup.service - Virtual Console Setup. May 15 12:47:59.757621 systemd[1]: Found device dev-disk-by\x2dpartlabel-USR\x2dA.device - QEMU_HARDDISK USR-A. May 15 12:47:59.758149 systemd[1]: Found device dev-disk-by\x2dpartuuid-7130c94a\x2d213a\x2d4e5a\x2d8e26\x2d6cce9662f132.device - QEMU_HARDDISK USR-A. May 15 12:47:59.765773 systemd[1]: Found device dev-disk-by\x2dlabel-EFI\x2dSYSTEM.device - QEMU_HARDDISK EFI-SYSTEM. May 15 12:47:59.773322 systemd[1]: Found device dev-disk-by\x2dlabel-OEM.device - QEMU_HARDDISK OEM. May 15 12:47:59.775053 systemd[1]: Starting disk-uuid.service - Generate new UUID for disk GPT if necessary... May 15 12:47:59.799468 disk-uuid[626]: Primary Header is updated. May 15 12:47:59.799468 disk-uuid[626]: Secondary Entries is updated. May 15 12:47:59.799468 disk-uuid[626]: Secondary Header is updated. May 15 12:47:59.810607 kernel: sda: sda1 sda2 sda3 sda4 sda6 sda7 sda9 May 15 12:47:59.907610 kernel: usb 1-1: new high-speed USB device number 2 using xhci_hcd May 15 12:47:59.998606 kernel: ata5: SATA link down (SStatus 0 SControl 300) May 15 12:47:59.998693 kernel: ata2: SATA link down (SStatus 0 SControl 300) May 15 12:47:59.998708 kernel: ata3: SATA link down (SStatus 0 SControl 300) May 15 12:47:59.998721 kernel: ata4: SATA link down (SStatus 0 SControl 300) May 15 12:47:59.999636 kernel: ata1: SATA link up 1.5 Gbps (SStatus 113 SControl 300) May 15 12:48:00.002595 kernel: ata1.00: ATAPI: QEMU DVD-ROM, 2.5+, max UDMA/100 May 15 12:48:00.002629 kernel: ata1.00: applying bridge limits May 15 12:48:00.005837 kernel: ata6: SATA link down (SStatus 0 SControl 300) May 15 12:48:00.006617 kernel: ata1.00: configured for UDMA/100 May 15 12:48:00.007901 kernel: scsi 1:0:0:0: CD-ROM QEMU QEMU DVD-ROM 2.5+ PQ: 0 ANSI: 5 May 15 12:48:00.048614 kernel: hid: raw HID events driver (C) Jiri Kosina May 15 12:48:00.054088 kernel: usbcore: registered new interface driver usbhid May 15 12:48:00.054152 kernel: usbhid: USB HID core driver May 15 12:48:00.057609 kernel: sr 1:0:0:0: [sr0] scsi3-mmc drive: 4x/4x cd/rw xa/form2 tray May 15 12:48:00.066046 kernel: cdrom: Uniform CD-ROM driver Revision: 3.20 May 15 12:48:00.066059 kernel: input: QEMU QEMU USB Tablet as /devices/pci0000:00/0000:00:02.1/0000:02:00.0/usb1/1-1/1-1:1.0/0003:0627:0001.0001/input/input4 May 15 12:48:00.066067 kernel: hid-generic 0003:0627:0001.0001: input,hidraw0: USB HID v0.01 Mouse [QEMU QEMU USB Tablet] on usb-0000:02:00.0-1/input0 May 15 12:48:00.066168 kernel: sr 1:0:0:0: Attached scsi CD-ROM sr0 May 15 12:48:00.378886 systemd[1]: Finished dracut-initqueue.service - dracut initqueue hook. May 15 12:48:00.382017 systemd[1]: Reached target remote-fs-pre.target - Preparation for Remote File Systems. May 15 12:48:00.383123 systemd[1]: Reached target remote-cryptsetup.target - Remote Encrypted Volumes. May 15 12:48:00.385412 systemd[1]: Reached target remote-fs.target - Remote File Systems. May 15 12:48:00.388725 systemd[1]: Starting dracut-pre-mount.service - dracut pre-mount hook... May 15 12:48:00.410966 systemd[1]: Finished dracut-pre-mount.service - dracut pre-mount hook. May 15 12:48:00.827960 kernel: sda: sda1 sda2 sda3 sda4 sda6 sda7 sda9 May 15 12:48:00.831633 disk-uuid[627]: The operation has completed successfully. May 15 12:48:00.885450 systemd[1]: disk-uuid.service: Deactivated successfully. May 15 12:48:00.885529 systemd[1]: Finished disk-uuid.service - Generate new UUID for disk GPT if necessary. May 15 12:48:00.900240 systemd[1]: Starting verity-setup.service - Verity Setup for /dev/mapper/usr... May 15 12:48:00.917673 sh[660]: Success May 15 12:48:00.931764 kernel: device-mapper: core: CONFIG_IMA_DISABLE_HTABLE is disabled. Duplicate IMA measurements will not be recorded in the IMA log. May 15 12:48:00.931799 kernel: device-mapper: uevent: version 1.0.3 May 15 12:48:00.931809 kernel: device-mapper: ioctl: 4.48.0-ioctl (2023-03-01) initialised: dm-devel@lists.linux.dev May 15 12:48:00.940614 kernel: device-mapper: verity: sha256 using shash "sha256-ni" May 15 12:48:00.977832 systemd[1]: Found device dev-mapper-usr.device - /dev/mapper/usr. May 15 12:48:00.981235 systemd[1]: Mounting sysusr-usr.mount - /sysusr/usr... May 15 12:48:00.995430 systemd[1]: Finished verity-setup.service - Verity Setup for /dev/mapper/usr. May 15 12:48:01.007071 kernel: BTRFS info: 'norecovery' is for compatibility only, recommended to use 'rescue=nologreplay' May 15 12:48:01.007106 kernel: BTRFS: device fsid 2d504097-db49-4d66-a0d5-eeb665b21004 devid 1 transid 41 /dev/mapper/usr (254:0) scanned by mount (672) May 15 12:48:01.007661 kernel: BTRFS info (device dm-0): first mount of filesystem 2d504097-db49-4d66-a0d5-eeb665b21004 May 15 12:48:01.011425 kernel: BTRFS info (device dm-0): using crc32c (crc32c-intel) checksum algorithm May 15 12:48:01.011449 kernel: BTRFS info (device dm-0): using free-space-tree May 15 12:48:01.020113 systemd[1]: Mounted sysusr-usr.mount - /sysusr/usr. May 15 12:48:01.020919 systemd[1]: Reached target initrd-usr-fs.target - Initrd /usr File System. May 15 12:48:01.021735 systemd[1]: afterburn-network-kargs.service - Afterburn Initrd Setup Network Kernel Arguments was skipped because no trigger condition checks were met. May 15 12:48:01.022367 systemd[1]: Starting ignition-setup.service - Ignition (setup)... May 15 12:48:01.024416 systemd[1]: Starting parse-ip-for-networkd.service - Write systemd-networkd units from cmdline... May 15 12:48:01.042591 kernel: BTRFS: device label OEM devid 1 transid 14 /dev/sda6 (8:6) scanned by mount (707) May 15 12:48:01.044643 kernel: BTRFS info (device sda6): first mount of filesystem afd0c70c-d15e-448c-8325-f96e3c3ed3a5 May 15 12:48:01.047152 kernel: BTRFS info (device sda6): using crc32c (crc32c-intel) checksum algorithm May 15 12:48:01.047171 kernel: BTRFS info (device sda6): using free-space-tree May 15 12:48:01.063590 kernel: BTRFS info (device sda6): last unmount of filesystem afd0c70c-d15e-448c-8325-f96e3c3ed3a5 May 15 12:48:01.063950 systemd[1]: Finished ignition-setup.service - Ignition (setup). May 15 12:48:01.065454 systemd[1]: Starting ignition-fetch-offline.service - Ignition (fetch-offline)... May 15 12:48:01.082447 systemd[1]: Finished parse-ip-for-networkd.service - Write systemd-networkd units from cmdline. May 15 12:48:01.085664 systemd[1]: Starting systemd-networkd.service - Network Configuration... May 15 12:48:01.115501 systemd-networkd[841]: lo: Link UP May 15 12:48:01.115510 systemd-networkd[841]: lo: Gained carrier May 15 12:48:01.120658 systemd-networkd[841]: Enumeration completed May 15 12:48:01.121237 systemd[1]: Started systemd-networkd.service - Network Configuration. May 15 12:48:01.121786 systemd[1]: Reached target network.target - Network. May 15 12:48:01.125229 systemd-networkd[841]: eth0: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. May 15 12:48:01.125235 systemd-networkd[841]: eth0: Configuring with /usr/lib/systemd/network/zz-default.network. May 15 12:48:01.125708 systemd-networkd[841]: eth1: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. May 15 12:48:01.125711 systemd-networkd[841]: eth1: Configuring with /usr/lib/systemd/network/zz-default.network. May 15 12:48:01.125932 systemd-networkd[841]: eth0: Link UP May 15 12:48:01.125935 systemd-networkd[841]: eth0: Gained carrier May 15 12:48:01.125941 systemd-networkd[841]: eth0: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. May 15 12:48:01.129311 systemd-networkd[841]: eth1: Link UP May 15 12:48:01.129314 systemd-networkd[841]: eth1: Gained carrier May 15 12:48:01.129321 systemd-networkd[841]: eth1: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. May 15 12:48:01.154089 ignition[810]: Ignition 2.21.0 May 15 12:48:01.154101 ignition[810]: Stage: fetch-offline May 15 12:48:01.154125 ignition[810]: no configs at "/usr/lib/ignition/base.d" May 15 12:48:01.154131 ignition[810]: no config dir at "/usr/lib/ignition/base.platform.d/hetzner" May 15 12:48:01.155489 systemd[1]: Finished ignition-fetch-offline.service - Ignition (fetch-offline). May 15 12:48:01.154187 ignition[810]: parsed url from cmdline: "" May 15 12:48:01.154189 ignition[810]: no config URL provided May 15 12:48:01.154192 ignition[810]: reading system config file "/usr/lib/ignition/user.ign" May 15 12:48:01.157671 systemd[1]: Starting ignition-fetch.service - Ignition (fetch)... May 15 12:48:01.154197 ignition[810]: no config at "/usr/lib/ignition/user.ign" May 15 12:48:01.154201 ignition[810]: failed to fetch config: resource requires networking May 15 12:48:01.154470 ignition[810]: Ignition finished successfully May 15 12:48:01.166622 systemd-networkd[841]: eth1: DHCPv4 address 10.0.0.3/32, gateway 10.0.0.1 acquired from 10.0.0.1 May 15 12:48:01.177497 ignition[850]: Ignition 2.21.0 May 15 12:48:01.177508 ignition[850]: Stage: fetch May 15 12:48:01.177637 ignition[850]: no configs at "/usr/lib/ignition/base.d" May 15 12:48:01.177644 ignition[850]: no config dir at "/usr/lib/ignition/base.platform.d/hetzner" May 15 12:48:01.177695 ignition[850]: parsed url from cmdline: "" May 15 12:48:01.177697 ignition[850]: no config URL provided May 15 12:48:01.177700 ignition[850]: reading system config file "/usr/lib/ignition/user.ign" May 15 12:48:01.177705 ignition[850]: no config at "/usr/lib/ignition/user.ign" May 15 12:48:01.177726 ignition[850]: GET http://169.254.169.254/hetzner/v1/userdata: attempt #1 May 15 12:48:01.177819 ignition[850]: GET error: Get "http://169.254.169.254/hetzner/v1/userdata": dial tcp 169.254.169.254:80: connect: network is unreachable May 15 12:48:01.184613 systemd-networkd[841]: eth0: DHCPv4 address 157.180.34.115/32, gateway 172.31.1.1 acquired from 172.31.1.1 May 15 12:48:01.378359 ignition[850]: GET http://169.254.169.254/hetzner/v1/userdata: attempt #2 May 15 12:48:01.384218 ignition[850]: GET result: OK May 15 12:48:01.384286 ignition[850]: parsing config with SHA512: f830c2758caf1f0ff4bd4446c1fdb271876aa64569f0e59eecd0e0d41687b1897e6e6e499c5172a8f81946bc9463ac4778a9bf8e73bcdc78a0e92987281084bc May 15 12:48:01.387771 unknown[850]: fetched base config from "system" May 15 12:48:01.387781 unknown[850]: fetched base config from "system" May 15 12:48:01.388032 ignition[850]: fetch: fetch complete May 15 12:48:01.387785 unknown[850]: fetched user config from "hetzner" May 15 12:48:01.388036 ignition[850]: fetch: fetch passed May 15 12:48:01.390141 systemd[1]: Finished ignition-fetch.service - Ignition (fetch). May 15 12:48:01.388071 ignition[850]: Ignition finished successfully May 15 12:48:01.392000 systemd[1]: Starting ignition-kargs.service - Ignition (kargs)... May 15 12:48:01.423372 ignition[859]: Ignition 2.21.0 May 15 12:48:01.423386 ignition[859]: Stage: kargs May 15 12:48:01.423503 ignition[859]: no configs at "/usr/lib/ignition/base.d" May 15 12:48:01.423511 ignition[859]: no config dir at "/usr/lib/ignition/base.platform.d/hetzner" May 15 12:48:01.425853 systemd[1]: Finished ignition-kargs.service - Ignition (kargs). May 15 12:48:01.424644 ignition[859]: kargs: kargs passed May 15 12:48:01.424692 ignition[859]: Ignition finished successfully May 15 12:48:01.428254 systemd[1]: Starting ignition-disks.service - Ignition (disks)... May 15 12:48:01.444999 ignition[866]: Ignition 2.21.0 May 15 12:48:01.445013 ignition[866]: Stage: disks May 15 12:48:01.445124 ignition[866]: no configs at "/usr/lib/ignition/base.d" May 15 12:48:01.446758 systemd[1]: Finished ignition-disks.service - Ignition (disks). May 15 12:48:01.445132 ignition[866]: no config dir at "/usr/lib/ignition/base.platform.d/hetzner" May 15 12:48:01.448082 systemd[1]: Reached target initrd-root-device.target - Initrd Root Device. May 15 12:48:01.445764 ignition[866]: disks: disks passed May 15 12:48:01.449007 systemd[1]: Reached target local-fs-pre.target - Preparation for Local File Systems. May 15 12:48:01.445796 ignition[866]: Ignition finished successfully May 15 12:48:01.450241 systemd[1]: Reached target local-fs.target - Local File Systems. May 15 12:48:01.451464 systemd[1]: Reached target sysinit.target - System Initialization. May 15 12:48:01.452465 systemd[1]: Reached target basic.target - Basic System. May 15 12:48:01.454378 systemd[1]: Starting systemd-fsck-root.service - File System Check on /dev/disk/by-label/ROOT... May 15 12:48:01.485259 systemd-fsck[875]: ROOT: clean, 15/1628000 files, 120826/1617920 blocks May 15 12:48:01.487803 systemd[1]: Finished systemd-fsck-root.service - File System Check on /dev/disk/by-label/ROOT. May 15 12:48:01.490566 systemd[1]: Mounting sysroot.mount - /sysroot... May 15 12:48:01.589596 kernel: EXT4-fs (sda9): mounted filesystem f7dea4bd-2644-4592-b85b-330f322c4d2b r/w with ordered data mode. Quota mode: none. May 15 12:48:01.590706 systemd[1]: Mounted sysroot.mount - /sysroot. May 15 12:48:01.591420 systemd[1]: Reached target initrd-root-fs.target - Initrd Root File System. May 15 12:48:01.593059 systemd[1]: Mounting sysroot-oem.mount - /sysroot/oem... May 15 12:48:01.595626 systemd[1]: Mounting sysroot-usr.mount - /sysroot/usr... May 15 12:48:01.597669 systemd[1]: Starting flatcar-metadata-hostname.service - Flatcar Metadata Hostname Agent... May 15 12:48:01.598812 systemd[1]: ignition-remount-sysroot.service - Remount /sysroot read-write for Ignition was skipped because of an unmet condition check (ConditionPathIsReadWrite=!/sysroot). May 15 12:48:01.601257 systemd[1]: Reached target ignition-diskful.target - Ignition Boot Disk Setup. May 15 12:48:01.603153 systemd[1]: Mounted sysroot-usr.mount - /sysroot/usr. May 15 12:48:01.604658 systemd[1]: Starting initrd-setup-root.service - Root filesystem setup... May 15 12:48:01.616103 kernel: BTRFS: device label OEM devid 1 transid 14 /dev/sda6 (8:6) scanned by mount (883) May 15 12:48:01.616130 kernel: BTRFS info (device sda6): first mount of filesystem afd0c70c-d15e-448c-8325-f96e3c3ed3a5 May 15 12:48:01.618451 kernel: BTRFS info (device sda6): using crc32c (crc32c-intel) checksum algorithm May 15 12:48:01.619617 kernel: BTRFS info (device sda6): using free-space-tree May 15 12:48:01.629196 systemd[1]: Mounted sysroot-oem.mount - /sysroot/oem. May 15 12:48:01.647419 initrd-setup-root[910]: cut: /sysroot/etc/passwd: No such file or directory May 15 12:48:01.650251 coreos-metadata[885]: May 15 12:48:01.650 INFO Fetching http://169.254.169.254/hetzner/v1/metadata/hostname: Attempt #1 May 15 12:48:01.651470 coreos-metadata[885]: May 15 12:48:01.651 INFO Fetch successful May 15 12:48:01.651470 coreos-metadata[885]: May 15 12:48:01.651 INFO wrote hostname ci-4334-0-0-a-250489a463 to /sysroot/etc/hostname May 15 12:48:01.652567 systemd[1]: Finished flatcar-metadata-hostname.service - Flatcar Metadata Hostname Agent. May 15 12:48:01.655174 initrd-setup-root[917]: cut: /sysroot/etc/group: No such file or directory May 15 12:48:01.657043 initrd-setup-root[925]: cut: /sysroot/etc/shadow: No such file or directory May 15 12:48:01.659941 initrd-setup-root[932]: cut: /sysroot/etc/gshadow: No such file or directory May 15 12:48:01.716602 systemd[1]: Finished initrd-setup-root.service - Root filesystem setup. May 15 12:48:01.718671 systemd[1]: Starting ignition-mount.service - Ignition (mount)... May 15 12:48:01.720067 systemd[1]: Starting sysroot-boot.service - /sysroot/boot... May 15 12:48:01.740591 kernel: BTRFS info (device sda6): last unmount of filesystem afd0c70c-d15e-448c-8325-f96e3c3ed3a5 May 15 12:48:01.752909 systemd[1]: Finished sysroot-boot.service - /sysroot/boot. May 15 12:48:01.759049 ignition[1001]: INFO : Ignition 2.21.0 May 15 12:48:01.759049 ignition[1001]: INFO : Stage: mount May 15 12:48:01.760838 ignition[1001]: INFO : no configs at "/usr/lib/ignition/base.d" May 15 12:48:01.760838 ignition[1001]: INFO : no config dir at "/usr/lib/ignition/base.platform.d/hetzner" May 15 12:48:01.760838 ignition[1001]: INFO : mount: mount passed May 15 12:48:01.760838 ignition[1001]: INFO : Ignition finished successfully May 15 12:48:01.762152 systemd[1]: Finished ignition-mount.service - Ignition (mount). May 15 12:48:01.764608 systemd[1]: Starting ignition-files.service - Ignition (files)... May 15 12:48:02.005019 systemd[1]: sysroot-oem.mount: Deactivated successfully. May 15 12:48:02.006686 systemd[1]: Mounting sysroot-oem.mount - /sysroot/oem... May 15 12:48:02.038621 kernel: BTRFS: device label OEM devid 1 transid 14 /dev/sda6 (8:6) scanned by mount (1013) May 15 12:48:02.041691 kernel: BTRFS info (device sda6): first mount of filesystem afd0c70c-d15e-448c-8325-f96e3c3ed3a5 May 15 12:48:02.041738 kernel: BTRFS info (device sda6): using crc32c (crc32c-intel) checksum algorithm May 15 12:48:02.044213 kernel: BTRFS info (device sda6): using free-space-tree May 15 12:48:02.049611 systemd[1]: Mounted sysroot-oem.mount - /sysroot/oem. May 15 12:48:02.077656 ignition[1029]: INFO : Ignition 2.21.0 May 15 12:48:02.077656 ignition[1029]: INFO : Stage: files May 15 12:48:02.079107 ignition[1029]: INFO : no configs at "/usr/lib/ignition/base.d" May 15 12:48:02.079107 ignition[1029]: INFO : no config dir at "/usr/lib/ignition/base.platform.d/hetzner" May 15 12:48:02.079107 ignition[1029]: DEBUG : files: compiled without relabeling support, skipping May 15 12:48:02.081651 ignition[1029]: INFO : files: ensureUsers: op(1): [started] creating or modifying user "core" May 15 12:48:02.081651 ignition[1029]: DEBUG : files: ensureUsers: op(1): executing: "usermod" "--root" "/sysroot" "core" May 15 12:48:02.083324 ignition[1029]: INFO : files: ensureUsers: op(1): [finished] creating or modifying user "core" May 15 12:48:02.083324 ignition[1029]: INFO : files: ensureUsers: op(2): [started] adding ssh keys to user "core" May 15 12:48:02.083324 ignition[1029]: INFO : files: ensureUsers: op(2): [finished] adding ssh keys to user "core" May 15 12:48:02.082772 unknown[1029]: wrote ssh authorized keys file for user: core May 15 12:48:02.086417 ignition[1029]: INFO : files: createFilesystemsFiles: createFiles: op(3): [started] writing file "/sysroot/opt/helm-v3.17.0-linux-amd64.tar.gz" May 15 12:48:02.086417 ignition[1029]: INFO : files: createFilesystemsFiles: createFiles: op(3): GET https://get.helm.sh/helm-v3.17.0-linux-amd64.tar.gz: attempt #1 May 15 12:48:02.406738 ignition[1029]: INFO : files: createFilesystemsFiles: createFiles: op(3): GET result: OK May 15 12:48:02.449007 systemd-networkd[841]: eth0: Gained IPv6LL May 15 12:48:02.768819 systemd-networkd[841]: eth1: Gained IPv6LL May 15 12:48:07.156676 ignition[1029]: INFO : files: createFilesystemsFiles: createFiles: op(3): [finished] writing file "/sysroot/opt/helm-v3.17.0-linux-amd64.tar.gz" May 15 12:48:07.159897 ignition[1029]: INFO : files: createFilesystemsFiles: createFiles: op(4): [started] writing file "/sysroot/home/core/install.sh" May 15 12:48:07.159897 ignition[1029]: INFO : files: createFilesystemsFiles: createFiles: op(4): [finished] writing file "/sysroot/home/core/install.sh" May 15 12:48:07.159897 ignition[1029]: INFO : files: createFilesystemsFiles: createFiles: op(5): [started] writing file "/sysroot/home/core/nginx.yaml" May 15 12:48:07.159897 ignition[1029]: INFO : files: createFilesystemsFiles: createFiles: op(5): [finished] writing file "/sysroot/home/core/nginx.yaml" May 15 12:48:07.159897 ignition[1029]: INFO : files: createFilesystemsFiles: createFiles: op(6): [started] writing file "/sysroot/home/core/nfs-pod.yaml" May 15 12:48:07.159897 ignition[1029]: INFO : files: createFilesystemsFiles: createFiles: op(6): [finished] writing file "/sysroot/home/core/nfs-pod.yaml" May 15 12:48:07.159897 ignition[1029]: INFO : files: createFilesystemsFiles: createFiles: op(7): [started] writing file "/sysroot/home/core/nfs-pvc.yaml" May 15 12:48:07.159897 ignition[1029]: INFO : files: createFilesystemsFiles: createFiles: op(7): [finished] writing file "/sysroot/home/core/nfs-pvc.yaml" May 15 12:48:07.159897 ignition[1029]: INFO : files: createFilesystemsFiles: createFiles: op(8): [started] writing file "/sysroot/etc/flatcar/update.conf" May 15 12:48:07.175345 ignition[1029]: INFO : files: createFilesystemsFiles: createFiles: op(8): [finished] writing file "/sysroot/etc/flatcar/update.conf" May 15 12:48:07.175345 ignition[1029]: INFO : files: createFilesystemsFiles: createFiles: op(9): [started] writing link "/sysroot/etc/extensions/kubernetes.raw" -> "/opt/extensions/kubernetes/kubernetes-v1.32.0-x86-64.raw" May 15 12:48:07.175345 ignition[1029]: INFO : files: createFilesystemsFiles: createFiles: op(9): [finished] writing link "/sysroot/etc/extensions/kubernetes.raw" -> "/opt/extensions/kubernetes/kubernetes-v1.32.0-x86-64.raw" May 15 12:48:07.175345 ignition[1029]: INFO : files: createFilesystemsFiles: createFiles: op(a): [started] writing file "/sysroot/opt/extensions/kubernetes/kubernetes-v1.32.0-x86-64.raw" May 15 12:48:07.175345 ignition[1029]: INFO : files: createFilesystemsFiles: createFiles: op(a): GET https://github.com/flatcar/sysext-bakery/releases/download/latest/kubernetes-v1.32.0-x86-64.raw: attempt #1 May 15 12:48:07.838782 ignition[1029]: INFO : files: createFilesystemsFiles: createFiles: op(a): GET result: OK May 15 12:48:08.098026 ignition[1029]: INFO : files: createFilesystemsFiles: createFiles: op(a): [finished] writing file "/sysroot/opt/extensions/kubernetes/kubernetes-v1.32.0-x86-64.raw" May 15 12:48:08.099277 ignition[1029]: INFO : files: op(b): [started] processing unit "prepare-helm.service" May 15 12:48:08.100234 ignition[1029]: INFO : files: op(b): op(c): [started] writing unit "prepare-helm.service" at "/sysroot/etc/systemd/system/prepare-helm.service" May 15 12:48:08.101834 ignition[1029]: INFO : files: op(b): op(c): [finished] writing unit "prepare-helm.service" at "/sysroot/etc/systemd/system/prepare-helm.service" May 15 12:48:08.101834 ignition[1029]: INFO : files: op(b): [finished] processing unit "prepare-helm.service" May 15 12:48:08.101834 ignition[1029]: INFO : files: op(d): [started] processing unit "coreos-metadata.service" May 15 12:48:08.106553 ignition[1029]: INFO : files: op(d): op(e): [started] writing systemd drop-in "00-custom-metadata.conf" at "/sysroot/etc/systemd/system/coreos-metadata.service.d/00-custom-metadata.conf" May 15 12:48:08.106553 ignition[1029]: INFO : files: op(d): op(e): [finished] writing systemd drop-in "00-custom-metadata.conf" at "/sysroot/etc/systemd/system/coreos-metadata.service.d/00-custom-metadata.conf" May 15 12:48:08.106553 ignition[1029]: INFO : files: op(d): [finished] processing unit "coreos-metadata.service" May 15 12:48:08.106553 ignition[1029]: INFO : files: op(f): [started] setting preset to enabled for "prepare-helm.service" May 15 12:48:08.106553 ignition[1029]: INFO : files: op(f): [finished] setting preset to enabled for "prepare-helm.service" May 15 12:48:08.106553 ignition[1029]: INFO : files: createResultFile: createFiles: op(10): [started] writing file "/sysroot/etc/.ignition-result.json" May 15 12:48:08.106553 ignition[1029]: INFO : files: createResultFile: createFiles: op(10): [finished] writing file "/sysroot/etc/.ignition-result.json" May 15 12:48:08.106553 ignition[1029]: INFO : files: files passed May 15 12:48:08.106553 ignition[1029]: INFO : Ignition finished successfully May 15 12:48:08.103569 systemd[1]: Finished ignition-files.service - Ignition (files). May 15 12:48:08.107663 systemd[1]: Starting ignition-quench.service - Ignition (record completion)... May 15 12:48:08.110859 systemd[1]: Starting initrd-setup-root-after-ignition.service - Root filesystem completion... May 15 12:48:08.118759 systemd[1]: ignition-quench.service: Deactivated successfully. May 15 12:48:08.118860 systemd[1]: Finished ignition-quench.service - Ignition (record completion). May 15 12:48:08.125335 initrd-setup-root-after-ignition[1060]: grep: /sysroot/etc/flatcar/enabled-sysext.conf: No such file or directory May 15 12:48:08.125335 initrd-setup-root-after-ignition[1060]: grep: /sysroot/usr/share/flatcar/enabled-sysext.conf: No such file or directory May 15 12:48:08.126887 initrd-setup-root-after-ignition[1064]: grep: /sysroot/etc/flatcar/enabled-sysext.conf: No such file or directory May 15 12:48:08.127188 systemd[1]: Finished initrd-setup-root-after-ignition.service - Root filesystem completion. May 15 12:48:08.129085 systemd[1]: Reached target ignition-complete.target - Ignition Complete. May 15 12:48:08.130280 systemd[1]: Starting initrd-parse-etc.service - Mountpoints Configured in the Real Root... May 15 12:48:08.176972 systemd[1]: initrd-parse-etc.service: Deactivated successfully. May 15 12:48:08.177088 systemd[1]: Finished initrd-parse-etc.service - Mountpoints Configured in the Real Root. May 15 12:48:08.178434 systemd[1]: Reached target initrd-fs.target - Initrd File Systems. May 15 12:48:08.179406 systemd[1]: Reached target initrd.target - Initrd Default Target. May 15 12:48:08.180534 systemd[1]: dracut-mount.service - dracut mount hook was skipped because no trigger condition checks were met. May 15 12:48:08.181281 systemd[1]: Starting dracut-pre-pivot.service - dracut pre-pivot and cleanup hook... May 15 12:48:08.208865 systemd[1]: Finished dracut-pre-pivot.service - dracut pre-pivot and cleanup hook. May 15 12:48:08.210561 systemd[1]: Starting initrd-cleanup.service - Cleaning Up and Shutting Down Daemons... May 15 12:48:08.230368 systemd[1]: Stopped target nss-lookup.target - Host and Network Name Lookups. May 15 12:48:08.231808 systemd[1]: Stopped target remote-cryptsetup.target - Remote Encrypted Volumes. May 15 12:48:08.232437 systemd[1]: Stopped target timers.target - Timer Units. May 15 12:48:08.233528 systemd[1]: dracut-pre-pivot.service: Deactivated successfully. May 15 12:48:08.233651 systemd[1]: Stopped dracut-pre-pivot.service - dracut pre-pivot and cleanup hook. May 15 12:48:08.234812 systemd[1]: Stopped target initrd.target - Initrd Default Target. May 15 12:48:08.235599 systemd[1]: Stopped target basic.target - Basic System. May 15 12:48:08.236674 systemd[1]: Stopped target ignition-complete.target - Ignition Complete. May 15 12:48:08.237939 systemd[1]: Stopped target ignition-diskful.target - Ignition Boot Disk Setup. May 15 12:48:08.239374 systemd[1]: Stopped target initrd-root-device.target - Initrd Root Device. May 15 12:48:08.241036 systemd[1]: Stopped target initrd-usr-fs.target - Initrd /usr File System. May 15 12:48:08.242785 systemd[1]: Stopped target remote-fs.target - Remote File Systems. May 15 12:48:08.244377 systemd[1]: Stopped target remote-fs-pre.target - Preparation for Remote File Systems. May 15 12:48:08.246133 systemd[1]: Stopped target sysinit.target - System Initialization. May 15 12:48:08.247539 systemd[1]: Stopped target local-fs.target - Local File Systems. May 15 12:48:08.248903 systemd[1]: Stopped target swap.target - Swaps. May 15 12:48:08.250409 systemd[1]: dracut-pre-mount.service: Deactivated successfully. May 15 12:48:08.250560 systemd[1]: Stopped dracut-pre-mount.service - dracut pre-mount hook. May 15 12:48:08.252495 systemd[1]: Stopped target cryptsetup.target - Local Encrypted Volumes. May 15 12:48:08.253626 systemd[1]: Stopped target cryptsetup-pre.target - Local Encrypted Volumes (Pre). May 15 12:48:08.254989 systemd[1]: clevis-luks-askpass.path: Deactivated successfully. May 15 12:48:08.255126 systemd[1]: Stopped clevis-luks-askpass.path - Forward Password Requests to Clevis Directory Watch. May 15 12:48:08.256739 systemd[1]: dracut-initqueue.service: Deactivated successfully. May 15 12:48:08.256919 systemd[1]: Stopped dracut-initqueue.service - dracut initqueue hook. May 15 12:48:08.259082 systemd[1]: initrd-setup-root-after-ignition.service: Deactivated successfully. May 15 12:48:08.259276 systemd[1]: Stopped initrd-setup-root-after-ignition.service - Root filesystem completion. May 15 12:48:08.260964 systemd[1]: ignition-files.service: Deactivated successfully. May 15 12:48:08.261143 systemd[1]: Stopped ignition-files.service - Ignition (files). May 15 12:48:08.262756 systemd[1]: flatcar-metadata-hostname.service: Deactivated successfully. May 15 12:48:08.262929 systemd[1]: Stopped flatcar-metadata-hostname.service - Flatcar Metadata Hostname Agent. May 15 12:48:08.265012 systemd[1]: Stopping ignition-mount.service - Ignition (mount)... May 15 12:48:08.275696 systemd[1]: Stopping sysroot-boot.service - /sysroot/boot... May 15 12:48:08.276600 systemd[1]: systemd-udev-trigger.service: Deactivated successfully. May 15 12:48:08.277227 systemd[1]: Stopped systemd-udev-trigger.service - Coldplug All udev Devices. May 15 12:48:08.279333 systemd[1]: dracut-pre-trigger.service: Deactivated successfully. May 15 12:48:08.279424 systemd[1]: Stopped dracut-pre-trigger.service - dracut pre-trigger hook. May 15 12:48:08.283321 systemd[1]: initrd-cleanup.service: Deactivated successfully. May 15 12:48:08.285780 ignition[1084]: INFO : Ignition 2.21.0 May 15 12:48:08.285780 ignition[1084]: INFO : Stage: umount May 15 12:48:08.286749 ignition[1084]: INFO : no configs at "/usr/lib/ignition/base.d" May 15 12:48:08.286749 ignition[1084]: INFO : no config dir at "/usr/lib/ignition/base.platform.d/hetzner" May 15 12:48:08.286749 ignition[1084]: INFO : umount: umount passed May 15 12:48:08.286749 ignition[1084]: INFO : Ignition finished successfully May 15 12:48:08.287282 systemd[1]: Finished initrd-cleanup.service - Cleaning Up and Shutting Down Daemons. May 15 12:48:08.288660 systemd[1]: ignition-mount.service: Deactivated successfully. May 15 12:48:08.288723 systemd[1]: Stopped ignition-mount.service - Ignition (mount). May 15 12:48:08.292020 systemd[1]: ignition-disks.service: Deactivated successfully. May 15 12:48:08.292056 systemd[1]: Stopped ignition-disks.service - Ignition (disks). May 15 12:48:08.293493 systemd[1]: ignition-kargs.service: Deactivated successfully. May 15 12:48:08.293527 systemd[1]: Stopped ignition-kargs.service - Ignition (kargs). May 15 12:48:08.295650 systemd[1]: ignition-fetch.service: Deactivated successfully. May 15 12:48:08.295681 systemd[1]: Stopped ignition-fetch.service - Ignition (fetch). May 15 12:48:08.296264 systemd[1]: Stopped target network.target - Network. May 15 12:48:08.296759 systemd[1]: ignition-fetch-offline.service: Deactivated successfully. May 15 12:48:08.296791 systemd[1]: Stopped ignition-fetch-offline.service - Ignition (fetch-offline). May 15 12:48:08.297232 systemd[1]: Stopped target paths.target - Path Units. May 15 12:48:08.299587 systemd[1]: systemd-ask-password-console.path: Deactivated successfully. May 15 12:48:08.303631 systemd[1]: Stopped systemd-ask-password-console.path - Dispatch Password Requests to Console Directory Watch. May 15 12:48:08.304116 systemd[1]: Stopped target slices.target - Slice Units. May 15 12:48:08.305190 systemd[1]: Stopped target sockets.target - Socket Units. May 15 12:48:08.306037 systemd[1]: iscsid.socket: Deactivated successfully. May 15 12:48:08.306064 systemd[1]: Closed iscsid.socket - Open-iSCSI iscsid Socket. May 15 12:48:08.306872 systemd[1]: iscsiuio.socket: Deactivated successfully. May 15 12:48:08.306895 systemd[1]: Closed iscsiuio.socket - Open-iSCSI iscsiuio Socket. May 15 12:48:08.307708 systemd[1]: ignition-setup.service: Deactivated successfully. May 15 12:48:08.307742 systemd[1]: Stopped ignition-setup.service - Ignition (setup). May 15 12:48:08.308598 systemd[1]: ignition-setup-pre.service: Deactivated successfully. May 15 12:48:08.308630 systemd[1]: Stopped ignition-setup-pre.service - Ignition env setup. May 15 12:48:08.309539 systemd[1]: Stopping systemd-networkd.service - Network Configuration... May 15 12:48:08.310374 systemd[1]: Stopping systemd-resolved.service - Network Name Resolution... May 15 12:48:08.312012 systemd[1]: sysroot-boot.mount: Deactivated successfully. May 15 12:48:08.312436 systemd[1]: sysroot-boot.service: Deactivated successfully. May 15 12:48:08.312510 systemd[1]: Stopped sysroot-boot.service - /sysroot/boot. May 15 12:48:08.313315 systemd[1]: initrd-setup-root.service: Deactivated successfully. May 15 12:48:08.313375 systemd[1]: Stopped initrd-setup-root.service - Root filesystem setup. May 15 12:48:08.317939 systemd[1]: systemd-resolved.service: Deactivated successfully. May 15 12:48:08.318018 systemd[1]: Stopped systemd-resolved.service - Network Name Resolution. May 15 12:48:08.320421 systemd[1]: run-credentials-systemd\x2dresolved.service.mount: Deactivated successfully. May 15 12:48:08.320734 systemd[1]: systemd-tmpfiles-setup.service: Deactivated successfully. May 15 12:48:08.320785 systemd[1]: Stopped systemd-tmpfiles-setup.service - Create System Files and Directories. May 15 12:48:08.322714 systemd[1]: run-credentials-systemd\x2dtmpfiles\x2dsetup.service.mount: Deactivated successfully. May 15 12:48:08.324635 systemd[1]: systemd-networkd.service: Deactivated successfully. May 15 12:48:08.324714 systemd[1]: Stopped systemd-networkd.service - Network Configuration. May 15 12:48:08.326241 systemd[1]: run-credentials-systemd\x2dnetworkd.service.mount: Deactivated successfully. May 15 12:48:08.326349 systemd[1]: Stopped target network-pre.target - Preparation for Network. May 15 12:48:08.327391 systemd[1]: systemd-networkd.socket: Deactivated successfully. May 15 12:48:08.327415 systemd[1]: Closed systemd-networkd.socket - Network Service Netlink Socket. May 15 12:48:08.328867 systemd[1]: Stopping network-cleanup.service - Network Cleanup... May 15 12:48:08.329889 systemd[1]: parse-ip-for-networkd.service: Deactivated successfully. May 15 12:48:08.329926 systemd[1]: Stopped parse-ip-for-networkd.service - Write systemd-networkd units from cmdline. May 15 12:48:08.331296 systemd[1]: systemd-sysctl.service: Deactivated successfully. May 15 12:48:08.331329 systemd[1]: Stopped systemd-sysctl.service - Apply Kernel Variables. May 15 12:48:08.332650 systemd[1]: systemd-modules-load.service: Deactivated successfully. May 15 12:48:08.332684 systemd[1]: Stopped systemd-modules-load.service - Load Kernel Modules. May 15 12:48:08.333415 systemd[1]: Stopping systemd-udevd.service - Rule-based Manager for Device Events and Files... May 15 12:48:08.335163 systemd[1]: run-credentials-systemd\x2dsysctl.service.mount: Deactivated successfully. May 15 12:48:08.337904 systemd[1]: systemd-udevd.service: Deactivated successfully. May 15 12:48:08.338014 systemd[1]: Stopped systemd-udevd.service - Rule-based Manager for Device Events and Files. May 15 12:48:08.340122 systemd[1]: systemd-udevd-control.socket: Deactivated successfully. May 15 12:48:08.340165 systemd[1]: Closed systemd-udevd-control.socket - udev Control Socket. May 15 12:48:08.340646 systemd[1]: systemd-udevd-kernel.socket: Deactivated successfully. May 15 12:48:08.340683 systemd[1]: Closed systemd-udevd-kernel.socket - udev Kernel Socket. May 15 12:48:08.341661 systemd[1]: dracut-pre-udev.service: Deactivated successfully. May 15 12:48:08.341695 systemd[1]: Stopped dracut-pre-udev.service - dracut pre-udev hook. May 15 12:48:08.342615 systemd[1]: dracut-cmdline.service: Deactivated successfully. May 15 12:48:08.342648 systemd[1]: Stopped dracut-cmdline.service - dracut cmdline hook. May 15 12:48:08.343648 systemd[1]: dracut-cmdline-ask.service: Deactivated successfully. May 15 12:48:08.343681 systemd[1]: Stopped dracut-cmdline-ask.service - dracut ask for additional cmdline parameters. May 15 12:48:08.346655 systemd[1]: Starting initrd-udevadm-cleanup-db.service - Cleanup udev Database... May 15 12:48:08.347390 systemd[1]: systemd-network-generator.service: Deactivated successfully. May 15 12:48:08.347428 systemd[1]: Stopped systemd-network-generator.service - Generate network units from Kernel command line. May 15 12:48:08.348636 systemd[1]: systemd-tmpfiles-setup-dev.service: Deactivated successfully. May 15 12:48:08.348671 systemd[1]: Stopped systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev. May 15 12:48:08.349644 systemd[1]: systemd-tmpfiles-setup-dev-early.service: Deactivated successfully. May 15 12:48:08.349675 systemd[1]: Stopped systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully. May 15 12:48:08.350690 systemd[1]: kmod-static-nodes.service: Deactivated successfully. May 15 12:48:08.350721 systemd[1]: Stopped kmod-static-nodes.service - Create List of Static Device Nodes. May 15 12:48:08.351413 systemd[1]: systemd-vconsole-setup.service: Deactivated successfully. May 15 12:48:08.351443 systemd[1]: Stopped systemd-vconsole-setup.service - Virtual Console Setup. May 15 12:48:08.353073 systemd[1]: network-cleanup.service: Deactivated successfully. May 15 12:48:08.353130 systemd[1]: Stopped network-cleanup.service - Network Cleanup. May 15 12:48:08.357416 systemd[1]: initrd-udevadm-cleanup-db.service: Deactivated successfully. May 15 12:48:08.357485 systemd[1]: Finished initrd-udevadm-cleanup-db.service - Cleanup udev Database. May 15 12:48:08.358932 systemd[1]: Reached target initrd-switch-root.target - Switch Root. May 15 12:48:08.360231 systemd[1]: Starting initrd-switch-root.service - Switch Root... May 15 12:48:08.371347 systemd[1]: Switching root. May 15 12:48:08.404902 systemd-journald[217]: Journal stopped May 15 12:48:09.220086 systemd-journald[217]: Received SIGTERM from PID 1 (systemd). May 15 12:48:09.220136 kernel: SELinux: policy capability network_peer_controls=1 May 15 12:48:09.220147 kernel: SELinux: policy capability open_perms=1 May 15 12:48:09.220158 kernel: SELinux: policy capability extended_socket_class=1 May 15 12:48:09.220167 kernel: SELinux: policy capability always_check_network=0 May 15 12:48:09.220176 kernel: SELinux: policy capability cgroup_seclabel=1 May 15 12:48:09.220184 kernel: SELinux: policy capability nnp_nosuid_transition=1 May 15 12:48:09.220192 kernel: SELinux: policy capability genfs_seclabel_symlinks=0 May 15 12:48:09.220199 kernel: SELinux: policy capability ioctl_skip_cloexec=0 May 15 12:48:09.220207 kernel: SELinux: policy capability userspace_initial_context=0 May 15 12:48:09.220215 kernel: audit: type=1403 audit(1747313288.550:2): auid=4294967295 ses=4294967295 lsm=selinux res=1 May 15 12:48:09.220225 systemd[1]: Successfully loaded SELinux policy in 46.960ms. May 15 12:48:09.220243 systemd[1]: Relabeled /dev/, /dev/shm/, /run/ in 15.027ms. May 15 12:48:09.220254 systemd[1]: systemd 256.8 running in system mode (+PAM +AUDIT +SELINUX -APPARMOR +IMA +SMACK +SECCOMP -GCRYPT -GNUTLS +OPENSSL -ACL +BLKID +CURL +ELFUTILS -FIDO2 +IDN2 -IDN +IPTC +KMOD +LIBCRYPTSETUP +LIBCRYPTSETUP_PLUGINS +LIBFDISK +PCRE2 -PWQUALITY -P11KIT -QRENCODE +TPM2 +BZIP2 +LZ4 +XZ +ZLIB +ZSTD -BPF_FRAMEWORK -XKBCOMMON +UTMP -SYSVINIT +LIBARCHIVE) May 15 12:48:09.220262 systemd[1]: Detected virtualization kvm. May 15 12:48:09.220270 systemd[1]: Detected architecture x86-64. May 15 12:48:09.220279 systemd[1]: Detected first boot. May 15 12:48:09.220287 systemd[1]: Hostname set to . May 15 12:48:09.220296 systemd[1]: Initializing machine ID from VM UUID. May 15 12:48:09.220305 zram_generator::config[1127]: No configuration found. May 15 12:48:09.220316 kernel: Guest personality initialized and is inactive May 15 12:48:09.220324 kernel: VMCI host device registered (name=vmci, major=10, minor=125) May 15 12:48:09.220331 kernel: Initialized host personality May 15 12:48:09.220339 kernel: NET: Registered PF_VSOCK protocol family May 15 12:48:09.220347 systemd[1]: Populated /etc with preset unit settings. May 15 12:48:09.220356 systemd[1]: run-credentials-systemd\x2djournald.service.mount: Deactivated successfully. May 15 12:48:09.220364 systemd[1]: initrd-switch-root.service: Deactivated successfully. May 15 12:48:09.220372 systemd[1]: Stopped initrd-switch-root.service - Switch Root. May 15 12:48:09.220381 systemd[1]: systemd-journald.service: Scheduled restart job, restart counter is at 1. May 15 12:48:09.220390 systemd[1]: Created slice system-addon\x2dconfig.slice - Slice /system/addon-config. May 15 12:48:09.220398 systemd[1]: Created slice system-addon\x2drun.slice - Slice /system/addon-run. May 15 12:48:09.220407 systemd[1]: Created slice system-getty.slice - Slice /system/getty. May 15 12:48:09.220414 systemd[1]: Created slice system-modprobe.slice - Slice /system/modprobe. May 15 12:48:09.220426 systemd[1]: Created slice system-serial\x2dgetty.slice - Slice /system/serial-getty. May 15 12:48:09.220434 systemd[1]: Created slice system-system\x2dcloudinit.slice - Slice /system/system-cloudinit. May 15 12:48:09.220458 systemd[1]: Created slice system-systemd\x2dfsck.slice - Slice /system/systemd-fsck. May 15 12:48:09.220467 systemd[1]: Created slice user.slice - User and Session Slice. May 15 12:48:09.220476 systemd[1]: Started clevis-luks-askpass.path - Forward Password Requests to Clevis Directory Watch. May 15 12:48:09.220487 systemd[1]: Started systemd-ask-password-console.path - Dispatch Password Requests to Console Directory Watch. May 15 12:48:09.220497 systemd[1]: Started systemd-ask-password-wall.path - Forward Password Requests to Wall Directory Watch. May 15 12:48:09.220507 systemd[1]: Set up automount boot.automount - Boot partition Automount Point. May 15 12:48:09.220517 systemd[1]: Set up automount proc-sys-fs-binfmt_misc.automount - Arbitrary Executable File Formats File System Automount Point. May 15 12:48:09.220526 systemd[1]: Expecting device dev-disk-by\x2dlabel-OEM.device - /dev/disk/by-label/OEM... May 15 12:48:09.220534 systemd[1]: Expecting device dev-ttyS0.device - /dev/ttyS0... May 15 12:48:09.220542 systemd[1]: Reached target cryptsetup-pre.target - Local Encrypted Volumes (Pre). May 15 12:48:09.220551 systemd[1]: Reached target cryptsetup.target - Local Encrypted Volumes. May 15 12:48:09.220559 systemd[1]: Stopped target initrd-switch-root.target - Switch Root. May 15 12:48:09.220567 systemd[1]: Stopped target initrd-fs.target - Initrd File Systems. May 15 12:48:09.220588 systemd[1]: Stopped target initrd-root-fs.target - Initrd Root File System. May 15 12:48:09.220599 systemd[1]: Reached target integritysetup.target - Local Integrity Protected Volumes. May 15 12:48:09.221606 systemd[1]: Reached target remote-cryptsetup.target - Remote Encrypted Volumes. May 15 12:48:09.221617 systemd[1]: Reached target remote-fs.target - Remote File Systems. May 15 12:48:09.221626 systemd[1]: Reached target slices.target - Slice Units. May 15 12:48:09.221635 systemd[1]: Reached target swap.target - Swaps. May 15 12:48:09.221643 systemd[1]: Reached target veritysetup.target - Local Verity Protected Volumes. May 15 12:48:09.221652 systemd[1]: Listening on systemd-coredump.socket - Process Core Dump Socket. May 15 12:48:09.221661 systemd[1]: Listening on systemd-creds.socket - Credential Encryption/Decryption. May 15 12:48:09.221669 systemd[1]: Listening on systemd-networkd.socket - Network Service Netlink Socket. May 15 12:48:09.221678 systemd[1]: Listening on systemd-udevd-control.socket - udev Control Socket. May 15 12:48:09.221688 systemd[1]: Listening on systemd-udevd-kernel.socket - udev Kernel Socket. May 15 12:48:09.221696 systemd[1]: Listening on systemd-userdbd.socket - User Database Manager Socket. May 15 12:48:09.221706 systemd[1]: Mounting dev-hugepages.mount - Huge Pages File System... May 15 12:48:09.221714 systemd[1]: Mounting dev-mqueue.mount - POSIX Message Queue File System... May 15 12:48:09.221722 systemd[1]: Mounting media.mount - External Media Directory... May 15 12:48:09.221731 systemd[1]: proc-xen.mount - /proc/xen was skipped because of an unmet condition check (ConditionVirtualization=xen). May 15 12:48:09.221739 systemd[1]: Mounting sys-kernel-debug.mount - Kernel Debug File System... May 15 12:48:09.221747 systemd[1]: Mounting sys-kernel-tracing.mount - Kernel Trace File System... May 15 12:48:09.221757 systemd[1]: Mounting tmp.mount - Temporary Directory /tmp... May 15 12:48:09.221769 systemd[1]: var-lib-machines.mount - Virtual Machine and Container Storage (Compatibility) was skipped because of an unmet condition check (ConditionPathExists=/var/lib/machines.raw). May 15 12:48:09.221778 systemd[1]: Reached target machines.target - Containers. May 15 12:48:09.221786 systemd[1]: Starting flatcar-tmpfiles.service - Create missing system files... May 15 12:48:09.221795 systemd[1]: ignition-delete-config.service - Ignition (delete config) was skipped because no trigger condition checks were met. May 15 12:48:09.221804 systemd[1]: Starting kmod-static-nodes.service - Create List of Static Device Nodes... May 15 12:48:09.221812 systemd[1]: Starting modprobe@configfs.service - Load Kernel Module configfs... May 15 12:48:09.221820 systemd[1]: Starting modprobe@dm_mod.service - Load Kernel Module dm_mod... May 15 12:48:09.221828 systemd[1]: Starting modprobe@drm.service - Load Kernel Module drm... May 15 12:48:09.221838 systemd[1]: Starting modprobe@efi_pstore.service - Load Kernel Module efi_pstore... May 15 12:48:09.221846 systemd[1]: Starting modprobe@fuse.service - Load Kernel Module fuse... May 15 12:48:09.221855 systemd[1]: Starting modprobe@loop.service - Load Kernel Module loop... May 15 12:48:09.221863 systemd[1]: setup-nsswitch.service - Create /etc/nsswitch.conf was skipped because of an unmet condition check (ConditionPathExists=!/etc/nsswitch.conf). May 15 12:48:09.221871 systemd[1]: systemd-fsck-root.service: Deactivated successfully. May 15 12:48:09.221879 systemd[1]: Stopped systemd-fsck-root.service - File System Check on Root Device. May 15 12:48:09.221889 systemd[1]: systemd-fsck-usr.service: Deactivated successfully. May 15 12:48:09.221897 systemd[1]: Stopped systemd-fsck-usr.service. May 15 12:48:09.221907 systemd[1]: systemd-hibernate-clear.service - Clear Stale Hibernate Storage Info was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/HibernateLocation-8cf2644b-4b0b-428f-9387-6d876050dc67). May 15 12:48:09.221915 systemd[1]: Starting systemd-journald.service - Journal Service... May 15 12:48:09.221924 systemd[1]: Starting systemd-modules-load.service - Load Kernel Modules... May 15 12:48:09.221932 systemd[1]: Starting systemd-network-generator.service - Generate network units from Kernel command line... May 15 12:48:09.221940 kernel: fuse: init (API version 7.41) May 15 12:48:09.221950 kernel: loop: module loaded May 15 12:48:09.221958 systemd[1]: Starting systemd-remount-fs.service - Remount Root and Kernel File Systems... May 15 12:48:09.221966 systemd[1]: Starting systemd-udev-load-credentials.service - Load udev Rules from Credentials... May 15 12:48:09.221976 systemd[1]: Starting systemd-udev-trigger.service - Coldplug All udev Devices... May 15 12:48:09.221985 systemd[1]: verity-setup.service: Deactivated successfully. May 15 12:48:09.221994 systemd[1]: Stopped verity-setup.service. May 15 12:48:09.222004 systemd[1]: xenserver-pv-version.service - Set fake PV driver version for XenServer was skipped because of an unmet condition check (ConditionVirtualization=xen). May 15 12:48:09.222030 systemd-journald[1211]: Collecting audit messages is disabled. May 15 12:48:09.222050 systemd[1]: Mounted dev-hugepages.mount - Huge Pages File System. May 15 12:48:09.222059 systemd[1]: Mounted dev-mqueue.mount - POSIX Message Queue File System. May 15 12:48:09.222069 systemd-journald[1211]: Journal started May 15 12:48:09.222090 systemd-journald[1211]: Runtime Journal (/run/log/journal/039d6f22dcf94777906c21c65685288e) is 4.8M, max 38.6M, 33.7M free. May 15 12:48:08.980302 systemd[1]: Queued start job for default target multi-user.target. May 15 12:48:08.991470 systemd[1]: Unnecessary job was removed for dev-sda6.device - /dev/sda6. May 15 12:48:08.991821 systemd[1]: systemd-journald.service: Deactivated successfully. May 15 12:48:09.226418 systemd[1]: Started systemd-journald.service - Journal Service. May 15 12:48:09.226031 systemd[1]: Mounted media.mount - External Media Directory. May 15 12:48:09.227629 systemd[1]: Mounted sys-kernel-debug.mount - Kernel Debug File System. May 15 12:48:09.228149 systemd[1]: Mounted sys-kernel-tracing.mount - Kernel Trace File System. May 15 12:48:09.228719 systemd[1]: Mounted tmp.mount - Temporary Directory /tmp. May 15 12:48:09.229309 systemd[1]: Finished flatcar-tmpfiles.service - Create missing system files. May 15 12:48:09.230323 systemd[1]: Finished kmod-static-nodes.service - Create List of Static Device Nodes. May 15 12:48:09.232376 systemd[1]: modprobe@configfs.service: Deactivated successfully. May 15 12:48:09.232508 systemd[1]: Finished modprobe@configfs.service - Load Kernel Module configfs. May 15 12:48:09.233550 systemd[1]: modprobe@dm_mod.service: Deactivated successfully. May 15 12:48:09.233706 systemd[1]: Finished modprobe@dm_mod.service - Load Kernel Module dm_mod. May 15 12:48:09.234962 systemd[1]: modprobe@efi_pstore.service: Deactivated successfully. May 15 12:48:09.235078 systemd[1]: Finished modprobe@efi_pstore.service - Load Kernel Module efi_pstore. May 15 12:48:09.236099 systemd[1]: modprobe@fuse.service: Deactivated successfully. May 15 12:48:09.236253 systemd[1]: Finished modprobe@fuse.service - Load Kernel Module fuse. May 15 12:48:09.237673 systemd[1]: modprobe@loop.service: Deactivated successfully. May 15 12:48:09.237786 systemd[1]: Finished modprobe@loop.service - Load Kernel Module loop. May 15 12:48:09.238810 systemd[1]: Finished systemd-modules-load.service - Load Kernel Modules. May 15 12:48:09.239814 systemd[1]: Finished systemd-network-generator.service - Generate network units from Kernel command line. May 15 12:48:09.240797 systemd[1]: Finished systemd-remount-fs.service - Remount Root and Kernel File Systems. May 15 12:48:09.246589 kernel: ACPI: bus type drm_connector registered May 15 12:48:09.244953 systemd[1]: modprobe@drm.service: Deactivated successfully. May 15 12:48:09.245081 systemd[1]: Finished modprobe@drm.service - Load Kernel Module drm. May 15 12:48:09.252347 systemd[1]: Finished systemd-udev-load-credentials.service - Load udev Rules from Credentials. May 15 12:48:09.254122 systemd[1]: Reached target network-pre.target - Preparation for Network. May 15 12:48:09.256647 systemd[1]: Mounting sys-fs-fuse-connections.mount - FUSE Control File System... May 15 12:48:09.259689 systemd[1]: Mounting sys-kernel-config.mount - Kernel Configuration File System... May 15 12:48:09.260213 systemd[1]: remount-root.service - Remount Root File System was skipped because of an unmet condition check (ConditionPathIsReadWrite=!/). May 15 12:48:09.260284 systemd[1]: Reached target local-fs.target - Local File Systems. May 15 12:48:09.261496 systemd[1]: Listening on systemd-sysext.socket - System Extension Image Management. May 15 12:48:09.265808 systemd[1]: Starting ldconfig.service - Rebuild Dynamic Linker Cache... May 15 12:48:09.266856 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. May 15 12:48:09.268029 systemd[1]: Starting systemd-hwdb-update.service - Rebuild Hardware Database... May 15 12:48:09.269676 systemd[1]: Starting systemd-journal-flush.service - Flush Journal to Persistent Storage... May 15 12:48:09.270674 systemd[1]: systemd-pstore.service - Platform Persistent Storage Archival was skipped because of an unmet condition check (ConditionDirectoryNotEmpty=/sys/fs/pstore). May 15 12:48:09.272665 systemd[1]: Starting systemd-random-seed.service - Load/Save OS Random Seed... May 15 12:48:09.273157 systemd[1]: systemd-repart.service - Repartition Root Disk was skipped because no trigger condition checks were met. May 15 12:48:09.273838 systemd[1]: Starting systemd-sysctl.service - Apply Kernel Variables... May 15 12:48:09.278669 systemd[1]: Starting systemd-sysext.service - Merge System Extension Images into /usr/ and /opt/... May 15 12:48:09.287699 systemd[1]: Starting systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully... May 15 12:48:09.289346 systemd[1]: Mounted sys-fs-fuse-connections.mount - FUSE Control File System. May 15 12:48:09.291939 systemd[1]: Mounted sys-kernel-config.mount - Kernel Configuration File System. May 15 12:48:09.299950 systemd[1]: Finished systemd-random-seed.service - Load/Save OS Random Seed. May 15 12:48:09.300556 systemd[1]: Reached target first-boot-complete.target - First Boot Complete. May 15 12:48:09.303836 kernel: loop0: detected capacity change from 0 to 218376 May 15 12:48:09.306193 systemd[1]: Starting systemd-machine-id-commit.service - Save Transient machine-id to Disk... May 15 12:48:09.308505 systemd-journald[1211]: Time spent on flushing to /var/log/journal/039d6f22dcf94777906c21c65685288e is 44.861ms for 1165 entries. May 15 12:48:09.308505 systemd-journald[1211]: System Journal (/var/log/journal/039d6f22dcf94777906c21c65685288e) is 8M, max 584.8M, 576.8M free. May 15 12:48:09.373537 systemd-journald[1211]: Received client request to flush runtime journal. May 15 12:48:09.375195 kernel: squashfs: version 4.0 (2009/01/31) Phillip Lougher May 15 12:48:09.329731 systemd[1]: Finished systemd-sysctl.service - Apply Kernel Variables. May 15 12:48:09.350396 systemd[1]: Finished systemd-udev-trigger.service - Coldplug All udev Devices. May 15 12:48:09.363715 systemd-tmpfiles[1253]: ACLs are not supported, ignoring. May 15 12:48:09.363724 systemd-tmpfiles[1253]: ACLs are not supported, ignoring. May 15 12:48:09.369722 systemd[1]: Finished systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully. May 15 12:48:09.372855 systemd[1]: Starting systemd-sysusers.service - Create System Users... May 15 12:48:09.377419 systemd[1]: Finished systemd-journal-flush.service - Flush Journal to Persistent Storage. May 15 12:48:09.381783 systemd[1]: Finished systemd-machine-id-commit.service - Save Transient machine-id to Disk. May 15 12:48:09.388595 kernel: loop1: detected capacity change from 0 to 113872 May 15 12:48:09.409281 systemd[1]: Finished systemd-sysusers.service - Create System Users. May 15 12:48:09.413425 systemd[1]: Starting systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev... May 15 12:48:09.414994 kernel: loop2: detected capacity change from 0 to 8 May 15 12:48:09.430125 kernel: loop3: detected capacity change from 0 to 146240 May 15 12:48:09.443691 systemd-tmpfiles[1274]: ACLs are not supported, ignoring. May 15 12:48:09.444471 systemd-tmpfiles[1274]: ACLs are not supported, ignoring. May 15 12:48:09.451500 systemd[1]: Finished systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev. May 15 12:48:09.469609 kernel: loop4: detected capacity change from 0 to 218376 May 15 12:48:09.491606 kernel: loop5: detected capacity change from 0 to 113872 May 15 12:48:09.505735 kernel: loop6: detected capacity change from 0 to 8 May 15 12:48:09.509633 kernel: loop7: detected capacity change from 0 to 146240 May 15 12:48:09.526467 (sd-merge)[1279]: Using extensions 'containerd-flatcar', 'docker-flatcar', 'kubernetes', 'oem-hetzner'. May 15 12:48:09.526851 (sd-merge)[1279]: Merged extensions into '/usr'. May 15 12:48:09.530716 systemd[1]: Reload requested from client PID 1252 ('systemd-sysext') (unit systemd-sysext.service)... May 15 12:48:09.530938 systemd[1]: Reloading... May 15 12:48:09.596609 zram_generator::config[1307]: No configuration found. May 15 12:48:09.690237 systemd[1]: /usr/lib/systemd/system/docker.socket:6: ListenStream= references a path below legacy directory /var/run/, updating /var/run/docker.sock → /run/docker.sock; please update the unit file accordingly. May 15 12:48:09.771602 ldconfig[1247]: /sbin/ldconfig: /usr/lib/ld.so.conf is not an ELF file - it has the wrong magic bytes at the start. May 15 12:48:09.771107 systemd[1]: etc-machine\x2did.mount: Deactivated successfully. May 15 12:48:09.771264 systemd[1]: Reloading finished in 239 ms. May 15 12:48:09.785633 systemd[1]: Finished ldconfig.service - Rebuild Dynamic Linker Cache. May 15 12:48:09.786389 systemd[1]: Finished systemd-sysext.service - Merge System Extension Images into /usr/ and /opt/. May 15 12:48:09.794668 systemd[1]: Starting ensure-sysext.service... May 15 12:48:09.797412 systemd[1]: Starting systemd-tmpfiles-setup.service - Create System Files and Directories... May 15 12:48:09.815326 systemd[1]: Reload requested from client PID 1348 ('systemctl') (unit ensure-sysext.service)... May 15 12:48:09.815458 systemd[1]: Reloading... May 15 12:48:09.816228 systemd-tmpfiles[1349]: /usr/lib/tmpfiles.d/nfs-utils.conf:6: Duplicate line for path "/var/lib/nfs/sm", ignoring. May 15 12:48:09.816391 systemd-tmpfiles[1349]: /usr/lib/tmpfiles.d/nfs-utils.conf:7: Duplicate line for path "/var/lib/nfs/sm.bak", ignoring. May 15 12:48:09.816789 systemd-tmpfiles[1349]: /usr/lib/tmpfiles.d/provision.conf:20: Duplicate line for path "/root", ignoring. May 15 12:48:09.817005 systemd-tmpfiles[1349]: /usr/lib/tmpfiles.d/systemd-flatcar.conf:6: Duplicate line for path "/var/log/journal", ignoring. May 15 12:48:09.817670 systemd-tmpfiles[1349]: /usr/lib/tmpfiles.d/systemd.conf:29: Duplicate line for path "/var/lib/systemd", ignoring. May 15 12:48:09.817913 systemd-tmpfiles[1349]: ACLs are not supported, ignoring. May 15 12:48:09.818003 systemd-tmpfiles[1349]: ACLs are not supported, ignoring. May 15 12:48:09.822152 systemd-tmpfiles[1349]: Detected autofs mount point /boot during canonicalization of boot. May 15 12:48:09.822159 systemd-tmpfiles[1349]: Skipping /boot May 15 12:48:09.834790 systemd-tmpfiles[1349]: Detected autofs mount point /boot during canonicalization of boot. May 15 12:48:09.835016 systemd-tmpfiles[1349]: Skipping /boot May 15 12:48:09.862601 zram_generator::config[1372]: No configuration found. May 15 12:48:09.943296 systemd[1]: /usr/lib/systemd/system/docker.socket:6: ListenStream= references a path below legacy directory /var/run/, updating /var/run/docker.sock → /run/docker.sock; please update the unit file accordingly. May 15 12:48:10.008826 systemd[1]: Reloading finished in 192 ms. May 15 12:48:10.028465 systemd[1]: Finished systemd-hwdb-update.service - Rebuild Hardware Database. May 15 12:48:10.032545 systemd[1]: Finished systemd-tmpfiles-setup.service - Create System Files and Directories. May 15 12:48:10.037007 systemd[1]: Starting audit-rules.service - Load Audit Rules... May 15 12:48:10.040071 systemd[1]: Starting clean-ca-certificates.service - Clean up broken links in /etc/ssl/certs... May 15 12:48:10.041655 systemd[1]: Starting systemd-journal-catalog-update.service - Rebuild Journal Catalog... May 15 12:48:10.045123 systemd[1]: Starting systemd-resolved.service - Network Name Resolution... May 15 12:48:10.047371 systemd[1]: Starting systemd-udevd.service - Rule-based Manager for Device Events and Files... May 15 12:48:10.052708 systemd[1]: Starting systemd-update-utmp.service - Record System Boot/Shutdown in UTMP... May 15 12:48:10.061657 systemd[1]: proc-xen.mount - /proc/xen was skipped because of an unmet condition check (ConditionVirtualization=xen). May 15 12:48:10.062205 systemd[1]: ignition-delete-config.service - Ignition (delete config) was skipped because no trigger condition checks were met. May 15 12:48:10.063945 systemd[1]: Starting modprobe@dm_mod.service - Load Kernel Module dm_mod... May 15 12:48:10.067754 systemd[1]: Starting modprobe@efi_pstore.service - Load Kernel Module efi_pstore... May 15 12:48:10.070676 systemd[1]: Starting modprobe@loop.service - Load Kernel Module loop... May 15 12:48:10.071668 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. May 15 12:48:10.071754 systemd[1]: systemd-hibernate-clear.service - Clear Stale Hibernate Storage Info was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/HibernateLocation-8cf2644b-4b0b-428f-9387-6d876050dc67). May 15 12:48:10.073937 systemd[1]: Starting systemd-userdbd.service - User Database Manager... May 15 12:48:10.074366 systemd[1]: xenserver-pv-version.service - Set fake PV driver version for XenServer was skipped because of an unmet condition check (ConditionVirtualization=xen). May 15 12:48:10.081805 systemd[1]: proc-xen.mount - /proc/xen was skipped because of an unmet condition check (ConditionVirtualization=xen). May 15 12:48:10.082250 systemd[1]: ignition-delete-config.service - Ignition (delete config) was skipped because no trigger condition checks were met. May 15 12:48:10.085095 systemd[1]: Starting modprobe@drm.service - Load Kernel Module drm... May 15 12:48:10.086705 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. May 15 12:48:10.086825 systemd[1]: systemd-hibernate-clear.service - Clear Stale Hibernate Storage Info was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/HibernateLocation-8cf2644b-4b0b-428f-9387-6d876050dc67). May 15 12:48:10.086968 systemd[1]: xenserver-pv-version.service - Set fake PV driver version for XenServer was skipped because of an unmet condition check (ConditionVirtualization=xen). May 15 12:48:10.091415 systemd[1]: Finished systemd-journal-catalog-update.service - Rebuild Journal Catalog. May 15 12:48:10.097779 systemd[1]: Starting systemd-update-done.service - Update is Completed... May 15 12:48:10.099463 systemd[1]: Finished ensure-sysext.service. May 15 12:48:10.100609 systemd-udevd[1426]: Using default interface naming scheme 'v255'. May 15 12:48:10.100885 systemd[1]: Finished systemd-update-utmp.service - Record System Boot/Shutdown in UTMP. May 15 12:48:10.103648 systemd[1]: modprobe@dm_mod.service: Deactivated successfully. May 15 12:48:10.104344 systemd[1]: Finished modprobe@dm_mod.service - Load Kernel Module dm_mod. May 15 12:48:10.116272 systemd[1]: Starting systemd-timesyncd.service - Network Time Synchronization... May 15 12:48:10.121693 systemd[1]: modprobe@drm.service: Deactivated successfully. May 15 12:48:10.121816 systemd[1]: Finished modprobe@drm.service - Load Kernel Module drm. May 15 12:48:10.123333 systemd[1]: modprobe@efi_pstore.service: Deactivated successfully. May 15 12:48:10.124104 systemd[1]: Finished modprobe@efi_pstore.service - Load Kernel Module efi_pstore. May 15 12:48:10.125485 systemd[1]: systemd-pstore.service - Platform Persistent Storage Archival was skipped because of an unmet condition check (ConditionDirectoryNotEmpty=/sys/fs/pstore). May 15 12:48:10.132378 augenrules[1459]: No rules May 15 12:48:10.130667 systemd[1]: audit-rules.service: Deactivated successfully. May 15 12:48:10.131771 systemd[1]: Finished audit-rules.service - Load Audit Rules. May 15 12:48:10.132772 systemd[1]: modprobe@loop.service: Deactivated successfully. May 15 12:48:10.132887 systemd[1]: Finished modprobe@loop.service - Load Kernel Module loop. May 15 12:48:10.134762 systemd[1]: systemd-repart.service - Repartition Root Disk was skipped because no trigger condition checks were met. May 15 12:48:10.136353 systemd[1]: Finished systemd-update-done.service - Update is Completed. May 15 12:48:10.143060 systemd[1]: Started systemd-udevd.service - Rule-based Manager for Device Events and Files. May 15 12:48:10.146261 systemd[1]: Starting systemd-networkd.service - Network Configuration... May 15 12:48:10.154310 systemd[1]: Started systemd-userdbd.service - User Database Manager. May 15 12:48:10.163832 systemd[1]: Finished clean-ca-certificates.service - Clean up broken links in /etc/ssl/certs. May 15 12:48:10.164543 systemd[1]: update-ca-certificates.service - Update CA bundle at /etc/ssl/certs/ca-certificates.crt was skipped because of an unmet condition check (ConditionPathIsSymbolicLink=!/etc/ssl/certs/ca-certificates.crt). May 15 12:48:10.226183 systemd[1]: Condition check resulted in dev-ttyS0.device - /dev/ttyS0 being skipped. May 15 12:48:10.257593 kernel: mousedev: PS/2 mouse device common for all mice May 15 12:48:10.280932 systemd-networkd[1472]: lo: Link UP May 15 12:48:10.280940 systemd-networkd[1472]: lo: Gained carrier May 15 12:48:10.281911 systemd-networkd[1472]: Enumeration completed May 15 12:48:10.281989 systemd[1]: Started systemd-networkd.service - Network Configuration. May 15 12:48:10.284703 systemd[1]: Starting systemd-networkd-persistent-storage.service - Enable Persistent Storage in systemd-networkd... May 15 12:48:10.287406 systemd[1]: Starting systemd-networkd-wait-online.service - Wait for Network to be Configured... May 15 12:48:10.303525 systemd-resolved[1425]: Positive Trust Anchors: May 15 12:48:10.303540 systemd-resolved[1425]: . IN DS 20326 8 2 e06d44b80b8f1d39a95c0b0d7c65d08458e880409bbc683457104237c7f8ec8d May 15 12:48:10.303563 systemd-resolved[1425]: Negative trust anchors: home.arpa 10.in-addr.arpa 16.172.in-addr.arpa 17.172.in-addr.arpa 18.172.in-addr.arpa 19.172.in-addr.arpa 20.172.in-addr.arpa 21.172.in-addr.arpa 22.172.in-addr.arpa 23.172.in-addr.arpa 24.172.in-addr.arpa 25.172.in-addr.arpa 26.172.in-addr.arpa 27.172.in-addr.arpa 28.172.in-addr.arpa 29.172.in-addr.arpa 30.172.in-addr.arpa 31.172.in-addr.arpa 170.0.0.192.in-addr.arpa 171.0.0.192.in-addr.arpa 168.192.in-addr.arpa d.f.ip6.arpa ipv4only.arpa resolver.arpa corp home internal intranet lan local private test May 15 12:48:10.309711 systemd[1]: Started systemd-timesyncd.service - Network Time Synchronization. May 15 12:48:10.310685 systemd[1]: Reached target time-set.target - System Time Set. May 15 12:48:10.312259 systemd-resolved[1425]: Using system hostname 'ci-4334-0-0-a-250489a463'. May 15 12:48:10.313457 systemd[1]: Finished systemd-networkd-persistent-storage.service - Enable Persistent Storage in systemd-networkd. May 15 12:48:10.316154 systemd[1]: Started systemd-resolved.service - Network Name Resolution. May 15 12:48:10.317516 systemd[1]: Reached target network.target - Network. May 15 12:48:10.317929 systemd[1]: Reached target nss-lookup.target - Host and Network Name Lookups. May 15 12:48:10.318410 systemd[1]: Reached target sysinit.target - System Initialization. May 15 12:48:10.320695 systemd[1]: Started motdgen.path - Watch for update engine configuration changes. May 15 12:48:10.321167 systemd[1]: Started user-cloudinit@var-lib-flatcar\x2dinstall-user_data.path - Watch for a cloud-config at /var/lib/flatcar-install/user_data. May 15 12:48:10.321643 systemd[1]: Started google-oslogin-cache.timer - NSS cache refresh timer. May 15 12:48:10.322168 systemd[1]: Started logrotate.timer - Daily rotation of log files. May 15 12:48:10.322668 systemd[1]: Started mdadm.timer - Weekly check for MD array's redundancy information.. May 15 12:48:10.323127 systemd[1]: Started systemd-tmpfiles-clean.timer - Daily Cleanup of Temporary Directories. May 15 12:48:10.329701 systemd[1]: update-engine-stub.timer - Update Engine Stub Timer was skipped because of an unmet condition check (ConditionPathExists=/usr/.noupdate). May 15 12:48:10.329728 systemd[1]: Reached target paths.target - Path Units. May 15 12:48:10.330127 systemd[1]: Reached target timers.target - Timer Units. May 15 12:48:10.331900 systemd[1]: Listening on dbus.socket - D-Bus System Message Bus Socket. May 15 12:48:10.333171 systemd[1]: Starting docker.socket - Docker Socket for the API... May 15 12:48:10.340859 systemd[1]: Listening on sshd-unix-local.socket - OpenSSH Server Socket (systemd-ssh-generator, AF_UNIX Local). May 15 12:48:10.347058 systemd[1]: Listening on sshd-vsock.socket - OpenSSH Server Socket (systemd-ssh-generator, AF_VSOCK). May 15 12:48:10.347847 systemd-networkd[1472]: eth1: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. May 15 12:48:10.347857 systemd-networkd[1472]: eth1: Configuring with /usr/lib/systemd/network/zz-default.network. May 15 12:48:10.348710 systemd[1]: Reached target ssh-access.target - SSH Access Available. May 15 12:48:10.350190 systemd-networkd[1472]: eth0: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. May 15 12:48:10.350196 systemd-networkd[1472]: eth0: Configuring with /usr/lib/systemd/network/zz-default.network. May 15 12:48:10.351348 systemd-networkd[1472]: eth1: Link UP May 15 12:48:10.351538 systemd-networkd[1472]: eth1: Gained carrier May 15 12:48:10.351549 systemd-networkd[1472]: eth1: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. May 15 12:48:10.355097 systemd[1]: Listening on sshd.socket - OpenSSH Server Socket. May 15 12:48:10.356194 systemd[1]: Listening on systemd-hostnamed.socket - Hostname Service Socket. May 15 12:48:10.356795 systemd-networkd[1472]: eth0: Link UP May 15 12:48:10.357542 systemd[1]: Listening on docker.socket - Docker Socket for the API. May 15 12:48:10.358172 systemd-networkd[1472]: eth0: Gained carrier May 15 12:48:10.358185 systemd-networkd[1472]: eth0: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. May 15 12:48:10.360465 systemd[1]: Reached target sockets.target - Socket Units. May 15 12:48:10.363818 systemd[1]: Reached target basic.target - Basic System. May 15 12:48:10.364432 systemd[1]: addon-config@oem.service - Configure Addon /oem was skipped because no trigger condition checks were met. May 15 12:48:10.364455 systemd[1]: addon-run@oem.service - Run Addon /oem was skipped because no trigger condition checks were met. May 15 12:48:10.366178 systemd[1]: Starting containerd.service - containerd container runtime... May 15 12:48:10.368667 systemd[1]: Starting coreos-metadata.service - Flatcar Metadata Agent... May 15 12:48:10.373834 systemd[1]: Starting dbus.service - D-Bus System Message Bus... May 15 12:48:10.376746 systemd[1]: Starting dracut-shutdown.service - Restore /run/initramfs on shutdown... May 15 12:48:10.382632 systemd[1]: Starting enable-oem-cloudinit.service - Enable cloudinit... May 15 12:48:10.385632 kernel: input: Power Button as /devices/LNXSYSTM:00/LNXPWRBN:00/input/input5 May 15 12:48:10.385703 systemd-networkd[1472]: eth1: DHCPv4 address 10.0.0.3/32, gateway 10.0.0.1 acquired from 10.0.0.1 May 15 12:48:10.386392 systemd[1]: Starting extend-filesystems.service - Extend Filesystems... May 15 12:48:10.387022 systemd-timesyncd[1455]: Network configuration changed, trying to establish connection. May 15 12:48:10.387784 systemd[1]: flatcar-setup-environment.service - Modifies /etc/environment for CoreOS was skipped because of an unmet condition check (ConditionPathExists=/oem/bin/flatcar-setup-environment). May 15 12:48:10.389857 systemd[1]: Starting google-oslogin-cache.service - NSS cache refresh... May 15 12:48:10.392378 systemd[1]: Starting motdgen.service - Generate /run/flatcar/motd... May 15 12:48:10.396716 systemd[1]: Starting prepare-helm.service - Unpack helm to /opt/bin... May 15 12:48:10.402709 kernel: ACPI: button: Power Button [PWRF] May 15 12:48:10.402746 jq[1525]: false May 15 12:48:10.401708 systemd[1]: Starting ssh-key-proc-cmdline.service - Install an ssh key from /proc/cmdline... May 15 12:48:10.403523 systemd[1]: Starting sshd-keygen.service - Generate sshd host keys... May 15 12:48:10.404518 oslogin_cache_refresh[1530]: Refreshing passwd entry cache May 15 12:48:10.404815 google_oslogin_nss_cache[1530]: oslogin_cache_refresh[1530]: Refreshing passwd entry cache May 15 12:48:10.408839 systemd[1]: Starting systemd-logind.service - User Login Management... May 15 12:48:10.408874 oslogin_cache_refresh[1530]: Failure getting users, quitting May 15 12:48:10.413830 google_oslogin_nss_cache[1530]: oslogin_cache_refresh[1530]: Failure getting users, quitting May 15 12:48:10.413830 google_oslogin_nss_cache[1530]: oslogin_cache_refresh[1530]: Produced empty passwd cache file, removing /etc/oslogin_passwd.cache.bak. May 15 12:48:10.413830 google_oslogin_nss_cache[1530]: oslogin_cache_refresh[1530]: Refreshing group entry cache May 15 12:48:10.413830 google_oslogin_nss_cache[1530]: oslogin_cache_refresh[1530]: Failure getting groups, quitting May 15 12:48:10.413830 google_oslogin_nss_cache[1530]: oslogin_cache_refresh[1530]: Produced empty group cache file, removing /etc/oslogin_group.cache.bak. May 15 12:48:10.409938 systemd[1]: tcsd.service - TCG Core Services Daemon was skipped because of an unmet condition check (ConditionPathExists=/dev/tpm0). May 15 12:48:10.408886 oslogin_cache_refresh[1530]: Produced empty passwd cache file, removing /etc/oslogin_passwd.cache.bak. May 15 12:48:10.410243 systemd[1]: cgroup compatibility translation between legacy and unified hierarchy settings activated. See cgroup-compat debug messages for details. May 15 12:48:10.408913 oslogin_cache_refresh[1530]: Refreshing group entry cache May 15 12:48:10.412970 systemd[1]: Starting update-engine.service - Update Engine... May 15 12:48:10.409300 oslogin_cache_refresh[1530]: Failure getting groups, quitting May 15 12:48:10.409306 oslogin_cache_refresh[1530]: Produced empty group cache file, removing /etc/oslogin_group.cache.bak. May 15 12:48:10.417693 systemd[1]: Starting update-ssh-keys-after-ignition.service - Run update-ssh-keys once after Ignition... May 15 12:48:10.419727 systemd-networkd[1472]: eth0: DHCPv4 address 157.180.34.115/32, gateway 172.31.1.1 acquired from 172.31.1.1 May 15 12:48:10.423667 systemd-timesyncd[1455]: Network configuration changed, trying to establish connection. May 15 12:48:10.434591 update_engine[1537]: I20250515 12:48:10.432035 1537 main.cc:92] Flatcar Update Engine starting May 15 12:48:10.435699 systemd[1]: Finished dracut-shutdown.service - Restore /run/initramfs on shutdown. May 15 12:48:10.437650 systemd[1]: enable-oem-cloudinit.service: Skipped due to 'exec-condition'. May 15 12:48:10.437798 systemd[1]: Condition check resulted in enable-oem-cloudinit.service - Enable cloudinit being skipped. May 15 12:48:10.437994 systemd[1]: google-oslogin-cache.service: Deactivated successfully. May 15 12:48:10.438115 systemd[1]: Finished google-oslogin-cache.service - NSS cache refresh. May 15 12:48:10.463081 systemd[1]: motdgen.service: Deactivated successfully. May 15 12:48:10.468498 jq[1538]: true May 15 12:48:10.469025 systemd[1]: Finished motdgen.service - Generate /run/flatcar/motd. May 15 12:48:10.469708 systemd[1]: ssh-key-proc-cmdline.service: Deactivated successfully. May 15 12:48:10.469834 systemd[1]: Finished ssh-key-proc-cmdline.service - Install an ssh key from /proc/cmdline. May 15 12:48:10.479441 extend-filesystems[1528]: Found loop4 May 15 12:48:10.481325 extend-filesystems[1528]: Found loop5 May 15 12:48:10.481325 extend-filesystems[1528]: Found loop6 May 15 12:48:10.481325 extend-filesystems[1528]: Found loop7 May 15 12:48:10.481325 extend-filesystems[1528]: Found sda May 15 12:48:10.481325 extend-filesystems[1528]: Found sda1 May 15 12:48:10.481325 extend-filesystems[1528]: Found sda2 May 15 12:48:10.481325 extend-filesystems[1528]: Found sda3 May 15 12:48:10.481325 extend-filesystems[1528]: Found usr May 15 12:48:10.481325 extend-filesystems[1528]: Found sda4 May 15 12:48:10.481325 extend-filesystems[1528]: Found sda6 May 15 12:48:10.481325 extend-filesystems[1528]: Found sda7 May 15 12:48:10.481325 extend-filesystems[1528]: Found sda9 May 15 12:48:10.481325 extend-filesystems[1528]: Checking size of /dev/sda9 May 15 12:48:10.562918 kernel: [drm] pci: virtio-vga detected at 0000:00:01.0 May 15 12:48:10.562943 kernel: virtio-pci 0000:00:01.0: vgaarb: deactivate vga console May 15 12:48:10.563089 kernel: EXT4-fs (sda9): resizing filesystem from 1617920 to 9393147 blocks May 15 12:48:10.563104 kernel: Console: switching to colour dummy device 80x25 May 15 12:48:10.563113 kernel: [drm] features: -virgl +edid -resource_blob -host_visible May 15 12:48:10.563123 kernel: [drm] features: -context_init May 15 12:48:10.563132 kernel: [drm] number of scanouts: 1 May 15 12:48:10.563143 kernel: [drm] number of cap sets: 0 May 15 12:48:10.563154 kernel: i801_smbus 0000:00:1f.3: SMBus using PCI interrupt May 15 12:48:10.563260 kernel: i2c i2c-0: Memory type 0x07 not supported yet, not instantiating SPD May 15 12:48:10.587738 kernel: [drm] Initialized virtio_gpu 0.1.0 for 0000:00:01.0 on minor 0 May 15 12:48:10.587761 extend-filesystems[1528]: Resized partition /dev/sda9 May 15 12:48:10.587870 coreos-metadata[1521]: May 15 12:48:10.526 INFO Fetching http://169.254.169.254/hetzner/v1/metadata: Attempt #1 May 15 12:48:10.587870 coreos-metadata[1521]: May 15 12:48:10.529 INFO Fetch successful May 15 12:48:10.587870 coreos-metadata[1521]: May 15 12:48:10.531 INFO Fetching http://169.254.169.254/hetzner/v1/metadata/private-networks: Attempt #1 May 15 12:48:10.587870 coreos-metadata[1521]: May 15 12:48:10.532 INFO Fetch successful May 15 12:48:10.484029 (ntainerd)[1551]: containerd.service: Referenced but unset environment variable evaluates to an empty string: TORCX_IMAGEDIR, TORCX_UNPACKDIR May 15 12:48:10.557168 dbus-daemon[1522]: [system] SELinux support is enabled May 15 12:48:10.589379 tar[1540]: linux-amd64/LICENSE May 15 12:48:10.589379 tar[1540]: linux-amd64/helm May 15 12:48:10.589520 update_engine[1537]: I20250515 12:48:10.567219 1537 update_check_scheduler.cc:74] Next update check in 2m17s May 15 12:48:10.589545 extend-filesystems[1569]: resize2fs 1.47.2 (1-Jan-2025) May 15 12:48:10.506142 systemd[1]: Condition check resulted in dev-virtio\x2dports-org.qemu.guest_agent.0.device - /dev/virtio-ports/org.qemu.guest_agent.0 being skipped. May 15 12:48:10.511691 systemd[1]: Started qemu-guest-agent.service - QEMU Guest Agent. May 15 12:48:10.557973 systemd[1]: Started dbus.service - D-Bus System Message Bus. May 15 12:48:10.570151 systemd[1]: system-cloudinit@usr-share-oem-cloud\x2dconfig.yml.service - Load cloud-config from /usr/share/oem/cloud-config.yml was skipped because of an unmet condition check (ConditionFileNotEmpty=/usr/share/oem/cloud-config.yml). May 15 12:48:10.570175 systemd[1]: Reached target system-config.target - Load system-provided cloud configs. May 15 12:48:10.570642 systemd[1]: user-cloudinit-proc-cmdline.service - Load cloud-config from url defined in /proc/cmdline was skipped because of an unmet condition check (ConditionKernelCommandLine=cloud-config-url). May 15 12:48:10.570656 systemd[1]: Reached target user-config.target - Load user-provided cloud configs. May 15 12:48:10.572324 systemd[1]: Started update-engine.service - Update Engine. May 15 12:48:10.575903 systemd[1]: Started locksmithd.service - Cluster reboot manager. May 15 12:48:10.596675 jq[1558]: true May 15 12:48:10.632025 systemd-logind[1536]: New seat seat0. May 15 12:48:10.633593 kernel: EXT4-fs (sda9): resized filesystem to 9393147 May 15 12:48:10.636395 systemd[1]: Started systemd-logind.service - User Login Management. May 15 12:48:10.641598 extend-filesystems[1569]: Filesystem at /dev/sda9 is mounted on /; on-line resizing required May 15 12:48:10.641598 extend-filesystems[1569]: old_desc_blocks = 1, new_desc_blocks = 5 May 15 12:48:10.641598 extend-filesystems[1569]: The filesystem on /dev/sda9 is now 9393147 (4k) blocks long. May 15 12:48:10.642134 extend-filesystems[1528]: Resized filesystem in /dev/sda9 May 15 12:48:10.642134 extend-filesystems[1528]: Found sr0 May 15 12:48:10.641818 systemd[1]: extend-filesystems.service: Deactivated successfully. May 15 12:48:10.641974 systemd[1]: Finished extend-filesystems.service - Extend Filesystems. May 15 12:48:10.662068 systemd[1]: Found device dev-disk-by\x2dlabel-OEM.device - QEMU_HARDDISK OEM. May 15 12:48:10.670159 systemd[1]: Starting systemd-fsck@dev-disk-by\x2dlabel-OEM.service - File System Check on /dev/disk/by-label/OEM... May 15 12:48:10.671392 bash[1604]: Updated "/home/core/.ssh/authorized_keys" May 15 12:48:10.676158 systemd[1]: Finished update-ssh-keys-after-ignition.service - Run update-ssh-keys once after Ignition. May 15 12:48:10.682016 systemd[1]: Starting sshkeys.service... May 15 12:48:10.711871 systemd[1]: Finished coreos-metadata.service - Flatcar Metadata Agent. May 15 12:48:10.712136 systemd[1]: packet-phone-home.service - Report Success to Packet was skipped because no trigger condition checks were met. May 15 12:48:10.723002 systemd[1]: Finished systemd-fsck@dev-disk-by\x2dlabel-OEM.service - File System Check on /dev/disk/by-label/OEM. May 15 12:48:10.724444 systemd[1]: Created slice system-coreos\x2dmetadata\x2dsshkeys.slice - Slice /system/coreos-metadata-sshkeys. May 15 12:48:10.727695 systemd[1]: Starting coreos-metadata-sshkeys@core.service - Flatcar Metadata Agent (SSH Keys)... May 15 12:48:10.787743 coreos-metadata[1618]: May 15 12:48:10.787 INFO Fetching http://169.254.169.254/hetzner/v1/metadata/public-keys: Attempt #1 May 15 12:48:10.788775 coreos-metadata[1618]: May 15 12:48:10.788 INFO Fetch successful May 15 12:48:10.796225 unknown[1618]: wrote ssh authorized keys file for user: core May 15 12:48:10.812132 containerd[1551]: time="2025-05-15T12:48:10Z" level=warning msg="Ignoring unknown key in TOML" column=1 error="strict mode: fields in the document are missing in the target struct" file=/usr/share/containerd/config.toml key=subreaper row=8 May 15 12:48:10.814258 containerd[1551]: time="2025-05-15T12:48:10.814230637Z" level=info msg="starting containerd" revision=06b99ca80cdbfbc6cc8bd567021738c9af2b36ce version=v2.0.4 May 15 12:48:10.832282 containerd[1551]: time="2025-05-15T12:48:10.832050010Z" level=warning msg="Configuration migrated from version 2, use `containerd config migrate` to avoid migration" t="8.257µs" May 15 12:48:10.832282 containerd[1551]: time="2025-05-15T12:48:10.832074275Z" level=info msg="loading plugin" id=io.containerd.image-verifier.v1.bindir type=io.containerd.image-verifier.v1 May 15 12:48:10.832282 containerd[1551]: time="2025-05-15T12:48:10.832091717Z" level=info msg="loading plugin" id=io.containerd.internal.v1.opt type=io.containerd.internal.v1 May 15 12:48:10.832282 containerd[1551]: time="2025-05-15T12:48:10.832191855Z" level=info msg="loading plugin" id=io.containerd.warning.v1.deprecations type=io.containerd.warning.v1 May 15 12:48:10.832282 containerd[1551]: time="2025-05-15T12:48:10.832205150Z" level=info msg="loading plugin" id=io.containerd.content.v1.content type=io.containerd.content.v1 May 15 12:48:10.832282 containerd[1551]: time="2025-05-15T12:48:10.832223415Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.blockfile type=io.containerd.snapshotter.v1 May 15 12:48:10.832282 containerd[1551]: time="2025-05-15T12:48:10.832273509Z" level=info msg="skip loading plugin" error="no scratch file generator: skip plugin" id=io.containerd.snapshotter.v1.blockfile type=io.containerd.snapshotter.v1 May 15 12:48:10.832282 containerd[1551]: time="2025-05-15T12:48:10.832284529Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.btrfs type=io.containerd.snapshotter.v1 May 15 12:48:10.832442 containerd[1551]: time="2025-05-15T12:48:10.832425394Z" level=info msg="skip loading plugin" error="path /var/lib/containerd/io.containerd.snapshotter.v1.btrfs (ext4) must be a btrfs filesystem to be used with the btrfs snapshotter: skip plugin" id=io.containerd.snapshotter.v1.btrfs type=io.containerd.snapshotter.v1 May 15 12:48:10.832442 containerd[1551]: time="2025-05-15T12:48:10.832439530Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.devmapper type=io.containerd.snapshotter.v1 May 15 12:48:10.832475 containerd[1551]: time="2025-05-15T12:48:10.832448807Z" level=info msg="skip loading plugin" error="devmapper not configured: skip plugin" id=io.containerd.snapshotter.v1.devmapper type=io.containerd.snapshotter.v1 May 15 12:48:10.832475 containerd[1551]: time="2025-05-15T12:48:10.832455210Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.native type=io.containerd.snapshotter.v1 May 15 12:48:10.832532 containerd[1551]: time="2025-05-15T12:48:10.832513318Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.overlayfs type=io.containerd.snapshotter.v1 May 15 12:48:10.836090 containerd[1551]: time="2025-05-15T12:48:10.835070153Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.zfs type=io.containerd.snapshotter.v1 May 15 12:48:10.836090 containerd[1551]: time="2025-05-15T12:48:10.835125766Z" level=info msg="skip loading plugin" error="lstat /var/lib/containerd/io.containerd.snapshotter.v1.zfs: no such file or directory: skip plugin" id=io.containerd.snapshotter.v1.zfs type=io.containerd.snapshotter.v1 May 15 12:48:10.836090 containerd[1551]: time="2025-05-15T12:48:10.835135695Z" level=info msg="loading plugin" id=io.containerd.event.v1.exchange type=io.containerd.event.v1 May 15 12:48:10.836090 containerd[1551]: time="2025-05-15T12:48:10.835171312Z" level=info msg="loading plugin" id=io.containerd.monitor.task.v1.cgroups type=io.containerd.monitor.task.v1 May 15 12:48:10.836090 containerd[1551]: time="2025-05-15T12:48:10.835615024Z" level=info msg="loading plugin" id=io.containerd.metadata.v1.bolt type=io.containerd.metadata.v1 May 15 12:48:10.836090 containerd[1551]: time="2025-05-15T12:48:10.835692640Z" level=info msg="metadata content store policy set" policy=shared May 15 12:48:10.837786 update-ssh-keys[1625]: Updated "/home/core/.ssh/authorized_keys" May 15 12:48:10.838859 systemd[1]: Finished coreos-metadata-sshkeys@core.service - Flatcar Metadata Agent (SSH Keys). May 15 12:48:10.842278 containerd[1551]: time="2025-05-15T12:48:10.841930797Z" level=info msg="loading plugin" id=io.containerd.gc.v1.scheduler type=io.containerd.gc.v1 May 15 12:48:10.842278 containerd[1551]: time="2025-05-15T12:48:10.841976482Z" level=info msg="loading plugin" id=io.containerd.differ.v1.walking type=io.containerd.differ.v1 May 15 12:48:10.842278 containerd[1551]: time="2025-05-15T12:48:10.841989487Z" level=info msg="loading plugin" id=io.containerd.lease.v1.manager type=io.containerd.lease.v1 May 15 12:48:10.842278 containerd[1551]: time="2025-05-15T12:48:10.841999546Z" level=info msg="loading plugin" id=io.containerd.service.v1.containers-service type=io.containerd.service.v1 May 15 12:48:10.842278 containerd[1551]: time="2025-05-15T12:48:10.842009495Z" level=info msg="loading plugin" id=io.containerd.service.v1.content-service type=io.containerd.service.v1 May 15 12:48:10.842278 containerd[1551]: time="2025-05-15T12:48:10.842016979Z" level=info msg="loading plugin" id=io.containerd.service.v1.diff-service type=io.containerd.service.v1 May 15 12:48:10.842278 containerd[1551]: time="2025-05-15T12:48:10.842055100Z" level=info msg="loading plugin" id=io.containerd.service.v1.images-service type=io.containerd.service.v1 May 15 12:48:10.842278 containerd[1551]: time="2025-05-15T12:48:10.842066661Z" level=info msg="loading plugin" id=io.containerd.service.v1.introspection-service type=io.containerd.service.v1 May 15 12:48:10.842278 containerd[1551]: time="2025-05-15T12:48:10.842075048Z" level=info msg="loading plugin" id=io.containerd.service.v1.namespaces-service type=io.containerd.service.v1 May 15 12:48:10.842278 containerd[1551]: time="2025-05-15T12:48:10.842082271Z" level=info msg="loading plugin" id=io.containerd.service.v1.snapshots-service type=io.containerd.service.v1 May 15 12:48:10.842278 containerd[1551]: time="2025-05-15T12:48:10.842088753Z" level=info msg="loading plugin" id=io.containerd.shim.v1.manager type=io.containerd.shim.v1 May 15 12:48:10.842278 containerd[1551]: time="2025-05-15T12:48:10.842098120Z" level=info msg="loading plugin" id=io.containerd.runtime.v2.task type=io.containerd.runtime.v2 May 15 12:48:10.842278 containerd[1551]: time="2025-05-15T12:48:10.842173602Z" level=info msg="loading plugin" id=io.containerd.service.v1.tasks-service type=io.containerd.service.v1 May 15 12:48:10.842278 containerd[1551]: time="2025-05-15T12:48:10.842190043Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.containers type=io.containerd.grpc.v1 May 15 12:48:10.842493 containerd[1551]: time="2025-05-15T12:48:10.842200603Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.content type=io.containerd.grpc.v1 May 15 12:48:10.842493 containerd[1551]: time="2025-05-15T12:48:10.842212204Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.diff type=io.containerd.grpc.v1 May 15 12:48:10.842493 containerd[1551]: time="2025-05-15T12:48:10.842220841Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.events type=io.containerd.grpc.v1 May 15 12:48:10.842493 containerd[1551]: time="2025-05-15T12:48:10.842229858Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.images type=io.containerd.grpc.v1 May 15 12:48:10.842493 containerd[1551]: time="2025-05-15T12:48:10.842238984Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.introspection type=io.containerd.grpc.v1 May 15 12:48:10.842493 containerd[1551]: time="2025-05-15T12:48:10.842246579Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.leases type=io.containerd.grpc.v1 May 15 12:48:10.842493 containerd[1551]: time="2025-05-15T12:48:10.842256247Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.namespaces type=io.containerd.grpc.v1 May 15 12:48:10.842493 containerd[1551]: time="2025-05-15T12:48:10.842264983Z" level=info msg="loading plugin" id=io.containerd.sandbox.store.v1.local type=io.containerd.sandbox.store.v1 May 15 12:48:10.842493 containerd[1551]: time="2025-05-15T12:48:10.842272317Z" level=info msg="loading plugin" id=io.containerd.cri.v1.images type=io.containerd.cri.v1 May 15 12:48:10.842493 containerd[1551]: time="2025-05-15T12:48:10.842318394Z" level=info msg="Get image filesystem path \"/var/lib/containerd/io.containerd.snapshotter.v1.overlayfs\" for snapshotter \"overlayfs\"" May 15 12:48:10.842493 containerd[1551]: time="2025-05-15T12:48:10.842329495Z" level=info msg="Start snapshots syncer" May 15 12:48:10.842493 containerd[1551]: time="2025-05-15T12:48:10.842358509Z" level=info msg="loading plugin" id=io.containerd.cri.v1.runtime type=io.containerd.cri.v1 May 15 12:48:10.842945 containerd[1551]: time="2025-05-15T12:48:10.842710829Z" level=info msg="starting cri plugin" config="{\"containerd\":{\"defaultRuntimeName\":\"runc\",\"runtimes\":{\"runc\":{\"runtimeType\":\"io.containerd.runc.v2\",\"runtimePath\":\"\",\"PodAnnotations\":null,\"ContainerAnnotations\":null,\"options\":{\"BinaryName\":\"\",\"CriuImagePath\":\"\",\"CriuWorkPath\":\"\",\"IoGid\":0,\"IoUid\":0,\"NoNewKeyring\":false,\"Root\":\"\",\"ShimCgroup\":\"\",\"SystemdCgroup\":true},\"privileged_without_host_devices\":false,\"privileged_without_host_devices_all_devices_allowed\":false,\"baseRuntimeSpec\":\"\",\"cniConfDir\":\"\",\"cniMaxConfNum\":0,\"snapshotter\":\"\",\"sandboxer\":\"podsandbox\",\"io_type\":\"\"}},\"ignoreBlockIONotEnabledErrors\":false,\"ignoreRdtNotEnabledErrors\":false},\"cni\":{\"binDir\":\"/opt/cni/bin\",\"confDir\":\"/etc/cni/net.d\",\"maxConfNum\":1,\"setupSerially\":false,\"confTemplate\":\"\",\"ipPref\":\"\",\"useInternalLoopback\":false},\"enableSelinux\":true,\"selinuxCategoryRange\":1024,\"maxContainerLogSize\":16384,\"disableApparmor\":false,\"restrictOOMScoreAdj\":false,\"disableProcMount\":false,\"unsetSeccompProfile\":\"\",\"tolerateMissingHugetlbController\":true,\"disableHugetlbController\":true,\"device_ownership_from_security_context\":false,\"ignoreImageDefinedVolumes\":false,\"netnsMountsUnderStateDir\":false,\"enableUnprivilegedPorts\":true,\"enableUnprivilegedICMP\":true,\"enableCDI\":true,\"cdiSpecDirs\":[\"/etc/cdi\",\"/var/run/cdi\"],\"drainExecSyncIOTimeout\":\"0s\",\"ignoreDeprecationWarnings\":null,\"containerdRootDir\":\"/var/lib/containerd\",\"containerdEndpoint\":\"/run/containerd/containerd.sock\",\"rootDir\":\"/var/lib/containerd/io.containerd.grpc.v1.cri\",\"stateDir\":\"/run/containerd/io.containerd.grpc.v1.cri\"}" May 15 12:48:10.842945 containerd[1551]: time="2025-05-15T12:48:10.842758308Z" level=info msg="loading plugin" id=io.containerd.podsandbox.controller.v1.podsandbox type=io.containerd.podsandbox.controller.v1 May 15 12:48:10.843044 containerd[1551]: time="2025-05-15T12:48:10.842832758Z" level=info msg="loading plugin" id=io.containerd.sandbox.controller.v1.shim type=io.containerd.sandbox.controller.v1 May 15 12:48:10.843044 containerd[1551]: time="2025-05-15T12:48:10.842947463Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.sandbox-controllers type=io.containerd.grpc.v1 May 15 12:48:10.843044 containerd[1551]: time="2025-05-15T12:48:10.842966549Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.sandboxes type=io.containerd.grpc.v1 May 15 12:48:10.843044 containerd[1551]: time="2025-05-15T12:48:10.842975125Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.snapshots type=io.containerd.grpc.v1 May 15 12:48:10.843044 containerd[1551]: time="2025-05-15T12:48:10.842982559Z" level=info msg="loading plugin" id=io.containerd.streaming.v1.manager type=io.containerd.streaming.v1 May 15 12:48:10.843044 containerd[1551]: time="2025-05-15T12:48:10.842992979Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.streaming type=io.containerd.grpc.v1 May 15 12:48:10.843044 containerd[1551]: time="2025-05-15T12:48:10.843001435Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.tasks type=io.containerd.grpc.v1 May 15 12:48:10.843044 containerd[1551]: time="2025-05-15T12:48:10.843009309Z" level=info msg="loading plugin" id=io.containerd.transfer.v1.local type=io.containerd.transfer.v1 May 15 12:48:10.843044 containerd[1551]: time="2025-05-15T12:48:10.843029657Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.transfer type=io.containerd.grpc.v1 May 15 12:48:10.843044 containerd[1551]: time="2025-05-15T12:48:10.843037853Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.version type=io.containerd.grpc.v1 May 15 12:48:10.843044 containerd[1551]: time="2025-05-15T12:48:10.843046298Z" level=info msg="loading plugin" id=io.containerd.monitor.container.v1.restart type=io.containerd.monitor.container.v1 May 15 12:48:10.843185 containerd[1551]: time="2025-05-15T12:48:10.843082216Z" level=info msg="loading plugin" id=io.containerd.tracing.processor.v1.otlp type=io.containerd.tracing.processor.v1 May 15 12:48:10.843185 containerd[1551]: time="2025-05-15T12:48:10.843093487Z" level=info msg="skip loading plugin" error="skip plugin: tracing endpoint not configured" id=io.containerd.tracing.processor.v1.otlp type=io.containerd.tracing.processor.v1 May 15 12:48:10.843185 containerd[1551]: time="2025-05-15T12:48:10.843099859Z" level=info msg="loading plugin" id=io.containerd.internal.v1.tracing type=io.containerd.internal.v1 May 15 12:48:10.843185 containerd[1551]: time="2025-05-15T12:48:10.843106611Z" level=info msg="skip loading plugin" error="skip plugin: tracing endpoint not configured" id=io.containerd.internal.v1.tracing type=io.containerd.internal.v1 May 15 12:48:10.843185 containerd[1551]: time="2025-05-15T12:48:10.843112022Z" level=info msg="loading plugin" id=io.containerd.ttrpc.v1.otelttrpc type=io.containerd.ttrpc.v1 May 15 12:48:10.843185 containerd[1551]: time="2025-05-15T12:48:10.843118133Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.healthcheck type=io.containerd.grpc.v1 May 15 12:48:10.843185 containerd[1551]: time="2025-05-15T12:48:10.843125767Z" level=info msg="loading plugin" id=io.containerd.nri.v1.nri type=io.containerd.nri.v1 May 15 12:48:10.843185 containerd[1551]: time="2025-05-15T12:48:10.843177244Z" level=info msg="runtime interface created" May 15 12:48:10.843185 containerd[1551]: time="2025-05-15T12:48:10.843182023Z" level=info msg="created NRI interface" May 15 12:48:10.843304 containerd[1551]: time="2025-05-15T12:48:10.843188595Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.cri type=io.containerd.grpc.v1 May 15 12:48:10.843304 containerd[1551]: time="2025-05-15T12:48:10.843196740Z" level=info msg="Connect containerd service" May 15 12:48:10.843304 containerd[1551]: time="2025-05-15T12:48:10.843215656Z" level=info msg="using experimental NRI integration - disable nri plugin to prevent this" May 15 12:48:10.846630 kernel: EDAC MC: Ver: 3.0.0 May 15 12:48:10.846236 systemd[1]: Finished sshkeys.service. May 15 12:48:10.846728 containerd[1551]: time="2025-05-15T12:48:10.843897655Z" level=error msg="failed to load cni during init, please check CRI plugin status before setting up network for pods" error="cni config load failed: no network config found in /etc/cni/net.d: cni plugin not initialized: failed to load cni config" May 15 12:48:10.881892 sshd_keygen[1554]: ssh-keygen: generating new host keys: RSA ECDSA ED25519 May 15 12:48:10.891998 locksmithd[1572]: locksmithd starting currentOperation="UPDATE_STATUS_IDLE" strategy="reboot" May 15 12:48:10.899144 systemd[1]: Finished sshd-keygen.service - Generate sshd host keys. May 15 12:48:10.901729 systemd[1]: Starting issuegen.service - Generate /run/issue... May 15 12:48:10.927946 systemd[1]: issuegen.service: Deactivated successfully. May 15 12:48:10.928458 systemd[1]: Finished issuegen.service - Generate /run/issue. May 15 12:48:10.933661 systemd[1]: Starting systemd-user-sessions.service - Permit User Sessions... May 15 12:48:10.961904 systemd[1]: Finished systemd-user-sessions.service - Permit User Sessions. May 15 12:48:10.966804 systemd[1]: Started getty@tty1.service - Getty on tty1. May 15 12:48:10.968331 systemd[1]: Started serial-getty@ttyS0.service - Serial Getty on ttyS0. May 15 12:48:10.968759 systemd[1]: Reached target getty.target - Login Prompts. May 15 12:48:10.980205 containerd[1551]: time="2025-05-15T12:48:10.980180389Z" level=info msg="Start subscribing containerd event" May 15 12:48:10.980404 containerd[1551]: time="2025-05-15T12:48:10.980369834Z" level=info msg="Start recovering state" May 15 12:48:10.981075 containerd[1551]: time="2025-05-15T12:48:10.981061090Z" level=info msg="Start event monitor" May 15 12:48:10.981379 containerd[1551]: time="2025-05-15T12:48:10.981342728Z" level=info msg="Start cni network conf syncer for default" May 15 12:48:10.981848 containerd[1551]: time="2025-05-15T12:48:10.981556029Z" level=info msg="Start streaming server" May 15 12:48:10.981848 containerd[1551]: time="2025-05-15T12:48:10.980986721Z" level=info msg=serving... address=/run/containerd/containerd.sock.ttrpc May 15 12:48:10.981848 containerd[1551]: time="2025-05-15T12:48:10.981812249Z" level=info msg=serving... address=/run/containerd/containerd.sock May 15 12:48:10.982914 containerd[1551]: time="2025-05-15T12:48:10.982159851Z" level=info msg="Registered namespace \"k8s.io\" with NRI" May 15 12:48:10.982914 containerd[1551]: time="2025-05-15T12:48:10.982174508Z" level=info msg="runtime interface starting up..." May 15 12:48:10.982914 containerd[1551]: time="2025-05-15T12:48:10.982181822Z" level=info msg="starting plugins..." May 15 12:48:10.982914 containerd[1551]: time="2025-05-15T12:48:10.982194796Z" level=info msg="Synchronizing NRI (plugin) with current runtime state" May 15 12:48:10.982317 systemd[1]: Started containerd.service - containerd container runtime. May 15 12:48:10.984690 containerd[1551]: time="2025-05-15T12:48:10.984231555Z" level=info msg="containerd successfully booted in 0.174970s" May 15 12:48:10.995560 systemd-logind[1536]: Watching system buttons on /dev/input/event0 (AT Translated Set 2 keyboard) May 15 12:48:10.999944 systemd-logind[1536]: Watching system buttons on /dev/input/event3 (Power Button) May 15 12:48:11.011863 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... May 15 12:48:11.065156 systemd[1]: systemd-vconsole-setup.service: Deactivated successfully. May 15 12:48:11.065631 systemd[1]: Stopped systemd-vconsole-setup.service - Virtual Console Setup. May 15 12:48:11.069855 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... May 15 12:48:11.127894 systemd[1]: Finished systemd-vconsole-setup.service - Virtual Console Setup. May 15 12:48:11.320285 tar[1540]: linux-amd64/README.md May 15 12:48:11.333958 systemd[1]: Finished prepare-helm.service - Unpack helm to /opt/bin. May 15 12:48:11.472834 systemd-networkd[1472]: eth1: Gained IPv6LL May 15 12:48:11.473946 systemd-timesyncd[1455]: Network configuration changed, trying to establish connection. May 15 12:48:11.476904 systemd[1]: Finished systemd-networkd-wait-online.service - Wait for Network to be Configured. May 15 12:48:11.477829 systemd[1]: Reached target network-online.target - Network is Online. May 15 12:48:11.481045 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... May 15 12:48:11.483877 systemd[1]: Starting nvidia.service - NVIDIA Configure Service... May 15 12:48:11.515430 systemd[1]: Finished nvidia.service - NVIDIA Configure Service. May 15 12:48:12.113755 systemd-networkd[1472]: eth0: Gained IPv6LL May 15 12:48:12.114273 systemd-timesyncd[1455]: Network configuration changed, trying to establish connection. May 15 12:48:12.380851 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. May 15 12:48:12.381719 systemd[1]: Reached target multi-user.target - Multi-User System. May 15 12:48:12.383148 systemd[1]: Startup finished in 2.969s (kernel) + 9.928s (initrd) + 3.878s (userspace) = 16.776s. May 15 12:48:12.387844 (kubelet)[1703]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS May 15 12:48:12.872805 kubelet[1703]: E0515 12:48:12.872692 1703 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" May 15 12:48:12.874997 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE May 15 12:48:12.875159 systemd[1]: kubelet.service: Failed with result 'exit-code'. May 15 12:48:12.875434 systemd[1]: kubelet.service: Consumed 854ms CPU time, 251.9M memory peak. May 15 12:48:23.126338 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 1. May 15 12:48:23.128932 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... May 15 12:48:23.249132 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. May 15 12:48:23.257791 (kubelet)[1722]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS May 15 12:48:23.299244 kubelet[1722]: E0515 12:48:23.299166 1722 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" May 15 12:48:23.304452 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE May 15 12:48:23.304609 systemd[1]: kubelet.service: Failed with result 'exit-code'. May 15 12:48:23.305062 systemd[1]: kubelet.service: Consumed 128ms CPU time, 104.1M memory peak. May 15 12:48:33.555180 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 2. May 15 12:48:33.556980 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... May 15 12:48:33.652347 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. May 15 12:48:33.665833 (kubelet)[1737]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS May 15 12:48:33.698496 kubelet[1737]: E0515 12:48:33.698424 1737 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" May 15 12:48:33.700427 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE May 15 12:48:33.700539 systemd[1]: kubelet.service: Failed with result 'exit-code'. May 15 12:48:33.700949 systemd[1]: kubelet.service: Consumed 111ms CPU time, 102.1M memory peak. May 15 12:48:43.379373 systemd-timesyncd[1455]: Contacted time server 217.144.138.234:123 (2.flatcar.pool.ntp.org). May 15 12:48:43.379580 systemd-resolved[1425]: Clock change detected. Flushing caches. May 15 12:48:43.380015 systemd-timesyncd[1455]: Initial clock synchronization to Thu 2025-05-15 12:48:43.379210 UTC. May 15 12:48:44.703948 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 3. May 15 12:48:44.708095 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... May 15 12:48:44.860357 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. May 15 12:48:44.880309 (kubelet)[1752]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS May 15 12:48:44.916923 kubelet[1752]: E0515 12:48:44.916829 1752 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" May 15 12:48:44.920109 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE May 15 12:48:44.920246 systemd[1]: kubelet.service: Failed with result 'exit-code'. May 15 12:48:44.920541 systemd[1]: kubelet.service: Consumed 138ms CPU time, 101.7M memory peak. May 15 12:48:54.953823 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 4. May 15 12:48:54.955286 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... May 15 12:48:55.074984 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. May 15 12:48:55.083393 (kubelet)[1768]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS May 15 12:48:55.118080 kubelet[1768]: E0515 12:48:55.118025 1768 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" May 15 12:48:55.120354 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE May 15 12:48:55.120476 systemd[1]: kubelet.service: Failed with result 'exit-code'. May 15 12:48:55.120743 systemd[1]: kubelet.service: Consumed 115ms CPU time, 103.8M memory peak. May 15 12:48:56.492847 update_engine[1537]: I20250515 12:48:56.492699 1537 update_attempter.cc:509] Updating boot flags... May 15 12:49:05.203755 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 5. May 15 12:49:05.205695 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... May 15 12:49:05.323288 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. May 15 12:49:05.333107 (kubelet)[1803]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS May 15 12:49:05.360652 kubelet[1803]: E0515 12:49:05.360610 1803 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" May 15 12:49:05.361926 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE May 15 12:49:05.362065 systemd[1]: kubelet.service: Failed with result 'exit-code'. May 15 12:49:05.362494 systemd[1]: kubelet.service: Consumed 109ms CPU time, 103.7M memory peak. May 15 12:49:15.454700 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 6. May 15 12:49:15.458284 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... May 15 12:49:15.598520 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. May 15 12:49:15.603142 (kubelet)[1818]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS May 15 12:49:15.630218 kubelet[1818]: E0515 12:49:15.630171 1818 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" May 15 12:49:15.632701 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE May 15 12:49:15.632821 systemd[1]: kubelet.service: Failed with result 'exit-code'. May 15 12:49:15.633080 systemd[1]: kubelet.service: Consumed 132ms CPU time, 103.7M memory peak. May 15 12:49:25.703641 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 7. May 15 12:49:25.705586 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... May 15 12:49:25.809339 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. May 15 12:49:25.820139 (kubelet)[1833]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS May 15 12:49:25.862355 kubelet[1833]: E0515 12:49:25.862286 1833 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" May 15 12:49:25.865150 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE May 15 12:49:25.865283 systemd[1]: kubelet.service: Failed with result 'exit-code'. May 15 12:49:25.865595 systemd[1]: kubelet.service: Consumed 120ms CPU time, 104.3M memory peak. May 15 12:49:35.954224 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 8. May 15 12:49:35.956344 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... May 15 12:49:36.082799 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. May 15 12:49:36.095129 (kubelet)[1848]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS May 15 12:49:36.124880 kubelet[1848]: E0515 12:49:36.124827 1848 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" May 15 12:49:36.127015 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE May 15 12:49:36.127254 systemd[1]: kubelet.service: Failed with result 'exit-code'. May 15 12:49:36.127513 systemd[1]: kubelet.service: Consumed 109ms CPU time, 103.7M memory peak. May 15 12:49:46.203777 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 9. May 15 12:49:46.205449 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... May 15 12:49:46.304232 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. May 15 12:49:46.317257 (kubelet)[1863]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS May 15 12:49:46.351421 kubelet[1863]: E0515 12:49:46.351365 1863 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" May 15 12:49:46.353543 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE May 15 12:49:46.353662 systemd[1]: kubelet.service: Failed with result 'exit-code'. May 15 12:49:46.353937 systemd[1]: kubelet.service: Consumed 117ms CPU time, 102M memory peak. May 15 12:49:47.594700 systemd[1]: Created slice system-sshd.slice - Slice /system/sshd. May 15 12:49:47.596079 systemd[1]: Started sshd@0-157.180.34.115:22-85.209.134.43:39946.service - OpenSSH per-connection server daemon (85.209.134.43:39946). May 15 12:49:48.165923 sshd[1872]: Invalid user pranav from 85.209.134.43 port 39946 May 15 12:49:48.265514 sshd[1872]: Received disconnect from 85.209.134.43 port 39946:11: Bye Bye [preauth] May 15 12:49:48.265514 sshd[1872]: Disconnected from invalid user pranav 85.209.134.43 port 39946 [preauth] May 15 12:49:48.267541 systemd[1]: sshd@0-157.180.34.115:22-85.209.134.43:39946.service: Deactivated successfully. May 15 12:49:56.453651 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 10. May 15 12:49:56.455070 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... May 15 12:49:56.581399 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. May 15 12:49:56.592148 (kubelet)[1884]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS May 15 12:49:56.627569 kubelet[1884]: E0515 12:49:56.627509 1884 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" May 15 12:49:56.630330 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE May 15 12:49:56.630482 systemd[1]: kubelet.service: Failed with result 'exit-code'. May 15 12:49:56.630973 systemd[1]: kubelet.service: Consumed 131ms CPU time, 101.7M memory peak. May 15 12:50:06.704340 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 11. May 15 12:50:06.707069 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... May 15 12:50:06.847506 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. May 15 12:50:06.850005 (kubelet)[1900]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS May 15 12:50:06.879649 kubelet[1900]: E0515 12:50:06.879603 1900 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" May 15 12:50:06.881461 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE May 15 12:50:06.881572 systemd[1]: kubelet.service: Failed with result 'exit-code'. May 15 12:50:06.881809 systemd[1]: kubelet.service: Consumed 134ms CPU time, 102.8M memory peak. May 15 12:50:16.954196 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 12. May 15 12:50:16.957118 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... May 15 12:50:17.096348 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. May 15 12:50:17.107324 (kubelet)[1916]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS May 15 12:50:17.162787 kubelet[1916]: E0515 12:50:17.162702 1916 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" May 15 12:50:17.165682 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE May 15 12:50:17.165863 systemd[1]: kubelet.service: Failed with result 'exit-code'. May 15 12:50:17.166323 systemd[1]: kubelet.service: Consumed 151ms CPU time, 102.1M memory peak. May 15 12:50:27.204371 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 13. May 15 12:50:27.207292 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... May 15 12:50:27.324369 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. May 15 12:50:27.330168 (kubelet)[1931]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS May 15 12:50:27.361462 kubelet[1931]: E0515 12:50:27.361395 1931 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" May 15 12:50:27.362783 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE May 15 12:50:27.362974 systemd[1]: kubelet.service: Failed with result 'exit-code'. May 15 12:50:27.363309 systemd[1]: kubelet.service: Consumed 114ms CPU time, 103.5M memory peak. May 15 12:50:28.482139 update_engine[1537]: I20250515 12:50:28.482056 1537 prefs.cc:52] certificate-report-to-send-update not present in /var/lib/update_engine/prefs May 15 12:50:28.482139 update_engine[1537]: I20250515 12:50:28.482110 1537 prefs.cc:52] certificate-report-to-send-download not present in /var/lib/update_engine/prefs May 15 12:50:28.482537 update_engine[1537]: I20250515 12:50:28.482273 1537 prefs.cc:52] aleph-version not present in /var/lib/update_engine/prefs May 15 12:50:28.482654 update_engine[1537]: I20250515 12:50:28.482627 1537 omaha_request_params.cc:62] Current group set to developer May 15 12:50:28.483120 update_engine[1537]: I20250515 12:50:28.482749 1537 update_attempter.cc:499] Already updated boot flags. Skipping. May 15 12:50:28.483120 update_engine[1537]: I20250515 12:50:28.482764 1537 update_attempter.cc:643] Scheduling an action processor start. May 15 12:50:28.483120 update_engine[1537]: I20250515 12:50:28.482781 1537 action_processor.cc:36] ActionProcessor::StartProcessing: OmahaRequestAction May 15 12:50:28.483120 update_engine[1537]: I20250515 12:50:28.482816 1537 prefs.cc:52] previous-version not present in /var/lib/update_engine/prefs May 15 12:50:28.483120 update_engine[1537]: I20250515 12:50:28.482914 1537 omaha_request_action.cc:271] Posting an Omaha request to disabled May 15 12:50:28.483120 update_engine[1537]: I20250515 12:50:28.482925 1537 omaha_request_action.cc:272] Request: May 15 12:50:28.483120 update_engine[1537]: May 15 12:50:28.483120 update_engine[1537]: May 15 12:50:28.483120 update_engine[1537]: May 15 12:50:28.483120 update_engine[1537]: May 15 12:50:28.483120 update_engine[1537]: May 15 12:50:28.483120 update_engine[1537]: May 15 12:50:28.483120 update_engine[1537]: May 15 12:50:28.483120 update_engine[1537]: May 15 12:50:28.483120 update_engine[1537]: I20250515 12:50:28.482930 1537 libcurl_http_fetcher.cc:47] Starting/Resuming transfer May 15 12:50:28.483382 locksmithd[1572]: LastCheckedTime=0 Progress=0 CurrentOperation="UPDATE_STATUS_CHECKING_FOR_UPDATE" NewVersion=0.0.0 NewSize=0 May 15 12:50:28.484074 update_engine[1537]: I20250515 12:50:28.484045 1537 libcurl_http_fetcher.cc:151] Setting up curl options for HTTP May 15 12:50:28.484763 update_engine[1537]: I20250515 12:50:28.484720 1537 libcurl_http_fetcher.cc:449] Setting up timeout source: 1 seconds. May 15 12:50:28.486771 update_engine[1537]: E20250515 12:50:28.486730 1537 libcurl_http_fetcher.cc:266] Unable to get http response code: Could not resolve host: disabled May 15 12:50:28.486852 update_engine[1537]: I20250515 12:50:28.486814 1537 libcurl_http_fetcher.cc:283] No HTTP response, retry 1 May 15 12:50:37.453795 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 14. May 15 12:50:37.455340 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... May 15 12:50:37.558926 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. May 15 12:50:37.565146 (kubelet)[1946]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS May 15 12:50:37.601829 kubelet[1946]: E0515 12:50:37.601774 1946 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" May 15 12:50:37.603811 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE May 15 12:50:37.603962 systemd[1]: kubelet.service: Failed with result 'exit-code'. May 15 12:50:37.604215 systemd[1]: kubelet.service: Consumed 118ms CPU time, 103.6M memory peak. May 15 12:50:38.458064 update_engine[1537]: I20250515 12:50:38.457934 1537 libcurl_http_fetcher.cc:47] Starting/Resuming transfer May 15 12:50:38.458379 update_engine[1537]: I20250515 12:50:38.458204 1537 libcurl_http_fetcher.cc:151] Setting up curl options for HTTP May 15 12:50:38.458434 update_engine[1537]: I20250515 12:50:38.458416 1537 libcurl_http_fetcher.cc:449] Setting up timeout source: 1 seconds. May 15 12:50:38.458810 update_engine[1537]: E20250515 12:50:38.458783 1537 libcurl_http_fetcher.cc:266] Unable to get http response code: Could not resolve host: disabled May 15 12:50:38.458866 update_engine[1537]: I20250515 12:50:38.458845 1537 libcurl_http_fetcher.cc:283] No HTTP response, retry 2 May 15 12:50:47.703648 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 15. May 15 12:50:47.705024 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... May 15 12:50:47.802704 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. May 15 12:50:47.812136 (kubelet)[1961]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS May 15 12:50:47.841606 kubelet[1961]: E0515 12:50:47.841551 1961 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" May 15 12:50:47.843473 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE May 15 12:50:47.843596 systemd[1]: kubelet.service: Failed with result 'exit-code'. May 15 12:50:47.843839 systemd[1]: kubelet.service: Consumed 99ms CPU time, 101M memory peak. May 15 12:50:48.458157 update_engine[1537]: I20250515 12:50:48.458009 1537 libcurl_http_fetcher.cc:47] Starting/Resuming transfer May 15 12:50:48.458686 update_engine[1537]: I20250515 12:50:48.458371 1537 libcurl_http_fetcher.cc:151] Setting up curl options for HTTP May 15 12:50:48.458839 update_engine[1537]: I20250515 12:50:48.458717 1537 libcurl_http_fetcher.cc:449] Setting up timeout source: 1 seconds. May 15 12:50:48.459180 update_engine[1537]: E20250515 12:50:48.459123 1537 libcurl_http_fetcher.cc:266] Unable to get http response code: Could not resolve host: disabled May 15 12:50:48.459247 update_engine[1537]: I20250515 12:50:48.459194 1537 libcurl_http_fetcher.cc:283] No HTTP response, retry 3 May 15 12:50:57.953751 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 16. May 15 12:50:57.955210 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... May 15 12:50:58.055655 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. May 15 12:50:58.067100 (kubelet)[1977]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS May 15 12:50:58.093633 kubelet[1977]: E0515 12:50:58.093572 1977 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" May 15 12:50:58.094575 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE May 15 12:50:58.094702 systemd[1]: kubelet.service: Failed with result 'exit-code'. May 15 12:50:58.095029 systemd[1]: kubelet.service: Consumed 106ms CPU time, 103.7M memory peak. May 15 12:50:58.458048 update_engine[1537]: I20250515 12:50:58.457962 1537 libcurl_http_fetcher.cc:47] Starting/Resuming transfer May 15 12:50:58.458450 update_engine[1537]: I20250515 12:50:58.458210 1537 libcurl_http_fetcher.cc:151] Setting up curl options for HTTP May 15 12:50:58.458450 update_engine[1537]: I20250515 12:50:58.458427 1537 libcurl_http_fetcher.cc:449] Setting up timeout source: 1 seconds. May 15 12:50:58.458841 update_engine[1537]: E20250515 12:50:58.458806 1537 libcurl_http_fetcher.cc:266] Unable to get http response code: Could not resolve host: disabled May 15 12:50:58.458924 update_engine[1537]: I20250515 12:50:58.458847 1537 libcurl_http_fetcher.cc:297] Transfer resulted in an error (0), 0 bytes downloaded May 15 12:50:58.458924 update_engine[1537]: I20250515 12:50:58.458854 1537 omaha_request_action.cc:617] Omaha request response: May 15 12:50:58.458988 update_engine[1537]: E20250515 12:50:58.458940 1537 omaha_request_action.cc:636] Omaha request network transfer failed. May 15 12:50:58.458988 update_engine[1537]: I20250515 12:50:58.458958 1537 action_processor.cc:68] ActionProcessor::ActionComplete: OmahaRequestAction action failed. Aborting processing. May 15 12:50:58.458988 update_engine[1537]: I20250515 12:50:58.458962 1537 action_processor.cc:73] ActionProcessor::ActionComplete: finished last action of type OmahaRequestAction May 15 12:50:58.458988 update_engine[1537]: I20250515 12:50:58.458966 1537 update_attempter.cc:306] Processing Done. May 15 12:50:58.458988 update_engine[1537]: E20250515 12:50:58.458978 1537 update_attempter.cc:619] Update failed. May 15 12:50:58.458988 update_engine[1537]: I20250515 12:50:58.458983 1537 utils.cc:600] Converting error code 2000 to kActionCodeOmahaErrorInHTTPResponse May 15 12:50:58.458988 update_engine[1537]: I20250515 12:50:58.458986 1537 payload_state.cc:97] Updating payload state for error code: 37 (kActionCodeOmahaErrorInHTTPResponse) May 15 12:50:58.458988 update_engine[1537]: I20250515 12:50:58.458990 1537 payload_state.cc:103] Ignoring failures until we get a valid Omaha response. May 15 12:50:58.459223 update_engine[1537]: I20250515 12:50:58.459059 1537 action_processor.cc:36] ActionProcessor::StartProcessing: OmahaRequestAction May 15 12:50:58.459223 update_engine[1537]: I20250515 12:50:58.459076 1537 omaha_request_action.cc:271] Posting an Omaha request to disabled May 15 12:50:58.459223 update_engine[1537]: I20250515 12:50:58.459080 1537 omaha_request_action.cc:272] Request: May 15 12:50:58.459223 update_engine[1537]: May 15 12:50:58.459223 update_engine[1537]: May 15 12:50:58.459223 update_engine[1537]: May 15 12:50:58.459223 update_engine[1537]: May 15 12:50:58.459223 update_engine[1537]: May 15 12:50:58.459223 update_engine[1537]: May 15 12:50:58.459223 update_engine[1537]: I20250515 12:50:58.459085 1537 libcurl_http_fetcher.cc:47] Starting/Resuming transfer May 15 12:50:58.459223 update_engine[1537]: I20250515 12:50:58.459208 1537 libcurl_http_fetcher.cc:151] Setting up curl options for HTTP May 15 12:50:58.459470 update_engine[1537]: I20250515 12:50:58.459340 1537 libcurl_http_fetcher.cc:449] Setting up timeout source: 1 seconds. May 15 12:50:58.459734 update_engine[1537]: E20250515 12:50:58.459637 1537 libcurl_http_fetcher.cc:266] Unable to get http response code: Could not resolve host: disabled May 15 12:50:58.459734 update_engine[1537]: I20250515 12:50:58.459671 1537 libcurl_http_fetcher.cc:297] Transfer resulted in an error (0), 0 bytes downloaded May 15 12:50:58.459734 update_engine[1537]: I20250515 12:50:58.459676 1537 omaha_request_action.cc:617] Omaha request response: May 15 12:50:58.459734 update_engine[1537]: I20250515 12:50:58.459681 1537 action_processor.cc:65] ActionProcessor::ActionComplete: finished last action of type OmahaRequestAction May 15 12:50:58.459734 update_engine[1537]: I20250515 12:50:58.459684 1537 action_processor.cc:73] ActionProcessor::ActionComplete: finished last action of type OmahaRequestAction May 15 12:50:58.459734 update_engine[1537]: I20250515 12:50:58.459687 1537 update_attempter.cc:306] Processing Done. May 15 12:50:58.459734 update_engine[1537]: I20250515 12:50:58.459692 1537 update_attempter.cc:310] Error event sent. May 15 12:50:58.459734 update_engine[1537]: I20250515 12:50:58.459702 1537 update_check_scheduler.cc:74] Next update check in 45m24s May 15 12:50:58.459986 locksmithd[1572]: LastCheckedTime=0 Progress=0 CurrentOperation="UPDATE_STATUS_REPORTING_ERROR_EVENT" NewVersion=0.0.0 NewSize=0 May 15 12:50:58.460344 locksmithd[1572]: LastCheckedTime=0 Progress=0 CurrentOperation="UPDATE_STATUS_IDLE" NewVersion=0.0.0 NewSize=0 May 15 12:51:08.203644 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 17. May 15 12:51:08.205026 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... May 15 12:51:08.330859 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. May 15 12:51:08.333597 (kubelet)[1992]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS May 15 12:51:08.361871 kubelet[1992]: E0515 12:51:08.361813 1992 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" May 15 12:51:08.363068 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE May 15 12:51:08.363226 systemd[1]: kubelet.service: Failed with result 'exit-code'. May 15 12:51:08.363696 systemd[1]: kubelet.service: Consumed 119ms CPU time, 103.7M memory peak. May 15 12:51:18.453982 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 18. May 15 12:51:18.455751 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... May 15 12:51:18.557179 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. May 15 12:51:18.560045 (kubelet)[2007]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS May 15 12:51:18.592658 kubelet[2007]: E0515 12:51:18.592598 2007 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" May 15 12:51:18.595135 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE May 15 12:51:18.595269 systemd[1]: kubelet.service: Failed with result 'exit-code'. May 15 12:51:18.595525 systemd[1]: kubelet.service: Consumed 115ms CPU time, 102M memory peak. May 15 12:51:28.703624 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 19. May 15 12:51:28.705082 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... May 15 12:51:28.811920 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. May 15 12:51:28.818142 (kubelet)[2023]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS May 15 12:51:28.850727 kubelet[2023]: E0515 12:51:28.850670 2023 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" May 15 12:51:28.852587 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE May 15 12:51:28.852744 systemd[1]: kubelet.service: Failed with result 'exit-code'. May 15 12:51:28.853122 systemd[1]: kubelet.service: Consumed 112ms CPU time, 103.4M memory peak. May 15 12:51:38.953779 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 20. May 15 12:51:38.955454 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... May 15 12:51:39.044631 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. May 15 12:51:39.047551 (kubelet)[2039]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS May 15 12:51:39.079843 kubelet[2039]: E0515 12:51:39.079781 2039 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" May 15 12:51:39.081652 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE May 15 12:51:39.081772 systemd[1]: kubelet.service: Failed with result 'exit-code'. May 15 12:51:39.082083 systemd[1]: kubelet.service: Consumed 108ms CPU time, 103.9M memory peak. May 15 12:51:49.203789 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 21. May 15 12:51:49.205290 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... May 15 12:51:49.323661 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. May 15 12:51:49.328257 (kubelet)[2055]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS May 15 12:51:49.363125 kubelet[2055]: E0515 12:51:49.363072 2055 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" May 15 12:51:49.365172 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE May 15 12:51:49.365289 systemd[1]: kubelet.service: Failed with result 'exit-code'. May 15 12:51:49.365545 systemd[1]: kubelet.service: Consumed 117ms CPU time, 103M memory peak. May 15 12:51:59.453623 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 22. May 15 12:51:59.455015 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... May 15 12:51:59.579608 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. May 15 12:51:59.588141 (kubelet)[2069]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS May 15 12:51:59.623544 kubelet[2069]: E0515 12:51:59.623491 2069 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" May 15 12:51:59.625490 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE May 15 12:51:59.625600 systemd[1]: kubelet.service: Failed with result 'exit-code'. May 15 12:51:59.625835 systemd[1]: kubelet.service: Consumed 128ms CPU time, 103.7M memory peak. May 15 12:52:09.704260 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 23. May 15 12:52:09.707046 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... May 15 12:52:09.850098 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. May 15 12:52:09.852651 (kubelet)[2085]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS May 15 12:52:09.881192 kubelet[2085]: E0515 12:52:09.881113 2085 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" May 15 12:52:09.883124 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE May 15 12:52:09.883235 systemd[1]: kubelet.service: Failed with result 'exit-code'. May 15 12:52:09.883651 systemd[1]: kubelet.service: Consumed 131ms CPU time, 103.7M memory peak. May 15 12:52:19.953541 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 24. May 15 12:52:19.956033 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... May 15 12:52:20.052439 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. May 15 12:52:20.054932 (kubelet)[2099]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS May 15 12:52:20.082874 kubelet[2099]: E0515 12:52:20.082822 2099 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" May 15 12:52:20.084736 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE May 15 12:52:20.084863 systemd[1]: kubelet.service: Failed with result 'exit-code'. May 15 12:52:20.085136 systemd[1]: kubelet.service: Consumed 94ms CPU time, 103.1M memory peak. May 15 12:52:30.204325 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 25. May 15 12:52:30.206981 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... May 15 12:52:30.333166 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. May 15 12:52:30.344139 (kubelet)[2114]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS May 15 12:52:30.374686 kubelet[2114]: E0515 12:52:30.374614 2114 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" May 15 12:52:30.375989 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE May 15 12:52:30.376188 systemd[1]: kubelet.service: Failed with result 'exit-code'. May 15 12:52:30.376557 systemd[1]: kubelet.service: Consumed 120ms CPU time, 103M memory peak. May 15 12:52:40.453756 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 26. May 15 12:52:40.455788 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... May 15 12:52:40.553482 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. May 15 12:52:40.558156 (kubelet)[2129]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS May 15 12:52:40.590773 kubelet[2129]: E0515 12:52:40.590708 2129 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" May 15 12:52:40.593122 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE May 15 12:52:40.593249 systemd[1]: kubelet.service: Failed with result 'exit-code'. May 15 12:52:40.593506 systemd[1]: kubelet.service: Consumed 113ms CPU time, 103.7M memory peak. May 15 12:52:50.704189 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 27. May 15 12:52:50.706387 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... May 15 12:52:50.857118 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. May 15 12:52:50.866169 (kubelet)[2144]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS May 15 12:52:50.906221 kubelet[2144]: E0515 12:52:50.906163 2144 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" May 15 12:52:50.909128 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE May 15 12:52:50.909279 systemd[1]: kubelet.service: Failed with result 'exit-code'. May 15 12:52:50.909679 systemd[1]: kubelet.service: Consumed 147ms CPU time, 103.8M memory peak. May 15 12:53:00.953865 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 28. May 15 12:53:00.955473 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... May 15 12:53:01.069729 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. May 15 12:53:01.072403 (kubelet)[2159]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS May 15 12:53:01.108160 kubelet[2159]: E0515 12:53:01.108109 2159 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" May 15 12:53:01.110383 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE May 15 12:53:01.110646 systemd[1]: kubelet.service: Failed with result 'exit-code'. May 15 12:53:01.111189 systemd[1]: kubelet.service: Consumed 114ms CPU time, 103.7M memory peak. May 15 12:53:11.204184 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 29. May 15 12:53:11.206790 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... May 15 12:53:11.351764 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. May 15 12:53:11.363096 (kubelet)[2175]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS May 15 12:53:11.393838 kubelet[2175]: E0515 12:53:11.393791 2175 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" May 15 12:53:11.395669 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE May 15 12:53:11.395782 systemd[1]: kubelet.service: Failed with result 'exit-code'. May 15 12:53:11.396053 systemd[1]: kubelet.service: Consumed 137ms CPU time, 103.7M memory peak. May 15 12:53:21.453650 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 30. May 15 12:53:21.454984 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... May 15 12:53:21.577499 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. May 15 12:53:21.585261 (kubelet)[2192]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS May 15 12:53:21.626388 kubelet[2192]: E0515 12:53:21.626313 2192 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" May 15 12:53:21.628941 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE May 15 12:53:21.629073 systemd[1]: kubelet.service: Failed with result 'exit-code'. May 15 12:53:21.629353 systemd[1]: kubelet.service: Consumed 128ms CPU time, 101M memory peak. May 15 12:53:31.703628 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 31. May 15 12:53:31.705076 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... May 15 12:53:31.825730 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. May 15 12:53:31.834127 (kubelet)[2208]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS May 15 12:53:31.866239 kubelet[2208]: E0515 12:53:31.866186 2208 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" May 15 12:53:31.868024 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE May 15 12:53:31.868207 systemd[1]: kubelet.service: Failed with result 'exit-code'. May 15 12:53:31.868530 systemd[1]: kubelet.service: Consumed 115ms CPU time, 101.8M memory peak. May 15 12:53:41.954287 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 32. May 15 12:53:41.956623 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... May 15 12:53:42.069472 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. May 15 12:53:42.081153 (kubelet)[2226]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS May 15 12:53:42.113469 kubelet[2226]: E0515 12:53:42.113416 2226 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" May 15 12:53:42.115817 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE May 15 12:53:42.116128 systemd[1]: kubelet.service: Failed with result 'exit-code'. May 15 12:53:42.116395 systemd[1]: kubelet.service: Consumed 115ms CPU time, 103.7M memory peak. May 15 12:53:52.204277 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 33. May 15 12:53:52.206935 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... May 15 12:53:52.357671 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. May 15 12:53:52.366289 (kubelet)[2241]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS May 15 12:53:52.402194 kubelet[2241]: E0515 12:53:52.402133 2241 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" May 15 12:53:52.404005 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE May 15 12:53:52.404231 systemd[1]: kubelet.service: Failed with result 'exit-code'. May 15 12:53:52.404656 systemd[1]: kubelet.service: Consumed 149ms CPU time, 103.6M memory peak. May 15 12:53:53.010418 systemd[1]: Started sshd@1-157.180.34.115:22-85.209.134.43:56818.service - OpenSSH per-connection server daemon (85.209.134.43:56818). May 15 12:53:53.575698 sshd[2250]: Invalid user ftpuser from 85.209.134.43 port 56818 May 15 12:53:53.676826 sshd[2250]: Received disconnect from 85.209.134.43 port 56818:11: Bye Bye [preauth] May 15 12:53:53.676826 sshd[2250]: Disconnected from invalid user ftpuser 85.209.134.43 port 56818 [preauth] May 15 12:53:53.680365 systemd[1]: sshd@1-157.180.34.115:22-85.209.134.43:56818.service: Deactivated successfully. May 15 12:54:02.453735 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 34. May 15 12:54:02.455390 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... May 15 12:54:02.573801 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. May 15 12:54:02.590209 (kubelet)[2262]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS May 15 12:54:02.622146 kubelet[2262]: E0515 12:54:02.622101 2262 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" May 15 12:54:02.624087 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE May 15 12:54:02.624228 systemd[1]: kubelet.service: Failed with result 'exit-code'. May 15 12:54:02.624522 systemd[1]: kubelet.service: Consumed 119ms CPU time, 101.6M memory peak. May 15 12:54:12.703796 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 35. May 15 12:54:12.705299 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... May 15 12:54:12.850175 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. May 15 12:54:12.854009 (kubelet)[2277]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS May 15 12:54:12.891612 kubelet[2277]: E0515 12:54:12.891547 2277 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" May 15 12:54:12.893787 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE May 15 12:54:12.893989 systemd[1]: kubelet.service: Failed with result 'exit-code'. May 15 12:54:12.894412 systemd[1]: kubelet.service: Consumed 142ms CPU time, 101.7M memory peak. May 15 12:54:22.953787 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 36. May 15 12:54:22.955370 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... May 15 12:54:23.051331 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. May 15 12:54:23.058132 (kubelet)[2292]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS May 15 12:54:23.094084 kubelet[2292]: E0515 12:54:23.094029 2292 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" May 15 12:54:23.096316 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE May 15 12:54:23.096437 systemd[1]: kubelet.service: Failed with result 'exit-code'. May 15 12:54:23.096711 systemd[1]: kubelet.service: Consumed 118ms CPU time, 101.9M memory peak. May 15 12:54:33.204320 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 37. May 15 12:54:33.206559 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... May 15 12:54:33.325665 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. May 15 12:54:33.329006 (kubelet)[2307]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS May 15 12:54:33.358940 kubelet[2307]: E0515 12:54:33.358870 2307 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" May 15 12:54:33.360749 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE May 15 12:54:33.360866 systemd[1]: kubelet.service: Failed with result 'exit-code'. May 15 12:54:33.361142 systemd[1]: kubelet.service: Consumed 112ms CPU time, 105.6M memory peak. May 15 12:54:43.454296 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 38. May 15 12:54:43.456520 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... May 15 12:54:43.605831 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. May 15 12:54:43.611144 (kubelet)[2322]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS May 15 12:54:43.646673 kubelet[2322]: E0515 12:54:43.646594 2322 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" May 15 12:54:43.648555 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE May 15 12:54:43.648665 systemd[1]: kubelet.service: Failed with result 'exit-code'. May 15 12:54:43.648885 systemd[1]: kubelet.service: Consumed 139ms CPU time, 103.6M memory peak. May 15 12:54:53.703650 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 39. May 15 12:54:53.705566 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... May 15 12:54:53.808673 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. May 15 12:54:53.821160 (kubelet)[2337]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS May 15 12:54:53.851623 kubelet[2337]: E0515 12:54:53.851572 2337 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" May 15 12:54:53.853300 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE May 15 12:54:53.853410 systemd[1]: kubelet.service: Failed with result 'exit-code'. May 15 12:54:53.853789 systemd[1]: kubelet.service: Consumed 111ms CPU time, 103.8M memory peak. May 15 12:55:03.954028 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 40. May 15 12:55:03.955459 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... May 15 12:55:04.057635 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. May 15 12:55:04.060042 (kubelet)[2352]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS May 15 12:55:04.088107 kubelet[2352]: E0515 12:55:04.088067 2352 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" May 15 12:55:04.090156 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE May 15 12:55:04.090359 systemd[1]: kubelet.service: Failed with result 'exit-code'. May 15 12:55:04.090598 systemd[1]: kubelet.service: Consumed 103ms CPU time, 103.8M memory peak. May 15 12:55:14.203693 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 41. May 15 12:55:14.205627 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... May 15 12:55:14.313595 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. May 15 12:55:14.316424 (kubelet)[2367]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS May 15 12:55:14.346868 kubelet[2367]: E0515 12:55:14.346813 2367 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" May 15 12:55:14.347915 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE May 15 12:55:14.348043 systemd[1]: kubelet.service: Failed with result 'exit-code'. May 15 12:55:14.348307 systemd[1]: kubelet.service: Consumed 109ms CPU time, 102.1M memory peak. May 15 12:55:24.454306 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 42. May 15 12:55:24.456493 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... May 15 12:55:24.573950 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. May 15 12:55:24.579334 (kubelet)[2382]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS May 15 12:55:24.609347 kubelet[2382]: E0515 12:55:24.609292 2382 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" May 15 12:55:24.611091 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE May 15 12:55:24.611201 systemd[1]: kubelet.service: Failed with result 'exit-code'. May 15 12:55:24.611424 systemd[1]: kubelet.service: Consumed 111ms CPU time, 101.7M memory peak. May 15 12:55:34.703607 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 43. May 15 12:55:34.705501 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... May 15 12:55:34.815019 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. May 15 12:55:34.823088 (kubelet)[2397]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS May 15 12:55:34.863744 kubelet[2397]: E0515 12:55:34.863669 2397 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" May 15 12:55:34.866523 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE May 15 12:55:34.866647 systemd[1]: kubelet.service: Failed with result 'exit-code'. May 15 12:55:34.866965 systemd[1]: kubelet.service: Consumed 120ms CPU time, 101.9M memory peak. May 15 12:55:44.953964 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 44. May 15 12:55:44.955517 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... May 15 12:55:45.056729 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. May 15 12:55:45.065086 (kubelet)[2413]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS May 15 12:55:45.093149 kubelet[2413]: E0515 12:55:45.093097 2413 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" May 15 12:55:45.095067 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE May 15 12:55:45.095183 systemd[1]: kubelet.service: Failed with result 'exit-code'. May 15 12:55:45.095420 systemd[1]: kubelet.service: Consumed 109ms CPU time, 103.5M memory peak. May 15 12:55:55.204183 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 45. May 15 12:55:55.206703 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... May 15 12:55:55.372812 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. May 15 12:55:55.379073 (kubelet)[2428]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS May 15 12:55:55.416067 kubelet[2428]: E0515 12:55:55.416002 2428 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" May 15 12:55:55.418934 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE May 15 12:55:55.419101 systemd[1]: kubelet.service: Failed with result 'exit-code'. May 15 12:55:55.419391 systemd[1]: kubelet.service: Consumed 147ms CPU time, 103.6M memory peak. May 15 12:56:05.453677 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 46. May 15 12:56:05.455354 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... May 15 12:56:05.561789 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. May 15 12:56:05.570089 (kubelet)[2443]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS May 15 12:56:05.601150 kubelet[2443]: E0515 12:56:05.601096 2443 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" May 15 12:56:05.603094 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE May 15 12:56:05.603216 systemd[1]: kubelet.service: Failed with result 'exit-code'. May 15 12:56:05.603459 systemd[1]: kubelet.service: Consumed 113ms CPU time, 103.6M memory peak. May 15 12:56:15.703774 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 47. May 15 12:56:15.705351 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... May 15 12:56:15.832344 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. May 15 12:56:15.845279 (kubelet)[2458]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS May 15 12:56:15.881946 kubelet[2458]: E0515 12:56:15.881862 2458 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" May 15 12:56:15.884100 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE May 15 12:56:15.884235 systemd[1]: kubelet.service: Failed with result 'exit-code'. May 15 12:56:15.884567 systemd[1]: kubelet.service: Consumed 129ms CPU time, 103.6M memory peak. May 15 12:56:25.953707 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 48. May 15 12:56:25.955315 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... May 15 12:56:26.071468 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. May 15 12:56:26.086248 (kubelet)[2473]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS May 15 12:56:26.126045 kubelet[2473]: E0515 12:56:26.125994 2473 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" May 15 12:56:26.127621 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE May 15 12:56:26.127751 systemd[1]: kubelet.service: Failed with result 'exit-code'. May 15 12:56:26.128064 systemd[1]: kubelet.service: Consumed 131ms CPU time, 103.6M memory peak. May 15 12:56:36.203789 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 49. May 15 12:56:36.205279 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... May 15 12:56:36.311648 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. May 15 12:56:36.322133 (kubelet)[2488]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS May 15 12:56:36.356074 kubelet[2488]: E0515 12:56:36.356020 2488 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" May 15 12:56:36.357991 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE May 15 12:56:36.358124 systemd[1]: kubelet.service: Failed with result 'exit-code'. May 15 12:56:36.358418 systemd[1]: kubelet.service: Consumed 113ms CPU time, 101.4M memory peak. May 15 12:56:46.453971 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 50. May 15 12:56:46.455877 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... May 15 12:56:46.573512 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. May 15 12:56:46.575957 (kubelet)[2503]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS May 15 12:56:46.610552 kubelet[2503]: E0515 12:56:46.610491 2503 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" May 15 12:56:46.613727 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE May 15 12:56:46.613913 systemd[1]: kubelet.service: Failed with result 'exit-code'. May 15 12:56:46.614463 systemd[1]: kubelet.service: Consumed 118ms CPU time, 103.8M memory peak. May 15 12:56:56.703673 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 51. May 15 12:56:56.705610 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... May 15 12:56:56.816698 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. May 15 12:56:56.822080 (kubelet)[2518]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS May 15 12:56:56.852030 kubelet[2518]: E0515 12:56:56.851975 2518 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" May 15 12:56:56.853959 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE May 15 12:56:56.854073 systemd[1]: kubelet.service: Failed with result 'exit-code'. May 15 12:56:56.854329 systemd[1]: kubelet.service: Consumed 111ms CPU time, 103.8M memory peak. May 15 12:57:06.953833 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 52. May 15 12:57:06.955377 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... May 15 12:57:07.058456 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. May 15 12:57:07.071189 (kubelet)[2533]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS May 15 12:57:07.104621 kubelet[2533]: E0515 12:57:07.104554 2533 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" May 15 12:57:07.106991 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE May 15 12:57:07.107120 systemd[1]: kubelet.service: Failed with result 'exit-code'. May 15 12:57:07.107367 systemd[1]: kubelet.service: Consumed 114ms CPU time, 103.7M memory peak. May 15 12:57:17.203631 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 53. May 15 12:57:17.205102 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... May 15 12:57:17.304955 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. May 15 12:57:17.311112 (kubelet)[2548]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS May 15 12:57:17.339288 kubelet[2548]: E0515 12:57:17.339223 2548 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" May 15 12:57:17.341695 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE May 15 12:57:17.341825 systemd[1]: kubelet.service: Failed with result 'exit-code'. May 15 12:57:17.342173 systemd[1]: kubelet.service: Consumed 109ms CPU time, 103.7M memory peak. May 15 12:57:27.454463 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 54. May 15 12:57:27.457688 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... May 15 12:57:27.576772 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. May 15 12:57:27.579851 (kubelet)[2563]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS May 15 12:57:27.612077 kubelet[2563]: E0515 12:57:27.612019 2563 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" May 15 12:57:27.614177 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE May 15 12:57:27.614329 systemd[1]: kubelet.service: Failed with result 'exit-code'. May 15 12:57:27.614594 systemd[1]: kubelet.service: Consumed 121ms CPU time, 103.2M memory peak. May 15 12:57:30.634334 systemd[1]: Started sshd@2-157.180.34.115:22-185.156.73.233:16800.service - OpenSSH per-connection server daemon (185.156.73.233:16800). May 15 12:57:31.429834 sshd[2571]: Invalid user admin from 185.156.73.233 port 16800 May 15 12:57:31.471030 sshd[2571]: Connection closed by invalid user admin 185.156.73.233 port 16800 [preauth] May 15 12:57:31.472658 systemd[1]: sshd@2-157.180.34.115:22-185.156.73.233:16800.service: Deactivated successfully. May 15 12:57:37.703667 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 55. May 15 12:57:37.705621 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... May 15 12:57:37.818807 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. May 15 12:57:37.825091 (kubelet)[2583]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS May 15 12:57:37.857436 kubelet[2583]: E0515 12:57:37.857395 2583 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" May 15 12:57:37.859285 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE May 15 12:57:37.859403 systemd[1]: kubelet.service: Failed with result 'exit-code'. May 15 12:57:37.859654 systemd[1]: kubelet.service: Consumed 114ms CPU time, 103.8M memory peak. May 15 12:57:47.953759 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 56. May 15 12:57:47.955444 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... May 15 12:57:48.062773 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. May 15 12:57:48.071095 (kubelet)[2598]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS May 15 12:57:48.099429 kubelet[2598]: E0515 12:57:48.099394 2598 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" May 15 12:57:48.101273 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE May 15 12:57:48.101406 systemd[1]: kubelet.service: Failed with result 'exit-code'. May 15 12:57:48.101714 systemd[1]: kubelet.service: Consumed 106ms CPU time, 103.8M memory peak. May 15 12:57:58.203675 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 57. May 15 12:57:58.205443 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... May 15 12:57:58.342613 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. May 15 12:57:58.345255 (kubelet)[2613]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS May 15 12:57:58.374167 kubelet[2613]: E0515 12:57:58.374117 2613 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" May 15 12:57:58.376239 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE May 15 12:57:58.376358 systemd[1]: kubelet.service: Failed with result 'exit-code'. May 15 12:57:58.376580 systemd[1]: kubelet.service: Consumed 108ms CPU time, 101.6M memory peak. May 15 12:57:58.865568 systemd[1]: Started sshd@3-157.180.34.115:22-85.209.134.43:52630.service - OpenSSH per-connection server daemon (85.209.134.43:52630). May 15 12:57:59.523503 sshd[2621]: Received disconnect from 85.209.134.43 port 52630:11: Bye Bye [preauth] May 15 12:57:59.523503 sshd[2621]: Disconnected from authenticating user root 85.209.134.43 port 52630 [preauth] May 15 12:57:59.525565 systemd[1]: sshd@3-157.180.34.115:22-85.209.134.43:52630.service: Deactivated successfully. May 15 12:58:08.453654 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 58. May 15 12:58:08.455343 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... May 15 12:58:08.561721 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. May 15 12:58:08.570105 (kubelet)[2633]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS May 15 12:58:08.602809 kubelet[2633]: E0515 12:58:08.602742 2633 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" May 15 12:58:08.603942 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE May 15 12:58:08.604068 systemd[1]: kubelet.service: Failed with result 'exit-code'. May 15 12:58:08.604342 systemd[1]: kubelet.service: Consumed 112ms CPU time, 104.1M memory peak. May 15 12:58:18.703846 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 59. May 15 12:58:18.705636 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... May 15 12:58:18.814780 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. May 15 12:58:18.819146 (kubelet)[2650]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS May 15 12:58:18.849123 kubelet[2650]: E0515 12:58:18.849059 2650 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" May 15 12:58:18.851961 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE May 15 12:58:18.852083 systemd[1]: kubelet.service: Failed with result 'exit-code'. May 15 12:58:18.852364 systemd[1]: kubelet.service: Consumed 113ms CPU time, 103.6M memory peak. May 15 12:58:28.953679 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 60. May 15 12:58:28.955267 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... May 15 12:58:29.095425 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. May 15 12:58:29.101223 (kubelet)[2665]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS May 15 12:58:29.147167 kubelet[2665]: E0515 12:58:29.147100 2665 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" May 15 12:58:29.149312 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE May 15 12:58:29.149472 systemd[1]: kubelet.service: Failed with result 'exit-code'. May 15 12:58:29.149785 systemd[1]: kubelet.service: Consumed 139ms CPU time, 103.5M memory peak. May 15 12:58:39.203800 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 61. May 15 12:58:39.205548 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... May 15 12:58:39.340709 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. May 15 12:58:39.353130 (kubelet)[2680]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS May 15 12:58:39.383963 kubelet[2680]: E0515 12:58:39.383911 2680 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" May 15 12:58:39.386030 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE May 15 12:58:39.386147 systemd[1]: kubelet.service: Failed with result 'exit-code'. May 15 12:58:39.386400 systemd[1]: kubelet.service: Consumed 136ms CPU time, 103.2M memory peak. May 15 12:58:49.454252 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 62. May 15 12:58:49.456609 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... May 15 12:58:49.602500 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. May 15 12:58:49.620120 (kubelet)[2697]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS May 15 12:58:49.651757 kubelet[2697]: E0515 12:58:49.651710 2697 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" May 15 12:58:49.653582 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE May 15 12:58:49.653694 systemd[1]: kubelet.service: Failed with result 'exit-code'. May 15 12:58:49.653955 systemd[1]: kubelet.service: Consumed 135ms CPU time, 101.6M memory peak. May 15 12:58:59.703663 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 63. May 15 12:58:59.705079 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... May 15 12:58:59.814634 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. May 15 12:58:59.820076 (kubelet)[2712]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS May 15 12:58:59.849857 kubelet[2712]: E0515 12:58:59.849822 2712 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" May 15 12:58:59.851643 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE May 15 12:58:59.851774 systemd[1]: kubelet.service: Failed with result 'exit-code'. May 15 12:58:59.852247 systemd[1]: kubelet.service: Consumed 110ms CPU time, 101.9M memory peak. May 15 12:59:09.954059 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 64. May 15 12:59:09.955550 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... May 15 12:59:10.069642 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. May 15 12:59:10.075122 (kubelet)[2728]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS May 15 12:59:10.102406 kubelet[2728]: E0515 12:59:10.102361 2728 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" May 15 12:59:10.104519 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE May 15 12:59:10.104752 systemd[1]: kubelet.service: Failed with result 'exit-code'. May 15 12:59:10.105179 systemd[1]: kubelet.service: Consumed 114ms CPU time, 103.8M memory peak. May 15 12:59:20.203632 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 65. May 15 12:59:20.205407 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... May 15 12:59:20.325864 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. May 15 12:59:20.332140 (kubelet)[2744]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS May 15 12:59:20.364767 kubelet[2744]: E0515 12:59:20.364711 2744 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" May 15 12:59:20.366873 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE May 15 12:59:20.367018 systemd[1]: kubelet.service: Failed with result 'exit-code'. May 15 12:59:20.367283 systemd[1]: kubelet.service: Consumed 116ms CPU time, 103.7M memory peak. May 15 12:59:30.453849 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 66. May 15 12:59:30.455731 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... May 15 12:59:30.566841 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. May 15 12:59:30.569057 (kubelet)[2759]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS May 15 12:59:30.595128 kubelet[2759]: E0515 12:59:30.595074 2759 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" May 15 12:59:30.597253 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE May 15 12:59:30.597374 systemd[1]: kubelet.service: Failed with result 'exit-code'. May 15 12:59:30.597607 systemd[1]: kubelet.service: Consumed 106ms CPU time, 103.7M memory peak. May 15 12:59:40.703837 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 67. May 15 12:59:40.705562 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... May 15 12:59:40.833530 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. May 15 12:59:40.836274 (kubelet)[2774]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS May 15 12:59:40.867298 kubelet[2774]: E0515 12:59:40.867252 2774 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" May 15 12:59:40.869329 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE May 15 12:59:40.869452 systemd[1]: kubelet.service: Failed with result 'exit-code'. May 15 12:59:40.869886 systemd[1]: kubelet.service: Consumed 114ms CPU time, 103.7M memory peak. May 15 12:59:50.953613 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 68. May 15 12:59:50.954940 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... May 15 12:59:51.077880 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. May 15 12:59:51.084108 (kubelet)[2790]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS May 15 12:59:51.120246 kubelet[2790]: E0515 12:59:51.120188 2790 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" May 15 12:59:51.122186 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE May 15 12:59:51.122321 systemd[1]: kubelet.service: Failed with result 'exit-code'. May 15 12:59:51.122679 systemd[1]: kubelet.service: Consumed 114ms CPU time, 105.6M memory peak. May 15 13:00:01.203771 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 69. May 15 13:00:01.205202 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... May 15 13:00:01.308703 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. May 15 13:00:01.316146 (kubelet)[2805]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS May 15 13:00:01.348757 kubelet[2805]: E0515 13:00:01.348706 2805 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" May 15 13:00:01.350555 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE May 15 13:00:01.350672 systemd[1]: kubelet.service: Failed with result 'exit-code'. May 15 13:00:01.350960 systemd[1]: kubelet.service: Consumed 111ms CPU time, 103.8M memory peak. May 15 13:00:11.453692 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 70. May 15 13:00:11.455339 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... May 15 13:00:11.582551 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. May 15 13:00:11.594215 (kubelet)[2820]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS May 15 13:00:11.624126 kubelet[2820]: E0515 13:00:11.624070 2820 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" May 15 13:00:11.626087 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE May 15 13:00:11.626206 systemd[1]: kubelet.service: Failed with result 'exit-code'. May 15 13:00:11.626446 systemd[1]: kubelet.service: Consumed 111ms CPU time, 101.7M memory peak. May 15 13:00:21.703866 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 71. May 15 13:00:21.705416 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... May 15 13:00:21.812115 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. May 15 13:00:21.816137 (kubelet)[2836]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS May 15 13:00:21.848577 kubelet[2836]: E0515 13:00:21.848510 2836 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" May 15 13:00:21.851263 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE May 15 13:00:21.851393 systemd[1]: kubelet.service: Failed with result 'exit-code'. May 15 13:00:21.851661 systemd[1]: kubelet.service: Consumed 112ms CPU time, 102.9M memory peak. May 15 13:00:31.953653 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 72. May 15 13:00:31.955533 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... May 15 13:00:32.056018 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. May 15 13:00:32.062131 (kubelet)[2851]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS May 15 13:00:32.093634 kubelet[2851]: E0515 13:00:32.093577 2851 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" May 15 13:00:32.095459 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE May 15 13:00:32.095623 systemd[1]: kubelet.service: Failed with result 'exit-code'. May 15 13:00:32.095998 systemd[1]: kubelet.service: Consumed 109ms CPU time, 103.8M memory peak. May 15 13:00:42.203817 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 73. May 15 13:00:42.205279 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... May 15 13:00:42.338707 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. May 15 13:00:42.351146 (kubelet)[2866]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS May 15 13:00:42.381407 kubelet[2866]: E0515 13:00:42.381351 2866 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" May 15 13:00:42.383107 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE May 15 13:00:42.383225 systemd[1]: kubelet.service: Failed with result 'exit-code'. May 15 13:00:42.383467 systemd[1]: kubelet.service: Consumed 112ms CPU time, 103.9M memory peak. May 15 13:00:52.453719 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 74. May 15 13:00:52.455010 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... May 15 13:00:52.575007 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. May 15 13:00:52.581126 (kubelet)[2882]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS May 15 13:00:52.610600 kubelet[2882]: E0515 13:00:52.610544 2882 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" May 15 13:00:52.612681 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE May 15 13:00:52.612809 systemd[1]: kubelet.service: Failed with result 'exit-code'. May 15 13:00:52.613252 systemd[1]: kubelet.service: Consumed 123ms CPU time, 103.6M memory peak. May 15 13:01:02.703672 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 75. May 15 13:01:02.705461 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... May 15 13:01:02.852933 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. May 15 13:01:02.857185 (kubelet)[2898]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS May 15 13:01:02.892902 kubelet[2898]: E0515 13:01:02.892829 2898 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" May 15 13:01:02.894139 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE May 15 13:01:02.894264 systemd[1]: kubelet.service: Failed with result 'exit-code'. May 15 13:01:02.894499 systemd[1]: kubelet.service: Consumed 119ms CPU time, 102M memory peak. May 15 13:01:12.954360 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 76. May 15 13:01:12.957508 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... May 15 13:01:13.116220 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. May 15 13:01:13.119174 (kubelet)[2913]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS May 15 13:01:13.148762 kubelet[2913]: E0515 13:01:13.148702 2913 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" May 15 13:01:13.150763 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE May 15 13:01:13.150876 systemd[1]: kubelet.service: Failed with result 'exit-code'. May 15 13:01:13.151146 systemd[1]: kubelet.service: Consumed 145ms CPU time, 103.3M memory peak. May 15 13:01:23.204211 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 77. May 15 13:01:23.207156 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... May 15 13:01:23.359640 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. May 15 13:01:23.364131 (kubelet)[2929]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS May 15 13:01:23.395432 kubelet[2929]: E0515 13:01:23.395369 2929 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" May 15 13:01:23.397442 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE May 15 13:01:23.397601 systemd[1]: kubelet.service: Failed with result 'exit-code'. May 15 13:01:23.397997 systemd[1]: kubelet.service: Consumed 132ms CPU time, 103.2M memory peak. May 15 13:01:33.453794 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 78. May 15 13:01:33.455531 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... May 15 13:01:33.551618 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. May 15 13:01:33.563140 (kubelet)[2945]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS May 15 13:01:33.592449 kubelet[2945]: E0515 13:01:33.592388 2945 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" May 15 13:01:33.593865 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE May 15 13:01:33.594007 systemd[1]: kubelet.service: Failed with result 'exit-code'. May 15 13:01:33.594388 systemd[1]: kubelet.service: Consumed 108ms CPU time, 103.7M memory peak. May 15 13:01:43.704131 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 79. May 15 13:01:43.706305 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... May 15 13:01:43.873509 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. May 15 13:01:43.876026 (kubelet)[2960]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS May 15 13:01:43.902704 kubelet[2960]: E0515 13:01:43.902633 2960 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" May 15 13:01:43.904601 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE May 15 13:01:43.904712 systemd[1]: kubelet.service: Failed with result 'exit-code'. May 15 13:01:43.904978 systemd[1]: kubelet.service: Consumed 139ms CPU time, 103.7M memory peak. May 15 13:01:53.953977 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 80. May 15 13:01:53.955696 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... May 15 13:01:54.055775 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. May 15 13:01:54.058269 (kubelet)[2975]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS May 15 13:01:54.087671 kubelet[2975]: E0515 13:01:54.087609 2975 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" May 15 13:01:54.090130 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE May 15 13:01:54.090253 systemd[1]: kubelet.service: Failed with result 'exit-code'. May 15 13:01:54.090500 systemd[1]: kubelet.service: Consumed 105ms CPU time, 103.5M memory peak. May 15 13:02:00.046155 systemd[1]: Started sshd@4-157.180.34.115:22-85.209.134.43:51244.service - OpenSSH per-connection server daemon (85.209.134.43:51244). May 15 13:02:00.633878 sshd[2983]: Invalid user joey from 85.209.134.43 port 51244 May 15 13:02:00.734369 sshd[2983]: Received disconnect from 85.209.134.43 port 51244:11: Bye Bye [preauth] May 15 13:02:00.734369 sshd[2983]: Disconnected from invalid user joey 85.209.134.43 port 51244 [preauth] May 15 13:02:00.737025 systemd[1]: sshd@4-157.180.34.115:22-85.209.134.43:51244.service: Deactivated successfully. May 15 13:02:04.203832 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 81. May 15 13:02:04.205446 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... May 15 13:02:04.326731 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. May 15 13:02:04.337215 (kubelet)[2995]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS May 15 13:02:04.375561 kubelet[2995]: E0515 13:02:04.375501 2995 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" May 15 13:02:04.377827 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE May 15 13:02:04.377973 systemd[1]: kubelet.service: Failed with result 'exit-code'. May 15 13:02:04.378215 systemd[1]: kubelet.service: Consumed 124ms CPU time, 103.7M memory peak. May 15 13:02:14.453730 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 82. May 15 13:02:14.455383 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... May 15 13:02:14.562374 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. May 15 13:02:14.568107 (kubelet)[3010]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS May 15 13:02:14.598324 kubelet[3010]: E0515 13:02:14.598254 3010 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" May 15 13:02:14.599996 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE May 15 13:02:14.600139 systemd[1]: kubelet.service: Failed with result 'exit-code'. May 15 13:02:14.600480 systemd[1]: kubelet.service: Consumed 108ms CPU time, 103.8M memory peak. May 15 13:02:24.703980 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 83. May 15 13:02:24.705550 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... May 15 13:02:24.804408 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. May 15 13:02:24.810118 (kubelet)[3025]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS May 15 13:02:24.839515 kubelet[3025]: E0515 13:02:24.839463 3025 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" May 15 13:02:24.841283 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE May 15 13:02:24.841415 systemd[1]: kubelet.service: Failed with result 'exit-code'. May 15 13:02:24.841698 systemd[1]: kubelet.service: Consumed 107ms CPU time, 103.8M memory peak. May 15 13:02:34.954098 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 84. May 15 13:02:34.956286 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... May 15 13:02:35.087847 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. May 15 13:02:35.093378 (kubelet)[3040]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS May 15 13:02:35.133822 kubelet[3040]: E0515 13:02:35.133760 3040 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" May 15 13:02:35.136289 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE May 15 13:02:35.136415 systemd[1]: kubelet.service: Failed with result 'exit-code'. May 15 13:02:35.136688 systemd[1]: kubelet.service: Consumed 131ms CPU time, 103.5M memory peak. May 15 13:02:45.204351 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 85. May 15 13:02:45.206956 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... May 15 13:02:45.359768 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. May 15 13:02:45.362456 (kubelet)[3055]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS May 15 13:02:45.394467 kubelet[3055]: E0515 13:02:45.394407 3055 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" May 15 13:02:45.396359 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE May 15 13:02:45.396485 systemd[1]: kubelet.service: Failed with result 'exit-code'. May 15 13:02:45.396730 systemd[1]: kubelet.service: Consumed 139ms CPU time, 103.7M memory peak. May 15 13:02:55.454192 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 86. May 15 13:02:55.456473 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... May 15 13:02:55.599485 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. May 15 13:02:55.602223 (kubelet)[3070]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS May 15 13:02:55.633312 kubelet[3070]: E0515 13:02:55.633261 3070 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" May 15 13:02:55.635221 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE May 15 13:02:55.635420 systemd[1]: kubelet.service: Failed with result 'exit-code'. May 15 13:02:55.635782 systemd[1]: kubelet.service: Consumed 135ms CPU time, 100.9M memory peak. May 15 13:02:57.706063 systemd[1]: Starting systemd-tmpfiles-clean.service - Cleanup of Temporary Directories... May 15 13:02:57.757159 systemd-tmpfiles[3077]: /usr/lib/tmpfiles.d/nfs-utils.conf:6: Duplicate line for path "/var/lib/nfs/sm", ignoring. May 15 13:02:57.757193 systemd-tmpfiles[3077]: /usr/lib/tmpfiles.d/nfs-utils.conf:7: Duplicate line for path "/var/lib/nfs/sm.bak", ignoring. May 15 13:02:57.757456 systemd-tmpfiles[3077]: /usr/lib/tmpfiles.d/provision.conf:20: Duplicate line for path "/root", ignoring. May 15 13:02:57.757917 systemd-tmpfiles[3077]: /usr/lib/tmpfiles.d/systemd-flatcar.conf:6: Duplicate line for path "/var/log/journal", ignoring. May 15 13:02:57.758847 systemd-tmpfiles[3077]: /usr/lib/tmpfiles.d/systemd.conf:29: Duplicate line for path "/var/lib/systemd", ignoring. May 15 13:02:57.759237 systemd-tmpfiles[3077]: ACLs are not supported, ignoring. May 15 13:02:57.759326 systemd-tmpfiles[3077]: ACLs are not supported, ignoring. May 15 13:02:57.766212 systemd-tmpfiles[3077]: Detected autofs mount point /boot during canonicalization of boot. May 15 13:02:57.766234 systemd-tmpfiles[3077]: Skipping /boot May 15 13:02:57.771835 systemd[1]: systemd-tmpfiles-clean.service: Deactivated successfully. May 15 13:02:57.772177 systemd[1]: Finished systemd-tmpfiles-clean.service - Cleanup of Temporary Directories. May 15 13:02:57.775084 systemd[1]: run-credentials-systemd\x2dtmpfiles\x2dclean.service.mount: Deactivated successfully. May 15 13:03:05.703840 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 87. May 15 13:03:05.705651 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... May 15 13:03:05.828069 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. May 15 13:03:05.838113 (kubelet)[3088]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS May 15 13:03:05.870623 kubelet[3088]: E0515 13:03:05.870572 3088 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" May 15 13:03:05.872678 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE May 15 13:03:05.872798 systemd[1]: kubelet.service: Failed with result 'exit-code'. May 15 13:03:05.873068 systemd[1]: kubelet.service: Consumed 116ms CPU time, 103.8M memory peak. May 15 13:03:15.953770 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 88. May 15 13:03:15.955773 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... May 15 13:03:16.094363 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. May 15 13:03:16.097387 (kubelet)[3104]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS May 15 13:03:16.133003 kubelet[3104]: E0515 13:03:16.132947 3104 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" May 15 13:03:16.134810 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE May 15 13:03:16.134992 systemd[1]: kubelet.service: Failed with result 'exit-code'. May 15 13:03:16.135324 systemd[1]: kubelet.service: Consumed 135ms CPU time, 103.4M memory peak. May 15 13:03:26.203866 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 89. May 15 13:03:26.205374 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... May 15 13:03:26.342508 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. May 15 13:03:26.353101 (kubelet)[3120]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS May 15 13:03:26.385272 kubelet[3120]: E0515 13:03:26.385205 3120 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" May 15 13:03:26.387783 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE May 15 13:03:26.388076 systemd[1]: kubelet.service: Failed with result 'exit-code'. May 15 13:03:26.388588 systemd[1]: kubelet.service: Consumed 141ms CPU time, 101.6M memory peak. May 15 13:03:36.453836 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 90. May 15 13:03:36.455840 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... May 15 13:03:36.565738 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. May 15 13:03:36.579148 (kubelet)[3135]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS May 15 13:03:36.612762 kubelet[3135]: E0515 13:03:36.612711 3135 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" May 15 13:03:36.614851 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE May 15 13:03:36.615035 systemd[1]: kubelet.service: Failed with result 'exit-code'. May 15 13:03:36.615384 systemd[1]: kubelet.service: Consumed 114ms CPU time, 103.5M memory peak. May 15 13:03:46.704136 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 91. May 15 13:03:46.706449 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... May 15 13:03:46.841647 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. May 15 13:03:46.854152 (kubelet)[3150]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS May 15 13:03:46.894086 kubelet[3150]: E0515 13:03:46.894035 3150 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" May 15 13:03:46.896056 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE May 15 13:03:46.896167 systemd[1]: kubelet.service: Failed with result 'exit-code'. May 15 13:03:46.896493 systemd[1]: kubelet.service: Consumed 142ms CPU time, 101.9M memory peak. May 15 13:03:56.953772 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 92. May 15 13:03:56.955372 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... May 15 13:03:57.073847 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. May 15 13:03:57.081113 (kubelet)[3165]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS May 15 13:03:57.112793 kubelet[3165]: E0515 13:03:57.112740 3165 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" May 15 13:03:57.114464 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE May 15 13:03:57.114603 systemd[1]: kubelet.service: Failed with result 'exit-code'. May 15 13:03:57.115072 systemd[1]: kubelet.service: Consumed 115ms CPU time, 102M memory peak. May 15 13:04:07.203690 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 93. May 15 13:04:07.205183 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... May 15 13:04:07.333905 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. May 15 13:04:07.340153 (kubelet)[3182]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS May 15 13:04:07.374078 kubelet[3182]: E0515 13:04:07.374025 3182 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" May 15 13:04:07.376072 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE May 15 13:04:07.376186 systemd[1]: kubelet.service: Failed with result 'exit-code'. May 15 13:04:07.376417 systemd[1]: kubelet.service: Consumed 113ms CPU time, 101.9M memory peak. May 15 13:04:17.453661 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 94. May 15 13:04:17.455224 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... May 15 13:04:17.556394 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. May 15 13:04:17.571202 (kubelet)[3197]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS May 15 13:04:17.601946 kubelet[3197]: E0515 13:04:17.601871 3197 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" May 15 13:04:17.604270 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE May 15 13:04:17.604394 systemd[1]: kubelet.service: Failed with result 'exit-code'. May 15 13:04:17.604655 systemd[1]: kubelet.service: Consumed 112ms CPU time, 101.7M memory peak. May 15 13:04:27.704269 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 95. May 15 13:04:27.706756 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... May 15 13:04:27.843244 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. May 15 13:04:27.852095 (kubelet)[3212]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS May 15 13:04:27.879966 kubelet[3212]: E0515 13:04:27.879867 3212 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" May 15 13:04:27.881826 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE May 15 13:04:27.882012 systemd[1]: kubelet.service: Failed with result 'exit-code'. May 15 13:04:27.882323 systemd[1]: kubelet.service: Consumed 128ms CPU time, 103.6M memory peak. May 15 13:04:37.953639 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 96. May 15 13:04:37.955073 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... May 15 13:04:38.071513 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. May 15 13:04:38.073807 (kubelet)[3228]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS May 15 13:04:38.099645 kubelet[3228]: E0515 13:04:38.099605 3228 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" May 15 13:04:38.101422 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE May 15 13:04:38.101533 systemd[1]: kubelet.service: Failed with result 'exit-code'. May 15 13:04:38.101779 systemd[1]: kubelet.service: Consumed 117ms CPU time, 102.1M memory peak. May 15 13:04:46.204935 systemd[1]: Started sshd@5-157.180.34.115:22-147.75.109.163:45534.service - OpenSSH per-connection server daemon (147.75.109.163:45534). May 15 13:04:47.196552 sshd[3236]: Accepted publickey for core from 147.75.109.163 port 45534 ssh2: RSA SHA256:7lmdsVx4mXdsPeXYTlWGXBW4TnKrdBGlv6Lg029Y6yo May 15 13:04:47.199697 sshd-session[3236]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 15 13:04:47.210526 systemd[1]: Created slice user-500.slice - User Slice of UID 500. May 15 13:04:47.212054 systemd[1]: Starting user-runtime-dir@500.service - User Runtime Directory /run/user/500... May 15 13:04:47.227437 systemd-logind[1536]: New session 1 of user core. May 15 13:04:47.241552 systemd[1]: Finished user-runtime-dir@500.service - User Runtime Directory /run/user/500. May 15 13:04:47.245560 systemd[1]: Starting user@500.service - User Manager for UID 500... May 15 13:04:47.267585 (systemd)[3240]: pam_unix(systemd-user:session): session opened for user core(uid=500) by (uid=0) May 15 13:04:47.271528 systemd-logind[1536]: New session c1 of user core. May 15 13:04:47.414817 systemd[3240]: Queued start job for default target default.target. May 15 13:04:47.420610 systemd[3240]: Created slice app.slice - User Application Slice. May 15 13:04:47.420636 systemd[3240]: Reached target paths.target - Paths. May 15 13:04:47.420668 systemd[3240]: Reached target timers.target - Timers. May 15 13:04:47.421661 systemd[3240]: Starting dbus.socket - D-Bus User Message Bus Socket... May 15 13:04:47.431526 systemd[3240]: Listening on dbus.socket - D-Bus User Message Bus Socket. May 15 13:04:47.431570 systemd[3240]: Reached target sockets.target - Sockets. May 15 13:04:47.431602 systemd[3240]: Reached target basic.target - Basic System. May 15 13:04:47.431630 systemd[3240]: Reached target default.target - Main User Target. May 15 13:04:47.431650 systemd[3240]: Startup finished in 151ms. May 15 13:04:47.432034 systemd[1]: Started user@500.service - User Manager for UID 500. May 15 13:04:47.447035 systemd[1]: Started session-1.scope - Session 1 of User core. May 15 13:04:48.147234 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 97. May 15 13:04:48.150837 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... May 15 13:04:48.154332 systemd[1]: Started sshd@6-157.180.34.115:22-147.75.109.163:45544.service - OpenSSH per-connection server daemon (147.75.109.163:45544). May 15 13:04:48.287216 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. May 15 13:04:48.297092 (kubelet)[3261]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS May 15 13:04:48.334231 kubelet[3261]: E0515 13:04:48.334155 3261 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" May 15 13:04:48.337289 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE May 15 13:04:48.337419 systemd[1]: kubelet.service: Failed with result 'exit-code'. May 15 13:04:48.337703 systemd[1]: kubelet.service: Consumed 140ms CPU time, 101.7M memory peak. May 15 13:04:49.147534 sshd[3252]: Accepted publickey for core from 147.75.109.163 port 45544 ssh2: RSA SHA256:7lmdsVx4mXdsPeXYTlWGXBW4TnKrdBGlv6Lg029Y6yo May 15 13:04:49.148864 sshd-session[3252]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 15 13:04:49.154411 systemd-logind[1536]: New session 2 of user core. May 15 13:04:49.162033 systemd[1]: Started session-2.scope - Session 2 of User core. May 15 13:04:49.819164 sshd[3268]: Connection closed by 147.75.109.163 port 45544 May 15 13:04:49.819719 sshd-session[3252]: pam_unix(sshd:session): session closed for user core May 15 13:04:49.822238 systemd[1]: sshd@6-157.180.34.115:22-147.75.109.163:45544.service: Deactivated successfully. May 15 13:04:49.824452 systemd[1]: session-2.scope: Deactivated successfully. May 15 13:04:49.825474 systemd-logind[1536]: Session 2 logged out. Waiting for processes to exit. May 15 13:04:49.826431 systemd-logind[1536]: Removed session 2. May 15 13:04:49.991563 systemd[1]: Started sshd@7-157.180.34.115:22-147.75.109.163:35686.service - OpenSSH per-connection server daemon (147.75.109.163:35686). May 15 13:04:50.978965 sshd[3275]: Accepted publickey for core from 147.75.109.163 port 35686 ssh2: RSA SHA256:7lmdsVx4mXdsPeXYTlWGXBW4TnKrdBGlv6Lg029Y6yo May 15 13:04:50.980264 sshd-session[3275]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 15 13:04:50.986775 systemd-logind[1536]: New session 3 of user core. May 15 13:04:50.995103 systemd[1]: Started session-3.scope - Session 3 of User core. May 15 13:04:51.648677 sshd[3277]: Connection closed by 147.75.109.163 port 35686 May 15 13:04:51.649227 sshd-session[3275]: pam_unix(sshd:session): session closed for user core May 15 13:04:51.651728 systemd[1]: sshd@7-157.180.34.115:22-147.75.109.163:35686.service: Deactivated successfully. May 15 13:04:51.653575 systemd-logind[1536]: Session 3 logged out. Waiting for processes to exit. May 15 13:04:51.654355 systemd[1]: session-3.scope: Deactivated successfully. May 15 13:04:51.655775 systemd-logind[1536]: Removed session 3. May 15 13:04:51.814787 systemd[1]: Started sshd@8-157.180.34.115:22-147.75.109.163:35698.service - OpenSSH per-connection server daemon (147.75.109.163:35698). May 15 13:04:52.796792 sshd[3284]: Accepted publickey for core from 147.75.109.163 port 35698 ssh2: RSA SHA256:7lmdsVx4mXdsPeXYTlWGXBW4TnKrdBGlv6Lg029Y6yo May 15 13:04:52.798101 sshd-session[3284]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 15 13:04:52.803174 systemd-logind[1536]: New session 4 of user core. May 15 13:04:52.814122 systemd[1]: Started session-4.scope - Session 4 of User core. May 15 13:04:53.468363 sshd[3286]: Connection closed by 147.75.109.163 port 35698 May 15 13:04:53.469047 sshd-session[3284]: pam_unix(sshd:session): session closed for user core May 15 13:04:53.471756 systemd[1]: sshd@8-157.180.34.115:22-147.75.109.163:35698.service: Deactivated successfully. May 15 13:04:53.473760 systemd-logind[1536]: Session 4 logged out. Waiting for processes to exit. May 15 13:04:53.474158 systemd[1]: session-4.scope: Deactivated successfully. May 15 13:04:53.475547 systemd-logind[1536]: Removed session 4. May 15 13:04:53.635592 systemd[1]: Started sshd@9-157.180.34.115:22-147.75.109.163:35704.service - OpenSSH per-connection server daemon (147.75.109.163:35704). May 15 13:04:54.617165 sshd[3292]: Accepted publickey for core from 147.75.109.163 port 35704 ssh2: RSA SHA256:7lmdsVx4mXdsPeXYTlWGXBW4TnKrdBGlv6Lg029Y6yo May 15 13:04:54.618636 sshd-session[3292]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 15 13:04:54.623926 systemd-logind[1536]: New session 5 of user core. May 15 13:04:54.630050 systemd[1]: Started session-5.scope - Session 5 of User core. May 15 13:04:55.141483 sudo[3295]: core : PWD=/home/core ; USER=root ; COMMAND=/usr/sbin/setenforce 1 May 15 13:04:55.141731 sudo[3295]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) May 15 13:04:55.161590 sudo[3295]: pam_unix(sudo:session): session closed for user root May 15 13:04:55.318619 sshd[3294]: Connection closed by 147.75.109.163 port 35704 May 15 13:04:55.319358 sshd-session[3292]: pam_unix(sshd:session): session closed for user core May 15 13:04:55.323024 systemd-logind[1536]: Session 5 logged out. Waiting for processes to exit. May 15 13:04:55.323652 systemd[1]: sshd@9-157.180.34.115:22-147.75.109.163:35704.service: Deactivated successfully. May 15 13:04:55.325222 systemd[1]: session-5.scope: Deactivated successfully. May 15 13:04:55.326826 systemd-logind[1536]: Removed session 5. May 15 13:04:55.485461 systemd[1]: Started sshd@10-157.180.34.115:22-147.75.109.163:35714.service - OpenSSH per-connection server daemon (147.75.109.163:35714). May 15 13:04:56.471827 sshd[3301]: Accepted publickey for core from 147.75.109.163 port 35714 ssh2: RSA SHA256:7lmdsVx4mXdsPeXYTlWGXBW4TnKrdBGlv6Lg029Y6yo May 15 13:04:56.473250 sshd-session[3301]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 15 13:04:56.480504 systemd-logind[1536]: New session 6 of user core. May 15 13:04:56.488102 systemd[1]: Started session-6.scope - Session 6 of User core. May 15 13:04:56.993160 sudo[3305]: core : PWD=/home/core ; USER=root ; COMMAND=/usr/sbin/rm -rf /etc/audit/rules.d/80-selinux.rules /etc/audit/rules.d/99-default.rules May 15 13:04:56.993606 sudo[3305]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) May 15 13:04:57.000938 sudo[3305]: pam_unix(sudo:session): session closed for user root May 15 13:04:57.010487 sudo[3304]: core : PWD=/home/core ; USER=root ; COMMAND=/usr/sbin/systemctl restart audit-rules May 15 13:04:57.011168 sudo[3304]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) May 15 13:04:57.026712 systemd[1]: Starting audit-rules.service - Load Audit Rules... May 15 13:04:57.074146 augenrules[3327]: No rules May 15 13:04:57.075253 systemd[1]: audit-rules.service: Deactivated successfully. May 15 13:04:57.075605 systemd[1]: Finished audit-rules.service - Load Audit Rules. May 15 13:04:57.076641 sudo[3304]: pam_unix(sudo:session): session closed for user root May 15 13:04:57.233846 sshd[3303]: Connection closed by 147.75.109.163 port 35714 May 15 13:04:57.234416 sshd-session[3301]: pam_unix(sshd:session): session closed for user core May 15 13:04:57.237332 systemd[1]: sshd@10-157.180.34.115:22-147.75.109.163:35714.service: Deactivated successfully. May 15 13:04:57.238936 systemd-logind[1536]: Session 6 logged out. Waiting for processes to exit. May 15 13:04:57.238976 systemd[1]: session-6.scope: Deactivated successfully. May 15 13:04:57.241082 systemd-logind[1536]: Removed session 6. May 15 13:04:57.413280 systemd[1]: Started sshd@11-157.180.34.115:22-147.75.109.163:35728.service - OpenSSH per-connection server daemon (147.75.109.163:35728). May 15 13:04:58.400455 sshd[3336]: Accepted publickey for core from 147.75.109.163 port 35728 ssh2: RSA SHA256:7lmdsVx4mXdsPeXYTlWGXBW4TnKrdBGlv6Lg029Y6yo May 15 13:04:58.401628 sshd-session[3336]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 15 13:04:58.402574 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 98. May 15 13:04:58.403860 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... May 15 13:04:58.407509 systemd-logind[1536]: New session 7 of user core. May 15 13:04:58.411783 systemd[1]: Started session-7.scope - Session 7 of User core. May 15 13:04:58.516434 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. May 15 13:04:58.525177 (kubelet)[3347]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS May 15 13:04:58.555112 kubelet[3347]: E0515 13:04:58.555065 3347 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" May 15 13:04:58.556989 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE May 15 13:04:58.557100 systemd[1]: kubelet.service: Failed with result 'exit-code'. May 15 13:04:58.557559 systemd[1]: kubelet.service: Consumed 109ms CPU time, 103.8M memory peak. May 15 13:04:58.919311 sudo[3354]: core : PWD=/home/core ; USER=root ; COMMAND=/home/core/install.sh May 15 13:04:58.919646 sudo[3354]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) May 15 13:04:59.170480 systemd[1]: Starting docker.service - Docker Application Container Engine... May 15 13:04:59.180200 (dockerd)[3371]: docker.service: Referenced but unset environment variable evaluates to an empty string: DOCKER_CGROUPS, DOCKER_OPTS, DOCKER_OPT_BIP, DOCKER_OPT_IPMASQ, DOCKER_OPT_MTU May 15 13:04:59.343488 dockerd[3371]: time="2025-05-15T13:04:59.343430914Z" level=info msg="Starting up" May 15 13:04:59.344685 dockerd[3371]: time="2025-05-15T13:04:59.344665676Z" level=info msg="OTEL tracing is not configured, using no-op tracer provider" May 15 13:04:59.366986 systemd[1]: var-lib-docker-check\x2doverlayfs\x2dsupport2957604280-merged.mount: Deactivated successfully. May 15 13:04:59.386696 dockerd[3371]: time="2025-05-15T13:04:59.386537470Z" level=info msg="Loading containers: start." May 15 13:04:59.395939 kernel: Initializing XFRM netlink socket May 15 13:04:59.569163 systemd-networkd[1472]: docker0: Link UP May 15 13:04:59.572364 dockerd[3371]: time="2025-05-15T13:04:59.572318791Z" level=info msg="Loading containers: done." May 15 13:04:59.584801 dockerd[3371]: time="2025-05-15T13:04:59.584757453Z" level=warning msg="Not using native diff for overlay2, this may cause degraded performance for building images: kernel has CONFIG_OVERLAY_FS_REDIRECT_DIR enabled" storage-driver=overlay2 May 15 13:04:59.584940 dockerd[3371]: time="2025-05-15T13:04:59.584845870Z" level=info msg="Docker daemon" commit=bbd0a17ccc67e48d4a69393287b7fcc4f0578683 containerd-snapshotter=false storage-driver=overlay2 version=28.0.1 May 15 13:04:59.584975 dockerd[3371]: time="2025-05-15T13:04:59.584950717Z" level=info msg="Initializing buildkit" May 15 13:04:59.601653 dockerd[3371]: time="2025-05-15T13:04:59.601614495Z" level=info msg="Completed buildkit initialization" May 15 13:04:59.608979 dockerd[3371]: time="2025-05-15T13:04:59.608947836Z" level=info msg="Daemon has completed initialization" May 15 13:04:59.609820 dockerd[3371]: time="2025-05-15T13:04:59.609090184Z" level=info msg="API listen on /run/docker.sock" May 15 13:04:59.609174 systemd[1]: Started docker.service - Docker Application Container Engine. May 15 13:05:00.363617 systemd[1]: var-lib-docker-overlay2-opaque\x2dbug\x2dcheck3553465624-merged.mount: Deactivated successfully. May 15 13:05:00.638813 containerd[1551]: time="2025-05-15T13:05:00.638688811Z" level=info msg="PullImage \"registry.k8s.io/kube-apiserver:v1.32.4\"" May 15 13:05:01.120463 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount686631689.mount: Deactivated successfully. May 15 13:05:02.270905 containerd[1551]: time="2025-05-15T13:05:02.270832335Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-apiserver:v1.32.4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 15 13:05:02.271796 containerd[1551]: time="2025-05-15T13:05:02.271753387Z" level=info msg="stop pulling image registry.k8s.io/kube-apiserver:v1.32.4: active requests=0, bytes read=28682973" May 15 13:05:02.272849 containerd[1551]: time="2025-05-15T13:05:02.272805796Z" level=info msg="ImageCreate event name:\"sha256:1c20c8797e48698afa3380793df2f1fb260e3209df72d8e864e1bc73af8336e5\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 15 13:05:02.274979 containerd[1551]: time="2025-05-15T13:05:02.274921135Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-apiserver@sha256:631c6cc78b2862be4fed7df3384a643ef7297eebadae22e8ef9cbe2e19b6386f\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 15 13:05:02.275747 containerd[1551]: time="2025-05-15T13:05:02.275586627Z" level=info msg="Pulled image \"registry.k8s.io/kube-apiserver:v1.32.4\" with image id \"sha256:1c20c8797e48698afa3380793df2f1fb260e3209df72d8e864e1bc73af8336e5\", repo tag \"registry.k8s.io/kube-apiserver:v1.32.4\", repo digest \"registry.k8s.io/kube-apiserver@sha256:631c6cc78b2862be4fed7df3384a643ef7297eebadae22e8ef9cbe2e19b6386f\", size \"28679679\" in 1.636855688s" May 15 13:05:02.275747 containerd[1551]: time="2025-05-15T13:05:02.275615100Z" level=info msg="PullImage \"registry.k8s.io/kube-apiserver:v1.32.4\" returns image reference \"sha256:1c20c8797e48698afa3380793df2f1fb260e3209df72d8e864e1bc73af8336e5\"" May 15 13:05:02.276200 containerd[1551]: time="2025-05-15T13:05:02.276141149Z" level=info msg="PullImage \"registry.k8s.io/kube-controller-manager:v1.32.4\"" May 15 13:05:03.487125 containerd[1551]: time="2025-05-15T13:05:03.487070356Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-controller-manager:v1.32.4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 15 13:05:03.488137 containerd[1551]: time="2025-05-15T13:05:03.487908974Z" level=info msg="stop pulling image registry.k8s.io/kube-controller-manager:v1.32.4: active requests=0, bytes read=24779611" May 15 13:05:03.489067 containerd[1551]: time="2025-05-15T13:05:03.488999854Z" level=info msg="ImageCreate event name:\"sha256:4db5364cd5509e0fc8e9f821fbc4b31ed79d4c9ae21809d22030ad67d530a61a\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 15 13:05:03.491426 containerd[1551]: time="2025-05-15T13:05:03.491373500Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-controller-manager@sha256:25e29187ea66f0ff9b9a00114849c3a30b649005c900a8b2a69e3f3fa56448fb\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 15 13:05:03.492030 containerd[1551]: time="2025-05-15T13:05:03.491985460Z" level=info msg="Pulled image \"registry.k8s.io/kube-controller-manager:v1.32.4\" with image id \"sha256:4db5364cd5509e0fc8e9f821fbc4b31ed79d4c9ae21809d22030ad67d530a61a\", repo tag \"registry.k8s.io/kube-controller-manager:v1.32.4\", repo digest \"registry.k8s.io/kube-controller-manager@sha256:25e29187ea66f0ff9b9a00114849c3a30b649005c900a8b2a69e3f3fa56448fb\", size \"26267962\" in 1.215708114s" May 15 13:05:03.492077 containerd[1551]: time="2025-05-15T13:05:03.492032077Z" level=info msg="PullImage \"registry.k8s.io/kube-controller-manager:v1.32.4\" returns image reference \"sha256:4db5364cd5509e0fc8e9f821fbc4b31ed79d4c9ae21809d22030ad67d530a61a\"" May 15 13:05:03.492759 containerd[1551]: time="2025-05-15T13:05:03.492544862Z" level=info msg="PullImage \"registry.k8s.io/kube-scheduler:v1.32.4\"" May 15 13:05:04.476363 containerd[1551]: time="2025-05-15T13:05:04.476318790Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-scheduler:v1.32.4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 15 13:05:04.477018 containerd[1551]: time="2025-05-15T13:05:04.476995563Z" level=info msg="stop pulling image registry.k8s.io/kube-scheduler:v1.32.4: active requests=0, bytes read=19169960" May 15 13:05:04.477822 containerd[1551]: time="2025-05-15T13:05:04.477783635Z" level=info msg="ImageCreate event name:\"sha256:70a252485ed1f2e8332b6f0a5f8f57443bfbc3c480228f8dcd82ad5ab5cc4000\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 15 13:05:04.479606 containerd[1551]: time="2025-05-15T13:05:04.479566277Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-scheduler@sha256:09c55f8dac59a4b8e5e354140f5a4bdd6fa9bd95c42d6bcba6782ed37c31b5a2\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 15 13:05:04.480325 containerd[1551]: time="2025-05-15T13:05:04.480188388Z" level=info msg="Pulled image \"registry.k8s.io/kube-scheduler:v1.32.4\" with image id \"sha256:70a252485ed1f2e8332b6f0a5f8f57443bfbc3c480228f8dcd82ad5ab5cc4000\", repo tag \"registry.k8s.io/kube-scheduler:v1.32.4\", repo digest \"registry.k8s.io/kube-scheduler@sha256:09c55f8dac59a4b8e5e354140f5a4bdd6fa9bd95c42d6bcba6782ed37c31b5a2\", size \"20658329\" in 987.600535ms" May 15 13:05:04.480325 containerd[1551]: time="2025-05-15T13:05:04.480218664Z" level=info msg="PullImage \"registry.k8s.io/kube-scheduler:v1.32.4\" returns image reference \"sha256:70a252485ed1f2e8332b6f0a5f8f57443bfbc3c480228f8dcd82ad5ab5cc4000\"" May 15 13:05:04.480721 containerd[1551]: time="2025-05-15T13:05:04.480684389Z" level=info msg="PullImage \"registry.k8s.io/kube-proxy:v1.32.4\"" May 15 13:05:05.474973 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount3120388706.mount: Deactivated successfully. May 15 13:05:05.748941 containerd[1551]: time="2025-05-15T13:05:05.748805803Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-proxy:v1.32.4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 15 13:05:05.749958 containerd[1551]: time="2025-05-15T13:05:05.749917754Z" level=info msg="stop pulling image registry.k8s.io/kube-proxy:v1.32.4: active requests=0, bytes read=30917884" May 15 13:05:05.750754 containerd[1551]: time="2025-05-15T13:05:05.750712179Z" level=info msg="ImageCreate event name:\"sha256:608f0c8bf7f9651ca79f170235ea5eefb978a0c1da132e7477a88ad37d171ad3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 15 13:05:05.752099 containerd[1551]: time="2025-05-15T13:05:05.752063981Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-proxy@sha256:152638222ecf265eb8e5352e3c50e8fc520994e8ffcff1ee1490c975f7fc2b36\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 15 13:05:05.752579 containerd[1551]: time="2025-05-15T13:05:05.752424279Z" level=info msg="Pulled image \"registry.k8s.io/kube-proxy:v1.32.4\" with image id \"sha256:608f0c8bf7f9651ca79f170235ea5eefb978a0c1da132e7477a88ad37d171ad3\", repo tag \"registry.k8s.io/kube-proxy:v1.32.4\", repo digest \"registry.k8s.io/kube-proxy@sha256:152638222ecf265eb8e5352e3c50e8fc520994e8ffcff1ee1490c975f7fc2b36\", size \"30916875\" in 1.271716014s" May 15 13:05:05.752579 containerd[1551]: time="2025-05-15T13:05:05.752455988Z" level=info msg="PullImage \"registry.k8s.io/kube-proxy:v1.32.4\" returns image reference \"sha256:608f0c8bf7f9651ca79f170235ea5eefb978a0c1da132e7477a88ad37d171ad3\"" May 15 13:05:05.753086 containerd[1551]: time="2025-05-15T13:05:05.753049564Z" level=info msg="PullImage \"registry.k8s.io/coredns/coredns:v1.11.3\"" May 15 13:05:06.248116 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount2205142547.mount: Deactivated successfully. May 15 13:05:06.961575 containerd[1551]: time="2025-05-15T13:05:06.961506332Z" level=info msg="ImageCreate event name:\"registry.k8s.io/coredns/coredns:v1.11.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 15 13:05:06.962526 containerd[1551]: time="2025-05-15T13:05:06.962375977Z" level=info msg="stop pulling image registry.k8s.io/coredns/coredns:v1.11.3: active requests=0, bytes read=18565335" May 15 13:05:06.963262 containerd[1551]: time="2025-05-15T13:05:06.963231586Z" level=info msg="ImageCreate event name:\"sha256:c69fa2e9cbf5f42dc48af631e956d3f95724c13f91596bc567591790e5e36db6\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 15 13:05:06.965606 containerd[1551]: time="2025-05-15T13:05:06.965570525Z" level=info msg="ImageCreate event name:\"registry.k8s.io/coredns/coredns@sha256:9caabbf6238b189a65d0d6e6ac138de60d6a1c419e5a341fbbb7c78382559c6e\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 15 13:05:06.967000 containerd[1551]: time="2025-05-15T13:05:06.966964887Z" level=info msg="Pulled image \"registry.k8s.io/coredns/coredns:v1.11.3\" with image id \"sha256:c69fa2e9cbf5f42dc48af631e956d3f95724c13f91596bc567591790e5e36db6\", repo tag \"registry.k8s.io/coredns/coredns:v1.11.3\", repo digest \"registry.k8s.io/coredns/coredns@sha256:9caabbf6238b189a65d0d6e6ac138de60d6a1c419e5a341fbbb7c78382559c6e\", size \"18562039\" in 1.213879415s" May 15 13:05:06.967081 containerd[1551]: time="2025-05-15T13:05:06.967068291Z" level=info msg="PullImage \"registry.k8s.io/coredns/coredns:v1.11.3\" returns image reference \"sha256:c69fa2e9cbf5f42dc48af631e956d3f95724c13f91596bc567591790e5e36db6\"" May 15 13:05:06.969004 containerd[1551]: time="2025-05-15T13:05:06.968765343Z" level=info msg="PullImage \"registry.k8s.io/pause:3.10\"" May 15 13:05:07.400494 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount679370852.mount: Deactivated successfully. May 15 13:05:07.406792 containerd[1551]: time="2025-05-15T13:05:07.406752082Z" level=info msg="ImageCreate event name:\"registry.k8s.io/pause:3.10\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" May 15 13:05:07.407492 containerd[1551]: time="2025-05-15T13:05:07.407457219Z" level=info msg="stop pulling image registry.k8s.io/pause:3.10: active requests=0, bytes read=321160" May 15 13:05:07.408480 containerd[1551]: time="2025-05-15T13:05:07.408438012Z" level=info msg="ImageCreate event name:\"sha256:873ed75102791e5b0b8a7fcd41606c92fcec98d56d05ead4ac5131650004c136\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" May 15 13:05:07.410119 containerd[1551]: time="2025-05-15T13:05:07.410048361Z" level=info msg="ImageCreate event name:\"registry.k8s.io/pause@sha256:ee6521f290b2168b6e0935a181d4cff9be1ac3f505666ef0e3c98fae8199917a\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" May 15 13:05:07.410842 containerd[1551]: time="2025-05-15T13:05:07.410718191Z" level=info msg="Pulled image \"registry.k8s.io/pause:3.10\" with image id \"sha256:873ed75102791e5b0b8a7fcd41606c92fcec98d56d05ead4ac5131650004c136\", repo tag \"registry.k8s.io/pause:3.10\", repo digest \"registry.k8s.io/pause@sha256:ee6521f290b2168b6e0935a181d4cff9be1ac3f505666ef0e3c98fae8199917a\", size \"320368\" in 441.905278ms" May 15 13:05:07.410842 containerd[1551]: time="2025-05-15T13:05:07.410746013Z" level=info msg="PullImage \"registry.k8s.io/pause:3.10\" returns image reference \"sha256:873ed75102791e5b0b8a7fcd41606c92fcec98d56d05ead4ac5131650004c136\"" May 15 13:05:07.411378 containerd[1551]: time="2025-05-15T13:05:07.411185399Z" level=info msg="PullImage \"registry.k8s.io/etcd:3.5.16-0\"" May 15 13:05:07.935688 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount2168570130.mount: Deactivated successfully. May 15 13:05:08.704146 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 99. May 15 13:05:08.707247 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... May 15 13:05:08.840984 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. May 15 13:05:08.846131 (kubelet)[3762]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS May 15 13:05:08.877847 kubelet[3762]: E0515 13:05:08.877762 3762 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" May 15 13:05:08.879462 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE May 15 13:05:08.879575 systemd[1]: kubelet.service: Failed with result 'exit-code'. May 15 13:05:08.879811 systemd[1]: kubelet.service: Consumed 116ms CPU time, 101.8M memory peak. May 15 13:05:09.448517 containerd[1551]: time="2025-05-15T13:05:09.448462960Z" level=info msg="ImageCreate event name:\"registry.k8s.io/etcd:3.5.16-0\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 15 13:05:09.449678 containerd[1551]: time="2025-05-15T13:05:09.449657256Z" level=info msg="stop pulling image registry.k8s.io/etcd:3.5.16-0: active requests=0, bytes read=57551430" May 15 13:05:09.450752 containerd[1551]: time="2025-05-15T13:05:09.450718873Z" level=info msg="ImageCreate event name:\"sha256:a9e7e6b294baf1695fccb862d956c5d3ad8510e1e4ca1535f35dc09f247abbfc\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 15 13:05:09.453232 containerd[1551]: time="2025-05-15T13:05:09.453200129Z" level=info msg="ImageCreate event name:\"registry.k8s.io/etcd@sha256:c6a9d11cc5c04b114ccdef39a9265eeef818e3d02f5359be035ae784097fdec5\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 15 13:05:09.454493 containerd[1551]: time="2025-05-15T13:05:09.454304265Z" level=info msg="Pulled image \"registry.k8s.io/etcd:3.5.16-0\" with image id \"sha256:a9e7e6b294baf1695fccb862d956c5d3ad8510e1e4ca1535f35dc09f247abbfc\", repo tag \"registry.k8s.io/etcd:3.5.16-0\", repo digest \"registry.k8s.io/etcd@sha256:c6a9d11cc5c04b114ccdef39a9265eeef818e3d02f5359be035ae784097fdec5\", size \"57680541\" in 2.043094269s" May 15 13:05:09.454493 containerd[1551]: time="2025-05-15T13:05:09.454348218Z" level=info msg="PullImage \"registry.k8s.io/etcd:3.5.16-0\" returns image reference \"sha256:a9e7e6b294baf1695fccb862d956c5d3ad8510e1e4ca1535f35dc09f247abbfc\"" May 15 13:05:12.496194 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent. May 15 13:05:12.496645 systemd[1]: kubelet.service: Consumed 116ms CPU time, 101.8M memory peak. May 15 13:05:12.499006 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... May 15 13:05:12.521971 systemd[1]: Reload requested from client PID 3798 ('systemctl') (unit session-7.scope)... May 15 13:05:12.521987 systemd[1]: Reloading... May 15 13:05:12.610972 zram_generator::config[3848]: No configuration found. May 15 13:05:12.672069 systemd[1]: /usr/lib/systemd/system/docker.socket:6: ListenStream= references a path below legacy directory /var/run/, updating /var/run/docker.sock → /run/docker.sock; please update the unit file accordingly. May 15 13:05:12.763251 systemd[1]: Reloading finished in 240 ms. May 15 13:05:12.800283 systemd[1]: kubelet.service: Control process exited, code=killed, status=15/TERM May 15 13:05:12.800356 systemd[1]: kubelet.service: Failed with result 'signal'. May 15 13:05:12.800520 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent. May 15 13:05:12.800567 systemd[1]: kubelet.service: Consumed 65ms CPU time, 91.8M memory peak. May 15 13:05:12.801600 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... May 15 13:05:12.896856 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. May 15 13:05:12.903089 (kubelet)[3896]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS May 15 13:05:12.937418 kubelet[3896]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. May 15 13:05:12.937418 kubelet[3896]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. May 15 13:05:12.937418 kubelet[3896]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. May 15 13:05:12.937801 kubelet[3896]: I0515 13:05:12.937484 3896 server.go:215] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" May 15 13:05:13.231074 kubelet[3896]: I0515 13:05:13.231040 3896 server.go:520] "Kubelet version" kubeletVersion="v1.32.0" May 15 13:05:13.231074 kubelet[3896]: I0515 13:05:13.231062 3896 server.go:522] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" May 15 13:05:13.231297 kubelet[3896]: I0515 13:05:13.231288 3896 server.go:954] "Client rotation is on, will bootstrap in background" May 15 13:05:13.258324 kubelet[3896]: I0515 13:05:13.258255 3896 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/pki/ca.crt" May 15 13:05:13.260547 kubelet[3896]: E0515 13:05:13.260517 3896 certificate_manager.go:562] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post \"https://157.180.34.115:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": dial tcp 157.180.34.115:6443: connect: connection refused" logger="UnhandledError" May 15 13:05:13.273862 kubelet[3896]: I0515 13:05:13.273835 3896 server.go:1444] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" May 15 13:05:13.278495 kubelet[3896]: I0515 13:05:13.278472 3896 server.go:772] "--cgroups-per-qos enabled, but --cgroup-root was not specified. defaulting to /" May 15 13:05:13.279832 kubelet[3896]: I0515 13:05:13.279798 3896 container_manager_linux.go:268] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] May 15 13:05:13.280006 kubelet[3896]: I0515 13:05:13.279830 3896 container_manager_linux.go:273] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ci-4334-0-0-a-250489a463","RuntimeCgroupsName":"","SystemCgroupsName":"","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":false,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":null,"HardEvictionThresholds":[{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"ExperimentalMemoryManagerPolicy":"None","ExperimentalMemoryManagerReservedMemory":null,"PodPidsLimit":-1,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} May 15 13:05:13.280006 kubelet[3896]: I0515 13:05:13.280004 3896 topology_manager.go:138] "Creating topology manager with none policy" May 15 13:05:13.280115 kubelet[3896]: I0515 13:05:13.280014 3896 container_manager_linux.go:304] "Creating device plugin manager" May 15 13:05:13.280115 kubelet[3896]: I0515 13:05:13.280112 3896 state_mem.go:36] "Initialized new in-memory state store" May 15 13:05:13.283670 kubelet[3896]: I0515 13:05:13.283651 3896 kubelet.go:446] "Attempting to sync node with API server" May 15 13:05:13.283670 kubelet[3896]: I0515 13:05:13.283671 3896 kubelet.go:341] "Adding static pod path" path="/etc/kubernetes/manifests" May 15 13:05:13.284925 kubelet[3896]: I0515 13:05:13.284748 3896 kubelet.go:352] "Adding apiserver pod source" May 15 13:05:13.284925 kubelet[3896]: I0515 13:05:13.284768 3896 apiserver.go:42] "Waiting for node sync before watching apiserver pods" May 15 13:05:13.287471 kubelet[3896]: W0515 13:05:13.287439 3896 reflector.go:569] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://157.180.34.115:6443/api/v1/nodes?fieldSelector=metadata.name%3Dci-4334-0-0-a-250489a463&limit=500&resourceVersion=0": dial tcp 157.180.34.115:6443: connect: connection refused May 15 13:05:13.287662 kubelet[3896]: E0515 13:05:13.287647 3896 reflector.go:166] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://157.180.34.115:6443/api/v1/nodes?fieldSelector=metadata.name%3Dci-4334-0-0-a-250489a463&limit=500&resourceVersion=0\": dial tcp 157.180.34.115:6443: connect: connection refused" logger="UnhandledError" May 15 13:05:13.287819 kubelet[3896]: I0515 13:05:13.287785 3896 kuberuntime_manager.go:269] "Container runtime initialized" containerRuntime="containerd" version="v2.0.4" apiVersion="v1" May 15 13:05:13.290651 kubelet[3896]: I0515 13:05:13.290577 3896 kubelet.go:890] "Not starting ClusterTrustBundle informer because we are in static kubelet mode" May 15 13:05:13.291311 kubelet[3896]: W0515 13:05:13.291274 3896 probe.go:272] Flexvolume plugin directory at /opt/libexec/kubernetes/kubelet-plugins/volume/exec/ does not exist. Recreating. May 15 13:05:13.293788 kubelet[3896]: I0515 13:05:13.293575 3896 watchdog_linux.go:99] "Systemd watchdog is not enabled" May 15 13:05:13.293788 kubelet[3896]: I0515 13:05:13.293601 3896 server.go:1287] "Started kubelet" May 15 13:05:13.299519 kubelet[3896]: W0515 13:05:13.299314 3896 reflector.go:569] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://157.180.34.115:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 157.180.34.115:6443: connect: connection refused May 15 13:05:13.299519 kubelet[3896]: E0515 13:05:13.299353 3896 reflector.go:166] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://157.180.34.115:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 157.180.34.115:6443: connect: connection refused" logger="UnhandledError" May 15 13:05:13.302478 kubelet[3896]: I0515 13:05:13.301972 3896 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" May 15 13:05:13.304476 kubelet[3896]: E0515 13:05:13.300930 3896 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://157.180.34.115:6443/api/v1/namespaces/default/events\": dial tcp 157.180.34.115:6443: connect: connection refused" event="&Event{ObjectMeta:{ci-4334-0-0-a-250489a463.183fb51c3ad1e009 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ci-4334-0-0-a-250489a463,UID:ci-4334-0-0-a-250489a463,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:ci-4334-0-0-a-250489a463,},FirstTimestamp:2025-05-15 13:05:13.293586441 +0000 UTC m=+0.386895006,LastTimestamp:2025-05-15 13:05:13.293586441 +0000 UTC m=+0.386895006,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ci-4334-0-0-a-250489a463,}" May 15 13:05:13.304638 kubelet[3896]: I0515 13:05:13.304614 3896 server.go:169] "Starting to listen" address="0.0.0.0" port=10250 May 15 13:05:13.307265 kubelet[3896]: I0515 13:05:13.307254 3896 volume_manager.go:297] "Starting Kubelet Volume Manager" May 15 13:05:13.307487 kubelet[3896]: E0515 13:05:13.307473 3896 kubelet_node_status.go:467] "Error getting the current node from lister" err="node \"ci-4334-0-0-a-250489a463\" not found" May 15 13:05:13.310227 kubelet[3896]: I0515 13:05:13.310215 3896 server.go:490] "Adding debug handlers to kubelet server" May 15 13:05:13.310915 kubelet[3896]: I0515 13:05:13.310857 3896 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 May 15 13:05:13.311095 kubelet[3896]: I0515 13:05:13.311083 3896 server.go:243] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" May 15 13:05:13.311999 kubelet[3896]: I0515 13:05:13.310939 3896 reconciler.go:26] "Reconciler: start to sync state" May 15 13:05:13.312176 kubelet[3896]: I0515 13:05:13.312163 3896 dynamic_serving_content.go:135] "Starting controller" name="kubelet-server-cert-files::/var/lib/kubelet/pki/kubelet.crt::/var/lib/kubelet/pki/kubelet.key" May 15 13:05:13.312256 kubelet[3896]: I0515 13:05:13.310903 3896 desired_state_of_world_populator.go:149] "Desired state populator starts to run" May 15 13:05:13.312829 kubelet[3896]: E0515 13:05:13.312810 3896 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://157.180.34.115:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/ci-4334-0-0-a-250489a463?timeout=10s\": dial tcp 157.180.34.115:6443: connect: connection refused" interval="200ms" May 15 13:05:13.313241 kubelet[3896]: W0515 13:05:13.313213 3896 reflector.go:569] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://157.180.34.115:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 157.180.34.115:6443: connect: connection refused May 15 13:05:13.313657 kubelet[3896]: E0515 13:05:13.313643 3896 reflector.go:166] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://157.180.34.115:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 157.180.34.115:6443: connect: connection refused" logger="UnhandledError" May 15 13:05:13.314052 kubelet[3896]: I0515 13:05:13.314024 3896 factory.go:221] Registration of the systemd container factory successfully May 15 13:05:13.314108 kubelet[3896]: I0515 13:05:13.314094 3896 factory.go:219] Registration of the crio container factory failed: Get "http://%2Fvar%2Frun%2Fcrio%2Fcrio.sock/info": dial unix /var/run/crio/crio.sock: connect: no such file or directory May 15 13:05:13.314935 kubelet[3896]: E0515 13:05:13.314871 3896 kubelet.go:1561] "Image garbage collection failed once. Stats initialization may not have completed yet" err="invalid capacity 0 on image filesystem" May 15 13:05:13.315294 kubelet[3896]: I0515 13:05:13.315278 3896 factory.go:221] Registration of the containerd container factory successfully May 15 13:05:13.317187 kubelet[3896]: I0515 13:05:13.317106 3896 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv4" May 15 13:05:13.318002 kubelet[3896]: I0515 13:05:13.317953 3896 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv6" May 15 13:05:13.318073 kubelet[3896]: I0515 13:05:13.318064 3896 status_manager.go:227] "Starting to sync pod status with apiserver" May 15 13:05:13.318128 kubelet[3896]: I0515 13:05:13.318119 3896 watchdog_linux.go:127] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." May 15 13:05:13.318166 kubelet[3896]: I0515 13:05:13.318161 3896 kubelet.go:2388] "Starting kubelet main sync loop" May 15 13:05:13.318253 kubelet[3896]: E0515 13:05:13.318232 3896 kubelet.go:2412] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" May 15 13:05:13.323539 kubelet[3896]: W0515 13:05:13.323349 3896 reflector.go:569] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://157.180.34.115:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 157.180.34.115:6443: connect: connection refused May 15 13:05:13.323539 kubelet[3896]: E0515 13:05:13.323384 3896 reflector.go:166] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://157.180.34.115:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 157.180.34.115:6443: connect: connection refused" logger="UnhandledError" May 15 13:05:13.341294 kubelet[3896]: I0515 13:05:13.341257 3896 cpu_manager.go:221] "Starting CPU manager" policy="none" May 15 13:05:13.341294 kubelet[3896]: I0515 13:05:13.341271 3896 cpu_manager.go:222] "Reconciling" reconcilePeriod="10s" May 15 13:05:13.341294 kubelet[3896]: I0515 13:05:13.341294 3896 state_mem.go:36] "Initialized new in-memory state store" May 15 13:05:13.343849 kubelet[3896]: I0515 13:05:13.343825 3896 policy_none.go:49] "None policy: Start" May 15 13:05:13.343849 kubelet[3896]: I0515 13:05:13.343846 3896 memory_manager.go:186] "Starting memorymanager" policy="None" May 15 13:05:13.343921 kubelet[3896]: I0515 13:05:13.343857 3896 state_mem.go:35] "Initializing new in-memory state store" May 15 13:05:13.349861 systemd[1]: Created slice kubepods.slice - libcontainer container kubepods.slice. May 15 13:05:13.358238 systemd[1]: Created slice kubepods-burstable.slice - libcontainer container kubepods-burstable.slice. May 15 13:05:13.360792 systemd[1]: Created slice kubepods-besteffort.slice - libcontainer container kubepods-besteffort.slice. May 15 13:05:13.370779 kubelet[3896]: I0515 13:05:13.370561 3896 manager.go:519] "Failed to read data from checkpoint" checkpoint="kubelet_internal_checkpoint" err="checkpoint is not found" May 15 13:05:13.371217 kubelet[3896]: I0515 13:05:13.370949 3896 eviction_manager.go:189] "Eviction manager: starting control loop" May 15 13:05:13.371217 kubelet[3896]: I0515 13:05:13.370963 3896 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" May 15 13:05:13.371481 kubelet[3896]: I0515 13:05:13.371432 3896 plugin_manager.go:118] "Starting Kubelet Plugin Manager" May 15 13:05:13.372750 kubelet[3896]: E0515 13:05:13.372729 3896 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="no imagefs label for configured runtime" May 15 13:05:13.372792 kubelet[3896]: E0515 13:05:13.372761 3896 eviction_manager.go:292] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"ci-4334-0-0-a-250489a463\" not found" May 15 13:05:13.430297 systemd[1]: Created slice kubepods-burstable-pod313fc59f4f5615138e3e747b2e80d428.slice - libcontainer container kubepods-burstable-pod313fc59f4f5615138e3e747b2e80d428.slice. May 15 13:05:13.438802 kubelet[3896]: E0515 13:05:13.438775 3896 kubelet.go:3196] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4334-0-0-a-250489a463\" not found" node="ci-4334-0-0-a-250489a463" May 15 13:05:13.442043 systemd[1]: Created slice kubepods-burstable-pod495ca9b7566e390d523bd92a3e96536c.slice - libcontainer container kubepods-burstable-pod495ca9b7566e390d523bd92a3e96536c.slice. May 15 13:05:13.445113 kubelet[3896]: E0515 13:05:13.445075 3896 kubelet.go:3196] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4334-0-0-a-250489a463\" not found" node="ci-4334-0-0-a-250489a463" May 15 13:05:13.446432 systemd[1]: Created slice kubepods-burstable-pod25043387969ff4861dd26cda3352c0a1.slice - libcontainer container kubepods-burstable-pod25043387969ff4861dd26cda3352c0a1.slice. May 15 13:05:13.449277 kubelet[3896]: E0515 13:05:13.449187 3896 kubelet.go:3196] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4334-0-0-a-250489a463\" not found" node="ci-4334-0-0-a-250489a463" May 15 13:05:13.472473 kubelet[3896]: I0515 13:05:13.472457 3896 kubelet_node_status.go:76] "Attempting to register node" node="ci-4334-0-0-a-250489a463" May 15 13:05:13.472922 kubelet[3896]: E0515 13:05:13.472874 3896 kubelet_node_status.go:108] "Unable to register node with API server" err="Post \"https://157.180.34.115:6443/api/v1/nodes\": dial tcp 157.180.34.115:6443: connect: connection refused" node="ci-4334-0-0-a-250489a463" May 15 13:05:13.513876 kubelet[3896]: I0515 13:05:13.513552 3896 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/25043387969ff4861dd26cda3352c0a1-kubeconfig\") pod \"kube-scheduler-ci-4334-0-0-a-250489a463\" (UID: \"25043387969ff4861dd26cda3352c0a1\") " pod="kube-system/kube-scheduler-ci-4334-0-0-a-250489a463" May 15 13:05:13.513876 kubelet[3896]: I0515 13:05:13.513589 3896 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/313fc59f4f5615138e3e747b2e80d428-usr-share-ca-certificates\") pod \"kube-apiserver-ci-4334-0-0-a-250489a463\" (UID: \"313fc59f4f5615138e3e747b2e80d428\") " pod="kube-system/kube-apiserver-ci-4334-0-0-a-250489a463" May 15 13:05:13.513876 kubelet[3896]: I0515 13:05:13.513612 3896 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"flexvolume-dir\" (UniqueName: \"kubernetes.io/host-path/495ca9b7566e390d523bd92a3e96536c-flexvolume-dir\") pod \"kube-controller-manager-ci-4334-0-0-a-250489a463\" (UID: \"495ca9b7566e390d523bd92a3e96536c\") " pod="kube-system/kube-controller-manager-ci-4334-0-0-a-250489a463" May 15 13:05:13.513876 kubelet[3896]: I0515 13:05:13.513628 3896 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/495ca9b7566e390d523bd92a3e96536c-k8s-certs\") pod \"kube-controller-manager-ci-4334-0-0-a-250489a463\" (UID: \"495ca9b7566e390d523bd92a3e96536c\") " pod="kube-system/kube-controller-manager-ci-4334-0-0-a-250489a463" May 15 13:05:13.513876 kubelet[3896]: I0515 13:05:13.513645 3896 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/495ca9b7566e390d523bd92a3e96536c-usr-share-ca-certificates\") pod \"kube-controller-manager-ci-4334-0-0-a-250489a463\" (UID: \"495ca9b7566e390d523bd92a3e96536c\") " pod="kube-system/kube-controller-manager-ci-4334-0-0-a-250489a463" May 15 13:05:13.514279 kubelet[3896]: I0515 13:05:13.513659 3896 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/313fc59f4f5615138e3e747b2e80d428-ca-certs\") pod \"kube-apiserver-ci-4334-0-0-a-250489a463\" (UID: \"313fc59f4f5615138e3e747b2e80d428\") " pod="kube-system/kube-apiserver-ci-4334-0-0-a-250489a463" May 15 13:05:13.514279 kubelet[3896]: I0515 13:05:13.513675 3896 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/313fc59f4f5615138e3e747b2e80d428-k8s-certs\") pod \"kube-apiserver-ci-4334-0-0-a-250489a463\" (UID: \"313fc59f4f5615138e3e747b2e80d428\") " pod="kube-system/kube-apiserver-ci-4334-0-0-a-250489a463" May 15 13:05:13.514279 kubelet[3896]: I0515 13:05:13.513689 3896 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/495ca9b7566e390d523bd92a3e96536c-ca-certs\") pod \"kube-controller-manager-ci-4334-0-0-a-250489a463\" (UID: \"495ca9b7566e390d523bd92a3e96536c\") " pod="kube-system/kube-controller-manager-ci-4334-0-0-a-250489a463" May 15 13:05:13.514279 kubelet[3896]: I0515 13:05:13.513703 3896 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/495ca9b7566e390d523bd92a3e96536c-kubeconfig\") pod \"kube-controller-manager-ci-4334-0-0-a-250489a463\" (UID: \"495ca9b7566e390d523bd92a3e96536c\") " pod="kube-system/kube-controller-manager-ci-4334-0-0-a-250489a463" May 15 13:05:13.514279 kubelet[3896]: E0515 13:05:13.513807 3896 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://157.180.34.115:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/ci-4334-0-0-a-250489a463?timeout=10s\": dial tcp 157.180.34.115:6443: connect: connection refused" interval="400ms" May 15 13:05:13.675406 kubelet[3896]: I0515 13:05:13.675371 3896 kubelet_node_status.go:76] "Attempting to register node" node="ci-4334-0-0-a-250489a463" May 15 13:05:13.675743 kubelet[3896]: E0515 13:05:13.675712 3896 kubelet_node_status.go:108] "Unable to register node with API server" err="Post \"https://157.180.34.115:6443/api/v1/nodes\": dial tcp 157.180.34.115:6443: connect: connection refused" node="ci-4334-0-0-a-250489a463" May 15 13:05:13.740767 containerd[1551]: time="2025-05-15T13:05:13.740706376Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-apiserver-ci-4334-0-0-a-250489a463,Uid:313fc59f4f5615138e3e747b2e80d428,Namespace:kube-system,Attempt:0,}" May 15 13:05:13.752661 containerd[1551]: time="2025-05-15T13:05:13.752614290Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-controller-manager-ci-4334-0-0-a-250489a463,Uid:495ca9b7566e390d523bd92a3e96536c,Namespace:kube-system,Attempt:0,}" May 15 13:05:13.753319 containerd[1551]: time="2025-05-15T13:05:13.752620011Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-scheduler-ci-4334-0-0-a-250489a463,Uid:25043387969ff4861dd26cda3352c0a1,Namespace:kube-system,Attempt:0,}" May 15 13:05:13.817813 kubelet[3896]: E0515 13:05:13.817399 3896 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://157.180.34.115:6443/api/v1/namespaces/default/events\": dial tcp 157.180.34.115:6443: connect: connection refused" event="&Event{ObjectMeta:{ci-4334-0-0-a-250489a463.183fb51c3ad1e009 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ci-4334-0-0-a-250489a463,UID:ci-4334-0-0-a-250489a463,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:ci-4334-0-0-a-250489a463,},FirstTimestamp:2025-05-15 13:05:13.293586441 +0000 UTC m=+0.386895006,LastTimestamp:2025-05-15 13:05:13.293586441 +0000 UTC m=+0.386895006,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ci-4334-0-0-a-250489a463,}" May 15 13:05:13.851277 containerd[1551]: time="2025-05-15T13:05:13.850881842Z" level=info msg="connecting to shim a698b378280ab49fba631e40a0138558baf033e6a389d4f87e34bafcaf852621" address="unix:///run/containerd/s/55201435fc61ab9fbe2dd3fb024733fe9e1a4bcf25c7713303575cdd153c213e" namespace=k8s.io protocol=ttrpc version=3 May 15 13:05:13.853630 containerd[1551]: time="2025-05-15T13:05:13.853537257Z" level=info msg="connecting to shim 85ae5497e9707db9f31abda29ff6608596d47e58502089d8eb3e8533ab0c1749" address="unix:///run/containerd/s/9c0785773d35a226674c018d79a61ceac1890b67312214e05ca1b35f24e4530f" namespace=k8s.io protocol=ttrpc version=3 May 15 13:05:13.856475 containerd[1551]: time="2025-05-15T13:05:13.856449053Z" level=info msg="connecting to shim 7271a027245ac672c52d7e3c71edb75aeca08a11eecf807e53c14dd13614ba9c" address="unix:///run/containerd/s/d3ca043f7a5e7ccd863e57ebe0c0d205be3306ffb44fb1e57f0b1853be2fc8b2" namespace=k8s.io protocol=ttrpc version=3 May 15 13:05:13.914391 kubelet[3896]: E0515 13:05:13.914354 3896 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://157.180.34.115:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/ci-4334-0-0-a-250489a463?timeout=10s\": dial tcp 157.180.34.115:6443: connect: connection refused" interval="800ms" May 15 13:05:13.916015 systemd[1]: Started cri-containerd-7271a027245ac672c52d7e3c71edb75aeca08a11eecf807e53c14dd13614ba9c.scope - libcontainer container 7271a027245ac672c52d7e3c71edb75aeca08a11eecf807e53c14dd13614ba9c. May 15 13:05:13.917205 systemd[1]: Started cri-containerd-a698b378280ab49fba631e40a0138558baf033e6a389d4f87e34bafcaf852621.scope - libcontainer container a698b378280ab49fba631e40a0138558baf033e6a389d4f87e34bafcaf852621. May 15 13:05:13.921322 systemd[1]: Started cri-containerd-85ae5497e9707db9f31abda29ff6608596d47e58502089d8eb3e8533ab0c1749.scope - libcontainer container 85ae5497e9707db9f31abda29ff6608596d47e58502089d8eb3e8533ab0c1749. May 15 13:05:13.976341 containerd[1551]: time="2025-05-15T13:05:13.974875075Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-scheduler-ci-4334-0-0-a-250489a463,Uid:25043387969ff4861dd26cda3352c0a1,Namespace:kube-system,Attempt:0,} returns sandbox id \"7271a027245ac672c52d7e3c71edb75aeca08a11eecf807e53c14dd13614ba9c\"" May 15 13:05:13.981240 containerd[1551]: time="2025-05-15T13:05:13.981213074Z" level=info msg="CreateContainer within sandbox \"7271a027245ac672c52d7e3c71edb75aeca08a11eecf807e53c14dd13614ba9c\" for container &ContainerMetadata{Name:kube-scheduler,Attempt:0,}" May 15 13:05:13.992429 containerd[1551]: time="2025-05-15T13:05:13.992402817Z" level=info msg="Container c763ab4549febb67e3de4879c6b2edae4268b8a506555942698e1465d06c1628: CDI devices from CRI Config.CDIDevices: []" May 15 13:05:13.992930 containerd[1551]: time="2025-05-15T13:05:13.992821865Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-apiserver-ci-4334-0-0-a-250489a463,Uid:313fc59f4f5615138e3e747b2e80d428,Namespace:kube-system,Attempt:0,} returns sandbox id \"a698b378280ab49fba631e40a0138558baf033e6a389d4f87e34bafcaf852621\"" May 15 13:05:13.995205 containerd[1551]: time="2025-05-15T13:05:13.995180972Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-controller-manager-ci-4334-0-0-a-250489a463,Uid:495ca9b7566e390d523bd92a3e96536c,Namespace:kube-system,Attempt:0,} returns sandbox id \"85ae5497e9707db9f31abda29ff6608596d47e58502089d8eb3e8533ab0c1749\"" May 15 13:05:13.996517 containerd[1551]: time="2025-05-15T13:05:13.996458865Z" level=info msg="CreateContainer within sandbox \"a698b378280ab49fba631e40a0138558baf033e6a389d4f87e34bafcaf852621\" for container &ContainerMetadata{Name:kube-apiserver,Attempt:0,}" May 15 13:05:13.998410 containerd[1551]: time="2025-05-15T13:05:13.998280431Z" level=info msg="CreateContainer within sandbox \"85ae5497e9707db9f31abda29ff6608596d47e58502089d8eb3e8533ab0c1749\" for container &ContainerMetadata{Name:kube-controller-manager,Attempt:0,}" May 15 13:05:13.999904 containerd[1551]: time="2025-05-15T13:05:13.999833942Z" level=info msg="CreateContainer within sandbox \"7271a027245ac672c52d7e3c71edb75aeca08a11eecf807e53c14dd13614ba9c\" for &ContainerMetadata{Name:kube-scheduler,Attempt:0,} returns container id \"c763ab4549febb67e3de4879c6b2edae4268b8a506555942698e1465d06c1628\"" May 15 13:05:14.000381 containerd[1551]: time="2025-05-15T13:05:14.000356675Z" level=info msg="StartContainer for \"c763ab4549febb67e3de4879c6b2edae4268b8a506555942698e1465d06c1628\"" May 15 13:05:14.001139 containerd[1551]: time="2025-05-15T13:05:14.001089143Z" level=info msg="connecting to shim c763ab4549febb67e3de4879c6b2edae4268b8a506555942698e1465d06c1628" address="unix:///run/containerd/s/d3ca043f7a5e7ccd863e57ebe0c0d205be3306ffb44fb1e57f0b1853be2fc8b2" protocol=ttrpc version=3 May 15 13:05:14.005397 containerd[1551]: time="2025-05-15T13:05:14.005372519Z" level=info msg="Container b9512b66e8b32a7cd3fd936bc245a8f256ca7e961eb14d96a54ec0f7999ee06b: CDI devices from CRI Config.CDIDevices: []" May 15 13:05:14.008161 containerd[1551]: time="2025-05-15T13:05:14.008106039Z" level=info msg="Container 1a6df3a8be706ceb2636ab7e168b6ce2e45586bcedce0fe7c61ad76dc53b7b6e: CDI devices from CRI Config.CDIDevices: []" May 15 13:05:14.010697 containerd[1551]: time="2025-05-15T13:05:14.010640936Z" level=info msg="CreateContainer within sandbox \"a698b378280ab49fba631e40a0138558baf033e6a389d4f87e34bafcaf852621\" for &ContainerMetadata{Name:kube-apiserver,Attempt:0,} returns container id \"b9512b66e8b32a7cd3fd936bc245a8f256ca7e961eb14d96a54ec0f7999ee06b\"" May 15 13:05:14.011901 containerd[1551]: time="2025-05-15T13:05:14.011523486Z" level=info msg="StartContainer for \"b9512b66e8b32a7cd3fd936bc245a8f256ca7e961eb14d96a54ec0f7999ee06b\"" May 15 13:05:14.012857 containerd[1551]: time="2025-05-15T13:05:14.012833229Z" level=info msg="connecting to shim b9512b66e8b32a7cd3fd936bc245a8f256ca7e961eb14d96a54ec0f7999ee06b" address="unix:///run/containerd/s/55201435fc61ab9fbe2dd3fb024733fe9e1a4bcf25c7713303575cdd153c213e" protocol=ttrpc version=3 May 15 13:05:14.014521 containerd[1551]: time="2025-05-15T13:05:14.014502668Z" level=info msg="CreateContainer within sandbox \"85ae5497e9707db9f31abda29ff6608596d47e58502089d8eb3e8533ab0c1749\" for &ContainerMetadata{Name:kube-controller-manager,Attempt:0,} returns container id \"1a6df3a8be706ceb2636ab7e168b6ce2e45586bcedce0fe7c61ad76dc53b7b6e\"" May 15 13:05:14.015229 containerd[1551]: time="2025-05-15T13:05:14.015214617Z" level=info msg="StartContainer for \"1a6df3a8be706ceb2636ab7e168b6ce2e45586bcedce0fe7c61ad76dc53b7b6e\"" May 15 13:05:14.021045 containerd[1551]: time="2025-05-15T13:05:14.021021327Z" level=info msg="connecting to shim 1a6df3a8be706ceb2636ab7e168b6ce2e45586bcedce0fe7c61ad76dc53b7b6e" address="unix:///run/containerd/s/9c0785773d35a226674c018d79a61ceac1890b67312214e05ca1b35f24e4530f" protocol=ttrpc version=3 May 15 13:05:14.021219 systemd[1]: Started cri-containerd-c763ab4549febb67e3de4879c6b2edae4268b8a506555942698e1465d06c1628.scope - libcontainer container c763ab4549febb67e3de4879c6b2edae4268b8a506555942698e1465d06c1628. May 15 13:05:14.052013 systemd[1]: Started cri-containerd-1a6df3a8be706ceb2636ab7e168b6ce2e45586bcedce0fe7c61ad76dc53b7b6e.scope - libcontainer container 1a6df3a8be706ceb2636ab7e168b6ce2e45586bcedce0fe7c61ad76dc53b7b6e. May 15 13:05:14.053402 systemd[1]: Started cri-containerd-b9512b66e8b32a7cd3fd936bc245a8f256ca7e961eb14d96a54ec0f7999ee06b.scope - libcontainer container b9512b66e8b32a7cd3fd936bc245a8f256ca7e961eb14d96a54ec0f7999ee06b. May 15 13:05:14.079574 kubelet[3896]: I0515 13:05:14.079053 3896 kubelet_node_status.go:76] "Attempting to register node" node="ci-4334-0-0-a-250489a463" May 15 13:05:14.079574 kubelet[3896]: E0515 13:05:14.079339 3896 kubelet_node_status.go:108] "Unable to register node with API server" err="Post \"https://157.180.34.115:6443/api/v1/nodes\": dial tcp 157.180.34.115:6443: connect: connection refused" node="ci-4334-0-0-a-250489a463" May 15 13:05:14.088980 containerd[1551]: time="2025-05-15T13:05:14.088951327Z" level=info msg="StartContainer for \"c763ab4549febb67e3de4879c6b2edae4268b8a506555942698e1465d06c1628\" returns successfully" May 15 13:05:14.119154 containerd[1551]: time="2025-05-15T13:05:14.119046854Z" level=info msg="StartContainer for \"b9512b66e8b32a7cd3fd936bc245a8f256ca7e961eb14d96a54ec0f7999ee06b\" returns successfully" May 15 13:05:14.134030 containerd[1551]: time="2025-05-15T13:05:14.133987963Z" level=info msg="StartContainer for \"1a6df3a8be706ceb2636ab7e168b6ce2e45586bcedce0fe7c61ad76dc53b7b6e\" returns successfully" May 15 13:05:14.330176 kubelet[3896]: E0515 13:05:14.330101 3896 kubelet.go:3196] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4334-0-0-a-250489a463\" not found" node="ci-4334-0-0-a-250489a463" May 15 13:05:14.334553 kubelet[3896]: E0515 13:05:14.334521 3896 kubelet.go:3196] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4334-0-0-a-250489a463\" not found" node="ci-4334-0-0-a-250489a463" May 15 13:05:14.336183 kubelet[3896]: E0515 13:05:14.336165 3896 kubelet.go:3196] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4334-0-0-a-250489a463\" not found" node="ci-4334-0-0-a-250489a463" May 15 13:05:14.880821 kubelet[3896]: I0515 13:05:14.880793 3896 kubelet_node_status.go:76] "Attempting to register node" node="ci-4334-0-0-a-250489a463" May 15 13:05:15.337356 kubelet[3896]: E0515 13:05:15.337328 3896 kubelet.go:3196] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4334-0-0-a-250489a463\" not found" node="ci-4334-0-0-a-250489a463" May 15 13:05:15.337977 kubelet[3896]: E0515 13:05:15.337960 3896 kubelet.go:3196] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4334-0-0-a-250489a463\" not found" node="ci-4334-0-0-a-250489a463" May 15 13:05:15.704965 kubelet[3896]: E0515 13:05:15.704927 3896 nodelease.go:49] "Failed to get node when trying to set owner ref to the node lease" err="nodes \"ci-4334-0-0-a-250489a463\" not found" node="ci-4334-0-0-a-250489a463" May 15 13:05:15.798312 kubelet[3896]: I0515 13:05:15.798266 3896 kubelet_node_status.go:79] "Successfully registered node" node="ci-4334-0-0-a-250489a463" May 15 13:05:15.808628 kubelet[3896]: I0515 13:05:15.808601 3896 kubelet.go:3200] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-ci-4334-0-0-a-250489a463" May 15 13:05:15.813417 kubelet[3896]: E0515 13:05:15.813379 3896 kubelet.go:3202] "Failed creating a mirror pod" err="pods \"kube-apiserver-ci-4334-0-0-a-250489a463\" is forbidden: no PriorityClass with name system-node-critical was found" pod="kube-system/kube-apiserver-ci-4334-0-0-a-250489a463" May 15 13:05:15.813417 kubelet[3896]: I0515 13:05:15.813408 3896 kubelet.go:3200] "Creating a mirror pod for static pod" pod="kube-system/kube-controller-manager-ci-4334-0-0-a-250489a463" May 15 13:05:15.814696 kubelet[3896]: E0515 13:05:15.814671 3896 kubelet.go:3202] "Failed creating a mirror pod" err="pods \"kube-controller-manager-ci-4334-0-0-a-250489a463\" is forbidden: no PriorityClass with name system-node-critical was found" pod="kube-system/kube-controller-manager-ci-4334-0-0-a-250489a463" May 15 13:05:15.814696 kubelet[3896]: I0515 13:05:15.814693 3896 kubelet.go:3200] "Creating a mirror pod for static pod" pod="kube-system/kube-scheduler-ci-4334-0-0-a-250489a463" May 15 13:05:15.816412 kubelet[3896]: E0515 13:05:15.816371 3896 kubelet.go:3202] "Failed creating a mirror pod" err="pods \"kube-scheduler-ci-4334-0-0-a-250489a463\" is forbidden: no PriorityClass with name system-node-critical was found" pod="kube-system/kube-scheduler-ci-4334-0-0-a-250489a463" May 15 13:05:16.295736 kubelet[3896]: I0515 13:05:16.295676 3896 apiserver.go:52] "Watching apiserver" May 15 13:05:16.313216 kubelet[3896]: I0515 13:05:16.313176 3896 desired_state_of_world_populator.go:157] "Finished populating initial desired state of world" May 15 13:05:17.864352 systemd[1]: Reload requested from client PID 4170 ('systemctl') (unit session-7.scope)... May 15 13:05:17.864389 systemd[1]: Reloading... May 15 13:05:17.948947 zram_generator::config[4210]: No configuration found. May 15 13:05:18.015696 systemd[1]: /usr/lib/systemd/system/docker.socket:6: ListenStream= references a path below legacy directory /var/run/, updating /var/run/docker.sock → /run/docker.sock; please update the unit file accordingly. May 15 13:05:18.121665 systemd[1]: Reloading finished in 256 ms. May 15 13:05:18.147257 systemd[1]: Stopping kubelet.service - kubelet: The Kubernetes Node Agent... May 15 13:05:18.166434 systemd[1]: kubelet.service: Deactivated successfully. May 15 13:05:18.166584 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent. May 15 13:05:18.166616 systemd[1]: kubelet.service: Consumed 672ms CPU time, 122.1M memory peak. May 15 13:05:18.168518 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... May 15 13:05:18.273536 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. May 15 13:05:18.279294 (kubelet)[4265]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS May 15 13:05:18.312626 kubelet[4265]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. May 15 13:05:18.312626 kubelet[4265]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. May 15 13:05:18.312626 kubelet[4265]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. May 15 13:05:18.312626 kubelet[4265]: I0515 13:05:18.312403 4265 server.go:215] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" May 15 13:05:18.318075 kubelet[4265]: I0515 13:05:18.318059 4265 server.go:520] "Kubelet version" kubeletVersion="v1.32.0" May 15 13:05:18.318163 kubelet[4265]: I0515 13:05:18.318155 4265 server.go:522] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" May 15 13:05:18.318400 kubelet[4265]: I0515 13:05:18.318388 4265 server.go:954] "Client rotation is on, will bootstrap in background" May 15 13:05:18.322600 kubelet[4265]: I0515 13:05:18.322584 4265 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-client-current.pem". May 15 13:05:18.335102 kubelet[4265]: I0515 13:05:18.334963 4265 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/pki/ca.crt" May 15 13:05:18.340202 kubelet[4265]: I0515 13:05:18.340181 4265 server.go:1444] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" May 15 13:05:18.343622 kubelet[4265]: I0515 13:05:18.343595 4265 server.go:772] "--cgroups-per-qos enabled, but --cgroup-root was not specified. defaulting to /" May 15 13:05:18.343838 kubelet[4265]: I0515 13:05:18.343803 4265 container_manager_linux.go:268] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] May 15 13:05:18.344047 kubelet[4265]: I0515 13:05:18.343834 4265 container_manager_linux.go:273] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ci-4334-0-0-a-250489a463","RuntimeCgroupsName":"","SystemCgroupsName":"","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":false,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":null,"HardEvictionThresholds":[{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"ExperimentalMemoryManagerPolicy":"None","ExperimentalMemoryManagerReservedMemory":null,"PodPidsLimit":-1,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} May 15 13:05:18.344047 kubelet[4265]: I0515 13:05:18.344043 4265 topology_manager.go:138] "Creating topology manager with none policy" May 15 13:05:18.344154 kubelet[4265]: I0515 13:05:18.344054 4265 container_manager_linux.go:304] "Creating device plugin manager" May 15 13:05:18.344154 kubelet[4265]: I0515 13:05:18.344091 4265 state_mem.go:36] "Initialized new in-memory state store" May 15 13:05:18.344272 kubelet[4265]: I0515 13:05:18.344239 4265 kubelet.go:446] "Attempting to sync node with API server" May 15 13:05:18.344660 kubelet[4265]: I0515 13:05:18.344262 4265 kubelet.go:341] "Adding static pod path" path="/etc/kubernetes/manifests" May 15 13:05:18.344686 kubelet[4265]: I0515 13:05:18.344673 4265 kubelet.go:352] "Adding apiserver pod source" May 15 13:05:18.344704 kubelet[4265]: I0515 13:05:18.344687 4265 apiserver.go:42] "Waiting for node sync before watching apiserver pods" May 15 13:05:18.347134 kubelet[4265]: I0515 13:05:18.347042 4265 kuberuntime_manager.go:269] "Container runtime initialized" containerRuntime="containerd" version="v2.0.4" apiVersion="v1" May 15 13:05:18.347466 kubelet[4265]: I0515 13:05:18.347448 4265 kubelet.go:890] "Not starting ClusterTrustBundle informer because we are in static kubelet mode" May 15 13:05:18.356963 kubelet[4265]: I0515 13:05:18.356946 4265 watchdog_linux.go:99] "Systemd watchdog is not enabled" May 15 13:05:18.357015 kubelet[4265]: I0515 13:05:18.356977 4265 server.go:1287] "Started kubelet" May 15 13:05:18.358127 kubelet[4265]: I0515 13:05:18.358108 4265 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" May 15 13:05:18.359811 kubelet[4265]: I0515 13:05:18.359775 4265 server.go:169] "Starting to listen" address="0.0.0.0" port=10250 May 15 13:05:18.362177 kubelet[4265]: I0515 13:05:18.362143 4265 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 May 15 13:05:18.362529 kubelet[4265]: I0515 13:05:18.362517 4265 server.go:243] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" May 15 13:05:18.363118 kubelet[4265]: I0515 13:05:18.363105 4265 dynamic_serving_content.go:135] "Starting controller" name="kubelet-server-cert-files::/var/lib/kubelet/pki/kubelet.crt::/var/lib/kubelet/pki/kubelet.key" May 15 13:05:18.365545 kubelet[4265]: I0515 13:05:18.365523 4265 server.go:490] "Adding debug handlers to kubelet server" May 15 13:05:18.366757 kubelet[4265]: I0515 13:05:18.366727 4265 volume_manager.go:297] "Starting Kubelet Volume Manager" May 15 13:05:18.367489 kubelet[4265]: E0515 13:05:18.367476 4265 kubelet.go:1561] "Image garbage collection failed once. Stats initialization may not have completed yet" err="invalid capacity 0 on image filesystem" May 15 13:05:18.367871 kubelet[4265]: I0515 13:05:18.367520 4265 desired_state_of_world_populator.go:149] "Desired state populator starts to run" May 15 13:05:18.367871 kubelet[4265]: I0515 13:05:18.367597 4265 reconciler.go:26] "Reconciler: start to sync state" May 15 13:05:18.371001 kubelet[4265]: I0515 13:05:18.370979 4265 factory.go:221] Registration of the systemd container factory successfully May 15 13:05:18.371072 kubelet[4265]: I0515 13:05:18.371057 4265 factory.go:219] Registration of the crio container factory failed: Get "http://%2Fvar%2Frun%2Fcrio%2Fcrio.sock/info": dial unix /var/run/crio/crio.sock: connect: no such file or directory May 15 13:05:18.373016 kubelet[4265]: I0515 13:05:18.372952 4265 factory.go:221] Registration of the containerd container factory successfully May 15 13:05:18.376951 kubelet[4265]: I0515 13:05:18.376882 4265 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv4" May 15 13:05:18.377669 kubelet[4265]: I0515 13:05:18.377640 4265 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv6" May 15 13:05:18.377669 kubelet[4265]: I0515 13:05:18.377663 4265 status_manager.go:227] "Starting to sync pod status with apiserver" May 15 13:05:18.377730 kubelet[4265]: I0515 13:05:18.377678 4265 watchdog_linux.go:127] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." May 15 13:05:18.377730 kubelet[4265]: I0515 13:05:18.377683 4265 kubelet.go:2388] "Starting kubelet main sync loop" May 15 13:05:18.377730 kubelet[4265]: E0515 13:05:18.377712 4265 kubelet.go:2412] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" May 15 13:05:18.412652 kubelet[4265]: I0515 13:05:18.412631 4265 cpu_manager.go:221] "Starting CPU manager" policy="none" May 15 13:05:18.412914 kubelet[4265]: I0515 13:05:18.412843 4265 cpu_manager.go:222] "Reconciling" reconcilePeriod="10s" May 15 13:05:18.412914 kubelet[4265]: I0515 13:05:18.412876 4265 state_mem.go:36] "Initialized new in-memory state store" May 15 13:05:18.413128 kubelet[4265]: I0515 13:05:18.413115 4265 state_mem.go:88] "Updated default CPUSet" cpuSet="" May 15 13:05:18.413196 kubelet[4265]: I0515 13:05:18.413177 4265 state_mem.go:96] "Updated CPUSet assignments" assignments={} May 15 13:05:18.413238 kubelet[4265]: I0515 13:05:18.413232 4265 policy_none.go:49] "None policy: Start" May 15 13:05:18.413295 kubelet[4265]: I0515 13:05:18.413288 4265 memory_manager.go:186] "Starting memorymanager" policy="None" May 15 13:05:18.413338 kubelet[4265]: I0515 13:05:18.413333 4265 state_mem.go:35] "Initializing new in-memory state store" May 15 13:05:18.413462 kubelet[4265]: I0515 13:05:18.413452 4265 state_mem.go:75] "Updated machine memory state" May 15 13:05:18.419057 kubelet[4265]: I0515 13:05:18.418379 4265 manager.go:519] "Failed to read data from checkpoint" checkpoint="kubelet_internal_checkpoint" err="checkpoint is not found" May 15 13:05:18.419057 kubelet[4265]: I0515 13:05:18.418606 4265 eviction_manager.go:189] "Eviction manager: starting control loop" May 15 13:05:18.419057 kubelet[4265]: I0515 13:05:18.418627 4265 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" May 15 13:05:18.419306 kubelet[4265]: I0515 13:05:18.419275 4265 plugin_manager.go:118] "Starting Kubelet Plugin Manager" May 15 13:05:18.421969 kubelet[4265]: E0515 13:05:18.421949 4265 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="no imagefs label for configured runtime" May 15 13:05:18.478724 kubelet[4265]: I0515 13:05:18.478685 4265 kubelet.go:3200] "Creating a mirror pod for static pod" pod="kube-system/kube-scheduler-ci-4334-0-0-a-250489a463" May 15 13:05:18.480563 kubelet[4265]: I0515 13:05:18.480542 4265 kubelet.go:3200] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-ci-4334-0-0-a-250489a463" May 15 13:05:18.481277 kubelet[4265]: I0515 13:05:18.481146 4265 kubelet.go:3200] "Creating a mirror pod for static pod" pod="kube-system/kube-controller-manager-ci-4334-0-0-a-250489a463" May 15 13:05:18.527035 kubelet[4265]: I0515 13:05:18.527003 4265 kubelet_node_status.go:76] "Attempting to register node" node="ci-4334-0-0-a-250489a463" May 15 13:05:18.532841 kubelet[4265]: I0515 13:05:18.532693 4265 kubelet_node_status.go:125] "Node was previously registered" node="ci-4334-0-0-a-250489a463" May 15 13:05:18.532841 kubelet[4265]: I0515 13:05:18.532751 4265 kubelet_node_status.go:79] "Successfully registered node" node="ci-4334-0-0-a-250489a463" May 15 13:05:18.671168 kubelet[4265]: I0515 13:05:18.671050 4265 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/495ca9b7566e390d523bd92a3e96536c-ca-certs\") pod \"kube-controller-manager-ci-4334-0-0-a-250489a463\" (UID: \"495ca9b7566e390d523bd92a3e96536c\") " pod="kube-system/kube-controller-manager-ci-4334-0-0-a-250489a463" May 15 13:05:18.671168 kubelet[4265]: I0515 13:05:18.671093 4265 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/495ca9b7566e390d523bd92a3e96536c-k8s-certs\") pod \"kube-controller-manager-ci-4334-0-0-a-250489a463\" (UID: \"495ca9b7566e390d523bd92a3e96536c\") " pod="kube-system/kube-controller-manager-ci-4334-0-0-a-250489a463" May 15 13:05:18.671168 kubelet[4265]: I0515 13:05:18.671117 4265 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/495ca9b7566e390d523bd92a3e96536c-kubeconfig\") pod \"kube-controller-manager-ci-4334-0-0-a-250489a463\" (UID: \"495ca9b7566e390d523bd92a3e96536c\") " pod="kube-system/kube-controller-manager-ci-4334-0-0-a-250489a463" May 15 13:05:18.671168 kubelet[4265]: I0515 13:05:18.671135 4265 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/495ca9b7566e390d523bd92a3e96536c-usr-share-ca-certificates\") pod \"kube-controller-manager-ci-4334-0-0-a-250489a463\" (UID: \"495ca9b7566e390d523bd92a3e96536c\") " pod="kube-system/kube-controller-manager-ci-4334-0-0-a-250489a463" May 15 13:05:18.671168 kubelet[4265]: I0515 13:05:18.671154 4265 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/25043387969ff4861dd26cda3352c0a1-kubeconfig\") pod \"kube-scheduler-ci-4334-0-0-a-250489a463\" (UID: \"25043387969ff4861dd26cda3352c0a1\") " pod="kube-system/kube-scheduler-ci-4334-0-0-a-250489a463" May 15 13:05:18.671394 kubelet[4265]: I0515 13:05:18.671168 4265 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/313fc59f4f5615138e3e747b2e80d428-ca-certs\") pod \"kube-apiserver-ci-4334-0-0-a-250489a463\" (UID: \"313fc59f4f5615138e3e747b2e80d428\") " pod="kube-system/kube-apiserver-ci-4334-0-0-a-250489a463" May 15 13:05:18.671394 kubelet[4265]: I0515 13:05:18.671188 4265 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/313fc59f4f5615138e3e747b2e80d428-k8s-certs\") pod \"kube-apiserver-ci-4334-0-0-a-250489a463\" (UID: \"313fc59f4f5615138e3e747b2e80d428\") " pod="kube-system/kube-apiserver-ci-4334-0-0-a-250489a463" May 15 13:05:18.671394 kubelet[4265]: I0515 13:05:18.671204 4265 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/313fc59f4f5615138e3e747b2e80d428-usr-share-ca-certificates\") pod \"kube-apiserver-ci-4334-0-0-a-250489a463\" (UID: \"313fc59f4f5615138e3e747b2e80d428\") " pod="kube-system/kube-apiserver-ci-4334-0-0-a-250489a463" May 15 13:05:18.671394 kubelet[4265]: I0515 13:05:18.671219 4265 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"flexvolume-dir\" (UniqueName: \"kubernetes.io/host-path/495ca9b7566e390d523bd92a3e96536c-flexvolume-dir\") pod \"kube-controller-manager-ci-4334-0-0-a-250489a463\" (UID: \"495ca9b7566e390d523bd92a3e96536c\") " pod="kube-system/kube-controller-manager-ci-4334-0-0-a-250489a463" May 15 13:05:19.350227 kubelet[4265]: I0515 13:05:19.350178 4265 apiserver.go:52] "Watching apiserver" May 15 13:05:19.368168 kubelet[4265]: I0515 13:05:19.368140 4265 desired_state_of_world_populator.go:157] "Finished populating initial desired state of world" May 15 13:05:19.403992 kubelet[4265]: I0515 13:05:19.403878 4265 kubelet.go:3200] "Creating a mirror pod for static pod" pod="kube-system/kube-scheduler-ci-4334-0-0-a-250489a463" May 15 13:05:19.405921 kubelet[4265]: I0515 13:05:19.404135 4265 kubelet.go:3200] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-ci-4334-0-0-a-250489a463" May 15 13:05:19.409723 kubelet[4265]: E0515 13:05:19.409561 4265 kubelet.go:3202] "Failed creating a mirror pod" err="pods \"kube-scheduler-ci-4334-0-0-a-250489a463\" already exists" pod="kube-system/kube-scheduler-ci-4334-0-0-a-250489a463" May 15 13:05:19.410524 kubelet[4265]: E0515 13:05:19.410382 4265 kubelet.go:3202] "Failed creating a mirror pod" err="pods \"kube-apiserver-ci-4334-0-0-a-250489a463\" already exists" pod="kube-system/kube-apiserver-ci-4334-0-0-a-250489a463" May 15 13:05:19.430254 kubelet[4265]: I0515 13:05:19.430177 4265 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-scheduler-ci-4334-0-0-a-250489a463" podStartSLOduration=1.430159871 podStartE2EDuration="1.430159871s" podCreationTimestamp="2025-05-15 13:05:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-05-15 13:05:19.423402573 +0000 UTC m=+1.139780487" watchObservedRunningTime="2025-05-15 13:05:19.430159871 +0000 UTC m=+1.146537784" May 15 13:05:19.437067 kubelet[4265]: I0515 13:05:19.437024 4265 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-apiserver-ci-4334-0-0-a-250489a463" podStartSLOduration=1.437013561 podStartE2EDuration="1.437013561s" podCreationTimestamp="2025-05-15 13:05:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-05-15 13:05:19.430654872 +0000 UTC m=+1.147032785" watchObservedRunningTime="2025-05-15 13:05:19.437013561 +0000 UTC m=+1.153391463" May 15 13:05:19.437171 kubelet[4265]: I0515 13:05:19.437086 4265 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-controller-manager-ci-4334-0-0-a-250489a463" podStartSLOduration=1.4370828 podStartE2EDuration="1.4370828s" podCreationTimestamp="2025-05-15 13:05:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-05-15 13:05:19.436866333 +0000 UTC m=+1.153244246" watchObservedRunningTime="2025-05-15 13:05:19.4370828 +0000 UTC m=+1.153460713" May 15 13:05:23.358238 sudo[3354]: pam_unix(sudo:session): session closed for user root May 15 13:05:23.522276 sshd[3341]: Connection closed by 147.75.109.163 port 35728 May 15 13:05:23.524819 sshd-session[3336]: pam_unix(sshd:session): session closed for user core May 15 13:05:23.528625 systemd-logind[1536]: Session 7 logged out. Waiting for processes to exit. May 15 13:05:23.529271 systemd[1]: sshd@11-157.180.34.115:22-147.75.109.163:35728.service: Deactivated successfully. May 15 13:05:23.531337 systemd[1]: session-7.scope: Deactivated successfully. May 15 13:05:23.531507 systemd[1]: session-7.scope: Consumed 4.060s CPU time, 157.9M memory peak. May 15 13:05:23.533323 systemd-logind[1536]: Removed session 7. May 15 13:05:24.132281 kubelet[4265]: I0515 13:05:24.132184 4265 kuberuntime_manager.go:1702] "Updating runtime config through cri with podcidr" CIDR="192.168.0.0/24" May 15 13:05:24.133179 containerd[1551]: time="2025-05-15T13:05:24.133086550Z" level=info msg="No cni config template is specified, wait for other system components to drop the config." May 15 13:05:24.134525 kubelet[4265]: I0515 13:05:24.134369 4265 kubelet_network.go:61] "Updating Pod CIDR" originalPodCIDR="" newPodCIDR="192.168.0.0/24" May 15 13:05:25.045128 systemd[1]: Created slice kubepods-besteffort-pod846b93bb_ebc6_4584_96d5_66a9d4e1c3be.slice - libcontainer container kubepods-besteffort-pod846b93bb_ebc6_4584_96d5_66a9d4e1c3be.slice. May 15 13:05:25.112658 kubelet[4265]: I0515 13:05:25.112524 4265 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2f86c\" (UniqueName: \"kubernetes.io/projected/846b93bb-ebc6-4584-96d5-66a9d4e1c3be-kube-api-access-2f86c\") pod \"kube-proxy-4829d\" (UID: \"846b93bb-ebc6-4584-96d5-66a9d4e1c3be\") " pod="kube-system/kube-proxy-4829d" May 15 13:05:25.112658 kubelet[4265]: I0515 13:05:25.112571 4265 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-proxy\" (UniqueName: \"kubernetes.io/configmap/846b93bb-ebc6-4584-96d5-66a9d4e1c3be-kube-proxy\") pod \"kube-proxy-4829d\" (UID: \"846b93bb-ebc6-4584-96d5-66a9d4e1c3be\") " pod="kube-system/kube-proxy-4829d" May 15 13:05:25.112658 kubelet[4265]: I0515 13:05:25.112589 4265 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"xtables-lock\" (UniqueName: \"kubernetes.io/host-path/846b93bb-ebc6-4584-96d5-66a9d4e1c3be-xtables-lock\") pod \"kube-proxy-4829d\" (UID: \"846b93bb-ebc6-4584-96d5-66a9d4e1c3be\") " pod="kube-system/kube-proxy-4829d" May 15 13:05:25.112658 kubelet[4265]: I0515 13:05:25.112600 4265 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/846b93bb-ebc6-4584-96d5-66a9d4e1c3be-lib-modules\") pod \"kube-proxy-4829d\" (UID: \"846b93bb-ebc6-4584-96d5-66a9d4e1c3be\") " pod="kube-system/kube-proxy-4829d" May 15 13:05:25.268879 systemd[1]: Created slice kubepods-besteffort-podb4af0d04_44d0_4257_82fa_6c98b7ca1b97.slice - libcontainer container kubepods-besteffort-podb4af0d04_44d0_4257_82fa_6c98b7ca1b97.slice. May 15 13:05:25.270088 kubelet[4265]: I0515 13:05:25.270039 4265 status_manager.go:890] "Failed to get status for pod" podUID="b4af0d04-44d0-4257-82fa-6c98b7ca1b97" pod="tigera-operator/tigera-operator-789496d6f5-87vqw" err="pods \"tigera-operator-789496d6f5-87vqw\" is forbidden: User \"system:node:ci-4334-0-0-a-250489a463\" cannot get resource \"pods\" in API group \"\" in the namespace \"tigera-operator\": no relationship found between node 'ci-4334-0-0-a-250489a463' and this object" May 15 13:05:25.270411 kubelet[4265]: W0515 13:05:25.270115 4265 reflector.go:569] object-"tigera-operator"/"kubernetes-services-endpoint": failed to list *v1.ConfigMap: configmaps "kubernetes-services-endpoint" is forbidden: User "system:node:ci-4334-0-0-a-250489a463" cannot list resource "configmaps" in API group "" in the namespace "tigera-operator": no relationship found between node 'ci-4334-0-0-a-250489a463' and this object May 15 13:05:25.270411 kubelet[4265]: E0515 13:05:25.270138 4265 reflector.go:166] "Unhandled Error" err="object-\"tigera-operator\"/\"kubernetes-services-endpoint\": Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: configmaps \"kubernetes-services-endpoint\" is forbidden: User \"system:node:ci-4334-0-0-a-250489a463\" cannot list resource \"configmaps\" in API group \"\" in the namespace \"tigera-operator\": no relationship found between node 'ci-4334-0-0-a-250489a463' and this object" logger="UnhandledError" May 15 13:05:25.270411 kubelet[4265]: W0515 13:05:25.270169 4265 reflector.go:569] object-"tigera-operator"/"kube-root-ca.crt": failed to list *v1.ConfigMap: configmaps "kube-root-ca.crt" is forbidden: User "system:node:ci-4334-0-0-a-250489a463" cannot list resource "configmaps" in API group "" in the namespace "tigera-operator": no relationship found between node 'ci-4334-0-0-a-250489a463' and this object May 15 13:05:25.270411 kubelet[4265]: E0515 13:05:25.270182 4265 reflector.go:166] "Unhandled Error" err="object-\"tigera-operator\"/\"kube-root-ca.crt\": Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: configmaps \"kube-root-ca.crt\" is forbidden: User \"system:node:ci-4334-0-0-a-250489a463\" cannot list resource \"configmaps\" in API group \"\" in the namespace \"tigera-operator\": no relationship found between node 'ci-4334-0-0-a-250489a463' and this object" logger="UnhandledError" May 15 13:05:25.315962 kubelet[4265]: I0515 13:05:25.314759 4265 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-calico\" (UniqueName: \"kubernetes.io/host-path/b4af0d04-44d0-4257-82fa-6c98b7ca1b97-var-lib-calico\") pod \"tigera-operator-789496d6f5-87vqw\" (UID: \"b4af0d04-44d0-4257-82fa-6c98b7ca1b97\") " pod="tigera-operator/tigera-operator-789496d6f5-87vqw" May 15 13:05:25.315962 kubelet[4265]: I0515 13:05:25.314815 4265 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pp2mr\" (UniqueName: \"kubernetes.io/projected/b4af0d04-44d0-4257-82fa-6c98b7ca1b97-kube-api-access-pp2mr\") pod \"tigera-operator-789496d6f5-87vqw\" (UID: \"b4af0d04-44d0-4257-82fa-6c98b7ca1b97\") " pod="tigera-operator/tigera-operator-789496d6f5-87vqw" May 15 13:05:25.352961 containerd[1551]: time="2025-05-15T13:05:25.352881701Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-proxy-4829d,Uid:846b93bb-ebc6-4584-96d5-66a9d4e1c3be,Namespace:kube-system,Attempt:0,}" May 15 13:05:25.366296 containerd[1551]: time="2025-05-15T13:05:25.366263547Z" level=info msg="connecting to shim df2210d8b6506f4017f41924b4ad1c4cc4a3e97bbdc90edc50e14be58132f8bd" address="unix:///run/containerd/s/f0e4509489f3d5845a693095010d03bb17926eb14bebfb9bae5c09778ea01246" namespace=k8s.io protocol=ttrpc version=3 May 15 13:05:25.397012 systemd[1]: Started cri-containerd-df2210d8b6506f4017f41924b4ad1c4cc4a3e97bbdc90edc50e14be58132f8bd.scope - libcontainer container df2210d8b6506f4017f41924b4ad1c4cc4a3e97bbdc90edc50e14be58132f8bd. May 15 13:05:25.422658 containerd[1551]: time="2025-05-15T13:05:25.422617867Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-proxy-4829d,Uid:846b93bb-ebc6-4584-96d5-66a9d4e1c3be,Namespace:kube-system,Attempt:0,} returns sandbox id \"df2210d8b6506f4017f41924b4ad1c4cc4a3e97bbdc90edc50e14be58132f8bd\"" May 15 13:05:25.425397 containerd[1551]: time="2025-05-15T13:05:25.424914827Z" level=info msg="CreateContainer within sandbox \"df2210d8b6506f4017f41924b4ad1c4cc4a3e97bbdc90edc50e14be58132f8bd\" for container &ContainerMetadata{Name:kube-proxy,Attempt:0,}" May 15 13:05:25.434703 containerd[1551]: time="2025-05-15T13:05:25.434646919Z" level=info msg="Container f49ab8a39d9737810993be250704cc47b903087925f43c0a78afe6465f47a574: CDI devices from CRI Config.CDIDevices: []" May 15 13:05:25.436785 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount2882280258.mount: Deactivated successfully. May 15 13:05:25.441443 containerd[1551]: time="2025-05-15T13:05:25.441410829Z" level=info msg="CreateContainer within sandbox \"df2210d8b6506f4017f41924b4ad1c4cc4a3e97bbdc90edc50e14be58132f8bd\" for &ContainerMetadata{Name:kube-proxy,Attempt:0,} returns container id \"f49ab8a39d9737810993be250704cc47b903087925f43c0a78afe6465f47a574\"" May 15 13:05:25.442514 containerd[1551]: time="2025-05-15T13:05:25.442485511Z" level=info msg="StartContainer for \"f49ab8a39d9737810993be250704cc47b903087925f43c0a78afe6465f47a574\"" May 15 13:05:25.443526 containerd[1551]: time="2025-05-15T13:05:25.443499467Z" level=info msg="connecting to shim f49ab8a39d9737810993be250704cc47b903087925f43c0a78afe6465f47a574" address="unix:///run/containerd/s/f0e4509489f3d5845a693095010d03bb17926eb14bebfb9bae5c09778ea01246" protocol=ttrpc version=3 May 15 13:05:25.464017 systemd[1]: Started cri-containerd-f49ab8a39d9737810993be250704cc47b903087925f43c0a78afe6465f47a574.scope - libcontainer container f49ab8a39d9737810993be250704cc47b903087925f43c0a78afe6465f47a574. May 15 13:05:25.496779 containerd[1551]: time="2025-05-15T13:05:25.496741194Z" level=info msg="StartContainer for \"f49ab8a39d9737810993be250704cc47b903087925f43c0a78afe6465f47a574\" returns successfully" May 15 13:05:26.422640 kubelet[4265]: E0515 13:05:26.422590 4265 projected.go:288] Couldn't get configMap tigera-operator/kube-root-ca.crt: failed to sync configmap cache: timed out waiting for the condition May 15 13:05:26.424077 kubelet[4265]: E0515 13:05:26.422739 4265 projected.go:194] Error preparing data for projected volume kube-api-access-pp2mr for pod tigera-operator/tigera-operator-789496d6f5-87vqw: failed to sync configmap cache: timed out waiting for the condition May 15 13:05:26.424077 kubelet[4265]: E0515 13:05:26.422939 4265 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/b4af0d04-44d0-4257-82fa-6c98b7ca1b97-kube-api-access-pp2mr podName:b4af0d04-44d0-4257-82fa-6c98b7ca1b97 nodeName:}" failed. No retries permitted until 2025-05-15 13:05:26.922916122 +0000 UTC m=+8.639294035 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-pp2mr" (UniqueName: "kubernetes.io/projected/b4af0d04-44d0-4257-82fa-6c98b7ca1b97-kube-api-access-pp2mr") pod "tigera-operator-789496d6f5-87vqw" (UID: "b4af0d04-44d0-4257-82fa-6c98b7ca1b97") : failed to sync configmap cache: timed out waiting for the condition May 15 13:05:26.431644 kubelet[4265]: I0515 13:05:26.431497 4265 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-proxy-4829d" podStartSLOduration=1.43148184 podStartE2EDuration="1.43148184s" podCreationTimestamp="2025-05-15 13:05:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-05-15 13:05:26.429441464 +0000 UTC m=+8.145819406" watchObservedRunningTime="2025-05-15 13:05:26.43148184 +0000 UTC m=+8.147859743" May 15 13:05:27.076160 containerd[1551]: time="2025-05-15T13:05:27.076062471Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:tigera-operator-789496d6f5-87vqw,Uid:b4af0d04-44d0-4257-82fa-6c98b7ca1b97,Namespace:tigera-operator,Attempt:0,}" May 15 13:05:27.100752 containerd[1551]: time="2025-05-15T13:05:27.100620785Z" level=info msg="connecting to shim 8aa344073ef65faa2332c163318935ece4604289d3ecb7b125ded7b52eba6186" address="unix:///run/containerd/s/6670297c956be2c5507629a9b1c5ee97be0d98ad9be64b02b5343a619bab6f99" namespace=k8s.io protocol=ttrpc version=3 May 15 13:05:27.142123 systemd[1]: Started cri-containerd-8aa344073ef65faa2332c163318935ece4604289d3ecb7b125ded7b52eba6186.scope - libcontainer container 8aa344073ef65faa2332c163318935ece4604289d3ecb7b125ded7b52eba6186. May 15 13:05:27.197454 containerd[1551]: time="2025-05-15T13:05:27.197410448Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:tigera-operator-789496d6f5-87vqw,Uid:b4af0d04-44d0-4257-82fa-6c98b7ca1b97,Namespace:tigera-operator,Attempt:0,} returns sandbox id \"8aa344073ef65faa2332c163318935ece4604289d3ecb7b125ded7b52eba6186\"" May 15 13:05:27.199027 containerd[1551]: time="2025-05-15T13:05:27.198991872Z" level=info msg="PullImage \"quay.io/tigera/operator:v1.36.7\"" May 15 13:05:28.995480 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount3431436829.mount: Deactivated successfully. May 15 13:05:29.304039 containerd[1551]: time="2025-05-15T13:05:29.303768902Z" level=info msg="ImageCreate event name:\"quay.io/tigera/operator:v1.36.7\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 15 13:05:29.305036 containerd[1551]: time="2025-05-15T13:05:29.304998394Z" level=info msg="stop pulling image quay.io/tigera/operator:v1.36.7: active requests=0, bytes read=22002662" May 15 13:05:29.306580 containerd[1551]: time="2025-05-15T13:05:29.306177552Z" level=info msg="ImageCreate event name:\"sha256:e9b19fa62f476f04e5840eb65a0f71b49c7b9f4ceede31675409ddc218bb5578\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 15 13:05:29.309568 containerd[1551]: time="2025-05-15T13:05:29.309537431Z" level=info msg="ImageCreate event name:\"quay.io/tigera/operator@sha256:a4a44422d8f2a14e0aaea2031ccb5580f2bf68218c9db444450c1888743305e9\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 15 13:05:29.310234 containerd[1551]: time="2025-05-15T13:05:29.310197512Z" level=info msg="Pulled image \"quay.io/tigera/operator:v1.36.7\" with image id \"sha256:e9b19fa62f476f04e5840eb65a0f71b49c7b9f4ceede31675409ddc218bb5578\", repo tag \"quay.io/tigera/operator:v1.36.7\", repo digest \"quay.io/tigera/operator@sha256:a4a44422d8f2a14e0aaea2031ccb5580f2bf68218c9db444450c1888743305e9\", size \"21998657\" in 2.11104148s" May 15 13:05:29.310234 containerd[1551]: time="2025-05-15T13:05:29.310235033Z" level=info msg="PullImage \"quay.io/tigera/operator:v1.36.7\" returns image reference \"sha256:e9b19fa62f476f04e5840eb65a0f71b49c7b9f4ceede31675409ddc218bb5578\"" May 15 13:05:29.313706 containerd[1551]: time="2025-05-15T13:05:29.313667418Z" level=info msg="CreateContainer within sandbox \"8aa344073ef65faa2332c163318935ece4604289d3ecb7b125ded7b52eba6186\" for container &ContainerMetadata{Name:tigera-operator,Attempt:0,}" May 15 13:05:29.319690 containerd[1551]: time="2025-05-15T13:05:29.319634278Z" level=info msg="Container 80ce2792319634f1a703173d233792af6e62ee1bdf85ecab8187137dca9f9e96: CDI devices from CRI Config.CDIDevices: []" May 15 13:05:29.325575 containerd[1551]: time="2025-05-15T13:05:29.325527292Z" level=info msg="CreateContainer within sandbox \"8aa344073ef65faa2332c163318935ece4604289d3ecb7b125ded7b52eba6186\" for &ContainerMetadata{Name:tigera-operator,Attempt:0,} returns container id \"80ce2792319634f1a703173d233792af6e62ee1bdf85ecab8187137dca9f9e96\"" May 15 13:05:29.326294 containerd[1551]: time="2025-05-15T13:05:29.326260831Z" level=info msg="StartContainer for \"80ce2792319634f1a703173d233792af6e62ee1bdf85ecab8187137dca9f9e96\"" May 15 13:05:29.326807 containerd[1551]: time="2025-05-15T13:05:29.326779486Z" level=info msg="connecting to shim 80ce2792319634f1a703173d233792af6e62ee1bdf85ecab8187137dca9f9e96" address="unix:///run/containerd/s/6670297c956be2c5507629a9b1c5ee97be0d98ad9be64b02b5343a619bab6f99" protocol=ttrpc version=3 May 15 13:05:29.345029 systemd[1]: Started cri-containerd-80ce2792319634f1a703173d233792af6e62ee1bdf85ecab8187137dca9f9e96.scope - libcontainer container 80ce2792319634f1a703173d233792af6e62ee1bdf85ecab8187137dca9f9e96. May 15 13:05:29.370913 containerd[1551]: time="2025-05-15T13:05:29.370836350Z" level=info msg="StartContainer for \"80ce2792319634f1a703173d233792af6e62ee1bdf85ecab8187137dca9f9e96\" returns successfully" May 15 13:05:29.435808 kubelet[4265]: I0515 13:05:29.435524 4265 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="tigera-operator/tigera-operator-789496d6f5-87vqw" podStartSLOduration=2.32280735 podStartE2EDuration="4.435509003s" podCreationTimestamp="2025-05-15 13:05:25 +0000 UTC" firstStartedPulling="2025-05-15 13:05:27.198588784 +0000 UTC m=+8.914966687" lastFinishedPulling="2025-05-15 13:05:29.311290437 +0000 UTC m=+11.027668340" observedRunningTime="2025-05-15 13:05:29.435003472 +0000 UTC m=+11.151381375" watchObservedRunningTime="2025-05-15 13:05:29.435509003 +0000 UTC m=+11.151886896" May 15 13:05:32.292054 systemd[1]: Created slice kubepods-besteffort-pod6e664fc3_54fe_4c40_a80e_10bf5e2b5155.slice - libcontainer container kubepods-besteffort-pod6e664fc3_54fe_4c40_a80e_10bf5e2b5155.slice. May 15 13:05:32.340077 systemd[1]: Created slice kubepods-besteffort-podd516580f_11ef_4406_9486_8d7f1d13d3b3.slice - libcontainer container kubepods-besteffort-podd516580f_11ef_4406_9486_8d7f1d13d3b3.slice. May 15 13:05:32.362871 kubelet[4265]: I0515 13:05:32.362810 4265 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6e664fc3-54fe-4c40-a80e-10bf5e2b5155-tigera-ca-bundle\") pod \"calico-typha-67b48869f7-pwb9d\" (UID: \"6e664fc3-54fe-4c40-a80e-10bf5e2b5155\") " pod="calico-system/calico-typha-67b48869f7-pwb9d" May 15 13:05:32.362871 kubelet[4265]: I0515 13:05:32.362850 4265 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-net-dir\" (UniqueName: \"kubernetes.io/host-path/d516580f-11ef-4406-9486-8d7f1d13d3b3-cni-net-dir\") pod \"calico-node-l4wnb\" (UID: \"d516580f-11ef-4406-9486-8d7f1d13d3b3\") " pod="calico-system/calico-node-l4wnb" May 15 13:05:32.362871 kubelet[4265]: I0515 13:05:32.362866 4265 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-log-dir\" (UniqueName: \"kubernetes.io/host-path/d516580f-11ef-4406-9486-8d7f1d13d3b3-cni-log-dir\") pod \"calico-node-l4wnb\" (UID: \"d516580f-11ef-4406-9486-8d7f1d13d3b3\") " pod="calico-system/calico-node-l4wnb" May 15 13:05:32.362871 kubelet[4265]: I0515 13:05:32.362879 4265 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"typha-certs\" (UniqueName: \"kubernetes.io/secret/6e664fc3-54fe-4c40-a80e-10bf5e2b5155-typha-certs\") pod \"calico-typha-67b48869f7-pwb9d\" (UID: \"6e664fc3-54fe-4c40-a80e-10bf5e2b5155\") " pod="calico-system/calico-typha-67b48869f7-pwb9d" May 15 13:05:32.363556 kubelet[4265]: I0515 13:05:32.362907 4265 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d516580f-11ef-4406-9486-8d7f1d13d3b3-tigera-ca-bundle\") pod \"calico-node-l4wnb\" (UID: \"d516580f-11ef-4406-9486-8d7f1d13d3b3\") " pod="calico-system/calico-node-l4wnb" May 15 13:05:32.363556 kubelet[4265]: I0515 13:05:32.362921 4265 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-certs\" (UniqueName: \"kubernetes.io/secret/d516580f-11ef-4406-9486-8d7f1d13d3b3-node-certs\") pod \"calico-node-l4wnb\" (UID: \"d516580f-11ef-4406-9486-8d7f1d13d3b3\") " pod="calico-system/calico-node-l4wnb" May 15 13:05:32.363556 kubelet[4265]: I0515 13:05:32.362934 4265 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-calico\" (UniqueName: \"kubernetes.io/host-path/d516580f-11ef-4406-9486-8d7f1d13d3b3-var-lib-calico\") pod \"calico-node-l4wnb\" (UID: \"d516580f-11ef-4406-9486-8d7f1d13d3b3\") " pod="calico-system/calico-node-l4wnb" May 15 13:05:32.363556 kubelet[4265]: I0515 13:05:32.362948 4265 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9x56j\" (UniqueName: \"kubernetes.io/projected/6e664fc3-54fe-4c40-a80e-10bf5e2b5155-kube-api-access-9x56j\") pod \"calico-typha-67b48869f7-pwb9d\" (UID: \"6e664fc3-54fe-4c40-a80e-10bf5e2b5155\") " pod="calico-system/calico-typha-67b48869f7-pwb9d" May 15 13:05:32.363556 kubelet[4265]: I0515 13:05:32.362960 4265 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-bin-dir\" (UniqueName: \"kubernetes.io/host-path/d516580f-11ef-4406-9486-8d7f1d13d3b3-cni-bin-dir\") pod \"calico-node-l4wnb\" (UID: \"d516580f-11ef-4406-9486-8d7f1d13d3b3\") " pod="calico-system/calico-node-l4wnb" May 15 13:05:32.363649 kubelet[4265]: I0515 13:05:32.362977 4265 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/d516580f-11ef-4406-9486-8d7f1d13d3b3-lib-modules\") pod \"calico-node-l4wnb\" (UID: \"d516580f-11ef-4406-9486-8d7f1d13d3b3\") " pod="calico-system/calico-node-l4wnb" May 15 13:05:32.363649 kubelet[4265]: I0515 13:05:32.362989 4265 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"xtables-lock\" (UniqueName: \"kubernetes.io/host-path/d516580f-11ef-4406-9486-8d7f1d13d3b3-xtables-lock\") pod \"calico-node-l4wnb\" (UID: \"d516580f-11ef-4406-9486-8d7f1d13d3b3\") " pod="calico-system/calico-node-l4wnb" May 15 13:05:32.363649 kubelet[4265]: I0515 13:05:32.363002 4265 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-calico\" (UniqueName: \"kubernetes.io/host-path/d516580f-11ef-4406-9486-8d7f1d13d3b3-var-run-calico\") pod \"calico-node-l4wnb\" (UID: \"d516580f-11ef-4406-9486-8d7f1d13d3b3\") " pod="calico-system/calico-node-l4wnb" May 15 13:05:32.363649 kubelet[4265]: I0515 13:05:32.363063 4265 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"flexvol-driver-host\" (UniqueName: \"kubernetes.io/host-path/d516580f-11ef-4406-9486-8d7f1d13d3b3-flexvol-driver-host\") pod \"calico-node-l4wnb\" (UID: \"d516580f-11ef-4406-9486-8d7f1d13d3b3\") " pod="calico-system/calico-node-l4wnb" May 15 13:05:32.363649 kubelet[4265]: I0515 13:05:32.363103 4265 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l56wd\" (UniqueName: \"kubernetes.io/projected/d516580f-11ef-4406-9486-8d7f1d13d3b3-kube-api-access-l56wd\") pod \"calico-node-l4wnb\" (UID: \"d516580f-11ef-4406-9486-8d7f1d13d3b3\") " pod="calico-system/calico-node-l4wnb" May 15 13:05:32.363794 kubelet[4265]: I0515 13:05:32.363127 4265 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"policysync\" (UniqueName: \"kubernetes.io/host-path/d516580f-11ef-4406-9486-8d7f1d13d3b3-policysync\") pod \"calico-node-l4wnb\" (UID: \"d516580f-11ef-4406-9486-8d7f1d13d3b3\") " pod="calico-system/calico-node-l4wnb" May 15 13:05:32.475978 kubelet[4265]: E0515 13:05:32.475709 4265 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 15 13:05:32.475978 kubelet[4265]: W0515 13:05:32.475730 4265 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 15 13:05:32.475978 kubelet[4265]: E0515 13:05:32.475752 4265 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 15 13:05:32.480196 kubelet[4265]: E0515 13:05:32.480031 4265 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 15 13:05:32.480196 kubelet[4265]: W0515 13:05:32.480057 4265 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 15 13:05:32.480196 kubelet[4265]: E0515 13:05:32.480079 4265 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 15 13:05:32.488275 kubelet[4265]: E0515 13:05:32.488226 4265 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 15 13:05:32.488275 kubelet[4265]: W0515 13:05:32.488247 4265 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 15 13:05:32.488368 kubelet[4265]: E0515 13:05:32.488278 4265 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 15 13:05:32.500580 kubelet[4265]: E0515 13:05:32.498588 4265 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 15 13:05:32.500580 kubelet[4265]: W0515 13:05:32.498605 4265 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 15 13:05:32.500580 kubelet[4265]: E0515 13:05:32.498617 4265 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 15 13:05:32.506486 kubelet[4265]: E0515 13:05:32.506442 4265 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-tx8fx" podUID="bb669bdd-390f-4594-a774-599bed5593aa" May 15 13:05:32.547564 kubelet[4265]: E0515 13:05:32.546953 4265 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 15 13:05:32.547564 kubelet[4265]: W0515 13:05:32.546972 4265 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 15 13:05:32.547564 kubelet[4265]: E0515 13:05:32.547006 4265 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 15 13:05:32.547564 kubelet[4265]: E0515 13:05:32.547198 4265 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 15 13:05:32.547564 kubelet[4265]: W0515 13:05:32.547205 4265 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 15 13:05:32.547564 kubelet[4265]: E0515 13:05:32.547212 4265 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 15 13:05:32.547564 kubelet[4265]: E0515 13:05:32.547371 4265 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 15 13:05:32.547564 kubelet[4265]: W0515 13:05:32.547377 4265 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 15 13:05:32.547564 kubelet[4265]: E0515 13:05:32.547385 4265 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 15 13:05:32.547564 kubelet[4265]: E0515 13:05:32.547549 4265 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 15 13:05:32.547849 kubelet[4265]: W0515 13:05:32.547556 4265 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 15 13:05:32.547849 kubelet[4265]: E0515 13:05:32.547562 4265 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 15 13:05:32.547849 kubelet[4265]: E0515 13:05:32.547696 4265 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 15 13:05:32.547849 kubelet[4265]: W0515 13:05:32.547703 4265 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 15 13:05:32.547849 kubelet[4265]: E0515 13:05:32.547709 4265 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 15 13:05:32.548445 kubelet[4265]: E0515 13:05:32.548403 4265 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 15 13:05:32.548445 kubelet[4265]: W0515 13:05:32.548419 4265 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 15 13:05:32.548445 kubelet[4265]: E0515 13:05:32.548427 4265 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 15 13:05:32.548593 kubelet[4265]: E0515 13:05:32.548575 4265 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 15 13:05:32.548593 kubelet[4265]: W0515 13:05:32.548588 4265 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 15 13:05:32.548593 kubelet[4265]: E0515 13:05:32.548594 4265 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 15 13:05:32.549022 kubelet[4265]: E0515 13:05:32.549001 4265 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 15 13:05:32.549022 kubelet[4265]: W0515 13:05:32.549015 4265 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 15 13:05:32.549022 kubelet[4265]: E0515 13:05:32.549023 4265 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 15 13:05:32.549566 kubelet[4265]: E0515 13:05:32.549544 4265 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 15 13:05:32.549566 kubelet[4265]: W0515 13:05:32.549558 4265 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 15 13:05:32.549566 kubelet[4265]: E0515 13:05:32.549568 4265 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 15 13:05:32.550240 kubelet[4265]: E0515 13:05:32.550082 4265 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 15 13:05:32.550240 kubelet[4265]: W0515 13:05:32.550233 4265 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 15 13:05:32.550330 kubelet[4265]: E0515 13:05:32.550243 4265 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 15 13:05:32.550477 kubelet[4265]: E0515 13:05:32.550451 4265 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 15 13:05:32.550517 kubelet[4265]: W0515 13:05:32.550488 4265 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 15 13:05:32.550517 kubelet[4265]: E0515 13:05:32.550497 4265 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 15 13:05:32.550807 kubelet[4265]: E0515 13:05:32.550744 4265 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 15 13:05:32.550807 kubelet[4265]: W0515 13:05:32.550755 4265 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 15 13:05:32.550807 kubelet[4265]: E0515 13:05:32.550763 4265 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 15 13:05:32.551055 kubelet[4265]: E0515 13:05:32.550998 4265 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 15 13:05:32.551055 kubelet[4265]: W0515 13:05:32.551012 4265 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 15 13:05:32.551055 kubelet[4265]: E0515 13:05:32.551026 4265 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 15 13:05:32.551450 kubelet[4265]: E0515 13:05:32.551428 4265 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 15 13:05:32.551450 kubelet[4265]: W0515 13:05:32.551443 4265 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 15 13:05:32.551450 kubelet[4265]: E0515 13:05:32.551450 4265 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 15 13:05:32.552030 kubelet[4265]: E0515 13:05:32.552009 4265 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 15 13:05:32.552030 kubelet[4265]: W0515 13:05:32.552025 4265 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 15 13:05:32.552092 kubelet[4265]: E0515 13:05:32.552033 4265 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 15 13:05:32.552457 kubelet[4265]: E0515 13:05:32.552436 4265 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 15 13:05:32.552457 kubelet[4265]: W0515 13:05:32.552450 4265 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 15 13:05:32.552457 kubelet[4265]: E0515 13:05:32.552459 4265 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 15 13:05:32.552840 kubelet[4265]: E0515 13:05:32.552809 4265 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 15 13:05:32.552840 kubelet[4265]: W0515 13:05:32.552824 4265 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 15 13:05:32.552840 kubelet[4265]: E0515 13:05:32.552832 4265 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 15 13:05:32.553314 kubelet[4265]: E0515 13:05:32.553295 4265 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 15 13:05:32.553314 kubelet[4265]: W0515 13:05:32.553308 4265 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 15 13:05:32.553314 kubelet[4265]: E0515 13:05:32.553316 4265 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 15 13:05:32.553477 kubelet[4265]: E0515 13:05:32.553459 4265 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 15 13:05:32.553477 kubelet[4265]: W0515 13:05:32.553472 4265 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 15 13:05:32.553531 kubelet[4265]: E0515 13:05:32.553480 4265 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 15 13:05:32.554082 kubelet[4265]: E0515 13:05:32.554061 4265 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 15 13:05:32.554082 kubelet[4265]: W0515 13:05:32.554076 4265 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 15 13:05:32.554132 kubelet[4265]: E0515 13:05:32.554085 4265 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 15 13:05:32.565763 kubelet[4265]: E0515 13:05:32.565742 4265 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 15 13:05:32.565763 kubelet[4265]: W0515 13:05:32.565758 4265 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 15 13:05:32.565845 kubelet[4265]: E0515 13:05:32.565768 4265 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 15 13:05:32.565845 kubelet[4265]: I0515 13:05:32.565790 4265 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/bb669bdd-390f-4594-a774-599bed5593aa-kubelet-dir\") pod \"csi-node-driver-tx8fx\" (UID: \"bb669bdd-390f-4594-a774-599bed5593aa\") " pod="calico-system/csi-node-driver-tx8fx" May 15 13:05:32.566103 kubelet[4265]: E0515 13:05:32.565926 4265 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 15 13:05:32.566103 kubelet[4265]: W0515 13:05:32.565938 4265 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 15 13:05:32.566103 kubelet[4265]: E0515 13:05:32.565945 4265 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 15 13:05:32.566103 kubelet[4265]: I0515 13:05:32.565956 4265 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hn5s9\" (UniqueName: \"kubernetes.io/projected/bb669bdd-390f-4594-a774-599bed5593aa-kube-api-access-hn5s9\") pod \"csi-node-driver-tx8fx\" (UID: \"bb669bdd-390f-4594-a774-599bed5593aa\") " pod="calico-system/csi-node-driver-tx8fx" May 15 13:05:32.566103 kubelet[4265]: E0515 13:05:32.566081 4265 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 15 13:05:32.566103 kubelet[4265]: W0515 13:05:32.566089 4265 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 15 13:05:32.566103 kubelet[4265]: E0515 13:05:32.566097 4265 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 15 13:05:32.566418 kubelet[4265]: I0515 13:05:32.566109 4265 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"varrun\" (UniqueName: \"kubernetes.io/host-path/bb669bdd-390f-4594-a774-599bed5593aa-varrun\") pod \"csi-node-driver-tx8fx\" (UID: \"bb669bdd-390f-4594-a774-599bed5593aa\") " pod="calico-system/csi-node-driver-tx8fx" May 15 13:05:32.566554 kubelet[4265]: E0515 13:05:32.566535 4265 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 15 13:05:32.566554 kubelet[4265]: W0515 13:05:32.566549 4265 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 15 13:05:32.566607 kubelet[4265]: E0515 13:05:32.566571 4265 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 15 13:05:32.566607 kubelet[4265]: I0515 13:05:32.566585 4265 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/bb669bdd-390f-4594-a774-599bed5593aa-socket-dir\") pod \"csi-node-driver-tx8fx\" (UID: \"bb669bdd-390f-4594-a774-599bed5593aa\") " pod="calico-system/csi-node-driver-tx8fx" May 15 13:05:32.567628 kubelet[4265]: E0515 13:05:32.567608 4265 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 15 13:05:32.567628 kubelet[4265]: W0515 13:05:32.567623 4265 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 15 13:05:32.567707 kubelet[4265]: E0515 13:05:32.567636 4265 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 15 13:05:32.567707 kubelet[4265]: I0515 13:05:32.567649 4265 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/bb669bdd-390f-4594-a774-599bed5593aa-registration-dir\") pod \"csi-node-driver-tx8fx\" (UID: \"bb669bdd-390f-4594-a774-599bed5593aa\") " pod="calico-system/csi-node-driver-tx8fx" May 15 13:05:32.568046 kubelet[4265]: E0515 13:05:32.568026 4265 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 15 13:05:32.568046 kubelet[4265]: W0515 13:05:32.568041 4265 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 15 13:05:32.568200 kubelet[4265]: E0515 13:05:32.568182 4265 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 15 13:05:32.568200 kubelet[4265]: W0515 13:05:32.568196 4265 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 15 13:05:32.568341 kubelet[4265]: E0515 13:05:32.568187 4265 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 15 13:05:32.568399 kubelet[4265]: E0515 13:05:32.568390 4265 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 15 13:05:32.569241 kubelet[4265]: E0515 13:05:32.569220 4265 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 15 13:05:32.569241 kubelet[4265]: W0515 13:05:32.569234 4265 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 15 13:05:32.569341 kubelet[4265]: E0515 13:05:32.569322 4265 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 15 13:05:32.569471 kubelet[4265]: E0515 13:05:32.569455 4265 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 15 13:05:32.569471 kubelet[4265]: W0515 13:05:32.569467 4265 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 15 13:05:32.569576 kubelet[4265]: E0515 13:05:32.569550 4265 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 15 13:05:32.569622 kubelet[4265]: E0515 13:05:32.569604 4265 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 15 13:05:32.569622 kubelet[4265]: W0515 13:05:32.569619 4265 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 15 13:05:32.569693 kubelet[4265]: E0515 13:05:32.569675 4265 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 15 13:05:32.569832 kubelet[4265]: E0515 13:05:32.569812 4265 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 15 13:05:32.569832 kubelet[4265]: W0515 13:05:32.569825 4265 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 15 13:05:32.569832 kubelet[4265]: E0515 13:05:32.569833 4265 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 15 13:05:32.570114 kubelet[4265]: E0515 13:05:32.570097 4265 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 15 13:05:32.570114 kubelet[4265]: W0515 13:05:32.570109 4265 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 15 13:05:32.570180 kubelet[4265]: E0515 13:05:32.570117 4265 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 15 13:05:32.570387 kubelet[4265]: E0515 13:05:32.570326 4265 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 15 13:05:32.570387 kubelet[4265]: W0515 13:05:32.570337 4265 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 15 13:05:32.570387 kubelet[4265]: E0515 13:05:32.570344 4265 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 15 13:05:32.570509 kubelet[4265]: E0515 13:05:32.570491 4265 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 15 13:05:32.570509 kubelet[4265]: W0515 13:05:32.570502 4265 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 15 13:05:32.570509 kubelet[4265]: E0515 13:05:32.570509 4265 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 15 13:05:32.570796 kubelet[4265]: E0515 13:05:32.570633 4265 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 15 13:05:32.570796 kubelet[4265]: W0515 13:05:32.570640 4265 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 15 13:05:32.570796 kubelet[4265]: E0515 13:05:32.570646 4265 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 15 13:05:32.603625 containerd[1551]: time="2025-05-15T13:05:32.603334706Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-typha-67b48869f7-pwb9d,Uid:6e664fc3-54fe-4c40-a80e-10bf5e2b5155,Namespace:calico-system,Attempt:0,}" May 15 13:05:32.632519 containerd[1551]: time="2025-05-15T13:05:32.632441324Z" level=info msg="connecting to shim 7aa347eee207050fcb2de8f00c25e930765c09cdcea1058bd289429ddf5c88b5" address="unix:///run/containerd/s/f635b14606e72fd482e837c1e71f4cff5af4ae1f48d58045b29ebf3912a40e31" namespace=k8s.io protocol=ttrpc version=3 May 15 13:05:32.644523 containerd[1551]: time="2025-05-15T13:05:32.644502365Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-node-l4wnb,Uid:d516580f-11ef-4406-9486-8d7f1d13d3b3,Namespace:calico-system,Attempt:0,}" May 15 13:05:32.657019 systemd[1]: Started cri-containerd-7aa347eee207050fcb2de8f00c25e930765c09cdcea1058bd289429ddf5c88b5.scope - libcontainer container 7aa347eee207050fcb2de8f00c25e930765c09cdcea1058bd289429ddf5c88b5. May 15 13:05:32.668598 kubelet[4265]: E0515 13:05:32.668396 4265 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 15 13:05:32.668598 kubelet[4265]: W0515 13:05:32.668441 4265 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 15 13:05:32.670005 kubelet[4265]: E0515 13:05:32.669927 4265 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 15 13:05:32.670456 kubelet[4265]: E0515 13:05:32.670445 4265 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 15 13:05:32.670665 kubelet[4265]: W0515 13:05:32.670514 4265 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 15 13:05:32.670665 kubelet[4265]: E0515 13:05:32.670544 4265 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 15 13:05:32.670843 kubelet[4265]: E0515 13:05:32.670821 4265 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 15 13:05:32.670843 kubelet[4265]: W0515 13:05:32.670831 4265 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 15 13:05:32.671031 kubelet[4265]: E0515 13:05:32.670955 4265 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 15 13:05:32.671334 kubelet[4265]: E0515 13:05:32.671324 4265 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 15 13:05:32.671475 kubelet[4265]: W0515 13:05:32.671411 4265 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 15 13:05:32.671584 kubelet[4265]: E0515 13:05:32.671426 4265 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 15 13:05:32.671763 kubelet[4265]: E0515 13:05:32.671755 4265 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 15 13:05:32.671947 kubelet[4265]: W0515 13:05:32.671930 4265 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 15 13:05:32.672120 kubelet[4265]: E0515 13:05:32.672018 4265 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 15 13:05:32.672423 kubelet[4265]: E0515 13:05:32.672414 4265 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 15 13:05:32.672518 kubelet[4265]: W0515 13:05:32.672468 4265 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 15 13:05:32.672572 kubelet[4265]: E0515 13:05:32.672563 4265 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 15 13:05:32.672789 kubelet[4265]: E0515 13:05:32.672775 4265 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 15 13:05:32.672869 kubelet[4265]: W0515 13:05:32.672861 4265 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 15 13:05:32.673099 kubelet[4265]: E0515 13:05:32.672990 4265 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 15 13:05:32.673576 kubelet[4265]: E0515 13:05:32.673260 4265 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 15 13:05:32.673576 kubelet[4265]: W0515 13:05:32.673475 4265 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 15 13:05:32.673576 kubelet[4265]: E0515 13:05:32.673515 4265 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 15 13:05:32.674207 kubelet[4265]: E0515 13:05:32.674012 4265 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 15 13:05:32.674207 kubelet[4265]: W0515 13:05:32.674047 4265 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 15 13:05:32.674207 kubelet[4265]: E0515 13:05:32.674074 4265 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 15 13:05:32.674484 kubelet[4265]: E0515 13:05:32.674354 4265 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 15 13:05:32.674484 kubelet[4265]: W0515 13:05:32.674383 4265 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 15 13:05:32.674484 kubelet[4265]: E0515 13:05:32.674397 4265 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 15 13:05:32.675032 kubelet[4265]: E0515 13:05:32.674859 4265 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 15 13:05:32.675032 kubelet[4265]: W0515 13:05:32.674868 4265 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 15 13:05:32.675032 kubelet[4265]: E0515 13:05:32.674877 4265 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 15 13:05:32.675300 kubelet[4265]: E0515 13:05:32.675273 4265 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 15 13:05:32.675448 kubelet[4265]: W0515 13:05:32.675343 4265 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 15 13:05:32.675448 kubelet[4265]: E0515 13:05:32.675355 4265 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 15 13:05:32.675992 kubelet[4265]: E0515 13:05:32.675788 4265 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 15 13:05:32.675992 kubelet[4265]: W0515 13:05:32.675796 4265 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 15 13:05:32.675992 kubelet[4265]: E0515 13:05:32.675866 4265 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 15 13:05:32.676452 kubelet[4265]: E0515 13:05:32.676430 4265 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 15 13:05:32.676452 kubelet[4265]: W0515 13:05:32.676440 4265 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 15 13:05:32.676744 kubelet[4265]: E0515 13:05:32.676589 4265 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 15 13:05:32.676937 kubelet[4265]: E0515 13:05:32.676924 4265 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 15 13:05:32.677106 kubelet[4265]: W0515 13:05:32.676997 4265 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 15 13:05:32.677106 kubelet[4265]: E0515 13:05:32.677028 4265 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 15 13:05:32.677453 kubelet[4265]: E0515 13:05:32.677417 4265 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 15 13:05:32.677453 kubelet[4265]: W0515 13:05:32.677435 4265 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 15 13:05:32.677778 kubelet[4265]: E0515 13:05:32.677709 4265 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 15 13:05:32.677930 kubelet[4265]: E0515 13:05:32.677874 4265 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 15 13:05:32.678885 kubelet[4265]: W0515 13:05:32.677884 4265 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 15 13:05:32.679044 kubelet[4265]: E0515 13:05:32.679005 4265 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 15 13:05:32.679135 kubelet[4265]: E0515 13:05:32.679126 4265 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 15 13:05:32.679249 kubelet[4265]: W0515 13:05:32.679198 4265 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 15 13:05:32.679308 kubelet[4265]: E0515 13:05:32.679299 4265 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 15 13:05:32.679656 kubelet[4265]: E0515 13:05:32.679646 4265 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 15 13:05:32.679763 kubelet[4265]: W0515 13:05:32.679697 4265 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 15 13:05:32.680694 kubelet[4265]: E0515 13:05:32.679803 4265 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 15 13:05:32.680694 kubelet[4265]: E0515 13:05:32.680379 4265 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 15 13:05:32.680694 kubelet[4265]: W0515 13:05:32.680387 4265 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 15 13:05:32.680694 kubelet[4265]: E0515 13:05:32.680421 4265 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 15 13:05:32.680694 kubelet[4265]: E0515 13:05:32.680526 4265 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 15 13:05:32.680694 kubelet[4265]: W0515 13:05:32.680532 4265 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 15 13:05:32.680694 kubelet[4265]: E0515 13:05:32.680611 4265 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 15 13:05:32.680972 kubelet[4265]: E0515 13:05:32.680852 4265 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 15 13:05:32.680972 kubelet[4265]: W0515 13:05:32.680859 4265 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 15 13:05:32.680972 kubelet[4265]: E0515 13:05:32.680869 4265 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 15 13:05:32.681188 kubelet[4265]: E0515 13:05:32.681146 4265 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 15 13:05:32.681188 kubelet[4265]: W0515 13:05:32.681165 4265 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 15 13:05:32.681383 kubelet[4265]: E0515 13:05:32.681374 4265 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 15 13:05:32.681477 kubelet[4265]: W0515 13:05:32.681466 4265 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 15 13:05:32.681553 kubelet[4265]: E0515 13:05:32.681541 4265 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 15 13:05:32.681716 kubelet[4265]: E0515 13:05:32.681510 4265 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 15 13:05:32.681799 kubelet[4265]: E0515 13:05:32.681774 4265 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 15 13:05:32.681846 kubelet[4265]: W0515 13:05:32.681838 4265 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 15 13:05:32.682112 kubelet[4265]: E0515 13:05:32.681916 4265 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 15 13:05:32.695814 kubelet[4265]: E0515 13:05:32.695765 4265 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 15 13:05:32.696090 kubelet[4265]: W0515 13:05:32.696040 4265 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 15 13:05:32.696090 kubelet[4265]: E0515 13:05:32.696061 4265 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 15 13:05:32.696188 containerd[1551]: time="2025-05-15T13:05:32.695689939Z" level=info msg="connecting to shim d917d8462de08a1e45f19f2ac44024df9d7df3776b9df3654d591d398e02cc20" address="unix:///run/containerd/s/54c78eb104a21fdcb502ec570dacef3f38352012fe017f0d09b8000e76d188bd" namespace=k8s.io protocol=ttrpc version=3 May 15 13:05:32.725263 systemd[1]: Started cri-containerd-d917d8462de08a1e45f19f2ac44024df9d7df3776b9df3654d591d398e02cc20.scope - libcontainer container d917d8462de08a1e45f19f2ac44024df9d7df3776b9df3654d591d398e02cc20. May 15 13:05:32.751664 containerd[1551]: time="2025-05-15T13:05:32.751578013Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-typha-67b48869f7-pwb9d,Uid:6e664fc3-54fe-4c40-a80e-10bf5e2b5155,Namespace:calico-system,Attempt:0,} returns sandbox id \"7aa347eee207050fcb2de8f00c25e930765c09cdcea1058bd289429ddf5c88b5\"" May 15 13:05:32.756218 containerd[1551]: time="2025-05-15T13:05:32.756189685Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/typha:v3.29.3\"" May 15 13:05:32.767580 containerd[1551]: time="2025-05-15T13:05:32.767475349Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-node-l4wnb,Uid:d516580f-11ef-4406-9486-8d7f1d13d3b3,Namespace:calico-system,Attempt:0,} returns sandbox id \"d917d8462de08a1e45f19f2ac44024df9d7df3776b9df3654d591d398e02cc20\"" May 15 13:05:34.379290 kubelet[4265]: E0515 13:05:34.378033 4265 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-tx8fx" podUID="bb669bdd-390f-4594-a774-599bed5593aa" May 15 13:05:35.505215 containerd[1551]: time="2025-05-15T13:05:35.505127646Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/typha:v3.29.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 15 13:05:35.506112 containerd[1551]: time="2025-05-15T13:05:35.506083424Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/typha:v3.29.3: active requests=0, bytes read=30426870" May 15 13:05:35.506952 containerd[1551]: time="2025-05-15T13:05:35.506911810Z" level=info msg="ImageCreate event name:\"sha256:bde24a3cb8851b59372b76b3ad78f8028d1a915ffed82c6cc6256f34e500bd3d\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 15 13:05:35.508539 containerd[1551]: time="2025-05-15T13:05:35.508516008Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/typha@sha256:f5516aa6a78f00931d2625f3012dcf2c69d141ce41483b8d59c6ec6330a18620\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 15 13:05:35.509073 containerd[1551]: time="2025-05-15T13:05:35.509041125Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/typha:v3.29.3\" with image id \"sha256:bde24a3cb8851b59372b76b3ad78f8028d1a915ffed82c6cc6256f34e500bd3d\", repo tag \"ghcr.io/flatcar/calico/typha:v3.29.3\", repo digest \"ghcr.io/flatcar/calico/typha@sha256:f5516aa6a78f00931d2625f3012dcf2c69d141ce41483b8d59c6ec6330a18620\", size \"31919484\" in 2.7528193s" May 15 13:05:35.509121 containerd[1551]: time="2025-05-15T13:05:35.509077644Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/typha:v3.29.3\" returns image reference \"sha256:bde24a3cb8851b59372b76b3ad78f8028d1a915ffed82c6cc6256f34e500bd3d\"" May 15 13:05:35.510763 containerd[1551]: time="2025-05-15T13:05:35.510721145Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.29.3\"" May 15 13:05:35.516742 containerd[1551]: time="2025-05-15T13:05:35.516701111Z" level=info msg="CreateContainer within sandbox \"7aa347eee207050fcb2de8f00c25e930765c09cdcea1058bd289429ddf5c88b5\" for container &ContainerMetadata{Name:calico-typha,Attempt:0,}" May 15 13:05:35.526852 containerd[1551]: time="2025-05-15T13:05:35.526308939Z" level=info msg="Container e7efe68380ee028dc44dbf9fabd6f0d0ffbae383ba072dc38e1b869bb9b4acdd: CDI devices from CRI Config.CDIDevices: []" May 15 13:05:35.537243 containerd[1551]: time="2025-05-15T13:05:35.537185634Z" level=info msg="CreateContainer within sandbox \"7aa347eee207050fcb2de8f00c25e930765c09cdcea1058bd289429ddf5c88b5\" for &ContainerMetadata{Name:calico-typha,Attempt:0,} returns container id \"e7efe68380ee028dc44dbf9fabd6f0d0ffbae383ba072dc38e1b869bb9b4acdd\"" May 15 13:05:35.538859 containerd[1551]: time="2025-05-15T13:05:35.537865974Z" level=info msg="StartContainer for \"e7efe68380ee028dc44dbf9fabd6f0d0ffbae383ba072dc38e1b869bb9b4acdd\"" May 15 13:05:35.538859 containerd[1551]: time="2025-05-15T13:05:35.538608109Z" level=info msg="connecting to shim e7efe68380ee028dc44dbf9fabd6f0d0ffbae383ba072dc38e1b869bb9b4acdd" address="unix:///run/containerd/s/f635b14606e72fd482e837c1e71f4cff5af4ae1f48d58045b29ebf3912a40e31" protocol=ttrpc version=3 May 15 13:05:35.563018 systemd[1]: Started cri-containerd-e7efe68380ee028dc44dbf9fabd6f0d0ffbae383ba072dc38e1b869bb9b4acdd.scope - libcontainer container e7efe68380ee028dc44dbf9fabd6f0d0ffbae383ba072dc38e1b869bb9b4acdd. May 15 13:05:35.604561 containerd[1551]: time="2025-05-15T13:05:35.604516016Z" level=info msg="StartContainer for \"e7efe68380ee028dc44dbf9fabd6f0d0ffbae383ba072dc38e1b869bb9b4acdd\" returns successfully" May 15 13:05:36.378924 kubelet[4265]: E0515 13:05:36.378747 4265 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-tx8fx" podUID="bb669bdd-390f-4594-a774-599bed5593aa" May 15 13:05:36.457841 kubelet[4265]: I0515 13:05:36.457755 4265 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/calico-typha-67b48869f7-pwb9d" podStartSLOduration=1.701139271 podStartE2EDuration="4.457736786s" podCreationTimestamp="2025-05-15 13:05:32 +0000 UTC" firstStartedPulling="2025-05-15 13:05:32.753705964 +0000 UTC m=+14.470083867" lastFinishedPulling="2025-05-15 13:05:35.51030348 +0000 UTC m=+17.226681382" observedRunningTime="2025-05-15 13:05:36.456517784 +0000 UTC m=+18.172895697" watchObservedRunningTime="2025-05-15 13:05:36.457736786 +0000 UTC m=+18.174114699" May 15 13:05:36.479719 kubelet[4265]: E0515 13:05:36.479689 4265 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 15 13:05:36.479719 kubelet[4265]: W0515 13:05:36.479707 4265 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 15 13:05:36.479928 kubelet[4265]: E0515 13:05:36.479728 4265 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 15 13:05:36.479928 kubelet[4265]: E0515 13:05:36.479868 4265 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 15 13:05:36.479928 kubelet[4265]: W0515 13:05:36.479875 4265 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 15 13:05:36.479928 kubelet[4265]: E0515 13:05:36.479884 4265 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 15 13:05:36.480142 kubelet[4265]: E0515 13:05:36.480058 4265 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 15 13:05:36.480142 kubelet[4265]: W0515 13:05:36.480066 4265 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 15 13:05:36.480142 kubelet[4265]: E0515 13:05:36.480093 4265 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 15 13:05:36.480316 kubelet[4265]: E0515 13:05:36.480303 4265 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 15 13:05:36.480316 kubelet[4265]: W0515 13:05:36.480314 4265 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 15 13:05:36.480407 kubelet[4265]: E0515 13:05:36.480341 4265 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 15 13:05:36.480552 kubelet[4265]: E0515 13:05:36.480523 4265 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 15 13:05:36.480552 kubelet[4265]: W0515 13:05:36.480534 4265 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 15 13:05:36.480552 kubelet[4265]: E0515 13:05:36.480542 4265 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 15 13:05:36.480712 kubelet[4265]: E0515 13:05:36.480697 4265 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 15 13:05:36.480712 kubelet[4265]: W0515 13:05:36.480706 4265 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 15 13:05:36.480796 kubelet[4265]: E0515 13:05:36.480715 4265 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 15 13:05:36.480880 kubelet[4265]: E0515 13:05:36.480865 4265 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 15 13:05:36.480880 kubelet[4265]: W0515 13:05:36.480875 4265 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 15 13:05:36.480998 kubelet[4265]: E0515 13:05:36.480885 4265 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 15 13:05:36.481041 kubelet[4265]: E0515 13:05:36.481013 4265 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 15 13:05:36.481041 kubelet[4265]: W0515 13:05:36.481020 4265 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 15 13:05:36.481041 kubelet[4265]: E0515 13:05:36.481029 4265 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 15 13:05:36.481185 kubelet[4265]: E0515 13:05:36.481166 4265 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 15 13:05:36.481185 kubelet[4265]: W0515 13:05:36.481176 4265 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 15 13:05:36.481185 kubelet[4265]: E0515 13:05:36.481184 4265 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 15 13:05:36.481399 kubelet[4265]: E0515 13:05:36.481286 4265 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 15 13:05:36.481399 kubelet[4265]: W0515 13:05:36.481293 4265 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 15 13:05:36.481399 kubelet[4265]: E0515 13:05:36.481300 4265 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 15 13:05:36.481583 kubelet[4265]: E0515 13:05:36.481404 4265 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 15 13:05:36.481583 kubelet[4265]: W0515 13:05:36.481411 4265 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 15 13:05:36.481583 kubelet[4265]: E0515 13:05:36.481418 4265 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 15 13:05:36.481583 kubelet[4265]: E0515 13:05:36.481522 4265 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 15 13:05:36.481583 kubelet[4265]: W0515 13:05:36.481528 4265 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 15 13:05:36.481583 kubelet[4265]: E0515 13:05:36.481535 4265 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 15 13:05:36.482003 kubelet[4265]: E0515 13:05:36.481651 4265 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 15 13:05:36.482003 kubelet[4265]: W0515 13:05:36.481657 4265 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 15 13:05:36.482003 kubelet[4265]: E0515 13:05:36.481664 4265 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 15 13:05:36.482003 kubelet[4265]: E0515 13:05:36.481770 4265 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 15 13:05:36.482003 kubelet[4265]: W0515 13:05:36.481776 4265 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 15 13:05:36.482003 kubelet[4265]: E0515 13:05:36.481783 4265 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 15 13:05:36.482003 kubelet[4265]: E0515 13:05:36.481913 4265 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 15 13:05:36.482003 kubelet[4265]: W0515 13:05:36.481920 4265 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 15 13:05:36.482003 kubelet[4265]: E0515 13:05:36.481927 4265 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 15 13:05:36.504463 kubelet[4265]: E0515 13:05:36.504416 4265 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 15 13:05:36.504463 kubelet[4265]: W0515 13:05:36.504441 4265 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 15 13:05:36.504463 kubelet[4265]: E0515 13:05:36.504459 4265 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 15 13:05:36.504847 kubelet[4265]: E0515 13:05:36.504793 4265 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 15 13:05:36.504847 kubelet[4265]: W0515 13:05:36.504841 4265 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 15 13:05:36.504979 kubelet[4265]: E0515 13:05:36.504861 4265 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 15 13:05:36.505249 kubelet[4265]: E0515 13:05:36.505218 4265 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 15 13:05:36.505356 kubelet[4265]: W0515 13:05:36.505321 4265 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 15 13:05:36.505419 kubelet[4265]: E0515 13:05:36.505400 4265 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 15 13:05:36.505727 kubelet[4265]: E0515 13:05:36.505701 4265 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 15 13:05:36.505727 kubelet[4265]: W0515 13:05:36.505720 4265 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 15 13:05:36.505808 kubelet[4265]: E0515 13:05:36.505740 4265 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 15 13:05:36.506070 kubelet[4265]: E0515 13:05:36.506043 4265 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 15 13:05:36.506070 kubelet[4265]: W0515 13:05:36.506062 4265 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 15 13:05:36.506990 kubelet[4265]: E0515 13:05:36.506082 4265 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 15 13:05:36.506990 kubelet[4265]: E0515 13:05:36.506447 4265 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 15 13:05:36.506990 kubelet[4265]: W0515 13:05:36.506460 4265 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 15 13:05:36.506990 kubelet[4265]: E0515 13:05:36.506504 4265 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 15 13:05:36.506990 kubelet[4265]: E0515 13:05:36.506652 4265 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 15 13:05:36.506990 kubelet[4265]: W0515 13:05:36.506662 4265 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 15 13:05:36.506990 kubelet[4265]: E0515 13:05:36.506815 4265 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 15 13:05:36.506990 kubelet[4265]: W0515 13:05:36.506825 4265 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 15 13:05:36.506990 kubelet[4265]: E0515 13:05:36.506839 4265 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 15 13:05:36.506990 kubelet[4265]: E0515 13:05:36.506919 4265 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 15 13:05:36.507391 kubelet[4265]: E0515 13:05:36.507359 4265 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 15 13:05:36.507391 kubelet[4265]: W0515 13:05:36.507380 4265 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 15 13:05:36.507549 kubelet[4265]: E0515 13:05:36.507401 4265 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 15 13:05:36.507670 kubelet[4265]: E0515 13:05:36.507645 4265 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 15 13:05:36.507670 kubelet[4265]: W0515 13:05:36.507661 4265 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 15 13:05:36.507857 kubelet[4265]: E0515 13:05:36.507678 4265 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 15 13:05:36.507857 kubelet[4265]: E0515 13:05:36.507818 4265 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 15 13:05:36.507857 kubelet[4265]: W0515 13:05:36.507827 4265 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 15 13:05:36.507857 kubelet[4265]: E0515 13:05:36.507847 4265 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 15 13:05:36.508324 kubelet[4265]: E0515 13:05:36.508058 4265 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 15 13:05:36.508324 kubelet[4265]: W0515 13:05:36.508068 4265 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 15 13:05:36.508324 kubelet[4265]: E0515 13:05:36.508089 4265 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 15 13:05:36.508604 kubelet[4265]: E0515 13:05:36.508524 4265 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 15 13:05:36.508604 kubelet[4265]: W0515 13:05:36.508543 4265 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 15 13:05:36.508604 kubelet[4265]: E0515 13:05:36.508568 4265 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 15 13:05:36.508807 kubelet[4265]: E0515 13:05:36.508733 4265 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 15 13:05:36.508807 kubelet[4265]: W0515 13:05:36.508744 4265 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 15 13:05:36.508807 kubelet[4265]: E0515 13:05:36.508764 4265 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 15 13:05:36.509011 kubelet[4265]: E0515 13:05:36.508956 4265 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 15 13:05:36.509011 kubelet[4265]: W0515 13:05:36.508966 4265 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 15 13:05:36.509011 kubelet[4265]: E0515 13:05:36.508980 4265 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 15 13:05:36.509199 kubelet[4265]: E0515 13:05:36.509149 4265 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 15 13:05:36.509199 kubelet[4265]: W0515 13:05:36.509159 4265 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 15 13:05:36.509199 kubelet[4265]: E0515 13:05:36.509168 4265 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 15 13:05:36.509357 kubelet[4265]: E0515 13:05:36.509331 4265 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 15 13:05:36.509357 kubelet[4265]: W0515 13:05:36.509344 4265 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 15 13:05:36.509357 kubelet[4265]: E0515 13:05:36.509355 4265 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 15 13:05:36.509745 kubelet[4265]: E0515 13:05:36.509714 4265 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 15 13:05:36.509745 kubelet[4265]: W0515 13:05:36.509736 4265 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 15 13:05:36.509869 kubelet[4265]: E0515 13:05:36.509752 4265 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 15 13:05:37.449379 kubelet[4265]: I0515 13:05:37.449336 4265 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" May 15 13:05:37.489453 kubelet[4265]: E0515 13:05:37.489425 4265 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 15 13:05:37.489453 kubelet[4265]: W0515 13:05:37.489442 4265 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 15 13:05:37.489714 kubelet[4265]: E0515 13:05:37.489464 4265 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 15 13:05:37.489714 kubelet[4265]: E0515 13:05:37.489617 4265 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 15 13:05:37.489714 kubelet[4265]: W0515 13:05:37.489624 4265 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 15 13:05:37.489714 kubelet[4265]: E0515 13:05:37.489680 4265 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 15 13:05:37.489866 kubelet[4265]: E0515 13:05:37.489841 4265 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 15 13:05:37.489866 kubelet[4265]: W0515 13:05:37.489857 4265 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 15 13:05:37.489866 kubelet[4265]: E0515 13:05:37.489867 4265 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 15 13:05:37.490087 kubelet[4265]: E0515 13:05:37.490063 4265 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 15 13:05:37.490087 kubelet[4265]: W0515 13:05:37.490078 4265 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 15 13:05:37.490087 kubelet[4265]: E0515 13:05:37.490088 4265 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 15 13:05:37.490253 kubelet[4265]: E0515 13:05:37.490236 4265 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 15 13:05:37.490253 kubelet[4265]: W0515 13:05:37.490248 4265 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 15 13:05:37.490253 kubelet[4265]: E0515 13:05:37.490256 4265 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 15 13:05:37.490398 kubelet[4265]: E0515 13:05:37.490386 4265 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 15 13:05:37.490398 kubelet[4265]: W0515 13:05:37.490395 4265 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 15 13:05:37.490456 kubelet[4265]: E0515 13:05:37.490403 4265 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 15 13:05:37.490528 kubelet[4265]: E0515 13:05:37.490511 4265 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 15 13:05:37.490528 kubelet[4265]: W0515 13:05:37.490525 4265 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 15 13:05:37.490604 kubelet[4265]: E0515 13:05:37.490533 4265 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 15 13:05:37.490670 kubelet[4265]: E0515 13:05:37.490654 4265 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 15 13:05:37.490670 kubelet[4265]: W0515 13:05:37.490666 4265 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 15 13:05:37.490715 kubelet[4265]: E0515 13:05:37.490673 4265 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 15 13:05:37.490802 kubelet[4265]: E0515 13:05:37.490781 4265 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 15 13:05:37.490802 kubelet[4265]: W0515 13:05:37.490793 4265 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 15 13:05:37.490802 kubelet[4265]: E0515 13:05:37.490801 4265 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 15 13:05:37.490952 kubelet[4265]: E0515 13:05:37.490926 4265 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 15 13:05:37.490952 kubelet[4265]: W0515 13:05:37.490933 4265 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 15 13:05:37.490952 kubelet[4265]: E0515 13:05:37.490940 4265 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 15 13:05:37.491077 kubelet[4265]: E0515 13:05:37.491049 4265 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 15 13:05:37.491077 kubelet[4265]: W0515 13:05:37.491060 4265 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 15 13:05:37.491077 kubelet[4265]: E0515 13:05:37.491068 4265 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 15 13:05:37.491228 kubelet[4265]: E0515 13:05:37.491214 4265 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 15 13:05:37.491228 kubelet[4265]: W0515 13:05:37.491225 4265 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 15 13:05:37.491300 kubelet[4265]: E0515 13:05:37.491244 4265 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 15 13:05:37.491373 kubelet[4265]: E0515 13:05:37.491364 4265 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 15 13:05:37.491373 kubelet[4265]: W0515 13:05:37.491373 4265 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 15 13:05:37.491448 kubelet[4265]: E0515 13:05:37.491380 4265 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 15 13:05:37.491532 kubelet[4265]: E0515 13:05:37.491515 4265 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 15 13:05:37.491532 kubelet[4265]: W0515 13:05:37.491524 4265 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 15 13:05:37.491532 kubelet[4265]: E0515 13:05:37.491531 4265 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 15 13:05:37.491707 kubelet[4265]: E0515 13:05:37.491668 4265 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 15 13:05:37.491707 kubelet[4265]: W0515 13:05:37.491680 4265 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 15 13:05:37.491809 kubelet[4265]: E0515 13:05:37.491689 4265 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 15 13:05:37.514300 kubelet[4265]: E0515 13:05:37.514273 4265 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 15 13:05:37.514300 kubelet[4265]: W0515 13:05:37.514294 4265 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 15 13:05:37.514391 kubelet[4265]: E0515 13:05:37.514312 4265 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 15 13:05:37.514592 kubelet[4265]: E0515 13:05:37.514573 4265 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 15 13:05:37.514592 kubelet[4265]: W0515 13:05:37.514590 4265 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 15 13:05:37.514711 kubelet[4265]: E0515 13:05:37.514611 4265 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 15 13:05:37.514917 kubelet[4265]: E0515 13:05:37.514833 4265 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 15 13:05:37.514917 kubelet[4265]: W0515 13:05:37.514844 4265 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 15 13:05:37.514917 kubelet[4265]: E0515 13:05:37.514859 4265 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 15 13:05:37.515199 kubelet[4265]: E0515 13:05:37.515187 4265 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 15 13:05:37.515368 kubelet[4265]: W0515 13:05:37.515267 4265 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 15 13:05:37.515368 kubelet[4265]: E0515 13:05:37.515309 4265 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 15 13:05:37.515590 kubelet[4265]: E0515 13:05:37.515571 4265 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 15 13:05:37.515590 kubelet[4265]: W0515 13:05:37.515582 4265 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 15 13:05:37.515664 kubelet[4265]: E0515 13:05:37.515593 4265 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 15 13:05:37.515744 kubelet[4265]: E0515 13:05:37.515725 4265 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 15 13:05:37.515744 kubelet[4265]: W0515 13:05:37.515735 4265 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 15 13:05:37.515831 kubelet[4265]: E0515 13:05:37.515819 4265 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 15 13:05:37.516343 kubelet[4265]: E0515 13:05:37.516261 4265 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 15 13:05:37.516343 kubelet[4265]: W0515 13:05:37.516275 4265 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 15 13:05:37.516413 kubelet[4265]: E0515 13:05:37.516404 4265 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 15 13:05:37.516579 kubelet[4265]: E0515 13:05:37.516510 4265 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 15 13:05:37.516579 kubelet[4265]: W0515 13:05:37.516523 4265 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 15 13:05:37.516909 kubelet[4265]: E0515 13:05:37.516706 4265 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 15 13:05:37.516909 kubelet[4265]: W0515 13:05:37.516723 4265 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 15 13:05:37.516909 kubelet[4265]: E0515 13:05:37.516734 4265 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 15 13:05:37.517136 kubelet[4265]: E0515 13:05:37.517109 4265 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 15 13:05:37.517201 kubelet[4265]: W0515 13:05:37.517190 4265 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 15 13:05:37.517249 kubelet[4265]: E0515 13:05:37.517241 4265 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 15 13:05:37.517522 kubelet[4265]: E0515 13:05:37.517446 4265 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 15 13:05:37.517522 kubelet[4265]: W0515 13:05:37.517456 4265 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 15 13:05:37.517522 kubelet[4265]: E0515 13:05:37.517465 4265 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 15 13:05:37.517759 kubelet[4265]: E0515 13:05:37.517656 4265 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 15 13:05:37.517759 kubelet[4265]: W0515 13:05:37.517666 4265 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 15 13:05:37.517759 kubelet[4265]: E0515 13:05:37.517675 4265 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 15 13:05:37.518286 kubelet[4265]: E0515 13:05:37.517964 4265 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 15 13:05:37.518286 kubelet[4265]: W0515 13:05:37.518030 4265 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 15 13:05:37.518286 kubelet[4265]: E0515 13:05:37.518041 4265 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 15 13:05:37.518502 kubelet[4265]: E0515 13:05:37.518491 4265 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 15 13:05:37.518585 kubelet[4265]: W0515 13:05:37.518576 4265 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 15 13:05:37.518745 kubelet[4265]: E0515 13:05:37.518734 4265 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 15 13:05:37.519969 kubelet[4265]: E0515 13:05:37.519784 4265 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 15 13:05:37.520584 kubelet[4265]: E0515 13:05:37.520394 4265 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 15 13:05:37.520584 kubelet[4265]: W0515 13:05:37.520406 4265 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 15 13:05:37.520584 kubelet[4265]: E0515 13:05:37.520435 4265 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 15 13:05:37.520912 kubelet[4265]: E0515 13:05:37.520810 4265 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 15 13:05:37.520912 kubelet[4265]: W0515 13:05:37.520824 4265 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 15 13:05:37.520912 kubelet[4265]: E0515 13:05:37.520834 4265 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 15 13:05:37.521333 kubelet[4265]: E0515 13:05:37.521294 4265 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 15 13:05:37.521333 kubelet[4265]: W0515 13:05:37.521306 4265 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 15 13:05:37.521570 kubelet[4265]: E0515 13:05:37.521405 4265 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 15 13:05:37.522385 kubelet[4265]: E0515 13:05:37.522016 4265 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 15 13:05:37.522385 kubelet[4265]: W0515 13:05:37.522047 4265 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 15 13:05:37.522385 kubelet[4265]: E0515 13:05:37.522057 4265 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 15 13:05:37.615129 containerd[1551]: time="2025-05-15T13:05:37.615069569Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.29.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 15 13:05:37.636920 containerd[1551]: time="2025-05-15T13:05:37.615922554Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.29.3: active requests=0, bytes read=5366937" May 15 13:05:37.636920 containerd[1551]: time="2025-05-15T13:05:37.617139752Z" level=info msg="ImageCreate event name:\"sha256:0ceddb3add2e9955cbb604f666245e259f30b1d6683c428f8748359e83d238a5\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 15 13:05:37.643187 containerd[1551]: time="2025-05-15T13:05:37.643143796Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/pod2daemon-flexvol@sha256:eeaa2bb4f9b1aa61adde43ce6dea95eee89291f96963548e108d9a2dfbc5edd1\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 15 13:05:37.643660 containerd[1551]: time="2025-05-15T13:05:37.643388887Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.29.3\" with image id \"sha256:0ceddb3add2e9955cbb604f666245e259f30b1d6683c428f8748359e83d238a5\", repo tag \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.29.3\", repo digest \"ghcr.io/flatcar/calico/pod2daemon-flexvol@sha256:eeaa2bb4f9b1aa61adde43ce6dea95eee89291f96963548e108d9a2dfbc5edd1\", size \"6859519\" in 2.132643255s" May 15 13:05:37.643660 containerd[1551]: time="2025-05-15T13:05:37.643413513Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.29.3\" returns image reference \"sha256:0ceddb3add2e9955cbb604f666245e259f30b1d6683c428f8748359e83d238a5\"" May 15 13:05:37.646517 containerd[1551]: time="2025-05-15T13:05:37.645065690Z" level=info msg="CreateContainer within sandbox \"d917d8462de08a1e45f19f2ac44024df9d7df3776b9df3654d591d398e02cc20\" for container &ContainerMetadata{Name:flexvol-driver,Attempt:0,}" May 15 13:05:37.675959 containerd[1551]: time="2025-05-15T13:05:37.675701995Z" level=info msg="Container f1db6f1ead85ca59db616a48ed872a75ce3c4baaa3b6aeca502650870bec2cc7: CDI devices from CRI Config.CDIDevices: []" May 15 13:05:37.693775 containerd[1551]: time="2025-05-15T13:05:37.693747501Z" level=info msg="CreateContainer within sandbox \"d917d8462de08a1e45f19f2ac44024df9d7df3776b9df3654d591d398e02cc20\" for &ContainerMetadata{Name:flexvol-driver,Attempt:0,} returns container id \"f1db6f1ead85ca59db616a48ed872a75ce3c4baaa3b6aeca502650870bec2cc7\"" May 15 13:05:37.694267 containerd[1551]: time="2025-05-15T13:05:37.694212065Z" level=info msg="StartContainer for \"f1db6f1ead85ca59db616a48ed872a75ce3c4baaa3b6aeca502650870bec2cc7\"" May 15 13:05:37.696491 containerd[1551]: time="2025-05-15T13:05:37.696474490Z" level=info msg="connecting to shim f1db6f1ead85ca59db616a48ed872a75ce3c4baaa3b6aeca502650870bec2cc7" address="unix:///run/containerd/s/54c78eb104a21fdcb502ec570dacef3f38352012fe017f0d09b8000e76d188bd" protocol=ttrpc version=3 May 15 13:05:37.721005 systemd[1]: Started cri-containerd-f1db6f1ead85ca59db616a48ed872a75ce3c4baaa3b6aeca502650870bec2cc7.scope - libcontainer container f1db6f1ead85ca59db616a48ed872a75ce3c4baaa3b6aeca502650870bec2cc7. May 15 13:05:37.758447 containerd[1551]: time="2025-05-15T13:05:37.758390509Z" level=info msg="StartContainer for \"f1db6f1ead85ca59db616a48ed872a75ce3c4baaa3b6aeca502650870bec2cc7\" returns successfully" May 15 13:05:37.767028 systemd[1]: cri-containerd-f1db6f1ead85ca59db616a48ed872a75ce3c4baaa3b6aeca502650870bec2cc7.scope: Deactivated successfully. May 15 13:05:37.803658 containerd[1551]: time="2025-05-15T13:05:37.803237900Z" level=info msg="TaskExit event in podsandbox handler container_id:\"f1db6f1ead85ca59db616a48ed872a75ce3c4baaa3b6aeca502650870bec2cc7\" id:\"f1db6f1ead85ca59db616a48ed872a75ce3c4baaa3b6aeca502650870bec2cc7\" pid:4941 exited_at:{seconds:1747314337 nanos:768811157}" May 15 13:05:37.803988 containerd[1551]: time="2025-05-15T13:05:37.803834021Z" level=info msg="received exit event container_id:\"f1db6f1ead85ca59db616a48ed872a75ce3c4baaa3b6aeca502650870bec2cc7\" id:\"f1db6f1ead85ca59db616a48ed872a75ce3c4baaa3b6aeca502650870bec2cc7\" pid:4941 exited_at:{seconds:1747314337 nanos:768811157}" May 15 13:05:37.828700 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-f1db6f1ead85ca59db616a48ed872a75ce3c4baaa3b6aeca502650870bec2cc7-rootfs.mount: Deactivated successfully. May 15 13:05:38.380528 kubelet[4265]: E0515 13:05:38.379503 4265 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-tx8fx" podUID="bb669bdd-390f-4594-a774-599bed5593aa" May 15 13:05:38.457131 containerd[1551]: time="2025-05-15T13:05:38.457061542Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/cni:v3.29.3\"" May 15 13:05:40.378723 kubelet[4265]: E0515 13:05:40.378366 4265 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-tx8fx" podUID="bb669bdd-390f-4594-a774-599bed5593aa" May 15 13:05:42.378916 kubelet[4265]: E0515 13:05:42.378603 4265 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-tx8fx" podUID="bb669bdd-390f-4594-a774-599bed5593aa" May 15 13:05:44.379448 kubelet[4265]: E0515 13:05:44.378413 4265 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-tx8fx" podUID="bb669bdd-390f-4594-a774-599bed5593aa" May 15 13:05:44.539128 containerd[1551]: time="2025-05-15T13:05:44.539061143Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/cni:v3.29.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 15 13:05:44.539846 containerd[1551]: time="2025-05-15T13:05:44.539729810Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/cni:v3.29.3: active requests=0, bytes read=97793683" May 15 13:05:44.540573 containerd[1551]: time="2025-05-15T13:05:44.540545625Z" level=info msg="ImageCreate event name:\"sha256:a140d04be1bc987bae0a1b9159e1dcb85751c448830efbdb3494207cf602b2d9\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 15 13:05:44.542181 containerd[1551]: time="2025-05-15T13:05:44.542155682Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/cni@sha256:4505ec8f976470994b6a94295a4dabac0cb98375db050e959a22603e00ada90b\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 15 13:05:44.542816 containerd[1551]: time="2025-05-15T13:05:44.542545476Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/cni:v3.29.3\" with image id \"sha256:a140d04be1bc987bae0a1b9159e1dcb85751c448830efbdb3494207cf602b2d9\", repo tag \"ghcr.io/flatcar/calico/cni:v3.29.3\", repo digest \"ghcr.io/flatcar/calico/cni@sha256:4505ec8f976470994b6a94295a4dabac0cb98375db050e959a22603e00ada90b\", size \"99286305\" in 6.085438228s" May 15 13:05:44.542816 containerd[1551]: time="2025-05-15T13:05:44.542571744Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/cni:v3.29.3\" returns image reference \"sha256:a140d04be1bc987bae0a1b9159e1dcb85751c448830efbdb3494207cf602b2d9\"" May 15 13:05:44.544600 containerd[1551]: time="2025-05-15T13:05:44.544561356Z" level=info msg="CreateContainer within sandbox \"d917d8462de08a1e45f19f2ac44024df9d7df3776b9df3654d591d398e02cc20\" for container &ContainerMetadata{Name:install-cni,Attempt:0,}" May 15 13:05:44.553706 containerd[1551]: time="2025-05-15T13:05:44.552996960Z" level=info msg="Container aad21d7fd89dea840edb3f42765758cc86e11dd3d407317c69c7fa973cd55a30: CDI devices from CRI Config.CDIDevices: []" May 15 13:05:44.564603 containerd[1551]: time="2025-05-15T13:05:44.564512467Z" level=info msg="CreateContainer within sandbox \"d917d8462de08a1e45f19f2ac44024df9d7df3776b9df3654d591d398e02cc20\" for &ContainerMetadata{Name:install-cni,Attempt:0,} returns container id \"aad21d7fd89dea840edb3f42765758cc86e11dd3d407317c69c7fa973cd55a30\"" May 15 13:05:44.565050 containerd[1551]: time="2025-05-15T13:05:44.565037544Z" level=info msg="StartContainer for \"aad21d7fd89dea840edb3f42765758cc86e11dd3d407317c69c7fa973cd55a30\"" May 15 13:05:44.566276 containerd[1551]: time="2025-05-15T13:05:44.566186585Z" level=info msg="connecting to shim aad21d7fd89dea840edb3f42765758cc86e11dd3d407317c69c7fa973cd55a30" address="unix:///run/containerd/s/54c78eb104a21fdcb502ec570dacef3f38352012fe017f0d09b8000e76d188bd" protocol=ttrpc version=3 May 15 13:05:44.588008 systemd[1]: Started cri-containerd-aad21d7fd89dea840edb3f42765758cc86e11dd3d407317c69c7fa973cd55a30.scope - libcontainer container aad21d7fd89dea840edb3f42765758cc86e11dd3d407317c69c7fa973cd55a30. May 15 13:05:44.630297 containerd[1551]: time="2025-05-15T13:05:44.630182866Z" level=info msg="StartContainer for \"aad21d7fd89dea840edb3f42765758cc86e11dd3d407317c69c7fa973cd55a30\" returns successfully" May 15 13:05:44.943953 systemd[1]: cri-containerd-aad21d7fd89dea840edb3f42765758cc86e11dd3d407317c69c7fa973cd55a30.scope: Deactivated successfully. May 15 13:05:44.944448 systemd[1]: cri-containerd-aad21d7fd89dea840edb3f42765758cc86e11dd3d407317c69c7fa973cd55a30.scope: Consumed 339ms CPU time, 149.2M memory peak, 3.7M read from disk, 154M written to disk. May 15 13:05:44.967393 containerd[1551]: time="2025-05-15T13:05:44.967188731Z" level=info msg="received exit event container_id:\"aad21d7fd89dea840edb3f42765758cc86e11dd3d407317c69c7fa973cd55a30\" id:\"aad21d7fd89dea840edb3f42765758cc86e11dd3d407317c69c7fa973cd55a30\" pid:4999 exited_at:{seconds:1747314344 nanos:947988994}" May 15 13:05:44.969242 containerd[1551]: time="2025-05-15T13:05:44.969213729Z" level=info msg="TaskExit event in podsandbox handler container_id:\"aad21d7fd89dea840edb3f42765758cc86e11dd3d407317c69c7fa973cd55a30\" id:\"aad21d7fd89dea840edb3f42765758cc86e11dd3d407317c69c7fa973cd55a30\" pid:4999 exited_at:{seconds:1747314344 nanos:947988994}" May 15 13:05:44.994097 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-aad21d7fd89dea840edb3f42765758cc86e11dd3d407317c69c7fa973cd55a30-rootfs.mount: Deactivated successfully. May 15 13:05:45.001914 kubelet[4265]: I0515 13:05:45.001868 4265 kubelet_node_status.go:502] "Fast updating node status as it just became ready" May 15 13:05:45.066859 systemd[1]: Created slice kubepods-burstable-podf7da6f75_940d_4f80_9f35_daf3db6856b4.slice - libcontainer container kubepods-burstable-podf7da6f75_940d_4f80_9f35_daf3db6856b4.slice. May 15 13:05:45.075555 systemd[1]: Created slice kubepods-besteffort-pod623640fb_bde7_4522_803c_1a32b7748b5a.slice - libcontainer container kubepods-besteffort-pod623640fb_bde7_4522_803c_1a32b7748b5a.slice. May 15 13:05:45.083922 systemd[1]: Created slice kubepods-burstable-pod3d5bfd57_7739_422c_9643_72aa5ec0fc19.slice - libcontainer container kubepods-burstable-pod3d5bfd57_7739_422c_9643_72aa5ec0fc19.slice. May 15 13:05:45.090050 systemd[1]: Created slice kubepods-besteffort-pod716fa121_5d06_4112_9e4d_16489f87729d.slice - libcontainer container kubepods-besteffort-pod716fa121_5d06_4112_9e4d_16489f87729d.slice. May 15 13:05:45.096156 systemd[1]: Created slice kubepods-besteffort-podde8ec6ba_558b_4695_ae8f_69de59951448.slice - libcontainer container kubepods-besteffort-podde8ec6ba_558b_4695_ae8f_69de59951448.slice. May 15 13:05:45.169143 kubelet[4265]: I0515 13:05:45.169106 4265 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/3d5bfd57-7739-422c-9643-72aa5ec0fc19-config-volume\") pod \"coredns-668d6bf9bc-ldncb\" (UID: \"3d5bfd57-7739-422c-9643-72aa5ec0fc19\") " pod="kube-system/coredns-668d6bf9bc-ldncb" May 15 13:05:45.169143 kubelet[4265]: I0515 13:05:45.169158 4265 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5fss8\" (UniqueName: \"kubernetes.io/projected/f7da6f75-940d-4f80-9f35-daf3db6856b4-kube-api-access-5fss8\") pod \"coredns-668d6bf9bc-tmm62\" (UID: \"f7da6f75-940d-4f80-9f35-daf3db6856b4\") " pod="kube-system/coredns-668d6bf9bc-tmm62" May 15 13:05:45.169334 kubelet[4265]: I0515 13:05:45.169182 4265 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"calico-apiserver-certs\" (UniqueName: \"kubernetes.io/secret/de8ec6ba-558b-4695-ae8f-69de59951448-calico-apiserver-certs\") pod \"calico-apiserver-7d48cd9d65-l57k5\" (UID: \"de8ec6ba-558b-4695-ae8f-69de59951448\") " pod="calico-apiserver/calico-apiserver-7d48cd9d65-l57k5" May 15 13:05:45.169334 kubelet[4265]: I0515 13:05:45.169197 4265 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/716fa121-5d06-4112-9e4d-16489f87729d-tigera-ca-bundle\") pod \"calico-kube-controllers-5455d8c489-zdjgc\" (UID: \"716fa121-5d06-4112-9e4d-16489f87729d\") " pod="calico-system/calico-kube-controllers-5455d8c489-zdjgc" May 15 13:05:45.169334 kubelet[4265]: I0515 13:05:45.169217 4265 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xzcq9\" (UniqueName: \"kubernetes.io/projected/3d5bfd57-7739-422c-9643-72aa5ec0fc19-kube-api-access-xzcq9\") pod \"coredns-668d6bf9bc-ldncb\" (UID: \"3d5bfd57-7739-422c-9643-72aa5ec0fc19\") " pod="kube-system/coredns-668d6bf9bc-ldncb" May 15 13:05:45.169334 kubelet[4265]: I0515 13:05:45.169245 4265 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"calico-apiserver-certs\" (UniqueName: \"kubernetes.io/secret/623640fb-bde7-4522-803c-1a32b7748b5a-calico-apiserver-certs\") pod \"calico-apiserver-7d48cd9d65-sqsnl\" (UID: \"623640fb-bde7-4522-803c-1a32b7748b5a\") " pod="calico-apiserver/calico-apiserver-7d48cd9d65-sqsnl" May 15 13:05:45.169334 kubelet[4265]: I0515 13:05:45.169261 4265 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/f7da6f75-940d-4f80-9f35-daf3db6856b4-config-volume\") pod \"coredns-668d6bf9bc-tmm62\" (UID: \"f7da6f75-940d-4f80-9f35-daf3db6856b4\") " pod="kube-system/coredns-668d6bf9bc-tmm62" May 15 13:05:45.170135 kubelet[4265]: I0515 13:05:45.169275 4265 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s2lrj\" (UniqueName: \"kubernetes.io/projected/716fa121-5d06-4112-9e4d-16489f87729d-kube-api-access-s2lrj\") pod \"calico-kube-controllers-5455d8c489-zdjgc\" (UID: \"716fa121-5d06-4112-9e4d-16489f87729d\") " pod="calico-system/calico-kube-controllers-5455d8c489-zdjgc" May 15 13:05:45.170135 kubelet[4265]: I0515 13:05:45.169287 4265 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9zrm8\" (UniqueName: \"kubernetes.io/projected/de8ec6ba-558b-4695-ae8f-69de59951448-kube-api-access-9zrm8\") pod \"calico-apiserver-7d48cd9d65-l57k5\" (UID: \"de8ec6ba-558b-4695-ae8f-69de59951448\") " pod="calico-apiserver/calico-apiserver-7d48cd9d65-l57k5" May 15 13:05:45.170135 kubelet[4265]: I0515 13:05:45.169304 4265 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dqzdj\" (UniqueName: \"kubernetes.io/projected/623640fb-bde7-4522-803c-1a32b7748b5a-kube-api-access-dqzdj\") pod \"calico-apiserver-7d48cd9d65-sqsnl\" (UID: \"623640fb-bde7-4522-803c-1a32b7748b5a\") " pod="calico-apiserver/calico-apiserver-7d48cd9d65-sqsnl" May 15 13:05:45.373868 containerd[1551]: time="2025-05-15T13:05:45.373799114Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-668d6bf9bc-tmm62,Uid:f7da6f75-940d-4f80-9f35-daf3db6856b4,Namespace:kube-system,Attempt:0,}" May 15 13:05:45.391538 containerd[1551]: time="2025-05-15T13:05:45.391485805Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-668d6bf9bc-ldncb,Uid:3d5bfd57-7739-422c-9643-72aa5ec0fc19,Namespace:kube-system,Attempt:0,}" May 15 13:05:45.391841 containerd[1551]: time="2025-05-15T13:05:45.391782674Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-7d48cd9d65-sqsnl,Uid:623640fb-bde7-4522-803c-1a32b7748b5a,Namespace:calico-apiserver,Attempt:0,}" May 15 13:05:45.395263 containerd[1551]: time="2025-05-15T13:05:45.395215700Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-5455d8c489-zdjgc,Uid:716fa121-5d06-4112-9e4d-16489f87729d,Namespace:calico-system,Attempt:0,}" May 15 13:05:45.402589 containerd[1551]: time="2025-05-15T13:05:45.402565181Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-7d48cd9d65-l57k5,Uid:de8ec6ba-558b-4695-ae8f-69de59951448,Namespace:calico-apiserver,Attempt:0,}" May 15 13:05:45.477993 containerd[1551]: time="2025-05-15T13:05:45.477942727Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node:v3.29.3\"" May 15 13:05:45.600514 containerd[1551]: time="2025-05-15T13:05:45.600455304Z" level=error msg="Failed to destroy network for sandbox \"7495db28a44e81ee2113780c25c2297a535e8cfa1c1f8957609b9c51f80cd8fd\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 15 13:05:45.603545 systemd[1]: run-netns-cni\x2dc44781f8\x2d784f\x2d6a2b\x2d7b60\x2dba7c8c6e86bc.mount: Deactivated successfully. May 15 13:05:45.607919 containerd[1551]: time="2025-05-15T13:05:45.605427204Z" level=error msg="Failed to destroy network for sandbox \"8f815761df42b40def1e027162081289f2d6a45dca493bde7311002f30247486\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 15 13:05:45.607919 containerd[1551]: time="2025-05-15T13:05:45.605547691Z" level=error msg="Failed to destroy network for sandbox \"78d920b56553db287de7d955e5b5dc36a6e5ec616789b86eaa6bd8227924eb73\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 15 13:05:45.607919 containerd[1551]: time="2025-05-15T13:05:45.605656775Z" level=error msg="Failed to destroy network for sandbox \"144ecf3d3fd96465fcfc846123f66dcb5fa77c9f4e65b628eb238828178e32c6\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 15 13:05:45.609927 containerd[1551]: time="2025-05-15T13:05:45.608878465Z" level=error msg="Failed to destroy network for sandbox \"ebd543fcffa57eb65160ea3d1ae1e1b702d4350cfe3aa616c0b83a9114ced7e0\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 15 13:05:45.609582 systemd[1]: run-netns-cni\x2d9e84510a\x2d0f35\x2d1eef\x2d39b0\x2d4b7c86806503.mount: Deactivated successfully. May 15 13:05:45.609657 systemd[1]: run-netns-cni\x2d49adb505\x2dc70a\x2dbe63\x2d26a2\x2d9f82c8f5a735.mount: Deactivated successfully. May 15 13:05:45.609702 systemd[1]: run-netns-cni\x2dd7a76b74\x2d50cd\x2d3719\x2da265\x2d4c6fdc17b12d.mount: Deactivated successfully. May 15 13:05:45.614169 systemd[1]: run-netns-cni\x2d3b0fd82f\x2d4392\x2d77e3\x2dd48f\x2d622d99199fb2.mount: Deactivated successfully. May 15 13:05:45.618346 containerd[1551]: time="2025-05-15T13:05:45.611335245Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-7d48cd9d65-sqsnl,Uid:623640fb-bde7-4522-803c-1a32b7748b5a,Namespace:calico-apiserver,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"7495db28a44e81ee2113780c25c2297a535e8cfa1c1f8957609b9c51f80cd8fd\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 15 13:05:45.618518 containerd[1551]: time="2025-05-15T13:05:45.614549239Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-668d6bf9bc-ldncb,Uid:3d5bfd57-7739-422c-9643-72aa5ec0fc19,Namespace:kube-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"78d920b56553db287de7d955e5b5dc36a6e5ec616789b86eaa6bd8227924eb73\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 15 13:05:45.618518 containerd[1551]: time="2025-05-15T13:05:45.616782860Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-668d6bf9bc-tmm62,Uid:f7da6f75-940d-4f80-9f35-daf3db6856b4,Namespace:kube-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"8f815761df42b40def1e027162081289f2d6a45dca493bde7311002f30247486\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 15 13:05:45.618518 containerd[1551]: time="2025-05-15T13:05:45.617660951Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-5455d8c489-zdjgc,Uid:716fa121-5d06-4112-9e4d-16489f87729d,Namespace:calico-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"144ecf3d3fd96465fcfc846123f66dcb5fa77c9f4e65b628eb238828178e32c6\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 15 13:05:45.618781 containerd[1551]: time="2025-05-15T13:05:45.618673856Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-7d48cd9d65-l57k5,Uid:de8ec6ba-558b-4695-ae8f-69de59951448,Namespace:calico-apiserver,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"ebd543fcffa57eb65160ea3d1ae1e1b702d4350cfe3aa616c0b83a9114ced7e0\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 15 13:05:45.619921 kubelet[4265]: E0515 13:05:45.619575 4265 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"144ecf3d3fd96465fcfc846123f66dcb5fa77c9f4e65b628eb238828178e32c6\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 15 13:05:45.619921 kubelet[4265]: E0515 13:05:45.619583 4265 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"7495db28a44e81ee2113780c25c2297a535e8cfa1c1f8957609b9c51f80cd8fd\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 15 13:05:45.619921 kubelet[4265]: E0515 13:05:45.619655 4265 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"144ecf3d3fd96465fcfc846123f66dcb5fa77c9f4e65b628eb238828178e32c6\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-kube-controllers-5455d8c489-zdjgc" May 15 13:05:45.619921 kubelet[4265]: E0515 13:05:45.619676 4265 kuberuntime_manager.go:1237] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"144ecf3d3fd96465fcfc846123f66dcb5fa77c9f4e65b628eb238828178e32c6\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-kube-controllers-5455d8c489-zdjgc" May 15 13:05:45.620240 kubelet[4265]: E0515 13:05:45.619674 4265 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"7495db28a44e81ee2113780c25c2297a535e8cfa1c1f8957609b9c51f80cd8fd\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-7d48cd9d65-sqsnl" May 15 13:05:45.620240 kubelet[4265]: E0515 13:05:45.619699 4265 kuberuntime_manager.go:1237] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"7495db28a44e81ee2113780c25c2297a535e8cfa1c1f8957609b9c51f80cd8fd\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-7d48cd9d65-sqsnl" May 15 13:05:45.620240 kubelet[4265]: E0515 13:05:45.619714 4265 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-kube-controllers-5455d8c489-zdjgc_calico-system(716fa121-5d06-4112-9e4d-16489f87729d)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-kube-controllers-5455d8c489-zdjgc_calico-system(716fa121-5d06-4112-9e4d-16489f87729d)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"144ecf3d3fd96465fcfc846123f66dcb5fa77c9f4e65b628eb238828178e32c6\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/calico-kube-controllers-5455d8c489-zdjgc" podUID="716fa121-5d06-4112-9e4d-16489f87729d" May 15 13:05:45.620320 kubelet[4265]: E0515 13:05:45.619744 4265 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-apiserver-7d48cd9d65-sqsnl_calico-apiserver(623640fb-bde7-4522-803c-1a32b7748b5a)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-apiserver-7d48cd9d65-sqsnl_calico-apiserver(623640fb-bde7-4522-803c-1a32b7748b5a)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"7495db28a44e81ee2113780c25c2297a535e8cfa1c1f8957609b9c51f80cd8fd\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-apiserver/calico-apiserver-7d48cd9d65-sqsnl" podUID="623640fb-bde7-4522-803c-1a32b7748b5a" May 15 13:05:45.620320 kubelet[4265]: E0515 13:05:45.619846 4265 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"ebd543fcffa57eb65160ea3d1ae1e1b702d4350cfe3aa616c0b83a9114ced7e0\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 15 13:05:45.620320 kubelet[4265]: E0515 13:05:45.619866 4265 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"ebd543fcffa57eb65160ea3d1ae1e1b702d4350cfe3aa616c0b83a9114ced7e0\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-7d48cd9d65-l57k5" May 15 13:05:45.620498 kubelet[4265]: E0515 13:05:45.619881 4265 kuberuntime_manager.go:1237] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"ebd543fcffa57eb65160ea3d1ae1e1b702d4350cfe3aa616c0b83a9114ced7e0\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-7d48cd9d65-l57k5" May 15 13:05:45.620581 kubelet[4265]: E0515 13:05:45.620462 4265 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-apiserver-7d48cd9d65-l57k5_calico-apiserver(de8ec6ba-558b-4695-ae8f-69de59951448)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-apiserver-7d48cd9d65-l57k5_calico-apiserver(de8ec6ba-558b-4695-ae8f-69de59951448)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"ebd543fcffa57eb65160ea3d1ae1e1b702d4350cfe3aa616c0b83a9114ced7e0\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-apiserver/calico-apiserver-7d48cd9d65-l57k5" podUID="de8ec6ba-558b-4695-ae8f-69de59951448" May 15 13:05:45.620637 kubelet[4265]: E0515 13:05:45.620572 4265 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"78d920b56553db287de7d955e5b5dc36a6e5ec616789b86eaa6bd8227924eb73\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 15 13:05:45.620637 kubelet[4265]: E0515 13:05:45.620600 4265 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"78d920b56553db287de7d955e5b5dc36a6e5ec616789b86eaa6bd8227924eb73\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-668d6bf9bc-ldncb" May 15 13:05:45.620637 kubelet[4265]: E0515 13:05:45.620615 4265 kuberuntime_manager.go:1237] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"78d920b56553db287de7d955e5b5dc36a6e5ec616789b86eaa6bd8227924eb73\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-668d6bf9bc-ldncb" May 15 13:05:45.620723 kubelet[4265]: E0515 13:05:45.620658 4265 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"coredns-668d6bf9bc-ldncb_kube-system(3d5bfd57-7739-422c-9643-72aa5ec0fc19)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"coredns-668d6bf9bc-ldncb_kube-system(3d5bfd57-7739-422c-9643-72aa5ec0fc19)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"78d920b56553db287de7d955e5b5dc36a6e5ec616789b86eaa6bd8227924eb73\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-668d6bf9bc-ldncb" podUID="3d5bfd57-7739-422c-9643-72aa5ec0fc19" May 15 13:05:45.622071 kubelet[4265]: E0515 13:05:45.621813 4265 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"8f815761df42b40def1e027162081289f2d6a45dca493bde7311002f30247486\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 15 13:05:45.622071 kubelet[4265]: E0515 13:05:45.621865 4265 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"8f815761df42b40def1e027162081289f2d6a45dca493bde7311002f30247486\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-668d6bf9bc-tmm62" May 15 13:05:45.622071 kubelet[4265]: E0515 13:05:45.621907 4265 kuberuntime_manager.go:1237] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"8f815761df42b40def1e027162081289f2d6a45dca493bde7311002f30247486\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-668d6bf9bc-tmm62" May 15 13:05:45.622212 kubelet[4265]: E0515 13:05:45.621949 4265 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"coredns-668d6bf9bc-tmm62_kube-system(f7da6f75-940d-4f80-9f35-daf3db6856b4)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"coredns-668d6bf9bc-tmm62_kube-system(f7da6f75-940d-4f80-9f35-daf3db6856b4)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"8f815761df42b40def1e027162081289f2d6a45dca493bde7311002f30247486\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-668d6bf9bc-tmm62" podUID="f7da6f75-940d-4f80-9f35-daf3db6856b4" May 15 13:05:46.386484 systemd[1]: Created slice kubepods-besteffort-podbb669bdd_390f_4594_a774_599bed5593aa.slice - libcontainer container kubepods-besteffort-podbb669bdd_390f_4594_a774_599bed5593aa.slice. May 15 13:05:46.389179 containerd[1551]: time="2025-05-15T13:05:46.389128686Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-tx8fx,Uid:bb669bdd-390f-4594-a774-599bed5593aa,Namespace:calico-system,Attempt:0,}" May 15 13:05:46.436070 containerd[1551]: time="2025-05-15T13:05:46.436004540Z" level=error msg="Failed to destroy network for sandbox \"19da61fbfc7fdbd8383c9801f7df72acb40d9dbba937a610ba80b48ccb08fc60\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 15 13:05:46.437024 containerd[1551]: time="2025-05-15T13:05:46.436981729Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-tx8fx,Uid:bb669bdd-390f-4594-a774-599bed5593aa,Namespace:calico-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"19da61fbfc7fdbd8383c9801f7df72acb40d9dbba937a610ba80b48ccb08fc60\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 15 13:05:46.437234 kubelet[4265]: E0515 13:05:46.437167 4265 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"19da61fbfc7fdbd8383c9801f7df72acb40d9dbba937a610ba80b48ccb08fc60\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 15 13:05:46.437770 kubelet[4265]: E0515 13:05:46.437276 4265 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"19da61fbfc7fdbd8383c9801f7df72acb40d9dbba937a610ba80b48ccb08fc60\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/csi-node-driver-tx8fx" May 15 13:05:46.437770 kubelet[4265]: E0515 13:05:46.437305 4265 kuberuntime_manager.go:1237] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"19da61fbfc7fdbd8383c9801f7df72acb40d9dbba937a610ba80b48ccb08fc60\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/csi-node-driver-tx8fx" May 15 13:05:46.437770 kubelet[4265]: E0515 13:05:46.437350 4265 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"csi-node-driver-tx8fx_calico-system(bb669bdd-390f-4594-a774-599bed5593aa)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"csi-node-driver-tx8fx_calico-system(bb669bdd-390f-4594-a774-599bed5593aa)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"19da61fbfc7fdbd8383c9801f7df72acb40d9dbba937a610ba80b48ccb08fc60\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/csi-node-driver-tx8fx" podUID="bb669bdd-390f-4594-a774-599bed5593aa" May 15 13:05:46.555059 systemd[1]: run-netns-cni\x2d2bb51642\x2dc382\x2dd627\x2decea\x2d157c9a5ce3c0.mount: Deactivated successfully. May 15 13:05:52.583842 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount2905976529.mount: Deactivated successfully. May 15 13:05:52.612398 containerd[1551]: time="2025-05-15T13:05:52.610376444Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/node:v3.29.3: active requests=0, bytes read=144068748" May 15 13:05:52.613133 containerd[1551]: time="2025-05-15T13:05:52.608348121Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node:v3.29.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 15 13:05:52.613704 containerd[1551]: time="2025-05-15T13:05:52.613669307Z" level=info msg="ImageCreate event name:\"sha256:042163432abcec06b8077b24973b223a5f4cfdb35d85c3816f5d07a13d51afae\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 15 13:05:52.614293 containerd[1551]: time="2025-05-15T13:05:52.614121668Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node@sha256:750e267b4f8217e0ca9e4107228370190d1a2499b72112ad04370ab9b4553916\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 15 13:05:52.614855 containerd[1551]: time="2025-05-15T13:05:52.614557277Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/node:v3.29.3\" with image id \"sha256:042163432abcec06b8077b24973b223a5f4cfdb35d85c3816f5d07a13d51afae\", repo tag \"ghcr.io/flatcar/calico/node:v3.29.3\", repo digest \"ghcr.io/flatcar/calico/node@sha256:750e267b4f8217e0ca9e4107228370190d1a2499b72112ad04370ab9b4553916\", size \"144068610\" in 7.136565638s" May 15 13:05:52.614855 containerd[1551]: time="2025-05-15T13:05:52.614581603Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node:v3.29.3\" returns image reference \"sha256:042163432abcec06b8077b24973b223a5f4cfdb35d85c3816f5d07a13d51afae\"" May 15 13:05:52.629971 containerd[1551]: time="2025-05-15T13:05:52.629943944Z" level=info msg="CreateContainer within sandbox \"d917d8462de08a1e45f19f2ac44024df9d7df3776b9df3654d591d398e02cc20\" for container &ContainerMetadata{Name:calico-node,Attempt:0,}" May 15 13:05:52.640688 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount2656346776.mount: Deactivated successfully. May 15 13:05:52.641772 containerd[1551]: time="2025-05-15T13:05:52.641676358Z" level=info msg="Container 4f961edc75ef95ac086ccfc69fd0d46c2ea7c617b362e85a0f73a5de9f57344e: CDI devices from CRI Config.CDIDevices: []" May 15 13:05:52.659795 containerd[1551]: time="2025-05-15T13:05:52.659748153Z" level=info msg="CreateContainer within sandbox \"d917d8462de08a1e45f19f2ac44024df9d7df3776b9df3654d591d398e02cc20\" for &ContainerMetadata{Name:calico-node,Attempt:0,} returns container id \"4f961edc75ef95ac086ccfc69fd0d46c2ea7c617b362e85a0f73a5de9f57344e\"" May 15 13:05:52.660599 containerd[1551]: time="2025-05-15T13:05:52.660559049Z" level=info msg="StartContainer for \"4f961edc75ef95ac086ccfc69fd0d46c2ea7c617b362e85a0f73a5de9f57344e\"" May 15 13:05:52.668573 containerd[1551]: time="2025-05-15T13:05:52.668536431Z" level=info msg="connecting to shim 4f961edc75ef95ac086ccfc69fd0d46c2ea7c617b362e85a0f73a5de9f57344e" address="unix:///run/containerd/s/54c78eb104a21fdcb502ec570dacef3f38352012fe017f0d09b8000e76d188bd" protocol=ttrpc version=3 May 15 13:05:52.792047 systemd[1]: Started cri-containerd-4f961edc75ef95ac086ccfc69fd0d46c2ea7c617b362e85a0f73a5de9f57344e.scope - libcontainer container 4f961edc75ef95ac086ccfc69fd0d46c2ea7c617b362e85a0f73a5de9f57344e. May 15 13:05:52.843127 containerd[1551]: time="2025-05-15T13:05:52.842875362Z" level=info msg="StartContainer for \"4f961edc75ef95ac086ccfc69fd0d46c2ea7c617b362e85a0f73a5de9f57344e\" returns successfully" May 15 13:05:53.104641 kernel: wireguard: WireGuard 1.0.0 loaded. See www.wireguard.com for information. May 15 13:05:53.104742 kernel: wireguard: Copyright (C) 2015-2019 Jason A. Donenfeld . All Rights Reserved. May 15 13:05:53.525665 kubelet[4265]: I0515 13:05:53.522862 4265 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/calico-node-l4wnb" podStartSLOduration=1.671838376 podStartE2EDuration="21.522845256s" podCreationTimestamp="2025-05-15 13:05:32 +0000 UTC" firstStartedPulling="2025-05-15 13:05:32.768637976 +0000 UTC m=+14.485015878" lastFinishedPulling="2025-05-15 13:05:52.619644865 +0000 UTC m=+34.336022758" observedRunningTime="2025-05-15 13:05:53.518727282 +0000 UTC m=+35.235105195" watchObservedRunningTime="2025-05-15 13:05:53.522845256 +0000 UTC m=+35.239223160" May 15 13:05:53.664314 containerd[1551]: time="2025-05-15T13:05:53.664241346Z" level=info msg="TaskExit event in podsandbox handler container_id:\"4f961edc75ef95ac086ccfc69fd0d46c2ea7c617b362e85a0f73a5de9f57344e\" id:\"0cdeef310173224ea2aa00d8265bd9483f6a40940e6992be93c0df99623df6e8\" pid:5302 exit_status:1 exited_at:{seconds:1747314353 nanos:649031122}" May 15 13:05:54.600731 containerd[1551]: time="2025-05-15T13:05:54.600690375Z" level=info msg="TaskExit event in podsandbox handler container_id:\"4f961edc75ef95ac086ccfc69fd0d46c2ea7c617b362e85a0f73a5de9f57344e\" id:\"446a10b43adbb502c403ebfa08ac7d32cff44d194cb3135a49bc7f8ba8c4bb27\" pid:5415 exit_status:1 exited_at:{seconds:1747314354 nanos:600446837}" May 15 13:05:55.552653 containerd[1551]: time="2025-05-15T13:05:55.552610390Z" level=info msg="TaskExit event in podsandbox handler container_id:\"4f961edc75ef95ac086ccfc69fd0d46c2ea7c617b362e85a0f73a5de9f57344e\" id:\"1e1697e40b4936b4c71cdbe3415901072c004bd59a33d16f18b756137ef4a47c\" pid:5445 exit_status:1 exited_at:{seconds:1747314355 nanos:551916935}" May 15 13:05:57.379160 containerd[1551]: time="2025-05-15T13:05:57.379051550Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-5455d8c489-zdjgc,Uid:716fa121-5d06-4112-9e4d-16489f87729d,Namespace:calico-system,Attempt:0,}" May 15 13:05:57.379558 containerd[1551]: time="2025-05-15T13:05:57.379051670Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-7d48cd9d65-l57k5,Uid:de8ec6ba-558b-4695-ae8f-69de59951448,Namespace:calico-apiserver,Attempt:0,}" May 15 13:05:57.667742 systemd-networkd[1472]: calia118f3e1bdb: Link UP May 15 13:05:57.668531 systemd-networkd[1472]: calia118f3e1bdb: Gained carrier May 15 13:05:57.682450 containerd[1551]: 2025-05-15 13:05:57.425 [INFO][5505] cni-plugin/utils.go 100: File /var/lib/calico/mtu does not exist May 15 13:05:57.682450 containerd[1551]: 2025-05-15 13:05:57.442 [INFO][5505] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4334--0--0--a--250489a463-k8s-calico--apiserver--7d48cd9d65--l57k5-eth0 calico-apiserver-7d48cd9d65- calico-apiserver de8ec6ba-558b-4695-ae8f-69de59951448 670 0 2025-05-15 13:05:32 +0000 UTC map[apiserver:true app.kubernetes.io/name:calico-apiserver k8s-app:calico-apiserver pod-template-hash:7d48cd9d65 projectcalico.org/namespace:calico-apiserver projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:calico-apiserver] map[] [] [] []} {k8s ci-4334-0-0-a-250489a463 calico-apiserver-7d48cd9d65-l57k5 eth0 calico-apiserver [] [] [kns.calico-apiserver ksa.calico-apiserver.calico-apiserver] calia118f3e1bdb [] []}} ContainerID="45bb1f6c14d218ec8dc935a51a8ff701884bed034c9894fcc7a5d5c1206e3ddf" Namespace="calico-apiserver" Pod="calico-apiserver-7d48cd9d65-l57k5" WorkloadEndpoint="ci--4334--0--0--a--250489a463-k8s-calico--apiserver--7d48cd9d65--l57k5-" May 15 13:05:57.682450 containerd[1551]: 2025-05-15 13:05:57.442 [INFO][5505] cni-plugin/k8s.go 77: Extracted identifiers for CmdAddK8s ContainerID="45bb1f6c14d218ec8dc935a51a8ff701884bed034c9894fcc7a5d5c1206e3ddf" Namespace="calico-apiserver" Pod="calico-apiserver-7d48cd9d65-l57k5" WorkloadEndpoint="ci--4334--0--0--a--250489a463-k8s-calico--apiserver--7d48cd9d65--l57k5-eth0" May 15 13:05:57.682450 containerd[1551]: 2025-05-15 13:05:57.603 [INFO][5526] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="45bb1f6c14d218ec8dc935a51a8ff701884bed034c9894fcc7a5d5c1206e3ddf" HandleID="k8s-pod-network.45bb1f6c14d218ec8dc935a51a8ff701884bed034c9894fcc7a5d5c1206e3ddf" Workload="ci--4334--0--0--a--250489a463-k8s-calico--apiserver--7d48cd9d65--l57k5-eth0" May 15 13:05:57.682822 containerd[1551]: 2025-05-15 13:05:57.621 [INFO][5526] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="45bb1f6c14d218ec8dc935a51a8ff701884bed034c9894fcc7a5d5c1206e3ddf" HandleID="k8s-pod-network.45bb1f6c14d218ec8dc935a51a8ff701884bed034c9894fcc7a5d5c1206e3ddf" Workload="ci--4334--0--0--a--250489a463-k8s-calico--apiserver--7d48cd9d65--l57k5-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc000311cc0), Attrs:map[string]string{"namespace":"calico-apiserver", "node":"ci-4334-0-0-a-250489a463", "pod":"calico-apiserver-7d48cd9d65-l57k5", "timestamp":"2025-05-15 13:05:57.603787419 +0000 UTC"}, Hostname:"ci-4334-0-0-a-250489a463", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} May 15 13:05:57.682822 containerd[1551]: 2025-05-15 13:05:57.621 [INFO][5526] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. May 15 13:05:57.682822 containerd[1551]: 2025-05-15 13:05:57.622 [INFO][5526] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. May 15 13:05:57.682822 containerd[1551]: 2025-05-15 13:05:57.622 [INFO][5526] ipam/ipam.go 107: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4334-0-0-a-250489a463' May 15 13:05:57.682822 containerd[1551]: 2025-05-15 13:05:57.625 [INFO][5526] ipam/ipam.go 660: Looking up existing affinities for host handle="k8s-pod-network.45bb1f6c14d218ec8dc935a51a8ff701884bed034c9894fcc7a5d5c1206e3ddf" host="ci-4334-0-0-a-250489a463" May 15 13:05:57.682822 containerd[1551]: 2025-05-15 13:05:57.631 [INFO][5526] ipam/ipam.go 372: Looking up existing affinities for host host="ci-4334-0-0-a-250489a463" May 15 13:05:57.682822 containerd[1551]: 2025-05-15 13:05:57.637 [INFO][5526] ipam/ipam.go 489: Trying affinity for 192.168.27.192/26 host="ci-4334-0-0-a-250489a463" May 15 13:05:57.682822 containerd[1551]: 2025-05-15 13:05:57.639 [INFO][5526] ipam/ipam.go 155: Attempting to load block cidr=192.168.27.192/26 host="ci-4334-0-0-a-250489a463" May 15 13:05:57.682822 containerd[1551]: 2025-05-15 13:05:57.641 [INFO][5526] ipam/ipam.go 232: Affinity is confirmed and block has been loaded cidr=192.168.27.192/26 host="ci-4334-0-0-a-250489a463" May 15 13:05:57.685006 containerd[1551]: 2025-05-15 13:05:57.641 [INFO][5526] ipam/ipam.go 1180: Attempting to assign 1 addresses from block block=192.168.27.192/26 handle="k8s-pod-network.45bb1f6c14d218ec8dc935a51a8ff701884bed034c9894fcc7a5d5c1206e3ddf" host="ci-4334-0-0-a-250489a463" May 15 13:05:57.685006 containerd[1551]: 2025-05-15 13:05:57.645 [INFO][5526] ipam/ipam.go 1685: Creating new handle: k8s-pod-network.45bb1f6c14d218ec8dc935a51a8ff701884bed034c9894fcc7a5d5c1206e3ddf May 15 13:05:57.685006 containerd[1551]: 2025-05-15 13:05:57.649 [INFO][5526] ipam/ipam.go 1203: Writing block in order to claim IPs block=192.168.27.192/26 handle="k8s-pod-network.45bb1f6c14d218ec8dc935a51a8ff701884bed034c9894fcc7a5d5c1206e3ddf" host="ci-4334-0-0-a-250489a463" May 15 13:05:57.685006 containerd[1551]: 2025-05-15 13:05:57.654 [INFO][5526] ipam/ipam.go 1216: Successfully claimed IPs: [192.168.27.193/26] block=192.168.27.192/26 handle="k8s-pod-network.45bb1f6c14d218ec8dc935a51a8ff701884bed034c9894fcc7a5d5c1206e3ddf" host="ci-4334-0-0-a-250489a463" May 15 13:05:57.685006 containerd[1551]: 2025-05-15 13:05:57.655 [INFO][5526] ipam/ipam.go 847: Auto-assigned 1 out of 1 IPv4s: [192.168.27.193/26] handle="k8s-pod-network.45bb1f6c14d218ec8dc935a51a8ff701884bed034c9894fcc7a5d5c1206e3ddf" host="ci-4334-0-0-a-250489a463" May 15 13:05:57.685006 containerd[1551]: 2025-05-15 13:05:57.655 [INFO][5526] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. May 15 13:05:57.685006 containerd[1551]: 2025-05-15 13:05:57.655 [INFO][5526] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.27.193/26] IPv6=[] ContainerID="45bb1f6c14d218ec8dc935a51a8ff701884bed034c9894fcc7a5d5c1206e3ddf" HandleID="k8s-pod-network.45bb1f6c14d218ec8dc935a51a8ff701884bed034c9894fcc7a5d5c1206e3ddf" Workload="ci--4334--0--0--a--250489a463-k8s-calico--apiserver--7d48cd9d65--l57k5-eth0" May 15 13:05:57.685122 containerd[1551]: 2025-05-15 13:05:57.657 [INFO][5505] cni-plugin/k8s.go 386: Populated endpoint ContainerID="45bb1f6c14d218ec8dc935a51a8ff701884bed034c9894fcc7a5d5c1206e3ddf" Namespace="calico-apiserver" Pod="calico-apiserver-7d48cd9d65-l57k5" WorkloadEndpoint="ci--4334--0--0--a--250489a463-k8s-calico--apiserver--7d48cd9d65--l57k5-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4334--0--0--a--250489a463-k8s-calico--apiserver--7d48cd9d65--l57k5-eth0", GenerateName:"calico-apiserver-7d48cd9d65-", Namespace:"calico-apiserver", SelfLink:"", UID:"de8ec6ba-558b-4695-ae8f-69de59951448", ResourceVersion:"670", Generation:0, CreationTimestamp:time.Date(2025, time.May, 15, 13, 5, 32, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"7d48cd9d65", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4334-0-0-a-250489a463", ContainerID:"", Pod:"calico-apiserver-7d48cd9d65-l57k5", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.27.193/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"calia118f3e1bdb", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil)}} May 15 13:05:57.685173 containerd[1551]: 2025-05-15 13:05:57.657 [INFO][5505] cni-plugin/k8s.go 387: Calico CNI using IPs: [192.168.27.193/32] ContainerID="45bb1f6c14d218ec8dc935a51a8ff701884bed034c9894fcc7a5d5c1206e3ddf" Namespace="calico-apiserver" Pod="calico-apiserver-7d48cd9d65-l57k5" WorkloadEndpoint="ci--4334--0--0--a--250489a463-k8s-calico--apiserver--7d48cd9d65--l57k5-eth0" May 15 13:05:57.685173 containerd[1551]: 2025-05-15 13:05:57.657 [INFO][5505] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to calia118f3e1bdb ContainerID="45bb1f6c14d218ec8dc935a51a8ff701884bed034c9894fcc7a5d5c1206e3ddf" Namespace="calico-apiserver" Pod="calico-apiserver-7d48cd9d65-l57k5" WorkloadEndpoint="ci--4334--0--0--a--250489a463-k8s-calico--apiserver--7d48cd9d65--l57k5-eth0" May 15 13:05:57.685173 containerd[1551]: 2025-05-15 13:05:57.668 [INFO][5505] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="45bb1f6c14d218ec8dc935a51a8ff701884bed034c9894fcc7a5d5c1206e3ddf" Namespace="calico-apiserver" Pod="calico-apiserver-7d48cd9d65-l57k5" WorkloadEndpoint="ci--4334--0--0--a--250489a463-k8s-calico--apiserver--7d48cd9d65--l57k5-eth0" May 15 13:05:57.685249 containerd[1551]: 2025-05-15 13:05:57.669 [INFO][5505] cni-plugin/k8s.go 414: Added Mac, interface name, and active container ID to endpoint ContainerID="45bb1f6c14d218ec8dc935a51a8ff701884bed034c9894fcc7a5d5c1206e3ddf" Namespace="calico-apiserver" Pod="calico-apiserver-7d48cd9d65-l57k5" WorkloadEndpoint="ci--4334--0--0--a--250489a463-k8s-calico--apiserver--7d48cd9d65--l57k5-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4334--0--0--a--250489a463-k8s-calico--apiserver--7d48cd9d65--l57k5-eth0", GenerateName:"calico-apiserver-7d48cd9d65-", Namespace:"calico-apiserver", SelfLink:"", UID:"de8ec6ba-558b-4695-ae8f-69de59951448", ResourceVersion:"670", Generation:0, CreationTimestamp:time.Date(2025, time.May, 15, 13, 5, 32, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"7d48cd9d65", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4334-0-0-a-250489a463", ContainerID:"45bb1f6c14d218ec8dc935a51a8ff701884bed034c9894fcc7a5d5c1206e3ddf", Pod:"calico-apiserver-7d48cd9d65-l57k5", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.27.193/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"calia118f3e1bdb", MAC:"7e:5f:ab:15:71:53", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil)}} May 15 13:05:57.685303 containerd[1551]: 2025-05-15 13:05:57.677 [INFO][5505] cni-plugin/k8s.go 500: Wrote updated endpoint to datastore ContainerID="45bb1f6c14d218ec8dc935a51a8ff701884bed034c9894fcc7a5d5c1206e3ddf" Namespace="calico-apiserver" Pod="calico-apiserver-7d48cd9d65-l57k5" WorkloadEndpoint="ci--4334--0--0--a--250489a463-k8s-calico--apiserver--7d48cd9d65--l57k5-eth0" May 15 13:05:57.752710 containerd[1551]: time="2025-05-15T13:05:57.752547979Z" level=info msg="connecting to shim 45bb1f6c14d218ec8dc935a51a8ff701884bed034c9894fcc7a5d5c1206e3ddf" address="unix:///run/containerd/s/fc1e54925f16dc7e13c4dcc7ca3e4f57a598e3518f52de1fc6c0a556ac3e8dcc" namespace=k8s.io protocol=ttrpc version=3 May 15 13:05:57.796383 systemd-networkd[1472]: cali4782bd50a2f: Link UP May 15 13:05:57.796565 systemd-networkd[1472]: cali4782bd50a2f: Gained carrier May 15 13:05:57.808335 systemd[1]: Started cri-containerd-45bb1f6c14d218ec8dc935a51a8ff701884bed034c9894fcc7a5d5c1206e3ddf.scope - libcontainer container 45bb1f6c14d218ec8dc935a51a8ff701884bed034c9894fcc7a5d5c1206e3ddf. May 15 13:05:57.826878 containerd[1551]: 2025-05-15 13:05:57.414 [INFO][5500] cni-plugin/utils.go 100: File /var/lib/calico/mtu does not exist May 15 13:05:57.826878 containerd[1551]: 2025-05-15 13:05:57.441 [INFO][5500] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4334--0--0--a--250489a463-k8s-calico--kube--controllers--5455d8c489--zdjgc-eth0 calico-kube-controllers-5455d8c489- calico-system 716fa121-5d06-4112-9e4d-16489f87729d 669 0 2025-05-15 13:05:32 +0000 UTC map[app.kubernetes.io/name:calico-kube-controllers k8s-app:calico-kube-controllers pod-template-hash:5455d8c489 projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:calico-kube-controllers] map[] [] [] []} {k8s ci-4334-0-0-a-250489a463 calico-kube-controllers-5455d8c489-zdjgc eth0 calico-kube-controllers [] [] [kns.calico-system ksa.calico-system.calico-kube-controllers] cali4782bd50a2f [] []}} ContainerID="904a76a40abd5e4ec7065ed6f188f61ddab58913bffbe065741ea8df79ea4105" Namespace="calico-system" Pod="calico-kube-controllers-5455d8c489-zdjgc" WorkloadEndpoint="ci--4334--0--0--a--250489a463-k8s-calico--kube--controllers--5455d8c489--zdjgc-" May 15 13:05:57.826878 containerd[1551]: 2025-05-15 13:05:57.441 [INFO][5500] cni-plugin/k8s.go 77: Extracted identifiers for CmdAddK8s ContainerID="904a76a40abd5e4ec7065ed6f188f61ddab58913bffbe065741ea8df79ea4105" Namespace="calico-system" Pod="calico-kube-controllers-5455d8c489-zdjgc" WorkloadEndpoint="ci--4334--0--0--a--250489a463-k8s-calico--kube--controllers--5455d8c489--zdjgc-eth0" May 15 13:05:57.826878 containerd[1551]: 2025-05-15 13:05:57.603 [INFO][5528] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="904a76a40abd5e4ec7065ed6f188f61ddab58913bffbe065741ea8df79ea4105" HandleID="k8s-pod-network.904a76a40abd5e4ec7065ed6f188f61ddab58913bffbe065741ea8df79ea4105" Workload="ci--4334--0--0--a--250489a463-k8s-calico--kube--controllers--5455d8c489--zdjgc-eth0" May 15 13:05:57.827108 containerd[1551]: 2025-05-15 13:05:57.621 [INFO][5528] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="904a76a40abd5e4ec7065ed6f188f61ddab58913bffbe065741ea8df79ea4105" HandleID="k8s-pod-network.904a76a40abd5e4ec7065ed6f188f61ddab58913bffbe065741ea8df79ea4105" Workload="ci--4334--0--0--a--250489a463-k8s-calico--kube--controllers--5455d8c489--zdjgc-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc0003116c0), Attrs:map[string]string{"namespace":"calico-system", "node":"ci-4334-0-0-a-250489a463", "pod":"calico-kube-controllers-5455d8c489-zdjgc", "timestamp":"2025-05-15 13:05:57.603691839 +0000 UTC"}, Hostname:"ci-4334-0-0-a-250489a463", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} May 15 13:05:57.827108 containerd[1551]: 2025-05-15 13:05:57.621 [INFO][5528] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. May 15 13:05:57.827108 containerd[1551]: 2025-05-15 13:05:57.655 [INFO][5528] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. May 15 13:05:57.827108 containerd[1551]: 2025-05-15 13:05:57.655 [INFO][5528] ipam/ipam.go 107: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4334-0-0-a-250489a463' May 15 13:05:57.827108 containerd[1551]: 2025-05-15 13:05:57.727 [INFO][5528] ipam/ipam.go 660: Looking up existing affinities for host handle="k8s-pod-network.904a76a40abd5e4ec7065ed6f188f61ddab58913bffbe065741ea8df79ea4105" host="ci-4334-0-0-a-250489a463" May 15 13:05:57.827108 containerd[1551]: 2025-05-15 13:05:57.741 [INFO][5528] ipam/ipam.go 372: Looking up existing affinities for host host="ci-4334-0-0-a-250489a463" May 15 13:05:57.827108 containerd[1551]: 2025-05-15 13:05:57.752 [INFO][5528] ipam/ipam.go 489: Trying affinity for 192.168.27.192/26 host="ci-4334-0-0-a-250489a463" May 15 13:05:57.827108 containerd[1551]: 2025-05-15 13:05:57.755 [INFO][5528] ipam/ipam.go 155: Attempting to load block cidr=192.168.27.192/26 host="ci-4334-0-0-a-250489a463" May 15 13:05:57.827108 containerd[1551]: 2025-05-15 13:05:57.759 [INFO][5528] ipam/ipam.go 232: Affinity is confirmed and block has been loaded cidr=192.168.27.192/26 host="ci-4334-0-0-a-250489a463" May 15 13:05:57.827264 containerd[1551]: 2025-05-15 13:05:57.759 [INFO][5528] ipam/ipam.go 1180: Attempting to assign 1 addresses from block block=192.168.27.192/26 handle="k8s-pod-network.904a76a40abd5e4ec7065ed6f188f61ddab58913bffbe065741ea8df79ea4105" host="ci-4334-0-0-a-250489a463" May 15 13:05:57.827264 containerd[1551]: 2025-05-15 13:05:57.761 [INFO][5528] ipam/ipam.go 1685: Creating new handle: k8s-pod-network.904a76a40abd5e4ec7065ed6f188f61ddab58913bffbe065741ea8df79ea4105 May 15 13:05:57.827264 containerd[1551]: 2025-05-15 13:05:57.771 [INFO][5528] ipam/ipam.go 1203: Writing block in order to claim IPs block=192.168.27.192/26 handle="k8s-pod-network.904a76a40abd5e4ec7065ed6f188f61ddab58913bffbe065741ea8df79ea4105" host="ci-4334-0-0-a-250489a463" May 15 13:05:57.827264 containerd[1551]: 2025-05-15 13:05:57.781 [INFO][5528] ipam/ipam.go 1216: Successfully claimed IPs: [192.168.27.194/26] block=192.168.27.192/26 handle="k8s-pod-network.904a76a40abd5e4ec7065ed6f188f61ddab58913bffbe065741ea8df79ea4105" host="ci-4334-0-0-a-250489a463" May 15 13:05:57.827264 containerd[1551]: 2025-05-15 13:05:57.781 [INFO][5528] ipam/ipam.go 847: Auto-assigned 1 out of 1 IPv4s: [192.168.27.194/26] handle="k8s-pod-network.904a76a40abd5e4ec7065ed6f188f61ddab58913bffbe065741ea8df79ea4105" host="ci-4334-0-0-a-250489a463" May 15 13:05:57.827264 containerd[1551]: 2025-05-15 13:05:57.781 [INFO][5528] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. May 15 13:05:57.827264 containerd[1551]: 2025-05-15 13:05:57.781 [INFO][5528] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.27.194/26] IPv6=[] ContainerID="904a76a40abd5e4ec7065ed6f188f61ddab58913bffbe065741ea8df79ea4105" HandleID="k8s-pod-network.904a76a40abd5e4ec7065ed6f188f61ddab58913bffbe065741ea8df79ea4105" Workload="ci--4334--0--0--a--250489a463-k8s-calico--kube--controllers--5455d8c489--zdjgc-eth0" May 15 13:05:57.827388 containerd[1551]: 2025-05-15 13:05:57.792 [INFO][5500] cni-plugin/k8s.go 386: Populated endpoint ContainerID="904a76a40abd5e4ec7065ed6f188f61ddab58913bffbe065741ea8df79ea4105" Namespace="calico-system" Pod="calico-kube-controllers-5455d8c489-zdjgc" WorkloadEndpoint="ci--4334--0--0--a--250489a463-k8s-calico--kube--controllers--5455d8c489--zdjgc-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4334--0--0--a--250489a463-k8s-calico--kube--controllers--5455d8c489--zdjgc-eth0", GenerateName:"calico-kube-controllers-5455d8c489-", Namespace:"calico-system", SelfLink:"", UID:"716fa121-5d06-4112-9e4d-16489f87729d", ResourceVersion:"669", Generation:0, CreationTimestamp:time.Date(2025, time.May, 15, 13, 5, 32, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"calico-kube-controllers", "k8s-app":"calico-kube-controllers", "pod-template-hash":"5455d8c489", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-kube-controllers"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4334-0-0-a-250489a463", ContainerID:"", Pod:"calico-kube-controllers-5455d8c489-zdjgc", Endpoint:"eth0", ServiceAccountName:"calico-kube-controllers", IPNetworks:[]string{"192.168.27.194/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.calico-kube-controllers"}, InterfaceName:"cali4782bd50a2f", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil)}} May 15 13:05:57.827433 containerd[1551]: 2025-05-15 13:05:57.793 [INFO][5500] cni-plugin/k8s.go 387: Calico CNI using IPs: [192.168.27.194/32] ContainerID="904a76a40abd5e4ec7065ed6f188f61ddab58913bffbe065741ea8df79ea4105" Namespace="calico-system" Pod="calico-kube-controllers-5455d8c489-zdjgc" WorkloadEndpoint="ci--4334--0--0--a--250489a463-k8s-calico--kube--controllers--5455d8c489--zdjgc-eth0" May 15 13:05:57.827433 containerd[1551]: 2025-05-15 13:05:57.793 [INFO][5500] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali4782bd50a2f ContainerID="904a76a40abd5e4ec7065ed6f188f61ddab58913bffbe065741ea8df79ea4105" Namespace="calico-system" Pod="calico-kube-controllers-5455d8c489-zdjgc" WorkloadEndpoint="ci--4334--0--0--a--250489a463-k8s-calico--kube--controllers--5455d8c489--zdjgc-eth0" May 15 13:05:57.827433 containerd[1551]: 2025-05-15 13:05:57.799 [INFO][5500] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="904a76a40abd5e4ec7065ed6f188f61ddab58913bffbe065741ea8df79ea4105" Namespace="calico-system" Pod="calico-kube-controllers-5455d8c489-zdjgc" WorkloadEndpoint="ci--4334--0--0--a--250489a463-k8s-calico--kube--controllers--5455d8c489--zdjgc-eth0" May 15 13:05:57.827526 containerd[1551]: 2025-05-15 13:05:57.799 [INFO][5500] cni-plugin/k8s.go 414: Added Mac, interface name, and active container ID to endpoint ContainerID="904a76a40abd5e4ec7065ed6f188f61ddab58913bffbe065741ea8df79ea4105" Namespace="calico-system" Pod="calico-kube-controllers-5455d8c489-zdjgc" WorkloadEndpoint="ci--4334--0--0--a--250489a463-k8s-calico--kube--controllers--5455d8c489--zdjgc-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4334--0--0--a--250489a463-k8s-calico--kube--controllers--5455d8c489--zdjgc-eth0", GenerateName:"calico-kube-controllers-5455d8c489-", Namespace:"calico-system", SelfLink:"", UID:"716fa121-5d06-4112-9e4d-16489f87729d", ResourceVersion:"669", Generation:0, CreationTimestamp:time.Date(2025, time.May, 15, 13, 5, 32, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"calico-kube-controllers", "k8s-app":"calico-kube-controllers", "pod-template-hash":"5455d8c489", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-kube-controllers"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4334-0-0-a-250489a463", ContainerID:"904a76a40abd5e4ec7065ed6f188f61ddab58913bffbe065741ea8df79ea4105", Pod:"calico-kube-controllers-5455d8c489-zdjgc", Endpoint:"eth0", ServiceAccountName:"calico-kube-controllers", IPNetworks:[]string{"192.168.27.194/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.calico-kube-controllers"}, InterfaceName:"cali4782bd50a2f", MAC:"aa:12:e9:cf:63:16", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil)}} May 15 13:05:57.827572 containerd[1551]: 2025-05-15 13:05:57.819 [INFO][5500] cni-plugin/k8s.go 500: Wrote updated endpoint to datastore ContainerID="904a76a40abd5e4ec7065ed6f188f61ddab58913bffbe065741ea8df79ea4105" Namespace="calico-system" Pod="calico-kube-controllers-5455d8c489-zdjgc" WorkloadEndpoint="ci--4334--0--0--a--250489a463-k8s-calico--kube--controllers--5455d8c489--zdjgc-eth0" May 15 13:05:57.857731 containerd[1551]: time="2025-05-15T13:05:57.857698204Z" level=info msg="connecting to shim 904a76a40abd5e4ec7065ed6f188f61ddab58913bffbe065741ea8df79ea4105" address="unix:///run/containerd/s/deb6bba0a504fb8e594f13344781086ca214b6fb1354f00b02d660879f49a1f6" namespace=k8s.io protocol=ttrpc version=3 May 15 13:05:57.881115 systemd[1]: Started cri-containerd-904a76a40abd5e4ec7065ed6f188f61ddab58913bffbe065741ea8df79ea4105.scope - libcontainer container 904a76a40abd5e4ec7065ed6f188f61ddab58913bffbe065741ea8df79ea4105. May 15 13:05:57.883541 containerd[1551]: time="2025-05-15T13:05:57.883412232Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-7d48cd9d65-l57k5,Uid:de8ec6ba-558b-4695-ae8f-69de59951448,Namespace:calico-apiserver,Attempt:0,} returns sandbox id \"45bb1f6c14d218ec8dc935a51a8ff701884bed034c9894fcc7a5d5c1206e3ddf\"" May 15 13:05:57.886342 containerd[1551]: time="2025-05-15T13:05:57.886314019Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.29.3\"" May 15 13:05:57.921575 containerd[1551]: time="2025-05-15T13:05:57.921426773Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-5455d8c489-zdjgc,Uid:716fa121-5d06-4112-9e4d-16489f87729d,Namespace:calico-system,Attempt:0,} returns sandbox id \"904a76a40abd5e4ec7065ed6f188f61ddab58913bffbe065741ea8df79ea4105\"" May 15 13:05:58.380735 containerd[1551]: time="2025-05-15T13:05:58.380168894Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-668d6bf9bc-tmm62,Uid:f7da6f75-940d-4f80-9f35-daf3db6856b4,Namespace:kube-system,Attempt:0,}" May 15 13:05:58.380735 containerd[1551]: time="2025-05-15T13:05:58.380498244Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-tx8fx,Uid:bb669bdd-390f-4594-a774-599bed5593aa,Namespace:calico-system,Attempt:0,}" May 15 13:05:58.529110 systemd-networkd[1472]: cali4f2419625b3: Link UP May 15 13:05:58.529664 systemd-networkd[1472]: cali4f2419625b3: Gained carrier May 15 13:05:58.543728 containerd[1551]: 2025-05-15 13:05:58.429 [INFO][5674] cni-plugin/utils.go 100: File /var/lib/calico/mtu does not exist May 15 13:05:58.543728 containerd[1551]: 2025-05-15 13:05:58.443 [INFO][5674] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4334--0--0--a--250489a463-k8s-coredns--668d6bf9bc--tmm62-eth0 coredns-668d6bf9bc- kube-system f7da6f75-940d-4f80-9f35-daf3db6856b4 664 0 2025-05-15 13:05:25 +0000 UTC map[k8s-app:kube-dns pod-template-hash:668d6bf9bc projectcalico.org/namespace:kube-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:coredns] map[] [] [] []} {k8s ci-4334-0-0-a-250489a463 coredns-668d6bf9bc-tmm62 eth0 coredns [] [] [kns.kube-system ksa.kube-system.coredns] cali4f2419625b3 [{dns UDP 53 0 } {dns-tcp TCP 53 0 } {metrics TCP 9153 0 }] []}} ContainerID="d1dafe21616ede8372a3321adcb11ae1601f65b93d144dcea8a52042fcfa33c9" Namespace="kube-system" Pod="coredns-668d6bf9bc-tmm62" WorkloadEndpoint="ci--4334--0--0--a--250489a463-k8s-coredns--668d6bf9bc--tmm62-" May 15 13:05:58.543728 containerd[1551]: 2025-05-15 13:05:58.443 [INFO][5674] cni-plugin/k8s.go 77: Extracted identifiers for CmdAddK8s ContainerID="d1dafe21616ede8372a3321adcb11ae1601f65b93d144dcea8a52042fcfa33c9" Namespace="kube-system" Pod="coredns-668d6bf9bc-tmm62" WorkloadEndpoint="ci--4334--0--0--a--250489a463-k8s-coredns--668d6bf9bc--tmm62-eth0" May 15 13:05:58.543728 containerd[1551]: 2025-05-15 13:05:58.477 [INFO][5700] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="d1dafe21616ede8372a3321adcb11ae1601f65b93d144dcea8a52042fcfa33c9" HandleID="k8s-pod-network.d1dafe21616ede8372a3321adcb11ae1601f65b93d144dcea8a52042fcfa33c9" Workload="ci--4334--0--0--a--250489a463-k8s-coredns--668d6bf9bc--tmm62-eth0" May 15 13:05:58.544109 containerd[1551]: 2025-05-15 13:05:58.485 [INFO][5700] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="d1dafe21616ede8372a3321adcb11ae1601f65b93d144dcea8a52042fcfa33c9" HandleID="k8s-pod-network.d1dafe21616ede8372a3321adcb11ae1601f65b93d144dcea8a52042fcfa33c9" Workload="ci--4334--0--0--a--250489a463-k8s-coredns--668d6bf9bc--tmm62-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc000051bf0), Attrs:map[string]string{"namespace":"kube-system", "node":"ci-4334-0-0-a-250489a463", "pod":"coredns-668d6bf9bc-tmm62", "timestamp":"2025-05-15 13:05:58.477745267 +0000 UTC"}, Hostname:"ci-4334-0-0-a-250489a463", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} May 15 13:05:58.544109 containerd[1551]: 2025-05-15 13:05:58.485 [INFO][5700] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. May 15 13:05:58.544109 containerd[1551]: 2025-05-15 13:05:58.486 [INFO][5700] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. May 15 13:05:58.544109 containerd[1551]: 2025-05-15 13:05:58.486 [INFO][5700] ipam/ipam.go 107: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4334-0-0-a-250489a463' May 15 13:05:58.544109 containerd[1551]: 2025-05-15 13:05:58.489 [INFO][5700] ipam/ipam.go 660: Looking up existing affinities for host handle="k8s-pod-network.d1dafe21616ede8372a3321adcb11ae1601f65b93d144dcea8a52042fcfa33c9" host="ci-4334-0-0-a-250489a463" May 15 13:05:58.544109 containerd[1551]: 2025-05-15 13:05:58.495 [INFO][5700] ipam/ipam.go 372: Looking up existing affinities for host host="ci-4334-0-0-a-250489a463" May 15 13:05:58.544109 containerd[1551]: 2025-05-15 13:05:58.503 [INFO][5700] ipam/ipam.go 489: Trying affinity for 192.168.27.192/26 host="ci-4334-0-0-a-250489a463" May 15 13:05:58.544109 containerd[1551]: 2025-05-15 13:05:58.505 [INFO][5700] ipam/ipam.go 155: Attempting to load block cidr=192.168.27.192/26 host="ci-4334-0-0-a-250489a463" May 15 13:05:58.544109 containerd[1551]: 2025-05-15 13:05:58.510 [INFO][5700] ipam/ipam.go 232: Affinity is confirmed and block has been loaded cidr=192.168.27.192/26 host="ci-4334-0-0-a-250489a463" May 15 13:05:58.544445 containerd[1551]: 2025-05-15 13:05:58.510 [INFO][5700] ipam/ipam.go 1180: Attempting to assign 1 addresses from block block=192.168.27.192/26 handle="k8s-pod-network.d1dafe21616ede8372a3321adcb11ae1601f65b93d144dcea8a52042fcfa33c9" host="ci-4334-0-0-a-250489a463" May 15 13:05:58.544445 containerd[1551]: 2025-05-15 13:05:58.511 [INFO][5700] ipam/ipam.go 1685: Creating new handle: k8s-pod-network.d1dafe21616ede8372a3321adcb11ae1601f65b93d144dcea8a52042fcfa33c9 May 15 13:05:58.544445 containerd[1551]: 2025-05-15 13:05:58.516 [INFO][5700] ipam/ipam.go 1203: Writing block in order to claim IPs block=192.168.27.192/26 handle="k8s-pod-network.d1dafe21616ede8372a3321adcb11ae1601f65b93d144dcea8a52042fcfa33c9" host="ci-4334-0-0-a-250489a463" May 15 13:05:58.544445 containerd[1551]: 2025-05-15 13:05:58.523 [INFO][5700] ipam/ipam.go 1216: Successfully claimed IPs: [192.168.27.195/26] block=192.168.27.192/26 handle="k8s-pod-network.d1dafe21616ede8372a3321adcb11ae1601f65b93d144dcea8a52042fcfa33c9" host="ci-4334-0-0-a-250489a463" May 15 13:05:58.544445 containerd[1551]: 2025-05-15 13:05:58.523 [INFO][5700] ipam/ipam.go 847: Auto-assigned 1 out of 1 IPv4s: [192.168.27.195/26] handle="k8s-pod-network.d1dafe21616ede8372a3321adcb11ae1601f65b93d144dcea8a52042fcfa33c9" host="ci-4334-0-0-a-250489a463" May 15 13:05:58.544445 containerd[1551]: 2025-05-15 13:05:58.523 [INFO][5700] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. May 15 13:05:58.544445 containerd[1551]: 2025-05-15 13:05:58.523 [INFO][5700] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.27.195/26] IPv6=[] ContainerID="d1dafe21616ede8372a3321adcb11ae1601f65b93d144dcea8a52042fcfa33c9" HandleID="k8s-pod-network.d1dafe21616ede8372a3321adcb11ae1601f65b93d144dcea8a52042fcfa33c9" Workload="ci--4334--0--0--a--250489a463-k8s-coredns--668d6bf9bc--tmm62-eth0" May 15 13:05:58.544837 containerd[1551]: 2025-05-15 13:05:58.526 [INFO][5674] cni-plugin/k8s.go 386: Populated endpoint ContainerID="d1dafe21616ede8372a3321adcb11ae1601f65b93d144dcea8a52042fcfa33c9" Namespace="kube-system" Pod="coredns-668d6bf9bc-tmm62" WorkloadEndpoint="ci--4334--0--0--a--250489a463-k8s-coredns--668d6bf9bc--tmm62-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4334--0--0--a--250489a463-k8s-coredns--668d6bf9bc--tmm62-eth0", GenerateName:"coredns-668d6bf9bc-", Namespace:"kube-system", SelfLink:"", UID:"f7da6f75-940d-4f80-9f35-daf3db6856b4", ResourceVersion:"664", Generation:0, CreationTimestamp:time.Date(2025, time.May, 15, 13, 5, 25, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"668d6bf9bc", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4334-0-0-a-250489a463", ContainerID:"", Pod:"coredns-668d6bf9bc-tmm62", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.27.195/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"cali4f2419625b3", MAC:"", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil)}} May 15 13:05:58.544837 containerd[1551]: 2025-05-15 13:05:58.526 [INFO][5674] cni-plugin/k8s.go 387: Calico CNI using IPs: [192.168.27.195/32] ContainerID="d1dafe21616ede8372a3321adcb11ae1601f65b93d144dcea8a52042fcfa33c9" Namespace="kube-system" Pod="coredns-668d6bf9bc-tmm62" WorkloadEndpoint="ci--4334--0--0--a--250489a463-k8s-coredns--668d6bf9bc--tmm62-eth0" May 15 13:05:58.544837 containerd[1551]: 2025-05-15 13:05:58.526 [INFO][5674] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali4f2419625b3 ContainerID="d1dafe21616ede8372a3321adcb11ae1601f65b93d144dcea8a52042fcfa33c9" Namespace="kube-system" Pod="coredns-668d6bf9bc-tmm62" WorkloadEndpoint="ci--4334--0--0--a--250489a463-k8s-coredns--668d6bf9bc--tmm62-eth0" May 15 13:05:58.544837 containerd[1551]: 2025-05-15 13:05:58.530 [INFO][5674] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="d1dafe21616ede8372a3321adcb11ae1601f65b93d144dcea8a52042fcfa33c9" Namespace="kube-system" Pod="coredns-668d6bf9bc-tmm62" WorkloadEndpoint="ci--4334--0--0--a--250489a463-k8s-coredns--668d6bf9bc--tmm62-eth0" May 15 13:05:58.544837 containerd[1551]: 2025-05-15 13:05:58.530 [INFO][5674] cni-plugin/k8s.go 414: Added Mac, interface name, and active container ID to endpoint ContainerID="d1dafe21616ede8372a3321adcb11ae1601f65b93d144dcea8a52042fcfa33c9" Namespace="kube-system" Pod="coredns-668d6bf9bc-tmm62" WorkloadEndpoint="ci--4334--0--0--a--250489a463-k8s-coredns--668d6bf9bc--tmm62-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4334--0--0--a--250489a463-k8s-coredns--668d6bf9bc--tmm62-eth0", GenerateName:"coredns-668d6bf9bc-", Namespace:"kube-system", SelfLink:"", UID:"f7da6f75-940d-4f80-9f35-daf3db6856b4", ResourceVersion:"664", Generation:0, CreationTimestamp:time.Date(2025, time.May, 15, 13, 5, 25, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"668d6bf9bc", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4334-0-0-a-250489a463", ContainerID:"d1dafe21616ede8372a3321adcb11ae1601f65b93d144dcea8a52042fcfa33c9", Pod:"coredns-668d6bf9bc-tmm62", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.27.195/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"cali4f2419625b3", MAC:"ca:15:c4:67:d7:5d", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil)}} May 15 13:05:58.544837 containerd[1551]: 2025-05-15 13:05:58.540 [INFO][5674] cni-plugin/k8s.go 500: Wrote updated endpoint to datastore ContainerID="d1dafe21616ede8372a3321adcb11ae1601f65b93d144dcea8a52042fcfa33c9" Namespace="kube-system" Pod="coredns-668d6bf9bc-tmm62" WorkloadEndpoint="ci--4334--0--0--a--250489a463-k8s-coredns--668d6bf9bc--tmm62-eth0" May 15 13:05:58.563871 containerd[1551]: time="2025-05-15T13:05:58.563834888Z" level=info msg="connecting to shim d1dafe21616ede8372a3321adcb11ae1601f65b93d144dcea8a52042fcfa33c9" address="unix:///run/containerd/s/84c62b77ce3beff3d9e2f986d99a11819b8875ca7be469587876fb0e342cb71d" namespace=k8s.io protocol=ttrpc version=3 May 15 13:05:58.584001 systemd[1]: Started cri-containerd-d1dafe21616ede8372a3321adcb11ae1601f65b93d144dcea8a52042fcfa33c9.scope - libcontainer container d1dafe21616ede8372a3321adcb11ae1601f65b93d144dcea8a52042fcfa33c9. May 15 13:05:58.628506 systemd-networkd[1472]: calibac7f2ae8c3: Link UP May 15 13:05:58.629575 systemd-networkd[1472]: calibac7f2ae8c3: Gained carrier May 15 13:05:58.635674 containerd[1551]: time="2025-05-15T13:05:58.635598215Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-668d6bf9bc-tmm62,Uid:f7da6f75-940d-4f80-9f35-daf3db6856b4,Namespace:kube-system,Attempt:0,} returns sandbox id \"d1dafe21616ede8372a3321adcb11ae1601f65b93d144dcea8a52042fcfa33c9\"" May 15 13:05:58.639270 containerd[1551]: time="2025-05-15T13:05:58.639240705Z" level=info msg="CreateContainer within sandbox \"d1dafe21616ede8372a3321adcb11ae1601f65b93d144dcea8a52042fcfa33c9\" for container &ContainerMetadata{Name:coredns,Attempt:0,}" May 15 13:05:58.646420 containerd[1551]: 2025-05-15 13:05:58.441 [INFO][5677] cni-plugin/utils.go 100: File /var/lib/calico/mtu does not exist May 15 13:05:58.646420 containerd[1551]: 2025-05-15 13:05:58.451 [INFO][5677] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4334--0--0--a--250489a463-k8s-csi--node--driver--tx8fx-eth0 csi-node-driver- calico-system bb669bdd-390f-4594-a774-599bed5593aa 581 0 2025-05-15 13:05:32 +0000 UTC map[app.kubernetes.io/name:csi-node-driver controller-revision-hash:5b5cc68cd5 k8s-app:csi-node-driver name:csi-node-driver pod-template-generation:1 projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:csi-node-driver] map[] [] [] []} {k8s ci-4334-0-0-a-250489a463 csi-node-driver-tx8fx eth0 csi-node-driver [] [] [kns.calico-system ksa.calico-system.csi-node-driver] calibac7f2ae8c3 [] []}} ContainerID="3e490ba5ca60ed777b12426bf74c3a9d9373b5b0c20c66d95e8585974c3d0a43" Namespace="calico-system" Pod="csi-node-driver-tx8fx" WorkloadEndpoint="ci--4334--0--0--a--250489a463-k8s-csi--node--driver--tx8fx-" May 15 13:05:58.646420 containerd[1551]: 2025-05-15 13:05:58.451 [INFO][5677] cni-plugin/k8s.go 77: Extracted identifiers for CmdAddK8s ContainerID="3e490ba5ca60ed777b12426bf74c3a9d9373b5b0c20c66d95e8585974c3d0a43" Namespace="calico-system" Pod="csi-node-driver-tx8fx" WorkloadEndpoint="ci--4334--0--0--a--250489a463-k8s-csi--node--driver--tx8fx-eth0" May 15 13:05:58.646420 containerd[1551]: 2025-05-15 13:05:58.489 [INFO][5706] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="3e490ba5ca60ed777b12426bf74c3a9d9373b5b0c20c66d95e8585974c3d0a43" HandleID="k8s-pod-network.3e490ba5ca60ed777b12426bf74c3a9d9373b5b0c20c66d95e8585974c3d0a43" Workload="ci--4334--0--0--a--250489a463-k8s-csi--node--driver--tx8fx-eth0" May 15 13:05:58.646420 containerd[1551]: 2025-05-15 13:05:58.499 [INFO][5706] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="3e490ba5ca60ed777b12426bf74c3a9d9373b5b0c20c66d95e8585974c3d0a43" HandleID="k8s-pod-network.3e490ba5ca60ed777b12426bf74c3a9d9373b5b0c20c66d95e8585974c3d0a43" Workload="ci--4334--0--0--a--250489a463-k8s-csi--node--driver--tx8fx-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc00031b340), Attrs:map[string]string{"namespace":"calico-system", "node":"ci-4334-0-0-a-250489a463", "pod":"csi-node-driver-tx8fx", "timestamp":"2025-05-15 13:05:58.489734394 +0000 UTC"}, Hostname:"ci-4334-0-0-a-250489a463", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} May 15 13:05:58.646420 containerd[1551]: 2025-05-15 13:05:58.500 [INFO][5706] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. May 15 13:05:58.646420 containerd[1551]: 2025-05-15 13:05:58.523 [INFO][5706] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. May 15 13:05:58.646420 containerd[1551]: 2025-05-15 13:05:58.524 [INFO][5706] ipam/ipam.go 107: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4334-0-0-a-250489a463' May 15 13:05:58.646420 containerd[1551]: 2025-05-15 13:05:58.590 [INFO][5706] ipam/ipam.go 660: Looking up existing affinities for host handle="k8s-pod-network.3e490ba5ca60ed777b12426bf74c3a9d9373b5b0c20c66d95e8585974c3d0a43" host="ci-4334-0-0-a-250489a463" May 15 13:05:58.646420 containerd[1551]: 2025-05-15 13:05:58.595 [INFO][5706] ipam/ipam.go 372: Looking up existing affinities for host host="ci-4334-0-0-a-250489a463" May 15 13:05:58.646420 containerd[1551]: 2025-05-15 13:05:58.601 [INFO][5706] ipam/ipam.go 489: Trying affinity for 192.168.27.192/26 host="ci-4334-0-0-a-250489a463" May 15 13:05:58.646420 containerd[1551]: 2025-05-15 13:05:58.603 [INFO][5706] ipam/ipam.go 155: Attempting to load block cidr=192.168.27.192/26 host="ci-4334-0-0-a-250489a463" May 15 13:05:58.646420 containerd[1551]: 2025-05-15 13:05:58.605 [INFO][5706] ipam/ipam.go 232: Affinity is confirmed and block has been loaded cidr=192.168.27.192/26 host="ci-4334-0-0-a-250489a463" May 15 13:05:58.646420 containerd[1551]: 2025-05-15 13:05:58.605 [INFO][5706] ipam/ipam.go 1180: Attempting to assign 1 addresses from block block=192.168.27.192/26 handle="k8s-pod-network.3e490ba5ca60ed777b12426bf74c3a9d9373b5b0c20c66d95e8585974c3d0a43" host="ci-4334-0-0-a-250489a463" May 15 13:05:58.646420 containerd[1551]: 2025-05-15 13:05:58.607 [INFO][5706] ipam/ipam.go 1685: Creating new handle: k8s-pod-network.3e490ba5ca60ed777b12426bf74c3a9d9373b5b0c20c66d95e8585974c3d0a43 May 15 13:05:58.646420 containerd[1551]: 2025-05-15 13:05:58.611 [INFO][5706] ipam/ipam.go 1203: Writing block in order to claim IPs block=192.168.27.192/26 handle="k8s-pod-network.3e490ba5ca60ed777b12426bf74c3a9d9373b5b0c20c66d95e8585974c3d0a43" host="ci-4334-0-0-a-250489a463" May 15 13:05:58.646420 containerd[1551]: 2025-05-15 13:05:58.619 [INFO][5706] ipam/ipam.go 1216: Successfully claimed IPs: [192.168.27.196/26] block=192.168.27.192/26 handle="k8s-pod-network.3e490ba5ca60ed777b12426bf74c3a9d9373b5b0c20c66d95e8585974c3d0a43" host="ci-4334-0-0-a-250489a463" May 15 13:05:58.646420 containerd[1551]: 2025-05-15 13:05:58.619 [INFO][5706] ipam/ipam.go 847: Auto-assigned 1 out of 1 IPv4s: [192.168.27.196/26] handle="k8s-pod-network.3e490ba5ca60ed777b12426bf74c3a9d9373b5b0c20c66d95e8585974c3d0a43" host="ci-4334-0-0-a-250489a463" May 15 13:05:58.646420 containerd[1551]: 2025-05-15 13:05:58.619 [INFO][5706] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. May 15 13:05:58.646420 containerd[1551]: 2025-05-15 13:05:58.619 [INFO][5706] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.27.196/26] IPv6=[] ContainerID="3e490ba5ca60ed777b12426bf74c3a9d9373b5b0c20c66d95e8585974c3d0a43" HandleID="k8s-pod-network.3e490ba5ca60ed777b12426bf74c3a9d9373b5b0c20c66d95e8585974c3d0a43" Workload="ci--4334--0--0--a--250489a463-k8s-csi--node--driver--tx8fx-eth0" May 15 13:05:58.647603 containerd[1551]: 2025-05-15 13:05:58.622 [INFO][5677] cni-plugin/k8s.go 386: Populated endpoint ContainerID="3e490ba5ca60ed777b12426bf74c3a9d9373b5b0c20c66d95e8585974c3d0a43" Namespace="calico-system" Pod="csi-node-driver-tx8fx" WorkloadEndpoint="ci--4334--0--0--a--250489a463-k8s-csi--node--driver--tx8fx-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4334--0--0--a--250489a463-k8s-csi--node--driver--tx8fx-eth0", GenerateName:"csi-node-driver-", Namespace:"calico-system", SelfLink:"", UID:"bb669bdd-390f-4594-a774-599bed5593aa", ResourceVersion:"581", Generation:0, CreationTimestamp:time.Date(2025, time.May, 15, 13, 5, 32, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"csi-node-driver", "controller-revision-hash":"5b5cc68cd5", "k8s-app":"csi-node-driver", "name":"csi-node-driver", "pod-template-generation":"1", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"csi-node-driver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4334-0-0-a-250489a463", ContainerID:"", Pod:"csi-node-driver-tx8fx", Endpoint:"eth0", ServiceAccountName:"csi-node-driver", IPNetworks:[]string{"192.168.27.196/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.csi-node-driver"}, InterfaceName:"calibac7f2ae8c3", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil)}} May 15 13:05:58.647603 containerd[1551]: 2025-05-15 13:05:58.622 [INFO][5677] cni-plugin/k8s.go 387: Calico CNI using IPs: [192.168.27.196/32] ContainerID="3e490ba5ca60ed777b12426bf74c3a9d9373b5b0c20c66d95e8585974c3d0a43" Namespace="calico-system" Pod="csi-node-driver-tx8fx" WorkloadEndpoint="ci--4334--0--0--a--250489a463-k8s-csi--node--driver--tx8fx-eth0" May 15 13:05:58.647603 containerd[1551]: 2025-05-15 13:05:58.623 [INFO][5677] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to calibac7f2ae8c3 ContainerID="3e490ba5ca60ed777b12426bf74c3a9d9373b5b0c20c66d95e8585974c3d0a43" Namespace="calico-system" Pod="csi-node-driver-tx8fx" WorkloadEndpoint="ci--4334--0--0--a--250489a463-k8s-csi--node--driver--tx8fx-eth0" May 15 13:05:58.647603 containerd[1551]: 2025-05-15 13:05:58.630 [INFO][5677] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="3e490ba5ca60ed777b12426bf74c3a9d9373b5b0c20c66d95e8585974c3d0a43" Namespace="calico-system" Pod="csi-node-driver-tx8fx" WorkloadEndpoint="ci--4334--0--0--a--250489a463-k8s-csi--node--driver--tx8fx-eth0" May 15 13:05:58.647603 containerd[1551]: 2025-05-15 13:05:58.630 [INFO][5677] cni-plugin/k8s.go 414: Added Mac, interface name, and active container ID to endpoint ContainerID="3e490ba5ca60ed777b12426bf74c3a9d9373b5b0c20c66d95e8585974c3d0a43" Namespace="calico-system" Pod="csi-node-driver-tx8fx" WorkloadEndpoint="ci--4334--0--0--a--250489a463-k8s-csi--node--driver--tx8fx-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4334--0--0--a--250489a463-k8s-csi--node--driver--tx8fx-eth0", GenerateName:"csi-node-driver-", Namespace:"calico-system", SelfLink:"", UID:"bb669bdd-390f-4594-a774-599bed5593aa", ResourceVersion:"581", Generation:0, CreationTimestamp:time.Date(2025, time.May, 15, 13, 5, 32, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"csi-node-driver", "controller-revision-hash":"5b5cc68cd5", "k8s-app":"csi-node-driver", "name":"csi-node-driver", "pod-template-generation":"1", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"csi-node-driver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4334-0-0-a-250489a463", ContainerID:"3e490ba5ca60ed777b12426bf74c3a9d9373b5b0c20c66d95e8585974c3d0a43", Pod:"csi-node-driver-tx8fx", Endpoint:"eth0", ServiceAccountName:"csi-node-driver", IPNetworks:[]string{"192.168.27.196/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.csi-node-driver"}, InterfaceName:"calibac7f2ae8c3", MAC:"9a:7c:67:2e:05:92", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil)}} May 15 13:05:58.647603 containerd[1551]: 2025-05-15 13:05:58.641 [INFO][5677] cni-plugin/k8s.go 500: Wrote updated endpoint to datastore ContainerID="3e490ba5ca60ed777b12426bf74c3a9d9373b5b0c20c66d95e8585974c3d0a43" Namespace="calico-system" Pod="csi-node-driver-tx8fx" WorkloadEndpoint="ci--4334--0--0--a--250489a463-k8s-csi--node--driver--tx8fx-eth0" May 15 13:05:58.659188 containerd[1551]: time="2025-05-15T13:05:58.659145639Z" level=info msg="Container c29ff7d6fa35dbaaea3d39c3611c39265b2ac6616bd769805e8ef2c5a17a8a04: CDI devices from CRI Config.CDIDevices: []" May 15 13:05:58.664706 containerd[1551]: time="2025-05-15T13:05:58.664684816Z" level=info msg="CreateContainer within sandbox \"d1dafe21616ede8372a3321adcb11ae1601f65b93d144dcea8a52042fcfa33c9\" for &ContainerMetadata{Name:coredns,Attempt:0,} returns container id \"c29ff7d6fa35dbaaea3d39c3611c39265b2ac6616bd769805e8ef2c5a17a8a04\"" May 15 13:05:58.667397 containerd[1551]: time="2025-05-15T13:05:58.667361200Z" level=info msg="StartContainer for \"c29ff7d6fa35dbaaea3d39c3611c39265b2ac6616bd769805e8ef2c5a17a8a04\"" May 15 13:05:58.667903 containerd[1551]: time="2025-05-15T13:05:58.667874335Z" level=info msg="connecting to shim 3e490ba5ca60ed777b12426bf74c3a9d9373b5b0c20c66d95e8585974c3d0a43" address="unix:///run/containerd/s/b659da34bb26bd54767e2e7a804f1c5724898168e9c558302d2ca3a9bc040960" namespace=k8s.io protocol=ttrpc version=3 May 15 13:05:58.669043 containerd[1551]: time="2025-05-15T13:05:58.669025429Z" level=info msg="connecting to shim c29ff7d6fa35dbaaea3d39c3611c39265b2ac6616bd769805e8ef2c5a17a8a04" address="unix:///run/containerd/s/84c62b77ce3beff3d9e2f986d99a11819b8875ca7be469587876fb0e342cb71d" protocol=ttrpc version=3 May 15 13:05:58.686012 systemd[1]: Started cri-containerd-3e490ba5ca60ed777b12426bf74c3a9d9373b5b0c20c66d95e8585974c3d0a43.scope - libcontainer container 3e490ba5ca60ed777b12426bf74c3a9d9373b5b0c20c66d95e8585974c3d0a43. May 15 13:05:58.686679 systemd[1]: Started cri-containerd-c29ff7d6fa35dbaaea3d39c3611c39265b2ac6616bd769805e8ef2c5a17a8a04.scope - libcontainer container c29ff7d6fa35dbaaea3d39c3611c39265b2ac6616bd769805e8ef2c5a17a8a04. May 15 13:05:58.721170 containerd[1551]: time="2025-05-15T13:05:58.721116221Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-tx8fx,Uid:bb669bdd-390f-4594-a774-599bed5593aa,Namespace:calico-system,Attempt:0,} returns sandbox id \"3e490ba5ca60ed777b12426bf74c3a9d9373b5b0c20c66d95e8585974c3d0a43\"" May 15 13:05:58.722012 containerd[1551]: time="2025-05-15T13:05:58.721996817Z" level=info msg="StartContainer for \"c29ff7d6fa35dbaaea3d39c3611c39265b2ac6616bd769805e8ef2c5a17a8a04\" returns successfully" May 15 13:05:59.194166 systemd-networkd[1472]: calia118f3e1bdb: Gained IPv6LL May 15 13:05:59.195144 systemd-networkd[1472]: cali4782bd50a2f: Gained IPv6LL May 15 13:05:59.391907 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount4051049348.mount: Deactivated successfully. May 15 13:05:59.539366 kubelet[4265]: I0515 13:05:59.539190 4265 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/coredns-668d6bf9bc-tmm62" podStartSLOduration=34.539162953 podStartE2EDuration="34.539162953s" podCreationTimestamp="2025-05-15 13:05:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-05-15 13:05:59.526700415 +0000 UTC m=+41.243078328" watchObservedRunningTime="2025-05-15 13:05:59.539162953 +0000 UTC m=+41.255540876" May 15 13:05:59.962012 systemd-networkd[1472]: cali4f2419625b3: Gained IPv6LL May 15 13:06:00.090143 systemd-networkd[1472]: calibac7f2ae8c3: Gained IPv6LL May 15 13:06:00.387397 kubelet[4265]: I0515 13:06:00.387196 4265 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" May 15 13:06:00.394532 containerd[1551]: time="2025-05-15T13:06:00.394383263Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-7d48cd9d65-sqsnl,Uid:623640fb-bde7-4522-803c-1a32b7748b5a,Namespace:calico-apiserver,Attempt:0,}" May 15 13:06:00.530720 systemd-networkd[1472]: calid62f1bd4917: Link UP May 15 13:06:00.530848 systemd-networkd[1472]: calid62f1bd4917: Gained carrier May 15 13:06:00.544256 containerd[1551]: 2025-05-15 13:06:00.451 [INFO][5909] cni-plugin/utils.go 100: File /var/lib/calico/mtu does not exist May 15 13:06:00.544256 containerd[1551]: 2025-05-15 13:06:00.460 [INFO][5909] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4334--0--0--a--250489a463-k8s-calico--apiserver--7d48cd9d65--sqsnl-eth0 calico-apiserver-7d48cd9d65- calico-apiserver 623640fb-bde7-4522-803c-1a32b7748b5a 667 0 2025-05-15 13:05:32 +0000 UTC map[apiserver:true app.kubernetes.io/name:calico-apiserver k8s-app:calico-apiserver pod-template-hash:7d48cd9d65 projectcalico.org/namespace:calico-apiserver projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:calico-apiserver] map[] [] [] []} {k8s ci-4334-0-0-a-250489a463 calico-apiserver-7d48cd9d65-sqsnl eth0 calico-apiserver [] [] [kns.calico-apiserver ksa.calico-apiserver.calico-apiserver] calid62f1bd4917 [] []}} ContainerID="a1fc357060d5abbad4388d0261c9f7f5fbcbfc27be565ff1bf3571bf839441ea" Namespace="calico-apiserver" Pod="calico-apiserver-7d48cd9d65-sqsnl" WorkloadEndpoint="ci--4334--0--0--a--250489a463-k8s-calico--apiserver--7d48cd9d65--sqsnl-" May 15 13:06:00.544256 containerd[1551]: 2025-05-15 13:06:00.460 [INFO][5909] cni-plugin/k8s.go 77: Extracted identifiers for CmdAddK8s ContainerID="a1fc357060d5abbad4388d0261c9f7f5fbcbfc27be565ff1bf3571bf839441ea" Namespace="calico-apiserver" Pod="calico-apiserver-7d48cd9d65-sqsnl" WorkloadEndpoint="ci--4334--0--0--a--250489a463-k8s-calico--apiserver--7d48cd9d65--sqsnl-eth0" May 15 13:06:00.544256 containerd[1551]: 2025-05-15 13:06:00.490 [INFO][5919] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="a1fc357060d5abbad4388d0261c9f7f5fbcbfc27be565ff1bf3571bf839441ea" HandleID="k8s-pod-network.a1fc357060d5abbad4388d0261c9f7f5fbcbfc27be565ff1bf3571bf839441ea" Workload="ci--4334--0--0--a--250489a463-k8s-calico--apiserver--7d48cd9d65--sqsnl-eth0" May 15 13:06:00.544256 containerd[1551]: 2025-05-15 13:06:00.501 [INFO][5919] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="a1fc357060d5abbad4388d0261c9f7f5fbcbfc27be565ff1bf3571bf839441ea" HandleID="k8s-pod-network.a1fc357060d5abbad4388d0261c9f7f5fbcbfc27be565ff1bf3571bf839441ea" Workload="ci--4334--0--0--a--250489a463-k8s-calico--apiserver--7d48cd9d65--sqsnl-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc00031b560), Attrs:map[string]string{"namespace":"calico-apiserver", "node":"ci-4334-0-0-a-250489a463", "pod":"calico-apiserver-7d48cd9d65-sqsnl", "timestamp":"2025-05-15 13:06:00.490253396 +0000 UTC"}, Hostname:"ci-4334-0-0-a-250489a463", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} May 15 13:06:00.544256 containerd[1551]: 2025-05-15 13:06:00.501 [INFO][5919] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. May 15 13:06:00.544256 containerd[1551]: 2025-05-15 13:06:00.501 [INFO][5919] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. May 15 13:06:00.544256 containerd[1551]: 2025-05-15 13:06:00.501 [INFO][5919] ipam/ipam.go 107: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4334-0-0-a-250489a463' May 15 13:06:00.544256 containerd[1551]: 2025-05-15 13:06:00.503 [INFO][5919] ipam/ipam.go 660: Looking up existing affinities for host handle="k8s-pod-network.a1fc357060d5abbad4388d0261c9f7f5fbcbfc27be565ff1bf3571bf839441ea" host="ci-4334-0-0-a-250489a463" May 15 13:06:00.544256 containerd[1551]: 2025-05-15 13:06:00.506 [INFO][5919] ipam/ipam.go 372: Looking up existing affinities for host host="ci-4334-0-0-a-250489a463" May 15 13:06:00.544256 containerd[1551]: 2025-05-15 13:06:00.510 [INFO][5919] ipam/ipam.go 489: Trying affinity for 192.168.27.192/26 host="ci-4334-0-0-a-250489a463" May 15 13:06:00.544256 containerd[1551]: 2025-05-15 13:06:00.511 [INFO][5919] ipam/ipam.go 155: Attempting to load block cidr=192.168.27.192/26 host="ci-4334-0-0-a-250489a463" May 15 13:06:00.544256 containerd[1551]: 2025-05-15 13:06:00.513 [INFO][5919] ipam/ipam.go 232: Affinity is confirmed and block has been loaded cidr=192.168.27.192/26 host="ci-4334-0-0-a-250489a463" May 15 13:06:00.544256 containerd[1551]: 2025-05-15 13:06:00.513 [INFO][5919] ipam/ipam.go 1180: Attempting to assign 1 addresses from block block=192.168.27.192/26 handle="k8s-pod-network.a1fc357060d5abbad4388d0261c9f7f5fbcbfc27be565ff1bf3571bf839441ea" host="ci-4334-0-0-a-250489a463" May 15 13:06:00.544256 containerd[1551]: 2025-05-15 13:06:00.516 [INFO][5919] ipam/ipam.go 1685: Creating new handle: k8s-pod-network.a1fc357060d5abbad4388d0261c9f7f5fbcbfc27be565ff1bf3571bf839441ea May 15 13:06:00.544256 containerd[1551]: 2025-05-15 13:06:00.519 [INFO][5919] ipam/ipam.go 1203: Writing block in order to claim IPs block=192.168.27.192/26 handle="k8s-pod-network.a1fc357060d5abbad4388d0261c9f7f5fbcbfc27be565ff1bf3571bf839441ea" host="ci-4334-0-0-a-250489a463" May 15 13:06:00.544256 containerd[1551]: 2025-05-15 13:06:00.525 [INFO][5919] ipam/ipam.go 1216: Successfully claimed IPs: [192.168.27.197/26] block=192.168.27.192/26 handle="k8s-pod-network.a1fc357060d5abbad4388d0261c9f7f5fbcbfc27be565ff1bf3571bf839441ea" host="ci-4334-0-0-a-250489a463" May 15 13:06:00.544256 containerd[1551]: 2025-05-15 13:06:00.525 [INFO][5919] ipam/ipam.go 847: Auto-assigned 1 out of 1 IPv4s: [192.168.27.197/26] handle="k8s-pod-network.a1fc357060d5abbad4388d0261c9f7f5fbcbfc27be565ff1bf3571bf839441ea" host="ci-4334-0-0-a-250489a463" May 15 13:06:00.544256 containerd[1551]: 2025-05-15 13:06:00.525 [INFO][5919] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. May 15 13:06:00.544256 containerd[1551]: 2025-05-15 13:06:00.525 [INFO][5919] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.27.197/26] IPv6=[] ContainerID="a1fc357060d5abbad4388d0261c9f7f5fbcbfc27be565ff1bf3571bf839441ea" HandleID="k8s-pod-network.a1fc357060d5abbad4388d0261c9f7f5fbcbfc27be565ff1bf3571bf839441ea" Workload="ci--4334--0--0--a--250489a463-k8s-calico--apiserver--7d48cd9d65--sqsnl-eth0" May 15 13:06:00.544834 containerd[1551]: 2025-05-15 13:06:00.528 [INFO][5909] cni-plugin/k8s.go 386: Populated endpoint ContainerID="a1fc357060d5abbad4388d0261c9f7f5fbcbfc27be565ff1bf3571bf839441ea" Namespace="calico-apiserver" Pod="calico-apiserver-7d48cd9d65-sqsnl" WorkloadEndpoint="ci--4334--0--0--a--250489a463-k8s-calico--apiserver--7d48cd9d65--sqsnl-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4334--0--0--a--250489a463-k8s-calico--apiserver--7d48cd9d65--sqsnl-eth0", GenerateName:"calico-apiserver-7d48cd9d65-", Namespace:"calico-apiserver", SelfLink:"", UID:"623640fb-bde7-4522-803c-1a32b7748b5a", ResourceVersion:"667", Generation:0, CreationTimestamp:time.Date(2025, time.May, 15, 13, 5, 32, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"7d48cd9d65", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4334-0-0-a-250489a463", ContainerID:"", Pod:"calico-apiserver-7d48cd9d65-sqsnl", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.27.197/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"calid62f1bd4917", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil)}} May 15 13:06:00.544834 containerd[1551]: 2025-05-15 13:06:00.528 [INFO][5909] cni-plugin/k8s.go 387: Calico CNI using IPs: [192.168.27.197/32] ContainerID="a1fc357060d5abbad4388d0261c9f7f5fbcbfc27be565ff1bf3571bf839441ea" Namespace="calico-apiserver" Pod="calico-apiserver-7d48cd9d65-sqsnl" WorkloadEndpoint="ci--4334--0--0--a--250489a463-k8s-calico--apiserver--7d48cd9d65--sqsnl-eth0" May 15 13:06:00.544834 containerd[1551]: 2025-05-15 13:06:00.528 [INFO][5909] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to calid62f1bd4917 ContainerID="a1fc357060d5abbad4388d0261c9f7f5fbcbfc27be565ff1bf3571bf839441ea" Namespace="calico-apiserver" Pod="calico-apiserver-7d48cd9d65-sqsnl" WorkloadEndpoint="ci--4334--0--0--a--250489a463-k8s-calico--apiserver--7d48cd9d65--sqsnl-eth0" May 15 13:06:00.544834 containerd[1551]: 2025-05-15 13:06:00.530 [INFO][5909] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="a1fc357060d5abbad4388d0261c9f7f5fbcbfc27be565ff1bf3571bf839441ea" Namespace="calico-apiserver" Pod="calico-apiserver-7d48cd9d65-sqsnl" WorkloadEndpoint="ci--4334--0--0--a--250489a463-k8s-calico--apiserver--7d48cd9d65--sqsnl-eth0" May 15 13:06:00.544834 containerd[1551]: 2025-05-15 13:06:00.530 [INFO][5909] cni-plugin/k8s.go 414: Added Mac, interface name, and active container ID to endpoint ContainerID="a1fc357060d5abbad4388d0261c9f7f5fbcbfc27be565ff1bf3571bf839441ea" Namespace="calico-apiserver" Pod="calico-apiserver-7d48cd9d65-sqsnl" WorkloadEndpoint="ci--4334--0--0--a--250489a463-k8s-calico--apiserver--7d48cd9d65--sqsnl-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4334--0--0--a--250489a463-k8s-calico--apiserver--7d48cd9d65--sqsnl-eth0", GenerateName:"calico-apiserver-7d48cd9d65-", Namespace:"calico-apiserver", SelfLink:"", UID:"623640fb-bde7-4522-803c-1a32b7748b5a", ResourceVersion:"667", Generation:0, CreationTimestamp:time.Date(2025, time.May, 15, 13, 5, 32, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"7d48cd9d65", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4334-0-0-a-250489a463", ContainerID:"a1fc357060d5abbad4388d0261c9f7f5fbcbfc27be565ff1bf3571bf839441ea", Pod:"calico-apiserver-7d48cd9d65-sqsnl", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.27.197/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"calid62f1bd4917", MAC:"92:94:b5:05:dd:0c", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil)}} May 15 13:06:00.544834 containerd[1551]: 2025-05-15 13:06:00.541 [INFO][5909] cni-plugin/k8s.go 500: Wrote updated endpoint to datastore ContainerID="a1fc357060d5abbad4388d0261c9f7f5fbcbfc27be565ff1bf3571bf839441ea" Namespace="calico-apiserver" Pod="calico-apiserver-7d48cd9d65-sqsnl" WorkloadEndpoint="ci--4334--0--0--a--250489a463-k8s-calico--apiserver--7d48cd9d65--sqsnl-eth0" May 15 13:06:00.567398 containerd[1551]: time="2025-05-15T13:06:00.567343993Z" level=info msg="connecting to shim a1fc357060d5abbad4388d0261c9f7f5fbcbfc27be565ff1bf3571bf839441ea" address="unix:///run/containerd/s/7d4af0cd4a23768a80485c31b84cbee2f85feee27ca56862628ae52ee29a6588" namespace=k8s.io protocol=ttrpc version=3 May 15 13:06:00.592053 systemd[1]: Started cri-containerd-a1fc357060d5abbad4388d0261c9f7f5fbcbfc27be565ff1bf3571bf839441ea.scope - libcontainer container a1fc357060d5abbad4388d0261c9f7f5fbcbfc27be565ff1bf3571bf839441ea. May 15 13:06:00.647545 containerd[1551]: time="2025-05-15T13:06:00.647431415Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-7d48cd9d65-sqsnl,Uid:623640fb-bde7-4522-803c-1a32b7748b5a,Namespace:calico-apiserver,Attempt:0,} returns sandbox id \"a1fc357060d5abbad4388d0261c9f7f5fbcbfc27be565ff1bf3571bf839441ea\"" May 15 13:06:01.265701 systemd[1]: Started sshd@12-157.180.34.115:22-85.209.134.43:39044.service - OpenSSH per-connection server daemon (85.209.134.43:39044). May 15 13:06:01.321288 systemd-networkd[1472]: vxlan.calico: Link UP May 15 13:06:01.321295 systemd-networkd[1472]: vxlan.calico: Gained carrier May 15 13:06:01.404948 containerd[1551]: time="2025-05-15T13:06:01.404903579Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-668d6bf9bc-ldncb,Uid:3d5bfd57-7739-422c-9643-72aa5ec0fc19,Namespace:kube-system,Attempt:0,}" May 15 13:06:01.693005 systemd-networkd[1472]: cali11b97e0a2f6: Link UP May 15 13:06:01.696548 systemd-networkd[1472]: cali11b97e0a2f6: Gained carrier May 15 13:06:01.733588 containerd[1551]: 2025-05-15 13:06:01.509 [INFO][6082] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4334--0--0--a--250489a463-k8s-coredns--668d6bf9bc--ldncb-eth0 coredns-668d6bf9bc- kube-system 3d5bfd57-7739-422c-9643-72aa5ec0fc19 668 0 2025-05-15 13:05:25 +0000 UTC map[k8s-app:kube-dns pod-template-hash:668d6bf9bc projectcalico.org/namespace:kube-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:coredns] map[] [] [] []} {k8s ci-4334-0-0-a-250489a463 coredns-668d6bf9bc-ldncb eth0 coredns [] [] [kns.kube-system ksa.kube-system.coredns] cali11b97e0a2f6 [{dns UDP 53 0 } {dns-tcp TCP 53 0 } {metrics TCP 9153 0 }] []}} ContainerID="b9a3b30336527343a9ea13b75374af7c53d83c33bba1d851c00b4c1edfc81371" Namespace="kube-system" Pod="coredns-668d6bf9bc-ldncb" WorkloadEndpoint="ci--4334--0--0--a--250489a463-k8s-coredns--668d6bf9bc--ldncb-" May 15 13:06:01.733588 containerd[1551]: 2025-05-15 13:06:01.509 [INFO][6082] cni-plugin/k8s.go 77: Extracted identifiers for CmdAddK8s ContainerID="b9a3b30336527343a9ea13b75374af7c53d83c33bba1d851c00b4c1edfc81371" Namespace="kube-system" Pod="coredns-668d6bf9bc-ldncb" WorkloadEndpoint="ci--4334--0--0--a--250489a463-k8s-coredns--668d6bf9bc--ldncb-eth0" May 15 13:06:01.733588 containerd[1551]: 2025-05-15 13:06:01.611 [INFO][6098] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="b9a3b30336527343a9ea13b75374af7c53d83c33bba1d851c00b4c1edfc81371" HandleID="k8s-pod-network.b9a3b30336527343a9ea13b75374af7c53d83c33bba1d851c00b4c1edfc81371" Workload="ci--4334--0--0--a--250489a463-k8s-coredns--668d6bf9bc--ldncb-eth0" May 15 13:06:01.733588 containerd[1551]: 2025-05-15 13:06:01.626 [INFO][6098] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="b9a3b30336527343a9ea13b75374af7c53d83c33bba1d851c00b4c1edfc81371" HandleID="k8s-pod-network.b9a3b30336527343a9ea13b75374af7c53d83c33bba1d851c00b4c1edfc81371" Workload="ci--4334--0--0--a--250489a463-k8s-coredns--668d6bf9bc--ldncb-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc0002fa000), Attrs:map[string]string{"namespace":"kube-system", "node":"ci-4334-0-0-a-250489a463", "pod":"coredns-668d6bf9bc-ldncb", "timestamp":"2025-05-15 13:06:01.611640781 +0000 UTC"}, Hostname:"ci-4334-0-0-a-250489a463", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} May 15 13:06:01.733588 containerd[1551]: 2025-05-15 13:06:01.627 [INFO][6098] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. May 15 13:06:01.733588 containerd[1551]: 2025-05-15 13:06:01.627 [INFO][6098] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. May 15 13:06:01.733588 containerd[1551]: 2025-05-15 13:06:01.627 [INFO][6098] ipam/ipam.go 107: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4334-0-0-a-250489a463' May 15 13:06:01.733588 containerd[1551]: 2025-05-15 13:06:01.631 [INFO][6098] ipam/ipam.go 660: Looking up existing affinities for host handle="k8s-pod-network.b9a3b30336527343a9ea13b75374af7c53d83c33bba1d851c00b4c1edfc81371" host="ci-4334-0-0-a-250489a463" May 15 13:06:01.733588 containerd[1551]: 2025-05-15 13:06:01.639 [INFO][6098] ipam/ipam.go 372: Looking up existing affinities for host host="ci-4334-0-0-a-250489a463" May 15 13:06:01.733588 containerd[1551]: 2025-05-15 13:06:01.645 [INFO][6098] ipam/ipam.go 489: Trying affinity for 192.168.27.192/26 host="ci-4334-0-0-a-250489a463" May 15 13:06:01.733588 containerd[1551]: 2025-05-15 13:06:01.648 [INFO][6098] ipam/ipam.go 155: Attempting to load block cidr=192.168.27.192/26 host="ci-4334-0-0-a-250489a463" May 15 13:06:01.733588 containerd[1551]: 2025-05-15 13:06:01.650 [INFO][6098] ipam/ipam.go 232: Affinity is confirmed and block has been loaded cidr=192.168.27.192/26 host="ci-4334-0-0-a-250489a463" May 15 13:06:01.733588 containerd[1551]: 2025-05-15 13:06:01.650 [INFO][6098] ipam/ipam.go 1180: Attempting to assign 1 addresses from block block=192.168.27.192/26 handle="k8s-pod-network.b9a3b30336527343a9ea13b75374af7c53d83c33bba1d851c00b4c1edfc81371" host="ci-4334-0-0-a-250489a463" May 15 13:06:01.733588 containerd[1551]: 2025-05-15 13:06:01.652 [INFO][6098] ipam/ipam.go 1685: Creating new handle: k8s-pod-network.b9a3b30336527343a9ea13b75374af7c53d83c33bba1d851c00b4c1edfc81371 May 15 13:06:01.733588 containerd[1551]: 2025-05-15 13:06:01.657 [INFO][6098] ipam/ipam.go 1203: Writing block in order to claim IPs block=192.168.27.192/26 handle="k8s-pod-network.b9a3b30336527343a9ea13b75374af7c53d83c33bba1d851c00b4c1edfc81371" host="ci-4334-0-0-a-250489a463" May 15 13:06:01.733588 containerd[1551]: 2025-05-15 13:06:01.668 [INFO][6098] ipam/ipam.go 1216: Successfully claimed IPs: [192.168.27.198/26] block=192.168.27.192/26 handle="k8s-pod-network.b9a3b30336527343a9ea13b75374af7c53d83c33bba1d851c00b4c1edfc81371" host="ci-4334-0-0-a-250489a463" May 15 13:06:01.733588 containerd[1551]: 2025-05-15 13:06:01.668 [INFO][6098] ipam/ipam.go 847: Auto-assigned 1 out of 1 IPv4s: [192.168.27.198/26] handle="k8s-pod-network.b9a3b30336527343a9ea13b75374af7c53d83c33bba1d851c00b4c1edfc81371" host="ci-4334-0-0-a-250489a463" May 15 13:06:01.733588 containerd[1551]: 2025-05-15 13:06:01.668 [INFO][6098] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. May 15 13:06:01.733588 containerd[1551]: 2025-05-15 13:06:01.668 [INFO][6098] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.27.198/26] IPv6=[] ContainerID="b9a3b30336527343a9ea13b75374af7c53d83c33bba1d851c00b4c1edfc81371" HandleID="k8s-pod-network.b9a3b30336527343a9ea13b75374af7c53d83c33bba1d851c00b4c1edfc81371" Workload="ci--4334--0--0--a--250489a463-k8s-coredns--668d6bf9bc--ldncb-eth0" May 15 13:06:01.735691 containerd[1551]: 2025-05-15 13:06:01.680 [INFO][6082] cni-plugin/k8s.go 386: Populated endpoint ContainerID="b9a3b30336527343a9ea13b75374af7c53d83c33bba1d851c00b4c1edfc81371" Namespace="kube-system" Pod="coredns-668d6bf9bc-ldncb" WorkloadEndpoint="ci--4334--0--0--a--250489a463-k8s-coredns--668d6bf9bc--ldncb-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4334--0--0--a--250489a463-k8s-coredns--668d6bf9bc--ldncb-eth0", GenerateName:"coredns-668d6bf9bc-", Namespace:"kube-system", SelfLink:"", UID:"3d5bfd57-7739-422c-9643-72aa5ec0fc19", ResourceVersion:"668", Generation:0, CreationTimestamp:time.Date(2025, time.May, 15, 13, 5, 25, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"668d6bf9bc", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4334-0-0-a-250489a463", ContainerID:"", Pod:"coredns-668d6bf9bc-ldncb", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.27.198/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"cali11b97e0a2f6", MAC:"", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil)}} May 15 13:06:01.735691 containerd[1551]: 2025-05-15 13:06:01.682 [INFO][6082] cni-plugin/k8s.go 387: Calico CNI using IPs: [192.168.27.198/32] ContainerID="b9a3b30336527343a9ea13b75374af7c53d83c33bba1d851c00b4c1edfc81371" Namespace="kube-system" Pod="coredns-668d6bf9bc-ldncb" WorkloadEndpoint="ci--4334--0--0--a--250489a463-k8s-coredns--668d6bf9bc--ldncb-eth0" May 15 13:06:01.735691 containerd[1551]: 2025-05-15 13:06:01.682 [INFO][6082] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali11b97e0a2f6 ContainerID="b9a3b30336527343a9ea13b75374af7c53d83c33bba1d851c00b4c1edfc81371" Namespace="kube-system" Pod="coredns-668d6bf9bc-ldncb" WorkloadEndpoint="ci--4334--0--0--a--250489a463-k8s-coredns--668d6bf9bc--ldncb-eth0" May 15 13:06:01.735691 containerd[1551]: 2025-05-15 13:06:01.695 [INFO][6082] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="b9a3b30336527343a9ea13b75374af7c53d83c33bba1d851c00b4c1edfc81371" Namespace="kube-system" Pod="coredns-668d6bf9bc-ldncb" WorkloadEndpoint="ci--4334--0--0--a--250489a463-k8s-coredns--668d6bf9bc--ldncb-eth0" May 15 13:06:01.735691 containerd[1551]: 2025-05-15 13:06:01.698 [INFO][6082] cni-plugin/k8s.go 414: Added Mac, interface name, and active container ID to endpoint ContainerID="b9a3b30336527343a9ea13b75374af7c53d83c33bba1d851c00b4c1edfc81371" Namespace="kube-system" Pod="coredns-668d6bf9bc-ldncb" WorkloadEndpoint="ci--4334--0--0--a--250489a463-k8s-coredns--668d6bf9bc--ldncb-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4334--0--0--a--250489a463-k8s-coredns--668d6bf9bc--ldncb-eth0", GenerateName:"coredns-668d6bf9bc-", Namespace:"kube-system", SelfLink:"", UID:"3d5bfd57-7739-422c-9643-72aa5ec0fc19", ResourceVersion:"668", Generation:0, CreationTimestamp:time.Date(2025, time.May, 15, 13, 5, 25, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"668d6bf9bc", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4334-0-0-a-250489a463", ContainerID:"b9a3b30336527343a9ea13b75374af7c53d83c33bba1d851c00b4c1edfc81371", Pod:"coredns-668d6bf9bc-ldncb", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.27.198/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"cali11b97e0a2f6", MAC:"f2:20:ca:b0:9d:66", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil)}} May 15 13:06:01.735691 containerd[1551]: 2025-05-15 13:06:01.722 [INFO][6082] cni-plugin/k8s.go 500: Wrote updated endpoint to datastore ContainerID="b9a3b30336527343a9ea13b75374af7c53d83c33bba1d851c00b4c1edfc81371" Namespace="kube-system" Pod="coredns-668d6bf9bc-ldncb" WorkloadEndpoint="ci--4334--0--0--a--250489a463-k8s-coredns--668d6bf9bc--ldncb-eth0" May 15 13:06:01.825720 containerd[1551]: time="2025-05-15T13:06:01.825644757Z" level=info msg="connecting to shim b9a3b30336527343a9ea13b75374af7c53d83c33bba1d851c00b4c1edfc81371" address="unix:///run/containerd/s/d1193302a80c209060b8d85db6e5154fd89c011ad03ca20ab68dadd7a6aecd4a" namespace=k8s.io protocol=ttrpc version=3 May 15 13:06:01.873695 sshd[6056]: Invalid user tomas from 85.209.134.43 port 39044 May 15 13:06:01.879061 systemd[1]: Started cri-containerd-b9a3b30336527343a9ea13b75374af7c53d83c33bba1d851c00b4c1edfc81371.scope - libcontainer container b9a3b30336527343a9ea13b75374af7c53d83c33bba1d851c00b4c1edfc81371. May 15 13:06:01.935825 containerd[1551]: time="2025-05-15T13:06:01.935796528Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-668d6bf9bc-ldncb,Uid:3d5bfd57-7739-422c-9643-72aa5ec0fc19,Namespace:kube-system,Attempt:0,} returns sandbox id \"b9a3b30336527343a9ea13b75374af7c53d83c33bba1d851c00b4c1edfc81371\"" May 15 13:06:01.940642 containerd[1551]: time="2025-05-15T13:06:01.940560078Z" level=info msg="CreateContainer within sandbox \"b9a3b30336527343a9ea13b75374af7c53d83c33bba1d851c00b4c1edfc81371\" for container &ContainerMetadata{Name:coredns,Attempt:0,}" May 15 13:06:01.951954 containerd[1551]: time="2025-05-15T13:06:01.951849728Z" level=info msg="Container 99813ba49c12467f50157e97533d7d1a8855e61055b1399cc1a02aa6c771c134: CDI devices from CRI Config.CDIDevices: []" May 15 13:06:01.953904 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount863104392.mount: Deactivated successfully. May 15 13:06:01.966685 containerd[1551]: time="2025-05-15T13:06:01.966551507Z" level=info msg="CreateContainer within sandbox \"b9a3b30336527343a9ea13b75374af7c53d83c33bba1d851c00b4c1edfc81371\" for &ContainerMetadata{Name:coredns,Attempt:0,} returns container id \"99813ba49c12467f50157e97533d7d1a8855e61055b1399cc1a02aa6c771c134\"" May 15 13:06:01.967709 containerd[1551]: time="2025-05-15T13:06:01.967513696Z" level=info msg="StartContainer for \"99813ba49c12467f50157e97533d7d1a8855e61055b1399cc1a02aa6c771c134\"" May 15 13:06:01.968489 containerd[1551]: time="2025-05-15T13:06:01.968454385Z" level=info msg="connecting to shim 99813ba49c12467f50157e97533d7d1a8855e61055b1399cc1a02aa6c771c134" address="unix:///run/containerd/s/d1193302a80c209060b8d85db6e5154fd89c011ad03ca20ab68dadd7a6aecd4a" protocol=ttrpc version=3 May 15 13:06:01.985661 sshd[6056]: Received disconnect from 85.209.134.43 port 39044:11: Bye Bye [preauth] May 15 13:06:01.985963 sshd[6056]: Disconnected from invalid user tomas 85.209.134.43 port 39044 [preauth] May 15 13:06:01.990096 systemd[1]: Started cri-containerd-99813ba49c12467f50157e97533d7d1a8855e61055b1399cc1a02aa6c771c134.scope - libcontainer container 99813ba49c12467f50157e97533d7d1a8855e61055b1399cc1a02aa6c771c134. May 15 13:06:01.990421 systemd[1]: sshd@12-157.180.34.115:22-85.209.134.43:39044.service: Deactivated successfully. May 15 13:06:02.042193 containerd[1551]: time="2025-05-15T13:06:02.041008852Z" level=info msg="StartContainer for \"99813ba49c12467f50157e97533d7d1a8855e61055b1399cc1a02aa6c771c134\" returns successfully" May 15 13:06:02.202278 containerd[1551]: time="2025-05-15T13:06:02.202179400Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.29.3: active requests=0, bytes read=43021437" May 15 13:06:02.214618 containerd[1551]: time="2025-05-15T13:06:02.214552639Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/apiserver:v3.29.3\" with image id \"sha256:b1960e792987d99ee8f3583d7354dcd25a683cf854e8f10322ca7eeb83128532\", repo tag \"ghcr.io/flatcar/calico/apiserver:v3.29.3\", repo digest \"ghcr.io/flatcar/calico/apiserver@sha256:bcb659f25f9aebaa389ed1dbb65edb39478ddf82c57d07d8da474e8cab38d77b\", size \"44514075\" in 4.323971865s" May 15 13:06:02.214618 containerd[1551]: time="2025-05-15T13:06:02.214604145Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.29.3\" returns image reference \"sha256:b1960e792987d99ee8f3583d7354dcd25a683cf854e8f10322ca7eeb83128532\"" May 15 13:06:02.216706 containerd[1551]: time="2025-05-15T13:06:02.216138682Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.29.3\"" May 15 13:06:02.224702 containerd[1551]: time="2025-05-15T13:06:02.224377295Z" level=info msg="CreateContainer within sandbox \"45bb1f6c14d218ec8dc935a51a8ff701884bed034c9894fcc7a5d5c1206e3ddf\" for container &ContainerMetadata{Name:calico-apiserver,Attempt:0,}" May 15 13:06:02.232228 containerd[1551]: time="2025-05-15T13:06:02.231836443Z" level=info msg="Container 27f21c8449abc27dffff8214053afa33924c7a356d22df6a623176941d00f5fa: CDI devices from CRI Config.CDIDevices: []" May 15 13:06:02.232411 containerd[1551]: time="2025-05-15T13:06:02.232389583Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/apiserver:v3.29.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 15 13:06:02.238339 containerd[1551]: time="2025-05-15T13:06:02.238311921Z" level=info msg="ImageCreate event name:\"sha256:b1960e792987d99ee8f3583d7354dcd25a683cf854e8f10322ca7eeb83128532\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 15 13:06:02.238781 containerd[1551]: time="2025-05-15T13:06:02.238763850Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/apiserver@sha256:bcb659f25f9aebaa389ed1dbb65edb39478ddf82c57d07d8da474e8cab38d77b\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 15 13:06:02.250801 containerd[1551]: time="2025-05-15T13:06:02.250753277Z" level=info msg="CreateContainer within sandbox \"45bb1f6c14d218ec8dc935a51a8ff701884bed034c9894fcc7a5d5c1206e3ddf\" for &ContainerMetadata{Name:calico-apiserver,Attempt:0,} returns container id \"27f21c8449abc27dffff8214053afa33924c7a356d22df6a623176941d00f5fa\"" May 15 13:06:02.251593 containerd[1551]: time="2025-05-15T13:06:02.251458844Z" level=info msg="StartContainer for \"27f21c8449abc27dffff8214053afa33924c7a356d22df6a623176941d00f5fa\"" May 15 13:06:02.253102 containerd[1551]: time="2025-05-15T13:06:02.253061628Z" level=info msg="connecting to shim 27f21c8449abc27dffff8214053afa33924c7a356d22df6a623176941d00f5fa" address="unix:///run/containerd/s/fc1e54925f16dc7e13c4dcc7ca3e4f57a598e3518f52de1fc6c0a556ac3e8dcc" protocol=ttrpc version=3 May 15 13:06:02.276114 systemd[1]: Started cri-containerd-27f21c8449abc27dffff8214053afa33924c7a356d22df6a623176941d00f5fa.scope - libcontainer container 27f21c8449abc27dffff8214053afa33924c7a356d22df6a623176941d00f5fa. May 15 13:06:02.323237 containerd[1551]: time="2025-05-15T13:06:02.323152061Z" level=info msg="StartContainer for \"27f21c8449abc27dffff8214053afa33924c7a356d22df6a623176941d00f5fa\" returns successfully" May 15 13:06:02.458090 systemd-networkd[1472]: vxlan.calico: Gained IPv6LL May 15 13:06:02.459015 systemd-networkd[1472]: calid62f1bd4917: Gained IPv6LL May 15 13:06:02.589730 kubelet[4265]: I0515 13:06:02.589255 4265 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-apiserver/calico-apiserver-7d48cd9d65-l57k5" podStartSLOduration=26.259186024 podStartE2EDuration="30.589234447s" podCreationTimestamp="2025-05-15 13:05:32 +0000 UTC" firstStartedPulling="2025-05-15 13:05:57.885440256 +0000 UTC m=+39.601818159" lastFinishedPulling="2025-05-15 13:06:02.215488679 +0000 UTC m=+43.931866582" observedRunningTime="2025-05-15 13:06:02.571180113 +0000 UTC m=+44.287558016" watchObservedRunningTime="2025-05-15 13:06:02.589234447 +0000 UTC m=+44.305612349" May 15 13:06:02.589730 kubelet[4265]: I0515 13:06:02.589363 4265 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/coredns-668d6bf9bc-ldncb" podStartSLOduration=37.589356647 podStartE2EDuration="37.589356647s" podCreationTimestamp="2025-05-15 13:05:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-05-15 13:06:02.587759172 +0000 UTC m=+44.304137075" watchObservedRunningTime="2025-05-15 13:06:02.589356647 +0000 UTC m=+44.305734550" May 15 13:06:03.354442 systemd-networkd[1472]: cali11b97e0a2f6: Gained IPv6LL May 15 13:06:03.555155 kubelet[4265]: I0515 13:06:03.555112 4265 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" May 15 13:06:05.536024 containerd[1551]: time="2025-05-15T13:06:05.535942455Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/kube-controllers:v3.29.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 15 13:06:05.536731 containerd[1551]: time="2025-05-15T13:06:05.536702114Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/kube-controllers:v3.29.3: active requests=0, bytes read=34789138" May 15 13:06:05.537392 containerd[1551]: time="2025-05-15T13:06:05.537350023Z" level=info msg="ImageCreate event name:\"sha256:4e982138231b3653a012db4f21ed5e7be69afd5f553dba38cf7e88f0ed740b94\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 15 13:06:05.538841 containerd[1551]: time="2025-05-15T13:06:05.538805060Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/kube-controllers@sha256:feaab0197035d474845e0f8137a99a78cab274f0a3cac4d5485cf9b1bdf9ffa9\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 15 13:06:05.539503 containerd[1551]: time="2025-05-15T13:06:05.539348531Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/kube-controllers:v3.29.3\" with image id \"sha256:4e982138231b3653a012db4f21ed5e7be69afd5f553dba38cf7e88f0ed740b94\", repo tag \"ghcr.io/flatcar/calico/kube-controllers:v3.29.3\", repo digest \"ghcr.io/flatcar/calico/kube-controllers@sha256:feaab0197035d474845e0f8137a99a78cab274f0a3cac4d5485cf9b1bdf9ffa9\", size \"36281728\" in 3.323184983s" May 15 13:06:05.539503 containerd[1551]: time="2025-05-15T13:06:05.539379339Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.29.3\" returns image reference \"sha256:4e982138231b3653a012db4f21ed5e7be69afd5f553dba38cf7e88f0ed740b94\"" May 15 13:06:05.540609 containerd[1551]: time="2025-05-15T13:06:05.540470592Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.29.3\"" May 15 13:06:05.560621 containerd[1551]: time="2025-05-15T13:06:05.560572044Z" level=info msg="CreateContainer within sandbox \"904a76a40abd5e4ec7065ed6f188f61ddab58913bffbe065741ea8df79ea4105\" for container &ContainerMetadata{Name:calico-kube-controllers,Attempt:0,}" May 15 13:06:05.567822 containerd[1551]: time="2025-05-15T13:06:05.567800127Z" level=info msg="Container 2fbd0fbdab491d22b17bbdbf8b7272ad97d978bd14bd63cd6682c029b6bdf8f6: CDI devices from CRI Config.CDIDevices: []" May 15 13:06:05.574620 containerd[1551]: time="2025-05-15T13:06:05.574524774Z" level=info msg="CreateContainer within sandbox \"904a76a40abd5e4ec7065ed6f188f61ddab58913bffbe065741ea8df79ea4105\" for &ContainerMetadata{Name:calico-kube-controllers,Attempt:0,} returns container id \"2fbd0fbdab491d22b17bbdbf8b7272ad97d978bd14bd63cd6682c029b6bdf8f6\"" May 15 13:06:05.575866 containerd[1551]: time="2025-05-15T13:06:05.575809370Z" level=info msg="StartContainer for \"2fbd0fbdab491d22b17bbdbf8b7272ad97d978bd14bd63cd6682c029b6bdf8f6\"" May 15 13:06:05.576817 containerd[1551]: time="2025-05-15T13:06:05.576770437Z" level=info msg="connecting to shim 2fbd0fbdab491d22b17bbdbf8b7272ad97d978bd14bd63cd6682c029b6bdf8f6" address="unix:///run/containerd/s/deb6bba0a504fb8e594f13344781086ca214b6fb1354f00b02d660879f49a1f6" protocol=ttrpc version=3 May 15 13:06:05.597025 systemd[1]: Started cri-containerd-2fbd0fbdab491d22b17bbdbf8b7272ad97d978bd14bd63cd6682c029b6bdf8f6.scope - libcontainer container 2fbd0fbdab491d22b17bbdbf8b7272ad97d978bd14bd63cd6682c029b6bdf8f6. May 15 13:06:05.648639 containerd[1551]: time="2025-05-15T13:06:05.648556628Z" level=info msg="StartContainer for \"2fbd0fbdab491d22b17bbdbf8b7272ad97d978bd14bd63cd6682c029b6bdf8f6\" returns successfully" May 15 13:06:07.439663 containerd[1551]: time="2025-05-15T13:06:07.439553773Z" level=info msg="TaskExit event in podsandbox handler container_id:\"2fbd0fbdab491d22b17bbdbf8b7272ad97d978bd14bd63cd6682c029b6bdf8f6\" id:\"853f980e42b3f7b8dd7c17676d31cf182357b432cda85721ec939b6a5e8cabb4\" pid:6336 exited_at:{seconds:1747314367 nanos:431828966}" May 15 13:06:07.475559 kubelet[4265]: I0515 13:06:07.475406 4265 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/calico-kube-controllers-5455d8c489-zdjgc" podStartSLOduration=27.858180468 podStartE2EDuration="35.475388934s" podCreationTimestamp="2025-05-15 13:05:32 +0000 UTC" firstStartedPulling="2025-05-15 13:05:57.922861331 +0000 UTC m=+39.639239234" lastFinishedPulling="2025-05-15 13:06:05.540069798 +0000 UTC m=+47.256447700" observedRunningTime="2025-05-15 13:06:06.576623965 +0000 UTC m=+48.293001878" watchObservedRunningTime="2025-05-15 13:06:07.475388934 +0000 UTC m=+49.191766827" May 15 13:06:07.496745 containerd[1551]: time="2025-05-15T13:06:07.496657403Z" level=info msg="TaskExit event in podsandbox handler container_id:\"2fbd0fbdab491d22b17bbdbf8b7272ad97d978bd14bd63cd6682c029b6bdf8f6\" id:\"129845b9ab7b03bed58d39b8940b6818c2155aaf251b302fe6596ec1491873e8\" pid:6358 exited_at:{seconds:1747314367 nanos:496339714}" May 15 13:06:08.543233 containerd[1551]: time="2025-05-15T13:06:08.543185627Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/csi:v3.29.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 15 13:06:08.544630 containerd[1551]: time="2025-05-15T13:06:08.544586009Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/csi:v3.29.3: active requests=0, bytes read=7912898" May 15 13:06:08.545576 containerd[1551]: time="2025-05-15T13:06:08.545548681Z" level=info msg="ImageCreate event name:\"sha256:4c37db5645f4075f8b8170eea8f14e340cb13550e0a392962f1f211ded741505\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 15 13:06:08.547768 containerd[1551]: time="2025-05-15T13:06:08.547708262Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/csi@sha256:72455a36febc7c56ec8881007f4805caed5764026a0694e4f86a2503209b2d31\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 15 13:06:08.548363 containerd[1551]: time="2025-05-15T13:06:08.548337545Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/csi:v3.29.3\" with image id \"sha256:4c37db5645f4075f8b8170eea8f14e340cb13550e0a392962f1f211ded741505\", repo tag \"ghcr.io/flatcar/calico/csi:v3.29.3\", repo digest \"ghcr.io/flatcar/calico/csi@sha256:72455a36febc7c56ec8881007f4805caed5764026a0694e4f86a2503209b2d31\", size \"9405520\" in 3.007825095s" May 15 13:06:08.548634 containerd[1551]: time="2025-05-15T13:06:08.548434106Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.29.3\" returns image reference \"sha256:4c37db5645f4075f8b8170eea8f14e340cb13550e0a392962f1f211ded741505\"" May 15 13:06:08.551785 containerd[1551]: time="2025-05-15T13:06:08.551550818Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.29.3\"" May 15 13:06:08.555736 containerd[1551]: time="2025-05-15T13:06:08.554266976Z" level=info msg="CreateContainer within sandbox \"3e490ba5ca60ed777b12426bf74c3a9d9373b5b0c20c66d95e8585974c3d0a43\" for container &ContainerMetadata{Name:calico-csi,Attempt:0,}" May 15 13:06:08.566541 containerd[1551]: time="2025-05-15T13:06:08.566511022Z" level=info msg="Container 435dd2e9f8a9b2cb14238b9714ebc3d6ba1e0d01ca9e011dabd7b501b330f970: CDI devices from CRI Config.CDIDevices: []" May 15 13:06:08.581351 containerd[1551]: time="2025-05-15T13:06:08.581304403Z" level=info msg="CreateContainer within sandbox \"3e490ba5ca60ed777b12426bf74c3a9d9373b5b0c20c66d95e8585974c3d0a43\" for &ContainerMetadata{Name:calico-csi,Attempt:0,} returns container id \"435dd2e9f8a9b2cb14238b9714ebc3d6ba1e0d01ca9e011dabd7b501b330f970\"" May 15 13:06:08.581789 containerd[1551]: time="2025-05-15T13:06:08.581676783Z" level=info msg="StartContainer for \"435dd2e9f8a9b2cb14238b9714ebc3d6ba1e0d01ca9e011dabd7b501b330f970\"" May 15 13:06:08.583395 containerd[1551]: time="2025-05-15T13:06:08.583353957Z" level=info msg="connecting to shim 435dd2e9f8a9b2cb14238b9714ebc3d6ba1e0d01ca9e011dabd7b501b330f970" address="unix:///run/containerd/s/b659da34bb26bd54767e2e7a804f1c5724898168e9c558302d2ca3a9bc040960" protocol=ttrpc version=3 May 15 13:06:08.604097 systemd[1]: Started cri-containerd-435dd2e9f8a9b2cb14238b9714ebc3d6ba1e0d01ca9e011dabd7b501b330f970.scope - libcontainer container 435dd2e9f8a9b2cb14238b9714ebc3d6ba1e0d01ca9e011dabd7b501b330f970. May 15 13:06:08.648183 containerd[1551]: time="2025-05-15T13:06:08.645936310Z" level=info msg="StartContainer for \"435dd2e9f8a9b2cb14238b9714ebc3d6ba1e0d01ca9e011dabd7b501b330f970\" returns successfully" May 15 13:06:09.137215 containerd[1551]: time="2025-05-15T13:06:09.137047766Z" level=info msg="ImageUpdate event name:\"ghcr.io/flatcar/calico/apiserver:v3.29.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 15 13:06:09.138115 containerd[1551]: time="2025-05-15T13:06:09.138073434Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.29.3: active requests=0, bytes read=77" May 15 13:06:09.139535 containerd[1551]: time="2025-05-15T13:06:09.139498043Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/apiserver:v3.29.3\" with image id \"sha256:b1960e792987d99ee8f3583d7354dcd25a683cf854e8f10322ca7eeb83128532\", repo tag \"ghcr.io/flatcar/calico/apiserver:v3.29.3\", repo digest \"ghcr.io/flatcar/calico/apiserver@sha256:bcb659f25f9aebaa389ed1dbb65edb39478ddf82c57d07d8da474e8cab38d77b\", size \"44514075\" in 587.895037ms" May 15 13:06:09.139535 containerd[1551]: time="2025-05-15T13:06:09.139528310Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.29.3\" returns image reference \"sha256:b1960e792987d99ee8f3583d7354dcd25a683cf854e8f10322ca7eeb83128532\"" May 15 13:06:09.140414 containerd[1551]: time="2025-05-15T13:06:09.140393698Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.29.3\"" May 15 13:06:09.142719 containerd[1551]: time="2025-05-15T13:06:09.142615486Z" level=info msg="CreateContainer within sandbox \"a1fc357060d5abbad4388d0261c9f7f5fbcbfc27be565ff1bf3571bf839441ea\" for container &ContainerMetadata{Name:calico-apiserver,Attempt:0,}" May 15 13:06:09.152252 containerd[1551]: time="2025-05-15T13:06:09.150945170Z" level=info msg="Container e824037266c6d28996f64fab189c6ccdd3fb98d69a05fe6df67ba1a3ac25d2bc: CDI devices from CRI Config.CDIDevices: []" May 15 13:06:09.165958 containerd[1551]: time="2025-05-15T13:06:09.165918489Z" level=info msg="CreateContainer within sandbox \"a1fc357060d5abbad4388d0261c9f7f5fbcbfc27be565ff1bf3571bf839441ea\" for &ContainerMetadata{Name:calico-apiserver,Attempt:0,} returns container id \"e824037266c6d28996f64fab189c6ccdd3fb98d69a05fe6df67ba1a3ac25d2bc\"" May 15 13:06:09.166942 containerd[1551]: time="2025-05-15T13:06:09.166477802Z" level=info msg="StartContainer for \"e824037266c6d28996f64fab189c6ccdd3fb98d69a05fe6df67ba1a3ac25d2bc\"" May 15 13:06:09.167593 containerd[1551]: time="2025-05-15T13:06:09.167539959Z" level=info msg="connecting to shim e824037266c6d28996f64fab189c6ccdd3fb98d69a05fe6df67ba1a3ac25d2bc" address="unix:///run/containerd/s/7d4af0cd4a23768a80485c31b84cbee2f85feee27ca56862628ae52ee29a6588" protocol=ttrpc version=3 May 15 13:06:09.187031 systemd[1]: Started cri-containerd-e824037266c6d28996f64fab189c6ccdd3fb98d69a05fe6df67ba1a3ac25d2bc.scope - libcontainer container e824037266c6d28996f64fab189c6ccdd3fb98d69a05fe6df67ba1a3ac25d2bc. May 15 13:06:09.233818 containerd[1551]: time="2025-05-15T13:06:09.233785371Z" level=info msg="StartContainer for \"e824037266c6d28996f64fab189c6ccdd3fb98d69a05fe6df67ba1a3ac25d2bc\" returns successfully" May 15 13:06:10.194032 kubelet[4265]: I0515 13:06:10.193313 4265 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-apiserver/calico-apiserver-7d48cd9d65-sqsnl" podStartSLOduration=29.728286043 podStartE2EDuration="38.193298333s" podCreationTimestamp="2025-05-15 13:05:32 +0000 UTC" firstStartedPulling="2025-05-15 13:06:00.675303682 +0000 UTC m=+42.391681585" lastFinishedPulling="2025-05-15 13:06:09.140315962 +0000 UTC m=+50.856693875" observedRunningTime="2025-05-15 13:06:09.60267749 +0000 UTC m=+51.319055393" watchObservedRunningTime="2025-05-15 13:06:10.193298333 +0000 UTC m=+51.909676236" May 15 13:06:11.348298 containerd[1551]: time="2025-05-15T13:06:11.348238427Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node-driver-registrar:v3.29.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 15 13:06:11.349165 containerd[1551]: time="2025-05-15T13:06:11.349130135Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/node-driver-registrar:v3.29.3: active requests=0, bytes read=13991773" May 15 13:06:11.350120 containerd[1551]: time="2025-05-15T13:06:11.349951408Z" level=info msg="ImageCreate event name:\"sha256:e909e2ccf54404290b577fbddd190d036984deed184001767f820b0dddf77fd9\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 15 13:06:11.353855 containerd[1551]: time="2025-05-15T13:06:11.353786101Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node-driver-registrar@sha256:3f15090a9bb45773d1fd019455ec3d3f3746f3287c35d8013e497b38d8237324\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 15 13:06:11.357793 containerd[1551]: time="2025-05-15T13:06:11.357679542Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.29.3\" with image id \"sha256:e909e2ccf54404290b577fbddd190d036984deed184001767f820b0dddf77fd9\", repo tag \"ghcr.io/flatcar/calico/node-driver-registrar:v3.29.3\", repo digest \"ghcr.io/flatcar/calico/node-driver-registrar@sha256:3f15090a9bb45773d1fd019455ec3d3f3746f3287c35d8013e497b38d8237324\", size \"15484347\" in 2.217259475s" May 15 13:06:11.357793 containerd[1551]: time="2025-05-15T13:06:11.357708857Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.29.3\" returns image reference \"sha256:e909e2ccf54404290b577fbddd190d036984deed184001767f820b0dddf77fd9\"" May 15 13:06:11.360410 containerd[1551]: time="2025-05-15T13:06:11.360340597Z" level=info msg="CreateContainer within sandbox \"3e490ba5ca60ed777b12426bf74c3a9d9373b5b0c20c66d95e8585974c3d0a43\" for container &ContainerMetadata{Name:csi-node-driver-registrar,Attempt:0,}" May 15 13:06:11.376087 containerd[1551]: time="2025-05-15T13:06:11.376068925Z" level=info msg="Container 202fa36d5140c638d188f67d1bb0f1306d89d8bd874396d9234f1fc40f484ffa: CDI devices from CRI Config.CDIDevices: []" May 15 13:06:11.381529 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount1488671807.mount: Deactivated successfully. May 15 13:06:11.394492 containerd[1551]: time="2025-05-15T13:06:11.394439384Z" level=info msg="CreateContainer within sandbox \"3e490ba5ca60ed777b12426bf74c3a9d9373b5b0c20c66d95e8585974c3d0a43\" for &ContainerMetadata{Name:csi-node-driver-registrar,Attempt:0,} returns container id \"202fa36d5140c638d188f67d1bb0f1306d89d8bd874396d9234f1fc40f484ffa\"" May 15 13:06:11.395654 containerd[1551]: time="2025-05-15T13:06:11.395082011Z" level=info msg="StartContainer for \"202fa36d5140c638d188f67d1bb0f1306d89d8bd874396d9234f1fc40f484ffa\"" May 15 13:06:11.397208 containerd[1551]: time="2025-05-15T13:06:11.397029875Z" level=info msg="connecting to shim 202fa36d5140c638d188f67d1bb0f1306d89d8bd874396d9234f1fc40f484ffa" address="unix:///run/containerd/s/b659da34bb26bd54767e2e7a804f1c5724898168e9c558302d2ca3a9bc040960" protocol=ttrpc version=3 May 15 13:06:11.437014 systemd[1]: Started cri-containerd-202fa36d5140c638d188f67d1bb0f1306d89d8bd874396d9234f1fc40f484ffa.scope - libcontainer container 202fa36d5140c638d188f67d1bb0f1306d89d8bd874396d9234f1fc40f484ffa. May 15 13:06:11.470755 containerd[1551]: time="2025-05-15T13:06:11.470707763Z" level=info msg="StartContainer for \"202fa36d5140c638d188f67d1bb0f1306d89d8bd874396d9234f1fc40f484ffa\" returns successfully" May 15 13:06:11.608787 kubelet[4265]: I0515 13:06:11.608606 4265 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/csi-node-driver-tx8fx" podStartSLOduration=26.973579595 podStartE2EDuration="39.608588242s" podCreationTimestamp="2025-05-15 13:05:32 +0000 UTC" firstStartedPulling="2025-05-15 13:05:58.723441614 +0000 UTC m=+40.439819517" lastFinishedPulling="2025-05-15 13:06:11.358450262 +0000 UTC m=+53.074828164" observedRunningTime="2025-05-15 13:06:11.607176717 +0000 UTC m=+53.323554620" watchObservedRunningTime="2025-05-15 13:06:11.608588242 +0000 UTC m=+53.324966145" May 15 13:06:12.600946 kubelet[4265]: I0515 13:06:12.600901 4265 csi_plugin.go:100] kubernetes.io/csi: Trying to validate a new CSI Driver with name: csi.tigera.io endpoint: /var/lib/kubelet/plugins/csi.tigera.io/csi.sock versions: 1.0.0 May 15 13:06:12.604093 kubelet[4265]: I0515 13:06:12.604063 4265 csi_plugin.go:113] kubernetes.io/csi: Register new plugin with name: csi.tigera.io at endpoint: /var/lib/kubelet/plugins/csi.tigera.io/csi.sock May 15 13:06:25.583638 containerd[1551]: time="2025-05-15T13:06:25.583596339Z" level=info msg="TaskExit event in podsandbox handler container_id:\"4f961edc75ef95ac086ccfc69fd0d46c2ea7c617b362e85a0f73a5de9f57344e\" id:\"870ba36480c4e1f920bbd660594ddbc675b83b7d0dd81806708ea9a036d55577\" pid:6509 exited_at:{seconds:1747314385 nanos:577599562}" May 15 13:06:28.329208 kubelet[4265]: I0515 13:06:28.328999 4265 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" May 15 13:06:31.240451 systemd[1]: Started sshd@13-157.180.34.115:22-185.156.73.234:48202.service - OpenSSH per-connection server daemon (185.156.73.234:48202). May 15 13:06:32.011142 sshd[6528]: Invalid user config from 185.156.73.234 port 48202 May 15 13:06:32.101165 sshd[6528]: Connection closed by invalid user config 185.156.73.234 port 48202 [preauth] May 15 13:06:32.102982 systemd[1]: sshd@13-157.180.34.115:22-185.156.73.234:48202.service: Deactivated successfully. May 15 13:06:37.540454 containerd[1551]: time="2025-05-15T13:06:37.540297619Z" level=info msg="TaskExit event in podsandbox handler container_id:\"2fbd0fbdab491d22b17bbdbf8b7272ad97d978bd14bd63cd6682c029b6bdf8f6\" id:\"f6cda7416e2f54adf3c415853f0ec84f88667b7d546d0e00a56b1becd9716ddf\" pid:6545 exited_at:{seconds:1747314397 nanos:539932923}" May 15 13:06:46.903089 containerd[1551]: time="2025-05-15T13:06:46.903045600Z" level=info msg="TaskExit event in podsandbox handler container_id:\"2fbd0fbdab491d22b17bbdbf8b7272ad97d978bd14bd63cd6682c029b6bdf8f6\" id:\"9bb472853318e8b0e61169e294f9d42d9c3c90a75a0e986149cb403c025f02d6\" pid:6574 exited_at:{seconds:1747314406 nanos:902463584}" May 15 13:06:55.562270 containerd[1551]: time="2025-05-15T13:06:55.562217565Z" level=info msg="TaskExit event in podsandbox handler container_id:\"4f961edc75ef95ac086ccfc69fd0d46c2ea7c617b362e85a0f73a5de9f57344e\" id:\"deed3f3b2e58b7579240573312b744cd9939545a032e892707bd7109f850cc37\" pid:6593 exited_at:{seconds:1747314415 nanos:561946576}" May 15 13:07:07.510521 containerd[1551]: time="2025-05-15T13:07:07.510463141Z" level=info msg="TaskExit event in podsandbox handler container_id:\"2fbd0fbdab491d22b17bbdbf8b7272ad97d978bd14bd63cd6682c029b6bdf8f6\" id:\"d7780f9615a26079b70943cbc4ba31529a17f063ba99e3b8a87263282b77c1bb\" pid:6618 exited_at:{seconds:1747314427 nanos:510242897}" May 15 13:07:25.552066 containerd[1551]: time="2025-05-15T13:07:25.551998843Z" level=info msg="TaskExit event in podsandbox handler container_id:\"4f961edc75ef95ac086ccfc69fd0d46c2ea7c617b362e85a0f73a5de9f57344e\" id:\"04f935e6517cbbefb396cdb0e8ae6e82c3a4cac50b2f9598cdcd573b1a8bae3c\" pid:6649 exited_at:{seconds:1747314445 nanos:551364650}" May 15 13:07:37.491917 containerd[1551]: time="2025-05-15T13:07:37.491846694Z" level=info msg="TaskExit event in podsandbox handler container_id:\"2fbd0fbdab491d22b17bbdbf8b7272ad97d978bd14bd63cd6682c029b6bdf8f6\" id:\"33d2c4990e90c57b0843d1722f8a161c5690413707edb9f1b590c2714f4c1e37\" pid:6693 exited_at:{seconds:1747314457 nanos:491655425}" May 15 13:07:46.906340 containerd[1551]: time="2025-05-15T13:07:46.906249719Z" level=info msg="TaskExit event in podsandbox handler container_id:\"2fbd0fbdab491d22b17bbdbf8b7272ad97d978bd14bd63cd6682c029b6bdf8f6\" id:\"d37fa8284880a7a5889eee03a6d628397293cfc9b7b9bb8679c4f912fe9b03c0\" pid:6716 exited_at:{seconds:1747314466 nanos:905645133}" May 15 13:07:55.550965 containerd[1551]: time="2025-05-15T13:07:55.550874370Z" level=info msg="TaskExit event in podsandbox handler container_id:\"4f961edc75ef95ac086ccfc69fd0d46c2ea7c617b362e85a0f73a5de9f57344e\" id:\"f40153e4b2895421d4d211704d9497240e903f89fdecb1b35f853a6516355bb5\" pid:6738 exited_at:{seconds:1747314475 nanos:550402171}" May 15 13:08:07.495628 containerd[1551]: time="2025-05-15T13:08:07.495571687Z" level=info msg="TaskExit event in podsandbox handler container_id:\"2fbd0fbdab491d22b17bbdbf8b7272ad97d978bd14bd63cd6682c029b6bdf8f6\" id:\"1ddc077d5d8789741d0f7dafd3a1b175447fee76c12980c995818ff2e2b9aaf3\" pid:6765 exited_at:{seconds:1747314487 nanos:495229283}" May 15 13:08:25.557243 containerd[1551]: time="2025-05-15T13:08:25.557200975Z" level=info msg="TaskExit event in podsandbox handler container_id:\"4f961edc75ef95ac086ccfc69fd0d46c2ea7c617b362e85a0f73a5de9f57344e\" id:\"87727cbfff66eb9338255103c8c0d1a4694c630a78c458753bb15452a9ef683d\" pid:6788 exited_at:{seconds:1747314505 nanos:556718258}" May 15 13:08:37.513687 containerd[1551]: time="2025-05-15T13:08:37.513613905Z" level=info msg="TaskExit event in podsandbox handler container_id:\"2fbd0fbdab491d22b17bbdbf8b7272ad97d978bd14bd63cd6682c029b6bdf8f6\" id:\"891e8c0bfdcd6998140f281b78113180827e2efca9b20005f0c1d36f968d82de\" pid:6816 exited_at:{seconds:1747314517 nanos:513351242}" May 15 13:08:46.905088 containerd[1551]: time="2025-05-15T13:08:46.905044980Z" level=info msg="TaskExit event in podsandbox handler container_id:\"2fbd0fbdab491d22b17bbdbf8b7272ad97d978bd14bd63cd6682c029b6bdf8f6\" id:\"a069dde47b2dce8ef7c869333d9a6c5cce7feb12432998e91b50eba3b438104e\" pid:6845 exited_at:{seconds:1747314526 nanos:904515315}" May 15 13:08:55.561144 containerd[1551]: time="2025-05-15T13:08:55.561092483Z" level=info msg="TaskExit event in podsandbox handler container_id:\"4f961edc75ef95ac086ccfc69fd0d46c2ea7c617b362e85a0f73a5de9f57344e\" id:\"4735f722b9ab3c2b0fc1c783a2984f804c65d7b9cf23cc6ecfc246b41a48cb2c\" pid:6866 exited_at:{seconds:1747314535 nanos:560507472}" May 15 13:09:07.496645 containerd[1551]: time="2025-05-15T13:09:07.496591294Z" level=info msg="TaskExit event in podsandbox handler container_id:\"2fbd0fbdab491d22b17bbdbf8b7272ad97d978bd14bd63cd6682c029b6bdf8f6\" id:\"796e2a9690388ad49c294a758c518e184c3d938b021960ebec15927c74f516d7\" pid:6899 exited_at:{seconds:1747314547 nanos:495832708}" May 15 13:09:25.557981 containerd[1551]: time="2025-05-15T13:09:25.557719392Z" level=info msg="TaskExit event in podsandbox handler container_id:\"4f961edc75ef95ac086ccfc69fd0d46c2ea7c617b362e85a0f73a5de9f57344e\" id:\"46dbfc04c2efe2a61a23705cb73a58f148f6acde92b1af57faed410d3e91c3bf\" pid:6935 exited_at:{seconds:1747314565 nanos:557241592}" May 15 13:09:37.490828 containerd[1551]: time="2025-05-15T13:09:37.490780469Z" level=info msg="TaskExit event in podsandbox handler container_id:\"2fbd0fbdab491d22b17bbdbf8b7272ad97d978bd14bd63cd6682c029b6bdf8f6\" id:\"d57c10f45d4cfe8cbc13ade72641839b081578c07b1ebcf4ee1dd1400d820db6\" pid:6962 exited_at:{seconds:1747314577 nanos:490478861}" May 15 13:09:46.929952 containerd[1551]: time="2025-05-15T13:09:46.929266987Z" level=info msg="TaskExit event in podsandbox handler container_id:\"2fbd0fbdab491d22b17bbdbf8b7272ad97d978bd14bd63cd6682c029b6bdf8f6\" id:\"ad11bc0af4ac5b8593fa4da90ca17ac1cfc68bd181c76fc133118a3e894e97eb\" pid:6986 exited_at:{seconds:1747314586 nanos:929062172}" May 15 13:09:55.547161 containerd[1551]: time="2025-05-15T13:09:55.547083931Z" level=info msg="TaskExit event in podsandbox handler container_id:\"4f961edc75ef95ac086ccfc69fd0d46c2ea7c617b362e85a0f73a5de9f57344e\" id:\"9cdca7516ed4a06c55612e17bbf726a92021818def74140f7692fce1f65a899a\" pid:7007 exited_at:{seconds:1747314595 nanos:546675523}" May 15 13:09:58.091323 systemd[1]: Started sshd@14-157.180.34.115:22-85.209.134.43:43632.service - OpenSSH per-connection server daemon (85.209.134.43:43632). May 15 13:09:58.698494 sshd[7023]: Invalid user deployer from 85.209.134.43 port 43632 May 15 13:09:58.800423 sshd[7023]: Received disconnect from 85.209.134.43 port 43632:11: Bye Bye [preauth] May 15 13:09:58.800423 sshd[7023]: Disconnected from invalid user deployer 85.209.134.43 port 43632 [preauth] May 15 13:09:58.802597 systemd[1]: sshd@14-157.180.34.115:22-85.209.134.43:43632.service: Deactivated successfully. May 15 13:09:59.142974 systemd[1]: Started sshd@15-157.180.34.115:22-117.50.184.148:50840.service - OpenSSH per-connection server daemon (117.50.184.148:50840). May 15 13:09:59.407660 systemd[1]: Started sshd@16-157.180.34.115:22-147.75.109.163:35370.service - OpenSSH per-connection server daemon (147.75.109.163:35370). May 15 13:10:00.404410 sshd[7033]: Accepted publickey for core from 147.75.109.163 port 35370 ssh2: RSA SHA256:7lmdsVx4mXdsPeXYTlWGXBW4TnKrdBGlv6Lg029Y6yo May 15 13:10:00.407057 sshd-session[7033]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 15 13:10:00.416016 systemd-logind[1536]: New session 8 of user core. May 15 13:10:00.421058 systemd[1]: Started session-8.scope - Session 8 of User core. May 15 13:10:01.547549 sshd[7035]: Connection closed by 147.75.109.163 port 35370 May 15 13:10:01.548319 sshd-session[7033]: pam_unix(sshd:session): session closed for user core May 15 13:10:01.557946 systemd-logind[1536]: Session 8 logged out. Waiting for processes to exit. May 15 13:10:01.559485 systemd[1]: sshd@16-157.180.34.115:22-147.75.109.163:35370.service: Deactivated successfully. May 15 13:10:01.562354 systemd[1]: session-8.scope: Deactivated successfully. May 15 13:10:01.566645 systemd-logind[1536]: Removed session 8. May 15 13:10:06.720342 systemd[1]: Started sshd@17-157.180.34.115:22-147.75.109.163:35380.service - OpenSSH per-connection server daemon (147.75.109.163:35380). May 15 13:10:07.493673 containerd[1551]: time="2025-05-15T13:10:07.493625923Z" level=info msg="TaskExit event in podsandbox handler container_id:\"2fbd0fbdab491d22b17bbdbf8b7272ad97d978bd14bd63cd6682c029b6bdf8f6\" id:\"b7ae391ca9b5523b401bbd082b04b670fd62590f7bc7251ce47a42dc4f0c4f20\" pid:7063 exited_at:{seconds:1747314607 nanos:493289430}" May 15 13:10:07.720663 sshd[7049]: Accepted publickey for core from 147.75.109.163 port 35380 ssh2: RSA SHA256:7lmdsVx4mXdsPeXYTlWGXBW4TnKrdBGlv6Lg029Y6yo May 15 13:10:07.722173 sshd-session[7049]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 15 13:10:07.727507 systemd-logind[1536]: New session 9 of user core. May 15 13:10:07.732042 systemd[1]: Started session-9.scope - Session 9 of User core. May 15 13:10:08.458429 sshd[7073]: Connection closed by 147.75.109.163 port 35380 May 15 13:10:08.460793 sshd-session[7049]: pam_unix(sshd:session): session closed for user core May 15 13:10:08.464604 systemd-logind[1536]: Session 9 logged out. Waiting for processes to exit. May 15 13:10:08.465302 systemd[1]: sshd@17-157.180.34.115:22-147.75.109.163:35380.service: Deactivated successfully. May 15 13:10:08.467705 systemd[1]: session-9.scope: Deactivated successfully. May 15 13:10:08.469260 systemd-logind[1536]: Removed session 9. May 15 13:10:13.627512 systemd[1]: Started sshd@18-157.180.34.115:22-147.75.109.163:51300.service - OpenSSH per-connection server daemon (147.75.109.163:51300). May 15 13:10:13.978682 containerd[1551]: time="2025-05-15T13:10:13.974848801Z" level=warning msg="container event discarded" container=7271a027245ac672c52d7e3c71edb75aeca08a11eecf807e53c14dd13614ba9c type=CONTAINER_CREATED_EVENT May 15 13:10:13.991944 containerd[1551]: time="2025-05-15T13:10:13.991865993Z" level=warning msg="container event discarded" container=7271a027245ac672c52d7e3c71edb75aeca08a11eecf807e53c14dd13614ba9c type=CONTAINER_STARTED_EVENT May 15 13:10:14.003195 containerd[1551]: time="2025-05-15T13:10:14.003148351Z" level=warning msg="container event discarded" container=a698b378280ab49fba631e40a0138558baf033e6a389d4f87e34bafcaf852621 type=CONTAINER_CREATED_EVENT May 15 13:10:14.003195 containerd[1551]: time="2025-05-15T13:10:14.003183818Z" level=warning msg="container event discarded" container=a698b378280ab49fba631e40a0138558baf033e6a389d4f87e34bafcaf852621 type=CONTAINER_STARTED_EVENT May 15 13:10:14.003195 containerd[1551]: time="2025-05-15T13:10:14.003194017Z" level=warning msg="container event discarded" container=85ae5497e9707db9f31abda29ff6608596d47e58502089d8eb3e8533ab0c1749 type=CONTAINER_CREATED_EVENT May 15 13:10:14.003872 containerd[1551]: time="2025-05-15T13:10:14.003201731Z" level=warning msg="container event discarded" container=85ae5497e9707db9f31abda29ff6608596d47e58502089d8eb3e8533ab0c1749 type=CONTAINER_STARTED_EVENT May 15 13:10:14.003872 containerd[1551]: time="2025-05-15T13:10:14.003208704Z" level=warning msg="container event discarded" container=c763ab4549febb67e3de4879c6b2edae4268b8a506555942698e1465d06c1628 type=CONTAINER_CREATED_EVENT May 15 13:10:14.020450 containerd[1551]: time="2025-05-15T13:10:14.020396757Z" level=warning msg="container event discarded" container=b9512b66e8b32a7cd3fd936bc245a8f256ca7e961eb14d96a54ec0f7999ee06b type=CONTAINER_CREATED_EVENT May 15 13:10:14.020450 containerd[1551]: time="2025-05-15T13:10:14.020437655Z" level=warning msg="container event discarded" container=1a6df3a8be706ceb2636ab7e168b6ce2e45586bcedce0fe7c61ad76dc53b7b6e type=CONTAINER_CREATED_EVENT May 15 13:10:14.098968 containerd[1551]: time="2025-05-15T13:10:14.098870757Z" level=warning msg="container event discarded" container=c763ab4549febb67e3de4879c6b2edae4268b8a506555942698e1465d06c1628 type=CONTAINER_STARTED_EVENT May 15 13:10:14.128127 containerd[1551]: time="2025-05-15T13:10:14.128076681Z" level=warning msg="container event discarded" container=b9512b66e8b32a7cd3fd936bc245a8f256ca7e961eb14d96a54ec0f7999ee06b type=CONTAINER_STARTED_EVENT May 15 13:10:14.143354 containerd[1551]: time="2025-05-15T13:10:14.143299669Z" level=warning msg="container event discarded" container=1a6df3a8be706ceb2636ab7e168b6ce2e45586bcedce0fe7c61ad76dc53b7b6e type=CONTAINER_STARTED_EVENT May 15 13:10:14.624587 sshd[7086]: Accepted publickey for core from 147.75.109.163 port 51300 ssh2: RSA SHA256:7lmdsVx4mXdsPeXYTlWGXBW4TnKrdBGlv6Lg029Y6yo May 15 13:10:14.626502 sshd-session[7086]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 15 13:10:14.633335 systemd-logind[1536]: New session 10 of user core. May 15 13:10:14.643226 systemd[1]: Started session-10.scope - Session 10 of User core. May 15 13:10:15.361171 sshd[7088]: Connection closed by 147.75.109.163 port 51300 May 15 13:10:15.361783 sshd-session[7086]: pam_unix(sshd:session): session closed for user core May 15 13:10:15.365010 systemd[1]: sshd@18-157.180.34.115:22-147.75.109.163:51300.service: Deactivated successfully. May 15 13:10:15.366881 systemd[1]: session-10.scope: Deactivated successfully. May 15 13:10:15.370074 systemd-logind[1536]: Session 10 logged out. Waiting for processes to exit. May 15 13:10:15.371555 systemd-logind[1536]: Removed session 10. May 15 13:10:15.535182 systemd[1]: Started sshd@19-157.180.34.115:22-147.75.109.163:51302.service - OpenSSH per-connection server daemon (147.75.109.163:51302). May 15 13:10:16.520271 sshd[7102]: Accepted publickey for core from 147.75.109.163 port 51302 ssh2: RSA SHA256:7lmdsVx4mXdsPeXYTlWGXBW4TnKrdBGlv6Lg029Y6yo May 15 13:10:16.521834 sshd-session[7102]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 15 13:10:16.527631 systemd-logind[1536]: New session 11 of user core. May 15 13:10:16.533060 systemd[1]: Started session-11.scope - Session 11 of User core. May 15 13:10:17.287556 sshd[7104]: Connection closed by 147.75.109.163 port 51302 May 15 13:10:17.288424 sshd-session[7102]: pam_unix(sshd:session): session closed for user core May 15 13:10:17.292686 systemd[1]: sshd@19-157.180.34.115:22-147.75.109.163:51302.service: Deactivated successfully. May 15 13:10:17.295829 systemd[1]: session-11.scope: Deactivated successfully. May 15 13:10:17.298227 systemd-logind[1536]: Session 11 logged out. Waiting for processes to exit. May 15 13:10:17.300790 systemd-logind[1536]: Removed session 11. May 15 13:10:17.462584 systemd[1]: Started sshd@20-157.180.34.115:22-147.75.109.163:51304.service - OpenSSH per-connection server daemon (147.75.109.163:51304). May 15 13:10:18.467292 sshd[7114]: Accepted publickey for core from 147.75.109.163 port 51304 ssh2: RSA SHA256:7lmdsVx4mXdsPeXYTlWGXBW4TnKrdBGlv6Lg029Y6yo May 15 13:10:18.468479 sshd-session[7114]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 15 13:10:18.473373 systemd-logind[1536]: New session 12 of user core. May 15 13:10:18.476140 systemd[1]: Started session-12.scope - Session 12 of User core. May 15 13:10:19.241228 sshd[7118]: Connection closed by 147.75.109.163 port 51304 May 15 13:10:19.244086 sshd-session[7114]: pam_unix(sshd:session): session closed for user core May 15 13:10:19.247938 systemd[1]: sshd@20-157.180.34.115:22-147.75.109.163:51304.service: Deactivated successfully. May 15 13:10:19.250371 systemd-logind[1536]: Session 12 logged out. Waiting for processes to exit. May 15 13:10:19.251449 systemd[1]: session-12.scope: Deactivated successfully. May 15 13:10:19.254693 systemd-logind[1536]: Removed session 12. May 15 13:10:24.418740 systemd[1]: Started sshd@21-157.180.34.115:22-147.75.109.163:57714.service - OpenSSH per-connection server daemon (147.75.109.163:57714). May 15 13:10:25.407798 sshd[7135]: Accepted publickey for core from 147.75.109.163 port 57714 ssh2: RSA SHA256:7lmdsVx4mXdsPeXYTlWGXBW4TnKrdBGlv6Lg029Y6yo May 15 13:10:25.409815 sshd-session[7135]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 15 13:10:25.417846 systemd-logind[1536]: New session 13 of user core. May 15 13:10:25.422122 systemd[1]: Started session-13.scope - Session 13 of User core. May 15 13:10:25.433184 containerd[1551]: time="2025-05-15T13:10:25.433059004Z" level=warning msg="container event discarded" container=df2210d8b6506f4017f41924b4ad1c4cc4a3e97bbdc90edc50e14be58132f8bd type=CONTAINER_CREATED_EVENT May 15 13:10:25.433184 containerd[1551]: time="2025-05-15T13:10:25.433155125Z" level=warning msg="container event discarded" container=df2210d8b6506f4017f41924b4ad1c4cc4a3e97bbdc90edc50e14be58132f8bd type=CONTAINER_STARTED_EVENT May 15 13:10:25.451493 containerd[1551]: time="2025-05-15T13:10:25.451442236Z" level=warning msg="container event discarded" container=f49ab8a39d9737810993be250704cc47b903087925f43c0a78afe6465f47a574 type=CONTAINER_CREATED_EVENT May 15 13:10:25.505645 containerd[1551]: time="2025-05-15T13:10:25.505593342Z" level=warning msg="container event discarded" container=f49ab8a39d9737810993be250704cc47b903087925f43c0a78afe6465f47a574 type=CONTAINER_STARTED_EVENT May 15 13:10:25.583791 containerd[1551]: time="2025-05-15T13:10:25.583546571Z" level=info msg="TaskExit event in podsandbox handler container_id:\"4f961edc75ef95ac086ccfc69fd0d46c2ea7c617b362e85a0f73a5de9f57344e\" id:\"afff4d71ae0d35bf4c71f636e0d8bfc8229f4e44f3d21c65d8ce9cb50a307258\" pid:7151 exited_at:{seconds:1747314625 nanos:583129547}" May 15 13:10:26.200803 sshd[7137]: Connection closed by 147.75.109.163 port 57714 May 15 13:10:26.205603 sshd-session[7135]: pam_unix(sshd:session): session closed for user core May 15 13:10:26.211311 systemd-logind[1536]: Session 13 logged out. Waiting for processes to exit. May 15 13:10:26.211412 systemd[1]: sshd@21-157.180.34.115:22-147.75.109.163:57714.service: Deactivated successfully. May 15 13:10:26.214689 systemd[1]: session-13.scope: Deactivated successfully. May 15 13:10:26.218420 systemd-logind[1536]: Removed session 13. May 15 13:10:27.207754 containerd[1551]: time="2025-05-15T13:10:27.207679180Z" level=warning msg="container event discarded" container=8aa344073ef65faa2332c163318935ece4604289d3ecb7b125ded7b52eba6186 type=CONTAINER_CREATED_EVENT May 15 13:10:27.207754 containerd[1551]: time="2025-05-15T13:10:27.207724184Z" level=warning msg="container event discarded" container=8aa344073ef65faa2332c163318935ece4604289d3ecb7b125ded7b52eba6186 type=CONTAINER_STARTED_EVENT May 15 13:10:29.335397 containerd[1551]: time="2025-05-15T13:10:29.335304227Z" level=warning msg="container event discarded" container=80ce2792319634f1a703173d233792af6e62ee1bdf85ecab8187137dca9f9e96 type=CONTAINER_CREATED_EVENT May 15 13:10:29.380637 containerd[1551]: time="2025-05-15T13:10:29.380568470Z" level=warning msg="container event discarded" container=80ce2792319634f1a703173d233792af6e62ee1bdf85ecab8187137dca9f9e96 type=CONTAINER_STARTED_EVENT May 15 13:10:31.370378 systemd[1]: Started sshd@22-157.180.34.115:22-147.75.109.163:37536.service - OpenSSH per-connection server daemon (147.75.109.163:37536). May 15 13:10:32.373082 sshd[7176]: Accepted publickey for core from 147.75.109.163 port 37536 ssh2: RSA SHA256:7lmdsVx4mXdsPeXYTlWGXBW4TnKrdBGlv6Lg029Y6yo May 15 13:10:32.375300 sshd-session[7176]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 15 13:10:32.383524 systemd-logind[1536]: New session 14 of user core. May 15 13:10:32.388011 systemd[1]: Started session-14.scope - Session 14 of User core. May 15 13:10:32.761782 containerd[1551]: time="2025-05-15T13:10:32.761655634Z" level=warning msg="container event discarded" container=7aa347eee207050fcb2de8f00c25e930765c09cdcea1058bd289429ddf5c88b5 type=CONTAINER_CREATED_EVENT May 15 13:10:32.761782 containerd[1551]: time="2025-05-15T13:10:32.761731096Z" level=warning msg="container event discarded" container=7aa347eee207050fcb2de8f00c25e930765c09cdcea1058bd289429ddf5c88b5 type=CONTAINER_STARTED_EVENT May 15 13:10:32.777777 containerd[1551]: time="2025-05-15T13:10:32.777655623Z" level=warning msg="container event discarded" container=d917d8462de08a1e45f19f2ac44024df9d7df3776b9df3654d591d398e02cc20 type=CONTAINER_CREATED_EVENT May 15 13:10:32.778109 containerd[1551]: time="2025-05-15T13:10:32.778013777Z" level=warning msg="container event discarded" container=d917d8462de08a1e45f19f2ac44024df9d7df3776b9df3654d591d398e02cc20 type=CONTAINER_STARTED_EVENT May 15 13:10:33.138141 sshd[7178]: Connection closed by 147.75.109.163 port 37536 May 15 13:10:33.139619 sshd-session[7176]: pam_unix(sshd:session): session closed for user core May 15 13:10:33.145306 systemd[1]: sshd@22-157.180.34.115:22-147.75.109.163:37536.service: Deactivated successfully. May 15 13:10:33.147977 systemd[1]: session-14.scope: Deactivated successfully. May 15 13:10:33.150755 systemd-logind[1536]: Session 14 logged out. Waiting for processes to exit. May 15 13:10:33.152916 systemd-logind[1536]: Removed session 14. May 15 13:10:35.547102 containerd[1551]: time="2025-05-15T13:10:35.547037078Z" level=warning msg="container event discarded" container=e7efe68380ee028dc44dbf9fabd6f0d0ffbae383ba072dc38e1b869bb9b4acdd type=CONTAINER_CREATED_EVENT May 15 13:10:35.614563 containerd[1551]: time="2025-05-15T13:10:35.614492545Z" level=warning msg="container event discarded" container=e7efe68380ee028dc44dbf9fabd6f0d0ffbae383ba072dc38e1b869bb9b4acdd type=CONTAINER_STARTED_EVENT May 15 13:10:37.490448 containerd[1551]: time="2025-05-15T13:10:37.490406795Z" level=info msg="TaskExit event in podsandbox handler container_id:\"2fbd0fbdab491d22b17bbdbf8b7272ad97d978bd14bd63cd6682c029b6bdf8f6\" id:\"5006a9782d787084c533ec796296e5c3743a79f8c285c785e2b4c16927f82bbc\" pid:7201 exited_at:{seconds:1747314637 nanos:490183545}" May 15 13:10:37.703267 containerd[1551]: time="2025-05-15T13:10:37.703187110Z" level=warning msg="container event discarded" container=f1db6f1ead85ca59db616a48ed872a75ce3c4baaa3b6aeca502650870bec2cc7 type=CONTAINER_CREATED_EVENT May 15 13:10:37.766590 containerd[1551]: time="2025-05-15T13:10:37.766469981Z" level=warning msg="container event discarded" container=f1db6f1ead85ca59db616a48ed872a75ce3c4baaa3b6aeca502650870bec2cc7 type=CONTAINER_STARTED_EVENT May 15 13:10:37.888131 containerd[1551]: time="2025-05-15T13:10:37.888061065Z" level=warning msg="container event discarded" container=f1db6f1ead85ca59db616a48ed872a75ce3c4baaa3b6aeca502650870bec2cc7 type=CONTAINER_STOPPED_EVENT May 15 13:10:38.312187 systemd[1]: Started sshd@23-157.180.34.115:22-147.75.109.163:34678.service - OpenSSH per-connection server daemon (147.75.109.163:34678). May 15 13:10:39.305107 sshd[7216]: Accepted publickey for core from 147.75.109.163 port 34678 ssh2: RSA SHA256:7lmdsVx4mXdsPeXYTlWGXBW4TnKrdBGlv6Lg029Y6yo May 15 13:10:39.306269 sshd-session[7216]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 15 13:10:39.310842 systemd-logind[1536]: New session 15 of user core. May 15 13:10:39.315052 systemd[1]: Started session-15.scope - Session 15 of User core. May 15 13:10:40.052552 sshd[7218]: Connection closed by 147.75.109.163 port 34678 May 15 13:10:40.053564 sshd-session[7216]: pam_unix(sshd:session): session closed for user core May 15 13:10:40.058976 systemd-logind[1536]: Session 15 logged out. Waiting for processes to exit. May 15 13:10:40.059361 systemd[1]: sshd@23-157.180.34.115:22-147.75.109.163:34678.service: Deactivated successfully. May 15 13:10:40.061715 systemd[1]: session-15.scope: Deactivated successfully. May 15 13:10:40.064676 systemd-logind[1536]: Removed session 15. May 15 13:10:40.217616 systemd[1]: Started sshd@24-157.180.34.115:22-147.75.109.163:34684.service - OpenSSH per-connection server daemon (147.75.109.163:34684). May 15 13:10:41.192083 sshd[7230]: Accepted publickey for core from 147.75.109.163 port 34684 ssh2: RSA SHA256:7lmdsVx4mXdsPeXYTlWGXBW4TnKrdBGlv6Lg029Y6yo May 15 13:10:41.193430 sshd-session[7230]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 15 13:10:41.197940 systemd-logind[1536]: New session 16 of user core. May 15 13:10:41.204049 systemd[1]: Started session-16.scope - Session 16 of User core. May 15 13:10:42.128632 sshd[7232]: Connection closed by 147.75.109.163 port 34684 May 15 13:10:42.132856 sshd-session[7230]: pam_unix(sshd:session): session closed for user core May 15 13:10:42.140173 systemd[1]: sshd@24-157.180.34.115:22-147.75.109.163:34684.service: Deactivated successfully. May 15 13:10:42.141963 systemd[1]: session-16.scope: Deactivated successfully. May 15 13:10:42.143086 systemd-logind[1536]: Session 16 logged out. Waiting for processes to exit. May 15 13:10:42.144519 systemd-logind[1536]: Removed session 16. May 15 13:10:42.305118 systemd[1]: Started sshd@25-157.180.34.115:22-147.75.109.163:34696.service - OpenSSH per-connection server daemon (147.75.109.163:34696). May 15 13:10:43.312487 sshd[7254]: Accepted publickey for core from 147.75.109.163 port 34696 ssh2: RSA SHA256:7lmdsVx4mXdsPeXYTlWGXBW4TnKrdBGlv6Lg029Y6yo May 15 13:10:43.313767 sshd-session[7254]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 15 13:10:43.318520 systemd-logind[1536]: New session 17 of user core. May 15 13:10:43.326013 systemd[1]: Started session-17.scope - Session 17 of User core. May 15 13:10:44.574172 containerd[1551]: time="2025-05-15T13:10:44.574047420Z" level=warning msg="container event discarded" container=aad21d7fd89dea840edb3f42765758cc86e11dd3d407317c69c7fa973cd55a30 type=CONTAINER_CREATED_EVENT May 15 13:10:44.639394 containerd[1551]: time="2025-05-15T13:10:44.639328227Z" level=warning msg="container event discarded" container=aad21d7fd89dea840edb3f42765758cc86e11dd3d407317c69c7fa973cd55a30 type=CONTAINER_STARTED_EVENT May 15 13:10:44.990126 sshd[7256]: Connection closed by 147.75.109.163 port 34696 May 15 13:10:44.992830 sshd-session[7254]: pam_unix(sshd:session): session closed for user core May 15 13:10:44.996746 systemd[1]: sshd@25-157.180.34.115:22-147.75.109.163:34696.service: Deactivated successfully. May 15 13:10:44.998955 systemd[1]: session-17.scope: Deactivated successfully. May 15 13:10:44.999654 systemd-logind[1536]: Session 17 logged out. Waiting for processes to exit. May 15 13:10:45.001026 systemd-logind[1536]: Removed session 17. May 15 13:10:45.064365 containerd[1551]: time="2025-05-15T13:10:45.064304809Z" level=warning msg="container event discarded" container=aad21d7fd89dea840edb3f42765758cc86e11dd3d407317c69c7fa973cd55a30 type=CONTAINER_STOPPED_EVENT May 15 13:10:45.162271 systemd[1]: Started sshd@26-157.180.34.115:22-147.75.109.163:34710.service - OpenSSH per-connection server daemon (147.75.109.163:34710). May 15 13:10:46.149343 sshd[7273]: Accepted publickey for core from 147.75.109.163 port 34710 ssh2: RSA SHA256:7lmdsVx4mXdsPeXYTlWGXBW4TnKrdBGlv6Lg029Y6yo May 15 13:10:46.150904 sshd-session[7273]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 15 13:10:46.155874 systemd-logind[1536]: New session 18 of user core. May 15 13:10:46.164057 systemd[1]: Started session-18.scope - Session 18 of User core. May 15 13:10:46.914619 containerd[1551]: time="2025-05-15T13:10:46.914555777Z" level=info msg="TaskExit event in podsandbox handler container_id:\"2fbd0fbdab491d22b17bbdbf8b7272ad97d978bd14bd63cd6682c029b6bdf8f6\" id:\"d60c1a503140ac6a54476060ffa65a213e5f6b8f716faa4245944edbde3b9473\" pid:7293 exited_at:{seconds:1747314646 nanos:914223040}" May 15 13:10:47.044031 sshd[7275]: Connection closed by 147.75.109.163 port 34710 May 15 13:10:47.044635 sshd-session[7273]: pam_unix(sshd:session): session closed for user core May 15 13:10:47.048150 systemd-logind[1536]: Session 18 logged out. Waiting for processes to exit. May 15 13:10:47.048258 systemd[1]: sshd@26-157.180.34.115:22-147.75.109.163:34710.service: Deactivated successfully. May 15 13:10:47.050087 systemd[1]: session-18.scope: Deactivated successfully. May 15 13:10:47.051478 systemd-logind[1536]: Removed session 18. May 15 13:10:47.215032 systemd[1]: Started sshd@27-157.180.34.115:22-147.75.109.163:34724.service - OpenSSH per-connection server daemon (147.75.109.163:34724). May 15 13:10:48.201726 sshd[7306]: Accepted publickey for core from 147.75.109.163 port 34724 ssh2: RSA SHA256:7lmdsVx4mXdsPeXYTlWGXBW4TnKrdBGlv6Lg029Y6yo May 15 13:10:48.203516 sshd-session[7306]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 15 13:10:48.209520 systemd-logind[1536]: New session 19 of user core. May 15 13:10:48.213026 systemd[1]: Started session-19.scope - Session 19 of User core. May 15 13:10:48.934007 sshd[7308]: Connection closed by 147.75.109.163 port 34724 May 15 13:10:48.934563 sshd-session[7306]: pam_unix(sshd:session): session closed for user core May 15 13:10:48.938002 systemd-logind[1536]: Session 19 logged out. Waiting for processes to exit. May 15 13:10:48.938093 systemd[1]: sshd@27-157.180.34.115:22-147.75.109.163:34724.service: Deactivated successfully. May 15 13:10:48.939780 systemd[1]: session-19.scope: Deactivated successfully. May 15 13:10:48.941165 systemd-logind[1536]: Removed session 19. May 15 13:10:52.669100 containerd[1551]: time="2025-05-15T13:10:52.669027704Z" level=warning msg="container event discarded" container=4f961edc75ef95ac086ccfc69fd0d46c2ea7c617b362e85a0f73a5de9f57344e type=CONTAINER_CREATED_EVENT May 15 13:10:52.851783 containerd[1551]: time="2025-05-15T13:10:52.851704716Z" level=warning msg="container event discarded" container=4f961edc75ef95ac086ccfc69fd0d46c2ea7c617b362e85a0f73a5de9f57344e type=CONTAINER_STARTED_EVENT May 15 13:10:54.102980 systemd[1]: Started sshd@28-157.180.34.115:22-147.75.109.163:50628.service - OpenSSH per-connection server daemon (147.75.109.163:50628). May 15 13:10:55.091085 sshd[7322]: Accepted publickey for core from 147.75.109.163 port 50628 ssh2: RSA SHA256:7lmdsVx4mXdsPeXYTlWGXBW4TnKrdBGlv6Lg029Y6yo May 15 13:10:55.092315 sshd-session[7322]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 15 13:10:55.096947 systemd-logind[1536]: New session 20 of user core. May 15 13:10:55.102020 systemd[1]: Started session-20.scope - Session 20 of User core. May 15 13:10:55.559493 containerd[1551]: time="2025-05-15T13:10:55.559439581Z" level=info msg="TaskExit event in podsandbox handler container_id:\"4f961edc75ef95ac086ccfc69fd0d46c2ea7c617b362e85a0f73a5de9f57344e\" id:\"f6863101e7ba55e1382bedd43c85c426ee6445eb49d0e8f42beefc00de66726b\" pid:7337 exited_at:{seconds:1747314655 nanos:559080195}" May 15 13:10:55.855270 sshd[7324]: Connection closed by 147.75.109.163 port 50628 May 15 13:10:55.855874 sshd-session[7322]: pam_unix(sshd:session): session closed for user core May 15 13:10:55.859380 systemd-logind[1536]: Session 20 logged out. Waiting for processes to exit. May 15 13:10:55.859453 systemd[1]: sshd@28-157.180.34.115:22-147.75.109.163:50628.service: Deactivated successfully. May 15 13:10:55.861113 systemd[1]: session-20.scope: Deactivated successfully. May 15 13:10:55.862514 systemd-logind[1536]: Removed session 20. May 15 13:10:57.893814 containerd[1551]: time="2025-05-15T13:10:57.893747988Z" level=warning msg="container event discarded" container=45bb1f6c14d218ec8dc935a51a8ff701884bed034c9894fcc7a5d5c1206e3ddf type=CONTAINER_CREATED_EVENT May 15 13:10:57.893814 containerd[1551]: time="2025-05-15T13:10:57.893791722Z" level=warning msg="container event discarded" container=45bb1f6c14d218ec8dc935a51a8ff701884bed034c9894fcc7a5d5c1206e3ddf type=CONTAINER_STARTED_EVENT May 15 13:10:57.932074 containerd[1551]: time="2025-05-15T13:10:57.932018821Z" level=warning msg="container event discarded" container=904a76a40abd5e4ec7065ed6f188f61ddab58913bffbe065741ea8df79ea4105 type=CONTAINER_CREATED_EVENT May 15 13:10:57.932074 containerd[1551]: time="2025-05-15T13:10:57.932054899Z" level=warning msg="container event discarded" container=904a76a40abd5e4ec7065ed6f188f61ddab58913bffbe065741ea8df79ea4105 type=CONTAINER_STARTED_EVENT May 15 13:10:58.646047 containerd[1551]: time="2025-05-15T13:10:58.645970040Z" level=warning msg="container event discarded" container=d1dafe21616ede8372a3321adcb11ae1601f65b93d144dcea8a52042fcfa33c9 type=CONTAINER_CREATED_EVENT May 15 13:10:58.646047 containerd[1551]: time="2025-05-15T13:10:58.646031466Z" level=warning msg="container event discarded" container=d1dafe21616ede8372a3321adcb11ae1601f65b93d144dcea8a52042fcfa33c9 type=CONTAINER_STARTED_EVENT May 15 13:10:58.675330 containerd[1551]: time="2025-05-15T13:10:58.675261425Z" level=warning msg="container event discarded" container=c29ff7d6fa35dbaaea3d39c3611c39265b2ac6616bd769805e8ef2c5a17a8a04 type=CONTAINER_CREATED_EVENT May 15 13:10:58.731782 containerd[1551]: time="2025-05-15T13:10:58.731694503Z" level=warning msg="container event discarded" container=c29ff7d6fa35dbaaea3d39c3611c39265b2ac6616bd769805e8ef2c5a17a8a04 type=CONTAINER_STARTED_EVENT May 15 13:10:58.731782 containerd[1551]: time="2025-05-15T13:10:58.731758283Z" level=warning msg="container event discarded" container=3e490ba5ca60ed777b12426bf74c3a9d9373b5b0c20c66d95e8585974c3d0a43 type=CONTAINER_CREATED_EVENT May 15 13:10:58.731782 containerd[1551]: time="2025-05-15T13:10:58.731770075Z" level=warning msg="container event discarded" container=3e490ba5ca60ed777b12426bf74c3a9d9373b5b0c20c66d95e8585974c3d0a43 type=CONTAINER_STARTED_EVENT May 15 13:11:00.658239 containerd[1551]: time="2025-05-15T13:11:00.658142448Z" level=warning msg="container event discarded" container=a1fc357060d5abbad4388d0261c9f7f5fbcbfc27be565ff1bf3571bf839441ea type=CONTAINER_CREATED_EVENT May 15 13:11:00.658239 containerd[1551]: time="2025-05-15T13:11:00.658227717Z" level=warning msg="container event discarded" container=a1fc357060d5abbad4388d0261c9f7f5fbcbfc27be565ff1bf3571bf839441ea type=CONTAINER_STARTED_EVENT May 15 13:11:01.028376 systemd[1]: Started sshd@29-157.180.34.115:22-147.75.109.163:48902.service - OpenSSH per-connection server daemon (147.75.109.163:48902). May 15 13:11:01.946838 containerd[1551]: time="2025-05-15T13:11:01.946772689Z" level=warning msg="container event discarded" container=b9a3b30336527343a9ea13b75374af7c53d83c33bba1d851c00b4c1edfc81371 type=CONTAINER_CREATED_EVENT May 15 13:11:01.946838 containerd[1551]: time="2025-05-15T13:11:01.946826430Z" level=warning msg="container event discarded" container=b9a3b30336527343a9ea13b75374af7c53d83c33bba1d851c00b4c1edfc81371 type=CONTAINER_STARTED_EVENT May 15 13:11:01.976878 containerd[1551]: time="2025-05-15T13:11:01.976824354Z" level=warning msg="container event discarded" container=99813ba49c12467f50157e97533d7d1a8855e61055b1399cc1a02aa6c771c134 type=CONTAINER_CREATED_EVENT May 15 13:11:02.023865 sshd[7363]: Accepted publickey for core from 147.75.109.163 port 48902 ssh2: RSA SHA256:7lmdsVx4mXdsPeXYTlWGXBW4TnKrdBGlv6Lg029Y6yo May 15 13:11:02.024854 sshd-session[7363]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 15 13:11:02.029565 systemd-logind[1536]: New session 21 of user core. May 15 13:11:02.037032 systemd[1]: Started session-21.scope - Session 21 of User core. May 15 13:11:02.048418 containerd[1551]: time="2025-05-15T13:11:02.048376384Z" level=warning msg="container event discarded" container=99813ba49c12467f50157e97533d7d1a8855e61055b1399cc1a02aa6c771c134 type=CONTAINER_STARTED_EVENT May 15 13:11:02.253176 containerd[1551]: time="2025-05-15T13:11:02.253118634Z" level=warning msg="container event discarded" container=27f21c8449abc27dffff8214053afa33924c7a356d22df6a623176941d00f5fa type=CONTAINER_CREATED_EVENT May 15 13:11:02.328747 containerd[1551]: time="2025-05-15T13:11:02.328687970Z" level=warning msg="container event discarded" container=27f21c8449abc27dffff8214053afa33924c7a356d22df6a623176941d00f5fa type=CONTAINER_STARTED_EVENT May 15 13:11:02.760091 sshd[7365]: Connection closed by 147.75.109.163 port 48902 May 15 13:11:02.760674 sshd-session[7363]: pam_unix(sshd:session): session closed for user core May 15 13:11:02.763903 systemd[1]: sshd@29-157.180.34.115:22-147.75.109.163:48902.service: Deactivated successfully. May 15 13:11:02.765438 systemd[1]: session-21.scope: Deactivated successfully. May 15 13:11:02.766498 systemd-logind[1536]: Session 21 logged out. Waiting for processes to exit. May 15 13:11:02.767735 systemd-logind[1536]: Removed session 21. May 15 13:11:05.584519 containerd[1551]: time="2025-05-15T13:11:05.584415593Z" level=warning msg="container event discarded" container=2fbd0fbdab491d22b17bbdbf8b7272ad97d978bd14bd63cd6682c029b6bdf8f6 type=CONTAINER_CREATED_EVENT May 15 13:11:05.657799 containerd[1551]: time="2025-05-15T13:11:05.657714729Z" level=warning msg="container event discarded" container=2fbd0fbdab491d22b17bbdbf8b7272ad97d978bd14bd63cd6682c029b6bdf8f6 type=CONTAINER_STARTED_EVENT