Jul 15 05:14:55.795015 kernel: Linux version 6.12.36-flatcar (build@pony-truck.infra.kinvolk.io) (x86_64-cros-linux-gnu-gcc (Gentoo Hardened 14.3.0 p8) 14.3.0, GNU ld (Gentoo 2.44 p1) 2.44.0) #1 SMP PREEMPT_DYNAMIC Tue Jul 15 03:28:48 -00 2025 Jul 15 05:14:55.795033 kernel: Command line: BOOT_IMAGE=/flatcar/vmlinuz-a mount.usr=/dev/mapper/usr verity.usr=PARTUUID=7130c94a-213a-4e5a-8e26-6cce9662f132 rootflags=rw mount.usrflags=ro consoleblank=0 root=LABEL=ROOT console=ttyS0,115200n8 console=tty0 flatcar.first_boot=detected flatcar.oem.id=hetzner verity.usrhash=926b029026d98240a9e8b6527b65fc026ae523bea87c3b77ffd7237bcc7be4fb Jul 15 05:14:55.795039 kernel: BIOS-provided physical RAM map: Jul 15 05:14:55.795044 kernel: BIOS-e820: [mem 0x0000000000000000-0x000000000009fbff] usable Jul 15 05:14:55.795049 kernel: BIOS-e820: [mem 0x000000000009fc00-0x000000000009ffff] reserved Jul 15 05:14:55.795053 kernel: BIOS-e820: [mem 0x00000000000f0000-0x00000000000fffff] reserved Jul 15 05:14:55.795060 kernel: BIOS-e820: [mem 0x0000000000100000-0x000000007cfdbfff] usable Jul 15 05:14:55.795064 kernel: BIOS-e820: [mem 0x000000007cfdc000-0x000000007cffffff] reserved Jul 15 05:14:55.795069 kernel: BIOS-e820: [mem 0x00000000b0000000-0x00000000bfffffff] reserved Jul 15 05:14:55.795073 kernel: BIOS-e820: [mem 0x00000000fed1c000-0x00000000fed1ffff] reserved Jul 15 05:14:55.795078 kernel: BIOS-e820: [mem 0x00000000feffc000-0x00000000feffffff] reserved Jul 15 05:14:55.795082 kernel: BIOS-e820: [mem 0x00000000fffc0000-0x00000000ffffffff] reserved Jul 15 05:14:55.795096 kernel: NX (Execute Disable) protection: active Jul 15 05:14:55.795101 kernel: APIC: Static calls initialized Jul 15 05:14:55.795122 kernel: SMBIOS 2.8 present. Jul 15 05:14:55.795137 kernel: DMI: Hetzner vServer/Standard PC (Q35 + ICH9, 2009), BIOS 20171111 11/11/2017 Jul 15 05:14:55.795145 kernel: DMI: Memory slots populated: 1/1 Jul 15 05:14:55.795160 kernel: Hypervisor detected: KVM Jul 15 05:14:55.795173 kernel: kvm-clock: Using msrs 4b564d01 and 4b564d00 Jul 15 05:14:55.795184 kernel: kvm-clock: using sched offset of 3843710913 cycles Jul 15 05:14:55.795189 kernel: clocksource: kvm-clock: mask: 0xffffffffffffffff max_cycles: 0x1cd42e4dffb, max_idle_ns: 881590591483 ns Jul 15 05:14:55.795194 kernel: tsc: Detected 2400.000 MHz processor Jul 15 05:14:55.795201 kernel: e820: update [mem 0x00000000-0x00000fff] usable ==> reserved Jul 15 05:14:55.795206 kernel: e820: remove [mem 0x000a0000-0x000fffff] usable Jul 15 05:14:55.795211 kernel: last_pfn = 0x7cfdc max_arch_pfn = 0x400000000 Jul 15 05:14:55.795216 kernel: MTRR map: 4 entries (3 fixed + 1 variable; max 19), built from 8 variable MTRRs Jul 15 05:14:55.795221 kernel: x86/PAT: Configuration [0-7]: WB WC UC- UC WB WP UC- WT Jul 15 05:14:55.795226 kernel: Using GB pages for direct mapping Jul 15 05:14:55.795231 kernel: ACPI: Early table checksum verification disabled Jul 15 05:14:55.795235 kernel: ACPI: RSDP 0x00000000000F5270 000014 (v00 BOCHS ) Jul 15 05:14:55.795240 kernel: ACPI: RSDT 0x000000007CFE2693 000038 (v01 BOCHS BXPC 00000001 BXPC 00000001) Jul 15 05:14:55.795498 kernel: ACPI: FACP 0x000000007CFE2483 0000F4 (v03 BOCHS BXPC 00000001 BXPC 00000001) Jul 15 05:14:55.795509 kernel: ACPI: DSDT 0x000000007CFE0040 002443 (v01 BOCHS BXPC 00000001 BXPC 00000001) Jul 15 05:14:55.795515 kernel: ACPI: FACS 0x000000007CFE0000 000040 Jul 15 05:14:55.795520 kernel: ACPI: APIC 0x000000007CFE2577 000080 (v03 BOCHS BXPC 00000001 BXPC 00000001) Jul 15 05:14:55.795525 kernel: ACPI: HPET 0x000000007CFE25F7 000038 (v01 BOCHS BXPC 00000001 BXPC 00000001) Jul 15 05:14:55.795530 kernel: ACPI: MCFG 0x000000007CFE262F 00003C (v01 BOCHS BXPC 00000001 BXPC 00000001) Jul 15 05:14:55.795535 kernel: ACPI: WAET 0x000000007CFE266B 000028 (v01 BOCHS BXPC 00000001 BXPC 00000001) Jul 15 05:14:55.795540 kernel: ACPI: Reserving FACP table memory at [mem 0x7cfe2483-0x7cfe2576] Jul 15 05:14:55.795545 kernel: ACPI: Reserving DSDT table memory at [mem 0x7cfe0040-0x7cfe2482] Jul 15 05:14:55.795554 kernel: ACPI: Reserving FACS table memory at [mem 0x7cfe0000-0x7cfe003f] Jul 15 05:14:55.795559 kernel: ACPI: Reserving APIC table memory at [mem 0x7cfe2577-0x7cfe25f6] Jul 15 05:14:55.795564 kernel: ACPI: Reserving HPET table memory at [mem 0x7cfe25f7-0x7cfe262e] Jul 15 05:14:55.795569 kernel: ACPI: Reserving MCFG table memory at [mem 0x7cfe262f-0x7cfe266a] Jul 15 05:14:55.795574 kernel: ACPI: Reserving WAET table memory at [mem 0x7cfe266b-0x7cfe2692] Jul 15 05:14:55.795581 kernel: No NUMA configuration found Jul 15 05:14:55.795586 kernel: Faking a node at [mem 0x0000000000000000-0x000000007cfdbfff] Jul 15 05:14:55.795591 kernel: NODE_DATA(0) allocated [mem 0x7cfd4dc0-0x7cfdbfff] Jul 15 05:14:55.795596 kernel: Zone ranges: Jul 15 05:14:55.795602 kernel: DMA [mem 0x0000000000001000-0x0000000000ffffff] Jul 15 05:14:55.795607 kernel: DMA32 [mem 0x0000000001000000-0x000000007cfdbfff] Jul 15 05:14:55.795612 kernel: Normal empty Jul 15 05:14:55.795618 kernel: Device empty Jul 15 05:14:55.795623 kernel: Movable zone start for each node Jul 15 05:14:55.795628 kernel: Early memory node ranges Jul 15 05:14:55.795635 kernel: node 0: [mem 0x0000000000001000-0x000000000009efff] Jul 15 05:14:55.795640 kernel: node 0: [mem 0x0000000000100000-0x000000007cfdbfff] Jul 15 05:14:55.795645 kernel: Initmem setup node 0 [mem 0x0000000000001000-0x000000007cfdbfff] Jul 15 05:14:55.795650 kernel: On node 0, zone DMA: 1 pages in unavailable ranges Jul 15 05:14:55.795655 kernel: On node 0, zone DMA: 97 pages in unavailable ranges Jul 15 05:14:55.795660 kernel: On node 0, zone DMA32: 12324 pages in unavailable ranges Jul 15 05:14:55.795665 kernel: ACPI: PM-Timer IO Port: 0x608 Jul 15 05:14:55.795670 kernel: ACPI: LAPIC_NMI (acpi_id[0xff] dfl dfl lint[0x1]) Jul 15 05:14:55.795675 kernel: IOAPIC[0]: apic_id 0, version 17, address 0xfec00000, GSI 0-23 Jul 15 05:14:55.795682 kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 0 global_irq 2 dfl dfl) Jul 15 05:14:55.795687 kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 5 global_irq 5 high level) Jul 15 05:14:55.795692 kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 9 global_irq 9 high level) Jul 15 05:14:55.795697 kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 10 global_irq 10 high level) Jul 15 05:14:55.795702 kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 11 global_irq 11 high level) Jul 15 05:14:55.795707 kernel: ACPI: Using ACPI (MADT) for SMP configuration information Jul 15 05:14:55.795712 kernel: ACPI: HPET id: 0x8086a201 base: 0xfed00000 Jul 15 05:14:55.795717 kernel: CPU topo: Max. logical packages: 1 Jul 15 05:14:55.795722 kernel: CPU topo: Max. logical dies: 1 Jul 15 05:14:55.795728 kernel: CPU topo: Max. dies per package: 1 Jul 15 05:14:55.795733 kernel: CPU topo: Max. threads per core: 1 Jul 15 05:14:55.795738 kernel: CPU topo: Num. cores per package: 2 Jul 15 05:14:55.795743 kernel: CPU topo: Num. threads per package: 2 Jul 15 05:14:55.795748 kernel: CPU topo: Allowing 2 present CPUs plus 0 hotplug CPUs Jul 15 05:14:55.795753 kernel: kvm-guest: APIC: eoi() replaced with kvm_guest_apic_eoi_write() Jul 15 05:14:55.795758 kernel: [mem 0xc0000000-0xfed1bfff] available for PCI devices Jul 15 05:14:55.795763 kernel: Booting paravirtualized kernel on KVM Jul 15 05:14:55.795791 kernel: clocksource: refined-jiffies: mask: 0xffffffff max_cycles: 0xffffffff, max_idle_ns: 1910969940391419 ns Jul 15 05:14:55.795796 kernel: setup_percpu: NR_CPUS:512 nr_cpumask_bits:2 nr_cpu_ids:2 nr_node_ids:1 Jul 15 05:14:55.795803 kernel: percpu: Embedded 60 pages/cpu s207832 r8192 d29736 u1048576 Jul 15 05:14:55.795808 kernel: pcpu-alloc: s207832 r8192 d29736 u1048576 alloc=1*2097152 Jul 15 05:14:55.795813 kernel: pcpu-alloc: [0] 0 1 Jul 15 05:14:55.795818 kernel: kvm-guest: PV spinlocks disabled, no host support Jul 15 05:14:55.795824 kernel: Kernel command line: rootflags=rw mount.usrflags=ro BOOT_IMAGE=/flatcar/vmlinuz-a mount.usr=/dev/mapper/usr verity.usr=PARTUUID=7130c94a-213a-4e5a-8e26-6cce9662f132 rootflags=rw mount.usrflags=ro consoleblank=0 root=LABEL=ROOT console=ttyS0,115200n8 console=tty0 flatcar.first_boot=detected flatcar.oem.id=hetzner verity.usrhash=926b029026d98240a9e8b6527b65fc026ae523bea87c3b77ffd7237bcc7be4fb Jul 15 05:14:55.795830 kernel: Unknown kernel command line parameters "BOOT_IMAGE=/flatcar/vmlinuz-a", will be passed to user space. Jul 15 05:14:55.795835 kernel: random: crng init done Jul 15 05:14:55.795840 kernel: Dentry cache hash table entries: 262144 (order: 9, 2097152 bytes, linear) Jul 15 05:14:55.795846 kernel: Inode-cache hash table entries: 131072 (order: 8, 1048576 bytes, linear) Jul 15 05:14:55.795851 kernel: Fallback order for Node 0: 0 Jul 15 05:14:55.795856 kernel: Built 1 zonelists, mobility grouping on. Total pages: 511866 Jul 15 05:14:55.795861 kernel: Policy zone: DMA32 Jul 15 05:14:55.795866 kernel: mem auto-init: stack:off, heap alloc:off, heap free:off Jul 15 05:14:55.795871 kernel: SLUB: HWalign=64, Order=0-3, MinObjects=0, CPUs=2, Nodes=1 Jul 15 05:14:55.795876 kernel: ftrace: allocating 40097 entries in 157 pages Jul 15 05:14:55.795881 kernel: ftrace: allocated 157 pages with 5 groups Jul 15 05:14:55.795886 kernel: Dynamic Preempt: voluntary Jul 15 05:14:55.795893 kernel: rcu: Preemptible hierarchical RCU implementation. Jul 15 05:14:55.795899 kernel: rcu: RCU event tracing is enabled. Jul 15 05:14:55.795904 kernel: rcu: RCU restricting CPUs from NR_CPUS=512 to nr_cpu_ids=2. Jul 15 05:14:55.795909 kernel: Trampoline variant of Tasks RCU enabled. Jul 15 05:14:55.795914 kernel: Rude variant of Tasks RCU enabled. Jul 15 05:14:55.795919 kernel: Tracing variant of Tasks RCU enabled. Jul 15 05:14:55.795924 kernel: rcu: RCU calculated value of scheduler-enlistment delay is 100 jiffies. Jul 15 05:14:55.795929 kernel: rcu: Adjusting geometry for rcu_fanout_leaf=16, nr_cpu_ids=2 Jul 15 05:14:55.795934 kernel: RCU Tasks: Setting shift to 1 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=2. Jul 15 05:14:55.795941 kernel: RCU Tasks Rude: Setting shift to 1 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=2. Jul 15 05:14:55.795946 kernel: RCU Tasks Trace: Setting shift to 1 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=2. Jul 15 05:14:55.795952 kernel: NR_IRQS: 33024, nr_irqs: 440, preallocated irqs: 16 Jul 15 05:14:55.795957 kernel: rcu: srcu_init: Setting srcu_struct sizes based on contention. Jul 15 05:14:55.795961 kernel: Console: colour VGA+ 80x25 Jul 15 05:14:55.795966 kernel: printk: legacy console [tty0] enabled Jul 15 05:14:55.795971 kernel: printk: legacy console [ttyS0] enabled Jul 15 05:14:55.795976 kernel: ACPI: Core revision 20240827 Jul 15 05:14:55.795981 kernel: clocksource: hpet: mask: 0xffffffff max_cycles: 0xffffffff, max_idle_ns: 19112604467 ns Jul 15 05:14:55.795993 kernel: APIC: Switch to symmetric I/O mode setup Jul 15 05:14:55.795998 kernel: x2apic enabled Jul 15 05:14:55.796003 kernel: APIC: Switched APIC routing to: physical x2apic Jul 15 05:14:55.796010 kernel: ..TIMER: vector=0x30 apic1=0 pin1=2 apic2=-1 pin2=-1 Jul 15 05:14:55.796015 kernel: clocksource: tsc-early: mask: 0xffffffffffffffff max_cycles: 0x22983777dd9, max_idle_ns: 440795300422 ns Jul 15 05:14:55.796021 kernel: Calibrating delay loop (skipped) preset value.. 4800.00 BogoMIPS (lpj=2400000) Jul 15 05:14:55.796034 kernel: x86/cpu: User Mode Instruction Prevention (UMIP) activated Jul 15 05:14:55.796058 kernel: Last level iTLB entries: 4KB 512, 2MB 255, 4MB 127 Jul 15 05:14:55.796077 kernel: Last level dTLB entries: 4KB 512, 2MB 255, 4MB 127, 1GB 0 Jul 15 05:14:55.796086 kernel: Spectre V1 : Mitigation: usercopy/swapgs barriers and __user pointer sanitization Jul 15 05:14:55.796091 kernel: Spectre V2 : Mitigation: Retpolines Jul 15 05:14:55.796097 kernel: Spectre V2 : Spectre v2 / SpectreRSB: Filling RSB on context switch and VMEXIT Jul 15 05:14:55.796102 kernel: Spectre V2 : Enabling Speculation Barrier for firmware calls Jul 15 05:14:55.796107 kernel: RETBleed: Mitigation: untrained return thunk Jul 15 05:14:55.796112 kernel: Spectre V2 : mitigation: Enabling conditional Indirect Branch Prediction Barrier Jul 15 05:14:55.796118 kernel: Speculative Store Bypass: Mitigation: Speculative Store Bypass disabled via prctl Jul 15 05:14:55.796123 kernel: Speculative Return Stack Overflow: IBPB-extending microcode not applied! Jul 15 05:14:55.796130 kernel: Speculative Return Stack Overflow: WARNING: See https://kernel.org/doc/html/latest/admin-guide/hw-vuln/srso.html for mitigation options. Jul 15 05:14:55.796136 kernel: x86/bugs: return thunk changed Jul 15 05:14:55.796144 kernel: Speculative Return Stack Overflow: Vulnerable: Safe RET, no microcode Jul 15 05:14:55.796155 kernel: x86/fpu: Supporting XSAVE feature 0x001: 'x87 floating point registers' Jul 15 05:14:55.796167 kernel: x86/fpu: Supporting XSAVE feature 0x002: 'SSE registers' Jul 15 05:14:55.796176 kernel: x86/fpu: Supporting XSAVE feature 0x004: 'AVX registers' Jul 15 05:14:55.796184 kernel: x86/fpu: xstate_offset[2]: 576, xstate_sizes[2]: 256 Jul 15 05:14:55.796190 kernel: x86/fpu: Enabled xstate features 0x7, context size is 832 bytes, using 'compacted' format. Jul 15 05:14:55.796199 kernel: Freeing SMP alternatives memory: 32K Jul 15 05:14:55.796204 kernel: pid_max: default: 32768 minimum: 301 Jul 15 05:14:55.796210 kernel: LSM: initializing lsm=lockdown,capability,landlock,selinux,ima Jul 15 05:14:55.796215 kernel: landlock: Up and running. Jul 15 05:14:55.796230 kernel: SELinux: Initializing. Jul 15 05:14:55.796243 kernel: Mount-cache hash table entries: 4096 (order: 3, 32768 bytes, linear) Jul 15 05:14:55.796262 kernel: Mountpoint-cache hash table entries: 4096 (order: 3, 32768 bytes, linear) Jul 15 05:14:55.796267 kernel: smpboot: CPU0: AMD EPYC-Rome Processor (family: 0x17, model: 0x31, stepping: 0x0) Jul 15 05:14:55.796273 kernel: Performance Events: Fam17h+ core perfctr, AMD PMU driver. Jul 15 05:14:55.796281 kernel: ... version: 0 Jul 15 05:14:55.796286 kernel: ... bit width: 48 Jul 15 05:14:55.796291 kernel: ... generic registers: 6 Jul 15 05:14:55.796297 kernel: ... value mask: 0000ffffffffffff Jul 15 05:14:55.796302 kernel: ... max period: 00007fffffffffff Jul 15 05:14:55.796307 kernel: ... fixed-purpose events: 0 Jul 15 05:14:55.796312 kernel: ... event mask: 000000000000003f Jul 15 05:14:55.796318 kernel: signal: max sigframe size: 1776 Jul 15 05:14:55.796323 kernel: rcu: Hierarchical SRCU implementation. Jul 15 05:14:55.796330 kernel: rcu: Max phase no-delay instances is 400. Jul 15 05:14:55.796336 kernel: Timer migration: 1 hierarchy levels; 8 children per group; 1 crossnode level Jul 15 05:14:55.796341 kernel: smp: Bringing up secondary CPUs ... Jul 15 05:14:55.796346 kernel: smpboot: x86: Booting SMP configuration: Jul 15 05:14:55.796352 kernel: .... node #0, CPUs: #1 Jul 15 05:14:55.796357 kernel: smp: Brought up 1 node, 2 CPUs Jul 15 05:14:55.796363 kernel: smpboot: Total of 2 processors activated (9600.00 BogoMIPS) Jul 15 05:14:55.796368 kernel: Memory: 1917780K/2047464K available (14336K kernel code, 2430K rwdata, 9956K rodata, 54608K init, 2360K bss, 125140K reserved, 0K cma-reserved) Jul 15 05:14:55.796374 kernel: devtmpfs: initialized Jul 15 05:14:55.796389 kernel: x86/mm: Memory block size: 128MB Jul 15 05:14:55.796395 kernel: clocksource: jiffies: mask: 0xffffffff max_cycles: 0xffffffff, max_idle_ns: 1911260446275000 ns Jul 15 05:14:55.796400 kernel: futex hash table entries: 512 (order: 3, 32768 bytes, linear) Jul 15 05:14:55.796406 kernel: pinctrl core: initialized pinctrl subsystem Jul 15 05:14:55.796412 kernel: NET: Registered PF_NETLINK/PF_ROUTE protocol family Jul 15 05:14:55.796417 kernel: audit: initializing netlink subsys (disabled) Jul 15 05:14:55.796424 kernel: audit: type=2000 audit(1752556493.588:1): state=initialized audit_enabled=0 res=1 Jul 15 05:14:55.796430 kernel: thermal_sys: Registered thermal governor 'step_wise' Jul 15 05:14:55.796436 kernel: thermal_sys: Registered thermal governor 'user_space' Jul 15 05:14:55.796443 kernel: cpuidle: using governor menu Jul 15 05:14:55.796448 kernel: acpiphp: ACPI Hot Plug PCI Controller Driver version: 0.5 Jul 15 05:14:55.798277 kernel: dca service started, version 1.12.1 Jul 15 05:14:55.798284 kernel: PCI: ECAM [mem 0xb0000000-0xbfffffff] (base 0xb0000000) for domain 0000 [bus 00-ff] Jul 15 05:14:55.798290 kernel: PCI: Using configuration type 1 for base access Jul 15 05:14:55.798295 kernel: kprobes: kprobe jump-optimization is enabled. All kprobes are optimized if possible. Jul 15 05:14:55.798301 kernel: HugeTLB: registered 1.00 GiB page size, pre-allocated 0 pages Jul 15 05:14:55.798306 kernel: HugeTLB: 16380 KiB vmemmap can be freed for a 1.00 GiB page Jul 15 05:14:55.798311 kernel: HugeTLB: registered 2.00 MiB page size, pre-allocated 0 pages Jul 15 05:14:55.798320 kernel: HugeTLB: 28 KiB vmemmap can be freed for a 2.00 MiB page Jul 15 05:14:55.798325 kernel: ACPI: Added _OSI(Module Device) Jul 15 05:14:55.798330 kernel: ACPI: Added _OSI(Processor Device) Jul 15 05:14:55.798336 kernel: ACPI: Added _OSI(Processor Aggregator Device) Jul 15 05:14:55.798345 kernel: ACPI: 1 ACPI AML tables successfully acquired and loaded Jul 15 05:14:55.798353 kernel: ACPI: Interpreter enabled Jul 15 05:14:55.798361 kernel: ACPI: PM: (supports S0 S5) Jul 15 05:14:55.798369 kernel: ACPI: Using IOAPIC for interrupt routing Jul 15 05:14:55.798387 kernel: PCI: Using host bridge windows from ACPI; if necessary, use "pci=nocrs" and report a bug Jul 15 05:14:55.798403 kernel: PCI: Using E820 reservations for host bridge windows Jul 15 05:14:55.798412 kernel: ACPI: Enabled 2 GPEs in block 00 to 3F Jul 15 05:14:55.798421 kernel: ACPI: PCI Root Bridge [PCI0] (domain 0000 [bus 00-ff]) Jul 15 05:14:55.798572 kernel: acpi PNP0A08:00: _OSC: OS supports [ExtendedConfig ASPM ClockPM Segments MSI HPX-Type3] Jul 15 05:14:55.798692 kernel: acpi PNP0A08:00: _OSC: platform does not support [PCIeHotplug LTR] Jul 15 05:14:55.798784 kernel: acpi PNP0A08:00: _OSC: OS now controls [PME AER PCIeCapability] Jul 15 05:14:55.798791 kernel: PCI host bridge to bus 0000:00 Jul 15 05:14:55.798887 kernel: pci_bus 0000:00: root bus resource [io 0x0000-0x0cf7 window] Jul 15 05:14:55.798968 kernel: pci_bus 0000:00: root bus resource [io 0x0d00-0xffff window] Jul 15 05:14:55.799048 kernel: pci_bus 0000:00: root bus resource [mem 0x000a0000-0x000bffff window] Jul 15 05:14:55.799137 kernel: pci_bus 0000:00: root bus resource [mem 0x7d000000-0xafffffff window] Jul 15 05:14:55.799217 kernel: pci_bus 0000:00: root bus resource [mem 0xc0000000-0xfebfffff window] Jul 15 05:14:55.799318 kernel: pci_bus 0000:00: root bus resource [mem 0x100000000-0x8ffffffff window] Jul 15 05:14:55.799408 kernel: pci_bus 0000:00: root bus resource [bus 00-ff] Jul 15 05:14:55.799513 kernel: pci 0000:00:00.0: [8086:29c0] type 00 class 0x060000 conventional PCI endpoint Jul 15 05:14:55.799615 kernel: pci 0000:00:01.0: [1af4:1050] type 00 class 0x030000 conventional PCI endpoint Jul 15 05:14:55.799704 kernel: pci 0000:00:01.0: BAR 0 [mem 0xfb800000-0xfbffffff pref] Jul 15 05:14:55.799790 kernel: pci 0000:00:01.0: BAR 2 [mem 0xfd200000-0xfd203fff 64bit pref] Jul 15 05:14:55.799876 kernel: pci 0000:00:01.0: BAR 4 [mem 0xfea10000-0xfea10fff] Jul 15 05:14:55.799962 kernel: pci 0000:00:01.0: ROM [mem 0xfea00000-0xfea0ffff pref] Jul 15 05:14:55.800048 kernel: pci 0000:00:01.0: Video device with shadowed ROM at [mem 0x000c0000-0x000dffff] Jul 15 05:14:55.800158 kernel: pci 0000:00:02.0: [1b36:000c] type 01 class 0x060400 PCIe Root Port Jul 15 05:14:55.800317 kernel: pci 0000:00:02.0: BAR 0 [mem 0xfea11000-0xfea11fff] Jul 15 05:14:55.800424 kernel: pci 0000:00:02.0: PCI bridge to [bus 01] Jul 15 05:14:55.800512 kernel: pci 0000:00:02.0: bridge window [mem 0xfe800000-0xfe9fffff] Jul 15 05:14:55.800599 kernel: pci 0000:00:02.0: bridge window [mem 0xfd000000-0xfd1fffff 64bit pref] Jul 15 05:14:55.800693 kernel: pci 0000:00:02.1: [1b36:000c] type 01 class 0x060400 PCIe Root Port Jul 15 05:14:55.800783 kernel: pci 0000:00:02.1: BAR 0 [mem 0xfea12000-0xfea12fff] Jul 15 05:14:55.800868 kernel: pci 0000:00:02.1: PCI bridge to [bus 02] Jul 15 05:14:55.800953 kernel: pci 0000:00:02.1: bridge window [mem 0xfe600000-0xfe7fffff] Jul 15 05:14:55.801040 kernel: pci 0000:00:02.1: bridge window [mem 0xfce00000-0xfcffffff 64bit pref] Jul 15 05:14:55.801131 kernel: pci 0000:00:02.2: [1b36:000c] type 01 class 0x060400 PCIe Root Port Jul 15 05:14:55.801218 kernel: pci 0000:00:02.2: BAR 0 [mem 0xfea13000-0xfea13fff] Jul 15 05:14:55.801315 kernel: pci 0000:00:02.2: PCI bridge to [bus 03] Jul 15 05:14:55.801411 kernel: pci 0000:00:02.2: bridge window [mem 0xfe400000-0xfe5fffff] Jul 15 05:14:55.801497 kernel: pci 0000:00:02.2: bridge window [mem 0xfcc00000-0xfcdfffff 64bit pref] Jul 15 05:14:55.801589 kernel: pci 0000:00:02.3: [1b36:000c] type 01 class 0x060400 PCIe Root Port Jul 15 05:14:55.801674 kernel: pci 0000:00:02.3: BAR 0 [mem 0xfea14000-0xfea14fff] Jul 15 05:14:55.801760 kernel: pci 0000:00:02.3: PCI bridge to [bus 04] Jul 15 05:14:55.801845 kernel: pci 0000:00:02.3: bridge window [mem 0xfe200000-0xfe3fffff] Jul 15 05:14:55.801931 kernel: pci 0000:00:02.3: bridge window [mem 0xfca00000-0xfcbfffff 64bit pref] Jul 15 05:14:55.802024 kernel: pci 0000:00:02.4: [1b36:000c] type 01 class 0x060400 PCIe Root Port Jul 15 05:14:55.802111 kernel: pci 0000:00:02.4: BAR 0 [mem 0xfea15000-0xfea15fff] Jul 15 05:14:55.802196 kernel: pci 0000:00:02.4: PCI bridge to [bus 05] Jul 15 05:14:55.804287 kernel: pci 0000:00:02.4: bridge window [mem 0xfe000000-0xfe1fffff] Jul 15 05:14:55.804431 kernel: pci 0000:00:02.4: bridge window [mem 0xfc800000-0xfc9fffff 64bit pref] Jul 15 05:14:55.804541 kernel: pci 0000:00:02.5: [1b36:000c] type 01 class 0x060400 PCIe Root Port Jul 15 05:14:55.804629 kernel: pci 0000:00:02.5: BAR 0 [mem 0xfea16000-0xfea16fff] Jul 15 05:14:55.804719 kernel: pci 0000:00:02.5: PCI bridge to [bus 06] Jul 15 05:14:55.804805 kernel: pci 0000:00:02.5: bridge window [mem 0xfde00000-0xfdffffff] Jul 15 05:14:55.804892 kernel: pci 0000:00:02.5: bridge window [mem 0xfc600000-0xfc7fffff 64bit pref] Jul 15 05:14:55.804984 kernel: pci 0000:00:02.6: [1b36:000c] type 01 class 0x060400 PCIe Root Port Jul 15 05:14:55.805069 kernel: pci 0000:00:02.6: BAR 0 [mem 0xfea17000-0xfea17fff] Jul 15 05:14:55.805155 kernel: pci 0000:00:02.6: PCI bridge to [bus 07] Jul 15 05:14:55.805241 kernel: pci 0000:00:02.6: bridge window [mem 0xfdc00000-0xfddfffff] Jul 15 05:14:55.805350 kernel: pci 0000:00:02.6: bridge window [mem 0xfc400000-0xfc5fffff 64bit pref] Jul 15 05:14:55.805450 kernel: pci 0000:00:02.7: [1b36:000c] type 01 class 0x060400 PCIe Root Port Jul 15 05:14:55.805537 kernel: pci 0000:00:02.7: BAR 0 [mem 0xfea18000-0xfea18fff] Jul 15 05:14:55.805622 kernel: pci 0000:00:02.7: PCI bridge to [bus 08] Jul 15 05:14:55.805707 kernel: pci 0000:00:02.7: bridge window [mem 0xfda00000-0xfdbfffff] Jul 15 05:14:55.805793 kernel: pci 0000:00:02.7: bridge window [mem 0xfc200000-0xfc3fffff 64bit pref] Jul 15 05:14:55.805885 kernel: pci 0000:00:03.0: [1b36:000c] type 01 class 0x060400 PCIe Root Port Jul 15 05:14:55.805974 kernel: pci 0000:00:03.0: BAR 0 [mem 0xfea19000-0xfea19fff] Jul 15 05:14:55.806060 kernel: pci 0000:00:03.0: PCI bridge to [bus 09] Jul 15 05:14:55.806146 kernel: pci 0000:00:03.0: bridge window [mem 0xfd800000-0xfd9fffff] Jul 15 05:14:55.806234 kernel: pci 0000:00:03.0: bridge window [mem 0xfc000000-0xfc1fffff 64bit pref] Jul 15 05:14:55.806397 kernel: pci 0000:00:1f.0: [8086:2918] type 00 class 0x060100 conventional PCI endpoint Jul 15 05:14:55.806492 kernel: pci 0000:00:1f.0: quirk: [io 0x0600-0x067f] claimed by ICH6 ACPI/GPIO/TCO Jul 15 05:14:55.806585 kernel: pci 0000:00:1f.2: [8086:2922] type 00 class 0x010601 conventional PCI endpoint Jul 15 05:14:55.806694 kernel: pci 0000:00:1f.2: BAR 4 [io 0xc040-0xc05f] Jul 15 05:14:55.806799 kernel: pci 0000:00:1f.2: BAR 5 [mem 0xfea1a000-0xfea1afff] Jul 15 05:14:55.806893 kernel: pci 0000:00:1f.3: [8086:2930] type 00 class 0x0c0500 conventional PCI endpoint Jul 15 05:14:55.806982 kernel: pci 0000:00:1f.3: BAR 4 [io 0x0700-0x073f] Jul 15 05:14:55.807081 kernel: pci 0000:01:00.0: [1af4:1041] type 00 class 0x020000 PCIe Endpoint Jul 15 05:14:55.807173 kernel: pci 0000:01:00.0: BAR 1 [mem 0xfe880000-0xfe880fff] Jul 15 05:14:55.807290 kernel: pci 0000:01:00.0: BAR 4 [mem 0xfd000000-0xfd003fff 64bit pref] Jul 15 05:14:55.807389 kernel: pci 0000:01:00.0: ROM [mem 0xfe800000-0xfe87ffff pref] Jul 15 05:14:55.807476 kernel: pci 0000:00:02.0: PCI bridge to [bus 01] Jul 15 05:14:55.807572 kernel: pci 0000:02:00.0: [1b36:000d] type 00 class 0x0c0330 PCIe Endpoint Jul 15 05:14:55.807662 kernel: pci 0000:02:00.0: BAR 0 [mem 0xfe600000-0xfe603fff 64bit] Jul 15 05:14:55.807749 kernel: pci 0000:00:02.1: PCI bridge to [bus 02] Jul 15 05:14:55.807847 kernel: pci 0000:03:00.0: [1af4:1043] type 00 class 0x078000 PCIe Endpoint Jul 15 05:14:55.807940 kernel: pci 0000:03:00.0: BAR 1 [mem 0xfe400000-0xfe400fff] Jul 15 05:14:55.808029 kernel: pci 0000:03:00.0: BAR 4 [mem 0xfcc00000-0xfcc03fff 64bit pref] Jul 15 05:14:55.808115 kernel: pci 0000:00:02.2: PCI bridge to [bus 03] Jul 15 05:14:55.808210 kernel: pci 0000:04:00.0: [1af4:1045] type 00 class 0x00ff00 PCIe Endpoint Jul 15 05:14:55.808343 kernel: pci 0000:04:00.0: BAR 4 [mem 0xfca00000-0xfca03fff 64bit pref] Jul 15 05:14:55.808461 kernel: pci 0000:00:02.3: PCI bridge to [bus 04] Jul 15 05:14:55.808562 kernel: pci 0000:05:00.0: [1af4:1044] type 00 class 0x00ff00 PCIe Endpoint Jul 15 05:14:55.808653 kernel: pci 0000:05:00.0: BAR 4 [mem 0xfc800000-0xfc803fff 64bit pref] Jul 15 05:14:55.808740 kernel: pci 0000:00:02.4: PCI bridge to [bus 05] Jul 15 05:14:55.808835 kernel: pci 0000:06:00.0: [1af4:1048] type 00 class 0x010000 PCIe Endpoint Jul 15 05:14:55.808926 kernel: pci 0000:06:00.0: BAR 1 [mem 0xfde00000-0xfde00fff] Jul 15 05:14:55.809017 kernel: pci 0000:06:00.0: BAR 4 [mem 0xfc600000-0xfc603fff 64bit pref] Jul 15 05:14:55.809103 kernel: pci 0000:00:02.5: PCI bridge to [bus 06] Jul 15 05:14:55.809112 kernel: acpiphp: Slot [0] registered Jul 15 05:14:55.809210 kernel: pci 0000:07:00.0: [1af4:1041] type 00 class 0x020000 PCIe Endpoint Jul 15 05:14:55.809318 kernel: pci 0000:07:00.0: BAR 1 [mem 0xfdc80000-0xfdc80fff] Jul 15 05:14:55.809417 kernel: pci 0000:07:00.0: BAR 4 [mem 0xfc400000-0xfc403fff 64bit pref] Jul 15 05:14:55.809508 kernel: pci 0000:07:00.0: ROM [mem 0xfdc00000-0xfdc7ffff pref] Jul 15 05:14:55.809594 kernel: pci 0000:00:02.6: PCI bridge to [bus 07] Jul 15 05:14:55.809601 kernel: acpiphp: Slot [0-2] registered Jul 15 05:14:55.809688 kernel: pci 0000:00:02.7: PCI bridge to [bus 08] Jul 15 05:14:55.809695 kernel: acpiphp: Slot [0-3] registered Jul 15 05:14:55.809780 kernel: pci 0000:00:03.0: PCI bridge to [bus 09] Jul 15 05:14:55.809786 kernel: ACPI: PCI: Interrupt link LNKA configured for IRQ 10 Jul 15 05:14:55.809792 kernel: ACPI: PCI: Interrupt link LNKB configured for IRQ 10 Jul 15 05:14:55.809797 kernel: ACPI: PCI: Interrupt link LNKC configured for IRQ 11 Jul 15 05:14:55.809803 kernel: ACPI: PCI: Interrupt link LNKD configured for IRQ 11 Jul 15 05:14:55.809808 kernel: ACPI: PCI: Interrupt link LNKE configured for IRQ 10 Jul 15 05:14:55.809813 kernel: ACPI: PCI: Interrupt link LNKF configured for IRQ 10 Jul 15 05:14:55.809821 kernel: ACPI: PCI: Interrupt link LNKG configured for IRQ 11 Jul 15 05:14:55.809826 kernel: ACPI: PCI: Interrupt link LNKH configured for IRQ 11 Jul 15 05:14:55.809831 kernel: ACPI: PCI: Interrupt link GSIA configured for IRQ 16 Jul 15 05:14:55.809836 kernel: ACPI: PCI: Interrupt link GSIB configured for IRQ 17 Jul 15 05:14:55.809842 kernel: ACPI: PCI: Interrupt link GSIC configured for IRQ 18 Jul 15 05:14:55.809847 kernel: ACPI: PCI: Interrupt link GSID configured for IRQ 19 Jul 15 05:14:55.809852 kernel: ACPI: PCI: Interrupt link GSIE configured for IRQ 20 Jul 15 05:14:55.809858 kernel: ACPI: PCI: Interrupt link GSIF configured for IRQ 21 Jul 15 05:14:55.809863 kernel: ACPI: PCI: Interrupt link GSIG configured for IRQ 22 Jul 15 05:14:55.809870 kernel: ACPI: PCI: Interrupt link GSIH configured for IRQ 23 Jul 15 05:14:55.809876 kernel: iommu: Default domain type: Translated Jul 15 05:14:55.809881 kernel: iommu: DMA domain TLB invalidation policy: lazy mode Jul 15 05:14:55.809887 kernel: PCI: Using ACPI for IRQ routing Jul 15 05:14:55.809892 kernel: PCI: pci_cache_line_size set to 64 bytes Jul 15 05:14:55.809897 kernel: e820: reserve RAM buffer [mem 0x0009fc00-0x0009ffff] Jul 15 05:14:55.809903 kernel: e820: reserve RAM buffer [mem 0x7cfdc000-0x7fffffff] Jul 15 05:14:55.809989 kernel: pci 0000:00:01.0: vgaarb: setting as boot VGA device Jul 15 05:14:55.810076 kernel: pci 0000:00:01.0: vgaarb: bridge control possible Jul 15 05:14:55.810165 kernel: pci 0000:00:01.0: vgaarb: VGA device added: decodes=io+mem,owns=io+mem,locks=none Jul 15 05:14:55.810172 kernel: vgaarb: loaded Jul 15 05:14:55.810177 kernel: hpet0: at MMIO 0xfed00000, IRQs 2, 8, 0 Jul 15 05:14:55.810183 kernel: hpet0: 3 comparators, 64-bit 100.000000 MHz counter Jul 15 05:14:55.810188 kernel: clocksource: Switched to clocksource kvm-clock Jul 15 05:14:55.810193 kernel: VFS: Disk quotas dquot_6.6.0 Jul 15 05:14:55.810199 kernel: VFS: Dquot-cache hash table entries: 512 (order 0, 4096 bytes) Jul 15 05:14:55.810204 kernel: pnp: PnP ACPI init Jul 15 05:14:55.810312 kernel: system 00:04: [mem 0xb0000000-0xbfffffff window] has been reserved Jul 15 05:14:55.810319 kernel: pnp: PnP ACPI: found 5 devices Jul 15 05:14:55.810325 kernel: clocksource: acpi_pm: mask: 0xffffff max_cycles: 0xffffff, max_idle_ns: 2085701024 ns Jul 15 05:14:55.810330 kernel: NET: Registered PF_INET protocol family Jul 15 05:14:55.810336 kernel: IP idents hash table entries: 32768 (order: 6, 262144 bytes, linear) Jul 15 05:14:55.810341 kernel: tcp_listen_portaddr_hash hash table entries: 1024 (order: 2, 16384 bytes, linear) Jul 15 05:14:55.810346 kernel: Table-perturb hash table entries: 65536 (order: 6, 262144 bytes, linear) Jul 15 05:14:55.810352 kernel: TCP established hash table entries: 16384 (order: 5, 131072 bytes, linear) Jul 15 05:14:55.810359 kernel: TCP bind hash table entries: 16384 (order: 7, 524288 bytes, linear) Jul 15 05:14:55.810365 kernel: TCP: Hash tables configured (established 16384 bind 16384) Jul 15 05:14:55.810370 kernel: UDP hash table entries: 1024 (order: 3, 32768 bytes, linear) Jul 15 05:14:55.810375 kernel: UDP-Lite hash table entries: 1024 (order: 3, 32768 bytes, linear) Jul 15 05:14:55.810386 kernel: NET: Registered PF_UNIX/PF_LOCAL protocol family Jul 15 05:14:55.810391 kernel: NET: Registered PF_XDP protocol family Jul 15 05:14:55.810479 kernel: pci 0000:00:02.6: bridge window [io 0x1000-0x0fff] to [bus 07] add_size 1000 Jul 15 05:14:55.810567 kernel: pci 0000:00:02.7: bridge window [io 0x1000-0x0fff] to [bus 08] add_size 1000 Jul 15 05:14:55.810657 kernel: pci 0000:00:03.0: bridge window [io 0x1000-0x0fff] to [bus 09] add_size 1000 Jul 15 05:14:55.810747 kernel: pci 0000:00:02.6: bridge window [io 0x1000-0x1fff]: assigned Jul 15 05:14:55.810846 kernel: pci 0000:00:02.7: bridge window [io 0x2000-0x2fff]: assigned Jul 15 05:14:55.810934 kernel: pci 0000:00:03.0: bridge window [io 0x3000-0x3fff]: assigned Jul 15 05:14:55.811021 kernel: pci 0000:00:02.0: PCI bridge to [bus 01] Jul 15 05:14:55.811107 kernel: pci 0000:00:02.0: bridge window [mem 0xfe800000-0xfe9fffff] Jul 15 05:14:55.811194 kernel: pci 0000:00:02.0: bridge window [mem 0xfd000000-0xfd1fffff 64bit pref] Jul 15 05:14:55.811302 kernel: pci 0000:00:02.1: PCI bridge to [bus 02] Jul 15 05:14:55.811396 kernel: pci 0000:00:02.1: bridge window [mem 0xfe600000-0xfe7fffff] Jul 15 05:14:55.811483 kernel: pci 0000:00:02.1: bridge window [mem 0xfce00000-0xfcffffff 64bit pref] Jul 15 05:14:55.811573 kernel: pci 0000:00:02.2: PCI bridge to [bus 03] Jul 15 05:14:55.811660 kernel: pci 0000:00:02.2: bridge window [mem 0xfe400000-0xfe5fffff] Jul 15 05:14:55.811747 kernel: pci 0000:00:02.2: bridge window [mem 0xfcc00000-0xfcdfffff 64bit pref] Jul 15 05:14:55.811832 kernel: pci 0000:00:02.3: PCI bridge to [bus 04] Jul 15 05:14:55.811919 kernel: pci 0000:00:02.3: bridge window [mem 0xfe200000-0xfe3fffff] Jul 15 05:14:55.812005 kernel: pci 0000:00:02.3: bridge window [mem 0xfca00000-0xfcbfffff 64bit pref] Jul 15 05:14:55.812091 kernel: pci 0000:00:02.4: PCI bridge to [bus 05] Jul 15 05:14:55.812177 kernel: pci 0000:00:02.4: bridge window [mem 0xfe000000-0xfe1fffff] Jul 15 05:14:55.812277 kernel: pci 0000:00:02.4: bridge window [mem 0xfc800000-0xfc9fffff 64bit pref] Jul 15 05:14:55.812364 kernel: pci 0000:00:02.5: PCI bridge to [bus 06] Jul 15 05:14:55.812456 kernel: pci 0000:00:02.5: bridge window [mem 0xfde00000-0xfdffffff] Jul 15 05:14:55.812542 kernel: pci 0000:00:02.5: bridge window [mem 0xfc600000-0xfc7fffff 64bit pref] Jul 15 05:14:55.812628 kernel: pci 0000:00:02.6: PCI bridge to [bus 07] Jul 15 05:14:55.812713 kernel: pci 0000:00:02.6: bridge window [io 0x1000-0x1fff] Jul 15 05:14:55.812799 kernel: pci 0000:00:02.6: bridge window [mem 0xfdc00000-0xfddfffff] Jul 15 05:14:55.812888 kernel: pci 0000:00:02.6: bridge window [mem 0xfc400000-0xfc5fffff 64bit pref] Jul 15 05:14:55.812973 kernel: pci 0000:00:02.7: PCI bridge to [bus 08] Jul 15 05:14:55.813060 kernel: pci 0000:00:02.7: bridge window [io 0x2000-0x2fff] Jul 15 05:14:55.813146 kernel: pci 0000:00:02.7: bridge window [mem 0xfda00000-0xfdbfffff] Jul 15 05:14:55.813232 kernel: pci 0000:00:02.7: bridge window [mem 0xfc200000-0xfc3fffff 64bit pref] Jul 15 05:14:55.813340 kernel: pci 0000:00:03.0: PCI bridge to [bus 09] Jul 15 05:14:55.813438 kernel: pci 0000:00:03.0: bridge window [io 0x3000-0x3fff] Jul 15 05:14:55.813525 kernel: pci 0000:00:03.0: bridge window [mem 0xfd800000-0xfd9fffff] Jul 15 05:14:55.813611 kernel: pci 0000:00:03.0: bridge window [mem 0xfc000000-0xfc1fffff 64bit pref] Jul 15 05:14:55.813693 kernel: pci_bus 0000:00: resource 4 [io 0x0000-0x0cf7 window] Jul 15 05:14:55.813775 kernel: pci_bus 0000:00: resource 5 [io 0x0d00-0xffff window] Jul 15 05:14:55.813859 kernel: pci_bus 0000:00: resource 6 [mem 0x000a0000-0x000bffff window] Jul 15 05:14:55.813940 kernel: pci_bus 0000:00: resource 7 [mem 0x7d000000-0xafffffff window] Jul 15 05:14:55.814020 kernel: pci_bus 0000:00: resource 8 [mem 0xc0000000-0xfebfffff window] Jul 15 05:14:55.814102 kernel: pci_bus 0000:00: resource 9 [mem 0x100000000-0x8ffffffff window] Jul 15 05:14:55.814192 kernel: pci_bus 0000:01: resource 1 [mem 0xfe800000-0xfe9fffff] Jul 15 05:14:55.814304 kernel: pci_bus 0000:01: resource 2 [mem 0xfd000000-0xfd1fffff 64bit pref] Jul 15 05:14:55.814458 kernel: pci_bus 0000:02: resource 1 [mem 0xfe600000-0xfe7fffff] Jul 15 05:14:55.814559 kernel: pci_bus 0000:02: resource 2 [mem 0xfce00000-0xfcffffff 64bit pref] Jul 15 05:14:55.814651 kernel: pci_bus 0000:03: resource 1 [mem 0xfe400000-0xfe5fffff] Jul 15 05:14:55.814739 kernel: pci_bus 0000:03: resource 2 [mem 0xfcc00000-0xfcdfffff 64bit pref] Jul 15 05:14:55.814830 kernel: pci_bus 0000:04: resource 1 [mem 0xfe200000-0xfe3fffff] Jul 15 05:14:55.814915 kernel: pci_bus 0000:04: resource 2 [mem 0xfca00000-0xfcbfffff 64bit pref] Jul 15 05:14:55.815004 kernel: pci_bus 0000:05: resource 1 [mem 0xfe000000-0xfe1fffff] Jul 15 05:14:55.815087 kernel: pci_bus 0000:05: resource 2 [mem 0xfc800000-0xfc9fffff 64bit pref] Jul 15 05:14:55.815176 kernel: pci_bus 0000:06: resource 1 [mem 0xfde00000-0xfdffffff] Jul 15 05:14:55.815274 kernel: pci_bus 0000:06: resource 2 [mem 0xfc600000-0xfc7fffff 64bit pref] Jul 15 05:14:55.815368 kernel: pci_bus 0000:07: resource 0 [io 0x1000-0x1fff] Jul 15 05:14:55.815464 kernel: pci_bus 0000:07: resource 1 [mem 0xfdc00000-0xfddfffff] Jul 15 05:14:55.815547 kernel: pci_bus 0000:07: resource 2 [mem 0xfc400000-0xfc5fffff 64bit pref] Jul 15 05:14:55.815635 kernel: pci_bus 0000:08: resource 0 [io 0x2000-0x2fff] Jul 15 05:14:55.815719 kernel: pci_bus 0000:08: resource 1 [mem 0xfda00000-0xfdbfffff] Jul 15 05:14:55.815802 kernel: pci_bus 0000:08: resource 2 [mem 0xfc200000-0xfc3fffff 64bit pref] Jul 15 05:14:55.815894 kernel: pci_bus 0000:09: resource 0 [io 0x3000-0x3fff] Jul 15 05:14:55.815977 kernel: pci_bus 0000:09: resource 1 [mem 0xfd800000-0xfd9fffff] Jul 15 05:14:55.816059 kernel: pci_bus 0000:09: resource 2 [mem 0xfc000000-0xfc1fffff 64bit pref] Jul 15 05:14:55.816066 kernel: ACPI: \_SB_.GSIG: Enabled at IRQ 22 Jul 15 05:14:55.816072 kernel: PCI: CLS 0 bytes, default 64 Jul 15 05:14:55.816078 kernel: clocksource: tsc: mask: 0xffffffffffffffff max_cycles: 0x22983777dd9, max_idle_ns: 440795300422 ns Jul 15 05:14:55.816084 kernel: Initialise system trusted keyrings Jul 15 05:14:55.816090 kernel: workingset: timestamp_bits=39 max_order=19 bucket_order=0 Jul 15 05:14:55.816097 kernel: Key type asymmetric registered Jul 15 05:14:55.816103 kernel: Asymmetric key parser 'x509' registered Jul 15 05:14:55.816108 kernel: Block layer SCSI generic (bsg) driver version 0.4 loaded (major 250) Jul 15 05:14:55.816114 kernel: io scheduler mq-deadline registered Jul 15 05:14:55.816120 kernel: io scheduler kyber registered Jul 15 05:14:55.816125 kernel: io scheduler bfq registered Jul 15 05:14:55.816214 kernel: pcieport 0000:00:02.0: PME: Signaling with IRQ 24 Jul 15 05:14:55.816340 kernel: pcieport 0000:00:02.0: AER: enabled with IRQ 24 Jul 15 05:14:55.816457 kernel: pcieport 0000:00:02.1: PME: Signaling with IRQ 25 Jul 15 05:14:55.816551 kernel: pcieport 0000:00:02.1: AER: enabled with IRQ 25 Jul 15 05:14:55.816638 kernel: pcieport 0000:00:02.2: PME: Signaling with IRQ 26 Jul 15 05:14:55.816726 kernel: pcieport 0000:00:02.2: AER: enabled with IRQ 26 Jul 15 05:14:55.816813 kernel: pcieport 0000:00:02.3: PME: Signaling with IRQ 27 Jul 15 05:14:55.816914 kernel: pcieport 0000:00:02.3: AER: enabled with IRQ 27 Jul 15 05:14:55.817002 kernel: pcieport 0000:00:02.4: PME: Signaling with IRQ 28 Jul 15 05:14:55.817089 kernel: pcieport 0000:00:02.4: AER: enabled with IRQ 28 Jul 15 05:14:55.817175 kernel: pcieport 0000:00:02.5: PME: Signaling with IRQ 29 Jul 15 05:14:55.817279 kernel: pcieport 0000:00:02.5: AER: enabled with IRQ 29 Jul 15 05:14:55.817367 kernel: pcieport 0000:00:02.6: PME: Signaling with IRQ 30 Jul 15 05:14:55.817461 kernel: pcieport 0000:00:02.6: AER: enabled with IRQ 30 Jul 15 05:14:55.817551 kernel: pcieport 0000:00:02.7: PME: Signaling with IRQ 31 Jul 15 05:14:55.817637 kernel: pcieport 0000:00:02.7: AER: enabled with IRQ 31 Jul 15 05:14:55.817644 kernel: ACPI: \_SB_.GSIH: Enabled at IRQ 23 Jul 15 05:14:55.817730 kernel: pcieport 0000:00:03.0: PME: Signaling with IRQ 32 Jul 15 05:14:55.817818 kernel: pcieport 0000:00:03.0: AER: enabled with IRQ 32 Jul 15 05:14:55.817825 kernel: ioatdma: Intel(R) QuickData Technology Driver 5.00 Jul 15 05:14:55.817832 kernel: ACPI: \_SB_.GSIF: Enabled at IRQ 21 Jul 15 05:14:55.817837 kernel: Serial: 8250/16550 driver, 4 ports, IRQ sharing enabled Jul 15 05:14:55.817845 kernel: 00:00: ttyS0 at I/O 0x3f8 (irq = 4, base_baud = 115200) is a 16550A Jul 15 05:14:55.817850 kernel: i8042: PNP: PS/2 Controller [PNP0303:KBD,PNP0f13:MOU] at 0x60,0x64 irq 1,12 Jul 15 05:14:55.817856 kernel: serio: i8042 KBD port at 0x60,0x64 irq 1 Jul 15 05:14:55.817862 kernel: serio: i8042 AUX port at 0x60,0x64 irq 12 Jul 15 05:14:55.817869 kernel: input: AT Translated Set 2 keyboard as /devices/platform/i8042/serio0/input/input0 Jul 15 05:14:55.817966 kernel: rtc_cmos 00:03: RTC can wake from S4 Jul 15 05:14:55.818054 kernel: rtc_cmos 00:03: registered as rtc0 Jul 15 05:14:55.818136 kernel: rtc_cmos 00:03: setting system clock to 2025-07-15T05:14:55 UTC (1752556495) Jul 15 05:14:55.818217 kernel: rtc_cmos 00:03: alarms up to one day, y3k, 242 bytes nvram, hpet irqs Jul 15 05:14:55.818224 kernel: amd_pstate: the _CPC object is not present in SBIOS or ACPI disabled Jul 15 05:14:55.818230 kernel: NET: Registered PF_INET6 protocol family Jul 15 05:14:55.818238 kernel: Segment Routing with IPv6 Jul 15 05:14:55.818243 kernel: In-situ OAM (IOAM) with IPv6 Jul 15 05:14:55.818269 kernel: NET: Registered PF_PACKET protocol family Jul 15 05:14:55.818274 kernel: Key type dns_resolver registered Jul 15 05:14:55.818280 kernel: IPI shorthand broadcast: enabled Jul 15 05:14:55.818285 kernel: sched_clock: Marking stable (2685015890, 129924300)->(2819026550, -4086360) Jul 15 05:14:55.818291 kernel: registered taskstats version 1 Jul 15 05:14:55.818297 kernel: Loading compiled-in X.509 certificates Jul 15 05:14:55.818302 kernel: Loaded X.509 cert 'Kinvolk GmbH: Module signing key for 6.12.36-flatcar: a24478b628e55368911ce1800a2bd6bc158938c7' Jul 15 05:14:55.818308 kernel: Demotion targets for Node 0: null Jul 15 05:14:55.818316 kernel: Key type .fscrypt registered Jul 15 05:14:55.818321 kernel: Key type fscrypt-provisioning registered Jul 15 05:14:55.818327 kernel: ima: No TPM chip found, activating TPM-bypass! Jul 15 05:14:55.818332 kernel: ima: Allocated hash algorithm: sha1 Jul 15 05:14:55.818338 kernel: ima: No architecture policies found Jul 15 05:14:55.818343 kernel: clk: Disabling unused clocks Jul 15 05:14:55.818349 kernel: Warning: unable to open an initial console. Jul 15 05:14:55.818355 kernel: Freeing unused kernel image (initmem) memory: 54608K Jul 15 05:14:55.818363 kernel: Write protecting the kernel read-only data: 24576k Jul 15 05:14:55.818368 kernel: Freeing unused kernel image (rodata/data gap) memory: 284K Jul 15 05:14:55.818374 kernel: Run /init as init process Jul 15 05:14:55.818386 kernel: with arguments: Jul 15 05:14:55.818392 kernel: /init Jul 15 05:14:55.818397 kernel: with environment: Jul 15 05:14:55.818402 kernel: HOME=/ Jul 15 05:14:55.818408 kernel: TERM=linux Jul 15 05:14:55.818413 kernel: BOOT_IMAGE=/flatcar/vmlinuz-a Jul 15 05:14:55.818420 systemd[1]: Successfully made /usr/ read-only. Jul 15 05:14:55.818430 systemd[1]: systemd 256.8 running in system mode (+PAM +AUDIT +SELINUX -APPARMOR +IMA +SMACK +SECCOMP -GCRYPT -GNUTLS +OPENSSL -ACL +BLKID +CURL +ELFUTILS -FIDO2 +IDN2 -IDN +IPTC +KMOD +LIBCRYPTSETUP +LIBCRYPTSETUP_PLUGINS +LIBFDISK +PCRE2 -PWQUALITY -P11KIT -QRENCODE +TPM2 +BZIP2 +LZ4 +XZ +ZLIB +ZSTD -BPF_FRAMEWORK -XKBCOMMON +UTMP -SYSVINIT +LIBARCHIVE) Jul 15 05:14:55.818437 systemd[1]: Detected virtualization kvm. Jul 15 05:14:55.818442 systemd[1]: Detected architecture x86-64. Jul 15 05:14:55.818448 systemd[1]: Running in initrd. Jul 15 05:14:55.818454 systemd[1]: No hostname configured, using default hostname. Jul 15 05:14:55.818460 systemd[1]: Hostname set to . Jul 15 05:14:55.818468 systemd[1]: Initializing machine ID from VM UUID. Jul 15 05:14:55.818474 systemd[1]: Queued start job for default target initrd.target. Jul 15 05:14:55.818479 systemd[1]: Started clevis-luks-askpass.path - Forward Password Requests to Clevis Directory Watch. Jul 15 05:14:55.818485 systemd[1]: Started systemd-ask-password-console.path - Dispatch Password Requests to Console Directory Watch. Jul 15 05:14:55.818492 systemd[1]: Expecting device dev-disk-by\x2dlabel-EFI\x2dSYSTEM.device - /dev/disk/by-label/EFI-SYSTEM... Jul 15 05:14:55.818497 systemd[1]: Expecting device dev-disk-by\x2dlabel-OEM.device - /dev/disk/by-label/OEM... Jul 15 05:14:55.818504 systemd[1]: Expecting device dev-disk-by\x2dlabel-ROOT.device - /dev/disk/by-label/ROOT... Jul 15 05:14:55.818510 systemd[1]: Expecting device dev-disk-by\x2dpartlabel-USR\x2dA.device - /dev/disk/by-partlabel/USR-A... Jul 15 05:14:55.818518 systemd[1]: Expecting device dev-disk-by\x2dpartuuid-7130c94a\x2d213a\x2d4e5a\x2d8e26\x2d6cce9662f132.device - /dev/disk/by-partuuid/7130c94a-213a-4e5a-8e26-6cce9662f132... Jul 15 05:14:55.818524 systemd[1]: Expecting device dev-mapper-usr.device - /dev/mapper/usr... Jul 15 05:14:55.818530 systemd[1]: Reached target cryptsetup-pre.target - Local Encrypted Volumes (Pre). Jul 15 05:14:55.818536 systemd[1]: Reached target cryptsetup.target - Local Encrypted Volumes. Jul 15 05:14:55.818542 systemd[1]: Reached target paths.target - Path Units. Jul 15 05:14:55.818548 systemd[1]: Reached target slices.target - Slice Units. Jul 15 05:14:55.818556 systemd[1]: Reached target swap.target - Swaps. Jul 15 05:14:55.818562 systemd[1]: Reached target timers.target - Timer Units. Jul 15 05:14:55.818569 systemd[1]: Listening on iscsid.socket - Open-iSCSI iscsid Socket. Jul 15 05:14:55.818575 systemd[1]: Listening on iscsiuio.socket - Open-iSCSI iscsiuio Socket. Jul 15 05:14:55.818581 systemd[1]: Listening on systemd-journald-dev-log.socket - Journal Socket (/dev/log). Jul 15 05:14:55.818587 systemd[1]: Listening on systemd-journald.socket - Journal Sockets. Jul 15 05:14:55.818593 systemd[1]: Listening on systemd-networkd.socket - Network Service Netlink Socket. Jul 15 05:14:55.818598 systemd[1]: Listening on systemd-udevd-control.socket - udev Control Socket. Jul 15 05:14:55.818604 systemd[1]: Listening on systemd-udevd-kernel.socket - udev Kernel Socket. Jul 15 05:14:55.818610 systemd[1]: Reached target sockets.target - Socket Units. Jul 15 05:14:55.818618 systemd[1]: Starting ignition-setup-pre.service - Ignition env setup... Jul 15 05:14:55.818623 systemd[1]: Starting kmod-static-nodes.service - Create List of Static Device Nodes... Jul 15 05:14:55.818629 systemd[1]: Finished network-cleanup.service - Network Cleanup. Jul 15 05:14:55.818635 systemd[1]: systemd-battery-check.service - Check battery level during early boot was skipped because of an unmet condition check (ConditionDirectoryNotEmpty=/sys/class/power_supply). Jul 15 05:14:55.818641 systemd[1]: Starting systemd-fsck-usr.service... Jul 15 05:14:55.818647 systemd[1]: Starting systemd-journald.service - Journal Service... Jul 15 05:14:55.818653 systemd[1]: Starting systemd-modules-load.service - Load Kernel Modules... Jul 15 05:14:55.818658 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Jul 15 05:14:55.818666 systemd[1]: Finished ignition-setup-pre.service - Ignition env setup. Jul 15 05:14:55.818672 systemd[1]: Finished kmod-static-nodes.service - Create List of Static Device Nodes. Jul 15 05:14:55.818678 systemd[1]: Finished systemd-fsck-usr.service. Jul 15 05:14:55.818684 systemd[1]: Starting systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully... Jul 15 05:14:55.818709 systemd-journald[217]: Collecting audit messages is disabled. Jul 15 05:14:55.818726 systemd[1]: Finished systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully. Jul 15 05:14:55.818732 systemd-journald[217]: Journal started Jul 15 05:14:55.818747 systemd-journald[217]: Runtime Journal (/run/log/journal/a09ae2d4a11541aab386b396b27aa7cd) is 4.8M, max 38.6M, 33.7M free. Jul 15 05:14:55.796468 systemd-modules-load[218]: Inserted module 'overlay' Jul 15 05:14:55.852962 kernel: bridge: filtering via arp/ip/ip6tables is no longer available by default. Update your scripts to load br_netfilter if you need this. Jul 15 05:14:55.852977 kernel: Bridge firewalling registered Jul 15 05:14:55.852986 systemd[1]: Started systemd-journald.service - Journal Service. Jul 15 05:14:55.824145 systemd-modules-load[218]: Inserted module 'br_netfilter' Jul 15 05:14:55.853573 systemd[1]: Finished systemd-modules-load.service - Load Kernel Modules. Jul 15 05:14:55.854535 systemd[1]: Finished systemd-vconsole-setup.service - Virtual Console Setup. Jul 15 05:14:55.856996 systemd[1]: Starting dracut-cmdline-ask.service - dracut ask for additional cmdline parameters... Jul 15 05:14:55.865709 systemd[1]: Starting systemd-sysctl.service - Apply Kernel Variables... Jul 15 05:14:55.867449 systemd[1]: Starting systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev... Jul 15 05:14:55.870875 systemd[1]: Starting systemd-tmpfiles-setup.service - Create System Files and Directories... Jul 15 05:14:55.882579 systemd[1]: Finished systemd-sysctl.service - Apply Kernel Variables. Jul 15 05:14:55.888756 systemd-tmpfiles[233]: /usr/lib/tmpfiles.d/var.conf:14: Duplicate line for path "/var/log", ignoring. Jul 15 05:14:55.891476 systemd[1]: Finished systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev. Jul 15 05:14:55.893225 systemd[1]: Finished dracut-cmdline-ask.service - dracut ask for additional cmdline parameters. Jul 15 05:14:55.895611 systemd[1]: Starting dracut-cmdline.service - dracut cmdline hook... Jul 15 05:14:55.897096 systemd[1]: Finished systemd-tmpfiles-setup.service - Create System Files and Directories. Jul 15 05:14:55.903877 systemd[1]: Starting systemd-resolved.service - Network Name Resolution... Jul 15 05:14:55.911232 dracut-cmdline[253]: Using kernel command line parameters: rd.driver.pre=btrfs SYSTEMD_SULOGIN_FORCE=1 rootflags=rw mount.usrflags=ro BOOT_IMAGE=/flatcar/vmlinuz-a mount.usr=/dev/mapper/usr verity.usr=PARTUUID=7130c94a-213a-4e5a-8e26-6cce9662f132 rootflags=rw mount.usrflags=ro consoleblank=0 root=LABEL=ROOT console=ttyS0,115200n8 console=tty0 flatcar.first_boot=detected flatcar.oem.id=hetzner verity.usrhash=926b029026d98240a9e8b6527b65fc026ae523bea87c3b77ffd7237bcc7be4fb Jul 15 05:14:55.940519 systemd-resolved[256]: Positive Trust Anchors: Jul 15 05:14:55.941060 systemd-resolved[256]: . IN DS 20326 8 2 e06d44b80b8f1d39a95c0b0d7c65d08458e880409bbc683457104237c7f8ec8d Jul 15 05:14:55.941082 systemd-resolved[256]: Negative trust anchors: home.arpa 10.in-addr.arpa 16.172.in-addr.arpa 17.172.in-addr.arpa 18.172.in-addr.arpa 19.172.in-addr.arpa 20.172.in-addr.arpa 21.172.in-addr.arpa 22.172.in-addr.arpa 23.172.in-addr.arpa 24.172.in-addr.arpa 25.172.in-addr.arpa 26.172.in-addr.arpa 27.172.in-addr.arpa 28.172.in-addr.arpa 29.172.in-addr.arpa 30.172.in-addr.arpa 31.172.in-addr.arpa 170.0.0.192.in-addr.arpa 171.0.0.192.in-addr.arpa 168.192.in-addr.arpa d.f.ip6.arpa ipv4only.arpa resolver.arpa corp home internal intranet lan local private test Jul 15 05:14:55.945317 systemd-resolved[256]: Defaulting to hostname 'linux'. Jul 15 05:14:55.946328 systemd[1]: Started systemd-resolved.service - Network Name Resolution. Jul 15 05:14:55.947340 systemd[1]: Reached target nss-lookup.target - Host and Network Name Lookups. Jul 15 05:14:55.977285 kernel: SCSI subsystem initialized Jul 15 05:14:55.984267 kernel: Loading iSCSI transport class v2.0-870. Jul 15 05:14:55.992268 kernel: iscsi: registered transport (tcp) Jul 15 05:14:56.007547 kernel: iscsi: registered transport (qla4xxx) Jul 15 05:14:56.007578 kernel: QLogic iSCSI HBA Driver Jul 15 05:14:56.021275 systemd[1]: Starting systemd-network-generator.service - Generate network units from Kernel command line... Jul 15 05:14:56.033757 systemd[1]: Finished systemd-network-generator.service - Generate network units from Kernel command line. Jul 15 05:14:56.034622 systemd[1]: Reached target network-pre.target - Preparation for Network. Jul 15 05:14:56.067697 systemd[1]: Finished dracut-cmdline.service - dracut cmdline hook. Jul 15 05:14:56.069103 systemd[1]: Starting dracut-pre-udev.service - dracut pre-udev hook... Jul 15 05:14:56.108277 kernel: raid6: avx2x4 gen() 48133 MB/s Jul 15 05:14:56.125271 kernel: raid6: avx2x2 gen() 48900 MB/s Jul 15 05:14:56.142283 kernel: raid6: avx2x1 gen() 39070 MB/s Jul 15 05:14:56.142308 kernel: raid6: using algorithm avx2x2 gen() 48900 MB/s Jul 15 05:14:56.160396 kernel: raid6: .... xor() 37228 MB/s, rmw enabled Jul 15 05:14:56.160425 kernel: raid6: using avx2x2 recovery algorithm Jul 15 05:14:56.176296 kernel: xor: automatically using best checksumming function avx Jul 15 05:14:56.277299 kernel: Btrfs loaded, zoned=no, fsverity=no Jul 15 05:14:56.284543 systemd[1]: Finished dracut-pre-udev.service - dracut pre-udev hook. Jul 15 05:14:56.288965 systemd[1]: Starting systemd-udevd.service - Rule-based Manager for Device Events and Files... Jul 15 05:14:56.308604 systemd-udevd[465]: Using default interface naming scheme 'v255'. Jul 15 05:14:56.313152 systemd[1]: Started systemd-udevd.service - Rule-based Manager for Device Events and Files. Jul 15 05:14:56.319481 systemd[1]: Starting dracut-pre-trigger.service - dracut pre-trigger hook... Jul 15 05:14:56.342799 dracut-pre-trigger[475]: rd.md=0: removing MD RAID activation Jul 15 05:14:56.374112 systemd[1]: Finished dracut-pre-trigger.service - dracut pre-trigger hook. Jul 15 05:14:56.376170 systemd[1]: Starting systemd-udev-trigger.service - Coldplug All udev Devices... Jul 15 05:14:56.442119 systemd[1]: Finished systemd-udev-trigger.service - Coldplug All udev Devices. Jul 15 05:14:56.449692 systemd[1]: Starting dracut-initqueue.service - dracut initqueue hook... Jul 15 05:14:56.551307 kernel: ACPI: bus type USB registered Jul 15 05:14:56.551355 kernel: cryptd: max_cpu_qlen set to 1000 Jul 15 05:14:56.551364 kernel: libata version 3.00 loaded. Jul 15 05:14:56.555295 kernel: virtio_scsi virtio5: 2/0/0 default/read/poll queues Jul 15 05:14:56.561619 kernel: usbcore: registered new interface driver usbfs Jul 15 05:14:56.561632 kernel: usbcore: registered new interface driver hub Jul 15 05:14:56.561640 kernel: usbcore: registered new device driver usb Jul 15 05:14:56.563420 kernel: scsi host0: Virtio SCSI HBA Jul 15 05:14:56.565275 kernel: scsi 0:0:0:0: Direct-Access QEMU QEMU HARDDISK 2.5+ PQ: 0 ANSI: 5 Jul 15 05:14:56.569590 kernel: ahci 0000:00:1f.2: version 3.0 Jul 15 05:14:56.569753 kernel: ACPI: \_SB_.GSIA: Enabled at IRQ 16 Jul 15 05:14:56.571071 systemd[1]: systemd-vconsole-setup.service: Deactivated successfully. Jul 15 05:14:56.571377 systemd[1]: Stopped systemd-vconsole-setup.service - Virtual Console Setup. Jul 15 05:14:56.579311 kernel: ahci 0000:00:1f.2: AHCI vers 0001.0000, 32 command slots, 1.5 Gbps, SATA mode Jul 15 05:14:56.579465 kernel: ahci 0000:00:1f.2: 6/6 ports implemented (port mask 0x3f) Jul 15 05:14:56.579570 kernel: ahci 0000:00:1f.2: flags: 64bit ncq only Jul 15 05:14:56.579671 kernel: scsi host1: ahci Jul 15 05:14:56.579782 kernel: scsi host2: ahci Jul 15 05:14:56.579884 kernel: scsi host3: ahci Jul 15 05:14:56.579989 kernel: scsi host4: ahci Jul 15 05:14:56.580091 kernel: scsi host5: ahci Jul 15 05:14:56.574791 systemd[1]: Stopping systemd-vconsole-setup.service - Virtual Console Setup... Jul 15 05:14:56.593196 kernel: scsi host6: ahci Jul 15 05:14:56.593678 kernel: ata1: SATA max UDMA/133 abar m4096@0xfea1a000 port 0xfea1a100 irq 46 lpm-pol 0 Jul 15 05:14:56.593688 kernel: ata2: SATA max UDMA/133 abar m4096@0xfea1a000 port 0xfea1a180 irq 46 lpm-pol 0 Jul 15 05:14:56.593700 kernel: ata3: SATA max UDMA/133 abar m4096@0xfea1a000 port 0xfea1a200 irq 46 lpm-pol 0 Jul 15 05:14:56.593707 kernel: ata4: SATA max UDMA/133 abar m4096@0xfea1a000 port 0xfea1a280 irq 46 lpm-pol 0 Jul 15 05:14:56.593714 kernel: ata5: SATA max UDMA/133 abar m4096@0xfea1a000 port 0xfea1a300 irq 46 lpm-pol 0 Jul 15 05:14:56.593721 kernel: ata6: SATA max UDMA/133 abar m4096@0xfea1a000 port 0xfea1a380 irq 46 lpm-pol 0 Jul 15 05:14:56.593921 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Jul 15 05:14:56.647031 systemd[1]: Finished systemd-vconsole-setup.service - Virtual Console Setup. Jul 15 05:14:56.892287 kernel: ata2: SATA link down (SStatus 0 SControl 300) Jul 15 05:14:56.892375 kernel: ata1: SATA link up 1.5 Gbps (SStatus 113 SControl 300) Jul 15 05:14:56.896278 kernel: ata4: SATA link down (SStatus 0 SControl 300) Jul 15 05:14:56.899122 kernel: ata6: SATA link down (SStatus 0 SControl 300) Jul 15 05:14:56.899151 kernel: ata1.00: ATAPI: QEMU DVD-ROM, 2.5+, max UDMA/100 Jul 15 05:14:56.902314 kernel: ata1.00: applying bridge limits Jul 15 05:14:56.906480 kernel: ata3: SATA link down (SStatus 0 SControl 300) Jul 15 05:14:56.907293 kernel: ata5: SATA link down (SStatus 0 SControl 300) Jul 15 05:14:56.909302 kernel: ata1.00: configured for UDMA/100 Jul 15 05:14:56.916306 kernel: scsi 1:0:0:0: CD-ROM QEMU QEMU DVD-ROM 2.5+ PQ: 0 ANSI: 5 Jul 15 05:14:56.943427 kernel: AES CTR mode by8 optimization enabled Jul 15 05:14:56.972869 kernel: sd 0:0:0:0: Power-on or device reset occurred Jul 15 05:14:56.973046 kernel: sd 0:0:0:0: [sda] 80003072 512-byte logical blocks: (41.0 GB/38.1 GiB) Jul 15 05:14:56.975272 kernel: sd 0:0:0:0: [sda] Write Protect is off Jul 15 05:14:56.975423 kernel: sd 0:0:0:0: [sda] Mode Sense: 63 00 00 08 Jul 15 05:14:56.975537 kernel: sd 0:0:0:0: [sda] Write cache: enabled, read cache: enabled, doesn't support DPO or FUA Jul 15 05:14:56.987268 kernel: input: ImExPS/2 Generic Explorer Mouse as /devices/platform/i8042/serio1/input/input2 Jul 15 05:14:56.997142 kernel: GPT:Primary header thinks Alt. header is not at the end of the disk. Jul 15 05:14:56.997210 kernel: GPT:17805311 != 80003071 Jul 15 05:14:56.997235 kernel: GPT:Alternate GPT header not at the end of the disk. Jul 15 05:14:56.997283 kernel: GPT:17805311 != 80003071 Jul 15 05:14:56.997322 kernel: GPT: Use GNU Parted to correct GPT errors. Jul 15 05:14:56.997347 kernel: sda: sda1 sda2 sda3 sda4 sda6 sda7 sda9 Jul 15 05:14:56.999283 kernel: sd 0:0:0:0: [sda] Attached SCSI disk Jul 15 05:14:57.003268 kernel: xhci_hcd 0000:02:00.0: xHCI Host Controller Jul 15 05:14:57.006913 kernel: xhci_hcd 0000:02:00.0: new USB bus registered, assigned bus number 1 Jul 15 05:14:57.007052 kernel: xhci_hcd 0000:02:00.0: hcc params 0x00087001 hci version 0x100 quirks 0x0000000000000010 Jul 15 05:14:57.010037 kernel: xhci_hcd 0000:02:00.0: xHCI Host Controller Jul 15 05:14:57.010183 kernel: xhci_hcd 0000:02:00.0: new USB bus registered, assigned bus number 2 Jul 15 05:14:57.013139 kernel: xhci_hcd 0000:02:00.0: Host supports USB 3.0 SuperSpeed Jul 15 05:14:57.013313 kernel: hub 1-0:1.0: USB hub found Jul 15 05:14:57.016419 kernel: hub 1-0:1.0: 4 ports detected Jul 15 05:14:57.016591 kernel: usb usb2: We don't know the algorithms for LPM for this host, disabling LPM. Jul 15 05:14:57.017279 kernel: sr 1:0:0:0: [sr0] scsi3-mmc drive: 4x/4x cd/rw xa/form2 tray Jul 15 05:14:57.017431 kernel: hub 2-0:1.0: USB hub found Jul 15 05:14:57.017548 kernel: cdrom: Uniform CD-ROM driver Revision: 3.20 Jul 15 05:14:57.019269 kernel: hub 2-0:1.0: 4 ports detected Jul 15 05:14:57.029264 kernel: sr 1:0:0:0: Attached scsi CD-ROM sr0 Jul 15 05:14:57.059756 systemd[1]: Found device dev-disk-by\x2dlabel-EFI\x2dSYSTEM.device - QEMU_HARDDISK EFI-SYSTEM. Jul 15 05:14:57.070343 systemd[1]: Found device dev-disk-by\x2dlabel-ROOT.device - QEMU_HARDDISK ROOT. Jul 15 05:14:57.076003 systemd[1]: Found device dev-disk-by\x2dpartlabel-USR\x2dA.device - QEMU_HARDDISK USR-A. Jul 15 05:14:57.076485 systemd[1]: Found device dev-disk-by\x2dpartuuid-7130c94a\x2d213a\x2d4e5a\x2d8e26\x2d6cce9662f132.device - QEMU_HARDDISK USR-A. Jul 15 05:14:57.083662 systemd[1]: Found device dev-disk-by\x2dlabel-OEM.device - QEMU_HARDDISK OEM. Jul 15 05:14:57.085324 systemd[1]: Starting disk-uuid.service - Generate new UUID for disk GPT if necessary... Jul 15 05:14:57.099008 disk-uuid[628]: Primary Header is updated. Jul 15 05:14:57.099008 disk-uuid[628]: Secondary Entries is updated. Jul 15 05:14:57.099008 disk-uuid[628]: Secondary Header is updated. Jul 15 05:14:57.106282 kernel: sda: sda1 sda2 sda3 sda4 sda6 sda7 sda9 Jul 15 05:14:57.251274 kernel: usb 1-1: new high-speed USB device number 2 using xhci_hcd Jul 15 05:14:57.270387 systemd[1]: Finished dracut-initqueue.service - dracut initqueue hook. Jul 15 05:14:57.272817 systemd[1]: Reached target remote-fs-pre.target - Preparation for Remote File Systems. Jul 15 05:14:57.274283 systemd[1]: Reached target remote-cryptsetup.target - Remote Encrypted Volumes. Jul 15 05:14:57.275712 systemd[1]: Reached target remote-fs.target - Remote File Systems. Jul 15 05:14:57.278133 systemd[1]: Starting dracut-pre-mount.service - dracut pre-mount hook... Jul 15 05:14:57.303350 systemd[1]: Finished dracut-pre-mount.service - dracut pre-mount hook. Jul 15 05:14:57.397322 kernel: hid: raw HID events driver (C) Jiri Kosina Jul 15 05:14:57.406530 kernel: usbcore: registered new interface driver usbhid Jul 15 05:14:57.406579 kernel: usbhid: USB HID core driver Jul 15 05:14:57.417308 kernel: input: QEMU QEMU USB Tablet as /devices/pci0000:00/0000:00:02.1/0000:02:00.0/usb1/1-1/1-1:1.0/0003:0627:0001.0001/input/input3 Jul 15 05:14:57.417353 kernel: hid-generic 0003:0627:0001.0001: input,hidraw0: USB HID v0.01 Mouse [QEMU QEMU USB Tablet] on usb-0000:02:00.0-1/input0 Jul 15 05:14:58.123447 kernel: sda: sda1 sda2 sda3 sda4 sda6 sda7 sda9 Jul 15 05:14:58.126351 disk-uuid[629]: The operation has completed successfully. Jul 15 05:14:58.212478 systemd[1]: disk-uuid.service: Deactivated successfully. Jul 15 05:14:58.212565 systemd[1]: Finished disk-uuid.service - Generate new UUID for disk GPT if necessary. Jul 15 05:14:58.235910 systemd[1]: Starting verity-setup.service - Verity Setup for /dev/mapper/usr... Jul 15 05:14:58.264529 sh[661]: Success Jul 15 05:14:58.278436 kernel: device-mapper: core: CONFIG_IMA_DISABLE_HTABLE is disabled. Duplicate IMA measurements will not be recorded in the IMA log. Jul 15 05:14:58.278488 kernel: device-mapper: uevent: version 1.0.3 Jul 15 05:14:58.279899 kernel: device-mapper: ioctl: 4.48.0-ioctl (2023-03-01) initialised: dm-devel@lists.linux.dev Jul 15 05:14:58.288310 kernel: device-mapper: verity: sha256 using shash "sha256-ni" Jul 15 05:14:58.342925 systemd[1]: Found device dev-mapper-usr.device - /dev/mapper/usr. Jul 15 05:14:58.345966 systemd[1]: Mounting sysusr-usr.mount - /sysusr/usr... Jul 15 05:14:58.362548 systemd[1]: Finished verity-setup.service - Verity Setup for /dev/mapper/usr. Jul 15 05:14:58.375436 kernel: BTRFS info: 'norecovery' is for compatibility only, recommended to use 'rescue=nologreplay' Jul 15 05:14:58.375491 kernel: BTRFS: device fsid eb96c768-dac4-4ca9-ae1d-82815d4ce00b devid 1 transid 36 /dev/mapper/usr (254:0) scanned by mount (673) Jul 15 05:14:58.382459 kernel: BTRFS info (device dm-0): first mount of filesystem eb96c768-dac4-4ca9-ae1d-82815d4ce00b Jul 15 05:14:58.382482 kernel: BTRFS info (device dm-0): using crc32c (crc32c-intel) checksum algorithm Jul 15 05:14:58.385519 kernel: BTRFS info (device dm-0): using free-space-tree Jul 15 05:14:58.397435 systemd[1]: Mounted sysusr-usr.mount - /sysusr/usr. Jul 15 05:14:58.399048 systemd[1]: Reached target initrd-usr-fs.target - Initrd /usr File System. Jul 15 05:14:58.400731 systemd[1]: afterburn-network-kargs.service - Afterburn Initrd Setup Network Kernel Arguments was skipped because no trigger condition checks were met. Jul 15 05:14:58.403341 systemd[1]: Starting ignition-setup.service - Ignition (setup)... Jul 15 05:14:58.406428 systemd[1]: Starting parse-ip-for-networkd.service - Write systemd-networkd units from cmdline... Jul 15 05:14:58.449294 kernel: BTRFS: device label OEM devid 1 transid 11 /dev/sda6 (8:6) scanned by mount (720) Jul 15 05:14:58.453977 kernel: BTRFS info (device sda6): first mount of filesystem 86e7a055-b4ff-48a6-9a0a-c301ff74862f Jul 15 05:14:58.453999 kernel: BTRFS info (device sda6): using crc32c (crc32c-intel) checksum algorithm Jul 15 05:14:58.456317 kernel: BTRFS info (device sda6): using free-space-tree Jul 15 05:14:58.464492 kernel: BTRFS info (device sda6): last unmount of filesystem 86e7a055-b4ff-48a6-9a0a-c301ff74862f Jul 15 05:14:58.465012 systemd[1]: Finished ignition-setup.service - Ignition (setup). Jul 15 05:14:58.466670 systemd[1]: Starting ignition-fetch-offline.service - Ignition (fetch-offline)... Jul 15 05:14:58.497268 systemd[1]: Finished parse-ip-for-networkd.service - Write systemd-networkd units from cmdline. Jul 15 05:14:58.500384 systemd[1]: Starting systemd-networkd.service - Network Configuration... Jul 15 05:14:58.540662 systemd-networkd[848]: lo: Link UP Jul 15 05:14:58.541212 systemd-networkd[848]: lo: Gained carrier Jul 15 05:14:58.544340 systemd-networkd[848]: Enumeration completed Jul 15 05:14:58.544399 systemd[1]: Started systemd-networkd.service - Network Configuration. Jul 15 05:14:58.544871 systemd[1]: Reached target network.target - Network. Jul 15 05:14:58.546189 systemd-networkd[848]: eth0: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Jul 15 05:14:58.546193 systemd-networkd[848]: eth0: Configuring with /usr/lib/systemd/network/zz-default.network. Jul 15 05:14:58.546662 systemd-networkd[848]: eth1: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Jul 15 05:14:58.546665 systemd-networkd[848]: eth1: Configuring with /usr/lib/systemd/network/zz-default.network. Jul 15 05:14:58.546927 systemd-networkd[848]: eth0: Link UP Jul 15 05:14:58.546930 systemd-networkd[848]: eth0: Gained carrier Jul 15 05:14:58.546935 systemd-networkd[848]: eth0: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Jul 15 05:14:58.551692 systemd-networkd[848]: eth1: Link UP Jul 15 05:14:58.551695 systemd-networkd[848]: eth1: Gained carrier Jul 15 05:14:58.551703 systemd-networkd[848]: eth1: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Jul 15 05:14:58.565936 ignition[799]: Ignition 2.21.0 Jul 15 05:14:58.566508 ignition[799]: Stage: fetch-offline Jul 15 05:14:58.566536 ignition[799]: no configs at "/usr/lib/ignition/base.d" Jul 15 05:14:58.566543 ignition[799]: no config dir at "/usr/lib/ignition/base.platform.d/hetzner" Jul 15 05:14:58.566600 ignition[799]: parsed url from cmdline: "" Jul 15 05:14:58.566603 ignition[799]: no config URL provided Jul 15 05:14:58.566606 ignition[799]: reading system config file "/usr/lib/ignition/user.ign" Jul 15 05:14:58.568170 systemd[1]: Finished ignition-fetch-offline.service - Ignition (fetch-offline). Jul 15 05:14:58.566611 ignition[799]: no config at "/usr/lib/ignition/user.ign" Jul 15 05:14:58.566615 ignition[799]: failed to fetch config: resource requires networking Jul 15 05:14:58.566734 ignition[799]: Ignition finished successfully Jul 15 05:14:58.571347 systemd[1]: Starting ignition-fetch.service - Ignition (fetch)... Jul 15 05:14:58.581499 systemd-networkd[848]: eth1: DHCPv4 address 10.0.0.3/32, gateway 10.0.0.1 acquired from 10.0.0.1 Jul 15 05:14:58.592838 ignition[857]: Ignition 2.21.0 Jul 15 05:14:58.592847 ignition[857]: Stage: fetch Jul 15 05:14:58.592942 ignition[857]: no configs at "/usr/lib/ignition/base.d" Jul 15 05:14:58.592949 ignition[857]: no config dir at "/usr/lib/ignition/base.platform.d/hetzner" Jul 15 05:14:58.593002 ignition[857]: parsed url from cmdline: "" Jul 15 05:14:58.593005 ignition[857]: no config URL provided Jul 15 05:14:58.593009 ignition[857]: reading system config file "/usr/lib/ignition/user.ign" Jul 15 05:14:58.593015 ignition[857]: no config at "/usr/lib/ignition/user.ign" Jul 15 05:14:58.593043 ignition[857]: GET http://169.254.169.254/hetzner/v1/userdata: attempt #1 Jul 15 05:14:58.593188 ignition[857]: GET error: Get "http://169.254.169.254/hetzner/v1/userdata": dial tcp 169.254.169.254:80: connect: network is unreachable Jul 15 05:14:58.623293 systemd-networkd[848]: eth0: DHCPv4 address 157.180.39.85/32, gateway 172.31.1.1 acquired from 172.31.1.1 Jul 15 05:14:58.793922 ignition[857]: GET http://169.254.169.254/hetzner/v1/userdata: attempt #2 Jul 15 05:14:58.804479 ignition[857]: GET result: OK Jul 15 05:14:58.804618 ignition[857]: parsing config with SHA512: dba7bc036d31c9b729daafab797038d7113955de64faa5f813ecc9b89cffa198ee23bf9abc91912bb4d625c9c13f2fc2742137958bbca4b9820044d9fe574523 Jul 15 05:14:58.819015 unknown[857]: fetched base config from "system" Jul 15 05:14:58.819044 unknown[857]: fetched base config from "system" Jul 15 05:14:58.819812 ignition[857]: fetch: fetch complete Jul 15 05:14:58.819061 unknown[857]: fetched user config from "hetzner" Jul 15 05:14:58.819826 ignition[857]: fetch: fetch passed Jul 15 05:14:58.819930 ignition[857]: Ignition finished successfully Jul 15 05:14:58.827797 systemd[1]: Finished ignition-fetch.service - Ignition (fetch). Jul 15 05:14:58.832313 systemd[1]: Starting ignition-kargs.service - Ignition (kargs)... Jul 15 05:14:58.875162 ignition[864]: Ignition 2.21.0 Jul 15 05:14:58.875260 ignition[864]: Stage: kargs Jul 15 05:14:58.875679 ignition[864]: no configs at "/usr/lib/ignition/base.d" Jul 15 05:14:58.881079 systemd[1]: Finished ignition-kargs.service - Ignition (kargs). Jul 15 05:14:58.875701 ignition[864]: no config dir at "/usr/lib/ignition/base.platform.d/hetzner" Jul 15 05:14:58.884457 systemd[1]: Starting ignition-disks.service - Ignition (disks)... Jul 15 05:14:58.877090 ignition[864]: kargs: kargs passed Jul 15 05:14:58.877166 ignition[864]: Ignition finished successfully Jul 15 05:14:58.915321 ignition[870]: Ignition 2.21.0 Jul 15 05:14:58.916476 ignition[870]: Stage: disks Jul 15 05:14:58.916684 ignition[870]: no configs at "/usr/lib/ignition/base.d" Jul 15 05:14:58.916703 ignition[870]: no config dir at "/usr/lib/ignition/base.platform.d/hetzner" Jul 15 05:14:58.917672 ignition[870]: disks: disks passed Jul 15 05:14:58.920193 systemd[1]: Finished ignition-disks.service - Ignition (disks). Jul 15 05:14:58.917735 ignition[870]: Ignition finished successfully Jul 15 05:14:58.922961 systemd[1]: Reached target initrd-root-device.target - Initrd Root Device. Jul 15 05:14:58.924834 systemd[1]: Reached target local-fs-pre.target - Preparation for Local File Systems. Jul 15 05:14:58.927003 systemd[1]: Reached target local-fs.target - Local File Systems. Jul 15 05:14:58.929107 systemd[1]: Reached target sysinit.target - System Initialization. Jul 15 05:14:58.931096 systemd[1]: Reached target basic.target - Basic System. Jul 15 05:14:58.934891 systemd[1]: Starting systemd-fsck-root.service - File System Check on /dev/disk/by-label/ROOT... Jul 15 05:14:58.973500 systemd-fsck[878]: ROOT: clean, 15/1628000 files, 120826/1617920 blocks Jul 15 05:14:58.977442 systemd[1]: Finished systemd-fsck-root.service - File System Check on /dev/disk/by-label/ROOT. Jul 15 05:14:58.982224 systemd[1]: Mounting sysroot.mount - /sysroot... Jul 15 05:14:59.103282 kernel: EXT4-fs (sda9): mounted filesystem 277c3938-5262-4ab1-8fa3-62fde82f8257 r/w with ordered data mode. Quota mode: none. Jul 15 05:14:59.104006 systemd[1]: Mounted sysroot.mount - /sysroot. Jul 15 05:14:59.104797 systemd[1]: Reached target initrd-root-fs.target - Initrd Root File System. Jul 15 05:14:59.109320 systemd[1]: Mounting sysroot-oem.mount - /sysroot/oem... Jul 15 05:14:59.112190 systemd[1]: Mounting sysroot-usr.mount - /sysroot/usr... Jul 15 05:14:59.115348 systemd[1]: Starting flatcar-metadata-hostname.service - Flatcar Metadata Hostname Agent... Jul 15 05:14:59.117330 systemd[1]: ignition-remount-sysroot.service - Remount /sysroot read-write for Ignition was skipped because of an unmet condition check (ConditionPathIsReadWrite=!/sysroot). Jul 15 05:14:59.117352 systemd[1]: Reached target ignition-diskful.target - Ignition Boot Disk Setup. Jul 15 05:14:59.125653 systemd[1]: Mounted sysroot-usr.mount - /sysroot/usr. Jul 15 05:14:59.130475 systemd[1]: Starting initrd-setup-root.service - Root filesystem setup... Jul 15 05:14:59.135296 kernel: BTRFS: device label OEM devid 1 transid 11 /dev/sda6 (8:6) scanned by mount (886) Jul 15 05:14:59.141717 kernel: BTRFS info (device sda6): first mount of filesystem 86e7a055-b4ff-48a6-9a0a-c301ff74862f Jul 15 05:14:59.141738 kernel: BTRFS info (device sda6): using crc32c (crc32c-intel) checksum algorithm Jul 15 05:14:59.141748 kernel: BTRFS info (device sda6): using free-space-tree Jul 15 05:14:59.156141 systemd[1]: Mounted sysroot-oem.mount - /sysroot/oem. Jul 15 05:14:59.177669 coreos-metadata[888]: Jul 15 05:14:59.177 INFO Fetching http://169.254.169.254/hetzner/v1/metadata/hostname: Attempt #1 Jul 15 05:14:59.180002 coreos-metadata[888]: Jul 15 05:14:59.179 INFO Fetch successful Jul 15 05:14:59.181334 coreos-metadata[888]: Jul 15 05:14:59.180 INFO wrote hostname ci-4396-0-0-n-e83c776e20 to /sysroot/etc/hostname Jul 15 05:14:59.182498 initrd-setup-root[913]: cut: /sysroot/etc/passwd: No such file or directory Jul 15 05:14:59.184119 systemd[1]: Finished flatcar-metadata-hostname.service - Flatcar Metadata Hostname Agent. Jul 15 05:14:59.188240 initrd-setup-root[921]: cut: /sysroot/etc/group: No such file or directory Jul 15 05:14:59.191197 initrd-setup-root[928]: cut: /sysroot/etc/shadow: No such file or directory Jul 15 05:14:59.194529 initrd-setup-root[935]: cut: /sysroot/etc/gshadow: No such file or directory Jul 15 05:14:59.266084 systemd[1]: Finished initrd-setup-root.service - Root filesystem setup. Jul 15 05:14:59.267341 systemd[1]: Starting ignition-mount.service - Ignition (mount)... Jul 15 05:14:59.268706 systemd[1]: Starting sysroot-boot.service - /sysroot/boot... Jul 15 05:14:59.286268 kernel: BTRFS info (device sda6): last unmount of filesystem 86e7a055-b4ff-48a6-9a0a-c301ff74862f Jul 15 05:14:59.299907 systemd[1]: Finished sysroot-boot.service - /sysroot/boot. Jul 15 05:14:59.306001 ignition[1003]: INFO : Ignition 2.21.0 Jul 15 05:14:59.306001 ignition[1003]: INFO : Stage: mount Jul 15 05:14:59.307010 ignition[1003]: INFO : no configs at "/usr/lib/ignition/base.d" Jul 15 05:14:59.307010 ignition[1003]: INFO : no config dir at "/usr/lib/ignition/base.platform.d/hetzner" Jul 15 05:14:59.307010 ignition[1003]: INFO : mount: mount passed Jul 15 05:14:59.307010 ignition[1003]: INFO : Ignition finished successfully Jul 15 05:14:59.307644 systemd[1]: Finished ignition-mount.service - Ignition (mount). Jul 15 05:14:59.310333 systemd[1]: Starting ignition-files.service - Ignition (files)... Jul 15 05:14:59.375355 systemd[1]: sysroot-oem.mount: Deactivated successfully. Jul 15 05:14:59.377328 systemd[1]: Mounting sysroot-oem.mount - /sysroot/oem... Jul 15 05:14:59.419332 kernel: BTRFS: device label OEM devid 1 transid 11 /dev/sda6 (8:6) scanned by mount (1014) Jul 15 05:14:59.419392 kernel: BTRFS info (device sda6): first mount of filesystem 86e7a055-b4ff-48a6-9a0a-c301ff74862f Jul 15 05:14:59.423168 kernel: BTRFS info (device sda6): using crc32c (crc32c-intel) checksum algorithm Jul 15 05:14:59.427913 kernel: BTRFS info (device sda6): using free-space-tree Jul 15 05:14:59.435823 systemd[1]: Mounted sysroot-oem.mount - /sysroot/oem. Jul 15 05:14:59.475879 ignition[1030]: INFO : Ignition 2.21.0 Jul 15 05:14:59.475879 ignition[1030]: INFO : Stage: files Jul 15 05:14:59.478343 ignition[1030]: INFO : no configs at "/usr/lib/ignition/base.d" Jul 15 05:14:59.478343 ignition[1030]: INFO : no config dir at "/usr/lib/ignition/base.platform.d/hetzner" Jul 15 05:14:59.478343 ignition[1030]: DEBUG : files: compiled without relabeling support, skipping Jul 15 05:14:59.482706 ignition[1030]: INFO : files: ensureUsers: op(1): [started] creating or modifying user "core" Jul 15 05:14:59.482706 ignition[1030]: DEBUG : files: ensureUsers: op(1): executing: "usermod" "--root" "/sysroot" "core" Jul 15 05:14:59.482706 ignition[1030]: INFO : files: ensureUsers: op(1): [finished] creating or modifying user "core" Jul 15 05:14:59.487292 ignition[1030]: INFO : files: ensureUsers: op(2): [started] adding ssh keys to user "core" Jul 15 05:14:59.487292 ignition[1030]: INFO : files: ensureUsers: op(2): [finished] adding ssh keys to user "core" Jul 15 05:14:59.487292 ignition[1030]: INFO : files: createFilesystemsFiles: createFiles: op(3): [started] writing file "/sysroot/opt/helm-v3.17.0-linux-amd64.tar.gz" Jul 15 05:14:59.487292 ignition[1030]: INFO : files: createFilesystemsFiles: createFiles: op(3): GET https://get.helm.sh/helm-v3.17.0-linux-amd64.tar.gz: attempt #1 Jul 15 05:14:59.483230 unknown[1030]: wrote ssh authorized keys file for user: core Jul 15 05:14:59.784218 ignition[1030]: INFO : files: createFilesystemsFiles: createFiles: op(3): GET result: OK Jul 15 05:14:59.857501 systemd-networkd[848]: eth0: Gained IPv6LL Jul 15 05:15:00.433521 systemd-networkd[848]: eth1: Gained IPv6LL Jul 15 05:15:01.763578 ignition[1030]: INFO : files: createFilesystemsFiles: createFiles: op(3): [finished] writing file "/sysroot/opt/helm-v3.17.0-linux-amd64.tar.gz" Jul 15 05:15:01.765776 ignition[1030]: INFO : files: createFilesystemsFiles: createFiles: op(4): [started] writing file "/sysroot/home/core/install.sh" Jul 15 05:15:01.765776 ignition[1030]: INFO : files: createFilesystemsFiles: createFiles: op(4): [finished] writing file "/sysroot/home/core/install.sh" Jul 15 05:15:01.765776 ignition[1030]: INFO : files: createFilesystemsFiles: createFiles: op(5): [started] writing file "/sysroot/home/core/nginx.yaml" Jul 15 05:15:01.765776 ignition[1030]: INFO : files: createFilesystemsFiles: createFiles: op(5): [finished] writing file "/sysroot/home/core/nginx.yaml" Jul 15 05:15:01.765776 ignition[1030]: INFO : files: createFilesystemsFiles: createFiles: op(6): [started] writing file "/sysroot/home/core/nfs-pod.yaml" Jul 15 05:15:01.765776 ignition[1030]: INFO : files: createFilesystemsFiles: createFiles: op(6): [finished] writing file "/sysroot/home/core/nfs-pod.yaml" Jul 15 05:15:01.765776 ignition[1030]: INFO : files: createFilesystemsFiles: createFiles: op(7): [started] writing file "/sysroot/home/core/nfs-pvc.yaml" Jul 15 05:15:01.765776 ignition[1030]: INFO : files: createFilesystemsFiles: createFiles: op(7): [finished] writing file "/sysroot/home/core/nfs-pvc.yaml" Jul 15 05:15:01.779027 ignition[1030]: INFO : files: createFilesystemsFiles: createFiles: op(8): [started] writing file "/sysroot/etc/flatcar/update.conf" Jul 15 05:15:01.779027 ignition[1030]: INFO : files: createFilesystemsFiles: createFiles: op(8): [finished] writing file "/sysroot/etc/flatcar/update.conf" Jul 15 05:15:01.779027 ignition[1030]: INFO : files: createFilesystemsFiles: createFiles: op(9): [started] writing link "/sysroot/etc/extensions/kubernetes.raw" -> "/opt/extensions/kubernetes/kubernetes-v1.32.4-x86-64.raw" Jul 15 05:15:01.779027 ignition[1030]: INFO : files: createFilesystemsFiles: createFiles: op(9): [finished] writing link "/sysroot/etc/extensions/kubernetes.raw" -> "/opt/extensions/kubernetes/kubernetes-v1.32.4-x86-64.raw" Jul 15 05:15:01.779027 ignition[1030]: INFO : files: createFilesystemsFiles: createFiles: op(a): [started] writing file "/sysroot/opt/extensions/kubernetes/kubernetes-v1.32.4-x86-64.raw" Jul 15 05:15:01.779027 ignition[1030]: INFO : files: createFilesystemsFiles: createFiles: op(a): GET https://extensions.flatcar.org/extensions/kubernetes-v1.32.4-x86-64.raw: attempt #1 Jul 15 05:15:01.983568 ignition[1030]: INFO : files: createFilesystemsFiles: createFiles: op(a): GET result: OK Jul 15 05:15:02.191972 ignition[1030]: INFO : files: createFilesystemsFiles: createFiles: op(a): [finished] writing file "/sysroot/opt/extensions/kubernetes/kubernetes-v1.32.4-x86-64.raw" Jul 15 05:15:02.191972 ignition[1030]: INFO : files: op(b): [started] processing unit "prepare-helm.service" Jul 15 05:15:02.194706 ignition[1030]: INFO : files: op(b): op(c): [started] writing unit "prepare-helm.service" at "/sysroot/etc/systemd/system/prepare-helm.service" Jul 15 05:15:02.196696 ignition[1030]: INFO : files: op(b): op(c): [finished] writing unit "prepare-helm.service" at "/sysroot/etc/systemd/system/prepare-helm.service" Jul 15 05:15:02.196696 ignition[1030]: INFO : files: op(b): [finished] processing unit "prepare-helm.service" Jul 15 05:15:02.196696 ignition[1030]: INFO : files: op(d): [started] processing unit "coreos-metadata.service" Jul 15 05:15:02.200050 ignition[1030]: INFO : files: op(d): op(e): [started] writing systemd drop-in "00-custom-metadata.conf" at "/sysroot/etc/systemd/system/coreos-metadata.service.d/00-custom-metadata.conf" Jul 15 05:15:02.200050 ignition[1030]: INFO : files: op(d): op(e): [finished] writing systemd drop-in "00-custom-metadata.conf" at "/sysroot/etc/systemd/system/coreos-metadata.service.d/00-custom-metadata.conf" Jul 15 05:15:02.200050 ignition[1030]: INFO : files: op(d): [finished] processing unit "coreos-metadata.service" Jul 15 05:15:02.200050 ignition[1030]: INFO : files: op(f): [started] setting preset to enabled for "prepare-helm.service" Jul 15 05:15:02.200050 ignition[1030]: INFO : files: op(f): [finished] setting preset to enabled for "prepare-helm.service" Jul 15 05:15:02.200050 ignition[1030]: INFO : files: createResultFile: createFiles: op(10): [started] writing file "/sysroot/etc/.ignition-result.json" Jul 15 05:15:02.200050 ignition[1030]: INFO : files: createResultFile: createFiles: op(10): [finished] writing file "/sysroot/etc/.ignition-result.json" Jul 15 05:15:02.200050 ignition[1030]: INFO : files: files passed Jul 15 05:15:02.200050 ignition[1030]: INFO : Ignition finished successfully Jul 15 05:15:02.199115 systemd[1]: Finished ignition-files.service - Ignition (files). Jul 15 05:15:02.204364 systemd[1]: Starting ignition-quench.service - Ignition (record completion)... Jul 15 05:15:02.207785 systemd[1]: Starting initrd-setup-root-after-ignition.service - Root filesystem completion... Jul 15 05:15:02.213947 systemd[1]: ignition-quench.service: Deactivated successfully. Jul 15 05:15:02.215328 systemd[1]: Finished ignition-quench.service - Ignition (record completion). Jul 15 05:15:02.220738 initrd-setup-root-after-ignition[1061]: grep: /sysroot/etc/flatcar/enabled-sysext.conf: No such file or directory Jul 15 05:15:02.220738 initrd-setup-root-after-ignition[1061]: grep: /sysroot/usr/share/flatcar/enabled-sysext.conf: No such file or directory Jul 15 05:15:02.222508 initrd-setup-root-after-ignition[1065]: grep: /sysroot/etc/flatcar/enabled-sysext.conf: No such file or directory Jul 15 05:15:02.223546 systemd[1]: Finished initrd-setup-root-after-ignition.service - Root filesystem completion. Jul 15 05:15:02.224729 systemd[1]: Reached target ignition-complete.target - Ignition Complete. Jul 15 05:15:02.226371 systemd[1]: Starting initrd-parse-etc.service - Mountpoints Configured in the Real Root... Jul 15 05:15:02.277592 systemd[1]: initrd-parse-etc.service: Deactivated successfully. Jul 15 05:15:02.277809 systemd[1]: Finished initrd-parse-etc.service - Mountpoints Configured in the Real Root. Jul 15 05:15:02.279680 systemd[1]: Reached target initrd-fs.target - Initrd File Systems. Jul 15 05:15:02.281424 systemd[1]: Reached target initrd.target - Initrd Default Target. Jul 15 05:15:02.281973 systemd[1]: dracut-mount.service - dracut mount hook was skipped because no trigger condition checks were met. Jul 15 05:15:02.282679 systemd[1]: Starting dracut-pre-pivot.service - dracut pre-pivot and cleanup hook... Jul 15 05:15:02.308712 systemd[1]: Finished dracut-pre-pivot.service - dracut pre-pivot and cleanup hook. Jul 15 05:15:02.312321 systemd[1]: Starting initrd-cleanup.service - Cleaning Up and Shutting Down Daemons... Jul 15 05:15:02.334873 systemd[1]: Stopped target nss-lookup.target - Host and Network Name Lookups. Jul 15 05:15:02.335887 systemd[1]: Stopped target remote-cryptsetup.target - Remote Encrypted Volumes. Jul 15 05:15:02.337361 systemd[1]: Stopped target timers.target - Timer Units. Jul 15 05:15:02.338715 systemd[1]: dracut-pre-pivot.service: Deactivated successfully. Jul 15 05:15:02.338912 systemd[1]: Stopped dracut-pre-pivot.service - dracut pre-pivot and cleanup hook. Jul 15 05:15:02.340786 systemd[1]: Stopped target initrd.target - Initrd Default Target. Jul 15 05:15:02.341761 systemd[1]: Stopped target basic.target - Basic System. Jul 15 05:15:02.342905 systemd[1]: Stopped target ignition-complete.target - Ignition Complete. Jul 15 05:15:02.344156 systemd[1]: Stopped target ignition-diskful.target - Ignition Boot Disk Setup. Jul 15 05:15:02.345638 systemd[1]: Stopped target initrd-root-device.target - Initrd Root Device. Jul 15 05:15:02.347098 systemd[1]: Stopped target initrd-usr-fs.target - Initrd /usr File System. Jul 15 05:15:02.348474 systemd[1]: Stopped target remote-fs.target - Remote File Systems. Jul 15 05:15:02.349965 systemd[1]: Stopped target remote-fs-pre.target - Preparation for Remote File Systems. Jul 15 05:15:02.351410 systemd[1]: Stopped target sysinit.target - System Initialization. Jul 15 05:15:02.352872 systemd[1]: Stopped target local-fs.target - Local File Systems. Jul 15 05:15:02.353890 systemd[1]: Stopped target swap.target - Swaps. Jul 15 05:15:02.355198 systemd[1]: dracut-pre-mount.service: Deactivated successfully. Jul 15 05:15:02.355353 systemd[1]: Stopped dracut-pre-mount.service - dracut pre-mount hook. Jul 15 05:15:02.356459 systemd[1]: Stopped target cryptsetup.target - Local Encrypted Volumes. Jul 15 05:15:02.357068 systemd[1]: Stopped target cryptsetup-pre.target - Local Encrypted Volumes (Pre). Jul 15 05:15:02.358025 systemd[1]: clevis-luks-askpass.path: Deactivated successfully. Jul 15 05:15:02.358100 systemd[1]: Stopped clevis-luks-askpass.path - Forward Password Requests to Clevis Directory Watch. Jul 15 05:15:02.359006 systemd[1]: dracut-initqueue.service: Deactivated successfully. Jul 15 05:15:02.359112 systemd[1]: Stopped dracut-initqueue.service - dracut initqueue hook. Jul 15 05:15:02.360318 systemd[1]: initrd-setup-root-after-ignition.service: Deactivated successfully. Jul 15 05:15:02.360445 systemd[1]: Stopped initrd-setup-root-after-ignition.service - Root filesystem completion. Jul 15 05:15:02.361546 systemd[1]: ignition-files.service: Deactivated successfully. Jul 15 05:15:02.361616 systemd[1]: Stopped ignition-files.service - Ignition (files). Jul 15 05:15:02.365802 systemd[1]: flatcar-metadata-hostname.service: Deactivated successfully. Jul 15 05:15:02.365901 systemd[1]: Stopped flatcar-metadata-hostname.service - Flatcar Metadata Hostname Agent. Jul 15 05:15:02.368323 systemd[1]: Stopping ignition-mount.service - Ignition (mount)... Jul 15 05:15:02.371352 systemd[1]: Stopping sysroot-boot.service - /sysroot/boot... Jul 15 05:15:02.372790 systemd[1]: systemd-udev-trigger.service: Deactivated successfully. Jul 15 05:15:02.372879 systemd[1]: Stopped systemd-udev-trigger.service - Coldplug All udev Devices. Jul 15 05:15:02.373763 systemd[1]: dracut-pre-trigger.service: Deactivated successfully. Jul 15 05:15:02.373859 systemd[1]: Stopped dracut-pre-trigger.service - dracut pre-trigger hook. Jul 15 05:15:02.378354 systemd[1]: initrd-cleanup.service: Deactivated successfully. Jul 15 05:15:02.378448 systemd[1]: Finished initrd-cleanup.service - Cleaning Up and Shutting Down Daemons. Jul 15 05:15:02.393266 ignition[1085]: INFO : Ignition 2.21.0 Jul 15 05:15:02.393266 ignition[1085]: INFO : Stage: umount Jul 15 05:15:02.393266 ignition[1085]: INFO : no configs at "/usr/lib/ignition/base.d" Jul 15 05:15:02.393266 ignition[1085]: INFO : no config dir at "/usr/lib/ignition/base.platform.d/hetzner" Jul 15 05:15:02.396771 ignition[1085]: INFO : umount: umount passed Jul 15 05:15:02.396771 ignition[1085]: INFO : Ignition finished successfully Jul 15 05:15:02.395009 systemd[1]: ignition-mount.service: Deactivated successfully. Jul 15 05:15:02.395114 systemd[1]: Stopped ignition-mount.service - Ignition (mount). Jul 15 05:15:02.396597 systemd[1]: sysroot-boot.mount: Deactivated successfully. Jul 15 05:15:02.396902 systemd[1]: ignition-disks.service: Deactivated successfully. Jul 15 05:15:02.396936 systemd[1]: Stopped ignition-disks.service - Ignition (disks). Jul 15 05:15:02.398592 systemd[1]: ignition-kargs.service: Deactivated successfully. Jul 15 05:15:02.398630 systemd[1]: Stopped ignition-kargs.service - Ignition (kargs). Jul 15 05:15:02.400864 systemd[1]: ignition-fetch.service: Deactivated successfully. Jul 15 05:15:02.400901 systemd[1]: Stopped ignition-fetch.service - Ignition (fetch). Jul 15 05:15:02.401708 systemd[1]: Stopped target network.target - Network. Jul 15 05:15:02.402502 systemd[1]: ignition-fetch-offline.service: Deactivated successfully. Jul 15 05:15:02.402542 systemd[1]: Stopped ignition-fetch-offline.service - Ignition (fetch-offline). Jul 15 05:15:02.403469 systemd[1]: Stopped target paths.target - Path Units. Jul 15 05:15:02.404239 systemd[1]: systemd-ask-password-console.path: Deactivated successfully. Jul 15 05:15:02.408289 systemd[1]: Stopped systemd-ask-password-console.path - Dispatch Password Requests to Console Directory Watch. Jul 15 05:15:02.408779 systemd[1]: Stopped target slices.target - Slice Units. Jul 15 05:15:02.409790 systemd[1]: Stopped target sockets.target - Socket Units. Jul 15 05:15:02.410635 systemd[1]: iscsid.socket: Deactivated successfully. Jul 15 05:15:02.410667 systemd[1]: Closed iscsid.socket - Open-iSCSI iscsid Socket. Jul 15 05:15:02.411455 systemd[1]: iscsiuio.socket: Deactivated successfully. Jul 15 05:15:02.411485 systemd[1]: Closed iscsiuio.socket - Open-iSCSI iscsiuio Socket. Jul 15 05:15:02.412311 systemd[1]: ignition-setup.service: Deactivated successfully. Jul 15 05:15:02.412359 systemd[1]: Stopped ignition-setup.service - Ignition (setup). Jul 15 05:15:02.413273 systemd[1]: ignition-setup-pre.service: Deactivated successfully. Jul 15 05:15:02.413321 systemd[1]: Stopped ignition-setup-pre.service - Ignition env setup. Jul 15 05:15:02.414326 systemd[1]: Stopping systemd-networkd.service - Network Configuration... Jul 15 05:15:02.415343 systemd[1]: Stopping systemd-resolved.service - Network Name Resolution... Jul 15 05:15:02.416484 systemd[1]: sysroot-boot.service: Deactivated successfully. Jul 15 05:15:02.416573 systemd[1]: Stopped sysroot-boot.service - /sysroot/boot. Jul 15 05:15:02.417544 systemd[1]: initrd-setup-root.service: Deactivated successfully. Jul 15 05:15:02.417640 systemd[1]: Stopped initrd-setup-root.service - Root filesystem setup. Jul 15 05:15:02.422734 systemd[1]: systemd-resolved.service: Deactivated successfully. Jul 15 05:15:02.422840 systemd[1]: Stopped systemd-resolved.service - Network Name Resolution. Jul 15 05:15:02.425486 systemd[1]: run-credentials-systemd\x2dresolved.service.mount: Deactivated successfully. Jul 15 05:15:02.425716 systemd[1]: systemd-networkd.service: Deactivated successfully. Jul 15 05:15:02.425814 systemd[1]: Stopped systemd-networkd.service - Network Configuration. Jul 15 05:15:02.427587 systemd[1]: run-credentials-systemd\x2dnetworkd.service.mount: Deactivated successfully. Jul 15 05:15:02.428628 systemd[1]: Stopped target network-pre.target - Preparation for Network. Jul 15 05:15:02.429122 systemd[1]: systemd-networkd.socket: Deactivated successfully. Jul 15 05:15:02.429156 systemd[1]: Closed systemd-networkd.socket - Network Service Netlink Socket. Jul 15 05:15:02.430723 systemd[1]: Stopping network-cleanup.service - Network Cleanup... Jul 15 05:15:02.431634 systemd[1]: parse-ip-for-networkd.service: Deactivated successfully. Jul 15 05:15:02.431675 systemd[1]: Stopped parse-ip-for-networkd.service - Write systemd-networkd units from cmdline. Jul 15 05:15:02.432979 systemd[1]: systemd-sysctl.service: Deactivated successfully. Jul 15 05:15:02.433018 systemd[1]: Stopped systemd-sysctl.service - Apply Kernel Variables. Jul 15 05:15:02.435837 systemd[1]: systemd-modules-load.service: Deactivated successfully. Jul 15 05:15:02.435874 systemd[1]: Stopped systemd-modules-load.service - Load Kernel Modules. Jul 15 05:15:02.436460 systemd[1]: systemd-tmpfiles-setup.service: Deactivated successfully. Jul 15 05:15:02.436496 systemd[1]: Stopped systemd-tmpfiles-setup.service - Create System Files and Directories. Jul 15 05:15:02.437724 systemd[1]: Stopping systemd-udevd.service - Rule-based Manager for Device Events and Files... Jul 15 05:15:02.440297 systemd[1]: run-credentials-systemd\x2dsysctl.service.mount: Deactivated successfully. Jul 15 05:15:02.440348 systemd[1]: run-credentials-systemd\x2dtmpfiles\x2dsetup.service.mount: Deactivated successfully. Jul 15 05:15:02.451855 systemd[1]: systemd-udevd.service: Deactivated successfully. Jul 15 05:15:02.453386 systemd[1]: Stopped systemd-udevd.service - Rule-based Manager for Device Events and Files. Jul 15 05:15:02.454048 systemd[1]: systemd-udevd-control.socket: Deactivated successfully. Jul 15 05:15:02.454081 systemd[1]: Closed systemd-udevd-control.socket - udev Control Socket. Jul 15 05:15:02.454611 systemd[1]: systemd-udevd-kernel.socket: Deactivated successfully. Jul 15 05:15:02.454638 systemd[1]: Closed systemd-udevd-kernel.socket - udev Kernel Socket. Jul 15 05:15:02.455596 systemd[1]: dracut-pre-udev.service: Deactivated successfully. Jul 15 05:15:02.455637 systemd[1]: Stopped dracut-pre-udev.service - dracut pre-udev hook. Jul 15 05:15:02.456967 systemd[1]: dracut-cmdline.service: Deactivated successfully. Jul 15 05:15:02.457006 systemd[1]: Stopped dracut-cmdline.service - dracut cmdline hook. Jul 15 05:15:02.457972 systemd[1]: dracut-cmdline-ask.service: Deactivated successfully. Jul 15 05:15:02.458009 systemd[1]: Stopped dracut-cmdline-ask.service - dracut ask for additional cmdline parameters. Jul 15 05:15:02.460339 systemd[1]: Starting initrd-udevadm-cleanup-db.service - Cleanup udev Database... Jul 15 05:15:02.460945 systemd[1]: systemd-network-generator.service: Deactivated successfully. Jul 15 05:15:02.460988 systemd[1]: Stopped systemd-network-generator.service - Generate network units from Kernel command line. Jul 15 05:15:02.463807 systemd[1]: systemd-tmpfiles-setup-dev.service: Deactivated successfully. Jul 15 05:15:02.463861 systemd[1]: Stopped systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev. Jul 15 05:15:02.464397 systemd[1]: systemd-tmpfiles-setup-dev-early.service: Deactivated successfully. Jul 15 05:15:02.464442 systemd[1]: Stopped systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully. Jul 15 05:15:02.465917 systemd[1]: kmod-static-nodes.service: Deactivated successfully. Jul 15 05:15:02.465951 systemd[1]: Stopped kmod-static-nodes.service - Create List of Static Device Nodes. Jul 15 05:15:02.466577 systemd[1]: systemd-vconsole-setup.service: Deactivated successfully. Jul 15 05:15:02.466612 systemd[1]: Stopped systemd-vconsole-setup.service - Virtual Console Setup. Jul 15 05:15:02.469066 systemd[1]: run-credentials-systemd\x2dnetwork\x2dgenerator.service.mount: Deactivated successfully. Jul 15 05:15:02.469109 systemd[1]: run-credentials-systemd\x2dtmpfiles\x2dsetup\x2ddev\x2dearly.service.mount: Deactivated successfully. Jul 15 05:15:02.469141 systemd[1]: run-credentials-systemd\x2dtmpfiles\x2dsetup\x2ddev.service.mount: Deactivated successfully. Jul 15 05:15:02.469176 systemd[1]: run-credentials-systemd\x2dvconsole\x2dsetup.service.mount: Deactivated successfully. Jul 15 05:15:02.469502 systemd[1]: network-cleanup.service: Deactivated successfully. Jul 15 05:15:02.472347 systemd[1]: Stopped network-cleanup.service - Network Cleanup. Jul 15 05:15:02.477915 systemd[1]: initrd-udevadm-cleanup-db.service: Deactivated successfully. Jul 15 05:15:02.478011 systemd[1]: Finished initrd-udevadm-cleanup-db.service - Cleanup udev Database. Jul 15 05:15:02.478991 systemd[1]: Reached target initrd-switch-root.target - Switch Root. Jul 15 05:15:02.481347 systemd[1]: Starting initrd-switch-root.service - Switch Root... Jul 15 05:15:02.498529 systemd[1]: Switching root. Jul 15 05:15:02.538724 systemd-journald[217]: Journal stopped Jul 15 05:15:03.596921 systemd-journald[217]: Received SIGTERM from PID 1 (systemd). Jul 15 05:15:03.597056 kernel: SELinux: policy capability network_peer_controls=1 Jul 15 05:15:03.597073 kernel: SELinux: policy capability open_perms=1 Jul 15 05:15:03.597082 kernel: SELinux: policy capability extended_socket_class=1 Jul 15 05:15:03.597093 kernel: SELinux: policy capability always_check_network=0 Jul 15 05:15:03.597102 kernel: SELinux: policy capability cgroup_seclabel=1 Jul 15 05:15:03.597111 kernel: SELinux: policy capability nnp_nosuid_transition=1 Jul 15 05:15:03.597120 kernel: SELinux: policy capability genfs_seclabel_symlinks=0 Jul 15 05:15:03.597129 kernel: SELinux: policy capability ioctl_skip_cloexec=0 Jul 15 05:15:03.597137 kernel: SELinux: policy capability userspace_initial_context=0 Jul 15 05:15:03.597151 kernel: audit: type=1403 audit(1752556502.762:2): auid=4294967295 ses=4294967295 lsm=selinux res=1 Jul 15 05:15:03.597161 systemd[1]: Successfully loaded SELinux policy in 62.979ms. Jul 15 05:15:03.597175 systemd[1]: Relabeled /dev/, /dev/shm/, /run/ in 6.377ms. Jul 15 05:15:03.597185 systemd[1]: systemd 256.8 running in system mode (+PAM +AUDIT +SELINUX -APPARMOR +IMA +SMACK +SECCOMP -GCRYPT -GNUTLS +OPENSSL -ACL +BLKID +CURL +ELFUTILS -FIDO2 +IDN2 -IDN +IPTC +KMOD +LIBCRYPTSETUP +LIBCRYPTSETUP_PLUGINS +LIBFDISK +PCRE2 -PWQUALITY -P11KIT -QRENCODE +TPM2 +BZIP2 +LZ4 +XZ +ZLIB +ZSTD -BPF_FRAMEWORK -XKBCOMMON +UTMP -SYSVINIT +LIBARCHIVE) Jul 15 05:15:03.597194 systemd[1]: Detected virtualization kvm. Jul 15 05:15:03.597204 systemd[1]: Detected architecture x86-64. Jul 15 05:15:03.600293 systemd[1]: Detected first boot. Jul 15 05:15:03.600317 systemd[1]: Hostname set to . Jul 15 05:15:03.600331 systemd[1]: Initializing machine ID from VM UUID. Jul 15 05:15:03.600341 zram_generator::config[1133]: No configuration found. Jul 15 05:15:03.600352 kernel: Guest personality initialized and is inactive Jul 15 05:15:03.600361 kernel: VMCI host device registered (name=vmci, major=10, minor=125) Jul 15 05:15:03.600369 kernel: Initialized host personality Jul 15 05:15:03.600377 kernel: NET: Registered PF_VSOCK protocol family Jul 15 05:15:03.600389 systemd[1]: Populated /etc with preset unit settings. Jul 15 05:15:03.600399 systemd[1]: run-credentials-systemd\x2djournald.service.mount: Deactivated successfully. Jul 15 05:15:03.600408 systemd[1]: initrd-switch-root.service: Deactivated successfully. Jul 15 05:15:03.600416 systemd[1]: Stopped initrd-switch-root.service - Switch Root. Jul 15 05:15:03.600425 systemd[1]: systemd-journald.service: Scheduled restart job, restart counter is at 1. Jul 15 05:15:03.600434 systemd[1]: Created slice system-addon\x2dconfig.slice - Slice /system/addon-config. Jul 15 05:15:03.600450 systemd[1]: Created slice system-addon\x2drun.slice - Slice /system/addon-run. Jul 15 05:15:03.600459 systemd[1]: Created slice system-getty.slice - Slice /system/getty. Jul 15 05:15:03.600469 systemd[1]: Created slice system-modprobe.slice - Slice /system/modprobe. Jul 15 05:15:03.600478 systemd[1]: Created slice system-serial\x2dgetty.slice - Slice /system/serial-getty. Jul 15 05:15:03.600487 systemd[1]: Created slice system-system\x2dcloudinit.slice - Slice /system/system-cloudinit. Jul 15 05:15:03.600499 systemd[1]: Created slice system-systemd\x2dfsck.slice - Slice /system/systemd-fsck. Jul 15 05:15:03.600509 systemd[1]: Created slice user.slice - User and Session Slice. Jul 15 05:15:03.600518 systemd[1]: Started clevis-luks-askpass.path - Forward Password Requests to Clevis Directory Watch. Jul 15 05:15:03.600529 systemd[1]: Started systemd-ask-password-console.path - Dispatch Password Requests to Console Directory Watch. Jul 15 05:15:03.600538 systemd[1]: Started systemd-ask-password-wall.path - Forward Password Requests to Wall Directory Watch. Jul 15 05:15:03.600547 systemd[1]: Set up automount boot.automount - Boot partition Automount Point. Jul 15 05:15:03.600556 systemd[1]: Set up automount proc-sys-fs-binfmt_misc.automount - Arbitrary Executable File Formats File System Automount Point. Jul 15 05:15:03.600565 systemd[1]: Expecting device dev-disk-by\x2dlabel-OEM.device - /dev/disk/by-label/OEM... Jul 15 05:15:03.600574 systemd[1]: Expecting device dev-ttyS0.device - /dev/ttyS0... Jul 15 05:15:03.600585 systemd[1]: Reached target cryptsetup-pre.target - Local Encrypted Volumes (Pre). Jul 15 05:15:03.600593 systemd[1]: Reached target cryptsetup.target - Local Encrypted Volumes. Jul 15 05:15:03.600602 systemd[1]: Stopped target initrd-switch-root.target - Switch Root. Jul 15 05:15:03.600610 systemd[1]: Stopped target initrd-fs.target - Initrd File Systems. Jul 15 05:15:03.600619 systemd[1]: Stopped target initrd-root-fs.target - Initrd Root File System. Jul 15 05:15:03.600628 systemd[1]: Reached target integritysetup.target - Local Integrity Protected Volumes. Jul 15 05:15:03.600637 systemd[1]: Reached target remote-cryptsetup.target - Remote Encrypted Volumes. Jul 15 05:15:03.600646 systemd[1]: Reached target remote-fs.target - Remote File Systems. Jul 15 05:15:03.600655 systemd[1]: Reached target slices.target - Slice Units. Jul 15 05:15:03.600665 systemd[1]: Reached target swap.target - Swaps. Jul 15 05:15:03.600673 systemd[1]: Reached target veritysetup.target - Local Verity Protected Volumes. Jul 15 05:15:03.600682 systemd[1]: Listening on systemd-coredump.socket - Process Core Dump Socket. Jul 15 05:15:03.600691 systemd[1]: Listening on systemd-creds.socket - Credential Encryption/Decryption. Jul 15 05:15:03.600699 systemd[1]: Listening on systemd-networkd.socket - Network Service Netlink Socket. Jul 15 05:15:03.600708 systemd[1]: Listening on systemd-udevd-control.socket - udev Control Socket. Jul 15 05:15:03.600720 systemd[1]: Listening on systemd-udevd-kernel.socket - udev Kernel Socket. Jul 15 05:15:03.600729 systemd[1]: Listening on systemd-userdbd.socket - User Database Manager Socket. Jul 15 05:15:03.600737 systemd[1]: Mounting dev-hugepages.mount - Huge Pages File System... Jul 15 05:15:03.600748 systemd[1]: Mounting dev-mqueue.mount - POSIX Message Queue File System... Jul 15 05:15:03.600757 systemd[1]: Mounting media.mount - External Media Directory... Jul 15 05:15:03.600766 systemd[1]: proc-xen.mount - /proc/xen was skipped because of an unmet condition check (ConditionVirtualization=xen). Jul 15 05:15:03.600774 systemd[1]: Mounting sys-kernel-debug.mount - Kernel Debug File System... Jul 15 05:15:03.600783 systemd[1]: Mounting sys-kernel-tracing.mount - Kernel Trace File System... Jul 15 05:15:03.600791 systemd[1]: Mounting tmp.mount - Temporary Directory /tmp... Jul 15 05:15:03.600800 systemd[1]: var-lib-machines.mount - Virtual Machine and Container Storage (Compatibility) was skipped because of an unmet condition check (ConditionPathExists=/var/lib/machines.raw). Jul 15 05:15:03.600809 systemd[1]: Reached target machines.target - Containers. Jul 15 05:15:03.600817 systemd[1]: Starting flatcar-tmpfiles.service - Create missing system files... Jul 15 05:15:03.600828 systemd[1]: ignition-delete-config.service - Ignition (delete config) was skipped because no trigger condition checks were met. Jul 15 05:15:03.600836 systemd[1]: Starting kmod-static-nodes.service - Create List of Static Device Nodes... Jul 15 05:15:03.600845 systemd[1]: Starting modprobe@configfs.service - Load Kernel Module configfs... Jul 15 05:15:03.600854 systemd[1]: Starting modprobe@dm_mod.service - Load Kernel Module dm_mod... Jul 15 05:15:03.600862 systemd[1]: Starting modprobe@drm.service - Load Kernel Module drm... Jul 15 05:15:03.600871 systemd[1]: Starting modprobe@efi_pstore.service - Load Kernel Module efi_pstore... Jul 15 05:15:03.600879 systemd[1]: Starting modprobe@fuse.service - Load Kernel Module fuse... Jul 15 05:15:03.600888 systemd[1]: Starting modprobe@loop.service - Load Kernel Module loop... Jul 15 05:15:03.600898 systemd[1]: setup-nsswitch.service - Create /etc/nsswitch.conf was skipped because of an unmet condition check (ConditionPathExists=!/etc/nsswitch.conf). Jul 15 05:15:03.600907 systemd[1]: systemd-fsck-root.service: Deactivated successfully. Jul 15 05:15:03.600915 systemd[1]: Stopped systemd-fsck-root.service - File System Check on Root Device. Jul 15 05:15:03.600924 systemd[1]: systemd-fsck-usr.service: Deactivated successfully. Jul 15 05:15:03.600933 systemd[1]: Stopped systemd-fsck-usr.service. Jul 15 05:15:03.600942 systemd[1]: systemd-hibernate-clear.service - Clear Stale Hibernate Storage Info was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/HibernateLocation-8cf2644b-4b0b-428f-9387-6d876050dc67). Jul 15 05:15:03.600951 systemd[1]: Starting systemd-journald.service - Journal Service... Jul 15 05:15:03.600960 systemd[1]: Starting systemd-modules-load.service - Load Kernel Modules... Jul 15 05:15:03.600970 kernel: loop: module loaded Jul 15 05:15:03.600979 systemd[1]: Starting systemd-network-generator.service - Generate network units from Kernel command line... Jul 15 05:15:03.600988 systemd[1]: Starting systemd-remount-fs.service - Remount Root and Kernel File Systems... Jul 15 05:15:03.600997 systemd[1]: Starting systemd-udev-load-credentials.service - Load udev Rules from Credentials... Jul 15 05:15:03.601007 systemd[1]: Starting systemd-udev-trigger.service - Coldplug All udev Devices... Jul 15 05:15:03.601016 systemd[1]: verity-setup.service: Deactivated successfully. Jul 15 05:15:03.601025 systemd[1]: Stopped verity-setup.service. Jul 15 05:15:03.601035 systemd[1]: xenserver-pv-version.service - Set fake PV driver version for XenServer was skipped because of an unmet condition check (ConditionVirtualization=xen). Jul 15 05:15:03.601044 systemd[1]: Mounted dev-hugepages.mount - Huge Pages File System. Jul 15 05:15:03.601053 systemd[1]: Mounted dev-mqueue.mount - POSIX Message Queue File System. Jul 15 05:15:03.601061 kernel: ACPI: bus type drm_connector registered Jul 15 05:15:03.601071 systemd[1]: Mounted media.mount - External Media Directory. Jul 15 05:15:03.601079 systemd[1]: Mounted sys-kernel-debug.mount - Kernel Debug File System. Jul 15 05:15:03.601090 systemd[1]: Mounted sys-kernel-tracing.mount - Kernel Trace File System. Jul 15 05:15:03.601098 kernel: fuse: init (API version 7.41) Jul 15 05:15:03.601107 systemd[1]: Mounted tmp.mount - Temporary Directory /tmp. Jul 15 05:15:03.601115 systemd[1]: Finished flatcar-tmpfiles.service - Create missing system files. Jul 15 05:15:03.601124 systemd[1]: Finished kmod-static-nodes.service - Create List of Static Device Nodes. Jul 15 05:15:03.601135 systemd[1]: modprobe@configfs.service: Deactivated successfully. Jul 15 05:15:03.601145 systemd[1]: Finished modprobe@configfs.service - Load Kernel Module configfs. Jul 15 05:15:03.601154 systemd[1]: modprobe@dm_mod.service: Deactivated successfully. Jul 15 05:15:03.601163 systemd[1]: Finished modprobe@dm_mod.service - Load Kernel Module dm_mod. Jul 15 05:15:03.601171 systemd[1]: modprobe@drm.service: Deactivated successfully. Jul 15 05:15:03.601203 systemd-journald[1217]: Collecting audit messages is disabled. Jul 15 05:15:03.601220 systemd[1]: Finished modprobe@drm.service - Load Kernel Module drm. Jul 15 05:15:03.601229 systemd[1]: modprobe@efi_pstore.service: Deactivated successfully. Jul 15 05:15:03.601240 systemd[1]: Finished modprobe@efi_pstore.service - Load Kernel Module efi_pstore. Jul 15 05:15:03.602808 systemd[1]: modprobe@fuse.service: Deactivated successfully. Jul 15 05:15:03.602826 systemd-journald[1217]: Journal started Jul 15 05:15:03.602844 systemd-journald[1217]: Runtime Journal (/run/log/journal/a09ae2d4a11541aab386b396b27aa7cd) is 4.8M, max 38.6M, 33.7M free. Jul 15 05:15:03.316977 systemd[1]: Queued start job for default target multi-user.target. Jul 15 05:15:03.344553 systemd[1]: Unnecessary job was removed for dev-sda6.device - /dev/sda6. Jul 15 05:15:03.345420 systemd[1]: systemd-journald.service: Deactivated successfully. Jul 15 05:15:03.603527 systemd[1]: Finished modprobe@fuse.service - Load Kernel Module fuse. Jul 15 05:15:03.606102 systemd[1]: Started systemd-journald.service - Journal Service. Jul 15 05:15:03.607560 systemd[1]: modprobe@loop.service: Deactivated successfully. Jul 15 05:15:03.607811 systemd[1]: Finished modprobe@loop.service - Load Kernel Module loop. Jul 15 05:15:03.608626 systemd[1]: Finished systemd-modules-load.service - Load Kernel Modules. Jul 15 05:15:03.609405 systemd[1]: Finished systemd-network-generator.service - Generate network units from Kernel command line. Jul 15 05:15:03.610190 systemd[1]: Finished systemd-remount-fs.service - Remount Root and Kernel File Systems. Jul 15 05:15:03.610978 systemd[1]: Finished systemd-udev-load-credentials.service - Load udev Rules from Credentials. Jul 15 05:15:03.620610 systemd[1]: Reached target network-pre.target - Preparation for Network. Jul 15 05:15:03.623358 systemd[1]: Mounting sys-fs-fuse-connections.mount - FUSE Control File System... Jul 15 05:15:03.627354 systemd[1]: Mounting sys-kernel-config.mount - Kernel Configuration File System... Jul 15 05:15:03.628176 systemd[1]: remount-root.service - Remount Root File System was skipped because of an unmet condition check (ConditionPathIsReadWrite=!/). Jul 15 05:15:03.628197 systemd[1]: Reached target local-fs.target - Local File Systems. Jul 15 05:15:03.630569 systemd[1]: Listening on systemd-sysext.socket - System Extension Image Management. Jul 15 05:15:03.637958 systemd[1]: Starting ldconfig.service - Rebuild Dynamic Linker Cache... Jul 15 05:15:03.638504 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. Jul 15 05:15:03.641406 systemd[1]: Starting systemd-hwdb-update.service - Rebuild Hardware Database... Jul 15 05:15:03.644555 systemd[1]: Starting systemd-journal-flush.service - Flush Journal to Persistent Storage... Jul 15 05:15:03.645379 systemd[1]: systemd-pstore.service - Platform Persistent Storage Archival was skipped because of an unmet condition check (ConditionDirectoryNotEmpty=/sys/fs/pstore). Jul 15 05:15:03.647479 systemd[1]: Starting systemd-random-seed.service - Load/Save OS Random Seed... Jul 15 05:15:03.647969 systemd[1]: systemd-repart.service - Repartition Root Disk was skipped because no trigger condition checks were met. Jul 15 05:15:03.649410 systemd[1]: Starting systemd-sysctl.service - Apply Kernel Variables... Jul 15 05:15:03.651045 systemd[1]: Starting systemd-sysext.service - Merge System Extension Images into /usr/ and /opt/... Jul 15 05:15:03.654394 systemd[1]: Starting systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully... Jul 15 05:15:03.657190 systemd[1]: Mounted sys-fs-fuse-connections.mount - FUSE Control File System. Jul 15 05:15:03.657742 systemd[1]: Mounted sys-kernel-config.mount - Kernel Configuration File System. Jul 15 05:15:03.687752 systemd[1]: Finished systemd-random-seed.service - Load/Save OS Random Seed. Jul 15 05:15:03.688326 systemd[1]: Reached target first-boot-complete.target - First Boot Complete. Jul 15 05:15:03.693375 kernel: loop0: detected capacity change from 0 to 114000 Jul 15 05:15:03.693435 systemd-journald[1217]: Time spent on flushing to /var/log/journal/a09ae2d4a11541aab386b396b27aa7cd is 22.786ms for 1168 entries. Jul 15 05:15:03.693435 systemd-journald[1217]: System Journal (/var/log/journal/a09ae2d4a11541aab386b396b27aa7cd) is 8M, max 584.8M, 576.8M free. Jul 15 05:15:03.730392 systemd-journald[1217]: Received client request to flush runtime journal. Jul 15 05:15:03.737295 kernel: squashfs: version 4.0 (2009/01/31) Phillip Lougher Jul 15 05:15:03.690957 systemd[1]: Starting systemd-machine-id-commit.service - Save Transient machine-id to Disk... Jul 15 05:15:03.709593 systemd-tmpfiles[1255]: ACLs are not supported, ignoring. Jul 15 05:15:03.709602 systemd-tmpfiles[1255]: ACLs are not supported, ignoring. Jul 15 05:15:03.714661 systemd[1]: Finished systemd-sysctl.service - Apply Kernel Variables. Jul 15 05:15:03.716148 systemd[1]: Finished systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully. Jul 15 05:15:03.723283 systemd[1]: Starting systemd-sysusers.service - Create System Users... Jul 15 05:15:03.736390 systemd[1]: Finished systemd-journal-flush.service - Flush Journal to Persistent Storage. Jul 15 05:15:03.759433 systemd[1]: Finished systemd-machine-id-commit.service - Save Transient machine-id to Disk. Jul 15 05:15:03.763887 kernel: loop1: detected capacity change from 0 to 8 Jul 15 05:15:03.774032 systemd[1]: Finished systemd-sysusers.service - Create System Users. Jul 15 05:15:03.777774 systemd[1]: Starting systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev... Jul 15 05:15:03.787320 kernel: loop2: detected capacity change from 0 to 146488 Jul 15 05:15:03.785472 systemd[1]: Finished systemd-udev-trigger.service - Coldplug All udev Devices. Jul 15 05:15:03.795046 systemd-tmpfiles[1274]: ACLs are not supported, ignoring. Jul 15 05:15:03.795320 systemd-tmpfiles[1274]: ACLs are not supported, ignoring. Jul 15 05:15:03.798845 systemd[1]: Finished systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev. Jul 15 05:15:03.826609 kernel: loop3: detected capacity change from 0 to 224512 Jul 15 05:15:03.860279 kernel: loop4: detected capacity change from 0 to 114000 Jul 15 05:15:03.880310 kernel: loop5: detected capacity change from 0 to 8 Jul 15 05:15:03.884306 kernel: loop6: detected capacity change from 0 to 146488 Jul 15 05:15:03.901342 kernel: loop7: detected capacity change from 0 to 224512 Jul 15 05:15:03.918343 (sd-merge)[1280]: Using extensions 'containerd-flatcar', 'docker-flatcar', 'kubernetes', 'oem-hetzner'. Jul 15 05:15:03.919812 (sd-merge)[1280]: Merged extensions into '/usr'. Jul 15 05:15:03.927354 systemd[1]: Reload requested from client PID 1254 ('systemd-sysext') (unit systemd-sysext.service)... Jul 15 05:15:03.927368 systemd[1]: Reloading... Jul 15 05:15:03.988279 zram_generator::config[1302]: No configuration found. Jul 15 05:15:04.096674 systemd[1]: /usr/lib/systemd/system/docker.socket:6: ListenStream= references a path below legacy directory /var/run/, updating /var/run/docker.sock → /run/docker.sock; please update the unit file accordingly. Jul 15 05:15:04.135921 ldconfig[1249]: /sbin/ldconfig: /usr/lib/ld.so.conf is not an ELF file - it has the wrong magic bytes at the start. Jul 15 05:15:04.171189 systemd[1]: etc-machine\x2did.mount: Deactivated successfully. Jul 15 05:15:04.171636 systemd[1]: Reloading finished in 243 ms. Jul 15 05:15:04.184953 systemd[1]: Finished ldconfig.service - Rebuild Dynamic Linker Cache. Jul 15 05:15:04.187824 systemd[1]: Finished systemd-sysext.service - Merge System Extension Images into /usr/ and /opt/. Jul 15 05:15:04.198407 systemd[1]: Starting ensure-sysext.service... Jul 15 05:15:04.199838 systemd[1]: Starting systemd-tmpfiles-setup.service - Create System Files and Directories... Jul 15 05:15:04.220305 systemd[1]: Finished systemd-hwdb-update.service - Rebuild Hardware Database. Jul 15 05:15:04.223404 systemd[1]: Starting systemd-udevd.service - Rule-based Manager for Device Events and Files... Jul 15 05:15:04.226632 systemd[1]: Reload requested from client PID 1349 ('systemctl') (unit ensure-sysext.service)... Jul 15 05:15:04.226647 systemd[1]: Reloading... Jul 15 05:15:04.227085 systemd-tmpfiles[1350]: /usr/lib/tmpfiles.d/nfs-utils.conf:6: Duplicate line for path "/var/lib/nfs/sm", ignoring. Jul 15 05:15:04.227124 systemd-tmpfiles[1350]: /usr/lib/tmpfiles.d/nfs-utils.conf:7: Duplicate line for path "/var/lib/nfs/sm.bak", ignoring. Jul 15 05:15:04.227735 systemd-tmpfiles[1350]: /usr/lib/tmpfiles.d/provision.conf:20: Duplicate line for path "/root", ignoring. Jul 15 05:15:04.228202 systemd-tmpfiles[1350]: /usr/lib/tmpfiles.d/systemd-flatcar.conf:6: Duplicate line for path "/var/log/journal", ignoring. Jul 15 05:15:04.230153 systemd-tmpfiles[1350]: /usr/lib/tmpfiles.d/systemd.conf:29: Duplicate line for path "/var/lib/systemd", ignoring. Jul 15 05:15:04.231219 systemd-tmpfiles[1350]: ACLs are not supported, ignoring. Jul 15 05:15:04.231388 systemd-tmpfiles[1350]: ACLs are not supported, ignoring. Jul 15 05:15:04.235852 systemd-tmpfiles[1350]: Detected autofs mount point /boot during canonicalization of boot. Jul 15 05:15:04.235955 systemd-tmpfiles[1350]: Skipping /boot Jul 15 05:15:04.246213 systemd-tmpfiles[1350]: Detected autofs mount point /boot during canonicalization of boot. Jul 15 05:15:04.246293 systemd-tmpfiles[1350]: Skipping /boot Jul 15 05:15:04.268141 systemd-udevd[1353]: Using default interface naming scheme 'v255'. Jul 15 05:15:04.285305 zram_generator::config[1374]: No configuration found. Jul 15 05:15:04.425658 systemd[1]: /usr/lib/systemd/system/docker.socket:6: ListenStream= references a path below legacy directory /var/run/, updating /var/run/docker.sock → /run/docker.sock; please update the unit file accordingly. Jul 15 05:15:04.493714 systemd[1]: Reloading finished in 266 ms. Jul 15 05:15:04.501793 systemd[1]: Started systemd-udevd.service - Rule-based Manager for Device Events and Files. Jul 15 05:15:04.502575 systemd[1]: Finished systemd-tmpfiles-setup.service - Create System Files and Directories. Jul 15 05:15:04.517156 systemd[1]: Condition check resulted in dev-ttyS0.device - /dev/ttyS0 being skipped. Jul 15 05:15:04.521180 systemd[1]: Starting audit-rules.service - Load Audit Rules... Jul 15 05:15:04.525296 systemd[1]: Starting clean-ca-certificates.service - Clean up broken links in /etc/ssl/certs... Jul 15 05:15:04.528347 systemd[1]: Starting systemd-journal-catalog-update.service - Rebuild Journal Catalog... Jul 15 05:15:04.535306 systemd[1]: Starting systemd-networkd.service - Network Configuration... Jul 15 05:15:04.539442 systemd[1]: Starting systemd-resolved.service - Network Name Resolution... Jul 15 05:15:04.541028 systemd[1]: Starting systemd-update-utmp.service - Record System Boot/Shutdown in UTMP... Jul 15 05:15:04.553377 systemd[1]: proc-xen.mount - /proc/xen was skipped because of an unmet condition check (ConditionVirtualization=xen). Jul 15 05:15:04.553520 systemd[1]: ignition-delete-config.service - Ignition (delete config) was skipped because no trigger condition checks were met. Jul 15 05:15:04.555322 systemd[1]: Starting modprobe@dm_mod.service - Load Kernel Module dm_mod... Jul 15 05:15:04.558584 systemd[1]: Starting modprobe@efi_pstore.service - Load Kernel Module efi_pstore... Jul 15 05:15:04.566545 systemd[1]: Starting modprobe@loop.service - Load Kernel Module loop... Jul 15 05:15:04.567414 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. Jul 15 05:15:04.567596 systemd[1]: systemd-hibernate-clear.service - Clear Stale Hibernate Storage Info was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/HibernateLocation-8cf2644b-4b0b-428f-9387-6d876050dc67). Jul 15 05:15:04.570264 systemd[1]: Starting systemd-userdbd.service - User Database Manager... Jul 15 05:15:04.570684 systemd[1]: xenserver-pv-version.service - Set fake PV driver version for XenServer was skipped because of an unmet condition check (ConditionVirtualization=xen). Jul 15 05:15:04.577500 systemd[1]: proc-xen.mount - /proc/xen was skipped because of an unmet condition check (ConditionVirtualization=xen). Jul 15 05:15:04.577657 systemd[1]: ignition-delete-config.service - Ignition (delete config) was skipped because no trigger condition checks were met. Jul 15 05:15:04.577807 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. Jul 15 05:15:04.577887 systemd[1]: systemd-hibernate-clear.service - Clear Stale Hibernate Storage Info was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/HibernateLocation-8cf2644b-4b0b-428f-9387-6d876050dc67). Jul 15 05:15:04.577976 systemd[1]: xenserver-pv-version.service - Set fake PV driver version for XenServer was skipped because of an unmet condition check (ConditionVirtualization=xen). Jul 15 05:15:04.581802 systemd[1]: Finished systemd-update-utmp.service - Record System Boot/Shutdown in UTMP. Jul 15 05:15:04.587658 systemd[1]: proc-xen.mount - /proc/xen was skipped because of an unmet condition check (ConditionVirtualization=xen). Jul 15 05:15:04.587820 systemd[1]: ignition-delete-config.service - Ignition (delete config) was skipped because no trigger condition checks were met. Jul 15 05:15:04.595364 systemd[1]: Starting modprobe@drm.service - Load Kernel Module drm... Jul 15 05:15:04.596053 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. Jul 15 05:15:04.596384 systemd[1]: systemd-hibernate-clear.service - Clear Stale Hibernate Storage Info was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/HibernateLocation-8cf2644b-4b0b-428f-9387-6d876050dc67). Jul 15 05:15:04.596589 systemd[1]: xenserver-pv-version.service - Set fake PV driver version for XenServer was skipped because of an unmet condition check (ConditionVirtualization=xen). Jul 15 05:15:04.601519 systemd[1]: modprobe@dm_mod.service: Deactivated successfully. Jul 15 05:15:04.601702 systemd[1]: Finished modprobe@dm_mod.service - Load Kernel Module dm_mod. Jul 15 05:15:04.608603 systemd[1]: Finished ensure-sysext.service. Jul 15 05:15:04.615049 systemd[1]: Starting systemd-timesyncd.service - Network Time Synchronization... Jul 15 05:15:04.618850 kernel: mousedev: PS/2 mouse device common for all mice Jul 15 05:15:04.617419 systemd[1]: modprobe@efi_pstore.service: Deactivated successfully. Jul 15 05:15:04.617609 systemd[1]: Finished modprobe@efi_pstore.service - Load Kernel Module efi_pstore. Jul 15 05:15:04.620342 augenrules[1495]: No rules Jul 15 05:15:04.622276 kernel: input: Power Button as /devices/LNXSYSTM:00/LNXPWRBN:00/input/input4 Jul 15 05:15:04.635175 kernel: ACPI: button: Power Button [PWRF] Jul 15 05:15:04.630531 systemd[1]: audit-rules.service: Deactivated successfully. Jul 15 05:15:04.631130 systemd[1]: Finished audit-rules.service - Load Audit Rules. Jul 15 05:15:04.633649 systemd[1]: modprobe@loop.service: Deactivated successfully. Jul 15 05:15:04.633798 systemd[1]: Finished modprobe@loop.service - Load Kernel Module loop. Jul 15 05:15:04.644590 systemd[1]: systemd-pstore.service - Platform Persistent Storage Archival was skipped because of an unmet condition check (ConditionDirectoryNotEmpty=/sys/fs/pstore). Jul 15 05:15:04.645443 systemd[1]: systemd-repart.service - Repartition Root Disk was skipped because no trigger condition checks were met. Jul 15 05:15:04.655303 systemd[1]: Finished systemd-journal-catalog-update.service - Rebuild Journal Catalog. Jul 15 05:15:04.658755 systemd[1]: Starting systemd-update-done.service - Update is Completed... Jul 15 05:15:04.662803 systemd[1]: modprobe@drm.service: Deactivated successfully. Jul 15 05:15:04.663009 systemd[1]: Finished modprobe@drm.service - Load Kernel Module drm. Jul 15 05:15:04.668503 systemd[1]: Started systemd-userdbd.service - User Database Manager. Jul 15 05:15:04.690246 systemd[1]: Found device dev-disk-by\x2dlabel-OEM.device - QEMU_HARDDISK OEM. Jul 15 05:15:04.691955 systemd[1]: Finished systemd-update-done.service - Update is Completed. Jul 15 05:15:04.700354 systemd[1]: Starting systemd-fsck@dev-disk-by\x2dlabel-OEM.service - File System Check on /dev/disk/by-label/OEM... Jul 15 05:15:04.708183 systemd[1]: Finished clean-ca-certificates.service - Clean up broken links in /etc/ssl/certs. Jul 15 05:15:04.709222 systemd[1]: update-ca-certificates.service - Update CA bundle at /etc/ssl/certs/ca-certificates.crt was skipped because of an unmet condition check (ConditionPathIsSymbolicLink=!/etc/ssl/certs/ca-certificates.crt). Jul 15 05:15:04.745913 systemd[1]: Finished systemd-fsck@dev-disk-by\x2dlabel-OEM.service - File System Check on /dev/disk/by-label/OEM. Jul 15 05:15:04.755130 kernel: i801_smbus 0000:00:1f.3: SMBus using PCI interrupt Jul 15 05:15:04.755380 kernel: i2c i2c-0: Memory type 0x07 not supported yet, not instantiating SPD Jul 15 05:15:04.787626 kernel: [drm] pci: virtio-vga detected at 0000:00:01.0 Jul 15 05:15:04.787649 kernel: virtio-pci 0000:00:01.0: vgaarb: deactivate vga console Jul 15 05:15:04.787845 kernel: Console: switching to colour dummy device 80x25 Jul 15 05:15:04.787859 kernel: [drm] features: -virgl +edid -resource_blob -host_visible Jul 15 05:15:04.787871 kernel: [drm] features: -context_init Jul 15 05:15:04.787884 kernel: [drm] number of scanouts: 1 Jul 15 05:15:04.787897 kernel: [drm] number of cap sets: 0 Jul 15 05:15:04.787906 kernel: [drm] Initialized virtio_gpu 0.1.0 for 0000:00:01.0 on minor 0 Jul 15 05:15:04.808081 systemd-networkd[1464]: lo: Link UP Jul 15 05:15:04.808087 systemd-networkd[1464]: lo: Gained carrier Jul 15 05:15:04.816422 systemd-networkd[1464]: Enumeration completed Jul 15 05:15:04.816552 systemd[1]: Started systemd-networkd.service - Network Configuration. Jul 15 05:15:04.818619 systemd[1]: Starting systemd-networkd-persistent-storage.service - Enable Persistent Storage in systemd-networkd... Jul 15 05:15:04.819070 systemd-networkd[1464]: eth0: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Jul 15 05:15:04.820783 systemd[1]: Starting systemd-networkd-wait-online.service - Wait for Network to be Configured... Jul 15 05:15:04.820925 systemd-networkd[1464]: eth0: Configuring with /usr/lib/systemd/network/zz-default.network. Jul 15 05:15:04.821437 systemd-networkd[1464]: eth1: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Jul 15 05:15:04.821498 systemd-networkd[1464]: eth1: Configuring with /usr/lib/systemd/network/zz-default.network. Jul 15 05:15:04.821848 systemd-networkd[1464]: eth0: Link UP Jul 15 05:15:04.822038 systemd-networkd[1464]: eth0: Gained carrier Jul 15 05:15:04.822051 systemd-networkd[1464]: eth0: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Jul 15 05:15:04.824475 systemd-networkd[1464]: eth1: Link UP Jul 15 05:15:04.824974 systemd-networkd[1464]: eth1: Gained carrier Jul 15 05:15:04.825033 systemd-networkd[1464]: eth1: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Jul 15 05:15:04.840862 systemd[1]: Condition check resulted in dev-virtio\x2dports-org.qemu.guest_agent.0.device - /dev/virtio-ports/org.qemu.guest_agent.0 being skipped. Jul 15 05:15:04.840909 systemd[1]: proc-xen.mount - /proc/xen was skipped because of an unmet condition check (ConditionVirtualization=xen). Jul 15 05:15:04.840995 systemd[1]: ignition-delete-config.service - Ignition (delete config) was skipped because no trigger condition checks were met. Jul 15 05:15:04.841795 systemd[1]: Starting modprobe@dm_mod.service - Load Kernel Module dm_mod... Jul 15 05:15:04.843743 systemd[1]: Starting modprobe@efi_pstore.service - Load Kernel Module efi_pstore... Jul 15 05:15:04.845871 systemd[1]: Starting modprobe@loop.service - Load Kernel Module loop... Jul 15 05:15:04.846365 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. Jul 15 05:15:04.846393 systemd[1]: systemd-hibernate-clear.service - Clear Stale Hibernate Storage Info was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/HibernateLocation-8cf2644b-4b0b-428f-9387-6d876050dc67). Jul 15 05:15:04.846413 systemd[1]: update-ca-certificates.service - Update CA bundle at /etc/ssl/certs/ca-certificates.crt was skipped because of an unmet condition check (ConditionPathIsSymbolicLink=!/etc/ssl/certs/ca-certificates.crt). Jul 15 05:15:04.846421 systemd[1]: xenserver-pv-version.service - Set fake PV driver version for XenServer was skipped because of an unmet condition check (ConditionVirtualization=xen). Jul 15 05:15:04.850317 systemd-networkd[1464]: eth1: DHCPv4 address 10.0.0.3/32, gateway 10.0.0.1 acquired from 10.0.0.1 Jul 15 05:15:04.851502 systemd[1]: Started systemd-timesyncd.service - Network Time Synchronization. Jul 15 05:15:04.851631 systemd[1]: Reached target time-set.target - System Time Set. Jul 15 05:15:04.855280 systemd[1]: Finished systemd-networkd-persistent-storage.service - Enable Persistent Storage in systemd-networkd. Jul 15 05:15:04.860622 systemd-resolved[1466]: Positive Trust Anchors: Jul 15 05:15:04.860632 systemd-resolved[1466]: . IN DS 20326 8 2 e06d44b80b8f1d39a95c0b0d7c65d08458e880409bbc683457104237c7f8ec8d Jul 15 05:15:04.860653 systemd-resolved[1466]: Negative trust anchors: home.arpa 10.in-addr.arpa 16.172.in-addr.arpa 17.172.in-addr.arpa 18.172.in-addr.arpa 19.172.in-addr.arpa 20.172.in-addr.arpa 21.172.in-addr.arpa 22.172.in-addr.arpa 23.172.in-addr.arpa 24.172.in-addr.arpa 25.172.in-addr.arpa 26.172.in-addr.arpa 27.172.in-addr.arpa 28.172.in-addr.arpa 29.172.in-addr.arpa 30.172.in-addr.arpa 31.172.in-addr.arpa 170.0.0.192.in-addr.arpa 171.0.0.192.in-addr.arpa 168.192.in-addr.arpa d.f.ip6.arpa ipv4only.arpa resolver.arpa corp home internal intranet lan local private test Jul 15 05:15:04.863821 systemd-resolved[1466]: Using system hostname 'ci-4396-0-0-n-e83c776e20'. Jul 15 05:15:04.865064 systemd[1]: Started systemd-resolved.service - Network Name Resolution. Jul 15 05:15:04.865171 systemd[1]: Reached target network.target - Network. Jul 15 05:15:04.865218 systemd[1]: Reached target nss-lookup.target - Host and Network Name Lookups. Jul 15 05:15:04.869773 systemd[1]: modprobe@dm_mod.service: Deactivated successfully. Jul 15 05:15:04.869944 systemd[1]: Finished modprobe@dm_mod.service - Load Kernel Module dm_mod. Jul 15 05:15:04.874810 systemd[1]: modprobe@efi_pstore.service: Deactivated successfully. Jul 15 05:15:04.874988 systemd[1]: Finished modprobe@efi_pstore.service - Load Kernel Module efi_pstore. Jul 15 05:15:04.875140 systemd[1]: systemd-pstore.service - Platform Persistent Storage Archival was skipped because of an unmet condition check (ConditionDirectoryNotEmpty=/sys/fs/pstore). Jul 15 05:15:04.875418 systemd[1]: modprobe@loop.service: Deactivated successfully. Jul 15 05:15:04.875572 systemd[1]: Finished modprobe@loop.service - Load Kernel Module loop. Jul 15 05:15:04.875724 systemd[1]: Reached target sysinit.target - System Initialization. Jul 15 05:15:04.875849 systemd[1]: Started motdgen.path - Watch for update engine configuration changes. Jul 15 05:15:04.875917 systemd[1]: Started user-cloudinit@var-lib-flatcar\x2dinstall-user_data.path - Watch for a cloud-config at /var/lib/flatcar-install/user_data. Jul 15 05:15:04.875965 systemd[1]: Started google-oslogin-cache.timer - NSS cache refresh timer. Jul 15 05:15:04.876134 systemd[1]: Started logrotate.timer - Daily rotation of log files. Jul 15 05:15:04.876298 systemd-networkd[1464]: eth0: DHCPv4 address 157.180.39.85/32, gateway 172.31.1.1 acquired from 172.31.1.1 Jul 15 05:15:04.876313 systemd[1]: Started mdadm.timer - Weekly check for MD array's redundancy information.. Jul 15 05:15:04.876366 systemd[1]: Started systemd-tmpfiles-clean.timer - Daily Cleanup of Temporary Directories. Jul 15 05:15:04.876406 systemd[1]: update-engine-stub.timer - Update Engine Stub Timer was skipped because of an unmet condition check (ConditionPathExists=/usr/.noupdate). Jul 15 05:15:04.876422 systemd[1]: Reached target paths.target - Path Units. Jul 15 05:15:04.876472 systemd[1]: Reached target timers.target - Timer Units. Jul 15 05:15:04.876960 systemd-timesyncd[1497]: Network configuration changed, trying to establish connection. Jul 15 05:15:04.877470 systemd[1]: Listening on dbus.socket - D-Bus System Message Bus Socket. Jul 15 05:15:04.878719 systemd[1]: Starting docker.socket - Docker Socket for the API... Jul 15 05:15:04.882328 systemd[1]: Listening on sshd-unix-local.socket - OpenSSH Server Socket (systemd-ssh-generator, AF_UNIX Local). Jul 15 05:15:04.882901 systemd[1]: Listening on sshd-vsock.socket - OpenSSH Server Socket (systemd-ssh-generator, AF_VSOCK). Jul 15 05:15:04.882968 systemd[1]: Reached target ssh-access.target - SSH Access Available. Jul 15 05:15:04.884268 kernel: EDAC MC: Ver: 3.0.0 Jul 15 05:15:04.885724 systemd[1]: Listening on sshd.socket - OpenSSH Server Socket. Jul 15 05:15:04.886026 systemd[1]: Listening on systemd-hostnamed.socket - Hostname Service Socket. Jul 15 05:15:04.886196 systemd[1]: systemd-repart.service - Repartition Root Disk was skipped because no trigger condition checks were met. Jul 15 05:15:04.886616 systemd[1]: Listening on docker.socket - Docker Socket for the API. Jul 15 05:15:04.887163 systemd[1]: Reached target sockets.target - Socket Units. Jul 15 05:15:04.887211 systemd[1]: Reached target basic.target - Basic System. Jul 15 05:15:04.887298 systemd[1]: addon-config@oem.service - Configure Addon /oem was skipped because no trigger condition checks were met. Jul 15 05:15:04.887316 systemd[1]: addon-run@oem.service - Run Addon /oem was skipped because no trigger condition checks were met. Jul 15 05:15:04.888372 systemd[1]: Starting containerd.service - containerd container runtime... Jul 15 05:15:04.889881 systemd[1]: Starting coreos-metadata.service - Flatcar Metadata Agent... Jul 15 05:15:04.892695 systemd[1]: Starting dbus.service - D-Bus System Message Bus... Jul 15 05:15:04.894415 systemd[1]: Starting dracut-shutdown.service - Restore /run/initramfs on shutdown... Jul 15 05:15:04.897798 systemd[1]: Starting enable-oem-cloudinit.service - Enable cloudinit... Jul 15 05:15:04.902324 systemd[1]: Starting extend-filesystems.service - Extend Filesystems... Jul 15 05:15:04.902391 systemd[1]: flatcar-setup-environment.service - Modifies /etc/environment for CoreOS was skipped because of an unmet condition check (ConditionPathExists=/oem/bin/flatcar-setup-environment). Jul 15 05:15:04.904613 systemd[1]: Starting google-oslogin-cache.service - NSS cache refresh... Jul 15 05:15:04.906967 jq[1551]: false Jul 15 05:15:04.907328 systemd[1]: Starting motdgen.service - Generate /run/flatcar/motd... Jul 15 05:15:04.912073 systemd[1]: Starting prepare-helm.service - Unpack helm to /opt/bin... Jul 15 05:15:04.917478 systemd[1]: Started qemu-guest-agent.service - QEMU Guest Agent. Jul 15 05:15:04.919277 systemd[1]: Starting ssh-key-proc-cmdline.service - Install an ssh key from /proc/cmdline... Jul 15 05:15:04.923598 systemd[1]: Starting sshd-keygen.service - Generate sshd host keys... Jul 15 05:15:04.932594 systemd[1]: Starting systemd-logind.service - User Login Management... Jul 15 05:15:04.933219 systemd[1]: tcsd.service - TCG Core Services Daemon was skipped because of an unmet condition check (ConditionPathExists=/dev/tpm0). Jul 15 05:15:04.934602 systemd[1]: cgroup compatibility translation between legacy and unified hierarchy settings activated. See cgroup-compat debug messages for details. Jul 15 05:15:04.935308 systemd[1]: Starting update-engine.service - Update Engine... Jul 15 05:15:04.936600 google_oslogin_nss_cache[1553]: oslogin_cache_refresh[1553]: Refreshing passwd entry cache Jul 15 05:15:04.936605 oslogin_cache_refresh[1553]: Refreshing passwd entry cache Jul 15 05:15:04.938724 systemd[1]: Starting update-ssh-keys-after-ignition.service - Run update-ssh-keys once after Ignition... Jul 15 05:15:04.940790 google_oslogin_nss_cache[1553]: oslogin_cache_refresh[1553]: Failure getting users, quitting Jul 15 05:15:04.940790 google_oslogin_nss_cache[1553]: oslogin_cache_refresh[1553]: Produced empty passwd cache file, removing /etc/oslogin_passwd.cache.bak. Jul 15 05:15:04.940790 google_oslogin_nss_cache[1553]: oslogin_cache_refresh[1553]: Refreshing group entry cache Jul 15 05:15:04.940341 oslogin_cache_refresh[1553]: Failure getting users, quitting Jul 15 05:15:04.940354 oslogin_cache_refresh[1553]: Produced empty passwd cache file, removing /etc/oslogin_passwd.cache.bak. Jul 15 05:15:04.940385 oslogin_cache_refresh[1553]: Refreshing group entry cache Jul 15 05:15:04.942886 oslogin_cache_refresh[1553]: Failure getting groups, quitting Jul 15 05:15:04.946677 systemd[1]: Finished dracut-shutdown.service - Restore /run/initramfs on shutdown. Jul 15 05:15:04.950366 google_oslogin_nss_cache[1553]: oslogin_cache_refresh[1553]: Failure getting groups, quitting Jul 15 05:15:04.950366 google_oslogin_nss_cache[1553]: oslogin_cache_refresh[1553]: Produced empty group cache file, removing /etc/oslogin_group.cache.bak. Jul 15 05:15:04.942894 oslogin_cache_refresh[1553]: Produced empty group cache file, removing /etc/oslogin_group.cache.bak. Jul 15 05:15:04.946953 systemd[1]: enable-oem-cloudinit.service: Skipped due to 'exec-condition'. Jul 15 05:15:04.947118 systemd[1]: Condition check resulted in enable-oem-cloudinit.service - Enable cloudinit being skipped. Jul 15 05:15:04.947362 systemd[1]: google-oslogin-cache.service: Deactivated successfully. Jul 15 05:15:04.947529 systemd[1]: Finished google-oslogin-cache.service - NSS cache refresh. Jul 15 05:15:04.948559 systemd[1]: motdgen.service: Deactivated successfully. Jul 15 05:15:04.948721 systemd[1]: Finished motdgen.service - Generate /run/flatcar/motd. Jul 15 05:15:04.953702 systemd[1]: ssh-key-proc-cmdline.service: Deactivated successfully. Jul 15 05:15:04.953882 systemd[1]: Finished ssh-key-proc-cmdline.service - Install an ssh key from /proc/cmdline. Jul 15 05:15:04.959529 extend-filesystems[1552]: Found /dev/sda6 Jul 15 05:15:04.967703 extend-filesystems[1552]: Found /dev/sda9 Jul 15 05:15:04.967406 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Jul 15 05:15:04.972328 coreos-metadata[1547]: Jul 15 05:15:04.971 INFO Fetching http://169.254.169.254/hetzner/v1/metadata: Attempt #1 Jul 15 05:15:04.972896 coreos-metadata[1547]: Jul 15 05:15:04.972 INFO Fetch successful Jul 15 05:15:04.973096 coreos-metadata[1547]: Jul 15 05:15:04.973 INFO Fetching http://169.254.169.254/hetzner/v1/metadata/private-networks: Attempt #1 Jul 15 05:15:04.973853 coreos-metadata[1547]: Jul 15 05:15:04.973 INFO Fetch successful Jul 15 05:15:04.986313 extend-filesystems[1552]: Checking size of /dev/sda9 Jul 15 05:15:04.987751 jq[1570]: true Jul 15 05:15:04.993862 (ntainerd)[1591]: containerd.service: Referenced but unset environment variable evaluates to an empty string: TORCX_IMAGEDIR, TORCX_UNPACKDIR Jul 15 05:15:05.009800 tar[1575]: linux-amd64/LICENSE Jul 15 05:15:05.010001 tar[1575]: linux-amd64/helm Jul 15 05:15:05.016473 dbus-daemon[1548]: [system] SELinux support is enabled Jul 15 05:15:05.016582 systemd[1]: Started dbus.service - D-Bus System Message Bus. Jul 15 05:15:05.020901 systemd[1]: system-cloudinit@usr-share-oem-cloud\x2dconfig.yml.service - Load cloud-config from /usr/share/oem/cloud-config.yml was skipped because of an unmet condition check (ConditionFileNotEmpty=/usr/share/oem/cloud-config.yml). Jul 15 05:15:05.020924 systemd[1]: Reached target system-config.target - Load system-provided cloud configs. Jul 15 05:15:05.020999 systemd[1]: user-cloudinit-proc-cmdline.service - Load cloud-config from url defined in /proc/cmdline was skipped because of an unmet condition check (ConditionKernelCommandLine=cloud-config-url). Jul 15 05:15:05.021013 systemd[1]: Reached target user-config.target - Load user-provided cloud configs. Jul 15 05:15:05.023184 update_engine[1569]: I20250715 05:15:05.023121 1569 main.cc:92] Flatcar Update Engine starting Jul 15 05:15:05.025592 jq[1596]: true Jul 15 05:15:05.027884 extend-filesystems[1552]: Resized partition /dev/sda9 Jul 15 05:15:05.033734 extend-filesystems[1606]: resize2fs 1.47.2 (1-Jan-2025) Jul 15 05:15:05.040952 systemd[1]: Started update-engine.service - Update Engine. Jul 15 05:15:05.044758 update_engine[1569]: I20250715 05:15:05.043267 1569 update_check_scheduler.cc:74] Next update check in 5m34s Jul 15 05:15:05.043478 systemd[1]: Started locksmithd.service - Cluster reboot manager. Jul 15 05:15:05.063404 kernel: EXT4-fs (sda9): resizing filesystem from 1617920 to 9393147 blocks Jul 15 05:15:05.125626 systemd[1]: Finished coreos-metadata.service - Flatcar Metadata Agent. Jul 15 05:15:05.125834 systemd[1]: packet-phone-home.service - Report Success to Packet was skipped because no trigger condition checks were met. Jul 15 05:15:05.183695 systemd[1]: systemd-vconsole-setup.service: Deactivated successfully. Jul 15 05:15:05.184279 systemd[1]: Stopped systemd-vconsole-setup.service - Virtual Console Setup. Jul 15 05:15:05.185962 bash[1630]: Updated "/home/core/.ssh/authorized_keys" Jul 15 05:15:05.186578 systemd[1]: run-credentials-systemd\x2dvconsole\x2dsetup.service.mount: Deactivated successfully. Jul 15 05:15:05.188191 sshd_keygen[1589]: ssh-keygen: generating new host keys: RSA ECDSA ED25519 Jul 15 05:15:05.188264 systemd[1]: Finished update-ssh-keys-after-ignition.service - Run update-ssh-keys once after Ignition. Jul 15 05:15:05.191102 systemd[1]: Starting sshkeys.service... Jul 15 05:15:05.195461 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Jul 15 05:15:05.222111 systemd-logind[1567]: New seat seat0. Jul 15 05:15:05.225236 systemd-logind[1567]: Watching system buttons on /dev/input/event3 (Power Button) Jul 15 05:15:05.225273 systemd-logind[1567]: Watching system buttons on /dev/input/event0 (AT Translated Set 2 keyboard) Jul 15 05:15:05.227724 systemd[1]: Started systemd-logind.service - User Login Management. Jul 15 05:15:05.233080 systemd[1]: Created slice system-coreos\x2dmetadata\x2dsshkeys.slice - Slice /system/coreos-metadata-sshkeys. Jul 15 05:15:05.234879 containerd[1591]: time="2025-07-15T05:15:05Z" level=warning msg="Ignoring unknown key in TOML" column=1 error="strict mode: fields in the document are missing in the target struct" file=/usr/share/containerd/config.toml key=subreaper row=8 Jul 15 05:15:05.236046 containerd[1591]: time="2025-07-15T05:15:05.236020559Z" level=info msg="starting containerd" revision=fb4c30d4ede3531652d86197bf3fc9515e5276d9 version=v2.0.5 Jul 15 05:15:05.236239 systemd[1]: Starting coreos-metadata-sshkeys@core.service - Flatcar Metadata Agent (SSH Keys)... Jul 15 05:15:05.242600 containerd[1591]: time="2025-07-15T05:15:05.242578829Z" level=warning msg="Configuration migrated from version 2, use `containerd config migrate` to avoid migration" t="5.44µs" Jul 15 05:15:05.242674 containerd[1591]: time="2025-07-15T05:15:05.242662709Z" level=info msg="loading plugin" id=io.containerd.image-verifier.v1.bindir type=io.containerd.image-verifier.v1 Jul 15 05:15:05.242709 containerd[1591]: time="2025-07-15T05:15:05.242701249Z" level=info msg="loading plugin" id=io.containerd.internal.v1.opt type=io.containerd.internal.v1 Jul 15 05:15:05.242845 containerd[1591]: time="2025-07-15T05:15:05.242834099Z" level=info msg="loading plugin" id=io.containerd.warning.v1.deprecations type=io.containerd.warning.v1 Jul 15 05:15:05.242881 containerd[1591]: time="2025-07-15T05:15:05.242873319Z" level=info msg="loading plugin" id=io.containerd.content.v1.content type=io.containerd.content.v1 Jul 15 05:15:05.242918 containerd[1591]: time="2025-07-15T05:15:05.242911019Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.blockfile type=io.containerd.snapshotter.v1 Jul 15 05:15:05.242988 containerd[1591]: time="2025-07-15T05:15:05.242978109Z" level=info msg="skip loading plugin" error="no scratch file generator: skip plugin" id=io.containerd.snapshotter.v1.blockfile type=io.containerd.snapshotter.v1 Jul 15 05:15:05.243029 containerd[1591]: time="2025-07-15T05:15:05.243021399Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.btrfs type=io.containerd.snapshotter.v1 Jul 15 05:15:05.246286 kernel: EXT4-fs (sda9): resized filesystem to 9393147 Jul 15 05:15:05.259729 containerd[1591]: time="2025-07-15T05:15:05.250822149Z" level=info msg="skip loading plugin" error="path /var/lib/containerd/io.containerd.snapshotter.v1.btrfs (ext4) must be a btrfs filesystem to be used with the btrfs snapshotter: skip plugin" id=io.containerd.snapshotter.v1.btrfs type=io.containerd.snapshotter.v1 Jul 15 05:15:05.259729 containerd[1591]: time="2025-07-15T05:15:05.250846309Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.devmapper type=io.containerd.snapshotter.v1 Jul 15 05:15:05.259729 containerd[1591]: time="2025-07-15T05:15:05.250859149Z" level=info msg="skip loading plugin" error="devmapper not configured: skip plugin" id=io.containerd.snapshotter.v1.devmapper type=io.containerd.snapshotter.v1 Jul 15 05:15:05.259729 containerd[1591]: time="2025-07-15T05:15:05.250866969Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.native type=io.containerd.snapshotter.v1 Jul 15 05:15:05.261533 containerd[1591]: time="2025-07-15T05:15:05.260531709Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.overlayfs type=io.containerd.snapshotter.v1 Jul 15 05:15:05.261533 containerd[1591]: time="2025-07-15T05:15:05.260788129Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.zfs type=io.containerd.snapshotter.v1 Jul 15 05:15:05.261533 containerd[1591]: time="2025-07-15T05:15:05.260814659Z" level=info msg="skip loading plugin" error="lstat /var/lib/containerd/io.containerd.snapshotter.v1.zfs: no such file or directory: skip plugin" id=io.containerd.snapshotter.v1.zfs type=io.containerd.snapshotter.v1 Jul 15 05:15:05.261533 containerd[1591]: time="2025-07-15T05:15:05.260823409Z" level=info msg="loading plugin" id=io.containerd.event.v1.exchange type=io.containerd.event.v1 Jul 15 05:15:05.261533 containerd[1591]: time="2025-07-15T05:15:05.260845939Z" level=info msg="loading plugin" id=io.containerd.monitor.task.v1.cgroups type=io.containerd.monitor.task.v1 Jul 15 05:15:05.261533 containerd[1591]: time="2025-07-15T05:15:05.261006739Z" level=info msg="loading plugin" id=io.containerd.metadata.v1.bolt type=io.containerd.metadata.v1 Jul 15 05:15:05.261533 containerd[1591]: time="2025-07-15T05:15:05.261048469Z" level=info msg="metadata content store policy set" policy=shared Jul 15 05:15:05.261575 systemd[1]: Finished sshd-keygen.service - Generate sshd host keys. Jul 15 05:15:05.264542 extend-filesystems[1606]: Filesystem at /dev/sda9 is mounted on /; on-line resizing required Jul 15 05:15:05.264542 extend-filesystems[1606]: old_desc_blocks = 1, new_desc_blocks = 5 Jul 15 05:15:05.264542 extend-filesystems[1606]: The filesystem on /dev/sda9 is now 9393147 (4k) blocks long. Jul 15 05:15:05.264418 systemd[1]: Starting issuegen.service - Generate /run/issue... Jul 15 05:15:05.264714 extend-filesystems[1552]: Resized filesystem in /dev/sda9 Jul 15 05:15:05.264760 systemd[1]: extend-filesystems.service: Deactivated successfully. Jul 15 05:15:05.264952 systemd[1]: Finished extend-filesystems.service - Extend Filesystems. Jul 15 05:15:05.271062 containerd[1591]: time="2025-07-15T05:15:05.271020049Z" level=info msg="loading plugin" id=io.containerd.gc.v1.scheduler type=io.containerd.gc.v1 Jul 15 05:15:05.271585 containerd[1591]: time="2025-07-15T05:15:05.271113919Z" level=info msg="loading plugin" id=io.containerd.differ.v1.walking type=io.containerd.differ.v1 Jul 15 05:15:05.271585 containerd[1591]: time="2025-07-15T05:15:05.271128229Z" level=info msg="loading plugin" id=io.containerd.lease.v1.manager type=io.containerd.lease.v1 Jul 15 05:15:05.271585 containerd[1591]: time="2025-07-15T05:15:05.271138729Z" level=info msg="loading plugin" id=io.containerd.service.v1.containers-service type=io.containerd.service.v1 Jul 15 05:15:05.271585 containerd[1591]: time="2025-07-15T05:15:05.271147799Z" level=info msg="loading plugin" id=io.containerd.service.v1.content-service type=io.containerd.service.v1 Jul 15 05:15:05.271585 containerd[1591]: time="2025-07-15T05:15:05.271155939Z" level=info msg="loading plugin" id=io.containerd.service.v1.diff-service type=io.containerd.service.v1 Jul 15 05:15:05.271585 containerd[1591]: time="2025-07-15T05:15:05.271193629Z" level=info msg="loading plugin" id=io.containerd.service.v1.images-service type=io.containerd.service.v1 Jul 15 05:15:05.271585 containerd[1591]: time="2025-07-15T05:15:05.271202809Z" level=info msg="loading plugin" id=io.containerd.service.v1.introspection-service type=io.containerd.service.v1 Jul 15 05:15:05.271585 containerd[1591]: time="2025-07-15T05:15:05.271211449Z" level=info msg="loading plugin" id=io.containerd.service.v1.namespaces-service type=io.containerd.service.v1 Jul 15 05:15:05.271585 containerd[1591]: time="2025-07-15T05:15:05.271219339Z" level=info msg="loading plugin" id=io.containerd.service.v1.snapshots-service type=io.containerd.service.v1 Jul 15 05:15:05.271585 containerd[1591]: time="2025-07-15T05:15:05.271226899Z" level=info msg="loading plugin" id=io.containerd.shim.v1.manager type=io.containerd.shim.v1 Jul 15 05:15:05.271585 containerd[1591]: time="2025-07-15T05:15:05.271243749Z" level=info msg="loading plugin" id=io.containerd.runtime.v2.task type=io.containerd.runtime.v2 Jul 15 05:15:05.271585 containerd[1591]: time="2025-07-15T05:15:05.271368639Z" level=info msg="loading plugin" id=io.containerd.service.v1.tasks-service type=io.containerd.service.v1 Jul 15 05:15:05.271585 containerd[1591]: time="2025-07-15T05:15:05.271382669Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.containers type=io.containerd.grpc.v1 Jul 15 05:15:05.271585 containerd[1591]: time="2025-07-15T05:15:05.271392969Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.content type=io.containerd.grpc.v1 Jul 15 05:15:05.271773 containerd[1591]: time="2025-07-15T05:15:05.271401489Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.diff type=io.containerd.grpc.v1 Jul 15 05:15:05.271773 containerd[1591]: time="2025-07-15T05:15:05.271410249Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.events type=io.containerd.grpc.v1 Jul 15 05:15:05.271773 containerd[1591]: time="2025-07-15T05:15:05.271431969Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.images type=io.containerd.grpc.v1 Jul 15 05:15:05.271773 containerd[1591]: time="2025-07-15T05:15:05.271440269Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.introspection type=io.containerd.grpc.v1 Jul 15 05:15:05.271773 containerd[1591]: time="2025-07-15T05:15:05.271447889Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.leases type=io.containerd.grpc.v1 Jul 15 05:15:05.271773 containerd[1591]: time="2025-07-15T05:15:05.271463439Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.namespaces type=io.containerd.grpc.v1 Jul 15 05:15:05.271773 containerd[1591]: time="2025-07-15T05:15:05.271471799Z" level=info msg="loading plugin" id=io.containerd.sandbox.store.v1.local type=io.containerd.sandbox.store.v1 Jul 15 05:15:05.271773 containerd[1591]: time="2025-07-15T05:15:05.271479309Z" level=info msg="loading plugin" id=io.containerd.cri.v1.images type=io.containerd.cri.v1 Jul 15 05:15:05.271773 containerd[1591]: time="2025-07-15T05:15:05.271542979Z" level=info msg="Get image filesystem path \"/var/lib/containerd/io.containerd.snapshotter.v1.overlayfs\" for snapshotter \"overlayfs\"" Jul 15 05:15:05.271773 containerd[1591]: time="2025-07-15T05:15:05.271554739Z" level=info msg="Start snapshots syncer" Jul 15 05:15:05.271978 containerd[1591]: time="2025-07-15T05:15:05.271921279Z" level=info msg="loading plugin" id=io.containerd.cri.v1.runtime type=io.containerd.cri.v1 Jul 15 05:15:05.276353 containerd[1591]: time="2025-07-15T05:15:05.275062739Z" level=info msg="starting cri plugin" config="{\"containerd\":{\"defaultRuntimeName\":\"runc\",\"runtimes\":{\"runc\":{\"runtimeType\":\"io.containerd.runc.v2\",\"runtimePath\":\"\",\"PodAnnotations\":null,\"ContainerAnnotations\":null,\"options\":{\"BinaryName\":\"\",\"CriuImagePath\":\"\",\"CriuWorkPath\":\"\",\"IoGid\":0,\"IoUid\":0,\"NoNewKeyring\":false,\"Root\":\"\",\"ShimCgroup\":\"\",\"SystemdCgroup\":true},\"privileged_without_host_devices\":false,\"privileged_without_host_devices_all_devices_allowed\":false,\"baseRuntimeSpec\":\"\",\"cniConfDir\":\"\",\"cniMaxConfNum\":0,\"snapshotter\":\"\",\"sandboxer\":\"podsandbox\",\"io_type\":\"\"}},\"ignoreBlockIONotEnabledErrors\":false,\"ignoreRdtNotEnabledErrors\":false},\"cni\":{\"binDir\":\"/opt/cni/bin\",\"confDir\":\"/etc/cni/net.d\",\"maxConfNum\":1,\"setupSerially\":false,\"confTemplate\":\"\",\"ipPref\":\"\",\"useInternalLoopback\":false},\"enableSelinux\":true,\"selinuxCategoryRange\":1024,\"maxContainerLogSize\":16384,\"disableApparmor\":false,\"restrictOOMScoreAdj\":false,\"disableProcMount\":false,\"unsetSeccompProfile\":\"\",\"tolerateMissingHugetlbController\":true,\"disableHugetlbController\":true,\"device_ownership_from_security_context\":false,\"ignoreImageDefinedVolumes\":false,\"netnsMountsUnderStateDir\":false,\"enableUnprivilegedPorts\":true,\"enableUnprivilegedICMP\":true,\"enableCDI\":true,\"cdiSpecDirs\":[\"/etc/cdi\",\"/var/run/cdi\"],\"drainExecSyncIOTimeout\":\"0s\",\"ignoreDeprecationWarnings\":null,\"containerdRootDir\":\"/var/lib/containerd\",\"containerdEndpoint\":\"/run/containerd/containerd.sock\",\"rootDir\":\"/var/lib/containerd/io.containerd.grpc.v1.cri\",\"stateDir\":\"/run/containerd/io.containerd.grpc.v1.cri\"}" Jul 15 05:15:05.276353 containerd[1591]: time="2025-07-15T05:15:05.275137109Z" level=info msg="loading plugin" id=io.containerd.podsandbox.controller.v1.podsandbox type=io.containerd.podsandbox.controller.v1 Jul 15 05:15:05.276484 containerd[1591]: time="2025-07-15T05:15:05.275215359Z" level=info msg="loading plugin" id=io.containerd.sandbox.controller.v1.shim type=io.containerd.sandbox.controller.v1 Jul 15 05:15:05.276484 containerd[1591]: time="2025-07-15T05:15:05.275352059Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.sandbox-controllers type=io.containerd.grpc.v1 Jul 15 05:15:05.276484 containerd[1591]: time="2025-07-15T05:15:05.275369179Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.sandboxes type=io.containerd.grpc.v1 Jul 15 05:15:05.276484 containerd[1591]: time="2025-07-15T05:15:05.275377539Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.snapshots type=io.containerd.grpc.v1 Jul 15 05:15:05.276484 containerd[1591]: time="2025-07-15T05:15:05.275384859Z" level=info msg="loading plugin" id=io.containerd.streaming.v1.manager type=io.containerd.streaming.v1 Jul 15 05:15:05.276484 containerd[1591]: time="2025-07-15T05:15:05.275392809Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.streaming type=io.containerd.grpc.v1 Jul 15 05:15:05.276484 containerd[1591]: time="2025-07-15T05:15:05.275400239Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.tasks type=io.containerd.grpc.v1 Jul 15 05:15:05.276484 containerd[1591]: time="2025-07-15T05:15:05.275408189Z" level=info msg="loading plugin" id=io.containerd.transfer.v1.local type=io.containerd.transfer.v1 Jul 15 05:15:05.276484 containerd[1591]: time="2025-07-15T05:15:05.275436249Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.transfer type=io.containerd.grpc.v1 Jul 15 05:15:05.276484 containerd[1591]: time="2025-07-15T05:15:05.275443629Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.version type=io.containerd.grpc.v1 Jul 15 05:15:05.276484 containerd[1591]: time="2025-07-15T05:15:05.275459749Z" level=info msg="loading plugin" id=io.containerd.monitor.container.v1.restart type=io.containerd.monitor.container.v1 Jul 15 05:15:05.276484 containerd[1591]: time="2025-07-15T05:15:05.275482279Z" level=info msg="loading plugin" id=io.containerd.tracing.processor.v1.otlp type=io.containerd.tracing.processor.v1 Jul 15 05:15:05.276484 containerd[1591]: time="2025-07-15T05:15:05.275490819Z" level=info msg="skip loading plugin" error="skip plugin: tracing endpoint not configured" id=io.containerd.tracing.processor.v1.otlp type=io.containerd.tracing.processor.v1 Jul 15 05:15:05.276484 containerd[1591]: time="2025-07-15T05:15:05.275515219Z" level=info msg="loading plugin" id=io.containerd.internal.v1.tracing type=io.containerd.internal.v1 Jul 15 05:15:05.276724 containerd[1591]: time="2025-07-15T05:15:05.275521949Z" level=info msg="skip loading plugin" error="skip plugin: tracing endpoint not configured" id=io.containerd.internal.v1.tracing type=io.containerd.internal.v1 Jul 15 05:15:05.276724 containerd[1591]: time="2025-07-15T05:15:05.275528579Z" level=info msg="loading plugin" id=io.containerd.ttrpc.v1.otelttrpc type=io.containerd.ttrpc.v1 Jul 15 05:15:05.276724 containerd[1591]: time="2025-07-15T05:15:05.275535399Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.healthcheck type=io.containerd.grpc.v1 Jul 15 05:15:05.276724 containerd[1591]: time="2025-07-15T05:15:05.275543119Z" level=info msg="loading plugin" id=io.containerd.nri.v1.nri type=io.containerd.nri.v1 Jul 15 05:15:05.276724 containerd[1591]: time="2025-07-15T05:15:05.275555459Z" level=info msg="runtime interface created" Jul 15 05:15:05.276724 containerd[1591]: time="2025-07-15T05:15:05.275559249Z" level=info msg="created NRI interface" Jul 15 05:15:05.276724 containerd[1591]: time="2025-07-15T05:15:05.276304969Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.cri type=io.containerd.grpc.v1 Jul 15 05:15:05.276724 containerd[1591]: time="2025-07-15T05:15:05.276321989Z" level=info msg="Connect containerd service" Jul 15 05:15:05.277319 containerd[1591]: time="2025-07-15T05:15:05.276837969Z" level=info msg="using experimental NRI integration - disable nri plugin to prevent this" Jul 15 05:15:05.280628 containerd[1591]: time="2025-07-15T05:15:05.279892859Z" level=error msg="failed to load cni during init, please check CRI plugin status before setting up network for pods" error="cni config load failed: no network config found in /etc/cni/net.d: cni plugin not initialized: failed to load cni config" Jul 15 05:15:05.285406 systemd[1]: issuegen.service: Deactivated successfully. Jul 15 05:15:05.285596 systemd[1]: Finished issuegen.service - Generate /run/issue. Jul 15 05:15:05.288595 systemd[1]: Finished systemd-vconsole-setup.service - Virtual Console Setup. Jul 15 05:15:05.291805 systemd[1]: Starting systemd-user-sessions.service - Permit User Sessions... Jul 15 05:15:05.294350 coreos-metadata[1647]: Jul 15 05:15:05.292 INFO Fetching http://169.254.169.254/hetzner/v1/metadata/public-keys: Attempt #1 Jul 15 05:15:05.293740 locksmithd[1607]: locksmithd starting currentOperation="UPDATE_STATUS_IDLE" strategy="reboot" Jul 15 05:15:05.295877 coreos-metadata[1647]: Jul 15 05:15:05.294 INFO Fetch successful Jul 15 05:15:05.297379 unknown[1647]: wrote ssh authorized keys file for user: core Jul 15 05:15:05.320938 systemd[1]: Finished systemd-user-sessions.service - Permit User Sessions. Jul 15 05:15:05.323723 systemd[1]: Started getty@tty1.service - Getty on tty1. Jul 15 05:15:05.324949 systemd[1]: Started serial-getty@ttyS0.service - Serial Getty on ttyS0. Jul 15 05:15:05.325130 systemd[1]: Reached target getty.target - Login Prompts. Jul 15 05:15:05.343298 update-ssh-keys[1673]: Updated "/home/core/.ssh/authorized_keys" Jul 15 05:15:05.347267 systemd[1]: Finished coreos-metadata-sshkeys@core.service - Flatcar Metadata Agent (SSH Keys). Jul 15 05:15:05.360964 systemd[1]: Finished sshkeys.service. Jul 15 05:15:05.434466 containerd[1591]: time="2025-07-15T05:15:05.434417929Z" level=info msg="Start subscribing containerd event" Jul 15 05:15:05.434550 containerd[1591]: time="2025-07-15T05:15:05.434495879Z" level=info msg="Start recovering state" Jul 15 05:15:05.434568 containerd[1591]: time="2025-07-15T05:15:05.434563629Z" level=info msg="Start event monitor" Jul 15 05:15:05.434583 containerd[1591]: time="2025-07-15T05:15:05.434572739Z" level=info msg="Start cni network conf syncer for default" Jul 15 05:15:05.434583 containerd[1591]: time="2025-07-15T05:15:05.434579949Z" level=info msg="Start streaming server" Jul 15 05:15:05.434623 containerd[1591]: time="2025-07-15T05:15:05.434586759Z" level=info msg="Registered namespace \"k8s.io\" with NRI" Jul 15 05:15:05.434623 containerd[1591]: time="2025-07-15T05:15:05.434592759Z" level=info msg="runtime interface starting up..." Jul 15 05:15:05.434623 containerd[1591]: time="2025-07-15T05:15:05.434597119Z" level=info msg="starting plugins..." Jul 15 05:15:05.434623 containerd[1591]: time="2025-07-15T05:15:05.434607609Z" level=info msg="Synchronizing NRI (plugin) with current runtime state" Jul 15 05:15:05.434903 containerd[1591]: time="2025-07-15T05:15:05.434879619Z" level=info msg=serving... address=/run/containerd/containerd.sock.ttrpc Jul 15 05:15:05.434939 containerd[1591]: time="2025-07-15T05:15:05.434923799Z" level=info msg=serving... address=/run/containerd/containerd.sock Jul 15 05:15:05.434993 containerd[1591]: time="2025-07-15T05:15:05.434974769Z" level=info msg="containerd successfully booted in 0.200607s" Jul 15 05:15:05.435373 systemd[1]: Started containerd.service - containerd container runtime. Jul 15 05:15:05.548345 tar[1575]: linux-amd64/README.md Jul 15 05:15:05.565739 systemd[1]: Finished prepare-helm.service - Unpack helm to /opt/bin. Jul 15 05:15:06.129591 systemd-networkd[1464]: eth1: Gained IPv6LL Jul 15 05:15:06.130679 systemd-timesyncd[1497]: Network configuration changed, trying to establish connection. Jul 15 05:15:06.134423 systemd[1]: Finished systemd-networkd-wait-online.service - Wait for Network to be Configured. Jul 15 05:15:06.135526 systemd[1]: Reached target network-online.target - Network is Online. Jul 15 05:15:06.139839 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Jul 15 05:15:06.143604 systemd[1]: Starting nvidia.service - NVIDIA Configure Service... Jul 15 05:15:06.193598 systemd[1]: Finished nvidia.service - NVIDIA Configure Service. Jul 15 05:15:06.193657 systemd-networkd[1464]: eth0: Gained IPv6LL Jul 15 05:15:06.195358 systemd-timesyncd[1497]: Network configuration changed, trying to establish connection. Jul 15 05:15:06.897333 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Jul 15 05:15:06.898145 systemd[1]: Reached target multi-user.target - Multi-User System. Jul 15 05:15:06.899347 systemd[1]: Startup finished in 2.743s (kernel) + 7.127s (initrd) + 4.199s (userspace) = 14.070s. Jul 15 05:15:06.900484 (kubelet)[1711]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Jul 15 05:15:07.367068 kubelet[1711]: E0715 05:15:07.366921 1711 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Jul 15 05:15:07.369952 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Jul 15 05:15:07.370222 systemd[1]: kubelet.service: Failed with result 'exit-code'. Jul 15 05:15:07.370741 systemd[1]: kubelet.service: Consumed 775ms CPU time, 264.5M memory peak. Jul 15 05:15:17.257813 systemd[1]: Created slice system-sshd.slice - Slice /system/sshd. Jul 15 05:15:17.260114 systemd[1]: Started sshd@0-157.180.39.85:22-139.178.89.65:46186.service - OpenSSH per-connection server daemon (139.178.89.65:46186). Jul 15 05:15:17.620941 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 1. Jul 15 05:15:17.623882 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Jul 15 05:15:17.782167 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Jul 15 05:15:17.784867 (kubelet)[1734]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Jul 15 05:15:17.815901 kubelet[1734]: E0715 05:15:17.815842 1734 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Jul 15 05:15:17.819398 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Jul 15 05:15:17.819725 systemd[1]: kubelet.service: Failed with result 'exit-code'. Jul 15 05:15:17.820280 systemd[1]: kubelet.service: Consumed 151ms CPU time, 110.8M memory peak. Jul 15 05:15:18.256462 sshd[1723]: Accepted publickey for core from 139.178.89.65 port 46186 ssh2: RSA SHA256:je//JvwAvCoYiHr1RrgFTcyG5vuK3YNw75TtQlLXdFM Jul 15 05:15:18.260168 sshd-session[1723]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jul 15 05:15:18.273523 systemd[1]: Created slice user-500.slice - User Slice of UID 500. Jul 15 05:15:18.276465 systemd[1]: Starting user-runtime-dir@500.service - User Runtime Directory /run/user/500... Jul 15 05:15:18.292088 systemd-logind[1567]: New session 1 of user core. Jul 15 05:15:18.316029 systemd[1]: Finished user-runtime-dir@500.service - User Runtime Directory /run/user/500. Jul 15 05:15:18.322430 systemd[1]: Starting user@500.service - User Manager for UID 500... Jul 15 05:15:18.340438 (systemd)[1743]: pam_unix(systemd-user:session): session opened for user core(uid=500) by (uid=0) Jul 15 05:15:18.342599 systemd-logind[1567]: New session c1 of user core. Jul 15 05:15:18.449957 systemd[1743]: Queued start job for default target default.target. Jul 15 05:15:18.456294 systemd[1743]: Created slice app.slice - User Application Slice. Jul 15 05:15:18.456313 systemd[1743]: Reached target paths.target - Paths. Jul 15 05:15:18.456346 systemd[1743]: Reached target timers.target - Timers. Jul 15 05:15:18.457443 systemd[1743]: Starting dbus.socket - D-Bus User Message Bus Socket... Jul 15 05:15:18.467243 systemd[1743]: Listening on dbus.socket - D-Bus User Message Bus Socket. Jul 15 05:15:18.467302 systemd[1743]: Reached target sockets.target - Sockets. Jul 15 05:15:18.467331 systemd[1743]: Reached target basic.target - Basic System. Jul 15 05:15:18.467360 systemd[1743]: Reached target default.target - Main User Target. Jul 15 05:15:18.467385 systemd[1743]: Startup finished in 119ms. Jul 15 05:15:18.467707 systemd[1]: Started user@500.service - User Manager for UID 500. Jul 15 05:15:18.486398 systemd[1]: Started session-1.scope - Session 1 of User core. Jul 15 05:15:19.185584 systemd[1]: Started sshd@1-157.180.39.85:22-139.178.89.65:56226.service - OpenSSH per-connection server daemon (139.178.89.65:56226). Jul 15 05:15:20.192006 sshd[1754]: Accepted publickey for core from 139.178.89.65 port 56226 ssh2: RSA SHA256:je//JvwAvCoYiHr1RrgFTcyG5vuK3YNw75TtQlLXdFM Jul 15 05:15:20.194950 sshd-session[1754]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jul 15 05:15:20.205335 systemd-logind[1567]: New session 2 of user core. Jul 15 05:15:20.213590 systemd[1]: Started session-2.scope - Session 2 of User core. Jul 15 05:15:20.864618 sshd[1757]: Connection closed by 139.178.89.65 port 56226 Jul 15 05:15:20.865483 sshd-session[1754]: pam_unix(sshd:session): session closed for user core Jul 15 05:15:20.870944 systemd[1]: sshd@1-157.180.39.85:22-139.178.89.65:56226.service: Deactivated successfully. Jul 15 05:15:20.873680 systemd[1]: session-2.scope: Deactivated successfully. Jul 15 05:15:20.876003 systemd-logind[1567]: Session 2 logged out. Waiting for processes to exit. Jul 15 05:15:20.878562 systemd-logind[1567]: Removed session 2. Jul 15 05:15:21.034718 systemd[1]: Started sshd@2-157.180.39.85:22-139.178.89.65:56232.service - OpenSSH per-connection server daemon (139.178.89.65:56232). Jul 15 05:15:22.038139 sshd[1763]: Accepted publickey for core from 139.178.89.65 port 56232 ssh2: RSA SHA256:je//JvwAvCoYiHr1RrgFTcyG5vuK3YNw75TtQlLXdFM Jul 15 05:15:22.040598 sshd-session[1763]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jul 15 05:15:22.049342 systemd-logind[1567]: New session 3 of user core. Jul 15 05:15:22.061474 systemd[1]: Started session-3.scope - Session 3 of User core. Jul 15 05:15:22.709432 sshd[1766]: Connection closed by 139.178.89.65 port 56232 Jul 15 05:15:22.710351 sshd-session[1763]: pam_unix(sshd:session): session closed for user core Jul 15 05:15:22.716556 systemd[1]: sshd@2-157.180.39.85:22-139.178.89.65:56232.service: Deactivated successfully. Jul 15 05:15:22.719973 systemd[1]: session-3.scope: Deactivated successfully. Jul 15 05:15:22.721514 systemd-logind[1567]: Session 3 logged out. Waiting for processes to exit. Jul 15 05:15:22.724021 systemd-logind[1567]: Removed session 3. Jul 15 05:15:22.879875 systemd[1]: Started sshd@3-157.180.39.85:22-139.178.89.65:56240.service - OpenSSH per-connection server daemon (139.178.89.65:56240). Jul 15 05:15:23.885459 sshd[1772]: Accepted publickey for core from 139.178.89.65 port 56240 ssh2: RSA SHA256:je//JvwAvCoYiHr1RrgFTcyG5vuK3YNw75TtQlLXdFM Jul 15 05:15:23.888017 sshd-session[1772]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jul 15 05:15:23.896331 systemd-logind[1567]: New session 4 of user core. Jul 15 05:15:23.903495 systemd[1]: Started session-4.scope - Session 4 of User core. Jul 15 05:15:24.563474 sshd[1775]: Connection closed by 139.178.89.65 port 56240 Jul 15 05:15:24.564423 sshd-session[1772]: pam_unix(sshd:session): session closed for user core Jul 15 05:15:24.571828 systemd-logind[1567]: Session 4 logged out. Waiting for processes to exit. Jul 15 05:15:24.573096 systemd[1]: sshd@3-157.180.39.85:22-139.178.89.65:56240.service: Deactivated successfully. Jul 15 05:15:24.576397 systemd[1]: session-4.scope: Deactivated successfully. Jul 15 05:15:24.579238 systemd-logind[1567]: Removed session 4. Jul 15 05:15:24.736859 systemd[1]: Started sshd@4-157.180.39.85:22-139.178.89.65:56244.service - OpenSSH per-connection server daemon (139.178.89.65:56244). Jul 15 05:15:25.748076 sshd[1781]: Accepted publickey for core from 139.178.89.65 port 56244 ssh2: RSA SHA256:je//JvwAvCoYiHr1RrgFTcyG5vuK3YNw75TtQlLXdFM Jul 15 05:15:25.750506 sshd-session[1781]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jul 15 05:15:25.759474 systemd-logind[1567]: New session 5 of user core. Jul 15 05:15:25.766482 systemd[1]: Started session-5.scope - Session 5 of User core. Jul 15 05:15:26.285289 sudo[1785]: core : PWD=/home/core ; USER=root ; COMMAND=/usr/sbin/setenforce 1 Jul 15 05:15:26.285855 sudo[1785]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Jul 15 05:15:26.308195 sudo[1785]: pam_unix(sudo:session): session closed for user root Jul 15 05:15:26.467699 sshd[1784]: Connection closed by 139.178.89.65 port 56244 Jul 15 05:15:26.468538 sshd-session[1781]: pam_unix(sshd:session): session closed for user core Jul 15 05:15:26.473604 systemd[1]: sshd@4-157.180.39.85:22-139.178.89.65:56244.service: Deactivated successfully. Jul 15 05:15:26.477090 systemd[1]: session-5.scope: Deactivated successfully. Jul 15 05:15:26.478700 systemd-logind[1567]: Session 5 logged out. Waiting for processes to exit. Jul 15 05:15:26.480658 systemd-logind[1567]: Removed session 5. Jul 15 05:15:26.642567 systemd[1]: Started sshd@5-157.180.39.85:22-139.178.89.65:56258.service - OpenSSH per-connection server daemon (139.178.89.65:56258). Jul 15 05:15:27.655577 sshd[1791]: Accepted publickey for core from 139.178.89.65 port 56258 ssh2: RSA SHA256:je//JvwAvCoYiHr1RrgFTcyG5vuK3YNw75TtQlLXdFM Jul 15 05:15:27.658084 sshd-session[1791]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jul 15 05:15:27.667134 systemd-logind[1567]: New session 6 of user core. Jul 15 05:15:27.680544 systemd[1]: Started session-6.scope - Session 6 of User core. Jul 15 05:15:28.021375 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 2. Jul 15 05:15:28.023969 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Jul 15 05:15:28.177912 sudo[1801]: core : PWD=/home/core ; USER=root ; COMMAND=/usr/sbin/rm -rf /etc/audit/rules.d/80-selinux.rules /etc/audit/rules.d/99-default.rules Jul 15 05:15:28.178148 sudo[1801]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Jul 15 05:15:28.186289 sudo[1801]: pam_unix(sudo:session): session closed for user root Jul 15 05:15:28.190505 sudo[1798]: core : PWD=/home/core ; USER=root ; COMMAND=/usr/sbin/systemctl restart audit-rules Jul 15 05:15:28.190940 sudo[1798]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Jul 15 05:15:28.205160 systemd[1]: Starting audit-rules.service - Load Audit Rules... Jul 15 05:15:28.207439 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Jul 15 05:15:28.213475 (kubelet)[1808]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Jul 15 05:15:28.234819 augenrules[1831]: No rules Jul 15 05:15:28.236334 systemd[1]: audit-rules.service: Deactivated successfully. Jul 15 05:15:28.236613 systemd[1]: Finished audit-rules.service - Load Audit Rules. Jul 15 05:15:28.237838 sudo[1798]: pam_unix(sudo:session): session closed for user root Jul 15 05:15:28.259683 kubelet[1808]: E0715 05:15:28.259644 1808 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Jul 15 05:15:28.262088 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Jul 15 05:15:28.262233 systemd[1]: kubelet.service: Failed with result 'exit-code'. Jul 15 05:15:28.262507 systemd[1]: kubelet.service: Consumed 181ms CPU time, 110.2M memory peak. Jul 15 05:15:28.395898 sshd[1794]: Connection closed by 139.178.89.65 port 56258 Jul 15 05:15:28.397487 sshd-session[1791]: pam_unix(sshd:session): session closed for user core Jul 15 05:15:28.403100 systemd[1]: sshd@5-157.180.39.85:22-139.178.89.65:56258.service: Deactivated successfully. Jul 15 05:15:28.406559 systemd[1]: session-6.scope: Deactivated successfully. Jul 15 05:15:28.410288 systemd-logind[1567]: Session 6 logged out. Waiting for processes to exit. Jul 15 05:15:28.412180 systemd-logind[1567]: Removed session 6. Jul 15 05:15:28.580075 systemd[1]: Started sshd@6-157.180.39.85:22-139.178.89.65:50104.service - OpenSSH per-connection server daemon (139.178.89.65:50104). Jul 15 05:15:29.594839 sshd[1842]: Accepted publickey for core from 139.178.89.65 port 50104 ssh2: RSA SHA256:je//JvwAvCoYiHr1RrgFTcyG5vuK3YNw75TtQlLXdFM Jul 15 05:15:29.597312 sshd-session[1842]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jul 15 05:15:29.606797 systemd-logind[1567]: New session 7 of user core. Jul 15 05:15:29.613455 systemd[1]: Started session-7.scope - Session 7 of User core. Jul 15 05:15:30.119777 sudo[1846]: core : PWD=/home/core ; USER=root ; COMMAND=/home/core/install.sh Jul 15 05:15:30.120499 sudo[1846]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Jul 15 05:15:30.414472 systemd[1]: Starting docker.service - Docker Application Container Engine... Jul 15 05:15:30.431552 (dockerd)[1864]: docker.service: Referenced but unset environment variable evaluates to an empty string: DOCKER_CGROUPS, DOCKER_OPTS, DOCKER_OPT_BIP, DOCKER_OPT_IPMASQ, DOCKER_OPT_MTU Jul 15 05:15:30.613704 dockerd[1864]: time="2025-07-15T05:15:30.611683727Z" level=info msg="Starting up" Jul 15 05:15:30.615193 dockerd[1864]: time="2025-07-15T05:15:30.615158657Z" level=info msg="OTEL tracing is not configured, using no-op tracer provider" Jul 15 05:15:30.632871 dockerd[1864]: time="2025-07-15T05:15:30.632818147Z" level=info msg="Creating a containerd client" address=/var/run/docker/libcontainerd/docker-containerd.sock timeout=1m0s Jul 15 05:15:30.653139 systemd[1]: var-lib-docker-check\x2doverlayfs\x2dsupport872872204-merged.mount: Deactivated successfully. Jul 15 05:15:30.687825 dockerd[1864]: time="2025-07-15T05:15:30.687603137Z" level=info msg="Loading containers: start." Jul 15 05:15:30.695313 kernel: Initializing XFRM netlink socket Jul 15 05:15:30.865114 systemd-timesyncd[1497]: Network configuration changed, trying to establish connection. Jul 15 05:15:30.898054 systemd-networkd[1464]: docker0: Link UP Jul 15 05:15:30.901588 dockerd[1864]: time="2025-07-15T05:15:30.901556287Z" level=info msg="Loading containers: done." Jul 15 05:15:30.911321 systemd[1]: var-lib-docker-overlay2-opaque\x2dbug\x2dcheck3632057435-merged.mount: Deactivated successfully. Jul 15 05:15:30.915743 dockerd[1864]: time="2025-07-15T05:15:30.915716287Z" level=warning msg="Not using native diff for overlay2, this may cause degraded performance for building images: kernel has CONFIG_OVERLAY_FS_REDIRECT_DIR enabled" storage-driver=overlay2 Jul 15 05:15:30.915828 dockerd[1864]: time="2025-07-15T05:15:30.915765917Z" level=info msg="Docker daemon" commit=6430e49a55babd9b8f4d08e70ecb2b68900770fe containerd-snapshotter=false storage-driver=overlay2 version=28.0.4 Jul 15 05:15:30.915828 dockerd[1864]: time="2025-07-15T05:15:30.915822427Z" level=info msg="Initializing buildkit" Jul 15 05:15:30.935073 dockerd[1864]: time="2025-07-15T05:15:30.935053757Z" level=info msg="Completed buildkit initialization" Jul 15 05:15:30.940878 dockerd[1864]: time="2025-07-15T05:15:30.940810397Z" level=info msg="Daemon has completed initialization" Jul 15 05:15:30.940987 systemd[1]: Started docker.service - Docker Application Container Engine. Jul 15 05:15:30.941312 dockerd[1864]: time="2025-07-15T05:15:30.941016007Z" level=info msg="API listen on /run/docker.sock" Jul 15 05:15:31.732754 systemd-resolved[1466]: Clock change detected. Flushing caches. Jul 15 05:15:31.733421 systemd-timesyncd[1497]: Contacted time server 144.91.126.59:123 (2.flatcar.pool.ntp.org). Jul 15 05:15:31.733470 systemd-timesyncd[1497]: Initial clock synchronization to Tue 2025-07-15 05:15:31.732626 UTC. Jul 15 05:15:32.616668 containerd[1591]: time="2025-07-15T05:15:32.616570386Z" level=info msg="PullImage \"registry.k8s.io/kube-apiserver:v1.32.6\"" Jul 15 05:15:33.237065 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount3154923200.mount: Deactivated successfully. Jul 15 05:15:34.218622 containerd[1591]: time="2025-07-15T05:15:34.218558086Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-apiserver:v1.32.6\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jul 15 05:15:34.219548 containerd[1591]: time="2025-07-15T05:15:34.219514316Z" level=info msg="stop pulling image registry.k8s.io/kube-apiserver:v1.32.6: active requests=0, bytes read=28799139" Jul 15 05:15:34.220688 containerd[1591]: time="2025-07-15T05:15:34.220647716Z" level=info msg="ImageCreate event name:\"sha256:8c5b95b1b5cb4a908fcbbbe81697c57019f9e9d89bfb5e0355235d440b7a6aa9\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jul 15 05:15:34.223244 containerd[1591]: time="2025-07-15T05:15:34.223206966Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-apiserver@sha256:0f5764551d7de4ef70489ff8a70f32df7dea00701f5545af089b60bc5ede4f6f\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jul 15 05:15:34.223996 containerd[1591]: time="2025-07-15T05:15:34.223869726Z" level=info msg="Pulled image \"registry.k8s.io/kube-apiserver:v1.32.6\" with image id \"sha256:8c5b95b1b5cb4a908fcbbbe81697c57019f9e9d89bfb5e0355235d440b7a6aa9\", repo tag \"registry.k8s.io/kube-apiserver:v1.32.6\", repo digest \"registry.k8s.io/kube-apiserver@sha256:0f5764551d7de4ef70489ff8a70f32df7dea00701f5545af089b60bc5ede4f6f\", size \"28795845\" in 1.60720009s" Jul 15 05:15:34.223996 containerd[1591]: time="2025-07-15T05:15:34.223892486Z" level=info msg="PullImage \"registry.k8s.io/kube-apiserver:v1.32.6\" returns image reference \"sha256:8c5b95b1b5cb4a908fcbbbe81697c57019f9e9d89bfb5e0355235d440b7a6aa9\"" Jul 15 05:15:34.224360 containerd[1591]: time="2025-07-15T05:15:34.224337586Z" level=info msg="PullImage \"registry.k8s.io/kube-controller-manager:v1.32.6\"" Jul 15 05:15:35.386081 containerd[1591]: time="2025-07-15T05:15:35.386033256Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-controller-manager:v1.32.6\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jul 15 05:15:35.387163 containerd[1591]: time="2025-07-15T05:15:35.387134486Z" level=info msg="stop pulling image registry.k8s.io/kube-controller-manager:v1.32.6: active requests=0, bytes read=24783934" Jul 15 05:15:35.388138 containerd[1591]: time="2025-07-15T05:15:35.388088816Z" level=info msg="ImageCreate event name:\"sha256:77d0e7de0c6b41e2331c3997698c3f917527cf7bbe462f5c813f514e788436de\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jul 15 05:15:35.390180 containerd[1591]: time="2025-07-15T05:15:35.390147146Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-controller-manager@sha256:3425f29c94a77d74cb89f38413e6274277dcf5e2bc7ab6ae953578a91e9e8356\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jul 15 05:15:35.390829 containerd[1591]: time="2025-07-15T05:15:35.390728756Z" level=info msg="Pulled image \"registry.k8s.io/kube-controller-manager:v1.32.6\" with image id \"sha256:77d0e7de0c6b41e2331c3997698c3f917527cf7bbe462f5c813f514e788436de\", repo tag \"registry.k8s.io/kube-controller-manager:v1.32.6\", repo digest \"registry.k8s.io/kube-controller-manager@sha256:3425f29c94a77d74cb89f38413e6274277dcf5e2bc7ab6ae953578a91e9e8356\", size \"26385746\" in 1.16631208s" Jul 15 05:15:35.390829 containerd[1591]: time="2025-07-15T05:15:35.390756386Z" level=info msg="PullImage \"registry.k8s.io/kube-controller-manager:v1.32.6\" returns image reference \"sha256:77d0e7de0c6b41e2331c3997698c3f917527cf7bbe462f5c813f514e788436de\"" Jul 15 05:15:35.391273 containerd[1591]: time="2025-07-15T05:15:35.391232636Z" level=info msg="PullImage \"registry.k8s.io/kube-scheduler:v1.32.6\"" Jul 15 05:15:36.424679 containerd[1591]: time="2025-07-15T05:15:36.424613526Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-scheduler:v1.32.6\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jul 15 05:15:36.425851 containerd[1591]: time="2025-07-15T05:15:36.425783596Z" level=info msg="stop pulling image registry.k8s.io/kube-scheduler:v1.32.6: active requests=0, bytes read=19176938" Jul 15 05:15:36.427333 containerd[1591]: time="2025-07-15T05:15:36.427292566Z" level=info msg="ImageCreate event name:\"sha256:b34d1cd163151c2491919f315274d85bff904721213f2b19341b403a28a39ae2\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jul 15 05:15:36.431094 containerd[1591]: time="2025-07-15T05:15:36.429631766Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-scheduler@sha256:130f633cbd1d70e2f4655350153cb3fc469f4d5a6310b4f0b49d93fb2ba2132b\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jul 15 05:15:36.432127 containerd[1591]: time="2025-07-15T05:15:36.432078196Z" level=info msg="Pulled image \"registry.k8s.io/kube-scheduler:v1.32.6\" with image id \"sha256:b34d1cd163151c2491919f315274d85bff904721213f2b19341b403a28a39ae2\", repo tag \"registry.k8s.io/kube-scheduler:v1.32.6\", repo digest \"registry.k8s.io/kube-scheduler@sha256:130f633cbd1d70e2f4655350153cb3fc469f4d5a6310b4f0b49d93fb2ba2132b\", size \"20778768\" in 1.04082092s" Jul 15 05:15:36.432164 containerd[1591]: time="2025-07-15T05:15:36.432131946Z" level=info msg="PullImage \"registry.k8s.io/kube-scheduler:v1.32.6\" returns image reference \"sha256:b34d1cd163151c2491919f315274d85bff904721213f2b19341b403a28a39ae2\"" Jul 15 05:15:36.433918 containerd[1591]: time="2025-07-15T05:15:36.433879386Z" level=info msg="PullImage \"registry.k8s.io/kube-proxy:v1.32.6\"" Jul 15 05:15:37.472203 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount1013723513.mount: Deactivated successfully. Jul 15 05:15:37.847838 containerd[1591]: time="2025-07-15T05:15:37.847631016Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-proxy:v1.32.6\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jul 15 05:15:37.848306 containerd[1591]: time="2025-07-15T05:15:37.848284566Z" level=info msg="stop pulling image registry.k8s.io/kube-proxy:v1.32.6: active requests=0, bytes read=30895391" Jul 15 05:15:37.848861 containerd[1591]: time="2025-07-15T05:15:37.848828746Z" level=info msg="ImageCreate event name:\"sha256:63f0cbe3b7339c5d006efc9964228e48271bae73039320037c451b5e8f763e02\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jul 15 05:15:37.849989 containerd[1591]: time="2025-07-15T05:15:37.849961876Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-proxy@sha256:b13d9da413b983d130bf090b83fce12e1ccc704e95f366da743c18e964d9d7e9\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jul 15 05:15:37.850488 containerd[1591]: time="2025-07-15T05:15:37.850278346Z" level=info msg="Pulled image \"registry.k8s.io/kube-proxy:v1.32.6\" with image id \"sha256:63f0cbe3b7339c5d006efc9964228e48271bae73039320037c451b5e8f763e02\", repo tag \"registry.k8s.io/kube-proxy:v1.32.6\", repo digest \"registry.k8s.io/kube-proxy@sha256:b13d9da413b983d130bf090b83fce12e1ccc704e95f366da743c18e964d9d7e9\", size \"30894382\" in 1.41629922s" Jul 15 05:15:37.850488 containerd[1591]: time="2025-07-15T05:15:37.850301356Z" level=info msg="PullImage \"registry.k8s.io/kube-proxy:v1.32.6\" returns image reference \"sha256:63f0cbe3b7339c5d006efc9964228e48271bae73039320037c451b5e8f763e02\"" Jul 15 05:15:37.850645 containerd[1591]: time="2025-07-15T05:15:37.850627396Z" level=info msg="PullImage \"registry.k8s.io/coredns/coredns:v1.11.3\"" Jul 15 05:15:38.362792 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount827527317.mount: Deactivated successfully. Jul 15 05:15:38.933517 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 3. Jul 15 05:15:38.935678 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Jul 15 05:15:39.120393 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Jul 15 05:15:39.128337 (kubelet)[2204]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Jul 15 05:15:39.160123 kubelet[2204]: E0715 05:15:39.160069 2204 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Jul 15 05:15:39.162952 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Jul 15 05:15:39.163137 systemd[1]: kubelet.service: Failed with result 'exit-code'. Jul 15 05:15:39.163569 systemd[1]: kubelet.service: Consumed 131ms CPU time, 108.4M memory peak. Jul 15 05:15:39.196303 containerd[1591]: time="2025-07-15T05:15:39.196130986Z" level=info msg="ImageCreate event name:\"registry.k8s.io/coredns/coredns:v1.11.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jul 15 05:15:39.197319 containerd[1591]: time="2025-07-15T05:15:39.197145856Z" level=info msg="stop pulling image registry.k8s.io/coredns/coredns:v1.11.3: active requests=0, bytes read=18565335" Jul 15 05:15:39.198582 containerd[1591]: time="2025-07-15T05:15:39.198562716Z" level=info msg="ImageCreate event name:\"sha256:c69fa2e9cbf5f42dc48af631e956d3f95724c13f91596bc567591790e5e36db6\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jul 15 05:15:39.200492 containerd[1591]: time="2025-07-15T05:15:39.200475136Z" level=info msg="ImageCreate event name:\"registry.k8s.io/coredns/coredns@sha256:9caabbf6238b189a65d0d6e6ac138de60d6a1c419e5a341fbbb7c78382559c6e\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jul 15 05:15:39.200993 containerd[1591]: time="2025-07-15T05:15:39.200972466Z" level=info msg="Pulled image \"registry.k8s.io/coredns/coredns:v1.11.3\" with image id \"sha256:c69fa2e9cbf5f42dc48af631e956d3f95724c13f91596bc567591790e5e36db6\", repo tag \"registry.k8s.io/coredns/coredns:v1.11.3\", repo digest \"registry.k8s.io/coredns/coredns@sha256:9caabbf6238b189a65d0d6e6ac138de60d6a1c419e5a341fbbb7c78382559c6e\", size \"18562039\" in 1.35032816s" Jul 15 05:15:39.201044 containerd[1591]: time="2025-07-15T05:15:39.200993646Z" level=info msg="PullImage \"registry.k8s.io/coredns/coredns:v1.11.3\" returns image reference \"sha256:c69fa2e9cbf5f42dc48af631e956d3f95724c13f91596bc567591790e5e36db6\"" Jul 15 05:15:39.201617 containerd[1591]: time="2025-07-15T05:15:39.201361906Z" level=info msg="PullImage \"registry.k8s.io/pause:3.10\"" Jul 15 05:15:39.660954 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount572411744.mount: Deactivated successfully. Jul 15 05:15:39.667582 containerd[1591]: time="2025-07-15T05:15:39.667503166Z" level=info msg="ImageCreate event name:\"registry.k8s.io/pause:3.10\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" Jul 15 05:15:39.668576 containerd[1591]: time="2025-07-15T05:15:39.668517726Z" level=info msg="stop pulling image registry.k8s.io/pause:3.10: active requests=0, bytes read=321160" Jul 15 05:15:39.669956 containerd[1591]: time="2025-07-15T05:15:39.669885406Z" level=info msg="ImageCreate event name:\"sha256:873ed75102791e5b0b8a7fcd41606c92fcec98d56d05ead4ac5131650004c136\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" Jul 15 05:15:39.672948 containerd[1591]: time="2025-07-15T05:15:39.672879596Z" level=info msg="ImageCreate event name:\"registry.k8s.io/pause@sha256:ee6521f290b2168b6e0935a181d4cff9be1ac3f505666ef0e3c98fae8199917a\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" Jul 15 05:15:39.674430 containerd[1591]: time="2025-07-15T05:15:39.673896406Z" level=info msg="Pulled image \"registry.k8s.io/pause:3.10\" with image id \"sha256:873ed75102791e5b0b8a7fcd41606c92fcec98d56d05ead4ac5131650004c136\", repo tag \"registry.k8s.io/pause:3.10\", repo digest \"registry.k8s.io/pause@sha256:ee6521f290b2168b6e0935a181d4cff9be1ac3f505666ef0e3c98fae8199917a\", size \"320368\" in 472.50455ms" Jul 15 05:15:39.674430 containerd[1591]: time="2025-07-15T05:15:39.673938426Z" level=info msg="PullImage \"registry.k8s.io/pause:3.10\" returns image reference \"sha256:873ed75102791e5b0b8a7fcd41606c92fcec98d56d05ead4ac5131650004c136\"" Jul 15 05:15:39.674766 containerd[1591]: time="2025-07-15T05:15:39.674696106Z" level=info msg="PullImage \"registry.k8s.io/etcd:3.5.16-0\"" Jul 15 05:15:40.217960 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount1057096636.mount: Deactivated successfully. Jul 15 05:15:41.616165 containerd[1591]: time="2025-07-15T05:15:41.616103996Z" level=info msg="ImageCreate event name:\"registry.k8s.io/etcd:3.5.16-0\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jul 15 05:15:41.617230 containerd[1591]: time="2025-07-15T05:15:41.617200616Z" level=info msg="stop pulling image registry.k8s.io/etcd:3.5.16-0: active requests=0, bytes read=57551430" Jul 15 05:15:41.618212 containerd[1591]: time="2025-07-15T05:15:41.618180076Z" level=info msg="ImageCreate event name:\"sha256:a9e7e6b294baf1695fccb862d956c5d3ad8510e1e4ca1535f35dc09f247abbfc\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jul 15 05:15:41.622027 containerd[1591]: time="2025-07-15T05:15:41.620186416Z" level=info msg="ImageCreate event name:\"registry.k8s.io/etcd@sha256:c6a9d11cc5c04b114ccdef39a9265eeef818e3d02f5359be035ae784097fdec5\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jul 15 05:15:41.622027 containerd[1591]: time="2025-07-15T05:15:41.621372266Z" level=info msg="Pulled image \"registry.k8s.io/etcd:3.5.16-0\" with image id \"sha256:a9e7e6b294baf1695fccb862d956c5d3ad8510e1e4ca1535f35dc09f247abbfc\", repo tag \"registry.k8s.io/etcd:3.5.16-0\", repo digest \"registry.k8s.io/etcd@sha256:c6a9d11cc5c04b114ccdef39a9265eeef818e3d02f5359be035ae784097fdec5\", size \"57680541\" in 1.94663749s" Jul 15 05:15:41.622027 containerd[1591]: time="2025-07-15T05:15:41.621393976Z" level=info msg="PullImage \"registry.k8s.io/etcd:3.5.16-0\" returns image reference \"sha256:a9e7e6b294baf1695fccb862d956c5d3ad8510e1e4ca1535f35dc09f247abbfc\"" Jul 15 05:15:43.794551 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent. Jul 15 05:15:43.795680 systemd[1]: kubelet.service: Consumed 131ms CPU time, 108.4M memory peak. Jul 15 05:15:43.800838 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Jul 15 05:15:43.846295 systemd[1]: Reload requested from client PID 2296 ('systemctl') (unit session-7.scope)... Jul 15 05:15:43.846308 systemd[1]: Reloading... Jul 15 05:15:43.957045 zram_generator::config[2346]: No configuration found. Jul 15 05:15:44.031700 systemd[1]: /usr/lib/systemd/system/docker.socket:6: ListenStream= references a path below legacy directory /var/run/, updating /var/run/docker.sock → /run/docker.sock; please update the unit file accordingly. Jul 15 05:15:44.125700 systemd[1]: Reloading finished in 279 ms. Jul 15 05:15:44.186731 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Jul 15 05:15:44.192739 systemd[1]: Stopping kubelet.service - kubelet: The Kubernetes Node Agent... Jul 15 05:15:44.195634 systemd[1]: kubelet.service: Deactivated successfully. Jul 15 05:15:44.195976 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent. Jul 15 05:15:44.196054 systemd[1]: kubelet.service: Consumed 103ms CPU time, 98M memory peak. Jul 15 05:15:44.198201 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Jul 15 05:15:44.323124 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Jul 15 05:15:44.329259 (kubelet)[2396]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS Jul 15 05:15:44.367827 kubelet[2396]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Jul 15 05:15:44.367827 kubelet[2396]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Jul 15 05:15:44.367827 kubelet[2396]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Jul 15 05:15:44.368394 kubelet[2396]: I0715 05:15:44.367861 2396 server.go:215] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Jul 15 05:15:44.671694 kubelet[2396]: I0715 05:15:44.671646 2396 server.go:520] "Kubelet version" kubeletVersion="v1.32.4" Jul 15 05:15:44.671694 kubelet[2396]: I0715 05:15:44.671668 2396 server.go:522] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Jul 15 05:15:44.671922 kubelet[2396]: I0715 05:15:44.671852 2396 server.go:954] "Client rotation is on, will bootstrap in background" Jul 15 05:15:44.702102 kubelet[2396]: E0715 05:15:44.702037 2396 certificate_manager.go:562] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post \"https://157.180.39.85:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": dial tcp 157.180.39.85:6443: connect: connection refused" logger="UnhandledError" Jul 15 05:15:44.704243 kubelet[2396]: I0715 05:15:44.704182 2396 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/pki/ca.crt" Jul 15 05:15:44.715282 kubelet[2396]: I0715 05:15:44.715263 2396 server.go:1444] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Jul 15 05:15:44.723252 kubelet[2396]: I0715 05:15:44.723230 2396 server.go:772] "--cgroups-per-qos enabled, but --cgroup-root was not specified. defaulting to /" Jul 15 05:15:44.730566 kubelet[2396]: I0715 05:15:44.730495 2396 container_manager_linux.go:268] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Jul 15 05:15:44.730798 kubelet[2396]: I0715 05:15:44.730563 2396 container_manager_linux.go:273] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ci-4396-0-0-n-e83c776e20","RuntimeCgroupsName":"","SystemCgroupsName":"","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":false,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":null,"HardEvictionThresholds":[{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"ExperimentalMemoryManagerPolicy":"None","ExperimentalMemoryManagerReservedMemory":null,"PodPidsLimit":-1,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Jul 15 05:15:44.730874 kubelet[2396]: I0715 05:15:44.730804 2396 topology_manager.go:138] "Creating topology manager with none policy" Jul 15 05:15:44.730874 kubelet[2396]: I0715 05:15:44.730819 2396 container_manager_linux.go:304] "Creating device plugin manager" Jul 15 05:15:44.732590 kubelet[2396]: I0715 05:15:44.732560 2396 state_mem.go:36] "Initialized new in-memory state store" Jul 15 05:15:44.740387 kubelet[2396]: I0715 05:15:44.740204 2396 kubelet.go:446] "Attempting to sync node with API server" Jul 15 05:15:44.740387 kubelet[2396]: I0715 05:15:44.740275 2396 kubelet.go:341] "Adding static pod path" path="/etc/kubernetes/manifests" Jul 15 05:15:44.742449 kubelet[2396]: I0715 05:15:44.742195 2396 kubelet.go:352] "Adding apiserver pod source" Jul 15 05:15:44.742449 kubelet[2396]: I0715 05:15:44.742223 2396 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Jul 15 05:15:44.747312 kubelet[2396]: I0715 05:15:44.747259 2396 kuberuntime_manager.go:269] "Container runtime initialized" containerRuntime="containerd" version="v2.0.5" apiVersion="v1" Jul 15 05:15:44.752083 kubelet[2396]: I0715 05:15:44.751989 2396 kubelet.go:890] "Not starting ClusterTrustBundle informer because we are in static kubelet mode" Jul 15 05:15:44.754063 kubelet[2396]: W0715 05:15:44.753136 2396 probe.go:272] Flexvolume plugin directory at /opt/libexec/kubernetes/kubelet-plugins/volume/exec/ does not exist. Recreating. Jul 15 05:15:44.754730 kubelet[2396]: W0715 05:15:44.754685 2396 reflector.go:569] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://157.180.39.85:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 157.180.39.85:6443: connect: connection refused Jul 15 05:15:44.754787 kubelet[2396]: E0715 05:15:44.754734 2396 reflector.go:166] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://157.180.39.85:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 157.180.39.85:6443: connect: connection refused" logger="UnhandledError" Jul 15 05:15:44.754831 kubelet[2396]: W0715 05:15:44.754784 2396 reflector.go:569] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://157.180.39.85:6443/api/v1/nodes?fieldSelector=metadata.name%3Dci-4396-0-0-n-e83c776e20&limit=500&resourceVersion=0": dial tcp 157.180.39.85:6443: connect: connection refused Jul 15 05:15:44.754831 kubelet[2396]: E0715 05:15:44.754803 2396 reflector.go:166] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://157.180.39.85:6443/api/v1/nodes?fieldSelector=metadata.name%3Dci-4396-0-0-n-e83c776e20&limit=500&resourceVersion=0\": dial tcp 157.180.39.85:6443: connect: connection refused" logger="UnhandledError" Jul 15 05:15:44.755239 kubelet[2396]: I0715 05:15:44.755225 2396 watchdog_linux.go:99] "Systemd watchdog is not enabled" Jul 15 05:15:44.755304 kubelet[2396]: I0715 05:15:44.755248 2396 server.go:1287] "Started kubelet" Jul 15 05:15:44.758537 kubelet[2396]: I0715 05:15:44.758477 2396 server.go:169] "Starting to listen" address="0.0.0.0" port=10250 Jul 15 05:15:44.762897 kubelet[2396]: I0715 05:15:44.762834 2396 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Jul 15 05:15:44.763117 kubelet[2396]: I0715 05:15:44.763096 2396 server.go:243] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Jul 15 05:15:44.763754 kubelet[2396]: I0715 05:15:44.763733 2396 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Jul 15 05:15:44.764137 kubelet[2396]: I0715 05:15:44.764117 2396 server.go:479] "Adding debug handlers to kubelet server" Jul 15 05:15:44.768106 kubelet[2396]: E0715 05:15:44.767114 2396 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://157.180.39.85:6443/api/v1/namespaces/default/events\": dial tcp 157.180.39.85:6443: connect: connection refused" event="&Event{ObjectMeta:{ci-4396-0-0-n-e83c776e20.185254e45aebf7af default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ci-4396-0-0-n-e83c776e20,UID:ci-4396-0-0-n-e83c776e20,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:ci-4396-0-0-n-e83c776e20,},FirstTimestamp:2025-07-15 05:15:44.755234735 +0000 UTC m=+0.423206211,LastTimestamp:2025-07-15 05:15:44.755234735 +0000 UTC m=+0.423206211,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ci-4396-0-0-n-e83c776e20,}" Jul 15 05:15:44.772046 kubelet[2396]: I0715 05:15:44.770912 2396 dynamic_serving_content.go:135] "Starting controller" name="kubelet-server-cert-files::/var/lib/kubelet/pki/kubelet.crt::/var/lib/kubelet/pki/kubelet.key" Jul 15 05:15:44.777578 kubelet[2396]: I0715 05:15:44.777211 2396 volume_manager.go:297] "Starting Kubelet Volume Manager" Jul 15 05:15:44.777578 kubelet[2396]: E0715 05:15:44.777348 2396 kubelet_node_status.go:466] "Error getting the current node from lister" err="node \"ci-4396-0-0-n-e83c776e20\" not found" Jul 15 05:15:44.777578 kubelet[2396]: I0715 05:15:44.777515 2396 desired_state_of_world_populator.go:150] "Desired state populator starts to run" Jul 15 05:15:44.777578 kubelet[2396]: I0715 05:15:44.777545 2396 reconciler.go:26] "Reconciler: start to sync state" Jul 15 05:15:44.779056 kubelet[2396]: I0715 05:15:44.779039 2396 factory.go:221] Registration of the systemd container factory successfully Jul 15 05:15:44.779269 kubelet[2396]: I0715 05:15:44.779252 2396 factory.go:219] Registration of the crio container factory failed: Get "http://%2Fvar%2Frun%2Fcrio%2Fcrio.sock/info": dial unix /var/run/crio/crio.sock: connect: no such file or directory Jul 15 05:15:44.779735 kubelet[2396]: E0715 05:15:44.779706 2396 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://157.180.39.85:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/ci-4396-0-0-n-e83c776e20?timeout=10s\": dial tcp 157.180.39.85:6443: connect: connection refused" interval="200ms" Jul 15 05:15:44.779898 kubelet[2396]: W0715 05:15:44.779867 2396 reflector.go:569] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://157.180.39.85:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 157.180.39.85:6443: connect: connection refused Jul 15 05:15:44.779924 kubelet[2396]: E0715 05:15:44.779899 2396 reflector.go:166] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://157.180.39.85:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 157.180.39.85:6443: connect: connection refused" logger="UnhandledError" Jul 15 05:15:44.781139 kubelet[2396]: E0715 05:15:44.781129 2396 kubelet.go:1555] "Image garbage collection failed once. Stats initialization may not have completed yet" err="invalid capacity 0 on image filesystem" Jul 15 05:15:44.781685 kubelet[2396]: I0715 05:15:44.781673 2396 factory.go:221] Registration of the containerd container factory successfully Jul 15 05:15:44.795967 kubelet[2396]: I0715 05:15:44.795932 2396 cpu_manager.go:221] "Starting CPU manager" policy="none" Jul 15 05:15:44.795967 kubelet[2396]: I0715 05:15:44.795944 2396 cpu_manager.go:222] "Reconciling" reconcilePeriod="10s" Jul 15 05:15:44.795967 kubelet[2396]: I0715 05:15:44.795956 2396 state_mem.go:36] "Initialized new in-memory state store" Jul 15 05:15:44.796868 kubelet[2396]: I0715 05:15:44.796825 2396 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv4" Jul 15 05:15:44.798433 kubelet[2396]: I0715 05:15:44.798420 2396 policy_none.go:49] "None policy: Start" Jul 15 05:15:44.798525 kubelet[2396]: I0715 05:15:44.798516 2396 memory_manager.go:186] "Starting memorymanager" policy="None" Jul 15 05:15:44.798592 kubelet[2396]: I0715 05:15:44.798585 2396 state_mem.go:35] "Initializing new in-memory state store" Jul 15 05:15:44.798914 kubelet[2396]: I0715 05:15:44.798860 2396 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv6" Jul 15 05:15:44.798914 kubelet[2396]: I0715 05:15:44.798874 2396 status_manager.go:227] "Starting to sync pod status with apiserver" Jul 15 05:15:44.798914 kubelet[2396]: I0715 05:15:44.798886 2396 watchdog_linux.go:127] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Jul 15 05:15:44.798914 kubelet[2396]: I0715 05:15:44.798892 2396 kubelet.go:2382] "Starting kubelet main sync loop" Jul 15 05:15:44.799038 kubelet[2396]: E0715 05:15:44.798921 2396 kubelet.go:2406] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" Jul 15 05:15:44.802159 kubelet[2396]: W0715 05:15:44.802125 2396 reflector.go:569] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://157.180.39.85:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 157.180.39.85:6443: connect: connection refused Jul 15 05:15:44.802201 kubelet[2396]: E0715 05:15:44.802159 2396 reflector.go:166] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://157.180.39.85:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 157.180.39.85:6443: connect: connection refused" logger="UnhandledError" Jul 15 05:15:44.805341 systemd[1]: Created slice kubepods.slice - libcontainer container kubepods.slice. Jul 15 05:15:44.818492 systemd[1]: Created slice kubepods-burstable.slice - libcontainer container kubepods-burstable.slice. Jul 15 05:15:44.821191 systemd[1]: Created slice kubepods-besteffort.slice - libcontainer container kubepods-besteffort.slice. Jul 15 05:15:44.827894 kubelet[2396]: I0715 05:15:44.827866 2396 manager.go:519] "Failed to read data from checkpoint" checkpoint="kubelet_internal_checkpoint" err="checkpoint is not found" Jul 15 05:15:44.828116 kubelet[2396]: I0715 05:15:44.828105 2396 eviction_manager.go:189] "Eviction manager: starting control loop" Jul 15 05:15:44.828294 kubelet[2396]: I0715 05:15:44.828246 2396 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Jul 15 05:15:44.829475 kubelet[2396]: I0715 05:15:44.829413 2396 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Jul 15 05:15:44.830197 kubelet[2396]: E0715 05:15:44.830180 2396 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="no imagefs label for configured runtime" Jul 15 05:15:44.830322 kubelet[2396]: E0715 05:15:44.830269 2396 eviction_manager.go:292] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"ci-4396-0-0-n-e83c776e20\" not found" Jul 15 05:15:44.922436 systemd[1]: Created slice kubepods-burstable-podecca8a58b66991dd69df1efad9a96552.slice - libcontainer container kubepods-burstable-podecca8a58b66991dd69df1efad9a96552.slice. Jul 15 05:15:44.932309 kubelet[2396]: I0715 05:15:44.932259 2396 kubelet_node_status.go:75] "Attempting to register node" node="ci-4396-0-0-n-e83c776e20" Jul 15 05:15:44.932735 kubelet[2396]: E0715 05:15:44.932659 2396 kubelet_node_status.go:107] "Unable to register node with API server" err="Post \"https://157.180.39.85:6443/api/v1/nodes\": dial tcp 157.180.39.85:6443: connect: connection refused" node="ci-4396-0-0-n-e83c776e20" Jul 15 05:15:44.943955 kubelet[2396]: E0715 05:15:44.942475 2396 kubelet.go:3190] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4396-0-0-n-e83c776e20\" not found" node="ci-4396-0-0-n-e83c776e20" Jul 15 05:15:44.949338 systemd[1]: Created slice kubepods-burstable-podd760f2b2b4c0af0f5ce8b8192d801887.slice - libcontainer container kubepods-burstable-podd760f2b2b4c0af0f5ce8b8192d801887.slice. Jul 15 05:15:44.958329 kubelet[2396]: E0715 05:15:44.958081 2396 kubelet.go:3190] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4396-0-0-n-e83c776e20\" not found" node="ci-4396-0-0-n-e83c776e20" Jul 15 05:15:44.961426 systemd[1]: Created slice kubepods-burstable-pod843de93c586da35ea89e9b7f902fcd04.slice - libcontainer container kubepods-burstable-pod843de93c586da35ea89e9b7f902fcd04.slice. Jul 15 05:15:44.963645 kubelet[2396]: E0715 05:15:44.963610 2396 kubelet.go:3190] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4396-0-0-n-e83c776e20\" not found" node="ci-4396-0-0-n-e83c776e20" Jul 15 05:15:44.979255 kubelet[2396]: I0715 05:15:44.978964 2396 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/ecca8a58b66991dd69df1efad9a96552-ca-certs\") pod \"kube-controller-manager-ci-4396-0-0-n-e83c776e20\" (UID: \"ecca8a58b66991dd69df1efad9a96552\") " pod="kube-system/kube-controller-manager-ci-4396-0-0-n-e83c776e20" Jul 15 05:15:44.979255 kubelet[2396]: I0715 05:15:44.979005 2396 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/ecca8a58b66991dd69df1efad9a96552-usr-share-ca-certificates\") pod \"kube-controller-manager-ci-4396-0-0-n-e83c776e20\" (UID: \"ecca8a58b66991dd69df1efad9a96552\") " pod="kube-system/kube-controller-manager-ci-4396-0-0-n-e83c776e20" Jul 15 05:15:44.979255 kubelet[2396]: I0715 05:15:44.979059 2396 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/d760f2b2b4c0af0f5ce8b8192d801887-kubeconfig\") pod \"kube-scheduler-ci-4396-0-0-n-e83c776e20\" (UID: \"d760f2b2b4c0af0f5ce8b8192d801887\") " pod="kube-system/kube-scheduler-ci-4396-0-0-n-e83c776e20" Jul 15 05:15:44.979255 kubelet[2396]: I0715 05:15:44.979095 2396 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/843de93c586da35ea89e9b7f902fcd04-usr-share-ca-certificates\") pod \"kube-apiserver-ci-4396-0-0-n-e83c776e20\" (UID: \"843de93c586da35ea89e9b7f902fcd04\") " pod="kube-system/kube-apiserver-ci-4396-0-0-n-e83c776e20" Jul 15 05:15:44.979255 kubelet[2396]: I0715 05:15:44.979115 2396 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"flexvolume-dir\" (UniqueName: \"kubernetes.io/host-path/ecca8a58b66991dd69df1efad9a96552-flexvolume-dir\") pod \"kube-controller-manager-ci-4396-0-0-n-e83c776e20\" (UID: \"ecca8a58b66991dd69df1efad9a96552\") " pod="kube-system/kube-controller-manager-ci-4396-0-0-n-e83c776e20" Jul 15 05:15:44.979601 kubelet[2396]: I0715 05:15:44.979163 2396 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/ecca8a58b66991dd69df1efad9a96552-k8s-certs\") pod \"kube-controller-manager-ci-4396-0-0-n-e83c776e20\" (UID: \"ecca8a58b66991dd69df1efad9a96552\") " pod="kube-system/kube-controller-manager-ci-4396-0-0-n-e83c776e20" Jul 15 05:15:44.979601 kubelet[2396]: I0715 05:15:44.979216 2396 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/ecca8a58b66991dd69df1efad9a96552-kubeconfig\") pod \"kube-controller-manager-ci-4396-0-0-n-e83c776e20\" (UID: \"ecca8a58b66991dd69df1efad9a96552\") " pod="kube-system/kube-controller-manager-ci-4396-0-0-n-e83c776e20" Jul 15 05:15:44.979601 kubelet[2396]: I0715 05:15:44.979262 2396 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/843de93c586da35ea89e9b7f902fcd04-ca-certs\") pod \"kube-apiserver-ci-4396-0-0-n-e83c776e20\" (UID: \"843de93c586da35ea89e9b7f902fcd04\") " pod="kube-system/kube-apiserver-ci-4396-0-0-n-e83c776e20" Jul 15 05:15:44.979601 kubelet[2396]: I0715 05:15:44.979291 2396 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/843de93c586da35ea89e9b7f902fcd04-k8s-certs\") pod \"kube-apiserver-ci-4396-0-0-n-e83c776e20\" (UID: \"843de93c586da35ea89e9b7f902fcd04\") " pod="kube-system/kube-apiserver-ci-4396-0-0-n-e83c776e20" Jul 15 05:15:44.980469 kubelet[2396]: E0715 05:15:44.980417 2396 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://157.180.39.85:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/ci-4396-0-0-n-e83c776e20?timeout=10s\": dial tcp 157.180.39.85:6443: connect: connection refused" interval="400ms" Jul 15 05:15:45.135871 kubelet[2396]: I0715 05:15:45.135790 2396 kubelet_node_status.go:75] "Attempting to register node" node="ci-4396-0-0-n-e83c776e20" Jul 15 05:15:45.136583 kubelet[2396]: E0715 05:15:45.136534 2396 kubelet_node_status.go:107] "Unable to register node with API server" err="Post \"https://157.180.39.85:6443/api/v1/nodes\": dial tcp 157.180.39.85:6443: connect: connection refused" node="ci-4396-0-0-n-e83c776e20" Jul 15 05:15:45.244515 containerd[1591]: time="2025-07-15T05:15:45.244410895Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-controller-manager-ci-4396-0-0-n-e83c776e20,Uid:ecca8a58b66991dd69df1efad9a96552,Namespace:kube-system,Attempt:0,}" Jul 15 05:15:45.266052 containerd[1591]: time="2025-07-15T05:15:45.265888815Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-scheduler-ci-4396-0-0-n-e83c776e20,Uid:d760f2b2b4c0af0f5ce8b8192d801887,Namespace:kube-system,Attempt:0,}" Jul 15 05:15:45.268442 containerd[1591]: time="2025-07-15T05:15:45.268369235Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-apiserver-ci-4396-0-0-n-e83c776e20,Uid:843de93c586da35ea89e9b7f902fcd04,Namespace:kube-system,Attempt:0,}" Jul 15 05:15:45.363525 containerd[1591]: time="2025-07-15T05:15:45.363448535Z" level=info msg="connecting to shim b015deb82b1d6760cf7821189c4ef90bf2a7a4c669d261bca4522c7b14236b92" address="unix:///run/containerd/s/958acc45c65bc3792e52b65cffd4ce72f06fef4fceb824e5e03efd0c440a563b" namespace=k8s.io protocol=ttrpc version=3 Jul 15 05:15:45.375311 containerd[1591]: time="2025-07-15T05:15:45.375278375Z" level=info msg="connecting to shim f17bb7f4ea97eb0f7606c0c99a76ffb4cd7f45632eb893f29bb012a768c354e0" address="unix:///run/containerd/s/502dd6cba8b8556aa3b0bd73d8174372fe66c5cba048a96c0d68fca7c52c4ae8" namespace=k8s.io protocol=ttrpc version=3 Jul 15 05:15:45.375955 containerd[1591]: time="2025-07-15T05:15:45.375925905Z" level=info msg="connecting to shim 411e6bfc3f73dfdf525b1d6d0d397a27e5051310b9a9996f79b01ac9e5e4216c" address="unix:///run/containerd/s/5a422a6e5de1fc52437713b7039eb6ec7c4360580e45f091b85fd1821631f7bb" namespace=k8s.io protocol=ttrpc version=3 Jul 15 05:15:45.381028 kubelet[2396]: E0715 05:15:45.380979 2396 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://157.180.39.85:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/ci-4396-0-0-n-e83c776e20?timeout=10s\": dial tcp 157.180.39.85:6443: connect: connection refused" interval="800ms" Jul 15 05:15:45.429132 systemd[1]: Started cri-containerd-411e6bfc3f73dfdf525b1d6d0d397a27e5051310b9a9996f79b01ac9e5e4216c.scope - libcontainer container 411e6bfc3f73dfdf525b1d6d0d397a27e5051310b9a9996f79b01ac9e5e4216c. Jul 15 05:15:45.430872 systemd[1]: Started cri-containerd-b015deb82b1d6760cf7821189c4ef90bf2a7a4c669d261bca4522c7b14236b92.scope - libcontainer container b015deb82b1d6760cf7821189c4ef90bf2a7a4c669d261bca4522c7b14236b92. Jul 15 05:15:45.432093 systemd[1]: Started cri-containerd-f17bb7f4ea97eb0f7606c0c99a76ffb4cd7f45632eb893f29bb012a768c354e0.scope - libcontainer container f17bb7f4ea97eb0f7606c0c99a76ffb4cd7f45632eb893f29bb012a768c354e0. Jul 15 05:15:45.478581 containerd[1591]: time="2025-07-15T05:15:45.478541715Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-scheduler-ci-4396-0-0-n-e83c776e20,Uid:d760f2b2b4c0af0f5ce8b8192d801887,Namespace:kube-system,Attempt:0,} returns sandbox id \"411e6bfc3f73dfdf525b1d6d0d397a27e5051310b9a9996f79b01ac9e5e4216c\"" Jul 15 05:15:45.482064 containerd[1591]: time="2025-07-15T05:15:45.481886705Z" level=info msg="CreateContainer within sandbox \"411e6bfc3f73dfdf525b1d6d0d397a27e5051310b9a9996f79b01ac9e5e4216c\" for container &ContainerMetadata{Name:kube-scheduler,Attempt:0,}" Jul 15 05:15:45.501322 containerd[1591]: time="2025-07-15T05:15:45.501252245Z" level=info msg="Container 426304308dbc7f0e2ecaaaf36d8dee5714076cb15d299380b331344bc2a6918d: CDI devices from CRI Config.CDIDevices: []" Jul 15 05:15:45.502162 containerd[1591]: time="2025-07-15T05:15:45.502139115Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-controller-manager-ci-4396-0-0-n-e83c776e20,Uid:ecca8a58b66991dd69df1efad9a96552,Namespace:kube-system,Attempt:0,} returns sandbox id \"f17bb7f4ea97eb0f7606c0c99a76ffb4cd7f45632eb893f29bb012a768c354e0\"" Jul 15 05:15:45.506397 containerd[1591]: time="2025-07-15T05:15:45.506284565Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-apiserver-ci-4396-0-0-n-e83c776e20,Uid:843de93c586da35ea89e9b7f902fcd04,Namespace:kube-system,Attempt:0,} returns sandbox id \"b015deb82b1d6760cf7821189c4ef90bf2a7a4c669d261bca4522c7b14236b92\"" Jul 15 05:15:45.508678 containerd[1591]: time="2025-07-15T05:15:45.508577085Z" level=info msg="CreateContainer within sandbox \"f17bb7f4ea97eb0f7606c0c99a76ffb4cd7f45632eb893f29bb012a768c354e0\" for container &ContainerMetadata{Name:kube-controller-manager,Attempt:0,}" Jul 15 05:15:45.508829 containerd[1591]: time="2025-07-15T05:15:45.508802885Z" level=info msg="CreateContainer within sandbox \"b015deb82b1d6760cf7821189c4ef90bf2a7a4c669d261bca4522c7b14236b92\" for container &ContainerMetadata{Name:kube-apiserver,Attempt:0,}" Jul 15 05:15:45.516929 containerd[1591]: time="2025-07-15T05:15:45.516899685Z" level=info msg="Container d84ffd01b63dafef51e5b4c08b061e8480909e986a9fe3b0a079651a857e020e: CDI devices from CRI Config.CDIDevices: []" Jul 15 05:15:45.517757 containerd[1591]: time="2025-07-15T05:15:45.517729695Z" level=info msg="CreateContainer within sandbox \"411e6bfc3f73dfdf525b1d6d0d397a27e5051310b9a9996f79b01ac9e5e4216c\" for &ContainerMetadata{Name:kube-scheduler,Attempt:0,} returns container id \"426304308dbc7f0e2ecaaaf36d8dee5714076cb15d299380b331344bc2a6918d\"" Jul 15 05:15:45.522582 containerd[1591]: time="2025-07-15T05:15:45.522543025Z" level=info msg="StartContainer for \"426304308dbc7f0e2ecaaaf36d8dee5714076cb15d299380b331344bc2a6918d\"" Jul 15 05:15:45.524057 containerd[1591]: time="2025-07-15T05:15:45.523725095Z" level=info msg="CreateContainer within sandbox \"f17bb7f4ea97eb0f7606c0c99a76ffb4cd7f45632eb893f29bb012a768c354e0\" for &ContainerMetadata{Name:kube-controller-manager,Attempt:0,} returns container id \"d84ffd01b63dafef51e5b4c08b061e8480909e986a9fe3b0a079651a857e020e\"" Jul 15 05:15:45.524423 containerd[1591]: time="2025-07-15T05:15:45.524404195Z" level=info msg="connecting to shim 426304308dbc7f0e2ecaaaf36d8dee5714076cb15d299380b331344bc2a6918d" address="unix:///run/containerd/s/5a422a6e5de1fc52437713b7039eb6ec7c4360580e45f091b85fd1821631f7bb" protocol=ttrpc version=3 Jul 15 05:15:45.524721 containerd[1591]: time="2025-07-15T05:15:45.524706425Z" level=info msg="StartContainer for \"d84ffd01b63dafef51e5b4c08b061e8480909e986a9fe3b0a079651a857e020e\"" Jul 15 05:15:45.525599 containerd[1591]: time="2025-07-15T05:15:45.525583605Z" level=info msg="connecting to shim d84ffd01b63dafef51e5b4c08b061e8480909e986a9fe3b0a079651a857e020e" address="unix:///run/containerd/s/502dd6cba8b8556aa3b0bd73d8174372fe66c5cba048a96c0d68fca7c52c4ae8" protocol=ttrpc version=3 Jul 15 05:15:45.532237 containerd[1591]: time="2025-07-15T05:15:45.532196545Z" level=info msg="Container 4e43aa790ae7a29875522602098a4d746b85e6552bbcec796b6e8f9780aabf2b: CDI devices from CRI Config.CDIDevices: []" Jul 15 05:15:45.541334 containerd[1591]: time="2025-07-15T05:15:45.541315105Z" level=info msg="CreateContainer within sandbox \"b015deb82b1d6760cf7821189c4ef90bf2a7a4c669d261bca4522c7b14236b92\" for &ContainerMetadata{Name:kube-apiserver,Attempt:0,} returns container id \"4e43aa790ae7a29875522602098a4d746b85e6552bbcec796b6e8f9780aabf2b\"" Jul 15 05:15:45.541396 kubelet[2396]: I0715 05:15:45.541352 2396 kubelet_node_status.go:75] "Attempting to register node" node="ci-4396-0-0-n-e83c776e20" Jul 15 05:15:45.541959 containerd[1591]: time="2025-07-15T05:15:45.541937015Z" level=info msg="StartContainer for \"4e43aa790ae7a29875522602098a4d746b85e6552bbcec796b6e8f9780aabf2b\"" Jul 15 05:15:45.542544 kubelet[2396]: E0715 05:15:45.542521 2396 kubelet_node_status.go:107] "Unable to register node with API server" err="Post \"https://157.180.39.85:6443/api/v1/nodes\": dial tcp 157.180.39.85:6443: connect: connection refused" node="ci-4396-0-0-n-e83c776e20" Jul 15 05:15:45.543894 containerd[1591]: time="2025-07-15T05:15:45.543877675Z" level=info msg="connecting to shim 4e43aa790ae7a29875522602098a4d746b85e6552bbcec796b6e8f9780aabf2b" address="unix:///run/containerd/s/958acc45c65bc3792e52b65cffd4ce72f06fef4fceb824e5e03efd0c440a563b" protocol=ttrpc version=3 Jul 15 05:15:45.544323 systemd[1]: Started cri-containerd-426304308dbc7f0e2ecaaaf36d8dee5714076cb15d299380b331344bc2a6918d.scope - libcontainer container 426304308dbc7f0e2ecaaaf36d8dee5714076cb15d299380b331344bc2a6918d. Jul 15 05:15:45.555118 systemd[1]: Started cri-containerd-d84ffd01b63dafef51e5b4c08b061e8480909e986a9fe3b0a079651a857e020e.scope - libcontainer container d84ffd01b63dafef51e5b4c08b061e8480909e986a9fe3b0a079651a857e020e. Jul 15 05:15:45.564481 systemd[1]: Started cri-containerd-4e43aa790ae7a29875522602098a4d746b85e6552bbcec796b6e8f9780aabf2b.scope - libcontainer container 4e43aa790ae7a29875522602098a4d746b85e6552bbcec796b6e8f9780aabf2b. Jul 15 05:15:45.604651 containerd[1591]: time="2025-07-15T05:15:45.604586725Z" level=info msg="StartContainer for \"d84ffd01b63dafef51e5b4c08b061e8480909e986a9fe3b0a079651a857e020e\" returns successfully" Jul 15 05:15:45.646819 containerd[1591]: time="2025-07-15T05:15:45.646782595Z" level=info msg="StartContainer for \"4e43aa790ae7a29875522602098a4d746b85e6552bbcec796b6e8f9780aabf2b\" returns successfully" Jul 15 05:15:45.651173 containerd[1591]: time="2025-07-15T05:15:45.651146315Z" level=info msg="StartContainer for \"426304308dbc7f0e2ecaaaf36d8dee5714076cb15d299380b331344bc2a6918d\" returns successfully" Jul 15 05:15:45.666806 kubelet[2396]: W0715 05:15:45.666754 2396 reflector.go:569] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://157.180.39.85:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 157.180.39.85:6443: connect: connection refused Jul 15 05:15:45.666806 kubelet[2396]: E0715 05:15:45.666808 2396 reflector.go:166] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://157.180.39.85:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 157.180.39.85:6443: connect: connection refused" logger="UnhandledError" Jul 15 05:15:45.750032 kubelet[2396]: E0715 05:15:45.749672 2396 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://157.180.39.85:6443/api/v1/namespaces/default/events\": dial tcp 157.180.39.85:6443: connect: connection refused" event="&Event{ObjectMeta:{ci-4396-0-0-n-e83c776e20.185254e45aebf7af default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ci-4396-0-0-n-e83c776e20,UID:ci-4396-0-0-n-e83c776e20,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:ci-4396-0-0-n-e83c776e20,},FirstTimestamp:2025-07-15 05:15:44.755234735 +0000 UTC m=+0.423206211,LastTimestamp:2025-07-15 05:15:44.755234735 +0000 UTC m=+0.423206211,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ci-4396-0-0-n-e83c776e20,}" Jul 15 05:15:45.791068 kubelet[2396]: W0715 05:15:45.790936 2396 reflector.go:569] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://157.180.39.85:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 157.180.39.85:6443: connect: connection refused Jul 15 05:15:45.792440 kubelet[2396]: W0715 05:15:45.791768 2396 reflector.go:569] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://157.180.39.85:6443/api/v1/nodes?fieldSelector=metadata.name%3Dci-4396-0-0-n-e83c776e20&limit=500&resourceVersion=0": dial tcp 157.180.39.85:6443: connect: connection refused Jul 15 05:15:45.792440 kubelet[2396]: E0715 05:15:45.791801 2396 reflector.go:166] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://157.180.39.85:6443/api/v1/nodes?fieldSelector=metadata.name%3Dci-4396-0-0-n-e83c776e20&limit=500&resourceVersion=0\": dial tcp 157.180.39.85:6443: connect: connection refused" logger="UnhandledError" Jul 15 05:15:45.792440 kubelet[2396]: E0715 05:15:45.791820 2396 reflector.go:166] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://157.180.39.85:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 157.180.39.85:6443: connect: connection refused" logger="UnhandledError" Jul 15 05:15:45.808954 kubelet[2396]: E0715 05:15:45.808931 2396 kubelet.go:3190] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4396-0-0-n-e83c776e20\" not found" node="ci-4396-0-0-n-e83c776e20" Jul 15 05:15:45.810566 kubelet[2396]: E0715 05:15:45.810547 2396 kubelet.go:3190] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4396-0-0-n-e83c776e20\" not found" node="ci-4396-0-0-n-e83c776e20" Jul 15 05:15:45.811935 kubelet[2396]: E0715 05:15:45.811915 2396 kubelet.go:3190] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4396-0-0-n-e83c776e20\" not found" node="ci-4396-0-0-n-e83c776e20" Jul 15 05:15:46.345463 kubelet[2396]: I0715 05:15:46.345422 2396 kubelet_node_status.go:75] "Attempting to register node" node="ci-4396-0-0-n-e83c776e20" Jul 15 05:15:46.682384 kubelet[2396]: E0715 05:15:46.682283 2396 nodelease.go:49] "Failed to get node when trying to set owner ref to the node lease" err="nodes \"ci-4396-0-0-n-e83c776e20\" not found" node="ci-4396-0-0-n-e83c776e20" Jul 15 05:15:46.753220 kubelet[2396]: I0715 05:15:46.753178 2396 kubelet_node_status.go:78] "Successfully registered node" node="ci-4396-0-0-n-e83c776e20" Jul 15 05:15:46.753220 kubelet[2396]: E0715 05:15:46.753207 2396 kubelet_node_status.go:548] "Error updating node status, will retry" err="error getting node \"ci-4396-0-0-n-e83c776e20\": node \"ci-4396-0-0-n-e83c776e20\" not found" Jul 15 05:15:46.769313 kubelet[2396]: E0715 05:15:46.769283 2396 kubelet_node_status.go:466] "Error getting the current node from lister" err="node \"ci-4396-0-0-n-e83c776e20\" not found" Jul 15 05:15:46.815928 kubelet[2396]: E0715 05:15:46.815860 2396 kubelet.go:3190] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4396-0-0-n-e83c776e20\" not found" node="ci-4396-0-0-n-e83c776e20" Jul 15 05:15:46.816307 kubelet[2396]: E0715 05:15:46.816286 2396 kubelet.go:3190] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4396-0-0-n-e83c776e20\" not found" node="ci-4396-0-0-n-e83c776e20" Jul 15 05:15:46.869673 kubelet[2396]: E0715 05:15:46.869640 2396 kubelet_node_status.go:466] "Error getting the current node from lister" err="node \"ci-4396-0-0-n-e83c776e20\" not found" Jul 15 05:15:46.970400 kubelet[2396]: E0715 05:15:46.970353 2396 kubelet_node_status.go:466] "Error getting the current node from lister" err="node \"ci-4396-0-0-n-e83c776e20\" not found" Jul 15 05:15:47.071303 kubelet[2396]: E0715 05:15:47.071241 2396 kubelet_node_status.go:466] "Error getting the current node from lister" err="node \"ci-4396-0-0-n-e83c776e20\" not found" Jul 15 05:15:47.172327 kubelet[2396]: E0715 05:15:47.172252 2396 kubelet_node_status.go:466] "Error getting the current node from lister" err="node \"ci-4396-0-0-n-e83c776e20\" not found" Jul 15 05:15:47.273041 kubelet[2396]: E0715 05:15:47.272846 2396 kubelet_node_status.go:466] "Error getting the current node from lister" err="node \"ci-4396-0-0-n-e83c776e20\" not found" Jul 15 05:15:47.373799 kubelet[2396]: E0715 05:15:47.373737 2396 kubelet_node_status.go:466] "Error getting the current node from lister" err="node \"ci-4396-0-0-n-e83c776e20\" not found" Jul 15 05:15:47.474785 kubelet[2396]: E0715 05:15:47.474721 2396 kubelet_node_status.go:466] "Error getting the current node from lister" err="node \"ci-4396-0-0-n-e83c776e20\" not found" Jul 15 05:15:47.575148 kubelet[2396]: E0715 05:15:47.574934 2396 kubelet_node_status.go:466] "Error getting the current node from lister" err="node \"ci-4396-0-0-n-e83c776e20\" not found" Jul 15 05:15:47.676074 kubelet[2396]: E0715 05:15:47.675963 2396 kubelet_node_status.go:466] "Error getting the current node from lister" err="node \"ci-4396-0-0-n-e83c776e20\" not found" Jul 15 05:15:47.777080 kubelet[2396]: E0715 05:15:47.776923 2396 kubelet_node_status.go:466] "Error getting the current node from lister" err="node \"ci-4396-0-0-n-e83c776e20\" not found" Jul 15 05:15:47.877606 kubelet[2396]: E0715 05:15:47.877392 2396 kubelet_node_status.go:466] "Error getting the current node from lister" err="node \"ci-4396-0-0-n-e83c776e20\" not found" Jul 15 05:15:47.978588 kubelet[2396]: E0715 05:15:47.978537 2396 kubelet_node_status.go:466] "Error getting the current node from lister" err="node \"ci-4396-0-0-n-e83c776e20\" not found" Jul 15 05:15:48.079115 kubelet[2396]: E0715 05:15:48.079054 2396 kubelet_node_status.go:466] "Error getting the current node from lister" err="node \"ci-4396-0-0-n-e83c776e20\" not found" Jul 15 05:15:48.179846 kubelet[2396]: E0715 05:15:48.179671 2396 kubelet_node_status.go:466] "Error getting the current node from lister" err="node \"ci-4396-0-0-n-e83c776e20\" not found" Jul 15 05:15:48.278464 kubelet[2396]: I0715 05:15:48.277958 2396 kubelet.go:3194] "Creating a mirror pod for static pod" pod="kube-system/kube-controller-manager-ci-4396-0-0-n-e83c776e20" Jul 15 05:15:48.297049 kubelet[2396]: I0715 05:15:48.296940 2396 kubelet.go:3194] "Creating a mirror pod for static pod" pod="kube-system/kube-scheduler-ci-4396-0-0-n-e83c776e20" Jul 15 05:15:48.303064 kubelet[2396]: I0715 05:15:48.302931 2396 kubelet.go:3194] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-ci-4396-0-0-n-e83c776e20" Jul 15 05:15:48.753998 kubelet[2396]: I0715 05:15:48.753937 2396 apiserver.go:52] "Watching apiserver" Jul 15 05:15:48.778098 kubelet[2396]: I0715 05:15:48.778039 2396 desired_state_of_world_populator.go:158] "Finished populating initial desired state of world" Jul 15 05:15:48.814744 systemd[1]: Reload requested from client PID 2666 ('systemctl') (unit session-7.scope)... Jul 15 05:15:48.814772 systemd[1]: Reloading... Jul 15 05:15:48.914089 zram_generator::config[2713]: No configuration found. Jul 15 05:15:48.987993 systemd[1]: /usr/lib/systemd/system/docker.socket:6: ListenStream= references a path below legacy directory /var/run/, updating /var/run/docker.sock → /run/docker.sock; please update the unit file accordingly. Jul 15 05:15:49.089441 systemd[1]: Reloading finished in 273 ms. Jul 15 05:15:49.125681 systemd[1]: Stopping kubelet.service - kubelet: The Kubernetes Node Agent... Jul 15 05:15:49.126981 kubelet[2396]: I0715 05:15:49.125845 2396 dynamic_cafile_content.go:175] "Shutting down controller" name="client-ca-bundle::/etc/kubernetes/pki/ca.crt" Jul 15 05:15:49.146900 systemd[1]: kubelet.service: Deactivated successfully. Jul 15 05:15:49.147202 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent. Jul 15 05:15:49.147257 systemd[1]: kubelet.service: Consumed 754ms CPU time, 129.7M memory peak. Jul 15 05:15:49.149567 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Jul 15 05:15:49.337174 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Jul 15 05:15:49.344521 (kubelet)[2761]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS Jul 15 05:15:49.376613 kubelet[2761]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Jul 15 05:15:49.376613 kubelet[2761]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Jul 15 05:15:49.376613 kubelet[2761]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Jul 15 05:15:49.376613 kubelet[2761]: I0715 05:15:49.376305 2761 server.go:215] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Jul 15 05:15:49.384896 kubelet[2761]: I0715 05:15:49.384843 2761 server.go:520] "Kubelet version" kubeletVersion="v1.32.4" Jul 15 05:15:49.384896 kubelet[2761]: I0715 05:15:49.384857 2761 server.go:522] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Jul 15 05:15:49.385051 kubelet[2761]: I0715 05:15:49.384980 2761 server.go:954] "Client rotation is on, will bootstrap in background" Jul 15 05:15:49.385850 kubelet[2761]: I0715 05:15:49.385811 2761 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-client-current.pem". Jul 15 05:15:49.390388 kubelet[2761]: I0715 05:15:49.390101 2761 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/pki/ca.crt" Jul 15 05:15:49.396053 kubelet[2761]: I0715 05:15:49.395888 2761 server.go:1444] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Jul 15 05:15:49.400501 kubelet[2761]: I0715 05:15:49.400482 2761 server.go:772] "--cgroups-per-qos enabled, but --cgroup-root was not specified. defaulting to /" Jul 15 05:15:49.400853 kubelet[2761]: I0715 05:15:49.400762 2761 container_manager_linux.go:268] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Jul 15 05:15:49.400886 kubelet[2761]: I0715 05:15:49.400780 2761 container_manager_linux.go:273] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ci-4396-0-0-n-e83c776e20","RuntimeCgroupsName":"","SystemCgroupsName":"","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":false,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":null,"HardEvictionThresholds":[{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"ExperimentalMemoryManagerPolicy":"None","ExperimentalMemoryManagerReservedMemory":null,"PodPidsLimit":-1,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Jul 15 05:15:49.400886 kubelet[2761]: I0715 05:15:49.400884 2761 topology_manager.go:138] "Creating topology manager with none policy" Jul 15 05:15:49.400961 kubelet[2761]: I0715 05:15:49.400889 2761 container_manager_linux.go:304] "Creating device plugin manager" Jul 15 05:15:49.400961 kubelet[2761]: I0715 05:15:49.400922 2761 state_mem.go:36] "Initialized new in-memory state store" Jul 15 05:15:49.401058 kubelet[2761]: I0715 05:15:49.401042 2761 kubelet.go:446] "Attempting to sync node with API server" Jul 15 05:15:49.401100 kubelet[2761]: I0715 05:15:49.401059 2761 kubelet.go:341] "Adding static pod path" path="/etc/kubernetes/manifests" Jul 15 05:15:49.401100 kubelet[2761]: I0715 05:15:49.401073 2761 kubelet.go:352] "Adding apiserver pod source" Jul 15 05:15:49.401100 kubelet[2761]: I0715 05:15:49.401080 2761 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Jul 15 05:15:49.402125 kubelet[2761]: I0715 05:15:49.402076 2761 kuberuntime_manager.go:269] "Container runtime initialized" containerRuntime="containerd" version="v2.0.5" apiVersion="v1" Jul 15 05:15:49.402467 kubelet[2761]: I0715 05:15:49.402457 2761 kubelet.go:890] "Not starting ClusterTrustBundle informer because we are in static kubelet mode" Jul 15 05:15:49.402964 kubelet[2761]: I0715 05:15:49.402907 2761 watchdog_linux.go:99] "Systemd watchdog is not enabled" Jul 15 05:15:49.402964 kubelet[2761]: I0715 05:15:49.402926 2761 server.go:1287] "Started kubelet" Jul 15 05:15:49.404616 kubelet[2761]: I0715 05:15:49.404590 2761 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Jul 15 05:15:49.405202 kubelet[2761]: I0715 05:15:49.405163 2761 server.go:169] "Starting to listen" address="0.0.0.0" port=10250 Jul 15 05:15:49.405983 kubelet[2761]: I0715 05:15:49.405932 2761 server.go:479] "Adding debug handlers to kubelet server" Jul 15 05:15:49.407669 kubelet[2761]: I0715 05:15:49.407383 2761 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Jul 15 05:15:49.408523 kubelet[2761]: I0715 05:15:49.408505 2761 dynamic_serving_content.go:135] "Starting controller" name="kubelet-server-cert-files::/var/lib/kubelet/pki/kubelet.crt::/var/lib/kubelet/pki/kubelet.key" Jul 15 05:15:49.411444 kubelet[2761]: I0715 05:15:49.411317 2761 volume_manager.go:297] "Starting Kubelet Volume Manager" Jul 15 05:15:49.413760 kubelet[2761]: I0715 05:15:49.411513 2761 desired_state_of_world_populator.go:150] "Desired state populator starts to run" Jul 15 05:15:49.414903 kubelet[2761]: E0715 05:15:49.411604 2761 kubelet_node_status.go:466] "Error getting the current node from lister" err="node \"ci-4396-0-0-n-e83c776e20\" not found" Jul 15 05:15:49.414903 kubelet[2761]: I0715 05:15:49.412084 2761 server.go:243] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Jul 15 05:15:49.417354 kubelet[2761]: I0715 05:15:49.417314 2761 reconciler.go:26] "Reconciler: start to sync state" Jul 15 05:15:49.426267 kubelet[2761]: I0715 05:15:49.426204 2761 factory.go:221] Registration of the systemd container factory successfully Jul 15 05:15:49.427218 kubelet[2761]: I0715 05:15:49.426269 2761 factory.go:219] Registration of the crio container factory failed: Get "http://%2Fvar%2Frun%2Fcrio%2Fcrio.sock/info": dial unix /var/run/crio/crio.sock: connect: no such file or directory Jul 15 05:15:49.429263 kubelet[2761]: E0715 05:15:49.429250 2761 kubelet.go:1555] "Image garbage collection failed once. Stats initialization may not have completed yet" err="invalid capacity 0 on image filesystem" Jul 15 05:15:49.431036 kubelet[2761]: I0715 05:15:49.430757 2761 factory.go:221] Registration of the containerd container factory successfully Jul 15 05:15:49.435659 kubelet[2761]: I0715 05:15:49.435619 2761 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv4" Jul 15 05:15:49.438018 kubelet[2761]: I0715 05:15:49.437944 2761 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv6" Jul 15 05:15:49.438018 kubelet[2761]: I0715 05:15:49.437968 2761 status_manager.go:227] "Starting to sync pod status with apiserver" Jul 15 05:15:49.438018 kubelet[2761]: I0715 05:15:49.437996 2761 watchdog_linux.go:127] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Jul 15 05:15:49.438018 kubelet[2761]: I0715 05:15:49.438001 2761 kubelet.go:2382] "Starting kubelet main sync loop" Jul 15 05:15:49.438118 kubelet[2761]: E0715 05:15:49.438051 2761 kubelet.go:2406] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" Jul 15 05:15:49.464505 kubelet[2761]: I0715 05:15:49.464488 2761 cpu_manager.go:221] "Starting CPU manager" policy="none" Jul 15 05:15:49.464601 kubelet[2761]: I0715 05:15:49.464499 2761 cpu_manager.go:222] "Reconciling" reconcilePeriod="10s" Jul 15 05:15:49.464601 kubelet[2761]: I0715 05:15:49.464581 2761 state_mem.go:36] "Initialized new in-memory state store" Jul 15 05:15:49.464694 kubelet[2761]: I0715 05:15:49.464676 2761 state_mem.go:88] "Updated default CPUSet" cpuSet="" Jul 15 05:15:49.464729 kubelet[2761]: I0715 05:15:49.464687 2761 state_mem.go:96] "Updated CPUSet assignments" assignments={} Jul 15 05:15:49.464729 kubelet[2761]: I0715 05:15:49.464699 2761 policy_none.go:49] "None policy: Start" Jul 15 05:15:49.464729 kubelet[2761]: I0715 05:15:49.464707 2761 memory_manager.go:186] "Starting memorymanager" policy="None" Jul 15 05:15:49.464729 kubelet[2761]: I0715 05:15:49.464714 2761 state_mem.go:35] "Initializing new in-memory state store" Jul 15 05:15:49.464792 kubelet[2761]: I0715 05:15:49.464781 2761 state_mem.go:75] "Updated machine memory state" Jul 15 05:15:49.468602 kubelet[2761]: I0715 05:15:49.467759 2761 manager.go:519] "Failed to read data from checkpoint" checkpoint="kubelet_internal_checkpoint" err="checkpoint is not found" Jul 15 05:15:49.468602 kubelet[2761]: I0715 05:15:49.467887 2761 eviction_manager.go:189] "Eviction manager: starting control loop" Jul 15 05:15:49.468602 kubelet[2761]: I0715 05:15:49.467896 2761 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Jul 15 05:15:49.468602 kubelet[2761]: I0715 05:15:49.468144 2761 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Jul 15 05:15:49.468602 kubelet[2761]: E0715 05:15:49.468601 2761 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="no imagefs label for configured runtime" Jul 15 05:15:49.539402 kubelet[2761]: I0715 05:15:49.539347 2761 kubelet.go:3194] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-ci-4396-0-0-n-e83c776e20" Jul 15 05:15:49.542688 kubelet[2761]: I0715 05:15:49.542662 2761 kubelet.go:3194] "Creating a mirror pod for static pod" pod="kube-system/kube-scheduler-ci-4396-0-0-n-e83c776e20" Jul 15 05:15:49.543048 kubelet[2761]: I0715 05:15:49.542741 2761 kubelet.go:3194] "Creating a mirror pod for static pod" pod="kube-system/kube-controller-manager-ci-4396-0-0-n-e83c776e20" Jul 15 05:15:49.552666 kubelet[2761]: E0715 05:15:49.552633 2761 kubelet.go:3196] "Failed creating a mirror pod" err="pods \"kube-apiserver-ci-4396-0-0-n-e83c776e20\" already exists" pod="kube-system/kube-apiserver-ci-4396-0-0-n-e83c776e20" Jul 15 05:15:49.552867 kubelet[2761]: E0715 05:15:49.552821 2761 kubelet.go:3196] "Failed creating a mirror pod" err="pods \"kube-scheduler-ci-4396-0-0-n-e83c776e20\" already exists" pod="kube-system/kube-scheduler-ci-4396-0-0-n-e83c776e20" Jul 15 05:15:49.553756 kubelet[2761]: E0715 05:15:49.553693 2761 kubelet.go:3196] "Failed creating a mirror pod" err="pods \"kube-controller-manager-ci-4396-0-0-n-e83c776e20\" already exists" pod="kube-system/kube-controller-manager-ci-4396-0-0-n-e83c776e20" Jul 15 05:15:49.576860 kubelet[2761]: I0715 05:15:49.575887 2761 kubelet_node_status.go:75] "Attempting to register node" node="ci-4396-0-0-n-e83c776e20" Jul 15 05:15:49.584364 kubelet[2761]: I0715 05:15:49.584311 2761 kubelet_node_status.go:124] "Node was previously registered" node="ci-4396-0-0-n-e83c776e20" Jul 15 05:15:49.584471 kubelet[2761]: I0715 05:15:49.584439 2761 kubelet_node_status.go:78] "Successfully registered node" node="ci-4396-0-0-n-e83c776e20" Jul 15 05:15:49.619067 kubelet[2761]: I0715 05:15:49.618780 2761 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/843de93c586da35ea89e9b7f902fcd04-usr-share-ca-certificates\") pod \"kube-apiserver-ci-4396-0-0-n-e83c776e20\" (UID: \"843de93c586da35ea89e9b7f902fcd04\") " pod="kube-system/kube-apiserver-ci-4396-0-0-n-e83c776e20" Jul 15 05:15:49.619661 kubelet[2761]: I0715 05:15:49.619501 2761 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"flexvolume-dir\" (UniqueName: \"kubernetes.io/host-path/ecca8a58b66991dd69df1efad9a96552-flexvolume-dir\") pod \"kube-controller-manager-ci-4396-0-0-n-e83c776e20\" (UID: \"ecca8a58b66991dd69df1efad9a96552\") " pod="kube-system/kube-controller-manager-ci-4396-0-0-n-e83c776e20" Jul 15 05:15:49.620642 kubelet[2761]: I0715 05:15:49.620088 2761 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/ecca8a58b66991dd69df1efad9a96552-k8s-certs\") pod \"kube-controller-manager-ci-4396-0-0-n-e83c776e20\" (UID: \"ecca8a58b66991dd69df1efad9a96552\") " pod="kube-system/kube-controller-manager-ci-4396-0-0-n-e83c776e20" Jul 15 05:15:49.620642 kubelet[2761]: I0715 05:15:49.620140 2761 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/ecca8a58b66991dd69df1efad9a96552-kubeconfig\") pod \"kube-controller-manager-ci-4396-0-0-n-e83c776e20\" (UID: \"ecca8a58b66991dd69df1efad9a96552\") " pod="kube-system/kube-controller-manager-ci-4396-0-0-n-e83c776e20" Jul 15 05:15:49.620642 kubelet[2761]: I0715 05:15:49.620168 2761 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/ecca8a58b66991dd69df1efad9a96552-usr-share-ca-certificates\") pod \"kube-controller-manager-ci-4396-0-0-n-e83c776e20\" (UID: \"ecca8a58b66991dd69df1efad9a96552\") " pod="kube-system/kube-controller-manager-ci-4396-0-0-n-e83c776e20" Jul 15 05:15:49.620642 kubelet[2761]: I0715 05:15:49.620195 2761 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/d760f2b2b4c0af0f5ce8b8192d801887-kubeconfig\") pod \"kube-scheduler-ci-4396-0-0-n-e83c776e20\" (UID: \"d760f2b2b4c0af0f5ce8b8192d801887\") " pod="kube-system/kube-scheduler-ci-4396-0-0-n-e83c776e20" Jul 15 05:15:49.620642 kubelet[2761]: I0715 05:15:49.620218 2761 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/843de93c586da35ea89e9b7f902fcd04-ca-certs\") pod \"kube-apiserver-ci-4396-0-0-n-e83c776e20\" (UID: \"843de93c586da35ea89e9b7f902fcd04\") " pod="kube-system/kube-apiserver-ci-4396-0-0-n-e83c776e20" Jul 15 05:15:49.620893 kubelet[2761]: I0715 05:15:49.620239 2761 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/843de93c586da35ea89e9b7f902fcd04-k8s-certs\") pod \"kube-apiserver-ci-4396-0-0-n-e83c776e20\" (UID: \"843de93c586da35ea89e9b7f902fcd04\") " pod="kube-system/kube-apiserver-ci-4396-0-0-n-e83c776e20" Jul 15 05:15:49.620893 kubelet[2761]: I0715 05:15:49.620260 2761 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/ecca8a58b66991dd69df1efad9a96552-ca-certs\") pod \"kube-controller-manager-ci-4396-0-0-n-e83c776e20\" (UID: \"ecca8a58b66991dd69df1efad9a96552\") " pod="kube-system/kube-controller-manager-ci-4396-0-0-n-e83c776e20" Jul 15 05:15:50.402064 kubelet[2761]: I0715 05:15:50.402001 2761 apiserver.go:52] "Watching apiserver" Jul 15 05:15:50.416146 kubelet[2761]: I0715 05:15:50.416119 2761 desired_state_of_world_populator.go:158] "Finished populating initial desired state of world" Jul 15 05:15:50.454990 kubelet[2761]: I0715 05:15:50.454217 2761 kubelet.go:3194] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-ci-4396-0-0-n-e83c776e20" Jul 15 05:15:50.459616 kubelet[2761]: E0715 05:15:50.459561 2761 kubelet.go:3196] "Failed creating a mirror pod" err="pods \"kube-apiserver-ci-4396-0-0-n-e83c776e20\" already exists" pod="kube-system/kube-apiserver-ci-4396-0-0-n-e83c776e20" Jul 15 05:15:50.492467 kubelet[2761]: I0715 05:15:50.492413 2761 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-controller-manager-ci-4396-0-0-n-e83c776e20" podStartSLOduration=2.4923971050000002 podStartE2EDuration="2.492397105s" podCreationTimestamp="2025-07-15 05:15:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-07-15 05:15:50.473612075 +0000 UTC m=+1.124986041" watchObservedRunningTime="2025-07-15 05:15:50.492397105 +0000 UTC m=+1.143771091" Jul 15 05:15:50.493153 update_engine[1569]: I20250715 05:15:50.493108 1569 update_attempter.cc:509] Updating boot flags... Jul 15 05:15:50.515441 kubelet[2761]: I0715 05:15:50.515405 2761 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-apiserver-ci-4396-0-0-n-e83c776e20" podStartSLOduration=2.515390315 podStartE2EDuration="2.515390315s" podCreationTimestamp="2025-07-15 05:15:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-07-15 05:15:50.494082255 +0000 UTC m=+1.145456221" watchObservedRunningTime="2025-07-15 05:15:50.515390315 +0000 UTC m=+1.166764291" Jul 15 05:15:55.327086 kubelet[2761]: I0715 05:15:55.327054 2761 kuberuntime_manager.go:1702] "Updating runtime config through cri with podcidr" CIDR="192.168.0.0/24" Jul 15 05:15:55.327785 kubelet[2761]: I0715 05:15:55.327448 2761 kubelet_network.go:61] "Updating Pod CIDR" originalPodCIDR="" newPodCIDR="192.168.0.0/24" Jul 15 05:15:55.327843 containerd[1591]: time="2025-07-15T05:15:55.327332584Z" level=info msg="No cni config template is specified, wait for other system components to drop the config." Jul 15 05:15:56.063366 kubelet[2761]: I0715 05:15:56.063219 2761 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-scheduler-ci-4396-0-0-n-e83c776e20" podStartSLOduration=8.063196034 podStartE2EDuration="8.063196034s" podCreationTimestamp="2025-07-15 05:15:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-07-15 05:15:50.516078865 +0000 UTC m=+1.167452841" watchObservedRunningTime="2025-07-15 05:15:56.063196034 +0000 UTC m=+6.714570050" Jul 15 05:15:56.087601 systemd[1]: Created slice kubepods-besteffort-pod8c86c5be_289c_4cf8_9f98_a33b43d897c5.slice - libcontainer container kubepods-besteffort-pod8c86c5be_289c_4cf8_9f98_a33b43d897c5.slice. Jul 15 05:15:56.159147 kubelet[2761]: I0715 05:15:56.159096 2761 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-proxy\" (UniqueName: \"kubernetes.io/configmap/8c86c5be-289c-4cf8-9f98-a33b43d897c5-kube-proxy\") pod \"kube-proxy-t2fzd\" (UID: \"8c86c5be-289c-4cf8-9f98-a33b43d897c5\") " pod="kube-system/kube-proxy-t2fzd" Jul 15 05:15:56.159147 kubelet[2761]: I0715 05:15:56.159152 2761 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7b574\" (UniqueName: \"kubernetes.io/projected/8c86c5be-289c-4cf8-9f98-a33b43d897c5-kube-api-access-7b574\") pod \"kube-proxy-t2fzd\" (UID: \"8c86c5be-289c-4cf8-9f98-a33b43d897c5\") " pod="kube-system/kube-proxy-t2fzd" Jul 15 05:15:56.159374 kubelet[2761]: I0715 05:15:56.159182 2761 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/8c86c5be-289c-4cf8-9f98-a33b43d897c5-lib-modules\") pod \"kube-proxy-t2fzd\" (UID: \"8c86c5be-289c-4cf8-9f98-a33b43d897c5\") " pod="kube-system/kube-proxy-t2fzd" Jul 15 05:15:56.159374 kubelet[2761]: I0715 05:15:56.159206 2761 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"xtables-lock\" (UniqueName: \"kubernetes.io/host-path/8c86c5be-289c-4cf8-9f98-a33b43d897c5-xtables-lock\") pod \"kube-proxy-t2fzd\" (UID: \"8c86c5be-289c-4cf8-9f98-a33b43d897c5\") " pod="kube-system/kube-proxy-t2fzd" Jul 15 05:15:56.396067 containerd[1591]: time="2025-07-15T05:15:56.395853414Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-proxy-t2fzd,Uid:8c86c5be-289c-4cf8-9f98-a33b43d897c5,Namespace:kube-system,Attempt:0,}" Jul 15 05:15:56.426792 containerd[1591]: time="2025-07-15T05:15:56.426402394Z" level=info msg="connecting to shim 89463c93fbdafa9e10fe215e1c60fc88eca8b817421379ab4a856baa98ba5c10" address="unix:///run/containerd/s/4dc2beddfd5090db66d0ccc350b11b0aade72108b3353029ad83e47a96c2ba9c" namespace=k8s.io protocol=ttrpc version=3 Jul 15 05:15:56.464146 systemd[1]: Started cri-containerd-89463c93fbdafa9e10fe215e1c60fc88eca8b817421379ab4a856baa98ba5c10.scope - libcontainer container 89463c93fbdafa9e10fe215e1c60fc88eca8b817421379ab4a856baa98ba5c10. Jul 15 05:15:56.491436 systemd[1]: Created slice kubepods-besteffort-podd58eb7e8_e5d8_4df0_9544_fa84bdf7af00.slice - libcontainer container kubepods-besteffort-podd58eb7e8_e5d8_4df0_9544_fa84bdf7af00.slice. Jul 15 05:15:56.497709 containerd[1591]: time="2025-07-15T05:15:56.497686314Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-proxy-t2fzd,Uid:8c86c5be-289c-4cf8-9f98-a33b43d897c5,Namespace:kube-system,Attempt:0,} returns sandbox id \"89463c93fbdafa9e10fe215e1c60fc88eca8b817421379ab4a856baa98ba5c10\"" Jul 15 05:15:56.500098 containerd[1591]: time="2025-07-15T05:15:56.500059204Z" level=info msg="CreateContainer within sandbox \"89463c93fbdafa9e10fe215e1c60fc88eca8b817421379ab4a856baa98ba5c10\" for container &ContainerMetadata{Name:kube-proxy,Attempt:0,}" Jul 15 05:15:56.510661 containerd[1591]: time="2025-07-15T05:15:56.510643814Z" level=info msg="Container 22839e025bb2cd58503c342076bb6c5698ce604d231974ad723f6935e9f25a15: CDI devices from CRI Config.CDIDevices: []" Jul 15 05:15:56.516959 containerd[1591]: time="2025-07-15T05:15:56.516940724Z" level=info msg="CreateContainer within sandbox \"89463c93fbdafa9e10fe215e1c60fc88eca8b817421379ab4a856baa98ba5c10\" for &ContainerMetadata{Name:kube-proxy,Attempt:0,} returns container id \"22839e025bb2cd58503c342076bb6c5698ce604d231974ad723f6935e9f25a15\"" Jul 15 05:15:56.517305 containerd[1591]: time="2025-07-15T05:15:56.517280064Z" level=info msg="StartContainer for \"22839e025bb2cd58503c342076bb6c5698ce604d231974ad723f6935e9f25a15\"" Jul 15 05:15:56.518888 containerd[1591]: time="2025-07-15T05:15:56.518864044Z" level=info msg="connecting to shim 22839e025bb2cd58503c342076bb6c5698ce604d231974ad723f6935e9f25a15" address="unix:///run/containerd/s/4dc2beddfd5090db66d0ccc350b11b0aade72108b3353029ad83e47a96c2ba9c" protocol=ttrpc version=3 Jul 15 05:15:56.536121 systemd[1]: Started cri-containerd-22839e025bb2cd58503c342076bb6c5698ce604d231974ad723f6935e9f25a15.scope - libcontainer container 22839e025bb2cd58503c342076bb6c5698ce604d231974ad723f6935e9f25a15. Jul 15 05:15:56.561959 kubelet[2761]: I0715 05:15:56.561932 2761 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-calico\" (UniqueName: \"kubernetes.io/host-path/d58eb7e8-e5d8-4df0-9544-fa84bdf7af00-var-lib-calico\") pod \"tigera-operator-747864d56d-pjvsh\" (UID: \"d58eb7e8-e5d8-4df0-9544-fa84bdf7af00\") " pod="tigera-operator/tigera-operator-747864d56d-pjvsh" Jul 15 05:15:56.561959 kubelet[2761]: I0715 05:15:56.561960 2761 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qm6h2\" (UniqueName: \"kubernetes.io/projected/d58eb7e8-e5d8-4df0-9544-fa84bdf7af00-kube-api-access-qm6h2\") pod \"tigera-operator-747864d56d-pjvsh\" (UID: \"d58eb7e8-e5d8-4df0-9544-fa84bdf7af00\") " pod="tigera-operator/tigera-operator-747864d56d-pjvsh" Jul 15 05:15:56.566347 containerd[1591]: time="2025-07-15T05:15:56.566317354Z" level=info msg="StartContainer for \"22839e025bb2cd58503c342076bb6c5698ce604d231974ad723f6935e9f25a15\" returns successfully" Jul 15 05:15:56.801271 containerd[1591]: time="2025-07-15T05:15:56.800889244Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:tigera-operator-747864d56d-pjvsh,Uid:d58eb7e8-e5d8-4df0-9544-fa84bdf7af00,Namespace:tigera-operator,Attempt:0,}" Jul 15 05:15:56.824219 containerd[1591]: time="2025-07-15T05:15:56.824178154Z" level=info msg="connecting to shim 72a76cd66dddaf7b086d078cb6161301e0c89cd7249db781ba35a162273349eb" address="unix:///run/containerd/s/4b1896cbc46f96fde2305040beb0d4ffc669bfdd67c63d81008e428134084bcc" namespace=k8s.io protocol=ttrpc version=3 Jul 15 05:15:56.842395 systemd[1]: Started cri-containerd-72a76cd66dddaf7b086d078cb6161301e0c89cd7249db781ba35a162273349eb.scope - libcontainer container 72a76cd66dddaf7b086d078cb6161301e0c89cd7249db781ba35a162273349eb. Jul 15 05:15:56.886299 containerd[1591]: time="2025-07-15T05:15:56.886162604Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:tigera-operator-747864d56d-pjvsh,Uid:d58eb7e8-e5d8-4df0-9544-fa84bdf7af00,Namespace:tigera-operator,Attempt:0,} returns sandbox id \"72a76cd66dddaf7b086d078cb6161301e0c89cd7249db781ba35a162273349eb\"" Jul 15 05:15:56.888741 containerd[1591]: time="2025-07-15T05:15:56.888700714Z" level=info msg="PullImage \"quay.io/tigera/operator:v1.38.3\"" Jul 15 05:15:57.291052 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount3068660387.mount: Deactivated successfully. Jul 15 05:15:58.686117 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount2771037397.mount: Deactivated successfully. Jul 15 05:15:58.697453 kubelet[2761]: I0715 05:15:58.697179 2761 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-proxy-t2fzd" podStartSLOduration=2.697160924 podStartE2EDuration="2.697160924s" podCreationTimestamp="2025-07-15 05:15:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-07-15 05:15:57.496703644 +0000 UTC m=+8.148077650" watchObservedRunningTime="2025-07-15 05:15:58.697160924 +0000 UTC m=+9.348534900" Jul 15 05:15:59.035952 containerd[1591]: time="2025-07-15T05:15:59.035829984Z" level=info msg="ImageCreate event name:\"quay.io/tigera/operator:v1.38.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jul 15 05:15:59.037052 containerd[1591]: time="2025-07-15T05:15:59.037003164Z" level=info msg="stop pulling image quay.io/tigera/operator:v1.38.3: active requests=0, bytes read=25056543" Jul 15 05:15:59.037977 containerd[1591]: time="2025-07-15T05:15:59.037946054Z" level=info msg="ImageCreate event name:\"sha256:8bde16470b09d1963e19456806d73180c9778a6c2b3c1fda2335c67c1cd4ce93\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jul 15 05:15:59.039546 containerd[1591]: time="2025-07-15T05:15:59.039517044Z" level=info msg="ImageCreate event name:\"quay.io/tigera/operator@sha256:dbf1bad0def7b5955dc8e4aeee96e23ead0bc5822f6872518e685cd0ed484121\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jul 15 05:15:59.040031 containerd[1591]: time="2025-07-15T05:15:59.039840974Z" level=info msg="Pulled image \"quay.io/tigera/operator:v1.38.3\" with image id \"sha256:8bde16470b09d1963e19456806d73180c9778a6c2b3c1fda2335c67c1cd4ce93\", repo tag \"quay.io/tigera/operator:v1.38.3\", repo digest \"quay.io/tigera/operator@sha256:dbf1bad0def7b5955dc8e4aeee96e23ead0bc5822f6872518e685cd0ed484121\", size \"25052538\" in 2.1511187s" Jul 15 05:15:59.040031 containerd[1591]: time="2025-07-15T05:15:59.039890144Z" level=info msg="PullImage \"quay.io/tigera/operator:v1.38.3\" returns image reference \"sha256:8bde16470b09d1963e19456806d73180c9778a6c2b3c1fda2335c67c1cd4ce93\"" Jul 15 05:15:59.042105 containerd[1591]: time="2025-07-15T05:15:59.042078234Z" level=info msg="CreateContainer within sandbox \"72a76cd66dddaf7b086d078cb6161301e0c89cd7249db781ba35a162273349eb\" for container &ContainerMetadata{Name:tigera-operator,Attempt:0,}" Jul 15 05:15:59.050056 containerd[1591]: time="2025-07-15T05:15:59.049745634Z" level=info msg="Container 243a100da6a48118c4599031453de009e00f859cdfcd436303f98bf48ee91307: CDI devices from CRI Config.CDIDevices: []" Jul 15 05:15:59.052323 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount195011214.mount: Deactivated successfully. Jul 15 05:15:59.063729 containerd[1591]: time="2025-07-15T05:15:59.063694154Z" level=info msg="CreateContainer within sandbox \"72a76cd66dddaf7b086d078cb6161301e0c89cd7249db781ba35a162273349eb\" for &ContainerMetadata{Name:tigera-operator,Attempt:0,} returns container id \"243a100da6a48118c4599031453de009e00f859cdfcd436303f98bf48ee91307\"" Jul 15 05:15:59.064110 containerd[1591]: time="2025-07-15T05:15:59.064091704Z" level=info msg="StartContainer for \"243a100da6a48118c4599031453de009e00f859cdfcd436303f98bf48ee91307\"" Jul 15 05:15:59.064582 containerd[1591]: time="2025-07-15T05:15:59.064559174Z" level=info msg="connecting to shim 243a100da6a48118c4599031453de009e00f859cdfcd436303f98bf48ee91307" address="unix:///run/containerd/s/4b1896cbc46f96fde2305040beb0d4ffc669bfdd67c63d81008e428134084bcc" protocol=ttrpc version=3 Jul 15 05:15:59.081104 systemd[1]: Started cri-containerd-243a100da6a48118c4599031453de009e00f859cdfcd436303f98bf48ee91307.scope - libcontainer container 243a100da6a48118c4599031453de009e00f859cdfcd436303f98bf48ee91307. Jul 15 05:15:59.107320 containerd[1591]: time="2025-07-15T05:15:59.107244794Z" level=info msg="StartContainer for \"243a100da6a48118c4599031453de009e00f859cdfcd436303f98bf48ee91307\" returns successfully" Jul 15 05:15:59.516410 kubelet[2761]: I0715 05:15:59.516338 2761 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="tigera-operator/tigera-operator-747864d56d-pjvsh" podStartSLOduration=1.3630461440000001 podStartE2EDuration="3.516190914s" podCreationTimestamp="2025-07-15 05:15:56 +0000 UTC" firstStartedPulling="2025-07-15 05:15:56.887337604 +0000 UTC m=+7.538711570" lastFinishedPulling="2025-07-15 05:15:59.040482364 +0000 UTC m=+9.691856340" observedRunningTime="2025-07-15 05:15:59.515197764 +0000 UTC m=+10.166571760" watchObservedRunningTime="2025-07-15 05:15:59.516190914 +0000 UTC m=+10.167564910" Jul 15 05:16:04.632314 sudo[1846]: pam_unix(sudo:session): session closed for user root Jul 15 05:16:04.790103 sshd[1845]: Connection closed by 139.178.89.65 port 50104 Jul 15 05:16:04.791696 sshd-session[1842]: pam_unix(sshd:session): session closed for user core Jul 15 05:16:04.797323 systemd-logind[1567]: Session 7 logged out. Waiting for processes to exit. Jul 15 05:16:04.799444 systemd[1]: sshd@6-157.180.39.85:22-139.178.89.65:50104.service: Deactivated successfully. Jul 15 05:16:04.805686 systemd[1]: session-7.scope: Deactivated successfully. Jul 15 05:16:04.806501 systemd[1]: session-7.scope: Consumed 3.571s CPU time, 154.8M memory peak. Jul 15 05:16:04.809905 systemd-logind[1567]: Removed session 7. Jul 15 05:16:07.501856 systemd[1]: Created slice kubepods-besteffort-pod2ff3cfc3_b4df_41a6_b068_5d204eafaf1c.slice - libcontainer container kubepods-besteffort-pod2ff3cfc3_b4df_41a6_b068_5d204eafaf1c.slice. Jul 15 05:16:07.533402 kubelet[2761]: I0715 05:16:07.533354 2761 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2ff3cfc3-b4df-41a6-b068-5d204eafaf1c-tigera-ca-bundle\") pod \"calico-typha-9584887c4-jwn7l\" (UID: \"2ff3cfc3-b4df-41a6-b068-5d204eafaf1c\") " pod="calico-system/calico-typha-9584887c4-jwn7l" Jul 15 05:16:07.534250 kubelet[2761]: I0715 05:16:07.533997 2761 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nkrmx\" (UniqueName: \"kubernetes.io/projected/2ff3cfc3-b4df-41a6-b068-5d204eafaf1c-kube-api-access-nkrmx\") pod \"calico-typha-9584887c4-jwn7l\" (UID: \"2ff3cfc3-b4df-41a6-b068-5d204eafaf1c\") " pod="calico-system/calico-typha-9584887c4-jwn7l" Jul 15 05:16:07.534250 kubelet[2761]: I0715 05:16:07.534057 2761 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"typha-certs\" (UniqueName: \"kubernetes.io/secret/2ff3cfc3-b4df-41a6-b068-5d204eafaf1c-typha-certs\") pod \"calico-typha-9584887c4-jwn7l\" (UID: \"2ff3cfc3-b4df-41a6-b068-5d204eafaf1c\") " pod="calico-system/calico-typha-9584887c4-jwn7l" Jul 15 05:16:07.718100 systemd[1]: Created slice kubepods-besteffort-podb27debea_9520_4583_8218_a5e677553bcb.slice - libcontainer container kubepods-besteffort-podb27debea_9520_4583_8218_a5e677553bcb.slice. Jul 15 05:16:07.735579 kubelet[2761]: I0715 05:16:07.735430 2761 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"flexvol-driver-host\" (UniqueName: \"kubernetes.io/host-path/b27debea-9520-4583-8218-a5e677553bcb-flexvol-driver-host\") pod \"calico-node-dwr67\" (UID: \"b27debea-9520-4583-8218-a5e677553bcb\") " pod="calico-system/calico-node-dwr67" Jul 15 05:16:07.735852 kubelet[2761]: I0715 05:16:07.735656 2761 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-certs\" (UniqueName: \"kubernetes.io/secret/b27debea-9520-4583-8218-a5e677553bcb-node-certs\") pod \"calico-node-dwr67\" (UID: \"b27debea-9520-4583-8218-a5e677553bcb\") " pod="calico-system/calico-node-dwr67" Jul 15 05:16:07.735852 kubelet[2761]: I0715 05:16:07.735675 2761 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-bin-dir\" (UniqueName: \"kubernetes.io/host-path/b27debea-9520-4583-8218-a5e677553bcb-cni-bin-dir\") pod \"calico-node-dwr67\" (UID: \"b27debea-9520-4583-8218-a5e677553bcb\") " pod="calico-system/calico-node-dwr67" Jul 15 05:16:07.735852 kubelet[2761]: I0715 05:16:07.735686 2761 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-log-dir\" (UniqueName: \"kubernetes.io/host-path/b27debea-9520-4583-8218-a5e677553bcb-cni-log-dir\") pod \"calico-node-dwr67\" (UID: \"b27debea-9520-4583-8218-a5e677553bcb\") " pod="calico-system/calico-node-dwr67" Jul 15 05:16:07.735852 kubelet[2761]: I0715 05:16:07.735696 2761 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/b27debea-9520-4583-8218-a5e677553bcb-lib-modules\") pod \"calico-node-dwr67\" (UID: \"b27debea-9520-4583-8218-a5e677553bcb\") " pod="calico-system/calico-node-dwr67" Jul 15 05:16:07.735852 kubelet[2761]: I0715 05:16:07.735707 2761 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b27debea-9520-4583-8218-a5e677553bcb-tigera-ca-bundle\") pod \"calico-node-dwr67\" (UID: \"b27debea-9520-4583-8218-a5e677553bcb\") " pod="calico-system/calico-node-dwr67" Jul 15 05:16:07.736059 kubelet[2761]: I0715 05:16:07.735719 2761 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"xtables-lock\" (UniqueName: \"kubernetes.io/host-path/b27debea-9520-4583-8218-a5e677553bcb-xtables-lock\") pod \"calico-node-dwr67\" (UID: \"b27debea-9520-4583-8218-a5e677553bcb\") " pod="calico-system/calico-node-dwr67" Jul 15 05:16:07.736059 kubelet[2761]: I0715 05:16:07.735730 2761 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-calico\" (UniqueName: \"kubernetes.io/host-path/b27debea-9520-4583-8218-a5e677553bcb-var-run-calico\") pod \"calico-node-dwr67\" (UID: \"b27debea-9520-4583-8218-a5e677553bcb\") " pod="calico-system/calico-node-dwr67" Jul 15 05:16:07.736059 kubelet[2761]: I0715 05:16:07.735762 2761 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-flc6m\" (UniqueName: \"kubernetes.io/projected/b27debea-9520-4583-8218-a5e677553bcb-kube-api-access-flc6m\") pod \"calico-node-dwr67\" (UID: \"b27debea-9520-4583-8218-a5e677553bcb\") " pod="calico-system/calico-node-dwr67" Jul 15 05:16:07.736059 kubelet[2761]: I0715 05:16:07.735777 2761 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-net-dir\" (UniqueName: \"kubernetes.io/host-path/b27debea-9520-4583-8218-a5e677553bcb-cni-net-dir\") pod \"calico-node-dwr67\" (UID: \"b27debea-9520-4583-8218-a5e677553bcb\") " pod="calico-system/calico-node-dwr67" Jul 15 05:16:07.736059 kubelet[2761]: I0715 05:16:07.735786 2761 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-calico\" (UniqueName: \"kubernetes.io/host-path/b27debea-9520-4583-8218-a5e677553bcb-var-lib-calico\") pod \"calico-node-dwr67\" (UID: \"b27debea-9520-4583-8218-a5e677553bcb\") " pod="calico-system/calico-node-dwr67" Jul 15 05:16:07.736324 kubelet[2761]: I0715 05:16:07.735797 2761 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"policysync\" (UniqueName: \"kubernetes.io/host-path/b27debea-9520-4583-8218-a5e677553bcb-policysync\") pod \"calico-node-dwr67\" (UID: \"b27debea-9520-4583-8218-a5e677553bcb\") " pod="calico-system/calico-node-dwr67" Jul 15 05:16:07.810171 containerd[1591]: time="2025-07-15T05:16:07.810066224Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-typha-9584887c4-jwn7l,Uid:2ff3cfc3-b4df-41a6-b068-5d204eafaf1c,Namespace:calico-system,Attempt:0,}" Jul 15 05:16:07.836483 containerd[1591]: time="2025-07-15T05:16:07.836351952Z" level=info msg="connecting to shim 87e7a0a8b7d2212a064c36e139228499b6b73ee3a8ca257bc7a3c7dab15dda49" address="unix:///run/containerd/s/a8996967ff02da2856a62f83408dfb19775cdcbd4f4f99385baa014f837d18ef" namespace=k8s.io protocol=ttrpc version=3 Jul 15 05:16:07.844302 kubelet[2761]: E0715 05:16:07.842352 2761 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 15 05:16:07.844302 kubelet[2761]: W0715 05:16:07.842372 2761 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 15 05:16:07.844302 kubelet[2761]: E0715 05:16:07.844062 2761 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 15 05:16:07.845097 kubelet[2761]: E0715 05:16:07.845082 2761 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 15 05:16:07.845571 kubelet[2761]: W0715 05:16:07.845168 2761 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 15 05:16:07.845571 kubelet[2761]: E0715 05:16:07.845183 2761 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 15 05:16:07.846027 kubelet[2761]: E0715 05:16:07.845687 2761 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 15 05:16:07.846106 kubelet[2761]: W0715 05:16:07.846092 2761 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 15 05:16:07.846145 kubelet[2761]: E0715 05:16:07.846137 2761 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 15 05:16:07.846495 kubelet[2761]: E0715 05:16:07.846484 2761 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 15 05:16:07.846544 kubelet[2761]: W0715 05:16:07.846535 2761 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 15 05:16:07.846584 kubelet[2761]: E0715 05:16:07.846576 2761 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 15 05:16:07.847069 kubelet[2761]: E0715 05:16:07.847057 2761 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 15 05:16:07.847203 kubelet[2761]: W0715 05:16:07.847122 2761 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 15 05:16:07.847203 kubelet[2761]: E0715 05:16:07.847134 2761 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 15 05:16:07.847402 kubelet[2761]: E0715 05:16:07.847336 2761 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 15 05:16:07.847402 kubelet[2761]: W0715 05:16:07.847343 2761 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 15 05:16:07.847402 kubelet[2761]: E0715 05:16:07.847351 2761 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 15 05:16:07.849246 kubelet[2761]: E0715 05:16:07.849139 2761 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 15 05:16:07.849246 kubelet[2761]: W0715 05:16:07.849149 2761 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 15 05:16:07.849246 kubelet[2761]: E0715 05:16:07.849158 2761 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 15 05:16:07.849753 kubelet[2761]: E0715 05:16:07.849742 2761 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 15 05:16:07.851168 kubelet[2761]: W0715 05:16:07.851078 2761 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 15 05:16:07.851168 kubelet[2761]: E0715 05:16:07.851094 2761 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 15 05:16:07.851313 kubelet[2761]: E0715 05:16:07.851304 2761 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 15 05:16:07.851461 kubelet[2761]: W0715 05:16:07.851366 2761 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 15 05:16:07.851461 kubelet[2761]: E0715 05:16:07.851378 2761 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 15 05:16:07.853354 kubelet[2761]: E0715 05:16:07.853216 2761 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 15 05:16:07.854083 kubelet[2761]: W0715 05:16:07.853402 2761 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 15 05:16:07.854194 kubelet[2761]: E0715 05:16:07.854123 2761 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 15 05:16:07.855157 kubelet[2761]: E0715 05:16:07.855002 2761 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 15 05:16:07.855157 kubelet[2761]: W0715 05:16:07.855038 2761 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 15 05:16:07.855157 kubelet[2761]: E0715 05:16:07.855046 2761 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 15 05:16:07.855288 kubelet[2761]: E0715 05:16:07.855279 2761 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 15 05:16:07.855323 kubelet[2761]: W0715 05:16:07.855316 2761 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 15 05:16:07.855355 kubelet[2761]: E0715 05:16:07.855348 2761 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 15 05:16:07.855624 kubelet[2761]: E0715 05:16:07.855615 2761 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 15 05:16:07.855702 kubelet[2761]: W0715 05:16:07.855661 2761 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 15 05:16:07.855702 kubelet[2761]: E0715 05:16:07.855670 2761 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 15 05:16:07.866579 kubelet[2761]: E0715 05:16:07.866520 2761 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 15 05:16:07.866579 kubelet[2761]: W0715 05:16:07.866536 2761 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 15 05:16:07.866579 kubelet[2761]: E0715 05:16:07.866549 2761 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 15 05:16:07.875136 systemd[1]: Started cri-containerd-87e7a0a8b7d2212a064c36e139228499b6b73ee3a8ca257bc7a3c7dab15dda49.scope - libcontainer container 87e7a0a8b7d2212a064c36e139228499b6b73ee3a8ca257bc7a3c7dab15dda49. Jul 15 05:16:07.920445 containerd[1591]: time="2025-07-15T05:16:07.920391760Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-typha-9584887c4-jwn7l,Uid:2ff3cfc3-b4df-41a6-b068-5d204eafaf1c,Namespace:calico-system,Attempt:0,} returns sandbox id \"87e7a0a8b7d2212a064c36e139228499b6b73ee3a8ca257bc7a3c7dab15dda49\"" Jul 15 05:16:07.923191 containerd[1591]: time="2025-07-15T05:16:07.923157641Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/typha:v3.30.2\"" Jul 15 05:16:07.956817 kubelet[2761]: E0715 05:16:07.955929 2761 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-9xn7s" podUID="14bfd7a9-4124-403a-afcc-084c249af056" Jul 15 05:16:08.020956 containerd[1591]: time="2025-07-15T05:16:08.020899209Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-node-dwr67,Uid:b27debea-9520-4583-8218-a5e677553bcb,Namespace:calico-system,Attempt:0,}" Jul 15 05:16:08.027774 kubelet[2761]: E0715 05:16:08.027747 2761 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 15 05:16:08.027774 kubelet[2761]: W0715 05:16:08.027764 2761 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 15 05:16:08.027902 kubelet[2761]: E0715 05:16:08.027780 2761 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 15 05:16:08.027976 kubelet[2761]: E0715 05:16:08.027962 2761 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 15 05:16:08.027976 kubelet[2761]: W0715 05:16:08.027972 2761 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 15 05:16:08.028127 kubelet[2761]: E0715 05:16:08.027978 2761 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 15 05:16:08.028159 kubelet[2761]: E0715 05:16:08.028148 2761 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 15 05:16:08.028159 kubelet[2761]: W0715 05:16:08.028154 2761 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 15 05:16:08.028191 kubelet[2761]: E0715 05:16:08.028160 2761 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 15 05:16:08.028434 kubelet[2761]: E0715 05:16:08.028414 2761 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 15 05:16:08.028434 kubelet[2761]: W0715 05:16:08.028430 2761 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 15 05:16:08.028496 kubelet[2761]: E0715 05:16:08.028440 2761 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 15 05:16:08.028670 kubelet[2761]: E0715 05:16:08.028663 2761 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 15 05:16:08.028690 kubelet[2761]: W0715 05:16:08.028673 2761 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 15 05:16:08.028690 kubelet[2761]: E0715 05:16:08.028685 2761 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 15 05:16:08.028921 kubelet[2761]: E0715 05:16:08.028840 2761 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 15 05:16:08.028921 kubelet[2761]: W0715 05:16:08.028852 2761 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 15 05:16:08.028921 kubelet[2761]: E0715 05:16:08.028861 2761 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 15 05:16:08.029046 kubelet[2761]: E0715 05:16:08.029033 2761 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 15 05:16:08.029046 kubelet[2761]: W0715 05:16:08.029042 2761 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 15 05:16:08.029128 kubelet[2761]: E0715 05:16:08.029049 2761 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 15 05:16:08.029291 kubelet[2761]: E0715 05:16:08.029250 2761 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 15 05:16:08.029291 kubelet[2761]: W0715 05:16:08.029268 2761 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 15 05:16:08.029291 kubelet[2761]: E0715 05:16:08.029287 2761 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 15 05:16:08.029604 kubelet[2761]: E0715 05:16:08.029582 2761 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 15 05:16:08.029604 kubelet[2761]: W0715 05:16:08.029593 2761 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 15 05:16:08.029604 kubelet[2761]: E0715 05:16:08.029602 2761 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 15 05:16:08.029773 kubelet[2761]: E0715 05:16:08.029758 2761 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 15 05:16:08.029773 kubelet[2761]: W0715 05:16:08.029770 2761 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 15 05:16:08.029907 kubelet[2761]: E0715 05:16:08.029777 2761 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 15 05:16:08.029951 kubelet[2761]: E0715 05:16:08.029934 2761 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 15 05:16:08.029951 kubelet[2761]: W0715 05:16:08.029941 2761 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 15 05:16:08.029951 kubelet[2761]: E0715 05:16:08.029947 2761 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 15 05:16:08.030249 kubelet[2761]: E0715 05:16:08.030235 2761 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 15 05:16:08.030249 kubelet[2761]: W0715 05:16:08.030244 2761 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 15 05:16:08.030295 kubelet[2761]: E0715 05:16:08.030253 2761 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 15 05:16:08.030414 kubelet[2761]: E0715 05:16:08.030402 2761 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 15 05:16:08.030414 kubelet[2761]: W0715 05:16:08.030411 2761 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 15 05:16:08.030451 kubelet[2761]: E0715 05:16:08.030418 2761 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 15 05:16:08.030618 kubelet[2761]: E0715 05:16:08.030604 2761 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 15 05:16:08.030705 kubelet[2761]: W0715 05:16:08.030614 2761 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 15 05:16:08.030705 kubelet[2761]: E0715 05:16:08.030701 2761 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 15 05:16:08.030919 kubelet[2761]: E0715 05:16:08.030865 2761 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 15 05:16:08.030919 kubelet[2761]: W0715 05:16:08.030910 2761 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 15 05:16:08.030919 kubelet[2761]: E0715 05:16:08.030916 2761 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 15 05:16:08.031169 kubelet[2761]: E0715 05:16:08.031152 2761 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 15 05:16:08.031169 kubelet[2761]: W0715 05:16:08.031165 2761 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 15 05:16:08.031216 kubelet[2761]: E0715 05:16:08.031173 2761 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 15 05:16:08.031365 kubelet[2761]: E0715 05:16:08.031347 2761 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 15 05:16:08.031365 kubelet[2761]: W0715 05:16:08.031356 2761 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 15 05:16:08.031365 kubelet[2761]: E0715 05:16:08.031362 2761 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 15 05:16:08.031542 kubelet[2761]: E0715 05:16:08.031524 2761 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 15 05:16:08.031542 kubelet[2761]: W0715 05:16:08.031534 2761 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 15 05:16:08.031542 kubelet[2761]: E0715 05:16:08.031541 2761 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 15 05:16:08.031733 kubelet[2761]: E0715 05:16:08.031713 2761 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 15 05:16:08.031733 kubelet[2761]: W0715 05:16:08.031725 2761 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 15 05:16:08.031733 kubelet[2761]: E0715 05:16:08.031732 2761 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 15 05:16:08.031899 kubelet[2761]: E0715 05:16:08.031882 2761 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 15 05:16:08.031899 kubelet[2761]: W0715 05:16:08.031892 2761 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 15 05:16:08.031899 kubelet[2761]: E0715 05:16:08.031898 2761 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 15 05:16:08.038201 kubelet[2761]: E0715 05:16:08.038175 2761 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 15 05:16:08.038201 kubelet[2761]: W0715 05:16:08.038191 2761 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 15 05:16:08.038201 kubelet[2761]: E0715 05:16:08.038202 2761 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 15 05:16:08.038454 kubelet[2761]: I0715 05:16:08.038231 2761 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/14bfd7a9-4124-403a-afcc-084c249af056-kubelet-dir\") pod \"csi-node-driver-9xn7s\" (UID: \"14bfd7a9-4124-403a-afcc-084c249af056\") " pod="calico-system/csi-node-driver-9xn7s" Jul 15 05:16:08.039437 kubelet[2761]: E0715 05:16:08.039394 2761 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 15 05:16:08.039437 kubelet[2761]: W0715 05:16:08.039410 2761 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 15 05:16:08.039606 kubelet[2761]: E0715 05:16:08.039499 2761 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 15 05:16:08.039606 kubelet[2761]: I0715 05:16:08.039516 2761 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/14bfd7a9-4124-403a-afcc-084c249af056-registration-dir\") pod \"csi-node-driver-9xn7s\" (UID: \"14bfd7a9-4124-403a-afcc-084c249af056\") " pod="calico-system/csi-node-driver-9xn7s" Jul 15 05:16:08.039735 kubelet[2761]: E0715 05:16:08.039728 2761 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 15 05:16:08.039754 kubelet[2761]: W0715 05:16:08.039737 2761 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 15 05:16:08.039842 kubelet[2761]: E0715 05:16:08.039794 2761 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 15 05:16:08.039842 kubelet[2761]: I0715 05:16:08.039820 2761 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/14bfd7a9-4124-403a-afcc-084c249af056-socket-dir\") pod \"csi-node-driver-9xn7s\" (UID: \"14bfd7a9-4124-403a-afcc-084c249af056\") " pod="calico-system/csi-node-driver-9xn7s" Jul 15 05:16:08.039911 kubelet[2761]: E0715 05:16:08.039898 2761 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 15 05:16:08.039911 kubelet[2761]: W0715 05:16:08.039908 2761 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 15 05:16:08.040026 kubelet[2761]: E0715 05:16:08.039988 2761 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 15 05:16:08.040190 kubelet[2761]: E0715 05:16:08.040167 2761 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 15 05:16:08.040190 kubelet[2761]: W0715 05:16:08.040178 2761 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 15 05:16:08.040325 kubelet[2761]: E0715 05:16:08.040251 2761 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 15 05:16:08.040573 kubelet[2761]: E0715 05:16:08.040520 2761 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 15 05:16:08.040573 kubelet[2761]: W0715 05:16:08.040529 2761 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 15 05:16:08.040573 kubelet[2761]: E0715 05:16:08.040541 2761 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 15 05:16:08.040573 kubelet[2761]: I0715 05:16:08.040555 2761 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"varrun\" (UniqueName: \"kubernetes.io/host-path/14bfd7a9-4124-403a-afcc-084c249af056-varrun\") pod \"csi-node-driver-9xn7s\" (UID: \"14bfd7a9-4124-403a-afcc-084c249af056\") " pod="calico-system/csi-node-driver-9xn7s" Jul 15 05:16:08.040968 kubelet[2761]: E0715 05:16:08.040955 2761 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 15 05:16:08.040968 kubelet[2761]: W0715 05:16:08.040966 2761 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 15 05:16:08.041071 kubelet[2761]: E0715 05:16:08.040982 2761 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 15 05:16:08.041203 kubelet[2761]: E0715 05:16:08.041183 2761 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 15 05:16:08.041203 kubelet[2761]: W0715 05:16:08.041194 2761 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 15 05:16:08.041203 kubelet[2761]: E0715 05:16:08.041201 2761 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 15 05:16:08.041493 kubelet[2761]: E0715 05:16:08.041464 2761 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 15 05:16:08.041493 kubelet[2761]: W0715 05:16:08.041477 2761 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 15 05:16:08.041556 kubelet[2761]: E0715 05:16:08.041500 2761 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 15 05:16:08.041651 kubelet[2761]: E0715 05:16:08.041635 2761 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 15 05:16:08.041651 kubelet[2761]: W0715 05:16:08.041646 2761 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 15 05:16:08.043073 containerd[1591]: time="2025-07-15T05:16:08.043037754Z" level=info msg="connecting to shim 6a74f534e54b59aef4d23b08a455ebf83004fefc0e97cb2b40432c6a9b30eb46" address="unix:///run/containerd/s/f589dd8d63c0284808d9a3bb9867cde47623a6790aa071d27c41a4e3aae18e20" namespace=k8s.io protocol=ttrpc version=3 Jul 15 05:16:08.045545 kubelet[2761]: E0715 05:16:08.045514 2761 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 15 05:16:08.045758 kubelet[2761]: E0715 05:16:08.045727 2761 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 15 05:16:08.045835 kubelet[2761]: W0715 05:16:08.045814 2761 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 15 05:16:08.045857 kubelet[2761]: E0715 05:16:08.045837 2761 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 15 05:16:08.046215 kubelet[2761]: E0715 05:16:08.046195 2761 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 15 05:16:08.046215 kubelet[2761]: W0715 05:16:08.046205 2761 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 15 05:16:08.046215 kubelet[2761]: E0715 05:16:08.046213 2761 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 15 05:16:08.046510 kubelet[2761]: E0715 05:16:08.046471 2761 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 15 05:16:08.046546 kubelet[2761]: W0715 05:16:08.046488 2761 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 15 05:16:08.046546 kubelet[2761]: E0715 05:16:08.046531 2761 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 15 05:16:08.046679 kubelet[2761]: I0715 05:16:08.046636 2761 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7hpbn\" (UniqueName: \"kubernetes.io/projected/14bfd7a9-4124-403a-afcc-084c249af056-kube-api-access-7hpbn\") pod \"csi-node-driver-9xn7s\" (UID: \"14bfd7a9-4124-403a-afcc-084c249af056\") " pod="calico-system/csi-node-driver-9xn7s" Jul 15 05:16:08.046973 kubelet[2761]: E0715 05:16:08.046922 2761 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 15 05:16:08.046973 kubelet[2761]: W0715 05:16:08.046934 2761 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 15 05:16:08.046973 kubelet[2761]: E0715 05:16:08.046943 2761 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 15 05:16:08.047236 kubelet[2761]: E0715 05:16:08.047203 2761 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 15 05:16:08.047236 kubelet[2761]: W0715 05:16:08.047213 2761 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 15 05:16:08.047236 kubelet[2761]: E0715 05:16:08.047221 2761 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 15 05:16:08.069153 systemd[1]: Started cri-containerd-6a74f534e54b59aef4d23b08a455ebf83004fefc0e97cb2b40432c6a9b30eb46.scope - libcontainer container 6a74f534e54b59aef4d23b08a455ebf83004fefc0e97cb2b40432c6a9b30eb46. Jul 15 05:16:08.114225 containerd[1591]: time="2025-07-15T05:16:08.114165242Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-node-dwr67,Uid:b27debea-9520-4583-8218-a5e677553bcb,Namespace:calico-system,Attempt:0,} returns sandbox id \"6a74f534e54b59aef4d23b08a455ebf83004fefc0e97cb2b40432c6a9b30eb46\"" Jul 15 05:16:08.148064 kubelet[2761]: E0715 05:16:08.148002 2761 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 15 05:16:08.148064 kubelet[2761]: W0715 05:16:08.148046 2761 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 15 05:16:08.148064 kubelet[2761]: E0715 05:16:08.148063 2761 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 15 05:16:08.148307 kubelet[2761]: E0715 05:16:08.148300 2761 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 15 05:16:08.148334 kubelet[2761]: W0715 05:16:08.148309 2761 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 15 05:16:08.148434 kubelet[2761]: E0715 05:16:08.148336 2761 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 15 05:16:08.148622 kubelet[2761]: E0715 05:16:08.148590 2761 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 15 05:16:08.148622 kubelet[2761]: W0715 05:16:08.148600 2761 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 15 05:16:08.148622 kubelet[2761]: E0715 05:16:08.148612 2761 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 15 05:16:08.148826 kubelet[2761]: E0715 05:16:08.148791 2761 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 15 05:16:08.148826 kubelet[2761]: W0715 05:16:08.148816 2761 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 15 05:16:08.148878 kubelet[2761]: E0715 05:16:08.148831 2761 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 15 05:16:08.149004 kubelet[2761]: E0715 05:16:08.148992 2761 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 15 05:16:08.149004 kubelet[2761]: W0715 05:16:08.149001 2761 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 15 05:16:08.149077 kubelet[2761]: E0715 05:16:08.149045 2761 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 15 05:16:08.149295 kubelet[2761]: E0715 05:16:08.149276 2761 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 15 05:16:08.149295 kubelet[2761]: W0715 05:16:08.149286 2761 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 15 05:16:08.149349 kubelet[2761]: E0715 05:16:08.149305 2761 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 15 05:16:08.149566 kubelet[2761]: E0715 05:16:08.149545 2761 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 15 05:16:08.149566 kubelet[2761]: W0715 05:16:08.149556 2761 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 15 05:16:08.149566 kubelet[2761]: E0715 05:16:08.149566 2761 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 15 05:16:08.149747 kubelet[2761]: E0715 05:16:08.149725 2761 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 15 05:16:08.149747 kubelet[2761]: W0715 05:16:08.149734 2761 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 15 05:16:08.149798 kubelet[2761]: E0715 05:16:08.149783 2761 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 15 05:16:08.149994 kubelet[2761]: E0715 05:16:08.149973 2761 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 15 05:16:08.149994 kubelet[2761]: W0715 05:16:08.149984 2761 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 15 05:16:08.150071 kubelet[2761]: E0715 05:16:08.149997 2761 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 15 05:16:08.150184 kubelet[2761]: E0715 05:16:08.150159 2761 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 15 05:16:08.150184 kubelet[2761]: W0715 05:16:08.150168 2761 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 15 05:16:08.150184 kubelet[2761]: E0715 05:16:08.150174 2761 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 15 05:16:08.150334 kubelet[2761]: E0715 05:16:08.150316 2761 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 15 05:16:08.150334 kubelet[2761]: W0715 05:16:08.150326 2761 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 15 05:16:08.150381 kubelet[2761]: E0715 05:16:08.150347 2761 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 15 05:16:08.150627 kubelet[2761]: E0715 05:16:08.150611 2761 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 15 05:16:08.150627 kubelet[2761]: W0715 05:16:08.150623 2761 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 15 05:16:08.150683 kubelet[2761]: E0715 05:16:08.150657 2761 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 15 05:16:08.150883 kubelet[2761]: E0715 05:16:08.150859 2761 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 15 05:16:08.150883 kubelet[2761]: W0715 05:16:08.150874 2761 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 15 05:16:08.150958 kubelet[2761]: E0715 05:16:08.150889 2761 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 15 05:16:08.151072 kubelet[2761]: E0715 05:16:08.151058 2761 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 15 05:16:08.151072 kubelet[2761]: W0715 05:16:08.151068 2761 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 15 05:16:08.151213 kubelet[2761]: E0715 05:16:08.151150 2761 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 15 05:16:08.151263 kubelet[2761]: E0715 05:16:08.151240 2761 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 15 05:16:08.151263 kubelet[2761]: W0715 05:16:08.151255 2761 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 15 05:16:08.151361 kubelet[2761]: E0715 05:16:08.151336 2761 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 15 05:16:08.151488 kubelet[2761]: E0715 05:16:08.151467 2761 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 15 05:16:08.151488 kubelet[2761]: W0715 05:16:08.151478 2761 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 15 05:16:08.151624 kubelet[2761]: E0715 05:16:08.151606 2761 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 15 05:16:08.151624 kubelet[2761]: W0715 05:16:08.151617 2761 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 15 05:16:08.151624 kubelet[2761]: E0715 05:16:08.151623 2761 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 15 05:16:08.151727 kubelet[2761]: E0715 05:16:08.151714 2761 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 15 05:16:08.151805 kubelet[2761]: E0715 05:16:08.151750 2761 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 15 05:16:08.151877 kubelet[2761]: W0715 05:16:08.151864 2761 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 15 05:16:08.152314 kubelet[2761]: E0715 05:16:08.151938 2761 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 15 05:16:08.152686 kubelet[2761]: E0715 05:16:08.152673 2761 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 15 05:16:08.152837 kubelet[2761]: W0715 05:16:08.152749 2761 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 15 05:16:08.152837 kubelet[2761]: E0715 05:16:08.152768 2761 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 15 05:16:08.153069 kubelet[2761]: E0715 05:16:08.153059 2761 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 15 05:16:08.153187 kubelet[2761]: W0715 05:16:08.153110 2761 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 15 05:16:08.153187 kubelet[2761]: E0715 05:16:08.153142 2761 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 15 05:16:08.153362 kubelet[2761]: E0715 05:16:08.153353 2761 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 15 05:16:08.153494 kubelet[2761]: W0715 05:16:08.153414 2761 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 15 05:16:08.153494 kubelet[2761]: E0715 05:16:08.153429 2761 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 15 05:16:08.153832 kubelet[2761]: E0715 05:16:08.153809 2761 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 15 05:16:08.154070 kubelet[2761]: W0715 05:16:08.153932 2761 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 15 05:16:08.154070 kubelet[2761]: E0715 05:16:08.153949 2761 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 15 05:16:08.154294 kubelet[2761]: E0715 05:16:08.154274 2761 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 15 05:16:08.154294 kubelet[2761]: W0715 05:16:08.154290 2761 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 15 05:16:08.154348 kubelet[2761]: E0715 05:16:08.154301 2761 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 15 05:16:08.157286 kubelet[2761]: E0715 05:16:08.157265 2761 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 15 05:16:08.157286 kubelet[2761]: W0715 05:16:08.157279 2761 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 15 05:16:08.157344 kubelet[2761]: E0715 05:16:08.157306 2761 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 15 05:16:08.157682 kubelet[2761]: E0715 05:16:08.157610 2761 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 15 05:16:08.157682 kubelet[2761]: W0715 05:16:08.157625 2761 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 15 05:16:08.157682 kubelet[2761]: E0715 05:16:08.157634 2761 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 15 05:16:08.165090 kubelet[2761]: E0715 05:16:08.165055 2761 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 15 05:16:08.165090 kubelet[2761]: W0715 05:16:08.165073 2761 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 15 05:16:08.165090 kubelet[2761]: E0715 05:16:08.165087 2761 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 15 05:16:09.708600 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount3875634857.mount: Deactivated successfully. Jul 15 05:16:10.440760 kubelet[2761]: E0715 05:16:10.440680 2761 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-9xn7s" podUID="14bfd7a9-4124-403a-afcc-084c249af056" Jul 15 05:16:10.918530 containerd[1591]: time="2025-07-15T05:16:10.918442505Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/typha:v3.30.2\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jul 15 05:16:10.919357 containerd[1591]: time="2025-07-15T05:16:10.919336968Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/typha:v3.30.2: active requests=0, bytes read=35233364" Jul 15 05:16:10.920297 containerd[1591]: time="2025-07-15T05:16:10.920267567Z" level=info msg="ImageCreate event name:\"sha256:b3baa600c7ff9cd50dc12f2529ef263aaa346dbeca13c77c6553d661fd216b54\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jul 15 05:16:10.921955 containerd[1591]: time="2025-07-15T05:16:10.921927926Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/typha@sha256:da29d745efe5eb7d25f765d3aa439f3fe60710a458efe39c285e58b02bd961af\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jul 15 05:16:10.922445 containerd[1591]: time="2025-07-15T05:16:10.922231701Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/typha:v3.30.2\" with image id \"sha256:b3baa600c7ff9cd50dc12f2529ef263aaa346dbeca13c77c6553d661fd216b54\", repo tag \"ghcr.io/flatcar/calico/typha:v3.30.2\", repo digest \"ghcr.io/flatcar/calico/typha@sha256:da29d745efe5eb7d25f765d3aa439f3fe60710a458efe39c285e58b02bd961af\", size \"35233218\" in 2.999006017s" Jul 15 05:16:10.922445 containerd[1591]: time="2025-07-15T05:16:10.922251164Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/typha:v3.30.2\" returns image reference \"sha256:b3baa600c7ff9cd50dc12f2529ef263aaa346dbeca13c77c6553d661fd216b54\"" Jul 15 05:16:10.923511 containerd[1591]: time="2025-07-15T05:16:10.923489299Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.2\"" Jul 15 05:16:10.936063 containerd[1591]: time="2025-07-15T05:16:10.936040966Z" level=info msg="CreateContainer within sandbox \"87e7a0a8b7d2212a064c36e139228499b6b73ee3a8ca257bc7a3c7dab15dda49\" for container &ContainerMetadata{Name:calico-typha,Attempt:0,}" Jul 15 05:16:10.942247 containerd[1591]: time="2025-07-15T05:16:10.942226810Z" level=info msg="Container 1d4f89ac76277b79fe4cd7b194e68774290e945b2f38ebe272ede7e19fc59bf0: CDI devices from CRI Config.CDIDevices: []" Jul 15 05:16:10.946495 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount4273852128.mount: Deactivated successfully. Jul 15 05:16:10.959752 containerd[1591]: time="2025-07-15T05:16:10.959718515Z" level=info msg="CreateContainer within sandbox \"87e7a0a8b7d2212a064c36e139228499b6b73ee3a8ca257bc7a3c7dab15dda49\" for &ContainerMetadata{Name:calico-typha,Attempt:0,} returns container id \"1d4f89ac76277b79fe4cd7b194e68774290e945b2f38ebe272ede7e19fc59bf0\"" Jul 15 05:16:10.960108 containerd[1591]: time="2025-07-15T05:16:10.960049975Z" level=info msg="StartContainer for \"1d4f89ac76277b79fe4cd7b194e68774290e945b2f38ebe272ede7e19fc59bf0\"" Jul 15 05:16:10.961297 containerd[1591]: time="2025-07-15T05:16:10.961276928Z" level=info msg="connecting to shim 1d4f89ac76277b79fe4cd7b194e68774290e945b2f38ebe272ede7e19fc59bf0" address="unix:///run/containerd/s/a8996967ff02da2856a62f83408dfb19775cdcbd4f4f99385baa014f837d18ef" protocol=ttrpc version=3 Jul 15 05:16:10.980110 systemd[1]: Started cri-containerd-1d4f89ac76277b79fe4cd7b194e68774290e945b2f38ebe272ede7e19fc59bf0.scope - libcontainer container 1d4f89ac76277b79fe4cd7b194e68774290e945b2f38ebe272ede7e19fc59bf0. Jul 15 05:16:11.022621 containerd[1591]: time="2025-07-15T05:16:11.022562829Z" level=info msg="StartContainer for \"1d4f89ac76277b79fe4cd7b194e68774290e945b2f38ebe272ede7e19fc59bf0\" returns successfully" Jul 15 05:16:11.555704 kubelet[2761]: E0715 05:16:11.555552 2761 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 15 05:16:11.555704 kubelet[2761]: W0715 05:16:11.555586 2761 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 15 05:16:11.555704 kubelet[2761]: E0715 05:16:11.555615 2761 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 15 05:16:11.558805 kubelet[2761]: E0715 05:16:11.557936 2761 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 15 05:16:11.558805 kubelet[2761]: W0715 05:16:11.557958 2761 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 15 05:16:11.558805 kubelet[2761]: E0715 05:16:11.557979 2761 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 15 05:16:11.559572 kubelet[2761]: E0715 05:16:11.559004 2761 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 15 05:16:11.559572 kubelet[2761]: W0715 05:16:11.559087 2761 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 15 05:16:11.559572 kubelet[2761]: E0715 05:16:11.559107 2761 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 15 05:16:11.560363 kubelet[2761]: E0715 05:16:11.560331 2761 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 15 05:16:11.560363 kubelet[2761]: W0715 05:16:11.560350 2761 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 15 05:16:11.560363 kubelet[2761]: E0715 05:16:11.560365 2761 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 15 05:16:11.560686 kubelet[2761]: E0715 05:16:11.560650 2761 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 15 05:16:11.560686 kubelet[2761]: W0715 05:16:11.560671 2761 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 15 05:16:11.560686 kubelet[2761]: E0715 05:16:11.560687 2761 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 15 05:16:11.560973 kubelet[2761]: E0715 05:16:11.560948 2761 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 15 05:16:11.560973 kubelet[2761]: W0715 05:16:11.560966 2761 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 15 05:16:11.561119 kubelet[2761]: E0715 05:16:11.560978 2761 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 15 05:16:11.561323 kubelet[2761]: E0715 05:16:11.561298 2761 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 15 05:16:11.561323 kubelet[2761]: W0715 05:16:11.561313 2761 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 15 05:16:11.561405 kubelet[2761]: E0715 05:16:11.561324 2761 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 15 05:16:11.561584 kubelet[2761]: E0715 05:16:11.561551 2761 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 15 05:16:11.561584 kubelet[2761]: W0715 05:16:11.561572 2761 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 15 05:16:11.561584 kubelet[2761]: E0715 05:16:11.561582 2761 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 15 05:16:11.561862 kubelet[2761]: E0715 05:16:11.561839 2761 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 15 05:16:11.561862 kubelet[2761]: W0715 05:16:11.561854 2761 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 15 05:16:11.562176 kubelet[2761]: E0715 05:16:11.561864 2761 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 15 05:16:11.562176 kubelet[2761]: E0715 05:16:11.562165 2761 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 15 05:16:11.562239 kubelet[2761]: W0715 05:16:11.562176 2761 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 15 05:16:11.562239 kubelet[2761]: E0715 05:16:11.562188 2761 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 15 05:16:11.562457 kubelet[2761]: E0715 05:16:11.562428 2761 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 15 05:16:11.562457 kubelet[2761]: W0715 05:16:11.562443 2761 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 15 05:16:11.562457 kubelet[2761]: E0715 05:16:11.562453 2761 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 15 05:16:11.562748 kubelet[2761]: E0715 05:16:11.562709 2761 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 15 05:16:11.562748 kubelet[2761]: W0715 05:16:11.562733 2761 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 15 05:16:11.562748 kubelet[2761]: E0715 05:16:11.562744 2761 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 15 05:16:11.563367 kubelet[2761]: E0715 05:16:11.562995 2761 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 15 05:16:11.563367 kubelet[2761]: W0715 05:16:11.563004 2761 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 15 05:16:11.563367 kubelet[2761]: E0715 05:16:11.563016 2761 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 15 05:16:11.563572 kubelet[2761]: E0715 05:16:11.563541 2761 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 15 05:16:11.563572 kubelet[2761]: W0715 05:16:11.563560 2761 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 15 05:16:11.563572 kubelet[2761]: E0715 05:16:11.563574 2761 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 15 05:16:11.563878 kubelet[2761]: E0715 05:16:11.563849 2761 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 15 05:16:11.563878 kubelet[2761]: W0715 05:16:11.563868 2761 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 15 05:16:11.563878 kubelet[2761]: E0715 05:16:11.563879 2761 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 15 05:16:11.577468 kubelet[2761]: E0715 05:16:11.577434 2761 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 15 05:16:11.577468 kubelet[2761]: W0715 05:16:11.577454 2761 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 15 05:16:11.577468 kubelet[2761]: E0715 05:16:11.577471 2761 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 15 05:16:11.577889 kubelet[2761]: E0715 05:16:11.577862 2761 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 15 05:16:11.577889 kubelet[2761]: W0715 05:16:11.577879 2761 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 15 05:16:11.577975 kubelet[2761]: E0715 05:16:11.577915 2761 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 15 05:16:11.578355 kubelet[2761]: E0715 05:16:11.578315 2761 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 15 05:16:11.578355 kubelet[2761]: W0715 05:16:11.578335 2761 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 15 05:16:11.578355 kubelet[2761]: E0715 05:16:11.578353 2761 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 15 05:16:11.578766 kubelet[2761]: E0715 05:16:11.578740 2761 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 15 05:16:11.579176 kubelet[2761]: W0715 05:16:11.578791 2761 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 15 05:16:11.579176 kubelet[2761]: E0715 05:16:11.579051 2761 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 15 05:16:11.579281 kubelet[2761]: E0715 05:16:11.579237 2761 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 15 05:16:11.579281 kubelet[2761]: W0715 05:16:11.579249 2761 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 15 05:16:11.579571 kubelet[2761]: E0715 05:16:11.579481 2761 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 15 05:16:11.580184 kubelet[2761]: E0715 05:16:11.580087 2761 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 15 05:16:11.580787 kubelet[2761]: W0715 05:16:11.580261 2761 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 15 05:16:11.581290 kubelet[2761]: E0715 05:16:11.581220 2761 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 15 05:16:11.581290 kubelet[2761]: W0715 05:16:11.581284 2761 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 15 05:16:11.581953 kubelet[2761]: E0715 05:16:11.581905 2761 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 15 05:16:11.581953 kubelet[2761]: W0715 05:16:11.581925 2761 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 15 05:16:11.581953 kubelet[2761]: E0715 05:16:11.581939 2761 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 15 05:16:11.582471 kubelet[2761]: E0715 05:16:11.582172 2761 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 15 05:16:11.582471 kubelet[2761]: E0715 05:16:11.582206 2761 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 15 05:16:11.583894 kubelet[2761]: E0715 05:16:11.583866 2761 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 15 05:16:11.583894 kubelet[2761]: W0715 05:16:11.583888 2761 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 15 05:16:11.585258 kubelet[2761]: E0715 05:16:11.585162 2761 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 15 05:16:11.585525 kubelet[2761]: E0715 05:16:11.585472 2761 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 15 05:16:11.585525 kubelet[2761]: W0715 05:16:11.585488 2761 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 15 05:16:11.585525 kubelet[2761]: E0715 05:16:11.585503 2761 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 15 05:16:11.586239 kubelet[2761]: E0715 05:16:11.586211 2761 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 15 05:16:11.586239 kubelet[2761]: W0715 05:16:11.586231 2761 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 15 05:16:11.586715 kubelet[2761]: E0715 05:16:11.586245 2761 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 15 05:16:11.587102 kubelet[2761]: E0715 05:16:11.587000 2761 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 15 05:16:11.587395 kubelet[2761]: W0715 05:16:11.587159 2761 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 15 05:16:11.587395 kubelet[2761]: E0715 05:16:11.587174 2761 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 15 05:16:11.589407 kubelet[2761]: E0715 05:16:11.589377 2761 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 15 05:16:11.589407 kubelet[2761]: W0715 05:16:11.589400 2761 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 15 05:16:11.589509 kubelet[2761]: E0715 05:16:11.589419 2761 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 15 05:16:11.589966 kubelet[2761]: E0715 05:16:11.589901 2761 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 15 05:16:11.590404 kubelet[2761]: W0715 05:16:11.590044 2761 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 15 05:16:11.590404 kubelet[2761]: E0715 05:16:11.590058 2761 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 15 05:16:11.591195 kubelet[2761]: E0715 05:16:11.591170 2761 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 15 05:16:11.591608 kubelet[2761]: W0715 05:16:11.591286 2761 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 15 05:16:11.591608 kubelet[2761]: E0715 05:16:11.591295 2761 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 15 05:16:11.591709 kubelet[2761]: E0715 05:16:11.591641 2761 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 15 05:16:11.591709 kubelet[2761]: W0715 05:16:11.591648 2761 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 15 05:16:11.591709 kubelet[2761]: E0715 05:16:11.591655 2761 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 15 05:16:11.592355 kubelet[2761]: E0715 05:16:11.592318 2761 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 15 05:16:11.592355 kubelet[2761]: W0715 05:16:11.592330 2761 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 15 05:16:11.592456 kubelet[2761]: E0715 05:16:11.592442 2761 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 15 05:16:11.592650 kubelet[2761]: E0715 05:16:11.592568 2761 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 15 05:16:11.592650 kubelet[2761]: W0715 05:16:11.592576 2761 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 15 05:16:11.592650 kubelet[2761]: E0715 05:16:11.592583 2761 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 15 05:16:12.439711 kubelet[2761]: E0715 05:16:12.439386 2761 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-9xn7s" podUID="14bfd7a9-4124-403a-afcc-084c249af056" Jul 15 05:16:12.541514 kubelet[2761]: I0715 05:16:12.541466 2761 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Jul 15 05:16:12.572421 kubelet[2761]: E0715 05:16:12.572340 2761 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 15 05:16:12.572421 kubelet[2761]: W0715 05:16:12.572373 2761 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 15 05:16:12.572421 kubelet[2761]: E0715 05:16:12.572401 2761 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 15 05:16:12.573748 kubelet[2761]: E0715 05:16:12.572740 2761 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 15 05:16:12.573748 kubelet[2761]: W0715 05:16:12.572752 2761 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 15 05:16:12.573748 kubelet[2761]: E0715 05:16:12.572767 2761 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 15 05:16:12.573748 kubelet[2761]: E0715 05:16:12.573591 2761 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 15 05:16:12.573748 kubelet[2761]: W0715 05:16:12.573607 2761 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 15 05:16:12.573748 kubelet[2761]: E0715 05:16:12.573626 2761 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 15 05:16:12.574623 kubelet[2761]: E0715 05:16:12.574008 2761 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 15 05:16:12.574623 kubelet[2761]: W0715 05:16:12.574048 2761 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 15 05:16:12.574623 kubelet[2761]: E0715 05:16:12.574064 2761 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 15 05:16:12.574623 kubelet[2761]: E0715 05:16:12.574383 2761 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 15 05:16:12.574623 kubelet[2761]: W0715 05:16:12.574394 2761 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 15 05:16:12.574623 kubelet[2761]: E0715 05:16:12.574408 2761 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 15 05:16:12.575262 kubelet[2761]: E0715 05:16:12.574755 2761 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 15 05:16:12.575262 kubelet[2761]: W0715 05:16:12.574815 2761 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 15 05:16:12.575262 kubelet[2761]: E0715 05:16:12.574830 2761 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 15 05:16:12.575262 kubelet[2761]: E0715 05:16:12.575148 2761 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 15 05:16:12.575262 kubelet[2761]: W0715 05:16:12.575159 2761 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 15 05:16:12.575262 kubelet[2761]: E0715 05:16:12.575172 2761 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 15 05:16:12.575674 kubelet[2761]: E0715 05:16:12.575462 2761 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 15 05:16:12.575674 kubelet[2761]: W0715 05:16:12.575473 2761 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 15 05:16:12.575674 kubelet[2761]: E0715 05:16:12.575486 2761 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 15 05:16:12.575784 kubelet[2761]: E0715 05:16:12.575769 2761 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 15 05:16:12.575784 kubelet[2761]: W0715 05:16:12.575779 2761 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 15 05:16:12.575862 kubelet[2761]: E0715 05:16:12.575791 2761 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 15 05:16:12.576445 kubelet[2761]: E0715 05:16:12.576159 2761 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 15 05:16:12.576445 kubelet[2761]: W0715 05:16:12.576178 2761 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 15 05:16:12.576445 kubelet[2761]: E0715 05:16:12.576192 2761 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 15 05:16:12.576607 kubelet[2761]: E0715 05:16:12.576593 2761 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 15 05:16:12.576649 kubelet[2761]: W0715 05:16:12.576610 2761 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 15 05:16:12.576649 kubelet[2761]: E0715 05:16:12.576626 2761 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 15 05:16:12.577171 kubelet[2761]: E0715 05:16:12.577063 2761 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 15 05:16:12.577171 kubelet[2761]: W0715 05:16:12.577083 2761 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 15 05:16:12.577171 kubelet[2761]: E0715 05:16:12.577101 2761 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 15 05:16:12.577533 kubelet[2761]: E0715 05:16:12.577494 2761 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 15 05:16:12.577533 kubelet[2761]: W0715 05:16:12.577529 2761 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 15 05:16:12.577928 kubelet[2761]: E0715 05:16:12.577551 2761 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 15 05:16:12.578220 kubelet[2761]: E0715 05:16:12.577987 2761 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 15 05:16:12.578220 kubelet[2761]: W0715 05:16:12.578001 2761 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 15 05:16:12.578220 kubelet[2761]: E0715 05:16:12.578063 2761 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 15 05:16:12.578560 kubelet[2761]: E0715 05:16:12.578372 2761 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 15 05:16:12.578560 kubelet[2761]: W0715 05:16:12.578390 2761 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 15 05:16:12.578560 kubelet[2761]: E0715 05:16:12.578403 2761 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 15 05:16:12.587874 kubelet[2761]: E0715 05:16:12.587655 2761 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 15 05:16:12.587874 kubelet[2761]: W0715 05:16:12.587680 2761 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 15 05:16:12.587874 kubelet[2761]: E0715 05:16:12.587701 2761 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 15 05:16:12.588332 kubelet[2761]: E0715 05:16:12.588310 2761 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 15 05:16:12.588534 kubelet[2761]: W0715 05:16:12.588439 2761 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 15 05:16:12.588534 kubelet[2761]: E0715 05:16:12.588461 2761 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 15 05:16:12.589364 kubelet[2761]: E0715 05:16:12.589251 2761 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 15 05:16:12.589364 kubelet[2761]: W0715 05:16:12.589276 2761 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 15 05:16:12.589364 kubelet[2761]: E0715 05:16:12.589294 2761 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 15 05:16:12.590408 kubelet[2761]: E0715 05:16:12.590262 2761 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 15 05:16:12.590408 kubelet[2761]: W0715 05:16:12.590298 2761 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 15 05:16:12.590408 kubelet[2761]: E0715 05:16:12.590326 2761 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 15 05:16:12.591256 kubelet[2761]: E0715 05:16:12.591122 2761 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 15 05:16:12.591256 kubelet[2761]: W0715 05:16:12.591180 2761 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 15 05:16:12.591625 kubelet[2761]: E0715 05:16:12.591568 2761 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 15 05:16:12.592916 kubelet[2761]: E0715 05:16:12.592760 2761 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 15 05:16:12.592916 kubelet[2761]: W0715 05:16:12.592781 2761 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 15 05:16:12.593469 kubelet[2761]: E0715 05:16:12.593400 2761 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 15 05:16:12.593469 kubelet[2761]: W0715 05:16:12.593416 2761 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 15 05:16:12.593693 kubelet[2761]: E0715 05:16:12.593624 2761 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 15 05:16:12.593693 kubelet[2761]: E0715 05:16:12.593662 2761 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 15 05:16:12.594050 kubelet[2761]: E0715 05:16:12.593926 2761 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 15 05:16:12.594050 kubelet[2761]: W0715 05:16:12.593940 2761 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 15 05:16:12.594050 kubelet[2761]: E0715 05:16:12.593981 2761 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 15 05:16:12.594451 kubelet[2761]: E0715 05:16:12.594310 2761 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 15 05:16:12.594451 kubelet[2761]: W0715 05:16:12.594322 2761 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 15 05:16:12.594451 kubelet[2761]: E0715 05:16:12.594339 2761 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 15 05:16:12.594814 kubelet[2761]: E0715 05:16:12.594570 2761 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 15 05:16:12.594814 kubelet[2761]: W0715 05:16:12.594614 2761 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 15 05:16:12.594814 kubelet[2761]: E0715 05:16:12.594641 2761 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 15 05:16:12.595273 kubelet[2761]: E0715 05:16:12.594934 2761 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 15 05:16:12.595273 kubelet[2761]: W0715 05:16:12.594948 2761 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 15 05:16:12.595273 kubelet[2761]: E0715 05:16:12.595047 2761 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 15 05:16:12.595854 kubelet[2761]: E0715 05:16:12.595831 2761 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 15 05:16:12.595854 kubelet[2761]: W0715 05:16:12.595846 2761 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 15 05:16:12.596164 kubelet[2761]: E0715 05:16:12.596102 2761 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 15 05:16:12.596164 kubelet[2761]: W0715 05:16:12.596112 2761 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 15 05:16:12.596511 kubelet[2761]: E0715 05:16:12.596289 2761 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 15 05:16:12.596569 kubelet[2761]: E0715 05:16:12.596529 2761 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 15 05:16:12.596738 kubelet[2761]: E0715 05:16:12.596617 2761 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 15 05:16:12.596738 kubelet[2761]: W0715 05:16:12.596624 2761 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 15 05:16:12.596738 kubelet[2761]: E0715 05:16:12.596640 2761 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 15 05:16:12.597162 kubelet[2761]: E0715 05:16:12.596862 2761 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 15 05:16:12.597162 kubelet[2761]: W0715 05:16:12.596871 2761 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 15 05:16:12.597162 kubelet[2761]: E0715 05:16:12.596904 2761 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 15 05:16:12.597637 kubelet[2761]: E0715 05:16:12.597219 2761 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 15 05:16:12.597637 kubelet[2761]: W0715 05:16:12.597229 2761 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 15 05:16:12.597637 kubelet[2761]: E0715 05:16:12.597240 2761 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 15 05:16:12.597637 kubelet[2761]: E0715 05:16:12.597510 2761 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 15 05:16:12.597637 kubelet[2761]: W0715 05:16:12.597518 2761 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 15 05:16:12.597637 kubelet[2761]: E0715 05:16:12.597529 2761 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 15 05:16:12.598129 kubelet[2761]: E0715 05:16:12.597913 2761 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 15 05:16:12.598129 kubelet[2761]: W0715 05:16:12.597923 2761 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 15 05:16:12.598129 kubelet[2761]: E0715 05:16:12.597933 2761 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 15 05:16:12.702664 containerd[1591]: time="2025-07-15T05:16:12.701978890Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.2\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jul 15 05:16:12.703457 containerd[1591]: time="2025-07-15T05:16:12.703391926Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.2: active requests=0, bytes read=4446956" Jul 15 05:16:12.709559 containerd[1591]: time="2025-07-15T05:16:12.709525672Z" level=info msg="ImageCreate event name:\"sha256:639615519fa6f7bc4b4756066ba9780068fd291eacc36c120f6c555e62f2b00e\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jul 15 05:16:12.711247 containerd[1591]: time="2025-07-15T05:16:12.711181159Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/pod2daemon-flexvol@sha256:972be127eaecd7d1a2d5393b8d14f1ae8f88550bee83e0519e9590c7e15eb41b\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jul 15 05:16:12.711579 containerd[1591]: time="2025-07-15T05:16:12.711460946Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.2\" with image id \"sha256:639615519fa6f7bc4b4756066ba9780068fd291eacc36c120f6c555e62f2b00e\", repo tag \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.2\", repo digest \"ghcr.io/flatcar/calico/pod2daemon-flexvol@sha256:972be127eaecd7d1a2d5393b8d14f1ae8f88550bee83e0519e9590c7e15eb41b\", size \"5939619\" in 1.787848278s" Jul 15 05:16:12.711579 containerd[1591]: time="2025-07-15T05:16:12.711482099Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.2\" returns image reference \"sha256:639615519fa6f7bc4b4756066ba9780068fd291eacc36c120f6c555e62f2b00e\"" Jul 15 05:16:12.713307 containerd[1591]: time="2025-07-15T05:16:12.713285936Z" level=info msg="CreateContainer within sandbox \"6a74f534e54b59aef4d23b08a455ebf83004fefc0e97cb2b40432c6a9b30eb46\" for container &ContainerMetadata{Name:flexvol-driver,Attempt:0,}" Jul 15 05:16:12.722292 containerd[1591]: time="2025-07-15T05:16:12.721659376Z" level=info msg="Container 0c13ad9bc59f832f9eaed6635c854bbd76014a914b4835030715175f68dce908: CDI devices from CRI Config.CDIDevices: []" Jul 15 05:16:12.736929 containerd[1591]: time="2025-07-15T05:16:12.736900409Z" level=info msg="CreateContainer within sandbox \"6a74f534e54b59aef4d23b08a455ebf83004fefc0e97cb2b40432c6a9b30eb46\" for &ContainerMetadata{Name:flexvol-driver,Attempt:0,} returns container id \"0c13ad9bc59f832f9eaed6635c854bbd76014a914b4835030715175f68dce908\"" Jul 15 05:16:12.737646 containerd[1591]: time="2025-07-15T05:16:12.737591890Z" level=info msg="StartContainer for \"0c13ad9bc59f832f9eaed6635c854bbd76014a914b4835030715175f68dce908\"" Jul 15 05:16:12.738537 containerd[1591]: time="2025-07-15T05:16:12.738519882Z" level=info msg="connecting to shim 0c13ad9bc59f832f9eaed6635c854bbd76014a914b4835030715175f68dce908" address="unix:///run/containerd/s/f589dd8d63c0284808d9a3bb9867cde47623a6790aa071d27c41a4e3aae18e20" protocol=ttrpc version=3 Jul 15 05:16:12.758128 systemd[1]: Started cri-containerd-0c13ad9bc59f832f9eaed6635c854bbd76014a914b4835030715175f68dce908.scope - libcontainer container 0c13ad9bc59f832f9eaed6635c854bbd76014a914b4835030715175f68dce908. Jul 15 05:16:12.793891 containerd[1591]: time="2025-07-15T05:16:12.793862563Z" level=info msg="StartContainer for \"0c13ad9bc59f832f9eaed6635c854bbd76014a914b4835030715175f68dce908\" returns successfully" Jul 15 05:16:12.805800 systemd[1]: cri-containerd-0c13ad9bc59f832f9eaed6635c854bbd76014a914b4835030715175f68dce908.scope: Deactivated successfully. Jul 15 05:16:12.818724 containerd[1591]: time="2025-07-15T05:16:12.818689025Z" level=info msg="received exit event container_id:\"0c13ad9bc59f832f9eaed6635c854bbd76014a914b4835030715175f68dce908\" id:\"0c13ad9bc59f832f9eaed6635c854bbd76014a914b4835030715175f68dce908\" pid:3496 exited_at:{seconds:1752556572 nanos:807827518}" Jul 15 05:16:12.829774 containerd[1591]: time="2025-07-15T05:16:12.829743768Z" level=info msg="TaskExit event in podsandbox handler container_id:\"0c13ad9bc59f832f9eaed6635c854bbd76014a914b4835030715175f68dce908\" id:\"0c13ad9bc59f832f9eaed6635c854bbd76014a914b4835030715175f68dce908\" pid:3496 exited_at:{seconds:1752556572 nanos:807827518}" Jul 15 05:16:12.840079 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-0c13ad9bc59f832f9eaed6635c854bbd76014a914b4835030715175f68dce908-rootfs.mount: Deactivated successfully. Jul 15 05:16:13.550660 containerd[1591]: time="2025-07-15T05:16:13.550594553Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/cni:v3.30.2\"" Jul 15 05:16:13.571545 kubelet[2761]: I0715 05:16:13.570848 2761 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/calico-typha-9584887c4-jwn7l" podStartSLOduration=3.5702283059999997 podStartE2EDuration="6.570825075s" podCreationTimestamp="2025-07-15 05:16:07 +0000 UTC" firstStartedPulling="2025-07-15 05:16:07.922801827 +0000 UTC m=+18.574175803" lastFinishedPulling="2025-07-15 05:16:10.923398606 +0000 UTC m=+21.574772572" observedRunningTime="2025-07-15 05:16:11.55399059 +0000 UTC m=+22.205364596" watchObservedRunningTime="2025-07-15 05:16:13.570825075 +0000 UTC m=+24.222199081" Jul 15 05:16:14.439353 kubelet[2761]: E0715 05:16:14.439255 2761 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-9xn7s" podUID="14bfd7a9-4124-403a-afcc-084c249af056" Jul 15 05:16:16.438580 kubelet[2761]: E0715 05:16:16.438477 2761 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-9xn7s" podUID="14bfd7a9-4124-403a-afcc-084c249af056" Jul 15 05:16:17.602506 containerd[1591]: time="2025-07-15T05:16:17.602467846Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/cni:v3.30.2\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jul 15 05:16:17.603474 containerd[1591]: time="2025-07-15T05:16:17.603448559Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/cni:v3.30.2: active requests=0, bytes read=70436221" Jul 15 05:16:17.604483 containerd[1591]: time="2025-07-15T05:16:17.604454905Z" level=info msg="ImageCreate event name:\"sha256:77a357d0d33e3016e61153f7d2b7de72371579c4aaeb767fb7ef0af606fe1630\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jul 15 05:16:17.605920 containerd[1591]: time="2025-07-15T05:16:17.605890892Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/cni@sha256:50686775cc60acb78bd92a66fa2d84e1700b2d8e43a718fbadbf35e59baefb4d\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jul 15 05:16:17.606246 containerd[1591]: time="2025-07-15T05:16:17.606218613Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/cni:v3.30.2\" with image id \"sha256:77a357d0d33e3016e61153f7d2b7de72371579c4aaeb767fb7ef0af606fe1630\", repo tag \"ghcr.io/flatcar/calico/cni:v3.30.2\", repo digest \"ghcr.io/flatcar/calico/cni@sha256:50686775cc60acb78bd92a66fa2d84e1700b2d8e43a718fbadbf35e59baefb4d\", size \"71928924\" in 4.055568823s" Jul 15 05:16:17.606246 containerd[1591]: time="2025-07-15T05:16:17.606237725Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/cni:v3.30.2\" returns image reference \"sha256:77a357d0d33e3016e61153f7d2b7de72371579c4aaeb767fb7ef0af606fe1630\"" Jul 15 05:16:17.608417 containerd[1591]: time="2025-07-15T05:16:17.608401211Z" level=info msg="CreateContainer within sandbox \"6a74f534e54b59aef4d23b08a455ebf83004fefc0e97cb2b40432c6a9b30eb46\" for container &ContainerMetadata{Name:install-cni,Attempt:0,}" Jul 15 05:16:17.620799 containerd[1591]: time="2025-07-15T05:16:17.620782639Z" level=info msg="Container adf3b25478997e296845cfa031b41da74a8213c8c159735c83642de02ff13075: CDI devices from CRI Config.CDIDevices: []" Jul 15 05:16:17.623798 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount2608862112.mount: Deactivated successfully. Jul 15 05:16:17.639287 containerd[1591]: time="2025-07-15T05:16:17.639262967Z" level=info msg="CreateContainer within sandbox \"6a74f534e54b59aef4d23b08a455ebf83004fefc0e97cb2b40432c6a9b30eb46\" for &ContainerMetadata{Name:install-cni,Attempt:0,} returns container id \"adf3b25478997e296845cfa031b41da74a8213c8c159735c83642de02ff13075\"" Jul 15 05:16:17.639723 containerd[1591]: time="2025-07-15T05:16:17.639659245Z" level=info msg="StartContainer for \"adf3b25478997e296845cfa031b41da74a8213c8c159735c83642de02ff13075\"" Jul 15 05:16:17.640791 containerd[1591]: time="2025-07-15T05:16:17.640760710Z" level=info msg="connecting to shim adf3b25478997e296845cfa031b41da74a8213c8c159735c83642de02ff13075" address="unix:///run/containerd/s/f589dd8d63c0284808d9a3bb9867cde47623a6790aa071d27c41a4e3aae18e20" protocol=ttrpc version=3 Jul 15 05:16:17.663105 systemd[1]: Started cri-containerd-adf3b25478997e296845cfa031b41da74a8213c8c159735c83642de02ff13075.scope - libcontainer container adf3b25478997e296845cfa031b41da74a8213c8c159735c83642de02ff13075. Jul 15 05:16:17.701794 containerd[1591]: time="2025-07-15T05:16:17.701756204Z" level=info msg="StartContainer for \"adf3b25478997e296845cfa031b41da74a8213c8c159735c83642de02ff13075\" returns successfully" Jul 15 05:16:18.006959 systemd[1]: cri-containerd-adf3b25478997e296845cfa031b41da74a8213c8c159735c83642de02ff13075.scope: Deactivated successfully. Jul 15 05:16:18.007204 systemd[1]: cri-containerd-adf3b25478997e296845cfa031b41da74a8213c8c159735c83642de02ff13075.scope: Consumed 310ms CPU time, 164.3M memory peak, 18.3M read from disk, 171.2M written to disk. Jul 15 05:16:18.008993 containerd[1591]: time="2025-07-15T05:16:18.008948449Z" level=info msg="received exit event container_id:\"adf3b25478997e296845cfa031b41da74a8213c8c159735c83642de02ff13075\" id:\"adf3b25478997e296845cfa031b41da74a8213c8c159735c83642de02ff13075\" pid:3553 exited_at:{seconds:1752556578 nanos:8370698}" Jul 15 05:16:18.010437 containerd[1591]: time="2025-07-15T05:16:18.010416860Z" level=info msg="TaskExit event in podsandbox handler container_id:\"adf3b25478997e296845cfa031b41da74a8213c8c159735c83642de02ff13075\" id:\"adf3b25478997e296845cfa031b41da74a8213c8c159735c83642de02ff13075\" pid:3553 exited_at:{seconds:1752556578 nanos:8370698}" Jul 15 05:16:18.033971 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-adf3b25478997e296845cfa031b41da74a8213c8c159735c83642de02ff13075-rootfs.mount: Deactivated successfully. Jul 15 05:16:18.101477 kubelet[2761]: I0715 05:16:18.101427 2761 kubelet_node_status.go:501] "Fast updating node status as it just became ready" Jul 15 05:16:18.148121 systemd[1]: Created slice kubepods-burstable-poda6375220_4b2a_45bf_9f5c_934a1ffc3abf.slice - libcontainer container kubepods-burstable-poda6375220_4b2a_45bf_9f5c_934a1ffc3abf.slice. Jul 15 05:16:18.161944 systemd[1]: Created slice kubepods-burstable-podf5754555_d845_4a94_affb_73038dd15a19.slice - libcontainer container kubepods-burstable-podf5754555_d845_4a94_affb_73038dd15a19.slice. Jul 15 05:16:18.169234 systemd[1]: Created slice kubepods-besteffort-podc691497d_2512_41d4_86a2_14832e0a4196.slice - libcontainer container kubepods-besteffort-podc691497d_2512_41d4_86a2_14832e0a4196.slice. Jul 15 05:16:18.178758 systemd[1]: Created slice kubepods-besteffort-pod272d51a9_e3b2_48a9_b04e_5a452ea134b1.slice - libcontainer container kubepods-besteffort-pod272d51a9_e3b2_48a9_b04e_5a452ea134b1.slice. Jul 15 05:16:18.186115 systemd[1]: Created slice kubepods-besteffort-pod9c02e846_0cfc_42de_9a5f_fa54428989e8.slice - libcontainer container kubepods-besteffort-pod9c02e846_0cfc_42de_9a5f_fa54428989e8.slice. Jul 15 05:16:18.191890 systemd[1]: Created slice kubepods-besteffort-pod33523259_a708_46f2_aa96_0ef7a77d199a.slice - libcontainer container kubepods-besteffort-pod33523259_a708_46f2_aa96_0ef7a77d199a.slice. Jul 15 05:16:18.197603 systemd[1]: Created slice kubepods-besteffort-pod84c72182_29d8_44ef_b493_115789db3f13.slice - libcontainer container kubepods-besteffort-pod84c72182_29d8_44ef_b493_115789db3f13.slice. Jul 15 05:16:18.230033 kubelet[2761]: I0715 05:16:18.229991 2761 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/f5754555-d845-4a94-affb-73038dd15a19-config-volume\") pod \"coredns-668d6bf9bc-vc4xs\" (UID: \"f5754555-d845-4a94-affb-73038dd15a19\") " pod="kube-system/coredns-668d6bf9bc-vc4xs" Jul 15 05:16:18.230162 kubelet[2761]: I0715 05:16:18.230150 2761 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"calico-apiserver-certs\" (UniqueName: \"kubernetes.io/secret/272d51a9-e3b2-48a9-b04e-5a452ea134b1-calico-apiserver-certs\") pod \"calico-apiserver-6494bf7c78-b7q6m\" (UID: \"272d51a9-e3b2-48a9-b04e-5a452ea134b1\") " pod="calico-apiserver/calico-apiserver-6494bf7c78-b7q6m" Jul 15 05:16:18.230210 kubelet[2761]: I0715 05:16:18.230202 2761 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whisker-backend-key-pair\" (UniqueName: \"kubernetes.io/secret/33523259-a708-46f2-aa96-0ef7a77d199a-whisker-backend-key-pair\") pod \"whisker-85cc8dcb94-t9k8t\" (UID: \"33523259-a708-46f2-aa96-0ef7a77d199a\") " pod="calico-system/whisker-85cc8dcb94-t9k8t" Jul 15 05:16:18.230253 kubelet[2761]: I0715 05:16:18.230246 2761 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c691497d-2512-41d4-86a2-14832e0a4196-tigera-ca-bundle\") pod \"calico-kube-controllers-694c859895-vr2gt\" (UID: \"c691497d-2512-41d4-86a2-14832e0a4196\") " pod="calico-system/calico-kube-controllers-694c859895-vr2gt" Jul 15 05:16:18.230296 kubelet[2761]: I0715 05:16:18.230288 2761 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/a6375220-4b2a-45bf-9f5c-934a1ffc3abf-config-volume\") pod \"coredns-668d6bf9bc-m56c8\" (UID: \"a6375220-4b2a-45bf-9f5c-934a1ffc3abf\") " pod="kube-system/coredns-668d6bf9bc-m56c8" Jul 15 05:16:18.230341 kubelet[2761]: I0715 05:16:18.230334 2761 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7pgqq\" (UniqueName: \"kubernetes.io/projected/f5754555-d845-4a94-affb-73038dd15a19-kube-api-access-7pgqq\") pod \"coredns-668d6bf9bc-vc4xs\" (UID: \"f5754555-d845-4a94-affb-73038dd15a19\") " pod="kube-system/coredns-668d6bf9bc-vc4xs" Jul 15 05:16:18.230383 kubelet[2761]: I0715 05:16:18.230376 2761 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whisker-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/33523259-a708-46f2-aa96-0ef7a77d199a-whisker-ca-bundle\") pod \"whisker-85cc8dcb94-t9k8t\" (UID: \"33523259-a708-46f2-aa96-0ef7a77d199a\") " pod="calico-system/whisker-85cc8dcb94-t9k8t" Jul 15 05:16:18.230423 kubelet[2761]: I0715 05:16:18.230415 2761 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"calico-apiserver-certs\" (UniqueName: \"kubernetes.io/secret/84c72182-29d8-44ef-b493-115789db3f13-calico-apiserver-certs\") pod \"calico-apiserver-6494bf7c78-znpph\" (UID: \"84c72182-29d8-44ef-b493-115789db3f13\") " pod="calico-apiserver/calico-apiserver-6494bf7c78-znpph" Jul 15 05:16:18.230463 kubelet[2761]: I0715 05:16:18.230455 2761 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bz4qb\" (UniqueName: \"kubernetes.io/projected/84c72182-29d8-44ef-b493-115789db3f13-kube-api-access-bz4qb\") pod \"calico-apiserver-6494bf7c78-znpph\" (UID: \"84c72182-29d8-44ef-b493-115789db3f13\") " pod="calico-apiserver/calico-apiserver-6494bf7c78-znpph" Jul 15 05:16:18.230501 kubelet[2761]: I0715 05:16:18.230494 2761 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9c02e846-0cfc-42de-9a5f-fa54428989e8-config\") pod \"goldmane-768f4c5c69-6hlgx\" (UID: \"9c02e846-0cfc-42de-9a5f-fa54428989e8\") " pod="calico-system/goldmane-768f4c5c69-6hlgx" Jul 15 05:16:18.230546 kubelet[2761]: I0715 05:16:18.230538 2761 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"goldmane-key-pair\" (UniqueName: \"kubernetes.io/secret/9c02e846-0cfc-42de-9a5f-fa54428989e8-goldmane-key-pair\") pod \"goldmane-768f4c5c69-6hlgx\" (UID: \"9c02e846-0cfc-42de-9a5f-fa54428989e8\") " pod="calico-system/goldmane-768f4c5c69-6hlgx" Jul 15 05:16:18.230593 kubelet[2761]: I0715 05:16:18.230579 2761 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4p2pm\" (UniqueName: \"kubernetes.io/projected/c691497d-2512-41d4-86a2-14832e0a4196-kube-api-access-4p2pm\") pod \"calico-kube-controllers-694c859895-vr2gt\" (UID: \"c691497d-2512-41d4-86a2-14832e0a4196\") " pod="calico-system/calico-kube-controllers-694c859895-vr2gt" Jul 15 05:16:18.230637 kubelet[2761]: I0715 05:16:18.230630 2761 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9s7g2\" (UniqueName: \"kubernetes.io/projected/9c02e846-0cfc-42de-9a5f-fa54428989e8-kube-api-access-9s7g2\") pod \"goldmane-768f4c5c69-6hlgx\" (UID: \"9c02e846-0cfc-42de-9a5f-fa54428989e8\") " pod="calico-system/goldmane-768f4c5c69-6hlgx" Jul 15 05:16:18.230680 kubelet[2761]: I0715 05:16:18.230673 2761 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xcqdm\" (UniqueName: \"kubernetes.io/projected/33523259-a708-46f2-aa96-0ef7a77d199a-kube-api-access-xcqdm\") pod \"whisker-85cc8dcb94-t9k8t\" (UID: \"33523259-a708-46f2-aa96-0ef7a77d199a\") " pod="calico-system/whisker-85cc8dcb94-t9k8t" Jul 15 05:16:18.230725 kubelet[2761]: I0715 05:16:18.230718 2761 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"goldmane-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9c02e846-0cfc-42de-9a5f-fa54428989e8-goldmane-ca-bundle\") pod \"goldmane-768f4c5c69-6hlgx\" (UID: \"9c02e846-0cfc-42de-9a5f-fa54428989e8\") " pod="calico-system/goldmane-768f4c5c69-6hlgx" Jul 15 05:16:18.230776 kubelet[2761]: I0715 05:16:18.230765 2761 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9wf98\" (UniqueName: \"kubernetes.io/projected/a6375220-4b2a-45bf-9f5c-934a1ffc3abf-kube-api-access-9wf98\") pod \"coredns-668d6bf9bc-m56c8\" (UID: \"a6375220-4b2a-45bf-9f5c-934a1ffc3abf\") " pod="kube-system/coredns-668d6bf9bc-m56c8" Jul 15 05:16:18.231232 kubelet[2761]: I0715 05:16:18.230857 2761 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tgj9m\" (UniqueName: \"kubernetes.io/projected/272d51a9-e3b2-48a9-b04e-5a452ea134b1-kube-api-access-tgj9m\") pod \"calico-apiserver-6494bf7c78-b7q6m\" (UID: \"272d51a9-e3b2-48a9-b04e-5a452ea134b1\") " pod="calico-apiserver/calico-apiserver-6494bf7c78-b7q6m" Jul 15 05:16:18.449389 systemd[1]: Created slice kubepods-besteffort-pod14bfd7a9_4124_403a_afcc_084c249af056.slice - libcontainer container kubepods-besteffort-pod14bfd7a9_4124_403a_afcc_084c249af056.slice. Jul 15 05:16:18.454194 containerd[1591]: time="2025-07-15T05:16:18.454137126Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-9xn7s,Uid:14bfd7a9-4124-403a-afcc-084c249af056,Namespace:calico-system,Attempt:0,}" Jul 15 05:16:18.457633 containerd[1591]: time="2025-07-15T05:16:18.457593554Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-668d6bf9bc-m56c8,Uid:a6375220-4b2a-45bf-9f5c-934a1ffc3abf,Namespace:kube-system,Attempt:0,}" Jul 15 05:16:18.466223 containerd[1591]: time="2025-07-15T05:16:18.466161948Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-668d6bf9bc-vc4xs,Uid:f5754555-d845-4a94-affb-73038dd15a19,Namespace:kube-system,Attempt:0,}" Jul 15 05:16:18.479463 containerd[1591]: time="2025-07-15T05:16:18.478869262Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-694c859895-vr2gt,Uid:c691497d-2512-41d4-86a2-14832e0a4196,Namespace:calico-system,Attempt:0,}" Jul 15 05:16:18.485059 containerd[1591]: time="2025-07-15T05:16:18.484386464Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-6494bf7c78-b7q6m,Uid:272d51a9-e3b2-48a9-b04e-5a452ea134b1,Namespace:calico-apiserver,Attempt:0,}" Jul 15 05:16:18.498575 containerd[1591]: time="2025-07-15T05:16:18.498545868Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:whisker-85cc8dcb94-t9k8t,Uid:33523259-a708-46f2-aa96-0ef7a77d199a,Namespace:calico-system,Attempt:0,}" Jul 15 05:16:18.499659 containerd[1591]: time="2025-07-15T05:16:18.499627044Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:goldmane-768f4c5c69-6hlgx,Uid:9c02e846-0cfc-42de-9a5f-fa54428989e8,Namespace:calico-system,Attempt:0,}" Jul 15 05:16:18.501477 containerd[1591]: time="2025-07-15T05:16:18.501337357Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-6494bf7c78-znpph,Uid:84c72182-29d8-44ef-b493-115789db3f13,Namespace:calico-apiserver,Attempt:0,}" Jul 15 05:16:18.578608 containerd[1591]: time="2025-07-15T05:16:18.578443325Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node:v3.30.2\"" Jul 15 05:16:18.667312 containerd[1591]: time="2025-07-15T05:16:18.667270420Z" level=error msg="Failed to destroy network for sandbox \"c63a6a041059afe866c4d05d3e9d28dbd9bb33f58243b56cd381395f72aedf9b\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jul 15 05:16:18.671530 systemd[1]: run-netns-cni\x2dc05e015f\x2d8680\x2dd5a6\x2d50b9\x2d00fdf52cf796.mount: Deactivated successfully. Jul 15 05:16:18.711288 containerd[1591]: time="2025-07-15T05:16:18.674536878Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-6494bf7c78-b7q6m,Uid:272d51a9-e3b2-48a9-b04e-5a452ea134b1,Namespace:calico-apiserver,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"c63a6a041059afe866c4d05d3e9d28dbd9bb33f58243b56cd381395f72aedf9b\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jul 15 05:16:18.713002 containerd[1591]: time="2025-07-15T05:16:18.691922309Z" level=error msg="Failed to destroy network for sandbox \"9e24e67283d6c258425e3573d49207b52c3c1556fd5df956f7ff7f9825d0728b\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jul 15 05:16:18.716463 systemd[1]: run-netns-cni\x2d428703ff\x2d30ba\x2dbaa9\x2ddd36\x2d724ab7ce61e9.mount: Deactivated successfully. Jul 15 05:16:18.719521 kubelet[2761]: E0715 05:16:18.719490 2761 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"c63a6a041059afe866c4d05d3e9d28dbd9bb33f58243b56cd381395f72aedf9b\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jul 15 05:16:18.719628 containerd[1591]: time="2025-07-15T05:16:18.701921841Z" level=error msg="Failed to destroy network for sandbox \"0bad518e420c4c0a6d8c4c519934f1607c0eda476cfa8ce288d9b42968e8eb83\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jul 15 05:16:18.719976 kubelet[2761]: E0715 05:16:18.719725 2761 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"c63a6a041059afe866c4d05d3e9d28dbd9bb33f58243b56cd381395f72aedf9b\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-6494bf7c78-b7q6m" Jul 15 05:16:18.719976 kubelet[2761]: E0715 05:16:18.719743 2761 kuberuntime_manager.go:1237] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"c63a6a041059afe866c4d05d3e9d28dbd9bb33f58243b56cd381395f72aedf9b\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-6494bf7c78-b7q6m" Jul 15 05:16:18.719976 kubelet[2761]: E0715 05:16:18.719794 2761 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-apiserver-6494bf7c78-b7q6m_calico-apiserver(272d51a9-e3b2-48a9-b04e-5a452ea134b1)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-apiserver-6494bf7c78-b7q6m_calico-apiserver(272d51a9-e3b2-48a9-b04e-5a452ea134b1)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"c63a6a041059afe866c4d05d3e9d28dbd9bb33f58243b56cd381395f72aedf9b\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-apiserver/calico-apiserver-6494bf7c78-b7q6m" podUID="272d51a9-e3b2-48a9-b04e-5a452ea134b1" Jul 15 05:16:18.721273 containerd[1591]: time="2025-07-15T05:16:18.721255466Z" level=error msg="Failed to destroy network for sandbox \"373472dfbf2d8d898a974c41a57e050217311b7b39dbd58d02251c7e0c2bbeb4\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jul 15 05:16:18.722332 containerd[1591]: time="2025-07-15T05:16:18.722230403Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:goldmane-768f4c5c69-6hlgx,Uid:9c02e846-0cfc-42de-9a5f-fa54428989e8,Namespace:calico-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"9e24e67283d6c258425e3573d49207b52c3c1556fd5df956f7ff7f9825d0728b\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jul 15 05:16:18.722486 kubelet[2761]: E0715 05:16:18.722468 2761 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"9e24e67283d6c258425e3573d49207b52c3c1556fd5df956f7ff7f9825d0728b\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jul 15 05:16:18.722607 kubelet[2761]: E0715 05:16:18.722577 2761 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"9e24e67283d6c258425e3573d49207b52c3c1556fd5df956f7ff7f9825d0728b\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/goldmane-768f4c5c69-6hlgx" Jul 15 05:16:18.722705 kubelet[2761]: E0715 05:16:18.722675 2761 kuberuntime_manager.go:1237] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"9e24e67283d6c258425e3573d49207b52c3c1556fd5df956f7ff7f9825d0728b\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/goldmane-768f4c5c69-6hlgx" Jul 15 05:16:18.722753 systemd[1]: run-netns-cni\x2d14302fc7\x2d017b\x2d39c2\x2dd101\x2d642180554c10.mount: Deactivated successfully. Jul 15 05:16:18.723448 kubelet[2761]: E0715 05:16:18.722767 2761 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"goldmane-768f4c5c69-6hlgx_calico-system(9c02e846-0cfc-42de-9a5f-fa54428989e8)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"goldmane-768f4c5c69-6hlgx_calico-system(9c02e846-0cfc-42de-9a5f-fa54428989e8)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"9e24e67283d6c258425e3573d49207b52c3c1556fd5df956f7ff7f9825d0728b\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/goldmane-768f4c5c69-6hlgx" podUID="9c02e846-0cfc-42de-9a5f-fa54428989e8" Jul 15 05:16:18.722823 systemd[1]: run-netns-cni\x2da66eb0f4\x2da4d0\x2de29b\x2d1c16\x2d5723146fea2c.mount: Deactivated successfully. Jul 15 05:16:18.723567 containerd[1591]: time="2025-07-15T05:16:18.723549591Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-668d6bf9bc-vc4xs,Uid:f5754555-d845-4a94-affb-73038dd15a19,Namespace:kube-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"373472dfbf2d8d898a974c41a57e050217311b7b39dbd58d02251c7e0c2bbeb4\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jul 15 05:16:18.723925 kubelet[2761]: E0715 05:16:18.723845 2761 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"373472dfbf2d8d898a974c41a57e050217311b7b39dbd58d02251c7e0c2bbeb4\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jul 15 05:16:18.723925 kubelet[2761]: E0715 05:16:18.723868 2761 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"373472dfbf2d8d898a974c41a57e050217311b7b39dbd58d02251c7e0c2bbeb4\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-668d6bf9bc-vc4xs" Jul 15 05:16:18.723925 kubelet[2761]: E0715 05:16:18.723900 2761 kuberuntime_manager.go:1237] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"373472dfbf2d8d898a974c41a57e050217311b7b39dbd58d02251c7e0c2bbeb4\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-668d6bf9bc-vc4xs" Jul 15 05:16:18.724172 kubelet[2761]: E0715 05:16:18.724097 2761 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"coredns-668d6bf9bc-vc4xs_kube-system(f5754555-d845-4a94-affb-73038dd15a19)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"coredns-668d6bf9bc-vc4xs_kube-system(f5754555-d845-4a94-affb-73038dd15a19)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"373472dfbf2d8d898a974c41a57e050217311b7b39dbd58d02251c7e0c2bbeb4\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-668d6bf9bc-vc4xs" podUID="f5754555-d845-4a94-affb-73038dd15a19" Jul 15 05:16:18.724792 containerd[1591]: time="2025-07-15T05:16:18.724750498Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-9xn7s,Uid:14bfd7a9-4124-403a-afcc-084c249af056,Namespace:calico-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"0bad518e420c4c0a6d8c4c519934f1607c0eda476cfa8ce288d9b42968e8eb83\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jul 15 05:16:18.725277 kubelet[2761]: E0715 05:16:18.724928 2761 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"0bad518e420c4c0a6d8c4c519934f1607c0eda476cfa8ce288d9b42968e8eb83\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jul 15 05:16:18.725401 kubelet[2761]: E0715 05:16:18.725003 2761 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"0bad518e420c4c0a6d8c4c519934f1607c0eda476cfa8ce288d9b42968e8eb83\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/csi-node-driver-9xn7s" Jul 15 05:16:18.725401 kubelet[2761]: E0715 05:16:18.725370 2761 kuberuntime_manager.go:1237] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"0bad518e420c4c0a6d8c4c519934f1607c0eda476cfa8ce288d9b42968e8eb83\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/csi-node-driver-9xn7s" Jul 15 05:16:18.725624 kubelet[2761]: E0715 05:16:18.725587 2761 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"csi-node-driver-9xn7s_calico-system(14bfd7a9-4124-403a-afcc-084c249af056)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"csi-node-driver-9xn7s_calico-system(14bfd7a9-4124-403a-afcc-084c249af056)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"0bad518e420c4c0a6d8c4c519934f1607c0eda476cfa8ce288d9b42968e8eb83\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/csi-node-driver-9xn7s" podUID="14bfd7a9-4124-403a-afcc-084c249af056" Jul 15 05:16:18.728891 containerd[1591]: time="2025-07-15T05:16:18.728175123Z" level=error msg="Failed to destroy network for sandbox \"c1b9f5d4d8f6b1e5adba36b5159152994af78d2abcc5881560a70e08a28bb560\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jul 15 05:16:18.730450 containerd[1591]: time="2025-07-15T05:16:18.730426474Z" level=error msg="Failed to destroy network for sandbox \"5d2f2ecbc742d4e7256d110b777a71d4bd178d6d1ca12adfdd46b935ae245e4e\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jul 15 05:16:18.731174 containerd[1591]: time="2025-07-15T05:16:18.731122386Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-6494bf7c78-znpph,Uid:84c72182-29d8-44ef-b493-115789db3f13,Namespace:calico-apiserver,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"c1b9f5d4d8f6b1e5adba36b5159152994af78d2abcc5881560a70e08a28bb560\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jul 15 05:16:18.731269 systemd[1]: run-netns-cni\x2d015a44ab\x2d7c60\x2d698a\x2d153f\x2d1e5fa0d3c27e.mount: Deactivated successfully. Jul 15 05:16:18.731709 kubelet[2761]: E0715 05:16:18.731602 2761 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"c1b9f5d4d8f6b1e5adba36b5159152994af78d2abcc5881560a70e08a28bb560\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jul 15 05:16:18.732210 kubelet[2761]: E0715 05:16:18.731769 2761 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"c1b9f5d4d8f6b1e5adba36b5159152994af78d2abcc5881560a70e08a28bb560\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-6494bf7c78-znpph" Jul 15 05:16:18.732210 kubelet[2761]: E0715 05:16:18.732144 2761 kuberuntime_manager.go:1237] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"c1b9f5d4d8f6b1e5adba36b5159152994af78d2abcc5881560a70e08a28bb560\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-6494bf7c78-znpph" Jul 15 05:16:18.732210 kubelet[2761]: E0715 05:16:18.732176 2761 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-apiserver-6494bf7c78-znpph_calico-apiserver(84c72182-29d8-44ef-b493-115789db3f13)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-apiserver-6494bf7c78-znpph_calico-apiserver(84c72182-29d8-44ef-b493-115789db3f13)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"c1b9f5d4d8f6b1e5adba36b5159152994af78d2abcc5881560a70e08a28bb560\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-apiserver/calico-apiserver-6494bf7c78-znpph" podUID="84c72182-29d8-44ef-b493-115789db3f13" Jul 15 05:16:18.732814 containerd[1591]: time="2025-07-15T05:16:18.732789005Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-694c859895-vr2gt,Uid:c691497d-2512-41d4-86a2-14832e0a4196,Namespace:calico-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"5d2f2ecbc742d4e7256d110b777a71d4bd178d6d1ca12adfdd46b935ae245e4e\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jul 15 05:16:18.732976 kubelet[2761]: E0715 05:16:18.732958 2761 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"5d2f2ecbc742d4e7256d110b777a71d4bd178d6d1ca12adfdd46b935ae245e4e\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jul 15 05:16:18.733063 kubelet[2761]: E0715 05:16:18.732996 2761 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"5d2f2ecbc742d4e7256d110b777a71d4bd178d6d1ca12adfdd46b935ae245e4e\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-kube-controllers-694c859895-vr2gt" Jul 15 05:16:18.733063 kubelet[2761]: E0715 05:16:18.733017 2761 kuberuntime_manager.go:1237] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"5d2f2ecbc742d4e7256d110b777a71d4bd178d6d1ca12adfdd46b935ae245e4e\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-kube-controllers-694c859895-vr2gt" Jul 15 05:16:18.733063 kubelet[2761]: E0715 05:16:18.733041 2761 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-kube-controllers-694c859895-vr2gt_calico-system(c691497d-2512-41d4-86a2-14832e0a4196)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-kube-controllers-694c859895-vr2gt_calico-system(c691497d-2512-41d4-86a2-14832e0a4196)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"5d2f2ecbc742d4e7256d110b777a71d4bd178d6d1ca12adfdd46b935ae245e4e\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/calico-kube-controllers-694c859895-vr2gt" podUID="c691497d-2512-41d4-86a2-14832e0a4196" Jul 15 05:16:18.733219 containerd[1591]: time="2025-07-15T05:16:18.732962730Z" level=error msg="Failed to destroy network for sandbox \"3188f009f68ae432f787b378d4698014317d61246798e47692b9ef01ebd26d1b\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jul 15 05:16:18.734347 containerd[1591]: time="2025-07-15T05:16:18.734273887Z" level=error msg="Failed to destroy network for sandbox \"096a28388e8dd84341c1008269c1a495f6e2f532f392fa08b02fe174836ee49c\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jul 15 05:16:18.734885 containerd[1591]: time="2025-07-15T05:16:18.734821956Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-668d6bf9bc-m56c8,Uid:a6375220-4b2a-45bf-9f5c-934a1ffc3abf,Namespace:kube-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"3188f009f68ae432f787b378d4698014317d61246798e47692b9ef01ebd26d1b\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jul 15 05:16:18.735075 kubelet[2761]: E0715 05:16:18.735052 2761 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"3188f009f68ae432f787b378d4698014317d61246798e47692b9ef01ebd26d1b\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jul 15 05:16:18.735135 kubelet[2761]: E0715 05:16:18.735076 2761 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"3188f009f68ae432f787b378d4698014317d61246798e47692b9ef01ebd26d1b\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-668d6bf9bc-m56c8" Jul 15 05:16:18.735135 kubelet[2761]: E0715 05:16:18.735087 2761 kuberuntime_manager.go:1237] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"3188f009f68ae432f787b378d4698014317d61246798e47692b9ef01ebd26d1b\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-668d6bf9bc-m56c8" Jul 15 05:16:18.735135 kubelet[2761]: E0715 05:16:18.735105 2761 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"coredns-668d6bf9bc-m56c8_kube-system(a6375220-4b2a-45bf-9f5c-934a1ffc3abf)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"coredns-668d6bf9bc-m56c8_kube-system(a6375220-4b2a-45bf-9f5c-934a1ffc3abf)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"3188f009f68ae432f787b378d4698014317d61246798e47692b9ef01ebd26d1b\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-668d6bf9bc-m56c8" podUID="a6375220-4b2a-45bf-9f5c-934a1ffc3abf" Jul 15 05:16:18.735763 containerd[1591]: time="2025-07-15T05:16:18.735735168Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:whisker-85cc8dcb94-t9k8t,Uid:33523259-a708-46f2-aa96-0ef7a77d199a,Namespace:calico-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"096a28388e8dd84341c1008269c1a495f6e2f532f392fa08b02fe174836ee49c\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jul 15 05:16:18.735864 kubelet[2761]: E0715 05:16:18.735842 2761 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"096a28388e8dd84341c1008269c1a495f6e2f532f392fa08b02fe174836ee49c\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jul 15 05:16:18.735889 kubelet[2761]: E0715 05:16:18.735864 2761 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"096a28388e8dd84341c1008269c1a495f6e2f532f392fa08b02fe174836ee49c\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/whisker-85cc8dcb94-t9k8t" Jul 15 05:16:18.735953 kubelet[2761]: E0715 05:16:18.735875 2761 kuberuntime_manager.go:1237] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"096a28388e8dd84341c1008269c1a495f6e2f532f392fa08b02fe174836ee49c\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/whisker-85cc8dcb94-t9k8t" Jul 15 05:16:18.736048 kubelet[2761]: E0715 05:16:18.735954 2761 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"whisker-85cc8dcb94-t9k8t_calico-system(33523259-a708-46f2-aa96-0ef7a77d199a)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"whisker-85cc8dcb94-t9k8t_calico-system(33523259-a708-46f2-aa96-0ef7a77d199a)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"096a28388e8dd84341c1008269c1a495f6e2f532f392fa08b02fe174836ee49c\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/whisker-85cc8dcb94-t9k8t" podUID="33523259-a708-46f2-aa96-0ef7a77d199a" Jul 15 05:16:19.623984 systemd[1]: run-netns-cni\x2d40862b2a\x2d23b1\x2d2b4f\x2d007f\x2d404e554ef36b.mount: Deactivated successfully. Jul 15 05:16:19.624486 systemd[1]: run-netns-cni\x2d07b877b3\x2dac64\x2df2d8\x2dc777\x2d9cf6f1d3140b.mount: Deactivated successfully. Jul 15 05:16:19.624638 systemd[1]: run-netns-cni\x2d46a76c90\x2df182\x2d3705\x2d05eb\x2df7870619d327.mount: Deactivated successfully. Jul 15 05:16:22.497318 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount4074150826.mount: Deactivated successfully. Jul 15 05:16:22.532357 containerd[1591]: time="2025-07-15T05:16:22.532321910Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node:v3.30.2\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jul 15 05:16:22.533598 containerd[1591]: time="2025-07-15T05:16:22.533557865Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/node:v3.30.2: active requests=0, bytes read=158500163" Jul 15 05:16:22.534084 containerd[1591]: time="2025-07-15T05:16:22.534051499Z" level=info msg="ImageCreate event name:\"sha256:cc52550d767f73458fee2ee68db9db5de30d175e8fa4569ebdb43610127b6d20\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jul 15 05:16:22.535534 containerd[1591]: time="2025-07-15T05:16:22.535505519Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node@sha256:e94d49349cc361ef2216d27dda4a097278984d778279f66e79b0616c827c6760\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jul 15 05:16:22.536100 containerd[1591]: time="2025-07-15T05:16:22.535789619Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/node:v3.30.2\" with image id \"sha256:cc52550d767f73458fee2ee68db9db5de30d175e8fa4569ebdb43610127b6d20\", repo tag \"ghcr.io/flatcar/calico/node:v3.30.2\", repo digest \"ghcr.io/flatcar/calico/node@sha256:e94d49349cc361ef2216d27dda4a097278984d778279f66e79b0616c827c6760\", size \"158500025\" in 3.957317161s" Jul 15 05:16:22.536100 containerd[1591]: time="2025-07-15T05:16:22.535808070Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node:v3.30.2\" returns image reference \"sha256:cc52550d767f73458fee2ee68db9db5de30d175e8fa4569ebdb43610127b6d20\"" Jul 15 05:16:22.562060 containerd[1591]: time="2025-07-15T05:16:22.562028897Z" level=info msg="CreateContainer within sandbox \"6a74f534e54b59aef4d23b08a455ebf83004fefc0e97cb2b40432c6a9b30eb46\" for container &ContainerMetadata{Name:calico-node,Attempt:0,}" Jul 15 05:16:22.573037 containerd[1591]: time="2025-07-15T05:16:22.571772789Z" level=info msg="Container 87af270add8c61091888bd10ba9da0b61a290df1dcd1c45e8bd59a8af425818f: CDI devices from CRI Config.CDIDevices: []" Jul 15 05:16:22.572868 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount862537020.mount: Deactivated successfully. Jul 15 05:16:22.598841 containerd[1591]: time="2025-07-15T05:16:22.598797971Z" level=info msg="CreateContainer within sandbox \"6a74f534e54b59aef4d23b08a455ebf83004fefc0e97cb2b40432c6a9b30eb46\" for &ContainerMetadata{Name:calico-node,Attempt:0,} returns container id \"87af270add8c61091888bd10ba9da0b61a290df1dcd1c45e8bd59a8af425818f\"" Jul 15 05:16:22.599491 containerd[1591]: time="2025-07-15T05:16:22.599398902Z" level=info msg="StartContainer for \"87af270add8c61091888bd10ba9da0b61a290df1dcd1c45e8bd59a8af425818f\"" Jul 15 05:16:22.605220 containerd[1591]: time="2025-07-15T05:16:22.605195642Z" level=info msg="connecting to shim 87af270add8c61091888bd10ba9da0b61a290df1dcd1c45e8bd59a8af425818f" address="unix:///run/containerd/s/f589dd8d63c0284808d9a3bb9867cde47623a6790aa071d27c41a4e3aae18e20" protocol=ttrpc version=3 Jul 15 05:16:22.674123 systemd[1]: Started cri-containerd-87af270add8c61091888bd10ba9da0b61a290df1dcd1c45e8bd59a8af425818f.scope - libcontainer container 87af270add8c61091888bd10ba9da0b61a290df1dcd1c45e8bd59a8af425818f. Jul 15 05:16:22.708942 containerd[1591]: time="2025-07-15T05:16:22.708918260Z" level=info msg="StartContainer for \"87af270add8c61091888bd10ba9da0b61a290df1dcd1c45e8bd59a8af425818f\" returns successfully" Jul 15 05:16:22.778985 kernel: wireguard: WireGuard 1.0.0 loaded. See www.wireguard.com for information. Jul 15 05:16:22.779441 kernel: wireguard: Copyright (C) 2015-2019 Jason A. Donenfeld . All Rights Reserved. Jul 15 05:16:22.969001 kubelet[2761]: I0715 05:16:22.968958 2761 operation_generator.go:780] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/33523259-a708-46f2-aa96-0ef7a77d199a-whisker-ca-bundle" (OuterVolumeSpecName: "whisker-ca-bundle") pod "33523259-a708-46f2-aa96-0ef7a77d199a" (UID: "33523259-a708-46f2-aa96-0ef7a77d199a"). InnerVolumeSpecName "whisker-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Jul 15 05:16:22.970343 kubelet[2761]: I0715 05:16:22.970297 2761 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"whisker-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/33523259-a708-46f2-aa96-0ef7a77d199a-whisker-ca-bundle\") pod \"33523259-a708-46f2-aa96-0ef7a77d199a\" (UID: \"33523259-a708-46f2-aa96-0ef7a77d199a\") " Jul 15 05:16:22.970808 kubelet[2761]: I0715 05:16:22.970471 2761 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"whisker-backend-key-pair\" (UniqueName: \"kubernetes.io/secret/33523259-a708-46f2-aa96-0ef7a77d199a-whisker-backend-key-pair\") pod \"33523259-a708-46f2-aa96-0ef7a77d199a\" (UID: \"33523259-a708-46f2-aa96-0ef7a77d199a\") " Jul 15 05:16:22.970808 kubelet[2761]: I0715 05:16:22.970489 2761 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xcqdm\" (UniqueName: \"kubernetes.io/projected/33523259-a708-46f2-aa96-0ef7a77d199a-kube-api-access-xcqdm\") pod \"33523259-a708-46f2-aa96-0ef7a77d199a\" (UID: \"33523259-a708-46f2-aa96-0ef7a77d199a\") " Jul 15 05:16:22.971027 kubelet[2761]: I0715 05:16:22.970894 2761 reconciler_common.go:299] "Volume detached for volume \"whisker-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/33523259-a708-46f2-aa96-0ef7a77d199a-whisker-ca-bundle\") on node \"ci-4396-0-0-n-e83c776e20\" DevicePath \"\"" Jul 15 05:16:22.974431 kubelet[2761]: I0715 05:16:22.974409 2761 operation_generator.go:780] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/33523259-a708-46f2-aa96-0ef7a77d199a-whisker-backend-key-pair" (OuterVolumeSpecName: "whisker-backend-key-pair") pod "33523259-a708-46f2-aa96-0ef7a77d199a" (UID: "33523259-a708-46f2-aa96-0ef7a77d199a"). InnerVolumeSpecName "whisker-backend-key-pair". PluginName "kubernetes.io/secret", VolumeGIDValue "" Jul 15 05:16:22.975098 kubelet[2761]: I0715 05:16:22.975073 2761 operation_generator.go:780] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/33523259-a708-46f2-aa96-0ef7a77d199a-kube-api-access-xcqdm" (OuterVolumeSpecName: "kube-api-access-xcqdm") pod "33523259-a708-46f2-aa96-0ef7a77d199a" (UID: "33523259-a708-46f2-aa96-0ef7a77d199a"). InnerVolumeSpecName "kube-api-access-xcqdm". PluginName "kubernetes.io/projected", VolumeGIDValue "" Jul 15 05:16:23.072105 kubelet[2761]: I0715 05:16:23.072045 2761 reconciler_common.go:299] "Volume detached for volume \"whisker-backend-key-pair\" (UniqueName: \"kubernetes.io/secret/33523259-a708-46f2-aa96-0ef7a77d199a-whisker-backend-key-pair\") on node \"ci-4396-0-0-n-e83c776e20\" DevicePath \"\"" Jul 15 05:16:23.072105 kubelet[2761]: I0715 05:16:23.072079 2761 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-xcqdm\" (UniqueName: \"kubernetes.io/projected/33523259-a708-46f2-aa96-0ef7a77d199a-kube-api-access-xcqdm\") on node \"ci-4396-0-0-n-e83c776e20\" DevicePath \"\"" Jul 15 05:16:23.451370 systemd[1]: Removed slice kubepods-besteffort-pod33523259_a708_46f2_aa96_0ef7a77d199a.slice - libcontainer container kubepods-besteffort-pod33523259_a708_46f2_aa96_0ef7a77d199a.slice. Jul 15 05:16:23.500764 systemd[1]: var-lib-kubelet-pods-33523259\x2da708\x2d46f2\x2daa96\x2d0ef7a77d199a-volumes-kubernetes.io\x7eprojected-kube\x2dapi\x2daccess\x2dxcqdm.mount: Deactivated successfully. Jul 15 05:16:23.500935 systemd[1]: var-lib-kubelet-pods-33523259\x2da708\x2d46f2\x2daa96\x2d0ef7a77d199a-volumes-kubernetes.io\x7esecret-whisker\x2dbackend\x2dkey\x2dpair.mount: Deactivated successfully. Jul 15 05:16:23.652148 kubelet[2761]: I0715 05:16:23.651999 2761 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/calico-node-dwr67" podStartSLOduration=2.23073303 podStartE2EDuration="16.651962877s" podCreationTimestamp="2025-07-15 05:16:07 +0000 UTC" firstStartedPulling="2025-07-15 05:16:08.115867952 +0000 UTC m=+18.767241928" lastFinishedPulling="2025-07-15 05:16:22.537097809 +0000 UTC m=+33.188471775" observedRunningTime="2025-07-15 05:16:23.629204037 +0000 UTC m=+34.280578053" watchObservedRunningTime="2025-07-15 05:16:23.651962877 +0000 UTC m=+34.303336883" Jul 15 05:16:23.696620 systemd[1]: Created slice kubepods-besteffort-pod2033dfa0_693e_4a8a_9342_4de54e705015.slice - libcontainer container kubepods-besteffort-pod2033dfa0_693e_4a8a_9342_4de54e705015.slice. Jul 15 05:16:23.778078 kubelet[2761]: I0715 05:16:23.777720 2761 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whisker-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2033dfa0-693e-4a8a-9342-4de54e705015-whisker-ca-bundle\") pod \"whisker-6c848fd45d-g24rg\" (UID: \"2033dfa0-693e-4a8a-9342-4de54e705015\") " pod="calico-system/whisker-6c848fd45d-g24rg" Jul 15 05:16:23.778078 kubelet[2761]: I0715 05:16:23.777808 2761 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tpbbk\" (UniqueName: \"kubernetes.io/projected/2033dfa0-693e-4a8a-9342-4de54e705015-kube-api-access-tpbbk\") pod \"whisker-6c848fd45d-g24rg\" (UID: \"2033dfa0-693e-4a8a-9342-4de54e705015\") " pod="calico-system/whisker-6c848fd45d-g24rg" Jul 15 05:16:23.778078 kubelet[2761]: I0715 05:16:23.777881 2761 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whisker-backend-key-pair\" (UniqueName: \"kubernetes.io/secret/2033dfa0-693e-4a8a-9342-4de54e705015-whisker-backend-key-pair\") pod \"whisker-6c848fd45d-g24rg\" (UID: \"2033dfa0-693e-4a8a-9342-4de54e705015\") " pod="calico-system/whisker-6c848fd45d-g24rg" Jul 15 05:16:24.001238 containerd[1591]: time="2025-07-15T05:16:24.001122392Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:whisker-6c848fd45d-g24rg,Uid:2033dfa0-693e-4a8a-9342-4de54e705015,Namespace:calico-system,Attempt:0,}" Jul 15 05:16:24.441202 systemd-networkd[1464]: caliaa569aff2e9: Link UP Jul 15 05:16:24.442657 systemd-networkd[1464]: caliaa569aff2e9: Gained carrier Jul 15 05:16:24.471815 containerd[1591]: time="2025-07-15T05:16:24.470094508Z" level=info msg="TaskExit event in podsandbox handler container_id:\"87af270add8c61091888bd10ba9da0b61a290df1dcd1c45e8bd59a8af425818f\" id:\"cfe2f8a89b5bcd806e44f79bbed252faeca8c6c8128cfafcf5d797a333b7d429\" pid:4009 exit_status:1 exited_at:{seconds:1752556584 nanos:434604209}" Jul 15 05:16:24.472507 containerd[1591]: 2025-07-15 05:16:24.148 [INFO][3935] cni-plugin/utils.go 100: File /var/lib/calico/mtu does not exist Jul 15 05:16:24.472507 containerd[1591]: 2025-07-15 05:16:24.203 [INFO][3935] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4396--0--0--n--e83c776e20-k8s-whisker--6c848fd45d--g24rg-eth0 whisker-6c848fd45d- calico-system 2033dfa0-693e-4a8a-9342-4de54e705015 845 0 2025-07-15 05:16:23 +0000 UTC map[app.kubernetes.io/name:whisker k8s-app:whisker pod-template-hash:6c848fd45d projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:whisker] map[] [] [] []} {k8s ci-4396-0-0-n-e83c776e20 whisker-6c848fd45d-g24rg eth0 whisker [] [] [kns.calico-system ksa.calico-system.whisker] caliaa569aff2e9 [] [] }} ContainerID="d3238ff54ba2e0acbd2e87e1898434492a93c1c2ac97a684f50c1f4ceadcf112" Namespace="calico-system" Pod="whisker-6c848fd45d-g24rg" WorkloadEndpoint="ci--4396--0--0--n--e83c776e20-k8s-whisker--6c848fd45d--g24rg-" Jul 15 05:16:24.472507 containerd[1591]: 2025-07-15 05:16:24.203 [INFO][3935] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="d3238ff54ba2e0acbd2e87e1898434492a93c1c2ac97a684f50c1f4ceadcf112" Namespace="calico-system" Pod="whisker-6c848fd45d-g24rg" WorkloadEndpoint="ci--4396--0--0--n--e83c776e20-k8s-whisker--6c848fd45d--g24rg-eth0" Jul 15 05:16:24.472507 containerd[1591]: 2025-07-15 05:16:24.387 [INFO][3988] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="d3238ff54ba2e0acbd2e87e1898434492a93c1c2ac97a684f50c1f4ceadcf112" HandleID="k8s-pod-network.d3238ff54ba2e0acbd2e87e1898434492a93c1c2ac97a684f50c1f4ceadcf112" Workload="ci--4396--0--0--n--e83c776e20-k8s-whisker--6c848fd45d--g24rg-eth0" Jul 15 05:16:24.472641 containerd[1591]: 2025-07-15 05:16:24.389 [INFO][3988] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="d3238ff54ba2e0acbd2e87e1898434492a93c1c2ac97a684f50c1f4ceadcf112" HandleID="k8s-pod-network.d3238ff54ba2e0acbd2e87e1898434492a93c1c2ac97a684f50c1f4ceadcf112" Workload="ci--4396--0--0--n--e83c776e20-k8s-whisker--6c848fd45d--g24rg-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc00004f540), Attrs:map[string]string{"namespace":"calico-system", "node":"ci-4396-0-0-n-e83c776e20", "pod":"whisker-6c848fd45d-g24rg", "timestamp":"2025-07-15 05:16:24.38723598 +0000 UTC"}, Hostname:"ci-4396-0-0-n-e83c776e20", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Jul 15 05:16:24.472641 containerd[1591]: 2025-07-15 05:16:24.389 [INFO][3988] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Jul 15 05:16:24.472641 containerd[1591]: 2025-07-15 05:16:24.389 [INFO][3988] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Jul 15 05:16:24.472641 containerd[1591]: 2025-07-15 05:16:24.390 [INFO][3988] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4396-0-0-n-e83c776e20' Jul 15 05:16:24.472641 containerd[1591]: 2025-07-15 05:16:24.402 [INFO][3988] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.d3238ff54ba2e0acbd2e87e1898434492a93c1c2ac97a684f50c1f4ceadcf112" host="ci-4396-0-0-n-e83c776e20" Jul 15 05:16:24.472641 containerd[1591]: 2025-07-15 05:16:24.409 [INFO][3988] ipam/ipam.go 394: Looking up existing affinities for host host="ci-4396-0-0-n-e83c776e20" Jul 15 05:16:24.472641 containerd[1591]: 2025-07-15 05:16:24.414 [INFO][3988] ipam/ipam.go 511: Trying affinity for 192.168.61.64/26 host="ci-4396-0-0-n-e83c776e20" Jul 15 05:16:24.472641 containerd[1591]: 2025-07-15 05:16:24.415 [INFO][3988] ipam/ipam.go 158: Attempting to load block cidr=192.168.61.64/26 host="ci-4396-0-0-n-e83c776e20" Jul 15 05:16:24.472641 containerd[1591]: 2025-07-15 05:16:24.417 [INFO][3988] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.61.64/26 host="ci-4396-0-0-n-e83c776e20" Jul 15 05:16:24.473697 containerd[1591]: 2025-07-15 05:16:24.418 [INFO][3988] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.61.64/26 handle="k8s-pod-network.d3238ff54ba2e0acbd2e87e1898434492a93c1c2ac97a684f50c1f4ceadcf112" host="ci-4396-0-0-n-e83c776e20" Jul 15 05:16:24.473697 containerd[1591]: 2025-07-15 05:16:24.419 [INFO][3988] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.d3238ff54ba2e0acbd2e87e1898434492a93c1c2ac97a684f50c1f4ceadcf112 Jul 15 05:16:24.473697 containerd[1591]: 2025-07-15 05:16:24.424 [INFO][3988] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.61.64/26 handle="k8s-pod-network.d3238ff54ba2e0acbd2e87e1898434492a93c1c2ac97a684f50c1f4ceadcf112" host="ci-4396-0-0-n-e83c776e20" Jul 15 05:16:24.473697 containerd[1591]: 2025-07-15 05:16:24.427 [INFO][3988] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.61.65/26] block=192.168.61.64/26 handle="k8s-pod-network.d3238ff54ba2e0acbd2e87e1898434492a93c1c2ac97a684f50c1f4ceadcf112" host="ci-4396-0-0-n-e83c776e20" Jul 15 05:16:24.473697 containerd[1591]: 2025-07-15 05:16:24.427 [INFO][3988] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.61.65/26] handle="k8s-pod-network.d3238ff54ba2e0acbd2e87e1898434492a93c1c2ac97a684f50c1f4ceadcf112" host="ci-4396-0-0-n-e83c776e20" Jul 15 05:16:24.473697 containerd[1591]: 2025-07-15 05:16:24.427 [INFO][3988] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Jul 15 05:16:24.473697 containerd[1591]: 2025-07-15 05:16:24.427 [INFO][3988] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.61.65/26] IPv6=[] ContainerID="d3238ff54ba2e0acbd2e87e1898434492a93c1c2ac97a684f50c1f4ceadcf112" HandleID="k8s-pod-network.d3238ff54ba2e0acbd2e87e1898434492a93c1c2ac97a684f50c1f4ceadcf112" Workload="ci--4396--0--0--n--e83c776e20-k8s-whisker--6c848fd45d--g24rg-eth0" Jul 15 05:16:24.474189 containerd[1591]: 2025-07-15 05:16:24.430 [INFO][3935] cni-plugin/k8s.go 418: Populated endpoint ContainerID="d3238ff54ba2e0acbd2e87e1898434492a93c1c2ac97a684f50c1f4ceadcf112" Namespace="calico-system" Pod="whisker-6c848fd45d-g24rg" WorkloadEndpoint="ci--4396--0--0--n--e83c776e20-k8s-whisker--6c848fd45d--g24rg-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4396--0--0--n--e83c776e20-k8s-whisker--6c848fd45d--g24rg-eth0", GenerateName:"whisker-6c848fd45d-", Namespace:"calico-system", SelfLink:"", UID:"2033dfa0-693e-4a8a-9342-4de54e705015", ResourceVersion:"845", Generation:0, CreationTimestamp:time.Date(2025, time.July, 15, 5, 16, 23, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"whisker", "k8s-app":"whisker", "pod-template-hash":"6c848fd45d", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"whisker"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4396-0-0-n-e83c776e20", ContainerID:"", Pod:"whisker-6c848fd45d-g24rg", Endpoint:"eth0", ServiceAccountName:"whisker", IPNetworks:[]string{"192.168.61.65/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.whisker"}, InterfaceName:"caliaa569aff2e9", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Jul 15 05:16:24.474189 containerd[1591]: 2025-07-15 05:16:24.430 [INFO][3935] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.61.65/32] ContainerID="d3238ff54ba2e0acbd2e87e1898434492a93c1c2ac97a684f50c1f4ceadcf112" Namespace="calico-system" Pod="whisker-6c848fd45d-g24rg" WorkloadEndpoint="ci--4396--0--0--n--e83c776e20-k8s-whisker--6c848fd45d--g24rg-eth0" Jul 15 05:16:24.474258 containerd[1591]: 2025-07-15 05:16:24.430 [INFO][3935] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to caliaa569aff2e9 ContainerID="d3238ff54ba2e0acbd2e87e1898434492a93c1c2ac97a684f50c1f4ceadcf112" Namespace="calico-system" Pod="whisker-6c848fd45d-g24rg" WorkloadEndpoint="ci--4396--0--0--n--e83c776e20-k8s-whisker--6c848fd45d--g24rg-eth0" Jul 15 05:16:24.474258 containerd[1591]: 2025-07-15 05:16:24.443 [INFO][3935] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="d3238ff54ba2e0acbd2e87e1898434492a93c1c2ac97a684f50c1f4ceadcf112" Namespace="calico-system" Pod="whisker-6c848fd45d-g24rg" WorkloadEndpoint="ci--4396--0--0--n--e83c776e20-k8s-whisker--6c848fd45d--g24rg-eth0" Jul 15 05:16:24.474289 containerd[1591]: 2025-07-15 05:16:24.443 [INFO][3935] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="d3238ff54ba2e0acbd2e87e1898434492a93c1c2ac97a684f50c1f4ceadcf112" Namespace="calico-system" Pod="whisker-6c848fd45d-g24rg" WorkloadEndpoint="ci--4396--0--0--n--e83c776e20-k8s-whisker--6c848fd45d--g24rg-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4396--0--0--n--e83c776e20-k8s-whisker--6c848fd45d--g24rg-eth0", GenerateName:"whisker-6c848fd45d-", Namespace:"calico-system", SelfLink:"", UID:"2033dfa0-693e-4a8a-9342-4de54e705015", ResourceVersion:"845", Generation:0, CreationTimestamp:time.Date(2025, time.July, 15, 5, 16, 23, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"whisker", "k8s-app":"whisker", "pod-template-hash":"6c848fd45d", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"whisker"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4396-0-0-n-e83c776e20", ContainerID:"d3238ff54ba2e0acbd2e87e1898434492a93c1c2ac97a684f50c1f4ceadcf112", Pod:"whisker-6c848fd45d-g24rg", Endpoint:"eth0", ServiceAccountName:"whisker", IPNetworks:[]string{"192.168.61.65/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.whisker"}, InterfaceName:"caliaa569aff2e9", MAC:"9a:33:3c:6a:b8:a6", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Jul 15 05:16:24.474328 containerd[1591]: 2025-07-15 05:16:24.458 [INFO][3935] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="d3238ff54ba2e0acbd2e87e1898434492a93c1c2ac97a684f50c1f4ceadcf112" Namespace="calico-system" Pod="whisker-6c848fd45d-g24rg" WorkloadEndpoint="ci--4396--0--0--n--e83c776e20-k8s-whisker--6c848fd45d--g24rg-eth0" Jul 15 05:16:24.544626 containerd[1591]: time="2025-07-15T05:16:24.544574210Z" level=info msg="TaskExit event in podsandbox handler container_id:\"87af270add8c61091888bd10ba9da0b61a290df1dcd1c45e8bd59a8af425818f\" id:\"2b87c93a645b2daddc43b428fbde23a99b535db8d774f09e55e13690929ceea6\" pid:4043 exit_status:1 exited_at:{seconds:1752556584 nanos:543811054}" Jul 15 05:16:24.558660 containerd[1591]: time="2025-07-15T05:16:24.558630811Z" level=info msg="connecting to shim d3238ff54ba2e0acbd2e87e1898434492a93c1c2ac97a684f50c1f4ceadcf112" address="unix:///run/containerd/s/b87ef7b13dc488e7933abe9526aa4ddd0a5d644c7c7f6ce834acf7c29f9c3319" namespace=k8s.io protocol=ttrpc version=3 Jul 15 05:16:24.583129 systemd[1]: Started cri-containerd-d3238ff54ba2e0acbd2e87e1898434492a93c1c2ac97a684f50c1f4ceadcf112.scope - libcontainer container d3238ff54ba2e0acbd2e87e1898434492a93c1c2ac97a684f50c1f4ceadcf112. Jul 15 05:16:24.633367 containerd[1591]: time="2025-07-15T05:16:24.633332426Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:whisker-6c848fd45d-g24rg,Uid:2033dfa0-693e-4a8a-9342-4de54e705015,Namespace:calico-system,Attempt:0,} returns sandbox id \"d3238ff54ba2e0acbd2e87e1898434492a93c1c2ac97a684f50c1f4ceadcf112\"" Jul 15 05:16:24.641119 containerd[1591]: time="2025-07-15T05:16:24.641000890Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.30.2\"" Jul 15 05:16:24.681231 containerd[1591]: time="2025-07-15T05:16:24.681179634Z" level=info msg="TaskExit event in podsandbox handler container_id:\"87af270add8c61091888bd10ba9da0b61a290df1dcd1c45e8bd59a8af425818f\" id:\"181ffc9042a70e9400a813a7909a603e5b56c6e193b273217509150177c23c04\" pid:4115 exit_status:1 exited_at:{seconds:1752556584 nanos:680578298}" Jul 15 05:16:25.441782 kubelet[2761]: I0715 05:16:25.441752 2761 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="33523259-a708-46f2-aa96-0ef7a77d199a" path="/var/lib/kubelet/pods/33523259-a708-46f2-aa96-0ef7a77d199a/volumes" Jul 15 05:16:25.512325 systemd-networkd[1464]: caliaa569aff2e9: Gained IPv6LL Jul 15 05:16:25.723435 containerd[1591]: time="2025-07-15T05:16:25.723347235Z" level=info msg="TaskExit event in podsandbox handler container_id:\"87af270add8c61091888bd10ba9da0b61a290df1dcd1c45e8bd59a8af425818f\" id:\"16a6309f1ed254f0190d5fa65992f8edfab350f5e22c217eb1ba2e6aba03f1e7\" pid:4162 exit_status:1 exited_at:{seconds:1752556585 nanos:722859427}" Jul 15 05:16:26.335808 containerd[1591]: time="2025-07-15T05:16:26.335766503Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/whisker:v3.30.2\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jul 15 05:16:26.336545 containerd[1591]: time="2025-07-15T05:16:26.336522104Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker:v3.30.2: active requests=0, bytes read=4661207" Jul 15 05:16:26.337479 containerd[1591]: time="2025-07-15T05:16:26.337442313Z" level=info msg="ImageCreate event name:\"sha256:eb8f512acf9402730da120a7b0d47d3d9d451b56e6e5eb8bad53ab24f926f954\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jul 15 05:16:26.339116 containerd[1591]: time="2025-07-15T05:16:26.339078040Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/whisker@sha256:31346d4524252a3b0d2a1d289c4985b8402b498b5ce82a12e682096ab7446678\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jul 15 05:16:26.339676 containerd[1591]: time="2025-07-15T05:16:26.339412307Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/whisker:v3.30.2\" with image id \"sha256:eb8f512acf9402730da120a7b0d47d3d9d451b56e6e5eb8bad53ab24f926f954\", repo tag \"ghcr.io/flatcar/calico/whisker:v3.30.2\", repo digest \"ghcr.io/flatcar/calico/whisker@sha256:31346d4524252a3b0d2a1d289c4985b8402b498b5ce82a12e682096ab7446678\", size \"6153902\" in 1.698307611s" Jul 15 05:16:26.339676 containerd[1591]: time="2025-07-15T05:16:26.339433009Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.30.2\" returns image reference \"sha256:eb8f512acf9402730da120a7b0d47d3d9d451b56e6e5eb8bad53ab24f926f954\"" Jul 15 05:16:26.341905 containerd[1591]: time="2025-07-15T05:16:26.341884209Z" level=info msg="CreateContainer within sandbox \"d3238ff54ba2e0acbd2e87e1898434492a93c1c2ac97a684f50c1f4ceadcf112\" for container &ContainerMetadata{Name:whisker,Attempt:0,}" Jul 15 05:16:26.349154 containerd[1591]: time="2025-07-15T05:16:26.349138595Z" level=info msg="Container 0aa3f1f3e96091915cd17b7e156c572f4da4c11bdcf13af53b690f7f8a9ca620: CDI devices from CRI Config.CDIDevices: []" Jul 15 05:16:26.354751 containerd[1591]: time="2025-07-15T05:16:26.354720742Z" level=info msg="CreateContainer within sandbox \"d3238ff54ba2e0acbd2e87e1898434492a93c1c2ac97a684f50c1f4ceadcf112\" for &ContainerMetadata{Name:whisker,Attempt:0,} returns container id \"0aa3f1f3e96091915cd17b7e156c572f4da4c11bdcf13af53b690f7f8a9ca620\"" Jul 15 05:16:26.355648 containerd[1591]: time="2025-07-15T05:16:26.355080492Z" level=info msg="StartContainer for \"0aa3f1f3e96091915cd17b7e156c572f4da4c11bdcf13af53b690f7f8a9ca620\"" Jul 15 05:16:26.355928 containerd[1591]: time="2025-07-15T05:16:26.355905966Z" level=info msg="connecting to shim 0aa3f1f3e96091915cd17b7e156c572f4da4c11bdcf13af53b690f7f8a9ca620" address="unix:///run/containerd/s/b87ef7b13dc488e7933abe9526aa4ddd0a5d644c7c7f6ce834acf7c29f9c3319" protocol=ttrpc version=3 Jul 15 05:16:26.373141 systemd[1]: Started cri-containerd-0aa3f1f3e96091915cd17b7e156c572f4da4c11bdcf13af53b690f7f8a9ca620.scope - libcontainer container 0aa3f1f3e96091915cd17b7e156c572f4da4c11bdcf13af53b690f7f8a9ca620. Jul 15 05:16:26.411384 containerd[1591]: time="2025-07-15T05:16:26.411342477Z" level=info msg="StartContainer for \"0aa3f1f3e96091915cd17b7e156c572f4da4c11bdcf13af53b690f7f8a9ca620\" returns successfully" Jul 15 05:16:26.412578 containerd[1591]: time="2025-07-15T05:16:26.412521920Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.30.2\"" Jul 15 05:16:28.349628 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount3857406482.mount: Deactivated successfully. Jul 15 05:16:28.365570 containerd[1591]: time="2025-07-15T05:16:28.365522181Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/whisker-backend:v3.30.2\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jul 15 05:16:28.366397 containerd[1591]: time="2025-07-15T05:16:28.366260066Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker-backend:v3.30.2: active requests=0, bytes read=33083477" Jul 15 05:16:28.367219 containerd[1591]: time="2025-07-15T05:16:28.367200790Z" level=info msg="ImageCreate event name:\"sha256:6ba7e39edcd8be6d32dfccbfdb65533a727b14a19173515e91607d4259f8ee7f\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jul 15 05:16:28.370525 containerd[1591]: time="2025-07-15T05:16:28.370506134Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/whisker-backend@sha256:fbf7f21f5aba95930803ad7e7dea8b083220854eae72c2a7c51681c09c5614b5\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jul 15 05:16:28.371855 containerd[1591]: time="2025-07-15T05:16:28.371829376Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.2\" with image id \"sha256:6ba7e39edcd8be6d32dfccbfdb65533a727b14a19173515e91607d4259f8ee7f\", repo tag \"ghcr.io/flatcar/calico/whisker-backend:v3.30.2\", repo digest \"ghcr.io/flatcar/calico/whisker-backend@sha256:fbf7f21f5aba95930803ad7e7dea8b083220854eae72c2a7c51681c09c5614b5\", size \"33083307\" in 1.959149498s" Jul 15 05:16:28.371962 containerd[1591]: time="2025-07-15T05:16:28.371932721Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.30.2\" returns image reference \"sha256:6ba7e39edcd8be6d32dfccbfdb65533a727b14a19173515e91607d4259f8ee7f\"" Jul 15 05:16:28.374610 containerd[1591]: time="2025-07-15T05:16:28.374573275Z" level=info msg="CreateContainer within sandbox \"d3238ff54ba2e0acbd2e87e1898434492a93c1c2ac97a684f50c1f4ceadcf112\" for container &ContainerMetadata{Name:whisker-backend,Attempt:0,}" Jul 15 05:16:28.382297 containerd[1591]: time="2025-07-15T05:16:28.382272315Z" level=info msg="Container fa72f435bdb9d3cbc69d3de3012aa6654c72dfac5999f96dd5116629a7e37fd4: CDI devices from CRI Config.CDIDevices: []" Jul 15 05:16:28.388825 containerd[1591]: time="2025-07-15T05:16:28.388797820Z" level=info msg="CreateContainer within sandbox \"d3238ff54ba2e0acbd2e87e1898434492a93c1c2ac97a684f50c1f4ceadcf112\" for &ContainerMetadata{Name:whisker-backend,Attempt:0,} returns container id \"fa72f435bdb9d3cbc69d3de3012aa6654c72dfac5999f96dd5116629a7e37fd4\"" Jul 15 05:16:28.389214 containerd[1591]: time="2025-07-15T05:16:28.389194059Z" level=info msg="StartContainer for \"fa72f435bdb9d3cbc69d3de3012aa6654c72dfac5999f96dd5116629a7e37fd4\"" Jul 15 05:16:28.389898 containerd[1591]: time="2025-07-15T05:16:28.389875521Z" level=info msg="connecting to shim fa72f435bdb9d3cbc69d3de3012aa6654c72dfac5999f96dd5116629a7e37fd4" address="unix:///run/containerd/s/b87ef7b13dc488e7933abe9526aa4ddd0a5d644c7c7f6ce834acf7c29f9c3319" protocol=ttrpc version=3 Jul 15 05:16:28.412258 systemd[1]: Started cri-containerd-fa72f435bdb9d3cbc69d3de3012aa6654c72dfac5999f96dd5116629a7e37fd4.scope - libcontainer container fa72f435bdb9d3cbc69d3de3012aa6654c72dfac5999f96dd5116629a7e37fd4. Jul 15 05:16:28.453182 containerd[1591]: time="2025-07-15T05:16:28.453149101Z" level=info msg="StartContainer for \"fa72f435bdb9d3cbc69d3de3012aa6654c72dfac5999f96dd5116629a7e37fd4\" returns successfully" Jul 15 05:16:28.660622 kubelet[2761]: I0715 05:16:28.659424 2761 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/whisker-6c848fd45d-g24rg" podStartSLOduration=1.927372275 podStartE2EDuration="5.659399662s" podCreationTimestamp="2025-07-15 05:16:23 +0000 UTC" firstStartedPulling="2025-07-15 05:16:24.640626958 +0000 UTC m=+35.292000924" lastFinishedPulling="2025-07-15 05:16:28.372654335 +0000 UTC m=+39.024028311" observedRunningTime="2025-07-15 05:16:28.658093761 +0000 UTC m=+39.309467767" watchObservedRunningTime="2025-07-15 05:16:28.659399662 +0000 UTC m=+39.310773668" Jul 15 05:16:29.441104 containerd[1591]: time="2025-07-15T05:16:29.440712780Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-694c859895-vr2gt,Uid:c691497d-2512-41d4-86a2-14832e0a4196,Namespace:calico-system,Attempt:0,}" Jul 15 05:16:29.441735 containerd[1591]: time="2025-07-15T05:16:29.440721111Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-6494bf7c78-b7q6m,Uid:272d51a9-e3b2-48a9-b04e-5a452ea134b1,Namespace:calico-apiserver,Attempt:0,}" Jul 15 05:16:29.564527 systemd-networkd[1464]: calif7cd30c7c1b: Link UP Jul 15 05:16:29.564700 systemd-networkd[1464]: calif7cd30c7c1b: Gained carrier Jul 15 05:16:29.578511 containerd[1591]: 2025-07-15 05:16:29.490 [INFO][4330] cni-plugin/utils.go 100: File /var/lib/calico/mtu does not exist Jul 15 05:16:29.578511 containerd[1591]: 2025-07-15 05:16:29.499 [INFO][4330] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4396--0--0--n--e83c776e20-k8s-calico--apiserver--6494bf7c78--b7q6m-eth0 calico-apiserver-6494bf7c78- calico-apiserver 272d51a9-e3b2-48a9-b04e-5a452ea134b1 784 0 2025-07-15 05:16:05 +0000 UTC map[apiserver:true app.kubernetes.io/name:calico-apiserver k8s-app:calico-apiserver pod-template-hash:6494bf7c78 projectcalico.org/namespace:calico-apiserver projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:calico-apiserver] map[] [] [] []} {k8s ci-4396-0-0-n-e83c776e20 calico-apiserver-6494bf7c78-b7q6m eth0 calico-apiserver [] [] [kns.calico-apiserver ksa.calico-apiserver.calico-apiserver] calif7cd30c7c1b [] [] }} ContainerID="6d92382dcc5d3a20c83b67ca71441cb951f6446062ff797d0f56361303e04979" Namespace="calico-apiserver" Pod="calico-apiserver-6494bf7c78-b7q6m" WorkloadEndpoint="ci--4396--0--0--n--e83c776e20-k8s-calico--apiserver--6494bf7c78--b7q6m-" Jul 15 05:16:29.578511 containerd[1591]: 2025-07-15 05:16:29.499 [INFO][4330] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="6d92382dcc5d3a20c83b67ca71441cb951f6446062ff797d0f56361303e04979" Namespace="calico-apiserver" Pod="calico-apiserver-6494bf7c78-b7q6m" WorkloadEndpoint="ci--4396--0--0--n--e83c776e20-k8s-calico--apiserver--6494bf7c78--b7q6m-eth0" Jul 15 05:16:29.578511 containerd[1591]: 2025-07-15 05:16:29.533 [INFO][4349] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="6d92382dcc5d3a20c83b67ca71441cb951f6446062ff797d0f56361303e04979" HandleID="k8s-pod-network.6d92382dcc5d3a20c83b67ca71441cb951f6446062ff797d0f56361303e04979" Workload="ci--4396--0--0--n--e83c776e20-k8s-calico--apiserver--6494bf7c78--b7q6m-eth0" Jul 15 05:16:29.578677 containerd[1591]: 2025-07-15 05:16:29.534 [INFO][4349] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="6d92382dcc5d3a20c83b67ca71441cb951f6446062ff797d0f56361303e04979" HandleID="k8s-pod-network.6d92382dcc5d3a20c83b67ca71441cb951f6446062ff797d0f56361303e04979" Workload="ci--4396--0--0--n--e83c776e20-k8s-calico--apiserver--6494bf7c78--b7q6m-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc0002d5710), Attrs:map[string]string{"namespace":"calico-apiserver", "node":"ci-4396-0-0-n-e83c776e20", "pod":"calico-apiserver-6494bf7c78-b7q6m", "timestamp":"2025-07-15 05:16:29.533896388 +0000 UTC"}, Hostname:"ci-4396-0-0-n-e83c776e20", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Jul 15 05:16:29.578677 containerd[1591]: 2025-07-15 05:16:29.534 [INFO][4349] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Jul 15 05:16:29.578677 containerd[1591]: 2025-07-15 05:16:29.534 [INFO][4349] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Jul 15 05:16:29.578677 containerd[1591]: 2025-07-15 05:16:29.534 [INFO][4349] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4396-0-0-n-e83c776e20' Jul 15 05:16:29.578677 containerd[1591]: 2025-07-15 05:16:29.539 [INFO][4349] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.6d92382dcc5d3a20c83b67ca71441cb951f6446062ff797d0f56361303e04979" host="ci-4396-0-0-n-e83c776e20" Jul 15 05:16:29.578677 containerd[1591]: 2025-07-15 05:16:29.542 [INFO][4349] ipam/ipam.go 394: Looking up existing affinities for host host="ci-4396-0-0-n-e83c776e20" Jul 15 05:16:29.578677 containerd[1591]: 2025-07-15 05:16:29.546 [INFO][4349] ipam/ipam.go 511: Trying affinity for 192.168.61.64/26 host="ci-4396-0-0-n-e83c776e20" Jul 15 05:16:29.578677 containerd[1591]: 2025-07-15 05:16:29.547 [INFO][4349] ipam/ipam.go 158: Attempting to load block cidr=192.168.61.64/26 host="ci-4396-0-0-n-e83c776e20" Jul 15 05:16:29.578677 containerd[1591]: 2025-07-15 05:16:29.548 [INFO][4349] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.61.64/26 host="ci-4396-0-0-n-e83c776e20" Jul 15 05:16:29.578825 containerd[1591]: 2025-07-15 05:16:29.548 [INFO][4349] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.61.64/26 handle="k8s-pod-network.6d92382dcc5d3a20c83b67ca71441cb951f6446062ff797d0f56361303e04979" host="ci-4396-0-0-n-e83c776e20" Jul 15 05:16:29.578825 containerd[1591]: 2025-07-15 05:16:29.550 [INFO][4349] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.6d92382dcc5d3a20c83b67ca71441cb951f6446062ff797d0f56361303e04979 Jul 15 05:16:29.578825 containerd[1591]: 2025-07-15 05:16:29.554 [INFO][4349] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.61.64/26 handle="k8s-pod-network.6d92382dcc5d3a20c83b67ca71441cb951f6446062ff797d0f56361303e04979" host="ci-4396-0-0-n-e83c776e20" Jul 15 05:16:29.578825 containerd[1591]: 2025-07-15 05:16:29.558 [INFO][4349] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.61.66/26] block=192.168.61.64/26 handle="k8s-pod-network.6d92382dcc5d3a20c83b67ca71441cb951f6446062ff797d0f56361303e04979" host="ci-4396-0-0-n-e83c776e20" Jul 15 05:16:29.578825 containerd[1591]: 2025-07-15 05:16:29.558 [INFO][4349] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.61.66/26] handle="k8s-pod-network.6d92382dcc5d3a20c83b67ca71441cb951f6446062ff797d0f56361303e04979" host="ci-4396-0-0-n-e83c776e20" Jul 15 05:16:29.578825 containerd[1591]: 2025-07-15 05:16:29.558 [INFO][4349] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Jul 15 05:16:29.578825 containerd[1591]: 2025-07-15 05:16:29.558 [INFO][4349] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.61.66/26] IPv6=[] ContainerID="6d92382dcc5d3a20c83b67ca71441cb951f6446062ff797d0f56361303e04979" HandleID="k8s-pod-network.6d92382dcc5d3a20c83b67ca71441cb951f6446062ff797d0f56361303e04979" Workload="ci--4396--0--0--n--e83c776e20-k8s-calico--apiserver--6494bf7c78--b7q6m-eth0" Jul 15 05:16:29.578989 containerd[1591]: 2025-07-15 05:16:29.562 [INFO][4330] cni-plugin/k8s.go 418: Populated endpoint ContainerID="6d92382dcc5d3a20c83b67ca71441cb951f6446062ff797d0f56361303e04979" Namespace="calico-apiserver" Pod="calico-apiserver-6494bf7c78-b7q6m" WorkloadEndpoint="ci--4396--0--0--n--e83c776e20-k8s-calico--apiserver--6494bf7c78--b7q6m-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4396--0--0--n--e83c776e20-k8s-calico--apiserver--6494bf7c78--b7q6m-eth0", GenerateName:"calico-apiserver-6494bf7c78-", Namespace:"calico-apiserver", SelfLink:"", UID:"272d51a9-e3b2-48a9-b04e-5a452ea134b1", ResourceVersion:"784", Generation:0, CreationTimestamp:time.Date(2025, time.July, 15, 5, 16, 5, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"6494bf7c78", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4396-0-0-n-e83c776e20", ContainerID:"", Pod:"calico-apiserver-6494bf7c78-b7q6m", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.61.66/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"calif7cd30c7c1b", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Jul 15 05:16:29.579050 containerd[1591]: 2025-07-15 05:16:29.562 [INFO][4330] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.61.66/32] ContainerID="6d92382dcc5d3a20c83b67ca71441cb951f6446062ff797d0f56361303e04979" Namespace="calico-apiserver" Pod="calico-apiserver-6494bf7c78-b7q6m" WorkloadEndpoint="ci--4396--0--0--n--e83c776e20-k8s-calico--apiserver--6494bf7c78--b7q6m-eth0" Jul 15 05:16:29.579050 containerd[1591]: 2025-07-15 05:16:29.562 [INFO][4330] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to calif7cd30c7c1b ContainerID="6d92382dcc5d3a20c83b67ca71441cb951f6446062ff797d0f56361303e04979" Namespace="calico-apiserver" Pod="calico-apiserver-6494bf7c78-b7q6m" WorkloadEndpoint="ci--4396--0--0--n--e83c776e20-k8s-calico--apiserver--6494bf7c78--b7q6m-eth0" Jul 15 05:16:29.579050 containerd[1591]: 2025-07-15 05:16:29.565 [INFO][4330] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="6d92382dcc5d3a20c83b67ca71441cb951f6446062ff797d0f56361303e04979" Namespace="calico-apiserver" Pod="calico-apiserver-6494bf7c78-b7q6m" WorkloadEndpoint="ci--4396--0--0--n--e83c776e20-k8s-calico--apiserver--6494bf7c78--b7q6m-eth0" Jul 15 05:16:29.579108 containerd[1591]: 2025-07-15 05:16:29.565 [INFO][4330] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="6d92382dcc5d3a20c83b67ca71441cb951f6446062ff797d0f56361303e04979" Namespace="calico-apiserver" Pod="calico-apiserver-6494bf7c78-b7q6m" WorkloadEndpoint="ci--4396--0--0--n--e83c776e20-k8s-calico--apiserver--6494bf7c78--b7q6m-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4396--0--0--n--e83c776e20-k8s-calico--apiserver--6494bf7c78--b7q6m-eth0", GenerateName:"calico-apiserver-6494bf7c78-", Namespace:"calico-apiserver", SelfLink:"", UID:"272d51a9-e3b2-48a9-b04e-5a452ea134b1", ResourceVersion:"784", Generation:0, CreationTimestamp:time.Date(2025, time.July, 15, 5, 16, 5, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"6494bf7c78", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4396-0-0-n-e83c776e20", ContainerID:"6d92382dcc5d3a20c83b67ca71441cb951f6446062ff797d0f56361303e04979", Pod:"calico-apiserver-6494bf7c78-b7q6m", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.61.66/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"calif7cd30c7c1b", MAC:"3a:34:54:e0:2e:8f", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Jul 15 05:16:29.579145 containerd[1591]: 2025-07-15 05:16:29.573 [INFO][4330] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="6d92382dcc5d3a20c83b67ca71441cb951f6446062ff797d0f56361303e04979" Namespace="calico-apiserver" Pod="calico-apiserver-6494bf7c78-b7q6m" WorkloadEndpoint="ci--4396--0--0--n--e83c776e20-k8s-calico--apiserver--6494bf7c78--b7q6m-eth0" Jul 15 05:16:29.603155 containerd[1591]: time="2025-07-15T05:16:29.603114514Z" level=info msg="connecting to shim 6d92382dcc5d3a20c83b67ca71441cb951f6446062ff797d0f56361303e04979" address="unix:///run/containerd/s/4ad236785feefb9a4b373b51b89b0e108e9ba66f88dd6755d6b423a2fac7da75" namespace=k8s.io protocol=ttrpc version=3 Jul 15 05:16:29.631186 systemd[1]: Started cri-containerd-6d92382dcc5d3a20c83b67ca71441cb951f6446062ff797d0f56361303e04979.scope - libcontainer container 6d92382dcc5d3a20c83b67ca71441cb951f6446062ff797d0f56361303e04979. Jul 15 05:16:29.669568 systemd-networkd[1464]: cali75308fc8b2d: Link UP Jul 15 05:16:29.670544 systemd-networkd[1464]: cali75308fc8b2d: Gained carrier Jul 15 05:16:29.686566 containerd[1591]: 2025-07-15 05:16:29.499 [INFO][4326] cni-plugin/utils.go 100: File /var/lib/calico/mtu does not exist Jul 15 05:16:29.686566 containerd[1591]: 2025-07-15 05:16:29.512 [INFO][4326] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4396--0--0--n--e83c776e20-k8s-calico--kube--controllers--694c859895--vr2gt-eth0 calico-kube-controllers-694c859895- calico-system c691497d-2512-41d4-86a2-14832e0a4196 783 0 2025-07-15 05:16:08 +0000 UTC map[app.kubernetes.io/name:calico-kube-controllers k8s-app:calico-kube-controllers pod-template-hash:694c859895 projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:calico-kube-controllers] map[] [] [] []} {k8s ci-4396-0-0-n-e83c776e20 calico-kube-controllers-694c859895-vr2gt eth0 calico-kube-controllers [] [] [kns.calico-system ksa.calico-system.calico-kube-controllers] cali75308fc8b2d [] [] }} ContainerID="74ebcdace3be4cf3401f9e191e7151142de9950ea8f7e152e76abc2c55aca44c" Namespace="calico-system" Pod="calico-kube-controllers-694c859895-vr2gt" WorkloadEndpoint="ci--4396--0--0--n--e83c776e20-k8s-calico--kube--controllers--694c859895--vr2gt-" Jul 15 05:16:29.686566 containerd[1591]: 2025-07-15 05:16:29.512 [INFO][4326] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="74ebcdace3be4cf3401f9e191e7151142de9950ea8f7e152e76abc2c55aca44c" Namespace="calico-system" Pod="calico-kube-controllers-694c859895-vr2gt" WorkloadEndpoint="ci--4396--0--0--n--e83c776e20-k8s-calico--kube--controllers--694c859895--vr2gt-eth0" Jul 15 05:16:29.686566 containerd[1591]: 2025-07-15 05:16:29.536 [INFO][4354] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="74ebcdace3be4cf3401f9e191e7151142de9950ea8f7e152e76abc2c55aca44c" HandleID="k8s-pod-network.74ebcdace3be4cf3401f9e191e7151142de9950ea8f7e152e76abc2c55aca44c" Workload="ci--4396--0--0--n--e83c776e20-k8s-calico--kube--controllers--694c859895--vr2gt-eth0" Jul 15 05:16:29.686951 containerd[1591]: 2025-07-15 05:16:29.536 [INFO][4354] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="74ebcdace3be4cf3401f9e191e7151142de9950ea8f7e152e76abc2c55aca44c" HandleID="k8s-pod-network.74ebcdace3be4cf3401f9e191e7151142de9950ea8f7e152e76abc2c55aca44c" Workload="ci--4396--0--0--n--e83c776e20-k8s-calico--kube--controllers--694c859895--vr2gt-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc0002b74a0), Attrs:map[string]string{"namespace":"calico-system", "node":"ci-4396-0-0-n-e83c776e20", "pod":"calico-kube-controllers-694c859895-vr2gt", "timestamp":"2025-07-15 05:16:29.536429569 +0000 UTC"}, Hostname:"ci-4396-0-0-n-e83c776e20", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Jul 15 05:16:29.686951 containerd[1591]: 2025-07-15 05:16:29.536 [INFO][4354] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Jul 15 05:16:29.686951 containerd[1591]: 2025-07-15 05:16:29.559 [INFO][4354] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Jul 15 05:16:29.686951 containerd[1591]: 2025-07-15 05:16:29.559 [INFO][4354] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4396-0-0-n-e83c776e20' Jul 15 05:16:29.686951 containerd[1591]: 2025-07-15 05:16:29.640 [INFO][4354] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.74ebcdace3be4cf3401f9e191e7151142de9950ea8f7e152e76abc2c55aca44c" host="ci-4396-0-0-n-e83c776e20" Jul 15 05:16:29.686951 containerd[1591]: 2025-07-15 05:16:29.645 [INFO][4354] ipam/ipam.go 394: Looking up existing affinities for host host="ci-4396-0-0-n-e83c776e20" Jul 15 05:16:29.686951 containerd[1591]: 2025-07-15 05:16:29.648 [INFO][4354] ipam/ipam.go 511: Trying affinity for 192.168.61.64/26 host="ci-4396-0-0-n-e83c776e20" Jul 15 05:16:29.686951 containerd[1591]: 2025-07-15 05:16:29.649 [INFO][4354] ipam/ipam.go 158: Attempting to load block cidr=192.168.61.64/26 host="ci-4396-0-0-n-e83c776e20" Jul 15 05:16:29.686951 containerd[1591]: 2025-07-15 05:16:29.651 [INFO][4354] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.61.64/26 host="ci-4396-0-0-n-e83c776e20" Jul 15 05:16:29.687127 containerd[1591]: 2025-07-15 05:16:29.651 [INFO][4354] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.61.64/26 handle="k8s-pod-network.74ebcdace3be4cf3401f9e191e7151142de9950ea8f7e152e76abc2c55aca44c" host="ci-4396-0-0-n-e83c776e20" Jul 15 05:16:29.687127 containerd[1591]: 2025-07-15 05:16:29.652 [INFO][4354] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.74ebcdace3be4cf3401f9e191e7151142de9950ea8f7e152e76abc2c55aca44c Jul 15 05:16:29.687127 containerd[1591]: 2025-07-15 05:16:29.656 [INFO][4354] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.61.64/26 handle="k8s-pod-network.74ebcdace3be4cf3401f9e191e7151142de9950ea8f7e152e76abc2c55aca44c" host="ci-4396-0-0-n-e83c776e20" Jul 15 05:16:29.687127 containerd[1591]: 2025-07-15 05:16:29.663 [INFO][4354] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.61.67/26] block=192.168.61.64/26 handle="k8s-pod-network.74ebcdace3be4cf3401f9e191e7151142de9950ea8f7e152e76abc2c55aca44c" host="ci-4396-0-0-n-e83c776e20" Jul 15 05:16:29.687127 containerd[1591]: 2025-07-15 05:16:29.663 [INFO][4354] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.61.67/26] handle="k8s-pod-network.74ebcdace3be4cf3401f9e191e7151142de9950ea8f7e152e76abc2c55aca44c" host="ci-4396-0-0-n-e83c776e20" Jul 15 05:16:29.687127 containerd[1591]: 2025-07-15 05:16:29.663 [INFO][4354] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Jul 15 05:16:29.687127 containerd[1591]: 2025-07-15 05:16:29.663 [INFO][4354] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.61.67/26] IPv6=[] ContainerID="74ebcdace3be4cf3401f9e191e7151142de9950ea8f7e152e76abc2c55aca44c" HandleID="k8s-pod-network.74ebcdace3be4cf3401f9e191e7151142de9950ea8f7e152e76abc2c55aca44c" Workload="ci--4396--0--0--n--e83c776e20-k8s-calico--kube--controllers--694c859895--vr2gt-eth0" Jul 15 05:16:29.687234 containerd[1591]: 2025-07-15 05:16:29.666 [INFO][4326] cni-plugin/k8s.go 418: Populated endpoint ContainerID="74ebcdace3be4cf3401f9e191e7151142de9950ea8f7e152e76abc2c55aca44c" Namespace="calico-system" Pod="calico-kube-controllers-694c859895-vr2gt" WorkloadEndpoint="ci--4396--0--0--n--e83c776e20-k8s-calico--kube--controllers--694c859895--vr2gt-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4396--0--0--n--e83c776e20-k8s-calico--kube--controllers--694c859895--vr2gt-eth0", GenerateName:"calico-kube-controllers-694c859895-", Namespace:"calico-system", SelfLink:"", UID:"c691497d-2512-41d4-86a2-14832e0a4196", ResourceVersion:"783", Generation:0, CreationTimestamp:time.Date(2025, time.July, 15, 5, 16, 8, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"calico-kube-controllers", "k8s-app":"calico-kube-controllers", "pod-template-hash":"694c859895", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-kube-controllers"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4396-0-0-n-e83c776e20", ContainerID:"", Pod:"calico-kube-controllers-694c859895-vr2gt", Endpoint:"eth0", ServiceAccountName:"calico-kube-controllers", IPNetworks:[]string{"192.168.61.67/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.calico-kube-controllers"}, InterfaceName:"cali75308fc8b2d", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Jul 15 05:16:29.687271 containerd[1591]: 2025-07-15 05:16:29.666 [INFO][4326] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.61.67/32] ContainerID="74ebcdace3be4cf3401f9e191e7151142de9950ea8f7e152e76abc2c55aca44c" Namespace="calico-system" Pod="calico-kube-controllers-694c859895-vr2gt" WorkloadEndpoint="ci--4396--0--0--n--e83c776e20-k8s-calico--kube--controllers--694c859895--vr2gt-eth0" Jul 15 05:16:29.687271 containerd[1591]: 2025-07-15 05:16:29.666 [INFO][4326] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali75308fc8b2d ContainerID="74ebcdace3be4cf3401f9e191e7151142de9950ea8f7e152e76abc2c55aca44c" Namespace="calico-system" Pod="calico-kube-controllers-694c859895-vr2gt" WorkloadEndpoint="ci--4396--0--0--n--e83c776e20-k8s-calico--kube--controllers--694c859895--vr2gt-eth0" Jul 15 05:16:29.687271 containerd[1591]: 2025-07-15 05:16:29.671 [INFO][4326] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="74ebcdace3be4cf3401f9e191e7151142de9950ea8f7e152e76abc2c55aca44c" Namespace="calico-system" Pod="calico-kube-controllers-694c859895-vr2gt" WorkloadEndpoint="ci--4396--0--0--n--e83c776e20-k8s-calico--kube--controllers--694c859895--vr2gt-eth0" Jul 15 05:16:29.687326 containerd[1591]: 2025-07-15 05:16:29.671 [INFO][4326] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="74ebcdace3be4cf3401f9e191e7151142de9950ea8f7e152e76abc2c55aca44c" Namespace="calico-system" Pod="calico-kube-controllers-694c859895-vr2gt" WorkloadEndpoint="ci--4396--0--0--n--e83c776e20-k8s-calico--kube--controllers--694c859895--vr2gt-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4396--0--0--n--e83c776e20-k8s-calico--kube--controllers--694c859895--vr2gt-eth0", GenerateName:"calico-kube-controllers-694c859895-", Namespace:"calico-system", SelfLink:"", UID:"c691497d-2512-41d4-86a2-14832e0a4196", ResourceVersion:"783", Generation:0, CreationTimestamp:time.Date(2025, time.July, 15, 5, 16, 8, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"calico-kube-controllers", "k8s-app":"calico-kube-controllers", "pod-template-hash":"694c859895", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-kube-controllers"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4396-0-0-n-e83c776e20", ContainerID:"74ebcdace3be4cf3401f9e191e7151142de9950ea8f7e152e76abc2c55aca44c", Pod:"calico-kube-controllers-694c859895-vr2gt", Endpoint:"eth0", ServiceAccountName:"calico-kube-controllers", IPNetworks:[]string{"192.168.61.67/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.calico-kube-controllers"}, InterfaceName:"cali75308fc8b2d", MAC:"16:60:db:d1:33:f6", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Jul 15 05:16:29.687372 containerd[1591]: 2025-07-15 05:16:29.683 [INFO][4326] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="74ebcdace3be4cf3401f9e191e7151142de9950ea8f7e152e76abc2c55aca44c" Namespace="calico-system" Pod="calico-kube-controllers-694c859895-vr2gt" WorkloadEndpoint="ci--4396--0--0--n--e83c776e20-k8s-calico--kube--controllers--694c859895--vr2gt-eth0" Jul 15 05:16:29.690342 containerd[1591]: time="2025-07-15T05:16:29.690322020Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-6494bf7c78-b7q6m,Uid:272d51a9-e3b2-48a9-b04e-5a452ea134b1,Namespace:calico-apiserver,Attempt:0,} returns sandbox id \"6d92382dcc5d3a20c83b67ca71441cb951f6446062ff797d0f56361303e04979\"" Jul 15 05:16:29.691944 containerd[1591]: time="2025-07-15T05:16:29.691889368Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.2\"" Jul 15 05:16:29.708234 containerd[1591]: time="2025-07-15T05:16:29.708209544Z" level=info msg="connecting to shim 74ebcdace3be4cf3401f9e191e7151142de9950ea8f7e152e76abc2c55aca44c" address="unix:///run/containerd/s/c0945f1fc33942b0b665a43400484ebd95eb75dc5f3281e90424bed186532366" namespace=k8s.io protocol=ttrpc version=3 Jul 15 05:16:29.730124 systemd[1]: Started cri-containerd-74ebcdace3be4cf3401f9e191e7151142de9950ea8f7e152e76abc2c55aca44c.scope - libcontainer container 74ebcdace3be4cf3401f9e191e7151142de9950ea8f7e152e76abc2c55aca44c. Jul 15 05:16:29.765620 containerd[1591]: time="2025-07-15T05:16:29.765591731Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-694c859895-vr2gt,Uid:c691497d-2512-41d4-86a2-14832e0a4196,Namespace:calico-system,Attempt:0,} returns sandbox id \"74ebcdace3be4cf3401f9e191e7151142de9950ea8f7e152e76abc2c55aca44c\"" Jul 15 05:16:30.439918 containerd[1591]: time="2025-07-15T05:16:30.439790323Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:goldmane-768f4c5c69-6hlgx,Uid:9c02e846-0cfc-42de-9a5f-fa54428989e8,Namespace:calico-system,Attempt:0,}" Jul 15 05:16:30.440511 containerd[1591]: time="2025-07-15T05:16:30.439790283Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-668d6bf9bc-m56c8,Uid:a6375220-4b2a-45bf-9f5c-934a1ffc3abf,Namespace:kube-system,Attempt:0,}" Jul 15 05:16:30.689618 systemd-networkd[1464]: cali79a585bde21: Link UP Jul 15 05:16:30.693497 systemd-networkd[1464]: cali79a585bde21: Gained carrier Jul 15 05:16:30.707146 containerd[1591]: 2025-07-15 05:16:30.499 [INFO][4491] cni-plugin/utils.go 100: File /var/lib/calico/mtu does not exist Jul 15 05:16:30.707146 containerd[1591]: 2025-07-15 05:16:30.508 [INFO][4491] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4396--0--0--n--e83c776e20-k8s-coredns--668d6bf9bc--m56c8-eth0 coredns-668d6bf9bc- kube-system a6375220-4b2a-45bf-9f5c-934a1ffc3abf 775 0 2025-07-15 05:15:56 +0000 UTC map[k8s-app:kube-dns pod-template-hash:668d6bf9bc projectcalico.org/namespace:kube-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:coredns] map[] [] [] []} {k8s ci-4396-0-0-n-e83c776e20 coredns-668d6bf9bc-m56c8 eth0 coredns [] [] [kns.kube-system ksa.kube-system.coredns] cali79a585bde21 [{dns UDP 53 0 } {dns-tcp TCP 53 0 } {metrics TCP 9153 0 }] [] }} ContainerID="06b7fdb37cbdc7225031a84bde5341901c49a1a58473085f17b2472cf1f3fbe3" Namespace="kube-system" Pod="coredns-668d6bf9bc-m56c8" WorkloadEndpoint="ci--4396--0--0--n--e83c776e20-k8s-coredns--668d6bf9bc--m56c8-" Jul 15 05:16:30.707146 containerd[1591]: 2025-07-15 05:16:30.508 [INFO][4491] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="06b7fdb37cbdc7225031a84bde5341901c49a1a58473085f17b2472cf1f3fbe3" Namespace="kube-system" Pod="coredns-668d6bf9bc-m56c8" WorkloadEndpoint="ci--4396--0--0--n--e83c776e20-k8s-coredns--668d6bf9bc--m56c8-eth0" Jul 15 05:16:30.707146 containerd[1591]: 2025-07-15 05:16:30.537 [INFO][4515] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="06b7fdb37cbdc7225031a84bde5341901c49a1a58473085f17b2472cf1f3fbe3" HandleID="k8s-pod-network.06b7fdb37cbdc7225031a84bde5341901c49a1a58473085f17b2472cf1f3fbe3" Workload="ci--4396--0--0--n--e83c776e20-k8s-coredns--668d6bf9bc--m56c8-eth0" Jul 15 05:16:30.707475 containerd[1591]: 2025-07-15 05:16:30.537 [INFO][4515] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="06b7fdb37cbdc7225031a84bde5341901c49a1a58473085f17b2472cf1f3fbe3" HandleID="k8s-pod-network.06b7fdb37cbdc7225031a84bde5341901c49a1a58473085f17b2472cf1f3fbe3" Workload="ci--4396--0--0--n--e83c776e20-k8s-coredns--668d6bf9bc--m56c8-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc0002d58f0), Attrs:map[string]string{"namespace":"kube-system", "node":"ci-4396-0-0-n-e83c776e20", "pod":"coredns-668d6bf9bc-m56c8", "timestamp":"2025-07-15 05:16:30.537537973 +0000 UTC"}, Hostname:"ci-4396-0-0-n-e83c776e20", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Jul 15 05:16:30.707475 containerd[1591]: 2025-07-15 05:16:30.537 [INFO][4515] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Jul 15 05:16:30.707475 containerd[1591]: 2025-07-15 05:16:30.537 [INFO][4515] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Jul 15 05:16:30.707475 containerd[1591]: 2025-07-15 05:16:30.537 [INFO][4515] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4396-0-0-n-e83c776e20' Jul 15 05:16:30.707475 containerd[1591]: 2025-07-15 05:16:30.542 [INFO][4515] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.06b7fdb37cbdc7225031a84bde5341901c49a1a58473085f17b2472cf1f3fbe3" host="ci-4396-0-0-n-e83c776e20" Jul 15 05:16:30.707475 containerd[1591]: 2025-07-15 05:16:30.637 [INFO][4515] ipam/ipam.go 394: Looking up existing affinities for host host="ci-4396-0-0-n-e83c776e20" Jul 15 05:16:30.707475 containerd[1591]: 2025-07-15 05:16:30.643 [INFO][4515] ipam/ipam.go 511: Trying affinity for 192.168.61.64/26 host="ci-4396-0-0-n-e83c776e20" Jul 15 05:16:30.707475 containerd[1591]: 2025-07-15 05:16:30.646 [INFO][4515] ipam/ipam.go 158: Attempting to load block cidr=192.168.61.64/26 host="ci-4396-0-0-n-e83c776e20" Jul 15 05:16:30.707475 containerd[1591]: 2025-07-15 05:16:30.649 [INFO][4515] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.61.64/26 host="ci-4396-0-0-n-e83c776e20" Jul 15 05:16:30.707619 containerd[1591]: 2025-07-15 05:16:30.650 [INFO][4515] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.61.64/26 handle="k8s-pod-network.06b7fdb37cbdc7225031a84bde5341901c49a1a58473085f17b2472cf1f3fbe3" host="ci-4396-0-0-n-e83c776e20" Jul 15 05:16:30.707619 containerd[1591]: 2025-07-15 05:16:30.654 [INFO][4515] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.06b7fdb37cbdc7225031a84bde5341901c49a1a58473085f17b2472cf1f3fbe3 Jul 15 05:16:30.707619 containerd[1591]: 2025-07-15 05:16:30.662 [INFO][4515] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.61.64/26 handle="k8s-pod-network.06b7fdb37cbdc7225031a84bde5341901c49a1a58473085f17b2472cf1f3fbe3" host="ci-4396-0-0-n-e83c776e20" Jul 15 05:16:30.707619 containerd[1591]: 2025-07-15 05:16:30.669 [INFO][4515] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.61.68/26] block=192.168.61.64/26 handle="k8s-pod-network.06b7fdb37cbdc7225031a84bde5341901c49a1a58473085f17b2472cf1f3fbe3" host="ci-4396-0-0-n-e83c776e20" Jul 15 05:16:30.707619 containerd[1591]: 2025-07-15 05:16:30.669 [INFO][4515] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.61.68/26] handle="k8s-pod-network.06b7fdb37cbdc7225031a84bde5341901c49a1a58473085f17b2472cf1f3fbe3" host="ci-4396-0-0-n-e83c776e20" Jul 15 05:16:30.707619 containerd[1591]: 2025-07-15 05:16:30.669 [INFO][4515] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Jul 15 05:16:30.707619 containerd[1591]: 2025-07-15 05:16:30.670 [INFO][4515] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.61.68/26] IPv6=[] ContainerID="06b7fdb37cbdc7225031a84bde5341901c49a1a58473085f17b2472cf1f3fbe3" HandleID="k8s-pod-network.06b7fdb37cbdc7225031a84bde5341901c49a1a58473085f17b2472cf1f3fbe3" Workload="ci--4396--0--0--n--e83c776e20-k8s-coredns--668d6bf9bc--m56c8-eth0" Jul 15 05:16:30.707726 containerd[1591]: 2025-07-15 05:16:30.676 [INFO][4491] cni-plugin/k8s.go 418: Populated endpoint ContainerID="06b7fdb37cbdc7225031a84bde5341901c49a1a58473085f17b2472cf1f3fbe3" Namespace="kube-system" Pod="coredns-668d6bf9bc-m56c8" WorkloadEndpoint="ci--4396--0--0--n--e83c776e20-k8s-coredns--668d6bf9bc--m56c8-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4396--0--0--n--e83c776e20-k8s-coredns--668d6bf9bc--m56c8-eth0", GenerateName:"coredns-668d6bf9bc-", Namespace:"kube-system", SelfLink:"", UID:"a6375220-4b2a-45bf-9f5c-934a1ffc3abf", ResourceVersion:"775", Generation:0, CreationTimestamp:time.Date(2025, time.July, 15, 5, 15, 56, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"668d6bf9bc", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4396-0-0-n-e83c776e20", ContainerID:"", Pod:"coredns-668d6bf9bc-m56c8", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.61.68/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"cali79a585bde21", MAC:"", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Jul 15 05:16:30.707726 containerd[1591]: 2025-07-15 05:16:30.677 [INFO][4491] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.61.68/32] ContainerID="06b7fdb37cbdc7225031a84bde5341901c49a1a58473085f17b2472cf1f3fbe3" Namespace="kube-system" Pod="coredns-668d6bf9bc-m56c8" WorkloadEndpoint="ci--4396--0--0--n--e83c776e20-k8s-coredns--668d6bf9bc--m56c8-eth0" Jul 15 05:16:30.707726 containerd[1591]: 2025-07-15 05:16:30.677 [INFO][4491] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali79a585bde21 ContainerID="06b7fdb37cbdc7225031a84bde5341901c49a1a58473085f17b2472cf1f3fbe3" Namespace="kube-system" Pod="coredns-668d6bf9bc-m56c8" WorkloadEndpoint="ci--4396--0--0--n--e83c776e20-k8s-coredns--668d6bf9bc--m56c8-eth0" Jul 15 05:16:30.707726 containerd[1591]: 2025-07-15 05:16:30.694 [INFO][4491] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="06b7fdb37cbdc7225031a84bde5341901c49a1a58473085f17b2472cf1f3fbe3" Namespace="kube-system" Pod="coredns-668d6bf9bc-m56c8" WorkloadEndpoint="ci--4396--0--0--n--e83c776e20-k8s-coredns--668d6bf9bc--m56c8-eth0" Jul 15 05:16:30.707726 containerd[1591]: 2025-07-15 05:16:30.694 [INFO][4491] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="06b7fdb37cbdc7225031a84bde5341901c49a1a58473085f17b2472cf1f3fbe3" Namespace="kube-system" Pod="coredns-668d6bf9bc-m56c8" WorkloadEndpoint="ci--4396--0--0--n--e83c776e20-k8s-coredns--668d6bf9bc--m56c8-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4396--0--0--n--e83c776e20-k8s-coredns--668d6bf9bc--m56c8-eth0", GenerateName:"coredns-668d6bf9bc-", Namespace:"kube-system", SelfLink:"", UID:"a6375220-4b2a-45bf-9f5c-934a1ffc3abf", ResourceVersion:"775", Generation:0, CreationTimestamp:time.Date(2025, time.July, 15, 5, 15, 56, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"668d6bf9bc", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4396-0-0-n-e83c776e20", ContainerID:"06b7fdb37cbdc7225031a84bde5341901c49a1a58473085f17b2472cf1f3fbe3", Pod:"coredns-668d6bf9bc-m56c8", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.61.68/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"cali79a585bde21", MAC:"e6:87:83:4b:0e:f5", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Jul 15 05:16:30.707726 containerd[1591]: 2025-07-15 05:16:30.703 [INFO][4491] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="06b7fdb37cbdc7225031a84bde5341901c49a1a58473085f17b2472cf1f3fbe3" Namespace="kube-system" Pod="coredns-668d6bf9bc-m56c8" WorkloadEndpoint="ci--4396--0--0--n--e83c776e20-k8s-coredns--668d6bf9bc--m56c8-eth0" Jul 15 05:16:30.731907 containerd[1591]: time="2025-07-15T05:16:30.731883985Z" level=info msg="connecting to shim 06b7fdb37cbdc7225031a84bde5341901c49a1a58473085f17b2472cf1f3fbe3" address="unix:///run/containerd/s/9aa800a14def36b19b8d2b42c2896b8880d66ef7511d843ede3f2b8b8e41a7c1" namespace=k8s.io protocol=ttrpc version=3 Jul 15 05:16:30.758120 systemd[1]: Started cri-containerd-06b7fdb37cbdc7225031a84bde5341901c49a1a58473085f17b2472cf1f3fbe3.scope - libcontainer container 06b7fdb37cbdc7225031a84bde5341901c49a1a58473085f17b2472cf1f3fbe3. Jul 15 05:16:30.773360 systemd-networkd[1464]: cali7e0e89ce4df: Link UP Jul 15 05:16:30.774858 systemd-networkd[1464]: cali7e0e89ce4df: Gained carrier Jul 15 05:16:30.791390 containerd[1591]: 2025-07-15 05:16:30.501 [INFO][4493] cni-plugin/utils.go 100: File /var/lib/calico/mtu does not exist Jul 15 05:16:30.791390 containerd[1591]: 2025-07-15 05:16:30.510 [INFO][4493] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4396--0--0--n--e83c776e20-k8s-goldmane--768f4c5c69--6hlgx-eth0 goldmane-768f4c5c69- calico-system 9c02e846-0cfc-42de-9a5f-fa54428989e8 781 0 2025-07-15 05:16:06 +0000 UTC map[app.kubernetes.io/name:goldmane k8s-app:goldmane pod-template-hash:768f4c5c69 projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:goldmane] map[] [] [] []} {k8s ci-4396-0-0-n-e83c776e20 goldmane-768f4c5c69-6hlgx eth0 goldmane [] [] [kns.calico-system ksa.calico-system.goldmane] cali7e0e89ce4df [] [] }} ContainerID="572b11424fc0bf9108932bf833d1aac031175336fc98a9cbc83cbb40213ffc69" Namespace="calico-system" Pod="goldmane-768f4c5c69-6hlgx" WorkloadEndpoint="ci--4396--0--0--n--e83c776e20-k8s-goldmane--768f4c5c69--6hlgx-" Jul 15 05:16:30.791390 containerd[1591]: 2025-07-15 05:16:30.510 [INFO][4493] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="572b11424fc0bf9108932bf833d1aac031175336fc98a9cbc83cbb40213ffc69" Namespace="calico-system" Pod="goldmane-768f4c5c69-6hlgx" WorkloadEndpoint="ci--4396--0--0--n--e83c776e20-k8s-goldmane--768f4c5c69--6hlgx-eth0" Jul 15 05:16:30.791390 containerd[1591]: 2025-07-15 05:16:30.537 [INFO][4520] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="572b11424fc0bf9108932bf833d1aac031175336fc98a9cbc83cbb40213ffc69" HandleID="k8s-pod-network.572b11424fc0bf9108932bf833d1aac031175336fc98a9cbc83cbb40213ffc69" Workload="ci--4396--0--0--n--e83c776e20-k8s-goldmane--768f4c5c69--6hlgx-eth0" Jul 15 05:16:30.791390 containerd[1591]: 2025-07-15 05:16:30.537 [INFO][4520] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="572b11424fc0bf9108932bf833d1aac031175336fc98a9cbc83cbb40213ffc69" HandleID="k8s-pod-network.572b11424fc0bf9108932bf833d1aac031175336fc98a9cbc83cbb40213ffc69" Workload="ci--4396--0--0--n--e83c776e20-k8s-goldmane--768f4c5c69--6hlgx-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc0002cfab0), Attrs:map[string]string{"namespace":"calico-system", "node":"ci-4396-0-0-n-e83c776e20", "pod":"goldmane-768f4c5c69-6hlgx", "timestamp":"2025-07-15 05:16:30.537404727 +0000 UTC"}, Hostname:"ci-4396-0-0-n-e83c776e20", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Jul 15 05:16:30.791390 containerd[1591]: 2025-07-15 05:16:30.537 [INFO][4520] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Jul 15 05:16:30.791390 containerd[1591]: 2025-07-15 05:16:30.669 [INFO][4520] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Jul 15 05:16:30.791390 containerd[1591]: 2025-07-15 05:16:30.670 [INFO][4520] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4396-0-0-n-e83c776e20' Jul 15 05:16:30.791390 containerd[1591]: 2025-07-15 05:16:30.688 [INFO][4520] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.572b11424fc0bf9108932bf833d1aac031175336fc98a9cbc83cbb40213ffc69" host="ci-4396-0-0-n-e83c776e20" Jul 15 05:16:30.791390 containerd[1591]: 2025-07-15 05:16:30.737 [INFO][4520] ipam/ipam.go 394: Looking up existing affinities for host host="ci-4396-0-0-n-e83c776e20" Jul 15 05:16:30.791390 containerd[1591]: 2025-07-15 05:16:30.748 [INFO][4520] ipam/ipam.go 511: Trying affinity for 192.168.61.64/26 host="ci-4396-0-0-n-e83c776e20" Jul 15 05:16:30.791390 containerd[1591]: 2025-07-15 05:16:30.752 [INFO][4520] ipam/ipam.go 158: Attempting to load block cidr=192.168.61.64/26 host="ci-4396-0-0-n-e83c776e20" Jul 15 05:16:30.791390 containerd[1591]: 2025-07-15 05:16:30.754 [INFO][4520] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.61.64/26 host="ci-4396-0-0-n-e83c776e20" Jul 15 05:16:30.791390 containerd[1591]: 2025-07-15 05:16:30.754 [INFO][4520] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.61.64/26 handle="k8s-pod-network.572b11424fc0bf9108932bf833d1aac031175336fc98a9cbc83cbb40213ffc69" host="ci-4396-0-0-n-e83c776e20" Jul 15 05:16:30.791390 containerd[1591]: 2025-07-15 05:16:30.755 [INFO][4520] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.572b11424fc0bf9108932bf833d1aac031175336fc98a9cbc83cbb40213ffc69 Jul 15 05:16:30.791390 containerd[1591]: 2025-07-15 05:16:30.759 [INFO][4520] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.61.64/26 handle="k8s-pod-network.572b11424fc0bf9108932bf833d1aac031175336fc98a9cbc83cbb40213ffc69" host="ci-4396-0-0-n-e83c776e20" Jul 15 05:16:30.791390 containerd[1591]: 2025-07-15 05:16:30.767 [INFO][4520] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.61.69/26] block=192.168.61.64/26 handle="k8s-pod-network.572b11424fc0bf9108932bf833d1aac031175336fc98a9cbc83cbb40213ffc69" host="ci-4396-0-0-n-e83c776e20" Jul 15 05:16:30.791390 containerd[1591]: 2025-07-15 05:16:30.767 [INFO][4520] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.61.69/26] handle="k8s-pod-network.572b11424fc0bf9108932bf833d1aac031175336fc98a9cbc83cbb40213ffc69" host="ci-4396-0-0-n-e83c776e20" Jul 15 05:16:30.791390 containerd[1591]: 2025-07-15 05:16:30.768 [INFO][4520] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Jul 15 05:16:30.791390 containerd[1591]: 2025-07-15 05:16:30.768 [INFO][4520] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.61.69/26] IPv6=[] ContainerID="572b11424fc0bf9108932bf833d1aac031175336fc98a9cbc83cbb40213ffc69" HandleID="k8s-pod-network.572b11424fc0bf9108932bf833d1aac031175336fc98a9cbc83cbb40213ffc69" Workload="ci--4396--0--0--n--e83c776e20-k8s-goldmane--768f4c5c69--6hlgx-eth0" Jul 15 05:16:30.792125 containerd[1591]: 2025-07-15 05:16:30.770 [INFO][4493] cni-plugin/k8s.go 418: Populated endpoint ContainerID="572b11424fc0bf9108932bf833d1aac031175336fc98a9cbc83cbb40213ffc69" Namespace="calico-system" Pod="goldmane-768f4c5c69-6hlgx" WorkloadEndpoint="ci--4396--0--0--n--e83c776e20-k8s-goldmane--768f4c5c69--6hlgx-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4396--0--0--n--e83c776e20-k8s-goldmane--768f4c5c69--6hlgx-eth0", GenerateName:"goldmane-768f4c5c69-", Namespace:"calico-system", SelfLink:"", UID:"9c02e846-0cfc-42de-9a5f-fa54428989e8", ResourceVersion:"781", Generation:0, CreationTimestamp:time.Date(2025, time.July, 15, 5, 16, 6, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"goldmane", "k8s-app":"goldmane", "pod-template-hash":"768f4c5c69", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"goldmane"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4396-0-0-n-e83c776e20", ContainerID:"", Pod:"goldmane-768f4c5c69-6hlgx", Endpoint:"eth0", ServiceAccountName:"goldmane", IPNetworks:[]string{"192.168.61.69/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.goldmane"}, InterfaceName:"cali7e0e89ce4df", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Jul 15 05:16:30.792125 containerd[1591]: 2025-07-15 05:16:30.770 [INFO][4493] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.61.69/32] ContainerID="572b11424fc0bf9108932bf833d1aac031175336fc98a9cbc83cbb40213ffc69" Namespace="calico-system" Pod="goldmane-768f4c5c69-6hlgx" WorkloadEndpoint="ci--4396--0--0--n--e83c776e20-k8s-goldmane--768f4c5c69--6hlgx-eth0" Jul 15 05:16:30.792125 containerd[1591]: 2025-07-15 05:16:30.770 [INFO][4493] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali7e0e89ce4df ContainerID="572b11424fc0bf9108932bf833d1aac031175336fc98a9cbc83cbb40213ffc69" Namespace="calico-system" Pod="goldmane-768f4c5c69-6hlgx" WorkloadEndpoint="ci--4396--0--0--n--e83c776e20-k8s-goldmane--768f4c5c69--6hlgx-eth0" Jul 15 05:16:30.792125 containerd[1591]: 2025-07-15 05:16:30.772 [INFO][4493] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="572b11424fc0bf9108932bf833d1aac031175336fc98a9cbc83cbb40213ffc69" Namespace="calico-system" Pod="goldmane-768f4c5c69-6hlgx" WorkloadEndpoint="ci--4396--0--0--n--e83c776e20-k8s-goldmane--768f4c5c69--6hlgx-eth0" Jul 15 05:16:30.792125 containerd[1591]: 2025-07-15 05:16:30.772 [INFO][4493] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="572b11424fc0bf9108932bf833d1aac031175336fc98a9cbc83cbb40213ffc69" Namespace="calico-system" Pod="goldmane-768f4c5c69-6hlgx" WorkloadEndpoint="ci--4396--0--0--n--e83c776e20-k8s-goldmane--768f4c5c69--6hlgx-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4396--0--0--n--e83c776e20-k8s-goldmane--768f4c5c69--6hlgx-eth0", GenerateName:"goldmane-768f4c5c69-", Namespace:"calico-system", SelfLink:"", UID:"9c02e846-0cfc-42de-9a5f-fa54428989e8", ResourceVersion:"781", Generation:0, CreationTimestamp:time.Date(2025, time.July, 15, 5, 16, 6, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"goldmane", "k8s-app":"goldmane", "pod-template-hash":"768f4c5c69", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"goldmane"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4396-0-0-n-e83c776e20", ContainerID:"572b11424fc0bf9108932bf833d1aac031175336fc98a9cbc83cbb40213ffc69", Pod:"goldmane-768f4c5c69-6hlgx", Endpoint:"eth0", ServiceAccountName:"goldmane", IPNetworks:[]string{"192.168.61.69/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.goldmane"}, InterfaceName:"cali7e0e89ce4df", MAC:"5e:b1:a4:ee:7a:07", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Jul 15 05:16:30.792125 containerd[1591]: 2025-07-15 05:16:30.785 [INFO][4493] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="572b11424fc0bf9108932bf833d1aac031175336fc98a9cbc83cbb40213ffc69" Namespace="calico-system" Pod="goldmane-768f4c5c69-6hlgx" WorkloadEndpoint="ci--4396--0--0--n--e83c776e20-k8s-goldmane--768f4c5c69--6hlgx-eth0" Jul 15 05:16:30.810578 containerd[1591]: time="2025-07-15T05:16:30.810521929Z" level=info msg="connecting to shim 572b11424fc0bf9108932bf833d1aac031175336fc98a9cbc83cbb40213ffc69" address="unix:///run/containerd/s/5911e1bfd0bae4b049b949e610238e5d19dd2205f1ff09f56b29767f3459b7f8" namespace=k8s.io protocol=ttrpc version=3 Jul 15 05:16:30.824211 systemd-networkd[1464]: cali75308fc8b2d: Gained IPv6LL Jul 15 05:16:30.825092 containerd[1591]: time="2025-07-15T05:16:30.824958803Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-668d6bf9bc-m56c8,Uid:a6375220-4b2a-45bf-9f5c-934a1ffc3abf,Namespace:kube-system,Attempt:0,} returns sandbox id \"06b7fdb37cbdc7225031a84bde5341901c49a1a58473085f17b2472cf1f3fbe3\"" Jul 15 05:16:30.827718 containerd[1591]: time="2025-07-15T05:16:30.827705026Z" level=info msg="CreateContainer within sandbox \"06b7fdb37cbdc7225031a84bde5341901c49a1a58473085f17b2472cf1f3fbe3\" for container &ContainerMetadata{Name:coredns,Attempt:0,}" Jul 15 05:16:30.837123 systemd[1]: Started cri-containerd-572b11424fc0bf9108932bf833d1aac031175336fc98a9cbc83cbb40213ffc69.scope - libcontainer container 572b11424fc0bf9108932bf833d1aac031175336fc98a9cbc83cbb40213ffc69. Jul 15 05:16:30.842578 containerd[1591]: time="2025-07-15T05:16:30.842555146Z" level=info msg="Container fa1912829682e25e6d6e91aec36820c017779edefcd53294ec9541d95fc69eb4: CDI devices from CRI Config.CDIDevices: []" Jul 15 05:16:30.848074 containerd[1591]: time="2025-07-15T05:16:30.848051802Z" level=info msg="CreateContainer within sandbox \"06b7fdb37cbdc7225031a84bde5341901c49a1a58473085f17b2472cf1f3fbe3\" for &ContainerMetadata{Name:coredns,Attempt:0,} returns container id \"fa1912829682e25e6d6e91aec36820c017779edefcd53294ec9541d95fc69eb4\"" Jul 15 05:16:30.848611 containerd[1591]: time="2025-07-15T05:16:30.848573544Z" level=info msg="StartContainer for \"fa1912829682e25e6d6e91aec36820c017779edefcd53294ec9541d95fc69eb4\"" Jul 15 05:16:30.850564 containerd[1591]: time="2025-07-15T05:16:30.850526604Z" level=info msg="connecting to shim fa1912829682e25e6d6e91aec36820c017779edefcd53294ec9541d95fc69eb4" address="unix:///run/containerd/s/9aa800a14def36b19b8d2b42c2896b8880d66ef7511d843ede3f2b8b8e41a7c1" protocol=ttrpc version=3 Jul 15 05:16:30.871138 systemd[1]: Started cri-containerd-fa1912829682e25e6d6e91aec36820c017779edefcd53294ec9541d95fc69eb4.scope - libcontainer container fa1912829682e25e6d6e91aec36820c017779edefcd53294ec9541d95fc69eb4. Jul 15 05:16:30.894683 containerd[1591]: time="2025-07-15T05:16:30.894651549Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:goldmane-768f4c5c69-6hlgx,Uid:9c02e846-0cfc-42de-9a5f-fa54428989e8,Namespace:calico-system,Attempt:0,} returns sandbox id \"572b11424fc0bf9108932bf833d1aac031175336fc98a9cbc83cbb40213ffc69\"" Jul 15 05:16:30.925354 containerd[1591]: time="2025-07-15T05:16:30.925212786Z" level=info msg="StartContainer for \"fa1912829682e25e6d6e91aec36820c017779edefcd53294ec9541d95fc69eb4\" returns successfully" Jul 15 05:16:31.016197 systemd-networkd[1464]: calif7cd30c7c1b: Gained IPv6LL Jul 15 05:16:31.439443 containerd[1591]: time="2025-07-15T05:16:31.439248889Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-9xn7s,Uid:14bfd7a9-4124-403a-afcc-084c249af056,Namespace:calico-system,Attempt:0,}" Jul 15 05:16:31.557243 systemd-networkd[1464]: cali31f83e6bc4a: Link UP Jul 15 05:16:31.558245 systemd-networkd[1464]: cali31f83e6bc4a: Gained carrier Jul 15 05:16:31.572382 containerd[1591]: 2025-07-15 05:16:31.469 [INFO][4687] cni-plugin/utils.go 100: File /var/lib/calico/mtu does not exist Jul 15 05:16:31.572382 containerd[1591]: 2025-07-15 05:16:31.480 [INFO][4687] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4396--0--0--n--e83c776e20-k8s-csi--node--driver--9xn7s-eth0 csi-node-driver- calico-system 14bfd7a9-4124-403a-afcc-084c249af056 679 0 2025-07-15 05:16:07 +0000 UTC map[app.kubernetes.io/name:csi-node-driver controller-revision-hash:8967bcb6f k8s-app:csi-node-driver name:csi-node-driver pod-template-generation:1 projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:csi-node-driver] map[] [] [] []} {k8s ci-4396-0-0-n-e83c776e20 csi-node-driver-9xn7s eth0 csi-node-driver [] [] [kns.calico-system ksa.calico-system.csi-node-driver] cali31f83e6bc4a [] [] }} ContainerID="83181a6f59f44f2314ef7f2c42d6e9b889f25add0df4a95a6bcfefd364e64db5" Namespace="calico-system" Pod="csi-node-driver-9xn7s" WorkloadEndpoint="ci--4396--0--0--n--e83c776e20-k8s-csi--node--driver--9xn7s-" Jul 15 05:16:31.572382 containerd[1591]: 2025-07-15 05:16:31.480 [INFO][4687] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="83181a6f59f44f2314ef7f2c42d6e9b889f25add0df4a95a6bcfefd364e64db5" Namespace="calico-system" Pod="csi-node-driver-9xn7s" WorkloadEndpoint="ci--4396--0--0--n--e83c776e20-k8s-csi--node--driver--9xn7s-eth0" Jul 15 05:16:31.572382 containerd[1591]: 2025-07-15 05:16:31.505 [INFO][4700] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="83181a6f59f44f2314ef7f2c42d6e9b889f25add0df4a95a6bcfefd364e64db5" HandleID="k8s-pod-network.83181a6f59f44f2314ef7f2c42d6e9b889f25add0df4a95a6bcfefd364e64db5" Workload="ci--4396--0--0--n--e83c776e20-k8s-csi--node--driver--9xn7s-eth0" Jul 15 05:16:31.572382 containerd[1591]: 2025-07-15 05:16:31.505 [INFO][4700] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="83181a6f59f44f2314ef7f2c42d6e9b889f25add0df4a95a6bcfefd364e64db5" HandleID="k8s-pod-network.83181a6f59f44f2314ef7f2c42d6e9b889f25add0df4a95a6bcfefd364e64db5" Workload="ci--4396--0--0--n--e83c776e20-k8s-csi--node--driver--9xn7s-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc0002d58a0), Attrs:map[string]string{"namespace":"calico-system", "node":"ci-4396-0-0-n-e83c776e20", "pod":"csi-node-driver-9xn7s", "timestamp":"2025-07-15 05:16:31.505330787 +0000 UTC"}, Hostname:"ci-4396-0-0-n-e83c776e20", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Jul 15 05:16:31.572382 containerd[1591]: 2025-07-15 05:16:31.505 [INFO][4700] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Jul 15 05:16:31.572382 containerd[1591]: 2025-07-15 05:16:31.505 [INFO][4700] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Jul 15 05:16:31.572382 containerd[1591]: 2025-07-15 05:16:31.505 [INFO][4700] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4396-0-0-n-e83c776e20' Jul 15 05:16:31.572382 containerd[1591]: 2025-07-15 05:16:31.514 [INFO][4700] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.83181a6f59f44f2314ef7f2c42d6e9b889f25add0df4a95a6bcfefd364e64db5" host="ci-4396-0-0-n-e83c776e20" Jul 15 05:16:31.572382 containerd[1591]: 2025-07-15 05:16:31.519 [INFO][4700] ipam/ipam.go 394: Looking up existing affinities for host host="ci-4396-0-0-n-e83c776e20" Jul 15 05:16:31.572382 containerd[1591]: 2025-07-15 05:16:31.524 [INFO][4700] ipam/ipam.go 511: Trying affinity for 192.168.61.64/26 host="ci-4396-0-0-n-e83c776e20" Jul 15 05:16:31.572382 containerd[1591]: 2025-07-15 05:16:31.526 [INFO][4700] ipam/ipam.go 158: Attempting to load block cidr=192.168.61.64/26 host="ci-4396-0-0-n-e83c776e20" Jul 15 05:16:31.572382 containerd[1591]: 2025-07-15 05:16:31.530 [INFO][4700] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.61.64/26 host="ci-4396-0-0-n-e83c776e20" Jul 15 05:16:31.572382 containerd[1591]: 2025-07-15 05:16:31.530 [INFO][4700] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.61.64/26 handle="k8s-pod-network.83181a6f59f44f2314ef7f2c42d6e9b889f25add0df4a95a6bcfefd364e64db5" host="ci-4396-0-0-n-e83c776e20" Jul 15 05:16:31.572382 containerd[1591]: 2025-07-15 05:16:31.531 [INFO][4700] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.83181a6f59f44f2314ef7f2c42d6e9b889f25add0df4a95a6bcfefd364e64db5 Jul 15 05:16:31.572382 containerd[1591]: 2025-07-15 05:16:31.535 [INFO][4700] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.61.64/26 handle="k8s-pod-network.83181a6f59f44f2314ef7f2c42d6e9b889f25add0df4a95a6bcfefd364e64db5" host="ci-4396-0-0-n-e83c776e20" Jul 15 05:16:31.572382 containerd[1591]: 2025-07-15 05:16:31.545 [INFO][4700] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.61.70/26] block=192.168.61.64/26 handle="k8s-pod-network.83181a6f59f44f2314ef7f2c42d6e9b889f25add0df4a95a6bcfefd364e64db5" host="ci-4396-0-0-n-e83c776e20" Jul 15 05:16:31.572382 containerd[1591]: 2025-07-15 05:16:31.545 [INFO][4700] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.61.70/26] handle="k8s-pod-network.83181a6f59f44f2314ef7f2c42d6e9b889f25add0df4a95a6bcfefd364e64db5" host="ci-4396-0-0-n-e83c776e20" Jul 15 05:16:31.572382 containerd[1591]: 2025-07-15 05:16:31.545 [INFO][4700] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Jul 15 05:16:31.572382 containerd[1591]: 2025-07-15 05:16:31.545 [INFO][4700] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.61.70/26] IPv6=[] ContainerID="83181a6f59f44f2314ef7f2c42d6e9b889f25add0df4a95a6bcfefd364e64db5" HandleID="k8s-pod-network.83181a6f59f44f2314ef7f2c42d6e9b889f25add0df4a95a6bcfefd364e64db5" Workload="ci--4396--0--0--n--e83c776e20-k8s-csi--node--driver--9xn7s-eth0" Jul 15 05:16:31.572786 containerd[1591]: 2025-07-15 05:16:31.550 [INFO][4687] cni-plugin/k8s.go 418: Populated endpoint ContainerID="83181a6f59f44f2314ef7f2c42d6e9b889f25add0df4a95a6bcfefd364e64db5" Namespace="calico-system" Pod="csi-node-driver-9xn7s" WorkloadEndpoint="ci--4396--0--0--n--e83c776e20-k8s-csi--node--driver--9xn7s-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4396--0--0--n--e83c776e20-k8s-csi--node--driver--9xn7s-eth0", GenerateName:"csi-node-driver-", Namespace:"calico-system", SelfLink:"", UID:"14bfd7a9-4124-403a-afcc-084c249af056", ResourceVersion:"679", Generation:0, CreationTimestamp:time.Date(2025, time.July, 15, 5, 16, 7, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"csi-node-driver", "controller-revision-hash":"8967bcb6f", "k8s-app":"csi-node-driver", "name":"csi-node-driver", "pod-template-generation":"1", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"csi-node-driver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4396-0-0-n-e83c776e20", ContainerID:"", Pod:"csi-node-driver-9xn7s", Endpoint:"eth0", ServiceAccountName:"csi-node-driver", IPNetworks:[]string{"192.168.61.70/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.csi-node-driver"}, InterfaceName:"cali31f83e6bc4a", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Jul 15 05:16:31.572786 containerd[1591]: 2025-07-15 05:16:31.550 [INFO][4687] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.61.70/32] ContainerID="83181a6f59f44f2314ef7f2c42d6e9b889f25add0df4a95a6bcfefd364e64db5" Namespace="calico-system" Pod="csi-node-driver-9xn7s" WorkloadEndpoint="ci--4396--0--0--n--e83c776e20-k8s-csi--node--driver--9xn7s-eth0" Jul 15 05:16:31.572786 containerd[1591]: 2025-07-15 05:16:31.550 [INFO][4687] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali31f83e6bc4a ContainerID="83181a6f59f44f2314ef7f2c42d6e9b889f25add0df4a95a6bcfefd364e64db5" Namespace="calico-system" Pod="csi-node-driver-9xn7s" WorkloadEndpoint="ci--4396--0--0--n--e83c776e20-k8s-csi--node--driver--9xn7s-eth0" Jul 15 05:16:31.572786 containerd[1591]: 2025-07-15 05:16:31.559 [INFO][4687] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="83181a6f59f44f2314ef7f2c42d6e9b889f25add0df4a95a6bcfefd364e64db5" Namespace="calico-system" Pod="csi-node-driver-9xn7s" WorkloadEndpoint="ci--4396--0--0--n--e83c776e20-k8s-csi--node--driver--9xn7s-eth0" Jul 15 05:16:31.572786 containerd[1591]: 2025-07-15 05:16:31.560 [INFO][4687] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="83181a6f59f44f2314ef7f2c42d6e9b889f25add0df4a95a6bcfefd364e64db5" Namespace="calico-system" Pod="csi-node-driver-9xn7s" WorkloadEndpoint="ci--4396--0--0--n--e83c776e20-k8s-csi--node--driver--9xn7s-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4396--0--0--n--e83c776e20-k8s-csi--node--driver--9xn7s-eth0", GenerateName:"csi-node-driver-", Namespace:"calico-system", SelfLink:"", UID:"14bfd7a9-4124-403a-afcc-084c249af056", ResourceVersion:"679", Generation:0, CreationTimestamp:time.Date(2025, time.July, 15, 5, 16, 7, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"csi-node-driver", "controller-revision-hash":"8967bcb6f", "k8s-app":"csi-node-driver", "name":"csi-node-driver", "pod-template-generation":"1", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"csi-node-driver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4396-0-0-n-e83c776e20", ContainerID:"83181a6f59f44f2314ef7f2c42d6e9b889f25add0df4a95a6bcfefd364e64db5", Pod:"csi-node-driver-9xn7s", Endpoint:"eth0", ServiceAccountName:"csi-node-driver", IPNetworks:[]string{"192.168.61.70/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.csi-node-driver"}, InterfaceName:"cali31f83e6bc4a", MAC:"c2:d4:ac:cb:ea:83", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Jul 15 05:16:31.572786 containerd[1591]: 2025-07-15 05:16:31.569 [INFO][4687] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="83181a6f59f44f2314ef7f2c42d6e9b889f25add0df4a95a6bcfefd364e64db5" Namespace="calico-system" Pod="csi-node-driver-9xn7s" WorkloadEndpoint="ci--4396--0--0--n--e83c776e20-k8s-csi--node--driver--9xn7s-eth0" Jul 15 05:16:31.602383 containerd[1591]: time="2025-07-15T05:16:31.602346017Z" level=info msg="connecting to shim 83181a6f59f44f2314ef7f2c42d6e9b889f25add0df4a95a6bcfefd364e64db5" address="unix:///run/containerd/s/7519cc700310ca7ebe19921fe1775b9d5fba1f082384e2088eb3be1aff600e07" namespace=k8s.io protocol=ttrpc version=3 Jul 15 05:16:31.641195 systemd[1]: Started cri-containerd-83181a6f59f44f2314ef7f2c42d6e9b889f25add0df4a95a6bcfefd364e64db5.scope - libcontainer container 83181a6f59f44f2314ef7f2c42d6e9b889f25add0df4a95a6bcfefd364e64db5. Jul 15 05:16:31.673071 containerd[1591]: time="2025-07-15T05:16:31.673045493Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-9xn7s,Uid:14bfd7a9-4124-403a-afcc-084c249af056,Namespace:calico-system,Attempt:0,} returns sandbox id \"83181a6f59f44f2314ef7f2c42d6e9b889f25add0df4a95a6bcfefd364e64db5\"" Jul 15 05:16:31.680998 kubelet[2761]: I0715 05:16:31.680961 2761 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/coredns-668d6bf9bc-m56c8" podStartSLOduration=35.680947537 podStartE2EDuration="35.680947537s" podCreationTimestamp="2025-07-15 05:15:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-07-15 05:16:31.678792114 +0000 UTC m=+42.330166090" watchObservedRunningTime="2025-07-15 05:16:31.680947537 +0000 UTC m=+42.332321503" Jul 15 05:16:31.976167 systemd-networkd[1464]: cali79a585bde21: Gained IPv6LL Jul 15 05:16:32.137673 containerd[1591]: time="2025-07-15T05:16:32.137636145Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/apiserver:v3.30.2\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jul 15 05:16:32.138594 containerd[1591]: time="2025-07-15T05:16:32.138571979Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.30.2: active requests=0, bytes read=47317977" Jul 15 05:16:32.139746 containerd[1591]: time="2025-07-15T05:16:32.139713761Z" level=info msg="ImageCreate event name:\"sha256:5509118eed617ef04ca00f5a095bfd0a4cd1cf69edcfcf9bedf0edb641be51dd\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jul 15 05:16:32.141404 containerd[1591]: time="2025-07-15T05:16:32.141362810Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/apiserver@sha256:ec6b10660962e7caad70c47755049fad68f9fc2f7064e8bc7cb862583e02cc2b\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jul 15 05:16:32.141942 containerd[1591]: time="2025-07-15T05:16:32.141653081Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/apiserver:v3.30.2\" with image id \"sha256:5509118eed617ef04ca00f5a095bfd0a4cd1cf69edcfcf9bedf0edb641be51dd\", repo tag \"ghcr.io/flatcar/calico/apiserver:v3.30.2\", repo digest \"ghcr.io/flatcar/calico/apiserver@sha256:ec6b10660962e7caad70c47755049fad68f9fc2f7064e8bc7cb862583e02cc2b\", size \"48810696\" in 2.449743441s" Jul 15 05:16:32.141942 containerd[1591]: time="2025-07-15T05:16:32.141674261Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.2\" returns image reference \"sha256:5509118eed617ef04ca00f5a095bfd0a4cd1cf69edcfcf9bedf0edb641be51dd\"" Jul 15 05:16:32.142275 containerd[1591]: time="2025-07-15T05:16:32.142262783Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.30.2\"" Jul 15 05:16:32.143593 containerd[1591]: time="2025-07-15T05:16:32.143574380Z" level=info msg="CreateContainer within sandbox \"6d92382dcc5d3a20c83b67ca71441cb951f6446062ff797d0f56361303e04979\" for container &ContainerMetadata{Name:calico-apiserver,Attempt:0,}" Jul 15 05:16:32.150184 containerd[1591]: time="2025-07-15T05:16:32.150167218Z" level=info msg="Container 9356dc6e06636181351f4e36acba0d3bbf9f5b2dce9469df4bd328897e004c28: CDI devices from CRI Config.CDIDevices: []" Jul 15 05:16:32.167035 containerd[1591]: time="2025-07-15T05:16:32.166449087Z" level=info msg="CreateContainer within sandbox \"6d92382dcc5d3a20c83b67ca71441cb951f6446062ff797d0f56361303e04979\" for &ContainerMetadata{Name:calico-apiserver,Attempt:0,} returns container id \"9356dc6e06636181351f4e36acba0d3bbf9f5b2dce9469df4bd328897e004c28\"" Jul 15 05:16:32.167781 containerd[1591]: time="2025-07-15T05:16:32.167761524Z" level=info msg="StartContainer for \"9356dc6e06636181351f4e36acba0d3bbf9f5b2dce9469df4bd328897e004c28\"" Jul 15 05:16:32.169470 containerd[1591]: time="2025-07-15T05:16:32.169455136Z" level=info msg="connecting to shim 9356dc6e06636181351f4e36acba0d3bbf9f5b2dce9469df4bd328897e004c28" address="unix:///run/containerd/s/4ad236785feefb9a4b373b51b89b0e108e9ba66f88dd6755d6b423a2fac7da75" protocol=ttrpc version=3 Jul 15 05:16:32.190127 systemd[1]: Started cri-containerd-9356dc6e06636181351f4e36acba0d3bbf9f5b2dce9469df4bd328897e004c28.scope - libcontainer container 9356dc6e06636181351f4e36acba0d3bbf9f5b2dce9469df4bd328897e004c28. Jul 15 05:16:32.230512 containerd[1591]: time="2025-07-15T05:16:32.230435830Z" level=info msg="StartContainer for \"9356dc6e06636181351f4e36acba0d3bbf9f5b2dce9469df4bd328897e004c28\" returns successfully" Jul 15 05:16:32.424183 systemd-networkd[1464]: cali7e0e89ce4df: Gained IPv6LL Jul 15 05:16:32.439441 containerd[1591]: time="2025-07-15T05:16:32.439413623Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-6494bf7c78-znpph,Uid:84c72182-29d8-44ef-b493-115789db3f13,Namespace:calico-apiserver,Attempt:0,}" Jul 15 05:16:32.535768 systemd-networkd[1464]: cali4ae90361957: Link UP Jul 15 05:16:32.537384 systemd-networkd[1464]: cali4ae90361957: Gained carrier Jul 15 05:16:32.555684 containerd[1591]: 2025-07-15 05:16:32.472 [INFO][4825] cni-plugin/utils.go 100: File /var/lib/calico/mtu does not exist Jul 15 05:16:32.555684 containerd[1591]: 2025-07-15 05:16:32.479 [INFO][4825] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4396--0--0--n--e83c776e20-k8s-calico--apiserver--6494bf7c78--znpph-eth0 calico-apiserver-6494bf7c78- calico-apiserver 84c72182-29d8-44ef-b493-115789db3f13 785 0 2025-07-15 05:16:05 +0000 UTC map[apiserver:true app.kubernetes.io/name:calico-apiserver k8s-app:calico-apiserver pod-template-hash:6494bf7c78 projectcalico.org/namespace:calico-apiserver projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:calico-apiserver] map[] [] [] []} {k8s ci-4396-0-0-n-e83c776e20 calico-apiserver-6494bf7c78-znpph eth0 calico-apiserver [] [] [kns.calico-apiserver ksa.calico-apiserver.calico-apiserver] cali4ae90361957 [] [] }} ContainerID="e283644be1bcd531ffacb2d1e6c1da45867cf45e86cd2dae78dd860284c02eec" Namespace="calico-apiserver" Pod="calico-apiserver-6494bf7c78-znpph" WorkloadEndpoint="ci--4396--0--0--n--e83c776e20-k8s-calico--apiserver--6494bf7c78--znpph-" Jul 15 05:16:32.555684 containerd[1591]: 2025-07-15 05:16:32.479 [INFO][4825] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="e283644be1bcd531ffacb2d1e6c1da45867cf45e86cd2dae78dd860284c02eec" Namespace="calico-apiserver" Pod="calico-apiserver-6494bf7c78-znpph" WorkloadEndpoint="ci--4396--0--0--n--e83c776e20-k8s-calico--apiserver--6494bf7c78--znpph-eth0" Jul 15 05:16:32.555684 containerd[1591]: 2025-07-15 05:16:32.503 [INFO][4837] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="e283644be1bcd531ffacb2d1e6c1da45867cf45e86cd2dae78dd860284c02eec" HandleID="k8s-pod-network.e283644be1bcd531ffacb2d1e6c1da45867cf45e86cd2dae78dd860284c02eec" Workload="ci--4396--0--0--n--e83c776e20-k8s-calico--apiserver--6494bf7c78--znpph-eth0" Jul 15 05:16:32.555684 containerd[1591]: 2025-07-15 05:16:32.503 [INFO][4837] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="e283644be1bcd531ffacb2d1e6c1da45867cf45e86cd2dae78dd860284c02eec" HandleID="k8s-pod-network.e283644be1bcd531ffacb2d1e6c1da45867cf45e86cd2dae78dd860284c02eec" Workload="ci--4396--0--0--n--e83c776e20-k8s-calico--apiserver--6494bf7c78--znpph-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc0002d4f70), Attrs:map[string]string{"namespace":"calico-apiserver", "node":"ci-4396-0-0-n-e83c776e20", "pod":"calico-apiserver-6494bf7c78-znpph", "timestamp":"2025-07-15 05:16:32.503781 +0000 UTC"}, Hostname:"ci-4396-0-0-n-e83c776e20", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Jul 15 05:16:32.555684 containerd[1591]: 2025-07-15 05:16:32.503 [INFO][4837] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Jul 15 05:16:32.555684 containerd[1591]: 2025-07-15 05:16:32.504 [INFO][4837] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Jul 15 05:16:32.555684 containerd[1591]: 2025-07-15 05:16:32.504 [INFO][4837] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4396-0-0-n-e83c776e20' Jul 15 05:16:32.555684 containerd[1591]: 2025-07-15 05:16:32.508 [INFO][4837] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.e283644be1bcd531ffacb2d1e6c1da45867cf45e86cd2dae78dd860284c02eec" host="ci-4396-0-0-n-e83c776e20" Jul 15 05:16:32.555684 containerd[1591]: 2025-07-15 05:16:32.512 [INFO][4837] ipam/ipam.go 394: Looking up existing affinities for host host="ci-4396-0-0-n-e83c776e20" Jul 15 05:16:32.555684 containerd[1591]: 2025-07-15 05:16:32.515 [INFO][4837] ipam/ipam.go 511: Trying affinity for 192.168.61.64/26 host="ci-4396-0-0-n-e83c776e20" Jul 15 05:16:32.555684 containerd[1591]: 2025-07-15 05:16:32.517 [INFO][4837] ipam/ipam.go 158: Attempting to load block cidr=192.168.61.64/26 host="ci-4396-0-0-n-e83c776e20" Jul 15 05:16:32.555684 containerd[1591]: 2025-07-15 05:16:32.519 [INFO][4837] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.61.64/26 host="ci-4396-0-0-n-e83c776e20" Jul 15 05:16:32.555684 containerd[1591]: 2025-07-15 05:16:32.519 [INFO][4837] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.61.64/26 handle="k8s-pod-network.e283644be1bcd531ffacb2d1e6c1da45867cf45e86cd2dae78dd860284c02eec" host="ci-4396-0-0-n-e83c776e20" Jul 15 05:16:32.555684 containerd[1591]: 2025-07-15 05:16:32.520 [INFO][4837] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.e283644be1bcd531ffacb2d1e6c1da45867cf45e86cd2dae78dd860284c02eec Jul 15 05:16:32.555684 containerd[1591]: 2025-07-15 05:16:32.524 [INFO][4837] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.61.64/26 handle="k8s-pod-network.e283644be1bcd531ffacb2d1e6c1da45867cf45e86cd2dae78dd860284c02eec" host="ci-4396-0-0-n-e83c776e20" Jul 15 05:16:32.555684 containerd[1591]: 2025-07-15 05:16:32.529 [INFO][4837] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.61.71/26] block=192.168.61.64/26 handle="k8s-pod-network.e283644be1bcd531ffacb2d1e6c1da45867cf45e86cd2dae78dd860284c02eec" host="ci-4396-0-0-n-e83c776e20" Jul 15 05:16:32.555684 containerd[1591]: 2025-07-15 05:16:32.529 [INFO][4837] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.61.71/26] handle="k8s-pod-network.e283644be1bcd531ffacb2d1e6c1da45867cf45e86cd2dae78dd860284c02eec" host="ci-4396-0-0-n-e83c776e20" Jul 15 05:16:32.555684 containerd[1591]: 2025-07-15 05:16:32.529 [INFO][4837] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Jul 15 05:16:32.555684 containerd[1591]: 2025-07-15 05:16:32.529 [INFO][4837] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.61.71/26] IPv6=[] ContainerID="e283644be1bcd531ffacb2d1e6c1da45867cf45e86cd2dae78dd860284c02eec" HandleID="k8s-pod-network.e283644be1bcd531ffacb2d1e6c1da45867cf45e86cd2dae78dd860284c02eec" Workload="ci--4396--0--0--n--e83c776e20-k8s-calico--apiserver--6494bf7c78--znpph-eth0" Jul 15 05:16:32.557047 containerd[1591]: 2025-07-15 05:16:32.532 [INFO][4825] cni-plugin/k8s.go 418: Populated endpoint ContainerID="e283644be1bcd531ffacb2d1e6c1da45867cf45e86cd2dae78dd860284c02eec" Namespace="calico-apiserver" Pod="calico-apiserver-6494bf7c78-znpph" WorkloadEndpoint="ci--4396--0--0--n--e83c776e20-k8s-calico--apiserver--6494bf7c78--znpph-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4396--0--0--n--e83c776e20-k8s-calico--apiserver--6494bf7c78--znpph-eth0", GenerateName:"calico-apiserver-6494bf7c78-", Namespace:"calico-apiserver", SelfLink:"", UID:"84c72182-29d8-44ef-b493-115789db3f13", ResourceVersion:"785", Generation:0, CreationTimestamp:time.Date(2025, time.July, 15, 5, 16, 5, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"6494bf7c78", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4396-0-0-n-e83c776e20", ContainerID:"", Pod:"calico-apiserver-6494bf7c78-znpph", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.61.71/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"cali4ae90361957", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Jul 15 05:16:32.557047 containerd[1591]: 2025-07-15 05:16:32.532 [INFO][4825] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.61.71/32] ContainerID="e283644be1bcd531ffacb2d1e6c1da45867cf45e86cd2dae78dd860284c02eec" Namespace="calico-apiserver" Pod="calico-apiserver-6494bf7c78-znpph" WorkloadEndpoint="ci--4396--0--0--n--e83c776e20-k8s-calico--apiserver--6494bf7c78--znpph-eth0" Jul 15 05:16:32.557047 containerd[1591]: 2025-07-15 05:16:32.532 [INFO][4825] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali4ae90361957 ContainerID="e283644be1bcd531ffacb2d1e6c1da45867cf45e86cd2dae78dd860284c02eec" Namespace="calico-apiserver" Pod="calico-apiserver-6494bf7c78-znpph" WorkloadEndpoint="ci--4396--0--0--n--e83c776e20-k8s-calico--apiserver--6494bf7c78--znpph-eth0" Jul 15 05:16:32.557047 containerd[1591]: 2025-07-15 05:16:32.538 [INFO][4825] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="e283644be1bcd531ffacb2d1e6c1da45867cf45e86cd2dae78dd860284c02eec" Namespace="calico-apiserver" Pod="calico-apiserver-6494bf7c78-znpph" WorkloadEndpoint="ci--4396--0--0--n--e83c776e20-k8s-calico--apiserver--6494bf7c78--znpph-eth0" Jul 15 05:16:32.557047 containerd[1591]: 2025-07-15 05:16:32.539 [INFO][4825] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="e283644be1bcd531ffacb2d1e6c1da45867cf45e86cd2dae78dd860284c02eec" Namespace="calico-apiserver" Pod="calico-apiserver-6494bf7c78-znpph" WorkloadEndpoint="ci--4396--0--0--n--e83c776e20-k8s-calico--apiserver--6494bf7c78--znpph-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4396--0--0--n--e83c776e20-k8s-calico--apiserver--6494bf7c78--znpph-eth0", GenerateName:"calico-apiserver-6494bf7c78-", Namespace:"calico-apiserver", SelfLink:"", UID:"84c72182-29d8-44ef-b493-115789db3f13", ResourceVersion:"785", Generation:0, CreationTimestamp:time.Date(2025, time.July, 15, 5, 16, 5, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"6494bf7c78", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4396-0-0-n-e83c776e20", ContainerID:"e283644be1bcd531ffacb2d1e6c1da45867cf45e86cd2dae78dd860284c02eec", Pod:"calico-apiserver-6494bf7c78-znpph", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.61.71/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"cali4ae90361957", MAC:"66:01:df:8c:5d:89", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Jul 15 05:16:32.557047 containerd[1591]: 2025-07-15 05:16:32.551 [INFO][4825] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="e283644be1bcd531ffacb2d1e6c1da45867cf45e86cd2dae78dd860284c02eec" Namespace="calico-apiserver" Pod="calico-apiserver-6494bf7c78-znpph" WorkloadEndpoint="ci--4396--0--0--n--e83c776e20-k8s-calico--apiserver--6494bf7c78--znpph-eth0" Jul 15 05:16:32.587069 containerd[1591]: time="2025-07-15T05:16:32.587035779Z" level=info msg="connecting to shim e283644be1bcd531ffacb2d1e6c1da45867cf45e86cd2dae78dd860284c02eec" address="unix:///run/containerd/s/690ca7eab03680b021fad49df2b9a9d48a837b947f2f6b03e36d80cd7ad5afe4" namespace=k8s.io protocol=ttrpc version=3 Jul 15 05:16:32.611138 systemd[1]: Started cri-containerd-e283644be1bcd531ffacb2d1e6c1da45867cf45e86cd2dae78dd860284c02eec.scope - libcontainer container e283644be1bcd531ffacb2d1e6c1da45867cf45e86cd2dae78dd860284c02eec. Jul 15 05:16:32.617127 systemd-networkd[1464]: cali31f83e6bc4a: Gained IPv6LL Jul 15 05:16:32.677074 containerd[1591]: time="2025-07-15T05:16:32.676542274Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-6494bf7c78-znpph,Uid:84c72182-29d8-44ef-b493-115789db3f13,Namespace:calico-apiserver,Attempt:0,} returns sandbox id \"e283644be1bcd531ffacb2d1e6c1da45867cf45e86cd2dae78dd860284c02eec\"" Jul 15 05:16:32.680653 containerd[1591]: time="2025-07-15T05:16:32.680633742Z" level=info msg="CreateContainer within sandbox \"e283644be1bcd531ffacb2d1e6c1da45867cf45e86cd2dae78dd860284c02eec\" for container &ContainerMetadata{Name:calico-apiserver,Attempt:0,}" Jul 15 05:16:32.685866 kubelet[2761]: I0715 05:16:32.685707 2761 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-apiserver/calico-apiserver-6494bf7c78-b7q6m" podStartSLOduration=25.235255767 podStartE2EDuration="27.685692125s" podCreationTimestamp="2025-07-15 05:16:05 +0000 UTC" firstStartedPulling="2025-07-15 05:16:29.691748032 +0000 UTC m=+40.343122008" lastFinishedPulling="2025-07-15 05:16:32.14218439 +0000 UTC m=+42.793558366" observedRunningTime="2025-07-15 05:16:32.685072812 +0000 UTC m=+43.336446778" watchObservedRunningTime="2025-07-15 05:16:32.685692125 +0000 UTC m=+43.337066101" Jul 15 05:16:32.693064 containerd[1591]: time="2025-07-15T05:16:32.689590476Z" level=info msg="Container fd441f72c1869ddfa5557e9efc05e67ca422282a83ad16e5accaf07c4edf6b14: CDI devices from CRI Config.CDIDevices: []" Jul 15 05:16:32.698183 containerd[1591]: time="2025-07-15T05:16:32.698121184Z" level=info msg="CreateContainer within sandbox \"e283644be1bcd531ffacb2d1e6c1da45867cf45e86cd2dae78dd860284c02eec\" for &ContainerMetadata{Name:calico-apiserver,Attempt:0,} returns container id \"fd441f72c1869ddfa5557e9efc05e67ca422282a83ad16e5accaf07c4edf6b14\"" Jul 15 05:16:32.699140 containerd[1591]: time="2025-07-15T05:16:32.698639433Z" level=info msg="StartContainer for \"fd441f72c1869ddfa5557e9efc05e67ca422282a83ad16e5accaf07c4edf6b14\"" Jul 15 05:16:32.700425 containerd[1591]: time="2025-07-15T05:16:32.700392396Z" level=info msg="connecting to shim fd441f72c1869ddfa5557e9efc05e67ca422282a83ad16e5accaf07c4edf6b14" address="unix:///run/containerd/s/690ca7eab03680b021fad49df2b9a9d48a837b947f2f6b03e36d80cd7ad5afe4" protocol=ttrpc version=3 Jul 15 05:16:32.725117 systemd[1]: Started cri-containerd-fd441f72c1869ddfa5557e9efc05e67ca422282a83ad16e5accaf07c4edf6b14.scope - libcontainer container fd441f72c1869ddfa5557e9efc05e67ca422282a83ad16e5accaf07c4edf6b14. Jul 15 05:16:32.780704 containerd[1591]: time="2025-07-15T05:16:32.780564574Z" level=info msg="StartContainer for \"fd441f72c1869ddfa5557e9efc05e67ca422282a83ad16e5accaf07c4edf6b14\" returns successfully" Jul 15 05:16:33.692832 kubelet[2761]: I0715 05:16:33.692747 2761 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Jul 15 05:16:33.704325 kubelet[2761]: I0715 05:16:33.704121 2761 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-apiserver/calico-apiserver-6494bf7c78-znpph" podStartSLOduration=28.704087866 podStartE2EDuration="28.704087866s" podCreationTimestamp="2025-07-15 05:16:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-07-15 05:16:33.701744687 +0000 UTC m=+44.353118653" watchObservedRunningTime="2025-07-15 05:16:33.704087866 +0000 UTC m=+44.355461842" Jul 15 05:16:33.705182 systemd-networkd[1464]: cali4ae90361957: Gained IPv6LL Jul 15 05:16:34.384554 containerd[1591]: time="2025-07-15T05:16:34.384440858Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/kube-controllers:v3.30.2\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jul 15 05:16:34.386097 containerd[1591]: time="2025-07-15T05:16:34.386061160Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/kube-controllers:v3.30.2: active requests=0, bytes read=51276688" Jul 15 05:16:34.387557 containerd[1591]: time="2025-07-15T05:16:34.387514866Z" level=info msg="ImageCreate event name:\"sha256:761b294e26556b58aabc85094a3d465389e6b141b7400aee732bd13400a6124a\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jul 15 05:16:34.389312 containerd[1591]: time="2025-07-15T05:16:34.389279512Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/kube-controllers@sha256:5d3ecdec3cbbe8f7009077102e35e8a2141161b59c548cf3f97829177677cbce\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jul 15 05:16:34.389732 containerd[1591]: time="2025-07-15T05:16:34.389595892Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.2\" with image id \"sha256:761b294e26556b58aabc85094a3d465389e6b141b7400aee732bd13400a6124a\", repo tag \"ghcr.io/flatcar/calico/kube-controllers:v3.30.2\", repo digest \"ghcr.io/flatcar/calico/kube-controllers@sha256:5d3ecdec3cbbe8f7009077102e35e8a2141161b59c548cf3f97829177677cbce\", size \"52769359\" in 2.24707689s" Jul 15 05:16:34.389732 containerd[1591]: time="2025-07-15T05:16:34.389619963Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.30.2\" returns image reference \"sha256:761b294e26556b58aabc85094a3d465389e6b141b7400aee732bd13400a6124a\"" Jul 15 05:16:34.390717 containerd[1591]: time="2025-07-15T05:16:34.390705107Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.30.2\"" Jul 15 05:16:34.433670 containerd[1591]: time="2025-07-15T05:16:34.433628261Z" level=info msg="CreateContainer within sandbox \"74ebcdace3be4cf3401f9e191e7151142de9950ea8f7e152e76abc2c55aca44c\" for container &ContainerMetadata{Name:calico-kube-controllers,Attempt:0,}" Jul 15 05:16:34.438875 containerd[1591]: time="2025-07-15T05:16:34.438838616Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-668d6bf9bc-vc4xs,Uid:f5754555-d845-4a94-affb-73038dd15a19,Namespace:kube-system,Attempt:0,}" Jul 15 05:16:34.463439 containerd[1591]: time="2025-07-15T05:16:34.463391616Z" level=info msg="Container ecc96da98e69b59d4998e563e8cd9c0081547a430a3bdb37fcca15973200c970: CDI devices from CRI Config.CDIDevices: []" Jul 15 05:16:34.467048 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount3102224980.mount: Deactivated successfully. Jul 15 05:16:34.475200 containerd[1591]: time="2025-07-15T05:16:34.475163110Z" level=info msg="CreateContainer within sandbox \"74ebcdace3be4cf3401f9e191e7151142de9950ea8f7e152e76abc2c55aca44c\" for &ContainerMetadata{Name:calico-kube-controllers,Attempt:0,} returns container id \"ecc96da98e69b59d4998e563e8cd9c0081547a430a3bdb37fcca15973200c970\"" Jul 15 05:16:34.477509 containerd[1591]: time="2025-07-15T05:16:34.477471694Z" level=info msg="StartContainer for \"ecc96da98e69b59d4998e563e8cd9c0081547a430a3bdb37fcca15973200c970\"" Jul 15 05:16:34.482218 containerd[1591]: time="2025-07-15T05:16:34.482196754Z" level=info msg="connecting to shim ecc96da98e69b59d4998e563e8cd9c0081547a430a3bdb37fcca15973200c970" address="unix:///run/containerd/s/c0945f1fc33942b0b665a43400484ebd95eb75dc5f3281e90424bed186532366" protocol=ttrpc version=3 Jul 15 05:16:34.504207 systemd[1]: Started cri-containerd-ecc96da98e69b59d4998e563e8cd9c0081547a430a3bdb37fcca15973200c970.scope - libcontainer container ecc96da98e69b59d4998e563e8cd9c0081547a430a3bdb37fcca15973200c970. Jul 15 05:16:34.578328 containerd[1591]: time="2025-07-15T05:16:34.578287416Z" level=info msg="StartContainer for \"ecc96da98e69b59d4998e563e8cd9c0081547a430a3bdb37fcca15973200c970\" returns successfully" Jul 15 05:16:34.610993 systemd-networkd[1464]: calia9ee13892a1: Link UP Jul 15 05:16:34.611715 systemd-networkd[1464]: calia9ee13892a1: Gained carrier Jul 15 05:16:34.628257 containerd[1591]: 2025-07-15 05:16:34.516 [INFO][4986] cni-plugin/utils.go 100: File /var/lib/calico/mtu does not exist Jul 15 05:16:34.628257 containerd[1591]: 2025-07-15 05:16:34.534 [INFO][4986] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4396--0--0--n--e83c776e20-k8s-coredns--668d6bf9bc--vc4xs-eth0 coredns-668d6bf9bc- kube-system f5754555-d845-4a94-affb-73038dd15a19 779 0 2025-07-15 05:15:56 +0000 UTC map[k8s-app:kube-dns pod-template-hash:668d6bf9bc projectcalico.org/namespace:kube-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:coredns] map[] [] [] []} {k8s ci-4396-0-0-n-e83c776e20 coredns-668d6bf9bc-vc4xs eth0 coredns [] [] [kns.kube-system ksa.kube-system.coredns] calia9ee13892a1 [{dns UDP 53 0 } {dns-tcp TCP 53 0 } {metrics TCP 9153 0 }] [] }} ContainerID="47bee240f99e3e58177cede01d56192cdf7cae15f13bc6cc3f1808a29777ea40" Namespace="kube-system" Pod="coredns-668d6bf9bc-vc4xs" WorkloadEndpoint="ci--4396--0--0--n--e83c776e20-k8s-coredns--668d6bf9bc--vc4xs-" Jul 15 05:16:34.628257 containerd[1591]: 2025-07-15 05:16:34.535 [INFO][4986] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="47bee240f99e3e58177cede01d56192cdf7cae15f13bc6cc3f1808a29777ea40" Namespace="kube-system" Pod="coredns-668d6bf9bc-vc4xs" WorkloadEndpoint="ci--4396--0--0--n--e83c776e20-k8s-coredns--668d6bf9bc--vc4xs-eth0" Jul 15 05:16:34.628257 containerd[1591]: 2025-07-15 05:16:34.573 [INFO][5018] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="47bee240f99e3e58177cede01d56192cdf7cae15f13bc6cc3f1808a29777ea40" HandleID="k8s-pod-network.47bee240f99e3e58177cede01d56192cdf7cae15f13bc6cc3f1808a29777ea40" Workload="ci--4396--0--0--n--e83c776e20-k8s-coredns--668d6bf9bc--vc4xs-eth0" Jul 15 05:16:34.628257 containerd[1591]: 2025-07-15 05:16:34.573 [INFO][5018] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="47bee240f99e3e58177cede01d56192cdf7cae15f13bc6cc3f1808a29777ea40" HandleID="k8s-pod-network.47bee240f99e3e58177cede01d56192cdf7cae15f13bc6cc3f1808a29777ea40" Workload="ci--4396--0--0--n--e83c776e20-k8s-coredns--668d6bf9bc--vc4xs-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc0002cf860), Attrs:map[string]string{"namespace":"kube-system", "node":"ci-4396-0-0-n-e83c776e20", "pod":"coredns-668d6bf9bc-vc4xs", "timestamp":"2025-07-15 05:16:34.573222975 +0000 UTC"}, Hostname:"ci-4396-0-0-n-e83c776e20", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Jul 15 05:16:34.628257 containerd[1591]: 2025-07-15 05:16:34.573 [INFO][5018] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Jul 15 05:16:34.628257 containerd[1591]: 2025-07-15 05:16:34.573 [INFO][5018] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Jul 15 05:16:34.628257 containerd[1591]: 2025-07-15 05:16:34.573 [INFO][5018] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4396-0-0-n-e83c776e20' Jul 15 05:16:34.628257 containerd[1591]: 2025-07-15 05:16:34.581 [INFO][5018] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.47bee240f99e3e58177cede01d56192cdf7cae15f13bc6cc3f1808a29777ea40" host="ci-4396-0-0-n-e83c776e20" Jul 15 05:16:34.628257 containerd[1591]: 2025-07-15 05:16:34.586 [INFO][5018] ipam/ipam.go 394: Looking up existing affinities for host host="ci-4396-0-0-n-e83c776e20" Jul 15 05:16:34.628257 containerd[1591]: 2025-07-15 05:16:34.591 [INFO][5018] ipam/ipam.go 511: Trying affinity for 192.168.61.64/26 host="ci-4396-0-0-n-e83c776e20" Jul 15 05:16:34.628257 containerd[1591]: 2025-07-15 05:16:34.592 [INFO][5018] ipam/ipam.go 158: Attempting to load block cidr=192.168.61.64/26 host="ci-4396-0-0-n-e83c776e20" Jul 15 05:16:34.628257 containerd[1591]: 2025-07-15 05:16:34.594 [INFO][5018] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.61.64/26 host="ci-4396-0-0-n-e83c776e20" Jul 15 05:16:34.628257 containerd[1591]: 2025-07-15 05:16:34.594 [INFO][5018] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.61.64/26 handle="k8s-pod-network.47bee240f99e3e58177cede01d56192cdf7cae15f13bc6cc3f1808a29777ea40" host="ci-4396-0-0-n-e83c776e20" Jul 15 05:16:34.628257 containerd[1591]: 2025-07-15 05:16:34.595 [INFO][5018] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.47bee240f99e3e58177cede01d56192cdf7cae15f13bc6cc3f1808a29777ea40 Jul 15 05:16:34.628257 containerd[1591]: 2025-07-15 05:16:34.599 [INFO][5018] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.61.64/26 handle="k8s-pod-network.47bee240f99e3e58177cede01d56192cdf7cae15f13bc6cc3f1808a29777ea40" host="ci-4396-0-0-n-e83c776e20" Jul 15 05:16:34.628257 containerd[1591]: 2025-07-15 05:16:34.605 [INFO][5018] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.61.72/26] block=192.168.61.64/26 handle="k8s-pod-network.47bee240f99e3e58177cede01d56192cdf7cae15f13bc6cc3f1808a29777ea40" host="ci-4396-0-0-n-e83c776e20" Jul 15 05:16:34.628257 containerd[1591]: 2025-07-15 05:16:34.605 [INFO][5018] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.61.72/26] handle="k8s-pod-network.47bee240f99e3e58177cede01d56192cdf7cae15f13bc6cc3f1808a29777ea40" host="ci-4396-0-0-n-e83c776e20" Jul 15 05:16:34.628257 containerd[1591]: 2025-07-15 05:16:34.605 [INFO][5018] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Jul 15 05:16:34.628257 containerd[1591]: 2025-07-15 05:16:34.605 [INFO][5018] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.61.72/26] IPv6=[] ContainerID="47bee240f99e3e58177cede01d56192cdf7cae15f13bc6cc3f1808a29777ea40" HandleID="k8s-pod-network.47bee240f99e3e58177cede01d56192cdf7cae15f13bc6cc3f1808a29777ea40" Workload="ci--4396--0--0--n--e83c776e20-k8s-coredns--668d6bf9bc--vc4xs-eth0" Jul 15 05:16:34.631248 containerd[1591]: 2025-07-15 05:16:34.608 [INFO][4986] cni-plugin/k8s.go 418: Populated endpoint ContainerID="47bee240f99e3e58177cede01d56192cdf7cae15f13bc6cc3f1808a29777ea40" Namespace="kube-system" Pod="coredns-668d6bf9bc-vc4xs" WorkloadEndpoint="ci--4396--0--0--n--e83c776e20-k8s-coredns--668d6bf9bc--vc4xs-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4396--0--0--n--e83c776e20-k8s-coredns--668d6bf9bc--vc4xs-eth0", GenerateName:"coredns-668d6bf9bc-", Namespace:"kube-system", SelfLink:"", UID:"f5754555-d845-4a94-affb-73038dd15a19", ResourceVersion:"779", Generation:0, CreationTimestamp:time.Date(2025, time.July, 15, 5, 15, 56, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"668d6bf9bc", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4396-0-0-n-e83c776e20", ContainerID:"", Pod:"coredns-668d6bf9bc-vc4xs", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.61.72/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"calia9ee13892a1", MAC:"", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Jul 15 05:16:34.631248 containerd[1591]: 2025-07-15 05:16:34.608 [INFO][4986] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.61.72/32] ContainerID="47bee240f99e3e58177cede01d56192cdf7cae15f13bc6cc3f1808a29777ea40" Namespace="kube-system" Pod="coredns-668d6bf9bc-vc4xs" WorkloadEndpoint="ci--4396--0--0--n--e83c776e20-k8s-coredns--668d6bf9bc--vc4xs-eth0" Jul 15 05:16:34.631248 containerd[1591]: 2025-07-15 05:16:34.608 [INFO][4986] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to calia9ee13892a1 ContainerID="47bee240f99e3e58177cede01d56192cdf7cae15f13bc6cc3f1808a29777ea40" Namespace="kube-system" Pod="coredns-668d6bf9bc-vc4xs" WorkloadEndpoint="ci--4396--0--0--n--e83c776e20-k8s-coredns--668d6bf9bc--vc4xs-eth0" Jul 15 05:16:34.631248 containerd[1591]: 2025-07-15 05:16:34.612 [INFO][4986] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="47bee240f99e3e58177cede01d56192cdf7cae15f13bc6cc3f1808a29777ea40" Namespace="kube-system" Pod="coredns-668d6bf9bc-vc4xs" WorkloadEndpoint="ci--4396--0--0--n--e83c776e20-k8s-coredns--668d6bf9bc--vc4xs-eth0" Jul 15 05:16:34.631248 containerd[1591]: 2025-07-15 05:16:34.612 [INFO][4986] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="47bee240f99e3e58177cede01d56192cdf7cae15f13bc6cc3f1808a29777ea40" Namespace="kube-system" Pod="coredns-668d6bf9bc-vc4xs" WorkloadEndpoint="ci--4396--0--0--n--e83c776e20-k8s-coredns--668d6bf9bc--vc4xs-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4396--0--0--n--e83c776e20-k8s-coredns--668d6bf9bc--vc4xs-eth0", GenerateName:"coredns-668d6bf9bc-", Namespace:"kube-system", SelfLink:"", UID:"f5754555-d845-4a94-affb-73038dd15a19", ResourceVersion:"779", Generation:0, CreationTimestamp:time.Date(2025, time.July, 15, 5, 15, 56, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"668d6bf9bc", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4396-0-0-n-e83c776e20", ContainerID:"47bee240f99e3e58177cede01d56192cdf7cae15f13bc6cc3f1808a29777ea40", Pod:"coredns-668d6bf9bc-vc4xs", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.61.72/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"calia9ee13892a1", MAC:"3e:e0:07:13:56:29", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Jul 15 05:16:34.631248 containerd[1591]: 2025-07-15 05:16:34.623 [INFO][4986] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="47bee240f99e3e58177cede01d56192cdf7cae15f13bc6cc3f1808a29777ea40" Namespace="kube-system" Pod="coredns-668d6bf9bc-vc4xs" WorkloadEndpoint="ci--4396--0--0--n--e83c776e20-k8s-coredns--668d6bf9bc--vc4xs-eth0" Jul 15 05:16:34.653709 containerd[1591]: time="2025-07-15T05:16:34.653131034Z" level=info msg="connecting to shim 47bee240f99e3e58177cede01d56192cdf7cae15f13bc6cc3f1808a29777ea40" address="unix:///run/containerd/s/32504bce778b73443f3905334197940e1092c1b7ddc28d88c51ccfd4d2d2d46b" namespace=k8s.io protocol=ttrpc version=3 Jul 15 05:16:34.683226 systemd[1]: Started cri-containerd-47bee240f99e3e58177cede01d56192cdf7cae15f13bc6cc3f1808a29777ea40.scope - libcontainer container 47bee240f99e3e58177cede01d56192cdf7cae15f13bc6cc3f1808a29777ea40. Jul 15 05:16:34.695874 kubelet[2761]: I0715 05:16:34.695795 2761 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Jul 15 05:16:34.713292 kubelet[2761]: I0715 05:16:34.713262 2761 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/calico-kube-controllers-694c859895-vr2gt" podStartSLOduration=22.089465331 podStartE2EDuration="26.713250154s" podCreationTimestamp="2025-07-15 05:16:08 +0000 UTC" firstStartedPulling="2025-07-15 05:16:29.766440629 +0000 UTC m=+40.417814595" lastFinishedPulling="2025-07-15 05:16:34.390225452 +0000 UTC m=+45.041599418" observedRunningTime="2025-07-15 05:16:34.713112549 +0000 UTC m=+45.364486525" watchObservedRunningTime="2025-07-15 05:16:34.713250154 +0000 UTC m=+45.364624120" Jul 15 05:16:34.754980 containerd[1591]: time="2025-07-15T05:16:34.754878096Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-668d6bf9bc-vc4xs,Uid:f5754555-d845-4a94-affb-73038dd15a19,Namespace:kube-system,Attempt:0,} returns sandbox id \"47bee240f99e3e58177cede01d56192cdf7cae15f13bc6cc3f1808a29777ea40\"" Jul 15 05:16:34.766662 containerd[1591]: time="2025-07-15T05:16:34.766613749Z" level=info msg="TaskExit event in podsandbox handler container_id:\"ecc96da98e69b59d4998e563e8cd9c0081547a430a3bdb37fcca15973200c970\" id:\"4a007f96df542e7966451fb430218c8b17037345cfafb11421e9763225288200\" pid:5104 exited_at:{seconds:1752556594 nanos:766424793}" Jul 15 05:16:34.769672 containerd[1591]: time="2025-07-15T05:16:34.769330015Z" level=info msg="CreateContainer within sandbox \"47bee240f99e3e58177cede01d56192cdf7cae15f13bc6cc3f1808a29777ea40\" for container &ContainerMetadata{Name:coredns,Attempt:0,}" Jul 15 05:16:34.778485 containerd[1591]: time="2025-07-15T05:16:34.778467886Z" level=info msg="Container 7d3eeaa81fb754f77cfc8f6805914025d9cf6105a2a32fdfa8165d01d6bce31c: CDI devices from CRI Config.CDIDevices: []" Jul 15 05:16:34.785612 containerd[1591]: time="2025-07-15T05:16:34.785563591Z" level=info msg="CreateContainer within sandbox \"47bee240f99e3e58177cede01d56192cdf7cae15f13bc6cc3f1808a29777ea40\" for &ContainerMetadata{Name:coredns,Attempt:0,} returns container id \"7d3eeaa81fb754f77cfc8f6805914025d9cf6105a2a32fdfa8165d01d6bce31c\"" Jul 15 05:16:34.792272 containerd[1591]: time="2025-07-15T05:16:34.792220913Z" level=info msg="StartContainer for \"7d3eeaa81fb754f77cfc8f6805914025d9cf6105a2a32fdfa8165d01d6bce31c\"" Jul 15 05:16:34.794845 containerd[1591]: time="2025-07-15T05:16:34.794816025Z" level=info msg="connecting to shim 7d3eeaa81fb754f77cfc8f6805914025d9cf6105a2a32fdfa8165d01d6bce31c" address="unix:///run/containerd/s/32504bce778b73443f3905334197940e1092c1b7ddc28d88c51ccfd4d2d2d46b" protocol=ttrpc version=3 Jul 15 05:16:34.813127 systemd[1]: Started cri-containerd-7d3eeaa81fb754f77cfc8f6805914025d9cf6105a2a32fdfa8165d01d6bce31c.scope - libcontainer container 7d3eeaa81fb754f77cfc8f6805914025d9cf6105a2a32fdfa8165d01d6bce31c. Jul 15 05:16:34.856449 containerd[1591]: time="2025-07-15T05:16:34.856413612Z" level=info msg="StartContainer for \"7d3eeaa81fb754f77cfc8f6805914025d9cf6105a2a32fdfa8165d01d6bce31c\" returns successfully" Jul 15 05:16:35.741479 kubelet[2761]: I0715 05:16:35.741367 2761 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/coredns-668d6bf9bc-vc4xs" podStartSLOduration=39.741347324 podStartE2EDuration="39.741347324s" podCreationTimestamp="2025-07-15 05:15:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-07-15 05:16:35.732232763 +0000 UTC m=+46.383606739" watchObservedRunningTime="2025-07-15 05:16:35.741347324 +0000 UTC m=+46.392721290" Jul 15 05:16:35.868667 kubelet[2761]: I0715 05:16:35.868398 2761 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Jul 15 05:16:36.264164 systemd-networkd[1464]: calia9ee13892a1: Gained IPv6LL Jul 15 05:16:37.021870 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount3200644246.mount: Deactivated successfully. Jul 15 05:16:37.246179 systemd-networkd[1464]: vxlan.calico: Link UP Jul 15 05:16:37.246186 systemd-networkd[1464]: vxlan.calico: Gained carrier Jul 15 05:16:38.407079 containerd[1591]: time="2025-07-15T05:16:38.407032229Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/goldmane:v3.30.2\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jul 15 05:16:38.408379 containerd[1591]: time="2025-07-15T05:16:38.408349831Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/goldmane:v3.30.2: active requests=0, bytes read=66352308" Jul 15 05:16:38.409173 containerd[1591]: time="2025-07-15T05:16:38.409136210Z" level=info msg="ImageCreate event name:\"sha256:dc4ea8b409b85d2f118bb4677ad3d34b57e7b01d488c9f019f7073bb58b2162b\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jul 15 05:16:38.410773 containerd[1591]: time="2025-07-15T05:16:38.410743730Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/goldmane@sha256:a2b761fd93d824431ad93e59e8e670cdf00b478f4b532145297e1e67f2768305\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jul 15 05:16:38.411409 containerd[1591]: time="2025-07-15T05:16:38.411101698Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/goldmane:v3.30.2\" with image id \"sha256:dc4ea8b409b85d2f118bb4677ad3d34b57e7b01d488c9f019f7073bb58b2162b\", repo tag \"ghcr.io/flatcar/calico/goldmane:v3.30.2\", repo digest \"ghcr.io/flatcar/calico/goldmane@sha256:a2b761fd93d824431ad93e59e8e670cdf00b478f4b532145297e1e67f2768305\", size \"66352154\" in 4.020306768s" Jul 15 05:16:38.411409 containerd[1591]: time="2025-07-15T05:16:38.411132679Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.30.2\" returns image reference \"sha256:dc4ea8b409b85d2f118bb4677ad3d34b57e7b01d488c9f019f7073bb58b2162b\"" Jul 15 05:16:38.417186 containerd[1591]: time="2025-07-15T05:16:38.417127076Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.30.2\"" Jul 15 05:16:38.417621 containerd[1591]: time="2025-07-15T05:16:38.417604968Z" level=info msg="CreateContainer within sandbox \"572b11424fc0bf9108932bf833d1aac031175336fc98a9cbc83cbb40213ffc69\" for container &ContainerMetadata{Name:goldmane,Attempt:0,}" Jul 15 05:16:38.426337 containerd[1591]: time="2025-07-15T05:16:38.426313942Z" level=info msg="Container 9428bd6e1d7b46eab1fee560c89f8d47314c8d522bc6b7a656d2d6d9133dfefe: CDI devices from CRI Config.CDIDevices: []" Jul 15 05:16:38.430536 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount959672020.mount: Deactivated successfully. Jul 15 05:16:38.452727 containerd[1591]: time="2025-07-15T05:16:38.452685889Z" level=info msg="CreateContainer within sandbox \"572b11424fc0bf9108932bf833d1aac031175336fc98a9cbc83cbb40213ffc69\" for &ContainerMetadata{Name:goldmane,Attempt:0,} returns container id \"9428bd6e1d7b46eab1fee560c89f8d47314c8d522bc6b7a656d2d6d9133dfefe\"" Jul 15 05:16:38.453211 containerd[1591]: time="2025-07-15T05:16:38.453127160Z" level=info msg="StartContainer for \"9428bd6e1d7b46eab1fee560c89f8d47314c8d522bc6b7a656d2d6d9133dfefe\"" Jul 15 05:16:38.454794 containerd[1591]: time="2025-07-15T05:16:38.454772010Z" level=info msg="connecting to shim 9428bd6e1d7b46eab1fee560c89f8d47314c8d522bc6b7a656d2d6d9133dfefe" address="unix:///run/containerd/s/5911e1bfd0bae4b049b949e610238e5d19dd2205f1ff09f56b29767f3459b7f8" protocol=ttrpc version=3 Jul 15 05:16:38.495246 systemd[1]: Started cri-containerd-9428bd6e1d7b46eab1fee560c89f8d47314c8d522bc6b7a656d2d6d9133dfefe.scope - libcontainer container 9428bd6e1d7b46eab1fee560c89f8d47314c8d522bc6b7a656d2d6d9133dfefe. Jul 15 05:16:38.540993 containerd[1591]: time="2025-07-15T05:16:38.540925984Z" level=info msg="StartContainer for \"9428bd6e1d7b46eab1fee560c89f8d47314c8d522bc6b7a656d2d6d9133dfefe\" returns successfully" Jul 15 05:16:38.853891 containerd[1591]: time="2025-07-15T05:16:38.853857594Z" level=info msg="TaskExit event in podsandbox handler container_id:\"9428bd6e1d7b46eab1fee560c89f8d47314c8d522bc6b7a656d2d6d9133dfefe\" id:\"ec28d379260e36ff1f22a9e512a54989a8d7aba4dfbd9ffb6ee1311c00b45a5d\" pid:5380 exit_status:1 exited_at:{seconds:1752556598 nanos:853411103}" Jul 15 05:16:38.889560 systemd-networkd[1464]: vxlan.calico: Gained IPv6LL Jul 15 05:16:39.381892 kubelet[2761]: I0715 05:16:39.381331 2761 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Jul 15 05:16:39.458875 kubelet[2761]: I0715 05:16:39.457792 2761 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/goldmane-768f4c5c69-6hlgx" podStartSLOduration=25.943269375 podStartE2EDuration="33.457769453s" podCreationTimestamp="2025-07-15 05:16:06 +0000 UTC" firstStartedPulling="2025-07-15 05:16:30.897252746 +0000 UTC m=+41.548626712" lastFinishedPulling="2025-07-15 05:16:38.411752814 +0000 UTC m=+49.063126790" observedRunningTime="2025-07-15 05:16:38.744786487 +0000 UTC m=+49.396160463" watchObservedRunningTime="2025-07-15 05:16:39.457769453 +0000 UTC m=+50.109143449" Jul 15 05:16:39.802032 containerd[1591]: time="2025-07-15T05:16:39.801918881Z" level=info msg="TaskExit event in podsandbox handler container_id:\"9428bd6e1d7b46eab1fee560c89f8d47314c8d522bc6b7a656d2d6d9133dfefe\" id:\"9d69bac6a35888d596b077b5c7ba7d93730f661d6a9befd09b0006a8353e19b5\" pid:5407 exit_status:1 exited_at:{seconds:1752556599 nanos:801440490}" Jul 15 05:16:40.832176 containerd[1591]: time="2025-07-15T05:16:40.832100807Z" level=info msg="TaskExit event in podsandbox handler container_id:\"9428bd6e1d7b46eab1fee560c89f8d47314c8d522bc6b7a656d2d6d9133dfefe\" id:\"484e09e014cf6223667783322c6a9e9fbd2e52c53de74ea1c7ba7516c98ea0c0\" pid:5429 exit_status:1 exited_at:{seconds:1752556600 nanos:831793341}" Jul 15 05:16:41.221804 containerd[1591]: time="2025-07-15T05:16:41.221761754Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/csi:v3.30.2\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jul 15 05:16:41.222791 containerd[1591]: time="2025-07-15T05:16:41.222735384Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/csi:v3.30.2: active requests=0, bytes read=8759190" Jul 15 05:16:41.223640 containerd[1591]: time="2025-07-15T05:16:41.223597571Z" level=info msg="ImageCreate event name:\"sha256:c7fd1cc652979d89a51bbcc125e28e90c9815c0bd8f922a5bd36eed4e1927c6d\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jul 15 05:16:41.226546 containerd[1591]: time="2025-07-15T05:16:41.226508940Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/csi@sha256:e570128aa8067a2f06b96d3cc98afa2e0a4b9790b435ee36ca051c8e72aeb8d0\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jul 15 05:16:41.227597 containerd[1591]: time="2025-07-15T05:16:41.227492350Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/csi:v3.30.2\" with image id \"sha256:c7fd1cc652979d89a51bbcc125e28e90c9815c0bd8f922a5bd36eed4e1927c6d\", repo tag \"ghcr.io/flatcar/calico/csi:v3.30.2\", repo digest \"ghcr.io/flatcar/calico/csi@sha256:e570128aa8067a2f06b96d3cc98afa2e0a4b9790b435ee36ca051c8e72aeb8d0\", size \"10251893\" in 2.810342343s" Jul 15 05:16:41.227597 containerd[1591]: time="2025-07-15T05:16:41.227545901Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.30.2\" returns image reference \"sha256:c7fd1cc652979d89a51bbcc125e28e90c9815c0bd8f922a5bd36eed4e1927c6d\"" Jul 15 05:16:41.230691 containerd[1591]: time="2025-07-15T05:16:41.230667554Z" level=info msg="CreateContainer within sandbox \"83181a6f59f44f2314ef7f2c42d6e9b889f25add0df4a95a6bcfefd364e64db5\" for container &ContainerMetadata{Name:calico-csi,Attempt:0,}" Jul 15 05:16:41.265881 containerd[1591]: time="2025-07-15T05:16:41.265297864Z" level=info msg="Container 103e4e4ee4f96fd017a7236e348f9f9489b35407fdef1df17648174cd48f9eb6: CDI devices from CRI Config.CDIDevices: []" Jul 15 05:16:41.275477 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount1404843739.mount: Deactivated successfully. Jul 15 05:16:41.294887 containerd[1591]: time="2025-07-15T05:16:41.294852772Z" level=info msg="CreateContainer within sandbox \"83181a6f59f44f2314ef7f2c42d6e9b889f25add0df4a95a6bcfefd364e64db5\" for &ContainerMetadata{Name:calico-csi,Attempt:0,} returns container id \"103e4e4ee4f96fd017a7236e348f9f9489b35407fdef1df17648174cd48f9eb6\"" Jul 15 05:16:41.295915 containerd[1591]: time="2025-07-15T05:16:41.295898173Z" level=info msg="StartContainer for \"103e4e4ee4f96fd017a7236e348f9f9489b35407fdef1df17648174cd48f9eb6\"" Jul 15 05:16:41.297304 containerd[1591]: time="2025-07-15T05:16:41.297287141Z" level=info msg="connecting to shim 103e4e4ee4f96fd017a7236e348f9f9489b35407fdef1df17648174cd48f9eb6" address="unix:///run/containerd/s/7519cc700310ca7ebe19921fe1775b9d5fba1f082384e2088eb3be1aff600e07" protocol=ttrpc version=3 Jul 15 05:16:41.317195 systemd[1]: Started cri-containerd-103e4e4ee4f96fd017a7236e348f9f9489b35407fdef1df17648174cd48f9eb6.scope - libcontainer container 103e4e4ee4f96fd017a7236e348f9f9489b35407fdef1df17648174cd48f9eb6. Jul 15 05:16:41.356353 containerd[1591]: time="2025-07-15T05:16:41.356327805Z" level=info msg="StartContainer for \"103e4e4ee4f96fd017a7236e348f9f9489b35407fdef1df17648174cd48f9eb6\" returns successfully" Jul 15 05:16:41.357582 containerd[1591]: time="2025-07-15T05:16:41.357559410Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.2\"" Jul 15 05:16:42.466469 containerd[1591]: time="2025-07-15T05:16:42.466336712Z" level=info msg="TaskExit event in podsandbox handler container_id:\"9428bd6e1d7b46eab1fee560c89f8d47314c8d522bc6b7a656d2d6d9133dfefe\" id:\"54b9f407fc0bef6bd261d4436b13e6e112352b4a17fc56f0201b0a24dd106e74\" pid:5486 exited_at:{seconds:1752556602 nanos:465909774}" Jul 15 05:16:43.059090 containerd[1591]: time="2025-07-15T05:16:43.059043369Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.2\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jul 15 05:16:43.060182 containerd[1591]: time="2025-07-15T05:16:43.060144698Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/node-driver-registrar:v3.30.2: active requests=0, bytes read=14703784" Jul 15 05:16:43.061810 containerd[1591]: time="2025-07-15T05:16:43.061769447Z" level=info msg="ImageCreate event name:\"sha256:9e48822a4fe26f4ed9231b361fdd1357ea3567f1fc0a8db4d616622fe570a866\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jul 15 05:16:43.063504 containerd[1591]: time="2025-07-15T05:16:43.063465667Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node-driver-registrar@sha256:8fec2de12dfa51bae89d941938a07af2598eb8bfcab55d0dded1d9c193d7b99f\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jul 15 05:16:43.063939 containerd[1591]: time="2025-07-15T05:16:43.063817103Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.2\" with image id \"sha256:9e48822a4fe26f4ed9231b361fdd1357ea3567f1fc0a8db4d616622fe570a866\", repo tag \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.2\", repo digest \"ghcr.io/flatcar/calico/node-driver-registrar@sha256:8fec2de12dfa51bae89d941938a07af2598eb8bfcab55d0dded1d9c193d7b99f\", size \"16196439\" in 1.70612739s" Jul 15 05:16:43.063939 containerd[1591]: time="2025-07-15T05:16:43.063844364Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.2\" returns image reference \"sha256:9e48822a4fe26f4ed9231b361fdd1357ea3567f1fc0a8db4d616622fe570a866\"" Jul 15 05:16:43.066030 containerd[1591]: time="2025-07-15T05:16:43.065430702Z" level=info msg="CreateContainer within sandbox \"83181a6f59f44f2314ef7f2c42d6e9b889f25add0df4a95a6bcfefd364e64db5\" for container &ContainerMetadata{Name:csi-node-driver-registrar,Attempt:0,}" Jul 15 05:16:43.074245 containerd[1591]: time="2025-07-15T05:16:43.074227558Z" level=info msg="Container f68518477b42a56d95d12da022007db0b7af983fba968f6c8868cf89cba2a62f: CDI devices from CRI Config.CDIDevices: []" Jul 15 05:16:43.093304 containerd[1591]: time="2025-07-15T05:16:43.093284987Z" level=info msg="CreateContainer within sandbox \"83181a6f59f44f2314ef7f2c42d6e9b889f25add0df4a95a6bcfefd364e64db5\" for &ContainerMetadata{Name:csi-node-driver-registrar,Attempt:0,} returns container id \"f68518477b42a56d95d12da022007db0b7af983fba968f6c8868cf89cba2a62f\"" Jul 15 05:16:43.094182 containerd[1591]: time="2025-07-15T05:16:43.094146472Z" level=info msg="StartContainer for \"f68518477b42a56d95d12da022007db0b7af983fba968f6c8868cf89cba2a62f\"" Jul 15 05:16:43.095235 containerd[1591]: time="2025-07-15T05:16:43.095210611Z" level=info msg="connecting to shim f68518477b42a56d95d12da022007db0b7af983fba968f6c8868cf89cba2a62f" address="unix:///run/containerd/s/7519cc700310ca7ebe19921fe1775b9d5fba1f082384e2088eb3be1aff600e07" protocol=ttrpc version=3 Jul 15 05:16:43.114119 systemd[1]: Started cri-containerd-f68518477b42a56d95d12da022007db0b7af983fba968f6c8868cf89cba2a62f.scope - libcontainer container f68518477b42a56d95d12da022007db0b7af983fba968f6c8868cf89cba2a62f. Jul 15 05:16:43.162215 containerd[1591]: time="2025-07-15T05:16:43.162189022Z" level=info msg="StartContainer for \"f68518477b42a56d95d12da022007db0b7af983fba968f6c8868cf89cba2a62f\" returns successfully" Jul 15 05:16:43.680768 kubelet[2761]: I0715 05:16:43.679801 2761 csi_plugin.go:100] kubernetes.io/csi: Trying to validate a new CSI Driver with name: csi.tigera.io endpoint: /var/lib/kubelet/plugins/csi.tigera.io/csi.sock versions: 1.0.0 Jul 15 05:16:43.682241 kubelet[2761]: I0715 05:16:43.682206 2761 csi_plugin.go:113] kubernetes.io/csi: Register new plugin with name: csi.tigera.io at endpoint: /var/lib/kubelet/plugins/csi.tigera.io/csi.sock Jul 15 05:16:51.585908 containerd[1591]: time="2025-07-15T05:16:51.585873054Z" level=info msg="TaskExit event in podsandbox handler container_id:\"ecc96da98e69b59d4998e563e8cd9c0081547a430a3bdb37fcca15973200c970\" id:\"33e486c1c2ff0b6aa56a0ad2a19a903d9e671d70bbf33ecceea153a8a292328b\" pid:5563 exited_at:{seconds:1752556611 nanos:585626202}" Jul 15 05:16:55.782350 containerd[1591]: time="2025-07-15T05:16:55.782313707Z" level=info msg="TaskExit event in podsandbox handler container_id:\"87af270add8c61091888bd10ba9da0b61a290df1dcd1c45e8bd59a8af425818f\" id:\"adcc82488396c6f656309d61eeee845f187dd80985832a0abb1f0c96d8a2f125\" pid:5585 exited_at:{seconds:1752556615 nanos:782080875}" Jul 15 05:16:55.817662 kubelet[2761]: I0715 05:16:55.817621 2761 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/csi-node-driver-9xn7s" podStartSLOduration=37.424702306 podStartE2EDuration="48.814671782s" podCreationTimestamp="2025-07-15 05:16:07 +0000 UTC" firstStartedPulling="2025-07-15 05:16:31.674513109 +0000 UTC m=+42.325887085" lastFinishedPulling="2025-07-15 05:16:43.064482595 +0000 UTC m=+53.715856561" observedRunningTime="2025-07-15 05:16:43.783893711 +0000 UTC m=+54.435267687" watchObservedRunningTime="2025-07-15 05:16:55.814671782 +0000 UTC m=+66.466045748" Jul 15 05:16:56.014641 kubelet[2761]: I0715 05:16:56.014337 2761 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Jul 15 05:17:04.855545 containerd[1591]: time="2025-07-15T05:17:04.855487476Z" level=info msg="TaskExit event in podsandbox handler container_id:\"ecc96da98e69b59d4998e563e8cd9c0081547a430a3bdb37fcca15973200c970\" id:\"3e29862239d1d7a5c45eb8dff442b539a2af6adc98b00a33a8f53f18b64de95d\" pid:5620 exited_at:{seconds:1752556624 nanos:855150834}" Jul 15 05:17:10.887409 containerd[1591]: time="2025-07-15T05:17:10.887145616Z" level=info msg="TaskExit event in podsandbox handler container_id:\"9428bd6e1d7b46eab1fee560c89f8d47314c8d522bc6b7a656d2d6d9133dfefe\" id:\"70192894820b45c57c82f4dfa6cec46ef9b645bd5ec1f280bb4c8f356f96fe4c\" pid:5647 exited_at:{seconds:1752556630 nanos:885824596}" Jul 15 05:17:25.718252 containerd[1591]: time="2025-07-15T05:17:25.718216162Z" level=info msg="TaskExit event in podsandbox handler container_id:\"87af270add8c61091888bd10ba9da0b61a290df1dcd1c45e8bd59a8af425818f\" id:\"d0d75fb9f8b3b9dcda1a60b2a88149179fa847952ce65b521a1ae4a456d0a71f\" pid:5680 exited_at:{seconds:1752556645 nanos:717935406}" Jul 15 05:17:26.921818 systemd[1]: Started sshd@7-157.180.39.85:22-139.178.89.65:38316.service - OpenSSH per-connection server daemon (139.178.89.65:38316). Jul 15 05:17:27.941301 sshd[5698]: Accepted publickey for core from 139.178.89.65 port 38316 ssh2: RSA SHA256:je//JvwAvCoYiHr1RrgFTcyG5vuK3YNw75TtQlLXdFM Jul 15 05:17:27.946590 sshd-session[5698]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jul 15 05:17:27.962547 systemd-logind[1567]: New session 8 of user core. Jul 15 05:17:27.971653 systemd[1]: Started session-8.scope - Session 8 of User core. Jul 15 05:17:29.112809 sshd[5702]: Connection closed by 139.178.89.65 port 38316 Jul 15 05:17:29.113188 sshd-session[5698]: pam_unix(sshd:session): session closed for user core Jul 15 05:17:29.120287 systemd-logind[1567]: Session 8 logged out. Waiting for processes to exit. Jul 15 05:17:29.121147 systemd[1]: sshd@7-157.180.39.85:22-139.178.89.65:38316.service: Deactivated successfully. Jul 15 05:17:29.123838 systemd[1]: session-8.scope: Deactivated successfully. Jul 15 05:17:29.127728 systemd-logind[1567]: Removed session 8. Jul 15 05:17:34.285625 systemd[1]: Started sshd@8-157.180.39.85:22-139.178.89.65:47574.service - OpenSSH per-connection server daemon (139.178.89.65:47574). Jul 15 05:17:34.803056 containerd[1591]: time="2025-07-15T05:17:34.803003247Z" level=info msg="TaskExit event in podsandbox handler container_id:\"ecc96da98e69b59d4998e563e8cd9c0081547a430a3bdb37fcca15973200c970\" id:\"69803123dd7e34cdcf225245fe46767127052e14907cd7b57d81daf08a01188b\" pid:5735 exited_at:{seconds:1752556654 nanos:802809375}" Jul 15 05:17:35.310557 sshd[5720]: Accepted publickey for core from 139.178.89.65 port 47574 ssh2: RSA SHA256:je//JvwAvCoYiHr1RrgFTcyG5vuK3YNw75TtQlLXdFM Jul 15 05:17:35.313343 sshd-session[5720]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jul 15 05:17:35.318861 systemd-logind[1567]: New session 9 of user core. Jul 15 05:17:35.325168 systemd[1]: Started session-9.scope - Session 9 of User core. Jul 15 05:17:36.177160 sshd[5744]: Connection closed by 139.178.89.65 port 47574 Jul 15 05:17:36.178418 sshd-session[5720]: pam_unix(sshd:session): session closed for user core Jul 15 05:17:36.187369 systemd[1]: sshd@8-157.180.39.85:22-139.178.89.65:47574.service: Deactivated successfully. Jul 15 05:17:36.191951 systemd[1]: session-9.scope: Deactivated successfully. Jul 15 05:17:36.193649 systemd-logind[1567]: Session 9 logged out. Waiting for processes to exit. Jul 15 05:17:36.196628 systemd-logind[1567]: Removed session 9. Jul 15 05:17:40.850699 containerd[1591]: time="2025-07-15T05:17:40.850627983Z" level=info msg="TaskExit event in podsandbox handler container_id:\"9428bd6e1d7b46eab1fee560c89f8d47314c8d522bc6b7a656d2d6d9133dfefe\" id:\"28799b80bdbfdf017e7e28b94609f214148fc48234488f3336245c40d5eb6879\" pid:5769 exited_at:{seconds:1752556660 nanos:850155109}" Jul 15 05:17:41.344898 systemd[1]: Started sshd@9-157.180.39.85:22-139.178.89.65:37846.service - OpenSSH per-connection server daemon (139.178.89.65:37846). Jul 15 05:17:42.350501 sshd[5780]: Accepted publickey for core from 139.178.89.65 port 37846 ssh2: RSA SHA256:je//JvwAvCoYiHr1RrgFTcyG5vuK3YNw75TtQlLXdFM Jul 15 05:17:42.353207 sshd-session[5780]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jul 15 05:17:42.362977 systemd-logind[1567]: New session 10 of user core. Jul 15 05:17:42.370508 systemd[1]: Started session-10.scope - Session 10 of User core. Jul 15 05:17:42.491125 containerd[1591]: time="2025-07-15T05:17:42.480874651Z" level=info msg="TaskExit event in podsandbox handler container_id:\"9428bd6e1d7b46eab1fee560c89f8d47314c8d522bc6b7a656d2d6d9133dfefe\" id:\"1aac46544c1f3d358630ef45c0e0f3339ef32d89724001b52517275c476744a9\" pid:5797 exited_at:{seconds:1752556662 nanos:480638229}" Jul 15 05:17:43.090441 sshd[5783]: Connection closed by 139.178.89.65 port 37846 Jul 15 05:17:43.092022 sshd-session[5780]: pam_unix(sshd:session): session closed for user core Jul 15 05:17:43.100140 systemd[1]: sshd@9-157.180.39.85:22-139.178.89.65:37846.service: Deactivated successfully. Jul 15 05:17:43.101188 systemd-logind[1567]: Session 10 logged out. Waiting for processes to exit. Jul 15 05:17:43.105501 systemd[1]: session-10.scope: Deactivated successfully. Jul 15 05:17:43.107862 systemd-logind[1567]: Removed session 10. Jul 15 05:17:48.267860 systemd[1]: Started sshd@10-157.180.39.85:22-139.178.89.65:37860.service - OpenSSH per-connection server daemon (139.178.89.65:37860). Jul 15 05:17:49.273676 sshd[5819]: Accepted publickey for core from 139.178.89.65 port 37860 ssh2: RSA SHA256:je//JvwAvCoYiHr1RrgFTcyG5vuK3YNw75TtQlLXdFM Jul 15 05:17:49.276216 sshd-session[5819]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jul 15 05:17:49.284422 systemd-logind[1567]: New session 11 of user core. Jul 15 05:17:49.289252 systemd[1]: Started session-11.scope - Session 11 of User core. Jul 15 05:17:50.087514 sshd[5822]: Connection closed by 139.178.89.65 port 37860 Jul 15 05:17:50.092953 sshd-session[5819]: pam_unix(sshd:session): session closed for user core Jul 15 05:17:50.099682 systemd[1]: sshd@10-157.180.39.85:22-139.178.89.65:37860.service: Deactivated successfully. Jul 15 05:17:50.102426 systemd[1]: session-11.scope: Deactivated successfully. Jul 15 05:17:50.104483 systemd-logind[1567]: Session 11 logged out. Waiting for processes to exit. Jul 15 05:17:50.107712 systemd-logind[1567]: Removed session 11. Jul 15 05:17:51.583709 containerd[1591]: time="2025-07-15T05:17:51.583509451Z" level=info msg="TaskExit event in podsandbox handler container_id:\"ecc96da98e69b59d4998e563e8cd9c0081547a430a3bdb37fcca15973200c970\" id:\"4f5ad2773bef8ab3331ae38c5e58eb4eb9127192f954823127fb539ce5a3cf08\" pid:5850 exited_at:{seconds:1752556671 nanos:582523687}" Jul 15 05:17:55.263446 systemd[1]: Started sshd@11-157.180.39.85:22-139.178.89.65:47106.service - OpenSSH per-connection server daemon (139.178.89.65:47106). Jul 15 05:17:55.711558 containerd[1591]: time="2025-07-15T05:17:55.711525567Z" level=info msg="TaskExit event in podsandbox handler container_id:\"87af270add8c61091888bd10ba9da0b61a290df1dcd1c45e8bd59a8af425818f\" id:\"6ab03b676441750f87b95ac746d600b03ce55c974a579bc0163e1d1aa6eddb8d\" pid:5876 exited_at:{seconds:1752556675 nanos:711079448}" Jul 15 05:17:56.301725 sshd[5860]: Accepted publickey for core from 139.178.89.65 port 47106 ssh2: RSA SHA256:je//JvwAvCoYiHr1RrgFTcyG5vuK3YNw75TtQlLXdFM Jul 15 05:17:56.305184 sshd-session[5860]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jul 15 05:17:56.314370 systemd-logind[1567]: New session 12 of user core. Jul 15 05:17:56.323127 systemd[1]: Started session-12.scope - Session 12 of User core. Jul 15 05:17:57.213005 sshd[5887]: Connection closed by 139.178.89.65 port 47106 Jul 15 05:17:57.214065 sshd-session[5860]: pam_unix(sshd:session): session closed for user core Jul 15 05:17:57.221974 systemd[1]: sshd@11-157.180.39.85:22-139.178.89.65:47106.service: Deactivated successfully. Jul 15 05:17:57.225689 systemd[1]: session-12.scope: Deactivated successfully. Jul 15 05:17:57.228542 systemd-logind[1567]: Session 12 logged out. Waiting for processes to exit. Jul 15 05:17:57.231787 systemd-logind[1567]: Removed session 12. Jul 15 05:18:02.381139 systemd[1]: Started sshd@12-157.180.39.85:22-139.178.89.65:58394.service - OpenSSH per-connection server daemon (139.178.89.65:58394). Jul 15 05:18:03.370034 sshd[5910]: Accepted publickey for core from 139.178.89.65 port 58394 ssh2: RSA SHA256:je//JvwAvCoYiHr1RrgFTcyG5vuK3YNw75TtQlLXdFM Jul 15 05:18:03.373083 sshd-session[5910]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jul 15 05:18:03.382096 systemd-logind[1567]: New session 13 of user core. Jul 15 05:18:03.391284 systemd[1]: Started session-13.scope - Session 13 of User core. Jul 15 05:18:04.119700 sshd[5913]: Connection closed by 139.178.89.65 port 58394 Jul 15 05:18:04.120223 sshd-session[5910]: pam_unix(sshd:session): session closed for user core Jul 15 05:18:04.124685 systemd[1]: sshd@12-157.180.39.85:22-139.178.89.65:58394.service: Deactivated successfully. Jul 15 05:18:04.126699 systemd[1]: session-13.scope: Deactivated successfully. Jul 15 05:18:04.128732 systemd-logind[1567]: Session 13 logged out. Waiting for processes to exit. Jul 15 05:18:04.130613 systemd-logind[1567]: Removed session 13. Jul 15 05:18:04.764375 containerd[1591]: time="2025-07-15T05:18:04.764308749Z" level=info msg="TaskExit event in podsandbox handler container_id:\"ecc96da98e69b59d4998e563e8cd9c0081547a430a3bdb37fcca15973200c970\" id:\"d978e19b67ca76449e15e41a43af753e60756d430f168f824e14f3e51fd6ad53\" pid:5937 exited_at:{seconds:1752556684 nanos:763176483}" Jul 15 05:18:09.287461 systemd[1]: Started sshd@13-157.180.39.85:22-139.178.89.65:34454.service - OpenSSH per-connection server daemon (139.178.89.65:34454). Jul 15 05:18:10.282076 sshd[5948]: Accepted publickey for core from 139.178.89.65 port 34454 ssh2: RSA SHA256:je//JvwAvCoYiHr1RrgFTcyG5vuK3YNw75TtQlLXdFM Jul 15 05:18:10.284064 sshd-session[5948]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jul 15 05:18:10.288433 systemd-logind[1567]: New session 14 of user core. Jul 15 05:18:10.295103 systemd[1]: Started session-14.scope - Session 14 of User core. Jul 15 05:18:10.801403 containerd[1591]: time="2025-07-15T05:18:10.801005426Z" level=info msg="TaskExit event in podsandbox handler container_id:\"9428bd6e1d7b46eab1fee560c89f8d47314c8d522bc6b7a656d2d6d9133dfefe\" id:\"9a72e75abbdd30cb7006fd3f95ddc28e96e2d75c31e72bb83c75a427f6dcb755\" pid:5971 exited_at:{seconds:1752556690 nanos:799563573}" Jul 15 05:18:11.007457 sshd[5958]: Connection closed by 139.178.89.65 port 34454 Jul 15 05:18:11.008516 sshd-session[5948]: pam_unix(sshd:session): session closed for user core Jul 15 05:18:11.020856 systemd[1]: sshd@13-157.180.39.85:22-139.178.89.65:34454.service: Deactivated successfully. Jul 15 05:18:11.026271 systemd[1]: session-14.scope: Deactivated successfully. Jul 15 05:18:11.029598 systemd-logind[1567]: Session 14 logged out. Waiting for processes to exit. Jul 15 05:18:11.033482 systemd-logind[1567]: Removed session 14. Jul 15 05:18:16.185685 systemd[1]: Started sshd@14-157.180.39.85:22-139.178.89.65:34456.service - OpenSSH per-connection server daemon (139.178.89.65:34456). Jul 15 05:18:17.185243 sshd[6009]: Accepted publickey for core from 139.178.89.65 port 34456 ssh2: RSA SHA256:je//JvwAvCoYiHr1RrgFTcyG5vuK3YNw75TtQlLXdFM Jul 15 05:18:17.188292 sshd-session[6009]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jul 15 05:18:17.197437 systemd-logind[1567]: New session 15 of user core. Jul 15 05:18:17.202235 systemd[1]: Started session-15.scope - Session 15 of User core. Jul 15 05:18:17.969948 sshd[6012]: Connection closed by 139.178.89.65 port 34456 Jul 15 05:18:17.970918 sshd-session[6009]: pam_unix(sshd:session): session closed for user core Jul 15 05:18:17.977888 systemd-logind[1567]: Session 15 logged out. Waiting for processes to exit. Jul 15 05:18:17.979465 systemd[1]: sshd@14-157.180.39.85:22-139.178.89.65:34456.service: Deactivated successfully. Jul 15 05:18:17.983872 systemd[1]: session-15.scope: Deactivated successfully. Jul 15 05:18:17.988175 systemd-logind[1567]: Removed session 15. Jul 15 05:18:23.143538 systemd[1]: Started sshd@15-157.180.39.85:22-139.178.89.65:49884.service - OpenSSH per-connection server daemon (139.178.89.65:49884). Jul 15 05:18:24.159611 sshd[6024]: Accepted publickey for core from 139.178.89.65 port 49884 ssh2: RSA SHA256:je//JvwAvCoYiHr1RrgFTcyG5vuK3YNw75TtQlLXdFM Jul 15 05:18:24.162401 sshd-session[6024]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jul 15 05:18:24.172101 systemd-logind[1567]: New session 16 of user core. Jul 15 05:18:24.179269 systemd[1]: Started session-16.scope - Session 16 of User core. Jul 15 05:18:24.971789 sshd[6027]: Connection closed by 139.178.89.65 port 49884 Jul 15 05:18:24.975632 sshd-session[6024]: pam_unix(sshd:session): session closed for user core Jul 15 05:18:24.987751 systemd[1]: sshd@15-157.180.39.85:22-139.178.89.65:49884.service: Deactivated successfully. Jul 15 05:18:24.991906 systemd[1]: session-16.scope: Deactivated successfully. Jul 15 05:18:24.994594 systemd-logind[1567]: Session 16 logged out. Waiting for processes to exit. Jul 15 05:18:24.997778 systemd-logind[1567]: Removed session 16. Jul 15 05:18:25.736226 containerd[1591]: time="2025-07-15T05:18:25.736130987Z" level=info msg="TaskExit event in podsandbox handler container_id:\"87af270add8c61091888bd10ba9da0b61a290df1dcd1c45e8bd59a8af425818f\" id:\"0a11b41376fbfa37b5d402ed7b1b66896be4bf3889dafef43205925a5f3edf11\" pid:6052 exited_at:{seconds:1752556705 nanos:735660504}" Jul 15 05:18:30.137751 systemd[1]: Started sshd@16-157.180.39.85:22-139.178.89.65:58428.service - OpenSSH per-connection server daemon (139.178.89.65:58428). Jul 15 05:18:31.147597 sshd[6067]: Accepted publickey for core from 139.178.89.65 port 58428 ssh2: RSA SHA256:je//JvwAvCoYiHr1RrgFTcyG5vuK3YNw75TtQlLXdFM Jul 15 05:18:31.151225 sshd-session[6067]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jul 15 05:18:31.168303 systemd-logind[1567]: New session 17 of user core. Jul 15 05:18:31.180270 systemd[1]: Started session-17.scope - Session 17 of User core. Jul 15 05:18:32.027708 sshd[6070]: Connection closed by 139.178.89.65 port 58428 Jul 15 05:18:32.033053 sshd-session[6067]: pam_unix(sshd:session): session closed for user core Jul 15 05:18:32.043387 systemd[1]: sshd@16-157.180.39.85:22-139.178.89.65:58428.service: Deactivated successfully. Jul 15 05:18:32.045885 systemd[1]: session-17.scope: Deactivated successfully. Jul 15 05:18:32.047061 systemd-logind[1567]: Session 17 logged out. Waiting for processes to exit. Jul 15 05:18:32.048746 systemd-logind[1567]: Removed session 17. Jul 15 05:18:34.750629 containerd[1591]: time="2025-07-15T05:18:34.750595020Z" level=info msg="TaskExit event in podsandbox handler container_id:\"ecc96da98e69b59d4998e563e8cd9c0081547a430a3bdb37fcca15973200c970\" id:\"6e6e18eb9bee9e3ba4a3089aa150ecc846750f92bd3ab45f4597eb4b02e7ea20\" pid:6094 exited_at:{seconds:1752556714 nanos:750408343}" Jul 15 05:18:37.201677 systemd[1]: Started sshd@17-157.180.39.85:22-139.178.89.65:58430.service - OpenSSH per-connection server daemon (139.178.89.65:58430). Jul 15 05:18:38.220525 sshd[6104]: Accepted publickey for core from 139.178.89.65 port 58430 ssh2: RSA SHA256:je//JvwAvCoYiHr1RrgFTcyG5vuK3YNw75TtQlLXdFM Jul 15 05:18:38.221832 sshd-session[6104]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jul 15 05:18:38.229221 systemd-logind[1567]: New session 18 of user core. Jul 15 05:18:38.234114 systemd[1]: Started session-18.scope - Session 18 of User core. Jul 15 05:18:38.991618 sshd[6107]: Connection closed by 139.178.89.65 port 58430 Jul 15 05:18:38.992591 sshd-session[6104]: pam_unix(sshd:session): session closed for user core Jul 15 05:18:39.000879 systemd[1]: sshd@17-157.180.39.85:22-139.178.89.65:58430.service: Deactivated successfully. Jul 15 05:18:39.004767 systemd[1]: session-18.scope: Deactivated successfully. Jul 15 05:18:39.007729 systemd-logind[1567]: Session 18 logged out. Waiting for processes to exit. Jul 15 05:18:39.010376 systemd-logind[1567]: Removed session 18. Jul 15 05:18:40.844830 containerd[1591]: time="2025-07-15T05:18:40.844792299Z" level=info msg="TaskExit event in podsandbox handler container_id:\"9428bd6e1d7b46eab1fee560c89f8d47314c8d522bc6b7a656d2d6d9133dfefe\" id:\"03cc7bd6f43f102cd94870e5726dab8681e5713f203a8c6967f62ac131d3cde1\" pid:6131 exited_at:{seconds:1752556720 nanos:844518603}" Jul 15 05:18:42.468734 containerd[1591]: time="2025-07-15T05:18:42.468698641Z" level=info msg="TaskExit event in podsandbox handler container_id:\"9428bd6e1d7b46eab1fee560c89f8d47314c8d522bc6b7a656d2d6d9133dfefe\" id:\"ae4836fe087297e2aa3b38e6655d8a00842d9702cff8f6894bb5740598b142eb\" pid:6154 exited_at:{seconds:1752556722 nanos:468329515}" Jul 15 05:18:44.160118 systemd[1]: Started sshd@18-157.180.39.85:22-139.178.89.65:57638.service - OpenSSH per-connection server daemon (139.178.89.65:57638). Jul 15 05:18:45.145229 sshd[6165]: Accepted publickey for core from 139.178.89.65 port 57638 ssh2: RSA SHA256:je//JvwAvCoYiHr1RrgFTcyG5vuK3YNw75TtQlLXdFM Jul 15 05:18:45.147917 sshd-session[6165]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jul 15 05:18:45.158066 systemd-logind[1567]: New session 19 of user core. Jul 15 05:18:45.163303 systemd[1]: Started session-19.scope - Session 19 of User core. Jul 15 05:18:45.995468 sshd[6168]: Connection closed by 139.178.89.65 port 57638 Jul 15 05:18:45.996752 sshd-session[6165]: pam_unix(sshd:session): session closed for user core Jul 15 05:18:46.003712 systemd[1]: sshd@18-157.180.39.85:22-139.178.89.65:57638.service: Deactivated successfully. Jul 15 05:18:46.008516 systemd[1]: session-19.scope: Deactivated successfully. Jul 15 05:18:46.012885 systemd-logind[1567]: Session 19 logged out. Waiting for processes to exit. Jul 15 05:18:46.015646 systemd-logind[1567]: Removed session 19. Jul 15 05:18:51.171194 systemd[1]: Started sshd@19-157.180.39.85:22-139.178.89.65:36220.service - OpenSSH per-connection server daemon (139.178.89.65:36220). Jul 15 05:18:51.579797 containerd[1591]: time="2025-07-15T05:18:51.579758601Z" level=info msg="TaskExit event in podsandbox handler container_id:\"ecc96da98e69b59d4998e563e8cd9c0081547a430a3bdb37fcca15973200c970\" id:\"8e94e2eee2aa161e61729fd3531a941da5fa16f5571fc2ec0e2cde8da33a865c\" pid:6199 exited_at:{seconds:1752556731 nanos:576701826}" Jul 15 05:18:52.172257 sshd[6184]: Accepted publickey for core from 139.178.89.65 port 36220 ssh2: RSA SHA256:je//JvwAvCoYiHr1RrgFTcyG5vuK3YNw75TtQlLXdFM Jul 15 05:18:52.174780 sshd-session[6184]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jul 15 05:18:52.182657 systemd-logind[1567]: New session 20 of user core. Jul 15 05:18:52.187211 systemd[1]: Started session-20.scope - Session 20 of User core. Jul 15 05:18:52.965788 sshd[6208]: Connection closed by 139.178.89.65 port 36220 Jul 15 05:18:52.966802 sshd-session[6184]: pam_unix(sshd:session): session closed for user core Jul 15 05:18:52.974284 systemd[1]: sshd@19-157.180.39.85:22-139.178.89.65:36220.service: Deactivated successfully. Jul 15 05:18:52.977824 systemd[1]: session-20.scope: Deactivated successfully. Jul 15 05:18:52.981449 systemd-logind[1567]: Session 20 logged out. Waiting for processes to exit. Jul 15 05:18:52.984236 systemd-logind[1567]: Removed session 20. Jul 15 05:18:55.738044 containerd[1591]: time="2025-07-15T05:18:55.737971164Z" level=info msg="TaskExit event in podsandbox handler container_id:\"87af270add8c61091888bd10ba9da0b61a290df1dcd1c45e8bd59a8af425818f\" id:\"6f33817b9087474ca81323516afbf23a78fcee8008d3416e91595861effb8bf5\" pid:6233 exited_at:{seconds:1752556735 nanos:737110094}" Jul 15 05:18:58.143661 systemd[1]: Started sshd@20-157.180.39.85:22-139.178.89.65:36226.service - OpenSSH per-connection server daemon (139.178.89.65:36226). Jul 15 05:18:59.175261 sshd[6247]: Accepted publickey for core from 139.178.89.65 port 36226 ssh2: RSA SHA256:je//JvwAvCoYiHr1RrgFTcyG5vuK3YNw75TtQlLXdFM Jul 15 05:18:59.177367 sshd-session[6247]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jul 15 05:18:59.184336 systemd-logind[1567]: New session 21 of user core. Jul 15 05:18:59.188115 systemd[1]: Started session-21.scope - Session 21 of User core. Jul 15 05:18:59.923522 sshd[6250]: Connection closed by 139.178.89.65 port 36226 Jul 15 05:18:59.924816 sshd-session[6247]: pam_unix(sshd:session): session closed for user core Jul 15 05:18:59.932491 systemd[1]: sshd@20-157.180.39.85:22-139.178.89.65:36226.service: Deactivated successfully. Jul 15 05:18:59.935921 systemd[1]: session-21.scope: Deactivated successfully. Jul 15 05:18:59.938278 systemd-logind[1567]: Session 21 logged out. Waiting for processes to exit. Jul 15 05:18:59.941628 systemd-logind[1567]: Removed session 21. Jul 15 05:19:04.764916 containerd[1591]: time="2025-07-15T05:19:04.764834794Z" level=info msg="TaskExit event in podsandbox handler container_id:\"ecc96da98e69b59d4998e563e8cd9c0081547a430a3bdb37fcca15973200c970\" id:\"dfca8b1f31eb6a123cd2fef18902503b0a6d365a8d8d6e4dc7e9ac661957c8dd\" pid:6273 exited_at:{seconds:1752556744 nanos:764395588}" Jul 15 05:19:05.098748 systemd[1]: Started sshd@21-157.180.39.85:22-139.178.89.65:40168.service - OpenSSH per-connection server daemon (139.178.89.65:40168). Jul 15 05:19:06.104815 sshd[6284]: Accepted publickey for core from 139.178.89.65 port 40168 ssh2: RSA SHA256:je//JvwAvCoYiHr1RrgFTcyG5vuK3YNw75TtQlLXdFM Jul 15 05:19:06.106160 sshd-session[6284]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jul 15 05:19:06.110991 systemd-logind[1567]: New session 22 of user core. Jul 15 05:19:06.120230 systemd[1]: Started session-22.scope - Session 22 of User core. Jul 15 05:19:06.878064 sshd[6287]: Connection closed by 139.178.89.65 port 40168 Jul 15 05:19:06.877355 sshd-session[6284]: pam_unix(sshd:session): session closed for user core Jul 15 05:19:06.884730 systemd-logind[1567]: Session 22 logged out. Waiting for processes to exit. Jul 15 05:19:06.885679 systemd[1]: sshd@21-157.180.39.85:22-139.178.89.65:40168.service: Deactivated successfully. Jul 15 05:19:06.889801 systemd[1]: session-22.scope: Deactivated successfully. Jul 15 05:19:06.894749 systemd-logind[1567]: Removed session 22. Jul 15 05:19:10.896323 containerd[1591]: time="2025-07-15T05:19:10.896271939Z" level=info msg="TaskExit event in podsandbox handler container_id:\"9428bd6e1d7b46eab1fee560c89f8d47314c8d522bc6b7a656d2d6d9133dfefe\" id:\"793c28240c9c256f8a0e614da90f93792da112ada1396ce5202c5cc210efe57e\" pid:6313 exited_at:{seconds:1752556750 nanos:895796664}" Jul 15 05:19:12.046117 systemd[1]: Started sshd@22-157.180.39.85:22-139.178.89.65:36106.service - OpenSSH per-connection server daemon (139.178.89.65:36106). Jul 15 05:19:13.038402 sshd[6324]: Accepted publickey for core from 139.178.89.65 port 36106 ssh2: RSA SHA256:je//JvwAvCoYiHr1RrgFTcyG5vuK3YNw75TtQlLXdFM Jul 15 05:19:13.041563 sshd-session[6324]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jul 15 05:19:13.049346 systemd-logind[1567]: New session 23 of user core. Jul 15 05:19:13.054224 systemd[1]: Started session-23.scope - Session 23 of User core. Jul 15 05:19:13.805984 sshd[6327]: Connection closed by 139.178.89.65 port 36106 Jul 15 05:19:13.806490 sshd-session[6324]: pam_unix(sshd:session): session closed for user core Jul 15 05:19:13.809839 systemd-logind[1567]: Session 23 logged out. Waiting for processes to exit. Jul 15 05:19:13.810544 systemd[1]: sshd@22-157.180.39.85:22-139.178.89.65:36106.service: Deactivated successfully. Jul 15 05:19:13.813361 systemd[1]: session-23.scope: Deactivated successfully. Jul 15 05:19:13.816404 systemd-logind[1567]: Removed session 23. Jul 15 05:19:18.978976 systemd[1]: Started sshd@23-157.180.39.85:22-139.178.89.65:36110.service - OpenSSH per-connection server daemon (139.178.89.65:36110). Jul 15 05:19:19.987081 sshd[6348]: Accepted publickey for core from 139.178.89.65 port 36110 ssh2: RSA SHA256:je//JvwAvCoYiHr1RrgFTcyG5vuK3YNw75TtQlLXdFM Jul 15 05:19:19.989591 sshd-session[6348]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jul 15 05:19:19.998091 systemd-logind[1567]: New session 24 of user core. Jul 15 05:19:20.003255 systemd[1]: Started session-24.scope - Session 24 of User core. Jul 15 05:19:20.876866 sshd[6351]: Connection closed by 139.178.89.65 port 36110 Jul 15 05:19:20.878485 sshd-session[6348]: pam_unix(sshd:session): session closed for user core Jul 15 05:19:20.885838 systemd-logind[1567]: Session 24 logged out. Waiting for processes to exit. Jul 15 05:19:20.887364 systemd[1]: sshd@23-157.180.39.85:22-139.178.89.65:36110.service: Deactivated successfully. Jul 15 05:19:20.891083 systemd[1]: session-24.scope: Deactivated successfully. Jul 15 05:19:20.895186 systemd-logind[1567]: Removed session 24. Jul 15 05:19:25.733902 containerd[1591]: time="2025-07-15T05:19:25.733840535Z" level=info msg="TaskExit event in podsandbox handler container_id:\"87af270add8c61091888bd10ba9da0b61a290df1dcd1c45e8bd59a8af425818f\" id:\"4850bd6d3af0196cef255c3e818b1fa2befeb103c517f8df31b336902b0731f4\" pid:6375 exited_at:{seconds:1752556765 nanos:733500948}" Jul 15 05:19:26.048499 systemd[1]: Started sshd@24-157.180.39.85:22-139.178.89.65:52932.service - OpenSSH per-connection server daemon (139.178.89.65:52932). Jul 15 05:19:27.093466 sshd[6389]: Accepted publickey for core from 139.178.89.65 port 52932 ssh2: RSA SHA256:je//JvwAvCoYiHr1RrgFTcyG5vuK3YNw75TtQlLXdFM Jul 15 05:19:27.097647 sshd-session[6389]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jul 15 05:19:27.106842 systemd-logind[1567]: New session 25 of user core. Jul 15 05:19:27.113237 systemd[1]: Started session-25.scope - Session 25 of User core. Jul 15 05:19:28.049054 sshd[6394]: Connection closed by 139.178.89.65 port 52932 Jul 15 05:19:28.051476 sshd-session[6389]: pam_unix(sshd:session): session closed for user core Jul 15 05:19:28.058639 systemd-logind[1567]: Session 25 logged out. Waiting for processes to exit. Jul 15 05:19:28.058940 systemd[1]: sshd@24-157.180.39.85:22-139.178.89.65:52932.service: Deactivated successfully. Jul 15 05:19:28.063243 systemd[1]: session-25.scope: Deactivated successfully. Jul 15 05:19:28.066118 systemd-logind[1567]: Removed session 25. Jul 15 05:19:33.216220 systemd[1]: Started sshd@25-157.180.39.85:22-139.178.89.65:44648.service - OpenSSH per-connection server daemon (139.178.89.65:44648). Jul 15 05:19:34.194708 sshd[6407]: Accepted publickey for core from 139.178.89.65 port 44648 ssh2: RSA SHA256:je//JvwAvCoYiHr1RrgFTcyG5vuK3YNw75TtQlLXdFM Jul 15 05:19:34.196540 sshd-session[6407]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jul 15 05:19:34.201919 systemd-logind[1567]: New session 26 of user core. Jul 15 05:19:34.208233 systemd[1]: Started session-26.scope - Session 26 of User core. Jul 15 05:19:34.740639 containerd[1591]: time="2025-07-15T05:19:34.740107223Z" level=info msg="TaskExit event in podsandbox handler container_id:\"ecc96da98e69b59d4998e563e8cd9c0081547a430a3bdb37fcca15973200c970\" id:\"b65c59bcf744376d7ceb7ed5ae155583a1e54758173ba856e35fb70e3e9a1a79\" pid:6428 exited_at:{seconds:1752556774 nanos:739634965}" Jul 15 05:19:34.968399 sshd[6410]: Connection closed by 139.178.89.65 port 44648 Jul 15 05:19:34.969855 sshd-session[6407]: pam_unix(sshd:session): session closed for user core Jul 15 05:19:34.973448 systemd-logind[1567]: Session 26 logged out. Waiting for processes to exit. Jul 15 05:19:34.974158 systemd[1]: sshd@25-157.180.39.85:22-139.178.89.65:44648.service: Deactivated successfully. Jul 15 05:19:34.976241 systemd[1]: session-26.scope: Deactivated successfully. Jul 15 05:19:34.979032 systemd-logind[1567]: Removed session 26. Jul 15 05:19:40.140400 systemd[1]: Started sshd@26-157.180.39.85:22-139.178.89.65:44710.service - OpenSSH per-connection server daemon (139.178.89.65:44710). Jul 15 05:19:40.819417 containerd[1591]: time="2025-07-15T05:19:40.819361380Z" level=info msg="TaskExit event in podsandbox handler container_id:\"9428bd6e1d7b46eab1fee560c89f8d47314c8d522bc6b7a656d2d6d9133dfefe\" id:\"f23def4f5ad7e336256db77d723248618e929a347bfb5eca6dd2664a8dfe5c2f\" pid:6460 exited_at:{seconds:1752556780 nanos:819110997}" Jul 15 05:19:41.132583 sshd[6445]: Accepted publickey for core from 139.178.89.65 port 44710 ssh2: RSA SHA256:je//JvwAvCoYiHr1RrgFTcyG5vuK3YNw75TtQlLXdFM Jul 15 05:19:41.134906 sshd-session[6445]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jul 15 05:19:41.142143 systemd-logind[1567]: New session 27 of user core. Jul 15 05:19:41.146140 systemd[1]: Started session-27.scope - Session 27 of User core. Jul 15 05:19:41.863156 sshd[6472]: Connection closed by 139.178.89.65 port 44710 Jul 15 05:19:41.863731 sshd-session[6445]: pam_unix(sshd:session): session closed for user core Jul 15 05:19:41.869170 systemd-logind[1567]: Session 27 logged out. Waiting for processes to exit. Jul 15 05:19:41.869583 systemd[1]: sshd@26-157.180.39.85:22-139.178.89.65:44710.service: Deactivated successfully. Jul 15 05:19:41.871612 systemd[1]: session-27.scope: Deactivated successfully. Jul 15 05:19:41.874101 systemd-logind[1567]: Removed session 27. Jul 15 05:19:42.487758 containerd[1591]: time="2025-07-15T05:19:42.487670935Z" level=info msg="TaskExit event in podsandbox handler container_id:\"9428bd6e1d7b46eab1fee560c89f8d47314c8d522bc6b7a656d2d6d9133dfefe\" id:\"80a41c7516f8f67cd145135cb12b86f36ec643ab384645787f187267a65fc439\" pid:6496 exited_at:{seconds:1752556782 nanos:486979705}" Jul 15 05:19:47.044930 systemd[1]: Started sshd@27-157.180.39.85:22-139.178.89.65:44718.service - OpenSSH per-connection server daemon (139.178.89.65:44718). Jul 15 05:19:48.052882 sshd[6507]: Accepted publickey for core from 139.178.89.65 port 44718 ssh2: RSA SHA256:je//JvwAvCoYiHr1RrgFTcyG5vuK3YNw75TtQlLXdFM Jul 15 05:19:48.055529 sshd-session[6507]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jul 15 05:19:48.063346 systemd-logind[1567]: New session 28 of user core. Jul 15 05:19:48.071431 systemd[1]: Started session-28.scope - Session 28 of User core. Jul 15 05:19:48.797223 sshd[6517]: Connection closed by 139.178.89.65 port 44718 Jul 15 05:19:48.798258 sshd-session[6507]: pam_unix(sshd:session): session closed for user core Jul 15 05:19:48.805493 systemd[1]: sshd@27-157.180.39.85:22-139.178.89.65:44718.service: Deactivated successfully. Jul 15 05:19:48.809662 systemd[1]: session-28.scope: Deactivated successfully. Jul 15 05:19:48.811356 systemd-logind[1567]: Session 28 logged out. Waiting for processes to exit. Jul 15 05:19:48.814274 systemd-logind[1567]: Removed session 28. Jul 15 05:19:51.588278 containerd[1591]: time="2025-07-15T05:19:51.581810197Z" level=info msg="TaskExit event in podsandbox handler container_id:\"ecc96da98e69b59d4998e563e8cd9c0081547a430a3bdb37fcca15973200c970\" id:\"12cd566d802b969e2b7059d894ba4a1732a431e4ea6a0628dd5b7dafd264811a\" pid:6544 exited_at:{seconds:1752556791 nanos:581447612}" Jul 15 05:19:53.969830 systemd[1]: Started sshd@28-157.180.39.85:22-139.178.89.65:47620.service - OpenSSH per-connection server daemon (139.178.89.65:47620). Jul 15 05:19:54.970826 sshd[6568]: Accepted publickey for core from 139.178.89.65 port 47620 ssh2: RSA SHA256:je//JvwAvCoYiHr1RrgFTcyG5vuK3YNw75TtQlLXdFM Jul 15 05:19:54.973440 sshd-session[6568]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jul 15 05:19:54.982189 systemd-logind[1567]: New session 29 of user core. Jul 15 05:19:54.988371 systemd[1]: Started session-29.scope - Session 29 of User core. Jul 15 05:19:55.697089 containerd[1591]: time="2025-07-15T05:19:55.696986048Z" level=info msg="TaskExit event in podsandbox handler container_id:\"87af270add8c61091888bd10ba9da0b61a290df1dcd1c45e8bd59a8af425818f\" id:\"f22d94d4a8bc9c9d2d2670a6a8c9606e766515d1dd2cfd6724830328e4a24eb8\" pid:6594 exited_at:{seconds:1752556795 nanos:696597164}" Jul 15 05:19:55.765924 sshd[6571]: Connection closed by 139.178.89.65 port 47620 Jul 15 05:19:55.766937 sshd-session[6568]: pam_unix(sshd:session): session closed for user core Jul 15 05:19:55.776592 systemd-logind[1567]: Session 29 logged out. Waiting for processes to exit. Jul 15 05:19:55.778072 systemd[1]: sshd@28-157.180.39.85:22-139.178.89.65:47620.service: Deactivated successfully. Jul 15 05:19:55.785797 systemd[1]: session-29.scope: Deactivated successfully. Jul 15 05:19:55.788679 systemd-logind[1567]: Removed session 29. Jul 15 05:20:00.938389 systemd[1]: Started sshd@29-157.180.39.85:22-139.178.89.65:35358.service - OpenSSH per-connection server daemon (139.178.89.65:35358). Jul 15 05:20:01.916559 sshd[6611]: Accepted publickey for core from 139.178.89.65 port 35358 ssh2: RSA SHA256:je//JvwAvCoYiHr1RrgFTcyG5vuK3YNw75TtQlLXdFM Jul 15 05:20:01.919173 sshd-session[6611]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jul 15 05:20:01.926820 systemd-logind[1567]: New session 30 of user core. Jul 15 05:20:01.937221 systemd[1]: Started session-30.scope - Session 30 of User core. Jul 15 05:20:02.656212 sshd[6616]: Connection closed by 139.178.89.65 port 35358 Jul 15 05:20:02.658237 sshd-session[6611]: pam_unix(sshd:session): session closed for user core Jul 15 05:20:02.662210 systemd-logind[1567]: Session 30 logged out. Waiting for processes to exit. Jul 15 05:20:02.662680 systemd[1]: sshd@29-157.180.39.85:22-139.178.89.65:35358.service: Deactivated successfully. Jul 15 05:20:02.665937 systemd[1]: session-30.scope: Deactivated successfully. Jul 15 05:20:02.670624 systemd-logind[1567]: Removed session 30. Jul 15 05:20:04.767039 containerd[1591]: time="2025-07-15T05:20:04.766983567Z" level=info msg="TaskExit event in podsandbox handler container_id:\"ecc96da98e69b59d4998e563e8cd9c0081547a430a3bdb37fcca15973200c970\" id:\"1ddf19fe98d6f33d92582826a562e42e2a2150795d0173a0724ffeac7a8943d4\" pid:6641 exited_at:{seconds:1752556804 nanos:766626074}" Jul 15 05:20:07.827579 systemd[1]: Started sshd@30-157.180.39.85:22-139.178.89.65:35364.service - OpenSSH per-connection server daemon (139.178.89.65:35364). Jul 15 05:20:08.822609 sshd[6651]: Accepted publickey for core from 139.178.89.65 port 35364 ssh2: RSA SHA256:je//JvwAvCoYiHr1RrgFTcyG5vuK3YNw75TtQlLXdFM Jul 15 05:20:08.825050 sshd-session[6651]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jul 15 05:20:08.833719 systemd-logind[1567]: New session 31 of user core. Jul 15 05:20:08.840159 systemd[1]: Started session-31.scope - Session 31 of User core. Jul 15 05:20:09.547197 sshd[6655]: Connection closed by 139.178.89.65 port 35364 Jul 15 05:20:09.548230 sshd-session[6651]: pam_unix(sshd:session): session closed for user core Jul 15 05:20:09.554240 systemd[1]: sshd@30-157.180.39.85:22-139.178.89.65:35364.service: Deactivated successfully. Jul 15 05:20:09.555933 systemd[1]: session-31.scope: Deactivated successfully. Jul 15 05:20:09.556742 systemd-logind[1567]: Session 31 logged out. Waiting for processes to exit. Jul 15 05:20:09.558722 systemd-logind[1567]: Removed session 31. Jul 15 05:20:10.799601 containerd[1591]: time="2025-07-15T05:20:10.799568095Z" level=info msg="TaskExit event in podsandbox handler container_id:\"9428bd6e1d7b46eab1fee560c89f8d47314c8d522bc6b7a656d2d6d9133dfefe\" id:\"d94a23def255c049465576a95d96c55266cb187c4f12e272ac0c8a67fc59e29e\" pid:6679 exited_at:{seconds:1752556810 nanos:798998821}" Jul 15 05:20:14.719278 systemd[1]: Started sshd@31-157.180.39.85:22-139.178.89.65:43410.service - OpenSSH per-connection server daemon (139.178.89.65:43410). Jul 15 05:20:15.717297 sshd[6690]: Accepted publickey for core from 139.178.89.65 port 43410 ssh2: RSA SHA256:je//JvwAvCoYiHr1RrgFTcyG5vuK3YNw75TtQlLXdFM Jul 15 05:20:15.719959 sshd-session[6690]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jul 15 05:20:15.728818 systemd-logind[1567]: New session 32 of user core. Jul 15 05:20:15.734310 systemd[1]: Started session-32.scope - Session 32 of User core. Jul 15 05:20:16.492111 sshd[6693]: Connection closed by 139.178.89.65 port 43410 Jul 15 05:20:16.493236 sshd-session[6690]: pam_unix(sshd:session): session closed for user core Jul 15 05:20:16.498304 systemd[1]: sshd@31-157.180.39.85:22-139.178.89.65:43410.service: Deactivated successfully. Jul 15 05:20:16.500857 systemd[1]: session-32.scope: Deactivated successfully. Jul 15 05:20:16.502567 systemd-logind[1567]: Session 32 logged out. Waiting for processes to exit. Jul 15 05:20:16.504660 systemd-logind[1567]: Removed session 32. Jul 15 05:20:21.663003 systemd[1]: Started sshd@32-157.180.39.85:22-139.178.89.65:57488.service - OpenSSH per-connection server daemon (139.178.89.65:57488). Jul 15 05:20:22.693522 sshd[6706]: Accepted publickey for core from 139.178.89.65 port 57488 ssh2: RSA SHA256:je//JvwAvCoYiHr1RrgFTcyG5vuK3YNw75TtQlLXdFM Jul 15 05:20:22.699207 sshd-session[6706]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jul 15 05:20:22.709677 systemd-logind[1567]: New session 33 of user core. Jul 15 05:20:22.718231 systemd[1]: Started session-33.scope - Session 33 of User core. Jul 15 05:20:23.719980 sshd[6709]: Connection closed by 139.178.89.65 port 57488 Jul 15 05:20:23.723270 sshd-session[6706]: pam_unix(sshd:session): session closed for user core Jul 15 05:20:23.729430 systemd[1]: sshd@32-157.180.39.85:22-139.178.89.65:57488.service: Deactivated successfully. Jul 15 05:20:23.733526 systemd[1]: session-33.scope: Deactivated successfully. Jul 15 05:20:23.734955 systemd-logind[1567]: Session 33 logged out. Waiting for processes to exit. Jul 15 05:20:23.737929 systemd-logind[1567]: Removed session 33. Jul 15 05:20:25.726711 containerd[1591]: time="2025-07-15T05:20:25.726640487Z" level=info msg="TaskExit event in podsandbox handler container_id:\"87af270add8c61091888bd10ba9da0b61a290df1dcd1c45e8bd59a8af425818f\" id:\"7010c6eb7e71f69e1c5f5bf8bd8bb3070ad175dd769ffbb9873c4cfac481714e\" pid:6734 exited_at:{seconds:1752556825 nanos:726260266}" Jul 15 05:20:28.893149 systemd[1]: Started sshd@33-157.180.39.85:22-139.178.89.65:57504.service - OpenSSH per-connection server daemon (139.178.89.65:57504). Jul 15 05:20:29.955134 sshd[6750]: Accepted publickey for core from 139.178.89.65 port 57504 ssh2: RSA SHA256:je//JvwAvCoYiHr1RrgFTcyG5vuK3YNw75TtQlLXdFM Jul 15 05:20:29.959555 sshd-session[6750]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jul 15 05:20:29.968570 systemd-logind[1567]: New session 34 of user core. Jul 15 05:20:29.976289 systemd[1]: Started session-34.scope - Session 34 of User core. Jul 15 05:20:30.974278 sshd[6753]: Connection closed by 139.178.89.65 port 57504 Jul 15 05:20:30.974508 sshd-session[6750]: pam_unix(sshd:session): session closed for user core Jul 15 05:20:30.983540 systemd[1]: sshd@33-157.180.39.85:22-139.178.89.65:57504.service: Deactivated successfully. Jul 15 05:20:30.987664 systemd[1]: session-34.scope: Deactivated successfully. Jul 15 05:20:30.988609 systemd-logind[1567]: Session 34 logged out. Waiting for processes to exit. Jul 15 05:20:30.991268 systemd-logind[1567]: Removed session 34. Jul 15 05:20:34.794655 containerd[1591]: time="2025-07-15T05:20:34.793790972Z" level=info msg="TaskExit event in podsandbox handler container_id:\"ecc96da98e69b59d4998e563e8cd9c0081547a430a3bdb37fcca15973200c970\" id:\"7a0f11ffb70c794f76dd7790bea9f5c95047d2b685bcfa95dfbdec974c215067\" pid:6777 exited_at:{seconds:1752556834 nanos:793642141}" Jul 15 05:20:36.145598 systemd[1]: Started sshd@34-157.180.39.85:22-139.178.89.65:52496.service - OpenSSH per-connection server daemon (139.178.89.65:52496). Jul 15 05:20:37.156356 sshd[6787]: Accepted publickey for core from 139.178.89.65 port 52496 ssh2: RSA SHA256:je//JvwAvCoYiHr1RrgFTcyG5vuK3YNw75TtQlLXdFM Jul 15 05:20:37.158833 sshd-session[6787]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jul 15 05:20:37.166527 systemd-logind[1567]: New session 35 of user core. Jul 15 05:20:37.172251 systemd[1]: Started session-35.scope - Session 35 of User core. Jul 15 05:20:38.003853 sshd[6790]: Connection closed by 139.178.89.65 port 52496 Jul 15 05:20:38.004180 sshd-session[6787]: pam_unix(sshd:session): session closed for user core Jul 15 05:20:38.008000 systemd-logind[1567]: Session 35 logged out. Waiting for processes to exit. Jul 15 05:20:38.008871 systemd[1]: sshd@34-157.180.39.85:22-139.178.89.65:52496.service: Deactivated successfully. Jul 15 05:20:38.011661 systemd[1]: session-35.scope: Deactivated successfully. Jul 15 05:20:38.012760 systemd-logind[1567]: Removed session 35. Jul 15 05:20:39.531764 update_engine[1569]: I20250715 05:20:39.531651 1569 prefs.cc:52] certificate-report-to-send-update not present in /var/lib/update_engine/prefs Jul 15 05:20:39.531764 update_engine[1569]: I20250715 05:20:39.531708 1569 prefs.cc:52] certificate-report-to-send-download not present in /var/lib/update_engine/prefs Jul 15 05:20:39.533178 update_engine[1569]: I20250715 05:20:39.533094 1569 prefs.cc:52] aleph-version not present in /var/lib/update_engine/prefs Jul 15 05:20:39.534365 update_engine[1569]: I20250715 05:20:39.534317 1569 omaha_request_params.cc:62] Current group set to developer Jul 15 05:20:39.535113 update_engine[1569]: I20250715 05:20:39.534516 1569 update_attempter.cc:499] Already updated boot flags. Skipping. Jul 15 05:20:39.535113 update_engine[1569]: I20250715 05:20:39.534526 1569 update_attempter.cc:643] Scheduling an action processor start. Jul 15 05:20:39.535113 update_engine[1569]: I20250715 05:20:39.534545 1569 action_processor.cc:36] ActionProcessor::StartProcessing: OmahaRequestAction Jul 15 05:20:39.535113 update_engine[1569]: I20250715 05:20:39.534574 1569 prefs.cc:52] previous-version not present in /var/lib/update_engine/prefs Jul 15 05:20:39.535113 update_engine[1569]: I20250715 05:20:39.534618 1569 omaha_request_action.cc:271] Posting an Omaha request to disabled Jul 15 05:20:39.535113 update_engine[1569]: I20250715 05:20:39.534623 1569 omaha_request_action.cc:272] Request: Jul 15 05:20:39.535113 update_engine[1569]: Jul 15 05:20:39.535113 update_engine[1569]: Jul 15 05:20:39.535113 update_engine[1569]: Jul 15 05:20:39.535113 update_engine[1569]: Jul 15 05:20:39.535113 update_engine[1569]: Jul 15 05:20:39.535113 update_engine[1569]: Jul 15 05:20:39.535113 update_engine[1569]: Jul 15 05:20:39.535113 update_engine[1569]: Jul 15 05:20:39.535113 update_engine[1569]: I20250715 05:20:39.534629 1569 libcurl_http_fetcher.cc:47] Starting/Resuming transfer Jul 15 05:20:39.550722 update_engine[1569]: I20250715 05:20:39.550606 1569 libcurl_http_fetcher.cc:151] Setting up curl options for HTTP Jul 15 05:20:39.551064 update_engine[1569]: I20250715 05:20:39.550904 1569 libcurl_http_fetcher.cc:449] Setting up timeout source: 1 seconds. Jul 15 05:20:39.551660 update_engine[1569]: E20250715 05:20:39.551564 1569 libcurl_http_fetcher.cc:266] Unable to get http response code: Could not resolve host: disabled Jul 15 05:20:39.551660 update_engine[1569]: I20250715 05:20:39.551613 1569 libcurl_http_fetcher.cc:283] No HTTP response, retry 1 Jul 15 05:20:39.555233 locksmithd[1607]: LastCheckedTime=0 Progress=0 CurrentOperation="UPDATE_STATUS_CHECKING_FOR_UPDATE" NewVersion=0.0.0 NewSize=0 Jul 15 05:20:40.839956 containerd[1591]: time="2025-07-15T05:20:40.839909624Z" level=info msg="TaskExit event in podsandbox handler container_id:\"9428bd6e1d7b46eab1fee560c89f8d47314c8d522bc6b7a656d2d6d9133dfefe\" id:\"0ebae0d3aeb460e4031a89fb3848e274b8a0cd444ccbddc1a41988535300cf5c\" pid:6815 exited_at:{seconds:1752556840 nanos:839503223}" Jul 15 05:20:42.489758 containerd[1591]: time="2025-07-15T05:20:42.489714725Z" level=info msg="TaskExit event in podsandbox handler container_id:\"9428bd6e1d7b46eab1fee560c89f8d47314c8d522bc6b7a656d2d6d9133dfefe\" id:\"57eab7e8aaa8ad4794710317ae39d1bfa65ba2be0e8c1337f5f01cc4677fabf0\" pid:6837 exited_at:{seconds:1752556842 nanos:489216334}" Jul 15 05:20:43.169204 systemd[1]: Started sshd@35-157.180.39.85:22-139.178.89.65:37874.service - OpenSSH per-connection server daemon (139.178.89.65:37874). Jul 15 05:20:44.165023 sshd[6849]: Accepted publickey for core from 139.178.89.65 port 37874 ssh2: RSA SHA256:je//JvwAvCoYiHr1RrgFTcyG5vuK3YNw75TtQlLXdFM Jul 15 05:20:44.167752 sshd-session[6849]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jul 15 05:20:44.173230 systemd-logind[1567]: New session 36 of user core. Jul 15 05:20:44.180174 systemd[1]: Started session-36.scope - Session 36 of User core. Jul 15 05:20:45.073966 sshd[6852]: Connection closed by 139.178.89.65 port 37874 Jul 15 05:20:45.075359 sshd-session[6849]: pam_unix(sshd:session): session closed for user core Jul 15 05:20:45.082295 systemd-logind[1567]: Session 36 logged out. Waiting for processes to exit. Jul 15 05:20:45.083539 systemd[1]: sshd@35-157.180.39.85:22-139.178.89.65:37874.service: Deactivated successfully. Jul 15 05:20:45.086587 systemd[1]: session-36.scope: Deactivated successfully. Jul 15 05:20:45.090214 systemd-logind[1567]: Removed session 36. Jul 15 05:20:45.506041 containerd[1591]: time="2025-07-15T05:20:45.478759550Z" level=warning msg="container event discarded" container=411e6bfc3f73dfdf525b1d6d0d397a27e5051310b9a9996f79b01ac9e5e4216c type=CONTAINER_CREATED_EVENT Jul 15 05:20:45.544831 containerd[1591]: time="2025-07-15T05:20:45.544263973Z" level=warning msg="container event discarded" container=411e6bfc3f73dfdf525b1d6d0d397a27e5051310b9a9996f79b01ac9e5e4216c type=CONTAINER_STARTED_EVENT Jul 15 05:20:45.545336 containerd[1591]: time="2025-07-15T05:20:45.545189605Z" level=warning msg="container event discarded" container=f17bb7f4ea97eb0f7606c0c99a76ffb4cd7f45632eb893f29bb012a768c354e0 type=CONTAINER_CREATED_EVENT Jul 15 05:20:45.545336 containerd[1591]: time="2025-07-15T05:20:45.545226205Z" level=warning msg="container event discarded" container=f17bb7f4ea97eb0f7606c0c99a76ffb4cd7f45632eb893f29bb012a768c354e0 type=CONTAINER_STARTED_EVENT Jul 15 05:20:45.545336 containerd[1591]: time="2025-07-15T05:20:45.545243995Z" level=warning msg="container event discarded" container=b015deb82b1d6760cf7821189c4ef90bf2a7a4c669d261bca4522c7b14236b92 type=CONTAINER_CREATED_EVENT Jul 15 05:20:45.545336 containerd[1591]: time="2025-07-15T05:20:45.545258205Z" level=warning msg="container event discarded" container=b015deb82b1d6760cf7821189c4ef90bf2a7a4c669d261bca4522c7b14236b92 type=CONTAINER_STARTED_EVENT Jul 15 05:20:45.545336 containerd[1591]: time="2025-07-15T05:20:45.545269545Z" level=warning msg="container event discarded" container=426304308dbc7f0e2ecaaaf36d8dee5714076cb15d299380b331344bc2a6918d type=CONTAINER_CREATED_EVENT Jul 15 05:20:45.545336 containerd[1591]: time="2025-07-15T05:20:45.545280995Z" level=warning msg="container event discarded" container=d84ffd01b63dafef51e5b4c08b061e8480909e986a9fe3b0a079651a857e020e type=CONTAINER_CREATED_EVENT Jul 15 05:20:45.545336 containerd[1591]: time="2025-07-15T05:20:45.545292925Z" level=warning msg="container event discarded" container=4e43aa790ae7a29875522602098a4d746b85e6552bbcec796b6e8f9780aabf2b type=CONTAINER_CREATED_EVENT Jul 15 05:20:45.613435 containerd[1591]: time="2025-07-15T05:20:45.613382574Z" level=warning msg="container event discarded" container=d84ffd01b63dafef51e5b4c08b061e8480909e986a9fe3b0a079651a857e020e type=CONTAINER_STARTED_EVENT Jul 15 05:20:45.655669 containerd[1591]: time="2025-07-15T05:20:45.655617877Z" level=warning msg="container event discarded" container=4e43aa790ae7a29875522602098a4d746b85e6552bbcec796b6e8f9780aabf2b type=CONTAINER_STARTED_EVENT Jul 15 05:20:45.655669 containerd[1591]: time="2025-07-15T05:20:45.655643047Z" level=warning msg="container event discarded" container=426304308dbc7f0e2ecaaaf36d8dee5714076cb15d299380b331344bc2a6918d type=CONTAINER_STARTED_EVENT Jul 15 05:20:49.494042 update_engine[1569]: I20250715 05:20:49.493927 1569 libcurl_http_fetcher.cc:47] Starting/Resuming transfer Jul 15 05:20:49.494730 update_engine[1569]: I20250715 05:20:49.494305 1569 libcurl_http_fetcher.cc:151] Setting up curl options for HTTP Jul 15 05:20:49.494730 update_engine[1569]: I20250715 05:20:49.494712 1569 libcurl_http_fetcher.cc:449] Setting up timeout source: 1 seconds. Jul 15 05:20:49.495317 update_engine[1569]: E20250715 05:20:49.495139 1569 libcurl_http_fetcher.cc:266] Unable to get http response code: Could not resolve host: disabled Jul 15 05:20:49.495317 update_engine[1569]: I20250715 05:20:49.495265 1569 libcurl_http_fetcher.cc:283] No HTTP response, retry 2 Jul 15 05:20:50.248437 systemd[1]: Started sshd@36-157.180.39.85:22-139.178.89.65:37644.service - OpenSSH per-connection server daemon (139.178.89.65:37644). Jul 15 05:20:51.252894 sshd[6866]: Accepted publickey for core from 139.178.89.65 port 37644 ssh2: RSA SHA256:je//JvwAvCoYiHr1RrgFTcyG5vuK3YNw75TtQlLXdFM Jul 15 05:20:51.255606 sshd-session[6866]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jul 15 05:20:51.263787 systemd-logind[1567]: New session 37 of user core. Jul 15 05:20:51.277320 systemd[1]: Started session-37.scope - Session 37 of User core. Jul 15 05:20:51.591467 containerd[1591]: time="2025-07-15T05:20:51.591226472Z" level=info msg="TaskExit event in podsandbox handler container_id:\"ecc96da98e69b59d4998e563e8cd9c0081547a430a3bdb37fcca15973200c970\" id:\"b84451ae7dcd535054755e2fff0685ac775e23b3c2e9e7d7eeec0869bb21f236\" pid:6882 exited_at:{seconds:1752556851 nanos:590923681}" Jul 15 05:20:52.036393 sshd[6869]: Connection closed by 139.178.89.65 port 37644 Jul 15 05:20:52.037325 sshd-session[6866]: pam_unix(sshd:session): session closed for user core Jul 15 05:20:52.048361 systemd[1]: sshd@36-157.180.39.85:22-139.178.89.65:37644.service: Deactivated successfully. Jul 15 05:20:52.052581 systemd[1]: session-37.scope: Deactivated successfully. Jul 15 05:20:52.055723 systemd-logind[1567]: Session 37 logged out. Waiting for processes to exit. Jul 15 05:20:52.058550 systemd-logind[1567]: Removed session 37. Jul 15 05:20:55.727064 containerd[1591]: time="2025-07-15T05:20:55.725999846Z" level=info msg="TaskExit event in podsandbox handler container_id:\"87af270add8c61091888bd10ba9da0b61a290df1dcd1c45e8bd59a8af425818f\" id:\"79a1f99ff0e48091adf1d85bf1891b8788a30151782c4e22b92fedd99dd8510c\" pid:6915 exited_at:{seconds:1752556855 nanos:724888175}" Jul 15 05:20:56.508121 containerd[1591]: time="2025-07-15T05:20:56.507911287Z" level=warning msg="container event discarded" container=89463c93fbdafa9e10fe215e1c60fc88eca8b817421379ab4a856baa98ba5c10 type=CONTAINER_CREATED_EVENT Jul 15 05:20:56.508121 containerd[1591]: time="2025-07-15T05:20:56.508067427Z" level=warning msg="container event discarded" container=89463c93fbdafa9e10fe215e1c60fc88eca8b817421379ab4a856baa98ba5c10 type=CONTAINER_STARTED_EVENT Jul 15 05:20:56.527689 containerd[1591]: time="2025-07-15T05:20:56.527559797Z" level=warning msg="container event discarded" container=22839e025bb2cd58503c342076bb6c5698ce604d231974ad723f6935e9f25a15 type=CONTAINER_CREATED_EVENT Jul 15 05:20:56.576069 containerd[1591]: time="2025-07-15T05:20:56.575952677Z" level=warning msg="container event discarded" container=22839e025bb2cd58503c342076bb6c5698ce604d231974ad723f6935e9f25a15 type=CONTAINER_STARTED_EVENT Jul 15 05:20:56.897097 containerd[1591]: time="2025-07-15T05:20:56.896839821Z" level=warning msg="container event discarded" container=72a76cd66dddaf7b086d078cb6161301e0c89cd7249db781ba35a162273349eb type=CONTAINER_CREATED_EVENT Jul 15 05:20:56.897097 containerd[1591]: time="2025-07-15T05:20:56.896901301Z" level=warning msg="container event discarded" container=72a76cd66dddaf7b086d078cb6161301e0c89cd7249db781ba35a162273349eb type=CONTAINER_STARTED_EVENT Jul 15 05:20:57.213493 systemd[1]: Started sshd@37-157.180.39.85:22-139.178.89.65:37658.service - OpenSSH per-connection server daemon (139.178.89.65:37658). Jul 15 05:20:58.233979 sshd[6931]: Accepted publickey for core from 139.178.89.65 port 37658 ssh2: RSA SHA256:je//JvwAvCoYiHr1RrgFTcyG5vuK3YNw75TtQlLXdFM Jul 15 05:20:58.236590 sshd-session[6931]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jul 15 05:20:58.245147 systemd-logind[1567]: New session 38 of user core. Jul 15 05:20:58.251464 systemd[1]: Started session-38.scope - Session 38 of User core. Jul 15 05:20:59.074341 sshd[6934]: Connection closed by 139.178.89.65 port 37658 Jul 15 05:20:59.074757 containerd[1591]: time="2025-07-15T05:20:59.074287615Z" level=warning msg="container event discarded" container=243a100da6a48118c4599031453de009e00f859cdfcd436303f98bf48ee91307 type=CONTAINER_CREATED_EVENT Jul 15 05:20:59.075503 sshd-session[6931]: pam_unix(sshd:session): session closed for user core Jul 15 05:20:59.080331 systemd-logind[1567]: Session 38 logged out. Waiting for processes to exit. Jul 15 05:20:59.081068 systemd[1]: sshd@37-157.180.39.85:22-139.178.89.65:37658.service: Deactivated successfully. Jul 15 05:20:59.083326 systemd[1]: session-38.scope: Deactivated successfully. Jul 15 05:20:59.085551 systemd-logind[1567]: Removed session 38. Jul 15 05:20:59.117804 containerd[1591]: time="2025-07-15T05:20:59.117740368Z" level=warning msg="container event discarded" container=243a100da6a48118c4599031453de009e00f859cdfcd436303f98bf48ee91307 type=CONTAINER_STARTED_EVENT Jul 15 05:20:59.492580 update_engine[1569]: I20250715 05:20:59.492489 1569 libcurl_http_fetcher.cc:47] Starting/Resuming transfer Jul 15 05:20:59.493939 update_engine[1569]: I20250715 05:20:59.492838 1569 libcurl_http_fetcher.cc:151] Setting up curl options for HTTP Jul 15 05:20:59.493939 update_engine[1569]: I20250715 05:20:59.493281 1569 libcurl_http_fetcher.cc:449] Setting up timeout source: 1 seconds. Jul 15 05:20:59.493939 update_engine[1569]: E20250715 05:20:59.493778 1569 libcurl_http_fetcher.cc:266] Unable to get http response code: Could not resolve host: disabled Jul 15 05:20:59.493939 update_engine[1569]: I20250715 05:20:59.493865 1569 libcurl_http_fetcher.cc:283] No HTTP response, retry 3 Jul 15 05:21:04.241946 systemd[1]: Started sshd@38-157.180.39.85:22-139.178.89.65:44448.service - OpenSSH per-connection server daemon (139.178.89.65:44448). Jul 15 05:21:04.761188 containerd[1591]: time="2025-07-15T05:21:04.761109297Z" level=info msg="TaskExit event in podsandbox handler container_id:\"ecc96da98e69b59d4998e563e8cd9c0081547a430a3bdb37fcca15973200c970\" id:\"0f35dcba40e75cb5b9884d39a7443d7785f01598c6f638981131e4f81bc004d0\" pid:6962 exited_at:{seconds:1752556864 nanos:760800287}" Jul 15 05:21:05.235033 sshd[6947]: Accepted publickey for core from 139.178.89.65 port 44448 ssh2: RSA SHA256:je//JvwAvCoYiHr1RrgFTcyG5vuK3YNw75TtQlLXdFM Jul 15 05:21:05.233767 sshd-session[6947]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jul 15 05:21:05.239966 systemd-logind[1567]: New session 39 of user core. Jul 15 05:21:05.244159 systemd[1]: Started session-39.scope - Session 39 of User core. Jul 15 05:21:05.968311 sshd[6970]: Connection closed by 139.178.89.65 port 44448 Jul 15 05:21:05.969257 sshd-session[6947]: pam_unix(sshd:session): session closed for user core Jul 15 05:21:05.977479 systemd[1]: sshd@38-157.180.39.85:22-139.178.89.65:44448.service: Deactivated successfully. Jul 15 05:21:05.983078 systemd[1]: session-39.scope: Deactivated successfully. Jul 15 05:21:05.986083 systemd-logind[1567]: Session 39 logged out. Waiting for processes to exit. Jul 15 05:21:05.990893 systemd-logind[1567]: Removed session 39. Jul 15 05:21:07.931041 containerd[1591]: time="2025-07-15T05:21:07.930888915Z" level=warning msg="container event discarded" container=87e7a0a8b7d2212a064c36e139228499b6b73ee3a8ca257bc7a3c7dab15dda49 type=CONTAINER_CREATED_EVENT Jul 15 05:21:07.931041 containerd[1591]: time="2025-07-15T05:21:07.930953415Z" level=warning msg="container event discarded" container=87e7a0a8b7d2212a064c36e139228499b6b73ee3a8ca257bc7a3c7dab15dda49 type=CONTAINER_STARTED_EVENT Jul 15 05:21:08.124821 containerd[1591]: time="2025-07-15T05:21:08.124724650Z" level=warning msg="container event discarded" container=6a74f534e54b59aef4d23b08a455ebf83004fefc0e97cb2b40432c6a9b30eb46 type=CONTAINER_CREATED_EVENT Jul 15 05:21:08.124821 containerd[1591]: time="2025-07-15T05:21:08.124781100Z" level=warning msg="container event discarded" container=6a74f534e54b59aef4d23b08a455ebf83004fefc0e97cb2b40432c6a9b30eb46 type=CONTAINER_STARTED_EVENT Jul 15 05:21:09.493736 update_engine[1569]: I20250715 05:21:09.493624 1569 libcurl_http_fetcher.cc:47] Starting/Resuming transfer Jul 15 05:21:09.494510 update_engine[1569]: I20250715 05:21:09.494074 1569 libcurl_http_fetcher.cc:151] Setting up curl options for HTTP Jul 15 05:21:09.494657 update_engine[1569]: I20250715 05:21:09.494585 1569 libcurl_http_fetcher.cc:449] Setting up timeout source: 1 seconds. Jul 15 05:21:09.495077 update_engine[1569]: E20250715 05:21:09.494971 1569 libcurl_http_fetcher.cc:266] Unable to get http response code: Could not resolve host: disabled Jul 15 05:21:09.495184 update_engine[1569]: I20250715 05:21:09.495149 1569 libcurl_http_fetcher.cc:297] Transfer resulted in an error (0), 0 bytes downloaded Jul 15 05:21:09.495184 update_engine[1569]: I20250715 05:21:09.495168 1569 omaha_request_action.cc:617] Omaha request response: Jul 15 05:21:09.495352 update_engine[1569]: E20250715 05:21:09.495304 1569 omaha_request_action.cc:636] Omaha request network transfer failed. Jul 15 05:21:09.495352 update_engine[1569]: I20250715 05:21:09.495344 1569 action_processor.cc:68] ActionProcessor::ActionComplete: OmahaRequestAction action failed. Aborting processing. Jul 15 05:21:09.495437 update_engine[1569]: I20250715 05:21:09.495359 1569 action_processor.cc:73] ActionProcessor::ActionComplete: finished last action of type OmahaRequestAction Jul 15 05:21:09.495437 update_engine[1569]: I20250715 05:21:09.495371 1569 update_attempter.cc:306] Processing Done. Jul 15 05:21:09.495437 update_engine[1569]: E20250715 05:21:09.495398 1569 update_attempter.cc:619] Update failed. Jul 15 05:21:09.495437 update_engine[1569]: I20250715 05:21:09.495416 1569 utils.cc:600] Converting error code 2000 to kActionCodeOmahaErrorInHTTPResponse Jul 15 05:21:09.495587 update_engine[1569]: I20250715 05:21:09.495433 1569 payload_state.cc:97] Updating payload state for error code: 37 (kActionCodeOmahaErrorInHTTPResponse) Jul 15 05:21:09.495587 update_engine[1569]: I20250715 05:21:09.495455 1569 payload_state.cc:103] Ignoring failures until we get a valid Omaha response. Jul 15 05:21:09.495587 update_engine[1569]: I20250715 05:21:09.495577 1569 action_processor.cc:36] ActionProcessor::StartProcessing: OmahaRequestAction Jul 15 05:21:09.495699 update_engine[1569]: I20250715 05:21:09.495615 1569 omaha_request_action.cc:271] Posting an Omaha request to disabled Jul 15 05:21:09.495699 update_engine[1569]: I20250715 05:21:09.495628 1569 omaha_request_action.cc:272] Request: Jul 15 05:21:09.495699 update_engine[1569]: Jul 15 05:21:09.495699 update_engine[1569]: Jul 15 05:21:09.495699 update_engine[1569]: Jul 15 05:21:09.495699 update_engine[1569]: Jul 15 05:21:09.495699 update_engine[1569]: Jul 15 05:21:09.495699 update_engine[1569]: Jul 15 05:21:09.495699 update_engine[1569]: I20250715 05:21:09.495642 1569 libcurl_http_fetcher.cc:47] Starting/Resuming transfer Jul 15 05:21:09.495995 update_engine[1569]: I20250715 05:21:09.495934 1569 libcurl_http_fetcher.cc:151] Setting up curl options for HTTP Jul 15 05:21:09.496604 locksmithd[1607]: LastCheckedTime=0 Progress=0 CurrentOperation="UPDATE_STATUS_REPORTING_ERROR_EVENT" NewVersion=0.0.0 NewSize=0 Jul 15 05:21:09.497268 update_engine[1569]: I20250715 05:21:09.496711 1569 libcurl_http_fetcher.cc:449] Setting up timeout source: 1 seconds. Jul 15 05:21:09.497268 update_engine[1569]: E20250715 05:21:09.497203 1569 libcurl_http_fetcher.cc:266] Unable to get http response code: Could not resolve host: disabled Jul 15 05:21:09.497373 update_engine[1569]: I20250715 05:21:09.497298 1569 libcurl_http_fetcher.cc:297] Transfer resulted in an error (0), 0 bytes downloaded Jul 15 05:21:09.497373 update_engine[1569]: I20250715 05:21:09.497313 1569 omaha_request_action.cc:617] Omaha request response: Jul 15 05:21:09.497373 update_engine[1569]: I20250715 05:21:09.497328 1569 action_processor.cc:65] ActionProcessor::ActionComplete: finished last action of type OmahaRequestAction Jul 15 05:21:09.497373 update_engine[1569]: I20250715 05:21:09.497341 1569 action_processor.cc:73] ActionProcessor::ActionComplete: finished last action of type OmahaRequestAction Jul 15 05:21:09.497373 update_engine[1569]: I20250715 05:21:09.497354 1569 update_attempter.cc:306] Processing Done. Jul 15 05:21:09.497373 update_engine[1569]: I20250715 05:21:09.497366 1569 update_attempter.cc:310] Error event sent. Jul 15 05:21:09.498514 update_engine[1569]: I20250715 05:21:09.498293 1569 update_check_scheduler.cc:74] Next update check in 49m16s Jul 15 05:21:09.499139 locksmithd[1607]: LastCheckedTime=0 Progress=0 CurrentOperation="UPDATE_STATUS_IDLE" NewVersion=0.0.0 NewSize=0 Jul 15 05:21:10.844991 containerd[1591]: time="2025-07-15T05:21:10.844957705Z" level=info msg="TaskExit event in podsandbox handler container_id:\"9428bd6e1d7b46eab1fee560c89f8d47314c8d522bc6b7a656d2d6d9133dfefe\" id:\"30424b639473cc450872e585dc77141bc41cc085c97c10c64c71ccc497cf3a8e\" pid:6994 exited_at:{seconds:1752556870 nanos:844687505}" Jul 15 05:21:10.959364 containerd[1591]: time="2025-07-15T05:21:10.959205356Z" level=warning msg="container event discarded" container=1d4f89ac76277b79fe4cd7b194e68774290e945b2f38ebe272ede7e19fc59bf0 type=CONTAINER_CREATED_EVENT Jul 15 05:21:11.048255 containerd[1591]: time="2025-07-15T05:21:11.048162229Z" level=warning msg="container event discarded" container=1d4f89ac76277b79fe4cd7b194e68774290e945b2f38ebe272ede7e19fc59bf0 type=CONTAINER_STARTED_EVENT Jul 15 05:21:11.141849 systemd[1]: Started sshd@39-157.180.39.85:22-139.178.89.65:33984.service - OpenSSH per-connection server daemon (139.178.89.65:33984). Jul 15 05:21:12.144037 sshd[7006]: Accepted publickey for core from 139.178.89.65 port 33984 ssh2: RSA SHA256:je//JvwAvCoYiHr1RrgFTcyG5vuK3YNw75TtQlLXdFM Jul 15 05:21:12.146086 sshd-session[7006]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jul 15 05:21:12.152022 systemd-logind[1567]: New session 40 of user core. Jul 15 05:21:12.155234 systemd[1]: Started session-40.scope - Session 40 of User core. Jul 15 05:21:12.747609 containerd[1591]: time="2025-07-15T05:21:12.747546679Z" level=warning msg="container event discarded" container=0c13ad9bc59f832f9eaed6635c854bbd76014a914b4835030715175f68dce908 type=CONTAINER_CREATED_EVENT Jul 15 05:21:12.803927 containerd[1591]: time="2025-07-15T05:21:12.803823801Z" level=warning msg="container event discarded" container=0c13ad9bc59f832f9eaed6635c854bbd76014a914b4835030715175f68dce908 type=CONTAINER_STARTED_EVENT Jul 15 05:21:12.909737 sshd[7009]: Connection closed by 139.178.89.65 port 33984 Jul 15 05:21:12.910706 sshd-session[7006]: pam_unix(sshd:session): session closed for user core Jul 15 05:21:12.917583 systemd[1]: sshd@39-157.180.39.85:22-139.178.89.65:33984.service: Deactivated successfully. Jul 15 05:21:12.921103 systemd[1]: session-40.scope: Deactivated successfully. Jul 15 05:21:12.923986 systemd-logind[1567]: Session 40 logged out. Waiting for processes to exit. Jul 15 05:21:12.927606 systemd-logind[1567]: Removed session 40. Jul 15 05:21:12.940488 containerd[1591]: time="2025-07-15T05:21:12.940379508Z" level=warning msg="container event discarded" container=0c13ad9bc59f832f9eaed6635c854bbd76014a914b4835030715175f68dce908 type=CONTAINER_STOPPED_EVENT Jul 15 05:21:17.649356 containerd[1591]: time="2025-07-15T05:21:17.649221256Z" level=warning msg="container event discarded" container=adf3b25478997e296845cfa031b41da74a8213c8c159735c83642de02ff13075 type=CONTAINER_CREATED_EVENT Jul 15 05:21:17.711704 containerd[1591]: time="2025-07-15T05:21:17.711606254Z" level=warning msg="container event discarded" container=adf3b25478997e296845cfa031b41da74a8213c8c159735c83642de02ff13075 type=CONTAINER_STARTED_EVENT Jul 15 05:21:18.076266 systemd[1]: Started sshd@40-157.180.39.85:22-139.178.89.65:33990.service - OpenSSH per-connection server daemon (139.178.89.65:33990). Jul 15 05:21:18.133821 containerd[1591]: time="2025-07-15T05:21:18.133760772Z" level=warning msg="container event discarded" container=adf3b25478997e296845cfa031b41da74a8213c8c159735c83642de02ff13075 type=CONTAINER_STOPPED_EVENT Jul 15 05:21:19.054929 sshd[7021]: Accepted publickey for core from 139.178.89.65 port 33990 ssh2: RSA SHA256:je//JvwAvCoYiHr1RrgFTcyG5vuK3YNw75TtQlLXdFM Jul 15 05:21:19.057438 sshd-session[7021]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jul 15 05:21:19.066899 systemd-logind[1567]: New session 41 of user core. Jul 15 05:21:19.072304 systemd[1]: Started session-41.scope - Session 41 of User core. Jul 15 05:21:19.856135 sshd[7024]: Connection closed by 139.178.89.65 port 33990 Jul 15 05:21:19.860347 sshd-session[7021]: pam_unix(sshd:session): session closed for user core Jul 15 05:21:19.868933 systemd[1]: sshd@40-157.180.39.85:22-139.178.89.65:33990.service: Deactivated successfully. Jul 15 05:21:19.871771 systemd[1]: session-41.scope: Deactivated successfully. Jul 15 05:21:19.874191 systemd-logind[1567]: Session 41 logged out. Waiting for processes to exit. Jul 15 05:21:19.878153 systemd-logind[1567]: Removed session 41. Jul 15 05:21:22.608559 containerd[1591]: time="2025-07-15T05:21:22.608450118Z" level=warning msg="container event discarded" container=87af270add8c61091888bd10ba9da0b61a290df1dcd1c45e8bd59a8af425818f type=CONTAINER_CREATED_EVENT Jul 15 05:21:22.719197 containerd[1591]: time="2025-07-15T05:21:22.719004167Z" level=warning msg="container event discarded" container=87af270add8c61091888bd10ba9da0b61a290df1dcd1c45e8bd59a8af425818f type=CONTAINER_STARTED_EVENT Jul 15 05:21:24.644283 containerd[1591]: time="2025-07-15T05:21:24.644138567Z" level=warning msg="container event discarded" container=d3238ff54ba2e0acbd2e87e1898434492a93c1c2ac97a684f50c1f4ceadcf112 type=CONTAINER_CREATED_EVENT Jul 15 05:21:24.644283 containerd[1591]: time="2025-07-15T05:21:24.644269877Z" level=warning msg="container event discarded" container=d3238ff54ba2e0acbd2e87e1898434492a93c1c2ac97a684f50c1f4ceadcf112 type=CONTAINER_STARTED_EVENT Jul 15 05:21:25.032434 systemd[1]: Started sshd@41-157.180.39.85:22-139.178.89.65:39392.service - OpenSSH per-connection server daemon (139.178.89.65:39392). Jul 15 05:21:25.729879 containerd[1591]: time="2025-07-15T05:21:25.729834850Z" level=info msg="TaskExit event in podsandbox handler container_id:\"87af270add8c61091888bd10ba9da0b61a290df1dcd1c45e8bd59a8af425818f\" id:\"a25e6fa8395cb3e4e4b9583c4208bad9a1a92a3c0e291d96a9e696a7f5b5ac50\" pid:7061 exited_at:{seconds:1752556885 nanos:729406371}" Jul 15 05:21:26.067046 sshd[7045]: Accepted publickey for core from 139.178.89.65 port 39392 ssh2: RSA SHA256:je//JvwAvCoYiHr1RrgFTcyG5vuK3YNw75TtQlLXdFM Jul 15 05:21:26.069217 sshd-session[7045]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jul 15 05:21:26.076489 systemd-logind[1567]: New session 42 of user core. Jul 15 05:21:26.082247 systemd[1]: Started session-42.scope - Session 42 of User core. Jul 15 05:21:26.365241 containerd[1591]: time="2025-07-15T05:21:26.364988734Z" level=warning msg="container event discarded" container=0aa3f1f3e96091915cd17b7e156c572f4da4c11bdcf13af53b690f7f8a9ca620 type=CONTAINER_CREATED_EVENT Jul 15 05:21:26.420719 containerd[1591]: time="2025-07-15T05:21:26.420657805Z" level=warning msg="container event discarded" container=0aa3f1f3e96091915cd17b7e156c572f4da4c11bdcf13af53b690f7f8a9ca620 type=CONTAINER_STARTED_EVENT Jul 15 05:21:26.916118 sshd[7073]: Connection closed by 139.178.89.65 port 39392 Jul 15 05:21:26.917090 sshd-session[7045]: pam_unix(sshd:session): session closed for user core Jul 15 05:21:26.924121 systemd[1]: sshd@41-157.180.39.85:22-139.178.89.65:39392.service: Deactivated successfully. Jul 15 05:21:26.927986 systemd[1]: session-42.scope: Deactivated successfully. Jul 15 05:21:26.930273 systemd-logind[1567]: Session 42 logged out. Waiting for processes to exit. Jul 15 05:21:26.932653 systemd-logind[1567]: Removed session 42. Jul 15 05:21:28.398646 containerd[1591]: time="2025-07-15T05:21:28.398578430Z" level=warning msg="container event discarded" container=fa72f435bdb9d3cbc69d3de3012aa6654c72dfac5999f96dd5116629a7e37fd4 type=CONTAINER_CREATED_EVENT Jul 15 05:21:28.463394 containerd[1591]: time="2025-07-15T05:21:28.463282532Z" level=warning msg="container event discarded" container=fa72f435bdb9d3cbc69d3de3012aa6654c72dfac5999f96dd5116629a7e37fd4 type=CONTAINER_STARTED_EVENT Jul 15 05:21:29.700795 containerd[1591]: time="2025-07-15T05:21:29.700704773Z" level=warning msg="container event discarded" container=6d92382dcc5d3a20c83b67ca71441cb951f6446062ff797d0f56361303e04979 type=CONTAINER_CREATED_EVENT Jul 15 05:21:29.700795 containerd[1591]: time="2025-07-15T05:21:29.700756103Z" level=warning msg="container event discarded" container=6d92382dcc5d3a20c83b67ca71441cb951f6446062ff797d0f56361303e04979 type=CONTAINER_STARTED_EVENT Jul 15 05:21:29.776266 containerd[1591]: time="2025-07-15T05:21:29.776176935Z" level=warning msg="container event discarded" container=74ebcdace3be4cf3401f9e191e7151142de9950ea8f7e152e76abc2c55aca44c type=CONTAINER_CREATED_EVENT Jul 15 05:21:29.776266 containerd[1591]: time="2025-07-15T05:21:29.776243055Z" level=warning msg="container event discarded" container=74ebcdace3be4cf3401f9e191e7151142de9950ea8f7e152e76abc2c55aca44c type=CONTAINER_STARTED_EVENT Jul 15 05:21:30.835174 containerd[1591]: time="2025-07-15T05:21:30.835000932Z" level=warning msg="container event discarded" container=06b7fdb37cbdc7225031a84bde5341901c49a1a58473085f17b2472cf1f3fbe3 type=CONTAINER_CREATED_EVENT Jul 15 05:21:30.835174 containerd[1591]: time="2025-07-15T05:21:30.835158982Z" level=warning msg="container event discarded" container=06b7fdb37cbdc7225031a84bde5341901c49a1a58473085f17b2472cf1f3fbe3 type=CONTAINER_STARTED_EVENT Jul 15 05:21:30.858555 containerd[1591]: time="2025-07-15T05:21:30.858476378Z" level=warning msg="container event discarded" container=fa1912829682e25e6d6e91aec36820c017779edefcd53294ec9541d95fc69eb4 type=CONTAINER_CREATED_EVENT Jul 15 05:21:30.904799 containerd[1591]: time="2025-07-15T05:21:30.904721289Z" level=warning msg="container event discarded" container=572b11424fc0bf9108932bf833d1aac031175336fc98a9cbc83cbb40213ffc69 type=CONTAINER_CREATED_EVENT Jul 15 05:21:30.904799 containerd[1591]: time="2025-07-15T05:21:30.904784589Z" level=warning msg="container event discarded" container=572b11424fc0bf9108932bf833d1aac031175336fc98a9cbc83cbb40213ffc69 type=CONTAINER_STARTED_EVENT Jul 15 05:21:30.934163 containerd[1591]: time="2025-07-15T05:21:30.934068366Z" level=warning msg="container event discarded" container=fa1912829682e25e6d6e91aec36820c017779edefcd53294ec9541d95fc69eb4 type=CONTAINER_STARTED_EVENT Jul 15 05:21:31.682765 containerd[1591]: time="2025-07-15T05:21:31.682674400Z" level=warning msg="container event discarded" container=83181a6f59f44f2314ef7f2c42d6e9b889f25add0df4a95a6bcfefd364e64db5 type=CONTAINER_CREATED_EVENT Jul 15 05:21:31.682765 containerd[1591]: time="2025-07-15T05:21:31.682748350Z" level=warning msg="container event discarded" container=83181a6f59f44f2314ef7f2c42d6e9b889f25add0df4a95a6bcfefd364e64db5 type=CONTAINER_STARTED_EVENT Jul 15 05:21:32.089273 systemd[1]: Started sshd@42-157.180.39.85:22-139.178.89.65:39758.service - OpenSSH per-connection server daemon (139.178.89.65:39758). Jul 15 05:21:32.176527 containerd[1591]: time="2025-07-15T05:21:32.176420213Z" level=warning msg="container event discarded" container=9356dc6e06636181351f4e36acba0d3bbf9f5b2dce9469df4bd328897e004c28 type=CONTAINER_CREATED_EVENT Jul 15 05:21:32.239882 containerd[1591]: time="2025-07-15T05:21:32.239794732Z" level=warning msg="container event discarded" container=9356dc6e06636181351f4e36acba0d3bbf9f5b2dce9469df4bd328897e004c28 type=CONTAINER_STARTED_EVENT Jul 15 05:21:32.686776 containerd[1591]: time="2025-07-15T05:21:32.686671991Z" level=warning msg="container event discarded" container=e283644be1bcd531ffacb2d1e6c1da45867cf45e86cd2dae78dd860284c02eec type=CONTAINER_CREATED_EVENT Jul 15 05:21:32.686776 containerd[1591]: time="2025-07-15T05:21:32.686740561Z" level=warning msg="container event discarded" container=e283644be1bcd531ffacb2d1e6c1da45867cf45e86cd2dae78dd860284c02eec type=CONTAINER_STARTED_EVENT Jul 15 05:21:32.707120 containerd[1591]: time="2025-07-15T05:21:32.706992689Z" level=warning msg="container event discarded" container=fd441f72c1869ddfa5557e9efc05e67ca422282a83ad16e5accaf07c4edf6b14 type=CONTAINER_CREATED_EVENT Jul 15 05:21:32.790533 containerd[1591]: time="2025-07-15T05:21:32.790447007Z" level=warning msg="container event discarded" container=fd441f72c1869ddfa5557e9efc05e67ca422282a83ad16e5accaf07c4edf6b14 type=CONTAINER_STARTED_EVENT Jul 15 05:21:33.090682 sshd[7103]: Accepted publickey for core from 139.178.89.65 port 39758 ssh2: RSA SHA256:je//JvwAvCoYiHr1RrgFTcyG5vuK3YNw75TtQlLXdFM Jul 15 05:21:33.091334 sshd-session[7103]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jul 15 05:21:33.095907 systemd-logind[1567]: New session 43 of user core. Jul 15 05:21:33.101136 systemd[1]: Started session-43.scope - Session 43 of User core. Jul 15 05:21:33.820719 sshd[7106]: Connection closed by 139.178.89.65 port 39758 Jul 15 05:21:33.821795 sshd-session[7103]: pam_unix(sshd:session): session closed for user core Jul 15 05:21:33.829211 systemd[1]: sshd@42-157.180.39.85:22-139.178.89.65:39758.service: Deactivated successfully. Jul 15 05:21:33.833629 systemd[1]: session-43.scope: Deactivated successfully. Jul 15 05:21:33.835966 systemd-logind[1567]: Session 43 logged out. Waiting for processes to exit. Jul 15 05:21:33.839226 systemd-logind[1567]: Removed session 43. Jul 15 05:21:33.994075 systemd[1]: Started sshd@43-157.180.39.85:22-139.178.89.65:39766.service - OpenSSH per-connection server daemon (139.178.89.65:39766). Jul 15 05:21:34.485276 containerd[1591]: time="2025-07-15T05:21:34.485205847Z" level=warning msg="container event discarded" container=ecc96da98e69b59d4998e563e8cd9c0081547a430a3bdb37fcca15973200c970 type=CONTAINER_CREATED_EVENT Jul 15 05:21:34.576638 containerd[1591]: time="2025-07-15T05:21:34.576574362Z" level=warning msg="container event discarded" container=ecc96da98e69b59d4998e563e8cd9c0081547a430a3bdb37fcca15973200c970 type=CONTAINER_STARTED_EVENT Jul 15 05:21:34.733980 containerd[1591]: time="2025-07-15T05:21:34.733952155Z" level=info msg="TaskExit event in podsandbox handler container_id:\"ecc96da98e69b59d4998e563e8cd9c0081547a430a3bdb37fcca15973200c970\" id:\"ca972484328709ec023d1dda7fde4fe8c13b0b6c6f3170eef7e6a2f1d7a12a60\" pid:7136 exited_at:{seconds:1752556894 nanos:733346206}" Jul 15 05:21:34.765928 containerd[1591]: time="2025-07-15T05:21:34.765813581Z" level=warning msg="container event discarded" container=47bee240f99e3e58177cede01d56192cdf7cae15f13bc6cc3f1808a29777ea40 type=CONTAINER_CREATED_EVENT Jul 15 05:21:34.766084 containerd[1591]: time="2025-07-15T05:21:34.766057420Z" level=warning msg="container event discarded" container=47bee240f99e3e58177cede01d56192cdf7cae15f13bc6cc3f1808a29777ea40 type=CONTAINER_STARTED_EVENT Jul 15 05:21:34.795294 containerd[1591]: time="2025-07-15T05:21:34.795248331Z" level=warning msg="container event discarded" container=7d3eeaa81fb754f77cfc8f6805914025d9cf6105a2a32fdfa8165d01d6bce31c type=CONTAINER_CREATED_EVENT Jul 15 05:21:34.866032 containerd[1591]: time="2025-07-15T05:21:34.865974051Z" level=warning msg="container event discarded" container=7d3eeaa81fb754f77cfc8f6805914025d9cf6105a2a32fdfa8165d01d6bce31c type=CONTAINER_STARTED_EVENT Jul 15 05:21:34.986630 sshd[7120]: Accepted publickey for core from 139.178.89.65 port 39766 ssh2: RSA SHA256:je//JvwAvCoYiHr1RrgFTcyG5vuK3YNw75TtQlLXdFM Jul 15 05:21:34.989540 sshd-session[7120]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jul 15 05:21:34.997882 systemd-logind[1567]: New session 44 of user core. Jul 15 05:21:35.004241 systemd[1]: Started session-44.scope - Session 44 of User core. Jul 15 05:21:35.848645 sshd[7145]: Connection closed by 139.178.89.65 port 39766 Jul 15 05:21:35.851362 sshd-session[7120]: pam_unix(sshd:session): session closed for user core Jul 15 05:21:35.856039 systemd-logind[1567]: Session 44 logged out. Waiting for processes to exit. Jul 15 05:21:35.858352 systemd[1]: sshd@43-157.180.39.85:22-139.178.89.65:39766.service: Deactivated successfully. Jul 15 05:21:35.860694 systemd[1]: session-44.scope: Deactivated successfully. Jul 15 05:21:35.864665 systemd-logind[1567]: Removed session 44. Jul 15 05:21:36.015517 systemd[1]: Started sshd@44-157.180.39.85:22-139.178.89.65:39772.service - OpenSSH per-connection server daemon (139.178.89.65:39772). Jul 15 05:21:37.010352 sshd[7155]: Accepted publickey for core from 139.178.89.65 port 39772 ssh2: RSA SHA256:je//JvwAvCoYiHr1RrgFTcyG5vuK3YNw75TtQlLXdFM Jul 15 05:21:37.012308 sshd-session[7155]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jul 15 05:21:37.017731 systemd-logind[1567]: New session 45 of user core. Jul 15 05:21:37.024248 systemd[1]: Started session-45.scope - Session 45 of User core. Jul 15 05:21:37.832574 sshd[7158]: Connection closed by 139.178.89.65 port 39772 Jul 15 05:21:37.834301 sshd-session[7155]: pam_unix(sshd:session): session closed for user core Jul 15 05:21:37.838528 systemd[1]: sshd@44-157.180.39.85:22-139.178.89.65:39772.service: Deactivated successfully. Jul 15 05:21:37.841494 systemd[1]: session-45.scope: Deactivated successfully. Jul 15 05:21:37.843214 systemd-logind[1567]: Session 45 logged out. Waiting for processes to exit. Jul 15 05:21:37.846147 systemd-logind[1567]: Removed session 45. Jul 15 05:21:38.462507 containerd[1591]: time="2025-07-15T05:21:38.462273387Z" level=warning msg="container event discarded" container=9428bd6e1d7b46eab1fee560c89f8d47314c8d522bc6b7a656d2d6d9133dfefe type=CONTAINER_CREATED_EVENT Jul 15 05:21:38.550762 containerd[1591]: time="2025-07-15T05:21:38.550681389Z" level=warning msg="container event discarded" container=9428bd6e1d7b46eab1fee560c89f8d47314c8d522bc6b7a656d2d6d9133dfefe type=CONTAINER_STARTED_EVENT Jul 15 05:21:40.800682 containerd[1591]: time="2025-07-15T05:21:40.800517797Z" level=info msg="TaskExit event in podsandbox handler container_id:\"9428bd6e1d7b46eab1fee560c89f8d47314c8d522bc6b7a656d2d6d9133dfefe\" id:\"5de93e62773f281ab40b6fc86dcf8b7239485d8a894c9a310c44c13ee28743fc\" pid:7181 exited_at:{seconds:1752556900 nanos:799993408}" Jul 15 05:21:41.304931 containerd[1591]: time="2025-07-15T05:21:41.304874975Z" level=warning msg="container event discarded" container=103e4e4ee4f96fd017a7236e348f9f9489b35407fdef1df17648174cd48f9eb6 type=CONTAINER_CREATED_EVENT Jul 15 05:21:41.366557 containerd[1591]: time="2025-07-15T05:21:41.366499399Z" level=warning msg="container event discarded" container=103e4e4ee4f96fd017a7236e348f9f9489b35407fdef1df17648174cd48f9eb6 type=CONTAINER_STARTED_EVENT Jul 15 05:21:42.459419 containerd[1591]: time="2025-07-15T05:21:42.459376232Z" level=info msg="TaskExit event in podsandbox handler container_id:\"9428bd6e1d7b46eab1fee560c89f8d47314c8d522bc6b7a656d2d6d9133dfefe\" id:\"dcf34a45320e65d0b3ee2d9fac96dd4f35bc638f2cd4069344a290a31de1e7c8\" pid:7203 exited_at:{seconds:1752556902 nanos:457260297}" Jul 15 05:21:42.999364 systemd[1]: Started sshd@45-157.180.39.85:22-139.178.89.65:36844.service - OpenSSH per-connection server daemon (139.178.89.65:36844). Jul 15 05:21:43.103506 containerd[1591]: time="2025-07-15T05:21:43.103358191Z" level=warning msg="container event discarded" container=f68518477b42a56d95d12da022007db0b7af983fba968f6c8868cf89cba2a62f type=CONTAINER_CREATED_EVENT Jul 15 05:21:43.171115 containerd[1591]: time="2025-07-15T05:21:43.170979996Z" level=warning msg="container event discarded" container=f68518477b42a56d95d12da022007db0b7af983fba968f6c8868cf89cba2a62f type=CONTAINER_STARTED_EVENT Jul 15 05:21:43.981507 sshd[7214]: Accepted publickey for core from 139.178.89.65 port 36844 ssh2: RSA SHA256:je//JvwAvCoYiHr1RrgFTcyG5vuK3YNw75TtQlLXdFM Jul 15 05:21:43.982218 sshd-session[7214]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jul 15 05:21:43.986253 systemd-logind[1567]: New session 46 of user core. Jul 15 05:21:43.991132 systemd[1]: Started session-46.scope - Session 46 of User core. Jul 15 05:21:44.705525 sshd[7217]: Connection closed by 139.178.89.65 port 36844 Jul 15 05:21:44.706104 sshd-session[7214]: pam_unix(sshd:session): session closed for user core Jul 15 05:21:44.712360 systemd-logind[1567]: Session 46 logged out. Waiting for processes to exit. Jul 15 05:21:44.712931 systemd[1]: sshd@45-157.180.39.85:22-139.178.89.65:36844.service: Deactivated successfully. Jul 15 05:21:44.716944 systemd[1]: session-46.scope: Deactivated successfully. Jul 15 05:21:44.720614 systemd-logind[1567]: Removed session 46. Jul 15 05:21:49.878094 systemd[1]: Started sshd@46-157.180.39.85:22-139.178.89.65:58012.service - OpenSSH per-connection server daemon (139.178.89.65:58012). Jul 15 05:21:50.874972 sshd[7230]: Accepted publickey for core from 139.178.89.65 port 58012 ssh2: RSA SHA256:je//JvwAvCoYiHr1RrgFTcyG5vuK3YNw75TtQlLXdFM Jul 15 05:21:50.877116 sshd-session[7230]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jul 15 05:21:50.882104 systemd-logind[1567]: New session 47 of user core. Jul 15 05:21:50.888159 systemd[1]: Started session-47.scope - Session 47 of User core. Jul 15 05:21:51.579160 containerd[1591]: time="2025-07-15T05:21:51.579071891Z" level=info msg="TaskExit event in podsandbox handler container_id:\"ecc96da98e69b59d4998e563e8cd9c0081547a430a3bdb37fcca15973200c970\" id:\"02d88817250d0c98c79561b6362515e0cf1fb7da64b4cd932960c411ad7caa72\" pid:7253 exited_at:{seconds:1752556911 nanos:578835012}" Jul 15 05:21:51.616045 sshd[7233]: Connection closed by 139.178.89.65 port 58012 Jul 15 05:21:51.616825 sshd-session[7230]: pam_unix(sshd:session): session closed for user core Jul 15 05:21:51.623716 systemd[1]: sshd@46-157.180.39.85:22-139.178.89.65:58012.service: Deactivated successfully. Jul 15 05:21:51.627854 systemd[1]: session-47.scope: Deactivated successfully. Jul 15 05:21:51.630635 systemd-logind[1567]: Session 47 logged out. Waiting for processes to exit. Jul 15 05:21:51.632911 systemd-logind[1567]: Removed session 47. Jul 15 05:21:55.725812 containerd[1591]: time="2025-07-15T05:21:55.725770601Z" level=info msg="TaskExit event in podsandbox handler container_id:\"87af270add8c61091888bd10ba9da0b61a290df1dcd1c45e8bd59a8af425818f\" id:\"ef9bd3f328aa4553f274f53da0107400e2570ecbe770fea7142903158d60b7d4\" pid:7278 exited_at:{seconds:1752556915 nanos:725466332}" Jul 15 05:21:56.789649 systemd[1]: Started sshd@47-157.180.39.85:22-139.178.89.65:58016.service - OpenSSH per-connection server daemon (139.178.89.65:58016). Jul 15 05:21:57.820726 sshd[7293]: Accepted publickey for core from 139.178.89.65 port 58016 ssh2: RSA SHA256:je//JvwAvCoYiHr1RrgFTcyG5vuK3YNw75TtQlLXdFM Jul 15 05:21:57.825707 sshd-session[7293]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jul 15 05:21:57.835137 systemd-logind[1567]: New session 48 of user core. Jul 15 05:21:57.842281 systemd[1]: Started session-48.scope - Session 48 of User core. Jul 15 05:21:58.744992 sshd[7304]: Connection closed by 139.178.89.65 port 58016 Jul 15 05:21:58.745963 sshd-session[7293]: pam_unix(sshd:session): session closed for user core Jul 15 05:21:58.753532 systemd[1]: sshd@47-157.180.39.85:22-139.178.89.65:58016.service: Deactivated successfully. Jul 15 05:21:58.757903 systemd[1]: session-48.scope: Deactivated successfully. Jul 15 05:21:58.760177 systemd-logind[1567]: Session 48 logged out. Waiting for processes to exit. Jul 15 05:21:58.763300 systemd-logind[1567]: Removed session 48. Jul 15 05:22:03.911656 systemd[1]: Started sshd@48-157.180.39.85:22-139.178.89.65:48242.service - OpenSSH per-connection server daemon (139.178.89.65:48242). Jul 15 05:22:04.768074 containerd[1591]: time="2025-07-15T05:22:04.767904019Z" level=info msg="TaskExit event in podsandbox handler container_id:\"ecc96da98e69b59d4998e563e8cd9c0081547a430a3bdb37fcca15973200c970\" id:\"d15c25497e79e694fed2a18cc131c170d350f71a0abdfca4d07929e2f74d78f4\" pid:7332 exited_at:{seconds:1752556924 nanos:767370630}" Jul 15 05:22:04.894136 sshd[7316]: Accepted publickey for core from 139.178.89.65 port 48242 ssh2: RSA SHA256:je//JvwAvCoYiHr1RrgFTcyG5vuK3YNw75TtQlLXdFM Jul 15 05:22:04.896837 sshd-session[7316]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jul 15 05:22:04.905557 systemd-logind[1567]: New session 49 of user core. Jul 15 05:22:04.916270 systemd[1]: Started session-49.scope - Session 49 of User core. Jul 15 05:22:05.676811 sshd[7341]: Connection closed by 139.178.89.65 port 48242 Jul 15 05:22:05.679725 sshd-session[7316]: pam_unix(sshd:session): session closed for user core Jul 15 05:22:05.686790 systemd[1]: sshd@48-157.180.39.85:22-139.178.89.65:48242.service: Deactivated successfully. Jul 15 05:22:05.689363 systemd[1]: session-49.scope: Deactivated successfully. Jul 15 05:22:05.690530 systemd-logind[1567]: Session 49 logged out. Waiting for processes to exit. Jul 15 05:22:05.692368 systemd-logind[1567]: Removed session 49. Jul 15 05:22:10.821871 containerd[1591]: time="2025-07-15T05:22:10.821817939Z" level=info msg="TaskExit event in podsandbox handler container_id:\"9428bd6e1d7b46eab1fee560c89f8d47314c8d522bc6b7a656d2d6d9133dfefe\" id:\"8ac6304df14a831bda224525cbfabb50fc29125b0c40556110231e5e8f5df04f\" pid:7364 exited_at:{seconds:1752556930 nanos:821295821}" Jul 15 05:22:10.848886 systemd[1]: Started sshd@49-157.180.39.85:22-139.178.89.65:41992.service - OpenSSH per-connection server daemon (139.178.89.65:41992). Jul 15 05:22:11.880860 sshd[7376]: Accepted publickey for core from 139.178.89.65 port 41992 ssh2: RSA SHA256:je//JvwAvCoYiHr1RrgFTcyG5vuK3YNw75TtQlLXdFM Jul 15 05:22:11.883950 sshd-session[7376]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jul 15 05:22:11.889487 systemd-logind[1567]: New session 50 of user core. Jul 15 05:22:11.894250 systemd[1]: Started session-50.scope - Session 50 of User core. Jul 15 05:22:13.008820 sshd[7379]: Connection closed by 139.178.89.65 port 41992 Jul 15 05:22:13.017083 sshd-session[7376]: pam_unix(sshd:session): session closed for user core Jul 15 05:22:13.029653 systemd[1]: sshd@49-157.180.39.85:22-139.178.89.65:41992.service: Deactivated successfully. Jul 15 05:22:13.032595 systemd[1]: session-50.scope: Deactivated successfully. Jul 15 05:22:13.036488 systemd-logind[1567]: Session 50 logged out. Waiting for processes to exit. Jul 15 05:22:13.037819 systemd-logind[1567]: Removed session 50. Jul 15 05:22:18.179771 systemd[1]: Started sshd@50-157.180.39.85:22-139.178.89.65:42000.service - OpenSSH per-connection server daemon (139.178.89.65:42000). Jul 15 05:22:19.187833 sshd[7392]: Accepted publickey for core from 139.178.89.65 port 42000 ssh2: RSA SHA256:je//JvwAvCoYiHr1RrgFTcyG5vuK3YNw75TtQlLXdFM Jul 15 05:22:19.191437 sshd-session[7392]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jul 15 05:22:19.202186 systemd-logind[1567]: New session 51 of user core. Jul 15 05:22:19.209268 systemd[1]: Started session-51.scope - Session 51 of User core. Jul 15 05:22:19.999729 sshd[7395]: Connection closed by 139.178.89.65 port 42000 Jul 15 05:22:20.000443 sshd-session[7392]: pam_unix(sshd:session): session closed for user core Jul 15 05:22:20.003767 systemd[1]: sshd@50-157.180.39.85:22-139.178.89.65:42000.service: Deactivated successfully. Jul 15 05:22:20.005711 systemd[1]: session-51.scope: Deactivated successfully. Jul 15 05:22:20.006619 systemd-logind[1567]: Session 51 logged out. Waiting for processes to exit. Jul 15 05:22:20.008439 systemd-logind[1567]: Removed session 51. Jul 15 05:22:25.173350 systemd[1]: Started sshd@51-157.180.39.85:22-139.178.89.65:54212.service - OpenSSH per-connection server daemon (139.178.89.65:54212). Jul 15 05:22:25.736843 containerd[1591]: time="2025-07-15T05:22:25.736773076Z" level=info msg="TaskExit event in podsandbox handler container_id:\"87af270add8c61091888bd10ba9da0b61a290df1dcd1c45e8bd59a8af425818f\" id:\"159bd7891d196db3482069a5ef5b696e33da0ac2efc040becac3b64e472902e1\" pid:7424 exited_at:{seconds:1752556945 nanos:736495587}" Jul 15 05:22:26.167283 sshd[7407]: Accepted publickey for core from 139.178.89.65 port 54212 ssh2: RSA SHA256:je//JvwAvCoYiHr1RrgFTcyG5vuK3YNw75TtQlLXdFM Jul 15 05:22:26.170467 sshd-session[7407]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jul 15 05:22:26.179141 systemd-logind[1567]: New session 52 of user core. Jul 15 05:22:26.187158 systemd[1]: Started session-52.scope - Session 52 of User core. Jul 15 05:22:27.196724 sshd[7436]: Connection closed by 139.178.89.65 port 54212 Jul 15 05:22:27.197916 sshd-session[7407]: pam_unix(sshd:session): session closed for user core Jul 15 05:22:27.204197 systemd[1]: sshd@51-157.180.39.85:22-139.178.89.65:54212.service: Deactivated successfully. Jul 15 05:22:27.208345 systemd[1]: session-52.scope: Deactivated successfully. Jul 15 05:22:27.213890 systemd-logind[1567]: Session 52 logged out. Waiting for processes to exit. Jul 15 05:22:27.215842 systemd-logind[1567]: Removed session 52. Jul 15 05:22:32.373759 systemd[1]: Started sshd@52-157.180.39.85:22-139.178.89.65:43528.service - OpenSSH per-connection server daemon (139.178.89.65:43528). Jul 15 05:22:33.374433 sshd[7450]: Accepted publickey for core from 139.178.89.65 port 43528 ssh2: RSA SHA256:je//JvwAvCoYiHr1RrgFTcyG5vuK3YNw75TtQlLXdFM Jul 15 05:22:33.377574 sshd-session[7450]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jul 15 05:22:33.387123 systemd-logind[1567]: New session 53 of user core. Jul 15 05:22:33.395288 systemd[1]: Started session-53.scope - Session 53 of User core. Jul 15 05:22:34.158715 sshd[7453]: Connection closed by 139.178.89.65 port 43528 Jul 15 05:22:34.159679 sshd-session[7450]: pam_unix(sshd:session): session closed for user core Jul 15 05:22:34.165595 systemd[1]: sshd@52-157.180.39.85:22-139.178.89.65:43528.service: Deactivated successfully. Jul 15 05:22:34.169850 systemd[1]: session-53.scope: Deactivated successfully. Jul 15 05:22:34.172891 systemd-logind[1567]: Session 53 logged out. Waiting for processes to exit. Jul 15 05:22:34.175983 systemd-logind[1567]: Removed session 53. Jul 15 05:22:34.747419 containerd[1591]: time="2025-07-15T05:22:34.747215405Z" level=info msg="TaskExit event in podsandbox handler container_id:\"ecc96da98e69b59d4998e563e8cd9c0081547a430a3bdb37fcca15973200c970\" id:\"8906a7c59b6b262726dd78e8db60913db951b7416bc0d91b7d2882916cd7cc25\" pid:7476 exited_at:{seconds:1752556954 nanos:746829936}" Jul 15 05:22:39.327250 systemd[1]: Started sshd@53-157.180.39.85:22-139.178.89.65:55922.service - OpenSSH per-connection server daemon (139.178.89.65:55922). Jul 15 05:22:40.299894 sshd[7486]: Accepted publickey for core from 139.178.89.65 port 55922 ssh2: RSA SHA256:je//JvwAvCoYiHr1RrgFTcyG5vuK3YNw75TtQlLXdFM Jul 15 05:22:40.302641 sshd-session[7486]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jul 15 05:22:40.311088 systemd-logind[1567]: New session 54 of user core. Jul 15 05:22:40.319256 systemd[1]: Started session-54.scope - Session 54 of User core. Jul 15 05:22:40.911849 containerd[1591]: time="2025-07-15T05:22:40.911798381Z" level=info msg="TaskExit event in podsandbox handler container_id:\"9428bd6e1d7b46eab1fee560c89f8d47314c8d522bc6b7a656d2d6d9133dfefe\" id:\"70654428b0f00f2410d3818098f34a8844e8a84076650c7f539175216143fb1a\" pid:7505 exited_at:{seconds:1752556960 nanos:911500662}" Jul 15 05:22:41.040127 sshd[7490]: Connection closed by 139.178.89.65 port 55922 Jul 15 05:22:41.040630 sshd-session[7486]: pam_unix(sshd:session): session closed for user core Jul 15 05:22:41.044661 systemd[1]: sshd@53-157.180.39.85:22-139.178.89.65:55922.service: Deactivated successfully. Jul 15 05:22:41.045243 systemd-logind[1567]: Session 54 logged out. Waiting for processes to exit. Jul 15 05:22:41.046853 systemd[1]: session-54.scope: Deactivated successfully. Jul 15 05:22:41.052866 systemd-logind[1567]: Removed session 54. Jul 15 05:22:42.508432 containerd[1591]: time="2025-07-15T05:22:42.507826679Z" level=info msg="TaskExit event in podsandbox handler container_id:\"9428bd6e1d7b46eab1fee560c89f8d47314c8d522bc6b7a656d2d6d9133dfefe\" id:\"f4dde418ed6341082879971f936cbffdf40757e5c297cc9bad7da2d5f7d8d8e0\" pid:7539 exited_at:{seconds:1752556962 nanos:507462281}" Jul 15 05:22:46.221686 systemd[1]: Started sshd@54-157.180.39.85:22-139.178.89.65:55928.service - OpenSSH per-connection server daemon (139.178.89.65:55928). Jul 15 05:22:47.234037 sshd[7550]: Accepted publickey for core from 139.178.89.65 port 55928 ssh2: RSA SHA256:je//JvwAvCoYiHr1RrgFTcyG5vuK3YNw75TtQlLXdFM Jul 15 05:22:47.235496 sshd-session[7550]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jul 15 05:22:47.241506 systemd-logind[1567]: New session 55 of user core. Jul 15 05:22:47.248119 systemd[1]: Started session-55.scope - Session 55 of User core. Jul 15 05:22:47.973647 sshd[7553]: Connection closed by 139.178.89.65 port 55928 Jul 15 05:22:47.974471 sshd-session[7550]: pam_unix(sshd:session): session closed for user core Jul 15 05:22:47.980189 systemd-logind[1567]: Session 55 logged out. Waiting for processes to exit. Jul 15 05:22:47.981045 systemd[1]: sshd@54-157.180.39.85:22-139.178.89.65:55928.service: Deactivated successfully. Jul 15 05:22:47.983938 systemd[1]: session-55.scope: Deactivated successfully. Jul 15 05:22:47.985915 systemd-logind[1567]: Removed session 55. Jul 15 05:22:51.582595 containerd[1591]: time="2025-07-15T05:22:51.582559055Z" level=info msg="TaskExit event in podsandbox handler container_id:\"ecc96da98e69b59d4998e563e8cd9c0081547a430a3bdb37fcca15973200c970\" id:\"664c5f07e9bb645e39b5a6fab500eab5a54a9d2e0de5a2708825048b9e415f5b\" pid:7578 exited_at:{seconds:1752556971 nanos:581095161}" Jul 15 05:22:53.148081 systemd[1]: Started sshd@55-157.180.39.85:22-139.178.89.65:36982.service - OpenSSH per-connection server daemon (139.178.89.65:36982). Jul 15 05:22:54.149770 sshd[7588]: Accepted publickey for core from 139.178.89.65 port 36982 ssh2: RSA SHA256:je//JvwAvCoYiHr1RrgFTcyG5vuK3YNw75TtQlLXdFM Jul 15 05:22:54.152509 sshd-session[7588]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jul 15 05:22:54.162168 systemd-logind[1567]: New session 56 of user core. Jul 15 05:22:54.168292 systemd[1]: Started session-56.scope - Session 56 of User core. Jul 15 05:22:54.927846 sshd[7591]: Connection closed by 139.178.89.65 port 36982 Jul 15 05:22:54.928521 sshd-session[7588]: pam_unix(sshd:session): session closed for user core Jul 15 05:22:54.935505 systemd-logind[1567]: Session 56 logged out. Waiting for processes to exit. Jul 15 05:22:54.936854 systemd[1]: sshd@55-157.180.39.85:22-139.178.89.65:36982.service: Deactivated successfully. Jul 15 05:22:54.941731 systemd[1]: session-56.scope: Deactivated successfully. Jul 15 05:22:54.945313 systemd-logind[1567]: Removed session 56. Jul 15 05:22:55.722735 containerd[1591]: time="2025-07-15T05:22:55.722652913Z" level=info msg="TaskExit event in podsandbox handler container_id:\"87af270add8c61091888bd10ba9da0b61a290df1dcd1c45e8bd59a8af425818f\" id:\"5c8343747f1202c64bfa5e3749be288bf01fd49035141b8618f5632e7a302ac4\" pid:7614 exited_at:{seconds:1752556975 nanos:722153155}" Jul 15 05:23:00.100982 systemd[1]: Started sshd@56-157.180.39.85:22-139.178.89.65:33548.service - OpenSSH per-connection server daemon (139.178.89.65:33548). Jul 15 05:23:01.101746 sshd[7649]: Accepted publickey for core from 139.178.89.65 port 33548 ssh2: RSA SHA256:je//JvwAvCoYiHr1RrgFTcyG5vuK3YNw75TtQlLXdFM Jul 15 05:23:01.104700 sshd-session[7649]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jul 15 05:23:01.112166 systemd-logind[1567]: New session 57 of user core. Jul 15 05:23:01.117246 systemd[1]: Started session-57.scope - Session 57 of User core. Jul 15 05:23:01.890395 sshd[7652]: Connection closed by 139.178.89.65 port 33548 Jul 15 05:23:01.891985 sshd-session[7649]: pam_unix(sshd:session): session closed for user core Jul 15 05:23:01.900626 systemd[1]: sshd@56-157.180.39.85:22-139.178.89.65:33548.service: Deactivated successfully. Jul 15 05:23:01.905570 systemd[1]: session-57.scope: Deactivated successfully. Jul 15 05:23:01.907404 systemd-logind[1567]: Session 57 logged out. Waiting for processes to exit. Jul 15 05:23:01.912887 systemd-logind[1567]: Removed session 57. Jul 15 05:23:04.761307 containerd[1591]: time="2025-07-15T05:23:04.761237349Z" level=info msg="TaskExit event in podsandbox handler container_id:\"ecc96da98e69b59d4998e563e8cd9c0081547a430a3bdb37fcca15973200c970\" id:\"3d96452153324f8d29787e0a9bc35224fc4ca8f2b764c99d510841316a02fa36\" pid:7676 exited_at:{seconds:1752556984 nanos:760797181}" Jul 15 05:23:07.055903 systemd[1]: Started sshd@57-157.180.39.85:22-139.178.89.65:33554.service - OpenSSH per-connection server daemon (139.178.89.65:33554). Jul 15 05:23:08.043986 sshd[7686]: Accepted publickey for core from 139.178.89.65 port 33554 ssh2: RSA SHA256:je//JvwAvCoYiHr1RrgFTcyG5vuK3YNw75TtQlLXdFM Jul 15 05:23:08.046683 sshd-session[7686]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jul 15 05:23:08.055594 systemd-logind[1567]: New session 58 of user core. Jul 15 05:23:08.064227 systemd[1]: Started session-58.scope - Session 58 of User core. Jul 15 05:23:08.818757 sshd[7689]: Connection closed by 139.178.89.65 port 33554 Jul 15 05:23:08.820844 sshd-session[7686]: pam_unix(sshd:session): session closed for user core Jul 15 05:23:08.825042 systemd-logind[1567]: Session 58 logged out. Waiting for processes to exit. Jul 15 05:23:08.826531 systemd[1]: sshd@57-157.180.39.85:22-139.178.89.65:33554.service: Deactivated successfully. Jul 15 05:23:08.828803 systemd[1]: session-58.scope: Deactivated successfully. Jul 15 05:23:08.832410 systemd-logind[1567]: Removed session 58. Jul 15 05:23:10.827419 containerd[1591]: time="2025-07-15T05:23:10.827374105Z" level=info msg="TaskExit event in podsandbox handler container_id:\"9428bd6e1d7b46eab1fee560c89f8d47314c8d522bc6b7a656d2d6d9133dfefe\" id:\"a03656ebc789ac2f464b5bd82b6c5a7f49bdcd00ce1e68a8c6eb103d745dd2a5\" pid:7713 exited_at:{seconds:1752556990 nanos:827047486}" Jul 15 05:23:13.984905 systemd[1]: Started sshd@58-157.180.39.85:22-139.178.89.65:46020.service - OpenSSH per-connection server daemon (139.178.89.65:46020). Jul 15 05:23:14.966302 sshd[7726]: Accepted publickey for core from 139.178.89.65 port 46020 ssh2: RSA SHA256:je//JvwAvCoYiHr1RrgFTcyG5vuK3YNw75TtQlLXdFM Jul 15 05:23:14.968749 sshd-session[7726]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jul 15 05:23:14.977340 systemd-logind[1567]: New session 59 of user core. Jul 15 05:23:14.985657 systemd[1]: Started session-59.scope - Session 59 of User core. Jul 15 05:23:15.947955 sshd[7729]: Connection closed by 139.178.89.65 port 46020 Jul 15 05:23:15.950177 sshd-session[7726]: pam_unix(sshd:session): session closed for user core Jul 15 05:23:15.957971 systemd[1]: sshd@58-157.180.39.85:22-139.178.89.65:46020.service: Deactivated successfully. Jul 15 05:23:15.962917 systemd[1]: session-59.scope: Deactivated successfully. Jul 15 05:23:15.965182 systemd-logind[1567]: Session 59 logged out. Waiting for processes to exit. Jul 15 05:23:15.968067 systemd-logind[1567]: Removed session 59. Jul 15 05:23:21.114978 systemd[1]: Started sshd@59-157.180.39.85:22-139.178.89.65:38870.service - OpenSSH per-connection server daemon (139.178.89.65:38870). Jul 15 05:23:22.102400 sshd[7741]: Accepted publickey for core from 139.178.89.65 port 38870 ssh2: RSA SHA256:je//JvwAvCoYiHr1RrgFTcyG5vuK3YNw75TtQlLXdFM Jul 15 05:23:22.105197 sshd-session[7741]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jul 15 05:23:22.113006 systemd-logind[1567]: New session 60 of user core. Jul 15 05:23:22.119328 systemd[1]: Started session-60.scope - Session 60 of User core. Jul 15 05:23:22.881123 sshd[7744]: Connection closed by 139.178.89.65 port 38870 Jul 15 05:23:22.881646 sshd-session[7741]: pam_unix(sshd:session): session closed for user core Jul 15 05:23:22.886111 systemd-logind[1567]: Session 60 logged out. Waiting for processes to exit. Jul 15 05:23:22.886715 systemd[1]: sshd@59-157.180.39.85:22-139.178.89.65:38870.service: Deactivated successfully. Jul 15 05:23:22.889218 systemd[1]: session-60.scope: Deactivated successfully. Jul 15 05:23:22.891571 systemd-logind[1567]: Removed session 60. Jul 15 05:23:25.737161 containerd[1591]: time="2025-07-15T05:23:25.737087127Z" level=info msg="TaskExit event in podsandbox handler container_id:\"87af270add8c61091888bd10ba9da0b61a290df1dcd1c45e8bd59a8af425818f\" id:\"398ea87ec64512e82bd4122ab2c4c33ab040e90e70dd4209aebc903cc7bf0a34\" pid:7766 exited_at:{seconds:1752557005 nanos:736835338}" Jul 15 05:23:28.052983 systemd[1]: Started sshd@60-157.180.39.85:22-139.178.89.65:38882.service - OpenSSH per-connection server daemon (139.178.89.65:38882). Jul 15 05:23:29.069557 sshd[7781]: Accepted publickey for core from 139.178.89.65 port 38882 ssh2: RSA SHA256:je//JvwAvCoYiHr1RrgFTcyG5vuK3YNw75TtQlLXdFM Jul 15 05:23:29.073648 sshd-session[7781]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jul 15 05:23:29.083126 systemd-logind[1567]: New session 61 of user core. Jul 15 05:23:29.090238 systemd[1]: Started session-61.scope - Session 61 of User core. Jul 15 05:23:30.073151 sshd[7784]: Connection closed by 139.178.89.65 port 38882 Jul 15 05:23:30.074077 sshd-session[7781]: pam_unix(sshd:session): session closed for user core Jul 15 05:23:30.080913 systemd[1]: sshd@60-157.180.39.85:22-139.178.89.65:38882.service: Deactivated successfully. Jul 15 05:23:30.084953 systemd[1]: session-61.scope: Deactivated successfully. Jul 15 05:23:30.088328 systemd-logind[1567]: Session 61 logged out. Waiting for processes to exit. Jul 15 05:23:30.092145 systemd-logind[1567]: Removed session 61. Jul 15 05:23:34.765605 containerd[1591]: time="2025-07-15T05:23:34.765551544Z" level=info msg="TaskExit event in podsandbox handler container_id:\"ecc96da98e69b59d4998e563e8cd9c0081547a430a3bdb37fcca15973200c970\" id:\"0262730f5ca3dcae92ebb6c6f68005c2448259877f07cfec3d7996161dad77ec\" pid:7808 exited_at:{seconds:1752557014 nanos:765104707}" Jul 15 05:23:35.246349 systemd[1]: Started sshd@61-157.180.39.85:22-139.178.89.65:43428.service - OpenSSH per-connection server daemon (139.178.89.65:43428). Jul 15 05:23:36.253395 sshd[7819]: Accepted publickey for core from 139.178.89.65 port 43428 ssh2: RSA SHA256:je//JvwAvCoYiHr1RrgFTcyG5vuK3YNw75TtQlLXdFM Jul 15 05:23:36.254080 sshd-session[7819]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jul 15 05:23:36.258230 systemd-logind[1567]: New session 62 of user core. Jul 15 05:23:36.265133 systemd[1]: Started session-62.scope - Session 62 of User core. Jul 15 05:23:37.060610 sshd[7822]: Connection closed by 139.178.89.65 port 43428 Jul 15 05:23:37.061519 sshd-session[7819]: pam_unix(sshd:session): session closed for user core Jul 15 05:23:37.068820 systemd[1]: sshd@61-157.180.39.85:22-139.178.89.65:43428.service: Deactivated successfully. Jul 15 05:23:37.072704 systemd[1]: session-62.scope: Deactivated successfully. Jul 15 05:23:37.075410 systemd-logind[1567]: Session 62 logged out. Waiting for processes to exit. Jul 15 05:23:37.078862 systemd-logind[1567]: Removed session 62. Jul 15 05:23:40.812546 containerd[1591]: time="2025-07-15T05:23:40.812509167Z" level=info msg="TaskExit event in podsandbox handler container_id:\"9428bd6e1d7b46eab1fee560c89f8d47314c8d522bc6b7a656d2d6d9133dfefe\" id:\"dad7029b12b3c5697fc6c9b8bad67611b2b5ba53d950377e6a3ac7d8cf3afa23\" pid:7846 exited_at:{seconds:1752557020 nanos:811950802}" Jul 15 05:23:42.230531 systemd[1]: Started sshd@62-157.180.39.85:22-139.178.89.65:48404.service - OpenSSH per-connection server daemon (139.178.89.65:48404). Jul 15 05:23:42.454993 containerd[1591]: time="2025-07-15T05:23:42.454786220Z" level=info msg="TaskExit event in podsandbox handler container_id:\"9428bd6e1d7b46eab1fee560c89f8d47314c8d522bc6b7a656d2d6d9133dfefe\" id:\"afb1ee81305dc7aad5f0b0326cebe22053d11b03d48a1f063635d92eced88a10\" pid:7872 exited_at:{seconds:1752557022 nanos:454349842}" Jul 15 05:23:43.211090 sshd[7857]: Accepted publickey for core from 139.178.89.65 port 48404 ssh2: RSA SHA256:je//JvwAvCoYiHr1RrgFTcyG5vuK3YNw75TtQlLXdFM Jul 15 05:23:43.213872 sshd-session[7857]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jul 15 05:23:43.223199 systemd-logind[1567]: New session 63 of user core. Jul 15 05:23:43.230235 systemd[1]: Started session-63.scope - Session 63 of User core. Jul 15 05:23:44.029640 sshd[7883]: Connection closed by 139.178.89.65 port 48404 Jul 15 05:23:44.031985 sshd-session[7857]: pam_unix(sshd:session): session closed for user core Jul 15 05:23:44.036191 systemd[1]: sshd@62-157.180.39.85:22-139.178.89.65:48404.service: Deactivated successfully. Jul 15 05:23:44.039136 systemd[1]: session-63.scope: Deactivated successfully. Jul 15 05:23:44.041656 systemd-logind[1567]: Session 63 logged out. Waiting for processes to exit. Jul 15 05:23:44.042650 systemd-logind[1567]: Removed session 63. Jul 15 05:23:49.195296 systemd[1]: Started sshd@63-157.180.39.85:22-139.178.89.65:41434.service - OpenSSH per-connection server daemon (139.178.89.65:41434). Jul 15 05:23:50.214070 sshd[7895]: Accepted publickey for core from 139.178.89.65 port 41434 ssh2: RSA SHA256:je//JvwAvCoYiHr1RrgFTcyG5vuK3YNw75TtQlLXdFM Jul 15 05:23:50.217595 sshd-session[7895]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jul 15 05:23:50.229185 systemd-logind[1567]: New session 64 of user core. Jul 15 05:23:50.236262 systemd[1]: Started session-64.scope - Session 64 of User core. Jul 15 05:23:51.001023 sshd[7900]: Connection closed by 139.178.89.65 port 41434 Jul 15 05:23:51.002326 sshd-session[7895]: pam_unix(sshd:session): session closed for user core Jul 15 05:23:51.009244 systemd[1]: sshd@63-157.180.39.85:22-139.178.89.65:41434.service: Deactivated successfully. Jul 15 05:23:51.013444 systemd[1]: session-64.scope: Deactivated successfully. Jul 15 05:23:51.015288 systemd-logind[1567]: Session 64 logged out. Waiting for processes to exit. Jul 15 05:23:51.018114 systemd-logind[1567]: Removed session 64. Jul 15 05:23:51.580905 containerd[1591]: time="2025-07-15T05:23:51.580848557Z" level=info msg="TaskExit event in podsandbox handler container_id:\"ecc96da98e69b59d4998e563e8cd9c0081547a430a3bdb37fcca15973200c970\" id:\"bdbfe014e13b52c29571c8d8949489c4fbdeb8dae514180b6e431094d35c2124\" pid:7922 exited_at:{seconds:1752557031 nanos:580248933}" Jul 15 05:23:55.756001 containerd[1591]: time="2025-07-15T05:23:55.755875307Z" level=info msg="TaskExit event in podsandbox handler container_id:\"87af270add8c61091888bd10ba9da0b61a290df1dcd1c45e8bd59a8af425818f\" id:\"79a7964e26bf201c4517552c6e5e634e9a65bb4e9ecfd46e16698e35b4122f53\" pid:7943 exited_at:{seconds:1752557035 nanos:755209605}" Jul 15 05:23:56.180136 systemd[1]: Started sshd@64-157.180.39.85:22-139.178.89.65:41448.service - OpenSSH per-connection server daemon (139.178.89.65:41448). Jul 15 05:23:57.215375 sshd[7955]: Accepted publickey for core from 139.178.89.65 port 41448 ssh2: RSA SHA256:je//JvwAvCoYiHr1RrgFTcyG5vuK3YNw75TtQlLXdFM Jul 15 05:23:57.218360 sshd-session[7955]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jul 15 05:23:57.227631 systemd-logind[1567]: New session 65 of user core. Jul 15 05:23:57.233362 systemd[1]: Started session-65.scope - Session 65 of User core. Jul 15 05:23:58.207862 sshd[7960]: Connection closed by 139.178.89.65 port 41448 Jul 15 05:23:58.208994 sshd-session[7955]: pam_unix(sshd:session): session closed for user core Jul 15 05:23:58.216277 systemd[1]: sshd@64-157.180.39.85:22-139.178.89.65:41448.service: Deactivated successfully. Jul 15 05:23:58.221261 systemd[1]: session-65.scope: Deactivated successfully. Jul 15 05:23:58.223729 systemd-logind[1567]: Session 65 logged out. Waiting for processes to exit. Jul 15 05:23:58.227453 systemd-logind[1567]: Removed session 65. Jul 15 05:24:03.377952 systemd[1]: Started sshd@65-157.180.39.85:22-139.178.89.65:58912.service - OpenSSH per-connection server daemon (139.178.89.65:58912). Jul 15 05:24:04.377767 sshd[7972]: Accepted publickey for core from 139.178.89.65 port 58912 ssh2: RSA SHA256:je//JvwAvCoYiHr1RrgFTcyG5vuK3YNw75TtQlLXdFM Jul 15 05:24:04.380724 sshd-session[7972]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jul 15 05:24:04.388832 systemd-logind[1567]: New session 66 of user core. Jul 15 05:24:04.402379 systemd[1]: Started session-66.scope - Session 66 of User core. Jul 15 05:24:04.764937 containerd[1591]: time="2025-07-15T05:24:04.764906361Z" level=info msg="TaskExit event in podsandbox handler container_id:\"ecc96da98e69b59d4998e563e8cd9c0081547a430a3bdb37fcca15973200c970\" id:\"203cadcd1974c8e6a0cb622fb27859cc8e7982a3d6764ee861e77de2db37234f\" pid:7989 exited_at:{seconds:1752557044 nanos:764546980}" Jul 15 05:24:05.140968 sshd[7975]: Connection closed by 139.178.89.65 port 58912 Jul 15 05:24:05.142186 sshd-session[7972]: pam_unix(sshd:session): session closed for user core Jul 15 05:24:05.149211 systemd[1]: sshd@65-157.180.39.85:22-139.178.89.65:58912.service: Deactivated successfully. Jul 15 05:24:05.153116 systemd[1]: session-66.scope: Deactivated successfully. Jul 15 05:24:05.155137 systemd-logind[1567]: Session 66 logged out. Waiting for processes to exit. Jul 15 05:24:05.158555 systemd-logind[1567]: Removed session 66. Jul 15 05:24:10.306171 systemd[1]: Started sshd@66-157.180.39.85:22-139.178.89.65:59512.service - OpenSSH per-connection server daemon (139.178.89.65:59512). Jul 15 05:24:10.819423 containerd[1591]: time="2025-07-15T05:24:10.819354616Z" level=info msg="TaskExit event in podsandbox handler container_id:\"9428bd6e1d7b46eab1fee560c89f8d47314c8d522bc6b7a656d2d6d9133dfefe\" id:\"76832d63e70030b14bd0e9d0df61383dbd76ba0a02ee1c2a3f1402edaf985a7d\" pid:8025 exited_at:{seconds:1752557050 nanos:819089062}" Jul 15 05:24:11.285092 sshd[8010]: Accepted publickey for core from 139.178.89.65 port 59512 ssh2: RSA SHA256:je//JvwAvCoYiHr1RrgFTcyG5vuK3YNw75TtQlLXdFM Jul 15 05:24:11.290128 sshd-session[8010]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jul 15 05:24:11.302317 systemd-logind[1567]: New session 67 of user core. Jul 15 05:24:11.309251 systemd[1]: Started session-67.scope - Session 67 of User core. Jul 15 05:24:12.045134 sshd[8035]: Connection closed by 139.178.89.65 port 59512 Jul 15 05:24:12.045645 sshd-session[8010]: pam_unix(sshd:session): session closed for user core Jul 15 05:24:12.048819 systemd-logind[1567]: Session 67 logged out. Waiting for processes to exit. Jul 15 05:24:12.049511 systemd[1]: sshd@66-157.180.39.85:22-139.178.89.65:59512.service: Deactivated successfully. Jul 15 05:24:12.051410 systemd[1]: session-67.scope: Deactivated successfully. Jul 15 05:24:12.051975 systemd-logind[1567]: Removed session 67. Jul 15 05:24:17.222854 systemd[1]: Started sshd@67-157.180.39.85:22-139.178.89.65:59520.service - OpenSSH per-connection server daemon (139.178.89.65:59520). Jul 15 05:24:18.255078 sshd[8047]: Accepted publickey for core from 139.178.89.65 port 59520 ssh2: RSA SHA256:je//JvwAvCoYiHr1RrgFTcyG5vuK3YNw75TtQlLXdFM Jul 15 05:24:18.259643 sshd-session[8047]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jul 15 05:24:18.271109 systemd-logind[1567]: New session 68 of user core. Jul 15 05:24:18.282336 systemd[1]: Started session-68.scope - Session 68 of User core. Jul 15 05:24:19.273237 sshd[8050]: Connection closed by 139.178.89.65 port 59520 Jul 15 05:24:19.274289 sshd-session[8047]: pam_unix(sshd:session): session closed for user core Jul 15 05:24:19.281198 systemd[1]: sshd@67-157.180.39.85:22-139.178.89.65:59520.service: Deactivated successfully. Jul 15 05:24:19.285067 systemd[1]: session-68.scope: Deactivated successfully. Jul 15 05:24:19.287780 systemd-logind[1567]: Session 68 logged out. Waiting for processes to exit. Jul 15 05:24:19.291349 systemd-logind[1567]: Removed session 68. Jul 15 05:24:24.442203 systemd[1]: Started sshd@68-157.180.39.85:22-139.178.89.65:49272.service - OpenSSH per-connection server daemon (139.178.89.65:49272). Jul 15 05:24:25.414818 sshd[8061]: Accepted publickey for core from 139.178.89.65 port 49272 ssh2: RSA SHA256:je//JvwAvCoYiHr1RrgFTcyG5vuK3YNw75TtQlLXdFM Jul 15 05:24:25.417666 sshd-session[8061]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jul 15 05:24:25.428205 systemd-logind[1567]: New session 69 of user core. Jul 15 05:24:25.438276 systemd[1]: Started session-69.scope - Session 69 of User core. Jul 15 05:24:25.743043 containerd[1591]: time="2025-07-15T05:24:25.742992815Z" level=info msg="TaskExit event in podsandbox handler container_id:\"87af270add8c61091888bd10ba9da0b61a290df1dcd1c45e8bd59a8af425818f\" id:\"043c6f5db9fd699570dd177c515a8c0304a7185851d6f1f23574f58143e60fe2\" pid:8078 exited_at:{seconds:1752557065 nanos:742631023}" Jul 15 05:24:26.282183 sshd[8064]: Connection closed by 139.178.89.65 port 49272 Jul 15 05:24:26.283877 sshd-session[8061]: pam_unix(sshd:session): session closed for user core Jul 15 05:24:26.291337 systemd[1]: sshd@68-157.180.39.85:22-139.178.89.65:49272.service: Deactivated successfully. Jul 15 05:24:26.295091 systemd[1]: session-69.scope: Deactivated successfully. Jul 15 05:24:26.296836 systemd-logind[1567]: Session 69 logged out. Waiting for processes to exit. Jul 15 05:24:26.299921 systemd-logind[1567]: Removed session 69. Jul 15 05:24:31.458797 systemd[1]: Started sshd@69-157.180.39.85:22-139.178.89.65:47334.service - OpenSSH per-connection server daemon (139.178.89.65:47334). Jul 15 05:24:32.469373 sshd[8124]: Accepted publickey for core from 139.178.89.65 port 47334 ssh2: RSA SHA256:je//JvwAvCoYiHr1RrgFTcyG5vuK3YNw75TtQlLXdFM Jul 15 05:24:32.472309 sshd-session[8124]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jul 15 05:24:32.481132 systemd-logind[1567]: New session 70 of user core. Jul 15 05:24:32.487364 systemd[1]: Started session-70.scope - Session 70 of User core. Jul 15 05:24:33.238183 sshd[8127]: Connection closed by 139.178.89.65 port 47334 Jul 15 05:24:33.239165 sshd-session[8124]: pam_unix(sshd:session): session closed for user core Jul 15 05:24:33.245605 systemd-logind[1567]: Session 70 logged out. Waiting for processes to exit. Jul 15 05:24:33.246849 systemd[1]: sshd@69-157.180.39.85:22-139.178.89.65:47334.service: Deactivated successfully. Jul 15 05:24:33.250264 systemd[1]: session-70.scope: Deactivated successfully. Jul 15 05:24:33.254308 systemd-logind[1567]: Removed session 70. Jul 15 05:24:34.764756 containerd[1591]: time="2025-07-15T05:24:34.764692000Z" level=info msg="TaskExit event in podsandbox handler container_id:\"ecc96da98e69b59d4998e563e8cd9c0081547a430a3bdb37fcca15973200c970\" id:\"3557fef8148574c435787b9f0d29cc8582f31067e45141fdde1942fcb08973a7\" pid:8150 exited_at:{seconds:1752557074 nanos:764465355}" Jul 15 05:24:38.412291 systemd[1]: Started sshd@70-157.180.39.85:22-139.178.89.65:47336.service - OpenSSH per-connection server daemon (139.178.89.65:47336). Jul 15 05:24:39.409924 sshd[8160]: Accepted publickey for core from 139.178.89.65 port 47336 ssh2: RSA SHA256:je//JvwAvCoYiHr1RrgFTcyG5vuK3YNw75TtQlLXdFM Jul 15 05:24:39.411235 sshd-session[8160]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jul 15 05:24:39.415498 systemd-logind[1567]: New session 71 of user core. Jul 15 05:24:39.421119 systemd[1]: Started session-71.scope - Session 71 of User core. Jul 15 05:24:40.367069 sshd[8163]: Connection closed by 139.178.89.65 port 47336 Jul 15 05:24:40.369101 sshd-session[8160]: pam_unix(sshd:session): session closed for user core Jul 15 05:24:40.383868 systemd[1]: sshd@70-157.180.39.85:22-139.178.89.65:47336.service: Deactivated successfully. Jul 15 05:24:40.392617 systemd[1]: session-71.scope: Deactivated successfully. Jul 15 05:24:40.395305 systemd-logind[1567]: Session 71 logged out. Waiting for processes to exit. Jul 15 05:24:40.399888 systemd-logind[1567]: Removed session 71. Jul 15 05:24:40.811366 containerd[1591]: time="2025-07-15T05:24:40.811297278Z" level=info msg="TaskExit event in podsandbox handler container_id:\"9428bd6e1d7b46eab1fee560c89f8d47314c8d522bc6b7a656d2d6d9133dfefe\" id:\"1eb114fcab7f963c0ac2ebc77872e503649a1355c54b68140a113ed27a2fd8e1\" pid:8187 exited_at:{seconds:1752557080 nanos:810694011}" Jul 15 05:24:42.468351 containerd[1591]: time="2025-07-15T05:24:42.468276283Z" level=info msg="TaskExit event in podsandbox handler container_id:\"9428bd6e1d7b46eab1fee560c89f8d47314c8d522bc6b7a656d2d6d9133dfefe\" id:\"031c91858c9cb5c131afb8fe7d6ca238356f49ff758809a3ecb6580851ee63fe\" pid:8209 exited_at:{seconds:1752557082 nanos:467752243}" Jul 15 05:24:45.531635 systemd[1]: Started sshd@71-157.180.39.85:22-139.178.89.65:50986.service - OpenSSH per-connection server daemon (139.178.89.65:50986). Jul 15 05:24:46.513662 sshd[8219]: Accepted publickey for core from 139.178.89.65 port 50986 ssh2: RSA SHA256:je//JvwAvCoYiHr1RrgFTcyG5vuK3YNw75TtQlLXdFM Jul 15 05:24:46.516282 sshd-session[8219]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jul 15 05:24:46.525597 systemd-logind[1567]: New session 72 of user core. Jul 15 05:24:46.530233 systemd[1]: Started session-72.scope - Session 72 of User core. Jul 15 05:24:47.247325 sshd[8222]: Connection closed by 139.178.89.65 port 50986 Jul 15 05:24:47.248335 sshd-session[8219]: pam_unix(sshd:session): session closed for user core Jul 15 05:24:47.253551 systemd-logind[1567]: Session 72 logged out. Waiting for processes to exit. Jul 15 05:24:47.254290 systemd[1]: sshd@71-157.180.39.85:22-139.178.89.65:50986.service: Deactivated successfully. Jul 15 05:24:47.256739 systemd[1]: session-72.scope: Deactivated successfully. Jul 15 05:24:47.259695 systemd-logind[1567]: Removed session 72. Jul 15 05:24:51.607641 containerd[1591]: time="2025-07-15T05:24:51.607606410Z" level=info msg="TaskExit event in podsandbox handler container_id:\"ecc96da98e69b59d4998e563e8cd9c0081547a430a3bdb37fcca15973200c970\" id:\"f4070e2ef87bc6d279b9cbb9e8577b846fa5939154710675924853453682a4af\" pid:8248 exited_at:{seconds:1752557091 nanos:607440754}" Jul 15 05:24:52.422147 systemd[1]: Started sshd@72-157.180.39.85:22-139.178.89.65:32784.service - OpenSSH per-connection server daemon (139.178.89.65:32784). Jul 15 05:24:53.422042 sshd[8258]: Accepted publickey for core from 139.178.89.65 port 32784 ssh2: RSA SHA256:je//JvwAvCoYiHr1RrgFTcyG5vuK3YNw75TtQlLXdFM Jul 15 05:24:53.423953 sshd-session[8258]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jul 15 05:24:53.432576 systemd-logind[1567]: New session 73 of user core. Jul 15 05:24:53.438122 systemd[1]: Started session-73.scope - Session 73 of User core. Jul 15 05:24:54.160676 sshd[8261]: Connection closed by 139.178.89.65 port 32784 Jul 15 05:24:54.161664 sshd-session[8258]: pam_unix(sshd:session): session closed for user core Jul 15 05:24:54.169333 systemd[1]: sshd@72-157.180.39.85:22-139.178.89.65:32784.service: Deactivated successfully. Jul 15 05:24:54.172930 systemd[1]: session-73.scope: Deactivated successfully. Jul 15 05:24:54.175554 systemd-logind[1567]: Session 73 logged out. Waiting for processes to exit. Jul 15 05:24:54.178737 systemd-logind[1567]: Removed session 73. Jul 15 05:24:55.694845 containerd[1591]: time="2025-07-15T05:24:55.694794352Z" level=info msg="TaskExit event in podsandbox handler container_id:\"87af270add8c61091888bd10ba9da0b61a290df1dcd1c45e8bd59a8af425818f\" id:\"e0e2976e92a55ac0b085ba35d7a187718948f144167960e46edeb15dcf1609ae\" pid:8284 exited_at:{seconds:1752557095 nanos:694300561}" Jul 15 05:24:59.340486 systemd[1]: Started sshd@73-157.180.39.85:22-139.178.89.65:58698.service - OpenSSH per-connection server daemon (139.178.89.65:58698). Jul 15 05:25:00.344677 sshd[8306]: Accepted publickey for core from 139.178.89.65 port 58698 ssh2: RSA SHA256:je//JvwAvCoYiHr1RrgFTcyG5vuK3YNw75TtQlLXdFM Jul 15 05:25:00.347067 sshd-session[8306]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jul 15 05:25:00.355241 systemd-logind[1567]: New session 74 of user core. Jul 15 05:25:00.362231 systemd[1]: Started session-74.scope - Session 74 of User core. Jul 15 05:25:01.116867 sshd[8309]: Connection closed by 139.178.89.65 port 58698 Jul 15 05:25:01.117785 sshd-session[8306]: pam_unix(sshd:session): session closed for user core Jul 15 05:25:01.124618 systemd[1]: sshd@73-157.180.39.85:22-139.178.89.65:58698.service: Deactivated successfully. Jul 15 05:25:01.128522 systemd[1]: session-74.scope: Deactivated successfully. Jul 15 05:25:01.130671 systemd-logind[1567]: Session 74 logged out. Waiting for processes to exit. Jul 15 05:25:01.133489 systemd-logind[1567]: Removed session 74. Jul 15 05:25:04.763621 containerd[1591]: time="2025-07-15T05:25:04.763568870Z" level=info msg="TaskExit event in podsandbox handler container_id:\"ecc96da98e69b59d4998e563e8cd9c0081547a430a3bdb37fcca15973200c970\" id:\"5280a2dc7d40973307de4d3ae72abc2f4655a32c5b74941c39ed2e5bdb365411\" pid:8334 exited_at:{seconds:1752557104 nanos:763133248}" Jul 15 05:25:06.288674 systemd[1]: Started sshd@74-157.180.39.85:22-139.178.89.65:58714.service - OpenSSH per-connection server daemon (139.178.89.65:58714). Jul 15 05:25:07.287958 sshd[8344]: Accepted publickey for core from 139.178.89.65 port 58714 ssh2: RSA SHA256:je//JvwAvCoYiHr1RrgFTcyG5vuK3YNw75TtQlLXdFM Jul 15 05:25:07.290377 sshd-session[8344]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jul 15 05:25:07.296835 systemd-logind[1567]: New session 75 of user core. Jul 15 05:25:07.302278 systemd[1]: Started session-75.scope - Session 75 of User core. Jul 15 05:25:08.060903 sshd[8347]: Connection closed by 139.178.89.65 port 58714 Jul 15 05:25:08.061825 sshd-session[8344]: pam_unix(sshd:session): session closed for user core Jul 15 05:25:08.068961 systemd[1]: sshd@74-157.180.39.85:22-139.178.89.65:58714.service: Deactivated successfully. Jul 15 05:25:08.072476 systemd[1]: session-75.scope: Deactivated successfully. Jul 15 05:25:08.074545 systemd-logind[1567]: Session 75 logged out. Waiting for processes to exit. Jul 15 05:25:08.077663 systemd-logind[1567]: Removed session 75. Jul 15 05:25:10.833343 containerd[1591]: time="2025-07-15T05:25:10.833281543Z" level=info msg="TaskExit event in podsandbox handler container_id:\"9428bd6e1d7b46eab1fee560c89f8d47314c8d522bc6b7a656d2d6d9133dfefe\" id:\"356487aebb88050a8e5fc2e94a925b27679b834e78d3bd496986f33ba9ba39bc\" pid:8370 exited_at:{seconds:1752557110 nanos:832497947}" Jul 15 05:25:13.226168 systemd[1]: Started sshd@75-157.180.39.85:22-139.178.89.65:35168.service - OpenSSH per-connection server daemon (139.178.89.65:35168). Jul 15 05:25:14.201333 sshd[8381]: Accepted publickey for core from 139.178.89.65 port 35168 ssh2: RSA SHA256:je//JvwAvCoYiHr1RrgFTcyG5vuK3YNw75TtQlLXdFM Jul 15 05:25:14.203882 sshd-session[8381]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jul 15 05:25:14.212134 systemd-logind[1567]: New session 76 of user core. Jul 15 05:25:14.219329 systemd[1]: Started session-76.scope - Session 76 of User core. Jul 15 05:25:14.937423 sshd[8384]: Connection closed by 139.178.89.65 port 35168 Jul 15 05:25:14.938218 sshd-session[8381]: pam_unix(sshd:session): session closed for user core Jul 15 05:25:14.941972 systemd-logind[1567]: Session 76 logged out. Waiting for processes to exit. Jul 15 05:25:14.942463 systemd[1]: sshd@75-157.180.39.85:22-139.178.89.65:35168.service: Deactivated successfully. Jul 15 05:25:14.944789 systemd[1]: session-76.scope: Deactivated successfully. Jul 15 05:25:14.947457 systemd-logind[1567]: Removed session 76. Jul 15 05:25:20.111482 systemd[1]: Started sshd@76-157.180.39.85:22-139.178.89.65:48694.service - OpenSSH per-connection server daemon (139.178.89.65:48694). Jul 15 05:25:21.106123 sshd[8397]: Accepted publickey for core from 139.178.89.65 port 48694 ssh2: RSA SHA256:je//JvwAvCoYiHr1RrgFTcyG5vuK3YNw75TtQlLXdFM Jul 15 05:25:21.107971 sshd-session[8397]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jul 15 05:25:21.114773 systemd-logind[1567]: New session 77 of user core. Jul 15 05:25:21.117551 systemd[1]: Started session-77.scope - Session 77 of User core. Jul 15 05:25:21.835742 sshd[8400]: Connection closed by 139.178.89.65 port 48694 Jul 15 05:25:21.837865 sshd-session[8397]: pam_unix(sshd:session): session closed for user core Jul 15 05:25:21.841483 systemd[1]: sshd@76-157.180.39.85:22-139.178.89.65:48694.service: Deactivated successfully. Jul 15 05:25:21.843835 systemd[1]: session-77.scope: Deactivated successfully. Jul 15 05:25:21.846090 systemd-logind[1567]: Session 77 logged out. Waiting for processes to exit. Jul 15 05:25:21.847981 systemd-logind[1567]: Removed session 77. Jul 15 05:25:25.719958 containerd[1591]: time="2025-07-15T05:25:25.719921028Z" level=info msg="TaskExit event in podsandbox handler container_id:\"87af270add8c61091888bd10ba9da0b61a290df1dcd1c45e8bd59a8af425818f\" id:\"fe6b47021211bbd9f4c2ae8dc3ba3f5b4694a08ead422a8b30e1d6b2a9a4013a\" pid:8424 exited_at:{seconds:1752557125 nanos:719547744}" Jul 15 05:25:27.010755 systemd[1]: Started sshd@77-157.180.39.85:22-139.178.89.65:48704.service - OpenSSH per-connection server daemon (139.178.89.65:48704). Jul 15 05:25:28.011760 sshd[8439]: Accepted publickey for core from 139.178.89.65 port 48704 ssh2: RSA SHA256:je//JvwAvCoYiHr1RrgFTcyG5vuK3YNw75TtQlLXdFM Jul 15 05:25:28.015120 sshd-session[8439]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jul 15 05:25:28.026616 systemd-logind[1567]: New session 78 of user core. Jul 15 05:25:28.033337 systemd[1]: Started session-78.scope - Session 78 of User core. Jul 15 05:25:28.809628 sshd[8442]: Connection closed by 139.178.89.65 port 48704 Jul 15 05:25:28.810673 sshd-session[8439]: pam_unix(sshd:session): session closed for user core Jul 15 05:25:28.818194 systemd[1]: sshd@77-157.180.39.85:22-139.178.89.65:48704.service: Deactivated successfully. Jul 15 05:25:28.821862 systemd[1]: session-78.scope: Deactivated successfully. Jul 15 05:25:28.824582 systemd-logind[1567]: Session 78 logged out. Waiting for processes to exit. Jul 15 05:25:28.827219 systemd-logind[1567]: Removed session 78. Jul 15 05:25:33.990476 systemd[1]: Started sshd@78-157.180.39.85:22-139.178.89.65:41112.service - OpenSSH per-connection server daemon (139.178.89.65:41112). Jul 15 05:25:34.754004 containerd[1591]: time="2025-07-15T05:25:34.753898337Z" level=info msg="TaskExit event in podsandbox handler container_id:\"ecc96da98e69b59d4998e563e8cd9c0081547a430a3bdb37fcca15973200c970\" id:\"cd73bcc4d2e7003f5dc4396eb55285393ff080899932dad1988d195c60082e48\" pid:8470 exited_at:{seconds:1752557134 nanos:753252528}" Jul 15 05:25:34.992497 sshd[8455]: Accepted publickey for core from 139.178.89.65 port 41112 ssh2: RSA SHA256:je//JvwAvCoYiHr1RrgFTcyG5vuK3YNw75TtQlLXdFM Jul 15 05:25:34.995267 sshd-session[8455]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jul 15 05:25:35.003086 systemd-logind[1567]: New session 79 of user core. Jul 15 05:25:35.009249 systemd[1]: Started session-79.scope - Session 79 of User core. Jul 15 05:25:35.766335 sshd[8479]: Connection closed by 139.178.89.65 port 41112 Jul 15 05:25:35.767513 sshd-session[8455]: pam_unix(sshd:session): session closed for user core Jul 15 05:25:35.774809 systemd-logind[1567]: Session 79 logged out. Waiting for processes to exit. Jul 15 05:25:35.776310 systemd[1]: sshd@78-157.180.39.85:22-139.178.89.65:41112.service: Deactivated successfully. Jul 15 05:25:35.780407 systemd[1]: session-79.scope: Deactivated successfully. Jul 15 05:25:35.783456 systemd-logind[1567]: Removed session 79. Jul 15 05:25:40.810371 containerd[1591]: time="2025-07-15T05:25:40.810207251Z" level=info msg="TaskExit event in podsandbox handler container_id:\"9428bd6e1d7b46eab1fee560c89f8d47314c8d522bc6b7a656d2d6d9133dfefe\" id:\"0269b4ee68681535e86d41d3a848e5a25da2aea480b1492daa691e956c0177a1\" pid:8501 exited_at:{seconds:1752557140 nanos:809853056}" Jul 15 05:25:40.935344 systemd[1]: Started sshd@79-157.180.39.85:22-139.178.89.65:44838.service - OpenSSH per-connection server daemon (139.178.89.65:44838). Jul 15 05:25:41.914534 sshd[8512]: Accepted publickey for core from 139.178.89.65 port 44838 ssh2: RSA SHA256:je//JvwAvCoYiHr1RrgFTcyG5vuK3YNw75TtQlLXdFM Jul 15 05:25:41.918713 sshd-session[8512]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jul 15 05:25:41.928841 systemd-logind[1567]: New session 80 of user core. Jul 15 05:25:41.935192 systemd[1]: Started session-80.scope - Session 80 of User core. Jul 15 05:25:42.492587 containerd[1591]: time="2025-07-15T05:25:42.492548854Z" level=info msg="TaskExit event in podsandbox handler container_id:\"9428bd6e1d7b46eab1fee560c89f8d47314c8d522bc6b7a656d2d6d9133dfefe\" id:\"32d76359bc314a0e363a285f911609b3598e139a260d47bec47d76379b220151\" pid:8528 exited_at:{seconds:1752557142 nanos:492027083}" Jul 15 05:25:42.654875 sshd[8515]: Connection closed by 139.178.89.65 port 44838 Jul 15 05:25:42.653900 sshd-session[8512]: pam_unix(sshd:session): session closed for user core Jul 15 05:25:42.658146 systemd-logind[1567]: Session 80 logged out. Waiting for processes to exit. Jul 15 05:25:42.659332 systemd[1]: sshd@79-157.180.39.85:22-139.178.89.65:44838.service: Deactivated successfully. Jul 15 05:25:42.662227 systemd[1]: session-80.scope: Deactivated successfully. Jul 15 05:25:42.665733 systemd-logind[1567]: Removed session 80. Jul 15 05:25:42.831281 systemd[1]: Started sshd@80-157.180.39.85:22-139.178.89.65:44840.service - OpenSSH per-connection server daemon (139.178.89.65:44840). Jul 15 05:25:43.824507 sshd[8551]: Accepted publickey for core from 139.178.89.65 port 44840 ssh2: RSA SHA256:je//JvwAvCoYiHr1RrgFTcyG5vuK3YNw75TtQlLXdFM Jul 15 05:25:43.827511 sshd-session[8551]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jul 15 05:25:43.834249 systemd-logind[1567]: New session 81 of user core. Jul 15 05:25:43.839040 systemd[1]: Started session-81.scope - Session 81 of User core. Jul 15 05:25:44.905149 sshd[8554]: Connection closed by 139.178.89.65 port 44840 Jul 15 05:25:44.914503 sshd-session[8551]: pam_unix(sshd:session): session closed for user core Jul 15 05:25:44.930493 systemd-logind[1567]: Session 81 logged out. Waiting for processes to exit. Jul 15 05:25:44.930803 systemd[1]: sshd@80-157.180.39.85:22-139.178.89.65:44840.service: Deactivated successfully. Jul 15 05:25:44.936338 systemd[1]: session-81.scope: Deactivated successfully. Jul 15 05:25:44.940776 systemd-logind[1567]: Removed session 81. Jul 15 05:25:45.076062 systemd[1]: Started sshd@81-157.180.39.85:22-139.178.89.65:44842.service - OpenSSH per-connection server daemon (139.178.89.65:44842). Jul 15 05:25:46.078539 sshd[8564]: Accepted publickey for core from 139.178.89.65 port 44842 ssh2: RSA SHA256:je//JvwAvCoYiHr1RrgFTcyG5vuK3YNw75TtQlLXdFM Jul 15 05:25:46.081266 sshd-session[8564]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jul 15 05:25:46.087809 systemd-logind[1567]: New session 82 of user core. Jul 15 05:25:46.103246 systemd[1]: Started session-82.scope - Session 82 of User core. Jul 15 05:25:47.429143 sshd[8567]: Connection closed by 139.178.89.65 port 44842 Jul 15 05:25:47.432155 sshd-session[8564]: pam_unix(sshd:session): session closed for user core Jul 15 05:25:47.434894 systemd-logind[1567]: Session 82 logged out. Waiting for processes to exit. Jul 15 05:25:47.435839 systemd[1]: sshd@81-157.180.39.85:22-139.178.89.65:44842.service: Deactivated successfully. Jul 15 05:25:47.440730 systemd[1]: session-82.scope: Deactivated successfully. Jul 15 05:25:47.445804 systemd-logind[1567]: Removed session 82. Jul 15 05:25:47.593808 systemd[1]: Started sshd@82-157.180.39.85:22-139.178.89.65:44848.service - OpenSSH per-connection server daemon (139.178.89.65:44848). Jul 15 05:25:48.571466 sshd[8584]: Accepted publickey for core from 139.178.89.65 port 44848 ssh2: RSA SHA256:je//JvwAvCoYiHr1RrgFTcyG5vuK3YNw75TtQlLXdFM Jul 15 05:25:48.574282 sshd-session[8584]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jul 15 05:25:48.584962 systemd-logind[1567]: New session 83 of user core. Jul 15 05:25:48.589431 systemd[1]: Started session-83.scope - Session 83 of User core. Jul 15 05:25:50.028939 sshd[8587]: Connection closed by 139.178.89.65 port 44848 Jul 15 05:25:50.039959 sshd-session[8584]: pam_unix(sshd:session): session closed for user core Jul 15 05:25:50.050931 systemd[1]: sshd@82-157.180.39.85:22-139.178.89.65:44848.service: Deactivated successfully. Jul 15 05:25:50.055490 systemd[1]: session-83.scope: Deactivated successfully. Jul 15 05:25:50.059074 systemd-logind[1567]: Session 83 logged out. Waiting for processes to exit. Jul 15 05:25:50.062256 systemd-logind[1567]: Removed session 83. Jul 15 05:25:50.195618 systemd[1]: Started sshd@83-157.180.39.85:22-139.178.89.65:57802.service - OpenSSH per-connection server daemon (139.178.89.65:57802). Jul 15 05:25:51.191989 sshd[8601]: Accepted publickey for core from 139.178.89.65 port 57802 ssh2: RSA SHA256:je//JvwAvCoYiHr1RrgFTcyG5vuK3YNw75TtQlLXdFM Jul 15 05:25:51.194948 sshd-session[8601]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jul 15 05:25:51.204123 systemd-logind[1567]: New session 84 of user core. Jul 15 05:25:51.210244 systemd[1]: Started session-84.scope - Session 84 of User core. Jul 15 05:25:51.619912 containerd[1591]: time="2025-07-15T05:25:51.619725112Z" level=info msg="TaskExit event in podsandbox handler container_id:\"ecc96da98e69b59d4998e563e8cd9c0081547a430a3bdb37fcca15973200c970\" id:\"3db7cdda6e0348caafd30fecac12dd0d274718d807dd776ad96d25b5632ce388\" pid:8617 exited_at:{seconds:1752557151 nanos:618950354}" Jul 15 05:25:52.082002 sshd[8604]: Connection closed by 139.178.89.65 port 57802 Jul 15 05:25:52.085161 sshd-session[8601]: pam_unix(sshd:session): session closed for user core Jul 15 05:25:52.088912 systemd[1]: sshd@83-157.180.39.85:22-139.178.89.65:57802.service: Deactivated successfully. Jul 15 05:25:52.092081 systemd[1]: session-84.scope: Deactivated successfully. Jul 15 05:25:52.094502 systemd-logind[1567]: Session 84 logged out. Waiting for processes to exit. Jul 15 05:25:52.095546 systemd-logind[1567]: Removed session 84. Jul 15 05:25:55.686306 containerd[1591]: time="2025-07-15T05:25:55.686253625Z" level=info msg="TaskExit event in podsandbox handler container_id:\"87af270add8c61091888bd10ba9da0b61a290df1dcd1c45e8bd59a8af425818f\" id:\"270a0b87548c2065e8457241edaaf50d72b32d3b99dd2e949d7668f87a3b75c1\" pid:8648 exited_at:{seconds:1752557155 nanos:685962170}" Jul 15 05:25:57.261601 systemd[1]: Started sshd@84-157.180.39.85:22-139.178.89.65:57812.service - OpenSSH per-connection server daemon (139.178.89.65:57812). Jul 15 05:25:58.299343 sshd[8664]: Accepted publickey for core from 139.178.89.65 port 57812 ssh2: RSA SHA256:je//JvwAvCoYiHr1RrgFTcyG5vuK3YNw75TtQlLXdFM Jul 15 05:25:58.301882 sshd-session[8664]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jul 15 05:25:58.311652 systemd-logind[1567]: New session 85 of user core. Jul 15 05:25:58.326274 systemd[1]: Started session-85.scope - Session 85 of User core. Jul 15 05:25:59.310313 sshd[8673]: Connection closed by 139.178.89.65 port 57812 Jul 15 05:25:59.311424 sshd-session[8664]: pam_unix(sshd:session): session closed for user core Jul 15 05:25:59.318898 systemd-logind[1567]: Session 85 logged out. Waiting for processes to exit. Jul 15 05:25:59.319224 systemd[1]: sshd@84-157.180.39.85:22-139.178.89.65:57812.service: Deactivated successfully. Jul 15 05:25:59.323513 systemd[1]: session-85.scope: Deactivated successfully. Jul 15 05:25:59.326610 systemd-logind[1567]: Removed session 85. Jul 15 05:26:04.483666 systemd[1]: Started sshd@85-157.180.39.85:22-139.178.89.65:51764.service - OpenSSH per-connection server daemon (139.178.89.65:51764). Jul 15 05:26:04.739851 containerd[1591]: time="2025-07-15T05:26:04.739542135Z" level=info msg="TaskExit event in podsandbox handler container_id:\"ecc96da98e69b59d4998e563e8cd9c0081547a430a3bdb37fcca15973200c970\" id:\"433006b2261793df753b056c57061250511655ecbaa3d153e6648a57b7d1b6cc\" pid:8724 exited_at:{seconds:1752557164 nanos:738784376}" Jul 15 05:26:05.500046 sshd[8708]: Accepted publickey for core from 139.178.89.65 port 51764 ssh2: RSA SHA256:je//JvwAvCoYiHr1RrgFTcyG5vuK3YNw75TtQlLXdFM Jul 15 05:26:05.502760 sshd-session[8708]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jul 15 05:26:05.511201 systemd-logind[1567]: New session 86 of user core. Jul 15 05:26:05.515293 systemd[1]: Started session-86.scope - Session 86 of User core. Jul 15 05:26:06.283204 sshd[8732]: Connection closed by 139.178.89.65 port 51764 Jul 15 05:26:06.284170 sshd-session[8708]: pam_unix(sshd:session): session closed for user core Jul 15 05:26:06.290684 systemd[1]: sshd@85-157.180.39.85:22-139.178.89.65:51764.service: Deactivated successfully. Jul 15 05:26:06.294782 systemd[1]: session-86.scope: Deactivated successfully. Jul 15 05:26:06.298607 systemd-logind[1567]: Session 86 logged out. Waiting for processes to exit. Jul 15 05:26:06.301328 systemd-logind[1567]: Removed session 86. Jul 15 05:26:10.834838 containerd[1591]: time="2025-07-15T05:26:10.834779061Z" level=info msg="TaskExit event in podsandbox handler container_id:\"9428bd6e1d7b46eab1fee560c89f8d47314c8d522bc6b7a656d2d6d9133dfefe\" id:\"a20cce231ea3ce79f422ba559965db9797e4a336d0a8b8155243b6be04aac819\" pid:8756 exited_at:{seconds:1752557170 nanos:834436956}" Jul 15 05:26:11.452187 systemd[1]: Started sshd@86-157.180.39.85:22-139.178.89.65:47276.service - OpenSSH per-connection server daemon (139.178.89.65:47276). Jul 15 05:26:12.433643 sshd[8767]: Accepted publickey for core from 139.178.89.65 port 47276 ssh2: RSA SHA256:je//JvwAvCoYiHr1RrgFTcyG5vuK3YNw75TtQlLXdFM Jul 15 05:26:12.435771 sshd-session[8767]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jul 15 05:26:12.442612 systemd-logind[1567]: New session 87 of user core. Jul 15 05:26:12.446148 systemd[1]: Started session-87.scope - Session 87 of User core. Jul 15 05:26:13.195776 sshd[8770]: Connection closed by 139.178.89.65 port 47276 Jul 15 05:26:13.197258 sshd-session[8767]: pam_unix(sshd:session): session closed for user core Jul 15 05:26:13.200775 systemd-logind[1567]: Session 87 logged out. Waiting for processes to exit. Jul 15 05:26:13.201662 systemd[1]: sshd@86-157.180.39.85:22-139.178.89.65:47276.service: Deactivated successfully. Jul 15 05:26:13.203918 systemd[1]: session-87.scope: Deactivated successfully. Jul 15 05:26:13.206637 systemd-logind[1567]: Removed session 87. Jul 15 05:26:18.365160 systemd[1]: Started sshd@87-157.180.39.85:22-139.178.89.65:47288.service - OpenSSH per-connection server daemon (139.178.89.65:47288). Jul 15 05:26:19.364480 sshd[8782]: Accepted publickey for core from 139.178.89.65 port 47288 ssh2: RSA SHA256:je//JvwAvCoYiHr1RrgFTcyG5vuK3YNw75TtQlLXdFM Jul 15 05:26:19.366194 sshd-session[8782]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jul 15 05:26:19.373450 systemd-logind[1567]: New session 88 of user core. Jul 15 05:26:19.379120 systemd[1]: Started session-88.scope - Session 88 of User core. Jul 15 05:26:20.152441 sshd[8785]: Connection closed by 139.178.89.65 port 47288 Jul 15 05:26:20.153426 sshd-session[8782]: pam_unix(sshd:session): session closed for user core Jul 15 05:26:20.159933 systemd[1]: sshd@87-157.180.39.85:22-139.178.89.65:47288.service: Deactivated successfully. Jul 15 05:26:20.163749 systemd[1]: session-88.scope: Deactivated successfully. Jul 15 05:26:20.166700 systemd-logind[1567]: Session 88 logged out. Waiting for processes to exit. Jul 15 05:26:20.169942 systemd-logind[1567]: Removed session 88. Jul 15 05:26:25.330675 systemd[1]: Started sshd@88-157.180.39.85:22-139.178.89.65:44720.service - OpenSSH per-connection server daemon (139.178.89.65:44720). Jul 15 05:26:25.705123 containerd[1591]: time="2025-07-15T05:26:25.705091343Z" level=info msg="TaskExit event in podsandbox handler container_id:\"87af270add8c61091888bd10ba9da0b61a290df1dcd1c45e8bd59a8af425818f\" id:\"27422c1a5ac7a0153870c837c5c068ab677a9ca6301cd84f2867dcc64f348cfa\" pid:8811 exited_at:{seconds:1752557185 nanos:704804387}" Jul 15 05:26:26.325386 sshd[8797]: Accepted publickey for core from 139.178.89.65 port 44720 ssh2: RSA SHA256:je//JvwAvCoYiHr1RrgFTcyG5vuK3YNw75TtQlLXdFM Jul 15 05:26:26.327240 sshd-session[8797]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jul 15 05:26:26.330650 systemd-logind[1567]: New session 89 of user core. Jul 15 05:26:26.336154 systemd[1]: Started session-89.scope - Session 89 of User core. Jul 15 05:26:27.086787 sshd[8822]: Connection closed by 139.178.89.65 port 44720 Jul 15 05:26:27.088101 sshd-session[8797]: pam_unix(sshd:session): session closed for user core Jul 15 05:26:27.095899 systemd[1]: sshd@88-157.180.39.85:22-139.178.89.65:44720.service: Deactivated successfully. Jul 15 05:26:27.102960 systemd[1]: session-89.scope: Deactivated successfully. Jul 15 05:26:27.107117 systemd-logind[1567]: Session 89 logged out. Waiting for processes to exit. Jul 15 05:26:27.112769 systemd-logind[1567]: Removed session 89. Jul 15 05:26:32.260602 systemd[1]: Started sshd@89-157.180.39.85:22-139.178.89.65:39300.service - OpenSSH per-connection server daemon (139.178.89.65:39300). Jul 15 05:26:33.261124 sshd[8836]: Accepted publickey for core from 139.178.89.65 port 39300 ssh2: RSA SHA256:je//JvwAvCoYiHr1RrgFTcyG5vuK3YNw75TtQlLXdFM Jul 15 05:26:33.263821 sshd-session[8836]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jul 15 05:26:33.273694 systemd-logind[1567]: New session 90 of user core. Jul 15 05:26:33.282239 systemd[1]: Started session-90.scope - Session 90 of User core. Jul 15 05:26:34.024454 sshd[8839]: Connection closed by 139.178.89.65 port 39300 Jul 15 05:26:34.025457 sshd-session[8836]: pam_unix(sshd:session): session closed for user core Jul 15 05:26:34.031702 systemd[1]: sshd@89-157.180.39.85:22-139.178.89.65:39300.service: Deactivated successfully. Jul 15 05:26:34.035705 systemd[1]: session-90.scope: Deactivated successfully. Jul 15 05:26:34.038313 systemd-logind[1567]: Session 90 logged out. Waiting for processes to exit. Jul 15 05:26:34.042418 systemd-logind[1567]: Removed session 90. Jul 15 05:26:34.756776 containerd[1591]: time="2025-07-15T05:26:34.756718808Z" level=info msg="TaskExit event in podsandbox handler container_id:\"ecc96da98e69b59d4998e563e8cd9c0081547a430a3bdb37fcca15973200c970\" id:\"c1807e65d97fbe654bb4e3858ec560e2af0a881e32da04c4a823c711120ce6e1\" pid:8862 exited_at:{seconds:1752557194 nanos:756441512}" Jul 15 05:26:39.195810 systemd[1]: Started sshd@90-157.180.39.85:22-139.178.89.65:47410.service - OpenSSH per-connection server daemon (139.178.89.65:47410). Jul 15 05:26:40.185581 sshd[8872]: Accepted publickey for core from 139.178.89.65 port 47410 ssh2: RSA SHA256:je//JvwAvCoYiHr1RrgFTcyG5vuK3YNw75TtQlLXdFM Jul 15 05:26:40.188297 sshd-session[8872]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jul 15 05:26:40.196776 systemd-logind[1567]: New session 91 of user core. Jul 15 05:26:40.209259 systemd[1]: Started session-91.scope - Session 91 of User core. Jul 15 05:26:40.841671 containerd[1591]: time="2025-07-15T05:26:40.841637836Z" level=info msg="TaskExit event in podsandbox handler container_id:\"9428bd6e1d7b46eab1fee560c89f8d47314c8d522bc6b7a656d2d6d9133dfefe\" id:\"7933872e576fe9c7c4e7e85788aca351b9cf242bc54444ea5c91f8c09ba4a116\" pid:8895 exited_at:{seconds:1752557200 nanos:841371339}" Jul 15 05:26:40.997780 sshd[8875]: Connection closed by 139.178.89.65 port 47410 Jul 15 05:26:40.998620 sshd-session[8872]: pam_unix(sshd:session): session closed for user core Jul 15 05:26:41.005262 systemd-logind[1567]: Session 91 logged out. Waiting for processes to exit. Jul 15 05:26:41.007175 systemd[1]: sshd@90-157.180.39.85:22-139.178.89.65:47410.service: Deactivated successfully. Jul 15 05:26:41.009510 systemd[1]: session-91.scope: Deactivated successfully. Jul 15 05:26:41.011542 systemd-logind[1567]: Removed session 91. Jul 15 05:26:42.474815 containerd[1591]: time="2025-07-15T05:26:42.474748984Z" level=info msg="TaskExit event in podsandbox handler container_id:\"9428bd6e1d7b46eab1fee560c89f8d47314c8d522bc6b7a656d2d6d9133dfefe\" id:\"64c10de3f64daae8c1b31de7182d2f1e957d0743565ee3ae3934cae265b8b199\" pid:8921 exited_at:{seconds:1752557202 nanos:474351009}" Jul 15 05:26:46.170616 systemd[1]: Started sshd@91-157.180.39.85:22-139.178.89.65:47416.service - OpenSSH per-connection server daemon (139.178.89.65:47416). Jul 15 05:26:47.192106 sshd[8932]: Accepted publickey for core from 139.178.89.65 port 47416 ssh2: RSA SHA256:je//JvwAvCoYiHr1RrgFTcyG5vuK3YNw75TtQlLXdFM Jul 15 05:26:47.194098 sshd-session[8932]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jul 15 05:26:47.197766 systemd-logind[1567]: New session 92 of user core. Jul 15 05:26:47.206503 systemd[1]: Started session-92.scope - Session 92 of User core. Jul 15 05:26:47.993554 sshd[8935]: Connection closed by 139.178.89.65 port 47416 Jul 15 05:26:47.994781 sshd-session[8932]: pam_unix(sshd:session): session closed for user core Jul 15 05:26:48.002263 systemd[1]: sshd@91-157.180.39.85:22-139.178.89.65:47416.service: Deactivated successfully. Jul 15 05:26:48.006051 systemd[1]: session-92.scope: Deactivated successfully. Jul 15 05:26:48.007857 systemd-logind[1567]: Session 92 logged out. Waiting for processes to exit. Jul 15 05:26:48.010993 systemd-logind[1567]: Removed session 92. Jul 15 05:26:51.591336 containerd[1591]: time="2025-07-15T05:26:51.591288547Z" level=info msg="TaskExit event in podsandbox handler container_id:\"ecc96da98e69b59d4998e563e8cd9c0081547a430a3bdb37fcca15973200c970\" id:\"9ac78bab6b9a155e35a956ec478b3bf1adbcef1f0ac809edb9380c2ab9b1466b\" pid:8962 exited_at:{seconds:1752557211 nanos:590231380}" Jul 15 05:26:53.173571 systemd[1]: Started sshd@92-157.180.39.85:22-139.178.89.65:55994.service - OpenSSH per-connection server daemon (139.178.89.65:55994). Jul 15 05:26:54.172708 sshd[8972]: Accepted publickey for core from 139.178.89.65 port 55994 ssh2: RSA SHA256:je//JvwAvCoYiHr1RrgFTcyG5vuK3YNw75TtQlLXdFM Jul 15 05:26:54.174749 sshd-session[8972]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jul 15 05:26:54.179926 systemd-logind[1567]: New session 93 of user core. Jul 15 05:26:54.186312 systemd[1]: Started session-93.scope - Session 93 of User core. Jul 15 05:26:54.949413 sshd[8975]: Connection closed by 139.178.89.65 port 55994 Jul 15 05:26:54.949723 sshd-session[8972]: pam_unix(sshd:session): session closed for user core Jul 15 05:26:54.954518 systemd[1]: sshd@92-157.180.39.85:22-139.178.89.65:55994.service: Deactivated successfully. Jul 15 05:26:54.954756 systemd-logind[1567]: Session 93 logged out. Waiting for processes to exit. Jul 15 05:26:54.957108 systemd[1]: session-93.scope: Deactivated successfully. Jul 15 05:26:54.959397 systemd-logind[1567]: Removed session 93. Jul 15 05:26:55.729230 containerd[1591]: time="2025-07-15T05:26:55.729195075Z" level=info msg="TaskExit event in podsandbox handler container_id:\"87af270add8c61091888bd10ba9da0b61a290df1dcd1c45e8bd59a8af425818f\" id:\"cd0bdb8909f14ce57de4ddf68bf852a3967125f07a6b7cd268f64c25b88369ff\" pid:8999 exited_at:{seconds:1752557215 nanos:728841439}" Jul 15 05:27:00.126181 systemd[1]: Started sshd@93-157.180.39.85:22-139.178.89.65:57858.service - OpenSSH per-connection server daemon (139.178.89.65:57858). Jul 15 05:27:01.126967 sshd[9015]: Accepted publickey for core from 139.178.89.65 port 57858 ssh2: RSA SHA256:je//JvwAvCoYiHr1RrgFTcyG5vuK3YNw75TtQlLXdFM Jul 15 05:27:01.129643 sshd-session[9015]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jul 15 05:27:01.139142 systemd-logind[1567]: New session 94 of user core. Jul 15 05:27:01.143284 systemd[1]: Started session-94.scope - Session 94 of User core. Jul 15 05:27:01.895125 sshd[9019]: Connection closed by 139.178.89.65 port 57858 Jul 15 05:27:01.896204 sshd-session[9015]: pam_unix(sshd:session): session closed for user core Jul 15 05:27:01.903539 systemd[1]: sshd@93-157.180.39.85:22-139.178.89.65:57858.service: Deactivated successfully. Jul 15 05:27:01.908176 systemd[1]: session-94.scope: Deactivated successfully. Jul 15 05:27:01.911764 systemd-logind[1567]: Session 94 logged out. Waiting for processes to exit. Jul 15 05:27:01.914260 systemd-logind[1567]: Removed session 94. Jul 15 05:27:04.764582 containerd[1591]: time="2025-07-15T05:27:04.764475522Z" level=info msg="TaskExit event in podsandbox handler container_id:\"ecc96da98e69b59d4998e563e8cd9c0081547a430a3bdb37fcca15973200c970\" id:\"1609507e41eb42222f54f2154c943118520521f42bdf5875ef630aec06a6298c\" pid:9041 exited_at:{seconds:1752557224 nanos:764310234}" Jul 15 05:27:07.073332 systemd[1]: Started sshd@94-157.180.39.85:22-139.178.89.65:57874.service - OpenSSH per-connection server daemon (139.178.89.65:57874). Jul 15 05:27:08.068855 sshd[9051]: Accepted publickey for core from 139.178.89.65 port 57874 ssh2: RSA SHA256:je//JvwAvCoYiHr1RrgFTcyG5vuK3YNw75TtQlLXdFM Jul 15 05:27:08.072200 sshd-session[9051]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jul 15 05:27:08.081135 systemd-logind[1567]: New session 95 of user core. Jul 15 05:27:08.090281 systemd[1]: Started session-95.scope - Session 95 of User core. Jul 15 05:27:08.846996 sshd[9054]: Connection closed by 139.178.89.65 port 57874 Jul 15 05:27:08.847941 sshd-session[9051]: pam_unix(sshd:session): session closed for user core Jul 15 05:27:08.856370 systemd[1]: sshd@94-157.180.39.85:22-139.178.89.65:57874.service: Deactivated successfully. Jul 15 05:27:08.862116 systemd[1]: session-95.scope: Deactivated successfully. Jul 15 05:27:08.864249 systemd-logind[1567]: Session 95 logged out. Waiting for processes to exit. Jul 15 05:27:08.866580 systemd-logind[1567]: Removed session 95. Jul 15 05:27:10.858272 containerd[1591]: time="2025-07-15T05:27:10.858205075Z" level=info msg="TaskExit event in podsandbox handler container_id:\"9428bd6e1d7b46eab1fee560c89f8d47314c8d522bc6b7a656d2d6d9133dfefe\" id:\"51562bd37faa4d056ae12f4ab23815e7a1764258a8b83bd95e706be98881bbce\" pid:9078 exited_at:{seconds:1752557230 nanos:857680162}" Jul 15 05:27:14.022284 systemd[1]: Started sshd@95-157.180.39.85:22-139.178.89.65:35594.service - OpenSSH per-connection server daemon (139.178.89.65:35594). Jul 15 05:27:15.015631 sshd[9090]: Accepted publickey for core from 139.178.89.65 port 35594 ssh2: RSA SHA256:je//JvwAvCoYiHr1RrgFTcyG5vuK3YNw75TtQlLXdFM Jul 15 05:27:15.017315 sshd-session[9090]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jul 15 05:27:15.024421 systemd-logind[1567]: New session 96 of user core. Jul 15 05:27:15.030110 systemd[1]: Started session-96.scope - Session 96 of User core. Jul 15 05:27:15.797573 sshd[9094]: Connection closed by 139.178.89.65 port 35594 Jul 15 05:27:15.798696 sshd-session[9090]: pam_unix(sshd:session): session closed for user core Jul 15 05:27:15.806308 systemd[1]: sshd@95-157.180.39.85:22-139.178.89.65:35594.service: Deactivated successfully. Jul 15 05:27:15.806381 systemd-logind[1567]: Session 96 logged out. Waiting for processes to exit. Jul 15 05:27:15.810501 systemd[1]: session-96.scope: Deactivated successfully. Jul 15 05:27:15.813983 systemd-logind[1567]: Removed session 96. Jul 15 05:27:20.974130 systemd[1]: Started sshd@96-157.180.39.85:22-139.178.89.65:34482.service - OpenSSH per-connection server daemon (139.178.89.65:34482). Jul 15 05:27:21.955389 sshd[9114]: Accepted publickey for core from 139.178.89.65 port 34482 ssh2: RSA SHA256:je//JvwAvCoYiHr1RrgFTcyG5vuK3YNw75TtQlLXdFM Jul 15 05:27:21.958169 sshd-session[9114]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jul 15 05:27:21.966972 systemd-logind[1567]: New session 97 of user core. Jul 15 05:27:21.974270 systemd[1]: Started session-97.scope - Session 97 of User core. Jul 15 05:27:22.835038 sshd[9117]: Connection closed by 139.178.89.65 port 34482 Jul 15 05:27:22.835042 sshd-session[9114]: pam_unix(sshd:session): session closed for user core Jul 15 05:27:22.839079 systemd-logind[1567]: Session 97 logged out. Waiting for processes to exit. Jul 15 05:27:22.839725 systemd[1]: sshd@96-157.180.39.85:22-139.178.89.65:34482.service: Deactivated successfully. Jul 15 05:27:22.842831 systemd[1]: session-97.scope: Deactivated successfully. Jul 15 05:27:22.846522 systemd-logind[1567]: Removed session 97. Jul 15 05:27:25.726910 containerd[1591]: time="2025-07-15T05:27:25.726849205Z" level=info msg="TaskExit event in podsandbox handler container_id:\"87af270add8c61091888bd10ba9da0b61a290df1dcd1c45e8bd59a8af425818f\" id:\"99ebccb8d6972c370b124436de4799e309f54829072a5053a304bab86c7525f8\" pid:9141 exited_at:{seconds:1752557245 nanos:726392020}" Jul 15 05:27:28.009665 systemd[1]: Started sshd@97-157.180.39.85:22-139.178.89.65:34490.service - OpenSSH per-connection server daemon (139.178.89.65:34490). Jul 15 05:27:29.041564 sshd[9155]: Accepted publickey for core from 139.178.89.65 port 34490 ssh2: RSA SHA256:je//JvwAvCoYiHr1RrgFTcyG5vuK3YNw75TtQlLXdFM Jul 15 05:27:29.042768 sshd-session[9155]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jul 15 05:27:29.051274 systemd-logind[1567]: New session 98 of user core. Jul 15 05:27:29.056111 systemd[1]: Started session-98.scope - Session 98 of User core. Jul 15 05:27:29.951697 sshd[9158]: Connection closed by 139.178.89.65 port 34490 Jul 15 05:27:29.952406 sshd-session[9155]: pam_unix(sshd:session): session closed for user core Jul 15 05:27:29.964212 systemd[1]: sshd@97-157.180.39.85:22-139.178.89.65:34490.service: Deactivated successfully. Jul 15 05:27:29.968716 systemd[1]: session-98.scope: Deactivated successfully. Jul 15 05:27:29.970318 systemd-logind[1567]: Session 98 logged out. Waiting for processes to exit. Jul 15 05:27:29.974202 systemd-logind[1567]: Removed session 98. Jul 15 05:27:34.760654 containerd[1591]: time="2025-07-15T05:27:34.760598668Z" level=info msg="TaskExit event in podsandbox handler container_id:\"ecc96da98e69b59d4998e563e8cd9c0081547a430a3bdb37fcca15973200c970\" id:\"08cb4a66815b531c43750a96b71a036965c21bb0a0054f078b7f3aa71e1dddb3\" pid:9182 exited_at:{seconds:1752557254 nanos:759898486}" Jul 15 05:27:35.125204 systemd[1]: Started sshd@98-157.180.39.85:22-139.178.89.65:55632.service - OpenSSH per-connection server daemon (139.178.89.65:55632). Jul 15 05:27:36.151924 sshd[9193]: Accepted publickey for core from 139.178.89.65 port 55632 ssh2: RSA SHA256:je//JvwAvCoYiHr1RrgFTcyG5vuK3YNw75TtQlLXdFM Jul 15 05:27:36.154966 sshd-session[9193]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jul 15 05:27:36.163866 systemd-logind[1567]: New session 99 of user core. Jul 15 05:27:36.173222 systemd[1]: Started session-99.scope - Session 99 of User core. Jul 15 05:27:36.990878 sshd[9217]: Connection closed by 139.178.89.65 port 55632 Jul 15 05:27:36.992107 sshd-session[9193]: pam_unix(sshd:session): session closed for user core Jul 15 05:27:36.999348 systemd[1]: sshd@98-157.180.39.85:22-139.178.89.65:55632.service: Deactivated successfully. Jul 15 05:27:37.004399 systemd[1]: session-99.scope: Deactivated successfully. Jul 15 05:27:37.006835 systemd-logind[1567]: Session 99 logged out. Waiting for processes to exit. Jul 15 05:27:37.009494 systemd-logind[1567]: Removed session 99. Jul 15 05:27:40.828891 containerd[1591]: time="2025-07-15T05:27:40.828837817Z" level=info msg="TaskExit event in podsandbox handler container_id:\"9428bd6e1d7b46eab1fee560c89f8d47314c8d522bc6b7a656d2d6d9133dfefe\" id:\"aa116e98f7c65df0727de9043d371e3d719aabc1741d40f0e8fe98266c225054\" pid:9240 exited_at:{seconds:1752557260 nanos:828555350}" Jul 15 05:27:42.175859 systemd[1]: Started sshd@99-157.180.39.85:22-139.178.89.65:54044.service - OpenSSH per-connection server daemon (139.178.89.65:54044). Jul 15 05:27:42.505789 containerd[1591]: time="2025-07-15T05:27:42.505753963Z" level=info msg="TaskExit event in podsandbox handler container_id:\"9428bd6e1d7b46eab1fee560c89f8d47314c8d522bc6b7a656d2d6d9133dfefe\" id:\"bcd86c0a24ee4158408832d02fe2dbc3a88a56468058be863f933fd14db95d1e\" pid:9267 exited_at:{seconds:1752557262 nanos:505380567}" Jul 15 05:27:43.154894 sshd[9251]: Accepted publickey for core from 139.178.89.65 port 54044 ssh2: RSA SHA256:je//JvwAvCoYiHr1RrgFTcyG5vuK3YNw75TtQlLXdFM Jul 15 05:27:43.158422 sshd-session[9251]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jul 15 05:27:43.166848 systemd-logind[1567]: New session 100 of user core. Jul 15 05:27:43.176246 systemd[1]: Started session-100.scope - Session 100 of User core. Jul 15 05:27:43.900211 sshd[9277]: Connection closed by 139.178.89.65 port 54044 Jul 15 05:27:43.902203 sshd-session[9251]: pam_unix(sshd:session): session closed for user core Jul 15 05:27:43.906653 systemd-logind[1567]: Session 100 logged out. Waiting for processes to exit. Jul 15 05:27:43.907538 systemd[1]: sshd@99-157.180.39.85:22-139.178.89.65:54044.service: Deactivated successfully. Jul 15 05:27:43.910359 systemd[1]: session-100.scope: Deactivated successfully. Jul 15 05:27:43.913795 systemd-logind[1567]: Removed session 100. Jul 15 05:27:49.067812 systemd[1]: Started sshd@100-157.180.39.85:22-139.178.89.65:54048.service - OpenSSH per-connection server daemon (139.178.89.65:54048). Jul 15 05:27:50.056083 sshd[9289]: Accepted publickey for core from 139.178.89.65 port 54048 ssh2: RSA SHA256:je//JvwAvCoYiHr1RrgFTcyG5vuK3YNw75TtQlLXdFM Jul 15 05:27:50.060552 sshd-session[9289]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jul 15 05:27:50.073198 systemd-logind[1567]: New session 101 of user core. Jul 15 05:27:50.081505 systemd[1]: Started session-101.scope - Session 101 of User core. Jul 15 05:27:50.835780 sshd[9294]: Connection closed by 139.178.89.65 port 54048 Jul 15 05:27:50.836285 sshd-session[9289]: pam_unix(sshd:session): session closed for user core Jul 15 05:27:50.840503 systemd-logind[1567]: Session 101 logged out. Waiting for processes to exit. Jul 15 05:27:50.840774 systemd[1]: sshd@100-157.180.39.85:22-139.178.89.65:54048.service: Deactivated successfully. Jul 15 05:27:50.842751 systemd[1]: session-101.scope: Deactivated successfully. Jul 15 05:27:50.844324 systemd-logind[1567]: Removed session 101. Jul 15 05:27:51.575809 containerd[1591]: time="2025-07-15T05:27:51.575757387Z" level=info msg="TaskExit event in podsandbox handler container_id:\"ecc96da98e69b59d4998e563e8cd9c0081547a430a3bdb37fcca15973200c970\" id:\"ffc452147e679cb6fc29cd739523a4a99193eb30abb496f11195e1ee9ad6feb0\" pid:9317 exited_at:{seconds:1752557271 nanos:575304542}" Jul 15 05:27:55.789406 containerd[1591]: time="2025-07-15T05:27:55.789372085Z" level=info msg="TaskExit event in podsandbox handler container_id:\"87af270add8c61091888bd10ba9da0b61a290df1dcd1c45e8bd59a8af425818f\" id:\"c1b7595887cefefa315c50c298de9e2aa54d613b62ea3a1969b3e9ae433ea93e\" pid:9338 exited_at:{seconds:1752557275 nanos:788934200}" Jul 15 05:27:56.002802 systemd[1]: Started sshd@101-157.180.39.85:22-139.178.89.65:45724.service - OpenSSH per-connection server daemon (139.178.89.65:45724). Jul 15 05:27:56.978592 sshd[9350]: Accepted publickey for core from 139.178.89.65 port 45724 ssh2: RSA SHA256:je//JvwAvCoYiHr1RrgFTcyG5vuK3YNw75TtQlLXdFM Jul 15 05:27:56.980298 sshd-session[9350]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jul 15 05:27:56.985051 systemd-logind[1567]: New session 102 of user core. Jul 15 05:27:56.990189 systemd[1]: Started session-102.scope - Session 102 of User core. Jul 15 05:27:57.780845 sshd[9356]: Connection closed by 139.178.89.65 port 45724 Jul 15 05:27:57.781868 sshd-session[9350]: pam_unix(sshd:session): session closed for user core Jul 15 05:27:57.789059 systemd[1]: sshd@101-157.180.39.85:22-139.178.89.65:45724.service: Deactivated successfully. Jul 15 05:27:57.793476 systemd[1]: session-102.scope: Deactivated successfully. Jul 15 05:27:57.795883 systemd-logind[1567]: Session 102 logged out. Waiting for processes to exit. Jul 15 05:27:57.798964 systemd-logind[1567]: Removed session 102. Jul 15 05:28:02.951087 systemd[1]: Started sshd@102-157.180.39.85:22-139.178.89.65:47560.service - OpenSSH per-connection server daemon (139.178.89.65:47560). Jul 15 05:28:03.942725 sshd[9368]: Accepted publickey for core from 139.178.89.65 port 47560 ssh2: RSA SHA256:je//JvwAvCoYiHr1RrgFTcyG5vuK3YNw75TtQlLXdFM Jul 15 05:28:03.945308 sshd-session[9368]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jul 15 05:28:03.953146 systemd-logind[1567]: New session 103 of user core. Jul 15 05:28:03.959139 systemd[1]: Started session-103.scope - Session 103 of User core. Jul 15 05:28:04.698355 sshd[9371]: Connection closed by 139.178.89.65 port 47560 Jul 15 05:28:04.700759 sshd-session[9368]: pam_unix(sshd:session): session closed for user core Jul 15 05:28:04.712007 systemd[1]: sshd@102-157.180.39.85:22-139.178.89.65:47560.service: Deactivated successfully. Jul 15 05:28:04.716809 systemd[1]: session-103.scope: Deactivated successfully. Jul 15 05:28:04.719913 systemd-logind[1567]: Session 103 logged out. Waiting for processes to exit. Jul 15 05:28:04.724273 systemd-logind[1567]: Removed session 103. Jul 15 05:28:04.768000 containerd[1591]: time="2025-07-15T05:28:04.767933150Z" level=info msg="TaskExit event in podsandbox handler container_id:\"ecc96da98e69b59d4998e563e8cd9c0081547a430a3bdb37fcca15973200c970\" id:\"e64da3c4a2d845378ec1a147e9b6932710ad55f748ad790eb7482038176391c6\" pid:9396 exited_at:{seconds:1752557284 nanos:767433376}" Jul 15 05:28:09.882391 systemd[1]: Started sshd@103-157.180.39.85:22-139.178.89.65:59978.service - OpenSSH per-connection server daemon (139.178.89.65:59978). Jul 15 05:28:10.825562 containerd[1591]: time="2025-07-15T05:28:10.825521349Z" level=info msg="TaskExit event in podsandbox handler container_id:\"9428bd6e1d7b46eab1fee560c89f8d47314c8d522bc6b7a656d2d6d9133dfefe\" id:\"350f3374ce57005c5d83c7e00bc8efc5935df45267135f4f9b92ce1a823b4554\" pid:9421 exited_at:{seconds:1752557290 nanos:825191742}" Jul 15 05:28:10.886250 sshd[9406]: Accepted publickey for core from 139.178.89.65 port 59978 ssh2: RSA SHA256:je//JvwAvCoYiHr1RrgFTcyG5vuK3YNw75TtQlLXdFM Jul 15 05:28:10.889655 sshd-session[9406]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jul 15 05:28:10.898248 systemd-logind[1567]: New session 104 of user core. Jul 15 05:28:10.903228 systemd[1]: Started session-104.scope - Session 104 of User core. Jul 15 05:28:11.678214 sshd[9432]: Connection closed by 139.178.89.65 port 59978 Jul 15 05:28:11.679221 sshd-session[9406]: pam_unix(sshd:session): session closed for user core Jul 15 05:28:11.686368 systemd[1]: sshd@103-157.180.39.85:22-139.178.89.65:59978.service: Deactivated successfully. Jul 15 05:28:11.690964 systemd[1]: session-104.scope: Deactivated successfully. Jul 15 05:28:11.694202 systemd-logind[1567]: Session 104 logged out. Waiting for processes to exit. Jul 15 05:28:11.698676 systemd-logind[1567]: Removed session 104. Jul 15 05:28:16.852126 systemd[1]: Started sshd@104-157.180.39.85:22-139.178.89.65:59984.service - OpenSSH per-connection server daemon (139.178.89.65:59984). Jul 15 05:28:17.852513 sshd[9444]: Accepted publickey for core from 139.178.89.65 port 59984 ssh2: RSA SHA256:je//JvwAvCoYiHr1RrgFTcyG5vuK3YNw75TtQlLXdFM Jul 15 05:28:17.853944 sshd-session[9444]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jul 15 05:28:17.860331 systemd-logind[1567]: New session 105 of user core. Jul 15 05:28:17.867045 systemd[1]: Started session-105.scope - Session 105 of User core. Jul 15 05:28:18.641034 sshd[9447]: Connection closed by 139.178.89.65 port 59984 Jul 15 05:28:18.642045 sshd-session[9444]: pam_unix(sshd:session): session closed for user core Jul 15 05:28:18.649342 systemd[1]: sshd@104-157.180.39.85:22-139.178.89.65:59984.service: Deactivated successfully. Jul 15 05:28:18.654457 systemd[1]: session-105.scope: Deactivated successfully. Jul 15 05:28:18.657441 systemd-logind[1567]: Session 105 logged out. Waiting for processes to exit. Jul 15 05:28:18.660390 systemd-logind[1567]: Removed session 105. Jul 15 05:28:23.812516 systemd[1]: Started sshd@105-157.180.39.85:22-139.178.89.65:43154.service - OpenSSH per-connection server daemon (139.178.89.65:43154). Jul 15 05:28:24.798310 sshd[9459]: Accepted publickey for core from 139.178.89.65 port 43154 ssh2: RSA SHA256:je//JvwAvCoYiHr1RrgFTcyG5vuK3YNw75TtQlLXdFM Jul 15 05:28:24.800827 sshd-session[9459]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jul 15 05:28:24.810064 systemd-logind[1567]: New session 106 of user core. Jul 15 05:28:24.817246 systemd[1]: Started session-106.scope - Session 106 of User core. Jul 15 05:28:25.570836 sshd[9462]: Connection closed by 139.178.89.65 port 43154 Jul 15 05:28:25.571835 sshd-session[9459]: pam_unix(sshd:session): session closed for user core Jul 15 05:28:25.579694 systemd-logind[1567]: Session 106 logged out. Waiting for processes to exit. Jul 15 05:28:25.580256 systemd[1]: sshd@105-157.180.39.85:22-139.178.89.65:43154.service: Deactivated successfully. Jul 15 05:28:25.584999 systemd[1]: session-106.scope: Deactivated successfully. Jul 15 05:28:25.589832 systemd-logind[1567]: Removed session 106. Jul 15 05:28:25.748567 containerd[1591]: time="2025-07-15T05:28:25.748510961Z" level=info msg="TaskExit event in podsandbox handler container_id:\"87af270add8c61091888bd10ba9da0b61a290df1dcd1c45e8bd59a8af425818f\" id:\"819ed960bc391b39837949061c54f6e9d9a4d355e5b006cc604e83185f13bbda\" pid:9486 exited_at:{seconds:1752557305 nanos:748220424}" Jul 15 05:28:30.738694 systemd[1]: Started sshd@106-157.180.39.85:22-139.178.89.65:43764.service - OpenSSH per-connection server daemon (139.178.89.65:43764). Jul 15 05:28:31.728702 sshd[9500]: Accepted publickey for core from 139.178.89.65 port 43764 ssh2: RSA SHA256:je//JvwAvCoYiHr1RrgFTcyG5vuK3YNw75TtQlLXdFM Jul 15 05:28:31.732063 sshd-session[9500]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jul 15 05:28:31.742952 systemd-logind[1567]: New session 107 of user core. Jul 15 05:28:31.749613 systemd[1]: Started session-107.scope - Session 107 of User core. Jul 15 05:28:32.611031 sshd[9503]: Connection closed by 139.178.89.65 port 43764 Jul 15 05:28:32.611972 sshd-session[9500]: pam_unix(sshd:session): session closed for user core Jul 15 05:28:32.618816 systemd[1]: sshd@106-157.180.39.85:22-139.178.89.65:43764.service: Deactivated successfully. Jul 15 05:28:32.622006 systemd[1]: session-107.scope: Deactivated successfully. Jul 15 05:28:32.624783 systemd-logind[1567]: Session 107 logged out. Waiting for processes to exit. Jul 15 05:28:32.627697 systemd-logind[1567]: Removed session 107. Jul 15 05:28:34.744780 containerd[1591]: time="2025-07-15T05:28:34.744424498Z" level=info msg="TaskExit event in podsandbox handler container_id:\"ecc96da98e69b59d4998e563e8cd9c0081547a430a3bdb37fcca15973200c970\" id:\"82d7a62076e277e12a948e38a3a665f5bc6ef49b669d5a755b51554d8835e00e\" pid:9526 exited_at:{seconds:1752557314 nanos:744139231}" Jul 15 05:28:37.785005 systemd[1]: Started sshd@107-157.180.39.85:22-139.178.89.65:43774.service - OpenSSH per-connection server daemon (139.178.89.65:43774). Jul 15 05:28:38.807714 sshd[9535]: Accepted publickey for core from 139.178.89.65 port 43774 ssh2: RSA SHA256:je//JvwAvCoYiHr1RrgFTcyG5vuK3YNw75TtQlLXdFM Jul 15 05:28:38.810401 sshd-session[9535]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jul 15 05:28:38.820078 systemd-logind[1567]: New session 108 of user core. Jul 15 05:28:38.826276 systemd[1]: Started session-108.scope - Session 108 of User core. Jul 15 05:28:39.651204 sshd[9538]: Connection closed by 139.178.89.65 port 43774 Jul 15 05:28:39.652457 sshd-session[9535]: pam_unix(sshd:session): session closed for user core Jul 15 05:28:39.659473 systemd[1]: sshd@107-157.180.39.85:22-139.178.89.65:43774.service: Deactivated successfully. Jul 15 05:28:39.663953 systemd[1]: session-108.scope: Deactivated successfully. Jul 15 05:28:39.666457 systemd-logind[1567]: Session 108 logged out. Waiting for processes to exit. Jul 15 05:28:39.668766 systemd-logind[1567]: Removed session 108. Jul 15 05:28:40.812828 containerd[1591]: time="2025-07-15T05:28:40.812685324Z" level=info msg="TaskExit event in podsandbox handler container_id:\"9428bd6e1d7b46eab1fee560c89f8d47314c8d522bc6b7a656d2d6d9133dfefe\" id:\"082d2c9046622b8673b84a411bc5f032ae50a5742237ea87999089026213ab43\" pid:9561 exited_at:{seconds:1752557320 nanos:812432946}" Jul 15 05:28:42.492978 containerd[1591]: time="2025-07-15T05:28:42.492925419Z" level=info msg="TaskExit event in podsandbox handler container_id:\"9428bd6e1d7b46eab1fee560c89f8d47314c8d522bc6b7a656d2d6d9133dfefe\" id:\"55b69beee101962b893bb597f244a71868f552cf81e348fe65ac88288d420eec\" pid:9584 exited_at:{seconds:1752557322 nanos:492603662}" Jul 15 05:28:44.828276 systemd[1]: Started sshd@108-157.180.39.85:22-139.178.89.65:44064.service - OpenSSH per-connection server daemon (139.178.89.65:44064). Jul 15 05:28:45.820039 sshd[9595]: Accepted publickey for core from 139.178.89.65 port 44064 ssh2: RSA SHA256:je//JvwAvCoYiHr1RrgFTcyG5vuK3YNw75TtQlLXdFM Jul 15 05:28:45.822806 sshd-session[9595]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jul 15 05:28:45.828163 systemd-logind[1567]: New session 109 of user core. Jul 15 05:28:45.833371 systemd[1]: Started session-109.scope - Session 109 of User core. Jul 15 05:28:46.580062 sshd[9598]: Connection closed by 139.178.89.65 port 44064 Jul 15 05:28:46.580572 sshd-session[9595]: pam_unix(sshd:session): session closed for user core Jul 15 05:28:46.592185 systemd[1]: sshd@108-157.180.39.85:22-139.178.89.65:44064.service: Deactivated successfully. Jul 15 05:28:46.596207 systemd[1]: session-109.scope: Deactivated successfully. Jul 15 05:28:46.597876 systemd-logind[1567]: Session 109 logged out. Waiting for processes to exit. Jul 15 05:28:46.599538 systemd-logind[1567]: Removed session 109. Jul 15 05:28:51.586034 containerd[1591]: time="2025-07-15T05:28:51.585226153Z" level=info msg="TaskExit event in podsandbox handler container_id:\"ecc96da98e69b59d4998e563e8cd9c0081547a430a3bdb37fcca15973200c970\" id:\"0742b2495c35ff755cd7d9adbf3e0db4299a8b7eb943f5a7e19782d9661a2e97\" pid:9623 exited_at:{seconds:1752557331 nanos:584268573}" Jul 15 05:28:51.751056 systemd[1]: Started sshd@109-157.180.39.85:22-139.178.89.65:60926.service - OpenSSH per-connection server daemon (139.178.89.65:60926). Jul 15 05:28:52.762124 sshd[9633]: Accepted publickey for core from 139.178.89.65 port 60926 ssh2: RSA SHA256:je//JvwAvCoYiHr1RrgFTcyG5vuK3YNw75TtQlLXdFM Jul 15 05:28:52.765499 sshd-session[9633]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jul 15 05:28:52.775130 systemd-logind[1567]: New session 110 of user core. Jul 15 05:28:52.783242 systemd[1]: Started session-110.scope - Session 110 of User core. Jul 15 05:28:53.874215 sshd[9636]: Connection closed by 139.178.89.65 port 60926 Jul 15 05:28:53.877210 sshd-session[9633]: pam_unix(sshd:session): session closed for user core Jul 15 05:28:53.881980 systemd-logind[1567]: Session 110 logged out. Waiting for processes to exit. Jul 15 05:28:53.883048 systemd[1]: sshd@109-157.180.39.85:22-139.178.89.65:60926.service: Deactivated successfully. Jul 15 05:28:53.887250 systemd[1]: session-110.scope: Deactivated successfully. Jul 15 05:28:53.891449 systemd-logind[1567]: Removed session 110. Jul 15 05:28:55.738192 containerd[1591]: time="2025-07-15T05:28:55.738139864Z" level=info msg="TaskExit event in podsandbox handler container_id:\"87af270add8c61091888bd10ba9da0b61a290df1dcd1c45e8bd59a8af425818f\" id:\"cdc5a6b7398c54ebc9cd5cc9645c70f53013693487ed3f8c019e2fadbfa9dd02\" pid:9659 exited_at:{seconds:1752557335 nanos:737037905}" Jul 15 05:28:59.049337 systemd[1]: Started sshd@110-157.180.39.85:22-139.178.89.65:60938.service - OpenSSH per-connection server daemon (139.178.89.65:60938). Jul 15 05:29:00.085348 sshd[9674]: Accepted publickey for core from 139.178.89.65 port 60938 ssh2: RSA SHA256:je//JvwAvCoYiHr1RrgFTcyG5vuK3YNw75TtQlLXdFM Jul 15 05:29:00.087119 sshd-session[9674]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jul 15 05:29:00.091479 systemd-logind[1567]: New session 111 of user core. Jul 15 05:29:00.099240 systemd[1]: Started session-111.scope - Session 111 of User core. Jul 15 05:29:01.028248 sshd[9677]: Connection closed by 139.178.89.65 port 60938 Jul 15 05:29:01.029414 sshd-session[9674]: pam_unix(sshd:session): session closed for user core Jul 15 05:29:01.037710 systemd[1]: sshd@110-157.180.39.85:22-139.178.89.65:60938.service: Deactivated successfully. Jul 15 05:29:01.044218 systemd[1]: session-111.scope: Deactivated successfully. Jul 15 05:29:01.046195 systemd-logind[1567]: Session 111 logged out. Waiting for processes to exit. Jul 15 05:29:01.048498 systemd-logind[1567]: Removed session 111. Jul 15 05:29:04.768998 containerd[1591]: time="2025-07-15T05:29:04.768831951Z" level=info msg="TaskExit event in podsandbox handler container_id:\"ecc96da98e69b59d4998e563e8cd9c0081547a430a3bdb37fcca15973200c970\" id:\"e2b78be98a6fa5469f7ac6a624057c5b7a0d71944513b39f03978210f6e800a7\" pid:9699 exited_at:{seconds:1752557344 nanos:768613093}" Jul 15 05:29:06.202196 systemd[1]: Started sshd@111-157.180.39.85:22-139.178.89.65:45918.service - OpenSSH per-connection server daemon (139.178.89.65:45918). Jul 15 05:29:07.209200 sshd[9709]: Accepted publickey for core from 139.178.89.65 port 45918 ssh2: RSA SHA256:je//JvwAvCoYiHr1RrgFTcyG5vuK3YNw75TtQlLXdFM Jul 15 05:29:07.211835 sshd-session[9709]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jul 15 05:29:07.220804 systemd-logind[1567]: New session 112 of user core. Jul 15 05:29:07.225332 systemd[1]: Started session-112.scope - Session 112 of User core. Jul 15 05:29:08.012812 sshd[9719]: Connection closed by 139.178.89.65 port 45918 Jul 15 05:29:08.014280 sshd-session[9709]: pam_unix(sshd:session): session closed for user core Jul 15 05:29:08.023133 systemd-logind[1567]: Session 112 logged out. Waiting for processes to exit. Jul 15 05:29:08.023861 systemd[1]: sshd@111-157.180.39.85:22-139.178.89.65:45918.service: Deactivated successfully. Jul 15 05:29:08.028706 systemd[1]: session-112.scope: Deactivated successfully. Jul 15 05:29:08.035931 systemd-logind[1567]: Removed session 112. Jul 15 05:29:10.831454 containerd[1591]: time="2025-07-15T05:29:10.831395023Z" level=info msg="TaskExit event in podsandbox handler container_id:\"9428bd6e1d7b46eab1fee560c89f8d47314c8d522bc6b7a656d2d6d9133dfefe\" id:\"69ead14ad40fae207da60ef34378aa330bcff9121fe886814c22c56841cd3492\" pid:9742 exited_at:{seconds:1752557350 nanos:830876218}" Jul 15 05:29:13.188641 systemd[1]: Started sshd@112-157.180.39.85:22-139.178.89.65:40292.service - OpenSSH per-connection server daemon (139.178.89.65:40292). Jul 15 05:29:14.189205 sshd[9753]: Accepted publickey for core from 139.178.89.65 port 40292 ssh2: RSA SHA256:je//JvwAvCoYiHr1RrgFTcyG5vuK3YNw75TtQlLXdFM Jul 15 05:29:14.191907 sshd-session[9753]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jul 15 05:29:14.199708 systemd-logind[1567]: New session 113 of user core. Jul 15 05:29:14.208247 systemd[1]: Started session-113.scope - Session 113 of User core. Jul 15 05:29:14.924554 sshd[9770]: Connection closed by 139.178.89.65 port 40292 Jul 15 05:29:14.925899 sshd-session[9753]: pam_unix(sshd:session): session closed for user core Jul 15 05:29:14.930938 systemd[1]: sshd@112-157.180.39.85:22-139.178.89.65:40292.service: Deactivated successfully. Jul 15 05:29:14.933249 systemd[1]: session-113.scope: Deactivated successfully. Jul 15 05:29:14.934943 systemd-logind[1567]: Session 113 logged out. Waiting for processes to exit. Jul 15 05:29:14.936560 systemd-logind[1567]: Removed session 113. Jul 15 05:29:20.096934 systemd[1]: Started sshd@113-157.180.39.85:22-139.178.89.65:54304.service - OpenSSH per-connection server daemon (139.178.89.65:54304). Jul 15 05:29:21.102196 sshd[9782]: Accepted publickey for core from 139.178.89.65 port 54304 ssh2: RSA SHA256:je//JvwAvCoYiHr1RrgFTcyG5vuK3YNw75TtQlLXdFM Jul 15 05:29:21.103801 sshd-session[9782]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jul 15 05:29:21.107302 systemd-logind[1567]: New session 114 of user core. Jul 15 05:29:21.113109 systemd[1]: Started session-114.scope - Session 114 of User core. Jul 15 05:29:21.877225 sshd[9785]: Connection closed by 139.178.89.65 port 54304 Jul 15 05:29:21.878713 sshd-session[9782]: pam_unix(sshd:session): session closed for user core Jul 15 05:29:21.886642 systemd[1]: sshd@113-157.180.39.85:22-139.178.89.65:54304.service: Deactivated successfully. Jul 15 05:29:21.890332 systemd[1]: session-114.scope: Deactivated successfully. Jul 15 05:29:21.892470 systemd-logind[1567]: Session 114 logged out. Waiting for processes to exit. Jul 15 05:29:21.895655 systemd-logind[1567]: Removed session 114. Jul 15 05:29:25.729257 containerd[1591]: time="2025-07-15T05:29:25.729223020Z" level=info msg="TaskExit event in podsandbox handler container_id:\"87af270add8c61091888bd10ba9da0b61a290df1dcd1c45e8bd59a8af425818f\" id:\"120bf3d00bbc9cc7ebe1d3a6c5ce4bc9a82ad7dd182144eae273c2a7ba17f517\" pid:9808 exited_at:{seconds:1752557365 nanos:728872653}" Jul 15 05:29:27.054465 systemd[1]: Started sshd@114-157.180.39.85:22-139.178.89.65:54316.service - OpenSSH per-connection server daemon (139.178.89.65:54316). Jul 15 05:29:28.061043 sshd[9822]: Accepted publickey for core from 139.178.89.65 port 54316 ssh2: RSA SHA256:je//JvwAvCoYiHr1RrgFTcyG5vuK3YNw75TtQlLXdFM Jul 15 05:29:28.062841 sshd-session[9822]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jul 15 05:29:28.068307 systemd-logind[1567]: New session 115 of user core. Jul 15 05:29:28.072157 systemd[1]: Started session-115.scope - Session 115 of User core. Jul 15 05:29:28.824630 sshd[9825]: Connection closed by 139.178.89.65 port 54316 Jul 15 05:29:28.828384 sshd-session[9822]: pam_unix(sshd:session): session closed for user core Jul 15 05:29:28.836083 systemd[1]: sshd@114-157.180.39.85:22-139.178.89.65:54316.service: Deactivated successfully. Jul 15 05:29:28.840236 systemd[1]: session-115.scope: Deactivated successfully. Jul 15 05:29:28.841969 systemd-logind[1567]: Session 115 logged out. Waiting for processes to exit. Jul 15 05:29:28.844829 systemd-logind[1567]: Removed session 115. Jul 15 05:29:34.001680 systemd[1]: Started sshd@115-157.180.39.85:22-139.178.89.65:45272.service - OpenSSH per-connection server daemon (139.178.89.65:45272). Jul 15 05:29:34.759735 containerd[1591]: time="2025-07-15T05:29:34.759525493Z" level=info msg="TaskExit event in podsandbox handler container_id:\"ecc96da98e69b59d4998e563e8cd9c0081547a430a3bdb37fcca15973200c970\" id:\"a2a1381ca71f6ac588adfc543db09c3eba8f3ee52f854c664fc6ea3e1f98e6e4\" pid:9851 exited_at:{seconds:1752557374 nanos:759112427}" Jul 15 05:29:34.999888 sshd[9837]: Accepted publickey for core from 139.178.89.65 port 45272 ssh2: RSA SHA256:je//JvwAvCoYiHr1RrgFTcyG5vuK3YNw75TtQlLXdFM Jul 15 05:29:35.002680 sshd-session[9837]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jul 15 05:29:35.011559 systemd-logind[1567]: New session 116 of user core. Jul 15 05:29:35.017688 systemd[1]: Started session-116.scope - Session 116 of User core. Jul 15 05:29:35.776178 sshd[9860]: Connection closed by 139.178.89.65 port 45272 Jul 15 05:29:35.777221 sshd-session[9837]: pam_unix(sshd:session): session closed for user core Jul 15 05:29:35.785175 systemd[1]: sshd@115-157.180.39.85:22-139.178.89.65:45272.service: Deactivated successfully. Jul 15 05:29:35.789585 systemd[1]: session-116.scope: Deactivated successfully. Jul 15 05:29:35.793313 systemd-logind[1567]: Session 116 logged out. Waiting for processes to exit. Jul 15 05:29:35.795894 systemd-logind[1567]: Removed session 116. Jul 15 05:29:40.845906 containerd[1591]: time="2025-07-15T05:29:40.845832015Z" level=info msg="TaskExit event in podsandbox handler container_id:\"9428bd6e1d7b46eab1fee560c89f8d47314c8d522bc6b7a656d2d6d9133dfefe\" id:\"cc1ff07e567e1d42efd9e527bbef8fe2cb55432014ccf32574049082d8e715fc\" pid:9884 exited_at:{seconds:1752557380 nanos:845431399}" Jul 15 05:29:40.947754 systemd[1]: Started sshd@116-157.180.39.85:22-139.178.89.65:55610.service - OpenSSH per-connection server daemon (139.178.89.65:55610). Jul 15 05:29:41.948199 sshd[9895]: Accepted publickey for core from 139.178.89.65 port 55610 ssh2: RSA SHA256:je//JvwAvCoYiHr1RrgFTcyG5vuK3YNw75TtQlLXdFM Jul 15 05:29:41.950703 sshd-session[9895]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jul 15 05:29:41.959101 systemd-logind[1567]: New session 117 of user core. Jul 15 05:29:41.973282 systemd[1]: Started session-117.scope - Session 117 of User core. Jul 15 05:29:42.469565 containerd[1591]: time="2025-07-15T05:29:42.469502715Z" level=info msg="TaskExit event in podsandbox handler container_id:\"9428bd6e1d7b46eab1fee560c89f8d47314c8d522bc6b7a656d2d6d9133dfefe\" id:\"88e3e557181e9b9b826661708f6037cfcbe8b8d3a2de839ed982a7ef774c2df7\" pid:9911 exited_at:{seconds:1752557382 nanos:468213256}" Jul 15 05:29:42.673307 sshd[9898]: Connection closed by 139.178.89.65 port 55610 Jul 15 05:29:42.674816 sshd-session[9895]: pam_unix(sshd:session): session closed for user core Jul 15 05:29:42.682391 systemd-logind[1567]: Session 117 logged out. Waiting for processes to exit. Jul 15 05:29:42.683700 systemd[1]: sshd@116-157.180.39.85:22-139.178.89.65:55610.service: Deactivated successfully. Jul 15 05:29:42.687651 systemd[1]: session-117.scope: Deactivated successfully. Jul 15 05:29:42.693175 systemd-logind[1567]: Removed session 117. Jul 15 05:29:47.847841 systemd[1]: Started sshd@117-157.180.39.85:22-139.178.89.65:55618.service - OpenSSH per-connection server daemon (139.178.89.65:55618). Jul 15 05:29:48.856585 sshd[9932]: Accepted publickey for core from 139.178.89.65 port 55618 ssh2: RSA SHA256:je//JvwAvCoYiHr1RrgFTcyG5vuK3YNw75TtQlLXdFM Jul 15 05:29:48.859763 sshd-session[9932]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jul 15 05:29:48.868135 systemd-logind[1567]: New session 118 of user core. Jul 15 05:29:48.874174 systemd[1]: Started session-118.scope - Session 118 of User core. Jul 15 05:29:49.618054 sshd[9935]: Connection closed by 139.178.89.65 port 55618 Jul 15 05:29:49.618867 sshd-session[9932]: pam_unix(sshd:session): session closed for user core Jul 15 05:29:49.627544 systemd-logind[1567]: Session 118 logged out. Waiting for processes to exit. Jul 15 05:29:49.628516 systemd[1]: sshd@117-157.180.39.85:22-139.178.89.65:55618.service: Deactivated successfully. Jul 15 05:29:49.632644 systemd[1]: session-118.scope: Deactivated successfully. Jul 15 05:29:49.636261 systemd-logind[1567]: Removed session 118. Jul 15 05:29:51.554500 containerd[1591]: time="2025-07-15T05:29:51.554452990Z" level=info msg="TaskExit event in podsandbox handler container_id:\"ecc96da98e69b59d4998e563e8cd9c0081547a430a3bdb37fcca15973200c970\" id:\"c66303c4f60806179aec7315e6155c827ea3d80d5852ede12409073efd132dc2\" pid:9961 exited_at:{seconds:1752557391 nanos:554152623}" Jul 15 05:29:54.792418 systemd[1]: Started sshd@118-157.180.39.85:22-139.178.89.65:47746.service - OpenSSH per-connection server daemon (139.178.89.65:47746). Jul 15 05:29:54.796839 systemd[1]: Starting systemd-tmpfiles-clean.service - Cleanup of Temporary Directories... Jul 15 05:29:54.854944 systemd-tmpfiles[9972]: /usr/lib/tmpfiles.d/nfs-utils.conf:6: Duplicate line for path "/var/lib/nfs/sm", ignoring. Jul 15 05:29:54.855325 systemd-tmpfiles[9972]: /usr/lib/tmpfiles.d/nfs-utils.conf:7: Duplicate line for path "/var/lib/nfs/sm.bak", ignoring. Jul 15 05:29:54.855603 systemd-tmpfiles[9972]: /usr/lib/tmpfiles.d/provision.conf:20: Duplicate line for path "/root", ignoring. Jul 15 05:29:54.856294 systemd-tmpfiles[9972]: /usr/lib/tmpfiles.d/systemd-flatcar.conf:6: Duplicate line for path "/var/log/journal", ignoring. Jul 15 05:29:54.857578 systemd-tmpfiles[9972]: /usr/lib/tmpfiles.d/systemd.conf:29: Duplicate line for path "/var/lib/systemd", ignoring. Jul 15 05:29:54.857836 systemd-tmpfiles[9972]: ACLs are not supported, ignoring. Jul 15 05:29:54.857948 systemd-tmpfiles[9972]: ACLs are not supported, ignoring. Jul 15 05:29:54.863866 systemd-tmpfiles[9972]: Detected autofs mount point /boot during canonicalization of boot. Jul 15 05:29:54.863878 systemd-tmpfiles[9972]: Skipping /boot Jul 15 05:29:54.871162 systemd[1]: systemd-tmpfiles-clean.service: Deactivated successfully. Jul 15 05:29:54.871415 systemd[1]: Finished systemd-tmpfiles-clean.service - Cleanup of Temporary Directories. Jul 15 05:29:54.875765 systemd[1]: run-credentials-systemd\x2dtmpfiles\x2dclean.service.mount: Deactivated successfully. Jul 15 05:29:55.705569 containerd[1591]: time="2025-07-15T05:29:55.705458526Z" level=info msg="TaskExit event in podsandbox handler container_id:\"87af270add8c61091888bd10ba9da0b61a290df1dcd1c45e8bd59a8af425818f\" id:\"b6f688fc73feefb908398583d5dad182671e4245f298941f8fe9ac670a7c5685\" pid:9989 exited_at:{seconds:1752557395 nanos:705200598}" Jul 15 05:29:55.790837 sshd[9971]: Accepted publickey for core from 139.178.89.65 port 47746 ssh2: RSA SHA256:je//JvwAvCoYiHr1RrgFTcyG5vuK3YNw75TtQlLXdFM Jul 15 05:29:55.794099 sshd-session[9971]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jul 15 05:29:55.803752 systemd-logind[1567]: New session 119 of user core. Jul 15 05:29:55.812273 systemd[1]: Started session-119.scope - Session 119 of User core. Jul 15 05:29:56.650673 sshd[10002]: Connection closed by 139.178.89.65 port 47746 Jul 15 05:29:56.653161 sshd-session[9971]: pam_unix(sshd:session): session closed for user core Jul 15 05:29:56.656407 systemd[1]: sshd@118-157.180.39.85:22-139.178.89.65:47746.service: Deactivated successfully. Jul 15 05:29:56.659633 systemd[1]: session-119.scope: Deactivated successfully. Jul 15 05:29:56.663049 systemd-logind[1567]: Session 119 logged out. Waiting for processes to exit. Jul 15 05:29:56.663901 systemd-logind[1567]: Removed session 119. Jul 15 05:30:04.745427 containerd[1591]: time="2025-07-15T05:30:04.745301727Z" level=info msg="TaskExit event in podsandbox handler container_id:\"ecc96da98e69b59d4998e563e8cd9c0081547a430a3bdb37fcca15973200c970\" id:\"175d652e864d02ed43cbc613915d55e2f183ea620f43d528b674705c41cdb88e\" pid:10028 exited_at:{seconds:1752557404 nanos:744929500}" Jul 15 05:30:10.830326 containerd[1591]: time="2025-07-15T05:30:10.830277120Z" level=info msg="TaskExit event in podsandbox handler container_id:\"9428bd6e1d7b46eab1fee560c89f8d47314c8d522bc6b7a656d2d6d9133dfefe\" id:\"12493508c947df1ca04142d5830c8a3573133a6bb12eef1d0c3fd2babc65a7e9\" pid:10051 exited_at:{seconds:1752557410 nanos:829844824}" Jul 15 05:30:13.073469 systemd[1]: cri-containerd-d84ffd01b63dafef51e5b4c08b061e8480909e986a9fe3b0a079651a857e020e.scope: Deactivated successfully. Jul 15 05:30:13.075371 systemd[1]: cri-containerd-d84ffd01b63dafef51e5b4c08b061e8480909e986a9fe3b0a079651a857e020e.scope: Consumed 9.050s CPU time, 83.9M memory peak, 91.7M read from disk. Jul 15 05:30:13.150892 containerd[1591]: time="2025-07-15T05:30:13.150861452Z" level=info msg="TaskExit event in podsandbox handler container_id:\"d84ffd01b63dafef51e5b4c08b061e8480909e986a9fe3b0a079651a857e020e\" id:\"d84ffd01b63dafef51e5b4c08b061e8480909e986a9fe3b0a079651a857e020e\" pid:2612 exit_status:1 exited_at:{seconds:1752557413 nanos:136759435}" Jul 15 05:30:13.157874 containerd[1591]: time="2025-07-15T05:30:13.157852491Z" level=info msg="received exit event container_id:\"d84ffd01b63dafef51e5b4c08b061e8480909e986a9fe3b0a079651a857e020e\" id:\"d84ffd01b63dafef51e5b4c08b061e8480909e986a9fe3b0a079651a857e020e\" pid:2612 exit_status:1 exited_at:{seconds:1752557413 nanos:136759435}" Jul 15 05:30:13.208388 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-d84ffd01b63dafef51e5b4c08b061e8480909e986a9fe3b0a079651a857e020e-rootfs.mount: Deactivated successfully. Jul 15 05:30:13.312656 systemd[1]: cri-containerd-243a100da6a48118c4599031453de009e00f859cdfcd436303f98bf48ee91307.scope: Deactivated successfully. Jul 15 05:30:13.313136 systemd[1]: cri-containerd-243a100da6a48118c4599031453de009e00f859cdfcd436303f98bf48ee91307.scope: Consumed 1min 4.286s CPU time, 115.7M memory peak, 68.1M read from disk. Jul 15 05:30:13.319680 containerd[1591]: time="2025-07-15T05:30:13.319610958Z" level=info msg="TaskExit event in podsandbox handler container_id:\"243a100da6a48118c4599031453de009e00f859cdfcd436303f98bf48ee91307\" id:\"243a100da6a48118c4599031453de009e00f859cdfcd436303f98bf48ee91307\" pid:3099 exit_status:1 exited_at:{seconds:1752557413 nanos:316246008}" Jul 15 05:30:13.320785 containerd[1591]: time="2025-07-15T05:30:13.320753038Z" level=info msg="received exit event container_id:\"243a100da6a48118c4599031453de009e00f859cdfcd436303f98bf48ee91307\" id:\"243a100da6a48118c4599031453de009e00f859cdfcd436303f98bf48ee91307\" pid:3099 exit_status:1 exited_at:{seconds:1752557413 nanos:316246008}" Jul 15 05:30:13.345756 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-243a100da6a48118c4599031453de009e00f859cdfcd436303f98bf48ee91307-rootfs.mount: Deactivated successfully. Jul 15 05:30:13.399130 kubelet[2761]: I0715 05:30:13.399093 2761 scope.go:117] "RemoveContainer" containerID="d84ffd01b63dafef51e5b4c08b061e8480909e986a9fe3b0a079651a857e020e" Jul 15 05:30:13.456655 containerd[1591]: time="2025-07-15T05:30:13.456602642Z" level=info msg="CreateContainer within sandbox \"f17bb7f4ea97eb0f7606c0c99a76ffb4cd7f45632eb893f29bb012a768c354e0\" for container &ContainerMetadata{Name:kube-controller-manager,Attempt:1,}" Jul 15 05:30:13.493384 kubelet[2761]: E0715 05:30:13.493162 2761 controller.go:195] "Failed to update lease" err="rpc error: code = Unavailable desc = error reading from server: read tcp 10.0.0.3:37456->10.0.0.2:2379: read: connection timed out" Jul 15 05:30:13.568261 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount4170692965.mount: Deactivated successfully. Jul 15 05:30:13.572049 containerd[1591]: time="2025-07-15T05:30:13.571993754Z" level=info msg="Container 5039cd3ae588ec97441d97c0aee6857d4a350b9f96b6e1ef987639fa52116164: CDI devices from CRI Config.CDIDevices: []" Jul 15 05:30:13.588298 containerd[1591]: time="2025-07-15T05:30:13.588269022Z" level=info msg="CreateContainer within sandbox \"f17bb7f4ea97eb0f7606c0c99a76ffb4cd7f45632eb893f29bb012a768c354e0\" for &ContainerMetadata{Name:kube-controller-manager,Attempt:1,} returns container id \"5039cd3ae588ec97441d97c0aee6857d4a350b9f96b6e1ef987639fa52116164\"" Jul 15 05:30:13.591116 containerd[1591]: time="2025-07-15T05:30:13.590166325Z" level=info msg="StartContainer for \"5039cd3ae588ec97441d97c0aee6857d4a350b9f96b6e1ef987639fa52116164\"" Jul 15 05:30:13.591116 containerd[1591]: time="2025-07-15T05:30:13.590826839Z" level=info msg="connecting to shim 5039cd3ae588ec97441d97c0aee6857d4a350b9f96b6e1ef987639fa52116164" address="unix:///run/containerd/s/502dd6cba8b8556aa3b0bd73d8174372fe66c5cba048a96c0d68fca7c52c4ae8" protocol=ttrpc version=3 Jul 15 05:30:13.648136 systemd[1]: Started cri-containerd-5039cd3ae588ec97441d97c0aee6857d4a350b9f96b6e1ef987639fa52116164.scope - libcontainer container 5039cd3ae588ec97441d97c0aee6857d4a350b9f96b6e1ef987639fa52116164. Jul 15 05:30:13.700648 containerd[1591]: time="2025-07-15T05:30:13.700235384Z" level=info msg="StartContainer for \"5039cd3ae588ec97441d97c0aee6857d4a350b9f96b6e1ef987639fa52116164\" returns successfully" Jul 15 05:30:14.208998 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount514300372.mount: Deactivated successfully. Jul 15 05:30:14.381690 kubelet[2761]: I0715 05:30:14.381503 2761 scope.go:117] "RemoveContainer" containerID="243a100da6a48118c4599031453de009e00f859cdfcd436303f98bf48ee91307" Jul 15 05:30:14.427120 containerd[1591]: time="2025-07-15T05:30:14.426999158Z" level=info msg="CreateContainer within sandbox \"72a76cd66dddaf7b086d078cb6161301e0c89cd7249db781ba35a162273349eb\" for container &ContainerMetadata{Name:tigera-operator,Attempt:1,}" Jul 15 05:30:14.452051 containerd[1591]: time="2025-07-15T05:30:14.449801940Z" level=info msg="Container 61aa7da0db583ce2f4fdc720b0816c9aab1e0ccb1976eefe77bab1f9ade2d121: CDI devices from CRI Config.CDIDevices: []" Jul 15 05:30:14.461593 containerd[1591]: time="2025-07-15T05:30:14.461468458Z" level=info msg="CreateContainer within sandbox \"72a76cd66dddaf7b086d078cb6161301e0c89cd7249db781ba35a162273349eb\" for &ContainerMetadata{Name:tigera-operator,Attempt:1,} returns container id \"61aa7da0db583ce2f4fdc720b0816c9aab1e0ccb1976eefe77bab1f9ade2d121\"" Jul 15 05:30:14.462782 containerd[1591]: time="2025-07-15T05:30:14.462729907Z" level=info msg="StartContainer for \"61aa7da0db583ce2f4fdc720b0816c9aab1e0ccb1976eefe77bab1f9ade2d121\"" Jul 15 05:30:14.464584 containerd[1591]: time="2025-07-15T05:30:14.464531121Z" level=info msg="connecting to shim 61aa7da0db583ce2f4fdc720b0816c9aab1e0ccb1976eefe77bab1f9ade2d121" address="unix:///run/containerd/s/4b1896cbc46f96fde2305040beb0d4ffc669bfdd67c63d81008e428134084bcc" protocol=ttrpc version=3 Jul 15 05:30:14.506215 systemd[1]: Started cri-containerd-61aa7da0db583ce2f4fdc720b0816c9aab1e0ccb1976eefe77bab1f9ade2d121.scope - libcontainer container 61aa7da0db583ce2f4fdc720b0816c9aab1e0ccb1976eefe77bab1f9ade2d121. Jul 15 05:30:14.548785 containerd[1591]: time="2025-07-15T05:30:14.548747296Z" level=info msg="StartContainer for \"61aa7da0db583ce2f4fdc720b0816c9aab1e0ccb1976eefe77bab1f9ade2d121\" returns successfully" Jul 15 05:30:17.607552 kubelet[2761]: E0715 05:30:17.584462 2761 event.go:359] "Server rejected event (will not retry!)" err="rpc error: code = Unavailable desc = error reading from server: read tcp 10.0.0.3:42024->10.0.0.2:2379: read: connection timed out" event="&Event{ObjectMeta:{kube-apiserver-ci-4396-0-0-n-e83c776e20.185255ad205615e4 kube-system 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:kube-system,Name:kube-apiserver-ci-4396-0-0-n-e83c776e20,UID:843de93c586da35ea89e9b7f902fcd04,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:Unhealthy,Message:Readiness probe failed: HTTP probe failed with statuscode: 500,Source:EventSource{Component:kubelet,Host:ci-4396-0-0-n-e83c776e20,},FirstTimestamp:2025-07-15 05:30:07.060760036 +0000 UTC m=+857.712134082,LastTimestamp:2025-07-15 05:30:07.060760036 +0000 UTC m=+857.712134082,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ci-4396-0-0-n-e83c776e20,}" Jul 15 05:30:18.607246 systemd[1]: cri-containerd-426304308dbc7f0e2ecaaaf36d8dee5714076cb15d299380b331344bc2a6918d.scope: Deactivated successfully. Jul 15 05:30:18.610430 systemd[1]: cri-containerd-426304308dbc7f0e2ecaaaf36d8dee5714076cb15d299380b331344bc2a6918d.scope: Consumed 6.374s CPU time, 40.6M memory peak, 47.7M read from disk. Jul 15 05:30:18.611358 containerd[1591]: time="2025-07-15T05:30:18.611283303Z" level=info msg="TaskExit event in podsandbox handler container_id:\"426304308dbc7f0e2ecaaaf36d8dee5714076cb15d299380b331344bc2a6918d\" id:\"426304308dbc7f0e2ecaaaf36d8dee5714076cb15d299380b331344bc2a6918d\" pid:2595 exit_status:1 exited_at:{seconds:1752557418 nanos:608618186}" Jul 15 05:30:18.612915 containerd[1591]: time="2025-07-15T05:30:18.612711571Z" level=info msg="received exit event container_id:\"426304308dbc7f0e2ecaaaf36d8dee5714076cb15d299380b331344bc2a6918d\" id:\"426304308dbc7f0e2ecaaaf36d8dee5714076cb15d299380b331344bc2a6918d\" pid:2595 exit_status:1 exited_at:{seconds:1752557418 nanos:608618186}" Jul 15 05:30:18.654079 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-426304308dbc7f0e2ecaaaf36d8dee5714076cb15d299380b331344bc2a6918d-rootfs.mount: Deactivated successfully.