Aug 12 23:42:59.807572 kernel: Booting Linux on physical CPU 0x0000000000 [0x413fd0c1] Aug 12 23:42:59.807600 kernel: Linux version 6.12.40-flatcar (build@pony-truck.infra.kinvolk.io) (aarch64-cros-linux-gnu-gcc (Gentoo Hardened 14.2.1_p20241221 p7) 14.2.1 20241221, GNU ld (Gentoo 2.44 p1) 2.44.0) #1 SMP PREEMPT Tue Aug 12 21:51:24 -00 2025 Aug 12 23:42:59.807612 kernel: KASLR enabled Aug 12 23:42:59.807618 kernel: efi: EFI v2.7 by Ubuntu distribution of EDK II Aug 12 23:42:59.807624 kernel: efi: SMBIOS 3.0=0x139ed0000 MEMATTR=0x1390bb018 ACPI 2.0=0x136760018 RNG=0x13676e918 MEMRESERVE=0x136b41218 Aug 12 23:42:59.807629 kernel: random: crng init done Aug 12 23:42:59.807636 kernel: secureboot: Secure boot disabled Aug 12 23:42:59.807642 kernel: ACPI: Early table checksum verification disabled Aug 12 23:42:59.807648 kernel: ACPI: RSDP 0x0000000136760018 000024 (v02 BOCHS ) Aug 12 23:42:59.807654 kernel: ACPI: XSDT 0x000000013676FE98 00006C (v01 BOCHS BXPC 00000001 01000013) Aug 12 23:42:59.807717 kernel: ACPI: FACP 0x000000013676FA98 000114 (v06 BOCHS BXPC 00000001 BXPC 00000001) Aug 12 23:42:59.807725 kernel: ACPI: DSDT 0x0000000136767518 001468 (v02 BOCHS BXPC 00000001 BXPC 00000001) Aug 12 23:42:59.807732 kernel: ACPI: APIC 0x000000013676FC18 000108 (v04 BOCHS BXPC 00000001 BXPC 00000001) Aug 12 23:42:59.807740 kernel: ACPI: PPTT 0x000000013676FD98 000060 (v02 BOCHS BXPC 00000001 BXPC 00000001) Aug 12 23:42:59.807748 kernel: ACPI: GTDT 0x000000013676D898 000060 (v02 BOCHS BXPC 00000001 BXPC 00000001) Aug 12 23:42:59.807760 kernel: ACPI: MCFG 0x000000013676FF98 00003C (v01 BOCHS BXPC 00000001 BXPC 00000001) Aug 12 23:42:59.807768 kernel: ACPI: SPCR 0x000000013676E818 000050 (v02 BOCHS BXPC 00000001 BXPC 00000001) Aug 12 23:42:59.807776 kernel: ACPI: DBG2 0x000000013676E898 000057 (v00 BOCHS BXPC 00000001 BXPC 00000001) Aug 12 23:42:59.807784 kernel: ACPI: IORT 0x000000013676E418 000080 (v03 BOCHS BXPC 00000001 BXPC 00000001) Aug 12 23:42:59.807791 kernel: ACPI: BGRT 0x000000013676E798 000038 (v01 INTEL EDK2 00000002 01000013) Aug 12 23:42:59.807798 kernel: ACPI: SPCR: console: pl011,mmio32,0x9000000,9600 Aug 12 23:42:59.807806 kernel: ACPI: Use ACPI SPCR as default console: Yes Aug 12 23:42:59.807813 kernel: NUMA: Faking a node at [mem 0x0000000040000000-0x0000000139ffffff] Aug 12 23:42:59.807820 kernel: NODE_DATA(0) allocated [mem 0x13967da00-0x139684fff] Aug 12 23:42:59.807828 kernel: Zone ranges: Aug 12 23:42:59.807836 kernel: DMA [mem 0x0000000040000000-0x00000000ffffffff] Aug 12 23:42:59.807842 kernel: DMA32 empty Aug 12 23:42:59.807849 kernel: Normal [mem 0x0000000100000000-0x0000000139ffffff] Aug 12 23:42:59.807855 kernel: Device empty Aug 12 23:42:59.807861 kernel: Movable zone start for each node Aug 12 23:42:59.807867 kernel: Early memory node ranges Aug 12 23:42:59.807874 kernel: node 0: [mem 0x0000000040000000-0x000000013666ffff] Aug 12 23:42:59.807880 kernel: node 0: [mem 0x0000000136670000-0x000000013667ffff] Aug 12 23:42:59.807886 kernel: node 0: [mem 0x0000000136680000-0x000000013676ffff] Aug 12 23:42:59.807893 kernel: node 0: [mem 0x0000000136770000-0x0000000136b3ffff] Aug 12 23:42:59.807899 kernel: node 0: [mem 0x0000000136b40000-0x0000000139e1ffff] Aug 12 23:42:59.807906 kernel: node 0: [mem 0x0000000139e20000-0x0000000139eaffff] Aug 12 23:42:59.807912 kernel: node 0: [mem 0x0000000139eb0000-0x0000000139ebffff] Aug 12 23:42:59.807920 kernel: node 0: [mem 0x0000000139ec0000-0x0000000139fdffff] Aug 12 23:42:59.807926 kernel: node 0: [mem 0x0000000139fe0000-0x0000000139ffffff] Aug 12 23:42:59.807936 kernel: Initmem setup node 0 [mem 0x0000000040000000-0x0000000139ffffff] Aug 12 23:42:59.807942 kernel: On node 0, zone Normal: 24576 pages in unavailable ranges Aug 12 23:42:59.807949 kernel: cma: Reserved 16 MiB at 0x00000000ff000000 on node -1 Aug 12 23:42:59.807960 kernel: psci: probing for conduit method from ACPI. Aug 12 23:42:59.807968 kernel: psci: PSCIv1.1 detected in firmware. Aug 12 23:42:59.807975 kernel: psci: Using standard PSCI v0.2 function IDs Aug 12 23:42:59.807983 kernel: psci: Trusted OS migration not required Aug 12 23:42:59.807991 kernel: psci: SMC Calling Convention v1.1 Aug 12 23:42:59.807999 kernel: smccc: KVM: hypervisor services detected (0x00000000 0x00000000 0x00000000 0x00000003) Aug 12 23:42:59.808007 kernel: percpu: Embedded 33 pages/cpu s98200 r8192 d28776 u135168 Aug 12 23:42:59.808015 kernel: pcpu-alloc: s98200 r8192 d28776 u135168 alloc=33*4096 Aug 12 23:42:59.808023 kernel: pcpu-alloc: [0] 0 [0] 1 Aug 12 23:42:59.808031 kernel: Detected PIPT I-cache on CPU0 Aug 12 23:42:59.808039 kernel: CPU features: detected: GIC system register CPU interface Aug 12 23:42:59.808048 kernel: CPU features: detected: Spectre-v4 Aug 12 23:42:59.808056 kernel: CPU features: detected: Spectre-BHB Aug 12 23:42:59.808063 kernel: CPU features: kernel page table isolation forced ON by KASLR Aug 12 23:42:59.808069 kernel: CPU features: detected: Kernel page table isolation (KPTI) Aug 12 23:42:59.808076 kernel: CPU features: detected: ARM erratum 1418040 Aug 12 23:42:59.808083 kernel: CPU features: detected: SSBS not fully self-synchronizing Aug 12 23:42:59.808089 kernel: alternatives: applying boot alternatives Aug 12 23:42:59.808097 kernel: Kernel command line: BOOT_IMAGE=/flatcar/vmlinuz-a mount.usr=/dev/mapper/usr verity.usr=PARTUUID=7130c94a-213a-4e5a-8e26-6cce9662f132 rootflags=rw mount.usrflags=ro consoleblank=0 root=LABEL=ROOT console=ttyAMA0,115200n8 flatcar.first_boot=detected acpi=force flatcar.oem.id=hetzner verity.usrhash=ce82f1ef836ba8581e59ce9db4eef4240d287b2b5f9937c28f0cd024f4dc9107 Aug 12 23:42:59.808104 kernel: Unknown kernel command line parameters "BOOT_IMAGE=/flatcar/vmlinuz-a", will be passed to user space. Aug 12 23:42:59.808111 kernel: Dentry cache hash table entries: 524288 (order: 10, 4194304 bytes, linear) Aug 12 23:42:59.808119 kernel: Inode-cache hash table entries: 262144 (order: 9, 2097152 bytes, linear) Aug 12 23:42:59.808126 kernel: Fallback order for Node 0: 0 Aug 12 23:42:59.808133 kernel: Built 1 zonelists, mobility grouping on. Total pages: 1024000 Aug 12 23:42:59.808139 kernel: Policy zone: Normal Aug 12 23:42:59.808146 kernel: mem auto-init: stack:off, heap alloc:off, heap free:off Aug 12 23:42:59.808153 kernel: software IO TLB: area num 2. Aug 12 23:42:59.808160 kernel: software IO TLB: mapped [mem 0x00000000fb000000-0x00000000ff000000] (64MB) Aug 12 23:42:59.808167 kernel: SLUB: HWalign=64, Order=0-3, MinObjects=0, CPUs=2, Nodes=1 Aug 12 23:42:59.808173 kernel: rcu: Preemptible hierarchical RCU implementation. Aug 12 23:42:59.808181 kernel: rcu: RCU event tracing is enabled. Aug 12 23:42:59.808188 kernel: rcu: RCU restricting CPUs from NR_CPUS=512 to nr_cpu_ids=2. Aug 12 23:42:59.808195 kernel: Trampoline variant of Tasks RCU enabled. Aug 12 23:42:59.808203 kernel: Tracing variant of Tasks RCU enabled. Aug 12 23:42:59.808210 kernel: rcu: RCU calculated value of scheduler-enlistment delay is 100 jiffies. Aug 12 23:42:59.808217 kernel: rcu: Adjusting geometry for rcu_fanout_leaf=16, nr_cpu_ids=2 Aug 12 23:42:59.808223 kernel: RCU Tasks: Setting shift to 1 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=2. Aug 12 23:42:59.808231 kernel: RCU Tasks Trace: Setting shift to 1 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=2. Aug 12 23:42:59.808237 kernel: NR_IRQS: 64, nr_irqs: 64, preallocated irqs: 0 Aug 12 23:42:59.808244 kernel: GICv3: 256 SPIs implemented Aug 12 23:42:59.808251 kernel: GICv3: 0 Extended SPIs implemented Aug 12 23:42:59.808257 kernel: Root IRQ handler: gic_handle_irq Aug 12 23:42:59.808264 kernel: GICv3: GICv3 features: 16 PPIs, DirectLPI Aug 12 23:42:59.808271 kernel: GICv3: GICD_CTRL.DS=1, SCR_EL3.FIQ=0 Aug 12 23:42:59.808277 kernel: GICv3: CPU0: found redistributor 0 region 0:0x00000000080a0000 Aug 12 23:42:59.808286 kernel: ITS [mem 0x08080000-0x0809ffff] Aug 12 23:42:59.808292 kernel: ITS@0x0000000008080000: allocated 8192 Devices @100100000 (indirect, esz 8, psz 64K, shr 1) Aug 12 23:42:59.808299 kernel: ITS@0x0000000008080000: allocated 8192 Interrupt Collections @100110000 (flat, esz 8, psz 64K, shr 1) Aug 12 23:42:59.808306 kernel: GICv3: using LPI property table @0x0000000100120000 Aug 12 23:42:59.808313 kernel: GICv3: CPU0: using allocated LPI pending table @0x0000000100130000 Aug 12 23:42:59.808320 kernel: rcu: srcu_init: Setting srcu_struct sizes based on contention. Aug 12 23:42:59.808326 kernel: arch_timer: Enabling local workaround for ARM erratum 1418040 Aug 12 23:42:59.808333 kernel: arch_timer: cp15 timer(s) running at 25.00MHz (virt). Aug 12 23:42:59.808414 kernel: clocksource: arch_sys_counter: mask: 0xffffffffffffff max_cycles: 0x5c40939b5, max_idle_ns: 440795202646 ns Aug 12 23:42:59.808426 kernel: sched_clock: 56 bits at 25MHz, resolution 40ns, wraps every 4398046511100ns Aug 12 23:42:59.808433 kernel: Console: colour dummy device 80x25 Aug 12 23:42:59.808444 kernel: ACPI: Core revision 20240827 Aug 12 23:42:59.808452 kernel: Calibrating delay loop (skipped), value calculated using timer frequency.. 50.00 BogoMIPS (lpj=25000) Aug 12 23:42:59.808459 kernel: pid_max: default: 32768 minimum: 301 Aug 12 23:42:59.808466 kernel: LSM: initializing lsm=lockdown,capability,landlock,selinux,ima Aug 12 23:42:59.808473 kernel: landlock: Up and running. Aug 12 23:42:59.808480 kernel: SELinux: Initializing. Aug 12 23:42:59.808487 kernel: Mount-cache hash table entries: 8192 (order: 4, 65536 bytes, linear) Aug 12 23:42:59.808494 kernel: Mountpoint-cache hash table entries: 8192 (order: 4, 65536 bytes, linear) Aug 12 23:42:59.808501 kernel: rcu: Hierarchical SRCU implementation. Aug 12 23:42:59.808509 kernel: rcu: Max phase no-delay instances is 400. Aug 12 23:42:59.808516 kernel: Timer migration: 1 hierarchy levels; 8 children per group; 1 crossnode level Aug 12 23:42:59.808523 kernel: Remapping and enabling EFI services. Aug 12 23:42:59.808530 kernel: smp: Bringing up secondary CPUs ... Aug 12 23:42:59.808537 kernel: Detected PIPT I-cache on CPU1 Aug 12 23:42:59.808558 kernel: GICv3: CPU1: found redistributor 1 region 0:0x00000000080c0000 Aug 12 23:42:59.808567 kernel: GICv3: CPU1: using allocated LPI pending table @0x0000000100140000 Aug 12 23:42:59.808574 kernel: arch_timer: Enabling local workaround for ARM erratum 1418040 Aug 12 23:42:59.808581 kernel: CPU1: Booted secondary processor 0x0000000001 [0x413fd0c1] Aug 12 23:42:59.808591 kernel: smp: Brought up 1 node, 2 CPUs Aug 12 23:42:59.808603 kernel: SMP: Total of 2 processors activated. Aug 12 23:42:59.808610 kernel: CPU: All CPU(s) started at EL1 Aug 12 23:42:59.808619 kernel: CPU features: detected: 32-bit EL0 Support Aug 12 23:42:59.808627 kernel: CPU features: detected: Data cache clean to the PoU not required for I/D coherence Aug 12 23:42:59.808634 kernel: CPU features: detected: Common not Private translations Aug 12 23:42:59.808642 kernel: CPU features: detected: CRC32 instructions Aug 12 23:42:59.808649 kernel: CPU features: detected: Enhanced Virtualization Traps Aug 12 23:42:59.808658 kernel: CPU features: detected: RCpc load-acquire (LDAPR) Aug 12 23:42:59.808666 kernel: CPU features: detected: LSE atomic instructions Aug 12 23:42:59.808673 kernel: CPU features: detected: Privileged Access Never Aug 12 23:42:59.808681 kernel: CPU features: detected: RAS Extension Support Aug 12 23:42:59.808688 kernel: CPU features: detected: Speculative Store Bypassing Safe (SSBS) Aug 12 23:42:59.808696 kernel: alternatives: applying system-wide alternatives Aug 12 23:42:59.808703 kernel: CPU features: detected: Hardware dirty bit management on CPU0-1 Aug 12 23:42:59.808711 kernel: Memory: 3859044K/4096000K available (11136K kernel code, 2436K rwdata, 9080K rodata, 39488K init, 1038K bss, 215476K reserved, 16384K cma-reserved) Aug 12 23:42:59.808719 kernel: devtmpfs: initialized Aug 12 23:42:59.808728 kernel: clocksource: jiffies: mask: 0xffffffff max_cycles: 0xffffffff, max_idle_ns: 1911260446275000 ns Aug 12 23:42:59.808735 kernel: futex hash table entries: 512 (order: 3, 32768 bytes, linear) Aug 12 23:42:59.808742 kernel: 2G module region forced by RANDOMIZE_MODULE_REGION_FULL Aug 12 23:42:59.808750 kernel: 0 pages in range for non-PLT usage Aug 12 23:42:59.808757 kernel: 508432 pages in range for PLT usage Aug 12 23:42:59.808765 kernel: pinctrl core: initialized pinctrl subsystem Aug 12 23:42:59.808772 kernel: SMBIOS 3.0.0 present. Aug 12 23:42:59.808779 kernel: DMI: Hetzner vServer/KVM Virtual Machine, BIOS 20171111 11/11/2017 Aug 12 23:42:59.808786 kernel: DMI: Memory slots populated: 1/1 Aug 12 23:42:59.808795 kernel: NET: Registered PF_NETLINK/PF_ROUTE protocol family Aug 12 23:42:59.808803 kernel: DMA: preallocated 512 KiB GFP_KERNEL pool for atomic allocations Aug 12 23:42:59.808810 kernel: DMA: preallocated 512 KiB GFP_KERNEL|GFP_DMA pool for atomic allocations Aug 12 23:42:59.808818 kernel: DMA: preallocated 512 KiB GFP_KERNEL|GFP_DMA32 pool for atomic allocations Aug 12 23:42:59.808825 kernel: audit: initializing netlink subsys (disabled) Aug 12 23:42:59.808832 kernel: audit: type=2000 audit(0.017:1): state=initialized audit_enabled=0 res=1 Aug 12 23:42:59.808840 kernel: thermal_sys: Registered thermal governor 'step_wise' Aug 12 23:42:59.808848 kernel: cpuidle: using governor menu Aug 12 23:42:59.808855 kernel: hw-breakpoint: found 6 breakpoint and 4 watchpoint registers. Aug 12 23:42:59.808863 kernel: ASID allocator initialised with 32768 entries Aug 12 23:42:59.808871 kernel: acpiphp: ACPI Hot Plug PCI Controller Driver version: 0.5 Aug 12 23:42:59.808878 kernel: Serial: AMBA PL011 UART driver Aug 12 23:42:59.808885 kernel: HugeTLB: registered 1.00 GiB page size, pre-allocated 0 pages Aug 12 23:42:59.808893 kernel: HugeTLB: 0 KiB vmemmap can be freed for a 1.00 GiB page Aug 12 23:42:59.808900 kernel: HugeTLB: registered 32.0 MiB page size, pre-allocated 0 pages Aug 12 23:42:59.808907 kernel: HugeTLB: 0 KiB vmemmap can be freed for a 32.0 MiB page Aug 12 23:42:59.808915 kernel: HugeTLB: registered 2.00 MiB page size, pre-allocated 0 pages Aug 12 23:42:59.808922 kernel: HugeTLB: 0 KiB vmemmap can be freed for a 2.00 MiB page Aug 12 23:42:59.808931 kernel: HugeTLB: registered 64.0 KiB page size, pre-allocated 0 pages Aug 12 23:42:59.808938 kernel: HugeTLB: 0 KiB vmemmap can be freed for a 64.0 KiB page Aug 12 23:42:59.808945 kernel: ACPI: Added _OSI(Module Device) Aug 12 23:42:59.808953 kernel: ACPI: Added _OSI(Processor Device) Aug 12 23:42:59.808960 kernel: ACPI: Added _OSI(Processor Aggregator Device) Aug 12 23:42:59.808967 kernel: ACPI: 1 ACPI AML tables successfully acquired and loaded Aug 12 23:42:59.808975 kernel: ACPI: Interpreter enabled Aug 12 23:42:59.808982 kernel: ACPI: Using GIC for interrupt routing Aug 12 23:42:59.808989 kernel: ACPI: MCFG table detected, 1 entries Aug 12 23:42:59.808998 kernel: ACPI: CPU0 has been hot-added Aug 12 23:42:59.809005 kernel: ACPI: CPU1 has been hot-added Aug 12 23:42:59.809013 kernel: ARMH0011:00: ttyAMA0 at MMIO 0x9000000 (irq = 12, base_baud = 0) is a SBSA Aug 12 23:42:59.809020 kernel: printk: legacy console [ttyAMA0] enabled Aug 12 23:42:59.809028 kernel: ACPI: PCI Root Bridge [PCI0] (domain 0000 [bus 00-ff]) Aug 12 23:42:59.809191 kernel: acpi PNP0A08:00: _OSC: OS supports [ExtendedConfig ASPM ClockPM Segments MSI HPX-Type3] Aug 12 23:42:59.809262 kernel: acpi PNP0A08:00: _OSC: platform does not support [LTR] Aug 12 23:42:59.809329 kernel: acpi PNP0A08:00: _OSC: OS now controls [PCIeHotplug PME AER PCIeCapability] Aug 12 23:42:59.809427 kernel: acpi PNP0A08:00: ECAM area [mem 0x4010000000-0x401fffffff] reserved by PNP0C02:00 Aug 12 23:42:59.809490 kernel: acpi PNP0A08:00: ECAM at [mem 0x4010000000-0x401fffffff] for [bus 00-ff] Aug 12 23:42:59.809500 kernel: ACPI: Remapped I/O 0x000000003eff0000 to [io 0x0000-0xffff window] Aug 12 23:42:59.809508 kernel: PCI host bridge to bus 0000:00 Aug 12 23:42:59.809610 kernel: pci_bus 0000:00: root bus resource [mem 0x10000000-0x3efeffff window] Aug 12 23:42:59.809672 kernel: pci_bus 0000:00: root bus resource [io 0x0000-0xffff window] Aug 12 23:42:59.809727 kernel: pci_bus 0000:00: root bus resource [mem 0x8000000000-0xffffffffff window] Aug 12 23:42:59.809788 kernel: pci_bus 0000:00: root bus resource [bus 00-ff] Aug 12 23:42:59.809869 kernel: pci 0000:00:00.0: [1b36:0008] type 00 class 0x060000 conventional PCI endpoint Aug 12 23:42:59.809945 kernel: pci 0000:00:01.0: [1af4:1050] type 00 class 0x038000 conventional PCI endpoint Aug 12 23:42:59.810007 kernel: pci 0000:00:01.0: BAR 1 [mem 0x11289000-0x11289fff] Aug 12 23:42:59.810069 kernel: pci 0000:00:01.0: BAR 4 [mem 0x8000600000-0x8000603fff 64bit pref] Aug 12 23:42:59.810145 kernel: pci 0000:00:02.0: [1b36:000c] type 01 class 0x060400 PCIe Root Port Aug 12 23:42:59.810226 kernel: pci 0000:00:02.0: BAR 0 [mem 0x11288000-0x11288fff] Aug 12 23:42:59.810288 kernel: pci 0000:00:02.0: PCI bridge to [bus 01] Aug 12 23:42:59.810365 kernel: pci 0000:00:02.0: bridge window [mem 0x11000000-0x111fffff] Aug 12 23:42:59.810434 kernel: pci 0000:00:02.0: bridge window [mem 0x8000000000-0x80000fffff 64bit pref] Aug 12 23:42:59.810503 kernel: pci 0000:00:02.1: [1b36:000c] type 01 class 0x060400 PCIe Root Port Aug 12 23:42:59.810583 kernel: pci 0000:00:02.1: BAR 0 [mem 0x11287000-0x11287fff] Aug 12 23:42:59.810646 kernel: pci 0000:00:02.1: PCI bridge to [bus 02] Aug 12 23:42:59.810711 kernel: pci 0000:00:02.1: bridge window [mem 0x10e00000-0x10ffffff] Aug 12 23:42:59.810779 kernel: pci 0000:00:02.2: [1b36:000c] type 01 class 0x060400 PCIe Root Port Aug 12 23:42:59.810842 kernel: pci 0000:00:02.2: BAR 0 [mem 0x11286000-0x11286fff] Aug 12 23:42:59.810903 kernel: pci 0000:00:02.2: PCI bridge to [bus 03] Aug 12 23:42:59.810964 kernel: pci 0000:00:02.2: bridge window [mem 0x10c00000-0x10dfffff] Aug 12 23:42:59.811025 kernel: pci 0000:00:02.2: bridge window [mem 0x8000100000-0x80001fffff 64bit pref] Aug 12 23:42:59.811097 kernel: pci 0000:00:02.3: [1b36:000c] type 01 class 0x060400 PCIe Root Port Aug 12 23:42:59.811164 kernel: pci 0000:00:02.3: BAR 0 [mem 0x11285000-0x11285fff] Aug 12 23:42:59.811232 kernel: pci 0000:00:02.3: PCI bridge to [bus 04] Aug 12 23:42:59.811291 kernel: pci 0000:00:02.3: bridge window [mem 0x10a00000-0x10bfffff] Aug 12 23:42:59.811371 kernel: pci 0000:00:02.3: bridge window [mem 0x8000200000-0x80002fffff 64bit pref] Aug 12 23:42:59.811443 kernel: pci 0000:00:02.4: [1b36:000c] type 01 class 0x060400 PCIe Root Port Aug 12 23:42:59.811506 kernel: pci 0000:00:02.4: BAR 0 [mem 0x11284000-0x11284fff] Aug 12 23:42:59.811583 kernel: pci 0000:00:02.4: PCI bridge to [bus 05] Aug 12 23:42:59.811651 kernel: pci 0000:00:02.4: bridge window [mem 0x10800000-0x109fffff] Aug 12 23:42:59.811712 kernel: pci 0000:00:02.4: bridge window [mem 0x8000300000-0x80003fffff 64bit pref] Aug 12 23:42:59.811781 kernel: pci 0000:00:02.5: [1b36:000c] type 01 class 0x060400 PCIe Root Port Aug 12 23:42:59.811842 kernel: pci 0000:00:02.5: BAR 0 [mem 0x11283000-0x11283fff] Aug 12 23:42:59.811903 kernel: pci 0000:00:02.5: PCI bridge to [bus 06] Aug 12 23:42:59.811962 kernel: pci 0000:00:02.5: bridge window [mem 0x10600000-0x107fffff] Aug 12 23:42:59.812022 kernel: pci 0000:00:02.5: bridge window [mem 0x8000400000-0x80004fffff 64bit pref] Aug 12 23:42:59.812092 kernel: pci 0000:00:02.6: [1b36:000c] type 01 class 0x060400 PCIe Root Port Aug 12 23:42:59.812153 kernel: pci 0000:00:02.6: BAR 0 [mem 0x11282000-0x11282fff] Aug 12 23:42:59.812249 kernel: pci 0000:00:02.6: PCI bridge to [bus 07] Aug 12 23:42:59.812315 kernel: pci 0000:00:02.6: bridge window [mem 0x10400000-0x105fffff] Aug 12 23:42:59.812416 kernel: pci 0000:00:02.6: bridge window [mem 0x8000500000-0x80005fffff 64bit pref] Aug 12 23:42:59.812495 kernel: pci 0000:00:02.7: [1b36:000c] type 01 class 0x060400 PCIe Root Port Aug 12 23:42:59.812573 kernel: pci 0000:00:02.7: BAR 0 [mem 0x11281000-0x11281fff] Aug 12 23:42:59.812644 kernel: pci 0000:00:02.7: PCI bridge to [bus 08] Aug 12 23:42:59.813020 kernel: pci 0000:00:02.7: bridge window [mem 0x10200000-0x103fffff] Aug 12 23:42:59.813110 kernel: pci 0000:00:03.0: [1b36:000c] type 01 class 0x060400 PCIe Root Port Aug 12 23:42:59.813173 kernel: pci 0000:00:03.0: BAR 0 [mem 0x11280000-0x11280fff] Aug 12 23:42:59.813235 kernel: pci 0000:00:03.0: PCI bridge to [bus 09] Aug 12 23:42:59.813297 kernel: pci 0000:00:03.0: bridge window [mem 0x10000000-0x101fffff] Aug 12 23:42:59.813979 kernel: pci 0000:00:04.0: [1b36:0002] type 00 class 0x070002 conventional PCI endpoint Aug 12 23:42:59.814058 kernel: pci 0000:00:04.0: BAR 0 [io 0x0000-0x0007] Aug 12 23:42:59.814133 kernel: pci 0000:01:00.0: [1af4:1041] type 00 class 0x020000 PCIe Endpoint Aug 12 23:42:59.814197 kernel: pci 0000:01:00.0: BAR 1 [mem 0x11000000-0x11000fff] Aug 12 23:42:59.814260 kernel: pci 0000:01:00.0: BAR 4 [mem 0x8000000000-0x8000003fff 64bit pref] Aug 12 23:42:59.814325 kernel: pci 0000:01:00.0: ROM [mem 0xfff80000-0xffffffff pref] Aug 12 23:42:59.814449 kernel: pci 0000:02:00.0: [1b36:000d] type 00 class 0x0c0330 PCIe Endpoint Aug 12 23:42:59.814524 kernel: pci 0000:02:00.0: BAR 0 [mem 0x10e00000-0x10e03fff 64bit] Aug 12 23:42:59.814635 kernel: pci 0000:03:00.0: [1af4:1043] type 00 class 0x078000 PCIe Endpoint Aug 12 23:42:59.814702 kernel: pci 0000:03:00.0: BAR 1 [mem 0x10c00000-0x10c00fff] Aug 12 23:42:59.814766 kernel: pci 0000:03:00.0: BAR 4 [mem 0x8000100000-0x8000103fff 64bit pref] Aug 12 23:42:59.814838 kernel: pci 0000:04:00.0: [1af4:1045] type 00 class 0x00ff00 PCIe Endpoint Aug 12 23:42:59.814904 kernel: pci 0000:04:00.0: BAR 4 [mem 0x8000200000-0x8000203fff 64bit pref] Aug 12 23:42:59.814982 kernel: pci 0000:05:00.0: [1af4:1044] type 00 class 0x00ff00 PCIe Endpoint Aug 12 23:42:59.815049 kernel: pci 0000:05:00.0: BAR 4 [mem 0x8000300000-0x8000303fff 64bit pref] Aug 12 23:42:59.815126 kernel: pci 0000:06:00.0: [1af4:1048] type 00 class 0x010000 PCIe Endpoint Aug 12 23:42:59.815190 kernel: pci 0000:06:00.0: BAR 1 [mem 0x10600000-0x10600fff] Aug 12 23:42:59.815253 kernel: pci 0000:06:00.0: BAR 4 [mem 0x8000400000-0x8000403fff 64bit pref] Aug 12 23:42:59.815328 kernel: pci 0000:07:00.0: [1af4:1041] type 00 class 0x020000 PCIe Endpoint Aug 12 23:42:59.815408 kernel: pci 0000:07:00.0: BAR 1 [mem 0x10400000-0x10400fff] Aug 12 23:42:59.815475 kernel: pci 0000:07:00.0: BAR 4 [mem 0x8000500000-0x8000503fff 64bit pref] Aug 12 23:42:59.815538 kernel: pci 0000:07:00.0: ROM [mem 0xfff80000-0xffffffff pref] Aug 12 23:42:59.817753 kernel: pci 0000:00:02.0: bridge window [io 0x1000-0x0fff] to [bus 01] add_size 1000 Aug 12 23:42:59.817825 kernel: pci 0000:00:02.0: bridge window [mem 0x00100000-0x001fffff 64bit pref] to [bus 01] add_size 100000 add_align 100000 Aug 12 23:42:59.817887 kernel: pci 0000:00:02.0: bridge window [mem 0x00100000-0x001fffff] to [bus 01] add_size 100000 add_align 100000 Aug 12 23:42:59.817952 kernel: pci 0000:00:02.1: bridge window [io 0x1000-0x0fff] to [bus 02] add_size 1000 Aug 12 23:42:59.818030 kernel: pci 0000:00:02.1: bridge window [mem 0x00100000-0x000fffff 64bit pref] to [bus 02] add_size 200000 add_align 100000 Aug 12 23:42:59.818100 kernel: pci 0000:00:02.1: bridge window [mem 0x00100000-0x001fffff] to [bus 02] add_size 100000 add_align 100000 Aug 12 23:42:59.818173 kernel: pci 0000:00:02.2: bridge window [io 0x1000-0x0fff] to [bus 03] add_size 1000 Aug 12 23:42:59.818245 kernel: pci 0000:00:02.2: bridge window [mem 0x00100000-0x001fffff 64bit pref] to [bus 03] add_size 100000 add_align 100000 Aug 12 23:42:59.818307 kernel: pci 0000:00:02.2: bridge window [mem 0x00100000-0x001fffff] to [bus 03] add_size 100000 add_align 100000 Aug 12 23:42:59.818443 kernel: pci 0000:00:02.3: bridge window [io 0x1000-0x0fff] to [bus 04] add_size 1000 Aug 12 23:42:59.818515 kernel: pci 0000:00:02.3: bridge window [mem 0x00100000-0x001fffff 64bit pref] to [bus 04] add_size 100000 add_align 100000 Aug 12 23:42:59.819707 kernel: pci 0000:00:02.3: bridge window [mem 0x00100000-0x000fffff] to [bus 04] add_size 200000 add_align 100000 Aug 12 23:42:59.819794 kernel: pci 0000:00:02.4: bridge window [io 0x1000-0x0fff] to [bus 05] add_size 1000 Aug 12 23:42:59.819858 kernel: pci 0000:00:02.4: bridge window [mem 0x00100000-0x001fffff 64bit pref] to [bus 05] add_size 100000 add_align 100000 Aug 12 23:42:59.819920 kernel: pci 0000:00:02.4: bridge window [mem 0x00100000-0x000fffff] to [bus 05] add_size 200000 add_align 100000 Aug 12 23:42:59.819989 kernel: pci 0000:00:02.5: bridge window [io 0x1000-0x0fff] to [bus 06] add_size 1000 Aug 12 23:42:59.820051 kernel: pci 0000:00:02.5: bridge window [mem 0x00100000-0x001fffff 64bit pref] to [bus 06] add_size 100000 add_align 100000 Aug 12 23:42:59.820112 kernel: pci 0000:00:02.5: bridge window [mem 0x00100000-0x001fffff] to [bus 06] add_size 100000 add_align 100000 Aug 12 23:42:59.820185 kernel: pci 0000:00:02.6: bridge window [io 0x1000-0x0fff] to [bus 07] add_size 1000 Aug 12 23:42:59.820248 kernel: pci 0000:00:02.6: bridge window [mem 0x00100000-0x001fffff 64bit pref] to [bus 07] add_size 100000 add_align 100000 Aug 12 23:42:59.820309 kernel: pci 0000:00:02.6: bridge window [mem 0x00100000-0x001fffff] to [bus 07] add_size 100000 add_align 100000 Aug 12 23:42:59.820393 kernel: pci 0000:00:02.7: bridge window [io 0x1000-0x0fff] to [bus 08] add_size 1000 Aug 12 23:42:59.820459 kernel: pci 0000:00:02.7: bridge window [mem 0x00100000-0x000fffff 64bit pref] to [bus 08] add_size 200000 add_align 100000 Aug 12 23:42:59.820521 kernel: pci 0000:00:02.7: bridge window [mem 0x00100000-0x000fffff] to [bus 08] add_size 200000 add_align 100000 Aug 12 23:42:59.821686 kernel: pci 0000:00:03.0: bridge window [io 0x1000-0x0fff] to [bus 09] add_size 1000 Aug 12 23:42:59.821774 kernel: pci 0000:00:03.0: bridge window [mem 0x00100000-0x000fffff 64bit pref] to [bus 09] add_size 200000 add_align 100000 Aug 12 23:42:59.821840 kernel: pci 0000:00:03.0: bridge window [mem 0x00100000-0x000fffff] to [bus 09] add_size 200000 add_align 100000 Aug 12 23:42:59.821907 kernel: pci 0000:00:02.0: bridge window [mem 0x10000000-0x101fffff]: assigned Aug 12 23:42:59.821970 kernel: pci 0000:00:02.0: bridge window [mem 0x8000000000-0x80001fffff 64bit pref]: assigned Aug 12 23:42:59.822035 kernel: pci 0000:00:02.1: bridge window [mem 0x10200000-0x103fffff]: assigned Aug 12 23:42:59.822096 kernel: pci 0000:00:02.1: bridge window [mem 0x8000200000-0x80003fffff 64bit pref]: assigned Aug 12 23:42:59.822160 kernel: pci 0000:00:02.2: bridge window [mem 0x10400000-0x105fffff]: assigned Aug 12 23:42:59.822225 kernel: pci 0000:00:02.2: bridge window [mem 0x8000400000-0x80005fffff 64bit pref]: assigned Aug 12 23:42:59.822289 kernel: pci 0000:00:02.3: bridge window [mem 0x10600000-0x107fffff]: assigned Aug 12 23:42:59.822365 kernel: pci 0000:00:02.3: bridge window [mem 0x8000600000-0x80007fffff 64bit pref]: assigned Aug 12 23:42:59.822435 kernel: pci 0000:00:02.4: bridge window [mem 0x10800000-0x109fffff]: assigned Aug 12 23:42:59.822497 kernel: pci 0000:00:02.4: bridge window [mem 0x8000800000-0x80009fffff 64bit pref]: assigned Aug 12 23:42:59.822588 kernel: pci 0000:00:02.5: bridge window [mem 0x10a00000-0x10bfffff]: assigned Aug 12 23:42:59.822663 kernel: pci 0000:00:02.5: bridge window [mem 0x8000a00000-0x8000bfffff 64bit pref]: assigned Aug 12 23:42:59.822727 kernel: pci 0000:00:02.6: bridge window [mem 0x10c00000-0x10dfffff]: assigned Aug 12 23:42:59.822795 kernel: pci 0000:00:02.6: bridge window [mem 0x8000c00000-0x8000dfffff 64bit pref]: assigned Aug 12 23:42:59.822858 kernel: pci 0000:00:02.7: bridge window [mem 0x10e00000-0x10ffffff]: assigned Aug 12 23:42:59.822921 kernel: pci 0000:00:02.7: bridge window [mem 0x8000e00000-0x8000ffffff 64bit pref]: assigned Aug 12 23:42:59.822982 kernel: pci 0000:00:03.0: bridge window [mem 0x11000000-0x111fffff]: assigned Aug 12 23:42:59.823043 kernel: pci 0000:00:03.0: bridge window [mem 0x8001000000-0x80011fffff 64bit pref]: assigned Aug 12 23:42:59.823110 kernel: pci 0000:00:01.0: BAR 4 [mem 0x8001200000-0x8001203fff 64bit pref]: assigned Aug 12 23:42:59.823172 kernel: pci 0000:00:01.0: BAR 1 [mem 0x11200000-0x11200fff]: assigned Aug 12 23:42:59.823233 kernel: pci 0000:00:02.0: BAR 0 [mem 0x11201000-0x11201fff]: assigned Aug 12 23:42:59.823296 kernel: pci 0000:00:02.0: bridge window [io 0x1000-0x1fff]: assigned Aug 12 23:42:59.823380 kernel: pci 0000:00:02.1: BAR 0 [mem 0x11202000-0x11202fff]: assigned Aug 12 23:42:59.823450 kernel: pci 0000:00:02.1: bridge window [io 0x2000-0x2fff]: assigned Aug 12 23:42:59.823513 kernel: pci 0000:00:02.2: BAR 0 [mem 0x11203000-0x11203fff]: assigned Aug 12 23:42:59.825674 kernel: pci 0000:00:02.2: bridge window [io 0x3000-0x3fff]: assigned Aug 12 23:42:59.825779 kernel: pci 0000:00:02.3: BAR 0 [mem 0x11204000-0x11204fff]: assigned Aug 12 23:42:59.825846 kernel: pci 0000:00:02.3: bridge window [io 0x4000-0x4fff]: assigned Aug 12 23:42:59.825912 kernel: pci 0000:00:02.4: BAR 0 [mem 0x11205000-0x11205fff]: assigned Aug 12 23:42:59.825975 kernel: pci 0000:00:02.4: bridge window [io 0x5000-0x5fff]: assigned Aug 12 23:42:59.826043 kernel: pci 0000:00:02.5: BAR 0 [mem 0x11206000-0x11206fff]: assigned Aug 12 23:42:59.826111 kernel: pci 0000:00:02.5: bridge window [io 0x6000-0x6fff]: assigned Aug 12 23:42:59.826183 kernel: pci 0000:00:02.6: BAR 0 [mem 0x11207000-0x11207fff]: assigned Aug 12 23:42:59.826261 kernel: pci 0000:00:02.6: bridge window [io 0x7000-0x7fff]: assigned Aug 12 23:42:59.826325 kernel: pci 0000:00:02.7: BAR 0 [mem 0x11208000-0x11208fff]: assigned Aug 12 23:42:59.826429 kernel: pci 0000:00:02.7: bridge window [io 0x8000-0x8fff]: assigned Aug 12 23:42:59.826499 kernel: pci 0000:00:03.0: BAR 0 [mem 0x11209000-0x11209fff]: assigned Aug 12 23:42:59.826582 kernel: pci 0000:00:03.0: bridge window [io 0x9000-0x9fff]: assigned Aug 12 23:42:59.826665 kernel: pci 0000:00:04.0: BAR 0 [io 0xa000-0xa007]: assigned Aug 12 23:42:59.826739 kernel: pci 0000:01:00.0: ROM [mem 0x10000000-0x1007ffff pref]: assigned Aug 12 23:42:59.826803 kernel: pci 0000:01:00.0: BAR 4 [mem 0x8000000000-0x8000003fff 64bit pref]: assigned Aug 12 23:42:59.826872 kernel: pci 0000:01:00.0: BAR 1 [mem 0x10080000-0x10080fff]: assigned Aug 12 23:42:59.826939 kernel: pci 0000:00:02.0: PCI bridge to [bus 01] Aug 12 23:42:59.827003 kernel: pci 0000:00:02.0: bridge window [io 0x1000-0x1fff] Aug 12 23:42:59.827078 kernel: pci 0000:00:02.0: bridge window [mem 0x10000000-0x101fffff] Aug 12 23:42:59.827155 kernel: pci 0000:00:02.0: bridge window [mem 0x8000000000-0x80001fffff 64bit pref] Aug 12 23:42:59.827235 kernel: pci 0000:02:00.0: BAR 0 [mem 0x10200000-0x10203fff 64bit]: assigned Aug 12 23:42:59.827301 kernel: pci 0000:00:02.1: PCI bridge to [bus 02] Aug 12 23:42:59.827377 kernel: pci 0000:00:02.1: bridge window [io 0x2000-0x2fff] Aug 12 23:42:59.827441 kernel: pci 0000:00:02.1: bridge window [mem 0x10200000-0x103fffff] Aug 12 23:42:59.827503 kernel: pci 0000:00:02.1: bridge window [mem 0x8000200000-0x80003fffff 64bit pref] Aug 12 23:42:59.830890 kernel: pci 0000:03:00.0: BAR 4 [mem 0x8000400000-0x8000403fff 64bit pref]: assigned Aug 12 23:42:59.830982 kernel: pci 0000:03:00.0: BAR 1 [mem 0x10400000-0x10400fff]: assigned Aug 12 23:42:59.831051 kernel: pci 0000:00:02.2: PCI bridge to [bus 03] Aug 12 23:42:59.831114 kernel: pci 0000:00:02.2: bridge window [io 0x3000-0x3fff] Aug 12 23:42:59.831184 kernel: pci 0000:00:02.2: bridge window [mem 0x10400000-0x105fffff] Aug 12 23:42:59.831248 kernel: pci 0000:00:02.2: bridge window [mem 0x8000400000-0x80005fffff 64bit pref] Aug 12 23:42:59.831317 kernel: pci 0000:04:00.0: BAR 4 [mem 0x8000600000-0x8000603fff 64bit pref]: assigned Aug 12 23:42:59.831434 kernel: pci 0000:00:02.3: PCI bridge to [bus 04] Aug 12 23:42:59.831503 kernel: pci 0000:00:02.3: bridge window [io 0x4000-0x4fff] Aug 12 23:42:59.831586 kernel: pci 0000:00:02.3: bridge window [mem 0x10600000-0x107fffff] Aug 12 23:42:59.831651 kernel: pci 0000:00:02.3: bridge window [mem 0x8000600000-0x80007fffff 64bit pref] Aug 12 23:42:59.831729 kernel: pci 0000:05:00.0: BAR 4 [mem 0x8000800000-0x8000803fff 64bit pref]: assigned Aug 12 23:42:59.831795 kernel: pci 0000:00:02.4: PCI bridge to [bus 05] Aug 12 23:42:59.831857 kernel: pci 0000:00:02.4: bridge window [io 0x5000-0x5fff] Aug 12 23:42:59.831918 kernel: pci 0000:00:02.4: bridge window [mem 0x10800000-0x109fffff] Aug 12 23:42:59.831980 kernel: pci 0000:00:02.4: bridge window [mem 0x8000800000-0x80009fffff 64bit pref] Aug 12 23:42:59.832050 kernel: pci 0000:06:00.0: BAR 4 [mem 0x8000a00000-0x8000a03fff 64bit pref]: assigned Aug 12 23:42:59.832113 kernel: pci 0000:06:00.0: BAR 1 [mem 0x10a00000-0x10a00fff]: assigned Aug 12 23:42:59.832178 kernel: pci 0000:00:02.5: PCI bridge to [bus 06] Aug 12 23:42:59.832244 kernel: pci 0000:00:02.5: bridge window [io 0x6000-0x6fff] Aug 12 23:42:59.832316 kernel: pci 0000:00:02.5: bridge window [mem 0x10a00000-0x10bfffff] Aug 12 23:42:59.832395 kernel: pci 0000:00:02.5: bridge window [mem 0x8000a00000-0x8000bfffff 64bit pref] Aug 12 23:42:59.832468 kernel: pci 0000:07:00.0: ROM [mem 0x10c00000-0x10c7ffff pref]: assigned Aug 12 23:42:59.832533 kernel: pci 0000:07:00.0: BAR 4 [mem 0x8000c00000-0x8000c03fff 64bit pref]: assigned Aug 12 23:42:59.833768 kernel: pci 0000:07:00.0: BAR 1 [mem 0x10c80000-0x10c80fff]: assigned Aug 12 23:42:59.833857 kernel: pci 0000:00:02.6: PCI bridge to [bus 07] Aug 12 23:42:59.833922 kernel: pci 0000:00:02.6: bridge window [io 0x7000-0x7fff] Aug 12 23:42:59.833986 kernel: pci 0000:00:02.6: bridge window [mem 0x10c00000-0x10dfffff] Aug 12 23:42:59.834050 kernel: pci 0000:00:02.6: bridge window [mem 0x8000c00000-0x8000dfffff 64bit pref] Aug 12 23:42:59.834118 kernel: pci 0000:00:02.7: PCI bridge to [bus 08] Aug 12 23:42:59.834181 kernel: pci 0000:00:02.7: bridge window [io 0x8000-0x8fff] Aug 12 23:42:59.834244 kernel: pci 0000:00:02.7: bridge window [mem 0x10e00000-0x10ffffff] Aug 12 23:42:59.834311 kernel: pci 0000:00:02.7: bridge window [mem 0x8000e00000-0x8000ffffff 64bit pref] Aug 12 23:42:59.834424 kernel: pci 0000:00:03.0: PCI bridge to [bus 09] Aug 12 23:42:59.834493 kernel: pci 0000:00:03.0: bridge window [io 0x9000-0x9fff] Aug 12 23:42:59.834577 kernel: pci 0000:00:03.0: bridge window [mem 0x11000000-0x111fffff] Aug 12 23:42:59.834648 kernel: pci 0000:00:03.0: bridge window [mem 0x8001000000-0x80011fffff 64bit pref] Aug 12 23:42:59.834713 kernel: pci_bus 0000:00: resource 4 [mem 0x10000000-0x3efeffff window] Aug 12 23:42:59.834769 kernel: pci_bus 0000:00: resource 5 [io 0x0000-0xffff window] Aug 12 23:42:59.834824 kernel: pci_bus 0000:00: resource 6 [mem 0x8000000000-0xffffffffff window] Aug 12 23:42:59.834892 kernel: pci_bus 0000:01: resource 0 [io 0x1000-0x1fff] Aug 12 23:42:59.834951 kernel: pci_bus 0000:01: resource 1 [mem 0x10000000-0x101fffff] Aug 12 23:42:59.835011 kernel: pci_bus 0000:01: resource 2 [mem 0x8000000000-0x80001fffff 64bit pref] Aug 12 23:42:59.835076 kernel: pci_bus 0000:02: resource 0 [io 0x2000-0x2fff] Aug 12 23:42:59.835133 kernel: pci_bus 0000:02: resource 1 [mem 0x10200000-0x103fffff] Aug 12 23:42:59.835190 kernel: pci_bus 0000:02: resource 2 [mem 0x8000200000-0x80003fffff 64bit pref] Aug 12 23:42:59.835262 kernel: pci_bus 0000:03: resource 0 [io 0x3000-0x3fff] Aug 12 23:42:59.835327 kernel: pci_bus 0000:03: resource 1 [mem 0x10400000-0x105fffff] Aug 12 23:42:59.835396 kernel: pci_bus 0000:03: resource 2 [mem 0x8000400000-0x80005fffff 64bit pref] Aug 12 23:42:59.835472 kernel: pci_bus 0000:04: resource 0 [io 0x4000-0x4fff] Aug 12 23:42:59.835529 kernel: pci_bus 0000:04: resource 1 [mem 0x10600000-0x107fffff] Aug 12 23:42:59.837745 kernel: pci_bus 0000:04: resource 2 [mem 0x8000600000-0x80007fffff 64bit pref] Aug 12 23:42:59.837848 kernel: pci_bus 0000:05: resource 0 [io 0x5000-0x5fff] Aug 12 23:42:59.837925 kernel: pci_bus 0000:05: resource 1 [mem 0x10800000-0x109fffff] Aug 12 23:42:59.837987 kernel: pci_bus 0000:05: resource 2 [mem 0x8000800000-0x80009fffff 64bit pref] Aug 12 23:42:59.838052 kernel: pci_bus 0000:06: resource 0 [io 0x6000-0x6fff] Aug 12 23:42:59.838117 kernel: pci_bus 0000:06: resource 1 [mem 0x10a00000-0x10bfffff] Aug 12 23:42:59.838174 kernel: pci_bus 0000:06: resource 2 [mem 0x8000a00000-0x8000bfffff 64bit pref] Aug 12 23:42:59.838239 kernel: pci_bus 0000:07: resource 0 [io 0x7000-0x7fff] Aug 12 23:42:59.838296 kernel: pci_bus 0000:07: resource 1 [mem 0x10c00000-0x10dfffff] Aug 12 23:42:59.838367 kernel: pci_bus 0000:07: resource 2 [mem 0x8000c00000-0x8000dfffff 64bit pref] Aug 12 23:42:59.838435 kernel: pci_bus 0000:08: resource 0 [io 0x8000-0x8fff] Aug 12 23:42:59.838499 kernel: pci_bus 0000:08: resource 1 [mem 0x10e00000-0x10ffffff] Aug 12 23:42:59.838575 kernel: pci_bus 0000:08: resource 2 [mem 0x8000e00000-0x8000ffffff 64bit pref] Aug 12 23:42:59.838647 kernel: pci_bus 0000:09: resource 0 [io 0x9000-0x9fff] Aug 12 23:42:59.838704 kernel: pci_bus 0000:09: resource 1 [mem 0x11000000-0x111fffff] Aug 12 23:42:59.838760 kernel: pci_bus 0000:09: resource 2 [mem 0x8001000000-0x80011fffff 64bit pref] Aug 12 23:42:59.838770 kernel: ACPI: PCI: Interrupt link GSI0 configured for IRQ 35 Aug 12 23:42:59.838778 kernel: ACPI: PCI: Interrupt link GSI1 configured for IRQ 36 Aug 12 23:42:59.838786 kernel: ACPI: PCI: Interrupt link GSI2 configured for IRQ 37 Aug 12 23:42:59.838796 kernel: ACPI: PCI: Interrupt link GSI3 configured for IRQ 38 Aug 12 23:42:59.838804 kernel: iommu: Default domain type: Translated Aug 12 23:42:59.838813 kernel: iommu: DMA domain TLB invalidation policy: strict mode Aug 12 23:42:59.838820 kernel: efivars: Registered efivars operations Aug 12 23:42:59.838828 kernel: vgaarb: loaded Aug 12 23:42:59.838836 kernel: clocksource: Switched to clocksource arch_sys_counter Aug 12 23:42:59.838844 kernel: VFS: Disk quotas dquot_6.6.0 Aug 12 23:42:59.838852 kernel: VFS: Dquot-cache hash table entries: 512 (order 0, 4096 bytes) Aug 12 23:42:59.838859 kernel: pnp: PnP ACPI init Aug 12 23:42:59.838931 kernel: system 00:00: [mem 0x4010000000-0x401fffffff window] could not be reserved Aug 12 23:42:59.838943 kernel: pnp: PnP ACPI: found 1 devices Aug 12 23:42:59.838951 kernel: NET: Registered PF_INET protocol family Aug 12 23:42:59.838959 kernel: IP idents hash table entries: 65536 (order: 7, 524288 bytes, linear) Aug 12 23:42:59.838967 kernel: tcp_listen_portaddr_hash hash table entries: 2048 (order: 3, 32768 bytes, linear) Aug 12 23:42:59.838975 kernel: Table-perturb hash table entries: 65536 (order: 6, 262144 bytes, linear) Aug 12 23:42:59.838983 kernel: TCP established hash table entries: 32768 (order: 6, 262144 bytes, linear) Aug 12 23:42:59.838990 kernel: TCP bind hash table entries: 32768 (order: 8, 1048576 bytes, linear) Aug 12 23:42:59.839000 kernel: TCP: Hash tables configured (established 32768 bind 32768) Aug 12 23:42:59.839008 kernel: UDP hash table entries: 2048 (order: 4, 65536 bytes, linear) Aug 12 23:42:59.839016 kernel: UDP-Lite hash table entries: 2048 (order: 4, 65536 bytes, linear) Aug 12 23:42:59.839024 kernel: NET: Registered PF_UNIX/PF_LOCAL protocol family Aug 12 23:42:59.839097 kernel: pci 0000:02:00.0: enabling device (0000 -> 0002) Aug 12 23:42:59.839109 kernel: PCI: CLS 0 bytes, default 64 Aug 12 23:42:59.839117 kernel: kvm [1]: HYP mode not available Aug 12 23:42:59.839125 kernel: Initialise system trusted keyrings Aug 12 23:42:59.839133 kernel: workingset: timestamp_bits=39 max_order=20 bucket_order=0 Aug 12 23:42:59.839143 kernel: Key type asymmetric registered Aug 12 23:42:59.839151 kernel: Asymmetric key parser 'x509' registered Aug 12 23:42:59.839158 kernel: Block layer SCSI generic (bsg) driver version 0.4 loaded (major 249) Aug 12 23:42:59.839166 kernel: io scheduler mq-deadline registered Aug 12 23:42:59.839175 kernel: io scheduler kyber registered Aug 12 23:42:59.839183 kernel: io scheduler bfq registered Aug 12 23:42:59.839191 kernel: ACPI: \_SB_.PCI0.GSI2: Enabled at IRQ 37 Aug 12 23:42:59.839259 kernel: pcieport 0000:00:02.0: PME: Signaling with IRQ 50 Aug 12 23:42:59.839323 kernel: pcieport 0000:00:02.0: AER: enabled with IRQ 50 Aug 12 23:42:59.839435 kernel: pcieport 0000:00:02.0: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Aug 12 23:42:59.839506 kernel: pcieport 0000:00:02.1: PME: Signaling with IRQ 51 Aug 12 23:42:59.841654 kernel: pcieport 0000:00:02.1: AER: enabled with IRQ 51 Aug 12 23:42:59.841742 kernel: pcieport 0000:00:02.1: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Aug 12 23:42:59.841811 kernel: pcieport 0000:00:02.2: PME: Signaling with IRQ 52 Aug 12 23:42:59.841876 kernel: pcieport 0000:00:02.2: AER: enabled with IRQ 52 Aug 12 23:42:59.841938 kernel: pcieport 0000:00:02.2: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Aug 12 23:42:59.842003 kernel: pcieport 0000:00:02.3: PME: Signaling with IRQ 53 Aug 12 23:42:59.842073 kernel: pcieport 0000:00:02.3: AER: enabled with IRQ 53 Aug 12 23:42:59.842137 kernel: pcieport 0000:00:02.3: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Aug 12 23:42:59.842203 kernel: pcieport 0000:00:02.4: PME: Signaling with IRQ 54 Aug 12 23:42:59.842265 kernel: pcieport 0000:00:02.4: AER: enabled with IRQ 54 Aug 12 23:42:59.842327 kernel: pcieport 0000:00:02.4: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Aug 12 23:42:59.842441 kernel: pcieport 0000:00:02.5: PME: Signaling with IRQ 55 Aug 12 23:42:59.842511 kernel: pcieport 0000:00:02.5: AER: enabled with IRQ 55 Aug 12 23:42:59.843652 kernel: pcieport 0000:00:02.5: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Aug 12 23:42:59.843740 kernel: pcieport 0000:00:02.6: PME: Signaling with IRQ 56 Aug 12 23:42:59.843806 kernel: pcieport 0000:00:02.6: AER: enabled with IRQ 56 Aug 12 23:42:59.843872 kernel: pcieport 0000:00:02.6: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Aug 12 23:42:59.843938 kernel: pcieport 0000:00:02.7: PME: Signaling with IRQ 57 Aug 12 23:42:59.844000 kernel: pcieport 0000:00:02.7: AER: enabled with IRQ 57 Aug 12 23:42:59.844062 kernel: pcieport 0000:00:02.7: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Aug 12 23:42:59.844073 kernel: ACPI: \_SB_.PCI0.GSI3: Enabled at IRQ 38 Aug 12 23:42:59.844141 kernel: pcieport 0000:00:03.0: PME: Signaling with IRQ 58 Aug 12 23:42:59.844208 kernel: pcieport 0000:00:03.0: AER: enabled with IRQ 58 Aug 12 23:42:59.844270 kernel: pcieport 0000:00:03.0: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Aug 12 23:42:59.844280 kernel: input: Power Button as /devices/LNXSYSTM:00/LNXSYBUS:00/PNP0C0C:00/input/input0 Aug 12 23:42:59.844289 kernel: ACPI: button: Power Button [PWRB] Aug 12 23:42:59.844298 kernel: ACPI: \_SB_.PCI0.GSI1: Enabled at IRQ 36 Aug 12 23:42:59.844381 kernel: virtio-pci 0000:04:00.0: enabling device (0000 -> 0002) Aug 12 23:42:59.844451 kernel: virtio-pci 0000:07:00.0: enabling device (0000 -> 0002) Aug 12 23:42:59.844465 kernel: Serial: 8250/16550 driver, 4 ports, IRQ sharing enabled Aug 12 23:42:59.844473 kernel: ACPI: \_SB_.PCI0.GSI0: Enabled at IRQ 35 Aug 12 23:42:59.844539 kernel: serial 0000:00:04.0: enabling device (0000 -> 0001) Aug 12 23:42:59.845598 kernel: 0000:00:04.0: ttyS0 at I/O 0xa000 (irq = 45, base_baud = 115200) is a 16550A Aug 12 23:42:59.845610 kernel: thunder_xcv, ver 1.0 Aug 12 23:42:59.845619 kernel: thunder_bgx, ver 1.0 Aug 12 23:42:59.845628 kernel: nicpf, ver 1.0 Aug 12 23:42:59.845636 kernel: nicvf, ver 1.0 Aug 12 23:42:59.845750 kernel: rtc-efi rtc-efi.0: registered as rtc0 Aug 12 23:42:59.845819 kernel: rtc-efi rtc-efi.0: setting system clock to 2025-08-12T23:42:59 UTC (1755042179) Aug 12 23:42:59.845829 kernel: hid: raw HID events driver (C) Jiri Kosina Aug 12 23:42:59.845837 kernel: hw perfevents: enabled with armv8_pmuv3_0 PMU driver, 7 (0,8000003f) counters available Aug 12 23:42:59.845845 kernel: NET: Registered PF_INET6 protocol family Aug 12 23:42:59.845853 kernel: watchdog: NMI not fully supported Aug 12 23:42:59.845861 kernel: watchdog: Hard watchdog permanently disabled Aug 12 23:42:59.845869 kernel: Segment Routing with IPv6 Aug 12 23:42:59.845877 kernel: In-situ OAM (IOAM) with IPv6 Aug 12 23:42:59.845886 kernel: NET: Registered PF_PACKET protocol family Aug 12 23:42:59.845894 kernel: Key type dns_resolver registered Aug 12 23:42:59.845902 kernel: registered taskstats version 1 Aug 12 23:42:59.845910 kernel: Loading compiled-in X.509 certificates Aug 12 23:42:59.845918 kernel: Loaded X.509 cert 'Kinvolk GmbH: Module signing key for 6.12.40-flatcar: e74bfacfa68399ed7282bf533dd5901fdb84b882' Aug 12 23:42:59.845926 kernel: Demotion targets for Node 0: null Aug 12 23:42:59.845934 kernel: Key type .fscrypt registered Aug 12 23:42:59.845941 kernel: Key type fscrypt-provisioning registered Aug 12 23:42:59.845949 kernel: ima: No TPM chip found, activating TPM-bypass! Aug 12 23:42:59.845958 kernel: ima: Allocated hash algorithm: sha1 Aug 12 23:42:59.845966 kernel: ima: No architecture policies found Aug 12 23:42:59.845974 kernel: alg: No test for fips(ansi_cprng) (fips_ansi_cprng) Aug 12 23:42:59.845982 kernel: clk: Disabling unused clocks Aug 12 23:42:59.845990 kernel: PM: genpd: Disabling unused power domains Aug 12 23:42:59.845997 kernel: Warning: unable to open an initial console. Aug 12 23:42:59.846006 kernel: Freeing unused kernel memory: 39488K Aug 12 23:42:59.846013 kernel: Run /init as init process Aug 12 23:42:59.846021 kernel: with arguments: Aug 12 23:42:59.846031 kernel: /init Aug 12 23:42:59.846038 kernel: with environment: Aug 12 23:42:59.846046 kernel: HOME=/ Aug 12 23:42:59.846054 kernel: TERM=linux Aug 12 23:42:59.846062 kernel: BOOT_IMAGE=/flatcar/vmlinuz-a Aug 12 23:42:59.846070 systemd[1]: Successfully made /usr/ read-only. Aug 12 23:42:59.846082 systemd[1]: systemd 256.8 running in system mode (+PAM +AUDIT +SELINUX -APPARMOR +IMA +SMACK +SECCOMP -GCRYPT -GNUTLS +OPENSSL -ACL +BLKID +CURL +ELFUTILS -FIDO2 +IDN2 -IDN +IPTC +KMOD +LIBCRYPTSETUP +LIBCRYPTSETUP_PLUGINS +LIBFDISK +PCRE2 -PWQUALITY -P11KIT -QRENCODE +TPM2 +BZIP2 +LZ4 +XZ +ZLIB +ZSTD -BPF_FRAMEWORK -XKBCOMMON +UTMP -SYSVINIT +LIBARCHIVE) Aug 12 23:42:59.846090 systemd[1]: Detected virtualization kvm. Aug 12 23:42:59.846100 systemd[1]: Detected architecture arm64. Aug 12 23:42:59.846108 systemd[1]: Running in initrd. Aug 12 23:42:59.846118 systemd[1]: No hostname configured, using default hostname. Aug 12 23:42:59.846126 systemd[1]: Hostname set to . Aug 12 23:42:59.846134 systemd[1]: Initializing machine ID from VM UUID. Aug 12 23:42:59.846142 systemd[1]: Queued start job for default target initrd.target. Aug 12 23:42:59.846151 systemd[1]: Started clevis-luks-askpass.path - Forward Password Requests to Clevis Directory Watch. Aug 12 23:42:59.846159 systemd[1]: Started systemd-ask-password-console.path - Dispatch Password Requests to Console Directory Watch. Aug 12 23:42:59.846170 systemd[1]: Expecting device dev-disk-by\x2dlabel-EFI\x2dSYSTEM.device - /dev/disk/by-label/EFI-SYSTEM... Aug 12 23:42:59.846179 systemd[1]: Expecting device dev-disk-by\x2dlabel-OEM.device - /dev/disk/by-label/OEM... Aug 12 23:42:59.846187 systemd[1]: Expecting device dev-disk-by\x2dlabel-ROOT.device - /dev/disk/by-label/ROOT... Aug 12 23:42:59.846196 systemd[1]: Expecting device dev-disk-by\x2dpartlabel-USR\x2dA.device - /dev/disk/by-partlabel/USR-A... Aug 12 23:42:59.846206 systemd[1]: Expecting device dev-disk-by\x2dpartuuid-7130c94a\x2d213a\x2d4e5a\x2d8e26\x2d6cce9662f132.device - /dev/disk/by-partuuid/7130c94a-213a-4e5a-8e26-6cce9662f132... Aug 12 23:42:59.846214 systemd[1]: Expecting device dev-mapper-usr.device - /dev/mapper/usr... Aug 12 23:42:59.846224 systemd[1]: Reached target cryptsetup-pre.target - Local Encrypted Volumes (Pre). Aug 12 23:42:59.846233 systemd[1]: Reached target cryptsetup.target - Local Encrypted Volumes. Aug 12 23:42:59.846241 systemd[1]: Reached target paths.target - Path Units. Aug 12 23:42:59.846249 systemd[1]: Reached target slices.target - Slice Units. Aug 12 23:42:59.846257 systemd[1]: Reached target swap.target - Swaps. Aug 12 23:42:59.846266 systemd[1]: Reached target timers.target - Timer Units. Aug 12 23:42:59.846274 systemd[1]: Listening on iscsid.socket - Open-iSCSI iscsid Socket. Aug 12 23:42:59.847572 systemd[1]: Listening on iscsiuio.socket - Open-iSCSI iscsiuio Socket. Aug 12 23:42:59.847582 systemd[1]: Listening on systemd-journald-dev-log.socket - Journal Socket (/dev/log). Aug 12 23:42:59.847595 systemd[1]: Listening on systemd-journald.socket - Journal Sockets. Aug 12 23:42:59.847604 systemd[1]: Listening on systemd-networkd.socket - Network Service Netlink Socket. Aug 12 23:42:59.847612 systemd[1]: Listening on systemd-udevd-control.socket - udev Control Socket. Aug 12 23:42:59.847621 systemd[1]: Listening on systemd-udevd-kernel.socket - udev Kernel Socket. Aug 12 23:42:59.847629 systemd[1]: Reached target sockets.target - Socket Units. Aug 12 23:42:59.847637 systemd[1]: Starting ignition-setup-pre.service - Ignition env setup... Aug 12 23:42:59.847646 systemd[1]: Starting kmod-static-nodes.service - Create List of Static Device Nodes... Aug 12 23:42:59.847654 systemd[1]: Finished network-cleanup.service - Network Cleanup. Aug 12 23:42:59.847663 systemd[1]: systemd-battery-check.service - Check battery level during early boot was skipped because of an unmet condition check (ConditionDirectoryNotEmpty=/sys/class/power_supply). Aug 12 23:42:59.847673 systemd[1]: Starting systemd-fsck-usr.service... Aug 12 23:42:59.847682 systemd[1]: Starting systemd-journald.service - Journal Service... Aug 12 23:42:59.847690 systemd[1]: Starting systemd-modules-load.service - Load Kernel Modules... Aug 12 23:42:59.847699 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Aug 12 23:42:59.847741 systemd-journald[244]: Collecting audit messages is disabled. Aug 12 23:42:59.847765 systemd[1]: Finished ignition-setup-pre.service - Ignition env setup. Aug 12 23:42:59.847775 systemd[1]: Finished kmod-static-nodes.service - Create List of Static Device Nodes. Aug 12 23:42:59.847783 systemd[1]: Finished systemd-fsck-usr.service. Aug 12 23:42:59.847794 systemd[1]: Starting systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully... Aug 12 23:42:59.847802 systemd[1]: Finished systemd-vconsole-setup.service - Virtual Console Setup. Aug 12 23:42:59.847811 systemd[1]: Starting dracut-cmdline-ask.service - dracut ask for additional cmdline parameters... Aug 12 23:42:59.847820 systemd[1]: Finished systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully. Aug 12 23:42:59.847829 systemd-journald[244]: Journal started Aug 12 23:42:59.847848 systemd-journald[244]: Runtime Journal (/run/log/journal/0d3aca2e51e2402b81c3a10c438c8b9a) is 8M, max 76.5M, 68.5M free. Aug 12 23:42:59.835584 systemd-modules-load[246]: Inserted module 'overlay' Aug 12 23:42:59.850618 systemd[1]: Started systemd-journald.service - Journal Service. Aug 12 23:42:59.856566 kernel: bridge: filtering via arp/ip/ip6tables is no longer available by default. Update your scripts to load br_netfilter if you need this. Aug 12 23:42:59.856717 systemd[1]: Starting systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev... Aug 12 23:42:59.860661 kernel: Bridge firewalling registered Aug 12 23:42:59.859252 systemd[1]: Starting systemd-tmpfiles-setup.service - Create System Files and Directories... Aug 12 23:42:59.859890 systemd-modules-load[246]: Inserted module 'br_netfilter' Aug 12 23:42:59.861621 systemd[1]: Finished systemd-modules-load.service - Load Kernel Modules. Aug 12 23:42:59.868413 systemd[1]: Starting systemd-sysctl.service - Apply Kernel Variables... Aug 12 23:42:59.878578 systemd[1]: Finished dracut-cmdline-ask.service - dracut ask for additional cmdline parameters. Aug 12 23:42:59.882937 systemd[1]: Starting dracut-cmdline.service - dracut cmdline hook... Aug 12 23:42:59.886019 systemd[1]: Finished systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev. Aug 12 23:42:59.886560 systemd-tmpfiles[264]: /usr/lib/tmpfiles.d/var.conf:14: Duplicate line for path "/var/log", ignoring. Aug 12 23:42:59.899985 systemd[1]: Finished systemd-tmpfiles-setup.service - Create System Files and Directories. Aug 12 23:42:59.902760 systemd[1]: Finished systemd-sysctl.service - Apply Kernel Variables. Aug 12 23:42:59.906485 systemd[1]: Starting systemd-resolved.service - Network Name Resolution... Aug 12 23:42:59.915427 dracut-cmdline[279]: Using kernel command line parameters: rd.driver.pre=btrfs SYSTEMD_SULOGIN_FORCE=1 BOOT_IMAGE=/flatcar/vmlinuz-a mount.usr=/dev/mapper/usr verity.usr=PARTUUID=7130c94a-213a-4e5a-8e26-6cce9662f132 rootflags=rw mount.usrflags=ro consoleblank=0 root=LABEL=ROOT console=ttyAMA0,115200n8 flatcar.first_boot=detected acpi=force flatcar.oem.id=hetzner verity.usrhash=ce82f1ef836ba8581e59ce9db4eef4240d287b2b5f9937c28f0cd024f4dc9107 Aug 12 23:42:59.951125 systemd-resolved[287]: Positive Trust Anchors: Aug 12 23:42:59.951915 systemd-resolved[287]: . IN DS 20326 8 2 e06d44b80b8f1d39a95c0b0d7c65d08458e880409bbc683457104237c7f8ec8d Aug 12 23:42:59.952704 systemd-resolved[287]: Negative trust anchors: home.arpa 10.in-addr.arpa 16.172.in-addr.arpa 17.172.in-addr.arpa 18.172.in-addr.arpa 19.172.in-addr.arpa 20.172.in-addr.arpa 21.172.in-addr.arpa 22.172.in-addr.arpa 23.172.in-addr.arpa 24.172.in-addr.arpa 25.172.in-addr.arpa 26.172.in-addr.arpa 27.172.in-addr.arpa 28.172.in-addr.arpa 29.172.in-addr.arpa 30.172.in-addr.arpa 31.172.in-addr.arpa 170.0.0.192.in-addr.arpa 171.0.0.192.in-addr.arpa 168.192.in-addr.arpa d.f.ip6.arpa ipv4only.arpa resolver.arpa corp home internal intranet lan local private test Aug 12 23:42:59.966816 systemd-resolved[287]: Defaulting to hostname 'linux'. Aug 12 23:42:59.968382 systemd[1]: Started systemd-resolved.service - Network Name Resolution. Aug 12 23:42:59.969249 systemd[1]: Reached target nss-lookup.target - Host and Network Name Lookups. Aug 12 23:43:00.030604 kernel: SCSI subsystem initialized Aug 12 23:43:00.034602 kernel: Loading iSCSI transport class v2.0-870. Aug 12 23:43:00.043603 kernel: iscsi: registered transport (tcp) Aug 12 23:43:00.057782 kernel: iscsi: registered transport (qla4xxx) Aug 12 23:43:00.057926 kernel: QLogic iSCSI HBA Driver Aug 12 23:43:00.083146 systemd[1]: Starting systemd-network-generator.service - Generate network units from Kernel command line... Aug 12 23:43:00.103393 systemd[1]: Finished systemd-network-generator.service - Generate network units from Kernel command line. Aug 12 23:43:00.105808 systemd[1]: Reached target network-pre.target - Preparation for Network. Aug 12 23:43:00.166645 systemd[1]: Finished dracut-cmdline.service - dracut cmdline hook. Aug 12 23:43:00.169726 systemd[1]: Starting dracut-pre-udev.service - dracut pre-udev hook... Aug 12 23:43:00.243646 kernel: raid6: neonx8 gen() 15566 MB/s Aug 12 23:43:00.260668 kernel: raid6: neonx4 gen() 15710 MB/s Aug 12 23:43:00.277648 kernel: raid6: neonx2 gen() 13053 MB/s Aug 12 23:43:00.294626 kernel: raid6: neonx1 gen() 10360 MB/s Aug 12 23:43:00.311604 kernel: raid6: int64x8 gen() 6858 MB/s Aug 12 23:43:00.328643 kernel: raid6: int64x4 gen() 5557 MB/s Aug 12 23:43:00.345620 kernel: raid6: int64x2 gen() 6032 MB/s Aug 12 23:43:00.362623 kernel: raid6: int64x1 gen() 5002 MB/s Aug 12 23:43:00.362706 kernel: raid6: using algorithm neonx4 gen() 15710 MB/s Aug 12 23:43:00.379622 kernel: raid6: .... xor() 12265 MB/s, rmw enabled Aug 12 23:43:00.379693 kernel: raid6: using neon recovery algorithm Aug 12 23:43:00.384602 kernel: xor: measuring software checksum speed Aug 12 23:43:00.385603 kernel: 8regs : 19222 MB/sec Aug 12 23:43:00.385671 kernel: 32regs : 21607 MB/sec Aug 12 23:43:00.385687 kernel: arm64_neon : 24631 MB/sec Aug 12 23:43:00.386645 kernel: xor: using function: arm64_neon (24631 MB/sec) Aug 12 23:43:00.444589 kernel: Btrfs loaded, zoned=no, fsverity=no Aug 12 23:43:00.454960 systemd[1]: Finished dracut-pre-udev.service - dracut pre-udev hook. Aug 12 23:43:00.458510 systemd[1]: Starting systemd-udevd.service - Rule-based Manager for Device Events and Files... Aug 12 23:43:00.493305 systemd-udevd[492]: Using default interface naming scheme 'v255'. Aug 12 23:43:00.497962 systemd[1]: Started systemd-udevd.service - Rule-based Manager for Device Events and Files. Aug 12 23:43:00.504070 systemd[1]: Starting dracut-pre-trigger.service - dracut pre-trigger hook... Aug 12 23:43:00.543199 dracut-pre-trigger[502]: rd.md=0: removing MD RAID activation Aug 12 23:43:00.581819 systemd[1]: Finished dracut-pre-trigger.service - dracut pre-trigger hook. Aug 12 23:43:00.584678 systemd[1]: Starting systemd-udev-trigger.service - Coldplug All udev Devices... Aug 12 23:43:00.665879 systemd[1]: Finished systemd-udev-trigger.service - Coldplug All udev Devices. Aug 12 23:43:00.669514 systemd[1]: Starting dracut-initqueue.service - dracut initqueue hook... Aug 12 23:43:00.775572 kernel: ACPI: bus type USB registered Aug 12 23:43:00.775644 kernel: usbcore: registered new interface driver usbfs Aug 12 23:43:00.775655 kernel: usbcore: registered new interface driver hub Aug 12 23:43:00.779583 kernel: usbcore: registered new device driver usb Aug 12 23:43:00.783587 kernel: virtio_scsi virtio5: 2/0/0 default/read/poll queues Aug 12 23:43:00.788581 kernel: scsi host0: Virtio SCSI HBA Aug 12 23:43:00.799148 kernel: scsi 0:0:0:0: CD-ROM QEMU QEMU CD-ROM 2.5+ PQ: 0 ANSI: 5 Aug 12 23:43:00.799242 kernel: scsi 0:0:0:1: Direct-Access QEMU QEMU HARDDISK 2.5+ PQ: 0 ANSI: 5 Aug 12 23:43:00.820536 systemd[1]: systemd-vconsole-setup.service: Deactivated successfully. Aug 12 23:43:00.822756 systemd[1]: Stopped systemd-vconsole-setup.service - Virtual Console Setup. Aug 12 23:43:00.824459 systemd[1]: Stopping systemd-vconsole-setup.service - Virtual Console Setup... Aug 12 23:43:00.829539 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Aug 12 23:43:00.833001 kernel: sd 0:0:0:1: Power-on or device reset occurred Aug 12 23:43:00.833911 kernel: sd 0:0:0:1: [sda] 80003072 512-byte logical blocks: (41.0 GB/38.1 GiB) Aug 12 23:43:00.834074 kernel: xhci_hcd 0000:02:00.0: xHCI Host Controller Aug 12 23:43:00.834198 kernel: sd 0:0:0:1: [sda] Write Protect is off Aug 12 23:43:00.834295 kernel: sd 0:0:0:1: [sda] Mode Sense: 63 00 00 08 Aug 12 23:43:00.834452 kernel: sd 0:0:0:1: [sda] Write cache: enabled, read cache: enabled, doesn't support DPO or FUA Aug 12 23:43:00.837228 kernel: xhci_hcd 0000:02:00.0: new USB bus registered, assigned bus number 1 Aug 12 23:43:00.838696 systemd[1]: run-credentials-systemd\x2dvconsole\x2dsetup.service.mount: Deactivated successfully. Aug 12 23:43:00.843743 kernel: GPT:Primary header thinks Alt. header is not at the end of the disk. Aug 12 23:43:00.843802 kernel: xhci_hcd 0000:02:00.0: hcc params 0x00087001 hci version 0x100 quirks 0x0000000000000010 Aug 12 23:43:00.844067 kernel: GPT:17805311 != 80003071 Aug 12 23:43:00.844080 kernel: GPT:Alternate GPT header not at the end of the disk. Aug 12 23:43:00.844091 kernel: GPT:17805311 != 80003071 Aug 12 23:43:00.844102 kernel: GPT: Use GNU Parted to correct GPT errors. Aug 12 23:43:00.844113 kernel: sda: sda1 sda2 sda3 sda4 sda6 sda7 sda9 Aug 12 23:43:00.844124 kernel: xhci_hcd 0000:02:00.0: xHCI Host Controller Aug 12 23:43:00.846593 kernel: sd 0:0:0:1: [sda] Attached SCSI disk Aug 12 23:43:00.849474 kernel: xhci_hcd 0000:02:00.0: new USB bus registered, assigned bus number 2 Aug 12 23:43:00.849699 kernel: sr 0:0:0:0: Power-on or device reset occurred Aug 12 23:43:00.849812 kernel: xhci_hcd 0000:02:00.0: Host supports USB 3.0 SuperSpeed Aug 12 23:43:00.852061 kernel: hub 1-0:1.0: USB hub found Aug 12 23:43:00.852251 kernel: hub 1-0:1.0: 4 ports detected Aug 12 23:43:00.852406 kernel: sr 0:0:0:0: [sr0] scsi3-mmc drive: 16x/50x cd/rw xa/form2 cdda tray Aug 12 23:43:00.853801 kernel: cdrom: Uniform CD-ROM driver Revision: 3.20 Aug 12 23:43:00.853839 kernel: usb usb2: We don't know the algorithms for LPM for this host, disabling LPM. Aug 12 23:43:00.854988 kernel: sr 0:0:0:0: Attached scsi CD-ROM sr0 Aug 12 23:43:00.860311 kernel: hub 2-0:1.0: USB hub found Aug 12 23:43:00.860520 kernel: hub 2-0:1.0: 4 ports detected Aug 12 23:43:00.876040 systemd[1]: Finished systemd-vconsole-setup.service - Virtual Console Setup. Aug 12 23:43:00.927746 systemd[1]: Found device dev-disk-by\x2dlabel-ROOT.device - QEMU_HARDDISK ROOT. Aug 12 23:43:00.950757 systemd[1]: Found device dev-disk-by\x2dlabel-EFI\x2dSYSTEM.device - QEMU_HARDDISK EFI-SYSTEM. Aug 12 23:43:00.960446 systemd[1]: Found device dev-disk-by\x2dlabel-OEM.device - QEMU_HARDDISK OEM. Aug 12 23:43:00.968594 systemd[1]: Found device dev-disk-by\x2dpartlabel-USR\x2dA.device - QEMU_HARDDISK USR-A. Aug 12 23:43:00.969354 systemd[1]: Found device dev-disk-by\x2dpartuuid-7130c94a\x2d213a\x2d4e5a\x2d8e26\x2d6cce9662f132.device - QEMU_HARDDISK USR-A. Aug 12 23:43:00.972100 systemd[1]: Starting disk-uuid.service - Generate new UUID for disk GPT if necessary... Aug 12 23:43:00.990426 disk-uuid[593]: Primary Header is updated. Aug 12 23:43:00.990426 disk-uuid[593]: Secondary Entries is updated. Aug 12 23:43:00.990426 disk-uuid[593]: Secondary Header is updated. Aug 12 23:43:00.996632 systemd[1]: Finished dracut-initqueue.service - dracut initqueue hook. Aug 12 23:43:00.998176 systemd[1]: Reached target remote-fs-pre.target - Preparation for Remote File Systems. Aug 12 23:43:01.000880 systemd[1]: Reached target remote-cryptsetup.target - Remote Encrypted Volumes. Aug 12 23:43:01.001907 systemd[1]: Reached target remote-fs.target - Remote File Systems. Aug 12 23:43:01.005585 systemd[1]: Starting dracut-pre-mount.service - dracut pre-mount hook... Aug 12 23:43:01.009563 kernel: sda: sda1 sda2 sda3 sda4 sda6 sda7 sda9 Aug 12 23:43:01.034162 systemd[1]: Finished dracut-pre-mount.service - dracut pre-mount hook. Aug 12 23:43:01.092629 kernel: usb 1-1: new high-speed USB device number 2 using xhci_hcd Aug 12 23:43:01.230928 kernel: input: QEMU QEMU USB Tablet as /devices/pci0000:00/0000:00:02.1/0000:02:00.0/usb1/1-1/1-1:1.0/0003:0627:0001.0001/input/input1 Aug 12 23:43:01.231087 kernel: hid-generic 0003:0627:0001.0001: input,hidraw0: USB HID v0.01 Mouse [QEMU QEMU USB Tablet] on usb-0000:02:00.0-1/input0 Aug 12 23:43:01.232125 kernel: usbcore: registered new interface driver usbhid Aug 12 23:43:01.232833 kernel: usbhid: USB HID core driver Aug 12 23:43:01.335583 kernel: usb 1-2: new high-speed USB device number 3 using xhci_hcd Aug 12 23:43:01.462611 kernel: input: QEMU QEMU USB Keyboard as /devices/pci0000:00/0000:00:02.1/0000:02:00.0/usb1/1-2/1-2:1.0/0003:0627:0001.0002/input/input2 Aug 12 23:43:01.515627 kernel: hid-generic 0003:0627:0001.0002: input,hidraw1: USB HID v1.11 Keyboard [QEMU QEMU USB Keyboard] on usb-0000:02:00.0-2/input0 Aug 12 23:43:02.029425 kernel: sda: sda1 sda2 sda3 sda4 sda6 sda7 sda9 Aug 12 23:43:02.029494 disk-uuid[597]: The operation has completed successfully. Aug 12 23:43:02.100201 systemd[1]: disk-uuid.service: Deactivated successfully. Aug 12 23:43:02.100336 systemd[1]: Finished disk-uuid.service - Generate new UUID for disk GPT if necessary. Aug 12 23:43:02.129138 systemd[1]: Starting verity-setup.service - Verity Setup for /dev/mapper/usr... Aug 12 23:43:02.152226 sh[624]: Success Aug 12 23:43:02.169832 kernel: device-mapper: core: CONFIG_IMA_DISABLE_HTABLE is disabled. Duplicate IMA measurements will not be recorded in the IMA log. Aug 12 23:43:02.169906 kernel: device-mapper: uevent: version 1.0.3 Aug 12 23:43:02.169934 kernel: device-mapper: ioctl: 4.48.0-ioctl (2023-03-01) initialised: dm-devel@lists.linux.dev Aug 12 23:43:02.179573 kernel: device-mapper: verity: sha256 using shash "sha256-ce" Aug 12 23:43:02.236704 systemd[1]: Found device dev-mapper-usr.device - /dev/mapper/usr. Aug 12 23:43:02.241675 systemd[1]: Mounting sysusr-usr.mount - /sysusr/usr... Aug 12 23:43:02.254183 systemd[1]: Finished verity-setup.service - Verity Setup for /dev/mapper/usr. Aug 12 23:43:02.271435 kernel: BTRFS info: 'norecovery' is for compatibility only, recommended to use 'rescue=nologreplay' Aug 12 23:43:02.271535 kernel: BTRFS: device fsid 7658cdd8-2ee4-4f84-82be-1f808605c89c devid 1 transid 42 /dev/mapper/usr (254:0) scanned by mount (636) Aug 12 23:43:02.274953 kernel: BTRFS info (device dm-0): first mount of filesystem 7658cdd8-2ee4-4f84-82be-1f808605c89c Aug 12 23:43:02.275017 kernel: BTRFS info (device dm-0): using crc32c (crc32c-generic) checksum algorithm Aug 12 23:43:02.275053 kernel: BTRFS info (device dm-0): using free-space-tree Aug 12 23:43:02.284861 systemd[1]: Mounted sysusr-usr.mount - /sysusr/usr. Aug 12 23:43:02.286173 systemd[1]: Reached target initrd-usr-fs.target - Initrd /usr File System. Aug 12 23:43:02.286957 systemd[1]: afterburn-network-kargs.service - Afterburn Initrd Setup Network Kernel Arguments was skipped because no trigger condition checks were met. Aug 12 23:43:02.287833 systemd[1]: Starting ignition-setup.service - Ignition (setup)... Aug 12 23:43:02.290975 systemd[1]: Starting parse-ip-for-networkd.service - Write systemd-networkd units from cmdline... Aug 12 23:43:02.338629 kernel: BTRFS: device label OEM devid 1 transid 11 /dev/sda6 (8:6) scanned by mount (667) Aug 12 23:43:02.341103 kernel: BTRFS info (device sda6): first mount of filesystem cff59a55-3bd9-4c36-9f7f-aabedbf210fb Aug 12 23:43:02.341154 kernel: BTRFS info (device sda6): using crc32c (crc32c-generic) checksum algorithm Aug 12 23:43:02.341168 kernel: BTRFS info (device sda6): using free-space-tree Aug 12 23:43:02.353597 kernel: BTRFS info (device sda6): last unmount of filesystem cff59a55-3bd9-4c36-9f7f-aabedbf210fb Aug 12 23:43:02.354662 systemd[1]: Finished ignition-setup.service - Ignition (setup). Aug 12 23:43:02.359225 systemd[1]: Starting ignition-fetch-offline.service - Ignition (fetch-offline)... Aug 12 23:43:02.452060 systemd[1]: Finished parse-ip-for-networkd.service - Write systemd-networkd units from cmdline. Aug 12 23:43:02.456654 systemd[1]: Starting systemd-networkd.service - Network Configuration... Aug 12 23:43:02.496795 systemd-networkd[807]: lo: Link UP Aug 12 23:43:02.496810 systemd-networkd[807]: lo: Gained carrier Aug 12 23:43:02.498391 systemd-networkd[807]: Enumeration completed Aug 12 23:43:02.498508 systemd[1]: Started systemd-networkd.service - Network Configuration. Aug 12 23:43:02.499279 systemd[1]: Reached target network.target - Network. Aug 12 23:43:02.500175 systemd-networkd[807]: eth0: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Aug 12 23:43:02.500179 systemd-networkd[807]: eth0: Configuring with /usr/lib/systemd/network/zz-default.network. Aug 12 23:43:02.500994 systemd-networkd[807]: eth1: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Aug 12 23:43:02.500999 systemd-networkd[807]: eth1: Configuring with /usr/lib/systemd/network/zz-default.network. Aug 12 23:43:02.502180 systemd-networkd[807]: eth0: Link UP Aug 12 23:43:02.502344 systemd-networkd[807]: eth1: Link UP Aug 12 23:43:02.502498 systemd-networkd[807]: eth0: Gained carrier Aug 12 23:43:02.502512 systemd-networkd[807]: eth0: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Aug 12 23:43:02.513071 systemd-networkd[807]: eth1: Gained carrier Aug 12 23:43:02.513090 systemd-networkd[807]: eth1: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Aug 12 23:43:02.534898 ignition[718]: Ignition 2.21.0 Aug 12 23:43:02.535517 ignition[718]: Stage: fetch-offline Aug 12 23:43:02.535580 ignition[718]: no configs at "/usr/lib/ignition/base.d" Aug 12 23:43:02.535590 ignition[718]: no config dir at "/usr/lib/ignition/base.platform.d/hetzner" Aug 12 23:43:02.535815 ignition[718]: parsed url from cmdline: "" Aug 12 23:43:02.535818 ignition[718]: no config URL provided Aug 12 23:43:02.535823 ignition[718]: reading system config file "/usr/lib/ignition/user.ign" Aug 12 23:43:02.539021 systemd[1]: Finished ignition-fetch-offline.service - Ignition (fetch-offline). Aug 12 23:43:02.535831 ignition[718]: no config at "/usr/lib/ignition/user.ign" Aug 12 23:43:02.535836 ignition[718]: failed to fetch config: resource requires networking Aug 12 23:43:02.537107 ignition[718]: Ignition finished successfully Aug 12 23:43:02.542658 systemd-networkd[807]: eth1: DHCPv4 address 10.0.0.3/32 acquired from 10.0.0.1 Aug 12 23:43:02.542881 systemd[1]: Starting ignition-fetch.service - Ignition (fetch)... Aug 12 23:43:02.549724 systemd-networkd[807]: eth0: DHCPv4 address 138.199.237.168/32, gateway 172.31.1.1 acquired from 172.31.1.1 Aug 12 23:43:02.569542 ignition[815]: Ignition 2.21.0 Aug 12 23:43:02.570189 ignition[815]: Stage: fetch Aug 12 23:43:02.570393 ignition[815]: no configs at "/usr/lib/ignition/base.d" Aug 12 23:43:02.570404 ignition[815]: no config dir at "/usr/lib/ignition/base.platform.d/hetzner" Aug 12 23:43:02.570498 ignition[815]: parsed url from cmdline: "" Aug 12 23:43:02.570501 ignition[815]: no config URL provided Aug 12 23:43:02.570506 ignition[815]: reading system config file "/usr/lib/ignition/user.ign" Aug 12 23:43:02.570513 ignition[815]: no config at "/usr/lib/ignition/user.ign" Aug 12 23:43:02.570637 ignition[815]: GET http://169.254.169.254/hetzner/v1/userdata: attempt #1 Aug 12 23:43:02.579611 ignition[815]: GET result: OK Aug 12 23:43:02.580098 ignition[815]: parsing config with SHA512: 9bacf994ffcbd563c5fd9cd531f87bc81a429fd68db34b383f1e57bd5d2ed044898c7786837cd927b87e4b513fd7662859f86a55dd9b10d444b24d2596ad2c6e Aug 12 23:43:02.587583 unknown[815]: fetched base config from "system" Aug 12 23:43:02.588567 unknown[815]: fetched base config from "system" Aug 12 23:43:02.589043 unknown[815]: fetched user config from "hetzner" Aug 12 23:43:02.589426 ignition[815]: fetch: fetch complete Aug 12 23:43:02.589432 ignition[815]: fetch: fetch passed Aug 12 23:43:02.589507 ignition[815]: Ignition finished successfully Aug 12 23:43:02.593773 systemd[1]: Finished ignition-fetch.service - Ignition (fetch). Aug 12 23:43:02.597002 systemd[1]: Starting ignition-kargs.service - Ignition (kargs)... Aug 12 23:43:02.625047 ignition[822]: Ignition 2.21.0 Aug 12 23:43:02.625059 ignition[822]: Stage: kargs Aug 12 23:43:02.625219 ignition[822]: no configs at "/usr/lib/ignition/base.d" Aug 12 23:43:02.625228 ignition[822]: no config dir at "/usr/lib/ignition/base.platform.d/hetzner" Aug 12 23:43:02.630829 ignition[822]: kargs: kargs passed Aug 12 23:43:02.630900 ignition[822]: Ignition finished successfully Aug 12 23:43:02.633347 systemd[1]: Finished ignition-kargs.service - Ignition (kargs). Aug 12 23:43:02.635137 systemd[1]: Starting ignition-disks.service - Ignition (disks)... Aug 12 23:43:02.664481 ignition[828]: Ignition 2.21.0 Aug 12 23:43:02.664497 ignition[828]: Stage: disks Aug 12 23:43:02.665792 ignition[828]: no configs at "/usr/lib/ignition/base.d" Aug 12 23:43:02.665809 ignition[828]: no config dir at "/usr/lib/ignition/base.platform.d/hetzner" Aug 12 23:43:02.668357 ignition[828]: disks: disks passed Aug 12 23:43:02.668427 ignition[828]: Ignition finished successfully Aug 12 23:43:02.671589 systemd[1]: Finished ignition-disks.service - Ignition (disks). Aug 12 23:43:02.672410 systemd[1]: Reached target initrd-root-device.target - Initrd Root Device. Aug 12 23:43:02.673328 systemd[1]: Reached target local-fs-pre.target - Preparation for Local File Systems. Aug 12 23:43:02.674482 systemd[1]: Reached target local-fs.target - Local File Systems. Aug 12 23:43:02.676241 systemd[1]: Reached target sysinit.target - System Initialization. Aug 12 23:43:02.677629 systemd[1]: Reached target basic.target - Basic System. Aug 12 23:43:02.680518 systemd[1]: Starting systemd-fsck-root.service - File System Check on /dev/disk/by-label/ROOT... Aug 12 23:43:02.714445 systemd-fsck[836]: ROOT: clean, 15/1628000 files, 120826/1617920 blocks Aug 12 23:43:02.720248 systemd[1]: Finished systemd-fsck-root.service - File System Check on /dev/disk/by-label/ROOT. Aug 12 23:43:02.722993 systemd[1]: Mounting sysroot.mount - /sysroot... Aug 12 23:43:02.806985 kernel: EXT4-fs (sda9): mounted filesystem d634334e-91a3-4b77-89ab-775bdd78a572 r/w with ordered data mode. Quota mode: none. Aug 12 23:43:02.808433 systemd[1]: Mounted sysroot.mount - /sysroot. Aug 12 23:43:02.810719 systemd[1]: Reached target initrd-root-fs.target - Initrd Root File System. Aug 12 23:43:02.813861 systemd[1]: Mounting sysroot-oem.mount - /sysroot/oem... Aug 12 23:43:02.816714 systemd[1]: Mounting sysroot-usr.mount - /sysroot/usr... Aug 12 23:43:02.820440 systemd[1]: Starting flatcar-metadata-hostname.service - Flatcar Metadata Hostname Agent... Aug 12 23:43:02.821150 systemd[1]: ignition-remount-sysroot.service - Remount /sysroot read-write for Ignition was skipped because of an unmet condition check (ConditionPathIsReadWrite=!/sysroot). Aug 12 23:43:02.821182 systemd[1]: Reached target ignition-diskful.target - Ignition Boot Disk Setup. Aug 12 23:43:02.835620 systemd[1]: Mounted sysroot-usr.mount - /sysroot/usr. Aug 12 23:43:02.837109 systemd[1]: Starting initrd-setup-root.service - Root filesystem setup... Aug 12 23:43:02.850604 kernel: BTRFS: device label OEM devid 1 transid 11 /dev/sda6 (8:6) scanned by mount (844) Aug 12 23:43:02.855494 kernel: BTRFS info (device sda6): first mount of filesystem cff59a55-3bd9-4c36-9f7f-aabedbf210fb Aug 12 23:43:02.855586 kernel: BTRFS info (device sda6): using crc32c (crc32c-generic) checksum algorithm Aug 12 23:43:02.856590 kernel: BTRFS info (device sda6): using free-space-tree Aug 12 23:43:02.868538 systemd[1]: Mounted sysroot-oem.mount - /sysroot/oem. Aug 12 23:43:02.896017 coreos-metadata[846]: Aug 12 23:43:02.895 INFO Fetching http://169.254.169.254/hetzner/v1/metadata/hostname: Attempt #1 Aug 12 23:43:02.899086 coreos-metadata[846]: Aug 12 23:43:02.899 INFO Fetch successful Aug 12 23:43:02.902443 coreos-metadata[846]: Aug 12 23:43:02.901 INFO wrote hostname ci-4372-1-0-9-13fe44d47a to /sysroot/etc/hostname Aug 12 23:43:02.903799 systemd[1]: Finished flatcar-metadata-hostname.service - Flatcar Metadata Hostname Agent. Aug 12 23:43:02.906985 initrd-setup-root[871]: cut: /sysroot/etc/passwd: No such file or directory Aug 12 23:43:02.913325 initrd-setup-root[879]: cut: /sysroot/etc/group: No such file or directory Aug 12 23:43:02.919392 initrd-setup-root[886]: cut: /sysroot/etc/shadow: No such file or directory Aug 12 23:43:02.925448 initrd-setup-root[893]: cut: /sysroot/etc/gshadow: No such file or directory Aug 12 23:43:03.034733 systemd[1]: Finished initrd-setup-root.service - Root filesystem setup. Aug 12 23:43:03.036708 systemd[1]: Starting ignition-mount.service - Ignition (mount)... Aug 12 23:43:03.038815 systemd[1]: Starting sysroot-boot.service - /sysroot/boot... Aug 12 23:43:03.053570 kernel: BTRFS info (device sda6): last unmount of filesystem cff59a55-3bd9-4c36-9f7f-aabedbf210fb Aug 12 23:43:03.078720 systemd[1]: Finished sysroot-boot.service - /sysroot/boot. Aug 12 23:43:03.086463 ignition[960]: INFO : Ignition 2.21.0 Aug 12 23:43:03.086463 ignition[960]: INFO : Stage: mount Aug 12 23:43:03.090066 ignition[960]: INFO : no configs at "/usr/lib/ignition/base.d" Aug 12 23:43:03.090066 ignition[960]: INFO : no config dir at "/usr/lib/ignition/base.platform.d/hetzner" Aug 12 23:43:03.090066 ignition[960]: INFO : mount: mount passed Aug 12 23:43:03.090066 ignition[960]: INFO : Ignition finished successfully Aug 12 23:43:03.090663 systemd[1]: Finished ignition-mount.service - Ignition (mount). Aug 12 23:43:03.094249 systemd[1]: Starting ignition-files.service - Ignition (files)... Aug 12 23:43:03.270036 systemd[1]: sysroot-oem.mount: Deactivated successfully. Aug 12 23:43:03.274739 systemd[1]: Mounting sysroot-oem.mount - /sysroot/oem... Aug 12 23:43:03.306615 kernel: BTRFS: device label OEM devid 1 transid 11 /dev/sda6 (8:6) scanned by mount (973) Aug 12 23:43:03.309072 kernel: BTRFS info (device sda6): first mount of filesystem cff59a55-3bd9-4c36-9f7f-aabedbf210fb Aug 12 23:43:03.309130 kernel: BTRFS info (device sda6): using crc32c (crc32c-generic) checksum algorithm Aug 12 23:43:03.309941 kernel: BTRFS info (device sda6): using free-space-tree Aug 12 23:43:03.318334 systemd[1]: Mounted sysroot-oem.mount - /sysroot/oem. Aug 12 23:43:03.352246 ignition[990]: INFO : Ignition 2.21.0 Aug 12 23:43:03.352246 ignition[990]: INFO : Stage: files Aug 12 23:43:03.353470 ignition[990]: INFO : no configs at "/usr/lib/ignition/base.d" Aug 12 23:43:03.353470 ignition[990]: INFO : no config dir at "/usr/lib/ignition/base.platform.d/hetzner" Aug 12 23:43:03.355109 ignition[990]: DEBUG : files: compiled without relabeling support, skipping Aug 12 23:43:03.355109 ignition[990]: INFO : files: ensureUsers: op(1): [started] creating or modifying user "core" Aug 12 23:43:03.355109 ignition[990]: DEBUG : files: ensureUsers: op(1): executing: "usermod" "--root" "/sysroot" "core" Aug 12 23:43:03.360081 ignition[990]: INFO : files: ensureUsers: op(1): [finished] creating or modifying user "core" Aug 12 23:43:03.360081 ignition[990]: INFO : files: ensureUsers: op(2): [started] adding ssh keys to user "core" Aug 12 23:43:03.364820 ignition[990]: INFO : files: ensureUsers: op(2): [finished] adding ssh keys to user "core" Aug 12 23:43:03.360512 unknown[990]: wrote ssh authorized keys file for user: core Aug 12 23:43:03.367962 ignition[990]: INFO : files: createFilesystemsFiles: createFiles: op(3): [started] writing file "/sysroot/opt/helm-v3.13.2-linux-arm64.tar.gz" Aug 12 23:43:03.367962 ignition[990]: INFO : files: createFilesystemsFiles: createFiles: op(3): GET https://get.helm.sh/helm-v3.13.2-linux-arm64.tar.gz: attempt #1 Aug 12 23:43:03.715861 systemd-networkd[807]: eth1: Gained IPv6LL Aug 12 23:43:03.971959 systemd-networkd[807]: eth0: Gained IPv6LL Aug 12 23:43:05.368099 ignition[990]: INFO : files: createFilesystemsFiles: createFiles: op(3): GET result: OK Aug 12 23:43:17.934369 ignition[990]: INFO : files: createFilesystemsFiles: createFiles: op(3): [finished] writing file "/sysroot/opt/helm-v3.13.2-linux-arm64.tar.gz" Aug 12 23:43:17.936522 ignition[990]: INFO : files: createFilesystemsFiles: createFiles: op(4): [started] writing file "/sysroot/home/core/install.sh" Aug 12 23:43:17.936522 ignition[990]: INFO : files: createFilesystemsFiles: createFiles: op(4): [finished] writing file "/sysroot/home/core/install.sh" Aug 12 23:43:17.936522 ignition[990]: INFO : files: createFilesystemsFiles: createFiles: op(5): [started] writing file "/sysroot/home/core/nginx.yaml" Aug 12 23:43:17.936522 ignition[990]: INFO : files: createFilesystemsFiles: createFiles: op(5): [finished] writing file "/sysroot/home/core/nginx.yaml" Aug 12 23:43:17.936522 ignition[990]: INFO : files: createFilesystemsFiles: createFiles: op(6): [started] writing file "/sysroot/home/core/nfs-pod.yaml" Aug 12 23:43:17.936522 ignition[990]: INFO : files: createFilesystemsFiles: createFiles: op(6): [finished] writing file "/sysroot/home/core/nfs-pod.yaml" Aug 12 23:43:17.936522 ignition[990]: INFO : files: createFilesystemsFiles: createFiles: op(7): [started] writing file "/sysroot/home/core/nfs-pvc.yaml" Aug 12 23:43:17.936522 ignition[990]: INFO : files: createFilesystemsFiles: createFiles: op(7): [finished] writing file "/sysroot/home/core/nfs-pvc.yaml" Aug 12 23:43:17.947196 ignition[990]: INFO : files: createFilesystemsFiles: createFiles: op(8): [started] writing file "/sysroot/etc/flatcar/update.conf" Aug 12 23:43:17.947196 ignition[990]: INFO : files: createFilesystemsFiles: createFiles: op(8): [finished] writing file "/sysroot/etc/flatcar/update.conf" Aug 12 23:43:17.947196 ignition[990]: INFO : files: createFilesystemsFiles: createFiles: op(9): [started] writing link "/sysroot/etc/extensions/kubernetes.raw" -> "/opt/extensions/kubernetes/kubernetes-v1.31.8-arm64.raw" Aug 12 23:43:17.947196 ignition[990]: INFO : files: createFilesystemsFiles: createFiles: op(9): [finished] writing link "/sysroot/etc/extensions/kubernetes.raw" -> "/opt/extensions/kubernetes/kubernetes-v1.31.8-arm64.raw" Aug 12 23:43:17.947196 ignition[990]: INFO : files: createFilesystemsFiles: createFiles: op(a): [started] writing file "/sysroot/opt/extensions/kubernetes/kubernetes-v1.31.8-arm64.raw" Aug 12 23:43:17.947196 ignition[990]: INFO : files: createFilesystemsFiles: createFiles: op(a): GET https://extensions.flatcar.org/extensions/kubernetes-v1.31.8-arm64.raw: attempt #1 Aug 12 23:43:18.039304 ignition[990]: INFO : files: createFilesystemsFiles: createFiles: op(a): GET result: OK Aug 12 23:43:18.262176 ignition[990]: INFO : files: createFilesystemsFiles: createFiles: op(a): [finished] writing file "/sysroot/opt/extensions/kubernetes/kubernetes-v1.31.8-arm64.raw" Aug 12 23:43:18.262176 ignition[990]: INFO : files: op(b): [started] processing unit "prepare-helm.service" Aug 12 23:43:18.265564 ignition[990]: INFO : files: op(b): op(c): [started] writing unit "prepare-helm.service" at "/sysroot/etc/systemd/system/prepare-helm.service" Aug 12 23:43:18.268813 ignition[990]: INFO : files: op(b): op(c): [finished] writing unit "prepare-helm.service" at "/sysroot/etc/systemd/system/prepare-helm.service" Aug 12 23:43:18.269935 ignition[990]: INFO : files: op(b): [finished] processing unit "prepare-helm.service" Aug 12 23:43:18.269935 ignition[990]: INFO : files: op(d): [started] processing unit "coreos-metadata.service" Aug 12 23:43:18.269935 ignition[990]: INFO : files: op(d): op(e): [started] writing systemd drop-in "00-custom-metadata.conf" at "/sysroot/etc/systemd/system/coreos-metadata.service.d/00-custom-metadata.conf" Aug 12 23:43:18.274947 ignition[990]: INFO : files: op(d): op(e): [finished] writing systemd drop-in "00-custom-metadata.conf" at "/sysroot/etc/systemd/system/coreos-metadata.service.d/00-custom-metadata.conf" Aug 12 23:43:18.274947 ignition[990]: INFO : files: op(d): [finished] processing unit "coreos-metadata.service" Aug 12 23:43:18.274947 ignition[990]: INFO : files: op(f): [started] setting preset to enabled for "prepare-helm.service" Aug 12 23:43:18.274947 ignition[990]: INFO : files: op(f): [finished] setting preset to enabled for "prepare-helm.service" Aug 12 23:43:18.274947 ignition[990]: INFO : files: createResultFile: createFiles: op(10): [started] writing file "/sysroot/etc/.ignition-result.json" Aug 12 23:43:18.274947 ignition[990]: INFO : files: createResultFile: createFiles: op(10): [finished] writing file "/sysroot/etc/.ignition-result.json" Aug 12 23:43:18.274947 ignition[990]: INFO : files: files passed Aug 12 23:43:18.274947 ignition[990]: INFO : Ignition finished successfully Aug 12 23:43:18.271886 systemd[1]: Finished ignition-files.service - Ignition (files). Aug 12 23:43:18.274949 systemd[1]: Starting ignition-quench.service - Ignition (record completion)... Aug 12 23:43:18.279752 systemd[1]: Starting initrd-setup-root-after-ignition.service - Root filesystem completion... Aug 12 23:43:18.298799 systemd[1]: ignition-quench.service: Deactivated successfully. Aug 12 23:43:18.298935 systemd[1]: Finished ignition-quench.service - Ignition (record completion). Aug 12 23:43:18.305565 initrd-setup-root-after-ignition[1019]: grep: /sysroot/etc/flatcar/enabled-sysext.conf: No such file or directory Aug 12 23:43:18.305565 initrd-setup-root-after-ignition[1019]: grep: /sysroot/usr/share/flatcar/enabled-sysext.conf: No such file or directory Aug 12 23:43:18.308387 initrd-setup-root-after-ignition[1023]: grep: /sysroot/etc/flatcar/enabled-sysext.conf: No such file or directory Aug 12 23:43:18.312601 systemd[1]: Finished initrd-setup-root-after-ignition.service - Root filesystem completion. Aug 12 23:43:18.313695 systemd[1]: Reached target ignition-complete.target - Ignition Complete. Aug 12 23:43:18.315903 systemd[1]: Starting initrd-parse-etc.service - Mountpoints Configured in the Real Root... Aug 12 23:43:18.372802 systemd[1]: initrd-parse-etc.service: Deactivated successfully. Aug 12 23:43:18.372985 systemd[1]: Finished initrd-parse-etc.service - Mountpoints Configured in the Real Root. Aug 12 23:43:18.375460 systemd[1]: Reached target initrd-fs.target - Initrd File Systems. Aug 12 23:43:18.376599 systemd[1]: Reached target initrd.target - Initrd Default Target. Aug 12 23:43:18.377894 systemd[1]: dracut-mount.service - dracut mount hook was skipped because no trigger condition checks were met. Aug 12 23:43:18.378740 systemd[1]: Starting dracut-pre-pivot.service - dracut pre-pivot and cleanup hook... Aug 12 23:43:18.411626 systemd[1]: Finished dracut-pre-pivot.service - dracut pre-pivot and cleanup hook. Aug 12 23:43:18.414075 systemd[1]: Starting initrd-cleanup.service - Cleaning Up and Shutting Down Daemons... Aug 12 23:43:18.441529 systemd[1]: Stopped target nss-lookup.target - Host and Network Name Lookups. Aug 12 23:43:18.443015 systemd[1]: Stopped target remote-cryptsetup.target - Remote Encrypted Volumes. Aug 12 23:43:18.443716 systemd[1]: Stopped target timers.target - Timer Units. Aug 12 23:43:18.445150 systemd[1]: dracut-pre-pivot.service: Deactivated successfully. Aug 12 23:43:18.445284 systemd[1]: Stopped dracut-pre-pivot.service - dracut pre-pivot and cleanup hook. Aug 12 23:43:18.446800 systemd[1]: Stopped target initrd.target - Initrd Default Target. Aug 12 23:43:18.447420 systemd[1]: Stopped target basic.target - Basic System. Aug 12 23:43:18.448476 systemd[1]: Stopped target ignition-complete.target - Ignition Complete. Aug 12 23:43:18.449556 systemd[1]: Stopped target ignition-diskful.target - Ignition Boot Disk Setup. Aug 12 23:43:18.451434 systemd[1]: Stopped target initrd-root-device.target - Initrd Root Device. Aug 12 23:43:18.453109 systemd[1]: Stopped target initrd-usr-fs.target - Initrd /usr File System. Aug 12 23:43:18.454768 systemd[1]: Stopped target remote-fs.target - Remote File Systems. Aug 12 23:43:18.455819 systemd[1]: Stopped target remote-fs-pre.target - Preparation for Remote File Systems. Aug 12 23:43:18.456914 systemd[1]: Stopped target sysinit.target - System Initialization. Aug 12 23:43:18.458092 systemd[1]: Stopped target local-fs.target - Local File Systems. Aug 12 23:43:18.459221 systemd[1]: Stopped target swap.target - Swaps. Aug 12 23:43:18.460079 systemd[1]: dracut-pre-mount.service: Deactivated successfully. Aug 12 23:43:18.460228 systemd[1]: Stopped dracut-pre-mount.service - dracut pre-mount hook. Aug 12 23:43:18.461593 systemd[1]: Stopped target cryptsetup.target - Local Encrypted Volumes. Aug 12 23:43:18.462214 systemd[1]: Stopped target cryptsetup-pre.target - Local Encrypted Volumes (Pre). Aug 12 23:43:18.463307 systemd[1]: clevis-luks-askpass.path: Deactivated successfully. Aug 12 23:43:18.463798 systemd[1]: Stopped clevis-luks-askpass.path - Forward Password Requests to Clevis Directory Watch. Aug 12 23:43:18.465199 systemd[1]: dracut-initqueue.service: Deactivated successfully. Aug 12 23:43:18.465326 systemd[1]: Stopped dracut-initqueue.service - dracut initqueue hook. Aug 12 23:43:18.467027 systemd[1]: initrd-setup-root-after-ignition.service: Deactivated successfully. Aug 12 23:43:18.467164 systemd[1]: Stopped initrd-setup-root-after-ignition.service - Root filesystem completion. Aug 12 23:43:18.468299 systemd[1]: ignition-files.service: Deactivated successfully. Aug 12 23:43:18.468403 systemd[1]: Stopped ignition-files.service - Ignition (files). Aug 12 23:43:18.469302 systemd[1]: flatcar-metadata-hostname.service: Deactivated successfully. Aug 12 23:43:18.469397 systemd[1]: Stopped flatcar-metadata-hostname.service - Flatcar Metadata Hostname Agent. Aug 12 23:43:18.471245 systemd[1]: Stopping ignition-mount.service - Ignition (mount)... Aug 12 23:43:18.473161 systemd[1]: kmod-static-nodes.service: Deactivated successfully. Aug 12 23:43:18.473301 systemd[1]: Stopped kmod-static-nodes.service - Create List of Static Device Nodes. Aug 12 23:43:18.483523 systemd[1]: Stopping sysroot-boot.service - /sysroot/boot... Aug 12 23:43:18.486670 systemd[1]: systemd-udev-trigger.service: Deactivated successfully. Aug 12 23:43:18.486983 systemd[1]: Stopped systemd-udev-trigger.service - Coldplug All udev Devices. Aug 12 23:43:18.491774 systemd[1]: dracut-pre-trigger.service: Deactivated successfully. Aug 12 23:43:18.491893 systemd[1]: Stopped dracut-pre-trigger.service - dracut pre-trigger hook. Aug 12 23:43:18.498329 systemd[1]: initrd-cleanup.service: Deactivated successfully. Aug 12 23:43:18.499358 systemd[1]: Finished initrd-cleanup.service - Cleaning Up and Shutting Down Daemons. Aug 12 23:43:18.505590 ignition[1043]: INFO : Ignition 2.21.0 Aug 12 23:43:18.505590 ignition[1043]: INFO : Stage: umount Aug 12 23:43:18.505590 ignition[1043]: INFO : no configs at "/usr/lib/ignition/base.d" Aug 12 23:43:18.505590 ignition[1043]: INFO : no config dir at "/usr/lib/ignition/base.platform.d/hetzner" Aug 12 23:43:18.510233 ignition[1043]: INFO : umount: umount passed Aug 12 23:43:18.510233 ignition[1043]: INFO : Ignition finished successfully Aug 12 23:43:18.509843 systemd[1]: ignition-mount.service: Deactivated successfully. Aug 12 23:43:18.510924 systemd[1]: Stopped ignition-mount.service - Ignition (mount). Aug 12 23:43:18.512242 systemd[1]: ignition-disks.service: Deactivated successfully. Aug 12 23:43:18.512335 systemd[1]: Stopped ignition-disks.service - Ignition (disks). Aug 12 23:43:18.513097 systemd[1]: ignition-kargs.service: Deactivated successfully. Aug 12 23:43:18.513163 systemd[1]: Stopped ignition-kargs.service - Ignition (kargs). Aug 12 23:43:18.514099 systemd[1]: ignition-fetch.service: Deactivated successfully. Aug 12 23:43:18.514185 systemd[1]: Stopped ignition-fetch.service - Ignition (fetch). Aug 12 23:43:18.515476 systemd[1]: Stopped target network.target - Network. Aug 12 23:43:18.518176 systemd[1]: ignition-fetch-offline.service: Deactivated successfully. Aug 12 23:43:18.518250 systemd[1]: Stopped ignition-fetch-offline.service - Ignition (fetch-offline). Aug 12 23:43:18.519251 systemd[1]: Stopped target paths.target - Path Units. Aug 12 23:43:18.521245 systemd[1]: systemd-ask-password-console.path: Deactivated successfully. Aug 12 23:43:18.525639 systemd[1]: Stopped systemd-ask-password-console.path - Dispatch Password Requests to Console Directory Watch. Aug 12 23:43:18.526471 systemd[1]: Stopped target slices.target - Slice Units. Aug 12 23:43:18.527385 systemd[1]: Stopped target sockets.target - Socket Units. Aug 12 23:43:18.528590 systemd[1]: iscsid.socket: Deactivated successfully. Aug 12 23:43:18.528651 systemd[1]: Closed iscsid.socket - Open-iSCSI iscsid Socket. Aug 12 23:43:18.530086 systemd[1]: iscsiuio.socket: Deactivated successfully. Aug 12 23:43:18.532112 systemd[1]: Closed iscsiuio.socket - Open-iSCSI iscsiuio Socket. Aug 12 23:43:18.532776 systemd[1]: ignition-setup.service: Deactivated successfully. Aug 12 23:43:18.532855 systemd[1]: Stopped ignition-setup.service - Ignition (setup). Aug 12 23:43:18.533606 systemd[1]: ignition-setup-pre.service: Deactivated successfully. Aug 12 23:43:18.533647 systemd[1]: Stopped ignition-setup-pre.service - Ignition env setup. Aug 12 23:43:18.534814 systemd[1]: Stopping systemd-networkd.service - Network Configuration... Aug 12 23:43:18.535764 systemd[1]: Stopping systemd-resolved.service - Network Name Resolution... Aug 12 23:43:18.537523 systemd[1]: sysroot-boot.mount: Deactivated successfully. Aug 12 23:43:18.538053 systemd[1]: sysroot-boot.service: Deactivated successfully. Aug 12 23:43:18.538192 systemd[1]: Stopped sysroot-boot.service - /sysroot/boot. Aug 12 23:43:18.540039 systemd[1]: initrd-setup-root.service: Deactivated successfully. Aug 12 23:43:18.540200 systemd[1]: Stopped initrd-setup-root.service - Root filesystem setup. Aug 12 23:43:18.546133 systemd[1]: systemd-resolved.service: Deactivated successfully. Aug 12 23:43:18.546267 systemd[1]: Stopped systemd-resolved.service - Network Name Resolution. Aug 12 23:43:18.550432 systemd[1]: run-credentials-systemd\x2dresolved.service.mount: Deactivated successfully. Aug 12 23:43:18.550701 systemd[1]: systemd-networkd.service: Deactivated successfully. Aug 12 23:43:18.550823 systemd[1]: Stopped systemd-networkd.service - Network Configuration. Aug 12 23:43:18.553215 systemd[1]: run-credentials-systemd\x2dnetworkd.service.mount: Deactivated successfully. Aug 12 23:43:18.554455 systemd[1]: Stopped target network-pre.target - Preparation for Network. Aug 12 23:43:18.555233 systemd[1]: systemd-networkd.socket: Deactivated successfully. Aug 12 23:43:18.555272 systemd[1]: Closed systemd-networkd.socket - Network Service Netlink Socket. Aug 12 23:43:18.557258 systemd[1]: Stopping network-cleanup.service - Network Cleanup... Aug 12 23:43:18.558039 systemd[1]: parse-ip-for-networkd.service: Deactivated successfully. Aug 12 23:43:18.558106 systemd[1]: Stopped parse-ip-for-networkd.service - Write systemd-networkd units from cmdline. Aug 12 23:43:18.558898 systemd[1]: systemd-sysctl.service: Deactivated successfully. Aug 12 23:43:18.558945 systemd[1]: Stopped systemd-sysctl.service - Apply Kernel Variables. Aug 12 23:43:18.563197 systemd[1]: systemd-modules-load.service: Deactivated successfully. Aug 12 23:43:18.563260 systemd[1]: Stopped systemd-modules-load.service - Load Kernel Modules. Aug 12 23:43:18.563891 systemd[1]: systemd-tmpfiles-setup.service: Deactivated successfully. Aug 12 23:43:18.563928 systemd[1]: Stopped systemd-tmpfiles-setup.service - Create System Files and Directories. Aug 12 23:43:18.566085 systemd[1]: Stopping systemd-udevd.service - Rule-based Manager for Device Events and Files... Aug 12 23:43:18.568295 systemd[1]: run-credentials-systemd\x2dsysctl.service.mount: Deactivated successfully. Aug 12 23:43:18.568364 systemd[1]: run-credentials-systemd\x2dtmpfiles\x2dsetup.service.mount: Deactivated successfully. Aug 12 23:43:18.576271 systemd[1]: systemd-udevd.service: Deactivated successfully. Aug 12 23:43:18.577591 systemd[1]: Stopped systemd-udevd.service - Rule-based Manager for Device Events and Files. Aug 12 23:43:18.578515 systemd[1]: systemd-udevd-control.socket: Deactivated successfully. Aug 12 23:43:18.578627 systemd[1]: Closed systemd-udevd-control.socket - udev Control Socket. Aug 12 23:43:18.580323 systemd[1]: systemd-udevd-kernel.socket: Deactivated successfully. Aug 12 23:43:18.580359 systemd[1]: Closed systemd-udevd-kernel.socket - udev Kernel Socket. Aug 12 23:43:18.583243 systemd[1]: dracut-pre-udev.service: Deactivated successfully. Aug 12 23:43:18.583303 systemd[1]: Stopped dracut-pre-udev.service - dracut pre-udev hook. Aug 12 23:43:18.585081 systemd[1]: dracut-cmdline.service: Deactivated successfully. Aug 12 23:43:18.585355 systemd[1]: Stopped dracut-cmdline.service - dracut cmdline hook. Aug 12 23:43:18.586732 systemd[1]: dracut-cmdline-ask.service: Deactivated successfully. Aug 12 23:43:18.586790 systemd[1]: Stopped dracut-cmdline-ask.service - dracut ask for additional cmdline parameters. Aug 12 23:43:18.589223 systemd[1]: Starting initrd-udevadm-cleanup-db.service - Cleanup udev Database... Aug 12 23:43:18.591614 systemd[1]: systemd-network-generator.service: Deactivated successfully. Aug 12 23:43:18.591694 systemd[1]: Stopped systemd-network-generator.service - Generate network units from Kernel command line. Aug 12 23:43:18.594905 systemd[1]: systemd-tmpfiles-setup-dev.service: Deactivated successfully. Aug 12 23:43:18.594986 systemd[1]: Stopped systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev. Aug 12 23:43:18.596192 systemd[1]: systemd-vconsole-setup.service: Deactivated successfully. Aug 12 23:43:18.596244 systemd[1]: Stopped systemd-vconsole-setup.service - Virtual Console Setup. Aug 12 23:43:18.601325 systemd[1]: run-credentials-systemd\x2dnetwork\x2dgenerator.service.mount: Deactivated successfully. Aug 12 23:43:18.601402 systemd[1]: run-credentials-systemd\x2dtmpfiles\x2dsetup\x2ddev.service.mount: Deactivated successfully. Aug 12 23:43:18.601451 systemd[1]: run-credentials-systemd\x2dvconsole\x2dsetup.service.mount: Deactivated successfully. Aug 12 23:43:18.601925 systemd[1]: network-cleanup.service: Deactivated successfully. Aug 12 23:43:18.602051 systemd[1]: Stopped network-cleanup.service - Network Cleanup. Aug 12 23:43:18.610918 systemd[1]: initrd-udevadm-cleanup-db.service: Deactivated successfully. Aug 12 23:43:18.611170 systemd[1]: Finished initrd-udevadm-cleanup-db.service - Cleanup udev Database. Aug 12 23:43:18.612434 systemd[1]: Reached target initrd-switch-root.target - Switch Root. Aug 12 23:43:18.614490 systemd[1]: Starting initrd-switch-root.service - Switch Root... Aug 12 23:43:18.640234 systemd[1]: Switching root. Aug 12 23:43:18.688043 systemd-journald[244]: Journal stopped Aug 12 23:43:19.710655 systemd-journald[244]: Received SIGTERM from PID 1 (systemd). Aug 12 23:43:19.710717 kernel: SELinux: policy capability network_peer_controls=1 Aug 12 23:43:19.710731 kernel: SELinux: policy capability open_perms=1 Aug 12 23:43:19.710740 kernel: SELinux: policy capability extended_socket_class=1 Aug 12 23:43:19.710749 kernel: SELinux: policy capability always_check_network=0 Aug 12 23:43:19.710758 kernel: SELinux: policy capability cgroup_seclabel=1 Aug 12 23:43:19.710767 kernel: SELinux: policy capability nnp_nosuid_transition=1 Aug 12 23:43:19.710776 kernel: SELinux: policy capability genfs_seclabel_symlinks=0 Aug 12 23:43:19.710785 kernel: SELinux: policy capability ioctl_skip_cloexec=0 Aug 12 23:43:19.710801 kernel: SELinux: policy capability userspace_initial_context=0 Aug 12 23:43:19.710811 kernel: audit: type=1403 audit(1755042198.859:2): auid=4294967295 ses=4294967295 lsm=selinux res=1 Aug 12 23:43:19.710822 systemd[1]: Successfully loaded SELinux policy in 58.538ms. Aug 12 23:43:19.710841 systemd[1]: Relabeled /dev/, /dev/shm/, /run/ in 11.422ms. Aug 12 23:43:19.710855 systemd[1]: systemd 256.8 running in system mode (+PAM +AUDIT +SELINUX -APPARMOR +IMA +SMACK +SECCOMP -GCRYPT -GNUTLS +OPENSSL -ACL +BLKID +CURL +ELFUTILS -FIDO2 +IDN2 -IDN +IPTC +KMOD +LIBCRYPTSETUP +LIBCRYPTSETUP_PLUGINS +LIBFDISK +PCRE2 -PWQUALITY -P11KIT -QRENCODE +TPM2 +BZIP2 +LZ4 +XZ +ZLIB +ZSTD -BPF_FRAMEWORK -XKBCOMMON +UTMP -SYSVINIT +LIBARCHIVE) Aug 12 23:43:19.710866 systemd[1]: Detected virtualization kvm. Aug 12 23:43:19.710876 systemd[1]: Detected architecture arm64. Aug 12 23:43:19.710886 systemd[1]: Detected first boot. Aug 12 23:43:19.710899 systemd[1]: Hostname set to . Aug 12 23:43:19.710910 systemd[1]: Initializing machine ID from VM UUID. Aug 12 23:43:19.710921 zram_generator::config[1087]: No configuration found. Aug 12 23:43:19.710932 kernel: NET: Registered PF_VSOCK protocol family Aug 12 23:43:19.710941 systemd[1]: Populated /etc with preset unit settings. Aug 12 23:43:19.710952 systemd[1]: run-credentials-systemd\x2djournald.service.mount: Deactivated successfully. Aug 12 23:43:19.710963 systemd[1]: initrd-switch-root.service: Deactivated successfully. Aug 12 23:43:19.710976 systemd[1]: Stopped initrd-switch-root.service - Switch Root. Aug 12 23:43:19.710986 systemd[1]: systemd-journald.service: Scheduled restart job, restart counter is at 1. Aug 12 23:43:19.710997 systemd[1]: Created slice system-addon\x2dconfig.slice - Slice /system/addon-config. Aug 12 23:43:19.711007 systemd[1]: Created slice system-addon\x2drun.slice - Slice /system/addon-run. Aug 12 23:43:19.711017 systemd[1]: Created slice system-getty.slice - Slice /system/getty. Aug 12 23:43:19.711027 systemd[1]: Created slice system-modprobe.slice - Slice /system/modprobe. Aug 12 23:43:19.711038 systemd[1]: Created slice system-serial\x2dgetty.slice - Slice /system/serial-getty. Aug 12 23:43:19.711050 systemd[1]: Created slice system-system\x2dcloudinit.slice - Slice /system/system-cloudinit. Aug 12 23:43:19.711063 systemd[1]: Created slice system-systemd\x2dfsck.slice - Slice /system/systemd-fsck. Aug 12 23:43:19.711073 systemd[1]: Created slice user.slice - User and Session Slice. Aug 12 23:43:19.711084 systemd[1]: Started clevis-luks-askpass.path - Forward Password Requests to Clevis Directory Watch. Aug 12 23:43:19.711094 systemd[1]: Started systemd-ask-password-console.path - Dispatch Password Requests to Console Directory Watch. Aug 12 23:43:19.711140 systemd[1]: Started systemd-ask-password-wall.path - Forward Password Requests to Wall Directory Watch. Aug 12 23:43:19.711154 systemd[1]: Set up automount boot.automount - Boot partition Automount Point. Aug 12 23:43:19.711165 systemd[1]: Set up automount proc-sys-fs-binfmt_misc.automount - Arbitrary Executable File Formats File System Automount Point. Aug 12 23:43:19.711176 systemd[1]: Expecting device dev-disk-by\x2dlabel-OEM.device - /dev/disk/by-label/OEM... Aug 12 23:43:19.711189 systemd[1]: Expecting device dev-ttyAMA0.device - /dev/ttyAMA0... Aug 12 23:43:19.711199 systemd[1]: Reached target cryptsetup-pre.target - Local Encrypted Volumes (Pre). Aug 12 23:43:19.711209 systemd[1]: Reached target cryptsetup.target - Local Encrypted Volumes. Aug 12 23:43:19.711219 systemd[1]: Stopped target initrd-switch-root.target - Switch Root. Aug 12 23:43:19.711229 systemd[1]: Stopped target initrd-fs.target - Initrd File Systems. Aug 12 23:43:19.711239 systemd[1]: Stopped target initrd-root-fs.target - Initrd Root File System. Aug 12 23:43:19.711249 systemd[1]: Reached target integritysetup.target - Local Integrity Protected Volumes. Aug 12 23:43:19.711260 systemd[1]: Reached target remote-cryptsetup.target - Remote Encrypted Volumes. Aug 12 23:43:19.711271 systemd[1]: Reached target remote-fs.target - Remote File Systems. Aug 12 23:43:19.711281 systemd[1]: Reached target slices.target - Slice Units. Aug 12 23:43:19.711295 systemd[1]: Reached target swap.target - Swaps. Aug 12 23:43:19.711305 systemd[1]: Reached target veritysetup.target - Local Verity Protected Volumes. Aug 12 23:43:19.711315 systemd[1]: Listening on systemd-coredump.socket - Process Core Dump Socket. Aug 12 23:43:19.711325 systemd[1]: Listening on systemd-creds.socket - Credential Encryption/Decryption. Aug 12 23:43:19.711335 systemd[1]: Listening on systemd-networkd.socket - Network Service Netlink Socket. Aug 12 23:43:19.711345 systemd[1]: Listening on systemd-udevd-control.socket - udev Control Socket. Aug 12 23:43:19.711357 systemd[1]: Listening on systemd-udevd-kernel.socket - udev Kernel Socket. Aug 12 23:43:19.711367 systemd[1]: Listening on systemd-userdbd.socket - User Database Manager Socket. Aug 12 23:43:19.711377 systemd[1]: Mounting dev-hugepages.mount - Huge Pages File System... Aug 12 23:43:19.711387 systemd[1]: Mounting dev-mqueue.mount - POSIX Message Queue File System... Aug 12 23:43:19.711397 systemd[1]: Mounting media.mount - External Media Directory... Aug 12 23:43:19.711407 systemd[1]: Mounting sys-kernel-debug.mount - Kernel Debug File System... Aug 12 23:43:19.711417 systemd[1]: Mounting sys-kernel-tracing.mount - Kernel Trace File System... Aug 12 23:43:19.711426 systemd[1]: Mounting tmp.mount - Temporary Directory /tmp... Aug 12 23:43:19.711437 systemd[1]: var-lib-machines.mount - Virtual Machine and Container Storage (Compatibility) was skipped because of an unmet condition check (ConditionPathExists=/var/lib/machines.raw). Aug 12 23:43:19.711448 systemd[1]: Reached target machines.target - Containers. Aug 12 23:43:19.711458 systemd[1]: Starting flatcar-tmpfiles.service - Create missing system files... Aug 12 23:43:19.711468 systemd[1]: ignition-delete-config.service - Ignition (delete config) was skipped because no trigger condition checks were met. Aug 12 23:43:19.711478 systemd[1]: Starting kmod-static-nodes.service - Create List of Static Device Nodes... Aug 12 23:43:19.711488 systemd[1]: Starting modprobe@configfs.service - Load Kernel Module configfs... Aug 12 23:43:19.711498 systemd[1]: Starting modprobe@dm_mod.service - Load Kernel Module dm_mod... Aug 12 23:43:19.711508 systemd[1]: Starting modprobe@drm.service - Load Kernel Module drm... Aug 12 23:43:19.711518 systemd[1]: Starting modprobe@efi_pstore.service - Load Kernel Module efi_pstore... Aug 12 23:43:19.711529 systemd[1]: Starting modprobe@fuse.service - Load Kernel Module fuse... Aug 12 23:43:19.711539 systemd[1]: Starting modprobe@loop.service - Load Kernel Module loop... Aug 12 23:43:19.714286 systemd[1]: setup-nsswitch.service - Create /etc/nsswitch.conf was skipped because of an unmet condition check (ConditionPathExists=!/etc/nsswitch.conf). Aug 12 23:43:19.714310 systemd[1]: systemd-fsck-root.service: Deactivated successfully. Aug 12 23:43:19.714321 systemd[1]: Stopped systemd-fsck-root.service - File System Check on Root Device. Aug 12 23:43:19.714331 systemd[1]: systemd-fsck-usr.service: Deactivated successfully. Aug 12 23:43:19.714341 systemd[1]: Stopped systemd-fsck-usr.service. Aug 12 23:43:19.714353 systemd[1]: systemd-hibernate-clear.service - Clear Stale Hibernate Storage Info was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/HibernateLocation-8cf2644b-4b0b-428f-9387-6d876050dc67). Aug 12 23:43:19.714364 systemd[1]: Starting systemd-journald.service - Journal Service... Aug 12 23:43:19.714382 systemd[1]: Starting systemd-modules-load.service - Load Kernel Modules... Aug 12 23:43:19.714392 kernel: loop: module loaded Aug 12 23:43:19.714403 systemd[1]: Starting systemd-network-generator.service - Generate network units from Kernel command line... Aug 12 23:43:19.714415 kernel: fuse: init (API version 7.41) Aug 12 23:43:19.714427 systemd[1]: Starting systemd-remount-fs.service - Remount Root and Kernel File Systems... Aug 12 23:43:19.714437 systemd[1]: Starting systemd-udev-load-credentials.service - Load udev Rules from Credentials... Aug 12 23:43:19.714447 systemd[1]: Starting systemd-udev-trigger.service - Coldplug All udev Devices... Aug 12 23:43:19.714458 systemd[1]: verity-setup.service: Deactivated successfully. Aug 12 23:43:19.714468 systemd[1]: Stopped verity-setup.service. Aug 12 23:43:19.714479 systemd[1]: Mounted dev-hugepages.mount - Huge Pages File System. Aug 12 23:43:19.714490 systemd[1]: Mounted dev-mqueue.mount - POSIX Message Queue File System. Aug 12 23:43:19.714500 systemd[1]: Mounted media.mount - External Media Directory. Aug 12 23:43:19.714510 systemd[1]: Mounted sys-kernel-debug.mount - Kernel Debug File System. Aug 12 23:43:19.714520 systemd[1]: Mounted sys-kernel-tracing.mount - Kernel Trace File System. Aug 12 23:43:19.717612 systemd[1]: Mounted tmp.mount - Temporary Directory /tmp. Aug 12 23:43:19.717630 systemd[1]: Finished kmod-static-nodes.service - Create List of Static Device Nodes. Aug 12 23:43:19.717641 systemd[1]: modprobe@configfs.service: Deactivated successfully. Aug 12 23:43:19.717652 systemd[1]: Finished modprobe@configfs.service - Load Kernel Module configfs. Aug 12 23:43:19.717674 systemd[1]: modprobe@dm_mod.service: Deactivated successfully. Aug 12 23:43:19.717686 systemd[1]: Finished modprobe@dm_mod.service - Load Kernel Module dm_mod. Aug 12 23:43:19.717696 systemd[1]: modprobe@efi_pstore.service: Deactivated successfully. Aug 12 23:43:19.717708 systemd[1]: Finished modprobe@efi_pstore.service - Load Kernel Module efi_pstore. Aug 12 23:43:19.717720 systemd[1]: modprobe@fuse.service: Deactivated successfully. Aug 12 23:43:19.717731 systemd[1]: Finished modprobe@fuse.service - Load Kernel Module fuse. Aug 12 23:43:19.717741 systemd[1]: modprobe@loop.service: Deactivated successfully. Aug 12 23:43:19.717751 systemd[1]: Finished modprobe@loop.service - Load Kernel Module loop. Aug 12 23:43:19.717761 systemd[1]: Finished systemd-network-generator.service - Generate network units from Kernel command line. Aug 12 23:43:19.717774 systemd[1]: Finished flatcar-tmpfiles.service - Create missing system files. Aug 12 23:43:19.717806 systemd[1]: Finished systemd-remount-fs.service - Remount Root and Kernel File Systems. Aug 12 23:43:19.717819 systemd[1]: Reached target network-pre.target - Preparation for Network. Aug 12 23:43:19.717830 systemd[1]: Mounting sys-fs-fuse-connections.mount - FUSE Control File System... Aug 12 23:43:19.717840 systemd[1]: Mounting sys-kernel-config.mount - Kernel Configuration File System... Aug 12 23:43:19.717850 systemd[1]: remount-root.service - Remount Root File System was skipped because of an unmet condition check (ConditionPathIsReadWrite=!/). Aug 12 23:43:19.717860 systemd[1]: Reached target local-fs.target - Local File Systems. Aug 12 23:43:19.717871 systemd[1]: Listening on systemd-sysext.socket - System Extension Image Management. Aug 12 23:43:19.717881 kernel: ACPI: bus type drm_connector registered Aug 12 23:43:19.717894 systemd[1]: Starting ldconfig.service - Rebuild Dynamic Linker Cache... Aug 12 23:43:19.717953 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. Aug 12 23:43:19.717966 systemd[1]: Starting systemd-hwdb-update.service - Rebuild Hardware Database... Aug 12 23:43:19.717977 systemd[1]: systemd-pstore.service - Platform Persistent Storage Archival was skipped because of an unmet condition check (ConditionDirectoryNotEmpty=/sys/fs/pstore). Aug 12 23:43:19.718026 systemd-journald[1155]: Collecting audit messages is disabled. Aug 12 23:43:19.718053 systemd[1]: Starting systemd-random-seed.service - Load/Save OS Random Seed... Aug 12 23:43:19.718063 systemd[1]: systemd-repart.service - Repartition Root Disk was skipped because no trigger condition checks were met. Aug 12 23:43:19.718077 systemd-journald[1155]: Journal started Aug 12 23:43:19.718100 systemd-journald[1155]: Runtime Journal (/run/log/journal/0d3aca2e51e2402b81c3a10c438c8b9a) is 8M, max 76.5M, 68.5M free. Aug 12 23:43:19.380410 systemd[1]: Queued start job for default target multi-user.target. Aug 12 23:43:19.406182 systemd[1]: Unnecessary job was removed for dev-sda6.device - /dev/sda6. Aug 12 23:43:19.407056 systemd[1]: systemd-journald.service: Deactivated successfully. Aug 12 23:43:19.726566 systemd[1]: Starting systemd-sysext.service - Merge System Extension Images into /usr/ and /opt/... Aug 12 23:43:19.735960 systemd[1]: Starting systemd-sysusers.service - Create System Users... Aug 12 23:43:19.742882 systemd[1]: Started systemd-journald.service - Journal Service. Aug 12 23:43:19.742604 systemd[1]: modprobe@drm.service: Deactivated successfully. Aug 12 23:43:19.743724 systemd[1]: Finished modprobe@drm.service - Load Kernel Module drm. Aug 12 23:43:19.747667 systemd[1]: Finished systemd-modules-load.service - Load Kernel Modules. Aug 12 23:43:19.748834 systemd[1]: Finished systemd-udev-load-credentials.service - Load udev Rules from Credentials. Aug 12 23:43:19.751112 systemd[1]: Mounted sys-fs-fuse-connections.mount - FUSE Control File System. Aug 12 23:43:19.753079 systemd[1]: Mounted sys-kernel-config.mount - Kernel Configuration File System. Aug 12 23:43:19.755151 systemd[1]: Finished systemd-random-seed.service - Load/Save OS Random Seed. Aug 12 23:43:19.790422 systemd[1]: Reached target first-boot-complete.target - First Boot Complete. Aug 12 23:43:19.801883 systemd[1]: Starting systemd-journal-flush.service - Flush Journal to Persistent Storage... Aug 12 23:43:19.810810 systemd[1]: Starting systemd-machine-id-commit.service - Save Transient machine-id to Disk... Aug 12 23:43:19.815611 kernel: loop0: detected capacity change from 0 to 107312 Aug 12 23:43:19.817830 systemd[1]: Starting systemd-sysctl.service - Apply Kernel Variables... Aug 12 23:43:19.834499 systemd[1]: Finished systemd-udev-trigger.service - Coldplug All udev Devices. Aug 12 23:43:19.846791 kernel: squashfs: version 4.0 (2009/01/31) Phillip Lougher Aug 12 23:43:19.859056 systemd-journald[1155]: Time spent on flushing to /var/log/journal/0d3aca2e51e2402b81c3a10c438c8b9a is 38.392ms for 1170 entries. Aug 12 23:43:19.859056 systemd-journald[1155]: System Journal (/var/log/journal/0d3aca2e51e2402b81c3a10c438c8b9a) is 8M, max 584.8M, 576.8M free. Aug 12 23:43:19.918892 systemd-journald[1155]: Received client request to flush runtime journal. Aug 12 23:43:19.918979 kernel: loop1: detected capacity change from 0 to 203944 Aug 12 23:43:19.919014 kernel: loop2: detected capacity change from 0 to 8 Aug 12 23:43:19.870927 systemd[1]: Finished systemd-machine-id-commit.service - Save Transient machine-id to Disk. Aug 12 23:43:19.884630 systemd[1]: Finished systemd-sysctl.service - Apply Kernel Variables. Aug 12 23:43:19.897366 systemd[1]: Finished systemd-sysusers.service - Create System Users. Aug 12 23:43:19.907003 systemd[1]: Starting systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev... Aug 12 23:43:19.922362 systemd[1]: Finished systemd-journal-flush.service - Flush Journal to Persistent Storage. Aug 12 23:43:19.937781 kernel: loop3: detected capacity change from 0 to 138376 Aug 12 23:43:19.966524 systemd-tmpfiles[1222]: ACLs are not supported, ignoring. Aug 12 23:43:19.967713 systemd-tmpfiles[1222]: ACLs are not supported, ignoring. Aug 12 23:43:19.978714 kernel: loop4: detected capacity change from 0 to 107312 Aug 12 23:43:19.981023 systemd[1]: Finished systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev. Aug 12 23:43:19.996627 kernel: loop5: detected capacity change from 0 to 203944 Aug 12 23:43:20.012579 kernel: loop6: detected capacity change from 0 to 8 Aug 12 23:43:20.015595 kernel: loop7: detected capacity change from 0 to 138376 Aug 12 23:43:20.027752 (sd-merge)[1228]: Using extensions 'containerd-flatcar', 'docker-flatcar', 'kubernetes', 'oem-hetzner'. Aug 12 23:43:20.028264 (sd-merge)[1228]: Merged extensions into '/usr'. Aug 12 23:43:20.033634 systemd[1]: Reload requested from client PID 1186 ('systemd-sysext') (unit systemd-sysext.service)... Aug 12 23:43:20.033801 systemd[1]: Reloading... Aug 12 23:43:20.197657 zram_generator::config[1258]: No configuration found. Aug 12 23:43:20.288060 ldconfig[1183]: /sbin/ldconfig: /usr/lib/ld.so.conf is not an ELF file - it has the wrong magic bytes at the start. Aug 12 23:43:20.315693 systemd[1]: /usr/lib/systemd/system/docker.socket:6: ListenStream= references a path below legacy directory /var/run/, updating /var/run/docker.sock → /run/docker.sock; please update the unit file accordingly. Aug 12 23:43:20.392288 systemd[1]: etc-machine\x2did.mount: Deactivated successfully. Aug 12 23:43:20.392730 systemd[1]: Reloading finished in 357 ms. Aug 12 23:43:20.426582 systemd[1]: Finished ldconfig.service - Rebuild Dynamic Linker Cache. Aug 12 23:43:20.427697 systemd[1]: Finished systemd-sysext.service - Merge System Extension Images into /usr/ and /opt/. Aug 12 23:43:20.441779 systemd[1]: Starting ensure-sysext.service... Aug 12 23:43:20.444214 systemd[1]: Starting systemd-tmpfiles-setup.service - Create System Files and Directories... Aug 12 23:43:20.470679 systemd[1]: Reload requested from client PID 1292 ('systemctl') (unit ensure-sysext.service)... Aug 12 23:43:20.470697 systemd[1]: Reloading... Aug 12 23:43:20.493002 systemd-tmpfiles[1293]: /usr/lib/tmpfiles.d/nfs-utils.conf:6: Duplicate line for path "/var/lib/nfs/sm", ignoring. Aug 12 23:43:20.493417 systemd-tmpfiles[1293]: /usr/lib/tmpfiles.d/nfs-utils.conf:7: Duplicate line for path "/var/lib/nfs/sm.bak", ignoring. Aug 12 23:43:20.495012 systemd-tmpfiles[1293]: /usr/lib/tmpfiles.d/provision.conf:20: Duplicate line for path "/root", ignoring. Aug 12 23:43:20.495805 systemd-tmpfiles[1293]: /usr/lib/tmpfiles.d/systemd-flatcar.conf:6: Duplicate line for path "/var/log/journal", ignoring. Aug 12 23:43:20.498592 systemd-tmpfiles[1293]: /usr/lib/tmpfiles.d/systemd.conf:29: Duplicate line for path "/var/lib/systemd", ignoring. Aug 12 23:43:20.499601 systemd-tmpfiles[1293]: ACLs are not supported, ignoring. Aug 12 23:43:20.499778 systemd-tmpfiles[1293]: ACLs are not supported, ignoring. Aug 12 23:43:20.502679 systemd-tmpfiles[1293]: Detected autofs mount point /boot during canonicalization of boot. Aug 12 23:43:20.502931 systemd-tmpfiles[1293]: Skipping /boot Aug 12 23:43:20.520406 systemd-tmpfiles[1293]: Detected autofs mount point /boot during canonicalization of boot. Aug 12 23:43:20.520422 systemd-tmpfiles[1293]: Skipping /boot Aug 12 23:43:20.565581 zram_generator::config[1320]: No configuration found. Aug 12 23:43:20.665532 systemd[1]: /usr/lib/systemd/system/docker.socket:6: ListenStream= references a path below legacy directory /var/run/, updating /var/run/docker.sock → /run/docker.sock; please update the unit file accordingly. Aug 12 23:43:20.740201 systemd[1]: Reloading finished in 269 ms. Aug 12 23:43:20.755540 systemd[1]: Finished systemd-hwdb-update.service - Rebuild Hardware Database. Aug 12 23:43:20.761518 systemd[1]: Finished systemd-tmpfiles-setup.service - Create System Files and Directories. Aug 12 23:43:20.770729 systemd[1]: Starting audit-rules.service - Load Audit Rules... Aug 12 23:43:20.774230 systemd[1]: Starting clean-ca-certificates.service - Clean up broken links in /etc/ssl/certs... Aug 12 23:43:20.781314 systemd[1]: Starting systemd-journal-catalog-update.service - Rebuild Journal Catalog... Aug 12 23:43:20.787336 systemd[1]: Starting systemd-resolved.service - Network Name Resolution... Aug 12 23:43:20.797030 systemd[1]: Starting systemd-udevd.service - Rule-based Manager for Device Events and Files... Aug 12 23:43:20.800467 systemd[1]: Starting systemd-update-utmp.service - Record System Boot/Shutdown in UTMP... Aug 12 23:43:20.807772 systemd[1]: ignition-delete-config.service - Ignition (delete config) was skipped because no trigger condition checks were met. Aug 12 23:43:20.809377 systemd[1]: Starting modprobe@dm_mod.service - Load Kernel Module dm_mod... Aug 12 23:43:20.814655 systemd[1]: Starting modprobe@efi_pstore.service - Load Kernel Module efi_pstore... Aug 12 23:43:20.822839 systemd[1]: Starting modprobe@loop.service - Load Kernel Module loop... Aug 12 23:43:20.823682 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. Aug 12 23:43:20.824271 systemd[1]: systemd-hibernate-clear.service - Clear Stale Hibernate Storage Info was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/HibernateLocation-8cf2644b-4b0b-428f-9387-6d876050dc67). Aug 12 23:43:20.827696 systemd[1]: ignition-delete-config.service - Ignition (delete config) was skipped because no trigger condition checks were met. Aug 12 23:43:20.827865 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. Aug 12 23:43:20.827960 systemd[1]: systemd-hibernate-clear.service - Clear Stale Hibernate Storage Info was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/HibernateLocation-8cf2644b-4b0b-428f-9387-6d876050dc67). Aug 12 23:43:20.834917 systemd[1]: Starting systemd-userdbd.service - User Database Manager... Aug 12 23:43:20.843154 systemd[1]: ignition-delete-config.service - Ignition (delete config) was skipped because no trigger condition checks were met. Aug 12 23:43:20.845473 systemd[1]: Starting modprobe@drm.service - Load Kernel Module drm... Aug 12 23:43:20.848538 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. Aug 12 23:43:20.848726 systemd[1]: systemd-hibernate-clear.service - Clear Stale Hibernate Storage Info was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/HibernateLocation-8cf2644b-4b0b-428f-9387-6d876050dc67). Aug 12 23:43:20.849434 systemd[1]: Finished systemd-update-utmp.service - Record System Boot/Shutdown in UTMP. Aug 12 23:43:20.850903 systemd[1]: modprobe@dm_mod.service: Deactivated successfully. Aug 12 23:43:20.851069 systemd[1]: Finished modprobe@dm_mod.service - Load Kernel Module dm_mod. Aug 12 23:43:20.860356 systemd[1]: Finished ensure-sysext.service. Aug 12 23:43:20.861530 systemd[1]: Finished systemd-journal-catalog-update.service - Rebuild Journal Catalog. Aug 12 23:43:20.869630 systemd[1]: Starting systemd-timesyncd.service - Network Time Synchronization... Aug 12 23:43:20.874755 systemd[1]: Starting systemd-update-done.service - Update is Completed... Aug 12 23:43:20.887038 systemd[1]: modprobe@efi_pstore.service: Deactivated successfully. Aug 12 23:43:20.891985 systemd[1]: Finished modprobe@efi_pstore.service - Load Kernel Module efi_pstore. Aug 12 23:43:20.894868 systemd[1]: modprobe@drm.service: Deactivated successfully. Aug 12 23:43:20.895806 systemd[1]: Finished modprobe@drm.service - Load Kernel Module drm. Aug 12 23:43:20.902272 systemd[1]: systemd-pstore.service - Platform Persistent Storage Archival was skipped because of an unmet condition check (ConditionDirectoryNotEmpty=/sys/fs/pstore). Aug 12 23:43:20.903534 systemd[1]: modprobe@loop.service: Deactivated successfully. Aug 12 23:43:20.903801 systemd[1]: Finished modprobe@loop.service - Load Kernel Module loop. Aug 12 23:43:20.904811 systemd[1]: systemd-repart.service - Repartition Root Disk was skipped because no trigger condition checks were met. Aug 12 23:43:20.916849 systemd[1]: Finished systemd-update-done.service - Update is Completed. Aug 12 23:43:20.923197 augenrules[1401]: No rules Aug 12 23:43:20.924451 systemd[1]: audit-rules.service: Deactivated successfully. Aug 12 23:43:20.925108 systemd[1]: Finished audit-rules.service - Load Audit Rules. Aug 12 23:43:20.928077 systemd-udevd[1370]: Using default interface naming scheme 'v255'. Aug 12 23:43:20.941639 systemd[1]: Finished clean-ca-certificates.service - Clean up broken links in /etc/ssl/certs. Aug 12 23:43:20.944137 systemd[1]: update-ca-certificates.service - Update CA bundle at /etc/ssl/certs/ca-certificates.crt was skipped because of an unmet condition check (ConditionPathIsSymbolicLink=!/etc/ssl/certs/ca-certificates.crt). Aug 12 23:43:20.950171 systemd[1]: Started systemd-userdbd.service - User Database Manager. Aug 12 23:43:20.967360 systemd[1]: Started systemd-udevd.service - Rule-based Manager for Device Events and Files. Aug 12 23:43:20.975806 systemd[1]: Starting systemd-networkd.service - Network Configuration... Aug 12 23:43:21.111322 systemd[1]: Condition check resulted in dev-ttyAMA0.device - /dev/ttyAMA0 being skipped. Aug 12 23:43:21.191693 systemd[1]: Started systemd-timesyncd.service - Network Time Synchronization. Aug 12 23:43:21.192504 systemd[1]: Reached target time-set.target - System Time Set. Aug 12 23:43:21.269017 systemd-resolved[1364]: Positive Trust Anchors: Aug 12 23:43:21.269041 systemd-resolved[1364]: . IN DS 20326 8 2 e06d44b80b8f1d39a95c0b0d7c65d08458e880409bbc683457104237c7f8ec8d Aug 12 23:43:21.269072 systemd-resolved[1364]: Negative trust anchors: home.arpa 10.in-addr.arpa 16.172.in-addr.arpa 17.172.in-addr.arpa 18.172.in-addr.arpa 19.172.in-addr.arpa 20.172.in-addr.arpa 21.172.in-addr.arpa 22.172.in-addr.arpa 23.172.in-addr.arpa 24.172.in-addr.arpa 25.172.in-addr.arpa 26.172.in-addr.arpa 27.172.in-addr.arpa 28.172.in-addr.arpa 29.172.in-addr.arpa 30.172.in-addr.arpa 31.172.in-addr.arpa 170.0.0.192.in-addr.arpa 171.0.0.192.in-addr.arpa 168.192.in-addr.arpa d.f.ip6.arpa ipv4only.arpa resolver.arpa corp home internal intranet lan local private test Aug 12 23:43:21.270970 systemd-networkd[1419]: lo: Link UP Aug 12 23:43:21.271528 systemd-networkd[1419]: lo: Gained carrier Aug 12 23:43:21.276696 systemd-resolved[1364]: Using system hostname 'ci-4372-1-0-9-13fe44d47a'. Aug 12 23:43:21.277790 systemd-networkd[1419]: Enumeration completed Aug 12 23:43:21.277921 systemd[1]: Started systemd-networkd.service - Network Configuration. Aug 12 23:43:21.280704 systemd-networkd[1419]: eth0: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Aug 12 23:43:21.280713 systemd-networkd[1419]: eth0: Configuring with /usr/lib/systemd/network/zz-default.network. Aug 12 23:43:21.281463 systemd-networkd[1419]: eth1: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Aug 12 23:43:21.281467 systemd-networkd[1419]: eth1: Configuring with /usr/lib/systemd/network/zz-default.network. Aug 12 23:43:21.282177 systemd[1]: Starting systemd-networkd-persistent-storage.service - Enable Persistent Storage in systemd-networkd... Aug 12 23:43:21.282316 systemd-networkd[1419]: eth1: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Aug 12 23:43:21.282349 systemd-networkd[1419]: eth0: Link UP Aug 12 23:43:21.282483 systemd-networkd[1419]: eth0: Gained carrier Aug 12 23:43:21.282494 systemd-networkd[1419]: eth0: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Aug 12 23:43:21.285812 systemd-networkd[1419]: eth1: Link UP Aug 12 23:43:21.286482 systemd-networkd[1419]: eth1: Gained carrier Aug 12 23:43:21.286801 systemd[1]: Starting systemd-networkd-wait-online.service - Wait for Network to be Configured... Aug 12 23:43:21.288115 systemd[1]: Started systemd-resolved.service - Network Name Resolution. Aug 12 23:43:21.288657 systemd-networkd[1419]: eth1: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Aug 12 23:43:21.290947 systemd[1]: Reached target network.target - Network. Aug 12 23:43:21.292642 systemd[1]: Reached target nss-lookup.target - Host and Network Name Lookups. Aug 12 23:43:21.293703 systemd[1]: Reached target sysinit.target - System Initialization. Aug 12 23:43:21.294989 systemd[1]: Started motdgen.path - Watch for update engine configuration changes. Aug 12 23:43:21.296760 systemd[1]: Started user-cloudinit@var-lib-flatcar\x2dinstall-user_data.path - Watch for a cloud-config at /var/lib/flatcar-install/user_data. Aug 12 23:43:21.297691 systemd[1]: Started logrotate.timer - Daily rotation of log files. Aug 12 23:43:21.299769 systemd[1]: Started mdadm.timer - Weekly check for MD array's redundancy information.. Aug 12 23:43:21.300503 systemd[1]: Started systemd-tmpfiles-clean.timer - Daily Cleanup of Temporary Directories. Aug 12 23:43:21.301448 systemd[1]: update-engine-stub.timer - Update Engine Stub Timer was skipped because of an unmet condition check (ConditionPathExists=/usr/.noupdate). Aug 12 23:43:21.301486 systemd[1]: Reached target paths.target - Path Units. Aug 12 23:43:21.302648 systemd[1]: Reached target timers.target - Timer Units. Aug 12 23:43:21.305627 systemd[1]: Listening on dbus.socket - D-Bus System Message Bus Socket. Aug 12 23:43:21.309704 systemd[1]: Starting docker.socket - Docker Socket for the API... Aug 12 23:43:21.314368 systemd[1]: Listening on sshd-unix-local.socket - OpenSSH Server Socket (systemd-ssh-generator, AF_UNIX Local). Aug 12 23:43:21.316437 systemd[1]: Listening on sshd-vsock.socket - OpenSSH Server Socket (systemd-ssh-generator, AF_VSOCK). Aug 12 23:43:21.317882 systemd[1]: Reached target ssh-access.target - SSH Access Available. Aug 12 23:43:21.322675 systemd[1]: Listening on sshd.socket - OpenSSH Server Socket. Aug 12 23:43:21.323907 systemd-networkd[1419]: eth1: DHCPv4 address 10.0.0.3/32 acquired from 10.0.0.1 Aug 12 23:43:21.324387 systemd-timesyncd[1391]: Network configuration changed, trying to establish connection. Aug 12 23:43:21.324456 systemd-timesyncd[1391]: Network configuration changed, trying to establish connection. Aug 12 23:43:21.324972 systemd[1]: Listening on systemd-hostnamed.socket - Hostname Service Socket. Aug 12 23:43:21.326419 systemd[1]: Listening on docker.socket - Docker Socket for the API. Aug 12 23:43:21.330963 systemd[1]: Reached target sockets.target - Socket Units. Aug 12 23:43:21.331590 systemd[1]: Reached target basic.target - Basic System. Aug 12 23:43:21.332138 systemd[1]: addon-config@oem.service - Configure Addon /oem was skipped because no trigger condition checks were met. Aug 12 23:43:21.332174 systemd[1]: addon-run@oem.service - Run Addon /oem was skipped because no trigger condition checks were met. Aug 12 23:43:21.340040 systemd[1]: Starting containerd.service - containerd container runtime... Aug 12 23:43:21.343819 systemd[1]: Starting coreos-metadata.service - Flatcar Metadata Agent... Aug 12 23:43:21.345907 systemd-networkd[1419]: eth0: DHCPv4 address 138.199.237.168/32, gateway 172.31.1.1 acquired from 172.31.1.1 Aug 12 23:43:21.347460 systemd[1]: Starting dbus.service - D-Bus System Message Bus... Aug 12 23:43:21.350209 systemd-timesyncd[1391]: Network configuration changed, trying to establish connection. Aug 12 23:43:21.350896 systemd[1]: Starting dracut-shutdown.service - Restore /run/initramfs on shutdown... Aug 12 23:43:21.354533 systemd[1]: Starting enable-oem-cloudinit.service - Enable cloudinit... Aug 12 23:43:21.360609 systemd[1]: Starting extend-filesystems.service - Extend Filesystems... Aug 12 23:43:21.361188 systemd[1]: flatcar-setup-environment.service - Modifies /etc/environment for CoreOS was skipped because of an unmet condition check (ConditionPathExists=/oem/bin/flatcar-setup-environment). Aug 12 23:43:21.370580 kernel: mousedev: PS/2 mouse device common for all mice Aug 12 23:43:21.369532 systemd[1]: Starting motdgen.service - Generate /run/flatcar/motd... Aug 12 23:43:21.379773 systemd[1]: Starting prepare-helm.service - Unpack helm to /opt/bin... Aug 12 23:43:21.384883 systemd[1]: Starting ssh-key-proc-cmdline.service - Install an ssh key from /proc/cmdline... Aug 12 23:43:21.389225 systemd[1]: Starting sshd-keygen.service - Generate sshd host keys... Aug 12 23:43:21.395035 jq[1472]: false Aug 12 23:43:21.396605 systemd[1]: Starting systemd-logind.service - User Login Management... Aug 12 23:43:21.401198 systemd[1]: tcsd.service - TCG Core Services Daemon was skipped because of an unmet condition check (ConditionPathExists=/dev/tpm0). Aug 12 23:43:21.402164 systemd[1]: cgroup compatibility translation between legacy and unified hierarchy settings activated. See cgroup-compat debug messages for details. Aug 12 23:43:21.403673 systemd[1]: Starting update-engine.service - Update Engine... Aug 12 23:43:21.417267 systemd[1]: Starting update-ssh-keys-after-ignition.service - Run update-ssh-keys once after Ignition... Aug 12 23:43:21.420619 systemd[1]: Finished systemd-networkd-persistent-storage.service - Enable Persistent Storage in systemd-networkd. Aug 12 23:43:21.422385 extend-filesystems[1473]: Found /dev/sda6 Aug 12 23:43:21.430603 systemd[1]: Finished dracut-shutdown.service - Restore /run/initramfs on shutdown. Aug 12 23:43:21.431910 systemd[1]: enable-oem-cloudinit.service: Skipped due to 'exec-condition'. Aug 12 23:43:21.432201 systemd[1]: Condition check resulted in enable-oem-cloudinit.service - Enable cloudinit being skipped. Aug 12 23:43:21.446573 extend-filesystems[1473]: Found /dev/sda9 Aug 12 23:43:21.450951 extend-filesystems[1473]: Checking size of /dev/sda9 Aug 12 23:43:21.456846 systemd[1]: ssh-key-proc-cmdline.service: Deactivated successfully. Aug 12 23:43:21.460662 systemd[1]: Finished ssh-key-proc-cmdline.service - Install an ssh key from /proc/cmdline. Aug 12 23:43:21.464951 systemd[1]: motdgen.service: Deactivated successfully. Aug 12 23:43:21.465844 systemd[1]: Finished motdgen.service - Generate /run/flatcar/motd. Aug 12 23:43:21.480258 jq[1487]: true Aug 12 23:43:21.504171 tar[1495]: linux-arm64/helm Aug 12 23:43:21.507503 coreos-metadata[1469]: Aug 12 23:43:21.507 INFO Fetching http://169.254.169.254/hetzner/v1/metadata: Attempt #1 Aug 12 23:43:21.508958 (ntainerd)[1506]: containerd.service: Referenced but unset environment variable evaluates to an empty string: TORCX_IMAGEDIR, TORCX_UNPACKDIR Aug 12 23:43:21.520914 coreos-metadata[1469]: Aug 12 23:43:21.518 INFO Fetch successful Aug 12 23:43:21.520914 coreos-metadata[1469]: Aug 12 23:43:21.518 INFO Fetching http://169.254.169.254/hetzner/v1/metadata/private-networks: Attempt #1 Aug 12 23:43:21.520914 coreos-metadata[1469]: Aug 12 23:43:21.518 INFO Fetch successful Aug 12 23:43:21.521039 jq[1512]: true Aug 12 23:43:21.528829 extend-filesystems[1473]: Resized partition /dev/sda9 Aug 12 23:43:21.533133 update_engine[1485]: I20250812 23:43:21.530668 1485 main.cc:92] Flatcar Update Engine starting Aug 12 23:43:21.536904 extend-filesystems[1519]: resize2fs 1.47.2 (1-Jan-2025) Aug 12 23:43:21.535932 systemd[1]: Started dbus.service - D-Bus System Message Bus. Aug 12 23:43:21.535737 dbus-daemon[1470]: [system] SELinux support is enabled Aug 12 23:43:21.539782 systemd[1]: system-cloudinit@usr-share-oem-cloud\x2dconfig.yml.service - Load cloud-config from /usr/share/oem/cloud-config.yml was skipped because of an unmet condition check (ConditionFileNotEmpty=/usr/share/oem/cloud-config.yml). Aug 12 23:43:21.539820 systemd[1]: Reached target system-config.target - Load system-provided cloud configs. Aug 12 23:43:21.541358 systemd[1]: user-cloudinit-proc-cmdline.service - Load cloud-config from url defined in /proc/cmdline was skipped because of an unmet condition check (ConditionKernelCommandLine=cloud-config-url). Aug 12 23:43:21.541392 systemd[1]: Reached target user-config.target - Load user-provided cloud configs. Aug 12 23:43:21.548577 kernel: EXT4-fs (sda9): resizing filesystem from 1617920 to 9393147 blocks Aug 12 23:43:21.553724 systemd[1]: Started update-engine.service - Update Engine. Aug 12 23:43:21.558479 update_engine[1485]: I20250812 23:43:21.558393 1485 update_check_scheduler.cc:74] Next update check in 5m39s Aug 12 23:43:21.607850 systemd[1]: Started locksmithd.service - Cluster reboot manager. Aug 12 23:43:21.655856 systemd-logind[1481]: New seat seat0. Aug 12 23:43:21.656867 bash[1536]: Updated "/home/core/.ssh/authorized_keys" Aug 12 23:43:21.658222 kernel: EXT4-fs (sda9): resized filesystem to 9393147 Aug 12 23:43:21.658021 systemd[1]: Finished update-ssh-keys-after-ignition.service - Run update-ssh-keys once after Ignition. Aug 12 23:43:21.673423 systemd[1]: Starting sshkeys.service... Aug 12 23:43:21.674700 systemd[1]: Started systemd-logind.service - User Login Management. Aug 12 23:43:21.678581 extend-filesystems[1519]: Filesystem at /dev/sda9 is mounted on /; on-line resizing required Aug 12 23:43:21.678581 extend-filesystems[1519]: old_desc_blocks = 1, new_desc_blocks = 5 Aug 12 23:43:21.678581 extend-filesystems[1519]: The filesystem on /dev/sda9 is now 9393147 (4k) blocks long. Aug 12 23:43:21.684978 extend-filesystems[1473]: Resized filesystem in /dev/sda9 Aug 12 23:43:21.684876 systemd[1]: extend-filesystems.service: Deactivated successfully. Aug 12 23:43:21.690904 systemd[1]: Finished extend-filesystems.service - Extend Filesystems. Aug 12 23:43:21.701569 systemd[1]: Finished coreos-metadata.service - Flatcar Metadata Agent. Aug 12 23:43:21.705304 systemd[1]: Found device dev-disk-by\x2dlabel-OEM.device - QEMU_HARDDISK OEM. Aug 12 23:43:21.708118 systemd[1]: packet-phone-home.service - Report Success to Packet was skipped because no trigger condition checks were met. Aug 12 23:43:21.711806 systemd[1]: Starting systemd-fsck@dev-disk-by\x2dlabel-OEM.service - File System Check on /dev/disk/by-label/OEM... Aug 12 23:43:21.718357 systemd[1]: Created slice system-coreos\x2dmetadata\x2dsshkeys.slice - Slice /system/coreos-metadata-sshkeys. Aug 12 23:43:21.723779 systemd[1]: Starting coreos-metadata-sshkeys@core.service - Flatcar Metadata Agent (SSH Keys)... Aug 12 23:43:21.756182 systemd[1]: Finished systemd-fsck@dev-disk-by\x2dlabel-OEM.service - File System Check on /dev/disk/by-label/OEM. Aug 12 23:43:21.765631 coreos-metadata[1549]: Aug 12 23:43:21.765 INFO Fetching http://169.254.169.254/hetzner/v1/metadata/public-keys: Attempt #1 Aug 12 23:43:21.766366 coreos-metadata[1549]: Aug 12 23:43:21.766 INFO Fetch successful Aug 12 23:43:21.775867 unknown[1549]: wrote ssh authorized keys file for user: core Aug 12 23:43:21.803848 kernel: [drm] pci: virtio-gpu-pci detected at 0000:00:01.0 Aug 12 23:43:21.803935 kernel: [drm] features: -virgl +edid -resource_blob -host_visible Aug 12 23:43:21.803952 kernel: [drm] features: -context_init Aug 12 23:43:21.815868 kernel: [drm] number of scanouts: 1 Aug 12 23:43:21.815928 kernel: [drm] number of cap sets: 0 Aug 12 23:43:21.822575 kernel: [drm] Initialized virtio_gpu 0.1.0 for 0000:00:01.0 on minor 0 Aug 12 23:43:21.837194 update-ssh-keys[1555]: Updated "/home/core/.ssh/authorized_keys" Aug 12 23:43:21.838577 systemd[1]: Finished coreos-metadata-sshkeys@core.service - Flatcar Metadata Agent (SSH Keys). Aug 12 23:43:21.842719 systemd[1]: Finished sshkeys.service. Aug 12 23:43:21.941647 systemd[1]: Condition check resulted in dev-virtio\x2dports-org.qemu.guest_agent.0.device - /dev/virtio-ports/org.qemu.guest_agent.0 being skipped. Aug 12 23:43:21.944874 systemd[1]: Started qemu-guest-agent.service - QEMU Guest Agent. Aug 12 23:43:22.052779 containerd[1506]: time="2025-08-12T23:43:22Z" level=warning msg="Ignoring unknown key in TOML" column=1 error="strict mode: fields in the document are missing in the target struct" file=/usr/share/containerd/config.toml key=subreaper row=8 Aug 12 23:43:22.060621 containerd[1506]: time="2025-08-12T23:43:22.060466600Z" level=info msg="starting containerd" revision=06b99ca80cdbfbc6cc8bd567021738c9af2b36ce version=v2.0.4 Aug 12 23:43:22.101322 containerd[1506]: time="2025-08-12T23:43:22.100038600Z" level=warning msg="Configuration migrated from version 2, use `containerd config migrate` to avoid migration" t="12.92µs" Aug 12 23:43:22.103610 containerd[1506]: time="2025-08-12T23:43:22.103566320Z" level=info msg="loading plugin" id=io.containerd.image-verifier.v1.bindir type=io.containerd.image-verifier.v1 Aug 12 23:43:22.104746 containerd[1506]: time="2025-08-12T23:43:22.104709600Z" level=info msg="loading plugin" id=io.containerd.internal.v1.opt type=io.containerd.internal.v1 Aug 12 23:43:22.104996 containerd[1506]: time="2025-08-12T23:43:22.104974040Z" level=info msg="loading plugin" id=io.containerd.warning.v1.deprecations type=io.containerd.warning.v1 Aug 12 23:43:22.105104 containerd[1506]: time="2025-08-12T23:43:22.105068800Z" level=info msg="loading plugin" id=io.containerd.content.v1.content type=io.containerd.content.v1 Aug 12 23:43:22.105188 containerd[1506]: time="2025-08-12T23:43:22.105173240Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.blockfile type=io.containerd.snapshotter.v1 Aug 12 23:43:22.105691 containerd[1506]: time="2025-08-12T23:43:22.105662880Z" level=info msg="skip loading plugin" error="no scratch file generator: skip plugin" id=io.containerd.snapshotter.v1.blockfile type=io.containerd.snapshotter.v1 Aug 12 23:43:22.106273 containerd[1506]: time="2025-08-12T23:43:22.106244720Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.btrfs type=io.containerd.snapshotter.v1 Aug 12 23:43:22.106630 containerd[1506]: time="2025-08-12T23:43:22.106602160Z" level=info msg="skip loading plugin" error="path /var/lib/containerd/io.containerd.snapshotter.v1.btrfs (ext4) must be a btrfs filesystem to be used with the btrfs snapshotter: skip plugin" id=io.containerd.snapshotter.v1.btrfs type=io.containerd.snapshotter.v1 Aug 12 23:43:22.107596 containerd[1506]: time="2025-08-12T23:43:22.107573560Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.devmapper type=io.containerd.snapshotter.v1 Aug 12 23:43:22.107664 containerd[1506]: time="2025-08-12T23:43:22.107650320Z" level=info msg="skip loading plugin" error="devmapper not configured: skip plugin" id=io.containerd.snapshotter.v1.devmapper type=io.containerd.snapshotter.v1 Aug 12 23:43:22.107736 containerd[1506]: time="2025-08-12T23:43:22.107721720Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.native type=io.containerd.snapshotter.v1 Aug 12 23:43:22.107878 containerd[1506]: time="2025-08-12T23:43:22.107862280Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.overlayfs type=io.containerd.snapshotter.v1 Aug 12 23:43:22.108646 containerd[1506]: time="2025-08-12T23:43:22.108621800Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.zfs type=io.containerd.snapshotter.v1 Aug 12 23:43:22.109305 containerd[1506]: time="2025-08-12T23:43:22.109278720Z" level=info msg="skip loading plugin" error="lstat /var/lib/containerd/io.containerd.snapshotter.v1.zfs: no such file or directory: skip plugin" id=io.containerd.snapshotter.v1.zfs type=io.containerd.snapshotter.v1 Aug 12 23:43:22.109475 containerd[1506]: time="2025-08-12T23:43:22.109456680Z" level=info msg="loading plugin" id=io.containerd.event.v1.exchange type=io.containerd.event.v1 Aug 12 23:43:22.110803 containerd[1506]: time="2025-08-12T23:43:22.110769160Z" level=info msg="loading plugin" id=io.containerd.monitor.task.v1.cgroups type=io.containerd.monitor.task.v1 Aug 12 23:43:22.114165 containerd[1506]: time="2025-08-12T23:43:22.114108800Z" level=info msg="loading plugin" id=io.containerd.metadata.v1.bolt type=io.containerd.metadata.v1 Aug 12 23:43:22.114284 containerd[1506]: time="2025-08-12T23:43:22.114259320Z" level=info msg="metadata content store policy set" policy=shared Aug 12 23:43:22.120538 containerd[1506]: time="2025-08-12T23:43:22.120434920Z" level=info msg="loading plugin" id=io.containerd.gc.v1.scheduler type=io.containerd.gc.v1 Aug 12 23:43:22.120538 containerd[1506]: time="2025-08-12T23:43:22.120512800Z" level=info msg="loading plugin" id=io.containerd.differ.v1.walking type=io.containerd.differ.v1 Aug 12 23:43:22.120538 containerd[1506]: time="2025-08-12T23:43:22.120528880Z" level=info msg="loading plugin" id=io.containerd.lease.v1.manager type=io.containerd.lease.v1 Aug 12 23:43:22.120648 containerd[1506]: time="2025-08-12T23:43:22.120542080Z" level=info msg="loading plugin" id=io.containerd.service.v1.containers-service type=io.containerd.service.v1 Aug 12 23:43:22.120648 containerd[1506]: time="2025-08-12T23:43:22.120590400Z" level=info msg="loading plugin" id=io.containerd.service.v1.content-service type=io.containerd.service.v1 Aug 12 23:43:22.120648 containerd[1506]: time="2025-08-12T23:43:22.120607200Z" level=info msg="loading plugin" id=io.containerd.service.v1.diff-service type=io.containerd.service.v1 Aug 12 23:43:22.120648 containerd[1506]: time="2025-08-12T23:43:22.120624600Z" level=info msg="loading plugin" id=io.containerd.service.v1.images-service type=io.containerd.service.v1 Aug 12 23:43:22.120648 containerd[1506]: time="2025-08-12T23:43:22.120641200Z" level=info msg="loading plugin" id=io.containerd.service.v1.introspection-service type=io.containerd.service.v1 Aug 12 23:43:22.120750 containerd[1506]: time="2025-08-12T23:43:22.120654440Z" level=info msg="loading plugin" id=io.containerd.service.v1.namespaces-service type=io.containerd.service.v1 Aug 12 23:43:22.120750 containerd[1506]: time="2025-08-12T23:43:22.120665080Z" level=info msg="loading plugin" id=io.containerd.service.v1.snapshots-service type=io.containerd.service.v1 Aug 12 23:43:22.120750 containerd[1506]: time="2025-08-12T23:43:22.120676000Z" level=info msg="loading plugin" id=io.containerd.shim.v1.manager type=io.containerd.shim.v1 Aug 12 23:43:22.120750 containerd[1506]: time="2025-08-12T23:43:22.120693400Z" level=info msg="loading plugin" id=io.containerd.runtime.v2.task type=io.containerd.runtime.v2 Aug 12 23:43:22.120856 containerd[1506]: time="2025-08-12T23:43:22.120830120Z" level=info msg="loading plugin" id=io.containerd.service.v1.tasks-service type=io.containerd.service.v1 Aug 12 23:43:22.120880 containerd[1506]: time="2025-08-12T23:43:22.120862200Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.containers type=io.containerd.grpc.v1 Aug 12 23:43:22.120898 containerd[1506]: time="2025-08-12T23:43:22.120878720Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.content type=io.containerd.grpc.v1 Aug 12 23:43:22.120898 containerd[1506]: time="2025-08-12T23:43:22.120891960Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.diff type=io.containerd.grpc.v1 Aug 12 23:43:22.120935 containerd[1506]: time="2025-08-12T23:43:22.120902840Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.events type=io.containerd.grpc.v1 Aug 12 23:43:22.120935 containerd[1506]: time="2025-08-12T23:43:22.120914080Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.images type=io.containerd.grpc.v1 Aug 12 23:43:22.120935 containerd[1506]: time="2025-08-12T23:43:22.120924720Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.introspection type=io.containerd.grpc.v1 Aug 12 23:43:22.120988 containerd[1506]: time="2025-08-12T23:43:22.120935400Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.leases type=io.containerd.grpc.v1 Aug 12 23:43:22.120988 containerd[1506]: time="2025-08-12T23:43:22.120948920Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.namespaces type=io.containerd.grpc.v1 Aug 12 23:43:22.120988 containerd[1506]: time="2025-08-12T23:43:22.120959760Z" level=info msg="loading plugin" id=io.containerd.sandbox.store.v1.local type=io.containerd.sandbox.store.v1 Aug 12 23:43:22.120988 containerd[1506]: time="2025-08-12T23:43:22.120976360Z" level=info msg="loading plugin" id=io.containerd.cri.v1.images type=io.containerd.cri.v1 Aug 12 23:43:22.121244 containerd[1506]: time="2025-08-12T23:43:22.121215240Z" level=info msg="Get image filesystem path \"/var/lib/containerd/io.containerd.snapshotter.v1.overlayfs\" for snapshotter \"overlayfs\"" Aug 12 23:43:22.121276 containerd[1506]: time="2025-08-12T23:43:22.121244240Z" level=info msg="Start snapshots syncer" Aug 12 23:43:22.121294 containerd[1506]: time="2025-08-12T23:43:22.121280800Z" level=info msg="loading plugin" id=io.containerd.cri.v1.runtime type=io.containerd.cri.v1 Aug 12 23:43:22.125335 containerd[1506]: time="2025-08-12T23:43:22.121526440Z" level=info msg="starting cri plugin" config="{\"containerd\":{\"defaultRuntimeName\":\"runc\",\"runtimes\":{\"runc\":{\"runtimeType\":\"io.containerd.runc.v2\",\"runtimePath\":\"\",\"PodAnnotations\":null,\"ContainerAnnotations\":null,\"options\":{\"BinaryName\":\"\",\"CriuImagePath\":\"\",\"CriuWorkPath\":\"\",\"IoGid\":0,\"IoUid\":0,\"NoNewKeyring\":false,\"Root\":\"\",\"ShimCgroup\":\"\",\"SystemdCgroup\":true},\"privileged_without_host_devices\":false,\"privileged_without_host_devices_all_devices_allowed\":false,\"baseRuntimeSpec\":\"\",\"cniConfDir\":\"\",\"cniMaxConfNum\":0,\"snapshotter\":\"\",\"sandboxer\":\"podsandbox\",\"io_type\":\"\"}},\"ignoreBlockIONotEnabledErrors\":false,\"ignoreRdtNotEnabledErrors\":false},\"cni\":{\"binDir\":\"/opt/cni/bin\",\"confDir\":\"/etc/cni/net.d\",\"maxConfNum\":1,\"setupSerially\":false,\"confTemplate\":\"\",\"ipPref\":\"\",\"useInternalLoopback\":false},\"enableSelinux\":true,\"selinuxCategoryRange\":1024,\"maxContainerLogSize\":16384,\"disableApparmor\":false,\"restrictOOMScoreAdj\":false,\"disableProcMount\":false,\"unsetSeccompProfile\":\"\",\"tolerateMissingHugetlbController\":true,\"disableHugetlbController\":true,\"device_ownership_from_security_context\":false,\"ignoreImageDefinedVolumes\":false,\"netnsMountsUnderStateDir\":false,\"enableUnprivilegedPorts\":true,\"enableUnprivilegedICMP\":true,\"enableCDI\":true,\"cdiSpecDirs\":[\"/etc/cdi\",\"/var/run/cdi\"],\"drainExecSyncIOTimeout\":\"0s\",\"ignoreDeprecationWarnings\":null,\"containerdRootDir\":\"/var/lib/containerd\",\"containerdEndpoint\":\"/run/containerd/containerd.sock\",\"rootDir\":\"/var/lib/containerd/io.containerd.grpc.v1.cri\",\"stateDir\":\"/run/containerd/io.containerd.grpc.v1.cri\"}" Aug 12 23:43:22.125335 containerd[1506]: time="2025-08-12T23:43:22.125068160Z" level=info msg="loading plugin" id=io.containerd.podsandbox.controller.v1.podsandbox type=io.containerd.podsandbox.controller.v1 Aug 12 23:43:22.125490 containerd[1506]: time="2025-08-12T23:43:22.125199880Z" level=info msg="loading plugin" id=io.containerd.sandbox.controller.v1.shim type=io.containerd.sandbox.controller.v1 Aug 12 23:43:22.125963 containerd[1506]: time="2025-08-12T23:43:22.125346640Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.sandbox-controllers type=io.containerd.grpc.v1 Aug 12 23:43:22.126007 containerd[1506]: time="2025-08-12T23:43:22.125971800Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.sandboxes type=io.containerd.grpc.v1 Aug 12 23:43:22.126007 containerd[1506]: time="2025-08-12T23:43:22.125987600Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.snapshots type=io.containerd.grpc.v1 Aug 12 23:43:22.126007 containerd[1506]: time="2025-08-12T23:43:22.125998400Z" level=info msg="loading plugin" id=io.containerd.streaming.v1.manager type=io.containerd.streaming.v1 Aug 12 23:43:22.126069 containerd[1506]: time="2025-08-12T23:43:22.126015960Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.streaming type=io.containerd.grpc.v1 Aug 12 23:43:22.126069 containerd[1506]: time="2025-08-12T23:43:22.126028760Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.tasks type=io.containerd.grpc.v1 Aug 12 23:43:22.126069 containerd[1506]: time="2025-08-12T23:43:22.126041000Z" level=info msg="loading plugin" id=io.containerd.transfer.v1.local type=io.containerd.transfer.v1 Aug 12 23:43:22.126233 containerd[1506]: time="2025-08-12T23:43:22.126136200Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.transfer type=io.containerd.grpc.v1 Aug 12 23:43:22.126233 containerd[1506]: time="2025-08-12T23:43:22.126158640Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.version type=io.containerd.grpc.v1 Aug 12 23:43:22.126233 containerd[1506]: time="2025-08-12T23:43:22.126171760Z" level=info msg="loading plugin" id=io.containerd.monitor.container.v1.restart type=io.containerd.monitor.container.v1 Aug 12 23:43:22.126233 containerd[1506]: time="2025-08-12T23:43:22.126218000Z" level=info msg="loading plugin" id=io.containerd.tracing.processor.v1.otlp type=io.containerd.tracing.processor.v1 Aug 12 23:43:22.126399 containerd[1506]: time="2025-08-12T23:43:22.126237920Z" level=info msg="skip loading plugin" error="skip plugin: tracing endpoint not configured" id=io.containerd.tracing.processor.v1.otlp type=io.containerd.tracing.processor.v1 Aug 12 23:43:22.126399 containerd[1506]: time="2025-08-12T23:43:22.126248600Z" level=info msg="loading plugin" id=io.containerd.internal.v1.tracing type=io.containerd.internal.v1 Aug 12 23:43:22.126399 containerd[1506]: time="2025-08-12T23:43:22.126258760Z" level=info msg="skip loading plugin" error="skip plugin: tracing endpoint not configured" id=io.containerd.internal.v1.tracing type=io.containerd.internal.v1 Aug 12 23:43:22.126399 containerd[1506]: time="2025-08-12T23:43:22.126266080Z" level=info msg="loading plugin" id=io.containerd.ttrpc.v1.otelttrpc type=io.containerd.ttrpc.v1 Aug 12 23:43:22.126399 containerd[1506]: time="2025-08-12T23:43:22.126277760Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.healthcheck type=io.containerd.grpc.v1 Aug 12 23:43:22.126399 containerd[1506]: time="2025-08-12T23:43:22.126308080Z" level=info msg="loading plugin" id=io.containerd.nri.v1.nri type=io.containerd.nri.v1 Aug 12 23:43:22.126399 containerd[1506]: time="2025-08-12T23:43:22.126389760Z" level=info msg="runtime interface created" Aug 12 23:43:22.126399 containerd[1506]: time="2025-08-12T23:43:22.126395160Z" level=info msg="created NRI interface" Aug 12 23:43:22.126651 containerd[1506]: time="2025-08-12T23:43:22.126405720Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.cri type=io.containerd.grpc.v1 Aug 12 23:43:22.126651 containerd[1506]: time="2025-08-12T23:43:22.126420560Z" level=info msg="Connect containerd service" Aug 12 23:43:22.126651 containerd[1506]: time="2025-08-12T23:43:22.126454080Z" level=info msg="using experimental NRI integration - disable nri plugin to prevent this" Aug 12 23:43:22.132382 containerd[1506]: time="2025-08-12T23:43:22.131129720Z" level=error msg="failed to load cni during init, please check CRI plugin status before setting up network for pods" error="cni config load failed: no network config found in /etc/cni/net.d: cni plugin not initialized: failed to load cni config" Aug 12 23:43:22.202677 locksmithd[1520]: locksmithd starting currentOperation="UPDATE_STATUS_IDLE" strategy="reboot" Aug 12 23:43:22.243849 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Aug 12 23:43:22.259383 systemd-logind[1481]: Watching system buttons on /dev/input/event0 (Power Button) Aug 12 23:43:22.301715 systemd-logind[1481]: Watching system buttons on /dev/input/event2 (QEMU QEMU USB Keyboard) Aug 12 23:43:22.354930 containerd[1506]: time="2025-08-12T23:43:22.354842240Z" level=info msg="Start subscribing containerd event" Aug 12 23:43:22.355022 containerd[1506]: time="2025-08-12T23:43:22.354941760Z" level=info msg="Start recovering state" Aug 12 23:43:22.355163 containerd[1506]: time="2025-08-12T23:43:22.355140200Z" level=info msg="Start event monitor" Aug 12 23:43:22.355430 containerd[1506]: time="2025-08-12T23:43:22.355397880Z" level=info msg="Start cni network conf syncer for default" Aug 12 23:43:22.355755 containerd[1506]: time="2025-08-12T23:43:22.355487840Z" level=info msg="Start streaming server" Aug 12 23:43:22.355755 containerd[1506]: time="2025-08-12T23:43:22.355502600Z" level=info msg="Registered namespace \"k8s.io\" with NRI" Aug 12 23:43:22.355755 containerd[1506]: time="2025-08-12T23:43:22.355515160Z" level=info msg="runtime interface starting up..." Aug 12 23:43:22.355755 containerd[1506]: time="2025-08-12T23:43:22.355521280Z" level=info msg="starting plugins..." Aug 12 23:43:22.355755 containerd[1506]: time="2025-08-12T23:43:22.355556680Z" level=info msg="Synchronizing NRI (plugin) with current runtime state" Aug 12 23:43:22.355755 containerd[1506]: time="2025-08-12T23:43:22.355664760Z" level=info msg=serving... address=/run/containerd/containerd.sock.ttrpc Aug 12 23:43:22.355955 containerd[1506]: time="2025-08-12T23:43:22.355779400Z" level=info msg=serving... address=/run/containerd/containerd.sock Aug 12 23:43:22.356123 containerd[1506]: time="2025-08-12T23:43:22.356093960Z" level=info msg="containerd successfully booted in 0.306692s" Aug 12 23:43:22.356258 systemd[1]: Started containerd.service - containerd container runtime. Aug 12 23:43:22.431358 systemd[1]: Finished systemd-vconsole-setup.service - Virtual Console Setup. Aug 12 23:43:22.557713 tar[1495]: linux-arm64/LICENSE Aug 12 23:43:22.558163 tar[1495]: linux-arm64/README.md Aug 12 23:43:22.576294 systemd[1]: Finished prepare-helm.service - Unpack helm to /opt/bin. Aug 12 23:43:22.595684 systemd-networkd[1419]: eth0: Gained IPv6LL Aug 12 23:43:22.596901 systemd-timesyncd[1391]: Network configuration changed, trying to establish connection. Aug 12 23:43:22.600909 systemd[1]: Finished systemd-networkd-wait-online.service - Wait for Network to be Configured. Aug 12 23:43:22.603868 systemd[1]: Reached target network-online.target - Network is Online. Aug 12 23:43:22.608775 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Aug 12 23:43:22.613863 systemd[1]: Starting nvidia.service - NVIDIA Configure Service... Aug 12 23:43:22.666605 systemd[1]: Finished nvidia.service - NVIDIA Configure Service. Aug 12 23:43:22.787703 systemd-networkd[1419]: eth1: Gained IPv6LL Aug 12 23:43:22.790380 systemd-timesyncd[1391]: Network configuration changed, trying to establish connection. Aug 12 23:43:22.815759 sshd_keygen[1505]: ssh-keygen: generating new host keys: RSA ECDSA ED25519 Aug 12 23:43:22.850816 systemd[1]: Finished sshd-keygen.service - Generate sshd host keys. Aug 12 23:43:22.856029 systemd[1]: Starting issuegen.service - Generate /run/issue... Aug 12 23:43:22.880435 systemd[1]: issuegen.service: Deactivated successfully. Aug 12 23:43:22.880858 systemd[1]: Finished issuegen.service - Generate /run/issue. Aug 12 23:43:22.886937 systemd[1]: Starting systemd-user-sessions.service - Permit User Sessions... Aug 12 23:43:22.904365 systemd[1]: Finished systemd-user-sessions.service - Permit User Sessions. Aug 12 23:43:22.908720 systemd[1]: Started getty@tty1.service - Getty on tty1. Aug 12 23:43:22.914151 systemd[1]: Started serial-getty@ttyAMA0.service - Serial Getty on ttyAMA0. Aug 12 23:43:22.915415 systemd[1]: Reached target getty.target - Login Prompts. Aug 12 23:43:23.491801 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Aug 12 23:43:23.494230 systemd[1]: Reached target multi-user.target - Multi-User System. Aug 12 23:43:23.496093 systemd[1]: Startup finished in 2.476s (kernel) + 19.233s (initrd) + 4.694s (userspace) = 26.404s. Aug 12 23:43:23.510374 (kubelet)[1644]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Aug 12 23:43:24.092462 kubelet[1644]: E0812 23:43:24.092359 1644 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Aug 12 23:43:24.096571 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Aug 12 23:43:24.096800 systemd[1]: kubelet.service: Failed with result 'exit-code'. Aug 12 23:43:24.097834 systemd[1]: kubelet.service: Consumed 981ms CPU time, 255.8M memory peak. Aug 12 23:43:34.303270 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 1. Aug 12 23:43:34.306571 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Aug 12 23:43:34.471648 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Aug 12 23:43:34.484519 (kubelet)[1663]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Aug 12 23:43:34.537296 kubelet[1663]: E0812 23:43:34.537238 1663 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Aug 12 23:43:34.542030 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Aug 12 23:43:34.542211 systemd[1]: kubelet.service: Failed with result 'exit-code'. Aug 12 23:43:34.543816 systemd[1]: kubelet.service: Consumed 176ms CPU time, 105.2M memory peak. Aug 12 23:43:44.554057 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 2. Aug 12 23:43:44.558521 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Aug 12 23:43:44.742604 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Aug 12 23:43:44.758368 (kubelet)[1678]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Aug 12 23:43:44.812313 kubelet[1678]: E0812 23:43:44.812176 1678 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Aug 12 23:43:44.816220 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Aug 12 23:43:44.816528 systemd[1]: kubelet.service: Failed with result 'exit-code'. Aug 12 23:43:44.818765 systemd[1]: kubelet.service: Consumed 189ms CPU time, 107.5M memory peak. Aug 12 23:43:46.214106 systemd[1]: Created slice system-sshd.slice - Slice /system/sshd. Aug 12 23:43:46.216913 systemd[1]: Started sshd@0-138.199.237.168:22-139.178.68.195:42920.service - OpenSSH per-connection server daemon (139.178.68.195:42920). Aug 12 23:43:47.255311 sshd[1686]: Accepted publickey for core from 139.178.68.195 port 42920 ssh2: RSA SHA256:mYYevtwiboZ0WAYUecJRosD7TZEB8o4Se1lqZszh41o Aug 12 23:43:47.261080 sshd-session[1686]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Aug 12 23:43:47.271224 systemd[1]: Created slice user-500.slice - User Slice of UID 500. Aug 12 23:43:47.272579 systemd[1]: Starting user-runtime-dir@500.service - User Runtime Directory /run/user/500... Aug 12 23:43:47.283076 systemd-logind[1481]: New session 1 of user core. Aug 12 23:43:47.300409 systemd[1]: Finished user-runtime-dir@500.service - User Runtime Directory /run/user/500. Aug 12 23:43:47.305691 systemd[1]: Starting user@500.service - User Manager for UID 500... Aug 12 23:43:47.322739 (systemd)[1690]: pam_unix(systemd-user:session): session opened for user core(uid=500) by (uid=0) Aug 12 23:43:47.326076 systemd-logind[1481]: New session c1 of user core. Aug 12 23:43:47.487583 systemd[1690]: Queued start job for default target default.target. Aug 12 23:43:47.502905 systemd[1690]: Created slice app.slice - User Application Slice. Aug 12 23:43:47.502991 systemd[1690]: Reached target paths.target - Paths. Aug 12 23:43:47.503186 systemd[1690]: Reached target timers.target - Timers. Aug 12 23:43:47.507932 systemd[1690]: Starting dbus.socket - D-Bus User Message Bus Socket... Aug 12 23:43:47.541615 systemd[1690]: Listening on dbus.socket - D-Bus User Message Bus Socket. Aug 12 23:43:47.541839 systemd[1690]: Reached target sockets.target - Sockets. Aug 12 23:43:47.541925 systemd[1690]: Reached target basic.target - Basic System. Aug 12 23:43:47.541978 systemd[1690]: Reached target default.target - Main User Target. Aug 12 23:43:47.542026 systemd[1690]: Startup finished in 207ms. Aug 12 23:43:47.542249 systemd[1]: Started user@500.service - User Manager for UID 500. Aug 12 23:43:47.550934 systemd[1]: Started session-1.scope - Session 1 of User core. Aug 12 23:43:48.267015 systemd[1]: Started sshd@1-138.199.237.168:22-139.178.68.195:42934.service - OpenSSH per-connection server daemon (139.178.68.195:42934). Aug 12 23:43:49.291461 sshd[1701]: Accepted publickey for core from 139.178.68.195 port 42934 ssh2: RSA SHA256:mYYevtwiboZ0WAYUecJRosD7TZEB8o4Se1lqZszh41o Aug 12 23:43:49.294798 sshd-session[1701]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Aug 12 23:43:49.302793 systemd-logind[1481]: New session 2 of user core. Aug 12 23:43:49.312893 systemd[1]: Started session-2.scope - Session 2 of User core. Aug 12 23:43:49.980178 sshd[1703]: Connection closed by 139.178.68.195 port 42934 Aug 12 23:43:49.981146 sshd-session[1701]: pam_unix(sshd:session): session closed for user core Aug 12 23:43:49.986609 systemd[1]: sshd@1-138.199.237.168:22-139.178.68.195:42934.service: Deactivated successfully. Aug 12 23:43:49.989647 systemd[1]: session-2.scope: Deactivated successfully. Aug 12 23:43:49.993083 systemd-logind[1481]: Session 2 logged out. Waiting for processes to exit. Aug 12 23:43:49.994933 systemd-logind[1481]: Removed session 2. Aug 12 23:43:50.159774 systemd[1]: Started sshd@2-138.199.237.168:22-139.178.68.195:42950.service - OpenSSH per-connection server daemon (139.178.68.195:42950). Aug 12 23:43:51.175369 sshd[1709]: Accepted publickey for core from 139.178.68.195 port 42950 ssh2: RSA SHA256:mYYevtwiboZ0WAYUecJRosD7TZEB8o4Se1lqZszh41o Aug 12 23:43:51.178376 sshd-session[1709]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Aug 12 23:43:51.186878 systemd-logind[1481]: New session 3 of user core. Aug 12 23:43:51.196981 systemd[1]: Started session-3.scope - Session 3 of User core. Aug 12 23:43:51.862472 sshd[1711]: Connection closed by 139.178.68.195 port 42950 Aug 12 23:43:51.863749 sshd-session[1709]: pam_unix(sshd:session): session closed for user core Aug 12 23:43:51.869236 systemd-logind[1481]: Session 3 logged out. Waiting for processes to exit. Aug 12 23:43:51.869737 systemd[1]: sshd@2-138.199.237.168:22-139.178.68.195:42950.service: Deactivated successfully. Aug 12 23:43:51.874101 systemd[1]: session-3.scope: Deactivated successfully. Aug 12 23:43:51.878256 systemd-logind[1481]: Removed session 3. Aug 12 23:43:52.042473 systemd[1]: Started sshd@3-138.199.237.168:22-139.178.68.195:58976.service - OpenSSH per-connection server daemon (139.178.68.195:58976). Aug 12 23:43:52.904145 systemd-timesyncd[1391]: Contacted time server 88.198.200.96:123 (2.flatcar.pool.ntp.org). Aug 12 23:43:52.904248 systemd-timesyncd[1391]: Initial clock synchronization to Tue 2025-08-12 23:43:52.684355 UTC. Aug 12 23:43:53.059714 sshd[1717]: Accepted publickey for core from 139.178.68.195 port 58976 ssh2: RSA SHA256:mYYevtwiboZ0WAYUecJRosD7TZEB8o4Se1lqZszh41o Aug 12 23:43:53.062099 sshd-session[1717]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Aug 12 23:43:53.070386 systemd-logind[1481]: New session 4 of user core. Aug 12 23:43:53.076941 systemd[1]: Started session-4.scope - Session 4 of User core. Aug 12 23:43:53.731777 sshd[1719]: Connection closed by 139.178.68.195 port 58976 Aug 12 23:43:53.732984 sshd-session[1717]: pam_unix(sshd:session): session closed for user core Aug 12 23:43:53.737748 systemd[1]: sshd@3-138.199.237.168:22-139.178.68.195:58976.service: Deactivated successfully. Aug 12 23:43:53.740164 systemd[1]: session-4.scope: Deactivated successfully. Aug 12 23:43:53.741290 systemd-logind[1481]: Session 4 logged out. Waiting for processes to exit. Aug 12 23:43:53.743003 systemd-logind[1481]: Removed session 4. Aug 12 23:43:53.898931 systemd[1]: Started sshd@4-138.199.237.168:22-139.178.68.195:58984.service - OpenSSH per-connection server daemon (139.178.68.195:58984). Aug 12 23:43:54.891421 sshd[1725]: Accepted publickey for core from 139.178.68.195 port 58984 ssh2: RSA SHA256:mYYevtwiboZ0WAYUecJRosD7TZEB8o4Se1lqZszh41o Aug 12 23:43:54.893482 sshd-session[1725]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Aug 12 23:43:54.894606 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 3. Aug 12 23:43:54.897793 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Aug 12 23:43:54.903695 systemd-logind[1481]: New session 5 of user core. Aug 12 23:43:54.908959 systemd[1]: Started session-5.scope - Session 5 of User core. Aug 12 23:43:55.053392 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Aug 12 23:43:55.076634 (kubelet)[1736]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Aug 12 23:43:55.122874 kubelet[1736]: E0812 23:43:55.122823 1736 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Aug 12 23:43:55.125684 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Aug 12 23:43:55.125846 systemd[1]: kubelet.service: Failed with result 'exit-code'. Aug 12 23:43:55.126806 systemd[1]: kubelet.service: Consumed 172ms CPU time, 105.2M memory peak. Aug 12 23:43:55.419522 sudo[1742]: core : PWD=/home/core ; USER=root ; COMMAND=/usr/sbin/setenforce 1 Aug 12 23:43:55.419900 sudo[1742]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Aug 12 23:43:55.439185 sudo[1742]: pam_unix(sudo:session): session closed for user root Aug 12 23:43:55.598626 sshd[1730]: Connection closed by 139.178.68.195 port 58984 Aug 12 23:43:55.599810 sshd-session[1725]: pam_unix(sshd:session): session closed for user core Aug 12 23:43:55.605024 systemd[1]: sshd@4-138.199.237.168:22-139.178.68.195:58984.service: Deactivated successfully. Aug 12 23:43:55.607605 systemd[1]: session-5.scope: Deactivated successfully. Aug 12 23:43:55.610786 systemd-logind[1481]: Session 5 logged out. Waiting for processes to exit. Aug 12 23:43:55.613096 systemd-logind[1481]: Removed session 5. Aug 12 23:43:55.770494 systemd[1]: Started sshd@5-138.199.237.168:22-139.178.68.195:58998.service - OpenSSH per-connection server daemon (139.178.68.195:58998). Aug 12 23:43:56.770423 sshd[1748]: Accepted publickey for core from 139.178.68.195 port 58998 ssh2: RSA SHA256:mYYevtwiboZ0WAYUecJRosD7TZEB8o4Se1lqZszh41o Aug 12 23:43:56.772684 sshd-session[1748]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Aug 12 23:43:56.780329 systemd-logind[1481]: New session 6 of user core. Aug 12 23:43:56.785998 systemd[1]: Started session-6.scope - Session 6 of User core. Aug 12 23:43:57.292260 sudo[1752]: core : PWD=/home/core ; USER=root ; COMMAND=/usr/sbin/rm -rf /etc/audit/rules.d/80-selinux.rules /etc/audit/rules.d/99-default.rules Aug 12 23:43:57.292587 sudo[1752]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Aug 12 23:43:57.299003 sudo[1752]: pam_unix(sudo:session): session closed for user root Aug 12 23:43:57.306162 sudo[1751]: core : PWD=/home/core ; USER=root ; COMMAND=/usr/sbin/systemctl restart audit-rules Aug 12 23:43:57.306449 sudo[1751]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Aug 12 23:43:57.318988 systemd[1]: Starting audit-rules.service - Load Audit Rules... Aug 12 23:43:57.380934 augenrules[1774]: No rules Aug 12 23:43:57.382899 systemd[1]: audit-rules.service: Deactivated successfully. Aug 12 23:43:57.383218 systemd[1]: Finished audit-rules.service - Load Audit Rules. Aug 12 23:43:57.384568 sudo[1751]: pam_unix(sudo:session): session closed for user root Aug 12 23:43:57.544744 sshd[1750]: Connection closed by 139.178.68.195 port 58998 Aug 12 23:43:57.545439 sshd-session[1748]: pam_unix(sshd:session): session closed for user core Aug 12 23:43:57.551450 systemd[1]: sshd@5-138.199.237.168:22-139.178.68.195:58998.service: Deactivated successfully. Aug 12 23:43:57.554155 systemd[1]: session-6.scope: Deactivated successfully. Aug 12 23:43:57.558986 systemd-logind[1481]: Session 6 logged out. Waiting for processes to exit. Aug 12 23:43:57.560506 systemd-logind[1481]: Removed session 6. Aug 12 23:43:57.730723 systemd[1]: Started sshd@6-138.199.237.168:22-139.178.68.195:59010.service - OpenSSH per-connection server daemon (139.178.68.195:59010). Aug 12 23:43:58.742882 sshd[1783]: Accepted publickey for core from 139.178.68.195 port 59010 ssh2: RSA SHA256:mYYevtwiboZ0WAYUecJRosD7TZEB8o4Se1lqZszh41o Aug 12 23:43:58.744622 sshd-session[1783]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Aug 12 23:43:58.750568 systemd-logind[1481]: New session 7 of user core. Aug 12 23:43:58.757841 systemd[1]: Started session-7.scope - Session 7 of User core. Aug 12 23:43:59.265719 sudo[1786]: core : PWD=/home/core ; USER=root ; COMMAND=/home/core/install.sh Aug 12 23:43:59.266091 sudo[1786]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Aug 12 23:43:59.617890 systemd[1]: Starting docker.service - Docker Application Container Engine... Aug 12 23:43:59.627163 (dockerd)[1804]: docker.service: Referenced but unset environment variable evaluates to an empty string: DOCKER_CGROUPS, DOCKER_OPTS, DOCKER_OPT_BIP, DOCKER_OPT_IPMASQ, DOCKER_OPT_MTU Aug 12 23:43:59.877719 dockerd[1804]: time="2025-08-12T23:43:59.876723428Z" level=info msg="Starting up" Aug 12 23:43:59.880107 dockerd[1804]: time="2025-08-12T23:43:59.880035953Z" level=info msg="OTEL tracing is not configured, using no-op tracer provider" Aug 12 23:43:59.917507 systemd[1]: var-lib-docker-check\x2doverlayfs\x2dsupport812414186-merged.mount: Deactivated successfully. Aug 12 23:43:59.948108 dockerd[1804]: time="2025-08-12T23:43:59.947843952Z" level=info msg="Loading containers: start." Aug 12 23:43:59.960587 kernel: Initializing XFRM netlink socket Aug 12 23:44:00.234456 systemd-networkd[1419]: docker0: Link UP Aug 12 23:44:00.240625 dockerd[1804]: time="2025-08-12T23:44:00.240559581Z" level=info msg="Loading containers: done." Aug 12 23:44:00.262140 dockerd[1804]: time="2025-08-12T23:44:00.262082343Z" level=warning msg="Not using native diff for overlay2, this may cause degraded performance for building images: kernel has CONFIG_OVERLAY_FS_REDIRECT_DIR enabled" storage-driver=overlay2 Aug 12 23:44:00.262307 dockerd[1804]: time="2025-08-12T23:44:00.262189615Z" level=info msg="Docker daemon" commit=bbd0a17ccc67e48d4a69393287b7fcc4f0578683 containerd-snapshotter=false storage-driver=overlay2 version=28.0.1 Aug 12 23:44:00.262337 dockerd[1804]: time="2025-08-12T23:44:00.262306738Z" level=info msg="Initializing buildkit" Aug 12 23:44:00.293332 dockerd[1804]: time="2025-08-12T23:44:00.293136489Z" level=info msg="Completed buildkit initialization" Aug 12 23:44:00.302289 systemd[1]: Started docker.service - Docker Application Container Engine. Aug 12 23:44:00.302712 dockerd[1804]: time="2025-08-12T23:44:00.301825808Z" level=info msg="Daemon has completed initialization" Aug 12 23:44:00.302943 dockerd[1804]: time="2025-08-12T23:44:00.302825990Z" level=info msg="API listen on /run/docker.sock" Aug 12 23:44:00.913090 systemd[1]: var-lib-docker-overlay2-opaque\x2dbug\x2dcheck1577057182-merged.mount: Deactivated successfully. Aug 12 23:44:01.377588 containerd[1506]: time="2025-08-12T23:44:01.377174746Z" level=info msg="PullImage \"registry.k8s.io/kube-apiserver:v1.31.11\"" Aug 12 23:44:02.021452 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount3662993655.mount: Deactivated successfully. Aug 12 23:44:03.279103 containerd[1506]: time="2025-08-12T23:44:03.278301737Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-apiserver:v1.31.11\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Aug 12 23:44:03.280605 containerd[1506]: time="2025-08-12T23:44:03.280516508Z" level=info msg="stop pulling image registry.k8s.io/kube-apiserver:v1.31.11: active requests=0, bytes read=25651905" Aug 12 23:44:03.282039 containerd[1506]: time="2025-08-12T23:44:03.281965091Z" level=info msg="ImageCreate event name:\"sha256:00a68b619a4bfa14c989a2181a7aa0726a5cb1272a7f65394e6a594ad6eade27\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Aug 12 23:44:03.285903 containerd[1506]: time="2025-08-12T23:44:03.285844276Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-apiserver@sha256:a3d1c4440817725a1b503a7ccce94f3dce2b208ebf257b405dc2d97817df3dde\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Aug 12 23:44:03.287226 containerd[1506]: time="2025-08-12T23:44:03.287059875Z" level=info msg="Pulled image \"registry.k8s.io/kube-apiserver:v1.31.11\" with image id \"sha256:00a68b619a4bfa14c989a2181a7aa0726a5cb1272a7f65394e6a594ad6eade27\", repo tag \"registry.k8s.io/kube-apiserver:v1.31.11\", repo digest \"registry.k8s.io/kube-apiserver@sha256:a3d1c4440817725a1b503a7ccce94f3dce2b208ebf257b405dc2d97817df3dde\", size \"25648613\" in 1.909835851s" Aug 12 23:44:03.287226 containerd[1506]: time="2025-08-12T23:44:03.287102326Z" level=info msg="PullImage \"registry.k8s.io/kube-apiserver:v1.31.11\" returns image reference \"sha256:00a68b619a4bfa14c989a2181a7aa0726a5cb1272a7f65394e6a594ad6eade27\"" Aug 12 23:44:03.289132 containerd[1506]: time="2025-08-12T23:44:03.289100353Z" level=info msg="PullImage \"registry.k8s.io/kube-controller-manager:v1.31.11\"" Aug 12 23:44:04.738048 containerd[1506]: time="2025-08-12T23:44:04.737968109Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-controller-manager:v1.31.11\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Aug 12 23:44:04.739972 containerd[1506]: time="2025-08-12T23:44:04.739565392Z" level=info msg="stop pulling image registry.k8s.io/kube-controller-manager:v1.31.11: active requests=0, bytes read=22460303" Aug 12 23:44:04.741046 containerd[1506]: time="2025-08-12T23:44:04.740991922Z" level=info msg="ImageCreate event name:\"sha256:5c5dc52b837451e0fe6108fdfb9cfa431191ce227ce71d103dec8a8c655c4e71\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Aug 12 23:44:04.745677 containerd[1506]: time="2025-08-12T23:44:04.744704356Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-controller-manager@sha256:0f19de157f3d251f5ddeb6e9d026895bc55cb02592874b326fa345c57e5e2848\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Aug 12 23:44:04.746020 containerd[1506]: time="2025-08-12T23:44:04.745987279Z" level=info msg="Pulled image \"registry.k8s.io/kube-controller-manager:v1.31.11\" with image id \"sha256:5c5dc52b837451e0fe6108fdfb9cfa431191ce227ce71d103dec8a8c655c4e71\", repo tag \"registry.k8s.io/kube-controller-manager:v1.31.11\", repo digest \"registry.k8s.io/kube-controller-manager@sha256:0f19de157f3d251f5ddeb6e9d026895bc55cb02592874b326fa345c57e5e2848\", size \"23996073\" in 1.456733353s" Aug 12 23:44:04.746111 containerd[1506]: time="2025-08-12T23:44:04.746097816Z" level=info msg="PullImage \"registry.k8s.io/kube-controller-manager:v1.31.11\" returns image reference \"sha256:5c5dc52b837451e0fe6108fdfb9cfa431191ce227ce71d103dec8a8c655c4e71\"" Aug 12 23:44:04.747565 containerd[1506]: time="2025-08-12T23:44:04.747353114Z" level=info msg="PullImage \"registry.k8s.io/kube-scheduler:v1.31.11\"" Aug 12 23:44:05.303200 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 4. Aug 12 23:44:05.306426 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Aug 12 23:44:05.476056 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Aug 12 23:44:05.505273 (kubelet)[2075]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Aug 12 23:44:05.559353 kubelet[2075]: E0812 23:44:05.559185 2075 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Aug 12 23:44:05.562612 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Aug 12 23:44:05.562796 systemd[1]: kubelet.service: Failed with result 'exit-code'. Aug 12 23:44:05.563262 systemd[1]: kubelet.service: Consumed 172ms CPU time, 107M memory peak. Aug 12 23:44:06.126220 containerd[1506]: time="2025-08-12T23:44:06.126177149Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-scheduler:v1.31.11\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Aug 12 23:44:06.128076 containerd[1506]: time="2025-08-12T23:44:06.128027555Z" level=info msg="stop pulling image registry.k8s.io/kube-scheduler:v1.31.11: active requests=0, bytes read=17125109" Aug 12 23:44:06.129568 containerd[1506]: time="2025-08-12T23:44:06.129248818Z" level=info msg="ImageCreate event name:\"sha256:89be0efdc4ab1793b9b1b05e836e33dc50f5b2911b57609b315b58608b2d3746\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Aug 12 23:44:06.132624 containerd[1506]: time="2025-08-12T23:44:06.132588504Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-scheduler@sha256:1a9b59b3bfa6c1f1911f6f865a795620c461d079e413061bb71981cadd67f39d\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Aug 12 23:44:06.134571 containerd[1506]: time="2025-08-12T23:44:06.134509048Z" level=info msg="Pulled image \"registry.k8s.io/kube-scheduler:v1.31.11\" with image id \"sha256:89be0efdc4ab1793b9b1b05e836e33dc50f5b2911b57609b315b58608b2d3746\", repo tag \"registry.k8s.io/kube-scheduler:v1.31.11\", repo digest \"registry.k8s.io/kube-scheduler@sha256:1a9b59b3bfa6c1f1911f6f865a795620c461d079e413061bb71981cadd67f39d\", size \"18660897\" in 1.387117526s" Aug 12 23:44:06.134704 containerd[1506]: time="2025-08-12T23:44:06.134686426Z" level=info msg="PullImage \"registry.k8s.io/kube-scheduler:v1.31.11\" returns image reference \"sha256:89be0efdc4ab1793b9b1b05e836e33dc50f5b2911b57609b315b58608b2d3746\"" Aug 12 23:44:06.135387 containerd[1506]: time="2025-08-12T23:44:06.135335192Z" level=info msg="PullImage \"registry.k8s.io/kube-proxy:v1.31.11\"" Aug 12 23:44:07.096579 update_engine[1485]: I20250812 23:44:07.096062 1485 update_attempter.cc:509] Updating boot flags... Aug 12 23:44:07.162083 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount4200018042.mount: Deactivated successfully. Aug 12 23:44:07.658124 containerd[1506]: time="2025-08-12T23:44:07.658008991Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-proxy:v1.31.11\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Aug 12 23:44:07.660288 containerd[1506]: time="2025-08-12T23:44:07.660223776Z" level=info msg="stop pulling image registry.k8s.io/kube-proxy:v1.31.11: active requests=0, bytes read=26916019" Aug 12 23:44:07.661151 containerd[1506]: time="2025-08-12T23:44:07.661097662Z" level=info msg="ImageCreate event name:\"sha256:7d1e7db6660181423f98acbe3a495b3fe5cec9b85cdef245540cc2cb3b180ab0\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Aug 12 23:44:07.664417 containerd[1506]: time="2025-08-12T23:44:07.664355214Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-proxy@sha256:a31da847792c5e7e92e91b78da1ad21d693e4b2b48d0e9f4610c8764dc2a5d79\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Aug 12 23:44:07.665156 containerd[1506]: time="2025-08-12T23:44:07.664798291Z" level=info msg="Pulled image \"registry.k8s.io/kube-proxy:v1.31.11\" with image id \"sha256:7d1e7db6660181423f98acbe3a495b3fe5cec9b85cdef245540cc2cb3b180ab0\", repo tag \"registry.k8s.io/kube-proxy:v1.31.11\", repo digest \"registry.k8s.io/kube-proxy@sha256:a31da847792c5e7e92e91b78da1ad21d693e4b2b48d0e9f4610c8764dc2a5d79\", size \"26915012\" in 1.529421136s" Aug 12 23:44:07.665156 containerd[1506]: time="2025-08-12T23:44:07.664844375Z" level=info msg="PullImage \"registry.k8s.io/kube-proxy:v1.31.11\" returns image reference \"sha256:7d1e7db6660181423f98acbe3a495b3fe5cec9b85cdef245540cc2cb3b180ab0\"" Aug 12 23:44:07.666773 containerd[1506]: time="2025-08-12T23:44:07.666513440Z" level=info msg="PullImage \"registry.k8s.io/coredns/coredns:v1.11.3\"" Aug 12 23:44:08.208710 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount2441597705.mount: Deactivated successfully. Aug 12 23:44:08.951575 containerd[1506]: time="2025-08-12T23:44:08.950259580Z" level=info msg="ImageCreate event name:\"registry.k8s.io/coredns/coredns:v1.11.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Aug 12 23:44:08.952399 containerd[1506]: time="2025-08-12T23:44:08.952334467Z" level=info msg="stop pulling image registry.k8s.io/coredns/coredns:v1.11.3: active requests=0, bytes read=16951714" Aug 12 23:44:08.953050 containerd[1506]: time="2025-08-12T23:44:08.953004536Z" level=info msg="ImageCreate event name:\"sha256:2f6c962e7b8311337352d9fdea917da2184d9919f4da7695bc2a6517cf392fe4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Aug 12 23:44:08.958054 containerd[1506]: time="2025-08-12T23:44:08.957983865Z" level=info msg="ImageCreate event name:\"registry.k8s.io/coredns/coredns@sha256:9caabbf6238b189a65d0d6e6ac138de60d6a1c419e5a341fbbb7c78382559c6e\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Aug 12 23:44:08.960229 containerd[1506]: time="2025-08-12T23:44:08.960191738Z" level=info msg="Pulled image \"registry.k8s.io/coredns/coredns:v1.11.3\" with image id \"sha256:2f6c962e7b8311337352d9fdea917da2184d9919f4da7695bc2a6517cf392fe4\", repo tag \"registry.k8s.io/coredns/coredns:v1.11.3\", repo digest \"registry.k8s.io/coredns/coredns@sha256:9caabbf6238b189a65d0d6e6ac138de60d6a1c419e5a341fbbb7c78382559c6e\", size \"16948420\" in 1.293609306s" Aug 12 23:44:08.960353 containerd[1506]: time="2025-08-12T23:44:08.960336958Z" level=info msg="PullImage \"registry.k8s.io/coredns/coredns:v1.11.3\" returns image reference \"sha256:2f6c962e7b8311337352d9fdea917da2184d9919f4da7695bc2a6517cf392fe4\"" Aug 12 23:44:08.960986 containerd[1506]: time="2025-08-12T23:44:08.960943941Z" level=info msg="PullImage \"registry.k8s.io/pause:3.10\"" Aug 12 23:44:09.480045 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount1815062879.mount: Deactivated successfully. Aug 12 23:44:09.488024 containerd[1506]: time="2025-08-12T23:44:09.487936745Z" level=info msg="ImageCreate event name:\"registry.k8s.io/pause:3.10\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" Aug 12 23:44:09.489760 containerd[1506]: time="2025-08-12T23:44:09.489676409Z" level=info msg="stop pulling image registry.k8s.io/pause:3.10: active requests=0, bytes read=268723" Aug 12 23:44:09.491805 containerd[1506]: time="2025-08-12T23:44:09.491696202Z" level=info msg="ImageCreate event name:\"sha256:afb61768ce381961ca0beff95337601f29dc70ff3ed14e5e4b3e5699057e6aa8\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" Aug 12 23:44:09.495463 containerd[1506]: time="2025-08-12T23:44:09.495377952Z" level=info msg="ImageCreate event name:\"registry.k8s.io/pause@sha256:ee6521f290b2168b6e0935a181d4cff9be1ac3f505666ef0e3c98fae8199917a\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" Aug 12 23:44:09.498950 containerd[1506]: time="2025-08-12T23:44:09.498279478Z" level=info msg="Pulled image \"registry.k8s.io/pause:3.10\" with image id \"sha256:afb61768ce381961ca0beff95337601f29dc70ff3ed14e5e4b3e5699057e6aa8\", repo tag \"registry.k8s.io/pause:3.10\", repo digest \"registry.k8s.io/pause@sha256:ee6521f290b2168b6e0935a181d4cff9be1ac3f505666ef0e3c98fae8199917a\", size \"267933\" in 537.289659ms" Aug 12 23:44:09.498950 containerd[1506]: time="2025-08-12T23:44:09.498354435Z" level=info msg="PullImage \"registry.k8s.io/pause:3.10\" returns image reference \"sha256:afb61768ce381961ca0beff95337601f29dc70ff3ed14e5e4b3e5699057e6aa8\"" Aug 12 23:44:09.499207 containerd[1506]: time="2025-08-12T23:44:09.499147257Z" level=info msg="PullImage \"registry.k8s.io/etcd:3.5.15-0\"" Aug 12 23:44:10.083616 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount2401215664.mount: Deactivated successfully. Aug 12 23:44:11.749099 containerd[1506]: time="2025-08-12T23:44:11.749044510Z" level=info msg="ImageCreate event name:\"registry.k8s.io/etcd:3.5.15-0\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Aug 12 23:44:11.751383 containerd[1506]: time="2025-08-12T23:44:11.751323441Z" level=info msg="stop pulling image registry.k8s.io/etcd:3.5.15-0: active requests=0, bytes read=66406533" Aug 12 23:44:11.752832 containerd[1506]: time="2025-08-12T23:44:11.752763978Z" level=info msg="ImageCreate event name:\"sha256:27e3830e1402783674d8b594038967deea9d51f0d91b34c93c8f39d2f68af7da\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Aug 12 23:44:11.756885 containerd[1506]: time="2025-08-12T23:44:11.756823800Z" level=info msg="ImageCreate event name:\"registry.k8s.io/etcd@sha256:a6dc63e6e8cfa0307d7851762fa6b629afb18f28d8aa3fab5a6e91b4af60026a\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Aug 12 23:44:11.758506 containerd[1506]: time="2025-08-12T23:44:11.758334921Z" level=info msg="Pulled image \"registry.k8s.io/etcd:3.5.15-0\" with image id \"sha256:27e3830e1402783674d8b594038967deea9d51f0d91b34c93c8f39d2f68af7da\", repo tag \"registry.k8s.io/etcd:3.5.15-0\", repo digest \"registry.k8s.io/etcd@sha256:a6dc63e6e8cfa0307d7851762fa6b629afb18f28d8aa3fab5a6e91b4af60026a\", size \"66535646\" in 2.259145913s" Aug 12 23:44:11.758506 containerd[1506]: time="2025-08-12T23:44:11.758377455Z" level=info msg="PullImage \"registry.k8s.io/etcd:3.5.15-0\" returns image reference \"sha256:27e3830e1402783674d8b594038967deea9d51f0d91b34c93c8f39d2f68af7da\"" Aug 12 23:44:15.803219 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 5. Aug 12 23:44:15.806773 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Aug 12 23:44:16.014752 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Aug 12 23:44:16.023043 (kubelet)[2253]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Aug 12 23:44:16.069661 kubelet[2253]: E0812 23:44:16.068652 2253 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Aug 12 23:44:16.071287 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Aug 12 23:44:16.071422 systemd[1]: kubelet.service: Failed with result 'exit-code'. Aug 12 23:44:16.072103 systemd[1]: kubelet.service: Consumed 161ms CPU time, 105.1M memory peak. Aug 12 23:44:17.356562 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent. Aug 12 23:44:17.356778 systemd[1]: kubelet.service: Consumed 161ms CPU time, 105.1M memory peak. Aug 12 23:44:17.360307 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Aug 12 23:44:17.389595 systemd[1]: Reload requested from client PID 2267 ('systemctl') (unit session-7.scope)... Aug 12 23:44:17.389614 systemd[1]: Reloading... Aug 12 23:44:17.515793 zram_generator::config[2307]: No configuration found. Aug 12 23:44:17.596210 systemd[1]: /usr/lib/systemd/system/docker.socket:6: ListenStream= references a path below legacy directory /var/run/, updating /var/run/docker.sock → /run/docker.sock; please update the unit file accordingly. Aug 12 23:44:17.699688 systemd[1]: Reloading finished in 309 ms. Aug 12 23:44:17.764821 systemd[1]: kubelet.service: Control process exited, code=killed, status=15/TERM Aug 12 23:44:17.764911 systemd[1]: kubelet.service: Failed with result 'signal'. Aug 12 23:44:17.765222 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent. Aug 12 23:44:17.765271 systemd[1]: kubelet.service: Consumed 109ms CPU time, 95M memory peak. Aug 12 23:44:17.768477 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Aug 12 23:44:17.922311 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Aug 12 23:44:17.932649 (kubelet)[2359]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS Aug 12 23:44:17.984756 kubelet[2359]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Aug 12 23:44:17.985071 kubelet[2359]: Flag --pod-infra-container-image has been deprecated, will be removed in a future release. Image garbage collector will get sandbox image information from CRI. Aug 12 23:44:17.985125 kubelet[2359]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Aug 12 23:44:17.985267 kubelet[2359]: I0812 23:44:17.985229 2359 server.go:211] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Aug 12 23:44:18.368608 kubelet[2359]: I0812 23:44:18.368407 2359 server.go:491] "Kubelet version" kubeletVersion="v1.31.8" Aug 12 23:44:18.368608 kubelet[2359]: I0812 23:44:18.368447 2359 server.go:493] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Aug 12 23:44:18.368809 kubelet[2359]: I0812 23:44:18.368784 2359 server.go:934] "Client rotation is on, will bootstrap in background" Aug 12 23:44:18.397767 kubelet[2359]: E0812 23:44:18.397720 2359 certificate_manager.go:562] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post \"https://138.199.237.168:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": dial tcp 138.199.237.168:6443: connect: connection refused" logger="UnhandledError" Aug 12 23:44:18.400734 kubelet[2359]: I0812 23:44:18.400517 2359 dynamic_cafile_content.go:160] "Starting controller" name="client-ca-bundle::/etc/kubernetes/pki/ca.crt" Aug 12 23:44:18.413297 kubelet[2359]: I0812 23:44:18.413263 2359 server.go:1431] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Aug 12 23:44:18.418981 kubelet[2359]: I0812 23:44:18.418855 2359 server.go:749] "--cgroups-per-qos enabled, but --cgroup-root was not specified. defaulting to /" Aug 12 23:44:18.420582 kubelet[2359]: I0812 23:44:18.420174 2359 swap_util.go:113] "Swap is on" /proc/swaps contents="Filename\t\t\t\tType\t\tSize\t\tUsed\t\tPriority" Aug 12 23:44:18.420582 kubelet[2359]: I0812 23:44:18.420364 2359 container_manager_linux.go:264] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Aug 12 23:44:18.420756 kubelet[2359]: I0812 23:44:18.420391 2359 container_manager_linux.go:269] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ci-4372-1-0-9-13fe44d47a","RuntimeCgroupsName":"","SystemCgroupsName":"","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":false,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":null,"HardEvictionThresholds":[{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"ExperimentalMemoryManagerPolicy":"None","ExperimentalMemoryManagerReservedMemory":null,"PodPidsLimit":-1,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Aug 12 23:44:18.420970 kubelet[2359]: I0812 23:44:18.420941 2359 topology_manager.go:138] "Creating topology manager with none policy" Aug 12 23:44:18.421028 kubelet[2359]: I0812 23:44:18.421020 2359 container_manager_linux.go:300] "Creating device plugin manager" Aug 12 23:44:18.421321 kubelet[2359]: I0812 23:44:18.421307 2359 state_mem.go:36] "Initialized new in-memory state store" Aug 12 23:44:18.424615 kubelet[2359]: I0812 23:44:18.424584 2359 kubelet.go:408] "Attempting to sync node with API server" Aug 12 23:44:18.424743 kubelet[2359]: I0812 23:44:18.424732 2359 kubelet.go:303] "Adding static pod path" path="/etc/kubernetes/manifests" Aug 12 23:44:18.424813 kubelet[2359]: I0812 23:44:18.424804 2359 kubelet.go:314] "Adding apiserver pod source" Aug 12 23:44:18.424930 kubelet[2359]: I0812 23:44:18.424921 2359 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Aug 12 23:44:18.426426 kubelet[2359]: W0812 23:44:18.426239 2359 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://138.199.237.168:6443/api/v1/nodes?fieldSelector=metadata.name%3Dci-4372-1-0-9-13fe44d47a&limit=500&resourceVersion=0": dial tcp 138.199.237.168:6443: connect: connection refused Aug 12 23:44:18.426426 kubelet[2359]: E0812 23:44:18.426316 2359 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://138.199.237.168:6443/api/v1/nodes?fieldSelector=metadata.name%3Dci-4372-1-0-9-13fe44d47a&limit=500&resourceVersion=0\": dial tcp 138.199.237.168:6443: connect: connection refused" logger="UnhandledError" Aug 12 23:44:18.430227 kubelet[2359]: W0812 23:44:18.429815 2359 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://138.199.237.168:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 138.199.237.168:6443: connect: connection refused Aug 12 23:44:18.430227 kubelet[2359]: E0812 23:44:18.429865 2359 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://138.199.237.168:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 138.199.237.168:6443: connect: connection refused" logger="UnhandledError" Aug 12 23:44:18.430643 kubelet[2359]: I0812 23:44:18.430624 2359 kuberuntime_manager.go:262] "Container runtime initialized" containerRuntime="containerd" version="v2.0.4" apiVersion="v1" Aug 12 23:44:18.431617 kubelet[2359]: I0812 23:44:18.431599 2359 kubelet.go:837] "Not starting ClusterTrustBundle informer because we are in static kubelet mode" Aug 12 23:44:18.431950 kubelet[2359]: W0812 23:44:18.431925 2359 probe.go:272] Flexvolume plugin directory at /opt/libexec/kubernetes/kubelet-plugins/volume/exec/ does not exist. Recreating. Aug 12 23:44:18.433465 kubelet[2359]: I0812 23:44:18.433439 2359 server.go:1274] "Started kubelet" Aug 12 23:44:18.435604 kubelet[2359]: I0812 23:44:18.435243 2359 server.go:163] "Starting to listen" address="0.0.0.0" port=10250 Aug 12 23:44:18.436419 kubelet[2359]: I0812 23:44:18.436393 2359 server.go:449] "Adding debug handlers to kubelet server" Aug 12 23:44:18.437158 kubelet[2359]: I0812 23:44:18.437103 2359 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Aug 12 23:44:18.437502 kubelet[2359]: I0812 23:44:18.437486 2359 server.go:236] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Aug 12 23:44:18.439503 kubelet[2359]: E0812 23:44:18.437745 2359 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://138.199.237.168:6443/api/v1/namespaces/default/events\": dial tcp 138.199.237.168:6443: connect: connection refused" event="&Event{ObjectMeta:{ci-4372-1-0-9-13fe44d47a.185b29a2abb90d1d default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ci-4372-1-0-9-13fe44d47a,UID:ci-4372-1-0-9-13fe44d47a,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:ci-4372-1-0-9-13fe44d47a,},FirstTimestamp:2025-08-12 23:44:18.433412381 +0000 UTC m=+0.496405414,LastTimestamp:2025-08-12 23:44:18.433412381 +0000 UTC m=+0.496405414,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ci-4372-1-0-9-13fe44d47a,}" Aug 12 23:44:18.440012 kubelet[2359]: I0812 23:44:18.439941 2359 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Aug 12 23:44:18.440451 kubelet[2359]: I0812 23:44:18.440420 2359 dynamic_serving_content.go:135] "Starting controller" name="kubelet-server-cert-files::/var/lib/kubelet/pki/kubelet.crt::/var/lib/kubelet/pki/kubelet.key" Aug 12 23:44:18.444159 kubelet[2359]: E0812 23:44:18.444139 2359 kubelet_node_status.go:453] "Error getting the current node from lister" err="node \"ci-4372-1-0-9-13fe44d47a\" not found" Aug 12 23:44:18.445205 kubelet[2359]: I0812 23:44:18.444292 2359 volume_manager.go:289] "Starting Kubelet Volume Manager" Aug 12 23:44:18.445205 kubelet[2359]: I0812 23:44:18.444529 2359 desired_state_of_world_populator.go:147] "Desired state populator starts to run" Aug 12 23:44:18.445205 kubelet[2359]: I0812 23:44:18.444596 2359 reconciler.go:26] "Reconciler: start to sync state" Aug 12 23:44:18.445205 kubelet[2359]: W0812 23:44:18.445022 2359 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://138.199.237.168:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 138.199.237.168:6443: connect: connection refused Aug 12 23:44:18.445205 kubelet[2359]: E0812 23:44:18.445064 2359 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://138.199.237.168:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 138.199.237.168:6443: connect: connection refused" logger="UnhandledError" Aug 12 23:44:18.445395 kubelet[2359]: E0812 23:44:18.445379 2359 kubelet.go:1478] "Image garbage collection failed once. Stats initialization may not have completed yet" err="invalid capacity 0 on image filesystem" Aug 12 23:44:18.446993 kubelet[2359]: I0812 23:44:18.446968 2359 factory.go:219] Registration of the crio container factory failed: Get "http://%2Fvar%2Frun%2Fcrio%2Fcrio.sock/info": dial unix /var/run/crio/crio.sock: connect: no such file or directory Aug 12 23:44:18.447769 kubelet[2359]: E0812 23:44:18.447730 2359 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://138.199.237.168:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/ci-4372-1-0-9-13fe44d47a?timeout=10s\": dial tcp 138.199.237.168:6443: connect: connection refused" interval="200ms" Aug 12 23:44:18.448673 kubelet[2359]: I0812 23:44:18.448653 2359 factory.go:221] Registration of the containerd container factory successfully Aug 12 23:44:18.448762 kubelet[2359]: I0812 23:44:18.448752 2359 factory.go:221] Registration of the systemd container factory successfully Aug 12 23:44:18.464800 kubelet[2359]: I0812 23:44:18.464728 2359 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv4" Aug 12 23:44:18.466347 kubelet[2359]: I0812 23:44:18.466299 2359 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv6" Aug 12 23:44:18.466347 kubelet[2359]: I0812 23:44:18.466333 2359 status_manager.go:217] "Starting to sync pod status with apiserver" Aug 12 23:44:18.466347 kubelet[2359]: I0812 23:44:18.466352 2359 kubelet.go:2321] "Starting kubelet main sync loop" Aug 12 23:44:18.466643 kubelet[2359]: E0812 23:44:18.466398 2359 kubelet.go:2345] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" Aug 12 23:44:18.474449 kubelet[2359]: W0812 23:44:18.474341 2359 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://138.199.237.168:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 138.199.237.168:6443: connect: connection refused Aug 12 23:44:18.474449 kubelet[2359]: E0812 23:44:18.474411 2359 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://138.199.237.168:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 138.199.237.168:6443: connect: connection refused" logger="UnhandledError" Aug 12 23:44:18.475886 kubelet[2359]: I0812 23:44:18.475835 2359 cpu_manager.go:214] "Starting CPU manager" policy="none" Aug 12 23:44:18.475886 kubelet[2359]: I0812 23:44:18.475869 2359 cpu_manager.go:215] "Reconciling" reconcilePeriod="10s" Aug 12 23:44:18.476014 kubelet[2359]: I0812 23:44:18.475915 2359 state_mem.go:36] "Initialized new in-memory state store" Aug 12 23:44:18.478637 kubelet[2359]: I0812 23:44:18.478592 2359 policy_none.go:49] "None policy: Start" Aug 12 23:44:18.479601 kubelet[2359]: I0812 23:44:18.479574 2359 memory_manager.go:170] "Starting memorymanager" policy="None" Aug 12 23:44:18.480144 kubelet[2359]: I0812 23:44:18.479748 2359 state_mem.go:35] "Initializing new in-memory state store" Aug 12 23:44:18.487302 systemd[1]: Created slice kubepods.slice - libcontainer container kubepods.slice. Aug 12 23:44:18.497851 systemd[1]: Created slice kubepods-burstable.slice - libcontainer container kubepods-burstable.slice. Aug 12 23:44:18.503989 systemd[1]: Created slice kubepods-besteffort.slice - libcontainer container kubepods-besteffort.slice. Aug 12 23:44:18.518664 kubelet[2359]: I0812 23:44:18.518595 2359 manager.go:513] "Failed to read data from checkpoint" checkpoint="kubelet_internal_checkpoint" err="checkpoint is not found" Aug 12 23:44:18.519600 kubelet[2359]: I0812 23:44:18.519314 2359 eviction_manager.go:189] "Eviction manager: starting control loop" Aug 12 23:44:18.520688 kubelet[2359]: I0812 23:44:18.519988 2359 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Aug 12 23:44:18.520688 kubelet[2359]: I0812 23:44:18.520447 2359 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Aug 12 23:44:18.524110 kubelet[2359]: E0812 23:44:18.524078 2359 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"ci-4372-1-0-9-13fe44d47a\" not found" Aug 12 23:44:18.584575 systemd[1]: Created slice kubepods-burstable-poda544c3f042ca3d854dd7922b10173e2a.slice - libcontainer container kubepods-burstable-poda544c3f042ca3d854dd7922b10173e2a.slice. Aug 12 23:44:18.603903 systemd[1]: Created slice kubepods-burstable-pode8d19474dfc8d326ab01cdba6d792b65.slice - libcontainer container kubepods-burstable-pode8d19474dfc8d326ab01cdba6d792b65.slice. Aug 12 23:44:18.611883 systemd[1]: Created slice kubepods-burstable-podd4e23bc46888b0613976a24bbbdad562.slice - libcontainer container kubepods-burstable-podd4e23bc46888b0613976a24bbbdad562.slice. Aug 12 23:44:18.625717 kubelet[2359]: I0812 23:44:18.623969 2359 kubelet_node_status.go:72] "Attempting to register node" node="ci-4372-1-0-9-13fe44d47a" Aug 12 23:44:18.626588 kubelet[2359]: E0812 23:44:18.626519 2359 kubelet_node_status.go:95] "Unable to register node with API server" err="Post \"https://138.199.237.168:6443/api/v1/nodes\": dial tcp 138.199.237.168:6443: connect: connection refused" node="ci-4372-1-0-9-13fe44d47a" Aug 12 23:44:18.645298 kubelet[2359]: I0812 23:44:18.645222 2359 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/a544c3f042ca3d854dd7922b10173e2a-k8s-certs\") pod \"kube-apiserver-ci-4372-1-0-9-13fe44d47a\" (UID: \"a544c3f042ca3d854dd7922b10173e2a\") " pod="kube-system/kube-apiserver-ci-4372-1-0-9-13fe44d47a" Aug 12 23:44:18.645298 kubelet[2359]: I0812 23:44:18.645295 2359 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/e8d19474dfc8d326ab01cdba6d792b65-kubeconfig\") pod \"kube-controller-manager-ci-4372-1-0-9-13fe44d47a\" (UID: \"e8d19474dfc8d326ab01cdba6d792b65\") " pod="kube-system/kube-controller-manager-ci-4372-1-0-9-13fe44d47a" Aug 12 23:44:18.645504 kubelet[2359]: I0812 23:44:18.645337 2359 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/e8d19474dfc8d326ab01cdba6d792b65-usr-share-ca-certificates\") pod \"kube-controller-manager-ci-4372-1-0-9-13fe44d47a\" (UID: \"e8d19474dfc8d326ab01cdba6d792b65\") " pod="kube-system/kube-controller-manager-ci-4372-1-0-9-13fe44d47a" Aug 12 23:44:18.645504 kubelet[2359]: I0812 23:44:18.645373 2359 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/a544c3f042ca3d854dd7922b10173e2a-ca-certs\") pod \"kube-apiserver-ci-4372-1-0-9-13fe44d47a\" (UID: \"a544c3f042ca3d854dd7922b10173e2a\") " pod="kube-system/kube-apiserver-ci-4372-1-0-9-13fe44d47a" Aug 12 23:44:18.645504 kubelet[2359]: I0812 23:44:18.645408 2359 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/d4e23bc46888b0613976a24bbbdad562-kubeconfig\") pod \"kube-scheduler-ci-4372-1-0-9-13fe44d47a\" (UID: \"d4e23bc46888b0613976a24bbbdad562\") " pod="kube-system/kube-scheduler-ci-4372-1-0-9-13fe44d47a" Aug 12 23:44:18.645504 kubelet[2359]: I0812 23:44:18.645442 2359 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/a544c3f042ca3d854dd7922b10173e2a-usr-share-ca-certificates\") pod \"kube-apiserver-ci-4372-1-0-9-13fe44d47a\" (UID: \"a544c3f042ca3d854dd7922b10173e2a\") " pod="kube-system/kube-apiserver-ci-4372-1-0-9-13fe44d47a" Aug 12 23:44:18.645504 kubelet[2359]: I0812 23:44:18.645480 2359 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/e8d19474dfc8d326ab01cdba6d792b65-ca-certs\") pod \"kube-controller-manager-ci-4372-1-0-9-13fe44d47a\" (UID: \"e8d19474dfc8d326ab01cdba6d792b65\") " pod="kube-system/kube-controller-manager-ci-4372-1-0-9-13fe44d47a" Aug 12 23:44:18.645836 kubelet[2359]: I0812 23:44:18.645517 2359 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"flexvolume-dir\" (UniqueName: \"kubernetes.io/host-path/e8d19474dfc8d326ab01cdba6d792b65-flexvolume-dir\") pod \"kube-controller-manager-ci-4372-1-0-9-13fe44d47a\" (UID: \"e8d19474dfc8d326ab01cdba6d792b65\") " pod="kube-system/kube-controller-manager-ci-4372-1-0-9-13fe44d47a" Aug 12 23:44:18.645836 kubelet[2359]: I0812 23:44:18.645582 2359 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/e8d19474dfc8d326ab01cdba6d792b65-k8s-certs\") pod \"kube-controller-manager-ci-4372-1-0-9-13fe44d47a\" (UID: \"e8d19474dfc8d326ab01cdba6d792b65\") " pod="kube-system/kube-controller-manager-ci-4372-1-0-9-13fe44d47a" Aug 12 23:44:18.648378 kubelet[2359]: E0812 23:44:18.648319 2359 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://138.199.237.168:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/ci-4372-1-0-9-13fe44d47a?timeout=10s\": dial tcp 138.199.237.168:6443: connect: connection refused" interval="400ms" Aug 12 23:44:18.829308 kubelet[2359]: I0812 23:44:18.829262 2359 kubelet_node_status.go:72] "Attempting to register node" node="ci-4372-1-0-9-13fe44d47a" Aug 12 23:44:18.829983 kubelet[2359]: E0812 23:44:18.829938 2359 kubelet_node_status.go:95] "Unable to register node with API server" err="Post \"https://138.199.237.168:6443/api/v1/nodes\": dial tcp 138.199.237.168:6443: connect: connection refused" node="ci-4372-1-0-9-13fe44d47a" Aug 12 23:44:18.901265 containerd[1506]: time="2025-08-12T23:44:18.901124021Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-apiserver-ci-4372-1-0-9-13fe44d47a,Uid:a544c3f042ca3d854dd7922b10173e2a,Namespace:kube-system,Attempt:0,}" Aug 12 23:44:18.908655 containerd[1506]: time="2025-08-12T23:44:18.908566802Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-controller-manager-ci-4372-1-0-9-13fe44d47a,Uid:e8d19474dfc8d326ab01cdba6d792b65,Namespace:kube-system,Attempt:0,}" Aug 12 23:44:18.917878 containerd[1506]: time="2025-08-12T23:44:18.917826970Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-scheduler-ci-4372-1-0-9-13fe44d47a,Uid:d4e23bc46888b0613976a24bbbdad562,Namespace:kube-system,Attempt:0,}" Aug 12 23:44:18.951934 containerd[1506]: time="2025-08-12T23:44:18.951852865Z" level=info msg="connecting to shim 63cc7c1f668225095cca793da3294eacfac5cb36edf22dfd0f68a8d971374f0e" address="unix:///run/containerd/s/18b6a29b1b4e5462e0c5fac0dd98e68fa2c0485a4ec7d19bba00de3410204a12" namespace=k8s.io protocol=ttrpc version=3 Aug 12 23:44:18.969589 containerd[1506]: time="2025-08-12T23:44:18.969521313Z" level=info msg="connecting to shim f03a57ded5400be9b1f484a9c3198b7e388b992cae5d0a96b0ae57a0271f7629" address="unix:///run/containerd/s/a8520bbacd5295a36db3101ab2017144523800f93aa489e6d09253939f80adc4" namespace=k8s.io protocol=ttrpc version=3 Aug 12 23:44:18.985895 containerd[1506]: time="2025-08-12T23:44:18.985833164Z" level=info msg="connecting to shim c5d6af8ec2d30f5a063448bdcfc55cd2cf73946a87dffbe65a2bcf4266afe4bc" address="unix:///run/containerd/s/0d20e9facbc3d0dc8dfda71bd807d29995c4d24802c38827b7a650a57bbe53a8" namespace=k8s.io protocol=ttrpc version=3 Aug 12 23:44:19.008739 systemd[1]: Started cri-containerd-63cc7c1f668225095cca793da3294eacfac5cb36edf22dfd0f68a8d971374f0e.scope - libcontainer container 63cc7c1f668225095cca793da3294eacfac5cb36edf22dfd0f68a8d971374f0e. Aug 12 23:44:19.014269 systemd[1]: Started cri-containerd-f03a57ded5400be9b1f484a9c3198b7e388b992cae5d0a96b0ae57a0271f7629.scope - libcontainer container f03a57ded5400be9b1f484a9c3198b7e388b992cae5d0a96b0ae57a0271f7629. Aug 12 23:44:19.033501 systemd[1]: Started cri-containerd-c5d6af8ec2d30f5a063448bdcfc55cd2cf73946a87dffbe65a2bcf4266afe4bc.scope - libcontainer container c5d6af8ec2d30f5a063448bdcfc55cd2cf73946a87dffbe65a2bcf4266afe4bc. Aug 12 23:44:19.050059 kubelet[2359]: E0812 23:44:19.049989 2359 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://138.199.237.168:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/ci-4372-1-0-9-13fe44d47a?timeout=10s\": dial tcp 138.199.237.168:6443: connect: connection refused" interval="800ms" Aug 12 23:44:19.089642 containerd[1506]: time="2025-08-12T23:44:19.087530789Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-apiserver-ci-4372-1-0-9-13fe44d47a,Uid:a544c3f042ca3d854dd7922b10173e2a,Namespace:kube-system,Attempt:0,} returns sandbox id \"63cc7c1f668225095cca793da3294eacfac5cb36edf22dfd0f68a8d971374f0e\"" Aug 12 23:44:19.095945 containerd[1506]: time="2025-08-12T23:44:19.095669644Z" level=info msg="CreateContainer within sandbox \"63cc7c1f668225095cca793da3294eacfac5cb36edf22dfd0f68a8d971374f0e\" for container &ContainerMetadata{Name:kube-apiserver,Attempt:0,}" Aug 12 23:44:19.111328 containerd[1506]: time="2025-08-12T23:44:19.111230167Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-controller-manager-ci-4372-1-0-9-13fe44d47a,Uid:e8d19474dfc8d326ab01cdba6d792b65,Namespace:kube-system,Attempt:0,} returns sandbox id \"f03a57ded5400be9b1f484a9c3198b7e388b992cae5d0a96b0ae57a0271f7629\"" Aug 12 23:44:19.115228 containerd[1506]: time="2025-08-12T23:44:19.114858271Z" level=info msg="Container eaf9a9f02e64b1376e9a1b7ff31a49b69fb20afc0b47b8a85282a680a82f61f9: CDI devices from CRI Config.CDIDevices: []" Aug 12 23:44:19.115757 containerd[1506]: time="2025-08-12T23:44:19.115719696Z" level=info msg="CreateContainer within sandbox \"f03a57ded5400be9b1f484a9c3198b7e388b992cae5d0a96b0ae57a0271f7629\" for container &ContainerMetadata{Name:kube-controller-manager,Attempt:0,}" Aug 12 23:44:19.124741 containerd[1506]: time="2025-08-12T23:44:19.124648197Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-scheduler-ci-4372-1-0-9-13fe44d47a,Uid:d4e23bc46888b0613976a24bbbdad562,Namespace:kube-system,Attempt:0,} returns sandbox id \"c5d6af8ec2d30f5a063448bdcfc55cd2cf73946a87dffbe65a2bcf4266afe4bc\"" Aug 12 23:44:19.129770 containerd[1506]: time="2025-08-12T23:44:19.129710078Z" level=info msg="CreateContainer within sandbox \"c5d6af8ec2d30f5a063448bdcfc55cd2cf73946a87dffbe65a2bcf4266afe4bc\" for container &ContainerMetadata{Name:kube-scheduler,Attempt:0,}" Aug 12 23:44:19.133541 containerd[1506]: time="2025-08-12T23:44:19.133181955Z" level=info msg="Container 23bb8c20a2225635a3b99a1166afdfa83d83099a06c33aa14b79ca02b22a47dd: CDI devices from CRI Config.CDIDevices: []" Aug 12 23:44:19.137279 containerd[1506]: time="2025-08-12T23:44:19.137160361Z" level=info msg="CreateContainer within sandbox \"63cc7c1f668225095cca793da3294eacfac5cb36edf22dfd0f68a8d971374f0e\" for &ContainerMetadata{Name:kube-apiserver,Attempt:0,} returns container id \"eaf9a9f02e64b1376e9a1b7ff31a49b69fb20afc0b47b8a85282a680a82f61f9\"" Aug 12 23:44:19.138326 containerd[1506]: time="2025-08-12T23:44:19.138275729Z" level=info msg="StartContainer for \"eaf9a9f02e64b1376e9a1b7ff31a49b69fb20afc0b47b8a85282a680a82f61f9\"" Aug 12 23:44:19.139989 containerd[1506]: time="2025-08-12T23:44:19.139953258Z" level=info msg="connecting to shim eaf9a9f02e64b1376e9a1b7ff31a49b69fb20afc0b47b8a85282a680a82f61f9" address="unix:///run/containerd/s/18b6a29b1b4e5462e0c5fac0dd98e68fa2c0485a4ec7d19bba00de3410204a12" protocol=ttrpc version=3 Aug 12 23:44:19.149441 containerd[1506]: time="2025-08-12T23:44:19.149288252Z" level=info msg="Container 0e65af796fa343169e6c8d278f1e1cadf8ed15c036c122f9e9d5a019957cf140: CDI devices from CRI Config.CDIDevices: []" Aug 12 23:44:19.150495 containerd[1506]: time="2025-08-12T23:44:19.150442387Z" level=info msg="CreateContainer within sandbox \"f03a57ded5400be9b1f484a9c3198b7e388b992cae5d0a96b0ae57a0271f7629\" for &ContainerMetadata{Name:kube-controller-manager,Attempt:0,} returns container id \"23bb8c20a2225635a3b99a1166afdfa83d83099a06c33aa14b79ca02b22a47dd\"" Aug 12 23:44:19.152994 containerd[1506]: time="2025-08-12T23:44:19.152607060Z" level=info msg="StartContainer for \"23bb8c20a2225635a3b99a1166afdfa83d83099a06c33aa14b79ca02b22a47dd\"" Aug 12 23:44:19.155898 containerd[1506]: time="2025-08-12T23:44:19.155842380Z" level=info msg="connecting to shim 23bb8c20a2225635a3b99a1166afdfa83d83099a06c33aa14b79ca02b22a47dd" address="unix:///run/containerd/s/a8520bbacd5295a36db3101ab2017144523800f93aa489e6d09253939f80adc4" protocol=ttrpc version=3 Aug 12 23:44:19.163161 containerd[1506]: time="2025-08-12T23:44:19.163125765Z" level=info msg="CreateContainer within sandbox \"c5d6af8ec2d30f5a063448bdcfc55cd2cf73946a87dffbe65a2bcf4266afe4bc\" for &ContainerMetadata{Name:kube-scheduler,Attempt:0,} returns container id \"0e65af796fa343169e6c8d278f1e1cadf8ed15c036c122f9e9d5a019957cf140\"" Aug 12 23:44:19.166064 containerd[1506]: time="2025-08-12T23:44:19.166033444Z" level=info msg="StartContainer for \"0e65af796fa343169e6c8d278f1e1cadf8ed15c036c122f9e9d5a019957cf140\"" Aug 12 23:44:19.168422 containerd[1506]: time="2025-08-12T23:44:19.168376724Z" level=info msg="connecting to shim 0e65af796fa343169e6c8d278f1e1cadf8ed15c036c122f9e9d5a019957cf140" address="unix:///run/containerd/s/0d20e9facbc3d0dc8dfda71bd807d29995c4d24802c38827b7a650a57bbe53a8" protocol=ttrpc version=3 Aug 12 23:44:19.169054 systemd[1]: Started cri-containerd-eaf9a9f02e64b1376e9a1b7ff31a49b69fb20afc0b47b8a85282a680a82f61f9.scope - libcontainer container eaf9a9f02e64b1376e9a1b7ff31a49b69fb20afc0b47b8a85282a680a82f61f9. Aug 12 23:44:19.186574 systemd[1]: Started cri-containerd-23bb8c20a2225635a3b99a1166afdfa83d83099a06c33aa14b79ca02b22a47dd.scope - libcontainer container 23bb8c20a2225635a3b99a1166afdfa83d83099a06c33aa14b79ca02b22a47dd. Aug 12 23:44:19.213944 systemd[1]: Started cri-containerd-0e65af796fa343169e6c8d278f1e1cadf8ed15c036c122f9e9d5a019957cf140.scope - libcontainer container 0e65af796fa343169e6c8d278f1e1cadf8ed15c036c122f9e9d5a019957cf140. Aug 12 23:44:19.237301 kubelet[2359]: I0812 23:44:19.236094 2359 kubelet_node_status.go:72] "Attempting to register node" node="ci-4372-1-0-9-13fe44d47a" Aug 12 23:44:19.237301 kubelet[2359]: E0812 23:44:19.236538 2359 kubelet_node_status.go:95] "Unable to register node with API server" err="Post \"https://138.199.237.168:6443/api/v1/nodes\": dial tcp 138.199.237.168:6443: connect: connection refused" node="ci-4372-1-0-9-13fe44d47a" Aug 12 23:44:19.272122 containerd[1506]: time="2025-08-12T23:44:19.272072282Z" level=info msg="StartContainer for \"eaf9a9f02e64b1376e9a1b7ff31a49b69fb20afc0b47b8a85282a680a82f61f9\" returns successfully" Aug 12 23:44:19.287685 containerd[1506]: time="2025-08-12T23:44:19.287631886Z" level=info msg="StartContainer for \"23bb8c20a2225635a3b99a1166afdfa83d83099a06c33aa14b79ca02b22a47dd\" returns successfully" Aug 12 23:44:19.319294 kubelet[2359]: W0812 23:44:19.319164 2359 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://138.199.237.168:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 138.199.237.168:6443: connect: connection refused Aug 12 23:44:19.319294 kubelet[2359]: E0812 23:44:19.319250 2359 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://138.199.237.168:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 138.199.237.168:6443: connect: connection refused" logger="UnhandledError" Aug 12 23:44:19.330357 containerd[1506]: time="2025-08-12T23:44:19.330161716Z" level=info msg="StartContainer for \"0e65af796fa343169e6c8d278f1e1cadf8ed15c036c122f9e9d5a019957cf140\" returns successfully" Aug 12 23:44:20.040805 kubelet[2359]: I0812 23:44:20.040770 2359 kubelet_node_status.go:72] "Attempting to register node" node="ci-4372-1-0-9-13fe44d47a" Aug 12 23:44:22.105284 kubelet[2359]: E0812 23:44:22.105239 2359 nodelease.go:49] "Failed to get node when trying to set owner ref to the node lease" err="nodes \"ci-4372-1-0-9-13fe44d47a\" not found" node="ci-4372-1-0-9-13fe44d47a" Aug 12 23:44:22.191730 kubelet[2359]: I0812 23:44:22.191690 2359 kubelet_node_status.go:75] "Successfully registered node" node="ci-4372-1-0-9-13fe44d47a" Aug 12 23:44:22.191877 kubelet[2359]: E0812 23:44:22.191732 2359 kubelet_node_status.go:535] "Error updating node status, will retry" err="error getting node \"ci-4372-1-0-9-13fe44d47a\": node \"ci-4372-1-0-9-13fe44d47a\" not found" Aug 12 23:44:22.429674 kubelet[2359]: I0812 23:44:22.429530 2359 apiserver.go:52] "Watching apiserver" Aug 12 23:44:22.445066 kubelet[2359]: I0812 23:44:22.445020 2359 desired_state_of_world_populator.go:155] "Finished populating initial desired state of world" Aug 12 23:44:24.399040 systemd[1]: Reload requested from client PID 2626 ('systemctl') (unit session-7.scope)... Aug 12 23:44:24.399066 systemd[1]: Reloading... Aug 12 23:44:24.505590 zram_generator::config[2670]: No configuration found. Aug 12 23:44:24.589891 systemd[1]: /usr/lib/systemd/system/docker.socket:6: ListenStream= references a path below legacy directory /var/run/, updating /var/run/docker.sock → /run/docker.sock; please update the unit file accordingly. Aug 12 23:44:24.706816 systemd[1]: Reloading finished in 307 ms. Aug 12 23:44:24.739608 systemd[1]: Stopping kubelet.service - kubelet: The Kubernetes Node Agent... Aug 12 23:44:24.756159 systemd[1]: kubelet.service: Deactivated successfully. Aug 12 23:44:24.757625 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent. Aug 12 23:44:24.757765 systemd[1]: kubelet.service: Consumed 963ms CPU time, 125.1M memory peak. Aug 12 23:44:24.760880 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Aug 12 23:44:24.924415 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Aug 12 23:44:24.935146 (kubelet)[2715]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS Aug 12 23:44:24.999680 kubelet[2715]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Aug 12 23:44:25.000815 kubelet[2715]: Flag --pod-infra-container-image has been deprecated, will be removed in a future release. Image garbage collector will get sandbox image information from CRI. Aug 12 23:44:25.001075 kubelet[2715]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Aug 12 23:44:25.001484 kubelet[2715]: I0812 23:44:25.001265 2715 server.go:211] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Aug 12 23:44:25.014256 kubelet[2715]: I0812 23:44:25.013220 2715 server.go:491] "Kubelet version" kubeletVersion="v1.31.8" Aug 12 23:44:25.014256 kubelet[2715]: I0812 23:44:25.013255 2715 server.go:493] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Aug 12 23:44:25.014256 kubelet[2715]: I0812 23:44:25.013528 2715 server.go:934] "Client rotation is on, will bootstrap in background" Aug 12 23:44:25.015686 kubelet[2715]: I0812 23:44:25.015637 2715 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-client-current.pem". Aug 12 23:44:25.018439 kubelet[2715]: I0812 23:44:25.018370 2715 dynamic_cafile_content.go:160] "Starting controller" name="client-ca-bundle::/etc/kubernetes/pki/ca.crt" Aug 12 23:44:25.027054 kubelet[2715]: I0812 23:44:25.027022 2715 server.go:1431] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Aug 12 23:44:25.031603 kubelet[2715]: I0812 23:44:25.031139 2715 server.go:749] "--cgroups-per-qos enabled, but --cgroup-root was not specified. defaulting to /" Aug 12 23:44:25.031603 kubelet[2715]: I0812 23:44:25.031429 2715 swap_util.go:113] "Swap is on" /proc/swaps contents="Filename\t\t\t\tType\t\tSize\t\tUsed\t\tPriority" Aug 12 23:44:25.032158 kubelet[2715]: I0812 23:44:25.032116 2715 container_manager_linux.go:264] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Aug 12 23:44:25.032664 kubelet[2715]: I0812 23:44:25.032253 2715 container_manager_linux.go:269] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ci-4372-1-0-9-13fe44d47a","RuntimeCgroupsName":"","SystemCgroupsName":"","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":false,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":null,"HardEvictionThresholds":[{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"ExperimentalMemoryManagerPolicy":"None","ExperimentalMemoryManagerReservedMemory":null,"PodPidsLimit":-1,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Aug 12 23:44:25.033005 kubelet[2715]: I0812 23:44:25.032978 2715 topology_manager.go:138] "Creating topology manager with none policy" Aug 12 23:44:25.033129 kubelet[2715]: I0812 23:44:25.033112 2715 container_manager_linux.go:300] "Creating device plugin manager" Aug 12 23:44:25.033281 kubelet[2715]: I0812 23:44:25.033264 2715 state_mem.go:36] "Initialized new in-memory state store" Aug 12 23:44:25.033631 kubelet[2715]: I0812 23:44:25.033601 2715 kubelet.go:408] "Attempting to sync node with API server" Aug 12 23:44:25.034246 kubelet[2715]: I0812 23:44:25.033940 2715 kubelet.go:303] "Adding static pod path" path="/etc/kubernetes/manifests" Aug 12 23:44:25.034405 kubelet[2715]: I0812 23:44:25.034385 2715 kubelet.go:314] "Adding apiserver pod source" Aug 12 23:44:25.035025 kubelet[2715]: I0812 23:44:25.034696 2715 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Aug 12 23:44:25.040867 kubelet[2715]: I0812 23:44:25.040834 2715 kuberuntime_manager.go:262] "Container runtime initialized" containerRuntime="containerd" version="v2.0.4" apiVersion="v1" Aug 12 23:44:25.041975 kubelet[2715]: I0812 23:44:25.041942 2715 kubelet.go:837] "Not starting ClusterTrustBundle informer because we are in static kubelet mode" Aug 12 23:44:25.045214 kubelet[2715]: I0812 23:44:25.045155 2715 server.go:1274] "Started kubelet" Aug 12 23:44:25.048332 kubelet[2715]: I0812 23:44:25.048299 2715 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Aug 12 23:44:25.049747 kubelet[2715]: I0812 23:44:25.049609 2715 server.go:163] "Starting to listen" address="0.0.0.0" port=10250 Aug 12 23:44:25.051554 kubelet[2715]: I0812 23:44:25.050566 2715 server.go:449] "Adding debug handlers to kubelet server" Aug 12 23:44:25.051554 kubelet[2715]: I0812 23:44:25.051472 2715 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Aug 12 23:44:25.053570 kubelet[2715]: I0812 23:44:25.051951 2715 server.go:236] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Aug 12 23:44:25.053570 kubelet[2715]: I0812 23:44:25.052703 2715 dynamic_serving_content.go:135] "Starting controller" name="kubelet-server-cert-files::/var/lib/kubelet/pki/kubelet.crt::/var/lib/kubelet/pki/kubelet.key" Aug 12 23:44:25.055459 kubelet[2715]: I0812 23:44:25.054772 2715 volume_manager.go:289] "Starting Kubelet Volume Manager" Aug 12 23:44:25.055459 kubelet[2715]: E0812 23:44:25.055037 2715 kubelet_node_status.go:453] "Error getting the current node from lister" err="node \"ci-4372-1-0-9-13fe44d47a\" not found" Aug 12 23:44:25.055815 kubelet[2715]: I0812 23:44:25.055793 2715 desired_state_of_world_populator.go:147] "Desired state populator starts to run" Aug 12 23:44:25.055958 kubelet[2715]: I0812 23:44:25.055944 2715 reconciler.go:26] "Reconciler: start to sync state" Aug 12 23:44:25.064540 kubelet[2715]: I0812 23:44:25.063987 2715 factory.go:221] Registration of the systemd container factory successfully Aug 12 23:44:25.064824 kubelet[2715]: I0812 23:44:25.064796 2715 factory.go:219] Registration of the crio container factory failed: Get "http://%2Fvar%2Frun%2Fcrio%2Fcrio.sock/info": dial unix /var/run/crio/crio.sock: connect: no such file or directory Aug 12 23:44:25.083719 kubelet[2715]: I0812 23:44:25.083664 2715 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv4" Aug 12 23:44:25.085798 kubelet[2715]: I0812 23:44:25.085763 2715 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv6" Aug 12 23:44:25.085798 kubelet[2715]: I0812 23:44:25.085799 2715 status_manager.go:217] "Starting to sync pod status with apiserver" Aug 12 23:44:25.085892 kubelet[2715]: I0812 23:44:25.085820 2715 kubelet.go:2321] "Starting kubelet main sync loop" Aug 12 23:44:25.085892 kubelet[2715]: E0812 23:44:25.085869 2715 kubelet.go:2345] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" Aug 12 23:44:25.104557 kubelet[2715]: I0812 23:44:25.104470 2715 factory.go:221] Registration of the containerd container factory successfully Aug 12 23:44:25.106341 kubelet[2715]: E0812 23:44:25.106297 2715 kubelet.go:1478] "Image garbage collection failed once. Stats initialization may not have completed yet" err="invalid capacity 0 on image filesystem" Aug 12 23:44:25.168471 kubelet[2715]: I0812 23:44:25.168396 2715 cpu_manager.go:214] "Starting CPU manager" policy="none" Aug 12 23:44:25.168471 kubelet[2715]: I0812 23:44:25.168421 2715 cpu_manager.go:215] "Reconciling" reconcilePeriod="10s" Aug 12 23:44:25.168471 kubelet[2715]: I0812 23:44:25.168445 2715 state_mem.go:36] "Initialized new in-memory state store" Aug 12 23:44:25.168710 kubelet[2715]: I0812 23:44:25.168627 2715 state_mem.go:88] "Updated default CPUSet" cpuSet="" Aug 12 23:44:25.168710 kubelet[2715]: I0812 23:44:25.168639 2715 state_mem.go:96] "Updated CPUSet assignments" assignments={} Aug 12 23:44:25.168710 kubelet[2715]: I0812 23:44:25.168658 2715 policy_none.go:49] "None policy: Start" Aug 12 23:44:25.170556 kubelet[2715]: I0812 23:44:25.170516 2715 memory_manager.go:170] "Starting memorymanager" policy="None" Aug 12 23:44:25.170666 kubelet[2715]: I0812 23:44:25.170572 2715 state_mem.go:35] "Initializing new in-memory state store" Aug 12 23:44:25.170778 kubelet[2715]: I0812 23:44:25.170748 2715 state_mem.go:75] "Updated machine memory state" Aug 12 23:44:25.176588 kubelet[2715]: I0812 23:44:25.176538 2715 manager.go:513] "Failed to read data from checkpoint" checkpoint="kubelet_internal_checkpoint" err="checkpoint is not found" Aug 12 23:44:25.176932 kubelet[2715]: I0812 23:44:25.176766 2715 eviction_manager.go:189] "Eviction manager: starting control loop" Aug 12 23:44:25.176932 kubelet[2715]: I0812 23:44:25.176788 2715 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Aug 12 23:44:25.178301 kubelet[2715]: I0812 23:44:25.178150 2715 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Aug 12 23:44:25.199590 kubelet[2715]: E0812 23:44:25.199497 2715 kubelet.go:1915] "Failed creating a mirror pod for" err="pods \"kube-scheduler-ci-4372-1-0-9-13fe44d47a\" already exists" pod="kube-system/kube-scheduler-ci-4372-1-0-9-13fe44d47a" Aug 12 23:44:25.290371 kubelet[2715]: I0812 23:44:25.290230 2715 kubelet_node_status.go:72] "Attempting to register node" node="ci-4372-1-0-9-13fe44d47a" Aug 12 23:44:25.302575 kubelet[2715]: I0812 23:44:25.302416 2715 kubelet_node_status.go:111] "Node was previously registered" node="ci-4372-1-0-9-13fe44d47a" Aug 12 23:44:25.302722 kubelet[2715]: I0812 23:44:25.302540 2715 kubelet_node_status.go:75] "Successfully registered node" node="ci-4372-1-0-9-13fe44d47a" Aug 12 23:44:25.358080 kubelet[2715]: I0812 23:44:25.357805 2715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/e8d19474dfc8d326ab01cdba6d792b65-ca-certs\") pod \"kube-controller-manager-ci-4372-1-0-9-13fe44d47a\" (UID: \"e8d19474dfc8d326ab01cdba6d792b65\") " pod="kube-system/kube-controller-manager-ci-4372-1-0-9-13fe44d47a" Aug 12 23:44:25.358080 kubelet[2715]: I0812 23:44:25.357849 2715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"flexvolume-dir\" (UniqueName: \"kubernetes.io/host-path/e8d19474dfc8d326ab01cdba6d792b65-flexvolume-dir\") pod \"kube-controller-manager-ci-4372-1-0-9-13fe44d47a\" (UID: \"e8d19474dfc8d326ab01cdba6d792b65\") " pod="kube-system/kube-controller-manager-ci-4372-1-0-9-13fe44d47a" Aug 12 23:44:25.358080 kubelet[2715]: I0812 23:44:25.357871 2715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/e8d19474dfc8d326ab01cdba6d792b65-k8s-certs\") pod \"kube-controller-manager-ci-4372-1-0-9-13fe44d47a\" (UID: \"e8d19474dfc8d326ab01cdba6d792b65\") " pod="kube-system/kube-controller-manager-ci-4372-1-0-9-13fe44d47a" Aug 12 23:44:25.358080 kubelet[2715]: I0812 23:44:25.357892 2715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/e8d19474dfc8d326ab01cdba6d792b65-usr-share-ca-certificates\") pod \"kube-controller-manager-ci-4372-1-0-9-13fe44d47a\" (UID: \"e8d19474dfc8d326ab01cdba6d792b65\") " pod="kube-system/kube-controller-manager-ci-4372-1-0-9-13fe44d47a" Aug 12 23:44:25.358080 kubelet[2715]: I0812 23:44:25.357914 2715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/a544c3f042ca3d854dd7922b10173e2a-k8s-certs\") pod \"kube-apiserver-ci-4372-1-0-9-13fe44d47a\" (UID: \"a544c3f042ca3d854dd7922b10173e2a\") " pod="kube-system/kube-apiserver-ci-4372-1-0-9-13fe44d47a" Aug 12 23:44:25.358329 kubelet[2715]: I0812 23:44:25.357932 2715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/a544c3f042ca3d854dd7922b10173e2a-usr-share-ca-certificates\") pod \"kube-apiserver-ci-4372-1-0-9-13fe44d47a\" (UID: \"a544c3f042ca3d854dd7922b10173e2a\") " pod="kube-system/kube-apiserver-ci-4372-1-0-9-13fe44d47a" Aug 12 23:44:25.358329 kubelet[2715]: I0812 23:44:25.357955 2715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/e8d19474dfc8d326ab01cdba6d792b65-kubeconfig\") pod \"kube-controller-manager-ci-4372-1-0-9-13fe44d47a\" (UID: \"e8d19474dfc8d326ab01cdba6d792b65\") " pod="kube-system/kube-controller-manager-ci-4372-1-0-9-13fe44d47a" Aug 12 23:44:25.358329 kubelet[2715]: I0812 23:44:25.357973 2715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/d4e23bc46888b0613976a24bbbdad562-kubeconfig\") pod \"kube-scheduler-ci-4372-1-0-9-13fe44d47a\" (UID: \"d4e23bc46888b0613976a24bbbdad562\") " pod="kube-system/kube-scheduler-ci-4372-1-0-9-13fe44d47a" Aug 12 23:44:25.358329 kubelet[2715]: I0812 23:44:25.357990 2715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/a544c3f042ca3d854dd7922b10173e2a-ca-certs\") pod \"kube-apiserver-ci-4372-1-0-9-13fe44d47a\" (UID: \"a544c3f042ca3d854dd7922b10173e2a\") " pod="kube-system/kube-apiserver-ci-4372-1-0-9-13fe44d47a" Aug 12 23:44:26.036602 kubelet[2715]: I0812 23:44:26.036509 2715 apiserver.go:52] "Watching apiserver" Aug 12 23:44:26.057273 kubelet[2715]: I0812 23:44:26.057232 2715 desired_state_of_world_populator.go:155] "Finished populating initial desired state of world" Aug 12 23:44:26.155385 kubelet[2715]: E0812 23:44:26.154482 2715 kubelet.go:1915] "Failed creating a mirror pod for" err="pods \"kube-apiserver-ci-4372-1-0-9-13fe44d47a\" already exists" pod="kube-system/kube-apiserver-ci-4372-1-0-9-13fe44d47a" Aug 12 23:44:26.214454 kubelet[2715]: I0812 23:44:26.214257 2715 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-scheduler-ci-4372-1-0-9-13fe44d47a" podStartSLOduration=2.214237339 podStartE2EDuration="2.214237339s" podCreationTimestamp="2025-08-12 23:44:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-08-12 23:44:26.193810947 +0000 UTC m=+1.253373540" watchObservedRunningTime="2025-08-12 23:44:26.214237339 +0000 UTC m=+1.273799892" Aug 12 23:44:26.241189 kubelet[2715]: I0812 23:44:26.241130 2715 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-apiserver-ci-4372-1-0-9-13fe44d47a" podStartSLOduration=1.241109859 podStartE2EDuration="1.241109859s" podCreationTimestamp="2025-08-12 23:44:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-08-12 23:44:26.215505898 +0000 UTC m=+1.275068451" watchObservedRunningTime="2025-08-12 23:44:26.241109859 +0000 UTC m=+1.300672412" Aug 12 23:44:26.260610 kubelet[2715]: I0812 23:44:26.260305 2715 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-controller-manager-ci-4372-1-0-9-13fe44d47a" podStartSLOduration=1.260287735 podStartE2EDuration="1.260287735s" podCreationTimestamp="2025-08-12 23:44:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-08-12 23:44:26.243029119 +0000 UTC m=+1.302591672" watchObservedRunningTime="2025-08-12 23:44:26.260287735 +0000 UTC m=+1.319850288" Aug 12 23:44:30.981266 kubelet[2715]: I0812 23:44:30.981215 2715 kuberuntime_manager.go:1635] "Updating runtime config through cri with podcidr" CIDR="192.168.0.0/24" Aug 12 23:44:30.982893 containerd[1506]: time="2025-08-12T23:44:30.982676138Z" level=info msg="No cni config template is specified, wait for other system components to drop the config." Aug 12 23:44:30.983566 kubelet[2715]: I0812 23:44:30.983452 2715 kubelet_network.go:61] "Updating Pod CIDR" originalPodCIDR="" newPodCIDR="192.168.0.0/24" Aug 12 23:44:31.858267 systemd[1]: Created slice kubepods-besteffort-pod22e0de47_31fe_4bb9_8288_96dd26587a39.slice - libcontainer container kubepods-besteffort-pod22e0de47_31fe_4bb9_8288_96dd26587a39.slice. Aug 12 23:44:31.902599 kubelet[2715]: I0812 23:44:31.902462 2715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-proxy\" (UniqueName: \"kubernetes.io/configmap/22e0de47-31fe-4bb9-8288-96dd26587a39-kube-proxy\") pod \"kube-proxy-zhmnm\" (UID: \"22e0de47-31fe-4bb9-8288-96dd26587a39\") " pod="kube-system/kube-proxy-zhmnm" Aug 12 23:44:31.902599 kubelet[2715]: I0812 23:44:31.902524 2715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/22e0de47-31fe-4bb9-8288-96dd26587a39-lib-modules\") pod \"kube-proxy-zhmnm\" (UID: \"22e0de47-31fe-4bb9-8288-96dd26587a39\") " pod="kube-system/kube-proxy-zhmnm" Aug 12 23:44:31.903743 kubelet[2715]: I0812 23:44:31.902710 2715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"xtables-lock\" (UniqueName: \"kubernetes.io/host-path/22e0de47-31fe-4bb9-8288-96dd26587a39-xtables-lock\") pod \"kube-proxy-zhmnm\" (UID: \"22e0de47-31fe-4bb9-8288-96dd26587a39\") " pod="kube-system/kube-proxy-zhmnm" Aug 12 23:44:31.903743 kubelet[2715]: I0812 23:44:31.902738 2715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jwsbn\" (UniqueName: \"kubernetes.io/projected/22e0de47-31fe-4bb9-8288-96dd26587a39-kube-api-access-jwsbn\") pod \"kube-proxy-zhmnm\" (UID: \"22e0de47-31fe-4bb9-8288-96dd26587a39\") " pod="kube-system/kube-proxy-zhmnm" Aug 12 23:44:32.124945 systemd[1]: Created slice kubepods-besteffort-pod073439f0_0995_4773_9d8f_c86682871c43.slice - libcontainer container kubepods-besteffort-pod073439f0_0995_4773_9d8f_c86682871c43.slice. Aug 12 23:44:32.170730 containerd[1506]: time="2025-08-12T23:44:32.170661221Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-proxy-zhmnm,Uid:22e0de47-31fe-4bb9-8288-96dd26587a39,Namespace:kube-system,Attempt:0,}" Aug 12 23:44:32.192160 containerd[1506]: time="2025-08-12T23:44:32.192107056Z" level=info msg="connecting to shim c7fc6da222bf254f8461cb4ffcbe3651533563caf86cd74c24ae70b7d356e6b7" address="unix:///run/containerd/s/55c6d80d9573a1c648c376d32d31d8282ef015c7f104faabf8ae0b95cae900eb" namespace=k8s.io protocol=ttrpc version=3 Aug 12 23:44:32.204942 kubelet[2715]: I0812 23:44:32.204872 2715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zv8n8\" (UniqueName: \"kubernetes.io/projected/073439f0-0995-4773-9d8f-c86682871c43-kube-api-access-zv8n8\") pod \"tigera-operator-5bf8dfcb4-xrzh4\" (UID: \"073439f0-0995-4773-9d8f-c86682871c43\") " pod="tigera-operator/tigera-operator-5bf8dfcb4-xrzh4" Aug 12 23:44:32.205517 kubelet[2715]: I0812 23:44:32.204959 2715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-calico\" (UniqueName: \"kubernetes.io/host-path/073439f0-0995-4773-9d8f-c86682871c43-var-lib-calico\") pod \"tigera-operator-5bf8dfcb4-xrzh4\" (UID: \"073439f0-0995-4773-9d8f-c86682871c43\") " pod="tigera-operator/tigera-operator-5bf8dfcb4-xrzh4" Aug 12 23:44:32.226894 systemd[1]: Started cri-containerd-c7fc6da222bf254f8461cb4ffcbe3651533563caf86cd74c24ae70b7d356e6b7.scope - libcontainer container c7fc6da222bf254f8461cb4ffcbe3651533563caf86cd74c24ae70b7d356e6b7. Aug 12 23:44:32.262772 containerd[1506]: time="2025-08-12T23:44:32.262446751Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-proxy-zhmnm,Uid:22e0de47-31fe-4bb9-8288-96dd26587a39,Namespace:kube-system,Attempt:0,} returns sandbox id \"c7fc6da222bf254f8461cb4ffcbe3651533563caf86cd74c24ae70b7d356e6b7\"" Aug 12 23:44:32.268069 containerd[1506]: time="2025-08-12T23:44:32.268010373Z" level=info msg="CreateContainer within sandbox \"c7fc6da222bf254f8461cb4ffcbe3651533563caf86cd74c24ae70b7d356e6b7\" for container &ContainerMetadata{Name:kube-proxy,Attempt:0,}" Aug 12 23:44:32.282880 containerd[1506]: time="2025-08-12T23:44:32.282671850Z" level=info msg="Container 90a5942d4d105100e8a1df193d5a37df7aef2766f3961a448da6c47d6685bc50: CDI devices from CRI Config.CDIDevices: []" Aug 12 23:44:32.288776 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount2791418967.mount: Deactivated successfully. Aug 12 23:44:32.298018 containerd[1506]: time="2025-08-12T23:44:32.297913192Z" level=info msg="CreateContainer within sandbox \"c7fc6da222bf254f8461cb4ffcbe3651533563caf86cd74c24ae70b7d356e6b7\" for &ContainerMetadata{Name:kube-proxy,Attempt:0,} returns container id \"90a5942d4d105100e8a1df193d5a37df7aef2766f3961a448da6c47d6685bc50\"" Aug 12 23:44:32.299140 containerd[1506]: time="2025-08-12T23:44:32.299027717Z" level=info msg="StartContainer for \"90a5942d4d105100e8a1df193d5a37df7aef2766f3961a448da6c47d6685bc50\"" Aug 12 23:44:32.301344 containerd[1506]: time="2025-08-12T23:44:32.301302691Z" level=info msg="connecting to shim 90a5942d4d105100e8a1df193d5a37df7aef2766f3961a448da6c47d6685bc50" address="unix:///run/containerd/s/55c6d80d9573a1c648c376d32d31d8282ef015c7f104faabf8ae0b95cae900eb" protocol=ttrpc version=3 Aug 12 23:44:32.325021 systemd[1]: Started cri-containerd-90a5942d4d105100e8a1df193d5a37df7aef2766f3961a448da6c47d6685bc50.scope - libcontainer container 90a5942d4d105100e8a1df193d5a37df7aef2766f3961a448da6c47d6685bc50. Aug 12 23:44:32.376582 containerd[1506]: time="2025-08-12T23:44:32.376469645Z" level=info msg="StartContainer for \"90a5942d4d105100e8a1df193d5a37df7aef2766f3961a448da6c47d6685bc50\" returns successfully" Aug 12 23:44:32.433013 containerd[1506]: time="2025-08-12T23:44:32.432643278Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:tigera-operator-5bf8dfcb4-xrzh4,Uid:073439f0-0995-4773-9d8f-c86682871c43,Namespace:tigera-operator,Attempt:0,}" Aug 12 23:44:32.467407 containerd[1506]: time="2025-08-12T23:44:32.466698401Z" level=info msg="connecting to shim 539174614838505008cd3414578437b2b7acb93a8cca5204397d97a299e63fcc" address="unix:///run/containerd/s/38ecee549daf5f0eae0042259ae5a5e7d5db771c4d5c7bb2b30d71c86a574f99" namespace=k8s.io protocol=ttrpc version=3 Aug 12 23:44:32.495788 systemd[1]: Started cri-containerd-539174614838505008cd3414578437b2b7acb93a8cca5204397d97a299e63fcc.scope - libcontainer container 539174614838505008cd3414578437b2b7acb93a8cca5204397d97a299e63fcc. Aug 12 23:44:32.567675 containerd[1506]: time="2025-08-12T23:44:32.567422809Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:tigera-operator-5bf8dfcb4-xrzh4,Uid:073439f0-0995-4773-9d8f-c86682871c43,Namespace:tigera-operator,Attempt:0,} returns sandbox id \"539174614838505008cd3414578437b2b7acb93a8cca5204397d97a299e63fcc\"" Aug 12 23:44:32.572396 containerd[1506]: time="2025-08-12T23:44:32.572158618Z" level=info msg="PullImage \"quay.io/tigera/operator:v1.38.3\"" Aug 12 23:44:34.232760 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount1749044327.mount: Deactivated successfully. Aug 12 23:44:34.786365 kubelet[2715]: I0812 23:44:34.785368 2715 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-proxy-zhmnm" podStartSLOduration=3.785348888 podStartE2EDuration="3.785348888s" podCreationTimestamp="2025-08-12 23:44:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-08-12 23:44:33.173212325 +0000 UTC m=+8.232774878" watchObservedRunningTime="2025-08-12 23:44:34.785348888 +0000 UTC m=+9.844911441" Aug 12 23:44:34.850684 containerd[1506]: time="2025-08-12T23:44:34.850614792Z" level=info msg="ImageCreate event name:\"quay.io/tigera/operator:v1.38.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Aug 12 23:44:34.852333 containerd[1506]: time="2025-08-12T23:44:34.851914402Z" level=info msg="stop pulling image quay.io/tigera/operator:v1.38.3: active requests=0, bytes read=22150610" Aug 12 23:44:34.856681 containerd[1506]: time="2025-08-12T23:44:34.855487880Z" level=info msg="ImageCreate event name:\"sha256:7f8a5b1dba618e907d5f7804e42b3bd7cd5766bc3b0a66da25ff2c687e356bb0\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Aug 12 23:44:34.858227 containerd[1506]: time="2025-08-12T23:44:34.858172349Z" level=info msg="ImageCreate event name:\"quay.io/tigera/operator@sha256:dbf1bad0def7b5955dc8e4aeee96e23ead0bc5822f6872518e685cd0ed484121\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Aug 12 23:44:34.859301 containerd[1506]: time="2025-08-12T23:44:34.859252217Z" level=info msg="Pulled image \"quay.io/tigera/operator:v1.38.3\" with image id \"sha256:7f8a5b1dba618e907d5f7804e42b3bd7cd5766bc3b0a66da25ff2c687e356bb0\", repo tag \"quay.io/tigera/operator:v1.38.3\", repo digest \"quay.io/tigera/operator@sha256:dbf1bad0def7b5955dc8e4aeee96e23ead0bc5822f6872518e685cd0ed484121\", size \"22146605\" in 2.287022312s" Aug 12 23:44:34.859301 containerd[1506]: time="2025-08-12T23:44:34.859299382Z" level=info msg="PullImage \"quay.io/tigera/operator:v1.38.3\" returns image reference \"sha256:7f8a5b1dba618e907d5f7804e42b3bd7cd5766bc3b0a66da25ff2c687e356bb0\"" Aug 12 23:44:34.862052 containerd[1506]: time="2025-08-12T23:44:34.862010294Z" level=info msg="CreateContainer within sandbox \"539174614838505008cd3414578437b2b7acb93a8cca5204397d97a299e63fcc\" for container &ContainerMetadata{Name:tigera-operator,Attempt:0,}" Aug 12 23:44:34.873151 containerd[1506]: time="2025-08-12T23:44:34.872604636Z" level=info msg="Container 057eb732b14b4064d51cfd604eb745895a7dbdae4e79b01dd59efa7fa4d14ea4: CDI devices from CRI Config.CDIDevices: []" Aug 12 23:44:34.890085 containerd[1506]: time="2025-08-12T23:44:34.890023502Z" level=info msg="CreateContainer within sandbox \"539174614838505008cd3414578437b2b7acb93a8cca5204397d97a299e63fcc\" for &ContainerMetadata{Name:tigera-operator,Attempt:0,} returns container id \"057eb732b14b4064d51cfd604eb745895a7dbdae4e79b01dd59efa7fa4d14ea4\"" Aug 12 23:44:34.892914 containerd[1506]: time="2025-08-12T23:44:34.892853386Z" level=info msg="StartContainer for \"057eb732b14b4064d51cfd604eb745895a7dbdae4e79b01dd59efa7fa4d14ea4\"" Aug 12 23:44:34.894411 containerd[1506]: time="2025-08-12T23:44:34.894365138Z" level=info msg="connecting to shim 057eb732b14b4064d51cfd604eb745895a7dbdae4e79b01dd59efa7fa4d14ea4" address="unix:///run/containerd/s/38ecee549daf5f0eae0042259ae5a5e7d5db771c4d5c7bb2b30d71c86a574f99" protocol=ttrpc version=3 Aug 12 23:44:34.922780 systemd[1]: Started cri-containerd-057eb732b14b4064d51cfd604eb745895a7dbdae4e79b01dd59efa7fa4d14ea4.scope - libcontainer container 057eb732b14b4064d51cfd604eb745895a7dbdae4e79b01dd59efa7fa4d14ea4. Aug 12 23:44:34.958134 containerd[1506]: time="2025-08-12T23:44:34.958086966Z" level=info msg="StartContainer for \"057eb732b14b4064d51cfd604eb745895a7dbdae4e79b01dd59efa7fa4d14ea4\" returns successfully" Aug 12 23:44:38.692093 kubelet[2715]: I0812 23:44:38.691850 2715 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="tigera-operator/tigera-operator-5bf8dfcb4-xrzh4" podStartSLOduration=4.402155257 podStartE2EDuration="6.69182561s" podCreationTimestamp="2025-08-12 23:44:32 +0000 UTC" firstStartedPulling="2025-08-12 23:44:32.570825229 +0000 UTC m=+7.630387822" lastFinishedPulling="2025-08-12 23:44:34.860495622 +0000 UTC m=+9.920058175" observedRunningTime="2025-08-12 23:44:35.228518862 +0000 UTC m=+10.288081455" watchObservedRunningTime="2025-08-12 23:44:38.69182561 +0000 UTC m=+13.751388163" Aug 12 23:44:41.102784 sudo[1786]: pam_unix(sudo:session): session closed for user root Aug 12 23:44:41.266734 sshd[1785]: Connection closed by 139.178.68.195 port 59010 Aug 12 23:44:41.267201 sshd-session[1783]: pam_unix(sshd:session): session closed for user core Aug 12 23:44:41.273880 systemd[1]: sshd@6-138.199.237.168:22-139.178.68.195:59010.service: Deactivated successfully. Aug 12 23:44:41.280817 systemd[1]: session-7.scope: Deactivated successfully. Aug 12 23:44:41.283736 systemd[1]: session-7.scope: Consumed 7.499s CPU time, 227M memory peak. Aug 12 23:44:41.285774 systemd-logind[1481]: Session 7 logged out. Waiting for processes to exit. Aug 12 23:44:41.291962 systemd-logind[1481]: Removed session 7. Aug 12 23:44:48.786400 systemd[1]: Created slice kubepods-besteffort-pod51b1ba88_2a5c_4634_a29b_d421ccab222d.slice - libcontainer container kubepods-besteffort-pod51b1ba88_2a5c_4634_a29b_d421ccab222d.slice. Aug 12 23:44:48.819194 kubelet[2715]: I0812 23:44:48.819089 2715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"typha-certs\" (UniqueName: \"kubernetes.io/secret/51b1ba88-2a5c-4634-a29b-d421ccab222d-typha-certs\") pod \"calico-typha-6674d8dd55-tmrsw\" (UID: \"51b1ba88-2a5c-4634-a29b-d421ccab222d\") " pod="calico-system/calico-typha-6674d8dd55-tmrsw" Aug 12 23:44:48.819194 kubelet[2715]: I0812 23:44:48.819146 2715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lr75s\" (UniqueName: \"kubernetes.io/projected/51b1ba88-2a5c-4634-a29b-d421ccab222d-kube-api-access-lr75s\") pod \"calico-typha-6674d8dd55-tmrsw\" (UID: \"51b1ba88-2a5c-4634-a29b-d421ccab222d\") " pod="calico-system/calico-typha-6674d8dd55-tmrsw" Aug 12 23:44:48.819194 kubelet[2715]: I0812 23:44:48.819172 2715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/51b1ba88-2a5c-4634-a29b-d421ccab222d-tigera-ca-bundle\") pod \"calico-typha-6674d8dd55-tmrsw\" (UID: \"51b1ba88-2a5c-4634-a29b-d421ccab222d\") " pod="calico-system/calico-typha-6674d8dd55-tmrsw" Aug 12 23:44:48.976097 systemd[1]: Created slice kubepods-besteffort-podb6ee9040_aa21_40b9_9587_cf75dcf52bb4.slice - libcontainer container kubepods-besteffort-podb6ee9040_aa21_40b9_9587_cf75dcf52bb4.slice. Aug 12 23:44:49.021871 kubelet[2715]: I0812 23:44:49.021812 2715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-log-dir\" (UniqueName: \"kubernetes.io/host-path/b6ee9040-aa21-40b9-9587-cf75dcf52bb4-cni-log-dir\") pod \"calico-node-9mp88\" (UID: \"b6ee9040-aa21-40b9-9587-cf75dcf52bb4\") " pod="calico-system/calico-node-9mp88" Aug 12 23:44:49.021871 kubelet[2715]: I0812 23:44:49.021869 2715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b6ee9040-aa21-40b9-9587-cf75dcf52bb4-tigera-ca-bundle\") pod \"calico-node-9mp88\" (UID: \"b6ee9040-aa21-40b9-9587-cf75dcf52bb4\") " pod="calico-system/calico-node-9mp88" Aug 12 23:44:49.021871 kubelet[2715]: I0812 23:44:49.021886 2715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-calico\" (UniqueName: \"kubernetes.io/host-path/b6ee9040-aa21-40b9-9587-cf75dcf52bb4-var-run-calico\") pod \"calico-node-9mp88\" (UID: \"b6ee9040-aa21-40b9-9587-cf75dcf52bb4\") " pod="calico-system/calico-node-9mp88" Aug 12 23:44:49.022083 kubelet[2715]: I0812 23:44:49.021904 2715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/b6ee9040-aa21-40b9-9587-cf75dcf52bb4-lib-modules\") pod \"calico-node-9mp88\" (UID: \"b6ee9040-aa21-40b9-9587-cf75dcf52bb4\") " pod="calico-system/calico-node-9mp88" Aug 12 23:44:49.022083 kubelet[2715]: I0812 23:44:49.021922 2715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"policysync\" (UniqueName: \"kubernetes.io/host-path/b6ee9040-aa21-40b9-9587-cf75dcf52bb4-policysync\") pod \"calico-node-9mp88\" (UID: \"b6ee9040-aa21-40b9-9587-cf75dcf52bb4\") " pod="calico-system/calico-node-9mp88" Aug 12 23:44:49.022083 kubelet[2715]: I0812 23:44:49.021938 2715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"xtables-lock\" (UniqueName: \"kubernetes.io/host-path/b6ee9040-aa21-40b9-9587-cf75dcf52bb4-xtables-lock\") pod \"calico-node-9mp88\" (UID: \"b6ee9040-aa21-40b9-9587-cf75dcf52bb4\") " pod="calico-system/calico-node-9mp88" Aug 12 23:44:49.022083 kubelet[2715]: I0812 23:44:49.021955 2715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-net-dir\" (UniqueName: \"kubernetes.io/host-path/b6ee9040-aa21-40b9-9587-cf75dcf52bb4-cni-net-dir\") pod \"calico-node-9mp88\" (UID: \"b6ee9040-aa21-40b9-9587-cf75dcf52bb4\") " pod="calico-system/calico-node-9mp88" Aug 12 23:44:49.022083 kubelet[2715]: I0812 23:44:49.021970 2715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rj7c6\" (UniqueName: \"kubernetes.io/projected/b6ee9040-aa21-40b9-9587-cf75dcf52bb4-kube-api-access-rj7c6\") pod \"calico-node-9mp88\" (UID: \"b6ee9040-aa21-40b9-9587-cf75dcf52bb4\") " pod="calico-system/calico-node-9mp88" Aug 12 23:44:49.022188 kubelet[2715]: I0812 23:44:49.021986 2715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-bin-dir\" (UniqueName: \"kubernetes.io/host-path/b6ee9040-aa21-40b9-9587-cf75dcf52bb4-cni-bin-dir\") pod \"calico-node-9mp88\" (UID: \"b6ee9040-aa21-40b9-9587-cf75dcf52bb4\") " pod="calico-system/calico-node-9mp88" Aug 12 23:44:49.022188 kubelet[2715]: I0812 23:44:49.022002 2715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-certs\" (UniqueName: \"kubernetes.io/secret/b6ee9040-aa21-40b9-9587-cf75dcf52bb4-node-certs\") pod \"calico-node-9mp88\" (UID: \"b6ee9040-aa21-40b9-9587-cf75dcf52bb4\") " pod="calico-system/calico-node-9mp88" Aug 12 23:44:49.022188 kubelet[2715]: I0812 23:44:49.022034 2715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-calico\" (UniqueName: \"kubernetes.io/host-path/b6ee9040-aa21-40b9-9587-cf75dcf52bb4-var-lib-calico\") pod \"calico-node-9mp88\" (UID: \"b6ee9040-aa21-40b9-9587-cf75dcf52bb4\") " pod="calico-system/calico-node-9mp88" Aug 12 23:44:49.022188 kubelet[2715]: I0812 23:44:49.022056 2715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"flexvol-driver-host\" (UniqueName: \"kubernetes.io/host-path/b6ee9040-aa21-40b9-9587-cf75dcf52bb4-flexvol-driver-host\") pod \"calico-node-9mp88\" (UID: \"b6ee9040-aa21-40b9-9587-cf75dcf52bb4\") " pod="calico-system/calico-node-9mp88" Aug 12 23:44:49.092411 containerd[1506]: time="2025-08-12T23:44:49.092243117Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-typha-6674d8dd55-tmrsw,Uid:51b1ba88-2a5c-4634-a29b-d421ccab222d,Namespace:calico-system,Attempt:0,}" Aug 12 23:44:49.125458 kubelet[2715]: E0812 23:44:49.125166 2715 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 12 23:44:49.125458 kubelet[2715]: W0812 23:44:49.125254 2715 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 12 23:44:49.125458 kubelet[2715]: E0812 23:44:49.125278 2715 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 12 23:44:49.126215 kubelet[2715]: E0812 23:44:49.126113 2715 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 12 23:44:49.126466 kubelet[2715]: W0812 23:44:49.126284 2715 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 12 23:44:49.126659 kubelet[2715]: E0812 23:44:49.126643 2715 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 12 23:44:49.128193 kubelet[2715]: E0812 23:44:49.127683 2715 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 12 23:44:49.128193 kubelet[2715]: W0812 23:44:49.128038 2715 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 12 23:44:49.128193 kubelet[2715]: E0812 23:44:49.128067 2715 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 12 23:44:49.135474 kubelet[2715]: E0812 23:44:49.130877 2715 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 12 23:44:49.135474 kubelet[2715]: W0812 23:44:49.133442 2715 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 12 23:44:49.135474 kubelet[2715]: E0812 23:44:49.133481 2715 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 12 23:44:49.135633 containerd[1506]: time="2025-08-12T23:44:49.131304745Z" level=info msg="connecting to shim 18e92bc5265e6cc58262f023836a3764eb67b50d7c035544275cfcaff56cc1ae" address="unix:///run/containerd/s/cf0f7a8d565ddb7e57f716423c0ecd0af02d1558a15a71b564ba183ac89dde3a" namespace=k8s.io protocol=ttrpc version=3 Aug 12 23:44:49.137621 kubelet[2715]: E0812 23:44:49.137460 2715 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 12 23:44:49.137743 kubelet[2715]: W0812 23:44:49.137721 2715 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 12 23:44:49.137867 kubelet[2715]: E0812 23:44:49.137831 2715 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 12 23:44:49.138263 kubelet[2715]: E0812 23:44:49.138232 2715 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 12 23:44:49.141136 kubelet[2715]: W0812 23:44:49.140333 2715 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 12 23:44:49.141136 kubelet[2715]: E0812 23:44:49.140488 2715 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 12 23:44:49.141136 kubelet[2715]: E0812 23:44:49.140825 2715 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 12 23:44:49.141136 kubelet[2715]: W0812 23:44:49.140838 2715 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 12 23:44:49.141136 kubelet[2715]: E0812 23:44:49.140875 2715 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 12 23:44:49.141898 kubelet[2715]: E0812 23:44:49.141462 2715 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 12 23:44:49.141898 kubelet[2715]: W0812 23:44:49.141478 2715 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 12 23:44:49.141898 kubelet[2715]: E0812 23:44:49.141499 2715 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 12 23:44:49.143229 kubelet[2715]: E0812 23:44:49.142343 2715 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 12 23:44:49.143229 kubelet[2715]: W0812 23:44:49.142459 2715 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 12 23:44:49.143229 kubelet[2715]: E0812 23:44:49.142488 2715 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 12 23:44:49.143229 kubelet[2715]: E0812 23:44:49.142911 2715 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 12 23:44:49.143229 kubelet[2715]: W0812 23:44:49.142925 2715 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 12 23:44:49.143229 kubelet[2715]: E0812 23:44:49.142961 2715 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 12 23:44:49.143949 kubelet[2715]: E0812 23:44:49.143620 2715 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 12 23:44:49.143949 kubelet[2715]: W0812 23:44:49.143636 2715 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 12 23:44:49.143949 kubelet[2715]: E0812 23:44:49.143654 2715 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 12 23:44:49.144683 kubelet[2715]: E0812 23:44:49.144279 2715 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 12 23:44:49.144683 kubelet[2715]: W0812 23:44:49.144295 2715 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 12 23:44:49.144683 kubelet[2715]: E0812 23:44:49.144309 2715 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 12 23:44:49.145492 kubelet[2715]: E0812 23:44:49.145262 2715 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 12 23:44:49.145492 kubelet[2715]: W0812 23:44:49.145283 2715 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 12 23:44:49.145492 kubelet[2715]: E0812 23:44:49.145299 2715 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 12 23:44:49.146501 kubelet[2715]: E0812 23:44:49.146423 2715 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 12 23:44:49.146501 kubelet[2715]: W0812 23:44:49.146444 2715 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 12 23:44:49.146501 kubelet[2715]: E0812 23:44:49.146459 2715 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 12 23:44:49.153929 kubelet[2715]: E0812 23:44:49.153847 2715 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 12 23:44:49.154111 kubelet[2715]: W0812 23:44:49.154089 2715 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 12 23:44:49.154531 kubelet[2715]: E0812 23:44:49.154508 2715 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 12 23:44:49.201643 kubelet[2715]: E0812 23:44:49.201300 2715 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-znzb2" podUID="e6d039b0-18c9-437e-88dc-fc77d8757ee7" Aug 12 23:44:49.211854 systemd[1]: Started cri-containerd-18e92bc5265e6cc58262f023836a3764eb67b50d7c035544275cfcaff56cc1ae.scope - libcontainer container 18e92bc5265e6cc58262f023836a3764eb67b50d7c035544275cfcaff56cc1ae. Aug 12 23:44:49.216873 kubelet[2715]: E0812 23:44:49.216840 2715 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 12 23:44:49.216873 kubelet[2715]: W0812 23:44:49.216867 2715 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 12 23:44:49.217053 kubelet[2715]: E0812 23:44:49.216902 2715 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 12 23:44:49.217216 kubelet[2715]: E0812 23:44:49.217195 2715 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 12 23:44:49.217254 kubelet[2715]: W0812 23:44:49.217226 2715 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 12 23:44:49.217254 kubelet[2715]: E0812 23:44:49.217241 2715 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 12 23:44:49.217482 kubelet[2715]: E0812 23:44:49.217464 2715 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 12 23:44:49.217482 kubelet[2715]: W0812 23:44:49.217480 2715 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 12 23:44:49.217678 kubelet[2715]: E0812 23:44:49.217497 2715 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 12 23:44:49.218209 kubelet[2715]: E0812 23:44:49.218176 2715 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 12 23:44:49.218209 kubelet[2715]: W0812 23:44:49.218212 2715 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 12 23:44:49.218323 kubelet[2715]: E0812 23:44:49.218227 2715 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 12 23:44:49.219804 kubelet[2715]: E0812 23:44:49.219768 2715 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 12 23:44:49.219804 kubelet[2715]: W0812 23:44:49.219792 2715 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 12 23:44:49.219804 kubelet[2715]: E0812 23:44:49.219807 2715 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 12 23:44:49.220181 kubelet[2715]: E0812 23:44:49.219984 2715 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 12 23:44:49.220181 kubelet[2715]: W0812 23:44:49.219999 2715 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 12 23:44:49.220181 kubelet[2715]: E0812 23:44:49.220036 2715 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 12 23:44:49.220181 kubelet[2715]: E0812 23:44:49.220170 2715 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 12 23:44:49.220301 kubelet[2715]: W0812 23:44:49.220178 2715 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 12 23:44:49.220301 kubelet[2715]: E0812 23:44:49.220206 2715 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 12 23:44:49.220383 kubelet[2715]: E0812 23:44:49.220358 2715 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 12 23:44:49.220383 kubelet[2715]: W0812 23:44:49.220374 2715 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 12 23:44:49.220430 kubelet[2715]: E0812 23:44:49.220383 2715 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 12 23:44:49.221200 kubelet[2715]: E0812 23:44:49.221168 2715 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 12 23:44:49.221200 kubelet[2715]: W0812 23:44:49.221191 2715 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 12 23:44:49.221305 kubelet[2715]: E0812 23:44:49.221205 2715 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 12 23:44:49.221609 kubelet[2715]: E0812 23:44:49.221581 2715 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 12 23:44:49.221609 kubelet[2715]: W0812 23:44:49.221601 2715 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 12 23:44:49.221609 kubelet[2715]: E0812 23:44:49.221613 2715 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 12 23:44:49.222613 kubelet[2715]: E0812 23:44:49.222584 2715 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 12 23:44:49.222613 kubelet[2715]: W0812 23:44:49.222606 2715 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 12 23:44:49.222714 kubelet[2715]: E0812 23:44:49.222620 2715 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 12 23:44:49.223955 kubelet[2715]: E0812 23:44:49.223922 2715 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 12 23:44:49.223955 kubelet[2715]: W0812 23:44:49.223944 2715 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 12 23:44:49.223955 kubelet[2715]: E0812 23:44:49.223958 2715 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 12 23:44:49.224727 kubelet[2715]: E0812 23:44:49.224699 2715 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 12 23:44:49.224727 kubelet[2715]: W0812 23:44:49.224718 2715 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 12 23:44:49.224875 kubelet[2715]: E0812 23:44:49.224731 2715 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 12 23:44:49.225877 kubelet[2715]: E0812 23:44:49.225848 2715 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 12 23:44:49.225877 kubelet[2715]: W0812 23:44:49.225868 2715 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 12 23:44:49.225877 kubelet[2715]: E0812 23:44:49.225881 2715 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 12 23:44:49.226768 kubelet[2715]: E0812 23:44:49.226734 2715 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 12 23:44:49.226768 kubelet[2715]: W0812 23:44:49.226758 2715 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 12 23:44:49.226768 kubelet[2715]: E0812 23:44:49.226770 2715 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 12 23:44:49.228259 kubelet[2715]: E0812 23:44:49.228210 2715 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 12 23:44:49.228259 kubelet[2715]: W0812 23:44:49.228241 2715 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 12 23:44:49.228259 kubelet[2715]: E0812 23:44:49.228254 2715 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 12 23:44:49.228679 kubelet[2715]: E0812 23:44:49.228477 2715 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 12 23:44:49.228679 kubelet[2715]: W0812 23:44:49.228491 2715 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 12 23:44:49.228679 kubelet[2715]: E0812 23:44:49.228509 2715 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 12 23:44:49.228679 kubelet[2715]: E0812 23:44:49.228671 2715 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 12 23:44:49.228679 kubelet[2715]: W0812 23:44:49.228680 2715 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 12 23:44:49.228679 kubelet[2715]: E0812 23:44:49.228688 2715 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 12 23:44:49.229267 kubelet[2715]: E0812 23:44:49.228799 2715 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 12 23:44:49.229267 kubelet[2715]: W0812 23:44:49.228806 2715 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 12 23:44:49.229267 kubelet[2715]: E0812 23:44:49.228813 2715 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 12 23:44:49.229267 kubelet[2715]: E0812 23:44:49.228923 2715 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 12 23:44:49.229267 kubelet[2715]: W0812 23:44:49.228929 2715 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 12 23:44:49.229267 kubelet[2715]: E0812 23:44:49.228936 2715 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 12 23:44:49.229267 kubelet[2715]: E0812 23:44:49.229205 2715 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 12 23:44:49.229267 kubelet[2715]: W0812 23:44:49.229214 2715 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 12 23:44:49.229267 kubelet[2715]: E0812 23:44:49.229222 2715 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 12 23:44:49.229496 kubelet[2715]: I0812 23:44:49.229252 2715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hzbfj\" (UniqueName: \"kubernetes.io/projected/e6d039b0-18c9-437e-88dc-fc77d8757ee7-kube-api-access-hzbfj\") pod \"csi-node-driver-znzb2\" (UID: \"e6d039b0-18c9-437e-88dc-fc77d8757ee7\") " pod="calico-system/csi-node-driver-znzb2" Aug 12 23:44:49.229496 kubelet[2715]: E0812 23:44:49.229398 2715 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 12 23:44:49.229496 kubelet[2715]: W0812 23:44:49.229408 2715 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 12 23:44:49.229892 kubelet[2715]: E0812 23:44:49.229561 2715 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 12 23:44:49.229892 kubelet[2715]: I0812 23:44:49.229583 2715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/e6d039b0-18c9-437e-88dc-fc77d8757ee7-socket-dir\") pod \"csi-node-driver-znzb2\" (UID: \"e6d039b0-18c9-437e-88dc-fc77d8757ee7\") " pod="calico-system/csi-node-driver-znzb2" Aug 12 23:44:49.229892 kubelet[2715]: E0812 23:44:49.229633 2715 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 12 23:44:49.229892 kubelet[2715]: W0812 23:44:49.229639 2715 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 12 23:44:49.229892 kubelet[2715]: E0812 23:44:49.229656 2715 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 12 23:44:49.229892 kubelet[2715]: E0812 23:44:49.229781 2715 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 12 23:44:49.229892 kubelet[2715]: W0812 23:44:49.229788 2715 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 12 23:44:49.229892 kubelet[2715]: E0812 23:44:49.229796 2715 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 12 23:44:49.230806 kubelet[2715]: E0812 23:44:49.229928 2715 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 12 23:44:49.230806 kubelet[2715]: W0812 23:44:49.229935 2715 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 12 23:44:49.230806 kubelet[2715]: E0812 23:44:49.229951 2715 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 12 23:44:49.230806 kubelet[2715]: I0812 23:44:49.229970 2715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"varrun\" (UniqueName: \"kubernetes.io/host-path/e6d039b0-18c9-437e-88dc-fc77d8757ee7-varrun\") pod \"csi-node-driver-znzb2\" (UID: \"e6d039b0-18c9-437e-88dc-fc77d8757ee7\") " pod="calico-system/csi-node-driver-znzb2" Aug 12 23:44:49.230806 kubelet[2715]: E0812 23:44:49.230143 2715 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 12 23:44:49.230806 kubelet[2715]: W0812 23:44:49.230153 2715 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 12 23:44:49.230806 kubelet[2715]: E0812 23:44:49.230165 2715 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 12 23:44:49.230806 kubelet[2715]: I0812 23:44:49.230183 2715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/e6d039b0-18c9-437e-88dc-fc77d8757ee7-kubelet-dir\") pod \"csi-node-driver-znzb2\" (UID: \"e6d039b0-18c9-437e-88dc-fc77d8757ee7\") " pod="calico-system/csi-node-driver-znzb2" Aug 12 23:44:49.230806 kubelet[2715]: E0812 23:44:49.230427 2715 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 12 23:44:49.231008 kubelet[2715]: W0812 23:44:49.230438 2715 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 12 23:44:49.231008 kubelet[2715]: E0812 23:44:49.230453 2715 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 12 23:44:49.231008 kubelet[2715]: I0812 23:44:49.230469 2715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/e6d039b0-18c9-437e-88dc-fc77d8757ee7-registration-dir\") pod \"csi-node-driver-znzb2\" (UID: \"e6d039b0-18c9-437e-88dc-fc77d8757ee7\") " pod="calico-system/csi-node-driver-znzb2" Aug 12 23:44:49.231978 kubelet[2715]: E0812 23:44:49.231957 2715 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 12 23:44:49.232458 kubelet[2715]: W0812 23:44:49.232057 2715 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 12 23:44:49.232614 kubelet[2715]: E0812 23:44:49.232541 2715 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 12 23:44:49.232824 kubelet[2715]: E0812 23:44:49.232803 2715 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 12 23:44:49.232824 kubelet[2715]: W0812 23:44:49.232819 2715 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 12 23:44:49.233005 kubelet[2715]: E0812 23:44:49.232836 2715 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 12 23:44:49.233005 kubelet[2715]: E0812 23:44:49.232957 2715 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 12 23:44:49.233005 kubelet[2715]: W0812 23:44:49.232964 2715 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 12 23:44:49.233005 kubelet[2715]: E0812 23:44:49.232978 2715 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 12 23:44:49.233388 kubelet[2715]: E0812 23:44:49.233242 2715 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 12 23:44:49.233388 kubelet[2715]: W0812 23:44:49.233252 2715 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 12 23:44:49.233388 kubelet[2715]: E0812 23:44:49.233297 2715 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 12 23:44:49.234111 kubelet[2715]: E0812 23:44:49.233590 2715 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 12 23:44:49.234111 kubelet[2715]: W0812 23:44:49.233600 2715 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 12 23:44:49.234111 kubelet[2715]: E0812 23:44:49.233610 2715 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 12 23:44:49.234602 kubelet[2715]: E0812 23:44:49.234324 2715 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 12 23:44:49.234602 kubelet[2715]: W0812 23:44:49.234344 2715 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 12 23:44:49.234602 kubelet[2715]: E0812 23:44:49.234356 2715 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 12 23:44:49.234602 kubelet[2715]: E0812 23:44:49.234504 2715 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 12 23:44:49.234602 kubelet[2715]: W0812 23:44:49.234510 2715 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 12 23:44:49.234602 kubelet[2715]: E0812 23:44:49.234518 2715 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 12 23:44:49.234771 kubelet[2715]: E0812 23:44:49.234670 2715 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 12 23:44:49.234771 kubelet[2715]: W0812 23:44:49.234678 2715 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 12 23:44:49.234771 kubelet[2715]: E0812 23:44:49.234687 2715 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 12 23:44:49.287791 containerd[1506]: time="2025-08-12T23:44:49.287385090Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-node-9mp88,Uid:b6ee9040-aa21-40b9-9587-cf75dcf52bb4,Namespace:calico-system,Attempt:0,}" Aug 12 23:44:49.326283 containerd[1506]: time="2025-08-12T23:44:49.326243188Z" level=info msg="connecting to shim 3f7549914de50118b9a2f08652cb62a765c74822bd924071684a5777e52a4572" address="unix:///run/containerd/s/86967d9467dc7f0b6454055d38c79265bfa02b575d9385748022299ef5327b2d" namespace=k8s.io protocol=ttrpc version=3 Aug 12 23:44:49.332586 kubelet[2715]: E0812 23:44:49.332512 2715 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 12 23:44:49.332586 kubelet[2715]: W0812 23:44:49.332584 2715 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 12 23:44:49.332586 kubelet[2715]: E0812 23:44:49.332608 2715 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 12 23:44:49.333448 kubelet[2715]: E0812 23:44:49.333358 2715 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 12 23:44:49.333448 kubelet[2715]: W0812 23:44:49.333402 2715 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 12 23:44:49.333448 kubelet[2715]: E0812 23:44:49.333429 2715 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 12 23:44:49.333966 kubelet[2715]: E0812 23:44:49.333825 2715 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 12 23:44:49.333966 kubelet[2715]: W0812 23:44:49.333856 2715 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 12 23:44:49.333966 kubelet[2715]: E0812 23:44:49.333882 2715 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 12 23:44:49.334580 kubelet[2715]: E0812 23:44:49.334400 2715 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 12 23:44:49.334970 kubelet[2715]: W0812 23:44:49.334742 2715 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 12 23:44:49.334970 kubelet[2715]: E0812 23:44:49.334775 2715 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 12 23:44:49.335798 kubelet[2715]: E0812 23:44:49.335594 2715 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 12 23:44:49.335798 kubelet[2715]: W0812 23:44:49.335611 2715 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 12 23:44:49.335798 kubelet[2715]: E0812 23:44:49.335658 2715 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 12 23:44:49.336052 kubelet[2715]: E0812 23:44:49.336035 2715 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 12 23:44:49.336303 kubelet[2715]: W0812 23:44:49.336190 2715 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 12 23:44:49.336303 kubelet[2715]: E0812 23:44:49.336260 2715 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 12 23:44:49.337709 kubelet[2715]: E0812 23:44:49.337505 2715 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 12 23:44:49.337709 kubelet[2715]: W0812 23:44:49.337520 2715 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 12 23:44:49.337709 kubelet[2715]: E0812 23:44:49.337570 2715 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 12 23:44:49.337988 kubelet[2715]: E0812 23:44:49.337971 2715 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 12 23:44:49.338237 kubelet[2715]: W0812 23:44:49.338139 2715 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 12 23:44:49.338237 kubelet[2715]: E0812 23:44:49.338186 2715 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 12 23:44:49.338562 kubelet[2715]: E0812 23:44:49.338492 2715 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 12 23:44:49.338664 kubelet[2715]: W0812 23:44:49.338648 2715 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 12 23:44:49.339186 kubelet[2715]: E0812 23:44:49.339001 2715 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 12 23:44:49.339186 kubelet[2715]: E0812 23:44:49.339064 2715 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 12 23:44:49.339186 kubelet[2715]: W0812 23:44:49.339075 2715 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 12 23:44:49.339186 kubelet[2715]: E0812 23:44:49.339110 2715 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 12 23:44:49.339581 kubelet[2715]: E0812 23:44:49.339445 2715 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 12 23:44:49.339581 kubelet[2715]: W0812 23:44:49.339460 2715 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 12 23:44:49.339581 kubelet[2715]: E0812 23:44:49.339494 2715 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 12 23:44:49.340581 kubelet[2715]: E0812 23:44:49.340272 2715 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 12 23:44:49.341038 kubelet[2715]: W0812 23:44:49.340725 2715 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 12 23:44:49.341038 kubelet[2715]: E0812 23:44:49.340761 2715 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 12 23:44:49.342113 kubelet[2715]: E0812 23:44:49.342093 2715 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 12 23:44:49.342345 kubelet[2715]: W0812 23:44:49.342195 2715 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 12 23:44:49.342345 kubelet[2715]: E0812 23:44:49.342244 2715 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 12 23:44:49.342519 kubelet[2715]: E0812 23:44:49.342437 2715 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 12 23:44:49.344724 kubelet[2715]: W0812 23:44:49.344580 2715 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 12 23:44:49.344724 kubelet[2715]: E0812 23:44:49.344629 2715 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 12 23:44:49.344869 kubelet[2715]: E0812 23:44:49.344855 2715 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 12 23:44:49.345308 kubelet[2715]: W0812 23:44:49.345185 2715 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 12 23:44:49.345308 kubelet[2715]: E0812 23:44:49.345245 2715 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 12 23:44:49.345575 kubelet[2715]: E0812 23:44:49.345459 2715 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 12 23:44:49.345575 kubelet[2715]: W0812 23:44:49.345472 2715 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 12 23:44:49.345575 kubelet[2715]: E0812 23:44:49.345503 2715 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 12 23:44:49.348536 kubelet[2715]: E0812 23:44:49.346864 2715 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 12 23:44:49.348536 kubelet[2715]: W0812 23:44:49.346880 2715 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 12 23:44:49.348536 kubelet[2715]: E0812 23:44:49.346954 2715 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 12 23:44:49.349144 kubelet[2715]: E0812 23:44:49.349103 2715 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 12 23:44:49.349420 kubelet[2715]: W0812 23:44:49.349301 2715 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 12 23:44:49.349420 kubelet[2715]: E0812 23:44:49.349358 2715 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 12 23:44:49.350442 kubelet[2715]: E0812 23:44:49.350197 2715 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 12 23:44:49.350442 kubelet[2715]: W0812 23:44:49.350213 2715 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 12 23:44:49.350442 kubelet[2715]: E0812 23:44:49.350275 2715 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 12 23:44:49.350734 kubelet[2715]: E0812 23:44:49.350716 2715 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 12 23:44:49.350904 kubelet[2715]: W0812 23:44:49.350886 2715 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 12 23:44:49.351156 kubelet[2715]: E0812 23:44:49.351009 2715 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 12 23:44:49.351820 kubelet[2715]: E0812 23:44:49.351759 2715 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 12 23:44:49.352328 kubelet[2715]: W0812 23:44:49.352052 2715 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 12 23:44:49.352328 kubelet[2715]: E0812 23:44:49.352146 2715 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 12 23:44:49.352656 kubelet[2715]: E0812 23:44:49.352626 2715 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 12 23:44:49.353280 kubelet[2715]: W0812 23:44:49.352846 2715 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 12 23:44:49.353280 kubelet[2715]: E0812 23:44:49.352907 2715 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 12 23:44:49.353790 kubelet[2715]: E0812 23:44:49.353701 2715 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 12 23:44:49.353965 kubelet[2715]: W0812 23:44:49.353948 2715 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 12 23:44:49.354203 kubelet[2715]: E0812 23:44:49.354123 2715 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 12 23:44:49.354842 kubelet[2715]: E0812 23:44:49.354808 2715 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 12 23:44:49.355354 kubelet[2715]: W0812 23:44:49.354936 2715 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 12 23:44:49.355354 kubelet[2715]: E0812 23:44:49.354993 2715 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 12 23:44:49.355964 kubelet[2715]: E0812 23:44:49.355785 2715 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 12 23:44:49.355964 kubelet[2715]: W0812 23:44:49.355908 2715 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 12 23:44:49.355964 kubelet[2715]: E0812 23:44:49.355928 2715 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 12 23:44:49.376077 containerd[1506]: time="2025-08-12T23:44:49.375702804Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-typha-6674d8dd55-tmrsw,Uid:51b1ba88-2a5c-4634-a29b-d421ccab222d,Namespace:calico-system,Attempt:0,} returns sandbox id \"18e92bc5265e6cc58262f023836a3764eb67b50d7c035544275cfcaff56cc1ae\"" Aug 12 23:44:49.377170 kubelet[2715]: E0812 23:44:49.377145 2715 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 12 23:44:49.377315 kubelet[2715]: W0812 23:44:49.377298 2715 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 12 23:44:49.377404 kubelet[2715]: E0812 23:44:49.377391 2715 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 12 23:44:49.383712 containerd[1506]: time="2025-08-12T23:44:49.383300455Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/typha:v3.30.2\"" Aug 12 23:44:49.414010 systemd[1]: Started cri-containerd-3f7549914de50118b9a2f08652cb62a765c74822bd924071684a5777e52a4572.scope - libcontainer container 3f7549914de50118b9a2f08652cb62a765c74822bd924071684a5777e52a4572. Aug 12 23:44:49.474628 containerd[1506]: time="2025-08-12T23:44:49.474261898Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-node-9mp88,Uid:b6ee9040-aa21-40b9-9587-cf75dcf52bb4,Namespace:calico-system,Attempt:0,} returns sandbox id \"3f7549914de50118b9a2f08652cb62a765c74822bd924071684a5777e52a4572\"" Aug 12 23:44:50.702156 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount1132939091.mount: Deactivated successfully. Aug 12 23:44:51.086399 kubelet[2715]: E0812 23:44:51.086347 2715 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-znzb2" podUID="e6d039b0-18c9-437e-88dc-fc77d8757ee7" Aug 12 23:44:51.197564 containerd[1506]: time="2025-08-12T23:44:51.197482758Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/typha:v3.30.2\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Aug 12 23:44:51.198966 containerd[1506]: time="2025-08-12T23:44:51.198908662Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/typha:v3.30.2: active requests=0, bytes read=33087207" Aug 12 23:44:51.200909 containerd[1506]: time="2025-08-12T23:44:51.200829988Z" level=info msg="ImageCreate event name:\"sha256:bd819526ff844d29b60cd75e846a1f55306016ff269d881d52a9b6c7b2eef0b2\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Aug 12 23:44:51.203539 containerd[1506]: time="2025-08-12T23:44:51.203400864Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/typha@sha256:da29d745efe5eb7d25f765d3aa439f3fe60710a458efe39c285e58b02bd961af\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Aug 12 23:44:51.204902 containerd[1506]: time="2025-08-12T23:44:51.204836129Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/typha:v3.30.2\" with image id \"sha256:bd819526ff844d29b60cd75e846a1f55306016ff269d881d52a9b6c7b2eef0b2\", repo tag \"ghcr.io/flatcar/calico/typha:v3.30.2\", repo digest \"ghcr.io/flatcar/calico/typha@sha256:da29d745efe5eb7d25f765d3aa439f3fe60710a458efe39c285e58b02bd961af\", size \"33087061\" in 1.821492872s" Aug 12 23:44:51.204902 containerd[1506]: time="2025-08-12T23:44:51.204885371Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/typha:v3.30.2\" returns image reference \"sha256:bd819526ff844d29b60cd75e846a1f55306016ff269d881d52a9b6c7b2eef0b2\"" Aug 12 23:44:51.206950 containerd[1506]: time="2025-08-12T23:44:51.206459162Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.2\"" Aug 12 23:44:51.225436 containerd[1506]: time="2025-08-12T23:44:51.225395615Z" level=info msg="CreateContainer within sandbox \"18e92bc5265e6cc58262f023836a3764eb67b50d7c035544275cfcaff56cc1ae\" for container &ContainerMetadata{Name:calico-typha,Attempt:0,}" Aug 12 23:44:51.238947 containerd[1506]: time="2025-08-12T23:44:51.238042784Z" level=info msg="Container 54a6892226e2da40b0346378f0fcecafb61a7c8291966d1228ed075725f80fe3: CDI devices from CRI Config.CDIDevices: []" Aug 12 23:44:51.254654 containerd[1506]: time="2025-08-12T23:44:51.254443563Z" level=info msg="CreateContainer within sandbox \"18e92bc5265e6cc58262f023836a3764eb67b50d7c035544275cfcaff56cc1ae\" for &ContainerMetadata{Name:calico-typha,Attempt:0,} returns container id \"54a6892226e2da40b0346378f0fcecafb61a7c8291966d1228ed075725f80fe3\"" Aug 12 23:44:51.256378 containerd[1506]: time="2025-08-12T23:44:51.256321008Z" level=info msg="StartContainer for \"54a6892226e2da40b0346378f0fcecafb61a7c8291966d1228ed075725f80fe3\"" Aug 12 23:44:51.262914 containerd[1506]: time="2025-08-12T23:44:51.262734016Z" level=info msg="connecting to shim 54a6892226e2da40b0346378f0fcecafb61a7c8291966d1228ed075725f80fe3" address="unix:///run/containerd/s/cf0f7a8d565ddb7e57f716423c0ecd0af02d1558a15a71b564ba183ac89dde3a" protocol=ttrpc version=3 Aug 12 23:44:51.291311 systemd[1]: Started cri-containerd-54a6892226e2da40b0346378f0fcecafb61a7c8291966d1228ed075725f80fe3.scope - libcontainer container 54a6892226e2da40b0346378f0fcecafb61a7c8291966d1228ed075725f80fe3. Aug 12 23:44:51.354358 containerd[1506]: time="2025-08-12T23:44:51.354214096Z" level=info msg="StartContainer for \"54a6892226e2da40b0346378f0fcecafb61a7c8291966d1228ed075725f80fe3\" returns successfully" Aug 12 23:44:52.249933 kubelet[2715]: E0812 23:44:52.249695 2715 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 12 23:44:52.249933 kubelet[2715]: W0812 23:44:52.249724 2715 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 12 23:44:52.249933 kubelet[2715]: E0812 23:44:52.249749 2715 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 12 23:44:52.250955 kubelet[2715]: E0812 23:44:52.250770 2715 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 12 23:44:52.250955 kubelet[2715]: W0812 23:44:52.250791 2715 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 12 23:44:52.250955 kubelet[2715]: E0812 23:44:52.250859 2715 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 12 23:44:52.251405 kubelet[2715]: E0812 23:44:52.251269 2715 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 12 23:44:52.251405 kubelet[2715]: W0812 23:44:52.251286 2715 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 12 23:44:52.251405 kubelet[2715]: E0812 23:44:52.251301 2715 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 12 23:44:52.251701 kubelet[2715]: E0812 23:44:52.251686 2715 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 12 23:44:52.251883 kubelet[2715]: W0812 23:44:52.251761 2715 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 12 23:44:52.251883 kubelet[2715]: E0812 23:44:52.251781 2715 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 12 23:44:52.252061 kubelet[2715]: E0812 23:44:52.252049 2715 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 12 23:44:52.252348 kubelet[2715]: W0812 23:44:52.252188 2715 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 12 23:44:52.252348 kubelet[2715]: E0812 23:44:52.252248 2715 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 12 23:44:52.252566 kubelet[2715]: E0812 23:44:52.252530 2715 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 12 23:44:52.252645 kubelet[2715]: W0812 23:44:52.252632 2715 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 12 23:44:52.252769 kubelet[2715]: E0812 23:44:52.252693 2715 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 12 23:44:52.252900 kubelet[2715]: E0812 23:44:52.252890 2715 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 12 23:44:52.252972 kubelet[2715]: W0812 23:44:52.252959 2715 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 12 23:44:52.253093 kubelet[2715]: E0812 23:44:52.253018 2715 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 12 23:44:52.253369 kubelet[2715]: E0812 23:44:52.253264 2715 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 12 23:44:52.253369 kubelet[2715]: W0812 23:44:52.253276 2715 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 12 23:44:52.253369 kubelet[2715]: E0812 23:44:52.253287 2715 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 12 23:44:52.253575 kubelet[2715]: E0812 23:44:52.253538 2715 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 12 23:44:52.253748 kubelet[2715]: W0812 23:44:52.253642 2715 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 12 23:44:52.253748 kubelet[2715]: E0812 23:44:52.253659 2715 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 12 23:44:52.254067 kubelet[2715]: E0812 23:44:52.253971 2715 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 12 23:44:52.254067 kubelet[2715]: W0812 23:44:52.253985 2715 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 12 23:44:52.254067 kubelet[2715]: E0812 23:44:52.253995 2715 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 12 23:44:52.254367 kubelet[2715]: E0812 23:44:52.254272 2715 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 12 23:44:52.254367 kubelet[2715]: W0812 23:44:52.254285 2715 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 12 23:44:52.254367 kubelet[2715]: E0812 23:44:52.254298 2715 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 12 23:44:52.254537 kubelet[2715]: E0812 23:44:52.254526 2715 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 12 23:44:52.254831 kubelet[2715]: W0812 23:44:52.254601 2715 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 12 23:44:52.254831 kubelet[2715]: E0812 23:44:52.254616 2715 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 12 23:44:52.255055 kubelet[2715]: E0812 23:44:52.255024 2715 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 12 23:44:52.255173 kubelet[2715]: W0812 23:44:52.255037 2715 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 12 23:44:52.255232 kubelet[2715]: E0812 23:44:52.255219 2715 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 12 23:44:52.255589 kubelet[2715]: E0812 23:44:52.255466 2715 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 12 23:44:52.255589 kubelet[2715]: W0812 23:44:52.255479 2715 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 12 23:44:52.255589 kubelet[2715]: E0812 23:44:52.255491 2715 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 12 23:44:52.256422 kubelet[2715]: E0812 23:44:52.255701 2715 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 12 23:44:52.256422 kubelet[2715]: W0812 23:44:52.255709 2715 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 12 23:44:52.256422 kubelet[2715]: E0812 23:44:52.255719 2715 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 12 23:44:52.263094 kubelet[2715]: E0812 23:44:52.263059 2715 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 12 23:44:52.263094 kubelet[2715]: W0812 23:44:52.263085 2715 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 12 23:44:52.263094 kubelet[2715]: E0812 23:44:52.263105 2715 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 12 23:44:52.263775 kubelet[2715]: E0812 23:44:52.263728 2715 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 12 23:44:52.263775 kubelet[2715]: W0812 23:44:52.263746 2715 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 12 23:44:52.263775 kubelet[2715]: E0812 23:44:52.263765 2715 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 12 23:44:52.264394 kubelet[2715]: E0812 23:44:52.264356 2715 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 12 23:44:52.264394 kubelet[2715]: W0812 23:44:52.264369 2715 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 12 23:44:52.264394 kubelet[2715]: E0812 23:44:52.264387 2715 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 12 23:44:52.264996 kubelet[2715]: E0812 23:44:52.264916 2715 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 12 23:44:52.264996 kubelet[2715]: W0812 23:44:52.264940 2715 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 12 23:44:52.264996 kubelet[2715]: E0812 23:44:52.264955 2715 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 12 23:44:52.265405 kubelet[2715]: E0812 23:44:52.265297 2715 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 12 23:44:52.265405 kubelet[2715]: W0812 23:44:52.265319 2715 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 12 23:44:52.265405 kubelet[2715]: E0812 23:44:52.265371 2715 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 12 23:44:52.265539 kubelet[2715]: E0812 23:44:52.265509 2715 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 12 23:44:52.265539 kubelet[2715]: W0812 23:44:52.265524 2715 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 12 23:44:52.265709 kubelet[2715]: E0812 23:44:52.265631 2715 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 12 23:44:52.265826 kubelet[2715]: E0812 23:44:52.265806 2715 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 12 23:44:52.265826 kubelet[2715]: W0812 23:44:52.265817 2715 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 12 23:44:52.265925 kubelet[2715]: E0812 23:44:52.265909 2715 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 12 23:44:52.266063 kubelet[2715]: E0812 23:44:52.266048 2715 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 12 23:44:52.266063 kubelet[2715]: W0812 23:44:52.266061 2715 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 12 23:44:52.266154 kubelet[2715]: E0812 23:44:52.266083 2715 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 12 23:44:52.266370 kubelet[2715]: E0812 23:44:52.266328 2715 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 12 23:44:52.266370 kubelet[2715]: W0812 23:44:52.266343 2715 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 12 23:44:52.266370 kubelet[2715]: E0812 23:44:52.266366 2715 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 12 23:44:52.266859 kubelet[2715]: E0812 23:44:52.266828 2715 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 12 23:44:52.266859 kubelet[2715]: W0812 23:44:52.266853 2715 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 12 23:44:52.266859 kubelet[2715]: E0812 23:44:52.266870 2715 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 12 23:44:52.267265 kubelet[2715]: E0812 23:44:52.267243 2715 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 12 23:44:52.267265 kubelet[2715]: W0812 23:44:52.267255 2715 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 12 23:44:52.267528 kubelet[2715]: E0812 23:44:52.267339 2715 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 12 23:44:52.267666 kubelet[2715]: E0812 23:44:52.267645 2715 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 12 23:44:52.267666 kubelet[2715]: W0812 23:44:52.267661 2715 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 12 23:44:52.267761 kubelet[2715]: E0812 23:44:52.267744 2715 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 12 23:44:52.267933 kubelet[2715]: E0812 23:44:52.267914 2715 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 12 23:44:52.267933 kubelet[2715]: W0812 23:44:52.267928 2715 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 12 23:44:52.268010 kubelet[2715]: E0812 23:44:52.267948 2715 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 12 23:44:52.268234 kubelet[2715]: E0812 23:44:52.268210 2715 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 12 23:44:52.268234 kubelet[2715]: W0812 23:44:52.268227 2715 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 12 23:44:52.268301 kubelet[2715]: E0812 23:44:52.268250 2715 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 12 23:44:52.268582 kubelet[2715]: E0812 23:44:52.268529 2715 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 12 23:44:52.268582 kubelet[2715]: W0812 23:44:52.268572 2715 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 12 23:44:52.268658 kubelet[2715]: E0812 23:44:52.268585 2715 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 12 23:44:52.268926 kubelet[2715]: E0812 23:44:52.268899 2715 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 12 23:44:52.268926 kubelet[2715]: W0812 23:44:52.268915 2715 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 12 23:44:52.269016 kubelet[2715]: E0812 23:44:52.268934 2715 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 12 23:44:52.269301 kubelet[2715]: E0812 23:44:52.269279 2715 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 12 23:44:52.269505 kubelet[2715]: W0812 23:44:52.269375 2715 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 12 23:44:52.269505 kubelet[2715]: E0812 23:44:52.269417 2715 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 12 23:44:52.269941 kubelet[2715]: E0812 23:44:52.269881 2715 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 12 23:44:52.269941 kubelet[2715]: W0812 23:44:52.269897 2715 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 12 23:44:52.269941 kubelet[2715]: E0812 23:44:52.269911 2715 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 12 23:44:52.550412 containerd[1506]: time="2025-08-12T23:44:52.550045922Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.2\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Aug 12 23:44:52.557184 containerd[1506]: time="2025-08-12T23:44:52.557133869Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.2: active requests=0, bytes read=4266981" Aug 12 23:44:52.559909 containerd[1506]: time="2025-08-12T23:44:52.559842466Z" level=info msg="ImageCreate event name:\"sha256:53f638101e3d73f7dd5e42dc42fb3d94ae1978e8958677222c3de6ec1d8c3d4f\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Aug 12 23:44:52.563563 containerd[1506]: time="2025-08-12T23:44:52.563495385Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/pod2daemon-flexvol@sha256:972be127eaecd7d1a2d5393b8d14f1ae8f88550bee83e0519e9590c7e15eb41b\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Aug 12 23:44:52.564976 containerd[1506]: time="2025-08-12T23:44:52.564286939Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.2\" with image id \"sha256:53f638101e3d73f7dd5e42dc42fb3d94ae1978e8958677222c3de6ec1d8c3d4f\", repo tag \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.2\", repo digest \"ghcr.io/flatcar/calico/pod2daemon-flexvol@sha256:972be127eaecd7d1a2d5393b8d14f1ae8f88550bee83e0519e9590c7e15eb41b\", size \"5636182\" in 1.357786815s" Aug 12 23:44:52.564976 containerd[1506]: time="2025-08-12T23:44:52.564340061Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.2\" returns image reference \"sha256:53f638101e3d73f7dd5e42dc42fb3d94ae1978e8958677222c3de6ec1d8c3d4f\"" Aug 12 23:44:52.570356 containerd[1506]: time="2025-08-12T23:44:52.570295039Z" level=info msg="CreateContainer within sandbox \"3f7549914de50118b9a2f08652cb62a765c74822bd924071684a5777e52a4572\" for container &ContainerMetadata{Name:flexvol-driver,Attempt:0,}" Aug 12 23:44:52.584611 containerd[1506]: time="2025-08-12T23:44:52.584237683Z" level=info msg="Container 5c04bc6c51e1adc015bfbc3d7644ae86eeef1aaa86935e4d1bc8f2843ee549c2: CDI devices from CRI Config.CDIDevices: []" Aug 12 23:44:52.591228 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount3061334533.mount: Deactivated successfully. Aug 12 23:44:52.597928 containerd[1506]: time="2025-08-12T23:44:52.597877994Z" level=info msg="CreateContainer within sandbox \"3f7549914de50118b9a2f08652cb62a765c74822bd924071684a5777e52a4572\" for &ContainerMetadata{Name:flexvol-driver,Attempt:0,} returns container id \"5c04bc6c51e1adc015bfbc3d7644ae86eeef1aaa86935e4d1bc8f2843ee549c2\"" Aug 12 23:44:52.599360 containerd[1506]: time="2025-08-12T23:44:52.599314096Z" level=info msg="StartContainer for \"5c04bc6c51e1adc015bfbc3d7644ae86eeef1aaa86935e4d1bc8f2843ee549c2\"" Aug 12 23:44:52.601251 containerd[1506]: time="2025-08-12T23:44:52.601212858Z" level=info msg="connecting to shim 5c04bc6c51e1adc015bfbc3d7644ae86eeef1aaa86935e4d1bc8f2843ee549c2" address="unix:///run/containerd/s/86967d9467dc7f0b6454055d38c79265bfa02b575d9385748022299ef5327b2d" protocol=ttrpc version=3 Aug 12 23:44:52.629917 systemd[1]: Started cri-containerd-5c04bc6c51e1adc015bfbc3d7644ae86eeef1aaa86935e4d1bc8f2843ee549c2.scope - libcontainer container 5c04bc6c51e1adc015bfbc3d7644ae86eeef1aaa86935e4d1bc8f2843ee549c2. Aug 12 23:44:52.686362 containerd[1506]: time="2025-08-12T23:44:52.686312623Z" level=info msg="StartContainer for \"5c04bc6c51e1adc015bfbc3d7644ae86eeef1aaa86935e4d1bc8f2843ee549c2\" returns successfully" Aug 12 23:44:52.703994 systemd[1]: cri-containerd-5c04bc6c51e1adc015bfbc3d7644ae86eeef1aaa86935e4d1bc8f2843ee549c2.scope: Deactivated successfully. Aug 12 23:44:52.707627 containerd[1506]: time="2025-08-12T23:44:52.707513022Z" level=info msg="received exit event container_id:\"5c04bc6c51e1adc015bfbc3d7644ae86eeef1aaa86935e4d1bc8f2843ee549c2\" id:\"5c04bc6c51e1adc015bfbc3d7644ae86eeef1aaa86935e4d1bc8f2843ee549c2\" pid:3393 exited_at:{seconds:1755042292 nanos:706927636}" Aug 12 23:44:52.707962 containerd[1506]: time="2025-08-12T23:44:52.707797594Z" level=info msg="TaskExit event in podsandbox handler container_id:\"5c04bc6c51e1adc015bfbc3d7644ae86eeef1aaa86935e4d1bc8f2843ee549c2\" id:\"5c04bc6c51e1adc015bfbc3d7644ae86eeef1aaa86935e4d1bc8f2843ee549c2\" pid:3393 exited_at:{seconds:1755042292 nanos:706927636}" Aug 12 23:44:52.732075 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-5c04bc6c51e1adc015bfbc3d7644ae86eeef1aaa86935e4d1bc8f2843ee549c2-rootfs.mount: Deactivated successfully. Aug 12 23:44:53.088201 kubelet[2715]: E0812 23:44:53.086800 2715 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-znzb2" podUID="e6d039b0-18c9-437e-88dc-fc77d8757ee7" Aug 12 23:44:53.235391 kubelet[2715]: I0812 23:44:53.235337 2715 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Aug 12 23:44:53.237140 containerd[1506]: time="2025-08-12T23:44:53.237031290Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/cni:v3.30.2\"" Aug 12 23:44:53.259422 kubelet[2715]: I0812 23:44:53.259333 2715 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/calico-typha-6674d8dd55-tmrsw" podStartSLOduration=3.433756953 podStartE2EDuration="5.259300258s" podCreationTimestamp="2025-08-12 23:44:48 +0000 UTC" firstStartedPulling="2025-08-12 23:44:49.380575962 +0000 UTC m=+24.440138515" lastFinishedPulling="2025-08-12 23:44:51.206119227 +0000 UTC m=+26.265681820" observedRunningTime="2025-08-12 23:44:52.258643382 +0000 UTC m=+27.318205935" watchObservedRunningTime="2025-08-12 23:44:53.259300258 +0000 UTC m=+28.318862851" Aug 12 23:44:55.087164 kubelet[2715]: E0812 23:44:55.087119 2715 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-znzb2" podUID="e6d039b0-18c9-437e-88dc-fc77d8757ee7" Aug 12 23:44:55.901158 containerd[1506]: time="2025-08-12T23:44:55.901112892Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/cni:v3.30.2\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Aug 12 23:44:55.903054 containerd[1506]: time="2025-08-12T23:44:55.903015446Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/cni:v3.30.2: active requests=0, bytes read=65888320" Aug 12 23:44:55.904572 containerd[1506]: time="2025-08-12T23:44:55.904462302Z" level=info msg="ImageCreate event name:\"sha256:f6e344d58b3c5524e767c7d1dd4cb29c85ce820b0f3005a603532b6a22db5588\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Aug 12 23:44:55.906603 containerd[1506]: time="2025-08-12T23:44:55.906436978Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/cni@sha256:50686775cc60acb78bd92a66fa2d84e1700b2d8e43a718fbadbf35e59baefb4d\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Aug 12 23:44:55.907891 containerd[1506]: time="2025-08-12T23:44:55.907136605Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/cni:v3.30.2\" with image id \"sha256:f6e344d58b3c5524e767c7d1dd4cb29c85ce820b0f3005a603532b6a22db5588\", repo tag \"ghcr.io/flatcar/calico/cni:v3.30.2\", repo digest \"ghcr.io/flatcar/calico/cni@sha256:50686775cc60acb78bd92a66fa2d84e1700b2d8e43a718fbadbf35e59baefb4d\", size \"67257561\" in 2.670062994s" Aug 12 23:44:55.907891 containerd[1506]: time="2025-08-12T23:44:55.907172407Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/cni:v3.30.2\" returns image reference \"sha256:f6e344d58b3c5524e767c7d1dd4cb29c85ce820b0f3005a603532b6a22db5588\"" Aug 12 23:44:55.912522 containerd[1506]: time="2025-08-12T23:44:55.911756624Z" level=info msg="CreateContainer within sandbox \"3f7549914de50118b9a2f08652cb62a765c74822bd924071684a5777e52a4572\" for container &ContainerMetadata{Name:install-cni,Attempt:0,}" Aug 12 23:44:55.924573 containerd[1506]: time="2025-08-12T23:44:55.923757649Z" level=info msg="Container ec6fc54c68dea8d0933909f85aca6e76649af579557bff9f8314a78069d9993b: CDI devices from CRI Config.CDIDevices: []" Aug 12 23:44:55.939352 containerd[1506]: time="2025-08-12T23:44:55.939290411Z" level=info msg="CreateContainer within sandbox \"3f7549914de50118b9a2f08652cb62a765c74822bd924071684a5777e52a4572\" for &ContainerMetadata{Name:install-cni,Attempt:0,} returns container id \"ec6fc54c68dea8d0933909f85aca6e76649af579557bff9f8314a78069d9993b\"" Aug 12 23:44:55.940919 containerd[1506]: time="2025-08-12T23:44:55.940757628Z" level=info msg="StartContainer for \"ec6fc54c68dea8d0933909f85aca6e76649af579557bff9f8314a78069d9993b\"" Aug 12 23:44:55.944898 containerd[1506]: time="2025-08-12T23:44:55.944843226Z" level=info msg="connecting to shim ec6fc54c68dea8d0933909f85aca6e76649af579557bff9f8314a78069d9993b" address="unix:///run/containerd/s/86967d9467dc7f0b6454055d38c79265bfa02b575d9385748022299ef5327b2d" protocol=ttrpc version=3 Aug 12 23:44:55.971776 systemd[1]: Started cri-containerd-ec6fc54c68dea8d0933909f85aca6e76649af579557bff9f8314a78069d9993b.scope - libcontainer container ec6fc54c68dea8d0933909f85aca6e76649af579557bff9f8314a78069d9993b. Aug 12 23:44:56.025701 containerd[1506]: time="2025-08-12T23:44:56.025609122Z" level=info msg="StartContainer for \"ec6fc54c68dea8d0933909f85aca6e76649af579557bff9f8314a78069d9993b\" returns successfully" Aug 12 23:44:56.551303 containerd[1506]: time="2025-08-12T23:44:56.551040695Z" level=error msg="failed to reload cni configuration after receiving fs change event(WRITE \"/etc/cni/net.d/calico-kubeconfig\")" error="cni config load failed: no network config found in /etc/cni/net.d: cni plugin not initialized: failed to load cni config" Aug 12 23:44:56.559642 systemd[1]: cri-containerd-ec6fc54c68dea8d0933909f85aca6e76649af579557bff9f8314a78069d9993b.scope: Deactivated successfully. Aug 12 23:44:56.560167 systemd[1]: cri-containerd-ec6fc54c68dea8d0933909f85aca6e76649af579557bff9f8314a78069d9993b.scope: Consumed 507ms CPU time, 186.5M memory peak, 165.8M written to disk. Aug 12 23:44:56.562052 containerd[1506]: time="2025-08-12T23:44:56.562015546Z" level=info msg="TaskExit event in podsandbox handler container_id:\"ec6fc54c68dea8d0933909f85aca6e76649af579557bff9f8314a78069d9993b\" id:\"ec6fc54c68dea8d0933909f85aca6e76649af579557bff9f8314a78069d9993b\" pid:3457 exited_at:{seconds:1755042296 nanos:559152359}" Aug 12 23:44:56.562689 containerd[1506]: time="2025-08-12T23:44:56.562591367Z" level=info msg="received exit event container_id:\"ec6fc54c68dea8d0933909f85aca6e76649af579557bff9f8314a78069d9993b\" id:\"ec6fc54c68dea8d0933909f85aca6e76649af579557bff9f8314a78069d9993b\" pid:3457 exited_at:{seconds:1755042296 nanos:559152359}" Aug 12 23:44:56.567024 kubelet[2715]: I0812 23:44:56.566929 2715 kubelet_node_status.go:488] "Fast updating node status as it just became ready" Aug 12 23:44:56.600766 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-ec6fc54c68dea8d0933909f85aca6e76649af579557bff9f8314a78069d9993b-rootfs.mount: Deactivated successfully. Aug 12 23:44:56.620798 systemd[1]: Created slice kubepods-burstable-poda62fab08_ab88_42e3_ab7b_29db68e0485c.slice - libcontainer container kubepods-burstable-poda62fab08_ab88_42e3_ab7b_29db68e0485c.slice. Aug 12 23:44:56.639004 systemd[1]: Created slice kubepods-burstable-podf78d3697_5a7e_4c57_950b_a6d5080b5a5b.slice - libcontainer container kubepods-burstable-podf78d3697_5a7e_4c57_950b_a6d5080b5a5b.slice. Aug 12 23:44:56.649069 systemd[1]: Created slice kubepods-besteffort-poda9e9c567_de9b_4a68_9d2c_640b2dafaf2f.slice - libcontainer container kubepods-besteffort-poda9e9c567_de9b_4a68_9d2c_640b2dafaf2f.slice. Aug 12 23:44:56.663606 systemd[1]: Created slice kubepods-besteffort-pod285f6b1f_9c43_4330_9ef8_c92ed81153fb.slice - libcontainer container kubepods-besteffort-pod285f6b1f_9c43_4330_9ef8_c92ed81153fb.slice. Aug 12 23:44:56.681987 systemd[1]: Created slice kubepods-besteffort-podd4e2f489_94af_4e39_ac07_2a80367a7943.slice - libcontainer container kubepods-besteffort-podd4e2f489_94af_4e39_ac07_2a80367a7943.slice. Aug 12 23:44:56.695314 kubelet[2715]: W0812 23:44:56.695162 2715 reflector.go:561] object-"calico-system"/"goldmane-key-pair": failed to list *v1.Secret: secrets "goldmane-key-pair" is forbidden: User "system:node:ci-4372-1-0-9-13fe44d47a" cannot list resource "secrets" in API group "" in the namespace "calico-system": no relationship found between node 'ci-4372-1-0-9-13fe44d47a' and this object Aug 12 23:44:56.695314 kubelet[2715]: E0812 23:44:56.695266 2715 reflector.go:158] "Unhandled Error" err="object-\"calico-system\"/\"goldmane-key-pair\": Failed to watch *v1.Secret: failed to list *v1.Secret: secrets \"goldmane-key-pair\" is forbidden: User \"system:node:ci-4372-1-0-9-13fe44d47a\" cannot list resource \"secrets\" in API group \"\" in the namespace \"calico-system\": no relationship found between node 'ci-4372-1-0-9-13fe44d47a' and this object" logger="UnhandledError" Aug 12 23:44:56.702147 systemd[1]: Created slice kubepods-besteffort-pod241f24c8_4f1a_4329_8952_6c015f2ffbc0.slice - libcontainer container kubepods-besteffort-pod241f24c8_4f1a_4329_8952_6c015f2ffbc0.slice. Aug 12 23:44:56.714109 systemd[1]: Created slice kubepods-besteffort-pod24f2d0d9_d57f_4291_90c0_84286d227df3.slice - libcontainer container kubepods-besteffort-pod24f2d0d9_d57f_4291_90c0_84286d227df3.slice. Aug 12 23:44:56.722604 systemd[1]: Created slice kubepods-besteffort-pod90377728_1937_4fd3_8a8d_e255d8053f39.slice - libcontainer container kubepods-besteffort-pod90377728_1937_4fd3_8a8d_e255d8053f39.slice. Aug 12 23:44:56.792935 kubelet[2715]: I0812 23:44:56.792899 2715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"goldmane-key-pair\" (UniqueName: \"kubernetes.io/secret/24f2d0d9-d57f-4291-90c0-84286d227df3-goldmane-key-pair\") pod \"goldmane-58fd7646b9-q57c4\" (UID: \"24f2d0d9-d57f-4291-90c0-84286d227df3\") " pod="calico-system/goldmane-58fd7646b9-q57c4" Aug 12 23:44:56.793435 kubelet[2715]: I0812 23:44:56.793397 2715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hdbkf\" (UniqueName: \"kubernetes.io/projected/d4e2f489-94af-4e39-ac07-2a80367a7943-kube-api-access-hdbkf\") pod \"calico-apiserver-6464b99944-nztd8\" (UID: \"d4e2f489-94af-4e39-ac07-2a80367a7943\") " pod="calico-apiserver/calico-apiserver-6464b99944-nztd8" Aug 12 23:44:56.793596 kubelet[2715]: I0812 23:44:56.793579 2715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pv57p\" (UniqueName: \"kubernetes.io/projected/24f2d0d9-d57f-4291-90c0-84286d227df3-kube-api-access-pv57p\") pod \"goldmane-58fd7646b9-q57c4\" (UID: \"24f2d0d9-d57f-4291-90c0-84286d227df3\") " pod="calico-system/goldmane-58fd7646b9-q57c4" Aug 12 23:44:56.793691 kubelet[2715]: I0812 23:44:56.793677 2715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-66frt\" (UniqueName: \"kubernetes.io/projected/285f6b1f-9c43-4330-9ef8-c92ed81153fb-kube-api-access-66frt\") pod \"calico-kube-controllers-5cbb45b49d-bnxpq\" (UID: \"285f6b1f-9c43-4330-9ef8-c92ed81153fb\") " pod="calico-system/calico-kube-controllers-5cbb45b49d-bnxpq" Aug 12 23:44:56.793775 kubelet[2715]: I0812 23:44:56.793762 2715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"calico-apiserver-certs\" (UniqueName: \"kubernetes.io/secret/a9e9c567-de9b-4a68-9d2c-640b2dafaf2f-calico-apiserver-certs\") pod \"calico-apiserver-7f8d4ddcc7-zgq55\" (UID: \"a9e9c567-de9b-4a68-9d2c-640b2dafaf2f\") " pod="calico-apiserver/calico-apiserver-7f8d4ddcc7-zgq55" Aug 12 23:44:56.793855 kubelet[2715]: I0812 23:44:56.793841 2715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whisker-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/90377728-1937-4fd3-8a8d-e255d8053f39-whisker-ca-bundle\") pod \"whisker-774b6f877d-xrgft\" (UID: \"90377728-1937-4fd3-8a8d-e255d8053f39\") " pod="calico-system/whisker-774b6f877d-xrgft" Aug 12 23:44:56.793926 kubelet[2715]: I0812 23:44:56.793914 2715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jccm6\" (UniqueName: \"kubernetes.io/projected/f78d3697-5a7e-4c57-950b-a6d5080b5a5b-kube-api-access-jccm6\") pod \"coredns-7c65d6cfc9-drk6f\" (UID: \"f78d3697-5a7e-4c57-950b-a6d5080b5a5b\") " pod="kube-system/coredns-7c65d6cfc9-drk6f" Aug 12 23:44:56.794018 kubelet[2715]: I0812 23:44:56.794005 2715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"calico-apiserver-certs\" (UniqueName: \"kubernetes.io/secret/241f24c8-4f1a-4329-8952-6c015f2ffbc0-calico-apiserver-certs\") pod \"calico-apiserver-7f8d4ddcc7-r7f9g\" (UID: \"241f24c8-4f1a-4329-8952-6c015f2ffbc0\") " pod="calico-apiserver/calico-apiserver-7f8d4ddcc7-r7f9g" Aug 12 23:44:56.794102 kubelet[2715]: I0812 23:44:56.794086 2715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/24f2d0d9-d57f-4291-90c0-84286d227df3-config\") pod \"goldmane-58fd7646b9-q57c4\" (UID: \"24f2d0d9-d57f-4291-90c0-84286d227df3\") " pod="calico-system/goldmane-58fd7646b9-q57c4" Aug 12 23:44:56.794174 kubelet[2715]: I0812 23:44:56.794162 2715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/285f6b1f-9c43-4330-9ef8-c92ed81153fb-tigera-ca-bundle\") pod \"calico-kube-controllers-5cbb45b49d-bnxpq\" (UID: \"285f6b1f-9c43-4330-9ef8-c92ed81153fb\") " pod="calico-system/calico-kube-controllers-5cbb45b49d-bnxpq" Aug 12 23:44:56.794289 kubelet[2715]: I0812 23:44:56.794230 2715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"goldmane-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/24f2d0d9-d57f-4291-90c0-84286d227df3-goldmane-ca-bundle\") pod \"goldmane-58fd7646b9-q57c4\" (UID: \"24f2d0d9-d57f-4291-90c0-84286d227df3\") " pod="calico-system/goldmane-58fd7646b9-q57c4" Aug 12 23:44:56.794376 kubelet[2715]: I0812 23:44:56.794361 2715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ns94x\" (UniqueName: \"kubernetes.io/projected/241f24c8-4f1a-4329-8952-6c015f2ffbc0-kube-api-access-ns94x\") pod \"calico-apiserver-7f8d4ddcc7-r7f9g\" (UID: \"241f24c8-4f1a-4329-8952-6c015f2ffbc0\") " pod="calico-apiserver/calico-apiserver-7f8d4ddcc7-r7f9g" Aug 12 23:44:56.794462 kubelet[2715]: I0812 23:44:56.794437 2715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whisker-backend-key-pair\" (UniqueName: \"kubernetes.io/secret/90377728-1937-4fd3-8a8d-e255d8053f39-whisker-backend-key-pair\") pod \"whisker-774b6f877d-xrgft\" (UID: \"90377728-1937-4fd3-8a8d-e255d8053f39\") " pod="calico-system/whisker-774b6f877d-xrgft" Aug 12 23:44:56.794581 kubelet[2715]: I0812 23:44:56.794532 2715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/a62fab08-ab88-42e3-ab7b-29db68e0485c-config-volume\") pod \"coredns-7c65d6cfc9-dbjg4\" (UID: \"a62fab08-ab88-42e3-ab7b-29db68e0485c\") " pod="kube-system/coredns-7c65d6cfc9-dbjg4" Aug 12 23:44:56.794677 kubelet[2715]: I0812 23:44:56.794662 2715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/f78d3697-5a7e-4c57-950b-a6d5080b5a5b-config-volume\") pod \"coredns-7c65d6cfc9-drk6f\" (UID: \"f78d3697-5a7e-4c57-950b-a6d5080b5a5b\") " pod="kube-system/coredns-7c65d6cfc9-drk6f" Aug 12 23:44:56.794916 kubelet[2715]: I0812 23:44:56.794743 2715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2mzl6\" (UniqueName: \"kubernetes.io/projected/a9e9c567-de9b-4a68-9d2c-640b2dafaf2f-kube-api-access-2mzl6\") pod \"calico-apiserver-7f8d4ddcc7-zgq55\" (UID: \"a9e9c567-de9b-4a68-9d2c-640b2dafaf2f\") " pod="calico-apiserver/calico-apiserver-7f8d4ddcc7-zgq55" Aug 12 23:44:56.795011 kubelet[2715]: I0812 23:44:56.794997 2715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-llwhf\" (UniqueName: \"kubernetes.io/projected/a62fab08-ab88-42e3-ab7b-29db68e0485c-kube-api-access-llwhf\") pod \"coredns-7c65d6cfc9-dbjg4\" (UID: \"a62fab08-ab88-42e3-ab7b-29db68e0485c\") " pod="kube-system/coredns-7c65d6cfc9-dbjg4" Aug 12 23:44:56.795260 kubelet[2715]: I0812 23:44:56.795229 2715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"calico-apiserver-certs\" (UniqueName: \"kubernetes.io/secret/d4e2f489-94af-4e39-ac07-2a80367a7943-calico-apiserver-certs\") pod \"calico-apiserver-6464b99944-nztd8\" (UID: \"d4e2f489-94af-4e39-ac07-2a80367a7943\") " pod="calico-apiserver/calico-apiserver-6464b99944-nztd8" Aug 12 23:44:56.796722 kubelet[2715]: I0812 23:44:56.795352 2715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9njl7\" (UniqueName: \"kubernetes.io/projected/90377728-1937-4fd3-8a8d-e255d8053f39-kube-api-access-9njl7\") pod \"whisker-774b6f877d-xrgft\" (UID: \"90377728-1937-4fd3-8a8d-e255d8053f39\") " pod="calico-system/whisker-774b6f877d-xrgft" Aug 12 23:44:56.974021 containerd[1506]: time="2025-08-12T23:44:56.973265849Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-5cbb45b49d-bnxpq,Uid:285f6b1f-9c43-4330-9ef8-c92ed81153fb,Namespace:calico-system,Attempt:0,}" Aug 12 23:44:56.988277 containerd[1506]: time="2025-08-12T23:44:56.988208128Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-6464b99944-nztd8,Uid:d4e2f489-94af-4e39-ac07-2a80367a7943,Namespace:calico-apiserver,Attempt:0,}" Aug 12 23:44:57.015799 containerd[1506]: time="2025-08-12T23:44:57.015607294Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-7f8d4ddcc7-r7f9g,Uid:241f24c8-4f1a-4329-8952-6c015f2ffbc0,Namespace:calico-apiserver,Attempt:0,}" Aug 12 23:44:57.032310 containerd[1506]: time="2025-08-12T23:44:57.032270216Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:whisker-774b6f877d-xrgft,Uid:90377728-1937-4fd3-8a8d-e255d8053f39,Namespace:calico-system,Attempt:0,}" Aug 12 23:44:57.097429 systemd[1]: Created slice kubepods-besteffort-pode6d039b0_18c9_437e_88dc_fc77d8757ee7.slice - libcontainer container kubepods-besteffort-pode6d039b0_18c9_437e_88dc_fc77d8757ee7.slice. Aug 12 23:44:57.102463 containerd[1506]: time="2025-08-12T23:44:57.102409072Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-znzb2,Uid:e6d039b0-18c9-437e-88dc-fc77d8757ee7,Namespace:calico-system,Attempt:0,}" Aug 12 23:44:57.131005 containerd[1506]: time="2025-08-12T23:44:57.130922422Z" level=error msg="Failed to destroy network for sandbox \"d292aae70966f968e22a085ae128b88121743c6c1d2fa2944fccf47273563846\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Aug 12 23:44:57.138928 containerd[1506]: time="2025-08-12T23:44:57.138485816Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-6464b99944-nztd8,Uid:d4e2f489-94af-4e39-ac07-2a80367a7943,Namespace:calico-apiserver,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"d292aae70966f968e22a085ae128b88121743c6c1d2fa2944fccf47273563846\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Aug 12 23:44:57.140934 kubelet[2715]: E0812 23:44:57.139294 2715 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"d292aae70966f968e22a085ae128b88121743c6c1d2fa2944fccf47273563846\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Aug 12 23:44:57.140934 kubelet[2715]: E0812 23:44:57.140508 2715 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"d292aae70966f968e22a085ae128b88121743c6c1d2fa2944fccf47273563846\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-6464b99944-nztd8" Aug 12 23:44:57.140934 kubelet[2715]: E0812 23:44:57.140532 2715 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"d292aae70966f968e22a085ae128b88121743c6c1d2fa2944fccf47273563846\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-6464b99944-nztd8" Aug 12 23:44:57.141153 kubelet[2715]: E0812 23:44:57.140601 2715 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-apiserver-6464b99944-nztd8_calico-apiserver(d4e2f489-94af-4e39-ac07-2a80367a7943)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-apiserver-6464b99944-nztd8_calico-apiserver(d4e2f489-94af-4e39-ac07-2a80367a7943)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"d292aae70966f968e22a085ae128b88121743c6c1d2fa2944fccf47273563846\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-apiserver/calico-apiserver-6464b99944-nztd8" podUID="d4e2f489-94af-4e39-ac07-2a80367a7943" Aug 12 23:44:57.153138 containerd[1506]: time="2025-08-12T23:44:57.153078423Z" level=error msg="Failed to destroy network for sandbox \"b79a2330816b18386416622ca33f54b5f3a3dacadbbb86904b1b1161d333fa1c\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Aug 12 23:44:57.155542 containerd[1506]: time="2025-08-12T23:44:57.155402067Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-5cbb45b49d-bnxpq,Uid:285f6b1f-9c43-4330-9ef8-c92ed81153fb,Namespace:calico-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"b79a2330816b18386416622ca33f54b5f3a3dacadbbb86904b1b1161d333fa1c\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Aug 12 23:44:57.156134 kubelet[2715]: E0812 23:44:57.156093 2715 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"b79a2330816b18386416622ca33f54b5f3a3dacadbbb86904b1b1161d333fa1c\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Aug 12 23:44:57.156527 kubelet[2715]: E0812 23:44:57.156295 2715 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"b79a2330816b18386416622ca33f54b5f3a3dacadbbb86904b1b1161d333fa1c\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-kube-controllers-5cbb45b49d-bnxpq" Aug 12 23:44:57.156527 kubelet[2715]: E0812 23:44:57.156334 2715 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"b79a2330816b18386416622ca33f54b5f3a3dacadbbb86904b1b1161d333fa1c\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-kube-controllers-5cbb45b49d-bnxpq" Aug 12 23:44:57.156527 kubelet[2715]: E0812 23:44:57.156482 2715 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-kube-controllers-5cbb45b49d-bnxpq_calico-system(285f6b1f-9c43-4330-9ef8-c92ed81153fb)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-kube-controllers-5cbb45b49d-bnxpq_calico-system(285f6b1f-9c43-4330-9ef8-c92ed81153fb)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"b79a2330816b18386416622ca33f54b5f3a3dacadbbb86904b1b1161d333fa1c\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/calico-kube-controllers-5cbb45b49d-bnxpq" podUID="285f6b1f-9c43-4330-9ef8-c92ed81153fb" Aug 12 23:44:57.174109 containerd[1506]: time="2025-08-12T23:44:57.174054422Z" level=error msg="Failed to destroy network for sandbox \"03e4f523d2a526fe7d7f32e793d0048825a8bdefd05c4cce4c1637b8a3cfb937\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Aug 12 23:44:57.176912 containerd[1506]: time="2025-08-12T23:44:57.176850203Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:whisker-774b6f877d-xrgft,Uid:90377728-1937-4fd3-8a8d-e255d8053f39,Namespace:calico-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"03e4f523d2a526fe7d7f32e793d0048825a8bdefd05c4cce4c1637b8a3cfb937\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Aug 12 23:44:57.177227 kubelet[2715]: E0812 23:44:57.177183 2715 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"03e4f523d2a526fe7d7f32e793d0048825a8bdefd05c4cce4c1637b8a3cfb937\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Aug 12 23:44:57.177448 kubelet[2715]: E0812 23:44:57.177418 2715 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"03e4f523d2a526fe7d7f32e793d0048825a8bdefd05c4cce4c1637b8a3cfb937\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/whisker-774b6f877d-xrgft" Aug 12 23:44:57.177737 kubelet[2715]: E0812 23:44:57.177673 2715 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"03e4f523d2a526fe7d7f32e793d0048825a8bdefd05c4cce4c1637b8a3cfb937\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/whisker-774b6f877d-xrgft" Aug 12 23:44:57.178333 kubelet[2715]: E0812 23:44:57.178287 2715 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"whisker-774b6f877d-xrgft_calico-system(90377728-1937-4fd3-8a8d-e255d8053f39)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"whisker-774b6f877d-xrgft_calico-system(90377728-1937-4fd3-8a8d-e255d8053f39)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"03e4f523d2a526fe7d7f32e793d0048825a8bdefd05c4cce4c1637b8a3cfb937\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/whisker-774b6f877d-xrgft" podUID="90377728-1937-4fd3-8a8d-e255d8053f39" Aug 12 23:44:57.180402 containerd[1506]: time="2025-08-12T23:44:57.180275607Z" level=error msg="Failed to destroy network for sandbox \"dd902fe0122699e2416fa61bffd9f8812292a8089d26e6c13441ffa6c8e7f56b\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Aug 12 23:44:57.182292 containerd[1506]: time="2025-08-12T23:44:57.182067431Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-7f8d4ddcc7-r7f9g,Uid:241f24c8-4f1a-4329-8952-6c015f2ffbc0,Namespace:calico-apiserver,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"dd902fe0122699e2416fa61bffd9f8812292a8089d26e6c13441ffa6c8e7f56b\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Aug 12 23:44:57.184041 kubelet[2715]: E0812 23:44:57.183665 2715 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"dd902fe0122699e2416fa61bffd9f8812292a8089d26e6c13441ffa6c8e7f56b\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Aug 12 23:44:57.184041 kubelet[2715]: E0812 23:44:57.183726 2715 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"dd902fe0122699e2416fa61bffd9f8812292a8089d26e6c13441ffa6c8e7f56b\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-7f8d4ddcc7-r7f9g" Aug 12 23:44:57.184041 kubelet[2715]: E0812 23:44:57.183776 2715 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"dd902fe0122699e2416fa61bffd9f8812292a8089d26e6c13441ffa6c8e7f56b\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-7f8d4ddcc7-r7f9g" Aug 12 23:44:57.184191 kubelet[2715]: E0812 23:44:57.183819 2715 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-apiserver-7f8d4ddcc7-r7f9g_calico-apiserver(241f24c8-4f1a-4329-8952-6c015f2ffbc0)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-apiserver-7f8d4ddcc7-r7f9g_calico-apiserver(241f24c8-4f1a-4329-8952-6c015f2ffbc0)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"dd902fe0122699e2416fa61bffd9f8812292a8089d26e6c13441ffa6c8e7f56b\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-apiserver/calico-apiserver-7f8d4ddcc7-r7f9g" podUID="241f24c8-4f1a-4329-8952-6c015f2ffbc0" Aug 12 23:44:57.199009 containerd[1506]: time="2025-08-12T23:44:57.198960002Z" level=error msg="Failed to destroy network for sandbox \"8af609aefe4c88102fffb0c14f0274317318958e7052c945c39b540be2623d08\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Aug 12 23:44:57.201511 containerd[1506]: time="2025-08-12T23:44:57.200702745Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-znzb2,Uid:e6d039b0-18c9-437e-88dc-fc77d8757ee7,Namespace:calico-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"8af609aefe4c88102fffb0c14f0274317318958e7052c945c39b540be2623d08\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Aug 12 23:44:57.201713 kubelet[2715]: E0812 23:44:57.201019 2715 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"8af609aefe4c88102fffb0c14f0274317318958e7052c945c39b540be2623d08\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Aug 12 23:44:57.201713 kubelet[2715]: E0812 23:44:57.201085 2715 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"8af609aefe4c88102fffb0c14f0274317318958e7052c945c39b540be2623d08\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/csi-node-driver-znzb2" Aug 12 23:44:57.201713 kubelet[2715]: E0812 23:44:57.201130 2715 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"8af609aefe4c88102fffb0c14f0274317318958e7052c945c39b540be2623d08\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/csi-node-driver-znzb2" Aug 12 23:44:57.201879 kubelet[2715]: E0812 23:44:57.201186 2715 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"csi-node-driver-znzb2_calico-system(e6d039b0-18c9-437e-88dc-fc77d8757ee7)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"csi-node-driver-znzb2_calico-system(e6d039b0-18c9-437e-88dc-fc77d8757ee7)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"8af609aefe4c88102fffb0c14f0274317318958e7052c945c39b540be2623d08\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/csi-node-driver-znzb2" podUID="e6d039b0-18c9-437e-88dc-fc77d8757ee7" Aug 12 23:44:57.234292 containerd[1506]: time="2025-08-12T23:44:57.234115353Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-7c65d6cfc9-dbjg4,Uid:a62fab08-ab88-42e3-ab7b-29db68e0485c,Namespace:kube-system,Attempt:0,}" Aug 12 23:44:57.248574 containerd[1506]: time="2025-08-12T23:44:57.248349748Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-7c65d6cfc9-drk6f,Uid:f78d3697-5a7e-4c57-950b-a6d5080b5a5b,Namespace:kube-system,Attempt:0,}" Aug 12 23:44:57.257672 containerd[1506]: time="2025-08-12T23:44:57.257631003Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-7f8d4ddcc7-zgq55,Uid:a9e9c567-de9b-4a68-9d2c-640b2dafaf2f,Namespace:calico-apiserver,Attempt:0,}" Aug 12 23:44:57.271832 containerd[1506]: time="2025-08-12T23:44:57.271737953Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node:v3.30.2\"" Aug 12 23:44:57.362665 containerd[1506]: time="2025-08-12T23:44:57.362607558Z" level=error msg="Failed to destroy network for sandbox \"7fac785ef270d7e5813bad83b064fe1721dec0c63b1583585de2c0f8d71accb3\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Aug 12 23:44:57.365295 containerd[1506]: time="2025-08-12T23:44:57.365215453Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-7c65d6cfc9-dbjg4,Uid:a62fab08-ab88-42e3-ab7b-29db68e0485c,Namespace:kube-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"7fac785ef270d7e5813bad83b064fe1721dec0c63b1583585de2c0f8d71accb3\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Aug 12 23:44:57.366308 kubelet[2715]: E0812 23:44:57.365757 2715 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"7fac785ef270d7e5813bad83b064fe1721dec0c63b1583585de2c0f8d71accb3\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Aug 12 23:44:57.366308 kubelet[2715]: E0812 23:44:57.365826 2715 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"7fac785ef270d7e5813bad83b064fe1721dec0c63b1583585de2c0f8d71accb3\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-7c65d6cfc9-dbjg4" Aug 12 23:44:57.366308 kubelet[2715]: E0812 23:44:57.365849 2715 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"7fac785ef270d7e5813bad83b064fe1721dec0c63b1583585de2c0f8d71accb3\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-7c65d6cfc9-dbjg4" Aug 12 23:44:57.366538 kubelet[2715]: E0812 23:44:57.365897 2715 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"coredns-7c65d6cfc9-dbjg4_kube-system(a62fab08-ab88-42e3-ab7b-29db68e0485c)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"coredns-7c65d6cfc9-dbjg4_kube-system(a62fab08-ab88-42e3-ab7b-29db68e0485c)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"7fac785ef270d7e5813bad83b064fe1721dec0c63b1583585de2c0f8d71accb3\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-7c65d6cfc9-dbjg4" podUID="a62fab08-ab88-42e3-ab7b-29db68e0485c" Aug 12 23:44:57.378773 containerd[1506]: time="2025-08-12T23:44:57.378669019Z" level=error msg="Failed to destroy network for sandbox \"fd618f984d48fb2d10f746beb2f6e8ac143abb9e718bc668528e7c91b9e2f5ee\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Aug 12 23:44:57.382742 containerd[1506]: time="2025-08-12T23:44:57.382436995Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-7c65d6cfc9-drk6f,Uid:f78d3697-5a7e-4c57-950b-a6d5080b5a5b,Namespace:kube-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"fd618f984d48fb2d10f746beb2f6e8ac143abb9e718bc668528e7c91b9e2f5ee\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Aug 12 23:44:57.383334 kubelet[2715]: E0812 23:44:57.383283 2715 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"fd618f984d48fb2d10f746beb2f6e8ac143abb9e718bc668528e7c91b9e2f5ee\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Aug 12 23:44:57.383536 kubelet[2715]: E0812 23:44:57.383503 2715 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"fd618f984d48fb2d10f746beb2f6e8ac143abb9e718bc668528e7c91b9e2f5ee\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-7c65d6cfc9-drk6f" Aug 12 23:44:57.384282 kubelet[2715]: E0812 23:44:57.383680 2715 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"fd618f984d48fb2d10f746beb2f6e8ac143abb9e718bc668528e7c91b9e2f5ee\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-7c65d6cfc9-drk6f" Aug 12 23:44:57.384857 kubelet[2715]: E0812 23:44:57.384783 2715 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"coredns-7c65d6cfc9-drk6f_kube-system(f78d3697-5a7e-4c57-950b-a6d5080b5a5b)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"coredns-7c65d6cfc9-drk6f_kube-system(f78d3697-5a7e-4c57-950b-a6d5080b5a5b)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"fd618f984d48fb2d10f746beb2f6e8ac143abb9e718bc668528e7c91b9e2f5ee\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-7c65d6cfc9-drk6f" podUID="f78d3697-5a7e-4c57-950b-a6d5080b5a5b" Aug 12 23:44:57.391829 containerd[1506]: time="2025-08-12T23:44:57.391695490Z" level=error msg="Failed to destroy network for sandbox \"f86457e1a1f68f27ec1e74e9b8b049abd83949fe2ad25ec9bfd6e7f05f8ba472\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Aug 12 23:44:57.393953 containerd[1506]: time="2025-08-12T23:44:57.393900130Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-7f8d4ddcc7-zgq55,Uid:a9e9c567-de9b-4a68-9d2c-640b2dafaf2f,Namespace:calico-apiserver,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"f86457e1a1f68f27ec1e74e9b8b049abd83949fe2ad25ec9bfd6e7f05f8ba472\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Aug 12 23:44:57.394445 kubelet[2715]: E0812 23:44:57.394135 2715 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"f86457e1a1f68f27ec1e74e9b8b049abd83949fe2ad25ec9bfd6e7f05f8ba472\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Aug 12 23:44:57.394445 kubelet[2715]: E0812 23:44:57.394196 2715 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"f86457e1a1f68f27ec1e74e9b8b049abd83949fe2ad25ec9bfd6e7f05f8ba472\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-7f8d4ddcc7-zgq55" Aug 12 23:44:57.394445 kubelet[2715]: E0812 23:44:57.394226 2715 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"f86457e1a1f68f27ec1e74e9b8b049abd83949fe2ad25ec9bfd6e7f05f8ba472\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-7f8d4ddcc7-zgq55" Aug 12 23:44:57.394998 kubelet[2715]: E0812 23:44:57.394736 2715 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-apiserver-7f8d4ddcc7-zgq55_calico-apiserver(a9e9c567-de9b-4a68-9d2c-640b2dafaf2f)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-apiserver-7f8d4ddcc7-zgq55_calico-apiserver(a9e9c567-de9b-4a68-9d2c-640b2dafaf2f)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"f86457e1a1f68f27ec1e74e9b8b049abd83949fe2ad25ec9bfd6e7f05f8ba472\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-apiserver/calico-apiserver-7f8d4ddcc7-zgq55" podUID="a9e9c567-de9b-4a68-9d2c-640b2dafaf2f" Aug 12 23:44:57.921453 containerd[1506]: time="2025-08-12T23:44:57.921256115Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:goldmane-58fd7646b9-q57c4,Uid:24f2d0d9-d57f-4291-90c0-84286d227df3,Namespace:calico-system,Attempt:0,}" Aug 12 23:44:57.989455 containerd[1506]: time="2025-08-12T23:44:57.989349617Z" level=error msg="Failed to destroy network for sandbox \"0df6b551e659ea1a7643222dc64eb8f6f4219f7da0b5ef032433c5854fd9f8a6\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Aug 12 23:44:57.992090 systemd[1]: run-netns-cni\x2d28329998\x2d196e\x2d68f7\x2de9f6\x2d18f43e9cbd5b.mount: Deactivated successfully. Aug 12 23:44:57.993774 containerd[1506]: time="2025-08-12T23:44:57.993431604Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:goldmane-58fd7646b9-q57c4,Uid:24f2d0d9-d57f-4291-90c0-84286d227df3,Namespace:calico-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"0df6b551e659ea1a7643222dc64eb8f6f4219f7da0b5ef032433c5854fd9f8a6\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Aug 12 23:44:57.993936 kubelet[2715]: E0812 23:44:57.993861 2715 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"0df6b551e659ea1a7643222dc64eb8f6f4219f7da0b5ef032433c5854fd9f8a6\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Aug 12 23:44:57.993936 kubelet[2715]: E0812 23:44:57.993928 2715 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"0df6b551e659ea1a7643222dc64eb8f6f4219f7da0b5ef032433c5854fd9f8a6\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/goldmane-58fd7646b9-q57c4" Aug 12 23:44:57.994219 kubelet[2715]: E0812 23:44:57.993947 2715 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"0df6b551e659ea1a7643222dc64eb8f6f4219f7da0b5ef032433c5854fd9f8a6\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/goldmane-58fd7646b9-q57c4" Aug 12 23:44:57.994219 kubelet[2715]: E0812 23:44:57.994001 2715 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"goldmane-58fd7646b9-q57c4_calico-system(24f2d0d9-d57f-4291-90c0-84286d227df3)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"goldmane-58fd7646b9-q57c4_calico-system(24f2d0d9-d57f-4291-90c0-84286d227df3)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"0df6b551e659ea1a7643222dc64eb8f6f4219f7da0b5ef032433c5854fd9f8a6\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/goldmane-58fd7646b9-q57c4" podUID="24f2d0d9-d57f-4291-90c0-84286d227df3" Aug 12 23:45:00.254563 kubelet[2715]: I0812 23:45:00.254295 2715 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Aug 12 23:45:01.822285 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount3047964818.mount: Deactivated successfully. Aug 12 23:45:01.851620 containerd[1506]: time="2025-08-12T23:45:01.851371925Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node:v3.30.2\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Aug 12 23:45:01.852790 containerd[1506]: time="2025-08-12T23:45:01.852738728Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/node:v3.30.2: active requests=0, bytes read=152544909" Aug 12 23:45:01.853637 containerd[1506]: time="2025-08-12T23:45:01.853594476Z" level=info msg="ImageCreate event name:\"sha256:1c6ddca599ddd18c061e797a7830b0aea985f8b023c5e43d815a9ed1088893a9\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Aug 12 23:45:01.856173 containerd[1506]: time="2025-08-12T23:45:01.856073555Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node@sha256:e94d49349cc361ef2216d27dda4a097278984d778279f66e79b0616c827c6760\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Aug 12 23:45:01.856991 containerd[1506]: time="2025-08-12T23:45:01.856950783Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/node:v3.30.2\" with image id \"sha256:1c6ddca599ddd18c061e797a7830b0aea985f8b023c5e43d815a9ed1088893a9\", repo tag \"ghcr.io/flatcar/calico/node:v3.30.2\", repo digest \"ghcr.io/flatcar/calico/node@sha256:e94d49349cc361ef2216d27dda4a097278984d778279f66e79b0616c827c6760\", size \"152544771\" in 4.58497322s" Aug 12 23:45:01.857093 containerd[1506]: time="2025-08-12T23:45:01.857075507Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node:v3.30.2\" returns image reference \"sha256:1c6ddca599ddd18c061e797a7830b0aea985f8b023c5e43d815a9ed1088893a9\"" Aug 12 23:45:01.879711 containerd[1506]: time="2025-08-12T23:45:01.879658146Z" level=info msg="CreateContainer within sandbox \"3f7549914de50118b9a2f08652cb62a765c74822bd924071684a5777e52a4572\" for container &ContainerMetadata{Name:calico-node,Attempt:0,}" Aug 12 23:45:01.915173 containerd[1506]: time="2025-08-12T23:45:01.915121197Z" level=info msg="Container 769dd1b877f8b6c3cdfcd850a1bfce2f7623183d7b5ca6c22e3baaa0871e87ab: CDI devices from CRI Config.CDIDevices: []" Aug 12 23:45:01.937909 containerd[1506]: time="2025-08-12T23:45:01.937856482Z" level=info msg="CreateContainer within sandbox \"3f7549914de50118b9a2f08652cb62a765c74822bd924071684a5777e52a4572\" for &ContainerMetadata{Name:calico-node,Attempt:0,} returns container id \"769dd1b877f8b6c3cdfcd850a1bfce2f7623183d7b5ca6c22e3baaa0871e87ab\"" Aug 12 23:45:01.938837 containerd[1506]: time="2025-08-12T23:45:01.938787111Z" level=info msg="StartContainer for \"769dd1b877f8b6c3cdfcd850a1bfce2f7623183d7b5ca6c22e3baaa0871e87ab\"" Aug 12 23:45:01.941244 containerd[1506]: time="2025-08-12T23:45:01.941185308Z" level=info msg="connecting to shim 769dd1b877f8b6c3cdfcd850a1bfce2f7623183d7b5ca6c22e3baaa0871e87ab" address="unix:///run/containerd/s/86967d9467dc7f0b6454055d38c79265bfa02b575d9385748022299ef5327b2d" protocol=ttrpc version=3 Aug 12 23:45:02.004530 systemd[1]: Started cri-containerd-769dd1b877f8b6c3cdfcd850a1bfce2f7623183d7b5ca6c22e3baaa0871e87ab.scope - libcontainer container 769dd1b877f8b6c3cdfcd850a1bfce2f7623183d7b5ca6c22e3baaa0871e87ab. Aug 12 23:45:02.059432 containerd[1506]: time="2025-08-12T23:45:02.059347021Z" level=info msg="StartContainer for \"769dd1b877f8b6c3cdfcd850a1bfce2f7623183d7b5ca6c22e3baaa0871e87ab\" returns successfully" Aug 12 23:45:02.208572 kernel: wireguard: WireGuard 1.0.0 loaded. See www.wireguard.com for information. Aug 12 23:45:02.208688 kernel: wireguard: Copyright (C) 2015-2019 Jason A. Donenfeld . All Rights Reserved. Aug 12 23:45:02.320853 kubelet[2715]: I0812 23:45:02.320769 2715 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/calico-node-9mp88" podStartSLOduration=1.939336516 podStartE2EDuration="14.320750477s" podCreationTimestamp="2025-08-12 23:44:48 +0000 UTC" firstStartedPulling="2025-08-12 23:44:49.476798102 +0000 UTC m=+24.536360655" lastFinishedPulling="2025-08-12 23:45:01.858212063 +0000 UTC m=+36.917774616" observedRunningTime="2025-08-12 23:45:02.319923932 +0000 UTC m=+37.379486525" watchObservedRunningTime="2025-08-12 23:45:02.320750477 +0000 UTC m=+37.380313030" Aug 12 23:45:02.446410 kubelet[2715]: I0812 23:45:02.445707 2715 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"whisker-backend-key-pair\" (UniqueName: \"kubernetes.io/secret/90377728-1937-4fd3-8a8d-e255d8053f39-whisker-backend-key-pair\") pod \"90377728-1937-4fd3-8a8d-e255d8053f39\" (UID: \"90377728-1937-4fd3-8a8d-e255d8053f39\") " Aug 12 23:45:02.446410 kubelet[2715]: I0812 23:45:02.445768 2715 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9njl7\" (UniqueName: \"kubernetes.io/projected/90377728-1937-4fd3-8a8d-e255d8053f39-kube-api-access-9njl7\") pod \"90377728-1937-4fd3-8a8d-e255d8053f39\" (UID: \"90377728-1937-4fd3-8a8d-e255d8053f39\") " Aug 12 23:45:02.446410 kubelet[2715]: I0812 23:45:02.445825 2715 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"whisker-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/90377728-1937-4fd3-8a8d-e255d8053f39-whisker-ca-bundle\") pod \"90377728-1937-4fd3-8a8d-e255d8053f39\" (UID: \"90377728-1937-4fd3-8a8d-e255d8053f39\") " Aug 12 23:45:02.446864 kubelet[2715]: I0812 23:45:02.446824 2715 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/90377728-1937-4fd3-8a8d-e255d8053f39-whisker-ca-bundle" (OuterVolumeSpecName: "whisker-ca-bundle") pod "90377728-1937-4fd3-8a8d-e255d8053f39" (UID: "90377728-1937-4fd3-8a8d-e255d8053f39"). InnerVolumeSpecName "whisker-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Aug 12 23:45:02.468211 kubelet[2715]: I0812 23:45:02.468152 2715 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/90377728-1937-4fd3-8a8d-e255d8053f39-whisker-backend-key-pair" (OuterVolumeSpecName: "whisker-backend-key-pair") pod "90377728-1937-4fd3-8a8d-e255d8053f39" (UID: "90377728-1937-4fd3-8a8d-e255d8053f39"). InnerVolumeSpecName "whisker-backend-key-pair". PluginName "kubernetes.io/secret", VolumeGidValue "" Aug 12 23:45:02.468596 kubelet[2715]: I0812 23:45:02.468525 2715 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/90377728-1937-4fd3-8a8d-e255d8053f39-kube-api-access-9njl7" (OuterVolumeSpecName: "kube-api-access-9njl7") pod "90377728-1937-4fd3-8a8d-e255d8053f39" (UID: "90377728-1937-4fd3-8a8d-e255d8053f39"). InnerVolumeSpecName "kube-api-access-9njl7". PluginName "kubernetes.io/projected", VolumeGidValue "" Aug 12 23:45:02.546940 kubelet[2715]: I0812 23:45:02.546882 2715 reconciler_common.go:293] "Volume detached for volume \"whisker-backend-key-pair\" (UniqueName: \"kubernetes.io/secret/90377728-1937-4fd3-8a8d-e255d8053f39-whisker-backend-key-pair\") on node \"ci-4372-1-0-9-13fe44d47a\" DevicePath \"\"" Aug 12 23:45:02.546940 kubelet[2715]: I0812 23:45:02.546926 2715 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9njl7\" (UniqueName: \"kubernetes.io/projected/90377728-1937-4fd3-8a8d-e255d8053f39-kube-api-access-9njl7\") on node \"ci-4372-1-0-9-13fe44d47a\" DevicePath \"\"" Aug 12 23:45:02.546940 kubelet[2715]: I0812 23:45:02.546937 2715 reconciler_common.go:293] "Volume detached for volume \"whisker-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/90377728-1937-4fd3-8a8d-e255d8053f39-whisker-ca-bundle\") on node \"ci-4372-1-0-9-13fe44d47a\" DevicePath \"\"" Aug 12 23:45:02.821848 systemd[1]: var-lib-kubelet-pods-90377728\x2d1937\x2d4fd3\x2d8a8d\x2de255d8053f39-volumes-kubernetes.io\x7eprojected-kube\x2dapi\x2daccess\x2d9njl7.mount: Deactivated successfully. Aug 12 23:45:02.822011 systemd[1]: var-lib-kubelet-pods-90377728\x2d1937\x2d4fd3\x2d8a8d\x2de255d8053f39-volumes-kubernetes.io\x7esecret-whisker\x2dbackend\x2dkey\x2dpair.mount: Deactivated successfully. Aug 12 23:45:03.098433 systemd[1]: Removed slice kubepods-besteffort-pod90377728_1937_4fd3_8a8d_e255d8053f39.slice - libcontainer container kubepods-besteffort-pod90377728_1937_4fd3_8a8d_e255d8053f39.slice. Aug 12 23:45:03.299999 kubelet[2715]: I0812 23:45:03.299925 2715 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Aug 12 23:45:03.382262 systemd[1]: Created slice kubepods-besteffort-poda02830e2_5d47_44a5_88a8_259af6ad22f6.slice - libcontainer container kubepods-besteffort-poda02830e2_5d47_44a5_88a8_259af6ad22f6.slice. Aug 12 23:45:03.553349 kubelet[2715]: I0812 23:45:03.553261 2715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jcfjh\" (UniqueName: \"kubernetes.io/projected/a02830e2-5d47-44a5-88a8-259af6ad22f6-kube-api-access-jcfjh\") pod \"whisker-7dd654f94b-k8v4p\" (UID: \"a02830e2-5d47-44a5-88a8-259af6ad22f6\") " pod="calico-system/whisker-7dd654f94b-k8v4p" Aug 12 23:45:03.553349 kubelet[2715]: I0812 23:45:03.553321 2715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whisker-backend-key-pair\" (UniqueName: \"kubernetes.io/secret/a02830e2-5d47-44a5-88a8-259af6ad22f6-whisker-backend-key-pair\") pod \"whisker-7dd654f94b-k8v4p\" (UID: \"a02830e2-5d47-44a5-88a8-259af6ad22f6\") " pod="calico-system/whisker-7dd654f94b-k8v4p" Aug 12 23:45:03.553349 kubelet[2715]: I0812 23:45:03.553353 2715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whisker-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a02830e2-5d47-44a5-88a8-259af6ad22f6-whisker-ca-bundle\") pod \"whisker-7dd654f94b-k8v4p\" (UID: \"a02830e2-5d47-44a5-88a8-259af6ad22f6\") " pod="calico-system/whisker-7dd654f94b-k8v4p" Aug 12 23:45:03.687529 containerd[1506]: time="2025-08-12T23:45:03.687081011Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:whisker-7dd654f94b-k8v4p,Uid:a02830e2-5d47-44a5-88a8-259af6ad22f6,Namespace:calico-system,Attempt:0,}" Aug 12 23:45:03.887236 systemd-networkd[1419]: calie8682fc2064: Link UP Aug 12 23:45:03.889166 systemd-networkd[1419]: calie8682fc2064: Gained carrier Aug 12 23:45:03.918611 containerd[1506]: 2025-08-12 23:45:03.715 [INFO][3807] cni-plugin/utils.go 100: File /var/lib/calico/mtu does not exist Aug 12 23:45:03.918611 containerd[1506]: 2025-08-12 23:45:03.757 [INFO][3807] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4372--1--0--9--13fe44d47a-k8s-whisker--7dd654f94b--k8v4p-eth0 whisker-7dd654f94b- calico-system a02830e2-5d47-44a5-88a8-259af6ad22f6 927 0 2025-08-12 23:45:03 +0000 UTC map[app.kubernetes.io/name:whisker k8s-app:whisker pod-template-hash:7dd654f94b projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:whisker] map[] [] [] []} {k8s ci-4372-1-0-9-13fe44d47a whisker-7dd654f94b-k8v4p eth0 whisker [] [] [kns.calico-system ksa.calico-system.whisker] calie8682fc2064 [] [] }} ContainerID="3f6ba5e93f2ad250c6321fe85c9511bbf91feb2ae037f49184d37eda2aa48ca9" Namespace="calico-system" Pod="whisker-7dd654f94b-k8v4p" WorkloadEndpoint="ci--4372--1--0--9--13fe44d47a-k8s-whisker--7dd654f94b--k8v4p-" Aug 12 23:45:03.918611 containerd[1506]: 2025-08-12 23:45:03.757 [INFO][3807] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="3f6ba5e93f2ad250c6321fe85c9511bbf91feb2ae037f49184d37eda2aa48ca9" Namespace="calico-system" Pod="whisker-7dd654f94b-k8v4p" WorkloadEndpoint="ci--4372--1--0--9--13fe44d47a-k8s-whisker--7dd654f94b--k8v4p-eth0" Aug 12 23:45:03.918611 containerd[1506]: 2025-08-12 23:45:03.820 [INFO][3859] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="3f6ba5e93f2ad250c6321fe85c9511bbf91feb2ae037f49184d37eda2aa48ca9" HandleID="k8s-pod-network.3f6ba5e93f2ad250c6321fe85c9511bbf91feb2ae037f49184d37eda2aa48ca9" Workload="ci--4372--1--0--9--13fe44d47a-k8s-whisker--7dd654f94b--k8v4p-eth0" Aug 12 23:45:03.918853 containerd[1506]: 2025-08-12 23:45:03.820 [INFO][3859] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="3f6ba5e93f2ad250c6321fe85c9511bbf91feb2ae037f49184d37eda2aa48ca9" HandleID="k8s-pod-network.3f6ba5e93f2ad250c6321fe85c9511bbf91feb2ae037f49184d37eda2aa48ca9" Workload="ci--4372--1--0--9--13fe44d47a-k8s-whisker--7dd654f94b--k8v4p-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0x40003a37c0), Attrs:map[string]string{"namespace":"calico-system", "node":"ci-4372-1-0-9-13fe44d47a", "pod":"whisker-7dd654f94b-k8v4p", "timestamp":"2025-08-12 23:45:03.820382666 +0000 UTC"}, Hostname:"ci-4372-1-0-9-13fe44d47a", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Aug 12 23:45:03.918853 containerd[1506]: 2025-08-12 23:45:03.820 [INFO][3859] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Aug 12 23:45:03.918853 containerd[1506]: 2025-08-12 23:45:03.820 [INFO][3859] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Aug 12 23:45:03.918853 containerd[1506]: 2025-08-12 23:45:03.820 [INFO][3859] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4372-1-0-9-13fe44d47a' Aug 12 23:45:03.918853 containerd[1506]: 2025-08-12 23:45:03.834 [INFO][3859] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.3f6ba5e93f2ad250c6321fe85c9511bbf91feb2ae037f49184d37eda2aa48ca9" host="ci-4372-1-0-9-13fe44d47a" Aug 12 23:45:03.918853 containerd[1506]: 2025-08-12 23:45:03.841 [INFO][3859] ipam/ipam.go 394: Looking up existing affinities for host host="ci-4372-1-0-9-13fe44d47a" Aug 12 23:45:03.918853 containerd[1506]: 2025-08-12 23:45:03.847 [INFO][3859] ipam/ipam.go 511: Trying affinity for 192.168.3.64/26 host="ci-4372-1-0-9-13fe44d47a" Aug 12 23:45:03.918853 containerd[1506]: 2025-08-12 23:45:03.850 [INFO][3859] ipam/ipam.go 158: Attempting to load block cidr=192.168.3.64/26 host="ci-4372-1-0-9-13fe44d47a" Aug 12 23:45:03.918853 containerd[1506]: 2025-08-12 23:45:03.852 [INFO][3859] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.3.64/26 host="ci-4372-1-0-9-13fe44d47a" Aug 12 23:45:03.919030 containerd[1506]: 2025-08-12 23:45:03.852 [INFO][3859] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.3.64/26 handle="k8s-pod-network.3f6ba5e93f2ad250c6321fe85c9511bbf91feb2ae037f49184d37eda2aa48ca9" host="ci-4372-1-0-9-13fe44d47a" Aug 12 23:45:03.919030 containerd[1506]: 2025-08-12 23:45:03.855 [INFO][3859] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.3f6ba5e93f2ad250c6321fe85c9511bbf91feb2ae037f49184d37eda2aa48ca9 Aug 12 23:45:03.919030 containerd[1506]: 2025-08-12 23:45:03.862 [INFO][3859] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.3.64/26 handle="k8s-pod-network.3f6ba5e93f2ad250c6321fe85c9511bbf91feb2ae037f49184d37eda2aa48ca9" host="ci-4372-1-0-9-13fe44d47a" Aug 12 23:45:03.919030 containerd[1506]: 2025-08-12 23:45:03.870 [INFO][3859] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.3.65/26] block=192.168.3.64/26 handle="k8s-pod-network.3f6ba5e93f2ad250c6321fe85c9511bbf91feb2ae037f49184d37eda2aa48ca9" host="ci-4372-1-0-9-13fe44d47a" Aug 12 23:45:03.919030 containerd[1506]: 2025-08-12 23:45:03.870 [INFO][3859] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.3.65/26] handle="k8s-pod-network.3f6ba5e93f2ad250c6321fe85c9511bbf91feb2ae037f49184d37eda2aa48ca9" host="ci-4372-1-0-9-13fe44d47a" Aug 12 23:45:03.919030 containerd[1506]: 2025-08-12 23:45:03.870 [INFO][3859] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Aug 12 23:45:03.919030 containerd[1506]: 2025-08-12 23:45:03.870 [INFO][3859] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.3.65/26] IPv6=[] ContainerID="3f6ba5e93f2ad250c6321fe85c9511bbf91feb2ae037f49184d37eda2aa48ca9" HandleID="k8s-pod-network.3f6ba5e93f2ad250c6321fe85c9511bbf91feb2ae037f49184d37eda2aa48ca9" Workload="ci--4372--1--0--9--13fe44d47a-k8s-whisker--7dd654f94b--k8v4p-eth0" Aug 12 23:45:03.919152 containerd[1506]: 2025-08-12 23:45:03.874 [INFO][3807] cni-plugin/k8s.go 418: Populated endpoint ContainerID="3f6ba5e93f2ad250c6321fe85c9511bbf91feb2ae037f49184d37eda2aa48ca9" Namespace="calico-system" Pod="whisker-7dd654f94b-k8v4p" WorkloadEndpoint="ci--4372--1--0--9--13fe44d47a-k8s-whisker--7dd654f94b--k8v4p-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4372--1--0--9--13fe44d47a-k8s-whisker--7dd654f94b--k8v4p-eth0", GenerateName:"whisker-7dd654f94b-", Namespace:"calico-system", SelfLink:"", UID:"a02830e2-5d47-44a5-88a8-259af6ad22f6", ResourceVersion:"927", Generation:0, CreationTimestamp:time.Date(2025, time.August, 12, 23, 45, 3, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"whisker", "k8s-app":"whisker", "pod-template-hash":"7dd654f94b", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"whisker"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4372-1-0-9-13fe44d47a", ContainerID:"", Pod:"whisker-7dd654f94b-k8v4p", Endpoint:"eth0", ServiceAccountName:"whisker", IPNetworks:[]string{"192.168.3.65/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.whisker"}, InterfaceName:"calie8682fc2064", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Aug 12 23:45:03.919152 containerd[1506]: 2025-08-12 23:45:03.874 [INFO][3807] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.3.65/32] ContainerID="3f6ba5e93f2ad250c6321fe85c9511bbf91feb2ae037f49184d37eda2aa48ca9" Namespace="calico-system" Pod="whisker-7dd654f94b-k8v4p" WorkloadEndpoint="ci--4372--1--0--9--13fe44d47a-k8s-whisker--7dd654f94b--k8v4p-eth0" Aug 12 23:45:03.919217 containerd[1506]: 2025-08-12 23:45:03.875 [INFO][3807] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to calie8682fc2064 ContainerID="3f6ba5e93f2ad250c6321fe85c9511bbf91feb2ae037f49184d37eda2aa48ca9" Namespace="calico-system" Pod="whisker-7dd654f94b-k8v4p" WorkloadEndpoint="ci--4372--1--0--9--13fe44d47a-k8s-whisker--7dd654f94b--k8v4p-eth0" Aug 12 23:45:03.919217 containerd[1506]: 2025-08-12 23:45:03.891 [INFO][3807] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="3f6ba5e93f2ad250c6321fe85c9511bbf91feb2ae037f49184d37eda2aa48ca9" Namespace="calico-system" Pod="whisker-7dd654f94b-k8v4p" WorkloadEndpoint="ci--4372--1--0--9--13fe44d47a-k8s-whisker--7dd654f94b--k8v4p-eth0" Aug 12 23:45:03.919256 containerd[1506]: 2025-08-12 23:45:03.891 [INFO][3807] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="3f6ba5e93f2ad250c6321fe85c9511bbf91feb2ae037f49184d37eda2aa48ca9" Namespace="calico-system" Pod="whisker-7dd654f94b-k8v4p" WorkloadEndpoint="ci--4372--1--0--9--13fe44d47a-k8s-whisker--7dd654f94b--k8v4p-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4372--1--0--9--13fe44d47a-k8s-whisker--7dd654f94b--k8v4p-eth0", GenerateName:"whisker-7dd654f94b-", Namespace:"calico-system", SelfLink:"", UID:"a02830e2-5d47-44a5-88a8-259af6ad22f6", ResourceVersion:"927", Generation:0, CreationTimestamp:time.Date(2025, time.August, 12, 23, 45, 3, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"whisker", "k8s-app":"whisker", "pod-template-hash":"7dd654f94b", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"whisker"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4372-1-0-9-13fe44d47a", ContainerID:"3f6ba5e93f2ad250c6321fe85c9511bbf91feb2ae037f49184d37eda2aa48ca9", Pod:"whisker-7dd654f94b-k8v4p", Endpoint:"eth0", ServiceAccountName:"whisker", IPNetworks:[]string{"192.168.3.65/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.whisker"}, InterfaceName:"calie8682fc2064", MAC:"8e:fd:a2:dd:42:6e", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Aug 12 23:45:03.919302 containerd[1506]: 2025-08-12 23:45:03.914 [INFO][3807] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="3f6ba5e93f2ad250c6321fe85c9511bbf91feb2ae037f49184d37eda2aa48ca9" Namespace="calico-system" Pod="whisker-7dd654f94b-k8v4p" WorkloadEndpoint="ci--4372--1--0--9--13fe44d47a-k8s-whisker--7dd654f94b--k8v4p-eth0" Aug 12 23:45:03.998057 containerd[1506]: time="2025-08-12T23:45:03.997708928Z" level=info msg="connecting to shim 3f6ba5e93f2ad250c6321fe85c9511bbf91feb2ae037f49184d37eda2aa48ca9" address="unix:///run/containerd/s/78088b8727e818abe70b04362fc5cc268aae8e38a957ba05e1fbe0bfde9b7661" namespace=k8s.io protocol=ttrpc version=3 Aug 12 23:45:04.046747 systemd[1]: Started cri-containerd-3f6ba5e93f2ad250c6321fe85c9511bbf91feb2ae037f49184d37eda2aa48ca9.scope - libcontainer container 3f6ba5e93f2ad250c6321fe85c9511bbf91feb2ae037f49184d37eda2aa48ca9. Aug 12 23:45:04.105116 containerd[1506]: time="2025-08-12T23:45:04.104966875Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:whisker-7dd654f94b-k8v4p,Uid:a02830e2-5d47-44a5-88a8-259af6ad22f6,Namespace:calico-system,Attempt:0,} returns sandbox id \"3f6ba5e93f2ad250c6321fe85c9511bbf91feb2ae037f49184d37eda2aa48ca9\"" Aug 12 23:45:04.108620 containerd[1506]: time="2025-08-12T23:45:04.107897521Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.30.2\"" Aug 12 23:45:04.543111 systemd-networkd[1419]: vxlan.calico: Link UP Aug 12 23:45:04.543127 systemd-networkd[1419]: vxlan.calico: Gained carrier Aug 12 23:45:05.093332 kubelet[2715]: I0812 23:45:05.093278 2715 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="90377728-1937-4fd3-8a8d-e255d8053f39" path="/var/lib/kubelet/pods/90377728-1937-4fd3-8a8d-e255d8053f39/volumes" Aug 12 23:45:05.764936 systemd-networkd[1419]: calie8682fc2064: Gained IPv6LL Aug 12 23:45:06.083835 systemd-networkd[1419]: vxlan.calico: Gained IPv6LL Aug 12 23:45:06.128460 containerd[1506]: time="2025-08-12T23:45:06.128376957Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/whisker:v3.30.2\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Aug 12 23:45:06.129848 containerd[1506]: time="2025-08-12T23:45:06.129183460Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker:v3.30.2: active requests=0, bytes read=4605614" Aug 12 23:45:06.131008 containerd[1506]: time="2025-08-12T23:45:06.130874187Z" level=info msg="ImageCreate event name:\"sha256:309942601a9ca6c4e92bcd09162824fef1c137a5c5d92fbbb45be0f29bfd1817\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Aug 12 23:45:06.134499 containerd[1506]: time="2025-08-12T23:45:06.134300362Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/whisker@sha256:31346d4524252a3b0d2a1d289c4985b8402b498b5ce82a12e682096ab7446678\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Aug 12 23:45:06.136478 containerd[1506]: time="2025-08-12T23:45:06.135190587Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/whisker:v3.30.2\" with image id \"sha256:309942601a9ca6c4e92bcd09162824fef1c137a5c5d92fbbb45be0f29bfd1817\", repo tag \"ghcr.io/flatcar/calico/whisker:v3.30.2\", repo digest \"ghcr.io/flatcar/calico/whisker@sha256:31346d4524252a3b0d2a1d289c4985b8402b498b5ce82a12e682096ab7446678\", size \"5974847\" in 2.02640772s" Aug 12 23:45:06.136478 containerd[1506]: time="2025-08-12T23:45:06.135233068Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.30.2\" returns image reference \"sha256:309942601a9ca6c4e92bcd09162824fef1c137a5c5d92fbbb45be0f29bfd1817\"" Aug 12 23:45:06.140718 containerd[1506]: time="2025-08-12T23:45:06.140677060Z" level=info msg="CreateContainer within sandbox \"3f6ba5e93f2ad250c6321fe85c9511bbf91feb2ae037f49184d37eda2aa48ca9\" for container &ContainerMetadata{Name:whisker,Attempt:0,}" Aug 12 23:45:06.150867 containerd[1506]: time="2025-08-12T23:45:06.150812983Z" level=info msg="Container f850b4fd6eb647bc70aabfcc0b122b96349d622d91e46362e1bbe8bdd7f11c01: CDI devices from CRI Config.CDIDevices: []" Aug 12 23:45:06.162735 containerd[1506]: time="2025-08-12T23:45:06.162537229Z" level=info msg="CreateContainer within sandbox \"3f6ba5e93f2ad250c6321fe85c9511bbf91feb2ae037f49184d37eda2aa48ca9\" for &ContainerMetadata{Name:whisker,Attempt:0,} returns container id \"f850b4fd6eb647bc70aabfcc0b122b96349d622d91e46362e1bbe8bdd7f11c01\"" Aug 12 23:45:06.163782 containerd[1506]: time="2025-08-12T23:45:06.163731703Z" level=info msg="StartContainer for \"f850b4fd6eb647bc70aabfcc0b122b96349d622d91e46362e1bbe8bdd7f11c01\"" Aug 12 23:45:06.165269 containerd[1506]: time="2025-08-12T23:45:06.165224624Z" level=info msg="connecting to shim f850b4fd6eb647bc70aabfcc0b122b96349d622d91e46362e1bbe8bdd7f11c01" address="unix:///run/containerd/s/78088b8727e818abe70b04362fc5cc268aae8e38a957ba05e1fbe0bfde9b7661" protocol=ttrpc version=3 Aug 12 23:45:06.195985 systemd[1]: Started cri-containerd-f850b4fd6eb647bc70aabfcc0b122b96349d622d91e46362e1bbe8bdd7f11c01.scope - libcontainer container f850b4fd6eb647bc70aabfcc0b122b96349d622d91e46362e1bbe8bdd7f11c01. Aug 12 23:45:06.249102 containerd[1506]: time="2025-08-12T23:45:06.249047801Z" level=info msg="StartContainer for \"f850b4fd6eb647bc70aabfcc0b122b96349d622d91e46362e1bbe8bdd7f11c01\" returns successfully" Aug 12 23:45:06.252179 containerd[1506]: time="2025-08-12T23:45:06.252008563Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.30.2\"" Aug 12 23:45:09.042163 kubelet[2715]: I0812 23:45:09.041912 2715 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Aug 12 23:45:09.088210 containerd[1506]: time="2025-08-12T23:45:09.088025328Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:goldmane-58fd7646b9-q57c4,Uid:24f2d0d9-d57f-4291-90c0-84286d227df3,Namespace:calico-system,Attempt:0,}" Aug 12 23:45:09.248192 containerd[1506]: time="2025-08-12T23:45:09.248132495Z" level=info msg="TaskExit event in podsandbox handler container_id:\"769dd1b877f8b6c3cdfcd850a1bfce2f7623183d7b5ca6c22e3baaa0871e87ab\" id:\"bfb374c12d6608fa78798f41859b92d7d4c5e2a9a883fd42739ac0a4fb96f34f\" pid:4136 exited_at:{seconds:1755042309 nanos:246445051}" Aug 12 23:45:09.393321 systemd-networkd[1419]: cali676c77411f1: Link UP Aug 12 23:45:09.395107 systemd-networkd[1419]: cali676c77411f1: Gained carrier Aug 12 23:45:09.448380 containerd[1506]: 2025-08-12 23:45:09.202 [INFO][4129] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4372--1--0--9--13fe44d47a-k8s-goldmane--58fd7646b9--q57c4-eth0 goldmane-58fd7646b9- calico-system 24f2d0d9-d57f-4291-90c0-84286d227df3 843 0 2025-08-12 23:44:49 +0000 UTC map[app.kubernetes.io/name:goldmane k8s-app:goldmane pod-template-hash:58fd7646b9 projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:goldmane] map[] [] [] []} {k8s ci-4372-1-0-9-13fe44d47a goldmane-58fd7646b9-q57c4 eth0 goldmane [] [] [kns.calico-system ksa.calico-system.goldmane] cali676c77411f1 [] [] }} ContainerID="cec82977a073516f212a8e5f10513f085337371364c01d60b372ce9b08816804" Namespace="calico-system" Pod="goldmane-58fd7646b9-q57c4" WorkloadEndpoint="ci--4372--1--0--9--13fe44d47a-k8s-goldmane--58fd7646b9--q57c4-" Aug 12 23:45:09.448380 containerd[1506]: 2025-08-12 23:45:09.203 [INFO][4129] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="cec82977a073516f212a8e5f10513f085337371364c01d60b372ce9b08816804" Namespace="calico-system" Pod="goldmane-58fd7646b9-q57c4" WorkloadEndpoint="ci--4372--1--0--9--13fe44d47a-k8s-goldmane--58fd7646b9--q57c4-eth0" Aug 12 23:45:09.448380 containerd[1506]: 2025-08-12 23:45:09.271 [INFO][4154] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="cec82977a073516f212a8e5f10513f085337371364c01d60b372ce9b08816804" HandleID="k8s-pod-network.cec82977a073516f212a8e5f10513f085337371364c01d60b372ce9b08816804" Workload="ci--4372--1--0--9--13fe44d47a-k8s-goldmane--58fd7646b9--q57c4-eth0" Aug 12 23:45:09.449520 containerd[1506]: 2025-08-12 23:45:09.271 [INFO][4154] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="cec82977a073516f212a8e5f10513f085337371364c01d60b372ce9b08816804" HandleID="k8s-pod-network.cec82977a073516f212a8e5f10513f085337371364c01d60b372ce9b08816804" Workload="ci--4372--1--0--9--13fe44d47a-k8s-goldmane--58fd7646b9--q57c4-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0x400004c160), Attrs:map[string]string{"namespace":"calico-system", "node":"ci-4372-1-0-9-13fe44d47a", "pod":"goldmane-58fd7646b9-q57c4", "timestamp":"2025-08-12 23:45:09.270971049 +0000 UTC"}, Hostname:"ci-4372-1-0-9-13fe44d47a", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Aug 12 23:45:09.449520 containerd[1506]: 2025-08-12 23:45:09.271 [INFO][4154] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Aug 12 23:45:09.449520 containerd[1506]: 2025-08-12 23:45:09.271 [INFO][4154] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Aug 12 23:45:09.449520 containerd[1506]: 2025-08-12 23:45:09.271 [INFO][4154] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4372-1-0-9-13fe44d47a' Aug 12 23:45:09.449520 containerd[1506]: 2025-08-12 23:45:09.303 [INFO][4154] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.cec82977a073516f212a8e5f10513f085337371364c01d60b372ce9b08816804" host="ci-4372-1-0-9-13fe44d47a" Aug 12 23:45:09.449520 containerd[1506]: 2025-08-12 23:45:09.328 [INFO][4154] ipam/ipam.go 394: Looking up existing affinities for host host="ci-4372-1-0-9-13fe44d47a" Aug 12 23:45:09.449520 containerd[1506]: 2025-08-12 23:45:09.341 [INFO][4154] ipam/ipam.go 511: Trying affinity for 192.168.3.64/26 host="ci-4372-1-0-9-13fe44d47a" Aug 12 23:45:09.449520 containerd[1506]: 2025-08-12 23:45:09.348 [INFO][4154] ipam/ipam.go 158: Attempting to load block cidr=192.168.3.64/26 host="ci-4372-1-0-9-13fe44d47a" Aug 12 23:45:09.449520 containerd[1506]: 2025-08-12 23:45:09.353 [INFO][4154] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.3.64/26 host="ci-4372-1-0-9-13fe44d47a" Aug 12 23:45:09.449838 containerd[1506]: 2025-08-12 23:45:09.353 [INFO][4154] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.3.64/26 handle="k8s-pod-network.cec82977a073516f212a8e5f10513f085337371364c01d60b372ce9b08816804" host="ci-4372-1-0-9-13fe44d47a" Aug 12 23:45:09.449838 containerd[1506]: 2025-08-12 23:45:09.357 [INFO][4154] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.cec82977a073516f212a8e5f10513f085337371364c01d60b372ce9b08816804 Aug 12 23:45:09.449838 containerd[1506]: 2025-08-12 23:45:09.366 [INFO][4154] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.3.64/26 handle="k8s-pod-network.cec82977a073516f212a8e5f10513f085337371364c01d60b372ce9b08816804" host="ci-4372-1-0-9-13fe44d47a" Aug 12 23:45:09.449838 containerd[1506]: 2025-08-12 23:45:09.379 [INFO][4154] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.3.66/26] block=192.168.3.64/26 handle="k8s-pod-network.cec82977a073516f212a8e5f10513f085337371364c01d60b372ce9b08816804" host="ci-4372-1-0-9-13fe44d47a" Aug 12 23:45:09.449838 containerd[1506]: 2025-08-12 23:45:09.379 [INFO][4154] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.3.66/26] handle="k8s-pod-network.cec82977a073516f212a8e5f10513f085337371364c01d60b372ce9b08816804" host="ci-4372-1-0-9-13fe44d47a" Aug 12 23:45:09.449838 containerd[1506]: 2025-08-12 23:45:09.379 [INFO][4154] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Aug 12 23:45:09.449838 containerd[1506]: 2025-08-12 23:45:09.379 [INFO][4154] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.3.66/26] IPv6=[] ContainerID="cec82977a073516f212a8e5f10513f085337371364c01d60b372ce9b08816804" HandleID="k8s-pod-network.cec82977a073516f212a8e5f10513f085337371364c01d60b372ce9b08816804" Workload="ci--4372--1--0--9--13fe44d47a-k8s-goldmane--58fd7646b9--q57c4-eth0" Aug 12 23:45:09.449968 containerd[1506]: 2025-08-12 23:45:09.386 [INFO][4129] cni-plugin/k8s.go 418: Populated endpoint ContainerID="cec82977a073516f212a8e5f10513f085337371364c01d60b372ce9b08816804" Namespace="calico-system" Pod="goldmane-58fd7646b9-q57c4" WorkloadEndpoint="ci--4372--1--0--9--13fe44d47a-k8s-goldmane--58fd7646b9--q57c4-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4372--1--0--9--13fe44d47a-k8s-goldmane--58fd7646b9--q57c4-eth0", GenerateName:"goldmane-58fd7646b9-", Namespace:"calico-system", SelfLink:"", UID:"24f2d0d9-d57f-4291-90c0-84286d227df3", ResourceVersion:"843", Generation:0, CreationTimestamp:time.Date(2025, time.August, 12, 23, 44, 49, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"goldmane", "k8s-app":"goldmane", "pod-template-hash":"58fd7646b9", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"goldmane"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4372-1-0-9-13fe44d47a", ContainerID:"", Pod:"goldmane-58fd7646b9-q57c4", Endpoint:"eth0", ServiceAccountName:"goldmane", IPNetworks:[]string{"192.168.3.66/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.goldmane"}, InterfaceName:"cali676c77411f1", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Aug 12 23:45:09.450023 containerd[1506]: 2025-08-12 23:45:09.387 [INFO][4129] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.3.66/32] ContainerID="cec82977a073516f212a8e5f10513f085337371364c01d60b372ce9b08816804" Namespace="calico-system" Pod="goldmane-58fd7646b9-q57c4" WorkloadEndpoint="ci--4372--1--0--9--13fe44d47a-k8s-goldmane--58fd7646b9--q57c4-eth0" Aug 12 23:45:09.450023 containerd[1506]: 2025-08-12 23:45:09.387 [INFO][4129] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali676c77411f1 ContainerID="cec82977a073516f212a8e5f10513f085337371364c01d60b372ce9b08816804" Namespace="calico-system" Pod="goldmane-58fd7646b9-q57c4" WorkloadEndpoint="ci--4372--1--0--9--13fe44d47a-k8s-goldmane--58fd7646b9--q57c4-eth0" Aug 12 23:45:09.450023 containerd[1506]: 2025-08-12 23:45:09.401 [INFO][4129] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="cec82977a073516f212a8e5f10513f085337371364c01d60b372ce9b08816804" Namespace="calico-system" Pod="goldmane-58fd7646b9-q57c4" WorkloadEndpoint="ci--4372--1--0--9--13fe44d47a-k8s-goldmane--58fd7646b9--q57c4-eth0" Aug 12 23:45:09.450088 containerd[1506]: 2025-08-12 23:45:09.407 [INFO][4129] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="cec82977a073516f212a8e5f10513f085337371364c01d60b372ce9b08816804" Namespace="calico-system" Pod="goldmane-58fd7646b9-q57c4" WorkloadEndpoint="ci--4372--1--0--9--13fe44d47a-k8s-goldmane--58fd7646b9--q57c4-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4372--1--0--9--13fe44d47a-k8s-goldmane--58fd7646b9--q57c4-eth0", GenerateName:"goldmane-58fd7646b9-", Namespace:"calico-system", SelfLink:"", UID:"24f2d0d9-d57f-4291-90c0-84286d227df3", ResourceVersion:"843", Generation:0, CreationTimestamp:time.Date(2025, time.August, 12, 23, 44, 49, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"goldmane", "k8s-app":"goldmane", "pod-template-hash":"58fd7646b9", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"goldmane"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4372-1-0-9-13fe44d47a", ContainerID:"cec82977a073516f212a8e5f10513f085337371364c01d60b372ce9b08816804", Pod:"goldmane-58fd7646b9-q57c4", Endpoint:"eth0", ServiceAccountName:"goldmane", IPNetworks:[]string{"192.168.3.66/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.goldmane"}, InterfaceName:"cali676c77411f1", MAC:"82:e2:d1:d2:ee:c9", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Aug 12 23:45:09.450134 containerd[1506]: 2025-08-12 23:45:09.442 [INFO][4129] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="cec82977a073516f212a8e5f10513f085337371364c01d60b372ce9b08816804" Namespace="calico-system" Pod="goldmane-58fd7646b9-q57c4" WorkloadEndpoint="ci--4372--1--0--9--13fe44d47a-k8s-goldmane--58fd7646b9--q57c4-eth0" Aug 12 23:45:09.519569 containerd[1506]: time="2025-08-12T23:45:09.517651669Z" level=info msg="TaskExit event in podsandbox handler container_id:\"769dd1b877f8b6c3cdfcd850a1bfce2f7623183d7b5ca6c22e3baaa0871e87ab\" id:\"e651fe9be2a7ced398a2d25479162ad67ebb72ccfb7184d5c4c3176c56b3bcc2\" pid:4173 exited_at:{seconds:1755042309 nanos:509660541}" Aug 12 23:45:09.538182 containerd[1506]: time="2025-08-12T23:45:09.538114722Z" level=info msg="connecting to shim cec82977a073516f212a8e5f10513f085337371364c01d60b372ce9b08816804" address="unix:///run/containerd/s/f7fd641bca54eeff8a16a495a8585b7b214acd362f2af44b9cf52eacd96a18a3" namespace=k8s.io protocol=ttrpc version=3 Aug 12 23:45:09.593296 systemd[1]: Started cri-containerd-cec82977a073516f212a8e5f10513f085337371364c01d60b372ce9b08816804.scope - libcontainer container cec82977a073516f212a8e5f10513f085337371364c01d60b372ce9b08816804. Aug 12 23:45:09.712104 containerd[1506]: time="2025-08-12T23:45:09.711251228Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:goldmane-58fd7646b9-q57c4,Uid:24f2d0d9-d57f-4291-90c0-84286d227df3,Namespace:calico-system,Attempt:0,} returns sandbox id \"cec82977a073516f212a8e5f10513f085337371364c01d60b372ce9b08816804\"" Aug 12 23:45:09.762127 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount4096822017.mount: Deactivated successfully. Aug 12 23:45:09.784611 containerd[1506]: time="2025-08-12T23:45:09.783929879Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/whisker-backend:v3.30.2\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Aug 12 23:45:09.785899 containerd[1506]: time="2025-08-12T23:45:09.785860290Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker-backend:v3.30.2: active requests=0, bytes read=30814581" Aug 12 23:45:09.786862 containerd[1506]: time="2025-08-12T23:45:09.786827635Z" level=info msg="ImageCreate event name:\"sha256:8763d908c0cd23d0e87bc61ce1ba8371b86449688baf955e5eeff7f7d7e101c4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Aug 12 23:45:09.791099 containerd[1506]: time="2025-08-12T23:45:09.791066745Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/whisker-backend@sha256:fbf7f21f5aba95930803ad7e7dea8b083220854eae72c2a7c51681c09c5614b5\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Aug 12 23:45:09.793399 containerd[1506]: time="2025-08-12T23:45:09.793226921Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.2\" with image id \"sha256:8763d908c0cd23d0e87bc61ce1ba8371b86449688baf955e5eeff7f7d7e101c4\", repo tag \"ghcr.io/flatcar/calico/whisker-backend:v3.30.2\", repo digest \"ghcr.io/flatcar/calico/whisker-backend@sha256:fbf7f21f5aba95930803ad7e7dea8b083220854eae72c2a7c51681c09c5614b5\", size \"30814411\" in 3.5405791s" Aug 12 23:45:09.793749 containerd[1506]: time="2025-08-12T23:45:09.793597291Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.30.2\" returns image reference \"sha256:8763d908c0cd23d0e87bc61ce1ba8371b86449688baf955e5eeff7f7d7e101c4\"" Aug 12 23:45:09.795979 containerd[1506]: time="2025-08-12T23:45:09.795854670Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.30.2\"" Aug 12 23:45:09.799997 containerd[1506]: time="2025-08-12T23:45:09.799800852Z" level=info msg="CreateContainer within sandbox \"3f6ba5e93f2ad250c6321fe85c9511bbf91feb2ae037f49184d37eda2aa48ca9\" for container &ContainerMetadata{Name:whisker-backend,Attempt:0,}" Aug 12 23:45:09.807747 containerd[1506]: time="2025-08-12T23:45:09.807673697Z" level=info msg="Container 787b7d5e172dc71fe587b3f6eea6d8f9aa0ef7b0fcf02f1ef7f16967be7de288: CDI devices from CRI Config.CDIDevices: []" Aug 12 23:45:09.822633 containerd[1506]: time="2025-08-12T23:45:09.822554965Z" level=info msg="CreateContainer within sandbox \"3f6ba5e93f2ad250c6321fe85c9511bbf91feb2ae037f49184d37eda2aa48ca9\" for &ContainerMetadata{Name:whisker-backend,Attempt:0,} returns container id \"787b7d5e172dc71fe587b3f6eea6d8f9aa0ef7b0fcf02f1ef7f16967be7de288\"" Aug 12 23:45:09.823929 containerd[1506]: time="2025-08-12T23:45:09.823886879Z" level=info msg="StartContainer for \"787b7d5e172dc71fe587b3f6eea6d8f9aa0ef7b0fcf02f1ef7f16967be7de288\"" Aug 12 23:45:09.825725 containerd[1506]: time="2025-08-12T23:45:09.825625364Z" level=info msg="connecting to shim 787b7d5e172dc71fe587b3f6eea6d8f9aa0ef7b0fcf02f1ef7f16967be7de288" address="unix:///run/containerd/s/78088b8727e818abe70b04362fc5cc268aae8e38a957ba05e1fbe0bfde9b7661" protocol=ttrpc version=3 Aug 12 23:45:09.910800 systemd[1]: Started cri-containerd-787b7d5e172dc71fe587b3f6eea6d8f9aa0ef7b0fcf02f1ef7f16967be7de288.scope - libcontainer container 787b7d5e172dc71fe587b3f6eea6d8f9aa0ef7b0fcf02f1ef7f16967be7de288. Aug 12 23:45:09.962624 containerd[1506]: time="2025-08-12T23:45:09.962474046Z" level=info msg="StartContainer for \"787b7d5e172dc71fe587b3f6eea6d8f9aa0ef7b0fcf02f1ef7f16967be7de288\" returns successfully" Aug 12 23:45:10.087842 containerd[1506]: time="2025-08-12T23:45:10.087780460Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-7f8d4ddcc7-r7f9g,Uid:241f24c8-4f1a-4329-8952-6c015f2ffbc0,Namespace:calico-apiserver,Attempt:0,}" Aug 12 23:45:10.088351 containerd[1506]: time="2025-08-12T23:45:10.088088988Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-6464b99944-nztd8,Uid:d4e2f489-94af-4e39-ac07-2a80367a7943,Namespace:calico-apiserver,Attempt:0,}" Aug 12 23:45:10.259168 systemd-networkd[1419]: cali77a5728c533: Link UP Aug 12 23:45:10.259527 systemd-networkd[1419]: cali77a5728c533: Gained carrier Aug 12 23:45:10.288569 containerd[1506]: 2025-08-12 23:45:10.156 [INFO][4287] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4372--1--0--9--13fe44d47a-k8s-calico--apiserver--7f8d4ddcc7--r7f9g-eth0 calico-apiserver-7f8d4ddcc7- calico-apiserver 241f24c8-4f1a-4329-8952-6c015f2ffbc0 851 0 2025-08-12 23:44:44 +0000 UTC map[apiserver:true app.kubernetes.io/name:calico-apiserver k8s-app:calico-apiserver pod-template-hash:7f8d4ddcc7 projectcalico.org/namespace:calico-apiserver projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:calico-apiserver] map[] [] [] []} {k8s ci-4372-1-0-9-13fe44d47a calico-apiserver-7f8d4ddcc7-r7f9g eth0 calico-apiserver [] [] [kns.calico-apiserver ksa.calico-apiserver.calico-apiserver] cali77a5728c533 [] [] }} ContainerID="69a93367d0d4a8757028e8e0ca21952426c5ce6ce96fdba13005db0b3b6b35bf" Namespace="calico-apiserver" Pod="calico-apiserver-7f8d4ddcc7-r7f9g" WorkloadEndpoint="ci--4372--1--0--9--13fe44d47a-k8s-calico--apiserver--7f8d4ddcc7--r7f9g-" Aug 12 23:45:10.288569 containerd[1506]: 2025-08-12 23:45:10.156 [INFO][4287] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="69a93367d0d4a8757028e8e0ca21952426c5ce6ce96fdba13005db0b3b6b35bf" Namespace="calico-apiserver" Pod="calico-apiserver-7f8d4ddcc7-r7f9g" WorkloadEndpoint="ci--4372--1--0--9--13fe44d47a-k8s-calico--apiserver--7f8d4ddcc7--r7f9g-eth0" Aug 12 23:45:10.288569 containerd[1506]: 2025-08-12 23:45:10.193 [INFO][4310] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="69a93367d0d4a8757028e8e0ca21952426c5ce6ce96fdba13005db0b3b6b35bf" HandleID="k8s-pod-network.69a93367d0d4a8757028e8e0ca21952426c5ce6ce96fdba13005db0b3b6b35bf" Workload="ci--4372--1--0--9--13fe44d47a-k8s-calico--apiserver--7f8d4ddcc7--r7f9g-eth0" Aug 12 23:45:10.288782 containerd[1506]: 2025-08-12 23:45:10.194 [INFO][4310] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="69a93367d0d4a8757028e8e0ca21952426c5ce6ce96fdba13005db0b3b6b35bf" HandleID="k8s-pod-network.69a93367d0d4a8757028e8e0ca21952426c5ce6ce96fdba13005db0b3b6b35bf" Workload="ci--4372--1--0--9--13fe44d47a-k8s-calico--apiserver--7f8d4ddcc7--r7f9g-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0x40002d38e0), Attrs:map[string]string{"namespace":"calico-apiserver", "node":"ci-4372-1-0-9-13fe44d47a", "pod":"calico-apiserver-7f8d4ddcc7-r7f9g", "timestamp":"2025-08-12 23:45:10.193424473 +0000 UTC"}, Hostname:"ci-4372-1-0-9-13fe44d47a", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Aug 12 23:45:10.288782 containerd[1506]: 2025-08-12 23:45:10.194 [INFO][4310] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Aug 12 23:45:10.288782 containerd[1506]: 2025-08-12 23:45:10.194 [INFO][4310] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Aug 12 23:45:10.288782 containerd[1506]: 2025-08-12 23:45:10.194 [INFO][4310] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4372-1-0-9-13fe44d47a' Aug 12 23:45:10.288782 containerd[1506]: 2025-08-12 23:45:10.208 [INFO][4310] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.69a93367d0d4a8757028e8e0ca21952426c5ce6ce96fdba13005db0b3b6b35bf" host="ci-4372-1-0-9-13fe44d47a" Aug 12 23:45:10.288782 containerd[1506]: 2025-08-12 23:45:10.215 [INFO][4310] ipam/ipam.go 394: Looking up existing affinities for host host="ci-4372-1-0-9-13fe44d47a" Aug 12 23:45:10.288782 containerd[1506]: 2025-08-12 23:45:10.221 [INFO][4310] ipam/ipam.go 511: Trying affinity for 192.168.3.64/26 host="ci-4372-1-0-9-13fe44d47a" Aug 12 23:45:10.288782 containerd[1506]: 2025-08-12 23:45:10.225 [INFO][4310] ipam/ipam.go 158: Attempting to load block cidr=192.168.3.64/26 host="ci-4372-1-0-9-13fe44d47a" Aug 12 23:45:10.288782 containerd[1506]: 2025-08-12 23:45:10.228 [INFO][4310] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.3.64/26 host="ci-4372-1-0-9-13fe44d47a" Aug 12 23:45:10.288966 containerd[1506]: 2025-08-12 23:45:10.229 [INFO][4310] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.3.64/26 handle="k8s-pod-network.69a93367d0d4a8757028e8e0ca21952426c5ce6ce96fdba13005db0b3b6b35bf" host="ci-4372-1-0-9-13fe44d47a" Aug 12 23:45:10.288966 containerd[1506]: 2025-08-12 23:45:10.231 [INFO][4310] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.69a93367d0d4a8757028e8e0ca21952426c5ce6ce96fdba13005db0b3b6b35bf Aug 12 23:45:10.288966 containerd[1506]: 2025-08-12 23:45:10.236 [INFO][4310] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.3.64/26 handle="k8s-pod-network.69a93367d0d4a8757028e8e0ca21952426c5ce6ce96fdba13005db0b3b6b35bf" host="ci-4372-1-0-9-13fe44d47a" Aug 12 23:45:10.288966 containerd[1506]: 2025-08-12 23:45:10.246 [INFO][4310] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.3.67/26] block=192.168.3.64/26 handle="k8s-pod-network.69a93367d0d4a8757028e8e0ca21952426c5ce6ce96fdba13005db0b3b6b35bf" host="ci-4372-1-0-9-13fe44d47a" Aug 12 23:45:10.288966 containerd[1506]: 2025-08-12 23:45:10.246 [INFO][4310] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.3.67/26] handle="k8s-pod-network.69a93367d0d4a8757028e8e0ca21952426c5ce6ce96fdba13005db0b3b6b35bf" host="ci-4372-1-0-9-13fe44d47a" Aug 12 23:45:10.288966 containerd[1506]: 2025-08-12 23:45:10.247 [INFO][4310] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Aug 12 23:45:10.288966 containerd[1506]: 2025-08-12 23:45:10.247 [INFO][4310] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.3.67/26] IPv6=[] ContainerID="69a93367d0d4a8757028e8e0ca21952426c5ce6ce96fdba13005db0b3b6b35bf" HandleID="k8s-pod-network.69a93367d0d4a8757028e8e0ca21952426c5ce6ce96fdba13005db0b3b6b35bf" Workload="ci--4372--1--0--9--13fe44d47a-k8s-calico--apiserver--7f8d4ddcc7--r7f9g-eth0" Aug 12 23:45:10.289171 containerd[1506]: 2025-08-12 23:45:10.251 [INFO][4287] cni-plugin/k8s.go 418: Populated endpoint ContainerID="69a93367d0d4a8757028e8e0ca21952426c5ce6ce96fdba13005db0b3b6b35bf" Namespace="calico-apiserver" Pod="calico-apiserver-7f8d4ddcc7-r7f9g" WorkloadEndpoint="ci--4372--1--0--9--13fe44d47a-k8s-calico--apiserver--7f8d4ddcc7--r7f9g-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4372--1--0--9--13fe44d47a-k8s-calico--apiserver--7f8d4ddcc7--r7f9g-eth0", GenerateName:"calico-apiserver-7f8d4ddcc7-", Namespace:"calico-apiserver", SelfLink:"", UID:"241f24c8-4f1a-4329-8952-6c015f2ffbc0", ResourceVersion:"851", Generation:0, CreationTimestamp:time.Date(2025, time.August, 12, 23, 44, 44, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"7f8d4ddcc7", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4372-1-0-9-13fe44d47a", ContainerID:"", Pod:"calico-apiserver-7f8d4ddcc7-r7f9g", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.3.67/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"cali77a5728c533", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Aug 12 23:45:10.289228 containerd[1506]: 2025-08-12 23:45:10.252 [INFO][4287] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.3.67/32] ContainerID="69a93367d0d4a8757028e8e0ca21952426c5ce6ce96fdba13005db0b3b6b35bf" Namespace="calico-apiserver" Pod="calico-apiserver-7f8d4ddcc7-r7f9g" WorkloadEndpoint="ci--4372--1--0--9--13fe44d47a-k8s-calico--apiserver--7f8d4ddcc7--r7f9g-eth0" Aug 12 23:45:10.289228 containerd[1506]: 2025-08-12 23:45:10.252 [INFO][4287] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali77a5728c533 ContainerID="69a93367d0d4a8757028e8e0ca21952426c5ce6ce96fdba13005db0b3b6b35bf" Namespace="calico-apiserver" Pod="calico-apiserver-7f8d4ddcc7-r7f9g" WorkloadEndpoint="ci--4372--1--0--9--13fe44d47a-k8s-calico--apiserver--7f8d4ddcc7--r7f9g-eth0" Aug 12 23:45:10.289228 containerd[1506]: 2025-08-12 23:45:10.263 [INFO][4287] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="69a93367d0d4a8757028e8e0ca21952426c5ce6ce96fdba13005db0b3b6b35bf" Namespace="calico-apiserver" Pod="calico-apiserver-7f8d4ddcc7-r7f9g" WorkloadEndpoint="ci--4372--1--0--9--13fe44d47a-k8s-calico--apiserver--7f8d4ddcc7--r7f9g-eth0" Aug 12 23:45:10.289292 containerd[1506]: 2025-08-12 23:45:10.264 [INFO][4287] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="69a93367d0d4a8757028e8e0ca21952426c5ce6ce96fdba13005db0b3b6b35bf" Namespace="calico-apiserver" Pod="calico-apiserver-7f8d4ddcc7-r7f9g" WorkloadEndpoint="ci--4372--1--0--9--13fe44d47a-k8s-calico--apiserver--7f8d4ddcc7--r7f9g-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4372--1--0--9--13fe44d47a-k8s-calico--apiserver--7f8d4ddcc7--r7f9g-eth0", GenerateName:"calico-apiserver-7f8d4ddcc7-", Namespace:"calico-apiserver", SelfLink:"", UID:"241f24c8-4f1a-4329-8952-6c015f2ffbc0", ResourceVersion:"851", Generation:0, CreationTimestamp:time.Date(2025, time.August, 12, 23, 44, 44, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"7f8d4ddcc7", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4372-1-0-9-13fe44d47a", ContainerID:"69a93367d0d4a8757028e8e0ca21952426c5ce6ce96fdba13005db0b3b6b35bf", Pod:"calico-apiserver-7f8d4ddcc7-r7f9g", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.3.67/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"cali77a5728c533", MAC:"ee:bd:30:7a:36:c8", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Aug 12 23:45:10.289338 containerd[1506]: 2025-08-12 23:45:10.285 [INFO][4287] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="69a93367d0d4a8757028e8e0ca21952426c5ce6ce96fdba13005db0b3b6b35bf" Namespace="calico-apiserver" Pod="calico-apiserver-7f8d4ddcc7-r7f9g" WorkloadEndpoint="ci--4372--1--0--9--13fe44d47a-k8s-calico--apiserver--7f8d4ddcc7--r7f9g-eth0" Aug 12 23:45:10.326017 containerd[1506]: time="2025-08-12T23:45:10.325897729Z" level=info msg="connecting to shim 69a93367d0d4a8757028e8e0ca21952426c5ce6ce96fdba13005db0b3b6b35bf" address="unix:///run/containerd/s/ba2aa9f7b63306fe5a9699a7403082cdcc8b935709310afa7d9a88cef385ab9f" namespace=k8s.io protocol=ttrpc version=3 Aug 12 23:45:10.384520 kubelet[2715]: I0812 23:45:10.384364 2715 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/whisker-7dd654f94b-k8v4p" podStartSLOduration=1.695987938 podStartE2EDuration="7.383906487s" podCreationTimestamp="2025-08-12 23:45:03 +0000 UTC" firstStartedPulling="2025-08-12 23:45:04.107465548 +0000 UTC m=+39.167028061" lastFinishedPulling="2025-08-12 23:45:09.795384097 +0000 UTC m=+44.854946610" observedRunningTime="2025-08-12 23:45:10.382328287 +0000 UTC m=+45.441890840" watchObservedRunningTime="2025-08-12 23:45:10.383906487 +0000 UTC m=+45.443469040" Aug 12 23:45:10.390732 systemd[1]: Started cri-containerd-69a93367d0d4a8757028e8e0ca21952426c5ce6ce96fdba13005db0b3b6b35bf.scope - libcontainer container 69a93367d0d4a8757028e8e0ca21952426c5ce6ce96fdba13005db0b3b6b35bf. Aug 12 23:45:10.422812 systemd-networkd[1419]: cali51737d6b27c: Link UP Aug 12 23:45:10.424665 systemd-networkd[1419]: cali51737d6b27c: Gained carrier Aug 12 23:45:10.450834 containerd[1506]: 2025-08-12 23:45:10.155 [INFO][4285] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4372--1--0--9--13fe44d47a-k8s-calico--apiserver--6464b99944--nztd8-eth0 calico-apiserver-6464b99944- calico-apiserver d4e2f489-94af-4e39-ac07-2a80367a7943 852 0 2025-08-12 23:44:46 +0000 UTC map[apiserver:true app.kubernetes.io/name:calico-apiserver k8s-app:calico-apiserver pod-template-hash:6464b99944 projectcalico.org/namespace:calico-apiserver projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:calico-apiserver] map[] [] [] []} {k8s ci-4372-1-0-9-13fe44d47a calico-apiserver-6464b99944-nztd8 eth0 calico-apiserver [] [] [kns.calico-apiserver ksa.calico-apiserver.calico-apiserver] cali51737d6b27c [] [] }} ContainerID="227f6754b8e7377cc59b160e3f58650e01ea812d4f5dff49f693ee2eca116f80" Namespace="calico-apiserver" Pod="calico-apiserver-6464b99944-nztd8" WorkloadEndpoint="ci--4372--1--0--9--13fe44d47a-k8s-calico--apiserver--6464b99944--nztd8-" Aug 12 23:45:10.450834 containerd[1506]: 2025-08-12 23:45:10.156 [INFO][4285] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="227f6754b8e7377cc59b160e3f58650e01ea812d4f5dff49f693ee2eca116f80" Namespace="calico-apiserver" Pod="calico-apiserver-6464b99944-nztd8" WorkloadEndpoint="ci--4372--1--0--9--13fe44d47a-k8s-calico--apiserver--6464b99944--nztd8-eth0" Aug 12 23:45:10.450834 containerd[1506]: 2025-08-12 23:45:10.199 [INFO][4312] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="227f6754b8e7377cc59b160e3f58650e01ea812d4f5dff49f693ee2eca116f80" HandleID="k8s-pod-network.227f6754b8e7377cc59b160e3f58650e01ea812d4f5dff49f693ee2eca116f80" Workload="ci--4372--1--0--9--13fe44d47a-k8s-calico--apiserver--6464b99944--nztd8-eth0" Aug 12 23:45:10.451057 containerd[1506]: 2025-08-12 23:45:10.199 [INFO][4312] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="227f6754b8e7377cc59b160e3f58650e01ea812d4f5dff49f693ee2eca116f80" HandleID="k8s-pod-network.227f6754b8e7377cc59b160e3f58650e01ea812d4f5dff49f693ee2eca116f80" Workload="ci--4372--1--0--9--13fe44d47a-k8s-calico--apiserver--6464b99944--nztd8-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0x400024b230), Attrs:map[string]string{"namespace":"calico-apiserver", "node":"ci-4372-1-0-9-13fe44d47a", "pod":"calico-apiserver-6464b99944-nztd8", "timestamp":"2025-08-12 23:45:10.199165619 +0000 UTC"}, Hostname:"ci-4372-1-0-9-13fe44d47a", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Aug 12 23:45:10.451057 containerd[1506]: 2025-08-12 23:45:10.199 [INFO][4312] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Aug 12 23:45:10.451057 containerd[1506]: 2025-08-12 23:45:10.247 [INFO][4312] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Aug 12 23:45:10.451057 containerd[1506]: 2025-08-12 23:45:10.247 [INFO][4312] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4372-1-0-9-13fe44d47a' Aug 12 23:45:10.451057 containerd[1506]: 2025-08-12 23:45:10.313 [INFO][4312] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.227f6754b8e7377cc59b160e3f58650e01ea812d4f5dff49f693ee2eca116f80" host="ci-4372-1-0-9-13fe44d47a" Aug 12 23:45:10.451057 containerd[1506]: 2025-08-12 23:45:10.328 [INFO][4312] ipam/ipam.go 394: Looking up existing affinities for host host="ci-4372-1-0-9-13fe44d47a" Aug 12 23:45:10.451057 containerd[1506]: 2025-08-12 23:45:10.341 [INFO][4312] ipam/ipam.go 511: Trying affinity for 192.168.3.64/26 host="ci-4372-1-0-9-13fe44d47a" Aug 12 23:45:10.451057 containerd[1506]: 2025-08-12 23:45:10.349 [INFO][4312] ipam/ipam.go 158: Attempting to load block cidr=192.168.3.64/26 host="ci-4372-1-0-9-13fe44d47a" Aug 12 23:45:10.451057 containerd[1506]: 2025-08-12 23:45:10.360 [INFO][4312] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.3.64/26 host="ci-4372-1-0-9-13fe44d47a" Aug 12 23:45:10.452018 containerd[1506]: 2025-08-12 23:45:10.361 [INFO][4312] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.3.64/26 handle="k8s-pod-network.227f6754b8e7377cc59b160e3f58650e01ea812d4f5dff49f693ee2eca116f80" host="ci-4372-1-0-9-13fe44d47a" Aug 12 23:45:10.452018 containerd[1506]: 2025-08-12 23:45:10.370 [INFO][4312] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.227f6754b8e7377cc59b160e3f58650e01ea812d4f5dff49f693ee2eca116f80 Aug 12 23:45:10.452018 containerd[1506]: 2025-08-12 23:45:10.395 [INFO][4312] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.3.64/26 handle="k8s-pod-network.227f6754b8e7377cc59b160e3f58650e01ea812d4f5dff49f693ee2eca116f80" host="ci-4372-1-0-9-13fe44d47a" Aug 12 23:45:10.452018 containerd[1506]: 2025-08-12 23:45:10.411 [INFO][4312] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.3.68/26] block=192.168.3.64/26 handle="k8s-pod-network.227f6754b8e7377cc59b160e3f58650e01ea812d4f5dff49f693ee2eca116f80" host="ci-4372-1-0-9-13fe44d47a" Aug 12 23:45:10.452018 containerd[1506]: 2025-08-12 23:45:10.411 [INFO][4312] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.3.68/26] handle="k8s-pod-network.227f6754b8e7377cc59b160e3f58650e01ea812d4f5dff49f693ee2eca116f80" host="ci-4372-1-0-9-13fe44d47a" Aug 12 23:45:10.452018 containerd[1506]: 2025-08-12 23:45:10.411 [INFO][4312] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Aug 12 23:45:10.452018 containerd[1506]: 2025-08-12 23:45:10.412 [INFO][4312] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.3.68/26] IPv6=[] ContainerID="227f6754b8e7377cc59b160e3f58650e01ea812d4f5dff49f693ee2eca116f80" HandleID="k8s-pod-network.227f6754b8e7377cc59b160e3f58650e01ea812d4f5dff49f693ee2eca116f80" Workload="ci--4372--1--0--9--13fe44d47a-k8s-calico--apiserver--6464b99944--nztd8-eth0" Aug 12 23:45:10.452166 containerd[1506]: 2025-08-12 23:45:10.416 [INFO][4285] cni-plugin/k8s.go 418: Populated endpoint ContainerID="227f6754b8e7377cc59b160e3f58650e01ea812d4f5dff49f693ee2eca116f80" Namespace="calico-apiserver" Pod="calico-apiserver-6464b99944-nztd8" WorkloadEndpoint="ci--4372--1--0--9--13fe44d47a-k8s-calico--apiserver--6464b99944--nztd8-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4372--1--0--9--13fe44d47a-k8s-calico--apiserver--6464b99944--nztd8-eth0", GenerateName:"calico-apiserver-6464b99944-", Namespace:"calico-apiserver", SelfLink:"", UID:"d4e2f489-94af-4e39-ac07-2a80367a7943", ResourceVersion:"852", Generation:0, CreationTimestamp:time.Date(2025, time.August, 12, 23, 44, 46, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"6464b99944", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4372-1-0-9-13fe44d47a", ContainerID:"", Pod:"calico-apiserver-6464b99944-nztd8", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.3.68/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"cali51737d6b27c", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Aug 12 23:45:10.452228 containerd[1506]: 2025-08-12 23:45:10.417 [INFO][4285] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.3.68/32] ContainerID="227f6754b8e7377cc59b160e3f58650e01ea812d4f5dff49f693ee2eca116f80" Namespace="calico-apiserver" Pod="calico-apiserver-6464b99944-nztd8" WorkloadEndpoint="ci--4372--1--0--9--13fe44d47a-k8s-calico--apiserver--6464b99944--nztd8-eth0" Aug 12 23:45:10.452228 containerd[1506]: 2025-08-12 23:45:10.417 [INFO][4285] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali51737d6b27c ContainerID="227f6754b8e7377cc59b160e3f58650e01ea812d4f5dff49f693ee2eca116f80" Namespace="calico-apiserver" Pod="calico-apiserver-6464b99944-nztd8" WorkloadEndpoint="ci--4372--1--0--9--13fe44d47a-k8s-calico--apiserver--6464b99944--nztd8-eth0" Aug 12 23:45:10.452228 containerd[1506]: 2025-08-12 23:45:10.426 [INFO][4285] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="227f6754b8e7377cc59b160e3f58650e01ea812d4f5dff49f693ee2eca116f80" Namespace="calico-apiserver" Pod="calico-apiserver-6464b99944-nztd8" WorkloadEndpoint="ci--4372--1--0--9--13fe44d47a-k8s-calico--apiserver--6464b99944--nztd8-eth0" Aug 12 23:45:10.452294 containerd[1506]: 2025-08-12 23:45:10.427 [INFO][4285] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="227f6754b8e7377cc59b160e3f58650e01ea812d4f5dff49f693ee2eca116f80" Namespace="calico-apiserver" Pod="calico-apiserver-6464b99944-nztd8" WorkloadEndpoint="ci--4372--1--0--9--13fe44d47a-k8s-calico--apiserver--6464b99944--nztd8-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4372--1--0--9--13fe44d47a-k8s-calico--apiserver--6464b99944--nztd8-eth0", GenerateName:"calico-apiserver-6464b99944-", Namespace:"calico-apiserver", SelfLink:"", UID:"d4e2f489-94af-4e39-ac07-2a80367a7943", ResourceVersion:"852", Generation:0, CreationTimestamp:time.Date(2025, time.August, 12, 23, 44, 46, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"6464b99944", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4372-1-0-9-13fe44d47a", ContainerID:"227f6754b8e7377cc59b160e3f58650e01ea812d4f5dff49f693ee2eca116f80", Pod:"calico-apiserver-6464b99944-nztd8", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.3.68/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"cali51737d6b27c", MAC:"a6:86:83:37:5d:a3", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Aug 12 23:45:10.452339 containerd[1506]: 2025-08-12 23:45:10.447 [INFO][4285] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="227f6754b8e7377cc59b160e3f58650e01ea812d4f5dff49f693ee2eca116f80" Namespace="calico-apiserver" Pod="calico-apiserver-6464b99944-nztd8" WorkloadEndpoint="ci--4372--1--0--9--13fe44d47a-k8s-calico--apiserver--6464b99944--nztd8-eth0" Aug 12 23:45:10.487110 containerd[1506]: time="2025-08-12T23:45:10.487057076Z" level=info msg="connecting to shim 227f6754b8e7377cc59b160e3f58650e01ea812d4f5dff49f693ee2eca116f80" address="unix:///run/containerd/s/3a470f5eec070e471522e0e7fb903488035aa068ab1624f8b662485c7f374958" namespace=k8s.io protocol=ttrpc version=3 Aug 12 23:45:10.515833 systemd[1]: Started cri-containerd-227f6754b8e7377cc59b160e3f58650e01ea812d4f5dff49f693ee2eca116f80.scope - libcontainer container 227f6754b8e7377cc59b160e3f58650e01ea812d4f5dff49f693ee2eca116f80. Aug 12 23:45:10.523376 containerd[1506]: time="2025-08-12T23:45:10.523313480Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-7f8d4ddcc7-r7f9g,Uid:241f24c8-4f1a-4329-8952-6c015f2ffbc0,Namespace:calico-apiserver,Attempt:0,} returns sandbox id \"69a93367d0d4a8757028e8e0ca21952426c5ce6ce96fdba13005db0b3b6b35bf\"" Aug 12 23:45:10.577264 containerd[1506]: time="2025-08-12T23:45:10.577221533Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-6464b99944-nztd8,Uid:d4e2f489-94af-4e39-ac07-2a80367a7943,Namespace:calico-apiserver,Attempt:0,} returns sandbox id \"227f6754b8e7377cc59b160e3f58650e01ea812d4f5dff49f693ee2eca116f80\"" Aug 12 23:45:10.755991 systemd-networkd[1419]: cali676c77411f1: Gained IPv6LL Aug 12 23:45:11.088308 containerd[1506]: time="2025-08-12T23:45:11.087952945Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-znzb2,Uid:e6d039b0-18c9-437e-88dc-fc77d8757ee7,Namespace:calico-system,Attempt:0,}" Aug 12 23:45:11.258753 systemd-networkd[1419]: califc7c65268f7: Link UP Aug 12 23:45:11.260119 systemd-networkd[1419]: califc7c65268f7: Gained carrier Aug 12 23:45:11.281315 containerd[1506]: 2025-08-12 23:45:11.154 [INFO][4433] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4372--1--0--9--13fe44d47a-k8s-csi--node--driver--znzb2-eth0 csi-node-driver- calico-system e6d039b0-18c9-437e-88dc-fc77d8757ee7 715 0 2025-08-12 23:44:49 +0000 UTC map[app.kubernetes.io/name:csi-node-driver controller-revision-hash:57bd658777 k8s-app:csi-node-driver name:csi-node-driver pod-template-generation:1 projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:csi-node-driver] map[] [] [] []} {k8s ci-4372-1-0-9-13fe44d47a csi-node-driver-znzb2 eth0 csi-node-driver [] [] [kns.calico-system ksa.calico-system.csi-node-driver] califc7c65268f7 [] [] }} ContainerID="776d8d2c290f716dd77e18c400a3f3495f95261b38226df66334762c89a1a50e" Namespace="calico-system" Pod="csi-node-driver-znzb2" WorkloadEndpoint="ci--4372--1--0--9--13fe44d47a-k8s-csi--node--driver--znzb2-" Aug 12 23:45:11.281315 containerd[1506]: 2025-08-12 23:45:11.154 [INFO][4433] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="776d8d2c290f716dd77e18c400a3f3495f95261b38226df66334762c89a1a50e" Namespace="calico-system" Pod="csi-node-driver-znzb2" WorkloadEndpoint="ci--4372--1--0--9--13fe44d47a-k8s-csi--node--driver--znzb2-eth0" Aug 12 23:45:11.281315 containerd[1506]: 2025-08-12 23:45:11.196 [INFO][4445] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="776d8d2c290f716dd77e18c400a3f3495f95261b38226df66334762c89a1a50e" HandleID="k8s-pod-network.776d8d2c290f716dd77e18c400a3f3495f95261b38226df66334762c89a1a50e" Workload="ci--4372--1--0--9--13fe44d47a-k8s-csi--node--driver--znzb2-eth0" Aug 12 23:45:11.282099 containerd[1506]: 2025-08-12 23:45:11.197 [INFO][4445] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="776d8d2c290f716dd77e18c400a3f3495f95261b38226df66334762c89a1a50e" HandleID="k8s-pod-network.776d8d2c290f716dd77e18c400a3f3495f95261b38226df66334762c89a1a50e" Workload="ci--4372--1--0--9--13fe44d47a-k8s-csi--node--driver--znzb2-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0x40002d3860), Attrs:map[string]string{"namespace":"calico-system", "node":"ci-4372-1-0-9-13fe44d47a", "pod":"csi-node-driver-znzb2", "timestamp":"2025-08-12 23:45:11.196884666 +0000 UTC"}, Hostname:"ci-4372-1-0-9-13fe44d47a", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Aug 12 23:45:11.282099 containerd[1506]: 2025-08-12 23:45:11.197 [INFO][4445] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Aug 12 23:45:11.282099 containerd[1506]: 2025-08-12 23:45:11.197 [INFO][4445] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Aug 12 23:45:11.282099 containerd[1506]: 2025-08-12 23:45:11.197 [INFO][4445] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4372-1-0-9-13fe44d47a' Aug 12 23:45:11.282099 containerd[1506]: 2025-08-12 23:45:11.212 [INFO][4445] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.776d8d2c290f716dd77e18c400a3f3495f95261b38226df66334762c89a1a50e" host="ci-4372-1-0-9-13fe44d47a" Aug 12 23:45:11.282099 containerd[1506]: 2025-08-12 23:45:11.218 [INFO][4445] ipam/ipam.go 394: Looking up existing affinities for host host="ci-4372-1-0-9-13fe44d47a" Aug 12 23:45:11.282099 containerd[1506]: 2025-08-12 23:45:11.225 [INFO][4445] ipam/ipam.go 511: Trying affinity for 192.168.3.64/26 host="ci-4372-1-0-9-13fe44d47a" Aug 12 23:45:11.282099 containerd[1506]: 2025-08-12 23:45:11.228 [INFO][4445] ipam/ipam.go 158: Attempting to load block cidr=192.168.3.64/26 host="ci-4372-1-0-9-13fe44d47a" Aug 12 23:45:11.282099 containerd[1506]: 2025-08-12 23:45:11.232 [INFO][4445] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.3.64/26 host="ci-4372-1-0-9-13fe44d47a" Aug 12 23:45:11.282355 containerd[1506]: 2025-08-12 23:45:11.232 [INFO][4445] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.3.64/26 handle="k8s-pod-network.776d8d2c290f716dd77e18c400a3f3495f95261b38226df66334762c89a1a50e" host="ci-4372-1-0-9-13fe44d47a" Aug 12 23:45:11.282355 containerd[1506]: 2025-08-12 23:45:11.235 [INFO][4445] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.776d8d2c290f716dd77e18c400a3f3495f95261b38226df66334762c89a1a50e Aug 12 23:45:11.282355 containerd[1506]: 2025-08-12 23:45:11.241 [INFO][4445] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.3.64/26 handle="k8s-pod-network.776d8d2c290f716dd77e18c400a3f3495f95261b38226df66334762c89a1a50e" host="ci-4372-1-0-9-13fe44d47a" Aug 12 23:45:11.282355 containerd[1506]: 2025-08-12 23:45:11.251 [INFO][4445] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.3.69/26] block=192.168.3.64/26 handle="k8s-pod-network.776d8d2c290f716dd77e18c400a3f3495f95261b38226df66334762c89a1a50e" host="ci-4372-1-0-9-13fe44d47a" Aug 12 23:45:11.282355 containerd[1506]: 2025-08-12 23:45:11.251 [INFO][4445] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.3.69/26] handle="k8s-pod-network.776d8d2c290f716dd77e18c400a3f3495f95261b38226df66334762c89a1a50e" host="ci-4372-1-0-9-13fe44d47a" Aug 12 23:45:11.282355 containerd[1506]: 2025-08-12 23:45:11.251 [INFO][4445] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Aug 12 23:45:11.282355 containerd[1506]: 2025-08-12 23:45:11.251 [INFO][4445] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.3.69/26] IPv6=[] ContainerID="776d8d2c290f716dd77e18c400a3f3495f95261b38226df66334762c89a1a50e" HandleID="k8s-pod-network.776d8d2c290f716dd77e18c400a3f3495f95261b38226df66334762c89a1a50e" Workload="ci--4372--1--0--9--13fe44d47a-k8s-csi--node--driver--znzb2-eth0" Aug 12 23:45:11.283684 containerd[1506]: 2025-08-12 23:45:11.254 [INFO][4433] cni-plugin/k8s.go 418: Populated endpoint ContainerID="776d8d2c290f716dd77e18c400a3f3495f95261b38226df66334762c89a1a50e" Namespace="calico-system" Pod="csi-node-driver-znzb2" WorkloadEndpoint="ci--4372--1--0--9--13fe44d47a-k8s-csi--node--driver--znzb2-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4372--1--0--9--13fe44d47a-k8s-csi--node--driver--znzb2-eth0", GenerateName:"csi-node-driver-", Namespace:"calico-system", SelfLink:"", UID:"e6d039b0-18c9-437e-88dc-fc77d8757ee7", ResourceVersion:"715", Generation:0, CreationTimestamp:time.Date(2025, time.August, 12, 23, 44, 49, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"csi-node-driver", "controller-revision-hash":"57bd658777", "k8s-app":"csi-node-driver", "name":"csi-node-driver", "pod-template-generation":"1", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"csi-node-driver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4372-1-0-9-13fe44d47a", ContainerID:"", Pod:"csi-node-driver-znzb2", Endpoint:"eth0", ServiceAccountName:"csi-node-driver", IPNetworks:[]string{"192.168.3.69/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.csi-node-driver"}, InterfaceName:"califc7c65268f7", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Aug 12 23:45:11.283768 containerd[1506]: 2025-08-12 23:45:11.254 [INFO][4433] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.3.69/32] ContainerID="776d8d2c290f716dd77e18c400a3f3495f95261b38226df66334762c89a1a50e" Namespace="calico-system" Pod="csi-node-driver-znzb2" WorkloadEndpoint="ci--4372--1--0--9--13fe44d47a-k8s-csi--node--driver--znzb2-eth0" Aug 12 23:45:11.283768 containerd[1506]: 2025-08-12 23:45:11.254 [INFO][4433] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to califc7c65268f7 ContainerID="776d8d2c290f716dd77e18c400a3f3495f95261b38226df66334762c89a1a50e" Namespace="calico-system" Pod="csi-node-driver-znzb2" WorkloadEndpoint="ci--4372--1--0--9--13fe44d47a-k8s-csi--node--driver--znzb2-eth0" Aug 12 23:45:11.283768 containerd[1506]: 2025-08-12 23:45:11.259 [INFO][4433] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="776d8d2c290f716dd77e18c400a3f3495f95261b38226df66334762c89a1a50e" Namespace="calico-system" Pod="csi-node-driver-znzb2" WorkloadEndpoint="ci--4372--1--0--9--13fe44d47a-k8s-csi--node--driver--znzb2-eth0" Aug 12 23:45:11.283848 containerd[1506]: 2025-08-12 23:45:11.259 [INFO][4433] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="776d8d2c290f716dd77e18c400a3f3495f95261b38226df66334762c89a1a50e" Namespace="calico-system" Pod="csi-node-driver-znzb2" WorkloadEndpoint="ci--4372--1--0--9--13fe44d47a-k8s-csi--node--driver--znzb2-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4372--1--0--9--13fe44d47a-k8s-csi--node--driver--znzb2-eth0", GenerateName:"csi-node-driver-", Namespace:"calico-system", SelfLink:"", UID:"e6d039b0-18c9-437e-88dc-fc77d8757ee7", ResourceVersion:"715", Generation:0, CreationTimestamp:time.Date(2025, time.August, 12, 23, 44, 49, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"csi-node-driver", "controller-revision-hash":"57bd658777", "k8s-app":"csi-node-driver", "name":"csi-node-driver", "pod-template-generation":"1", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"csi-node-driver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4372-1-0-9-13fe44d47a", ContainerID:"776d8d2c290f716dd77e18c400a3f3495f95261b38226df66334762c89a1a50e", Pod:"csi-node-driver-znzb2", Endpoint:"eth0", ServiceAccountName:"csi-node-driver", IPNetworks:[]string{"192.168.3.69/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.csi-node-driver"}, InterfaceName:"califc7c65268f7", MAC:"8e:a4:a3:cc:25:9b", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Aug 12 23:45:11.283900 containerd[1506]: 2025-08-12 23:45:11.276 [INFO][4433] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="776d8d2c290f716dd77e18c400a3f3495f95261b38226df66334762c89a1a50e" Namespace="calico-system" Pod="csi-node-driver-znzb2" WorkloadEndpoint="ci--4372--1--0--9--13fe44d47a-k8s-csi--node--driver--znzb2-eth0" Aug 12 23:45:11.315678 containerd[1506]: time="2025-08-12T23:45:11.315542390Z" level=info msg="connecting to shim 776d8d2c290f716dd77e18c400a3f3495f95261b38226df66334762c89a1a50e" address="unix:///run/containerd/s/977a04e5d07edabeac157109881b93c0d98561bad1bc865bb55d34d124ca62fe" namespace=k8s.io protocol=ttrpc version=3 Aug 12 23:45:11.349883 systemd[1]: Started cri-containerd-776d8d2c290f716dd77e18c400a3f3495f95261b38226df66334762c89a1a50e.scope - libcontainer container 776d8d2c290f716dd77e18c400a3f3495f95261b38226df66334762c89a1a50e. Aug 12 23:45:11.387300 containerd[1506]: time="2025-08-12T23:45:11.387137778Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-znzb2,Uid:e6d039b0-18c9-437e-88dc-fc77d8757ee7,Namespace:calico-system,Attempt:0,} returns sandbox id \"776d8d2c290f716dd77e18c400a3f3495f95261b38226df66334762c89a1a50e\"" Aug 12 23:45:11.652754 systemd-networkd[1419]: cali77a5728c533: Gained IPv6LL Aug 12 23:45:12.037411 systemd-networkd[1419]: cali51737d6b27c: Gained IPv6LL Aug 12 23:45:12.088407 containerd[1506]: time="2025-08-12T23:45:12.087924560Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-7c65d6cfc9-dbjg4,Uid:a62fab08-ab88-42e3-ab7b-29db68e0485c,Namespace:kube-system,Attempt:0,}" Aug 12 23:45:12.088407 containerd[1506]: time="2025-08-12T23:45:12.088173766Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-7c65d6cfc9-drk6f,Uid:f78d3697-5a7e-4c57-950b-a6d5080b5a5b,Namespace:kube-system,Attempt:0,}" Aug 12 23:45:12.089076 containerd[1506]: time="2025-08-12T23:45:12.089045067Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-7f8d4ddcc7-zgq55,Uid:a9e9c567-de9b-4a68-9d2c-640b2dafaf2f,Namespace:calico-apiserver,Attempt:0,}" Aug 12 23:45:12.089416 containerd[1506]: time="2025-08-12T23:45:12.089208871Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-5cbb45b49d-bnxpq,Uid:285f6b1f-9c43-4330-9ef8-c92ed81153fb,Namespace:calico-system,Attempt:0,}" Aug 12 23:45:12.345480 systemd-networkd[1419]: cali8fcc04ff5ce: Link UP Aug 12 23:45:12.346000 systemd-networkd[1419]: cali8fcc04ff5ce: Gained carrier Aug 12 23:45:12.379075 containerd[1506]: 2025-08-12 23:45:12.166 [INFO][4503] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4372--1--0--9--13fe44d47a-k8s-coredns--7c65d6cfc9--drk6f-eth0 coredns-7c65d6cfc9- kube-system f78d3697-5a7e-4c57-950b-a6d5080b5a5b 849 0 2025-08-12 23:44:32 +0000 UTC map[k8s-app:kube-dns pod-template-hash:7c65d6cfc9 projectcalico.org/namespace:kube-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:coredns] map[] [] [] []} {k8s ci-4372-1-0-9-13fe44d47a coredns-7c65d6cfc9-drk6f eth0 coredns [] [] [kns.kube-system ksa.kube-system.coredns] cali8fcc04ff5ce [{dns UDP 53 0 } {dns-tcp TCP 53 0 } {metrics TCP 9153 0 }] [] }} ContainerID="9f526967f1bd718177b55a67c1823515130ed761ceb5678c5662b78e22370777" Namespace="kube-system" Pod="coredns-7c65d6cfc9-drk6f" WorkloadEndpoint="ci--4372--1--0--9--13fe44d47a-k8s-coredns--7c65d6cfc9--drk6f-" Aug 12 23:45:12.379075 containerd[1506]: 2025-08-12 23:45:12.167 [INFO][4503] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="9f526967f1bd718177b55a67c1823515130ed761ceb5678c5662b78e22370777" Namespace="kube-system" Pod="coredns-7c65d6cfc9-drk6f" WorkloadEndpoint="ci--4372--1--0--9--13fe44d47a-k8s-coredns--7c65d6cfc9--drk6f-eth0" Aug 12 23:45:12.379075 containerd[1506]: 2025-08-12 23:45:12.234 [INFO][4551] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="9f526967f1bd718177b55a67c1823515130ed761ceb5678c5662b78e22370777" HandleID="k8s-pod-network.9f526967f1bd718177b55a67c1823515130ed761ceb5678c5662b78e22370777" Workload="ci--4372--1--0--9--13fe44d47a-k8s-coredns--7c65d6cfc9--drk6f-eth0" Aug 12 23:45:12.380323 containerd[1506]: 2025-08-12 23:45:12.234 [INFO][4551] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="9f526967f1bd718177b55a67c1823515130ed761ceb5678c5662b78e22370777" HandleID="k8s-pod-network.9f526967f1bd718177b55a67c1823515130ed761ceb5678c5662b78e22370777" Workload="ci--4372--1--0--9--13fe44d47a-k8s-coredns--7c65d6cfc9--drk6f-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0x40002d3220), Attrs:map[string]string{"namespace":"kube-system", "node":"ci-4372-1-0-9-13fe44d47a", "pod":"coredns-7c65d6cfc9-drk6f", "timestamp":"2025-08-12 23:45:12.234137662 +0000 UTC"}, Hostname:"ci-4372-1-0-9-13fe44d47a", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Aug 12 23:45:12.380323 containerd[1506]: 2025-08-12 23:45:12.235 [INFO][4551] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Aug 12 23:45:12.380323 containerd[1506]: 2025-08-12 23:45:12.235 [INFO][4551] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Aug 12 23:45:12.380323 containerd[1506]: 2025-08-12 23:45:12.235 [INFO][4551] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4372-1-0-9-13fe44d47a' Aug 12 23:45:12.380323 containerd[1506]: 2025-08-12 23:45:12.266 [INFO][4551] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.9f526967f1bd718177b55a67c1823515130ed761ceb5678c5662b78e22370777" host="ci-4372-1-0-9-13fe44d47a" Aug 12 23:45:12.380323 containerd[1506]: 2025-08-12 23:45:12.277 [INFO][4551] ipam/ipam.go 394: Looking up existing affinities for host host="ci-4372-1-0-9-13fe44d47a" Aug 12 23:45:12.380323 containerd[1506]: 2025-08-12 23:45:12.293 [INFO][4551] ipam/ipam.go 511: Trying affinity for 192.168.3.64/26 host="ci-4372-1-0-9-13fe44d47a" Aug 12 23:45:12.380323 containerd[1506]: 2025-08-12 23:45:12.296 [INFO][4551] ipam/ipam.go 158: Attempting to load block cidr=192.168.3.64/26 host="ci-4372-1-0-9-13fe44d47a" Aug 12 23:45:12.380323 containerd[1506]: 2025-08-12 23:45:12.301 [INFO][4551] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.3.64/26 host="ci-4372-1-0-9-13fe44d47a" Aug 12 23:45:12.380809 containerd[1506]: 2025-08-12 23:45:12.301 [INFO][4551] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.3.64/26 handle="k8s-pod-network.9f526967f1bd718177b55a67c1823515130ed761ceb5678c5662b78e22370777" host="ci-4372-1-0-9-13fe44d47a" Aug 12 23:45:12.380809 containerd[1506]: 2025-08-12 23:45:12.304 [INFO][4551] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.9f526967f1bd718177b55a67c1823515130ed761ceb5678c5662b78e22370777 Aug 12 23:45:12.380809 containerd[1506]: 2025-08-12 23:45:12.312 [INFO][4551] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.3.64/26 handle="k8s-pod-network.9f526967f1bd718177b55a67c1823515130ed761ceb5678c5662b78e22370777" host="ci-4372-1-0-9-13fe44d47a" Aug 12 23:45:12.380809 containerd[1506]: 2025-08-12 23:45:12.325 [INFO][4551] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.3.70/26] block=192.168.3.64/26 handle="k8s-pod-network.9f526967f1bd718177b55a67c1823515130ed761ceb5678c5662b78e22370777" host="ci-4372-1-0-9-13fe44d47a" Aug 12 23:45:12.380809 containerd[1506]: 2025-08-12 23:45:12.325 [INFO][4551] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.3.70/26] handle="k8s-pod-network.9f526967f1bd718177b55a67c1823515130ed761ceb5678c5662b78e22370777" host="ci-4372-1-0-9-13fe44d47a" Aug 12 23:45:12.380809 containerd[1506]: 2025-08-12 23:45:12.325 [INFO][4551] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Aug 12 23:45:12.380809 containerd[1506]: 2025-08-12 23:45:12.326 [INFO][4551] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.3.70/26] IPv6=[] ContainerID="9f526967f1bd718177b55a67c1823515130ed761ceb5678c5662b78e22370777" HandleID="k8s-pod-network.9f526967f1bd718177b55a67c1823515130ed761ceb5678c5662b78e22370777" Workload="ci--4372--1--0--9--13fe44d47a-k8s-coredns--7c65d6cfc9--drk6f-eth0" Aug 12 23:45:12.381834 containerd[1506]: 2025-08-12 23:45:12.329 [INFO][4503] cni-plugin/k8s.go 418: Populated endpoint ContainerID="9f526967f1bd718177b55a67c1823515130ed761ceb5678c5662b78e22370777" Namespace="kube-system" Pod="coredns-7c65d6cfc9-drk6f" WorkloadEndpoint="ci--4372--1--0--9--13fe44d47a-k8s-coredns--7c65d6cfc9--drk6f-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4372--1--0--9--13fe44d47a-k8s-coredns--7c65d6cfc9--drk6f-eth0", GenerateName:"coredns-7c65d6cfc9-", Namespace:"kube-system", SelfLink:"", UID:"f78d3697-5a7e-4c57-950b-a6d5080b5a5b", ResourceVersion:"849", Generation:0, CreationTimestamp:time.Date(2025, time.August, 12, 23, 44, 32, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"7c65d6cfc9", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4372-1-0-9-13fe44d47a", ContainerID:"", Pod:"coredns-7c65d6cfc9-drk6f", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.3.70/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"cali8fcc04ff5ce", MAC:"", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Aug 12 23:45:12.381834 containerd[1506]: 2025-08-12 23:45:12.329 [INFO][4503] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.3.70/32] ContainerID="9f526967f1bd718177b55a67c1823515130ed761ceb5678c5662b78e22370777" Namespace="kube-system" Pod="coredns-7c65d6cfc9-drk6f" WorkloadEndpoint="ci--4372--1--0--9--13fe44d47a-k8s-coredns--7c65d6cfc9--drk6f-eth0" Aug 12 23:45:12.381834 containerd[1506]: 2025-08-12 23:45:12.329 [INFO][4503] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali8fcc04ff5ce ContainerID="9f526967f1bd718177b55a67c1823515130ed761ceb5678c5662b78e22370777" Namespace="kube-system" Pod="coredns-7c65d6cfc9-drk6f" WorkloadEndpoint="ci--4372--1--0--9--13fe44d47a-k8s-coredns--7c65d6cfc9--drk6f-eth0" Aug 12 23:45:12.381834 containerd[1506]: 2025-08-12 23:45:12.346 [INFO][4503] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="9f526967f1bd718177b55a67c1823515130ed761ceb5678c5662b78e22370777" Namespace="kube-system" Pod="coredns-7c65d6cfc9-drk6f" WorkloadEndpoint="ci--4372--1--0--9--13fe44d47a-k8s-coredns--7c65d6cfc9--drk6f-eth0" Aug 12 23:45:12.381834 containerd[1506]: 2025-08-12 23:45:12.348 [INFO][4503] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="9f526967f1bd718177b55a67c1823515130ed761ceb5678c5662b78e22370777" Namespace="kube-system" Pod="coredns-7c65d6cfc9-drk6f" WorkloadEndpoint="ci--4372--1--0--9--13fe44d47a-k8s-coredns--7c65d6cfc9--drk6f-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4372--1--0--9--13fe44d47a-k8s-coredns--7c65d6cfc9--drk6f-eth0", GenerateName:"coredns-7c65d6cfc9-", Namespace:"kube-system", SelfLink:"", UID:"f78d3697-5a7e-4c57-950b-a6d5080b5a5b", ResourceVersion:"849", Generation:0, CreationTimestamp:time.Date(2025, time.August, 12, 23, 44, 32, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"7c65d6cfc9", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4372-1-0-9-13fe44d47a", ContainerID:"9f526967f1bd718177b55a67c1823515130ed761ceb5678c5662b78e22370777", Pod:"coredns-7c65d6cfc9-drk6f", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.3.70/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"cali8fcc04ff5ce", MAC:"d2:ca:55:db:38:7f", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Aug 12 23:45:12.381834 containerd[1506]: 2025-08-12 23:45:12.373 [INFO][4503] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="9f526967f1bd718177b55a67c1823515130ed761ceb5678c5662b78e22370777" Namespace="kube-system" Pod="coredns-7c65d6cfc9-drk6f" WorkloadEndpoint="ci--4372--1--0--9--13fe44d47a-k8s-coredns--7c65d6cfc9--drk6f-eth0" Aug 12 23:45:12.436068 containerd[1506]: time="2025-08-12T23:45:12.436014048Z" level=info msg="connecting to shim 9f526967f1bd718177b55a67c1823515130ed761ceb5678c5662b78e22370777" address="unix:///run/containerd/s/bf10a73b4b5d7b53f199b9d6443084eff17afd5f406a5da4a414ab200f448a0b" namespace=k8s.io protocol=ttrpc version=3 Aug 12 23:45:12.461283 systemd-networkd[1419]: cali06c80f6778a: Link UP Aug 12 23:45:12.463886 systemd-networkd[1419]: cali06c80f6778a: Gained carrier Aug 12 23:45:12.489901 systemd[1]: Started cri-containerd-9f526967f1bd718177b55a67c1823515130ed761ceb5678c5662b78e22370777.scope - libcontainer container 9f526967f1bd718177b55a67c1823515130ed761ceb5678c5662b78e22370777. Aug 12 23:45:12.501053 containerd[1506]: 2025-08-12 23:45:12.217 [INFO][4528] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4372--1--0--9--13fe44d47a-k8s-calico--kube--controllers--5cbb45b49d--bnxpq-eth0 calico-kube-controllers-5cbb45b49d- calico-system 285f6b1f-9c43-4330-9ef8-c92ed81153fb 853 0 2025-08-12 23:44:49 +0000 UTC map[app.kubernetes.io/name:calico-kube-controllers k8s-app:calico-kube-controllers pod-template-hash:5cbb45b49d projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:calico-kube-controllers] map[] [] [] []} {k8s ci-4372-1-0-9-13fe44d47a calico-kube-controllers-5cbb45b49d-bnxpq eth0 calico-kube-controllers [] [] [kns.calico-system ksa.calico-system.calico-kube-controllers] cali06c80f6778a [] [] }} ContainerID="86d417c813a03bada6b8faf8d2c202e515fa78a5eee64cf7c08d139363e5f3c1" Namespace="calico-system" Pod="calico-kube-controllers-5cbb45b49d-bnxpq" WorkloadEndpoint="ci--4372--1--0--9--13fe44d47a-k8s-calico--kube--controllers--5cbb45b49d--bnxpq-" Aug 12 23:45:12.501053 containerd[1506]: 2025-08-12 23:45:12.220 [INFO][4528] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="86d417c813a03bada6b8faf8d2c202e515fa78a5eee64cf7c08d139363e5f3c1" Namespace="calico-system" Pod="calico-kube-controllers-5cbb45b49d-bnxpq" WorkloadEndpoint="ci--4372--1--0--9--13fe44d47a-k8s-calico--kube--controllers--5cbb45b49d--bnxpq-eth0" Aug 12 23:45:12.501053 containerd[1506]: 2025-08-12 23:45:12.295 [INFO][4564] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="86d417c813a03bada6b8faf8d2c202e515fa78a5eee64cf7c08d139363e5f3c1" HandleID="k8s-pod-network.86d417c813a03bada6b8faf8d2c202e515fa78a5eee64cf7c08d139363e5f3c1" Workload="ci--4372--1--0--9--13fe44d47a-k8s-calico--kube--controllers--5cbb45b49d--bnxpq-eth0" Aug 12 23:45:12.501053 containerd[1506]: 2025-08-12 23:45:12.296 [INFO][4564] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="86d417c813a03bada6b8faf8d2c202e515fa78a5eee64cf7c08d139363e5f3c1" HandleID="k8s-pod-network.86d417c813a03bada6b8faf8d2c202e515fa78a5eee64cf7c08d139363e5f3c1" Workload="ci--4372--1--0--9--13fe44d47a-k8s-calico--kube--controllers--5cbb45b49d--bnxpq-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0x40002dd600), Attrs:map[string]string{"namespace":"calico-system", "node":"ci-4372-1-0-9-13fe44d47a", "pod":"calico-kube-controllers-5cbb45b49d-bnxpq", "timestamp":"2025-08-12 23:45:12.295156557 +0000 UTC"}, Hostname:"ci-4372-1-0-9-13fe44d47a", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Aug 12 23:45:12.501053 containerd[1506]: 2025-08-12 23:45:12.296 [INFO][4564] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Aug 12 23:45:12.501053 containerd[1506]: 2025-08-12 23:45:12.326 [INFO][4564] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Aug 12 23:45:12.501053 containerd[1506]: 2025-08-12 23:45:12.326 [INFO][4564] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4372-1-0-9-13fe44d47a' Aug 12 23:45:12.501053 containerd[1506]: 2025-08-12 23:45:12.373 [INFO][4564] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.86d417c813a03bada6b8faf8d2c202e515fa78a5eee64cf7c08d139363e5f3c1" host="ci-4372-1-0-9-13fe44d47a" Aug 12 23:45:12.501053 containerd[1506]: 2025-08-12 23:45:12.387 [INFO][4564] ipam/ipam.go 394: Looking up existing affinities for host host="ci-4372-1-0-9-13fe44d47a" Aug 12 23:45:12.501053 containerd[1506]: 2025-08-12 23:45:12.397 [INFO][4564] ipam/ipam.go 511: Trying affinity for 192.168.3.64/26 host="ci-4372-1-0-9-13fe44d47a" Aug 12 23:45:12.501053 containerd[1506]: 2025-08-12 23:45:12.405 [INFO][4564] ipam/ipam.go 158: Attempting to load block cidr=192.168.3.64/26 host="ci-4372-1-0-9-13fe44d47a" Aug 12 23:45:12.501053 containerd[1506]: 2025-08-12 23:45:12.415 [INFO][4564] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.3.64/26 host="ci-4372-1-0-9-13fe44d47a" Aug 12 23:45:12.501053 containerd[1506]: 2025-08-12 23:45:12.415 [INFO][4564] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.3.64/26 handle="k8s-pod-network.86d417c813a03bada6b8faf8d2c202e515fa78a5eee64cf7c08d139363e5f3c1" host="ci-4372-1-0-9-13fe44d47a" Aug 12 23:45:12.501053 containerd[1506]: 2025-08-12 23:45:12.419 [INFO][4564] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.86d417c813a03bada6b8faf8d2c202e515fa78a5eee64cf7c08d139363e5f3c1 Aug 12 23:45:12.501053 containerd[1506]: 2025-08-12 23:45:12.430 [INFO][4564] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.3.64/26 handle="k8s-pod-network.86d417c813a03bada6b8faf8d2c202e515fa78a5eee64cf7c08d139363e5f3c1" host="ci-4372-1-0-9-13fe44d47a" Aug 12 23:45:12.501053 containerd[1506]: 2025-08-12 23:45:12.444 [INFO][4564] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.3.71/26] block=192.168.3.64/26 handle="k8s-pod-network.86d417c813a03bada6b8faf8d2c202e515fa78a5eee64cf7c08d139363e5f3c1" host="ci-4372-1-0-9-13fe44d47a" Aug 12 23:45:12.501053 containerd[1506]: 2025-08-12 23:45:12.444 [INFO][4564] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.3.71/26] handle="k8s-pod-network.86d417c813a03bada6b8faf8d2c202e515fa78a5eee64cf7c08d139363e5f3c1" host="ci-4372-1-0-9-13fe44d47a" Aug 12 23:45:12.501053 containerd[1506]: 2025-08-12 23:45:12.444 [INFO][4564] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Aug 12 23:45:12.501053 containerd[1506]: 2025-08-12 23:45:12.444 [INFO][4564] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.3.71/26] IPv6=[] ContainerID="86d417c813a03bada6b8faf8d2c202e515fa78a5eee64cf7c08d139363e5f3c1" HandleID="k8s-pod-network.86d417c813a03bada6b8faf8d2c202e515fa78a5eee64cf7c08d139363e5f3c1" Workload="ci--4372--1--0--9--13fe44d47a-k8s-calico--kube--controllers--5cbb45b49d--bnxpq-eth0" Aug 12 23:45:12.502511 containerd[1506]: 2025-08-12 23:45:12.451 [INFO][4528] cni-plugin/k8s.go 418: Populated endpoint ContainerID="86d417c813a03bada6b8faf8d2c202e515fa78a5eee64cf7c08d139363e5f3c1" Namespace="calico-system" Pod="calico-kube-controllers-5cbb45b49d-bnxpq" WorkloadEndpoint="ci--4372--1--0--9--13fe44d47a-k8s-calico--kube--controllers--5cbb45b49d--bnxpq-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4372--1--0--9--13fe44d47a-k8s-calico--kube--controllers--5cbb45b49d--bnxpq-eth0", GenerateName:"calico-kube-controllers-5cbb45b49d-", Namespace:"calico-system", SelfLink:"", UID:"285f6b1f-9c43-4330-9ef8-c92ed81153fb", ResourceVersion:"853", Generation:0, CreationTimestamp:time.Date(2025, time.August, 12, 23, 44, 49, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"calico-kube-controllers", "k8s-app":"calico-kube-controllers", "pod-template-hash":"5cbb45b49d", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-kube-controllers"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4372-1-0-9-13fe44d47a", ContainerID:"", Pod:"calico-kube-controllers-5cbb45b49d-bnxpq", Endpoint:"eth0", ServiceAccountName:"calico-kube-controllers", IPNetworks:[]string{"192.168.3.71/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.calico-kube-controllers"}, InterfaceName:"cali06c80f6778a", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Aug 12 23:45:12.502511 containerd[1506]: 2025-08-12 23:45:12.452 [INFO][4528] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.3.71/32] ContainerID="86d417c813a03bada6b8faf8d2c202e515fa78a5eee64cf7c08d139363e5f3c1" Namespace="calico-system" Pod="calico-kube-controllers-5cbb45b49d-bnxpq" WorkloadEndpoint="ci--4372--1--0--9--13fe44d47a-k8s-calico--kube--controllers--5cbb45b49d--bnxpq-eth0" Aug 12 23:45:12.502511 containerd[1506]: 2025-08-12 23:45:12.452 [INFO][4528] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali06c80f6778a ContainerID="86d417c813a03bada6b8faf8d2c202e515fa78a5eee64cf7c08d139363e5f3c1" Namespace="calico-system" Pod="calico-kube-controllers-5cbb45b49d-bnxpq" WorkloadEndpoint="ci--4372--1--0--9--13fe44d47a-k8s-calico--kube--controllers--5cbb45b49d--bnxpq-eth0" Aug 12 23:45:12.502511 containerd[1506]: 2025-08-12 23:45:12.466 [INFO][4528] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="86d417c813a03bada6b8faf8d2c202e515fa78a5eee64cf7c08d139363e5f3c1" Namespace="calico-system" Pod="calico-kube-controllers-5cbb45b49d-bnxpq" WorkloadEndpoint="ci--4372--1--0--9--13fe44d47a-k8s-calico--kube--controllers--5cbb45b49d--bnxpq-eth0" Aug 12 23:45:12.502511 containerd[1506]: 2025-08-12 23:45:12.471 [INFO][4528] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="86d417c813a03bada6b8faf8d2c202e515fa78a5eee64cf7c08d139363e5f3c1" Namespace="calico-system" Pod="calico-kube-controllers-5cbb45b49d-bnxpq" WorkloadEndpoint="ci--4372--1--0--9--13fe44d47a-k8s-calico--kube--controllers--5cbb45b49d--bnxpq-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4372--1--0--9--13fe44d47a-k8s-calico--kube--controllers--5cbb45b49d--bnxpq-eth0", GenerateName:"calico-kube-controllers-5cbb45b49d-", Namespace:"calico-system", SelfLink:"", UID:"285f6b1f-9c43-4330-9ef8-c92ed81153fb", ResourceVersion:"853", Generation:0, CreationTimestamp:time.Date(2025, time.August, 12, 23, 44, 49, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"calico-kube-controllers", "k8s-app":"calico-kube-controllers", "pod-template-hash":"5cbb45b49d", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-kube-controllers"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4372-1-0-9-13fe44d47a", ContainerID:"86d417c813a03bada6b8faf8d2c202e515fa78a5eee64cf7c08d139363e5f3c1", Pod:"calico-kube-controllers-5cbb45b49d-bnxpq", Endpoint:"eth0", ServiceAccountName:"calico-kube-controllers", IPNetworks:[]string{"192.168.3.71/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.calico-kube-controllers"}, InterfaceName:"cali06c80f6778a", MAC:"f2:7c:06:ab:57:f5", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Aug 12 23:45:12.502511 containerd[1506]: 2025-08-12 23:45:12.496 [INFO][4528] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="86d417c813a03bada6b8faf8d2c202e515fa78a5eee64cf7c08d139363e5f3c1" Namespace="calico-system" Pod="calico-kube-controllers-5cbb45b49d-bnxpq" WorkloadEndpoint="ci--4372--1--0--9--13fe44d47a-k8s-calico--kube--controllers--5cbb45b49d--bnxpq-eth0" Aug 12 23:45:12.549485 containerd[1506]: time="2025-08-12T23:45:12.549297184Z" level=info msg="connecting to shim 86d417c813a03bada6b8faf8d2c202e515fa78a5eee64cf7c08d139363e5f3c1" address="unix:///run/containerd/s/c3c56d978201f8b333d3d32db7baaac00fba5b24287127408f517c1dd28f0604" namespace=k8s.io protocol=ttrpc version=3 Aug 12 23:45:12.595375 systemd-networkd[1419]: calicdb23de850c: Link UP Aug 12 23:45:12.598787 systemd-networkd[1419]: calicdb23de850c: Gained carrier Aug 12 23:45:12.629837 systemd[1]: Started cri-containerd-86d417c813a03bada6b8faf8d2c202e515fa78a5eee64cf7c08d139363e5f3c1.scope - libcontainer container 86d417c813a03bada6b8faf8d2c202e515fa78a5eee64cf7c08d139363e5f3c1. Aug 12 23:45:12.640256 containerd[1506]: 2025-08-12 23:45:12.258 [INFO][4532] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4372--1--0--9--13fe44d47a-k8s-calico--apiserver--7f8d4ddcc7--zgq55-eth0 calico-apiserver-7f8d4ddcc7- calico-apiserver a9e9c567-de9b-4a68-9d2c-640b2dafaf2f 850 0 2025-08-12 23:44:44 +0000 UTC map[apiserver:true app.kubernetes.io/name:calico-apiserver k8s-app:calico-apiserver pod-template-hash:7f8d4ddcc7 projectcalico.org/namespace:calico-apiserver projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:calico-apiserver] map[] [] [] []} {k8s ci-4372-1-0-9-13fe44d47a calico-apiserver-7f8d4ddcc7-zgq55 eth0 calico-apiserver [] [] [kns.calico-apiserver ksa.calico-apiserver.calico-apiserver] calicdb23de850c [] [] }} ContainerID="6a5e1ec7533ab9f8c923e3de67713cb1c6bc6ed94eeb1753dd30a384ea76360c" Namespace="calico-apiserver" Pod="calico-apiserver-7f8d4ddcc7-zgq55" WorkloadEndpoint="ci--4372--1--0--9--13fe44d47a-k8s-calico--apiserver--7f8d4ddcc7--zgq55-" Aug 12 23:45:12.640256 containerd[1506]: 2025-08-12 23:45:12.258 [INFO][4532] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="6a5e1ec7533ab9f8c923e3de67713cb1c6bc6ed94eeb1753dd30a384ea76360c" Namespace="calico-apiserver" Pod="calico-apiserver-7f8d4ddcc7-zgq55" WorkloadEndpoint="ci--4372--1--0--9--13fe44d47a-k8s-calico--apiserver--7f8d4ddcc7--zgq55-eth0" Aug 12 23:45:12.640256 containerd[1506]: 2025-08-12 23:45:12.318 [INFO][4575] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="6a5e1ec7533ab9f8c923e3de67713cb1c6bc6ed94eeb1753dd30a384ea76360c" HandleID="k8s-pod-network.6a5e1ec7533ab9f8c923e3de67713cb1c6bc6ed94eeb1753dd30a384ea76360c" Workload="ci--4372--1--0--9--13fe44d47a-k8s-calico--apiserver--7f8d4ddcc7--zgq55-eth0" Aug 12 23:45:12.640256 containerd[1506]: 2025-08-12 23:45:12.319 [INFO][4575] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="6a5e1ec7533ab9f8c923e3de67713cb1c6bc6ed94eeb1753dd30a384ea76360c" HandleID="k8s-pod-network.6a5e1ec7533ab9f8c923e3de67713cb1c6bc6ed94eeb1753dd30a384ea76360c" Workload="ci--4372--1--0--9--13fe44d47a-k8s-calico--apiserver--7f8d4ddcc7--zgq55-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0x4000341820), Attrs:map[string]string{"namespace":"calico-apiserver", "node":"ci-4372-1-0-9-13fe44d47a", "pod":"calico-apiserver-7f8d4ddcc7-zgq55", "timestamp":"2025-08-12 23:45:12.318403167 +0000 UTC"}, Hostname:"ci-4372-1-0-9-13fe44d47a", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Aug 12 23:45:12.640256 containerd[1506]: 2025-08-12 23:45:12.319 [INFO][4575] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Aug 12 23:45:12.640256 containerd[1506]: 2025-08-12 23:45:12.445 [INFO][4575] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Aug 12 23:45:12.640256 containerd[1506]: 2025-08-12 23:45:12.445 [INFO][4575] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4372-1-0-9-13fe44d47a' Aug 12 23:45:12.640256 containerd[1506]: 2025-08-12 23:45:12.483 [INFO][4575] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.6a5e1ec7533ab9f8c923e3de67713cb1c6bc6ed94eeb1753dd30a384ea76360c" host="ci-4372-1-0-9-13fe44d47a" Aug 12 23:45:12.640256 containerd[1506]: 2025-08-12 23:45:12.498 [INFO][4575] ipam/ipam.go 394: Looking up existing affinities for host host="ci-4372-1-0-9-13fe44d47a" Aug 12 23:45:12.640256 containerd[1506]: 2025-08-12 23:45:12.517 [INFO][4575] ipam/ipam.go 511: Trying affinity for 192.168.3.64/26 host="ci-4372-1-0-9-13fe44d47a" Aug 12 23:45:12.640256 containerd[1506]: 2025-08-12 23:45:12.522 [INFO][4575] ipam/ipam.go 158: Attempting to load block cidr=192.168.3.64/26 host="ci-4372-1-0-9-13fe44d47a" Aug 12 23:45:12.640256 containerd[1506]: 2025-08-12 23:45:12.530 [INFO][4575] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.3.64/26 host="ci-4372-1-0-9-13fe44d47a" Aug 12 23:45:12.640256 containerd[1506]: 2025-08-12 23:45:12.530 [INFO][4575] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.3.64/26 handle="k8s-pod-network.6a5e1ec7533ab9f8c923e3de67713cb1c6bc6ed94eeb1753dd30a384ea76360c" host="ci-4372-1-0-9-13fe44d47a" Aug 12 23:45:12.640256 containerd[1506]: 2025-08-12 23:45:12.534 [INFO][4575] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.6a5e1ec7533ab9f8c923e3de67713cb1c6bc6ed94eeb1753dd30a384ea76360c Aug 12 23:45:12.640256 containerd[1506]: 2025-08-12 23:45:12.549 [INFO][4575] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.3.64/26 handle="k8s-pod-network.6a5e1ec7533ab9f8c923e3de67713cb1c6bc6ed94eeb1753dd30a384ea76360c" host="ci-4372-1-0-9-13fe44d47a" Aug 12 23:45:12.640256 containerd[1506]: 2025-08-12 23:45:12.572 [INFO][4575] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.3.72/26] block=192.168.3.64/26 handle="k8s-pod-network.6a5e1ec7533ab9f8c923e3de67713cb1c6bc6ed94eeb1753dd30a384ea76360c" host="ci-4372-1-0-9-13fe44d47a" Aug 12 23:45:12.640256 containerd[1506]: 2025-08-12 23:45:12.572 [INFO][4575] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.3.72/26] handle="k8s-pod-network.6a5e1ec7533ab9f8c923e3de67713cb1c6bc6ed94eeb1753dd30a384ea76360c" host="ci-4372-1-0-9-13fe44d47a" Aug 12 23:45:12.640256 containerd[1506]: 2025-08-12 23:45:12.572 [INFO][4575] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Aug 12 23:45:12.640256 containerd[1506]: 2025-08-12 23:45:12.573 [INFO][4575] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.3.72/26] IPv6=[] ContainerID="6a5e1ec7533ab9f8c923e3de67713cb1c6bc6ed94eeb1753dd30a384ea76360c" HandleID="k8s-pod-network.6a5e1ec7533ab9f8c923e3de67713cb1c6bc6ed94eeb1753dd30a384ea76360c" Workload="ci--4372--1--0--9--13fe44d47a-k8s-calico--apiserver--7f8d4ddcc7--zgq55-eth0" Aug 12 23:45:12.642020 containerd[1506]: 2025-08-12 23:45:12.580 [INFO][4532] cni-plugin/k8s.go 418: Populated endpoint ContainerID="6a5e1ec7533ab9f8c923e3de67713cb1c6bc6ed94eeb1753dd30a384ea76360c" Namespace="calico-apiserver" Pod="calico-apiserver-7f8d4ddcc7-zgq55" WorkloadEndpoint="ci--4372--1--0--9--13fe44d47a-k8s-calico--apiserver--7f8d4ddcc7--zgq55-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4372--1--0--9--13fe44d47a-k8s-calico--apiserver--7f8d4ddcc7--zgq55-eth0", GenerateName:"calico-apiserver-7f8d4ddcc7-", Namespace:"calico-apiserver", SelfLink:"", UID:"a9e9c567-de9b-4a68-9d2c-640b2dafaf2f", ResourceVersion:"850", Generation:0, CreationTimestamp:time.Date(2025, time.August, 12, 23, 44, 44, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"7f8d4ddcc7", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4372-1-0-9-13fe44d47a", ContainerID:"", Pod:"calico-apiserver-7f8d4ddcc7-zgq55", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.3.72/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"calicdb23de850c", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Aug 12 23:45:12.642020 containerd[1506]: 2025-08-12 23:45:12.581 [INFO][4532] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.3.72/32] ContainerID="6a5e1ec7533ab9f8c923e3de67713cb1c6bc6ed94eeb1753dd30a384ea76360c" Namespace="calico-apiserver" Pod="calico-apiserver-7f8d4ddcc7-zgq55" WorkloadEndpoint="ci--4372--1--0--9--13fe44d47a-k8s-calico--apiserver--7f8d4ddcc7--zgq55-eth0" Aug 12 23:45:12.642020 containerd[1506]: 2025-08-12 23:45:12.581 [INFO][4532] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to calicdb23de850c ContainerID="6a5e1ec7533ab9f8c923e3de67713cb1c6bc6ed94eeb1753dd30a384ea76360c" Namespace="calico-apiserver" Pod="calico-apiserver-7f8d4ddcc7-zgq55" WorkloadEndpoint="ci--4372--1--0--9--13fe44d47a-k8s-calico--apiserver--7f8d4ddcc7--zgq55-eth0" Aug 12 23:45:12.642020 containerd[1506]: 2025-08-12 23:45:12.601 [INFO][4532] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="6a5e1ec7533ab9f8c923e3de67713cb1c6bc6ed94eeb1753dd30a384ea76360c" Namespace="calico-apiserver" Pod="calico-apiserver-7f8d4ddcc7-zgq55" WorkloadEndpoint="ci--4372--1--0--9--13fe44d47a-k8s-calico--apiserver--7f8d4ddcc7--zgq55-eth0" Aug 12 23:45:12.642020 containerd[1506]: 2025-08-12 23:45:12.606 [INFO][4532] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="6a5e1ec7533ab9f8c923e3de67713cb1c6bc6ed94eeb1753dd30a384ea76360c" Namespace="calico-apiserver" Pod="calico-apiserver-7f8d4ddcc7-zgq55" WorkloadEndpoint="ci--4372--1--0--9--13fe44d47a-k8s-calico--apiserver--7f8d4ddcc7--zgq55-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4372--1--0--9--13fe44d47a-k8s-calico--apiserver--7f8d4ddcc7--zgq55-eth0", GenerateName:"calico-apiserver-7f8d4ddcc7-", Namespace:"calico-apiserver", SelfLink:"", UID:"a9e9c567-de9b-4a68-9d2c-640b2dafaf2f", ResourceVersion:"850", Generation:0, CreationTimestamp:time.Date(2025, time.August, 12, 23, 44, 44, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"7f8d4ddcc7", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4372-1-0-9-13fe44d47a", ContainerID:"6a5e1ec7533ab9f8c923e3de67713cb1c6bc6ed94eeb1753dd30a384ea76360c", Pod:"calico-apiserver-7f8d4ddcc7-zgq55", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.3.72/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"calicdb23de850c", MAC:"ea:19:86:a3:d5:76", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Aug 12 23:45:12.642020 containerd[1506]: 2025-08-12 23:45:12.633 [INFO][4532] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="6a5e1ec7533ab9f8c923e3de67713cb1c6bc6ed94eeb1753dd30a384ea76360c" Namespace="calico-apiserver" Pod="calico-apiserver-7f8d4ddcc7-zgq55" WorkloadEndpoint="ci--4372--1--0--9--13fe44d47a-k8s-calico--apiserver--7f8d4ddcc7--zgq55-eth0" Aug 12 23:45:12.714517 systemd-networkd[1419]: cali9d33dacc38c: Link UP Aug 12 23:45:12.715855 systemd-networkd[1419]: cali9d33dacc38c: Gained carrier Aug 12 23:45:12.758713 containerd[1506]: 2025-08-12 23:45:12.235 [INFO][4516] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4372--1--0--9--13fe44d47a-k8s-coredns--7c65d6cfc9--dbjg4-eth0 coredns-7c65d6cfc9- kube-system a62fab08-ab88-42e3-ab7b-29db68e0485c 839 0 2025-08-12 23:44:32 +0000 UTC map[k8s-app:kube-dns pod-template-hash:7c65d6cfc9 projectcalico.org/namespace:kube-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:coredns] map[] [] [] []} {k8s ci-4372-1-0-9-13fe44d47a coredns-7c65d6cfc9-dbjg4 eth0 coredns [] [] [kns.kube-system ksa.kube-system.coredns] cali9d33dacc38c [{dns UDP 53 0 } {dns-tcp TCP 53 0 } {metrics TCP 9153 0 }] [] }} ContainerID="dfd91b76898ef903999282e0bf1d11cc87df1aa60920e5c1a3693bdb348517e2" Namespace="kube-system" Pod="coredns-7c65d6cfc9-dbjg4" WorkloadEndpoint="ci--4372--1--0--9--13fe44d47a-k8s-coredns--7c65d6cfc9--dbjg4-" Aug 12 23:45:12.758713 containerd[1506]: 2025-08-12 23:45:12.236 [INFO][4516] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="dfd91b76898ef903999282e0bf1d11cc87df1aa60920e5c1a3693bdb348517e2" Namespace="kube-system" Pod="coredns-7c65d6cfc9-dbjg4" WorkloadEndpoint="ci--4372--1--0--9--13fe44d47a-k8s-coredns--7c65d6cfc9--dbjg4-eth0" Aug 12 23:45:12.758713 containerd[1506]: 2025-08-12 23:45:12.321 [INFO][4570] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="dfd91b76898ef903999282e0bf1d11cc87df1aa60920e5c1a3693bdb348517e2" HandleID="k8s-pod-network.dfd91b76898ef903999282e0bf1d11cc87df1aa60920e5c1a3693bdb348517e2" Workload="ci--4372--1--0--9--13fe44d47a-k8s-coredns--7c65d6cfc9--dbjg4-eth0" Aug 12 23:45:12.758713 containerd[1506]: 2025-08-12 23:45:12.321 [INFO][4570] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="dfd91b76898ef903999282e0bf1d11cc87df1aa60920e5c1a3693bdb348517e2" HandleID="k8s-pod-network.dfd91b76898ef903999282e0bf1d11cc87df1aa60920e5c1a3693bdb348517e2" Workload="ci--4372--1--0--9--13fe44d47a-k8s-coredns--7c65d6cfc9--dbjg4-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0x40003719b0), Attrs:map[string]string{"namespace":"kube-system", "node":"ci-4372-1-0-9-13fe44d47a", "pod":"coredns-7c65d6cfc9-dbjg4", "timestamp":"2025-08-12 23:45:12.321337718 +0000 UTC"}, Hostname:"ci-4372-1-0-9-13fe44d47a", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Aug 12 23:45:12.758713 containerd[1506]: 2025-08-12 23:45:12.321 [INFO][4570] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Aug 12 23:45:12.758713 containerd[1506]: 2025-08-12 23:45:12.573 [INFO][4570] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Aug 12 23:45:12.758713 containerd[1506]: 2025-08-12 23:45:12.573 [INFO][4570] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4372-1-0-9-13fe44d47a' Aug 12 23:45:12.758713 containerd[1506]: 2025-08-12 23:45:12.600 [INFO][4570] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.dfd91b76898ef903999282e0bf1d11cc87df1aa60920e5c1a3693bdb348517e2" host="ci-4372-1-0-9-13fe44d47a" Aug 12 23:45:12.758713 containerd[1506]: 2025-08-12 23:45:12.617 [INFO][4570] ipam/ipam.go 394: Looking up existing affinities for host host="ci-4372-1-0-9-13fe44d47a" Aug 12 23:45:12.758713 containerd[1506]: 2025-08-12 23:45:12.643 [INFO][4570] ipam/ipam.go 511: Trying affinity for 192.168.3.64/26 host="ci-4372-1-0-9-13fe44d47a" Aug 12 23:45:12.758713 containerd[1506]: 2025-08-12 23:45:12.647 [INFO][4570] ipam/ipam.go 158: Attempting to load block cidr=192.168.3.64/26 host="ci-4372-1-0-9-13fe44d47a" Aug 12 23:45:12.758713 containerd[1506]: 2025-08-12 23:45:12.665 [INFO][4570] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.3.64/26 host="ci-4372-1-0-9-13fe44d47a" Aug 12 23:45:12.758713 containerd[1506]: 2025-08-12 23:45:12.666 [INFO][4570] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.3.64/26 handle="k8s-pod-network.dfd91b76898ef903999282e0bf1d11cc87df1aa60920e5c1a3693bdb348517e2" host="ci-4372-1-0-9-13fe44d47a" Aug 12 23:45:12.758713 containerd[1506]: 2025-08-12 23:45:12.673 [INFO][4570] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.dfd91b76898ef903999282e0bf1d11cc87df1aa60920e5c1a3693bdb348517e2 Aug 12 23:45:12.758713 containerd[1506]: 2025-08-12 23:45:12.685 [INFO][4570] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.3.64/26 handle="k8s-pod-network.dfd91b76898ef903999282e0bf1d11cc87df1aa60920e5c1a3693bdb348517e2" host="ci-4372-1-0-9-13fe44d47a" Aug 12 23:45:12.758713 containerd[1506]: 2025-08-12 23:45:12.701 [INFO][4570] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.3.73/26] block=192.168.3.64/26 handle="k8s-pod-network.dfd91b76898ef903999282e0bf1d11cc87df1aa60920e5c1a3693bdb348517e2" host="ci-4372-1-0-9-13fe44d47a" Aug 12 23:45:12.758713 containerd[1506]: 2025-08-12 23:45:12.701 [INFO][4570] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.3.73/26] handle="k8s-pod-network.dfd91b76898ef903999282e0bf1d11cc87df1aa60920e5c1a3693bdb348517e2" host="ci-4372-1-0-9-13fe44d47a" Aug 12 23:45:12.758713 containerd[1506]: 2025-08-12 23:45:12.701 [INFO][4570] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Aug 12 23:45:12.758713 containerd[1506]: 2025-08-12 23:45:12.701 [INFO][4570] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.3.73/26] IPv6=[] ContainerID="dfd91b76898ef903999282e0bf1d11cc87df1aa60920e5c1a3693bdb348517e2" HandleID="k8s-pod-network.dfd91b76898ef903999282e0bf1d11cc87df1aa60920e5c1a3693bdb348517e2" Workload="ci--4372--1--0--9--13fe44d47a-k8s-coredns--7c65d6cfc9--dbjg4-eth0" Aug 12 23:45:12.760846 containerd[1506]: 2025-08-12 23:45:12.705 [INFO][4516] cni-plugin/k8s.go 418: Populated endpoint ContainerID="dfd91b76898ef903999282e0bf1d11cc87df1aa60920e5c1a3693bdb348517e2" Namespace="kube-system" Pod="coredns-7c65d6cfc9-dbjg4" WorkloadEndpoint="ci--4372--1--0--9--13fe44d47a-k8s-coredns--7c65d6cfc9--dbjg4-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4372--1--0--9--13fe44d47a-k8s-coredns--7c65d6cfc9--dbjg4-eth0", GenerateName:"coredns-7c65d6cfc9-", Namespace:"kube-system", SelfLink:"", UID:"a62fab08-ab88-42e3-ab7b-29db68e0485c", ResourceVersion:"839", Generation:0, CreationTimestamp:time.Date(2025, time.August, 12, 23, 44, 32, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"7c65d6cfc9", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4372-1-0-9-13fe44d47a", ContainerID:"", Pod:"coredns-7c65d6cfc9-dbjg4", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.3.73/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"cali9d33dacc38c", MAC:"", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Aug 12 23:45:12.760846 containerd[1506]: 2025-08-12 23:45:12.705 [INFO][4516] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.3.73/32] ContainerID="dfd91b76898ef903999282e0bf1d11cc87df1aa60920e5c1a3693bdb348517e2" Namespace="kube-system" Pod="coredns-7c65d6cfc9-dbjg4" WorkloadEndpoint="ci--4372--1--0--9--13fe44d47a-k8s-coredns--7c65d6cfc9--dbjg4-eth0" Aug 12 23:45:12.760846 containerd[1506]: 2025-08-12 23:45:12.705 [INFO][4516] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali9d33dacc38c ContainerID="dfd91b76898ef903999282e0bf1d11cc87df1aa60920e5c1a3693bdb348517e2" Namespace="kube-system" Pod="coredns-7c65d6cfc9-dbjg4" WorkloadEndpoint="ci--4372--1--0--9--13fe44d47a-k8s-coredns--7c65d6cfc9--dbjg4-eth0" Aug 12 23:45:12.760846 containerd[1506]: 2025-08-12 23:45:12.717 [INFO][4516] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="dfd91b76898ef903999282e0bf1d11cc87df1aa60920e5c1a3693bdb348517e2" Namespace="kube-system" Pod="coredns-7c65d6cfc9-dbjg4" WorkloadEndpoint="ci--4372--1--0--9--13fe44d47a-k8s-coredns--7c65d6cfc9--dbjg4-eth0" Aug 12 23:45:12.760846 containerd[1506]: 2025-08-12 23:45:12.720 [INFO][4516] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="dfd91b76898ef903999282e0bf1d11cc87df1aa60920e5c1a3693bdb348517e2" Namespace="kube-system" Pod="coredns-7c65d6cfc9-dbjg4" WorkloadEndpoint="ci--4372--1--0--9--13fe44d47a-k8s-coredns--7c65d6cfc9--dbjg4-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4372--1--0--9--13fe44d47a-k8s-coredns--7c65d6cfc9--dbjg4-eth0", GenerateName:"coredns-7c65d6cfc9-", Namespace:"kube-system", SelfLink:"", UID:"a62fab08-ab88-42e3-ab7b-29db68e0485c", ResourceVersion:"839", Generation:0, CreationTimestamp:time.Date(2025, time.August, 12, 23, 44, 32, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"7c65d6cfc9", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4372-1-0-9-13fe44d47a", ContainerID:"dfd91b76898ef903999282e0bf1d11cc87df1aa60920e5c1a3693bdb348517e2", Pod:"coredns-7c65d6cfc9-dbjg4", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.3.73/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"cali9d33dacc38c", MAC:"be:70:8f:9e:2b:98", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Aug 12 23:45:12.760846 containerd[1506]: 2025-08-12 23:45:12.749 [INFO][4516] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="dfd91b76898ef903999282e0bf1d11cc87df1aa60920e5c1a3693bdb348517e2" Namespace="kube-system" Pod="coredns-7c65d6cfc9-dbjg4" WorkloadEndpoint="ci--4372--1--0--9--13fe44d47a-k8s-coredns--7c65d6cfc9--dbjg4-eth0" Aug 12 23:45:12.760846 containerd[1506]: time="2025-08-12T23:45:12.758756636Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-7c65d6cfc9-drk6f,Uid:f78d3697-5a7e-4c57-950b-a6d5080b5a5b,Namespace:kube-system,Attempt:0,} returns sandbox id \"9f526967f1bd718177b55a67c1823515130ed761ceb5678c5662b78e22370777\"" Aug 12 23:45:12.769336 containerd[1506]: time="2025-08-12T23:45:12.769161851Z" level=info msg="CreateContainer within sandbox \"9f526967f1bd718177b55a67c1823515130ed761ceb5678c5662b78e22370777\" for container &ContainerMetadata{Name:coredns,Attempt:0,}" Aug 12 23:45:12.771946 containerd[1506]: time="2025-08-12T23:45:12.771164140Z" level=info msg="connecting to shim 6a5e1ec7533ab9f8c923e3de67713cb1c6bc6ed94eeb1753dd30a384ea76360c" address="unix:///run/containerd/s/cffd78bb2d1542aed1e94b84712f4c7ec73e472b9b7930c775eae007ebe346e0" namespace=k8s.io protocol=ttrpc version=3 Aug 12 23:45:12.815905 containerd[1506]: time="2025-08-12T23:45:12.815845074Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-5cbb45b49d-bnxpq,Uid:285f6b1f-9c43-4330-9ef8-c92ed81153fb,Namespace:calico-system,Attempt:0,} returns sandbox id \"86d417c813a03bada6b8faf8d2c202e515fa78a5eee64cf7c08d139363e5f3c1\"" Aug 12 23:45:12.816295 containerd[1506]: time="2025-08-12T23:45:12.816090160Z" level=info msg="Container 7814c8ebe2bdf4ed487804be72e3dadc47b8e13b9fafc2978d2e74fb369abcc3: CDI devices from CRI Config.CDIDevices: []" Aug 12 23:45:12.826115 containerd[1506]: time="2025-08-12T23:45:12.826075805Z" level=info msg="connecting to shim dfd91b76898ef903999282e0bf1d11cc87df1aa60920e5c1a3693bdb348517e2" address="unix:///run/containerd/s/e7f1b2eb994b301b422e0be010077f262a32b205f165bda1033a4780fd49bc0e" namespace=k8s.io protocol=ttrpc version=3 Aug 12 23:45:12.832166 containerd[1506]: time="2025-08-12T23:45:12.832128033Z" level=info msg="CreateContainer within sandbox \"9f526967f1bd718177b55a67c1823515130ed761ceb5678c5662b78e22370777\" for &ContainerMetadata{Name:coredns,Attempt:0,} returns container id \"7814c8ebe2bdf4ed487804be72e3dadc47b8e13b9fafc2978d2e74fb369abcc3\"" Aug 12 23:45:12.834512 containerd[1506]: time="2025-08-12T23:45:12.833038216Z" level=info msg="StartContainer for \"7814c8ebe2bdf4ed487804be72e3dadc47b8e13b9fafc2978d2e74fb369abcc3\"" Aug 12 23:45:12.834512 containerd[1506]: time="2025-08-12T23:45:12.833919757Z" level=info msg="connecting to shim 7814c8ebe2bdf4ed487804be72e3dadc47b8e13b9fafc2978d2e74fb369abcc3" address="unix:///run/containerd/s/bf10a73b4b5d7b53f199b9d6443084eff17afd5f406a5da4a414ab200f448a0b" protocol=ttrpc version=3 Aug 12 23:45:12.842845 systemd[1]: Started cri-containerd-6a5e1ec7533ab9f8c923e3de67713cb1c6bc6ed94eeb1753dd30a384ea76360c.scope - libcontainer container 6a5e1ec7533ab9f8c923e3de67713cb1c6bc6ed94eeb1753dd30a384ea76360c. Aug 12 23:45:12.867769 systemd[1]: Started cri-containerd-7814c8ebe2bdf4ed487804be72e3dadc47b8e13b9fafc2978d2e74fb369abcc3.scope - libcontainer container 7814c8ebe2bdf4ed487804be72e3dadc47b8e13b9fafc2978d2e74fb369abcc3. Aug 12 23:45:12.869848 systemd-networkd[1419]: califc7c65268f7: Gained IPv6LL Aug 12 23:45:12.886821 systemd[1]: Started cri-containerd-dfd91b76898ef903999282e0bf1d11cc87df1aa60920e5c1a3693bdb348517e2.scope - libcontainer container dfd91b76898ef903999282e0bf1d11cc87df1aa60920e5c1a3693bdb348517e2. Aug 12 23:45:12.931207 containerd[1506]: time="2025-08-12T23:45:12.931159860Z" level=info msg="StartContainer for \"7814c8ebe2bdf4ed487804be72e3dadc47b8e13b9fafc2978d2e74fb369abcc3\" returns successfully" Aug 12 23:45:12.958070 containerd[1506]: time="2025-08-12T23:45:12.957995597Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-7c65d6cfc9-dbjg4,Uid:a62fab08-ab88-42e3-ab7b-29db68e0485c,Namespace:kube-system,Attempt:0,} returns sandbox id \"dfd91b76898ef903999282e0bf1d11cc87df1aa60920e5c1a3693bdb348517e2\"" Aug 12 23:45:12.967135 containerd[1506]: time="2025-08-12T23:45:12.967047379Z" level=info msg="CreateContainer within sandbox \"dfd91b76898ef903999282e0bf1d11cc87df1aa60920e5c1a3693bdb348517e2\" for container &ContainerMetadata{Name:coredns,Attempt:0,}" Aug 12 23:45:12.990845 containerd[1506]: time="2025-08-12T23:45:12.990755120Z" level=info msg="Container f1de2f93b75e7804df7395a031dbafbf3852eecedc2352a716083f4f23d5ccb2: CDI devices from CRI Config.CDIDevices: []" Aug 12 23:45:13.008751 containerd[1506]: time="2025-08-12T23:45:13.008567673Z" level=info msg="CreateContainer within sandbox \"dfd91b76898ef903999282e0bf1d11cc87df1aa60920e5c1a3693bdb348517e2\" for &ContainerMetadata{Name:coredns,Attempt:0,} returns container id \"f1de2f93b75e7804df7395a031dbafbf3852eecedc2352a716083f4f23d5ccb2\"" Aug 12 23:45:13.010863 containerd[1506]: time="2025-08-12T23:45:13.010412277Z" level=info msg="StartContainer for \"f1de2f93b75e7804df7395a031dbafbf3852eecedc2352a716083f4f23d5ccb2\"" Aug 12 23:45:13.017209 containerd[1506]: time="2025-08-12T23:45:13.017171920Z" level=info msg="connecting to shim f1de2f93b75e7804df7395a031dbafbf3852eecedc2352a716083f4f23d5ccb2" address="unix:///run/containerd/s/e7f1b2eb994b301b422e0be010077f262a32b205f165bda1033a4780fd49bc0e" protocol=ttrpc version=3 Aug 12 23:45:13.029335 containerd[1506]: time="2025-08-12T23:45:13.029265691Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-7f8d4ddcc7-zgq55,Uid:a9e9c567-de9b-4a68-9d2c-640b2dafaf2f,Namespace:calico-apiserver,Attempt:0,} returns sandbox id \"6a5e1ec7533ab9f8c923e3de67713cb1c6bc6ed94eeb1753dd30a384ea76360c\"" Aug 12 23:45:13.053927 systemd[1]: Started cri-containerd-f1de2f93b75e7804df7395a031dbafbf3852eecedc2352a716083f4f23d5ccb2.scope - libcontainer container f1de2f93b75e7804df7395a031dbafbf3852eecedc2352a716083f4f23d5ccb2. Aug 12 23:45:13.118609 containerd[1506]: time="2025-08-12T23:45:13.118187550Z" level=info msg="StartContainer for \"f1de2f93b75e7804df7395a031dbafbf3852eecedc2352a716083f4f23d5ccb2\" returns successfully" Aug 12 23:45:13.380593 systemd-networkd[1419]: cali8fcc04ff5ce: Gained IPv6LL Aug 12 23:45:13.411998 kubelet[2715]: I0812 23:45:13.411760 2715 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/coredns-7c65d6cfc9-drk6f" podStartSLOduration=41.411728171 podStartE2EDuration="41.411728171s" podCreationTimestamp="2025-08-12 23:44:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-08-12 23:45:13.407897359 +0000 UTC m=+48.467459912" watchObservedRunningTime="2025-08-12 23:45:13.411728171 +0000 UTC m=+48.471290724" Aug 12 23:45:13.464154 kubelet[2715]: I0812 23:45:13.463244 2715 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/coredns-7c65d6cfc9-dbjg4" podStartSLOduration=41.46321981 podStartE2EDuration="41.46321981s" podCreationTimestamp="2025-08-12 23:44:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-08-12 23:45:13.463088286 +0000 UTC m=+48.522650839" watchObservedRunningTime="2025-08-12 23:45:13.46321981 +0000 UTC m=+48.522782363" Aug 12 23:45:13.507934 systemd-networkd[1419]: cali06c80f6778a: Gained IPv6LL Aug 12 23:45:13.835365 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount2636210369.mount: Deactivated successfully. Aug 12 23:45:13.956930 systemd-networkd[1419]: calicdb23de850c: Gained IPv6LL Aug 12 23:45:14.276017 systemd-networkd[1419]: cali9d33dacc38c: Gained IPv6LL Aug 12 23:45:14.312920 containerd[1506]: time="2025-08-12T23:45:14.312218702Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/goldmane:v3.30.2\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Aug 12 23:45:14.313954 containerd[1506]: time="2025-08-12T23:45:14.313814220Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/goldmane:v3.30.2: active requests=0, bytes read=61838790" Aug 12 23:45:14.315370 containerd[1506]: time="2025-08-12T23:45:14.315080250Z" level=info msg="ImageCreate event name:\"sha256:1389d38feb576cfff09a57a2c028a53e51a72c658f295166960f770eaf07985f\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Aug 12 23:45:14.321181 containerd[1506]: time="2025-08-12T23:45:14.321012430Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/goldmane@sha256:a2b761fd93d824431ad93e59e8e670cdf00b478f4b532145297e1e67f2768305\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Aug 12 23:45:14.325682 containerd[1506]: time="2025-08-12T23:45:14.325324412Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/goldmane:v3.30.2\" with image id \"sha256:1389d38feb576cfff09a57a2c028a53e51a72c658f295166960f770eaf07985f\", repo tag \"ghcr.io/flatcar/calico/goldmane:v3.30.2\", repo digest \"ghcr.io/flatcar/calico/goldmane@sha256:a2b761fd93d824431ad93e59e8e670cdf00b478f4b532145297e1e67f2768305\", size \"61838636\" in 4.529215696s" Aug 12 23:45:14.325682 containerd[1506]: time="2025-08-12T23:45:14.325462415Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.30.2\" returns image reference \"sha256:1389d38feb576cfff09a57a2c028a53e51a72c658f295166960f770eaf07985f\"" Aug 12 23:45:14.328512 containerd[1506]: time="2025-08-12T23:45:14.328379604Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.2\"" Aug 12 23:45:14.330631 containerd[1506]: time="2025-08-12T23:45:14.330115965Z" level=info msg="CreateContainer within sandbox \"cec82977a073516f212a8e5f10513f085337371364c01d60b372ce9b08816804\" for container &ContainerMetadata{Name:goldmane,Attempt:0,}" Aug 12 23:45:14.343738 containerd[1506]: time="2025-08-12T23:45:14.343686486Z" level=info msg="Container 1cccb6bc47c00d5e1d4ee67f317e7ac919f9017670873c5bb6119b773b41640b: CDI devices from CRI Config.CDIDevices: []" Aug 12 23:45:14.363165 containerd[1506]: time="2025-08-12T23:45:14.363113345Z" level=info msg="CreateContainer within sandbox \"cec82977a073516f212a8e5f10513f085337371364c01d60b372ce9b08816804\" for &ContainerMetadata{Name:goldmane,Attempt:0,} returns container id \"1cccb6bc47c00d5e1d4ee67f317e7ac919f9017670873c5bb6119b773b41640b\"" Aug 12 23:45:14.365608 containerd[1506]: time="2025-08-12T23:45:14.364474297Z" level=info msg="StartContainer for \"1cccb6bc47c00d5e1d4ee67f317e7ac919f9017670873c5bb6119b773b41640b\"" Aug 12 23:45:14.367927 containerd[1506]: time="2025-08-12T23:45:14.367891298Z" level=info msg="connecting to shim 1cccb6bc47c00d5e1d4ee67f317e7ac919f9017670873c5bb6119b773b41640b" address="unix:///run/containerd/s/f7fd641bca54eeff8a16a495a8585b7b214acd362f2af44b9cf52eacd96a18a3" protocol=ttrpc version=3 Aug 12 23:45:14.402923 systemd[1]: Started cri-containerd-1cccb6bc47c00d5e1d4ee67f317e7ac919f9017670873c5bb6119b773b41640b.scope - libcontainer container 1cccb6bc47c00d5e1d4ee67f317e7ac919f9017670873c5bb6119b773b41640b. Aug 12 23:45:14.539089 containerd[1506]: time="2025-08-12T23:45:14.538949621Z" level=info msg="StartContainer for \"1cccb6bc47c00d5e1d4ee67f317e7ac919f9017670873c5bb6119b773b41640b\" returns successfully" Aug 12 23:45:15.632941 containerd[1506]: time="2025-08-12T23:45:15.632863391Z" level=info msg="TaskExit event in podsandbox handler container_id:\"1cccb6bc47c00d5e1d4ee67f317e7ac919f9017670873c5bb6119b773b41640b\" id:\"0926e47debfcee17ad5b556db2de043b4f72db90f012d0c42661291a521db61a\" pid:4945 exit_status:1 exited_at:{seconds:1755042315 nanos:629982724}" Aug 12 23:45:16.793076 containerd[1506]: time="2025-08-12T23:45:16.793020787Z" level=info msg="TaskExit event in podsandbox handler container_id:\"1cccb6bc47c00d5e1d4ee67f317e7ac919f9017670873c5bb6119b773b41640b\" id:\"44d448aec49e5545786a776bbf613a070ff303c4ae829c0781913e0831bc680a\" pid:4977 exited_at:{seconds:1755042316 nanos:790709894}" Aug 12 23:45:16.836382 kubelet[2715]: I0812 23:45:16.836297 2715 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/goldmane-58fd7646b9-q57c4" podStartSLOduration=23.226610831 podStartE2EDuration="27.836274377s" podCreationTimestamp="2025-08-12 23:44:49 +0000 UTC" firstStartedPulling="2025-08-12 23:45:09.717296065 +0000 UTC m=+44.776858618" lastFinishedPulling="2025-08-12 23:45:14.326959611 +0000 UTC m=+49.386522164" observedRunningTime="2025-08-12 23:45:15.464798884 +0000 UTC m=+50.524361437" watchObservedRunningTime="2025-08-12 23:45:16.836274377 +0000 UTC m=+51.895836930" Aug 12 23:45:17.064651 containerd[1506]: time="2025-08-12T23:45:17.064511377Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/apiserver:v3.30.2\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Aug 12 23:45:17.066483 containerd[1506]: time="2025-08-12T23:45:17.065954889Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.30.2: active requests=0, bytes read=44517149" Aug 12 23:45:17.067678 containerd[1506]: time="2025-08-12T23:45:17.067633327Z" level=info msg="ImageCreate event name:\"sha256:3371ea1b18040228ef58c964e49b96f4291def748753dfbc0aef87a55f906b8f\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Aug 12 23:45:17.070408 containerd[1506]: time="2025-08-12T23:45:17.070365228Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/apiserver@sha256:ec6b10660962e7caad70c47755049fad68f9fc2f7064e8bc7cb862583e02cc2b\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Aug 12 23:45:17.071565 containerd[1506]: time="2025-08-12T23:45:17.071509174Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/apiserver:v3.30.2\" with image id \"sha256:3371ea1b18040228ef58c964e49b96f4291def748753dfbc0aef87a55f906b8f\", repo tag \"ghcr.io/flatcar/calico/apiserver:v3.30.2\", repo digest \"ghcr.io/flatcar/calico/apiserver@sha256:ec6b10660962e7caad70c47755049fad68f9fc2f7064e8bc7cb862583e02cc2b\", size \"45886406\" in 2.742781482s" Aug 12 23:45:17.071757 containerd[1506]: time="2025-08-12T23:45:17.071736859Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.2\" returns image reference \"sha256:3371ea1b18040228ef58c964e49b96f4291def748753dfbc0aef87a55f906b8f\"" Aug 12 23:45:17.073170 containerd[1506]: time="2025-08-12T23:45:17.073127371Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.2\"" Aug 12 23:45:17.075594 containerd[1506]: time="2025-08-12T23:45:17.074943612Z" level=info msg="CreateContainer within sandbox \"69a93367d0d4a8757028e8e0ca21952426c5ce6ce96fdba13005db0b3b6b35bf\" for container &ContainerMetadata{Name:calico-apiserver,Attempt:0,}" Aug 12 23:45:17.088799 containerd[1506]: time="2025-08-12T23:45:17.088744963Z" level=info msg="Container 17298248124098d02f0008112d81d4844b1cdf9fcdfd2deb6a21815dfbf58987: CDI devices from CRI Config.CDIDevices: []" Aug 12 23:45:17.112640 containerd[1506]: time="2025-08-12T23:45:17.112560699Z" level=info msg="CreateContainer within sandbox \"69a93367d0d4a8757028e8e0ca21952426c5ce6ce96fdba13005db0b3b6b35bf\" for &ContainerMetadata{Name:calico-apiserver,Attempt:0,} returns container id \"17298248124098d02f0008112d81d4844b1cdf9fcdfd2deb6a21815dfbf58987\"" Aug 12 23:45:17.115920 containerd[1506]: time="2025-08-12T23:45:17.115882854Z" level=info msg="StartContainer for \"17298248124098d02f0008112d81d4844b1cdf9fcdfd2deb6a21815dfbf58987\"" Aug 12 23:45:17.119075 containerd[1506]: time="2025-08-12T23:45:17.119040925Z" level=info msg="connecting to shim 17298248124098d02f0008112d81d4844b1cdf9fcdfd2deb6a21815dfbf58987" address="unix:///run/containerd/s/ba2aa9f7b63306fe5a9699a7403082cdcc8b935709310afa7d9a88cef385ab9f" protocol=ttrpc version=3 Aug 12 23:45:17.155079 systemd[1]: Started cri-containerd-17298248124098d02f0008112d81d4844b1cdf9fcdfd2deb6a21815dfbf58987.scope - libcontainer container 17298248124098d02f0008112d81d4844b1cdf9fcdfd2deb6a21815dfbf58987. Aug 12 23:45:17.223837 containerd[1506]: time="2025-08-12T23:45:17.223783005Z" level=info msg="StartContainer for \"17298248124098d02f0008112d81d4844b1cdf9fcdfd2deb6a21815dfbf58987\" returns successfully" Aug 12 23:45:17.468263 containerd[1506]: time="2025-08-12T23:45:17.468147752Z" level=info msg="ImageUpdate event name:\"ghcr.io/flatcar/calico/apiserver:v3.30.2\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Aug 12 23:45:17.470575 containerd[1506]: time="2025-08-12T23:45:17.469521463Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.30.2: active requests=0, bytes read=77" Aug 12 23:45:17.473874 containerd[1506]: time="2025-08-12T23:45:17.473819679Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/apiserver:v3.30.2\" with image id \"sha256:3371ea1b18040228ef58c964e49b96f4291def748753dfbc0aef87a55f906b8f\", repo tag \"ghcr.io/flatcar/calico/apiserver:v3.30.2\", repo digest \"ghcr.io/flatcar/calico/apiserver@sha256:ec6b10660962e7caad70c47755049fad68f9fc2f7064e8bc7cb862583e02cc2b\", size \"45886406\" in 400.651668ms" Aug 12 23:45:17.473874 containerd[1506]: time="2025-08-12T23:45:17.473865920Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.2\" returns image reference \"sha256:3371ea1b18040228ef58c964e49b96f4291def748753dfbc0aef87a55f906b8f\"" Aug 12 23:45:17.475566 containerd[1506]: time="2025-08-12T23:45:17.474986426Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.30.2\"" Aug 12 23:45:17.478622 containerd[1506]: time="2025-08-12T23:45:17.478533146Z" level=info msg="CreateContainer within sandbox \"227f6754b8e7377cc59b160e3f58650e01ea812d4f5dff49f693ee2eca116f80\" for container &ContainerMetadata{Name:calico-apiserver,Attempt:0,}" Aug 12 23:45:17.491595 containerd[1506]: time="2025-08-12T23:45:17.491037547Z" level=info msg="Container 30fbfec031f850070ae8f0cad35064a3cbbe824223cde765357374ce1a4286fa: CDI devices from CRI Config.CDIDevices: []" Aug 12 23:45:17.509144 containerd[1506]: time="2025-08-12T23:45:17.509082594Z" level=info msg="CreateContainer within sandbox \"227f6754b8e7377cc59b160e3f58650e01ea812d4f5dff49f693ee2eca116f80\" for &ContainerMetadata{Name:calico-apiserver,Attempt:0,} returns container id \"30fbfec031f850070ae8f0cad35064a3cbbe824223cde765357374ce1a4286fa\"" Aug 12 23:45:17.512086 containerd[1506]: time="2025-08-12T23:45:17.512028500Z" level=info msg="StartContainer for \"30fbfec031f850070ae8f0cad35064a3cbbe824223cde765357374ce1a4286fa\"" Aug 12 23:45:17.522948 containerd[1506]: time="2025-08-12T23:45:17.522889145Z" level=info msg="connecting to shim 30fbfec031f850070ae8f0cad35064a3cbbe824223cde765357374ce1a4286fa" address="unix:///run/containerd/s/3a470f5eec070e471522e0e7fb903488035aa068ab1624f8b662485c7f374958" protocol=ttrpc version=3 Aug 12 23:45:17.564750 systemd[1]: Started cri-containerd-30fbfec031f850070ae8f0cad35064a3cbbe824223cde765357374ce1a4286fa.scope - libcontainer container 30fbfec031f850070ae8f0cad35064a3cbbe824223cde765357374ce1a4286fa. Aug 12 23:45:17.680620 containerd[1506]: time="2025-08-12T23:45:17.680520697Z" level=info msg="StartContainer for \"30fbfec031f850070ae8f0cad35064a3cbbe824223cde765357374ce1a4286fa\" returns successfully" Aug 12 23:45:18.453146 kubelet[2715]: I0812 23:45:18.453101 2715 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Aug 12 23:45:18.468839 kubelet[2715]: I0812 23:45:18.468157 2715 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-apiserver/calico-apiserver-7f8d4ddcc7-r7f9g" podStartSLOduration=27.921876651 podStartE2EDuration="34.468136933s" podCreationTimestamp="2025-08-12 23:44:44 +0000 UTC" firstStartedPulling="2025-08-12 23:45:10.526345677 +0000 UTC m=+45.585908230" lastFinishedPulling="2025-08-12 23:45:17.072605959 +0000 UTC m=+52.132168512" observedRunningTime="2025-08-12 23:45:17.470911294 +0000 UTC m=+52.530473847" watchObservedRunningTime="2025-08-12 23:45:18.468136933 +0000 UTC m=+53.527699486" Aug 12 23:45:18.468839 kubelet[2715]: I0812 23:45:18.468371 2715 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-apiserver/calico-apiserver-6464b99944-nztd8" podStartSLOduration=25.57307115 podStartE2EDuration="32.468366779s" podCreationTimestamp="2025-08-12 23:44:46 +0000 UTC" firstStartedPulling="2025-08-12 23:45:10.579583594 +0000 UTC m=+45.639146147" lastFinishedPulling="2025-08-12 23:45:17.474879223 +0000 UTC m=+52.534441776" observedRunningTime="2025-08-12 23:45:18.466976588 +0000 UTC m=+53.526539141" watchObservedRunningTime="2025-08-12 23:45:18.468366779 +0000 UTC m=+53.527929332" Aug 12 23:45:19.456343 kubelet[2715]: I0812 23:45:19.456296 2715 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Aug 12 23:45:20.925140 containerd[1506]: time="2025-08-12T23:45:20.925073697Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/csi:v3.30.2\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Aug 12 23:45:20.926695 containerd[1506]: time="2025-08-12T23:45:20.926193201Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/csi:v3.30.2: active requests=0, bytes read=8225702" Aug 12 23:45:20.928584 containerd[1506]: time="2025-08-12T23:45:20.927944039Z" level=info msg="ImageCreate event name:\"sha256:14ecfabbdbebd1f5a36708f8b11a95a43baddd6a935d7d78c89a9c333849fcd2\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Aug 12 23:45:20.933573 containerd[1506]: time="2025-08-12T23:45:20.931596678Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/csi@sha256:e570128aa8067a2f06b96d3cc98afa2e0a4b9790b435ee36ca051c8e72aeb8d0\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Aug 12 23:45:20.933573 containerd[1506]: time="2025-08-12T23:45:20.932320974Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/csi:v3.30.2\" with image id \"sha256:14ecfabbdbebd1f5a36708f8b11a95a43baddd6a935d7d78c89a9c333849fcd2\", repo tag \"ghcr.io/flatcar/calico/csi:v3.30.2\", repo digest \"ghcr.io/flatcar/calico/csi@sha256:e570128aa8067a2f06b96d3cc98afa2e0a4b9790b435ee36ca051c8e72aeb8d0\", size \"9594943\" in 3.457295628s" Aug 12 23:45:20.933573 containerd[1506]: time="2025-08-12T23:45:20.932357735Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.30.2\" returns image reference \"sha256:14ecfabbdbebd1f5a36708f8b11a95a43baddd6a935d7d78c89a9c333849fcd2\"" Aug 12 23:45:20.935906 containerd[1506]: time="2025-08-12T23:45:20.935859810Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.30.2\"" Aug 12 23:45:20.940036 containerd[1506]: time="2025-08-12T23:45:20.940000260Z" level=info msg="CreateContainer within sandbox \"776d8d2c290f716dd77e18c400a3f3495f95261b38226df66334762c89a1a50e\" for container &ContainerMetadata{Name:calico-csi,Attempt:0,}" Aug 12 23:45:20.961758 containerd[1506]: time="2025-08-12T23:45:20.961709129Z" level=info msg="Container 4f13d6d65ec23befc45734d1407c01bb2417a2ad1880b526bcde32a862b819e7: CDI devices from CRI Config.CDIDevices: []" Aug 12 23:45:20.981267 containerd[1506]: time="2025-08-12T23:45:20.981209991Z" level=info msg="CreateContainer within sandbox \"776d8d2c290f716dd77e18c400a3f3495f95261b38226df66334762c89a1a50e\" for &ContainerMetadata{Name:calico-csi,Attempt:0,} returns container id \"4f13d6d65ec23befc45734d1407c01bb2417a2ad1880b526bcde32a862b819e7\"" Aug 12 23:45:20.984603 containerd[1506]: time="2025-08-12T23:45:20.983736205Z" level=info msg="StartContainer for \"4f13d6d65ec23befc45734d1407c01bb2417a2ad1880b526bcde32a862b819e7\"" Aug 12 23:45:20.985400 containerd[1506]: time="2025-08-12T23:45:20.985361761Z" level=info msg="connecting to shim 4f13d6d65ec23befc45734d1407c01bb2417a2ad1880b526bcde32a862b819e7" address="unix:///run/containerd/s/977a04e5d07edabeac157109881b93c0d98561bad1bc865bb55d34d124ca62fe" protocol=ttrpc version=3 Aug 12 23:45:21.024861 systemd[1]: Started cri-containerd-4f13d6d65ec23befc45734d1407c01bb2417a2ad1880b526bcde32a862b819e7.scope - libcontainer container 4f13d6d65ec23befc45734d1407c01bb2417a2ad1880b526bcde32a862b819e7. Aug 12 23:45:21.204382 containerd[1506]: time="2025-08-12T23:45:21.204230399Z" level=info msg="StartContainer for \"4f13d6d65ec23befc45734d1407c01bb2417a2ad1880b526bcde32a862b819e7\" returns successfully" Aug 12 23:45:25.336566 containerd[1506]: time="2025-08-12T23:45:25.335832018Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/kube-controllers:v3.30.2\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Aug 12 23:45:25.338025 containerd[1506]: time="2025-08-12T23:45:25.337971822Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/kube-controllers:v3.30.2: active requests=0, bytes read=48128336" Aug 12 23:45:25.339190 containerd[1506]: time="2025-08-12T23:45:25.339133286Z" level=info msg="ImageCreate event name:\"sha256:ba9e7793995ca67a9b78aa06adda4e89cbd435b1e88ab1032ca665140517fa7a\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Aug 12 23:45:25.343018 containerd[1506]: time="2025-08-12T23:45:25.342589756Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/kube-controllers@sha256:5d3ecdec3cbbe8f7009077102e35e8a2141161b59c548cf3f97829177677cbce\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Aug 12 23:45:25.344303 containerd[1506]: time="2025-08-12T23:45:25.344250670Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.2\" with image id \"sha256:ba9e7793995ca67a9b78aa06adda4e89cbd435b1e88ab1032ca665140517fa7a\", repo tag \"ghcr.io/flatcar/calico/kube-controllers:v3.30.2\", repo digest \"ghcr.io/flatcar/calico/kube-controllers@sha256:5d3ecdec3cbbe8f7009077102e35e8a2141161b59c548cf3f97829177677cbce\", size \"49497545\" in 4.408349699s" Aug 12 23:45:25.344303 containerd[1506]: time="2025-08-12T23:45:25.344295311Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.30.2\" returns image reference \"sha256:ba9e7793995ca67a9b78aa06adda4e89cbd435b1e88ab1032ca665140517fa7a\"" Aug 12 23:45:25.347486 containerd[1506]: time="2025-08-12T23:45:25.347442816Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.2\"" Aug 12 23:45:25.369636 containerd[1506]: time="2025-08-12T23:45:25.367856393Z" level=info msg="CreateContainer within sandbox \"86d417c813a03bada6b8faf8d2c202e515fa78a5eee64cf7c08d139363e5f3c1\" for container &ContainerMetadata{Name:calico-kube-controllers,Attempt:0,}" Aug 12 23:45:25.398775 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount3557894416.mount: Deactivated successfully. Aug 12 23:45:25.402682 containerd[1506]: time="2025-08-12T23:45:25.402628464Z" level=info msg="Container 19dd4802aef9a2278a4c8001b706649c2528bbfafc2b91c7d2bde8cc49cbafdc: CDI devices from CRI Config.CDIDevices: []" Aug 12 23:45:25.416742 containerd[1506]: time="2025-08-12T23:45:25.416695352Z" level=info msg="CreateContainer within sandbox \"86d417c813a03bada6b8faf8d2c202e515fa78a5eee64cf7c08d139363e5f3c1\" for &ContainerMetadata{Name:calico-kube-controllers,Attempt:0,} returns container id \"19dd4802aef9a2278a4c8001b706649c2528bbfafc2b91c7d2bde8cc49cbafdc\"" Aug 12 23:45:25.417937 containerd[1506]: time="2025-08-12T23:45:25.417906617Z" level=info msg="StartContainer for \"19dd4802aef9a2278a4c8001b706649c2528bbfafc2b91c7d2bde8cc49cbafdc\"" Aug 12 23:45:25.422561 containerd[1506]: time="2025-08-12T23:45:25.422459670Z" level=info msg="connecting to shim 19dd4802aef9a2278a4c8001b706649c2528bbfafc2b91c7d2bde8cc49cbafdc" address="unix:///run/containerd/s/c3c56d978201f8b333d3d32db7baaac00fba5b24287127408f517c1dd28f0604" protocol=ttrpc version=3 Aug 12 23:45:25.457170 systemd[1]: Started cri-containerd-19dd4802aef9a2278a4c8001b706649c2528bbfafc2b91c7d2bde8cc49cbafdc.scope - libcontainer container 19dd4802aef9a2278a4c8001b706649c2528bbfafc2b91c7d2bde8cc49cbafdc. Aug 12 23:45:25.556410 containerd[1506]: time="2025-08-12T23:45:25.556375568Z" level=info msg="StartContainer for \"19dd4802aef9a2278a4c8001b706649c2528bbfafc2b91c7d2bde8cc49cbafdc\" returns successfully" Aug 12 23:45:25.753779 containerd[1506]: time="2025-08-12T23:45:25.753629122Z" level=info msg="ImageUpdate event name:\"ghcr.io/flatcar/calico/apiserver:v3.30.2\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Aug 12 23:45:25.755756 containerd[1506]: time="2025-08-12T23:45:25.755699964Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.30.2: active requests=0, bytes read=77" Aug 12 23:45:25.758415 containerd[1506]: time="2025-08-12T23:45:25.758221816Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/apiserver:v3.30.2\" with image id \"sha256:3371ea1b18040228ef58c964e49b96f4291def748753dfbc0aef87a55f906b8f\", repo tag \"ghcr.io/flatcar/calico/apiserver:v3.30.2\", repo digest \"ghcr.io/flatcar/calico/apiserver@sha256:ec6b10660962e7caad70c47755049fad68f9fc2f7064e8bc7cb862583e02cc2b\", size \"45886406\" in 410.285949ms" Aug 12 23:45:25.758415 containerd[1506]: time="2025-08-12T23:45:25.758280257Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.2\" returns image reference \"sha256:3371ea1b18040228ef58c964e49b96f4291def748753dfbc0aef87a55f906b8f\"" Aug 12 23:45:25.761700 containerd[1506]: time="2025-08-12T23:45:25.761619645Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.2\"" Aug 12 23:45:25.764163 containerd[1506]: time="2025-08-12T23:45:25.764115736Z" level=info msg="CreateContainer within sandbox \"6a5e1ec7533ab9f8c923e3de67713cb1c6bc6ed94eeb1753dd30a384ea76360c\" for container &ContainerMetadata{Name:calico-apiserver,Attempt:0,}" Aug 12 23:45:25.778572 containerd[1506]: time="2025-08-12T23:45:25.777776295Z" level=info msg="Container ee7bef68afdb2544acface379516b3610fe18efeb5fb3874e38591a5bbf54b90: CDI devices from CRI Config.CDIDevices: []" Aug 12 23:45:25.792787 containerd[1506]: time="2025-08-12T23:45:25.792722561Z" level=info msg="CreateContainer within sandbox \"6a5e1ec7533ab9f8c923e3de67713cb1c6bc6ed94eeb1753dd30a384ea76360c\" for &ContainerMetadata{Name:calico-apiserver,Attempt:0,} returns container id \"ee7bef68afdb2544acface379516b3610fe18efeb5fb3874e38591a5bbf54b90\"" Aug 12 23:45:25.795819 containerd[1506]: time="2025-08-12T23:45:25.795739503Z" level=info msg="StartContainer for \"ee7bef68afdb2544acface379516b3610fe18efeb5fb3874e38591a5bbf54b90\"" Aug 12 23:45:25.797783 containerd[1506]: time="2025-08-12T23:45:25.797716823Z" level=info msg="connecting to shim ee7bef68afdb2544acface379516b3610fe18efeb5fb3874e38591a5bbf54b90" address="unix:///run/containerd/s/cffd78bb2d1542aed1e94b84712f4c7ec73e472b9b7930c775eae007ebe346e0" protocol=ttrpc version=3 Aug 12 23:45:25.828997 systemd[1]: Started cri-containerd-ee7bef68afdb2544acface379516b3610fe18efeb5fb3874e38591a5bbf54b90.scope - libcontainer container ee7bef68afdb2544acface379516b3610fe18efeb5fb3874e38591a5bbf54b90. Aug 12 23:45:25.911028 containerd[1506]: time="2025-08-12T23:45:25.910983339Z" level=info msg="StartContainer for \"ee7bef68afdb2544acface379516b3610fe18efeb5fb3874e38591a5bbf54b90\" returns successfully" Aug 12 23:45:26.544779 kubelet[2715]: I0812 23:45:26.543884 2715 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-apiserver/calico-apiserver-7f8d4ddcc7-zgq55" podStartSLOduration=29.815549795 podStartE2EDuration="42.543863976s" podCreationTimestamp="2025-08-12 23:44:44 +0000 UTC" firstStartedPulling="2025-08-12 23:45:13.032034998 +0000 UTC m=+48.091597551" lastFinishedPulling="2025-08-12 23:45:25.760349179 +0000 UTC m=+60.819911732" observedRunningTime="2025-08-12 23:45:26.542966198 +0000 UTC m=+61.602528751" watchObservedRunningTime="2025-08-12 23:45:26.543863976 +0000 UTC m=+61.603426529" Aug 12 23:45:26.599046 containerd[1506]: time="2025-08-12T23:45:26.598877891Z" level=info msg="TaskExit event in podsandbox handler container_id:\"19dd4802aef9a2278a4c8001b706649c2528bbfafc2b91c7d2bde8cc49cbafdc\" id:\"54517f97ab5f13af376cbc3b58c93673db20f068cef7c3fd568d5df6e89e42ed\" pid:5208 exited_at:{seconds:1755042326 nanos:597272738}" Aug 12 23:45:26.636076 kubelet[2715]: I0812 23:45:26.635821 2715 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/calico-kube-controllers-5cbb45b49d-bnxpq" podStartSLOduration=25.109808388 podStartE2EDuration="37.635805559s" podCreationTimestamp="2025-08-12 23:44:49 +0000 UTC" firstStartedPulling="2025-08-12 23:45:12.819887613 +0000 UTC m=+47.879450126" lastFinishedPulling="2025-08-12 23:45:25.345884744 +0000 UTC m=+60.405447297" observedRunningTime="2025-08-12 23:45:26.575060208 +0000 UTC m=+61.634622761" watchObservedRunningTime="2025-08-12 23:45:26.635805559 +0000 UTC m=+61.695368112" Aug 12 23:45:27.041715 containerd[1506]: time="2025-08-12T23:45:27.041651932Z" level=info msg="TaskExit event in podsandbox handler container_id:\"19dd4802aef9a2278a4c8001b706649c2528bbfafc2b91c7d2bde8cc49cbafdc\" id:\"3f0afb6ed19db1b17dfecfb4beab1e2c69eebef236fbe13177b911c8ff10928e\" pid:5231 exited_at:{seconds:1755042327 nanos:40976239}" Aug 12 23:45:27.800138 containerd[1506]: time="2025-08-12T23:45:27.798891814Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.2\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Aug 12 23:45:27.801973 containerd[1506]: time="2025-08-12T23:45:27.801928635Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/node-driver-registrar:v3.30.2: active requests=0, bytes read=13754366" Aug 12 23:45:27.804610 containerd[1506]: time="2025-08-12T23:45:27.804119319Z" level=info msg="ImageCreate event name:\"sha256:664ed31fb4687b0de23d6e6e116bc87b236790d7355871d3237c54452e02e27c\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Aug 12 23:45:27.811968 containerd[1506]: time="2025-08-12T23:45:27.811924716Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node-driver-registrar@sha256:8fec2de12dfa51bae89d941938a07af2598eb8bfcab55d0dded1d9c193d7b99f\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Aug 12 23:45:27.815434 containerd[1506]: time="2025-08-12T23:45:27.815283263Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.2\" with image id \"sha256:664ed31fb4687b0de23d6e6e116bc87b236790d7355871d3237c54452e02e27c\", repo tag \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.2\", repo digest \"ghcr.io/flatcar/calico/node-driver-registrar@sha256:8fec2de12dfa51bae89d941938a07af2598eb8bfcab55d0dded1d9c193d7b99f\", size \"15123559\" in 2.053591297s" Aug 12 23:45:27.815434 containerd[1506]: time="2025-08-12T23:45:27.815329544Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.2\" returns image reference \"sha256:664ed31fb4687b0de23d6e6e116bc87b236790d7355871d3237c54452e02e27c\"" Aug 12 23:45:27.832906 containerd[1506]: time="2025-08-12T23:45:27.832862576Z" level=info msg="CreateContainer within sandbox \"776d8d2c290f716dd77e18c400a3f3495f95261b38226df66334762c89a1a50e\" for container &ContainerMetadata{Name:csi-node-driver-registrar,Attempt:0,}" Aug 12 23:45:27.847596 containerd[1506]: time="2025-08-12T23:45:27.846918578Z" level=info msg="Container bc22fceb1309aaf629972db6a25858a6f723c92ec218e6ce5bbeda2b84f60270: CDI devices from CRI Config.CDIDevices: []" Aug 12 23:45:27.860614 containerd[1506]: time="2025-08-12T23:45:27.860576652Z" level=info msg="CreateContainer within sandbox \"776d8d2c290f716dd77e18c400a3f3495f95261b38226df66334762c89a1a50e\" for &ContainerMetadata{Name:csi-node-driver-registrar,Attempt:0,} returns container id \"bc22fceb1309aaf629972db6a25858a6f723c92ec218e6ce5bbeda2b84f60270\"" Aug 12 23:45:27.880226 containerd[1506]: time="2025-08-12T23:45:27.880172046Z" level=info msg="StartContainer for \"bc22fceb1309aaf629972db6a25858a6f723c92ec218e6ce5bbeda2b84f60270\"" Aug 12 23:45:27.883410 containerd[1506]: time="2025-08-12T23:45:27.883360310Z" level=info msg="connecting to shim bc22fceb1309aaf629972db6a25858a6f723c92ec218e6ce5bbeda2b84f60270" address="unix:///run/containerd/s/977a04e5d07edabeac157109881b93c0d98561bad1bc865bb55d34d124ca62fe" protocol=ttrpc version=3 Aug 12 23:45:27.915204 systemd[1]: Started cri-containerd-bc22fceb1309aaf629972db6a25858a6f723c92ec218e6ce5bbeda2b84f60270.scope - libcontainer container bc22fceb1309aaf629972db6a25858a6f723c92ec218e6ce5bbeda2b84f60270. Aug 12 23:45:28.030327 containerd[1506]: time="2025-08-12T23:45:28.030262454Z" level=info msg="StartContainer for \"bc22fceb1309aaf629972db6a25858a6f723c92ec218e6ce5bbeda2b84f60270\" returns successfully" Aug 12 23:45:28.348986 containerd[1506]: time="2025-08-12T23:45:28.348937037Z" level=info msg="TaskExit event in podsandbox handler container_id:\"1cccb6bc47c00d5e1d4ee67f317e7ac919f9017670873c5bb6119b773b41640b\" id:\"a03fee30c1bfaf18ac7a720e944155c0a51f8120bea472cd9bd2c92d42ab5611\" pid:5274 exited_at:{seconds:1755042328 nanos:348601671}" Aug 12 23:45:29.231974 kubelet[2715]: I0812 23:45:29.231841 2715 csi_plugin.go:100] kubernetes.io/csi: Trying to validate a new CSI Driver with name: csi.tigera.io endpoint: /var/lib/kubelet/plugins/csi.tigera.io/csi.sock versions: 1.0.0 Aug 12 23:45:29.235564 kubelet[2715]: I0812 23:45:29.235432 2715 csi_plugin.go:113] kubernetes.io/csi: Register new plugin with name: csi.tigera.io at endpoint: /var/lib/kubelet/plugins/csi.tigera.io/csi.sock Aug 12 23:45:30.775584 kubelet[2715]: I0812 23:45:30.775069 2715 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/csi-node-driver-znzb2" podStartSLOduration=25.340267699 podStartE2EDuration="41.775047146s" podCreationTimestamp="2025-08-12 23:44:49 +0000 UTC" firstStartedPulling="2025-08-12 23:45:11.389371474 +0000 UTC m=+46.448934027" lastFinishedPulling="2025-08-12 23:45:27.824150921 +0000 UTC m=+62.883713474" observedRunningTime="2025-08-12 23:45:28.554214324 +0000 UTC m=+63.613776877" watchObservedRunningTime="2025-08-12 23:45:30.775047146 +0000 UTC m=+65.834609739" Aug 12 23:45:39.187565 containerd[1506]: time="2025-08-12T23:45:39.187130741Z" level=info msg="TaskExit event in podsandbox handler container_id:\"769dd1b877f8b6c3cdfcd850a1bfce2f7623183d7b5ca6c22e3baaa0871e87ab\" id:\"3044189f3c7ff0830965737c2b60c058b93fe66d9dbdb550b75979bcd49180e3\" pid:5316 exited_at:{seconds:1755042339 nanos:186802902}" Aug 12 23:45:42.014698 kubelet[2715]: I0812 23:45:42.014530 2715 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Aug 12 23:45:56.360582 kubelet[2715]: I0812 23:45:56.358924 2715 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Aug 12 23:45:56.510181 containerd[1506]: time="2025-08-12T23:45:56.509987569Z" level=info msg="StopContainer for \"17298248124098d02f0008112d81d4844b1cdf9fcdfd2deb6a21815dfbf58987\" with timeout 30 (s)" Aug 12 23:45:56.510857 containerd[1506]: time="2025-08-12T23:45:56.510713491Z" level=info msg="Stop container \"17298248124098d02f0008112d81d4844b1cdf9fcdfd2deb6a21815dfbf58987\" with signal terminated" Aug 12 23:45:56.575656 systemd[1]: Created slice kubepods-besteffort-pod68ac659b_22d5_4a57_9eea_9e7c1c84a48d.slice - libcontainer container kubepods-besteffort-pod68ac659b_22d5_4a57_9eea_9e7c1c84a48d.slice. Aug 12 23:45:56.607731 systemd[1]: cri-containerd-17298248124098d02f0008112d81d4844b1cdf9fcdfd2deb6a21815dfbf58987.scope: Deactivated successfully. Aug 12 23:45:56.608087 systemd[1]: cri-containerd-17298248124098d02f0008112d81d4844b1cdf9fcdfd2deb6a21815dfbf58987.scope: Consumed 1.143s CPU time, 50.5M memory peak, 4K read from disk. Aug 12 23:45:56.613642 containerd[1506]: time="2025-08-12T23:45:56.613221416Z" level=info msg="received exit event container_id:\"17298248124098d02f0008112d81d4844b1cdf9fcdfd2deb6a21815dfbf58987\" id:\"17298248124098d02f0008112d81d4844b1cdf9fcdfd2deb6a21815dfbf58987\" pid:5004 exit_status:1 exited_at:{seconds:1755042356 nanos:611916413}" Aug 12 23:45:56.614045 containerd[1506]: time="2025-08-12T23:45:56.614011979Z" level=info msg="TaskExit event in podsandbox handler container_id:\"17298248124098d02f0008112d81d4844b1cdf9fcdfd2deb6a21815dfbf58987\" id:\"17298248124098d02f0008112d81d4844b1cdf9fcdfd2deb6a21815dfbf58987\" pid:5004 exit_status:1 exited_at:{seconds:1755042356 nanos:611916413}" Aug 12 23:45:56.641518 kubelet[2715]: I0812 23:45:56.641376 2715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qsn9k\" (UniqueName: \"kubernetes.io/projected/68ac659b-22d5-4a57-9eea-9e7c1c84a48d-kube-api-access-qsn9k\") pod \"calico-apiserver-6464b99944-sd79w\" (UID: \"68ac659b-22d5-4a57-9eea-9e7c1c84a48d\") " pod="calico-apiserver/calico-apiserver-6464b99944-sd79w" Aug 12 23:45:56.643455 kubelet[2715]: I0812 23:45:56.642866 2715 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"calico-apiserver-certs\" (UniqueName: \"kubernetes.io/secret/68ac659b-22d5-4a57-9eea-9e7c1c84a48d-calico-apiserver-certs\") pod \"calico-apiserver-6464b99944-sd79w\" (UID: \"68ac659b-22d5-4a57-9eea-9e7c1c84a48d\") " pod="calico-apiserver/calico-apiserver-6464b99944-sd79w" Aug 12 23:45:56.659178 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-17298248124098d02f0008112d81d4844b1cdf9fcdfd2deb6a21815dfbf58987-rootfs.mount: Deactivated successfully. Aug 12 23:45:56.720710 containerd[1506]: time="2025-08-12T23:45:56.720531595Z" level=info msg="StopContainer for \"17298248124098d02f0008112d81d4844b1cdf9fcdfd2deb6a21815dfbf58987\" returns successfully" Aug 12 23:45:56.723645 containerd[1506]: time="2025-08-12T23:45:56.723010522Z" level=info msg="StopPodSandbox for \"69a93367d0d4a8757028e8e0ca21952426c5ce6ce96fdba13005db0b3b6b35bf\"" Aug 12 23:45:56.723645 containerd[1506]: time="2025-08-12T23:45:56.723116562Z" level=info msg="Container to stop \"17298248124098d02f0008112d81d4844b1cdf9fcdfd2deb6a21815dfbf58987\" must be in running or unknown state, current state \"CONTAINER_EXITED\"" Aug 12 23:45:56.747996 systemd[1]: cri-containerd-69a93367d0d4a8757028e8e0ca21952426c5ce6ce96fdba13005db0b3b6b35bf.scope: Deactivated successfully. Aug 12 23:45:56.748399 systemd[1]: cri-containerd-69a93367d0d4a8757028e8e0ca21952426c5ce6ce96fdba13005db0b3b6b35bf.scope: Consumed 37ms CPU time, 4.3M memory peak, 2.3M read from disk. Aug 12 23:45:56.759969 containerd[1506]: time="2025-08-12T23:45:56.759909944Z" level=info msg="TaskExit event in podsandbox handler container_id:\"69a93367d0d4a8757028e8e0ca21952426c5ce6ce96fdba13005db0b3b6b35bf\" id:\"69a93367d0d4a8757028e8e0ca21952426c5ce6ce96fdba13005db0b3b6b35bf\" pid:4365 exit_status:137 exited_at:{seconds:1755042356 nanos:752480443}" Aug 12 23:45:56.811635 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-69a93367d0d4a8757028e8e0ca21952426c5ce6ce96fdba13005db0b3b6b35bf-rootfs.mount: Deactivated successfully. Aug 12 23:45:56.814494 containerd[1506]: time="2025-08-12T23:45:56.814438016Z" level=info msg="shim disconnected" id=69a93367d0d4a8757028e8e0ca21952426c5ce6ce96fdba13005db0b3b6b35bf namespace=k8s.io Aug 12 23:45:56.815943 containerd[1506]: time="2025-08-12T23:45:56.815227298Z" level=warning msg="cleaning up after shim disconnected" id=69a93367d0d4a8757028e8e0ca21952426c5ce6ce96fdba13005db0b3b6b35bf namespace=k8s.io Aug 12 23:45:56.816297 containerd[1506]: time="2025-08-12T23:45:56.816207861Z" level=info msg="cleaning up dead shim" namespace=k8s.io Aug 12 23:45:56.850021 containerd[1506]: time="2025-08-12T23:45:56.849958954Z" level=info msg="received exit event sandbox_id:\"69a93367d0d4a8757028e8e0ca21952426c5ce6ce96fdba13005db0b3b6b35bf\" exit_status:137 exited_at:{seconds:1755042356 nanos:752480443}" Aug 12 23:45:56.854585 systemd[1]: run-containerd-io.containerd.grpc.v1.cri-sandboxes-69a93367d0d4a8757028e8e0ca21952426c5ce6ce96fdba13005db0b3b6b35bf-shm.mount: Deactivated successfully. Aug 12 23:45:56.881263 containerd[1506]: time="2025-08-12T23:45:56.880778800Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-6464b99944-sd79w,Uid:68ac659b-22d5-4a57-9eea-9e7c1c84a48d,Namespace:calico-apiserver,Attempt:0,}" Aug 12 23:45:56.941236 systemd-networkd[1419]: cali77a5728c533: Link DOWN Aug 12 23:45:56.941244 systemd-networkd[1419]: cali77a5728c533: Lost carrier Aug 12 23:45:57.066399 containerd[1506]: time="2025-08-12T23:45:57.066347539Z" level=info msg="TaskExit event in podsandbox handler container_id:\"19dd4802aef9a2278a4c8001b706649c2528bbfafc2b91c7d2bde8cc49cbafdc\" id:\"fb4b6fda13dc892858812e75117d0ff2fcf9faeaef7b617409ba800ef3f8879c\" pid:5464 exited_at:{seconds:1755042357 nanos:63434690}" Aug 12 23:45:57.169714 containerd[1506]: 2025-08-12 23:45:56.937 [INFO][5415] cni-plugin/k8s.go 640: Cleaning up netns ContainerID="69a93367d0d4a8757028e8e0ca21952426c5ce6ce96fdba13005db0b3b6b35bf" Aug 12 23:45:57.169714 containerd[1506]: 2025-08-12 23:45:56.938 [INFO][5415] cni-plugin/dataplane_linux.go 559: Deleting workload's device in netns. ContainerID="69a93367d0d4a8757028e8e0ca21952426c5ce6ce96fdba13005db0b3b6b35bf" iface="eth0" netns="/var/run/netns/cni-ab4d1615-a659-8215-d6b6-1180b2383a69" Aug 12 23:45:57.169714 containerd[1506]: 2025-08-12 23:45:56.939 [INFO][5415] cni-plugin/dataplane_linux.go 570: Entered netns, deleting veth. ContainerID="69a93367d0d4a8757028e8e0ca21952426c5ce6ce96fdba13005db0b3b6b35bf" iface="eth0" netns="/var/run/netns/cni-ab4d1615-a659-8215-d6b6-1180b2383a69" Aug 12 23:45:57.169714 containerd[1506]: 2025-08-12 23:45:56.955 [INFO][5415] cni-plugin/dataplane_linux.go 604: Deleted device in netns. ContainerID="69a93367d0d4a8757028e8e0ca21952426c5ce6ce96fdba13005db0b3b6b35bf" after=16.807126ms iface="eth0" netns="/var/run/netns/cni-ab4d1615-a659-8215-d6b6-1180b2383a69" Aug 12 23:45:57.169714 containerd[1506]: 2025-08-12 23:45:56.955 [INFO][5415] cni-plugin/k8s.go 647: Releasing IP address(es) ContainerID="69a93367d0d4a8757028e8e0ca21952426c5ce6ce96fdba13005db0b3b6b35bf" Aug 12 23:45:57.169714 containerd[1506]: 2025-08-12 23:45:56.955 [INFO][5415] cni-plugin/utils.go 188: Calico CNI releasing IP address ContainerID="69a93367d0d4a8757028e8e0ca21952426c5ce6ce96fdba13005db0b3b6b35bf" Aug 12 23:45:57.169714 containerd[1506]: 2025-08-12 23:45:57.047 [INFO][5435] ipam/ipam_plugin.go 412: Releasing address using handleID ContainerID="69a93367d0d4a8757028e8e0ca21952426c5ce6ce96fdba13005db0b3b6b35bf" HandleID="k8s-pod-network.69a93367d0d4a8757028e8e0ca21952426c5ce6ce96fdba13005db0b3b6b35bf" Workload="ci--4372--1--0--9--13fe44d47a-k8s-calico--apiserver--7f8d4ddcc7--r7f9g-eth0" Aug 12 23:45:57.169714 containerd[1506]: 2025-08-12 23:45:57.047 [INFO][5435] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Aug 12 23:45:57.169714 containerd[1506]: 2025-08-12 23:45:57.047 [INFO][5435] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Aug 12 23:45:57.169714 containerd[1506]: 2025-08-12 23:45:57.149 [INFO][5435] ipam/ipam_plugin.go 431: Released address using handleID ContainerID="69a93367d0d4a8757028e8e0ca21952426c5ce6ce96fdba13005db0b3b6b35bf" HandleID="k8s-pod-network.69a93367d0d4a8757028e8e0ca21952426c5ce6ce96fdba13005db0b3b6b35bf" Workload="ci--4372--1--0--9--13fe44d47a-k8s-calico--apiserver--7f8d4ddcc7--r7f9g-eth0" Aug 12 23:45:57.169714 containerd[1506]: 2025-08-12 23:45:57.149 [INFO][5435] ipam/ipam_plugin.go 440: Releasing address using workloadID ContainerID="69a93367d0d4a8757028e8e0ca21952426c5ce6ce96fdba13005db0b3b6b35bf" HandleID="k8s-pod-network.69a93367d0d4a8757028e8e0ca21952426c5ce6ce96fdba13005db0b3b6b35bf" Workload="ci--4372--1--0--9--13fe44d47a-k8s-calico--apiserver--7f8d4ddcc7--r7f9g-eth0" Aug 12 23:45:57.169714 containerd[1506]: 2025-08-12 23:45:57.163 [INFO][5435] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Aug 12 23:45:57.169714 containerd[1506]: 2025-08-12 23:45:57.166 [INFO][5415] cni-plugin/k8s.go 653: Teardown processing complete. ContainerID="69a93367d0d4a8757028e8e0ca21952426c5ce6ce96fdba13005db0b3b6b35bf" Aug 12 23:45:57.170208 containerd[1506]: time="2025-08-12T23:45:57.170157585Z" level=info msg="TearDown network for sandbox \"69a93367d0d4a8757028e8e0ca21952426c5ce6ce96fdba13005db0b3b6b35bf\" successfully" Aug 12 23:45:57.170208 containerd[1506]: time="2025-08-12T23:45:57.170208505Z" level=info msg="StopPodSandbox for \"69a93367d0d4a8757028e8e0ca21952426c5ce6ce96fdba13005db0b3b6b35bf\" returns successfully" Aug 12 23:45:57.249070 kubelet[2715]: I0812 23:45:57.249033 2715 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"calico-apiserver-certs\" (UniqueName: \"kubernetes.io/secret/241f24c8-4f1a-4329-8952-6c015f2ffbc0-calico-apiserver-certs\") pod \"241f24c8-4f1a-4329-8952-6c015f2ffbc0\" (UID: \"241f24c8-4f1a-4329-8952-6c015f2ffbc0\") " Aug 12 23:45:57.249070 kubelet[2715]: I0812 23:45:57.249084 2715 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ns94x\" (UniqueName: \"kubernetes.io/projected/241f24c8-4f1a-4329-8952-6c015f2ffbc0-kube-api-access-ns94x\") pod \"241f24c8-4f1a-4329-8952-6c015f2ffbc0\" (UID: \"241f24c8-4f1a-4329-8952-6c015f2ffbc0\") " Aug 12 23:45:57.256243 kubelet[2715]: I0812 23:45:57.256189 2715 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/241f24c8-4f1a-4329-8952-6c015f2ffbc0-calico-apiserver-certs" (OuterVolumeSpecName: "calico-apiserver-certs") pod "241f24c8-4f1a-4329-8952-6c015f2ffbc0" (UID: "241f24c8-4f1a-4329-8952-6c015f2ffbc0"). InnerVolumeSpecName "calico-apiserver-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Aug 12 23:45:57.257975 kubelet[2715]: I0812 23:45:57.257870 2715 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/241f24c8-4f1a-4329-8952-6c015f2ffbc0-kube-api-access-ns94x" (OuterVolumeSpecName: "kube-api-access-ns94x") pod "241f24c8-4f1a-4329-8952-6c015f2ffbc0" (UID: "241f24c8-4f1a-4329-8952-6c015f2ffbc0"). InnerVolumeSpecName "kube-api-access-ns94x". PluginName "kubernetes.io/projected", VolumeGidValue "" Aug 12 23:45:57.351515 kubelet[2715]: I0812 23:45:57.350374 2715 reconciler_common.go:293] "Volume detached for volume \"calico-apiserver-certs\" (UniqueName: \"kubernetes.io/secret/241f24c8-4f1a-4329-8952-6c015f2ffbc0-calico-apiserver-certs\") on node \"ci-4372-1-0-9-13fe44d47a\" DevicePath \"\"" Aug 12 23:45:57.351515 kubelet[2715]: I0812 23:45:57.350412 2715 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ns94x\" (UniqueName: \"kubernetes.io/projected/241f24c8-4f1a-4329-8952-6c015f2ffbc0-kube-api-access-ns94x\") on node \"ci-4372-1-0-9-13fe44d47a\" DevicePath \"\"" Aug 12 23:45:57.350506 systemd-networkd[1419]: cali4a4b5fd9304: Link UP Aug 12 23:45:57.352540 systemd-networkd[1419]: cali4a4b5fd9304: Gained carrier Aug 12 23:45:57.385853 containerd[1506]: 2025-08-12 23:45:56.954 [INFO][5422] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4372--1--0--9--13fe44d47a-k8s-calico--apiserver--6464b99944--sd79w-eth0 calico-apiserver-6464b99944- calico-apiserver 68ac659b-22d5-4a57-9eea-9e7c1c84a48d 1207 0 2025-08-12 23:45:56 +0000 UTC map[apiserver:true app.kubernetes.io/name:calico-apiserver k8s-app:calico-apiserver pod-template-hash:6464b99944 projectcalico.org/namespace:calico-apiserver projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:calico-apiserver] map[] [] [] []} {k8s ci-4372-1-0-9-13fe44d47a calico-apiserver-6464b99944-sd79w eth0 calico-apiserver [] [] [kns.calico-apiserver ksa.calico-apiserver.calico-apiserver] cali4a4b5fd9304 [] [] }} ContainerID="d5cf342a7cbd7776ecc8a031f62b0b1dee1077d30f532b136451762bb3f510d4" Namespace="calico-apiserver" Pod="calico-apiserver-6464b99944-sd79w" WorkloadEndpoint="ci--4372--1--0--9--13fe44d47a-k8s-calico--apiserver--6464b99944--sd79w-" Aug 12 23:45:57.385853 containerd[1506]: 2025-08-12 23:45:56.956 [INFO][5422] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="d5cf342a7cbd7776ecc8a031f62b0b1dee1077d30f532b136451762bb3f510d4" Namespace="calico-apiserver" Pod="calico-apiserver-6464b99944-sd79w" WorkloadEndpoint="ci--4372--1--0--9--13fe44d47a-k8s-calico--apiserver--6464b99944--sd79w-eth0" Aug 12 23:45:57.385853 containerd[1506]: 2025-08-12 23:45:57.045 [INFO][5441] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="d5cf342a7cbd7776ecc8a031f62b0b1dee1077d30f532b136451762bb3f510d4" HandleID="k8s-pod-network.d5cf342a7cbd7776ecc8a031f62b0b1dee1077d30f532b136451762bb3f510d4" Workload="ci--4372--1--0--9--13fe44d47a-k8s-calico--apiserver--6464b99944--sd79w-eth0" Aug 12 23:45:57.385853 containerd[1506]: 2025-08-12 23:45:57.049 [INFO][5441] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="d5cf342a7cbd7776ecc8a031f62b0b1dee1077d30f532b136451762bb3f510d4" HandleID="k8s-pod-network.d5cf342a7cbd7776ecc8a031f62b0b1dee1077d30f532b136451762bb3f510d4" Workload="ci--4372--1--0--9--13fe44d47a-k8s-calico--apiserver--6464b99944--sd79w-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0x4000103e30), Attrs:map[string]string{"namespace":"calico-apiserver", "node":"ci-4372-1-0-9-13fe44d47a", "pod":"calico-apiserver-6464b99944-sd79w", "timestamp":"2025-08-12 23:45:57.045860075 +0000 UTC"}, Hostname:"ci-4372-1-0-9-13fe44d47a", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Aug 12 23:45:57.385853 containerd[1506]: 2025-08-12 23:45:57.050 [INFO][5441] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Aug 12 23:45:57.385853 containerd[1506]: 2025-08-12 23:45:57.163 [INFO][5441] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Aug 12 23:45:57.385853 containerd[1506]: 2025-08-12 23:45:57.163 [INFO][5441] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4372-1-0-9-13fe44d47a' Aug 12 23:45:57.385853 containerd[1506]: 2025-08-12 23:45:57.205 [INFO][5441] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.d5cf342a7cbd7776ecc8a031f62b0b1dee1077d30f532b136451762bb3f510d4" host="ci-4372-1-0-9-13fe44d47a" Aug 12 23:45:57.385853 containerd[1506]: 2025-08-12 23:45:57.257 [INFO][5441] ipam/ipam.go 394: Looking up existing affinities for host host="ci-4372-1-0-9-13fe44d47a" Aug 12 23:45:57.385853 containerd[1506]: 2025-08-12 23:45:57.304 [INFO][5441] ipam/ipam.go 511: Trying affinity for 192.168.3.64/26 host="ci-4372-1-0-9-13fe44d47a" Aug 12 23:45:57.385853 containerd[1506]: 2025-08-12 23:45:57.307 [INFO][5441] ipam/ipam.go 158: Attempting to load block cidr=192.168.3.64/26 host="ci-4372-1-0-9-13fe44d47a" Aug 12 23:45:57.385853 containerd[1506]: 2025-08-12 23:45:57.312 [INFO][5441] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.3.64/26 host="ci-4372-1-0-9-13fe44d47a" Aug 12 23:45:57.385853 containerd[1506]: 2025-08-12 23:45:57.312 [INFO][5441] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.3.64/26 handle="k8s-pod-network.d5cf342a7cbd7776ecc8a031f62b0b1dee1077d30f532b136451762bb3f510d4" host="ci-4372-1-0-9-13fe44d47a" Aug 12 23:45:57.385853 containerd[1506]: 2025-08-12 23:45:57.314 [INFO][5441] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.d5cf342a7cbd7776ecc8a031f62b0b1dee1077d30f532b136451762bb3f510d4 Aug 12 23:45:57.385853 containerd[1506]: 2025-08-12 23:45:57.324 [INFO][5441] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.3.64/26 handle="k8s-pod-network.d5cf342a7cbd7776ecc8a031f62b0b1dee1077d30f532b136451762bb3f510d4" host="ci-4372-1-0-9-13fe44d47a" Aug 12 23:45:57.385853 containerd[1506]: 2025-08-12 23:45:57.339 [INFO][5441] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.3.74/26] block=192.168.3.64/26 handle="k8s-pod-network.d5cf342a7cbd7776ecc8a031f62b0b1dee1077d30f532b136451762bb3f510d4" host="ci-4372-1-0-9-13fe44d47a" Aug 12 23:45:57.385853 containerd[1506]: 2025-08-12 23:45:57.340 [INFO][5441] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.3.74/26] handle="k8s-pod-network.d5cf342a7cbd7776ecc8a031f62b0b1dee1077d30f532b136451762bb3f510d4" host="ci-4372-1-0-9-13fe44d47a" Aug 12 23:45:57.385853 containerd[1506]: 2025-08-12 23:45:57.340 [INFO][5441] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Aug 12 23:45:57.385853 containerd[1506]: 2025-08-12 23:45:57.340 [INFO][5441] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.3.74/26] IPv6=[] ContainerID="d5cf342a7cbd7776ecc8a031f62b0b1dee1077d30f532b136451762bb3f510d4" HandleID="k8s-pod-network.d5cf342a7cbd7776ecc8a031f62b0b1dee1077d30f532b136451762bb3f510d4" Workload="ci--4372--1--0--9--13fe44d47a-k8s-calico--apiserver--6464b99944--sd79w-eth0" Aug 12 23:45:57.386400 containerd[1506]: 2025-08-12 23:45:57.342 [INFO][5422] cni-plugin/k8s.go 418: Populated endpoint ContainerID="d5cf342a7cbd7776ecc8a031f62b0b1dee1077d30f532b136451762bb3f510d4" Namespace="calico-apiserver" Pod="calico-apiserver-6464b99944-sd79w" WorkloadEndpoint="ci--4372--1--0--9--13fe44d47a-k8s-calico--apiserver--6464b99944--sd79w-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4372--1--0--9--13fe44d47a-k8s-calico--apiserver--6464b99944--sd79w-eth0", GenerateName:"calico-apiserver-6464b99944-", Namespace:"calico-apiserver", SelfLink:"", UID:"68ac659b-22d5-4a57-9eea-9e7c1c84a48d", ResourceVersion:"1207", Generation:0, CreationTimestamp:time.Date(2025, time.August, 12, 23, 45, 56, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"6464b99944", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4372-1-0-9-13fe44d47a", ContainerID:"", Pod:"calico-apiserver-6464b99944-sd79w", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.3.74/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"cali4a4b5fd9304", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Aug 12 23:45:57.386400 containerd[1506]: 2025-08-12 23:45:57.343 [INFO][5422] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.3.74/32] ContainerID="d5cf342a7cbd7776ecc8a031f62b0b1dee1077d30f532b136451762bb3f510d4" Namespace="calico-apiserver" Pod="calico-apiserver-6464b99944-sd79w" WorkloadEndpoint="ci--4372--1--0--9--13fe44d47a-k8s-calico--apiserver--6464b99944--sd79w-eth0" Aug 12 23:45:57.386400 containerd[1506]: 2025-08-12 23:45:57.343 [INFO][5422] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali4a4b5fd9304 ContainerID="d5cf342a7cbd7776ecc8a031f62b0b1dee1077d30f532b136451762bb3f510d4" Namespace="calico-apiserver" Pod="calico-apiserver-6464b99944-sd79w" WorkloadEndpoint="ci--4372--1--0--9--13fe44d47a-k8s-calico--apiserver--6464b99944--sd79w-eth0" Aug 12 23:45:57.386400 containerd[1506]: 2025-08-12 23:45:57.352 [INFO][5422] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="d5cf342a7cbd7776ecc8a031f62b0b1dee1077d30f532b136451762bb3f510d4" Namespace="calico-apiserver" Pod="calico-apiserver-6464b99944-sd79w" WorkloadEndpoint="ci--4372--1--0--9--13fe44d47a-k8s-calico--apiserver--6464b99944--sd79w-eth0" Aug 12 23:45:57.386400 containerd[1506]: 2025-08-12 23:45:57.353 [INFO][5422] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="d5cf342a7cbd7776ecc8a031f62b0b1dee1077d30f532b136451762bb3f510d4" Namespace="calico-apiserver" Pod="calico-apiserver-6464b99944-sd79w" WorkloadEndpoint="ci--4372--1--0--9--13fe44d47a-k8s-calico--apiserver--6464b99944--sd79w-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4372--1--0--9--13fe44d47a-k8s-calico--apiserver--6464b99944--sd79w-eth0", GenerateName:"calico-apiserver-6464b99944-", Namespace:"calico-apiserver", SelfLink:"", UID:"68ac659b-22d5-4a57-9eea-9e7c1c84a48d", ResourceVersion:"1207", Generation:0, CreationTimestamp:time.Date(2025, time.August, 12, 23, 45, 56, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"6464b99944", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4372-1-0-9-13fe44d47a", ContainerID:"d5cf342a7cbd7776ecc8a031f62b0b1dee1077d30f532b136451762bb3f510d4", Pod:"calico-apiserver-6464b99944-sd79w", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.3.74/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"cali4a4b5fd9304", MAC:"46:3d:d3:1f:f0:cf", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Aug 12 23:45:57.386400 containerd[1506]: 2025-08-12 23:45:57.381 [INFO][5422] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="d5cf342a7cbd7776ecc8a031f62b0b1dee1077d30f532b136451762bb3f510d4" Namespace="calico-apiserver" Pod="calico-apiserver-6464b99944-sd79w" WorkloadEndpoint="ci--4372--1--0--9--13fe44d47a-k8s-calico--apiserver--6464b99944--sd79w-eth0" Aug 12 23:45:57.428920 containerd[1506]: time="2025-08-12T23:45:57.428781275Z" level=info msg="connecting to shim d5cf342a7cbd7776ecc8a031f62b0b1dee1077d30f532b136451762bb3f510d4" address="unix:///run/containerd/s/e286343466c725fd7b1f549df7bb3e8792675878406716d9eabb108e1ea2b084" namespace=k8s.io protocol=ttrpc version=3 Aug 12 23:45:57.464355 systemd[1]: Started cri-containerd-d5cf342a7cbd7776ecc8a031f62b0b1dee1077d30f532b136451762bb3f510d4.scope - libcontainer container d5cf342a7cbd7776ecc8a031f62b0b1dee1077d30f532b136451762bb3f510d4. Aug 12 23:45:57.557635 containerd[1506]: time="2025-08-12T23:45:57.557248638Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-6464b99944-sd79w,Uid:68ac659b-22d5-4a57-9eea-9e7c1c84a48d,Namespace:calico-apiserver,Attempt:0,} returns sandbox id \"d5cf342a7cbd7776ecc8a031f62b0b1dee1077d30f532b136451762bb3f510d4\"" Aug 12 23:45:57.568337 containerd[1506]: time="2025-08-12T23:45:57.568293473Z" level=info msg="CreateContainer within sandbox \"d5cf342a7cbd7776ecc8a031f62b0b1dee1077d30f532b136451762bb3f510d4\" for container &ContainerMetadata{Name:calico-apiserver,Attempt:0,}" Aug 12 23:45:57.579817 containerd[1506]: time="2025-08-12T23:45:57.579762309Z" level=info msg="Container 05d81955dce54c19bcda5454d9e57b54016ce2a79250e53fbbb5174fc2abd83b: CDI devices from CRI Config.CDIDevices: []" Aug 12 23:45:57.594766 containerd[1506]: time="2025-08-12T23:45:57.594686395Z" level=info msg="CreateContainer within sandbox \"d5cf342a7cbd7776ecc8a031f62b0b1dee1077d30f532b136451762bb3f510d4\" for &ContainerMetadata{Name:calico-apiserver,Attempt:0,} returns container id \"05d81955dce54c19bcda5454d9e57b54016ce2a79250e53fbbb5174fc2abd83b\"" Aug 12 23:45:57.595582 containerd[1506]: time="2025-08-12T23:45:57.595492678Z" level=info msg="StartContainer for \"05d81955dce54c19bcda5454d9e57b54016ce2a79250e53fbbb5174fc2abd83b\"" Aug 12 23:45:57.597646 containerd[1506]: time="2025-08-12T23:45:57.597206963Z" level=info msg="connecting to shim 05d81955dce54c19bcda5454d9e57b54016ce2a79250e53fbbb5174fc2abd83b" address="unix:///run/containerd/s/e286343466c725fd7b1f549df7bb3e8792675878406716d9eabb108e1ea2b084" protocol=ttrpc version=3 Aug 12 23:45:57.632988 systemd[1]: Started cri-containerd-05d81955dce54c19bcda5454d9e57b54016ce2a79250e53fbbb5174fc2abd83b.scope - libcontainer container 05d81955dce54c19bcda5454d9e57b54016ce2a79250e53fbbb5174fc2abd83b. Aug 12 23:45:57.658948 kubelet[2715]: I0812 23:45:57.658899 2715 scope.go:117] "RemoveContainer" containerID="17298248124098d02f0008112d81d4844b1cdf9fcdfd2deb6a21815dfbf58987" Aug 12 23:45:57.662187 systemd[1]: run-netns-cni\x2dab4d1615\x2da659\x2d8215\x2dd6b6\x2d1180b2383a69.mount: Deactivated successfully. Aug 12 23:45:57.662353 systemd[1]: var-lib-kubelet-pods-241f24c8\x2d4f1a\x2d4329\x2d8952\x2d6c015f2ffbc0-volumes-kubernetes.io\x7eprojected-kube\x2dapi\x2daccess\x2dns94x.mount: Deactivated successfully. Aug 12 23:45:57.662426 systemd[1]: var-lib-kubelet-pods-241f24c8\x2d4f1a\x2d4329\x2d8952\x2d6c015f2ffbc0-volumes-kubernetes.io\x7esecret-calico\x2dapiserver\x2dcerts.mount: Deactivated successfully. Aug 12 23:45:57.675807 containerd[1506]: time="2025-08-12T23:45:57.675758849Z" level=info msg="RemoveContainer for \"17298248124098d02f0008112d81d4844b1cdf9fcdfd2deb6a21815dfbf58987\"" Aug 12 23:45:57.682506 systemd[1]: Removed slice kubepods-besteffort-pod241f24c8_4f1a_4329_8952_6c015f2ffbc0.slice - libcontainer container kubepods-besteffort-pod241f24c8_4f1a_4329_8952_6c015f2ffbc0.slice. Aug 12 23:45:57.682645 systemd[1]: kubepods-besteffort-pod241f24c8_4f1a_4329_8952_6c015f2ffbc0.slice: Consumed 1.181s CPU time, 51M memory peak, 2.4M read from disk. Aug 12 23:45:57.690661 containerd[1506]: time="2025-08-12T23:45:57.690523456Z" level=info msg="RemoveContainer for \"17298248124098d02f0008112d81d4844b1cdf9fcdfd2deb6a21815dfbf58987\" returns successfully" Aug 12 23:45:57.691121 kubelet[2715]: I0812 23:45:57.691084 2715 scope.go:117] "RemoveContainer" containerID="17298248124098d02f0008112d81d4844b1cdf9fcdfd2deb6a21815dfbf58987" Aug 12 23:45:57.691483 containerd[1506]: time="2025-08-12T23:45:57.691409058Z" level=error msg="ContainerStatus for \"17298248124098d02f0008112d81d4844b1cdf9fcdfd2deb6a21815dfbf58987\" failed" error="rpc error: code = NotFound desc = an error occurred when try to find container \"17298248124098d02f0008112d81d4844b1cdf9fcdfd2deb6a21815dfbf58987\": not found" Aug 12 23:45:57.695938 kubelet[2715]: E0812 23:45:57.695857 2715 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = an error occurred when try to find container \"17298248124098d02f0008112d81d4844b1cdf9fcdfd2deb6a21815dfbf58987\": not found" containerID="17298248124098d02f0008112d81d4844b1cdf9fcdfd2deb6a21815dfbf58987" Aug 12 23:45:57.697125 kubelet[2715]: I0812 23:45:57.696986 2715 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"containerd","ID":"17298248124098d02f0008112d81d4844b1cdf9fcdfd2deb6a21815dfbf58987"} err="failed to get container status \"17298248124098d02f0008112d81d4844b1cdf9fcdfd2deb6a21815dfbf58987\": rpc error: code = NotFound desc = an error occurred when try to find container \"17298248124098d02f0008112d81d4844b1cdf9fcdfd2deb6a21815dfbf58987\": not found" Aug 12 23:45:57.813706 containerd[1506]: time="2025-08-12T23:45:57.813650922Z" level=info msg="StartContainer for \"05d81955dce54c19bcda5454d9e57b54016ce2a79250e53fbbb5174fc2abd83b\" returns successfully" Aug 12 23:45:58.006346 containerd[1506]: time="2025-08-12T23:45:58.005809126Z" level=info msg="TaskExit event in podsandbox handler container_id:\"1cccb6bc47c00d5e1d4ee67f317e7ac919f9017670873c5bb6119b773b41640b\" id:\"7c707c829515f645ff4b8088d66f1a8ed08d184b852e162de3094e5540bd20e4\" pid:5580 exited_at:{seconds:1755042358 nanos:4823842}" Aug 12 23:45:59.090024 kubelet[2715]: I0812 23:45:59.089973 2715 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="241f24c8-4f1a-4329-8952-6c015f2ffbc0" path="/var/lib/kubelet/pods/241f24c8-4f1a-4329-8952-6c015f2ffbc0/volumes" Aug 12 23:45:59.140731 systemd-networkd[1419]: cali4a4b5fd9304: Gained IPv6LL Aug 12 23:46:00.284358 kubelet[2715]: I0812 23:46:00.284280 2715 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-apiserver/calico-apiserver-6464b99944-sd79w" podStartSLOduration=4.283537486 podStartE2EDuration="4.283537486s" podCreationTimestamp="2025-08-12 23:45:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-08-12 23:45:58.706727923 +0000 UTC m=+93.766290476" watchObservedRunningTime="2025-08-12 23:46:00.283537486 +0000 UTC m=+95.343099999" Aug 12 23:46:00.318750 containerd[1506]: time="2025-08-12T23:46:00.318705711Z" level=info msg="StopContainer for \"ee7bef68afdb2544acface379516b3610fe18efeb5fb3874e38591a5bbf54b90\" with timeout 30 (s)" Aug 12 23:46:00.320503 containerd[1506]: time="2025-08-12T23:46:00.320454038Z" level=info msg="Stop container \"ee7bef68afdb2544acface379516b3610fe18efeb5fb3874e38591a5bbf54b90\" with signal terminated" Aug 12 23:46:00.382784 systemd[1]: cri-containerd-ee7bef68afdb2544acface379516b3610fe18efeb5fb3874e38591a5bbf54b90.scope: Deactivated successfully. Aug 12 23:46:00.383113 systemd[1]: cri-containerd-ee7bef68afdb2544acface379516b3610fe18efeb5fb3874e38591a5bbf54b90.scope: Consumed 2.472s CPU time, 59.4M memory peak. Aug 12 23:46:00.389569 containerd[1506]: time="2025-08-12T23:46:00.387823397Z" level=info msg="TaskExit event in podsandbox handler container_id:\"ee7bef68afdb2544acface379516b3610fe18efeb5fb3874e38591a5bbf54b90\" id:\"ee7bef68afdb2544acface379516b3610fe18efeb5fb3874e38591a5bbf54b90\" pid:5177 exit_status:1 exited_at:{seconds:1755042360 nanos:385866628}" Aug 12 23:46:00.389569 containerd[1506]: time="2025-08-12T23:46:00.388011197Z" level=info msg="received exit event container_id:\"ee7bef68afdb2544acface379516b3610fe18efeb5fb3874e38591a5bbf54b90\" id:\"ee7bef68afdb2544acface379516b3610fe18efeb5fb3874e38591a5bbf54b90\" pid:5177 exit_status:1 exited_at:{seconds:1755042360 nanos:385866628}" Aug 12 23:46:00.420672 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-ee7bef68afdb2544acface379516b3610fe18efeb5fb3874e38591a5bbf54b90-rootfs.mount: Deactivated successfully. Aug 12 23:46:00.441195 containerd[1506]: time="2025-08-12T23:46:00.441144017Z" level=info msg="StopContainer for \"ee7bef68afdb2544acface379516b3610fe18efeb5fb3874e38591a5bbf54b90\" returns successfully" Aug 12 23:46:00.441794 containerd[1506]: time="2025-08-12T23:46:00.441734339Z" level=info msg="StopPodSandbox for \"6a5e1ec7533ab9f8c923e3de67713cb1c6bc6ed94eeb1753dd30a384ea76360c\"" Aug 12 23:46:00.442852 containerd[1506]: time="2025-08-12T23:46:00.441937220Z" level=info msg="Container to stop \"ee7bef68afdb2544acface379516b3610fe18efeb5fb3874e38591a5bbf54b90\" must be in running or unknown state, current state \"CONTAINER_EXITED\"" Aug 12 23:46:00.452779 systemd[1]: cri-containerd-6a5e1ec7533ab9f8c923e3de67713cb1c6bc6ed94eeb1753dd30a384ea76360c.scope: Deactivated successfully. Aug 12 23:46:00.459149 containerd[1506]: time="2025-08-12T23:46:00.459001091Z" level=info msg="TaskExit event in podsandbox handler container_id:\"6a5e1ec7533ab9f8c923e3de67713cb1c6bc6ed94eeb1753dd30a384ea76360c\" id:\"6a5e1ec7533ab9f8c923e3de67713cb1c6bc6ed94eeb1753dd30a384ea76360c\" pid:4773 exit_status:137 exited_at:{seconds:1755042360 nanos:458621409}" Aug 12 23:46:00.504508 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-6a5e1ec7533ab9f8c923e3de67713cb1c6bc6ed94eeb1753dd30a384ea76360c-rootfs.mount: Deactivated successfully. Aug 12 23:46:00.508058 containerd[1506]: time="2025-08-12T23:46:00.508003573Z" level=info msg="shim disconnected" id=6a5e1ec7533ab9f8c923e3de67713cb1c6bc6ed94eeb1753dd30a384ea76360c namespace=k8s.io Aug 12 23:46:00.508170 containerd[1506]: time="2025-08-12T23:46:00.508047013Z" level=warning msg="cleaning up after shim disconnected" id=6a5e1ec7533ab9f8c923e3de67713cb1c6bc6ed94eeb1753dd30a384ea76360c namespace=k8s.io Aug 12 23:46:00.508170 containerd[1506]: time="2025-08-12T23:46:00.508080054Z" level=info msg="cleaning up dead shim" namespace=k8s.io Aug 12 23:46:00.534145 containerd[1506]: time="2025-08-12T23:46:00.534039001Z" level=info msg="received exit event sandbox_id:\"6a5e1ec7533ab9f8c923e3de67713cb1c6bc6ed94eeb1753dd30a384ea76360c\" exit_status:137 exited_at:{seconds:1755042360 nanos:458621409}" Aug 12 23:46:00.540723 systemd[1]: run-containerd-io.containerd.grpc.v1.cri-sandboxes-6a5e1ec7533ab9f8c923e3de67713cb1c6bc6ed94eeb1753dd30a384ea76360c-shm.mount: Deactivated successfully. Aug 12 23:46:00.613463 systemd-networkd[1419]: calicdb23de850c: Link DOWN Aug 12 23:46:00.614071 systemd-networkd[1419]: calicdb23de850c: Lost carrier Aug 12 23:46:00.699821 kubelet[2715]: I0812 23:46:00.699776 2715 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6a5e1ec7533ab9f8c923e3de67713cb1c6bc6ed94eeb1753dd30a384ea76360c" Aug 12 23:46:00.739048 containerd[1506]: 2025-08-12 23:46:00.609 [INFO][5666] cni-plugin/k8s.go 640: Cleaning up netns ContainerID="6a5e1ec7533ab9f8c923e3de67713cb1c6bc6ed94eeb1753dd30a384ea76360c" Aug 12 23:46:00.739048 containerd[1506]: 2025-08-12 23:46:00.610 [INFO][5666] cni-plugin/dataplane_linux.go 559: Deleting workload's device in netns. ContainerID="6a5e1ec7533ab9f8c923e3de67713cb1c6bc6ed94eeb1753dd30a384ea76360c" iface="eth0" netns="/var/run/netns/cni-2b14e3f9-c29f-808b-b231-8bb118de2538" Aug 12 23:46:00.739048 containerd[1506]: 2025-08-12 23:46:00.612 [INFO][5666] cni-plugin/dataplane_linux.go 570: Entered netns, deleting veth. ContainerID="6a5e1ec7533ab9f8c923e3de67713cb1c6bc6ed94eeb1753dd30a384ea76360c" iface="eth0" netns="/var/run/netns/cni-2b14e3f9-c29f-808b-b231-8bb118de2538" Aug 12 23:46:00.739048 containerd[1506]: 2025-08-12 23:46:00.629 [INFO][5666] cni-plugin/dataplane_linux.go 604: Deleted device in netns. ContainerID="6a5e1ec7533ab9f8c923e3de67713cb1c6bc6ed94eeb1753dd30a384ea76360c" after=19.2946ms iface="eth0" netns="/var/run/netns/cni-2b14e3f9-c29f-808b-b231-8bb118de2538" Aug 12 23:46:00.739048 containerd[1506]: 2025-08-12 23:46:00.629 [INFO][5666] cni-plugin/k8s.go 647: Releasing IP address(es) ContainerID="6a5e1ec7533ab9f8c923e3de67713cb1c6bc6ed94eeb1753dd30a384ea76360c" Aug 12 23:46:00.739048 containerd[1506]: 2025-08-12 23:46:00.629 [INFO][5666] cni-plugin/utils.go 188: Calico CNI releasing IP address ContainerID="6a5e1ec7533ab9f8c923e3de67713cb1c6bc6ed94eeb1753dd30a384ea76360c" Aug 12 23:46:00.739048 containerd[1506]: 2025-08-12 23:46:00.657 [INFO][5680] ipam/ipam_plugin.go 412: Releasing address using handleID ContainerID="6a5e1ec7533ab9f8c923e3de67713cb1c6bc6ed94eeb1753dd30a384ea76360c" HandleID="k8s-pod-network.6a5e1ec7533ab9f8c923e3de67713cb1c6bc6ed94eeb1753dd30a384ea76360c" Workload="ci--4372--1--0--9--13fe44d47a-k8s-calico--apiserver--7f8d4ddcc7--zgq55-eth0" Aug 12 23:46:00.739048 containerd[1506]: 2025-08-12 23:46:00.657 [INFO][5680] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Aug 12 23:46:00.739048 containerd[1506]: 2025-08-12 23:46:00.657 [INFO][5680] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Aug 12 23:46:00.739048 containerd[1506]: 2025-08-12 23:46:00.732 [INFO][5680] ipam/ipam_plugin.go 431: Released address using handleID ContainerID="6a5e1ec7533ab9f8c923e3de67713cb1c6bc6ed94eeb1753dd30a384ea76360c" HandleID="k8s-pod-network.6a5e1ec7533ab9f8c923e3de67713cb1c6bc6ed94eeb1753dd30a384ea76360c" Workload="ci--4372--1--0--9--13fe44d47a-k8s-calico--apiserver--7f8d4ddcc7--zgq55-eth0" Aug 12 23:46:00.739048 containerd[1506]: 2025-08-12 23:46:00.732 [INFO][5680] ipam/ipam_plugin.go 440: Releasing address using workloadID ContainerID="6a5e1ec7533ab9f8c923e3de67713cb1c6bc6ed94eeb1753dd30a384ea76360c" HandleID="k8s-pod-network.6a5e1ec7533ab9f8c923e3de67713cb1c6bc6ed94eeb1753dd30a384ea76360c" Workload="ci--4372--1--0--9--13fe44d47a-k8s-calico--apiserver--7f8d4ddcc7--zgq55-eth0" Aug 12 23:46:00.739048 containerd[1506]: 2025-08-12 23:46:00.734 [INFO][5680] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Aug 12 23:46:00.739048 containerd[1506]: 2025-08-12 23:46:00.736 [INFO][5666] cni-plugin/k8s.go 653: Teardown processing complete. ContainerID="6a5e1ec7533ab9f8c923e3de67713cb1c6bc6ed94eeb1753dd30a384ea76360c" Aug 12 23:46:00.742215 systemd[1]: run-netns-cni\x2d2b14e3f9\x2dc29f\x2d808b\x2db231\x2d8bb118de2538.mount: Deactivated successfully. Aug 12 23:46:00.743830 containerd[1506]: time="2025-08-12T23:46:00.743773948Z" level=info msg="TearDown network for sandbox \"6a5e1ec7533ab9f8c923e3de67713cb1c6bc6ed94eeb1753dd30a384ea76360c\" successfully" Aug 12 23:46:00.743830 containerd[1506]: time="2025-08-12T23:46:00.743816268Z" level=info msg="StopPodSandbox for \"6a5e1ec7533ab9f8c923e3de67713cb1c6bc6ed94eeb1753dd30a384ea76360c\" returns successfully" Aug 12 23:46:00.879131 kubelet[2715]: I0812 23:46:00.879086 2715 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2mzl6\" (UniqueName: \"kubernetes.io/projected/a9e9c567-de9b-4a68-9d2c-640b2dafaf2f-kube-api-access-2mzl6\") pod \"a9e9c567-de9b-4a68-9d2c-640b2dafaf2f\" (UID: \"a9e9c567-de9b-4a68-9d2c-640b2dafaf2f\") " Aug 12 23:46:00.879330 kubelet[2715]: I0812 23:46:00.879145 2715 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"calico-apiserver-certs\" (UniqueName: \"kubernetes.io/secret/a9e9c567-de9b-4a68-9d2c-640b2dafaf2f-calico-apiserver-certs\") pod \"a9e9c567-de9b-4a68-9d2c-640b2dafaf2f\" (UID: \"a9e9c567-de9b-4a68-9d2c-640b2dafaf2f\") " Aug 12 23:46:00.885873 systemd[1]: var-lib-kubelet-pods-a9e9c567\x2dde9b\x2d4a68\x2d9d2c\x2d640b2dafaf2f-volumes-kubernetes.io\x7esecret-calico\x2dapiserver\x2dcerts.mount: Deactivated successfully. Aug 12 23:46:00.886555 kubelet[2715]: I0812 23:46:00.886493 2715 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a9e9c567-de9b-4a68-9d2c-640b2dafaf2f-calico-apiserver-certs" (OuterVolumeSpecName: "calico-apiserver-certs") pod "a9e9c567-de9b-4a68-9d2c-640b2dafaf2f" (UID: "a9e9c567-de9b-4a68-9d2c-640b2dafaf2f"). InnerVolumeSpecName "calico-apiserver-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Aug 12 23:46:00.889877 kubelet[2715]: I0812 23:46:00.889823 2715 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a9e9c567-de9b-4a68-9d2c-640b2dafaf2f-kube-api-access-2mzl6" (OuterVolumeSpecName: "kube-api-access-2mzl6") pod "a9e9c567-de9b-4a68-9d2c-640b2dafaf2f" (UID: "a9e9c567-de9b-4a68-9d2c-640b2dafaf2f"). InnerVolumeSpecName "kube-api-access-2mzl6". PluginName "kubernetes.io/projected", VolumeGidValue "" Aug 12 23:46:00.980017 kubelet[2715]: I0812 23:46:00.979937 2715 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2mzl6\" (UniqueName: \"kubernetes.io/projected/a9e9c567-de9b-4a68-9d2c-640b2dafaf2f-kube-api-access-2mzl6\") on node \"ci-4372-1-0-9-13fe44d47a\" DevicePath \"\"" Aug 12 23:46:00.980017 kubelet[2715]: I0812 23:46:00.979985 2715 reconciler_common.go:293] "Volume detached for volume \"calico-apiserver-certs\" (UniqueName: \"kubernetes.io/secret/a9e9c567-de9b-4a68-9d2c-640b2dafaf2f-calico-apiserver-certs\") on node \"ci-4372-1-0-9-13fe44d47a\" DevicePath \"\"" Aug 12 23:46:01.095368 systemd[1]: Removed slice kubepods-besteffort-poda9e9c567_de9b_4a68_9d2c_640b2dafaf2f.slice - libcontainer container kubepods-besteffort-poda9e9c567_de9b_4a68_9d2c_640b2dafaf2f.slice. Aug 12 23:46:01.095488 systemd[1]: kubepods-besteffort-poda9e9c567_de9b_4a68_9d2c_640b2dafaf2f.slice: Consumed 2.498s CPU time, 59.7M memory peak. Aug 12 23:46:01.419095 systemd[1]: var-lib-kubelet-pods-a9e9c567\x2dde9b\x2d4a68\x2d9d2c\x2d640b2dafaf2f-volumes-kubernetes.io\x7eprojected-kube\x2dapi\x2daccess\x2d2mzl6.mount: Deactivated successfully. Aug 12 23:46:03.092290 kubelet[2715]: I0812 23:46:03.092212 2715 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a9e9c567-de9b-4a68-9d2c-640b2dafaf2f" path="/var/lib/kubelet/pods/a9e9c567-de9b-4a68-9d2c-640b2dafaf2f/volumes" Aug 12 23:46:04.988894 containerd[1506]: time="2025-08-12T23:46:04.988839106Z" level=info msg="TaskExit event in podsandbox handler container_id:\"1cccb6bc47c00d5e1d4ee67f317e7ac919f9017670873c5bb6119b773b41640b\" id:\"2db8dfb154dc4ab60f6f6c16ac5cd6fe17f885282281d52d974c89c416b33732\" pid:5705 exited_at:{seconds:1755042364 nanos:987957941}" Aug 12 23:46:09.153177 containerd[1506]: time="2025-08-12T23:46:09.153126864Z" level=info msg="TaskExit event in podsandbox handler container_id:\"769dd1b877f8b6c3cdfcd850a1bfce2f7623183d7b5ca6c22e3baaa0871e87ab\" id:\"05d6590f409aff7694ecd0bfbf1ab8322087e0b1d7377fbbff38a3b5d3c02b69\" pid:5732 exited_at:{seconds:1755042369 nanos:152536060}" Aug 12 23:46:16.749087 containerd[1506]: time="2025-08-12T23:46:16.748976125Z" level=info msg="TaskExit event in podsandbox handler container_id:\"19dd4802aef9a2278a4c8001b706649c2528bbfafc2b91c7d2bde8cc49cbafdc\" id:\"b65202399a679fa40d633141761f15199486006c0fcafaf05b19c968ace11a54\" pid:5760 exited_at:{seconds:1755042376 nanos:748279239}" Aug 12 23:46:25.122809 kubelet[2715]: I0812 23:46:25.122734 2715 scope.go:117] "RemoveContainer" containerID="ee7bef68afdb2544acface379516b3610fe18efeb5fb3874e38591a5bbf54b90" Aug 12 23:46:25.125668 containerd[1506]: time="2025-08-12T23:46:25.125607130Z" level=info msg="RemoveContainer for \"ee7bef68afdb2544acface379516b3610fe18efeb5fb3874e38591a5bbf54b90\"" Aug 12 23:46:25.131629 containerd[1506]: time="2025-08-12T23:46:25.131348665Z" level=info msg="RemoveContainer for \"ee7bef68afdb2544acface379516b3610fe18efeb5fb3874e38591a5bbf54b90\" returns successfully" Aug 12 23:46:25.133990 containerd[1506]: time="2025-08-12T23:46:25.133952890Z" level=info msg="StopPodSandbox for \"69a93367d0d4a8757028e8e0ca21952426c5ce6ce96fdba13005db0b3b6b35bf\"" Aug 12 23:46:25.239557 containerd[1506]: 2025-08-12 23:46:25.187 [WARNING][5790] cni-plugin/k8s.go 598: WorkloadEndpoint does not exist in the datastore, moving forward with the clean up ContainerID="69a93367d0d4a8757028e8e0ca21952426c5ce6ce96fdba13005db0b3b6b35bf" WorkloadEndpoint="ci--4372--1--0--9--13fe44d47a-k8s-calico--apiserver--7f8d4ddcc7--r7f9g-eth0" Aug 12 23:46:25.239557 containerd[1506]: 2025-08-12 23:46:25.187 [INFO][5790] cni-plugin/k8s.go 640: Cleaning up netns ContainerID="69a93367d0d4a8757028e8e0ca21952426c5ce6ce96fdba13005db0b3b6b35bf" Aug 12 23:46:25.239557 containerd[1506]: 2025-08-12 23:46:25.187 [INFO][5790] cni-plugin/dataplane_linux.go 555: CleanUpNamespace called with no netns name, ignoring. ContainerID="69a93367d0d4a8757028e8e0ca21952426c5ce6ce96fdba13005db0b3b6b35bf" iface="eth0" netns="" Aug 12 23:46:25.239557 containerd[1506]: 2025-08-12 23:46:25.187 [INFO][5790] cni-plugin/k8s.go 647: Releasing IP address(es) ContainerID="69a93367d0d4a8757028e8e0ca21952426c5ce6ce96fdba13005db0b3b6b35bf" Aug 12 23:46:25.239557 containerd[1506]: 2025-08-12 23:46:25.187 [INFO][5790] cni-plugin/utils.go 188: Calico CNI releasing IP address ContainerID="69a93367d0d4a8757028e8e0ca21952426c5ce6ce96fdba13005db0b3b6b35bf" Aug 12 23:46:25.239557 containerd[1506]: 2025-08-12 23:46:25.214 [INFO][5797] ipam/ipam_plugin.go 412: Releasing address using handleID ContainerID="69a93367d0d4a8757028e8e0ca21952426c5ce6ce96fdba13005db0b3b6b35bf" HandleID="k8s-pod-network.69a93367d0d4a8757028e8e0ca21952426c5ce6ce96fdba13005db0b3b6b35bf" Workload="ci--4372--1--0--9--13fe44d47a-k8s-calico--apiserver--7f8d4ddcc7--r7f9g-eth0" Aug 12 23:46:25.239557 containerd[1506]: 2025-08-12 23:46:25.214 [INFO][5797] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Aug 12 23:46:25.239557 containerd[1506]: 2025-08-12 23:46:25.214 [INFO][5797] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Aug 12 23:46:25.239557 containerd[1506]: 2025-08-12 23:46:25.228 [WARNING][5797] ipam/ipam_plugin.go 429: Asked to release address but it doesn't exist. Ignoring ContainerID="69a93367d0d4a8757028e8e0ca21952426c5ce6ce96fdba13005db0b3b6b35bf" HandleID="k8s-pod-network.69a93367d0d4a8757028e8e0ca21952426c5ce6ce96fdba13005db0b3b6b35bf" Workload="ci--4372--1--0--9--13fe44d47a-k8s-calico--apiserver--7f8d4ddcc7--r7f9g-eth0" Aug 12 23:46:25.239557 containerd[1506]: 2025-08-12 23:46:25.229 [INFO][5797] ipam/ipam_plugin.go 440: Releasing address using workloadID ContainerID="69a93367d0d4a8757028e8e0ca21952426c5ce6ce96fdba13005db0b3b6b35bf" HandleID="k8s-pod-network.69a93367d0d4a8757028e8e0ca21952426c5ce6ce96fdba13005db0b3b6b35bf" Workload="ci--4372--1--0--9--13fe44d47a-k8s-calico--apiserver--7f8d4ddcc7--r7f9g-eth0" Aug 12 23:46:25.239557 containerd[1506]: 2025-08-12 23:46:25.233 [INFO][5797] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Aug 12 23:46:25.239557 containerd[1506]: 2025-08-12 23:46:25.237 [INFO][5790] cni-plugin/k8s.go 653: Teardown processing complete. ContainerID="69a93367d0d4a8757028e8e0ca21952426c5ce6ce96fdba13005db0b3b6b35bf" Aug 12 23:46:25.240467 containerd[1506]: time="2025-08-12T23:46:25.239738827Z" level=info msg="TearDown network for sandbox \"69a93367d0d4a8757028e8e0ca21952426c5ce6ce96fdba13005db0b3b6b35bf\" successfully" Aug 12 23:46:25.240467 containerd[1506]: time="2025-08-12T23:46:25.239761907Z" level=info msg="StopPodSandbox for \"69a93367d0d4a8757028e8e0ca21952426c5ce6ce96fdba13005db0b3b6b35bf\" returns successfully" Aug 12 23:46:25.240467 containerd[1506]: time="2025-08-12T23:46:25.240350473Z" level=info msg="RemovePodSandbox for \"69a93367d0d4a8757028e8e0ca21952426c5ce6ce96fdba13005db0b3b6b35bf\"" Aug 12 23:46:25.240467 containerd[1506]: time="2025-08-12T23:46:25.240380873Z" level=info msg="Forcibly stopping sandbox \"69a93367d0d4a8757028e8e0ca21952426c5ce6ce96fdba13005db0b3b6b35bf\"" Aug 12 23:46:25.326326 containerd[1506]: 2025-08-12 23:46:25.281 [WARNING][5811] cni-plugin/k8s.go 598: WorkloadEndpoint does not exist in the datastore, moving forward with the clean up ContainerID="69a93367d0d4a8757028e8e0ca21952426c5ce6ce96fdba13005db0b3b6b35bf" WorkloadEndpoint="ci--4372--1--0--9--13fe44d47a-k8s-calico--apiserver--7f8d4ddcc7--r7f9g-eth0" Aug 12 23:46:25.326326 containerd[1506]: 2025-08-12 23:46:25.281 [INFO][5811] cni-plugin/k8s.go 640: Cleaning up netns ContainerID="69a93367d0d4a8757028e8e0ca21952426c5ce6ce96fdba13005db0b3b6b35bf" Aug 12 23:46:25.326326 containerd[1506]: 2025-08-12 23:46:25.281 [INFO][5811] cni-plugin/dataplane_linux.go 555: CleanUpNamespace called with no netns name, ignoring. ContainerID="69a93367d0d4a8757028e8e0ca21952426c5ce6ce96fdba13005db0b3b6b35bf" iface="eth0" netns="" Aug 12 23:46:25.326326 containerd[1506]: 2025-08-12 23:46:25.281 [INFO][5811] cni-plugin/k8s.go 647: Releasing IP address(es) ContainerID="69a93367d0d4a8757028e8e0ca21952426c5ce6ce96fdba13005db0b3b6b35bf" Aug 12 23:46:25.326326 containerd[1506]: 2025-08-12 23:46:25.281 [INFO][5811] cni-plugin/utils.go 188: Calico CNI releasing IP address ContainerID="69a93367d0d4a8757028e8e0ca21952426c5ce6ce96fdba13005db0b3b6b35bf" Aug 12 23:46:25.326326 containerd[1506]: 2025-08-12 23:46:25.305 [INFO][5818] ipam/ipam_plugin.go 412: Releasing address using handleID ContainerID="69a93367d0d4a8757028e8e0ca21952426c5ce6ce96fdba13005db0b3b6b35bf" HandleID="k8s-pod-network.69a93367d0d4a8757028e8e0ca21952426c5ce6ce96fdba13005db0b3b6b35bf" Workload="ci--4372--1--0--9--13fe44d47a-k8s-calico--apiserver--7f8d4ddcc7--r7f9g-eth0" Aug 12 23:46:25.326326 containerd[1506]: 2025-08-12 23:46:25.305 [INFO][5818] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Aug 12 23:46:25.326326 containerd[1506]: 2025-08-12 23:46:25.305 [INFO][5818] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Aug 12 23:46:25.326326 containerd[1506]: 2025-08-12 23:46:25.317 [WARNING][5818] ipam/ipam_plugin.go 429: Asked to release address but it doesn't exist. Ignoring ContainerID="69a93367d0d4a8757028e8e0ca21952426c5ce6ce96fdba13005db0b3b6b35bf" HandleID="k8s-pod-network.69a93367d0d4a8757028e8e0ca21952426c5ce6ce96fdba13005db0b3b6b35bf" Workload="ci--4372--1--0--9--13fe44d47a-k8s-calico--apiserver--7f8d4ddcc7--r7f9g-eth0" Aug 12 23:46:25.326326 containerd[1506]: 2025-08-12 23:46:25.317 [INFO][5818] ipam/ipam_plugin.go 440: Releasing address using workloadID ContainerID="69a93367d0d4a8757028e8e0ca21952426c5ce6ce96fdba13005db0b3b6b35bf" HandleID="k8s-pod-network.69a93367d0d4a8757028e8e0ca21952426c5ce6ce96fdba13005db0b3b6b35bf" Workload="ci--4372--1--0--9--13fe44d47a-k8s-calico--apiserver--7f8d4ddcc7--r7f9g-eth0" Aug 12 23:46:25.326326 containerd[1506]: 2025-08-12 23:46:25.320 [INFO][5818] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Aug 12 23:46:25.326326 containerd[1506]: 2025-08-12 23:46:25.323 [INFO][5811] cni-plugin/k8s.go 653: Teardown processing complete. ContainerID="69a93367d0d4a8757028e8e0ca21952426c5ce6ce96fdba13005db0b3b6b35bf" Aug 12 23:46:25.327238 containerd[1506]: time="2025-08-12T23:46:25.326388139Z" level=info msg="TearDown network for sandbox \"69a93367d0d4a8757028e8e0ca21952426c5ce6ce96fdba13005db0b3b6b35bf\" successfully" Aug 12 23:46:25.328718 containerd[1506]: time="2025-08-12T23:46:25.328517800Z" level=info msg="Ensure that sandbox 69a93367d0d4a8757028e8e0ca21952426c5ce6ce96fdba13005db0b3b6b35bf in task-service has been cleanup successfully" Aug 12 23:46:25.332902 containerd[1506]: time="2025-08-12T23:46:25.332824601Z" level=info msg="RemovePodSandbox \"69a93367d0d4a8757028e8e0ca21952426c5ce6ce96fdba13005db0b3b6b35bf\" returns successfully" Aug 12 23:46:25.333542 containerd[1506]: time="2025-08-12T23:46:25.333499008Z" level=info msg="StopPodSandbox for \"6a5e1ec7533ab9f8c923e3de67713cb1c6bc6ed94eeb1753dd30a384ea76360c\"" Aug 12 23:46:25.424767 containerd[1506]: 2025-08-12 23:46:25.379 [WARNING][5832] cni-plugin/k8s.go 598: WorkloadEndpoint does not exist in the datastore, moving forward with the clean up ContainerID="6a5e1ec7533ab9f8c923e3de67713cb1c6bc6ed94eeb1753dd30a384ea76360c" WorkloadEndpoint="ci--4372--1--0--9--13fe44d47a-k8s-calico--apiserver--7f8d4ddcc7--zgq55-eth0" Aug 12 23:46:25.424767 containerd[1506]: 2025-08-12 23:46:25.379 [INFO][5832] cni-plugin/k8s.go 640: Cleaning up netns ContainerID="6a5e1ec7533ab9f8c923e3de67713cb1c6bc6ed94eeb1753dd30a384ea76360c" Aug 12 23:46:25.424767 containerd[1506]: 2025-08-12 23:46:25.379 [INFO][5832] cni-plugin/dataplane_linux.go 555: CleanUpNamespace called with no netns name, ignoring. ContainerID="6a5e1ec7533ab9f8c923e3de67713cb1c6bc6ed94eeb1753dd30a384ea76360c" iface="eth0" netns="" Aug 12 23:46:25.424767 containerd[1506]: 2025-08-12 23:46:25.379 [INFO][5832] cni-plugin/k8s.go 647: Releasing IP address(es) ContainerID="6a5e1ec7533ab9f8c923e3de67713cb1c6bc6ed94eeb1753dd30a384ea76360c" Aug 12 23:46:25.424767 containerd[1506]: 2025-08-12 23:46:25.379 [INFO][5832] cni-plugin/utils.go 188: Calico CNI releasing IP address ContainerID="6a5e1ec7533ab9f8c923e3de67713cb1c6bc6ed94eeb1753dd30a384ea76360c" Aug 12 23:46:25.424767 containerd[1506]: 2025-08-12 23:46:25.405 [INFO][5839] ipam/ipam_plugin.go 412: Releasing address using handleID ContainerID="6a5e1ec7533ab9f8c923e3de67713cb1c6bc6ed94eeb1753dd30a384ea76360c" HandleID="k8s-pod-network.6a5e1ec7533ab9f8c923e3de67713cb1c6bc6ed94eeb1753dd30a384ea76360c" Workload="ci--4372--1--0--9--13fe44d47a-k8s-calico--apiserver--7f8d4ddcc7--zgq55-eth0" Aug 12 23:46:25.424767 containerd[1506]: 2025-08-12 23:46:25.405 [INFO][5839] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Aug 12 23:46:25.424767 containerd[1506]: 2025-08-12 23:46:25.405 [INFO][5839] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Aug 12 23:46:25.424767 containerd[1506]: 2025-08-12 23:46:25.416 [WARNING][5839] ipam/ipam_plugin.go 429: Asked to release address but it doesn't exist. Ignoring ContainerID="6a5e1ec7533ab9f8c923e3de67713cb1c6bc6ed94eeb1753dd30a384ea76360c" HandleID="k8s-pod-network.6a5e1ec7533ab9f8c923e3de67713cb1c6bc6ed94eeb1753dd30a384ea76360c" Workload="ci--4372--1--0--9--13fe44d47a-k8s-calico--apiserver--7f8d4ddcc7--zgq55-eth0" Aug 12 23:46:25.424767 containerd[1506]: 2025-08-12 23:46:25.417 [INFO][5839] ipam/ipam_plugin.go 440: Releasing address using workloadID ContainerID="6a5e1ec7533ab9f8c923e3de67713cb1c6bc6ed94eeb1753dd30a384ea76360c" HandleID="k8s-pod-network.6a5e1ec7533ab9f8c923e3de67713cb1c6bc6ed94eeb1753dd30a384ea76360c" Workload="ci--4372--1--0--9--13fe44d47a-k8s-calico--apiserver--7f8d4ddcc7--zgq55-eth0" Aug 12 23:46:25.424767 containerd[1506]: 2025-08-12 23:46:25.420 [INFO][5839] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Aug 12 23:46:25.424767 containerd[1506]: 2025-08-12 23:46:25.421 [INFO][5832] cni-plugin/k8s.go 653: Teardown processing complete. ContainerID="6a5e1ec7533ab9f8c923e3de67713cb1c6bc6ed94eeb1753dd30a384ea76360c" Aug 12 23:46:25.424767 containerd[1506]: time="2025-08-12T23:46:25.424271720Z" level=info msg="TearDown network for sandbox \"6a5e1ec7533ab9f8c923e3de67713cb1c6bc6ed94eeb1753dd30a384ea76360c\" successfully" Aug 12 23:46:25.424767 containerd[1506]: time="2025-08-12T23:46:25.424301680Z" level=info msg="StopPodSandbox for \"6a5e1ec7533ab9f8c923e3de67713cb1c6bc6ed94eeb1753dd30a384ea76360c\" returns successfully" Aug 12 23:46:25.425419 containerd[1506]: time="2025-08-12T23:46:25.425106728Z" level=info msg="RemovePodSandbox for \"6a5e1ec7533ab9f8c923e3de67713cb1c6bc6ed94eeb1753dd30a384ea76360c\"" Aug 12 23:46:25.425419 containerd[1506]: time="2025-08-12T23:46:25.425171368Z" level=info msg="Forcibly stopping sandbox \"6a5e1ec7533ab9f8c923e3de67713cb1c6bc6ed94eeb1753dd30a384ea76360c\"" Aug 12 23:46:25.516028 containerd[1506]: 2025-08-12 23:46:25.472 [WARNING][5853] cni-plugin/k8s.go 598: WorkloadEndpoint does not exist in the datastore, moving forward with the clean up ContainerID="6a5e1ec7533ab9f8c923e3de67713cb1c6bc6ed94eeb1753dd30a384ea76360c" WorkloadEndpoint="ci--4372--1--0--9--13fe44d47a-k8s-calico--apiserver--7f8d4ddcc7--zgq55-eth0" Aug 12 23:46:25.516028 containerd[1506]: 2025-08-12 23:46:25.472 [INFO][5853] cni-plugin/k8s.go 640: Cleaning up netns ContainerID="6a5e1ec7533ab9f8c923e3de67713cb1c6bc6ed94eeb1753dd30a384ea76360c" Aug 12 23:46:25.516028 containerd[1506]: 2025-08-12 23:46:25.472 [INFO][5853] cni-plugin/dataplane_linux.go 555: CleanUpNamespace called with no netns name, ignoring. ContainerID="6a5e1ec7533ab9f8c923e3de67713cb1c6bc6ed94eeb1753dd30a384ea76360c" iface="eth0" netns="" Aug 12 23:46:25.516028 containerd[1506]: 2025-08-12 23:46:25.472 [INFO][5853] cni-plugin/k8s.go 647: Releasing IP address(es) ContainerID="6a5e1ec7533ab9f8c923e3de67713cb1c6bc6ed94eeb1753dd30a384ea76360c" Aug 12 23:46:25.516028 containerd[1506]: 2025-08-12 23:46:25.473 [INFO][5853] cni-plugin/utils.go 188: Calico CNI releasing IP address ContainerID="6a5e1ec7533ab9f8c923e3de67713cb1c6bc6ed94eeb1753dd30a384ea76360c" Aug 12 23:46:25.516028 containerd[1506]: 2025-08-12 23:46:25.498 [INFO][5860] ipam/ipam_plugin.go 412: Releasing address using handleID ContainerID="6a5e1ec7533ab9f8c923e3de67713cb1c6bc6ed94eeb1753dd30a384ea76360c" HandleID="k8s-pod-network.6a5e1ec7533ab9f8c923e3de67713cb1c6bc6ed94eeb1753dd30a384ea76360c" Workload="ci--4372--1--0--9--13fe44d47a-k8s-calico--apiserver--7f8d4ddcc7--zgq55-eth0" Aug 12 23:46:25.516028 containerd[1506]: 2025-08-12 23:46:25.498 [INFO][5860] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Aug 12 23:46:25.516028 containerd[1506]: 2025-08-12 23:46:25.498 [INFO][5860] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Aug 12 23:46:25.516028 containerd[1506]: 2025-08-12 23:46:25.509 [WARNING][5860] ipam/ipam_plugin.go 429: Asked to release address but it doesn't exist. Ignoring ContainerID="6a5e1ec7533ab9f8c923e3de67713cb1c6bc6ed94eeb1753dd30a384ea76360c" HandleID="k8s-pod-network.6a5e1ec7533ab9f8c923e3de67713cb1c6bc6ed94eeb1753dd30a384ea76360c" Workload="ci--4372--1--0--9--13fe44d47a-k8s-calico--apiserver--7f8d4ddcc7--zgq55-eth0" Aug 12 23:46:25.516028 containerd[1506]: 2025-08-12 23:46:25.509 [INFO][5860] ipam/ipam_plugin.go 440: Releasing address using workloadID ContainerID="6a5e1ec7533ab9f8c923e3de67713cb1c6bc6ed94eeb1753dd30a384ea76360c" HandleID="k8s-pod-network.6a5e1ec7533ab9f8c923e3de67713cb1c6bc6ed94eeb1753dd30a384ea76360c" Workload="ci--4372--1--0--9--13fe44d47a-k8s-calico--apiserver--7f8d4ddcc7--zgq55-eth0" Aug 12 23:46:25.516028 containerd[1506]: 2025-08-12 23:46:25.512 [INFO][5860] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Aug 12 23:46:25.516028 containerd[1506]: 2025-08-12 23:46:25.514 [INFO][5853] cni-plugin/k8s.go 653: Teardown processing complete. ContainerID="6a5e1ec7533ab9f8c923e3de67713cb1c6bc6ed94eeb1753dd30a384ea76360c" Aug 12 23:46:25.516832 containerd[1506]: time="2025-08-12T23:46:25.516083442Z" level=info msg="TearDown network for sandbox \"6a5e1ec7533ab9f8c923e3de67713cb1c6bc6ed94eeb1753dd30a384ea76360c\" successfully" Aug 12 23:46:25.519260 containerd[1506]: time="2025-08-12T23:46:25.519118991Z" level=info msg="Ensure that sandbox 6a5e1ec7533ab9f8c923e3de67713cb1c6bc6ed94eeb1753dd30a384ea76360c in task-service has been cleanup successfully" Aug 12 23:46:25.524031 containerd[1506]: time="2025-08-12T23:46:25.523967158Z" level=info msg="RemovePodSandbox \"6a5e1ec7533ab9f8c923e3de67713cb1c6bc6ed94eeb1753dd30a384ea76360c\" returns successfully" Aug 12 23:46:27.013640 containerd[1506]: time="2025-08-12T23:46:27.013522814Z" level=info msg="TaskExit event in podsandbox handler container_id:\"19dd4802aef9a2278a4c8001b706649c2528bbfafc2b91c7d2bde8cc49cbafdc\" id:\"9f97e5dadcfa9f583bb3af240abbbbec25b911bb466df63510e063b4ddbea264\" pid:5879 exited_at:{seconds:1755042387 nanos:13170731}" Aug 12 23:46:28.024752 containerd[1506]: time="2025-08-12T23:46:28.024673214Z" level=info msg="TaskExit event in podsandbox handler container_id:\"1cccb6bc47c00d5e1d4ee67f317e7ac919f9017670873c5bb6119b773b41640b\" id:\"23f3b1c577a9214b4699204cb4620b31d5ec1d759a536d3b6ccd6ed666bf0516\" pid:5900 exited_at:{seconds:1755042388 nanos:24347731}" Aug 12 23:46:39.140926 containerd[1506]: time="2025-08-12T23:46:39.140881352Z" level=info msg="TaskExit event in podsandbox handler container_id:\"769dd1b877f8b6c3cdfcd850a1bfce2f7623183d7b5ca6c22e3baaa0871e87ab\" id:\"fecf2545d92b0a0071c6993de732f698adfdd67c6416cbbb812aed3753f4a98f\" pid:5935 exited_at:{seconds:1755042399 nanos:140031062}" Aug 12 23:46:57.014648 containerd[1506]: time="2025-08-12T23:46:57.014491157Z" level=info msg="TaskExit event in podsandbox handler container_id:\"19dd4802aef9a2278a4c8001b706649c2528bbfafc2b91c7d2bde8cc49cbafdc\" id:\"3a5af89c41f05e87b8aa7b2b1f4a54060280c0c0d0e0c9329f867022322c8218\" pid:5977 exited_at:{seconds:1755042417 nanos:13864229}" Aug 12 23:46:58.018463 containerd[1506]: time="2025-08-12T23:46:58.018297136Z" level=info msg="TaskExit event in podsandbox handler container_id:\"1cccb6bc47c00d5e1d4ee67f317e7ac919f9017670873c5bb6119b773b41640b\" id:\"1a7b3755fc0e850035766820918e1cdc493f8bcad45bf366db8b1265a271484c\" pid:5997 exited_at:{seconds:1755042418 nanos:17469606}" Aug 12 23:47:04.951837 containerd[1506]: time="2025-08-12T23:47:04.951710470Z" level=info msg="TaskExit event in podsandbox handler container_id:\"1cccb6bc47c00d5e1d4ee67f317e7ac919f9017670873c5bb6119b773b41640b\" id:\"bb263442e3b30fada49839a0efaf2f4e3f87ec3ec6e87ed65c0deab5dfc7d53e\" pid:6021 exited_at:{seconds:1755042424 nanos:951146383}" Aug 12 23:47:07.698048 systemd[1]: Started sshd@7-138.199.237.168:22-139.178.68.195:42316.service - OpenSSH per-connection server daemon (139.178.68.195:42316). Aug 12 23:47:08.781425 sshd[6035]: Accepted publickey for core from 139.178.68.195 port 42316 ssh2: RSA SHA256:mYYevtwiboZ0WAYUecJRosD7TZEB8o4Se1lqZszh41o Aug 12 23:47:08.784067 sshd-session[6035]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Aug 12 23:47:08.792873 systemd-logind[1481]: New session 8 of user core. Aug 12 23:47:08.798861 systemd[1]: Started session-8.scope - Session 8 of User core. Aug 12 23:47:09.141626 containerd[1506]: time="2025-08-12T23:47:09.141523354Z" level=info msg="TaskExit event in podsandbox handler container_id:\"769dd1b877f8b6c3cdfcd850a1bfce2f7623183d7b5ca6c22e3baaa0871e87ab\" id:\"ee1c11bb6f46f527a1c4107eb7f545500e2399220d5940c1a8d81855dde7eaf9\" pid:6050 exited_at:{seconds:1755042429 nanos:140970387}" Aug 12 23:47:09.667542 sshd[6037]: Connection closed by 139.178.68.195 port 42316 Aug 12 23:47:09.668103 sshd-session[6035]: pam_unix(sshd:session): session closed for user core Aug 12 23:47:09.676196 systemd[1]: sshd@7-138.199.237.168:22-139.178.68.195:42316.service: Deactivated successfully. Aug 12 23:47:09.685358 systemd[1]: session-8.scope: Deactivated successfully. Aug 12 23:47:09.690532 systemd-logind[1481]: Session 8 logged out. Waiting for processes to exit. Aug 12 23:47:09.695305 systemd-logind[1481]: Removed session 8. Aug 12 23:47:14.836840 systemd[1]: Started sshd@8-138.199.237.168:22-139.178.68.195:40674.service - OpenSSH per-connection server daemon (139.178.68.195:40674). Aug 12 23:47:15.850366 sshd[6079]: Accepted publickey for core from 139.178.68.195 port 40674 ssh2: RSA SHA256:mYYevtwiboZ0WAYUecJRosD7TZEB8o4Se1lqZszh41o Aug 12 23:47:15.852681 sshd-session[6079]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Aug 12 23:47:15.859228 systemd-logind[1481]: New session 9 of user core. Aug 12 23:47:15.866887 systemd[1]: Started session-9.scope - Session 9 of User core. Aug 12 23:47:16.614971 sshd[6081]: Connection closed by 139.178.68.195 port 40674 Aug 12 23:47:16.615491 sshd-session[6079]: pam_unix(sshd:session): session closed for user core Aug 12 23:47:16.620962 systemd[1]: sshd@8-138.199.237.168:22-139.178.68.195:40674.service: Deactivated successfully. Aug 12 23:47:16.624262 systemd[1]: session-9.scope: Deactivated successfully. Aug 12 23:47:16.625729 systemd-logind[1481]: Session 9 logged out. Waiting for processes to exit. Aug 12 23:47:16.627626 systemd-logind[1481]: Removed session 9. Aug 12 23:47:16.737040 containerd[1506]: time="2025-08-12T23:47:16.736977986Z" level=info msg="TaskExit event in podsandbox handler container_id:\"19dd4802aef9a2278a4c8001b706649c2528bbfafc2b91c7d2bde8cc49cbafdc\" id:\"950afc85e605b2b237495177a6ffd8a73ed0e2e1724f3240f260bb6ecda16f58\" pid:6104 exited_at:{seconds:1755042436 nanos:736463899}" Aug 12 23:47:21.802974 systemd[1]: Started sshd@9-138.199.237.168:22-139.178.68.195:59270.service - OpenSSH per-connection server daemon (139.178.68.195:59270). Aug 12 23:47:22.821730 sshd[6117]: Accepted publickey for core from 139.178.68.195 port 59270 ssh2: RSA SHA256:mYYevtwiboZ0WAYUecJRosD7TZEB8o4Se1lqZszh41o Aug 12 23:47:22.823928 sshd-session[6117]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Aug 12 23:47:22.831747 systemd-logind[1481]: New session 10 of user core. Aug 12 23:47:22.838809 systemd[1]: Started session-10.scope - Session 10 of User core. Aug 12 23:47:23.597455 sshd[6119]: Connection closed by 139.178.68.195 port 59270 Aug 12 23:47:23.598466 sshd-session[6117]: pam_unix(sshd:session): session closed for user core Aug 12 23:47:23.605280 systemd[1]: sshd@9-138.199.237.168:22-139.178.68.195:59270.service: Deactivated successfully. Aug 12 23:47:23.610117 systemd[1]: session-10.scope: Deactivated successfully. Aug 12 23:47:23.614008 systemd-logind[1481]: Session 10 logged out. Waiting for processes to exit. Aug 12 23:47:23.615448 systemd-logind[1481]: Removed session 10. Aug 12 23:47:27.014018 containerd[1506]: time="2025-08-12T23:47:27.013977137Z" level=info msg="TaskExit event in podsandbox handler container_id:\"19dd4802aef9a2278a4c8001b706649c2528bbfafc2b91c7d2bde8cc49cbafdc\" id:\"fa968ebc17bedf680d00b76eef6ec4bf1c8a12dff80a390e21fe1d65f3875474\" pid:6145 exited_at:{seconds:1755042447 nanos:13677413}" Aug 12 23:47:28.004530 containerd[1506]: time="2025-08-12T23:47:28.004468625Z" level=info msg="TaskExit event in podsandbox handler container_id:\"1cccb6bc47c00d5e1d4ee67f317e7ac919f9017670873c5bb6119b773b41640b\" id:\"da1e6892ad940fbedd1dde4ccc91dd85ca5613c931771f7ff991ff8000b92ab1\" pid:6167 exited_at:{seconds:1755042448 nanos:2930324}" Aug 12 23:47:28.774691 systemd[1]: Started sshd@10-138.199.237.168:22-139.178.68.195:59274.service - OpenSSH per-connection server daemon (139.178.68.195:59274). Aug 12 23:47:29.797887 sshd[6178]: Accepted publickey for core from 139.178.68.195 port 59274 ssh2: RSA SHA256:mYYevtwiboZ0WAYUecJRosD7TZEB8o4Se1lqZszh41o Aug 12 23:47:29.799923 sshd-session[6178]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Aug 12 23:47:29.807016 systemd-logind[1481]: New session 11 of user core. Aug 12 23:47:29.811761 systemd[1]: Started session-11.scope - Session 11 of User core. Aug 12 23:47:30.582663 sshd[6180]: Connection closed by 139.178.68.195 port 59274 Aug 12 23:47:30.583463 sshd-session[6178]: pam_unix(sshd:session): session closed for user core Aug 12 23:47:30.588502 systemd-logind[1481]: Session 11 logged out. Waiting for processes to exit. Aug 12 23:47:30.589337 systemd[1]: sshd@10-138.199.237.168:22-139.178.68.195:59274.service: Deactivated successfully. Aug 12 23:47:30.593486 systemd[1]: session-11.scope: Deactivated successfully. Aug 12 23:47:30.595855 systemd-logind[1481]: Removed session 11. Aug 12 23:47:35.762475 systemd[1]: Started sshd@11-138.199.237.168:22-139.178.68.195:60586.service - OpenSSH per-connection server daemon (139.178.68.195:60586). Aug 12 23:47:36.785192 sshd[6194]: Accepted publickey for core from 139.178.68.195 port 60586 ssh2: RSA SHA256:mYYevtwiboZ0WAYUecJRosD7TZEB8o4Se1lqZszh41o Aug 12 23:47:36.787736 sshd-session[6194]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Aug 12 23:47:36.794276 systemd-logind[1481]: New session 12 of user core. Aug 12 23:47:36.799852 systemd[1]: Started session-12.scope - Session 12 of User core. Aug 12 23:47:37.567577 sshd[6196]: Connection closed by 139.178.68.195 port 60586 Aug 12 23:47:37.566709 sshd-session[6194]: pam_unix(sshd:session): session closed for user core Aug 12 23:47:37.572410 systemd[1]: sshd@11-138.199.237.168:22-139.178.68.195:60586.service: Deactivated successfully. Aug 12 23:47:37.576143 systemd[1]: session-12.scope: Deactivated successfully. Aug 12 23:47:37.578810 systemd-logind[1481]: Session 12 logged out. Waiting for processes to exit. Aug 12 23:47:37.583172 systemd-logind[1481]: Removed session 12. Aug 12 23:47:39.125578 containerd[1506]: time="2025-08-12T23:47:39.125491114Z" level=info msg="TaskExit event in podsandbox handler container_id:\"769dd1b877f8b6c3cdfcd850a1bfce2f7623183d7b5ca6c22e3baaa0871e87ab\" id:\"a4bbb398a353a4e465d873ababff297f5954465dc5799b2301b8331425941268\" pid:6220 exited_at:{seconds:1755042459 nanos:125048956}" Aug 12 23:47:42.766792 systemd[1]: Started sshd@12-138.199.237.168:22-139.178.68.195:38006.service - OpenSSH per-connection server daemon (139.178.68.195:38006). Aug 12 23:47:43.853372 sshd[6233]: Accepted publickey for core from 139.178.68.195 port 38006 ssh2: RSA SHA256:mYYevtwiboZ0WAYUecJRosD7TZEB8o4Se1lqZszh41o Aug 12 23:47:43.856622 sshd-session[6233]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Aug 12 23:47:43.865524 systemd-logind[1481]: New session 13 of user core. Aug 12 23:47:43.873019 systemd[1]: Started session-13.scope - Session 13 of User core. Aug 12 23:47:44.672594 sshd[6235]: Connection closed by 139.178.68.195 port 38006 Aug 12 23:47:44.672879 sshd-session[6233]: pam_unix(sshd:session): session closed for user core Aug 12 23:47:44.677167 systemd[1]: sshd@12-138.199.237.168:22-139.178.68.195:38006.service: Deactivated successfully. Aug 12 23:47:44.681926 systemd[1]: session-13.scope: Deactivated successfully. Aug 12 23:47:44.685139 systemd-logind[1481]: Session 13 logged out. Waiting for processes to exit. Aug 12 23:47:44.686772 systemd-logind[1481]: Removed session 13. Aug 12 23:47:49.836893 systemd[1]: Started sshd@13-138.199.237.168:22-139.178.68.195:38018.service - OpenSSH per-connection server daemon (139.178.68.195:38018). Aug 12 23:47:50.852841 sshd[6254]: Accepted publickey for core from 139.178.68.195 port 38018 ssh2: RSA SHA256:mYYevtwiboZ0WAYUecJRosD7TZEB8o4Se1lqZszh41o Aug 12 23:47:50.855756 sshd-session[6254]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Aug 12 23:47:50.861152 systemd-logind[1481]: New session 14 of user core. Aug 12 23:47:50.869849 systemd[1]: Started session-14.scope - Session 14 of User core. Aug 12 23:47:51.629602 sshd[6256]: Connection closed by 139.178.68.195 port 38018 Aug 12 23:47:51.631598 sshd-session[6254]: pam_unix(sshd:session): session closed for user core Aug 12 23:47:51.638632 systemd[1]: sshd@13-138.199.237.168:22-139.178.68.195:38018.service: Deactivated successfully. Aug 12 23:47:51.641832 systemd[1]: session-14.scope: Deactivated successfully. Aug 12 23:47:51.643536 systemd-logind[1481]: Session 14 logged out. Waiting for processes to exit. Aug 12 23:47:51.646658 systemd-logind[1481]: Removed session 14. Aug 12 23:47:56.807855 systemd[1]: Started sshd@14-138.199.237.168:22-139.178.68.195:57512.service - OpenSSH per-connection server daemon (139.178.68.195:57512). Aug 12 23:47:57.019520 containerd[1506]: time="2025-08-12T23:47:57.019448149Z" level=info msg="TaskExit event in podsandbox handler container_id:\"19dd4802aef9a2278a4c8001b706649c2528bbfafc2b91c7d2bde8cc49cbafdc\" id:\"3257db541ba03e43b62f30dcdd65f8156b5e6f93fcd857e2928d956417d79a21\" pid:6284 exited_at:{seconds:1755042477 nanos:18384869}" Aug 12 23:47:57.826639 sshd[6269]: Accepted publickey for core from 139.178.68.195 port 57512 ssh2: RSA SHA256:mYYevtwiboZ0WAYUecJRosD7TZEB8o4Se1lqZszh41o Aug 12 23:47:57.830158 sshd-session[6269]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Aug 12 23:47:57.839250 systemd-logind[1481]: New session 15 of user core. Aug 12 23:47:57.846898 systemd[1]: Started session-15.scope - Session 15 of User core. Aug 12 23:47:58.023472 containerd[1506]: time="2025-08-12T23:47:58.023289471Z" level=info msg="TaskExit event in podsandbox handler container_id:\"1cccb6bc47c00d5e1d4ee67f317e7ac919f9017670873c5bb6119b773b41640b\" id:\"e0ab5ee9270167b4e750bf8c9a9c5afd8455313067a776d3b20437c070e44c01\" pid:6307 exited_at:{seconds:1755042478 nanos:21758430}" Aug 12 23:47:58.614443 sshd[6294]: Connection closed by 139.178.68.195 port 57512 Aug 12 23:47:58.616845 sshd-session[6269]: pam_unix(sshd:session): session closed for user core Aug 12 23:47:58.630100 systemd[1]: sshd@14-138.199.237.168:22-139.178.68.195:57512.service: Deactivated successfully. Aug 12 23:47:58.637968 systemd[1]: session-15.scope: Deactivated successfully. Aug 12 23:47:58.644878 systemd-logind[1481]: Session 15 logged out. Waiting for processes to exit. Aug 12 23:47:58.650605 systemd-logind[1481]: Removed session 15. Aug 12 23:48:03.797840 systemd[1]: Started sshd@15-138.199.237.168:22-139.178.68.195:45666.service - OpenSSH per-connection server daemon (139.178.68.195:45666). Aug 12 23:48:04.818456 sshd[6332]: Accepted publickey for core from 139.178.68.195 port 45666 ssh2: RSA SHA256:mYYevtwiboZ0WAYUecJRosD7TZEB8o4Se1lqZszh41o Aug 12 23:48:04.821483 sshd-session[6332]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Aug 12 23:48:04.828625 systemd-logind[1481]: New session 16 of user core. Aug 12 23:48:04.832740 systemd[1]: Started session-16.scope - Session 16 of User core. Aug 12 23:48:04.949097 containerd[1506]: time="2025-08-12T23:48:04.949002518Z" level=info msg="TaskExit event in podsandbox handler container_id:\"1cccb6bc47c00d5e1d4ee67f317e7ac919f9017670873c5bb6119b773b41640b\" id:\"c57bd5b47176989ce5ed92464514f498421b4c1e94f2a777abe6f460b61b212b\" pid:6347 exited_at:{seconds:1755042484 nanos:948073317}" Aug 12 23:48:05.589462 sshd[6334]: Connection closed by 139.178.68.195 port 45666 Aug 12 23:48:05.590891 sshd-session[6332]: pam_unix(sshd:session): session closed for user core Aug 12 23:48:05.597100 systemd[1]: sshd@15-138.199.237.168:22-139.178.68.195:45666.service: Deactivated successfully. Aug 12 23:48:05.601123 systemd[1]: session-16.scope: Deactivated successfully. Aug 12 23:48:05.602528 systemd-logind[1481]: Session 16 logged out. Waiting for processes to exit. Aug 12 23:48:05.605158 systemd-logind[1481]: Removed session 16. Aug 12 23:48:09.127948 containerd[1506]: time="2025-08-12T23:48:09.127884166Z" level=info msg="TaskExit event in podsandbox handler container_id:\"769dd1b877f8b6c3cdfcd850a1bfce2f7623183d7b5ca6c22e3baaa0871e87ab\" id:\"7dd7e0e10a926d1e0bd57af8a8634136fd1b6f31a025dafc9e6f70e371fda211\" pid:6379 exited_at:{seconds:1755042489 nanos:127269164}" Aug 12 23:48:10.767875 systemd[1]: Started sshd@16-138.199.237.168:22-139.178.68.195:42564.service - OpenSSH per-connection server daemon (139.178.68.195:42564). Aug 12 23:48:11.789445 sshd[6391]: Accepted publickey for core from 139.178.68.195 port 42564 ssh2: RSA SHA256:mYYevtwiboZ0WAYUecJRosD7TZEB8o4Se1lqZszh41o Aug 12 23:48:11.792401 sshd-session[6391]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Aug 12 23:48:11.799868 systemd-logind[1481]: New session 17 of user core. Aug 12 23:48:11.804922 systemd[1]: Started session-17.scope - Session 17 of User core. Aug 12 23:48:12.559342 sshd[6400]: Connection closed by 139.178.68.195 port 42564 Aug 12 23:48:12.562827 sshd-session[6391]: pam_unix(sshd:session): session closed for user core Aug 12 23:48:12.567074 systemd-logind[1481]: Session 17 logged out. Waiting for processes to exit. Aug 12 23:48:12.567861 systemd[1]: sshd@16-138.199.237.168:22-139.178.68.195:42564.service: Deactivated successfully. Aug 12 23:48:12.574275 systemd[1]: session-17.scope: Deactivated successfully. Aug 12 23:48:12.581768 systemd-logind[1481]: Removed session 17. Aug 12 23:48:16.740239 containerd[1506]: time="2025-08-12T23:48:16.740162305Z" level=info msg="TaskExit event in podsandbox handler container_id:\"19dd4802aef9a2278a4c8001b706649c2528bbfafc2b91c7d2bde8cc49cbafdc\" id:\"35b92929c0cebf66df2179d2eb57e1e6843457e926c3e854a83277a59745f50a\" pid:6438 exited_at:{seconds:1755042496 nanos:739185022}" Aug 12 23:48:17.736061 systemd[1]: Started sshd@17-138.199.237.168:22-139.178.68.195:42574.service - OpenSSH per-connection server daemon (139.178.68.195:42574). Aug 12 23:48:18.755329 sshd[6448]: Accepted publickey for core from 139.178.68.195 port 42574 ssh2: RSA SHA256:mYYevtwiboZ0WAYUecJRosD7TZEB8o4Se1lqZszh41o Aug 12 23:48:18.758168 sshd-session[6448]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Aug 12 23:48:18.765448 systemd-logind[1481]: New session 18 of user core. Aug 12 23:48:18.775946 systemd[1]: Started session-18.scope - Session 18 of User core. Aug 12 23:48:19.544382 sshd[6450]: Connection closed by 139.178.68.195 port 42574 Aug 12 23:48:19.545411 sshd-session[6448]: pam_unix(sshd:session): session closed for user core Aug 12 23:48:19.552058 systemd[1]: sshd@17-138.199.237.168:22-139.178.68.195:42574.service: Deactivated successfully. Aug 12 23:48:19.557492 systemd[1]: session-18.scope: Deactivated successfully. Aug 12 23:48:19.560643 systemd-logind[1481]: Session 18 logged out. Waiting for processes to exit. Aug 12 23:48:19.562866 systemd-logind[1481]: Removed session 18. Aug 12 23:48:24.739788 systemd[1]: Started sshd@18-138.199.237.168:22-139.178.68.195:40948.service - OpenSSH per-connection server daemon (139.178.68.195:40948). Aug 12 23:48:25.819061 sshd[6464]: Accepted publickey for core from 139.178.68.195 port 40948 ssh2: RSA SHA256:mYYevtwiboZ0WAYUecJRosD7TZEB8o4Se1lqZszh41o Aug 12 23:48:25.821947 sshd-session[6464]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Aug 12 23:48:25.831210 systemd-logind[1481]: New session 19 of user core. Aug 12 23:48:25.839911 systemd[1]: Started session-19.scope - Session 19 of User core. Aug 12 23:48:26.630038 sshd[6471]: Connection closed by 139.178.68.195 port 40948 Aug 12 23:48:26.631074 sshd-session[6464]: pam_unix(sshd:session): session closed for user core Aug 12 23:48:26.637870 systemd[1]: sshd@18-138.199.237.168:22-139.178.68.195:40948.service: Deactivated successfully. Aug 12 23:48:26.642382 systemd[1]: session-19.scope: Deactivated successfully. Aug 12 23:48:26.644507 systemd-logind[1481]: Session 19 logged out. Waiting for processes to exit. Aug 12 23:48:26.647083 systemd-logind[1481]: Removed session 19. Aug 12 23:48:27.014050 containerd[1506]: time="2025-08-12T23:48:27.013858974Z" level=info msg="TaskExit event in podsandbox handler container_id:\"19dd4802aef9a2278a4c8001b706649c2528bbfafc2b91c7d2bde8cc49cbafdc\" id:\"3bba4cf21c89979aa7fd61e07863f958ea065886e8d3656ea65321e2f0916137\" pid:6495 exited_at:{seconds:1755042507 nanos:12296447}" Aug 12 23:48:27.996003 containerd[1506]: time="2025-08-12T23:48:27.995932256Z" level=info msg="TaskExit event in podsandbox handler container_id:\"1cccb6bc47c00d5e1d4ee67f317e7ac919f9017670873c5bb6119b773b41640b\" id:\"c0cdd32f3ab9b4f5e028862468bc8c04a08c2b09c95b3a2e6c38e43aa1a0b2c1\" pid:6518 exited_at:{seconds:1755042507 nanos:995112772}" Aug 12 23:48:31.798702 systemd[1]: Started sshd@19-138.199.237.168:22-139.178.68.195:38536.service - OpenSSH per-connection server daemon (139.178.68.195:38536). Aug 12 23:48:32.812651 sshd[6528]: Accepted publickey for core from 139.178.68.195 port 38536 ssh2: RSA SHA256:mYYevtwiboZ0WAYUecJRosD7TZEB8o4Se1lqZszh41o Aug 12 23:48:32.814748 sshd-session[6528]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Aug 12 23:48:32.820972 systemd-logind[1481]: New session 20 of user core. Aug 12 23:48:32.828004 systemd[1]: Started session-20.scope - Session 20 of User core. Aug 12 23:48:33.583497 sshd[6532]: Connection closed by 139.178.68.195 port 38536 Aug 12 23:48:33.584476 sshd-session[6528]: pam_unix(sshd:session): session closed for user core Aug 12 23:48:33.590395 systemd[1]: sshd@19-138.199.237.168:22-139.178.68.195:38536.service: Deactivated successfully. Aug 12 23:48:33.596393 systemd[1]: session-20.scope: Deactivated successfully. Aug 12 23:48:33.598740 systemd-logind[1481]: Session 20 logged out. Waiting for processes to exit. Aug 12 23:48:33.601274 systemd-logind[1481]: Removed session 20. Aug 12 23:48:38.761906 systemd[1]: Started sshd@20-138.199.237.168:22-139.178.68.195:38550.service - OpenSSH per-connection server daemon (139.178.68.195:38550). Aug 12 23:48:39.159813 containerd[1506]: time="2025-08-12T23:48:39.159326562Z" level=info msg="TaskExit event in podsandbox handler container_id:\"769dd1b877f8b6c3cdfcd850a1bfce2f7623183d7b5ca6c22e3baaa0871e87ab\" id:\"12d9e17c8060523a22fc5386dd8e16f8b704db56a3b6921193e2257b9a49e4a9\" pid:6559 exited_at:{seconds:1755042519 nanos:157434271}" Aug 12 23:48:39.775798 sshd[6545]: Accepted publickey for core from 139.178.68.195 port 38550 ssh2: RSA SHA256:mYYevtwiboZ0WAYUecJRosD7TZEB8o4Se1lqZszh41o Aug 12 23:48:39.777817 sshd-session[6545]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Aug 12 23:48:39.784841 systemd-logind[1481]: New session 21 of user core. Aug 12 23:48:39.790819 systemd[1]: Started session-21.scope - Session 21 of User core. Aug 12 23:48:40.559930 sshd[6570]: Connection closed by 139.178.68.195 port 38550 Aug 12 23:48:40.560972 sshd-session[6545]: pam_unix(sshd:session): session closed for user core Aug 12 23:48:40.569104 systemd[1]: sshd@20-138.199.237.168:22-139.178.68.195:38550.service: Deactivated successfully. Aug 12 23:48:40.574161 systemd[1]: session-21.scope: Deactivated successfully. Aug 12 23:48:40.575996 systemd-logind[1481]: Session 21 logged out. Waiting for processes to exit. Aug 12 23:48:40.579004 systemd-logind[1481]: Removed session 21. Aug 12 23:48:45.735649 systemd[1]: Started sshd@21-138.199.237.168:22-139.178.68.195:52110.service - OpenSSH per-connection server daemon (139.178.68.195:52110). Aug 12 23:48:46.748394 sshd[6585]: Accepted publickey for core from 139.178.68.195 port 52110 ssh2: RSA SHA256:mYYevtwiboZ0WAYUecJRosD7TZEB8o4Se1lqZszh41o Aug 12 23:48:46.750833 sshd-session[6585]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Aug 12 23:48:46.756779 systemd-logind[1481]: New session 22 of user core. Aug 12 23:48:46.761751 systemd[1]: Started session-22.scope - Session 22 of User core. Aug 12 23:48:47.545302 sshd[6587]: Connection closed by 139.178.68.195 port 52110 Aug 12 23:48:47.544218 sshd-session[6585]: pam_unix(sshd:session): session closed for user core Aug 12 23:48:47.552630 systemd[1]: sshd@21-138.199.237.168:22-139.178.68.195:52110.service: Deactivated successfully. Aug 12 23:48:47.555626 systemd[1]: session-22.scope: Deactivated successfully. Aug 12 23:48:47.558071 systemd-logind[1481]: Session 22 logged out. Waiting for processes to exit. Aug 12 23:48:47.560281 systemd-logind[1481]: Removed session 22. Aug 12 23:48:52.739525 systemd[1]: Started sshd@22-138.199.237.168:22-139.178.68.195:50192.service - OpenSSH per-connection server daemon (139.178.68.195:50192). Aug 12 23:48:53.812592 sshd[6600]: Accepted publickey for core from 139.178.68.195 port 50192 ssh2: RSA SHA256:mYYevtwiboZ0WAYUecJRosD7TZEB8o4Se1lqZszh41o Aug 12 23:48:53.814990 sshd-session[6600]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Aug 12 23:48:53.824371 systemd-logind[1481]: New session 23 of user core. Aug 12 23:48:53.827866 systemd[1]: Started session-23.scope - Session 23 of User core. Aug 12 23:48:54.629526 sshd[6602]: Connection closed by 139.178.68.195 port 50192 Aug 12 23:48:54.628529 sshd-session[6600]: pam_unix(sshd:session): session closed for user core Aug 12 23:48:54.634941 systemd[1]: sshd@22-138.199.237.168:22-139.178.68.195:50192.service: Deactivated successfully. Aug 12 23:48:54.638154 systemd[1]: session-23.scope: Deactivated successfully. Aug 12 23:48:54.640825 systemd-logind[1481]: Session 23 logged out. Waiting for processes to exit. Aug 12 23:48:54.643142 systemd-logind[1481]: Removed session 23. Aug 12 23:48:57.008283 containerd[1506]: time="2025-08-12T23:48:57.008236742Z" level=info msg="TaskExit event in podsandbox handler container_id:\"19dd4802aef9a2278a4c8001b706649c2528bbfafc2b91c7d2bde8cc49cbafdc\" id:\"35e76af53d883b32a52b0d969ad7bed2628c71dbe5e83fb7232e505487addf7a\" pid:6627 exited_at:{seconds:1755042537 nanos:6879092}" Aug 12 23:48:58.009209 containerd[1506]: time="2025-08-12T23:48:58.009160576Z" level=info msg="TaskExit event in podsandbox handler container_id:\"1cccb6bc47c00d5e1d4ee67f317e7ac919f9017670873c5bb6119b773b41640b\" id:\"3ec469433dd5d5073f9b4ab7a4685ddc63d4328f52fa005f7dfb158812b9a85d\" pid:6648 exited_at:{seconds:1755042538 nanos:8789893}" Aug 12 23:48:59.798032 systemd[1]: Started sshd@23-138.199.237.168:22-139.178.68.195:50206.service - OpenSSH per-connection server daemon (139.178.68.195:50206). Aug 12 23:49:00.813059 sshd[6659]: Accepted publickey for core from 139.178.68.195 port 50206 ssh2: RSA SHA256:mYYevtwiboZ0WAYUecJRosD7TZEB8o4Se1lqZszh41o Aug 12 23:49:00.815308 sshd-session[6659]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Aug 12 23:49:00.823655 systemd-logind[1481]: New session 24 of user core. Aug 12 23:49:00.826832 systemd[1]: Started session-24.scope - Session 24 of User core. Aug 12 23:49:01.091859 update_engine[1485]: I20250812 23:49:01.091712 1485 prefs.cc:52] certificate-report-to-send-update not present in /var/lib/update_engine/prefs Aug 12 23:49:01.091859 update_engine[1485]: I20250812 23:49:01.091779 1485 prefs.cc:52] certificate-report-to-send-download not present in /var/lib/update_engine/prefs Aug 12 23:49:01.092224 update_engine[1485]: I20250812 23:49:01.092080 1485 prefs.cc:52] aleph-version not present in /var/lib/update_engine/prefs Aug 12 23:49:01.093120 update_engine[1485]: I20250812 23:49:01.092602 1485 omaha_request_params.cc:62] Current group set to beta Aug 12 23:49:01.094592 update_engine[1485]: I20250812 23:49:01.094213 1485 update_attempter.cc:499] Already updated boot flags. Skipping. Aug 12 23:49:01.094592 update_engine[1485]: I20250812 23:49:01.094249 1485 update_attempter.cc:643] Scheduling an action processor start. Aug 12 23:49:01.094592 update_engine[1485]: I20250812 23:49:01.094272 1485 action_processor.cc:36] ActionProcessor::StartProcessing: OmahaRequestAction Aug 12 23:49:01.096353 update_engine[1485]: I20250812 23:49:01.096130 1485 prefs.cc:52] previous-version not present in /var/lib/update_engine/prefs Aug 12 23:49:01.096353 update_engine[1485]: I20250812 23:49:01.096293 1485 omaha_request_action.cc:271] Posting an Omaha request to disabled Aug 12 23:49:01.096353 update_engine[1485]: I20250812 23:49:01.096303 1485 omaha_request_action.cc:272] Request: Aug 12 23:49:01.096353 update_engine[1485]: Aug 12 23:49:01.096353 update_engine[1485]: Aug 12 23:49:01.096353 update_engine[1485]: Aug 12 23:49:01.096353 update_engine[1485]: Aug 12 23:49:01.096353 update_engine[1485]: Aug 12 23:49:01.096353 update_engine[1485]: Aug 12 23:49:01.096353 update_engine[1485]: Aug 12 23:49:01.096353 update_engine[1485]: Aug 12 23:49:01.096353 update_engine[1485]: I20250812 23:49:01.096309 1485 libcurl_http_fetcher.cc:47] Starting/Resuming transfer Aug 12 23:49:01.102880 update_engine[1485]: I20250812 23:49:01.102157 1485 libcurl_http_fetcher.cc:151] Setting up curl options for HTTP Aug 12 23:49:01.103511 update_engine[1485]: I20250812 23:49:01.103430 1485 libcurl_http_fetcher.cc:449] Setting up timeout source: 1 seconds. Aug 12 23:49:01.105572 update_engine[1485]: E20250812 23:49:01.105386 1485 libcurl_http_fetcher.cc:266] Unable to get http response code: Could not resolve host: disabled Aug 12 23:49:01.105820 update_engine[1485]: I20250812 23:49:01.105772 1485 libcurl_http_fetcher.cc:283] No HTTP response, retry 1 Aug 12 23:49:01.107793 locksmithd[1520]: LastCheckedTime=0 Progress=0 CurrentOperation="UPDATE_STATUS_CHECKING_FOR_UPDATE" NewVersion=0.0.0 NewSize=0 Aug 12 23:49:01.584768 sshd[6661]: Connection closed by 139.178.68.195 port 50206 Aug 12 23:49:01.586145 sshd-session[6659]: pam_unix(sshd:session): session closed for user core Aug 12 23:49:01.591040 systemd[1]: sshd@23-138.199.237.168:22-139.178.68.195:50206.service: Deactivated successfully. Aug 12 23:49:01.593283 systemd[1]: session-24.scope: Deactivated successfully. Aug 12 23:49:01.595442 systemd-logind[1481]: Session 24 logged out. Waiting for processes to exit. Aug 12 23:49:01.597106 systemd-logind[1481]: Removed session 24. Aug 12 23:49:04.954742 containerd[1506]: time="2025-08-12T23:49:04.954695198Z" level=info msg="TaskExit event in podsandbox handler container_id:\"1cccb6bc47c00d5e1d4ee67f317e7ac919f9017670873c5bb6119b773b41640b\" id:\"286fab60db9568acbaef949f99fd31dd32eea26f582ee2da09a705074e766512\" pid:6686 exited_at:{seconds:1755042544 nanos:954311195}" Aug 12 23:49:06.782737 systemd[1]: Started sshd@24-138.199.237.168:22-139.178.68.195:47526.service - OpenSSH per-connection server daemon (139.178.68.195:47526). Aug 12 23:49:07.854935 sshd[6698]: Accepted publickey for core from 139.178.68.195 port 47526 ssh2: RSA SHA256:mYYevtwiboZ0WAYUecJRosD7TZEB8o4Se1lqZszh41o Aug 12 23:49:07.858351 sshd-session[6698]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Aug 12 23:49:07.865618 systemd-logind[1481]: New session 25 of user core. Aug 12 23:49:07.869736 systemd[1]: Started session-25.scope - Session 25 of User core. Aug 12 23:49:08.669806 sshd[6700]: Connection closed by 139.178.68.195 port 47526 Aug 12 23:49:08.670903 sshd-session[6698]: pam_unix(sshd:session): session closed for user core Aug 12 23:49:08.677513 systemd[1]: sshd@24-138.199.237.168:22-139.178.68.195:47526.service: Deactivated successfully. Aug 12 23:49:08.681998 systemd[1]: session-25.scope: Deactivated successfully. Aug 12 23:49:08.683818 systemd-logind[1481]: Session 25 logged out. Waiting for processes to exit. Aug 12 23:49:08.687154 systemd-logind[1481]: Removed session 25. Aug 12 23:49:09.141974 containerd[1506]: time="2025-08-12T23:49:09.141812905Z" level=info msg="TaskExit event in podsandbox handler container_id:\"769dd1b877f8b6c3cdfcd850a1bfce2f7623183d7b5ca6c22e3baaa0871e87ab\" id:\"750960c13ef3d824a9a2eb25dc0e6d11afdc426e5e12f9a8e92c110889584ac3\" pid:6723 exited_at:{seconds:1755042549 nanos:141436342}" Aug 12 23:49:11.091843 update_engine[1485]: I20250812 23:49:11.091758 1485 libcurl_http_fetcher.cc:47] Starting/Resuming transfer Aug 12 23:49:11.092227 update_engine[1485]: I20250812 23:49:11.091998 1485 libcurl_http_fetcher.cc:151] Setting up curl options for HTTP Aug 12 23:49:11.092289 update_engine[1485]: I20250812 23:49:11.092249 1485 libcurl_http_fetcher.cc:449] Setting up timeout source: 1 seconds. Aug 12 23:49:11.092792 update_engine[1485]: E20250812 23:49:11.092744 1485 libcurl_http_fetcher.cc:266] Unable to get http response code: Could not resolve host: disabled Aug 12 23:49:11.092861 update_engine[1485]: I20250812 23:49:11.092813 1485 libcurl_http_fetcher.cc:283] No HTTP response, retry 2 Aug 12 23:49:13.840853 systemd[1]: Started sshd@25-138.199.237.168:22-139.178.68.195:58026.service - OpenSSH per-connection server daemon (139.178.68.195:58026). Aug 12 23:49:14.850207 sshd[6735]: Accepted publickey for core from 139.178.68.195 port 58026 ssh2: RSA SHA256:mYYevtwiboZ0WAYUecJRosD7TZEB8o4Se1lqZszh41o Aug 12 23:49:14.853648 sshd-session[6735]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Aug 12 23:49:14.860463 systemd-logind[1481]: New session 26 of user core. Aug 12 23:49:14.868971 systemd[1]: Started session-26.scope - Session 26 of User core. Aug 12 23:49:15.625698 sshd[6737]: Connection closed by 139.178.68.195 port 58026 Aug 12 23:49:15.628775 sshd-session[6735]: pam_unix(sshd:session): session closed for user core Aug 12 23:49:15.634179 systemd[1]: sshd@25-138.199.237.168:22-139.178.68.195:58026.service: Deactivated successfully. Aug 12 23:49:15.639076 systemd[1]: session-26.scope: Deactivated successfully. Aug 12 23:49:15.643639 systemd-logind[1481]: Session 26 logged out. Waiting for processes to exit. Aug 12 23:49:15.647627 systemd-logind[1481]: Removed session 26. Aug 12 23:49:16.766293 containerd[1506]: time="2025-08-12T23:49:16.766234811Z" level=info msg="TaskExit event in podsandbox handler container_id:\"19dd4802aef9a2278a4c8001b706649c2528bbfafc2b91c7d2bde8cc49cbafdc\" id:\"f7772a5469a243498a8227ec9dbfaa576c0e1ef6303093a410c80bff4d74e5b0\" pid:6761 exited_at:{seconds:1755042556 nanos:765469725}" Aug 12 23:49:19.088503 containerd[1506]: time="2025-08-12T23:49:19.088217518Z" level=warning msg="container event discarded" container=63cc7c1f668225095cca793da3294eacfac5cb36edf22dfd0f68a8d971374f0e type=CONTAINER_CREATED_EVENT Aug 12 23:49:19.088503 containerd[1506]: time="2025-08-12T23:49:19.088406039Z" level=warning msg="container event discarded" container=63cc7c1f668225095cca793da3294eacfac5cb36edf22dfd0f68a8d971374f0e type=CONTAINER_STARTED_EVENT Aug 12 23:49:19.122056 containerd[1506]: time="2025-08-12T23:49:19.121920734Z" level=warning msg="container event discarded" container=f03a57ded5400be9b1f484a9c3198b7e388b992cae5d0a96b0ae57a0271f7629 type=CONTAINER_CREATED_EVENT Aug 12 23:49:19.122056 containerd[1506]: time="2025-08-12T23:49:19.122036975Z" level=warning msg="container event discarded" container=f03a57ded5400be9b1f484a9c3198b7e388b992cae5d0a96b0ae57a0271f7629 type=CONTAINER_STARTED_EVENT Aug 12 23:49:19.135686 containerd[1506]: time="2025-08-12T23:49:19.135542174Z" level=warning msg="container event discarded" container=c5d6af8ec2d30f5a063448bdcfc55cd2cf73946a87dffbe65a2bcf4266afe4bc type=CONTAINER_CREATED_EVENT Aug 12 23:49:19.135686 containerd[1506]: time="2025-08-12T23:49:19.135631855Z" level=warning msg="container event discarded" container=c5d6af8ec2d30f5a063448bdcfc55cd2cf73946a87dffbe65a2bcf4266afe4bc type=CONTAINER_STARTED_EVENT Aug 12 23:49:19.146101 containerd[1506]: time="2025-08-12T23:49:19.145967106Z" level=warning msg="container event discarded" container=eaf9a9f02e64b1376e9a1b7ff31a49b69fb20afc0b47b8a85282a680a82f61f9 type=CONTAINER_CREATED_EVENT Aug 12 23:49:19.158604 containerd[1506]: time="2025-08-12T23:49:19.158497536Z" level=warning msg="container event discarded" container=23bb8c20a2225635a3b99a1166afdfa83d83099a06c33aa14b79ca02b22a47dd type=CONTAINER_CREATED_EVENT Aug 12 23:49:19.172858 containerd[1506]: time="2025-08-12T23:49:19.172760661Z" level=warning msg="container event discarded" container=0e65af796fa343169e6c8d278f1e1cadf8ed15c036c122f9e9d5a019957cf140 type=CONTAINER_CREATED_EVENT Aug 12 23:49:19.268354 containerd[1506]: time="2025-08-12T23:49:19.268253221Z" level=warning msg="container event discarded" container=eaf9a9f02e64b1376e9a1b7ff31a49b69fb20afc0b47b8a85282a680a82f61f9 type=CONTAINER_STARTED_EVENT Aug 12 23:49:19.296731 containerd[1506]: time="2025-08-12T23:49:19.296611191Z" level=warning msg="container event discarded" container=23bb8c20a2225635a3b99a1166afdfa83d83099a06c33aa14b79ca02b22a47dd type=CONTAINER_STARTED_EVENT Aug 12 23:49:19.338032 containerd[1506]: time="2025-08-12T23:49:19.337922994Z" level=warning msg="container event discarded" container=0e65af796fa343169e6c8d278f1e1cadf8ed15c036c122f9e9d5a019957cf140 type=CONTAINER_STARTED_EVENT Aug 12 23:49:20.800891 systemd[1]: Started sshd@26-138.199.237.168:22-139.178.68.195:52018.service - OpenSSH per-connection server daemon (139.178.68.195:52018). Aug 12 23:49:21.096222 update_engine[1485]: I20250812 23:49:21.096154 1485 libcurl_http_fetcher.cc:47] Starting/Resuming transfer Aug 12 23:49:21.097412 update_engine[1485]: I20250812 23:49:21.096951 1485 libcurl_http_fetcher.cc:151] Setting up curl options for HTTP Aug 12 23:49:21.097412 update_engine[1485]: I20250812 23:49:21.097301 1485 libcurl_http_fetcher.cc:449] Setting up timeout source: 1 seconds. Aug 12 23:49:21.097817 update_engine[1485]: E20250812 23:49:21.097783 1485 libcurl_http_fetcher.cc:266] Unable to get http response code: Could not resolve host: disabled Aug 12 23:49:21.097926 update_engine[1485]: I20250812 23:49:21.097906 1485 libcurl_http_fetcher.cc:283] No HTTP response, retry 3 Aug 12 23:49:21.815164 sshd[6771]: Accepted publickey for core from 139.178.68.195 port 52018 ssh2: RSA SHA256:mYYevtwiboZ0WAYUecJRosD7TZEB8o4Se1lqZszh41o Aug 12 23:49:21.817470 sshd-session[6771]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Aug 12 23:49:21.823392 systemd-logind[1481]: New session 27 of user core. Aug 12 23:49:21.829869 systemd[1]: Started session-27.scope - Session 27 of User core. Aug 12 23:49:22.599727 sshd[6773]: Connection closed by 139.178.68.195 port 52018 Aug 12 23:49:22.600733 sshd-session[6771]: pam_unix(sshd:session): session closed for user core Aug 12 23:49:22.608095 systemd[1]: sshd@26-138.199.237.168:22-139.178.68.195:52018.service: Deactivated successfully. Aug 12 23:49:22.610896 systemd[1]: session-27.scope: Deactivated successfully. Aug 12 23:49:22.612118 systemd-logind[1481]: Session 27 logged out. Waiting for processes to exit. Aug 12 23:49:22.613964 systemd-logind[1481]: Removed session 27. Aug 12 23:49:27.014404 containerd[1506]: time="2025-08-12T23:49:27.014331716Z" level=info msg="TaskExit event in podsandbox handler container_id:\"19dd4802aef9a2278a4c8001b706649c2528bbfafc2b91c7d2bde8cc49cbafdc\" id:\"935e6a4fef6eca151c43a9ea07c5a4e3080357decc480aa044891b296ca664fd\" pid:6801 exited_at:{seconds:1755042567 nanos:14012753}" Aug 12 23:49:27.778488 systemd[1]: Started sshd@27-138.199.237.168:22-139.178.68.195:52024.service - OpenSSH per-connection server daemon (139.178.68.195:52024). Aug 12 23:49:28.013867 containerd[1506]: time="2025-08-12T23:49:28.013809658Z" level=info msg="TaskExit event in podsandbox handler container_id:\"1cccb6bc47c00d5e1d4ee67f317e7ac919f9017670873c5bb6119b773b41640b\" id:\"344cc73c0acd9e10ec336f4eb0697e10f742a506d3b1596ce50838fc32293a18\" pid:6825 exited_at:{seconds:1755042568 nanos:13203213}" Aug 12 23:49:28.797148 sshd[6811]: Accepted publickey for core from 139.178.68.195 port 52024 ssh2: RSA SHA256:mYYevtwiboZ0WAYUecJRosD7TZEB8o4Se1lqZszh41o Aug 12 23:49:28.799302 sshd-session[6811]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Aug 12 23:49:28.805267 systemd-logind[1481]: New session 28 of user core. Aug 12 23:49:28.811866 systemd[1]: Started session-28.scope - Session 28 of User core. Aug 12 23:49:29.583632 sshd[6836]: Connection closed by 139.178.68.195 port 52024 Aug 12 23:49:29.584705 sshd-session[6811]: pam_unix(sshd:session): session closed for user core Aug 12 23:49:29.592454 systemd-logind[1481]: Session 28 logged out. Waiting for processes to exit. Aug 12 23:49:29.593340 systemd[1]: sshd@27-138.199.237.168:22-139.178.68.195:52024.service: Deactivated successfully. Aug 12 23:49:29.597362 systemd[1]: session-28.scope: Deactivated successfully. Aug 12 23:49:29.600981 systemd-logind[1481]: Removed session 28. Aug 12 23:49:31.093202 update_engine[1485]: I20250812 23:49:31.092396 1485 libcurl_http_fetcher.cc:47] Starting/Resuming transfer Aug 12 23:49:31.093202 update_engine[1485]: I20250812 23:49:31.092780 1485 libcurl_http_fetcher.cc:151] Setting up curl options for HTTP Aug 12 23:49:31.093202 update_engine[1485]: I20250812 23:49:31.093137 1485 libcurl_http_fetcher.cc:449] Setting up timeout source: 1 seconds. Aug 12 23:49:31.094095 update_engine[1485]: E20250812 23:49:31.094013 1485 libcurl_http_fetcher.cc:266] Unable to get http response code: Could not resolve host: disabled Aug 12 23:49:31.094182 update_engine[1485]: I20250812 23:49:31.094098 1485 libcurl_http_fetcher.cc:297] Transfer resulted in an error (0), 0 bytes downloaded Aug 12 23:49:31.094182 update_engine[1485]: I20250812 23:49:31.094112 1485 omaha_request_action.cc:617] Omaha request response: Aug 12 23:49:31.094601 update_engine[1485]: E20250812 23:49:31.094535 1485 omaha_request_action.cc:636] Omaha request network transfer failed. Aug 12 23:49:31.094601 update_engine[1485]: I20250812 23:49:31.094594 1485 action_processor.cc:68] ActionProcessor::ActionComplete: OmahaRequestAction action failed. Aborting processing. Aug 12 23:49:31.094757 update_engine[1485]: I20250812 23:49:31.094604 1485 action_processor.cc:73] ActionProcessor::ActionComplete: finished last action of type OmahaRequestAction Aug 12 23:49:31.094757 update_engine[1485]: I20250812 23:49:31.094611 1485 update_attempter.cc:306] Processing Done. Aug 12 23:49:31.094757 update_engine[1485]: E20250812 23:49:31.094630 1485 update_attempter.cc:619] Update failed. Aug 12 23:49:31.094757 update_engine[1485]: I20250812 23:49:31.094638 1485 utils.cc:600] Converting error code 2000 to kActionCodeOmahaErrorInHTTPResponse Aug 12 23:49:31.094757 update_engine[1485]: I20250812 23:49:31.094645 1485 payload_state.cc:97] Updating payload state for error code: 37 (kActionCodeOmahaErrorInHTTPResponse) Aug 12 23:49:31.094757 update_engine[1485]: I20250812 23:49:31.094652 1485 payload_state.cc:103] Ignoring failures until we get a valid Omaha response. Aug 12 23:49:31.094757 update_engine[1485]: I20250812 23:49:31.094741 1485 action_processor.cc:36] ActionProcessor::StartProcessing: OmahaRequestAction Aug 12 23:49:31.095042 update_engine[1485]: I20250812 23:49:31.094770 1485 omaha_request_action.cc:271] Posting an Omaha request to disabled Aug 12 23:49:31.095042 update_engine[1485]: I20250812 23:49:31.094779 1485 omaha_request_action.cc:272] Request: Aug 12 23:49:31.095042 update_engine[1485]: Aug 12 23:49:31.095042 update_engine[1485]: Aug 12 23:49:31.095042 update_engine[1485]: Aug 12 23:49:31.095042 update_engine[1485]: Aug 12 23:49:31.095042 update_engine[1485]: Aug 12 23:49:31.095042 update_engine[1485]: Aug 12 23:49:31.095042 update_engine[1485]: I20250812 23:49:31.094786 1485 libcurl_http_fetcher.cc:47] Starting/Resuming transfer Aug 12 23:49:31.095398 update_engine[1485]: I20250812 23:49:31.095089 1485 libcurl_http_fetcher.cc:151] Setting up curl options for HTTP Aug 12 23:49:31.095994 update_engine[1485]: I20250812 23:49:31.095491 1485 libcurl_http_fetcher.cc:449] Setting up timeout source: 1 seconds. Aug 12 23:49:31.095994 update_engine[1485]: E20250812 23:49:31.095871 1485 libcurl_http_fetcher.cc:266] Unable to get http response code: Could not resolve host: disabled Aug 12 23:49:31.096134 locksmithd[1520]: LastCheckedTime=0 Progress=0 CurrentOperation="UPDATE_STATUS_REPORTING_ERROR_EVENT" NewVersion=0.0.0 NewSize=0 Aug 12 23:49:31.096843 update_engine[1485]: I20250812 23:49:31.096050 1485 libcurl_http_fetcher.cc:297] Transfer resulted in an error (0), 0 bytes downloaded Aug 12 23:49:31.096843 update_engine[1485]: I20250812 23:49:31.096066 1485 omaha_request_action.cc:617] Omaha request response: Aug 12 23:49:31.096843 update_engine[1485]: I20250812 23:49:31.096076 1485 action_processor.cc:65] ActionProcessor::ActionComplete: finished last action of type OmahaRequestAction Aug 12 23:49:31.096843 update_engine[1485]: I20250812 23:49:31.096082 1485 action_processor.cc:73] ActionProcessor::ActionComplete: finished last action of type OmahaRequestAction Aug 12 23:49:31.096843 update_engine[1485]: I20250812 23:49:31.096088 1485 update_attempter.cc:306] Processing Done. Aug 12 23:49:31.096843 update_engine[1485]: I20250812 23:49:31.096096 1485 update_attempter.cc:310] Error event sent. Aug 12 23:49:31.096843 update_engine[1485]: I20250812 23:49:31.096108 1485 update_check_scheduler.cc:74] Next update check in 45m55s Aug 12 23:49:31.097127 locksmithd[1520]: LastCheckedTime=0 Progress=0 CurrentOperation="UPDATE_STATUS_IDLE" NewVersion=0.0.0 NewSize=0 Aug 12 23:49:32.272877 containerd[1506]: time="2025-08-12T23:49:32.272733970Z" level=warning msg="container event discarded" container=c7fc6da222bf254f8461cb4ffcbe3651533563caf86cd74c24ae70b7d356e6b7 type=CONTAINER_CREATED_EVENT Aug 12 23:49:32.272877 containerd[1506]: time="2025-08-12T23:49:32.272828770Z" level=warning msg="container event discarded" container=c7fc6da222bf254f8461cb4ffcbe3651533563caf86cd74c24ae70b7d356e6b7 type=CONTAINER_STARTED_EVENT Aug 12 23:49:32.307216 containerd[1506]: time="2025-08-12T23:49:32.307120213Z" level=warning msg="container event discarded" container=90a5942d4d105100e8a1df193d5a37df7aef2766f3961a448da6c47d6685bc50 type=CONTAINER_CREATED_EVENT Aug 12 23:49:32.384889 containerd[1506]: time="2025-08-12T23:49:32.384815904Z" level=warning msg="container event discarded" container=90a5942d4d105100e8a1df193d5a37df7aef2766f3961a448da6c47d6685bc50 type=CONTAINER_STARTED_EVENT Aug 12 23:49:32.578029 containerd[1506]: time="2025-08-12T23:49:32.577944641Z" level=warning msg="container event discarded" container=539174614838505008cd3414578437b2b7acb93a8cca5204397d97a299e63fcc type=CONTAINER_CREATED_EVENT Aug 12 23:49:32.578029 containerd[1506]: time="2025-08-12T23:49:32.578012761Z" level=warning msg="container event discarded" container=539174614838505008cd3414578437b2b7acb93a8cca5204397d97a299e63fcc type=CONTAINER_STARTED_EVENT Aug 12 23:49:34.769820 systemd[1]: Started sshd@28-138.199.237.168:22-139.178.68.195:60814.service - OpenSSH per-connection server daemon (139.178.68.195:60814). Aug 12 23:49:34.899661 containerd[1506]: time="2025-08-12T23:49:34.899503957Z" level=warning msg="container event discarded" container=057eb732b14b4064d51cfd604eb745895a7dbdae4e79b01dd59efa7fa4d14ea4 type=CONTAINER_CREATED_EVENT Aug 12 23:49:34.968025 containerd[1506]: time="2025-08-12T23:49:34.967926886Z" level=warning msg="container event discarded" container=057eb732b14b4064d51cfd604eb745895a7dbdae4e79b01dd59efa7fa4d14ea4 type=CONTAINER_STARTED_EVENT Aug 12 23:49:35.781045 sshd[6850]: Accepted publickey for core from 139.178.68.195 port 60814 ssh2: RSA SHA256:mYYevtwiboZ0WAYUecJRosD7TZEB8o4Se1lqZszh41o Aug 12 23:49:35.783725 sshd-session[6850]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Aug 12 23:49:35.789618 systemd-logind[1481]: New session 29 of user core. Aug 12 23:49:35.796861 systemd[1]: Started session-29.scope - Session 29 of User core. Aug 12 23:49:36.555937 sshd[6852]: Connection closed by 139.178.68.195 port 60814 Aug 12 23:49:36.557019 sshd-session[6850]: pam_unix(sshd:session): session closed for user core Aug 12 23:49:36.563300 systemd-logind[1481]: Session 29 logged out. Waiting for processes to exit. Aug 12 23:49:36.563862 systemd[1]: sshd@28-138.199.237.168:22-139.178.68.195:60814.service: Deactivated successfully. Aug 12 23:49:36.566508 systemd[1]: session-29.scope: Deactivated successfully. Aug 12 23:49:36.569628 systemd-logind[1481]: Removed session 29. Aug 12 23:49:39.137861 containerd[1506]: time="2025-08-12T23:49:39.137808729Z" level=info msg="TaskExit event in podsandbox handler container_id:\"769dd1b877f8b6c3cdfcd850a1bfce2f7623183d7b5ca6c22e3baaa0871e87ab\" id:\"fd23dbfc99402935b678fb877fcc4ea58b7603dab5b532c5940ea04ad8791eda\" pid:6876 exited_at:{seconds:1755042579 nanos:137184123}" Aug 12 23:49:41.748842 systemd[1]: Started sshd@29-138.199.237.168:22-139.178.68.195:48698.service - OpenSSH per-connection server daemon (139.178.68.195:48698). Aug 12 23:49:42.819212 sshd[6888]: Accepted publickey for core from 139.178.68.195 port 48698 ssh2: RSA SHA256:mYYevtwiboZ0WAYUecJRosD7TZEB8o4Se1lqZszh41o Aug 12 23:49:42.822022 sshd-session[6888]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Aug 12 23:49:42.828016 systemd-logind[1481]: New session 30 of user core. Aug 12 23:49:42.837861 systemd[1]: Started session-30.scope - Session 30 of User core. Aug 12 23:49:43.631775 sshd[6890]: Connection closed by 139.178.68.195 port 48698 Aug 12 23:49:43.632791 sshd-session[6888]: pam_unix(sshd:session): session closed for user core Aug 12 23:49:43.638759 systemd[1]: sshd@29-138.199.237.168:22-139.178.68.195:48698.service: Deactivated successfully. Aug 12 23:49:43.644401 systemd[1]: session-30.scope: Deactivated successfully. Aug 12 23:49:43.646061 systemd-logind[1481]: Session 30 logged out. Waiting for processes to exit. Aug 12 23:49:43.648196 systemd-logind[1481]: Removed session 30. Aug 12 23:49:48.819108 systemd[1]: Started sshd@30-138.199.237.168:22-139.178.68.195:48714.service - OpenSSH per-connection server daemon (139.178.68.195:48714). Aug 12 23:49:49.386862 containerd[1506]: time="2025-08-12T23:49:49.386757024Z" level=warning msg="container event discarded" container=18e92bc5265e6cc58262f023836a3764eb67b50d7c035544275cfcaff56cc1ae type=CONTAINER_CREATED_EVENT Aug 12 23:49:49.386862 containerd[1506]: time="2025-08-12T23:49:49.386837185Z" level=warning msg="container event discarded" container=18e92bc5265e6cc58262f023836a3764eb67b50d7c035544275cfcaff56cc1ae type=CONTAINER_STARTED_EVENT Aug 12 23:49:49.485385 containerd[1506]: time="2025-08-12T23:49:49.485317534Z" level=warning msg="container event discarded" container=3f7549914de50118b9a2f08652cb62a765c74822bd924071684a5777e52a4572 type=CONTAINER_CREATED_EVENT Aug 12 23:49:49.485385 containerd[1506]: time="2025-08-12T23:49:49.485361614Z" level=warning msg="container event discarded" container=3f7549914de50118b9a2f08652cb62a765c74822bd924071684a5777e52a4572 type=CONTAINER_STARTED_EVENT Aug 12 23:49:49.886399 sshd[6903]: Accepted publickey for core from 139.178.68.195 port 48714 ssh2: RSA SHA256:mYYevtwiboZ0WAYUecJRosD7TZEB8o4Se1lqZszh41o Aug 12 23:49:49.890759 sshd-session[6903]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Aug 12 23:49:49.898486 systemd-logind[1481]: New session 31 of user core. Aug 12 23:49:49.903920 systemd[1]: Started session-31.scope - Session 31 of User core. Aug 12 23:49:50.696833 sshd[6912]: Connection closed by 139.178.68.195 port 48714 Aug 12 23:49:50.697705 sshd-session[6903]: pam_unix(sshd:session): session closed for user core Aug 12 23:49:50.703137 systemd[1]: sshd@30-138.199.237.168:22-139.178.68.195:48714.service: Deactivated successfully. Aug 12 23:49:50.706140 systemd[1]: session-31.scope: Deactivated successfully. Aug 12 23:49:50.708893 systemd-logind[1481]: Session 31 logged out. Waiting for processes to exit. Aug 12 23:49:50.710644 systemd-logind[1481]: Removed session 31. Aug 12 23:49:51.263365 containerd[1506]: time="2025-08-12T23:49:51.263249272Z" level=warning msg="container event discarded" container=54a6892226e2da40b0346378f0fcecafb61a7c8291966d1228ed075725f80fe3 type=CONTAINER_CREATED_EVENT Aug 12 23:49:51.363037 containerd[1506]: time="2025-08-12T23:49:51.362960240Z" level=warning msg="container event discarded" container=54a6892226e2da40b0346378f0fcecafb61a7c8291966d1228ed075725f80fe3 type=CONTAINER_STARTED_EVENT Aug 12 23:49:52.606282 containerd[1506]: time="2025-08-12T23:49:52.606172299Z" level=warning msg="container event discarded" container=5c04bc6c51e1adc015bfbc3d7644ae86eeef1aaa86935e4d1bc8f2843ee549c2 type=CONTAINER_CREATED_EVENT Aug 12 23:49:52.695205 containerd[1506]: time="2025-08-12T23:49:52.695077640Z" level=warning msg="container event discarded" container=5c04bc6c51e1adc015bfbc3d7644ae86eeef1aaa86935e4d1bc8f2843ee549c2 type=CONTAINER_STARTED_EVENT Aug 12 23:49:52.846807 containerd[1506]: time="2025-08-12T23:49:52.846697137Z" level=warning msg="container event discarded" container=5c04bc6c51e1adc015bfbc3d7644ae86eeef1aaa86935e4d1bc8f2843ee549c2 type=CONTAINER_STOPPED_EVENT Aug 12 23:49:55.863367 systemd[1]: Started sshd@31-138.199.237.168:22-139.178.68.195:50904.service - OpenSSH per-connection server daemon (139.178.68.195:50904). Aug 12 23:49:55.947975 containerd[1506]: time="2025-08-12T23:49:55.947883024Z" level=warning msg="container event discarded" container=ec6fc54c68dea8d0933909f85aca6e76649af579557bff9f8314a78069d9993b type=CONTAINER_CREATED_EVENT Aug 12 23:49:56.035380 containerd[1506]: time="2025-08-12T23:49:56.035220118Z" level=warning msg="container event discarded" container=ec6fc54c68dea8d0933909f85aca6e76649af579557bff9f8314a78069d9993b type=CONTAINER_STARTED_EVENT Aug 12 23:49:56.715598 containerd[1506]: time="2025-08-12T23:49:56.715456813Z" level=warning msg="container event discarded" container=ec6fc54c68dea8d0933909f85aca6e76649af579557bff9f8314a78069d9993b type=CONTAINER_STOPPED_EVENT Aug 12 23:49:56.874997 sshd[6940]: Accepted publickey for core from 139.178.68.195 port 50904 ssh2: RSA SHA256:mYYevtwiboZ0WAYUecJRosD7TZEB8o4Se1lqZszh41o Aug 12 23:49:56.877920 sshd-session[6940]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Aug 12 23:49:56.887167 systemd-logind[1481]: New session 32 of user core. Aug 12 23:49:56.892827 systemd[1]: Started session-32.scope - Session 32 of User core. Aug 12 23:49:57.017815 containerd[1506]: time="2025-08-12T23:49:57.017663313Z" level=info msg="TaskExit event in podsandbox handler container_id:\"19dd4802aef9a2278a4c8001b706649c2528bbfafc2b91c7d2bde8cc49cbafdc\" id:\"1ea727bd8452100191a759580ca4fb47e5db027866572c78244b4823951d266d\" pid:6955 exited_at:{seconds:1755042597 nanos:16964746}" Aug 12 23:49:57.668193 sshd[6942]: Connection closed by 139.178.68.195 port 50904 Aug 12 23:49:57.669851 sshd-session[6940]: pam_unix(sshd:session): session closed for user core Aug 12 23:49:57.678603 systemd[1]: sshd@31-138.199.237.168:22-139.178.68.195:50904.service: Deactivated successfully. Aug 12 23:49:57.683624 systemd[1]: session-32.scope: Deactivated successfully. Aug 12 23:49:57.687304 systemd-logind[1481]: Session 32 logged out. Waiting for processes to exit. Aug 12 23:49:57.689404 systemd-logind[1481]: Removed session 32. Aug 12 23:49:58.010164 containerd[1506]: time="2025-08-12T23:49:58.010014438Z" level=info msg="TaskExit event in podsandbox handler container_id:\"1cccb6bc47c00d5e1d4ee67f317e7ac919f9017670873c5bb6119b773b41640b\" id:\"af14ca791bf4c2d7f7ca9b692bf8fc6bdfde4f247dafd321441cfa9ca6f56ed5\" pid:6987 exited_at:{seconds:1755042598 nanos:9275630}" Aug 12 23:50:01.947133 containerd[1506]: time="2025-08-12T23:50:01.947032957Z" level=warning msg="container event discarded" container=769dd1b877f8b6c3cdfcd850a1bfce2f7623183d7b5ca6c22e3baaa0871e87ab type=CONTAINER_CREATED_EVENT Aug 12 23:50:02.068485 containerd[1506]: time="2025-08-12T23:50:02.068401421Z" level=warning msg="container event discarded" container=769dd1b877f8b6c3cdfcd850a1bfce2f7623183d7b5ca6c22e3baaa0871e87ab type=CONTAINER_STARTED_EVENT Aug 12 23:50:02.845058 systemd[1]: Started sshd@32-138.199.237.168:22-139.178.68.195:36968.service - OpenSSH per-connection server daemon (139.178.68.195:36968). Aug 12 23:50:03.860286 sshd[7002]: Accepted publickey for core from 139.178.68.195 port 36968 ssh2: RSA SHA256:mYYevtwiboZ0WAYUecJRosD7TZEB8o4Se1lqZszh41o Aug 12 23:50:03.862667 sshd-session[7002]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Aug 12 23:50:03.871393 systemd-logind[1481]: New session 33 of user core. Aug 12 23:50:03.877838 systemd[1]: Started session-33.scope - Session 33 of User core. Aug 12 23:50:04.116254 containerd[1506]: time="2025-08-12T23:50:04.116000471Z" level=warning msg="container event discarded" container=3f6ba5e93f2ad250c6321fe85c9511bbf91feb2ae037f49184d37eda2aa48ca9 type=CONTAINER_CREATED_EVENT Aug 12 23:50:04.116254 containerd[1506]: time="2025-08-12T23:50:04.116110752Z" level=warning msg="container event discarded" container=3f6ba5e93f2ad250c6321fe85c9511bbf91feb2ae037f49184d37eda2aa48ca9 type=CONTAINER_STARTED_EVENT Aug 12 23:50:04.628628 sshd[7004]: Connection closed by 139.178.68.195 port 36968 Aug 12 23:50:04.629595 sshd-session[7002]: pam_unix(sshd:session): session closed for user core Aug 12 23:50:04.635955 systemd-logind[1481]: Session 33 logged out. Waiting for processes to exit. Aug 12 23:50:04.637133 systemd[1]: sshd@32-138.199.237.168:22-139.178.68.195:36968.service: Deactivated successfully. Aug 12 23:50:04.641269 systemd[1]: session-33.scope: Deactivated successfully. Aug 12 23:50:04.644668 systemd-logind[1481]: Removed session 33. Aug 12 23:50:04.955641 containerd[1506]: time="2025-08-12T23:50:04.955149059Z" level=info msg="TaskExit event in podsandbox handler container_id:\"1cccb6bc47c00d5e1d4ee67f317e7ac919f9017670873c5bb6119b773b41640b\" id:\"989c8625a7287f86e046e72de657f619cc199fceb3698399daa969efec76dc6e\" pid:7029 exited_at:{seconds:1755042604 nanos:954625014}" Aug 12 23:50:06.171674 containerd[1506]: time="2025-08-12T23:50:06.171515232Z" level=warning msg="container event discarded" container=f850b4fd6eb647bc70aabfcc0b122b96349d622d91e46362e1bbe8bdd7f11c01 type=CONTAINER_CREATED_EVENT Aug 12 23:50:06.256728 containerd[1506]: time="2025-08-12T23:50:06.256644608Z" level=warning msg="container event discarded" container=f850b4fd6eb647bc70aabfcc0b122b96349d622d91e46362e1bbe8bdd7f11c01 type=CONTAINER_STARTED_EVENT Aug 12 23:50:09.149112 containerd[1506]: time="2025-08-12T23:50:09.148826528Z" level=info msg="TaskExit event in podsandbox handler container_id:\"769dd1b877f8b6c3cdfcd850a1bfce2f7623183d7b5ca6c22e3baaa0871e87ab\" id:\"2d84bdf9630560c969ee41277a33875346e33749bf2a3165d84ee0783f7ed5c5\" pid:7051 exited_at:{seconds:1755042609 nanos:147463994}" Aug 12 23:50:09.722323 containerd[1506]: time="2025-08-12T23:50:09.722182404Z" level=warning msg="container event discarded" container=cec82977a073516f212a8e5f10513f085337371364c01d60b372ce9b08816804 type=CONTAINER_CREATED_EVENT Aug 12 23:50:09.722323 containerd[1506]: time="2025-08-12T23:50:09.722295805Z" level=warning msg="container event discarded" container=cec82977a073516f212a8e5f10513f085337371364c01d60b372ce9b08816804 type=CONTAINER_STARTED_EVENT Aug 12 23:50:09.803852 systemd[1]: Started sshd@33-138.199.237.168:22-139.178.68.195:36970.service - OpenSSH per-connection server daemon (139.178.68.195:36970). Aug 12 23:50:09.831729 containerd[1506]: time="2025-08-12T23:50:09.831653324Z" level=warning msg="container event discarded" container=787b7d5e172dc71fe587b3f6eea6d8f9aa0ef7b0fcf02f1ef7f16967be7de288 type=CONTAINER_CREATED_EVENT Aug 12 23:50:09.971086 containerd[1506]: time="2025-08-12T23:50:09.970953880Z" level=warning msg="container event discarded" container=787b7d5e172dc71fe587b3f6eea6d8f9aa0ef7b0fcf02f1ef7f16967be7de288 type=CONTAINER_STARTED_EVENT Aug 12 23:50:10.533410 containerd[1506]: time="2025-08-12T23:50:10.533330412Z" level=warning msg="container event discarded" container=69a93367d0d4a8757028e8e0ca21952426c5ce6ce96fdba13005db0b3b6b35bf type=CONTAINER_CREATED_EVENT Aug 12 23:50:10.533410 containerd[1506]: time="2025-08-12T23:50:10.533406772Z" level=warning msg="container event discarded" container=69a93367d0d4a8757028e8e0ca21952426c5ce6ce96fdba13005db0b3b6b35bf type=CONTAINER_STARTED_EVENT Aug 12 23:50:10.587428 containerd[1506]: time="2025-08-12T23:50:10.587285505Z" level=warning msg="container event discarded" container=227f6754b8e7377cc59b160e3f58650e01ea812d4f5dff49f693ee2eca116f80 type=CONTAINER_CREATED_EVENT Aug 12 23:50:10.587428 containerd[1506]: time="2025-08-12T23:50:10.587360985Z" level=warning msg="container event discarded" container=227f6754b8e7377cc59b160e3f58650e01ea812d4f5dff49f693ee2eca116f80 type=CONTAINER_STARTED_EVENT Aug 12 23:50:10.816358 sshd[7063]: Accepted publickey for core from 139.178.68.195 port 36970 ssh2: RSA SHA256:mYYevtwiboZ0WAYUecJRosD7TZEB8o4Se1lqZszh41o Aug 12 23:50:10.819707 sshd-session[7063]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Aug 12 23:50:10.828675 systemd-logind[1481]: New session 34 of user core. Aug 12 23:50:10.833814 systemd[1]: Started session-34.scope - Session 34 of User core. Aug 12 23:50:11.397681 containerd[1506]: time="2025-08-12T23:50:11.397594439Z" level=warning msg="container event discarded" container=776d8d2c290f716dd77e18c400a3f3495f95261b38226df66334762c89a1a50e type=CONTAINER_CREATED_EVENT Aug 12 23:50:11.397681 containerd[1506]: time="2025-08-12T23:50:11.397651440Z" level=warning msg="container event discarded" container=776d8d2c290f716dd77e18c400a3f3495f95261b38226df66334762c89a1a50e type=CONTAINER_STARTED_EVENT Aug 12 23:50:11.636865 sshd[7065]: Connection closed by 139.178.68.195 port 36970 Aug 12 23:50:11.637797 sshd-session[7063]: pam_unix(sshd:session): session closed for user core Aug 12 23:50:11.642773 systemd[1]: sshd@33-138.199.237.168:22-139.178.68.195:36970.service: Deactivated successfully. Aug 12 23:50:11.649267 systemd[1]: session-34.scope: Deactivated successfully. Aug 12 23:50:11.651341 systemd-logind[1481]: Session 34 logged out. Waiting for processes to exit. Aug 12 23:50:11.654190 systemd-logind[1481]: Removed session 34. Aug 12 23:50:12.769874 containerd[1506]: time="2025-08-12T23:50:12.769727020Z" level=warning msg="container event discarded" container=9f526967f1bd718177b55a67c1823515130ed761ceb5678c5662b78e22370777 type=CONTAINER_CREATED_EVENT Aug 12 23:50:12.769874 containerd[1506]: time="2025-08-12T23:50:12.769792061Z" level=warning msg="container event discarded" container=9f526967f1bd718177b55a67c1823515130ed761ceb5678c5662b78e22370777 type=CONTAINER_STARTED_EVENT Aug 12 23:50:12.826171 containerd[1506]: time="2025-08-12T23:50:12.826029140Z" level=warning msg="container event discarded" container=86d417c813a03bada6b8faf8d2c202e515fa78a5eee64cf7c08d139363e5f3c1 type=CONTAINER_CREATED_EVENT Aug 12 23:50:12.826171 containerd[1506]: time="2025-08-12T23:50:12.826150222Z" level=warning msg="container event discarded" container=86d417c813a03bada6b8faf8d2c202e515fa78a5eee64cf7c08d139363e5f3c1 type=CONTAINER_STARTED_EVENT Aug 12 23:50:12.841795 containerd[1506]: time="2025-08-12T23:50:12.841717908Z" level=warning msg="container event discarded" container=7814c8ebe2bdf4ed487804be72e3dadc47b8e13b9fafc2978d2e74fb369abcc3 type=CONTAINER_CREATED_EVENT Aug 12 23:50:12.941139 containerd[1506]: time="2025-08-12T23:50:12.941063687Z" level=warning msg="container event discarded" container=7814c8ebe2bdf4ed487804be72e3dadc47b8e13b9fafc2978d2e74fb369abcc3 type=CONTAINER_STARTED_EVENT Aug 12 23:50:12.968741 containerd[1506]: time="2025-08-12T23:50:12.968610301Z" level=warning msg="container event discarded" container=dfd91b76898ef903999282e0bf1d11cc87df1aa60920e5c1a3693bdb348517e2 type=CONTAINER_CREATED_EVENT Aug 12 23:50:12.968741 containerd[1506]: time="2025-08-12T23:50:12.968682982Z" level=warning msg="container event discarded" container=dfd91b76898ef903999282e0bf1d11cc87df1aa60920e5c1a3693bdb348517e2 type=CONTAINER_STARTED_EVENT Aug 12 23:50:13.013101 containerd[1506]: time="2025-08-12T23:50:13.012982215Z" level=warning msg="container event discarded" container=f1de2f93b75e7804df7395a031dbafbf3852eecedc2352a716083f4f23d5ccb2 type=CONTAINER_CREATED_EVENT Aug 12 23:50:13.039928 containerd[1506]: time="2025-08-12T23:50:13.039295296Z" level=warning msg="container event discarded" container=6a5e1ec7533ab9f8c923e3de67713cb1c6bc6ed94eeb1753dd30a384ea76360c type=CONTAINER_CREATED_EVENT Aug 12 23:50:13.039928 containerd[1506]: time="2025-08-12T23:50:13.039389577Z" level=warning msg="container event discarded" container=6a5e1ec7533ab9f8c923e3de67713cb1c6bc6ed94eeb1753dd30a384ea76360c type=CONTAINER_STARTED_EVENT Aug 12 23:50:13.124038 containerd[1506]: time="2025-08-12T23:50:13.123949761Z" level=warning msg="container event discarded" container=f1de2f93b75e7804df7395a031dbafbf3852eecedc2352a716083f4f23d5ccb2 type=CONTAINER_STARTED_EVENT Aug 12 23:50:14.371374 containerd[1506]: time="2025-08-12T23:50:14.371275020Z" level=warning msg="container event discarded" container=1cccb6bc47c00d5e1d4ee67f317e7ac919f9017670873c5bb6119b773b41640b type=CONTAINER_CREATED_EVENT Aug 12 23:50:14.546926 containerd[1506]: time="2025-08-12T23:50:14.546747820Z" level=warning msg="container event discarded" container=1cccb6bc47c00d5e1d4ee67f317e7ac919f9017670873c5bb6119b773b41640b type=CONTAINER_STARTED_EVENT Aug 12 23:50:16.745302 containerd[1506]: time="2025-08-12T23:50:16.744951017Z" level=info msg="TaskExit event in podsandbox handler container_id:\"19dd4802aef9a2278a4c8001b706649c2528bbfafc2b91c7d2bde8cc49cbafdc\" id:\"0c622d4ea3da4bb2760678c4148a8b737ad22a43b1a4cc2ad7cbd56c66674a8d\" pid:7089 exited_at:{seconds:1755042616 nanos:743991646}" Aug 12 23:50:16.816064 systemd[1]: Started sshd@34-138.199.237.168:22-139.178.68.195:55178.service - OpenSSH per-connection server daemon (139.178.68.195:55178). Aug 12 23:50:17.120782 containerd[1506]: time="2025-08-12T23:50:17.120653579Z" level=warning msg="container event discarded" container=17298248124098d02f0008112d81d4844b1cdf9fcdfd2deb6a21815dfbf58987 type=CONTAINER_CREATED_EVENT Aug 12 23:50:17.231208 containerd[1506]: time="2025-08-12T23:50:17.231136450Z" level=warning msg="container event discarded" container=17298248124098d02f0008112d81d4844b1cdf9fcdfd2deb6a21815dfbf58987 type=CONTAINER_STARTED_EVENT Aug 12 23:50:17.517677 containerd[1506]: time="2025-08-12T23:50:17.517448094Z" level=warning msg="container event discarded" container=30fbfec031f850070ae8f0cad35064a3cbbe824223cde765357374ce1a4286fa type=CONTAINER_CREATED_EVENT Aug 12 23:50:17.690354 containerd[1506]: time="2025-08-12T23:50:17.690211916Z" level=warning msg="container event discarded" container=30fbfec031f850070ae8f0cad35064a3cbbe824223cde765357374ce1a4286fa type=CONTAINER_STARTED_EVENT Aug 12 23:50:17.840096 sshd[7099]: Accepted publickey for core from 139.178.68.195 port 55178 ssh2: RSA SHA256:mYYevtwiboZ0WAYUecJRosD7TZEB8o4Se1lqZszh41o Aug 12 23:50:17.843255 sshd-session[7099]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Aug 12 23:50:17.850513 systemd-logind[1481]: New session 35 of user core. Aug 12 23:50:17.855917 systemd[1]: Started session-35.scope - Session 35 of User core. Aug 12 23:50:18.626676 sshd[7101]: Connection closed by 139.178.68.195 port 55178 Aug 12 23:50:18.628391 sshd-session[7099]: pam_unix(sshd:session): session closed for user core Aug 12 23:50:18.635470 systemd-logind[1481]: Session 35 logged out. Waiting for processes to exit. Aug 12 23:50:18.635784 systemd[1]: sshd@34-138.199.237.168:22-139.178.68.195:55178.service: Deactivated successfully. Aug 12 23:50:18.641142 systemd[1]: session-35.scope: Deactivated successfully. Aug 12 23:50:18.646136 systemd-logind[1481]: Removed session 35. Aug 12 23:50:20.990746 containerd[1506]: time="2025-08-12T23:50:20.990664678Z" level=warning msg="container event discarded" container=4f13d6d65ec23befc45734d1407c01bb2417a2ad1880b526bcde32a862b819e7 type=CONTAINER_CREATED_EVENT Aug 12 23:50:21.212619 containerd[1506]: time="2025-08-12T23:50:21.212540567Z" level=warning msg="container event discarded" container=4f13d6d65ec23befc45734d1407c01bb2417a2ad1880b526bcde32a862b819e7 type=CONTAINER_STARTED_EVENT Aug 12 23:50:23.807387 systemd[1]: Started sshd@35-138.199.237.168:22-139.178.68.195:53386.service - OpenSSH per-connection server daemon (139.178.68.195:53386). Aug 12 23:50:24.829956 sshd[7114]: Accepted publickey for core from 139.178.68.195 port 53386 ssh2: RSA SHA256:mYYevtwiboZ0WAYUecJRosD7TZEB8o4Se1lqZszh41o Aug 12 23:50:24.831985 sshd-session[7114]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Aug 12 23:50:24.839859 systemd-logind[1481]: New session 36 of user core. Aug 12 23:50:24.845908 systemd[1]: Started session-36.scope - Session 36 of User core. Aug 12 23:50:25.425992 containerd[1506]: time="2025-08-12T23:50:25.425749810Z" level=warning msg="container event discarded" container=19dd4802aef9a2278a4c8001b706649c2528bbfafc2b91c7d2bde8cc49cbafdc type=CONTAINER_CREATED_EVENT Aug 12 23:50:25.565527 containerd[1506]: time="2025-08-12T23:50:25.565384016Z" level=warning msg="container event discarded" container=19dd4802aef9a2278a4c8001b706649c2528bbfafc2b91c7d2bde8cc49cbafdc type=CONTAINER_STARTED_EVENT Aug 12 23:50:25.594437 sshd[7122]: Connection closed by 139.178.68.195 port 53386 Aug 12 23:50:25.596758 sshd-session[7114]: pam_unix(sshd:session): session closed for user core Aug 12 23:50:25.603343 systemd[1]: sshd@35-138.199.237.168:22-139.178.68.195:53386.service: Deactivated successfully. Aug 12 23:50:25.608684 systemd[1]: session-36.scope: Deactivated successfully. Aug 12 23:50:25.612725 systemd-logind[1481]: Session 36 logged out. Waiting for processes to exit. Aug 12 23:50:25.614671 systemd-logind[1481]: Removed session 36. Aug 12 23:50:25.800314 containerd[1506]: time="2025-08-12T23:50:25.799444415Z" level=warning msg="container event discarded" container=ee7bef68afdb2544acface379516b3610fe18efeb5fb3874e38591a5bbf54b90 type=CONTAINER_CREATED_EVENT Aug 12 23:50:25.919150 containerd[1506]: time="2025-08-12T23:50:25.919069282Z" level=warning msg="container event discarded" container=ee7bef68afdb2544acface379516b3610fe18efeb5fb3874e38591a5bbf54b90 type=CONTAINER_STARTED_EVENT Aug 12 23:50:27.012360 containerd[1506]: time="2025-08-12T23:50:27.012312371Z" level=info msg="TaskExit event in podsandbox handler container_id:\"19dd4802aef9a2278a4c8001b706649c2528bbfafc2b91c7d2bde8cc49cbafdc\" id:\"6ca1e0fbd3d6597c6573913380aec2e9a73819a1b0e97723b0872bc8308493c9\" pid:7148 exited_at:{seconds:1755042627 nanos:11980408}" Aug 12 23:50:27.869980 containerd[1506]: time="2025-08-12T23:50:27.869879457Z" level=warning msg="container event discarded" container=bc22fceb1309aaf629972db6a25858a6f723c92ec218e6ce5bbeda2b84f60270 type=CONTAINER_CREATED_EVENT Aug 12 23:50:28.001962 containerd[1506]: time="2025-08-12T23:50:28.001918425Z" level=info msg="TaskExit event in podsandbox handler container_id:\"1cccb6bc47c00d5e1d4ee67f317e7ac919f9017670873c5bb6119b773b41640b\" id:\"0baa23570868e981595985af6e043df7d39858b87d35eeff105d9fa0e8bccb51\" pid:7169 exited_at:{seconds:1755042628 nanos:1459220}" Aug 12 23:50:28.039842 containerd[1506]: time="2025-08-12T23:50:28.039768201Z" level=warning msg="container event discarded" container=bc22fceb1309aaf629972db6a25858a6f723c92ec218e6ce5bbeda2b84f60270 type=CONTAINER_STARTED_EVENT Aug 12 23:50:30.772880 systemd[1]: Started sshd@36-138.199.237.168:22-139.178.68.195:55494.service - OpenSSH per-connection server daemon (139.178.68.195:55494). Aug 12 23:50:31.787719 sshd[7180]: Accepted publickey for core from 139.178.68.195 port 55494 ssh2: RSA SHA256:mYYevtwiboZ0WAYUecJRosD7TZEB8o4Se1lqZszh41o Aug 12 23:50:31.790761 sshd-session[7180]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Aug 12 23:50:31.797040 systemd-logind[1481]: New session 37 of user core. Aug 12 23:50:31.808891 systemd[1]: Started session-37.scope - Session 37 of User core. Aug 12 23:50:32.559561 sshd[7182]: Connection closed by 139.178.68.195 port 55494 Aug 12 23:50:32.560515 sshd-session[7180]: pam_unix(sshd:session): session closed for user core Aug 12 23:50:32.565861 systemd[1]: sshd@36-138.199.237.168:22-139.178.68.195:55494.service: Deactivated successfully. Aug 12 23:50:32.569878 systemd[1]: session-37.scope: Deactivated successfully. Aug 12 23:50:32.571125 systemd-logind[1481]: Session 37 logged out. Waiting for processes to exit. Aug 12 23:50:32.573286 systemd-logind[1481]: Removed session 37. Aug 12 23:50:37.735945 systemd[1]: Started sshd@37-138.199.237.168:22-139.178.68.195:55500.service - OpenSSH per-connection server daemon (139.178.68.195:55500). Aug 12 23:50:38.750647 sshd[7197]: Accepted publickey for core from 139.178.68.195 port 55500 ssh2: RSA SHA256:mYYevtwiboZ0WAYUecJRosD7TZEB8o4Se1lqZszh41o Aug 12 23:50:38.752520 sshd-session[7197]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Aug 12 23:50:38.758731 systemd-logind[1481]: New session 38 of user core. Aug 12 23:50:38.766122 systemd[1]: Started session-38.scope - Session 38 of User core. Aug 12 23:50:39.130980 containerd[1506]: time="2025-08-12T23:50:39.130838735Z" level=info msg="TaskExit event in podsandbox handler container_id:\"769dd1b877f8b6c3cdfcd850a1bfce2f7623183d7b5ca6c22e3baaa0871e87ab\" id:\"ffad4dd14742c24af214aa4dca2502296abb2770d85e83794cee1fefdb39b62f\" pid:7212 exited_at:{seconds:1755042639 nanos:130458051}" Aug 12 23:50:39.531646 sshd[7199]: Connection closed by 139.178.68.195 port 55500 Aug 12 23:50:39.531871 sshd-session[7197]: pam_unix(sshd:session): session closed for user core Aug 12 23:50:39.537388 systemd-logind[1481]: Session 38 logged out. Waiting for processes to exit. Aug 12 23:50:39.539862 systemd[1]: sshd@37-138.199.237.168:22-139.178.68.195:55500.service: Deactivated successfully. Aug 12 23:50:39.543332 systemd[1]: session-38.scope: Deactivated successfully. Aug 12 23:50:39.545507 systemd-logind[1481]: Removed session 38. Aug 12 23:50:44.705929 systemd[1]: Started sshd@38-138.199.237.168:22-139.178.68.195:37070.service - OpenSSH per-connection server daemon (139.178.68.195:37070). Aug 12 23:50:45.723917 sshd[7235]: Accepted publickey for core from 139.178.68.195 port 37070 ssh2: RSA SHA256:mYYevtwiboZ0WAYUecJRosD7TZEB8o4Se1lqZszh41o Aug 12 23:50:45.726869 sshd-session[7235]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Aug 12 23:50:45.732941 systemd-logind[1481]: New session 39 of user core. Aug 12 23:50:45.738815 systemd[1]: Started session-39.scope - Session 39 of User core. Aug 12 23:50:46.488957 sshd[7237]: Connection closed by 139.178.68.195 port 37070 Aug 12 23:50:46.489831 sshd-session[7235]: pam_unix(sshd:session): session closed for user core Aug 12 23:50:46.494738 systemd[1]: sshd@38-138.199.237.168:22-139.178.68.195:37070.service: Deactivated successfully. Aug 12 23:50:46.497668 systemd[1]: session-39.scope: Deactivated successfully. Aug 12 23:50:46.499677 systemd-logind[1481]: Session 39 logged out. Waiting for processes to exit. Aug 12 23:50:46.501718 systemd-logind[1481]: Removed session 39. Aug 12 23:50:51.665857 systemd[1]: Started sshd@39-138.199.237.168:22-139.178.68.195:57702.service - OpenSSH per-connection server daemon (139.178.68.195:57702). Aug 12 23:50:52.690612 sshd[7249]: Accepted publickey for core from 139.178.68.195 port 57702 ssh2: RSA SHA256:mYYevtwiboZ0WAYUecJRosD7TZEB8o4Se1lqZszh41o Aug 12 23:50:52.693162 sshd-session[7249]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Aug 12 23:50:52.704122 systemd-logind[1481]: New session 40 of user core. Aug 12 23:50:52.719859 systemd[1]: Started session-40.scope - Session 40 of User core. Aug 12 23:50:53.495581 sshd[7251]: Connection closed by 139.178.68.195 port 57702 Aug 12 23:50:53.496532 sshd-session[7249]: pam_unix(sshd:session): session closed for user core Aug 12 23:50:53.502370 systemd[1]: sshd@39-138.199.237.168:22-139.178.68.195:57702.service: Deactivated successfully. Aug 12 23:50:53.506402 systemd[1]: session-40.scope: Deactivated successfully. Aug 12 23:50:53.508563 systemd-logind[1481]: Session 40 logged out. Waiting for processes to exit. Aug 12 23:50:53.510367 systemd-logind[1481]: Removed session 40. Aug 12 23:50:56.730771 containerd[1506]: time="2025-08-12T23:50:56.730513357Z" level=warning msg="container event discarded" container=17298248124098d02f0008112d81d4844b1cdf9fcdfd2deb6a21815dfbf58987 type=CONTAINER_STOPPED_EVENT Aug 12 23:50:56.864215 containerd[1506]: time="2025-08-12T23:50:56.864085318Z" level=warning msg="container event discarded" container=69a93367d0d4a8757028e8e0ca21952426c5ce6ce96fdba13005db0b3b6b35bf type=CONTAINER_STOPPED_EVENT Aug 12 23:50:57.022988 containerd[1506]: time="2025-08-12T23:50:57.022796324Z" level=info msg="TaskExit event in podsandbox handler container_id:\"19dd4802aef9a2278a4c8001b706649c2528bbfafc2b91c7d2bde8cc49cbafdc\" id:\"d91c008239ae5095394c886cb731ca7a2ea92b9d021dadea4934a623e366ffbe\" pid:7277 exited_at:{seconds:1755042657 nanos:22271678}" Aug 12 23:50:57.567869 containerd[1506]: time="2025-08-12T23:50:57.567730893Z" level=warning msg="container event discarded" container=d5cf342a7cbd7776ecc8a031f62b0b1dee1077d30f532b136451762bb3f510d4 type=CONTAINER_CREATED_EVENT Aug 12 23:50:57.567869 containerd[1506]: time="2025-08-12T23:50:57.567819654Z" level=warning msg="container event discarded" container=d5cf342a7cbd7776ecc8a031f62b0b1dee1077d30f532b136451762bb3f510d4 type=CONTAINER_STARTED_EVENT Aug 12 23:50:57.604355 containerd[1506]: time="2025-08-12T23:50:57.604058987Z" level=warning msg="container event discarded" container=05d81955dce54c19bcda5454d9e57b54016ce2a79250e53fbbb5174fc2abd83b type=CONTAINER_CREATED_EVENT Aug 12 23:50:57.701119 containerd[1506]: time="2025-08-12T23:50:57.701034252Z" level=warning msg="container event discarded" container=17298248124098d02f0008112d81d4844b1cdf9fcdfd2deb6a21815dfbf58987 type=CONTAINER_DELETED_EVENT Aug 12 23:50:57.822931 containerd[1506]: time="2025-08-12T23:50:57.822762279Z" level=warning msg="container event discarded" container=05d81955dce54c19bcda5454d9e57b54016ce2a79250e53fbbb5174fc2abd83b type=CONTAINER_STARTED_EVENT Aug 12 23:50:58.020432 containerd[1506]: time="2025-08-12T23:50:58.020375331Z" level=info msg="TaskExit event in podsandbox handler container_id:\"1cccb6bc47c00d5e1d4ee67f317e7ac919f9017670873c5bb6119b773b41640b\" id:\"a6b5586f3a287a2f3a8ab222a4ffa81fe800e0375058471dcab9c80f4581ac0b\" pid:7299 exited_at:{seconds:1755042658 nanos:19839005}" Aug 12 23:50:58.677724 systemd[1]: Started sshd@40-138.199.237.168:22-139.178.68.195:57714.service - OpenSSH per-connection server daemon (139.178.68.195:57714). Aug 12 23:50:59.710691 sshd[7310]: Accepted publickey for core from 139.178.68.195 port 57714 ssh2: RSA SHA256:mYYevtwiboZ0WAYUecJRosD7TZEB8o4Se1lqZszh41o Aug 12 23:50:59.713216 sshd-session[7310]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Aug 12 23:50:59.721053 systemd-logind[1481]: New session 41 of user core. Aug 12 23:50:59.726033 systemd[1]: Started session-41.scope - Session 41 of User core. Aug 12 23:51:00.451224 containerd[1506]: time="2025-08-12T23:51:00.451111434Z" level=warning msg="container event discarded" container=ee7bef68afdb2544acface379516b3610fe18efeb5fb3874e38591a5bbf54b90 type=CONTAINER_STOPPED_EVENT Aug 12 23:51:00.486277 sshd[7312]: Connection closed by 139.178.68.195 port 57714 Aug 12 23:51:00.487623 sshd-session[7310]: pam_unix(sshd:session): session closed for user core Aug 12 23:51:00.493502 systemd-logind[1481]: Session 41 logged out. Waiting for processes to exit. Aug 12 23:51:00.494487 systemd[1]: sshd@40-138.199.237.168:22-139.178.68.195:57714.service: Deactivated successfully. Aug 12 23:51:00.498773 systemd[1]: session-41.scope: Deactivated successfully. Aug 12 23:51:00.501871 systemd-logind[1481]: Removed session 41. Aug 12 23:51:00.548682 containerd[1506]: time="2025-08-12T23:51:00.548577188Z" level=warning msg="container event discarded" container=6a5e1ec7533ab9f8c923e3de67713cb1c6bc6ed94eeb1753dd30a384ea76360c type=CONTAINER_STOPPED_EVENT Aug 12 23:51:04.946346 containerd[1506]: time="2025-08-12T23:51:04.946282379Z" level=info msg="TaskExit event in podsandbox handler container_id:\"1cccb6bc47c00d5e1d4ee67f317e7ac919f9017670873c5bb6119b773b41640b\" id:\"a33893025115d1c8200f3cb56d6b1fb8ccab87c844d8b540611e7f7adafd7226\" pid:7340 exited_at:{seconds:1755042664 nanos:945482970}" Aug 12 23:51:05.686092 systemd[1]: Started sshd@41-138.199.237.168:22-139.178.68.195:49008.service - OpenSSH per-connection server daemon (139.178.68.195:49008). Aug 12 23:51:06.761841 sshd[7351]: Accepted publickey for core from 139.178.68.195 port 49008 ssh2: RSA SHA256:mYYevtwiboZ0WAYUecJRosD7TZEB8o4Se1lqZszh41o Aug 12 23:51:06.763933 sshd-session[7351]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Aug 12 23:51:06.770221 systemd-logind[1481]: New session 42 of user core. Aug 12 23:51:06.775795 systemd[1]: Started session-42.scope - Session 42 of User core. Aug 12 23:51:07.570701 sshd[7353]: Connection closed by 139.178.68.195 port 49008 Aug 12 23:51:07.571755 sshd-session[7351]: pam_unix(sshd:session): session closed for user core Aug 12 23:51:07.577002 systemd-logind[1481]: Session 42 logged out. Waiting for processes to exit. Aug 12 23:51:07.578073 systemd[1]: sshd@41-138.199.237.168:22-139.178.68.195:49008.service: Deactivated successfully. Aug 12 23:51:07.581826 systemd[1]: session-42.scope: Deactivated successfully. Aug 12 23:51:07.583718 systemd-logind[1481]: Removed session 42. Aug 12 23:51:09.145078 containerd[1506]: time="2025-08-12T23:51:09.144952832Z" level=info msg="TaskExit event in podsandbox handler container_id:\"769dd1b877f8b6c3cdfcd850a1bfce2f7623183d7b5ca6c22e3baaa0871e87ab\" id:\"98b466a02f8ebb837ff3dd6fb001339079c9393df411ab43e0a5d8d97c653f64\" pid:7377 exited_at:{seconds:1755042669 nanos:144335705}" Aug 12 23:51:12.755009 systemd[1]: Started sshd@42-138.199.237.168:22-139.178.68.195:35382.service - OpenSSH per-connection server daemon (139.178.68.195:35382). Aug 12 23:51:13.834585 sshd[7389]: Accepted publickey for core from 139.178.68.195 port 35382 ssh2: RSA SHA256:mYYevtwiboZ0WAYUecJRosD7TZEB8o4Se1lqZszh41o Aug 12 23:51:13.837022 sshd-session[7389]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Aug 12 23:51:13.844049 systemd-logind[1481]: New session 43 of user core. Aug 12 23:51:13.851007 systemd[1]: Started session-43.scope - Session 43 of User core. Aug 12 23:51:14.654727 sshd[7391]: Connection closed by 139.178.68.195 port 35382 Aug 12 23:51:14.655577 sshd-session[7389]: pam_unix(sshd:session): session closed for user core Aug 12 23:51:14.661376 systemd[1]: sshd@42-138.199.237.168:22-139.178.68.195:35382.service: Deactivated successfully. Aug 12 23:51:14.668882 systemd[1]: session-43.scope: Deactivated successfully. Aug 12 23:51:14.670878 systemd-logind[1481]: Session 43 logged out. Waiting for processes to exit. Aug 12 23:51:14.676832 systemd-logind[1481]: Removed session 43. Aug 12 23:51:14.840990 systemd[1]: Started sshd@43-138.199.237.168:22-139.178.68.195:35398.service - OpenSSH per-connection server daemon (139.178.68.195:35398). Aug 12 23:51:15.914704 sshd[7404]: Accepted publickey for core from 139.178.68.195 port 35398 ssh2: RSA SHA256:mYYevtwiboZ0WAYUecJRosD7TZEB8o4Se1lqZszh41o Aug 12 23:51:15.916974 sshd-session[7404]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Aug 12 23:51:15.923160 systemd-logind[1481]: New session 44 of user core. Aug 12 23:51:15.927852 systemd[1]: Started session-44.scope - Session 44 of User core. Aug 12 23:51:16.739938 containerd[1506]: time="2025-08-12T23:51:16.739888524Z" level=info msg="TaskExit event in podsandbox handler container_id:\"19dd4802aef9a2278a4c8001b706649c2528bbfafc2b91c7d2bde8cc49cbafdc\" id:\"a722c31ba49e4840257f1f454b76891785960211f022734ea0411d8a970608c5\" pid:7425 exited_at:{seconds:1755042676 nanos:739028194}" Aug 12 23:51:16.770717 sshd[7406]: Connection closed by 139.178.68.195 port 35398 Aug 12 23:51:16.771833 sshd-session[7404]: pam_unix(sshd:session): session closed for user core Aug 12 23:51:16.779216 systemd[1]: sshd@43-138.199.237.168:22-139.178.68.195:35398.service: Deactivated successfully. Aug 12 23:51:16.783008 systemd[1]: session-44.scope: Deactivated successfully. Aug 12 23:51:16.785313 systemd-logind[1481]: Session 44 logged out. Waiting for processes to exit. Aug 12 23:51:16.788508 systemd-logind[1481]: Removed session 44. Aug 12 23:51:16.939098 systemd[1]: Started sshd@44-138.199.237.168:22-139.178.68.195:35414.service - OpenSSH per-connection server daemon (139.178.68.195:35414). Aug 12 23:51:17.958820 sshd[7438]: Accepted publickey for core from 139.178.68.195 port 35414 ssh2: RSA SHA256:mYYevtwiboZ0WAYUecJRosD7TZEB8o4Se1lqZszh41o Aug 12 23:51:17.961137 sshd-session[7438]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Aug 12 23:51:17.968695 systemd-logind[1481]: New session 45 of user core. Aug 12 23:51:17.973752 systemd[1]: Started session-45.scope - Session 45 of User core. Aug 12 23:51:18.736749 sshd[7440]: Connection closed by 139.178.68.195 port 35414 Aug 12 23:51:18.737695 sshd-session[7438]: pam_unix(sshd:session): session closed for user core Aug 12 23:51:18.744541 systemd[1]: sshd@44-138.199.237.168:22-139.178.68.195:35414.service: Deactivated successfully. Aug 12 23:51:18.747492 systemd[1]: session-45.scope: Deactivated successfully. Aug 12 23:51:18.749888 systemd-logind[1481]: Session 45 logged out. Waiting for processes to exit. Aug 12 23:51:18.751593 systemd-logind[1481]: Removed session 45. Aug 12 23:51:23.915875 systemd[1]: Started sshd@45-138.199.237.168:22-139.178.68.195:38794.service - OpenSSH per-connection server daemon (139.178.68.195:38794). Aug 12 23:51:24.926875 sshd[7462]: Accepted publickey for core from 139.178.68.195 port 38794 ssh2: RSA SHA256:mYYevtwiboZ0WAYUecJRosD7TZEB8o4Se1lqZszh41o Aug 12 23:51:24.929055 sshd-session[7462]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Aug 12 23:51:24.938059 systemd-logind[1481]: New session 46 of user core. Aug 12 23:51:24.941773 systemd[1]: Started session-46.scope - Session 46 of User core. Aug 12 23:51:25.141952 containerd[1506]: time="2025-08-12T23:51:25.141818323Z" level=warning msg="container event discarded" container=ee7bef68afdb2544acface379516b3610fe18efeb5fb3874e38591a5bbf54b90 type=CONTAINER_DELETED_EVENT Aug 12 23:51:25.341917 containerd[1506]: time="2025-08-12T23:51:25.341841292Z" level=warning msg="container event discarded" container=69a93367d0d4a8757028e8e0ca21952426c5ce6ce96fdba13005db0b3b6b35bf type=CONTAINER_DELETED_EVENT Aug 12 23:51:25.533672 containerd[1506]: time="2025-08-12T23:51:25.533597605Z" level=warning msg="container event discarded" container=6a5e1ec7533ab9f8c923e3de67713cb1c6bc6ed94eeb1753dd30a384ea76360c type=CONTAINER_DELETED_EVENT Aug 12 23:51:25.698460 sshd[7478]: Connection closed by 139.178.68.195 port 38794 Aug 12 23:51:25.699542 sshd-session[7462]: pam_unix(sshd:session): session closed for user core Aug 12 23:51:25.706236 systemd[1]: sshd@45-138.199.237.168:22-139.178.68.195:38794.service: Deactivated successfully. Aug 12 23:51:25.709768 systemd[1]: session-46.scope: Deactivated successfully. Aug 12 23:51:25.711635 systemd-logind[1481]: Session 46 logged out. Waiting for processes to exit. Aug 12 23:51:25.714747 systemd-logind[1481]: Removed session 46. Aug 12 23:51:27.015046 containerd[1506]: time="2025-08-12T23:51:27.014990504Z" level=info msg="TaskExit event in podsandbox handler container_id:\"19dd4802aef9a2278a4c8001b706649c2528bbfafc2b91c7d2bde8cc49cbafdc\" id:\"c5decf00abdd36becf7f930544349f2364473ee0d2621caf89685b2813961719\" pid:7504 exited_at:{seconds:1755042687 nanos:14607220}" Aug 12 23:51:28.004086 containerd[1506]: time="2025-08-12T23:51:28.003766193Z" level=info msg="TaskExit event in podsandbox handler container_id:\"1cccb6bc47c00d5e1d4ee67f317e7ac919f9017670873c5bb6119b773b41640b\" id:\"9c1c975a13bf108029de52eec9ffa6b740b221c42c63ca95b5534d43d767b3e7\" pid:7525 exited_at:{seconds:1755042688 nanos:2903423}" Aug 12 23:51:30.873656 systemd[1]: Started sshd@46-138.199.237.168:22-139.178.68.195:33562.service - OpenSSH per-connection server daemon (139.178.68.195:33562). Aug 12 23:51:31.892821 sshd[7536]: Accepted publickey for core from 139.178.68.195 port 33562 ssh2: RSA SHA256:mYYevtwiboZ0WAYUecJRosD7TZEB8o4Se1lqZszh41o Aug 12 23:51:31.895425 sshd-session[7536]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Aug 12 23:51:31.901254 systemd-logind[1481]: New session 47 of user core. Aug 12 23:51:31.909888 systemd[1]: Started session-47.scope - Session 47 of User core. Aug 12 23:51:32.665443 sshd[7538]: Connection closed by 139.178.68.195 port 33562 Aug 12 23:51:32.666382 sshd-session[7536]: pam_unix(sshd:session): session closed for user core Aug 12 23:51:32.672335 systemd[1]: sshd@46-138.199.237.168:22-139.178.68.195:33562.service: Deactivated successfully. Aug 12 23:51:32.675152 systemd[1]: session-47.scope: Deactivated successfully. Aug 12 23:51:32.678278 systemd-logind[1481]: Session 47 logged out. Waiting for processes to exit. Aug 12 23:51:32.680017 systemd-logind[1481]: Removed session 47. Aug 12 23:51:37.843375 systemd[1]: Started sshd@47-138.199.237.168:22-139.178.68.195:33572.service - OpenSSH per-connection server daemon (139.178.68.195:33572). Aug 12 23:51:38.862472 sshd[7551]: Accepted publickey for core from 139.178.68.195 port 33572 ssh2: RSA SHA256:mYYevtwiboZ0WAYUecJRosD7TZEB8o4Se1lqZszh41o Aug 12 23:51:38.864770 sshd-session[7551]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Aug 12 23:51:38.871708 systemd-logind[1481]: New session 48 of user core. Aug 12 23:51:38.876909 systemd[1]: Started session-48.scope - Session 48 of User core. Aug 12 23:51:39.144865 containerd[1506]: time="2025-08-12T23:51:39.144707896Z" level=info msg="TaskExit event in podsandbox handler container_id:\"769dd1b877f8b6c3cdfcd850a1bfce2f7623183d7b5ca6c22e3baaa0871e87ab\" id:\"e0dd269b18df385a0632771d7a32f48f88184d981e3c772a810344a05f7af06c\" pid:7566 exited_at:{seconds:1755042699 nanos:144093689}" Aug 12 23:51:39.644181 sshd[7553]: Connection closed by 139.178.68.195 port 33572 Aug 12 23:51:39.643272 sshd-session[7551]: pam_unix(sshd:session): session closed for user core Aug 12 23:51:39.650052 systemd[1]: sshd@47-138.199.237.168:22-139.178.68.195:33572.service: Deactivated successfully. Aug 12 23:51:39.653751 systemd[1]: session-48.scope: Deactivated successfully. Aug 12 23:51:39.655394 systemd-logind[1481]: Session 48 logged out. Waiting for processes to exit. Aug 12 23:51:39.657896 systemd-logind[1481]: Removed session 48. Aug 12 23:51:44.819851 systemd[1]: Started sshd@48-138.199.237.168:22-139.178.68.195:53456.service - OpenSSH per-connection server daemon (139.178.68.195:53456). Aug 12 23:51:45.832329 sshd[7588]: Accepted publickey for core from 139.178.68.195 port 53456 ssh2: RSA SHA256:mYYevtwiboZ0WAYUecJRosD7TZEB8o4Se1lqZszh41o Aug 12 23:51:45.834361 sshd-session[7588]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Aug 12 23:51:45.842052 systemd-logind[1481]: New session 49 of user core. Aug 12 23:51:45.845879 systemd[1]: Started session-49.scope - Session 49 of User core. Aug 12 23:51:46.603876 sshd[7590]: Connection closed by 139.178.68.195 port 53456 Aug 12 23:51:46.604925 sshd-session[7588]: pam_unix(sshd:session): session closed for user core Aug 12 23:51:46.612053 systemd-logind[1481]: Session 49 logged out. Waiting for processes to exit. Aug 12 23:51:46.613084 systemd[1]: sshd@48-138.199.237.168:22-139.178.68.195:53456.service: Deactivated successfully. Aug 12 23:51:46.616987 systemd[1]: session-49.scope: Deactivated successfully. Aug 12 23:51:46.619997 systemd-logind[1481]: Removed session 49. Aug 12 23:51:51.786663 systemd[1]: Started sshd@49-138.199.237.168:22-139.178.68.195:60012.service - OpenSSH per-connection server daemon (139.178.68.195:60012). Aug 12 23:51:52.806737 sshd[7602]: Accepted publickey for core from 139.178.68.195 port 60012 ssh2: RSA SHA256:mYYevtwiboZ0WAYUecJRosD7TZEB8o4Se1lqZszh41o Aug 12 23:51:52.808612 sshd-session[7602]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Aug 12 23:51:52.816489 systemd-logind[1481]: New session 50 of user core. Aug 12 23:51:52.821796 systemd[1]: Started session-50.scope - Session 50 of User core. Aug 12 23:51:53.585108 sshd[7604]: Connection closed by 139.178.68.195 port 60012 Aug 12 23:51:53.585013 sshd-session[7602]: pam_unix(sshd:session): session closed for user core Aug 12 23:51:53.593521 systemd[1]: sshd@49-138.199.237.168:22-139.178.68.195:60012.service: Deactivated successfully. Aug 12 23:51:53.596362 systemd[1]: session-50.scope: Deactivated successfully. Aug 12 23:51:53.599279 systemd-logind[1481]: Session 50 logged out. Waiting for processes to exit. Aug 12 23:51:53.602738 systemd-logind[1481]: Removed session 50. Aug 12 23:51:57.009506 containerd[1506]: time="2025-08-12T23:51:57.009290551Z" level=info msg="TaskExit event in podsandbox handler container_id:\"19dd4802aef9a2278a4c8001b706649c2528bbfafc2b91c7d2bde8cc49cbafdc\" id:\"8ee5d19c47206f4a8cca98f1829015c6d06da0d97879c3400e4d6d90e3cd7f57\" pid:7628 exited_at:{seconds:1755042717 nanos:9012028}" Aug 12 23:51:58.012941 containerd[1506]: time="2025-08-12T23:51:58.012872273Z" level=info msg="TaskExit event in podsandbox handler container_id:\"1cccb6bc47c00d5e1d4ee67f317e7ac919f9017670873c5bb6119b773b41640b\" id:\"1c203524d8062a96fbc02642e3c26455805d2b9527d87b3e82dcde53b0ac62bf\" pid:7650 exited_at:{seconds:1755042718 nanos:11514137}" Aug 12 23:51:58.763391 systemd[1]: Started sshd@50-138.199.237.168:22-139.178.68.195:60018.service - OpenSSH per-connection server daemon (139.178.68.195:60018). Aug 12 23:51:59.797958 sshd[7662]: Accepted publickey for core from 139.178.68.195 port 60018 ssh2: RSA SHA256:mYYevtwiboZ0WAYUecJRosD7TZEB8o4Se1lqZszh41o Aug 12 23:51:59.800491 sshd-session[7662]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Aug 12 23:51:59.807057 systemd-logind[1481]: New session 51 of user core. Aug 12 23:51:59.816845 systemd[1]: Started session-51.scope - Session 51 of User core. Aug 12 23:52:00.574861 sshd[7665]: Connection closed by 139.178.68.195 port 60018 Aug 12 23:52:00.576675 sshd-session[7662]: pam_unix(sshd:session): session closed for user core Aug 12 23:52:00.584462 systemd[1]: sshd@50-138.199.237.168:22-139.178.68.195:60018.service: Deactivated successfully. Aug 12 23:52:00.589339 systemd[1]: session-51.scope: Deactivated successfully. Aug 12 23:52:00.591000 systemd-logind[1481]: Session 51 logged out. Waiting for processes to exit. Aug 12 23:52:00.593699 systemd-logind[1481]: Removed session 51. Aug 12 23:52:04.951707 containerd[1506]: time="2025-08-12T23:52:04.951599752Z" level=info msg="TaskExit event in podsandbox handler container_id:\"1cccb6bc47c00d5e1d4ee67f317e7ac919f9017670873c5bb6119b773b41640b\" id:\"d343b8b4a08d12f22b557ab269bea6ad07d52503d672c72b33769a38cc40ae55\" pid:7691 exited_at:{seconds:1755042724 nanos:951007505}" Aug 12 23:52:05.771248 systemd[1]: Started sshd@51-138.199.237.168:22-139.178.68.195:50594.service - OpenSSH per-connection server daemon (139.178.68.195:50594). Aug 12 23:52:06.850992 sshd[7702]: Accepted publickey for core from 139.178.68.195 port 50594 ssh2: RSA SHA256:mYYevtwiboZ0WAYUecJRosD7TZEB8o4Se1lqZszh41o Aug 12 23:52:06.853524 sshd-session[7702]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Aug 12 23:52:06.859574 systemd-logind[1481]: New session 52 of user core. Aug 12 23:52:06.866831 systemd[1]: Started session-52.scope - Session 52 of User core. Aug 12 23:52:07.657877 sshd[7705]: Connection closed by 139.178.68.195 port 50594 Aug 12 23:52:07.657750 sshd-session[7702]: pam_unix(sshd:session): session closed for user core Aug 12 23:52:07.663689 systemd[1]: sshd@51-138.199.237.168:22-139.178.68.195:50594.service: Deactivated successfully. Aug 12 23:52:07.668538 systemd[1]: session-52.scope: Deactivated successfully. Aug 12 23:52:07.670148 systemd-logind[1481]: Session 52 logged out. Waiting for processes to exit. Aug 12 23:52:07.672146 systemd-logind[1481]: Removed session 52. Aug 12 23:52:09.149459 containerd[1506]: time="2025-08-12T23:52:09.149358172Z" level=info msg="TaskExit event in podsandbox handler container_id:\"769dd1b877f8b6c3cdfcd850a1bfce2f7623183d7b5ca6c22e3baaa0871e87ab\" id:\"a8ce6b0606dca655ee681de1c904a96659c1e85afbf96037d4a62a6a14878514\" pid:7727 exited_at:{seconds:1755042729 nanos:148688204}" Aug 12 23:52:12.827723 systemd[1]: Started sshd@52-138.199.237.168:22-139.178.68.195:32926.service - OpenSSH per-connection server daemon (139.178.68.195:32926). Aug 12 23:52:13.848488 sshd[7739]: Accepted publickey for core from 139.178.68.195 port 32926 ssh2: RSA SHA256:mYYevtwiboZ0WAYUecJRosD7TZEB8o4Se1lqZszh41o Aug 12 23:52:13.850146 sshd-session[7739]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Aug 12 23:52:13.856917 systemd-logind[1481]: New session 53 of user core. Aug 12 23:52:13.862806 systemd[1]: Started session-53.scope - Session 53 of User core. Aug 12 23:52:14.617537 sshd[7741]: Connection closed by 139.178.68.195 port 32926 Aug 12 23:52:14.619234 sshd-session[7739]: pam_unix(sshd:session): session closed for user core Aug 12 23:52:14.628288 systemd[1]: sshd@52-138.199.237.168:22-139.178.68.195:32926.service: Deactivated successfully. Aug 12 23:52:14.632764 systemd[1]: session-53.scope: Deactivated successfully. Aug 12 23:52:14.636015 systemd-logind[1481]: Session 53 logged out. Waiting for processes to exit. Aug 12 23:52:14.639087 systemd-logind[1481]: Removed session 53. Aug 12 23:52:16.760510 containerd[1506]: time="2025-08-12T23:52:16.760467229Z" level=info msg="TaskExit event in podsandbox handler container_id:\"19dd4802aef9a2278a4c8001b706649c2528bbfafc2b91c7d2bde8cc49cbafdc\" id:\"dd822185fe0b06b3f802fb87f165645c8437ead01fc43b62f3d8dff243cb34f7\" pid:7765 exited_at:{seconds:1755042736 nanos:760204426}" Aug 12 23:52:19.797508 systemd[1]: Started sshd@53-138.199.237.168:22-139.178.68.195:32932.service - OpenSSH per-connection server daemon (139.178.68.195:32932). Aug 12 23:52:20.813124 sshd[7775]: Accepted publickey for core from 139.178.68.195 port 32932 ssh2: RSA SHA256:mYYevtwiboZ0WAYUecJRosD7TZEB8o4Se1lqZszh41o Aug 12 23:52:20.815777 sshd-session[7775]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Aug 12 23:52:20.823722 systemd-logind[1481]: New session 54 of user core. Aug 12 23:52:20.828820 systemd[1]: Started session-54.scope - Session 54 of User core. Aug 12 23:52:21.581703 sshd[7777]: Connection closed by 139.178.68.195 port 32932 Aug 12 23:52:21.582631 sshd-session[7775]: pam_unix(sshd:session): session closed for user core Aug 12 23:52:21.588268 systemd-logind[1481]: Session 54 logged out. Waiting for processes to exit. Aug 12 23:52:21.589168 systemd[1]: sshd@53-138.199.237.168:22-139.178.68.195:32932.service: Deactivated successfully. Aug 12 23:52:21.592729 systemd[1]: session-54.scope: Deactivated successfully. Aug 12 23:52:21.596297 systemd-logind[1481]: Removed session 54. Aug 12 23:52:26.757818 systemd[1]: Started sshd@54-138.199.237.168:22-139.178.68.195:37960.service - OpenSSH per-connection server daemon (139.178.68.195:37960). Aug 12 23:52:27.045170 containerd[1506]: time="2025-08-12T23:52:27.044648809Z" level=info msg="TaskExit event in podsandbox handler container_id:\"19dd4802aef9a2278a4c8001b706649c2528bbfafc2b91c7d2bde8cc49cbafdc\" id:\"8880e9fbe90ce75d0ab2111a89e784451589d502ae024ec1bdde52e6a8d856d1\" pid:7804 exited_at:{seconds:1755042747 nanos:42993869}" Aug 12 23:52:27.774643 sshd[7790]: Accepted publickey for core from 139.178.68.195 port 37960 ssh2: RSA SHA256:mYYevtwiboZ0WAYUecJRosD7TZEB8o4Se1lqZszh41o Aug 12 23:52:27.776585 sshd-session[7790]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Aug 12 23:52:27.783697 systemd-logind[1481]: New session 55 of user core. Aug 12 23:52:27.788824 systemd[1]: Started session-55.scope - Session 55 of User core. Aug 12 23:52:28.043744 containerd[1506]: time="2025-08-12T23:52:28.043579674Z" level=info msg="TaskExit event in podsandbox handler container_id:\"1cccb6bc47c00d5e1d4ee67f317e7ac919f9017670873c5bb6119b773b41640b\" id:\"cfb666e2bd1e33ec71b63e11b44f6f6d6ae0d4c175cdb260a53c6c19f75d9429\" pid:7829 exited_at:{seconds:1755042748 nanos:42874266}" Aug 12 23:52:28.558926 sshd[7814]: Connection closed by 139.178.68.195 port 37960 Aug 12 23:52:28.559771 sshd-session[7790]: pam_unix(sshd:session): session closed for user core Aug 12 23:52:28.570598 systemd[1]: sshd@54-138.199.237.168:22-139.178.68.195:37960.service: Deactivated successfully. Aug 12 23:52:28.574678 systemd[1]: session-55.scope: Deactivated successfully. Aug 12 23:52:28.576225 systemd-logind[1481]: Session 55 logged out. Waiting for processes to exit. Aug 12 23:52:28.578904 systemd-logind[1481]: Removed session 55. Aug 12 23:52:33.741052 systemd[1]: Started sshd@55-138.199.237.168:22-139.178.68.195:45036.service - OpenSSH per-connection server daemon (139.178.68.195:45036). Aug 12 23:52:34.754480 sshd[7851]: Accepted publickey for core from 139.178.68.195 port 45036 ssh2: RSA SHA256:mYYevtwiboZ0WAYUecJRosD7TZEB8o4Se1lqZszh41o Aug 12 23:52:34.756954 sshd-session[7851]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Aug 12 23:52:34.763384 systemd-logind[1481]: New session 56 of user core. Aug 12 23:52:34.777835 systemd[1]: Started session-56.scope - Session 56 of User core. Aug 12 23:52:35.535197 sshd[7853]: Connection closed by 139.178.68.195 port 45036 Aug 12 23:52:35.536291 sshd-session[7851]: pam_unix(sshd:session): session closed for user core Aug 12 23:52:35.541743 systemd[1]: sshd@55-138.199.237.168:22-139.178.68.195:45036.service: Deactivated successfully. Aug 12 23:52:35.544084 systemd[1]: session-56.scope: Deactivated successfully. Aug 12 23:52:35.545347 systemd-logind[1481]: Session 56 logged out. Waiting for processes to exit. Aug 12 23:52:35.548120 systemd-logind[1481]: Removed session 56. Aug 12 23:52:39.134869 containerd[1506]: time="2025-08-12T23:52:39.134807632Z" level=info msg="TaskExit event in podsandbox handler container_id:\"769dd1b877f8b6c3cdfcd850a1bfce2f7623183d7b5ca6c22e3baaa0871e87ab\" id:\"4b449857ff74d85f8f2b2308335ef307fae16d1acef09e1826cf88d93bb75c43\" pid:7877 exited_at:{seconds:1755042759 nanos:134430267}" Aug 12 23:52:40.716746 systemd[1]: Started sshd@56-138.199.237.168:22-139.178.68.195:41884.service - OpenSSH per-connection server daemon (139.178.68.195:41884). Aug 12 23:52:41.734428 sshd[7889]: Accepted publickey for core from 139.178.68.195 port 41884 ssh2: RSA SHA256:mYYevtwiboZ0WAYUecJRosD7TZEB8o4Se1lqZszh41o Aug 12 23:52:41.737718 sshd-session[7889]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Aug 12 23:52:41.746033 systemd-logind[1481]: New session 57 of user core. Aug 12 23:52:41.762895 systemd[1]: Started session-57.scope - Session 57 of User core. Aug 12 23:52:42.510084 sshd[7891]: Connection closed by 139.178.68.195 port 41884 Aug 12 23:52:42.510992 sshd-session[7889]: pam_unix(sshd:session): session closed for user core Aug 12 23:52:42.520444 systemd[1]: sshd@56-138.199.237.168:22-139.178.68.195:41884.service: Deactivated successfully. Aug 12 23:52:42.524315 systemd[1]: session-57.scope: Deactivated successfully. Aug 12 23:52:42.525946 systemd-logind[1481]: Session 57 logged out. Waiting for processes to exit. Aug 12 23:52:42.529024 systemd-logind[1481]: Removed session 57. Aug 12 23:52:47.685362 systemd[1]: Started sshd@57-138.199.237.168:22-139.178.68.195:41892.service - OpenSSH per-connection server daemon (139.178.68.195:41892). Aug 12 23:52:48.700155 sshd[7903]: Accepted publickey for core from 139.178.68.195 port 41892 ssh2: RSA SHA256:mYYevtwiboZ0WAYUecJRosD7TZEB8o4Se1lqZszh41o Aug 12 23:52:48.702954 sshd-session[7903]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Aug 12 23:52:48.708302 systemd-logind[1481]: New session 58 of user core. Aug 12 23:52:48.715070 systemd[1]: Started session-58.scope - Session 58 of User core. Aug 12 23:52:49.497163 sshd[7905]: Connection closed by 139.178.68.195 port 41892 Aug 12 23:52:49.497931 sshd-session[7903]: pam_unix(sshd:session): session closed for user core Aug 12 23:52:49.505774 systemd[1]: sshd@57-138.199.237.168:22-139.178.68.195:41892.service: Deactivated successfully. Aug 12 23:52:49.509308 systemd[1]: session-58.scope: Deactivated successfully. Aug 12 23:52:49.510779 systemd-logind[1481]: Session 58 logged out. Waiting for processes to exit. Aug 12 23:52:49.514725 systemd-logind[1481]: Removed session 58. Aug 12 23:52:54.679828 systemd[1]: Started sshd@58-138.199.237.168:22-139.178.68.195:34910.service - OpenSSH per-connection server daemon (139.178.68.195:34910). Aug 12 23:52:55.708497 sshd[7924]: Accepted publickey for core from 139.178.68.195 port 34910 ssh2: RSA SHA256:mYYevtwiboZ0WAYUecJRosD7TZEB8o4Se1lqZszh41o Aug 12 23:52:55.711097 sshd-session[7924]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Aug 12 23:52:55.720775 systemd-logind[1481]: New session 59 of user core. Aug 12 23:52:55.726898 systemd[1]: Started session-59.scope - Session 59 of User core. Aug 12 23:52:56.485203 sshd[7926]: Connection closed by 139.178.68.195 port 34910 Aug 12 23:52:56.485740 sshd-session[7924]: pam_unix(sshd:session): session closed for user core Aug 12 23:52:56.491985 systemd[1]: sshd@58-138.199.237.168:22-139.178.68.195:34910.service: Deactivated successfully. Aug 12 23:52:56.495972 systemd[1]: session-59.scope: Deactivated successfully. Aug 12 23:52:56.498666 systemd-logind[1481]: Session 59 logged out. Waiting for processes to exit. Aug 12 23:52:56.500845 systemd-logind[1481]: Removed session 59. Aug 12 23:52:57.024979 containerd[1506]: time="2025-08-12T23:52:57.024898718Z" level=info msg="TaskExit event in podsandbox handler container_id:\"19dd4802aef9a2278a4c8001b706649c2528bbfafc2b91c7d2bde8cc49cbafdc\" id:\"6ebc600fd49d406233e80c9697fccf4f43e546e1e8dbba58a6e0f4420724ce07\" pid:7948 exited_at:{seconds:1755042777 nanos:24611994}" Aug 12 23:52:58.084781 containerd[1506]: time="2025-08-12T23:52:58.084720924Z" level=info msg="TaskExit event in podsandbox handler container_id:\"1cccb6bc47c00d5e1d4ee67f317e7ac919f9017670873c5bb6119b773b41640b\" id:\"ad5d459d1088a5ed1fba18ef607304aec7740bb04dd4196ae80586f94b4c0606\" pid:7970 exited_at:{seconds:1755042778 nanos:83938755}" Aug 12 23:53:01.661981 systemd[1]: Started sshd@59-138.199.237.168:22-139.178.68.195:54918.service - OpenSSH per-connection server daemon (139.178.68.195:54918). Aug 12 23:53:02.684661 sshd[7995]: Accepted publickey for core from 139.178.68.195 port 54918 ssh2: RSA SHA256:mYYevtwiboZ0WAYUecJRosD7TZEB8o4Se1lqZszh41o Aug 12 23:53:02.686668 sshd-session[7995]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Aug 12 23:53:02.692422 systemd-logind[1481]: New session 60 of user core. Aug 12 23:53:02.705286 systemd[1]: Started session-60.scope - Session 60 of User core. Aug 12 23:53:03.456455 sshd[7999]: Connection closed by 139.178.68.195 port 54918 Aug 12 23:53:03.457104 sshd-session[7995]: pam_unix(sshd:session): session closed for user core Aug 12 23:53:03.463797 systemd-logind[1481]: Session 60 logged out. Waiting for processes to exit. Aug 12 23:53:03.464239 systemd[1]: sshd@59-138.199.237.168:22-139.178.68.195:54918.service: Deactivated successfully. Aug 12 23:53:03.466958 systemd[1]: session-60.scope: Deactivated successfully. Aug 12 23:53:03.469254 systemd-logind[1481]: Removed session 60. Aug 12 23:53:04.940452 containerd[1506]: time="2025-08-12T23:53:04.940400324Z" level=info msg="TaskExit event in podsandbox handler container_id:\"1cccb6bc47c00d5e1d4ee67f317e7ac919f9017670873c5bb6119b773b41640b\" id:\"0f81dcb7181eea875ac738f2e8efa92e35c5328f0b386cac3cf31a115ffd9cb7\" pid:8024 exited_at:{seconds:1755042784 nanos:939279550}" Aug 12 23:53:08.633866 systemd[1]: Started sshd@60-138.199.237.168:22-139.178.68.195:54922.service - OpenSSH per-connection server daemon (139.178.68.195:54922). Aug 12 23:53:09.135161 containerd[1506]: time="2025-08-12T23:53:09.135093093Z" level=info msg="TaskExit event in podsandbox handler container_id:\"769dd1b877f8b6c3cdfcd850a1bfce2f7623183d7b5ca6c22e3baaa0871e87ab\" id:\"e57ae346da68c3027ff34ecd92b16b5c35765120f25e4286be6da05732516f3b\" pid:8049 exited_at:{seconds:1755042789 nanos:134715288}" Aug 12 23:53:09.653704 sshd[8035]: Accepted publickey for core from 139.178.68.195 port 54922 ssh2: RSA SHA256:mYYevtwiboZ0WAYUecJRosD7TZEB8o4Se1lqZszh41o Aug 12 23:53:09.656301 sshd-session[8035]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Aug 12 23:53:09.662963 systemd-logind[1481]: New session 61 of user core. Aug 12 23:53:09.666768 systemd[1]: Started session-61.scope - Session 61 of User core. Aug 12 23:53:10.420071 sshd[8061]: Connection closed by 139.178.68.195 port 54922 Aug 12 23:53:10.419389 sshd-session[8035]: pam_unix(sshd:session): session closed for user core Aug 12 23:53:10.424830 systemd[1]: sshd@60-138.199.237.168:22-139.178.68.195:54922.service: Deactivated successfully. Aug 12 23:53:10.430007 systemd[1]: session-61.scope: Deactivated successfully. Aug 12 23:53:10.431320 systemd-logind[1481]: Session 61 logged out. Waiting for processes to exit. Aug 12 23:53:10.434059 systemd-logind[1481]: Removed session 61. Aug 12 23:53:15.594819 systemd[1]: Started sshd@61-138.199.237.168:22-139.178.68.195:59342.service - OpenSSH per-connection server daemon (139.178.68.195:59342). Aug 12 23:53:16.603950 sshd[8073]: Accepted publickey for core from 139.178.68.195 port 59342 ssh2: RSA SHA256:mYYevtwiboZ0WAYUecJRosD7TZEB8o4Se1lqZszh41o Aug 12 23:53:16.607403 sshd-session[8073]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Aug 12 23:53:16.613449 systemd-logind[1481]: New session 62 of user core. Aug 12 23:53:16.621910 systemd[1]: Started session-62.scope - Session 62 of User core. Aug 12 23:53:16.743354 containerd[1506]: time="2025-08-12T23:53:16.743304819Z" level=info msg="TaskExit event in podsandbox handler container_id:\"19dd4802aef9a2278a4c8001b706649c2528bbfafc2b91c7d2bde8cc49cbafdc\" id:\"621621c75f12b857e50e8648f11b5f83f4f2c33236145b0b56f759201f26f8aa\" pid:8089 exited_at:{seconds:1755042796 nanos:743007535}" Aug 12 23:53:17.381535 sshd[8075]: Connection closed by 139.178.68.195 port 59342 Aug 12 23:53:17.382315 sshd-session[8073]: pam_unix(sshd:session): session closed for user core Aug 12 23:53:17.387883 systemd[1]: sshd@61-138.199.237.168:22-139.178.68.195:59342.service: Deactivated successfully. Aug 12 23:53:17.391458 systemd[1]: session-62.scope: Deactivated successfully. Aug 12 23:53:17.393304 systemd-logind[1481]: Session 62 logged out. Waiting for processes to exit. Aug 12 23:53:17.395928 systemd-logind[1481]: Removed session 62. Aug 12 23:53:22.555896 systemd[1]: Started sshd@62-138.199.237.168:22-139.178.68.195:54260.service - OpenSSH per-connection server daemon (139.178.68.195:54260). Aug 12 23:53:23.571049 sshd[8109]: Accepted publickey for core from 139.178.68.195 port 54260 ssh2: RSA SHA256:mYYevtwiboZ0WAYUecJRosD7TZEB8o4Se1lqZszh41o Aug 12 23:53:23.573532 sshd-session[8109]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Aug 12 23:53:23.580026 systemd-logind[1481]: New session 63 of user core. Aug 12 23:53:23.589860 systemd[1]: Started session-63.scope - Session 63 of User core. Aug 12 23:53:24.348607 sshd[8111]: Connection closed by 139.178.68.195 port 54260 Aug 12 23:53:24.348336 sshd-session[8109]: pam_unix(sshd:session): session closed for user core Aug 12 23:53:24.353708 systemd-logind[1481]: Session 63 logged out. Waiting for processes to exit. Aug 12 23:53:24.354115 systemd[1]: sshd@62-138.199.237.168:22-139.178.68.195:54260.service: Deactivated successfully. Aug 12 23:53:24.359634 systemd[1]: session-63.scope: Deactivated successfully. Aug 12 23:53:24.365297 systemd-logind[1481]: Removed session 63. Aug 12 23:53:27.011745 containerd[1506]: time="2025-08-12T23:53:27.011412637Z" level=info msg="TaskExit event in podsandbox handler container_id:\"19dd4802aef9a2278a4c8001b706649c2528bbfafc2b91c7d2bde8cc49cbafdc\" id:\"9765f21c2b935bac9394364d657cfd3275dbc7993e291eda90fbb6d01d830862\" pid:8145 exited_at:{seconds:1755042807 nanos:10841230}" Aug 12 23:53:28.011876 containerd[1506]: time="2025-08-12T23:53:28.011823110Z" level=info msg="TaskExit event in podsandbox handler container_id:\"1cccb6bc47c00d5e1d4ee67f317e7ac919f9017670873c5bb6119b773b41640b\" id:\"52b15b195d31dea2f7f7db494070fc473a6074dda9a1a3b226b3d5a3446928e1\" pid:8166 exited_at:{seconds:1755042808 nanos:10498134}" Aug 12 23:53:29.550834 systemd[1]: Started sshd@63-138.199.237.168:22-139.178.68.195:54274.service - OpenSSH per-connection server daemon (139.178.68.195:54274). Aug 12 23:53:30.654524 sshd[8176]: Accepted publickey for core from 139.178.68.195 port 54274 ssh2: RSA SHA256:mYYevtwiboZ0WAYUecJRosD7TZEB8o4Se1lqZszh41o Aug 12 23:53:30.657020 sshd-session[8176]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Aug 12 23:53:30.663728 systemd-logind[1481]: New session 64 of user core. Aug 12 23:53:30.669831 systemd[1]: Started session-64.scope - Session 64 of User core. Aug 12 23:53:31.491451 sshd[8178]: Connection closed by 139.178.68.195 port 54274 Aug 12 23:53:31.491828 sshd-session[8176]: pam_unix(sshd:session): session closed for user core Aug 12 23:53:31.498936 systemd[1]: sshd@63-138.199.237.168:22-139.178.68.195:54274.service: Deactivated successfully. Aug 12 23:53:31.502781 systemd[1]: session-64.scope: Deactivated successfully. Aug 12 23:53:31.504054 systemd-logind[1481]: Session 64 logged out. Waiting for processes to exit. Aug 12 23:53:31.506400 systemd-logind[1481]: Removed session 64. Aug 12 23:53:36.661075 systemd[1]: Started sshd@64-138.199.237.168:22-139.178.68.195:41418.service - OpenSSH per-connection server daemon (139.178.68.195:41418). Aug 12 23:53:37.676255 sshd[8192]: Accepted publickey for core from 139.178.68.195 port 41418 ssh2: RSA SHA256:mYYevtwiboZ0WAYUecJRosD7TZEB8o4Se1lqZszh41o Aug 12 23:53:37.678328 sshd-session[8192]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Aug 12 23:53:37.683601 systemd-logind[1481]: New session 65 of user core. Aug 12 23:53:37.691845 systemd[1]: Started session-65.scope - Session 65 of User core. Aug 12 23:53:38.456249 sshd[8194]: Connection closed by 139.178.68.195 port 41418 Aug 12 23:53:38.455342 sshd-session[8192]: pam_unix(sshd:session): session closed for user core Aug 12 23:53:38.465117 systemd[1]: sshd@64-138.199.237.168:22-139.178.68.195:41418.service: Deactivated successfully. Aug 12 23:53:38.473661 systemd[1]: session-65.scope: Deactivated successfully. Aug 12 23:53:38.482178 systemd-logind[1481]: Session 65 logged out. Waiting for processes to exit. Aug 12 23:53:38.486275 systemd-logind[1481]: Removed session 65. Aug 12 23:53:39.131042 containerd[1506]: time="2025-08-12T23:53:39.129535552Z" level=info msg="TaskExit event in podsandbox handler container_id:\"769dd1b877f8b6c3cdfcd850a1bfce2f7623183d7b5ca6c22e3baaa0871e87ab\" id:\"810a68b6a44b1dd17bacb62b359bd0c5e31ab1b54396f7735d71f33b82a526c6\" pid:8217 exited_at:{seconds:1755042819 nanos:128761623}" Aug 12 23:53:40.347901 systemd[1]: Started sshd@65-138.199.237.168:22-185.50.38.233:51866.service - OpenSSH per-connection server daemon (185.50.38.233:51866). Aug 12 23:53:43.634910 systemd[1]: Started sshd@66-138.199.237.168:22-139.178.68.195:57266.service - OpenSSH per-connection server daemon (139.178.68.195:57266). Aug 12 23:53:44.650945 sshd[8233]: Accepted publickey for core from 139.178.68.195 port 57266 ssh2: RSA SHA256:mYYevtwiboZ0WAYUecJRosD7TZEB8o4Se1lqZszh41o Aug 12 23:53:44.653043 sshd-session[8233]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Aug 12 23:53:44.659274 systemd-logind[1481]: New session 66 of user core. Aug 12 23:53:44.669912 systemd[1]: Started session-66.scope - Session 66 of User core. Aug 12 23:53:44.919340 sshd[8230]: Connection closed by authenticating user root 185.50.38.233 port 51866 [preauth] Aug 12 23:53:44.922736 systemd[1]: sshd@65-138.199.237.168:22-185.50.38.233:51866.service: Deactivated successfully. Aug 12 23:53:45.422642 sshd[8235]: Connection closed by 139.178.68.195 port 57266 Aug 12 23:53:45.423463 sshd-session[8233]: pam_unix(sshd:session): session closed for user core Aug 12 23:53:45.429210 systemd[1]: sshd@66-138.199.237.168:22-139.178.68.195:57266.service: Deactivated successfully. Aug 12 23:53:45.432446 systemd[1]: session-66.scope: Deactivated successfully. Aug 12 23:53:45.434466 systemd-logind[1481]: Session 66 logged out. Waiting for processes to exit. Aug 12 23:53:45.437096 systemd-logind[1481]: Removed session 66. Aug 12 23:53:45.537892 systemd[1]: Started sshd@67-138.199.237.168:22-185.50.38.233:51870.service - OpenSSH per-connection server daemon (185.50.38.233:51870). Aug 12 23:53:50.602522 systemd[1]: Started sshd@68-138.199.237.168:22-139.178.68.195:34054.service - OpenSSH per-connection server daemon (139.178.68.195:34054). Aug 12 23:53:51.106518 sshd[8249]: Connection closed by authenticating user root 185.50.38.233 port 51870 [preauth] Aug 12 23:53:51.111443 systemd[1]: sshd@67-138.199.237.168:22-185.50.38.233:51870.service: Deactivated successfully. Aug 12 23:53:51.617807 sshd[8252]: Accepted publickey for core from 139.178.68.195 port 34054 ssh2: RSA SHA256:mYYevtwiboZ0WAYUecJRosD7TZEB8o4Se1lqZszh41o Aug 12 23:53:51.620340 sshd-session[8252]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Aug 12 23:53:51.635085 systemd[1]: Started sshd@69-138.199.237.168:22-185.50.38.233:36368.service - OpenSSH per-connection server daemon (185.50.38.233:36368). Aug 12 23:53:51.644640 systemd-logind[1481]: New session 67 of user core. Aug 12 23:53:51.651428 systemd[1]: Started session-67.scope - Session 67 of User core. Aug 12 23:53:52.462026 sshd[8258]: Connection closed by 139.178.68.195 port 34054 Aug 12 23:53:52.462906 sshd-session[8252]: pam_unix(sshd:session): session closed for user core Aug 12 23:53:52.467541 systemd[1]: sshd@68-138.199.237.168:22-139.178.68.195:34054.service: Deactivated successfully. Aug 12 23:53:52.471908 systemd[1]: session-67.scope: Deactivated successfully. Aug 12 23:53:52.473974 systemd-logind[1481]: Session 67 logged out. Waiting for processes to exit. Aug 12 23:53:52.479127 systemd-logind[1481]: Removed session 67. Aug 12 23:53:55.828333 sshd[8257]: Invalid user scpserver from 185.50.38.233 port 36368 Aug 12 23:53:56.400423 sshd[8257]: Connection closed by invalid user scpserver 185.50.38.233 port 36368 [preauth] Aug 12 23:53:56.403115 systemd[1]: sshd@69-138.199.237.168:22-185.50.38.233:36368.service: Deactivated successfully. Aug 12 23:53:57.014349 containerd[1506]: time="2025-08-12T23:53:57.014236980Z" level=info msg="TaskExit event in podsandbox handler container_id:\"19dd4802aef9a2278a4c8001b706649c2528bbfafc2b91c7d2bde8cc49cbafdc\" id:\"ea40061adef817573456ce97449312a8ac9ff837b6be5af3c8a6fcc160840fa7\" pid:8284 exited_at:{seconds:1755042837 nanos:13732774}" Aug 12 23:53:57.306817 systemd[1]: Started sshd@70-138.199.237.168:22-185.50.38.233:36272.service - OpenSSH per-connection server daemon (185.50.38.233:36272). Aug 12 23:53:57.638793 systemd[1]: Started sshd@71-138.199.237.168:22-139.178.68.195:34070.service - OpenSSH per-connection server daemon (139.178.68.195:34070). Aug 12 23:53:58.009931 containerd[1506]: time="2025-08-12T23:53:58.009790434Z" level=info msg="TaskExit event in podsandbox handler container_id:\"1cccb6bc47c00d5e1d4ee67f317e7ac919f9017670873c5bb6119b773b41640b\" id:\"13f809ed1d90e664c66af9449710ec881ae025e4921ca88fac7f7dc1e40936b9\" pid:8310 exited_at:{seconds:1755042838 nanos:9113866}" Aug 12 23:53:58.661614 sshd[8296]: Accepted publickey for core from 139.178.68.195 port 34070 ssh2: RSA SHA256:mYYevtwiboZ0WAYUecJRosD7TZEB8o4Se1lqZszh41o Aug 12 23:53:58.664628 sshd-session[8296]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Aug 12 23:53:58.672878 systemd-logind[1481]: New session 68 of user core. Aug 12 23:53:58.682255 systemd[1]: Started session-68.scope - Session 68 of User core. Aug 12 23:53:59.439681 sshd[8321]: Connection closed by 139.178.68.195 port 34070 Aug 12 23:53:59.440819 sshd-session[8296]: pam_unix(sshd:session): session closed for user core Aug 12 23:53:59.446055 systemd-logind[1481]: Session 68 logged out. Waiting for processes to exit. Aug 12 23:53:59.447957 systemd[1]: sshd@71-138.199.237.168:22-139.178.68.195:34070.service: Deactivated successfully. Aug 12 23:53:59.453325 systemd[1]: session-68.scope: Deactivated successfully. Aug 12 23:53:59.457346 systemd-logind[1481]: Removed session 68. Aug 12 23:54:03.673886 sshd[8294]: Connection closed by authenticating user root 185.50.38.233 port 36272 [preauth] Aug 12 23:54:03.677573 systemd[1]: sshd@70-138.199.237.168:22-185.50.38.233:36272.service: Deactivated successfully. Aug 12 23:54:04.269636 systemd[1]: Started sshd@72-138.199.237.168:22-185.50.38.233:36274.service - OpenSSH per-connection server daemon (185.50.38.233:36274). Aug 12 23:54:04.623772 systemd[1]: Started sshd@73-138.199.237.168:22-139.178.68.195:60260.service - OpenSSH per-connection server daemon (139.178.68.195:60260). Aug 12 23:54:04.959891 containerd[1506]: time="2025-08-12T23:54:04.959774257Z" level=info msg="TaskExit event in podsandbox handler container_id:\"1cccb6bc47c00d5e1d4ee67f317e7ac919f9017670873c5bb6119b773b41640b\" id:\"f8d770db0161b54de55ade1c9da5f999f729a1ff0b2fd84f624012b26c92a352\" pid:8355 exited_at:{seconds:1755042844 nanos:959059089}" Aug 12 23:54:05.643730 sshd[8341]: Accepted publickey for core from 139.178.68.195 port 60260 ssh2: RSA SHA256:mYYevtwiboZ0WAYUecJRosD7TZEB8o4Se1lqZszh41o Aug 12 23:54:05.646319 sshd-session[8341]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Aug 12 23:54:05.654542 systemd-logind[1481]: New session 69 of user core. Aug 12 23:54:05.661844 systemd[1]: Started session-69.scope - Session 69 of User core. Aug 12 23:54:06.430296 sshd[8366]: Connection closed by 139.178.68.195 port 60260 Aug 12 23:54:06.431308 sshd-session[8341]: pam_unix(sshd:session): session closed for user core Aug 12 23:54:06.437186 systemd[1]: sshd@73-138.199.237.168:22-139.178.68.195:60260.service: Deactivated successfully. Aug 12 23:54:06.442008 systemd[1]: session-69.scope: Deactivated successfully. Aug 12 23:54:06.445010 systemd-logind[1481]: Session 69 logged out. Waiting for processes to exit. Aug 12 23:54:06.447445 systemd-logind[1481]: Removed session 69. Aug 12 23:54:09.138299 containerd[1506]: time="2025-08-12T23:54:09.138166132Z" level=info msg="TaskExit event in podsandbox handler container_id:\"769dd1b877f8b6c3cdfcd850a1bfce2f7623183d7b5ca6c22e3baaa0871e87ab\" id:\"47d2f5ed4ae01687dc849148f5dbf42d096a0efb153d420bc21fb8ab866b5fc9\" pid:8388 exited_at:{seconds:1755042849 nanos:137774928}" Aug 12 23:54:10.624587 sshd[8339]: Connection closed by authenticating user root 185.50.38.233 port 36274 [preauth] Aug 12 23:54:10.627603 systemd[1]: sshd@72-138.199.237.168:22-185.50.38.233:36274.service: Deactivated successfully. Aug 12 23:54:11.264605 systemd[1]: Started sshd@74-138.199.237.168:22-185.50.38.233:45294.service - OpenSSH per-connection server daemon (185.50.38.233:45294). Aug 12 23:54:11.610958 systemd[1]: Started sshd@75-138.199.237.168:22-139.178.68.195:58752.service - OpenSSH per-connection server daemon (139.178.68.195:58752). Aug 12 23:54:12.625624 sshd[8404]: Accepted publickey for core from 139.178.68.195 port 58752 ssh2: RSA SHA256:mYYevtwiboZ0WAYUecJRosD7TZEB8o4Se1lqZszh41o Aug 12 23:54:12.627298 sshd-session[8404]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Aug 12 23:54:12.634218 systemd-logind[1481]: New session 70 of user core. Aug 12 23:54:12.641845 systemd[1]: Started session-70.scope - Session 70 of User core. Aug 12 23:54:13.402529 sshd[8406]: Connection closed by 139.178.68.195 port 58752 Aug 12 23:54:13.403474 sshd-session[8404]: pam_unix(sshd:session): session closed for user core Aug 12 23:54:13.412592 systemd[1]: sshd@75-138.199.237.168:22-139.178.68.195:58752.service: Deactivated successfully. Aug 12 23:54:13.416882 systemd[1]: session-70.scope: Deactivated successfully. Aug 12 23:54:13.418485 systemd-logind[1481]: Session 70 logged out. Waiting for processes to exit. Aug 12 23:54:13.423322 systemd-logind[1481]: Removed session 70. Aug 12 23:54:15.741828 sshd[8402]: Invalid user ali from 185.50.38.233 port 45294 Aug 12 23:54:16.196568 sshd[8402]: Connection closed by invalid user ali 185.50.38.233 port 45294 [preauth] Aug 12 23:54:16.199896 systemd[1]: sshd@74-138.199.237.168:22-185.50.38.233:45294.service: Deactivated successfully. Aug 12 23:54:16.743375 containerd[1506]: time="2025-08-12T23:54:16.743325612Z" level=info msg="TaskExit event in podsandbox handler container_id:\"19dd4802aef9a2278a4c8001b706649c2528bbfafc2b91c7d2bde8cc49cbafdc\" id:\"fcd0807f7d04da5e0da18a7a0c937f4d420abca0288fefe382e15ee9230edff3\" pid:8432 exited_at:{seconds:1755042856 nanos:742813526}" Aug 12 23:54:16.894859 systemd[1]: Started sshd@76-138.199.237.168:22-185.50.38.233:45310.service - OpenSSH per-connection server daemon (185.50.38.233:45310). Aug 12 23:54:18.577896 systemd[1]: Started sshd@77-138.199.237.168:22-139.178.68.195:58758.service - OpenSSH per-connection server daemon (139.178.68.195:58758). Aug 12 23:54:19.596398 sshd[8444]: Accepted publickey for core from 139.178.68.195 port 58758 ssh2: RSA SHA256:mYYevtwiboZ0WAYUecJRosD7TZEB8o4Se1lqZszh41o Aug 12 23:54:19.598985 sshd-session[8444]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Aug 12 23:54:19.606334 systemd-logind[1481]: New session 71 of user core. Aug 12 23:54:19.613886 systemd[1]: Started session-71.scope - Session 71 of User core. Aug 12 23:54:20.365236 sshd[8447]: Connection closed by 139.178.68.195 port 58758 Aug 12 23:54:20.366269 sshd-session[8444]: pam_unix(sshd:session): session closed for user core Aug 12 23:54:20.372849 systemd-logind[1481]: Session 71 logged out. Waiting for processes to exit. Aug 12 23:54:20.373040 systemd[1]: sshd@77-138.199.237.168:22-139.178.68.195:58758.service: Deactivated successfully. Aug 12 23:54:20.376136 systemd[1]: session-71.scope: Deactivated successfully. Aug 12 23:54:20.380199 systemd-logind[1481]: Removed session 71. Aug 12 23:54:23.468613 sshd[8442]: Connection closed by authenticating user root 185.50.38.233 port 45310 [preauth] Aug 12 23:54:23.472443 systemd[1]: sshd@76-138.199.237.168:22-185.50.38.233:45310.service: Deactivated successfully. Aug 12 23:54:24.248201 systemd[1]: Started sshd@78-138.199.237.168:22-185.50.38.233:54604.service - OpenSSH per-connection server daemon (185.50.38.233:54604). Aug 12 23:54:25.562101 systemd[1]: Started sshd@79-138.199.237.168:22-139.178.68.195:50650.service - OpenSSH per-connection server daemon (139.178.68.195:50650). Aug 12 23:54:26.644176 sshd[8475]: Accepted publickey for core from 139.178.68.195 port 50650 ssh2: RSA SHA256:mYYevtwiboZ0WAYUecJRosD7TZEB8o4Se1lqZszh41o Aug 12 23:54:26.646651 sshd-session[8475]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Aug 12 23:54:26.655096 systemd-logind[1481]: New session 72 of user core. Aug 12 23:54:26.660874 systemd[1]: Started session-72.scope - Session 72 of User core. Aug 12 23:54:27.013329 containerd[1506]: time="2025-08-12T23:54:27.013202873Z" level=info msg="TaskExit event in podsandbox handler container_id:\"19dd4802aef9a2278a4c8001b706649c2528bbfafc2b91c7d2bde8cc49cbafdc\" id:\"bcca88a485399f7ad5a07023d09e8ce2475c30f23131f09a89b4e08f8d4459c2\" pid:8492 exited_at:{seconds:1755042867 nanos:12417824}" Aug 12 23:54:27.458262 sshd[8477]: Connection closed by 139.178.68.195 port 50650 Aug 12 23:54:27.459135 sshd-session[8475]: pam_unix(sshd:session): session closed for user core Aug 12 23:54:27.465030 systemd[1]: sshd@79-138.199.237.168:22-139.178.68.195:50650.service: Deactivated successfully. Aug 12 23:54:27.469091 systemd[1]: session-72.scope: Deactivated successfully. Aug 12 23:54:27.471047 systemd-logind[1481]: Session 72 logged out. Waiting for processes to exit. Aug 12 23:54:27.473986 systemd-logind[1481]: Removed session 72. Aug 12 23:54:28.008603 containerd[1506]: time="2025-08-12T23:54:28.008359153Z" level=info msg="TaskExit event in podsandbox handler container_id:\"1cccb6bc47c00d5e1d4ee67f317e7ac919f9017670873c5bb6119b773b41640b\" id:\"8c3d1f59cc1756b9c2a6330fc9db9c952aed11963942b8b1837bad4b3c2d2562\" pid:8523 exited_at:{seconds:1755042868 nanos:7811346}" Aug 12 23:54:30.875921 sshd[8463]: Connection closed by authenticating user root 185.50.38.233 port 54604 [preauth] Aug 12 23:54:30.880394 systemd[1]: sshd@78-138.199.237.168:22-185.50.38.233:54604.service: Deactivated successfully. Aug 12 23:54:31.468420 systemd[1]: Started sshd@80-138.199.237.168:22-185.50.38.233:35520.service - OpenSSH per-connection server daemon (185.50.38.233:35520). Aug 12 23:54:32.642836 systemd[1]: Started sshd@81-138.199.237.168:22-139.178.68.195:46872.service - OpenSSH per-connection server daemon (139.178.68.195:46872). Aug 12 23:54:33.719572 sshd[8554]: Accepted publickey for core from 139.178.68.195 port 46872 ssh2: RSA SHA256:mYYevtwiboZ0WAYUecJRosD7TZEB8o4Se1lqZszh41o Aug 12 23:54:33.722251 sshd-session[8554]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Aug 12 23:54:33.730633 systemd-logind[1481]: New session 73 of user core. Aug 12 23:54:33.735743 systemd[1]: Started session-73.scope - Session 73 of User core. Aug 12 23:54:34.556275 sshd[8557]: Connection closed by 139.178.68.195 port 46872 Aug 12 23:54:34.559622 sshd-session[8554]: pam_unix(sshd:session): session closed for user core Aug 12 23:54:34.564602 systemd[1]: sshd@81-138.199.237.168:22-139.178.68.195:46872.service: Deactivated successfully. Aug 12 23:54:34.564699 systemd-logind[1481]: Session 73 logged out. Waiting for processes to exit. Aug 12 23:54:34.567638 systemd[1]: session-73.scope: Deactivated successfully. Aug 12 23:54:34.571100 systemd-logind[1481]: Removed session 73. Aug 12 23:54:35.930859 sshd[8549]: Invalid user ftpuser from 185.50.38.233 port 35520 Aug 12 23:54:36.495903 sshd[8549]: Connection closed by invalid user ftpuser 185.50.38.233 port 35520 [preauth] Aug 12 23:54:36.500080 systemd[1]: sshd@80-138.199.237.168:22-185.50.38.233:35520.service: Deactivated successfully. Aug 12 23:54:37.235976 systemd[1]: Started sshd@82-138.199.237.168:22-185.50.38.233:43562.service - OpenSSH per-connection server daemon (185.50.38.233:43562). Aug 12 23:54:39.139232 containerd[1506]: time="2025-08-12T23:54:39.139131163Z" level=info msg="TaskExit event in podsandbox handler container_id:\"769dd1b877f8b6c3cdfcd850a1bfce2f7623183d7b5ca6c22e3baaa0871e87ab\" id:\"98257c8a14fc2619720fc13a8495f1247437edb4ec9924ab308c5df5bd50b6e9\" pid:8586 exited_at:{seconds:1755042879 nanos:137357942}" Aug 12 23:54:39.730067 systemd[1]: Started sshd@83-138.199.237.168:22-139.178.68.195:46876.service - OpenSSH per-connection server daemon (139.178.68.195:46876). Aug 12 23:54:40.749175 sshd[8598]: Accepted publickey for core from 139.178.68.195 port 46876 ssh2: RSA SHA256:mYYevtwiboZ0WAYUecJRosD7TZEB8o4Se1lqZszh41o Aug 12 23:54:40.752262 sshd-session[8598]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Aug 12 23:54:40.762695 systemd-logind[1481]: New session 74 of user core. Aug 12 23:54:40.764829 systemd[1]: Started session-74.scope - Session 74 of User core. Aug 12 23:54:41.524517 sshd[8600]: Connection closed by 139.178.68.195 port 46876 Aug 12 23:54:41.524389 sshd-session[8598]: pam_unix(sshd:session): session closed for user core Aug 12 23:54:41.532308 systemd[1]: sshd@83-138.199.237.168:22-139.178.68.195:46876.service: Deactivated successfully. Aug 12 23:54:41.536758 systemd[1]: session-74.scope: Deactivated successfully. Aug 12 23:54:41.538471 systemd-logind[1481]: Session 74 logged out. Waiting for processes to exit. Aug 12 23:54:41.541345 systemd-logind[1481]: Removed session 74. Aug 12 23:54:43.720916 sshd[8571]: Connection closed by authenticating user root 185.50.38.233 port 43562 [preauth] Aug 12 23:54:43.724437 systemd[1]: sshd@82-138.199.237.168:22-185.50.38.233:43562.service: Deactivated successfully. Aug 12 23:54:44.253928 systemd[1]: Started sshd@84-138.199.237.168:22-185.50.38.233:43576.service - OpenSSH per-connection server daemon (185.50.38.233:43576). Aug 12 23:54:46.706266 systemd[1]: Started sshd@85-138.199.237.168:22-139.178.68.195:51202.service - OpenSSH per-connection server daemon (139.178.68.195:51202). Aug 12 23:54:47.717636 sshd[8617]: Accepted publickey for core from 139.178.68.195 port 51202 ssh2: RSA SHA256:mYYevtwiboZ0WAYUecJRosD7TZEB8o4Se1lqZszh41o Aug 12 23:54:47.719653 sshd-session[8617]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Aug 12 23:54:47.726667 systemd-logind[1481]: New session 75 of user core. Aug 12 23:54:47.733010 systemd[1]: Started session-75.scope - Session 75 of User core. Aug 12 23:54:48.494744 sshd[8619]: Connection closed by 139.178.68.195 port 51202 Aug 12 23:54:48.495754 sshd-session[8617]: pam_unix(sshd:session): session closed for user core Aug 12 23:54:48.505464 systemd[1]: sshd@85-138.199.237.168:22-139.178.68.195:51202.service: Deactivated successfully. Aug 12 23:54:48.509971 systemd[1]: session-75.scope: Deactivated successfully. Aug 12 23:54:48.512885 systemd-logind[1481]: Session 75 logged out. Waiting for processes to exit. Aug 12 23:54:48.515934 systemd-logind[1481]: Removed session 75. Aug 12 23:54:50.220991 sshd[8614]: Connection closed by authenticating user root 185.50.38.233 port 43576 [preauth] Aug 12 23:54:50.225877 systemd[1]: sshd@84-138.199.237.168:22-185.50.38.233:43576.service: Deactivated successfully. Aug 12 23:54:50.936001 systemd[1]: Started sshd@86-138.199.237.168:22-185.50.38.233:56574.service - OpenSSH per-connection server daemon (185.50.38.233:56574). Aug 12 23:54:53.670182 systemd[1]: Started sshd@87-138.199.237.168:22-139.178.68.195:45678.service - OpenSSH per-connection server daemon (139.178.68.195:45678). Aug 12 23:54:54.685579 sshd[8636]: Accepted publickey for core from 139.178.68.195 port 45678 ssh2: RSA SHA256:mYYevtwiboZ0WAYUecJRosD7TZEB8o4Se1lqZszh41o Aug 12 23:54:54.687426 sshd-session[8636]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Aug 12 23:54:54.694108 systemd-logind[1481]: New session 76 of user core. Aug 12 23:54:54.701450 systemd[1]: Started session-76.scope - Session 76 of User core. Aug 12 23:54:55.457751 sshd[8638]: Connection closed by 139.178.68.195 port 45678 Aug 12 23:54:55.459416 sshd-session[8636]: pam_unix(sshd:session): session closed for user core Aug 12 23:54:55.464942 systemd-logind[1481]: Session 76 logged out. Waiting for processes to exit. Aug 12 23:54:55.465204 systemd[1]: sshd@87-138.199.237.168:22-139.178.68.195:45678.service: Deactivated successfully. Aug 12 23:54:55.468794 systemd[1]: session-76.scope: Deactivated successfully. Aug 12 23:54:55.473379 systemd-logind[1481]: Removed session 76. Aug 12 23:54:56.085630 sshd[8633]: Connection closed by authenticating user root 185.50.38.233 port 56574 [preauth] Aug 12 23:54:56.089483 systemd[1]: sshd@86-138.199.237.168:22-185.50.38.233:56574.service: Deactivated successfully. Aug 12 23:54:56.995811 systemd[1]: Started sshd@88-138.199.237.168:22-185.50.38.233:56588.service - OpenSSH per-connection server daemon (185.50.38.233:56588). Aug 12 23:54:57.018923 containerd[1506]: time="2025-08-12T23:54:57.018878985Z" level=info msg="TaskExit event in podsandbox handler container_id:\"19dd4802aef9a2278a4c8001b706649c2528bbfafc2b91c7d2bde8cc49cbafdc\" id:\"6f0e59feb884445301d1f40a543ee71424150496e79a5a31c5aff6c6b9bc226f\" pid:8665 exited_at:{seconds:1755042897 nanos:17848253}" Aug 12 23:54:58.034774 containerd[1506]: time="2025-08-12T23:54:58.034688535Z" level=info msg="TaskExit event in podsandbox handler container_id:\"1cccb6bc47c00d5e1d4ee67f317e7ac919f9017670873c5bb6119b773b41640b\" id:\"ae578dea2e298ef4ef1054174693a75a5f19ea3c31a155b38cec22a7af9afa9b\" pid:8687 exited_at:{seconds:1755042898 nanos:32874274}" Aug 12 23:55:00.639243 systemd[1]: Started sshd@89-138.199.237.168:22-139.178.68.195:57734.service - OpenSSH per-connection server daemon (139.178.68.195:57734). Aug 12 23:55:01.647380 sshd[8699]: Accepted publickey for core from 139.178.68.195 port 57734 ssh2: RSA SHA256:mYYevtwiboZ0WAYUecJRosD7TZEB8o4Se1lqZszh41o Aug 12 23:55:01.649712 sshd-session[8699]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Aug 12 23:55:01.657237 systemd-logind[1481]: New session 77 of user core. Aug 12 23:55:01.665888 systemd[1]: Started session-77.scope - Session 77 of User core. Aug 12 23:55:02.425634 sshd[8701]: Connection closed by 139.178.68.195 port 57734 Aug 12 23:55:02.426539 sshd-session[8699]: pam_unix(sshd:session): session closed for user core Aug 12 23:55:02.434004 systemd-logind[1481]: Session 77 logged out. Waiting for processes to exit. Aug 12 23:55:02.434969 systemd[1]: sshd@89-138.199.237.168:22-139.178.68.195:57734.service: Deactivated successfully. Aug 12 23:55:02.438266 systemd[1]: session-77.scope: Deactivated successfully. Aug 12 23:55:02.441646 systemd-logind[1481]: Removed session 77. Aug 12 23:55:05.002491 containerd[1506]: time="2025-08-12T23:55:05.002434059Z" level=info msg="TaskExit event in podsandbox handler container_id:\"1cccb6bc47c00d5e1d4ee67f317e7ac919f9017670873c5bb6119b773b41640b\" id:\"471d3567a54e6bd644c6f7fc3653d4a3569a8af5fac109def16e8c32457834e1\" pid:8725 exited_at:{seconds:1755042905 nanos:2096655}" Aug 12 23:55:05.099569 sshd[8663]: Connection closed by authenticating user root 185.50.38.233 port 56588 [preauth] Aug 12 23:55:05.104125 systemd[1]: sshd@88-138.199.237.168:22-185.50.38.233:56588.service: Deactivated successfully. Aug 12 23:55:05.754933 systemd[1]: Started sshd@90-138.199.237.168:22-185.50.38.233:48642.service - OpenSSH per-connection server daemon (185.50.38.233:48642). Aug 12 23:55:07.602978 systemd[1]: Started sshd@91-138.199.237.168:22-139.178.68.195:57740.service - OpenSSH per-connection server daemon (139.178.68.195:57740). Aug 12 23:55:08.629264 sshd[8741]: Accepted publickey for core from 139.178.68.195 port 57740 ssh2: RSA SHA256:mYYevtwiboZ0WAYUecJRosD7TZEB8o4Se1lqZszh41o Aug 12 23:55:08.632620 sshd-session[8741]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Aug 12 23:55:08.640099 systemd-logind[1481]: New session 78 of user core. Aug 12 23:55:08.654871 systemd[1]: Started session-78.scope - Session 78 of User core. Aug 12 23:55:09.139003 containerd[1506]: time="2025-08-12T23:55:09.138960805Z" level=info msg="TaskExit event in podsandbox handler container_id:\"769dd1b877f8b6c3cdfcd850a1bfce2f7623183d7b5ca6c22e3baaa0871e87ab\" id:\"c6ea89788c873e21a6c2bdbfe0b29628a10cd6fde7b25049185402e5a915c001\" pid:8756 exited_at:{seconds:1755042909 nanos:138395759}" Aug 12 23:55:09.408486 sshd[8743]: Connection closed by 139.178.68.195 port 57740 Aug 12 23:55:09.409848 sshd-session[8741]: pam_unix(sshd:session): session closed for user core Aug 12 23:55:09.416434 systemd[1]: sshd@91-138.199.237.168:22-139.178.68.195:57740.service: Deactivated successfully. Aug 12 23:55:09.419718 systemd[1]: session-78.scope: Deactivated successfully. Aug 12 23:55:09.422131 systemd-logind[1481]: Session 78 logged out. Waiting for processes to exit. Aug 12 23:55:09.425328 systemd-logind[1481]: Removed session 78. Aug 12 23:55:12.836885 sshd[8738]: Connection closed by authenticating user root 185.50.38.233 port 48642 [preauth] Aug 12 23:55:12.842075 systemd[1]: sshd@90-138.199.237.168:22-185.50.38.233:48642.service: Deactivated successfully. Aug 12 23:55:13.640198 systemd[1]: Started sshd@92-138.199.237.168:22-185.50.38.233:53748.service - OpenSSH per-connection server daemon (185.50.38.233:53748). Aug 12 23:55:14.594105 systemd[1]: Started sshd@93-138.199.237.168:22-139.178.68.195:44198.service - OpenSSH per-connection server daemon (139.178.68.195:44198). Aug 12 23:55:15.621302 sshd[8782]: Accepted publickey for core from 139.178.68.195 port 44198 ssh2: RSA SHA256:mYYevtwiboZ0WAYUecJRosD7TZEB8o4Se1lqZszh41o Aug 12 23:55:15.622739 sshd-session[8782]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Aug 12 23:55:15.629099 systemd-logind[1481]: New session 79 of user core. Aug 12 23:55:15.639893 systemd[1]: Started session-79.scope - Session 79 of User core. Aug 12 23:55:16.390936 sshd[8785]: Connection closed by 139.178.68.195 port 44198 Aug 12 23:55:16.391914 sshd-session[8782]: pam_unix(sshd:session): session closed for user core Aug 12 23:55:16.398753 systemd[1]: sshd@93-138.199.237.168:22-139.178.68.195:44198.service: Deactivated successfully. Aug 12 23:55:16.402111 systemd[1]: session-79.scope: Deactivated successfully. Aug 12 23:55:16.404023 systemd-logind[1481]: Session 79 logged out. Waiting for processes to exit. Aug 12 23:55:16.406821 systemd-logind[1481]: Removed session 79. Aug 12 23:55:16.743786 containerd[1506]: time="2025-08-12T23:55:16.743446685Z" level=info msg="TaskExit event in podsandbox handler container_id:\"19dd4802aef9a2278a4c8001b706649c2528bbfafc2b91c7d2bde8cc49cbafdc\" id:\"774dfaabb528c56cd48316cb0e23b5cfc5a22991a50ece7bd5ec220920c5dc86\" pid:8808 exited_at:{seconds:1755042916 nanos:743034400}" Aug 12 23:55:20.731386 sshd[8780]: Connection closed by authenticating user root 185.50.38.233 port 53748 [preauth] Aug 12 23:55:20.734705 systemd[1]: sshd@92-138.199.237.168:22-185.50.38.233:53748.service: Deactivated successfully. Aug 12 23:55:21.417152 systemd[1]: Started sshd@94-138.199.237.168:22-185.50.38.233:48724.service - OpenSSH per-connection server daemon (185.50.38.233:48724). Aug 12 23:55:21.566250 systemd[1]: Started sshd@95-138.199.237.168:22-139.178.68.195:41840.service - OpenSSH per-connection server daemon (139.178.68.195:41840). Aug 12 23:55:22.584447 sshd[8822]: Accepted publickey for core from 139.178.68.195 port 41840 ssh2: RSA SHA256:mYYevtwiboZ0WAYUecJRosD7TZEB8o4Se1lqZszh41o Aug 12 23:55:22.586286 sshd-session[8822]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Aug 12 23:55:22.593653 systemd-logind[1481]: New session 80 of user core. Aug 12 23:55:22.598050 systemd[1]: Started session-80.scope - Session 80 of User core. Aug 12 23:55:23.367754 sshd[8824]: Connection closed by 139.178.68.195 port 41840 Aug 12 23:55:23.368265 sshd-session[8822]: pam_unix(sshd:session): session closed for user core Aug 12 23:55:23.374198 systemd-logind[1481]: Session 80 logged out. Waiting for processes to exit. Aug 12 23:55:23.375208 systemd[1]: sshd@95-138.199.237.168:22-139.178.68.195:41840.service: Deactivated successfully. Aug 12 23:55:23.379037 systemd[1]: session-80.scope: Deactivated successfully. Aug 12 23:55:23.382352 systemd-logind[1481]: Removed session 80. Aug 12 23:55:23.562357 systemd[1]: Started sshd@96-138.199.237.168:22-139.178.68.195:41852.service - OpenSSH per-connection server daemon (139.178.68.195:41852). Aug 12 23:55:24.639952 sshd[8837]: Accepted publickey for core from 139.178.68.195 port 41852 ssh2: RSA SHA256:mYYevtwiboZ0WAYUecJRosD7TZEB8o4Se1lqZszh41o Aug 12 23:55:24.643848 sshd-session[8837]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Aug 12 23:55:24.652031 systemd-logind[1481]: New session 81 of user core. Aug 12 23:55:24.656824 systemd[1]: Started session-81.scope - Session 81 of User core. Aug 12 23:55:25.651592 sshd[8839]: Connection closed by 139.178.68.195 port 41852 Aug 12 23:55:25.650963 sshd-session[8837]: pam_unix(sshd:session): session closed for user core Aug 12 23:55:25.659359 systemd[1]: sshd@96-138.199.237.168:22-139.178.68.195:41852.service: Deactivated successfully. Aug 12 23:55:25.664526 systemd[1]: session-81.scope: Deactivated successfully. Aug 12 23:55:25.668100 systemd-logind[1481]: Session 81 logged out. Waiting for processes to exit. Aug 12 23:55:25.670653 systemd-logind[1481]: Removed session 81. Aug 12 23:55:25.818632 systemd[1]: Started sshd@97-138.199.237.168:22-139.178.68.195:41856.service - OpenSSH per-connection server daemon (139.178.68.195:41856). Aug 12 23:55:26.528573 sshd[8820]: Invalid user test1 from 185.50.38.233 port 48724 Aug 12 23:55:26.831713 sshd[8851]: Accepted publickey for core from 139.178.68.195 port 41856 ssh2: RSA SHA256:mYYevtwiboZ0WAYUecJRosD7TZEB8o4Se1lqZszh41o Aug 12 23:55:26.834224 sshd-session[8851]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Aug 12 23:55:26.845994 systemd-logind[1481]: New session 82 of user core. Aug 12 23:55:26.849737 systemd[1]: Started session-82.scope - Session 82 of User core. Aug 12 23:55:27.023964 containerd[1506]: time="2025-08-12T23:55:27.023891696Z" level=info msg="TaskExit event in podsandbox handler container_id:\"19dd4802aef9a2278a4c8001b706649c2528bbfafc2b91c7d2bde8cc49cbafdc\" id:\"b42063cb97ff172b1619604c20d35d57623a70499ff4742e06a05cc8876f5793\" pid:8871 exited_at:{seconds:1755042927 nanos:23058886}" Aug 12 23:55:27.084886 sshd[8820]: Connection closed by invalid user test1 185.50.38.233 port 48724 [preauth] Aug 12 23:55:27.088342 systemd[1]: sshd@94-138.199.237.168:22-185.50.38.233:48724.service: Deactivated successfully. Aug 12 23:55:28.046088 systemd[1]: Started sshd@98-138.199.237.168:22-185.50.38.233:43464.service - OpenSSH per-connection server daemon (185.50.38.233:43464). Aug 12 23:55:28.307343 containerd[1506]: time="2025-08-12T23:55:28.307233191Z" level=info msg="TaskExit event in podsandbox handler container_id:\"1cccb6bc47c00d5e1d4ee67f317e7ac919f9017670873c5bb6119b773b41640b\" id:\"385692fa068ca4bcc1b4cfa9d05d4c43c2a52ffee5b65f4eee4e5c91e85e6abf\" pid:8904 exited_at:{seconds:1755042928 nanos:306957268}" Aug 12 23:55:29.558775 sshd[8857]: Connection closed by 139.178.68.195 port 41856 Aug 12 23:55:29.561437 sshd-session[8851]: pam_unix(sshd:session): session closed for user core Aug 12 23:55:29.568195 systemd[1]: sshd@97-138.199.237.168:22-139.178.68.195:41856.service: Deactivated successfully. Aug 12 23:55:29.573531 systemd[1]: session-82.scope: Deactivated successfully. Aug 12 23:55:29.574066 systemd[1]: session-82.scope: Consumed 640ms CPU time, 72.2M memory peak. Aug 12 23:55:29.575437 systemd-logind[1481]: Session 82 logged out. Waiting for processes to exit. Aug 12 23:55:29.577539 systemd-logind[1481]: Removed session 82. Aug 12 23:55:29.753152 systemd[1]: Started sshd@99-138.199.237.168:22-139.178.68.195:41860.service - OpenSSH per-connection server daemon (139.178.68.195:41860). Aug 12 23:55:30.828858 sshd[8924]: Accepted publickey for core from 139.178.68.195 port 41860 ssh2: RSA SHA256:mYYevtwiboZ0WAYUecJRosD7TZEB8o4Se1lqZszh41o Aug 12 23:55:30.831762 sshd-session[8924]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Aug 12 23:55:30.838101 systemd-logind[1481]: New session 83 of user core. Aug 12 23:55:30.847885 systemd[1]: Started session-83.scope - Session 83 of User core. Aug 12 23:55:31.775519 sshd[8928]: Connection closed by 139.178.68.195 port 41860 Aug 12 23:55:31.776470 sshd-session[8924]: pam_unix(sshd:session): session closed for user core Aug 12 23:55:31.781937 systemd[1]: sshd@99-138.199.237.168:22-139.178.68.195:41860.service: Deactivated successfully. Aug 12 23:55:31.784450 systemd[1]: session-83.scope: Deactivated successfully. Aug 12 23:55:31.785790 systemd-logind[1481]: Session 83 logged out. Waiting for processes to exit. Aug 12 23:55:31.788013 systemd-logind[1481]: Removed session 83. Aug 12 23:55:31.939893 systemd[1]: Started sshd@100-138.199.237.168:22-139.178.68.195:33392.service - OpenSSH per-connection server daemon (139.178.68.195:33392). Aug 12 23:55:32.960085 sshd[8938]: Accepted publickey for core from 139.178.68.195 port 33392 ssh2: RSA SHA256:mYYevtwiboZ0WAYUecJRosD7TZEB8o4Se1lqZszh41o Aug 12 23:55:32.962185 sshd-session[8938]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Aug 12 23:55:32.970603 systemd-logind[1481]: New session 84 of user core. Aug 12 23:55:32.974732 systemd[1]: Started session-84.scope - Session 84 of User core. Aug 12 23:55:33.733772 sshd[8942]: Connection closed by 139.178.68.195 port 33392 Aug 12 23:55:33.734601 sshd-session[8938]: pam_unix(sshd:session): session closed for user core Aug 12 23:55:33.741089 systemd-logind[1481]: Session 84 logged out. Waiting for processes to exit. Aug 12 23:55:33.741906 systemd[1]: sshd@100-138.199.237.168:22-139.178.68.195:33392.service: Deactivated successfully. Aug 12 23:55:33.748358 systemd[1]: session-84.scope: Deactivated successfully. Aug 12 23:55:33.751725 systemd-logind[1481]: Removed session 84. Aug 12 23:55:35.057262 sshd[8915]: Connection closed by authenticating user root 185.50.38.233 port 43464 [preauth] Aug 12 23:55:35.060655 systemd[1]: sshd@98-138.199.237.168:22-185.50.38.233:43464.service: Deactivated successfully. Aug 12 23:55:35.663829 systemd[1]: Started sshd@101-138.199.237.168:22-185.50.38.233:43478.service - OpenSSH per-connection server daemon (185.50.38.233:43478). Aug 12 23:55:38.909720 systemd[1]: Started sshd@102-138.199.237.168:22-139.178.68.195:33396.service - OpenSSH per-connection server daemon (139.178.68.195:33396). Aug 12 23:55:39.147317 containerd[1506]: time="2025-08-12T23:55:39.147261737Z" level=info msg="TaskExit event in podsandbox handler container_id:\"769dd1b877f8b6c3cdfcd850a1bfce2f7623183d7b5ca6c22e3baaa0871e87ab\" id:\"f54ac682ca2307b375d2edec12463cc7094d9d5ae9cc1db220bc56c1d78d4ce0\" pid:8974 exited_at:{seconds:1755042939 nanos:146847652}" Aug 12 23:55:39.925286 sshd[8959]: Accepted publickey for core from 139.178.68.195 port 33396 ssh2: RSA SHA256:mYYevtwiboZ0WAYUecJRosD7TZEB8o4Se1lqZszh41o Aug 12 23:55:39.927742 sshd-session[8959]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Aug 12 23:55:39.934898 systemd-logind[1481]: New session 85 of user core. Aug 12 23:55:39.942104 systemd[1]: Started session-85.scope - Session 85 of User core. Aug 12 23:55:40.701752 sshd[8985]: Connection closed by 139.178.68.195 port 33396 Aug 12 23:55:40.702633 sshd-session[8959]: pam_unix(sshd:session): session closed for user core Aug 12 23:55:40.708337 systemd[1]: sshd@102-138.199.237.168:22-139.178.68.195:33396.service: Deactivated successfully. Aug 12 23:55:40.712021 systemd[1]: session-85.scope: Deactivated successfully. Aug 12 23:55:40.714746 systemd-logind[1481]: Session 85 logged out. Waiting for processes to exit. Aug 12 23:55:40.718334 systemd-logind[1481]: Removed session 85. Aug 12 23:55:41.960072 sshd[8956]: Connection closed by authenticating user root 185.50.38.233 port 43478 [preauth] Aug 12 23:55:41.964150 systemd[1]: sshd@101-138.199.237.168:22-185.50.38.233:43478.service: Deactivated successfully. Aug 12 23:55:42.675066 systemd[1]: Started sshd@103-138.199.237.168:22-185.50.38.233:47980.service - OpenSSH per-connection server daemon (185.50.38.233:47980). Aug 12 23:55:45.880123 systemd[1]: Started sshd@104-138.199.237.168:22-139.178.68.195:59184.service - OpenSSH per-connection server daemon (139.178.68.195:59184). Aug 12 23:55:46.897890 sshd[9008]: Accepted publickey for core from 139.178.68.195 port 59184 ssh2: RSA SHA256:mYYevtwiboZ0WAYUecJRosD7TZEB8o4Se1lqZszh41o Aug 12 23:55:46.900505 sshd-session[9008]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Aug 12 23:55:46.907954 systemd-logind[1481]: New session 86 of user core. Aug 12 23:55:46.915879 systemd[1]: Started session-86.scope - Session 86 of User core. Aug 12 23:55:47.671581 sshd[9010]: Connection closed by 139.178.68.195 port 59184 Aug 12 23:55:47.672397 sshd-session[9008]: pam_unix(sshd:session): session closed for user core Aug 12 23:55:47.678481 systemd[1]: sshd@104-138.199.237.168:22-139.178.68.195:59184.service: Deactivated successfully. Aug 12 23:55:47.682471 systemd[1]: session-86.scope: Deactivated successfully. Aug 12 23:55:47.686657 systemd-logind[1481]: Session 86 logged out. Waiting for processes to exit. Aug 12 23:55:47.688242 systemd-logind[1481]: Removed session 86. Aug 12 23:55:49.439809 sshd[8999]: Invalid user cservs from 185.50.38.233 port 47980 Aug 12 23:55:50.490858 sshd[8999]: Connection closed by invalid user cservs 185.50.38.233 port 47980 [preauth] Aug 12 23:55:50.494378 systemd[1]: sshd@103-138.199.237.168:22-185.50.38.233:47980.service: Deactivated successfully. Aug 12 23:55:51.311074 systemd[1]: Started sshd@105-138.199.237.168:22-185.50.38.233:52770.service - OpenSSH per-connection server daemon (185.50.38.233:52770). Aug 12 23:55:52.871515 systemd[1]: Started sshd@106-138.199.237.168:22-139.178.68.195:47670.service - OpenSSH per-connection server daemon (139.178.68.195:47670). Aug 12 23:55:53.942957 sshd[9026]: Accepted publickey for core from 139.178.68.195 port 47670 ssh2: RSA SHA256:mYYevtwiboZ0WAYUecJRosD7TZEB8o4Se1lqZszh41o Aug 12 23:55:53.946009 sshd-session[9026]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Aug 12 23:55:53.951643 systemd-logind[1481]: New session 87 of user core. Aug 12 23:55:53.956720 systemd[1]: Started session-87.scope - Session 87 of User core. Aug 12 23:55:54.755717 sshd[9029]: Connection closed by 139.178.68.195 port 47670 Aug 12 23:55:54.756822 sshd-session[9026]: pam_unix(sshd:session): session closed for user core Aug 12 23:55:54.763411 systemd[1]: sshd@106-138.199.237.168:22-139.178.68.195:47670.service: Deactivated successfully. Aug 12 23:55:54.768009 systemd[1]: session-87.scope: Deactivated successfully. Aug 12 23:55:54.770698 systemd-logind[1481]: Session 87 logged out. Waiting for processes to exit. Aug 12 23:55:54.772997 systemd-logind[1481]: Removed session 87. Aug 12 23:55:57.058421 containerd[1506]: time="2025-08-12T23:55:57.058372072Z" level=info msg="TaskExit event in podsandbox handler container_id:\"19dd4802aef9a2278a4c8001b706649c2528bbfafc2b91c7d2bde8cc49cbafdc\" id:\"f6973be41d409a76f7d8947f42e748e27a59921e0456fe365c19109c6d1204e3\" pid:9052 exited_at:{seconds:1755042957 nanos:57967627}" Aug 12 23:55:58.026376 containerd[1506]: time="2025-08-12T23:55:58.026321163Z" level=info msg="TaskExit event in podsandbox handler container_id:\"1cccb6bc47c00d5e1d4ee67f317e7ac919f9017670873c5bb6119b773b41640b\" id:\"48f3ae49cf7528de7a6bb3e5c5b054d65898bf34a84cf8b04ff1c6cd3b2142ec\" pid:9082 exited_at:{seconds:1755042958 nanos:24885666}" Aug 12 23:55:58.434931 sshd[9024]: Connection closed by authenticating user root 185.50.38.233 port 52770 [preauth] Aug 12 23:55:58.439665 systemd[1]: sshd@105-138.199.237.168:22-185.50.38.233:52770.service: Deactivated successfully. Aug 12 23:55:59.353758 systemd[1]: Started sshd@107-138.199.237.168:22-185.50.38.233:47948.service - OpenSSH per-connection server daemon (185.50.38.233:47948). Aug 12 23:55:59.925441 systemd[1]: Started sshd@108-138.199.237.168:22-139.178.68.195:47682.service - OpenSSH per-connection server daemon (139.178.68.195:47682). Aug 12 23:56:00.943876 sshd[9097]: Accepted publickey for core from 139.178.68.195 port 47682 ssh2: RSA SHA256:mYYevtwiboZ0WAYUecJRosD7TZEB8o4Se1lqZszh41o Aug 12 23:56:00.946261 sshd-session[9097]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Aug 12 23:56:00.952213 systemd-logind[1481]: New session 88 of user core. Aug 12 23:56:00.958773 systemd[1]: Started session-88.scope - Session 88 of User core. Aug 12 23:56:01.720650 sshd[9099]: Connection closed by 139.178.68.195 port 47682 Aug 12 23:56:01.721713 sshd-session[9097]: pam_unix(sshd:session): session closed for user core Aug 12 23:56:01.728220 systemd[1]: sshd@108-138.199.237.168:22-139.178.68.195:47682.service: Deactivated successfully. Aug 12 23:56:01.731225 systemd[1]: session-88.scope: Deactivated successfully. Aug 12 23:56:01.732682 systemd-logind[1481]: Session 88 logged out. Waiting for processes to exit. Aug 12 23:56:01.734816 systemd-logind[1481]: Removed session 88. Aug 12 23:56:04.953729 containerd[1506]: time="2025-08-12T23:56:04.953588832Z" level=info msg="TaskExit event in podsandbox handler container_id:\"1cccb6bc47c00d5e1d4ee67f317e7ac919f9017670873c5bb6119b773b41640b\" id:\"58ddbefd945b7da77a5aa5fa9b14c73fd62ee758ffd06663df465c14b4e9e4b4\" pid:9138 exited_at:{seconds:1755042964 nanos:953026185}" Aug 12 23:56:06.547459 sshd[9095]: Connection closed by authenticating user root 185.50.38.233 port 47948 [preauth] Aug 12 23:56:06.555904 systemd[1]: sshd@107-138.199.237.168:22-185.50.38.233:47948.service: Deactivated successfully. Aug 12 23:56:06.893860 systemd[1]: Started sshd@109-138.199.237.168:22-139.178.68.195:42018.service - OpenSSH per-connection server daemon (139.178.68.195:42018). Aug 12 23:56:07.869927 systemd[1]: Started sshd@110-138.199.237.168:22-185.50.38.233:47254.service - OpenSSH per-connection server daemon (185.50.38.233:47254). Aug 12 23:56:07.906082 sshd[9151]: Accepted publickey for core from 139.178.68.195 port 42018 ssh2: RSA SHA256:mYYevtwiboZ0WAYUecJRosD7TZEB8o4Se1lqZszh41o Aug 12 23:56:07.908193 sshd-session[9151]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Aug 12 23:56:07.916034 systemd-logind[1481]: New session 89 of user core. Aug 12 23:56:07.923806 systemd[1]: Started session-89.scope - Session 89 of User core. Aug 12 23:56:08.689033 sshd[9155]: Connection closed by 139.178.68.195 port 42018 Aug 12 23:56:08.690415 sshd-session[9151]: pam_unix(sshd:session): session closed for user core Aug 12 23:56:08.699734 systemd-logind[1481]: Session 89 logged out. Waiting for processes to exit. Aug 12 23:56:08.702983 systemd[1]: sshd@109-138.199.237.168:22-139.178.68.195:42018.service: Deactivated successfully. Aug 12 23:56:08.707960 systemd[1]: session-89.scope: Deactivated successfully. Aug 12 23:56:08.712485 systemd-logind[1481]: Removed session 89. Aug 12 23:56:09.143958 containerd[1506]: time="2025-08-12T23:56:09.143900634Z" level=info msg="TaskExit event in podsandbox handler container_id:\"769dd1b877f8b6c3cdfcd850a1bfce2f7623183d7b5ca6c22e3baaa0871e87ab\" id:\"29f96df83cf55b572c577b9739cd6425fd81c7be07b6e79b6dd8c3ec01d4c205\" pid:9179 exited_at:{seconds:1755042969 nanos:143541550}" Aug 12 23:56:13.884841 systemd[1]: Started sshd@111-138.199.237.168:22-139.178.68.195:40844.service - OpenSSH per-connection server daemon (139.178.68.195:40844). Aug 12 23:56:13.954571 sshd[9154]: Invalid user guest from 185.50.38.233 port 47254 Aug 12 23:56:14.557660 sshd[9154]: Connection closed by invalid user guest 185.50.38.233 port 47254 [preauth] Aug 12 23:56:14.562875 systemd[1]: sshd@110-138.199.237.168:22-185.50.38.233:47254.service: Deactivated successfully. Aug 12 23:56:14.951769 sshd[9193]: Accepted publickey for core from 139.178.68.195 port 40844 ssh2: RSA SHA256:mYYevtwiboZ0WAYUecJRosD7TZEB8o4Se1lqZszh41o Aug 12 23:56:14.955031 sshd-session[9193]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Aug 12 23:56:14.962427 systemd-logind[1481]: New session 90 of user core. Aug 12 23:56:14.966804 systemd[1]: Started session-90.scope - Session 90 of User core. Aug 12 23:56:15.336579 systemd[1]: Started sshd@112-138.199.237.168:22-185.50.38.233:47306.service - OpenSSH per-connection server daemon (185.50.38.233:47306). Aug 12 23:56:15.764611 sshd[9197]: Connection closed by 139.178.68.195 port 40844 Aug 12 23:56:15.766085 sshd-session[9193]: pam_unix(sshd:session): session closed for user core Aug 12 23:56:15.771305 systemd[1]: sshd@111-138.199.237.168:22-139.178.68.195:40844.service: Deactivated successfully. Aug 12 23:56:15.771365 systemd-logind[1481]: Session 90 logged out. Waiting for processes to exit. Aug 12 23:56:15.773948 systemd[1]: session-90.scope: Deactivated successfully. Aug 12 23:56:15.776909 systemd-logind[1481]: Removed session 90. Aug 12 23:56:16.741077 containerd[1506]: time="2025-08-12T23:56:16.741029055Z" level=info msg="TaskExit event in podsandbox handler container_id:\"19dd4802aef9a2278a4c8001b706649c2528bbfafc2b91c7d2bde8cc49cbafdc\" id:\"1182b3e274b7fb6931b2750b0f86a97d704aab3e9658bbb823d225231eb31df2\" pid:9223 exited_at:{seconds:1755042976 nanos:740502289}" Aug 12 23:56:20.933518 systemd[1]: Started sshd@113-138.199.237.168:22-139.178.68.195:44378.service - OpenSSH per-connection server daemon (139.178.68.195:44378). Aug 12 23:56:21.947403 sshd[9234]: Accepted publickey for core from 139.178.68.195 port 44378 ssh2: RSA SHA256:mYYevtwiboZ0WAYUecJRosD7TZEB8o4Se1lqZszh41o Aug 12 23:56:21.949530 sshd-session[9234]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Aug 12 23:56:21.957608 systemd-logind[1481]: New session 91 of user core. Aug 12 23:56:21.962821 systemd[1]: Started session-91.scope - Session 91 of User core. Aug 12 23:56:22.712603 sshd[9237]: Connection closed by 139.178.68.195 port 44378 Aug 12 23:56:22.712578 sshd-session[9234]: pam_unix(sshd:session): session closed for user core Aug 12 23:56:22.721044 systemd-logind[1481]: Session 91 logged out. Waiting for processes to exit. Aug 12 23:56:22.721764 systemd[1]: sshd@113-138.199.237.168:22-139.178.68.195:44378.service: Deactivated successfully. Aug 12 23:56:22.724658 systemd[1]: session-91.scope: Deactivated successfully. Aug 12 23:56:22.730079 systemd-logind[1481]: Removed session 91. Aug 12 23:56:23.376970 sshd[9199]: Connection closed by authenticating user root 185.50.38.233 port 47306 [preauth] Aug 12 23:56:23.382389 systemd[1]: sshd@112-138.199.237.168:22-185.50.38.233:47306.service: Deactivated successfully. Aug 12 23:56:24.337447 systemd[1]: Started sshd@114-138.199.237.168:22-185.50.38.233:43612.service - OpenSSH per-connection server daemon (185.50.38.233:43612). Aug 12 23:56:27.008332 containerd[1506]: time="2025-08-12T23:56:27.008287118Z" level=info msg="TaskExit event in podsandbox handler container_id:\"19dd4802aef9a2278a4c8001b706649c2528bbfafc2b91c7d2bde8cc49cbafdc\" id:\"c084b1a3b3aa2b1d98bd8c5a642137674061a5aab0a7244a5f058dc6b1ccab41\" pid:9267 exited_at:{seconds:1755042987 nanos:7847433}" Aug 12 23:56:27.889683 systemd[1]: Started sshd@115-138.199.237.168:22-139.178.68.195:44392.service - OpenSSH per-connection server daemon (139.178.68.195:44392). Aug 12 23:56:28.000335 containerd[1506]: time="2025-08-12T23:56:28.000252189Z" level=info msg="TaskExit event in podsandbox handler container_id:\"1cccb6bc47c00d5e1d4ee67f317e7ac919f9017670873c5bb6119b773b41640b\" id:\"cea2f04ec9d0bda37d068e5f2e2d74e732da07259a6b300c4cc90fef450bf930\" pid:9292 exited_at:{seconds:1755042987 nanos:999685982}" Aug 12 23:56:28.908474 sshd[9277]: Accepted publickey for core from 139.178.68.195 port 44392 ssh2: RSA SHA256:mYYevtwiboZ0WAYUecJRosD7TZEB8o4Se1lqZszh41o Aug 12 23:56:28.910885 sshd-session[9277]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Aug 12 23:56:28.920333 systemd-logind[1481]: New session 92 of user core. Aug 12 23:56:28.927642 systemd[1]: Started session-92.scope - Session 92 of User core. Aug 12 23:56:29.699584 sshd[9302]: Connection closed by 139.178.68.195 port 44392 Aug 12 23:56:29.700276 sshd-session[9277]: pam_unix(sshd:session): session closed for user core Aug 12 23:56:29.706182 systemd[1]: session-92.scope: Deactivated successfully. Aug 12 23:56:29.707865 systemd[1]: sshd@115-138.199.237.168:22-139.178.68.195:44392.service: Deactivated successfully. Aug 12 23:56:29.713721 systemd-logind[1481]: Session 92 logged out. Waiting for processes to exit. Aug 12 23:56:29.717371 systemd-logind[1481]: Removed session 92. Aug 12 23:56:32.432729 sshd[9251]: Connection closed by authenticating user root 185.50.38.233 port 43612 [preauth] Aug 12 23:56:32.436950 systemd[1]: sshd@114-138.199.237.168:22-185.50.38.233:43612.service: Deactivated successfully. Aug 12 23:56:33.148822 systemd[1]: Started sshd@116-138.199.237.168:22-185.50.38.233:35938.service - OpenSSH per-connection server daemon (185.50.38.233:35938). Aug 12 23:56:34.873868 systemd[1]: Started sshd@117-138.199.237.168:22-139.178.68.195:53450.service - OpenSSH per-connection server daemon (139.178.68.195:53450). Aug 12 23:56:35.899566 sshd[9320]: Accepted publickey for core from 139.178.68.195 port 53450 ssh2: RSA SHA256:mYYevtwiboZ0WAYUecJRosD7TZEB8o4Se1lqZszh41o Aug 12 23:56:35.902753 sshd-session[9320]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Aug 12 23:56:35.908404 systemd-logind[1481]: New session 93 of user core. Aug 12 23:56:35.913768 systemd[1]: Started session-93.scope - Session 93 of User core. Aug 12 23:56:36.667542 sshd[9323]: Connection closed by 139.178.68.195 port 53450 Aug 12 23:56:36.668306 sshd-session[9320]: pam_unix(sshd:session): session closed for user core Aug 12 23:56:36.672826 systemd-logind[1481]: Session 93 logged out. Waiting for processes to exit. Aug 12 23:56:36.672980 systemd[1]: sshd@117-138.199.237.168:22-139.178.68.195:53450.service: Deactivated successfully. Aug 12 23:56:36.675934 systemd[1]: session-93.scope: Deactivated successfully. Aug 12 23:56:36.680464 systemd-logind[1481]: Removed session 93. Aug 12 23:56:39.201989 containerd[1506]: time="2025-08-12T23:56:39.201816955Z" level=info msg="TaskExit event in podsandbox handler container_id:\"769dd1b877f8b6c3cdfcd850a1bfce2f7623183d7b5ca6c22e3baaa0871e87ab\" id:\"a2068b69993150d176061b254642a6560104d0384c634c9b754aca4733ce1a9c\" pid:9346 exited_at:{seconds:1755042999 nanos:200701021}" Aug 12 23:56:40.500438 sshd[9318]: Connection closed by authenticating user root 185.50.38.233 port 35938 [preauth] Aug 12 23:56:40.505756 systemd[1]: sshd@116-138.199.237.168:22-185.50.38.233:35938.service: Deactivated successfully. Aug 12 23:56:41.186846 systemd[1]: Started sshd@118-138.199.237.168:22-185.50.38.233:57572.service - OpenSSH per-connection server daemon (185.50.38.233:57572). Aug 12 23:56:41.844850 systemd[1]: Started sshd@119-138.199.237.168:22-139.178.68.195:52464.service - OpenSSH per-connection server daemon (139.178.68.195:52464). Aug 12 23:56:42.862599 sshd[9365]: Accepted publickey for core from 139.178.68.195 port 52464 ssh2: RSA SHA256:mYYevtwiboZ0WAYUecJRosD7TZEB8o4Se1lqZszh41o Aug 12 23:56:42.866987 sshd-session[9365]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Aug 12 23:56:42.873344 systemd-logind[1481]: New session 94 of user core. Aug 12 23:56:42.883831 systemd[1]: Started session-94.scope - Session 94 of User core. Aug 12 23:56:43.640260 sshd[9367]: Connection closed by 139.178.68.195 port 52464 Aug 12 23:56:43.640974 sshd-session[9365]: pam_unix(sshd:session): session closed for user core Aug 12 23:56:43.647300 systemd[1]: sshd@119-138.199.237.168:22-139.178.68.195:52464.service: Deactivated successfully. Aug 12 23:56:43.651252 systemd[1]: session-94.scope: Deactivated successfully. Aug 12 23:56:43.652959 systemd-logind[1481]: Session 94 logged out. Waiting for processes to exit. Aug 12 23:56:43.655163 systemd-logind[1481]: Removed session 94. Aug 12 23:56:48.815058 systemd[1]: Started sshd@120-138.199.237.168:22-139.178.68.195:52468.service - OpenSSH per-connection server daemon (139.178.68.195:52468). Aug 12 23:56:48.907474 sshd[9363]: Connection closed by authenticating user root 185.50.38.233 port 57572 [preauth] Aug 12 23:56:48.910883 systemd[1]: sshd@118-138.199.237.168:22-185.50.38.233:57572.service: Deactivated successfully. Aug 12 23:56:49.822967 sshd[9380]: Accepted publickey for core from 139.178.68.195 port 52468 ssh2: RSA SHA256:mYYevtwiboZ0WAYUecJRosD7TZEB8o4Se1lqZszh41o Aug 12 23:56:49.826475 sshd-session[9380]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Aug 12 23:56:49.834938 systemd-logind[1481]: New session 95 of user core. Aug 12 23:56:49.843820 systemd[1]: Started session-95.scope - Session 95 of User core. Aug 12 23:56:49.928496 systemd[1]: Started sshd@121-138.199.237.168:22-185.50.38.233:58018.service - OpenSSH per-connection server daemon (185.50.38.233:58018). Aug 12 23:56:50.593775 sshd[9384]: Connection closed by 139.178.68.195 port 52468 Aug 12 23:56:50.594610 sshd-session[9380]: pam_unix(sshd:session): session closed for user core Aug 12 23:56:50.599282 systemd-logind[1481]: Session 95 logged out. Waiting for processes to exit. Aug 12 23:56:50.599957 systemd[1]: sshd@120-138.199.237.168:22-139.178.68.195:52468.service: Deactivated successfully. Aug 12 23:56:50.603118 systemd[1]: session-95.scope: Deactivated successfully. Aug 12 23:56:50.607984 systemd-logind[1481]: Removed session 95. Aug 12 23:56:55.776302 systemd[1]: Started sshd@122-138.199.237.168:22-139.178.68.195:52626.service - OpenSSH per-connection server daemon (139.178.68.195:52626). Aug 12 23:56:56.801695 sshd[9398]: Accepted publickey for core from 139.178.68.195 port 52626 ssh2: RSA SHA256:mYYevtwiboZ0WAYUecJRosD7TZEB8o4Se1lqZszh41o Aug 12 23:56:56.803967 sshd-session[9398]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Aug 12 23:56:56.812634 systemd-logind[1481]: New session 96 of user core. Aug 12 23:56:56.819876 systemd[1]: Started session-96.scope - Session 96 of User core. Aug 12 23:56:57.014231 containerd[1506]: time="2025-08-12T23:56:57.014135714Z" level=info msg="TaskExit event in podsandbox handler container_id:\"19dd4802aef9a2278a4c8001b706649c2528bbfafc2b91c7d2bde8cc49cbafdc\" id:\"2902e4e0705cbbfa3bc57e51e69c0320679a75072b370e523744bc9279ee3f5f\" pid:9413 exited_at:{seconds:1755043017 nanos:13476026}" Aug 12 23:56:57.576278 sshd[9400]: Connection closed by 139.178.68.195 port 52626 Aug 12 23:56:57.576912 sshd-session[9398]: pam_unix(sshd:session): session closed for user core Aug 12 23:56:57.586468 systemd[1]: sshd@122-138.199.237.168:22-139.178.68.195:52626.service: Deactivated successfully. Aug 12 23:56:57.591173 systemd[1]: session-96.scope: Deactivated successfully. Aug 12 23:56:57.592714 systemd-logind[1481]: Session 96 logged out. Waiting for processes to exit. Aug 12 23:56:57.594746 systemd-logind[1481]: Removed session 96. Aug 12 23:56:58.018808 containerd[1506]: time="2025-08-12T23:56:58.018463223Z" level=info msg="TaskExit event in podsandbox handler container_id:\"1cccb6bc47c00d5e1d4ee67f317e7ac919f9017670873c5bb6119b773b41640b\" id:\"cd663457fa2b607f93d968a50a071ef3a3eeb991fe18f546e5b39195510b424c\" pid:9444 exited_at:{seconds:1755043018 nanos:17105847}" Aug 12 23:56:58.421716 sshd[9386]: Connection closed by authenticating user root 185.50.38.233 port 58018 [preauth] Aug 12 23:56:58.426063 systemd[1]: sshd@121-138.199.237.168:22-185.50.38.233:58018.service: Deactivated successfully. Aug 12 23:56:59.672866 systemd[1]: Started sshd@123-138.199.237.168:22-185.50.38.233:52230.service - OpenSSH per-connection server daemon (185.50.38.233:52230). Aug 12 23:57:02.755993 systemd[1]: Started sshd@124-138.199.237.168:22-139.178.68.195:53724.service - OpenSSH per-connection server daemon (139.178.68.195:53724). Aug 12 23:57:03.777617 sshd[9460]: Accepted publickey for core from 139.178.68.195 port 53724 ssh2: RSA SHA256:mYYevtwiboZ0WAYUecJRosD7TZEB8o4Se1lqZszh41o Aug 12 23:57:03.779990 sshd-session[9460]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Aug 12 23:57:03.787891 systemd-logind[1481]: New session 97 of user core. Aug 12 23:57:03.796946 systemd[1]: Started session-97.scope - Session 97 of User core. Aug 12 23:57:04.558509 sshd[9463]: Connection closed by 139.178.68.195 port 53724 Aug 12 23:57:04.557665 sshd-session[9460]: pam_unix(sshd:session): session closed for user core Aug 12 23:57:04.566455 systemd[1]: sshd@124-138.199.237.168:22-139.178.68.195:53724.service: Deactivated successfully. Aug 12 23:57:04.571949 systemd[1]: session-97.scope: Deactivated successfully. Aug 12 23:57:04.573489 systemd-logind[1481]: Session 97 logged out. Waiting for processes to exit. Aug 12 23:57:04.576451 systemd-logind[1481]: Removed session 97. Aug 12 23:57:04.953428 containerd[1506]: time="2025-08-12T23:57:04.952824000Z" level=info msg="TaskExit event in podsandbox handler container_id:\"1cccb6bc47c00d5e1d4ee67f317e7ac919f9017670873c5bb6119b773b41640b\" id:\"cbf8e67cc2c65d96c503045c31c26ce84e2089d61afe8aef2efd3e16828a45d7\" pid:9485 exited_at:{seconds:1755043024 nanos:952385074}" Aug 12 23:57:09.132974 containerd[1506]: time="2025-08-12T23:57:09.132920562Z" level=info msg="TaskExit event in podsandbox handler container_id:\"769dd1b877f8b6c3cdfcd850a1bfce2f7623183d7b5ca6c22e3baaa0871e87ab\" id:\"b5fbb3e6d34c59459a518a0453532d7b978afc1e2db018e384b231b491514bd8\" pid:9508 exited_at:{seconds:1755043029 nanos:132394236}" Aug 12 23:57:09.739367 systemd[1]: Started sshd@125-138.199.237.168:22-139.178.68.195:53732.service - OpenSSH per-connection server daemon (139.178.68.195:53732). Aug 12 23:57:10.761230 sshd[9521]: Accepted publickey for core from 139.178.68.195 port 53732 ssh2: RSA SHA256:mYYevtwiboZ0WAYUecJRosD7TZEB8o4Se1lqZszh41o Aug 12 23:57:10.764346 sshd-session[9521]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Aug 12 23:57:10.773065 systemd-logind[1481]: New session 98 of user core. Aug 12 23:57:10.781766 systemd[1]: Started session-98.scope - Session 98 of User core. Aug 12 23:57:11.538946 sshd[9523]: Connection closed by 139.178.68.195 port 53732 Aug 12 23:57:11.541762 sshd-session[9521]: pam_unix(sshd:session): session closed for user core Aug 12 23:57:11.547429 systemd[1]: sshd@125-138.199.237.168:22-139.178.68.195:53732.service: Deactivated successfully. Aug 12 23:57:11.552058 systemd[1]: session-98.scope: Deactivated successfully. Aug 12 23:57:11.556463 systemd-logind[1481]: Session 98 logged out. Waiting for processes to exit. Aug 12 23:57:11.558991 systemd-logind[1481]: Removed session 98. Aug 12 23:57:13.682346 sshd[9456]: Connection closed by authenticating user root 185.50.38.233 port 52230 [preauth] Aug 12 23:57:13.688626 systemd[1]: sshd@123-138.199.237.168:22-185.50.38.233:52230.service: Deactivated successfully. Aug 12 23:57:15.084424 systemd[1]: Started sshd@126-138.199.237.168:22-185.50.38.233:53758.service - OpenSSH per-connection server daemon (185.50.38.233:53758). Aug 12 23:57:16.714071 systemd[1]: Started sshd@127-138.199.237.168:22-139.178.68.195:38026.service - OpenSSH per-connection server daemon (139.178.68.195:38026). Aug 12 23:57:16.759076 containerd[1506]: time="2025-08-12T23:57:16.759034315Z" level=info msg="TaskExit event in podsandbox handler container_id:\"19dd4802aef9a2278a4c8001b706649c2528bbfafc2b91c7d2bde8cc49cbafdc\" id:\"bc2df874d4518b5946c345982c9886b1901336ebe1569a53d5159192d4941384\" pid:9552 exited_at:{seconds:1755043036 nanos:758245145}" Aug 12 23:57:17.741884 sshd[9545]: Accepted publickey for core from 139.178.68.195 port 38026 ssh2: RSA SHA256:mYYevtwiboZ0WAYUecJRosD7TZEB8o4Se1lqZszh41o Aug 12 23:57:17.744867 sshd-session[9545]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Aug 12 23:57:17.751195 systemd-logind[1481]: New session 99 of user core. Aug 12 23:57:17.756802 systemd[1]: Started session-99.scope - Session 99 of User core. Aug 12 23:57:18.515413 sshd[9563]: Connection closed by 139.178.68.195 port 38026 Aug 12 23:57:18.516677 sshd-session[9545]: pam_unix(sshd:session): session closed for user core Aug 12 23:57:18.522672 systemd[1]: sshd@127-138.199.237.168:22-139.178.68.195:38026.service: Deactivated successfully. Aug 12 23:57:18.525510 systemd[1]: session-99.scope: Deactivated successfully. Aug 12 23:57:18.527690 systemd-logind[1481]: Session 99 logged out. Waiting for processes to exit. Aug 12 23:57:18.530344 systemd-logind[1481]: Removed session 99. Aug 12 23:57:23.691399 systemd[1]: Started sshd@128-138.199.237.168:22-139.178.68.195:47292.service - OpenSSH per-connection server daemon (139.178.68.195:47292). Aug 12 23:57:24.710491 sshd[9574]: Accepted publickey for core from 139.178.68.195 port 47292 ssh2: RSA SHA256:mYYevtwiboZ0WAYUecJRosD7TZEB8o4Se1lqZszh41o Aug 12 23:57:24.712972 sshd-session[9574]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Aug 12 23:57:24.719406 systemd-logind[1481]: New session 100 of user core. Aug 12 23:57:24.727828 systemd[1]: Started session-100.scope - Session 100 of User core. Aug 12 23:57:25.482177 sshd[9576]: Connection closed by 139.178.68.195 port 47292 Aug 12 23:57:25.482635 sshd-session[9574]: pam_unix(sshd:session): session closed for user core Aug 12 23:57:25.487477 systemd[1]: sshd@128-138.199.237.168:22-139.178.68.195:47292.service: Deactivated successfully. Aug 12 23:57:25.490531 systemd[1]: session-100.scope: Deactivated successfully. Aug 12 23:57:25.492361 systemd-logind[1481]: Session 100 logged out. Waiting for processes to exit. Aug 12 23:57:25.495166 systemd-logind[1481]: Removed session 100. Aug 12 23:57:27.022604 containerd[1506]: time="2025-08-12T23:57:27.022530313Z" level=info msg="TaskExit event in podsandbox handler container_id:\"19dd4802aef9a2278a4c8001b706649c2528bbfafc2b91c7d2bde8cc49cbafdc\" id:\"a5ac5ff012b4bac8f13c189a97bf559cd0ebeca3ce27e86fb165901c4d812cf5\" pid:9600 exited_at:{seconds:1755043047 nanos:21221217}" Aug 12 23:57:28.016831 containerd[1506]: time="2025-08-12T23:57:28.016773188Z" level=info msg="TaskExit event in podsandbox handler container_id:\"1cccb6bc47c00d5e1d4ee67f317e7ac919f9017670873c5bb6119b773b41640b\" id:\"dd05a771f203366bf84c7d9cdeae53cf3c637784f68f0c87abe852a26ffee58e\" pid:9622 exited_at:{seconds:1755043048 nanos:15904378}" Aug 12 23:57:29.030410 sshd[9537]: Invalid user ubuntu from 185.50.38.233 port 53758 Aug 12 23:57:30.659838 systemd[1]: Started sshd@129-138.199.237.168:22-139.178.68.195:50092.service - OpenSSH per-connection server daemon (139.178.68.195:50092). Aug 12 23:57:31.177358 sshd[9537]: Connection closed by invalid user ubuntu 185.50.38.233 port 53758 [preauth] Aug 12 23:57:31.180101 systemd[1]: sshd@126-138.199.237.168:22-185.50.38.233:53758.service: Deactivated successfully. Aug 12 23:57:31.673694 sshd[9632]: Accepted publickey for core from 139.178.68.195 port 50092 ssh2: RSA SHA256:mYYevtwiboZ0WAYUecJRosD7TZEB8o4Se1lqZszh41o Aug 12 23:57:31.675889 sshd-session[9632]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Aug 12 23:57:31.682927 systemd-logind[1481]: New session 101 of user core. Aug 12 23:57:31.688751 systemd[1]: Started session-101.scope - Session 101 of User core. Aug 12 23:57:32.452842 sshd[9643]: Connection closed by 139.178.68.195 port 50092 Aug 12 23:57:32.453923 sshd-session[9632]: pam_unix(sshd:session): session closed for user core Aug 12 23:57:32.461444 systemd[1]: sshd@129-138.199.237.168:22-139.178.68.195:50092.service: Deactivated successfully. Aug 12 23:57:32.465390 systemd[1]: session-101.scope: Deactivated successfully. Aug 12 23:57:32.466897 systemd-logind[1481]: Session 101 logged out. Waiting for processes to exit. Aug 12 23:57:32.469027 systemd-logind[1481]: Removed session 101. Aug 12 23:57:32.770824 systemd[1]: Started sshd@130-138.199.237.168:22-185.50.38.233:52706.service - OpenSSH per-connection server daemon (185.50.38.233:52706). Aug 12 23:57:37.628208 systemd[1]: Started sshd@131-138.199.237.168:22-139.178.68.195:50104.service - OpenSSH per-connection server daemon (139.178.68.195:50104). Aug 12 23:57:38.648590 sshd[9674]: Accepted publickey for core from 139.178.68.195 port 50104 ssh2: RSA SHA256:mYYevtwiboZ0WAYUecJRosD7TZEB8o4Se1lqZszh41o Aug 12 23:57:38.651062 sshd-session[9674]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Aug 12 23:57:38.658379 systemd-logind[1481]: New session 102 of user core. Aug 12 23:57:38.666906 systemd[1]: Started session-102.scope - Session 102 of User core. Aug 12 23:57:39.148509 containerd[1506]: time="2025-08-12T23:57:39.148355012Z" level=info msg="TaskExit event in podsandbox handler container_id:\"769dd1b877f8b6c3cdfcd850a1bfce2f7623183d7b5ca6c22e3baaa0871e87ab\" id:\"3e1ece115014cde6da7a374e4460e79869d3cdf91495aa8efad8402fe2a3617c\" pid:9689 exited_at:{seconds:1755043059 nanos:148019048}" Aug 12 23:57:39.428282 sshd[9676]: Connection closed by 139.178.68.195 port 50104 Aug 12 23:57:39.429006 sshd-session[9674]: pam_unix(sshd:session): session closed for user core Aug 12 23:57:39.436842 systemd[1]: sshd@131-138.199.237.168:22-139.178.68.195:50104.service: Deactivated successfully. Aug 12 23:57:39.440096 systemd[1]: session-102.scope: Deactivated successfully. Aug 12 23:57:39.441845 systemd-logind[1481]: Session 102 logged out. Waiting for processes to exit. Aug 12 23:57:39.446023 systemd-logind[1481]: Removed session 102. Aug 12 23:57:44.629874 systemd[1]: Started sshd@132-138.199.237.168:22-139.178.68.195:56012.service - OpenSSH per-connection server daemon (139.178.68.195:56012). Aug 12 23:57:45.706290 sshd[9709]: Accepted publickey for core from 139.178.68.195 port 56012 ssh2: RSA SHA256:mYYevtwiboZ0WAYUecJRosD7TZEB8o4Se1lqZszh41o Aug 12 23:57:45.708878 sshd-session[9709]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Aug 12 23:57:45.716423 systemd-logind[1481]: New session 103 of user core. Aug 12 23:57:45.727941 systemd[1]: Started session-103.scope - Session 103 of User core. Aug 12 23:57:46.511533 sshd[9711]: Connection closed by 139.178.68.195 port 56012 Aug 12 23:57:46.512614 sshd-session[9709]: pam_unix(sshd:session): session closed for user core Aug 12 23:57:46.518747 systemd[1]: sshd@132-138.199.237.168:22-139.178.68.195:56012.service: Deactivated successfully. Aug 12 23:57:46.524078 systemd[1]: session-103.scope: Deactivated successfully. Aug 12 23:57:46.525305 systemd-logind[1481]: Session 103 logged out. Waiting for processes to exit. Aug 12 23:57:46.528292 systemd-logind[1481]: Removed session 103. Aug 12 23:57:49.941026 sshd[9657]: Connection closed by authenticating user root 185.50.38.233 port 52706 [preauth] Aug 12 23:57:49.944135 systemd[1]: sshd@130-138.199.237.168:22-185.50.38.233:52706.service: Deactivated successfully. Aug 12 23:57:51.657494 systemd[1]: Started sshd@133-138.199.237.168:22-185.50.38.233:44272.service - OpenSSH per-connection server daemon (185.50.38.233:44272). Aug 12 23:57:51.697606 systemd[1]: Started sshd@134-138.199.237.168:22-139.178.68.195:39138.service - OpenSSH per-connection server daemon (139.178.68.195:39138). Aug 12 23:57:52.766250 sshd[9726]: Accepted publickey for core from 139.178.68.195 port 39138 ssh2: RSA SHA256:mYYevtwiboZ0WAYUecJRosD7TZEB8o4Se1lqZszh41o Aug 12 23:57:52.768621 sshd-session[9726]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Aug 12 23:57:52.774988 systemd-logind[1481]: New session 104 of user core. Aug 12 23:57:52.786942 systemd[1]: Started session-104.scope - Session 104 of User core. Aug 12 23:57:53.579292 sshd[9728]: Connection closed by 139.178.68.195 port 39138 Aug 12 23:57:53.579857 sshd-session[9726]: pam_unix(sshd:session): session closed for user core Aug 12 23:57:53.584840 systemd-logind[1481]: Session 104 logged out. Waiting for processes to exit. Aug 12 23:57:53.584968 systemd[1]: sshd@134-138.199.237.168:22-139.178.68.195:39138.service: Deactivated successfully. Aug 12 23:57:53.587626 systemd[1]: session-104.scope: Deactivated successfully. Aug 12 23:57:53.592924 systemd-logind[1481]: Removed session 104. Aug 12 23:57:57.012433 containerd[1506]: time="2025-08-12T23:57:57.012383769Z" level=info msg="TaskExit event in podsandbox handler container_id:\"19dd4802aef9a2278a4c8001b706649c2528bbfafc2b91c7d2bde8cc49cbafdc\" id:\"f0215ff188b4336f94a66d921d954f500c5e0be5b1d95268f4c0275f9f3fcfb5\" pid:9753 exited_at:{seconds:1755043077 nanos:11155954}" Aug 12 23:57:58.009103 containerd[1506]: time="2025-08-12T23:57:58.009040439Z" level=info msg="TaskExit event in podsandbox handler container_id:\"1cccb6bc47c00d5e1d4ee67f317e7ac919f9017670873c5bb6119b773b41640b\" id:\"4d92b6a9ced5e3fb03df24d52a9fb63662f58b6e16adc4e15ce78fc05abaf4d9\" pid:9774 exited_at:{seconds:1755043078 nanos:8594794}" Aug 12 23:57:58.743776 systemd[1]: Started sshd@135-138.199.237.168:22-139.178.68.195:39152.service - OpenSSH per-connection server daemon (139.178.68.195:39152). Aug 12 23:57:58.747849 systemd[1]: Starting systemd-tmpfiles-clean.service - Cleanup of Temporary Directories... Aug 12 23:57:58.792284 systemd-tmpfiles[9786]: /usr/lib/tmpfiles.d/nfs-utils.conf:6: Duplicate line for path "/var/lib/nfs/sm", ignoring. Aug 12 23:57:58.792300 systemd-tmpfiles[9786]: /usr/lib/tmpfiles.d/nfs-utils.conf:7: Duplicate line for path "/var/lib/nfs/sm.bak", ignoring. Aug 12 23:57:58.793759 systemd-tmpfiles[9786]: /usr/lib/tmpfiles.d/provision.conf:20: Duplicate line for path "/root", ignoring. Aug 12 23:57:58.794498 systemd-tmpfiles[9786]: /usr/lib/tmpfiles.d/systemd-flatcar.conf:6: Duplicate line for path "/var/log/journal", ignoring. Aug 12 23:57:58.796758 systemd-tmpfiles[9786]: /usr/lib/tmpfiles.d/systemd.conf:29: Duplicate line for path "/var/lib/systemd", ignoring. Aug 12 23:57:58.797127 systemd-tmpfiles[9786]: ACLs are not supported, ignoring. Aug 12 23:57:58.797868 systemd-tmpfiles[9786]: ACLs are not supported, ignoring. Aug 12 23:57:58.805993 systemd-tmpfiles[9786]: Detected autofs mount point /boot during canonicalization of boot. Aug 12 23:57:58.806011 systemd-tmpfiles[9786]: Skipping /boot Aug 12 23:57:58.818643 systemd[1]: systemd-tmpfiles-clean.service: Deactivated successfully. Aug 12 23:57:58.819134 systemd[1]: Finished systemd-tmpfiles-clean.service - Cleanup of Temporary Directories. Aug 12 23:57:58.828310 systemd[1]: run-credentials-systemd\x2dtmpfiles\x2dclean.service.mount: Deactivated successfully. Aug 12 23:57:59.768711 sshd[9785]: Accepted publickey for core from 139.178.68.195 port 39152 ssh2: RSA SHA256:mYYevtwiboZ0WAYUecJRosD7TZEB8o4Se1lqZszh41o Aug 12 23:57:59.771321 sshd-session[9785]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Aug 12 23:57:59.778200 systemd-logind[1481]: New session 105 of user core. Aug 12 23:57:59.786877 systemd[1]: Started session-105.scope - Session 105 of User core. Aug 12 23:58:00.531718 sshd[9790]: Connection closed by 139.178.68.195 port 39152 Aug 12 23:58:00.532792 sshd-session[9785]: pam_unix(sshd:session): session closed for user core Aug 12 23:58:00.538724 systemd[1]: sshd@135-138.199.237.168:22-139.178.68.195:39152.service: Deactivated successfully. Aug 12 23:58:00.541330 systemd[1]: session-105.scope: Deactivated successfully. Aug 12 23:58:00.543376 systemd-logind[1481]: Session 105 logged out. Waiting for processes to exit. Aug 12 23:58:00.546946 systemd-logind[1481]: Removed session 105. Aug 12 23:58:04.955081 containerd[1506]: time="2025-08-12T23:58:04.954882923Z" level=info msg="TaskExit event in podsandbox handler container_id:\"1cccb6bc47c00d5e1d4ee67f317e7ac919f9017670873c5bb6119b773b41640b\" id:\"477528eb2861a854317138be4d5b1df8392d464a68f1c591f0bd3b5348ed5c61\" pid:9816 exited_at:{seconds:1755043084 nanos:954436878}" Aug 12 23:58:05.727324 systemd[1]: Started sshd@136-138.199.237.168:22-139.178.68.195:44154.service - OpenSSH per-connection server daemon (139.178.68.195:44154). Aug 12 23:58:06.801000 sshd[9827]: Accepted publickey for core from 139.178.68.195 port 44154 ssh2: RSA SHA256:mYYevtwiboZ0WAYUecJRosD7TZEB8o4Se1lqZszh41o Aug 12 23:58:06.803671 sshd-session[9827]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Aug 12 23:58:06.810409 systemd-logind[1481]: New session 106 of user core. Aug 12 23:58:06.816073 systemd[1]: Started session-106.scope - Session 106 of User core. Aug 12 23:58:07.612434 sshd[9829]: Connection closed by 139.178.68.195 port 44154 Aug 12 23:58:07.613331 sshd-session[9827]: pam_unix(sshd:session): session closed for user core Aug 12 23:58:07.619823 systemd[1]: sshd@136-138.199.237.168:22-139.178.68.195:44154.service: Deactivated successfully. Aug 12 23:58:07.623971 systemd[1]: session-106.scope: Deactivated successfully. Aug 12 23:58:07.625741 systemd-logind[1481]: Session 106 logged out. Waiting for processes to exit. Aug 12 23:58:07.627498 systemd-logind[1481]: Removed session 106. Aug 12 23:58:08.374594 sshd[9724]: Connection closed by authenticating user root 185.50.38.233 port 44272 [preauth] Aug 12 23:58:08.378649 systemd[1]: sshd@133-138.199.237.168:22-185.50.38.233:44272.service: Deactivated successfully. Aug 12 23:58:09.136205 containerd[1506]: time="2025-08-12T23:58:09.136130511Z" level=info msg="TaskExit event in podsandbox handler container_id:\"769dd1b877f8b6c3cdfcd850a1bfce2f7623183d7b5ca6c22e3baaa0871e87ab\" id:\"1d05d7355e4d64d20eb0d8f5bddbdf949e7dc0ca341346a508edf4f94d87e40e\" pid:9854 exited_at:{seconds:1755043089 nanos:135833908}" Aug 12 23:58:10.344131 systemd[1]: Started sshd@137-138.199.237.168:22-185.50.38.233:51240.service - OpenSSH per-connection server daemon (185.50.38.233:51240). Aug 12 23:58:12.773905 systemd[1]: Started sshd@138-138.199.237.168:22-139.178.68.195:47640.service - OpenSSH per-connection server daemon (139.178.68.195:47640). Aug 12 23:58:13.790863 sshd[9868]: Accepted publickey for core from 139.178.68.195 port 47640 ssh2: RSA SHA256:mYYevtwiboZ0WAYUecJRosD7TZEB8o4Se1lqZszh41o Aug 12 23:58:13.793969 sshd-session[9868]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Aug 12 23:58:13.802275 systemd-logind[1481]: New session 107 of user core. Aug 12 23:58:13.809983 systemd[1]: Started session-107.scope - Session 107 of User core. Aug 12 23:58:14.616563 sshd[9870]: Connection closed by 139.178.68.195 port 47640 Aug 12 23:58:14.618484 sshd-session[9868]: pam_unix(sshd:session): session closed for user core Aug 12 23:58:14.626111 systemd-logind[1481]: Session 107 logged out. Waiting for processes to exit. Aug 12 23:58:14.627445 systemd[1]: sshd@138-138.199.237.168:22-139.178.68.195:47640.service: Deactivated successfully. Aug 12 23:58:14.634868 systemd[1]: session-107.scope: Deactivated successfully. Aug 12 23:58:14.640810 systemd-logind[1481]: Removed session 107. Aug 12 23:58:16.741626 containerd[1506]: time="2025-08-12T23:58:16.741503505Z" level=info msg="TaskExit event in podsandbox handler container_id:\"19dd4802aef9a2278a4c8001b706649c2528bbfafc2b91c7d2bde8cc49cbafdc\" id:\"5e30f3f3d98a3cd9dfb9d7f6efbcdfde66a15ab7bfbbbff9ac2728ac63641c49\" pid:9894 exited_at:{seconds:1755043096 nanos:741100340}" Aug 12 23:58:19.794347 systemd[1]: Started sshd@139-138.199.237.168:22-139.178.68.195:47654.service - OpenSSH per-connection server daemon (139.178.68.195:47654). Aug 12 23:58:20.806647 sshd[9904]: Accepted publickey for core from 139.178.68.195 port 47654 ssh2: RSA SHA256:mYYevtwiboZ0WAYUecJRosD7TZEB8o4Se1lqZszh41o Aug 12 23:58:20.809072 sshd-session[9904]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Aug 12 23:58:20.814060 systemd-logind[1481]: New session 108 of user core. Aug 12 23:58:20.820821 systemd[1]: Started session-108.scope - Session 108 of User core. Aug 12 23:58:21.577573 sshd[9906]: Connection closed by 139.178.68.195 port 47654 Aug 12 23:58:21.578514 sshd-session[9904]: pam_unix(sshd:session): session closed for user core Aug 12 23:58:21.583874 systemd-logind[1481]: Session 108 logged out. Waiting for processes to exit. Aug 12 23:58:21.585049 systemd[1]: sshd@139-138.199.237.168:22-139.178.68.195:47654.service: Deactivated successfully. Aug 12 23:58:21.589861 systemd[1]: session-108.scope: Deactivated successfully. Aug 12 23:58:21.591727 systemd-logind[1481]: Removed session 108. Aug 12 23:58:26.041574 sshd[9866]: Invalid user debian from 185.50.38.233 port 51240 Aug 12 23:58:26.756059 systemd[1]: Started sshd@140-138.199.237.168:22-139.178.68.195:45872.service - OpenSSH per-connection server daemon (139.178.68.195:45872). Aug 12 23:58:27.018721 containerd[1506]: time="2025-08-12T23:58:27.018436457Z" level=info msg="TaskExit event in podsandbox handler container_id:\"19dd4802aef9a2278a4c8001b706649c2528bbfafc2b91c7d2bde8cc49cbafdc\" id:\"0c81b39dbf3b315de7a65836d17a168114fb456b6f1867abb178b85d46a52548\" pid:9935 exited_at:{seconds:1755043107 nanos:18055453}" Aug 12 23:58:27.774893 sshd[9920]: Accepted publickey for core from 139.178.68.195 port 45872 ssh2: RSA SHA256:mYYevtwiboZ0WAYUecJRosD7TZEB8o4Se1lqZszh41o Aug 12 23:58:27.777720 sshd-session[9920]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Aug 12 23:58:27.785977 systemd-logind[1481]: New session 109 of user core. Aug 12 23:58:27.792007 systemd[1]: Started session-109.scope - Session 109 of User core. Aug 12 23:58:28.009685 containerd[1506]: time="2025-08-12T23:58:28.009397704Z" level=info msg="TaskExit event in podsandbox handler container_id:\"1cccb6bc47c00d5e1d4ee67f317e7ac919f9017670873c5bb6119b773b41640b\" id:\"7fe46636240be3bfdbed4fb03b19b9d79b9b6be9254c159a081824c85aed7a52\" pid:9959 exited_at:{seconds:1755043108 nanos:8339851}" Aug 12 23:58:28.301590 sshd[9866]: Connection closed by invalid user debian 185.50.38.233 port 51240 [preauth] Aug 12 23:58:28.305383 systemd[1]: sshd@137-138.199.237.168:22-185.50.38.233:51240.service: Deactivated successfully. Aug 12 23:58:28.552864 sshd[9946]: Connection closed by 139.178.68.195 port 45872 Aug 12 23:58:28.555122 sshd-session[9920]: pam_unix(sshd:session): session closed for user core Aug 12 23:58:28.563995 systemd[1]: sshd@140-138.199.237.168:22-139.178.68.195:45872.service: Deactivated successfully. Aug 12 23:58:28.567390 systemd[1]: session-109.scope: Deactivated successfully. Aug 12 23:58:28.569910 systemd-logind[1481]: Session 109 logged out. Waiting for processes to exit. Aug 12 23:58:28.573199 systemd-logind[1481]: Removed session 109. Aug 12 23:58:30.565823 systemd[1]: Started sshd@141-138.199.237.168:22-185.50.38.233:33930.service - OpenSSH per-connection server daemon (185.50.38.233:33930). Aug 12 23:58:33.728750 systemd[1]: Started sshd@142-138.199.237.168:22-139.178.68.195:36586.service - OpenSSH per-connection server daemon (139.178.68.195:36586). Aug 12 23:58:34.754417 sshd[9987]: Accepted publickey for core from 139.178.68.195 port 36586 ssh2: RSA SHA256:mYYevtwiboZ0WAYUecJRosD7TZEB8o4Se1lqZszh41o Aug 12 23:58:34.757324 sshd-session[9987]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Aug 12 23:58:34.766417 systemd-logind[1481]: New session 110 of user core. Aug 12 23:58:34.769813 systemd[1]: Started session-110.scope - Session 110 of User core. Aug 12 23:58:35.554907 sshd[9989]: Connection closed by 139.178.68.195 port 36586 Aug 12 23:58:35.553953 sshd-session[9987]: pam_unix(sshd:session): session closed for user core Aug 12 23:58:35.559452 systemd[1]: sshd@142-138.199.237.168:22-139.178.68.195:36586.service: Deactivated successfully. Aug 12 23:58:35.566280 systemd[1]: session-110.scope: Deactivated successfully. Aug 12 23:58:35.571451 systemd-logind[1481]: Session 110 logged out. Waiting for processes to exit. Aug 12 23:58:35.576759 systemd-logind[1481]: Removed session 110. Aug 12 23:58:39.133899 containerd[1506]: time="2025-08-12T23:58:39.133830793Z" level=info msg="TaskExit event in podsandbox handler container_id:\"769dd1b877f8b6c3cdfcd850a1bfce2f7623183d7b5ca6c22e3baaa0871e87ab\" id:\"0c0ac9ff188ee5a4cf177d21ed535b56e15442ac856acb1267276108823d4a42\" pid:10012 exited_at:{seconds:1755043119 nanos:133039264}" Aug 12 23:58:40.729220 systemd[1]: Started sshd@143-138.199.237.168:22-139.178.68.195:44344.service - OpenSSH per-connection server daemon (139.178.68.195:44344). Aug 12 23:58:41.746075 sshd[10024]: Accepted publickey for core from 139.178.68.195 port 44344 ssh2: RSA SHA256:mYYevtwiboZ0WAYUecJRosD7TZEB8o4Se1lqZszh41o Aug 12 23:58:41.749201 sshd-session[10024]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Aug 12 23:58:41.755628 systemd-logind[1481]: New session 111 of user core. Aug 12 23:58:41.761797 systemd[1]: Started session-111.scope - Session 111 of User core. Aug 12 23:58:42.517938 sshd[10026]: Connection closed by 139.178.68.195 port 44344 Aug 12 23:58:42.519075 sshd-session[10024]: pam_unix(sshd:session): session closed for user core Aug 12 23:58:42.524310 systemd-logind[1481]: Session 111 logged out. Waiting for processes to exit. Aug 12 23:58:42.526494 systemd[1]: sshd@143-138.199.237.168:22-139.178.68.195:44344.service: Deactivated successfully. Aug 12 23:58:42.529336 systemd[1]: session-111.scope: Deactivated successfully. Aug 12 23:58:42.533610 systemd-logind[1481]: Removed session 111. Aug 12 23:58:45.862481 sshd[9982]: Connection closed by authenticating user root 185.50.38.233 port 33930 [preauth] Aug 12 23:58:45.867280 systemd[1]: sshd@141-138.199.237.168:22-185.50.38.233:33930.service: Deactivated successfully. Aug 12 23:58:47.727895 systemd[1]: Started sshd@144-138.199.237.168:22-139.178.68.195:44360.service - OpenSSH per-connection server daemon (139.178.68.195:44360). Aug 12 23:58:47.816700 systemd[1]: Started sshd@145-138.199.237.168:22-185.50.38.233:49226.service - OpenSSH per-connection server daemon (185.50.38.233:49226). Aug 12 23:58:48.808177 sshd[10045]: Accepted publickey for core from 139.178.68.195 port 44360 ssh2: RSA SHA256:mYYevtwiboZ0WAYUecJRosD7TZEB8o4Se1lqZszh41o Aug 12 23:58:48.810463 sshd-session[10045]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Aug 12 23:58:48.818614 systemd-logind[1481]: New session 112 of user core. Aug 12 23:58:48.823749 systemd[1]: Started session-112.scope - Session 112 of User core. Aug 12 23:58:49.617921 sshd[10050]: Connection closed by 139.178.68.195 port 44360 Aug 12 23:58:49.618959 sshd-session[10045]: pam_unix(sshd:session): session closed for user core Aug 12 23:58:49.625457 systemd[1]: sshd@144-138.199.237.168:22-139.178.68.195:44360.service: Deactivated successfully. Aug 12 23:58:49.628002 systemd[1]: session-112.scope: Deactivated successfully. Aug 12 23:58:49.629458 systemd-logind[1481]: Session 112 logged out. Waiting for processes to exit. Aug 12 23:58:49.631922 systemd-logind[1481]: Removed session 112. Aug 12 23:58:54.785539 systemd[1]: Started sshd@146-138.199.237.168:22-139.178.68.195:52670.service - OpenSSH per-connection server daemon (139.178.68.195:52670). Aug 12 23:58:55.802440 sshd[10063]: Accepted publickey for core from 139.178.68.195 port 52670 ssh2: RSA SHA256:mYYevtwiboZ0WAYUecJRosD7TZEB8o4Se1lqZszh41o Aug 12 23:58:55.804847 sshd-session[10063]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Aug 12 23:58:55.811516 systemd-logind[1481]: New session 113 of user core. Aug 12 23:58:55.816830 systemd[1]: Started session-113.scope - Session 113 of User core. Aug 12 23:58:56.587225 sshd[10065]: Connection closed by 139.178.68.195 port 52670 Aug 12 23:58:56.588118 sshd-session[10063]: pam_unix(sshd:session): session closed for user core Aug 12 23:58:56.594134 systemd-logind[1481]: Session 113 logged out. Waiting for processes to exit. Aug 12 23:58:56.596281 systemd[1]: sshd@146-138.199.237.168:22-139.178.68.195:52670.service: Deactivated successfully. Aug 12 23:58:56.600990 systemd[1]: session-113.scope: Deactivated successfully. Aug 12 23:58:56.603606 systemd-logind[1481]: Removed session 113. Aug 12 23:58:57.011961 containerd[1506]: time="2025-08-12T23:58:57.011370713Z" level=info msg="TaskExit event in podsandbox handler container_id:\"19dd4802aef9a2278a4c8001b706649c2528bbfafc2b91c7d2bde8cc49cbafdc\" id:\"0802006ad93e8fda61b251617b8a4d3f083d10655a68d4810d79c6760e386c1f\" pid:10089 exited_at:{seconds:1755043137 nanos:10430341}" Aug 12 23:58:58.012447 containerd[1506]: time="2025-08-12T23:58:58.012317042Z" level=info msg="TaskExit event in podsandbox handler container_id:\"1cccb6bc47c00d5e1d4ee67f317e7ac919f9017670873c5bb6119b773b41640b\" id:\"943bd8062df7d2ce82962cd4760f65bfd92a9b03e0ba3ffe9a07b0df0b09df6e\" pid:10110 exited_at:{seconds:1755043138 nanos:11835357}" Aug 12 23:59:01.770707 systemd[1]: Started sshd@147-138.199.237.168:22-139.178.68.195:48932.service - OpenSSH per-connection server daemon (139.178.68.195:48932). Aug 12 23:59:02.794647 sshd[10120]: Accepted publickey for core from 139.178.68.195 port 48932 ssh2: RSA SHA256:mYYevtwiboZ0WAYUecJRosD7TZEB8o4Se1lqZszh41o Aug 12 23:59:02.797911 sshd-session[10120]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Aug 12 23:59:02.804333 systemd-logind[1481]: New session 114 of user core. Aug 12 23:59:02.813954 systemd[1]: Started session-114.scope - Session 114 of User core. Aug 12 23:59:03.569658 sshd[10125]: Connection closed by 139.178.68.195 port 48932 Aug 12 23:59:03.570283 sshd-session[10120]: pam_unix(sshd:session): session closed for user core Aug 12 23:59:03.578933 systemd[1]: sshd@147-138.199.237.168:22-139.178.68.195:48932.service: Deactivated successfully. Aug 12 23:59:03.579000 systemd-logind[1481]: Session 114 logged out. Waiting for processes to exit. Aug 12 23:59:03.582773 systemd[1]: session-114.scope: Deactivated successfully. Aug 12 23:59:03.588505 systemd-logind[1481]: Removed session 114. Aug 12 23:59:04.944656 containerd[1506]: time="2025-08-12T23:59:04.944574059Z" level=info msg="TaskExit event in podsandbox handler container_id:\"1cccb6bc47c00d5e1d4ee67f317e7ac919f9017670873c5bb6119b773b41640b\" id:\"e50e3ea1d1fa4d096b7a60376de34cca270657a3172fee3d5a2cb4245baea2d6\" pid:10150 exited_at:{seconds:1755043144 nanos:944177015}" Aug 12 23:59:07.893239 sshd[10048]: Connection closed by authenticating user root 185.50.38.233 port 49226 [preauth] Aug 12 23:59:07.897675 systemd[1]: sshd@145-138.199.237.168:22-185.50.38.233:49226.service: Deactivated successfully. Aug 12 23:59:08.744339 systemd[1]: Started sshd@148-138.199.237.168:22-139.178.68.195:48934.service - OpenSSH per-connection server daemon (139.178.68.195:48934). Aug 12 23:59:09.188296 containerd[1506]: time="2025-08-12T23:59:09.188238831Z" level=info msg="TaskExit event in podsandbox handler container_id:\"769dd1b877f8b6c3cdfcd850a1bfce2f7623183d7b5ca6c22e3baaa0871e87ab\" id:\"d9b4ef43fbf20a823df22a62f80c158371590ee875f702ae0e9fc57a2beab4b0\" pid:10177 exited_at:{seconds:1755043149 nanos:187666344}" Aug 12 23:59:09.763099 sshd[10163]: Accepted publickey for core from 139.178.68.195 port 48934 ssh2: RSA SHA256:mYYevtwiboZ0WAYUecJRosD7TZEB8o4Se1lqZszh41o Aug 12 23:59:09.766136 sshd-session[10163]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Aug 12 23:59:09.773026 systemd-logind[1481]: New session 115 of user core. Aug 12 23:59:09.779433 systemd[1]: Started session-115.scope - Session 115 of User core. Aug 12 23:59:09.997854 systemd[1]: Started sshd@149-138.199.237.168:22-185.50.38.233:40346.service - OpenSSH per-connection server daemon (185.50.38.233:40346). Aug 12 23:59:10.537892 sshd[10189]: Connection closed by 139.178.68.195 port 48934 Aug 12 23:59:10.538887 sshd-session[10163]: pam_unix(sshd:session): session closed for user core Aug 12 23:59:10.545122 systemd-logind[1481]: Session 115 logged out. Waiting for processes to exit. Aug 12 23:59:10.545439 systemd[1]: sshd@148-138.199.237.168:22-139.178.68.195:48934.service: Deactivated successfully. Aug 12 23:59:10.548844 systemd[1]: session-115.scope: Deactivated successfully. Aug 12 23:59:10.554595 systemd-logind[1481]: Removed session 115. Aug 12 23:59:15.716368 systemd[1]: Started sshd@150-138.199.237.168:22-139.178.68.195:60206.service - OpenSSH per-connection server daemon (139.178.68.195:60206). Aug 12 23:59:16.737130 sshd[10224]: Accepted publickey for core from 139.178.68.195 port 60206 ssh2: RSA SHA256:mYYevtwiboZ0WAYUecJRosD7TZEB8o4Se1lqZszh41o Aug 12 23:59:16.738875 sshd-session[10224]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Aug 12 23:59:16.746816 systemd-logind[1481]: New session 116 of user core. Aug 12 23:59:16.752793 systemd[1]: Started session-116.scope - Session 116 of User core. Aug 12 23:59:16.762591 containerd[1506]: time="2025-08-12T23:59:16.762352504Z" level=info msg="TaskExit event in podsandbox handler container_id:\"19dd4802aef9a2278a4c8001b706649c2528bbfafc2b91c7d2bde8cc49cbafdc\" id:\"990ad811bdcd2eee7550df57519265db58f4e0fc3a7e1ecc9f5391a08edc0cf4\" pid:10238 exited_at:{seconds:1755043156 nanos:761864218}" Aug 12 23:59:17.526591 sshd[10244]: Connection closed by 139.178.68.195 port 60206 Aug 12 23:59:17.527759 sshd-session[10224]: pam_unix(sshd:session): session closed for user core Aug 12 23:59:17.535198 systemd[1]: sshd@150-138.199.237.168:22-139.178.68.195:60206.service: Deactivated successfully. Aug 12 23:59:17.539249 systemd[1]: session-116.scope: Deactivated successfully. Aug 12 23:59:17.544491 systemd-logind[1481]: Session 116 logged out. Waiting for processes to exit. Aug 12 23:59:17.546020 systemd-logind[1481]: Removed session 116. Aug 12 23:59:22.702495 systemd[1]: Started sshd@151-138.199.237.168:22-139.178.68.195:41158.service - OpenSSH per-connection server daemon (139.178.68.195:41158). Aug 12 23:59:23.734291 sshd[10259]: Accepted publickey for core from 139.178.68.195 port 41158 ssh2: RSA SHA256:mYYevtwiboZ0WAYUecJRosD7TZEB8o4Se1lqZszh41o Aug 12 23:59:23.736975 sshd-session[10259]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Aug 12 23:59:23.745130 systemd-logind[1481]: New session 117 of user core. Aug 12 23:59:23.750832 systemd[1]: Started session-117.scope - Session 117 of User core. Aug 12 23:59:24.510793 sshd[10261]: Connection closed by 139.178.68.195 port 41158 Aug 12 23:59:24.509989 sshd-session[10259]: pam_unix(sshd:session): session closed for user core Aug 12 23:59:24.515640 systemd[1]: sshd@151-138.199.237.168:22-139.178.68.195:41158.service: Deactivated successfully. Aug 12 23:59:24.518806 systemd[1]: session-117.scope: Deactivated successfully. Aug 12 23:59:24.522032 systemd-logind[1481]: Session 117 logged out. Waiting for processes to exit. Aug 12 23:59:24.524369 systemd-logind[1481]: Removed session 117. Aug 12 23:59:26.563991 sshd[10191]: Invalid user jenkins from 185.50.38.233 port 40346 Aug 12 23:59:27.012283 containerd[1506]: time="2025-08-12T23:59:27.011875117Z" level=info msg="TaskExit event in podsandbox handler container_id:\"19dd4802aef9a2278a4c8001b706649c2528bbfafc2b91c7d2bde8cc49cbafdc\" id:\"8a012e31daefc2303fe98b5b0646ba7c93610c88d6cd932d2677047421fb9722\" pid:10286 exited_at:{seconds:1755043167 nanos:10126256}" Aug 12 23:59:28.010112 containerd[1506]: time="2025-08-12T23:59:28.010058577Z" level=info msg="TaskExit event in podsandbox handler container_id:\"1cccb6bc47c00d5e1d4ee67f317e7ac919f9017670873c5bb6119b773b41640b\" id:\"980eec6ab5607da6ccd1e95992124ab81626c56f8860ecae166548c94e06fafd\" pid:10308 exited_at:{seconds:1755043168 nanos:9758733}" Aug 12 23:59:28.848394 sshd[10191]: Connection closed by invalid user jenkins 185.50.38.233 port 40346 [preauth] Aug 12 23:59:28.853742 systemd[1]: sshd@149-138.199.237.168:22-185.50.38.233:40346.service: Deactivated successfully. Aug 12 23:59:29.684362 systemd[1]: Started sshd@152-138.199.237.168:22-139.178.68.195:41168.service - OpenSSH per-connection server daemon (139.178.68.195:41168). Aug 12 23:59:30.702609 sshd[10321]: Accepted publickey for core from 139.178.68.195 port 41168 ssh2: RSA SHA256:mYYevtwiboZ0WAYUecJRosD7TZEB8o4Se1lqZszh41o Aug 12 23:59:30.705290 sshd-session[10321]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Aug 12 23:59:30.714418 systemd-logind[1481]: New session 118 of user core. Aug 12 23:59:30.719838 systemd[1]: Started session-118.scope - Session 118 of User core. Aug 12 23:59:30.754967 systemd[1]: Started sshd@153-138.199.237.168:22-185.50.38.233:40952.service - OpenSSH per-connection server daemon (185.50.38.233:40952). Aug 12 23:59:31.473133 sshd[10323]: Connection closed by 139.178.68.195 port 41168 Aug 12 23:59:31.471907 sshd-session[10321]: pam_unix(sshd:session): session closed for user core Aug 12 23:59:31.479069 systemd[1]: sshd@152-138.199.237.168:22-139.178.68.195:41168.service: Deactivated successfully. Aug 12 23:59:31.485325 systemd[1]: session-118.scope: Deactivated successfully. Aug 12 23:59:31.487442 systemd-logind[1481]: Session 118 logged out. Waiting for processes to exit. Aug 12 23:59:31.489871 systemd-logind[1481]: Removed session 118. Aug 12 23:59:36.660023 systemd[1]: Started sshd@154-138.199.237.168:22-139.178.68.195:46696.service - OpenSSH per-connection server daemon (139.178.68.195:46696). Aug 12 23:59:37.684312 sshd[10340]: Accepted publickey for core from 139.178.68.195 port 46696 ssh2: RSA SHA256:mYYevtwiboZ0WAYUecJRosD7TZEB8o4Se1lqZszh41o Aug 12 23:59:37.686938 sshd-session[10340]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Aug 12 23:59:37.693614 systemd-logind[1481]: New session 119 of user core. Aug 12 23:59:37.699840 systemd[1]: Started session-119.scope - Session 119 of User core. Aug 12 23:59:38.460273 sshd[10342]: Connection closed by 139.178.68.195 port 46696 Aug 12 23:59:38.461377 sshd-session[10340]: pam_unix(sshd:session): session closed for user core Aug 12 23:59:38.469836 systemd-logind[1481]: Session 119 logged out. Waiting for processes to exit. Aug 12 23:59:38.470807 systemd[1]: session-119.scope: Deactivated successfully. Aug 12 23:59:38.472170 systemd[1]: sshd@154-138.199.237.168:22-139.178.68.195:46696.service: Deactivated successfully. Aug 12 23:59:38.480226 systemd-logind[1481]: Removed session 119. Aug 12 23:59:39.139128 containerd[1506]: time="2025-08-12T23:59:39.138980150Z" level=info msg="TaskExit event in podsandbox handler container_id:\"769dd1b877f8b6c3cdfcd850a1bfce2f7623183d7b5ca6c22e3baaa0871e87ab\" id:\"386c9a283aee38a2240452e8949605bb25a961980bbb4002135af026579f0a6e\" pid:10366 exited_at:{seconds:1755043179 nanos:138658826}" Aug 12 23:59:49.915124 sshd[10325]: Connection closed by authenticating user root 185.50.38.233 port 40952 [preauth] Aug 12 23:59:49.918090 systemd[1]: sshd@153-138.199.237.168:22-185.50.38.233:40952.service: Deactivated successfully. Aug 12 23:59:52.277713 systemd[1]: Started sshd@155-138.199.237.168:22-185.50.38.233:46780.service - OpenSSH per-connection server daemon (185.50.38.233:46780). Aug 12 23:59:54.218755 kubelet[2715]: E0812 23:59:54.218687 2715 controller.go:195] "Failed to update lease" err="rpc error: code = Unavailable desc = error reading from server: read tcp 10.0.0.3:53362->10.0.0.2:2379: read: connection timed out" Aug 12 23:59:54.224132 systemd[1]: cri-containerd-0e65af796fa343169e6c8d278f1e1cadf8ed15c036c122f9e9d5a019957cf140.scope: Deactivated successfully. Aug 12 23:59:54.224641 systemd[1]: cri-containerd-0e65af796fa343169e6c8d278f1e1cadf8ed15c036c122f9e9d5a019957cf140.scope: Consumed 4.259s CPU time, 26.5M memory peak, 4.7M read from disk. Aug 12 23:59:54.228183 containerd[1506]: time="2025-08-12T23:59:54.227933385Z" level=info msg="received exit event container_id:\"0e65af796fa343169e6c8d278f1e1cadf8ed15c036c122f9e9d5a019957cf140\" id:\"0e65af796fa343169e6c8d278f1e1cadf8ed15c036c122f9e9d5a019957cf140\" pid:2581 exit_status:1 exited_at:{seconds:1755043194 nanos:227439819}" Aug 12 23:59:54.228980 containerd[1506]: time="2025-08-12T23:59:54.228121588Z" level=info msg="TaskExit event in podsandbox handler container_id:\"0e65af796fa343169e6c8d278f1e1cadf8ed15c036c122f9e9d5a019957cf140\" id:\"0e65af796fa343169e6c8d278f1e1cadf8ed15c036c122f9e9d5a019957cf140\" pid:2581 exit_status:1 exited_at:{seconds:1755043194 nanos:227439819}" Aug 12 23:59:54.260180 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-0e65af796fa343169e6c8d278f1e1cadf8ed15c036c122f9e9d5a019957cf140-rootfs.mount: Deactivated successfully. Aug 12 23:59:54.509478 kubelet[2715]: I0812 23:59:54.508785 2715 scope.go:117] "RemoveContainer" containerID="0e65af796fa343169e6c8d278f1e1cadf8ed15c036c122f9e9d5a019957cf140" Aug 12 23:59:54.512736 containerd[1506]: time="2025-08-12T23:59:54.512699095Z" level=info msg="CreateContainer within sandbox \"c5d6af8ec2d30f5a063448bdcfc55cd2cf73946a87dffbe65a2bcf4266afe4bc\" for container &ContainerMetadata{Name:kube-scheduler,Attempt:1,}" Aug 12 23:59:54.525697 containerd[1506]: time="2025-08-12T23:59:54.525654171Z" level=info msg="Container 43238aaea654376a5c7a329e475cc8eef7ca2de12d7782ac97a43a93990d92b4: CDI devices from CRI Config.CDIDevices: []" Aug 12 23:59:54.533912 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount3399600981.mount: Deactivated successfully. Aug 12 23:59:54.539980 containerd[1506]: time="2025-08-12T23:59:54.539617979Z" level=info msg="CreateContainer within sandbox \"c5d6af8ec2d30f5a063448bdcfc55cd2cf73946a87dffbe65a2bcf4266afe4bc\" for &ContainerMetadata{Name:kube-scheduler,Attempt:1,} returns container id \"43238aaea654376a5c7a329e475cc8eef7ca2de12d7782ac97a43a93990d92b4\"" Aug 12 23:59:54.541583 containerd[1506]: time="2025-08-12T23:59:54.540503190Z" level=info msg="StartContainer for \"43238aaea654376a5c7a329e475cc8eef7ca2de12d7782ac97a43a93990d92b4\"" Aug 12 23:59:54.542117 containerd[1506]: time="2025-08-12T23:59:54.542050408Z" level=info msg="connecting to shim 43238aaea654376a5c7a329e475cc8eef7ca2de12d7782ac97a43a93990d92b4" address="unix:///run/containerd/s/0d20e9facbc3d0dc8dfda71bd807d29995c4d24802c38827b7a650a57bbe53a8" protocol=ttrpc version=3 Aug 12 23:59:54.580806 systemd[1]: Started cri-containerd-43238aaea654376a5c7a329e475cc8eef7ca2de12d7782ac97a43a93990d92b4.scope - libcontainer container 43238aaea654376a5c7a329e475cc8eef7ca2de12d7782ac97a43a93990d92b4. Aug 12 23:59:54.645418 containerd[1506]: time="2025-08-12T23:59:54.645346452Z" level=info msg="StartContainer for \"43238aaea654376a5c7a329e475cc8eef7ca2de12d7782ac97a43a93990d92b4\" returns successfully" Aug 12 23:59:55.227740 systemd[1]: cri-containerd-057eb732b14b4064d51cfd604eb745895a7dbdae4e79b01dd59efa7fa4d14ea4.scope: Deactivated successfully. Aug 12 23:59:55.228335 systemd[1]: cri-containerd-057eb732b14b4064d51cfd604eb745895a7dbdae4e79b01dd59efa7fa4d14ea4.scope: Consumed 46.061s CPU time, 106.7M memory peak, 5M read from disk. Aug 12 23:59:55.231901 containerd[1506]: time="2025-08-12T23:59:55.231857116Z" level=info msg="received exit event container_id:\"057eb732b14b4064d51cfd604eb745895a7dbdae4e79b01dd59efa7fa4d14ea4\" id:\"057eb732b14b4064d51cfd604eb745895a7dbdae4e79b01dd59efa7fa4d14ea4\" pid:3035 exit_status:1 exited_at:{seconds:1755043195 nanos:230655942}" Aug 12 23:59:55.232313 containerd[1506]: time="2025-08-12T23:59:55.232278681Z" level=info msg="TaskExit event in podsandbox handler container_id:\"057eb732b14b4064d51cfd604eb745895a7dbdae4e79b01dd59efa7fa4d14ea4\" id:\"057eb732b14b4064d51cfd604eb745895a7dbdae4e79b01dd59efa7fa4d14ea4\" pid:3035 exit_status:1 exited_at:{seconds:1755043195 nanos:230655942}" Aug 12 23:59:55.259893 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-057eb732b14b4064d51cfd604eb745895a7dbdae4e79b01dd59efa7fa4d14ea4-rootfs.mount: Deactivated successfully. Aug 12 23:59:55.517701 kubelet[2715]: I0812 23:59:55.517593 2715 scope.go:117] "RemoveContainer" containerID="057eb732b14b4064d51cfd604eb745895a7dbdae4e79b01dd59efa7fa4d14ea4" Aug 12 23:59:55.534055 containerd[1506]: time="2025-08-12T23:59:55.533985315Z" level=info msg="CreateContainer within sandbox \"539174614838505008cd3414578437b2b7acb93a8cca5204397d97a299e63fcc\" for container &ContainerMetadata{Name:tigera-operator,Attempt:1,}" Aug 12 23:59:55.548643 containerd[1506]: time="2025-08-12T23:59:55.548039364Z" level=info msg="Container ac0d6c16f21dcf222e252ee28259c7c505e1ba31fd4619944e39616fd7a5c1e8: CDI devices from CRI Config.CDIDevices: []" Aug 12 23:59:55.552347 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount1162416501.mount: Deactivated successfully. Aug 12 23:59:55.558843 containerd[1506]: time="2025-08-12T23:59:55.558798454Z" level=info msg="CreateContainer within sandbox \"539174614838505008cd3414578437b2b7acb93a8cca5204397d97a299e63fcc\" for &ContainerMetadata{Name:tigera-operator,Attempt:1,} returns container id \"ac0d6c16f21dcf222e252ee28259c7c505e1ba31fd4619944e39616fd7a5c1e8\"" Aug 12 23:59:55.559499 containerd[1506]: time="2025-08-12T23:59:55.559465062Z" level=info msg="StartContainer for \"ac0d6c16f21dcf222e252ee28259c7c505e1ba31fd4619944e39616fd7a5c1e8\"" Aug 12 23:59:55.561038 containerd[1506]: time="2025-08-12T23:59:55.560999360Z" level=info msg="connecting to shim ac0d6c16f21dcf222e252ee28259c7c505e1ba31fd4619944e39616fd7a5c1e8" address="unix:///run/containerd/s/38ecee549daf5f0eae0042259ae5a5e7d5db771c4d5c7bb2b30d71c86a574f99" protocol=ttrpc version=3 Aug 12 23:59:55.591000 systemd[1]: Started cri-containerd-ac0d6c16f21dcf222e252ee28259c7c505e1ba31fd4619944e39616fd7a5c1e8.scope - libcontainer container ac0d6c16f21dcf222e252ee28259c7c505e1ba31fd4619944e39616fd7a5c1e8. Aug 12 23:59:55.612283 systemd[1]: cri-containerd-23bb8c20a2225635a3b99a1166afdfa83d83099a06c33aa14b79ca02b22a47dd.scope: Deactivated successfully. Aug 12 23:59:55.612713 systemd[1]: cri-containerd-23bb8c20a2225635a3b99a1166afdfa83d83099a06c33aa14b79ca02b22a47dd.scope: Consumed 11.531s CPU time, 68.7M memory peak, 3.6M read from disk. Aug 12 23:59:55.619575 containerd[1506]: time="2025-08-12T23:59:55.618810056Z" level=info msg="TaskExit event in podsandbox handler container_id:\"23bb8c20a2225635a3b99a1166afdfa83d83099a06c33aa14b79ca02b22a47dd\" id:\"23bb8c20a2225635a3b99a1166afdfa83d83099a06c33aa14b79ca02b22a47dd\" pid:2572 exit_status:1 exited_at:{seconds:1755043195 nanos:616866673}" Aug 12 23:59:55.619850 containerd[1506]: time="2025-08-12T23:59:55.619072099Z" level=info msg="received exit event container_id:\"23bb8c20a2225635a3b99a1166afdfa83d83099a06c33aa14b79ca02b22a47dd\" id:\"23bb8c20a2225635a3b99a1166afdfa83d83099a06c33aa14b79ca02b22a47dd\" pid:2572 exit_status:1 exited_at:{seconds:1755043195 nanos:616866673}" Aug 12 23:59:55.684193 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-23bb8c20a2225635a3b99a1166afdfa83d83099a06c33aa14b79ca02b22a47dd-rootfs.mount: Deactivated successfully. Aug 12 23:59:55.702126 containerd[1506]: time="2025-08-12T23:59:55.702004778Z" level=info msg="StartContainer for \"ac0d6c16f21dcf222e252ee28259c7c505e1ba31fd4619944e39616fd7a5c1e8\" returns successfully" Aug 12 23:59:56.463504 kubelet[2715]: E0812 23:59:56.460376 2715 event.go:359] "Server rejected event (will not retry!)" err="rpc error: code = Unavailable desc = error reading from server: read tcp 10.0.0.3:53148->10.0.0.2:2379: read: connection timed out" event="&Event{ObjectMeta:{kube-apiserver-ci-4372-1-0-9-13fe44d47a.185b2a7b3b995ad7 kube-system 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:kube-system,Name:kube-apiserver-ci-4372-1-0-9-13fe44d47a,UID:a544c3f042ca3d854dd7922b10173e2a,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:Unhealthy,Message:Readiness probe failed: HTTP probe failed with statuscode: 500,Source:EventSource{Component:kubelet,Host:ci-4372-1-0-9-13fe44d47a,},FirstTimestamp:2025-08-12 23:59:48.560190167 +0000 UTC m=+923.619752720,LastTimestamp:2025-08-12 23:59:48.560190167 +0000 UTC m=+923.619752720,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ci-4372-1-0-9-13fe44d47a,}" Aug 12 23:59:56.531412 kubelet[2715]: I0812 23:59:56.531376 2715 scope.go:117] "RemoveContainer" containerID="23bb8c20a2225635a3b99a1166afdfa83d83099a06c33aa14b79ca02b22a47dd" Aug 12 23:59:56.534160 containerd[1506]: time="2025-08-12T23:59:56.534089360Z" level=info msg="CreateContainer within sandbox \"f03a57ded5400be9b1f484a9c3198b7e388b992cae5d0a96b0ae57a0271f7629\" for container &ContainerMetadata{Name:kube-controller-manager,Attempt:1,}" Aug 12 23:59:56.547502 containerd[1506]: time="2025-08-12T23:59:56.547454801Z" level=info msg="Container c696d35c1c87b082817c4e2c1f6c75b8e8bd2db138d43b4e23609d6ae709e247: CDI devices from CRI Config.CDIDevices: []" Aug 12 23:59:56.558567 containerd[1506]: time="2025-08-12T23:59:56.558511094Z" level=info msg="CreateContainer within sandbox \"f03a57ded5400be9b1f484a9c3198b7e388b992cae5d0a96b0ae57a0271f7629\" for &ContainerMetadata{Name:kube-controller-manager,Attempt:1,} returns container id \"c696d35c1c87b082817c4e2c1f6c75b8e8bd2db138d43b4e23609d6ae709e247\"" Aug 12 23:59:56.559607 containerd[1506]: time="2025-08-12T23:59:56.559037100Z" level=info msg="StartContainer for \"c696d35c1c87b082817c4e2c1f6c75b8e8bd2db138d43b4e23609d6ae709e247\"" Aug 12 23:59:56.561162 containerd[1506]: time="2025-08-12T23:59:56.561114205Z" level=info msg="connecting to shim c696d35c1c87b082817c4e2c1f6c75b8e8bd2db138d43b4e23609d6ae709e247" address="unix:///run/containerd/s/a8520bbacd5295a36db3101ab2017144523800f93aa489e6d09253939f80adc4" protocol=ttrpc version=3 Aug 12 23:59:56.587769 systemd[1]: Started cri-containerd-c696d35c1c87b082817c4e2c1f6c75b8e8bd2db138d43b4e23609d6ae709e247.scope - libcontainer container c696d35c1c87b082817c4e2c1f6c75b8e8bd2db138d43b4e23609d6ae709e247. Aug 12 23:59:56.662563 containerd[1506]: time="2025-08-12T23:59:56.662368144Z" level=info msg="StartContainer for \"c696d35c1c87b082817c4e2c1f6c75b8e8bd2db138d43b4e23609d6ae709e247\" returns successfully" Aug 12 23:59:57.022759 containerd[1506]: time="2025-08-12T23:59:57.022693044Z" level=info msg="TaskExit event in podsandbox handler container_id:\"19dd4802aef9a2278a4c8001b706649c2528bbfafc2b91c7d2bde8cc49cbafdc\" id:\"f188016750d9acf37494d6db8581ad67775114bf43c5fe6ee493e5ae200512be\" pid:10529 exit_status:1 exited_at:{seconds:1755043197 nanos:21449469}" Aug 12 23:59:58.019473 containerd[1506]: time="2025-08-12T23:59:58.019309287Z" level=info msg="TaskExit event in podsandbox handler container_id:\"1cccb6bc47c00d5e1d4ee67f317e7ac919f9017670873c5bb6119b773b41640b\" id:\"008da68fb29eaff8f5723ad3fc34d442c740f7af12a537a472cdbd7045c5c240\" pid:10550 exited_at:{seconds:1755043198 nanos:18613559}"