Sep 16 04:22:19.809268 kernel: Booting Linux on physical CPU 0x0000000000 [0x413fd0c1] Sep 16 04:22:19.809298 kernel: Linux version 6.12.47-flatcar (build@pony-truck.infra.kinvolk.io) (aarch64-cros-linux-gnu-gcc (Gentoo Hardened 14.3.0 p8) 14.3.0, GNU ld (Gentoo 2.44 p4) 2.44.0) #1 SMP PREEMPT Tue Sep 16 03:05:48 -00 2025 Sep 16 04:22:19.809309 kernel: KASLR enabled Sep 16 04:22:19.809315 kernel: efi: EFI v2.7 by Ubuntu distribution of EDK II Sep 16 04:22:19.809321 kernel: efi: SMBIOS 3.0=0x139ed0000 MEMATTR=0x1390bb018 ACPI 2.0=0x136760018 RNG=0x13676e918 MEMRESERVE=0x136b41218 Sep 16 04:22:19.809326 kernel: random: crng init done Sep 16 04:22:19.809333 kernel: secureboot: Secure boot disabled Sep 16 04:22:19.809339 kernel: ACPI: Early table checksum verification disabled Sep 16 04:22:19.809345 kernel: ACPI: RSDP 0x0000000136760018 000024 (v02 BOCHS ) Sep 16 04:22:19.809350 kernel: ACPI: XSDT 0x000000013676FE98 00006C (v01 BOCHS BXPC 00000001 01000013) Sep 16 04:22:19.809358 kernel: ACPI: FACP 0x000000013676FA98 000114 (v06 BOCHS BXPC 00000001 BXPC 00000001) Sep 16 04:22:19.809364 kernel: ACPI: DSDT 0x0000000136767518 001468 (v02 BOCHS BXPC 00000001 BXPC 00000001) Sep 16 04:22:19.809370 kernel: ACPI: APIC 0x000000013676FC18 000108 (v04 BOCHS BXPC 00000001 BXPC 00000001) Sep 16 04:22:19.809376 kernel: ACPI: PPTT 0x000000013676FD98 000060 (v02 BOCHS BXPC 00000001 BXPC 00000001) Sep 16 04:22:19.809383 kernel: ACPI: GTDT 0x000000013676D898 000060 (v02 BOCHS BXPC 00000001 BXPC 00000001) Sep 16 04:22:19.809390 kernel: ACPI: MCFG 0x000000013676FF98 00003C (v01 BOCHS BXPC 00000001 BXPC 00000001) Sep 16 04:22:19.809396 kernel: ACPI: SPCR 0x000000013676E818 000050 (v02 BOCHS BXPC 00000001 BXPC 00000001) Sep 16 04:22:19.809402 kernel: ACPI: DBG2 0x000000013676E898 000057 (v00 BOCHS BXPC 00000001 BXPC 00000001) Sep 16 04:22:19.809408 kernel: ACPI: IORT 0x000000013676E418 000080 (v03 BOCHS BXPC 00000001 BXPC 00000001) Sep 16 04:22:19.809414 kernel: ACPI: BGRT 0x000000013676E798 000038 (v01 INTEL EDK2 00000002 01000013) Sep 16 04:22:19.809421 kernel: ACPI: SPCR: console: pl011,mmio32,0x9000000,9600 Sep 16 04:22:19.809427 kernel: ACPI: Use ACPI SPCR as default console: No Sep 16 04:22:19.809433 kernel: NUMA: Faking a node at [mem 0x0000000040000000-0x0000000139ffffff] Sep 16 04:22:19.809439 kernel: NODE_DATA(0) allocated [mem 0x13967da00-0x139684fff] Sep 16 04:22:19.809445 kernel: Zone ranges: Sep 16 04:22:19.809451 kernel: DMA [mem 0x0000000040000000-0x00000000ffffffff] Sep 16 04:22:19.809458 kernel: DMA32 empty Sep 16 04:22:19.809465 kernel: Normal [mem 0x0000000100000000-0x0000000139ffffff] Sep 16 04:22:19.809470 kernel: Device empty Sep 16 04:22:19.809476 kernel: Movable zone start for each node Sep 16 04:22:19.809482 kernel: Early memory node ranges Sep 16 04:22:19.809488 kernel: node 0: [mem 0x0000000040000000-0x000000013666ffff] Sep 16 04:22:19.809508 kernel: node 0: [mem 0x0000000136670000-0x000000013667ffff] Sep 16 04:22:19.809515 kernel: node 0: [mem 0x0000000136680000-0x000000013676ffff] Sep 16 04:22:19.809521 kernel: node 0: [mem 0x0000000136770000-0x0000000136b3ffff] Sep 16 04:22:19.809527 kernel: node 0: [mem 0x0000000136b40000-0x0000000139e1ffff] Sep 16 04:22:19.809533 kernel: node 0: [mem 0x0000000139e20000-0x0000000139eaffff] Sep 16 04:22:19.809539 kernel: node 0: [mem 0x0000000139eb0000-0x0000000139ebffff] Sep 16 04:22:19.809547 kernel: node 0: [mem 0x0000000139ec0000-0x0000000139fdffff] Sep 16 04:22:19.809554 kernel: node 0: [mem 0x0000000139fe0000-0x0000000139ffffff] Sep 16 04:22:19.809563 kernel: Initmem setup node 0 [mem 0x0000000040000000-0x0000000139ffffff] Sep 16 04:22:19.809569 kernel: On node 0, zone Normal: 24576 pages in unavailable ranges Sep 16 04:22:19.809576 kernel: cma: Reserved 16 MiB at 0x00000000ff000000 on node -1 Sep 16 04:22:19.809594 kernel: psci: probing for conduit method from ACPI. Sep 16 04:22:19.809601 kernel: psci: PSCIv1.1 detected in firmware. Sep 16 04:22:19.809607 kernel: psci: Using standard PSCI v0.2 function IDs Sep 16 04:22:19.809614 kernel: psci: Trusted OS migration not required Sep 16 04:22:19.809620 kernel: psci: SMC Calling Convention v1.1 Sep 16 04:22:19.809627 kernel: smccc: KVM: hypervisor services detected (0x00000000 0x00000000 0x00000000 0x00000003) Sep 16 04:22:19.809633 kernel: percpu: Embedded 33 pages/cpu s98200 r8192 d28776 u135168 Sep 16 04:22:19.809639 kernel: pcpu-alloc: s98200 r8192 d28776 u135168 alloc=33*4096 Sep 16 04:22:19.809646 kernel: pcpu-alloc: [0] 0 [0] 1 Sep 16 04:22:19.809653 kernel: Detected PIPT I-cache on CPU0 Sep 16 04:22:19.809659 kernel: CPU features: detected: GIC system register CPU interface Sep 16 04:22:19.809668 kernel: CPU features: detected: Spectre-v4 Sep 16 04:22:19.809674 kernel: CPU features: detected: Spectre-BHB Sep 16 04:22:19.809681 kernel: CPU features: kernel page table isolation forced ON by KASLR Sep 16 04:22:19.809687 kernel: CPU features: detected: Kernel page table isolation (KPTI) Sep 16 04:22:19.809693 kernel: CPU features: detected: ARM erratum 1418040 Sep 16 04:22:19.809700 kernel: CPU features: detected: SSBS not fully self-synchronizing Sep 16 04:22:19.809706 kernel: alternatives: applying boot alternatives Sep 16 04:22:19.809714 kernel: Kernel command line: BOOT_IMAGE=/flatcar/vmlinuz-a mount.usr=/dev/mapper/usr verity.usr=PARTUUID=7130c94a-213a-4e5a-8e26-6cce9662f132 rootflags=rw mount.usrflags=ro consoleblank=0 root=LABEL=ROOT console=ttyAMA0,115200n8 flatcar.first_boot=detected acpi=force flatcar.oem.id=hetzner verity.usrhash=eff5cc3c399cf6fc52e3071751a09276871b099078da6d1b1a498405d04a9313 Sep 16 04:22:19.809721 kernel: Unknown kernel command line parameters "BOOT_IMAGE=/flatcar/vmlinuz-a", will be passed to user space. Sep 16 04:22:19.809727 kernel: Dentry cache hash table entries: 524288 (order: 10, 4194304 bytes, linear) Sep 16 04:22:19.809735 kernel: Inode-cache hash table entries: 262144 (order: 9, 2097152 bytes, linear) Sep 16 04:22:19.809741 kernel: Fallback order for Node 0: 0 Sep 16 04:22:19.809748 kernel: Built 1 zonelists, mobility grouping on. Total pages: 1024000 Sep 16 04:22:19.809754 kernel: Policy zone: Normal Sep 16 04:22:19.809760 kernel: mem auto-init: stack:off, heap alloc:off, heap free:off Sep 16 04:22:19.809767 kernel: software IO TLB: area num 2. Sep 16 04:22:19.809773 kernel: software IO TLB: mapped [mem 0x00000000fb000000-0x00000000ff000000] (64MB) Sep 16 04:22:19.809780 kernel: SLUB: HWalign=64, Order=0-3, MinObjects=0, CPUs=2, Nodes=1 Sep 16 04:22:19.809786 kernel: rcu: Preemptible hierarchical RCU implementation. Sep 16 04:22:19.809794 kernel: rcu: RCU event tracing is enabled. Sep 16 04:22:19.809800 kernel: rcu: RCU restricting CPUs from NR_CPUS=512 to nr_cpu_ids=2. Sep 16 04:22:19.809807 kernel: Trampoline variant of Tasks RCU enabled. Sep 16 04:22:19.809814 kernel: Tracing variant of Tasks RCU enabled. Sep 16 04:22:19.809821 kernel: rcu: RCU calculated value of scheduler-enlistment delay is 100 jiffies. Sep 16 04:22:19.809827 kernel: rcu: Adjusting geometry for rcu_fanout_leaf=16, nr_cpu_ids=2 Sep 16 04:22:19.809834 kernel: RCU Tasks: Setting shift to 1 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=2. Sep 16 04:22:19.809840 kernel: RCU Tasks Trace: Setting shift to 1 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=2. Sep 16 04:22:19.809846 kernel: NR_IRQS: 64, nr_irqs: 64, preallocated irqs: 0 Sep 16 04:22:19.809853 kernel: GICv3: 256 SPIs implemented Sep 16 04:22:19.809859 kernel: GICv3: 0 Extended SPIs implemented Sep 16 04:22:19.809865 kernel: Root IRQ handler: gic_handle_irq Sep 16 04:22:19.809872 kernel: GICv3: GICv3 features: 16 PPIs, DirectLPI Sep 16 04:22:19.809878 kernel: GICv3: GICD_CTRL.DS=1, SCR_EL3.FIQ=0 Sep 16 04:22:19.809885 kernel: GICv3: CPU0: found redistributor 0 region 0:0x00000000080a0000 Sep 16 04:22:19.809892 kernel: ITS [mem 0x08080000-0x0809ffff] Sep 16 04:22:19.809899 kernel: ITS@0x0000000008080000: allocated 8192 Devices @100100000 (indirect, esz 8, psz 64K, shr 1) Sep 16 04:22:19.809905 kernel: ITS@0x0000000008080000: allocated 8192 Interrupt Collections @100110000 (flat, esz 8, psz 64K, shr 1) Sep 16 04:22:19.809912 kernel: GICv3: using LPI property table @0x0000000100120000 Sep 16 04:22:19.809919 kernel: GICv3: CPU0: using allocated LPI pending table @0x0000000100130000 Sep 16 04:22:19.809925 kernel: rcu: srcu_init: Setting srcu_struct sizes based on contention. Sep 16 04:22:19.809932 kernel: arch_timer: Enabling local workaround for ARM erratum 1418040 Sep 16 04:22:19.809938 kernel: arch_timer: cp15 timer(s) running at 25.00MHz (virt). Sep 16 04:22:19.809945 kernel: clocksource: arch_sys_counter: mask: 0xffffffffffffff max_cycles: 0x5c40939b5, max_idle_ns: 440795202646 ns Sep 16 04:22:19.809951 kernel: sched_clock: 56 bits at 25MHz, resolution 40ns, wraps every 4398046511100ns Sep 16 04:22:19.809958 kernel: Console: colour dummy device 80x25 Sep 16 04:22:19.809966 kernel: ACPI: Core revision 20240827 Sep 16 04:22:19.809973 kernel: Calibrating delay loop (skipped), value calculated using timer frequency.. 50.00 BogoMIPS (lpj=25000) Sep 16 04:22:19.809979 kernel: pid_max: default: 32768 minimum: 301 Sep 16 04:22:19.809986 kernel: LSM: initializing lsm=lockdown,capability,landlock,selinux,ima Sep 16 04:22:19.809993 kernel: landlock: Up and running. Sep 16 04:22:19.809999 kernel: SELinux: Initializing. Sep 16 04:22:19.810006 kernel: Mount-cache hash table entries: 8192 (order: 4, 65536 bytes, linear) Sep 16 04:22:19.810013 kernel: Mountpoint-cache hash table entries: 8192 (order: 4, 65536 bytes, linear) Sep 16 04:22:19.810019 kernel: rcu: Hierarchical SRCU implementation. Sep 16 04:22:19.810027 kernel: rcu: Max phase no-delay instances is 400. Sep 16 04:22:19.810034 kernel: Timer migration: 1 hierarchy levels; 8 children per group; 1 crossnode level Sep 16 04:22:19.810041 kernel: Remapping and enabling EFI services. Sep 16 04:22:19.810048 kernel: smp: Bringing up secondary CPUs ... Sep 16 04:22:19.810054 kernel: Detected PIPT I-cache on CPU1 Sep 16 04:22:19.810061 kernel: GICv3: CPU1: found redistributor 1 region 0:0x00000000080c0000 Sep 16 04:22:19.810068 kernel: GICv3: CPU1: using allocated LPI pending table @0x0000000100140000 Sep 16 04:22:19.810075 kernel: arch_timer: Enabling local workaround for ARM erratum 1418040 Sep 16 04:22:19.810081 kernel: CPU1: Booted secondary processor 0x0000000001 [0x413fd0c1] Sep 16 04:22:19.810090 kernel: smp: Brought up 1 node, 2 CPUs Sep 16 04:22:19.810102 kernel: SMP: Total of 2 processors activated. Sep 16 04:22:19.810109 kernel: CPU: All CPU(s) started at EL1 Sep 16 04:22:19.810117 kernel: CPU features: detected: 32-bit EL0 Support Sep 16 04:22:19.810124 kernel: CPU features: detected: Data cache clean to the PoU not required for I/D coherence Sep 16 04:22:19.810131 kernel: CPU features: detected: Common not Private translations Sep 16 04:22:19.810138 kernel: CPU features: detected: CRC32 instructions Sep 16 04:22:19.810145 kernel: CPU features: detected: Enhanced Virtualization Traps Sep 16 04:22:19.810154 kernel: CPU features: detected: RCpc load-acquire (LDAPR) Sep 16 04:22:19.810161 kernel: CPU features: detected: LSE atomic instructions Sep 16 04:22:19.810168 kernel: CPU features: detected: Privileged Access Never Sep 16 04:22:19.810175 kernel: CPU features: detected: RAS Extension Support Sep 16 04:22:19.810182 kernel: CPU features: detected: Speculative Store Bypassing Safe (SSBS) Sep 16 04:22:19.810189 kernel: alternatives: applying system-wide alternatives Sep 16 04:22:19.810196 kernel: CPU features: detected: Hardware dirty bit management on CPU0-1 Sep 16 04:22:19.810203 kernel: Memory: 3859556K/4096000K available (11136K kernel code, 2440K rwdata, 9068K rodata, 38976K init, 1038K bss, 214964K reserved, 16384K cma-reserved) Sep 16 04:22:19.810211 kernel: devtmpfs: initialized Sep 16 04:22:19.810219 kernel: clocksource: jiffies: mask: 0xffffffff max_cycles: 0xffffffff, max_idle_ns: 1911260446275000 ns Sep 16 04:22:19.810226 kernel: futex hash table entries: 512 (order: 3, 32768 bytes, linear) Sep 16 04:22:19.810233 kernel: 2G module region forced by RANDOMIZE_MODULE_REGION_FULL Sep 16 04:22:19.810240 kernel: 0 pages in range for non-PLT usage Sep 16 04:22:19.810247 kernel: 508560 pages in range for PLT usage Sep 16 04:22:19.810254 kernel: pinctrl core: initialized pinctrl subsystem Sep 16 04:22:19.810261 kernel: SMBIOS 3.0.0 present. Sep 16 04:22:19.810269 kernel: DMI: Hetzner vServer/KVM Virtual Machine, BIOS 20171111 11/11/2017 Sep 16 04:22:19.810276 kernel: DMI: Memory slots populated: 1/1 Sep 16 04:22:19.810284 kernel: NET: Registered PF_NETLINK/PF_ROUTE protocol family Sep 16 04:22:19.810291 kernel: DMA: preallocated 512 KiB GFP_KERNEL pool for atomic allocations Sep 16 04:22:19.810298 kernel: DMA: preallocated 512 KiB GFP_KERNEL|GFP_DMA pool for atomic allocations Sep 16 04:22:19.810305 kernel: DMA: preallocated 512 KiB GFP_KERNEL|GFP_DMA32 pool for atomic allocations Sep 16 04:22:19.810313 kernel: audit: initializing netlink subsys (disabled) Sep 16 04:22:19.810320 kernel: audit: type=2000 audit(0.018:1): state=initialized audit_enabled=0 res=1 Sep 16 04:22:19.810327 kernel: thermal_sys: Registered thermal governor 'step_wise' Sep 16 04:22:19.810334 kernel: cpuidle: using governor menu Sep 16 04:22:19.810341 kernel: hw-breakpoint: found 6 breakpoint and 4 watchpoint registers. Sep 16 04:22:19.810350 kernel: ASID allocator initialised with 32768 entries Sep 16 04:22:19.810357 kernel: acpiphp: ACPI Hot Plug PCI Controller Driver version: 0.5 Sep 16 04:22:19.810364 kernel: Serial: AMBA PL011 UART driver Sep 16 04:22:19.810371 kernel: HugeTLB: registered 1.00 GiB page size, pre-allocated 0 pages Sep 16 04:22:19.810378 kernel: HugeTLB: 0 KiB vmemmap can be freed for a 1.00 GiB page Sep 16 04:22:19.810385 kernel: HugeTLB: registered 32.0 MiB page size, pre-allocated 0 pages Sep 16 04:22:19.810392 kernel: HugeTLB: 0 KiB vmemmap can be freed for a 32.0 MiB page Sep 16 04:22:19.810399 kernel: HugeTLB: registered 2.00 MiB page size, pre-allocated 0 pages Sep 16 04:22:19.810407 kernel: HugeTLB: 0 KiB vmemmap can be freed for a 2.00 MiB page Sep 16 04:22:19.810415 kernel: HugeTLB: registered 64.0 KiB page size, pre-allocated 0 pages Sep 16 04:22:19.810423 kernel: HugeTLB: 0 KiB vmemmap can be freed for a 64.0 KiB page Sep 16 04:22:19.810430 kernel: ACPI: Added _OSI(Module Device) Sep 16 04:22:19.810437 kernel: ACPI: Added _OSI(Processor Device) Sep 16 04:22:19.810444 kernel: ACPI: Added _OSI(Processor Aggregator Device) Sep 16 04:22:19.810451 kernel: ACPI: 1 ACPI AML tables successfully acquired and loaded Sep 16 04:22:19.810458 kernel: ACPI: Interpreter enabled Sep 16 04:22:19.810465 kernel: ACPI: Using GIC for interrupt routing Sep 16 04:22:19.810472 kernel: ACPI: MCFG table detected, 1 entries Sep 16 04:22:19.810487 kernel: ACPI: CPU0 has been hot-added Sep 16 04:22:19.810527 kernel: ACPI: CPU1 has been hot-added Sep 16 04:22:19.810535 kernel: ARMH0011:00: ttyAMA0 at MMIO 0x9000000 (irq = 12, base_baud = 0) is a SBSA Sep 16 04:22:19.810546 kernel: printk: legacy console [ttyAMA0] enabled Sep 16 04:22:19.810558 kernel: ACPI: PCI Root Bridge [PCI0] (domain 0000 [bus 00-ff]) Sep 16 04:22:19.810725 kernel: acpi PNP0A08:00: _OSC: OS supports [ExtendedConfig ASPM ClockPM Segments MSI HPX-Type3] Sep 16 04:22:19.810793 kernel: acpi PNP0A08:00: _OSC: platform does not support [LTR] Sep 16 04:22:19.810855 kernel: acpi PNP0A08:00: _OSC: OS now controls [PCIeHotplug PME AER PCIeCapability] Sep 16 04:22:19.810913 kernel: acpi PNP0A08:00: ECAM area [mem 0x4010000000-0x401fffffff] reserved by PNP0C02:00 Sep 16 04:22:19.810969 kernel: acpi PNP0A08:00: ECAM at [mem 0x4010000000-0x401fffffff] for [bus 00-ff] Sep 16 04:22:19.810978 kernel: ACPI: Remapped I/O 0x000000003eff0000 to [io 0x0000-0xffff window] Sep 16 04:22:19.810986 kernel: PCI host bridge to bus 0000:00 Sep 16 04:22:19.811050 kernel: pci_bus 0000:00: root bus resource [mem 0x10000000-0x3efeffff window] Sep 16 04:22:19.811104 kernel: pci_bus 0000:00: root bus resource [io 0x0000-0xffff window] Sep 16 04:22:19.811157 kernel: pci_bus 0000:00: root bus resource [mem 0x8000000000-0xffffffffff window] Sep 16 04:22:19.811211 kernel: pci_bus 0000:00: root bus resource [bus 00-ff] Sep 16 04:22:19.811293 kernel: pci 0000:00:00.0: [1b36:0008] type 00 class 0x060000 conventional PCI endpoint Sep 16 04:22:19.811365 kernel: pci 0000:00:01.0: [1af4:1050] type 00 class 0x038000 conventional PCI endpoint Sep 16 04:22:19.811426 kernel: pci 0000:00:01.0: BAR 1 [mem 0x11289000-0x11289fff] Sep 16 04:22:19.811501 kernel: pci 0000:00:01.0: BAR 4 [mem 0x8000600000-0x8000603fff 64bit pref] Sep 16 04:22:19.813625 kernel: pci 0000:00:02.0: [1b36:000c] type 01 class 0x060400 PCIe Root Port Sep 16 04:22:19.813769 kernel: pci 0000:00:02.0: BAR 0 [mem 0x11288000-0x11288fff] Sep 16 04:22:19.813834 kernel: pci 0000:00:02.0: PCI bridge to [bus 01] Sep 16 04:22:19.813893 kernel: pci 0000:00:02.0: bridge window [mem 0x11000000-0x111fffff] Sep 16 04:22:19.813951 kernel: pci 0000:00:02.0: bridge window [mem 0x8000000000-0x80000fffff 64bit pref] Sep 16 04:22:19.814021 kernel: pci 0000:00:02.1: [1b36:000c] type 01 class 0x060400 PCIe Root Port Sep 16 04:22:19.814080 kernel: pci 0000:00:02.1: BAR 0 [mem 0x11287000-0x11287fff] Sep 16 04:22:19.814165 kernel: pci 0000:00:02.1: PCI bridge to [bus 02] Sep 16 04:22:19.814236 kernel: pci 0000:00:02.1: bridge window [mem 0x10e00000-0x10ffffff] Sep 16 04:22:19.814310 kernel: pci 0000:00:02.2: [1b36:000c] type 01 class 0x060400 PCIe Root Port Sep 16 04:22:19.814379 kernel: pci 0000:00:02.2: BAR 0 [mem 0x11286000-0x11286fff] Sep 16 04:22:19.814438 kernel: pci 0000:00:02.2: PCI bridge to [bus 03] Sep 16 04:22:19.814507 kernel: pci 0000:00:02.2: bridge window [mem 0x10c00000-0x10dfffff] Sep 16 04:22:19.814572 kernel: pci 0000:00:02.2: bridge window [mem 0x8000100000-0x80001fffff 64bit pref] Sep 16 04:22:19.814991 kernel: pci 0000:00:02.3: [1b36:000c] type 01 class 0x060400 PCIe Root Port Sep 16 04:22:19.815088 kernel: pci 0000:00:02.3: BAR 0 [mem 0x11285000-0x11285fff] Sep 16 04:22:19.815151 kernel: pci 0000:00:02.3: PCI bridge to [bus 04] Sep 16 04:22:19.815209 kernel: pci 0000:00:02.3: bridge window [mem 0x10a00000-0x10bfffff] Sep 16 04:22:19.815266 kernel: pci 0000:00:02.3: bridge window [mem 0x8000200000-0x80002fffff 64bit pref] Sep 16 04:22:19.815331 kernel: pci 0000:00:02.4: [1b36:000c] type 01 class 0x060400 PCIe Root Port Sep 16 04:22:19.815389 kernel: pci 0000:00:02.4: BAR 0 [mem 0x11284000-0x11284fff] Sep 16 04:22:19.815449 kernel: pci 0000:00:02.4: PCI bridge to [bus 05] Sep 16 04:22:19.815557 kernel: pci 0000:00:02.4: bridge window [mem 0x10800000-0x109fffff] Sep 16 04:22:19.816697 kernel: pci 0000:00:02.4: bridge window [mem 0x8000300000-0x80003fffff 64bit pref] Sep 16 04:22:19.816783 kernel: pci 0000:00:02.5: [1b36:000c] type 01 class 0x060400 PCIe Root Port Sep 16 04:22:19.816843 kernel: pci 0000:00:02.5: BAR 0 [mem 0x11283000-0x11283fff] Sep 16 04:22:19.816914 kernel: pci 0000:00:02.5: PCI bridge to [bus 06] Sep 16 04:22:19.816973 kernel: pci 0000:00:02.5: bridge window [mem 0x10600000-0x107fffff] Sep 16 04:22:19.817031 kernel: pci 0000:00:02.5: bridge window [mem 0x8000400000-0x80004fffff 64bit pref] Sep 16 04:22:19.817108 kernel: pci 0000:00:02.6: [1b36:000c] type 01 class 0x060400 PCIe Root Port Sep 16 04:22:19.817167 kernel: pci 0000:00:02.6: BAR 0 [mem 0x11282000-0x11282fff] Sep 16 04:22:19.817224 kernel: pci 0000:00:02.6: PCI bridge to [bus 07] Sep 16 04:22:19.817282 kernel: pci 0000:00:02.6: bridge window [mem 0x10400000-0x105fffff] Sep 16 04:22:19.817339 kernel: pci 0000:00:02.6: bridge window [mem 0x8000500000-0x80005fffff 64bit pref] Sep 16 04:22:19.817404 kernel: pci 0000:00:02.7: [1b36:000c] type 01 class 0x060400 PCIe Root Port Sep 16 04:22:19.817463 kernel: pci 0000:00:02.7: BAR 0 [mem 0x11281000-0x11281fff] Sep 16 04:22:19.817541 kernel: pci 0000:00:02.7: PCI bridge to [bus 08] Sep 16 04:22:19.817739 kernel: pci 0000:00:02.7: bridge window [mem 0x10200000-0x103fffff] Sep 16 04:22:19.817815 kernel: pci 0000:00:03.0: [1b36:000c] type 01 class 0x060400 PCIe Root Port Sep 16 04:22:19.817874 kernel: pci 0000:00:03.0: BAR 0 [mem 0x11280000-0x11280fff] Sep 16 04:22:19.817932 kernel: pci 0000:00:03.0: PCI bridge to [bus 09] Sep 16 04:22:19.817991 kernel: pci 0000:00:03.0: bridge window [mem 0x10000000-0x101fffff] Sep 16 04:22:19.818061 kernel: pci 0000:00:04.0: [1b36:0002] type 00 class 0x070002 conventional PCI endpoint Sep 16 04:22:19.818119 kernel: pci 0000:00:04.0: BAR 0 [io 0x0000-0x0007] Sep 16 04:22:19.818197 kernel: pci 0000:01:00.0: [1af4:1041] type 00 class 0x020000 PCIe Endpoint Sep 16 04:22:19.818258 kernel: pci 0000:01:00.0: BAR 1 [mem 0x11000000-0x11000fff] Sep 16 04:22:19.818318 kernel: pci 0000:01:00.0: BAR 4 [mem 0x8000000000-0x8000003fff 64bit pref] Sep 16 04:22:19.818377 kernel: pci 0000:01:00.0: ROM [mem 0xfff80000-0xffffffff pref] Sep 16 04:22:19.818445 kernel: pci 0000:02:00.0: [1b36:000d] type 00 class 0x0c0330 PCIe Endpoint Sep 16 04:22:19.818525 kernel: pci 0000:02:00.0: BAR 0 [mem 0x10e00000-0x10e03fff 64bit] Sep 16 04:22:19.819658 kernel: pci 0000:03:00.0: [1af4:1043] type 00 class 0x078000 PCIe Endpoint Sep 16 04:22:19.819745 kernel: pci 0000:03:00.0: BAR 1 [mem 0x10c00000-0x10c00fff] Sep 16 04:22:19.819807 kernel: pci 0000:03:00.0: BAR 4 [mem 0x8000100000-0x8000103fff 64bit pref] Sep 16 04:22:19.819877 kernel: pci 0000:04:00.0: [1af4:1045] type 00 class 0x00ff00 PCIe Endpoint Sep 16 04:22:19.819937 kernel: pci 0000:04:00.0: BAR 4 [mem 0x8000200000-0x8000203fff 64bit pref] Sep 16 04:22:19.820012 kernel: pci 0000:05:00.0: [1af4:1044] type 00 class 0x00ff00 PCIe Endpoint Sep 16 04:22:19.820073 kernel: pci 0000:05:00.0: BAR 4 [mem 0x8000300000-0x8000303fff 64bit pref] Sep 16 04:22:19.820147 kernel: pci 0000:06:00.0: [1af4:1048] type 00 class 0x010000 PCIe Endpoint Sep 16 04:22:19.820207 kernel: pci 0000:06:00.0: BAR 1 [mem 0x10600000-0x10600fff] Sep 16 04:22:19.820266 kernel: pci 0000:06:00.0: BAR 4 [mem 0x8000400000-0x8000403fff 64bit pref] Sep 16 04:22:19.820333 kernel: pci 0000:07:00.0: [1af4:1041] type 00 class 0x020000 PCIe Endpoint Sep 16 04:22:19.820394 kernel: pci 0000:07:00.0: BAR 1 [mem 0x10400000-0x10400fff] Sep 16 04:22:19.820456 kernel: pci 0000:07:00.0: BAR 4 [mem 0x8000500000-0x8000503fff 64bit pref] Sep 16 04:22:19.820537 kernel: pci 0000:07:00.0: ROM [mem 0xfff80000-0xffffffff pref] Sep 16 04:22:19.820624 kernel: pci 0000:00:02.0: bridge window [io 0x1000-0x0fff] to [bus 01] add_size 1000 Sep 16 04:22:19.820685 kernel: pci 0000:00:02.0: bridge window [mem 0x00100000-0x001fffff 64bit pref] to [bus 01] add_size 100000 add_align 100000 Sep 16 04:22:19.820742 kernel: pci 0000:00:02.0: bridge window [mem 0x00100000-0x001fffff] to [bus 01] add_size 100000 add_align 100000 Sep 16 04:22:19.820804 kernel: pci 0000:00:02.1: bridge window [io 0x1000-0x0fff] to [bus 02] add_size 1000 Sep 16 04:22:19.820862 kernel: pci 0000:00:02.1: bridge window [mem 0x00100000-0x000fffff 64bit pref] to [bus 02] add_size 200000 add_align 100000 Sep 16 04:22:19.820924 kernel: pci 0000:00:02.1: bridge window [mem 0x00100000-0x001fffff] to [bus 02] add_size 100000 add_align 100000 Sep 16 04:22:19.820985 kernel: pci 0000:00:02.2: bridge window [io 0x1000-0x0fff] to [bus 03] add_size 1000 Sep 16 04:22:19.821043 kernel: pci 0000:00:02.2: bridge window [mem 0x00100000-0x001fffff 64bit pref] to [bus 03] add_size 100000 add_align 100000 Sep 16 04:22:19.821099 kernel: pci 0000:00:02.2: bridge window [mem 0x00100000-0x001fffff] to [bus 03] add_size 100000 add_align 100000 Sep 16 04:22:19.821160 kernel: pci 0000:00:02.3: bridge window [io 0x1000-0x0fff] to [bus 04] add_size 1000 Sep 16 04:22:19.821219 kernel: pci 0000:00:02.3: bridge window [mem 0x00100000-0x001fffff 64bit pref] to [bus 04] add_size 100000 add_align 100000 Sep 16 04:22:19.821277 kernel: pci 0000:00:02.3: bridge window [mem 0x00100000-0x000fffff] to [bus 04] add_size 200000 add_align 100000 Sep 16 04:22:19.821337 kernel: pci 0000:00:02.4: bridge window [io 0x1000-0x0fff] to [bus 05] add_size 1000 Sep 16 04:22:19.821397 kernel: pci 0000:00:02.4: bridge window [mem 0x00100000-0x001fffff 64bit pref] to [bus 05] add_size 100000 add_align 100000 Sep 16 04:22:19.821454 kernel: pci 0000:00:02.4: bridge window [mem 0x00100000-0x000fffff] to [bus 05] add_size 200000 add_align 100000 Sep 16 04:22:19.821526 kernel: pci 0000:00:02.5: bridge window [io 0x1000-0x0fff] to [bus 06] add_size 1000 Sep 16 04:22:19.822321 kernel: pci 0000:00:02.5: bridge window [mem 0x00100000-0x001fffff 64bit pref] to [bus 06] add_size 100000 add_align 100000 Sep 16 04:22:19.822409 kernel: pci 0000:00:02.5: bridge window [mem 0x00100000-0x001fffff] to [bus 06] add_size 100000 add_align 100000 Sep 16 04:22:19.822479 kernel: pci 0000:00:02.6: bridge window [io 0x1000-0x0fff] to [bus 07] add_size 1000 Sep 16 04:22:19.822639 kernel: pci 0000:00:02.6: bridge window [mem 0x00100000-0x001fffff 64bit pref] to [bus 07] add_size 100000 add_align 100000 Sep 16 04:22:19.822709 kernel: pci 0000:00:02.6: bridge window [mem 0x00100000-0x001fffff] to [bus 07] add_size 100000 add_align 100000 Sep 16 04:22:19.822773 kernel: pci 0000:00:02.7: bridge window [io 0x1000-0x0fff] to [bus 08] add_size 1000 Sep 16 04:22:19.822831 kernel: pci 0000:00:02.7: bridge window [mem 0x00100000-0x000fffff 64bit pref] to [bus 08] add_size 200000 add_align 100000 Sep 16 04:22:19.822889 kernel: pci 0000:00:02.7: bridge window [mem 0x00100000-0x000fffff] to [bus 08] add_size 200000 add_align 100000 Sep 16 04:22:19.822951 kernel: pci 0000:00:03.0: bridge window [io 0x1000-0x0fff] to [bus 09] add_size 1000 Sep 16 04:22:19.823014 kernel: pci 0000:00:03.0: bridge window [mem 0x00100000-0x000fffff 64bit pref] to [bus 09] add_size 200000 add_align 100000 Sep 16 04:22:19.823071 kernel: pci 0000:00:03.0: bridge window [mem 0x00100000-0x000fffff] to [bus 09] add_size 200000 add_align 100000 Sep 16 04:22:19.823131 kernel: pci 0000:00:02.0: bridge window [mem 0x10000000-0x101fffff]: assigned Sep 16 04:22:19.823189 kernel: pci 0000:00:02.0: bridge window [mem 0x8000000000-0x80001fffff 64bit pref]: assigned Sep 16 04:22:19.823248 kernel: pci 0000:00:02.1: bridge window [mem 0x10200000-0x103fffff]: assigned Sep 16 04:22:19.823305 kernel: pci 0000:00:02.1: bridge window [mem 0x8000200000-0x80003fffff 64bit pref]: assigned Sep 16 04:22:19.823365 kernel: pci 0000:00:02.2: bridge window [mem 0x10400000-0x105fffff]: assigned Sep 16 04:22:19.823424 kernel: pci 0000:00:02.2: bridge window [mem 0x8000400000-0x80005fffff 64bit pref]: assigned Sep 16 04:22:19.823484 kernel: pci 0000:00:02.3: bridge window [mem 0x10600000-0x107fffff]: assigned Sep 16 04:22:19.823562 kernel: pci 0000:00:02.3: bridge window [mem 0x8000600000-0x80007fffff 64bit pref]: assigned Sep 16 04:22:19.823648 kernel: pci 0000:00:02.4: bridge window [mem 0x10800000-0x109fffff]: assigned Sep 16 04:22:19.823710 kernel: pci 0000:00:02.4: bridge window [mem 0x8000800000-0x80009fffff 64bit pref]: assigned Sep 16 04:22:19.823769 kernel: pci 0000:00:02.5: bridge window [mem 0x10a00000-0x10bfffff]: assigned Sep 16 04:22:19.823827 kernel: pci 0000:00:02.5: bridge window [mem 0x8000a00000-0x8000bfffff 64bit pref]: assigned Sep 16 04:22:19.823884 kernel: pci 0000:00:02.6: bridge window [mem 0x10c00000-0x10dfffff]: assigned Sep 16 04:22:19.823945 kernel: pci 0000:00:02.6: bridge window [mem 0x8000c00000-0x8000dfffff 64bit pref]: assigned Sep 16 04:22:19.825345 kernel: pci 0000:00:02.7: bridge window [mem 0x10e00000-0x10ffffff]: assigned Sep 16 04:22:19.825422 kernel: pci 0000:00:02.7: bridge window [mem 0x8000e00000-0x8000ffffff 64bit pref]: assigned Sep 16 04:22:19.825485 kernel: pci 0000:00:03.0: bridge window [mem 0x11000000-0x111fffff]: assigned Sep 16 04:22:19.825626 kernel: pci 0000:00:03.0: bridge window [mem 0x8001000000-0x80011fffff 64bit pref]: assigned Sep 16 04:22:19.825698 kernel: pci 0000:00:01.0: BAR 4 [mem 0x8001200000-0x8001203fff 64bit pref]: assigned Sep 16 04:22:19.825758 kernel: pci 0000:00:01.0: BAR 1 [mem 0x11200000-0x11200fff]: assigned Sep 16 04:22:19.825818 kernel: pci 0000:00:02.0: BAR 0 [mem 0x11201000-0x11201fff]: assigned Sep 16 04:22:19.825883 kernel: pci 0000:00:02.0: bridge window [io 0x1000-0x1fff]: assigned Sep 16 04:22:19.825944 kernel: pci 0000:00:02.1: BAR 0 [mem 0x11202000-0x11202fff]: assigned Sep 16 04:22:19.826002 kernel: pci 0000:00:02.1: bridge window [io 0x2000-0x2fff]: assigned Sep 16 04:22:19.826062 kernel: pci 0000:00:02.2: BAR 0 [mem 0x11203000-0x11203fff]: assigned Sep 16 04:22:19.826123 kernel: pci 0000:00:02.2: bridge window [io 0x3000-0x3fff]: assigned Sep 16 04:22:19.826187 kernel: pci 0000:00:02.3: BAR 0 [mem 0x11204000-0x11204fff]: assigned Sep 16 04:22:19.826245 kernel: pci 0000:00:02.3: bridge window [io 0x4000-0x4fff]: assigned Sep 16 04:22:19.826306 kernel: pci 0000:00:02.4: BAR 0 [mem 0x11205000-0x11205fff]: assigned Sep 16 04:22:19.826364 kernel: pci 0000:00:02.4: bridge window [io 0x5000-0x5fff]: assigned Sep 16 04:22:19.827734 kernel: pci 0000:00:02.5: BAR 0 [mem 0x11206000-0x11206fff]: assigned Sep 16 04:22:19.827801 kernel: pci 0000:00:02.5: bridge window [io 0x6000-0x6fff]: assigned Sep 16 04:22:19.827863 kernel: pci 0000:00:02.6: BAR 0 [mem 0x11207000-0x11207fff]: assigned Sep 16 04:22:19.827938 kernel: pci 0000:00:02.6: bridge window [io 0x7000-0x7fff]: assigned Sep 16 04:22:19.828023 kernel: pci 0000:00:02.7: BAR 0 [mem 0x11208000-0x11208fff]: assigned Sep 16 04:22:19.828086 kernel: pci 0000:00:02.7: bridge window [io 0x8000-0x8fff]: assigned Sep 16 04:22:19.828147 kernel: pci 0000:00:03.0: BAR 0 [mem 0x11209000-0x11209fff]: assigned Sep 16 04:22:19.828205 kernel: pci 0000:00:03.0: bridge window [io 0x9000-0x9fff]: assigned Sep 16 04:22:19.828268 kernel: pci 0000:00:04.0: BAR 0 [io 0xa000-0xa007]: assigned Sep 16 04:22:19.828335 kernel: pci 0000:01:00.0: ROM [mem 0x10000000-0x1007ffff pref]: assigned Sep 16 04:22:19.828395 kernel: pci 0000:01:00.0: BAR 4 [mem 0x8000000000-0x8000003fff 64bit pref]: assigned Sep 16 04:22:19.828465 kernel: pci 0000:01:00.0: BAR 1 [mem 0x10080000-0x10080fff]: assigned Sep 16 04:22:19.828574 kernel: pci 0000:00:02.0: PCI bridge to [bus 01] Sep 16 04:22:19.828974 kernel: pci 0000:00:02.0: bridge window [io 0x1000-0x1fff] Sep 16 04:22:19.829042 kernel: pci 0000:00:02.0: bridge window [mem 0x10000000-0x101fffff] Sep 16 04:22:19.829101 kernel: pci 0000:00:02.0: bridge window [mem 0x8000000000-0x80001fffff 64bit pref] Sep 16 04:22:19.829168 kernel: pci 0000:02:00.0: BAR 0 [mem 0x10200000-0x10203fff 64bit]: assigned Sep 16 04:22:19.829231 kernel: pci 0000:00:02.1: PCI bridge to [bus 02] Sep 16 04:22:19.829295 kernel: pci 0000:00:02.1: bridge window [io 0x2000-0x2fff] Sep 16 04:22:19.829353 kernel: pci 0000:00:02.1: bridge window [mem 0x10200000-0x103fffff] Sep 16 04:22:19.829412 kernel: pci 0000:00:02.1: bridge window [mem 0x8000200000-0x80003fffff 64bit pref] Sep 16 04:22:19.829478 kernel: pci 0000:03:00.0: BAR 4 [mem 0x8000400000-0x8000403fff 64bit pref]: assigned Sep 16 04:22:19.829561 kernel: pci 0000:03:00.0: BAR 1 [mem 0x10400000-0x10400fff]: assigned Sep 16 04:22:19.829684 kernel: pci 0000:00:02.2: PCI bridge to [bus 03] Sep 16 04:22:19.829750 kernel: pci 0000:00:02.2: bridge window [io 0x3000-0x3fff] Sep 16 04:22:19.829814 kernel: pci 0000:00:02.2: bridge window [mem 0x10400000-0x105fffff] Sep 16 04:22:19.829873 kernel: pci 0000:00:02.2: bridge window [mem 0x8000400000-0x80005fffff 64bit pref] Sep 16 04:22:19.829940 kernel: pci 0000:04:00.0: BAR 4 [mem 0x8000600000-0x8000603fff 64bit pref]: assigned Sep 16 04:22:19.830001 kernel: pci 0000:00:02.3: PCI bridge to [bus 04] Sep 16 04:22:19.830059 kernel: pci 0000:00:02.3: bridge window [io 0x4000-0x4fff] Sep 16 04:22:19.830116 kernel: pci 0000:00:02.3: bridge window [mem 0x10600000-0x107fffff] Sep 16 04:22:19.830175 kernel: pci 0000:00:02.3: bridge window [mem 0x8000600000-0x80007fffff 64bit pref] Sep 16 04:22:19.830242 kernel: pci 0000:05:00.0: BAR 4 [mem 0x8000800000-0x8000803fff 64bit pref]: assigned Sep 16 04:22:19.830303 kernel: pci 0000:00:02.4: PCI bridge to [bus 05] Sep 16 04:22:19.830361 kernel: pci 0000:00:02.4: bridge window [io 0x5000-0x5fff] Sep 16 04:22:19.830419 kernel: pci 0000:00:02.4: bridge window [mem 0x10800000-0x109fffff] Sep 16 04:22:19.830477 kernel: pci 0000:00:02.4: bridge window [mem 0x8000800000-0x80009fffff 64bit pref] Sep 16 04:22:19.830611 kernel: pci 0000:06:00.0: BAR 4 [mem 0x8000a00000-0x8000a03fff 64bit pref]: assigned Sep 16 04:22:19.832270 kernel: pci 0000:06:00.0: BAR 1 [mem 0x10a00000-0x10a00fff]: assigned Sep 16 04:22:19.832344 kernel: pci 0000:00:02.5: PCI bridge to [bus 06] Sep 16 04:22:19.832410 kernel: pci 0000:00:02.5: bridge window [io 0x6000-0x6fff] Sep 16 04:22:19.832479 kernel: pci 0000:00:02.5: bridge window [mem 0x10a00000-0x10bfffff] Sep 16 04:22:19.832623 kernel: pci 0000:00:02.5: bridge window [mem 0x8000a00000-0x8000bfffff 64bit pref] Sep 16 04:22:19.832706 kernel: pci 0000:07:00.0: ROM [mem 0x10c00000-0x10c7ffff pref]: assigned Sep 16 04:22:19.832768 kernel: pci 0000:07:00.0: BAR 4 [mem 0x8000c00000-0x8000c03fff 64bit pref]: assigned Sep 16 04:22:19.832829 kernel: pci 0000:07:00.0: BAR 1 [mem 0x10c80000-0x10c80fff]: assigned Sep 16 04:22:19.832894 kernel: pci 0000:00:02.6: PCI bridge to [bus 07] Sep 16 04:22:19.832954 kernel: pci 0000:00:02.6: bridge window [io 0x7000-0x7fff] Sep 16 04:22:19.833015 kernel: pci 0000:00:02.6: bridge window [mem 0x10c00000-0x10dfffff] Sep 16 04:22:19.833073 kernel: pci 0000:00:02.6: bridge window [mem 0x8000c00000-0x8000dfffff 64bit pref] Sep 16 04:22:19.833137 kernel: pci 0000:00:02.7: PCI bridge to [bus 08] Sep 16 04:22:19.833195 kernel: pci 0000:00:02.7: bridge window [io 0x8000-0x8fff] Sep 16 04:22:19.833252 kernel: pci 0000:00:02.7: bridge window [mem 0x10e00000-0x10ffffff] Sep 16 04:22:19.833309 kernel: pci 0000:00:02.7: bridge window [mem 0x8000e00000-0x8000ffffff 64bit pref] Sep 16 04:22:19.833370 kernel: pci 0000:00:03.0: PCI bridge to [bus 09] Sep 16 04:22:19.833429 kernel: pci 0000:00:03.0: bridge window [io 0x9000-0x9fff] Sep 16 04:22:19.833488 kernel: pci 0000:00:03.0: bridge window [mem 0x11000000-0x111fffff] Sep 16 04:22:19.833571 kernel: pci 0000:00:03.0: bridge window [mem 0x8001000000-0x80011fffff 64bit pref] Sep 16 04:22:19.834349 kernel: pci_bus 0000:00: resource 4 [mem 0x10000000-0x3efeffff window] Sep 16 04:22:19.834480 kernel: pci_bus 0000:00: resource 5 [io 0x0000-0xffff window] Sep 16 04:22:19.834638 kernel: pci_bus 0000:00: resource 6 [mem 0x8000000000-0xffffffffff window] Sep 16 04:22:19.834714 kernel: pci_bus 0000:01: resource 0 [io 0x1000-0x1fff] Sep 16 04:22:19.834770 kernel: pci_bus 0000:01: resource 1 [mem 0x10000000-0x101fffff] Sep 16 04:22:19.834832 kernel: pci_bus 0000:01: resource 2 [mem 0x8000000000-0x80001fffff 64bit pref] Sep 16 04:22:19.834895 kernel: pci_bus 0000:02: resource 0 [io 0x2000-0x2fff] Sep 16 04:22:19.834950 kernel: pci_bus 0000:02: resource 1 [mem 0x10200000-0x103fffff] Sep 16 04:22:19.835004 kernel: pci_bus 0000:02: resource 2 [mem 0x8000200000-0x80003fffff 64bit pref] Sep 16 04:22:19.835065 kernel: pci_bus 0000:03: resource 0 [io 0x3000-0x3fff] Sep 16 04:22:19.835119 kernel: pci_bus 0000:03: resource 1 [mem 0x10400000-0x105fffff] Sep 16 04:22:19.835174 kernel: pci_bus 0000:03: resource 2 [mem 0x8000400000-0x80005fffff 64bit pref] Sep 16 04:22:19.835245 kernel: pci_bus 0000:04: resource 0 [io 0x4000-0x4fff] Sep 16 04:22:19.835300 kernel: pci_bus 0000:04: resource 1 [mem 0x10600000-0x107fffff] Sep 16 04:22:19.835356 kernel: pci_bus 0000:04: resource 2 [mem 0x8000600000-0x80007fffff 64bit pref] Sep 16 04:22:19.835419 kernel: pci_bus 0000:05: resource 0 [io 0x5000-0x5fff] Sep 16 04:22:19.835474 kernel: pci_bus 0000:05: resource 1 [mem 0x10800000-0x109fffff] Sep 16 04:22:19.835545 kernel: pci_bus 0000:05: resource 2 [mem 0x8000800000-0x80009fffff 64bit pref] Sep 16 04:22:19.835650 kernel: pci_bus 0000:06: resource 0 [io 0x6000-0x6fff] Sep 16 04:22:19.835716 kernel: pci_bus 0000:06: resource 1 [mem 0x10a00000-0x10bfffff] Sep 16 04:22:19.835771 kernel: pci_bus 0000:06: resource 2 [mem 0x8000a00000-0x8000bfffff 64bit pref] Sep 16 04:22:19.835832 kernel: pci_bus 0000:07: resource 0 [io 0x7000-0x7fff] Sep 16 04:22:19.835886 kernel: pci_bus 0000:07: resource 1 [mem 0x10c00000-0x10dfffff] Sep 16 04:22:19.835940 kernel: pci_bus 0000:07: resource 2 [mem 0x8000c00000-0x8000dfffff 64bit pref] Sep 16 04:22:19.836001 kernel: pci_bus 0000:08: resource 0 [io 0x8000-0x8fff] Sep 16 04:22:19.836057 kernel: pci_bus 0000:08: resource 1 [mem 0x10e00000-0x10ffffff] Sep 16 04:22:19.836110 kernel: pci_bus 0000:08: resource 2 [mem 0x8000e00000-0x8000ffffff 64bit pref] Sep 16 04:22:19.836175 kernel: pci_bus 0000:09: resource 0 [io 0x9000-0x9fff] Sep 16 04:22:19.836228 kernel: pci_bus 0000:09: resource 1 [mem 0x11000000-0x111fffff] Sep 16 04:22:19.836281 kernel: pci_bus 0000:09: resource 2 [mem 0x8001000000-0x80011fffff 64bit pref] Sep 16 04:22:19.836291 kernel: ACPI: PCI: Interrupt link GSI0 configured for IRQ 35 Sep 16 04:22:19.836299 kernel: ACPI: PCI: Interrupt link GSI1 configured for IRQ 36 Sep 16 04:22:19.836307 kernel: ACPI: PCI: Interrupt link GSI2 configured for IRQ 37 Sep 16 04:22:19.836316 kernel: ACPI: PCI: Interrupt link GSI3 configured for IRQ 38 Sep 16 04:22:19.836323 kernel: iommu: Default domain type: Translated Sep 16 04:22:19.836331 kernel: iommu: DMA domain TLB invalidation policy: strict mode Sep 16 04:22:19.836338 kernel: efivars: Registered efivars operations Sep 16 04:22:19.836345 kernel: vgaarb: loaded Sep 16 04:22:19.836353 kernel: clocksource: Switched to clocksource arch_sys_counter Sep 16 04:22:19.836360 kernel: VFS: Disk quotas dquot_6.6.0 Sep 16 04:22:19.836368 kernel: VFS: Dquot-cache hash table entries: 512 (order 0, 4096 bytes) Sep 16 04:22:19.836375 kernel: pnp: PnP ACPI init Sep 16 04:22:19.836446 kernel: system 00:00: [mem 0x4010000000-0x401fffffff window] could not be reserved Sep 16 04:22:19.836457 kernel: pnp: PnP ACPI: found 1 devices Sep 16 04:22:19.836465 kernel: NET: Registered PF_INET protocol family Sep 16 04:22:19.836472 kernel: IP idents hash table entries: 65536 (order: 7, 524288 bytes, linear) Sep 16 04:22:19.836480 kernel: tcp_listen_portaddr_hash hash table entries: 2048 (order: 3, 32768 bytes, linear) Sep 16 04:22:19.836487 kernel: Table-perturb hash table entries: 65536 (order: 6, 262144 bytes, linear) Sep 16 04:22:19.836537 kernel: TCP established hash table entries: 32768 (order: 6, 262144 bytes, linear) Sep 16 04:22:19.836546 kernel: TCP bind hash table entries: 32768 (order: 8, 1048576 bytes, linear) Sep 16 04:22:19.836556 kernel: TCP: Hash tables configured (established 32768 bind 32768) Sep 16 04:22:19.836563 kernel: UDP hash table entries: 2048 (order: 4, 65536 bytes, linear) Sep 16 04:22:19.836571 kernel: UDP-Lite hash table entries: 2048 (order: 4, 65536 bytes, linear) Sep 16 04:22:19.836578 kernel: NET: Registered PF_UNIX/PF_LOCAL protocol family Sep 16 04:22:19.836675 kernel: pci 0000:02:00.0: enabling device (0000 -> 0002) Sep 16 04:22:19.836688 kernel: PCI: CLS 0 bytes, default 64 Sep 16 04:22:19.836695 kernel: kvm [1]: HYP mode not available Sep 16 04:22:19.836703 kernel: Initialise system trusted keyrings Sep 16 04:22:19.836710 kernel: workingset: timestamp_bits=39 max_order=20 bucket_order=0 Sep 16 04:22:19.836720 kernel: Key type asymmetric registered Sep 16 04:22:19.836727 kernel: Asymmetric key parser 'x509' registered Sep 16 04:22:19.836734 kernel: Block layer SCSI generic (bsg) driver version 0.4 loaded (major 249) Sep 16 04:22:19.836742 kernel: io scheduler mq-deadline registered Sep 16 04:22:19.836749 kernel: io scheduler kyber registered Sep 16 04:22:19.836756 kernel: io scheduler bfq registered Sep 16 04:22:19.836765 kernel: ACPI: \_SB_.PCI0.GSI2: Enabled at IRQ 37 Sep 16 04:22:19.836831 kernel: pcieport 0000:00:02.0: PME: Signaling with IRQ 50 Sep 16 04:22:19.836891 kernel: pcieport 0000:00:02.0: AER: enabled with IRQ 50 Sep 16 04:22:19.836954 kernel: pcieport 0000:00:02.0: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Sep 16 04:22:19.837016 kernel: pcieport 0000:00:02.1: PME: Signaling with IRQ 51 Sep 16 04:22:19.837076 kernel: pcieport 0000:00:02.1: AER: enabled with IRQ 51 Sep 16 04:22:19.837134 kernel: pcieport 0000:00:02.1: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Sep 16 04:22:19.837197 kernel: pcieport 0000:00:02.2: PME: Signaling with IRQ 52 Sep 16 04:22:19.837256 kernel: pcieport 0000:00:02.2: AER: enabled with IRQ 52 Sep 16 04:22:19.837315 kernel: pcieport 0000:00:02.2: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Sep 16 04:22:19.837380 kernel: pcieport 0000:00:02.3: PME: Signaling with IRQ 53 Sep 16 04:22:19.837443 kernel: pcieport 0000:00:02.3: AER: enabled with IRQ 53 Sep 16 04:22:19.837514 kernel: pcieport 0000:00:02.3: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Sep 16 04:22:19.837590 kernel: pcieport 0000:00:02.4: PME: Signaling with IRQ 54 Sep 16 04:22:19.837659 kernel: pcieport 0000:00:02.4: AER: enabled with IRQ 54 Sep 16 04:22:19.837719 kernel: pcieport 0000:00:02.4: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Sep 16 04:22:19.837784 kernel: pcieport 0000:00:02.5: PME: Signaling with IRQ 55 Sep 16 04:22:19.837842 kernel: pcieport 0000:00:02.5: AER: enabled with IRQ 55 Sep 16 04:22:19.837903 kernel: pcieport 0000:00:02.5: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Sep 16 04:22:19.837964 kernel: pcieport 0000:00:02.6: PME: Signaling with IRQ 56 Sep 16 04:22:19.838023 kernel: pcieport 0000:00:02.6: AER: enabled with IRQ 56 Sep 16 04:22:19.838081 kernel: pcieport 0000:00:02.6: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Sep 16 04:22:19.838142 kernel: pcieport 0000:00:02.7: PME: Signaling with IRQ 57 Sep 16 04:22:19.838201 kernel: pcieport 0000:00:02.7: AER: enabled with IRQ 57 Sep 16 04:22:19.838258 kernel: pcieport 0000:00:02.7: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Sep 16 04:22:19.838268 kernel: ACPI: \_SB_.PCI0.GSI3: Enabled at IRQ 38 Sep 16 04:22:19.838330 kernel: pcieport 0000:00:03.0: PME: Signaling with IRQ 58 Sep 16 04:22:19.838388 kernel: pcieport 0000:00:03.0: AER: enabled with IRQ 58 Sep 16 04:22:19.838445 kernel: pcieport 0000:00:03.0: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Sep 16 04:22:19.838455 kernel: input: Power Button as /devices/LNXSYSTM:00/LNXSYBUS:00/PNP0C0C:00/input/input0 Sep 16 04:22:19.838463 kernel: ACPI: button: Power Button [PWRB] Sep 16 04:22:19.838470 kernel: ACPI: \_SB_.PCI0.GSI1: Enabled at IRQ 36 Sep 16 04:22:19.838549 kernel: virtio-pci 0000:04:00.0: enabling device (0000 -> 0002) Sep 16 04:22:19.838628 kernel: virtio-pci 0000:07:00.0: enabling device (0000 -> 0002) Sep 16 04:22:19.838644 kernel: Serial: 8250/16550 driver, 4 ports, IRQ sharing enabled Sep 16 04:22:19.838652 kernel: ACPI: \_SB_.PCI0.GSI0: Enabled at IRQ 35 Sep 16 04:22:19.838713 kernel: serial 0000:00:04.0: enabling device (0000 -> 0001) Sep 16 04:22:19.838723 kernel: 0000:00:04.0: ttyS0 at I/O 0xa000 (irq = 45, base_baud = 115200) is a 16550A Sep 16 04:22:19.838732 kernel: thunder_xcv, ver 1.0 Sep 16 04:22:19.838739 kernel: thunder_bgx, ver 1.0 Sep 16 04:22:19.838747 kernel: nicpf, ver 1.0 Sep 16 04:22:19.838754 kernel: nicvf, ver 1.0 Sep 16 04:22:19.838829 kernel: rtc-efi rtc-efi.0: registered as rtc0 Sep 16 04:22:19.838889 kernel: rtc-efi rtc-efi.0: setting system clock to 2025-09-16T04:22:19 UTC (1757996539) Sep 16 04:22:19.838898 kernel: hid: raw HID events driver (C) Jiri Kosina Sep 16 04:22:19.838906 kernel: hw perfevents: enabled with armv8_pmuv3_0 PMU driver, 7 (0,8000003f) counters available Sep 16 04:22:19.838913 kernel: watchdog: NMI not fully supported Sep 16 04:22:19.838921 kernel: watchdog: Hard watchdog permanently disabled Sep 16 04:22:19.838928 kernel: NET: Registered PF_INET6 protocol family Sep 16 04:22:19.838936 kernel: Segment Routing with IPv6 Sep 16 04:22:19.838943 kernel: In-situ OAM (IOAM) with IPv6 Sep 16 04:22:19.838952 kernel: NET: Registered PF_PACKET protocol family Sep 16 04:22:19.838960 kernel: Key type dns_resolver registered Sep 16 04:22:19.838967 kernel: registered taskstats version 1 Sep 16 04:22:19.838974 kernel: Loading compiled-in X.509 certificates Sep 16 04:22:19.838982 kernel: Loaded X.509 cert 'Kinvolk GmbH: Module signing key for 6.12.47-flatcar: 99eb88579c3d58869b2224a85ec8efa5647af805' Sep 16 04:22:19.838989 kernel: Demotion targets for Node 0: null Sep 16 04:22:19.838998 kernel: Key type .fscrypt registered Sep 16 04:22:19.839007 kernel: Key type fscrypt-provisioning registered Sep 16 04:22:19.839015 kernel: ima: No TPM chip found, activating TPM-bypass! Sep 16 04:22:19.839025 kernel: ima: Allocated hash algorithm: sha1 Sep 16 04:22:19.839033 kernel: ima: No architecture policies found Sep 16 04:22:19.839041 kernel: alg: No test for fips(ansi_cprng) (fips_ansi_cprng) Sep 16 04:22:19.839049 kernel: clk: Disabling unused clocks Sep 16 04:22:19.839057 kernel: PM: genpd: Disabling unused power domains Sep 16 04:22:19.839065 kernel: Warning: unable to open an initial console. Sep 16 04:22:19.839073 kernel: Freeing unused kernel memory: 38976K Sep 16 04:22:19.839081 kernel: Run /init as init process Sep 16 04:22:19.839089 kernel: with arguments: Sep 16 04:22:19.839097 kernel: /init Sep 16 04:22:19.839104 kernel: with environment: Sep 16 04:22:19.839112 kernel: HOME=/ Sep 16 04:22:19.839119 kernel: TERM=linux Sep 16 04:22:19.839126 kernel: BOOT_IMAGE=/flatcar/vmlinuz-a Sep 16 04:22:19.839134 systemd[1]: Successfully made /usr/ read-only. Sep 16 04:22:19.839145 systemd[1]: systemd 256.8 running in system mode (+PAM +AUDIT +SELINUX -APPARMOR +IMA +SMACK +SECCOMP -GCRYPT -GNUTLS +OPENSSL -ACL +BLKID +CURL +ELFUTILS -FIDO2 +IDN2 -IDN +IPTC +KMOD +LIBCRYPTSETUP +LIBCRYPTSETUP_PLUGINS +LIBFDISK +PCRE2 -PWQUALITY -P11KIT -QRENCODE +TPM2 +BZIP2 +LZ4 +XZ +ZLIB +ZSTD -BPF_FRAMEWORK -XKBCOMMON +UTMP -SYSVINIT +LIBARCHIVE) Sep 16 04:22:19.839153 systemd[1]: Detected virtualization kvm. Sep 16 04:22:19.839162 systemd[1]: Detected architecture arm64. Sep 16 04:22:19.839170 systemd[1]: Running in initrd. Sep 16 04:22:19.839177 systemd[1]: No hostname configured, using default hostname. Sep 16 04:22:19.839185 systemd[1]: Hostname set to . Sep 16 04:22:19.839195 systemd[1]: Initializing machine ID from VM UUID. Sep 16 04:22:19.839203 systemd[1]: Queued start job for default target initrd.target. Sep 16 04:22:19.839210 systemd[1]: Started clevis-luks-askpass.path - Forward Password Requests to Clevis Directory Watch. Sep 16 04:22:19.839218 systemd[1]: Started systemd-ask-password-console.path - Dispatch Password Requests to Console Directory Watch. Sep 16 04:22:19.839229 systemd[1]: Expecting device dev-disk-by\x2dlabel-EFI\x2dSYSTEM.device - /dev/disk/by-label/EFI-SYSTEM... Sep 16 04:22:19.839237 systemd[1]: Expecting device dev-disk-by\x2dlabel-OEM.device - /dev/disk/by-label/OEM... Sep 16 04:22:19.839245 systemd[1]: Expecting device dev-disk-by\x2dlabel-ROOT.device - /dev/disk/by-label/ROOT... Sep 16 04:22:19.839253 systemd[1]: Expecting device dev-disk-by\x2dpartlabel-USR\x2dA.device - /dev/disk/by-partlabel/USR-A... Sep 16 04:22:19.839262 systemd[1]: Expecting device dev-disk-by\x2dpartuuid-7130c94a\x2d213a\x2d4e5a\x2d8e26\x2d6cce9662f132.device - /dev/disk/by-partuuid/7130c94a-213a-4e5a-8e26-6cce9662f132... Sep 16 04:22:19.839270 systemd[1]: Expecting device dev-mapper-usr.device - /dev/mapper/usr... Sep 16 04:22:19.839278 systemd[1]: Reached target cryptsetup-pre.target - Local Encrypted Volumes (Pre). Sep 16 04:22:19.839287 systemd[1]: Reached target cryptsetup.target - Local Encrypted Volumes. Sep 16 04:22:19.839295 systemd[1]: Reached target paths.target - Path Units. Sep 16 04:22:19.839303 systemd[1]: Reached target slices.target - Slice Units. Sep 16 04:22:19.839311 systemd[1]: Reached target swap.target - Swaps. Sep 16 04:22:19.839318 systemd[1]: Reached target timers.target - Timer Units. Sep 16 04:22:19.839326 systemd[1]: Listening on iscsid.socket - Open-iSCSI iscsid Socket. Sep 16 04:22:19.839334 systemd[1]: Listening on iscsiuio.socket - Open-iSCSI iscsiuio Socket. Sep 16 04:22:19.839342 systemd[1]: Listening on systemd-journald-dev-log.socket - Journal Socket (/dev/log). Sep 16 04:22:19.839351 systemd[1]: Listening on systemd-journald.socket - Journal Sockets. Sep 16 04:22:19.839359 systemd[1]: Listening on systemd-networkd.socket - Network Service Netlink Socket. Sep 16 04:22:19.839367 systemd[1]: Listening on systemd-udevd-control.socket - udev Control Socket. Sep 16 04:22:19.839375 systemd[1]: Listening on systemd-udevd-kernel.socket - udev Kernel Socket. Sep 16 04:22:19.839383 systemd[1]: Reached target sockets.target - Socket Units. Sep 16 04:22:19.839391 systemd[1]: Starting ignition-setup-pre.service - Ignition env setup... Sep 16 04:22:19.839398 systemd[1]: Starting kmod-static-nodes.service - Create List of Static Device Nodes... Sep 16 04:22:19.839406 systemd[1]: Finished network-cleanup.service - Network Cleanup. Sep 16 04:22:19.839414 systemd[1]: systemd-battery-check.service - Check battery level during early boot was skipped because of an unmet condition check (ConditionDirectoryNotEmpty=/sys/class/power_supply). Sep 16 04:22:19.839424 systemd[1]: Starting systemd-fsck-usr.service... Sep 16 04:22:19.839432 systemd[1]: Starting systemd-journald.service - Journal Service... Sep 16 04:22:19.839440 systemd[1]: Starting systemd-modules-load.service - Load Kernel Modules... Sep 16 04:22:19.839448 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Sep 16 04:22:19.839456 systemd[1]: Finished ignition-setup-pre.service - Ignition env setup. Sep 16 04:22:19.839465 systemd[1]: Finished kmod-static-nodes.service - Create List of Static Device Nodes. Sep 16 04:22:19.839474 systemd[1]: Finished systemd-fsck-usr.service. Sep 16 04:22:19.839538 systemd-journald[245]: Collecting audit messages is disabled. Sep 16 04:22:19.839567 systemd[1]: Starting systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully... Sep 16 04:22:19.839576 kernel: bridge: filtering via arp/ip/ip6tables is no longer available by default. Update your scripts to load br_netfilter if you need this. Sep 16 04:22:19.839605 kernel: Bridge firewalling registered Sep 16 04:22:19.839615 systemd[1]: Finished systemd-vconsole-setup.service - Virtual Console Setup. Sep 16 04:22:19.839623 systemd[1]: Finished systemd-modules-load.service - Load Kernel Modules. Sep 16 04:22:19.839631 systemd[1]: Finished systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully. Sep 16 04:22:19.839639 systemd[1]: Starting dracut-cmdline-ask.service - dracut ask for additional cmdline parameters... Sep 16 04:22:19.839648 systemd-journald[245]: Journal started Sep 16 04:22:19.839670 systemd-journald[245]: Runtime Journal (/run/log/journal/b57b5e16e7aa4590be54f3c4063c0fab) is 8M, max 76.5M, 68.5M free. Sep 16 04:22:19.812987 systemd-modules-load[246]: Inserted module 'overlay' Sep 16 04:22:19.828408 systemd-modules-load[246]: Inserted module 'br_netfilter' Sep 16 04:22:19.846610 systemd[1]: Starting systemd-sysctl.service - Apply Kernel Variables... Sep 16 04:22:19.848761 systemd[1]: Starting systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev... Sep 16 04:22:19.850104 systemd[1]: Started systemd-journald.service - Journal Service. Sep 16 04:22:19.853571 systemd[1]: Starting systemd-tmpfiles-setup.service - Create System Files and Directories... Sep 16 04:22:19.867139 systemd[1]: Finished systemd-sysctl.service - Apply Kernel Variables. Sep 16 04:22:19.869881 systemd[1]: Finished systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev. Sep 16 04:22:19.874846 systemd-tmpfiles[267]: /usr/lib/tmpfiles.d/var.conf:14: Duplicate line for path "/var/log", ignoring. Sep 16 04:22:19.879332 systemd[1]: Finished systemd-tmpfiles-setup.service - Create System Files and Directories. Sep 16 04:22:19.881749 systemd[1]: Starting systemd-resolved.service - Network Name Resolution... Sep 16 04:22:19.882524 systemd[1]: Finished dracut-cmdline-ask.service - dracut ask for additional cmdline parameters. Sep 16 04:22:19.884225 systemd[1]: Starting dracut-cmdline.service - dracut cmdline hook... Sep 16 04:22:19.913663 dracut-cmdline[284]: Using kernel command line parameters: rd.driver.pre=btrfs SYSTEMD_SULOGIN_FORCE=1 BOOT_IMAGE=/flatcar/vmlinuz-a mount.usr=/dev/mapper/usr verity.usr=PARTUUID=7130c94a-213a-4e5a-8e26-6cce9662f132 rootflags=rw mount.usrflags=ro consoleblank=0 root=LABEL=ROOT console=ttyAMA0,115200n8 flatcar.first_boot=detected acpi=force flatcar.oem.id=hetzner verity.usrhash=eff5cc3c399cf6fc52e3071751a09276871b099078da6d1b1a498405d04a9313 Sep 16 04:22:19.927718 systemd-resolved[283]: Positive Trust Anchors: Sep 16 04:22:19.927741 systemd-resolved[283]: . IN DS 20326 8 2 e06d44b80b8f1d39a95c0b0d7c65d08458e880409bbc683457104237c7f8ec8d Sep 16 04:22:19.927772 systemd-resolved[283]: Negative trust anchors: home.arpa 10.in-addr.arpa 16.172.in-addr.arpa 17.172.in-addr.arpa 18.172.in-addr.arpa 19.172.in-addr.arpa 20.172.in-addr.arpa 21.172.in-addr.arpa 22.172.in-addr.arpa 23.172.in-addr.arpa 24.172.in-addr.arpa 25.172.in-addr.arpa 26.172.in-addr.arpa 27.172.in-addr.arpa 28.172.in-addr.arpa 29.172.in-addr.arpa 30.172.in-addr.arpa 31.172.in-addr.arpa 170.0.0.192.in-addr.arpa 171.0.0.192.in-addr.arpa 168.192.in-addr.arpa d.f.ip6.arpa ipv4only.arpa resolver.arpa corp home internal intranet lan local private test Sep 16 04:22:19.933056 systemd-resolved[283]: Defaulting to hostname 'linux'. Sep 16 04:22:19.934057 systemd[1]: Started systemd-resolved.service - Network Name Resolution. Sep 16 04:22:19.935941 systemd[1]: Reached target nss-lookup.target - Host and Network Name Lookups. Sep 16 04:22:20.010653 kernel: SCSI subsystem initialized Sep 16 04:22:20.015620 kernel: Loading iSCSI transport class v2.0-870. Sep 16 04:22:20.023642 kernel: iscsi: registered transport (tcp) Sep 16 04:22:20.036634 kernel: iscsi: registered transport (qla4xxx) Sep 16 04:22:20.036685 kernel: QLogic iSCSI HBA Driver Sep 16 04:22:20.059461 systemd[1]: Starting systemd-network-generator.service - Generate network units from Kernel command line... Sep 16 04:22:20.090153 systemd[1]: Finished systemd-network-generator.service - Generate network units from Kernel command line. Sep 16 04:22:20.093928 systemd[1]: Reached target network-pre.target - Preparation for Network. Sep 16 04:22:20.150332 systemd[1]: Finished dracut-cmdline.service - dracut cmdline hook. Sep 16 04:22:20.153647 systemd[1]: Starting dracut-pre-udev.service - dracut pre-udev hook... Sep 16 04:22:20.217643 kernel: raid6: neonx8 gen() 15644 MB/s Sep 16 04:22:20.234619 kernel: raid6: neonx4 gen() 15710 MB/s Sep 16 04:22:20.252621 kernel: raid6: neonx2 gen() 13116 MB/s Sep 16 04:22:20.268633 kernel: raid6: neonx1 gen() 8969 MB/s Sep 16 04:22:20.285655 kernel: raid6: int64x8 gen() 6868 MB/s Sep 16 04:22:20.302631 kernel: raid6: int64x4 gen() 7322 MB/s Sep 16 04:22:20.319644 kernel: raid6: int64x2 gen() 6077 MB/s Sep 16 04:22:20.336637 kernel: raid6: int64x1 gen() 5028 MB/s Sep 16 04:22:20.336693 kernel: raid6: using algorithm neonx4 gen() 15710 MB/s Sep 16 04:22:20.353645 kernel: raid6: .... xor() 12303 MB/s, rmw enabled Sep 16 04:22:20.353688 kernel: raid6: using neon recovery algorithm Sep 16 04:22:20.358931 kernel: xor: measuring software checksum speed Sep 16 04:22:20.358968 kernel: 8regs : 21607 MB/sec Sep 16 04:22:20.358988 kernel: 32regs : 20812 MB/sec Sep 16 04:22:20.359757 kernel: arm64_neon : 28061 MB/sec Sep 16 04:22:20.359792 kernel: xor: using function: arm64_neon (28061 MB/sec) Sep 16 04:22:20.413657 kernel: Btrfs loaded, zoned=no, fsverity=no Sep 16 04:22:20.422421 systemd[1]: Finished dracut-pre-udev.service - dracut pre-udev hook. Sep 16 04:22:20.425055 systemd[1]: Starting systemd-udevd.service - Rule-based Manager for Device Events and Files... Sep 16 04:22:20.454694 systemd-udevd[492]: Using default interface naming scheme 'v255'. Sep 16 04:22:20.458952 systemd[1]: Started systemd-udevd.service - Rule-based Manager for Device Events and Files. Sep 16 04:22:20.463095 systemd[1]: Starting dracut-pre-trigger.service - dracut pre-trigger hook... Sep 16 04:22:20.492960 dracut-pre-trigger[502]: rd.md=0: removing MD RAID activation Sep 16 04:22:20.520649 systemd[1]: Finished dracut-pre-trigger.service - dracut pre-trigger hook. Sep 16 04:22:20.522987 systemd[1]: Starting systemd-udev-trigger.service - Coldplug All udev Devices... Sep 16 04:22:20.589296 systemd[1]: Finished systemd-udev-trigger.service - Coldplug All udev Devices. Sep 16 04:22:20.593055 systemd[1]: Starting dracut-initqueue.service - dracut initqueue hook... Sep 16 04:22:20.677612 kernel: ACPI: bus type USB registered Sep 16 04:22:20.679049 kernel: usbcore: registered new interface driver usbfs Sep 16 04:22:20.679087 kernel: usbcore: registered new interface driver hub Sep 16 04:22:20.679097 kernel: usbcore: registered new device driver usb Sep 16 04:22:20.679612 kernel: virtio_scsi virtio5: 2/0/0 default/read/poll queues Sep 16 04:22:20.680716 kernel: scsi host0: Virtio SCSI HBA Sep 16 04:22:20.692713 kernel: scsi 0:0:0:0: CD-ROM QEMU QEMU CD-ROM 2.5+ PQ: 0 ANSI: 5 Sep 16 04:22:20.692794 kernel: scsi 0:0:0:1: Direct-Access QEMU QEMU HARDDISK 2.5+ PQ: 0 ANSI: 5 Sep 16 04:22:20.723209 systemd[1]: systemd-vconsole-setup.service: Deactivated successfully. Sep 16 04:22:20.723283 systemd[1]: Stopped systemd-vconsole-setup.service - Virtual Console Setup. Sep 16 04:22:20.727322 systemd[1]: Stopping systemd-vconsole-setup.service - Virtual Console Setup... Sep 16 04:22:20.731435 kernel: xhci_hcd 0000:02:00.0: xHCI Host Controller Sep 16 04:22:20.731661 kernel: xhci_hcd 0000:02:00.0: new USB bus registered, assigned bus number 1 Sep 16 04:22:20.730709 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Sep 16 04:22:20.733670 kernel: xhci_hcd 0000:02:00.0: hcc params 0x00087001 hci version 0x100 quirks 0x0000000000000010 Sep 16 04:22:20.733852 kernel: xhci_hcd 0000:02:00.0: xHCI Host Controller Sep 16 04:22:20.737606 kernel: xhci_hcd 0000:02:00.0: new USB bus registered, assigned bus number 2 Sep 16 04:22:20.740676 kernel: xhci_hcd 0000:02:00.0: Host supports USB 3.0 SuperSpeed Sep 16 04:22:20.740850 kernel: hub 1-0:1.0: USB hub found Sep 16 04:22:20.742116 kernel: hub 1-0:1.0: 4 ports detected Sep 16 04:22:20.742269 kernel: usb usb2: We don't know the algorithms for LPM for this host, disabling LPM. Sep 16 04:22:20.743664 kernel: hub 2-0:1.0: USB hub found Sep 16 04:22:20.746617 kernel: hub 2-0:1.0: 4 ports detected Sep 16 04:22:20.746799 kernel: sd 0:0:0:1: Power-on or device reset occurred Sep 16 04:22:20.746906 kernel: sd 0:0:0:1: [sda] 80003072 512-byte logical blocks: (41.0 GB/38.1 GiB) Sep 16 04:22:20.746983 kernel: sd 0:0:0:1: [sda] Write Protect is off Sep 16 04:22:20.747052 kernel: sd 0:0:0:1: [sda] Mode Sense: 63 00 00 08 Sep 16 04:22:20.747121 kernel: sd 0:0:0:1: [sda] Write cache: enabled, read cache: enabled, doesn't support DPO or FUA Sep 16 04:22:20.758384 kernel: GPT:Primary header thinks Alt. header is not at the end of the disk. Sep 16 04:22:20.758518 kernel: GPT:17805311 != 80003071 Sep 16 04:22:20.758550 kernel: GPT:Alternate GPT header not at the end of the disk. Sep 16 04:22:20.760375 kernel: GPT:17805311 != 80003071 Sep 16 04:22:20.760436 kernel: GPT: Use GNU Parted to correct GPT errors. Sep 16 04:22:20.760463 kernel: sda: sda1 sda2 sda3 sda4 sda6 sda7 sda9 Sep 16 04:22:20.760512 kernel: sd 0:0:0:1: [sda] Attached SCSI disk Sep 16 04:22:20.767900 kernel: sr 0:0:0:0: Power-on or device reset occurred Sep 16 04:22:20.770230 kernel: sr 0:0:0:0: [sr0] scsi3-mmc drive: 16x/50x cd/rw xa/form2 cdda tray Sep 16 04:22:20.770427 kernel: cdrom: Uniform CD-ROM driver Revision: 3.20 Sep 16 04:22:20.770830 systemd[1]: Finished systemd-vconsole-setup.service - Virtual Console Setup. Sep 16 04:22:20.774620 kernel: sr 0:0:0:0: Attached scsi CD-ROM sr0 Sep 16 04:22:20.824356 systemd[1]: Found device dev-disk-by\x2dlabel-ROOT.device - QEMU_HARDDISK ROOT. Sep 16 04:22:20.836009 systemd[1]: Found device dev-disk-by\x2dlabel-EFI\x2dSYSTEM.device - QEMU_HARDDISK EFI-SYSTEM. Sep 16 04:22:20.850526 systemd[1]: Found device dev-disk-by\x2dlabel-OEM.device - QEMU_HARDDISK OEM. Sep 16 04:22:20.859706 systemd[1]: Found device dev-disk-by\x2dpartlabel-USR\x2dA.device - QEMU_HARDDISK USR-A. Sep 16 04:22:20.860435 systemd[1]: Found device dev-disk-by\x2dpartuuid-7130c94a\x2d213a\x2d4e5a\x2d8e26\x2d6cce9662f132.device - QEMU_HARDDISK USR-A. Sep 16 04:22:20.863797 systemd[1]: Starting disk-uuid.service - Generate new UUID for disk GPT if necessary... Sep 16 04:22:20.883551 systemd[1]: Finished dracut-initqueue.service - dracut initqueue hook. Sep 16 04:22:20.884945 systemd[1]: Reached target remote-fs-pre.target - Preparation for Remote File Systems. Sep 16 04:22:20.888703 disk-uuid[597]: Primary Header is updated. Sep 16 04:22:20.888703 disk-uuid[597]: Secondary Entries is updated. Sep 16 04:22:20.888703 disk-uuid[597]: Secondary Header is updated. Sep 16 04:22:20.886522 systemd[1]: Reached target remote-cryptsetup.target - Remote Encrypted Volumes. Sep 16 04:22:20.889340 systemd[1]: Reached target remote-fs.target - Remote File Systems. Sep 16 04:22:20.894155 systemd[1]: Starting dracut-pre-mount.service - dracut pre-mount hook... Sep 16 04:22:20.899653 kernel: sda: sda1 sda2 sda3 sda4 sda6 sda7 sda9 Sep 16 04:22:20.929609 systemd[1]: Finished dracut-pre-mount.service - dracut pre-mount hook. Sep 16 04:22:20.984644 kernel: usb 1-1: new high-speed USB device number 2 using xhci_hcd Sep 16 04:22:21.116529 kernel: input: QEMU QEMU USB Tablet as /devices/pci0000:00/0000:00:02.1/0000:02:00.0/usb1/1-1/1-1:1.0/0003:0627:0001.0001/input/input1 Sep 16 04:22:21.116608 kernel: hid-generic 0003:0627:0001.0001: input,hidraw0: USB HID v0.01 Mouse [QEMU QEMU USB Tablet] on usb-0000:02:00.0-1/input0 Sep 16 04:22:21.117609 kernel: usbcore: registered new interface driver usbhid Sep 16 04:22:21.117647 kernel: usbhid: USB HID core driver Sep 16 04:22:21.218682 kernel: usb 1-2: new high-speed USB device number 3 using xhci_hcd Sep 16 04:22:21.344825 kernel: input: QEMU QEMU USB Keyboard as /devices/pci0000:00/0000:00:02.1/0000:02:00.0/usb1/1-2/1-2:1.0/0003:0627:0001.0002/input/input2 Sep 16 04:22:21.399130 kernel: hid-generic 0003:0627:0001.0002: input,hidraw1: USB HID v1.11 Keyboard [QEMU QEMU USB Keyboard] on usb-0000:02:00.0-2/input0 Sep 16 04:22:21.924625 kernel: sda: sda1 sda2 sda3 sda4 sda6 sda7 sda9 Sep 16 04:22:21.924912 disk-uuid[601]: The operation has completed successfully. Sep 16 04:22:21.977035 systemd[1]: disk-uuid.service: Deactivated successfully. Sep 16 04:22:21.977150 systemd[1]: Finished disk-uuid.service - Generate new UUID for disk GPT if necessary. Sep 16 04:22:22.007929 systemd[1]: Starting verity-setup.service - Verity Setup for /dev/mapper/usr... Sep 16 04:22:22.026645 sh[626]: Success Sep 16 04:22:22.043286 kernel: device-mapper: core: CONFIG_IMA_DISABLE_HTABLE is disabled. Duplicate IMA measurements will not be recorded in the IMA log. Sep 16 04:22:22.043344 kernel: device-mapper: uevent: version 1.0.3 Sep 16 04:22:22.043355 kernel: device-mapper: ioctl: 4.48.0-ioctl (2023-03-01) initialised: dm-devel@lists.linux.dev Sep 16 04:22:22.055628 kernel: device-mapper: verity: sha256 using shash "sha256-ce" Sep 16 04:22:22.105905 systemd[1]: Found device dev-mapper-usr.device - /dev/mapper/usr. Sep 16 04:22:22.107475 systemd[1]: Mounting sysusr-usr.mount - /sysusr/usr... Sep 16 04:22:22.116737 systemd[1]: Finished verity-setup.service - Verity Setup for /dev/mapper/usr. Sep 16 04:22:22.132616 kernel: BTRFS: device fsid 782b6948-7aaa-439e-9946-c8fdb4d8f287 devid 1 transid 37 /dev/mapper/usr (254:0) scanned by mount (638) Sep 16 04:22:22.134623 kernel: BTRFS info (device dm-0): first mount of filesystem 782b6948-7aaa-439e-9946-c8fdb4d8f287 Sep 16 04:22:22.134679 kernel: BTRFS info (device dm-0): using crc32c (crc32c-generic) checksum algorithm Sep 16 04:22:22.142888 kernel: BTRFS info (device dm-0): enabling ssd optimizations Sep 16 04:22:22.142968 kernel: BTRFS info (device dm-0): disabling log replay at mount time Sep 16 04:22:22.142992 kernel: BTRFS info (device dm-0): enabling free space tree Sep 16 04:22:22.145156 systemd[1]: Mounted sysusr-usr.mount - /sysusr/usr. Sep 16 04:22:22.147429 systemd[1]: Reached target initrd-usr-fs.target - Initrd /usr File System. Sep 16 04:22:22.149605 systemd[1]: afterburn-network-kargs.service - Afterburn Initrd Setup Network Kernel Arguments was skipped because no trigger condition checks were met. Sep 16 04:22:22.150829 systemd[1]: Starting ignition-setup.service - Ignition (setup)... Sep 16 04:22:22.154722 systemd[1]: Starting parse-ip-for-networkd.service - Write systemd-networkd units from cmdline... Sep 16 04:22:22.184661 kernel: BTRFS: device label OEM devid 1 transid 11 /dev/sda6 (8:6) scanned by mount (669) Sep 16 04:22:22.186918 kernel: BTRFS info (device sda6): first mount of filesystem a546938e-7af2-44ea-b88d-218d567c463b Sep 16 04:22:22.186975 kernel: BTRFS info (device sda6): using crc32c (crc32c-generic) checksum algorithm Sep 16 04:22:22.192797 kernel: BTRFS info (device sda6): enabling ssd optimizations Sep 16 04:22:22.192866 kernel: BTRFS info (device sda6): turning on async discard Sep 16 04:22:22.192879 kernel: BTRFS info (device sda6): enabling free space tree Sep 16 04:22:22.200681 kernel: BTRFS info (device sda6): last unmount of filesystem a546938e-7af2-44ea-b88d-218d567c463b Sep 16 04:22:22.202866 systemd[1]: Finished ignition-setup.service - Ignition (setup). Sep 16 04:22:22.205126 systemd[1]: Starting ignition-fetch-offline.service - Ignition (fetch-offline)... Sep 16 04:22:22.280923 systemd[1]: Finished parse-ip-for-networkd.service - Write systemd-networkd units from cmdline. Sep 16 04:22:22.283828 systemd[1]: Starting systemd-networkd.service - Network Configuration... Sep 16 04:22:22.318417 systemd-networkd[810]: lo: Link UP Sep 16 04:22:22.318431 systemd-networkd[810]: lo: Gained carrier Sep 16 04:22:22.319977 systemd-networkd[810]: Enumeration completed Sep 16 04:22:22.320075 systemd[1]: Started systemd-networkd.service - Network Configuration. Sep 16 04:22:22.320816 systemd[1]: Reached target network.target - Network. Sep 16 04:22:22.322305 systemd-networkd[810]: eth0: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Sep 16 04:22:22.322308 systemd-networkd[810]: eth0: Configuring with /usr/lib/systemd/network/zz-default.network. Sep 16 04:22:22.324950 systemd-networkd[810]: eth1: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Sep 16 04:22:22.324956 systemd-networkd[810]: eth1: Configuring with /usr/lib/systemd/network/zz-default.network. Sep 16 04:22:22.328261 systemd-networkd[810]: eth0: Link UP Sep 16 04:22:22.328502 systemd-networkd[810]: eth1: Link UP Sep 16 04:22:22.328771 systemd-networkd[810]: eth0: Gained carrier Sep 16 04:22:22.328790 systemd-networkd[810]: eth0: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Sep 16 04:22:22.334882 systemd-networkd[810]: eth1: Gained carrier Sep 16 04:22:22.334905 systemd-networkd[810]: eth1: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Sep 16 04:22:22.356607 ignition[722]: Ignition 2.22.0 Sep 16 04:22:22.356617 ignition[722]: Stage: fetch-offline Sep 16 04:22:22.356668 ignition[722]: no configs at "/usr/lib/ignition/base.d" Sep 16 04:22:22.356675 ignition[722]: no config dir at "/usr/lib/ignition/base.platform.d/hetzner" Sep 16 04:22:22.356972 ignition[722]: parsed url from cmdline: "" Sep 16 04:22:22.356975 ignition[722]: no config URL provided Sep 16 04:22:22.356980 ignition[722]: reading system config file "/usr/lib/ignition/user.ign" Sep 16 04:22:22.356986 ignition[722]: no config at "/usr/lib/ignition/user.ign" Sep 16 04:22:22.356992 ignition[722]: failed to fetch config: resource requires networking Sep 16 04:22:22.357269 ignition[722]: Ignition finished successfully Sep 16 04:22:22.364360 systemd[1]: Finished ignition-fetch-offline.service - Ignition (fetch-offline). Sep 16 04:22:22.366713 systemd-networkd[810]: eth1: DHCPv4 address 10.0.0.3/32 acquired from 10.0.0.1 Sep 16 04:22:22.367856 systemd[1]: Starting ignition-fetch.service - Ignition (fetch)... Sep 16 04:22:22.387702 systemd-networkd[810]: eth0: DHCPv4 address 138.201.119.17/32, gateway 172.31.1.1 acquired from 172.31.1.1 Sep 16 04:22:22.406097 ignition[818]: Ignition 2.22.0 Sep 16 04:22:22.406119 ignition[818]: Stage: fetch Sep 16 04:22:22.406267 ignition[818]: no configs at "/usr/lib/ignition/base.d" Sep 16 04:22:22.406282 ignition[818]: no config dir at "/usr/lib/ignition/base.platform.d/hetzner" Sep 16 04:22:22.406365 ignition[818]: parsed url from cmdline: "" Sep 16 04:22:22.406368 ignition[818]: no config URL provided Sep 16 04:22:22.406373 ignition[818]: reading system config file "/usr/lib/ignition/user.ign" Sep 16 04:22:22.406381 ignition[818]: no config at "/usr/lib/ignition/user.ign" Sep 16 04:22:22.406410 ignition[818]: GET http://169.254.169.254/hetzner/v1/userdata: attempt #1 Sep 16 04:22:22.413658 ignition[818]: GET result: OK Sep 16 04:22:22.413800 ignition[818]: parsing config with SHA512: 97a43708178448f66c90d66b477a63b1c6ff7042f46d3632369b30c6ed93b2048f8016ce5b97eb7564c0adec63458dd85b4c2e0a70d8a6784919643adbf2b822 Sep 16 04:22:22.420569 unknown[818]: fetched base config from "system" Sep 16 04:22:22.421468 unknown[818]: fetched base config from "system" Sep 16 04:22:22.421478 unknown[818]: fetched user config from "hetzner" Sep 16 04:22:22.422341 ignition[818]: fetch: fetch complete Sep 16 04:22:22.422347 ignition[818]: fetch: fetch passed Sep 16 04:22:22.422405 ignition[818]: Ignition finished successfully Sep 16 04:22:22.425705 systemd[1]: Finished ignition-fetch.service - Ignition (fetch). Sep 16 04:22:22.428302 systemd[1]: Starting ignition-kargs.service - Ignition (kargs)... Sep 16 04:22:22.467927 ignition[826]: Ignition 2.22.0 Sep 16 04:22:22.467940 ignition[826]: Stage: kargs Sep 16 04:22:22.468082 ignition[826]: no configs at "/usr/lib/ignition/base.d" Sep 16 04:22:22.468090 ignition[826]: no config dir at "/usr/lib/ignition/base.platform.d/hetzner" Sep 16 04:22:22.471252 ignition[826]: kargs: kargs passed Sep 16 04:22:22.471310 ignition[826]: Ignition finished successfully Sep 16 04:22:22.474657 systemd[1]: Finished ignition-kargs.service - Ignition (kargs). Sep 16 04:22:22.477435 systemd[1]: Starting ignition-disks.service - Ignition (disks)... Sep 16 04:22:22.507662 ignition[832]: Ignition 2.22.0 Sep 16 04:22:22.507681 ignition[832]: Stage: disks Sep 16 04:22:22.507848 ignition[832]: no configs at "/usr/lib/ignition/base.d" Sep 16 04:22:22.507858 ignition[832]: no config dir at "/usr/lib/ignition/base.platform.d/hetzner" Sep 16 04:22:22.508784 ignition[832]: disks: disks passed Sep 16 04:22:22.511352 systemd[1]: Finished ignition-disks.service - Ignition (disks). Sep 16 04:22:22.508850 ignition[832]: Ignition finished successfully Sep 16 04:22:22.513240 systemd[1]: Reached target initrd-root-device.target - Initrd Root Device. Sep 16 04:22:22.514622 systemd[1]: Reached target local-fs-pre.target - Preparation for Local File Systems. Sep 16 04:22:22.515919 systemd[1]: Reached target local-fs.target - Local File Systems. Sep 16 04:22:22.517120 systemd[1]: Reached target sysinit.target - System Initialization. Sep 16 04:22:22.517743 systemd[1]: Reached target basic.target - Basic System. Sep 16 04:22:22.519779 systemd[1]: Starting systemd-fsck-root.service - File System Check on /dev/disk/by-label/ROOT... Sep 16 04:22:22.545971 systemd-fsck[841]: ROOT: clean, 15/1628000 files, 120826/1617920 blocks Sep 16 04:22:22.550928 systemd[1]: Finished systemd-fsck-root.service - File System Check on /dev/disk/by-label/ROOT. Sep 16 04:22:22.554483 systemd[1]: Mounting sysroot.mount - /sysroot... Sep 16 04:22:22.632660 kernel: EXT4-fs (sda9): mounted filesystem a00d22d9-68b1-4a84-acfc-9fae1fca53dd r/w with ordered data mode. Quota mode: none. Sep 16 04:22:22.634051 systemd[1]: Mounted sysroot.mount - /sysroot. Sep 16 04:22:22.636625 systemd[1]: Reached target initrd-root-fs.target - Initrd Root File System. Sep 16 04:22:22.641299 systemd[1]: Mounting sysroot-oem.mount - /sysroot/oem... Sep 16 04:22:22.643066 systemd[1]: Mounting sysroot-usr.mount - /sysroot/usr... Sep 16 04:22:22.647125 systemd[1]: Starting flatcar-metadata-hostname.service - Flatcar Metadata Hostname Agent... Sep 16 04:22:22.649682 systemd[1]: ignition-remount-sysroot.service - Remount /sysroot read-write for Ignition was skipped because of an unmet condition check (ConditionPathIsReadWrite=!/sysroot). Sep 16 04:22:22.649720 systemd[1]: Reached target ignition-diskful.target - Ignition Boot Disk Setup. Sep 16 04:22:22.661235 systemd[1]: Mounted sysroot-usr.mount - /sysroot/usr. Sep 16 04:22:22.664483 systemd[1]: Starting initrd-setup-root.service - Root filesystem setup... Sep 16 04:22:22.676655 kernel: BTRFS: device label OEM devid 1 transid 11 /dev/sda6 (8:6) scanned by mount (849) Sep 16 04:22:22.679956 kernel: BTRFS info (device sda6): first mount of filesystem a546938e-7af2-44ea-b88d-218d567c463b Sep 16 04:22:22.680012 kernel: BTRFS info (device sda6): using crc32c (crc32c-generic) checksum algorithm Sep 16 04:22:22.690887 kernel: BTRFS info (device sda6): enabling ssd optimizations Sep 16 04:22:22.690967 kernel: BTRFS info (device sda6): turning on async discard Sep 16 04:22:22.690988 kernel: BTRFS info (device sda6): enabling free space tree Sep 16 04:22:22.694211 systemd[1]: Mounted sysroot-oem.mount - /sysroot/oem. Sep 16 04:22:22.716609 coreos-metadata[851]: Sep 16 04:22:22.716 INFO Fetching http://169.254.169.254/hetzner/v1/metadata/hostname: Attempt #1 Sep 16 04:22:22.719292 coreos-metadata[851]: Sep 16 04:22:22.719 INFO Fetch successful Sep 16 04:22:22.721863 coreos-metadata[851]: Sep 16 04:22:22.720 INFO wrote hostname ci-4459-0-0-n-21eb3e8385 to /sysroot/etc/hostname Sep 16 04:22:22.725342 systemd[1]: Finished flatcar-metadata-hostname.service - Flatcar Metadata Hostname Agent. Sep 16 04:22:22.727960 initrd-setup-root[877]: cut: /sysroot/etc/passwd: No such file or directory Sep 16 04:22:22.733450 initrd-setup-root[884]: cut: /sysroot/etc/group: No such file or directory Sep 16 04:22:22.738244 initrd-setup-root[891]: cut: /sysroot/etc/shadow: No such file or directory Sep 16 04:22:22.742897 initrd-setup-root[898]: cut: /sysroot/etc/gshadow: No such file or directory Sep 16 04:22:22.841746 systemd[1]: Finished initrd-setup-root.service - Root filesystem setup. Sep 16 04:22:22.844390 systemd[1]: Starting ignition-mount.service - Ignition (mount)... Sep 16 04:22:22.846410 systemd[1]: Starting sysroot-boot.service - /sysroot/boot... Sep 16 04:22:22.860622 kernel: BTRFS info (device sda6): last unmount of filesystem a546938e-7af2-44ea-b88d-218d567c463b Sep 16 04:22:22.876386 systemd[1]: Finished sysroot-boot.service - /sysroot/boot. Sep 16 04:22:22.890808 ignition[966]: INFO : Ignition 2.22.0 Sep 16 04:22:22.890808 ignition[966]: INFO : Stage: mount Sep 16 04:22:22.892358 ignition[966]: INFO : no configs at "/usr/lib/ignition/base.d" Sep 16 04:22:22.892358 ignition[966]: INFO : no config dir at "/usr/lib/ignition/base.platform.d/hetzner" Sep 16 04:22:22.894391 ignition[966]: INFO : mount: mount passed Sep 16 04:22:22.894391 ignition[966]: INFO : Ignition finished successfully Sep 16 04:22:22.894741 systemd[1]: Finished ignition-mount.service - Ignition (mount). Sep 16 04:22:22.896344 systemd[1]: Starting ignition-files.service - Ignition (files)... Sep 16 04:22:23.134990 systemd[1]: sysroot-oem.mount: Deactivated successfully. Sep 16 04:22:23.139800 systemd[1]: Mounting sysroot-oem.mount - /sysroot/oem... Sep 16 04:22:23.163624 kernel: BTRFS: device label OEM devid 1 transid 11 /dev/sda6 (8:6) scanned by mount (978) Sep 16 04:22:23.165639 kernel: BTRFS info (device sda6): first mount of filesystem a546938e-7af2-44ea-b88d-218d567c463b Sep 16 04:22:23.165700 kernel: BTRFS info (device sda6): using crc32c (crc32c-generic) checksum algorithm Sep 16 04:22:23.169918 kernel: BTRFS info (device sda6): enabling ssd optimizations Sep 16 04:22:23.169965 kernel: BTRFS info (device sda6): turning on async discard Sep 16 04:22:23.169984 kernel: BTRFS info (device sda6): enabling free space tree Sep 16 04:22:23.174143 systemd[1]: Mounted sysroot-oem.mount - /sysroot/oem. Sep 16 04:22:23.219784 ignition[995]: INFO : Ignition 2.22.0 Sep 16 04:22:23.219784 ignition[995]: INFO : Stage: files Sep 16 04:22:23.220990 ignition[995]: INFO : no configs at "/usr/lib/ignition/base.d" Sep 16 04:22:23.220990 ignition[995]: INFO : no config dir at "/usr/lib/ignition/base.platform.d/hetzner" Sep 16 04:22:23.223570 ignition[995]: DEBUG : files: compiled without relabeling support, skipping Sep 16 04:22:23.223570 ignition[995]: INFO : files: ensureUsers: op(1): [started] creating or modifying user "core" Sep 16 04:22:23.223570 ignition[995]: DEBUG : files: ensureUsers: op(1): executing: "usermod" "--root" "/sysroot" "core" Sep 16 04:22:23.226779 ignition[995]: INFO : files: ensureUsers: op(1): [finished] creating or modifying user "core" Sep 16 04:22:23.226779 ignition[995]: INFO : files: ensureUsers: op(2): [started] adding ssh keys to user "core" Sep 16 04:22:23.226779 ignition[995]: INFO : files: ensureUsers: op(2): [finished] adding ssh keys to user "core" Sep 16 04:22:23.226644 unknown[995]: wrote ssh authorized keys file for user: core Sep 16 04:22:23.235948 ignition[995]: INFO : files: createFilesystemsFiles: createFiles: op(3): [started] writing file "/sysroot/opt/helm-v3.17.3-linux-arm64.tar.gz" Sep 16 04:22:23.235948 ignition[995]: INFO : files: createFilesystemsFiles: createFiles: op(3): GET https://get.helm.sh/helm-v3.17.3-linux-arm64.tar.gz: attempt #1 Sep 16 04:22:23.395998 ignition[995]: INFO : files: createFilesystemsFiles: createFiles: op(3): GET result: OK Sep 16 04:22:24.011777 systemd-networkd[810]: eth1: Gained IPv6LL Sep 16 04:22:24.139865 systemd-networkd[810]: eth0: Gained IPv6LL Sep 16 04:22:26.799065 ignition[995]: INFO : files: createFilesystemsFiles: createFiles: op(3): [finished] writing file "/sysroot/opt/helm-v3.17.3-linux-arm64.tar.gz" Sep 16 04:22:26.802997 ignition[995]: INFO : files: createFilesystemsFiles: createFiles: op(4): [started] writing file "/sysroot/home/core/install.sh" Sep 16 04:22:26.802997 ignition[995]: INFO : files: createFilesystemsFiles: createFiles: op(4): [finished] writing file "/sysroot/home/core/install.sh" Sep 16 04:22:26.802997 ignition[995]: INFO : files: createFilesystemsFiles: createFiles: op(5): [started] writing file "/sysroot/home/core/nginx.yaml" Sep 16 04:22:26.802997 ignition[995]: INFO : files: createFilesystemsFiles: createFiles: op(5): [finished] writing file "/sysroot/home/core/nginx.yaml" Sep 16 04:22:26.802997 ignition[995]: INFO : files: createFilesystemsFiles: createFiles: op(6): [started] writing file "/sysroot/home/core/nfs-pod.yaml" Sep 16 04:22:26.802997 ignition[995]: INFO : files: createFilesystemsFiles: createFiles: op(6): [finished] writing file "/sysroot/home/core/nfs-pod.yaml" Sep 16 04:22:26.802997 ignition[995]: INFO : files: createFilesystemsFiles: createFiles: op(7): [started] writing file "/sysroot/home/core/nfs-pvc.yaml" Sep 16 04:22:26.802997 ignition[995]: INFO : files: createFilesystemsFiles: createFiles: op(7): [finished] writing file "/sysroot/home/core/nfs-pvc.yaml" Sep 16 04:22:26.802997 ignition[995]: INFO : files: createFilesystemsFiles: createFiles: op(8): [started] writing file "/sysroot/etc/flatcar/update.conf" Sep 16 04:22:26.813014 ignition[995]: INFO : files: createFilesystemsFiles: createFiles: op(8): [finished] writing file "/sysroot/etc/flatcar/update.conf" Sep 16 04:22:26.813014 ignition[995]: INFO : files: createFilesystemsFiles: createFiles: op(9): [started] writing link "/sysroot/etc/extensions/kubernetes.raw" -> "/opt/extensions/kubernetes/kubernetes-v1.33.0-arm64.raw" Sep 16 04:22:26.813014 ignition[995]: INFO : files: createFilesystemsFiles: createFiles: op(9): [finished] writing link "/sysroot/etc/extensions/kubernetes.raw" -> "/opt/extensions/kubernetes/kubernetes-v1.33.0-arm64.raw" Sep 16 04:22:26.813014 ignition[995]: INFO : files: createFilesystemsFiles: createFiles: op(a): [started] writing file "/sysroot/opt/extensions/kubernetes/kubernetes-v1.33.0-arm64.raw" Sep 16 04:22:26.813014 ignition[995]: INFO : files: createFilesystemsFiles: createFiles: op(a): GET https://extensions.flatcar.org/extensions/kubernetes-v1.33.0-arm64.raw: attempt #1 Sep 16 04:22:27.173039 ignition[995]: INFO : files: createFilesystemsFiles: createFiles: op(a): GET result: OK Sep 16 04:22:29.434552 ignition[995]: INFO : files: createFilesystemsFiles: createFiles: op(a): [finished] writing file "/sysroot/opt/extensions/kubernetes/kubernetes-v1.33.0-arm64.raw" Sep 16 04:22:29.434552 ignition[995]: INFO : files: op(b): [started] processing unit "prepare-helm.service" Sep 16 04:22:29.437681 ignition[995]: INFO : files: op(b): op(c): [started] writing unit "prepare-helm.service" at "/sysroot/etc/systemd/system/prepare-helm.service" Sep 16 04:22:29.439196 ignition[995]: INFO : files: op(b): op(c): [finished] writing unit "prepare-helm.service" at "/sysroot/etc/systemd/system/prepare-helm.service" Sep 16 04:22:29.439196 ignition[995]: INFO : files: op(b): [finished] processing unit "prepare-helm.service" Sep 16 04:22:29.439196 ignition[995]: INFO : files: op(d): [started] processing unit "coreos-metadata.service" Sep 16 04:22:29.439196 ignition[995]: INFO : files: op(d): op(e): [started] writing systemd drop-in "00-custom-metadata.conf" at "/sysroot/etc/systemd/system/coreos-metadata.service.d/00-custom-metadata.conf" Sep 16 04:22:29.439196 ignition[995]: INFO : files: op(d): op(e): [finished] writing systemd drop-in "00-custom-metadata.conf" at "/sysroot/etc/systemd/system/coreos-metadata.service.d/00-custom-metadata.conf" Sep 16 04:22:29.439196 ignition[995]: INFO : files: op(d): [finished] processing unit "coreos-metadata.service" Sep 16 04:22:29.439196 ignition[995]: INFO : files: op(f): [started] setting preset to enabled for "prepare-helm.service" Sep 16 04:22:29.454705 ignition[995]: INFO : files: op(f): [finished] setting preset to enabled for "prepare-helm.service" Sep 16 04:22:29.454705 ignition[995]: INFO : files: createResultFile: createFiles: op(10): [started] writing file "/sysroot/etc/.ignition-result.json" Sep 16 04:22:29.454705 ignition[995]: INFO : files: createResultFile: createFiles: op(10): [finished] writing file "/sysroot/etc/.ignition-result.json" Sep 16 04:22:29.454705 ignition[995]: INFO : files: files passed Sep 16 04:22:29.454705 ignition[995]: INFO : Ignition finished successfully Sep 16 04:22:29.442052 systemd[1]: Finished ignition-files.service - Ignition (files). Sep 16 04:22:29.446976 systemd[1]: Starting ignition-quench.service - Ignition (record completion)... Sep 16 04:22:29.452713 systemd[1]: Starting initrd-setup-root-after-ignition.service - Root filesystem completion... Sep 16 04:22:29.465488 systemd[1]: ignition-quench.service: Deactivated successfully. Sep 16 04:22:29.466810 systemd[1]: Finished ignition-quench.service - Ignition (record completion). Sep 16 04:22:29.476359 initrd-setup-root-after-ignition[1024]: grep: /sysroot/etc/flatcar/enabled-sysext.conf: No such file or directory Sep 16 04:22:29.476359 initrd-setup-root-after-ignition[1024]: grep: /sysroot/usr/share/flatcar/enabled-sysext.conf: No such file or directory Sep 16 04:22:29.480646 initrd-setup-root-after-ignition[1028]: grep: /sysroot/etc/flatcar/enabled-sysext.conf: No such file or directory Sep 16 04:22:29.483645 systemd[1]: Finished initrd-setup-root-after-ignition.service - Root filesystem completion. Sep 16 04:22:29.484703 systemd[1]: Reached target ignition-complete.target - Ignition Complete. Sep 16 04:22:29.487042 systemd[1]: Starting initrd-parse-etc.service - Mountpoints Configured in the Real Root... Sep 16 04:22:29.554801 systemd[1]: initrd-parse-etc.service: Deactivated successfully. Sep 16 04:22:29.554980 systemd[1]: Finished initrd-parse-etc.service - Mountpoints Configured in the Real Root. Sep 16 04:22:29.557823 systemd[1]: Reached target initrd-fs.target - Initrd File Systems. Sep 16 04:22:29.558947 systemd[1]: Reached target initrd.target - Initrd Default Target. Sep 16 04:22:29.560231 systemd[1]: dracut-mount.service - dracut mount hook was skipped because no trigger condition checks were met. Sep 16 04:22:29.561113 systemd[1]: Starting dracut-pre-pivot.service - dracut pre-pivot and cleanup hook... Sep 16 04:22:29.587381 systemd[1]: Finished dracut-pre-pivot.service - dracut pre-pivot and cleanup hook. Sep 16 04:22:29.589975 systemd[1]: Starting initrd-cleanup.service - Cleaning Up and Shutting Down Daemons... Sep 16 04:22:29.614189 systemd[1]: Stopped target nss-lookup.target - Host and Network Name Lookups. Sep 16 04:22:29.615902 systemd[1]: Stopped target remote-cryptsetup.target - Remote Encrypted Volumes. Sep 16 04:22:29.617421 systemd[1]: Stopped target timers.target - Timer Units. Sep 16 04:22:29.618671 systemd[1]: dracut-pre-pivot.service: Deactivated successfully. Sep 16 04:22:29.618811 systemd[1]: Stopped dracut-pre-pivot.service - dracut pre-pivot and cleanup hook. Sep 16 04:22:29.621059 systemd[1]: Stopped target initrd.target - Initrd Default Target. Sep 16 04:22:29.621733 systemd[1]: Stopped target basic.target - Basic System. Sep 16 04:22:29.623628 systemd[1]: Stopped target ignition-complete.target - Ignition Complete. Sep 16 04:22:29.625999 systemd[1]: Stopped target ignition-diskful.target - Ignition Boot Disk Setup. Sep 16 04:22:29.627435 systemd[1]: Stopped target initrd-root-device.target - Initrd Root Device. Sep 16 04:22:29.628788 systemd[1]: Stopped target initrd-usr-fs.target - Initrd /usr File System. Sep 16 04:22:29.631001 systemd[1]: Stopped target remote-fs.target - Remote File Systems. Sep 16 04:22:29.631683 systemd[1]: Stopped target remote-fs-pre.target - Preparation for Remote File Systems. Sep 16 04:22:29.633261 systemd[1]: Stopped target sysinit.target - System Initialization. Sep 16 04:22:29.634824 systemd[1]: Stopped target local-fs.target - Local File Systems. Sep 16 04:22:29.636219 systemd[1]: Stopped target swap.target - Swaps. Sep 16 04:22:29.637440 systemd[1]: dracut-pre-mount.service: Deactivated successfully. Sep 16 04:22:29.637566 systemd[1]: Stopped dracut-pre-mount.service - dracut pre-mount hook. Sep 16 04:22:29.639408 systemd[1]: Stopped target cryptsetup.target - Local Encrypted Volumes. Sep 16 04:22:29.640249 systemd[1]: Stopped target cryptsetup-pre.target - Local Encrypted Volumes (Pre). Sep 16 04:22:29.641065 systemd[1]: clevis-luks-askpass.path: Deactivated successfully. Sep 16 04:22:29.642605 systemd[1]: Stopped clevis-luks-askpass.path - Forward Password Requests to Clevis Directory Watch. Sep 16 04:22:29.643383 systemd[1]: dracut-initqueue.service: Deactivated successfully. Sep 16 04:22:29.643504 systemd[1]: Stopped dracut-initqueue.service - dracut initqueue hook. Sep 16 04:22:29.645238 systemd[1]: initrd-setup-root-after-ignition.service: Deactivated successfully. Sep 16 04:22:29.645392 systemd[1]: Stopped initrd-setup-root-after-ignition.service - Root filesystem completion. Sep 16 04:22:29.647027 systemd[1]: ignition-files.service: Deactivated successfully. Sep 16 04:22:29.647126 systemd[1]: Stopped ignition-files.service - Ignition (files). Sep 16 04:22:29.648226 systemd[1]: flatcar-metadata-hostname.service: Deactivated successfully. Sep 16 04:22:29.648318 systemd[1]: Stopped flatcar-metadata-hostname.service - Flatcar Metadata Hostname Agent. Sep 16 04:22:29.650271 systemd[1]: Stopping ignition-mount.service - Ignition (mount)... Sep 16 04:22:29.654919 systemd[1]: Stopping sysroot-boot.service - /sysroot/boot... Sep 16 04:22:29.657167 systemd[1]: systemd-udev-trigger.service: Deactivated successfully. Sep 16 04:22:29.657354 systemd[1]: Stopped systemd-udev-trigger.service - Coldplug All udev Devices. Sep 16 04:22:29.659848 systemd[1]: dracut-pre-trigger.service: Deactivated successfully. Sep 16 04:22:29.659947 systemd[1]: Stopped dracut-pre-trigger.service - dracut pre-trigger hook. Sep 16 04:22:29.667832 systemd[1]: initrd-cleanup.service: Deactivated successfully. Sep 16 04:22:29.670657 systemd[1]: Finished initrd-cleanup.service - Cleaning Up and Shutting Down Daemons. Sep 16 04:22:29.679362 systemd[1]: sysroot-boot.mount: Deactivated successfully. Sep 16 04:22:29.684341 systemd[1]: sysroot-boot.service: Deactivated successfully. Sep 16 04:22:29.685142 systemd[1]: Stopped sysroot-boot.service - /sysroot/boot. Sep 16 04:22:29.690655 ignition[1048]: INFO : Ignition 2.22.0 Sep 16 04:22:29.690655 ignition[1048]: INFO : Stage: umount Sep 16 04:22:29.693157 ignition[1048]: INFO : no configs at "/usr/lib/ignition/base.d" Sep 16 04:22:29.693157 ignition[1048]: INFO : no config dir at "/usr/lib/ignition/base.platform.d/hetzner" Sep 16 04:22:29.693157 ignition[1048]: INFO : umount: umount passed Sep 16 04:22:29.693157 ignition[1048]: INFO : Ignition finished successfully Sep 16 04:22:29.694979 systemd[1]: ignition-mount.service: Deactivated successfully. Sep 16 04:22:29.696664 systemd[1]: Stopped ignition-mount.service - Ignition (mount). Sep 16 04:22:29.698982 systemd[1]: ignition-disks.service: Deactivated successfully. Sep 16 04:22:29.699116 systemd[1]: Stopped ignition-disks.service - Ignition (disks). Sep 16 04:22:29.701223 systemd[1]: ignition-kargs.service: Deactivated successfully. Sep 16 04:22:29.701305 systemd[1]: Stopped ignition-kargs.service - Ignition (kargs). Sep 16 04:22:29.703027 systemd[1]: ignition-fetch.service: Deactivated successfully. Sep 16 04:22:29.703102 systemd[1]: Stopped ignition-fetch.service - Ignition (fetch). Sep 16 04:22:29.704905 systemd[1]: Stopped target network.target - Network. Sep 16 04:22:29.706563 systemd[1]: ignition-fetch-offline.service: Deactivated successfully. Sep 16 04:22:29.706686 systemd[1]: Stopped ignition-fetch-offline.service - Ignition (fetch-offline). Sep 16 04:22:29.708830 systemd[1]: Stopped target paths.target - Path Units. Sep 16 04:22:29.710494 systemd[1]: systemd-ask-password-console.path: Deactivated successfully. Sep 16 04:22:29.713732 systemd[1]: Stopped systemd-ask-password-console.path - Dispatch Password Requests to Console Directory Watch. Sep 16 04:22:29.716041 systemd[1]: Stopped target slices.target - Slice Units. Sep 16 04:22:29.717865 systemd[1]: Stopped target sockets.target - Socket Units. Sep 16 04:22:29.719681 systemd[1]: iscsid.socket: Deactivated successfully. Sep 16 04:22:29.719758 systemd[1]: Closed iscsid.socket - Open-iSCSI iscsid Socket. Sep 16 04:22:29.721693 systemd[1]: iscsiuio.socket: Deactivated successfully. Sep 16 04:22:29.721759 systemd[1]: Closed iscsiuio.socket - Open-iSCSI iscsiuio Socket. Sep 16 04:22:29.724030 systemd[1]: ignition-setup.service: Deactivated successfully. Sep 16 04:22:29.724129 systemd[1]: Stopped ignition-setup.service - Ignition (setup). Sep 16 04:22:29.725909 systemd[1]: ignition-setup-pre.service: Deactivated successfully. Sep 16 04:22:29.725979 systemd[1]: Stopped ignition-setup-pre.service - Ignition env setup. Sep 16 04:22:29.727825 systemd[1]: initrd-setup-root.service: Deactivated successfully. Sep 16 04:22:29.727924 systemd[1]: Stopped initrd-setup-root.service - Root filesystem setup. Sep 16 04:22:29.730168 systemd[1]: Stopping systemd-networkd.service - Network Configuration... Sep 16 04:22:29.730962 systemd[1]: Stopping systemd-resolved.service - Network Name Resolution... Sep 16 04:22:29.740061 systemd[1]: systemd-resolved.service: Deactivated successfully. Sep 16 04:22:29.740183 systemd[1]: Stopped systemd-resolved.service - Network Name Resolution. Sep 16 04:22:29.744716 systemd[1]: run-credentials-systemd\x2dresolved.service.mount: Deactivated successfully. Sep 16 04:22:29.745003 systemd[1]: systemd-networkd.service: Deactivated successfully. Sep 16 04:22:29.745118 systemd[1]: Stopped systemd-networkd.service - Network Configuration. Sep 16 04:22:29.748942 systemd[1]: run-credentials-systemd\x2dnetworkd.service.mount: Deactivated successfully. Sep 16 04:22:29.749842 systemd[1]: Stopped target network-pre.target - Preparation for Network. Sep 16 04:22:29.751204 systemd[1]: systemd-networkd.socket: Deactivated successfully. Sep 16 04:22:29.751245 systemd[1]: Closed systemd-networkd.socket - Network Service Netlink Socket. Sep 16 04:22:29.754087 systemd[1]: Stopping network-cleanup.service - Network Cleanup... Sep 16 04:22:29.755069 systemd[1]: parse-ip-for-networkd.service: Deactivated successfully. Sep 16 04:22:29.755128 systemd[1]: Stopped parse-ip-for-networkd.service - Write systemd-networkd units from cmdline. Sep 16 04:22:29.759793 systemd[1]: systemd-sysctl.service: Deactivated successfully. Sep 16 04:22:29.759856 systemd[1]: Stopped systemd-sysctl.service - Apply Kernel Variables. Sep 16 04:22:29.763276 systemd[1]: systemd-modules-load.service: Deactivated successfully. Sep 16 04:22:29.763344 systemd[1]: Stopped systemd-modules-load.service - Load Kernel Modules. Sep 16 04:22:29.764730 systemd[1]: systemd-tmpfiles-setup.service: Deactivated successfully. Sep 16 04:22:29.764789 systemd[1]: Stopped systemd-tmpfiles-setup.service - Create System Files and Directories. Sep 16 04:22:29.768024 systemd[1]: Stopping systemd-udevd.service - Rule-based Manager for Device Events and Files... Sep 16 04:22:29.774907 systemd[1]: run-credentials-systemd\x2dsysctl.service.mount: Deactivated successfully. Sep 16 04:22:29.774982 systemd[1]: run-credentials-systemd\x2dtmpfiles\x2dsetup.service.mount: Deactivated successfully. Sep 16 04:22:29.779726 systemd[1]: systemd-udevd.service: Deactivated successfully. Sep 16 04:22:29.781630 systemd[1]: Stopped systemd-udevd.service - Rule-based Manager for Device Events and Files. Sep 16 04:22:29.783704 systemd[1]: systemd-udevd-control.socket: Deactivated successfully. Sep 16 04:22:29.783749 systemd[1]: Closed systemd-udevd-control.socket - udev Control Socket. Sep 16 04:22:29.785551 systemd[1]: systemd-udevd-kernel.socket: Deactivated successfully. Sep 16 04:22:29.785602 systemd[1]: Closed systemd-udevd-kernel.socket - udev Kernel Socket. Sep 16 04:22:29.786675 systemd[1]: dracut-pre-udev.service: Deactivated successfully. Sep 16 04:22:29.786725 systemd[1]: Stopped dracut-pre-udev.service - dracut pre-udev hook. Sep 16 04:22:29.787456 systemd[1]: dracut-cmdline.service: Deactivated successfully. Sep 16 04:22:29.787499 systemd[1]: Stopped dracut-cmdline.service - dracut cmdline hook. Sep 16 04:22:29.789391 systemd[1]: dracut-cmdline-ask.service: Deactivated successfully. Sep 16 04:22:29.789448 systemd[1]: Stopped dracut-cmdline-ask.service - dracut ask for additional cmdline parameters. Sep 16 04:22:29.793393 systemd[1]: Starting initrd-udevadm-cleanup-db.service - Cleanup udev Database... Sep 16 04:22:29.794200 systemd[1]: systemd-network-generator.service: Deactivated successfully. Sep 16 04:22:29.794257 systemd[1]: Stopped systemd-network-generator.service - Generate network units from Kernel command line. Sep 16 04:22:29.800077 systemd[1]: systemd-tmpfiles-setup-dev.service: Deactivated successfully. Sep 16 04:22:29.800137 systemd[1]: Stopped systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev. Sep 16 04:22:29.801067 systemd[1]: systemd-tmpfiles-setup-dev-early.service: Deactivated successfully. Sep 16 04:22:29.801108 systemd[1]: Stopped systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully. Sep 16 04:22:29.803355 systemd[1]: kmod-static-nodes.service: Deactivated successfully. Sep 16 04:22:29.803400 systemd[1]: Stopped kmod-static-nodes.service - Create List of Static Device Nodes. Sep 16 04:22:29.804252 systemd[1]: systemd-vconsole-setup.service: Deactivated successfully. Sep 16 04:22:29.804294 systemd[1]: Stopped systemd-vconsole-setup.service - Virtual Console Setup. Sep 16 04:22:29.809168 systemd[1]: run-credentials-systemd\x2dnetwork\x2dgenerator.service.mount: Deactivated successfully. Sep 16 04:22:29.809220 systemd[1]: run-credentials-systemd\x2dtmpfiles\x2dsetup\x2ddev\x2dearly.service.mount: Deactivated successfully. Sep 16 04:22:29.809247 systemd[1]: run-credentials-systemd\x2dtmpfiles\x2dsetup\x2ddev.service.mount: Deactivated successfully. Sep 16 04:22:29.809277 systemd[1]: run-credentials-systemd\x2dvconsole\x2dsetup.service.mount: Deactivated successfully. Sep 16 04:22:29.809662 systemd[1]: network-cleanup.service: Deactivated successfully. Sep 16 04:22:29.809752 systemd[1]: Stopped network-cleanup.service - Network Cleanup. Sep 16 04:22:29.818560 systemd[1]: initrd-udevadm-cleanup-db.service: Deactivated successfully. Sep 16 04:22:29.818664 systemd[1]: Finished initrd-udevadm-cleanup-db.service - Cleanup udev Database. Sep 16 04:22:29.822103 systemd[1]: Reached target initrd-switch-root.target - Switch Root. Sep 16 04:22:29.823815 systemd[1]: Starting initrd-switch-root.service - Switch Root... Sep 16 04:22:29.846905 systemd[1]: Switching root. Sep 16 04:22:29.878183 systemd-journald[245]: Journal stopped Sep 16 04:22:30.856731 systemd-journald[245]: Received SIGTERM from PID 1 (systemd). Sep 16 04:22:30.856803 kernel: SELinux: policy capability network_peer_controls=1 Sep 16 04:22:30.856815 kernel: SELinux: policy capability open_perms=1 Sep 16 04:22:30.856824 kernel: SELinux: policy capability extended_socket_class=1 Sep 16 04:22:30.856833 kernel: SELinux: policy capability always_check_network=0 Sep 16 04:22:30.856842 kernel: SELinux: policy capability cgroup_seclabel=1 Sep 16 04:22:30.856851 kernel: SELinux: policy capability nnp_nosuid_transition=1 Sep 16 04:22:30.856864 kernel: SELinux: policy capability genfs_seclabel_symlinks=0 Sep 16 04:22:30.856873 kernel: SELinux: policy capability ioctl_skip_cloexec=0 Sep 16 04:22:30.856881 kernel: SELinux: policy capability userspace_initial_context=0 Sep 16 04:22:30.856893 kernel: audit: type=1403 audit(1757996550.067:2): auid=4294967295 ses=4294967295 lsm=selinux res=1 Sep 16 04:22:30.856906 systemd[1]: Successfully loaded SELinux policy in 72.432ms. Sep 16 04:22:30.856922 systemd[1]: Relabeled /dev/, /dev/shm/, /run/ in 6.129ms. Sep 16 04:22:30.856933 systemd[1]: systemd 256.8 running in system mode (+PAM +AUDIT +SELINUX -APPARMOR +IMA +SMACK +SECCOMP -GCRYPT -GNUTLS +OPENSSL -ACL +BLKID +CURL +ELFUTILS -FIDO2 +IDN2 -IDN +IPTC +KMOD +LIBCRYPTSETUP +LIBCRYPTSETUP_PLUGINS +LIBFDISK +PCRE2 -PWQUALITY -P11KIT -QRENCODE +TPM2 +BZIP2 +LZ4 +XZ +ZLIB +ZSTD -BPF_FRAMEWORK -XKBCOMMON +UTMP -SYSVINIT +LIBARCHIVE) Sep 16 04:22:30.856944 systemd[1]: Detected virtualization kvm. Sep 16 04:22:30.856955 systemd[1]: Detected architecture arm64. Sep 16 04:22:30.856964 systemd[1]: Detected first boot. Sep 16 04:22:30.856974 systemd[1]: Hostname set to . Sep 16 04:22:30.856984 systemd[1]: Initializing machine ID from VM UUID. Sep 16 04:22:30.856993 kernel: NET: Registered PF_VSOCK protocol family Sep 16 04:22:30.857002 zram_generator::config[1091]: No configuration found. Sep 16 04:22:30.857013 systemd[1]: Populated /etc with preset unit settings. Sep 16 04:22:30.857028 systemd[1]: run-credentials-systemd\x2djournald.service.mount: Deactivated successfully. Sep 16 04:22:30.857039 systemd[1]: initrd-switch-root.service: Deactivated successfully. Sep 16 04:22:30.857050 systemd[1]: Stopped initrd-switch-root.service - Switch Root. Sep 16 04:22:30.857060 systemd[1]: systemd-journald.service: Scheduled restart job, restart counter is at 1. Sep 16 04:22:30.857069 systemd[1]: Created slice system-addon\x2dconfig.slice - Slice /system/addon-config. Sep 16 04:22:30.857080 systemd[1]: Created slice system-addon\x2drun.slice - Slice /system/addon-run. Sep 16 04:22:30.857089 systemd[1]: Created slice system-getty.slice - Slice /system/getty. Sep 16 04:22:30.857100 systemd[1]: Created slice system-modprobe.slice - Slice /system/modprobe. Sep 16 04:22:30.857109 systemd[1]: Created slice system-serial\x2dgetty.slice - Slice /system/serial-getty. Sep 16 04:22:30.857119 systemd[1]: Created slice system-system\x2dcloudinit.slice - Slice /system/system-cloudinit. Sep 16 04:22:30.857129 systemd[1]: Created slice system-systemd\x2dfsck.slice - Slice /system/systemd-fsck. Sep 16 04:22:30.857139 systemd[1]: Created slice user.slice - User and Session Slice. Sep 16 04:22:30.857148 systemd[1]: Started clevis-luks-askpass.path - Forward Password Requests to Clevis Directory Watch. Sep 16 04:22:30.857158 systemd[1]: Started systemd-ask-password-console.path - Dispatch Password Requests to Console Directory Watch. Sep 16 04:22:30.857169 systemd[1]: Started systemd-ask-password-wall.path - Forward Password Requests to Wall Directory Watch. Sep 16 04:22:30.857179 systemd[1]: Set up automount boot.automount - Boot partition Automount Point. Sep 16 04:22:30.857189 systemd[1]: Set up automount proc-sys-fs-binfmt_misc.automount - Arbitrary Executable File Formats File System Automount Point. Sep 16 04:22:30.857200 systemd[1]: Expecting device dev-disk-by\x2dlabel-OEM.device - /dev/disk/by-label/OEM... Sep 16 04:22:30.857209 systemd[1]: Expecting device dev-ttyAMA0.device - /dev/ttyAMA0... Sep 16 04:22:30.857219 systemd[1]: Reached target cryptsetup-pre.target - Local Encrypted Volumes (Pre). Sep 16 04:22:30.857229 systemd[1]: Reached target cryptsetup.target - Local Encrypted Volumes. Sep 16 04:22:30.857239 systemd[1]: Stopped target initrd-switch-root.target - Switch Root. Sep 16 04:22:30.857250 systemd[1]: Stopped target initrd-fs.target - Initrd File Systems. Sep 16 04:22:30.857260 systemd[1]: Stopped target initrd-root-fs.target - Initrd Root File System. Sep 16 04:22:30.857270 systemd[1]: Reached target integritysetup.target - Local Integrity Protected Volumes. Sep 16 04:22:30.857281 systemd[1]: Reached target remote-cryptsetup.target - Remote Encrypted Volumes. Sep 16 04:22:30.857290 systemd[1]: Reached target remote-fs.target - Remote File Systems. Sep 16 04:22:30.857314 systemd[1]: Reached target slices.target - Slice Units. Sep 16 04:22:30.857327 systemd[1]: Reached target swap.target - Swaps. Sep 16 04:22:30.857338 systemd[1]: Reached target veritysetup.target - Local Verity Protected Volumes. Sep 16 04:22:30.857347 systemd[1]: Listening on systemd-coredump.socket - Process Core Dump Socket. Sep 16 04:22:30.857360 systemd[1]: Listening on systemd-creds.socket - Credential Encryption/Decryption. Sep 16 04:22:30.857370 systemd[1]: Listening on systemd-networkd.socket - Network Service Netlink Socket. Sep 16 04:22:30.857379 systemd[1]: Listening on systemd-udevd-control.socket - udev Control Socket. Sep 16 04:22:30.857389 systemd[1]: Listening on systemd-udevd-kernel.socket - udev Kernel Socket. Sep 16 04:22:30.857399 systemd[1]: Listening on systemd-userdbd.socket - User Database Manager Socket. Sep 16 04:22:30.857408 systemd[1]: Mounting dev-hugepages.mount - Huge Pages File System... Sep 16 04:22:30.857418 systemd[1]: Mounting dev-mqueue.mount - POSIX Message Queue File System... Sep 16 04:22:30.857428 systemd[1]: Mounting media.mount - External Media Directory... Sep 16 04:22:30.857438 systemd[1]: Mounting sys-kernel-debug.mount - Kernel Debug File System... Sep 16 04:22:30.857449 systemd[1]: Mounting sys-kernel-tracing.mount - Kernel Trace File System... Sep 16 04:22:30.857459 systemd[1]: Mounting tmp.mount - Temporary Directory /tmp... Sep 16 04:22:30.857469 systemd[1]: var-lib-machines.mount - Virtual Machine and Container Storage (Compatibility) was skipped because of an unmet condition check (ConditionPathExists=/var/lib/machines.raw). Sep 16 04:22:30.857479 systemd[1]: Reached target machines.target - Containers. Sep 16 04:22:30.857489 systemd[1]: Starting flatcar-tmpfiles.service - Create missing system files... Sep 16 04:22:30.857498 systemd[1]: ignition-delete-config.service - Ignition (delete config) was skipped because no trigger condition checks were met. Sep 16 04:22:30.857508 systemd[1]: Starting kmod-static-nodes.service - Create List of Static Device Nodes... Sep 16 04:22:30.857518 systemd[1]: Starting modprobe@configfs.service - Load Kernel Module configfs... Sep 16 04:22:30.857530 systemd[1]: Starting modprobe@dm_mod.service - Load Kernel Module dm_mod... Sep 16 04:22:30.857540 systemd[1]: Starting modprobe@drm.service - Load Kernel Module drm... Sep 16 04:22:30.857550 systemd[1]: Starting modprobe@efi_pstore.service - Load Kernel Module efi_pstore... Sep 16 04:22:30.857560 systemd[1]: Starting modprobe@fuse.service - Load Kernel Module fuse... Sep 16 04:22:30.857569 systemd[1]: Starting modprobe@loop.service - Load Kernel Module loop... Sep 16 04:22:30.857683 systemd[1]: setup-nsswitch.service - Create /etc/nsswitch.conf was skipped because of an unmet condition check (ConditionPathExists=!/etc/nsswitch.conf). Sep 16 04:22:30.857704 systemd[1]: systemd-fsck-root.service: Deactivated successfully. Sep 16 04:22:30.857718 systemd[1]: Stopped systemd-fsck-root.service - File System Check on Root Device. Sep 16 04:22:30.857730 systemd[1]: systemd-fsck-usr.service: Deactivated successfully. Sep 16 04:22:30.857740 systemd[1]: Stopped systemd-fsck-usr.service. Sep 16 04:22:30.857750 systemd[1]: systemd-hibernate-clear.service - Clear Stale Hibernate Storage Info was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/HibernateLocation-8cf2644b-4b0b-428f-9387-6d876050dc67). Sep 16 04:22:30.857761 systemd[1]: Starting systemd-journald.service - Journal Service... Sep 16 04:22:30.857772 systemd[1]: Starting systemd-modules-load.service - Load Kernel Modules... Sep 16 04:22:30.857783 systemd[1]: Starting systemd-network-generator.service - Generate network units from Kernel command line... Sep 16 04:22:30.857794 kernel: fuse: init (API version 7.41) Sep 16 04:22:30.857804 systemd[1]: Starting systemd-remount-fs.service - Remount Root and Kernel File Systems... Sep 16 04:22:30.857814 systemd[1]: Starting systemd-udev-load-credentials.service - Load udev Rules from Credentials... Sep 16 04:22:30.857824 kernel: loop: module loaded Sep 16 04:22:30.857835 systemd[1]: Starting systemd-udev-trigger.service - Coldplug All udev Devices... Sep 16 04:22:30.857848 systemd[1]: verity-setup.service: Deactivated successfully. Sep 16 04:22:30.857858 systemd[1]: Stopped verity-setup.service. Sep 16 04:22:30.857868 systemd[1]: Mounted dev-hugepages.mount - Huge Pages File System. Sep 16 04:22:30.857878 systemd[1]: Mounted dev-mqueue.mount - POSIX Message Queue File System. Sep 16 04:22:30.857888 systemd[1]: Mounted media.mount - External Media Directory. Sep 16 04:22:30.857898 systemd[1]: Mounted sys-kernel-debug.mount - Kernel Debug File System. Sep 16 04:22:30.857908 systemd[1]: Mounted sys-kernel-tracing.mount - Kernel Trace File System. Sep 16 04:22:30.857918 systemd[1]: Mounted tmp.mount - Temporary Directory /tmp. Sep 16 04:22:30.857929 kernel: ACPI: bus type drm_connector registered Sep 16 04:22:30.857938 systemd[1]: Finished kmod-static-nodes.service - Create List of Static Device Nodes. Sep 16 04:22:30.857948 systemd[1]: modprobe@configfs.service: Deactivated successfully. Sep 16 04:22:30.857958 systemd[1]: Finished modprobe@configfs.service - Load Kernel Module configfs. Sep 16 04:22:30.857967 systemd[1]: modprobe@dm_mod.service: Deactivated successfully. Sep 16 04:22:30.857977 systemd[1]: Finished modprobe@dm_mod.service - Load Kernel Module dm_mod. Sep 16 04:22:30.857987 systemd[1]: modprobe@drm.service: Deactivated successfully. Sep 16 04:22:30.857997 systemd[1]: Finished modprobe@drm.service - Load Kernel Module drm. Sep 16 04:22:30.858008 systemd[1]: modprobe@efi_pstore.service: Deactivated successfully. Sep 16 04:22:30.858020 systemd[1]: Finished modprobe@efi_pstore.service - Load Kernel Module efi_pstore. Sep 16 04:22:30.858031 systemd[1]: modprobe@fuse.service: Deactivated successfully. Sep 16 04:22:30.858040 systemd[1]: Finished modprobe@fuse.service - Load Kernel Module fuse. Sep 16 04:22:30.858050 systemd[1]: modprobe@loop.service: Deactivated successfully. Sep 16 04:22:30.858059 systemd[1]: Finished modprobe@loop.service - Load Kernel Module loop. Sep 16 04:22:30.858069 systemd[1]: Finished systemd-modules-load.service - Load Kernel Modules. Sep 16 04:22:30.858080 systemd[1]: Finished systemd-remount-fs.service - Remount Root and Kernel File Systems. Sep 16 04:22:30.858090 systemd[1]: Finished systemd-network-generator.service - Generate network units from Kernel command line. Sep 16 04:22:30.858130 systemd-journald[1155]: Collecting audit messages is disabled. Sep 16 04:22:30.858156 systemd[1]: Reached target network-pre.target - Preparation for Network. Sep 16 04:22:30.858166 systemd[1]: Mounting sys-fs-fuse-connections.mount - FUSE Control File System... Sep 16 04:22:30.858177 systemd-journald[1155]: Journal started Sep 16 04:22:30.858200 systemd-journald[1155]: Runtime Journal (/run/log/journal/b57b5e16e7aa4590be54f3c4063c0fab) is 8M, max 76.5M, 68.5M free. Sep 16 04:22:30.861628 systemd[1]: Mounting sys-kernel-config.mount - Kernel Configuration File System... Sep 16 04:22:30.576253 systemd[1]: Queued start job for default target multi-user.target. Sep 16 04:22:30.581693 systemd[1]: Unnecessary job was removed for dev-sda6.device - /dev/sda6. Sep 16 04:22:30.582242 systemd[1]: systemd-journald.service: Deactivated successfully. Sep 16 04:22:30.866500 systemd[1]: remount-root.service - Remount Root File System was skipped because of an unmet condition check (ConditionPathIsReadWrite=!/). Sep 16 04:22:30.866557 systemd[1]: Reached target local-fs.target - Local File Systems. Sep 16 04:22:30.866571 systemd[1]: Listening on systemd-sysext.socket - System Extension Image Management. Sep 16 04:22:30.874697 systemd[1]: Starting ldconfig.service - Rebuild Dynamic Linker Cache... Sep 16 04:22:30.876630 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. Sep 16 04:22:30.882515 systemd[1]: Starting systemd-hwdb-update.service - Rebuild Hardware Database... Sep 16 04:22:30.882576 systemd[1]: systemd-pstore.service - Platform Persistent Storage Archival was skipped because of an unmet condition check (ConditionDirectoryNotEmpty=/sys/fs/pstore). Sep 16 04:22:30.889607 systemd[1]: Starting systemd-random-seed.service - Load/Save OS Random Seed... Sep 16 04:22:30.891618 systemd[1]: systemd-repart.service - Repartition Root Disk was skipped because no trigger condition checks were met. Sep 16 04:22:30.894941 systemd[1]: Starting systemd-sysctl.service - Apply Kernel Variables... Sep 16 04:22:30.897777 systemd[1]: Starting systemd-sysext.service - Merge System Extension Images into /usr/ and /opt/... Sep 16 04:22:30.903275 systemd[1]: Starting systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully... Sep 16 04:22:30.910004 systemd[1]: Started systemd-journald.service - Journal Service. Sep 16 04:22:30.908146 systemd[1]: Finished flatcar-tmpfiles.service - Create missing system files. Sep 16 04:22:30.912006 systemd[1]: Finished systemd-udev-load-credentials.service - Load udev Rules from Credentials. Sep 16 04:22:30.913891 systemd[1]: Mounted sys-fs-fuse-connections.mount - FUSE Control File System. Sep 16 04:22:30.914838 systemd[1]: Mounted sys-kernel-config.mount - Kernel Configuration File System. Sep 16 04:22:30.930557 systemd[1]: Starting systemd-journal-flush.service - Flush Journal to Persistent Storage... Sep 16 04:22:30.939652 systemd[1]: Finished systemd-random-seed.service - Load/Save OS Random Seed. Sep 16 04:22:30.940743 systemd[1]: Reached target first-boot-complete.target - First Boot Complete. Sep 16 04:22:30.943790 systemd[1]: Starting systemd-machine-id-commit.service - Save Transient machine-id to Disk... Sep 16 04:22:30.950923 kernel: loop0: detected capacity change from 0 to 211168 Sep 16 04:22:30.969401 systemd-journald[1155]: Time spent on flushing to /var/log/journal/b57b5e16e7aa4590be54f3c4063c0fab is 73.154ms for 1177 entries. Sep 16 04:22:30.969401 systemd-journald[1155]: System Journal (/var/log/journal/b57b5e16e7aa4590be54f3c4063c0fab) is 8M, max 584.8M, 576.8M free. Sep 16 04:22:31.061550 systemd-journald[1155]: Received client request to flush runtime journal. Sep 16 04:22:31.061640 kernel: squashfs: version 4.0 (2009/01/31) Phillip Lougher Sep 16 04:22:31.061656 kernel: loop1: detected capacity change from 0 to 8 Sep 16 04:22:31.061802 kernel: loop2: detected capacity change from 0 to 119368 Sep 16 04:22:30.988837 systemd[1]: Finished systemd-sysctl.service - Apply Kernel Variables. Sep 16 04:22:31.007423 systemd-tmpfiles[1193]: ACLs are not supported, ignoring. Sep 16 04:22:31.007434 systemd-tmpfiles[1193]: ACLs are not supported, ignoring. Sep 16 04:22:31.022560 systemd[1]: Finished systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully. Sep 16 04:22:31.029259 systemd[1]: Starting systemd-sysusers.service - Create System Users... Sep 16 04:22:31.042084 systemd[1]: Finished systemd-udev-trigger.service - Coldplug All udev Devices. Sep 16 04:22:31.051829 systemd[1]: Finished systemd-machine-id-commit.service - Save Transient machine-id to Disk. Sep 16 04:22:31.066861 systemd[1]: Finished systemd-journal-flush.service - Flush Journal to Persistent Storage. Sep 16 04:22:31.095655 kernel: loop3: detected capacity change from 0 to 100632 Sep 16 04:22:31.097717 systemd[1]: Finished systemd-sysusers.service - Create System Users. Sep 16 04:22:31.107862 systemd[1]: Starting systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev... Sep 16 04:22:31.136347 systemd-tmpfiles[1236]: ACLs are not supported, ignoring. Sep 16 04:22:31.136374 systemd-tmpfiles[1236]: ACLs are not supported, ignoring. Sep 16 04:22:31.140992 systemd[1]: Finished systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev. Sep 16 04:22:31.158613 kernel: loop4: detected capacity change from 0 to 211168 Sep 16 04:22:31.182750 kernel: loop5: detected capacity change from 0 to 8 Sep 16 04:22:31.188008 kernel: loop6: detected capacity change from 0 to 119368 Sep 16 04:22:31.202619 kernel: loop7: detected capacity change from 0 to 100632 Sep 16 04:22:31.221433 (sd-merge)[1240]: Using extensions 'containerd-flatcar', 'docker-flatcar', 'kubernetes', 'oem-hetzner'. Sep 16 04:22:31.222227 (sd-merge)[1240]: Merged extensions into '/usr'. Sep 16 04:22:31.228063 systemd[1]: Reload requested from client PID 1192 ('systemd-sysext') (unit systemd-sysext.service)... Sep 16 04:22:31.228097 systemd[1]: Reloading... Sep 16 04:22:31.344076 zram_generator::config[1266]: No configuration found. Sep 16 04:22:31.464253 ldconfig[1185]: /sbin/ldconfig: /usr/lib/ld.so.conf is not an ELF file - it has the wrong magic bytes at the start. Sep 16 04:22:31.565992 systemd[1]: etc-machine\x2did.mount: Deactivated successfully. Sep 16 04:22:31.566102 systemd[1]: Reloading finished in 337 ms. Sep 16 04:22:31.583200 systemd[1]: Finished ldconfig.service - Rebuild Dynamic Linker Cache. Sep 16 04:22:31.586546 systemd[1]: Finished systemd-sysext.service - Merge System Extension Images into /usr/ and /opt/. Sep 16 04:22:31.599829 systemd[1]: Starting ensure-sysext.service... Sep 16 04:22:31.604834 systemd[1]: Starting systemd-tmpfiles-setup.service - Create System Files and Directories... Sep 16 04:22:31.620222 systemd[1]: Finished systemd-hwdb-update.service - Rebuild Hardware Database. Sep 16 04:22:31.627176 systemd[1]: Starting systemd-udevd.service - Rule-based Manager for Device Events and Files... Sep 16 04:22:31.635775 systemd[1]: Reload requested from client PID 1303 ('systemctl') (unit ensure-sysext.service)... Sep 16 04:22:31.635796 systemd[1]: Reloading... Sep 16 04:22:31.645382 systemd-tmpfiles[1304]: /usr/lib/tmpfiles.d/nfs-utils.conf:6: Duplicate line for path "/var/lib/nfs/sm", ignoring. Sep 16 04:22:31.645843 systemd-tmpfiles[1304]: /usr/lib/tmpfiles.d/nfs-utils.conf:7: Duplicate line for path "/var/lib/nfs/sm.bak", ignoring. Sep 16 04:22:31.646226 systemd-tmpfiles[1304]: /usr/lib/tmpfiles.d/provision.conf:20: Duplicate line for path "/root", ignoring. Sep 16 04:22:31.646657 systemd-tmpfiles[1304]: /usr/lib/tmpfiles.d/systemd-flatcar.conf:6: Duplicate line for path "/var/log/journal", ignoring. Sep 16 04:22:31.647474 systemd-tmpfiles[1304]: /usr/lib/tmpfiles.d/systemd.conf:29: Duplicate line for path "/var/lib/systemd", ignoring. Sep 16 04:22:31.647809 systemd-tmpfiles[1304]: ACLs are not supported, ignoring. Sep 16 04:22:31.647910 systemd-tmpfiles[1304]: ACLs are not supported, ignoring. Sep 16 04:22:31.651972 systemd-tmpfiles[1304]: Detected autofs mount point /boot during canonicalization of boot. Sep 16 04:22:31.652077 systemd-tmpfiles[1304]: Skipping /boot Sep 16 04:22:31.659772 systemd-tmpfiles[1304]: Detected autofs mount point /boot during canonicalization of boot. Sep 16 04:22:31.659954 systemd-tmpfiles[1304]: Skipping /boot Sep 16 04:22:31.697542 systemd-udevd[1307]: Using default interface naming scheme 'v255'. Sep 16 04:22:31.703715 zram_generator::config[1333]: No configuration found. Sep 16 04:22:31.993124 systemd[1]: Condition check resulted in dev-ttyAMA0.device - /dev/ttyAMA0 being skipped. Sep 16 04:22:31.993603 systemd[1]: Reloading finished in 357 ms. Sep 16 04:22:32.002739 systemd[1]: Started systemd-udevd.service - Rule-based Manager for Device Events and Files. Sep 16 04:22:32.009722 systemd[1]: Finished systemd-tmpfiles-setup.service - Create System Files and Directories. Sep 16 04:22:32.017608 kernel: mousedev: PS/2 mouse device common for all mice Sep 16 04:22:32.031187 systemd[1]: Starting audit-rules.service - Load Audit Rules... Sep 16 04:22:32.036838 systemd[1]: Starting clean-ca-certificates.service - Clean up broken links in /etc/ssl/certs... Sep 16 04:22:32.039835 systemd[1]: Starting systemd-journal-catalog-update.service - Rebuild Journal Catalog... Sep 16 04:22:32.044838 systemd[1]: Starting systemd-networkd.service - Network Configuration... Sep 16 04:22:32.053790 systemd[1]: Starting systemd-resolved.service - Network Name Resolution... Sep 16 04:22:32.071923 systemd[1]: Starting systemd-update-utmp.service - Record System Boot/Shutdown in UTMP... Sep 16 04:22:32.078656 systemd[1]: ignition-delete-config.service - Ignition (delete config) was skipped because no trigger condition checks were met. Sep 16 04:22:32.081949 systemd[1]: Starting modprobe@dm_mod.service - Load Kernel Module dm_mod... Sep 16 04:22:32.084795 systemd[1]: Starting modprobe@efi_pstore.service - Load Kernel Module efi_pstore... Sep 16 04:22:32.089046 kernel: [drm] pci: virtio-gpu-pci detected at 0000:00:01.0 Sep 16 04:22:32.089134 kernel: [drm] features: -virgl +edid -resource_blob -host_visible Sep 16 04:22:32.089147 kernel: [drm] features: -context_init Sep 16 04:22:32.094992 systemd[1]: Starting modprobe@loop.service - Load Kernel Module loop... Sep 16 04:22:32.096745 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. Sep 16 04:22:32.096896 systemd[1]: systemd-hibernate-clear.service - Clear Stale Hibernate Storage Info was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/HibernateLocation-8cf2644b-4b0b-428f-9387-6d876050dc67). Sep 16 04:22:32.099796 systemd[1]: Starting systemd-userdbd.service - User Database Manager... Sep 16 04:22:32.106297 kernel: [drm] number of scanouts: 1 Sep 16 04:22:32.106363 kernel: [drm] number of cap sets: 0 Sep 16 04:22:32.105424 systemd[1]: ignition-delete-config.service - Ignition (delete config) was skipped because no trigger condition checks were met. Sep 16 04:22:32.106454 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. Sep 16 04:22:32.106575 systemd[1]: systemd-hibernate-clear.service - Clear Stale Hibernate Storage Info was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/HibernateLocation-8cf2644b-4b0b-428f-9387-6d876050dc67). Sep 16 04:22:32.109158 systemd[1]: ignition-delete-config.service - Ignition (delete config) was skipped because no trigger condition checks were met. Sep 16 04:22:32.112894 systemd[1]: Starting modprobe@drm.service - Load Kernel Module drm... Sep 16 04:22:32.114783 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. Sep 16 04:22:32.114922 systemd[1]: systemd-hibernate-clear.service - Clear Stale Hibernate Storage Info was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/HibernateLocation-8cf2644b-4b0b-428f-9387-6d876050dc67). Sep 16 04:22:32.120663 kernel: [drm] Initialized virtio_gpu 0.1.0 for 0000:00:01.0 on minor 0 Sep 16 04:22:32.121635 systemd[1]: Finished ensure-sysext.service. Sep 16 04:22:32.123658 systemd[1]: Finished systemd-journal-catalog-update.service - Rebuild Journal Catalog. Sep 16 04:22:32.131118 kernel: Console: switching to colour frame buffer device 160x50 Sep 16 04:22:32.172596 kernel: virtio-pci 0000:00:01.0: [drm] fb0: virtio_gpudrmfb frame buffer device Sep 16 04:22:32.173637 systemd[1]: Finished clean-ca-certificates.service - Clean up broken links in /etc/ssl/certs. Sep 16 04:22:32.184081 systemd[1]: Finished systemd-update-utmp.service - Record System Boot/Shutdown in UTMP. Sep 16 04:22:32.185815 systemd[1]: modprobe@loop.service: Deactivated successfully. Sep 16 04:22:32.188682 systemd[1]: Finished modprobe@loop.service - Load Kernel Module loop. Sep 16 04:22:32.189771 systemd[1]: modprobe@efi_pstore.service: Deactivated successfully. Sep 16 04:22:32.192275 systemd[1]: Finished modprobe@efi_pstore.service - Load Kernel Module efi_pstore. Sep 16 04:22:32.193396 systemd[1]: modprobe@drm.service: Deactivated successfully. Sep 16 04:22:32.193546 systemd[1]: Finished modprobe@drm.service - Load Kernel Module drm. Sep 16 04:22:32.200160 systemd[1]: systemd-pstore.service - Platform Persistent Storage Archival was skipped because of an unmet condition check (ConditionDirectoryNotEmpty=/sys/fs/pstore). Sep 16 04:22:32.205193 systemd[1]: Starting systemd-timesyncd.service - Network Time Synchronization... Sep 16 04:22:32.209944 systemd[1]: Starting systemd-update-done.service - Update is Completed... Sep 16 04:22:32.211652 systemd[1]: update-ca-certificates.service - Update CA bundle at /etc/ssl/certs/ca-certificates.crt was skipped because of an unmet condition check (ConditionPathIsSymbolicLink=!/etc/ssl/certs/ca-certificates.crt). Sep 16 04:22:32.211981 systemd[1]: modprobe@dm_mod.service: Deactivated successfully. Sep 16 04:22:32.212184 systemd[1]: Finished modprobe@dm_mod.service - Load Kernel Module dm_mod. Sep 16 04:22:32.217230 systemd[1]: Condition check resulted in dev-virtio\x2dports-org.qemu.guest_agent.0.device - /dev/virtio-ports/org.qemu.guest_agent.0 being skipped. Sep 16 04:22:32.217371 systemd[1]: ignition-delete-config.service - Ignition (delete config) was skipped because no trigger condition checks were met. Sep 16 04:22:32.221538 systemd[1]: Starting modprobe@dm_mod.service - Load Kernel Module dm_mod... Sep 16 04:22:32.232707 systemd[1]: Starting modprobe@efi_pstore.service - Load Kernel Module efi_pstore... Sep 16 04:22:32.235295 augenrules[1458]: No rules Sep 16 04:22:32.239641 systemd[1]: Starting modprobe@loop.service - Load Kernel Module loop... Sep 16 04:22:32.240916 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. Sep 16 04:22:32.240965 systemd[1]: systemd-hibernate-clear.service - Clear Stale Hibernate Storage Info was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/HibernateLocation-8cf2644b-4b0b-428f-9387-6d876050dc67). Sep 16 04:22:32.240990 systemd[1]: update-ca-certificates.service - Update CA bundle at /etc/ssl/certs/ca-certificates.crt was skipped because of an unmet condition check (ConditionPathIsSymbolicLink=!/etc/ssl/certs/ca-certificates.crt). Sep 16 04:22:32.241271 systemd[1]: audit-rules.service: Deactivated successfully. Sep 16 04:22:32.241987 systemd[1]: Finished audit-rules.service - Load Audit Rules. Sep 16 04:22:32.275612 systemd[1]: Finished systemd-update-done.service - Update is Completed. Sep 16 04:22:32.294172 systemd[1]: modprobe@loop.service: Deactivated successfully. Sep 16 04:22:32.294505 systemd[1]: Finished modprobe@loop.service - Load Kernel Module loop. Sep 16 04:22:32.299036 systemd[1]: modprobe@dm_mod.service: Deactivated successfully. Sep 16 04:22:32.300632 systemd[1]: Finished modprobe@dm_mod.service - Load Kernel Module dm_mod. Sep 16 04:22:32.301656 systemd[1]: modprobe@efi_pstore.service: Deactivated successfully. Sep 16 04:22:32.301809 systemd[1]: Finished modprobe@efi_pstore.service - Load Kernel Module efi_pstore. Sep 16 04:22:32.306348 systemd[1]: systemd-pstore.service - Platform Persistent Storage Archival was skipped because of an unmet condition check (ConditionDirectoryNotEmpty=/sys/fs/pstore). Sep 16 04:22:32.306429 systemd[1]: systemd-repart.service - Repartition Root Disk was skipped because no trigger condition checks were met. Sep 16 04:22:32.320313 systemd[1]: Started systemd-userdbd.service - User Database Manager. Sep 16 04:22:32.339414 systemd[1]: Found device dev-disk-by\x2dlabel-OEM.device - QEMU_HARDDISK OEM. Sep 16 04:22:32.343745 systemd[1]: Starting systemd-fsck@dev-disk-by\x2dlabel-OEM.service - File System Check on /dev/disk/by-label/OEM... Sep 16 04:22:32.376792 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Sep 16 04:22:32.381288 systemd[1]: Finished systemd-fsck@dev-disk-by\x2dlabel-OEM.service - File System Check on /dev/disk/by-label/OEM. Sep 16 04:22:32.417032 systemd[1]: systemd-vconsole-setup.service: Deactivated successfully. Sep 16 04:22:32.417611 systemd[1]: Stopped systemd-vconsole-setup.service - Virtual Console Setup. Sep 16 04:22:32.423964 systemd[1]: run-credentials-systemd\x2dvconsole\x2dsetup.service.mount: Deactivated successfully. Sep 16 04:22:32.428398 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Sep 16 04:22:32.528774 systemd[1]: Finished systemd-vconsole-setup.service - Virtual Console Setup. Sep 16 04:22:32.540801 systemd-networkd[1418]: lo: Link UP Sep 16 04:22:32.541099 systemd-networkd[1418]: lo: Gained carrier Sep 16 04:22:32.543857 systemd-networkd[1418]: Enumeration completed Sep 16 04:22:32.544398 systemd[1]: Started systemd-networkd.service - Network Configuration. Sep 16 04:22:32.544954 systemd-networkd[1418]: eth0: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Sep 16 04:22:32.545628 systemd-networkd[1418]: eth0: Configuring with /usr/lib/systemd/network/zz-default.network. Sep 16 04:22:32.546479 systemd-networkd[1418]: eth1: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Sep 16 04:22:32.546483 systemd-networkd[1418]: eth1: Configuring with /usr/lib/systemd/network/zz-default.network. Sep 16 04:22:32.547038 systemd-networkd[1418]: eth0: Link UP Sep 16 04:22:32.547427 systemd-networkd[1418]: eth0: Gained carrier Sep 16 04:22:32.547506 systemd-networkd[1418]: eth0: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Sep 16 04:22:32.548864 systemd[1]: Starting systemd-networkd-persistent-storage.service - Enable Persistent Storage in systemd-networkd... Sep 16 04:22:32.551855 systemd[1]: Starting systemd-networkd-wait-online.service - Wait for Network to be Configured... Sep 16 04:22:32.552862 systemd-networkd[1418]: eth1: Link UP Sep 16 04:22:32.553491 systemd-networkd[1418]: eth1: Gained carrier Sep 16 04:22:32.553515 systemd-networkd[1418]: eth1: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Sep 16 04:22:32.573266 systemd[1]: Started systemd-timesyncd.service - Network Time Synchronization. Sep 16 04:22:32.575758 systemd[1]: Reached target time-set.target - System Time Set. Sep 16 04:22:32.589691 systemd-networkd[1418]: eth1: DHCPv4 address 10.0.0.3/32 acquired from 10.0.0.1 Sep 16 04:22:32.590524 systemd-timesyncd[1451]: Network configuration changed, trying to establish connection. Sep 16 04:22:32.592681 systemd-resolved[1420]: Positive Trust Anchors: Sep 16 04:22:32.592699 systemd-resolved[1420]: . IN DS 20326 8 2 e06d44b80b8f1d39a95c0b0d7c65d08458e880409bbc683457104237c7f8ec8d Sep 16 04:22:32.592731 systemd-resolved[1420]: Negative trust anchors: home.arpa 10.in-addr.arpa 16.172.in-addr.arpa 17.172.in-addr.arpa 18.172.in-addr.arpa 19.172.in-addr.arpa 20.172.in-addr.arpa 21.172.in-addr.arpa 22.172.in-addr.arpa 23.172.in-addr.arpa 24.172.in-addr.arpa 25.172.in-addr.arpa 26.172.in-addr.arpa 27.172.in-addr.arpa 28.172.in-addr.arpa 29.172.in-addr.arpa 30.172.in-addr.arpa 31.172.in-addr.arpa 170.0.0.192.in-addr.arpa 171.0.0.192.in-addr.arpa 168.192.in-addr.arpa d.f.ip6.arpa ipv4only.arpa resolver.arpa corp home internal intranet lan local private test Sep 16 04:22:32.596872 systemd-resolved[1420]: Using system hostname 'ci-4459-0-0-n-21eb3e8385'. Sep 16 04:22:32.598721 systemd[1]: Finished systemd-networkd-persistent-storage.service - Enable Persistent Storage in systemd-networkd. Sep 16 04:22:32.601006 systemd[1]: Started systemd-resolved.service - Network Name Resolution. Sep 16 04:22:32.601936 systemd[1]: Reached target network.target - Network. Sep 16 04:22:32.602537 systemd[1]: Reached target nss-lookup.target - Host and Network Name Lookups. Sep 16 04:22:32.602714 systemd-networkd[1418]: eth0: DHCPv4 address 138.201.119.17/32, gateway 172.31.1.1 acquired from 172.31.1.1 Sep 16 04:22:32.603206 systemd-timesyncd[1451]: Network configuration changed, trying to establish connection. Sep 16 04:22:32.603602 systemd[1]: Reached target sysinit.target - System Initialization. Sep 16 04:22:32.604403 systemd[1]: Started motdgen.path - Watch for update engine configuration changes. Sep 16 04:22:32.605378 systemd[1]: Started user-cloudinit@var-lib-flatcar\x2dinstall-user_data.path - Watch for a cloud-config at /var/lib/flatcar-install/user_data. Sep 16 04:22:32.606014 systemd-timesyncd[1451]: Network configuration changed, trying to establish connection. Sep 16 04:22:32.606523 systemd[1]: Started logrotate.timer - Daily rotation of log files. Sep 16 04:22:32.607523 systemd[1]: Started mdadm.timer - Weekly check for MD array's redundancy information.. Sep 16 04:22:32.608577 systemd[1]: Started systemd-tmpfiles-clean.timer - Daily Cleanup of Temporary Directories. Sep 16 04:22:32.609413 systemd[1]: update-engine-stub.timer - Update Engine Stub Timer was skipped because of an unmet condition check (ConditionPathExists=/usr/.noupdate). Sep 16 04:22:32.609453 systemd[1]: Reached target paths.target - Path Units. Sep 16 04:22:32.610203 systemd[1]: Reached target timers.target - Timer Units. Sep 16 04:22:32.612647 systemd[1]: Listening on dbus.socket - D-Bus System Message Bus Socket. Sep 16 04:22:32.614980 systemd[1]: Starting docker.socket - Docker Socket for the API... Sep 16 04:22:32.617771 systemd[1]: Listening on sshd-unix-local.socket - OpenSSH Server Socket (systemd-ssh-generator, AF_UNIX Local). Sep 16 04:22:32.618752 systemd[1]: Listening on sshd-vsock.socket - OpenSSH Server Socket (systemd-ssh-generator, AF_VSOCK). Sep 16 04:22:32.619545 systemd[1]: Reached target ssh-access.target - SSH Access Available. Sep 16 04:22:32.622411 systemd[1]: Listening on sshd.socket - OpenSSH Server Socket. Sep 16 04:22:32.623578 systemd[1]: Listening on systemd-hostnamed.socket - Hostname Service Socket. Sep 16 04:22:32.625104 systemd[1]: Listening on docker.socket - Docker Socket for the API. Sep 16 04:22:32.625950 systemd[1]: Reached target sockets.target - Socket Units. Sep 16 04:22:32.626577 systemd[1]: Reached target basic.target - Basic System. Sep 16 04:22:32.627220 systemd[1]: addon-config@oem.service - Configure Addon /oem was skipped because no trigger condition checks were met. Sep 16 04:22:32.627260 systemd[1]: addon-run@oem.service - Run Addon /oem was skipped because no trigger condition checks were met. Sep 16 04:22:32.628443 systemd[1]: Starting containerd.service - containerd container runtime... Sep 16 04:22:32.630304 systemd[1]: Starting coreos-metadata.service - Flatcar Metadata Agent... Sep 16 04:22:32.634402 systemd[1]: Starting dbus.service - D-Bus System Message Bus... Sep 16 04:22:32.636905 systemd[1]: Starting dracut-shutdown.service - Restore /run/initramfs on shutdown... Sep 16 04:22:32.644905 systemd[1]: Starting enable-oem-cloudinit.service - Enable cloudinit... Sep 16 04:22:32.649132 systemd[1]: Starting extend-filesystems.service - Extend Filesystems... Sep 16 04:22:32.649867 systemd[1]: flatcar-setup-environment.service - Modifies /etc/environment for CoreOS was skipped because of an unmet condition check (ConditionPathExists=/oem/bin/flatcar-setup-environment). Sep 16 04:22:32.651033 systemd[1]: Starting motdgen.service - Generate /run/flatcar/motd... Sep 16 04:22:32.655857 systemd[1]: Starting prepare-helm.service - Unpack helm to /opt/bin... Sep 16 04:22:32.666244 systemd[1]: Started qemu-guest-agent.service - QEMU Guest Agent. Sep 16 04:22:32.672105 systemd[1]: Starting ssh-key-proc-cmdline.service - Install an ssh key from /proc/cmdline... Sep 16 04:22:32.677829 systemd[1]: Starting sshd-keygen.service - Generate sshd host keys... Sep 16 04:22:32.685894 jq[1517]: false Sep 16 04:22:32.686364 systemd[1]: Starting systemd-logind.service - User Login Management... Sep 16 04:22:32.688105 systemd[1]: tcsd.service - TCG Core Services Daemon was skipped because of an unmet condition check (ConditionPathExists=/dev/tpm0). Sep 16 04:22:32.690082 coreos-metadata[1514]: Sep 16 04:22:32.689 INFO Fetching http://169.254.169.254/hetzner/v1/metadata: Attempt #1 Sep 16 04:22:32.690894 systemd[1]: cgroup compatibility translation between legacy and unified hierarchy settings activated. See cgroup-compat debug messages for details. Sep 16 04:22:32.693085 coreos-metadata[1514]: Sep 16 04:22:32.692 INFO Fetch successful Sep 16 04:22:32.693085 coreos-metadata[1514]: Sep 16 04:22:32.693 INFO Fetching http://169.254.169.254/hetzner/v1/metadata/private-networks: Attempt #1 Sep 16 04:22:32.694001 systemd[1]: Starting update-engine.service - Update Engine... Sep 16 04:22:32.702917 coreos-metadata[1514]: Sep 16 04:22:32.693 INFO Fetch successful Sep 16 04:22:32.699308 systemd[1]: Starting update-ssh-keys-after-ignition.service - Run update-ssh-keys once after Ignition... Sep 16 04:22:32.717636 jq[1532]: true Sep 16 04:22:32.717881 extend-filesystems[1518]: Found /dev/sda6 Sep 16 04:22:32.714257 systemd[1]: Finished dracut-shutdown.service - Restore /run/initramfs on shutdown. Sep 16 04:22:32.715351 systemd[1]: enable-oem-cloudinit.service: Skipped due to 'exec-condition'. Sep 16 04:22:32.715528 systemd[1]: Condition check resulted in enable-oem-cloudinit.service - Enable cloudinit being skipped. Sep 16 04:22:32.727624 extend-filesystems[1518]: Found /dev/sda9 Sep 16 04:22:32.727624 extend-filesystems[1518]: Checking size of /dev/sda9 Sep 16 04:22:32.730081 systemd[1]: ssh-key-proc-cmdline.service: Deactivated successfully. Sep 16 04:22:32.741092 extend-filesystems[1518]: Resized partition /dev/sda9 Sep 16 04:22:32.742189 kernel: EXT4-fs (sda9): resizing filesystem from 1617920 to 9393147 blocks Sep 16 04:22:32.730317 systemd[1]: Finished ssh-key-proc-cmdline.service - Install an ssh key from /proc/cmdline. Sep 16 04:22:32.742322 extend-filesystems[1553]: resize2fs 1.47.3 (8-Jul-2025) Sep 16 04:22:32.764685 tar[1539]: linux-arm64/LICENSE Sep 16 04:22:32.765148 tar[1539]: linux-arm64/helm Sep 16 04:22:32.768613 jq[1542]: true Sep 16 04:22:32.775395 systemd[1]: motdgen.service: Deactivated successfully. Sep 16 04:22:32.776678 systemd[1]: Finished motdgen.service - Generate /run/flatcar/motd. Sep 16 04:22:32.807636 (ntainerd)[1558]: containerd.service: Referenced but unset environment variable evaluates to an empty string: TORCX_IMAGEDIR, TORCX_UNPACKDIR Sep 16 04:22:32.834472 dbus-daemon[1515]: [system] SELinux support is enabled Sep 16 04:22:32.834699 systemd[1]: Started dbus.service - D-Bus System Message Bus. Sep 16 04:22:32.838902 systemd[1]: system-cloudinit@usr-share-oem-cloud\x2dconfig.yml.service - Load cloud-config from /usr/share/oem/cloud-config.yml was skipped because of an unmet condition check (ConditionFileNotEmpty=/usr/share/oem/cloud-config.yml). Sep 16 04:22:32.838950 systemd[1]: Reached target system-config.target - Load system-provided cloud configs. Sep 16 04:22:32.842074 systemd[1]: user-cloudinit-proc-cmdline.service - Load cloud-config from url defined in /proc/cmdline was skipped because of an unmet condition check (ConditionKernelCommandLine=cloud-config-url). Sep 16 04:22:32.842104 systemd[1]: Reached target user-config.target - Load user-provided cloud configs. Sep 16 04:22:32.845154 update_engine[1531]: I20250916 04:22:32.843979 1531 main.cc:92] Flatcar Update Engine starting Sep 16 04:22:32.860837 systemd[1]: Started update-engine.service - Update Engine. Sep 16 04:22:32.864429 update_engine[1531]: I20250916 04:22:32.864009 1531 update_check_scheduler.cc:74] Next update check in 5m45s Sep 16 04:22:32.870393 systemd[1]: Started locksmithd.service - Cluster reboot manager. Sep 16 04:22:32.887363 kernel: EXT4-fs (sda9): resized filesystem to 9393147 Sep 16 04:22:32.903567 extend-filesystems[1553]: Filesystem at /dev/sda9 is mounted on /; on-line resizing required Sep 16 04:22:32.903567 extend-filesystems[1553]: old_desc_blocks = 1, new_desc_blocks = 5 Sep 16 04:22:32.903567 extend-filesystems[1553]: The filesystem on /dev/sda9 is now 9393147 (4k) blocks long. Sep 16 04:22:32.914693 extend-filesystems[1518]: Resized filesystem in /dev/sda9 Sep 16 04:22:32.907166 systemd[1]: extend-filesystems.service: Deactivated successfully. Sep 16 04:22:32.917421 bash[1584]: Updated "/home/core/.ssh/authorized_keys" Sep 16 04:22:32.922138 systemd-logind[1529]: New seat seat0. Sep 16 04:22:32.926759 systemd-logind[1529]: Watching system buttons on /dev/input/event0 (Power Button) Sep 16 04:22:32.926782 systemd-logind[1529]: Watching system buttons on /dev/input/event2 (QEMU QEMU USB Keyboard) Sep 16 04:22:32.971641 systemd[1]: Finished extend-filesystems.service - Extend Filesystems. Sep 16 04:22:32.973350 systemd[1]: Started systemd-logind.service - User Login Management. Sep 16 04:22:32.980634 systemd[1]: Finished update-ssh-keys-after-ignition.service - Run update-ssh-keys once after Ignition. Sep 16 04:22:33.011918 systemd[1]: Starting sshkeys.service... Sep 16 04:22:33.020456 systemd[1]: Finished coreos-metadata.service - Flatcar Metadata Agent. Sep 16 04:22:33.023595 systemd[1]: packet-phone-home.service - Report Success to Packet was skipped because no trigger condition checks were met. Sep 16 04:22:33.038216 systemd[1]: Created slice system-coreos\x2dmetadata\x2dsshkeys.slice - Slice /system/coreos-metadata-sshkeys. Sep 16 04:22:33.041451 systemd[1]: Starting coreos-metadata-sshkeys@core.service - Flatcar Metadata Agent (SSH Keys)... Sep 16 04:22:33.113773 coreos-metadata[1596]: Sep 16 04:22:33.113 INFO Fetching http://169.254.169.254/hetzner/v1/metadata/public-keys: Attempt #1 Sep 16 04:22:33.115520 coreos-metadata[1596]: Sep 16 04:22:33.115 INFO Fetch successful Sep 16 04:22:33.117569 unknown[1596]: wrote ssh authorized keys file for user: core Sep 16 04:22:33.138608 update-ssh-keys[1607]: Updated "/home/core/.ssh/authorized_keys" Sep 16 04:22:33.139954 systemd[1]: Finished coreos-metadata-sshkeys@core.service - Flatcar Metadata Agent (SSH Keys). Sep 16 04:22:33.149634 systemd[1]: Finished sshkeys.service. Sep 16 04:22:33.176044 locksmithd[1582]: locksmithd starting currentOperation="UPDATE_STATUS_IDLE" strategy="reboot" Sep 16 04:22:33.286911 containerd[1558]: time="2025-09-16T04:22:33Z" level=warning msg="Ignoring unknown key in TOML" column=1 error="strict mode: fields in the document are missing in the target struct" file=/usr/share/containerd/config.toml key=subreaper row=8 Sep 16 04:22:33.289373 containerd[1558]: time="2025-09-16T04:22:33.289326120Z" level=info msg="starting containerd" revision=fb4c30d4ede3531652d86197bf3fc9515e5276d9 version=v2.0.5 Sep 16 04:22:33.300507 containerd[1558]: time="2025-09-16T04:22:33.300457280Z" level=warning msg="Configuration migrated from version 2, use `containerd config migrate` to avoid migration" t="58.4µs" Sep 16 04:22:33.300754 containerd[1558]: time="2025-09-16T04:22:33.300691800Z" level=info msg="loading plugin" id=io.containerd.image-verifier.v1.bindir type=io.containerd.image-verifier.v1 Sep 16 04:22:33.300817 containerd[1558]: time="2025-09-16T04:22:33.300803480Z" level=info msg="loading plugin" id=io.containerd.internal.v1.opt type=io.containerd.internal.v1 Sep 16 04:22:33.301025 containerd[1558]: time="2025-09-16T04:22:33.301005240Z" level=info msg="loading plugin" id=io.containerd.warning.v1.deprecations type=io.containerd.warning.v1 Sep 16 04:22:33.301085 containerd[1558]: time="2025-09-16T04:22:33.301071640Z" level=info msg="loading plugin" id=io.containerd.content.v1.content type=io.containerd.content.v1 Sep 16 04:22:33.301173 containerd[1558]: time="2025-09-16T04:22:33.301159920Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.blockfile type=io.containerd.snapshotter.v1 Sep 16 04:22:33.301313 containerd[1558]: time="2025-09-16T04:22:33.301292400Z" level=info msg="skip loading plugin" error="no scratch file generator: skip plugin" id=io.containerd.snapshotter.v1.blockfile type=io.containerd.snapshotter.v1 Sep 16 04:22:33.301365 containerd[1558]: time="2025-09-16T04:22:33.301353320Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.btrfs type=io.containerd.snapshotter.v1 Sep 16 04:22:33.301683 containerd[1558]: time="2025-09-16T04:22:33.301658760Z" level=info msg="skip loading plugin" error="path /var/lib/containerd/io.containerd.snapshotter.v1.btrfs (ext4) must be a btrfs filesystem to be used with the btrfs snapshotter: skip plugin" id=io.containerd.snapshotter.v1.btrfs type=io.containerd.snapshotter.v1 Sep 16 04:22:33.301743 containerd[1558]: time="2025-09-16T04:22:33.301730360Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.devmapper type=io.containerd.snapshotter.v1 Sep 16 04:22:33.301806 containerd[1558]: time="2025-09-16T04:22:33.301792240Z" level=info msg="skip loading plugin" error="devmapper not configured: skip plugin" id=io.containerd.snapshotter.v1.devmapper type=io.containerd.snapshotter.v1 Sep 16 04:22:33.301859 containerd[1558]: time="2025-09-16T04:22:33.301846920Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.native type=io.containerd.snapshotter.v1 Sep 16 04:22:33.301998 containerd[1558]: time="2025-09-16T04:22:33.301979960Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.overlayfs type=io.containerd.snapshotter.v1 Sep 16 04:22:33.302246 containerd[1558]: time="2025-09-16T04:22:33.302225920Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.zfs type=io.containerd.snapshotter.v1 Sep 16 04:22:33.302401 containerd[1558]: time="2025-09-16T04:22:33.302382920Z" level=info msg="skip loading plugin" error="lstat /var/lib/containerd/io.containerd.snapshotter.v1.zfs: no such file or directory: skip plugin" id=io.containerd.snapshotter.v1.zfs type=io.containerd.snapshotter.v1 Sep 16 04:22:33.302452 containerd[1558]: time="2025-09-16T04:22:33.302440080Z" level=info msg="loading plugin" id=io.containerd.event.v1.exchange type=io.containerd.event.v1 Sep 16 04:22:33.302528 containerd[1558]: time="2025-09-16T04:22:33.302514160Z" level=info msg="loading plugin" id=io.containerd.monitor.task.v1.cgroups type=io.containerd.monitor.task.v1 Sep 16 04:22:33.302950 containerd[1558]: time="2025-09-16T04:22:33.302929880Z" level=info msg="loading plugin" id=io.containerd.metadata.v1.bolt type=io.containerd.metadata.v1 Sep 16 04:22:33.303279 containerd[1558]: time="2025-09-16T04:22:33.303125080Z" level=info msg="metadata content store policy set" policy=shared Sep 16 04:22:33.308641 containerd[1558]: time="2025-09-16T04:22:33.308573400Z" level=info msg="loading plugin" id=io.containerd.gc.v1.scheduler type=io.containerd.gc.v1 Sep 16 04:22:33.308850 containerd[1558]: time="2025-09-16T04:22:33.308830640Z" level=info msg="loading plugin" id=io.containerd.differ.v1.walking type=io.containerd.differ.v1 Sep 16 04:22:33.308919 containerd[1558]: time="2025-09-16T04:22:33.308902840Z" level=info msg="loading plugin" id=io.containerd.lease.v1.manager type=io.containerd.lease.v1 Sep 16 04:22:33.308982 containerd[1558]: time="2025-09-16T04:22:33.308966400Z" level=info msg="loading plugin" id=io.containerd.service.v1.containers-service type=io.containerd.service.v1 Sep 16 04:22:33.309060 containerd[1558]: time="2025-09-16T04:22:33.309032080Z" level=info msg="loading plugin" id=io.containerd.service.v1.content-service type=io.containerd.service.v1 Sep 16 04:22:33.309216 containerd[1558]: time="2025-09-16T04:22:33.309195560Z" level=info msg="loading plugin" id=io.containerd.service.v1.diff-service type=io.containerd.service.v1 Sep 16 04:22:33.309368 containerd[1558]: time="2025-09-16T04:22:33.309347600Z" level=info msg="loading plugin" id=io.containerd.service.v1.images-service type=io.containerd.service.v1 Sep 16 04:22:33.310611 containerd[1558]: time="2025-09-16T04:22:33.309422000Z" level=info msg="loading plugin" id=io.containerd.service.v1.introspection-service type=io.containerd.service.v1 Sep 16 04:22:33.310611 containerd[1558]: time="2025-09-16T04:22:33.309445520Z" level=info msg="loading plugin" id=io.containerd.service.v1.namespaces-service type=io.containerd.service.v1 Sep 16 04:22:33.310611 containerd[1558]: time="2025-09-16T04:22:33.309460080Z" level=info msg="loading plugin" id=io.containerd.service.v1.snapshots-service type=io.containerd.service.v1 Sep 16 04:22:33.310611 containerd[1558]: time="2025-09-16T04:22:33.309484320Z" level=info msg="loading plugin" id=io.containerd.shim.v1.manager type=io.containerd.shim.v1 Sep 16 04:22:33.310611 containerd[1558]: time="2025-09-16T04:22:33.309513560Z" level=info msg="loading plugin" id=io.containerd.runtime.v2.task type=io.containerd.runtime.v2 Sep 16 04:22:33.310611 containerd[1558]: time="2025-09-16T04:22:33.309711880Z" level=info msg="loading plugin" id=io.containerd.service.v1.tasks-service type=io.containerd.service.v1 Sep 16 04:22:33.310611 containerd[1558]: time="2025-09-16T04:22:33.309741960Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.containers type=io.containerd.grpc.v1 Sep 16 04:22:33.310611 containerd[1558]: time="2025-09-16T04:22:33.309761600Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.content type=io.containerd.grpc.v1 Sep 16 04:22:33.310611 containerd[1558]: time="2025-09-16T04:22:33.309776040Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.diff type=io.containerd.grpc.v1 Sep 16 04:22:33.310611 containerd[1558]: time="2025-09-16T04:22:33.309794240Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.events type=io.containerd.grpc.v1 Sep 16 04:22:33.310611 containerd[1558]: time="2025-09-16T04:22:33.309807680Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.images type=io.containerd.grpc.v1 Sep 16 04:22:33.310611 containerd[1558]: time="2025-09-16T04:22:33.309821160Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.introspection type=io.containerd.grpc.v1 Sep 16 04:22:33.310611 containerd[1558]: time="2025-09-16T04:22:33.309835040Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.leases type=io.containerd.grpc.v1 Sep 16 04:22:33.310611 containerd[1558]: time="2025-09-16T04:22:33.309848720Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.namespaces type=io.containerd.grpc.v1 Sep 16 04:22:33.310611 containerd[1558]: time="2025-09-16T04:22:33.309862520Z" level=info msg="loading plugin" id=io.containerd.sandbox.store.v1.local type=io.containerd.sandbox.store.v1 Sep 16 04:22:33.310970 containerd[1558]: time="2025-09-16T04:22:33.309875760Z" level=info msg="loading plugin" id=io.containerd.cri.v1.images type=io.containerd.cri.v1 Sep 16 04:22:33.310970 containerd[1558]: time="2025-09-16T04:22:33.310074880Z" level=info msg="Get image filesystem path \"/var/lib/containerd/io.containerd.snapshotter.v1.overlayfs\" for snapshotter \"overlayfs\"" Sep 16 04:22:33.310970 containerd[1558]: time="2025-09-16T04:22:33.310093040Z" level=info msg="Start snapshots syncer" Sep 16 04:22:33.310970 containerd[1558]: time="2025-09-16T04:22:33.310128680Z" level=info msg="loading plugin" id=io.containerd.cri.v1.runtime type=io.containerd.cri.v1 Sep 16 04:22:33.311054 containerd[1558]: time="2025-09-16T04:22:33.310409920Z" level=info msg="starting cri plugin" config="{\"containerd\":{\"defaultRuntimeName\":\"runc\",\"runtimes\":{\"runc\":{\"runtimeType\":\"io.containerd.runc.v2\",\"runtimePath\":\"\",\"PodAnnotations\":null,\"ContainerAnnotations\":null,\"options\":{\"BinaryName\":\"\",\"CriuImagePath\":\"\",\"CriuWorkPath\":\"\",\"IoGid\":0,\"IoUid\":0,\"NoNewKeyring\":false,\"Root\":\"\",\"ShimCgroup\":\"\",\"SystemdCgroup\":true},\"privileged_without_host_devices\":false,\"privileged_without_host_devices_all_devices_allowed\":false,\"baseRuntimeSpec\":\"\",\"cniConfDir\":\"\",\"cniMaxConfNum\":0,\"snapshotter\":\"\",\"sandboxer\":\"podsandbox\",\"io_type\":\"\"}},\"ignoreBlockIONotEnabledErrors\":false,\"ignoreRdtNotEnabledErrors\":false},\"cni\":{\"binDir\":\"/opt/cni/bin\",\"confDir\":\"/etc/cni/net.d\",\"maxConfNum\":1,\"setupSerially\":false,\"confTemplate\":\"\",\"ipPref\":\"\",\"useInternalLoopback\":false},\"enableSelinux\":true,\"selinuxCategoryRange\":1024,\"maxContainerLogSize\":16384,\"disableApparmor\":false,\"restrictOOMScoreAdj\":false,\"disableProcMount\":false,\"unsetSeccompProfile\":\"\",\"tolerateMissingHugetlbController\":true,\"disableHugetlbController\":true,\"device_ownership_from_security_context\":false,\"ignoreImageDefinedVolumes\":false,\"netnsMountsUnderStateDir\":false,\"enableUnprivilegedPorts\":true,\"enableUnprivilegedICMP\":true,\"enableCDI\":true,\"cdiSpecDirs\":[\"/etc/cdi\",\"/var/run/cdi\"],\"drainExecSyncIOTimeout\":\"0s\",\"ignoreDeprecationWarnings\":null,\"containerdRootDir\":\"/var/lib/containerd\",\"containerdEndpoint\":\"/run/containerd/containerd.sock\",\"rootDir\":\"/var/lib/containerd/io.containerd.grpc.v1.cri\",\"stateDir\":\"/run/containerd/io.containerd.grpc.v1.cri\"}" Sep 16 04:22:33.311054 containerd[1558]: time="2025-09-16T04:22:33.310469200Z" level=info msg="loading plugin" id=io.containerd.podsandbox.controller.v1.podsandbox type=io.containerd.podsandbox.controller.v1 Sep 16 04:22:33.311403 containerd[1558]: time="2025-09-16T04:22:33.310573640Z" level=info msg="loading plugin" id=io.containerd.sandbox.controller.v1.shim type=io.containerd.sandbox.controller.v1 Sep 16 04:22:33.311757 containerd[1558]: time="2025-09-16T04:22:33.311731200Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.sandbox-controllers type=io.containerd.grpc.v1 Sep 16 04:22:33.311911 containerd[1558]: time="2025-09-16T04:22:33.311891320Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.sandboxes type=io.containerd.grpc.v1 Sep 16 04:22:33.311977 containerd[1558]: time="2025-09-16T04:22:33.311962680Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.snapshots type=io.containerd.grpc.v1 Sep 16 04:22:33.312106 containerd[1558]: time="2025-09-16T04:22:33.312086880Z" level=info msg="loading plugin" id=io.containerd.streaming.v1.manager type=io.containerd.streaming.v1 Sep 16 04:22:33.312227 containerd[1558]: time="2025-09-16T04:22:33.312158640Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.streaming type=io.containerd.grpc.v1 Sep 16 04:22:33.312489 containerd[1558]: time="2025-09-16T04:22:33.312466200Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.tasks type=io.containerd.grpc.v1 Sep 16 04:22:33.312568 containerd[1558]: time="2025-09-16T04:22:33.312551920Z" level=info msg="loading plugin" id=io.containerd.transfer.v1.local type=io.containerd.transfer.v1 Sep 16 04:22:33.312767 containerd[1558]: time="2025-09-16T04:22:33.312746800Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.transfer type=io.containerd.grpc.v1 Sep 16 04:22:33.312893 containerd[1558]: time="2025-09-16T04:22:33.312875000Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.version type=io.containerd.grpc.v1 Sep 16 04:22:33.313010 containerd[1558]: time="2025-09-16T04:22:33.312945960Z" level=info msg="loading plugin" id=io.containerd.monitor.container.v1.restart type=io.containerd.monitor.container.v1 Sep 16 04:22:33.313133 containerd[1558]: time="2025-09-16T04:22:33.313113200Z" level=info msg="loading plugin" id=io.containerd.tracing.processor.v1.otlp type=io.containerd.tracing.processor.v1 Sep 16 04:22:33.313207 containerd[1558]: time="2025-09-16T04:22:33.313189760Z" level=info msg="skip loading plugin" error="skip plugin: tracing endpoint not configured" id=io.containerd.tracing.processor.v1.otlp type=io.containerd.tracing.processor.v1 Sep 16 04:22:33.313328 containerd[1558]: time="2025-09-16T04:22:33.313306760Z" level=info msg="loading plugin" id=io.containerd.internal.v1.tracing type=io.containerd.internal.v1 Sep 16 04:22:33.313417 containerd[1558]: time="2025-09-16T04:22:33.313398800Z" level=info msg="skip loading plugin" error="skip plugin: tracing endpoint not configured" id=io.containerd.internal.v1.tracing type=io.containerd.internal.v1 Sep 16 04:22:33.313478 containerd[1558]: time="2025-09-16T04:22:33.313464120Z" level=info msg="loading plugin" id=io.containerd.ttrpc.v1.otelttrpc type=io.containerd.ttrpc.v1 Sep 16 04:22:33.313539 containerd[1558]: time="2025-09-16T04:22:33.313522840Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.healthcheck type=io.containerd.grpc.v1 Sep 16 04:22:33.313627 containerd[1558]: time="2025-09-16T04:22:33.313611240Z" level=info msg="loading plugin" id=io.containerd.nri.v1.nri type=io.containerd.nri.v1 Sep 16 04:22:33.313785 containerd[1558]: time="2025-09-16T04:22:33.313771200Z" level=info msg="runtime interface created" Sep 16 04:22:33.313843 containerd[1558]: time="2025-09-16T04:22:33.313831320Z" level=info msg="created NRI interface" Sep 16 04:22:33.313890 containerd[1558]: time="2025-09-16T04:22:33.313878200Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.cri type=io.containerd.grpc.v1 Sep 16 04:22:33.313945 containerd[1558]: time="2025-09-16T04:22:33.313932840Z" level=info msg="Connect containerd service" Sep 16 04:22:33.314025 containerd[1558]: time="2025-09-16T04:22:33.314011360Z" level=info msg="using experimental NRI integration - disable nri plugin to prevent this" Sep 16 04:22:33.315049 containerd[1558]: time="2025-09-16T04:22:33.314981840Z" level=error msg="failed to load cni during init, please check CRI plugin status before setting up network for pods" error="cni config load failed: no network config found in /etc/cni/net.d: cni plugin not initialized: failed to load cni config" Sep 16 04:22:33.450218 tar[1539]: linux-arm64/README.md Sep 16 04:22:33.472998 containerd[1558]: time="2025-09-16T04:22:33.472907480Z" level=info msg="Start subscribing containerd event" Sep 16 04:22:33.472998 containerd[1558]: time="2025-09-16T04:22:33.472989600Z" level=info msg="Start recovering state" Sep 16 04:22:33.473135 containerd[1558]: time="2025-09-16T04:22:33.473086600Z" level=info msg="Start event monitor" Sep 16 04:22:33.473135 containerd[1558]: time="2025-09-16T04:22:33.473098800Z" level=info msg="Start cni network conf syncer for default" Sep 16 04:22:33.473135 containerd[1558]: time="2025-09-16T04:22:33.473112400Z" level=info msg="Start streaming server" Sep 16 04:22:33.473135 containerd[1558]: time="2025-09-16T04:22:33.473128760Z" level=info msg="Registered namespace \"k8s.io\" with NRI" Sep 16 04:22:33.473200 containerd[1558]: time="2025-09-16T04:22:33.473136000Z" level=info msg="runtime interface starting up..." Sep 16 04:22:33.473200 containerd[1558]: time="2025-09-16T04:22:33.473145760Z" level=info msg="starting plugins..." Sep 16 04:22:33.473200 containerd[1558]: time="2025-09-16T04:22:33.473159280Z" level=info msg="Synchronizing NRI (plugin) with current runtime state" Sep 16 04:22:33.473717 systemd[1]: Finished prepare-helm.service - Unpack helm to /opt/bin. Sep 16 04:22:33.473858 containerd[1558]: time="2025-09-16T04:22:33.473719680Z" level=info msg=serving... address=/run/containerd/containerd.sock.ttrpc Sep 16 04:22:33.473961 containerd[1558]: time="2025-09-16T04:22:33.473941400Z" level=info msg=serving... address=/run/containerd/containerd.sock Sep 16 04:22:33.474057 containerd[1558]: time="2025-09-16T04:22:33.474044280Z" level=info msg="containerd successfully booted in 0.187527s" Sep 16 04:22:33.475950 systemd[1]: Started containerd.service - containerd container runtime. Sep 16 04:22:33.733081 sshd_keygen[1565]: ssh-keygen: generating new host keys: RSA ECDSA ED25519 Sep 16 04:22:33.758723 systemd[1]: Finished sshd-keygen.service - Generate sshd host keys. Sep 16 04:22:33.763872 systemd[1]: Starting issuegen.service - Generate /run/issue... Sep 16 04:22:33.792368 systemd[1]: issuegen.service: Deactivated successfully. Sep 16 04:22:33.792594 systemd[1]: Finished issuegen.service - Generate /run/issue. Sep 16 04:22:33.797690 systemd[1]: Starting systemd-user-sessions.service - Permit User Sessions... Sep 16 04:22:33.804818 systemd-networkd[1418]: eth0: Gained IPv6LL Sep 16 04:22:33.806093 systemd-timesyncd[1451]: Network configuration changed, trying to establish connection. Sep 16 04:22:33.809349 systemd[1]: Finished systemd-networkd-wait-online.service - Wait for Network to be Configured. Sep 16 04:22:33.814703 systemd[1]: Reached target network-online.target - Network is Online. Sep 16 04:22:33.817732 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Sep 16 04:22:33.820837 systemd[1]: Starting nvidia.service - NVIDIA Configure Service... Sep 16 04:22:33.822092 systemd[1]: Finished systemd-user-sessions.service - Permit User Sessions. Sep 16 04:22:33.830561 systemd[1]: Started getty@tty1.service - Getty on tty1. Sep 16 04:22:33.835101 systemd[1]: Started serial-getty@ttyAMA0.service - Serial Getty on ttyAMA0. Sep 16 04:22:33.837477 systemd[1]: Reached target getty.target - Login Prompts. Sep 16 04:22:33.858845 systemd[1]: Finished nvidia.service - NVIDIA Configure Service. Sep 16 04:22:34.251932 systemd-networkd[1418]: eth1: Gained IPv6LL Sep 16 04:22:34.252709 systemd-timesyncd[1451]: Network configuration changed, trying to establish connection. Sep 16 04:22:34.618123 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Sep 16 04:22:34.620754 systemd[1]: Reached target multi-user.target - Multi-User System. Sep 16 04:22:34.622160 systemd[1]: Startup finished in 2.364s (kernel) + 10.437s (initrd) + 4.624s (userspace) = 17.426s. Sep 16 04:22:34.628008 (kubelet)[1664]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Sep 16 04:22:35.137706 kubelet[1664]: E0916 04:22:35.137613 1664 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Sep 16 04:22:35.140699 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Sep 16 04:22:35.140982 systemd[1]: kubelet.service: Failed with result 'exit-code'. Sep 16 04:22:35.141464 systemd[1]: kubelet.service: Consumed 878ms CPU time, 256.8M memory peak. Sep 16 04:22:45.391329 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 1. Sep 16 04:22:45.393487 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Sep 16 04:22:45.537713 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Sep 16 04:22:45.557271 (kubelet)[1682]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Sep 16 04:22:45.613814 kubelet[1682]: E0916 04:22:45.613748 1682 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Sep 16 04:22:45.618928 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Sep 16 04:22:45.619286 systemd[1]: kubelet.service: Failed with result 'exit-code'. Sep 16 04:22:45.620834 systemd[1]: kubelet.service: Consumed 174ms CPU time, 105.6M memory peak. Sep 16 04:22:55.622331 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 2. Sep 16 04:22:55.624820 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Sep 16 04:22:55.775915 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Sep 16 04:22:55.789209 (kubelet)[1695]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Sep 16 04:22:55.838800 kubelet[1695]: E0916 04:22:55.838741 1695 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Sep 16 04:22:55.842354 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Sep 16 04:22:55.842526 systemd[1]: kubelet.service: Failed with result 'exit-code'. Sep 16 04:22:55.843239 systemd[1]: kubelet.service: Consumed 168ms CPU time, 104.6M memory peak. Sep 16 04:23:04.567113 systemd-timesyncd[1451]: Contacted time server 141.144.241.16:123 (2.flatcar.pool.ntp.org). Sep 16 04:23:04.567226 systemd-timesyncd[1451]: Initial clock synchronization to Tue 2025-09-16 04:23:04.278584 UTC. Sep 16 04:23:05.872632 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 3. Sep 16 04:23:05.876809 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Sep 16 04:23:06.054061 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Sep 16 04:23:06.067172 (kubelet)[1711]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Sep 16 04:23:06.111776 kubelet[1711]: E0916 04:23:06.111697 1711 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Sep 16 04:23:06.115341 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Sep 16 04:23:06.115477 systemd[1]: kubelet.service: Failed with result 'exit-code'. Sep 16 04:23:06.116035 systemd[1]: kubelet.service: Consumed 167ms CPU time, 104.9M memory peak. Sep 16 04:23:08.059315 systemd[1]: Created slice system-sshd.slice - Slice /system/sshd. Sep 16 04:23:08.061943 systemd[1]: Started sshd@0-138.201.119.17:22-139.178.89.65:55540.service - OpenSSH per-connection server daemon (139.178.89.65:55540). Sep 16 04:23:09.052894 sshd[1718]: Accepted publickey for core from 139.178.89.65 port 55540 ssh2: RSA SHA256:hnZQROmedaG+reQAaWvmG41QCRiTlF3QrQA4Qzar5jk Sep 16 04:23:09.055247 sshd-session[1718]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 16 04:23:09.063744 systemd[1]: Created slice user-500.slice - User Slice of UID 500. Sep 16 04:23:09.065214 systemd[1]: Starting user-runtime-dir@500.service - User Runtime Directory /run/user/500... Sep 16 04:23:09.073647 systemd-logind[1529]: New session 1 of user core. Sep 16 04:23:09.095680 systemd[1]: Finished user-runtime-dir@500.service - User Runtime Directory /run/user/500. Sep 16 04:23:09.100002 systemd[1]: Starting user@500.service - User Manager for UID 500... Sep 16 04:23:09.120405 (systemd)[1723]: pam_unix(systemd-user:session): session opened for user core(uid=500) by (uid=0) Sep 16 04:23:09.123831 systemd-logind[1529]: New session c1 of user core. Sep 16 04:23:09.258750 systemd[1723]: Queued start job for default target default.target. Sep 16 04:23:09.269333 systemd[1723]: Created slice app.slice - User Application Slice. Sep 16 04:23:09.269384 systemd[1723]: Reached target paths.target - Paths. Sep 16 04:23:09.269449 systemd[1723]: Reached target timers.target - Timers. Sep 16 04:23:09.271406 systemd[1723]: Starting dbus.socket - D-Bus User Message Bus Socket... Sep 16 04:23:09.286248 systemd[1723]: Listening on dbus.socket - D-Bus User Message Bus Socket. Sep 16 04:23:09.286739 systemd[1723]: Reached target sockets.target - Sockets. Sep 16 04:23:09.286981 systemd[1723]: Reached target basic.target - Basic System. Sep 16 04:23:09.287211 systemd[1723]: Reached target default.target - Main User Target. Sep 16 04:23:09.287235 systemd[1]: Started user@500.service - User Manager for UID 500. Sep 16 04:23:09.287540 systemd[1723]: Startup finished in 156ms. Sep 16 04:23:09.294890 systemd[1]: Started session-1.scope - Session 1 of User core. Sep 16 04:23:09.979193 systemd[1]: Started sshd@1-138.201.119.17:22-139.178.89.65:55542.service - OpenSSH per-connection server daemon (139.178.89.65:55542). Sep 16 04:23:10.954234 sshd[1734]: Accepted publickey for core from 139.178.89.65 port 55542 ssh2: RSA SHA256:hnZQROmedaG+reQAaWvmG41QCRiTlF3QrQA4Qzar5jk Sep 16 04:23:10.956508 sshd-session[1734]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 16 04:23:10.963673 systemd-logind[1529]: New session 2 of user core. Sep 16 04:23:10.969897 systemd[1]: Started session-2.scope - Session 2 of User core. Sep 16 04:23:11.617656 sshd[1737]: Connection closed by 139.178.89.65 port 55542 Sep 16 04:23:11.618550 sshd-session[1734]: pam_unix(sshd:session): session closed for user core Sep 16 04:23:11.625876 systemd-logind[1529]: Session 2 logged out. Waiting for processes to exit. Sep 16 04:23:11.625919 systemd[1]: sshd@1-138.201.119.17:22-139.178.89.65:55542.service: Deactivated successfully. Sep 16 04:23:11.628011 systemd[1]: session-2.scope: Deactivated successfully. Sep 16 04:23:11.629683 systemd-logind[1529]: Removed session 2. Sep 16 04:23:11.790541 systemd[1]: Started sshd@2-138.201.119.17:22-139.178.89.65:60496.service - OpenSSH per-connection server daemon (139.178.89.65:60496). Sep 16 04:23:12.778794 sshd[1743]: Accepted publickey for core from 139.178.89.65 port 60496 ssh2: RSA SHA256:hnZQROmedaG+reQAaWvmG41QCRiTlF3QrQA4Qzar5jk Sep 16 04:23:12.780919 sshd-session[1743]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 16 04:23:12.786444 systemd-logind[1529]: New session 3 of user core. Sep 16 04:23:12.793943 systemd[1]: Started session-3.scope - Session 3 of User core. Sep 16 04:23:13.449760 sshd[1746]: Connection closed by 139.178.89.65 port 60496 Sep 16 04:23:13.449633 sshd-session[1743]: pam_unix(sshd:session): session closed for user core Sep 16 04:23:13.456667 systemd-logind[1529]: Session 3 logged out. Waiting for processes to exit. Sep 16 04:23:13.456844 systemd[1]: sshd@2-138.201.119.17:22-139.178.89.65:60496.service: Deactivated successfully. Sep 16 04:23:13.458500 systemd[1]: session-3.scope: Deactivated successfully. Sep 16 04:23:13.461226 systemd-logind[1529]: Removed session 3. Sep 16 04:23:13.619937 systemd[1]: Started sshd@3-138.201.119.17:22-139.178.89.65:60502.service - OpenSSH per-connection server daemon (139.178.89.65:60502). Sep 16 04:23:14.630836 sshd[1752]: Accepted publickey for core from 139.178.89.65 port 60502 ssh2: RSA SHA256:hnZQROmedaG+reQAaWvmG41QCRiTlF3QrQA4Qzar5jk Sep 16 04:23:14.632421 sshd-session[1752]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 16 04:23:14.638917 systemd-logind[1529]: New session 4 of user core. Sep 16 04:23:14.649873 systemd[1]: Started session-4.scope - Session 4 of User core. Sep 16 04:23:15.306685 sshd[1755]: Connection closed by 139.178.89.65 port 60502 Sep 16 04:23:15.307439 sshd-session[1752]: pam_unix(sshd:session): session closed for user core Sep 16 04:23:15.312347 systemd[1]: sshd@3-138.201.119.17:22-139.178.89.65:60502.service: Deactivated successfully. Sep 16 04:23:15.314117 systemd[1]: session-4.scope: Deactivated successfully. Sep 16 04:23:15.315260 systemd-logind[1529]: Session 4 logged out. Waiting for processes to exit. Sep 16 04:23:15.317203 systemd-logind[1529]: Removed session 4. Sep 16 04:23:15.480395 systemd[1]: Started sshd@4-138.201.119.17:22-139.178.89.65:60510.service - OpenSSH per-connection server daemon (139.178.89.65:60510). Sep 16 04:23:16.122203 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 4. Sep 16 04:23:16.124247 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Sep 16 04:23:16.294126 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Sep 16 04:23:16.305759 (kubelet)[1772]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Sep 16 04:23:16.349965 kubelet[1772]: E0916 04:23:16.349905 1772 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Sep 16 04:23:16.352923 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Sep 16 04:23:16.353445 systemd[1]: kubelet.service: Failed with result 'exit-code'. Sep 16 04:23:16.354163 systemd[1]: kubelet.service: Consumed 165ms CPU time, 106.4M memory peak. Sep 16 04:23:16.463095 sshd[1761]: Accepted publickey for core from 139.178.89.65 port 60510 ssh2: RSA SHA256:hnZQROmedaG+reQAaWvmG41QCRiTlF3QrQA4Qzar5jk Sep 16 04:23:16.465002 sshd-session[1761]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 16 04:23:16.470676 systemd-logind[1529]: New session 5 of user core. Sep 16 04:23:16.476189 systemd[1]: Started session-5.scope - Session 5 of User core. Sep 16 04:23:16.986624 sudo[1780]: core : PWD=/home/core ; USER=root ; COMMAND=/usr/sbin/setenforce 1 Sep 16 04:23:16.986917 sudo[1780]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Sep 16 04:23:17.006095 sudo[1780]: pam_unix(sudo:session): session closed for user root Sep 16 04:23:17.165612 sshd[1779]: Connection closed by 139.178.89.65 port 60510 Sep 16 04:23:17.164607 sshd-session[1761]: pam_unix(sshd:session): session closed for user core Sep 16 04:23:17.170134 systemd-logind[1529]: Session 5 logged out. Waiting for processes to exit. Sep 16 04:23:17.171130 systemd[1]: sshd@4-138.201.119.17:22-139.178.89.65:60510.service: Deactivated successfully. Sep 16 04:23:17.173377 systemd[1]: session-5.scope: Deactivated successfully. Sep 16 04:23:17.175565 systemd-logind[1529]: Removed session 5. Sep 16 04:23:17.337121 systemd[1]: Started sshd@5-138.201.119.17:22-139.178.89.65:60524.service - OpenSSH per-connection server daemon (139.178.89.65:60524). Sep 16 04:23:18.197251 update_engine[1531]: I20250916 04:23:18.196614 1531 update_attempter.cc:509] Updating boot flags... Sep 16 04:23:18.331138 sshd[1786]: Accepted publickey for core from 139.178.89.65 port 60524 ssh2: RSA SHA256:hnZQROmedaG+reQAaWvmG41QCRiTlF3QrQA4Qzar5jk Sep 16 04:23:18.336986 sshd-session[1786]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 16 04:23:18.352010 systemd-logind[1529]: New session 6 of user core. Sep 16 04:23:18.357849 systemd[1]: Started session-6.scope - Session 6 of User core. Sep 16 04:23:18.848036 sudo[1811]: core : PWD=/home/core ; USER=root ; COMMAND=/usr/sbin/rm -rf /etc/audit/rules.d/80-selinux.rules /etc/audit/rules.d/99-default.rules Sep 16 04:23:18.848307 sudo[1811]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Sep 16 04:23:18.853783 sudo[1811]: pam_unix(sudo:session): session closed for user root Sep 16 04:23:18.859877 sudo[1810]: core : PWD=/home/core ; USER=root ; COMMAND=/usr/sbin/systemctl restart audit-rules Sep 16 04:23:18.860181 sudo[1810]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Sep 16 04:23:18.871776 systemd[1]: Starting audit-rules.service - Load Audit Rules... Sep 16 04:23:18.919352 augenrules[1833]: No rules Sep 16 04:23:18.920820 systemd[1]: audit-rules.service: Deactivated successfully. Sep 16 04:23:18.921016 systemd[1]: Finished audit-rules.service - Load Audit Rules. Sep 16 04:23:18.922449 sudo[1810]: pam_unix(sudo:session): session closed for user root Sep 16 04:23:19.079376 sshd[1809]: Connection closed by 139.178.89.65 port 60524 Sep 16 04:23:19.080150 sshd-session[1786]: pam_unix(sshd:session): session closed for user core Sep 16 04:23:19.085549 systemd[1]: sshd@5-138.201.119.17:22-139.178.89.65:60524.service: Deactivated successfully. Sep 16 04:23:19.088021 systemd[1]: session-6.scope: Deactivated successfully. Sep 16 04:23:19.090681 systemd-logind[1529]: Session 6 logged out. Waiting for processes to exit. Sep 16 04:23:19.091930 systemd-logind[1529]: Removed session 6. Sep 16 04:23:19.249765 systemd[1]: Started sshd@6-138.201.119.17:22-139.178.89.65:60526.service - OpenSSH per-connection server daemon (139.178.89.65:60526). Sep 16 04:23:20.247056 sshd[1842]: Accepted publickey for core from 139.178.89.65 port 60526 ssh2: RSA SHA256:hnZQROmedaG+reQAaWvmG41QCRiTlF3QrQA4Qzar5jk Sep 16 04:23:20.248807 sshd-session[1842]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 16 04:23:20.253930 systemd-logind[1529]: New session 7 of user core. Sep 16 04:23:20.262212 systemd[1]: Started session-7.scope - Session 7 of User core. Sep 16 04:23:20.764071 sudo[1846]: core : PWD=/home/core ; USER=root ; COMMAND=/home/core/install.sh Sep 16 04:23:20.764759 sudo[1846]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Sep 16 04:23:21.072934 systemd[1]: Starting docker.service - Docker Application Container Engine... Sep 16 04:23:21.088375 (dockerd)[1863]: docker.service: Referenced but unset environment variable evaluates to an empty string: DOCKER_CGROUPS, DOCKER_OPTS, DOCKER_OPT_BIP, DOCKER_OPT_IPMASQ, DOCKER_OPT_MTU Sep 16 04:23:21.318030 dockerd[1863]: time="2025-09-16T04:23:21.317946063Z" level=info msg="Starting up" Sep 16 04:23:21.319431 dockerd[1863]: time="2025-09-16T04:23:21.319368585Z" level=info msg="OTEL tracing is not configured, using no-op tracer provider" Sep 16 04:23:21.331248 dockerd[1863]: time="2025-09-16T04:23:21.330867372Z" level=info msg="Creating a containerd client" address=/var/run/docker/libcontainerd/docker-containerd.sock timeout=1m0s Sep 16 04:23:21.357189 systemd[1]: var-lib-docker-metacopy\x2dcheck964458563-merged.mount: Deactivated successfully. Sep 16 04:23:21.366753 dockerd[1863]: time="2025-09-16T04:23:21.366699386Z" level=info msg="Loading containers: start." Sep 16 04:23:21.380642 kernel: Initializing XFRM netlink socket Sep 16 04:23:21.618551 systemd-networkd[1418]: docker0: Link UP Sep 16 04:23:21.625467 dockerd[1863]: time="2025-09-16T04:23:21.625238123Z" level=info msg="Loading containers: done." Sep 16 04:23:21.643435 dockerd[1863]: time="2025-09-16T04:23:21.643350691Z" level=warning msg="Not using native diff for overlay2, this may cause degraded performance for building images: kernel has CONFIG_OVERLAY_FS_REDIRECT_DIR enabled" storage-driver=overlay2 Sep 16 04:23:21.643701 dockerd[1863]: time="2025-09-16T04:23:21.643494038Z" level=info msg="Docker daemon" commit=6430e49a55babd9b8f4d08e70ecb2b68900770fe containerd-snapshotter=false storage-driver=overlay2 version=28.0.4 Sep 16 04:23:21.643701 dockerd[1863]: time="2025-09-16T04:23:21.643623764Z" level=info msg="Initializing buildkit" Sep 16 04:23:21.676935 dockerd[1863]: time="2025-09-16T04:23:21.676885202Z" level=info msg="Completed buildkit initialization" Sep 16 04:23:21.688174 dockerd[1863]: time="2025-09-16T04:23:21.687828523Z" level=info msg="Daemon has completed initialization" Sep 16 04:23:21.688174 dockerd[1863]: time="2025-09-16T04:23:21.687933156Z" level=info msg="API listen on /run/docker.sock" Sep 16 04:23:21.689857 systemd[1]: Started docker.service - Docker Application Container Engine. Sep 16 04:23:22.797101 containerd[1558]: time="2025-09-16T04:23:22.796639272Z" level=info msg="PullImage \"registry.k8s.io/kube-apiserver:v1.33.5\"" Sep 16 04:23:23.360303 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount699696616.mount: Deactivated successfully. Sep 16 04:23:25.373346 containerd[1558]: time="2025-09-16T04:23:25.373267558Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-apiserver:v1.33.5\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 16 04:23:25.375561 containerd[1558]: time="2025-09-16T04:23:25.375507558Z" level=info msg="stop pulling image registry.k8s.io/kube-apiserver:v1.33.5: active requests=0, bytes read=27390326" Sep 16 04:23:25.376347 containerd[1558]: time="2025-09-16T04:23:25.376297024Z" level=info msg="ImageCreate event name:\"sha256:6a7fd297b49102b08dc3d8d4fd7f1538bcf21d3131eae8bf62ba26ce3283237f\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 16 04:23:25.380614 containerd[1558]: time="2025-09-16T04:23:25.379712883Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-apiserver@sha256:1b9c6c00bc1fe86860e72efb8e4148f9e436a132eba4ca636ca4f48d61d6dfb4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 16 04:23:25.381874 containerd[1558]: time="2025-09-16T04:23:25.381501851Z" level=info msg="Pulled image \"registry.k8s.io/kube-apiserver:v1.33.5\" with image id \"sha256:6a7fd297b49102b08dc3d8d4fd7f1538bcf21d3131eae8bf62ba26ce3283237f\", repo tag \"registry.k8s.io/kube-apiserver:v1.33.5\", repo digest \"registry.k8s.io/kube-apiserver@sha256:1b9c6c00bc1fe86860e72efb8e4148f9e436a132eba4ca636ca4f48d61d6dfb4\", size \"27386827\" in 2.584818849s" Sep 16 04:23:25.381874 containerd[1558]: time="2025-09-16T04:23:25.381555117Z" level=info msg="PullImage \"registry.k8s.io/kube-apiserver:v1.33.5\" returns image reference \"sha256:6a7fd297b49102b08dc3d8d4fd7f1538bcf21d3131eae8bf62ba26ce3283237f\"" Sep 16 04:23:25.384128 containerd[1558]: time="2025-09-16T04:23:25.384096244Z" level=info msg="PullImage \"registry.k8s.io/kube-controller-manager:v1.33.5\"" Sep 16 04:23:26.372602 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 5. Sep 16 04:23:26.375521 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Sep 16 04:23:26.538499 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Sep 16 04:23:26.551164 (kubelet)[2140]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Sep 16 04:23:26.590404 kubelet[2140]: E0916 04:23:26.590330 2140 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Sep 16 04:23:26.593937 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Sep 16 04:23:26.594279 systemd[1]: kubelet.service: Failed with result 'exit-code'. Sep 16 04:23:26.594909 systemd[1]: kubelet.service: Consumed 154ms CPU time, 104.7M memory peak. Sep 16 04:23:27.566619 containerd[1558]: time="2025-09-16T04:23:27.564912550Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-controller-manager:v1.33.5\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 16 04:23:27.567286 containerd[1558]: time="2025-09-16T04:23:27.567086231Z" level=info msg="stop pulling image registry.k8s.io/kube-controller-manager:v1.33.5: active requests=0, bytes read=23547937" Sep 16 04:23:27.567702 containerd[1558]: time="2025-09-16T04:23:27.567672390Z" level=info msg="ImageCreate event name:\"sha256:2dd4c25a937008b7b8a6cdca70d816403b5078b51550926721b7a7762139cd23\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 16 04:23:27.570484 containerd[1558]: time="2025-09-16T04:23:27.570449397Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-controller-manager@sha256:1082a6ab67fb46397314dd36b36cb197ba4a4c5365033e9ad22bc7edaaaabd5c\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 16 04:23:27.571681 containerd[1558]: time="2025-09-16T04:23:27.571639041Z" level=info msg="Pulled image \"registry.k8s.io/kube-controller-manager:v1.33.5\" with image id \"sha256:2dd4c25a937008b7b8a6cdca70d816403b5078b51550926721b7a7762139cd23\", repo tag \"registry.k8s.io/kube-controller-manager:v1.33.5\", repo digest \"registry.k8s.io/kube-controller-manager@sha256:1082a6ab67fb46397314dd36b36cb197ba4a4c5365033e9ad22bc7edaaaabd5c\", size \"25135832\" in 2.187348955s" Sep 16 04:23:27.571756 containerd[1558]: time="2025-09-16T04:23:27.571683237Z" level=info msg="PullImage \"registry.k8s.io/kube-controller-manager:v1.33.5\" returns image reference \"sha256:2dd4c25a937008b7b8a6cdca70d816403b5078b51550926721b7a7762139cd23\"" Sep 16 04:23:27.572359 containerd[1558]: time="2025-09-16T04:23:27.572303171Z" level=info msg="PullImage \"registry.k8s.io/kube-scheduler:v1.33.5\"" Sep 16 04:23:29.026887 containerd[1558]: time="2025-09-16T04:23:29.026825469Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-scheduler:v1.33.5\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 16 04:23:29.028314 containerd[1558]: time="2025-09-16T04:23:29.027978540Z" level=info msg="stop pulling image registry.k8s.io/kube-scheduler:v1.33.5: active requests=0, bytes read=18295997" Sep 16 04:23:29.029405 containerd[1558]: time="2025-09-16T04:23:29.029349174Z" level=info msg="ImageCreate event name:\"sha256:5e600beaed8620718e0650dd2721266869ce1d737488c004a869333273e6ec15\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 16 04:23:29.032724 containerd[1558]: time="2025-09-16T04:23:29.032631288Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-scheduler@sha256:3e7b57c9d9f06b77f0064e5be7f3df61e0151101160acd5fdecce911df28a189\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 16 04:23:29.033861 containerd[1558]: time="2025-09-16T04:23:29.033828575Z" level=info msg="Pulled image \"registry.k8s.io/kube-scheduler:v1.33.5\" with image id \"sha256:5e600beaed8620718e0650dd2721266869ce1d737488c004a869333273e6ec15\", repo tag \"registry.k8s.io/kube-scheduler:v1.33.5\", repo digest \"registry.k8s.io/kube-scheduler@sha256:3e7b57c9d9f06b77f0064e5be7f3df61e0151101160acd5fdecce911df28a189\", size \"19883910\" in 1.460561245s" Sep 16 04:23:29.033967 containerd[1558]: time="2025-09-16T04:23:29.033951835Z" level=info msg="PullImage \"registry.k8s.io/kube-scheduler:v1.33.5\" returns image reference \"sha256:5e600beaed8620718e0650dd2721266869ce1d737488c004a869333273e6ec15\"" Sep 16 04:23:29.035079 containerd[1558]: time="2025-09-16T04:23:29.035023945Z" level=info msg="PullImage \"registry.k8s.io/kube-proxy:v1.33.5\"" Sep 16 04:23:30.231465 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount1398021589.mount: Deactivated successfully. Sep 16 04:23:30.931880 containerd[1558]: time="2025-09-16T04:23:30.931814374Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-proxy:v1.33.5\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 16 04:23:30.933395 containerd[1558]: time="2025-09-16T04:23:30.933364188Z" level=info msg="stop pulling image registry.k8s.io/kube-proxy:v1.33.5: active requests=0, bytes read=28240132" Sep 16 04:23:30.934601 containerd[1558]: time="2025-09-16T04:23:30.933936855Z" level=info msg="ImageCreate event name:\"sha256:021a8d45ab0c346664e47d95595ff5180ce90a22a681ea27904c65ae90788e70\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 16 04:23:30.936054 containerd[1558]: time="2025-09-16T04:23:30.936028776Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-proxy@sha256:71445ec84ad98bd52a7784865a9d31b1b50b56092d3f7699edc39eefd71befe1\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 16 04:23:30.937269 containerd[1558]: time="2025-09-16T04:23:30.937247055Z" level=info msg="Pulled image \"registry.k8s.io/kube-proxy:v1.33.5\" with image id \"sha256:021a8d45ab0c346664e47d95595ff5180ce90a22a681ea27904c65ae90788e70\", repo tag \"registry.k8s.io/kube-proxy:v1.33.5\", repo digest \"registry.k8s.io/kube-proxy@sha256:71445ec84ad98bd52a7784865a9d31b1b50b56092d3f7699edc39eefd71befe1\", size \"28239125\" in 1.902026996s" Sep 16 04:23:30.937447 containerd[1558]: time="2025-09-16T04:23:30.937433776Z" level=info msg="PullImage \"registry.k8s.io/kube-proxy:v1.33.5\" returns image reference \"sha256:021a8d45ab0c346664e47d95595ff5180ce90a22a681ea27904c65ae90788e70\"" Sep 16 04:23:30.938551 containerd[1558]: time="2025-09-16T04:23:30.938416078Z" level=info msg="PullImage \"registry.k8s.io/coredns/coredns:v1.12.0\"" Sep 16 04:23:31.562519 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount1698404848.mount: Deactivated successfully. Sep 16 04:23:32.702993 containerd[1558]: time="2025-09-16T04:23:32.702903485Z" level=info msg="ImageCreate event name:\"registry.k8s.io/coredns/coredns:v1.12.0\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 16 04:23:32.704904 containerd[1558]: time="2025-09-16T04:23:32.704842744Z" level=info msg="stop pulling image registry.k8s.io/coredns/coredns:v1.12.0: active requests=0, bytes read=19152209" Sep 16 04:23:32.706603 containerd[1558]: time="2025-09-16T04:23:32.705845800Z" level=info msg="ImageCreate event name:\"sha256:f72407be9e08c3a1b29a88318cbfee87b9f2da489f84015a5090b1e386e4dbc1\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 16 04:23:32.709070 containerd[1558]: time="2025-09-16T04:23:32.709018969Z" level=info msg="ImageCreate event name:\"registry.k8s.io/coredns/coredns@sha256:40384aa1f5ea6bfdc77997d243aec73da05f27aed0c5e9d65bfa98933c519d97\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 16 04:23:32.710717 containerd[1558]: time="2025-09-16T04:23:32.710664996Z" level=info msg="Pulled image \"registry.k8s.io/coredns/coredns:v1.12.0\" with image id \"sha256:f72407be9e08c3a1b29a88318cbfee87b9f2da489f84015a5090b1e386e4dbc1\", repo tag \"registry.k8s.io/coredns/coredns:v1.12.0\", repo digest \"registry.k8s.io/coredns/coredns@sha256:40384aa1f5ea6bfdc77997d243aec73da05f27aed0c5e9d65bfa98933c519d97\", size \"19148915\" in 1.771622476s" Sep 16 04:23:32.710717 containerd[1558]: time="2025-09-16T04:23:32.710717824Z" level=info msg="PullImage \"registry.k8s.io/coredns/coredns:v1.12.0\" returns image reference \"sha256:f72407be9e08c3a1b29a88318cbfee87b9f2da489f84015a5090b1e386e4dbc1\"" Sep 16 04:23:32.711208 containerd[1558]: time="2025-09-16T04:23:32.711165505Z" level=info msg="PullImage \"registry.k8s.io/pause:3.10\"" Sep 16 04:23:33.237134 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount608883217.mount: Deactivated successfully. Sep 16 04:23:33.244036 containerd[1558]: time="2025-09-16T04:23:33.243958668Z" level=info msg="ImageCreate event name:\"registry.k8s.io/pause:3.10\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" Sep 16 04:23:33.245741 containerd[1558]: time="2025-09-16T04:23:33.245669041Z" level=info msg="stop pulling image registry.k8s.io/pause:3.10: active requests=0, bytes read=268723" Sep 16 04:23:33.247609 containerd[1558]: time="2025-09-16T04:23:33.246351216Z" level=info msg="ImageCreate event name:\"sha256:afb61768ce381961ca0beff95337601f29dc70ff3ed14e5e4b3e5699057e6aa8\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" Sep 16 04:23:33.251066 containerd[1558]: time="2025-09-16T04:23:33.251017813Z" level=info msg="ImageCreate event name:\"registry.k8s.io/pause@sha256:ee6521f290b2168b6e0935a181d4cff9be1ac3f505666ef0e3c98fae8199917a\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" Sep 16 04:23:33.252043 containerd[1558]: time="2025-09-16T04:23:33.251994176Z" level=info msg="Pulled image \"registry.k8s.io/pause:3.10\" with image id \"sha256:afb61768ce381961ca0beff95337601f29dc70ff3ed14e5e4b3e5699057e6aa8\", repo tag \"registry.k8s.io/pause:3.10\", repo digest \"registry.k8s.io/pause@sha256:ee6521f290b2168b6e0935a181d4cff9be1ac3f505666ef0e3c98fae8199917a\", size \"267933\" in 540.787669ms" Sep 16 04:23:33.252227 containerd[1558]: time="2025-09-16T04:23:33.252195163Z" level=info msg="PullImage \"registry.k8s.io/pause:3.10\" returns image reference \"sha256:afb61768ce381961ca0beff95337601f29dc70ff3ed14e5e4b3e5699057e6aa8\"" Sep 16 04:23:33.252943 containerd[1558]: time="2025-09-16T04:23:33.252898600Z" level=info msg="PullImage \"registry.k8s.io/etcd:3.5.21-0\"" Sep 16 04:23:33.790166 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount1400132077.mount: Deactivated successfully. Sep 16 04:23:36.622108 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 6. Sep 16 04:23:36.624783 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Sep 16 04:23:36.791321 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Sep 16 04:23:36.804155 (kubelet)[2279]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Sep 16 04:23:36.849597 kubelet[2279]: E0916 04:23:36.849530 2279 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Sep 16 04:23:36.854695 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Sep 16 04:23:36.854819 systemd[1]: kubelet.service: Failed with result 'exit-code'. Sep 16 04:23:36.855193 systemd[1]: kubelet.service: Consumed 158ms CPU time, 106.9M memory peak. Sep 16 04:23:37.127055 containerd[1558]: time="2025-09-16T04:23:37.126944591Z" level=info msg="ImageCreate event name:\"registry.k8s.io/etcd:3.5.21-0\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 16 04:23:37.129336 containerd[1558]: time="2025-09-16T04:23:37.129261127Z" level=info msg="ImageCreate event name:\"sha256:31747a36ce712f0bf61b50a0c06e99768522025e7b8daedd6dc63d1ae84837b5\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 16 04:23:37.129502 containerd[1558]: time="2025-09-16T04:23:37.129456189Z" level=info msg="stop pulling image registry.k8s.io/etcd:3.5.21-0: active requests=0, bytes read=69465913" Sep 16 04:23:37.132626 containerd[1558]: time="2025-09-16T04:23:37.132387594Z" level=info msg="ImageCreate event name:\"registry.k8s.io/etcd@sha256:d58c035df557080a27387d687092e3fc2b64c6d0e3162dc51453a115f847d121\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 16 04:23:37.133726 containerd[1558]: time="2025-09-16T04:23:37.133526160Z" level=info msg="Pulled image \"registry.k8s.io/etcd:3.5.21-0\" with image id \"sha256:31747a36ce712f0bf61b50a0c06e99768522025e7b8daedd6dc63d1ae84837b5\", repo tag \"registry.k8s.io/etcd:3.5.21-0\", repo digest \"registry.k8s.io/etcd@sha256:d58c035df557080a27387d687092e3fc2b64c6d0e3162dc51453a115f847d121\", size \"70026017\" in 3.880588025s" Sep 16 04:23:37.133726 containerd[1558]: time="2025-09-16T04:23:37.133565525Z" level=info msg="PullImage \"registry.k8s.io/etcd:3.5.21-0\" returns image reference \"sha256:31747a36ce712f0bf61b50a0c06e99768522025e7b8daedd6dc63d1ae84837b5\"" Sep 16 04:23:45.148190 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent. Sep 16 04:23:45.148337 systemd[1]: kubelet.service: Consumed 158ms CPU time, 106.9M memory peak. Sep 16 04:23:45.151142 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Sep 16 04:23:45.180393 systemd[1]: Reload requested from client PID 2317 ('systemctl') (unit session-7.scope)... Sep 16 04:23:45.180413 systemd[1]: Reloading... Sep 16 04:23:45.286604 zram_generator::config[2360]: No configuration found. Sep 16 04:23:45.486922 systemd[1]: Reloading finished in 306 ms. Sep 16 04:23:45.548368 systemd[1]: kubelet.service: Control process exited, code=killed, status=15/TERM Sep 16 04:23:45.548465 systemd[1]: kubelet.service: Failed with result 'signal'. Sep 16 04:23:45.548818 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent. Sep 16 04:23:45.548861 systemd[1]: kubelet.service: Consumed 107ms CPU time, 94.9M memory peak. Sep 16 04:23:45.550553 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Sep 16 04:23:45.703699 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Sep 16 04:23:45.720773 (kubelet)[2409]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS Sep 16 04:23:45.768637 kubelet[2409]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Sep 16 04:23:45.768637 kubelet[2409]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Sep 16 04:23:45.768637 kubelet[2409]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Sep 16 04:23:45.768637 kubelet[2409]: I0916 04:23:45.766973 2409 server.go:212] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Sep 16 04:23:47.035195 kubelet[2409]: I0916 04:23:47.035128 2409 server.go:530] "Kubelet version" kubeletVersion="v1.33.0" Sep 16 04:23:47.035195 kubelet[2409]: I0916 04:23:47.035177 2409 server.go:532] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Sep 16 04:23:47.035714 kubelet[2409]: I0916 04:23:47.035508 2409 server.go:956] "Client rotation is on, will bootstrap in background" Sep 16 04:23:47.059795 kubelet[2409]: E0916 04:23:47.059744 2409 certificate_manager.go:596] "Failed while requesting a signed certificate from the control plane" err="cannot create certificate signing request: Post \"https://138.201.119.17:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": dial tcp 138.201.119.17:6443: connect: connection refused" logger="kubernetes.io/kube-apiserver-client-kubelet.UnhandledError" Sep 16 04:23:47.061439 kubelet[2409]: I0916 04:23:47.061311 2409 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/pki/ca.crt" Sep 16 04:23:47.078442 kubelet[2409]: I0916 04:23:47.078412 2409 server.go:1446] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Sep 16 04:23:47.081626 kubelet[2409]: I0916 04:23:47.081372 2409 server.go:782] "--cgroups-per-qos enabled, but --cgroup-root was not specified. defaulting to /" Sep 16 04:23:47.081740 kubelet[2409]: I0916 04:23:47.081710 2409 container_manager_linux.go:267] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Sep 16 04:23:47.081899 kubelet[2409]: I0916 04:23:47.081742 2409 container_manager_linux.go:272] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ci-4459-0-0-n-21eb3e8385","RuntimeCgroupsName":"","SystemCgroupsName":"","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":false,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":null,"HardEvictionThresholds":[{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"MemoryManagerPolicy":"None","MemoryManagerReservedMemory":null,"PodPidsLimit":-1,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Sep 16 04:23:47.082004 kubelet[2409]: I0916 04:23:47.081967 2409 topology_manager.go:138] "Creating topology manager with none policy" Sep 16 04:23:47.082004 kubelet[2409]: I0916 04:23:47.081976 2409 container_manager_linux.go:303] "Creating device plugin manager" Sep 16 04:23:47.082206 kubelet[2409]: I0916 04:23:47.082189 2409 state_mem.go:36] "Initialized new in-memory state store" Sep 16 04:23:47.086272 kubelet[2409]: I0916 04:23:47.086227 2409 kubelet.go:480] "Attempting to sync node with API server" Sep 16 04:23:47.086353 kubelet[2409]: I0916 04:23:47.086272 2409 kubelet.go:375] "Adding static pod path" path="/etc/kubernetes/manifests" Sep 16 04:23:47.086353 kubelet[2409]: I0916 04:23:47.086312 2409 kubelet.go:386] "Adding apiserver pod source" Sep 16 04:23:47.086353 kubelet[2409]: I0916 04:23:47.086331 2409 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Sep 16 04:23:47.090683 kubelet[2409]: E0916 04:23:47.090642 2409 reflector.go:200] "Failed to watch" err="failed to list *v1.Service: Get \"https://138.201.119.17:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 138.201.119.17:6443: connect: connection refused" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" Sep 16 04:23:47.091069 kubelet[2409]: E0916 04:23:47.091014 2409 reflector.go:200] "Failed to watch" err="failed to list *v1.Node: Get \"https://138.201.119.17:6443/api/v1/nodes?fieldSelector=metadata.name%3Dci-4459-0-0-n-21eb3e8385&limit=500&resourceVersion=0\": dial tcp 138.201.119.17:6443: connect: connection refused" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Node" Sep 16 04:23:47.091610 kubelet[2409]: I0916 04:23:47.091569 2409 kuberuntime_manager.go:279] "Container runtime initialized" containerRuntime="containerd" version="v2.0.5" apiVersion="v1" Sep 16 04:23:47.092351 kubelet[2409]: I0916 04:23:47.092324 2409 kubelet.go:935] "Not starting ClusterTrustBundle informer because we are in static kubelet mode or the ClusterTrustBundleProjection featuregate is disabled" Sep 16 04:23:47.092471 kubelet[2409]: W0916 04:23:47.092456 2409 probe.go:272] Flexvolume plugin directory at /opt/libexec/kubernetes/kubelet-plugins/volume/exec/ does not exist. Recreating. Sep 16 04:23:47.095331 kubelet[2409]: I0916 04:23:47.095305 2409 watchdog_linux.go:99] "Systemd watchdog is not enabled" Sep 16 04:23:47.095414 kubelet[2409]: I0916 04:23:47.095345 2409 server.go:1289] "Started kubelet" Sep 16 04:23:47.096821 kubelet[2409]: I0916 04:23:47.096783 2409 server.go:180] "Starting to listen" address="0.0.0.0" port=10250 Sep 16 04:23:47.097954 kubelet[2409]: I0916 04:23:47.097931 2409 server.go:317] "Adding debug handlers to kubelet server" Sep 16 04:23:47.100256 kubelet[2409]: I0916 04:23:47.100176 2409 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Sep 16 04:23:47.100657 kubelet[2409]: I0916 04:23:47.100642 2409 server.go:255] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Sep 16 04:23:47.102861 kubelet[2409]: E0916 04:23:47.100839 2409 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://138.201.119.17:6443/api/v1/namespaces/default/events\": dial tcp 138.201.119.17:6443: connect: connection refused" event="&Event{ObjectMeta:{ci-4459-0-0-n-21eb3e8385.1865a89e3a59fe05 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ci-4459-0-0-n-21eb3e8385,UID:ci-4459-0-0-n-21eb3e8385,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:ci-4459-0-0-n-21eb3e8385,},FirstTimestamp:2025-09-16 04:23:47.095322117 +0000 UTC m=+1.369250011,LastTimestamp:2025-09-16 04:23:47.095322117 +0000 UTC m=+1.369250011,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ci-4459-0-0-n-21eb3e8385,}" Sep 16 04:23:47.106109 kubelet[2409]: E0916 04:23:47.106065 2409 kubelet.go:1600] "Image garbage collection failed once. Stats initialization may not have completed yet" err="invalid capacity 0 on image filesystem" Sep 16 04:23:47.107044 kubelet[2409]: I0916 04:23:47.107009 2409 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Sep 16 04:23:47.109623 kubelet[2409]: I0916 04:23:47.107293 2409 dynamic_serving_content.go:135] "Starting controller" name="kubelet-server-cert-files::/var/lib/kubelet/pki/kubelet.crt::/var/lib/kubelet/pki/kubelet.key" Sep 16 04:23:47.110229 kubelet[2409]: I0916 04:23:47.110212 2409 volume_manager.go:297] "Starting Kubelet Volume Manager" Sep 16 04:23:47.110377 kubelet[2409]: I0916 04:23:47.110364 2409 desired_state_of_world_populator.go:150] "Desired state populator starts to run" Sep 16 04:23:47.110434 kubelet[2409]: I0916 04:23:47.110422 2409 reconciler.go:26] "Reconciler: start to sync state" Sep 16 04:23:47.111919 kubelet[2409]: E0916 04:23:47.111623 2409 reflector.go:200] "Failed to watch" err="failed to list *v1.CSIDriver: Get \"https://138.201.119.17:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 138.201.119.17:6443: connect: connection refused" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSIDriver" Sep 16 04:23:47.111992 kubelet[2409]: I0916 04:23:47.111955 2409 factory.go:223] Registration of the systemd container factory successfully Sep 16 04:23:47.112078 kubelet[2409]: I0916 04:23:47.112047 2409 factory.go:221] Registration of the crio container factory failed: Get "http://%2Fvar%2Frun%2Fcrio%2Fcrio.sock/info": dial unix /var/run/crio/crio.sock: connect: no such file or directory Sep 16 04:23:47.114501 kubelet[2409]: I0916 04:23:47.114473 2409 factory.go:223] Registration of the containerd container factory successfully Sep 16 04:23:47.118843 kubelet[2409]: E0916 04:23:47.115103 2409 kubelet_node_status.go:466] "Error getting the current node from lister" err="node \"ci-4459-0-0-n-21eb3e8385\" not found" Sep 16 04:23:47.125617 kubelet[2409]: E0916 04:23:47.124886 2409 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://138.201.119.17:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/ci-4459-0-0-n-21eb3e8385?timeout=10s\": dial tcp 138.201.119.17:6443: connect: connection refused" interval="200ms" Sep 16 04:23:47.126135 kubelet[2409]: I0916 04:23:47.126092 2409 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv6" Sep 16 04:23:47.142243 kubelet[2409]: I0916 04:23:47.142215 2409 cpu_manager.go:221] "Starting CPU manager" policy="none" Sep 16 04:23:47.142402 kubelet[2409]: I0916 04:23:47.142387 2409 cpu_manager.go:222] "Reconciling" reconcilePeriod="10s" Sep 16 04:23:47.142461 kubelet[2409]: I0916 04:23:47.142453 2409 state_mem.go:36] "Initialized new in-memory state store" Sep 16 04:23:47.145468 kubelet[2409]: I0916 04:23:47.145436 2409 policy_none.go:49] "None policy: Start" Sep 16 04:23:47.145865 kubelet[2409]: I0916 04:23:47.145849 2409 memory_manager.go:186] "Starting memorymanager" policy="None" Sep 16 04:23:47.145939 kubelet[2409]: I0916 04:23:47.145930 2409 state_mem.go:35] "Initializing new in-memory state store" Sep 16 04:23:47.155100 kubelet[2409]: I0916 04:23:47.155069 2409 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv4" Sep 16 04:23:47.155260 kubelet[2409]: I0916 04:23:47.155248 2409 status_manager.go:230] "Starting to sync pod status with apiserver" Sep 16 04:23:47.155360 kubelet[2409]: I0916 04:23:47.155333 2409 watchdog_linux.go:127] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Sep 16 04:23:47.156623 kubelet[2409]: I0916 04:23:47.156434 2409 kubelet.go:2436] "Starting kubelet main sync loop" Sep 16 04:23:47.156623 kubelet[2409]: E0916 04:23:47.156517 2409 kubelet.go:2460] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" Sep 16 04:23:47.158037 systemd[1]: Created slice kubepods.slice - libcontainer container kubepods.slice. Sep 16 04:23:47.159244 kubelet[2409]: E0916 04:23:47.159178 2409 reflector.go:200] "Failed to watch" err="failed to list *v1.RuntimeClass: Get \"https://138.201.119.17:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 138.201.119.17:6443: connect: connection refused" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.RuntimeClass" Sep 16 04:23:47.174302 systemd[1]: Created slice kubepods-burstable.slice - libcontainer container kubepods-burstable.slice. Sep 16 04:23:47.181422 systemd[1]: Created slice kubepods-besteffort.slice - libcontainer container kubepods-besteffort.slice. Sep 16 04:23:47.191183 kubelet[2409]: E0916 04:23:47.190700 2409 manager.go:517] "Failed to read data from checkpoint" err="checkpoint is not found" checkpoint="kubelet_internal_checkpoint" Sep 16 04:23:47.191183 kubelet[2409]: I0916 04:23:47.190943 2409 eviction_manager.go:189] "Eviction manager: starting control loop" Sep 16 04:23:47.191183 kubelet[2409]: I0916 04:23:47.190954 2409 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Sep 16 04:23:47.191419 kubelet[2409]: I0916 04:23:47.191261 2409 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Sep 16 04:23:47.192980 kubelet[2409]: E0916 04:23:47.192919 2409 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="no imagefs label for configured runtime" Sep 16 04:23:47.192980 kubelet[2409]: E0916 04:23:47.192967 2409 eviction_manager.go:292] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"ci-4459-0-0-n-21eb3e8385\" not found" Sep 16 04:23:47.274911 systemd[1]: Created slice kubepods-burstable-pod4ce1c21b63bdba21869870828f75fbac.slice - libcontainer container kubepods-burstable-pod4ce1c21b63bdba21869870828f75fbac.slice. Sep 16 04:23:47.288918 kubelet[2409]: E0916 04:23:47.287910 2409 kubelet.go:3305] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4459-0-0-n-21eb3e8385\" not found" node="ci-4459-0-0-n-21eb3e8385" Sep 16 04:23:47.294029 kubelet[2409]: I0916 04:23:47.293637 2409 kubelet_node_status.go:75] "Attempting to register node" node="ci-4459-0-0-n-21eb3e8385" Sep 16 04:23:47.294170 systemd[1]: Created slice kubepods-burstable-pod7c36fccf1720ba6cac05fc8271f81652.slice - libcontainer container kubepods-burstable-pod7c36fccf1720ba6cac05fc8271f81652.slice. Sep 16 04:23:47.294617 kubelet[2409]: E0916 04:23:47.294216 2409 kubelet_node_status.go:107] "Unable to register node with API server" err="Post \"https://138.201.119.17:6443/api/v1/nodes\": dial tcp 138.201.119.17:6443: connect: connection refused" node="ci-4459-0-0-n-21eb3e8385" Sep 16 04:23:47.309507 kubelet[2409]: E0916 04:23:47.309090 2409 kubelet.go:3305] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4459-0-0-n-21eb3e8385\" not found" node="ci-4459-0-0-n-21eb3e8385" Sep 16 04:23:47.311513 systemd[1]: Created slice kubepods-burstable-pod7d4a01f5e8540fe2b3198bc7c49f082d.slice - libcontainer container kubepods-burstable-pod7d4a01f5e8540fe2b3198bc7c49f082d.slice. Sep 16 04:23:47.312140 kubelet[2409]: I0916 04:23:47.311768 2409 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/7d4a01f5e8540fe2b3198bc7c49f082d-k8s-certs\") pod \"kube-apiserver-ci-4459-0-0-n-21eb3e8385\" (UID: \"7d4a01f5e8540fe2b3198bc7c49f082d\") " pod="kube-system/kube-apiserver-ci-4459-0-0-n-21eb3e8385" Sep 16 04:23:47.312140 kubelet[2409]: I0916 04:23:47.311810 2409 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/4ce1c21b63bdba21869870828f75fbac-k8s-certs\") pod \"kube-controller-manager-ci-4459-0-0-n-21eb3e8385\" (UID: \"4ce1c21b63bdba21869870828f75fbac\") " pod="kube-system/kube-controller-manager-ci-4459-0-0-n-21eb3e8385" Sep 16 04:23:47.312140 kubelet[2409]: I0916 04:23:47.311842 2409 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/7d4a01f5e8540fe2b3198bc7c49f082d-ca-certs\") pod \"kube-apiserver-ci-4459-0-0-n-21eb3e8385\" (UID: \"7d4a01f5e8540fe2b3198bc7c49f082d\") " pod="kube-system/kube-apiserver-ci-4459-0-0-n-21eb3e8385" Sep 16 04:23:47.312140 kubelet[2409]: I0916 04:23:47.311867 2409 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/7d4a01f5e8540fe2b3198bc7c49f082d-usr-share-ca-certificates\") pod \"kube-apiserver-ci-4459-0-0-n-21eb3e8385\" (UID: \"7d4a01f5e8540fe2b3198bc7c49f082d\") " pod="kube-system/kube-apiserver-ci-4459-0-0-n-21eb3e8385" Sep 16 04:23:47.312140 kubelet[2409]: I0916 04:23:47.311895 2409 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/4ce1c21b63bdba21869870828f75fbac-ca-certs\") pod \"kube-controller-manager-ci-4459-0-0-n-21eb3e8385\" (UID: \"4ce1c21b63bdba21869870828f75fbac\") " pod="kube-system/kube-controller-manager-ci-4459-0-0-n-21eb3e8385" Sep 16 04:23:47.312397 kubelet[2409]: I0916 04:23:47.311923 2409 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"flexvolume-dir\" (UniqueName: \"kubernetes.io/host-path/4ce1c21b63bdba21869870828f75fbac-flexvolume-dir\") pod \"kube-controller-manager-ci-4459-0-0-n-21eb3e8385\" (UID: \"4ce1c21b63bdba21869870828f75fbac\") " pod="kube-system/kube-controller-manager-ci-4459-0-0-n-21eb3e8385" Sep 16 04:23:47.312397 kubelet[2409]: I0916 04:23:47.311950 2409 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/4ce1c21b63bdba21869870828f75fbac-kubeconfig\") pod \"kube-controller-manager-ci-4459-0-0-n-21eb3e8385\" (UID: \"4ce1c21b63bdba21869870828f75fbac\") " pod="kube-system/kube-controller-manager-ci-4459-0-0-n-21eb3e8385" Sep 16 04:23:47.312397 kubelet[2409]: I0916 04:23:47.311975 2409 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/4ce1c21b63bdba21869870828f75fbac-usr-share-ca-certificates\") pod \"kube-controller-manager-ci-4459-0-0-n-21eb3e8385\" (UID: \"4ce1c21b63bdba21869870828f75fbac\") " pod="kube-system/kube-controller-manager-ci-4459-0-0-n-21eb3e8385" Sep 16 04:23:47.312397 kubelet[2409]: I0916 04:23:47.312002 2409 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/7c36fccf1720ba6cac05fc8271f81652-kubeconfig\") pod \"kube-scheduler-ci-4459-0-0-n-21eb3e8385\" (UID: \"7c36fccf1720ba6cac05fc8271f81652\") " pod="kube-system/kube-scheduler-ci-4459-0-0-n-21eb3e8385" Sep 16 04:23:47.314477 kubelet[2409]: E0916 04:23:47.314438 2409 kubelet.go:3305] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4459-0-0-n-21eb3e8385\" not found" node="ci-4459-0-0-n-21eb3e8385" Sep 16 04:23:47.326156 kubelet[2409]: E0916 04:23:47.326105 2409 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://138.201.119.17:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/ci-4459-0-0-n-21eb3e8385?timeout=10s\": dial tcp 138.201.119.17:6443: connect: connection refused" interval="400ms" Sep 16 04:23:47.496868 kubelet[2409]: I0916 04:23:47.496823 2409 kubelet_node_status.go:75] "Attempting to register node" node="ci-4459-0-0-n-21eb3e8385" Sep 16 04:23:47.497455 kubelet[2409]: E0916 04:23:47.497413 2409 kubelet_node_status.go:107] "Unable to register node with API server" err="Post \"https://138.201.119.17:6443/api/v1/nodes\": dial tcp 138.201.119.17:6443: connect: connection refused" node="ci-4459-0-0-n-21eb3e8385" Sep 16 04:23:47.592979 containerd[1558]: time="2025-09-16T04:23:47.592808628Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-controller-manager-ci-4459-0-0-n-21eb3e8385,Uid:4ce1c21b63bdba21869870828f75fbac,Namespace:kube-system,Attempt:0,}" Sep 16 04:23:47.610960 containerd[1558]: time="2025-09-16T04:23:47.610895026Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-scheduler-ci-4459-0-0-n-21eb3e8385,Uid:7c36fccf1720ba6cac05fc8271f81652,Namespace:kube-system,Attempt:0,}" Sep 16 04:23:47.621859 containerd[1558]: time="2025-09-16T04:23:47.621709798Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-apiserver-ci-4459-0-0-n-21eb3e8385,Uid:7d4a01f5e8540fe2b3198bc7c49f082d,Namespace:kube-system,Attempt:0,}" Sep 16 04:23:47.623351 containerd[1558]: time="2025-09-16T04:23:47.623317820Z" level=info msg="connecting to shim 25a57597e8d27d89915fcf44b056cafadc4c1a306859fe4e7e1fea3e53bf1b16" address="unix:///run/containerd/s/009a445214b32cd95b63504b1d68350db14d5bf1fe2c2bf56ca99ff8190e3ff1" namespace=k8s.io protocol=ttrpc version=3 Sep 16 04:23:47.644840 containerd[1558]: time="2025-09-16T04:23:47.644794715Z" level=info msg="connecting to shim 35be61e851d5cdef65cba45040bb388023aae4354947b26310910b2e0937094a" address="unix:///run/containerd/s/d09735b2fa9581fe7380e0596dc34a489bc4a91955e9d9321c9d3103713fedc2" namespace=k8s.io protocol=ttrpc version=3 Sep 16 04:23:47.666862 containerd[1558]: time="2025-09-16T04:23:47.666816164Z" level=info msg="connecting to shim c41d732404fb27be368f03904e6fc1d598d68e0f76f231d0bbdc764cc7014ccd" address="unix:///run/containerd/s/3c4b48858f82cfac32e6964e6cd25ef19c6236a36d0ac98fe49458b837ab95f9" namespace=k8s.io protocol=ttrpc version=3 Sep 16 04:23:47.672793 systemd[1]: Started cri-containerd-25a57597e8d27d89915fcf44b056cafadc4c1a306859fe4e7e1fea3e53bf1b16.scope - libcontainer container 25a57597e8d27d89915fcf44b056cafadc4c1a306859fe4e7e1fea3e53bf1b16. Sep 16 04:23:47.710824 systemd[1]: Started cri-containerd-35be61e851d5cdef65cba45040bb388023aae4354947b26310910b2e0937094a.scope - libcontainer container 35be61e851d5cdef65cba45040bb388023aae4354947b26310910b2e0937094a. Sep 16 04:23:47.718744 systemd[1]: Started cri-containerd-c41d732404fb27be368f03904e6fc1d598d68e0f76f231d0bbdc764cc7014ccd.scope - libcontainer container c41d732404fb27be368f03904e6fc1d598d68e0f76f231d0bbdc764cc7014ccd. Sep 16 04:23:47.727826 kubelet[2409]: E0916 04:23:47.727675 2409 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://138.201.119.17:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/ci-4459-0-0-n-21eb3e8385?timeout=10s\": dial tcp 138.201.119.17:6443: connect: connection refused" interval="800ms" Sep 16 04:23:47.746314 containerd[1558]: time="2025-09-16T04:23:47.746269567Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-controller-manager-ci-4459-0-0-n-21eb3e8385,Uid:4ce1c21b63bdba21869870828f75fbac,Namespace:kube-system,Attempt:0,} returns sandbox id \"25a57597e8d27d89915fcf44b056cafadc4c1a306859fe4e7e1fea3e53bf1b16\"" Sep 16 04:23:47.761262 containerd[1558]: time="2025-09-16T04:23:47.761218084Z" level=info msg="CreateContainer within sandbox \"25a57597e8d27d89915fcf44b056cafadc4c1a306859fe4e7e1fea3e53bf1b16\" for container &ContainerMetadata{Name:kube-controller-manager,Attempt:0,}" Sep 16 04:23:47.793806 containerd[1558]: time="2025-09-16T04:23:47.793755366Z" level=info msg="Container ed4ab31cc3567f76094356286f56f2fd375bdc859e654c50dbe6dd9f919537b3: CDI devices from CRI Config.CDIDevices: []" Sep 16 04:23:47.807860 containerd[1558]: time="2025-09-16T04:23:47.807803145Z" level=info msg="CreateContainer within sandbox \"25a57597e8d27d89915fcf44b056cafadc4c1a306859fe4e7e1fea3e53bf1b16\" for &ContainerMetadata{Name:kube-controller-manager,Attempt:0,} returns container id \"ed4ab31cc3567f76094356286f56f2fd375bdc859e654c50dbe6dd9f919537b3\"" Sep 16 04:23:47.808844 containerd[1558]: time="2025-09-16T04:23:47.808672240Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-scheduler-ci-4459-0-0-n-21eb3e8385,Uid:7c36fccf1720ba6cac05fc8271f81652,Namespace:kube-system,Attempt:0,} returns sandbox id \"35be61e851d5cdef65cba45040bb388023aae4354947b26310910b2e0937094a\"" Sep 16 04:23:47.809689 containerd[1558]: time="2025-09-16T04:23:47.809575578Z" level=info msg="StartContainer for \"ed4ab31cc3567f76094356286f56f2fd375bdc859e654c50dbe6dd9f919537b3\"" Sep 16 04:23:47.812374 containerd[1558]: time="2025-09-16T04:23:47.812305753Z" level=info msg="connecting to shim ed4ab31cc3567f76094356286f56f2fd375bdc859e654c50dbe6dd9f919537b3" address="unix:///run/containerd/s/009a445214b32cd95b63504b1d68350db14d5bf1fe2c2bf56ca99ff8190e3ff1" protocol=ttrpc version=3 Sep 16 04:23:47.813504 containerd[1558]: time="2025-09-16T04:23:47.813458267Z" level=info msg="CreateContainer within sandbox \"35be61e851d5cdef65cba45040bb388023aae4354947b26310910b2e0937094a\" for container &ContainerMetadata{Name:kube-scheduler,Attempt:0,}" Sep 16 04:23:47.817706 containerd[1558]: time="2025-09-16T04:23:47.817551808Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-apiserver-ci-4459-0-0-n-21eb3e8385,Uid:7d4a01f5e8540fe2b3198bc7c49f082d,Namespace:kube-system,Attempt:0,} returns sandbox id \"c41d732404fb27be368f03904e6fc1d598d68e0f76f231d0bbdc764cc7014ccd\"" Sep 16 04:23:47.824488 containerd[1558]: time="2025-09-16T04:23:47.824445050Z" level=info msg="CreateContainer within sandbox \"c41d732404fb27be368f03904e6fc1d598d68e0f76f231d0bbdc764cc7014ccd\" for container &ContainerMetadata{Name:kube-apiserver,Attempt:0,}" Sep 16 04:23:47.830372 containerd[1558]: time="2025-09-16T04:23:47.830113692Z" level=info msg="Container d0e4c211bee4804610e277b52afb873471a087a0945a8b1a439aed3e0eed084e: CDI devices from CRI Config.CDIDevices: []" Sep 16 04:23:47.840188 containerd[1558]: time="2025-09-16T04:23:47.839841555Z" level=info msg="Container f847b5840d56e10162db69c0c939205a09af2a0a7433acb7962de50e0f991381: CDI devices from CRI Config.CDIDevices: []" Sep 16 04:23:47.846310 containerd[1558]: time="2025-09-16T04:23:47.845703890Z" level=info msg="CreateContainer within sandbox \"35be61e851d5cdef65cba45040bb388023aae4354947b26310910b2e0937094a\" for &ContainerMetadata{Name:kube-scheduler,Attempt:0,} returns container id \"d0e4c211bee4804610e277b52afb873471a087a0945a8b1a439aed3e0eed084e\"" Sep 16 04:23:47.846310 containerd[1558]: time="2025-09-16T04:23:47.846134717Z" level=info msg="StartContainer for \"d0e4c211bee4804610e277b52afb873471a087a0945a8b1a439aed3e0eed084e\"" Sep 16 04:23:47.845833 systemd[1]: Started cri-containerd-ed4ab31cc3567f76094356286f56f2fd375bdc859e654c50dbe6dd9f919537b3.scope - libcontainer container ed4ab31cc3567f76094356286f56f2fd375bdc859e654c50dbe6dd9f919537b3. Sep 16 04:23:47.849092 containerd[1558]: time="2025-09-16T04:23:47.849046344Z" level=info msg="connecting to shim d0e4c211bee4804610e277b52afb873471a087a0945a8b1a439aed3e0eed084e" address="unix:///run/containerd/s/d09735b2fa9581fe7380e0596dc34a489bc4a91955e9d9321c9d3103713fedc2" protocol=ttrpc version=3 Sep 16 04:23:47.857235 containerd[1558]: time="2025-09-16T04:23:47.857174064Z" level=info msg="CreateContainer within sandbox \"c41d732404fb27be368f03904e6fc1d598d68e0f76f231d0bbdc764cc7014ccd\" for &ContainerMetadata{Name:kube-apiserver,Attempt:0,} returns container id \"f847b5840d56e10162db69c0c939205a09af2a0a7433acb7962de50e0f991381\"" Sep 16 04:23:47.859655 containerd[1558]: time="2025-09-16T04:23:47.859605499Z" level=info msg="StartContainer for \"f847b5840d56e10162db69c0c939205a09af2a0a7433acb7962de50e0f991381\"" Sep 16 04:23:47.861940 containerd[1558]: time="2025-09-16T04:23:47.861875845Z" level=info msg="connecting to shim f847b5840d56e10162db69c0c939205a09af2a0a7433acb7962de50e0f991381" address="unix:///run/containerd/s/3c4b48858f82cfac32e6964e6cd25ef19c6236a36d0ac98fe49458b837ab95f9" protocol=ttrpc version=3 Sep 16 04:23:47.889086 systemd[1]: Started cri-containerd-d0e4c211bee4804610e277b52afb873471a087a0945a8b1a439aed3e0eed084e.scope - libcontainer container d0e4c211bee4804610e277b52afb873471a087a0945a8b1a439aed3e0eed084e. Sep 16 04:23:47.905073 kubelet[2409]: I0916 04:23:47.904895 2409 kubelet_node_status.go:75] "Attempting to register node" node="ci-4459-0-0-n-21eb3e8385" Sep 16 04:23:47.907652 kubelet[2409]: E0916 04:23:47.907611 2409 kubelet_node_status.go:107] "Unable to register node with API server" err="Post \"https://138.201.119.17:6443/api/v1/nodes\": dial tcp 138.201.119.17:6443: connect: connection refused" node="ci-4459-0-0-n-21eb3e8385" Sep 16 04:23:47.911106 systemd[1]: Started cri-containerd-f847b5840d56e10162db69c0c939205a09af2a0a7433acb7962de50e0f991381.scope - libcontainer container f847b5840d56e10162db69c0c939205a09af2a0a7433acb7962de50e0f991381. Sep 16 04:23:47.930942 containerd[1558]: time="2025-09-16T04:23:47.930870339Z" level=info msg="StartContainer for \"ed4ab31cc3567f76094356286f56f2fd375bdc859e654c50dbe6dd9f919537b3\" returns successfully" Sep 16 04:23:47.985534 containerd[1558]: time="2025-09-16T04:23:47.985488034Z" level=info msg="StartContainer for \"d0e4c211bee4804610e277b52afb873471a087a0945a8b1a439aed3e0eed084e\" returns successfully" Sep 16 04:23:47.992670 containerd[1558]: time="2025-09-16T04:23:47.991735954Z" level=info msg="StartContainer for \"f847b5840d56e10162db69c0c939205a09af2a0a7433acb7962de50e0f991381\" returns successfully" Sep 16 04:23:48.078949 kubelet[2409]: E0916 04:23:48.078818 2409 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://138.201.119.17:6443/api/v1/namespaces/default/events\": dial tcp 138.201.119.17:6443: connect: connection refused" event="&Event{ObjectMeta:{ci-4459-0-0-n-21eb3e8385.1865a89e3a59fe05 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ci-4459-0-0-n-21eb3e8385,UID:ci-4459-0-0-n-21eb3e8385,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:ci-4459-0-0-n-21eb3e8385,},FirstTimestamp:2025-09-16 04:23:47.095322117 +0000 UTC m=+1.369250011,LastTimestamp:2025-09-16 04:23:47.095322117 +0000 UTC m=+1.369250011,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ci-4459-0-0-n-21eb3e8385,}" Sep 16 04:23:48.168967 kubelet[2409]: E0916 04:23:48.168938 2409 kubelet.go:3305] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4459-0-0-n-21eb3e8385\" not found" node="ci-4459-0-0-n-21eb3e8385" Sep 16 04:23:48.173176 kubelet[2409]: E0916 04:23:48.172985 2409 kubelet.go:3305] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4459-0-0-n-21eb3e8385\" not found" node="ci-4459-0-0-n-21eb3e8385" Sep 16 04:23:48.177168 kubelet[2409]: E0916 04:23:48.177139 2409 kubelet.go:3305] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4459-0-0-n-21eb3e8385\" not found" node="ci-4459-0-0-n-21eb3e8385" Sep 16 04:23:48.710333 kubelet[2409]: I0916 04:23:48.709414 2409 kubelet_node_status.go:75] "Attempting to register node" node="ci-4459-0-0-n-21eb3e8385" Sep 16 04:23:49.178599 kubelet[2409]: E0916 04:23:49.177525 2409 kubelet.go:3305] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4459-0-0-n-21eb3e8385\" not found" node="ci-4459-0-0-n-21eb3e8385" Sep 16 04:23:49.184720 kubelet[2409]: E0916 04:23:49.184688 2409 kubelet.go:3305] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4459-0-0-n-21eb3e8385\" not found" node="ci-4459-0-0-n-21eb3e8385" Sep 16 04:23:50.361485 kubelet[2409]: E0916 04:23:50.361434 2409 nodelease.go:49] "Failed to get node when trying to set owner ref to the node lease" err="nodes \"ci-4459-0-0-n-21eb3e8385\" not found" node="ci-4459-0-0-n-21eb3e8385" Sep 16 04:23:50.442365 kubelet[2409]: I0916 04:23:50.442324 2409 kubelet_node_status.go:78] "Successfully registered node" node="ci-4459-0-0-n-21eb3e8385" Sep 16 04:23:50.514499 kubelet[2409]: I0916 04:23:50.514459 2409 kubelet.go:3309] "Creating a mirror pod for static pod" pod="kube-system/kube-controller-manager-ci-4459-0-0-n-21eb3e8385" Sep 16 04:23:50.534850 kubelet[2409]: E0916 04:23:50.534801 2409 kubelet.go:3311] "Failed creating a mirror pod" err="pods \"kube-controller-manager-ci-4459-0-0-n-21eb3e8385\" is forbidden: no PriorityClass with name system-node-critical was found" pod="kube-system/kube-controller-manager-ci-4459-0-0-n-21eb3e8385" Sep 16 04:23:50.534850 kubelet[2409]: I0916 04:23:50.534839 2409 kubelet.go:3309] "Creating a mirror pod for static pod" pod="kube-system/kube-scheduler-ci-4459-0-0-n-21eb3e8385" Sep 16 04:23:50.539598 kubelet[2409]: E0916 04:23:50.538305 2409 kubelet.go:3311] "Failed creating a mirror pod" err="pods \"kube-scheduler-ci-4459-0-0-n-21eb3e8385\" is forbidden: no PriorityClass with name system-node-critical was found" pod="kube-system/kube-scheduler-ci-4459-0-0-n-21eb3e8385" Sep 16 04:23:50.539598 kubelet[2409]: I0916 04:23:50.538354 2409 kubelet.go:3309] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-ci-4459-0-0-n-21eb3e8385" Sep 16 04:23:50.541474 kubelet[2409]: E0916 04:23:50.541435 2409 kubelet.go:3311] "Failed creating a mirror pod" err="pods \"kube-apiserver-ci-4459-0-0-n-21eb3e8385\" is forbidden: no PriorityClass with name system-node-critical was found" pod="kube-system/kube-apiserver-ci-4459-0-0-n-21eb3e8385" Sep 16 04:23:51.089833 kubelet[2409]: I0916 04:23:51.089789 2409 apiserver.go:52] "Watching apiserver" Sep 16 04:23:51.111596 kubelet[2409]: I0916 04:23:51.111436 2409 desired_state_of_world_populator.go:158] "Finished populating initial desired state of world" Sep 16 04:23:52.206422 kubelet[2409]: I0916 04:23:52.206377 2409 kubelet.go:3309] "Creating a mirror pod for static pod" pod="kube-system/kube-scheduler-ci-4459-0-0-n-21eb3e8385" Sep 16 04:23:53.087661 systemd[1]: Reload requested from client PID 2694 ('systemctl') (unit session-7.scope)... Sep 16 04:23:53.087687 systemd[1]: Reloading... Sep 16 04:23:53.196609 zram_generator::config[2735]: No configuration found. Sep 16 04:23:53.397855 systemd[1]: Reloading finished in 309 ms. Sep 16 04:23:53.431971 systemd[1]: Stopping kubelet.service - kubelet: The Kubernetes Node Agent... Sep 16 04:23:53.432282 kubelet[2409]: I0916 04:23:53.432126 2409 dynamic_cafile_content.go:175] "Shutting down controller" name="client-ca-bundle::/etc/kubernetes/pki/ca.crt" Sep 16 04:23:53.446456 systemd[1]: kubelet.service: Deactivated successfully. Sep 16 04:23:53.447184 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent. Sep 16 04:23:53.447281 systemd[1]: kubelet.service: Consumed 1.810s CPU time, 129.7M memory peak. Sep 16 04:23:53.450124 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Sep 16 04:23:53.607554 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Sep 16 04:23:53.621938 (kubelet)[2783]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS Sep 16 04:23:53.686192 kubelet[2783]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Sep 16 04:23:53.686192 kubelet[2783]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Sep 16 04:23:53.686192 kubelet[2783]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Sep 16 04:23:53.686192 kubelet[2783]: I0916 04:23:53.682601 2783 server.go:212] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Sep 16 04:23:53.694638 kubelet[2783]: I0916 04:23:53.694420 2783 server.go:530] "Kubelet version" kubeletVersion="v1.33.0" Sep 16 04:23:53.694638 kubelet[2783]: I0916 04:23:53.694454 2783 server.go:532] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Sep 16 04:23:53.694859 kubelet[2783]: I0916 04:23:53.694832 2783 server.go:956] "Client rotation is on, will bootstrap in background" Sep 16 04:23:53.696499 kubelet[2783]: I0916 04:23:53.696475 2783 certificate_store.go:147] "Loading cert/key pair from a file" filePath="/var/lib/kubelet/pki/kubelet-client-current.pem" Sep 16 04:23:53.699420 kubelet[2783]: I0916 04:23:53.698975 2783 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/pki/ca.crt" Sep 16 04:23:53.707811 kubelet[2783]: I0916 04:23:53.707696 2783 server.go:1446] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Sep 16 04:23:53.711608 kubelet[2783]: I0916 04:23:53.710934 2783 server.go:782] "--cgroups-per-qos enabled, but --cgroup-root was not specified. defaulting to /" Sep 16 04:23:53.711608 kubelet[2783]: I0916 04:23:53.711114 2783 container_manager_linux.go:267] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Sep 16 04:23:53.711608 kubelet[2783]: I0916 04:23:53.711140 2783 container_manager_linux.go:272] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ci-4459-0-0-n-21eb3e8385","RuntimeCgroupsName":"","SystemCgroupsName":"","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":false,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":null,"HardEvictionThresholds":[{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"MemoryManagerPolicy":"None","MemoryManagerReservedMemory":null,"PodPidsLimit":-1,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Sep 16 04:23:53.711608 kubelet[2783]: I0916 04:23:53.711295 2783 topology_manager.go:138] "Creating topology manager with none policy" Sep 16 04:23:53.711846 kubelet[2783]: I0916 04:23:53.711305 2783 container_manager_linux.go:303] "Creating device plugin manager" Sep 16 04:23:53.711846 kubelet[2783]: I0916 04:23:53.711343 2783 state_mem.go:36] "Initialized new in-memory state store" Sep 16 04:23:53.711846 kubelet[2783]: I0916 04:23:53.711473 2783 kubelet.go:480] "Attempting to sync node with API server" Sep 16 04:23:53.711846 kubelet[2783]: I0916 04:23:53.711483 2783 kubelet.go:375] "Adding static pod path" path="/etc/kubernetes/manifests" Sep 16 04:23:53.711846 kubelet[2783]: I0916 04:23:53.711507 2783 kubelet.go:386] "Adding apiserver pod source" Sep 16 04:23:53.711846 kubelet[2783]: I0916 04:23:53.711520 2783 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Sep 16 04:23:53.718645 kubelet[2783]: I0916 04:23:53.716992 2783 kuberuntime_manager.go:279] "Container runtime initialized" containerRuntime="containerd" version="v2.0.5" apiVersion="v1" Sep 16 04:23:53.718645 kubelet[2783]: I0916 04:23:53.718043 2783 kubelet.go:935] "Not starting ClusterTrustBundle informer because we are in static kubelet mode or the ClusterTrustBundleProjection featuregate is disabled" Sep 16 04:23:53.728628 kubelet[2783]: I0916 04:23:53.728198 2783 watchdog_linux.go:99] "Systemd watchdog is not enabled" Sep 16 04:23:53.728628 kubelet[2783]: I0916 04:23:53.728248 2783 server.go:1289] "Started kubelet" Sep 16 04:23:53.730045 kubelet[2783]: I0916 04:23:53.730020 2783 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Sep 16 04:23:53.737777 kubelet[2783]: I0916 04:23:53.737378 2783 server.go:180] "Starting to listen" address="0.0.0.0" port=10250 Sep 16 04:23:53.738370 kubelet[2783]: I0916 04:23:53.738340 2783 server.go:317] "Adding debug handlers to kubelet server" Sep 16 04:23:53.741475 kubelet[2783]: I0916 04:23:53.741390 2783 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Sep 16 04:23:53.741777 kubelet[2783]: I0916 04:23:53.741751 2783 server.go:255] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Sep 16 04:23:53.742268 kubelet[2783]: I0916 04:23:53.741980 2783 dynamic_serving_content.go:135] "Starting controller" name="kubelet-server-cert-files::/var/lib/kubelet/pki/kubelet.crt::/var/lib/kubelet/pki/kubelet.key" Sep 16 04:23:53.746593 kubelet[2783]: I0916 04:23:53.743558 2783 volume_manager.go:297] "Starting Kubelet Volume Manager" Sep 16 04:23:53.746593 kubelet[2783]: E0916 04:23:53.743804 2783 kubelet_node_status.go:466] "Error getting the current node from lister" err="node \"ci-4459-0-0-n-21eb3e8385\" not found" Sep 16 04:23:53.746593 kubelet[2783]: I0916 04:23:53.745419 2783 desired_state_of_world_populator.go:150] "Desired state populator starts to run" Sep 16 04:23:53.746593 kubelet[2783]: I0916 04:23:53.745515 2783 reconciler.go:26] "Reconciler: start to sync state" Sep 16 04:23:53.747076 kubelet[2783]: I0916 04:23:53.747027 2783 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv4" Sep 16 04:23:53.747990 kubelet[2783]: I0916 04:23:53.747963 2783 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv6" Sep 16 04:23:53.748035 kubelet[2783]: I0916 04:23:53.748001 2783 status_manager.go:230] "Starting to sync pod status with apiserver" Sep 16 04:23:53.748035 kubelet[2783]: I0916 04:23:53.748020 2783 watchdog_linux.go:127] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Sep 16 04:23:53.748035 kubelet[2783]: I0916 04:23:53.748026 2783 kubelet.go:2436] "Starting kubelet main sync loop" Sep 16 04:23:53.748095 kubelet[2783]: E0916 04:23:53.748065 2783 kubelet.go:2460] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" Sep 16 04:23:53.756594 kubelet[2783]: I0916 04:23:53.755749 2783 factory.go:223] Registration of the systemd container factory successfully Sep 16 04:23:53.756594 kubelet[2783]: I0916 04:23:53.755857 2783 factory.go:221] Registration of the crio container factory failed: Get "http://%2Fvar%2Frun%2Fcrio%2Fcrio.sock/info": dial unix /var/run/crio/crio.sock: connect: no such file or directory Sep 16 04:23:53.759833 kubelet[2783]: I0916 04:23:53.759775 2783 factory.go:223] Registration of the containerd container factory successfully Sep 16 04:23:53.821020 kubelet[2783]: I0916 04:23:53.820983 2783 cpu_manager.go:221] "Starting CPU manager" policy="none" Sep 16 04:23:53.821020 kubelet[2783]: I0916 04:23:53.821003 2783 cpu_manager.go:222] "Reconciling" reconcilePeriod="10s" Sep 16 04:23:53.821020 kubelet[2783]: I0916 04:23:53.821026 2783 state_mem.go:36] "Initialized new in-memory state store" Sep 16 04:23:53.821327 kubelet[2783]: I0916 04:23:53.821159 2783 state_mem.go:88] "Updated default CPUSet" cpuSet="" Sep 16 04:23:53.821327 kubelet[2783]: I0916 04:23:53.821168 2783 state_mem.go:96] "Updated CPUSet assignments" assignments={} Sep 16 04:23:53.821327 kubelet[2783]: I0916 04:23:53.821185 2783 policy_none.go:49] "None policy: Start" Sep 16 04:23:53.821327 kubelet[2783]: I0916 04:23:53.821194 2783 memory_manager.go:186] "Starting memorymanager" policy="None" Sep 16 04:23:53.821327 kubelet[2783]: I0916 04:23:53.821202 2783 state_mem.go:35] "Initializing new in-memory state store" Sep 16 04:23:53.821327 kubelet[2783]: I0916 04:23:53.821280 2783 state_mem.go:75] "Updated machine memory state" Sep 16 04:23:53.826546 kubelet[2783]: E0916 04:23:53.826152 2783 manager.go:517] "Failed to read data from checkpoint" err="checkpoint is not found" checkpoint="kubelet_internal_checkpoint" Sep 16 04:23:53.827076 kubelet[2783]: I0916 04:23:53.827050 2783 eviction_manager.go:189] "Eviction manager: starting control loop" Sep 16 04:23:53.827162 kubelet[2783]: I0916 04:23:53.827083 2783 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Sep 16 04:23:53.829643 kubelet[2783]: I0916 04:23:53.827491 2783 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Sep 16 04:23:53.829820 kubelet[2783]: E0916 04:23:53.829046 2783 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="no imagefs label for configured runtime" Sep 16 04:23:53.850689 kubelet[2783]: I0916 04:23:53.850660 2783 kubelet.go:3309] "Creating a mirror pod for static pod" pod="kube-system/kube-scheduler-ci-4459-0-0-n-21eb3e8385" Sep 16 04:23:53.851464 kubelet[2783]: I0916 04:23:53.851015 2783 kubelet.go:3309] "Creating a mirror pod for static pod" pod="kube-system/kube-controller-manager-ci-4459-0-0-n-21eb3e8385" Sep 16 04:23:53.852205 kubelet[2783]: I0916 04:23:53.852040 2783 kubelet.go:3309] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-ci-4459-0-0-n-21eb3e8385" Sep 16 04:23:53.865026 kubelet[2783]: E0916 04:23:53.864947 2783 kubelet.go:3311] "Failed creating a mirror pod" err="pods \"kube-scheduler-ci-4459-0-0-n-21eb3e8385\" already exists" pod="kube-system/kube-scheduler-ci-4459-0-0-n-21eb3e8385" Sep 16 04:23:53.933248 kubelet[2783]: I0916 04:23:53.932932 2783 kubelet_node_status.go:75] "Attempting to register node" node="ci-4459-0-0-n-21eb3e8385" Sep 16 04:23:53.945469 kubelet[2783]: I0916 04:23:53.945327 2783 kubelet_node_status.go:124] "Node was previously registered" node="ci-4459-0-0-n-21eb3e8385" Sep 16 04:23:53.945469 kubelet[2783]: I0916 04:23:53.945412 2783 kubelet_node_status.go:78] "Successfully registered node" node="ci-4459-0-0-n-21eb3e8385" Sep 16 04:23:54.046293 kubelet[2783]: I0916 04:23:54.046231 2783 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/7d4a01f5e8540fe2b3198bc7c49f082d-ca-certs\") pod \"kube-apiserver-ci-4459-0-0-n-21eb3e8385\" (UID: \"7d4a01f5e8540fe2b3198bc7c49f082d\") " pod="kube-system/kube-apiserver-ci-4459-0-0-n-21eb3e8385" Sep 16 04:23:54.046721 kubelet[2783]: I0916 04:23:54.046578 2783 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/7d4a01f5e8540fe2b3198bc7c49f082d-usr-share-ca-certificates\") pod \"kube-apiserver-ci-4459-0-0-n-21eb3e8385\" (UID: \"7d4a01f5e8540fe2b3198bc7c49f082d\") " pod="kube-system/kube-apiserver-ci-4459-0-0-n-21eb3e8385" Sep 16 04:23:54.047074 kubelet[2783]: I0916 04:23:54.046900 2783 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/4ce1c21b63bdba21869870828f75fbac-ca-certs\") pod \"kube-controller-manager-ci-4459-0-0-n-21eb3e8385\" (UID: \"4ce1c21b63bdba21869870828f75fbac\") " pod="kube-system/kube-controller-manager-ci-4459-0-0-n-21eb3e8385" Sep 16 04:23:54.047354 kubelet[2783]: I0916 04:23:54.047326 2783 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/4ce1c21b63bdba21869870828f75fbac-kubeconfig\") pod \"kube-controller-manager-ci-4459-0-0-n-21eb3e8385\" (UID: \"4ce1c21b63bdba21869870828f75fbac\") " pod="kube-system/kube-controller-manager-ci-4459-0-0-n-21eb3e8385" Sep 16 04:23:54.047610 kubelet[2783]: I0916 04:23:54.047462 2783 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/4ce1c21b63bdba21869870828f75fbac-usr-share-ca-certificates\") pod \"kube-controller-manager-ci-4459-0-0-n-21eb3e8385\" (UID: \"4ce1c21b63bdba21869870828f75fbac\") " pod="kube-system/kube-controller-manager-ci-4459-0-0-n-21eb3e8385" Sep 16 04:23:54.047764 kubelet[2783]: I0916 04:23:54.047736 2783 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/7d4a01f5e8540fe2b3198bc7c49f082d-k8s-certs\") pod \"kube-apiserver-ci-4459-0-0-n-21eb3e8385\" (UID: \"7d4a01f5e8540fe2b3198bc7c49f082d\") " pod="kube-system/kube-apiserver-ci-4459-0-0-n-21eb3e8385" Sep 16 04:23:54.048069 kubelet[2783]: I0916 04:23:54.047954 2783 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"flexvolume-dir\" (UniqueName: \"kubernetes.io/host-path/4ce1c21b63bdba21869870828f75fbac-flexvolume-dir\") pod \"kube-controller-manager-ci-4459-0-0-n-21eb3e8385\" (UID: \"4ce1c21b63bdba21869870828f75fbac\") " pod="kube-system/kube-controller-manager-ci-4459-0-0-n-21eb3e8385" Sep 16 04:23:54.048242 kubelet[2783]: I0916 04:23:54.048047 2783 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/4ce1c21b63bdba21869870828f75fbac-k8s-certs\") pod \"kube-controller-manager-ci-4459-0-0-n-21eb3e8385\" (UID: \"4ce1c21b63bdba21869870828f75fbac\") " pod="kube-system/kube-controller-manager-ci-4459-0-0-n-21eb3e8385" Sep 16 04:23:54.048352 kubelet[2783]: I0916 04:23:54.048217 2783 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/7c36fccf1720ba6cac05fc8271f81652-kubeconfig\") pod \"kube-scheduler-ci-4459-0-0-n-21eb3e8385\" (UID: \"7c36fccf1720ba6cac05fc8271f81652\") " pod="kube-system/kube-scheduler-ci-4459-0-0-n-21eb3e8385" Sep 16 04:23:54.715176 kubelet[2783]: I0916 04:23:54.715139 2783 apiserver.go:52] "Watching apiserver" Sep 16 04:23:54.745773 kubelet[2783]: I0916 04:23:54.745734 2783 desired_state_of_world_populator.go:158] "Finished populating initial desired state of world" Sep 16 04:23:54.777075 kubelet[2783]: I0916 04:23:54.777013 2783 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-scheduler-ci-4459-0-0-n-21eb3e8385" podStartSLOduration=2.776997577 podStartE2EDuration="2.776997577s" podCreationTimestamp="2025-09-16 04:23:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-16 04:23:54.776721404 +0000 UTC m=+1.145362426" watchObservedRunningTime="2025-09-16 04:23:54.776997577 +0000 UTC m=+1.145638599" Sep 16 04:23:54.797305 kubelet[2783]: I0916 04:23:54.797240 2783 kubelet.go:3309] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-ci-4459-0-0-n-21eb3e8385" Sep 16 04:23:54.807785 kubelet[2783]: I0916 04:23:54.807725 2783 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-controller-manager-ci-4459-0-0-n-21eb3e8385" podStartSLOduration=1.807706805 podStartE2EDuration="1.807706805s" podCreationTimestamp="2025-09-16 04:23:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-16 04:23:54.805398901 +0000 UTC m=+1.174039883" watchObservedRunningTime="2025-09-16 04:23:54.807706805 +0000 UTC m=+1.176347827" Sep 16 04:23:54.810455 kubelet[2783]: E0916 04:23:54.810404 2783 kubelet.go:3311] "Failed creating a mirror pod" err="pods \"kube-apiserver-ci-4459-0-0-n-21eb3e8385\" already exists" pod="kube-system/kube-apiserver-ci-4459-0-0-n-21eb3e8385" Sep 16 04:23:54.833515 kubelet[2783]: I0916 04:23:54.833453 2783 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-apiserver-ci-4459-0-0-n-21eb3e8385" podStartSLOduration=1.833433888 podStartE2EDuration="1.833433888s" podCreationTimestamp="2025-09-16 04:23:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-16 04:23:54.820401579 +0000 UTC m=+1.189042601" watchObservedRunningTime="2025-09-16 04:23:54.833433888 +0000 UTC m=+1.202074910" Sep 16 04:23:59.490825 kubelet[2783]: I0916 04:23:59.490779 2783 kuberuntime_manager.go:1746] "Updating runtime config through cri with podcidr" CIDR="192.168.0.0/24" Sep 16 04:23:59.491538 containerd[1558]: time="2025-09-16T04:23:59.491446289Z" level=info msg="No cni config template is specified, wait for other system components to drop the config." Sep 16 04:23:59.492020 kubelet[2783]: I0916 04:23:59.491665 2783 kubelet_network.go:61] "Updating Pod CIDR" originalPodCIDR="" newPodCIDR="192.168.0.0/24" Sep 16 04:23:59.977231 systemd[1]: Created slice kubepods-besteffort-pod4a443492_c25f_4b6f_8600_f4ea6a8c09bd.slice - libcontainer container kubepods-besteffort-pod4a443492_c25f_4b6f_8600_f4ea6a8c09bd.slice. Sep 16 04:23:59.985242 kubelet[2783]: I0916 04:23:59.985204 2783 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-proxy\" (UniqueName: \"kubernetes.io/configmap/4a443492-c25f-4b6f-8600-f4ea6a8c09bd-kube-proxy\") pod \"kube-proxy-gp2n6\" (UID: \"4a443492-c25f-4b6f-8600-f4ea6a8c09bd\") " pod="kube-system/kube-proxy-gp2n6" Sep 16 04:23:59.985463 kubelet[2783]: I0916 04:23:59.985434 2783 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/4a443492-c25f-4b6f-8600-f4ea6a8c09bd-lib-modules\") pod \"kube-proxy-gp2n6\" (UID: \"4a443492-c25f-4b6f-8600-f4ea6a8c09bd\") " pod="kube-system/kube-proxy-gp2n6" Sep 16 04:23:59.985648 kubelet[2783]: I0916 04:23:59.985576 2783 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nc5gq\" (UniqueName: \"kubernetes.io/projected/4a443492-c25f-4b6f-8600-f4ea6a8c09bd-kube-api-access-nc5gq\") pod \"kube-proxy-gp2n6\" (UID: \"4a443492-c25f-4b6f-8600-f4ea6a8c09bd\") " pod="kube-system/kube-proxy-gp2n6" Sep 16 04:23:59.985807 kubelet[2783]: I0916 04:23:59.985758 2783 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"xtables-lock\" (UniqueName: \"kubernetes.io/host-path/4a443492-c25f-4b6f-8600-f4ea6a8c09bd-xtables-lock\") pod \"kube-proxy-gp2n6\" (UID: \"4a443492-c25f-4b6f-8600-f4ea6a8c09bd\") " pod="kube-system/kube-proxy-gp2n6" Sep 16 04:24:00.097846 kubelet[2783]: E0916 04:24:00.097812 2783 projected.go:289] Couldn't get configMap kube-system/kube-root-ca.crt: configmap "kube-root-ca.crt" not found Sep 16 04:24:00.098121 kubelet[2783]: E0916 04:24:00.098055 2783 projected.go:194] Error preparing data for projected volume kube-api-access-nc5gq for pod kube-system/kube-proxy-gp2n6: configmap "kube-root-ca.crt" not found Sep 16 04:24:00.098469 kubelet[2783]: E0916 04:24:00.098405 2783 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/4a443492-c25f-4b6f-8600-f4ea6a8c09bd-kube-api-access-nc5gq podName:4a443492-c25f-4b6f-8600-f4ea6a8c09bd nodeName:}" failed. No retries permitted until 2025-09-16 04:24:00.598228309 +0000 UTC m=+6.966869331 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-nc5gq" (UniqueName: "kubernetes.io/projected/4a443492-c25f-4b6f-8600-f4ea6a8c09bd-kube-api-access-nc5gq") pod "kube-proxy-gp2n6" (UID: "4a443492-c25f-4b6f-8600-f4ea6a8c09bd") : configmap "kube-root-ca.crt" not found Sep 16 04:24:00.645635 systemd[1]: Created slice kubepods-besteffort-podeb3a7502_4eae_4e12_92ea_d165b7b4a355.slice - libcontainer container kubepods-besteffort-podeb3a7502_4eae_4e12_92ea_d165b7b4a355.slice. Sep 16 04:24:00.688880 kubelet[2783]: I0916 04:24:00.688814 2783 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-calico\" (UniqueName: \"kubernetes.io/host-path/eb3a7502-4eae-4e12-92ea-d165b7b4a355-var-lib-calico\") pod \"tigera-operator-755d956888-p588p\" (UID: \"eb3a7502-4eae-4e12-92ea-d165b7b4a355\") " pod="tigera-operator/tigera-operator-755d956888-p588p" Sep 16 04:24:00.689282 kubelet[2783]: I0916 04:24:00.688908 2783 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wnltp\" (UniqueName: \"kubernetes.io/projected/eb3a7502-4eae-4e12-92ea-d165b7b4a355-kube-api-access-wnltp\") pod \"tigera-operator-755d956888-p588p\" (UID: \"eb3a7502-4eae-4e12-92ea-d165b7b4a355\") " pod="tigera-operator/tigera-operator-755d956888-p588p" Sep 16 04:24:00.889814 containerd[1558]: time="2025-09-16T04:24:00.889551924Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-proxy-gp2n6,Uid:4a443492-c25f-4b6f-8600-f4ea6a8c09bd,Namespace:kube-system,Attempt:0,}" Sep 16 04:24:00.911729 containerd[1558]: time="2025-09-16T04:24:00.911463003Z" level=info msg="connecting to shim 38ce084cf76605850da2aab9ea80c8bb75d199bc5f3c2390d8b6f75a68533aab" address="unix:///run/containerd/s/615836ee27ac41ac5abe4bc5237fa515e2708954da6643a35e8b84f56d01aa0f" namespace=k8s.io protocol=ttrpc version=3 Sep 16 04:24:00.935784 systemd[1]: Started cri-containerd-38ce084cf76605850da2aab9ea80c8bb75d199bc5f3c2390d8b6f75a68533aab.scope - libcontainer container 38ce084cf76605850da2aab9ea80c8bb75d199bc5f3c2390d8b6f75a68533aab. Sep 16 04:24:00.950500 containerd[1558]: time="2025-09-16T04:24:00.950460314Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:tigera-operator-755d956888-p588p,Uid:eb3a7502-4eae-4e12-92ea-d165b7b4a355,Namespace:tigera-operator,Attempt:0,}" Sep 16 04:24:00.965551 containerd[1558]: time="2025-09-16T04:24:00.965514875Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-proxy-gp2n6,Uid:4a443492-c25f-4b6f-8600-f4ea6a8c09bd,Namespace:kube-system,Attempt:0,} returns sandbox id \"38ce084cf76605850da2aab9ea80c8bb75d199bc5f3c2390d8b6f75a68533aab\"" Sep 16 04:24:00.974761 containerd[1558]: time="2025-09-16T04:24:00.974727595Z" level=info msg="CreateContainer within sandbox \"38ce084cf76605850da2aab9ea80c8bb75d199bc5f3c2390d8b6f75a68533aab\" for container &ContainerMetadata{Name:kube-proxy,Attempt:0,}" Sep 16 04:24:00.981317 containerd[1558]: time="2025-09-16T04:24:00.981279862Z" level=info msg="connecting to shim a08e43c1cc4799704c01eac2015473bc4e38d82fc3562b0ca970af3383c54304" address="unix:///run/containerd/s/a24b3ee215d638acb2ac861b8db52a92b34871befd8ffd2c366186494f3a0212" namespace=k8s.io protocol=ttrpc version=3 Sep 16 04:24:00.987345 containerd[1558]: time="2025-09-16T04:24:00.987278229Z" level=info msg="Container b4c8e44787afeb928ee322c6d5869cdcd1bcc9bedf0ce757824a50a3f5615d60: CDI devices from CRI Config.CDIDevices: []" Sep 16 04:24:00.996631 containerd[1558]: time="2025-09-16T04:24:00.996541310Z" level=info msg="CreateContainer within sandbox \"38ce084cf76605850da2aab9ea80c8bb75d199bc5f3c2390d8b6f75a68533aab\" for &ContainerMetadata{Name:kube-proxy,Attempt:0,} returns container id \"b4c8e44787afeb928ee322c6d5869cdcd1bcc9bedf0ce757824a50a3f5615d60\"" Sep 16 04:24:01.000219 containerd[1558]: time="2025-09-16T04:24:01.000132555Z" level=info msg="StartContainer for \"b4c8e44787afeb928ee322c6d5869cdcd1bcc9bedf0ce757824a50a3f5615d60\"" Sep 16 04:24:01.004810 containerd[1558]: time="2025-09-16T04:24:01.004185210Z" level=info msg="connecting to shim b4c8e44787afeb928ee322c6d5869cdcd1bcc9bedf0ce757824a50a3f5615d60" address="unix:///run/containerd/s/615836ee27ac41ac5abe4bc5237fa515e2708954da6643a35e8b84f56d01aa0f" protocol=ttrpc version=3 Sep 16 04:24:01.014809 systemd[1]: Started cri-containerd-a08e43c1cc4799704c01eac2015473bc4e38d82fc3562b0ca970af3383c54304.scope - libcontainer container a08e43c1cc4799704c01eac2015473bc4e38d82fc3562b0ca970af3383c54304. Sep 16 04:24:01.030806 systemd[1]: Started cri-containerd-b4c8e44787afeb928ee322c6d5869cdcd1bcc9bedf0ce757824a50a3f5615d60.scope - libcontainer container b4c8e44787afeb928ee322c6d5869cdcd1bcc9bedf0ce757824a50a3f5615d60. Sep 16 04:24:01.079123 containerd[1558]: time="2025-09-16T04:24:01.078512762Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:tigera-operator-755d956888-p588p,Uid:eb3a7502-4eae-4e12-92ea-d165b7b4a355,Namespace:tigera-operator,Attempt:0,} returns sandbox id \"a08e43c1cc4799704c01eac2015473bc4e38d82fc3562b0ca970af3383c54304\"" Sep 16 04:24:01.080276 containerd[1558]: time="2025-09-16T04:24:01.080244899Z" level=info msg="PullImage \"quay.io/tigera/operator:v1.38.6\"" Sep 16 04:24:01.097185 containerd[1558]: time="2025-09-16T04:24:01.097124540Z" level=info msg="StartContainer for \"b4c8e44787afeb928ee322c6d5869cdcd1bcc9bedf0ce757824a50a3f5615d60\" returns successfully" Sep 16 04:24:03.393131 kubelet[2783]: I0916 04:24:03.392849 2783 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-proxy-gp2n6" podStartSLOduration=4.392827768 podStartE2EDuration="4.392827768s" podCreationTimestamp="2025-09-16 04:23:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-16 04:24:01.832302424 +0000 UTC m=+8.200943486" watchObservedRunningTime="2025-09-16 04:24:03.392827768 +0000 UTC m=+9.761468790" Sep 16 04:24:03.651633 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount1813622703.mount: Deactivated successfully. Sep 16 04:24:04.152939 containerd[1558]: time="2025-09-16T04:24:04.152855457Z" level=info msg="ImageCreate event name:\"quay.io/tigera/operator:v1.38.6\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 16 04:24:04.154710 containerd[1558]: time="2025-09-16T04:24:04.154656630Z" level=info msg="stop pulling image quay.io/tigera/operator:v1.38.6: active requests=0, bytes read=22152365" Sep 16 04:24:04.155873 containerd[1558]: time="2025-09-16T04:24:04.155805104Z" level=info msg="ImageCreate event name:\"sha256:dd2e197838b00861b08ae5f480dfbfb9a519722e35ced99346315722309cbe9f\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 16 04:24:04.159884 containerd[1558]: time="2025-09-16T04:24:04.159562815Z" level=info msg="ImageCreate event name:\"quay.io/tigera/operator@sha256:00a7a9b62f9b9a4e0856128b078539783b8352b07f707bff595cb604cc580f6e\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 16 04:24:04.160851 containerd[1558]: time="2025-09-16T04:24:04.160797852Z" level=info msg="Pulled image \"quay.io/tigera/operator:v1.38.6\" with image id \"sha256:dd2e197838b00861b08ae5f480dfbfb9a519722e35ced99346315722309cbe9f\", repo tag \"quay.io/tigera/operator:v1.38.6\", repo digest \"quay.io/tigera/operator@sha256:00a7a9b62f9b9a4e0856128b078539783b8352b07f707bff595cb604cc580f6e\", size \"22148360\" in 3.080404987s" Sep 16 04:24:04.160914 containerd[1558]: time="2025-09-16T04:24:04.160855613Z" level=info msg="PullImage \"quay.io/tigera/operator:v1.38.6\" returns image reference \"sha256:dd2e197838b00861b08ae5f480dfbfb9a519722e35ced99346315722309cbe9f\"" Sep 16 04:24:04.170774 containerd[1558]: time="2025-09-16T04:24:04.167517250Z" level=info msg="CreateContainer within sandbox \"a08e43c1cc4799704c01eac2015473bc4e38d82fc3562b0ca970af3383c54304\" for container &ContainerMetadata{Name:tigera-operator,Attempt:0,}" Sep 16 04:24:04.180625 containerd[1558]: time="2025-09-16T04:24:04.180438832Z" level=info msg="Container 6d0da9451e670ade73d62e52afea4f69fc4d068a0dd91185a8485e2dc740c3a2: CDI devices from CRI Config.CDIDevices: []" Sep 16 04:24:04.186304 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount3254462524.mount: Deactivated successfully. Sep 16 04:24:04.198382 containerd[1558]: time="2025-09-16T04:24:04.198320321Z" level=info msg="CreateContainer within sandbox \"a08e43c1cc4799704c01eac2015473bc4e38d82fc3562b0ca970af3383c54304\" for &ContainerMetadata{Name:tigera-operator,Attempt:0,} returns container id \"6d0da9451e670ade73d62e52afea4f69fc4d068a0dd91185a8485e2dc740c3a2\"" Sep 16 04:24:04.199956 containerd[1558]: time="2025-09-16T04:24:04.199914128Z" level=info msg="StartContainer for \"6d0da9451e670ade73d62e52afea4f69fc4d068a0dd91185a8485e2dc740c3a2\"" Sep 16 04:24:04.202212 containerd[1558]: time="2025-09-16T04:24:04.202173915Z" level=info msg="connecting to shim 6d0da9451e670ade73d62e52afea4f69fc4d068a0dd91185a8485e2dc740c3a2" address="unix:///run/containerd/s/a24b3ee215d638acb2ac861b8db52a92b34871befd8ffd2c366186494f3a0212" protocol=ttrpc version=3 Sep 16 04:24:04.225094 systemd[1]: Started cri-containerd-6d0da9451e670ade73d62e52afea4f69fc4d068a0dd91185a8485e2dc740c3a2.scope - libcontainer container 6d0da9451e670ade73d62e52afea4f69fc4d068a0dd91185a8485e2dc740c3a2. Sep 16 04:24:04.267878 containerd[1558]: time="2025-09-16T04:24:04.267834056Z" level=info msg="StartContainer for \"6d0da9451e670ade73d62e52afea4f69fc4d068a0dd91185a8485e2dc740c3a2\" returns successfully" Sep 16 04:24:10.556667 sudo[1846]: pam_unix(sudo:session): session closed for user root Sep 16 04:24:10.714860 sshd[1845]: Connection closed by 139.178.89.65 port 60526 Sep 16 04:24:10.716853 sshd-session[1842]: pam_unix(sshd:session): session closed for user core Sep 16 04:24:10.724032 systemd[1]: sshd@6-138.201.119.17:22-139.178.89.65:60526.service: Deactivated successfully. Sep 16 04:24:10.728491 systemd[1]: session-7.scope: Deactivated successfully. Sep 16 04:24:10.729925 systemd[1]: session-7.scope: Consumed 9.908s CPU time, 221.7M memory peak. Sep 16 04:24:10.734326 systemd-logind[1529]: Session 7 logged out. Waiting for processes to exit. Sep 16 04:24:10.736636 systemd-logind[1529]: Removed session 7. Sep 16 04:24:17.678008 kubelet[2783]: I0916 04:24:17.677932 2783 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="tigera-operator/tigera-operator-755d956888-p588p" podStartSLOduration=14.595436158 podStartE2EDuration="17.677914569s" podCreationTimestamp="2025-09-16 04:24:00 +0000 UTC" firstStartedPulling="2025-09-16 04:24:01.079840766 +0000 UTC m=+7.448481788" lastFinishedPulling="2025-09-16 04:24:04.162319177 +0000 UTC m=+10.530960199" observedRunningTime="2025-09-16 04:24:04.845081522 +0000 UTC m=+11.213722584" watchObservedRunningTime="2025-09-16 04:24:17.677914569 +0000 UTC m=+24.046555551" Sep 16 04:24:17.691822 systemd[1]: Created slice kubepods-besteffort-podf0c202d5_4e80_4334_86a8_85bd588b4f07.slice - libcontainer container kubepods-besteffort-podf0c202d5_4e80_4334_86a8_85bd588b4f07.slice. Sep 16 04:24:17.706810 kubelet[2783]: I0916 04:24:17.706726 2783 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f0c202d5-4e80-4334-86a8-85bd588b4f07-tigera-ca-bundle\") pod \"calico-typha-cc4bf644f-69wk9\" (UID: \"f0c202d5-4e80-4334-86a8-85bd588b4f07\") " pod="calico-system/calico-typha-cc4bf644f-69wk9" Sep 16 04:24:17.706810 kubelet[2783]: I0916 04:24:17.706811 2783 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j5lfr\" (UniqueName: \"kubernetes.io/projected/f0c202d5-4e80-4334-86a8-85bd588b4f07-kube-api-access-j5lfr\") pod \"calico-typha-cc4bf644f-69wk9\" (UID: \"f0c202d5-4e80-4334-86a8-85bd588b4f07\") " pod="calico-system/calico-typha-cc4bf644f-69wk9" Sep 16 04:24:17.707032 kubelet[2783]: I0916 04:24:17.706832 2783 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"typha-certs\" (UniqueName: \"kubernetes.io/secret/f0c202d5-4e80-4334-86a8-85bd588b4f07-typha-certs\") pod \"calico-typha-cc4bf644f-69wk9\" (UID: \"f0c202d5-4e80-4334-86a8-85bd588b4f07\") " pod="calico-system/calico-typha-cc4bf644f-69wk9" Sep 16 04:24:17.866653 systemd[1]: Created slice kubepods-besteffort-podfdb84d65_cb89_4607_9837_210202652ed0.slice - libcontainer container kubepods-besteffort-podfdb84d65_cb89_4607_9837_210202652ed0.slice. Sep 16 04:24:17.907951 kubelet[2783]: I0916 04:24:17.907901 2783 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-calico\" (UniqueName: \"kubernetes.io/host-path/fdb84d65-cb89-4607-9837-210202652ed0-var-run-calico\") pod \"calico-node-gdfbn\" (UID: \"fdb84d65-cb89-4607-9837-210202652ed0\") " pod="calico-system/calico-node-gdfbn" Sep 16 04:24:17.908291 kubelet[2783]: I0916 04:24:17.908165 2783 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-bin-dir\" (UniqueName: \"kubernetes.io/host-path/fdb84d65-cb89-4607-9837-210202652ed0-cni-bin-dir\") pod \"calico-node-gdfbn\" (UID: \"fdb84d65-cb89-4607-9837-210202652ed0\") " pod="calico-system/calico-node-gdfbn" Sep 16 04:24:17.908291 kubelet[2783]: I0916 04:24:17.908191 2783 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-net-dir\" (UniqueName: \"kubernetes.io/host-path/fdb84d65-cb89-4607-9837-210202652ed0-cni-net-dir\") pod \"calico-node-gdfbn\" (UID: \"fdb84d65-cb89-4607-9837-210202652ed0\") " pod="calico-system/calico-node-gdfbn" Sep 16 04:24:17.908291 kubelet[2783]: I0916 04:24:17.908208 2783 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"flexvol-driver-host\" (UniqueName: \"kubernetes.io/host-path/fdb84d65-cb89-4607-9837-210202652ed0-flexvol-driver-host\") pod \"calico-node-gdfbn\" (UID: \"fdb84d65-cb89-4607-9837-210202652ed0\") " pod="calico-system/calico-node-gdfbn" Sep 16 04:24:17.908291 kubelet[2783]: I0916 04:24:17.908252 2783 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/fdb84d65-cb89-4607-9837-210202652ed0-lib-modules\") pod \"calico-node-gdfbn\" (UID: \"fdb84d65-cb89-4607-9837-210202652ed0\") " pod="calico-system/calico-node-gdfbn" Sep 16 04:24:17.908291 kubelet[2783]: I0916 04:24:17.908274 2783 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qdrnp\" (UniqueName: \"kubernetes.io/projected/fdb84d65-cb89-4607-9837-210202652ed0-kube-api-access-qdrnp\") pod \"calico-node-gdfbn\" (UID: \"fdb84d65-cb89-4607-9837-210202652ed0\") " pod="calico-system/calico-node-gdfbn" Sep 16 04:24:17.908665 kubelet[2783]: I0916 04:24:17.908479 2783 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-log-dir\" (UniqueName: \"kubernetes.io/host-path/fdb84d65-cb89-4607-9837-210202652ed0-cni-log-dir\") pod \"calico-node-gdfbn\" (UID: \"fdb84d65-cb89-4607-9837-210202652ed0\") " pod="calico-system/calico-node-gdfbn" Sep 16 04:24:17.908665 kubelet[2783]: I0916 04:24:17.908537 2783 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"policysync\" (UniqueName: \"kubernetes.io/host-path/fdb84d65-cb89-4607-9837-210202652ed0-policysync\") pod \"calico-node-gdfbn\" (UID: \"fdb84d65-cb89-4607-9837-210202652ed0\") " pod="calico-system/calico-node-gdfbn" Sep 16 04:24:17.908665 kubelet[2783]: I0916 04:24:17.908555 2783 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-calico\" (UniqueName: \"kubernetes.io/host-path/fdb84d65-cb89-4607-9837-210202652ed0-var-lib-calico\") pod \"calico-node-gdfbn\" (UID: \"fdb84d65-cb89-4607-9837-210202652ed0\") " pod="calico-system/calico-node-gdfbn" Sep 16 04:24:17.908665 kubelet[2783]: I0916 04:24:17.908714 2783 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-certs\" (UniqueName: \"kubernetes.io/secret/fdb84d65-cb89-4607-9837-210202652ed0-node-certs\") pod \"calico-node-gdfbn\" (UID: \"fdb84d65-cb89-4607-9837-210202652ed0\") " pod="calico-system/calico-node-gdfbn" Sep 16 04:24:17.908665 kubelet[2783]: I0916 04:24:17.908739 2783 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"xtables-lock\" (UniqueName: \"kubernetes.io/host-path/fdb84d65-cb89-4607-9837-210202652ed0-xtables-lock\") pod \"calico-node-gdfbn\" (UID: \"fdb84d65-cb89-4607-9837-210202652ed0\") " pod="calico-system/calico-node-gdfbn" Sep 16 04:24:17.909020 kubelet[2783]: I0916 04:24:17.908756 2783 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/fdb84d65-cb89-4607-9837-210202652ed0-tigera-ca-bundle\") pod \"calico-node-gdfbn\" (UID: \"fdb84d65-cb89-4607-9837-210202652ed0\") " pod="calico-system/calico-node-gdfbn" Sep 16 04:24:17.959915 kubelet[2783]: E0916 04:24:17.959576 2783 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-c85z6" podUID="6e749571-a39a-485f-bcfb-603d4a5b22ed" Sep 16 04:24:17.996373 containerd[1558]: time="2025-09-16T04:24:17.996294104Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-typha-cc4bf644f-69wk9,Uid:f0c202d5-4e80-4334-86a8-85bd588b4f07,Namespace:calico-system,Attempt:0,}" Sep 16 04:24:18.010846 kubelet[2783]: I0916 04:24:18.009994 2783 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/6e749571-a39a-485f-bcfb-603d4a5b22ed-registration-dir\") pod \"csi-node-driver-c85z6\" (UID: \"6e749571-a39a-485f-bcfb-603d4a5b22ed\") " pod="calico-system/csi-node-driver-c85z6" Sep 16 04:24:18.020883 kubelet[2783]: E0916 04:24:18.018578 2783 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 16 04:24:18.020883 kubelet[2783]: W0916 04:24:18.020615 2783 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 16 04:24:18.020883 kubelet[2783]: E0916 04:24:18.020652 2783 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 16 04:24:18.021655 kubelet[2783]: E0916 04:24:18.021633 2783 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 16 04:24:18.022539 kubelet[2783]: W0916 04:24:18.021746 2783 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 16 04:24:18.022539 kubelet[2783]: E0916 04:24:18.021769 2783 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 16 04:24:18.022755 kubelet[2783]: E0916 04:24:18.022740 2783 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 16 04:24:18.022820 kubelet[2783]: W0916 04:24:18.022808 2783 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 16 04:24:18.022884 kubelet[2783]: E0916 04:24:18.022871 2783 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 16 04:24:18.026943 kubelet[2783]: E0916 04:24:18.025316 2783 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 16 04:24:18.026943 kubelet[2783]: W0916 04:24:18.025344 2783 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 16 04:24:18.026943 kubelet[2783]: E0916 04:24:18.025360 2783 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 16 04:24:18.026943 kubelet[2783]: E0916 04:24:18.026758 2783 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 16 04:24:18.026943 kubelet[2783]: W0916 04:24:18.026770 2783 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 16 04:24:18.026943 kubelet[2783]: E0916 04:24:18.026783 2783 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 16 04:24:18.026943 kubelet[2783]: I0916 04:24:18.026814 2783 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"varrun\" (UniqueName: \"kubernetes.io/host-path/6e749571-a39a-485f-bcfb-603d4a5b22ed-varrun\") pod \"csi-node-driver-c85z6\" (UID: \"6e749571-a39a-485f-bcfb-603d4a5b22ed\") " pod="calico-system/csi-node-driver-c85z6" Sep 16 04:24:18.027475 kubelet[2783]: E0916 04:24:18.027293 2783 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 16 04:24:18.027475 kubelet[2783]: W0916 04:24:18.027309 2783 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 16 04:24:18.027475 kubelet[2783]: E0916 04:24:18.027321 2783 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 16 04:24:18.028021 kubelet[2783]: E0916 04:24:18.027630 2783 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 16 04:24:18.028021 kubelet[2783]: W0916 04:24:18.027643 2783 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 16 04:24:18.028021 kubelet[2783]: E0916 04:24:18.027654 2783 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 16 04:24:18.028712 kubelet[2783]: E0916 04:24:18.028696 2783 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 16 04:24:18.028811 kubelet[2783]: W0916 04:24:18.028798 2783 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 16 04:24:18.028876 kubelet[2783]: E0916 04:24:18.028855 2783 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 16 04:24:18.031811 kubelet[2783]: E0916 04:24:18.031669 2783 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 16 04:24:18.031811 kubelet[2783]: W0916 04:24:18.031686 2783 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 16 04:24:18.031811 kubelet[2783]: E0916 04:24:18.031700 2783 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 16 04:24:18.032807 kubelet[2783]: E0916 04:24:18.032649 2783 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 16 04:24:18.032807 kubelet[2783]: W0916 04:24:18.032668 2783 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 16 04:24:18.032807 kubelet[2783]: E0916 04:24:18.032680 2783 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 16 04:24:18.032807 kubelet[2783]: I0916 04:24:18.032704 2783 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/6e749571-a39a-485f-bcfb-603d4a5b22ed-socket-dir\") pod \"csi-node-driver-c85z6\" (UID: \"6e749571-a39a-485f-bcfb-603d4a5b22ed\") " pod="calico-system/csi-node-driver-c85z6" Sep 16 04:24:18.033238 kubelet[2783]: E0916 04:24:18.033212 2783 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 16 04:24:18.038995 kubelet[2783]: W0916 04:24:18.038959 2783 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 16 04:24:18.039498 kubelet[2783]: E0916 04:24:18.039264 2783 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 16 04:24:18.042497 kubelet[2783]: E0916 04:24:18.042468 2783 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 16 04:24:18.044344 kubelet[2783]: W0916 04:24:18.044235 2783 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 16 04:24:18.044635 kubelet[2783]: E0916 04:24:18.044616 2783 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 16 04:24:18.050060 kubelet[2783]: E0916 04:24:18.049795 2783 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 16 04:24:18.050060 kubelet[2783]: W0916 04:24:18.049817 2783 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 16 04:24:18.050060 kubelet[2783]: E0916 04:24:18.049837 2783 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 16 04:24:18.059620 kubelet[2783]: E0916 04:24:18.059143 2783 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 16 04:24:18.059620 kubelet[2783]: W0916 04:24:18.059166 2783 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 16 04:24:18.059620 kubelet[2783]: E0916 04:24:18.059185 2783 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 16 04:24:18.060131 kubelet[2783]: E0916 04:24:18.060072 2783 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 16 04:24:18.060752 kubelet[2783]: W0916 04:24:18.060646 2783 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 16 04:24:18.060752 kubelet[2783]: E0916 04:24:18.060673 2783 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 16 04:24:18.060843 kubelet[2783]: I0916 04:24:18.060770 2783 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/6e749571-a39a-485f-bcfb-603d4a5b22ed-kubelet-dir\") pod \"csi-node-driver-c85z6\" (UID: \"6e749571-a39a-485f-bcfb-603d4a5b22ed\") " pod="calico-system/csi-node-driver-c85z6" Sep 16 04:24:18.061966 kubelet[2783]: E0916 04:24:18.061946 2783 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 16 04:24:18.062233 kubelet[2783]: W0916 04:24:18.062145 2783 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 16 04:24:18.062233 kubelet[2783]: E0916 04:24:18.062170 2783 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 16 04:24:18.064240 kubelet[2783]: E0916 04:24:18.064012 2783 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 16 04:24:18.064240 kubelet[2783]: W0916 04:24:18.064037 2783 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 16 04:24:18.064240 kubelet[2783]: E0916 04:24:18.064054 2783 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 16 04:24:18.065187 kubelet[2783]: E0916 04:24:18.065161 2783 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 16 04:24:18.065388 kubelet[2783]: W0916 04:24:18.065273 2783 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 16 04:24:18.065482 kubelet[2783]: E0916 04:24:18.065451 2783 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 16 04:24:18.066443 kubelet[2783]: E0916 04:24:18.065812 2783 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 16 04:24:18.066443 kubelet[2783]: W0916 04:24:18.066303 2783 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 16 04:24:18.066443 kubelet[2783]: E0916 04:24:18.066326 2783 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 16 04:24:18.066813 kubelet[2783]: E0916 04:24:18.066737 2783 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 16 04:24:18.066813 kubelet[2783]: W0916 04:24:18.066752 2783 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 16 04:24:18.066813 kubelet[2783]: E0916 04:24:18.066763 2783 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 16 04:24:18.067070 kubelet[2783]: E0916 04:24:18.067058 2783 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 16 04:24:18.067291 kubelet[2783]: W0916 04:24:18.067194 2783 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 16 04:24:18.067291 kubelet[2783]: E0916 04:24:18.067213 2783 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 16 04:24:18.067355 kubelet[2783]: I0916 04:24:18.067285 2783 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ptct9\" (UniqueName: \"kubernetes.io/projected/6e749571-a39a-485f-bcfb-603d4a5b22ed-kube-api-access-ptct9\") pod \"csi-node-driver-c85z6\" (UID: \"6e749571-a39a-485f-bcfb-603d4a5b22ed\") " pod="calico-system/csi-node-driver-c85z6" Sep 16 04:24:18.068994 kubelet[2783]: E0916 04:24:18.068944 2783 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 16 04:24:18.068994 kubelet[2783]: W0916 04:24:18.068962 2783 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 16 04:24:18.068994 kubelet[2783]: E0916 04:24:18.068978 2783 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 16 04:24:18.071398 kubelet[2783]: E0916 04:24:18.070438 2783 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 16 04:24:18.071398 kubelet[2783]: W0916 04:24:18.070456 2783 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 16 04:24:18.071398 kubelet[2783]: E0916 04:24:18.070471 2783 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 16 04:24:18.071514 containerd[1558]: time="2025-09-16T04:24:18.071144751Z" level=info msg="connecting to shim bb700e25270771d109442898fb786f24e9082587621305ec0c13586c76e6b01c" address="unix:///run/containerd/s/de2211d7e0379bcbd01de0eec0b7888d78e7f89c3b37b716a092562cfb96d58a" namespace=k8s.io protocol=ttrpc version=3 Sep 16 04:24:18.072804 kubelet[2783]: E0916 04:24:18.072781 2783 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 16 04:24:18.073142 kubelet[2783]: W0916 04:24:18.072956 2783 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 16 04:24:18.073142 kubelet[2783]: E0916 04:24:18.072982 2783 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 16 04:24:18.073843 kubelet[2783]: E0916 04:24:18.073716 2783 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 16 04:24:18.073843 kubelet[2783]: W0916 04:24:18.073735 2783 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 16 04:24:18.073843 kubelet[2783]: E0916 04:24:18.073748 2783 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 16 04:24:18.074819 kubelet[2783]: E0916 04:24:18.074681 2783 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 16 04:24:18.074819 kubelet[2783]: W0916 04:24:18.074698 2783 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 16 04:24:18.074819 kubelet[2783]: E0916 04:24:18.074711 2783 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 16 04:24:18.075976 kubelet[2783]: E0916 04:24:18.075225 2783 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 16 04:24:18.075976 kubelet[2783]: W0916 04:24:18.075241 2783 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 16 04:24:18.075976 kubelet[2783]: E0916 04:24:18.075254 2783 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 16 04:24:18.076435 kubelet[2783]: E0916 04:24:18.076417 2783 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 16 04:24:18.077661 kubelet[2783]: W0916 04:24:18.077633 2783 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 16 04:24:18.077773 kubelet[2783]: E0916 04:24:18.077759 2783 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 16 04:24:18.078808 kubelet[2783]: E0916 04:24:18.078650 2783 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 16 04:24:18.078808 kubelet[2783]: W0916 04:24:18.078668 2783 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 16 04:24:18.078808 kubelet[2783]: E0916 04:24:18.078682 2783 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 16 04:24:18.079147 kubelet[2783]: E0916 04:24:18.078977 2783 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 16 04:24:18.079147 kubelet[2783]: W0916 04:24:18.078990 2783 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 16 04:24:18.079147 kubelet[2783]: E0916 04:24:18.079001 2783 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 16 04:24:18.079570 kubelet[2783]: E0916 04:24:18.079494 2783 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 16 04:24:18.079753 kubelet[2783]: W0916 04:24:18.079676 2783 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 16 04:24:18.079753 kubelet[2783]: E0916 04:24:18.079698 2783 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 16 04:24:18.081231 kubelet[2783]: E0916 04:24:18.081197 2783 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 16 04:24:18.081437 kubelet[2783]: W0916 04:24:18.081319 2783 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 16 04:24:18.081437 kubelet[2783]: E0916 04:24:18.081339 2783 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 16 04:24:18.082311 kubelet[2783]: E0916 04:24:18.082293 2783 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 16 04:24:18.082542 kubelet[2783]: W0916 04:24:18.082393 2783 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 16 04:24:18.082542 kubelet[2783]: E0916 04:24:18.082411 2783 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 16 04:24:18.083173 kubelet[2783]: E0916 04:24:18.083158 2783 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 16 04:24:18.083269 kubelet[2783]: W0916 04:24:18.083247 2783 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 16 04:24:18.083327 kubelet[2783]: E0916 04:24:18.083314 2783 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 16 04:24:18.084945 kubelet[2783]: E0916 04:24:18.084859 2783 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 16 04:24:18.084945 kubelet[2783]: W0916 04:24:18.084879 2783 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 16 04:24:18.084945 kubelet[2783]: E0916 04:24:18.084892 2783 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 16 04:24:18.115654 systemd[1]: Started cri-containerd-bb700e25270771d109442898fb786f24e9082587621305ec0c13586c76e6b01c.scope - libcontainer container bb700e25270771d109442898fb786f24e9082587621305ec0c13586c76e6b01c. Sep 16 04:24:18.171404 containerd[1558]: time="2025-09-16T04:24:18.171351285Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-node-gdfbn,Uid:fdb84d65-cb89-4607-9837-210202652ed0,Namespace:calico-system,Attempt:0,}" Sep 16 04:24:18.178057 kubelet[2783]: E0916 04:24:18.178000 2783 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 16 04:24:18.178057 kubelet[2783]: W0916 04:24:18.178024 2783 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 16 04:24:18.178057 kubelet[2783]: E0916 04:24:18.178044 2783 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 16 04:24:18.178328 kubelet[2783]: E0916 04:24:18.178270 2783 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 16 04:24:18.178328 kubelet[2783]: W0916 04:24:18.178281 2783 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 16 04:24:18.178328 kubelet[2783]: E0916 04:24:18.178291 2783 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 16 04:24:18.179116 kubelet[2783]: E0916 04:24:18.178440 2783 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 16 04:24:18.179116 kubelet[2783]: W0916 04:24:18.178453 2783 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 16 04:24:18.179116 kubelet[2783]: E0916 04:24:18.178474 2783 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 16 04:24:18.179116 kubelet[2783]: E0916 04:24:18.178622 2783 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 16 04:24:18.179116 kubelet[2783]: W0916 04:24:18.178629 2783 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 16 04:24:18.179116 kubelet[2783]: E0916 04:24:18.178638 2783 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 16 04:24:18.179116 kubelet[2783]: E0916 04:24:18.178777 2783 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 16 04:24:18.179116 kubelet[2783]: W0916 04:24:18.178784 2783 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 16 04:24:18.179116 kubelet[2783]: E0916 04:24:18.178792 2783 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 16 04:24:18.179116 kubelet[2783]: E0916 04:24:18.178977 2783 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 16 04:24:18.180754 kubelet[2783]: W0916 04:24:18.179005 2783 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 16 04:24:18.180754 kubelet[2783]: E0916 04:24:18.179015 2783 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 16 04:24:18.180754 kubelet[2783]: E0916 04:24:18.179170 2783 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 16 04:24:18.180754 kubelet[2783]: W0916 04:24:18.179178 2783 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 16 04:24:18.180754 kubelet[2783]: E0916 04:24:18.179187 2783 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 16 04:24:18.180754 kubelet[2783]: E0916 04:24:18.179329 2783 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 16 04:24:18.180754 kubelet[2783]: W0916 04:24:18.179337 2783 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 16 04:24:18.180754 kubelet[2783]: E0916 04:24:18.179346 2783 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 16 04:24:18.180754 kubelet[2783]: E0916 04:24:18.179504 2783 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 16 04:24:18.180754 kubelet[2783]: W0916 04:24:18.179511 2783 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 16 04:24:18.180972 kubelet[2783]: E0916 04:24:18.179519 2783 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 16 04:24:18.180972 kubelet[2783]: E0916 04:24:18.179903 2783 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 16 04:24:18.180972 kubelet[2783]: W0916 04:24:18.179918 2783 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 16 04:24:18.180972 kubelet[2783]: E0916 04:24:18.179931 2783 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 16 04:24:18.181461 kubelet[2783]: E0916 04:24:18.181328 2783 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 16 04:24:18.181461 kubelet[2783]: W0916 04:24:18.181368 2783 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 16 04:24:18.181461 kubelet[2783]: E0916 04:24:18.181383 2783 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 16 04:24:18.182184 kubelet[2783]: E0916 04:24:18.182060 2783 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 16 04:24:18.182184 kubelet[2783]: W0916 04:24:18.182095 2783 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 16 04:24:18.182184 kubelet[2783]: E0916 04:24:18.182111 2783 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 16 04:24:18.183688 kubelet[2783]: E0916 04:24:18.183666 2783 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 16 04:24:18.183869 kubelet[2783]: W0916 04:24:18.183773 2783 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 16 04:24:18.183869 kubelet[2783]: E0916 04:24:18.183793 2783 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 16 04:24:18.184112 kubelet[2783]: E0916 04:24:18.184094 2783 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 16 04:24:18.184247 kubelet[2783]: W0916 04:24:18.184173 2783 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 16 04:24:18.184247 kubelet[2783]: E0916 04:24:18.184189 2783 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 16 04:24:18.184452 kubelet[2783]: E0916 04:24:18.184438 2783 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 16 04:24:18.184566 kubelet[2783]: W0916 04:24:18.184498 2783 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 16 04:24:18.184566 kubelet[2783]: E0916 04:24:18.184512 2783 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 16 04:24:18.184836 kubelet[2783]: E0916 04:24:18.184823 2783 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 16 04:24:18.184973 kubelet[2783]: W0916 04:24:18.184889 2783 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 16 04:24:18.184973 kubelet[2783]: E0916 04:24:18.184905 2783 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 16 04:24:18.185293 kubelet[2783]: E0916 04:24:18.185130 2783 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 16 04:24:18.185293 kubelet[2783]: W0916 04:24:18.185142 2783 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 16 04:24:18.185293 kubelet[2783]: E0916 04:24:18.185152 2783 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 16 04:24:18.186041 kubelet[2783]: E0916 04:24:18.185668 2783 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 16 04:24:18.186041 kubelet[2783]: W0916 04:24:18.185687 2783 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 16 04:24:18.186041 kubelet[2783]: E0916 04:24:18.185702 2783 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 16 04:24:18.186453 kubelet[2783]: E0916 04:24:18.186435 2783 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 16 04:24:18.186616 kubelet[2783]: W0916 04:24:18.186520 2783 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 16 04:24:18.186616 kubelet[2783]: E0916 04:24:18.186540 2783 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 16 04:24:18.186938 kubelet[2783]: E0916 04:24:18.186902 2783 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 16 04:24:18.187943 kubelet[2783]: W0916 04:24:18.187029 2783 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 16 04:24:18.187943 kubelet[2783]: E0916 04:24:18.187049 2783 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 16 04:24:18.188441 kubelet[2783]: E0916 04:24:18.188393 2783 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 16 04:24:18.188441 kubelet[2783]: W0916 04:24:18.188412 2783 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 16 04:24:18.188441 kubelet[2783]: E0916 04:24:18.188428 2783 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 16 04:24:18.188811 kubelet[2783]: E0916 04:24:18.188773 2783 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 16 04:24:18.188811 kubelet[2783]: W0916 04:24:18.188786 2783 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 16 04:24:18.188811 kubelet[2783]: E0916 04:24:18.188797 2783 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 16 04:24:18.189246 kubelet[2783]: E0916 04:24:18.189205 2783 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 16 04:24:18.189246 kubelet[2783]: W0916 04:24:18.189220 2783 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 16 04:24:18.189246 kubelet[2783]: E0916 04:24:18.189231 2783 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 16 04:24:18.189869 kubelet[2783]: E0916 04:24:18.189829 2783 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 16 04:24:18.189869 kubelet[2783]: W0916 04:24:18.189843 2783 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 16 04:24:18.189869 kubelet[2783]: E0916 04:24:18.189854 2783 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 16 04:24:18.190280 kubelet[2783]: E0916 04:24:18.190265 2783 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 16 04:24:18.190381 kubelet[2783]: W0916 04:24:18.190341 2783 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 16 04:24:18.190381 kubelet[2783]: E0916 04:24:18.190357 2783 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 16 04:24:18.205613 containerd[1558]: time="2025-09-16T04:24:18.205523745Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-typha-cc4bf644f-69wk9,Uid:f0c202d5-4e80-4334-86a8-85bd588b4f07,Namespace:calico-system,Attempt:0,} returns sandbox id \"bb700e25270771d109442898fb786f24e9082587621305ec0c13586c76e6b01c\"" Sep 16 04:24:18.208334 containerd[1558]: time="2025-09-16T04:24:18.208235597Z" level=info msg="connecting to shim f7c71579cce0ec213c5a474c65ab702dd5d554093c1d9cbbb0691207270a2b80" address="unix:///run/containerd/s/e12e1fde57a77c49dcd1ab8b5c66e5d2509e78308dbb9912b659009b55267cb5" namespace=k8s.io protocol=ttrpc version=3 Sep 16 04:24:18.210733 containerd[1558]: time="2025-09-16T04:24:18.210546242Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/typha:v3.30.3\"" Sep 16 04:24:18.211071 kubelet[2783]: E0916 04:24:18.211045 2783 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 16 04:24:18.211150 kubelet[2783]: W0916 04:24:18.211110 2783 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 16 04:24:18.211150 kubelet[2783]: E0916 04:24:18.211132 2783 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 16 04:24:18.252815 systemd[1]: Started cri-containerd-f7c71579cce0ec213c5a474c65ab702dd5d554093c1d9cbbb0691207270a2b80.scope - libcontainer container f7c71579cce0ec213c5a474c65ab702dd5d554093c1d9cbbb0691207270a2b80. Sep 16 04:24:18.293069 containerd[1558]: time="2025-09-16T04:24:18.292642587Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-node-gdfbn,Uid:fdb84d65-cb89-4607-9837-210202652ed0,Namespace:calico-system,Attempt:0,} returns sandbox id \"f7c71579cce0ec213c5a474c65ab702dd5d554093c1d9cbbb0691207270a2b80\"" Sep 16 04:24:19.748984 kubelet[2783]: E0916 04:24:19.748938 2783 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-c85z6" podUID="6e749571-a39a-485f-bcfb-603d4a5b22ed" Sep 16 04:24:19.762396 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount286832699.mount: Deactivated successfully. Sep 16 04:24:20.768762 containerd[1558]: time="2025-09-16T04:24:20.768667737Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/typha:v3.30.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 16 04:24:20.770032 containerd[1558]: time="2025-09-16T04:24:20.769781598Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/typha:v3.30.3: active requests=0, bytes read=33105775" Sep 16 04:24:20.771058 containerd[1558]: time="2025-09-16T04:24:20.771004740Z" level=info msg="ImageCreate event name:\"sha256:6a1496fdc48cc0b9ab3c10aef777497484efac5df9efbfbbdf9775e9583645cb\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 16 04:24:20.775236 containerd[1558]: time="2025-09-16T04:24:20.775165017Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/typha@sha256:f4a3d61ffda9c98a53adeb412c5af404ca3727a3cc2d0b4ef28d197bdd47ecaa\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 16 04:24:20.777316 containerd[1558]: time="2025-09-16T04:24:20.777263296Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/typha:v3.30.3\" with image id \"sha256:6a1496fdc48cc0b9ab3c10aef777497484efac5df9efbfbbdf9775e9583645cb\", repo tag \"ghcr.io/flatcar/calico/typha:v3.30.3\", repo digest \"ghcr.io/flatcar/calico/typha@sha256:f4a3d61ffda9c98a53adeb412c5af404ca3727a3cc2d0b4ef28d197bdd47ecaa\", size \"33105629\" in 2.566631813s" Sep 16 04:24:20.777565 containerd[1558]: time="2025-09-16T04:24:20.777404779Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/typha:v3.30.3\" returns image reference \"sha256:6a1496fdc48cc0b9ab3c10aef777497484efac5df9efbfbbdf9775e9583645cb\"" Sep 16 04:24:20.780269 containerd[1558]: time="2025-09-16T04:24:20.779884984Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.3\"" Sep 16 04:24:20.801916 containerd[1558]: time="2025-09-16T04:24:20.801858270Z" level=info msg="CreateContainer within sandbox \"bb700e25270771d109442898fb786f24e9082587621305ec0c13586c76e6b01c\" for container &ContainerMetadata{Name:calico-typha,Attempt:0,}" Sep 16 04:24:20.811628 containerd[1558]: time="2025-09-16T04:24:20.810958118Z" level=info msg="Container b1f80d0a12a6fc0383ef34b6e955dfad8330dcfc1a715afb5778206817e5ed52: CDI devices from CRI Config.CDIDevices: []" Sep 16 04:24:20.822960 containerd[1558]: time="2025-09-16T04:24:20.822865258Z" level=info msg="CreateContainer within sandbox \"bb700e25270771d109442898fb786f24e9082587621305ec0c13586c76e6b01c\" for &ContainerMetadata{Name:calico-typha,Attempt:0,} returns container id \"b1f80d0a12a6fc0383ef34b6e955dfad8330dcfc1a715afb5778206817e5ed52\"" Sep 16 04:24:20.826291 containerd[1558]: time="2025-09-16T04:24:20.825009417Z" level=info msg="StartContainer for \"b1f80d0a12a6fc0383ef34b6e955dfad8330dcfc1a715afb5778206817e5ed52\"" Sep 16 04:24:20.827895 containerd[1558]: time="2025-09-16T04:24:20.827810309Z" level=info msg="connecting to shim b1f80d0a12a6fc0383ef34b6e955dfad8330dcfc1a715afb5778206817e5ed52" address="unix:///run/containerd/s/de2211d7e0379bcbd01de0eec0b7888d78e7f89c3b37b716a092562cfb96d58a" protocol=ttrpc version=3 Sep 16 04:24:20.857960 systemd[1]: Started cri-containerd-b1f80d0a12a6fc0383ef34b6e955dfad8330dcfc1a715afb5778206817e5ed52.scope - libcontainer container b1f80d0a12a6fc0383ef34b6e955dfad8330dcfc1a715afb5778206817e5ed52. Sep 16 04:24:20.918745 containerd[1558]: time="2025-09-16T04:24:20.918697826Z" level=info msg="StartContainer for \"b1f80d0a12a6fc0383ef34b6e955dfad8330dcfc1a715afb5778206817e5ed52\" returns successfully" Sep 16 04:24:21.750618 kubelet[2783]: E0916 04:24:21.749288 2783 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-c85z6" podUID="6e749571-a39a-485f-bcfb-603d4a5b22ed" Sep 16 04:24:21.918492 kubelet[2783]: E0916 04:24:21.918440 2783 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 16 04:24:21.918849 kubelet[2783]: W0916 04:24:21.918791 2783 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 16 04:24:21.918988 kubelet[2783]: E0916 04:24:21.918969 2783 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 16 04:24:21.919623 kubelet[2783]: E0916 04:24:21.919575 2783 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 16 04:24:21.919832 kubelet[2783]: W0916 04:24:21.919760 2783 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 16 04:24:21.919945 kubelet[2783]: E0916 04:24:21.919924 2783 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 16 04:24:21.920501 kubelet[2783]: E0916 04:24:21.920344 2783 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 16 04:24:21.920501 kubelet[2783]: W0916 04:24:21.920360 2783 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 16 04:24:21.920501 kubelet[2783]: E0916 04:24:21.920377 2783 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 16 04:24:21.921020 kubelet[2783]: E0916 04:24:21.920915 2783 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 16 04:24:21.921020 kubelet[2783]: W0916 04:24:21.920955 2783 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 16 04:24:21.921020 kubelet[2783]: E0916 04:24:21.920979 2783 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 16 04:24:21.921792 kubelet[2783]: E0916 04:24:21.921679 2783 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 16 04:24:21.921792 kubelet[2783]: W0916 04:24:21.921717 2783 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 16 04:24:21.921792 kubelet[2783]: E0916 04:24:21.921734 2783 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 16 04:24:21.922700 kubelet[2783]: E0916 04:24:21.922632 2783 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 16 04:24:21.922700 kubelet[2783]: W0916 04:24:21.922649 2783 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 16 04:24:21.922700 kubelet[2783]: E0916 04:24:21.922664 2783 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 16 04:24:21.923080 kubelet[2783]: E0916 04:24:21.923029 2783 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 16 04:24:21.923080 kubelet[2783]: W0916 04:24:21.923044 2783 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 16 04:24:21.923080 kubelet[2783]: E0916 04:24:21.923057 2783 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 16 04:24:21.923481 kubelet[2783]: E0916 04:24:21.923460 2783 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 16 04:24:21.923719 kubelet[2783]: W0916 04:24:21.923517 2783 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 16 04:24:21.923719 kubelet[2783]: E0916 04:24:21.923533 2783 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 16 04:24:21.924054 kubelet[2783]: E0916 04:24:21.923978 2783 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 16 04:24:21.924054 kubelet[2783]: W0916 04:24:21.923996 2783 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 16 04:24:21.924054 kubelet[2783]: E0916 04:24:21.924012 2783 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 16 04:24:21.924549 kubelet[2783]: E0916 04:24:21.924535 2783 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 16 04:24:21.924714 kubelet[2783]: W0916 04:24:21.924643 2783 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 16 04:24:21.924714 kubelet[2783]: E0916 04:24:21.924660 2783 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 16 04:24:21.924982 kubelet[2783]: E0916 04:24:21.924963 2783 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 16 04:24:21.925212 kubelet[2783]: W0916 04:24:21.925139 2783 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 16 04:24:21.925212 kubelet[2783]: E0916 04:24:21.925159 2783 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 16 04:24:21.925478 kubelet[2783]: E0916 04:24:21.925465 2783 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 16 04:24:21.926036 kubelet[2783]: W0916 04:24:21.926009 2783 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 16 04:24:21.926346 kubelet[2783]: E0916 04:24:21.926194 2783 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 16 04:24:21.926696 kubelet[2783]: E0916 04:24:21.926666 2783 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 16 04:24:21.926941 kubelet[2783]: W0916 04:24:21.926680 2783 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 16 04:24:21.926941 kubelet[2783]: E0916 04:24:21.926908 2783 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 16 04:24:21.928731 kubelet[2783]: E0916 04:24:21.928685 2783 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 16 04:24:21.928931 kubelet[2783]: W0916 04:24:21.928809 2783 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 16 04:24:21.928931 kubelet[2783]: E0916 04:24:21.928828 2783 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 16 04:24:21.929631 kubelet[2783]: E0916 04:24:21.929323 2783 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 16 04:24:21.929631 kubelet[2783]: W0916 04:24:21.929339 2783 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 16 04:24:21.929631 kubelet[2783]: E0916 04:24:21.929351 2783 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 16 04:24:22.010287 kubelet[2783]: E0916 04:24:22.008778 2783 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 16 04:24:22.010287 kubelet[2783]: W0916 04:24:22.008808 2783 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 16 04:24:22.010287 kubelet[2783]: E0916 04:24:22.008831 2783 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 16 04:24:22.010287 kubelet[2783]: E0916 04:24:22.009063 2783 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 16 04:24:22.010287 kubelet[2783]: W0916 04:24:22.009074 2783 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 16 04:24:22.010287 kubelet[2783]: E0916 04:24:22.009110 2783 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 16 04:24:22.010287 kubelet[2783]: E0916 04:24:22.009341 2783 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 16 04:24:22.010287 kubelet[2783]: W0916 04:24:22.009353 2783 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 16 04:24:22.010287 kubelet[2783]: E0916 04:24:22.009364 2783 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 16 04:24:22.010287 kubelet[2783]: E0916 04:24:22.009666 2783 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 16 04:24:22.010867 kubelet[2783]: W0916 04:24:22.009697 2783 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 16 04:24:22.010867 kubelet[2783]: E0916 04:24:22.009713 2783 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 16 04:24:22.010867 kubelet[2783]: E0916 04:24:22.009978 2783 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 16 04:24:22.010867 kubelet[2783]: W0916 04:24:22.009990 2783 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 16 04:24:22.010867 kubelet[2783]: E0916 04:24:22.010001 2783 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 16 04:24:22.010867 kubelet[2783]: E0916 04:24:22.010196 2783 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 16 04:24:22.010867 kubelet[2783]: W0916 04:24:22.010206 2783 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 16 04:24:22.010867 kubelet[2783]: E0916 04:24:22.010216 2783 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 16 04:24:22.010867 kubelet[2783]: E0916 04:24:22.010426 2783 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 16 04:24:22.010867 kubelet[2783]: W0916 04:24:22.010436 2783 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 16 04:24:22.011188 kubelet[2783]: E0916 04:24:22.010446 2783 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 16 04:24:22.011188 kubelet[2783]: E0916 04:24:22.010950 2783 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 16 04:24:22.011188 kubelet[2783]: W0916 04:24:22.010964 2783 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 16 04:24:22.011188 kubelet[2783]: E0916 04:24:22.010976 2783 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 16 04:24:22.011607 kubelet[2783]: E0916 04:24:22.011209 2783 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 16 04:24:22.011607 kubelet[2783]: W0916 04:24:22.011218 2783 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 16 04:24:22.011607 kubelet[2783]: E0916 04:24:22.011252 2783 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 16 04:24:22.011607 kubelet[2783]: E0916 04:24:22.011412 2783 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 16 04:24:22.011607 kubelet[2783]: W0916 04:24:22.011421 2783 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 16 04:24:22.011607 kubelet[2783]: E0916 04:24:22.011431 2783 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 16 04:24:22.011910 kubelet[2783]: E0916 04:24:22.011755 2783 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 16 04:24:22.011910 kubelet[2783]: W0916 04:24:22.011767 2783 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 16 04:24:22.011910 kubelet[2783]: E0916 04:24:22.011780 2783 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 16 04:24:22.012134 kubelet[2783]: E0916 04:24:22.012114 2783 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 16 04:24:22.012134 kubelet[2783]: W0916 04:24:22.012130 2783 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 16 04:24:22.012225 kubelet[2783]: E0916 04:24:22.012143 2783 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 16 04:24:22.012354 kubelet[2783]: E0916 04:24:22.012322 2783 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 16 04:24:22.012354 kubelet[2783]: W0916 04:24:22.012333 2783 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 16 04:24:22.012354 kubelet[2783]: E0916 04:24:22.012343 2783 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 16 04:24:22.012555 kubelet[2783]: E0916 04:24:22.012518 2783 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 16 04:24:22.012555 kubelet[2783]: W0916 04:24:22.012528 2783 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 16 04:24:22.012555 kubelet[2783]: E0916 04:24:22.012538 2783 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 16 04:24:22.012735 kubelet[2783]: E0916 04:24:22.012714 2783 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 16 04:24:22.012735 kubelet[2783]: W0916 04:24:22.012728 2783 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 16 04:24:22.012830 kubelet[2783]: E0916 04:24:22.012738 2783 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 16 04:24:22.012892 kubelet[2783]: E0916 04:24:22.012879 2783 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 16 04:24:22.012892 kubelet[2783]: W0916 04:24:22.012887 2783 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 16 04:24:22.012985 kubelet[2783]: E0916 04:24:22.012896 2783 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 16 04:24:22.013230 kubelet[2783]: E0916 04:24:22.013078 2783 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 16 04:24:22.013230 kubelet[2783]: W0916 04:24:22.013109 2783 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 16 04:24:22.013230 kubelet[2783]: E0916 04:24:22.013121 2783 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 16 04:24:22.013701 kubelet[2783]: E0916 04:24:22.013681 2783 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 16 04:24:22.013701 kubelet[2783]: W0916 04:24:22.013700 2783 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 16 04:24:22.013812 kubelet[2783]: E0916 04:24:22.013714 2783 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 16 04:24:22.569810 containerd[1558]: time="2025-09-16T04:24:22.569758535Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 16 04:24:22.571892 containerd[1558]: time="2025-09-16T04:24:22.571456565Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.3: active requests=0, bytes read=4266814" Sep 16 04:24:22.572980 containerd[1558]: time="2025-09-16T04:24:22.572940471Z" level=info msg="ImageCreate event name:\"sha256:29e6f31ad72882b1b817dd257df6b7981e4d7d31d872b7fe2cf102c6e2af27a5\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 16 04:24:22.575216 containerd[1558]: time="2025-09-16T04:24:22.575162110Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/pod2daemon-flexvol@sha256:81bdfcd9dbd36624dc35354e8c181c75631ba40e6c7df5820f5f56cea36f0ef9\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 16 04:24:22.576065 containerd[1558]: time="2025-09-16T04:24:22.575949924Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.3\" with image id \"sha256:29e6f31ad72882b1b817dd257df6b7981e4d7d31d872b7fe2cf102c6e2af27a5\", repo tag \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.3\", repo digest \"ghcr.io/flatcar/calico/pod2daemon-flexvol@sha256:81bdfcd9dbd36624dc35354e8c181c75631ba40e6c7df5820f5f56cea36f0ef9\", size \"5636015\" in 1.795412768s" Sep 16 04:24:22.576065 containerd[1558]: time="2025-09-16T04:24:22.575984725Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.3\" returns image reference \"sha256:29e6f31ad72882b1b817dd257df6b7981e4d7d31d872b7fe2cf102c6e2af27a5\"" Sep 16 04:24:22.580928 containerd[1558]: time="2025-09-16T04:24:22.580893692Z" level=info msg="CreateContainer within sandbox \"f7c71579cce0ec213c5a474c65ab702dd5d554093c1d9cbbb0691207270a2b80\" for container &ContainerMetadata{Name:flexvol-driver,Attempt:0,}" Sep 16 04:24:22.591616 containerd[1558]: time="2025-09-16T04:24:22.590071174Z" level=info msg="Container 27e83d286fc9cdbaf5cff59a691df5b5febd2877eb429abaf956916a6e1488fd: CDI devices from CRI Config.CDIDevices: []" Sep 16 04:24:22.605409 containerd[1558]: time="2025-09-16T04:24:22.605306404Z" level=info msg="CreateContainer within sandbox \"f7c71579cce0ec213c5a474c65ab702dd5d554093c1d9cbbb0691207270a2b80\" for &ContainerMetadata{Name:flexvol-driver,Attempt:0,} returns container id \"27e83d286fc9cdbaf5cff59a691df5b5febd2877eb429abaf956916a6e1488fd\"" Sep 16 04:24:22.607081 containerd[1558]: time="2025-09-16T04:24:22.607027435Z" level=info msg="StartContainer for \"27e83d286fc9cdbaf5cff59a691df5b5febd2877eb429abaf956916a6e1488fd\"" Sep 16 04:24:22.609856 containerd[1558]: time="2025-09-16T04:24:22.609806524Z" level=info msg="connecting to shim 27e83d286fc9cdbaf5cff59a691df5b5febd2877eb429abaf956916a6e1488fd" address="unix:///run/containerd/s/e12e1fde57a77c49dcd1ab8b5c66e5d2509e78308dbb9912b659009b55267cb5" protocol=ttrpc version=3 Sep 16 04:24:22.639819 systemd[1]: Started cri-containerd-27e83d286fc9cdbaf5cff59a691df5b5febd2877eb429abaf956916a6e1488fd.scope - libcontainer container 27e83d286fc9cdbaf5cff59a691df5b5febd2877eb429abaf956916a6e1488fd. Sep 16 04:24:22.692143 containerd[1558]: time="2025-09-16T04:24:22.692096782Z" level=info msg="StartContainer for \"27e83d286fc9cdbaf5cff59a691df5b5febd2877eb429abaf956916a6e1488fd\" returns successfully" Sep 16 04:24:22.707664 systemd[1]: cri-containerd-27e83d286fc9cdbaf5cff59a691df5b5febd2877eb429abaf956916a6e1488fd.scope: Deactivated successfully. Sep 16 04:24:22.714005 containerd[1558]: time="2025-09-16T04:24:22.713943529Z" level=info msg="TaskExit event in podsandbox handler container_id:\"27e83d286fc9cdbaf5cff59a691df5b5febd2877eb429abaf956916a6e1488fd\" id:\"27e83d286fc9cdbaf5cff59a691df5b5febd2877eb429abaf956916a6e1488fd\" pid:3434 exited_at:{seconds:1757996662 nanos:713357398}" Sep 16 04:24:22.714170 containerd[1558]: time="2025-09-16T04:24:22.714052851Z" level=info msg="received exit event container_id:\"27e83d286fc9cdbaf5cff59a691df5b5febd2877eb429abaf956916a6e1488fd\" id:\"27e83d286fc9cdbaf5cff59a691df5b5febd2877eb429abaf956916a6e1488fd\" pid:3434 exited_at:{seconds:1757996662 nanos:713357398}" Sep 16 04:24:22.738425 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-27e83d286fc9cdbaf5cff59a691df5b5febd2877eb429abaf956916a6e1488fd-rootfs.mount: Deactivated successfully. Sep 16 04:24:22.891692 kubelet[2783]: I0916 04:24:22.891664 2783 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Sep 16 04:24:22.895078 containerd[1558]: time="2025-09-16T04:24:22.894777012Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/cni:v3.30.3\"" Sep 16 04:24:22.915336 kubelet[2783]: I0916 04:24:22.915194 2783 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/calico-typha-cc4bf644f-69wk9" podStartSLOduration=3.345338661 podStartE2EDuration="5.915175293s" podCreationTimestamp="2025-09-16 04:24:17 +0000 UTC" firstStartedPulling="2025-09-16 04:24:18.209661825 +0000 UTC m=+24.578302847" lastFinishedPulling="2025-09-16 04:24:20.779498457 +0000 UTC m=+27.148139479" observedRunningTime="2025-09-16 04:24:21.902166753 +0000 UTC m=+28.270807775" watchObservedRunningTime="2025-09-16 04:24:22.915175293 +0000 UTC m=+29.283816315" Sep 16 04:24:23.750242 kubelet[2783]: E0916 04:24:23.750067 2783 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-c85z6" podUID="6e749571-a39a-485f-bcfb-603d4a5b22ed" Sep 16 04:24:25.750193 kubelet[2783]: E0916 04:24:25.749523 2783 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-c85z6" podUID="6e749571-a39a-485f-bcfb-603d4a5b22ed" Sep 16 04:24:26.601349 containerd[1558]: time="2025-09-16T04:24:26.601265030Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/cni:v3.30.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 16 04:24:26.602484 containerd[1558]: time="2025-09-16T04:24:26.602406649Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/cni:v3.30.3: active requests=0, bytes read=65913477" Sep 16 04:24:26.603675 containerd[1558]: time="2025-09-16T04:24:26.603498787Z" level=info msg="ImageCreate event name:\"sha256:7077a1dc632ee598cbfa626f9e3c9bca5b20c0d1e1e557995890125b2e8d2e23\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 16 04:24:26.608823 containerd[1558]: time="2025-09-16T04:24:26.608777914Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/cni@sha256:73d1e391050490d54e5bee8ff2b1a50a8be1746c98dc530361b00e8c0ab63f87\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 16 04:24:26.609978 containerd[1558]: time="2025-09-16T04:24:26.609519886Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/cni:v3.30.3\" with image id \"sha256:7077a1dc632ee598cbfa626f9e3c9bca5b20c0d1e1e557995890125b2e8d2e23\", repo tag \"ghcr.io/flatcar/calico/cni:v3.30.3\", repo digest \"ghcr.io/flatcar/calico/cni@sha256:73d1e391050490d54e5bee8ff2b1a50a8be1746c98dc530361b00e8c0ab63f87\", size \"67282718\" in 3.714668353s" Sep 16 04:24:26.609978 containerd[1558]: time="2025-09-16T04:24:26.609558647Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/cni:v3.30.3\" returns image reference \"sha256:7077a1dc632ee598cbfa626f9e3c9bca5b20c0d1e1e557995890125b2e8d2e23\"" Sep 16 04:24:26.614998 containerd[1558]: time="2025-09-16T04:24:26.614905695Z" level=info msg="CreateContainer within sandbox \"f7c71579cce0ec213c5a474c65ab702dd5d554093c1d9cbbb0691207270a2b80\" for container &ContainerMetadata{Name:install-cni,Attempt:0,}" Sep 16 04:24:26.626996 containerd[1558]: time="2025-09-16T04:24:26.625820075Z" level=info msg="Container 281d18e013cf1a67e834c91d17d97d83bc6492f001b7ed005aa794a7febe5f3e: CDI devices from CRI Config.CDIDevices: []" Sep 16 04:24:26.631902 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount3530961711.mount: Deactivated successfully. Sep 16 04:24:26.636612 containerd[1558]: time="2025-09-16T04:24:26.636537371Z" level=info msg="CreateContainer within sandbox \"f7c71579cce0ec213c5a474c65ab702dd5d554093c1d9cbbb0691207270a2b80\" for &ContainerMetadata{Name:install-cni,Attempt:0,} returns container id \"281d18e013cf1a67e834c91d17d97d83bc6492f001b7ed005aa794a7febe5f3e\"" Sep 16 04:24:26.637478 containerd[1558]: time="2025-09-16T04:24:26.637439306Z" level=info msg="StartContainer for \"281d18e013cf1a67e834c91d17d97d83bc6492f001b7ed005aa794a7febe5f3e\"" Sep 16 04:24:26.641394 containerd[1558]: time="2025-09-16T04:24:26.641365491Z" level=info msg="connecting to shim 281d18e013cf1a67e834c91d17d97d83bc6492f001b7ed005aa794a7febe5f3e" address="unix:///run/containerd/s/e12e1fde57a77c49dcd1ab8b5c66e5d2509e78308dbb9912b659009b55267cb5" protocol=ttrpc version=3 Sep 16 04:24:26.666982 systemd[1]: Started cri-containerd-281d18e013cf1a67e834c91d17d97d83bc6492f001b7ed005aa794a7febe5f3e.scope - libcontainer container 281d18e013cf1a67e834c91d17d97d83bc6492f001b7ed005aa794a7febe5f3e. Sep 16 04:24:26.716328 containerd[1558]: time="2025-09-16T04:24:26.716199565Z" level=info msg="StartContainer for \"281d18e013cf1a67e834c91d17d97d83bc6492f001b7ed005aa794a7febe5f3e\" returns successfully" Sep 16 04:24:27.272209 containerd[1558]: time="2025-09-16T04:24:27.270555954Z" level=error msg="failed to reload cni configuration after receiving fs change event(WRITE \"/etc/cni/net.d/calico-kubeconfig\")" error="cni config load failed: no network config found in /etc/cni/net.d: cni plugin not initialized: failed to load cni config" Sep 16 04:24:27.273118 systemd[1]: cri-containerd-281d18e013cf1a67e834c91d17d97d83bc6492f001b7ed005aa794a7febe5f3e.scope: Deactivated successfully. Sep 16 04:24:27.273819 systemd[1]: cri-containerd-281d18e013cf1a67e834c91d17d97d83bc6492f001b7ed005aa794a7febe5f3e.scope: Consumed 532ms CPU time, 188.3M memory peak, 165.8M written to disk. Sep 16 04:24:27.278796 containerd[1558]: time="2025-09-16T04:24:27.278762847Z" level=info msg="received exit event container_id:\"281d18e013cf1a67e834c91d17d97d83bc6492f001b7ed005aa794a7febe5f3e\" id:\"281d18e013cf1a67e834c91d17d97d83bc6492f001b7ed005aa794a7febe5f3e\" pid:3495 exited_at:{seconds:1757996667 nanos:276639892}" Sep 16 04:24:27.279410 containerd[1558]: time="2025-09-16T04:24:27.279283935Z" level=info msg="TaskExit event in podsandbox handler container_id:\"281d18e013cf1a67e834c91d17d97d83bc6492f001b7ed005aa794a7febe5f3e\" id:\"281d18e013cf1a67e834c91d17d97d83bc6492f001b7ed005aa794a7febe5f3e\" pid:3495 exited_at:{seconds:1757996667 nanos:276639892}" Sep 16 04:24:27.301097 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-281d18e013cf1a67e834c91d17d97d83bc6492f001b7ed005aa794a7febe5f3e-rootfs.mount: Deactivated successfully. Sep 16 04:24:27.379859 kubelet[2783]: I0916 04:24:27.379820 2783 kubelet_node_status.go:501] "Fast updating node status as it just became ready" Sep 16 04:24:27.431908 systemd[1]: Created slice kubepods-burstable-pod315abdbf_a1e5_4c09_b409_2dd38a2cfeeb.slice - libcontainer container kubepods-burstable-pod315abdbf_a1e5_4c09_b409_2dd38a2cfeeb.slice. Sep 16 04:24:27.445303 systemd[1]: Created slice kubepods-burstable-pod038e6674_1975_4134_b752_06e86fdb41a9.slice - libcontainer container kubepods-burstable-pod038e6674_1975_4134_b752_06e86fdb41a9.slice. Sep 16 04:24:27.461383 systemd[1]: Created slice kubepods-besteffort-pod60cfb0b5_050e_42a6_8c96_a4c5067b5655.slice - libcontainer container kubepods-besteffort-pod60cfb0b5_050e_42a6_8c96_a4c5067b5655.slice. Sep 16 04:24:27.469738 systemd[1]: Created slice kubepods-besteffort-pod5ac97367_6d73_46ba_bb67_eceba0a1415b.slice - libcontainer container kubepods-besteffort-pod5ac97367_6d73_46ba_bb67_eceba0a1415b.slice. Sep 16 04:24:27.487204 systemd[1]: Created slice kubepods-besteffort-pod49803f3a_c196_489e_944f_acc9e7819ab4.slice - libcontainer container kubepods-besteffort-pod49803f3a_c196_489e_944f_acc9e7819ab4.slice. Sep 16 04:24:27.497399 systemd[1]: Created slice kubepods-besteffort-pod34d6afd2_952d_4990_a809_94238195a22f.slice - libcontainer container kubepods-besteffort-pod34d6afd2_952d_4990_a809_94238195a22f.slice. Sep 16 04:24:27.508437 systemd[1]: Created slice kubepods-besteffort-pod5e16adeb_bdda_4f9a_bad5_38ec719a7ad7.slice - libcontainer container kubepods-besteffort-pod5e16adeb_bdda_4f9a_bad5_38ec719a7ad7.slice. Sep 16 04:24:27.515566 systemd[1]: Created slice kubepods-besteffort-podbf6d1753_4f29_4a36_94fb_7d6f48d24a1e.slice - libcontainer container kubepods-besteffort-podbf6d1753_4f29_4a36_94fb_7d6f48d24a1e.slice. Sep 16 04:24:27.549956 kubelet[2783]: I0916 04:24:27.548984 2783 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whisker-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5e16adeb-bdda-4f9a-bad5-38ec719a7ad7-whisker-ca-bundle\") pod \"whisker-5ff7c486ff-56vzc\" (UID: \"5e16adeb-bdda-4f9a-bad5-38ec719a7ad7\") " pod="calico-system/whisker-5ff7c486ff-56vzc" Sep 16 04:24:27.549956 kubelet[2783]: I0916 04:24:27.549074 2783 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whisker-backend-key-pair\" (UniqueName: \"kubernetes.io/secret/5e16adeb-bdda-4f9a-bad5-38ec719a7ad7-whisker-backend-key-pair\") pod \"whisker-5ff7c486ff-56vzc\" (UID: \"5e16adeb-bdda-4f9a-bad5-38ec719a7ad7\") " pod="calico-system/whisker-5ff7c486ff-56vzc" Sep 16 04:24:27.549956 kubelet[2783]: I0916 04:24:27.549143 2783 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"calico-apiserver-certs\" (UniqueName: \"kubernetes.io/secret/60cfb0b5-050e-42a6-8c96-a4c5067b5655-calico-apiserver-certs\") pod \"calico-apiserver-6bd74b9558-fz99l\" (UID: \"60cfb0b5-050e-42a6-8c96-a4c5067b5655\") " pod="calico-apiserver/calico-apiserver-6bd74b9558-fz99l" Sep 16 04:24:27.549956 kubelet[2783]: I0916 04:24:27.549189 2783 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zksc4\" (UniqueName: \"kubernetes.io/projected/34d6afd2-952d-4990-a809-94238195a22f-kube-api-access-zksc4\") pod \"goldmane-54d579b49d-xdvqq\" (UID: \"34d6afd2-952d-4990-a809-94238195a22f\") " pod="calico-system/goldmane-54d579b49d-xdvqq" Sep 16 04:24:27.549956 kubelet[2783]: I0916 04:24:27.549243 2783 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"calico-apiserver-certs\" (UniqueName: \"kubernetes.io/secret/bf6d1753-4f29-4a36-94fb-7d6f48d24a1e-calico-apiserver-certs\") pod \"calico-apiserver-6bd74b9558-ljjkj\" (UID: \"bf6d1753-4f29-4a36-94fb-7d6f48d24a1e\") " pod="calico-apiserver/calico-apiserver-6bd74b9558-ljjkj" Sep 16 04:24:27.550748 kubelet[2783]: I0916 04:24:27.549284 2783 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/315abdbf-a1e5-4c09-b409-2dd38a2cfeeb-config-volume\") pod \"coredns-674b8bbfcf-4l959\" (UID: \"315abdbf-a1e5-4c09-b409-2dd38a2cfeeb\") " pod="kube-system/coredns-674b8bbfcf-4l959" Sep 16 04:24:27.550748 kubelet[2783]: I0916 04:24:27.549321 2783 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hmmhj\" (UniqueName: \"kubernetes.io/projected/315abdbf-a1e5-4c09-b409-2dd38a2cfeeb-kube-api-access-hmmhj\") pod \"coredns-674b8bbfcf-4l959\" (UID: \"315abdbf-a1e5-4c09-b409-2dd38a2cfeeb\") " pod="kube-system/coredns-674b8bbfcf-4l959" Sep 16 04:24:27.550748 kubelet[2783]: I0916 04:24:27.549359 2783 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d2dw2\" (UniqueName: \"kubernetes.io/projected/bf6d1753-4f29-4a36-94fb-7d6f48d24a1e-kube-api-access-d2dw2\") pod \"calico-apiserver-6bd74b9558-ljjkj\" (UID: \"bf6d1753-4f29-4a36-94fb-7d6f48d24a1e\") " pod="calico-apiserver/calico-apiserver-6bd74b9558-ljjkj" Sep 16 04:24:27.550748 kubelet[2783]: I0916 04:24:27.549411 2783 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/34d6afd2-952d-4990-a809-94238195a22f-config\") pod \"goldmane-54d579b49d-xdvqq\" (UID: \"34d6afd2-952d-4990-a809-94238195a22f\") " pod="calico-system/goldmane-54d579b49d-xdvqq" Sep 16 04:24:27.550748 kubelet[2783]: I0916 04:24:27.549449 2783 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"goldmane-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/34d6afd2-952d-4990-a809-94238195a22f-goldmane-ca-bundle\") pod \"goldmane-54d579b49d-xdvqq\" (UID: \"34d6afd2-952d-4990-a809-94238195a22f\") " pod="calico-system/goldmane-54d579b49d-xdvqq" Sep 16 04:24:27.552182 kubelet[2783]: I0916 04:24:27.549496 2783 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v58jd\" (UniqueName: \"kubernetes.io/projected/5e16adeb-bdda-4f9a-bad5-38ec719a7ad7-kube-api-access-v58jd\") pod \"whisker-5ff7c486ff-56vzc\" (UID: \"5e16adeb-bdda-4f9a-bad5-38ec719a7ad7\") " pod="calico-system/whisker-5ff7c486ff-56vzc" Sep 16 04:24:27.552182 kubelet[2783]: I0916 04:24:27.549534 2783 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2vznz\" (UniqueName: \"kubernetes.io/projected/49803f3a-c196-489e-944f-acc9e7819ab4-kube-api-access-2vznz\") pod \"calico-kube-controllers-5fb8d7647c-wmjlp\" (UID: \"49803f3a-c196-489e-944f-acc9e7819ab4\") " pod="calico-system/calico-kube-controllers-5fb8d7647c-wmjlp" Sep 16 04:24:27.552182 kubelet[2783]: I0916 04:24:27.549575 2783 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"calico-apiserver-certs\" (UniqueName: \"kubernetes.io/secret/5ac97367-6d73-46ba-bb67-eceba0a1415b-calico-apiserver-certs\") pod \"calico-apiserver-6bd6d5f94d-b8px6\" (UID: \"5ac97367-6d73-46ba-bb67-eceba0a1415b\") " pod="calico-apiserver/calico-apiserver-6bd6d5f94d-b8px6" Sep 16 04:24:27.552182 kubelet[2783]: I0916 04:24:27.549657 2783 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f7tkv\" (UniqueName: \"kubernetes.io/projected/60cfb0b5-050e-42a6-8c96-a4c5067b5655-kube-api-access-f7tkv\") pod \"calico-apiserver-6bd74b9558-fz99l\" (UID: \"60cfb0b5-050e-42a6-8c96-a4c5067b5655\") " pod="calico-apiserver/calico-apiserver-6bd74b9558-fz99l" Sep 16 04:24:27.552182 kubelet[2783]: I0916 04:24:27.549692 2783 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"goldmane-key-pair\" (UniqueName: \"kubernetes.io/secret/34d6afd2-952d-4990-a809-94238195a22f-goldmane-key-pair\") pod \"goldmane-54d579b49d-xdvqq\" (UID: \"34d6afd2-952d-4990-a809-94238195a22f\") " pod="calico-system/goldmane-54d579b49d-xdvqq" Sep 16 04:24:27.552646 kubelet[2783]: I0916 04:24:27.549736 2783 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/49803f3a-c196-489e-944f-acc9e7819ab4-tigera-ca-bundle\") pod \"calico-kube-controllers-5fb8d7647c-wmjlp\" (UID: \"49803f3a-c196-489e-944f-acc9e7819ab4\") " pod="calico-system/calico-kube-controllers-5fb8d7647c-wmjlp" Sep 16 04:24:27.552646 kubelet[2783]: I0916 04:24:27.549775 2783 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/038e6674-1975-4134-b752-06e86fdb41a9-config-volume\") pod \"coredns-674b8bbfcf-xxw8g\" (UID: \"038e6674-1975-4134-b752-06e86fdb41a9\") " pod="kube-system/coredns-674b8bbfcf-xxw8g" Sep 16 04:24:27.552646 kubelet[2783]: I0916 04:24:27.549814 2783 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6qphm\" (UniqueName: \"kubernetes.io/projected/038e6674-1975-4134-b752-06e86fdb41a9-kube-api-access-6qphm\") pod \"coredns-674b8bbfcf-xxw8g\" (UID: \"038e6674-1975-4134-b752-06e86fdb41a9\") " pod="kube-system/coredns-674b8bbfcf-xxw8g" Sep 16 04:24:27.552646 kubelet[2783]: I0916 04:24:27.549850 2783 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xntmq\" (UniqueName: \"kubernetes.io/projected/5ac97367-6d73-46ba-bb67-eceba0a1415b-kube-api-access-xntmq\") pod \"calico-apiserver-6bd6d5f94d-b8px6\" (UID: \"5ac97367-6d73-46ba-bb67-eceba0a1415b\") " pod="calico-apiserver/calico-apiserver-6bd6d5f94d-b8px6" Sep 16 04:24:27.742344 containerd[1558]: time="2025-09-16T04:24:27.742202686Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-674b8bbfcf-4l959,Uid:315abdbf-a1e5-4c09-b409-2dd38a2cfeeb,Namespace:kube-system,Attempt:0,}" Sep 16 04:24:27.757903 systemd[1]: Created slice kubepods-besteffort-pod6e749571_a39a_485f_bcfb_603d4a5b22ed.slice - libcontainer container kubepods-besteffort-pod6e749571_a39a_485f_bcfb_603d4a5b22ed.slice. Sep 16 04:24:27.759262 containerd[1558]: time="2025-09-16T04:24:27.758610872Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-674b8bbfcf-xxw8g,Uid:038e6674-1975-4134-b752-06e86fdb41a9,Namespace:kube-system,Attempt:0,}" Sep 16 04:24:27.767572 containerd[1558]: time="2025-09-16T04:24:27.767405575Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-c85z6,Uid:6e749571-a39a-485f-bcfb-603d4a5b22ed,Namespace:calico-system,Attempt:0,}" Sep 16 04:24:27.767773 containerd[1558]: time="2025-09-16T04:24:27.767698660Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-6bd74b9558-fz99l,Uid:60cfb0b5-050e-42a6-8c96-a4c5067b5655,Namespace:calico-apiserver,Attempt:0,}" Sep 16 04:24:27.782525 containerd[1558]: time="2025-09-16T04:24:27.782473139Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-6bd6d5f94d-b8px6,Uid:5ac97367-6d73-46ba-bb67-eceba0a1415b,Namespace:calico-apiserver,Attempt:0,}" Sep 16 04:24:27.797638 containerd[1558]: time="2025-09-16T04:24:27.797572144Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-5fb8d7647c-wmjlp,Uid:49803f3a-c196-489e-944f-acc9e7819ab4,Namespace:calico-system,Attempt:0,}" Sep 16 04:24:27.807189 containerd[1558]: time="2025-09-16T04:24:27.807060458Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:goldmane-54d579b49d-xdvqq,Uid:34d6afd2-952d-4990-a809-94238195a22f,Namespace:calico-system,Attempt:0,}" Sep 16 04:24:27.816622 containerd[1558]: time="2025-09-16T04:24:27.816238607Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:whisker-5ff7c486ff-56vzc,Uid:5e16adeb-bdda-4f9a-bad5-38ec719a7ad7,Namespace:calico-system,Attempt:0,}" Sep 16 04:24:27.824435 containerd[1558]: time="2025-09-16T04:24:27.824395180Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-6bd74b9558-ljjkj,Uid:bf6d1753-4f29-4a36-94fb-7d6f48d24a1e,Namespace:calico-apiserver,Attempt:0,}" Sep 16 04:24:27.955705 containerd[1558]: time="2025-09-16T04:24:27.955665709Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node:v3.30.3\"" Sep 16 04:24:28.004833 containerd[1558]: time="2025-09-16T04:24:28.004711184Z" level=error msg="Failed to destroy network for sandbox \"584946498b085b45c0f35f2fbd9aab8b5ae7c1340321f31db0de2deca1291d38\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 16 04:24:28.011448 containerd[1558]: time="2025-09-16T04:24:28.011371251Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-674b8bbfcf-4l959,Uid:315abdbf-a1e5-4c09-b409-2dd38a2cfeeb,Namespace:kube-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"584946498b085b45c0f35f2fbd9aab8b5ae7c1340321f31db0de2deca1291d38\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 16 04:24:28.011858 kubelet[2783]: E0916 04:24:28.011799 2783 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"584946498b085b45c0f35f2fbd9aab8b5ae7c1340321f31db0de2deca1291d38\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 16 04:24:28.012388 kubelet[2783]: E0916 04:24:28.012334 2783 kuberuntime_sandbox.go:70] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"584946498b085b45c0f35f2fbd9aab8b5ae7c1340321f31db0de2deca1291d38\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-674b8bbfcf-4l959" Sep 16 04:24:28.012456 kubelet[2783]: E0916 04:24:28.012393 2783 kuberuntime_manager.go:1252] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"584946498b085b45c0f35f2fbd9aab8b5ae7c1340321f31db0de2deca1291d38\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-674b8bbfcf-4l959" Sep 16 04:24:28.012483 kubelet[2783]: E0916 04:24:28.012457 2783 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"coredns-674b8bbfcf-4l959_kube-system(315abdbf-a1e5-4c09-b409-2dd38a2cfeeb)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"coredns-674b8bbfcf-4l959_kube-system(315abdbf-a1e5-4c09-b409-2dd38a2cfeeb)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"584946498b085b45c0f35f2fbd9aab8b5ae7c1340321f31db0de2deca1291d38\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-674b8bbfcf-4l959" podUID="315abdbf-a1e5-4c09-b409-2dd38a2cfeeb" Sep 16 04:24:28.051098 containerd[1558]: time="2025-09-16T04:24:28.051030405Z" level=error msg="Failed to destroy network for sandbox \"b05321c46787b15d3f7ca142dcc6b086cdb8162c4127f9849d4be13c53ffe6e2\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 16 04:24:28.055009 containerd[1558]: time="2025-09-16T04:24:28.054951067Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-6bd6d5f94d-b8px6,Uid:5ac97367-6d73-46ba-bb67-eceba0a1415b,Namespace:calico-apiserver,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"b05321c46787b15d3f7ca142dcc6b086cdb8162c4127f9849d4be13c53ffe6e2\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 16 04:24:28.055519 kubelet[2783]: E0916 04:24:28.055488 2783 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"b05321c46787b15d3f7ca142dcc6b086cdb8162c4127f9849d4be13c53ffe6e2\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 16 04:24:28.056151 kubelet[2783]: E0916 04:24:28.055840 2783 kuberuntime_sandbox.go:70] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"b05321c46787b15d3f7ca142dcc6b086cdb8162c4127f9849d4be13c53ffe6e2\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-6bd6d5f94d-b8px6" Sep 16 04:24:28.057356 kubelet[2783]: E0916 04:24:28.055868 2783 kuberuntime_manager.go:1252] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"b05321c46787b15d3f7ca142dcc6b086cdb8162c4127f9849d4be13c53ffe6e2\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-6bd6d5f94d-b8px6" Sep 16 04:24:28.057356 kubelet[2783]: E0916 04:24:28.056327 2783 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-apiserver-6bd6d5f94d-b8px6_calico-apiserver(5ac97367-6d73-46ba-bb67-eceba0a1415b)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-apiserver-6bd6d5f94d-b8px6_calico-apiserver(5ac97367-6d73-46ba-bb67-eceba0a1415b)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"b05321c46787b15d3f7ca142dcc6b086cdb8162c4127f9849d4be13c53ffe6e2\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-apiserver/calico-apiserver-6bd6d5f94d-b8px6" podUID="5ac97367-6d73-46ba-bb67-eceba0a1415b" Sep 16 04:24:28.076771 containerd[1558]: time="2025-09-16T04:24:28.076694255Z" level=error msg="Failed to destroy network for sandbox \"3688c2636aad9d1d691f4bf73c0badf46357cef767163bb0f629345d4d0d2034\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 16 04:24:28.080838 containerd[1558]: time="2025-09-16T04:24:28.080713879Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-674b8bbfcf-xxw8g,Uid:038e6674-1975-4134-b752-06e86fdb41a9,Namespace:kube-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"3688c2636aad9d1d691f4bf73c0badf46357cef767163bb0f629345d4d0d2034\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 16 04:24:28.081005 kubelet[2783]: E0916 04:24:28.080960 2783 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"3688c2636aad9d1d691f4bf73c0badf46357cef767163bb0f629345d4d0d2034\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 16 04:24:28.081118 kubelet[2783]: E0916 04:24:28.081020 2783 kuberuntime_sandbox.go:70] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"3688c2636aad9d1d691f4bf73c0badf46357cef767163bb0f629345d4d0d2034\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-674b8bbfcf-xxw8g" Sep 16 04:24:28.081118 kubelet[2783]: E0916 04:24:28.081040 2783 kuberuntime_manager.go:1252] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"3688c2636aad9d1d691f4bf73c0badf46357cef767163bb0f629345d4d0d2034\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-674b8bbfcf-xxw8g" Sep 16 04:24:28.081118 kubelet[2783]: E0916 04:24:28.081099 2783 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"coredns-674b8bbfcf-xxw8g_kube-system(038e6674-1975-4134-b752-06e86fdb41a9)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"coredns-674b8bbfcf-xxw8g_kube-system(038e6674-1975-4134-b752-06e86fdb41a9)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"3688c2636aad9d1d691f4bf73c0badf46357cef767163bb0f629345d4d0d2034\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-674b8bbfcf-xxw8g" podUID="038e6674-1975-4134-b752-06e86fdb41a9" Sep 16 04:24:28.084054 containerd[1558]: time="2025-09-16T04:24:28.083933050Z" level=error msg="Failed to destroy network for sandbox \"4a152f5310582d9cd7c0b623bddef90154a3fd5c5f2fb32e9e7c96a27da96302\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 16 04:24:28.088415 containerd[1558]: time="2025-09-16T04:24:28.088293600Z" level=error msg="Failed to destroy network for sandbox \"6a26e386ff591889f7fe960f6491a6920c0a39bbbea749718d28e21a47fd1cda\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 16 04:24:28.088872 containerd[1558]: time="2025-09-16T04:24:28.088696247Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-c85z6,Uid:6e749571-a39a-485f-bcfb-603d4a5b22ed,Namespace:calico-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"4a152f5310582d9cd7c0b623bddef90154a3fd5c5f2fb32e9e7c96a27da96302\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 16 04:24:28.089358 kubelet[2783]: E0916 04:24:28.089324 2783 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"4a152f5310582d9cd7c0b623bddef90154a3fd5c5f2fb32e9e7c96a27da96302\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 16 04:24:28.089633 kubelet[2783]: E0916 04:24:28.089479 2783 kuberuntime_sandbox.go:70] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"4a152f5310582d9cd7c0b623bddef90154a3fd5c5f2fb32e9e7c96a27da96302\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/csi-node-driver-c85z6" Sep 16 04:24:28.089633 kubelet[2783]: E0916 04:24:28.089504 2783 kuberuntime_manager.go:1252] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"4a152f5310582d9cd7c0b623bddef90154a3fd5c5f2fb32e9e7c96a27da96302\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/csi-node-driver-c85z6" Sep 16 04:24:28.089892 kubelet[2783]: E0916 04:24:28.089567 2783 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"csi-node-driver-c85z6_calico-system(6e749571-a39a-485f-bcfb-603d4a5b22ed)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"csi-node-driver-c85z6_calico-system(6e749571-a39a-485f-bcfb-603d4a5b22ed)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"4a152f5310582d9cd7c0b623bddef90154a3fd5c5f2fb32e9e7c96a27da96302\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/csi-node-driver-c85z6" podUID="6e749571-a39a-485f-bcfb-603d4a5b22ed" Sep 16 04:24:28.090664 containerd[1558]: time="2025-09-16T04:24:28.090478115Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-5fb8d7647c-wmjlp,Uid:49803f3a-c196-489e-944f-acc9e7819ab4,Namespace:calico-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"6a26e386ff591889f7fe960f6491a6920c0a39bbbea749718d28e21a47fd1cda\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 16 04:24:28.090973 kubelet[2783]: E0916 04:24:28.090937 2783 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"6a26e386ff591889f7fe960f6491a6920c0a39bbbea749718d28e21a47fd1cda\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 16 04:24:28.091395 kubelet[2783]: E0916 04:24:28.091139 2783 kuberuntime_sandbox.go:70] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"6a26e386ff591889f7fe960f6491a6920c0a39bbbea749718d28e21a47fd1cda\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-kube-controllers-5fb8d7647c-wmjlp" Sep 16 04:24:28.091395 kubelet[2783]: E0916 04:24:28.091357 2783 kuberuntime_manager.go:1252] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"6a26e386ff591889f7fe960f6491a6920c0a39bbbea749718d28e21a47fd1cda\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-kube-controllers-5fb8d7647c-wmjlp" Sep 16 04:24:28.091656 kubelet[2783]: E0916 04:24:28.091522 2783 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-kube-controllers-5fb8d7647c-wmjlp_calico-system(49803f3a-c196-489e-944f-acc9e7819ab4)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-kube-controllers-5fb8d7647c-wmjlp_calico-system(49803f3a-c196-489e-944f-acc9e7819ab4)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"6a26e386ff591889f7fe960f6491a6920c0a39bbbea749718d28e21a47fd1cda\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/calico-kube-controllers-5fb8d7647c-wmjlp" podUID="49803f3a-c196-489e-944f-acc9e7819ab4" Sep 16 04:24:28.097459 containerd[1558]: time="2025-09-16T04:24:28.097415706Z" level=error msg="Failed to destroy network for sandbox \"96a5a1d844c3c4191231e913f6f1dcc1ca28ee48c74b3ca91b950b9c542c8478\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 16 04:24:28.099487 containerd[1558]: time="2025-09-16T04:24:28.099291016Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-6bd74b9558-fz99l,Uid:60cfb0b5-050e-42a6-8c96-a4c5067b5655,Namespace:calico-apiserver,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"96a5a1d844c3c4191231e913f6f1dcc1ca28ee48c74b3ca91b950b9c542c8478\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 16 04:24:28.101128 kubelet[2783]: E0916 04:24:28.100121 2783 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"96a5a1d844c3c4191231e913f6f1dcc1ca28ee48c74b3ca91b950b9c542c8478\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 16 04:24:28.101128 kubelet[2783]: E0916 04:24:28.100185 2783 kuberuntime_sandbox.go:70] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"96a5a1d844c3c4191231e913f6f1dcc1ca28ee48c74b3ca91b950b9c542c8478\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-6bd74b9558-fz99l" Sep 16 04:24:28.101128 kubelet[2783]: E0916 04:24:28.100213 2783 kuberuntime_manager.go:1252] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"96a5a1d844c3c4191231e913f6f1dcc1ca28ee48c74b3ca91b950b9c542c8478\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-6bd74b9558-fz99l" Sep 16 04:24:28.101293 kubelet[2783]: E0916 04:24:28.100265 2783 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-apiserver-6bd74b9558-fz99l_calico-apiserver(60cfb0b5-050e-42a6-8c96-a4c5067b5655)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-apiserver-6bd74b9558-fz99l_calico-apiserver(60cfb0b5-050e-42a6-8c96-a4c5067b5655)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"96a5a1d844c3c4191231e913f6f1dcc1ca28ee48c74b3ca91b950b9c542c8478\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-apiserver/calico-apiserver-6bd74b9558-fz99l" podUID="60cfb0b5-050e-42a6-8c96-a4c5067b5655" Sep 16 04:24:28.103738 containerd[1558]: time="2025-09-16T04:24:28.103698766Z" level=error msg="Failed to destroy network for sandbox \"9020cf5374d3f3c9bc78f61eb448fa3c3c156f6687c1d8a14cc4925486f8cda1\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 16 04:24:28.105933 containerd[1558]: time="2025-09-16T04:24:28.105668998Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:goldmane-54d579b49d-xdvqq,Uid:34d6afd2-952d-4990-a809-94238195a22f,Namespace:calico-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"9020cf5374d3f3c9bc78f61eb448fa3c3c156f6687c1d8a14cc4925486f8cda1\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 16 04:24:28.106579 kubelet[2783]: E0916 04:24:28.106257 2783 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"9020cf5374d3f3c9bc78f61eb448fa3c3c156f6687c1d8a14cc4925486f8cda1\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 16 04:24:28.106579 kubelet[2783]: E0916 04:24:28.106311 2783 kuberuntime_sandbox.go:70] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"9020cf5374d3f3c9bc78f61eb448fa3c3c156f6687c1d8a14cc4925486f8cda1\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/goldmane-54d579b49d-xdvqq" Sep 16 04:24:28.106579 kubelet[2783]: E0916 04:24:28.106332 2783 kuberuntime_manager.go:1252] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"9020cf5374d3f3c9bc78f61eb448fa3c3c156f6687c1d8a14cc4925486f8cda1\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/goldmane-54d579b49d-xdvqq" Sep 16 04:24:28.106772 kubelet[2783]: E0916 04:24:28.106385 2783 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"goldmane-54d579b49d-xdvqq_calico-system(34d6afd2-952d-4990-a809-94238195a22f)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"goldmane-54d579b49d-xdvqq_calico-system(34d6afd2-952d-4990-a809-94238195a22f)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"9020cf5374d3f3c9bc78f61eb448fa3c3c156f6687c1d8a14cc4925486f8cda1\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/goldmane-54d579b49d-xdvqq" podUID="34d6afd2-952d-4990-a809-94238195a22f" Sep 16 04:24:28.111467 containerd[1558]: time="2025-09-16T04:24:28.111420770Z" level=error msg="Failed to destroy network for sandbox \"6c26c073f8b2551d2c2afd4958659b5b7670b0e600a8269127652b2266ee2d9c\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 16 04:24:28.113803 containerd[1558]: time="2025-09-16T04:24:28.113726407Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-6bd74b9558-ljjkj,Uid:bf6d1753-4f29-4a36-94fb-7d6f48d24a1e,Namespace:calico-apiserver,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"6c26c073f8b2551d2c2afd4958659b5b7670b0e600a8269127652b2266ee2d9c\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 16 04:24:28.114222 kubelet[2783]: E0916 04:24:28.114157 2783 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"6c26c073f8b2551d2c2afd4958659b5b7670b0e600a8269127652b2266ee2d9c\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 16 04:24:28.114288 kubelet[2783]: E0916 04:24:28.114226 2783 kuberuntime_sandbox.go:70] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"6c26c073f8b2551d2c2afd4958659b5b7670b0e600a8269127652b2266ee2d9c\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-6bd74b9558-ljjkj" Sep 16 04:24:28.114288 kubelet[2783]: E0916 04:24:28.114247 2783 kuberuntime_manager.go:1252] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"6c26c073f8b2551d2c2afd4958659b5b7670b0e600a8269127652b2266ee2d9c\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-6bd74b9558-ljjkj" Sep 16 04:24:28.114346 kubelet[2783]: E0916 04:24:28.114301 2783 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-apiserver-6bd74b9558-ljjkj_calico-apiserver(bf6d1753-4f29-4a36-94fb-7d6f48d24a1e)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-apiserver-6bd74b9558-ljjkj_calico-apiserver(bf6d1753-4f29-4a36-94fb-7d6f48d24a1e)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"6c26c073f8b2551d2c2afd4958659b5b7670b0e600a8269127652b2266ee2d9c\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-apiserver/calico-apiserver-6bd74b9558-ljjkj" podUID="bf6d1753-4f29-4a36-94fb-7d6f48d24a1e" Sep 16 04:24:28.117446 containerd[1558]: time="2025-09-16T04:24:28.117402785Z" level=error msg="Failed to destroy network for sandbox \"7a83e2e20c0f73b951e8301518c31b59f08222a1e46e4cc5070d257ac7084417\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 16 04:24:28.119108 containerd[1558]: time="2025-09-16T04:24:28.119044732Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:whisker-5ff7c486ff-56vzc,Uid:5e16adeb-bdda-4f9a-bad5-38ec719a7ad7,Namespace:calico-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"7a83e2e20c0f73b951e8301518c31b59f08222a1e46e4cc5070d257ac7084417\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 16 04:24:28.119365 kubelet[2783]: E0916 04:24:28.119327 2783 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"7a83e2e20c0f73b951e8301518c31b59f08222a1e46e4cc5070d257ac7084417\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 16 04:24:28.119443 kubelet[2783]: E0916 04:24:28.119384 2783 kuberuntime_sandbox.go:70] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"7a83e2e20c0f73b951e8301518c31b59f08222a1e46e4cc5070d257ac7084417\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/whisker-5ff7c486ff-56vzc" Sep 16 04:24:28.119443 kubelet[2783]: E0916 04:24:28.119408 2783 kuberuntime_manager.go:1252] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"7a83e2e20c0f73b951e8301518c31b59f08222a1e46e4cc5070d257ac7084417\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/whisker-5ff7c486ff-56vzc" Sep 16 04:24:28.119532 kubelet[2783]: E0916 04:24:28.119505 2783 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"whisker-5ff7c486ff-56vzc_calico-system(5e16adeb-bdda-4f9a-bad5-38ec719a7ad7)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"whisker-5ff7c486ff-56vzc_calico-system(5e16adeb-bdda-4f9a-bad5-38ec719a7ad7)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"7a83e2e20c0f73b951e8301518c31b59f08222a1e46e4cc5070d257ac7084417\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/whisker-5ff7c486ff-56vzc" podUID="5e16adeb-bdda-4f9a-bad5-38ec719a7ad7" Sep 16 04:24:35.230760 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount118836783.mount: Deactivated successfully. Sep 16 04:24:35.262556 containerd[1558]: time="2025-09-16T04:24:35.262485601Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/node:v3.30.3: active requests=0, bytes read=151100457" Sep 16 04:24:35.266540 containerd[1558]: time="2025-09-16T04:24:35.266403419Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/node:v3.30.3\" with image id \"sha256:2b8abd2140fc4464ed664d225fe38e5b90bbfcf62996b484b0fc0e0537b6a4a9\", repo tag \"ghcr.io/flatcar/calico/node:v3.30.3\", repo digest \"ghcr.io/flatcar/calico/node@sha256:bcb8146fcaeced1e1c88fad3eaa697f1680746bd23c3e7e8d4535bc484c6f2a1\", size \"151100319\" in 7.310363583s" Sep 16 04:24:35.266540 containerd[1558]: time="2025-09-16T04:24:35.266447619Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node:v3.30.3\" returns image reference \"sha256:2b8abd2140fc4464ed664d225fe38e5b90bbfcf62996b484b0fc0e0537b6a4a9\"" Sep 16 04:24:35.283206 containerd[1558]: time="2025-09-16T04:24:35.279728854Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node:v3.30.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 16 04:24:35.283206 containerd[1558]: time="2025-09-16T04:24:35.282326532Z" level=info msg="ImageCreate event name:\"sha256:2b8abd2140fc4464ed664d225fe38e5b90bbfcf62996b484b0fc0e0537b6a4a9\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 16 04:24:35.283206 containerd[1558]: time="2025-09-16T04:24:35.283054903Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node@sha256:bcb8146fcaeced1e1c88fad3eaa697f1680746bd23c3e7e8d4535bc484c6f2a1\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 16 04:24:35.290230 containerd[1558]: time="2025-09-16T04:24:35.290166967Z" level=info msg="CreateContainer within sandbox \"f7c71579cce0ec213c5a474c65ab702dd5d554093c1d9cbbb0691207270a2b80\" for container &ContainerMetadata{Name:calico-node,Attempt:0,}" Sep 16 04:24:35.312610 containerd[1558]: time="2025-09-16T04:24:35.311684842Z" level=info msg="Container 29accf524a610bb91eb08b15c29dbaaf8febb57b21dcb514741ee9efee567994: CDI devices from CRI Config.CDIDevices: []" Sep 16 04:24:35.316470 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount3913017512.mount: Deactivated successfully. Sep 16 04:24:35.325434 containerd[1558]: time="2025-09-16T04:24:35.325343042Z" level=info msg="CreateContainer within sandbox \"f7c71579cce0ec213c5a474c65ab702dd5d554093c1d9cbbb0691207270a2b80\" for &ContainerMetadata{Name:calico-node,Attempt:0,} returns container id \"29accf524a610bb91eb08b15c29dbaaf8febb57b21dcb514741ee9efee567994\"" Sep 16 04:24:35.326739 containerd[1558]: time="2025-09-16T04:24:35.326672581Z" level=info msg="StartContainer for \"29accf524a610bb91eb08b15c29dbaaf8febb57b21dcb514741ee9efee567994\"" Sep 16 04:24:35.329435 containerd[1558]: time="2025-09-16T04:24:35.329401781Z" level=info msg="connecting to shim 29accf524a610bb91eb08b15c29dbaaf8febb57b21dcb514741ee9efee567994" address="unix:///run/containerd/s/e12e1fde57a77c49dcd1ab8b5c66e5d2509e78308dbb9912b659009b55267cb5" protocol=ttrpc version=3 Sep 16 04:24:35.376889 systemd[1]: Started cri-containerd-29accf524a610bb91eb08b15c29dbaaf8febb57b21dcb514741ee9efee567994.scope - libcontainer container 29accf524a610bb91eb08b15c29dbaaf8febb57b21dcb514741ee9efee567994. Sep 16 04:24:35.428603 containerd[1558]: time="2025-09-16T04:24:35.428535114Z" level=info msg="StartContainer for \"29accf524a610bb91eb08b15c29dbaaf8febb57b21dcb514741ee9efee567994\" returns successfully" Sep 16 04:24:35.575052 kernel: wireguard: WireGuard 1.0.0 loaded. See www.wireguard.com for information. Sep 16 04:24:35.575263 kernel: wireguard: Copyright (C) 2015-2019 Jason A. Donenfeld . All Rights Reserved. Sep 16 04:24:35.915779 kubelet[2783]: I0916 04:24:35.915715 2783 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"whisker-backend-key-pair\" (UniqueName: \"kubernetes.io/secret/5e16adeb-bdda-4f9a-bad5-38ec719a7ad7-whisker-backend-key-pair\") pod \"5e16adeb-bdda-4f9a-bad5-38ec719a7ad7\" (UID: \"5e16adeb-bdda-4f9a-bad5-38ec719a7ad7\") " Sep 16 04:24:35.917278 kubelet[2783]: I0916 04:24:35.915915 2783 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"whisker-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5e16adeb-bdda-4f9a-bad5-38ec719a7ad7-whisker-ca-bundle\") pod \"5e16adeb-bdda-4f9a-bad5-38ec719a7ad7\" (UID: \"5e16adeb-bdda-4f9a-bad5-38ec719a7ad7\") " Sep 16 04:24:35.917278 kubelet[2783]: I0916 04:24:35.916061 2783 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v58jd\" (UniqueName: \"kubernetes.io/projected/5e16adeb-bdda-4f9a-bad5-38ec719a7ad7-kube-api-access-v58jd\") pod \"5e16adeb-bdda-4f9a-bad5-38ec719a7ad7\" (UID: \"5e16adeb-bdda-4f9a-bad5-38ec719a7ad7\") " Sep 16 04:24:35.922876 kubelet[2783]: I0916 04:24:35.922827 2783 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5e16adeb-bdda-4f9a-bad5-38ec719a7ad7-whisker-ca-bundle" (OuterVolumeSpecName: "whisker-ca-bundle") pod "5e16adeb-bdda-4f9a-bad5-38ec719a7ad7" (UID: "5e16adeb-bdda-4f9a-bad5-38ec719a7ad7"). InnerVolumeSpecName "whisker-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Sep 16 04:24:35.924573 kubelet[2783]: I0916 04:24:35.924519 2783 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5e16adeb-bdda-4f9a-bad5-38ec719a7ad7-whisker-backend-key-pair" (OuterVolumeSpecName: "whisker-backend-key-pair") pod "5e16adeb-bdda-4f9a-bad5-38ec719a7ad7" (UID: "5e16adeb-bdda-4f9a-bad5-38ec719a7ad7"). InnerVolumeSpecName "whisker-backend-key-pair". PluginName "kubernetes.io/secret", VolumeGIDValue "" Sep 16 04:24:35.925384 kubelet[2783]: I0916 04:24:35.925354 2783 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5e16adeb-bdda-4f9a-bad5-38ec719a7ad7-kube-api-access-v58jd" (OuterVolumeSpecName: "kube-api-access-v58jd") pod "5e16adeb-bdda-4f9a-bad5-38ec719a7ad7" (UID: "5e16adeb-bdda-4f9a-bad5-38ec719a7ad7"). InnerVolumeSpecName "kube-api-access-v58jd". PluginName "kubernetes.io/projected", VolumeGIDValue "" Sep 16 04:24:35.979490 systemd[1]: Removed slice kubepods-besteffort-pod5e16adeb_bdda_4f9a_bad5_38ec719a7ad7.slice - libcontainer container kubepods-besteffort-pod5e16adeb_bdda_4f9a_bad5_38ec719a7ad7.slice. Sep 16 04:24:36.015823 kubelet[2783]: I0916 04:24:36.015748 2783 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/calico-node-gdfbn" podStartSLOduration=2.041526086 podStartE2EDuration="19.015654032s" podCreationTimestamp="2025-09-16 04:24:17 +0000 UTC" firstStartedPulling="2025-09-16 04:24:18.294968832 +0000 UTC m=+24.663609814" lastFinishedPulling="2025-09-16 04:24:35.269096778 +0000 UTC m=+41.637737760" observedRunningTime="2025-09-16 04:24:35.99966408 +0000 UTC m=+42.368305102" watchObservedRunningTime="2025-09-16 04:24:36.015654032 +0000 UTC m=+42.384295054" Sep 16 04:24:36.017134 kubelet[2783]: I0916 04:24:36.017094 2783 reconciler_common.go:299] "Volume detached for volume \"whisker-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5e16adeb-bdda-4f9a-bad5-38ec719a7ad7-whisker-ca-bundle\") on node \"ci-4459-0-0-n-21eb3e8385\" DevicePath \"\"" Sep 16 04:24:36.017261 kubelet[2783]: I0916 04:24:36.017144 2783 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-v58jd\" (UniqueName: \"kubernetes.io/projected/5e16adeb-bdda-4f9a-bad5-38ec719a7ad7-kube-api-access-v58jd\") on node \"ci-4459-0-0-n-21eb3e8385\" DevicePath \"\"" Sep 16 04:24:36.017261 kubelet[2783]: I0916 04:24:36.017165 2783 reconciler_common.go:299] "Volume detached for volume \"whisker-backend-key-pair\" (UniqueName: \"kubernetes.io/secret/5e16adeb-bdda-4f9a-bad5-38ec719a7ad7-whisker-backend-key-pair\") on node \"ci-4459-0-0-n-21eb3e8385\" DevicePath \"\"" Sep 16 04:24:36.082311 systemd[1]: Created slice kubepods-besteffort-pod413d26b0_3b5a_451e_9f89_1507a866ee05.slice - libcontainer container kubepods-besteffort-pod413d26b0_3b5a_451e_9f89_1507a866ee05.slice. Sep 16 04:24:36.219277 kubelet[2783]: I0916 04:24:36.219116 2783 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whisker-backend-key-pair\" (UniqueName: \"kubernetes.io/secret/413d26b0-3b5a-451e-9f89-1507a866ee05-whisker-backend-key-pair\") pod \"whisker-85b9fd968b-4j4vh\" (UID: \"413d26b0-3b5a-451e-9f89-1507a866ee05\") " pod="calico-system/whisker-85b9fd968b-4j4vh" Sep 16 04:24:36.220118 kubelet[2783]: I0916 04:24:36.219834 2783 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whisker-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/413d26b0-3b5a-451e-9f89-1507a866ee05-whisker-ca-bundle\") pod \"whisker-85b9fd968b-4j4vh\" (UID: \"413d26b0-3b5a-451e-9f89-1507a866ee05\") " pod="calico-system/whisker-85b9fd968b-4j4vh" Sep 16 04:24:36.220118 kubelet[2783]: I0916 04:24:36.219887 2783 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-psqbc\" (UniqueName: \"kubernetes.io/projected/413d26b0-3b5a-451e-9f89-1507a866ee05-kube-api-access-psqbc\") pod \"whisker-85b9fd968b-4j4vh\" (UID: \"413d26b0-3b5a-451e-9f89-1507a866ee05\") " pod="calico-system/whisker-85b9fd968b-4j4vh" Sep 16 04:24:36.232314 systemd[1]: var-lib-kubelet-pods-5e16adeb\x2dbdda\x2d4f9a\x2dbad5\x2d38ec719a7ad7-volumes-kubernetes.io\x7eprojected-kube\x2dapi\x2daccess\x2dv58jd.mount: Deactivated successfully. Sep 16 04:24:36.232417 systemd[1]: var-lib-kubelet-pods-5e16adeb\x2dbdda\x2d4f9a\x2dbad5\x2d38ec719a7ad7-volumes-kubernetes.io\x7esecret-whisker\x2dbackend\x2dkey\x2dpair.mount: Deactivated successfully. Sep 16 04:24:36.387778 containerd[1558]: time="2025-09-16T04:24:36.387712228Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:whisker-85b9fd968b-4j4vh,Uid:413d26b0-3b5a-451e-9f89-1507a866ee05,Namespace:calico-system,Attempt:0,}" Sep 16 04:24:36.582681 systemd-networkd[1418]: cali9c69bd2c5f9: Link UP Sep 16 04:24:36.583414 systemd-networkd[1418]: cali9c69bd2c5f9: Gained carrier Sep 16 04:24:36.605428 containerd[1558]: 2025-09-16 04:24:36.415 [INFO][3851] cni-plugin/utils.go 100: File /var/lib/calico/mtu does not exist Sep 16 04:24:36.605428 containerd[1558]: 2025-09-16 04:24:36.462 [INFO][3851] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4459--0--0--n--21eb3e8385-k8s-whisker--85b9fd968b--4j4vh-eth0 whisker-85b9fd968b- calico-system 413d26b0-3b5a-451e-9f89-1507a866ee05 905 0 2025-09-16 04:24:36 +0000 UTC map[app.kubernetes.io/name:whisker k8s-app:whisker pod-template-hash:85b9fd968b projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:whisker] map[] [] [] []} {k8s ci-4459-0-0-n-21eb3e8385 whisker-85b9fd968b-4j4vh eth0 whisker [] [] [kns.calico-system ksa.calico-system.whisker] cali9c69bd2c5f9 [] [] }} ContainerID="4417d78058fd69e26692971def28e61ab64ec9900779c4f7a0794b4295d5c516" Namespace="calico-system" Pod="whisker-85b9fd968b-4j4vh" WorkloadEndpoint="ci--4459--0--0--n--21eb3e8385-k8s-whisker--85b9fd968b--4j4vh-" Sep 16 04:24:36.605428 containerd[1558]: 2025-09-16 04:24:36.462 [INFO][3851] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="4417d78058fd69e26692971def28e61ab64ec9900779c4f7a0794b4295d5c516" Namespace="calico-system" Pod="whisker-85b9fd968b-4j4vh" WorkloadEndpoint="ci--4459--0--0--n--21eb3e8385-k8s-whisker--85b9fd968b--4j4vh-eth0" Sep 16 04:24:36.605428 containerd[1558]: 2025-09-16 04:24:36.513 [INFO][3864] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="4417d78058fd69e26692971def28e61ab64ec9900779c4f7a0794b4295d5c516" HandleID="k8s-pod-network.4417d78058fd69e26692971def28e61ab64ec9900779c4f7a0794b4295d5c516" Workload="ci--4459--0--0--n--21eb3e8385-k8s-whisker--85b9fd968b--4j4vh-eth0" Sep 16 04:24:36.605428 containerd[1558]: 2025-09-16 04:24:36.513 [INFO][3864] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="4417d78058fd69e26692971def28e61ab64ec9900779c4f7a0794b4295d5c516" HandleID="k8s-pod-network.4417d78058fd69e26692971def28e61ab64ec9900779c4f7a0794b4295d5c516" Workload="ci--4459--0--0--n--21eb3e8385-k8s-whisker--85b9fd968b--4j4vh-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0x400024b100), Attrs:map[string]string{"namespace":"calico-system", "node":"ci-4459-0-0-n-21eb3e8385", "pod":"whisker-85b9fd968b-4j4vh", "timestamp":"2025-09-16 04:24:36.513011685 +0000 UTC"}, Hostname:"ci-4459-0-0-n-21eb3e8385", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Sep 16 04:24:36.605428 containerd[1558]: 2025-09-16 04:24:36.513 [INFO][3864] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Sep 16 04:24:36.605428 containerd[1558]: 2025-09-16 04:24:36.513 [INFO][3864] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Sep 16 04:24:36.605428 containerd[1558]: 2025-09-16 04:24:36.513 [INFO][3864] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4459-0-0-n-21eb3e8385' Sep 16 04:24:36.605428 containerd[1558]: 2025-09-16 04:24:36.527 [INFO][3864] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.4417d78058fd69e26692971def28e61ab64ec9900779c4f7a0794b4295d5c516" host="ci-4459-0-0-n-21eb3e8385" Sep 16 04:24:36.605428 containerd[1558]: 2025-09-16 04:24:36.536 [INFO][3864] ipam/ipam.go 394: Looking up existing affinities for host host="ci-4459-0-0-n-21eb3e8385" Sep 16 04:24:36.605428 containerd[1558]: 2025-09-16 04:24:36.544 [INFO][3864] ipam/ipam.go 511: Trying affinity for 192.168.66.0/26 host="ci-4459-0-0-n-21eb3e8385" Sep 16 04:24:36.605428 containerd[1558]: 2025-09-16 04:24:36.547 [INFO][3864] ipam/ipam.go 158: Attempting to load block cidr=192.168.66.0/26 host="ci-4459-0-0-n-21eb3e8385" Sep 16 04:24:36.605428 containerd[1558]: 2025-09-16 04:24:36.550 [INFO][3864] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.66.0/26 host="ci-4459-0-0-n-21eb3e8385" Sep 16 04:24:36.605428 containerd[1558]: 2025-09-16 04:24:36.550 [INFO][3864] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.66.0/26 handle="k8s-pod-network.4417d78058fd69e26692971def28e61ab64ec9900779c4f7a0794b4295d5c516" host="ci-4459-0-0-n-21eb3e8385" Sep 16 04:24:36.605428 containerd[1558]: 2025-09-16 04:24:36.552 [INFO][3864] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.4417d78058fd69e26692971def28e61ab64ec9900779c4f7a0794b4295d5c516 Sep 16 04:24:36.605428 containerd[1558]: 2025-09-16 04:24:36.557 [INFO][3864] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.66.0/26 handle="k8s-pod-network.4417d78058fd69e26692971def28e61ab64ec9900779c4f7a0794b4295d5c516" host="ci-4459-0-0-n-21eb3e8385" Sep 16 04:24:36.605428 containerd[1558]: 2025-09-16 04:24:36.567 [INFO][3864] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.66.1/26] block=192.168.66.0/26 handle="k8s-pod-network.4417d78058fd69e26692971def28e61ab64ec9900779c4f7a0794b4295d5c516" host="ci-4459-0-0-n-21eb3e8385" Sep 16 04:24:36.605428 containerd[1558]: 2025-09-16 04:24:36.567 [INFO][3864] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.66.1/26] handle="k8s-pod-network.4417d78058fd69e26692971def28e61ab64ec9900779c4f7a0794b4295d5c516" host="ci-4459-0-0-n-21eb3e8385" Sep 16 04:24:36.605428 containerd[1558]: 2025-09-16 04:24:36.567 [INFO][3864] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Sep 16 04:24:36.605995 containerd[1558]: 2025-09-16 04:24:36.567 [INFO][3864] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.66.1/26] IPv6=[] ContainerID="4417d78058fd69e26692971def28e61ab64ec9900779c4f7a0794b4295d5c516" HandleID="k8s-pod-network.4417d78058fd69e26692971def28e61ab64ec9900779c4f7a0794b4295d5c516" Workload="ci--4459--0--0--n--21eb3e8385-k8s-whisker--85b9fd968b--4j4vh-eth0" Sep 16 04:24:36.605995 containerd[1558]: 2025-09-16 04:24:36.572 [INFO][3851] cni-plugin/k8s.go 418: Populated endpoint ContainerID="4417d78058fd69e26692971def28e61ab64ec9900779c4f7a0794b4295d5c516" Namespace="calico-system" Pod="whisker-85b9fd968b-4j4vh" WorkloadEndpoint="ci--4459--0--0--n--21eb3e8385-k8s-whisker--85b9fd968b--4j4vh-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4459--0--0--n--21eb3e8385-k8s-whisker--85b9fd968b--4j4vh-eth0", GenerateName:"whisker-85b9fd968b-", Namespace:"calico-system", SelfLink:"", UID:"413d26b0-3b5a-451e-9f89-1507a866ee05", ResourceVersion:"905", Generation:0, CreationTimestamp:time.Date(2025, time.September, 16, 4, 24, 36, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"whisker", "k8s-app":"whisker", "pod-template-hash":"85b9fd968b", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"whisker"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4459-0-0-n-21eb3e8385", ContainerID:"", Pod:"whisker-85b9fd968b-4j4vh", Endpoint:"eth0", ServiceAccountName:"whisker", IPNetworks:[]string{"192.168.66.1/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.whisker"}, InterfaceName:"cali9c69bd2c5f9", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 16 04:24:36.605995 containerd[1558]: 2025-09-16 04:24:36.572 [INFO][3851] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.66.1/32] ContainerID="4417d78058fd69e26692971def28e61ab64ec9900779c4f7a0794b4295d5c516" Namespace="calico-system" Pod="whisker-85b9fd968b-4j4vh" WorkloadEndpoint="ci--4459--0--0--n--21eb3e8385-k8s-whisker--85b9fd968b--4j4vh-eth0" Sep 16 04:24:36.605995 containerd[1558]: 2025-09-16 04:24:36.572 [INFO][3851] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali9c69bd2c5f9 ContainerID="4417d78058fd69e26692971def28e61ab64ec9900779c4f7a0794b4295d5c516" Namespace="calico-system" Pod="whisker-85b9fd968b-4j4vh" WorkloadEndpoint="ci--4459--0--0--n--21eb3e8385-k8s-whisker--85b9fd968b--4j4vh-eth0" Sep 16 04:24:36.605995 containerd[1558]: 2025-09-16 04:24:36.583 [INFO][3851] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="4417d78058fd69e26692971def28e61ab64ec9900779c4f7a0794b4295d5c516" Namespace="calico-system" Pod="whisker-85b9fd968b-4j4vh" WorkloadEndpoint="ci--4459--0--0--n--21eb3e8385-k8s-whisker--85b9fd968b--4j4vh-eth0" Sep 16 04:24:36.605995 containerd[1558]: 2025-09-16 04:24:36.584 [INFO][3851] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="4417d78058fd69e26692971def28e61ab64ec9900779c4f7a0794b4295d5c516" Namespace="calico-system" Pod="whisker-85b9fd968b-4j4vh" WorkloadEndpoint="ci--4459--0--0--n--21eb3e8385-k8s-whisker--85b9fd968b--4j4vh-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4459--0--0--n--21eb3e8385-k8s-whisker--85b9fd968b--4j4vh-eth0", GenerateName:"whisker-85b9fd968b-", Namespace:"calico-system", SelfLink:"", UID:"413d26b0-3b5a-451e-9f89-1507a866ee05", ResourceVersion:"905", Generation:0, CreationTimestamp:time.Date(2025, time.September, 16, 4, 24, 36, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"whisker", "k8s-app":"whisker", "pod-template-hash":"85b9fd968b", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"whisker"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4459-0-0-n-21eb3e8385", ContainerID:"4417d78058fd69e26692971def28e61ab64ec9900779c4f7a0794b4295d5c516", Pod:"whisker-85b9fd968b-4j4vh", Endpoint:"eth0", ServiceAccountName:"whisker", IPNetworks:[]string{"192.168.66.1/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.whisker"}, InterfaceName:"cali9c69bd2c5f9", MAC:"ce:a0:42:79:70:bb", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 16 04:24:36.606224 containerd[1558]: 2025-09-16 04:24:36.602 [INFO][3851] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="4417d78058fd69e26692971def28e61ab64ec9900779c4f7a0794b4295d5c516" Namespace="calico-system" Pod="whisker-85b9fd968b-4j4vh" WorkloadEndpoint="ci--4459--0--0--n--21eb3e8385-k8s-whisker--85b9fd968b--4j4vh-eth0" Sep 16 04:24:36.648221 containerd[1558]: time="2025-09-16T04:24:36.647823880Z" level=info msg="connecting to shim 4417d78058fd69e26692971def28e61ab64ec9900779c4f7a0794b4295d5c516" address="unix:///run/containerd/s/5c44943291f57d488e9f6fb29614ef2edddcce3f2d8f831710d32ed5c7a0b672" namespace=k8s.io protocol=ttrpc version=3 Sep 16 04:24:36.678041 systemd[1]: Started cri-containerd-4417d78058fd69e26692971def28e61ab64ec9900779c4f7a0794b4295d5c516.scope - libcontainer container 4417d78058fd69e26692971def28e61ab64ec9900779c4f7a0794b4295d5c516. Sep 16 04:24:36.722885 containerd[1558]: time="2025-09-16T04:24:36.722813768Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:whisker-85b9fd968b-4j4vh,Uid:413d26b0-3b5a-451e-9f89-1507a866ee05,Namespace:calico-system,Attempt:0,} returns sandbox id \"4417d78058fd69e26692971def28e61ab64ec9900779c4f7a0794b4295d5c516\"" Sep 16 04:24:36.726604 containerd[1558]: time="2025-09-16T04:24:36.726519981Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.30.3\"" Sep 16 04:24:36.977078 kubelet[2783]: I0916 04:24:36.977000 2783 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Sep 16 04:24:37.709733 systemd-networkd[1418]: cali9c69bd2c5f9: Gained IPv6LL Sep 16 04:24:37.754049 kubelet[2783]: I0916 04:24:37.753970 2783 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5e16adeb-bdda-4f9a-bad5-38ec719a7ad7" path="/var/lib/kubelet/pods/5e16adeb-bdda-4f9a-bad5-38ec719a7ad7/volumes" Sep 16 04:24:38.502423 kubelet[2783]: I0916 04:24:38.502371 2783 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Sep 16 04:24:38.616694 containerd[1558]: time="2025-09-16T04:24:38.616597813Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/whisker:v3.30.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 16 04:24:38.618798 containerd[1558]: time="2025-09-16T04:24:38.618758204Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker:v3.30.3: active requests=0, bytes read=4605606" Sep 16 04:24:38.620611 containerd[1558]: time="2025-09-16T04:24:38.620003621Z" level=info msg="ImageCreate event name:\"sha256:270a0129ec34c3ad6ae6d56c0afce111eb0baa25dfdacb63722ec5887bafd3c5\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 16 04:24:38.623171 containerd[1558]: time="2025-09-16T04:24:38.623135586Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/whisker@sha256:e7113761fc7633d515882f0d48b5c8d0b8e62f3f9d34823f2ee194bb16d2ec44\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 16 04:24:38.624598 containerd[1558]: time="2025-09-16T04:24:38.624553166Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/whisker:v3.30.3\" with image id \"sha256:270a0129ec34c3ad6ae6d56c0afce111eb0baa25dfdacb63722ec5887bafd3c5\", repo tag \"ghcr.io/flatcar/calico/whisker:v3.30.3\", repo digest \"ghcr.io/flatcar/calico/whisker@sha256:e7113761fc7633d515882f0d48b5c8d0b8e62f3f9d34823f2ee194bb16d2ec44\", size \"5974839\" in 1.897979784s" Sep 16 04:24:38.625102 containerd[1558]: time="2025-09-16T04:24:38.624708848Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.30.3\" returns image reference \"sha256:270a0129ec34c3ad6ae6d56c0afce111eb0baa25dfdacb63722ec5887bafd3c5\"" Sep 16 04:24:38.630978 containerd[1558]: time="2025-09-16T04:24:38.630839976Z" level=info msg="CreateContainer within sandbox \"4417d78058fd69e26692971def28e61ab64ec9900779c4f7a0794b4295d5c516\" for container &ContainerMetadata{Name:whisker,Attempt:0,}" Sep 16 04:24:38.643320 containerd[1558]: time="2025-09-16T04:24:38.643195592Z" level=info msg="Container 451152038e74f0f7850cc160d814d3f2086e1df717bc30b43170540391224845: CDI devices from CRI Config.CDIDevices: []" Sep 16 04:24:38.660659 containerd[1558]: time="2025-09-16T04:24:38.660571999Z" level=info msg="CreateContainer within sandbox \"4417d78058fd69e26692971def28e61ab64ec9900779c4f7a0794b4295d5c516\" for &ContainerMetadata{Name:whisker,Attempt:0,} returns container id \"451152038e74f0f7850cc160d814d3f2086e1df717bc30b43170540391224845\"" Sep 16 04:24:38.662928 containerd[1558]: time="2025-09-16T04:24:38.661444491Z" level=info msg="StartContainer for \"451152038e74f0f7850cc160d814d3f2086e1df717bc30b43170540391224845\"" Sep 16 04:24:38.662928 containerd[1558]: time="2025-09-16T04:24:38.662469466Z" level=info msg="connecting to shim 451152038e74f0f7850cc160d814d3f2086e1df717bc30b43170540391224845" address="unix:///run/containerd/s/5c44943291f57d488e9f6fb29614ef2edddcce3f2d8f831710d32ed5c7a0b672" protocol=ttrpc version=3 Sep 16 04:24:38.700202 systemd[1]: Started cri-containerd-451152038e74f0f7850cc160d814d3f2086e1df717bc30b43170540391224845.scope - libcontainer container 451152038e74f0f7850cc160d814d3f2086e1df717bc30b43170540391224845. Sep 16 04:24:38.751438 containerd[1558]: time="2025-09-16T04:24:38.750746683Z" level=info msg="StartContainer for \"451152038e74f0f7850cc160d814d3f2086e1df717bc30b43170540391224845\" returns successfully" Sep 16 04:24:38.753969 containerd[1558]: time="2025-09-16T04:24:38.753788046Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.30.3\"" Sep 16 04:24:39.752402 containerd[1558]: time="2025-09-16T04:24:39.752354493Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-6bd6d5f94d-b8px6,Uid:5ac97367-6d73-46ba-bb67-eceba0a1415b,Namespace:calico-apiserver,Attempt:0,}" Sep 16 04:24:39.753603 containerd[1558]: time="2025-09-16T04:24:39.753555670Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-674b8bbfcf-xxw8g,Uid:038e6674-1975-4134-b752-06e86fdb41a9,Namespace:kube-system,Attempt:0,}" Sep 16 04:24:39.753843 containerd[1558]: time="2025-09-16T04:24:39.753817354Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:goldmane-54d579b49d-xdvqq,Uid:34d6afd2-952d-4990-a809-94238195a22f,Namespace:calico-system,Attempt:0,}" Sep 16 04:24:39.754086 containerd[1558]: time="2025-09-16T04:24:39.754047237Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-674b8bbfcf-4l959,Uid:315abdbf-a1e5-4c09-b409-2dd38a2cfeeb,Namespace:kube-system,Attempt:0,}" Sep 16 04:24:40.004093 systemd-networkd[1418]: vxlan.calico: Link UP Sep 16 04:24:40.004107 systemd-networkd[1418]: vxlan.calico: Gained carrier Sep 16 04:24:40.150131 systemd-networkd[1418]: calica2478979a6: Link UP Sep 16 04:24:40.151447 systemd-networkd[1418]: calica2478979a6: Gained carrier Sep 16 04:24:40.177758 containerd[1558]: 2025-09-16 04:24:39.907 [INFO][4154] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4459--0--0--n--21eb3e8385-k8s-coredns--674b8bbfcf--4l959-eth0 coredns-674b8bbfcf- kube-system 315abdbf-a1e5-4c09-b409-2dd38a2cfeeb 833 0 2025-09-16 04:24:00 +0000 UTC map[k8s-app:kube-dns pod-template-hash:674b8bbfcf projectcalico.org/namespace:kube-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:coredns] map[] [] [] []} {k8s ci-4459-0-0-n-21eb3e8385 coredns-674b8bbfcf-4l959 eth0 coredns [] [] [kns.kube-system ksa.kube-system.coredns] calica2478979a6 [{dns UDP 53 0 } {dns-tcp TCP 53 0 } {metrics TCP 9153 0 }] [] }} ContainerID="a50447aed51378baaf791458d903162d129025aab136eaff7349f82a32bb64fa" Namespace="kube-system" Pod="coredns-674b8bbfcf-4l959" WorkloadEndpoint="ci--4459--0--0--n--21eb3e8385-k8s-coredns--674b8bbfcf--4l959-" Sep 16 04:24:40.177758 containerd[1558]: 2025-09-16 04:24:39.909 [INFO][4154] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="a50447aed51378baaf791458d903162d129025aab136eaff7349f82a32bb64fa" Namespace="kube-system" Pod="coredns-674b8bbfcf-4l959" WorkloadEndpoint="ci--4459--0--0--n--21eb3e8385-k8s-coredns--674b8bbfcf--4l959-eth0" Sep 16 04:24:40.177758 containerd[1558]: 2025-09-16 04:24:40.058 [INFO][4197] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="a50447aed51378baaf791458d903162d129025aab136eaff7349f82a32bb64fa" HandleID="k8s-pod-network.a50447aed51378baaf791458d903162d129025aab136eaff7349f82a32bb64fa" Workload="ci--4459--0--0--n--21eb3e8385-k8s-coredns--674b8bbfcf--4l959-eth0" Sep 16 04:24:40.177758 containerd[1558]: 2025-09-16 04:24:40.060 [INFO][4197] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="a50447aed51378baaf791458d903162d129025aab136eaff7349f82a32bb64fa" HandleID="k8s-pod-network.a50447aed51378baaf791458d903162d129025aab136eaff7349f82a32bb64fa" Workload="ci--4459--0--0--n--21eb3e8385-k8s-coredns--674b8bbfcf--4l959-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0x4000328d40), Attrs:map[string]string{"namespace":"kube-system", "node":"ci-4459-0-0-n-21eb3e8385", "pod":"coredns-674b8bbfcf-4l959", "timestamp":"2025-09-16 04:24:40.058853174 +0000 UTC"}, Hostname:"ci-4459-0-0-n-21eb3e8385", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Sep 16 04:24:40.177758 containerd[1558]: 2025-09-16 04:24:40.060 [INFO][4197] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Sep 16 04:24:40.177758 containerd[1558]: 2025-09-16 04:24:40.060 [INFO][4197] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Sep 16 04:24:40.177758 containerd[1558]: 2025-09-16 04:24:40.060 [INFO][4197] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4459-0-0-n-21eb3e8385' Sep 16 04:24:40.177758 containerd[1558]: 2025-09-16 04:24:40.092 [INFO][4197] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.a50447aed51378baaf791458d903162d129025aab136eaff7349f82a32bb64fa" host="ci-4459-0-0-n-21eb3e8385" Sep 16 04:24:40.177758 containerd[1558]: 2025-09-16 04:24:40.108 [INFO][4197] ipam/ipam.go 394: Looking up existing affinities for host host="ci-4459-0-0-n-21eb3e8385" Sep 16 04:24:40.177758 containerd[1558]: 2025-09-16 04:24:40.117 [INFO][4197] ipam/ipam.go 511: Trying affinity for 192.168.66.0/26 host="ci-4459-0-0-n-21eb3e8385" Sep 16 04:24:40.177758 containerd[1558]: 2025-09-16 04:24:40.120 [INFO][4197] ipam/ipam.go 158: Attempting to load block cidr=192.168.66.0/26 host="ci-4459-0-0-n-21eb3e8385" Sep 16 04:24:40.177758 containerd[1558]: 2025-09-16 04:24:40.123 [INFO][4197] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.66.0/26 host="ci-4459-0-0-n-21eb3e8385" Sep 16 04:24:40.177758 containerd[1558]: 2025-09-16 04:24:40.124 [INFO][4197] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.66.0/26 handle="k8s-pod-network.a50447aed51378baaf791458d903162d129025aab136eaff7349f82a32bb64fa" host="ci-4459-0-0-n-21eb3e8385" Sep 16 04:24:40.177758 containerd[1558]: 2025-09-16 04:24:40.126 [INFO][4197] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.a50447aed51378baaf791458d903162d129025aab136eaff7349f82a32bb64fa Sep 16 04:24:40.177758 containerd[1558]: 2025-09-16 04:24:40.131 [INFO][4197] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.66.0/26 handle="k8s-pod-network.a50447aed51378baaf791458d903162d129025aab136eaff7349f82a32bb64fa" host="ci-4459-0-0-n-21eb3e8385" Sep 16 04:24:40.177758 containerd[1558]: 2025-09-16 04:24:40.139 [INFO][4197] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.66.2/26] block=192.168.66.0/26 handle="k8s-pod-network.a50447aed51378baaf791458d903162d129025aab136eaff7349f82a32bb64fa" host="ci-4459-0-0-n-21eb3e8385" Sep 16 04:24:40.177758 containerd[1558]: 2025-09-16 04:24:40.139 [INFO][4197] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.66.2/26] handle="k8s-pod-network.a50447aed51378baaf791458d903162d129025aab136eaff7349f82a32bb64fa" host="ci-4459-0-0-n-21eb3e8385" Sep 16 04:24:40.177758 containerd[1558]: 2025-09-16 04:24:40.139 [INFO][4197] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Sep 16 04:24:40.177758 containerd[1558]: 2025-09-16 04:24:40.139 [INFO][4197] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.66.2/26] IPv6=[] ContainerID="a50447aed51378baaf791458d903162d129025aab136eaff7349f82a32bb64fa" HandleID="k8s-pod-network.a50447aed51378baaf791458d903162d129025aab136eaff7349f82a32bb64fa" Workload="ci--4459--0--0--n--21eb3e8385-k8s-coredns--674b8bbfcf--4l959-eth0" Sep 16 04:24:40.178706 containerd[1558]: 2025-09-16 04:24:40.144 [INFO][4154] cni-plugin/k8s.go 418: Populated endpoint ContainerID="a50447aed51378baaf791458d903162d129025aab136eaff7349f82a32bb64fa" Namespace="kube-system" Pod="coredns-674b8bbfcf-4l959" WorkloadEndpoint="ci--4459--0--0--n--21eb3e8385-k8s-coredns--674b8bbfcf--4l959-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4459--0--0--n--21eb3e8385-k8s-coredns--674b8bbfcf--4l959-eth0", GenerateName:"coredns-674b8bbfcf-", Namespace:"kube-system", SelfLink:"", UID:"315abdbf-a1e5-4c09-b409-2dd38a2cfeeb", ResourceVersion:"833", Generation:0, CreationTimestamp:time.Date(2025, time.September, 16, 4, 24, 0, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"674b8bbfcf", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4459-0-0-n-21eb3e8385", ContainerID:"", Pod:"coredns-674b8bbfcf-4l959", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.66.2/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"calica2478979a6", MAC:"", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 16 04:24:40.178706 containerd[1558]: 2025-09-16 04:24:40.144 [INFO][4154] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.66.2/32] ContainerID="a50447aed51378baaf791458d903162d129025aab136eaff7349f82a32bb64fa" Namespace="kube-system" Pod="coredns-674b8bbfcf-4l959" WorkloadEndpoint="ci--4459--0--0--n--21eb3e8385-k8s-coredns--674b8bbfcf--4l959-eth0" Sep 16 04:24:40.178706 containerd[1558]: 2025-09-16 04:24:40.144 [INFO][4154] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to calica2478979a6 ContainerID="a50447aed51378baaf791458d903162d129025aab136eaff7349f82a32bb64fa" Namespace="kube-system" Pod="coredns-674b8bbfcf-4l959" WorkloadEndpoint="ci--4459--0--0--n--21eb3e8385-k8s-coredns--674b8bbfcf--4l959-eth0" Sep 16 04:24:40.178706 containerd[1558]: 2025-09-16 04:24:40.150 [INFO][4154] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="a50447aed51378baaf791458d903162d129025aab136eaff7349f82a32bb64fa" Namespace="kube-system" Pod="coredns-674b8bbfcf-4l959" WorkloadEndpoint="ci--4459--0--0--n--21eb3e8385-k8s-coredns--674b8bbfcf--4l959-eth0" Sep 16 04:24:40.178847 containerd[1558]: 2025-09-16 04:24:40.158 [INFO][4154] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="a50447aed51378baaf791458d903162d129025aab136eaff7349f82a32bb64fa" Namespace="kube-system" Pod="coredns-674b8bbfcf-4l959" WorkloadEndpoint="ci--4459--0--0--n--21eb3e8385-k8s-coredns--674b8bbfcf--4l959-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4459--0--0--n--21eb3e8385-k8s-coredns--674b8bbfcf--4l959-eth0", GenerateName:"coredns-674b8bbfcf-", Namespace:"kube-system", SelfLink:"", UID:"315abdbf-a1e5-4c09-b409-2dd38a2cfeeb", ResourceVersion:"833", Generation:0, CreationTimestamp:time.Date(2025, time.September, 16, 4, 24, 0, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"674b8bbfcf", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4459-0-0-n-21eb3e8385", ContainerID:"a50447aed51378baaf791458d903162d129025aab136eaff7349f82a32bb64fa", Pod:"coredns-674b8bbfcf-4l959", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.66.2/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"calica2478979a6", MAC:"da:d3:c9:ac:e5:a5", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 16 04:24:40.178847 containerd[1558]: 2025-09-16 04:24:40.172 [INFO][4154] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="a50447aed51378baaf791458d903162d129025aab136eaff7349f82a32bb64fa" Namespace="kube-system" Pod="coredns-674b8bbfcf-4l959" WorkloadEndpoint="ci--4459--0--0--n--21eb3e8385-k8s-coredns--674b8bbfcf--4l959-eth0" Sep 16 04:24:40.218615 containerd[1558]: time="2025-09-16T04:24:40.218539170Z" level=info msg="connecting to shim a50447aed51378baaf791458d903162d129025aab136eaff7349f82a32bb64fa" address="unix:///run/containerd/s/449e85db8c6f5c7de26e2094ec033e4239f95528b8a7c92d91dece710b3ccd43" namespace=k8s.io protocol=ttrpc version=3 Sep 16 04:24:40.263085 systemd[1]: Started cri-containerd-a50447aed51378baaf791458d903162d129025aab136eaff7349f82a32bb64fa.scope - libcontainer container a50447aed51378baaf791458d903162d129025aab136eaff7349f82a32bb64fa. Sep 16 04:24:40.287020 systemd-networkd[1418]: cali962d594e8be: Link UP Sep 16 04:24:40.290783 systemd-networkd[1418]: cali962d594e8be: Gained carrier Sep 16 04:24:40.327701 containerd[1558]: 2025-09-16 04:24:39.919 [INFO][4140] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4459--0--0--n--21eb3e8385-k8s-goldmane--54d579b49d--xdvqq-eth0 goldmane-54d579b49d- calico-system 34d6afd2-952d-4990-a809-94238195a22f 837 0 2025-09-16 04:24:17 +0000 UTC map[app.kubernetes.io/name:goldmane k8s-app:goldmane pod-template-hash:54d579b49d projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:goldmane] map[] [] [] []} {k8s ci-4459-0-0-n-21eb3e8385 goldmane-54d579b49d-xdvqq eth0 goldmane [] [] [kns.calico-system ksa.calico-system.goldmane] cali962d594e8be [] [] }} ContainerID="c97c66d2619171743082a8e92c9b5c4f676446e0607614f7bb7f5a5ca57c4b66" Namespace="calico-system" Pod="goldmane-54d579b49d-xdvqq" WorkloadEndpoint="ci--4459--0--0--n--21eb3e8385-k8s-goldmane--54d579b49d--xdvqq-" Sep 16 04:24:40.327701 containerd[1558]: 2025-09-16 04:24:39.920 [INFO][4140] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="c97c66d2619171743082a8e92c9b5c4f676446e0607614f7bb7f5a5ca57c4b66" Namespace="calico-system" Pod="goldmane-54d579b49d-xdvqq" WorkloadEndpoint="ci--4459--0--0--n--21eb3e8385-k8s-goldmane--54d579b49d--xdvqq-eth0" Sep 16 04:24:40.327701 containerd[1558]: 2025-09-16 04:24:40.077 [INFO][4204] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="c97c66d2619171743082a8e92c9b5c4f676446e0607614f7bb7f5a5ca57c4b66" HandleID="k8s-pod-network.c97c66d2619171743082a8e92c9b5c4f676446e0607614f7bb7f5a5ca57c4b66" Workload="ci--4459--0--0--n--21eb3e8385-k8s-goldmane--54d579b49d--xdvqq-eth0" Sep 16 04:24:40.327701 containerd[1558]: 2025-09-16 04:24:40.077 [INFO][4204] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="c97c66d2619171743082a8e92c9b5c4f676446e0607614f7bb7f5a5ca57c4b66" HandleID="k8s-pod-network.c97c66d2619171743082a8e92c9b5c4f676446e0607614f7bb7f5a5ca57c4b66" Workload="ci--4459--0--0--n--21eb3e8385-k8s-goldmane--54d579b49d--xdvqq-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0x40002d2ff0), Attrs:map[string]string{"namespace":"calico-system", "node":"ci-4459-0-0-n-21eb3e8385", "pod":"goldmane-54d579b49d-xdvqq", "timestamp":"2025-09-16 04:24:40.07711475 +0000 UTC"}, Hostname:"ci-4459-0-0-n-21eb3e8385", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Sep 16 04:24:40.327701 containerd[1558]: 2025-09-16 04:24:40.077 [INFO][4204] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Sep 16 04:24:40.327701 containerd[1558]: 2025-09-16 04:24:40.139 [INFO][4204] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Sep 16 04:24:40.327701 containerd[1558]: 2025-09-16 04:24:40.139 [INFO][4204] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4459-0-0-n-21eb3e8385' Sep 16 04:24:40.327701 containerd[1558]: 2025-09-16 04:24:40.199 [INFO][4204] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.c97c66d2619171743082a8e92c9b5c4f676446e0607614f7bb7f5a5ca57c4b66" host="ci-4459-0-0-n-21eb3e8385" Sep 16 04:24:40.327701 containerd[1558]: 2025-09-16 04:24:40.212 [INFO][4204] ipam/ipam.go 394: Looking up existing affinities for host host="ci-4459-0-0-n-21eb3e8385" Sep 16 04:24:40.327701 containerd[1558]: 2025-09-16 04:24:40.232 [INFO][4204] ipam/ipam.go 511: Trying affinity for 192.168.66.0/26 host="ci-4459-0-0-n-21eb3e8385" Sep 16 04:24:40.327701 containerd[1558]: 2025-09-16 04:24:40.235 [INFO][4204] ipam/ipam.go 158: Attempting to load block cidr=192.168.66.0/26 host="ci-4459-0-0-n-21eb3e8385" Sep 16 04:24:40.327701 containerd[1558]: 2025-09-16 04:24:40.240 [INFO][4204] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.66.0/26 host="ci-4459-0-0-n-21eb3e8385" Sep 16 04:24:40.327701 containerd[1558]: 2025-09-16 04:24:40.240 [INFO][4204] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.66.0/26 handle="k8s-pod-network.c97c66d2619171743082a8e92c9b5c4f676446e0607614f7bb7f5a5ca57c4b66" host="ci-4459-0-0-n-21eb3e8385" Sep 16 04:24:40.327701 containerd[1558]: 2025-09-16 04:24:40.247 [INFO][4204] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.c97c66d2619171743082a8e92c9b5c4f676446e0607614f7bb7f5a5ca57c4b66 Sep 16 04:24:40.327701 containerd[1558]: 2025-09-16 04:24:40.256 [INFO][4204] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.66.0/26 handle="k8s-pod-network.c97c66d2619171743082a8e92c9b5c4f676446e0607614f7bb7f5a5ca57c4b66" host="ci-4459-0-0-n-21eb3e8385" Sep 16 04:24:40.327701 containerd[1558]: 2025-09-16 04:24:40.271 [INFO][4204] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.66.3/26] block=192.168.66.0/26 handle="k8s-pod-network.c97c66d2619171743082a8e92c9b5c4f676446e0607614f7bb7f5a5ca57c4b66" host="ci-4459-0-0-n-21eb3e8385" Sep 16 04:24:40.327701 containerd[1558]: 2025-09-16 04:24:40.271 [INFO][4204] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.66.3/26] handle="k8s-pod-network.c97c66d2619171743082a8e92c9b5c4f676446e0607614f7bb7f5a5ca57c4b66" host="ci-4459-0-0-n-21eb3e8385" Sep 16 04:24:40.327701 containerd[1558]: 2025-09-16 04:24:40.272 [INFO][4204] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Sep 16 04:24:40.327701 containerd[1558]: 2025-09-16 04:24:40.272 [INFO][4204] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.66.3/26] IPv6=[] ContainerID="c97c66d2619171743082a8e92c9b5c4f676446e0607614f7bb7f5a5ca57c4b66" HandleID="k8s-pod-network.c97c66d2619171743082a8e92c9b5c4f676446e0607614f7bb7f5a5ca57c4b66" Workload="ci--4459--0--0--n--21eb3e8385-k8s-goldmane--54d579b49d--xdvqq-eth0" Sep 16 04:24:40.328370 containerd[1558]: 2025-09-16 04:24:40.279 [INFO][4140] cni-plugin/k8s.go 418: Populated endpoint ContainerID="c97c66d2619171743082a8e92c9b5c4f676446e0607614f7bb7f5a5ca57c4b66" Namespace="calico-system" Pod="goldmane-54d579b49d-xdvqq" WorkloadEndpoint="ci--4459--0--0--n--21eb3e8385-k8s-goldmane--54d579b49d--xdvqq-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4459--0--0--n--21eb3e8385-k8s-goldmane--54d579b49d--xdvqq-eth0", GenerateName:"goldmane-54d579b49d-", Namespace:"calico-system", SelfLink:"", UID:"34d6afd2-952d-4990-a809-94238195a22f", ResourceVersion:"837", Generation:0, CreationTimestamp:time.Date(2025, time.September, 16, 4, 24, 17, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"goldmane", "k8s-app":"goldmane", "pod-template-hash":"54d579b49d", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"goldmane"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4459-0-0-n-21eb3e8385", ContainerID:"", Pod:"goldmane-54d579b49d-xdvqq", Endpoint:"eth0", ServiceAccountName:"goldmane", IPNetworks:[]string{"192.168.66.3/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.goldmane"}, InterfaceName:"cali962d594e8be", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 16 04:24:40.328370 containerd[1558]: 2025-09-16 04:24:40.279 [INFO][4140] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.66.3/32] ContainerID="c97c66d2619171743082a8e92c9b5c4f676446e0607614f7bb7f5a5ca57c4b66" Namespace="calico-system" Pod="goldmane-54d579b49d-xdvqq" WorkloadEndpoint="ci--4459--0--0--n--21eb3e8385-k8s-goldmane--54d579b49d--xdvqq-eth0" Sep 16 04:24:40.328370 containerd[1558]: 2025-09-16 04:24:40.279 [INFO][4140] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali962d594e8be ContainerID="c97c66d2619171743082a8e92c9b5c4f676446e0607614f7bb7f5a5ca57c4b66" Namespace="calico-system" Pod="goldmane-54d579b49d-xdvqq" WorkloadEndpoint="ci--4459--0--0--n--21eb3e8385-k8s-goldmane--54d579b49d--xdvqq-eth0" Sep 16 04:24:40.328370 containerd[1558]: 2025-09-16 04:24:40.294 [INFO][4140] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="c97c66d2619171743082a8e92c9b5c4f676446e0607614f7bb7f5a5ca57c4b66" Namespace="calico-system" Pod="goldmane-54d579b49d-xdvqq" WorkloadEndpoint="ci--4459--0--0--n--21eb3e8385-k8s-goldmane--54d579b49d--xdvqq-eth0" Sep 16 04:24:40.328370 containerd[1558]: 2025-09-16 04:24:40.295 [INFO][4140] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="c97c66d2619171743082a8e92c9b5c4f676446e0607614f7bb7f5a5ca57c4b66" Namespace="calico-system" Pod="goldmane-54d579b49d-xdvqq" WorkloadEndpoint="ci--4459--0--0--n--21eb3e8385-k8s-goldmane--54d579b49d--xdvqq-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4459--0--0--n--21eb3e8385-k8s-goldmane--54d579b49d--xdvqq-eth0", GenerateName:"goldmane-54d579b49d-", Namespace:"calico-system", SelfLink:"", UID:"34d6afd2-952d-4990-a809-94238195a22f", ResourceVersion:"837", Generation:0, CreationTimestamp:time.Date(2025, time.September, 16, 4, 24, 17, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"goldmane", "k8s-app":"goldmane", "pod-template-hash":"54d579b49d", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"goldmane"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4459-0-0-n-21eb3e8385", ContainerID:"c97c66d2619171743082a8e92c9b5c4f676446e0607614f7bb7f5a5ca57c4b66", Pod:"goldmane-54d579b49d-xdvqq", Endpoint:"eth0", ServiceAccountName:"goldmane", IPNetworks:[]string{"192.168.66.3/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.goldmane"}, InterfaceName:"cali962d594e8be", MAC:"f2:1a:7f:db:95:ac", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 16 04:24:40.328370 containerd[1558]: 2025-09-16 04:24:40.324 [INFO][4140] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="c97c66d2619171743082a8e92c9b5c4f676446e0607614f7bb7f5a5ca57c4b66" Namespace="calico-system" Pod="goldmane-54d579b49d-xdvqq" WorkloadEndpoint="ci--4459--0--0--n--21eb3e8385-k8s-goldmane--54d579b49d--xdvqq-eth0" Sep 16 04:24:40.367398 containerd[1558]: time="2025-09-16T04:24:40.367340854Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-674b8bbfcf-4l959,Uid:315abdbf-a1e5-4c09-b409-2dd38a2cfeeb,Namespace:kube-system,Attempt:0,} returns sandbox id \"a50447aed51378baaf791458d903162d129025aab136eaff7349f82a32bb64fa\"" Sep 16 04:24:40.378556 containerd[1558]: time="2025-09-16T04:24:40.378422969Z" level=info msg="CreateContainer within sandbox \"a50447aed51378baaf791458d903162d129025aab136eaff7349f82a32bb64fa\" for container &ContainerMetadata{Name:coredns,Attempt:0,}" Sep 16 04:24:40.381252 containerd[1558]: time="2025-09-16T04:24:40.381012446Z" level=info msg="connecting to shim c97c66d2619171743082a8e92c9b5c4f676446e0607614f7bb7f5a5ca57c4b66" address="unix:///run/containerd/s/5bab783ec2646ba661d1ad631248d5fa08b92f9854f0222082dd6e505a1a54d6" namespace=k8s.io protocol=ttrpc version=3 Sep 16 04:24:40.412638 containerd[1558]: time="2025-09-16T04:24:40.411254029Z" level=info msg="Container 6b339db8a545c6fde5a18f98c80055240a0af9c453697fe215632520c162eed3: CDI devices from CRI Config.CDIDevices: []" Sep 16 04:24:40.427907 containerd[1558]: time="2025-09-16T04:24:40.427864062Z" level=info msg="CreateContainer within sandbox \"a50447aed51378baaf791458d903162d129025aab136eaff7349f82a32bb64fa\" for &ContainerMetadata{Name:coredns,Attempt:0,} returns container id \"6b339db8a545c6fde5a18f98c80055240a0af9c453697fe215632520c162eed3\"" Sep 16 04:24:40.432448 containerd[1558]: time="2025-09-16T04:24:40.432411406Z" level=info msg="StartContainer for \"6b339db8a545c6fde5a18f98c80055240a0af9c453697fe215632520c162eed3\"" Sep 16 04:24:40.434906 systemd-networkd[1418]: caliac25f8b6413: Link UP Sep 16 04:24:40.438253 containerd[1558]: time="2025-09-16T04:24:40.438217087Z" level=info msg="connecting to shim 6b339db8a545c6fde5a18f98c80055240a0af9c453697fe215632520c162eed3" address="unix:///run/containerd/s/449e85db8c6f5c7de26e2094ec033e4239f95528b8a7c92d91dece710b3ccd43" protocol=ttrpc version=3 Sep 16 04:24:40.438718 systemd-networkd[1418]: caliac25f8b6413: Gained carrier Sep 16 04:24:40.486677 containerd[1558]: 2025-09-16 04:24:39.924 [INFO][4148] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4459--0--0--n--21eb3e8385-k8s-coredns--674b8bbfcf--xxw8g-eth0 coredns-674b8bbfcf- kube-system 038e6674-1975-4134-b752-06e86fdb41a9 838 0 2025-09-16 04:24:00 +0000 UTC map[k8s-app:kube-dns pod-template-hash:674b8bbfcf projectcalico.org/namespace:kube-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:coredns] map[] [] [] []} {k8s ci-4459-0-0-n-21eb3e8385 coredns-674b8bbfcf-xxw8g eth0 coredns [] [] [kns.kube-system ksa.kube-system.coredns] caliac25f8b6413 [{dns UDP 53 0 } {dns-tcp TCP 53 0 } {metrics TCP 9153 0 }] [] }} ContainerID="0ec7bc1e72af1c64d5e4f4781d71b30a078a8f8aeca5117c149752f003b07903" Namespace="kube-system" Pod="coredns-674b8bbfcf-xxw8g" WorkloadEndpoint="ci--4459--0--0--n--21eb3e8385-k8s-coredns--674b8bbfcf--xxw8g-" Sep 16 04:24:40.486677 containerd[1558]: 2025-09-16 04:24:39.927 [INFO][4148] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="0ec7bc1e72af1c64d5e4f4781d71b30a078a8f8aeca5117c149752f003b07903" Namespace="kube-system" Pod="coredns-674b8bbfcf-xxw8g" WorkloadEndpoint="ci--4459--0--0--n--21eb3e8385-k8s-coredns--674b8bbfcf--xxw8g-eth0" Sep 16 04:24:40.486677 containerd[1558]: 2025-09-16 04:24:40.108 [INFO][4212] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="0ec7bc1e72af1c64d5e4f4781d71b30a078a8f8aeca5117c149752f003b07903" HandleID="k8s-pod-network.0ec7bc1e72af1c64d5e4f4781d71b30a078a8f8aeca5117c149752f003b07903" Workload="ci--4459--0--0--n--21eb3e8385-k8s-coredns--674b8bbfcf--xxw8g-eth0" Sep 16 04:24:40.486677 containerd[1558]: 2025-09-16 04:24:40.108 [INFO][4212] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="0ec7bc1e72af1c64d5e4f4781d71b30a078a8f8aeca5117c149752f003b07903" HandleID="k8s-pod-network.0ec7bc1e72af1c64d5e4f4781d71b30a078a8f8aeca5117c149752f003b07903" Workload="ci--4459--0--0--n--21eb3e8385-k8s-coredns--674b8bbfcf--xxw8g-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0x400031ef20), Attrs:map[string]string{"namespace":"kube-system", "node":"ci-4459-0-0-n-21eb3e8385", "pod":"coredns-674b8bbfcf-xxw8g", "timestamp":"2025-09-16 04:24:40.107751939 +0000 UTC"}, Hostname:"ci-4459-0-0-n-21eb3e8385", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Sep 16 04:24:40.486677 containerd[1558]: 2025-09-16 04:24:40.108 [INFO][4212] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Sep 16 04:24:40.486677 containerd[1558]: 2025-09-16 04:24:40.272 [INFO][4212] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Sep 16 04:24:40.486677 containerd[1558]: 2025-09-16 04:24:40.272 [INFO][4212] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4459-0-0-n-21eb3e8385' Sep 16 04:24:40.486677 containerd[1558]: 2025-09-16 04:24:40.305 [INFO][4212] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.0ec7bc1e72af1c64d5e4f4781d71b30a078a8f8aeca5117c149752f003b07903" host="ci-4459-0-0-n-21eb3e8385" Sep 16 04:24:40.486677 containerd[1558]: 2025-09-16 04:24:40.321 [INFO][4212] ipam/ipam.go 394: Looking up existing affinities for host host="ci-4459-0-0-n-21eb3e8385" Sep 16 04:24:40.486677 containerd[1558]: 2025-09-16 04:24:40.338 [INFO][4212] ipam/ipam.go 511: Trying affinity for 192.168.66.0/26 host="ci-4459-0-0-n-21eb3e8385" Sep 16 04:24:40.486677 containerd[1558]: 2025-09-16 04:24:40.347 [INFO][4212] ipam/ipam.go 158: Attempting to load block cidr=192.168.66.0/26 host="ci-4459-0-0-n-21eb3e8385" Sep 16 04:24:40.486677 containerd[1558]: 2025-09-16 04:24:40.358 [INFO][4212] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.66.0/26 host="ci-4459-0-0-n-21eb3e8385" Sep 16 04:24:40.486677 containerd[1558]: 2025-09-16 04:24:40.358 [INFO][4212] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.66.0/26 handle="k8s-pod-network.0ec7bc1e72af1c64d5e4f4781d71b30a078a8f8aeca5117c149752f003b07903" host="ci-4459-0-0-n-21eb3e8385" Sep 16 04:24:40.486677 containerd[1558]: 2025-09-16 04:24:40.363 [INFO][4212] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.0ec7bc1e72af1c64d5e4f4781d71b30a078a8f8aeca5117c149752f003b07903 Sep 16 04:24:40.486677 containerd[1558]: 2025-09-16 04:24:40.384 [INFO][4212] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.66.0/26 handle="k8s-pod-network.0ec7bc1e72af1c64d5e4f4781d71b30a078a8f8aeca5117c149752f003b07903" host="ci-4459-0-0-n-21eb3e8385" Sep 16 04:24:40.486677 containerd[1558]: 2025-09-16 04:24:40.401 [INFO][4212] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.66.4/26] block=192.168.66.0/26 handle="k8s-pod-network.0ec7bc1e72af1c64d5e4f4781d71b30a078a8f8aeca5117c149752f003b07903" host="ci-4459-0-0-n-21eb3e8385" Sep 16 04:24:40.486677 containerd[1558]: 2025-09-16 04:24:40.401 [INFO][4212] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.66.4/26] handle="k8s-pod-network.0ec7bc1e72af1c64d5e4f4781d71b30a078a8f8aeca5117c149752f003b07903" host="ci-4459-0-0-n-21eb3e8385" Sep 16 04:24:40.486677 containerd[1558]: 2025-09-16 04:24:40.401 [INFO][4212] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Sep 16 04:24:40.486677 containerd[1558]: 2025-09-16 04:24:40.401 [INFO][4212] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.66.4/26] IPv6=[] ContainerID="0ec7bc1e72af1c64d5e4f4781d71b30a078a8f8aeca5117c149752f003b07903" HandleID="k8s-pod-network.0ec7bc1e72af1c64d5e4f4781d71b30a078a8f8aeca5117c149752f003b07903" Workload="ci--4459--0--0--n--21eb3e8385-k8s-coredns--674b8bbfcf--xxw8g-eth0" Sep 16 04:24:40.487379 containerd[1558]: 2025-09-16 04:24:40.412 [INFO][4148] cni-plugin/k8s.go 418: Populated endpoint ContainerID="0ec7bc1e72af1c64d5e4f4781d71b30a078a8f8aeca5117c149752f003b07903" Namespace="kube-system" Pod="coredns-674b8bbfcf-xxw8g" WorkloadEndpoint="ci--4459--0--0--n--21eb3e8385-k8s-coredns--674b8bbfcf--xxw8g-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4459--0--0--n--21eb3e8385-k8s-coredns--674b8bbfcf--xxw8g-eth0", GenerateName:"coredns-674b8bbfcf-", Namespace:"kube-system", SelfLink:"", UID:"038e6674-1975-4134-b752-06e86fdb41a9", ResourceVersion:"838", Generation:0, CreationTimestamp:time.Date(2025, time.September, 16, 4, 24, 0, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"674b8bbfcf", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4459-0-0-n-21eb3e8385", ContainerID:"", Pod:"coredns-674b8bbfcf-xxw8g", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.66.4/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"caliac25f8b6413", MAC:"", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 16 04:24:40.487379 containerd[1558]: 2025-09-16 04:24:40.413 [INFO][4148] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.66.4/32] ContainerID="0ec7bc1e72af1c64d5e4f4781d71b30a078a8f8aeca5117c149752f003b07903" Namespace="kube-system" Pod="coredns-674b8bbfcf-xxw8g" WorkloadEndpoint="ci--4459--0--0--n--21eb3e8385-k8s-coredns--674b8bbfcf--xxw8g-eth0" Sep 16 04:24:40.487379 containerd[1558]: 2025-09-16 04:24:40.414 [INFO][4148] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to caliac25f8b6413 ContainerID="0ec7bc1e72af1c64d5e4f4781d71b30a078a8f8aeca5117c149752f003b07903" Namespace="kube-system" Pod="coredns-674b8bbfcf-xxw8g" WorkloadEndpoint="ci--4459--0--0--n--21eb3e8385-k8s-coredns--674b8bbfcf--xxw8g-eth0" Sep 16 04:24:40.487379 containerd[1558]: 2025-09-16 04:24:40.444 [INFO][4148] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="0ec7bc1e72af1c64d5e4f4781d71b30a078a8f8aeca5117c149752f003b07903" Namespace="kube-system" Pod="coredns-674b8bbfcf-xxw8g" WorkloadEndpoint="ci--4459--0--0--n--21eb3e8385-k8s-coredns--674b8bbfcf--xxw8g-eth0" Sep 16 04:24:40.487500 containerd[1558]: 2025-09-16 04:24:40.449 [INFO][4148] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="0ec7bc1e72af1c64d5e4f4781d71b30a078a8f8aeca5117c149752f003b07903" Namespace="kube-system" Pod="coredns-674b8bbfcf-xxw8g" WorkloadEndpoint="ci--4459--0--0--n--21eb3e8385-k8s-coredns--674b8bbfcf--xxw8g-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4459--0--0--n--21eb3e8385-k8s-coredns--674b8bbfcf--xxw8g-eth0", GenerateName:"coredns-674b8bbfcf-", Namespace:"kube-system", SelfLink:"", UID:"038e6674-1975-4134-b752-06e86fdb41a9", ResourceVersion:"838", Generation:0, CreationTimestamp:time.Date(2025, time.September, 16, 4, 24, 0, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"674b8bbfcf", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4459-0-0-n-21eb3e8385", ContainerID:"0ec7bc1e72af1c64d5e4f4781d71b30a078a8f8aeca5117c149752f003b07903", Pod:"coredns-674b8bbfcf-xxw8g", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.66.4/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"caliac25f8b6413", MAC:"fa:de:a6:22:1e:dc", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 16 04:24:40.487500 containerd[1558]: 2025-09-16 04:24:40.475 [INFO][4148] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="0ec7bc1e72af1c64d5e4f4781d71b30a078a8f8aeca5117c149752f003b07903" Namespace="kube-system" Pod="coredns-674b8bbfcf-xxw8g" WorkloadEndpoint="ci--4459--0--0--n--21eb3e8385-k8s-coredns--674b8bbfcf--xxw8g-eth0" Sep 16 04:24:40.494798 systemd[1]: Started cri-containerd-6b339db8a545c6fde5a18f98c80055240a0af9c453697fe215632520c162eed3.scope - libcontainer container 6b339db8a545c6fde5a18f98c80055240a0af9c453697fe215632520c162eed3. Sep 16 04:24:40.517296 systemd[1]: Started cri-containerd-c97c66d2619171743082a8e92c9b5c4f676446e0607614f7bb7f5a5ca57c4b66.scope - libcontainer container c97c66d2619171743082a8e92c9b5c4f676446e0607614f7bb7f5a5ca57c4b66. Sep 16 04:24:40.554363 systemd-networkd[1418]: calic544165cc29: Link UP Sep 16 04:24:40.562304 systemd-networkd[1418]: calic544165cc29: Gained carrier Sep 16 04:24:40.572746 containerd[1558]: time="2025-09-16T04:24:40.572665770Z" level=info msg="connecting to shim 0ec7bc1e72af1c64d5e4f4781d71b30a078a8f8aeca5117c149752f003b07903" address="unix:///run/containerd/s/65c49ce8e85ad8269e9956ad01e86a843b797b5bde5d85860c0162b4967eb6d2" namespace=k8s.io protocol=ttrpc version=3 Sep 16 04:24:40.599693 containerd[1558]: 2025-09-16 04:24:39.918 [INFO][4135] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4459--0--0--n--21eb3e8385-k8s-calico--apiserver--6bd6d5f94d--b8px6-eth0 calico-apiserver-6bd6d5f94d- calico-apiserver 5ac97367-6d73-46ba-bb67-eceba0a1415b 836 0 2025-09-16 04:24:14 +0000 UTC map[apiserver:true app.kubernetes.io/name:calico-apiserver k8s-app:calico-apiserver pod-template-hash:6bd6d5f94d projectcalico.org/namespace:calico-apiserver projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:calico-apiserver] map[] [] [] []} {k8s ci-4459-0-0-n-21eb3e8385 calico-apiserver-6bd6d5f94d-b8px6 eth0 calico-apiserver [] [] [kns.calico-apiserver ksa.calico-apiserver.calico-apiserver] calic544165cc29 [] [] }} ContainerID="bd0ee2805722d3138f5f509cff73c25da2471dbc49ab7fb41ffc01c84b62cf43" Namespace="calico-apiserver" Pod="calico-apiserver-6bd6d5f94d-b8px6" WorkloadEndpoint="ci--4459--0--0--n--21eb3e8385-k8s-calico--apiserver--6bd6d5f94d--b8px6-" Sep 16 04:24:40.599693 containerd[1558]: 2025-09-16 04:24:39.920 [INFO][4135] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="bd0ee2805722d3138f5f509cff73c25da2471dbc49ab7fb41ffc01c84b62cf43" Namespace="calico-apiserver" Pod="calico-apiserver-6bd6d5f94d-b8px6" WorkloadEndpoint="ci--4459--0--0--n--21eb3e8385-k8s-calico--apiserver--6bd6d5f94d--b8px6-eth0" Sep 16 04:24:40.599693 containerd[1558]: 2025-09-16 04:24:40.110 [INFO][4202] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="bd0ee2805722d3138f5f509cff73c25da2471dbc49ab7fb41ffc01c84b62cf43" HandleID="k8s-pod-network.bd0ee2805722d3138f5f509cff73c25da2471dbc49ab7fb41ffc01c84b62cf43" Workload="ci--4459--0--0--n--21eb3e8385-k8s-calico--apiserver--6bd6d5f94d--b8px6-eth0" Sep 16 04:24:40.599693 containerd[1558]: 2025-09-16 04:24:40.111 [INFO][4202] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="bd0ee2805722d3138f5f509cff73c25da2471dbc49ab7fb41ffc01c84b62cf43" HandleID="k8s-pod-network.bd0ee2805722d3138f5f509cff73c25da2471dbc49ab7fb41ffc01c84b62cf43" Workload="ci--4459--0--0--n--21eb3e8385-k8s-calico--apiserver--6bd6d5f94d--b8px6-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0x4000333a50), Attrs:map[string]string{"namespace":"calico-apiserver", "node":"ci-4459-0-0-n-21eb3e8385", "pod":"calico-apiserver-6bd6d5f94d-b8px6", "timestamp":"2025-09-16 04:24:40.110819902 +0000 UTC"}, Hostname:"ci-4459-0-0-n-21eb3e8385", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Sep 16 04:24:40.599693 containerd[1558]: 2025-09-16 04:24:40.111 [INFO][4202] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Sep 16 04:24:40.599693 containerd[1558]: 2025-09-16 04:24:40.401 [INFO][4202] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Sep 16 04:24:40.599693 containerd[1558]: 2025-09-16 04:24:40.402 [INFO][4202] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4459-0-0-n-21eb3e8385' Sep 16 04:24:40.599693 containerd[1558]: 2025-09-16 04:24:40.442 [INFO][4202] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.bd0ee2805722d3138f5f509cff73c25da2471dbc49ab7fb41ffc01c84b62cf43" host="ci-4459-0-0-n-21eb3e8385" Sep 16 04:24:40.599693 containerd[1558]: 2025-09-16 04:24:40.458 [INFO][4202] ipam/ipam.go 394: Looking up existing affinities for host host="ci-4459-0-0-n-21eb3e8385" Sep 16 04:24:40.599693 containerd[1558]: 2025-09-16 04:24:40.468 [INFO][4202] ipam/ipam.go 511: Trying affinity for 192.168.66.0/26 host="ci-4459-0-0-n-21eb3e8385" Sep 16 04:24:40.599693 containerd[1558]: 2025-09-16 04:24:40.477 [INFO][4202] ipam/ipam.go 158: Attempting to load block cidr=192.168.66.0/26 host="ci-4459-0-0-n-21eb3e8385" Sep 16 04:24:40.599693 containerd[1558]: 2025-09-16 04:24:40.485 [INFO][4202] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.66.0/26 host="ci-4459-0-0-n-21eb3e8385" Sep 16 04:24:40.599693 containerd[1558]: 2025-09-16 04:24:40.486 [INFO][4202] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.66.0/26 handle="k8s-pod-network.bd0ee2805722d3138f5f509cff73c25da2471dbc49ab7fb41ffc01c84b62cf43" host="ci-4459-0-0-n-21eb3e8385" Sep 16 04:24:40.599693 containerd[1558]: 2025-09-16 04:24:40.494 [INFO][4202] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.bd0ee2805722d3138f5f509cff73c25da2471dbc49ab7fb41ffc01c84b62cf43 Sep 16 04:24:40.599693 containerd[1558]: 2025-09-16 04:24:40.514 [INFO][4202] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.66.0/26 handle="k8s-pod-network.bd0ee2805722d3138f5f509cff73c25da2471dbc49ab7fb41ffc01c84b62cf43" host="ci-4459-0-0-n-21eb3e8385" Sep 16 04:24:40.599693 containerd[1558]: 2025-09-16 04:24:40.532 [INFO][4202] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.66.5/26] block=192.168.66.0/26 handle="k8s-pod-network.bd0ee2805722d3138f5f509cff73c25da2471dbc49ab7fb41ffc01c84b62cf43" host="ci-4459-0-0-n-21eb3e8385" Sep 16 04:24:40.599693 containerd[1558]: 2025-09-16 04:24:40.533 [INFO][4202] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.66.5/26] handle="k8s-pod-network.bd0ee2805722d3138f5f509cff73c25da2471dbc49ab7fb41ffc01c84b62cf43" host="ci-4459-0-0-n-21eb3e8385" Sep 16 04:24:40.599693 containerd[1558]: 2025-09-16 04:24:40.533 [INFO][4202] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Sep 16 04:24:40.600297 containerd[1558]: 2025-09-16 04:24:40.533 [INFO][4202] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.66.5/26] IPv6=[] ContainerID="bd0ee2805722d3138f5f509cff73c25da2471dbc49ab7fb41ffc01c84b62cf43" HandleID="k8s-pod-network.bd0ee2805722d3138f5f509cff73c25da2471dbc49ab7fb41ffc01c84b62cf43" Workload="ci--4459--0--0--n--21eb3e8385-k8s-calico--apiserver--6bd6d5f94d--b8px6-eth0" Sep 16 04:24:40.600297 containerd[1558]: 2025-09-16 04:24:40.541 [INFO][4135] cni-plugin/k8s.go 418: Populated endpoint ContainerID="bd0ee2805722d3138f5f509cff73c25da2471dbc49ab7fb41ffc01c84b62cf43" Namespace="calico-apiserver" Pod="calico-apiserver-6bd6d5f94d-b8px6" WorkloadEndpoint="ci--4459--0--0--n--21eb3e8385-k8s-calico--apiserver--6bd6d5f94d--b8px6-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4459--0--0--n--21eb3e8385-k8s-calico--apiserver--6bd6d5f94d--b8px6-eth0", GenerateName:"calico-apiserver-6bd6d5f94d-", Namespace:"calico-apiserver", SelfLink:"", UID:"5ac97367-6d73-46ba-bb67-eceba0a1415b", ResourceVersion:"836", Generation:0, CreationTimestamp:time.Date(2025, time.September, 16, 4, 24, 14, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"6bd6d5f94d", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4459-0-0-n-21eb3e8385", ContainerID:"", Pod:"calico-apiserver-6bd6d5f94d-b8px6", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.66.5/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"calic544165cc29", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 16 04:24:40.600297 containerd[1558]: 2025-09-16 04:24:40.542 [INFO][4135] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.66.5/32] ContainerID="bd0ee2805722d3138f5f509cff73c25da2471dbc49ab7fb41ffc01c84b62cf43" Namespace="calico-apiserver" Pod="calico-apiserver-6bd6d5f94d-b8px6" WorkloadEndpoint="ci--4459--0--0--n--21eb3e8385-k8s-calico--apiserver--6bd6d5f94d--b8px6-eth0" Sep 16 04:24:40.600297 containerd[1558]: 2025-09-16 04:24:40.542 [INFO][4135] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to calic544165cc29 ContainerID="bd0ee2805722d3138f5f509cff73c25da2471dbc49ab7fb41ffc01c84b62cf43" Namespace="calico-apiserver" Pod="calico-apiserver-6bd6d5f94d-b8px6" WorkloadEndpoint="ci--4459--0--0--n--21eb3e8385-k8s-calico--apiserver--6bd6d5f94d--b8px6-eth0" Sep 16 04:24:40.600297 containerd[1558]: 2025-09-16 04:24:40.564 [INFO][4135] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="bd0ee2805722d3138f5f509cff73c25da2471dbc49ab7fb41ffc01c84b62cf43" Namespace="calico-apiserver" Pod="calico-apiserver-6bd6d5f94d-b8px6" WorkloadEndpoint="ci--4459--0--0--n--21eb3e8385-k8s-calico--apiserver--6bd6d5f94d--b8px6-eth0" Sep 16 04:24:40.600434 containerd[1558]: 2025-09-16 04:24:40.567 [INFO][4135] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="bd0ee2805722d3138f5f509cff73c25da2471dbc49ab7fb41ffc01c84b62cf43" Namespace="calico-apiserver" Pod="calico-apiserver-6bd6d5f94d-b8px6" WorkloadEndpoint="ci--4459--0--0--n--21eb3e8385-k8s-calico--apiserver--6bd6d5f94d--b8px6-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4459--0--0--n--21eb3e8385-k8s-calico--apiserver--6bd6d5f94d--b8px6-eth0", GenerateName:"calico-apiserver-6bd6d5f94d-", Namespace:"calico-apiserver", SelfLink:"", UID:"5ac97367-6d73-46ba-bb67-eceba0a1415b", ResourceVersion:"836", Generation:0, CreationTimestamp:time.Date(2025, time.September, 16, 4, 24, 14, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"6bd6d5f94d", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4459-0-0-n-21eb3e8385", ContainerID:"bd0ee2805722d3138f5f509cff73c25da2471dbc49ab7fb41ffc01c84b62cf43", Pod:"calico-apiserver-6bd6d5f94d-b8px6", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.66.5/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"calic544165cc29", MAC:"6e:c2:7f:57:48:bb", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 16 04:24:40.600434 containerd[1558]: 2025-09-16 04:24:40.590 [INFO][4135] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="bd0ee2805722d3138f5f509cff73c25da2471dbc49ab7fb41ffc01c84b62cf43" Namespace="calico-apiserver" Pod="calico-apiserver-6bd6d5f94d-b8px6" WorkloadEndpoint="ci--4459--0--0--n--21eb3e8385-k8s-calico--apiserver--6bd6d5f94d--b8px6-eth0" Sep 16 04:24:40.639893 containerd[1558]: time="2025-09-16T04:24:40.639506026Z" level=info msg="StartContainer for \"6b339db8a545c6fde5a18f98c80055240a0af9c453697fe215632520c162eed3\" returns successfully" Sep 16 04:24:40.663529 containerd[1558]: time="2025-09-16T04:24:40.663283919Z" level=info msg="connecting to shim bd0ee2805722d3138f5f509cff73c25da2471dbc49ab7fb41ffc01c84b62cf43" address="unix:///run/containerd/s/25c0b5e5fd408a0f0407614e170554fabc77228858f9529e9207c2e2446b5994" namespace=k8s.io protocol=ttrpc version=3 Sep 16 04:24:40.680277 systemd[1]: Started cri-containerd-0ec7bc1e72af1c64d5e4f4781d71b30a078a8f8aeca5117c149752f003b07903.scope - libcontainer container 0ec7bc1e72af1c64d5e4f4781d71b30a078a8f8aeca5117c149752f003b07903. Sep 16 04:24:40.707496 systemd[1]: Started cri-containerd-bd0ee2805722d3138f5f509cff73c25da2471dbc49ab7fb41ffc01c84b62cf43.scope - libcontainer container bd0ee2805722d3138f5f509cff73c25da2471dbc49ab7fb41ffc01c84b62cf43. Sep 16 04:24:40.751563 containerd[1558]: time="2025-09-16T04:24:40.751486954Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-c85z6,Uid:6e749571-a39a-485f-bcfb-603d4a5b22ed,Namespace:calico-system,Attempt:0,}" Sep 16 04:24:40.751859 containerd[1558]: time="2025-09-16T04:24:40.751699677Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-6bd74b9558-fz99l,Uid:60cfb0b5-050e-42a6-8c96-a4c5067b5655,Namespace:calico-apiserver,Attempt:0,}" Sep 16 04:24:40.768406 containerd[1558]: time="2025-09-16T04:24:40.765613752Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:goldmane-54d579b49d-xdvqq,Uid:34d6afd2-952d-4990-a809-94238195a22f,Namespace:calico-system,Attempt:0,} returns sandbox id \"c97c66d2619171743082a8e92c9b5c4f676446e0607614f7bb7f5a5ca57c4b66\"" Sep 16 04:24:40.812186 containerd[1558]: time="2025-09-16T04:24:40.812038602Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-674b8bbfcf-xxw8g,Uid:038e6674-1975-4134-b752-06e86fdb41a9,Namespace:kube-system,Attempt:0,} returns sandbox id \"0ec7bc1e72af1c64d5e4f4781d71b30a078a8f8aeca5117c149752f003b07903\"" Sep 16 04:24:40.830614 containerd[1558]: time="2025-09-16T04:24:40.830458300Z" level=info msg="CreateContainer within sandbox \"0ec7bc1e72af1c64d5e4f4781d71b30a078a8f8aeca5117c149752f003b07903\" for container &ContainerMetadata{Name:coredns,Attempt:0,}" Sep 16 04:24:40.868948 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount366431606.mount: Deactivated successfully. Sep 16 04:24:40.877000 containerd[1558]: time="2025-09-16T04:24:40.876886030Z" level=info msg="Container 1bab73f621173acd92b434faa9fb4109780778a33414ff1e082d1886b7bccc87: CDI devices from CRI Config.CDIDevices: []" Sep 16 04:24:40.901238 containerd[1558]: time="2025-09-16T04:24:40.901182679Z" level=info msg="CreateContainer within sandbox \"0ec7bc1e72af1c64d5e4f4781d71b30a078a8f8aeca5117c149752f003b07903\" for &ContainerMetadata{Name:coredns,Attempt:0,} returns container id \"1bab73f621173acd92b434faa9fb4109780778a33414ff1e082d1886b7bccc87\"" Sep 16 04:24:40.903375 containerd[1558]: time="2025-09-16T04:24:40.903334704Z" level=info msg="StartContainer for \"1bab73f621173acd92b434faa9fb4109780778a33414ff1e082d1886b7bccc87\"" Sep 16 04:24:40.904367 containerd[1558]: time="2025-09-16T04:24:40.904315155Z" level=info msg="connecting to shim 1bab73f621173acd92b434faa9fb4109780778a33414ff1e082d1886b7bccc87" address="unix:///run/containerd/s/65c49ce8e85ad8269e9956ad01e86a843b797b5bde5d85860c0162b4967eb6d2" protocol=ttrpc version=3 Sep 16 04:24:40.977021 systemd[1]: Started cri-containerd-1bab73f621173acd92b434faa9fb4109780778a33414ff1e082d1886b7bccc87.scope - libcontainer container 1bab73f621173acd92b434faa9fb4109780778a33414ff1e082d1886b7bccc87. Sep 16 04:24:41.109482 containerd[1558]: time="2025-09-16T04:24:41.108732206Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-6bd6d5f94d-b8px6,Uid:5ac97367-6d73-46ba-bb67-eceba0a1415b,Namespace:calico-apiserver,Attempt:0,} returns sandbox id \"bd0ee2805722d3138f5f509cff73c25da2471dbc49ab7fb41ffc01c84b62cf43\"" Sep 16 04:24:41.156340 kubelet[2783]: I0916 04:24:41.156268 2783 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/coredns-674b8bbfcf-4l959" podStartSLOduration=41.155210813 podStartE2EDuration="41.155210813s" podCreationTimestamp="2025-09-16 04:24:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-16 04:24:41.08929165 +0000 UTC m=+47.457932792" watchObservedRunningTime="2025-09-16 04:24:41.155210813 +0000 UTC m=+47.523851835" Sep 16 04:24:41.163954 systemd-networkd[1418]: vxlan.calico: Gained IPv6LL Sep 16 04:24:41.189353 containerd[1558]: time="2025-09-16T04:24:41.188791288Z" level=info msg="StartContainer for \"1bab73f621173acd92b434faa9fb4109780778a33414ff1e082d1886b7bccc87\" returns successfully" Sep 16 04:24:41.267015 systemd-networkd[1418]: calicfc764fa7eb: Link UP Sep 16 04:24:41.267185 systemd-networkd[1418]: calicfc764fa7eb: Gained carrier Sep 16 04:24:41.308312 containerd[1558]: 2025-09-16 04:24:40.939 [INFO][4509] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4459--0--0--n--21eb3e8385-k8s-csi--node--driver--c85z6-eth0 csi-node-driver- calico-system 6e749571-a39a-485f-bcfb-603d4a5b22ed 715 0 2025-09-16 04:24:17 +0000 UTC map[app.kubernetes.io/name:csi-node-driver controller-revision-hash:6c96d95cc7 k8s-app:csi-node-driver name:csi-node-driver pod-template-generation:1 projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:csi-node-driver] map[] [] [] []} {k8s ci-4459-0-0-n-21eb3e8385 csi-node-driver-c85z6 eth0 csi-node-driver [] [] [kns.calico-system ksa.calico-system.csi-node-driver] calicfc764fa7eb [] [] }} ContainerID="06b339a4ad9ecc43049a43a52397442958c95ad5d6b17f2bc727827d99d90505" Namespace="calico-system" Pod="csi-node-driver-c85z6" WorkloadEndpoint="ci--4459--0--0--n--21eb3e8385-k8s-csi--node--driver--c85z6-" Sep 16 04:24:41.308312 containerd[1558]: 2025-09-16 04:24:40.940 [INFO][4509] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="06b339a4ad9ecc43049a43a52397442958c95ad5d6b17f2bc727827d99d90505" Namespace="calico-system" Pod="csi-node-driver-c85z6" WorkloadEndpoint="ci--4459--0--0--n--21eb3e8385-k8s-csi--node--driver--c85z6-eth0" Sep 16 04:24:41.308312 containerd[1558]: 2025-09-16 04:24:41.069 [INFO][4560] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="06b339a4ad9ecc43049a43a52397442958c95ad5d6b17f2bc727827d99d90505" HandleID="k8s-pod-network.06b339a4ad9ecc43049a43a52397442958c95ad5d6b17f2bc727827d99d90505" Workload="ci--4459--0--0--n--21eb3e8385-k8s-csi--node--driver--c85z6-eth0" Sep 16 04:24:41.308312 containerd[1558]: 2025-09-16 04:24:41.069 [INFO][4560] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="06b339a4ad9ecc43049a43a52397442958c95ad5d6b17f2bc727827d99d90505" HandleID="k8s-pod-network.06b339a4ad9ecc43049a43a52397442958c95ad5d6b17f2bc727827d99d90505" Workload="ci--4459--0--0--n--21eb3e8385-k8s-csi--node--driver--c85z6-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0x4000330830), Attrs:map[string]string{"namespace":"calico-system", "node":"ci-4459-0-0-n-21eb3e8385", "pod":"csi-node-driver-c85z6", "timestamp":"2025-09-16 04:24:41.069149381 +0000 UTC"}, Hostname:"ci-4459-0-0-n-21eb3e8385", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Sep 16 04:24:41.308312 containerd[1558]: 2025-09-16 04:24:41.069 [INFO][4560] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Sep 16 04:24:41.308312 containerd[1558]: 2025-09-16 04:24:41.069 [INFO][4560] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Sep 16 04:24:41.308312 containerd[1558]: 2025-09-16 04:24:41.069 [INFO][4560] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4459-0-0-n-21eb3e8385' Sep 16 04:24:41.308312 containerd[1558]: 2025-09-16 04:24:41.101 [INFO][4560] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.06b339a4ad9ecc43049a43a52397442958c95ad5d6b17f2bc727827d99d90505" host="ci-4459-0-0-n-21eb3e8385" Sep 16 04:24:41.308312 containerd[1558]: 2025-09-16 04:24:41.119 [INFO][4560] ipam/ipam.go 394: Looking up existing affinities for host host="ci-4459-0-0-n-21eb3e8385" Sep 16 04:24:41.308312 containerd[1558]: 2025-09-16 04:24:41.162 [INFO][4560] ipam/ipam.go 511: Trying affinity for 192.168.66.0/26 host="ci-4459-0-0-n-21eb3e8385" Sep 16 04:24:41.308312 containerd[1558]: 2025-09-16 04:24:41.185 [INFO][4560] ipam/ipam.go 158: Attempting to load block cidr=192.168.66.0/26 host="ci-4459-0-0-n-21eb3e8385" Sep 16 04:24:41.308312 containerd[1558]: 2025-09-16 04:24:41.207 [INFO][4560] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.66.0/26 host="ci-4459-0-0-n-21eb3e8385" Sep 16 04:24:41.308312 containerd[1558]: 2025-09-16 04:24:41.208 [INFO][4560] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.66.0/26 handle="k8s-pod-network.06b339a4ad9ecc43049a43a52397442958c95ad5d6b17f2bc727827d99d90505" host="ci-4459-0-0-n-21eb3e8385" Sep 16 04:24:41.308312 containerd[1558]: 2025-09-16 04:24:41.216 [INFO][4560] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.06b339a4ad9ecc43049a43a52397442958c95ad5d6b17f2bc727827d99d90505 Sep 16 04:24:41.308312 containerd[1558]: 2025-09-16 04:24:41.233 [INFO][4560] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.66.0/26 handle="k8s-pod-network.06b339a4ad9ecc43049a43a52397442958c95ad5d6b17f2bc727827d99d90505" host="ci-4459-0-0-n-21eb3e8385" Sep 16 04:24:41.308312 containerd[1558]: 2025-09-16 04:24:41.254 [INFO][4560] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.66.6/26] block=192.168.66.0/26 handle="k8s-pod-network.06b339a4ad9ecc43049a43a52397442958c95ad5d6b17f2bc727827d99d90505" host="ci-4459-0-0-n-21eb3e8385" Sep 16 04:24:41.308312 containerd[1558]: 2025-09-16 04:24:41.254 [INFO][4560] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.66.6/26] handle="k8s-pod-network.06b339a4ad9ecc43049a43a52397442958c95ad5d6b17f2bc727827d99d90505" host="ci-4459-0-0-n-21eb3e8385" Sep 16 04:24:41.308312 containerd[1558]: 2025-09-16 04:24:41.254 [INFO][4560] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Sep 16 04:24:41.308312 containerd[1558]: 2025-09-16 04:24:41.254 [INFO][4560] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.66.6/26] IPv6=[] ContainerID="06b339a4ad9ecc43049a43a52397442958c95ad5d6b17f2bc727827d99d90505" HandleID="k8s-pod-network.06b339a4ad9ecc43049a43a52397442958c95ad5d6b17f2bc727827d99d90505" Workload="ci--4459--0--0--n--21eb3e8385-k8s-csi--node--driver--c85z6-eth0" Sep 16 04:24:41.309766 containerd[1558]: 2025-09-16 04:24:41.262 [INFO][4509] cni-plugin/k8s.go 418: Populated endpoint ContainerID="06b339a4ad9ecc43049a43a52397442958c95ad5d6b17f2bc727827d99d90505" Namespace="calico-system" Pod="csi-node-driver-c85z6" WorkloadEndpoint="ci--4459--0--0--n--21eb3e8385-k8s-csi--node--driver--c85z6-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4459--0--0--n--21eb3e8385-k8s-csi--node--driver--c85z6-eth0", GenerateName:"csi-node-driver-", Namespace:"calico-system", SelfLink:"", UID:"6e749571-a39a-485f-bcfb-603d4a5b22ed", ResourceVersion:"715", Generation:0, CreationTimestamp:time.Date(2025, time.September, 16, 4, 24, 17, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"csi-node-driver", "controller-revision-hash":"6c96d95cc7", "k8s-app":"csi-node-driver", "name":"csi-node-driver", "pod-template-generation":"1", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"csi-node-driver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4459-0-0-n-21eb3e8385", ContainerID:"", Pod:"csi-node-driver-c85z6", Endpoint:"eth0", ServiceAccountName:"csi-node-driver", IPNetworks:[]string{"192.168.66.6/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.csi-node-driver"}, InterfaceName:"calicfc764fa7eb", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 16 04:24:41.309766 containerd[1558]: 2025-09-16 04:24:41.262 [INFO][4509] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.66.6/32] ContainerID="06b339a4ad9ecc43049a43a52397442958c95ad5d6b17f2bc727827d99d90505" Namespace="calico-system" Pod="csi-node-driver-c85z6" WorkloadEndpoint="ci--4459--0--0--n--21eb3e8385-k8s-csi--node--driver--c85z6-eth0" Sep 16 04:24:41.309766 containerd[1558]: 2025-09-16 04:24:41.262 [INFO][4509] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to calicfc764fa7eb ContainerID="06b339a4ad9ecc43049a43a52397442958c95ad5d6b17f2bc727827d99d90505" Namespace="calico-system" Pod="csi-node-driver-c85z6" WorkloadEndpoint="ci--4459--0--0--n--21eb3e8385-k8s-csi--node--driver--c85z6-eth0" Sep 16 04:24:41.309766 containerd[1558]: 2025-09-16 04:24:41.265 [INFO][4509] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="06b339a4ad9ecc43049a43a52397442958c95ad5d6b17f2bc727827d99d90505" Namespace="calico-system" Pod="csi-node-driver-c85z6" WorkloadEndpoint="ci--4459--0--0--n--21eb3e8385-k8s-csi--node--driver--c85z6-eth0" Sep 16 04:24:41.309766 containerd[1558]: 2025-09-16 04:24:41.266 [INFO][4509] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="06b339a4ad9ecc43049a43a52397442958c95ad5d6b17f2bc727827d99d90505" Namespace="calico-system" Pod="csi-node-driver-c85z6" WorkloadEndpoint="ci--4459--0--0--n--21eb3e8385-k8s-csi--node--driver--c85z6-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4459--0--0--n--21eb3e8385-k8s-csi--node--driver--c85z6-eth0", GenerateName:"csi-node-driver-", Namespace:"calico-system", SelfLink:"", UID:"6e749571-a39a-485f-bcfb-603d4a5b22ed", ResourceVersion:"715", Generation:0, CreationTimestamp:time.Date(2025, time.September, 16, 4, 24, 17, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"csi-node-driver", "controller-revision-hash":"6c96d95cc7", "k8s-app":"csi-node-driver", "name":"csi-node-driver", "pod-template-generation":"1", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"csi-node-driver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4459-0-0-n-21eb3e8385", ContainerID:"06b339a4ad9ecc43049a43a52397442958c95ad5d6b17f2bc727827d99d90505", Pod:"csi-node-driver-c85z6", Endpoint:"eth0", ServiceAccountName:"csi-node-driver", IPNetworks:[]string{"192.168.66.6/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.csi-node-driver"}, InterfaceName:"calicfc764fa7eb", MAC:"1a:45:e5:ab:68:8c", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 16 04:24:41.309766 containerd[1558]: 2025-09-16 04:24:41.302 [INFO][4509] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="06b339a4ad9ecc43049a43a52397442958c95ad5d6b17f2bc727827d99d90505" Namespace="calico-system" Pod="csi-node-driver-c85z6" WorkloadEndpoint="ci--4459--0--0--n--21eb3e8385-k8s-csi--node--driver--c85z6-eth0" Sep 16 04:24:41.368473 containerd[1558]: time="2025-09-16T04:24:41.368338969Z" level=info msg="connecting to shim 06b339a4ad9ecc43049a43a52397442958c95ad5d6b17f2bc727827d99d90505" address="unix:///run/containerd/s/a4907a4afeeea230949e61b0a90a7033c8c830cf22921f80abfd4cb25113e0d7" namespace=k8s.io protocol=ttrpc version=3 Sep 16 04:24:41.417755 systemd-networkd[1418]: cali03b14a9e1b3: Link UP Sep 16 04:24:41.418540 systemd-networkd[1418]: cali03b14a9e1b3: Gained carrier Sep 16 04:24:41.421063 systemd[1]: Started cri-containerd-06b339a4ad9ecc43049a43a52397442958c95ad5d6b17f2bc727827d99d90505.scope - libcontainer container 06b339a4ad9ecc43049a43a52397442958c95ad5d6b17f2bc727827d99d90505. Sep 16 04:24:41.448229 containerd[1558]: 2025-09-16 04:24:40.996 [INFO][4505] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4459--0--0--n--21eb3e8385-k8s-calico--apiserver--6bd74b9558--fz99l-eth0 calico-apiserver-6bd74b9558- calico-apiserver 60cfb0b5-050e-42a6-8c96-a4c5067b5655 835 0 2025-09-16 04:24:11 +0000 UTC map[apiserver:true app.kubernetes.io/name:calico-apiserver k8s-app:calico-apiserver pod-template-hash:6bd74b9558 projectcalico.org/namespace:calico-apiserver projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:calico-apiserver] map[] [] [] []} {k8s ci-4459-0-0-n-21eb3e8385 calico-apiserver-6bd74b9558-fz99l eth0 calico-apiserver [] [] [kns.calico-apiserver ksa.calico-apiserver.calico-apiserver] cali03b14a9e1b3 [] [] }} ContainerID="8a610be6a24d2e391acbed96ab712c6beb8ff0fbbbbd95157a98bf1bdc94ecff" Namespace="calico-apiserver" Pod="calico-apiserver-6bd74b9558-fz99l" WorkloadEndpoint="ci--4459--0--0--n--21eb3e8385-k8s-calico--apiserver--6bd74b9558--fz99l-" Sep 16 04:24:41.448229 containerd[1558]: 2025-09-16 04:24:40.996 [INFO][4505] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="8a610be6a24d2e391acbed96ab712c6beb8ff0fbbbbd95157a98bf1bdc94ecff" Namespace="calico-apiserver" Pod="calico-apiserver-6bd74b9558-fz99l" WorkloadEndpoint="ci--4459--0--0--n--21eb3e8385-k8s-calico--apiserver--6bd74b9558--fz99l-eth0" Sep 16 04:24:41.448229 containerd[1558]: 2025-09-16 04:24:41.203 [INFO][4579] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="8a610be6a24d2e391acbed96ab712c6beb8ff0fbbbbd95157a98bf1bdc94ecff" HandleID="k8s-pod-network.8a610be6a24d2e391acbed96ab712c6beb8ff0fbbbbd95157a98bf1bdc94ecff" Workload="ci--4459--0--0--n--21eb3e8385-k8s-calico--apiserver--6bd74b9558--fz99l-eth0" Sep 16 04:24:41.448229 containerd[1558]: 2025-09-16 04:24:41.203 [INFO][4579] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="8a610be6a24d2e391acbed96ab712c6beb8ff0fbbbbd95157a98bf1bdc94ecff" HandleID="k8s-pod-network.8a610be6a24d2e391acbed96ab712c6beb8ff0fbbbbd95157a98bf1bdc94ecff" Workload="ci--4459--0--0--n--21eb3e8385-k8s-calico--apiserver--6bd74b9558--fz99l-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0x4000102550), Attrs:map[string]string{"namespace":"calico-apiserver", "node":"ci-4459-0-0-n-21eb3e8385", "pod":"calico-apiserver-6bd74b9558-fz99l", "timestamp":"2025-09-16 04:24:41.203170567 +0000 UTC"}, Hostname:"ci-4459-0-0-n-21eb3e8385", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Sep 16 04:24:41.448229 containerd[1558]: 2025-09-16 04:24:41.203 [INFO][4579] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Sep 16 04:24:41.448229 containerd[1558]: 2025-09-16 04:24:41.255 [INFO][4579] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Sep 16 04:24:41.448229 containerd[1558]: 2025-09-16 04:24:41.255 [INFO][4579] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4459-0-0-n-21eb3e8385' Sep 16 04:24:41.448229 containerd[1558]: 2025-09-16 04:24:41.293 [INFO][4579] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.8a610be6a24d2e391acbed96ab712c6beb8ff0fbbbbd95157a98bf1bdc94ecff" host="ci-4459-0-0-n-21eb3e8385" Sep 16 04:24:41.448229 containerd[1558]: 2025-09-16 04:24:41.304 [INFO][4579] ipam/ipam.go 394: Looking up existing affinities for host host="ci-4459-0-0-n-21eb3e8385" Sep 16 04:24:41.448229 containerd[1558]: 2025-09-16 04:24:41.340 [INFO][4579] ipam/ipam.go 511: Trying affinity for 192.168.66.0/26 host="ci-4459-0-0-n-21eb3e8385" Sep 16 04:24:41.448229 containerd[1558]: 2025-09-16 04:24:41.345 [INFO][4579] ipam/ipam.go 158: Attempting to load block cidr=192.168.66.0/26 host="ci-4459-0-0-n-21eb3e8385" Sep 16 04:24:41.448229 containerd[1558]: 2025-09-16 04:24:41.354 [INFO][4579] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.66.0/26 host="ci-4459-0-0-n-21eb3e8385" Sep 16 04:24:41.448229 containerd[1558]: 2025-09-16 04:24:41.355 [INFO][4579] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.66.0/26 handle="k8s-pod-network.8a610be6a24d2e391acbed96ab712c6beb8ff0fbbbbd95157a98bf1bdc94ecff" host="ci-4459-0-0-n-21eb3e8385" Sep 16 04:24:41.448229 containerd[1558]: 2025-09-16 04:24:41.360 [INFO][4579] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.8a610be6a24d2e391acbed96ab712c6beb8ff0fbbbbd95157a98bf1bdc94ecff Sep 16 04:24:41.448229 containerd[1558]: 2025-09-16 04:24:41.376 [INFO][4579] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.66.0/26 handle="k8s-pod-network.8a610be6a24d2e391acbed96ab712c6beb8ff0fbbbbd95157a98bf1bdc94ecff" host="ci-4459-0-0-n-21eb3e8385" Sep 16 04:24:41.448229 containerd[1558]: 2025-09-16 04:24:41.395 [INFO][4579] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.66.7/26] block=192.168.66.0/26 handle="k8s-pod-network.8a610be6a24d2e391acbed96ab712c6beb8ff0fbbbbd95157a98bf1bdc94ecff" host="ci-4459-0-0-n-21eb3e8385" Sep 16 04:24:41.448229 containerd[1558]: 2025-09-16 04:24:41.395 [INFO][4579] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.66.7/26] handle="k8s-pod-network.8a610be6a24d2e391acbed96ab712c6beb8ff0fbbbbd95157a98bf1bdc94ecff" host="ci-4459-0-0-n-21eb3e8385" Sep 16 04:24:41.448229 containerd[1558]: 2025-09-16 04:24:41.395 [INFO][4579] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Sep 16 04:24:41.449862 containerd[1558]: 2025-09-16 04:24:41.395 [INFO][4579] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.66.7/26] IPv6=[] ContainerID="8a610be6a24d2e391acbed96ab712c6beb8ff0fbbbbd95157a98bf1bdc94ecff" HandleID="k8s-pod-network.8a610be6a24d2e391acbed96ab712c6beb8ff0fbbbbd95157a98bf1bdc94ecff" Workload="ci--4459--0--0--n--21eb3e8385-k8s-calico--apiserver--6bd74b9558--fz99l-eth0" Sep 16 04:24:41.449862 containerd[1558]: 2025-09-16 04:24:41.412 [INFO][4505] cni-plugin/k8s.go 418: Populated endpoint ContainerID="8a610be6a24d2e391acbed96ab712c6beb8ff0fbbbbd95157a98bf1bdc94ecff" Namespace="calico-apiserver" Pod="calico-apiserver-6bd74b9558-fz99l" WorkloadEndpoint="ci--4459--0--0--n--21eb3e8385-k8s-calico--apiserver--6bd74b9558--fz99l-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4459--0--0--n--21eb3e8385-k8s-calico--apiserver--6bd74b9558--fz99l-eth0", GenerateName:"calico-apiserver-6bd74b9558-", Namespace:"calico-apiserver", SelfLink:"", UID:"60cfb0b5-050e-42a6-8c96-a4c5067b5655", ResourceVersion:"835", Generation:0, CreationTimestamp:time.Date(2025, time.September, 16, 4, 24, 11, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"6bd74b9558", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4459-0-0-n-21eb3e8385", ContainerID:"", Pod:"calico-apiserver-6bd74b9558-fz99l", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.66.7/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"cali03b14a9e1b3", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 16 04:24:41.449862 containerd[1558]: 2025-09-16 04:24:41.412 [INFO][4505] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.66.7/32] ContainerID="8a610be6a24d2e391acbed96ab712c6beb8ff0fbbbbd95157a98bf1bdc94ecff" Namespace="calico-apiserver" Pod="calico-apiserver-6bd74b9558-fz99l" WorkloadEndpoint="ci--4459--0--0--n--21eb3e8385-k8s-calico--apiserver--6bd74b9558--fz99l-eth0" Sep 16 04:24:41.449862 containerd[1558]: 2025-09-16 04:24:41.412 [INFO][4505] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali03b14a9e1b3 ContainerID="8a610be6a24d2e391acbed96ab712c6beb8ff0fbbbbd95157a98bf1bdc94ecff" Namespace="calico-apiserver" Pod="calico-apiserver-6bd74b9558-fz99l" WorkloadEndpoint="ci--4459--0--0--n--21eb3e8385-k8s-calico--apiserver--6bd74b9558--fz99l-eth0" Sep 16 04:24:41.449862 containerd[1558]: 2025-09-16 04:24:41.414 [INFO][4505] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="8a610be6a24d2e391acbed96ab712c6beb8ff0fbbbbd95157a98bf1bdc94ecff" Namespace="calico-apiserver" Pod="calico-apiserver-6bd74b9558-fz99l" WorkloadEndpoint="ci--4459--0--0--n--21eb3e8385-k8s-calico--apiserver--6bd74b9558--fz99l-eth0" Sep 16 04:24:41.450006 containerd[1558]: 2025-09-16 04:24:41.415 [INFO][4505] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="8a610be6a24d2e391acbed96ab712c6beb8ff0fbbbbd95157a98bf1bdc94ecff" Namespace="calico-apiserver" Pod="calico-apiserver-6bd74b9558-fz99l" WorkloadEndpoint="ci--4459--0--0--n--21eb3e8385-k8s-calico--apiserver--6bd74b9558--fz99l-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4459--0--0--n--21eb3e8385-k8s-calico--apiserver--6bd74b9558--fz99l-eth0", GenerateName:"calico-apiserver-6bd74b9558-", Namespace:"calico-apiserver", SelfLink:"", UID:"60cfb0b5-050e-42a6-8c96-a4c5067b5655", ResourceVersion:"835", Generation:0, CreationTimestamp:time.Date(2025, time.September, 16, 4, 24, 11, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"6bd74b9558", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4459-0-0-n-21eb3e8385", ContainerID:"8a610be6a24d2e391acbed96ab712c6beb8ff0fbbbbd95157a98bf1bdc94ecff", Pod:"calico-apiserver-6bd74b9558-fz99l", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.66.7/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"cali03b14a9e1b3", MAC:"aa:b4:f6:ff:af:6f", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 16 04:24:41.450006 containerd[1558]: 2025-09-16 04:24:41.441 [INFO][4505] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="8a610be6a24d2e391acbed96ab712c6beb8ff0fbbbbd95157a98bf1bdc94ecff" Namespace="calico-apiserver" Pod="calico-apiserver-6bd74b9558-fz99l" WorkloadEndpoint="ci--4459--0--0--n--21eb3e8385-k8s-calico--apiserver--6bd74b9558--fz99l-eth0" Sep 16 04:24:41.498151 containerd[1558]: time="2025-09-16T04:24:41.498051751Z" level=info msg="connecting to shim 8a610be6a24d2e391acbed96ab712c6beb8ff0fbbbbd95157a98bf1bdc94ecff" address="unix:///run/containerd/s/5433bcb26b867e6ca9082c5cb7f07d7002b89e7d656459a4775673917c3ea24d" namespace=k8s.io protocol=ttrpc version=3 Sep 16 04:24:41.522324 containerd[1558]: time="2025-09-16T04:24:41.522096348Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-c85z6,Uid:6e749571-a39a-485f-bcfb-603d4a5b22ed,Namespace:calico-system,Attempt:0,} returns sandbox id \"06b339a4ad9ecc43049a43a52397442958c95ad5d6b17f2bc727827d99d90505\"" Sep 16 04:24:41.539788 systemd[1]: Started cri-containerd-8a610be6a24d2e391acbed96ab712c6beb8ff0fbbbbd95157a98bf1bdc94ecff.scope - libcontainer container 8a610be6a24d2e391acbed96ab712c6beb8ff0fbbbbd95157a98bf1bdc94ecff. Sep 16 04:24:41.603905 containerd[1558]: time="2025-09-16T04:24:41.603860416Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-6bd74b9558-fz99l,Uid:60cfb0b5-050e-42a6-8c96-a4c5067b5655,Namespace:calico-apiserver,Attempt:0,} returns sandbox id \"8a610be6a24d2e391acbed96ab712c6beb8ff0fbbbbd95157a98bf1bdc94ecff\"" Sep 16 04:24:41.612157 systemd-networkd[1418]: caliac25f8b6413: Gained IPv6LL Sep 16 04:24:41.740754 systemd-networkd[1418]: cali962d594e8be: Gained IPv6LL Sep 16 04:24:41.752445 containerd[1558]: time="2025-09-16T04:24:41.752398639Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-5fb8d7647c-wmjlp,Uid:49803f3a-c196-489e-944f-acc9e7819ab4,Namespace:calico-system,Attempt:0,}" Sep 16 04:24:41.752686 containerd[1558]: time="2025-09-16T04:24:41.752402479Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-6bd74b9558-ljjkj,Uid:bf6d1753-4f29-4a36-94fb-7d6f48d24a1e,Namespace:calico-apiserver,Attempt:0,}" Sep 16 04:24:41.987109 systemd-networkd[1418]: cali670052adcdb: Link UP Sep 16 04:24:41.988714 systemd-networkd[1418]: cali670052adcdb: Gained carrier Sep 16 04:24:41.997553 systemd-networkd[1418]: calica2478979a6: Gained IPv6LL Sep 16 04:24:42.023064 containerd[1558]: 2025-09-16 04:24:41.847 [INFO][4722] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4459--0--0--n--21eb3e8385-k8s-calico--kube--controllers--5fb8d7647c--wmjlp-eth0 calico-kube-controllers-5fb8d7647c- calico-system 49803f3a-c196-489e-944f-acc9e7819ab4 839 0 2025-09-16 04:24:18 +0000 UTC map[app.kubernetes.io/name:calico-kube-controllers k8s-app:calico-kube-controllers pod-template-hash:5fb8d7647c projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:calico-kube-controllers] map[] [] [] []} {k8s ci-4459-0-0-n-21eb3e8385 calico-kube-controllers-5fb8d7647c-wmjlp eth0 calico-kube-controllers [] [] [kns.calico-system ksa.calico-system.calico-kube-controllers] cali670052adcdb [] [] }} ContainerID="bf9451cd5cf370340927d8330d7d3ece7ed4d22949a75087d5ae4a5d86b2be7c" Namespace="calico-system" Pod="calico-kube-controllers-5fb8d7647c-wmjlp" WorkloadEndpoint="ci--4459--0--0--n--21eb3e8385-k8s-calico--kube--controllers--5fb8d7647c--wmjlp-" Sep 16 04:24:42.023064 containerd[1558]: 2025-09-16 04:24:41.847 [INFO][4722] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="bf9451cd5cf370340927d8330d7d3ece7ed4d22949a75087d5ae4a5d86b2be7c" Namespace="calico-system" Pod="calico-kube-controllers-5fb8d7647c-wmjlp" WorkloadEndpoint="ci--4459--0--0--n--21eb3e8385-k8s-calico--kube--controllers--5fb8d7647c--wmjlp-eth0" Sep 16 04:24:42.023064 containerd[1558]: 2025-09-16 04:24:41.909 [INFO][4748] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="bf9451cd5cf370340927d8330d7d3ece7ed4d22949a75087d5ae4a5d86b2be7c" HandleID="k8s-pod-network.bf9451cd5cf370340927d8330d7d3ece7ed4d22949a75087d5ae4a5d86b2be7c" Workload="ci--4459--0--0--n--21eb3e8385-k8s-calico--kube--controllers--5fb8d7647c--wmjlp-eth0" Sep 16 04:24:42.023064 containerd[1558]: 2025-09-16 04:24:41.909 [INFO][4748] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="bf9451cd5cf370340927d8330d7d3ece7ed4d22949a75087d5ae4a5d86b2be7c" HandleID="k8s-pod-network.bf9451cd5cf370340927d8330d7d3ece7ed4d22949a75087d5ae4a5d86b2be7c" Workload="ci--4459--0--0--n--21eb3e8385-k8s-calico--kube--controllers--5fb8d7647c--wmjlp-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0x40002d3860), Attrs:map[string]string{"namespace":"calico-system", "node":"ci-4459-0-0-n-21eb3e8385", "pod":"calico-kube-controllers-5fb8d7647c-wmjlp", "timestamp":"2025-09-16 04:24:41.906551894 +0000 UTC"}, Hostname:"ci-4459-0-0-n-21eb3e8385", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Sep 16 04:24:42.023064 containerd[1558]: 2025-09-16 04:24:41.909 [INFO][4748] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Sep 16 04:24:42.023064 containerd[1558]: 2025-09-16 04:24:41.909 [INFO][4748] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Sep 16 04:24:42.023064 containerd[1558]: 2025-09-16 04:24:41.909 [INFO][4748] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4459-0-0-n-21eb3e8385' Sep 16 04:24:42.023064 containerd[1558]: 2025-09-16 04:24:41.927 [INFO][4748] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.bf9451cd5cf370340927d8330d7d3ece7ed4d22949a75087d5ae4a5d86b2be7c" host="ci-4459-0-0-n-21eb3e8385" Sep 16 04:24:42.023064 containerd[1558]: 2025-09-16 04:24:41.936 [INFO][4748] ipam/ipam.go 394: Looking up existing affinities for host host="ci-4459-0-0-n-21eb3e8385" Sep 16 04:24:42.023064 containerd[1558]: 2025-09-16 04:24:41.943 [INFO][4748] ipam/ipam.go 511: Trying affinity for 192.168.66.0/26 host="ci-4459-0-0-n-21eb3e8385" Sep 16 04:24:42.023064 containerd[1558]: 2025-09-16 04:24:41.947 [INFO][4748] ipam/ipam.go 158: Attempting to load block cidr=192.168.66.0/26 host="ci-4459-0-0-n-21eb3e8385" Sep 16 04:24:42.023064 containerd[1558]: 2025-09-16 04:24:41.951 [INFO][4748] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.66.0/26 host="ci-4459-0-0-n-21eb3e8385" Sep 16 04:24:42.023064 containerd[1558]: 2025-09-16 04:24:41.951 [INFO][4748] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.66.0/26 handle="k8s-pod-network.bf9451cd5cf370340927d8330d7d3ece7ed4d22949a75087d5ae4a5d86b2be7c" host="ci-4459-0-0-n-21eb3e8385" Sep 16 04:24:42.023064 containerd[1558]: 2025-09-16 04:24:41.956 [INFO][4748] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.bf9451cd5cf370340927d8330d7d3ece7ed4d22949a75087d5ae4a5d86b2be7c Sep 16 04:24:42.023064 containerd[1558]: 2025-09-16 04:24:41.964 [INFO][4748] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.66.0/26 handle="k8s-pod-network.bf9451cd5cf370340927d8330d7d3ece7ed4d22949a75087d5ae4a5d86b2be7c" host="ci-4459-0-0-n-21eb3e8385" Sep 16 04:24:42.023064 containerd[1558]: 2025-09-16 04:24:41.976 [INFO][4748] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.66.8/26] block=192.168.66.0/26 handle="k8s-pod-network.bf9451cd5cf370340927d8330d7d3ece7ed4d22949a75087d5ae4a5d86b2be7c" host="ci-4459-0-0-n-21eb3e8385" Sep 16 04:24:42.023064 containerd[1558]: 2025-09-16 04:24:41.976 [INFO][4748] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.66.8/26] handle="k8s-pod-network.bf9451cd5cf370340927d8330d7d3ece7ed4d22949a75087d5ae4a5d86b2be7c" host="ci-4459-0-0-n-21eb3e8385" Sep 16 04:24:42.023064 containerd[1558]: 2025-09-16 04:24:41.976 [INFO][4748] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Sep 16 04:24:42.024831 containerd[1558]: 2025-09-16 04:24:41.976 [INFO][4748] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.66.8/26] IPv6=[] ContainerID="bf9451cd5cf370340927d8330d7d3ece7ed4d22949a75087d5ae4a5d86b2be7c" HandleID="k8s-pod-network.bf9451cd5cf370340927d8330d7d3ece7ed4d22949a75087d5ae4a5d86b2be7c" Workload="ci--4459--0--0--n--21eb3e8385-k8s-calico--kube--controllers--5fb8d7647c--wmjlp-eth0" Sep 16 04:24:42.024831 containerd[1558]: 2025-09-16 04:24:41.981 [INFO][4722] cni-plugin/k8s.go 418: Populated endpoint ContainerID="bf9451cd5cf370340927d8330d7d3ece7ed4d22949a75087d5ae4a5d86b2be7c" Namespace="calico-system" Pod="calico-kube-controllers-5fb8d7647c-wmjlp" WorkloadEndpoint="ci--4459--0--0--n--21eb3e8385-k8s-calico--kube--controllers--5fb8d7647c--wmjlp-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4459--0--0--n--21eb3e8385-k8s-calico--kube--controllers--5fb8d7647c--wmjlp-eth0", GenerateName:"calico-kube-controllers-5fb8d7647c-", Namespace:"calico-system", SelfLink:"", UID:"49803f3a-c196-489e-944f-acc9e7819ab4", ResourceVersion:"839", Generation:0, CreationTimestamp:time.Date(2025, time.September, 16, 4, 24, 18, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"calico-kube-controllers", "k8s-app":"calico-kube-controllers", "pod-template-hash":"5fb8d7647c", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-kube-controllers"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4459-0-0-n-21eb3e8385", ContainerID:"", Pod:"calico-kube-controllers-5fb8d7647c-wmjlp", Endpoint:"eth0", ServiceAccountName:"calico-kube-controllers", IPNetworks:[]string{"192.168.66.8/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.calico-kube-controllers"}, InterfaceName:"cali670052adcdb", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 16 04:24:42.024831 containerd[1558]: 2025-09-16 04:24:41.981 [INFO][4722] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.66.8/32] ContainerID="bf9451cd5cf370340927d8330d7d3ece7ed4d22949a75087d5ae4a5d86b2be7c" Namespace="calico-system" Pod="calico-kube-controllers-5fb8d7647c-wmjlp" WorkloadEndpoint="ci--4459--0--0--n--21eb3e8385-k8s-calico--kube--controllers--5fb8d7647c--wmjlp-eth0" Sep 16 04:24:42.024831 containerd[1558]: 2025-09-16 04:24:41.982 [INFO][4722] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali670052adcdb ContainerID="bf9451cd5cf370340927d8330d7d3ece7ed4d22949a75087d5ae4a5d86b2be7c" Namespace="calico-system" Pod="calico-kube-controllers-5fb8d7647c-wmjlp" WorkloadEndpoint="ci--4459--0--0--n--21eb3e8385-k8s-calico--kube--controllers--5fb8d7647c--wmjlp-eth0" Sep 16 04:24:42.024831 containerd[1558]: 2025-09-16 04:24:41.985 [INFO][4722] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="bf9451cd5cf370340927d8330d7d3ece7ed4d22949a75087d5ae4a5d86b2be7c" Namespace="calico-system" Pod="calico-kube-controllers-5fb8d7647c-wmjlp" WorkloadEndpoint="ci--4459--0--0--n--21eb3e8385-k8s-calico--kube--controllers--5fb8d7647c--wmjlp-eth0" Sep 16 04:24:42.024967 containerd[1558]: 2025-09-16 04:24:41.985 [INFO][4722] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="bf9451cd5cf370340927d8330d7d3ece7ed4d22949a75087d5ae4a5d86b2be7c" Namespace="calico-system" Pod="calico-kube-controllers-5fb8d7647c-wmjlp" WorkloadEndpoint="ci--4459--0--0--n--21eb3e8385-k8s-calico--kube--controllers--5fb8d7647c--wmjlp-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4459--0--0--n--21eb3e8385-k8s-calico--kube--controllers--5fb8d7647c--wmjlp-eth0", GenerateName:"calico-kube-controllers-5fb8d7647c-", Namespace:"calico-system", SelfLink:"", UID:"49803f3a-c196-489e-944f-acc9e7819ab4", ResourceVersion:"839", Generation:0, CreationTimestamp:time.Date(2025, time.September, 16, 4, 24, 18, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"calico-kube-controllers", "k8s-app":"calico-kube-controllers", "pod-template-hash":"5fb8d7647c", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-kube-controllers"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4459-0-0-n-21eb3e8385", ContainerID:"bf9451cd5cf370340927d8330d7d3ece7ed4d22949a75087d5ae4a5d86b2be7c", Pod:"calico-kube-controllers-5fb8d7647c-wmjlp", Endpoint:"eth0", ServiceAccountName:"calico-kube-controllers", IPNetworks:[]string{"192.168.66.8/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.calico-kube-controllers"}, InterfaceName:"cali670052adcdb", MAC:"16:35:ee:42:74:78", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 16 04:24:42.024967 containerd[1558]: 2025-09-16 04:24:42.017 [INFO][4722] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="bf9451cd5cf370340927d8330d7d3ece7ed4d22949a75087d5ae4a5d86b2be7c" Namespace="calico-system" Pod="calico-kube-controllers-5fb8d7647c-wmjlp" WorkloadEndpoint="ci--4459--0--0--n--21eb3e8385-k8s-calico--kube--controllers--5fb8d7647c--wmjlp-eth0" Sep 16 04:24:42.078250 containerd[1558]: time="2025-09-16T04:24:42.078169806Z" level=info msg="connecting to shim bf9451cd5cf370340927d8330d7d3ece7ed4d22949a75087d5ae4a5d86b2be7c" address="unix:///run/containerd/s/d3ae2d71b57a2cb6b2e02fe63a79444420b63f66eb8e20c8152977447c5f7077" namespace=k8s.io protocol=ttrpc version=3 Sep 16 04:24:42.149949 systemd[1]: Started cri-containerd-bf9451cd5cf370340927d8330d7d3ece7ed4d22949a75087d5ae4a5d86b2be7c.scope - libcontainer container bf9451cd5cf370340927d8330d7d3ece7ed4d22949a75087d5ae4a5d86b2be7c. Sep 16 04:24:42.152293 kubelet[2783]: I0916 04:24:42.152070 2783 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/coredns-674b8bbfcf-xxw8g" podStartSLOduration=42.152010424 podStartE2EDuration="42.152010424s" podCreationTimestamp="2025-09-16 04:24:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-16 04:24:42.094868115 +0000 UTC m=+48.463509137" watchObservedRunningTime="2025-09-16 04:24:42.152010424 +0000 UTC m=+48.520651446" Sep 16 04:24:42.188876 systemd-networkd[1418]: calibfe1fbbb672: Link UP Sep 16 04:24:42.193797 systemd-networkd[1418]: calibfe1fbbb672: Gained carrier Sep 16 04:24:42.224860 containerd[1558]: 2025-09-16 04:24:41.884 [INFO][4729] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4459--0--0--n--21eb3e8385-k8s-calico--apiserver--6bd74b9558--ljjkj-eth0 calico-apiserver-6bd74b9558- calico-apiserver bf6d1753-4f29-4a36-94fb-7d6f48d24a1e 840 0 2025-09-16 04:24:11 +0000 UTC map[apiserver:true app.kubernetes.io/name:calico-apiserver k8s-app:calico-apiserver pod-template-hash:6bd74b9558 projectcalico.org/namespace:calico-apiserver projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:calico-apiserver] map[] [] [] []} {k8s ci-4459-0-0-n-21eb3e8385 calico-apiserver-6bd74b9558-ljjkj eth0 calico-apiserver [] [] [kns.calico-apiserver ksa.calico-apiserver.calico-apiserver] calibfe1fbbb672 [] [] }} ContainerID="8f4206c779cd8e784c940367c2f359b2973b5440f615fa238c911344451e797b" Namespace="calico-apiserver" Pod="calico-apiserver-6bd74b9558-ljjkj" WorkloadEndpoint="ci--4459--0--0--n--21eb3e8385-k8s-calico--apiserver--6bd74b9558--ljjkj-" Sep 16 04:24:42.224860 containerd[1558]: 2025-09-16 04:24:41.884 [INFO][4729] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="8f4206c779cd8e784c940367c2f359b2973b5440f615fa238c911344451e797b" Namespace="calico-apiserver" Pod="calico-apiserver-6bd74b9558-ljjkj" WorkloadEndpoint="ci--4459--0--0--n--21eb3e8385-k8s-calico--apiserver--6bd74b9558--ljjkj-eth0" Sep 16 04:24:42.224860 containerd[1558]: 2025-09-16 04:24:41.938 [INFO][4754] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="8f4206c779cd8e784c940367c2f359b2973b5440f615fa238c911344451e797b" HandleID="k8s-pod-network.8f4206c779cd8e784c940367c2f359b2973b5440f615fa238c911344451e797b" Workload="ci--4459--0--0--n--21eb3e8385-k8s-calico--apiserver--6bd74b9558--ljjkj-eth0" Sep 16 04:24:42.224860 containerd[1558]: 2025-09-16 04:24:41.938 [INFO][4754] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="8f4206c779cd8e784c940367c2f359b2973b5440f615fa238c911344451e797b" HandleID="k8s-pod-network.8f4206c779cd8e784c940367c2f359b2973b5440f615fa238c911344451e797b" Workload="ci--4459--0--0--n--21eb3e8385-k8s-calico--apiserver--6bd74b9558--ljjkj-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0x400024b640), Attrs:map[string]string{"namespace":"calico-apiserver", "node":"ci-4459-0-0-n-21eb3e8385", "pod":"calico-apiserver-6bd74b9558-ljjkj", "timestamp":"2025-09-16 04:24:41.938349625 +0000 UTC"}, Hostname:"ci-4459-0-0-n-21eb3e8385", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Sep 16 04:24:42.224860 containerd[1558]: 2025-09-16 04:24:41.938 [INFO][4754] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Sep 16 04:24:42.224860 containerd[1558]: 2025-09-16 04:24:41.976 [INFO][4754] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Sep 16 04:24:42.224860 containerd[1558]: 2025-09-16 04:24:41.977 [INFO][4754] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4459-0-0-n-21eb3e8385' Sep 16 04:24:42.224860 containerd[1558]: 2025-09-16 04:24:42.031 [INFO][4754] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.8f4206c779cd8e784c940367c2f359b2973b5440f615fa238c911344451e797b" host="ci-4459-0-0-n-21eb3e8385" Sep 16 04:24:42.224860 containerd[1558]: 2025-09-16 04:24:42.050 [INFO][4754] ipam/ipam.go 394: Looking up existing affinities for host host="ci-4459-0-0-n-21eb3e8385" Sep 16 04:24:42.224860 containerd[1558]: 2025-09-16 04:24:42.070 [INFO][4754] ipam/ipam.go 511: Trying affinity for 192.168.66.0/26 host="ci-4459-0-0-n-21eb3e8385" Sep 16 04:24:42.224860 containerd[1558]: 2025-09-16 04:24:42.077 [INFO][4754] ipam/ipam.go 158: Attempting to load block cidr=192.168.66.0/26 host="ci-4459-0-0-n-21eb3e8385" Sep 16 04:24:42.224860 containerd[1558]: 2025-09-16 04:24:42.086 [INFO][4754] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.66.0/26 host="ci-4459-0-0-n-21eb3e8385" Sep 16 04:24:42.224860 containerd[1558]: 2025-09-16 04:24:42.088 [INFO][4754] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.66.0/26 handle="k8s-pod-network.8f4206c779cd8e784c940367c2f359b2973b5440f615fa238c911344451e797b" host="ci-4459-0-0-n-21eb3e8385" Sep 16 04:24:42.224860 containerd[1558]: 2025-09-16 04:24:42.104 [INFO][4754] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.8f4206c779cd8e784c940367c2f359b2973b5440f615fa238c911344451e797b Sep 16 04:24:42.224860 containerd[1558]: 2025-09-16 04:24:42.136 [INFO][4754] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.66.0/26 handle="k8s-pod-network.8f4206c779cd8e784c940367c2f359b2973b5440f615fa238c911344451e797b" host="ci-4459-0-0-n-21eb3e8385" Sep 16 04:24:42.224860 containerd[1558]: 2025-09-16 04:24:42.163 [INFO][4754] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.66.9/26] block=192.168.66.0/26 handle="k8s-pod-network.8f4206c779cd8e784c940367c2f359b2973b5440f615fa238c911344451e797b" host="ci-4459-0-0-n-21eb3e8385" Sep 16 04:24:42.224860 containerd[1558]: 2025-09-16 04:24:42.163 [INFO][4754] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.66.9/26] handle="k8s-pod-network.8f4206c779cd8e784c940367c2f359b2973b5440f615fa238c911344451e797b" host="ci-4459-0-0-n-21eb3e8385" Sep 16 04:24:42.224860 containerd[1558]: 2025-09-16 04:24:42.163 [INFO][4754] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Sep 16 04:24:42.226086 containerd[1558]: 2025-09-16 04:24:42.163 [INFO][4754] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.66.9/26] IPv6=[] ContainerID="8f4206c779cd8e784c940367c2f359b2973b5440f615fa238c911344451e797b" HandleID="k8s-pod-network.8f4206c779cd8e784c940367c2f359b2973b5440f615fa238c911344451e797b" Workload="ci--4459--0--0--n--21eb3e8385-k8s-calico--apiserver--6bd74b9558--ljjkj-eth0" Sep 16 04:24:42.226086 containerd[1558]: 2025-09-16 04:24:42.174 [INFO][4729] cni-plugin/k8s.go 418: Populated endpoint ContainerID="8f4206c779cd8e784c940367c2f359b2973b5440f615fa238c911344451e797b" Namespace="calico-apiserver" Pod="calico-apiserver-6bd74b9558-ljjkj" WorkloadEndpoint="ci--4459--0--0--n--21eb3e8385-k8s-calico--apiserver--6bd74b9558--ljjkj-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4459--0--0--n--21eb3e8385-k8s-calico--apiserver--6bd74b9558--ljjkj-eth0", GenerateName:"calico-apiserver-6bd74b9558-", Namespace:"calico-apiserver", SelfLink:"", UID:"bf6d1753-4f29-4a36-94fb-7d6f48d24a1e", ResourceVersion:"840", Generation:0, CreationTimestamp:time.Date(2025, time.September, 16, 4, 24, 11, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"6bd74b9558", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4459-0-0-n-21eb3e8385", ContainerID:"", Pod:"calico-apiserver-6bd74b9558-ljjkj", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.66.9/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"calibfe1fbbb672", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 16 04:24:42.226086 containerd[1558]: 2025-09-16 04:24:42.174 [INFO][4729] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.66.9/32] ContainerID="8f4206c779cd8e784c940367c2f359b2973b5440f615fa238c911344451e797b" Namespace="calico-apiserver" Pod="calico-apiserver-6bd74b9558-ljjkj" WorkloadEndpoint="ci--4459--0--0--n--21eb3e8385-k8s-calico--apiserver--6bd74b9558--ljjkj-eth0" Sep 16 04:24:42.226086 containerd[1558]: 2025-09-16 04:24:42.174 [INFO][4729] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to calibfe1fbbb672 ContainerID="8f4206c779cd8e784c940367c2f359b2973b5440f615fa238c911344451e797b" Namespace="calico-apiserver" Pod="calico-apiserver-6bd74b9558-ljjkj" WorkloadEndpoint="ci--4459--0--0--n--21eb3e8385-k8s-calico--apiserver--6bd74b9558--ljjkj-eth0" Sep 16 04:24:42.226086 containerd[1558]: 2025-09-16 04:24:42.195 [INFO][4729] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="8f4206c779cd8e784c940367c2f359b2973b5440f615fa238c911344451e797b" Namespace="calico-apiserver" Pod="calico-apiserver-6bd74b9558-ljjkj" WorkloadEndpoint="ci--4459--0--0--n--21eb3e8385-k8s-calico--apiserver--6bd74b9558--ljjkj-eth0" Sep 16 04:24:42.226457 containerd[1558]: 2025-09-16 04:24:42.198 [INFO][4729] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="8f4206c779cd8e784c940367c2f359b2973b5440f615fa238c911344451e797b" Namespace="calico-apiserver" Pod="calico-apiserver-6bd74b9558-ljjkj" WorkloadEndpoint="ci--4459--0--0--n--21eb3e8385-k8s-calico--apiserver--6bd74b9558--ljjkj-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4459--0--0--n--21eb3e8385-k8s-calico--apiserver--6bd74b9558--ljjkj-eth0", GenerateName:"calico-apiserver-6bd74b9558-", Namespace:"calico-apiserver", SelfLink:"", UID:"bf6d1753-4f29-4a36-94fb-7d6f48d24a1e", ResourceVersion:"840", Generation:0, CreationTimestamp:time.Date(2025, time.September, 16, 4, 24, 11, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"6bd74b9558", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4459-0-0-n-21eb3e8385", ContainerID:"8f4206c779cd8e784c940367c2f359b2973b5440f615fa238c911344451e797b", Pod:"calico-apiserver-6bd74b9558-ljjkj", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.66.9/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"calibfe1fbbb672", MAC:"0e:57:15:2d:4c:f4", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 16 04:24:42.226457 containerd[1558]: 2025-09-16 04:24:42.222 [INFO][4729] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="8f4206c779cd8e784c940367c2f359b2973b5440f615fa238c911344451e797b" Namespace="calico-apiserver" Pod="calico-apiserver-6bd74b9558-ljjkj" WorkloadEndpoint="ci--4459--0--0--n--21eb3e8385-k8s-calico--apiserver--6bd74b9558--ljjkj-eth0" Sep 16 04:24:42.287483 containerd[1558]: time="2025-09-16T04:24:42.286973480Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-5fb8d7647c-wmjlp,Uid:49803f3a-c196-489e-944f-acc9e7819ab4,Namespace:calico-system,Attempt:0,} returns sandbox id \"bf9451cd5cf370340927d8330d7d3ece7ed4d22949a75087d5ae4a5d86b2be7c\"" Sep 16 04:24:42.296367 containerd[1558]: time="2025-09-16T04:24:42.296184967Z" level=info msg="connecting to shim 8f4206c779cd8e784c940367c2f359b2973b5440f615fa238c911344451e797b" address="unix:///run/containerd/s/acea9b0227aff4c794c3a3dccf47154d1db3cdde7a5bae36e787a11dea56e094" namespace=k8s.io protocol=ttrpc version=3 Sep 16 04:24:42.316896 systemd-networkd[1418]: calic544165cc29: Gained IPv6LL Sep 16 04:24:42.319040 systemd[1]: Started cri-containerd-8f4206c779cd8e784c940367c2f359b2973b5440f615fa238c911344451e797b.scope - libcontainer container 8f4206c779cd8e784c940367c2f359b2973b5440f615fa238c911344451e797b. Sep 16 04:24:42.356988 containerd[1558]: time="2025-09-16T04:24:42.356163534Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/whisker-backend:v3.30.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 16 04:24:42.359327 containerd[1558]: time="2025-09-16T04:24:42.359287030Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker-backend:v3.30.3: active requests=0, bytes read=30823700" Sep 16 04:24:42.361094 containerd[1558]: time="2025-09-16T04:24:42.361055016Z" level=info msg="ImageCreate event name:\"sha256:e210e86234bc99f018431b30477c5ca2ad6f7ecf67ef011498f7beb48fb0b21f\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 16 04:24:42.367621 containerd[1558]: time="2025-09-16T04:24:42.367541284Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/whisker-backend@sha256:29becebc47401da9997a2a30f4c25c511a5f379d17275680b048224829af71a5\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 16 04:24:42.368188 containerd[1558]: time="2025-09-16T04:24:42.368130240Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.3\" with image id \"sha256:e210e86234bc99f018431b30477c5ca2ad6f7ecf67ef011498f7beb48fb0b21f\", repo tag \"ghcr.io/flatcar/calico/whisker-backend:v3.30.3\", repo digest \"ghcr.io/flatcar/calico/whisker-backend@sha256:29becebc47401da9997a2a30f4c25c511a5f379d17275680b048224829af71a5\", size \"30823530\" in 3.614259193s" Sep 16 04:24:42.368188 containerd[1558]: time="2025-09-16T04:24:42.368185119Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.30.3\" returns image reference \"sha256:e210e86234bc99f018431b30477c5ca2ad6f7ecf67ef011498f7beb48fb0b21f\"" Sep 16 04:24:42.371253 containerd[1558]: time="2025-09-16T04:24:42.370807219Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.30.3\"" Sep 16 04:24:42.375559 containerd[1558]: time="2025-09-16T04:24:42.375513502Z" level=info msg="CreateContainer within sandbox \"4417d78058fd69e26692971def28e61ab64ec9900779c4f7a0794b4295d5c516\" for container &ContainerMetadata{Name:whisker-backend,Attempt:0,}" Sep 16 04:24:42.396939 containerd[1558]: time="2025-09-16T04:24:42.396861533Z" level=info msg="Container f2ad6edd96e4ad657b591a8040fcb79cc1db93fcd2751a146b0d4c28fd3953d1: CDI devices from CRI Config.CDIDevices: []" Sep 16 04:24:42.405829 containerd[1558]: time="2025-09-16T04:24:42.405708863Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-6bd74b9558-ljjkj,Uid:bf6d1753-4f29-4a36-94fb-7d6f48d24a1e,Namespace:calico-apiserver,Attempt:0,} returns sandbox id \"8f4206c779cd8e784c940367c2f359b2973b5440f615fa238c911344451e797b\"" Sep 16 04:24:42.410495 containerd[1558]: time="2025-09-16T04:24:42.410424426Z" level=info msg="CreateContainer within sandbox \"4417d78058fd69e26692971def28e61ab64ec9900779c4f7a0794b4295d5c516\" for &ContainerMetadata{Name:whisker-backend,Attempt:0,} returns container id \"f2ad6edd96e4ad657b591a8040fcb79cc1db93fcd2751a146b0d4c28fd3953d1\"" Sep 16 04:24:42.411388 containerd[1558]: time="2025-09-16T04:24:42.411348219Z" level=info msg="StartContainer for \"f2ad6edd96e4ad657b591a8040fcb79cc1db93fcd2751a146b0d4c28fd3953d1\"" Sep 16 04:24:42.413069 containerd[1558]: time="2025-09-16T04:24:42.413001206Z" level=info msg="connecting to shim f2ad6edd96e4ad657b591a8040fcb79cc1db93fcd2751a146b0d4c28fd3953d1" address="unix:///run/containerd/s/5c44943291f57d488e9f6fb29614ef2edddcce3f2d8f831710d32ed5c7a0b672" protocol=ttrpc version=3 Sep 16 04:24:42.439775 systemd[1]: Started cri-containerd-f2ad6edd96e4ad657b591a8040fcb79cc1db93fcd2751a146b0d4c28fd3953d1.scope - libcontainer container f2ad6edd96e4ad657b591a8040fcb79cc1db93fcd2751a146b0d4c28fd3953d1. Sep 16 04:24:42.488493 containerd[1558]: time="2025-09-16T04:24:42.488446211Z" level=info msg="StartContainer for \"f2ad6edd96e4ad657b591a8040fcb79cc1db93fcd2751a146b0d4c28fd3953d1\" returns successfully" Sep 16 04:24:42.537644 kubelet[2783]: I0916 04:24:42.537494 2783 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Sep 16 04:24:42.666653 containerd[1558]: time="2025-09-16T04:24:42.666567846Z" level=info msg="TaskExit event in podsandbox handler container_id:\"29accf524a610bb91eb08b15c29dbaaf8febb57b21dcb514741ee9efee567994\" id:\"6a05af47e04efeb0a6767ca09675067045d1496d02d5ab493a633fa202e71702\" pid:4925 exited_at:{seconds:1757996682 nanos:665931131}" Sep 16 04:24:42.761463 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount3605214589.mount: Deactivated successfully. Sep 16 04:24:42.781819 containerd[1558]: time="2025-09-16T04:24:42.781601299Z" level=info msg="TaskExit event in podsandbox handler container_id:\"29accf524a610bb91eb08b15c29dbaaf8febb57b21dcb514741ee9efee567994\" id:\"47c882f0077f186b4940bc0c7cbcaa814b893954c4c1126144937bcff410c5fd\" pid:4948 exited_at:{seconds:1757996682 nanos:780648267}" Sep 16 04:24:43.019804 systemd-networkd[1418]: cali03b14a9e1b3: Gained IPv6LL Sep 16 04:24:43.112952 kubelet[2783]: I0916 04:24:43.112743 2783 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/whisker-85b9fd968b-4j4vh" podStartSLOduration=1.468664811 podStartE2EDuration="7.112719071s" podCreationTimestamp="2025-09-16 04:24:36 +0000 UTC" firstStartedPulling="2025-09-16 04:24:36.725568648 +0000 UTC m=+43.094209670" lastFinishedPulling="2025-09-16 04:24:42.369622908 +0000 UTC m=+48.738263930" observedRunningTime="2025-09-16 04:24:43.11152448 +0000 UTC m=+49.480165502" watchObservedRunningTime="2025-09-16 04:24:43.112719071 +0000 UTC m=+49.481360133" Sep 16 04:24:43.213300 systemd-networkd[1418]: calicfc764fa7eb: Gained IPv6LL Sep 16 04:24:43.661018 systemd-networkd[1418]: cali670052adcdb: Gained IPv6LL Sep 16 04:24:44.109750 systemd-networkd[1418]: calibfe1fbbb672: Gained IPv6LL Sep 16 04:24:46.483297 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount3796832439.mount: Deactivated successfully. Sep 16 04:24:47.099636 containerd[1558]: time="2025-09-16T04:24:47.099499807Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/goldmane:v3.30.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 16 04:24:47.104359 containerd[1558]: time="2025-09-16T04:24:47.104299062Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/goldmane:v3.30.3: active requests=0, bytes read=61845332" Sep 16 04:24:47.106375 containerd[1558]: time="2025-09-16T04:24:47.106342291Z" level=info msg="ImageCreate event name:\"sha256:14088376331a0622b7f6a2fbc2f2932806a6eafdd7b602f6139d3b985bf1e685\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 16 04:24:47.110042 containerd[1558]: time="2025-09-16T04:24:47.110003352Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/goldmane@sha256:46297703ab3739331a00a58f0d6a5498c8d3b6523ad947eed68592ee0f3e79f0\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 16 04:24:47.111729 containerd[1558]: time="2025-09-16T04:24:47.111695663Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/goldmane:v3.30.3\" with image id \"sha256:14088376331a0622b7f6a2fbc2f2932806a6eafdd7b602f6139d3b985bf1e685\", repo tag \"ghcr.io/flatcar/calico/goldmane:v3.30.3\", repo digest \"ghcr.io/flatcar/calico/goldmane@sha256:46297703ab3739331a00a58f0d6a5498c8d3b6523ad947eed68592ee0f3e79f0\", size \"61845178\" in 4.740850925s" Sep 16 04:24:47.111858 containerd[1558]: time="2025-09-16T04:24:47.111843142Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.30.3\" returns image reference \"sha256:14088376331a0622b7f6a2fbc2f2932806a6eafdd7b602f6139d3b985bf1e685\"" Sep 16 04:24:47.114190 containerd[1558]: time="2025-09-16T04:24:47.114153450Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.3\"" Sep 16 04:24:47.118110 containerd[1558]: time="2025-09-16T04:24:47.117934710Z" level=info msg="CreateContainer within sandbox \"c97c66d2619171743082a8e92c9b5c4f676446e0607614f7bb7f5a5ca57c4b66\" for container &ContainerMetadata{Name:goldmane,Attempt:0,}" Sep 16 04:24:47.127531 containerd[1558]: time="2025-09-16T04:24:47.127491020Z" level=info msg="Container 8d6c6929c13327a75d8c268f852bf6c0f9e08ea12f3d02fe73bf74d4acf04464: CDI devices from CRI Config.CDIDevices: []" Sep 16 04:24:47.135505 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount2303470263.mount: Deactivated successfully. Sep 16 04:24:47.146804 containerd[1558]: time="2025-09-16T04:24:47.146723158Z" level=info msg="CreateContainer within sandbox \"c97c66d2619171743082a8e92c9b5c4f676446e0607614f7bb7f5a5ca57c4b66\" for &ContainerMetadata{Name:goldmane,Attempt:0,} returns container id \"8d6c6929c13327a75d8c268f852bf6c0f9e08ea12f3d02fe73bf74d4acf04464\"" Sep 16 04:24:47.148011 containerd[1558]: time="2025-09-16T04:24:47.147981392Z" level=info msg="StartContainer for \"8d6c6929c13327a75d8c268f852bf6c0f9e08ea12f3d02fe73bf74d4acf04464\"" Sep 16 04:24:47.150683 containerd[1558]: time="2025-09-16T04:24:47.150563378Z" level=info msg="connecting to shim 8d6c6929c13327a75d8c268f852bf6c0f9e08ea12f3d02fe73bf74d4acf04464" address="unix:///run/containerd/s/5bab783ec2646ba661d1ad631248d5fa08b92f9854f0222082dd6e505a1a54d6" protocol=ttrpc version=3 Sep 16 04:24:47.178059 systemd[1]: Started cri-containerd-8d6c6929c13327a75d8c268f852bf6c0f9e08ea12f3d02fe73bf74d4acf04464.scope - libcontainer container 8d6c6929c13327a75d8c268f852bf6c0f9e08ea12f3d02fe73bf74d4acf04464. Sep 16 04:24:47.235652 containerd[1558]: time="2025-09-16T04:24:47.235610411Z" level=info msg="StartContainer for \"8d6c6929c13327a75d8c268f852bf6c0f9e08ea12f3d02fe73bf74d4acf04464\" returns successfully" Sep 16 04:24:48.341928 containerd[1558]: time="2025-09-16T04:24:48.341886275Z" level=info msg="TaskExit event in podsandbox handler container_id:\"8d6c6929c13327a75d8c268f852bf6c0f9e08ea12f3d02fe73bf74d4acf04464\" id:\"a7f5abadcc5df413ea24172661ca6c4cc91d884ed51838a936cd1f7ab87d5347\" pid:5028 exit_status:1 exited_at:{seconds:1757996688 nanos:332781519}" Sep 16 04:24:49.401519 containerd[1558]: time="2025-09-16T04:24:49.401474393Z" level=info msg="TaskExit event in podsandbox handler container_id:\"8d6c6929c13327a75d8c268f852bf6c0f9e08ea12f3d02fe73bf74d4acf04464\" id:\"321ed48ff81c58874f029599a3dc2b852400b19e3c1d3c398b6889147f08017f\" pid:5052 exit_status:1 exited_at:{seconds:1757996689 nanos:401045994}" Sep 16 04:24:50.313169 containerd[1558]: time="2025-09-16T04:24:50.313105434Z" level=info msg="TaskExit event in podsandbox handler container_id:\"8d6c6929c13327a75d8c268f852bf6c0f9e08ea12f3d02fe73bf74d4acf04464\" id:\"cbaa86c34337327f92d9d03eeef5c0e818d395d91de2b2451a655fb83b309c4e\" pid:5079 exit_status:1 exited_at:{seconds:1757996690 nanos:312224797}" Sep 16 04:24:50.344709 containerd[1558]: time="2025-09-16T04:24:50.344663711Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/apiserver:v3.30.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 16 04:24:50.346433 containerd[1558]: time="2025-09-16T04:24:50.346389905Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.30.3: active requests=0, bytes read=44530807" Sep 16 04:24:50.347894 containerd[1558]: time="2025-09-16T04:24:50.347797419Z" level=info msg="ImageCreate event name:\"sha256:632fbde00b1918016ac07458e79cc438ccda83cb762bfd5fc50a26721abced08\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 16 04:24:50.353042 containerd[1558]: time="2025-09-16T04:24:50.352958639Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/apiserver@sha256:6a24147f11c1edce9d6ba79bdb0c2beadec53853fb43438a287291e67b41e51b\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 16 04:24:50.353839 containerd[1558]: time="2025-09-16T04:24:50.353801716Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/apiserver:v3.30.3\" with image id \"sha256:632fbde00b1918016ac07458e79cc438ccda83cb762bfd5fc50a26721abced08\", repo tag \"ghcr.io/flatcar/calico/apiserver:v3.30.3\", repo digest \"ghcr.io/flatcar/calico/apiserver@sha256:6a24147f11c1edce9d6ba79bdb0c2beadec53853fb43438a287291e67b41e51b\", size \"45900064\" in 3.239607626s" Sep 16 04:24:50.354014 containerd[1558]: time="2025-09-16T04:24:50.353925436Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.3\" returns image reference \"sha256:632fbde00b1918016ac07458e79cc438ccda83cb762bfd5fc50a26721abced08\"" Sep 16 04:24:50.355056 containerd[1558]: time="2025-09-16T04:24:50.355019271Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.30.3\"" Sep 16 04:24:50.358505 containerd[1558]: time="2025-09-16T04:24:50.358399338Z" level=info msg="CreateContainer within sandbox \"bd0ee2805722d3138f5f509cff73c25da2471dbc49ab7fb41ffc01c84b62cf43\" for container &ContainerMetadata{Name:calico-apiserver,Attempt:0,}" Sep 16 04:24:50.374705 containerd[1558]: time="2025-09-16T04:24:50.374634315Z" level=info msg="Container b20ad674036d9cd03941d34ba6ace764c0aa12809506f287c499c307d8d9beeb: CDI devices from CRI Config.CDIDevices: []" Sep 16 04:24:50.386878 containerd[1558]: time="2025-09-16T04:24:50.386808948Z" level=info msg="CreateContainer within sandbox \"bd0ee2805722d3138f5f509cff73c25da2471dbc49ab7fb41ffc01c84b62cf43\" for &ContainerMetadata{Name:calico-apiserver,Attempt:0,} returns container id \"b20ad674036d9cd03941d34ba6ace764c0aa12809506f287c499c307d8d9beeb\"" Sep 16 04:24:50.388083 containerd[1558]: time="2025-09-16T04:24:50.387952144Z" level=info msg="StartContainer for \"b20ad674036d9cd03941d34ba6ace764c0aa12809506f287c499c307d8d9beeb\"" Sep 16 04:24:50.389521 containerd[1558]: time="2025-09-16T04:24:50.389429378Z" level=info msg="connecting to shim b20ad674036d9cd03941d34ba6ace764c0aa12809506f287c499c307d8d9beeb" address="unix:///run/containerd/s/25c0b5e5fd408a0f0407614e170554fabc77228858f9529e9207c2e2446b5994" protocol=ttrpc version=3 Sep 16 04:24:50.415792 systemd[1]: Started cri-containerd-b20ad674036d9cd03941d34ba6ace764c0aa12809506f287c499c307d8d9beeb.scope - libcontainer container b20ad674036d9cd03941d34ba6ace764c0aa12809506f287c499c307d8d9beeb. Sep 16 04:24:50.483481 containerd[1558]: time="2025-09-16T04:24:50.483425534Z" level=info msg="StartContainer for \"b20ad674036d9cd03941d34ba6ace764c0aa12809506f287c499c307d8d9beeb\" returns successfully" Sep 16 04:24:51.166758 kubelet[2783]: I0916 04:24:51.165764 2783 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/goldmane-54d579b49d-xdvqq" podStartSLOduration=27.829805666 podStartE2EDuration="34.165748283s" podCreationTimestamp="2025-09-16 04:24:17 +0000 UTC" firstStartedPulling="2025-09-16 04:24:40.777363277 +0000 UTC m=+47.146004299" lastFinishedPulling="2025-09-16 04:24:47.113305894 +0000 UTC m=+53.481946916" observedRunningTime="2025-09-16 04:24:48.17143417 +0000 UTC m=+54.540075232" watchObservedRunningTime="2025-09-16 04:24:51.165748283 +0000 UTC m=+57.534389265" Sep 16 04:24:52.199958 containerd[1558]: time="2025-09-16T04:24:52.199898330Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/csi:v3.30.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 16 04:24:52.203630 containerd[1558]: time="2025-09-16T04:24:52.202952841Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/csi:v3.30.3: active requests=0, bytes read=8227489" Sep 16 04:24:52.204084 containerd[1558]: time="2025-09-16T04:24:52.204032398Z" level=info msg="ImageCreate event name:\"sha256:5e2b30128ce4b607acd97d3edef62ce1a90be0259903090a51c360adbe4a8f3b\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 16 04:24:52.208599 containerd[1558]: time="2025-09-16T04:24:52.208514704Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/csi@sha256:f22c88018d8b58c4ef0052f594b216a13bd6852166ac131a538c5ab2fba23bb2\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 16 04:24:52.210293 containerd[1558]: time="2025-09-16T04:24:52.210052780Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/csi:v3.30.3\" with image id \"sha256:5e2b30128ce4b607acd97d3edef62ce1a90be0259903090a51c360adbe4a8f3b\", repo tag \"ghcr.io/flatcar/calico/csi:v3.30.3\", repo digest \"ghcr.io/flatcar/calico/csi@sha256:f22c88018d8b58c4ef0052f594b216a13bd6852166ac131a538c5ab2fba23bb2\", size \"9596730\" in 1.854995069s" Sep 16 04:24:52.210293 containerd[1558]: time="2025-09-16T04:24:52.210091740Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.30.3\" returns image reference \"sha256:5e2b30128ce4b607acd97d3edef62ce1a90be0259903090a51c360adbe4a8f3b\"" Sep 16 04:24:52.212484 containerd[1558]: time="2025-09-16T04:24:52.212439853Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.3\"" Sep 16 04:24:52.216231 containerd[1558]: time="2025-09-16T04:24:52.216177121Z" level=info msg="CreateContainer within sandbox \"06b339a4ad9ecc43049a43a52397442958c95ad5d6b17f2bc727827d99d90505\" for container &ContainerMetadata{Name:calico-csi,Attempt:0,}" Sep 16 04:24:52.257995 containerd[1558]: time="2025-09-16T04:24:52.257438877Z" level=info msg="Container 2f14fe3453e235fec7aeba82e5e5aba14bd81e2eb43fa8691efa8431790e1a0e: CDI devices from CRI Config.CDIDevices: []" Sep 16 04:24:52.270715 containerd[1558]: time="2025-09-16T04:24:52.270674117Z" level=info msg="CreateContainer within sandbox \"06b339a4ad9ecc43049a43a52397442958c95ad5d6b17f2bc727827d99d90505\" for &ContainerMetadata{Name:calico-csi,Attempt:0,} returns container id \"2f14fe3453e235fec7aeba82e5e5aba14bd81e2eb43fa8691efa8431790e1a0e\"" Sep 16 04:24:52.271433 containerd[1558]: time="2025-09-16T04:24:52.271401915Z" level=info msg="StartContainer for \"2f14fe3453e235fec7aeba82e5e5aba14bd81e2eb43fa8691efa8431790e1a0e\"" Sep 16 04:24:52.273306 containerd[1558]: time="2025-09-16T04:24:52.273272469Z" level=info msg="connecting to shim 2f14fe3453e235fec7aeba82e5e5aba14bd81e2eb43fa8691efa8431790e1a0e" address="unix:///run/containerd/s/a4907a4afeeea230949e61b0a90a7033c8c830cf22921f80abfd4cb25113e0d7" protocol=ttrpc version=3 Sep 16 04:24:52.304822 systemd[1]: Started cri-containerd-2f14fe3453e235fec7aeba82e5e5aba14bd81e2eb43fa8691efa8431790e1a0e.scope - libcontainer container 2f14fe3453e235fec7aeba82e5e5aba14bd81e2eb43fa8691efa8431790e1a0e. Sep 16 04:24:52.374509 containerd[1558]: time="2025-09-16T04:24:52.374337724Z" level=info msg="StartContainer for \"2f14fe3453e235fec7aeba82e5e5aba14bd81e2eb43fa8691efa8431790e1a0e\" returns successfully" Sep 16 04:24:52.597443 containerd[1558]: time="2025-09-16T04:24:52.597313011Z" level=info msg="ImageUpdate event name:\"ghcr.io/flatcar/calico/apiserver:v3.30.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 16 04:24:52.600350 containerd[1558]: time="2025-09-16T04:24:52.600282682Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.30.3: active requests=0, bytes read=77" Sep 16 04:24:52.610453 containerd[1558]: time="2025-09-16T04:24:52.610381851Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/apiserver:v3.30.3\" with image id \"sha256:632fbde00b1918016ac07458e79cc438ccda83cb762bfd5fc50a26721abced08\", repo tag \"ghcr.io/flatcar/calico/apiserver:v3.30.3\", repo digest \"ghcr.io/flatcar/calico/apiserver@sha256:6a24147f11c1edce9d6ba79bdb0c2beadec53853fb43438a287291e67b41e51b\", size \"45900064\" in 397.899159ms" Sep 16 04:24:52.610453 containerd[1558]: time="2025-09-16T04:24:52.610441971Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.3\" returns image reference \"sha256:632fbde00b1918016ac07458e79cc438ccda83cb762bfd5fc50a26721abced08\"" Sep 16 04:24:52.614302 containerd[1558]: time="2025-09-16T04:24:52.614252440Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.30.3\"" Sep 16 04:24:52.619950 containerd[1558]: time="2025-09-16T04:24:52.619889143Z" level=info msg="CreateContainer within sandbox \"8a610be6a24d2e391acbed96ab712c6beb8ff0fbbbbd95157a98bf1bdc94ecff\" for container &ContainerMetadata{Name:calico-apiserver,Attempt:0,}" Sep 16 04:24:52.649275 containerd[1558]: time="2025-09-16T04:24:52.649229934Z" level=info msg="Container 4d0ab61559473cb2339e578525aaf069f866621a7afcf3eab88ca02d4a4a511b: CDI devices from CRI Config.CDIDevices: []" Sep 16 04:24:52.650040 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount3051882332.mount: Deactivated successfully. Sep 16 04:24:52.663470 containerd[1558]: time="2025-09-16T04:24:52.663421051Z" level=info msg="CreateContainer within sandbox \"8a610be6a24d2e391acbed96ab712c6beb8ff0fbbbbd95157a98bf1bdc94ecff\" for &ContainerMetadata{Name:calico-apiserver,Attempt:0,} returns container id \"4d0ab61559473cb2339e578525aaf069f866621a7afcf3eab88ca02d4a4a511b\"" Sep 16 04:24:52.665842 containerd[1558]: time="2025-09-16T04:24:52.665798884Z" level=info msg="StartContainer for \"4d0ab61559473cb2339e578525aaf069f866621a7afcf3eab88ca02d4a4a511b\"" Sep 16 04:24:52.666983 containerd[1558]: time="2025-09-16T04:24:52.666950321Z" level=info msg="connecting to shim 4d0ab61559473cb2339e578525aaf069f866621a7afcf3eab88ca02d4a4a511b" address="unix:///run/containerd/s/5433bcb26b867e6ca9082c5cb7f07d7002b89e7d656459a4775673917c3ea24d" protocol=ttrpc version=3 Sep 16 04:24:52.692233 systemd[1]: Started cri-containerd-4d0ab61559473cb2339e578525aaf069f866621a7afcf3eab88ca02d4a4a511b.scope - libcontainer container 4d0ab61559473cb2339e578525aaf069f866621a7afcf3eab88ca02d4a4a511b. Sep 16 04:24:52.771112 containerd[1558]: time="2025-09-16T04:24:52.771078166Z" level=info msg="StartContainer for \"4d0ab61559473cb2339e578525aaf069f866621a7afcf3eab88ca02d4a4a511b\" returns successfully" Sep 16 04:24:53.169099 kubelet[2783]: I0916 04:24:53.169021 2783 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Sep 16 04:24:53.202611 kubelet[2783]: I0916 04:24:53.202238 2783 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-apiserver/calico-apiserver-6bd6d5f94d-b8px6" podStartSLOduration=29.971188274 podStartE2EDuration="39.202220547s" podCreationTimestamp="2025-09-16 04:24:14 +0000 UTC" firstStartedPulling="2025-09-16 04:24:41.123763679 +0000 UTC m=+47.492404701" lastFinishedPulling="2025-09-16 04:24:50.354795952 +0000 UTC m=+56.723436974" observedRunningTime="2025-09-16 04:24:51.166831439 +0000 UTC m=+57.535472461" watchObservedRunningTime="2025-09-16 04:24:53.202220547 +0000 UTC m=+59.570861569" Sep 16 04:24:53.566441 kubelet[2783]: I0916 04:24:53.565554 2783 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-apiserver/calico-apiserver-6bd74b9558-fz99l" podStartSLOduration=31.558044075 podStartE2EDuration="42.565537318s" podCreationTimestamp="2025-09-16 04:24:11 +0000 UTC" firstStartedPulling="2025-09-16 04:24:41.605824719 +0000 UTC m=+47.974465741" lastFinishedPulling="2025-09-16 04:24:52.613317962 +0000 UTC m=+58.981958984" observedRunningTime="2025-09-16 04:24:53.205645618 +0000 UTC m=+59.574286640" watchObservedRunningTime="2025-09-16 04:24:53.565537318 +0000 UTC m=+59.934178340" Sep 16 04:24:53.659490 systemd[1]: Created slice kubepods-besteffort-pod0d54ff7f_c5bb_415d_989d_604db3303766.slice - libcontainer container kubepods-besteffort-pod0d54ff7f_c5bb_415d_989d_604db3303766.slice. Sep 16 04:24:53.765395 kubelet[2783]: I0916 04:24:53.765352 2783 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xwfwb\" (UniqueName: \"kubernetes.io/projected/0d54ff7f-c5bb-415d-989d-604db3303766-kube-api-access-xwfwb\") pod \"calico-apiserver-6bd6d5f94d-k7n7n\" (UID: \"0d54ff7f-c5bb-415d-989d-604db3303766\") " pod="calico-apiserver/calico-apiserver-6bd6d5f94d-k7n7n" Sep 16 04:24:53.765642 kubelet[2783]: I0916 04:24:53.765606 2783 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"calico-apiserver-certs\" (UniqueName: \"kubernetes.io/secret/0d54ff7f-c5bb-415d-989d-604db3303766-calico-apiserver-certs\") pod \"calico-apiserver-6bd6d5f94d-k7n7n\" (UID: \"0d54ff7f-c5bb-415d-989d-604db3303766\") " pod="calico-apiserver/calico-apiserver-6bd6d5f94d-k7n7n" Sep 16 04:24:53.965133 containerd[1558]: time="2025-09-16T04:24:53.964857115Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-6bd6d5f94d-k7n7n,Uid:0d54ff7f-c5bb-415d-989d-604db3303766,Namespace:calico-apiserver,Attempt:0,}" Sep 16 04:24:54.174082 kubelet[2783]: I0916 04:24:54.174012 2783 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Sep 16 04:24:54.214601 systemd-networkd[1418]: cali51f2bca5a85: Link UP Sep 16 04:24:54.214892 systemd-networkd[1418]: cali51f2bca5a85: Gained carrier Sep 16 04:24:54.234667 containerd[1558]: 2025-09-16 04:24:54.070 [INFO][5216] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4459--0--0--n--21eb3e8385-k8s-calico--apiserver--6bd6d5f94d--k7n7n-eth0 calico-apiserver-6bd6d5f94d- calico-apiserver 0d54ff7f-c5bb-415d-989d-604db3303766 1072 0 2025-09-16 04:24:53 +0000 UTC map[apiserver:true app.kubernetes.io/name:calico-apiserver k8s-app:calico-apiserver pod-template-hash:6bd6d5f94d projectcalico.org/namespace:calico-apiserver projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:calico-apiserver] map[] [] [] []} {k8s ci-4459-0-0-n-21eb3e8385 calico-apiserver-6bd6d5f94d-k7n7n eth0 calico-apiserver [] [] [kns.calico-apiserver ksa.calico-apiserver.calico-apiserver] cali51f2bca5a85 [] [] }} ContainerID="46d2afbbefec18f3f564c1fb3ddd0c85be740c86a17f0ac9efa33e6b0fbe940f" Namespace="calico-apiserver" Pod="calico-apiserver-6bd6d5f94d-k7n7n" WorkloadEndpoint="ci--4459--0--0--n--21eb3e8385-k8s-calico--apiserver--6bd6d5f94d--k7n7n-" Sep 16 04:24:54.234667 containerd[1558]: 2025-09-16 04:24:54.072 [INFO][5216] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="46d2afbbefec18f3f564c1fb3ddd0c85be740c86a17f0ac9efa33e6b0fbe940f" Namespace="calico-apiserver" Pod="calico-apiserver-6bd6d5f94d-k7n7n" WorkloadEndpoint="ci--4459--0--0--n--21eb3e8385-k8s-calico--apiserver--6bd6d5f94d--k7n7n-eth0" Sep 16 04:24:54.234667 containerd[1558]: 2025-09-16 04:24:54.132 [INFO][5231] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="46d2afbbefec18f3f564c1fb3ddd0c85be740c86a17f0ac9efa33e6b0fbe940f" HandleID="k8s-pod-network.46d2afbbefec18f3f564c1fb3ddd0c85be740c86a17f0ac9efa33e6b0fbe940f" Workload="ci--4459--0--0--n--21eb3e8385-k8s-calico--apiserver--6bd6d5f94d--k7n7n-eth0" Sep 16 04:24:54.234667 containerd[1558]: 2025-09-16 04:24:54.132 [INFO][5231] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="46d2afbbefec18f3f564c1fb3ddd0c85be740c86a17f0ac9efa33e6b0fbe940f" HandleID="k8s-pod-network.46d2afbbefec18f3f564c1fb3ddd0c85be740c86a17f0ac9efa33e6b0fbe940f" Workload="ci--4459--0--0--n--21eb3e8385-k8s-calico--apiserver--6bd6d5f94d--k7n7n-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0x40002d2ff0), Attrs:map[string]string{"namespace":"calico-apiserver", "node":"ci-4459-0-0-n-21eb3e8385", "pod":"calico-apiserver-6bd6d5f94d-k7n7n", "timestamp":"2025-09-16 04:24:54.132207089 +0000 UTC"}, Hostname:"ci-4459-0-0-n-21eb3e8385", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Sep 16 04:24:54.234667 containerd[1558]: 2025-09-16 04:24:54.133 [INFO][5231] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Sep 16 04:24:54.234667 containerd[1558]: 2025-09-16 04:24:54.133 [INFO][5231] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Sep 16 04:24:54.234667 containerd[1558]: 2025-09-16 04:24:54.133 [INFO][5231] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4459-0-0-n-21eb3e8385' Sep 16 04:24:54.234667 containerd[1558]: 2025-09-16 04:24:54.156 [INFO][5231] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.46d2afbbefec18f3f564c1fb3ddd0c85be740c86a17f0ac9efa33e6b0fbe940f" host="ci-4459-0-0-n-21eb3e8385" Sep 16 04:24:54.234667 containerd[1558]: 2025-09-16 04:24:54.163 [INFO][5231] ipam/ipam.go 394: Looking up existing affinities for host host="ci-4459-0-0-n-21eb3e8385" Sep 16 04:24:54.234667 containerd[1558]: 2025-09-16 04:24:54.169 [INFO][5231] ipam/ipam.go 511: Trying affinity for 192.168.66.0/26 host="ci-4459-0-0-n-21eb3e8385" Sep 16 04:24:54.234667 containerd[1558]: 2025-09-16 04:24:54.174 [INFO][5231] ipam/ipam.go 158: Attempting to load block cidr=192.168.66.0/26 host="ci-4459-0-0-n-21eb3e8385" Sep 16 04:24:54.234667 containerd[1558]: 2025-09-16 04:24:54.178 [INFO][5231] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.66.0/26 host="ci-4459-0-0-n-21eb3e8385" Sep 16 04:24:54.234667 containerd[1558]: 2025-09-16 04:24:54.178 [INFO][5231] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.66.0/26 handle="k8s-pod-network.46d2afbbefec18f3f564c1fb3ddd0c85be740c86a17f0ac9efa33e6b0fbe940f" host="ci-4459-0-0-n-21eb3e8385" Sep 16 04:24:54.234667 containerd[1558]: 2025-09-16 04:24:54.180 [INFO][5231] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.46d2afbbefec18f3f564c1fb3ddd0c85be740c86a17f0ac9efa33e6b0fbe940f Sep 16 04:24:54.234667 containerd[1558]: 2025-09-16 04:24:54.188 [INFO][5231] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.66.0/26 handle="k8s-pod-network.46d2afbbefec18f3f564c1fb3ddd0c85be740c86a17f0ac9efa33e6b0fbe940f" host="ci-4459-0-0-n-21eb3e8385" Sep 16 04:24:54.234667 containerd[1558]: 2025-09-16 04:24:54.203 [INFO][5231] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.66.10/26] block=192.168.66.0/26 handle="k8s-pod-network.46d2afbbefec18f3f564c1fb3ddd0c85be740c86a17f0ac9efa33e6b0fbe940f" host="ci-4459-0-0-n-21eb3e8385" Sep 16 04:24:54.234667 containerd[1558]: 2025-09-16 04:24:54.203 [INFO][5231] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.66.10/26] handle="k8s-pod-network.46d2afbbefec18f3f564c1fb3ddd0c85be740c86a17f0ac9efa33e6b0fbe940f" host="ci-4459-0-0-n-21eb3e8385" Sep 16 04:24:54.234667 containerd[1558]: 2025-09-16 04:24:54.203 [INFO][5231] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Sep 16 04:24:54.237549 containerd[1558]: 2025-09-16 04:24:54.203 [INFO][5231] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.66.10/26] IPv6=[] ContainerID="46d2afbbefec18f3f564c1fb3ddd0c85be740c86a17f0ac9efa33e6b0fbe940f" HandleID="k8s-pod-network.46d2afbbefec18f3f564c1fb3ddd0c85be740c86a17f0ac9efa33e6b0fbe940f" Workload="ci--4459--0--0--n--21eb3e8385-k8s-calico--apiserver--6bd6d5f94d--k7n7n-eth0" Sep 16 04:24:54.237549 containerd[1558]: 2025-09-16 04:24:54.206 [INFO][5216] cni-plugin/k8s.go 418: Populated endpoint ContainerID="46d2afbbefec18f3f564c1fb3ddd0c85be740c86a17f0ac9efa33e6b0fbe940f" Namespace="calico-apiserver" Pod="calico-apiserver-6bd6d5f94d-k7n7n" WorkloadEndpoint="ci--4459--0--0--n--21eb3e8385-k8s-calico--apiserver--6bd6d5f94d--k7n7n-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4459--0--0--n--21eb3e8385-k8s-calico--apiserver--6bd6d5f94d--k7n7n-eth0", GenerateName:"calico-apiserver-6bd6d5f94d-", Namespace:"calico-apiserver", SelfLink:"", UID:"0d54ff7f-c5bb-415d-989d-604db3303766", ResourceVersion:"1072", Generation:0, CreationTimestamp:time.Date(2025, time.September, 16, 4, 24, 53, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"6bd6d5f94d", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4459-0-0-n-21eb3e8385", ContainerID:"", Pod:"calico-apiserver-6bd6d5f94d-k7n7n", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.66.10/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"cali51f2bca5a85", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 16 04:24:54.237549 containerd[1558]: 2025-09-16 04:24:54.206 [INFO][5216] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.66.10/32] ContainerID="46d2afbbefec18f3f564c1fb3ddd0c85be740c86a17f0ac9efa33e6b0fbe940f" Namespace="calico-apiserver" Pod="calico-apiserver-6bd6d5f94d-k7n7n" WorkloadEndpoint="ci--4459--0--0--n--21eb3e8385-k8s-calico--apiserver--6bd6d5f94d--k7n7n-eth0" Sep 16 04:24:54.237549 containerd[1558]: 2025-09-16 04:24:54.206 [INFO][5216] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali51f2bca5a85 ContainerID="46d2afbbefec18f3f564c1fb3ddd0c85be740c86a17f0ac9efa33e6b0fbe940f" Namespace="calico-apiserver" Pod="calico-apiserver-6bd6d5f94d-k7n7n" WorkloadEndpoint="ci--4459--0--0--n--21eb3e8385-k8s-calico--apiserver--6bd6d5f94d--k7n7n-eth0" Sep 16 04:24:54.237549 containerd[1558]: 2025-09-16 04:24:54.210 [INFO][5216] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="46d2afbbefec18f3f564c1fb3ddd0c85be740c86a17f0ac9efa33e6b0fbe940f" Namespace="calico-apiserver" Pod="calico-apiserver-6bd6d5f94d-k7n7n" WorkloadEndpoint="ci--4459--0--0--n--21eb3e8385-k8s-calico--apiserver--6bd6d5f94d--k7n7n-eth0" Sep 16 04:24:54.237724 containerd[1558]: 2025-09-16 04:24:54.211 [INFO][5216] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="46d2afbbefec18f3f564c1fb3ddd0c85be740c86a17f0ac9efa33e6b0fbe940f" Namespace="calico-apiserver" Pod="calico-apiserver-6bd6d5f94d-k7n7n" WorkloadEndpoint="ci--4459--0--0--n--21eb3e8385-k8s-calico--apiserver--6bd6d5f94d--k7n7n-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4459--0--0--n--21eb3e8385-k8s-calico--apiserver--6bd6d5f94d--k7n7n-eth0", GenerateName:"calico-apiserver-6bd6d5f94d-", Namespace:"calico-apiserver", SelfLink:"", UID:"0d54ff7f-c5bb-415d-989d-604db3303766", ResourceVersion:"1072", Generation:0, CreationTimestamp:time.Date(2025, time.September, 16, 4, 24, 53, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"6bd6d5f94d", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4459-0-0-n-21eb3e8385", ContainerID:"46d2afbbefec18f3f564c1fb3ddd0c85be740c86a17f0ac9efa33e6b0fbe940f", Pod:"calico-apiserver-6bd6d5f94d-k7n7n", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.66.10/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"cali51f2bca5a85", MAC:"56:e8:32:d1:94:e5", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 16 04:24:54.237724 containerd[1558]: 2025-09-16 04:24:54.228 [INFO][5216] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="46d2afbbefec18f3f564c1fb3ddd0c85be740c86a17f0ac9efa33e6b0fbe940f" Namespace="calico-apiserver" Pod="calico-apiserver-6bd6d5f94d-k7n7n" WorkloadEndpoint="ci--4459--0--0--n--21eb3e8385-k8s-calico--apiserver--6bd6d5f94d--k7n7n-eth0" Sep 16 04:24:54.278739 containerd[1558]: time="2025-09-16T04:24:54.278634085Z" level=info msg="connecting to shim 46d2afbbefec18f3f564c1fb3ddd0c85be740c86a17f0ac9efa33e6b0fbe940f" address="unix:///run/containerd/s/741c7b745469ed5fba4afce8e4a62d3a5fa91af70cdeb4dadc293f2215310584" namespace=k8s.io protocol=ttrpc version=3 Sep 16 04:24:54.321985 systemd[1]: Started cri-containerd-46d2afbbefec18f3f564c1fb3ddd0c85be740c86a17f0ac9efa33e6b0fbe940f.scope - libcontainer container 46d2afbbefec18f3f564c1fb3ddd0c85be740c86a17f0ac9efa33e6b0fbe940f. Sep 16 04:24:54.472552 containerd[1558]: time="2025-09-16T04:24:54.472504775Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-6bd6d5f94d-k7n7n,Uid:0d54ff7f-c5bb-415d-989d-604db3303766,Namespace:calico-apiserver,Attempt:0,} returns sandbox id \"46d2afbbefec18f3f564c1fb3ddd0c85be740c86a17f0ac9efa33e6b0fbe940f\"" Sep 16 04:24:54.489002 containerd[1558]: time="2025-09-16T04:24:54.488149780Z" level=info msg="CreateContainer within sandbox \"46d2afbbefec18f3f564c1fb3ddd0c85be740c86a17f0ac9efa33e6b0fbe940f\" for container &ContainerMetadata{Name:calico-apiserver,Attempt:0,}" Sep 16 04:24:54.499940 containerd[1558]: time="2025-09-16T04:24:54.499881554Z" level=info msg="Container d054e762edef0daf7dc5cd405a95100f699c02a9e17a6ef9b09ebd721eb34721: CDI devices from CRI Config.CDIDevices: []" Sep 16 04:24:54.514187 containerd[1558]: time="2025-09-16T04:24:54.514109283Z" level=info msg="CreateContainer within sandbox \"46d2afbbefec18f3f564c1fb3ddd0c85be740c86a17f0ac9efa33e6b0fbe940f\" for &ContainerMetadata{Name:calico-apiserver,Attempt:0,} returns container id \"d054e762edef0daf7dc5cd405a95100f699c02a9e17a6ef9b09ebd721eb34721\"" Sep 16 04:24:54.515258 containerd[1558]: time="2025-09-16T04:24:54.515232000Z" level=info msg="StartContainer for \"d054e762edef0daf7dc5cd405a95100f699c02a9e17a6ef9b09ebd721eb34721\"" Sep 16 04:24:54.518718 containerd[1558]: time="2025-09-16T04:24:54.518634393Z" level=info msg="connecting to shim d054e762edef0daf7dc5cd405a95100f699c02a9e17a6ef9b09ebd721eb34721" address="unix:///run/containerd/s/741c7b745469ed5fba4afce8e4a62d3a5fa91af70cdeb4dadc293f2215310584" protocol=ttrpc version=3 Sep 16 04:24:54.578847 systemd[1]: Started cri-containerd-d054e762edef0daf7dc5cd405a95100f699c02a9e17a6ef9b09ebd721eb34721.scope - libcontainer container d054e762edef0daf7dc5cd405a95100f699c02a9e17a6ef9b09ebd721eb34721. Sep 16 04:24:54.738081 containerd[1558]: time="2025-09-16T04:24:54.738007306Z" level=info msg="StartContainer for \"d054e762edef0daf7dc5cd405a95100f699c02a9e17a6ef9b09ebd721eb34721\" returns successfully" Sep 16 04:24:56.205744 systemd-networkd[1418]: cali51f2bca5a85: Gained IPv6LL Sep 16 04:24:57.189953 kubelet[2783]: I0916 04:24:57.188610 2783 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Sep 16 04:24:57.803062 containerd[1558]: time="2025-09-16T04:24:57.802321736Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/kube-controllers:v3.30.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 16 04:24:57.803062 containerd[1558]: time="2025-09-16T04:24:57.803013055Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/kube-controllers:v3.30.3: active requests=0, bytes read=48134957" Sep 16 04:24:57.805726 containerd[1558]: time="2025-09-16T04:24:57.805665372Z" level=info msg="ImageCreate event name:\"sha256:34117caf92350e1565610f2254377d7455b11e36666b5ce11b4a13670720432a\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 16 04:24:57.810843 containerd[1558]: time="2025-09-16T04:24:57.810792687Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/kube-controllers@sha256:27c4187717f08f0a5727019d8beb7597665eb47e69eaa1d7d091a7e28913e577\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 16 04:24:57.812595 containerd[1558]: time="2025-09-16T04:24:57.812514525Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.3\" with image id \"sha256:34117caf92350e1565610f2254377d7455b11e36666b5ce11b4a13670720432a\", repo tag \"ghcr.io/flatcar/calico/kube-controllers:v3.30.3\", repo digest \"ghcr.io/flatcar/calico/kube-controllers@sha256:27c4187717f08f0a5727019d8beb7597665eb47e69eaa1d7d091a7e28913e577\", size \"49504166\" in 5.198210405s" Sep 16 04:24:57.813201 containerd[1558]: time="2025-09-16T04:24:57.812795004Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.30.3\" returns image reference \"sha256:34117caf92350e1565610f2254377d7455b11e36666b5ce11b4a13670720432a\"" Sep 16 04:24:57.815298 containerd[1558]: time="2025-09-16T04:24:57.814799082Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.3\"" Sep 16 04:24:57.833960 containerd[1558]: time="2025-09-16T04:24:57.833916061Z" level=info msg="CreateContainer within sandbox \"bf9451cd5cf370340927d8330d7d3ece7ed4d22949a75087d5ae4a5d86b2be7c\" for container &ContainerMetadata{Name:calico-kube-controllers,Attempt:0,}" Sep 16 04:24:57.847945 containerd[1558]: time="2025-09-16T04:24:57.847895405Z" level=info msg="Container 330eda4a58ea998791d702c8e63e01b0da33f16b43b46f482cb45be65d7b3d87: CDI devices from CRI Config.CDIDevices: []" Sep 16 04:24:57.857008 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount1649370071.mount: Deactivated successfully. Sep 16 04:24:57.859510 containerd[1558]: time="2025-09-16T04:24:57.859452553Z" level=info msg="CreateContainer within sandbox \"bf9451cd5cf370340927d8330d7d3ece7ed4d22949a75087d5ae4a5d86b2be7c\" for &ContainerMetadata{Name:calico-kube-controllers,Attempt:0,} returns container id \"330eda4a58ea998791d702c8e63e01b0da33f16b43b46f482cb45be65d7b3d87\"" Sep 16 04:24:57.860623 containerd[1558]: time="2025-09-16T04:24:57.860569751Z" level=info msg="StartContainer for \"330eda4a58ea998791d702c8e63e01b0da33f16b43b46f482cb45be65d7b3d87\"" Sep 16 04:24:57.863013 containerd[1558]: time="2025-09-16T04:24:57.862980669Z" level=info msg="connecting to shim 330eda4a58ea998791d702c8e63e01b0da33f16b43b46f482cb45be65d7b3d87" address="unix:///run/containerd/s/d3ae2d71b57a2cb6b2e02fe63a79444420b63f66eb8e20c8152977447c5f7077" protocol=ttrpc version=3 Sep 16 04:24:57.890960 systemd[1]: Started cri-containerd-330eda4a58ea998791d702c8e63e01b0da33f16b43b46f482cb45be65d7b3d87.scope - libcontainer container 330eda4a58ea998791d702c8e63e01b0da33f16b43b46f482cb45be65d7b3d87. Sep 16 04:24:57.991647 containerd[1558]: time="2025-09-16T04:24:57.991531567Z" level=info msg="StartContainer for \"330eda4a58ea998791d702c8e63e01b0da33f16b43b46f482cb45be65d7b3d87\" returns successfully" Sep 16 04:24:58.226284 kubelet[2783]: I0916 04:24:58.226209 2783 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-apiserver/calico-apiserver-6bd6d5f94d-k7n7n" podStartSLOduration=5.226189345 podStartE2EDuration="5.226189345s" podCreationTimestamp="2025-09-16 04:24:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-16 04:24:55.198480961 +0000 UTC m=+61.567122023" watchObservedRunningTime="2025-09-16 04:24:58.226189345 +0000 UTC m=+64.594830327" Sep 16 04:24:58.236177 containerd[1558]: time="2025-09-16T04:24:58.234933778Z" level=info msg="ImageUpdate event name:\"ghcr.io/flatcar/calico/apiserver:v3.30.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 16 04:24:58.237829 containerd[1558]: time="2025-09-16T04:24:58.237622176Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.30.3: active requests=0, bytes read=77" Sep 16 04:24:58.244597 containerd[1558]: time="2025-09-16T04:24:58.244535971Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/apiserver:v3.30.3\" with image id \"sha256:632fbde00b1918016ac07458e79cc438ccda83cb762bfd5fc50a26721abced08\", repo tag \"ghcr.io/flatcar/calico/apiserver:v3.30.3\", repo digest \"ghcr.io/flatcar/calico/apiserver@sha256:6a24147f11c1edce9d6ba79bdb0c2beadec53853fb43438a287291e67b41e51b\", size \"45900064\" in 429.701769ms" Sep 16 04:24:58.244857 containerd[1558]: time="2025-09-16T04:24:58.244829971Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.3\" returns image reference \"sha256:632fbde00b1918016ac07458e79cc438ccda83cb762bfd5fc50a26721abced08\"" Sep 16 04:24:58.249296 containerd[1558]: time="2025-09-16T04:24:58.249260647Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.3\"" Sep 16 04:24:58.253143 containerd[1558]: time="2025-09-16T04:24:58.252635205Z" level=info msg="CreateContainer within sandbox \"8f4206c779cd8e784c940367c2f359b2973b5440f615fa238c911344451e797b\" for container &ContainerMetadata{Name:calico-apiserver,Attempt:0,}" Sep 16 04:24:58.266219 containerd[1558]: time="2025-09-16T04:24:58.266178355Z" level=info msg="Container 474e26dba50d2f668922af260af9db09d6b600a2e5be6020d5efee03be1c0d4c: CDI devices from CRI Config.CDIDevices: []" Sep 16 04:24:58.280174 containerd[1558]: time="2025-09-16T04:24:58.280133944Z" level=info msg="CreateContainer within sandbox \"8f4206c779cd8e784c940367c2f359b2973b5440f615fa238c911344451e797b\" for &ContainerMetadata{Name:calico-apiserver,Attempt:0,} returns container id \"474e26dba50d2f668922af260af9db09d6b600a2e5be6020d5efee03be1c0d4c\"" Sep 16 04:24:58.282560 containerd[1558]: time="2025-09-16T04:24:58.282002382Z" level=info msg="StartContainer for \"474e26dba50d2f668922af260af9db09d6b600a2e5be6020d5efee03be1c0d4c\"" Sep 16 04:24:58.284764 containerd[1558]: time="2025-09-16T04:24:58.284722060Z" level=info msg="connecting to shim 474e26dba50d2f668922af260af9db09d6b600a2e5be6020d5efee03be1c0d4c" address="unix:///run/containerd/s/acea9b0227aff4c794c3a3dccf47154d1db3cdde7a5bae36e787a11dea56e094" protocol=ttrpc version=3 Sep 16 04:24:58.318131 systemd[1]: Started cri-containerd-474e26dba50d2f668922af260af9db09d6b600a2e5be6020d5efee03be1c0d4c.scope - libcontainer container 474e26dba50d2f668922af260af9db09d6b600a2e5be6020d5efee03be1c0d4c. Sep 16 04:24:58.332382 containerd[1558]: time="2025-09-16T04:24:58.331780105Z" level=info msg="TaskExit event in podsandbox handler container_id:\"330eda4a58ea998791d702c8e63e01b0da33f16b43b46f482cb45be65d7b3d87\" id:\"b7d0b8606d8d3faa5fbf86896e093bbdb5691d9670051697762858b8b8f4d2d6\" pid:5398 exit_status:1 exited_at:{seconds:1757996698 nanos:329734626}" Sep 16 04:24:58.464789 containerd[1558]: time="2025-09-16T04:24:58.464746524Z" level=info msg="StartContainer for \"474e26dba50d2f668922af260af9db09d6b600a2e5be6020d5efee03be1c0d4c\" returns successfully" Sep 16 04:24:58.769720 kubelet[2783]: I0916 04:24:58.769642 2783 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/calico-kube-controllers-5fb8d7647c-wmjlp" podStartSLOduration=25.246791174 podStartE2EDuration="40.769622972s" podCreationTimestamp="2025-09-16 04:24:18 +0000 UTC" firstStartedPulling="2025-09-16 04:24:42.291432725 +0000 UTC m=+48.660073747" lastFinishedPulling="2025-09-16 04:24:57.814264523 +0000 UTC m=+64.182905545" observedRunningTime="2025-09-16 04:24:58.226809144 +0000 UTC m=+64.595450166" watchObservedRunningTime="2025-09-16 04:24:58.769622972 +0000 UTC m=+65.138263994" Sep 16 04:24:58.836213 kubelet[2783]: I0916 04:24:58.836171 2783 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Sep 16 04:24:58.837373 containerd[1558]: time="2025-09-16T04:24:58.837239041Z" level=info msg="StopContainer for \"4d0ab61559473cb2339e578525aaf069f866621a7afcf3eab88ca02d4a4a511b\" with timeout 30 (s)" Sep 16 04:24:58.839110 containerd[1558]: time="2025-09-16T04:24:58.839070600Z" level=info msg="Stop container \"4d0ab61559473cb2339e578525aaf069f866621a7afcf3eab88ca02d4a4a511b\" with signal terminated" Sep 16 04:24:58.964215 systemd[1]: cri-containerd-4d0ab61559473cb2339e578525aaf069f866621a7afcf3eab88ca02d4a4a511b.scope: Deactivated successfully. Sep 16 04:24:58.971307 containerd[1558]: time="2025-09-16T04:24:58.971261219Z" level=info msg="received exit event container_id:\"4d0ab61559473cb2339e578525aaf069f866621a7afcf3eab88ca02d4a4a511b\" id:\"4d0ab61559473cb2339e578525aaf069f866621a7afcf3eab88ca02d4a4a511b\" pid:5183 exit_status:1 exited_at:{seconds:1757996698 nanos:970803540}" Sep 16 04:24:58.971778 containerd[1558]: time="2025-09-16T04:24:58.971387299Z" level=info msg="TaskExit event in podsandbox handler container_id:\"4d0ab61559473cb2339e578525aaf069f866621a7afcf3eab88ca02d4a4a511b\" id:\"4d0ab61559473cb2339e578525aaf069f866621a7afcf3eab88ca02d4a4a511b\" pid:5183 exit_status:1 exited_at:{seconds:1757996698 nanos:970803540}" Sep 16 04:24:59.009803 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-4d0ab61559473cb2339e578525aaf069f866621a7afcf3eab88ca02d4a4a511b-rootfs.mount: Deactivated successfully. Sep 16 04:24:59.063547 containerd[1558]: time="2025-09-16T04:24:59.063014010Z" level=info msg="StopContainer for \"4d0ab61559473cb2339e578525aaf069f866621a7afcf3eab88ca02d4a4a511b\" returns successfully" Sep 16 04:24:59.064092 containerd[1558]: time="2025-09-16T04:24:59.063768130Z" level=info msg="StopPodSandbox for \"8a610be6a24d2e391acbed96ab712c6beb8ff0fbbbbd95157a98bf1bdc94ecff\"" Sep 16 04:24:59.064092 containerd[1558]: time="2025-09-16T04:24:59.063872250Z" level=info msg="Container to stop \"4d0ab61559473cb2339e578525aaf069f866621a7afcf3eab88ca02d4a4a511b\" must be in running or unknown state, current state \"CONTAINER_EXITED\"" Sep 16 04:24:59.088874 systemd[1]: cri-containerd-8a610be6a24d2e391acbed96ab712c6beb8ff0fbbbbd95157a98bf1bdc94ecff.scope: Deactivated successfully. Sep 16 04:24:59.094346 containerd[1558]: time="2025-09-16T04:24:59.094302637Z" level=info msg="TaskExit event in podsandbox handler container_id:\"8a610be6a24d2e391acbed96ab712c6beb8ff0fbbbbd95157a98bf1bdc94ecff\" id:\"8a610be6a24d2e391acbed96ab712c6beb8ff0fbbbbd95157a98bf1bdc94ecff\" pid:4708 exit_status:137 exited_at:{seconds:1757996699 nanos:93863557}" Sep 16 04:24:59.142602 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-8a610be6a24d2e391acbed96ab712c6beb8ff0fbbbbd95157a98bf1bdc94ecff-rootfs.mount: Deactivated successfully. Sep 16 04:24:59.144394 containerd[1558]: time="2025-09-16T04:24:59.144319296Z" level=info msg="shim disconnected" id=8a610be6a24d2e391acbed96ab712c6beb8ff0fbbbbd95157a98bf1bdc94ecff namespace=k8s.io Sep 16 04:24:59.144394 containerd[1558]: time="2025-09-16T04:24:59.144359096Z" level=warning msg="cleaning up after shim disconnected" id=8a610be6a24d2e391acbed96ab712c6beb8ff0fbbbbd95157a98bf1bdc94ecff namespace=k8s.io Sep 16 04:24:59.144394 containerd[1558]: time="2025-09-16T04:24:59.144393416Z" level=info msg="cleaning up dead shim" namespace=k8s.io Sep 16 04:24:59.180230 systemd[1]: run-containerd-io.containerd.grpc.v1.cri-sandboxes-8a610be6a24d2e391acbed96ab712c6beb8ff0fbbbbd95157a98bf1bdc94ecff-shm.mount: Deactivated successfully. Sep 16 04:24:59.194451 containerd[1558]: time="2025-09-16T04:24:59.194367715Z" level=info msg="received exit event sandbox_id:\"8a610be6a24d2e391acbed96ab712c6beb8ff0fbbbbd95157a98bf1bdc94ecff\" exit_status:137 exited_at:{seconds:1757996699 nanos:93863557}" Sep 16 04:24:59.234958 containerd[1558]: time="2025-09-16T04:24:59.234917378Z" level=info msg="StopContainer for \"474e26dba50d2f668922af260af9db09d6b600a2e5be6020d5efee03be1c0d4c\" with timeout 30 (s)" Sep 16 04:24:59.241465 containerd[1558]: time="2025-09-16T04:24:59.241306495Z" level=info msg="Stop container \"474e26dba50d2f668922af260af9db09d6b600a2e5be6020d5efee03be1c0d4c\" with signal terminated" Sep 16 04:24:59.256967 kubelet[2783]: I0916 04:24:59.255427 2783 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-apiserver/calico-apiserver-6bd74b9558-ljjkj" podStartSLOduration=32.415549727 podStartE2EDuration="48.255411409s" podCreationTimestamp="2025-09-16 04:24:11 +0000 UTC" firstStartedPulling="2025-09-16 04:24:42.407847407 +0000 UTC m=+48.776488429" lastFinishedPulling="2025-09-16 04:24:58.247709089 +0000 UTC m=+64.616350111" observedRunningTime="2025-09-16 04:24:59.254727369 +0000 UTC m=+65.623368431" watchObservedRunningTime="2025-09-16 04:24:59.255411409 +0000 UTC m=+65.624052431" Sep 16 04:24:59.259986 kubelet[2783]: I0916 04:24:59.259291 2783 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8a610be6a24d2e391acbed96ab712c6beb8ff0fbbbbd95157a98bf1bdc94ecff" Sep 16 04:24:59.304416 systemd[1]: cri-containerd-474e26dba50d2f668922af260af9db09d6b600a2e5be6020d5efee03be1c0d4c.scope: Deactivated successfully. Sep 16 04:24:59.310978 containerd[1558]: time="2025-09-16T04:24:59.310752385Z" level=info msg="received exit event container_id:\"474e26dba50d2f668922af260af9db09d6b600a2e5be6020d5efee03be1c0d4c\" id:\"474e26dba50d2f668922af260af9db09d6b600a2e5be6020d5efee03be1c0d4c\" pid:5417 exit_status:1 exited_at:{seconds:1757996699 nanos:309907066}" Sep 16 04:24:59.318314 containerd[1558]: time="2025-09-16T04:24:59.317080863Z" level=info msg="TaskExit event in podsandbox handler container_id:\"474e26dba50d2f668922af260af9db09d6b600a2e5be6020d5efee03be1c0d4c\" id:\"474e26dba50d2f668922af260af9db09d6b600a2e5be6020d5efee03be1c0d4c\" pid:5417 exit_status:1 exited_at:{seconds:1757996699 nanos:309907066}" Sep 16 04:24:59.321808 systemd-networkd[1418]: cali03b14a9e1b3: Link DOWN Sep 16 04:24:59.321814 systemd-networkd[1418]: cali03b14a9e1b3: Lost carrier Sep 16 04:24:59.400739 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-474e26dba50d2f668922af260af9db09d6b600a2e5be6020d5efee03be1c0d4c-rootfs.mount: Deactivated successfully. Sep 16 04:24:59.419215 containerd[1558]: time="2025-09-16T04:24:59.419168620Z" level=info msg="StopContainer for \"474e26dba50d2f668922af260af9db09d6b600a2e5be6020d5efee03be1c0d4c\" returns successfully" Sep 16 04:24:59.420896 containerd[1558]: time="2025-09-16T04:24:59.420799099Z" level=info msg="StopPodSandbox for \"8f4206c779cd8e784c940367c2f359b2973b5440f615fa238c911344451e797b\"" Sep 16 04:24:59.420896 containerd[1558]: time="2025-09-16T04:24:59.420876539Z" level=info msg="Container to stop \"474e26dba50d2f668922af260af9db09d6b600a2e5be6020d5efee03be1c0d4c\" must be in running or unknown state, current state \"CONTAINER_EXITED\"" Sep 16 04:24:59.428082 containerd[1558]: time="2025-09-16T04:24:59.427177296Z" level=info msg="TaskExit event in podsandbox handler container_id:\"330eda4a58ea998791d702c8e63e01b0da33f16b43b46f482cb45be65d7b3d87\" id:\"c684bea37c51683f5e9651742d5b310d9abff8260d7c5e867fec8597950cdf70\" pid:5544 exited_at:{seconds:1757996699 nanos:424755257}" Sep 16 04:24:59.432928 systemd[1]: cri-containerd-8f4206c779cd8e784c940367c2f359b2973b5440f615fa238c911344451e797b.scope: Deactivated successfully. Sep 16 04:24:59.443296 containerd[1558]: time="2025-09-16T04:24:59.443199969Z" level=info msg="TaskExit event in podsandbox handler container_id:\"8f4206c779cd8e784c940367c2f359b2973b5440f615fa238c911344451e797b\" id:\"8f4206c779cd8e784c940367c2f359b2973b5440f615fa238c911344451e797b\" pid:4865 exit_status:137 exited_at:{seconds:1757996699 nanos:441668050}" Sep 16 04:24:59.489490 containerd[1558]: time="2025-09-16T04:24:59.489250710Z" level=info msg="shim disconnected" id=8f4206c779cd8e784c940367c2f359b2973b5440f615fa238c911344451e797b namespace=k8s.io Sep 16 04:24:59.489490 containerd[1558]: time="2025-09-16T04:24:59.489284030Z" level=warning msg="cleaning up after shim disconnected" id=8f4206c779cd8e784c940367c2f359b2973b5440f615fa238c911344451e797b namespace=k8s.io Sep 16 04:24:59.489490 containerd[1558]: time="2025-09-16T04:24:59.489311150Z" level=info msg="cleaning up dead shim" namespace=k8s.io Sep 16 04:24:59.518430 containerd[1558]: time="2025-09-16T04:24:59.518366378Z" level=info msg="received exit event sandbox_id:\"8f4206c779cd8e784c940367c2f359b2973b5440f615fa238c911344451e797b\" exit_status:137 exited_at:{seconds:1757996699 nanos:441668050}" Sep 16 04:24:59.616148 containerd[1558]: 2025-09-16 04:24:59.317 [INFO][5517] cni-plugin/k8s.go 640: Cleaning up netns ContainerID="8a610be6a24d2e391acbed96ab712c6beb8ff0fbbbbd95157a98bf1bdc94ecff" Sep 16 04:24:59.616148 containerd[1558]: 2025-09-16 04:24:59.317 [INFO][5517] cni-plugin/dataplane_linux.go 559: Deleting workload's device in netns. ContainerID="8a610be6a24d2e391acbed96ab712c6beb8ff0fbbbbd95157a98bf1bdc94ecff" iface="eth0" netns="/var/run/netns/cni-4b26f7ef-20e7-7047-c2fc-bcc5886b026e" Sep 16 04:24:59.616148 containerd[1558]: 2025-09-16 04:24:59.320 [INFO][5517] cni-plugin/dataplane_linux.go 570: Entered netns, deleting veth. ContainerID="8a610be6a24d2e391acbed96ab712c6beb8ff0fbbbbd95157a98bf1bdc94ecff" iface="eth0" netns="/var/run/netns/cni-4b26f7ef-20e7-7047-c2fc-bcc5886b026e" Sep 16 04:24:59.616148 containerd[1558]: 2025-09-16 04:24:59.332 [INFO][5517] cni-plugin/dataplane_linux.go 604: Deleted device in netns. ContainerID="8a610be6a24d2e391acbed96ab712c6beb8ff0fbbbbd95157a98bf1bdc94ecff" after=15.413833ms iface="eth0" netns="/var/run/netns/cni-4b26f7ef-20e7-7047-c2fc-bcc5886b026e" Sep 16 04:24:59.616148 containerd[1558]: 2025-09-16 04:24:59.332 [INFO][5517] cni-plugin/k8s.go 647: Releasing IP address(es) ContainerID="8a610be6a24d2e391acbed96ab712c6beb8ff0fbbbbd95157a98bf1bdc94ecff" Sep 16 04:24:59.616148 containerd[1558]: 2025-09-16 04:24:59.332 [INFO][5517] cni-plugin/utils.go 188: Calico CNI releasing IP address ContainerID="8a610be6a24d2e391acbed96ab712c6beb8ff0fbbbbd95157a98bf1bdc94ecff" Sep 16 04:24:59.616148 containerd[1558]: 2025-09-16 04:24:59.447 [INFO][5560] ipam/ipam_plugin.go 412: Releasing address using handleID ContainerID="8a610be6a24d2e391acbed96ab712c6beb8ff0fbbbbd95157a98bf1bdc94ecff" HandleID="k8s-pod-network.8a610be6a24d2e391acbed96ab712c6beb8ff0fbbbbd95157a98bf1bdc94ecff" Workload="ci--4459--0--0--n--21eb3e8385-k8s-calico--apiserver--6bd74b9558--fz99l-eth0" Sep 16 04:24:59.616148 containerd[1558]: 2025-09-16 04:24:59.449 [INFO][5560] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Sep 16 04:24:59.616148 containerd[1558]: 2025-09-16 04:24:59.449 [INFO][5560] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Sep 16 04:24:59.616148 containerd[1558]: 2025-09-16 04:24:59.597 [INFO][5560] ipam/ipam_plugin.go 431: Released address using handleID ContainerID="8a610be6a24d2e391acbed96ab712c6beb8ff0fbbbbd95157a98bf1bdc94ecff" HandleID="k8s-pod-network.8a610be6a24d2e391acbed96ab712c6beb8ff0fbbbbd95157a98bf1bdc94ecff" Workload="ci--4459--0--0--n--21eb3e8385-k8s-calico--apiserver--6bd74b9558--fz99l-eth0" Sep 16 04:24:59.616148 containerd[1558]: 2025-09-16 04:24:59.597 [INFO][5560] ipam/ipam_plugin.go 440: Releasing address using workloadID ContainerID="8a610be6a24d2e391acbed96ab712c6beb8ff0fbbbbd95157a98bf1bdc94ecff" HandleID="k8s-pod-network.8a610be6a24d2e391acbed96ab712c6beb8ff0fbbbbd95157a98bf1bdc94ecff" Workload="ci--4459--0--0--n--21eb3e8385-k8s-calico--apiserver--6bd74b9558--fz99l-eth0" Sep 16 04:24:59.616148 containerd[1558]: 2025-09-16 04:24:59.607 [INFO][5560] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Sep 16 04:24:59.616148 containerd[1558]: 2025-09-16 04:24:59.613 [INFO][5517] cni-plugin/k8s.go 653: Teardown processing complete. ContainerID="8a610be6a24d2e391acbed96ab712c6beb8ff0fbbbbd95157a98bf1bdc94ecff" Sep 16 04:24:59.618619 containerd[1558]: time="2025-09-16T04:24:59.618497455Z" level=info msg="TearDown network for sandbox \"8a610be6a24d2e391acbed96ab712c6beb8ff0fbbbbd95157a98bf1bdc94ecff\" successfully" Sep 16 04:24:59.618619 containerd[1558]: time="2025-09-16T04:24:59.618535295Z" level=info msg="StopPodSandbox for \"8a610be6a24d2e391acbed96ab712c6beb8ff0fbbbbd95157a98bf1bdc94ecff\" returns successfully" Sep 16 04:24:59.639891 systemd-networkd[1418]: calibfe1fbbb672: Link DOWN Sep 16 04:24:59.639903 systemd-networkd[1418]: calibfe1fbbb672: Lost carrier Sep 16 04:24:59.719090 kubelet[2783]: I0916 04:24:59.719038 2783 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"calico-apiserver-certs\" (UniqueName: \"kubernetes.io/secret/60cfb0b5-050e-42a6-8c96-a4c5067b5655-calico-apiserver-certs\") pod \"60cfb0b5-050e-42a6-8c96-a4c5067b5655\" (UID: \"60cfb0b5-050e-42a6-8c96-a4c5067b5655\") " Sep 16 04:24:59.720174 kubelet[2783]: I0916 04:24:59.719807 2783 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-f7tkv\" (UniqueName: \"kubernetes.io/projected/60cfb0b5-050e-42a6-8c96-a4c5067b5655-kube-api-access-f7tkv\") pod \"60cfb0b5-050e-42a6-8c96-a4c5067b5655\" (UID: \"60cfb0b5-050e-42a6-8c96-a4c5067b5655\") " Sep 16 04:24:59.731964 kubelet[2783]: I0916 04:24:59.731891 2783 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/60cfb0b5-050e-42a6-8c96-a4c5067b5655-calico-apiserver-certs" (OuterVolumeSpecName: "calico-apiserver-certs") pod "60cfb0b5-050e-42a6-8c96-a4c5067b5655" (UID: "60cfb0b5-050e-42a6-8c96-a4c5067b5655"). InnerVolumeSpecName "calico-apiserver-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Sep 16 04:24:59.732863 kubelet[2783]: I0916 04:24:59.732803 2783 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/60cfb0b5-050e-42a6-8c96-a4c5067b5655-kube-api-access-f7tkv" (OuterVolumeSpecName: "kube-api-access-f7tkv") pod "60cfb0b5-050e-42a6-8c96-a4c5067b5655" (UID: "60cfb0b5-050e-42a6-8c96-a4c5067b5655"). InnerVolumeSpecName "kube-api-access-f7tkv". PluginName "kubernetes.io/projected", VolumeGIDValue "" Sep 16 04:24:59.766789 systemd[1]: Removed slice kubepods-besteffort-pod60cfb0b5_050e_42a6_8c96_a4c5067b5655.slice - libcontainer container kubepods-besteffort-pod60cfb0b5_050e_42a6_8c96_a4c5067b5655.slice. Sep 16 04:24:59.774954 containerd[1558]: 2025-09-16 04:24:59.637 [INFO][5627] cni-plugin/k8s.go 640: Cleaning up netns ContainerID="8f4206c779cd8e784c940367c2f359b2973b5440f615fa238c911344451e797b" Sep 16 04:24:59.774954 containerd[1558]: 2025-09-16 04:24:59.638 [INFO][5627] cni-plugin/dataplane_linux.go 559: Deleting workload's device in netns. ContainerID="8f4206c779cd8e784c940367c2f359b2973b5440f615fa238c911344451e797b" iface="eth0" netns="/var/run/netns/cni-6aea365e-969f-89d0-07bb-fefa39e0f501" Sep 16 04:24:59.774954 containerd[1558]: 2025-09-16 04:24:59.638 [INFO][5627] cni-plugin/dataplane_linux.go 570: Entered netns, deleting veth. ContainerID="8f4206c779cd8e784c940367c2f359b2973b5440f615fa238c911344451e797b" iface="eth0" netns="/var/run/netns/cni-6aea365e-969f-89d0-07bb-fefa39e0f501" Sep 16 04:24:59.774954 containerd[1558]: 2025-09-16 04:24:59.651 [INFO][5627] cni-plugin/dataplane_linux.go 604: Deleted device in netns. ContainerID="8f4206c779cd8e784c940367c2f359b2973b5440f615fa238c911344451e797b" after=12.693435ms iface="eth0" netns="/var/run/netns/cni-6aea365e-969f-89d0-07bb-fefa39e0f501" Sep 16 04:24:59.774954 containerd[1558]: 2025-09-16 04:24:59.651 [INFO][5627] cni-plugin/k8s.go 647: Releasing IP address(es) ContainerID="8f4206c779cd8e784c940367c2f359b2973b5440f615fa238c911344451e797b" Sep 16 04:24:59.774954 containerd[1558]: 2025-09-16 04:24:59.651 [INFO][5627] cni-plugin/utils.go 188: Calico CNI releasing IP address ContainerID="8f4206c779cd8e784c940367c2f359b2973b5440f615fa238c911344451e797b" Sep 16 04:24:59.774954 containerd[1558]: 2025-09-16 04:24:59.688 [INFO][5635] ipam/ipam_plugin.go 412: Releasing address using handleID ContainerID="8f4206c779cd8e784c940367c2f359b2973b5440f615fa238c911344451e797b" HandleID="k8s-pod-network.8f4206c779cd8e784c940367c2f359b2973b5440f615fa238c911344451e797b" Workload="ci--4459--0--0--n--21eb3e8385-k8s-calico--apiserver--6bd74b9558--ljjkj-eth0" Sep 16 04:24:59.774954 containerd[1558]: 2025-09-16 04:24:59.690 [INFO][5635] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Sep 16 04:24:59.774954 containerd[1558]: 2025-09-16 04:24:59.690 [INFO][5635] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Sep 16 04:24:59.774954 containerd[1558]: 2025-09-16 04:24:59.760 [INFO][5635] ipam/ipam_plugin.go 431: Released address using handleID ContainerID="8f4206c779cd8e784c940367c2f359b2973b5440f615fa238c911344451e797b" HandleID="k8s-pod-network.8f4206c779cd8e784c940367c2f359b2973b5440f615fa238c911344451e797b" Workload="ci--4459--0--0--n--21eb3e8385-k8s-calico--apiserver--6bd74b9558--ljjkj-eth0" Sep 16 04:24:59.774954 containerd[1558]: 2025-09-16 04:24:59.760 [INFO][5635] ipam/ipam_plugin.go 440: Releasing address using workloadID ContainerID="8f4206c779cd8e784c940367c2f359b2973b5440f615fa238c911344451e797b" HandleID="k8s-pod-network.8f4206c779cd8e784c940367c2f359b2973b5440f615fa238c911344451e797b" Workload="ci--4459--0--0--n--21eb3e8385-k8s-calico--apiserver--6bd74b9558--ljjkj-eth0" Sep 16 04:24:59.774954 containerd[1558]: 2025-09-16 04:24:59.770 [INFO][5635] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Sep 16 04:24:59.774954 containerd[1558]: 2025-09-16 04:24:59.772 [INFO][5627] cni-plugin/k8s.go 653: Teardown processing complete. ContainerID="8f4206c779cd8e784c940367c2f359b2973b5440f615fa238c911344451e797b" Sep 16 04:24:59.778257 containerd[1558]: time="2025-09-16T04:24:59.778200708Z" level=info msg="TearDown network for sandbox \"8f4206c779cd8e784c940367c2f359b2973b5440f615fa238c911344451e797b\" successfully" Sep 16 04:24:59.778257 containerd[1558]: time="2025-09-16T04:24:59.778248868Z" level=info msg="StopPodSandbox for \"8f4206c779cd8e784c940367c2f359b2973b5440f615fa238c911344451e797b\" returns successfully" Sep 16 04:24:59.820629 kubelet[2783]: I0916 04:24:59.820103 2783 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d2dw2\" (UniqueName: \"kubernetes.io/projected/bf6d1753-4f29-4a36-94fb-7d6f48d24a1e-kube-api-access-d2dw2\") pod \"bf6d1753-4f29-4a36-94fb-7d6f48d24a1e\" (UID: \"bf6d1753-4f29-4a36-94fb-7d6f48d24a1e\") " Sep 16 04:24:59.822683 kubelet[2783]: I0916 04:24:59.822639 2783 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"calico-apiserver-certs\" (UniqueName: \"kubernetes.io/secret/bf6d1753-4f29-4a36-94fb-7d6f48d24a1e-calico-apiserver-certs\") pod \"bf6d1753-4f29-4a36-94fb-7d6f48d24a1e\" (UID: \"bf6d1753-4f29-4a36-94fb-7d6f48d24a1e\") " Sep 16 04:24:59.824720 kubelet[2783]: I0916 04:24:59.824323 2783 reconciler_common.go:299] "Volume detached for volume \"calico-apiserver-certs\" (UniqueName: \"kubernetes.io/secret/60cfb0b5-050e-42a6-8c96-a4c5067b5655-calico-apiserver-certs\") on node \"ci-4459-0-0-n-21eb3e8385\" DevicePath \"\"" Sep 16 04:24:59.826264 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-8f4206c779cd8e784c940367c2f359b2973b5440f615fa238c911344451e797b-rootfs.mount: Deactivated successfully. Sep 16 04:24:59.826372 systemd[1]: run-containerd-io.containerd.grpc.v1.cri-sandboxes-8f4206c779cd8e784c940367c2f359b2973b5440f615fa238c911344451e797b-shm.mount: Deactivated successfully. Sep 16 04:24:59.826436 systemd[1]: run-netns-cni\x2d6aea365e\x2d969f\x2d89d0\x2d07bb\x2dfefa39e0f501.mount: Deactivated successfully. Sep 16 04:24:59.826484 systemd[1]: run-netns-cni\x2d4b26f7ef\x2d20e7\x2d7047\x2dc2fc\x2dbcc5886b026e.mount: Deactivated successfully. Sep 16 04:24:59.826535 systemd[1]: var-lib-kubelet-pods-60cfb0b5\x2d050e\x2d42a6\x2d8c96\x2da4c5067b5655-volumes-kubernetes.io\x7eprojected-kube\x2dapi\x2daccess\x2df7tkv.mount: Deactivated successfully. Sep 16 04:24:59.826605 systemd[1]: var-lib-kubelet-pods-60cfb0b5\x2d050e\x2d42a6\x2d8c96\x2da4c5067b5655-volumes-kubernetes.io\x7esecret-calico\x2dapiserver\x2dcerts.mount: Deactivated successfully. Sep 16 04:24:59.828563 kubelet[2783]: I0916 04:24:59.828529 2783 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-f7tkv\" (UniqueName: \"kubernetes.io/projected/60cfb0b5-050e-42a6-8c96-a4c5067b5655-kube-api-access-f7tkv\") on node \"ci-4459-0-0-n-21eb3e8385\" DevicePath \"\"" Sep 16 04:24:59.839477 kubelet[2783]: I0916 04:24:59.839386 2783 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bf6d1753-4f29-4a36-94fb-7d6f48d24a1e-calico-apiserver-certs" (OuterVolumeSpecName: "calico-apiserver-certs") pod "bf6d1753-4f29-4a36-94fb-7d6f48d24a1e" (UID: "bf6d1753-4f29-4a36-94fb-7d6f48d24a1e"). InnerVolumeSpecName "calico-apiserver-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Sep 16 04:24:59.840298 systemd[1]: var-lib-kubelet-pods-bf6d1753\x2d4f29\x2d4a36\x2d94fb\x2d7d6f48d24a1e-volumes-kubernetes.io\x7eprojected-kube\x2dapi\x2daccess\x2dd2dw2.mount: Deactivated successfully. Sep 16 04:24:59.840454 systemd[1]: var-lib-kubelet-pods-bf6d1753\x2d4f29\x2d4a36\x2d94fb\x2d7d6f48d24a1e-volumes-kubernetes.io\x7esecret-calico\x2dapiserver\x2dcerts.mount: Deactivated successfully. Sep 16 04:24:59.843795 kubelet[2783]: I0916 04:24:59.843723 2783 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf6d1753-4f29-4a36-94fb-7d6f48d24a1e-kube-api-access-d2dw2" (OuterVolumeSpecName: "kube-api-access-d2dw2") pod "bf6d1753-4f29-4a36-94fb-7d6f48d24a1e" (UID: "bf6d1753-4f29-4a36-94fb-7d6f48d24a1e"). InnerVolumeSpecName "kube-api-access-d2dw2". PluginName "kubernetes.io/projected", VolumeGIDValue "" Sep 16 04:24:59.929043 kubelet[2783]: I0916 04:24:59.929009 2783 reconciler_common.go:299] "Volume detached for volume \"calico-apiserver-certs\" (UniqueName: \"kubernetes.io/secret/bf6d1753-4f29-4a36-94fb-7d6f48d24a1e-calico-apiserver-certs\") on node \"ci-4459-0-0-n-21eb3e8385\" DevicePath \"\"" Sep 16 04:24:59.929244 kubelet[2783]: I0916 04:24:59.929220 2783 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-d2dw2\" (UniqueName: \"kubernetes.io/projected/bf6d1753-4f29-4a36-94fb-7d6f48d24a1e-kube-api-access-d2dw2\") on node \"ci-4459-0-0-n-21eb3e8385\" DevicePath \"\"" Sep 16 04:25:00.268230 kubelet[2783]: I0916 04:25:00.267926 2783 scope.go:117] "RemoveContainer" containerID="474e26dba50d2f668922af260af9db09d6b600a2e5be6020d5efee03be1c0d4c" Sep 16 04:25:00.283792 systemd[1]: Removed slice kubepods-besteffort-podbf6d1753_4f29_4a36_94fb_7d6f48d24a1e.slice - libcontainer container kubepods-besteffort-podbf6d1753_4f29_4a36_94fb_7d6f48d24a1e.slice. Sep 16 04:25:00.297530 containerd[1558]: time="2025-09-16T04:25:00.297177945Z" level=info msg="RemoveContainer for \"474e26dba50d2f668922af260af9db09d6b600a2e5be6020d5efee03be1c0d4c\"" Sep 16 04:25:00.315395 containerd[1558]: time="2025-09-16T04:25:00.315296623Z" level=info msg="RemoveContainer for \"474e26dba50d2f668922af260af9db09d6b600a2e5be6020d5efee03be1c0d4c\" returns successfully" Sep 16 04:25:00.317218 kubelet[2783]: I0916 04:25:00.316882 2783 scope.go:117] "RemoveContainer" containerID="474e26dba50d2f668922af260af9db09d6b600a2e5be6020d5efee03be1c0d4c" Sep 16 04:25:00.317813 containerd[1558]: time="2025-09-16T04:25:00.317727183Z" level=error msg="ContainerStatus for \"474e26dba50d2f668922af260af9db09d6b600a2e5be6020d5efee03be1c0d4c\" failed" error="rpc error: code = NotFound desc = an error occurred when try to find container \"474e26dba50d2f668922af260af9db09d6b600a2e5be6020d5efee03be1c0d4c\": not found" Sep 16 04:25:00.317991 kubelet[2783]: E0916 04:25:00.317888 2783 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = an error occurred when try to find container \"474e26dba50d2f668922af260af9db09d6b600a2e5be6020d5efee03be1c0d4c\": not found" containerID="474e26dba50d2f668922af260af9db09d6b600a2e5be6020d5efee03be1c0d4c" Sep 16 04:25:00.317991 kubelet[2783]: I0916 04:25:00.317925 2783 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"containerd","ID":"474e26dba50d2f668922af260af9db09d6b600a2e5be6020d5efee03be1c0d4c"} err="failed to get container status \"474e26dba50d2f668922af260af9db09d6b600a2e5be6020d5efee03be1c0d4c\": rpc error: code = NotFound desc = an error occurred when try to find container \"474e26dba50d2f668922af260af9db09d6b600a2e5be6020d5efee03be1c0d4c\": not found" Sep 16 04:25:00.415139 containerd[1558]: time="2025-09-16T04:25:00.414427573Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 16 04:25:00.415139 containerd[1558]: time="2025-09-16T04:25:00.415100773Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/node-driver-registrar:v3.30.3: active requests=0, bytes read=13761208" Sep 16 04:25:00.416440 containerd[1558]: time="2025-09-16T04:25:00.416413493Z" level=info msg="ImageCreate event name:\"sha256:a319b5bdc1001e98875b68e2943279adb74bcb19d09f1db857bc27959a078a65\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 16 04:25:00.419220 containerd[1558]: time="2025-09-16T04:25:00.419174253Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node-driver-registrar@sha256:731ab232ca708102ab332340b1274d5cd656aa896ecc5368ee95850b811df86f\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 16 04:25:00.420693 containerd[1558]: time="2025-09-16T04:25:00.419704613Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.3\" with image id \"sha256:a319b5bdc1001e98875b68e2943279adb74bcb19d09f1db857bc27959a078a65\", repo tag \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.3\", repo digest \"ghcr.io/flatcar/calico/node-driver-registrar@sha256:731ab232ca708102ab332340b1274d5cd656aa896ecc5368ee95850b811df86f\", size \"15130401\" in 2.170242406s" Sep 16 04:25:00.420693 containerd[1558]: time="2025-09-16T04:25:00.420642933Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.3\" returns image reference \"sha256:a319b5bdc1001e98875b68e2943279adb74bcb19d09f1db857bc27959a078a65\"" Sep 16 04:25:00.426476 containerd[1558]: time="2025-09-16T04:25:00.426383652Z" level=info msg="CreateContainer within sandbox \"06b339a4ad9ecc43049a43a52397442958c95ad5d6b17f2bc727827d99d90505\" for container &ContainerMetadata{Name:csi-node-driver-registrar,Attempt:0,}" Sep 16 04:25:00.446516 containerd[1558]: time="2025-09-16T04:25:00.444864850Z" level=info msg="Container 40c8d2832f3200c57a2ac59776774f318ab0d75a054cd3c02d37d7492942e7e7: CDI devices from CRI Config.CDIDevices: []" Sep 16 04:25:00.458517 containerd[1558]: time="2025-09-16T04:25:00.458466889Z" level=info msg="CreateContainer within sandbox \"06b339a4ad9ecc43049a43a52397442958c95ad5d6b17f2bc727827d99d90505\" for &ContainerMetadata{Name:csi-node-driver-registrar,Attempt:0,} returns container id \"40c8d2832f3200c57a2ac59776774f318ab0d75a054cd3c02d37d7492942e7e7\"" Sep 16 04:25:00.461071 containerd[1558]: time="2025-09-16T04:25:00.461004249Z" level=info msg="StartContainer for \"40c8d2832f3200c57a2ac59776774f318ab0d75a054cd3c02d37d7492942e7e7\"" Sep 16 04:25:00.463349 containerd[1558]: time="2025-09-16T04:25:00.463293849Z" level=info msg="connecting to shim 40c8d2832f3200c57a2ac59776774f318ab0d75a054cd3c02d37d7492942e7e7" address="unix:///run/containerd/s/a4907a4afeeea230949e61b0a90a7033c8c830cf22921f80abfd4cb25113e0d7" protocol=ttrpc version=3 Sep 16 04:25:00.500967 systemd[1]: Started cri-containerd-40c8d2832f3200c57a2ac59776774f318ab0d75a054cd3c02d37d7492942e7e7.scope - libcontainer container 40c8d2832f3200c57a2ac59776774f318ab0d75a054cd3c02d37d7492942e7e7. Sep 16 04:25:00.634321 containerd[1558]: time="2025-09-16T04:25:00.634194472Z" level=info msg="StartContainer for \"40c8d2832f3200c57a2ac59776774f318ab0d75a054cd3c02d37d7492942e7e7\" returns successfully" Sep 16 04:25:00.877034 kubelet[2783]: I0916 04:25:00.876983 2783 csi_plugin.go:106] kubernetes.io/csi: Trying to validate a new CSI Driver with name: csi.tigera.io endpoint: /var/lib/kubelet/plugins/csi.tigera.io/csi.sock versions: 1.0.0 Sep 16 04:25:00.883455 kubelet[2783]: I0916 04:25:00.883413 2783 csi_plugin.go:119] kubernetes.io/csi: Register new plugin with name: csi.tigera.io at endpoint: /var/lib/kubelet/plugins/csi.tigera.io/csi.sock Sep 16 04:25:01.311602 containerd[1558]: time="2025-09-16T04:25:01.311537464Z" level=info msg="TaskExit event in podsandbox handler exit_status:137 exited_at:{seconds:1757996699 nanos:441668050}" Sep 16 04:25:01.753403 kubelet[2783]: I0916 04:25:01.753355 2783 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="60cfb0b5-050e-42a6-8c96-a4c5067b5655" path="/var/lib/kubelet/pods/60cfb0b5-050e-42a6-8c96-a4c5067b5655/volumes" Sep 16 04:25:01.753849 kubelet[2783]: I0916 04:25:01.753737 2783 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bf6d1753-4f29-4a36-94fb-7d6f48d24a1e" path="/var/lib/kubelet/pods/bf6d1753-4f29-4a36-94fb-7d6f48d24a1e/volumes" Sep 16 04:25:12.797552 containerd[1558]: time="2025-09-16T04:25:12.797500696Z" level=info msg="TaskExit event in podsandbox handler container_id:\"29accf524a610bb91eb08b15c29dbaaf8febb57b21dcb514741ee9efee567994\" id:\"994e10999f639b3fcdc843da1e9b82d212be0eef21a89599de04a78e233e1c3b\" pid:5716 exited_at:{seconds:1757996712 nanos:797059094}" Sep 16 04:25:20.224551 containerd[1558]: time="2025-09-16T04:25:20.224437892Z" level=info msg="TaskExit event in podsandbox handler container_id:\"8d6c6929c13327a75d8c268f852bf6c0f9e08ea12f3d02fe73bf74d4acf04464\" id:\"fb66407edfe224face6ae56c175f40a949fd03f21c31c25e9fb34b2d1f8b3b17\" pid:5743 exited_at:{seconds:1757996720 nanos:223913570}" Sep 16 04:25:20.259890 kubelet[2783]: I0916 04:25:20.259664 2783 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/csi-node-driver-c85z6" podStartSLOduration=44.363847881 podStartE2EDuration="1m3.259644736s" podCreationTimestamp="2025-09-16 04:24:17 +0000 UTC" firstStartedPulling="2025-09-16 04:24:41.525618278 +0000 UTC m=+47.894259260" lastFinishedPulling="2025-09-16 04:25:00.421415093 +0000 UTC m=+66.790056115" observedRunningTime="2025-09-16 04:25:01.29077778 +0000 UTC m=+67.659418802" watchObservedRunningTime="2025-09-16 04:25:20.259644736 +0000 UTC m=+86.628285838" Sep 16 04:25:20.936341 containerd[1558]: time="2025-09-16T04:25:20.936276524Z" level=info msg="TaskExit event in podsandbox handler container_id:\"8d6c6929c13327a75d8c268f852bf6c0f9e08ea12f3d02fe73bf74d4acf04464\" id:\"2621d3c2f933a4bbfb6667542828ef6cc36af8c0299384a7b8b313adc0255e46\" pid:5768 exited_at:{seconds:1757996720 nanos:935967403}" Sep 16 04:25:29.324613 containerd[1558]: time="2025-09-16T04:25:29.324457585Z" level=info msg="TaskExit event in podsandbox handler container_id:\"330eda4a58ea998791d702c8e63e01b0da33f16b43b46f482cb45be65d7b3d87\" id:\"7ced265fce712a508717a2073396ff1bc362049fb4610d393804827aa8a64bf7\" pid:5797 exited_at:{seconds:1757996729 nanos:323146617}" Sep 16 04:25:42.761320 containerd[1558]: time="2025-09-16T04:25:42.761142823Z" level=info msg="TaskExit event in podsandbox handler container_id:\"29accf524a610bb91eb08b15c29dbaaf8febb57b21dcb514741ee9efee567994\" id:\"38e5e4ac518b8192a9dfeccb0cf05031edbf481e653a750c9e6caa790cc026ce\" pid:5822 exited_at:{seconds:1757996742 nanos:760550298}" Sep 16 04:25:50.218552 containerd[1558]: time="2025-09-16T04:25:50.218467229Z" level=info msg="TaskExit event in podsandbox handler container_id:\"8d6c6929c13327a75d8c268f852bf6c0f9e08ea12f3d02fe73bf74d4acf04464\" id:\"b0a74c39c84783b4196da936f34ec42219e4898287d92c45f9ceec8ce02d3f6f\" pid:5846 exited_at:{seconds:1757996750 nanos:218052986}" Sep 16 04:25:53.768746 kubelet[2783]: I0916 04:25:53.768656 2783 scope.go:117] "RemoveContainer" containerID="4d0ab61559473cb2339e578525aaf069f866621a7afcf3eab88ca02d4a4a511b" Sep 16 04:25:53.772513 containerd[1558]: time="2025-09-16T04:25:53.772479349Z" level=info msg="RemoveContainer for \"4d0ab61559473cb2339e578525aaf069f866621a7afcf3eab88ca02d4a4a511b\"" Sep 16 04:25:53.779343 containerd[1558]: time="2025-09-16T04:25:53.779301684Z" level=info msg="RemoveContainer for \"4d0ab61559473cb2339e578525aaf069f866621a7afcf3eab88ca02d4a4a511b\" returns successfully" Sep 16 04:25:53.781055 containerd[1558]: time="2025-09-16T04:25:53.781018218Z" level=info msg="StopPodSandbox for \"8a610be6a24d2e391acbed96ab712c6beb8ff0fbbbbd95157a98bf1bdc94ecff\"" Sep 16 04:25:53.894054 containerd[1558]: 2025-09-16 04:25:53.827 [WARNING][5866] cni-plugin/k8s.go 598: WorkloadEndpoint does not exist in the datastore, moving forward with the clean up ContainerID="8a610be6a24d2e391acbed96ab712c6beb8ff0fbbbbd95157a98bf1bdc94ecff" WorkloadEndpoint="ci--4459--0--0--n--21eb3e8385-k8s-calico--apiserver--6bd74b9558--fz99l-eth0" Sep 16 04:25:53.894054 containerd[1558]: 2025-09-16 04:25:53.827 [INFO][5866] cni-plugin/k8s.go 640: Cleaning up netns ContainerID="8a610be6a24d2e391acbed96ab712c6beb8ff0fbbbbd95157a98bf1bdc94ecff" Sep 16 04:25:53.894054 containerd[1558]: 2025-09-16 04:25:53.827 [INFO][5866] cni-plugin/dataplane_linux.go 555: CleanUpNamespace called with no netns name, ignoring. ContainerID="8a610be6a24d2e391acbed96ab712c6beb8ff0fbbbbd95157a98bf1bdc94ecff" iface="eth0" netns="" Sep 16 04:25:53.894054 containerd[1558]: 2025-09-16 04:25:53.827 [INFO][5866] cni-plugin/k8s.go 647: Releasing IP address(es) ContainerID="8a610be6a24d2e391acbed96ab712c6beb8ff0fbbbbd95157a98bf1bdc94ecff" Sep 16 04:25:53.894054 containerd[1558]: 2025-09-16 04:25:53.827 [INFO][5866] cni-plugin/utils.go 188: Calico CNI releasing IP address ContainerID="8a610be6a24d2e391acbed96ab712c6beb8ff0fbbbbd95157a98bf1bdc94ecff" Sep 16 04:25:53.894054 containerd[1558]: 2025-09-16 04:25:53.857 [INFO][5873] ipam/ipam_plugin.go 412: Releasing address using handleID ContainerID="8a610be6a24d2e391acbed96ab712c6beb8ff0fbbbbd95157a98bf1bdc94ecff" HandleID="k8s-pod-network.8a610be6a24d2e391acbed96ab712c6beb8ff0fbbbbd95157a98bf1bdc94ecff" Workload="ci--4459--0--0--n--21eb3e8385-k8s-calico--apiserver--6bd74b9558--fz99l-eth0" Sep 16 04:25:53.894054 containerd[1558]: 2025-09-16 04:25:53.857 [INFO][5873] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Sep 16 04:25:53.894054 containerd[1558]: 2025-09-16 04:25:53.857 [INFO][5873] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Sep 16 04:25:53.894054 containerd[1558]: 2025-09-16 04:25:53.877 [WARNING][5873] ipam/ipam_plugin.go 429: Asked to release address but it doesn't exist. Ignoring ContainerID="8a610be6a24d2e391acbed96ab712c6beb8ff0fbbbbd95157a98bf1bdc94ecff" HandleID="k8s-pod-network.8a610be6a24d2e391acbed96ab712c6beb8ff0fbbbbd95157a98bf1bdc94ecff" Workload="ci--4459--0--0--n--21eb3e8385-k8s-calico--apiserver--6bd74b9558--fz99l-eth0" Sep 16 04:25:53.894054 containerd[1558]: 2025-09-16 04:25:53.877 [INFO][5873] ipam/ipam_plugin.go 440: Releasing address using workloadID ContainerID="8a610be6a24d2e391acbed96ab712c6beb8ff0fbbbbd95157a98bf1bdc94ecff" HandleID="k8s-pod-network.8a610be6a24d2e391acbed96ab712c6beb8ff0fbbbbd95157a98bf1bdc94ecff" Workload="ci--4459--0--0--n--21eb3e8385-k8s-calico--apiserver--6bd74b9558--fz99l-eth0" Sep 16 04:25:53.894054 containerd[1558]: 2025-09-16 04:25:53.881 [INFO][5873] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Sep 16 04:25:53.894054 containerd[1558]: 2025-09-16 04:25:53.887 [INFO][5866] cni-plugin/k8s.go 653: Teardown processing complete. ContainerID="8a610be6a24d2e391acbed96ab712c6beb8ff0fbbbbd95157a98bf1bdc94ecff" Sep 16 04:25:53.894054 containerd[1558]: time="2025-09-16T04:25:53.894004057Z" level=info msg="TearDown network for sandbox \"8a610be6a24d2e391acbed96ab712c6beb8ff0fbbbbd95157a98bf1bdc94ecff\" successfully" Sep 16 04:25:53.894054 containerd[1558]: time="2025-09-16T04:25:53.894031937Z" level=info msg="StopPodSandbox for \"8a610be6a24d2e391acbed96ab712c6beb8ff0fbbbbd95157a98bf1bdc94ecff\" returns successfully" Sep 16 04:25:53.907454 containerd[1558]: time="2025-09-16T04:25:53.906660000Z" level=info msg="RemovePodSandbox for \"8a610be6a24d2e391acbed96ab712c6beb8ff0fbbbbd95157a98bf1bdc94ecff\"" Sep 16 04:25:53.907454 containerd[1558]: time="2025-09-16T04:25:53.906708960Z" level=info msg="Forcibly stopping sandbox \"8a610be6a24d2e391acbed96ab712c6beb8ff0fbbbbd95157a98bf1bdc94ecff\"" Sep 16 04:25:53.989998 containerd[1558]: 2025-09-16 04:25:53.948 [WARNING][5887] cni-plugin/k8s.go 598: WorkloadEndpoint does not exist in the datastore, moving forward with the clean up ContainerID="8a610be6a24d2e391acbed96ab712c6beb8ff0fbbbbd95157a98bf1bdc94ecff" WorkloadEndpoint="ci--4459--0--0--n--21eb3e8385-k8s-calico--apiserver--6bd74b9558--fz99l-eth0" Sep 16 04:25:53.989998 containerd[1558]: 2025-09-16 04:25:53.948 [INFO][5887] cni-plugin/k8s.go 640: Cleaning up netns ContainerID="8a610be6a24d2e391acbed96ab712c6beb8ff0fbbbbd95157a98bf1bdc94ecff" Sep 16 04:25:53.989998 containerd[1558]: 2025-09-16 04:25:53.948 [INFO][5887] cni-plugin/dataplane_linux.go 555: CleanUpNamespace called with no netns name, ignoring. ContainerID="8a610be6a24d2e391acbed96ab712c6beb8ff0fbbbbd95157a98bf1bdc94ecff" iface="eth0" netns="" Sep 16 04:25:53.989998 containerd[1558]: 2025-09-16 04:25:53.948 [INFO][5887] cni-plugin/k8s.go 647: Releasing IP address(es) ContainerID="8a610be6a24d2e391acbed96ab712c6beb8ff0fbbbbd95157a98bf1bdc94ecff" Sep 16 04:25:53.989998 containerd[1558]: 2025-09-16 04:25:53.948 [INFO][5887] cni-plugin/utils.go 188: Calico CNI releasing IP address ContainerID="8a610be6a24d2e391acbed96ab712c6beb8ff0fbbbbd95157a98bf1bdc94ecff" Sep 16 04:25:53.989998 containerd[1558]: 2025-09-16 04:25:53.973 [INFO][5895] ipam/ipam_plugin.go 412: Releasing address using handleID ContainerID="8a610be6a24d2e391acbed96ab712c6beb8ff0fbbbbd95157a98bf1bdc94ecff" HandleID="k8s-pod-network.8a610be6a24d2e391acbed96ab712c6beb8ff0fbbbbd95157a98bf1bdc94ecff" Workload="ci--4459--0--0--n--21eb3e8385-k8s-calico--apiserver--6bd74b9558--fz99l-eth0" Sep 16 04:25:53.989998 containerd[1558]: 2025-09-16 04:25:53.974 [INFO][5895] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Sep 16 04:25:53.989998 containerd[1558]: 2025-09-16 04:25:53.974 [INFO][5895] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Sep 16 04:25:53.989998 containerd[1558]: 2025-09-16 04:25:53.983 [WARNING][5895] ipam/ipam_plugin.go 429: Asked to release address but it doesn't exist. Ignoring ContainerID="8a610be6a24d2e391acbed96ab712c6beb8ff0fbbbbd95157a98bf1bdc94ecff" HandleID="k8s-pod-network.8a610be6a24d2e391acbed96ab712c6beb8ff0fbbbbd95157a98bf1bdc94ecff" Workload="ci--4459--0--0--n--21eb3e8385-k8s-calico--apiserver--6bd74b9558--fz99l-eth0" Sep 16 04:25:53.989998 containerd[1558]: 2025-09-16 04:25:53.983 [INFO][5895] ipam/ipam_plugin.go 440: Releasing address using workloadID ContainerID="8a610be6a24d2e391acbed96ab712c6beb8ff0fbbbbd95157a98bf1bdc94ecff" HandleID="k8s-pod-network.8a610be6a24d2e391acbed96ab712c6beb8ff0fbbbbd95157a98bf1bdc94ecff" Workload="ci--4459--0--0--n--21eb3e8385-k8s-calico--apiserver--6bd74b9558--fz99l-eth0" Sep 16 04:25:53.989998 containerd[1558]: 2025-09-16 04:25:53.986 [INFO][5895] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Sep 16 04:25:53.989998 containerd[1558]: 2025-09-16 04:25:53.987 [INFO][5887] cni-plugin/k8s.go 653: Teardown processing complete. ContainerID="8a610be6a24d2e391acbed96ab712c6beb8ff0fbbbbd95157a98bf1bdc94ecff" Sep 16 04:25:53.990565 containerd[1558]: time="2025-09-16T04:25:53.990039557Z" level=info msg="TearDown network for sandbox \"8a610be6a24d2e391acbed96ab712c6beb8ff0fbbbbd95157a98bf1bdc94ecff\" successfully" Sep 16 04:25:53.992767 containerd[1558]: time="2025-09-16T04:25:53.992717619Z" level=info msg="Ensure that sandbox 8a610be6a24d2e391acbed96ab712c6beb8ff0fbbbbd95157a98bf1bdc94ecff in task-service has been cleanup successfully" Sep 16 04:25:53.996264 containerd[1558]: time="2025-09-16T04:25:53.996223208Z" level=info msg="RemovePodSandbox \"8a610be6a24d2e391acbed96ab712c6beb8ff0fbbbbd95157a98bf1bdc94ecff\" returns successfully" Sep 16 04:25:53.997796 containerd[1558]: time="2025-09-16T04:25:53.997765780Z" level=info msg="StopPodSandbox for \"8f4206c779cd8e784c940367c2f359b2973b5440f615fa238c911344451e797b\"" Sep 16 04:25:54.090135 containerd[1558]: 2025-09-16 04:25:54.047 [WARNING][5909] cni-plugin/k8s.go 598: WorkloadEndpoint does not exist in the datastore, moving forward with the clean up ContainerID="8f4206c779cd8e784c940367c2f359b2973b5440f615fa238c911344451e797b" WorkloadEndpoint="ci--4459--0--0--n--21eb3e8385-k8s-calico--apiserver--6bd74b9558--ljjkj-eth0" Sep 16 04:25:54.090135 containerd[1558]: 2025-09-16 04:25:54.047 [INFO][5909] cni-plugin/k8s.go 640: Cleaning up netns ContainerID="8f4206c779cd8e784c940367c2f359b2973b5440f615fa238c911344451e797b" Sep 16 04:25:54.090135 containerd[1558]: 2025-09-16 04:25:54.047 [INFO][5909] cni-plugin/dataplane_linux.go 555: CleanUpNamespace called with no netns name, ignoring. ContainerID="8f4206c779cd8e784c940367c2f359b2973b5440f615fa238c911344451e797b" iface="eth0" netns="" Sep 16 04:25:54.090135 containerd[1558]: 2025-09-16 04:25:54.047 [INFO][5909] cni-plugin/k8s.go 647: Releasing IP address(es) ContainerID="8f4206c779cd8e784c940367c2f359b2973b5440f615fa238c911344451e797b" Sep 16 04:25:54.090135 containerd[1558]: 2025-09-16 04:25:54.047 [INFO][5909] cni-plugin/utils.go 188: Calico CNI releasing IP address ContainerID="8f4206c779cd8e784c940367c2f359b2973b5440f615fa238c911344451e797b" Sep 16 04:25:54.090135 containerd[1558]: 2025-09-16 04:25:54.072 [INFO][5916] ipam/ipam_plugin.go 412: Releasing address using handleID ContainerID="8f4206c779cd8e784c940367c2f359b2973b5440f615fa238c911344451e797b" HandleID="k8s-pod-network.8f4206c779cd8e784c940367c2f359b2973b5440f615fa238c911344451e797b" Workload="ci--4459--0--0--n--21eb3e8385-k8s-calico--apiserver--6bd74b9558--ljjkj-eth0" Sep 16 04:25:54.090135 containerd[1558]: 2025-09-16 04:25:54.072 [INFO][5916] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Sep 16 04:25:54.090135 containerd[1558]: 2025-09-16 04:25:54.073 [INFO][5916] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Sep 16 04:25:54.090135 containerd[1558]: 2025-09-16 04:25:54.084 [WARNING][5916] ipam/ipam_plugin.go 429: Asked to release address but it doesn't exist. Ignoring ContainerID="8f4206c779cd8e784c940367c2f359b2973b5440f615fa238c911344451e797b" HandleID="k8s-pod-network.8f4206c779cd8e784c940367c2f359b2973b5440f615fa238c911344451e797b" Workload="ci--4459--0--0--n--21eb3e8385-k8s-calico--apiserver--6bd74b9558--ljjkj-eth0" Sep 16 04:25:54.090135 containerd[1558]: 2025-09-16 04:25:54.084 [INFO][5916] ipam/ipam_plugin.go 440: Releasing address using workloadID ContainerID="8f4206c779cd8e784c940367c2f359b2973b5440f615fa238c911344451e797b" HandleID="k8s-pod-network.8f4206c779cd8e784c940367c2f359b2973b5440f615fa238c911344451e797b" Workload="ci--4459--0--0--n--21eb3e8385-k8s-calico--apiserver--6bd74b9558--ljjkj-eth0" Sep 16 04:25:54.090135 containerd[1558]: 2025-09-16 04:25:54.086 [INFO][5916] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Sep 16 04:25:54.090135 containerd[1558]: 2025-09-16 04:25:54.088 [INFO][5909] cni-plugin/k8s.go 653: Teardown processing complete. ContainerID="8f4206c779cd8e784c940367c2f359b2973b5440f615fa238c911344451e797b" Sep 16 04:25:54.090135 containerd[1558]: time="2025-09-16T04:25:54.090020575Z" level=info msg="TearDown network for sandbox \"8f4206c779cd8e784c940367c2f359b2973b5440f615fa238c911344451e797b\" successfully" Sep 16 04:25:54.090135 containerd[1558]: time="2025-09-16T04:25:54.090045136Z" level=info msg="StopPodSandbox for \"8f4206c779cd8e784c940367c2f359b2973b5440f615fa238c911344451e797b\" returns successfully" Sep 16 04:25:54.094087 containerd[1558]: time="2025-09-16T04:25:54.090624220Z" level=info msg="RemovePodSandbox for \"8f4206c779cd8e784c940367c2f359b2973b5440f615fa238c911344451e797b\"" Sep 16 04:25:54.094087 containerd[1558]: time="2025-09-16T04:25:54.090654861Z" level=info msg="Forcibly stopping sandbox \"8f4206c779cd8e784c940367c2f359b2973b5440f615fa238c911344451e797b\"" Sep 16 04:25:54.179415 containerd[1558]: 2025-09-16 04:25:54.137 [WARNING][5930] cni-plugin/k8s.go 598: WorkloadEndpoint does not exist in the datastore, moving forward with the clean up ContainerID="8f4206c779cd8e784c940367c2f359b2973b5440f615fa238c911344451e797b" WorkloadEndpoint="ci--4459--0--0--n--21eb3e8385-k8s-calico--apiserver--6bd74b9558--ljjkj-eth0" Sep 16 04:25:54.179415 containerd[1558]: 2025-09-16 04:25:54.137 [INFO][5930] cni-plugin/k8s.go 640: Cleaning up netns ContainerID="8f4206c779cd8e784c940367c2f359b2973b5440f615fa238c911344451e797b" Sep 16 04:25:54.179415 containerd[1558]: 2025-09-16 04:25:54.137 [INFO][5930] cni-plugin/dataplane_linux.go 555: CleanUpNamespace called with no netns name, ignoring. ContainerID="8f4206c779cd8e784c940367c2f359b2973b5440f615fa238c911344451e797b" iface="eth0" netns="" Sep 16 04:25:54.179415 containerd[1558]: 2025-09-16 04:25:54.137 [INFO][5930] cni-plugin/k8s.go 647: Releasing IP address(es) ContainerID="8f4206c779cd8e784c940367c2f359b2973b5440f615fa238c911344451e797b" Sep 16 04:25:54.179415 containerd[1558]: 2025-09-16 04:25:54.137 [INFO][5930] cni-plugin/utils.go 188: Calico CNI releasing IP address ContainerID="8f4206c779cd8e784c940367c2f359b2973b5440f615fa238c911344451e797b" Sep 16 04:25:54.179415 containerd[1558]: 2025-09-16 04:25:54.159 [INFO][5937] ipam/ipam_plugin.go 412: Releasing address using handleID ContainerID="8f4206c779cd8e784c940367c2f359b2973b5440f615fa238c911344451e797b" HandleID="k8s-pod-network.8f4206c779cd8e784c940367c2f359b2973b5440f615fa238c911344451e797b" Workload="ci--4459--0--0--n--21eb3e8385-k8s-calico--apiserver--6bd74b9558--ljjkj-eth0" Sep 16 04:25:54.179415 containerd[1558]: 2025-09-16 04:25:54.159 [INFO][5937] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Sep 16 04:25:54.179415 containerd[1558]: 2025-09-16 04:25:54.159 [INFO][5937] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Sep 16 04:25:54.179415 containerd[1558]: 2025-09-16 04:25:54.171 [WARNING][5937] ipam/ipam_plugin.go 429: Asked to release address but it doesn't exist. Ignoring ContainerID="8f4206c779cd8e784c940367c2f359b2973b5440f615fa238c911344451e797b" HandleID="k8s-pod-network.8f4206c779cd8e784c940367c2f359b2973b5440f615fa238c911344451e797b" Workload="ci--4459--0--0--n--21eb3e8385-k8s-calico--apiserver--6bd74b9558--ljjkj-eth0" Sep 16 04:25:54.179415 containerd[1558]: 2025-09-16 04:25:54.171 [INFO][5937] ipam/ipam_plugin.go 440: Releasing address using workloadID ContainerID="8f4206c779cd8e784c940367c2f359b2973b5440f615fa238c911344451e797b" HandleID="k8s-pod-network.8f4206c779cd8e784c940367c2f359b2973b5440f615fa238c911344451e797b" Workload="ci--4459--0--0--n--21eb3e8385-k8s-calico--apiserver--6bd74b9558--ljjkj-eth0" Sep 16 04:25:54.179415 containerd[1558]: 2025-09-16 04:25:54.173 [INFO][5937] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Sep 16 04:25:54.179415 containerd[1558]: 2025-09-16 04:25:54.176 [INFO][5930] cni-plugin/k8s.go 653: Teardown processing complete. ContainerID="8f4206c779cd8e784c940367c2f359b2973b5440f615fa238c911344451e797b" Sep 16 04:25:54.179861 containerd[1558]: time="2025-09-16T04:25:54.179490548Z" level=info msg="TearDown network for sandbox \"8f4206c779cd8e784c940367c2f359b2973b5440f615fa238c911344451e797b\" successfully" Sep 16 04:25:54.182134 containerd[1558]: time="2025-09-16T04:25:54.182089489Z" level=info msg="Ensure that sandbox 8f4206c779cd8e784c940367c2f359b2973b5440f615fa238c911344451e797b in task-service has been cleanup successfully" Sep 16 04:25:54.187217 containerd[1558]: time="2025-09-16T04:25:54.187151771Z" level=info msg="RemovePodSandbox \"8f4206c779cd8e784c940367c2f359b2973b5440f615fa238c911344451e797b\" returns successfully" Sep 16 04:25:59.303346 containerd[1558]: time="2025-09-16T04:25:59.303246771Z" level=info msg="TaskExit event in podsandbox handler container_id:\"330eda4a58ea998791d702c8e63e01b0da33f16b43b46f482cb45be65d7b3d87\" id:\"5d6c9d00255574794474c2a9ebb1b9ee57023c2176f76c9ea98b9d24242f518f\" pid:5955 exited_at:{seconds:1757996759 nanos:302829248}" Sep 16 04:26:05.589146 containerd[1558]: time="2025-09-16T04:26:05.589099972Z" level=info msg="TaskExit event in podsandbox handler container_id:\"330eda4a58ea998791d702c8e63e01b0da33f16b43b46f482cb45be65d7b3d87\" id:\"e1982a2a9118696a784d64917a30a1f794a5e83a30537951353160940556f7df\" pid:5984 exited_at:{seconds:1757996765 nanos:588413726}" Sep 16 04:26:12.756251 containerd[1558]: time="2025-09-16T04:26:12.756206970Z" level=info msg="TaskExit event in podsandbox handler container_id:\"29accf524a610bb91eb08b15c29dbaaf8febb57b21dcb514741ee9efee567994\" id:\"23fa4d11a46a15f80f122613f3b516c7fbd39e2da7406adbac230fac8e1124f1\" pid:6013 exited_at:{seconds:1757996772 nanos:755898807}" Sep 16 04:26:20.220450 containerd[1558]: time="2025-09-16T04:26:20.220407971Z" level=info msg="TaskExit event in podsandbox handler container_id:\"8d6c6929c13327a75d8c268f852bf6c0f9e08ea12f3d02fe73bf74d4acf04464\" id:\"387416176a311f0836c8e8421da138dbef92043ec754f2c8afa93c417549d308\" pid:6053 exited_at:{seconds:1757996780 nanos:219848046}" Sep 16 04:26:20.790029 containerd[1558]: time="2025-09-16T04:26:20.789924216Z" level=info msg="TaskExit event in podsandbox handler container_id:\"8d6c6929c13327a75d8c268f852bf6c0f9e08ea12f3d02fe73bf74d4acf04464\" id:\"03a8d44a6755e6f922f4f7f0128ba254049f2d977c5f2b2852e48df873ab6cb1\" pid:6076 exited_at:{seconds:1757996780 nanos:789188490}" Sep 16 04:26:29.290864 containerd[1558]: time="2025-09-16T04:26:29.290815442Z" level=info msg="TaskExit event in podsandbox handler container_id:\"330eda4a58ea998791d702c8e63e01b0da33f16b43b46f482cb45be65d7b3d87\" id:\"eda7c6649d76c6f4e9c03b19f3f1615dc4f684178fe7ffd571259112e68f82d7\" pid:6101 exited_at:{seconds:1757996789 nanos:290130235}" Sep 16 04:26:32.790756 systemd[1]: Started sshd@7-138.201.119.17:22-139.178.89.65:57714.service - OpenSSH per-connection server daemon (139.178.89.65:57714). Sep 16 04:26:33.796621 sshd[6118]: Accepted publickey for core from 139.178.89.65 port 57714 ssh2: RSA SHA256:hnZQROmedaG+reQAaWvmG41QCRiTlF3QrQA4Qzar5jk Sep 16 04:26:33.799122 sshd-session[6118]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 16 04:26:33.806443 systemd-logind[1529]: New session 8 of user core. Sep 16 04:26:33.811813 systemd[1]: Started session-8.scope - Session 8 of User core. Sep 16 04:26:34.581700 sshd[6121]: Connection closed by 139.178.89.65 port 57714 Sep 16 04:26:34.582710 sshd-session[6118]: pam_unix(sshd:session): session closed for user core Sep 16 04:26:34.589114 systemd-logind[1529]: Session 8 logged out. Waiting for processes to exit. Sep 16 04:26:34.590995 systemd[1]: sshd@7-138.201.119.17:22-139.178.89.65:57714.service: Deactivated successfully. Sep 16 04:26:34.594891 systemd[1]: session-8.scope: Deactivated successfully. Sep 16 04:26:34.597803 systemd-logind[1529]: Removed session 8. Sep 16 04:26:39.758436 systemd[1]: Started sshd@8-138.201.119.17:22-139.178.89.65:57716.service - OpenSSH per-connection server daemon (139.178.89.65:57716). Sep 16 04:26:40.773655 sshd[6133]: Accepted publickey for core from 139.178.89.65 port 57716 ssh2: RSA SHA256:hnZQROmedaG+reQAaWvmG41QCRiTlF3QrQA4Qzar5jk Sep 16 04:26:40.775223 sshd-session[6133]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 16 04:26:40.781568 systemd-logind[1529]: New session 9 of user core. Sep 16 04:26:40.790886 systemd[1]: Started session-9.scope - Session 9 of User core. Sep 16 04:26:41.549417 sshd[6136]: Connection closed by 139.178.89.65 port 57716 Sep 16 04:26:41.550300 sshd-session[6133]: pam_unix(sshd:session): session closed for user core Sep 16 04:26:41.557151 systemd[1]: sshd@8-138.201.119.17:22-139.178.89.65:57716.service: Deactivated successfully. Sep 16 04:26:41.560810 systemd[1]: session-9.scope: Deactivated successfully. Sep 16 04:26:41.562355 systemd-logind[1529]: Session 9 logged out. Waiting for processes to exit. Sep 16 04:26:41.564284 systemd-logind[1529]: Removed session 9. Sep 16 04:26:42.752629 containerd[1558]: time="2025-09-16T04:26:42.752500947Z" level=info msg="TaskExit event in podsandbox handler container_id:\"29accf524a610bb91eb08b15c29dbaaf8febb57b21dcb514741ee9efee567994\" id:\"593f6858e7f74a58cd96bdfdd3afe2c10eff06a8a4bf0f84f307b376fa7344e8\" pid:6162 exited_at:{seconds:1757996802 nanos:751928422}" Sep 16 04:26:46.721176 systemd[1]: Started sshd@9-138.201.119.17:22-139.178.89.65:58580.service - OpenSSH per-connection server daemon (139.178.89.65:58580). Sep 16 04:26:47.725750 sshd[6172]: Accepted publickey for core from 139.178.89.65 port 58580 ssh2: RSA SHA256:hnZQROmedaG+reQAaWvmG41QCRiTlF3QrQA4Qzar5jk Sep 16 04:26:47.728171 sshd-session[6172]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 16 04:26:47.733250 systemd-logind[1529]: New session 10 of user core. Sep 16 04:26:47.739854 systemd[1]: Started session-10.scope - Session 10 of User core. Sep 16 04:26:48.482485 sshd[6175]: Connection closed by 139.178.89.65 port 58580 Sep 16 04:26:48.483414 sshd-session[6172]: pam_unix(sshd:session): session closed for user core Sep 16 04:26:48.489786 systemd[1]: sshd@9-138.201.119.17:22-139.178.89.65:58580.service: Deactivated successfully. Sep 16 04:26:48.494274 systemd[1]: session-10.scope: Deactivated successfully. Sep 16 04:26:48.496209 systemd-logind[1529]: Session 10 logged out. Waiting for processes to exit. Sep 16 04:26:48.498287 systemd-logind[1529]: Removed session 10. Sep 16 04:26:50.226158 containerd[1558]: time="2025-09-16T04:26:50.225885239Z" level=info msg="TaskExit event in podsandbox handler container_id:\"8d6c6929c13327a75d8c268f852bf6c0f9e08ea12f3d02fe73bf74d4acf04464\" id:\"c3f2b416b484df3101fe2b783acddc8bac885714ed5f6fd54ca50f1e71be3d98\" pid:6201 exited_at:{seconds:1757996810 nanos:225264092}" Sep 16 04:26:53.659429 systemd[1]: Started sshd@10-138.201.119.17:22-139.178.89.65:59214.service - OpenSSH per-connection server daemon (139.178.89.65:59214). Sep 16 04:26:54.672977 sshd[6211]: Accepted publickey for core from 139.178.89.65 port 59214 ssh2: RSA SHA256:hnZQROmedaG+reQAaWvmG41QCRiTlF3QrQA4Qzar5jk Sep 16 04:26:54.674320 sshd-session[6211]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 16 04:26:54.680787 systemd-logind[1529]: New session 11 of user core. Sep 16 04:26:54.686918 systemd[1]: Started session-11.scope - Session 11 of User core. Sep 16 04:26:55.437248 sshd[6216]: Connection closed by 139.178.89.65 port 59214 Sep 16 04:26:55.437133 sshd-session[6211]: pam_unix(sshd:session): session closed for user core Sep 16 04:26:55.445482 systemd[1]: sshd@10-138.201.119.17:22-139.178.89.65:59214.service: Deactivated successfully. Sep 16 04:26:55.445877 systemd-logind[1529]: Session 11 logged out. Waiting for processes to exit. Sep 16 04:26:55.448571 systemd[1]: session-11.scope: Deactivated successfully. Sep 16 04:26:55.452366 systemd-logind[1529]: Removed session 11. Sep 16 04:26:59.301673 containerd[1558]: time="2025-09-16T04:26:59.301469323Z" level=info msg="TaskExit event in podsandbox handler container_id:\"330eda4a58ea998791d702c8e63e01b0da33f16b43b46f482cb45be65d7b3d87\" id:\"3e7f05dceb6c67fb0efe05227f55cd01c6bf215128349004d363fe5a994c82d8\" pid:6240 exited_at:{seconds:1757996819 nanos:300544647}" Sep 16 04:27:00.608698 systemd[1]: Started sshd@11-138.201.119.17:22-139.178.89.65:52276.service - OpenSSH per-connection server daemon (139.178.89.65:52276). Sep 16 04:27:01.606302 sshd[6250]: Accepted publickey for core from 139.178.89.65 port 52276 ssh2: RSA SHA256:hnZQROmedaG+reQAaWvmG41QCRiTlF3QrQA4Qzar5jk Sep 16 04:27:01.608354 sshd-session[6250]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 16 04:27:01.614552 systemd-logind[1529]: New session 12 of user core. Sep 16 04:27:01.624921 systemd[1]: Started session-12.scope - Session 12 of User core. Sep 16 04:27:02.369715 sshd[6255]: Connection closed by 139.178.89.65 port 52276 Sep 16 04:27:02.370311 sshd-session[6250]: pam_unix(sshd:session): session closed for user core Sep 16 04:27:02.376343 systemd[1]: sshd@11-138.201.119.17:22-139.178.89.65:52276.service: Deactivated successfully. Sep 16 04:27:02.379522 systemd[1]: session-12.scope: Deactivated successfully. Sep 16 04:27:02.383476 systemd-logind[1529]: Session 12 logged out. Waiting for processes to exit. Sep 16 04:27:02.385441 systemd-logind[1529]: Removed session 12. Sep 16 04:27:05.700415 containerd[1558]: time="2025-09-16T04:27:05.700313628Z" level=info msg="TaskExit event in podsandbox handler container_id:\"330eda4a58ea998791d702c8e63e01b0da33f16b43b46f482cb45be65d7b3d87\" id:\"65190c062454c8d207e57ed17506c5105f7ca4637bc81832975e9aa0231071b3\" pid:6279 exited_at:{seconds:1757996825 nanos:699958535}" Sep 16 04:27:07.552874 systemd[1]: Started sshd@12-138.201.119.17:22-139.178.89.65:52290.service - OpenSSH per-connection server daemon (139.178.89.65:52290). Sep 16 04:27:08.618670 sshd[6289]: Accepted publickey for core from 139.178.89.65 port 52290 ssh2: RSA SHA256:hnZQROmedaG+reQAaWvmG41QCRiTlF3QrQA4Qzar5jk Sep 16 04:27:08.621717 sshd-session[6289]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 16 04:27:08.628937 systemd-logind[1529]: New session 13 of user core. Sep 16 04:27:08.634845 systemd[1]: Started session-13.scope - Session 13 of User core. Sep 16 04:27:09.437621 sshd[6292]: Connection closed by 139.178.89.65 port 52290 Sep 16 04:27:09.436631 sshd-session[6289]: pam_unix(sshd:session): session closed for user core Sep 16 04:27:09.441065 systemd-logind[1529]: Session 13 logged out. Waiting for processes to exit. Sep 16 04:27:09.442378 systemd[1]: sshd@12-138.201.119.17:22-139.178.89.65:52290.service: Deactivated successfully. Sep 16 04:27:09.444774 systemd[1]: session-13.scope: Deactivated successfully. Sep 16 04:27:09.447912 systemd-logind[1529]: Removed session 13. Sep 16 04:27:12.758360 containerd[1558]: time="2025-09-16T04:27:12.758312222Z" level=info msg="TaskExit event in podsandbox handler container_id:\"29accf524a610bb91eb08b15c29dbaaf8febb57b21dcb514741ee9efee567994\" id:\"50887a61c0c7b6aa9c5309b242f6853272836516a15754414925169c48e0debd\" pid:6317 exited_at:{seconds:1757996832 nanos:758008092}" Sep 16 04:27:14.610255 systemd[1]: Started sshd@13-138.201.119.17:22-139.178.89.65:48618.service - OpenSSH per-connection server daemon (139.178.89.65:48618). Sep 16 04:27:15.594318 sshd[6330]: Accepted publickey for core from 139.178.89.65 port 48618 ssh2: RSA SHA256:hnZQROmedaG+reQAaWvmG41QCRiTlF3QrQA4Qzar5jk Sep 16 04:27:15.596419 sshd-session[6330]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 16 04:27:15.601758 systemd-logind[1529]: New session 14 of user core. Sep 16 04:27:15.614114 systemd[1]: Started session-14.scope - Session 14 of User core. Sep 16 04:27:16.345260 sshd[6333]: Connection closed by 139.178.89.65 port 48618 Sep 16 04:27:16.346324 sshd-session[6330]: pam_unix(sshd:session): session closed for user core Sep 16 04:27:16.353326 systemd[1]: sshd@13-138.201.119.17:22-139.178.89.65:48618.service: Deactivated successfully. Sep 16 04:27:16.356503 systemd[1]: session-14.scope: Deactivated successfully. Sep 16 04:27:16.360362 systemd-logind[1529]: Session 14 logged out. Waiting for processes to exit. Sep 16 04:27:16.361637 systemd-logind[1529]: Removed session 14. Sep 16 04:27:20.221175 containerd[1558]: time="2025-09-16T04:27:20.221128134Z" level=info msg="TaskExit event in podsandbox handler container_id:\"8d6c6929c13327a75d8c268f852bf6c0f9e08ea12f3d02fe73bf74d4acf04464\" id:\"f13b269260c15d88ac53b8ac13bd012c232478fb6c8480baf8d91663b84dc2a9\" pid:6358 exited_at:{seconds:1757996840 nanos:220692400}" Sep 16 04:27:20.786165 containerd[1558]: time="2025-09-16T04:27:20.786105812Z" level=info msg="TaskExit event in podsandbox handler container_id:\"8d6c6929c13327a75d8c268f852bf6c0f9e08ea12f3d02fe73bf74d4acf04464\" id:\"9f5b88902e6b2b4d066767640b2e912559a03177dd0b8aaadd09c3a51f981da5\" pid:6381 exited_at:{seconds:1757996840 nanos:785266785}" Sep 16 04:27:21.527788 systemd[1]: Started sshd@14-138.201.119.17:22-139.178.89.65:56352.service - OpenSSH per-connection server daemon (139.178.89.65:56352). Sep 16 04:27:22.533556 sshd[6398]: Accepted publickey for core from 139.178.89.65 port 56352 ssh2: RSA SHA256:hnZQROmedaG+reQAaWvmG41QCRiTlF3QrQA4Qzar5jk Sep 16 04:27:22.536427 sshd-session[6398]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 16 04:27:22.542189 systemd-logind[1529]: New session 15 of user core. Sep 16 04:27:22.547890 systemd[1]: Started session-15.scope - Session 15 of User core. Sep 16 04:27:23.300653 sshd[6401]: Connection closed by 139.178.89.65 port 56352 Sep 16 04:27:23.301290 sshd-session[6398]: pam_unix(sshd:session): session closed for user core Sep 16 04:27:23.306505 systemd[1]: sshd@14-138.201.119.17:22-139.178.89.65:56352.service: Deactivated successfully. Sep 16 04:27:23.309149 systemd[1]: session-15.scope: Deactivated successfully. Sep 16 04:27:23.311719 systemd-logind[1529]: Session 15 logged out. Waiting for processes to exit. Sep 16 04:27:23.314051 systemd-logind[1529]: Removed session 15. Sep 16 04:27:28.479196 systemd[1]: Started sshd@15-138.201.119.17:22-139.178.89.65:56354.service - OpenSSH per-connection server daemon (139.178.89.65:56354). Sep 16 04:27:29.298149 containerd[1558]: time="2025-09-16T04:27:29.298103556Z" level=info msg="TaskExit event in podsandbox handler container_id:\"330eda4a58ea998791d702c8e63e01b0da33f16b43b46f482cb45be65d7b3d87\" id:\"9415b0094a344bb4e95f92a6a831944631304536e92d21e6939099fc43acc421\" pid:6429 exited_at:{seconds:1757996849 nanos:297830788}" Sep 16 04:27:29.483561 sshd[6414]: Accepted publickey for core from 139.178.89.65 port 56354 ssh2: RSA SHA256:hnZQROmedaG+reQAaWvmG41QCRiTlF3QrQA4Qzar5jk Sep 16 04:27:29.486228 sshd-session[6414]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 16 04:27:29.494193 systemd-logind[1529]: New session 16 of user core. Sep 16 04:27:29.498813 systemd[1]: Started session-16.scope - Session 16 of User core. Sep 16 04:27:30.256452 sshd[6438]: Connection closed by 139.178.89.65 port 56354 Sep 16 04:27:30.257482 sshd-session[6414]: pam_unix(sshd:session): session closed for user core Sep 16 04:27:30.262611 systemd-logind[1529]: Session 16 logged out. Waiting for processes to exit. Sep 16 04:27:30.262716 systemd[1]: sshd@15-138.201.119.17:22-139.178.89.65:56354.service: Deactivated successfully. Sep 16 04:27:30.265782 systemd[1]: session-16.scope: Deactivated successfully. Sep 16 04:27:30.271191 systemd-logind[1529]: Removed session 16. Sep 16 04:27:35.432188 systemd[1]: Started sshd@16-138.201.119.17:22-139.178.89.65:42042.service - OpenSSH per-connection server daemon (139.178.89.65:42042). Sep 16 04:27:36.443413 sshd[6454]: Accepted publickey for core from 139.178.89.65 port 42042 ssh2: RSA SHA256:hnZQROmedaG+reQAaWvmG41QCRiTlF3QrQA4Qzar5jk Sep 16 04:27:36.445544 sshd-session[6454]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 16 04:27:36.454806 systemd-logind[1529]: New session 17 of user core. Sep 16 04:27:36.459812 systemd[1]: Started session-17.scope - Session 17 of User core. Sep 16 04:27:37.234366 sshd[6460]: Connection closed by 139.178.89.65 port 42042 Sep 16 04:27:37.235209 sshd-session[6454]: pam_unix(sshd:session): session closed for user core Sep 16 04:27:37.242287 systemd[1]: sshd@16-138.201.119.17:22-139.178.89.65:42042.service: Deactivated successfully. Sep 16 04:27:37.247379 systemd[1]: session-17.scope: Deactivated successfully. Sep 16 04:27:37.250300 systemd-logind[1529]: Session 17 logged out. Waiting for processes to exit. Sep 16 04:27:37.253995 systemd-logind[1529]: Removed session 17. Sep 16 04:27:42.411734 systemd[1]: Started sshd@17-138.201.119.17:22-139.178.89.65:44986.service - OpenSSH per-connection server daemon (139.178.89.65:44986). Sep 16 04:27:42.757130 containerd[1558]: time="2025-09-16T04:27:42.756820207Z" level=info msg="TaskExit event in podsandbox handler container_id:\"29accf524a610bb91eb08b15c29dbaaf8febb57b21dcb514741ee9efee567994\" id:\"2363f8d71d6b8dbc9b3c55ea4371f68eb0dfa4654d792ea9b4fce4375255849b\" pid:6492 exited_at:{seconds:1757996862 nanos:755777900}" Sep 16 04:27:43.437535 sshd[6476]: Accepted publickey for core from 139.178.89.65 port 44986 ssh2: RSA SHA256:hnZQROmedaG+reQAaWvmG41QCRiTlF3QrQA4Qzar5jk Sep 16 04:27:43.440799 sshd-session[6476]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 16 04:27:43.448647 systemd-logind[1529]: New session 18 of user core. Sep 16 04:27:43.458947 systemd[1]: Started session-18.scope - Session 18 of User core. Sep 16 04:27:44.199251 sshd[6503]: Connection closed by 139.178.89.65 port 44986 Sep 16 04:27:44.200358 sshd-session[6476]: pam_unix(sshd:session): session closed for user core Sep 16 04:27:44.206825 systemd-logind[1529]: Session 18 logged out. Waiting for processes to exit. Sep 16 04:27:44.207045 systemd[1]: sshd@17-138.201.119.17:22-139.178.89.65:44986.service: Deactivated successfully. Sep 16 04:27:44.209920 systemd[1]: session-18.scope: Deactivated successfully. Sep 16 04:27:44.215009 systemd-logind[1529]: Removed session 18. Sep 16 04:27:49.373911 systemd[1]: Started sshd@18-138.201.119.17:22-139.178.89.65:44996.service - OpenSSH per-connection server daemon (139.178.89.65:44996). Sep 16 04:27:50.220540 containerd[1558]: time="2025-09-16T04:27:50.220489801Z" level=info msg="TaskExit event in podsandbox handler container_id:\"8d6c6929c13327a75d8c268f852bf6c0f9e08ea12f3d02fe73bf74d4acf04464\" id:\"ff29f13aa0cd09227e8c25ede50a56ec5473298663870b9a5e0b0afca90c6ecc\" pid:6538 exited_at:{seconds:1757996870 nanos:219884626}" Sep 16 04:27:50.365780 sshd[6522]: Accepted publickey for core from 139.178.89.65 port 44996 ssh2: RSA SHA256:hnZQROmedaG+reQAaWvmG41QCRiTlF3QrQA4Qzar5jk Sep 16 04:27:50.367519 sshd-session[6522]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 16 04:27:50.375981 systemd-logind[1529]: New session 19 of user core. Sep 16 04:27:50.382870 systemd[1]: Started session-19.scope - Session 19 of User core. Sep 16 04:27:51.130193 sshd[6548]: Connection closed by 139.178.89.65 port 44996 Sep 16 04:27:51.130835 sshd-session[6522]: pam_unix(sshd:session): session closed for user core Sep 16 04:27:51.137934 systemd[1]: sshd@18-138.201.119.17:22-139.178.89.65:44996.service: Deactivated successfully. Sep 16 04:27:51.140977 systemd[1]: session-19.scope: Deactivated successfully. Sep 16 04:27:51.143518 systemd-logind[1529]: Session 19 logged out. Waiting for processes to exit. Sep 16 04:27:51.145463 systemd-logind[1529]: Removed session 19. Sep 16 04:27:56.316785 systemd[1]: Started sshd@19-138.201.119.17:22-139.178.89.65:46240.service - OpenSSH per-connection server daemon (139.178.89.65:46240). Sep 16 04:27:57.375644 sshd[6576]: Accepted publickey for core from 139.178.89.65 port 46240 ssh2: RSA SHA256:hnZQROmedaG+reQAaWvmG41QCRiTlF3QrQA4Qzar5jk Sep 16 04:27:57.376738 sshd-session[6576]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 16 04:27:57.381723 systemd-logind[1529]: New session 20 of user core. Sep 16 04:27:57.391942 systemd[1]: Started session-20.scope - Session 20 of User core. Sep 16 04:27:58.172758 sshd[6579]: Connection closed by 139.178.89.65 port 46240 Sep 16 04:27:58.173578 sshd-session[6576]: pam_unix(sshd:session): session closed for user core Sep 16 04:27:58.178979 systemd-logind[1529]: Session 20 logged out. Waiting for processes to exit. Sep 16 04:27:58.179200 systemd[1]: sshd@19-138.201.119.17:22-139.178.89.65:46240.service: Deactivated successfully. Sep 16 04:27:58.182063 systemd[1]: session-20.scope: Deactivated successfully. Sep 16 04:27:58.187649 systemd-logind[1529]: Removed session 20. Sep 16 04:27:59.295574 containerd[1558]: time="2025-09-16T04:27:59.295528147Z" level=info msg="TaskExit event in podsandbox handler container_id:\"330eda4a58ea998791d702c8e63e01b0da33f16b43b46f482cb45be65d7b3d87\" id:\"bdb9473616ac71496ed1b3244b2427742035f49c82e2a6db905a5398381f1ea2\" pid:6603 exited_at:{seconds:1757996879 nanos:294986174}" Sep 16 04:28:03.351896 systemd[1]: Started sshd@20-138.201.119.17:22-139.178.89.65:39540.service - OpenSSH per-connection server daemon (139.178.89.65:39540). Sep 16 04:28:04.367303 sshd[6615]: Accepted publickey for core from 139.178.89.65 port 39540 ssh2: RSA SHA256:hnZQROmedaG+reQAaWvmG41QCRiTlF3QrQA4Qzar5jk Sep 16 04:28:04.369023 sshd-session[6615]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 16 04:28:04.379651 systemd-logind[1529]: New session 21 of user core. Sep 16 04:28:04.385930 systemd[1]: Started session-21.scope - Session 21 of User core. Sep 16 04:28:05.145701 sshd[6618]: Connection closed by 139.178.89.65 port 39540 Sep 16 04:28:05.146661 sshd-session[6615]: pam_unix(sshd:session): session closed for user core Sep 16 04:28:05.153259 systemd[1]: sshd@20-138.201.119.17:22-139.178.89.65:39540.service: Deactivated successfully. Sep 16 04:28:05.157577 systemd[1]: session-21.scope: Deactivated successfully. Sep 16 04:28:05.159876 systemd-logind[1529]: Session 21 logged out. Waiting for processes to exit. Sep 16 04:28:05.161388 systemd-logind[1529]: Removed session 21. Sep 16 04:28:05.594261 containerd[1558]: time="2025-09-16T04:28:05.593929927Z" level=info msg="TaskExit event in podsandbox handler container_id:\"330eda4a58ea998791d702c8e63e01b0da33f16b43b46f482cb45be65d7b3d87\" id:\"cc8d08948ca2eb61fee9b7e83ee7b177f17dc7fcf12ce218dcda14aade9d1360\" pid:6643 exited_at:{seconds:1757996885 nanos:593615040}" Sep 16 04:28:10.319484 systemd[1]: Started sshd@21-138.201.119.17:22-139.178.89.65:51198.service - OpenSSH per-connection server daemon (139.178.89.65:51198). Sep 16 04:28:11.319339 sshd[6653]: Accepted publickey for core from 139.178.89.65 port 51198 ssh2: RSA SHA256:hnZQROmedaG+reQAaWvmG41QCRiTlF3QrQA4Qzar5jk Sep 16 04:28:11.321388 sshd-session[6653]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 16 04:28:11.328720 systemd-logind[1529]: New session 22 of user core. Sep 16 04:28:11.331813 systemd[1]: Started session-22.scope - Session 22 of User core. Sep 16 04:28:12.080612 sshd[6656]: Connection closed by 139.178.89.65 port 51198 Sep 16 04:28:12.079993 sshd-session[6653]: pam_unix(sshd:session): session closed for user core Sep 16 04:28:12.087314 systemd[1]: sshd@21-138.201.119.17:22-139.178.89.65:51198.service: Deactivated successfully. Sep 16 04:28:12.088243 systemd-logind[1529]: Session 22 logged out. Waiting for processes to exit. Sep 16 04:28:12.090899 systemd[1]: session-22.scope: Deactivated successfully. Sep 16 04:28:12.094468 systemd-logind[1529]: Removed session 22. Sep 16 04:28:12.750313 containerd[1558]: time="2025-09-16T04:28:12.750269488Z" level=info msg="TaskExit event in podsandbox handler container_id:\"29accf524a610bb91eb08b15c29dbaaf8febb57b21dcb514741ee9efee567994\" id:\"61f8d7a5acc09e759cd622537be9793a00fe4dbd491149bba287cfad65ad44ea\" pid:6679 exited_at:{seconds:1757996892 nanos:749459671}" Sep 16 04:28:17.256473 systemd[1]: Started sshd@22-138.201.119.17:22-139.178.89.65:51208.service - OpenSSH per-connection server daemon (139.178.89.65:51208). Sep 16 04:28:18.201224 update_engine[1531]: I20250916 04:28:18.201099 1531 prefs.cc:52] certificate-report-to-send-update not present in /var/lib/update_engine/prefs Sep 16 04:28:18.202737 update_engine[1531]: I20250916 04:28:18.201195 1531 prefs.cc:52] certificate-report-to-send-download not present in /var/lib/update_engine/prefs Sep 16 04:28:18.203328 update_engine[1531]: I20250916 04:28:18.202883 1531 prefs.cc:52] aleph-version not present in /var/lib/update_engine/prefs Sep 16 04:28:18.203849 update_engine[1531]: I20250916 04:28:18.203816 1531 omaha_request_params.cc:62] Current group set to developer Sep 16 04:28:18.206429 update_engine[1531]: I20250916 04:28:18.206378 1531 update_attempter.cc:499] Already updated boot flags. Skipping. Sep 16 04:28:18.207199 update_engine[1531]: I20250916 04:28:18.206573 1531 update_attempter.cc:643] Scheduling an action processor start. Sep 16 04:28:18.207199 update_engine[1531]: I20250916 04:28:18.206669 1531 action_processor.cc:36] ActionProcessor::StartProcessing: OmahaRequestAction Sep 16 04:28:18.210772 update_engine[1531]: I20250916 04:28:18.210742 1531 prefs.cc:52] previous-version not present in /var/lib/update_engine/prefs Sep 16 04:28:18.210970 update_engine[1531]: I20250916 04:28:18.210953 1531 omaha_request_action.cc:271] Posting an Omaha request to disabled Sep 16 04:28:18.211238 update_engine[1531]: I20250916 04:28:18.211216 1531 omaha_request_action.cc:272] Request: Sep 16 04:28:18.211238 update_engine[1531]: Sep 16 04:28:18.211238 update_engine[1531]: Sep 16 04:28:18.211238 update_engine[1531]: Sep 16 04:28:18.211238 update_engine[1531]: Sep 16 04:28:18.211238 update_engine[1531]: Sep 16 04:28:18.211238 update_engine[1531]: Sep 16 04:28:18.211238 update_engine[1531]: Sep 16 04:28:18.211238 update_engine[1531]: Sep 16 04:28:18.212653 update_engine[1531]: I20250916 04:28:18.211608 1531 libcurl_http_fetcher.cc:47] Starting/Resuming transfer Sep 16 04:28:18.215255 update_engine[1531]: I20250916 04:28:18.214969 1531 libcurl_http_fetcher.cc:151] Setting up curl options for HTTP Sep 16 04:28:18.216434 update_engine[1531]: I20250916 04:28:18.216404 1531 libcurl_http_fetcher.cc:449] Setting up timeout source: 1 seconds. Sep 16 04:28:18.217381 update_engine[1531]: E20250916 04:28:18.217350 1531 libcurl_http_fetcher.cc:266] Unable to get http response code: Could not resolve host: disabled Sep 16 04:28:18.217516 update_engine[1531]: I20250916 04:28:18.217497 1531 libcurl_http_fetcher.cc:283] No HTTP response, retry 1 Sep 16 04:28:18.220361 locksmithd[1582]: LastCheckedTime=0 Progress=0 CurrentOperation="UPDATE_STATUS_CHECKING_FOR_UPDATE" NewVersion=0.0.0 NewSize=0 Sep 16 04:28:18.273115 sshd[6692]: Accepted publickey for core from 139.178.89.65 port 51208 ssh2: RSA SHA256:hnZQROmedaG+reQAaWvmG41QCRiTlF3QrQA4Qzar5jk Sep 16 04:28:18.274975 sshd-session[6692]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 16 04:28:18.280189 systemd-logind[1529]: New session 23 of user core. Sep 16 04:28:18.286897 systemd[1]: Started session-23.scope - Session 23 of User core. Sep 16 04:28:19.046104 sshd[6695]: Connection closed by 139.178.89.65 port 51208 Sep 16 04:28:19.048881 sshd-session[6692]: pam_unix(sshd:session): session closed for user core Sep 16 04:28:19.056291 systemd-logind[1529]: Session 23 logged out. Waiting for processes to exit. Sep 16 04:28:19.057303 systemd[1]: sshd@22-138.201.119.17:22-139.178.89.65:51208.service: Deactivated successfully. Sep 16 04:28:19.060001 systemd[1]: session-23.scope: Deactivated successfully. Sep 16 04:28:19.063187 systemd-logind[1529]: Removed session 23. Sep 16 04:28:20.227484 containerd[1558]: time="2025-09-16T04:28:20.227287277Z" level=info msg="TaskExit event in podsandbox handler container_id:\"8d6c6929c13327a75d8c268f852bf6c0f9e08ea12f3d02fe73bf74d4acf04464\" id:\"97a2f6d89b825e51468fab307c5bd45b119a7467b491ddceee429abce592bb4a\" pid:6718 exited_at:{seconds:1757996900 nanos:226946830}" Sep 16 04:28:20.788768 containerd[1558]: time="2025-09-16T04:28:20.788720795Z" level=info msg="TaskExit event in podsandbox handler container_id:\"8d6c6929c13327a75d8c268f852bf6c0f9e08ea12f3d02fe73bf74d4acf04464\" id:\"b1f2bf6b26f01264f8ddcba4a7d3088bdd21ad015c800cb991cb63f09d69163c\" pid:6740 exited_at:{seconds:1757996900 nanos:788103942}" Sep 16 04:28:24.223229 systemd[1]: Started sshd@23-138.201.119.17:22-139.178.89.65:35250.service - OpenSSH per-connection server daemon (139.178.89.65:35250). Sep 16 04:28:25.226094 sshd[6751]: Accepted publickey for core from 139.178.89.65 port 35250 ssh2: RSA SHA256:hnZQROmedaG+reQAaWvmG41QCRiTlF3QrQA4Qzar5jk Sep 16 04:28:25.229008 sshd-session[6751]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 16 04:28:25.235668 systemd-logind[1529]: New session 24 of user core. Sep 16 04:28:25.245975 systemd[1]: Started session-24.scope - Session 24 of User core. Sep 16 04:28:25.986897 sshd[6754]: Connection closed by 139.178.89.65 port 35250 Sep 16 04:28:25.986792 sshd-session[6751]: pam_unix(sshd:session): session closed for user core Sep 16 04:28:25.992117 systemd-logind[1529]: Session 24 logged out. Waiting for processes to exit. Sep 16 04:28:25.992779 systemd[1]: sshd@23-138.201.119.17:22-139.178.89.65:35250.service: Deactivated successfully. Sep 16 04:28:25.997241 systemd[1]: session-24.scope: Deactivated successfully. Sep 16 04:28:26.000570 systemd-logind[1529]: Removed session 24. Sep 16 04:28:28.197736 update_engine[1531]: I20250916 04:28:28.197557 1531 libcurl_http_fetcher.cc:47] Starting/Resuming transfer Sep 16 04:28:28.199269 update_engine[1531]: I20250916 04:28:28.198400 1531 libcurl_http_fetcher.cc:151] Setting up curl options for HTTP Sep 16 04:28:28.199808 update_engine[1531]: I20250916 04:28:28.199558 1531 libcurl_http_fetcher.cc:449] Setting up timeout source: 1 seconds. Sep 16 04:28:28.199947 update_engine[1531]: E20250916 04:28:28.199915 1531 libcurl_http_fetcher.cc:266] Unable to get http response code: Could not resolve host: disabled Sep 16 04:28:28.200046 update_engine[1531]: I20250916 04:28:28.200008 1531 libcurl_http_fetcher.cc:283] No HTTP response, retry 2 Sep 16 04:28:29.298666 containerd[1558]: time="2025-09-16T04:28:29.298568617Z" level=info msg="TaskExit event in podsandbox handler container_id:\"330eda4a58ea998791d702c8e63e01b0da33f16b43b46f482cb45be65d7b3d87\" id:\"cd279878e8726e572df07b31fb2311888fc36b74c3034be5f9444f67b31a4ba4\" pid:6777 exited_at:{seconds:1757996909 nanos:297328912}" Sep 16 04:28:31.164623 systemd[1]: Started sshd@24-138.201.119.17:22-139.178.89.65:49836.service - OpenSSH per-connection server daemon (139.178.89.65:49836). Sep 16 04:28:32.167650 sshd[6788]: Accepted publickey for core from 139.178.89.65 port 49836 ssh2: RSA SHA256:hnZQROmedaG+reQAaWvmG41QCRiTlF3QrQA4Qzar5jk Sep 16 04:28:32.169498 sshd-session[6788]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 16 04:28:32.175033 systemd-logind[1529]: New session 25 of user core. Sep 16 04:28:32.184244 systemd[1]: Started session-25.scope - Session 25 of User core. Sep 16 04:28:32.935652 sshd[6793]: Connection closed by 139.178.89.65 port 49836 Sep 16 04:28:32.935357 sshd-session[6788]: pam_unix(sshd:session): session closed for user core Sep 16 04:28:32.941348 systemd[1]: sshd@24-138.201.119.17:22-139.178.89.65:49836.service: Deactivated successfully. Sep 16 04:28:32.945080 systemd[1]: session-25.scope: Deactivated successfully. Sep 16 04:28:32.946940 systemd-logind[1529]: Session 25 logged out. Waiting for processes to exit. Sep 16 04:28:32.949115 systemd-logind[1529]: Removed session 25. Sep 16 04:28:38.109268 systemd[1]: Started sshd@25-138.201.119.17:22-139.178.89.65:49852.service - OpenSSH per-connection server daemon (139.178.89.65:49852). Sep 16 04:28:38.200102 update_engine[1531]: I20250916 04:28:38.199970 1531 libcurl_http_fetcher.cc:47] Starting/Resuming transfer Sep 16 04:28:38.200823 update_engine[1531]: I20250916 04:28:38.200747 1531 libcurl_http_fetcher.cc:151] Setting up curl options for HTTP Sep 16 04:28:38.201559 update_engine[1531]: I20250916 04:28:38.201494 1531 libcurl_http_fetcher.cc:449] Setting up timeout source: 1 seconds. Sep 16 04:28:38.202019 update_engine[1531]: E20250916 04:28:38.201964 1531 libcurl_http_fetcher.cc:266] Unable to get http response code: Could not resolve host: disabled Sep 16 04:28:38.202084 update_engine[1531]: I20250916 04:28:38.202062 1531 libcurl_http_fetcher.cc:283] No HTTP response, retry 3 Sep 16 04:28:39.127165 sshd[6806]: Accepted publickey for core from 139.178.89.65 port 49852 ssh2: RSA SHA256:hnZQROmedaG+reQAaWvmG41QCRiTlF3QrQA4Qzar5jk Sep 16 04:28:39.129934 sshd-session[6806]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 16 04:28:39.135879 systemd-logind[1529]: New session 26 of user core. Sep 16 04:28:39.141920 systemd[1]: Started session-26.scope - Session 26 of User core. Sep 16 04:28:39.947735 sshd[6809]: Connection closed by 139.178.89.65 port 49852 Sep 16 04:28:39.948142 sshd-session[6806]: pam_unix(sshd:session): session closed for user core Sep 16 04:28:39.953906 systemd-logind[1529]: Session 26 logged out. Waiting for processes to exit. Sep 16 04:28:39.955055 systemd[1]: sshd@25-138.201.119.17:22-139.178.89.65:49852.service: Deactivated successfully. Sep 16 04:28:39.962388 systemd[1]: session-26.scope: Deactivated successfully. Sep 16 04:28:39.968019 systemd-logind[1529]: Removed session 26. Sep 16 04:28:42.757496 containerd[1558]: time="2025-09-16T04:28:42.757442916Z" level=info msg="TaskExit event in podsandbox handler container_id:\"29accf524a610bb91eb08b15c29dbaaf8febb57b21dcb514741ee9efee567994\" id:\"a67713ea778186bf1381ead2a28b5c8913c8c34649595c0de0636639e2a254a3\" pid:6836 exited_at:{seconds:1757996922 nanos:756568980}" Sep 16 04:28:45.126252 systemd[1]: Started sshd@26-138.201.119.17:22-139.178.89.65:44920.service - OpenSSH per-connection server daemon (139.178.89.65:44920). Sep 16 04:28:46.126939 sshd[6848]: Accepted publickey for core from 139.178.89.65 port 44920 ssh2: RSA SHA256:hnZQROmedaG+reQAaWvmG41QCRiTlF3QrQA4Qzar5jk Sep 16 04:28:46.129007 sshd-session[6848]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 16 04:28:46.135466 systemd-logind[1529]: New session 27 of user core. Sep 16 04:28:46.140820 systemd[1]: Started session-27.scope - Session 27 of User core. Sep 16 04:28:46.880429 sshd[6851]: Connection closed by 139.178.89.65 port 44920 Sep 16 04:28:46.880980 sshd-session[6848]: pam_unix(sshd:session): session closed for user core Sep 16 04:28:46.886730 systemd[1]: sshd@26-138.201.119.17:22-139.178.89.65:44920.service: Deactivated successfully. Sep 16 04:28:46.891158 systemd[1]: session-27.scope: Deactivated successfully. Sep 16 04:28:46.892922 systemd-logind[1529]: Session 27 logged out. Waiting for processes to exit. Sep 16 04:28:46.894555 systemd-logind[1529]: Removed session 27. Sep 16 04:28:47.746761 containerd[1558]: time="2025-09-16T04:28:47.746637870Z" level=warning msg="container event discarded" container=25a57597e8d27d89915fcf44b056cafadc4c1a306859fe4e7e1fea3e53bf1b16 type=CONTAINER_CREATED_EVENT Sep 16 04:28:47.758056 containerd[1558]: time="2025-09-16T04:28:47.757923877Z" level=warning msg="container event discarded" container=25a57597e8d27d89915fcf44b056cafadc4c1a306859fe4e7e1fea3e53bf1b16 type=CONTAINER_STARTED_EVENT Sep 16 04:28:47.816364 containerd[1558]: time="2025-09-16T04:28:47.816244547Z" level=warning msg="container event discarded" container=ed4ab31cc3567f76094356286f56f2fd375bdc859e654c50dbe6dd9f919537b3 type=CONTAINER_CREATED_EVENT Sep 16 04:28:47.816364 containerd[1558]: time="2025-09-16T04:28:47.816330748Z" level=warning msg="container event discarded" container=35be61e851d5cdef65cba45040bb388023aae4354947b26310910b2e0937094a type=CONTAINER_CREATED_EVENT Sep 16 04:28:47.816364 containerd[1558]: time="2025-09-16T04:28:47.816370509Z" level=warning msg="container event discarded" container=35be61e851d5cdef65cba45040bb388023aae4354947b26310910b2e0937094a type=CONTAINER_STARTED_EVENT Sep 16 04:28:47.828957 containerd[1558]: time="2025-09-16T04:28:47.828900779Z" level=warning msg="container event discarded" container=c41d732404fb27be368f03904e6fc1d598d68e0f76f231d0bbdc764cc7014ccd type=CONTAINER_CREATED_EVENT Sep 16 04:28:47.829187 containerd[1558]: time="2025-09-16T04:28:47.829145383Z" level=warning msg="container event discarded" container=c41d732404fb27be368f03904e6fc1d598d68e0f76f231d0bbdc764cc7014ccd type=CONTAINER_STARTED_EVENT Sep 16 04:28:47.854512 containerd[1558]: time="2025-09-16T04:28:47.854379446Z" level=warning msg="container event discarded" container=d0e4c211bee4804610e277b52afb873471a087a0945a8b1a439aed3e0eed084e type=CONTAINER_CREATED_EVENT Sep 16 04:28:47.854512 containerd[1558]: time="2025-09-16T04:28:47.854454367Z" level=warning msg="container event discarded" container=f847b5840d56e10162db69c0c939205a09af2a0a7433acb7962de50e0f991381 type=CONTAINER_CREATED_EVENT Sep 16 04:28:47.940031 containerd[1558]: time="2025-09-16T04:28:47.939857893Z" level=warning msg="container event discarded" container=ed4ab31cc3567f76094356286f56f2fd375bdc859e654c50dbe6dd9f919537b3 type=CONTAINER_STARTED_EVENT Sep 16 04:28:47.994455 containerd[1558]: time="2025-09-16T04:28:47.994375892Z" level=warning msg="container event discarded" container=d0e4c211bee4804610e277b52afb873471a087a0945a8b1a439aed3e0eed084e type=CONTAINER_STARTED_EVENT Sep 16 04:28:47.994736 containerd[1558]: time="2025-09-16T04:28:47.994560136Z" level=warning msg="container event discarded" container=f847b5840d56e10162db69c0c939205a09af2a0a7433acb7962de50e0f991381 type=CONTAINER_STARTED_EVENT Sep 16 04:28:48.197126 update_engine[1531]: I20250916 04:28:48.197000 1531 libcurl_http_fetcher.cc:47] Starting/Resuming transfer Sep 16 04:28:48.197695 update_engine[1531]: I20250916 04:28:48.197149 1531 libcurl_http_fetcher.cc:151] Setting up curl options for HTTP Sep 16 04:28:48.197897 update_engine[1531]: I20250916 04:28:48.197832 1531 libcurl_http_fetcher.cc:449] Setting up timeout source: 1 seconds. Sep 16 04:28:48.198481 update_engine[1531]: E20250916 04:28:48.198318 1531 libcurl_http_fetcher.cc:266] Unable to get http response code: Could not resolve host: disabled Sep 16 04:28:48.198571 update_engine[1531]: I20250916 04:28:48.198474 1531 libcurl_http_fetcher.cc:297] Transfer resulted in an error (0), 0 bytes downloaded Sep 16 04:28:48.198571 update_engine[1531]: I20250916 04:28:48.198501 1531 omaha_request_action.cc:617] Omaha request response: Sep 16 04:28:48.198707 update_engine[1531]: E20250916 04:28:48.198675 1531 omaha_request_action.cc:636] Omaha request network transfer failed. Sep 16 04:28:48.198751 update_engine[1531]: I20250916 04:28:48.198709 1531 action_processor.cc:68] ActionProcessor::ActionComplete: OmahaRequestAction action failed. Aborting processing. Sep 16 04:28:48.198751 update_engine[1531]: I20250916 04:28:48.198719 1531 action_processor.cc:73] ActionProcessor::ActionComplete: finished last action of type OmahaRequestAction Sep 16 04:28:48.198751 update_engine[1531]: I20250916 04:28:48.198726 1531 update_attempter.cc:306] Processing Done. Sep 16 04:28:48.198751 update_engine[1531]: E20250916 04:28:48.198745 1531 update_attempter.cc:619] Update failed. Sep 16 04:28:48.198859 update_engine[1531]: I20250916 04:28:48.198755 1531 utils.cc:600] Converting error code 2000 to kActionCodeOmahaErrorInHTTPResponse Sep 16 04:28:48.198859 update_engine[1531]: I20250916 04:28:48.198763 1531 payload_state.cc:97] Updating payload state for error code: 37 (kActionCodeOmahaErrorInHTTPResponse) Sep 16 04:28:48.198859 update_engine[1531]: I20250916 04:28:48.198773 1531 payload_state.cc:103] Ignoring failures until we get a valid Omaha response. Sep 16 04:28:48.199223 update_engine[1531]: I20250916 04:28:48.198886 1531 action_processor.cc:36] ActionProcessor::StartProcessing: OmahaRequestAction Sep 16 04:28:48.199223 update_engine[1531]: I20250916 04:28:48.198925 1531 omaha_request_action.cc:271] Posting an Omaha request to disabled Sep 16 04:28:48.199223 update_engine[1531]: I20250916 04:28:48.198935 1531 omaha_request_action.cc:272] Request: Sep 16 04:28:48.199223 update_engine[1531]: Sep 16 04:28:48.199223 update_engine[1531]: Sep 16 04:28:48.199223 update_engine[1531]: Sep 16 04:28:48.199223 update_engine[1531]: Sep 16 04:28:48.199223 update_engine[1531]: Sep 16 04:28:48.199223 update_engine[1531]: Sep 16 04:28:48.199223 update_engine[1531]: I20250916 04:28:48.198944 1531 libcurl_http_fetcher.cc:47] Starting/Resuming transfer Sep 16 04:28:48.199223 update_engine[1531]: I20250916 04:28:48.198973 1531 libcurl_http_fetcher.cc:151] Setting up curl options for HTTP Sep 16 04:28:48.199558 update_engine[1531]: I20250916 04:28:48.199410 1531 libcurl_http_fetcher.cc:449] Setting up timeout source: 1 seconds. Sep 16 04:28:48.199784 locksmithd[1582]: LastCheckedTime=0 Progress=0 CurrentOperation="UPDATE_STATUS_REPORTING_ERROR_EVENT" NewVersion=0.0.0 NewSize=0 Sep 16 04:28:48.200436 update_engine[1531]: E20250916 04:28:48.199908 1531 libcurl_http_fetcher.cc:266] Unable to get http response code: Could not resolve host: disabled Sep 16 04:28:48.200436 update_engine[1531]: I20250916 04:28:48.199993 1531 libcurl_http_fetcher.cc:297] Transfer resulted in an error (0), 0 bytes downloaded Sep 16 04:28:48.200436 update_engine[1531]: I20250916 04:28:48.200007 1531 omaha_request_action.cc:617] Omaha request response: Sep 16 04:28:48.200436 update_engine[1531]: I20250916 04:28:48.200018 1531 action_processor.cc:65] ActionProcessor::ActionComplete: finished last action of type OmahaRequestAction Sep 16 04:28:48.200436 update_engine[1531]: I20250916 04:28:48.200026 1531 action_processor.cc:73] ActionProcessor::ActionComplete: finished last action of type OmahaRequestAction Sep 16 04:28:48.200436 update_engine[1531]: I20250916 04:28:48.200033 1531 update_attempter.cc:306] Processing Done. Sep 16 04:28:48.200436 update_engine[1531]: I20250916 04:28:48.200043 1531 update_attempter.cc:310] Error event sent. Sep 16 04:28:48.200436 update_engine[1531]: I20250916 04:28:48.200055 1531 update_check_scheduler.cc:74] Next update check in 41m48s Sep 16 04:28:48.200917 locksmithd[1582]: LastCheckedTime=0 Progress=0 CurrentOperation="UPDATE_STATUS_IDLE" NewVersion=0.0.0 NewSize=0 Sep 16 04:28:50.214592 containerd[1558]: time="2025-09-16T04:28:50.214523053Z" level=info msg="TaskExit event in podsandbox handler container_id:\"8d6c6929c13327a75d8c268f852bf6c0f9e08ea12f3d02fe73bf74d4acf04464\" id:\"80fca1a52728c76a529210629756da7044d4b9206865b24aed3a4863f330f055\" pid:6875 exited_at:{seconds:1757996930 nanos:214115326}" Sep 16 04:28:52.050812 systemd[1]: Started sshd@27-138.201.119.17:22-139.178.89.65:50740.service - OpenSSH per-connection server daemon (139.178.89.65:50740). Sep 16 04:28:53.049045 sshd[6886]: Accepted publickey for core from 139.178.89.65 port 50740 ssh2: RSA SHA256:hnZQROmedaG+reQAaWvmG41QCRiTlF3QrQA4Qzar5jk Sep 16 04:28:53.051502 sshd-session[6886]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 16 04:28:53.057478 systemd-logind[1529]: New session 28 of user core. Sep 16 04:28:53.067326 systemd[1]: Started session-28.scope - Session 28 of User core. Sep 16 04:28:53.804440 sshd[6889]: Connection closed by 139.178.89.65 port 50740 Sep 16 04:28:53.805362 sshd-session[6886]: pam_unix(sshd:session): session closed for user core Sep 16 04:28:53.811500 systemd[1]: sshd@27-138.201.119.17:22-139.178.89.65:50740.service: Deactivated successfully. Sep 16 04:28:53.814939 systemd[1]: session-28.scope: Deactivated successfully. Sep 16 04:28:53.817299 systemd-logind[1529]: Session 28 logged out. Waiting for processes to exit. Sep 16 04:28:53.819884 systemd-logind[1529]: Removed session 28. Sep 16 04:28:58.981409 systemd[1]: Started sshd@28-138.201.119.17:22-139.178.89.65:50744.service - OpenSSH per-connection server daemon (139.178.89.65:50744). Sep 16 04:28:59.299406 containerd[1558]: time="2025-09-16T04:28:59.298960391Z" level=info msg="TaskExit event in podsandbox handler container_id:\"330eda4a58ea998791d702c8e63e01b0da33f16b43b46f482cb45be65d7b3d87\" id:\"430325fa5c5e35829508bce82c779ca98e2e9eb485d938901df1c060767e38f1\" pid:6920 exited_at:{seconds:1757996939 nanos:298329420}" Sep 16 04:28:59.991685 sshd[6904]: Accepted publickey for core from 139.178.89.65 port 50744 ssh2: RSA SHA256:hnZQROmedaG+reQAaWvmG41QCRiTlF3QrQA4Qzar5jk Sep 16 04:28:59.995059 sshd-session[6904]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 16 04:29:00.001098 systemd-logind[1529]: New session 29 of user core. Sep 16 04:29:00.008898 systemd[1]: Started session-29.scope - Session 29 of User core. Sep 16 04:29:00.757529 sshd[6930]: Connection closed by 139.178.89.65 port 50744 Sep 16 04:29:00.758407 sshd-session[6904]: pam_unix(sshd:session): session closed for user core Sep 16 04:29:00.762557 systemd[1]: sshd@28-138.201.119.17:22-139.178.89.65:50744.service: Deactivated successfully. Sep 16 04:29:00.765195 systemd[1]: session-29.scope: Deactivated successfully. Sep 16 04:29:00.766860 systemd-logind[1529]: Session 29 logged out. Waiting for processes to exit. Sep 16 04:29:00.769279 systemd-logind[1529]: Removed session 29. Sep 16 04:29:00.976048 containerd[1558]: time="2025-09-16T04:29:00.975911253Z" level=warning msg="container event discarded" container=38ce084cf76605850da2aab9ea80c8bb75d199bc5f3c2390d8b6f75a68533aab type=CONTAINER_CREATED_EVENT Sep 16 04:29:00.976048 containerd[1558]: time="2025-09-16T04:29:00.976007455Z" level=warning msg="container event discarded" container=38ce084cf76605850da2aab9ea80c8bb75d199bc5f3c2390d8b6f75a68533aab type=CONTAINER_STARTED_EVENT Sep 16 04:29:01.006016 containerd[1558]: time="2025-09-16T04:29:01.005936577Z" level=warning msg="container event discarded" container=b4c8e44787afeb928ee322c6d5869cdcd1bcc9bedf0ce757824a50a3f5615d60 type=CONTAINER_CREATED_EVENT Sep 16 04:29:01.089083 containerd[1558]: time="2025-09-16T04:29:01.088840701Z" level=warning msg="container event discarded" container=a08e43c1cc4799704c01eac2015473bc4e38d82fc3562b0ca970af3383c54304 type=CONTAINER_CREATED_EVENT Sep 16 04:29:01.089083 containerd[1558]: time="2025-09-16T04:29:01.088924662Z" level=warning msg="container event discarded" container=a08e43c1cc4799704c01eac2015473bc4e38d82fc3562b0ca970af3383c54304 type=CONTAINER_STARTED_EVENT Sep 16 04:29:01.106735 containerd[1558]: time="2025-09-16T04:29:01.106562089Z" level=warning msg="container event discarded" container=b4c8e44787afeb928ee322c6d5869cdcd1bcc9bedf0ce757824a50a3f5615d60 type=CONTAINER_STARTED_EVENT Sep 16 04:29:04.207831 containerd[1558]: time="2025-09-16T04:29:04.207639158Z" level=warning msg="container event discarded" container=6d0da9451e670ade73d62e52afea4f69fc4d068a0dd91185a8485e2dc740c3a2 type=CONTAINER_CREATED_EVENT Sep 16 04:29:04.273938 containerd[1558]: time="2025-09-16T04:29:04.273860380Z" level=warning msg="container event discarded" container=6d0da9451e670ade73d62e52afea4f69fc4d068a0dd91185a8485e2dc740c3a2 type=CONTAINER_STARTED_EVENT Sep 16 04:29:05.588211 containerd[1558]: time="2025-09-16T04:29:05.588162407Z" level=info msg="TaskExit event in podsandbox handler container_id:\"330eda4a58ea998791d702c8e63e01b0da33f16b43b46f482cb45be65d7b3d87\" id:\"e2f5ff5bacdb6793b0ffef5ec51cae2eff3011270588570503d5d037d4aac6ba\" pid:6956 exited_at:{seconds:1757996945 nanos:587869081}" Sep 16 04:29:05.933484 systemd[1]: Started sshd@29-138.201.119.17:22-139.178.89.65:47126.service - OpenSSH per-connection server daemon (139.178.89.65:47126). Sep 16 04:29:06.943526 sshd[6966]: Accepted publickey for core from 139.178.89.65 port 47126 ssh2: RSA SHA256:hnZQROmedaG+reQAaWvmG41QCRiTlF3QrQA4Qzar5jk Sep 16 04:29:06.946079 sshd-session[6966]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 16 04:29:06.953672 systemd-logind[1529]: New session 30 of user core. Sep 16 04:29:06.963967 systemd[1]: Started session-30.scope - Session 30 of User core. Sep 16 04:29:07.700888 sshd[6969]: Connection closed by 139.178.89.65 port 47126 Sep 16 04:29:07.701545 sshd-session[6966]: pam_unix(sshd:session): session closed for user core Sep 16 04:29:07.707820 systemd[1]: sshd@29-138.201.119.17:22-139.178.89.65:47126.service: Deactivated successfully. Sep 16 04:29:07.710187 systemd[1]: session-30.scope: Deactivated successfully. Sep 16 04:29:07.713832 systemd-logind[1529]: Session 30 logged out. Waiting for processes to exit. Sep 16 04:29:07.717825 systemd-logind[1529]: Removed session 30. Sep 16 04:29:12.843953 containerd[1558]: time="2025-09-16T04:29:12.843885666Z" level=info msg="TaskExit event in podsandbox handler container_id:\"29accf524a610bb91eb08b15c29dbaaf8febb57b21dcb514741ee9efee567994\" id:\"40b2b22dca5d1d3a14b3156b517b35ec998f03a3d607ee89b09654549fdcf648\" pid:6993 exited_at:{seconds:1757996952 nanos:843217175}" Sep 16 04:29:12.873044 systemd[1]: Started sshd@30-138.201.119.17:22-139.178.89.65:44550.service - OpenSSH per-connection server daemon (139.178.89.65:44550). Sep 16 04:29:13.873207 sshd[7005]: Accepted publickey for core from 139.178.89.65 port 44550 ssh2: RSA SHA256:hnZQROmedaG+reQAaWvmG41QCRiTlF3QrQA4Qzar5jk Sep 16 04:29:13.875220 sshd-session[7005]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 16 04:29:13.881224 systemd-logind[1529]: New session 31 of user core. Sep 16 04:29:13.888863 systemd[1]: Started session-31.scope - Session 31 of User core. Sep 16 04:29:14.631704 sshd[7008]: Connection closed by 139.178.89.65 port 44550 Sep 16 04:29:14.632966 sshd-session[7005]: pam_unix(sshd:session): session closed for user core Sep 16 04:29:14.638572 systemd[1]: sshd@30-138.201.119.17:22-139.178.89.65:44550.service: Deactivated successfully. Sep 16 04:29:14.641102 systemd[1]: session-31.scope: Deactivated successfully. Sep 16 04:29:14.642416 systemd-logind[1529]: Session 31 logged out. Waiting for processes to exit. Sep 16 04:29:14.644114 systemd-logind[1529]: Removed session 31. Sep 16 04:29:18.216546 containerd[1558]: time="2025-09-16T04:29:18.216448486Z" level=warning msg="container event discarded" container=bb700e25270771d109442898fb786f24e9082587621305ec0c13586c76e6b01c type=CONTAINER_CREATED_EVENT Sep 16 04:29:18.216546 containerd[1558]: time="2025-09-16T04:29:18.216532447Z" level=warning msg="container event discarded" container=bb700e25270771d109442898fb786f24e9082587621305ec0c13586c76e6b01c type=CONTAINER_STARTED_EVENT Sep 16 04:29:18.302945 containerd[1558]: time="2025-09-16T04:29:18.302814834Z" level=warning msg="container event discarded" container=f7c71579cce0ec213c5a474c65ab702dd5d554093c1d9cbbb0691207270a2b80 type=CONTAINER_CREATED_EVENT Sep 16 04:29:18.302945 containerd[1558]: time="2025-09-16T04:29:18.302894956Z" level=warning msg="container event discarded" container=f7c71579cce0ec213c5a474c65ab702dd5d554093c1d9cbbb0691207270a2b80 type=CONTAINER_STARTED_EVENT Sep 16 04:29:19.806113 systemd[1]: Started sshd@31-138.201.119.17:22-139.178.89.65:44556.service - OpenSSH per-connection server daemon (139.178.89.65:44556). Sep 16 04:29:20.220744 containerd[1558]: time="2025-09-16T04:29:20.220676684Z" level=info msg="TaskExit event in podsandbox handler container_id:\"8d6c6929c13327a75d8c268f852bf6c0f9e08ea12f3d02fe73bf74d4acf04464\" id:\"fab21fa1be6b10f9e8fc9f2a886f2894e0422f4d5e53dee83a1f4c459237452e\" pid:7036 exited_at:{seconds:1757996960 nanos:220321918}" Sep 16 04:29:20.792921 containerd[1558]: time="2025-09-16T04:29:20.792874095Z" level=info msg="TaskExit event in podsandbox handler container_id:\"8d6c6929c13327a75d8c268f852bf6c0f9e08ea12f3d02fe73bf74d4acf04464\" id:\"6e6fd0a289d8dc9080705281417d437ca06753260a994ef633d6316c04e15521\" pid:7058 exited_at:{seconds:1757996960 nanos:792428968}" Sep 16 04:29:20.808618 sshd[7021]: Accepted publickey for core from 139.178.89.65 port 44556 ssh2: RSA SHA256:hnZQROmedaG+reQAaWvmG41QCRiTlF3QrQA4Qzar5jk Sep 16 04:29:20.810046 sshd-session[7021]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 16 04:29:20.816460 systemd-logind[1529]: New session 32 of user core. Sep 16 04:29:20.826970 systemd[1]: Started session-32.scope - Session 32 of User core. Sep 16 04:29:20.831296 containerd[1558]: time="2025-09-16T04:29:20.831219006Z" level=warning msg="container event discarded" container=b1f80d0a12a6fc0383ef34b6e955dfad8330dcfc1a715afb5778206817e5ed52 type=CONTAINER_CREATED_EVENT Sep 16 04:29:20.927095 containerd[1558]: time="2025-09-16T04:29:20.926983421Z" level=warning msg="container event discarded" container=b1f80d0a12a6fc0383ef34b6e955dfad8330dcfc1a715afb5778206817e5ed52 type=CONTAINER_STARTED_EVENT Sep 16 04:29:21.581005 sshd[7068]: Connection closed by 139.178.89.65 port 44556 Sep 16 04:29:21.581975 sshd-session[7021]: pam_unix(sshd:session): session closed for user core Sep 16 04:29:21.587723 systemd-logind[1529]: Session 32 logged out. Waiting for processes to exit. Sep 16 04:29:21.588441 systemd[1]: sshd@31-138.201.119.17:22-139.178.89.65:44556.service: Deactivated successfully. Sep 16 04:29:21.591068 systemd[1]: session-32.scope: Deactivated successfully. Sep 16 04:29:21.593939 systemd-logind[1529]: Removed session 32. Sep 16 04:29:22.614073 containerd[1558]: time="2025-09-16T04:29:22.614001072Z" level=warning msg="container event discarded" container=27e83d286fc9cdbaf5cff59a691df5b5febd2877eb429abaf956916a6e1488fd type=CONTAINER_CREATED_EVENT Sep 16 04:29:22.701550 containerd[1558]: time="2025-09-16T04:29:22.701438863Z" level=warning msg="container event discarded" container=27e83d286fc9cdbaf5cff59a691df5b5febd2877eb429abaf956916a6e1488fd type=CONTAINER_STARTED_EVENT Sep 16 04:29:22.860808 containerd[1558]: time="2025-09-16T04:29:22.860709108Z" level=warning msg="container event discarded" container=27e83d286fc9cdbaf5cff59a691df5b5febd2877eb429abaf956916a6e1488fd type=CONTAINER_STOPPED_EVENT Sep 16 04:29:26.646085 containerd[1558]: time="2025-09-16T04:29:26.645959207Z" level=warning msg="container event discarded" container=281d18e013cf1a67e834c91d17d97d83bc6492f001b7ed005aa794a7febe5f3e type=CONTAINER_CREATED_EVENT Sep 16 04:29:26.725538 containerd[1558]: time="2025-09-16T04:29:26.725430254Z" level=warning msg="container event discarded" container=281d18e013cf1a67e834c91d17d97d83bc6492f001b7ed005aa794a7febe5f3e type=CONTAINER_STARTED_EVENT Sep 16 04:29:26.751290 systemd[1]: Started sshd@32-138.201.119.17:22-139.178.89.65:60754.service - OpenSSH per-connection server daemon (139.178.89.65:60754). Sep 16 04:29:27.398264 containerd[1558]: time="2025-09-16T04:29:27.398161535Z" level=warning msg="container event discarded" container=281d18e013cf1a67e834c91d17d97d83bc6492f001b7ed005aa794a7febe5f3e type=CONTAINER_STOPPED_EVENT Sep 16 04:29:27.741956 sshd[7088]: Accepted publickey for core from 139.178.89.65 port 60754 ssh2: RSA SHA256:hnZQROmedaG+reQAaWvmG41QCRiTlF3QrQA4Qzar5jk Sep 16 04:29:27.744825 sshd-session[7088]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 16 04:29:27.750126 systemd-logind[1529]: New session 33 of user core. Sep 16 04:29:27.761946 systemd[1]: Started session-33.scope - Session 33 of User core. Sep 16 04:29:28.492950 sshd[7105]: Connection closed by 139.178.89.65 port 60754 Sep 16 04:29:28.493729 sshd-session[7088]: pam_unix(sshd:session): session closed for user core Sep 16 04:29:28.499579 systemd-logind[1529]: Session 33 logged out. Waiting for processes to exit. Sep 16 04:29:28.500748 systemd[1]: sshd@32-138.201.119.17:22-139.178.89.65:60754.service: Deactivated successfully. Sep 16 04:29:28.504896 systemd[1]: session-33.scope: Deactivated successfully. Sep 16 04:29:28.507864 systemd-logind[1529]: Removed session 33. Sep 16 04:29:29.302318 containerd[1558]: time="2025-09-16T04:29:29.302275161Z" level=info msg="TaskExit event in podsandbox handler container_id:\"330eda4a58ea998791d702c8e63e01b0da33f16b43b46f482cb45be65d7b3d87\" id:\"48bdf121893c14a3ee0de52e66ba91a62de92bd4d64aa33526dc7702842afed1\" pid:7130 exited_at:{seconds:1757996969 nanos:301790753}" Sep 16 04:29:33.667117 systemd[1]: Started sshd@33-138.201.119.17:22-139.178.89.65:49996.service - OpenSSH per-connection server daemon (139.178.89.65:49996). Sep 16 04:29:34.671163 sshd[7142]: Accepted publickey for core from 139.178.89.65 port 49996 ssh2: RSA SHA256:hnZQROmedaG+reQAaWvmG41QCRiTlF3QrQA4Qzar5jk Sep 16 04:29:34.673544 sshd-session[7142]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 16 04:29:34.679756 systemd-logind[1529]: New session 34 of user core. Sep 16 04:29:34.685977 systemd[1]: Started session-34.scope - Session 34 of User core. Sep 16 04:29:35.335124 containerd[1558]: time="2025-09-16T04:29:35.335017074Z" level=warning msg="container event discarded" container=29accf524a610bb91eb08b15c29dbaaf8febb57b21dcb514741ee9efee567994 type=CONTAINER_CREATED_EVENT Sep 16 04:29:35.433011 sshd[7145]: Connection closed by 139.178.89.65 port 49996 Sep 16 04:29:35.434405 sshd-session[7142]: pam_unix(sshd:session): session closed for user core Sep 16 04:29:35.437446 containerd[1558]: time="2025-09-16T04:29:35.437347417Z" level=warning msg="container event discarded" container=29accf524a610bb91eb08b15c29dbaaf8febb57b21dcb514741ee9efee567994 type=CONTAINER_STARTED_EVENT Sep 16 04:29:35.441946 systemd-logind[1529]: Session 34 logged out. Waiting for processes to exit. Sep 16 04:29:35.442442 systemd[1]: sshd@33-138.201.119.17:22-139.178.89.65:49996.service: Deactivated successfully. Sep 16 04:29:35.445418 systemd[1]: session-34.scope: Deactivated successfully. Sep 16 04:29:35.448318 systemd-logind[1529]: Removed session 34. Sep 16 04:29:36.733214 containerd[1558]: time="2025-09-16T04:29:36.733113469Z" level=warning msg="container event discarded" container=4417d78058fd69e26692971def28e61ab64ec9900779c4f7a0794b4295d5c516 type=CONTAINER_CREATED_EVENT Sep 16 04:29:36.733214 containerd[1558]: time="2025-09-16T04:29:36.733180590Z" level=warning msg="container event discarded" container=4417d78058fd69e26692971def28e61ab64ec9900779c4f7a0794b4295d5c516 type=CONTAINER_STARTED_EVENT Sep 16 04:29:38.669493 containerd[1558]: time="2025-09-16T04:29:38.669389924Z" level=warning msg="container event discarded" container=451152038e74f0f7850cc160d814d3f2086e1df717bc30b43170540391224845 type=CONTAINER_CREATED_EVENT Sep 16 04:29:38.758819 containerd[1558]: time="2025-09-16T04:29:38.758741173Z" level=warning msg="container event discarded" container=451152038e74f0f7850cc160d814d3f2086e1df717bc30b43170540391224845 type=CONTAINER_STARTED_EVENT Sep 16 04:29:40.377523 containerd[1558]: time="2025-09-16T04:29:40.377318151Z" level=warning msg="container event discarded" container=a50447aed51378baaf791458d903162d129025aab136eaff7349f82a32bb64fa type=CONTAINER_CREATED_EVENT Sep 16 04:29:40.377523 containerd[1558]: time="2025-09-16T04:29:40.377390513Z" level=warning msg="container event discarded" container=a50447aed51378baaf791458d903162d129025aab136eaff7349f82a32bb64fa type=CONTAINER_STARTED_EVENT Sep 16 04:29:40.435701 containerd[1558]: time="2025-09-16T04:29:40.435638907Z" level=warning msg="container event discarded" container=6b339db8a545c6fde5a18f98c80055240a0af9c453697fe215632520c162eed3 type=CONTAINER_CREATED_EVENT Sep 16 04:29:40.608924 systemd[1]: Started sshd@34-138.201.119.17:22-139.178.89.65:56010.service - OpenSSH per-connection server daemon (139.178.89.65:56010). Sep 16 04:29:40.647375 containerd[1558]: time="2025-09-16T04:29:40.647234269Z" level=warning msg="container event discarded" container=6b339db8a545c6fde5a18f98c80055240a0af9c453697fe215632520c162eed3 type=CONTAINER_STARTED_EVENT Sep 16 04:29:40.775660 containerd[1558]: time="2025-09-16T04:29:40.775533643Z" level=warning msg="container event discarded" container=c97c66d2619171743082a8e92c9b5c4f676446e0607614f7bb7f5a5ca57c4b66 type=CONTAINER_CREATED_EVENT Sep 16 04:29:40.776724 containerd[1558]: time="2025-09-16T04:29:40.776661701Z" level=warning msg="container event discarded" container=c97c66d2619171743082a8e92c9b5c4f676446e0607614f7bb7f5a5ca57c4b66 type=CONTAINER_STARTED_EVENT Sep 16 04:29:40.822226 containerd[1558]: time="2025-09-16T04:29:40.822129215Z" level=warning msg="container event discarded" container=0ec7bc1e72af1c64d5e4f4781d71b30a078a8f8aeca5117c149752f003b07903 type=CONTAINER_CREATED_EVENT Sep 16 04:29:40.822226 containerd[1558]: time="2025-09-16T04:29:40.822214456Z" level=warning msg="container event discarded" container=0ec7bc1e72af1c64d5e4f4781d71b30a078a8f8aeca5117c149752f003b07903 type=CONTAINER_STARTED_EVENT Sep 16 04:29:40.909894 containerd[1558]: time="2025-09-16T04:29:40.909452426Z" level=warning msg="container event discarded" container=1bab73f621173acd92b434faa9fb4109780778a33414ff1e082d1886b7bccc87 type=CONTAINER_CREATED_EVENT Sep 16 04:29:41.119017 containerd[1558]: time="2025-09-16T04:29:41.118848349Z" level=warning msg="container event discarded" container=bd0ee2805722d3138f5f509cff73c25da2471dbc49ab7fb41ffc01c84b62cf43 type=CONTAINER_CREATED_EVENT Sep 16 04:29:41.119017 containerd[1558]: time="2025-09-16T04:29:41.118929551Z" level=warning msg="container event discarded" container=bd0ee2805722d3138f5f509cff73c25da2471dbc49ab7fb41ffc01c84b62cf43 type=CONTAINER_STARTED_EVENT Sep 16 04:29:41.161473 containerd[1558]: time="2025-09-16T04:29:41.161197773Z" level=warning msg="container event discarded" container=1bab73f621173acd92b434faa9fb4109780778a33414ff1e082d1886b7bccc87 type=CONTAINER_STARTED_EVENT Sep 16 04:29:41.531602 containerd[1558]: time="2025-09-16T04:29:41.531301572Z" level=warning msg="container event discarded" container=06b339a4ad9ecc43049a43a52397442958c95ad5d6b17f2bc727827d99d90505 type=CONTAINER_CREATED_EVENT Sep 16 04:29:41.531602 containerd[1558]: time="2025-09-16T04:29:41.531388893Z" level=warning msg="container event discarded" container=06b339a4ad9ecc43049a43a52397442958c95ad5d6b17f2bc727827d99d90505 type=CONTAINER_STARTED_EVENT Sep 16 04:29:41.614937 containerd[1558]: time="2025-09-16T04:29:41.614816680Z" level=warning msg="container event discarded" container=8a610be6a24d2e391acbed96ab712c6beb8ff0fbbbbd95157a98bf1bdc94ecff type=CONTAINER_CREATED_EVENT Sep 16 04:29:41.614937 containerd[1558]: time="2025-09-16T04:29:41.614881721Z" level=warning msg="container event discarded" container=8a610be6a24d2e391acbed96ab712c6beb8ff0fbbbbd95157a98bf1bdc94ecff type=CONTAINER_STARTED_EVENT Sep 16 04:29:41.619571 sshd[7159]: Accepted publickey for core from 139.178.89.65 port 56010 ssh2: RSA SHA256:hnZQROmedaG+reQAaWvmG41QCRiTlF3QrQA4Qzar5jk Sep 16 04:29:41.622118 sshd-session[7159]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 16 04:29:41.628674 systemd-logind[1529]: New session 35 of user core. Sep 16 04:29:41.637922 systemd[1]: Started session-35.scope - Session 35 of User core. Sep 16 04:29:42.294849 containerd[1558]: time="2025-09-16T04:29:42.294720564Z" level=warning msg="container event discarded" container=bf9451cd5cf370340927d8330d7d3ece7ed4d22949a75087d5ae4a5d86b2be7c type=CONTAINER_CREATED_EVENT Sep 16 04:29:42.294849 containerd[1558]: time="2025-09-16T04:29:42.294803645Z" level=warning msg="container event discarded" container=bf9451cd5cf370340927d8330d7d3ece7ed4d22949a75087d5ae4a5d86b2be7c type=CONTAINER_STARTED_EVENT Sep 16 04:29:42.396128 sshd[7162]: Connection closed by 139.178.89.65 port 56010 Sep 16 04:29:42.395285 sshd-session[7159]: pam_unix(sshd:session): session closed for user core Sep 16 04:29:42.401300 systemd[1]: sshd@34-138.201.119.17:22-139.178.89.65:56010.service: Deactivated successfully. Sep 16 04:29:42.403478 systemd[1]: session-35.scope: Deactivated successfully. Sep 16 04:29:42.404757 systemd-logind[1529]: Session 35 logged out. Waiting for processes to exit. Sep 16 04:29:42.406451 systemd-logind[1529]: Removed session 35. Sep 16 04:29:42.416225 containerd[1558]: time="2025-09-16T04:29:42.416100502Z" level=warning msg="container event discarded" container=8f4206c779cd8e784c940367c2f359b2973b5440f615fa238c911344451e797b type=CONTAINER_CREATED_EVENT Sep 16 04:29:42.416225 containerd[1558]: time="2025-09-16T04:29:42.416187503Z" level=warning msg="container event discarded" container=8f4206c779cd8e784c940367c2f359b2973b5440f615fa238c911344451e797b type=CONTAINER_STARTED_EVENT Sep 16 04:29:42.416225 containerd[1558]: time="2025-09-16T04:29:42.416207744Z" level=warning msg="container event discarded" container=f2ad6edd96e4ad657b591a8040fcb79cc1db93fcd2751a146b0d4c28fd3953d1 type=CONTAINER_CREATED_EVENT Sep 16 04:29:42.497737 containerd[1558]: time="2025-09-16T04:29:42.497647017Z" level=warning msg="container event discarded" container=f2ad6edd96e4ad657b591a8040fcb79cc1db93fcd2751a146b0d4c28fd3953d1 type=CONTAINER_STARTED_EVENT Sep 16 04:29:42.838450 containerd[1558]: time="2025-09-16T04:29:42.837895698Z" level=info msg="TaskExit event in podsandbox handler container_id:\"29accf524a610bb91eb08b15c29dbaaf8febb57b21dcb514741ee9efee567994\" id:\"b24084dc4a3a0b5a84aeaa03a9b26edb04bbaf5de96939a31fc073255a78fecf\" pid:7186 exited_at:{seconds:1757996982 nanos:837344209}" Sep 16 04:29:47.156286 containerd[1558]: time="2025-09-16T04:29:47.156103417Z" level=warning msg="container event discarded" container=8d6c6929c13327a75d8c268f852bf6c0f9e08ea12f3d02fe73bf74d4acf04464 type=CONTAINER_CREATED_EVENT Sep 16 04:29:47.243901 containerd[1558]: time="2025-09-16T04:29:47.243821096Z" level=warning msg="container event discarded" container=8d6c6929c13327a75d8c268f852bf6c0f9e08ea12f3d02fe73bf74d4acf04464 type=CONTAINER_STARTED_EVENT Sep 16 04:29:47.564286 systemd[1]: Started sshd@35-138.201.119.17:22-139.178.89.65:56012.service - OpenSSH per-connection server daemon (139.178.89.65:56012). Sep 16 04:29:48.561398 sshd[7198]: Accepted publickey for core from 139.178.89.65 port 56012 ssh2: RSA SHA256:hnZQROmedaG+reQAaWvmG41QCRiTlF3QrQA4Qzar5jk Sep 16 04:29:48.564569 sshd-session[7198]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 16 04:29:48.570515 systemd-logind[1529]: New session 36 of user core. Sep 16 04:29:48.583973 systemd[1]: Started session-36.scope - Session 36 of User core. Sep 16 04:29:49.316099 sshd[7201]: Connection closed by 139.178.89.65 port 56012 Sep 16 04:29:49.316998 sshd-session[7198]: pam_unix(sshd:session): session closed for user core Sep 16 04:29:49.324149 systemd[1]: sshd@35-138.201.119.17:22-139.178.89.65:56012.service: Deactivated successfully. Sep 16 04:29:49.327106 systemd[1]: session-36.scope: Deactivated successfully. Sep 16 04:29:49.329413 systemd-logind[1529]: Session 36 logged out. Waiting for processes to exit. Sep 16 04:29:49.331435 systemd-logind[1529]: Removed session 36. Sep 16 04:29:50.225338 containerd[1558]: time="2025-09-16T04:29:50.225263171Z" level=info msg="TaskExit event in podsandbox handler container_id:\"8d6c6929c13327a75d8c268f852bf6c0f9e08ea12f3d02fe73bf74d4acf04464\" id:\"72145c7b5311e440020db8556c39783be35e60fe89994c5e2d1372b8b5b946a8\" pid:7225 exited_at:{seconds:1757996990 nanos:224804764}" Sep 16 04:29:50.394791 containerd[1558]: time="2025-09-16T04:29:50.394697462Z" level=warning msg="container event discarded" container=b20ad674036d9cd03941d34ba6ace764c0aa12809506f287c499c307d8d9beeb type=CONTAINER_CREATED_EVENT Sep 16 04:29:50.492479 containerd[1558]: time="2025-09-16T04:29:50.492051481Z" level=warning msg="container event discarded" container=b20ad674036d9cd03941d34ba6ace764c0aa12809506f287c499c307d8d9beeb type=CONTAINER_STARTED_EVENT Sep 16 04:29:52.280653 containerd[1558]: time="2025-09-16T04:29:52.280498031Z" level=warning msg="container event discarded" container=2f14fe3453e235fec7aeba82e5e5aba14bd81e2eb43fa8691efa8431790e1a0e type=CONTAINER_CREATED_EVENT Sep 16 04:29:52.383056 containerd[1558]: time="2025-09-16T04:29:52.382915283Z" level=warning msg="container event discarded" container=2f14fe3453e235fec7aeba82e5e5aba14bd81e2eb43fa8691efa8431790e1a0e type=CONTAINER_STARTED_EVENT Sep 16 04:29:52.673087 containerd[1558]: time="2025-09-16T04:29:52.672993176Z" level=warning msg="container event discarded" container=4d0ab61559473cb2339e578525aaf069f866621a7afcf3eab88ca02d4a4a511b type=CONTAINER_CREATED_EVENT Sep 16 04:29:52.780711 containerd[1558]: time="2025-09-16T04:29:52.780564988Z" level=warning msg="container event discarded" container=4d0ab61559473cb2339e578525aaf069f866621a7afcf3eab88ca02d4a4a511b type=CONTAINER_STARTED_EVENT Sep 16 04:29:54.483292 containerd[1558]: time="2025-09-16T04:29:54.483176195Z" level=warning msg="container event discarded" container=46d2afbbefec18f3f564c1fb3ddd0c85be740c86a17f0ac9efa33e6b0fbe940f type=CONTAINER_CREATED_EVENT Sep 16 04:29:54.483292 containerd[1558]: time="2025-09-16T04:29:54.483249236Z" level=warning msg="container event discarded" container=46d2afbbefec18f3f564c1fb3ddd0c85be740c86a17f0ac9efa33e6b0fbe940f type=CONTAINER_STARTED_EVENT Sep 16 04:29:54.493334 systemd[1]: Started sshd@36-138.201.119.17:22-139.178.89.65:52466.service - OpenSSH per-connection server daemon (139.178.89.65:52466). Sep 16 04:29:54.522123 containerd[1558]: time="2025-09-16T04:29:54.522049149Z" level=warning msg="container event discarded" container=d054e762edef0daf7dc5cd405a95100f699c02a9e17a6ef9b09ebd721eb34721 type=CONTAINER_CREATED_EVENT Sep 16 04:29:54.745979 containerd[1558]: time="2025-09-16T04:29:54.745798813Z" level=warning msg="container event discarded" container=d054e762edef0daf7dc5cd405a95100f699c02a9e17a6ef9b09ebd721eb34721 type=CONTAINER_STARTED_EVENT Sep 16 04:29:55.489365 sshd[7237]: Accepted publickey for core from 139.178.89.65 port 52466 ssh2: RSA SHA256:hnZQROmedaG+reQAaWvmG41QCRiTlF3QrQA4Qzar5jk Sep 16 04:29:55.491518 sshd-session[7237]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 16 04:29:55.498483 systemd-logind[1529]: New session 37 of user core. Sep 16 04:29:55.502873 systemd[1]: Started session-37.scope - Session 37 of User core. Sep 16 04:29:56.244248 sshd[7240]: Connection closed by 139.178.89.65 port 52466 Sep 16 04:29:56.243672 sshd-session[7237]: pam_unix(sshd:session): session closed for user core Sep 16 04:29:56.249260 systemd[1]: sshd@36-138.201.119.17:22-139.178.89.65:52466.service: Deactivated successfully. Sep 16 04:29:56.249289 systemd-logind[1529]: Session 37 logged out. Waiting for processes to exit. Sep 16 04:29:56.251799 systemd[1]: session-37.scope: Deactivated successfully. Sep 16 04:29:56.256191 systemd-logind[1529]: Removed session 37. Sep 16 04:29:57.869275 containerd[1558]: time="2025-09-16T04:29:57.869064818Z" level=warning msg="container event discarded" container=330eda4a58ea998791d702c8e63e01b0da33f16b43b46f482cb45be65d7b3d87 type=CONTAINER_CREATED_EVENT Sep 16 04:29:57.998408 containerd[1558]: time="2025-09-16T04:29:57.998319626Z" level=warning msg="container event discarded" container=330eda4a58ea998791d702c8e63e01b0da33f16b43b46f482cb45be65d7b3d87 type=CONTAINER_STARTED_EVENT Sep 16 04:29:58.289428 containerd[1558]: time="2025-09-16T04:29:58.289353650Z" level=warning msg="container event discarded" container=474e26dba50d2f668922af260af9db09d6b600a2e5be6020d5efee03be1c0d4c type=CONTAINER_CREATED_EVENT Sep 16 04:29:58.472514 containerd[1558]: time="2025-09-16T04:29:58.472434673Z" level=warning msg="container event discarded" container=474e26dba50d2f668922af260af9db09d6b600a2e5be6020d5efee03be1c0d4c type=CONTAINER_STARTED_EVENT Sep 16 04:29:59.071780 containerd[1558]: time="2025-09-16T04:29:59.071681380Z" level=warning msg="container event discarded" container=4d0ab61559473cb2339e578525aaf069f866621a7afcf3eab88ca02d4a4a511b type=CONTAINER_STOPPED_EVENT Sep 16 04:29:59.207647 containerd[1558]: time="2025-09-16T04:29:59.207501802Z" level=warning msg="container event discarded" container=8a610be6a24d2e391acbed96ab712c6beb8ff0fbbbbd95157a98bf1bdc94ecff type=CONTAINER_STOPPED_EVENT Sep 16 04:29:59.296392 containerd[1558]: time="2025-09-16T04:29:59.296336830Z" level=info msg="TaskExit event in podsandbox handler container_id:\"330eda4a58ea998791d702c8e63e01b0da33f16b43b46f482cb45be65d7b3d87\" id:\"60b538abe02b0f2c006726375b8cbbed87bc642663896ea4023da65ec2526eca\" pid:7264 exited_at:{seconds:1757996999 nanos:296121227}" Sep 16 04:29:59.431971 containerd[1558]: time="2025-09-16T04:29:59.431847286Z" level=warning msg="container event discarded" container=474e26dba50d2f668922af260af9db09d6b600a2e5be6020d5efee03be1c0d4c type=CONTAINER_STOPPED_EVENT Sep 16 04:29:59.530408 containerd[1558]: time="2025-09-16T04:29:59.530328221Z" level=warning msg="container event discarded" container=8f4206c779cd8e784c940367c2f359b2973b5440f615fa238c911344451e797b type=CONTAINER_STOPPED_EVENT Sep 16 04:30:00.326295 containerd[1558]: time="2025-09-16T04:30:00.326207413Z" level=warning msg="container event discarded" container=474e26dba50d2f668922af260af9db09d6b600a2e5be6020d5efee03be1c0d4c type=CONTAINER_DELETED_EVENT Sep 16 04:30:00.467628 containerd[1558]: time="2025-09-16T04:30:00.467498794Z" level=warning msg="container event discarded" container=40c8d2832f3200c57a2ac59776774f318ab0d75a054cd3c02d37d7492942e7e7 type=CONTAINER_CREATED_EVENT Sep 16 04:30:00.642469 containerd[1558]: time="2025-09-16T04:30:00.642384124Z" level=warning msg="container event discarded" container=40c8d2832f3200c57a2ac59776774f318ab0d75a054cd3c02d37d7492942e7e7 type=CONTAINER_STARTED_EVENT Sep 16 04:30:01.418995 systemd[1]: Started sshd@37-138.201.119.17:22-139.178.89.65:39474.service - OpenSSH per-connection server daemon (139.178.89.65:39474). Sep 16 04:30:02.423879 sshd[7282]: Accepted publickey for core from 139.178.89.65 port 39474 ssh2: RSA SHA256:hnZQROmedaG+reQAaWvmG41QCRiTlF3QrQA4Qzar5jk Sep 16 04:30:02.426197 sshd-session[7282]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 16 04:30:02.434096 systemd-logind[1529]: New session 38 of user core. Sep 16 04:30:02.438805 systemd[1]: Started session-38.scope - Session 38 of User core. Sep 16 04:30:03.187993 sshd[7285]: Connection closed by 139.178.89.65 port 39474 Sep 16 04:30:03.188864 sshd-session[7282]: pam_unix(sshd:session): session closed for user core Sep 16 04:30:03.195100 systemd[1]: sshd@37-138.201.119.17:22-139.178.89.65:39474.service: Deactivated successfully. Sep 16 04:30:03.198884 systemd[1]: session-38.scope: Deactivated successfully. Sep 16 04:30:03.201057 systemd-logind[1529]: Session 38 logged out. Waiting for processes to exit. Sep 16 04:30:03.203715 systemd-logind[1529]: Removed session 38. Sep 16 04:30:05.596527 containerd[1558]: time="2025-09-16T04:30:05.596470504Z" level=info msg="TaskExit event in podsandbox handler container_id:\"330eda4a58ea998791d702c8e63e01b0da33f16b43b46f482cb45be65d7b3d87\" id:\"a43ec38500caa2cb2ee4306a534856fcaf477052c0ddc4d2af99d64856ddb19f\" pid:7309 exited_at:{seconds:1757997005 nanos:596246981}" Sep 16 04:30:08.366840 systemd[1]: Started sshd@38-138.201.119.17:22-139.178.89.65:39486.service - OpenSSH per-connection server daemon (139.178.89.65:39486). Sep 16 04:30:09.375185 sshd[7319]: Accepted publickey for core from 139.178.89.65 port 39486 ssh2: RSA SHA256:hnZQROmedaG+reQAaWvmG41QCRiTlF3QrQA4Qzar5jk Sep 16 04:30:09.376837 sshd-session[7319]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 16 04:30:09.381946 systemd-logind[1529]: New session 39 of user core. Sep 16 04:30:09.394907 systemd[1]: Started session-39.scope - Session 39 of User core. Sep 16 04:30:10.138813 sshd[7322]: Connection closed by 139.178.89.65 port 39486 Sep 16 04:30:10.138700 sshd-session[7319]: pam_unix(sshd:session): session closed for user core Sep 16 04:30:10.145289 systemd[1]: sshd@38-138.201.119.17:22-139.178.89.65:39486.service: Deactivated successfully. Sep 16 04:30:10.149536 systemd[1]: session-39.scope: Deactivated successfully. Sep 16 04:30:10.152391 systemd-logind[1529]: Session 39 logged out. Waiting for processes to exit. Sep 16 04:30:10.154891 systemd-logind[1529]: Removed session 39. Sep 16 04:30:12.797114 containerd[1558]: time="2025-09-16T04:30:12.797029187Z" level=info msg="TaskExit event in podsandbox handler container_id:\"29accf524a610bb91eb08b15c29dbaaf8febb57b21dcb514741ee9efee567994\" id:\"9c7f77ea1e1dcac2d9a12f891c4870156acb38c3ea159d796bbe0fb16a2fae51\" pid:7345 exited_at:{seconds:1757997012 nanos:796567020}" Sep 16 04:30:15.312710 systemd[1]: Started sshd@39-138.201.119.17:22-139.178.89.65:58196.service - OpenSSH per-connection server daemon (139.178.89.65:58196). Sep 16 04:30:16.324094 sshd[7357]: Accepted publickey for core from 139.178.89.65 port 58196 ssh2: RSA SHA256:hnZQROmedaG+reQAaWvmG41QCRiTlF3QrQA4Qzar5jk Sep 16 04:30:16.325675 sshd-session[7357]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 16 04:30:16.332045 systemd-logind[1529]: New session 40 of user core. Sep 16 04:30:16.339801 systemd[1]: Started session-40.scope - Session 40 of User core. Sep 16 04:30:17.089384 sshd[7360]: Connection closed by 139.178.89.65 port 58196 Sep 16 04:30:17.088639 sshd-session[7357]: pam_unix(sshd:session): session closed for user core Sep 16 04:30:17.094306 systemd[1]: sshd@39-138.201.119.17:22-139.178.89.65:58196.service: Deactivated successfully. Sep 16 04:30:17.094960 systemd-logind[1529]: Session 40 logged out. Waiting for processes to exit. Sep 16 04:30:17.097400 systemd[1]: session-40.scope: Deactivated successfully. Sep 16 04:30:17.100333 systemd-logind[1529]: Removed session 40. Sep 16 04:30:20.214697 containerd[1558]: time="2025-09-16T04:30:20.214654680Z" level=info msg="TaskExit event in podsandbox handler container_id:\"8d6c6929c13327a75d8c268f852bf6c0f9e08ea12f3d02fe73bf74d4acf04464\" id:\"025af863734b7e2f38d6e0bff38a75d0f9930d4664ac9143a321e9c12e5877b2\" pid:7387 exited_at:{seconds:1757997020 nanos:214249714}" Sep 16 04:30:20.784129 containerd[1558]: time="2025-09-16T04:30:20.784079803Z" level=info msg="TaskExit event in podsandbox handler container_id:\"8d6c6929c13327a75d8c268f852bf6c0f9e08ea12f3d02fe73bf74d4acf04464\" id:\"5b194bad7f2cec468a83c91f16cbcd045cb62b32628cb560d0e85007e9498fbb\" pid:7411 exited_at:{seconds:1757997020 nanos:783756038}" Sep 16 04:30:22.267839 systemd[1]: Started sshd@40-138.201.119.17:22-139.178.89.65:54572.service - OpenSSH per-connection server daemon (139.178.89.65:54572). Sep 16 04:30:23.278807 sshd[7422]: Accepted publickey for core from 139.178.89.65 port 54572 ssh2: RSA SHA256:hnZQROmedaG+reQAaWvmG41QCRiTlF3QrQA4Qzar5jk Sep 16 04:30:23.282187 sshd-session[7422]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 16 04:30:23.288898 systemd-logind[1529]: New session 41 of user core. Sep 16 04:30:23.293812 systemd[1]: Started session-41.scope - Session 41 of User core. Sep 16 04:30:24.037952 sshd[7425]: Connection closed by 139.178.89.65 port 54572 Sep 16 04:30:24.038782 sshd-session[7422]: pam_unix(sshd:session): session closed for user core Sep 16 04:30:24.044147 systemd[1]: sshd@40-138.201.119.17:22-139.178.89.65:54572.service: Deactivated successfully. Sep 16 04:30:24.046218 systemd[1]: session-41.scope: Deactivated successfully. Sep 16 04:30:24.049076 systemd-logind[1529]: Session 41 logged out. Waiting for processes to exit. Sep 16 04:30:24.050761 systemd-logind[1529]: Removed session 41. Sep 16 04:30:29.214336 systemd[1]: Started sshd@41-138.201.119.17:22-139.178.89.65:54576.service - OpenSSH per-connection server daemon (139.178.89.65:54576). Sep 16 04:30:29.295347 containerd[1558]: time="2025-09-16T04:30:29.295277404Z" level=info msg="TaskExit event in podsandbox handler container_id:\"330eda4a58ea998791d702c8e63e01b0da33f16b43b46f482cb45be65d7b3d87\" id:\"df5202a1e09e52c9c91cfe8db4630b0d2921c93ac2fcea533620d560614c7873\" pid:7455 exited_at:{seconds:1757997029 nanos:294689755}" Sep 16 04:30:30.217061 sshd[7438]: Accepted publickey for core from 139.178.89.65 port 54576 ssh2: RSA SHA256:hnZQROmedaG+reQAaWvmG41QCRiTlF3QrQA4Qzar5jk Sep 16 04:30:30.219741 sshd-session[7438]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 16 04:30:30.224695 systemd-logind[1529]: New session 42 of user core. Sep 16 04:30:30.230283 systemd[1]: Started session-42.scope - Session 42 of User core. Sep 16 04:30:30.981116 sshd[7464]: Connection closed by 139.178.89.65 port 54576 Sep 16 04:30:30.982079 sshd-session[7438]: pam_unix(sshd:session): session closed for user core Sep 16 04:30:30.987319 systemd[1]: sshd@41-138.201.119.17:22-139.178.89.65:54576.service: Deactivated successfully. Sep 16 04:30:30.990801 systemd[1]: session-42.scope: Deactivated successfully. Sep 16 04:30:30.992895 systemd-logind[1529]: Session 42 logged out. Waiting for processes to exit. Sep 16 04:30:30.994774 systemd-logind[1529]: Removed session 42. Sep 16 04:30:36.159846 systemd[1]: Started sshd@42-138.201.119.17:22-139.178.89.65:51702.service - OpenSSH per-connection server daemon (139.178.89.65:51702). Sep 16 04:30:37.171237 sshd[7478]: Accepted publickey for core from 139.178.89.65 port 51702 ssh2: RSA SHA256:hnZQROmedaG+reQAaWvmG41QCRiTlF3QrQA4Qzar5jk Sep 16 04:30:37.173106 sshd-session[7478]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 16 04:30:37.178825 systemd-logind[1529]: New session 43 of user core. Sep 16 04:30:37.188903 systemd[1]: Started session-43.scope - Session 43 of User core. Sep 16 04:30:37.940557 sshd[7481]: Connection closed by 139.178.89.65 port 51702 Sep 16 04:30:37.940397 sshd-session[7478]: pam_unix(sshd:session): session closed for user core Sep 16 04:30:37.945952 systemd[1]: sshd@42-138.201.119.17:22-139.178.89.65:51702.service: Deactivated successfully. Sep 16 04:30:37.948708 systemd[1]: session-43.scope: Deactivated successfully. Sep 16 04:30:37.949818 systemd-logind[1529]: Session 43 logged out. Waiting for processes to exit. Sep 16 04:30:37.952149 systemd-logind[1529]: Removed session 43. Sep 16 04:30:38.111839 systemd[1]: Started sshd@43-138.201.119.17:22-139.178.89.65:51716.service - OpenSSH per-connection server daemon (139.178.89.65:51716). Sep 16 04:30:39.096353 sshd[7494]: Accepted publickey for core from 139.178.89.65 port 51716 ssh2: RSA SHA256:hnZQROmedaG+reQAaWvmG41QCRiTlF3QrQA4Qzar5jk Sep 16 04:30:39.099012 sshd-session[7494]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 16 04:30:39.104719 systemd-logind[1529]: New session 44 of user core. Sep 16 04:30:39.111914 systemd[1]: Started session-44.scope - Session 44 of User core. Sep 16 04:30:39.888423 sshd[7497]: Connection closed by 139.178.89.65 port 51716 Sep 16 04:30:39.889404 sshd-session[7494]: pam_unix(sshd:session): session closed for user core Sep 16 04:30:39.896185 systemd[1]: sshd@43-138.201.119.17:22-139.178.89.65:51716.service: Deactivated successfully. Sep 16 04:30:39.898769 systemd[1]: session-44.scope: Deactivated successfully. Sep 16 04:30:39.901455 systemd-logind[1529]: Session 44 logged out. Waiting for processes to exit. Sep 16 04:30:39.903412 systemd-logind[1529]: Removed session 44. Sep 16 04:30:40.062831 systemd[1]: Started sshd@44-138.201.119.17:22-139.178.89.65:51724.service - OpenSSH per-connection server daemon (139.178.89.65:51724). Sep 16 04:30:41.069441 sshd[7507]: Accepted publickey for core from 139.178.89.65 port 51724 ssh2: RSA SHA256:hnZQROmedaG+reQAaWvmG41QCRiTlF3QrQA4Qzar5jk Sep 16 04:30:41.071838 sshd-session[7507]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 16 04:30:41.076979 systemd-logind[1529]: New session 45 of user core. Sep 16 04:30:41.086044 systemd[1]: Started session-45.scope - Session 45 of User core. Sep 16 04:30:41.827854 sshd[7510]: Connection closed by 139.178.89.65 port 51724 Sep 16 04:30:41.830433 sshd-session[7507]: pam_unix(sshd:session): session closed for user core Sep 16 04:30:41.836555 systemd[1]: sshd@44-138.201.119.17:22-139.178.89.65:51724.service: Deactivated successfully. Sep 16 04:30:41.840080 systemd[1]: session-45.scope: Deactivated successfully. Sep 16 04:30:41.841931 systemd-logind[1529]: Session 45 logged out. Waiting for processes to exit. Sep 16 04:30:41.844363 systemd-logind[1529]: Removed session 45. Sep 16 04:30:42.761756 containerd[1558]: time="2025-09-16T04:30:42.761678545Z" level=info msg="TaskExit event in podsandbox handler container_id:\"29accf524a610bb91eb08b15c29dbaaf8febb57b21dcb514741ee9efee567994\" id:\"81090b46441a779eca2a3638c0d16c34668fdc9fd114eca8d34cee0437eb8d65\" pid:7533 exited_at:{seconds:1757997042 nanos:760857973}" Sep 16 04:30:46.998018 systemd[1]: Started sshd@45-138.201.119.17:22-139.178.89.65:35876.service - OpenSSH per-connection server daemon (139.178.89.65:35876). Sep 16 04:30:47.999628 sshd[7544]: Accepted publickey for core from 139.178.89.65 port 35876 ssh2: RSA SHA256:hnZQROmedaG+reQAaWvmG41QCRiTlF3QrQA4Qzar5jk Sep 16 04:30:48.001099 sshd-session[7544]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 16 04:30:48.007818 systemd-logind[1529]: New session 46 of user core. Sep 16 04:30:48.013794 systemd[1]: Started session-46.scope - Session 46 of User core. Sep 16 04:30:48.750898 sshd[7547]: Connection closed by 139.178.89.65 port 35876 Sep 16 04:30:48.751847 sshd-session[7544]: pam_unix(sshd:session): session closed for user core Sep 16 04:30:48.758012 systemd[1]: sshd@45-138.201.119.17:22-139.178.89.65:35876.service: Deactivated successfully. Sep 16 04:30:48.760122 systemd[1]: session-46.scope: Deactivated successfully. Sep 16 04:30:48.761977 systemd-logind[1529]: Session 46 logged out. Waiting for processes to exit. Sep 16 04:30:48.764320 systemd-logind[1529]: Removed session 46. Sep 16 04:30:50.221781 containerd[1558]: time="2025-09-16T04:30:50.221680648Z" level=info msg="TaskExit event in podsandbox handler container_id:\"8d6c6929c13327a75d8c268f852bf6c0f9e08ea12f3d02fe73bf74d4acf04464\" id:\"20b9b08edca8560c8f78be8a341434710c2ace4bc6c3390523218d934a26b2a0\" pid:7572 exited_at:{seconds:1757997050 nanos:221325563}" Sep 16 04:30:53.790020 containerd[1558]: time="2025-09-16T04:30:53.789892718Z" level=warning msg="container event discarded" container=4d0ab61559473cb2339e578525aaf069f866621a7afcf3eab88ca02d4a4a511b type=CONTAINER_DELETED_EVENT Sep 16 04:30:53.928163 systemd[1]: Started sshd@46-138.201.119.17:22-139.178.89.65:44002.service - OpenSSH per-connection server daemon (139.178.89.65:44002). Sep 16 04:30:54.006430 containerd[1558]: time="2025-09-16T04:30:54.006265576Z" level=warning msg="container event discarded" container=8a610be6a24d2e391acbed96ab712c6beb8ff0fbbbbd95157a98bf1bdc94ecff type=CONTAINER_DELETED_EVENT Sep 16 04:30:54.197443 containerd[1558]: time="2025-09-16T04:30:54.197332710Z" level=warning msg="container event discarded" container=8f4206c779cd8e784c940367c2f359b2973b5440f615fa238c911344451e797b type=CONTAINER_DELETED_EVENT Sep 16 04:30:54.935011 sshd[7585]: Accepted publickey for core from 139.178.89.65 port 44002 ssh2: RSA SHA256:hnZQROmedaG+reQAaWvmG41QCRiTlF3QrQA4Qzar5jk Sep 16 04:30:54.937039 sshd-session[7585]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 16 04:30:54.941576 systemd-logind[1529]: New session 47 of user core. Sep 16 04:30:54.949899 systemd[1]: Started session-47.scope - Session 47 of User core. Sep 16 04:30:55.715319 sshd[7588]: Connection closed by 139.178.89.65 port 44002 Sep 16 04:30:55.716495 sshd-session[7585]: pam_unix(sshd:session): session closed for user core Sep 16 04:30:55.721966 systemd-logind[1529]: Session 47 logged out. Waiting for processes to exit. Sep 16 04:30:55.722671 systemd[1]: sshd@46-138.201.119.17:22-139.178.89.65:44002.service: Deactivated successfully. Sep 16 04:30:55.726111 systemd[1]: session-47.scope: Deactivated successfully. Sep 16 04:30:55.728060 systemd-logind[1529]: Removed session 47. Sep 16 04:30:59.298651 containerd[1558]: time="2025-09-16T04:30:59.298540634Z" level=info msg="TaskExit event in podsandbox handler container_id:\"330eda4a58ea998791d702c8e63e01b0da33f16b43b46f482cb45be65d7b3d87\" id:\"0f9e4dd7ab192908a2d4f699e2fa61037f815eafc79bb87a87fd219dd4d8787b\" pid:7617 exited_at:{seconds:1757997059 nanos:297989386}" Sep 16 04:31:00.885640 systemd[1]: Started sshd@47-138.201.119.17:22-139.178.89.65:51156.service - OpenSSH per-connection server daemon (139.178.89.65:51156). Sep 16 04:31:01.888668 sshd[7641]: Accepted publickey for core from 139.178.89.65 port 51156 ssh2: RSA SHA256:hnZQROmedaG+reQAaWvmG41QCRiTlF3QrQA4Qzar5jk Sep 16 04:31:01.891701 sshd-session[7641]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 16 04:31:01.897235 systemd-logind[1529]: New session 48 of user core. Sep 16 04:31:01.905842 systemd[1]: Started session-48.scope - Session 48 of User core. Sep 16 04:31:02.640242 sshd[7646]: Connection closed by 139.178.89.65 port 51156 Sep 16 04:31:02.641375 sshd-session[7641]: pam_unix(sshd:session): session closed for user core Sep 16 04:31:02.647564 systemd[1]: sshd@47-138.201.119.17:22-139.178.89.65:51156.service: Deactivated successfully. Sep 16 04:31:02.651303 systemd[1]: session-48.scope: Deactivated successfully. Sep 16 04:31:02.652762 systemd-logind[1529]: Session 48 logged out. Waiting for processes to exit. Sep 16 04:31:02.654971 systemd-logind[1529]: Removed session 48. Sep 16 04:31:05.588211 containerd[1558]: time="2025-09-16T04:31:05.588156109Z" level=info msg="TaskExit event in podsandbox handler container_id:\"330eda4a58ea998791d702c8e63e01b0da33f16b43b46f482cb45be65d7b3d87\" id:\"a2c9a5fdbef6ebe7413c54a0079eaa1a8a0d3f3b95e6f2c9f79f3ab70f8b0af0\" pid:7669 exited_at:{seconds:1757997065 nanos:587654381}" Sep 16 04:31:07.811662 systemd[1]: Started sshd@48-138.201.119.17:22-139.178.89.65:51172.service - OpenSSH per-connection server daemon (139.178.89.65:51172). Sep 16 04:31:08.807519 sshd[7680]: Accepted publickey for core from 139.178.89.65 port 51172 ssh2: RSA SHA256:hnZQROmedaG+reQAaWvmG41QCRiTlF3QrQA4Qzar5jk Sep 16 04:31:08.810237 sshd-session[7680]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 16 04:31:08.816646 systemd-logind[1529]: New session 49 of user core. Sep 16 04:31:08.824860 systemd[1]: Started session-49.scope - Session 49 of User core. Sep 16 04:31:09.568702 sshd[7683]: Connection closed by 139.178.89.65 port 51172 Sep 16 04:31:09.569681 sshd-session[7680]: pam_unix(sshd:session): session closed for user core Sep 16 04:31:09.575617 systemd-logind[1529]: Session 49 logged out. Waiting for processes to exit. Sep 16 04:31:09.575743 systemd[1]: sshd@48-138.201.119.17:22-139.178.89.65:51172.service: Deactivated successfully. Sep 16 04:31:09.580734 systemd[1]: session-49.scope: Deactivated successfully. Sep 16 04:31:09.586633 systemd-logind[1529]: Removed session 49. Sep 16 04:31:12.753689 containerd[1558]: time="2025-09-16T04:31:12.753638541Z" level=info msg="TaskExit event in podsandbox handler container_id:\"29accf524a610bb91eb08b15c29dbaaf8febb57b21dcb514741ee9efee567994\" id:\"11ee70dd6518e3d93ec3ce8f44a5f00f9ddb160dc130488e6b23b76a08700489\" pid:7708 exited_at:{seconds:1757997072 nanos:753211290}" Sep 16 04:31:14.739072 systemd[1]: Started sshd@49-138.201.119.17:22-139.178.89.65:41202.service - OpenSSH per-connection server daemon (139.178.89.65:41202). Sep 16 04:31:15.737708 sshd[7720]: Accepted publickey for core from 139.178.89.65 port 41202 ssh2: RSA SHA256:hnZQROmedaG+reQAaWvmG41QCRiTlF3QrQA4Qzar5jk Sep 16 04:31:15.739383 sshd-session[7720]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 16 04:31:15.745547 systemd-logind[1529]: New session 50 of user core. Sep 16 04:31:15.751787 systemd[1]: Started session-50.scope - Session 50 of User core. Sep 16 04:31:16.495999 sshd[7723]: Connection closed by 139.178.89.65 port 41202 Sep 16 04:31:16.495895 sshd-session[7720]: pam_unix(sshd:session): session closed for user core Sep 16 04:31:16.503467 systemd-logind[1529]: Session 50 logged out. Waiting for processes to exit. Sep 16 04:31:16.504426 systemd[1]: sshd@49-138.201.119.17:22-139.178.89.65:41202.service: Deactivated successfully. Sep 16 04:31:16.509186 systemd[1]: session-50.scope: Deactivated successfully. Sep 16 04:31:16.512208 systemd-logind[1529]: Removed session 50. Sep 16 04:31:20.217333 containerd[1558]: time="2025-09-16T04:31:20.217273735Z" level=info msg="TaskExit event in podsandbox handler container_id:\"8d6c6929c13327a75d8c268f852bf6c0f9e08ea12f3d02fe73bf74d4acf04464\" id:\"7e083f612e3165375409319d829c5153678f384f68b2a32a4aca374026cd917b\" pid:7748 exited_at:{seconds:1757997080 nanos:216491235}" Sep 16 04:31:20.793077 containerd[1558]: time="2025-09-16T04:31:20.792972568Z" level=info msg="TaskExit event in podsandbox handler container_id:\"8d6c6929c13327a75d8c268f852bf6c0f9e08ea12f3d02fe73bf74d4acf04464\" id:\"25c87d8d53a6822d18db2174b92c5c7e8dc5f7df5653aeaec18dcf32a6cd06a6\" pid:7772 exited_at:{seconds:1757997080 nanos:792354672}" Sep 16 04:31:21.666268 systemd[1]: Started sshd@50-138.201.119.17:22-139.178.89.65:44668.service - OpenSSH per-connection server daemon (139.178.89.65:44668). Sep 16 04:31:22.664120 sshd[7782]: Accepted publickey for core from 139.178.89.65 port 44668 ssh2: RSA SHA256:hnZQROmedaG+reQAaWvmG41QCRiTlF3QrQA4Qzar5jk Sep 16 04:31:22.666075 sshd-session[7782]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 16 04:31:22.673080 systemd-logind[1529]: New session 51 of user core. Sep 16 04:31:22.686221 systemd[1]: Started session-51.scope - Session 51 of User core. Sep 16 04:31:23.417205 sshd[7785]: Connection closed by 139.178.89.65 port 44668 Sep 16 04:31:23.418179 sshd-session[7782]: pam_unix(sshd:session): session closed for user core Sep 16 04:31:23.425132 systemd-logind[1529]: Session 51 logged out. Waiting for processes to exit. Sep 16 04:31:23.426361 systemd[1]: sshd@50-138.201.119.17:22-139.178.89.65:44668.service: Deactivated successfully. Sep 16 04:31:23.429242 systemd[1]: session-51.scope: Deactivated successfully. Sep 16 04:31:23.432265 systemd-logind[1529]: Removed session 51. Sep 16 04:31:28.594698 systemd[1]: Started sshd@51-138.201.119.17:22-139.178.89.65:44672.service - OpenSSH per-connection server daemon (139.178.89.65:44672). Sep 16 04:31:29.301292 containerd[1558]: time="2025-09-16T04:31:29.301253586Z" level=info msg="TaskExit event in podsandbox handler container_id:\"330eda4a58ea998791d702c8e63e01b0da33f16b43b46f482cb45be65d7b3d87\" id:\"bed782085ffc79e982ac7b6ec9a3a90b38cd2ecca1cc83a26984606ad202ee29\" pid:7812 exited_at:{seconds:1757997089 nanos:300818736}" Sep 16 04:31:29.586042 sshd[7797]: Accepted publickey for core from 139.178.89.65 port 44672 ssh2: RSA SHA256:hnZQROmedaG+reQAaWvmG41QCRiTlF3QrQA4Qzar5jk Sep 16 04:31:29.587449 sshd-session[7797]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 16 04:31:29.594248 systemd-logind[1529]: New session 52 of user core. Sep 16 04:31:29.598799 systemd[1]: Started session-52.scope - Session 52 of User core. Sep 16 04:31:30.341805 sshd[7821]: Connection closed by 139.178.89.65 port 44672 Sep 16 04:31:30.342529 sshd-session[7797]: pam_unix(sshd:session): session closed for user core Sep 16 04:31:30.350444 systemd[1]: sshd@51-138.201.119.17:22-139.178.89.65:44672.service: Deactivated successfully. Sep 16 04:31:30.354786 systemd[1]: session-52.scope: Deactivated successfully. Sep 16 04:31:30.356632 systemd-logind[1529]: Session 52 logged out. Waiting for processes to exit. Sep 16 04:31:30.358930 systemd-logind[1529]: Removed session 52. Sep 16 04:31:35.514264 systemd[1]: Started sshd@52-138.201.119.17:22-139.178.89.65:58374.service - OpenSSH per-connection server daemon (139.178.89.65:58374). Sep 16 04:31:36.503610 sshd[7835]: Accepted publickey for core from 139.178.89.65 port 58374 ssh2: RSA SHA256:hnZQROmedaG+reQAaWvmG41QCRiTlF3QrQA4Qzar5jk Sep 16 04:31:36.505539 sshd-session[7835]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 16 04:31:36.515686 systemd-logind[1529]: New session 53 of user core. Sep 16 04:31:36.521706 systemd[1]: Started session-53.scope - Session 53 of User core. Sep 16 04:31:37.256146 sshd[7838]: Connection closed by 139.178.89.65 port 58374 Sep 16 04:31:37.257872 sshd-session[7835]: pam_unix(sshd:session): session closed for user core Sep 16 04:31:37.263706 systemd-logind[1529]: Session 53 logged out. Waiting for processes to exit. Sep 16 04:31:37.263811 systemd[1]: sshd@52-138.201.119.17:22-139.178.89.65:58374.service: Deactivated successfully. Sep 16 04:31:37.266910 systemd[1]: session-53.scope: Deactivated successfully. Sep 16 04:31:37.270232 systemd-logind[1529]: Removed session 53. Sep 16 04:31:42.435412 systemd[1]: Started sshd@53-138.201.119.17:22-139.178.89.65:53582.service - OpenSSH per-connection server daemon (139.178.89.65:53582). Sep 16 04:31:42.757505 containerd[1558]: time="2025-09-16T04:31:42.757137891Z" level=info msg="TaskExit event in podsandbox handler container_id:\"29accf524a610bb91eb08b15c29dbaaf8febb57b21dcb514741ee9efee567994\" id:\"a460db5535ee247185154ef7206409df63052ed2c0a3bd57688351c9472a133d\" pid:7865 exited_at:{seconds:1757997102 nanos:756854484}" Sep 16 04:31:43.426930 sshd[7849]: Accepted publickey for core from 139.178.89.65 port 53582 ssh2: RSA SHA256:hnZQROmedaG+reQAaWvmG41QCRiTlF3QrQA4Qzar5jk Sep 16 04:31:43.429260 sshd-session[7849]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 16 04:31:43.436109 systemd-logind[1529]: New session 54 of user core. Sep 16 04:31:43.452938 systemd[1]: Started session-54.scope - Session 54 of User core. Sep 16 04:31:44.172772 sshd[7879]: Connection closed by 139.178.89.65 port 53582 Sep 16 04:31:44.173683 sshd-session[7849]: pam_unix(sshd:session): session closed for user core Sep 16 04:31:44.179002 systemd[1]: sshd@53-138.201.119.17:22-139.178.89.65:53582.service: Deactivated successfully. Sep 16 04:31:44.182867 systemd[1]: session-54.scope: Deactivated successfully. Sep 16 04:31:44.185609 systemd-logind[1529]: Session 54 logged out. Waiting for processes to exit. Sep 16 04:31:44.187792 systemd-logind[1529]: Removed session 54. Sep 16 04:31:49.345265 systemd[1]: Started sshd@54-138.201.119.17:22-139.178.89.65:53584.service - OpenSSH per-connection server daemon (139.178.89.65:53584). Sep 16 04:31:50.256457 containerd[1558]: time="2025-09-16T04:31:50.256398857Z" level=info msg="TaskExit event in podsandbox handler container_id:\"8d6c6929c13327a75d8c268f852bf6c0f9e08ea12f3d02fe73bf74d4acf04464\" id:\"7437208147b8bfc3d985f5134ac3cc56951292a7f1d5f0c420873d4dfe8a562e\" pid:7907 exited_at:{seconds:1757997110 nanos:255523037}" Sep 16 04:31:50.352949 sshd[7890]: Accepted publickey for core from 139.178.89.65 port 53584 ssh2: RSA SHA256:hnZQROmedaG+reQAaWvmG41QCRiTlF3QrQA4Qzar5jk Sep 16 04:31:50.355656 sshd-session[7890]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 16 04:31:50.361600 systemd-logind[1529]: New session 55 of user core. Sep 16 04:31:50.367367 systemd[1]: Started session-55.scope - Session 55 of User core. Sep 16 04:31:51.124523 sshd[7917]: Connection closed by 139.178.89.65 port 53584 Sep 16 04:31:51.123622 sshd-session[7890]: pam_unix(sshd:session): session closed for user core Sep 16 04:31:51.129817 systemd-logind[1529]: Session 55 logged out. Waiting for processes to exit. Sep 16 04:31:51.130478 systemd[1]: sshd@54-138.201.119.17:22-139.178.89.65:53584.service: Deactivated successfully. Sep 16 04:31:51.135707 systemd[1]: session-55.scope: Deactivated successfully. Sep 16 04:31:51.138727 systemd-logind[1529]: Removed session 55. Sep 16 04:31:56.308438 systemd[1]: Started sshd@55-138.201.119.17:22-139.178.89.65:40854.service - OpenSSH per-connection server daemon (139.178.89.65:40854). Sep 16 04:31:57.327505 sshd[7931]: Accepted publickey for core from 139.178.89.65 port 40854 ssh2: RSA SHA256:hnZQROmedaG+reQAaWvmG41QCRiTlF3QrQA4Qzar5jk Sep 16 04:31:57.329563 sshd-session[7931]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 16 04:31:57.335236 systemd-logind[1529]: New session 56 of user core. Sep 16 04:31:57.352365 systemd[1]: Started session-56.scope - Session 56 of User core. Sep 16 04:31:58.093698 sshd[7934]: Connection closed by 139.178.89.65 port 40854 Sep 16 04:31:58.094562 sshd-session[7931]: pam_unix(sshd:session): session closed for user core Sep 16 04:31:58.100272 systemd-logind[1529]: Session 56 logged out. Waiting for processes to exit. Sep 16 04:31:58.100526 systemd[1]: sshd@55-138.201.119.17:22-139.178.89.65:40854.service: Deactivated successfully. Sep 16 04:31:58.103187 systemd[1]: session-56.scope: Deactivated successfully. Sep 16 04:31:58.107453 systemd-logind[1529]: Removed session 56. Sep 16 04:31:59.300089 containerd[1558]: time="2025-09-16T04:31:59.299941116Z" level=info msg="TaskExit event in podsandbox handler container_id:\"330eda4a58ea998791d702c8e63e01b0da33f16b43b46f482cb45be65d7b3d87\" id:\"1e3473221f0d2ed1269d9afdc5cc4402834b01ec8e8c7c3b060e8f894b12cb9b\" pid:7958 exited_at:{seconds:1757997119 nanos:299506986}" Sep 16 04:32:03.265931 systemd[1]: Started sshd@56-138.201.119.17:22-139.178.89.65:44746.service - OpenSSH per-connection server daemon (139.178.89.65:44746). Sep 16 04:32:04.259207 sshd[7970]: Accepted publickey for core from 139.178.89.65 port 44746 ssh2: RSA SHA256:hnZQROmedaG+reQAaWvmG41QCRiTlF3QrQA4Qzar5jk Sep 16 04:32:04.261247 sshd-session[7970]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 16 04:32:04.266747 systemd-logind[1529]: New session 57 of user core. Sep 16 04:32:04.272855 systemd[1]: Started session-57.scope - Session 57 of User core. Sep 16 04:32:05.016938 sshd[7973]: Connection closed by 139.178.89.65 port 44746 Sep 16 04:32:05.017697 sshd-session[7970]: pam_unix(sshd:session): session closed for user core Sep 16 04:32:05.027885 systemd[1]: sshd@56-138.201.119.17:22-139.178.89.65:44746.service: Deactivated successfully. Sep 16 04:32:05.033285 systemd[1]: session-57.scope: Deactivated successfully. Sep 16 04:32:05.034892 systemd-logind[1529]: Session 57 logged out. Waiting for processes to exit. Sep 16 04:32:05.037231 systemd-logind[1529]: Removed session 57. Sep 16 04:32:05.596473 containerd[1558]: time="2025-09-16T04:32:05.596415588Z" level=info msg="TaskExit event in podsandbox handler container_id:\"330eda4a58ea998791d702c8e63e01b0da33f16b43b46f482cb45be65d7b3d87\" id:\"0f5b65d0b3b326c7e0457ef920e27d2c5367da47def0538edb4e45ba7caf82f0\" pid:7998 exited_at:{seconds:1757997125 nanos:596095260}" Sep 16 04:32:10.193545 systemd[1]: Started sshd@57-138.201.119.17:22-139.178.89.65:60416.service - OpenSSH per-connection server daemon (139.178.89.65:60416). Sep 16 04:32:11.212181 sshd[8008]: Accepted publickey for core from 139.178.89.65 port 60416 ssh2: RSA SHA256:hnZQROmedaG+reQAaWvmG41QCRiTlF3QrQA4Qzar5jk Sep 16 04:32:11.214283 sshd-session[8008]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 16 04:32:11.220725 systemd-logind[1529]: New session 58 of user core. Sep 16 04:32:11.230878 systemd[1]: Started session-58.scope - Session 58 of User core. Sep 16 04:32:11.984251 sshd[8011]: Connection closed by 139.178.89.65 port 60416 Sep 16 04:32:11.985245 sshd-session[8008]: pam_unix(sshd:session): session closed for user core Sep 16 04:32:11.992667 systemd[1]: sshd@57-138.201.119.17:22-139.178.89.65:60416.service: Deactivated successfully. Sep 16 04:32:11.997169 systemd[1]: session-58.scope: Deactivated successfully. Sep 16 04:32:11.998440 systemd-logind[1529]: Session 58 logged out. Waiting for processes to exit. Sep 16 04:32:12.000041 systemd-logind[1529]: Removed session 58. Sep 16 04:32:12.756344 containerd[1558]: time="2025-09-16T04:32:12.756122693Z" level=info msg="TaskExit event in podsandbox handler container_id:\"29accf524a610bb91eb08b15c29dbaaf8febb57b21dcb514741ee9efee567994\" id:\"5ef2e2acc286c6af965b2b5c8dd0433119ad648564d1f2198100c9d08a803040\" pid:8035 exited_at:{seconds:1757997132 nanos:755798246}" Sep 16 04:32:17.154791 systemd[1]: Started sshd@58-138.201.119.17:22-139.178.89.65:60428.service - OpenSSH per-connection server daemon (139.178.89.65:60428). Sep 16 04:32:18.159686 sshd[8048]: Accepted publickey for core from 139.178.89.65 port 60428 ssh2: RSA SHA256:hnZQROmedaG+reQAaWvmG41QCRiTlF3QrQA4Qzar5jk Sep 16 04:32:18.161433 sshd-session[8048]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 16 04:32:18.170490 systemd-logind[1529]: New session 59 of user core. Sep 16 04:32:18.177126 systemd[1]: Started session-59.scope - Session 59 of User core. Sep 16 04:32:18.918403 sshd[8051]: Connection closed by 139.178.89.65 port 60428 Sep 16 04:32:18.919319 sshd-session[8048]: pam_unix(sshd:session): session closed for user core Sep 16 04:32:18.926148 systemd[1]: sshd@58-138.201.119.17:22-139.178.89.65:60428.service: Deactivated successfully. Sep 16 04:32:18.928754 systemd[1]: session-59.scope: Deactivated successfully. Sep 16 04:32:18.930420 systemd-logind[1529]: Session 59 logged out. Waiting for processes to exit. Sep 16 04:32:18.932334 systemd-logind[1529]: Removed session 59. Sep 16 04:32:20.311288 containerd[1558]: time="2025-09-16T04:32:20.311249779Z" level=info msg="TaskExit event in podsandbox handler container_id:\"8d6c6929c13327a75d8c268f852bf6c0f9e08ea12f3d02fe73bf74d4acf04464\" id:\"2f74558485ef97632bb30da06ab1eea1d2c0fb769e348b1f31520575d11fe412\" pid:8076 exited_at:{seconds:1757997140 nanos:310599245}" Sep 16 04:32:20.962447 containerd[1558]: time="2025-09-16T04:32:20.962292680Z" level=info msg="TaskExit event in podsandbox handler container_id:\"8d6c6929c13327a75d8c268f852bf6c0f9e08ea12f3d02fe73bf74d4acf04464\" id:\"4a2c0396d4fd2a42711f5413a7a2f3212eafe9919e0ed7cf2582d9e97b972036\" pid:8098 exited_at:{seconds:1757997140 nanos:961984553}" Sep 16 04:32:24.089196 systemd[1]: Started sshd@59-138.201.119.17:22-139.178.89.65:44034.service - OpenSSH per-connection server daemon (139.178.89.65:44034). Sep 16 04:32:25.100185 sshd[8109]: Accepted publickey for core from 139.178.89.65 port 44034 ssh2: RSA SHA256:hnZQROmedaG+reQAaWvmG41QCRiTlF3QrQA4Qzar5jk Sep 16 04:32:25.102689 sshd-session[8109]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 16 04:32:25.107758 systemd-logind[1529]: New session 60 of user core. Sep 16 04:32:25.117957 systemd[1]: Started session-60.scope - Session 60 of User core. Sep 16 04:32:25.858144 sshd[8112]: Connection closed by 139.178.89.65 port 44034 Sep 16 04:32:25.858797 sshd-session[8109]: pam_unix(sshd:session): session closed for user core Sep 16 04:32:25.865334 systemd[1]: sshd@59-138.201.119.17:22-139.178.89.65:44034.service: Deactivated successfully. Sep 16 04:32:25.867632 systemd[1]: session-60.scope: Deactivated successfully. Sep 16 04:32:25.871024 systemd-logind[1529]: Session 60 logged out. Waiting for processes to exit. Sep 16 04:32:25.872728 systemd-logind[1529]: Removed session 60. Sep 16 04:32:29.296529 containerd[1558]: time="2025-09-16T04:32:29.296486841Z" level=info msg="TaskExit event in podsandbox handler container_id:\"330eda4a58ea998791d702c8e63e01b0da33f16b43b46f482cb45be65d7b3d87\" id:\"13df23011b6bb05a83b0a689e39414f7b485078fd0001ba779a56e2e28cf265b\" pid:8136 exited_at:{seconds:1757997149 nanos:296146154}" Sep 16 04:32:31.028980 systemd[1]: Started sshd@60-138.201.119.17:22-139.178.89.65:41810.service - OpenSSH per-connection server daemon (139.178.89.65:41810). Sep 16 04:32:32.031418 sshd[8152]: Accepted publickey for core from 139.178.89.65 port 41810 ssh2: RSA SHA256:hnZQROmedaG+reQAaWvmG41QCRiTlF3QrQA4Qzar5jk Sep 16 04:32:32.034508 sshd-session[8152]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 16 04:32:32.039918 systemd-logind[1529]: New session 61 of user core. Sep 16 04:32:32.048004 systemd[1]: Started session-61.scope - Session 61 of User core. Sep 16 04:32:32.788658 sshd[8157]: Connection closed by 139.178.89.65 port 41810 Sep 16 04:32:32.789400 sshd-session[8152]: pam_unix(sshd:session): session closed for user core Sep 16 04:32:32.793978 systemd[1]: sshd@60-138.201.119.17:22-139.178.89.65:41810.service: Deactivated successfully. Sep 16 04:32:32.796839 systemd[1]: session-61.scope: Deactivated successfully. Sep 16 04:32:32.799621 systemd-logind[1529]: Session 61 logged out. Waiting for processes to exit. Sep 16 04:32:32.801128 systemd-logind[1529]: Removed session 61. Sep 16 04:32:37.970775 systemd[1]: Started sshd@61-138.201.119.17:22-139.178.89.65:41818.service - OpenSSH per-connection server daemon (139.178.89.65:41818). Sep 16 04:32:38.976167 sshd[8172]: Accepted publickey for core from 139.178.89.65 port 41818 ssh2: RSA SHA256:hnZQROmedaG+reQAaWvmG41QCRiTlF3QrQA4Qzar5jk Sep 16 04:32:38.978370 sshd-session[8172]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 16 04:32:38.984653 systemd-logind[1529]: New session 62 of user core. Sep 16 04:32:38.988811 systemd[1]: Started session-62.scope - Session 62 of User core. Sep 16 04:32:39.736184 sshd[8189]: Connection closed by 139.178.89.65 port 41818 Sep 16 04:32:39.737158 sshd-session[8172]: pam_unix(sshd:session): session closed for user core Sep 16 04:32:39.743820 systemd[1]: sshd@61-138.201.119.17:22-139.178.89.65:41818.service: Deactivated successfully. Sep 16 04:32:39.747297 systemd[1]: session-62.scope: Deactivated successfully. Sep 16 04:32:39.749675 systemd-logind[1529]: Session 62 logged out. Waiting for processes to exit. Sep 16 04:32:39.752418 systemd-logind[1529]: Removed session 62. Sep 16 04:32:42.765108 containerd[1558]: time="2025-09-16T04:32:42.765032250Z" level=info msg="TaskExit event in podsandbox handler container_id:\"29accf524a610bb91eb08b15c29dbaaf8febb57b21dcb514741ee9efee567994\" id:\"c8ce2c062b0a2472b1099eceba879a9f6d80bf91b5dd25a8e3c5b52fa215dab1\" pid:8211 exited_at:{seconds:1757997162 nanos:764140792}" Sep 16 04:32:44.915393 systemd[1]: Started sshd@62-138.201.119.17:22-139.178.89.65:35166.service - OpenSSH per-connection server daemon (139.178.89.65:35166). Sep 16 04:32:45.922373 sshd[8224]: Accepted publickey for core from 139.178.89.65 port 35166 ssh2: RSA SHA256:hnZQROmedaG+reQAaWvmG41QCRiTlF3QrQA4Qzar5jk Sep 16 04:32:45.924295 sshd-session[8224]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 16 04:32:45.929723 systemd-logind[1529]: New session 63 of user core. Sep 16 04:32:45.939943 systemd[1]: Started session-63.scope - Session 63 of User core. Sep 16 04:32:46.683661 sshd[8227]: Connection closed by 139.178.89.65 port 35166 Sep 16 04:32:46.684638 sshd-session[8224]: pam_unix(sshd:session): session closed for user core Sep 16 04:32:46.689844 systemd[1]: sshd@62-138.201.119.17:22-139.178.89.65:35166.service: Deactivated successfully. Sep 16 04:32:46.692770 systemd[1]: session-63.scope: Deactivated successfully. Sep 16 04:32:46.695646 systemd-logind[1529]: Session 63 logged out. Waiting for processes to exit. Sep 16 04:32:46.697498 systemd-logind[1529]: Removed session 63. Sep 16 04:32:50.223814 containerd[1558]: time="2025-09-16T04:32:50.223745803Z" level=info msg="TaskExit event in podsandbox handler container_id:\"8d6c6929c13327a75d8c268f852bf6c0f9e08ea12f3d02fe73bf74d4acf04464\" id:\"97d32bcf433a3793de90c8c64aa28c4a21af6438d1d514831079c1e5e37cf7e0\" pid:8251 exited_at:{seconds:1757997170 nanos:223327914}" Sep 16 04:32:51.857081 systemd[1]: Started sshd@63-138.201.119.17:22-139.178.89.65:50842.service - OpenSSH per-connection server daemon (139.178.89.65:50842). Sep 16 04:32:52.863241 sshd[8261]: Accepted publickey for core from 139.178.89.65 port 50842 ssh2: RSA SHA256:hnZQROmedaG+reQAaWvmG41QCRiTlF3QrQA4Qzar5jk Sep 16 04:32:52.867302 sshd-session[8261]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 16 04:32:52.873661 systemd-logind[1529]: New session 64 of user core. Sep 16 04:32:52.879886 systemd[1]: Started session-64.scope - Session 64 of User core. Sep 16 04:32:53.650620 sshd[8264]: Connection closed by 139.178.89.65 port 50842 Sep 16 04:32:53.651608 sshd-session[8261]: pam_unix(sshd:session): session closed for user core Sep 16 04:32:53.657126 systemd[1]: sshd@63-138.201.119.17:22-139.178.89.65:50842.service: Deactivated successfully. Sep 16 04:32:53.661750 systemd[1]: session-64.scope: Deactivated successfully. Sep 16 04:32:53.663783 systemd-logind[1529]: Session 64 logged out. Waiting for processes to exit. Sep 16 04:32:53.668229 systemd-logind[1529]: Removed session 64. Sep 16 04:32:58.821914 systemd[1]: Started sshd@64-138.201.119.17:22-139.178.89.65:50848.service - OpenSSH per-connection server daemon (139.178.89.65:50848). Sep 16 04:32:59.295896 containerd[1558]: time="2025-09-16T04:32:59.295836512Z" level=info msg="TaskExit event in podsandbox handler container_id:\"330eda4a58ea998791d702c8e63e01b0da33f16b43b46f482cb45be65d7b3d87\" id:\"2094fa04ae5ef7a717ac67e910b01e719824d5d4ac475f928549f02e5c113003\" pid:8293 exited_at:{seconds:1757997179 nanos:294620288}" Sep 16 04:32:59.811703 sshd[8278]: Accepted publickey for core from 139.178.89.65 port 50848 ssh2: RSA SHA256:hnZQROmedaG+reQAaWvmG41QCRiTlF3QrQA4Qzar5jk Sep 16 04:32:59.813519 sshd-session[8278]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 16 04:32:59.818888 systemd-logind[1529]: New session 65 of user core. Sep 16 04:32:59.825884 systemd[1]: Started session-65.scope - Session 65 of User core. Sep 16 04:33:00.569809 sshd[8302]: Connection closed by 139.178.89.65 port 50848 Sep 16 04:33:00.571035 sshd-session[8278]: pam_unix(sshd:session): session closed for user core Sep 16 04:33:00.578042 systemd[1]: sshd@64-138.201.119.17:22-139.178.89.65:50848.service: Deactivated successfully. Sep 16 04:33:00.581984 systemd[1]: session-65.scope: Deactivated successfully. Sep 16 04:33:00.583862 systemd-logind[1529]: Session 65 logged out. Waiting for processes to exit. Sep 16 04:33:00.586435 systemd-logind[1529]: Removed session 65. Sep 16 04:33:05.593968 containerd[1558]: time="2025-09-16T04:33:05.593703866Z" level=info msg="TaskExit event in podsandbox handler container_id:\"330eda4a58ea998791d702c8e63e01b0da33f16b43b46f482cb45be65d7b3d87\" id:\"30dde70283efaa2f23b94dacacbadcc99de0c9729cd40322ea54d4b4c4468f55\" pid:8333 exited_at:{seconds:1757997185 nanos:593134055}" Sep 16 04:33:05.747862 systemd[1]: Started sshd@65-138.201.119.17:22-139.178.89.65:58002.service - OpenSSH per-connection server daemon (139.178.89.65:58002). Sep 16 04:33:06.759038 sshd[8344]: Accepted publickey for core from 139.178.89.65 port 58002 ssh2: RSA SHA256:hnZQROmedaG+reQAaWvmG41QCRiTlF3QrQA4Qzar5jk Sep 16 04:33:06.760956 sshd-session[8344]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 16 04:33:06.765632 systemd-logind[1529]: New session 66 of user core. Sep 16 04:33:06.774816 systemd[1]: Started session-66.scope - Session 66 of User core. Sep 16 04:33:07.525291 sshd[8347]: Connection closed by 139.178.89.65 port 58002 Sep 16 04:33:07.526239 sshd-session[8344]: pam_unix(sshd:session): session closed for user core Sep 16 04:33:07.533075 systemd[1]: sshd@65-138.201.119.17:22-139.178.89.65:58002.service: Deactivated successfully. Sep 16 04:33:07.537366 systemd[1]: session-66.scope: Deactivated successfully. Sep 16 04:33:07.538733 systemd-logind[1529]: Session 66 logged out. Waiting for processes to exit. Sep 16 04:33:07.540395 systemd-logind[1529]: Removed session 66. Sep 16 04:33:12.691912 systemd[1]: Started sshd@66-138.201.119.17:22-139.178.89.65:48658.service - OpenSSH per-connection server daemon (139.178.89.65:48658). Sep 16 04:33:12.762810 containerd[1558]: time="2025-09-16T04:33:12.762760301Z" level=info msg="TaskExit event in podsandbox handler container_id:\"29accf524a610bb91eb08b15c29dbaaf8febb57b21dcb514741ee9efee567994\" id:\"331bddcb611ac12d0a8758faf52fac38d35678101027367f641085e9637b0b69\" pid:8372 exited_at:{seconds:1757997192 nanos:762368333}" Sep 16 04:33:13.677247 sshd[8371]: Accepted publickey for core from 139.178.89.65 port 48658 ssh2: RSA SHA256:hnZQROmedaG+reQAaWvmG41QCRiTlF3QrQA4Qzar5jk Sep 16 04:33:13.679970 sshd-session[8371]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 16 04:33:13.686092 systemd-logind[1529]: New session 67 of user core. Sep 16 04:33:13.691888 systemd[1]: Started session-67.scope - Session 67 of User core. Sep 16 04:33:14.421527 sshd[8386]: Connection closed by 139.178.89.65 port 48658 Sep 16 04:33:14.422287 sshd-session[8371]: pam_unix(sshd:session): session closed for user core Sep 16 04:33:14.427488 systemd[1]: sshd@66-138.201.119.17:22-139.178.89.65:48658.service: Deactivated successfully. Sep 16 04:33:14.430488 systemd[1]: session-67.scope: Deactivated successfully. Sep 16 04:33:14.432635 systemd-logind[1529]: Session 67 logged out. Waiting for processes to exit. Sep 16 04:33:14.434286 systemd-logind[1529]: Removed session 67. Sep 16 04:33:19.599003 systemd[1]: Started sshd@67-138.201.119.17:22-139.178.89.65:48660.service - OpenSSH per-connection server daemon (139.178.89.65:48660). Sep 16 04:33:20.227003 containerd[1558]: time="2025-09-16T04:33:20.226958263Z" level=info msg="TaskExit event in podsandbox handler container_id:\"8d6c6929c13327a75d8c268f852bf6c0f9e08ea12f3d02fe73bf74d4acf04464\" id:\"fb11ad77c01b6a580004b589c470a210411c2056d9de6ba5cef74d949da56c04\" pid:8417 exited_at:{seconds:1757997200 nanos:226343691}" Sep 16 04:33:20.623412 sshd[8398]: Accepted publickey for core from 139.178.89.65 port 48660 ssh2: RSA SHA256:hnZQROmedaG+reQAaWvmG41QCRiTlF3QrQA4Qzar5jk Sep 16 04:33:20.625157 sshd-session[8398]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 16 04:33:20.631727 systemd-logind[1529]: New session 68 of user core. Sep 16 04:33:20.636779 systemd[1]: Started session-68.scope - Session 68 of User core. Sep 16 04:33:20.860312 containerd[1558]: time="2025-09-16T04:33:20.860250562Z" level=info msg="TaskExit event in podsandbox handler container_id:\"8d6c6929c13327a75d8c268f852bf6c0f9e08ea12f3d02fe73bf74d4acf04464\" id:\"de17d86a7314de41c5ee45001ea5960d0fd298a3958e9f4418635901b4927e87\" pid:8440 exited_at:{seconds:1757997200 nanos:858876016}" Sep 16 04:33:21.395175 sshd[8427]: Connection closed by 139.178.89.65 port 48660 Sep 16 04:33:21.394949 sshd-session[8398]: pam_unix(sshd:session): session closed for user core Sep 16 04:33:21.402070 systemd[1]: sshd@67-138.201.119.17:22-139.178.89.65:48660.service: Deactivated successfully. Sep 16 04:33:21.405824 systemd[1]: session-68.scope: Deactivated successfully. Sep 16 04:33:21.407327 systemd-logind[1529]: Session 68 logged out. Waiting for processes to exit. Sep 16 04:33:21.409431 systemd-logind[1529]: Removed session 68. Sep 16 04:33:26.578648 systemd[1]: Started sshd@68-138.201.119.17:22-139.178.89.65:34022.service - OpenSSH per-connection server daemon (139.178.89.65:34022). Sep 16 04:33:27.650662 sshd[8461]: Accepted publickey for core from 139.178.89.65 port 34022 ssh2: RSA SHA256:hnZQROmedaG+reQAaWvmG41QCRiTlF3QrQA4Qzar5jk Sep 16 04:33:27.652839 sshd-session[8461]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 16 04:33:27.661073 systemd-logind[1529]: New session 69 of user core. Sep 16 04:33:27.664774 systemd[1]: Started session-69.scope - Session 69 of User core. Sep 16 04:33:28.450208 sshd[8464]: Connection closed by 139.178.89.65 port 34022 Sep 16 04:33:28.450770 sshd-session[8461]: pam_unix(sshd:session): session closed for user core Sep 16 04:33:28.459206 systemd[1]: sshd@68-138.201.119.17:22-139.178.89.65:34022.service: Deactivated successfully. Sep 16 04:33:28.462496 systemd[1]: session-69.scope: Deactivated successfully. Sep 16 04:33:28.464335 systemd-logind[1529]: Session 69 logged out. Waiting for processes to exit. Sep 16 04:33:28.467420 systemd-logind[1529]: Removed session 69. Sep 16 04:33:29.302818 containerd[1558]: time="2025-09-16T04:33:29.302644147Z" level=info msg="TaskExit event in podsandbox handler container_id:\"330eda4a58ea998791d702c8e63e01b0da33f16b43b46f482cb45be65d7b3d87\" id:\"a92960d77d6834fe3fc3d4b73fa3e0db846b7394ccff8e45e059b8085276a62c\" pid:8487 exited_at:{seconds:1757997209 nanos:299967457}" Sep 16 04:33:33.628958 systemd[1]: Started sshd@69-138.201.119.17:22-139.178.89.65:35092.service - OpenSSH per-connection server daemon (139.178.89.65:35092). Sep 16 04:33:34.630462 sshd[8499]: Accepted publickey for core from 139.178.89.65 port 35092 ssh2: RSA SHA256:hnZQROmedaG+reQAaWvmG41QCRiTlF3QrQA4Qzar5jk Sep 16 04:33:34.632631 sshd-session[8499]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 16 04:33:34.638857 systemd-logind[1529]: New session 70 of user core. Sep 16 04:33:34.650911 systemd[1]: Started session-70.scope - Session 70 of User core. Sep 16 04:33:35.392417 sshd[8502]: Connection closed by 139.178.89.65 port 35092 Sep 16 04:33:35.393909 sshd-session[8499]: pam_unix(sshd:session): session closed for user core Sep 16 04:33:35.399180 systemd-logind[1529]: Session 70 logged out. Waiting for processes to exit. Sep 16 04:33:35.399437 systemd[1]: sshd@69-138.201.119.17:22-139.178.89.65:35092.service: Deactivated successfully. Sep 16 04:33:35.402419 systemd[1]: session-70.scope: Deactivated successfully. Sep 16 04:33:35.406489 systemd-logind[1529]: Removed session 70. Sep 16 04:33:40.571235 systemd[1]: Started sshd@70-138.201.119.17:22-139.178.89.65:60740.service - OpenSSH per-connection server daemon (139.178.89.65:60740). Sep 16 04:33:41.593508 sshd[8514]: Accepted publickey for core from 139.178.89.65 port 60740 ssh2: RSA SHA256:hnZQROmedaG+reQAaWvmG41QCRiTlF3QrQA4Qzar5jk Sep 16 04:33:41.595417 sshd-session[8514]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 16 04:33:41.602679 systemd-logind[1529]: New session 71 of user core. Sep 16 04:33:41.611943 systemd[1]: Started session-71.scope - Session 71 of User core. Sep 16 04:33:42.367475 sshd[8517]: Connection closed by 139.178.89.65 port 60740 Sep 16 04:33:42.368796 sshd-session[8514]: pam_unix(sshd:session): session closed for user core Sep 16 04:33:42.375959 systemd[1]: sshd@70-138.201.119.17:22-139.178.89.65:60740.service: Deactivated successfully. Sep 16 04:33:42.379057 systemd[1]: session-71.scope: Deactivated successfully. Sep 16 04:33:42.381004 systemd-logind[1529]: Session 71 logged out. Waiting for processes to exit. Sep 16 04:33:42.382776 systemd-logind[1529]: Removed session 71. Sep 16 04:33:42.761698 containerd[1558]: time="2025-09-16T04:33:42.761640673Z" level=info msg="TaskExit event in podsandbox handler container_id:\"29accf524a610bb91eb08b15c29dbaaf8febb57b21dcb514741ee9efee567994\" id:\"eb824cac5c8bb9ca47fdc760d0479aa552a48b2e5924cba365410d41c377a18d\" pid:8540 exited_at:{seconds:1757997222 nanos:761066182}" Sep 16 04:33:47.541748 systemd[1]: Started sshd@71-138.201.119.17:22-139.178.89.65:60746.service - OpenSSH per-connection server daemon (139.178.89.65:60746). Sep 16 04:33:48.542262 sshd[8552]: Accepted publickey for core from 139.178.89.65 port 60746 ssh2: RSA SHA256:hnZQROmedaG+reQAaWvmG41QCRiTlF3QrQA4Qzar5jk Sep 16 04:33:48.545264 sshd-session[8552]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 16 04:33:48.551926 systemd-logind[1529]: New session 72 of user core. Sep 16 04:33:48.556804 systemd[1]: Started session-72.scope - Session 72 of User core. Sep 16 04:33:49.307813 sshd[8555]: Connection closed by 139.178.89.65 port 60746 Sep 16 04:33:49.308844 sshd-session[8552]: pam_unix(sshd:session): session closed for user core Sep 16 04:33:49.314847 systemd-logind[1529]: Session 72 logged out. Waiting for processes to exit. Sep 16 04:33:49.316318 systemd[1]: sshd@71-138.201.119.17:22-139.178.89.65:60746.service: Deactivated successfully. Sep 16 04:33:49.318773 systemd[1]: session-72.scope: Deactivated successfully. Sep 16 04:33:49.319919 systemd-logind[1529]: Removed session 72. Sep 16 04:33:50.223447 containerd[1558]: time="2025-09-16T04:33:50.223366580Z" level=info msg="TaskExit event in podsandbox handler container_id:\"8d6c6929c13327a75d8c268f852bf6c0f9e08ea12f3d02fe73bf74d4acf04464\" id:\"23e0dacb76f980e63f780b5544c31ce27c8071e02ee6a70af24c51e4b1960103\" pid:8578 exited_at:{seconds:1757997230 nanos:222944172}" Sep 16 04:33:54.478779 systemd[1]: Started sshd@72-138.201.119.17:22-139.178.89.65:54488.service - OpenSSH per-connection server daemon (139.178.89.65:54488). Sep 16 04:33:55.485207 sshd[8590]: Accepted publickey for core from 139.178.89.65 port 54488 ssh2: RSA SHA256:hnZQROmedaG+reQAaWvmG41QCRiTlF3QrQA4Qzar5jk Sep 16 04:33:55.487530 sshd-session[8590]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 16 04:33:55.494647 systemd-logind[1529]: New session 73 of user core. Sep 16 04:33:55.501943 systemd[1]: Started session-73.scope - Session 73 of User core. Sep 16 04:33:56.286655 sshd[8593]: Connection closed by 139.178.89.65 port 54488 Sep 16 04:33:56.287672 sshd-session[8590]: pam_unix(sshd:session): session closed for user core Sep 16 04:33:56.293308 systemd[1]: sshd@72-138.201.119.17:22-139.178.89.65:54488.service: Deactivated successfully. Sep 16 04:33:56.298440 systemd[1]: session-73.scope: Deactivated successfully. Sep 16 04:33:56.299783 systemd-logind[1529]: Session 73 logged out. Waiting for processes to exit. Sep 16 04:33:56.301712 systemd-logind[1529]: Removed session 73. Sep 16 04:33:59.297987 containerd[1558]: time="2025-09-16T04:33:59.297922283Z" level=info msg="TaskExit event in podsandbox handler container_id:\"330eda4a58ea998791d702c8e63e01b0da33f16b43b46f482cb45be65d7b3d87\" id:\"703f2d1643b9e2c7a5c2643f135c5c7c783afe0ab9b4dec30919b38b2f2d6969\" pid:8616 exited_at:{seconds:1757997239 nanos:297144589}" Sep 16 04:34:01.462903 systemd[1]: Started sshd@73-138.201.119.17:22-139.178.89.65:49700.service - OpenSSH per-connection server daemon (139.178.89.65:49700). Sep 16 04:34:02.457337 sshd[8628]: Accepted publickey for core from 139.178.89.65 port 49700 ssh2: RSA SHA256:hnZQROmedaG+reQAaWvmG41QCRiTlF3QrQA4Qzar5jk Sep 16 04:34:02.459830 sshd-session[8628]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 16 04:34:02.467637 systemd-logind[1529]: New session 74 of user core. Sep 16 04:34:02.488929 systemd[1]: Started session-74.scope - Session 74 of User core. Sep 16 04:34:03.211814 sshd[8631]: Connection closed by 139.178.89.65 port 49700 Sep 16 04:34:03.212631 sshd-session[8628]: pam_unix(sshd:session): session closed for user core Sep 16 04:34:03.217724 systemd[1]: sshd@73-138.201.119.17:22-139.178.89.65:49700.service: Deactivated successfully. Sep 16 04:34:03.220245 systemd[1]: session-74.scope: Deactivated successfully. Sep 16 04:34:03.222394 systemd-logind[1529]: Session 74 logged out. Waiting for processes to exit. Sep 16 04:34:03.225162 systemd-logind[1529]: Removed session 74. Sep 16 04:34:05.588114 containerd[1558]: time="2025-09-16T04:34:05.588056406Z" level=info msg="TaskExit event in podsandbox handler container_id:\"330eda4a58ea998791d702c8e63e01b0da33f16b43b46f482cb45be65d7b3d87\" id:\"416cd5b0edca192ae0df294d6d084113667f5fe4b72480cedfe6244e6f204aa9\" pid:8654 exited_at:{seconds:1757997245 nanos:587816041}" Sep 16 04:34:08.382743 systemd[1]: Started sshd@74-138.201.119.17:22-139.178.89.65:49704.service - OpenSSH per-connection server daemon (139.178.89.65:49704). Sep 16 04:34:09.385982 sshd[8671]: Accepted publickey for core from 139.178.89.65 port 49704 ssh2: RSA SHA256:hnZQROmedaG+reQAaWvmG41QCRiTlF3QrQA4Qzar5jk Sep 16 04:34:09.388564 sshd-session[8671]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 16 04:34:09.397684 systemd-logind[1529]: New session 75 of user core. Sep 16 04:34:09.400867 systemd[1]: Started session-75.scope - Session 75 of User core. Sep 16 04:34:10.182234 sshd[8674]: Connection closed by 139.178.89.65 port 49704 Sep 16 04:34:10.183194 sshd-session[8671]: pam_unix(sshd:session): session closed for user core Sep 16 04:34:10.188519 systemd[1]: sshd@74-138.201.119.17:22-139.178.89.65:49704.service: Deactivated successfully. Sep 16 04:34:10.191374 systemd[1]: session-75.scope: Deactivated successfully. Sep 16 04:34:10.192604 systemd-logind[1529]: Session 75 logged out. Waiting for processes to exit. Sep 16 04:34:10.194571 systemd-logind[1529]: Removed session 75. Sep 16 04:34:12.756664 containerd[1558]: time="2025-09-16T04:34:12.756620460Z" level=info msg="TaskExit event in podsandbox handler container_id:\"29accf524a610bb91eb08b15c29dbaaf8febb57b21dcb514741ee9efee567994\" id:\"9f40274926a76648a63dd711557532f71e27fdb89e60000861ff7920edf0ba65\" pid:8711 exited_at:{seconds:1757997252 nanos:756141291}" Sep 16 04:34:15.353829 systemd[1]: Started sshd@75-138.201.119.17:22-139.178.89.65:36240.service - OpenSSH per-connection server daemon (139.178.89.65:36240). Sep 16 04:34:16.349789 sshd[8724]: Accepted publickey for core from 139.178.89.65 port 36240 ssh2: RSA SHA256:hnZQROmedaG+reQAaWvmG41QCRiTlF3QrQA4Qzar5jk Sep 16 04:34:16.352426 sshd-session[8724]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 16 04:34:16.359661 systemd-logind[1529]: New session 76 of user core. Sep 16 04:34:16.367317 systemd[1]: Started session-76.scope - Session 76 of User core. Sep 16 04:34:17.113727 sshd[8727]: Connection closed by 139.178.89.65 port 36240 Sep 16 04:34:17.114701 sshd-session[8724]: pam_unix(sshd:session): session closed for user core Sep 16 04:34:17.121517 systemd[1]: sshd@75-138.201.119.17:22-139.178.89.65:36240.service: Deactivated successfully. Sep 16 04:34:17.125284 systemd[1]: session-76.scope: Deactivated successfully. Sep 16 04:34:17.126965 systemd-logind[1529]: Session 76 logged out. Waiting for processes to exit. Sep 16 04:34:17.128640 systemd-logind[1529]: Removed session 76. Sep 16 04:34:20.233210 containerd[1558]: time="2025-09-16T04:34:20.233148670Z" level=info msg="TaskExit event in podsandbox handler container_id:\"8d6c6929c13327a75d8c268f852bf6c0f9e08ea12f3d02fe73bf74d4acf04464\" id:\"89e122b1cc7188b008420e6e49f7ad0cd6cc30debd48fe21c3d29a23c662d3b9\" pid:8751 exited_at:{seconds:1757997260 nanos:232667502}" Sep 16 04:34:20.791887 containerd[1558]: time="2025-09-16T04:34:20.791829451Z" level=info msg="TaskExit event in podsandbox handler container_id:\"8d6c6929c13327a75d8c268f852bf6c0f9e08ea12f3d02fe73bf74d4acf04464\" id:\"474f1b054fa62aaa1d1cffd3a44843bd8073b5a58cf45a8d21350371a6a5c7e0\" pid:8773 exited_at:{seconds:1757997260 nanos:791475885}" Sep 16 04:34:22.291770 systemd[1]: Started sshd@76-138.201.119.17:22-139.178.89.65:40078.service - OpenSSH per-connection server daemon (139.178.89.65:40078). Sep 16 04:34:23.292256 sshd[8785]: Accepted publickey for core from 139.178.89.65 port 40078 ssh2: RSA SHA256:hnZQROmedaG+reQAaWvmG41QCRiTlF3QrQA4Qzar5jk Sep 16 04:34:23.294977 sshd-session[8785]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 16 04:34:23.300542 systemd-logind[1529]: New session 77 of user core. Sep 16 04:34:23.304835 systemd[1]: Started session-77.scope - Session 77 of User core. Sep 16 04:34:24.060687 sshd[8788]: Connection closed by 139.178.89.65 port 40078 Sep 16 04:34:24.061949 sshd-session[8785]: pam_unix(sshd:session): session closed for user core Sep 16 04:34:24.067137 systemd-logind[1529]: Session 77 logged out. Waiting for processes to exit. Sep 16 04:34:24.067550 systemd[1]: sshd@76-138.201.119.17:22-139.178.89.65:40078.service: Deactivated successfully. Sep 16 04:34:24.071256 systemd[1]: session-77.scope: Deactivated successfully. Sep 16 04:34:24.073892 systemd-logind[1529]: Removed session 77. Sep 16 04:34:29.242205 systemd[1]: Started sshd@77-138.201.119.17:22-139.178.89.65:40080.service - OpenSSH per-connection server daemon (139.178.89.65:40080). Sep 16 04:34:29.299767 containerd[1558]: time="2025-09-16T04:34:29.299689270Z" level=info msg="TaskExit event in podsandbox handler container_id:\"330eda4a58ea998791d702c8e63e01b0da33f16b43b46f482cb45be65d7b3d87\" id:\"07c36e52473773598e0f62a2ad7b60d063f005e2f202e218d348e21186a92f6b\" pid:8814 exited_at:{seconds:1757997269 nanos:298951417}" Sep 16 04:34:30.257465 sshd[8800]: Accepted publickey for core from 139.178.89.65 port 40080 ssh2: RSA SHA256:hnZQROmedaG+reQAaWvmG41QCRiTlF3QrQA4Qzar5jk Sep 16 04:34:30.260075 sshd-session[8800]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 16 04:34:30.265905 systemd-logind[1529]: New session 78 of user core. Sep 16 04:34:30.272772 systemd[1]: Started session-78.scope - Session 78 of User core. Sep 16 04:34:31.028366 sshd[8823]: Connection closed by 139.178.89.65 port 40080 Sep 16 04:34:31.029374 sshd-session[8800]: pam_unix(sshd:session): session closed for user core Sep 16 04:34:31.034922 systemd-logind[1529]: Session 78 logged out. Waiting for processes to exit. Sep 16 04:34:31.035442 systemd[1]: sshd@77-138.201.119.17:22-139.178.89.65:40080.service: Deactivated successfully. Sep 16 04:34:31.038872 systemd[1]: session-78.scope: Deactivated successfully. Sep 16 04:34:31.040687 systemd-logind[1529]: Removed session 78. Sep 16 04:34:36.199897 systemd[1]: Started sshd@78-138.201.119.17:22-139.178.89.65:51058.service - OpenSSH per-connection server daemon (139.178.89.65:51058). Sep 16 04:34:37.192179 sshd[8836]: Accepted publickey for core from 139.178.89.65 port 51058 ssh2: RSA SHA256:hnZQROmedaG+reQAaWvmG41QCRiTlF3QrQA4Qzar5jk Sep 16 04:34:37.194314 sshd-session[8836]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 16 04:34:37.201696 systemd-logind[1529]: New session 79 of user core. Sep 16 04:34:37.210939 systemd[1]: Started session-79.scope - Session 79 of User core. Sep 16 04:34:37.954400 sshd[8839]: Connection closed by 139.178.89.65 port 51058 Sep 16 04:34:37.955259 sshd-session[8836]: pam_unix(sshd:session): session closed for user core Sep 16 04:34:37.961483 systemd[1]: sshd@78-138.201.119.17:22-139.178.89.65:51058.service: Deactivated successfully. Sep 16 04:34:37.965229 systemd[1]: session-79.scope: Deactivated successfully. Sep 16 04:34:37.966231 systemd-logind[1529]: Session 79 logged out. Waiting for processes to exit. Sep 16 04:34:37.967897 systemd-logind[1529]: Removed session 79. Sep 16 04:34:42.762944 containerd[1558]: time="2025-09-16T04:34:42.762690415Z" level=info msg="TaskExit event in podsandbox handler container_id:\"29accf524a610bb91eb08b15c29dbaaf8febb57b21dcb514741ee9efee567994\" id:\"78cbc4e6a80b4ba047984bdf680980dffe886655ffb55ba4cd111a3d669e3f3d\" pid:8862 exited_at:{seconds:1757997282 nanos:761647917}" Sep 16 04:34:43.132062 systemd[1]: Started sshd@79-138.201.119.17:22-139.178.89.65:47296.service - OpenSSH per-connection server daemon (139.178.89.65:47296). Sep 16 04:34:44.151250 sshd[8874]: Accepted publickey for core from 139.178.89.65 port 47296 ssh2: RSA SHA256:hnZQROmedaG+reQAaWvmG41QCRiTlF3QrQA4Qzar5jk Sep 16 04:34:44.153780 sshd-session[8874]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 16 04:34:44.158472 systemd-logind[1529]: New session 80 of user core. Sep 16 04:34:44.163792 systemd[1]: Started session-80.scope - Session 80 of User core. Sep 16 04:34:44.922614 sshd[8877]: Connection closed by 139.178.89.65 port 47296 Sep 16 04:34:44.923015 sshd-session[8874]: pam_unix(sshd:session): session closed for user core Sep 16 04:34:44.928715 systemd-logind[1529]: Session 80 logged out. Waiting for processes to exit. Sep 16 04:34:44.929822 systemd[1]: sshd@79-138.201.119.17:22-139.178.89.65:47296.service: Deactivated successfully. Sep 16 04:34:44.934332 systemd[1]: session-80.scope: Deactivated successfully. Sep 16 04:34:44.939084 systemd-logind[1529]: Removed session 80. Sep 16 04:34:45.094885 systemd[1]: Started sshd@80-138.201.119.17:22-139.178.89.65:47300.service - OpenSSH per-connection server daemon (139.178.89.65:47300). Sep 16 04:34:46.087323 sshd[8888]: Accepted publickey for core from 139.178.89.65 port 47300 ssh2: RSA SHA256:hnZQROmedaG+reQAaWvmG41QCRiTlF3QrQA4Qzar5jk Sep 16 04:34:46.089330 sshd-session[8888]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 16 04:34:46.095910 systemd-logind[1529]: New session 81 of user core. Sep 16 04:34:46.104919 systemd[1]: Started session-81.scope - Session 81 of User core. Sep 16 04:34:46.983859 sshd[8891]: Connection closed by 139.178.89.65 port 47300 Sep 16 04:34:46.985430 sshd-session[8888]: pam_unix(sshd:session): session closed for user core Sep 16 04:34:46.990808 systemd[1]: sshd@80-138.201.119.17:22-139.178.89.65:47300.service: Deactivated successfully. Sep 16 04:34:46.990966 systemd-logind[1529]: Session 81 logged out. Waiting for processes to exit. Sep 16 04:34:46.993110 systemd[1]: session-81.scope: Deactivated successfully. Sep 16 04:34:46.995607 systemd-logind[1529]: Removed session 81. Sep 16 04:34:47.156892 systemd[1]: Started sshd@81-138.201.119.17:22-139.178.89.65:47316.service - OpenSSH per-connection server daemon (139.178.89.65:47316). Sep 16 04:34:48.160705 sshd[8902]: Accepted publickey for core from 139.178.89.65 port 47316 ssh2: RSA SHA256:hnZQROmedaG+reQAaWvmG41QCRiTlF3QrQA4Qzar5jk Sep 16 04:34:48.162759 sshd-session[8902]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 16 04:34:48.170679 systemd-logind[1529]: New session 82 of user core. Sep 16 04:34:48.172783 systemd[1]: Started session-82.scope - Session 82 of User core. Sep 16 04:34:49.543293 sshd[8906]: Connection closed by 139.178.89.65 port 47316 Sep 16 04:34:49.544241 sshd-session[8902]: pam_unix(sshd:session): session closed for user core Sep 16 04:34:49.550167 systemd[1]: sshd@81-138.201.119.17:22-139.178.89.65:47316.service: Deactivated successfully. Sep 16 04:34:49.554595 systemd[1]: session-82.scope: Deactivated successfully. Sep 16 04:34:49.558038 systemd-logind[1529]: Session 82 logged out. Waiting for processes to exit. Sep 16 04:34:49.560572 systemd-logind[1529]: Removed session 82. Sep 16 04:34:49.716560 systemd[1]: Started sshd@82-138.201.119.17:22-139.178.89.65:47322.service - OpenSSH per-connection server daemon (139.178.89.65:47322). Sep 16 04:34:50.219328 containerd[1558]: time="2025-09-16T04:34:50.219265677Z" level=info msg="TaskExit event in podsandbox handler container_id:\"8d6c6929c13327a75d8c268f852bf6c0f9e08ea12f3d02fe73bf74d4acf04464\" id:\"cb29cd1949128173a244d8b605eae7f80573c3c4c417db0b804b07077191825a\" pid:8944 exited_at:{seconds:1757997290 nanos:218833269}" Sep 16 04:34:50.726543 sshd[8928]: Accepted publickey for core from 139.178.89.65 port 47322 ssh2: RSA SHA256:hnZQROmedaG+reQAaWvmG41QCRiTlF3QrQA4Qzar5jk Sep 16 04:34:50.728752 sshd-session[8928]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 16 04:34:50.734632 systemd-logind[1529]: New session 83 of user core. Sep 16 04:34:50.741823 systemd[1]: Started session-83.scope - Session 83 of User core. Sep 16 04:34:51.617813 sshd[8954]: Connection closed by 139.178.89.65 port 47322 Sep 16 04:34:51.616968 sshd-session[8928]: pam_unix(sshd:session): session closed for user core Sep 16 04:34:51.622083 systemd[1]: sshd@82-138.201.119.17:22-139.178.89.65:47322.service: Deactivated successfully. Sep 16 04:34:51.625834 systemd[1]: session-83.scope: Deactivated successfully. Sep 16 04:34:51.626887 systemd-logind[1529]: Session 83 logged out. Waiting for processes to exit. Sep 16 04:34:51.628475 systemd-logind[1529]: Removed session 83. Sep 16 04:34:51.789956 systemd[1]: Started sshd@83-138.201.119.17:22-139.178.89.65:43906.service - OpenSSH per-connection server daemon (139.178.89.65:43906). Sep 16 04:34:52.786780 sshd[8963]: Accepted publickey for core from 139.178.89.65 port 43906 ssh2: RSA SHA256:hnZQROmedaG+reQAaWvmG41QCRiTlF3QrQA4Qzar5jk Sep 16 04:34:52.789371 sshd-session[8963]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 16 04:34:52.795638 systemd-logind[1529]: New session 84 of user core. Sep 16 04:34:52.800909 systemd[1]: Started session-84.scope - Session 84 of User core. Sep 16 04:34:53.549075 sshd[8966]: Connection closed by 139.178.89.65 port 43906 Sep 16 04:34:53.550889 sshd-session[8963]: pam_unix(sshd:session): session closed for user core Sep 16 04:34:53.557305 systemd[1]: sshd@83-138.201.119.17:22-139.178.89.65:43906.service: Deactivated successfully. Sep 16 04:34:53.559972 systemd[1]: session-84.scope: Deactivated successfully. Sep 16 04:34:53.561388 systemd-logind[1529]: Session 84 logged out. Waiting for processes to exit. Sep 16 04:34:53.565404 systemd-logind[1529]: Removed session 84. Sep 16 04:34:58.723977 systemd[1]: Started sshd@84-138.201.119.17:22-139.178.89.65:43916.service - OpenSSH per-connection server daemon (139.178.89.65:43916). Sep 16 04:34:59.455662 containerd[1558]: time="2025-09-16T04:34:59.455168512Z" level=info msg="TaskExit event in podsandbox handler container_id:\"330eda4a58ea998791d702c8e63e01b0da33f16b43b46f482cb45be65d7b3d87\" id:\"3c5a107798f9fdebc2c54f3bf87caa06e6416ce64a91d4ee69300388ca120dfe\" pid:8994 exited_at:{seconds:1757997299 nanos:454819386}" Sep 16 04:34:59.712680 sshd[8979]: Accepted publickey for core from 139.178.89.65 port 43916 ssh2: RSA SHA256:hnZQROmedaG+reQAaWvmG41QCRiTlF3QrQA4Qzar5jk Sep 16 04:34:59.714347 sshd-session[8979]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 16 04:34:59.719970 systemd-logind[1529]: New session 85 of user core. Sep 16 04:34:59.726798 systemd[1]: Started session-85.scope - Session 85 of User core. Sep 16 04:35:00.472567 sshd[9003]: Connection closed by 139.178.89.65 port 43916 Sep 16 04:35:00.474011 sshd-session[8979]: pam_unix(sshd:session): session closed for user core Sep 16 04:35:00.479739 systemd[1]: sshd@84-138.201.119.17:22-139.178.89.65:43916.service: Deactivated successfully. Sep 16 04:35:00.483069 systemd[1]: session-85.scope: Deactivated successfully. Sep 16 04:35:00.485878 systemd-logind[1529]: Session 85 logged out. Waiting for processes to exit. Sep 16 04:35:00.487647 systemd-logind[1529]: Removed session 85. Sep 16 04:35:05.589943 containerd[1558]: time="2025-09-16T04:35:05.589878924Z" level=info msg="TaskExit event in podsandbox handler container_id:\"330eda4a58ea998791d702c8e63e01b0da33f16b43b46f482cb45be65d7b3d87\" id:\"da1ec99097fc5dbfb9babdde35d8ac581ff8ac94e6f7f5667c5d423c4d218562\" pid:9029 exited_at:{seconds:1757997305 nanos:589059390}" Sep 16 04:35:05.651752 systemd[1]: Started sshd@85-138.201.119.17:22-139.178.89.65:45934.service - OpenSSH per-connection server daemon (139.178.89.65:45934). Sep 16 04:35:06.645986 sshd[9040]: Accepted publickey for core from 139.178.89.65 port 45934 ssh2: RSA SHA256:hnZQROmedaG+reQAaWvmG41QCRiTlF3QrQA4Qzar5jk Sep 16 04:35:06.648209 sshd-session[9040]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 16 04:35:06.655624 systemd-logind[1529]: New session 86 of user core. Sep 16 04:35:06.663874 systemd[1]: Started session-86.scope - Session 86 of User core. Sep 16 04:35:07.408490 sshd[9043]: Connection closed by 139.178.89.65 port 45934 Sep 16 04:35:07.409310 sshd-session[9040]: pam_unix(sshd:session): session closed for user core Sep 16 04:35:07.415823 systemd[1]: sshd@85-138.201.119.17:22-139.178.89.65:45934.service: Deactivated successfully. Sep 16 04:35:07.419362 systemd[1]: session-86.scope: Deactivated successfully. Sep 16 04:35:07.421853 systemd-logind[1529]: Session 86 logged out. Waiting for processes to exit. Sep 16 04:35:07.424634 systemd-logind[1529]: Removed session 86. Sep 16 04:35:12.586407 systemd[1]: Started sshd@86-138.201.119.17:22-139.178.89.65:47922.service - OpenSSH per-connection server daemon (139.178.89.65:47922). Sep 16 04:35:12.754284 containerd[1558]: time="2025-09-16T04:35:12.754224519Z" level=info msg="TaskExit event in podsandbox handler container_id:\"29accf524a610bb91eb08b15c29dbaaf8febb57b21dcb514741ee9efee567994\" id:\"3497b957d51cd8a4e0efd7c3d446626644b6e4dd18be788a43934e4ff473db6f\" pid:9072 exited_at:{seconds:1757997312 nanos:753187502}" Sep 16 04:35:13.595245 sshd[9055]: Accepted publickey for core from 139.178.89.65 port 47922 ssh2: RSA SHA256:hnZQROmedaG+reQAaWvmG41QCRiTlF3QrQA4Qzar5jk Sep 16 04:35:13.597475 sshd-session[9055]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 16 04:35:13.603903 systemd-logind[1529]: New session 87 of user core. Sep 16 04:35:13.609966 systemd[1]: Started session-87.scope - Session 87 of User core. Sep 16 04:35:14.349007 sshd[9083]: Connection closed by 139.178.89.65 port 47922 Sep 16 04:35:14.349857 sshd-session[9055]: pam_unix(sshd:session): session closed for user core Sep 16 04:35:14.355190 systemd[1]: sshd@86-138.201.119.17:22-139.178.89.65:47922.service: Deactivated successfully. Sep 16 04:35:14.357225 systemd[1]: session-87.scope: Deactivated successfully. Sep 16 04:35:14.358156 systemd-logind[1529]: Session 87 logged out. Waiting for processes to exit. Sep 16 04:35:14.360234 systemd-logind[1529]: Removed session 87. Sep 16 04:35:19.530468 systemd[1]: Started sshd@87-138.201.119.17:22-139.178.89.65:47936.service - OpenSSH per-connection server daemon (139.178.89.65:47936). Sep 16 04:35:20.206916 containerd[1558]: time="2025-09-16T04:35:20.206794735Z" level=info msg="TaskExit event in podsandbox handler container_id:\"8d6c6929c13327a75d8c268f852bf6c0f9e08ea12f3d02fe73bf74d4acf04464\" id:\"523093283f718bd18abd861de94f9c898ab7dd322ff046e9d5cfe0c263f3ed9d\" pid:9111 exited_at:{seconds:1757997320 nanos:206466249}" Sep 16 04:35:20.540385 sshd[9094]: Accepted publickey for core from 139.178.89.65 port 47936 ssh2: RSA SHA256:hnZQROmedaG+reQAaWvmG41QCRiTlF3QrQA4Qzar5jk Sep 16 04:35:20.542848 sshd-session[9094]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 16 04:35:20.550834 systemd-logind[1529]: New session 88 of user core. Sep 16 04:35:20.555781 systemd[1]: Started session-88.scope - Session 88 of User core. Sep 16 04:35:20.792350 containerd[1558]: time="2025-09-16T04:35:20.792225033Z" level=info msg="TaskExit event in podsandbox handler container_id:\"8d6c6929c13327a75d8c268f852bf6c0f9e08ea12f3d02fe73bf74d4acf04464\" id:\"d2d80156611e1ce28714ecc6c04425764ccc7e9b1e01b0c6d21935fd0b6dc7ed\" pid:9134 exited_at:{seconds:1757997320 nanos:791888507}" Sep 16 04:35:21.306015 sshd[9121]: Connection closed by 139.178.89.65 port 47936 Sep 16 04:35:21.306980 sshd-session[9094]: pam_unix(sshd:session): session closed for user core Sep 16 04:35:21.313038 systemd-logind[1529]: Session 88 logged out. Waiting for processes to exit. Sep 16 04:35:21.313886 systemd[1]: sshd@87-138.201.119.17:22-139.178.89.65:47936.service: Deactivated successfully. Sep 16 04:35:21.316341 systemd[1]: session-88.scope: Deactivated successfully. Sep 16 04:35:21.319992 systemd-logind[1529]: Removed session 88. Sep 16 04:35:26.479315 systemd[1]: Started sshd@88-138.201.119.17:22-139.178.89.65:44752.service - OpenSSH per-connection server daemon (139.178.89.65:44752). Sep 16 04:35:27.494643 sshd[9162]: Accepted publickey for core from 139.178.89.65 port 44752 ssh2: RSA SHA256:hnZQROmedaG+reQAaWvmG41QCRiTlF3QrQA4Qzar5jk Sep 16 04:35:27.495578 sshd-session[9162]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 16 04:35:27.501649 systemd-logind[1529]: New session 89 of user core. Sep 16 04:35:27.504783 systemd[1]: Started session-89.scope - Session 89 of User core. Sep 16 04:35:28.252984 sshd[9165]: Connection closed by 139.178.89.65 port 44752 Sep 16 04:35:28.254836 sshd-session[9162]: pam_unix(sshd:session): session closed for user core Sep 16 04:35:28.261432 systemd[1]: sshd@88-138.201.119.17:22-139.178.89.65:44752.service: Deactivated successfully. Sep 16 04:35:28.263790 systemd[1]: session-89.scope: Deactivated successfully. Sep 16 04:35:28.265830 systemd-logind[1529]: Session 89 logged out. Waiting for processes to exit. Sep 16 04:35:28.267751 systemd-logind[1529]: Removed session 89. Sep 16 04:35:29.320649 containerd[1558]: time="2025-09-16T04:35:29.320331364Z" level=info msg="TaskExit event in podsandbox handler container_id:\"330eda4a58ea998791d702c8e63e01b0da33f16b43b46f482cb45be65d7b3d87\" id:\"4e6eabb2a83395532aac55442d130919a8ecf1d33f58a5ba291f804940d27b1f\" pid:9189 exited_at:{seconds:1757997329 nanos:319846876}" Sep 16 04:35:33.438739 systemd[1]: Started sshd@89-138.201.119.17:22-139.178.89.65:35544.service - OpenSSH per-connection server daemon (139.178.89.65:35544). Sep 16 04:35:34.438725 sshd[9201]: Accepted publickey for core from 139.178.89.65 port 35544 ssh2: RSA SHA256:hnZQROmedaG+reQAaWvmG41QCRiTlF3QrQA4Qzar5jk Sep 16 04:35:34.441181 sshd-session[9201]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 16 04:35:34.448540 systemd-logind[1529]: New session 90 of user core. Sep 16 04:35:34.453902 systemd[1]: Started session-90.scope - Session 90 of User core. Sep 16 04:35:35.192461 sshd[9204]: Connection closed by 139.178.89.65 port 35544 Sep 16 04:35:35.193037 sshd-session[9201]: pam_unix(sshd:session): session closed for user core Sep 16 04:35:35.199300 systemd-logind[1529]: Session 90 logged out. Waiting for processes to exit. Sep 16 04:35:35.199474 systemd[1]: sshd@89-138.201.119.17:22-139.178.89.65:35544.service: Deactivated successfully. Sep 16 04:35:35.204013 systemd[1]: session-90.scope: Deactivated successfully. Sep 16 04:35:35.207324 systemd-logind[1529]: Removed session 90. Sep 16 04:35:40.362521 systemd[1]: Started sshd@90-138.201.119.17:22-139.178.89.65:59242.service - OpenSSH per-connection server daemon (139.178.89.65:59242). Sep 16 04:35:41.352488 sshd[9225]: Accepted publickey for core from 139.178.89.65 port 59242 ssh2: RSA SHA256:hnZQROmedaG+reQAaWvmG41QCRiTlF3QrQA4Qzar5jk Sep 16 04:35:41.355272 sshd-session[9225]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 16 04:35:41.361684 systemd-logind[1529]: New session 91 of user core. Sep 16 04:35:41.365965 systemd[1]: Started session-91.scope - Session 91 of User core. Sep 16 04:35:42.105514 sshd[9228]: Connection closed by 139.178.89.65 port 59242 Sep 16 04:35:42.105279 sshd-session[9225]: pam_unix(sshd:session): session closed for user core Sep 16 04:35:42.110996 systemd[1]: sshd@90-138.201.119.17:22-139.178.89.65:59242.service: Deactivated successfully. Sep 16 04:35:42.113974 systemd[1]: session-91.scope: Deactivated successfully. Sep 16 04:35:42.115241 systemd-logind[1529]: Session 91 logged out. Waiting for processes to exit. Sep 16 04:35:42.117360 systemd-logind[1529]: Removed session 91. Sep 16 04:35:42.749601 containerd[1558]: time="2025-09-16T04:35:42.749547806Z" level=info msg="TaskExit event in podsandbox handler container_id:\"29accf524a610bb91eb08b15c29dbaaf8febb57b21dcb514741ee9efee567994\" id:\"53e3eba90b47d48a52946160d367f9f664d911a2981f0c1e12d54aa6ee30b6f6\" pid:9264 exited_at:{seconds:1757997342 nanos:749189520}" Sep 16 04:35:47.281505 systemd[1]: Started sshd@91-138.201.119.17:22-139.178.89.65:59244.service - OpenSSH per-connection server daemon (139.178.89.65:59244). Sep 16 04:35:48.284536 sshd[9276]: Accepted publickey for core from 139.178.89.65 port 59244 ssh2: RSA SHA256:hnZQROmedaG+reQAaWvmG41QCRiTlF3QrQA4Qzar5jk Sep 16 04:35:48.287233 sshd-session[9276]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 16 04:35:48.292679 systemd-logind[1529]: New session 92 of user core. Sep 16 04:35:48.301823 systemd[1]: Started session-92.scope - Session 92 of User core. Sep 16 04:35:49.053611 sshd[9279]: Connection closed by 139.178.89.65 port 59244 Sep 16 04:35:49.052854 sshd-session[9276]: pam_unix(sshd:session): session closed for user core Sep 16 04:35:49.058006 systemd-logind[1529]: Session 92 logged out. Waiting for processes to exit. Sep 16 04:35:49.058505 systemd[1]: sshd@91-138.201.119.17:22-139.178.89.65:59244.service: Deactivated successfully. Sep 16 04:35:49.062031 systemd[1]: session-92.scope: Deactivated successfully. Sep 16 04:35:49.064481 systemd-logind[1529]: Removed session 92. Sep 16 04:35:50.217204 containerd[1558]: time="2025-09-16T04:35:50.217161124Z" level=info msg="TaskExit event in podsandbox handler container_id:\"8d6c6929c13327a75d8c268f852bf6c0f9e08ea12f3d02fe73bf74d4acf04464\" id:\"11a22178ef6b659717d38ce4a9515f6f880f5388feaab42ab141a63d23632615\" pid:9304 exited_at:{seconds:1757997350 nanos:216675436}" Sep 16 04:35:54.229923 systemd[1]: Started sshd@92-138.201.119.17:22-139.178.89.65:46694.service - OpenSSH per-connection server daemon (139.178.89.65:46694). Sep 16 04:35:55.231836 sshd[9317]: Accepted publickey for core from 139.178.89.65 port 46694 ssh2: RSA SHA256:hnZQROmedaG+reQAaWvmG41QCRiTlF3QrQA4Qzar5jk Sep 16 04:35:55.233917 sshd-session[9317]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 16 04:35:55.241672 systemd-logind[1529]: New session 93 of user core. Sep 16 04:35:55.254894 systemd[1]: Started session-93.scope - Session 93 of User core. Sep 16 04:35:56.005949 sshd[9320]: Connection closed by 139.178.89.65 port 46694 Sep 16 04:35:56.006866 sshd-session[9317]: pam_unix(sshd:session): session closed for user core Sep 16 04:35:56.011608 systemd[1]: sshd@92-138.201.119.17:22-139.178.89.65:46694.service: Deactivated successfully. Sep 16 04:35:56.014400 systemd[1]: session-93.scope: Deactivated successfully. Sep 16 04:35:56.016112 systemd-logind[1529]: Session 93 logged out. Waiting for processes to exit. Sep 16 04:35:56.018213 systemd-logind[1529]: Removed session 93. Sep 16 04:35:59.298459 containerd[1558]: time="2025-09-16T04:35:59.298378591Z" level=info msg="TaskExit event in podsandbox handler container_id:\"330eda4a58ea998791d702c8e63e01b0da33f16b43b46f482cb45be65d7b3d87\" id:\"1796d4e37f00da234efe23dee391cebaae10571daa820dd797f495cbb8528510\" pid:9344 exited_at:{seconds:1757997359 nanos:297956224}" Sep 16 04:36:01.176840 systemd[1]: Started sshd@93-138.201.119.17:22-139.178.89.65:35804.service - OpenSSH per-connection server daemon (139.178.89.65:35804). Sep 16 04:36:02.178933 sshd[9354]: Accepted publickey for core from 139.178.89.65 port 35804 ssh2: RSA SHA256:hnZQROmedaG+reQAaWvmG41QCRiTlF3QrQA4Qzar5jk Sep 16 04:36:02.181303 sshd-session[9354]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 16 04:36:02.189842 systemd-logind[1529]: New session 94 of user core. Sep 16 04:36:02.197853 systemd[1]: Started session-94.scope - Session 94 of User core. Sep 16 04:36:02.939467 sshd[9359]: Connection closed by 139.178.89.65 port 35804 Sep 16 04:36:02.939861 sshd-session[9354]: pam_unix(sshd:session): session closed for user core Sep 16 04:36:02.946653 systemd[1]: session-94.scope: Deactivated successfully. Sep 16 04:36:02.946701 systemd-logind[1529]: Session 94 logged out. Waiting for processes to exit. Sep 16 04:36:02.949172 systemd[1]: sshd@93-138.201.119.17:22-139.178.89.65:35804.service: Deactivated successfully. Sep 16 04:36:02.955070 systemd-logind[1529]: Removed session 94. Sep 16 04:36:05.591992 containerd[1558]: time="2025-09-16T04:36:05.591839125Z" level=info msg="TaskExit event in podsandbox handler container_id:\"330eda4a58ea998791d702c8e63e01b0da33f16b43b46f482cb45be65d7b3d87\" id:\"b2e7cc0175a95f0e02865588d367ee8af6a46d1d9dbbf716f054dfc275be7658\" pid:9382 exited_at:{seconds:1757997365 nanos:591410999}" Sep 16 04:36:08.113160 systemd[1]: Started sshd@94-138.201.119.17:22-139.178.89.65:35810.service - OpenSSH per-connection server daemon (139.178.89.65:35810). Sep 16 04:36:09.113952 sshd[9392]: Accepted publickey for core from 139.178.89.65 port 35810 ssh2: RSA SHA256:hnZQROmedaG+reQAaWvmG41QCRiTlF3QrQA4Qzar5jk Sep 16 04:36:09.115984 sshd-session[9392]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 16 04:36:09.125691 systemd-logind[1529]: New session 95 of user core. Sep 16 04:36:09.128822 systemd[1]: Started session-95.scope - Session 95 of User core. Sep 16 04:36:09.875300 sshd[9395]: Connection closed by 139.178.89.65 port 35810 Sep 16 04:36:09.875061 sshd-session[9392]: pam_unix(sshd:session): session closed for user core Sep 16 04:36:09.881428 systemd[1]: sshd@94-138.201.119.17:22-139.178.89.65:35810.service: Deactivated successfully. Sep 16 04:36:09.885510 systemd[1]: session-95.scope: Deactivated successfully. Sep 16 04:36:09.888845 systemd-logind[1529]: Session 95 logged out. Waiting for processes to exit. Sep 16 04:36:09.891411 systemd-logind[1529]: Removed session 95. Sep 16 04:36:12.763723 containerd[1558]: time="2025-09-16T04:36:12.763667605Z" level=info msg="TaskExit event in podsandbox handler container_id:\"29accf524a610bb91eb08b15c29dbaaf8febb57b21dcb514741ee9efee567994\" id:\"b93f514fbc0807a23e7609400ed909f995587bb391238cccf892acda038323e7\" pid:9418 exited_at:{seconds:1757997372 nanos:763009714}" Sep 16 04:36:15.051170 systemd[1]: Started sshd@95-138.201.119.17:22-139.178.89.65:47788.service - OpenSSH per-connection server daemon (139.178.89.65:47788). Sep 16 04:36:16.055997 sshd[9430]: Accepted publickey for core from 139.178.89.65 port 47788 ssh2: RSA SHA256:hnZQROmedaG+reQAaWvmG41QCRiTlF3QrQA4Qzar5jk Sep 16 04:36:16.058069 sshd-session[9430]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 16 04:36:16.063812 systemd-logind[1529]: New session 96 of user core. Sep 16 04:36:16.069864 systemd[1]: Started session-96.scope - Session 96 of User core. Sep 16 04:36:16.809550 sshd[9433]: Connection closed by 139.178.89.65 port 47788 Sep 16 04:36:16.810112 sshd-session[9430]: pam_unix(sshd:session): session closed for user core Sep 16 04:36:16.816614 systemd[1]: sshd@95-138.201.119.17:22-139.178.89.65:47788.service: Deactivated successfully. Sep 16 04:36:16.818862 systemd[1]: session-96.scope: Deactivated successfully. Sep 16 04:36:16.819724 systemd-logind[1529]: Session 96 logged out. Waiting for processes to exit. Sep 16 04:36:16.821446 systemd-logind[1529]: Removed session 96. Sep 16 04:36:20.217471 containerd[1558]: time="2025-09-16T04:36:20.217419524Z" level=info msg="TaskExit event in podsandbox handler container_id:\"8d6c6929c13327a75d8c268f852bf6c0f9e08ea12f3d02fe73bf74d4acf04464\" id:\"1988ebc43e83eacc969ca630412c148509afba0b51b909f0fdaaadd3910aa621\" pid:9456 exited_at:{seconds:1757997380 nanos:217072638}" Sep 16 04:36:20.792011 containerd[1558]: time="2025-09-16T04:36:20.791970580Z" level=info msg="TaskExit event in podsandbox handler container_id:\"8d6c6929c13327a75d8c268f852bf6c0f9e08ea12f3d02fe73bf74d4acf04464\" id:\"63c775cc158f6699eea4426fc983d3786c8fa1d8cebe139120b52a56c6058cd2\" pid:9477 exited_at:{seconds:1757997380 nanos:791536333}" Sep 16 04:36:21.983433 systemd[1]: Started sshd@96-138.201.119.17:22-139.178.89.65:51572.service - OpenSSH per-connection server daemon (139.178.89.65:51572). Sep 16 04:36:22.979573 sshd[9487]: Accepted publickey for core from 139.178.89.65 port 51572 ssh2: RSA SHA256:hnZQROmedaG+reQAaWvmG41QCRiTlF3QrQA4Qzar5jk Sep 16 04:36:22.982337 sshd-session[9487]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 16 04:36:22.989853 systemd-logind[1529]: New session 97 of user core. Sep 16 04:36:22.993817 systemd[1]: Started session-97.scope - Session 97 of User core. Sep 16 04:36:23.734534 sshd[9490]: Connection closed by 139.178.89.65 port 51572 Sep 16 04:36:23.734405 sshd-session[9487]: pam_unix(sshd:session): session closed for user core Sep 16 04:36:23.740989 systemd-logind[1529]: Session 97 logged out. Waiting for processes to exit. Sep 16 04:36:23.741708 systemd[1]: sshd@96-138.201.119.17:22-139.178.89.65:51572.service: Deactivated successfully. Sep 16 04:36:23.744227 systemd[1]: session-97.scope: Deactivated successfully. Sep 16 04:36:23.745867 systemd-logind[1529]: Removed session 97. Sep 16 04:36:28.909093 systemd[1]: Started sshd@97-138.201.119.17:22-139.178.89.65:51580.service - OpenSSH per-connection server daemon (139.178.89.65:51580). Sep 16 04:36:29.295556 containerd[1558]: time="2025-09-16T04:36:29.295440723Z" level=info msg="TaskExit event in podsandbox handler container_id:\"330eda4a58ea998791d702c8e63e01b0da33f16b43b46f482cb45be65d7b3d87\" id:\"34802b36fde0a6a91120c9359c7bbdc24d0d213b5f873ec4e053908b6625e2fa\" pid:9517 exited_at:{seconds:1757997389 nanos:295029236}" Sep 16 04:36:29.922319 sshd[9502]: Accepted publickey for core from 139.178.89.65 port 51580 ssh2: RSA SHA256:hnZQROmedaG+reQAaWvmG41QCRiTlF3QrQA4Qzar5jk Sep 16 04:36:29.924911 sshd-session[9502]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 16 04:36:29.931495 systemd-logind[1529]: New session 98 of user core. Sep 16 04:36:29.938890 systemd[1]: Started session-98.scope - Session 98 of User core. Sep 16 04:36:30.687616 sshd[9527]: Connection closed by 139.178.89.65 port 51580 Sep 16 04:36:30.688535 sshd-session[9502]: pam_unix(sshd:session): session closed for user core Sep 16 04:36:30.695576 systemd[1]: sshd@97-138.201.119.17:22-139.178.89.65:51580.service: Deactivated successfully. Sep 16 04:36:30.699471 systemd[1]: session-98.scope: Deactivated successfully. Sep 16 04:36:30.702427 systemd-logind[1529]: Session 98 logged out. Waiting for processes to exit. Sep 16 04:36:30.704523 systemd-logind[1529]: Removed session 98. Sep 16 04:36:35.858867 systemd[1]: Started sshd@98-138.201.119.17:22-139.178.89.65:57520.service - OpenSSH per-connection server daemon (139.178.89.65:57520). Sep 16 04:36:36.760298 systemd[1]: Started sshd@99-138.201.119.17:22-183.245.9.13:50720.service - OpenSSH per-connection server daemon (183.245.9.13:50720). Sep 16 04:36:36.861420 sshd[9540]: Accepted publickey for core from 139.178.89.65 port 57520 ssh2: RSA SHA256:hnZQROmedaG+reQAaWvmG41QCRiTlF3QrQA4Qzar5jk Sep 16 04:36:36.864490 sshd-session[9540]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 16 04:36:36.869727 systemd-logind[1529]: New session 99 of user core. Sep 16 04:36:36.877919 systemd[1]: Started session-99.scope - Session 99 of User core. Sep 16 04:36:37.613800 sshd[9546]: Connection closed by 139.178.89.65 port 57520 Sep 16 04:36:37.614384 sshd-session[9540]: pam_unix(sshd:session): session closed for user core Sep 16 04:36:37.621134 systemd[1]: sshd@98-138.201.119.17:22-139.178.89.65:57520.service: Deactivated successfully. Sep 16 04:36:37.624696 systemd[1]: session-99.scope: Deactivated successfully. Sep 16 04:36:37.625682 systemd-logind[1529]: Session 99 logged out. Waiting for processes to exit. Sep 16 04:36:37.628024 systemd-logind[1529]: Removed session 99. Sep 16 04:36:42.774387 containerd[1558]: time="2025-09-16T04:36:42.774341516Z" level=info msg="TaskExit event in podsandbox handler container_id:\"29accf524a610bb91eb08b15c29dbaaf8febb57b21dcb514741ee9efee567994\" id:\"fb00a41a8f9939f4fe2cc4e1447bbeadd6057703cde1e9232acd8822ef677f23\" pid:9569 exited_at:{seconds:1757997402 nanos:773749627}" Sep 16 04:36:42.784044 systemd[1]: Started sshd@100-138.201.119.17:22-139.178.89.65:57104.service - OpenSSH per-connection server daemon (139.178.89.65:57104). Sep 16 04:36:43.810172 sshd[9581]: Accepted publickey for core from 139.178.89.65 port 57104 ssh2: RSA SHA256:hnZQROmedaG+reQAaWvmG41QCRiTlF3QrQA4Qzar5jk Sep 16 04:36:43.812971 sshd-session[9581]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 16 04:36:43.819036 systemd-logind[1529]: New session 100 of user core. Sep 16 04:36:43.826175 systemd[1]: Started session-100.scope - Session 100 of User core. Sep 16 04:36:44.575240 sshd[9584]: Connection closed by 139.178.89.65 port 57104 Sep 16 04:36:44.576709 sshd-session[9581]: pam_unix(sshd:session): session closed for user core Sep 16 04:36:44.581765 systemd-logind[1529]: Session 100 logged out. Waiting for processes to exit. Sep 16 04:36:44.581898 systemd[1]: sshd@100-138.201.119.17:22-139.178.89.65:57104.service: Deactivated successfully. Sep 16 04:36:44.584251 systemd[1]: session-100.scope: Deactivated successfully. Sep 16 04:36:44.587375 systemd-logind[1529]: Removed session 100. Sep 16 04:36:49.749991 systemd[1]: Started sshd@101-138.201.119.17:22-139.178.89.65:57108.service - OpenSSH per-connection server daemon (139.178.89.65:57108). Sep 16 04:36:50.219451 containerd[1558]: time="2025-09-16T04:36:50.219359307Z" level=info msg="TaskExit event in podsandbox handler container_id:\"8d6c6929c13327a75d8c268f852bf6c0f9e08ea12f3d02fe73bf74d4acf04464\" id:\"1898b4cac11f9b813d667e8aa9b68b21309422295f2a3c669c40c4222ffa695c\" pid:9614 exited_at:{seconds:1757997410 nanos:219020462}" Sep 16 04:36:50.757283 sshd[9596]: Accepted publickey for core from 139.178.89.65 port 57108 ssh2: RSA SHA256:hnZQROmedaG+reQAaWvmG41QCRiTlF3QrQA4Qzar5jk Sep 16 04:36:50.759624 sshd-session[9596]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 16 04:36:50.766309 systemd-logind[1529]: New session 101 of user core. Sep 16 04:36:50.771854 systemd[1]: Started session-101.scope - Session 101 of User core. Sep 16 04:36:51.529465 sshd[9624]: Connection closed by 139.178.89.65 port 57108 Sep 16 04:36:51.529778 sshd-session[9596]: pam_unix(sshd:session): session closed for user core Sep 16 04:36:51.534714 systemd[1]: sshd@101-138.201.119.17:22-139.178.89.65:57108.service: Deactivated successfully. Sep 16 04:36:51.536854 systemd[1]: session-101.scope: Deactivated successfully. Sep 16 04:36:51.537737 systemd-logind[1529]: Session 101 logged out. Waiting for processes to exit. Sep 16 04:36:51.540059 systemd-logind[1529]: Removed session 101. Sep 16 04:36:56.704831 systemd[1]: Started sshd@102-138.201.119.17:22-139.178.89.65:32814.service - OpenSSH per-connection server daemon (139.178.89.65:32814). Sep 16 04:36:57.709540 sshd[9638]: Accepted publickey for core from 139.178.89.65 port 32814 ssh2: RSA SHA256:hnZQROmedaG+reQAaWvmG41QCRiTlF3QrQA4Qzar5jk Sep 16 04:36:57.711432 sshd-session[9638]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 16 04:36:57.718653 systemd-logind[1529]: New session 102 of user core. Sep 16 04:36:57.727911 systemd[1]: Started session-102.scope - Session 102 of User core. Sep 16 04:36:58.468746 sshd[9641]: Connection closed by 139.178.89.65 port 32814 Sep 16 04:36:58.469806 sshd-session[9638]: pam_unix(sshd:session): session closed for user core Sep 16 04:36:58.475397 systemd[1]: sshd@102-138.201.119.17:22-139.178.89.65:32814.service: Deactivated successfully. Sep 16 04:36:58.478127 systemd[1]: session-102.scope: Deactivated successfully. Sep 16 04:36:58.479507 systemd-logind[1529]: Session 102 logged out. Waiting for processes to exit. Sep 16 04:36:58.481253 systemd-logind[1529]: Removed session 102. Sep 16 04:36:59.307327 containerd[1558]: time="2025-09-16T04:36:59.307258848Z" level=info msg="TaskExit event in podsandbox handler container_id:\"330eda4a58ea998791d702c8e63e01b0da33f16b43b46f482cb45be65d7b3d87\" id:\"0297836c46d188b035e4c745132883e56422795a7f6d426fd59c511fb4aac96c\" pid:9663 exited_at:{seconds:1757997419 nanos:306754200}" Sep 16 04:37:03.641637 systemd[1]: Started sshd@103-138.201.119.17:22-139.178.89.65:48864.service - OpenSSH per-connection server daemon (139.178.89.65:48864). Sep 16 04:37:04.656909 sshd[9674]: Accepted publickey for core from 139.178.89.65 port 48864 ssh2: RSA SHA256:hnZQROmedaG+reQAaWvmG41QCRiTlF3QrQA4Qzar5jk Sep 16 04:37:04.661121 sshd-session[9674]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 16 04:37:04.668081 systemd-logind[1529]: New session 103 of user core. Sep 16 04:37:04.674787 systemd[1]: Started session-103.scope - Session 103 of User core. Sep 16 04:37:05.450424 sshd[9677]: Connection closed by 139.178.89.65 port 48864 Sep 16 04:37:05.452814 sshd-session[9674]: pam_unix(sshd:session): session closed for user core Sep 16 04:37:05.457271 systemd[1]: session-103.scope: Deactivated successfully. Sep 16 04:37:05.457300 systemd-logind[1529]: Session 103 logged out. Waiting for processes to exit. Sep 16 04:37:05.458565 systemd[1]: sshd@103-138.201.119.17:22-139.178.89.65:48864.service: Deactivated successfully. Sep 16 04:37:05.466315 systemd-logind[1529]: Removed session 103. Sep 16 04:37:05.626394 containerd[1558]: time="2025-09-16T04:37:05.626351026Z" level=info msg="TaskExit event in podsandbox handler container_id:\"330eda4a58ea998791d702c8e63e01b0da33f16b43b46f482cb45be65d7b3d87\" id:\"cdf23a56ca09e8eca6a1e6dfb085b2f28fb1691144bc6d61a1b09b2d7831b84e\" pid:9699 exited_at:{seconds:1757997425 nanos:626050901}" Sep 16 04:37:10.618264 systemd[1]: Started sshd@104-138.201.119.17:22-139.178.89.65:57314.service - OpenSSH per-connection server daemon (139.178.89.65:57314). Sep 16 04:37:11.616197 sshd[9709]: Accepted publickey for core from 139.178.89.65 port 57314 ssh2: RSA SHA256:hnZQROmedaG+reQAaWvmG41QCRiTlF3QrQA4Qzar5jk Sep 16 04:37:11.618643 sshd-session[9709]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 16 04:37:11.627221 systemd-logind[1529]: New session 104 of user core. Sep 16 04:37:11.631819 systemd[1]: Started session-104.scope - Session 104 of User core. Sep 16 04:37:12.370621 sshd[9712]: Connection closed by 139.178.89.65 port 57314 Sep 16 04:37:12.370794 sshd-session[9709]: pam_unix(sshd:session): session closed for user core Sep 16 04:37:12.378184 systemd[1]: sshd@104-138.201.119.17:22-139.178.89.65:57314.service: Deactivated successfully. Sep 16 04:37:12.380140 systemd[1]: session-104.scope: Deactivated successfully. Sep 16 04:37:12.381566 systemd-logind[1529]: Session 104 logged out. Waiting for processes to exit. Sep 16 04:37:12.384746 systemd-logind[1529]: Removed session 104. Sep 16 04:37:12.758063 containerd[1558]: time="2025-09-16T04:37:12.757927129Z" level=info msg="TaskExit event in podsandbox handler container_id:\"29accf524a610bb91eb08b15c29dbaaf8febb57b21dcb514741ee9efee567994\" id:\"ecf9dda5d6a23114fe0d042891f6b9a7365315e33e30f92a5d0f001edb7a55ac\" pid:9735 exited_at:{seconds:1757997432 nanos:757399921}" Sep 16 04:37:14.014053 systemd[1]: Started sshd@105-138.201.119.17:22-179.33.186.151:5795.service - OpenSSH per-connection server daemon (179.33.186.151:5795). Sep 16 04:37:15.199270 sshd[9754]: Invalid user oracle from 179.33.186.151 port 5795 Sep 16 04:37:15.397605 sshd[9754]: Received disconnect from 179.33.186.151 port 5795:11: Bye Bye [preauth] Sep 16 04:37:15.397605 sshd[9754]: Disconnected from invalid user oracle 179.33.186.151 port 5795 [preauth] Sep 16 04:37:15.400615 systemd[1]: sshd@105-138.201.119.17:22-179.33.186.151:5795.service: Deactivated successfully. Sep 16 04:37:17.552723 systemd[1]: Started sshd@106-138.201.119.17:22-139.178.89.65:57328.service - OpenSSH per-connection server daemon (139.178.89.65:57328). Sep 16 04:37:17.555830 systemd[1]: Starting systemd-tmpfiles-clean.service - Cleanup of Temporary Directories... Sep 16 04:37:17.586974 systemd-tmpfiles[9775]: /usr/lib/tmpfiles.d/nfs-utils.conf:6: Duplicate line for path "/var/lib/nfs/sm", ignoring. Sep 16 04:37:17.586992 systemd-tmpfiles[9775]: /usr/lib/tmpfiles.d/nfs-utils.conf:7: Duplicate line for path "/var/lib/nfs/sm.bak", ignoring. Sep 16 04:37:17.587193 systemd-tmpfiles[9775]: /usr/lib/tmpfiles.d/provision.conf:20: Duplicate line for path "/root", ignoring. Sep 16 04:37:17.587872 systemd-tmpfiles[9775]: /usr/lib/tmpfiles.d/systemd-flatcar.conf:6: Duplicate line for path "/var/log/journal", ignoring. Sep 16 04:37:17.589237 systemd-tmpfiles[9775]: /usr/lib/tmpfiles.d/systemd.conf:29: Duplicate line for path "/var/lib/systemd", ignoring. Sep 16 04:37:17.589478 systemd-tmpfiles[9775]: ACLs are not supported, ignoring. Sep 16 04:37:17.589539 systemd-tmpfiles[9775]: ACLs are not supported, ignoring. Sep 16 04:37:17.595242 systemd-tmpfiles[9775]: Detected autofs mount point /boot during canonicalization of boot. Sep 16 04:37:17.595256 systemd-tmpfiles[9775]: Skipping /boot Sep 16 04:37:17.601313 systemd[1]: systemd-tmpfiles-clean.service: Deactivated successfully. Sep 16 04:37:17.602679 systemd[1]: Finished systemd-tmpfiles-clean.service - Cleanup of Temporary Directories. Sep 16 04:37:17.609344 systemd[1]: run-credentials-systemd\x2dtmpfiles\x2dclean.service.mount: Deactivated successfully. Sep 16 04:37:18.551086 sshd[9774]: Accepted publickey for core from 139.178.89.65 port 57328 ssh2: RSA SHA256:hnZQROmedaG+reQAaWvmG41QCRiTlF3QrQA4Qzar5jk Sep 16 04:37:18.553400 sshd-session[9774]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 16 04:37:18.559788 systemd-logind[1529]: New session 105 of user core. Sep 16 04:37:18.569985 systemd[1]: Started session-105.scope - Session 105 of User core. Sep 16 04:37:19.309566 sshd[9780]: Connection closed by 139.178.89.65 port 57328 Sep 16 04:37:19.309449 sshd-session[9774]: pam_unix(sshd:session): session closed for user core Sep 16 04:37:19.315904 systemd-logind[1529]: Session 105 logged out. Waiting for processes to exit. Sep 16 04:37:19.317205 systemd[1]: sshd@106-138.201.119.17:22-139.178.89.65:57328.service: Deactivated successfully. Sep 16 04:37:19.319571 systemd[1]: session-105.scope: Deactivated successfully. Sep 16 04:37:19.323610 systemd-logind[1529]: Removed session 105. Sep 16 04:37:20.235916 containerd[1558]: time="2025-09-16T04:37:20.235860544Z" level=info msg="TaskExit event in podsandbox handler container_id:\"8d6c6929c13327a75d8c268f852bf6c0f9e08ea12f3d02fe73bf74d4acf04464\" id:\"3e94e3869ebf9b49a8b7c87327b1964792bd8334bee2994c9ec969ac370f6715\" pid:9803 exited_at:{seconds:1757997440 nanos:235317336}" Sep 16 04:37:20.789832 containerd[1558]: time="2025-09-16T04:37:20.789758543Z" level=info msg="TaskExit event in podsandbox handler container_id:\"8d6c6929c13327a75d8c268f852bf6c0f9e08ea12f3d02fe73bf74d4acf04464\" id:\"b99726352b784c93b15f24a7367a6654c7d1bc55647edcef60b7059724614bc2\" pid:9826 exited_at:{seconds:1757997440 nanos:789241015}" Sep 16 04:37:24.489050 systemd[1]: Started sshd@107-138.201.119.17:22-139.178.89.65:52798.service - OpenSSH per-connection server daemon (139.178.89.65:52798). Sep 16 04:37:25.489310 sshd[9838]: Accepted publickey for core from 139.178.89.65 port 52798 ssh2: RSA SHA256:hnZQROmedaG+reQAaWvmG41QCRiTlF3QrQA4Qzar5jk Sep 16 04:37:25.491507 sshd-session[9838]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 16 04:37:25.497578 systemd-logind[1529]: New session 106 of user core. Sep 16 04:37:25.507903 systemd[1]: Started session-106.scope - Session 106 of User core. Sep 16 04:37:26.247802 sshd[9841]: Connection closed by 139.178.89.65 port 52798 Sep 16 04:37:26.249339 sshd-session[9838]: pam_unix(sshd:session): session closed for user core Sep 16 04:37:26.255534 systemd[1]: sshd@107-138.201.119.17:22-139.178.89.65:52798.service: Deactivated successfully. Sep 16 04:37:26.257991 systemd[1]: session-106.scope: Deactivated successfully. Sep 16 04:37:26.259009 systemd-logind[1529]: Session 106 logged out. Waiting for processes to exit. Sep 16 04:37:26.261452 systemd-logind[1529]: Removed session 106. Sep 16 04:37:29.309267 containerd[1558]: time="2025-09-16T04:37:29.309187021Z" level=info msg="TaskExit event in podsandbox handler container_id:\"330eda4a58ea998791d702c8e63e01b0da33f16b43b46f482cb45be65d7b3d87\" id:\"11d5e1d9792c9e1a86fa9bcfe27231c26612e47c2191f28a3c6b11f853952e43\" pid:9863 exited_at:{seconds:1757997449 nanos:308537051}" Sep 16 04:37:31.420844 systemd[1]: Started sshd@108-138.201.119.17:22-139.178.89.65:39052.service - OpenSSH per-connection server daemon (139.178.89.65:39052). Sep 16 04:37:32.428758 sshd[9875]: Accepted publickey for core from 139.178.89.65 port 39052 ssh2: RSA SHA256:hnZQROmedaG+reQAaWvmG41QCRiTlF3QrQA4Qzar5jk Sep 16 04:37:32.431472 sshd-session[9875]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 16 04:37:32.437727 systemd-logind[1529]: New session 107 of user core. Sep 16 04:37:32.446958 systemd[1]: Started session-107.scope - Session 107 of User core. Sep 16 04:37:33.184319 sshd[9878]: Connection closed by 139.178.89.65 port 39052 Sep 16 04:37:33.184197 sshd-session[9875]: pam_unix(sshd:session): session closed for user core Sep 16 04:37:33.191321 systemd[1]: sshd@108-138.201.119.17:22-139.178.89.65:39052.service: Deactivated successfully. Sep 16 04:37:33.194021 systemd[1]: session-107.scope: Deactivated successfully. Sep 16 04:37:33.195630 systemd-logind[1529]: Session 107 logged out. Waiting for processes to exit. Sep 16 04:37:33.198005 systemd-logind[1529]: Removed session 107. Sep 16 04:37:38.354959 systemd[1]: Started sshd@109-138.201.119.17:22-139.178.89.65:39058.service - OpenSSH per-connection server daemon (139.178.89.65:39058). Sep 16 04:37:39.364248 sshd[9891]: Accepted publickey for core from 139.178.89.65 port 39058 ssh2: RSA SHA256:hnZQROmedaG+reQAaWvmG41QCRiTlF3QrQA4Qzar5jk Sep 16 04:37:39.366550 sshd-session[9891]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 16 04:37:39.374135 systemd-logind[1529]: New session 108 of user core. Sep 16 04:37:39.378856 systemd[1]: Started session-108.scope - Session 108 of User core. Sep 16 04:37:40.128124 sshd[9894]: Connection closed by 139.178.89.65 port 39058 Sep 16 04:37:40.129039 sshd-session[9891]: pam_unix(sshd:session): session closed for user core Sep 16 04:37:40.135287 systemd-logind[1529]: Session 108 logged out. Waiting for processes to exit. Sep 16 04:37:40.136228 systemd[1]: sshd@109-138.201.119.17:22-139.178.89.65:39058.service: Deactivated successfully. Sep 16 04:37:40.139890 systemd[1]: session-108.scope: Deactivated successfully. Sep 16 04:37:40.143689 systemd-logind[1529]: Removed session 108. Sep 16 04:37:42.756694 containerd[1558]: time="2025-09-16T04:37:42.756642066Z" level=info msg="TaskExit event in podsandbox handler container_id:\"29accf524a610bb91eb08b15c29dbaaf8febb57b21dcb514741ee9efee567994\" id:\"f2652375df53306fe97f26c816e6298127c1ca42166a1a00ea488d36c0686de7\" pid:9917 exited_at:{seconds:1757997462 nanos:755940655}" Sep 16 04:37:45.303061 systemd[1]: Started sshd@110-138.201.119.17:22-139.178.89.65:48116.service - OpenSSH per-connection server daemon (139.178.89.65:48116). Sep 16 04:37:46.313226 sshd[9929]: Accepted publickey for core from 139.178.89.65 port 48116 ssh2: RSA SHA256:hnZQROmedaG+reQAaWvmG41QCRiTlF3QrQA4Qzar5jk Sep 16 04:37:46.315074 sshd-session[9929]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 16 04:37:46.321045 systemd-logind[1529]: New session 109 of user core. Sep 16 04:37:46.326867 systemd[1]: Started session-109.scope - Session 109 of User core. Sep 16 04:37:47.079860 sshd[9932]: Connection closed by 139.178.89.65 port 48116 Sep 16 04:37:47.080570 sshd-session[9929]: pam_unix(sshd:session): session closed for user core Sep 16 04:37:47.086918 systemd-logind[1529]: Session 109 logged out. Waiting for processes to exit. Sep 16 04:37:47.087048 systemd[1]: sshd@110-138.201.119.17:22-139.178.89.65:48116.service: Deactivated successfully. Sep 16 04:37:47.090907 systemd[1]: session-109.scope: Deactivated successfully. Sep 16 04:37:47.094784 systemd-logind[1529]: Removed session 109. Sep 16 04:37:50.229061 containerd[1558]: time="2025-09-16T04:37:50.229006318Z" level=info msg="TaskExit event in podsandbox handler container_id:\"8d6c6929c13327a75d8c268f852bf6c0f9e08ea12f3d02fe73bf74d4acf04464\" id:\"492fc9bf9eff9383be8e3994f9da291cb6fbdec09c9df2933f786bcc0760b3ad\" pid:9955 exited_at:{seconds:1757997470 nanos:228528631}" Sep 16 04:37:52.253991 systemd[1]: Started sshd@111-138.201.119.17:22-139.178.89.65:60720.service - OpenSSH per-connection server daemon (139.178.89.65:60720). Sep 16 04:37:53.253444 sshd[9967]: Accepted publickey for core from 139.178.89.65 port 60720 ssh2: RSA SHA256:hnZQROmedaG+reQAaWvmG41QCRiTlF3QrQA4Qzar5jk Sep 16 04:37:53.255641 sshd-session[9967]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 16 04:37:53.260571 systemd-logind[1529]: New session 110 of user core. Sep 16 04:37:53.269075 systemd[1]: Started session-110.scope - Session 110 of User core. Sep 16 04:37:54.021894 sshd[9970]: Connection closed by 139.178.89.65 port 60720 Sep 16 04:37:54.022426 sshd-session[9967]: pam_unix(sshd:session): session closed for user core Sep 16 04:37:54.029846 systemd[1]: sshd@111-138.201.119.17:22-139.178.89.65:60720.service: Deactivated successfully. Sep 16 04:37:54.033569 systemd[1]: session-110.scope: Deactivated successfully. Sep 16 04:37:54.035424 systemd-logind[1529]: Session 110 logged out. Waiting for processes to exit. Sep 16 04:37:54.037499 systemd-logind[1529]: Removed session 110. Sep 16 04:37:59.194833 systemd[1]: Started sshd@112-138.201.119.17:22-139.178.89.65:60726.service - OpenSSH per-connection server daemon (139.178.89.65:60726). Sep 16 04:37:59.294555 containerd[1558]: time="2025-09-16T04:37:59.294480980Z" level=info msg="TaskExit event in podsandbox handler container_id:\"330eda4a58ea998791d702c8e63e01b0da33f16b43b46f482cb45be65d7b3d87\" id:\"f83da47fe73d933fde532dd0993cc82d4e8a0e2e87ae67be0094d2fc3c909537\" pid:10001 exited_at:{seconds:1757997479 nanos:294006573}" Sep 16 04:38:00.196609 sshd[9984]: Accepted publickey for core from 139.178.89.65 port 60726 ssh2: RSA SHA256:hnZQROmedaG+reQAaWvmG41QCRiTlF3QrQA4Qzar5jk Sep 16 04:38:00.198782 sshd-session[9984]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 16 04:38:00.204328 systemd-logind[1529]: New session 111 of user core. Sep 16 04:38:00.210890 systemd[1]: Started session-111.scope - Session 111 of User core. Sep 16 04:38:00.957176 sshd[10011]: Connection closed by 139.178.89.65 port 60726 Sep 16 04:38:00.957770 sshd-session[9984]: pam_unix(sshd:session): session closed for user core Sep 16 04:38:00.963366 systemd[1]: sshd@112-138.201.119.17:22-139.178.89.65:60726.service: Deactivated successfully. Sep 16 04:38:00.968157 systemd[1]: session-111.scope: Deactivated successfully. Sep 16 04:38:00.971476 systemd-logind[1529]: Session 111 logged out. Waiting for processes to exit. Sep 16 04:38:00.974370 systemd-logind[1529]: Removed session 111. Sep 16 04:38:05.587975 containerd[1558]: time="2025-09-16T04:38:05.587912117Z" level=info msg="TaskExit event in podsandbox handler container_id:\"330eda4a58ea998791d702c8e63e01b0da33f16b43b46f482cb45be65d7b3d87\" id:\"62abb270b61d24a18af187ec688d682dc49466e890c86d7bb45f004edd66e395\" pid:10037 exited_at:{seconds:1757997485 nanos:587281027}" Sep 16 04:38:06.133795 systemd[1]: Started sshd@113-138.201.119.17:22-139.178.89.65:45070.service - OpenSSH per-connection server daemon (139.178.89.65:45070). Sep 16 04:38:07.145269 sshd[10047]: Accepted publickey for core from 139.178.89.65 port 45070 ssh2: RSA SHA256:hnZQROmedaG+reQAaWvmG41QCRiTlF3QrQA4Qzar5jk Sep 16 04:38:07.148379 sshd-session[10047]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 16 04:38:07.155651 systemd-logind[1529]: New session 112 of user core. Sep 16 04:38:07.162803 systemd[1]: Started session-112.scope - Session 112 of User core. Sep 16 04:38:07.941870 sshd[10050]: Connection closed by 139.178.89.65 port 45070 Sep 16 04:38:07.946041 sshd-session[10047]: pam_unix(sshd:session): session closed for user core Sep 16 04:38:07.953687 systemd[1]: sshd@113-138.201.119.17:22-139.178.89.65:45070.service: Deactivated successfully. Sep 16 04:38:07.958417 systemd[1]: session-112.scope: Deactivated successfully. Sep 16 04:38:07.961398 systemd-logind[1529]: Session 112 logged out. Waiting for processes to exit. Sep 16 04:38:07.964277 systemd-logind[1529]: Removed session 112. Sep 16 04:38:12.761087 containerd[1558]: time="2025-09-16T04:38:12.760961105Z" level=info msg="TaskExit event in podsandbox handler container_id:\"29accf524a610bb91eb08b15c29dbaaf8febb57b21dcb514741ee9efee567994\" id:\"6f6a7389ddb947bd3e6bf6a6ec94da6f8f66f6fe5c04469a1249f2d0d54358fb\" pid:10074 exited_at:{seconds:1757997492 nanos:759709045}" Sep 16 04:38:13.119366 systemd[1]: Started sshd@114-138.201.119.17:22-139.178.89.65:37770.service - OpenSSH per-connection server daemon (139.178.89.65:37770). Sep 16 04:38:14.126892 sshd[10087]: Accepted publickey for core from 139.178.89.65 port 37770 ssh2: RSA SHA256:hnZQROmedaG+reQAaWvmG41QCRiTlF3QrQA4Qzar5jk Sep 16 04:38:14.129372 sshd-session[10087]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 16 04:38:14.137041 systemd-logind[1529]: New session 113 of user core. Sep 16 04:38:14.141855 systemd[1]: Started session-113.scope - Session 113 of User core. Sep 16 04:38:14.895686 sshd[10090]: Connection closed by 139.178.89.65 port 37770 Sep 16 04:38:14.896643 sshd-session[10087]: pam_unix(sshd:session): session closed for user core Sep 16 04:38:14.902759 systemd[1]: sshd@114-138.201.119.17:22-139.178.89.65:37770.service: Deactivated successfully. Sep 16 04:38:14.905572 systemd[1]: session-113.scope: Deactivated successfully. Sep 16 04:38:14.906777 systemd-logind[1529]: Session 113 logged out. Waiting for processes to exit. Sep 16 04:38:14.909371 systemd-logind[1529]: Removed session 113. Sep 16 04:38:20.067946 systemd[1]: Started sshd@115-138.201.119.17:22-139.178.89.65:37774.service - OpenSSH per-connection server daemon (139.178.89.65:37774). Sep 16 04:38:20.206946 systemd[1]: Started sshd@116-138.201.119.17:22-211.201.163.70:46066.service - OpenSSH per-connection server daemon (211.201.163.70:46066). Sep 16 04:38:20.225293 containerd[1558]: time="2025-09-16T04:38:20.223596619Z" level=info msg="TaskExit event in podsandbox handler container_id:\"8d6c6929c13327a75d8c268f852bf6c0f9e08ea12f3d02fe73bf74d4acf04464\" id:\"e7b8dae58e8fd97366d183d8042ac76f760abd08555373a54e4584598e413cfa\" pid:10119 exited_at:{seconds:1757997500 nanos:223029211}" Sep 16 04:38:20.792422 containerd[1558]: time="2025-09-16T04:38:20.792361841Z" level=info msg="TaskExit event in podsandbox handler container_id:\"8d6c6929c13327a75d8c268f852bf6c0f9e08ea12f3d02fe73bf74d4acf04464\" id:\"7654f8d3d652bbcbf1589053e80cf6d6a8324ef1211b8d43d296f056cdefd63d\" pid:10147 exited_at:{seconds:1757997500 nanos:791514468}" Sep 16 04:38:21.072261 sshd[10103]: Accepted publickey for core from 139.178.89.65 port 37774 ssh2: RSA SHA256:hnZQROmedaG+reQAaWvmG41QCRiTlF3QrQA4Qzar5jk Sep 16 04:38:21.075314 sshd-session[10103]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 16 04:38:21.082176 systemd-logind[1529]: New session 114 of user core. Sep 16 04:38:21.084811 systemd[1]: Started session-114.scope - Session 114 of User core. Sep 16 04:38:21.508554 sshd[10131]: Invalid user test from 211.201.163.70 port 46066 Sep 16 04:38:21.752901 sshd[10131]: Received disconnect from 211.201.163.70 port 46066:11: Bye Bye [preauth] Sep 16 04:38:21.752901 sshd[10131]: Disconnected from invalid user test 211.201.163.70 port 46066 [preauth] Sep 16 04:38:21.756700 systemd[1]: sshd@116-138.201.119.17:22-211.201.163.70:46066.service: Deactivated successfully. Sep 16 04:38:21.831015 sshd[10163]: Connection closed by 139.178.89.65 port 37774 Sep 16 04:38:21.831465 sshd-session[10103]: pam_unix(sshd:session): session closed for user core Sep 16 04:38:21.839501 systemd-logind[1529]: Session 114 logged out. Waiting for processes to exit. Sep 16 04:38:21.840212 systemd[1]: sshd@115-138.201.119.17:22-139.178.89.65:37774.service: Deactivated successfully. Sep 16 04:38:21.844289 systemd[1]: session-114.scope: Deactivated successfully. Sep 16 04:38:21.847213 systemd-logind[1529]: Removed session 114. Sep 16 04:38:27.016382 systemd[1]: Started sshd@117-138.201.119.17:22-139.178.89.65:34774.service - OpenSSH per-connection server daemon (139.178.89.65:34774). Sep 16 04:38:28.019828 sshd[10178]: Accepted publickey for core from 139.178.89.65 port 34774 ssh2: RSA SHA256:hnZQROmedaG+reQAaWvmG41QCRiTlF3QrQA4Qzar5jk Sep 16 04:38:28.021773 sshd-session[10178]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 16 04:38:28.026177 systemd-logind[1529]: New session 115 of user core. Sep 16 04:38:28.034056 systemd[1]: Started session-115.scope - Session 115 of User core. Sep 16 04:38:28.781008 sshd[10181]: Connection closed by 139.178.89.65 port 34774 Sep 16 04:38:28.783776 sshd-session[10178]: pam_unix(sshd:session): session closed for user core Sep 16 04:38:28.788971 systemd-logind[1529]: Session 115 logged out. Waiting for processes to exit. Sep 16 04:38:28.789241 systemd[1]: sshd@117-138.201.119.17:22-139.178.89.65:34774.service: Deactivated successfully. Sep 16 04:38:28.792561 systemd[1]: session-115.scope: Deactivated successfully. Sep 16 04:38:28.795647 systemd-logind[1529]: Removed session 115. Sep 16 04:38:29.297947 containerd[1558]: time="2025-09-16T04:38:29.297892804Z" level=info msg="TaskExit event in podsandbox handler container_id:\"330eda4a58ea998791d702c8e63e01b0da33f16b43b46f482cb45be65d7b3d87\" id:\"83eea19a0b13e73516e04c67a4dfa4e9f2314621eba4258d1f7bca2ea4b40da5\" pid:10203 exited_at:{seconds:1757997509 nanos:297310155}" Sep 16 04:38:33.947114 systemd[1]: Started sshd@118-138.201.119.17:22-139.178.89.65:42208.service - OpenSSH per-connection server daemon (139.178.89.65:42208). Sep 16 04:38:34.949324 sshd[10215]: Accepted publickey for core from 139.178.89.65 port 42208 ssh2: RSA SHA256:hnZQROmedaG+reQAaWvmG41QCRiTlF3QrQA4Qzar5jk Sep 16 04:38:34.951513 sshd-session[10215]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 16 04:38:34.956392 systemd-logind[1529]: New session 116 of user core. Sep 16 04:38:34.961023 systemd[1]: Started session-116.scope - Session 116 of User core. Sep 16 04:38:35.697001 sshd[10218]: Connection closed by 139.178.89.65 port 42208 Sep 16 04:38:35.697723 sshd-session[10215]: pam_unix(sshd:session): session closed for user core Sep 16 04:38:35.703231 systemd[1]: sshd@118-138.201.119.17:22-139.178.89.65:42208.service: Deactivated successfully. Sep 16 04:38:35.706046 systemd[1]: session-116.scope: Deactivated successfully. Sep 16 04:38:35.708468 systemd-logind[1529]: Session 116 logged out. Waiting for processes to exit. Sep 16 04:38:35.710255 systemd-logind[1529]: Removed session 116. Sep 16 04:38:38.061132 systemd[1]: sshd@99-138.201.119.17:22-183.245.9.13:50720.service: Deactivated successfully. Sep 16 04:38:40.878806 systemd[1]: Started sshd@119-138.201.119.17:22-139.178.89.65:38138.service - OpenSSH per-connection server daemon (139.178.89.65:38138). Sep 16 04:38:41.889199 sshd[10234]: Accepted publickey for core from 139.178.89.65 port 38138 ssh2: RSA SHA256:hnZQROmedaG+reQAaWvmG41QCRiTlF3QrQA4Qzar5jk Sep 16 04:38:41.891409 sshd-session[10234]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 16 04:38:41.897836 systemd-logind[1529]: New session 117 of user core. Sep 16 04:38:41.908903 systemd[1]: Started session-117.scope - Session 117 of User core. Sep 16 04:38:42.648845 sshd[10237]: Connection closed by 139.178.89.65 port 38138 Sep 16 04:38:42.649418 sshd-session[10234]: pam_unix(sshd:session): session closed for user core Sep 16 04:38:42.656261 systemd[1]: sshd@119-138.201.119.17:22-139.178.89.65:38138.service: Deactivated successfully. Sep 16 04:38:42.659715 systemd[1]: session-117.scope: Deactivated successfully. Sep 16 04:38:42.661951 systemd-logind[1529]: Session 117 logged out. Waiting for processes to exit. Sep 16 04:38:42.663771 systemd-logind[1529]: Removed session 117. Sep 16 04:38:42.754954 containerd[1558]: time="2025-09-16T04:38:42.754870005Z" level=info msg="TaskExit event in podsandbox handler container_id:\"29accf524a610bb91eb08b15c29dbaaf8febb57b21dcb514741ee9efee567994\" id:\"ac413c10ad24fb622f791b15ad540b13c0907b320d95ea6e759ba6ad3c9f4073\" pid:10262 exited_at:{seconds:1757997522 nanos:753742028}" Sep 16 04:38:47.822536 systemd[1]: Started sshd@120-138.201.119.17:22-139.178.89.65:38148.service - OpenSSH per-connection server daemon (139.178.89.65:38148). Sep 16 04:38:48.822370 sshd[10274]: Accepted publickey for core from 139.178.89.65 port 38148 ssh2: RSA SHA256:hnZQROmedaG+reQAaWvmG41QCRiTlF3QrQA4Qzar5jk Sep 16 04:38:48.824191 sshd-session[10274]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 16 04:38:48.831198 systemd-logind[1529]: New session 118 of user core. Sep 16 04:38:48.843935 systemd[1]: Started session-118.scope - Session 118 of User core. Sep 16 04:38:49.579801 sshd[10277]: Connection closed by 139.178.89.65 port 38148 Sep 16 04:38:49.580826 sshd-session[10274]: pam_unix(sshd:session): session closed for user core Sep 16 04:38:49.586408 systemd[1]: sshd@120-138.201.119.17:22-139.178.89.65:38148.service: Deactivated successfully. Sep 16 04:38:49.589479 systemd[1]: session-118.scope: Deactivated successfully. Sep 16 04:38:49.591671 systemd-logind[1529]: Session 118 logged out. Waiting for processes to exit. Sep 16 04:38:49.594117 systemd-logind[1529]: Removed session 118. Sep 16 04:38:50.212325 containerd[1558]: time="2025-09-16T04:38:50.212257698Z" level=info msg="TaskExit event in podsandbox handler container_id:\"8d6c6929c13327a75d8c268f852bf6c0f9e08ea12f3d02fe73bf74d4acf04464\" id:\"c3747a4f69ca82ebbb6b22c980387c33ebd14c78e6cd3b75502ba57eed11f9e0\" pid:10299 exited_at:{seconds:1757997530 nanos:211907532}" Sep 16 04:38:54.755316 systemd[1]: Started sshd@121-138.201.119.17:22-139.178.89.65:35410.service - OpenSSH per-connection server daemon (139.178.89.65:35410). Sep 16 04:38:55.445671 systemd[1]: Started sshd@122-138.201.119.17:22-103.154.77.2:58800.service - OpenSSH per-connection server daemon (103.154.77.2:58800). Sep 16 04:38:55.771695 sshd[10333]: Accepted publickey for core from 139.178.89.65 port 35410 ssh2: RSA SHA256:hnZQROmedaG+reQAaWvmG41QCRiTlF3QrQA4Qzar5jk Sep 16 04:38:55.774032 sshd-session[10333]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 16 04:38:55.779277 systemd-logind[1529]: New session 119 of user core. Sep 16 04:38:55.793866 systemd[1]: Started session-119.scope - Session 119 of User core. Sep 16 04:38:56.551970 sshd[10340]: Connection closed by 139.178.89.65 port 35410 Sep 16 04:38:56.553142 sshd-session[10333]: pam_unix(sshd:session): session closed for user core Sep 16 04:38:56.558621 systemd[1]: sshd@121-138.201.119.17:22-139.178.89.65:35410.service: Deactivated successfully. Sep 16 04:38:56.561326 systemd[1]: session-119.scope: Deactivated successfully. Sep 16 04:38:56.562539 systemd-logind[1529]: Session 119 logged out. Waiting for processes to exit. Sep 16 04:38:56.565290 systemd-logind[1529]: Removed session 119. Sep 16 04:38:56.662088 sshd[10337]: Received disconnect from 103.154.77.2 port 58800:11: Bye Bye [preauth] Sep 16 04:38:56.662088 sshd[10337]: Disconnected from authenticating user root 103.154.77.2 port 58800 [preauth] Sep 16 04:38:56.665687 systemd[1]: sshd@122-138.201.119.17:22-103.154.77.2:58800.service: Deactivated successfully. Sep 16 04:38:59.297758 containerd[1558]: time="2025-09-16T04:38:59.297678507Z" level=info msg="TaskExit event in podsandbox handler container_id:\"330eda4a58ea998791d702c8e63e01b0da33f16b43b46f482cb45be65d7b3d87\" id:\"a5ac49fb2ce78efaf9b5799b460ac5a5f1b6afd843deeef1ff4905d15e2339b8\" pid:10365 exited_at:{seconds:1757997539 nanos:296731532}" Sep 16 04:39:05.594222 containerd[1558]: time="2025-09-16T04:39:05.594088843Z" level=info msg="TaskExit event in podsandbox handler container_id:\"330eda4a58ea998791d702c8e63e01b0da33f16b43b46f482cb45be65d7b3d87\" id:\"839f718b4e136692706be4bf2e426506f0f378bd268f3c9ac47a792b56d7dd73\" pid:10388 exited_at:{seconds:1757997545 nanos:593502554}" Sep 16 04:39:12.528761 kubelet[2783]: E0916 04:39:12.528606 2783 controller.go:195] "Failed to update lease" err="rpc error: code = Unavailable desc = error reading from server: read tcp 10.0.0.3:35526->10.0.0.2:2379: read: connection timed out" Sep 16 04:39:12.535048 systemd[1]: cri-containerd-d0e4c211bee4804610e277b52afb873471a087a0945a8b1a439aed3e0eed084e.scope: Deactivated successfully. Sep 16 04:39:12.535438 systemd[1]: cri-containerd-d0e4c211bee4804610e277b52afb873471a087a0945a8b1a439aed3e0eed084e.scope: Consumed 11.695s CPU time, 28.3M memory peak, 4.4M read from disk. Sep 16 04:39:12.539326 containerd[1558]: time="2025-09-16T04:39:12.539252714Z" level=info msg="received exit event container_id:\"d0e4c211bee4804610e277b52afb873471a087a0945a8b1a439aed3e0eed084e\" id:\"d0e4c211bee4804610e277b52afb873471a087a0945a8b1a439aed3e0eed084e\" pid:2629 exit_status:1 exited_at:{seconds:1757997552 nanos:538886788}" Sep 16 04:39:12.540168 containerd[1558]: time="2025-09-16T04:39:12.539479597Z" level=info msg="TaskExit event in podsandbox handler container_id:\"d0e4c211bee4804610e277b52afb873471a087a0945a8b1a439aed3e0eed084e\" id:\"d0e4c211bee4804610e277b52afb873471a087a0945a8b1a439aed3e0eed084e\" pid:2629 exit_status:1 exited_at:{seconds:1757997552 nanos:538886788}" Sep 16 04:39:12.565548 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-d0e4c211bee4804610e277b52afb873471a087a0945a8b1a439aed3e0eed084e-rootfs.mount: Deactivated successfully. Sep 16 04:39:12.619260 systemd[1]: cri-containerd-6d0da9451e670ade73d62e52afea4f69fc4d068a0dd91185a8485e2dc740c3a2.scope: Deactivated successfully. Sep 16 04:39:12.619758 systemd[1]: cri-containerd-6d0da9451e670ade73d62e52afea4f69fc4d068a0dd91185a8485e2dc740c3a2.scope: Consumed 44.967s CPU time, 111.1M memory peak, 5M read from disk. Sep 16 04:39:12.624899 containerd[1558]: time="2025-09-16T04:39:12.624833030Z" level=info msg="received exit event container_id:\"6d0da9451e670ade73d62e52afea4f69fc4d068a0dd91185a8485e2dc740c3a2\" id:\"6d0da9451e670ade73d62e52afea4f69fc4d068a0dd91185a8485e2dc740c3a2\" pid:3104 exit_status:1 exited_at:{seconds:1757997552 nanos:624147660}" Sep 16 04:39:12.625472 containerd[1558]: time="2025-09-16T04:39:12.625404559Z" level=info msg="TaskExit event in podsandbox handler container_id:\"6d0da9451e670ade73d62e52afea4f69fc4d068a0dd91185a8485e2dc740c3a2\" id:\"6d0da9451e670ade73d62e52afea4f69fc4d068a0dd91185a8485e2dc740c3a2\" pid:3104 exit_status:1 exited_at:{seconds:1757997552 nanos:624147660}" Sep 16 04:39:12.651520 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-6d0da9451e670ade73d62e52afea4f69fc4d068a0dd91185a8485e2dc740c3a2-rootfs.mount: Deactivated successfully. Sep 16 04:39:12.761835 containerd[1558]: time="2025-09-16T04:39:12.761790217Z" level=info msg="TaskExit event in podsandbox handler container_id:\"29accf524a610bb91eb08b15c29dbaaf8febb57b21dcb514741ee9efee567994\" id:\"6f1d0832d90f468732e1df6f6c56c8249fff8035ab7432e932446da1aa8403a0\" pid:10432 exited_at:{seconds:1757997552 nanos:761000445}" Sep 16 04:39:12.957318 kubelet[2783]: I0916 04:39:12.957224 2783 scope.go:117] "RemoveContainer" containerID="6d0da9451e670ade73d62e52afea4f69fc4d068a0dd91185a8485e2dc740c3a2" Sep 16 04:39:12.962676 kubelet[2783]: I0916 04:39:12.962535 2783 scope.go:117] "RemoveContainer" containerID="d0e4c211bee4804610e277b52afb873471a087a0945a8b1a439aed3e0eed084e" Sep 16 04:39:12.968848 containerd[1558]: time="2025-09-16T04:39:12.968785001Z" level=info msg="CreateContainer within sandbox \"35be61e851d5cdef65cba45040bb388023aae4354947b26310910b2e0937094a\" for container &ContainerMetadata{Name:kube-scheduler,Attempt:1,}" Sep 16 04:39:12.971798 containerd[1558]: time="2025-09-16T04:39:12.971759606Z" level=info msg="CreateContainer within sandbox \"a08e43c1cc4799704c01eac2015473bc4e38d82fc3562b0ca970af3383c54304\" for container &ContainerMetadata{Name:tigera-operator,Attempt:1,}" Sep 16 04:39:12.981776 containerd[1558]: time="2025-09-16T04:39:12.981443355Z" level=info msg="Container 9062c7cb746e345ba2fea24f91d2cae23ba5f5dded0801cd8f0ad52b0222d3dc: CDI devices from CRI Config.CDIDevices: []" Sep 16 04:39:12.984330 containerd[1558]: time="2025-09-16T04:39:12.983186502Z" level=info msg="Container 4cf2971e5adcd43099378b61bb228786564e324e5457e612124ac73bc1386277: CDI devices from CRI Config.CDIDevices: []" Sep 16 04:39:12.992405 containerd[1558]: time="2025-09-16T04:39:12.992351803Z" level=info msg="CreateContainer within sandbox \"a08e43c1cc4799704c01eac2015473bc4e38d82fc3562b0ca970af3383c54304\" for &ContainerMetadata{Name:tigera-operator,Attempt:1,} returns container id \"9062c7cb746e345ba2fea24f91d2cae23ba5f5dded0801cd8f0ad52b0222d3dc\"" Sep 16 04:39:12.993392 containerd[1558]: time="2025-09-16T04:39:12.993021653Z" level=info msg="StartContainer for \"9062c7cb746e345ba2fea24f91d2cae23ba5f5dded0801cd8f0ad52b0222d3dc\"" Sep 16 04:39:12.994188 containerd[1558]: time="2025-09-16T04:39:12.994158631Z" level=info msg="connecting to shim 9062c7cb746e345ba2fea24f91d2cae23ba5f5dded0801cd8f0ad52b0222d3dc" address="unix:///run/containerd/s/a24b3ee215d638acb2ac861b8db52a92b34871befd8ffd2c366186494f3a0212" protocol=ttrpc version=3 Sep 16 04:39:12.994406 containerd[1558]: time="2025-09-16T04:39:12.994282193Z" level=info msg="CreateContainer within sandbox \"35be61e851d5cdef65cba45040bb388023aae4354947b26310910b2e0937094a\" for &ContainerMetadata{Name:kube-scheduler,Attempt:1,} returns container id \"4cf2971e5adcd43099378b61bb228786564e324e5457e612124ac73bc1386277\"" Sep 16 04:39:12.994895 containerd[1558]: time="2025-09-16T04:39:12.994864322Z" level=info msg="StartContainer for \"4cf2971e5adcd43099378b61bb228786564e324e5457e612124ac73bc1386277\"" Sep 16 04:39:13.000393 containerd[1558]: time="2025-09-16T04:39:13.000271085Z" level=info msg="connecting to shim 4cf2971e5adcd43099378b61bb228786564e324e5457e612124ac73bc1386277" address="unix:///run/containerd/s/d09735b2fa9581fe7380e0596dc34a489bc4a91955e9d9321c9d3103713fedc2" protocol=ttrpc version=3 Sep 16 04:39:13.028000 systemd[1]: Started cri-containerd-4cf2971e5adcd43099378b61bb228786564e324e5457e612124ac73bc1386277.scope - libcontainer container 4cf2971e5adcd43099378b61bb228786564e324e5457e612124ac73bc1386277. Sep 16 04:39:13.030704 systemd[1]: Started cri-containerd-9062c7cb746e345ba2fea24f91d2cae23ba5f5dded0801cd8f0ad52b0222d3dc.scope - libcontainer container 9062c7cb746e345ba2fea24f91d2cae23ba5f5dded0801cd8f0ad52b0222d3dc. Sep 16 04:39:13.090409 containerd[1558]: time="2025-09-16T04:39:13.090354790Z" level=info msg="StartContainer for \"9062c7cb746e345ba2fea24f91d2cae23ba5f5dded0801cd8f0ad52b0222d3dc\" returns successfully" Sep 16 04:39:13.119350 containerd[1558]: time="2025-09-16T04:39:13.119197834Z" level=info msg="StartContainer for \"4cf2971e5adcd43099378b61bb228786564e324e5457e612124ac73bc1386277\" returns successfully" Sep 16 04:39:13.872777 systemd[1]: cri-containerd-ed4ab31cc3567f76094356286f56f2fd375bdc859e654c50dbe6dd9f919537b3.scope: Deactivated successfully. Sep 16 04:39:13.874703 systemd[1]: cri-containerd-ed4ab31cc3567f76094356286f56f2fd375bdc859e654c50dbe6dd9f919537b3.scope: Consumed 11.837s CPU time, 63.3M memory peak, 4M read from disk. Sep 16 04:39:13.877961 containerd[1558]: time="2025-09-16T04:39:13.877913582Z" level=info msg="TaskExit event in podsandbox handler container_id:\"ed4ab31cc3567f76094356286f56f2fd375bdc859e654c50dbe6dd9f919537b3\" id:\"ed4ab31cc3567f76094356286f56f2fd375bdc859e654c50dbe6dd9f919537b3\" pid:2599 exit_status:1 exited_at:{seconds:1757997553 nanos:877401014}" Sep 16 04:39:13.878265 containerd[1558]: time="2025-09-16T04:39:13.878003583Z" level=info msg="received exit event container_id:\"ed4ab31cc3567f76094356286f56f2fd375bdc859e654c50dbe6dd9f919537b3\" id:\"ed4ab31cc3567f76094356286f56f2fd375bdc859e654c50dbe6dd9f919537b3\" pid:2599 exit_status:1 exited_at:{seconds:1757997553 nanos:877401014}" Sep 16 04:39:13.914133 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-ed4ab31cc3567f76094356286f56f2fd375bdc859e654c50dbe6dd9f919537b3-rootfs.mount: Deactivated successfully. Sep 16 04:39:13.977641 kubelet[2783]: I0916 04:39:13.977608 2783 scope.go:117] "RemoveContainer" containerID="ed4ab31cc3567f76094356286f56f2fd375bdc859e654c50dbe6dd9f919537b3" Sep 16 04:39:13.980596 containerd[1558]: time="2025-09-16T04:39:13.980550800Z" level=info msg="CreateContainer within sandbox \"25a57597e8d27d89915fcf44b056cafadc4c1a306859fe4e7e1fea3e53bf1b16\" for container &ContainerMetadata{Name:kube-controller-manager,Attempt:1,}" Sep 16 04:39:13.991592 containerd[1558]: time="2025-09-16T04:39:13.991301766Z" level=info msg="Container 6092755eb23719b0f02ae6038993ec1908bada2d955fd5d206e5c75b8da6993a: CDI devices from CRI Config.CDIDevices: []" Sep 16 04:39:14.005447 containerd[1558]: time="2025-09-16T04:39:14.005392902Z" level=info msg="CreateContainer within sandbox \"25a57597e8d27d89915fcf44b056cafadc4c1a306859fe4e7e1fea3e53bf1b16\" for &ContainerMetadata{Name:kube-controller-manager,Attempt:1,} returns container id \"6092755eb23719b0f02ae6038993ec1908bada2d955fd5d206e5c75b8da6993a\"" Sep 16 04:39:14.006057 containerd[1558]: time="2025-09-16T04:39:14.006026512Z" level=info msg="StartContainer for \"6092755eb23719b0f02ae6038993ec1908bada2d955fd5d206e5c75b8da6993a\"" Sep 16 04:39:14.007457 containerd[1558]: time="2025-09-16T04:39:14.007422294Z" level=info msg="connecting to shim 6092755eb23719b0f02ae6038993ec1908bada2d955fd5d206e5c75b8da6993a" address="unix:///run/containerd/s/009a445214b32cd95b63504b1d68350db14d5bf1fe2c2bf56ca99ff8190e3ff1" protocol=ttrpc version=3 Sep 16 04:39:14.045797 systemd[1]: Started cri-containerd-6092755eb23719b0f02ae6038993ec1908bada2d955fd5d206e5c75b8da6993a.scope - libcontainer container 6092755eb23719b0f02ae6038993ec1908bada2d955fd5d206e5c75b8da6993a. Sep 16 04:39:14.119570 containerd[1558]: time="2025-09-16T04:39:14.118571683Z" level=info msg="StartContainer for \"6092755eb23719b0f02ae6038993ec1908bada2d955fd5d206e5c75b8da6993a\" returns successfully" Sep 16 04:39:16.821898 kubelet[2783]: E0916 04:39:16.819797 2783 event.go:359] "Server rejected event (will not retry!)" err="rpc error: code = Unavailable desc = error reading from server: read tcp 10.0.0.3:35368->10.0.0.2:2379: read: connection timed out" event="&Event{ObjectMeta:{kube-apiserver-ci-4459-0-0-n-21eb3e8385.1865a97443aae3c3 kube-system 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:kube-system,Name:kube-apiserver-ci-4459-0-0-n-21eb3e8385,UID:7d4a01f5e8540fe2b3198bc7c49f082d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:Unhealthy,Message:Readiness probe failed: HTTP probe failed with statuscode: 500,Source:EventSource{Component:kubelet,Host:ci-4459-0-0-n-21eb3e8385,},FirstTimestamp:2025-09-16 04:39:06.374620099 +0000 UTC m=+912.743261161,LastTimestamp:2025-09-16 04:39:06.374620099 +0000 UTC m=+912.743261161,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ci-4459-0-0-n-21eb3e8385,}"