Jul 6 23:26:16.793484 kernel: Booting Linux on physical CPU 0x0000000000 [0x413fd0c1] Jul 6 23:26:16.793510 kernel: Linux version 6.12.35-flatcar (build@pony-truck.infra.kinvolk.io) (aarch64-cros-linux-gnu-gcc (Gentoo Hardened 14.2.1_p20241221 p7) 14.2.1 20241221, GNU ld (Gentoo 2.44 p1) 2.44.0) #1 SMP PREEMPT Sun Jul 6 21:52:18 -00 2025 Jul 6 23:26:16.793520 kernel: KASLR enabled Jul 6 23:26:16.793525 kernel: efi: EFI v2.7 by Ubuntu distribution of EDK II Jul 6 23:26:16.793531 kernel: efi: SMBIOS 3.0=0x139ed0000 MEMATTR=0x1390bb018 ACPI 2.0=0x136760018 RNG=0x13676e918 MEMRESERVE=0x136b41218 Jul 6 23:26:16.793536 kernel: random: crng init done Jul 6 23:26:16.793543 kernel: secureboot: Secure boot disabled Jul 6 23:26:16.793548 kernel: ACPI: Early table checksum verification disabled Jul 6 23:26:16.793554 kernel: ACPI: RSDP 0x0000000136760018 000024 (v02 BOCHS ) Jul 6 23:26:16.793560 kernel: ACPI: XSDT 0x000000013676FE98 00006C (v01 BOCHS BXPC 00000001 01000013) Jul 6 23:26:16.793567 kernel: ACPI: FACP 0x000000013676FA98 000114 (v06 BOCHS BXPC 00000001 BXPC 00000001) Jul 6 23:26:16.793572 kernel: ACPI: DSDT 0x0000000136767518 001468 (v02 BOCHS BXPC 00000001 BXPC 00000001) Jul 6 23:26:16.793578 kernel: ACPI: APIC 0x000000013676FC18 000108 (v04 BOCHS BXPC 00000001 BXPC 00000001) Jul 6 23:26:16.793584 kernel: ACPI: PPTT 0x000000013676FD98 000060 (v02 BOCHS BXPC 00000001 BXPC 00000001) Jul 6 23:26:16.793591 kernel: ACPI: GTDT 0x000000013676D898 000060 (v02 BOCHS BXPC 00000001 BXPC 00000001) Jul 6 23:26:16.793598 kernel: ACPI: MCFG 0x000000013676FF98 00003C (v01 BOCHS BXPC 00000001 BXPC 00000001) Jul 6 23:26:16.793605 kernel: ACPI: SPCR 0x000000013676E818 000050 (v02 BOCHS BXPC 00000001 BXPC 00000001) Jul 6 23:26:16.793611 kernel: ACPI: DBG2 0x000000013676E898 000057 (v00 BOCHS BXPC 00000001 BXPC 00000001) Jul 6 23:26:16.793617 kernel: ACPI: IORT 0x000000013676E418 000080 (v03 BOCHS BXPC 00000001 BXPC 00000001) Jul 6 23:26:16.793623 kernel: ACPI: BGRT 0x000000013676E798 000038 (v01 INTEL EDK2 00000002 01000013) Jul 6 23:26:16.793629 kernel: ACPI: SPCR: console: pl011,mmio32,0x9000000,9600 Jul 6 23:26:16.793635 kernel: ACPI: Use ACPI SPCR as default console: Yes Jul 6 23:26:16.793641 kernel: NUMA: Faking a node at [mem 0x0000000040000000-0x0000000139ffffff] Jul 6 23:26:16.793647 kernel: NODE_DATA(0) allocated [mem 0x13967ddc0-0x139684fff] Jul 6 23:26:16.793653 kernel: Zone ranges: Jul 6 23:26:16.793660 kernel: DMA [mem 0x0000000040000000-0x00000000ffffffff] Jul 6 23:26:16.793666 kernel: DMA32 empty Jul 6 23:26:16.793672 kernel: Normal [mem 0x0000000100000000-0x0000000139ffffff] Jul 6 23:26:16.793678 kernel: Device empty Jul 6 23:26:16.793684 kernel: Movable zone start for each node Jul 6 23:26:16.793690 kernel: Early memory node ranges Jul 6 23:26:16.793696 kernel: node 0: [mem 0x0000000040000000-0x000000013666ffff] Jul 6 23:26:16.793702 kernel: node 0: [mem 0x0000000136670000-0x000000013667ffff] Jul 6 23:26:16.793708 kernel: node 0: [mem 0x0000000136680000-0x000000013676ffff] Jul 6 23:26:16.793714 kernel: node 0: [mem 0x0000000136770000-0x0000000136b3ffff] Jul 6 23:26:16.793720 kernel: node 0: [mem 0x0000000136b40000-0x0000000139e1ffff] Jul 6 23:26:16.793726 kernel: node 0: [mem 0x0000000139e20000-0x0000000139eaffff] Jul 6 23:26:16.793732 kernel: node 0: [mem 0x0000000139eb0000-0x0000000139ebffff] Jul 6 23:26:16.793739 kernel: node 0: [mem 0x0000000139ec0000-0x0000000139fdffff] Jul 6 23:26:16.793745 kernel: node 0: [mem 0x0000000139fe0000-0x0000000139ffffff] Jul 6 23:26:16.793754 kernel: Initmem setup node 0 [mem 0x0000000040000000-0x0000000139ffffff] Jul 6 23:26:16.793768 kernel: On node 0, zone Normal: 24576 pages in unavailable ranges Jul 6 23:26:16.793778 kernel: psci: probing for conduit method from ACPI. Jul 6 23:26:16.793788 kernel: psci: PSCIv1.1 detected in firmware. Jul 6 23:26:16.793797 kernel: psci: Using standard PSCI v0.2 function IDs Jul 6 23:26:16.793803 kernel: psci: Trusted OS migration not required Jul 6 23:26:16.793810 kernel: psci: SMC Calling Convention v1.1 Jul 6 23:26:16.793816 kernel: smccc: KVM: hypervisor services detected (0x00000000 0x00000000 0x00000000 0x00000003) Jul 6 23:26:16.793856 kernel: percpu: Embedded 33 pages/cpu s98200 r8192 d28776 u135168 Jul 6 23:26:16.793864 kernel: pcpu-alloc: s98200 r8192 d28776 u135168 alloc=33*4096 Jul 6 23:26:16.793871 kernel: pcpu-alloc: [0] 0 [0] 1 Jul 6 23:26:16.793878 kernel: Detected PIPT I-cache on CPU0 Jul 6 23:26:16.793884 kernel: CPU features: detected: GIC system register CPU interface Jul 6 23:26:16.793891 kernel: CPU features: detected: Spectre-v4 Jul 6 23:26:16.793900 kernel: CPU features: detected: Spectre-BHB Jul 6 23:26:16.793906 kernel: CPU features: kernel page table isolation forced ON by KASLR Jul 6 23:26:16.793913 kernel: CPU features: detected: Kernel page table isolation (KPTI) Jul 6 23:26:16.793919 kernel: CPU features: detected: ARM erratum 1418040 Jul 6 23:26:16.793926 kernel: CPU features: detected: SSBS not fully self-synchronizing Jul 6 23:26:16.793932 kernel: alternatives: applying boot alternatives Jul 6 23:26:16.793940 kernel: Kernel command line: BOOT_IMAGE=/flatcar/vmlinuz-a mount.usr=/dev/mapper/usr verity.usr=PARTUUID=7130c94a-213a-4e5a-8e26-6cce9662f132 rootflags=rw mount.usrflags=ro consoleblank=0 root=LABEL=ROOT console=ttyAMA0,115200n8 flatcar.first_boot=detected acpi=force flatcar.oem.id=hetzner verity.usrhash=dd2d39de40482a23e9bb75390ff5ca85cd9bd34d902b8049121a8373f8cb2ef2 Jul 6 23:26:16.793947 kernel: Unknown kernel command line parameters "BOOT_IMAGE=/flatcar/vmlinuz-a", will be passed to user space. Jul 6 23:26:16.793953 kernel: Dentry cache hash table entries: 524288 (order: 10, 4194304 bytes, linear) Jul 6 23:26:16.793960 kernel: Inode-cache hash table entries: 262144 (order: 9, 2097152 bytes, linear) Jul 6 23:26:16.793967 kernel: Fallback order for Node 0: 0 Jul 6 23:26:16.793974 kernel: Built 1 zonelists, mobility grouping on. Total pages: 1024000 Jul 6 23:26:16.793980 kernel: Policy zone: Normal Jul 6 23:26:16.793986 kernel: mem auto-init: stack:off, heap alloc:off, heap free:off Jul 6 23:26:16.793993 kernel: software IO TLB: area num 2. Jul 6 23:26:16.793999 kernel: software IO TLB: mapped [mem 0x00000000fbfff000-0x00000000fffff000] (64MB) Jul 6 23:26:16.794006 kernel: SLUB: HWalign=64, Order=0-3, MinObjects=0, CPUs=2, Nodes=1 Jul 6 23:26:16.794012 kernel: rcu: Preemptible hierarchical RCU implementation. Jul 6 23:26:16.794019 kernel: rcu: RCU event tracing is enabled. Jul 6 23:26:16.794026 kernel: rcu: RCU restricting CPUs from NR_CPUS=512 to nr_cpu_ids=2. Jul 6 23:26:16.794070 kernel: Trampoline variant of Tasks RCU enabled. Jul 6 23:26:16.794079 kernel: Tracing variant of Tasks RCU enabled. Jul 6 23:26:16.794089 kernel: rcu: RCU calculated value of scheduler-enlistment delay is 100 jiffies. Jul 6 23:26:16.794095 kernel: rcu: Adjusting geometry for rcu_fanout_leaf=16, nr_cpu_ids=2 Jul 6 23:26:16.794102 kernel: RCU Tasks: Setting shift to 1 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=2. Jul 6 23:26:16.794108 kernel: RCU Tasks Trace: Setting shift to 1 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=2. Jul 6 23:26:16.794115 kernel: NR_IRQS: 64, nr_irqs: 64, preallocated irqs: 0 Jul 6 23:26:16.794121 kernel: GICv3: 256 SPIs implemented Jul 6 23:26:16.794128 kernel: GICv3: 0 Extended SPIs implemented Jul 6 23:26:16.794134 kernel: Root IRQ handler: gic_handle_irq Jul 6 23:26:16.794140 kernel: GICv3: GICv3 features: 16 PPIs, DirectLPI Jul 6 23:26:16.794147 kernel: GICv3: GICD_CTRL.DS=1, SCR_EL3.FIQ=0 Jul 6 23:26:16.794153 kernel: GICv3: CPU0: found redistributor 0 region 0:0x00000000080a0000 Jul 6 23:26:16.794159 kernel: ITS [mem 0x08080000-0x0809ffff] Jul 6 23:26:16.794167 kernel: ITS@0x0000000008080000: allocated 8192 Devices @100100000 (indirect, esz 8, psz 64K, shr 1) Jul 6 23:26:16.794174 kernel: ITS@0x0000000008080000: allocated 8192 Interrupt Collections @100110000 (flat, esz 8, psz 64K, shr 1) Jul 6 23:26:16.794181 kernel: GICv3: using LPI property table @0x0000000100120000 Jul 6 23:26:16.794187 kernel: GICv3: CPU0: using allocated LPI pending table @0x0000000100130000 Jul 6 23:26:16.794193 kernel: rcu: srcu_init: Setting srcu_struct sizes based on contention. Jul 6 23:26:16.794200 kernel: arch_timer: Enabling local workaround for ARM erratum 1418040 Jul 6 23:26:16.794206 kernel: arch_timer: cp15 timer(s) running at 25.00MHz (virt). Jul 6 23:26:16.794213 kernel: clocksource: arch_sys_counter: mask: 0xffffffffffffff max_cycles: 0x5c40939b5, max_idle_ns: 440795202646 ns Jul 6 23:26:16.794220 kernel: sched_clock: 56 bits at 25MHz, resolution 40ns, wraps every 4398046511100ns Jul 6 23:26:16.794226 kernel: Console: colour dummy device 80x25 Jul 6 23:26:16.794233 kernel: ACPI: Core revision 20240827 Jul 6 23:26:16.794241 kernel: Calibrating delay loop (skipped), value calculated using timer frequency.. 50.00 BogoMIPS (lpj=25000) Jul 6 23:26:16.794247 kernel: pid_max: default: 32768 minimum: 301 Jul 6 23:26:16.794254 kernel: LSM: initializing lsm=lockdown,capability,landlock,selinux,ima Jul 6 23:26:16.794261 kernel: landlock: Up and running. Jul 6 23:26:16.794267 kernel: SELinux: Initializing. Jul 6 23:26:16.794274 kernel: Mount-cache hash table entries: 8192 (order: 4, 65536 bytes, linear) Jul 6 23:26:16.794281 kernel: Mountpoint-cache hash table entries: 8192 (order: 4, 65536 bytes, linear) Jul 6 23:26:16.794287 kernel: rcu: Hierarchical SRCU implementation. Jul 6 23:26:16.794294 kernel: rcu: Max phase no-delay instances is 400. Jul 6 23:26:16.794302 kernel: Timer migration: 1 hierarchy levels; 8 children per group; 1 crossnode level Jul 6 23:26:16.794309 kernel: Remapping and enabling EFI services. Jul 6 23:26:16.794315 kernel: smp: Bringing up secondary CPUs ... Jul 6 23:26:16.794322 kernel: Detected PIPT I-cache on CPU1 Jul 6 23:26:16.794328 kernel: GICv3: CPU1: found redistributor 1 region 0:0x00000000080c0000 Jul 6 23:26:16.794335 kernel: GICv3: CPU1: using allocated LPI pending table @0x0000000100140000 Jul 6 23:26:16.794342 kernel: arch_timer: Enabling local workaround for ARM erratum 1418040 Jul 6 23:26:16.794348 kernel: CPU1: Booted secondary processor 0x0000000001 [0x413fd0c1] Jul 6 23:26:16.794355 kernel: smp: Brought up 1 node, 2 CPUs Jul 6 23:26:16.794364 kernel: SMP: Total of 2 processors activated. Jul 6 23:26:16.794375 kernel: CPU: All CPU(s) started at EL1 Jul 6 23:26:16.794382 kernel: CPU features: detected: 32-bit EL0 Support Jul 6 23:26:16.794390 kernel: CPU features: detected: Data cache clean to the PoU not required for I/D coherence Jul 6 23:26:16.794397 kernel: CPU features: detected: Common not Private translations Jul 6 23:26:16.794404 kernel: CPU features: detected: CRC32 instructions Jul 6 23:26:16.794411 kernel: CPU features: detected: Enhanced Virtualization Traps Jul 6 23:26:16.794418 kernel: CPU features: detected: RCpc load-acquire (LDAPR) Jul 6 23:26:16.794427 kernel: CPU features: detected: LSE atomic instructions Jul 6 23:26:16.794434 kernel: CPU features: detected: Privileged Access Never Jul 6 23:26:16.794441 kernel: CPU features: detected: RAS Extension Support Jul 6 23:26:16.794448 kernel: CPU features: detected: Speculative Store Bypassing Safe (SSBS) Jul 6 23:26:16.794455 kernel: alternatives: applying system-wide alternatives Jul 6 23:26:16.794462 kernel: CPU features: detected: Hardware dirty bit management on CPU0-1 Jul 6 23:26:16.794469 kernel: Memory: 3875624K/4096000K available (11072K kernel code, 2428K rwdata, 9032K rodata, 39424K init, 1035K bss, 215284K reserved, 0K cma-reserved) Jul 6 23:26:16.794477 kernel: devtmpfs: initialized Jul 6 23:26:16.794484 kernel: clocksource: jiffies: mask: 0xffffffff max_cycles: 0xffffffff, max_idle_ns: 1911260446275000 ns Jul 6 23:26:16.794492 kernel: futex hash table entries: 512 (order: 3, 32768 bytes, linear) Jul 6 23:26:16.794499 kernel: 2G module region forced by RANDOMIZE_MODULE_REGION_FULL Jul 6 23:26:16.794506 kernel: 0 pages in range for non-PLT usage Jul 6 23:26:16.794513 kernel: 508480 pages in range for PLT usage Jul 6 23:26:16.794520 kernel: pinctrl core: initialized pinctrl subsystem Jul 6 23:26:16.794527 kernel: SMBIOS 3.0.0 present. Jul 6 23:26:16.794534 kernel: DMI: Hetzner vServer/KVM Virtual Machine, BIOS 20171111 11/11/2017 Jul 6 23:26:16.794541 kernel: DMI: Memory slots populated: 1/1 Jul 6 23:26:16.794548 kernel: NET: Registered PF_NETLINK/PF_ROUTE protocol family Jul 6 23:26:16.794556 kernel: DMA: preallocated 512 KiB GFP_KERNEL pool for atomic allocations Jul 6 23:26:16.794563 kernel: DMA: preallocated 512 KiB GFP_KERNEL|GFP_DMA pool for atomic allocations Jul 6 23:26:16.794570 kernel: DMA: preallocated 512 KiB GFP_KERNEL|GFP_DMA32 pool for atomic allocations Jul 6 23:26:16.794577 kernel: audit: initializing netlink subsys (disabled) Jul 6 23:26:16.794584 kernel: audit: type=2000 audit(0.020:1): state=initialized audit_enabled=0 res=1 Jul 6 23:26:16.794591 kernel: thermal_sys: Registered thermal governor 'step_wise' Jul 6 23:26:16.794598 kernel: cpuidle: using governor menu Jul 6 23:26:16.794605 kernel: hw-breakpoint: found 6 breakpoint and 4 watchpoint registers. Jul 6 23:26:16.794612 kernel: ASID allocator initialised with 32768 entries Jul 6 23:26:16.794620 kernel: acpiphp: ACPI Hot Plug PCI Controller Driver version: 0.5 Jul 6 23:26:16.794627 kernel: Serial: AMBA PL011 UART driver Jul 6 23:26:16.794634 kernel: HugeTLB: registered 1.00 GiB page size, pre-allocated 0 pages Jul 6 23:26:16.794641 kernel: HugeTLB: 0 KiB vmemmap can be freed for a 1.00 GiB page Jul 6 23:26:16.794648 kernel: HugeTLB: registered 32.0 MiB page size, pre-allocated 0 pages Jul 6 23:26:16.794655 kernel: HugeTLB: 0 KiB vmemmap can be freed for a 32.0 MiB page Jul 6 23:26:16.794662 kernel: HugeTLB: registered 2.00 MiB page size, pre-allocated 0 pages Jul 6 23:26:16.794676 kernel: HugeTLB: 0 KiB vmemmap can be freed for a 2.00 MiB page Jul 6 23:26:16.794687 kernel: HugeTLB: registered 64.0 KiB page size, pre-allocated 0 pages Jul 6 23:26:16.794698 kernel: HugeTLB: 0 KiB vmemmap can be freed for a 64.0 KiB page Jul 6 23:26:16.794705 kernel: ACPI: Added _OSI(Module Device) Jul 6 23:26:16.794712 kernel: ACPI: Added _OSI(Processor Device) Jul 6 23:26:16.794719 kernel: ACPI: Added _OSI(Processor Aggregator Device) Jul 6 23:26:16.794726 kernel: ACPI: 1 ACPI AML tables successfully acquired and loaded Jul 6 23:26:16.794733 kernel: ACPI: Interpreter enabled Jul 6 23:26:16.794739 kernel: ACPI: Using GIC for interrupt routing Jul 6 23:26:16.794746 kernel: ACPI: MCFG table detected, 1 entries Jul 6 23:26:16.794753 kernel: ACPI: CPU0 has been hot-added Jul 6 23:26:16.794761 kernel: ACPI: CPU1 has been hot-added Jul 6 23:26:16.794768 kernel: ARMH0011:00: ttyAMA0 at MMIO 0x9000000 (irq = 12, base_baud = 0) is a SBSA Jul 6 23:26:16.794775 kernel: printk: legacy console [ttyAMA0] enabled Jul 6 23:26:16.794782 kernel: ACPI: PCI Root Bridge [PCI0] (domain 0000 [bus 00-ff]) Jul 6 23:26:16.794945 kernel: acpi PNP0A08:00: _OSC: OS supports [ExtendedConfig ASPM ClockPM Segments MSI HPX-Type3] Jul 6 23:26:16.795016 kernel: acpi PNP0A08:00: _OSC: platform does not support [LTR] Jul 6 23:26:16.795098 kernel: acpi PNP0A08:00: _OSC: OS now controls [PCIeHotplug PME AER PCIeCapability] Jul 6 23:26:16.795161 kernel: acpi PNP0A08:00: ECAM area [mem 0x4010000000-0x401fffffff] reserved by PNP0C02:00 Jul 6 23:26:16.795216 kernel: acpi PNP0A08:00: ECAM at [mem 0x4010000000-0x401fffffff] for [bus 00-ff] Jul 6 23:26:16.795225 kernel: ACPI: Remapped I/O 0x000000003eff0000 to [io 0x0000-0xffff window] Jul 6 23:26:16.795232 kernel: PCI host bridge to bus 0000:00 Jul 6 23:26:16.795301 kernel: pci_bus 0000:00: root bus resource [mem 0x10000000-0x3efeffff window] Jul 6 23:26:16.795353 kernel: pci_bus 0000:00: root bus resource [io 0x0000-0xffff window] Jul 6 23:26:16.795404 kernel: pci_bus 0000:00: root bus resource [mem 0x8000000000-0xffffffffff window] Jul 6 23:26:16.795456 kernel: pci_bus 0000:00: root bus resource [bus 00-ff] Jul 6 23:26:16.795529 kernel: pci 0000:00:00.0: [1b36:0008] type 00 class 0x060000 conventional PCI endpoint Jul 6 23:26:16.795598 kernel: pci 0000:00:01.0: [1af4:1050] type 00 class 0x038000 conventional PCI endpoint Jul 6 23:26:16.795658 kernel: pci 0000:00:01.0: BAR 1 [mem 0x11289000-0x11289fff] Jul 6 23:26:16.795716 kernel: pci 0000:00:01.0: BAR 4 [mem 0x8000600000-0x8000603fff 64bit pref] Jul 6 23:26:16.795783 kernel: pci 0000:00:02.0: [1b36:000c] type 01 class 0x060400 PCIe Root Port Jul 6 23:26:16.795881 kernel: pci 0000:00:02.0: BAR 0 [mem 0x11288000-0x11288fff] Jul 6 23:26:16.795948 kernel: pci 0000:00:02.0: PCI bridge to [bus 01] Jul 6 23:26:16.796007 kernel: pci 0000:00:02.0: bridge window [mem 0x11000000-0x111fffff] Jul 6 23:26:16.796088 kernel: pci 0000:00:02.0: bridge window [mem 0x8000000000-0x80000fffff 64bit pref] Jul 6 23:26:16.796160 kernel: pci 0000:00:02.1: [1b36:000c] type 01 class 0x060400 PCIe Root Port Jul 6 23:26:16.796220 kernel: pci 0000:00:02.1: BAR 0 [mem 0x11287000-0x11287fff] Jul 6 23:26:16.796277 kernel: pci 0000:00:02.1: PCI bridge to [bus 02] Jul 6 23:26:16.796334 kernel: pci 0000:00:02.1: bridge window [mem 0x10e00000-0x10ffffff] Jul 6 23:26:16.796402 kernel: pci 0000:00:02.2: [1b36:000c] type 01 class 0x060400 PCIe Root Port Jul 6 23:26:16.796461 kernel: pci 0000:00:02.2: BAR 0 [mem 0x11286000-0x11286fff] Jul 6 23:26:16.796519 kernel: pci 0000:00:02.2: PCI bridge to [bus 03] Jul 6 23:26:16.796576 kernel: pci 0000:00:02.2: bridge window [mem 0x10c00000-0x10dfffff] Jul 6 23:26:16.796633 kernel: pci 0000:00:02.2: bridge window [mem 0x8000100000-0x80001fffff 64bit pref] Jul 6 23:26:16.796698 kernel: pci 0000:00:02.3: [1b36:000c] type 01 class 0x060400 PCIe Root Port Jul 6 23:26:16.796758 kernel: pci 0000:00:02.3: BAR 0 [mem 0x11285000-0x11285fff] Jul 6 23:26:16.796815 kernel: pci 0000:00:02.3: PCI bridge to [bus 04] Jul 6 23:26:16.796886 kernel: pci 0000:00:02.3: bridge window [mem 0x10a00000-0x10bfffff] Jul 6 23:26:16.796945 kernel: pci 0000:00:02.3: bridge window [mem 0x8000200000-0x80002fffff 64bit pref] Jul 6 23:26:16.797009 kernel: pci 0000:00:02.4: [1b36:000c] type 01 class 0x060400 PCIe Root Port Jul 6 23:26:16.797640 kernel: pci 0000:00:02.4: BAR 0 [mem 0x11284000-0x11284fff] Jul 6 23:26:16.797719 kernel: pci 0000:00:02.4: PCI bridge to [bus 05] Jul 6 23:26:16.797778 kernel: pci 0000:00:02.4: bridge window [mem 0x10800000-0x109fffff] Jul 6 23:26:16.797894 kernel: pci 0000:00:02.4: bridge window [mem 0x8000300000-0x80003fffff 64bit pref] Jul 6 23:26:16.797974 kernel: pci 0000:00:02.5: [1b36:000c] type 01 class 0x060400 PCIe Root Port Jul 6 23:26:16.798070 kernel: pci 0000:00:02.5: BAR 0 [mem 0x11283000-0x11283fff] Jul 6 23:26:16.798137 kernel: pci 0000:00:02.5: PCI bridge to [bus 06] Jul 6 23:26:16.798195 kernel: pci 0000:00:02.5: bridge window [mem 0x10600000-0x107fffff] Jul 6 23:26:16.798252 kernel: pci 0000:00:02.5: bridge window [mem 0x8000400000-0x80004fffff 64bit pref] Jul 6 23:26:16.798319 kernel: pci 0000:00:02.6: [1b36:000c] type 01 class 0x060400 PCIe Root Port Jul 6 23:26:16.798384 kernel: pci 0000:00:02.6: BAR 0 [mem 0x11282000-0x11282fff] Jul 6 23:26:16.798443 kernel: pci 0000:00:02.6: PCI bridge to [bus 07] Jul 6 23:26:16.798900 kernel: pci 0000:00:02.6: bridge window [mem 0x10400000-0x105fffff] Jul 6 23:26:16.798964 kernel: pci 0000:00:02.6: bridge window [mem 0x8000500000-0x80005fffff 64bit pref] Jul 6 23:26:16.799052 kernel: pci 0000:00:02.7: [1b36:000c] type 01 class 0x060400 PCIe Root Port Jul 6 23:26:16.799119 kernel: pci 0000:00:02.7: BAR 0 [mem 0x11281000-0x11281fff] Jul 6 23:26:16.799185 kernel: pci 0000:00:02.7: PCI bridge to [bus 08] Jul 6 23:26:16.799242 kernel: pci 0000:00:02.7: bridge window [mem 0x10200000-0x103fffff] Jul 6 23:26:16.799307 kernel: pci 0000:00:03.0: [1b36:000c] type 01 class 0x060400 PCIe Root Port Jul 6 23:26:16.799365 kernel: pci 0000:00:03.0: BAR 0 [mem 0x11280000-0x11280fff] Jul 6 23:26:16.799422 kernel: pci 0000:00:03.0: PCI bridge to [bus 09] Jul 6 23:26:16.799479 kernel: pci 0000:00:03.0: bridge window [mem 0x10000000-0x101fffff] Jul 6 23:26:16.799548 kernel: pci 0000:00:04.0: [1b36:0002] type 00 class 0x070002 conventional PCI endpoint Jul 6 23:26:16.799611 kernel: pci 0000:00:04.0: BAR 0 [io 0x0000-0x0007] Jul 6 23:26:16.799686 kernel: pci 0000:01:00.0: [1af4:1041] type 00 class 0x020000 PCIe Endpoint Jul 6 23:26:16.799747 kernel: pci 0000:01:00.0: BAR 1 [mem 0x11000000-0x11000fff] Jul 6 23:26:16.799806 kernel: pci 0000:01:00.0: BAR 4 [mem 0x8000000000-0x8000003fff 64bit pref] Jul 6 23:26:16.799882 kernel: pci 0000:01:00.0: ROM [mem 0xfff80000-0xffffffff pref] Jul 6 23:26:16.799954 kernel: pci 0000:02:00.0: [1b36:000d] type 00 class 0x0c0330 PCIe Endpoint Jul 6 23:26:16.800026 kernel: pci 0000:02:00.0: BAR 0 [mem 0x10e00000-0x10e03fff 64bit] Jul 6 23:26:16.802224 kernel: pci 0000:03:00.0: [1af4:1043] type 00 class 0x078000 PCIe Endpoint Jul 6 23:26:16.802293 kernel: pci 0000:03:00.0: BAR 1 [mem 0x10c00000-0x10c00fff] Jul 6 23:26:16.802355 kernel: pci 0000:03:00.0: BAR 4 [mem 0x8000100000-0x8000103fff 64bit pref] Jul 6 23:26:16.802425 kernel: pci 0000:04:00.0: [1af4:1045] type 00 class 0x00ff00 PCIe Endpoint Jul 6 23:26:16.802486 kernel: pci 0000:04:00.0: BAR 4 [mem 0x8000200000-0x8000203fff 64bit pref] Jul 6 23:26:16.802560 kernel: pci 0000:05:00.0: [1af4:1044] type 00 class 0x00ff00 PCIe Endpoint Jul 6 23:26:16.802628 kernel: pci 0000:05:00.0: BAR 1 [mem 0x10800000-0x10800fff] Jul 6 23:26:16.802690 kernel: pci 0000:05:00.0: BAR 4 [mem 0x8000300000-0x8000303fff 64bit pref] Jul 6 23:26:16.802758 kernel: pci 0000:06:00.0: [1af4:1048] type 00 class 0x010000 PCIe Endpoint Jul 6 23:26:16.802819 kernel: pci 0000:06:00.0: BAR 1 [mem 0x10600000-0x10600fff] Jul 6 23:26:16.802903 kernel: pci 0000:06:00.0: BAR 4 [mem 0x8000400000-0x8000403fff 64bit pref] Jul 6 23:26:16.802974 kernel: pci 0000:07:00.0: [1af4:1041] type 00 class 0x020000 PCIe Endpoint Jul 6 23:26:16.803078 kernel: pci 0000:07:00.0: BAR 1 [mem 0x10400000-0x10400fff] Jul 6 23:26:16.803146 kernel: pci 0000:07:00.0: BAR 4 [mem 0x8000500000-0x8000503fff 64bit pref] Jul 6 23:26:16.803207 kernel: pci 0000:07:00.0: ROM [mem 0xfff80000-0xffffffff pref] Jul 6 23:26:16.803269 kernel: pci 0000:00:02.0: bridge window [io 0x1000-0x0fff] to [bus 01] add_size 1000 Jul 6 23:26:16.803328 kernel: pci 0000:00:02.0: bridge window [mem 0x00100000-0x001fffff 64bit pref] to [bus 01] add_size 100000 add_align 100000 Jul 6 23:26:16.803386 kernel: pci 0000:00:02.0: bridge window [mem 0x00100000-0x001fffff] to [bus 01] add_size 100000 add_align 100000 Jul 6 23:26:16.803448 kernel: pci 0000:00:02.1: bridge window [io 0x1000-0x0fff] to [bus 02] add_size 1000 Jul 6 23:26:16.803506 kernel: pci 0000:00:02.1: bridge window [mem 0x00100000-0x000fffff 64bit pref] to [bus 02] add_size 200000 add_align 100000 Jul 6 23:26:16.803567 kernel: pci 0000:00:02.1: bridge window [mem 0x00100000-0x001fffff] to [bus 02] add_size 100000 add_align 100000 Jul 6 23:26:16.803629 kernel: pci 0000:00:02.2: bridge window [io 0x1000-0x0fff] to [bus 03] add_size 1000 Jul 6 23:26:16.803688 kernel: pci 0000:00:02.2: bridge window [mem 0x00100000-0x001fffff 64bit pref] to [bus 03] add_size 100000 add_align 100000 Jul 6 23:26:16.803745 kernel: pci 0000:00:02.2: bridge window [mem 0x00100000-0x001fffff] to [bus 03] add_size 100000 add_align 100000 Jul 6 23:26:16.803805 kernel: pci 0000:00:02.3: bridge window [io 0x1000-0x0fff] to [bus 04] add_size 1000 Jul 6 23:26:16.803907 kernel: pci 0000:00:02.3: bridge window [mem 0x00100000-0x001fffff 64bit pref] to [bus 04] add_size 100000 add_align 100000 Jul 6 23:26:16.803974 kernel: pci 0000:00:02.3: bridge window [mem 0x00100000-0x000fffff] to [bus 04] add_size 200000 add_align 100000 Jul 6 23:26:16.806117 kernel: pci 0000:00:02.4: bridge window [io 0x1000-0x0fff] to [bus 05] add_size 1000 Jul 6 23:26:16.806236 kernel: pci 0000:00:02.4: bridge window [mem 0x00100000-0x001fffff 64bit pref] to [bus 05] add_size 100000 add_align 100000 Jul 6 23:26:16.806299 kernel: pci 0000:00:02.4: bridge window [mem 0x00100000-0x001fffff] to [bus 05] add_size 100000 add_align 100000 Jul 6 23:26:16.806363 kernel: pci 0000:00:02.5: bridge window [io 0x1000-0x0fff] to [bus 06] add_size 1000 Jul 6 23:26:16.806423 kernel: pci 0000:00:02.5: bridge window [mem 0x00100000-0x001fffff 64bit pref] to [bus 06] add_size 100000 add_align 100000 Jul 6 23:26:16.806481 kernel: pci 0000:00:02.5: bridge window [mem 0x00100000-0x001fffff] to [bus 06] add_size 100000 add_align 100000 Jul 6 23:26:16.806551 kernel: pci 0000:00:02.6: bridge window [io 0x1000-0x0fff] to [bus 07] add_size 1000 Jul 6 23:26:16.806610 kernel: pci 0000:00:02.6: bridge window [mem 0x00100000-0x001fffff 64bit pref] to [bus 07] add_size 100000 add_align 100000 Jul 6 23:26:16.806667 kernel: pci 0000:00:02.6: bridge window [mem 0x00100000-0x001fffff] to [bus 07] add_size 100000 add_align 100000 Jul 6 23:26:16.806728 kernel: pci 0000:00:02.7: bridge window [io 0x1000-0x0fff] to [bus 08] add_size 1000 Jul 6 23:26:16.806785 kernel: pci 0000:00:02.7: bridge window [mem 0x00100000-0x000fffff 64bit pref] to [bus 08] add_size 200000 add_align 100000 Jul 6 23:26:16.806863 kernel: pci 0000:00:02.7: bridge window [mem 0x00100000-0x000fffff] to [bus 08] add_size 200000 add_align 100000 Jul 6 23:26:16.806929 kernel: pci 0000:00:03.0: bridge window [io 0x1000-0x0fff] to [bus 09] add_size 1000 Jul 6 23:26:16.806994 kernel: pci 0000:00:03.0: bridge window [mem 0x00100000-0x000fffff 64bit pref] to [bus 09] add_size 200000 add_align 100000 Jul 6 23:26:16.807075 kernel: pci 0000:00:03.0: bridge window [mem 0x00100000-0x000fffff] to [bus 09] add_size 200000 add_align 100000 Jul 6 23:26:16.807139 kernel: pci 0000:00:02.0: bridge window [mem 0x10000000-0x101fffff]: assigned Jul 6 23:26:16.807197 kernel: pci 0000:00:02.0: bridge window [mem 0x8000000000-0x80001fffff 64bit pref]: assigned Jul 6 23:26:16.807259 kernel: pci 0000:00:02.1: bridge window [mem 0x10200000-0x103fffff]: assigned Jul 6 23:26:16.807317 kernel: pci 0000:00:02.1: bridge window [mem 0x8000200000-0x80003fffff 64bit pref]: assigned Jul 6 23:26:16.807378 kernel: pci 0000:00:02.2: bridge window [mem 0x10400000-0x105fffff]: assigned Jul 6 23:26:16.807442 kernel: pci 0000:00:02.2: bridge window [mem 0x8000400000-0x80005fffff 64bit pref]: assigned Jul 6 23:26:16.807502 kernel: pci 0000:00:02.3: bridge window [mem 0x10600000-0x107fffff]: assigned Jul 6 23:26:16.807561 kernel: pci 0000:00:02.3: bridge window [mem 0x8000600000-0x80007fffff 64bit pref]: assigned Jul 6 23:26:16.807618 kernel: pci 0000:00:02.4: bridge window [mem 0x10800000-0x109fffff]: assigned Jul 6 23:26:16.807675 kernel: pci 0000:00:02.4: bridge window [mem 0x8000800000-0x80009fffff 64bit pref]: assigned Jul 6 23:26:16.807737 kernel: pci 0000:00:02.5: bridge window [mem 0x10a00000-0x10bfffff]: assigned Jul 6 23:26:16.807795 kernel: pci 0000:00:02.5: bridge window [mem 0x8000a00000-0x8000bfffff 64bit pref]: assigned Jul 6 23:26:16.807870 kernel: pci 0000:00:02.6: bridge window [mem 0x10c00000-0x10dfffff]: assigned Jul 6 23:26:16.807934 kernel: pci 0000:00:02.6: bridge window [mem 0x8000c00000-0x8000dfffff 64bit pref]: assigned Jul 6 23:26:16.807994 kernel: pci 0000:00:02.7: bridge window [mem 0x10e00000-0x10ffffff]: assigned Jul 6 23:26:16.810147 kernel: pci 0000:00:02.7: bridge window [mem 0x8000e00000-0x8000ffffff 64bit pref]: assigned Jul 6 23:26:16.810236 kernel: pci 0000:00:03.0: bridge window [mem 0x11000000-0x111fffff]: assigned Jul 6 23:26:16.810297 kernel: pci 0000:00:03.0: bridge window [mem 0x8001000000-0x80011fffff 64bit pref]: assigned Jul 6 23:26:16.810362 kernel: pci 0000:00:01.0: BAR 4 [mem 0x8001200000-0x8001203fff 64bit pref]: assigned Jul 6 23:26:16.810421 kernel: pci 0000:00:01.0: BAR 1 [mem 0x11200000-0x11200fff]: assigned Jul 6 23:26:16.810482 kernel: pci 0000:00:02.0: BAR 0 [mem 0x11201000-0x11201fff]: assigned Jul 6 23:26:16.810548 kernel: pci 0000:00:02.0: bridge window [io 0x1000-0x1fff]: assigned Jul 6 23:26:16.810612 kernel: pci 0000:00:02.1: BAR 0 [mem 0x11202000-0x11202fff]: assigned Jul 6 23:26:16.810673 kernel: pci 0000:00:02.1: bridge window [io 0x2000-0x2fff]: assigned Jul 6 23:26:16.810748 kernel: pci 0000:00:02.2: BAR 0 [mem 0x11203000-0x11203fff]: assigned Jul 6 23:26:16.810834 kernel: pci 0000:00:02.2: bridge window [io 0x3000-0x3fff]: assigned Jul 6 23:26:16.810907 kernel: pci 0000:00:02.3: BAR 0 [mem 0x11204000-0x11204fff]: assigned Jul 6 23:26:16.810974 kernel: pci 0000:00:02.3: bridge window [io 0x4000-0x4fff]: assigned Jul 6 23:26:16.811058 kernel: pci 0000:00:02.4: BAR 0 [mem 0x11205000-0x11205fff]: assigned Jul 6 23:26:16.811120 kernel: pci 0000:00:02.4: bridge window [io 0x5000-0x5fff]: assigned Jul 6 23:26:16.811181 kernel: pci 0000:00:02.5: BAR 0 [mem 0x11206000-0x11206fff]: assigned Jul 6 23:26:16.811239 kernel: pci 0000:00:02.5: bridge window [io 0x6000-0x6fff]: assigned Jul 6 23:26:16.811299 kernel: pci 0000:00:02.6: BAR 0 [mem 0x11207000-0x11207fff]: assigned Jul 6 23:26:16.812970 kernel: pci 0000:00:02.6: bridge window [io 0x7000-0x7fff]: assigned Jul 6 23:26:16.813655 kernel: pci 0000:00:02.7: BAR 0 [mem 0x11208000-0x11208fff]: assigned Jul 6 23:26:16.813739 kernel: pci 0000:00:02.7: bridge window [io 0x8000-0x8fff]: assigned Jul 6 23:26:16.813808 kernel: pci 0000:00:03.0: BAR 0 [mem 0x11209000-0x11209fff]: assigned Jul 6 23:26:16.813897 kernel: pci 0000:00:03.0: bridge window [io 0x9000-0x9fff]: assigned Jul 6 23:26:16.813971 kernel: pci 0000:00:04.0: BAR 0 [io 0xa000-0xa007]: assigned Jul 6 23:26:16.815194 kernel: pci 0000:01:00.0: ROM [mem 0x10000000-0x1007ffff pref]: assigned Jul 6 23:26:16.815293 kernel: pci 0000:01:00.0: BAR 4 [mem 0x8000000000-0x8000003fff 64bit pref]: assigned Jul 6 23:26:16.815366 kernel: pci 0000:01:00.0: BAR 1 [mem 0x10080000-0x10080fff]: assigned Jul 6 23:26:16.815430 kernel: pci 0000:00:02.0: PCI bridge to [bus 01] Jul 6 23:26:16.815491 kernel: pci 0000:00:02.0: bridge window [io 0x1000-0x1fff] Jul 6 23:26:16.815552 kernel: pci 0000:00:02.0: bridge window [mem 0x10000000-0x101fffff] Jul 6 23:26:16.815611 kernel: pci 0000:00:02.0: bridge window [mem 0x8000000000-0x80001fffff 64bit pref] Jul 6 23:26:16.815678 kernel: pci 0000:02:00.0: BAR 0 [mem 0x10200000-0x10203fff 64bit]: assigned Jul 6 23:26:16.815742 kernel: pci 0000:00:02.1: PCI bridge to [bus 02] Jul 6 23:26:16.815810 kernel: pci 0000:00:02.1: bridge window [io 0x2000-0x2fff] Jul 6 23:26:16.815916 kernel: pci 0000:00:02.1: bridge window [mem 0x10200000-0x103fffff] Jul 6 23:26:16.815979 kernel: pci 0000:00:02.1: bridge window [mem 0x8000200000-0x80003fffff 64bit pref] Jul 6 23:26:16.816818 kernel: pci 0000:03:00.0: BAR 4 [mem 0x8000400000-0x8000403fff 64bit pref]: assigned Jul 6 23:26:16.816935 kernel: pci 0000:03:00.0: BAR 1 [mem 0x10400000-0x10400fff]: assigned Jul 6 23:26:16.817007 kernel: pci 0000:00:02.2: PCI bridge to [bus 03] Jul 6 23:26:16.817096 kernel: pci 0000:00:02.2: bridge window [io 0x3000-0x3fff] Jul 6 23:26:16.817166 kernel: pci 0000:00:02.2: bridge window [mem 0x10400000-0x105fffff] Jul 6 23:26:16.817226 kernel: pci 0000:00:02.2: bridge window [mem 0x8000400000-0x80005fffff 64bit pref] Jul 6 23:26:16.817296 kernel: pci 0000:04:00.0: BAR 4 [mem 0x8000600000-0x8000603fff 64bit pref]: assigned Jul 6 23:26:16.817356 kernel: pci 0000:00:02.3: PCI bridge to [bus 04] Jul 6 23:26:16.817415 kernel: pci 0000:00:02.3: bridge window [io 0x4000-0x4fff] Jul 6 23:26:16.817472 kernel: pci 0000:00:02.3: bridge window [mem 0x10600000-0x107fffff] Jul 6 23:26:16.817531 kernel: pci 0000:00:02.3: bridge window [mem 0x8000600000-0x80007fffff 64bit pref] Jul 6 23:26:16.817597 kernel: pci 0000:05:00.0: BAR 4 [mem 0x8000800000-0x8000803fff 64bit pref]: assigned Jul 6 23:26:16.817657 kernel: pci 0000:05:00.0: BAR 1 [mem 0x10800000-0x10800fff]: assigned Jul 6 23:26:16.817715 kernel: pci 0000:00:02.4: PCI bridge to [bus 05] Jul 6 23:26:16.817772 kernel: pci 0000:00:02.4: bridge window [io 0x5000-0x5fff] Jul 6 23:26:16.817841 kernel: pci 0000:00:02.4: bridge window [mem 0x10800000-0x109fffff] Jul 6 23:26:16.817904 kernel: pci 0000:00:02.4: bridge window [mem 0x8000800000-0x80009fffff 64bit pref] Jul 6 23:26:16.817971 kernel: pci 0000:06:00.0: BAR 4 [mem 0x8000a00000-0x8000a03fff 64bit pref]: assigned Jul 6 23:26:16.818107 kernel: pci 0000:06:00.0: BAR 1 [mem 0x10a00000-0x10a00fff]: assigned Jul 6 23:26:16.818186 kernel: pci 0000:00:02.5: PCI bridge to [bus 06] Jul 6 23:26:16.818262 kernel: pci 0000:00:02.5: bridge window [io 0x6000-0x6fff] Jul 6 23:26:16.818320 kernel: pci 0000:00:02.5: bridge window [mem 0x10a00000-0x10bfffff] Jul 6 23:26:16.818378 kernel: pci 0000:00:02.5: bridge window [mem 0x8000a00000-0x8000bfffff 64bit pref] Jul 6 23:26:16.818442 kernel: pci 0000:07:00.0: ROM [mem 0x10c00000-0x10c7ffff pref]: assigned Jul 6 23:26:16.818503 kernel: pci 0000:07:00.0: BAR 4 [mem 0x8000c00000-0x8000c03fff 64bit pref]: assigned Jul 6 23:26:16.818565 kernel: pci 0000:07:00.0: BAR 1 [mem 0x10c80000-0x10c80fff]: assigned Jul 6 23:26:16.818625 kernel: pci 0000:00:02.6: PCI bridge to [bus 07] Jul 6 23:26:16.818695 kernel: pci 0000:00:02.6: bridge window [io 0x7000-0x7fff] Jul 6 23:26:16.818756 kernel: pci 0000:00:02.6: bridge window [mem 0x10c00000-0x10dfffff] Jul 6 23:26:16.818862 kernel: pci 0000:00:02.6: bridge window [mem 0x8000c00000-0x8000dfffff 64bit pref] Jul 6 23:26:16.818956 kernel: pci 0000:00:02.7: PCI bridge to [bus 08] Jul 6 23:26:16.819022 kernel: pci 0000:00:02.7: bridge window [io 0x8000-0x8fff] Jul 6 23:26:16.819112 kernel: pci 0000:00:02.7: bridge window [mem 0x10e00000-0x10ffffff] Jul 6 23:26:16.819173 kernel: pci 0000:00:02.7: bridge window [mem 0x8000e00000-0x8000ffffff 64bit pref] Jul 6 23:26:16.819242 kernel: pci 0000:00:03.0: PCI bridge to [bus 09] Jul 6 23:26:16.819301 kernel: pci 0000:00:03.0: bridge window [io 0x9000-0x9fff] Jul 6 23:26:16.819365 kernel: pci 0000:00:03.0: bridge window [mem 0x11000000-0x111fffff] Jul 6 23:26:16.819426 kernel: pci 0000:00:03.0: bridge window [mem 0x8001000000-0x80011fffff 64bit pref] Jul 6 23:26:16.819498 kernel: pci_bus 0000:00: resource 4 [mem 0x10000000-0x3efeffff window] Jul 6 23:26:16.819552 kernel: pci_bus 0000:00: resource 5 [io 0x0000-0xffff window] Jul 6 23:26:16.819604 kernel: pci_bus 0000:00: resource 6 [mem 0x8000000000-0xffffffffff window] Jul 6 23:26:16.819674 kernel: pci_bus 0000:01: resource 0 [io 0x1000-0x1fff] Jul 6 23:26:16.819731 kernel: pci_bus 0000:01: resource 1 [mem 0x10000000-0x101fffff] Jul 6 23:26:16.819784 kernel: pci_bus 0000:01: resource 2 [mem 0x8000000000-0x80001fffff 64bit pref] Jul 6 23:26:16.819857 kernel: pci_bus 0000:02: resource 0 [io 0x2000-0x2fff] Jul 6 23:26:16.819914 kernel: pci_bus 0000:02: resource 1 [mem 0x10200000-0x103fffff] Jul 6 23:26:16.819969 kernel: pci_bus 0000:02: resource 2 [mem 0x8000200000-0x80003fffff 64bit pref] Jul 6 23:26:16.820030 kernel: pci_bus 0000:03: resource 0 [io 0x3000-0x3fff] Jul 6 23:26:16.820491 kernel: pci_bus 0000:03: resource 1 [mem 0x10400000-0x105fffff] Jul 6 23:26:16.820554 kernel: pci_bus 0000:03: resource 2 [mem 0x8000400000-0x80005fffff 64bit pref] Jul 6 23:26:16.820619 kernel: pci_bus 0000:04: resource 0 [io 0x4000-0x4fff] Jul 6 23:26:16.820672 kernel: pci_bus 0000:04: resource 1 [mem 0x10600000-0x107fffff] Jul 6 23:26:16.820736 kernel: pci_bus 0000:04: resource 2 [mem 0x8000600000-0x80007fffff 64bit pref] Jul 6 23:26:16.820798 kernel: pci_bus 0000:05: resource 0 [io 0x5000-0x5fff] Jul 6 23:26:16.820906 kernel: pci_bus 0000:05: resource 1 [mem 0x10800000-0x109fffff] Jul 6 23:26:16.820970 kernel: pci_bus 0000:05: resource 2 [mem 0x8000800000-0x80009fffff 64bit pref] Jul 6 23:26:16.821086 kernel: pci_bus 0000:06: resource 0 [io 0x6000-0x6fff] Jul 6 23:26:16.821148 kernel: pci_bus 0000:06: resource 1 [mem 0x10a00000-0x10bfffff] Jul 6 23:26:16.821204 kernel: pci_bus 0000:06: resource 2 [mem 0x8000a00000-0x8000bfffff 64bit pref] Jul 6 23:26:16.821268 kernel: pci_bus 0000:07: resource 0 [io 0x7000-0x7fff] Jul 6 23:26:16.821322 kernel: pci_bus 0000:07: resource 1 [mem 0x10c00000-0x10dfffff] Jul 6 23:26:16.821374 kernel: pci_bus 0000:07: resource 2 [mem 0x8000c00000-0x8000dfffff 64bit pref] Jul 6 23:26:16.821438 kernel: pci_bus 0000:08: resource 0 [io 0x8000-0x8fff] Jul 6 23:26:16.821491 kernel: pci_bus 0000:08: resource 1 [mem 0x10e00000-0x10ffffff] Jul 6 23:26:16.821544 kernel: pci_bus 0000:08: resource 2 [mem 0x8000e00000-0x8000ffffff 64bit pref] Jul 6 23:26:16.821607 kernel: pci_bus 0000:09: resource 0 [io 0x9000-0x9fff] Jul 6 23:26:16.821663 kernel: pci_bus 0000:09: resource 1 [mem 0x11000000-0x111fffff] Jul 6 23:26:16.821717 kernel: pci_bus 0000:09: resource 2 [mem 0x8001000000-0x80011fffff 64bit pref] Jul 6 23:26:16.821727 kernel: ACPI: PCI: Interrupt link GSI0 configured for IRQ 35 Jul 6 23:26:16.821737 kernel: ACPI: PCI: Interrupt link GSI1 configured for IRQ 36 Jul 6 23:26:16.821745 kernel: ACPI: PCI: Interrupt link GSI2 configured for IRQ 37 Jul 6 23:26:16.821752 kernel: ACPI: PCI: Interrupt link GSI3 configured for IRQ 38 Jul 6 23:26:16.821759 kernel: iommu: Default domain type: Translated Jul 6 23:26:16.821767 kernel: iommu: DMA domain TLB invalidation policy: strict mode Jul 6 23:26:16.821774 kernel: efivars: Registered efivars operations Jul 6 23:26:16.821781 kernel: vgaarb: loaded Jul 6 23:26:16.821789 kernel: clocksource: Switched to clocksource arch_sys_counter Jul 6 23:26:16.821796 kernel: VFS: Disk quotas dquot_6.6.0 Jul 6 23:26:16.821805 kernel: VFS: Dquot-cache hash table entries: 512 (order 0, 4096 bytes) Jul 6 23:26:16.821813 kernel: pnp: PnP ACPI init Jul 6 23:26:16.821901 kernel: system 00:00: [mem 0x4010000000-0x401fffffff window] could not be reserved Jul 6 23:26:16.821913 kernel: pnp: PnP ACPI: found 1 devices Jul 6 23:26:16.821920 kernel: NET: Registered PF_INET protocol family Jul 6 23:26:16.821928 kernel: IP idents hash table entries: 65536 (order: 7, 524288 bytes, linear) Jul 6 23:26:16.821935 kernel: tcp_listen_portaddr_hash hash table entries: 2048 (order: 3, 32768 bytes, linear) Jul 6 23:26:16.821943 kernel: Table-perturb hash table entries: 65536 (order: 6, 262144 bytes, linear) Jul 6 23:26:16.821950 kernel: TCP established hash table entries: 32768 (order: 6, 262144 bytes, linear) Jul 6 23:26:16.821960 kernel: TCP bind hash table entries: 32768 (order: 8, 1048576 bytes, linear) Jul 6 23:26:16.821967 kernel: TCP: Hash tables configured (established 32768 bind 32768) Jul 6 23:26:16.821975 kernel: UDP hash table entries: 2048 (order: 4, 65536 bytes, linear) Jul 6 23:26:16.821982 kernel: UDP-Lite hash table entries: 2048 (order: 4, 65536 bytes, linear) Jul 6 23:26:16.821989 kernel: NET: Registered PF_UNIX/PF_LOCAL protocol family Jul 6 23:26:16.822069 kernel: pci 0000:02:00.0: enabling device (0000 -> 0002) Jul 6 23:26:16.822081 kernel: PCI: CLS 0 bytes, default 64 Jul 6 23:26:16.822091 kernel: kvm [1]: HYP mode not available Jul 6 23:26:16.822100 kernel: Initialise system trusted keyrings Jul 6 23:26:16.822107 kernel: workingset: timestamp_bits=39 max_order=20 bucket_order=0 Jul 6 23:26:16.822115 kernel: Key type asymmetric registered Jul 6 23:26:16.822122 kernel: Asymmetric key parser 'x509' registered Jul 6 23:26:16.822129 kernel: Block layer SCSI generic (bsg) driver version 0.4 loaded (major 249) Jul 6 23:26:16.822137 kernel: io scheduler mq-deadline registered Jul 6 23:26:16.822145 kernel: io scheduler kyber registered Jul 6 23:26:16.822152 kernel: io scheduler bfq registered Jul 6 23:26:16.822160 kernel: ACPI: \_SB_.PCI0.GSI2: Enabled at IRQ 37 Jul 6 23:26:16.822222 kernel: pcieport 0000:00:02.0: PME: Signaling with IRQ 50 Jul 6 23:26:16.822284 kernel: pcieport 0000:00:02.0: AER: enabled with IRQ 50 Jul 6 23:26:16.822343 kernel: pcieport 0000:00:02.0: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Jul 6 23:26:16.822447 kernel: pcieport 0000:00:02.1: PME: Signaling with IRQ 51 Jul 6 23:26:16.822512 kernel: pcieport 0000:00:02.1: AER: enabled with IRQ 51 Jul 6 23:26:16.822573 kernel: pcieport 0000:00:02.1: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Jul 6 23:26:16.822654 kernel: pcieport 0000:00:02.2: PME: Signaling with IRQ 52 Jul 6 23:26:16.822726 kernel: pcieport 0000:00:02.2: AER: enabled with IRQ 52 Jul 6 23:26:16.822793 kernel: pcieport 0000:00:02.2: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Jul 6 23:26:16.822873 kernel: pcieport 0000:00:02.3: PME: Signaling with IRQ 53 Jul 6 23:26:16.822939 kernel: pcieport 0000:00:02.3: AER: enabled with IRQ 53 Jul 6 23:26:16.823012 kernel: pcieport 0000:00:02.3: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Jul 6 23:26:16.825152 kernel: pcieport 0000:00:02.4: PME: Signaling with IRQ 54 Jul 6 23:26:16.825243 kernel: pcieport 0000:00:02.4: AER: enabled with IRQ 54 Jul 6 23:26:16.825304 kernel: pcieport 0000:00:02.4: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Jul 6 23:26:16.825367 kernel: pcieport 0000:00:02.5: PME: Signaling with IRQ 55 Jul 6 23:26:16.825432 kernel: pcieport 0000:00:02.5: AER: enabled with IRQ 55 Jul 6 23:26:16.825494 kernel: pcieport 0000:00:02.5: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Jul 6 23:26:16.825556 kernel: pcieport 0000:00:02.6: PME: Signaling with IRQ 56 Jul 6 23:26:16.825669 kernel: pcieport 0000:00:02.6: AER: enabled with IRQ 56 Jul 6 23:26:16.825733 kernel: pcieport 0000:00:02.6: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Jul 6 23:26:16.825795 kernel: pcieport 0000:00:02.7: PME: Signaling with IRQ 57 Jul 6 23:26:16.825872 kernel: pcieport 0000:00:02.7: AER: enabled with IRQ 57 Jul 6 23:26:16.825936 kernel: pcieport 0000:00:02.7: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Jul 6 23:26:16.825954 kernel: ACPI: \_SB_.PCI0.GSI3: Enabled at IRQ 38 Jul 6 23:26:16.826016 kernel: pcieport 0000:00:03.0: PME: Signaling with IRQ 58 Jul 6 23:26:16.826095 kernel: pcieport 0000:00:03.0: AER: enabled with IRQ 58 Jul 6 23:26:16.826163 kernel: pcieport 0000:00:03.0: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Jul 6 23:26:16.826174 kernel: input: Power Button as /devices/LNXSYSTM:00/LNXSYBUS:00/PNP0C0C:00/input/input0 Jul 6 23:26:16.826182 kernel: ACPI: button: Power Button [PWRB] Jul 6 23:26:16.826190 kernel: ACPI: \_SB_.PCI0.GSI1: Enabled at IRQ 36 Jul 6 23:26:16.826255 kernel: virtio-pci 0000:04:00.0: enabling device (0000 -> 0002) Jul 6 23:26:16.826325 kernel: virtio-pci 0000:07:00.0: enabling device (0000 -> 0002) Jul 6 23:26:16.826335 kernel: Serial: 8250/16550 driver, 4 ports, IRQ sharing enabled Jul 6 23:26:16.826343 kernel: ACPI: \_SB_.PCI0.GSI0: Enabled at IRQ 35 Jul 6 23:26:16.826405 kernel: serial 0000:00:04.0: enabling device (0000 -> 0001) Jul 6 23:26:16.826415 kernel: 0000:00:04.0: ttyS0 at I/O 0xa000 (irq = 45, base_baud = 115200) is a 16550A Jul 6 23:26:16.826423 kernel: thunder_xcv, ver 1.0 Jul 6 23:26:16.826430 kernel: thunder_bgx, ver 1.0 Jul 6 23:26:16.826438 kernel: nicpf, ver 1.0 Jul 6 23:26:16.826445 kernel: nicvf, ver 1.0 Jul 6 23:26:16.826517 kernel: rtc-efi rtc-efi.0: registered as rtc0 Jul 6 23:26:16.826574 kernel: rtc-efi rtc-efi.0: setting system clock to 2025-07-06T23:26:16 UTC (1751844376) Jul 6 23:26:16.826584 kernel: hid: raw HID events driver (C) Jiri Kosina Jul 6 23:26:16.826592 kernel: hw perfevents: enabled with armv8_pmuv3_0 PMU driver, 7 (0,8000003f) counters available Jul 6 23:26:16.826599 kernel: watchdog: NMI not fully supported Jul 6 23:26:16.826606 kernel: watchdog: Hard watchdog permanently disabled Jul 6 23:26:16.826614 kernel: NET: Registered PF_INET6 protocol family Jul 6 23:26:16.826621 kernel: Segment Routing with IPv6 Jul 6 23:26:16.826630 kernel: In-situ OAM (IOAM) with IPv6 Jul 6 23:26:16.826638 kernel: NET: Registered PF_PACKET protocol family Jul 6 23:26:16.826645 kernel: Key type dns_resolver registered Jul 6 23:26:16.826653 kernel: registered taskstats version 1 Jul 6 23:26:16.826660 kernel: Loading compiled-in X.509 certificates Jul 6 23:26:16.826668 kernel: Loaded X.509 cert 'Kinvolk GmbH: Module signing key for 6.12.35-flatcar: 90fb300ebe1fa0773739bb35dad461c5679d8dfb' Jul 6 23:26:16.826675 kernel: Demotion targets for Node 0: null Jul 6 23:26:16.826682 kernel: Key type .fscrypt registered Jul 6 23:26:16.826689 kernel: Key type fscrypt-provisioning registered Jul 6 23:26:16.826698 kernel: ima: No TPM chip found, activating TPM-bypass! Jul 6 23:26:16.826706 kernel: ima: Allocated hash algorithm: sha1 Jul 6 23:26:16.826713 kernel: ima: No architecture policies found Jul 6 23:26:16.826720 kernel: alg: No test for fips(ansi_cprng) (fips_ansi_cprng) Jul 6 23:26:16.826728 kernel: clk: Disabling unused clocks Jul 6 23:26:16.826735 kernel: PM: genpd: Disabling unused power domains Jul 6 23:26:16.826742 kernel: Warning: unable to open an initial console. Jul 6 23:26:16.826750 kernel: Freeing unused kernel memory: 39424K Jul 6 23:26:16.826757 kernel: Run /init as init process Jul 6 23:26:16.826766 kernel: with arguments: Jul 6 23:26:16.826773 kernel: /init Jul 6 23:26:16.826781 kernel: with environment: Jul 6 23:26:16.826788 kernel: HOME=/ Jul 6 23:26:16.826797 kernel: TERM=linux Jul 6 23:26:16.826804 kernel: BOOT_IMAGE=/flatcar/vmlinuz-a Jul 6 23:26:16.826812 systemd[1]: Successfully made /usr/ read-only. Jul 6 23:26:16.826857 systemd[1]: systemd 256.8 running in system mode (+PAM +AUDIT +SELINUX -APPARMOR +IMA +SMACK +SECCOMP -GCRYPT -GNUTLS +OPENSSL -ACL +BLKID +CURL +ELFUTILS -FIDO2 +IDN2 -IDN +IPTC +KMOD +LIBCRYPTSETUP +LIBCRYPTSETUP_PLUGINS +LIBFDISK +PCRE2 -PWQUALITY -P11KIT -QRENCODE +TPM2 +BZIP2 +LZ4 +XZ +ZLIB +ZSTD -BPF_FRAMEWORK -XKBCOMMON +UTMP -SYSVINIT +LIBARCHIVE) Jul 6 23:26:16.826869 systemd[1]: Detected virtualization kvm. Jul 6 23:26:16.826877 systemd[1]: Detected architecture arm64. Jul 6 23:26:16.826884 systemd[1]: Running in initrd. Jul 6 23:26:16.826892 systemd[1]: No hostname configured, using default hostname. Jul 6 23:26:16.826900 systemd[1]: Hostname set to . Jul 6 23:26:16.826908 systemd[1]: Initializing machine ID from VM UUID. Jul 6 23:26:16.826916 systemd[1]: Queued start job for default target initrd.target. Jul 6 23:26:16.826923 systemd[1]: Started clevis-luks-askpass.path - Forward Password Requests to Clevis Directory Watch. Jul 6 23:26:16.826933 systemd[1]: Started systemd-ask-password-console.path - Dispatch Password Requests to Console Directory Watch. Jul 6 23:26:16.826941 systemd[1]: Expecting device dev-disk-by\x2dlabel-EFI\x2dSYSTEM.device - /dev/disk/by-label/EFI-SYSTEM... Jul 6 23:26:16.826949 systemd[1]: Expecting device dev-disk-by\x2dlabel-OEM.device - /dev/disk/by-label/OEM... Jul 6 23:26:16.826957 systemd[1]: Expecting device dev-disk-by\x2dlabel-ROOT.device - /dev/disk/by-label/ROOT... Jul 6 23:26:16.826965 systemd[1]: Expecting device dev-disk-by\x2dpartlabel-USR\x2dA.device - /dev/disk/by-partlabel/USR-A... Jul 6 23:26:16.826974 systemd[1]: Expecting device dev-disk-by\x2dpartuuid-7130c94a\x2d213a\x2d4e5a\x2d8e26\x2d6cce9662f132.device - /dev/disk/by-partuuid/7130c94a-213a-4e5a-8e26-6cce9662f132... Jul 6 23:26:16.826984 systemd[1]: Expecting device dev-mapper-usr.device - /dev/mapper/usr... Jul 6 23:26:16.826992 systemd[1]: Reached target cryptsetup-pre.target - Local Encrypted Volumes (Pre). Jul 6 23:26:16.827000 systemd[1]: Reached target cryptsetup.target - Local Encrypted Volumes. Jul 6 23:26:16.827008 systemd[1]: Reached target paths.target - Path Units. Jul 6 23:26:16.827016 systemd[1]: Reached target slices.target - Slice Units. Jul 6 23:26:16.827024 systemd[1]: Reached target swap.target - Swaps. Jul 6 23:26:16.829059 systemd[1]: Reached target timers.target - Timer Units. Jul 6 23:26:16.829089 systemd[1]: Listening on iscsid.socket - Open-iSCSI iscsid Socket. Jul 6 23:26:16.829099 systemd[1]: Listening on iscsiuio.socket - Open-iSCSI iscsiuio Socket. Jul 6 23:26:16.829112 systemd[1]: Listening on systemd-journald-dev-log.socket - Journal Socket (/dev/log). Jul 6 23:26:16.829120 systemd[1]: Listening on systemd-journald.socket - Journal Sockets. Jul 6 23:26:16.829128 systemd[1]: Listening on systemd-networkd.socket - Network Service Netlink Socket. Jul 6 23:26:16.829136 systemd[1]: Listening on systemd-udevd-control.socket - udev Control Socket. Jul 6 23:26:16.829144 systemd[1]: Listening on systemd-udevd-kernel.socket - udev Kernel Socket. Jul 6 23:26:16.829152 systemd[1]: Reached target sockets.target - Socket Units. Jul 6 23:26:16.829160 systemd[1]: Starting ignition-setup-pre.service - Ignition env setup... Jul 6 23:26:16.829168 systemd[1]: Starting kmod-static-nodes.service - Create List of Static Device Nodes... Jul 6 23:26:16.829177 systemd[1]: Finished network-cleanup.service - Network Cleanup. Jul 6 23:26:16.829186 systemd[1]: systemd-battery-check.service - Check battery level during early boot was skipped because of an unmet condition check (ConditionDirectoryNotEmpty=/sys/class/power_supply). Jul 6 23:26:16.829193 systemd[1]: Starting systemd-fsck-usr.service... Jul 6 23:26:16.829201 systemd[1]: Starting systemd-journald.service - Journal Service... Jul 6 23:26:16.829209 systemd[1]: Starting systemd-modules-load.service - Load Kernel Modules... Jul 6 23:26:16.829217 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Jul 6 23:26:16.829225 systemd[1]: Finished ignition-setup-pre.service - Ignition env setup. Jul 6 23:26:16.829262 systemd-journald[243]: Collecting audit messages is disabled. Jul 6 23:26:16.829284 systemd[1]: Finished kmod-static-nodes.service - Create List of Static Device Nodes. Jul 6 23:26:16.829294 systemd[1]: Finished systemd-fsck-usr.service. Jul 6 23:26:16.829302 systemd[1]: Starting systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully... Jul 6 23:26:16.829310 kernel: bridge: filtering via arp/ip/ip6tables is no longer available by default. Update your scripts to load br_netfilter if you need this. Jul 6 23:26:16.829318 kernel: Bridge firewalling registered Jul 6 23:26:16.829326 systemd[1]: Finished systemd-vconsole-setup.service - Virtual Console Setup. Jul 6 23:26:16.829334 systemd[1]: Finished systemd-modules-load.service - Load Kernel Modules. Jul 6 23:26:16.829342 systemd[1]: Starting dracut-cmdline-ask.service - dracut ask for additional cmdline parameters... Jul 6 23:26:16.829351 systemd-journald[243]: Journal started Jul 6 23:26:16.829370 systemd-journald[243]: Runtime Journal (/run/log/journal/03b6491ed6914f54a91c7fe06a22ceb8) is 8M, max 76.5M, 68.5M free. Jul 6 23:26:16.831103 systemd[1]: Starting systemd-sysctl.service - Apply Kernel Variables... Jul 6 23:26:16.792690 systemd-modules-load[244]: Inserted module 'overlay' Jul 6 23:26:16.833579 systemd[1]: Started systemd-journald.service - Journal Service. Jul 6 23:26:16.819381 systemd-modules-load[244]: Inserted module 'br_netfilter' Jul 6 23:26:16.838621 systemd[1]: Finished systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully. Jul 6 23:26:16.847335 systemd[1]: Starting systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev... Jul 6 23:26:16.852702 systemd[1]: Starting systemd-tmpfiles-setup.service - Create System Files and Directories... Jul 6 23:26:16.855467 systemd[1]: Finished systemd-sysctl.service - Apply Kernel Variables. Jul 6 23:26:16.864835 systemd[1]: Finished dracut-cmdline-ask.service - dracut ask for additional cmdline parameters. Jul 6 23:26:16.867702 systemd-tmpfiles[270]: /usr/lib/tmpfiles.d/var.conf:14: Duplicate line for path "/var/log", ignoring. Jul 6 23:26:16.869018 systemd[1]: Starting dracut-cmdline.service - dracut cmdline hook... Jul 6 23:26:16.873184 systemd[1]: Finished systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev. Jul 6 23:26:16.875590 systemd[1]: Finished systemd-tmpfiles-setup.service - Create System Files and Directories. Jul 6 23:26:16.883341 systemd[1]: Starting systemd-resolved.service - Network Name Resolution... Jul 6 23:26:16.900672 dracut-cmdline[281]: Using kernel command line parameters: rd.driver.pre=btrfs SYSTEMD_SULOGIN_FORCE=1 BOOT_IMAGE=/flatcar/vmlinuz-a mount.usr=/dev/mapper/usr verity.usr=PARTUUID=7130c94a-213a-4e5a-8e26-6cce9662f132 rootflags=rw mount.usrflags=ro consoleblank=0 root=LABEL=ROOT console=ttyAMA0,115200n8 flatcar.first_boot=detected acpi=force flatcar.oem.id=hetzner verity.usrhash=dd2d39de40482a23e9bb75390ff5ca85cd9bd34d902b8049121a8373f8cb2ef2 Jul 6 23:26:16.927752 systemd-resolved[285]: Positive Trust Anchors: Jul 6 23:26:16.928611 systemd-resolved[285]: . IN DS 20326 8 2 e06d44b80b8f1d39a95c0b0d7c65d08458e880409bbc683457104237c7f8ec8d Jul 6 23:26:16.929452 systemd-resolved[285]: Negative trust anchors: home.arpa 10.in-addr.arpa 16.172.in-addr.arpa 17.172.in-addr.arpa 18.172.in-addr.arpa 19.172.in-addr.arpa 20.172.in-addr.arpa 21.172.in-addr.arpa 22.172.in-addr.arpa 23.172.in-addr.arpa 24.172.in-addr.arpa 25.172.in-addr.arpa 26.172.in-addr.arpa 27.172.in-addr.arpa 28.172.in-addr.arpa 29.172.in-addr.arpa 30.172.in-addr.arpa 31.172.in-addr.arpa 170.0.0.192.in-addr.arpa 171.0.0.192.in-addr.arpa 168.192.in-addr.arpa d.f.ip6.arpa ipv4only.arpa resolver.arpa corp home internal intranet lan local private test Jul 6 23:26:16.936110 systemd-resolved[285]: Defaulting to hostname 'linux'. Jul 6 23:26:16.937189 systemd[1]: Started systemd-resolved.service - Network Name Resolution. Jul 6 23:26:16.938639 systemd[1]: Reached target nss-lookup.target - Host and Network Name Lookups. Jul 6 23:26:17.002105 kernel: SCSI subsystem initialized Jul 6 23:26:17.007074 kernel: Loading iSCSI transport class v2.0-870. Jul 6 23:26:17.015085 kernel: iscsi: registered transport (tcp) Jul 6 23:26:17.028083 kernel: iscsi: registered transport (qla4xxx) Jul 6 23:26:17.028164 kernel: QLogic iSCSI HBA Driver Jul 6 23:26:17.051301 systemd[1]: Starting systemd-network-generator.service - Generate network units from Kernel command line... Jul 6 23:26:17.071175 systemd[1]: Finished systemd-network-generator.service - Generate network units from Kernel command line. Jul 6 23:26:17.073673 systemd[1]: Reached target network-pre.target - Preparation for Network. Jul 6 23:26:17.134782 systemd[1]: Finished dracut-cmdline.service - dracut cmdline hook. Jul 6 23:26:17.139158 systemd[1]: Starting dracut-pre-udev.service - dracut pre-udev hook... Jul 6 23:26:17.209113 kernel: raid6: neonx8 gen() 15694 MB/s Jul 6 23:26:17.226097 kernel: raid6: neonx4 gen() 15735 MB/s Jul 6 23:26:17.243112 kernel: raid6: neonx2 gen() 13192 MB/s Jul 6 23:26:17.260089 kernel: raid6: neonx1 gen() 10419 MB/s Jul 6 23:26:17.277194 kernel: raid6: int64x8 gen() 6873 MB/s Jul 6 23:26:17.294077 kernel: raid6: int64x4 gen() 7313 MB/s Jul 6 23:26:17.311098 kernel: raid6: int64x2 gen() 6076 MB/s Jul 6 23:26:17.328104 kernel: raid6: int64x1 gen() 5033 MB/s Jul 6 23:26:17.328200 kernel: raid6: using algorithm neonx4 gen() 15735 MB/s Jul 6 23:26:17.345118 kernel: raid6: .... xor() 12279 MB/s, rmw enabled Jul 6 23:26:17.345207 kernel: raid6: using neon recovery algorithm Jul 6 23:26:17.350352 kernel: xor: measuring software checksum speed Jul 6 23:26:17.350439 kernel: 8regs : 20660 MB/sec Jul 6 23:26:17.351234 kernel: 32regs : 21664 MB/sec Jul 6 23:26:17.351274 kernel: arm64_neon : 22758 MB/sec Jul 6 23:26:17.351291 kernel: xor: using function: arm64_neon (22758 MB/sec) Jul 6 23:26:17.406140 kernel: Btrfs loaded, zoned=no, fsverity=no Jul 6 23:26:17.415597 systemd[1]: Finished dracut-pre-udev.service - dracut pre-udev hook. Jul 6 23:26:17.420262 systemd[1]: Starting systemd-udevd.service - Rule-based Manager for Device Events and Files... Jul 6 23:26:17.448870 systemd-udevd[493]: Using default interface naming scheme 'v255'. Jul 6 23:26:17.454447 systemd[1]: Started systemd-udevd.service - Rule-based Manager for Device Events and Files. Jul 6 23:26:17.459424 systemd[1]: Starting dracut-pre-trigger.service - dracut pre-trigger hook... Jul 6 23:26:17.487134 dracut-pre-trigger[501]: rd.md=0: removing MD RAID activation Jul 6 23:26:17.525093 systemd[1]: Finished dracut-pre-trigger.service - dracut pre-trigger hook. Jul 6 23:26:17.528003 systemd[1]: Starting systemd-udev-trigger.service - Coldplug All udev Devices... Jul 6 23:26:17.589278 systemd[1]: Finished systemd-udev-trigger.service - Coldplug All udev Devices. Jul 6 23:26:17.591947 systemd[1]: Starting dracut-initqueue.service - dracut initqueue hook... Jul 6 23:26:17.671108 kernel: virtio_scsi virtio5: 2/0/0 default/read/poll queues Jul 6 23:26:17.672064 kernel: scsi host0: Virtio SCSI HBA Jul 6 23:26:17.681333 kernel: scsi 0:0:0:0: CD-ROM QEMU QEMU CD-ROM 2.5+ PQ: 0 ANSI: 5 Jul 6 23:26:17.681410 kernel: scsi 0:0:0:1: Direct-Access QEMU QEMU HARDDISK 2.5+ PQ: 0 ANSI: 5 Jul 6 23:26:17.706086 kernel: ACPI: bus type USB registered Jul 6 23:26:17.706137 kernel: usbcore: registered new interface driver usbfs Jul 6 23:26:17.707088 kernel: usbcore: registered new interface driver hub Jul 6 23:26:17.707126 kernel: usbcore: registered new device driver usb Jul 6 23:26:17.726055 kernel: sd 0:0:0:1: Power-on or device reset occurred Jul 6 23:26:17.726264 kernel: sd 0:0:0:1: [sda] 80003072 512-byte logical blocks: (41.0 GB/38.1 GiB) Jul 6 23:26:17.727863 kernel: sd 0:0:0:1: [sda] Write Protect is off Jul 6 23:26:17.728063 kernel: sd 0:0:0:1: [sda] Mode Sense: 63 00 00 08 Jul 6 23:26:17.728159 kernel: sd 0:0:0:1: [sda] Write cache: enabled, read cache: enabled, doesn't support DPO or FUA Jul 6 23:26:17.731065 kernel: sr 0:0:0:0: Power-on or device reset occurred Jul 6 23:26:17.731250 kernel: sr 0:0:0:0: [sr0] scsi3-mmc drive: 16x/50x cd/rw xa/form2 cdda tray Jul 6 23:26:17.732067 kernel: cdrom: Uniform CD-ROM driver Revision: 3.20 Jul 6 23:26:17.732571 systemd[1]: systemd-vconsole-setup.service: Deactivated successfully. Jul 6 23:26:17.732737 systemd[1]: Stopped systemd-vconsole-setup.service - Virtual Console Setup. Jul 6 23:26:17.739147 kernel: sr 0:0:0:0: Attached scsi CD-ROM sr0 Jul 6 23:26:17.739335 kernel: GPT:Primary header thinks Alt. header is not at the end of the disk. Jul 6 23:26:17.739358 kernel: GPT:17805311 != 80003071 Jul 6 23:26:17.739370 kernel: GPT:Alternate GPT header not at the end of the disk. Jul 6 23:26:17.739381 kernel: GPT:17805311 != 80003071 Jul 6 23:26:17.739392 kernel: GPT: Use GNU Parted to correct GPT errors. Jul 6 23:26:17.739402 kernel: sda: sda1 sda2 sda3 sda4 sda6 sda7 sda9 Jul 6 23:26:17.739152 systemd[1]: Stopping systemd-vconsole-setup.service - Virtual Console Setup... Jul 6 23:26:17.741182 kernel: sd 0:0:0:1: [sda] Attached SCSI disk Jul 6 23:26:17.743663 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Jul 6 23:26:17.755645 kernel: xhci_hcd 0000:02:00.0: xHCI Host Controller Jul 6 23:26:17.755822 kernel: xhci_hcd 0000:02:00.0: new USB bus registered, assigned bus number 1 Jul 6 23:26:17.755930 kernel: xhci_hcd 0000:02:00.0: hcc params 0x00087001 hci version 0x100 quirks 0x0000000000000010 Jul 6 23:26:17.759212 kernel: xhci_hcd 0000:02:00.0: xHCI Host Controller Jul 6 23:26:17.759385 kernel: xhci_hcd 0000:02:00.0: new USB bus registered, assigned bus number 2 Jul 6 23:26:17.762416 kernel: xhci_hcd 0000:02:00.0: Host supports USB 3.0 SuperSpeed Jul 6 23:26:17.766310 kernel: hub 1-0:1.0: USB hub found Jul 6 23:26:17.769074 kernel: hub 1-0:1.0: 4 ports detected Jul 6 23:26:17.771104 kernel: usb usb2: We don't know the algorithms for LPM for this host, disabling LPM. Jul 6 23:26:17.773253 kernel: hub 2-0:1.0: USB hub found Jul 6 23:26:17.773441 kernel: hub 2-0:1.0: 4 ports detected Jul 6 23:26:17.781168 systemd[1]: Finished systemd-vconsole-setup.service - Virtual Console Setup. Jul 6 23:26:17.822961 systemd[1]: Found device dev-disk-by\x2dlabel-ROOT.device - QEMU_HARDDISK ROOT. Jul 6 23:26:17.850120 systemd[1]: Found device dev-disk-by\x2dlabel-EFI\x2dSYSTEM.device - QEMU_HARDDISK EFI-SYSTEM. Jul 6 23:26:17.868431 systemd[1]: Found device dev-disk-by\x2dpartuuid-7130c94a\x2d213a\x2d4e5a\x2d8e26\x2d6cce9662f132.device - QEMU_HARDDISK USR-A. Jul 6 23:26:17.869733 systemd[1]: Found device dev-disk-by\x2dpartlabel-USR\x2dA.device - QEMU_HARDDISK USR-A. Jul 6 23:26:17.880199 systemd[1]: Found device dev-disk-by\x2dlabel-OEM.device - QEMU_HARDDISK OEM. Jul 6 23:26:17.881159 systemd[1]: Finished dracut-initqueue.service - dracut initqueue hook. Jul 6 23:26:17.888679 systemd[1]: Reached target remote-fs-pre.target - Preparation for Remote File Systems. Jul 6 23:26:17.889498 systemd[1]: Reached target remote-cryptsetup.target - Remote Encrypted Volumes. Jul 6 23:26:17.891319 systemd[1]: Reached target remote-fs.target - Remote File Systems. Jul 6 23:26:17.893713 systemd[1]: Starting disk-uuid.service - Generate new UUID for disk GPT if necessary... Jul 6 23:26:17.895138 systemd[1]: Starting dracut-pre-mount.service - dracut pre-mount hook... Jul 6 23:26:17.917554 disk-uuid[599]: Primary Header is updated. Jul 6 23:26:17.917554 disk-uuid[599]: Secondary Entries is updated. Jul 6 23:26:17.917554 disk-uuid[599]: Secondary Header is updated. Jul 6 23:26:17.925715 systemd[1]: Finished dracut-pre-mount.service - dracut pre-mount hook. Jul 6 23:26:17.930056 kernel: sda: sda1 sda2 sda3 sda4 sda6 sda7 sda9 Jul 6 23:26:18.012078 kernel: usb 1-1: new high-speed USB device number 2 using xhci_hcd Jul 6 23:26:18.145339 kernel: input: QEMU QEMU USB Tablet as /devices/pci0000:00/0000:00:02.1/0000:02:00.0/usb1/1-1/1-1:1.0/0003:0627:0001.0001/input/input1 Jul 6 23:26:18.145417 kernel: hid-generic 0003:0627:0001.0001: input,hidraw0: USB HID v0.01 Mouse [QEMU QEMU USB Tablet] on usb-0000:02:00.0-1/input0 Jul 6 23:26:18.146122 kernel: usbcore: registered new interface driver usbhid Jul 6 23:26:18.146162 kernel: usbhid: USB HID core driver Jul 6 23:26:18.249081 kernel: usb 1-2: new high-speed USB device number 3 using xhci_hcd Jul 6 23:26:18.378069 kernel: input: QEMU QEMU USB Keyboard as /devices/pci0000:00/0000:00:02.1/0000:02:00.0/usb1/1-2/1-2:1.0/0003:0627:0001.0002/input/input2 Jul 6 23:26:18.432134 kernel: hid-generic 0003:0627:0001.0002: input,hidraw1: USB HID v1.11 Keyboard [QEMU QEMU USB Keyboard] on usb-0000:02:00.0-2/input0 Jul 6 23:26:18.948268 kernel: sda: sda1 sda2 sda3 sda4 sda6 sda7 sda9 Jul 6 23:26:18.949409 disk-uuid[604]: The operation has completed successfully. Jul 6 23:26:19.014195 systemd[1]: disk-uuid.service: Deactivated successfully. Jul 6 23:26:19.015667 systemd[1]: Finished disk-uuid.service - Generate new UUID for disk GPT if necessary. Jul 6 23:26:19.042529 systemd[1]: Starting verity-setup.service - Verity Setup for /dev/mapper/usr... Jul 6 23:26:19.062088 sh[624]: Success Jul 6 23:26:19.080127 kernel: device-mapper: core: CONFIG_IMA_DISABLE_HTABLE is disabled. Duplicate IMA measurements will not be recorded in the IMA log. Jul 6 23:26:19.080196 kernel: device-mapper: uevent: version 1.0.3 Jul 6 23:26:19.081160 kernel: device-mapper: ioctl: 4.48.0-ioctl (2023-03-01) initialised: dm-devel@lists.linux.dev Jul 6 23:26:19.090067 kernel: device-mapper: verity: sha256 using shash "sha256-ce" Jul 6 23:26:19.145929 systemd[1]: Found device dev-mapper-usr.device - /dev/mapper/usr. Jul 6 23:26:19.148675 systemd[1]: Mounting sysusr-usr.mount - /sysusr/usr... Jul 6 23:26:19.168582 systemd[1]: Finished verity-setup.service - Verity Setup for /dev/mapper/usr. Jul 6 23:26:19.188499 kernel: BTRFS info: 'norecovery' is for compatibility only, recommended to use 'rescue=nologreplay' Jul 6 23:26:19.188585 kernel: BTRFS: device fsid aa7ffdf7-f152-4ceb-bd0e-b3b3f8f8b296 devid 1 transid 38 /dev/mapper/usr (254:0) scanned by mount (637) Jul 6 23:26:19.192046 kernel: BTRFS info (device dm-0): first mount of filesystem aa7ffdf7-f152-4ceb-bd0e-b3b3f8f8b296 Jul 6 23:26:19.192106 kernel: BTRFS info (device dm-0): using crc32c (crc32c-generic) checksum algorithm Jul 6 23:26:19.192124 kernel: BTRFS info (device dm-0): using free-space-tree Jul 6 23:26:19.202557 systemd[1]: Mounted sysusr-usr.mount - /sysusr/usr. Jul 6 23:26:19.204503 systemd[1]: Reached target initrd-usr-fs.target - Initrd /usr File System. Jul 6 23:26:19.206356 systemd[1]: afterburn-network-kargs.service - Afterburn Initrd Setup Network Kernel Arguments was skipped because no trigger condition checks were met. Jul 6 23:26:19.207323 systemd[1]: Starting ignition-setup.service - Ignition (setup)... Jul 6 23:26:19.211437 systemd[1]: Starting parse-ip-for-networkd.service - Write systemd-networkd units from cmdline... Jul 6 23:26:19.239078 kernel: BTRFS: device label OEM devid 1 transid 11 /dev/sda6 (8:6) scanned by mount (670) Jul 6 23:26:19.241423 kernel: BTRFS info (device sda6): first mount of filesystem 492b2e2a-5dd7-445f-b930-e9dd6acadf93 Jul 6 23:26:19.241477 kernel: BTRFS info (device sda6): using crc32c (crc32c-generic) checksum algorithm Jul 6 23:26:19.242084 kernel: BTRFS info (device sda6): using free-space-tree Jul 6 23:26:19.254093 kernel: BTRFS info (device sda6): last unmount of filesystem 492b2e2a-5dd7-445f-b930-e9dd6acadf93 Jul 6 23:26:19.255422 systemd[1]: Finished ignition-setup.service - Ignition (setup). Jul 6 23:26:19.259323 systemd[1]: Starting ignition-fetch-offline.service - Ignition (fetch-offline)... Jul 6 23:26:19.360910 systemd[1]: Finished parse-ip-for-networkd.service - Write systemd-networkd units from cmdline. Jul 6 23:26:19.363579 systemd[1]: Starting systemd-networkd.service - Network Configuration... Jul 6 23:26:19.404700 systemd-networkd[812]: lo: Link UP Jul 6 23:26:19.404711 systemd-networkd[812]: lo: Gained carrier Jul 6 23:26:19.406562 systemd-networkd[812]: Enumeration completed Jul 6 23:26:19.407380 systemd-networkd[812]: eth0: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Jul 6 23:26:19.407384 systemd-networkd[812]: eth0: Configuring with /usr/lib/systemd/network/zz-default.network. Jul 6 23:26:19.407758 systemd[1]: Started systemd-networkd.service - Network Configuration. Jul 6 23:26:19.409762 systemd-networkd[812]: eth1: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Jul 6 23:26:19.409766 systemd-networkd[812]: eth1: Configuring with /usr/lib/systemd/network/zz-default.network. Jul 6 23:26:19.413208 ignition[727]: Ignition 2.21.0 Jul 6 23:26:19.410825 systemd-networkd[812]: eth0: Link UP Jul 6 23:26:19.413214 ignition[727]: Stage: fetch-offline Jul 6 23:26:19.410828 systemd-networkd[812]: eth0: Gained carrier Jul 6 23:26:19.413252 ignition[727]: no configs at "/usr/lib/ignition/base.d" Jul 6 23:26:19.410838 systemd-networkd[812]: eth0: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Jul 6 23:26:19.413260 ignition[727]: no config dir at "/usr/lib/ignition/base.platform.d/hetzner" Jul 6 23:26:19.411630 systemd[1]: Reached target network.target - Network. Jul 6 23:26:19.413442 ignition[727]: parsed url from cmdline: "" Jul 6 23:26:19.415886 systemd[1]: Finished ignition-fetch-offline.service - Ignition (fetch-offline). Jul 6 23:26:19.413446 ignition[727]: no config URL provided Jul 6 23:26:19.416991 systemd-networkd[812]: eth1: Link UP Jul 6 23:26:19.413450 ignition[727]: reading system config file "/usr/lib/ignition/user.ign" Jul 6 23:26:19.416995 systemd-networkd[812]: eth1: Gained carrier Jul 6 23:26:19.413456 ignition[727]: no config at "/usr/lib/ignition/user.ign" Jul 6 23:26:19.417005 systemd-networkd[812]: eth1: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Jul 6 23:26:19.413460 ignition[727]: failed to fetch config: resource requires networking Jul 6 23:26:19.420178 systemd[1]: Starting ignition-fetch.service - Ignition (fetch)... Jul 6 23:26:19.413718 ignition[727]: Ignition finished successfully Jul 6 23:26:19.445119 systemd-networkd[812]: eth1: DHCPv4 address 10.0.0.3/32, gateway 10.0.0.1 acquired from 10.0.0.1 Jul 6 23:26:19.446711 ignition[817]: Ignition 2.21.0 Jul 6 23:26:19.446736 ignition[817]: Stage: fetch Jul 6 23:26:19.447002 ignition[817]: no configs at "/usr/lib/ignition/base.d" Jul 6 23:26:19.447017 ignition[817]: no config dir at "/usr/lib/ignition/base.platform.d/hetzner" Jul 6 23:26:19.447750 ignition[817]: parsed url from cmdline: "" Jul 6 23:26:19.447757 ignition[817]: no config URL provided Jul 6 23:26:19.447766 ignition[817]: reading system config file "/usr/lib/ignition/user.ign" Jul 6 23:26:19.447780 ignition[817]: no config at "/usr/lib/ignition/user.ign" Jul 6 23:26:19.447958 ignition[817]: GET http://169.254.169.254/hetzner/v1/userdata: attempt #1 Jul 6 23:26:19.454596 ignition[817]: GET error: Get "http://169.254.169.254/hetzner/v1/userdata": dial tcp 169.254.169.254:80: connect: network is unreachable Jul 6 23:26:19.485158 systemd-networkd[812]: eth0: DHCPv4 address 91.99.177.85/32, gateway 172.31.1.1 acquired from 172.31.1.1 Jul 6 23:26:19.656259 ignition[817]: GET http://169.254.169.254/hetzner/v1/userdata: attempt #2 Jul 6 23:26:19.662868 ignition[817]: GET result: OK Jul 6 23:26:19.663004 ignition[817]: parsing config with SHA512: 09b1ada9fbee1c39b93c93c92a22cc5ee2efc04885505bad3fb96e6a64d2049c69eeae222eb8b04a71da51c07d9d13f60d0915e385f6cf81d1392da0ec48332b Jul 6 23:26:19.668684 unknown[817]: fetched base config from "system" Jul 6 23:26:19.668694 unknown[817]: fetched base config from "system" Jul 6 23:26:19.669189 ignition[817]: fetch: fetch complete Jul 6 23:26:19.668708 unknown[817]: fetched user config from "hetzner" Jul 6 23:26:19.669194 ignition[817]: fetch: fetch passed Jul 6 23:26:19.674968 systemd[1]: Finished ignition-fetch.service - Ignition (fetch). Jul 6 23:26:19.669258 ignition[817]: Ignition finished successfully Jul 6 23:26:19.677996 systemd[1]: Starting ignition-kargs.service - Ignition (kargs)... Jul 6 23:26:19.708576 ignition[825]: Ignition 2.21.0 Jul 6 23:26:19.709174 ignition[825]: Stage: kargs Jul 6 23:26:19.709512 ignition[825]: no configs at "/usr/lib/ignition/base.d" Jul 6 23:26:19.709536 ignition[825]: no config dir at "/usr/lib/ignition/base.platform.d/hetzner" Jul 6 23:26:19.711714 ignition[825]: kargs: kargs passed Jul 6 23:26:19.711781 ignition[825]: Ignition finished successfully Jul 6 23:26:19.714375 systemd[1]: Finished ignition-kargs.service - Ignition (kargs). Jul 6 23:26:19.717076 systemd[1]: Starting ignition-disks.service - Ignition (disks)... Jul 6 23:26:19.739710 ignition[832]: Ignition 2.21.0 Jul 6 23:26:19.739736 ignition[832]: Stage: disks Jul 6 23:26:19.739926 ignition[832]: no configs at "/usr/lib/ignition/base.d" Jul 6 23:26:19.739937 ignition[832]: no config dir at "/usr/lib/ignition/base.platform.d/hetzner" Jul 6 23:26:19.744058 ignition[832]: disks: disks passed Jul 6 23:26:19.744533 ignition[832]: Ignition finished successfully Jul 6 23:26:19.746284 systemd[1]: Finished ignition-disks.service - Ignition (disks). Jul 6 23:26:19.747133 systemd[1]: Reached target initrd-root-device.target - Initrd Root Device. Jul 6 23:26:19.747966 systemd[1]: Reached target local-fs-pre.target - Preparation for Local File Systems. Jul 6 23:26:19.749132 systemd[1]: Reached target local-fs.target - Local File Systems. Jul 6 23:26:19.750240 systemd[1]: Reached target sysinit.target - System Initialization. Jul 6 23:26:19.751350 systemd[1]: Reached target basic.target - Basic System. Jul 6 23:26:19.753470 systemd[1]: Starting systemd-fsck-root.service - File System Check on /dev/disk/by-label/ROOT... Jul 6 23:26:19.788107 systemd-fsck[840]: ROOT: clean, 15/1628000 files, 120826/1617920 blocks Jul 6 23:26:19.791415 systemd[1]: Finished systemd-fsck-root.service - File System Check on /dev/disk/by-label/ROOT. Jul 6 23:26:19.794372 systemd[1]: Mounting sysroot.mount - /sysroot... Jul 6 23:26:19.882064 kernel: EXT4-fs (sda9): mounted filesystem a6b10247-fbe6-4a25-95d9-ddd4b58604ec r/w with ordered data mode. Quota mode: none. Jul 6 23:26:19.883233 systemd[1]: Mounted sysroot.mount - /sysroot. Jul 6 23:26:19.885154 systemd[1]: Reached target initrd-root-fs.target - Initrd Root File System. Jul 6 23:26:19.888782 systemd[1]: Mounting sysroot-oem.mount - /sysroot/oem... Jul 6 23:26:19.890510 systemd[1]: Mounting sysroot-usr.mount - /sysroot/usr... Jul 6 23:26:19.894275 systemd[1]: Starting flatcar-metadata-hostname.service - Flatcar Metadata Hostname Agent... Jul 6 23:26:19.894938 systemd[1]: ignition-remount-sysroot.service - Remount /sysroot read-write for Ignition was skipped because of an unmet condition check (ConditionPathIsReadWrite=!/sysroot). Jul 6 23:26:19.894981 systemd[1]: Reached target ignition-diskful.target - Ignition Boot Disk Setup. Jul 6 23:26:19.905625 systemd[1]: Mounted sysroot-usr.mount - /sysroot/usr. Jul 6 23:26:19.912484 systemd[1]: Starting initrd-setup-root.service - Root filesystem setup... Jul 6 23:26:19.920080 kernel: BTRFS: device label OEM devid 1 transid 11 /dev/sda6 (8:6) scanned by mount (848) Jul 6 23:26:19.923076 kernel: BTRFS info (device sda6): first mount of filesystem 492b2e2a-5dd7-445f-b930-e9dd6acadf93 Jul 6 23:26:19.923131 kernel: BTRFS info (device sda6): using crc32c (crc32c-generic) checksum algorithm Jul 6 23:26:19.923144 kernel: BTRFS info (device sda6): using free-space-tree Jul 6 23:26:19.941419 systemd[1]: Mounted sysroot-oem.mount - /sysroot/oem. Jul 6 23:26:19.972293 coreos-metadata[850]: Jul 06 23:26:19.971 INFO Fetching http://169.254.169.254/hetzner/v1/metadata/hostname: Attempt #1 Jul 6 23:26:19.973579 initrd-setup-root[875]: cut: /sysroot/etc/passwd: No such file or directory Jul 6 23:26:19.975186 coreos-metadata[850]: Jul 06 23:26:19.973 INFO Fetch successful Jul 6 23:26:19.975186 coreos-metadata[850]: Jul 06 23:26:19.974 INFO wrote hostname ci-4344-1-1-3-d8bdec45b1 to /sysroot/etc/hostname Jul 6 23:26:19.978002 systemd[1]: Finished flatcar-metadata-hostname.service - Flatcar Metadata Hostname Agent. Jul 6 23:26:19.983364 initrd-setup-root[883]: cut: /sysroot/etc/group: No such file or directory Jul 6 23:26:19.988754 initrd-setup-root[890]: cut: /sysroot/etc/shadow: No such file or directory Jul 6 23:26:19.993423 initrd-setup-root[897]: cut: /sysroot/etc/gshadow: No such file or directory Jul 6 23:26:20.094815 systemd[1]: Finished initrd-setup-root.service - Root filesystem setup. Jul 6 23:26:20.096512 systemd[1]: Starting ignition-mount.service - Ignition (mount)... Jul 6 23:26:20.100062 systemd[1]: Starting sysroot-boot.service - /sysroot/boot... Jul 6 23:26:20.123075 kernel: BTRFS info (device sda6): last unmount of filesystem 492b2e2a-5dd7-445f-b930-e9dd6acadf93 Jul 6 23:26:20.144298 systemd[1]: Finished sysroot-boot.service - /sysroot/boot. Jul 6 23:26:20.153380 ignition[966]: INFO : Ignition 2.21.0 Jul 6 23:26:20.153380 ignition[966]: INFO : Stage: mount Jul 6 23:26:20.156611 ignition[966]: INFO : no configs at "/usr/lib/ignition/base.d" Jul 6 23:26:20.156611 ignition[966]: INFO : no config dir at "/usr/lib/ignition/base.platform.d/hetzner" Jul 6 23:26:20.156611 ignition[966]: INFO : mount: mount passed Jul 6 23:26:20.156611 ignition[966]: INFO : Ignition finished successfully Jul 6 23:26:20.157779 systemd[1]: Finished ignition-mount.service - Ignition (mount). Jul 6 23:26:20.160642 systemd[1]: Starting ignition-files.service - Ignition (files)... Jul 6 23:26:20.185853 systemd[1]: sysroot-oem.mount: Deactivated successfully. Jul 6 23:26:20.193236 systemd[1]: Mounting sysroot-oem.mount - /sysroot/oem... Jul 6 23:26:20.219058 kernel: BTRFS: device label OEM devid 1 transid 11 /dev/sda6 (8:6) scanned by mount (976) Jul 6 23:26:20.221112 kernel: BTRFS info (device sda6): first mount of filesystem 492b2e2a-5dd7-445f-b930-e9dd6acadf93 Jul 6 23:26:20.221177 kernel: BTRFS info (device sda6): using crc32c (crc32c-generic) checksum algorithm Jul 6 23:26:20.221200 kernel: BTRFS info (device sda6): using free-space-tree Jul 6 23:26:20.229298 systemd[1]: Mounted sysroot-oem.mount - /sysroot/oem. Jul 6 23:26:20.263080 ignition[993]: INFO : Ignition 2.21.0 Jul 6 23:26:20.266145 ignition[993]: INFO : Stage: files Jul 6 23:26:20.266145 ignition[993]: INFO : no configs at "/usr/lib/ignition/base.d" Jul 6 23:26:20.266145 ignition[993]: INFO : no config dir at "/usr/lib/ignition/base.platform.d/hetzner" Jul 6 23:26:20.268117 ignition[993]: DEBUG : files: compiled without relabeling support, skipping Jul 6 23:26:20.268117 ignition[993]: INFO : files: ensureUsers: op(1): [started] creating or modifying user "core" Jul 6 23:26:20.269650 ignition[993]: DEBUG : files: ensureUsers: op(1): executing: "usermod" "--root" "/sysroot" "core" Jul 6 23:26:20.271341 ignition[993]: INFO : files: ensureUsers: op(1): [finished] creating or modifying user "core" Jul 6 23:26:20.272273 ignition[993]: INFO : files: ensureUsers: op(2): [started] adding ssh keys to user "core" Jul 6 23:26:20.272273 ignition[993]: INFO : files: ensureUsers: op(2): [finished] adding ssh keys to user "core" Jul 6 23:26:20.272120 unknown[993]: wrote ssh authorized keys file for user: core Jul 6 23:26:20.275149 ignition[993]: INFO : files: createFilesystemsFiles: createFiles: op(3): [started] writing file "/sysroot/opt/helm-v3.17.0-linux-arm64.tar.gz" Jul 6 23:26:20.275149 ignition[993]: INFO : files: createFilesystemsFiles: createFiles: op(3): GET https://get.helm.sh/helm-v3.17.0-linux-arm64.tar.gz: attempt #1 Jul 6 23:26:20.324142 ignition[993]: INFO : files: createFilesystemsFiles: createFiles: op(3): GET result: OK Jul 6 23:26:20.544270 systemd-networkd[812]: eth1: Gained IPv6LL Jul 6 23:26:21.120323 systemd-networkd[812]: eth0: Gained IPv6LL Jul 6 23:26:23.599205 ignition[993]: INFO : files: createFilesystemsFiles: createFiles: op(3): [finished] writing file "/sysroot/opt/helm-v3.17.0-linux-arm64.tar.gz" Jul 6 23:26:23.599205 ignition[993]: INFO : files: createFilesystemsFiles: createFiles: op(4): [started] writing file "/sysroot/home/core/install.sh" Jul 6 23:26:23.606252 ignition[993]: INFO : files: createFilesystemsFiles: createFiles: op(4): [finished] writing file "/sysroot/home/core/install.sh" Jul 6 23:26:23.606252 ignition[993]: INFO : files: createFilesystemsFiles: createFiles: op(5): [started] writing file "/sysroot/home/core/nginx.yaml" Jul 6 23:26:23.606252 ignition[993]: INFO : files: createFilesystemsFiles: createFiles: op(5): [finished] writing file "/sysroot/home/core/nginx.yaml" Jul 6 23:26:23.606252 ignition[993]: INFO : files: createFilesystemsFiles: createFiles: op(6): [started] writing file "/sysroot/home/core/nfs-pod.yaml" Jul 6 23:26:23.606252 ignition[993]: INFO : files: createFilesystemsFiles: createFiles: op(6): [finished] writing file "/sysroot/home/core/nfs-pod.yaml" Jul 6 23:26:23.606252 ignition[993]: INFO : files: createFilesystemsFiles: createFiles: op(7): [started] writing file "/sysroot/home/core/nfs-pvc.yaml" Jul 6 23:26:23.606252 ignition[993]: INFO : files: createFilesystemsFiles: createFiles: op(7): [finished] writing file "/sysroot/home/core/nfs-pvc.yaml" Jul 6 23:26:23.606252 ignition[993]: INFO : files: createFilesystemsFiles: createFiles: op(8): [started] writing file "/sysroot/etc/flatcar/update.conf" Jul 6 23:26:23.606252 ignition[993]: INFO : files: createFilesystemsFiles: createFiles: op(8): [finished] writing file "/sysroot/etc/flatcar/update.conf" Jul 6 23:26:23.606252 ignition[993]: INFO : files: createFilesystemsFiles: createFiles: op(9): [started] writing link "/sysroot/etc/extensions/kubernetes.raw" -> "/opt/extensions/kubernetes/kubernetes-v1.32.4-arm64.raw" Jul 6 23:26:23.619224 ignition[993]: INFO : files: createFilesystemsFiles: createFiles: op(9): [finished] writing link "/sysroot/etc/extensions/kubernetes.raw" -> "/opt/extensions/kubernetes/kubernetes-v1.32.4-arm64.raw" Jul 6 23:26:23.619224 ignition[993]: INFO : files: createFilesystemsFiles: createFiles: op(a): [started] writing file "/sysroot/opt/extensions/kubernetes/kubernetes-v1.32.4-arm64.raw" Jul 6 23:26:23.619224 ignition[993]: INFO : files: createFilesystemsFiles: createFiles: op(a): GET https://extensions.flatcar.org/extensions/kubernetes-v1.32.4-arm64.raw: attempt #1 Jul 6 23:26:24.250987 ignition[993]: INFO : files: createFilesystemsFiles: createFiles: op(a): GET result: OK Jul 6 23:26:24.475955 ignition[993]: INFO : files: createFilesystemsFiles: createFiles: op(a): [finished] writing file "/sysroot/opt/extensions/kubernetes/kubernetes-v1.32.4-arm64.raw" Jul 6 23:26:24.475955 ignition[993]: INFO : files: op(b): [started] processing unit "prepare-helm.service" Jul 6 23:26:24.479609 ignition[993]: INFO : files: op(b): op(c): [started] writing unit "prepare-helm.service" at "/sysroot/etc/systemd/system/prepare-helm.service" Jul 6 23:26:24.481248 ignition[993]: INFO : files: op(b): op(c): [finished] writing unit "prepare-helm.service" at "/sysroot/etc/systemd/system/prepare-helm.service" Jul 6 23:26:24.481248 ignition[993]: INFO : files: op(b): [finished] processing unit "prepare-helm.service" Jul 6 23:26:24.481248 ignition[993]: INFO : files: op(d): [started] processing unit "coreos-metadata.service" Jul 6 23:26:24.481248 ignition[993]: INFO : files: op(d): op(e): [started] writing systemd drop-in "00-custom-metadata.conf" at "/sysroot/etc/systemd/system/coreos-metadata.service.d/00-custom-metadata.conf" Jul 6 23:26:24.481248 ignition[993]: INFO : files: op(d): op(e): [finished] writing systemd drop-in "00-custom-metadata.conf" at "/sysroot/etc/systemd/system/coreos-metadata.service.d/00-custom-metadata.conf" Jul 6 23:26:24.481248 ignition[993]: INFO : files: op(d): [finished] processing unit "coreos-metadata.service" Jul 6 23:26:24.481248 ignition[993]: INFO : files: op(f): [started] setting preset to enabled for "prepare-helm.service" Jul 6 23:26:24.481248 ignition[993]: INFO : files: op(f): [finished] setting preset to enabled for "prepare-helm.service" Jul 6 23:26:24.481248 ignition[993]: INFO : files: createResultFile: createFiles: op(10): [started] writing file "/sysroot/etc/.ignition-result.json" Jul 6 23:26:24.496169 ignition[993]: INFO : files: createResultFile: createFiles: op(10): [finished] writing file "/sysroot/etc/.ignition-result.json" Jul 6 23:26:24.496169 ignition[993]: INFO : files: files passed Jul 6 23:26:24.496169 ignition[993]: INFO : Ignition finished successfully Jul 6 23:26:24.484506 systemd[1]: Finished ignition-files.service - Ignition (files). Jul 6 23:26:24.486679 systemd[1]: Starting ignition-quench.service - Ignition (record completion)... Jul 6 23:26:24.489124 systemd[1]: Starting initrd-setup-root-after-ignition.service - Root filesystem completion... Jul 6 23:26:24.503878 systemd[1]: ignition-quench.service: Deactivated successfully. Jul 6 23:26:24.504857 systemd[1]: Finished ignition-quench.service - Ignition (record completion). Jul 6 23:26:24.509860 initrd-setup-root-after-ignition[1023]: grep: /sysroot/etc/flatcar/enabled-sysext.conf: No such file or directory Jul 6 23:26:24.509860 initrd-setup-root-after-ignition[1023]: grep: /sysroot/usr/share/flatcar/enabled-sysext.conf: No such file or directory Jul 6 23:26:24.512356 initrd-setup-root-after-ignition[1027]: grep: /sysroot/etc/flatcar/enabled-sysext.conf: No such file or directory Jul 6 23:26:24.514357 systemd[1]: Finished initrd-setup-root-after-ignition.service - Root filesystem completion. Jul 6 23:26:24.516208 systemd[1]: Reached target ignition-complete.target - Ignition Complete. Jul 6 23:26:24.518192 systemd[1]: Starting initrd-parse-etc.service - Mountpoints Configured in the Real Root... Jul 6 23:26:24.568344 systemd[1]: initrd-parse-etc.service: Deactivated successfully. Jul 6 23:26:24.568522 systemd[1]: Finished initrd-parse-etc.service - Mountpoints Configured in the Real Root. Jul 6 23:26:24.570508 systemd[1]: Reached target initrd-fs.target - Initrd File Systems. Jul 6 23:26:24.571813 systemd[1]: Reached target initrd.target - Initrd Default Target. Jul 6 23:26:24.573633 systemd[1]: dracut-mount.service - dracut mount hook was skipped because no trigger condition checks were met. Jul 6 23:26:24.577326 systemd[1]: Starting dracut-pre-pivot.service - dracut pre-pivot and cleanup hook... Jul 6 23:26:24.612432 systemd[1]: Finished dracut-pre-pivot.service - dracut pre-pivot and cleanup hook. Jul 6 23:26:24.615198 systemd[1]: Starting initrd-cleanup.service - Cleaning Up and Shutting Down Daemons... Jul 6 23:26:24.649954 systemd[1]: Stopped target nss-lookup.target - Host and Network Name Lookups. Jul 6 23:26:24.651449 systemd[1]: Stopped target remote-cryptsetup.target - Remote Encrypted Volumes. Jul 6 23:26:24.652264 systemd[1]: Stopped target timers.target - Timer Units. Jul 6 23:26:24.653483 systemd[1]: dracut-pre-pivot.service: Deactivated successfully. Jul 6 23:26:24.653613 systemd[1]: Stopped dracut-pre-pivot.service - dracut pre-pivot and cleanup hook. Jul 6 23:26:24.655827 systemd[1]: Stopped target initrd.target - Initrd Default Target. Jul 6 23:26:24.656545 systemd[1]: Stopped target basic.target - Basic System. Jul 6 23:26:24.658213 systemd[1]: Stopped target ignition-complete.target - Ignition Complete. Jul 6 23:26:24.659850 systemd[1]: Stopped target ignition-diskful.target - Ignition Boot Disk Setup. Jul 6 23:26:24.661487 systemd[1]: Stopped target initrd-root-device.target - Initrd Root Device. Jul 6 23:26:24.663164 systemd[1]: Stopped target initrd-usr-fs.target - Initrd /usr File System. Jul 6 23:26:24.664667 systemd[1]: Stopped target remote-fs.target - Remote File Systems. Jul 6 23:26:24.666052 systemd[1]: Stopped target remote-fs-pre.target - Preparation for Remote File Systems. Jul 6 23:26:24.667506 systemd[1]: Stopped target sysinit.target - System Initialization. Jul 6 23:26:24.668708 systemd[1]: Stopped target local-fs.target - Local File Systems. Jul 6 23:26:24.671099 systemd[1]: Stopped target swap.target - Swaps. Jul 6 23:26:24.671852 systemd[1]: dracut-pre-mount.service: Deactivated successfully. Jul 6 23:26:24.672104 systemd[1]: Stopped dracut-pre-mount.service - dracut pre-mount hook. Jul 6 23:26:24.673983 systemd[1]: Stopped target cryptsetup.target - Local Encrypted Volumes. Jul 6 23:26:24.674793 systemd[1]: Stopped target cryptsetup-pre.target - Local Encrypted Volumes (Pre). Jul 6 23:26:24.675942 systemd[1]: clevis-luks-askpass.path: Deactivated successfully. Jul 6 23:26:24.676528 systemd[1]: Stopped clevis-luks-askpass.path - Forward Password Requests to Clevis Directory Watch. Jul 6 23:26:24.677351 systemd[1]: dracut-initqueue.service: Deactivated successfully. Jul 6 23:26:24.677521 systemd[1]: Stopped dracut-initqueue.service - dracut initqueue hook. Jul 6 23:26:24.679244 systemd[1]: initrd-setup-root-after-ignition.service: Deactivated successfully. Jul 6 23:26:24.679426 systemd[1]: Stopped initrd-setup-root-after-ignition.service - Root filesystem completion. Jul 6 23:26:24.680533 systemd[1]: ignition-files.service: Deactivated successfully. Jul 6 23:26:24.680720 systemd[1]: Stopped ignition-files.service - Ignition (files). Jul 6 23:26:24.681526 systemd[1]: flatcar-metadata-hostname.service: Deactivated successfully. Jul 6 23:26:24.681679 systemd[1]: Stopped flatcar-metadata-hostname.service - Flatcar Metadata Hostname Agent. Jul 6 23:26:24.685164 systemd[1]: Stopping ignition-mount.service - Ignition (mount)... Jul 6 23:26:24.685872 systemd[1]: kmod-static-nodes.service: Deactivated successfully. Jul 6 23:26:24.686072 systemd[1]: Stopped kmod-static-nodes.service - Create List of Static Device Nodes. Jul 6 23:26:24.690103 systemd[1]: Stopping sysroot-boot.service - /sysroot/boot... Jul 6 23:26:24.690612 systemd[1]: systemd-udev-trigger.service: Deactivated successfully. Jul 6 23:26:24.690838 systemd[1]: Stopped systemd-udev-trigger.service - Coldplug All udev Devices. Jul 6 23:26:24.692373 systemd[1]: dracut-pre-trigger.service: Deactivated successfully. Jul 6 23:26:24.692513 systemd[1]: Stopped dracut-pre-trigger.service - dracut pre-trigger hook. Jul 6 23:26:24.700950 systemd[1]: initrd-cleanup.service: Deactivated successfully. Jul 6 23:26:24.701720 systemd[1]: Finished initrd-cleanup.service - Cleaning Up and Shutting Down Daemons. Jul 6 23:26:24.709826 systemd[1]: sysroot-boot.mount: Deactivated successfully. Jul 6 23:26:24.714279 ignition[1047]: INFO : Ignition 2.21.0 Jul 6 23:26:24.714279 ignition[1047]: INFO : Stage: umount Jul 6 23:26:24.717193 ignition[1047]: INFO : no configs at "/usr/lib/ignition/base.d" Jul 6 23:26:24.717193 ignition[1047]: INFO : no config dir at "/usr/lib/ignition/base.platform.d/hetzner" Jul 6 23:26:24.717193 ignition[1047]: INFO : umount: umount passed Jul 6 23:26:24.717193 ignition[1047]: INFO : Ignition finished successfully Jul 6 23:26:24.718133 systemd[1]: sysroot-boot.service: Deactivated successfully. Jul 6 23:26:24.718236 systemd[1]: Stopped sysroot-boot.service - /sysroot/boot. Jul 6 23:26:24.720228 systemd[1]: ignition-mount.service: Deactivated successfully. Jul 6 23:26:24.720343 systemd[1]: Stopped ignition-mount.service - Ignition (mount). Jul 6 23:26:24.721426 systemd[1]: ignition-disks.service: Deactivated successfully. Jul 6 23:26:24.721468 systemd[1]: Stopped ignition-disks.service - Ignition (disks). Jul 6 23:26:24.722227 systemd[1]: ignition-kargs.service: Deactivated successfully. Jul 6 23:26:24.722268 systemd[1]: Stopped ignition-kargs.service - Ignition (kargs). Jul 6 23:26:24.723131 systemd[1]: ignition-fetch.service: Deactivated successfully. Jul 6 23:26:24.723173 systemd[1]: Stopped ignition-fetch.service - Ignition (fetch). Jul 6 23:26:24.724105 systemd[1]: Stopped target network.target - Network. Jul 6 23:26:24.724966 systemd[1]: ignition-fetch-offline.service: Deactivated successfully. Jul 6 23:26:24.725016 systemd[1]: Stopped ignition-fetch-offline.service - Ignition (fetch-offline). Jul 6 23:26:24.726050 systemd[1]: Stopped target paths.target - Path Units. Jul 6 23:26:24.726907 systemd[1]: systemd-ask-password-console.path: Deactivated successfully. Jul 6 23:26:24.730083 systemd[1]: Stopped systemd-ask-password-console.path - Dispatch Password Requests to Console Directory Watch. Jul 6 23:26:24.730718 systemd[1]: Stopped target slices.target - Slice Units. Jul 6 23:26:24.731681 systemd[1]: Stopped target sockets.target - Socket Units. Jul 6 23:26:24.732570 systemd[1]: iscsid.socket: Deactivated successfully. Jul 6 23:26:24.732611 systemd[1]: Closed iscsid.socket - Open-iSCSI iscsid Socket. Jul 6 23:26:24.733569 systemd[1]: iscsiuio.socket: Deactivated successfully. Jul 6 23:26:24.733600 systemd[1]: Closed iscsiuio.socket - Open-iSCSI iscsiuio Socket. Jul 6 23:26:24.734896 systemd[1]: ignition-setup.service: Deactivated successfully. Jul 6 23:26:24.734986 systemd[1]: Stopped ignition-setup.service - Ignition (setup). Jul 6 23:26:24.735812 systemd[1]: ignition-setup-pre.service: Deactivated successfully. Jul 6 23:26:24.735854 systemd[1]: Stopped ignition-setup-pre.service - Ignition env setup. Jul 6 23:26:24.736837 systemd[1]: initrd-setup-root.service: Deactivated successfully. Jul 6 23:26:24.736896 systemd[1]: Stopped initrd-setup-root.service - Root filesystem setup. Jul 6 23:26:24.738222 systemd[1]: Stopping systemd-networkd.service - Network Configuration... Jul 6 23:26:24.739167 systemd[1]: Stopping systemd-resolved.service - Network Name Resolution... Jul 6 23:26:24.747770 systemd[1]: systemd-resolved.service: Deactivated successfully. Jul 6 23:26:24.747976 systemd[1]: Stopped systemd-resolved.service - Network Name Resolution. Jul 6 23:26:24.751590 systemd[1]: run-credentials-systemd\x2dresolved.service.mount: Deactivated successfully. Jul 6 23:26:24.751862 systemd[1]: systemd-tmpfiles-setup.service: Deactivated successfully. Jul 6 23:26:24.751900 systemd[1]: Stopped systemd-tmpfiles-setup.service - Create System Files and Directories. Jul 6 23:26:24.756377 systemd[1]: run-credentials-systemd\x2dtmpfiles\x2dsetup.service.mount: Deactivated successfully. Jul 6 23:26:24.756634 systemd[1]: systemd-networkd.service: Deactivated successfully. Jul 6 23:26:24.757257 systemd[1]: Stopped systemd-networkd.service - Network Configuration. Jul 6 23:26:24.759351 systemd[1]: run-credentials-systemd\x2dnetworkd.service.mount: Deactivated successfully. Jul 6 23:26:24.761962 systemd[1]: Stopped target network-pre.target - Preparation for Network. Jul 6 23:26:24.764029 systemd[1]: systemd-networkd.socket: Deactivated successfully. Jul 6 23:26:24.764243 systemd[1]: Closed systemd-networkd.socket - Network Service Netlink Socket. Jul 6 23:26:24.766091 systemd[1]: Stopping network-cleanup.service - Network Cleanup... Jul 6 23:26:24.766605 systemd[1]: parse-ip-for-networkd.service: Deactivated successfully. Jul 6 23:26:24.766659 systemd[1]: Stopped parse-ip-for-networkd.service - Write systemd-networkd units from cmdline. Jul 6 23:26:24.767453 systemd[1]: systemd-sysctl.service: Deactivated successfully. Jul 6 23:26:24.767497 systemd[1]: Stopped systemd-sysctl.service - Apply Kernel Variables. Jul 6 23:26:24.768226 systemd[1]: systemd-modules-load.service: Deactivated successfully. Jul 6 23:26:24.768265 systemd[1]: Stopped systemd-modules-load.service - Load Kernel Modules. Jul 6 23:26:24.769449 systemd[1]: Stopping systemd-udevd.service - Rule-based Manager for Device Events and Files... Jul 6 23:26:24.771576 systemd[1]: run-credentials-systemd\x2dsysctl.service.mount: Deactivated successfully. Jul 6 23:26:24.786291 systemd[1]: systemd-udevd.service: Deactivated successfully. Jul 6 23:26:24.789773 systemd[1]: Stopped systemd-udevd.service - Rule-based Manager for Device Events and Files. Jul 6 23:26:24.791925 systemd[1]: systemd-udevd-control.socket: Deactivated successfully. Jul 6 23:26:24.791990 systemd[1]: Closed systemd-udevd-control.socket - udev Control Socket. Jul 6 23:26:24.794204 systemd[1]: systemd-udevd-kernel.socket: Deactivated successfully. Jul 6 23:26:24.794236 systemd[1]: Closed systemd-udevd-kernel.socket - udev Kernel Socket. Jul 6 23:26:24.795128 systemd[1]: dracut-pre-udev.service: Deactivated successfully. Jul 6 23:26:24.795176 systemd[1]: Stopped dracut-pre-udev.service - dracut pre-udev hook. Jul 6 23:26:24.796335 systemd[1]: dracut-cmdline.service: Deactivated successfully. Jul 6 23:26:24.796377 systemd[1]: Stopped dracut-cmdline.service - dracut cmdline hook. Jul 6 23:26:24.796972 systemd[1]: dracut-cmdline-ask.service: Deactivated successfully. Jul 6 23:26:24.797017 systemd[1]: Stopped dracut-cmdline-ask.service - dracut ask for additional cmdline parameters. Jul 6 23:26:24.798515 systemd[1]: Starting initrd-udevadm-cleanup-db.service - Cleanup udev Database... Jul 6 23:26:24.799134 systemd[1]: systemd-network-generator.service: Deactivated successfully. Jul 6 23:26:24.799191 systemd[1]: Stopped systemd-network-generator.service - Generate network units from Kernel command line. Jul 6 23:26:24.802699 systemd[1]: systemd-tmpfiles-setup-dev.service: Deactivated successfully. Jul 6 23:26:24.802789 systemd[1]: Stopped systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev. Jul 6 23:26:24.805848 systemd[1]: systemd-vconsole-setup.service: Deactivated successfully. Jul 6 23:26:24.805893 systemd[1]: Stopped systemd-vconsole-setup.service - Virtual Console Setup. Jul 6 23:26:24.811864 systemd[1]: network-cleanup.service: Deactivated successfully. Jul 6 23:26:24.814097 systemd[1]: Stopped network-cleanup.service - Network Cleanup. Jul 6 23:26:24.815984 systemd[1]: initrd-udevadm-cleanup-db.service: Deactivated successfully. Jul 6 23:26:24.817116 systemd[1]: Finished initrd-udevadm-cleanup-db.service - Cleanup udev Database. Jul 6 23:26:24.818859 systemd[1]: Reached target initrd-switch-root.target - Switch Root. Jul 6 23:26:24.821424 systemd[1]: Starting initrd-switch-root.service - Switch Root... Jul 6 23:26:24.842165 systemd[1]: Switching root. Jul 6 23:26:24.888820 systemd-journald[243]: Journal stopped Jul 6 23:26:25.913341 systemd-journald[243]: Received SIGTERM from PID 1 (systemd). Jul 6 23:26:25.913399 kernel: SELinux: policy capability network_peer_controls=1 Jul 6 23:26:25.913410 kernel: SELinux: policy capability open_perms=1 Jul 6 23:26:25.913419 kernel: SELinux: policy capability extended_socket_class=1 Jul 6 23:26:25.913432 kernel: SELinux: policy capability always_check_network=0 Jul 6 23:26:25.913440 kernel: SELinux: policy capability cgroup_seclabel=1 Jul 6 23:26:25.913452 kernel: SELinux: policy capability nnp_nosuid_transition=1 Jul 6 23:26:25.913461 kernel: SELinux: policy capability genfs_seclabel_symlinks=0 Jul 6 23:26:25.913474 kernel: SELinux: policy capability ioctl_skip_cloexec=0 Jul 6 23:26:25.913487 kernel: SELinux: policy capability userspace_initial_context=0 Jul 6 23:26:25.913496 kernel: audit: type=1403 audit(1751844385.062:2): auid=4294967295 ses=4294967295 lsm=selinux res=1 Jul 6 23:26:25.913506 systemd[1]: Successfully loaded SELinux policy in 44.089ms. Jul 6 23:26:25.913525 systemd[1]: Relabeled /dev/, /dev/shm/, /run/ in 11.086ms. Jul 6 23:26:25.913538 systemd[1]: systemd 256.8 running in system mode (+PAM +AUDIT +SELINUX -APPARMOR +IMA +SMACK +SECCOMP -GCRYPT -GNUTLS +OPENSSL -ACL +BLKID +CURL +ELFUTILS -FIDO2 +IDN2 -IDN +IPTC +KMOD +LIBCRYPTSETUP +LIBCRYPTSETUP_PLUGINS +LIBFDISK +PCRE2 -PWQUALITY -P11KIT -QRENCODE +TPM2 +BZIP2 +LZ4 +XZ +ZLIB +ZSTD -BPF_FRAMEWORK -XKBCOMMON +UTMP -SYSVINIT +LIBARCHIVE) Jul 6 23:26:25.913551 systemd[1]: Detected virtualization kvm. Jul 6 23:26:25.913560 systemd[1]: Detected architecture arm64. Jul 6 23:26:25.913570 systemd[1]: Detected first boot. Jul 6 23:26:25.913579 systemd[1]: Hostname set to . Jul 6 23:26:25.913588 systemd[1]: Initializing machine ID from VM UUID. Jul 6 23:26:25.913599 zram_generator::config[1091]: No configuration found. Jul 6 23:26:25.913609 kernel: NET: Registered PF_VSOCK protocol family Jul 6 23:26:25.913618 systemd[1]: Populated /etc with preset unit settings. Jul 6 23:26:25.913628 systemd[1]: run-credentials-systemd\x2djournald.service.mount: Deactivated successfully. Jul 6 23:26:25.913641 systemd[1]: initrd-switch-root.service: Deactivated successfully. Jul 6 23:26:25.913661 systemd[1]: Stopped initrd-switch-root.service - Switch Root. Jul 6 23:26:25.913673 systemd[1]: systemd-journald.service: Scheduled restart job, restart counter is at 1. Jul 6 23:26:25.913683 systemd[1]: Created slice system-addon\x2dconfig.slice - Slice /system/addon-config. Jul 6 23:26:25.913693 systemd[1]: Created slice system-addon\x2drun.slice - Slice /system/addon-run. Jul 6 23:26:25.913702 systemd[1]: Created slice system-getty.slice - Slice /system/getty. Jul 6 23:26:25.913712 systemd[1]: Created slice system-modprobe.slice - Slice /system/modprobe. Jul 6 23:26:25.913730 systemd[1]: Created slice system-serial\x2dgetty.slice - Slice /system/serial-getty. Jul 6 23:26:25.913745 systemd[1]: Created slice system-system\x2dcloudinit.slice - Slice /system/system-cloudinit. Jul 6 23:26:25.913759 systemd[1]: Created slice system-systemd\x2dfsck.slice - Slice /system/systemd-fsck. Jul 6 23:26:25.913770 systemd[1]: Created slice user.slice - User and Session Slice. Jul 6 23:26:25.913783 systemd[1]: Started clevis-luks-askpass.path - Forward Password Requests to Clevis Directory Watch. Jul 6 23:26:25.913793 systemd[1]: Started systemd-ask-password-console.path - Dispatch Password Requests to Console Directory Watch. Jul 6 23:26:25.913803 systemd[1]: Started systemd-ask-password-wall.path - Forward Password Requests to Wall Directory Watch. Jul 6 23:26:25.913814 systemd[1]: Set up automount boot.automount - Boot partition Automount Point. Jul 6 23:26:25.913824 systemd[1]: Set up automount proc-sys-fs-binfmt_misc.automount - Arbitrary Executable File Formats File System Automount Point. Jul 6 23:26:25.913834 systemd[1]: Expecting device dev-disk-by\x2dlabel-OEM.device - /dev/disk/by-label/OEM... Jul 6 23:26:25.913844 systemd[1]: Expecting device dev-ttyAMA0.device - /dev/ttyAMA0... Jul 6 23:26:25.913854 systemd[1]: Reached target cryptsetup-pre.target - Local Encrypted Volumes (Pre). Jul 6 23:26:25.913864 systemd[1]: Reached target cryptsetup.target - Local Encrypted Volumes. Jul 6 23:26:25.913875 systemd[1]: Stopped target initrd-switch-root.target - Switch Root. Jul 6 23:26:25.913886 systemd[1]: Stopped target initrd-fs.target - Initrd File Systems. Jul 6 23:26:25.913896 systemd[1]: Stopped target initrd-root-fs.target - Initrd Root File System. Jul 6 23:26:25.913905 systemd[1]: Reached target integritysetup.target - Local Integrity Protected Volumes. Jul 6 23:26:25.913915 systemd[1]: Reached target remote-cryptsetup.target - Remote Encrypted Volumes. Jul 6 23:26:25.913924 systemd[1]: Reached target remote-fs.target - Remote File Systems. Jul 6 23:26:25.913934 systemd[1]: Reached target slices.target - Slice Units. Jul 6 23:26:25.913944 systemd[1]: Reached target swap.target - Swaps. Jul 6 23:26:25.913953 systemd[1]: Reached target veritysetup.target - Local Verity Protected Volumes. Jul 6 23:26:25.913964 systemd[1]: Listening on systemd-coredump.socket - Process Core Dump Socket. Jul 6 23:26:25.913974 systemd[1]: Listening on systemd-creds.socket - Credential Encryption/Decryption. Jul 6 23:26:25.913984 systemd[1]: Listening on systemd-networkd.socket - Network Service Netlink Socket. Jul 6 23:26:25.913994 systemd[1]: Listening on systemd-udevd-control.socket - udev Control Socket. Jul 6 23:26:25.914004 systemd[1]: Listening on systemd-udevd-kernel.socket - udev Kernel Socket. Jul 6 23:26:25.914014 systemd[1]: Listening on systemd-userdbd.socket - User Database Manager Socket. Jul 6 23:26:25.914028 systemd[1]: Mounting dev-hugepages.mount - Huge Pages File System... Jul 6 23:26:25.921102 systemd[1]: Mounting dev-mqueue.mount - POSIX Message Queue File System... Jul 6 23:26:25.921125 systemd[1]: Mounting media.mount - External Media Directory... Jul 6 23:26:25.921141 systemd[1]: Mounting sys-kernel-debug.mount - Kernel Debug File System... Jul 6 23:26:25.921151 systemd[1]: Mounting sys-kernel-tracing.mount - Kernel Trace File System... Jul 6 23:26:25.921161 systemd[1]: Mounting tmp.mount - Temporary Directory /tmp... Jul 6 23:26:25.921172 systemd[1]: var-lib-machines.mount - Virtual Machine and Container Storage (Compatibility) was skipped because of an unmet condition check (ConditionPathExists=/var/lib/machines.raw). Jul 6 23:26:25.921182 systemd[1]: Reached target machines.target - Containers. Jul 6 23:26:25.921192 systemd[1]: Starting flatcar-tmpfiles.service - Create missing system files... Jul 6 23:26:25.921202 systemd[1]: ignition-delete-config.service - Ignition (delete config) was skipped because no trigger condition checks were met. Jul 6 23:26:25.921213 systemd[1]: Starting kmod-static-nodes.service - Create List of Static Device Nodes... Jul 6 23:26:25.921265 systemd[1]: Starting modprobe@configfs.service - Load Kernel Module configfs... Jul 6 23:26:25.921280 systemd[1]: Starting modprobe@dm_mod.service - Load Kernel Module dm_mod... Jul 6 23:26:25.921290 systemd[1]: Starting modprobe@drm.service - Load Kernel Module drm... Jul 6 23:26:25.921300 systemd[1]: Starting modprobe@efi_pstore.service - Load Kernel Module efi_pstore... Jul 6 23:26:25.921309 systemd[1]: Starting modprobe@fuse.service - Load Kernel Module fuse... Jul 6 23:26:25.921319 systemd[1]: Starting modprobe@loop.service - Load Kernel Module loop... Jul 6 23:26:25.921329 systemd[1]: setup-nsswitch.service - Create /etc/nsswitch.conf was skipped because of an unmet condition check (ConditionPathExists=!/etc/nsswitch.conf). Jul 6 23:26:25.921339 systemd[1]: systemd-fsck-root.service: Deactivated successfully. Jul 6 23:26:25.921349 systemd[1]: Stopped systemd-fsck-root.service - File System Check on Root Device. Jul 6 23:26:25.921360 systemd[1]: systemd-fsck-usr.service: Deactivated successfully. Jul 6 23:26:25.921370 systemd[1]: Stopped systemd-fsck-usr.service. Jul 6 23:26:25.921380 systemd[1]: systemd-hibernate-clear.service - Clear Stale Hibernate Storage Info was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/HibernateLocation-8cf2644b-4b0b-428f-9387-6d876050dc67). Jul 6 23:26:25.921390 systemd[1]: Starting systemd-journald.service - Journal Service... Jul 6 23:26:25.921400 systemd[1]: Starting systemd-modules-load.service - Load Kernel Modules... Jul 6 23:26:25.921411 systemd[1]: Starting systemd-network-generator.service - Generate network units from Kernel command line... Jul 6 23:26:25.921421 systemd[1]: Starting systemd-remount-fs.service - Remount Root and Kernel File Systems... Jul 6 23:26:25.921431 systemd[1]: Starting systemd-udev-load-credentials.service - Load udev Rules from Credentials... Jul 6 23:26:25.921442 systemd[1]: Starting systemd-udev-trigger.service - Coldplug All udev Devices... Jul 6 23:26:25.921453 systemd[1]: verity-setup.service: Deactivated successfully. Jul 6 23:26:25.921463 systemd[1]: Stopped verity-setup.service. Jul 6 23:26:25.921473 systemd[1]: Mounted dev-hugepages.mount - Huge Pages File System. Jul 6 23:26:25.921482 systemd[1]: Mounted dev-mqueue.mount - POSIX Message Queue File System. Jul 6 23:26:25.921494 systemd[1]: Mounted media.mount - External Media Directory. Jul 6 23:26:25.921503 systemd[1]: Mounted sys-kernel-debug.mount - Kernel Debug File System. Jul 6 23:26:25.921514 systemd[1]: Mounted sys-kernel-tracing.mount - Kernel Trace File System. Jul 6 23:26:25.921523 systemd[1]: Mounted tmp.mount - Temporary Directory /tmp. Jul 6 23:26:25.921534 systemd[1]: Finished kmod-static-nodes.service - Create List of Static Device Nodes. Jul 6 23:26:25.921545 systemd[1]: modprobe@configfs.service: Deactivated successfully. Jul 6 23:26:25.921555 systemd[1]: Finished modprobe@configfs.service - Load Kernel Module configfs. Jul 6 23:26:25.921564 systemd[1]: modprobe@dm_mod.service: Deactivated successfully. Jul 6 23:26:25.921575 systemd[1]: Finished modprobe@dm_mod.service - Load Kernel Module dm_mod. Jul 6 23:26:25.921585 systemd[1]: modprobe@efi_pstore.service: Deactivated successfully. Jul 6 23:26:25.921599 systemd[1]: Finished modprobe@efi_pstore.service - Load Kernel Module efi_pstore. Jul 6 23:26:25.921610 systemd[1]: Finished systemd-modules-load.service - Load Kernel Modules. Jul 6 23:26:25.921620 systemd[1]: Finished systemd-remount-fs.service - Remount Root and Kernel File Systems. Jul 6 23:26:25.921630 systemd[1]: Mounting sys-kernel-config.mount - Kernel Configuration File System... Jul 6 23:26:25.921641 systemd[1]: remount-root.service - Remount Root File System was skipped because of an unmet condition check (ConditionPathIsReadWrite=!/). Jul 6 23:26:25.921651 systemd[1]: Reached target local-fs.target - Local File Systems. Jul 6 23:26:25.921661 systemd[1]: Listening on systemd-sysext.socket - System Extension Image Management. Jul 6 23:26:25.921671 kernel: loop: module loaded Jul 6 23:26:25.921682 systemd[1]: Starting ldconfig.service - Rebuild Dynamic Linker Cache... Jul 6 23:26:25.921699 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. Jul 6 23:26:25.921711 systemd[1]: Starting systemd-hwdb-update.service - Rebuild Hardware Database... Jul 6 23:26:25.921735 kernel: fuse: init (API version 7.41) Jul 6 23:26:25.921750 systemd[1]: systemd-pstore.service - Platform Persistent Storage Archival was skipped because of an unmet condition check (ConditionDirectoryNotEmpty=/sys/fs/pstore). Jul 6 23:26:25.921762 systemd[1]: Starting systemd-random-seed.service - Load/Save OS Random Seed... Jul 6 23:26:25.921772 systemd[1]: Starting systemd-sysctl.service - Apply Kernel Variables... Jul 6 23:26:25.921814 systemd-journald[1162]: Collecting audit messages is disabled. Jul 6 23:26:25.921853 systemd[1]: Starting systemd-sysext.service - Merge System Extension Images into /usr/ and /opt/... Jul 6 23:26:25.921867 systemd[1]: Finished flatcar-tmpfiles.service - Create missing system files. Jul 6 23:26:25.921879 systemd[1]: modprobe@fuse.service: Deactivated successfully. Jul 6 23:26:25.921889 systemd[1]: Finished modprobe@fuse.service - Load Kernel Module fuse. Jul 6 23:26:25.921898 systemd[1]: modprobe@loop.service: Deactivated successfully. Jul 6 23:26:25.921911 systemd-journald[1162]: Journal started Jul 6 23:26:25.921934 systemd-journald[1162]: Runtime Journal (/run/log/journal/03b6491ed6914f54a91c7fe06a22ceb8) is 8M, max 76.5M, 68.5M free. Jul 6 23:26:25.930093 systemd[1]: Finished modprobe@loop.service - Load Kernel Module loop. Jul 6 23:26:25.610901 systemd[1]: Queued start job for default target multi-user.target. Jul 6 23:26:25.638300 systemd[1]: Unnecessary job was removed for dev-sda6.device - /dev/sda6. Jul 6 23:26:25.638899 systemd[1]: systemd-journald.service: Deactivated successfully. Jul 6 23:26:25.931567 systemd[1]: Started systemd-journald.service - Journal Service. Jul 6 23:26:25.935268 systemd[1]: Finished systemd-network-generator.service - Generate network units from Kernel command line. Jul 6 23:26:25.938069 systemd[1]: Finished systemd-udev-load-credentials.service - Load udev Rules from Credentials. Jul 6 23:26:25.939282 systemd[1]: Mounted sys-kernel-config.mount - Kernel Configuration File System. Jul 6 23:26:25.941887 systemd[1]: Finished systemd-random-seed.service - Load/Save OS Random Seed. Jul 6 23:26:25.954083 kernel: loop0: detected capacity change from 0 to 207008 Jul 6 23:26:25.972990 systemd[1]: Reached target first-boot-complete.target - First Boot Complete. Jul 6 23:26:25.976176 systemd[1]: Reached target network-pre.target - Preparation for Network. Jul 6 23:26:25.980417 kernel: ACPI: bus type drm_connector registered Jul 6 23:26:25.980960 systemd[1]: Starting systemd-journal-flush.service - Flush Journal to Persistent Storage... Jul 6 23:26:25.984459 systemd[1]: Starting systemd-machine-id-commit.service - Save Transient machine-id to Disk... Jul 6 23:26:25.987220 systemd[1]: systemd-repart.service - Repartition Root Disk was skipped because no trigger condition checks were met. Jul 6 23:26:25.989577 systemd[1]: Starting systemd-sysusers.service - Create System Users... Jul 6 23:26:25.991465 systemd[1]: modprobe@drm.service: Deactivated successfully. Jul 6 23:26:25.991661 systemd[1]: Finished modprobe@drm.service - Load Kernel Module drm. Jul 6 23:26:25.995633 systemd[1]: Finished systemd-sysctl.service - Apply Kernel Variables. Jul 6 23:26:26.009126 kernel: squashfs: version 4.0 (2009/01/31) Phillip Lougher Jul 6 23:26:26.028501 systemd-journald[1162]: Time spent on flushing to /var/log/journal/03b6491ed6914f54a91c7fe06a22ceb8 is 32.519ms for 1167 entries. Jul 6 23:26:26.028501 systemd-journald[1162]: System Journal (/var/log/journal/03b6491ed6914f54a91c7fe06a22ceb8) is 8M, max 584.8M, 576.8M free. Jul 6 23:26:26.076558 systemd-journald[1162]: Received client request to flush runtime journal. Jul 6 23:26:26.076658 kernel: loop1: detected capacity change from 0 to 138376 Jul 6 23:26:26.042859 systemd[1]: Finished systemd-machine-id-commit.service - Save Transient machine-id to Disk. Jul 6 23:26:26.081055 kernel: loop2: detected capacity change from 0 to 107312 Jul 6 23:26:26.084129 systemd[1]: Finished systemd-journal-flush.service - Flush Journal to Persistent Storage. Jul 6 23:26:26.106363 systemd[1]: Finished systemd-sysusers.service - Create System Users. Jul 6 23:26:26.111273 systemd[1]: Starting systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev... Jul 6 23:26:26.123068 kernel: loop3: detected capacity change from 0 to 8 Jul 6 23:26:26.128527 systemd[1]: Finished systemd-udev-trigger.service - Coldplug All udev Devices. Jul 6 23:26:26.143079 kernel: loop4: detected capacity change from 0 to 207008 Jul 6 23:26:26.160215 systemd-tmpfiles[1226]: ACLs are not supported, ignoring. Jul 6 23:26:26.160633 systemd-tmpfiles[1226]: ACLs are not supported, ignoring. Jul 6 23:26:26.170604 systemd[1]: Finished systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev. Jul 6 23:26:26.180096 kernel: loop5: detected capacity change from 0 to 138376 Jul 6 23:26:26.213067 kernel: loop6: detected capacity change from 0 to 107312 Jul 6 23:26:26.232244 kernel: loop7: detected capacity change from 0 to 8 Jul 6 23:26:26.232883 (sd-merge)[1230]: Using extensions 'containerd-flatcar', 'docker-flatcar', 'kubernetes', 'oem-hetzner'. Jul 6 23:26:26.233899 (sd-merge)[1230]: Merged extensions into '/usr'. Jul 6 23:26:26.240926 systemd[1]: Reload requested from client PID 1190 ('systemd-sysext') (unit systemd-sysext.service)... Jul 6 23:26:26.241668 systemd[1]: Reloading... Jul 6 23:26:26.362072 zram_generator::config[1254]: No configuration found. Jul 6 23:26:26.453960 ldconfig[1186]: /sbin/ldconfig: /usr/lib/ld.so.conf is not an ELF file - it has the wrong magic bytes at the start. Jul 6 23:26:26.495530 systemd[1]: /usr/lib/systemd/system/docker.socket:6: ListenStream= references a path below legacy directory /var/run/, updating /var/run/docker.sock → /run/docker.sock; please update the unit file accordingly. Jul 6 23:26:26.570027 systemd[1]: etc-machine\x2did.mount: Deactivated successfully. Jul 6 23:26:26.570318 systemd[1]: Reloading finished in 327 ms. Jul 6 23:26:26.588900 systemd[1]: Finished ldconfig.service - Rebuild Dynamic Linker Cache. Jul 6 23:26:26.590681 systemd[1]: Finished systemd-sysext.service - Merge System Extension Images into /usr/ and /opt/. Jul 6 23:26:26.603210 systemd[1]: Starting ensure-sysext.service... Jul 6 23:26:26.606261 systemd[1]: Starting systemd-tmpfiles-setup.service - Create System Files and Directories... Jul 6 23:26:26.633095 systemd[1]: Reload requested from client PID 1294 ('systemctl') (unit ensure-sysext.service)... Jul 6 23:26:26.633226 systemd[1]: Reloading... Jul 6 23:26:26.647491 systemd-tmpfiles[1295]: /usr/lib/tmpfiles.d/nfs-utils.conf:6: Duplicate line for path "/var/lib/nfs/sm", ignoring. Jul 6 23:26:26.647530 systemd-tmpfiles[1295]: /usr/lib/tmpfiles.d/nfs-utils.conf:7: Duplicate line for path "/var/lib/nfs/sm.bak", ignoring. Jul 6 23:26:26.647801 systemd-tmpfiles[1295]: /usr/lib/tmpfiles.d/provision.conf:20: Duplicate line for path "/root", ignoring. Jul 6 23:26:26.647983 systemd-tmpfiles[1295]: /usr/lib/tmpfiles.d/systemd-flatcar.conf:6: Duplicate line for path "/var/log/journal", ignoring. Jul 6 23:26:26.648612 systemd-tmpfiles[1295]: /usr/lib/tmpfiles.d/systemd.conf:29: Duplicate line for path "/var/lib/systemd", ignoring. Jul 6 23:26:26.648873 systemd-tmpfiles[1295]: ACLs are not supported, ignoring. Jul 6 23:26:26.648928 systemd-tmpfiles[1295]: ACLs are not supported, ignoring. Jul 6 23:26:26.652561 systemd-tmpfiles[1295]: Detected autofs mount point /boot during canonicalization of boot. Jul 6 23:26:26.652575 systemd-tmpfiles[1295]: Skipping /boot Jul 6 23:26:26.661922 systemd-tmpfiles[1295]: Detected autofs mount point /boot during canonicalization of boot. Jul 6 23:26:26.661934 systemd-tmpfiles[1295]: Skipping /boot Jul 6 23:26:26.717067 zram_generator::config[1325]: No configuration found. Jul 6 23:26:26.789987 systemd[1]: /usr/lib/systemd/system/docker.socket:6: ListenStream= references a path below legacy directory /var/run/, updating /var/run/docker.sock → /run/docker.sock; please update the unit file accordingly. Jul 6 23:26:26.864064 systemd[1]: Reloading finished in 230 ms. Jul 6 23:26:26.890121 systemd[1]: Finished systemd-hwdb-update.service - Rebuild Hardware Database. Jul 6 23:26:26.895408 systemd[1]: Finished systemd-tmpfiles-setup.service - Create System Files and Directories. Jul 6 23:26:26.905889 systemd[1]: Mounting sys-fs-fuse-connections.mount - FUSE Control File System... Jul 6 23:26:26.911262 systemd[1]: Starting audit-rules.service - Load Audit Rules... Jul 6 23:26:26.915252 systemd[1]: Starting clean-ca-certificates.service - Clean up broken links in /etc/ssl/certs... Jul 6 23:26:26.919172 systemd[1]: Starting systemd-journal-catalog-update.service - Rebuild Journal Catalog... Jul 6 23:26:26.921883 systemd[1]: Starting systemd-resolved.service - Network Name Resolution... Jul 6 23:26:26.932298 systemd[1]: Starting systemd-udevd.service - Rule-based Manager for Device Events and Files... Jul 6 23:26:26.934426 systemd[1]: Starting systemd-update-utmp.service - Record System Boot/Shutdown in UTMP... Jul 6 23:26:26.937442 systemd[1]: Mounted sys-fs-fuse-connections.mount - FUSE Control File System. Jul 6 23:26:26.947698 systemd[1]: ignition-delete-config.service - Ignition (delete config) was skipped because no trigger condition checks were met. Jul 6 23:26:26.949343 systemd[1]: Starting modprobe@dm_mod.service - Load Kernel Module dm_mod... Jul 6 23:26:26.954384 systemd[1]: Starting modprobe@efi_pstore.service - Load Kernel Module efi_pstore... Jul 6 23:26:26.960147 systemd[1]: Starting modprobe@loop.service - Load Kernel Module loop... Jul 6 23:26:26.961085 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. Jul 6 23:26:26.961217 systemd[1]: systemd-hibernate-clear.service - Clear Stale Hibernate Storage Info was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/HibernateLocation-8cf2644b-4b0b-428f-9387-6d876050dc67). Jul 6 23:26:26.970160 systemd[1]: Starting systemd-userdbd.service - User Database Manager... Jul 6 23:26:26.973861 systemd[1]: ignition-delete-config.service - Ignition (delete config) was skipped because no trigger condition checks were met. Jul 6 23:26:26.974022 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. Jul 6 23:26:26.974194 systemd[1]: systemd-hibernate-clear.service - Clear Stale Hibernate Storage Info was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/HibernateLocation-8cf2644b-4b0b-428f-9387-6d876050dc67). Jul 6 23:26:26.977903 systemd[1]: ignition-delete-config.service - Ignition (delete config) was skipped because no trigger condition checks were met. Jul 6 23:26:26.992131 systemd[1]: Starting modprobe@drm.service - Load Kernel Module drm... Jul 6 23:26:26.994238 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. Jul 6 23:26:26.994390 systemd[1]: systemd-hibernate-clear.service - Clear Stale Hibernate Storage Info was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/HibernateLocation-8cf2644b-4b0b-428f-9387-6d876050dc67). Jul 6 23:26:26.995088 systemd[1]: modprobe@dm_mod.service: Deactivated successfully. Jul 6 23:26:27.000092 systemd[1]: Finished modprobe@dm_mod.service - Load Kernel Module dm_mod. Jul 6 23:26:27.004589 systemd[1]: Finished systemd-journal-catalog-update.service - Rebuild Journal Catalog. Jul 6 23:26:27.015109 systemd[1]: Finished ensure-sysext.service. Jul 6 23:26:27.030426 systemd[1]: Starting systemd-timesyncd.service - Network Time Synchronization... Jul 6 23:26:27.030528 systemd-udevd[1367]: Using default interface naming scheme 'v255'. Jul 6 23:26:27.035461 systemd[1]: Starting systemd-update-done.service - Update is Completed... Jul 6 23:26:27.038301 systemd[1]: Finished clean-ca-certificates.service - Clean up broken links in /etc/ssl/certs. Jul 6 23:26:27.041300 systemd[1]: Finished systemd-update-utmp.service - Record System Boot/Shutdown in UTMP. Jul 6 23:26:27.042532 systemd[1]: modprobe@efi_pstore.service: Deactivated successfully. Jul 6 23:26:27.043114 systemd[1]: Finished modprobe@efi_pstore.service - Load Kernel Module efi_pstore. Jul 6 23:26:27.046297 systemd[1]: modprobe@loop.service: Deactivated successfully. Jul 6 23:26:27.046513 systemd[1]: Finished modprobe@loop.service - Load Kernel Module loop. Jul 6 23:26:27.047490 systemd[1]: modprobe@drm.service: Deactivated successfully. Jul 6 23:26:27.047646 systemd[1]: Finished modprobe@drm.service - Load Kernel Module drm. Jul 6 23:26:27.057584 systemd[1]: systemd-pstore.service - Platform Persistent Storage Archival was skipped because of an unmet condition check (ConditionDirectoryNotEmpty=/sys/fs/pstore). Jul 6 23:26:27.057673 systemd[1]: systemd-repart.service - Repartition Root Disk was skipped because no trigger condition checks were met. Jul 6 23:26:27.057705 systemd[1]: update-ca-certificates.service - Update CA bundle at /etc/ssl/certs/ca-certificates.crt was skipped because of an unmet condition check (ConditionPathIsSymbolicLink=!/etc/ssl/certs/ca-certificates.crt). Jul 6 23:26:27.079978 systemd[1]: Started systemd-udevd.service - Rule-based Manager for Device Events and Files. Jul 6 23:26:27.086203 systemd[1]: Starting systemd-networkd.service - Network Configuration... Jul 6 23:26:27.087080 systemd[1]: Started systemd-userdbd.service - User Database Manager. Jul 6 23:26:27.092864 systemd[1]: Finished systemd-update-done.service - Update is Completed. Jul 6 23:26:27.095816 augenrules[1413]: No rules Jul 6 23:26:27.098883 systemd[1]: audit-rules.service: Deactivated successfully. Jul 6 23:26:27.099123 systemd[1]: Finished audit-rules.service - Load Audit Rules. Jul 6 23:26:27.198358 systemd[1]: Condition check resulted in dev-ttyAMA0.device - /dev/ttyAMA0 being skipped. Jul 6 23:26:27.324136 systemd-networkd[1406]: lo: Link UP Jul 6 23:26:27.324149 systemd-networkd[1406]: lo: Gained carrier Jul 6 23:26:27.325661 systemd-networkd[1406]: Enumeration completed Jul 6 23:26:27.325796 systemd[1]: Started systemd-networkd.service - Network Configuration. Jul 6 23:26:27.328052 systemd-networkd[1406]: eth0: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Jul 6 23:26:27.328077 systemd-networkd[1406]: eth0: Configuring with /usr/lib/systemd/network/zz-default.network. Jul 6 23:26:27.329006 systemd-networkd[1406]: eth0: Link UP Jul 6 23:26:27.329189 systemd-networkd[1406]: eth0: Gained carrier Jul 6 23:26:27.329206 systemd-networkd[1406]: eth0: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Jul 6 23:26:27.329754 systemd[1]: Starting systemd-networkd-persistent-storage.service - Enable Persistent Storage in systemd-networkd... Jul 6 23:26:27.335832 systemd[1]: Starting systemd-networkd-wait-online.service - Wait for Network to be Configured... Jul 6 23:26:27.368071 kernel: mousedev: PS/2 mouse device common for all mice Jul 6 23:26:27.376594 systemd-networkd[1406]: eth1: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Jul 6 23:26:27.376609 systemd-networkd[1406]: eth1: Configuring with /usr/lib/systemd/network/zz-default.network. Jul 6 23:26:27.378372 systemd-networkd[1406]: eth1: Link UP Jul 6 23:26:27.380197 systemd-networkd[1406]: eth1: Gained carrier Jul 6 23:26:27.380231 systemd-networkd[1406]: eth1: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Jul 6 23:26:27.387602 systemd[1]: Finished systemd-networkd-persistent-storage.service - Enable Persistent Storage in systemd-networkd. Jul 6 23:26:27.404120 systemd-networkd[1406]: eth1: DHCPv4 address 10.0.0.3/32, gateway 10.0.0.1 acquired from 10.0.0.1 Jul 6 23:26:27.409029 systemd[1]: Started systemd-timesyncd.service - Network Time Synchronization. Jul 6 23:26:27.410049 systemd[1]: Reached target time-set.target - System Time Set. Jul 6 23:26:27.417118 systemd-networkd[1406]: eth0: DHCPv4 address 91.99.177.85/32, gateway 172.31.1.1 acquired from 172.31.1.1 Jul 6 23:26:27.425753 systemd-resolved[1365]: Positive Trust Anchors: Jul 6 23:26:27.426629 systemd-resolved[1365]: . IN DS 20326 8 2 e06d44b80b8f1d39a95c0b0d7c65d08458e880409bbc683457104237c7f8ec8d Jul 6 23:26:27.426857 systemd-resolved[1365]: Negative trust anchors: home.arpa 10.in-addr.arpa 16.172.in-addr.arpa 17.172.in-addr.arpa 18.172.in-addr.arpa 19.172.in-addr.arpa 20.172.in-addr.arpa 21.172.in-addr.arpa 22.172.in-addr.arpa 23.172.in-addr.arpa 24.172.in-addr.arpa 25.172.in-addr.arpa 26.172.in-addr.arpa 27.172.in-addr.arpa 28.172.in-addr.arpa 29.172.in-addr.arpa 30.172.in-addr.arpa 31.172.in-addr.arpa 170.0.0.192.in-addr.arpa 171.0.0.192.in-addr.arpa 168.192.in-addr.arpa d.f.ip6.arpa ipv4only.arpa resolver.arpa corp home internal intranet lan local private test Jul 6 23:26:27.433265 systemd-resolved[1365]: Using system hostname 'ci-4344-1-1-3-d8bdec45b1'. Jul 6 23:26:27.436329 systemd[1]: Started systemd-resolved.service - Network Name Resolution. Jul 6 23:26:27.437280 systemd[1]: Reached target network.target - Network. Jul 6 23:26:27.437896 systemd[1]: Reached target nss-lookup.target - Host and Network Name Lookups. Jul 6 23:26:27.439116 systemd[1]: Reached target sysinit.target - System Initialization. Jul 6 23:26:27.440120 systemd[1]: Started motdgen.path - Watch for update engine configuration changes. Jul 6 23:26:27.442253 systemd[1]: Started user-cloudinit@var-lib-flatcar\x2dinstall-user_data.path - Watch for a cloud-config at /var/lib/flatcar-install/user_data. Jul 6 23:26:27.443559 systemd[1]: Started logrotate.timer - Daily rotation of log files. Jul 6 23:26:27.445181 systemd[1]: Started mdadm.timer - Weekly check for MD array's redundancy information.. Jul 6 23:26:27.446048 systemd[1]: Started systemd-tmpfiles-clean.timer - Daily Cleanup of Temporary Directories. Jul 6 23:26:27.446943 systemd[1]: update-engine-stub.timer - Update Engine Stub Timer was skipped because of an unmet condition check (ConditionPathExists=/usr/.noupdate). Jul 6 23:26:27.447078 systemd[1]: Reached target paths.target - Path Units. Jul 6 23:26:27.448255 systemd[1]: Reached target timers.target - Timer Units. Jul 6 23:26:27.452079 systemd[1]: Listening on dbus.socket - D-Bus System Message Bus Socket. Jul 6 23:26:27.456923 systemd[1]: Starting docker.socket - Docker Socket for the API... Jul 6 23:26:27.464127 systemd[1]: Listening on sshd-unix-local.socket - OpenSSH Server Socket (systemd-ssh-generator, AF_UNIX Local). Jul 6 23:26:27.467369 systemd[1]: Listening on sshd-vsock.socket - OpenSSH Server Socket (systemd-ssh-generator, AF_VSOCK). Jul 6 23:26:27.468942 systemd[1]: Reached target ssh-access.target - SSH Access Available. Jul 6 23:26:27.474765 systemd[1]: Listening on sshd.socket - OpenSSH Server Socket. Jul 6 23:26:27.477111 systemd[1]: Listening on systemd-hostnamed.socket - Hostname Service Socket. Jul 6 23:26:27.478500 systemd[1]: Listening on docker.socket - Docker Socket for the API. Jul 6 23:26:27.482869 systemd[1]: Reached target sockets.target - Socket Units. Jul 6 23:26:27.484935 systemd[1]: Reached target basic.target - Basic System. Jul 6 23:26:27.486148 systemd[1]: addon-config@oem.service - Configure Addon /oem was skipped because no trigger condition checks were met. Jul 6 23:26:27.486183 systemd[1]: addon-run@oem.service - Run Addon /oem was skipped because no trigger condition checks were met. Jul 6 23:26:27.489213 systemd[1]: Starting containerd.service - containerd container runtime... Jul 6 23:26:27.492485 systemd[1]: Starting coreos-metadata.service - Flatcar Metadata Agent... Jul 6 23:26:27.495841 systemd[1]: Starting dbus.service - D-Bus System Message Bus... Jul 6 23:26:27.499334 systemd[1]: Starting dracut-shutdown.service - Restore /run/initramfs on shutdown... Jul 6 23:26:27.507225 systemd[1]: Starting enable-oem-cloudinit.service - Enable cloudinit... Jul 6 23:26:27.520198 systemd[1]: Starting extend-filesystems.service - Extend Filesystems... Jul 6 23:26:27.521247 systemd[1]: flatcar-setup-environment.service - Modifies /etc/environment for CoreOS was skipped because of an unmet condition check (ConditionPathExists=/oem/bin/flatcar-setup-environment). Jul 6 23:26:27.523766 systemd[1]: Starting motdgen.service - Generate /run/flatcar/motd... Jul 6 23:26:27.528086 systemd[1]: Starting prepare-helm.service - Unpack helm to /opt/bin... Jul 6 23:26:27.533237 systemd[1]: Starting ssh-key-proc-cmdline.service - Install an ssh key from /proc/cmdline... Jul 6 23:26:27.537295 jq[1477]: false Jul 6 23:26:27.538012 systemd[1]: Starting sshd-keygen.service - Generate sshd host keys... Jul 6 23:26:27.546148 systemd[1]: Starting systemd-logind.service - User Login Management... Jul 6 23:26:27.548972 systemd[1]: tcsd.service - TCG Core Services Daemon was skipped because of an unmet condition check (ConditionPathExists=/dev/tpm0). Jul 6 23:26:27.549463 systemd[1]: cgroup compatibility translation between legacy and unified hierarchy settings activated. See cgroup-compat debug messages for details. Jul 6 23:26:27.553811 systemd[1]: Starting update-engine.service - Update Engine... Jul 6 23:26:27.559975 systemd[1]: Starting update-ssh-keys-after-ignition.service - Run update-ssh-keys once after Ignition... Jul 6 23:26:28.049741 systemd-timesyncd[1392]: Contacted time server 188.68.34.173:123 (3.flatcar.pool.ntp.org). Jul 6 23:26:28.049842 systemd-timesyncd[1392]: Initial clock synchronization to Sun 2025-07-06 23:26:28.049631 UTC. Jul 6 23:26:28.050199 systemd-resolved[1365]: Clock change detected. Flushing caches. Jul 6 23:26:28.055907 systemd[1]: Finished dracut-shutdown.service - Restore /run/initramfs on shutdown. Jul 6 23:26:28.057797 systemd[1]: enable-oem-cloudinit.service: Skipped due to 'exec-condition'. Jul 6 23:26:28.058005 systemd[1]: Condition check resulted in enable-oem-cloudinit.service - Enable cloudinit being skipped. Jul 6 23:26:28.058257 systemd[1]: motdgen.service: Deactivated successfully. Jul 6 23:26:28.058472 systemd[1]: Finished motdgen.service - Generate /run/flatcar/motd. Jul 6 23:26:28.070257 systemd[1]: ssh-key-proc-cmdline.service: Deactivated successfully. Jul 6 23:26:28.070520 systemd[1]: Finished ssh-key-proc-cmdline.service - Install an ssh key from /proc/cmdline. Jul 6 23:26:28.072413 jq[1494]: true Jul 6 23:26:28.085296 coreos-metadata[1474]: Jul 06 23:26:28.085 INFO Fetching http://169.254.169.254/hetzner/v1/metadata: Attempt #1 Jul 6 23:26:28.094604 systemd[1]: Condition check resulted in dev-virtio\x2dports-org.qemu.guest_agent.0.device - /dev/virtio-ports/org.qemu.guest_agent.0 being skipped. Jul 6 23:26:28.098225 kernel: [drm] pci: virtio-gpu-pci detected at 0000:00:01.0 Jul 6 23:26:28.098283 kernel: [drm] features: -virgl +edid -resource_blob -host_visible Jul 6 23:26:28.098295 kernel: [drm] features: -context_init Jul 6 23:26:28.098324 coreos-metadata[1474]: Jul 06 23:26:28.096 INFO Fetch successful Jul 6 23:26:28.103413 coreos-metadata[1474]: Jul 06 23:26:28.098 INFO Fetching http://169.254.169.254/hetzner/v1/metadata/private-networks: Attempt #1 Jul 6 23:26:28.110899 systemd[1]: Started qemu-guest-agent.service - QEMU Guest Agent. Jul 6 23:26:28.113520 coreos-metadata[1474]: Jul 06 23:26:28.113 INFO Fetch successful Jul 6 23:26:28.120036 jq[1497]: true Jul 6 23:26:28.127081 tar[1496]: linux-arm64/LICENSE Jul 6 23:26:28.127081 tar[1496]: linux-arm64/helm Jul 6 23:26:28.145023 extend-filesystems[1479]: Found /dev/sda6 Jul 6 23:26:28.155791 dbus-daemon[1475]: [system] SELinux support is enabled Jul 6 23:26:28.155980 systemd[1]: Started dbus.service - D-Bus System Message Bus. Jul 6 23:26:28.158578 systemd[1]: system-cloudinit@usr-share-oem-cloud\x2dconfig.yml.service - Load cloud-config from /usr/share/oem/cloud-config.yml was skipped because of an unmet condition check (ConditionFileNotEmpty=/usr/share/oem/cloud-config.yml). Jul 6 23:26:28.158618 systemd[1]: Reached target system-config.target - Load system-provided cloud configs. Jul 6 23:26:28.159817 extend-filesystems[1479]: Found /dev/sda9 Jul 6 23:26:28.160247 systemd[1]: user-cloudinit-proc-cmdline.service - Load cloud-config from url defined in /proc/cmdline was skipped because of an unmet condition check (ConditionKernelCommandLine=cloud-config-url). Jul 6 23:26:28.160268 systemd[1]: Reached target user-config.target - Load user-provided cloud configs. Jul 6 23:26:28.164813 update_engine[1493]: I20250706 23:26:28.163177 1493 main.cc:92] Flatcar Update Engine starting Jul 6 23:26:28.167163 extend-filesystems[1479]: Checking size of /dev/sda9 Jul 6 23:26:28.172612 systemd[1]: Started update-engine.service - Update Engine. Jul 6 23:26:28.174079 update_engine[1493]: I20250706 23:26:28.172701 1493 update_check_scheduler.cc:74] Next update check in 6m51s Jul 6 23:26:28.183691 kernel: [drm] number of scanouts: 1 Jul 6 23:26:28.183758 kernel: [drm] number of cap sets: 0 Jul 6 23:26:28.190273 extend-filesystems[1479]: Resized partition /dev/sda9 Jul 6 23:26:28.196489 extend-filesystems[1531]: resize2fs 1.47.2 (1-Jan-2025) Jul 6 23:26:28.199236 (ntainerd)[1514]: containerd.service: Referenced but unset environment variable evaluates to an empty string: TORCX_IMAGEDIR, TORCX_UNPACKDIR Jul 6 23:26:28.210847 kernel: [drm] Initialized virtio_gpu 0.1.0 for 0000:00:01.0 on minor 0 Jul 6 23:26:28.210899 kernel: EXT4-fs (sda9): resizing filesystem from 1617920 to 9393147 blocks Jul 6 23:26:28.225370 systemd[1]: Started locksmithd.service - Cluster reboot manager. Jul 6 23:26:28.243655 systemd[1]: Found device dev-disk-by\x2dlabel-OEM.device - QEMU_HARDDISK OEM. Jul 6 23:26:28.248329 systemd[1]: Starting systemd-fsck@dev-disk-by\x2dlabel-OEM.service - File System Check on /dev/disk/by-label/OEM... Jul 6 23:26:28.305688 bash[1550]: Updated "/home/core/.ssh/authorized_keys" Jul 6 23:26:28.318159 systemd[1]: Finished update-ssh-keys-after-ignition.service - Run update-ssh-keys once after Ignition. Jul 6 23:26:28.320733 systemd[1]: Finished systemd-fsck@dev-disk-by\x2dlabel-OEM.service - File System Check on /dev/disk/by-label/OEM. Jul 6 23:26:28.324423 systemd[1]: Starting sshkeys.service... Jul 6 23:26:28.361693 kernel: EXT4-fs (sda9): resized filesystem to 9393147 Jul 6 23:26:28.373769 extend-filesystems[1531]: Filesystem at /dev/sda9 is mounted on /; on-line resizing required Jul 6 23:26:28.373769 extend-filesystems[1531]: old_desc_blocks = 1, new_desc_blocks = 5 Jul 6 23:26:28.373769 extend-filesystems[1531]: The filesystem on /dev/sda9 is now 9393147 (4k) blocks long. Jul 6 23:26:28.383490 extend-filesystems[1479]: Resized filesystem in /dev/sda9 Jul 6 23:26:28.378138 systemd[1]: extend-filesystems.service: Deactivated successfully. Jul 6 23:26:28.378328 systemd[1]: Finished extend-filesystems.service - Extend Filesystems. Jul 6 23:26:28.386426 systemd[1]: Finished coreos-metadata.service - Flatcar Metadata Agent. Jul 6 23:26:28.387401 systemd[1]: packet-phone-home.service - Report Success to Packet was skipped because no trigger condition checks were met. Jul 6 23:26:28.401244 systemd[1]: Created slice system-coreos\x2dmetadata\x2dsshkeys.slice - Slice /system/coreos-metadata-sshkeys. Jul 6 23:26:28.403942 systemd[1]: Starting coreos-metadata-sshkeys@core.service - Flatcar Metadata Agent (SSH Keys)... Jul 6 23:26:28.483585 coreos-metadata[1567]: Jul 06 23:26:28.483 INFO Fetching http://169.254.169.254/hetzner/v1/metadata/public-keys: Attempt #1 Jul 6 23:26:28.487947 coreos-metadata[1567]: Jul 06 23:26:28.487 INFO Fetch successful Jul 6 23:26:28.489963 unknown[1567]: wrote ssh authorized keys file for user: core Jul 6 23:26:28.545211 update-ssh-keys[1576]: Updated "/home/core/.ssh/authorized_keys" Jul 6 23:26:28.547499 systemd[1]: Finished coreos-metadata-sshkeys@core.service - Flatcar Metadata Agent (SSH Keys). Jul 6 23:26:28.556707 systemd[1]: Finished sshkeys.service. Jul 6 23:26:28.576061 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Jul 6 23:26:28.624703 systemd-logind[1491]: New seat seat0. Jul 6 23:26:28.630924 systemd[1]: Started systemd-logind.service - User Login Management. Jul 6 23:26:28.667463 systemd-logind[1491]: Watching system buttons on /dev/input/event0 (Power Button) Jul 6 23:26:28.670686 containerd[1514]: time="2025-07-06T23:26:28Z" level=warning msg="Ignoring unknown key in TOML" column=1 error="strict mode: fields in the document are missing in the target struct" file=/usr/share/containerd/config.toml key=subreaper row=8 Jul 6 23:26:28.675445 containerd[1514]: time="2025-07-06T23:26:28.675344304Z" level=info msg="starting containerd" revision=06b99ca80cdbfbc6cc8bd567021738c9af2b36ce version=v2.0.4 Jul 6 23:26:28.689576 systemd-logind[1491]: Watching system buttons on /dev/input/event2 (QEMU QEMU USB Keyboard) Jul 6 23:26:28.737520 containerd[1514]: time="2025-07-06T23:26:28.736956344Z" level=warning msg="Configuration migrated from version 2, use `containerd config migrate` to avoid migration" t="11.4µs" Jul 6 23:26:28.737723 containerd[1514]: time="2025-07-06T23:26:28.737702384Z" level=info msg="loading plugin" id=io.containerd.image-verifier.v1.bindir type=io.containerd.image-verifier.v1 Jul 6 23:26:28.737783 containerd[1514]: time="2025-07-06T23:26:28.737770824Z" level=info msg="loading plugin" id=io.containerd.internal.v1.opt type=io.containerd.internal.v1 Jul 6 23:26:28.740961 containerd[1514]: time="2025-07-06T23:26:28.740401624Z" level=info msg="loading plugin" id=io.containerd.warning.v1.deprecations type=io.containerd.warning.v1 Jul 6 23:26:28.740961 containerd[1514]: time="2025-07-06T23:26:28.740442504Z" level=info msg="loading plugin" id=io.containerd.content.v1.content type=io.containerd.content.v1 Jul 6 23:26:28.740961 containerd[1514]: time="2025-07-06T23:26:28.740470384Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.blockfile type=io.containerd.snapshotter.v1 Jul 6 23:26:28.740961 containerd[1514]: time="2025-07-06T23:26:28.740529184Z" level=info msg="skip loading plugin" error="no scratch file generator: skip plugin" id=io.containerd.snapshotter.v1.blockfile type=io.containerd.snapshotter.v1 Jul 6 23:26:28.740961 containerd[1514]: time="2025-07-06T23:26:28.740540464Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.btrfs type=io.containerd.snapshotter.v1 Jul 6 23:26:28.740961 containerd[1514]: time="2025-07-06T23:26:28.740786704Z" level=info msg="skip loading plugin" error="path /var/lib/containerd/io.containerd.snapshotter.v1.btrfs (ext4) must be a btrfs filesystem to be used with the btrfs snapshotter: skip plugin" id=io.containerd.snapshotter.v1.btrfs type=io.containerd.snapshotter.v1 Jul 6 23:26:28.740961 containerd[1514]: time="2025-07-06T23:26:28.740804064Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.devmapper type=io.containerd.snapshotter.v1 Jul 6 23:26:28.740961 containerd[1514]: time="2025-07-06T23:26:28.740816784Z" level=info msg="skip loading plugin" error="devmapper not configured: skip plugin" id=io.containerd.snapshotter.v1.devmapper type=io.containerd.snapshotter.v1 Jul 6 23:26:28.740961 containerd[1514]: time="2025-07-06T23:26:28.740824624Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.native type=io.containerd.snapshotter.v1 Jul 6 23:26:28.740961 containerd[1514]: time="2025-07-06T23:26:28.740898424Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.overlayfs type=io.containerd.snapshotter.v1 Jul 6 23:26:28.742087 containerd[1514]: time="2025-07-06T23:26:28.741410584Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.zfs type=io.containerd.snapshotter.v1 Jul 6 23:26:28.742087 containerd[1514]: time="2025-07-06T23:26:28.741452024Z" level=info msg="skip loading plugin" error="lstat /var/lib/containerd/io.containerd.snapshotter.v1.zfs: no such file or directory: skip plugin" id=io.containerd.snapshotter.v1.zfs type=io.containerd.snapshotter.v1 Jul 6 23:26:28.742087 containerd[1514]: time="2025-07-06T23:26:28.741463184Z" level=info msg="loading plugin" id=io.containerd.event.v1.exchange type=io.containerd.event.v1 Jul 6 23:26:28.743094 containerd[1514]: time="2025-07-06T23:26:28.743066584Z" level=info msg="loading plugin" id=io.containerd.monitor.task.v1.cgroups type=io.containerd.monitor.task.v1 Jul 6 23:26:28.750682 containerd[1514]: time="2025-07-06T23:26:28.747853304Z" level=info msg="loading plugin" id=io.containerd.metadata.v1.bolt type=io.containerd.metadata.v1 Jul 6 23:26:28.750682 containerd[1514]: time="2025-07-06T23:26:28.747980424Z" level=info msg="metadata content store policy set" policy=shared Jul 6 23:26:28.756924 containerd[1514]: time="2025-07-06T23:26:28.755890224Z" level=info msg="loading plugin" id=io.containerd.gc.v1.scheduler type=io.containerd.gc.v1 Jul 6 23:26:28.756924 containerd[1514]: time="2025-07-06T23:26:28.756033504Z" level=info msg="loading plugin" id=io.containerd.differ.v1.walking type=io.containerd.differ.v1 Jul 6 23:26:28.756924 containerd[1514]: time="2025-07-06T23:26:28.756049824Z" level=info msg="loading plugin" id=io.containerd.lease.v1.manager type=io.containerd.lease.v1 Jul 6 23:26:28.756924 containerd[1514]: time="2025-07-06T23:26:28.756062384Z" level=info msg="loading plugin" id=io.containerd.service.v1.containers-service type=io.containerd.service.v1 Jul 6 23:26:28.756924 containerd[1514]: time="2025-07-06T23:26:28.756076064Z" level=info msg="loading plugin" id=io.containerd.service.v1.content-service type=io.containerd.service.v1 Jul 6 23:26:28.756924 containerd[1514]: time="2025-07-06T23:26:28.756087504Z" level=info msg="loading plugin" id=io.containerd.service.v1.diff-service type=io.containerd.service.v1 Jul 6 23:26:28.756924 containerd[1514]: time="2025-07-06T23:26:28.756104544Z" level=info msg="loading plugin" id=io.containerd.service.v1.images-service type=io.containerd.service.v1 Jul 6 23:26:28.756924 containerd[1514]: time="2025-07-06T23:26:28.756117184Z" level=info msg="loading plugin" id=io.containerd.service.v1.introspection-service type=io.containerd.service.v1 Jul 6 23:26:28.756924 containerd[1514]: time="2025-07-06T23:26:28.756128744Z" level=info msg="loading plugin" id=io.containerd.service.v1.namespaces-service type=io.containerd.service.v1 Jul 6 23:26:28.756924 containerd[1514]: time="2025-07-06T23:26:28.756139144Z" level=info msg="loading plugin" id=io.containerd.service.v1.snapshots-service type=io.containerd.service.v1 Jul 6 23:26:28.756924 containerd[1514]: time="2025-07-06T23:26:28.756150504Z" level=info msg="loading plugin" id=io.containerd.shim.v1.manager type=io.containerd.shim.v1 Jul 6 23:26:28.756924 containerd[1514]: time="2025-07-06T23:26:28.756163464Z" level=info msg="loading plugin" id=io.containerd.runtime.v2.task type=io.containerd.runtime.v2 Jul 6 23:26:28.756924 containerd[1514]: time="2025-07-06T23:26:28.756303584Z" level=info msg="loading plugin" id=io.containerd.service.v1.tasks-service type=io.containerd.service.v1 Jul 6 23:26:28.756924 containerd[1514]: time="2025-07-06T23:26:28.756324784Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.containers type=io.containerd.grpc.v1 Jul 6 23:26:28.757312 containerd[1514]: time="2025-07-06T23:26:28.756390264Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.content type=io.containerd.grpc.v1 Jul 6 23:26:28.757312 containerd[1514]: time="2025-07-06T23:26:28.756404024Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.diff type=io.containerd.grpc.v1 Jul 6 23:26:28.757312 containerd[1514]: time="2025-07-06T23:26:28.756421144Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.events type=io.containerd.grpc.v1 Jul 6 23:26:28.757312 containerd[1514]: time="2025-07-06T23:26:28.756432744Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.images type=io.containerd.grpc.v1 Jul 6 23:26:28.757312 containerd[1514]: time="2025-07-06T23:26:28.756445344Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.introspection type=io.containerd.grpc.v1 Jul 6 23:26:28.757312 containerd[1514]: time="2025-07-06T23:26:28.756459144Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.leases type=io.containerd.grpc.v1 Jul 6 23:26:28.757312 containerd[1514]: time="2025-07-06T23:26:28.756472984Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.namespaces type=io.containerd.grpc.v1 Jul 6 23:26:28.757312 containerd[1514]: time="2025-07-06T23:26:28.756488584Z" level=info msg="loading plugin" id=io.containerd.sandbox.store.v1.local type=io.containerd.sandbox.store.v1 Jul 6 23:26:28.757312 containerd[1514]: time="2025-07-06T23:26:28.756502064Z" level=info msg="loading plugin" id=io.containerd.cri.v1.images type=io.containerd.cri.v1 Jul 6 23:26:28.757312 containerd[1514]: time="2025-07-06T23:26:28.756722304Z" level=info msg="Get image filesystem path \"/var/lib/containerd/io.containerd.snapshotter.v1.overlayfs\" for snapshotter \"overlayfs\"" Jul 6 23:26:28.757312 containerd[1514]: time="2025-07-06T23:26:28.756739784Z" level=info msg="Start snapshots syncer" Jul 6 23:26:28.757312 containerd[1514]: time="2025-07-06T23:26:28.756772664Z" level=info msg="loading plugin" id=io.containerd.cri.v1.runtime type=io.containerd.cri.v1 Jul 6 23:26:28.757798 containerd[1514]: time="2025-07-06T23:26:28.757754744Z" level=info msg="starting cri plugin" config="{\"containerd\":{\"defaultRuntimeName\":\"runc\",\"runtimes\":{\"runc\":{\"runtimeType\":\"io.containerd.runc.v2\",\"runtimePath\":\"\",\"PodAnnotations\":null,\"ContainerAnnotations\":null,\"options\":{\"BinaryName\":\"\",\"CriuImagePath\":\"\",\"CriuWorkPath\":\"\",\"IoGid\":0,\"IoUid\":0,\"NoNewKeyring\":false,\"Root\":\"\",\"ShimCgroup\":\"\",\"SystemdCgroup\":true},\"privileged_without_host_devices\":false,\"privileged_without_host_devices_all_devices_allowed\":false,\"baseRuntimeSpec\":\"\",\"cniConfDir\":\"\",\"cniMaxConfNum\":0,\"snapshotter\":\"\",\"sandboxer\":\"podsandbox\",\"io_type\":\"\"}},\"ignoreBlockIONotEnabledErrors\":false,\"ignoreRdtNotEnabledErrors\":false},\"cni\":{\"binDir\":\"/opt/cni/bin\",\"confDir\":\"/etc/cni/net.d\",\"maxConfNum\":1,\"setupSerially\":false,\"confTemplate\":\"\",\"ipPref\":\"\",\"useInternalLoopback\":false},\"enableSelinux\":true,\"selinuxCategoryRange\":1024,\"maxContainerLogSize\":16384,\"disableApparmor\":false,\"restrictOOMScoreAdj\":false,\"disableProcMount\":false,\"unsetSeccompProfile\":\"\",\"tolerateMissingHugetlbController\":true,\"disableHugetlbController\":true,\"device_ownership_from_security_context\":false,\"ignoreImageDefinedVolumes\":false,\"netnsMountsUnderStateDir\":false,\"enableUnprivilegedPorts\":true,\"enableUnprivilegedICMP\":true,\"enableCDI\":true,\"cdiSpecDirs\":[\"/etc/cdi\",\"/var/run/cdi\"],\"drainExecSyncIOTimeout\":\"0s\",\"ignoreDeprecationWarnings\":null,\"containerdRootDir\":\"/var/lib/containerd\",\"containerdEndpoint\":\"/run/containerd/containerd.sock\",\"rootDir\":\"/var/lib/containerd/io.containerd.grpc.v1.cri\",\"stateDir\":\"/run/containerd/io.containerd.grpc.v1.cri\"}" Jul 6 23:26:28.763159 containerd[1514]: time="2025-07-06T23:26:28.758940064Z" level=info msg="loading plugin" id=io.containerd.podsandbox.controller.v1.podsandbox type=io.containerd.podsandbox.controller.v1 Jul 6 23:26:28.763159 containerd[1514]: time="2025-07-06T23:26:28.759063904Z" level=info msg="loading plugin" id=io.containerd.sandbox.controller.v1.shim type=io.containerd.sandbox.controller.v1 Jul 6 23:26:28.767493 containerd[1514]: time="2025-07-06T23:26:28.766439984Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.sandbox-controllers type=io.containerd.grpc.v1 Jul 6 23:26:28.774488 containerd[1514]: time="2025-07-06T23:26:28.774453784Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.sandboxes type=io.containerd.grpc.v1 Jul 6 23:26:28.776067 containerd[1514]: time="2025-07-06T23:26:28.775932264Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.snapshots type=io.containerd.grpc.v1 Jul 6 23:26:28.776067 containerd[1514]: time="2025-07-06T23:26:28.775963944Z" level=info msg="loading plugin" id=io.containerd.streaming.v1.manager type=io.containerd.streaming.v1 Jul 6 23:26:28.776067 containerd[1514]: time="2025-07-06T23:26:28.775992464Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.streaming type=io.containerd.grpc.v1 Jul 6 23:26:28.776067 containerd[1514]: time="2025-07-06T23:26:28.776006344Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.tasks type=io.containerd.grpc.v1 Jul 6 23:26:28.776067 containerd[1514]: time="2025-07-06T23:26:28.776019984Z" level=info msg="loading plugin" id=io.containerd.transfer.v1.local type=io.containerd.transfer.v1 Jul 6 23:26:28.776286 containerd[1514]: time="2025-07-06T23:26:28.776054784Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.transfer type=io.containerd.grpc.v1 Jul 6 23:26:28.776371 containerd[1514]: time="2025-07-06T23:26:28.776356264Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.version type=io.containerd.grpc.v1 Jul 6 23:26:28.776433 containerd[1514]: time="2025-07-06T23:26:28.776413024Z" level=info msg="loading plugin" id=io.containerd.monitor.container.v1.restart type=io.containerd.monitor.container.v1 Jul 6 23:26:28.778382 containerd[1514]: time="2025-07-06T23:26:28.776594944Z" level=info msg="loading plugin" id=io.containerd.tracing.processor.v1.otlp type=io.containerd.tracing.processor.v1 Jul 6 23:26:28.778382 containerd[1514]: time="2025-07-06T23:26:28.776621984Z" level=info msg="skip loading plugin" error="skip plugin: tracing endpoint not configured" id=io.containerd.tracing.processor.v1.otlp type=io.containerd.tracing.processor.v1 Jul 6 23:26:28.778382 containerd[1514]: time="2025-07-06T23:26:28.776631544Z" level=info msg="loading plugin" id=io.containerd.internal.v1.tracing type=io.containerd.internal.v1 Jul 6 23:26:28.780705 containerd[1514]: time="2025-07-06T23:26:28.778319464Z" level=info msg="skip loading plugin" error="skip plugin: tracing endpoint not configured" id=io.containerd.internal.v1.tracing type=io.containerd.internal.v1 Jul 6 23:26:28.780705 containerd[1514]: time="2025-07-06T23:26:28.780299304Z" level=info msg="loading plugin" id=io.containerd.ttrpc.v1.otelttrpc type=io.containerd.ttrpc.v1 Jul 6 23:26:28.780705 containerd[1514]: time="2025-07-06T23:26:28.780318744Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.healthcheck type=io.containerd.grpc.v1 Jul 6 23:26:28.780705 containerd[1514]: time="2025-07-06T23:26:28.780362144Z" level=info msg="loading plugin" id=io.containerd.nri.v1.nri type=io.containerd.nri.v1 Jul 6 23:26:28.780705 containerd[1514]: time="2025-07-06T23:26:28.780452584Z" level=info msg="runtime interface created" Jul 6 23:26:28.780705 containerd[1514]: time="2025-07-06T23:26:28.780459504Z" level=info msg="created NRI interface" Jul 6 23:26:28.780705 containerd[1514]: time="2025-07-06T23:26:28.780473424Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.cri type=io.containerd.grpc.v1 Jul 6 23:26:28.780705 containerd[1514]: time="2025-07-06T23:26:28.780493904Z" level=info msg="Connect containerd service" Jul 6 23:26:28.780705 containerd[1514]: time="2025-07-06T23:26:28.780551464Z" level=info msg="using experimental NRI integration - disable nri plugin to prevent this" Jul 6 23:26:28.786489 containerd[1514]: time="2025-07-06T23:26:28.786447704Z" level=error msg="failed to load cni during init, please check CRI plugin status before setting up network for pods" error="cni config load failed: no network config found in /etc/cni/net.d: cni plugin not initialized: failed to load cni config" Jul 6 23:26:28.814243 systemd[1]: Finished systemd-vconsole-setup.service - Virtual Console Setup. Jul 6 23:26:28.926417 locksmithd[1533]: locksmithd starting currentOperation="UPDATE_STATUS_IDLE" strategy="reboot" Jul 6 23:26:29.038216 containerd[1514]: time="2025-07-06T23:26:29.038157384Z" level=info msg="Start subscribing containerd event" Jul 6 23:26:29.038378 containerd[1514]: time="2025-07-06T23:26:29.038361704Z" level=info msg="Start recovering state" Jul 6 23:26:29.038696 containerd[1514]: time="2025-07-06T23:26:29.038678224Z" level=info msg="Start event monitor" Jul 6 23:26:29.038822 containerd[1514]: time="2025-07-06T23:26:29.038805544Z" level=info msg="Start cni network conf syncer for default" Jul 6 23:26:29.038882 containerd[1514]: time="2025-07-06T23:26:29.038870544Z" level=info msg="Start streaming server" Jul 6 23:26:29.039175 containerd[1514]: time="2025-07-06T23:26:29.039101584Z" level=info msg="Registered namespace \"k8s.io\" with NRI" Jul 6 23:26:29.039175 containerd[1514]: time="2025-07-06T23:26:29.039115024Z" level=info msg="runtime interface starting up..." Jul 6 23:26:29.039175 containerd[1514]: time="2025-07-06T23:26:29.039121944Z" level=info msg="starting plugins..." Jul 6 23:26:29.039175 containerd[1514]: time="2025-07-06T23:26:29.039142504Z" level=info msg="Synchronizing NRI (plugin) with current runtime state" Jul 6 23:26:29.040597 containerd[1514]: time="2025-07-06T23:26:29.040572144Z" level=info msg=serving... address=/run/containerd/containerd.sock.ttrpc Jul 6 23:26:29.041219 containerd[1514]: time="2025-07-06T23:26:29.040934584Z" level=info msg=serving... address=/run/containerd/containerd.sock Jul 6 23:26:29.041841 containerd[1514]: time="2025-07-06T23:26:29.041659064Z" level=info msg="containerd successfully booted in 0.371371s" Jul 6 23:26:29.041792 systemd[1]: Started containerd.service - containerd container runtime. Jul 6 23:26:29.097347 tar[1496]: linux-arm64/README.md Jul 6 23:26:29.116047 systemd[1]: Finished prepare-helm.service - Unpack helm to /opt/bin. Jul 6 23:26:29.406892 systemd-networkd[1406]: eth0: Gained IPv6LL Jul 6 23:26:29.412413 systemd[1]: Finished systemd-networkd-wait-online.service - Wait for Network to be Configured. Jul 6 23:26:29.414173 systemd[1]: Reached target network-online.target - Network is Online. Jul 6 23:26:29.419873 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Jul 6 23:26:29.421750 systemd[1]: Starting nvidia.service - NVIDIA Configure Service... Jul 6 23:26:29.470773 systemd[1]: Finished nvidia.service - NVIDIA Configure Service. Jul 6 23:26:29.810057 sshd_keygen[1517]: ssh-keygen: generating new host keys: RSA ECDSA ED25519 Jul 6 23:26:29.834007 systemd[1]: Finished sshd-keygen.service - Generate sshd host keys. Jul 6 23:26:29.838792 systemd[1]: Starting issuegen.service - Generate /run/issue... Jul 6 23:26:29.856208 systemd-networkd[1406]: eth1: Gained IPv6LL Jul 6 23:26:29.866423 systemd[1]: issuegen.service: Deactivated successfully. Jul 6 23:26:29.866849 systemd[1]: Finished issuegen.service - Generate /run/issue. Jul 6 23:26:29.873734 systemd[1]: Starting systemd-user-sessions.service - Permit User Sessions... Jul 6 23:26:29.897240 systemd[1]: Finished systemd-user-sessions.service - Permit User Sessions. Jul 6 23:26:29.901591 systemd[1]: Started getty@tty1.service - Getty on tty1. Jul 6 23:26:29.904970 systemd[1]: Started serial-getty@ttyAMA0.service - Serial Getty on ttyAMA0. Jul 6 23:26:29.905913 systemd[1]: Reached target getty.target - Login Prompts. Jul 6 23:26:30.323817 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Jul 6 23:26:30.325512 systemd[1]: Reached target multi-user.target - Multi-User System. Jul 6 23:26:30.329437 systemd[1]: Startup finished in 2.321s (kernel) + 8.455s (initrd) + 4.833s (userspace) = 15.609s. Jul 6 23:26:30.334513 (kubelet)[1645]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Jul 6 23:26:30.889153 kubelet[1645]: E0706 23:26:30.889099 1645 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Jul 6 23:26:30.891883 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Jul 6 23:26:30.892264 systemd[1]: kubelet.service: Failed with result 'exit-code'. Jul 6 23:26:30.892709 systemd[1]: kubelet.service: Consumed 966ms CPU time, 257.1M memory peak. Jul 6 23:26:31.108906 systemd[1]: Created slice system-sshd.slice - Slice /system/sshd. Jul 6 23:26:31.110739 systemd[1]: Started sshd@0-91.99.177.85:22-139.178.89.65:37586.service - OpenSSH per-connection server daemon (139.178.89.65:37586). Jul 6 23:26:32.255965 sshd[1657]: Accepted publickey for core from 139.178.89.65 port 37586 ssh2: RSA SHA256:3osEBBBp4JZnYlmjNq6bOg4+EAdCjLWXmHbLkgxlNTk Jul 6 23:26:32.259996 sshd-session[1657]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jul 6 23:26:32.273908 systemd[1]: Created slice user-500.slice - User Slice of UID 500. Jul 6 23:26:32.277943 systemd[1]: Starting user-runtime-dir@500.service - User Runtime Directory /run/user/500... Jul 6 23:26:32.283155 systemd-logind[1491]: New session 1 of user core. Jul 6 23:26:32.309708 systemd[1]: Finished user-runtime-dir@500.service - User Runtime Directory /run/user/500. Jul 6 23:26:32.312122 systemd[1]: Starting user@500.service - User Manager for UID 500... Jul 6 23:26:32.327567 (systemd)[1661]: pam_unix(systemd-user:session): session opened for user core(uid=500) by (uid=0) Jul 6 23:26:32.331652 systemd-logind[1491]: New session c1 of user core. Jul 6 23:26:32.471583 systemd[1661]: Queued start job for default target default.target. Jul 6 23:26:32.482586 systemd[1661]: Created slice app.slice - User Application Slice. Jul 6 23:26:32.482640 systemd[1661]: Reached target paths.target - Paths. Jul 6 23:26:32.483055 systemd[1661]: Reached target timers.target - Timers. Jul 6 23:26:32.485716 systemd[1661]: Starting dbus.socket - D-Bus User Message Bus Socket... Jul 6 23:26:32.499791 systemd[1661]: Listening on dbus.socket - D-Bus User Message Bus Socket. Jul 6 23:26:32.499910 systemd[1661]: Reached target sockets.target - Sockets. Jul 6 23:26:32.499961 systemd[1661]: Reached target basic.target - Basic System. Jul 6 23:26:32.499996 systemd[1661]: Reached target default.target - Main User Target. Jul 6 23:26:32.500025 systemd[1661]: Startup finished in 156ms. Jul 6 23:26:32.500602 systemd[1]: Started user@500.service - User Manager for UID 500. Jul 6 23:26:32.508971 systemd[1]: Started session-1.scope - Session 1 of User core. Jul 6 23:26:33.278805 systemd[1]: Started sshd@1-91.99.177.85:22-139.178.89.65:37588.service - OpenSSH per-connection server daemon (139.178.89.65:37588). Jul 6 23:26:34.390284 sshd[1672]: Accepted publickey for core from 139.178.89.65 port 37588 ssh2: RSA SHA256:3osEBBBp4JZnYlmjNq6bOg4+EAdCjLWXmHbLkgxlNTk Jul 6 23:26:34.392531 sshd-session[1672]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jul 6 23:26:34.399600 systemd-logind[1491]: New session 2 of user core. Jul 6 23:26:34.402851 systemd[1]: Started session-2.scope - Session 2 of User core. Jul 6 23:26:35.147712 sshd[1674]: Connection closed by 139.178.89.65 port 37588 Jul 6 23:26:35.147003 sshd-session[1672]: pam_unix(sshd:session): session closed for user core Jul 6 23:26:35.152651 systemd-logind[1491]: Session 2 logged out. Waiting for processes to exit. Jul 6 23:26:35.153434 systemd[1]: sshd@1-91.99.177.85:22-139.178.89.65:37588.service: Deactivated successfully. Jul 6 23:26:35.155498 systemd[1]: session-2.scope: Deactivated successfully. Jul 6 23:26:35.158021 systemd-logind[1491]: Removed session 2. Jul 6 23:26:35.337435 systemd[1]: Started sshd@2-91.99.177.85:22-139.178.89.65:37590.service - OpenSSH per-connection server daemon (139.178.89.65:37590). Jul 6 23:26:36.434136 sshd[1680]: Accepted publickey for core from 139.178.89.65 port 37590 ssh2: RSA SHA256:3osEBBBp4JZnYlmjNq6bOg4+EAdCjLWXmHbLkgxlNTk Jul 6 23:26:36.436900 sshd-session[1680]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jul 6 23:26:36.444497 systemd-logind[1491]: New session 3 of user core. Jul 6 23:26:36.452948 systemd[1]: Started session-3.scope - Session 3 of User core. Jul 6 23:26:37.177800 sshd[1682]: Connection closed by 139.178.89.65 port 37590 Jul 6 23:26:37.179010 sshd-session[1680]: pam_unix(sshd:session): session closed for user core Jul 6 23:26:37.186374 systemd[1]: sshd@2-91.99.177.85:22-139.178.89.65:37590.service: Deactivated successfully. Jul 6 23:26:37.189470 systemd[1]: session-3.scope: Deactivated successfully. Jul 6 23:26:37.191569 systemd-logind[1491]: Session 3 logged out. Waiting for processes to exit. Jul 6 23:26:37.193160 systemd-logind[1491]: Removed session 3. Jul 6 23:26:37.378146 systemd[1]: Started sshd@3-91.99.177.85:22-139.178.89.65:37596.service - OpenSSH per-connection server daemon (139.178.89.65:37596). Jul 6 23:26:38.471839 sshd[1688]: Accepted publickey for core from 139.178.89.65 port 37596 ssh2: RSA SHA256:3osEBBBp4JZnYlmjNq6bOg4+EAdCjLWXmHbLkgxlNTk Jul 6 23:26:38.475362 sshd-session[1688]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jul 6 23:26:38.481725 systemd-logind[1491]: New session 4 of user core. Jul 6 23:26:38.490011 systemd[1]: Started session-4.scope - Session 4 of User core. Jul 6 23:26:39.219702 sshd[1690]: Connection closed by 139.178.89.65 port 37596 Jul 6 23:26:39.218898 sshd-session[1688]: pam_unix(sshd:session): session closed for user core Jul 6 23:26:39.223723 systemd[1]: sshd@3-91.99.177.85:22-139.178.89.65:37596.service: Deactivated successfully. Jul 6 23:26:39.225348 systemd[1]: session-4.scope: Deactivated successfully. Jul 6 23:26:39.228423 systemd-logind[1491]: Session 4 logged out. Waiting for processes to exit. Jul 6 23:26:39.229638 systemd-logind[1491]: Removed session 4. Jul 6 23:26:39.411511 systemd[1]: Started sshd@4-91.99.177.85:22-139.178.89.65:37606.service - OpenSSH per-connection server daemon (139.178.89.65:37606). Jul 6 23:26:40.531175 sshd[1696]: Accepted publickey for core from 139.178.89.65 port 37606 ssh2: RSA SHA256:3osEBBBp4JZnYlmjNq6bOg4+EAdCjLWXmHbLkgxlNTk Jul 6 23:26:40.533334 sshd-session[1696]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jul 6 23:26:40.538491 systemd-logind[1491]: New session 5 of user core. Jul 6 23:26:40.546959 systemd[1]: Started session-5.scope - Session 5 of User core. Jul 6 23:26:41.118325 sudo[1699]: core : PWD=/home/core ; USER=root ; COMMAND=/usr/sbin/setenforce 1 Jul 6 23:26:41.119094 sudo[1699]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Jul 6 23:26:41.120545 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 1. Jul 6 23:26:41.123915 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Jul 6 23:26:41.136733 sudo[1699]: pam_unix(sudo:session): session closed for user root Jul 6 23:26:41.300150 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Jul 6 23:26:41.315735 sshd[1698]: Connection closed by 139.178.89.65 port 37606 Jul 6 23:26:41.316706 sshd-session[1696]: pam_unix(sshd:session): session closed for user core Jul 6 23:26:41.319030 (kubelet)[1709]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Jul 6 23:26:41.323982 systemd[1]: sshd@4-91.99.177.85:22-139.178.89.65:37606.service: Deactivated successfully. Jul 6 23:26:41.327871 systemd[1]: session-5.scope: Deactivated successfully. Jul 6 23:26:41.329582 systemd-logind[1491]: Session 5 logged out. Waiting for processes to exit. Jul 6 23:26:41.332350 systemd-logind[1491]: Removed session 5. Jul 6 23:26:41.371725 kubelet[1709]: E0706 23:26:41.370343 1709 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Jul 6 23:26:41.374642 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Jul 6 23:26:41.374943 systemd[1]: kubelet.service: Failed with result 'exit-code'. Jul 6 23:26:41.375702 systemd[1]: kubelet.service: Consumed 178ms CPU time, 105.8M memory peak. Jul 6 23:26:41.513072 systemd[1]: Started sshd@5-91.99.177.85:22-139.178.89.65:47118.service - OpenSSH per-connection server daemon (139.178.89.65:47118). Jul 6 23:26:42.633465 sshd[1719]: Accepted publickey for core from 139.178.89.65 port 47118 ssh2: RSA SHA256:3osEBBBp4JZnYlmjNq6bOg4+EAdCjLWXmHbLkgxlNTk Jul 6 23:26:42.636072 sshd-session[1719]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jul 6 23:26:42.642766 systemd-logind[1491]: New session 6 of user core. Jul 6 23:26:42.652005 systemd[1]: Started session-6.scope - Session 6 of User core. Jul 6 23:26:43.215107 sudo[1723]: core : PWD=/home/core ; USER=root ; COMMAND=/usr/sbin/rm -rf /etc/audit/rules.d/80-selinux.rules /etc/audit/rules.d/99-default.rules Jul 6 23:26:43.215788 sudo[1723]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Jul 6 23:26:43.226223 sudo[1723]: pam_unix(sudo:session): session closed for user root Jul 6 23:26:43.233575 sudo[1722]: core : PWD=/home/core ; USER=root ; COMMAND=/usr/sbin/systemctl restart audit-rules Jul 6 23:26:43.233958 sudo[1722]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Jul 6 23:26:43.247267 systemd[1]: Starting audit-rules.service - Load Audit Rules... Jul 6 23:26:43.306751 augenrules[1745]: No rules Jul 6 23:26:43.308077 systemd[1]: audit-rules.service: Deactivated successfully. Jul 6 23:26:43.308307 systemd[1]: Finished audit-rules.service - Load Audit Rules. Jul 6 23:26:43.310006 sudo[1722]: pam_unix(sudo:session): session closed for user root Jul 6 23:26:43.488468 sshd[1721]: Connection closed by 139.178.89.65 port 47118 Jul 6 23:26:43.489525 sshd-session[1719]: pam_unix(sshd:session): session closed for user core Jul 6 23:26:43.495304 systemd[1]: sshd@5-91.99.177.85:22-139.178.89.65:47118.service: Deactivated successfully. Jul 6 23:26:43.495474 systemd-logind[1491]: Session 6 logged out. Waiting for processes to exit. Jul 6 23:26:43.497559 systemd[1]: session-6.scope: Deactivated successfully. Jul 6 23:26:43.499594 systemd-logind[1491]: Removed session 6. Jul 6 23:26:43.682274 systemd[1]: Started sshd@6-91.99.177.85:22-139.178.89.65:47122.service - OpenSSH per-connection server daemon (139.178.89.65:47122). Jul 6 23:26:44.792620 sshd[1754]: Accepted publickey for core from 139.178.89.65 port 47122 ssh2: RSA SHA256:3osEBBBp4JZnYlmjNq6bOg4+EAdCjLWXmHbLkgxlNTk Jul 6 23:26:44.794396 sshd-session[1754]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jul 6 23:26:44.802749 systemd-logind[1491]: New session 7 of user core. Jul 6 23:26:44.813939 systemd[1]: Started session-7.scope - Session 7 of User core. Jul 6 23:26:45.369998 sudo[1757]: core : PWD=/home/core ; USER=root ; COMMAND=/home/core/install.sh Jul 6 23:26:45.370294 sudo[1757]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Jul 6 23:26:45.742454 systemd[1]: Starting docker.service - Docker Application Container Engine... Jul 6 23:26:45.754348 (dockerd)[1774]: docker.service: Referenced but unset environment variable evaluates to an empty string: DOCKER_CGROUPS, DOCKER_OPTS, DOCKER_OPT_BIP, DOCKER_OPT_IPMASQ, DOCKER_OPT_MTU Jul 6 23:26:46.006824 dockerd[1774]: time="2025-07-06T23:26:46.006315744Z" level=info msg="Starting up" Jul 6 23:26:46.011802 dockerd[1774]: time="2025-07-06T23:26:46.011748624Z" level=info msg="OTEL tracing is not configured, using no-op tracer provider" Jul 6 23:26:46.076807 dockerd[1774]: time="2025-07-06T23:26:46.076720704Z" level=info msg="Loading containers: start." Jul 6 23:26:46.087708 kernel: Initializing XFRM netlink socket Jul 6 23:26:46.352765 systemd-networkd[1406]: docker0: Link UP Jul 6 23:26:46.358393 dockerd[1774]: time="2025-07-06T23:26:46.358328184Z" level=info msg="Loading containers: done." Jul 6 23:26:46.375023 systemd[1]: var-lib-docker-overlay2-opaque\x2dbug\x2dcheck2566550252-merged.mount: Deactivated successfully. Jul 6 23:26:46.382333 dockerd[1774]: time="2025-07-06T23:26:46.382248664Z" level=warning msg="Not using native diff for overlay2, this may cause degraded performance for building images: kernel has CONFIG_OVERLAY_FS_REDIRECT_DIR enabled" storage-driver=overlay2 Jul 6 23:26:46.382585 dockerd[1774]: time="2025-07-06T23:26:46.382370224Z" level=info msg="Docker daemon" commit=bbd0a17ccc67e48d4a69393287b7fcc4f0578683 containerd-snapshotter=false storage-driver=overlay2 version=28.0.1 Jul 6 23:26:46.382585 dockerd[1774]: time="2025-07-06T23:26:46.382525184Z" level=info msg="Initializing buildkit" Jul 6 23:26:46.417823 dockerd[1774]: time="2025-07-06T23:26:46.417728344Z" level=info msg="Completed buildkit initialization" Jul 6 23:26:46.431701 dockerd[1774]: time="2025-07-06T23:26:46.431446704Z" level=info msg="Daemon has completed initialization" Jul 6 23:26:46.431701 dockerd[1774]: time="2025-07-06T23:26:46.431521024Z" level=info msg="API listen on /run/docker.sock" Jul 6 23:26:46.432485 systemd[1]: Started docker.service - Docker Application Container Engine. Jul 6 23:26:47.487021 containerd[1514]: time="2025-07-06T23:26:47.486948504Z" level=info msg="PullImage \"registry.k8s.io/kube-apiserver:v1.32.6\"" Jul 6 23:26:48.080275 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount3239595896.mount: Deactivated successfully. Jul 6 23:26:49.980900 containerd[1514]: time="2025-07-06T23:26:49.980648384Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-apiserver:v1.32.6\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jul 6 23:26:49.983135 containerd[1514]: time="2025-07-06T23:26:49.983057224Z" level=info msg="stop pulling image registry.k8s.io/kube-apiserver:v1.32.6: active requests=0, bytes read=26328286" Jul 6 23:26:49.984461 containerd[1514]: time="2025-07-06T23:26:49.984423544Z" level=info msg="ImageCreate event name:\"sha256:4ee56e04a4dd8fbc5a022e324327ae1f9b19bdaab8a79644d85d29b70d28e87a\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jul 6 23:26:49.988165 containerd[1514]: time="2025-07-06T23:26:49.988085744Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-apiserver@sha256:0f5764551d7de4ef70489ff8a70f32df7dea00701f5545af089b60bc5ede4f6f\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jul 6 23:26:49.990398 containerd[1514]: time="2025-07-06T23:26:49.990293704Z" level=info msg="Pulled image \"registry.k8s.io/kube-apiserver:v1.32.6\" with image id \"sha256:4ee56e04a4dd8fbc5a022e324327ae1f9b19bdaab8a79644d85d29b70d28e87a\", repo tag \"registry.k8s.io/kube-apiserver:v1.32.6\", repo digest \"registry.k8s.io/kube-apiserver@sha256:0f5764551d7de4ef70489ff8a70f32df7dea00701f5545af089b60bc5ede4f6f\", size \"26324994\" in 2.50329636s" Jul 6 23:26:49.990398 containerd[1514]: time="2025-07-06T23:26:49.990360944Z" level=info msg="PullImage \"registry.k8s.io/kube-apiserver:v1.32.6\" returns image reference \"sha256:4ee56e04a4dd8fbc5a022e324327ae1f9b19bdaab8a79644d85d29b70d28e87a\"" Jul 6 23:26:49.991538 containerd[1514]: time="2025-07-06T23:26:49.991471544Z" level=info msg="PullImage \"registry.k8s.io/kube-controller-manager:v1.32.6\"" Jul 6 23:26:51.471803 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 2. Jul 6 23:26:51.474410 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Jul 6 23:26:51.639143 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Jul 6 23:26:51.648061 (kubelet)[2043]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Jul 6 23:26:51.701175 kubelet[2043]: E0706 23:26:51.701096 2043 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Jul 6 23:26:51.703307 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Jul 6 23:26:51.703442 systemd[1]: kubelet.service: Failed with result 'exit-code'. Jul 6 23:26:51.703760 systemd[1]: kubelet.service: Consumed 169ms CPU time, 107.1M memory peak. Jul 6 23:26:52.007181 containerd[1514]: time="2025-07-06T23:26:52.007084064Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-controller-manager:v1.32.6\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jul 6 23:26:52.008802 containerd[1514]: time="2025-07-06T23:26:52.008756424Z" level=info msg="stop pulling image registry.k8s.io/kube-controller-manager:v1.32.6: active requests=0, bytes read=22529248" Jul 6 23:26:52.010772 containerd[1514]: time="2025-07-06T23:26:52.010701464Z" level=info msg="ImageCreate event name:\"sha256:3451c4b5bd601398c65e0579f1b720df4e0edde78f7f38e142f2b0be5e9bd038\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jul 6 23:26:52.014206 containerd[1514]: time="2025-07-06T23:26:52.014099424Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-controller-manager@sha256:3425f29c94a77d74cb89f38413e6274277dcf5e2bc7ab6ae953578a91e9e8356\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jul 6 23:26:52.015105 containerd[1514]: time="2025-07-06T23:26:52.015038944Z" level=info msg="Pulled image \"registry.k8s.io/kube-controller-manager:v1.32.6\" with image id \"sha256:3451c4b5bd601398c65e0579f1b720df4e0edde78f7f38e142f2b0be5e9bd038\", repo tag \"registry.k8s.io/kube-controller-manager:v1.32.6\", repo digest \"registry.k8s.io/kube-controller-manager@sha256:3425f29c94a77d74cb89f38413e6274277dcf5e2bc7ab6ae953578a91e9e8356\", size \"24065018\" in 2.02296152s" Jul 6 23:26:52.015105 containerd[1514]: time="2025-07-06T23:26:52.015094064Z" level=info msg="PullImage \"registry.k8s.io/kube-controller-manager:v1.32.6\" returns image reference \"sha256:3451c4b5bd601398c65e0579f1b720df4e0edde78f7f38e142f2b0be5e9bd038\"" Jul 6 23:26:52.015971 containerd[1514]: time="2025-07-06T23:26:52.015867464Z" level=info msg="PullImage \"registry.k8s.io/kube-scheduler:v1.32.6\"" Jul 6 23:26:53.519502 containerd[1514]: time="2025-07-06T23:26:53.519455824Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-scheduler:v1.32.6\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jul 6 23:26:53.521880 containerd[1514]: time="2025-07-06T23:26:53.521847184Z" level=info msg="stop pulling image registry.k8s.io/kube-scheduler:v1.32.6: active requests=0, bytes read=17484161" Jul 6 23:26:53.523541 containerd[1514]: time="2025-07-06T23:26:53.523512704Z" level=info msg="ImageCreate event name:\"sha256:3d72026a3748f31411df93e4aaa9c67944b7e0cc311c11eba2aae5e615213d5f\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jul 6 23:26:53.527757 containerd[1514]: time="2025-07-06T23:26:53.527711704Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-scheduler@sha256:130f633cbd1d70e2f4655350153cb3fc469f4d5a6310b4f0b49d93fb2ba2132b\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jul 6 23:26:53.529150 containerd[1514]: time="2025-07-06T23:26:53.529110304Z" level=info msg="Pulled image \"registry.k8s.io/kube-scheduler:v1.32.6\" with image id \"sha256:3d72026a3748f31411df93e4aaa9c67944b7e0cc311c11eba2aae5e615213d5f\", repo tag \"registry.k8s.io/kube-scheduler:v1.32.6\", repo digest \"registry.k8s.io/kube-scheduler@sha256:130f633cbd1d70e2f4655350153cb3fc469f4d5a6310b4f0b49d93fb2ba2132b\", size \"19019949\" in 1.51320572s" Jul 6 23:26:53.529150 containerd[1514]: time="2025-07-06T23:26:53.529145384Z" level=info msg="PullImage \"registry.k8s.io/kube-scheduler:v1.32.6\" returns image reference \"sha256:3d72026a3748f31411df93e4aaa9c67944b7e0cc311c11eba2aae5e615213d5f\"" Jul 6 23:26:53.529728 containerd[1514]: time="2025-07-06T23:26:53.529699504Z" level=info msg="PullImage \"registry.k8s.io/kube-proxy:v1.32.6\"" Jul 6 23:26:54.551887 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount2734637290.mount: Deactivated successfully. Jul 6 23:26:55.015695 containerd[1514]: time="2025-07-06T23:26:55.015508384Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-proxy:v1.32.6\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jul 6 23:26:55.017580 containerd[1514]: time="2025-07-06T23:26:55.017476944Z" level=info msg="stop pulling image registry.k8s.io/kube-proxy:v1.32.6: active requests=0, bytes read=27378432" Jul 6 23:26:55.018693 containerd[1514]: time="2025-07-06T23:26:55.018591264Z" level=info msg="ImageCreate event name:\"sha256:e29293ef7b817bb7b03ce7484edafe6ca0a7087e54074e7d7dcd3bd3c762eee9\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jul 6 23:26:55.021505 containerd[1514]: time="2025-07-06T23:26:55.021443184Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-proxy@sha256:b13d9da413b983d130bf090b83fce12e1ccc704e95f366da743c18e964d9d7e9\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jul 6 23:26:55.022687 containerd[1514]: time="2025-07-06T23:26:55.022596984Z" level=info msg="Pulled image \"registry.k8s.io/kube-proxy:v1.32.6\" with image id \"sha256:e29293ef7b817bb7b03ce7484edafe6ca0a7087e54074e7d7dcd3bd3c762eee9\", repo tag \"registry.k8s.io/kube-proxy:v1.32.6\", repo digest \"registry.k8s.io/kube-proxy@sha256:b13d9da413b983d130bf090b83fce12e1ccc704e95f366da743c18e964d9d7e9\", size \"27377425\" in 1.49286508s" Jul 6 23:26:55.022687 containerd[1514]: time="2025-07-06T23:26:55.022642344Z" level=info msg="PullImage \"registry.k8s.io/kube-proxy:v1.32.6\" returns image reference \"sha256:e29293ef7b817bb7b03ce7484edafe6ca0a7087e54074e7d7dcd3bd3c762eee9\"" Jul 6 23:26:55.023548 containerd[1514]: time="2025-07-06T23:26:55.023379504Z" level=info msg="PullImage \"registry.k8s.io/coredns/coredns:v1.11.3\"" Jul 6 23:26:55.645220 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount1328561485.mount: Deactivated successfully. Jul 6 23:26:56.394322 containerd[1514]: time="2025-07-06T23:26:56.394068984Z" level=info msg="ImageCreate event name:\"registry.k8s.io/coredns/coredns:v1.11.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jul 6 23:26:56.395366 containerd[1514]: time="2025-07-06T23:26:56.395315784Z" level=info msg="stop pulling image registry.k8s.io/coredns/coredns:v1.11.3: active requests=0, bytes read=16951714" Jul 6 23:26:56.397090 containerd[1514]: time="2025-07-06T23:26:56.397047944Z" level=info msg="ImageCreate event name:\"sha256:2f6c962e7b8311337352d9fdea917da2184d9919f4da7695bc2a6517cf392fe4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jul 6 23:26:56.402305 containerd[1514]: time="2025-07-06T23:26:56.402234504Z" level=info msg="ImageCreate event name:\"registry.k8s.io/coredns/coredns@sha256:9caabbf6238b189a65d0d6e6ac138de60d6a1c419e5a341fbbb7c78382559c6e\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jul 6 23:26:56.403967 containerd[1514]: time="2025-07-06T23:26:56.403782504Z" level=info msg="Pulled image \"registry.k8s.io/coredns/coredns:v1.11.3\" with image id \"sha256:2f6c962e7b8311337352d9fdea917da2184d9919f4da7695bc2a6517cf392fe4\", repo tag \"registry.k8s.io/coredns/coredns:v1.11.3\", repo digest \"registry.k8s.io/coredns/coredns@sha256:9caabbf6238b189a65d0d6e6ac138de60d6a1c419e5a341fbbb7c78382559c6e\", size \"16948420\" in 1.38034728s" Jul 6 23:26:56.403967 containerd[1514]: time="2025-07-06T23:26:56.403841984Z" level=info msg="PullImage \"registry.k8s.io/coredns/coredns:v1.11.3\" returns image reference \"sha256:2f6c962e7b8311337352d9fdea917da2184d9919f4da7695bc2a6517cf392fe4\"" Jul 6 23:26:56.404724 containerd[1514]: time="2025-07-06T23:26:56.404651304Z" level=info msg="PullImage \"registry.k8s.io/pause:3.10\"" Jul 6 23:26:56.921955 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount2615023340.mount: Deactivated successfully. Jul 6 23:26:56.929056 containerd[1514]: time="2025-07-06T23:26:56.928972464Z" level=info msg="ImageCreate event name:\"registry.k8s.io/pause:3.10\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" Jul 6 23:26:56.930793 containerd[1514]: time="2025-07-06T23:26:56.930678304Z" level=info msg="stop pulling image registry.k8s.io/pause:3.10: active requests=0, bytes read=268723" Jul 6 23:26:56.931825 containerd[1514]: time="2025-07-06T23:26:56.931728544Z" level=info msg="ImageCreate event name:\"sha256:afb61768ce381961ca0beff95337601f29dc70ff3ed14e5e4b3e5699057e6aa8\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" Jul 6 23:26:56.934975 containerd[1514]: time="2025-07-06T23:26:56.934904464Z" level=info msg="ImageCreate event name:\"registry.k8s.io/pause@sha256:ee6521f290b2168b6e0935a181d4cff9be1ac3f505666ef0e3c98fae8199917a\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" Jul 6 23:26:56.936348 containerd[1514]: time="2025-07-06T23:26:56.935751504Z" level=info msg="Pulled image \"registry.k8s.io/pause:3.10\" with image id \"sha256:afb61768ce381961ca0beff95337601f29dc70ff3ed14e5e4b3e5699057e6aa8\", repo tag \"registry.k8s.io/pause:3.10\", repo digest \"registry.k8s.io/pause@sha256:ee6521f290b2168b6e0935a181d4cff9be1ac3f505666ef0e3c98fae8199917a\", size \"267933\" in 531.03532ms" Jul 6 23:26:56.936348 containerd[1514]: time="2025-07-06T23:26:56.935792824Z" level=info msg="PullImage \"registry.k8s.io/pause:3.10\" returns image reference \"sha256:afb61768ce381961ca0beff95337601f29dc70ff3ed14e5e4b3e5699057e6aa8\"" Jul 6 23:26:56.936582 containerd[1514]: time="2025-07-06T23:26:56.936513984Z" level=info msg="PullImage \"registry.k8s.io/etcd:3.5.16-0\"" Jul 6 23:26:57.460314 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount3727487957.mount: Deactivated successfully. Jul 6 23:27:00.043164 containerd[1514]: time="2025-07-06T23:27:00.042942424Z" level=info msg="ImageCreate event name:\"registry.k8s.io/etcd:3.5.16-0\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jul 6 23:27:00.044579 containerd[1514]: time="2025-07-06T23:27:00.044507784Z" level=info msg="stop pulling image registry.k8s.io/etcd:3.5.16-0: active requests=0, bytes read=67812537" Jul 6 23:27:00.045771 containerd[1514]: time="2025-07-06T23:27:00.045698264Z" level=info msg="ImageCreate event name:\"sha256:7fc9d4aa817aa6a3e549f3cd49d1f7b496407be979fc36dd5f356d59ce8c3a82\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jul 6 23:27:00.051430 containerd[1514]: time="2025-07-06T23:27:00.051333824Z" level=info msg="ImageCreate event name:\"registry.k8s.io/etcd@sha256:c6a9d11cc5c04b114ccdef39a9265eeef818e3d02f5359be035ae784097fdec5\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jul 6 23:27:00.052927 containerd[1514]: time="2025-07-06T23:27:00.052775664Z" level=info msg="Pulled image \"registry.k8s.io/etcd:3.5.16-0\" with image id \"sha256:7fc9d4aa817aa6a3e549f3cd49d1f7b496407be979fc36dd5f356d59ce8c3a82\", repo tag \"registry.k8s.io/etcd:3.5.16-0\", repo digest \"registry.k8s.io/etcd@sha256:c6a9d11cc5c04b114ccdef39a9265eeef818e3d02f5359be035ae784097fdec5\", size \"67941650\" in 3.11619664s" Jul 6 23:27:00.052927 containerd[1514]: time="2025-07-06T23:27:00.052824784Z" level=info msg="PullImage \"registry.k8s.io/etcd:3.5.16-0\" returns image reference \"sha256:7fc9d4aa817aa6a3e549f3cd49d1f7b496407be979fc36dd5f356d59ce8c3a82\"" Jul 6 23:27:01.722300 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 3. Jul 6 23:27:01.726855 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Jul 6 23:27:01.906826 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Jul 6 23:27:01.916187 (kubelet)[2199]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Jul 6 23:27:01.968458 kubelet[2199]: E0706 23:27:01.968355 2199 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Jul 6 23:27:01.972070 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Jul 6 23:27:01.972440 systemd[1]: kubelet.service: Failed with result 'exit-code'. Jul 6 23:27:01.974852 systemd[1]: kubelet.service: Consumed 177ms CPU time, 105.1M memory peak. Jul 6 23:27:05.125810 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent. Jul 6 23:27:05.126081 systemd[1]: kubelet.service: Consumed 177ms CPU time, 105.1M memory peak. Jul 6 23:27:05.128770 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Jul 6 23:27:05.163829 systemd[1]: Reload requested from client PID 2214 ('systemctl') (unit session-7.scope)... Jul 6 23:27:05.163851 systemd[1]: Reloading... Jul 6 23:27:05.290721 zram_generator::config[2261]: No configuration found. Jul 6 23:27:05.381710 systemd[1]: /usr/lib/systemd/system/docker.socket:6: ListenStream= references a path below legacy directory /var/run/, updating /var/run/docker.sock → /run/docker.sock; please update the unit file accordingly. Jul 6 23:27:05.491698 systemd[1]: Reloading finished in 327 ms. Jul 6 23:27:05.541400 systemd[1]: kubelet.service: Control process exited, code=killed, status=15/TERM Jul 6 23:27:05.541483 systemd[1]: kubelet.service: Failed with result 'signal'. Jul 6 23:27:05.542767 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent. Jul 6 23:27:05.542823 systemd[1]: kubelet.service: Consumed 107ms CPU time, 95M memory peak. Jul 6 23:27:05.544635 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Jul 6 23:27:05.694418 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Jul 6 23:27:05.704290 (kubelet)[2306]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS Jul 6 23:27:05.752147 kubelet[2306]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Jul 6 23:27:05.752653 kubelet[2306]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Jul 6 23:27:05.753788 kubelet[2306]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Jul 6 23:27:05.753788 kubelet[2306]: I0706 23:27:05.752794 2306 server.go:215] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Jul 6 23:27:06.367838 kubelet[2306]: I0706 23:27:06.367792 2306 server.go:520] "Kubelet version" kubeletVersion="v1.32.4" Jul 6 23:27:06.368014 kubelet[2306]: I0706 23:27:06.368001 2306 server.go:522] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Jul 6 23:27:06.368501 kubelet[2306]: I0706 23:27:06.368473 2306 server.go:954] "Client rotation is on, will bootstrap in background" Jul 6 23:27:06.407163 kubelet[2306]: E0706 23:27:06.407105 2306 certificate_manager.go:562] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post \"https://91.99.177.85:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": dial tcp 91.99.177.85:6443: connect: connection refused" logger="UnhandledError" Jul 6 23:27:06.412422 kubelet[2306]: I0706 23:27:06.412371 2306 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/pki/ca.crt" Jul 6 23:27:06.425162 kubelet[2306]: I0706 23:27:06.425085 2306 server.go:1444] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Jul 6 23:27:06.429212 kubelet[2306]: I0706 23:27:06.429164 2306 server.go:772] "--cgroups-per-qos enabled, but --cgroup-root was not specified. defaulting to /" Jul 6 23:27:06.430210 kubelet[2306]: I0706 23:27:06.430094 2306 container_manager_linux.go:268] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Jul 6 23:27:06.430381 kubelet[2306]: I0706 23:27:06.430153 2306 container_manager_linux.go:273] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ci-4344-1-1-3-d8bdec45b1","RuntimeCgroupsName":"","SystemCgroupsName":"","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":false,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":null,"HardEvictionThresholds":[{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"ExperimentalMemoryManagerPolicy":"None","ExperimentalMemoryManagerReservedMemory":null,"PodPidsLimit":-1,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Jul 6 23:27:06.430509 kubelet[2306]: I0706 23:27:06.430438 2306 topology_manager.go:138] "Creating topology manager with none policy" Jul 6 23:27:06.430509 kubelet[2306]: I0706 23:27:06.430449 2306 container_manager_linux.go:304] "Creating device plugin manager" Jul 6 23:27:06.430750 kubelet[2306]: I0706 23:27:06.430725 2306 state_mem.go:36] "Initialized new in-memory state store" Jul 6 23:27:06.434388 kubelet[2306]: I0706 23:27:06.434354 2306 kubelet.go:446] "Attempting to sync node with API server" Jul 6 23:27:06.434388 kubelet[2306]: I0706 23:27:06.434385 2306 kubelet.go:341] "Adding static pod path" path="/etc/kubernetes/manifests" Jul 6 23:27:06.435791 kubelet[2306]: I0706 23:27:06.434414 2306 kubelet.go:352] "Adding apiserver pod source" Jul 6 23:27:06.435791 kubelet[2306]: I0706 23:27:06.434425 2306 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Jul 6 23:27:06.442888 kubelet[2306]: W0706 23:27:06.442773 2306 reflector.go:569] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://91.99.177.85:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 91.99.177.85:6443: connect: connection refused Jul 6 23:27:06.442888 kubelet[2306]: E0706 23:27:06.442854 2306 reflector.go:166] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://91.99.177.85:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 91.99.177.85:6443: connect: connection refused" logger="UnhandledError" Jul 6 23:27:06.444781 kubelet[2306]: W0706 23:27:06.444711 2306 reflector.go:569] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://91.99.177.85:6443/api/v1/nodes?fieldSelector=metadata.name%3Dci-4344-1-1-3-d8bdec45b1&limit=500&resourceVersion=0": dial tcp 91.99.177.85:6443: connect: connection refused Jul 6 23:27:06.444856 kubelet[2306]: E0706 23:27:06.444793 2306 reflector.go:166] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://91.99.177.85:6443/api/v1/nodes?fieldSelector=metadata.name%3Dci-4344-1-1-3-d8bdec45b1&limit=500&resourceVersion=0\": dial tcp 91.99.177.85:6443: connect: connection refused" logger="UnhandledError" Jul 6 23:27:06.444960 kubelet[2306]: I0706 23:27:06.444934 2306 kuberuntime_manager.go:269] "Container runtime initialized" containerRuntime="containerd" version="v2.0.4" apiVersion="v1" Jul 6 23:27:06.445789 kubelet[2306]: I0706 23:27:06.445763 2306 kubelet.go:890] "Not starting ClusterTrustBundle informer because we are in static kubelet mode" Jul 6 23:27:06.445995 kubelet[2306]: W0706 23:27:06.445984 2306 probe.go:272] Flexvolume plugin directory at /opt/libexec/kubernetes/kubelet-plugins/volume/exec/ does not exist. Recreating. Jul 6 23:27:06.447764 kubelet[2306]: I0706 23:27:06.447744 2306 watchdog_linux.go:99] "Systemd watchdog is not enabled" Jul 6 23:27:06.447869 kubelet[2306]: I0706 23:27:06.447861 2306 server.go:1287] "Started kubelet" Jul 6 23:27:06.449169 kubelet[2306]: I0706 23:27:06.449122 2306 server.go:169] "Starting to listen" address="0.0.0.0" port=10250 Jul 6 23:27:06.451601 kubelet[2306]: I0706 23:27:06.451241 2306 server.go:479] "Adding debug handlers to kubelet server" Jul 6 23:27:06.458551 kubelet[2306]: I0706 23:27:06.458481 2306 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Jul 6 23:27:06.458929 kubelet[2306]: I0706 23:27:06.458909 2306 server.go:243] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Jul 6 23:27:06.461156 kubelet[2306]: I0706 23:27:06.459296 2306 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Jul 6 23:27:06.461156 kubelet[2306]: E0706 23:27:06.459567 2306 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://91.99.177.85:6443/api/v1/namespaces/default/events\": dial tcp 91.99.177.85:6443: connect: connection refused" event="&Event{ObjectMeta:{ci-4344-1-1-3-d8bdec45b1.184fcd39642bb7bc default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ci-4344-1-1-3-d8bdec45b1,UID:ci-4344-1-1-3-d8bdec45b1,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:ci-4344-1-1-3-d8bdec45b1,},FirstTimestamp:2025-07-06 23:27:06.447828924 +0000 UTC m=+0.738855532,LastTimestamp:2025-07-06 23:27:06.447828924 +0000 UTC m=+0.738855532,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ci-4344-1-1-3-d8bdec45b1,}" Jul 6 23:27:06.461156 kubelet[2306]: I0706 23:27:06.459979 2306 dynamic_serving_content.go:135] "Starting controller" name="kubelet-server-cert-files::/var/lib/kubelet/pki/kubelet.crt::/var/lib/kubelet/pki/kubelet.key" Jul 6 23:27:06.465089 kubelet[2306]: I0706 23:27:06.465055 2306 volume_manager.go:297] "Starting Kubelet Volume Manager" Jul 6 23:27:06.465901 kubelet[2306]: E0706 23:27:06.465790 2306 kubelet_node_status.go:466] "Error getting the current node from lister" err="node \"ci-4344-1-1-3-d8bdec45b1\" not found" Jul 6 23:27:06.466865 kubelet[2306]: E0706 23:27:06.466844 2306 kubelet.go:1555] "Image garbage collection failed once. Stats initialization may not have completed yet" err="invalid capacity 0 on image filesystem" Jul 6 23:27:06.467104 kubelet[2306]: E0706 23:27:06.467080 2306 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://91.99.177.85:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/ci-4344-1-1-3-d8bdec45b1?timeout=10s\": dial tcp 91.99.177.85:6443: connect: connection refused" interval="200ms" Jul 6 23:27:06.467333 kubelet[2306]: I0706 23:27:06.467316 2306 factory.go:221] Registration of the systemd container factory successfully Jul 6 23:27:06.467498 kubelet[2306]: I0706 23:27:06.467481 2306 factory.go:219] Registration of the crio container factory failed: Get "http://%2Fvar%2Frun%2Fcrio%2Fcrio.sock/info": dial unix /var/run/crio/crio.sock: connect: no such file or directory Jul 6 23:27:06.469686 kubelet[2306]: I0706 23:27:06.468989 2306 desired_state_of_world_populator.go:150] "Desired state populator starts to run" Jul 6 23:27:06.469686 kubelet[2306]: I0706 23:27:06.469065 2306 reconciler.go:26] "Reconciler: start to sync state" Jul 6 23:27:06.470846 kubelet[2306]: I0706 23:27:06.470825 2306 factory.go:221] Registration of the containerd container factory successfully Jul 6 23:27:06.481879 kubelet[2306]: I0706 23:27:06.481822 2306 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv4" Jul 6 23:27:06.483011 kubelet[2306]: I0706 23:27:06.482954 2306 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv6" Jul 6 23:27:06.483011 kubelet[2306]: I0706 23:27:06.482989 2306 status_manager.go:227] "Starting to sync pod status with apiserver" Jul 6 23:27:06.483011 kubelet[2306]: I0706 23:27:06.483010 2306 watchdog_linux.go:127] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Jul 6 23:27:06.483011 kubelet[2306]: I0706 23:27:06.483017 2306 kubelet.go:2382] "Starting kubelet main sync loop" Jul 6 23:27:06.483164 kubelet[2306]: E0706 23:27:06.483061 2306 kubelet.go:2406] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" Jul 6 23:27:06.490799 kubelet[2306]: W0706 23:27:06.490727 2306 reflector.go:569] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://91.99.177.85:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 91.99.177.85:6443: connect: connection refused Jul 6 23:27:06.490878 kubelet[2306]: E0706 23:27:06.490815 2306 reflector.go:166] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://91.99.177.85:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 91.99.177.85:6443: connect: connection refused" logger="UnhandledError" Jul 6 23:27:06.490958 kubelet[2306]: W0706 23:27:06.490911 2306 reflector.go:569] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://91.99.177.85:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 91.99.177.85:6443: connect: connection refused Jul 6 23:27:06.491006 kubelet[2306]: E0706 23:27:06.490957 2306 reflector.go:166] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://91.99.177.85:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 91.99.177.85:6443: connect: connection refused" logger="UnhandledError" Jul 6 23:27:06.505535 kubelet[2306]: I0706 23:27:06.505490 2306 cpu_manager.go:221] "Starting CPU manager" policy="none" Jul 6 23:27:06.505535 kubelet[2306]: I0706 23:27:06.505509 2306 cpu_manager.go:222] "Reconciling" reconcilePeriod="10s" Jul 6 23:27:06.505535 kubelet[2306]: I0706 23:27:06.505529 2306 state_mem.go:36] "Initialized new in-memory state store" Jul 6 23:27:06.507704 kubelet[2306]: I0706 23:27:06.507627 2306 policy_none.go:49] "None policy: Start" Jul 6 23:27:06.507704 kubelet[2306]: I0706 23:27:06.507660 2306 memory_manager.go:186] "Starting memorymanager" policy="None" Jul 6 23:27:06.507704 kubelet[2306]: I0706 23:27:06.507690 2306 state_mem.go:35] "Initializing new in-memory state store" Jul 6 23:27:06.515472 systemd[1]: Created slice kubepods.slice - libcontainer container kubepods.slice. Jul 6 23:27:06.540743 systemd[1]: Created slice kubepods-burstable.slice - libcontainer container kubepods-burstable.slice. Jul 6 23:27:06.546356 systemd[1]: Created slice kubepods-besteffort.slice - libcontainer container kubepods-besteffort.slice. Jul 6 23:27:06.560073 kubelet[2306]: I0706 23:27:06.559599 2306 manager.go:519] "Failed to read data from checkpoint" checkpoint="kubelet_internal_checkpoint" err="checkpoint is not found" Jul 6 23:27:06.561606 kubelet[2306]: I0706 23:27:06.561571 2306 eviction_manager.go:189] "Eviction manager: starting control loop" Jul 6 23:27:06.562695 kubelet[2306]: I0706 23:27:06.562524 2306 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Jul 6 23:27:06.564569 kubelet[2306]: I0706 23:27:06.564496 2306 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Jul 6 23:27:06.566199 kubelet[2306]: E0706 23:27:06.566093 2306 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="no imagefs label for configured runtime" Jul 6 23:27:06.566324 kubelet[2306]: E0706 23:27:06.566233 2306 eviction_manager.go:292] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"ci-4344-1-1-3-d8bdec45b1\" not found" Jul 6 23:27:06.597256 systemd[1]: Created slice kubepods-burstable-podb204b96b2b12e15759d1708287da6c43.slice - libcontainer container kubepods-burstable-podb204b96b2b12e15759d1708287da6c43.slice. Jul 6 23:27:06.618083 kubelet[2306]: E0706 23:27:06.616928 2306 kubelet.go:3190] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4344-1-1-3-d8bdec45b1\" not found" node="ci-4344-1-1-3-d8bdec45b1" Jul 6 23:27:06.622792 systemd[1]: Created slice kubepods-burstable-poddfd1d1b9bd9763be4828a279e619c79e.slice - libcontainer container kubepods-burstable-poddfd1d1b9bd9763be4828a279e619c79e.slice. Jul 6 23:27:06.634193 kubelet[2306]: E0706 23:27:06.633897 2306 kubelet.go:3190] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4344-1-1-3-d8bdec45b1\" not found" node="ci-4344-1-1-3-d8bdec45b1" Jul 6 23:27:06.638290 systemd[1]: Created slice kubepods-burstable-podc012f9832c2f93172b62286bcb610cbb.slice - libcontainer container kubepods-burstable-podc012f9832c2f93172b62286bcb610cbb.slice. Jul 6 23:27:06.640957 kubelet[2306]: E0706 23:27:06.640926 2306 kubelet.go:3190] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4344-1-1-3-d8bdec45b1\" not found" node="ci-4344-1-1-3-d8bdec45b1" Jul 6 23:27:06.666181 kubelet[2306]: I0706 23:27:06.666105 2306 kubelet_node_status.go:75] "Attempting to register node" node="ci-4344-1-1-3-d8bdec45b1" Jul 6 23:27:06.667144 kubelet[2306]: E0706 23:27:06.667096 2306 kubelet_node_status.go:107] "Unable to register node with API server" err="Post \"https://91.99.177.85:6443/api/v1/nodes\": dial tcp 91.99.177.85:6443: connect: connection refused" node="ci-4344-1-1-3-d8bdec45b1" Jul 6 23:27:06.668491 kubelet[2306]: E0706 23:27:06.668439 2306 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://91.99.177.85:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/ci-4344-1-1-3-d8bdec45b1?timeout=10s\": dial tcp 91.99.177.85:6443: connect: connection refused" interval="400ms" Jul 6 23:27:06.670505 kubelet[2306]: I0706 23:27:06.670226 2306 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/b204b96b2b12e15759d1708287da6c43-ca-certs\") pod \"kube-controller-manager-ci-4344-1-1-3-d8bdec45b1\" (UID: \"b204b96b2b12e15759d1708287da6c43\") " pod="kube-system/kube-controller-manager-ci-4344-1-1-3-d8bdec45b1" Jul 6 23:27:06.670505 kubelet[2306]: I0706 23:27:06.670265 2306 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/b204b96b2b12e15759d1708287da6c43-kubeconfig\") pod \"kube-controller-manager-ci-4344-1-1-3-d8bdec45b1\" (UID: \"b204b96b2b12e15759d1708287da6c43\") " pod="kube-system/kube-controller-manager-ci-4344-1-1-3-d8bdec45b1" Jul 6 23:27:06.670505 kubelet[2306]: I0706 23:27:06.670304 2306 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/b204b96b2b12e15759d1708287da6c43-usr-share-ca-certificates\") pod \"kube-controller-manager-ci-4344-1-1-3-d8bdec45b1\" (UID: \"b204b96b2b12e15759d1708287da6c43\") " pod="kube-system/kube-controller-manager-ci-4344-1-1-3-d8bdec45b1" Jul 6 23:27:06.670505 kubelet[2306]: I0706 23:27:06.670328 2306 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/dfd1d1b9bd9763be4828a279e619c79e-kubeconfig\") pod \"kube-scheduler-ci-4344-1-1-3-d8bdec45b1\" (UID: \"dfd1d1b9bd9763be4828a279e619c79e\") " pod="kube-system/kube-scheduler-ci-4344-1-1-3-d8bdec45b1" Jul 6 23:27:06.670505 kubelet[2306]: I0706 23:27:06.670353 2306 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"flexvolume-dir\" (UniqueName: \"kubernetes.io/host-path/b204b96b2b12e15759d1708287da6c43-flexvolume-dir\") pod \"kube-controller-manager-ci-4344-1-1-3-d8bdec45b1\" (UID: \"b204b96b2b12e15759d1708287da6c43\") " pod="kube-system/kube-controller-manager-ci-4344-1-1-3-d8bdec45b1" Jul 6 23:27:06.670757 kubelet[2306]: I0706 23:27:06.670372 2306 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/b204b96b2b12e15759d1708287da6c43-k8s-certs\") pod \"kube-controller-manager-ci-4344-1-1-3-d8bdec45b1\" (UID: \"b204b96b2b12e15759d1708287da6c43\") " pod="kube-system/kube-controller-manager-ci-4344-1-1-3-d8bdec45b1" Jul 6 23:27:06.670757 kubelet[2306]: I0706 23:27:06.670391 2306 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/c012f9832c2f93172b62286bcb610cbb-ca-certs\") pod \"kube-apiserver-ci-4344-1-1-3-d8bdec45b1\" (UID: \"c012f9832c2f93172b62286bcb610cbb\") " pod="kube-system/kube-apiserver-ci-4344-1-1-3-d8bdec45b1" Jul 6 23:27:06.670757 kubelet[2306]: I0706 23:27:06.670409 2306 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/c012f9832c2f93172b62286bcb610cbb-k8s-certs\") pod \"kube-apiserver-ci-4344-1-1-3-d8bdec45b1\" (UID: \"c012f9832c2f93172b62286bcb610cbb\") " pod="kube-system/kube-apiserver-ci-4344-1-1-3-d8bdec45b1" Jul 6 23:27:06.670757 kubelet[2306]: I0706 23:27:06.670428 2306 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/c012f9832c2f93172b62286bcb610cbb-usr-share-ca-certificates\") pod \"kube-apiserver-ci-4344-1-1-3-d8bdec45b1\" (UID: \"c012f9832c2f93172b62286bcb610cbb\") " pod="kube-system/kube-apiserver-ci-4344-1-1-3-d8bdec45b1" Jul 6 23:27:06.870407 kubelet[2306]: I0706 23:27:06.870230 2306 kubelet_node_status.go:75] "Attempting to register node" node="ci-4344-1-1-3-d8bdec45b1" Jul 6 23:27:06.871585 kubelet[2306]: E0706 23:27:06.870618 2306 kubelet_node_status.go:107] "Unable to register node with API server" err="Post \"https://91.99.177.85:6443/api/v1/nodes\": dial tcp 91.99.177.85:6443: connect: connection refused" node="ci-4344-1-1-3-d8bdec45b1" Jul 6 23:27:06.919811 containerd[1514]: time="2025-07-06T23:27:06.919437575Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-controller-manager-ci-4344-1-1-3-d8bdec45b1,Uid:b204b96b2b12e15759d1708287da6c43,Namespace:kube-system,Attempt:0,}" Jul 6 23:27:06.937533 containerd[1514]: time="2025-07-06T23:27:06.936769571Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-scheduler-ci-4344-1-1-3-d8bdec45b1,Uid:dfd1d1b9bd9763be4828a279e619c79e,Namespace:kube-system,Attempt:0,}" Jul 6 23:27:06.947229 containerd[1514]: time="2025-07-06T23:27:06.947094117Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-apiserver-ci-4344-1-1-3-d8bdec45b1,Uid:c012f9832c2f93172b62286bcb610cbb,Namespace:kube-system,Attempt:0,}" Jul 6 23:27:06.950335 containerd[1514]: time="2025-07-06T23:27:06.950278245Z" level=info msg="connecting to shim 3822a28e27ca4024c847c82a29e10798d049c7b3f8cda8c64b6fef49e74372b6" address="unix:///run/containerd/s/0308bb234b90146c5af46f6afc0b1c660ea60916804c1dccf3c27fb448b2a8bb" namespace=k8s.io protocol=ttrpc version=3 Jul 6 23:27:06.993413 containerd[1514]: time="2025-07-06T23:27:06.993364483Z" level=info msg="connecting to shim 82845ac8f3fe88b6e7f60420764259ebadc0fae77b07279331e3f0cdfa0a09c8" address="unix:///run/containerd/s/342fe8f570715599960574d7e014b0280cded2b115e6a700147f1e8a0e4935bf" namespace=k8s.io protocol=ttrpc version=3 Jul 6 23:27:07.000845 systemd[1]: Started cri-containerd-3822a28e27ca4024c847c82a29e10798d049c7b3f8cda8c64b6fef49e74372b6.scope - libcontainer container 3822a28e27ca4024c847c82a29e10798d049c7b3f8cda8c64b6fef49e74372b6. Jul 6 23:27:07.020268 containerd[1514]: time="2025-07-06T23:27:07.020220118Z" level=info msg="connecting to shim 5a2a8ae0d146d87df91edd42240a80da54c6a94553ee243836989163d87ed5ea" address="unix:///run/containerd/s/9611cc7882d68f81f720bd0f8c783d94d4a73f762feb640ecb97e6f86a936990" namespace=k8s.io protocol=ttrpc version=3 Jul 6 23:27:07.037836 systemd[1]: Started cri-containerd-82845ac8f3fe88b6e7f60420764259ebadc0fae77b07279331e3f0cdfa0a09c8.scope - libcontainer container 82845ac8f3fe88b6e7f60420764259ebadc0fae77b07279331e3f0cdfa0a09c8. Jul 6 23:27:07.070619 kubelet[2306]: E0706 23:27:07.070561 2306 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://91.99.177.85:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/ci-4344-1-1-3-d8bdec45b1?timeout=10s\": dial tcp 91.99.177.85:6443: connect: connection refused" interval="800ms" Jul 6 23:27:07.077405 systemd[1]: Started cri-containerd-5a2a8ae0d146d87df91edd42240a80da54c6a94553ee243836989163d87ed5ea.scope - libcontainer container 5a2a8ae0d146d87df91edd42240a80da54c6a94553ee243836989163d87ed5ea. Jul 6 23:27:07.083977 containerd[1514]: time="2025-07-06T23:27:07.083780348Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-controller-manager-ci-4344-1-1-3-d8bdec45b1,Uid:b204b96b2b12e15759d1708287da6c43,Namespace:kube-system,Attempt:0,} returns sandbox id \"3822a28e27ca4024c847c82a29e10798d049c7b3f8cda8c64b6fef49e74372b6\"" Jul 6 23:27:07.100386 containerd[1514]: time="2025-07-06T23:27:07.100338809Z" level=info msg="CreateContainer within sandbox \"3822a28e27ca4024c847c82a29e10798d049c7b3f8cda8c64b6fef49e74372b6\" for container &ContainerMetadata{Name:kube-controller-manager,Attempt:0,}" Jul 6 23:27:07.114681 containerd[1514]: time="2025-07-06T23:27:07.114412266Z" level=info msg="Container c0bbb527259b66f18382d4109286042c4a92739751fe41f500a0db6d31beeb76: CDI devices from CRI Config.CDIDevices: []" Jul 6 23:27:07.129443 containerd[1514]: time="2025-07-06T23:27:07.129306724Z" level=info msg="CreateContainer within sandbox \"3822a28e27ca4024c847c82a29e10798d049c7b3f8cda8c64b6fef49e74372b6\" for &ContainerMetadata{Name:kube-controller-manager,Attempt:0,} returns container id \"c0bbb527259b66f18382d4109286042c4a92739751fe41f500a0db6d31beeb76\"" Jul 6 23:27:07.130246 containerd[1514]: time="2025-07-06T23:27:07.130190728Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-scheduler-ci-4344-1-1-3-d8bdec45b1,Uid:dfd1d1b9bd9763be4828a279e619c79e,Namespace:kube-system,Attempt:0,} returns sandbox id \"82845ac8f3fe88b6e7f60420764259ebadc0fae77b07279331e3f0cdfa0a09c8\"" Jul 6 23:27:07.131422 containerd[1514]: time="2025-07-06T23:27:07.131368186Z" level=info msg="StartContainer for \"c0bbb527259b66f18382d4109286042c4a92739751fe41f500a0db6d31beeb76\"" Jul 6 23:27:07.134707 containerd[1514]: time="2025-07-06T23:27:07.134510062Z" level=info msg="CreateContainer within sandbox \"82845ac8f3fe88b6e7f60420764259ebadc0fae77b07279331e3f0cdfa0a09c8\" for container &ContainerMetadata{Name:kube-scheduler,Attempt:0,}" Jul 6 23:27:07.136253 containerd[1514]: time="2025-07-06T23:27:07.135860289Z" level=info msg="connecting to shim c0bbb527259b66f18382d4109286042c4a92739751fe41f500a0db6d31beeb76" address="unix:///run/containerd/s/0308bb234b90146c5af46f6afc0b1c660ea60916804c1dccf3c27fb448b2a8bb" protocol=ttrpc version=3 Jul 6 23:27:07.139433 containerd[1514]: time="2025-07-06T23:27:07.139396944Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-apiserver-ci-4344-1-1-3-d8bdec45b1,Uid:c012f9832c2f93172b62286bcb610cbb,Namespace:kube-system,Attempt:0,} returns sandbox id \"5a2a8ae0d146d87df91edd42240a80da54c6a94553ee243836989163d87ed5ea\"" Jul 6 23:27:07.144576 containerd[1514]: time="2025-07-06T23:27:07.144534399Z" level=info msg="CreateContainer within sandbox \"5a2a8ae0d146d87df91edd42240a80da54c6a94553ee243836989163d87ed5ea\" for container &ContainerMetadata{Name:kube-apiserver,Attempt:0,}" Jul 6 23:27:07.149977 containerd[1514]: time="2025-07-06T23:27:07.149895625Z" level=info msg="Container 67c7723d1156f61c05d90af66428ebad654a61e3c6ae4602c9a125448dfb18ff: CDI devices from CRI Config.CDIDevices: []" Jul 6 23:27:07.159807 containerd[1514]: time="2025-07-06T23:27:07.159765754Z" level=info msg="Container fed893dbf1fbe5a88a52d671aa161d1765f1b49183b2a3e5a172155462581b2c: CDI devices from CRI Config.CDIDevices: []" Jul 6 23:27:07.162637 containerd[1514]: time="2025-07-06T23:27:07.162586294Z" level=info msg="CreateContainer within sandbox \"82845ac8f3fe88b6e7f60420764259ebadc0fae77b07279331e3f0cdfa0a09c8\" for &ContainerMetadata{Name:kube-scheduler,Attempt:0,} returns container id \"67c7723d1156f61c05d90af66428ebad654a61e3c6ae4602c9a125448dfb18ff\"" Jul 6 23:27:07.167192 containerd[1514]: time="2025-07-06T23:27:07.167123278Z" level=info msg="StartContainer for \"67c7723d1156f61c05d90af66428ebad654a61e3c6ae4602c9a125448dfb18ff\"" Jul 6 23:27:07.169939 containerd[1514]: time="2025-07-06T23:27:07.169893656Z" level=info msg="connecting to shim 67c7723d1156f61c05d90af66428ebad654a61e3c6ae4602c9a125448dfb18ff" address="unix:///run/containerd/s/342fe8f570715599960574d7e014b0280cded2b115e6a700147f1e8a0e4935bf" protocol=ttrpc version=3 Jul 6 23:27:07.170439 systemd[1]: Started cri-containerd-c0bbb527259b66f18382d4109286042c4a92739751fe41f500a0db6d31beeb76.scope - libcontainer container c0bbb527259b66f18382d4109286042c4a92739751fe41f500a0db6d31beeb76. Jul 6 23:27:07.180692 containerd[1514]: time="2025-07-06T23:27:07.180375735Z" level=info msg="CreateContainer within sandbox \"5a2a8ae0d146d87df91edd42240a80da54c6a94553ee243836989163d87ed5ea\" for &ContainerMetadata{Name:kube-apiserver,Attempt:0,} returns container id \"fed893dbf1fbe5a88a52d671aa161d1765f1b49183b2a3e5a172155462581b2c\"" Jul 6 23:27:07.181628 containerd[1514]: time="2025-07-06T23:27:07.181475510Z" level=info msg="StartContainer for \"fed893dbf1fbe5a88a52d671aa161d1765f1b49183b2a3e5a172155462581b2c\"" Jul 6 23:27:07.184168 containerd[1514]: time="2025-07-06T23:27:07.183274199Z" level=info msg="connecting to shim fed893dbf1fbe5a88a52d671aa161d1765f1b49183b2a3e5a172155462581b2c" address="unix:///run/containerd/s/9611cc7882d68f81f720bd0f8c783d94d4a73f762feb640ecb97e6f86a936990" protocol=ttrpc version=3 Jul 6 23:27:07.195163 systemd[1]: Started cri-containerd-67c7723d1156f61c05d90af66428ebad654a61e3c6ae4602c9a125448dfb18ff.scope - libcontainer container 67c7723d1156f61c05d90af66428ebad654a61e3c6ae4602c9a125448dfb18ff. Jul 6 23:27:07.215844 systemd[1]: Started cri-containerd-fed893dbf1fbe5a88a52d671aa161d1765f1b49183b2a3e5a172155462581b2c.scope - libcontainer container fed893dbf1fbe5a88a52d671aa161d1765f1b49183b2a3e5a172155462581b2c. Jul 6 23:27:07.254910 containerd[1514]: time="2025-07-06T23:27:07.254868267Z" level=info msg="StartContainer for \"c0bbb527259b66f18382d4109286042c4a92739751fe41f500a0db6d31beeb76\" returns successfully" Jul 6 23:27:07.274720 kubelet[2306]: I0706 23:27:07.274375 2306 kubelet_node_status.go:75] "Attempting to register node" node="ci-4344-1-1-3-d8bdec45b1" Jul 6 23:27:07.276841 kubelet[2306]: E0706 23:27:07.276555 2306 kubelet_node_status.go:107] "Unable to register node with API server" err="Post \"https://91.99.177.85:6443/api/v1/nodes\": dial tcp 91.99.177.85:6443: connect: connection refused" node="ci-4344-1-1-3-d8bdec45b1" Jul 6 23:27:07.289279 containerd[1514]: time="2025-07-06T23:27:07.289190968Z" level=info msg="StartContainer for \"67c7723d1156f61c05d90af66428ebad654a61e3c6ae4602c9a125448dfb18ff\" returns successfully" Jul 6 23:27:07.296869 containerd[1514]: time="2025-07-06T23:27:07.296824146Z" level=info msg="StartContainer for \"fed893dbf1fbe5a88a52d671aa161d1765f1b49183b2a3e5a172155462581b2c\" returns successfully" Jul 6 23:27:07.510703 kubelet[2306]: E0706 23:27:07.510636 2306 kubelet.go:3190] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4344-1-1-3-d8bdec45b1\" not found" node="ci-4344-1-1-3-d8bdec45b1" Jul 6 23:27:07.512204 kubelet[2306]: E0706 23:27:07.512172 2306 kubelet.go:3190] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4344-1-1-3-d8bdec45b1\" not found" node="ci-4344-1-1-3-d8bdec45b1" Jul 6 23:27:07.515611 kubelet[2306]: E0706 23:27:07.515576 2306 kubelet.go:3190] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4344-1-1-3-d8bdec45b1\" not found" node="ci-4344-1-1-3-d8bdec45b1" Jul 6 23:27:08.079067 kubelet[2306]: I0706 23:27:08.079033 2306 kubelet_node_status.go:75] "Attempting to register node" node="ci-4344-1-1-3-d8bdec45b1" Jul 6 23:27:08.518942 kubelet[2306]: E0706 23:27:08.518910 2306 kubelet.go:3190] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4344-1-1-3-d8bdec45b1\" not found" node="ci-4344-1-1-3-d8bdec45b1" Jul 6 23:27:08.520065 kubelet[2306]: E0706 23:27:08.520042 2306 kubelet.go:3190] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4344-1-1-3-d8bdec45b1\" not found" node="ci-4344-1-1-3-d8bdec45b1" Jul 6 23:27:09.469619 kubelet[2306]: E0706 23:27:09.469588 2306 kubelet.go:3190] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4344-1-1-3-d8bdec45b1\" not found" node="ci-4344-1-1-3-d8bdec45b1" Jul 6 23:27:10.099024 kubelet[2306]: E0706 23:27:10.098979 2306 nodelease.go:49] "Failed to get node when trying to set owner ref to the node lease" err="nodes \"ci-4344-1-1-3-d8bdec45b1\" not found" node="ci-4344-1-1-3-d8bdec45b1" Jul 6 23:27:10.219994 kubelet[2306]: I0706 23:27:10.219755 2306 kubelet_node_status.go:78] "Successfully registered node" node="ci-4344-1-1-3-d8bdec45b1" Jul 6 23:27:10.267090 kubelet[2306]: I0706 23:27:10.267037 2306 kubelet.go:3194] "Creating a mirror pod for static pod" pod="kube-system/kube-controller-manager-ci-4344-1-1-3-d8bdec45b1" Jul 6 23:27:10.280695 kubelet[2306]: E0706 23:27:10.279283 2306 kubelet.go:3196] "Failed creating a mirror pod" err="pods \"kube-controller-manager-ci-4344-1-1-3-d8bdec45b1\" is forbidden: no PriorityClass with name system-node-critical was found" pod="kube-system/kube-controller-manager-ci-4344-1-1-3-d8bdec45b1" Jul 6 23:27:10.281118 kubelet[2306]: I0706 23:27:10.280857 2306 kubelet.go:3194] "Creating a mirror pod for static pod" pod="kube-system/kube-scheduler-ci-4344-1-1-3-d8bdec45b1" Jul 6 23:27:10.283772 kubelet[2306]: E0706 23:27:10.283736 2306 kubelet.go:3196] "Failed creating a mirror pod" err="pods \"kube-scheduler-ci-4344-1-1-3-d8bdec45b1\" is forbidden: no PriorityClass with name system-node-critical was found" pod="kube-system/kube-scheduler-ci-4344-1-1-3-d8bdec45b1" Jul 6 23:27:10.284209 kubelet[2306]: I0706 23:27:10.284185 2306 kubelet.go:3194] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-ci-4344-1-1-3-d8bdec45b1" Jul 6 23:27:10.289508 kubelet[2306]: E0706 23:27:10.289472 2306 kubelet.go:3196] "Failed creating a mirror pod" err="pods \"kube-apiserver-ci-4344-1-1-3-d8bdec45b1\" is forbidden: no PriorityClass with name system-node-critical was found" pod="kube-system/kube-apiserver-ci-4344-1-1-3-d8bdec45b1" Jul 6 23:27:10.441184 kubelet[2306]: I0706 23:27:10.441027 2306 apiserver.go:52] "Watching apiserver" Jul 6 23:27:10.469386 kubelet[2306]: I0706 23:27:10.469331 2306 desired_state_of_world_populator.go:158] "Finished populating initial desired state of world" Jul 6 23:27:12.666784 systemd[1]: Reload requested from client PID 2578 ('systemctl') (unit session-7.scope)... Jul 6 23:27:12.667139 systemd[1]: Reloading... Jul 6 23:27:12.782710 zram_generator::config[2622]: No configuration found. Jul 6 23:27:12.868752 systemd[1]: /usr/lib/systemd/system/docker.socket:6: ListenStream= references a path below legacy directory /var/run/, updating /var/run/docker.sock → /run/docker.sock; please update the unit file accordingly. Jul 6 23:27:12.994778 systemd[1]: Reloading finished in 327 ms. Jul 6 23:27:13.030896 systemd[1]: Stopping kubelet.service - kubelet: The Kubernetes Node Agent... Jul 6 23:27:13.042380 systemd[1]: kubelet.service: Deactivated successfully. Jul 6 23:27:13.042797 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent. Jul 6 23:27:13.042870 systemd[1]: kubelet.service: Consumed 1.192s CPU time, 127.9M memory peak. Jul 6 23:27:13.046847 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Jul 6 23:27:13.197430 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Jul 6 23:27:13.213191 (kubelet)[2667]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS Jul 6 23:27:13.277646 kubelet[2667]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Jul 6 23:27:13.277646 kubelet[2667]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Jul 6 23:27:13.277646 kubelet[2667]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Jul 6 23:27:13.277646 kubelet[2667]: I0706 23:27:13.270345 2667 server.go:215] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Jul 6 23:27:13.284012 kubelet[2667]: I0706 23:27:13.283933 2667 server.go:520] "Kubelet version" kubeletVersion="v1.32.4" Jul 6 23:27:13.284012 kubelet[2667]: I0706 23:27:13.283963 2667 server.go:522] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Jul 6 23:27:13.284369 kubelet[2667]: I0706 23:27:13.284260 2667 server.go:954] "Client rotation is on, will bootstrap in background" Jul 6 23:27:13.285819 kubelet[2667]: I0706 23:27:13.285750 2667 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-client-current.pem". Jul 6 23:27:13.288609 kubelet[2667]: I0706 23:27:13.288445 2667 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/pki/ca.crt" Jul 6 23:27:13.297007 kubelet[2667]: I0706 23:27:13.296926 2667 server.go:1444] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Jul 6 23:27:13.304196 kubelet[2667]: I0706 23:27:13.304139 2667 server.go:772] "--cgroups-per-qos enabled, but --cgroup-root was not specified. defaulting to /" Jul 6 23:27:13.304471 kubelet[2667]: I0706 23:27:13.304409 2667 container_manager_linux.go:268] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Jul 6 23:27:13.304673 kubelet[2667]: I0706 23:27:13.304442 2667 container_manager_linux.go:273] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ci-4344-1-1-3-d8bdec45b1","RuntimeCgroupsName":"","SystemCgroupsName":"","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":false,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":null,"HardEvictionThresholds":[{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"ExperimentalMemoryManagerPolicy":"None","ExperimentalMemoryManagerReservedMemory":null,"PodPidsLimit":-1,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Jul 6 23:27:13.304673 kubelet[2667]: I0706 23:27:13.304674 2667 topology_manager.go:138] "Creating topology manager with none policy" Jul 6 23:27:13.304878 kubelet[2667]: I0706 23:27:13.304684 2667 container_manager_linux.go:304] "Creating device plugin manager" Jul 6 23:27:13.304878 kubelet[2667]: I0706 23:27:13.304727 2667 state_mem.go:36] "Initialized new in-memory state store" Jul 6 23:27:13.304968 kubelet[2667]: I0706 23:27:13.304882 2667 kubelet.go:446] "Attempting to sync node with API server" Jul 6 23:27:13.304968 kubelet[2667]: I0706 23:27:13.304897 2667 kubelet.go:341] "Adding static pod path" path="/etc/kubernetes/manifests" Jul 6 23:27:13.304968 kubelet[2667]: I0706 23:27:13.304917 2667 kubelet.go:352] "Adding apiserver pod source" Jul 6 23:27:13.305542 kubelet[2667]: I0706 23:27:13.305467 2667 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Jul 6 23:27:13.308122 kubelet[2667]: I0706 23:27:13.308068 2667 kuberuntime_manager.go:269] "Container runtime initialized" containerRuntime="containerd" version="v2.0.4" apiVersion="v1" Jul 6 23:27:13.310356 kubelet[2667]: I0706 23:27:13.310316 2667 kubelet.go:890] "Not starting ClusterTrustBundle informer because we are in static kubelet mode" Jul 6 23:27:13.312589 kubelet[2667]: I0706 23:27:13.312539 2667 watchdog_linux.go:99] "Systemd watchdog is not enabled" Jul 6 23:27:13.312885 kubelet[2667]: I0706 23:27:13.312615 2667 server.go:1287] "Started kubelet" Jul 6 23:27:13.320699 kubelet[2667]: I0706 23:27:13.320429 2667 server.go:169] "Starting to listen" address="0.0.0.0" port=10250 Jul 6 23:27:13.323092 kubelet[2667]: I0706 23:27:13.323033 2667 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Jul 6 23:27:13.323413 kubelet[2667]: I0706 23:27:13.323312 2667 server.go:243] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Jul 6 23:27:13.330351 kubelet[2667]: I0706 23:27:13.330322 2667 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Jul 6 23:27:13.335986 kubelet[2667]: I0706 23:27:13.335965 2667 server.go:479] "Adding debug handlers to kubelet server" Jul 6 23:27:13.347576 kubelet[2667]: I0706 23:27:13.347445 2667 dynamic_serving_content.go:135] "Starting controller" name="kubelet-server-cert-files::/var/lib/kubelet/pki/kubelet.crt::/var/lib/kubelet/pki/kubelet.key" Jul 6 23:27:13.349566 kubelet[2667]: I0706 23:27:13.349535 2667 volume_manager.go:297] "Starting Kubelet Volume Manager" Jul 6 23:27:13.350617 kubelet[2667]: E0706 23:27:13.349793 2667 kubelet_node_status.go:466] "Error getting the current node from lister" err="node \"ci-4344-1-1-3-d8bdec45b1\" not found" Jul 6 23:27:13.350860 kubelet[2667]: I0706 23:27:13.350819 2667 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv4" Jul 6 23:27:13.351038 kubelet[2667]: I0706 23:27:13.351006 2667 desired_state_of_world_populator.go:150] "Desired state populator starts to run" Jul 6 23:27:13.351160 kubelet[2667]: I0706 23:27:13.351135 2667 reconciler.go:26] "Reconciler: start to sync state" Jul 6 23:27:13.360125 kubelet[2667]: I0706 23:27:13.359817 2667 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv6" Jul 6 23:27:13.360125 kubelet[2667]: I0706 23:27:13.359844 2667 status_manager.go:227] "Starting to sync pod status with apiserver" Jul 6 23:27:13.360125 kubelet[2667]: I0706 23:27:13.359897 2667 watchdog_linux.go:127] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Jul 6 23:27:13.360125 kubelet[2667]: I0706 23:27:13.359908 2667 kubelet.go:2382] "Starting kubelet main sync loop" Jul 6 23:27:13.360125 kubelet[2667]: E0706 23:27:13.359950 2667 kubelet.go:2406] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" Jul 6 23:27:13.365777 kubelet[2667]: I0706 23:27:13.365614 2667 factory.go:221] Registration of the systemd container factory successfully Jul 6 23:27:13.365777 kubelet[2667]: I0706 23:27:13.365881 2667 factory.go:219] Registration of the crio container factory failed: Get "http://%2Fvar%2Frun%2Fcrio%2Fcrio.sock/info": dial unix /var/run/crio/crio.sock: connect: no such file or directory Jul 6 23:27:13.369309 kubelet[2667]: I0706 23:27:13.369206 2667 factory.go:221] Registration of the containerd container factory successfully Jul 6 23:27:13.372721 kubelet[2667]: E0706 23:27:13.372266 2667 kubelet.go:1555] "Image garbage collection failed once. Stats initialization may not have completed yet" err="invalid capacity 0 on image filesystem" Jul 6 23:27:13.420733 kubelet[2667]: I0706 23:27:13.420694 2667 cpu_manager.go:221] "Starting CPU manager" policy="none" Jul 6 23:27:13.420733 kubelet[2667]: I0706 23:27:13.420717 2667 cpu_manager.go:222] "Reconciling" reconcilePeriod="10s" Jul 6 23:27:13.420733 kubelet[2667]: I0706 23:27:13.420738 2667 state_mem.go:36] "Initialized new in-memory state store" Jul 6 23:27:13.420979 kubelet[2667]: I0706 23:27:13.420948 2667 state_mem.go:88] "Updated default CPUSet" cpuSet="" Jul 6 23:27:13.421044 kubelet[2667]: I0706 23:27:13.420969 2667 state_mem.go:96] "Updated CPUSet assignments" assignments={} Jul 6 23:27:13.421044 kubelet[2667]: I0706 23:27:13.420989 2667 policy_none.go:49] "None policy: Start" Jul 6 23:27:13.421044 kubelet[2667]: I0706 23:27:13.421000 2667 memory_manager.go:186] "Starting memorymanager" policy="None" Jul 6 23:27:13.421044 kubelet[2667]: I0706 23:27:13.421010 2667 state_mem.go:35] "Initializing new in-memory state store" Jul 6 23:27:13.421156 kubelet[2667]: I0706 23:27:13.421109 2667 state_mem.go:75] "Updated machine memory state" Jul 6 23:27:13.429251 kubelet[2667]: I0706 23:27:13.428566 2667 manager.go:519] "Failed to read data from checkpoint" checkpoint="kubelet_internal_checkpoint" err="checkpoint is not found" Jul 6 23:27:13.429251 kubelet[2667]: I0706 23:27:13.428831 2667 eviction_manager.go:189] "Eviction manager: starting control loop" Jul 6 23:27:13.429251 kubelet[2667]: I0706 23:27:13.428848 2667 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Jul 6 23:27:13.429251 kubelet[2667]: I0706 23:27:13.429250 2667 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Jul 6 23:27:13.432007 kubelet[2667]: E0706 23:27:13.431973 2667 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="no imagefs label for configured runtime" Jul 6 23:27:13.461483 kubelet[2667]: I0706 23:27:13.461364 2667 kubelet.go:3194] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-ci-4344-1-1-3-d8bdec45b1" Jul 6 23:27:13.461827 kubelet[2667]: I0706 23:27:13.461798 2667 kubelet.go:3194] "Creating a mirror pod for static pod" pod="kube-system/kube-scheduler-ci-4344-1-1-3-d8bdec45b1" Jul 6 23:27:13.462260 kubelet[2667]: I0706 23:27:13.462205 2667 kubelet.go:3194] "Creating a mirror pod for static pod" pod="kube-system/kube-controller-manager-ci-4344-1-1-3-d8bdec45b1" Jul 6 23:27:13.533861 kubelet[2667]: I0706 23:27:13.533761 2667 kubelet_node_status.go:75] "Attempting to register node" node="ci-4344-1-1-3-d8bdec45b1" Jul 6 23:27:13.549455 kubelet[2667]: I0706 23:27:13.549339 2667 kubelet_node_status.go:124] "Node was previously registered" node="ci-4344-1-1-3-d8bdec45b1" Jul 6 23:27:13.549455 kubelet[2667]: I0706 23:27:13.549437 2667 kubelet_node_status.go:78] "Successfully registered node" node="ci-4344-1-1-3-d8bdec45b1" Jul 6 23:27:13.552453 kubelet[2667]: I0706 23:27:13.552427 2667 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/b204b96b2b12e15759d1708287da6c43-ca-certs\") pod \"kube-controller-manager-ci-4344-1-1-3-d8bdec45b1\" (UID: \"b204b96b2b12e15759d1708287da6c43\") " pod="kube-system/kube-controller-manager-ci-4344-1-1-3-d8bdec45b1" Jul 6 23:27:13.553803 kubelet[2667]: I0706 23:27:13.553758 2667 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"flexvolume-dir\" (UniqueName: \"kubernetes.io/host-path/b204b96b2b12e15759d1708287da6c43-flexvolume-dir\") pod \"kube-controller-manager-ci-4344-1-1-3-d8bdec45b1\" (UID: \"b204b96b2b12e15759d1708287da6c43\") " pod="kube-system/kube-controller-manager-ci-4344-1-1-3-d8bdec45b1" Jul 6 23:27:13.553803 kubelet[2667]: I0706 23:27:13.553805 2667 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/b204b96b2b12e15759d1708287da6c43-k8s-certs\") pod \"kube-controller-manager-ci-4344-1-1-3-d8bdec45b1\" (UID: \"b204b96b2b12e15759d1708287da6c43\") " pod="kube-system/kube-controller-manager-ci-4344-1-1-3-d8bdec45b1" Jul 6 23:27:13.553949 kubelet[2667]: I0706 23:27:13.553831 2667 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/dfd1d1b9bd9763be4828a279e619c79e-kubeconfig\") pod \"kube-scheduler-ci-4344-1-1-3-d8bdec45b1\" (UID: \"dfd1d1b9bd9763be4828a279e619c79e\") " pod="kube-system/kube-scheduler-ci-4344-1-1-3-d8bdec45b1" Jul 6 23:27:13.553949 kubelet[2667]: I0706 23:27:13.553856 2667 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/c012f9832c2f93172b62286bcb610cbb-ca-certs\") pod \"kube-apiserver-ci-4344-1-1-3-d8bdec45b1\" (UID: \"c012f9832c2f93172b62286bcb610cbb\") " pod="kube-system/kube-apiserver-ci-4344-1-1-3-d8bdec45b1" Jul 6 23:27:13.553949 kubelet[2667]: I0706 23:27:13.553875 2667 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/c012f9832c2f93172b62286bcb610cbb-k8s-certs\") pod \"kube-apiserver-ci-4344-1-1-3-d8bdec45b1\" (UID: \"c012f9832c2f93172b62286bcb610cbb\") " pod="kube-system/kube-apiserver-ci-4344-1-1-3-d8bdec45b1" Jul 6 23:27:13.553949 kubelet[2667]: I0706 23:27:13.553896 2667 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/c012f9832c2f93172b62286bcb610cbb-usr-share-ca-certificates\") pod \"kube-apiserver-ci-4344-1-1-3-d8bdec45b1\" (UID: \"c012f9832c2f93172b62286bcb610cbb\") " pod="kube-system/kube-apiserver-ci-4344-1-1-3-d8bdec45b1" Jul 6 23:27:13.553949 kubelet[2667]: I0706 23:27:13.553920 2667 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/b204b96b2b12e15759d1708287da6c43-kubeconfig\") pod \"kube-controller-manager-ci-4344-1-1-3-d8bdec45b1\" (UID: \"b204b96b2b12e15759d1708287da6c43\") " pod="kube-system/kube-controller-manager-ci-4344-1-1-3-d8bdec45b1" Jul 6 23:27:13.554807 kubelet[2667]: I0706 23:27:13.553944 2667 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/b204b96b2b12e15759d1708287da6c43-usr-share-ca-certificates\") pod \"kube-controller-manager-ci-4344-1-1-3-d8bdec45b1\" (UID: \"b204b96b2b12e15759d1708287da6c43\") " pod="kube-system/kube-controller-manager-ci-4344-1-1-3-d8bdec45b1" Jul 6 23:27:13.719881 update_engine[1493]: I20250706 23:27:13.719707 1493 update_attempter.cc:509] Updating boot flags... Jul 6 23:27:14.306799 kubelet[2667]: I0706 23:27:14.306747 2667 apiserver.go:52] "Watching apiserver" Jul 6 23:27:14.351203 kubelet[2667]: I0706 23:27:14.351135 2667 desired_state_of_world_populator.go:158] "Finished populating initial desired state of world" Jul 6 23:27:14.400642 kubelet[2667]: I0706 23:27:14.400422 2667 kubelet.go:3194] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-ci-4344-1-1-3-d8bdec45b1" Jul 6 23:27:14.401397 kubelet[2667]: I0706 23:27:14.401243 2667 kubelet.go:3194] "Creating a mirror pod for static pod" pod="kube-system/kube-scheduler-ci-4344-1-1-3-d8bdec45b1" Jul 6 23:27:14.413772 kubelet[2667]: E0706 23:27:14.413740 2667 kubelet.go:3196] "Failed creating a mirror pod" err="pods \"kube-apiserver-ci-4344-1-1-3-d8bdec45b1\" already exists" pod="kube-system/kube-apiserver-ci-4344-1-1-3-d8bdec45b1" Jul 6 23:27:14.422488 kubelet[2667]: E0706 23:27:14.422391 2667 kubelet.go:3196] "Failed creating a mirror pod" err="pods \"kube-scheduler-ci-4344-1-1-3-d8bdec45b1\" already exists" pod="kube-system/kube-scheduler-ci-4344-1-1-3-d8bdec45b1" Jul 6 23:27:14.442757 kubelet[2667]: I0706 23:27:14.441719 2667 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-apiserver-ci-4344-1-1-3-d8bdec45b1" podStartSLOduration=1.441698226 podStartE2EDuration="1.441698226s" podCreationTimestamp="2025-07-06 23:27:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-07-06 23:27:14.441555101 +0000 UTC m=+1.223215473" watchObservedRunningTime="2025-07-06 23:27:14.441698226 +0000 UTC m=+1.223358598" Jul 6 23:27:14.454019 kubelet[2667]: I0706 23:27:14.453776 2667 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-scheduler-ci-4344-1-1-3-d8bdec45b1" podStartSLOduration=1.453755006 podStartE2EDuration="1.453755006s" podCreationTimestamp="2025-07-06 23:27:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-07-06 23:27:14.452550008 +0000 UTC m=+1.234210500" watchObservedRunningTime="2025-07-06 23:27:14.453755006 +0000 UTC m=+1.235415418" Jul 6 23:27:14.470025 kubelet[2667]: I0706 23:27:14.469958 2667 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-controller-manager-ci-4344-1-1-3-d8bdec45b1" podStartSLOduration=1.469940517 podStartE2EDuration="1.469940517s" podCreationTimestamp="2025-07-06 23:27:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-07-06 23:27:14.469904836 +0000 UTC m=+1.251565208" watchObservedRunningTime="2025-07-06 23:27:14.469940517 +0000 UTC m=+1.251600889" Jul 6 23:27:16.937026 kubelet[2667]: I0706 23:27:16.936985 2667 kuberuntime_manager.go:1702] "Updating runtime config through cri with podcidr" CIDR="192.168.0.0/24" Jul 6 23:27:16.937981 containerd[1514]: time="2025-07-06T23:27:16.937920864Z" level=info msg="No cni config template is specified, wait for other system components to drop the config." Jul 6 23:27:16.939234 kubelet[2667]: I0706 23:27:16.938175 2667 kubelet_network.go:61] "Updating Pod CIDR" originalPodCIDR="" newPodCIDR="192.168.0.0/24" Jul 6 23:27:17.562272 systemd[1]: Created slice kubepods-besteffort-podf1b59d06_bea4_4942_adaf_cc1170253e58.slice - libcontainer container kubepods-besteffort-podf1b59d06_bea4_4942_adaf_cc1170253e58.slice. Jul 6 23:27:17.578020 kubelet[2667]: I0706 23:27:17.577983 2667 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-proxy\" (UniqueName: \"kubernetes.io/configmap/f1b59d06-bea4-4942-adaf-cc1170253e58-kube-proxy\") pod \"kube-proxy-xnk5r\" (UID: \"f1b59d06-bea4-4942-adaf-cc1170253e58\") " pod="kube-system/kube-proxy-xnk5r" Jul 6 23:27:17.578174 kubelet[2667]: I0706 23:27:17.578056 2667 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"xtables-lock\" (UniqueName: \"kubernetes.io/host-path/f1b59d06-bea4-4942-adaf-cc1170253e58-xtables-lock\") pod \"kube-proxy-xnk5r\" (UID: \"f1b59d06-bea4-4942-adaf-cc1170253e58\") " pod="kube-system/kube-proxy-xnk5r" Jul 6 23:27:17.578174 kubelet[2667]: I0706 23:27:17.578075 2667 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/f1b59d06-bea4-4942-adaf-cc1170253e58-lib-modules\") pod \"kube-proxy-xnk5r\" (UID: \"f1b59d06-bea4-4942-adaf-cc1170253e58\") " pod="kube-system/kube-proxy-xnk5r" Jul 6 23:27:17.578174 kubelet[2667]: I0706 23:27:17.578091 2667 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2gkqz\" (UniqueName: \"kubernetes.io/projected/f1b59d06-bea4-4942-adaf-cc1170253e58-kube-api-access-2gkqz\") pod \"kube-proxy-xnk5r\" (UID: \"f1b59d06-bea4-4942-adaf-cc1170253e58\") " pod="kube-system/kube-proxy-xnk5r" Jul 6 23:27:17.876863 containerd[1514]: time="2025-07-06T23:27:17.875378621Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-proxy-xnk5r,Uid:f1b59d06-bea4-4942-adaf-cc1170253e58,Namespace:kube-system,Attempt:0,}" Jul 6 23:27:17.908686 containerd[1514]: time="2025-07-06T23:27:17.908625045Z" level=info msg="connecting to shim d9fb5353437e4f1db2dfceaa8c02b641eb0f6f9c639fa6824417286ac98ec941" address="unix:///run/containerd/s/5ab8cfe96a5658e75a3e800fb90d911ca241eead37bec374f4df50a66c7adc47" namespace=k8s.io protocol=ttrpc version=3 Jul 6 23:27:17.943893 systemd[1]: Started cri-containerd-d9fb5353437e4f1db2dfceaa8c02b641eb0f6f9c639fa6824417286ac98ec941.scope - libcontainer container d9fb5353437e4f1db2dfceaa8c02b641eb0f6f9c639fa6824417286ac98ec941. Jul 6 23:27:17.978128 containerd[1514]: time="2025-07-06T23:27:17.978046930Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-proxy-xnk5r,Uid:f1b59d06-bea4-4942-adaf-cc1170253e58,Namespace:kube-system,Attempt:0,} returns sandbox id \"d9fb5353437e4f1db2dfceaa8c02b641eb0f6f9c639fa6824417286ac98ec941\"" Jul 6 23:27:17.992310 containerd[1514]: time="2025-07-06T23:27:17.991964212Z" level=info msg="CreateContainer within sandbox \"d9fb5353437e4f1db2dfceaa8c02b641eb0f6f9c639fa6824417286ac98ec941\" for container &ContainerMetadata{Name:kube-proxy,Attempt:0,}" Jul 6 23:27:18.010557 containerd[1514]: time="2025-07-06T23:27:18.007979376Z" level=info msg="Container 3582477ddf8753ca863b333c5aec9dbdd3accb6a41bd9e23fcc90f7fa08bcd19: CDI devices from CRI Config.CDIDevices: []" Jul 6 23:27:18.018841 containerd[1514]: time="2025-07-06T23:27:18.018773319Z" level=info msg="CreateContainer within sandbox \"d9fb5353437e4f1db2dfceaa8c02b641eb0f6f9c639fa6824417286ac98ec941\" for &ContainerMetadata{Name:kube-proxy,Attempt:0,} returns container id \"3582477ddf8753ca863b333c5aec9dbdd3accb6a41bd9e23fcc90f7fa08bcd19\"" Jul 6 23:27:18.020414 containerd[1514]: time="2025-07-06T23:27:18.019997909Z" level=info msg="StartContainer for \"3582477ddf8753ca863b333c5aec9dbdd3accb6a41bd9e23fcc90f7fa08bcd19\"" Jul 6 23:27:18.024173 containerd[1514]: time="2025-07-06T23:27:18.024130890Z" level=info msg="connecting to shim 3582477ddf8753ca863b333c5aec9dbdd3accb6a41bd9e23fcc90f7fa08bcd19" address="unix:///run/containerd/s/5ab8cfe96a5658e75a3e800fb90d911ca241eead37bec374f4df50a66c7adc47" protocol=ttrpc version=3 Jul 6 23:27:18.061064 systemd[1]: Started cri-containerd-3582477ddf8753ca863b333c5aec9dbdd3accb6a41bd9e23fcc90f7fa08bcd19.scope - libcontainer container 3582477ddf8753ca863b333c5aec9dbdd3accb6a41bd9e23fcc90f7fa08bcd19. Jul 6 23:27:18.106366 systemd[1]: Created slice kubepods-besteffort-podc1314fa3_6ef8_4ec3_8288_b7055796d2a0.slice - libcontainer container kubepods-besteffort-podc1314fa3_6ef8_4ec3_8288_b7055796d2a0.slice. Jul 6 23:27:18.144155 containerd[1514]: time="2025-07-06T23:27:18.144022051Z" level=info msg="StartContainer for \"3582477ddf8753ca863b333c5aec9dbdd3accb6a41bd9e23fcc90f7fa08bcd19\" returns successfully" Jul 6 23:27:18.184108 kubelet[2667]: I0706 23:27:18.184005 2667 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-calico\" (UniqueName: \"kubernetes.io/host-path/c1314fa3-6ef8-4ec3-8288-b7055796d2a0-var-lib-calico\") pod \"tigera-operator-747864d56d-qx56l\" (UID: \"c1314fa3-6ef8-4ec3-8288-b7055796d2a0\") " pod="tigera-operator/tigera-operator-747864d56d-qx56l" Jul 6 23:27:18.184108 kubelet[2667]: I0706 23:27:18.184093 2667 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ql2wk\" (UniqueName: \"kubernetes.io/projected/c1314fa3-6ef8-4ec3-8288-b7055796d2a0-kube-api-access-ql2wk\") pod \"tigera-operator-747864d56d-qx56l\" (UID: \"c1314fa3-6ef8-4ec3-8288-b7055796d2a0\") " pod="tigera-operator/tigera-operator-747864d56d-qx56l" Jul 6 23:27:18.411799 containerd[1514]: time="2025-07-06T23:27:18.410944116Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:tigera-operator-747864d56d-qx56l,Uid:c1314fa3-6ef8-4ec3-8288-b7055796d2a0,Namespace:tigera-operator,Attempt:0,}" Jul 6 23:27:18.445127 containerd[1514]: time="2025-07-06T23:27:18.445076067Z" level=info msg="connecting to shim 1dfb1768ae90720710cb782faaffdedff9620d17ddedc9df0603d251045a2ed1" address="unix:///run/containerd/s/f9d7971e827750e14ff44b538e4820acbfccf95a64bd3ab9f2cbe296f7cecf6a" namespace=k8s.io protocol=ttrpc version=3 Jul 6 23:27:18.476075 systemd[1]: Started cri-containerd-1dfb1768ae90720710cb782faaffdedff9620d17ddedc9df0603d251045a2ed1.scope - libcontainer container 1dfb1768ae90720710cb782faaffdedff9620d17ddedc9df0603d251045a2ed1. Jul 6 23:27:18.543797 containerd[1514]: time="2025-07-06T23:27:18.543652310Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:tigera-operator-747864d56d-qx56l,Uid:c1314fa3-6ef8-4ec3-8288-b7055796d2a0,Namespace:tigera-operator,Attempt:0,} returns sandbox id \"1dfb1768ae90720710cb782faaffdedff9620d17ddedc9df0603d251045a2ed1\"" Jul 6 23:27:18.547114 containerd[1514]: time="2025-07-06T23:27:18.546895669Z" level=info msg="PullImage \"quay.io/tigera/operator:v1.38.3\"" Jul 6 23:27:20.010351 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount3044572955.mount: Deactivated successfully. Jul 6 23:27:20.355143 kubelet[2667]: I0706 23:27:20.354154 2667 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-proxy-xnk5r" podStartSLOduration=3.354133545 podStartE2EDuration="3.354133545s" podCreationTimestamp="2025-07-06 23:27:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-07-06 23:27:18.43491006 +0000 UTC m=+5.216570432" watchObservedRunningTime="2025-07-06 23:27:20.354133545 +0000 UTC m=+7.135793877" Jul 6 23:27:20.545221 containerd[1514]: time="2025-07-06T23:27:20.543902049Z" level=info msg="ImageCreate event name:\"quay.io/tigera/operator:v1.38.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jul 6 23:27:20.545221 containerd[1514]: time="2025-07-06T23:27:20.545168037Z" level=info msg="stop pulling image quay.io/tigera/operator:v1.38.3: active requests=0, bytes read=22150610" Jul 6 23:27:20.546125 containerd[1514]: time="2025-07-06T23:27:20.546090576Z" level=info msg="ImageCreate event name:\"sha256:7f8a5b1dba618e907d5f7804e42b3bd7cd5766bc3b0a66da25ff2c687e356bb0\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jul 6 23:27:20.548761 containerd[1514]: time="2025-07-06T23:27:20.548724753Z" level=info msg="ImageCreate event name:\"quay.io/tigera/operator@sha256:dbf1bad0def7b5955dc8e4aeee96e23ead0bc5822f6872518e685cd0ed484121\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jul 6 23:27:20.549485 containerd[1514]: time="2025-07-06T23:27:20.549444608Z" level=info msg="Pulled image \"quay.io/tigera/operator:v1.38.3\" with image id \"sha256:7f8a5b1dba618e907d5f7804e42b3bd7cd5766bc3b0a66da25ff2c687e356bb0\", repo tag \"quay.io/tigera/operator:v1.38.3\", repo digest \"quay.io/tigera/operator@sha256:dbf1bad0def7b5955dc8e4aeee96e23ead0bc5822f6872518e685cd0ed484121\", size \"22146605\" in 2.002506219s" Jul 6 23:27:20.549485 containerd[1514]: time="2025-07-06T23:27:20.549483329Z" level=info msg="PullImage \"quay.io/tigera/operator:v1.38.3\" returns image reference \"sha256:7f8a5b1dba618e907d5f7804e42b3bd7cd5766bc3b0a66da25ff2c687e356bb0\"" Jul 6 23:27:20.553816 containerd[1514]: time="2025-07-06T23:27:20.553780101Z" level=info msg="CreateContainer within sandbox \"1dfb1768ae90720710cb782faaffdedff9620d17ddedc9df0603d251045a2ed1\" for container &ContainerMetadata{Name:tigera-operator,Attempt:0,}" Jul 6 23:27:20.565642 containerd[1514]: time="2025-07-06T23:27:20.564739976Z" level=info msg="Container 8e083feeb9d7a32080503be29d78ce5b5b552a02970c2ebc00f43540dc4749dc: CDI devices from CRI Config.CDIDevices: []" Jul 6 23:27:20.573749 containerd[1514]: time="2025-07-06T23:27:20.573689727Z" level=info msg="CreateContainer within sandbox \"1dfb1768ae90720710cb782faaffdedff9620d17ddedc9df0603d251045a2ed1\" for &ContainerMetadata{Name:tigera-operator,Attempt:0,} returns container id \"8e083feeb9d7a32080503be29d78ce5b5b552a02970c2ebc00f43540dc4749dc\"" Jul 6 23:27:20.576692 containerd[1514]: time="2025-07-06T23:27:20.575707851Z" level=info msg="StartContainer for \"8e083feeb9d7a32080503be29d78ce5b5b552a02970c2ebc00f43540dc4749dc\"" Jul 6 23:27:20.577280 containerd[1514]: time="2025-07-06T23:27:20.577240963Z" level=info msg="connecting to shim 8e083feeb9d7a32080503be29d78ce5b5b552a02970c2ebc00f43540dc4749dc" address="unix:///run/containerd/s/f9d7971e827750e14ff44b538e4820acbfccf95a64bd3ab9f2cbe296f7cecf6a" protocol=ttrpc version=3 Jul 6 23:27:20.602949 systemd[1]: Started cri-containerd-8e083feeb9d7a32080503be29d78ce5b5b552a02970c2ebc00f43540dc4749dc.scope - libcontainer container 8e083feeb9d7a32080503be29d78ce5b5b552a02970c2ebc00f43540dc4749dc. Jul 6 23:27:20.648762 containerd[1514]: time="2025-07-06T23:27:20.648091121Z" level=info msg="StartContainer for \"8e083feeb9d7a32080503be29d78ce5b5b552a02970c2ebc00f43540dc4749dc\" returns successfully" Jul 6 23:27:21.444684 kubelet[2667]: I0706 23:27:21.444544 2667 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="tigera-operator/tigera-operator-747864d56d-qx56l" podStartSLOduration=1.439658993 podStartE2EDuration="3.444523985s" podCreationTimestamp="2025-07-06 23:27:18 +0000 UTC" firstStartedPulling="2025-07-06 23:27:18.545847323 +0000 UTC m=+5.327507695" lastFinishedPulling="2025-07-06 23:27:20.550712315 +0000 UTC m=+7.332372687" observedRunningTime="2025-07-06 23:27:21.444409263 +0000 UTC m=+8.226069635" watchObservedRunningTime="2025-07-06 23:27:21.444523985 +0000 UTC m=+8.226184397" Jul 6 23:27:26.919984 sudo[1757]: pam_unix(sudo:session): session closed for user root Jul 6 23:27:27.097728 sshd[1756]: Connection closed by 139.178.89.65 port 47122 Jul 6 23:27:27.099883 sshd-session[1754]: pam_unix(sshd:session): session closed for user core Jul 6 23:27:27.106513 systemd[1]: sshd@6-91.99.177.85:22-139.178.89.65:47122.service: Deactivated successfully. Jul 6 23:27:27.110715 systemd[1]: session-7.scope: Deactivated successfully. Jul 6 23:27:27.111199 systemd[1]: session-7.scope: Consumed 6.899s CPU time, 231.9M memory peak. Jul 6 23:27:27.113921 systemd-logind[1491]: Session 7 logged out. Waiting for processes to exit. Jul 6 23:27:27.117201 systemd-logind[1491]: Removed session 7. Jul 6 23:27:35.822922 systemd[1]: Created slice kubepods-besteffort-podf30bebb7_1b49_4cef_b4de_cd98773c316e.slice - libcontainer container kubepods-besteffort-podf30bebb7_1b49_4cef_b4de_cd98773c316e.slice. Jul 6 23:27:35.902279 kubelet[2667]: I0706 23:27:35.902235 2667 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wnxxl\" (UniqueName: \"kubernetes.io/projected/f30bebb7-1b49-4cef-b4de-cd98773c316e-kube-api-access-wnxxl\") pod \"calico-typha-5bf8fb4678-v79n4\" (UID: \"f30bebb7-1b49-4cef-b4de-cd98773c316e\") " pod="calico-system/calico-typha-5bf8fb4678-v79n4" Jul 6 23:27:35.902607 kubelet[2667]: I0706 23:27:35.902286 2667 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f30bebb7-1b49-4cef-b4de-cd98773c316e-tigera-ca-bundle\") pod \"calico-typha-5bf8fb4678-v79n4\" (UID: \"f30bebb7-1b49-4cef-b4de-cd98773c316e\") " pod="calico-system/calico-typha-5bf8fb4678-v79n4" Jul 6 23:27:35.902607 kubelet[2667]: I0706 23:27:35.902305 2667 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"typha-certs\" (UniqueName: \"kubernetes.io/secret/f30bebb7-1b49-4cef-b4de-cd98773c316e-typha-certs\") pod \"calico-typha-5bf8fb4678-v79n4\" (UID: \"f30bebb7-1b49-4cef-b4de-cd98773c316e\") " pod="calico-system/calico-typha-5bf8fb4678-v79n4" Jul 6 23:27:36.029171 systemd[1]: Created slice kubepods-besteffort-podd0430eb4_5462_44d9_8437_9bdd8f5b3f84.slice - libcontainer container kubepods-besteffort-podd0430eb4_5462_44d9_8437_9bdd8f5b3f84.slice. Jul 6 23:27:36.103806 kubelet[2667]: I0706 23:27:36.103066 2667 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-certs\" (UniqueName: \"kubernetes.io/secret/d0430eb4-5462-44d9-8437-9bdd8f5b3f84-node-certs\") pod \"calico-node-gctx6\" (UID: \"d0430eb4-5462-44d9-8437-9bdd8f5b3f84\") " pod="calico-system/calico-node-gctx6" Jul 6 23:27:36.103806 kubelet[2667]: I0706 23:27:36.103115 2667 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-calico\" (UniqueName: \"kubernetes.io/host-path/d0430eb4-5462-44d9-8437-9bdd8f5b3f84-var-lib-calico\") pod \"calico-node-gctx6\" (UID: \"d0430eb4-5462-44d9-8437-9bdd8f5b3f84\") " pod="calico-system/calico-node-gctx6" Jul 6 23:27:36.103806 kubelet[2667]: I0706 23:27:36.103134 2667 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"xtables-lock\" (UniqueName: \"kubernetes.io/host-path/d0430eb4-5462-44d9-8437-9bdd8f5b3f84-xtables-lock\") pod \"calico-node-gctx6\" (UID: \"d0430eb4-5462-44d9-8437-9bdd8f5b3f84\") " pod="calico-system/calico-node-gctx6" Jul 6 23:27:36.103806 kubelet[2667]: I0706 23:27:36.103150 2667 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4v6q4\" (UniqueName: \"kubernetes.io/projected/d0430eb4-5462-44d9-8437-9bdd8f5b3f84-kube-api-access-4v6q4\") pod \"calico-node-gctx6\" (UID: \"d0430eb4-5462-44d9-8437-9bdd8f5b3f84\") " pod="calico-system/calico-node-gctx6" Jul 6 23:27:36.103806 kubelet[2667]: I0706 23:27:36.103171 2667 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-log-dir\" (UniqueName: \"kubernetes.io/host-path/d0430eb4-5462-44d9-8437-9bdd8f5b3f84-cni-log-dir\") pod \"calico-node-gctx6\" (UID: \"d0430eb4-5462-44d9-8437-9bdd8f5b3f84\") " pod="calico-system/calico-node-gctx6" Jul 6 23:27:36.104162 kubelet[2667]: I0706 23:27:36.103186 2667 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"flexvol-driver-host\" (UniqueName: \"kubernetes.io/host-path/d0430eb4-5462-44d9-8437-9bdd8f5b3f84-flexvol-driver-host\") pod \"calico-node-gctx6\" (UID: \"d0430eb4-5462-44d9-8437-9bdd8f5b3f84\") " pod="calico-system/calico-node-gctx6" Jul 6 23:27:36.104162 kubelet[2667]: I0706 23:27:36.103209 2667 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-calico\" (UniqueName: \"kubernetes.io/host-path/d0430eb4-5462-44d9-8437-9bdd8f5b3f84-var-run-calico\") pod \"calico-node-gctx6\" (UID: \"d0430eb4-5462-44d9-8437-9bdd8f5b3f84\") " pod="calico-system/calico-node-gctx6" Jul 6 23:27:36.104162 kubelet[2667]: I0706 23:27:36.103226 2667 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d0430eb4-5462-44d9-8437-9bdd8f5b3f84-tigera-ca-bundle\") pod \"calico-node-gctx6\" (UID: \"d0430eb4-5462-44d9-8437-9bdd8f5b3f84\") " pod="calico-system/calico-node-gctx6" Jul 6 23:27:36.104162 kubelet[2667]: I0706 23:27:36.103243 2667 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-net-dir\" (UniqueName: \"kubernetes.io/host-path/d0430eb4-5462-44d9-8437-9bdd8f5b3f84-cni-net-dir\") pod \"calico-node-gctx6\" (UID: \"d0430eb4-5462-44d9-8437-9bdd8f5b3f84\") " pod="calico-system/calico-node-gctx6" Jul 6 23:27:36.104162 kubelet[2667]: I0706 23:27:36.103273 2667 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-bin-dir\" (UniqueName: \"kubernetes.io/host-path/d0430eb4-5462-44d9-8437-9bdd8f5b3f84-cni-bin-dir\") pod \"calico-node-gctx6\" (UID: \"d0430eb4-5462-44d9-8437-9bdd8f5b3f84\") " pod="calico-system/calico-node-gctx6" Jul 6 23:27:36.104416 kubelet[2667]: I0706 23:27:36.103289 2667 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/d0430eb4-5462-44d9-8437-9bdd8f5b3f84-lib-modules\") pod \"calico-node-gctx6\" (UID: \"d0430eb4-5462-44d9-8437-9bdd8f5b3f84\") " pod="calico-system/calico-node-gctx6" Jul 6 23:27:36.104416 kubelet[2667]: I0706 23:27:36.103302 2667 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"policysync\" (UniqueName: \"kubernetes.io/host-path/d0430eb4-5462-44d9-8437-9bdd8f5b3f84-policysync\") pod \"calico-node-gctx6\" (UID: \"d0430eb4-5462-44d9-8437-9bdd8f5b3f84\") " pod="calico-system/calico-node-gctx6" Jul 6 23:27:36.128265 containerd[1514]: time="2025-07-06T23:27:36.127846933Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-typha-5bf8fb4678-v79n4,Uid:f30bebb7-1b49-4cef-b4de-cd98773c316e,Namespace:calico-system,Attempt:0,}" Jul 6 23:27:36.161024 containerd[1514]: time="2025-07-06T23:27:36.160653184Z" level=info msg="connecting to shim 3d7c9840e22b3e6c508cbbc324ecbf99a8cb28b763f3e10fe74be22b3d1c1147" address="unix:///run/containerd/s/27372d996ce4d6feeba199454eba20a0eadff3cd05ebd5b27bf56542d573e8ab" namespace=k8s.io protocol=ttrpc version=3 Jul 6 23:27:36.191770 kubelet[2667]: E0706 23:27:36.191420 2667 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-xwdth" podUID="6cccb974-c600-4611-9cf6-2ebb27d5d999" Jul 6 23:27:36.198109 systemd[1]: Started cri-containerd-3d7c9840e22b3e6c508cbbc324ecbf99a8cb28b763f3e10fe74be22b3d1c1147.scope - libcontainer container 3d7c9840e22b3e6c508cbbc324ecbf99a8cb28b763f3e10fe74be22b3d1c1147. Jul 6 23:27:36.208345 kubelet[2667]: E0706 23:27:36.208310 2667 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 6 23:27:36.208345 kubelet[2667]: W0706 23:27:36.208334 2667 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 6 23:27:36.208479 kubelet[2667]: E0706 23:27:36.208358 2667 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 6 23:27:36.219544 kubelet[2667]: E0706 23:27:36.219499 2667 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 6 23:27:36.219544 kubelet[2667]: W0706 23:27:36.219523 2667 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 6 23:27:36.219544 kubelet[2667]: E0706 23:27:36.219543 2667 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 6 23:27:36.231851 kubelet[2667]: E0706 23:27:36.231823 2667 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 6 23:27:36.231851 kubelet[2667]: W0706 23:27:36.231844 2667 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 6 23:27:36.231987 kubelet[2667]: E0706 23:27:36.231865 2667 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 6 23:27:36.287508 kubelet[2667]: E0706 23:27:36.287465 2667 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 6 23:27:36.287508 kubelet[2667]: W0706 23:27:36.287497 2667 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 6 23:27:36.287508 kubelet[2667]: E0706 23:27:36.287522 2667 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 6 23:27:36.287975 kubelet[2667]: E0706 23:27:36.287946 2667 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 6 23:27:36.288057 kubelet[2667]: W0706 23:27:36.287969 2667 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 6 23:27:36.288057 kubelet[2667]: E0706 23:27:36.288018 2667 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 6 23:27:36.288212 kubelet[2667]: E0706 23:27:36.288186 2667 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 6 23:27:36.288212 kubelet[2667]: W0706 23:27:36.288202 2667 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 6 23:27:36.288276 kubelet[2667]: E0706 23:27:36.288216 2667 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 6 23:27:36.288631 kubelet[2667]: E0706 23:27:36.288524 2667 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 6 23:27:36.288631 kubelet[2667]: W0706 23:27:36.288540 2667 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 6 23:27:36.288631 kubelet[2667]: E0706 23:27:36.288552 2667 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 6 23:27:36.289039 kubelet[2667]: E0706 23:27:36.288875 2667 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 6 23:27:36.289039 kubelet[2667]: W0706 23:27:36.288885 2667 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 6 23:27:36.289039 kubelet[2667]: E0706 23:27:36.288896 2667 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 6 23:27:36.289275 kubelet[2667]: E0706 23:27:36.289254 2667 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 6 23:27:36.289275 kubelet[2667]: W0706 23:27:36.289272 2667 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 6 23:27:36.289354 kubelet[2667]: E0706 23:27:36.289284 2667 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 6 23:27:36.289683 kubelet[2667]: E0706 23:27:36.289586 2667 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 6 23:27:36.289683 kubelet[2667]: W0706 23:27:36.289603 2667 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 6 23:27:36.289683 kubelet[2667]: E0706 23:27:36.289613 2667 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 6 23:27:36.290312 kubelet[2667]: E0706 23:27:36.290140 2667 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 6 23:27:36.290312 kubelet[2667]: W0706 23:27:36.290157 2667 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 6 23:27:36.290312 kubelet[2667]: E0706 23:27:36.290172 2667 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 6 23:27:36.290473 kubelet[2667]: E0706 23:27:36.290442 2667 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 6 23:27:36.290473 kubelet[2667]: W0706 23:27:36.290460 2667 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 6 23:27:36.290473 kubelet[2667]: E0706 23:27:36.290469 2667 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 6 23:27:36.290821 kubelet[2667]: E0706 23:27:36.290795 2667 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 6 23:27:36.290821 kubelet[2667]: W0706 23:27:36.290812 2667 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 6 23:27:36.291144 kubelet[2667]: E0706 23:27:36.290826 2667 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 6 23:27:36.291254 kubelet[2667]: E0706 23:27:36.291229 2667 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 6 23:27:36.291254 kubelet[2667]: W0706 23:27:36.291248 2667 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 6 23:27:36.291318 kubelet[2667]: E0706 23:27:36.291262 2667 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 6 23:27:36.291810 kubelet[2667]: E0706 23:27:36.291783 2667 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 6 23:27:36.291810 kubelet[2667]: W0706 23:27:36.291803 2667 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 6 23:27:36.291943 kubelet[2667]: E0706 23:27:36.291817 2667 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 6 23:27:36.292235 kubelet[2667]: E0706 23:27:36.291977 2667 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 6 23:27:36.292235 kubelet[2667]: W0706 23:27:36.291992 2667 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 6 23:27:36.292235 kubelet[2667]: E0706 23:27:36.292001 2667 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 6 23:27:36.292331 kubelet[2667]: E0706 23:27:36.292295 2667 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 6 23:27:36.292331 kubelet[2667]: W0706 23:27:36.292308 2667 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 6 23:27:36.292331 kubelet[2667]: E0706 23:27:36.292317 2667 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 6 23:27:36.292589 kubelet[2667]: E0706 23:27:36.292564 2667 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 6 23:27:36.292589 kubelet[2667]: W0706 23:27:36.292580 2667 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 6 23:27:36.292589 kubelet[2667]: E0706 23:27:36.292590 2667 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 6 23:27:36.292975 kubelet[2667]: E0706 23:27:36.292951 2667 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 6 23:27:36.292975 kubelet[2667]: W0706 23:27:36.292967 2667 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 6 23:27:36.292975 kubelet[2667]: E0706 23:27:36.292979 2667 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 6 23:27:36.293314 kubelet[2667]: E0706 23:27:36.293283 2667 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 6 23:27:36.293314 kubelet[2667]: W0706 23:27:36.293300 2667 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 6 23:27:36.293314 kubelet[2667]: E0706 23:27:36.293313 2667 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 6 23:27:36.293564 kubelet[2667]: E0706 23:27:36.293541 2667 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 6 23:27:36.293564 kubelet[2667]: W0706 23:27:36.293553 2667 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 6 23:27:36.293564 kubelet[2667]: E0706 23:27:36.293563 2667 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 6 23:27:36.294066 kubelet[2667]: E0706 23:27:36.294037 2667 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 6 23:27:36.294066 kubelet[2667]: W0706 23:27:36.294058 2667 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 6 23:27:36.294066 kubelet[2667]: E0706 23:27:36.294069 2667 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 6 23:27:36.294386 kubelet[2667]: E0706 23:27:36.294196 2667 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 6 23:27:36.294386 kubelet[2667]: W0706 23:27:36.294203 2667 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 6 23:27:36.294386 kubelet[2667]: E0706 23:27:36.294211 2667 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 6 23:27:36.305798 kubelet[2667]: E0706 23:27:36.305771 2667 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 6 23:27:36.305798 kubelet[2667]: W0706 23:27:36.305790 2667 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 6 23:27:36.305798 kubelet[2667]: E0706 23:27:36.305805 2667 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 6 23:27:36.306134 kubelet[2667]: I0706 23:27:36.305832 2667 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/6cccb974-c600-4611-9cf6-2ebb27d5d999-kubelet-dir\") pod \"csi-node-driver-xwdth\" (UID: \"6cccb974-c600-4611-9cf6-2ebb27d5d999\") " pod="calico-system/csi-node-driver-xwdth" Jul 6 23:27:36.306390 kubelet[2667]: E0706 23:27:36.306372 2667 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 6 23:27:36.306709 kubelet[2667]: W0706 23:27:36.306690 2667 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 6 23:27:36.306802 kubelet[2667]: E0706 23:27:36.306789 2667 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 6 23:27:36.306973 kubelet[2667]: E0706 23:27:36.306946 2667 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 6 23:27:36.306973 kubelet[2667]: W0706 23:27:36.306968 2667 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 6 23:27:36.307065 kubelet[2667]: E0706 23:27:36.306986 2667 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 6 23:27:36.307065 kubelet[2667]: I0706 23:27:36.307006 2667 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fcrcg\" (UniqueName: \"kubernetes.io/projected/6cccb974-c600-4611-9cf6-2ebb27d5d999-kube-api-access-fcrcg\") pod \"csi-node-driver-xwdth\" (UID: \"6cccb974-c600-4611-9cf6-2ebb27d5d999\") " pod="calico-system/csi-node-driver-xwdth" Jul 6 23:27:36.307151 kubelet[2667]: E0706 23:27:36.307138 2667 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 6 23:27:36.307151 kubelet[2667]: W0706 23:27:36.307150 2667 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 6 23:27:36.307198 kubelet[2667]: E0706 23:27:36.307159 2667 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 6 23:27:36.307427 kubelet[2667]: E0706 23:27:36.307412 2667 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 6 23:27:36.307427 kubelet[2667]: W0706 23:27:36.307426 2667 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 6 23:27:36.307494 kubelet[2667]: E0706 23:27:36.307444 2667 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 6 23:27:36.307765 kubelet[2667]: E0706 23:27:36.307742 2667 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 6 23:27:36.307765 kubelet[2667]: W0706 23:27:36.307760 2667 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 6 23:27:36.307765 kubelet[2667]: E0706 23:27:36.307777 2667 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 6 23:27:36.308338 kubelet[2667]: E0706 23:27:36.308035 2667 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 6 23:27:36.308338 kubelet[2667]: W0706 23:27:36.308045 2667 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 6 23:27:36.308338 kubelet[2667]: E0706 23:27:36.308054 2667 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 6 23:27:36.308338 kubelet[2667]: I0706 23:27:36.308071 2667 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"varrun\" (UniqueName: \"kubernetes.io/host-path/6cccb974-c600-4611-9cf6-2ebb27d5d999-varrun\") pod \"csi-node-driver-xwdth\" (UID: \"6cccb974-c600-4611-9cf6-2ebb27d5d999\") " pod="calico-system/csi-node-driver-xwdth" Jul 6 23:27:36.308338 kubelet[2667]: E0706 23:27:36.308247 2667 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 6 23:27:36.308338 kubelet[2667]: W0706 23:27:36.308258 2667 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 6 23:27:36.308338 kubelet[2667]: E0706 23:27:36.308266 2667 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 6 23:27:36.308338 kubelet[2667]: I0706 23:27:36.308281 2667 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/6cccb974-c600-4611-9cf6-2ebb27d5d999-socket-dir\") pod \"csi-node-driver-xwdth\" (UID: \"6cccb974-c600-4611-9cf6-2ebb27d5d999\") " pod="calico-system/csi-node-driver-xwdth" Jul 6 23:27:36.308539 kubelet[2667]: E0706 23:27:36.308441 2667 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 6 23:27:36.308539 kubelet[2667]: W0706 23:27:36.308450 2667 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 6 23:27:36.308539 kubelet[2667]: E0706 23:27:36.308467 2667 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 6 23:27:36.308539 kubelet[2667]: I0706 23:27:36.308485 2667 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/6cccb974-c600-4611-9cf6-2ebb27d5d999-registration-dir\") pod \"csi-node-driver-xwdth\" (UID: \"6cccb974-c600-4611-9cf6-2ebb27d5d999\") " pod="calico-system/csi-node-driver-xwdth" Jul 6 23:27:36.308778 kubelet[2667]: E0706 23:27:36.308756 2667 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 6 23:27:36.308821 kubelet[2667]: W0706 23:27:36.308771 2667 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 6 23:27:36.308821 kubelet[2667]: E0706 23:27:36.308806 2667 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 6 23:27:36.309008 kubelet[2667]: E0706 23:27:36.308991 2667 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 6 23:27:36.309046 kubelet[2667]: W0706 23:27:36.309016 2667 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 6 23:27:36.309046 kubelet[2667]: E0706 23:27:36.309035 2667 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 6 23:27:36.309222 kubelet[2667]: E0706 23:27:36.309206 2667 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 6 23:27:36.309222 kubelet[2667]: W0706 23:27:36.309220 2667 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 6 23:27:36.309274 kubelet[2667]: E0706 23:27:36.309234 2667 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 6 23:27:36.309418 kubelet[2667]: E0706 23:27:36.309391 2667 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 6 23:27:36.309418 kubelet[2667]: W0706 23:27:36.309414 2667 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 6 23:27:36.309496 kubelet[2667]: E0706 23:27:36.309428 2667 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 6 23:27:36.309609 kubelet[2667]: E0706 23:27:36.309587 2667 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 6 23:27:36.309609 kubelet[2667]: W0706 23:27:36.309600 2667 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 6 23:27:36.309693 kubelet[2667]: E0706 23:27:36.309611 2667 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 6 23:27:36.309858 kubelet[2667]: E0706 23:27:36.309804 2667 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 6 23:27:36.309858 kubelet[2667]: W0706 23:27:36.309837 2667 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 6 23:27:36.309858 kubelet[2667]: E0706 23:27:36.309848 2667 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 6 23:27:36.335888 containerd[1514]: time="2025-07-06T23:27:36.335845440Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-node-gctx6,Uid:d0430eb4-5462-44d9-8437-9bdd8f5b3f84,Namespace:calico-system,Attempt:0,}" Jul 6 23:27:36.347220 containerd[1514]: time="2025-07-06T23:27:36.347110206Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-typha-5bf8fb4678-v79n4,Uid:f30bebb7-1b49-4cef-b4de-cd98773c316e,Namespace:calico-system,Attempt:0,} returns sandbox id \"3d7c9840e22b3e6c508cbbc324ecbf99a8cb28b763f3e10fe74be22b3d1c1147\"" Jul 6 23:27:36.350500 containerd[1514]: time="2025-07-06T23:27:36.350462591Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/typha:v3.30.2\"" Jul 6 23:27:36.376751 containerd[1514]: time="2025-07-06T23:27:36.375828145Z" level=info msg="connecting to shim 0a9883a4e9231cb1e98292a314943ddd3dde0e2016ccd784f689910c90b7d079" address="unix:///run/containerd/s/9b34b730dea15fb491c14c885345ddfbac870c6e1d7d709f99dc7fda4750edf3" namespace=k8s.io protocol=ttrpc version=3 Jul 6 23:27:36.410446 systemd[1]: Started cri-containerd-0a9883a4e9231cb1e98292a314943ddd3dde0e2016ccd784f689910c90b7d079.scope - libcontainer container 0a9883a4e9231cb1e98292a314943ddd3dde0e2016ccd784f689910c90b7d079. Jul 6 23:27:36.413055 kubelet[2667]: E0706 23:27:36.413014 2667 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 6 23:27:36.413359 kubelet[2667]: W0706 23:27:36.413272 2667 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 6 23:27:36.413359 kubelet[2667]: E0706 23:27:36.413317 2667 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 6 23:27:36.414267 kubelet[2667]: E0706 23:27:36.414229 2667 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 6 23:27:36.414267 kubelet[2667]: W0706 23:27:36.414250 2667 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 6 23:27:36.414428 kubelet[2667]: E0706 23:27:36.414375 2667 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 6 23:27:36.414625 kubelet[2667]: E0706 23:27:36.414612 2667 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 6 23:27:36.414817 kubelet[2667]: W0706 23:27:36.414734 2667 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 6 23:27:36.414817 kubelet[2667]: E0706 23:27:36.414764 2667 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 6 23:27:36.415089 kubelet[2667]: E0706 23:27:36.415062 2667 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 6 23:27:36.415089 kubelet[2667]: W0706 23:27:36.415076 2667 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 6 23:27:36.415327 kubelet[2667]: E0706 23:27:36.415182 2667 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 6 23:27:36.416252 kubelet[2667]: E0706 23:27:36.416234 2667 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 6 23:27:36.416478 kubelet[2667]: W0706 23:27:36.416338 2667 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 6 23:27:36.417267 kubelet[2667]: E0706 23:27:36.417218 2667 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 6 23:27:36.417578 kubelet[2667]: E0706 23:27:36.417525 2667 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 6 23:27:36.417578 kubelet[2667]: W0706 23:27:36.417540 2667 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 6 23:27:36.417578 kubelet[2667]: E0706 23:27:36.417558 2667 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 6 23:27:36.418022 kubelet[2667]: E0706 23:27:36.417996 2667 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 6 23:27:36.418022 kubelet[2667]: W0706 23:27:36.418009 2667 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 6 23:27:36.418193 kubelet[2667]: E0706 23:27:36.418115 2667 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 6 23:27:36.418713 kubelet[2667]: E0706 23:27:36.418678 2667 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 6 23:27:36.418713 kubelet[2667]: W0706 23:27:36.418693 2667 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 6 23:27:36.418973 kubelet[2667]: E0706 23:27:36.418891 2667 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 6 23:27:36.419225 kubelet[2667]: E0706 23:27:36.419202 2667 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 6 23:27:36.419225 kubelet[2667]: W0706 23:27:36.419213 2667 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 6 23:27:36.419353 kubelet[2667]: E0706 23:27:36.419312 2667 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 6 23:27:36.419537 kubelet[2667]: E0706 23:27:36.419524 2667 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 6 23:27:36.420078 kubelet[2667]: W0706 23:27:36.419597 2667 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 6 23:27:36.420166 kubelet[2667]: E0706 23:27:36.420150 2667 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 6 23:27:36.420361 kubelet[2667]: E0706 23:27:36.420350 2667 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 6 23:27:36.420457 kubelet[2667]: W0706 23:27:36.420414 2667 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 6 23:27:36.420525 kubelet[2667]: E0706 23:27:36.420504 2667 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 6 23:27:36.420725 kubelet[2667]: E0706 23:27:36.420700 2667 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 6 23:27:36.420725 kubelet[2667]: W0706 23:27:36.420712 2667 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 6 23:27:36.420959 kubelet[2667]: E0706 23:27:36.420899 2667 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 6 23:27:36.421115 kubelet[2667]: E0706 23:27:36.421093 2667 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 6 23:27:36.421115 kubelet[2667]: W0706 23:27:36.421103 2667 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 6 23:27:36.421312 kubelet[2667]: E0706 23:27:36.421250 2667 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 6 23:27:36.422322 kubelet[2667]: E0706 23:27:36.422308 2667 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 6 23:27:36.422472 kubelet[2667]: W0706 23:27:36.422387 2667 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 6 23:27:36.422472 kubelet[2667]: E0706 23:27:36.422414 2667 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 6 23:27:36.423268 kubelet[2667]: E0706 23:27:36.423242 2667 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 6 23:27:36.423268 kubelet[2667]: W0706 23:27:36.423255 2667 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 6 23:27:36.423394 kubelet[2667]: E0706 23:27:36.423354 2667 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 6 23:27:36.423602 kubelet[2667]: E0706 23:27:36.423580 2667 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 6 23:27:36.423602 kubelet[2667]: W0706 23:27:36.423591 2667 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 6 23:27:36.424856 kubelet[2667]: E0706 23:27:36.424776 2667 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 6 23:27:36.425211 kubelet[2667]: E0706 23:27:36.425182 2667 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 6 23:27:36.425211 kubelet[2667]: W0706 23:27:36.425197 2667 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 6 23:27:36.425343 kubelet[2667]: E0706 23:27:36.425332 2667 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 6 23:27:36.425572 kubelet[2667]: E0706 23:27:36.425547 2667 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 6 23:27:36.425572 kubelet[2667]: W0706 23:27:36.425559 2667 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 6 23:27:36.425761 kubelet[2667]: E0706 23:27:36.425737 2667 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 6 23:27:36.426006 kubelet[2667]: E0706 23:27:36.425979 2667 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 6 23:27:36.426006 kubelet[2667]: W0706 23:27:36.425993 2667 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 6 23:27:36.426684 kubelet[2667]: E0706 23:27:36.426623 2667 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 6 23:27:36.427057 kubelet[2667]: E0706 23:27:36.427028 2667 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 6 23:27:36.427057 kubelet[2667]: W0706 23:27:36.427044 2667 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 6 23:27:36.427848 kubelet[2667]: E0706 23:27:36.427272 2667 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 6 23:27:36.428130 kubelet[2667]: E0706 23:27:36.428050 2667 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 6 23:27:36.428130 kubelet[2667]: W0706 23:27:36.428064 2667 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 6 23:27:36.428382 kubelet[2667]: E0706 23:27:36.428219 2667 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 6 23:27:36.428756 kubelet[2667]: E0706 23:27:36.428726 2667 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 6 23:27:36.428756 kubelet[2667]: W0706 23:27:36.428742 2667 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 6 23:27:36.429853 kubelet[2667]: E0706 23:27:36.429657 2667 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 6 23:27:36.430007 kubelet[2667]: E0706 23:27:36.429983 2667 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 6 23:27:36.430007 kubelet[2667]: W0706 23:27:36.429994 2667 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 6 23:27:36.430135 kubelet[2667]: E0706 23:27:36.430080 2667 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 6 23:27:36.430482 kubelet[2667]: E0706 23:27:36.430469 2667 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 6 23:27:36.431071 kubelet[2667]: W0706 23:27:36.430554 2667 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 6 23:27:36.431071 kubelet[2667]: E0706 23:27:36.430573 2667 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 6 23:27:36.431744 kubelet[2667]: E0706 23:27:36.431545 2667 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 6 23:27:36.431744 kubelet[2667]: W0706 23:27:36.431581 2667 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 6 23:27:36.431744 kubelet[2667]: E0706 23:27:36.431593 2667 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 6 23:27:36.452706 kubelet[2667]: E0706 23:27:36.452157 2667 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 6 23:27:36.452706 kubelet[2667]: W0706 23:27:36.452181 2667 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 6 23:27:36.452706 kubelet[2667]: E0706 23:27:36.452213 2667 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 6 23:27:36.472018 containerd[1514]: time="2025-07-06T23:27:36.471826717Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-node-gctx6,Uid:d0430eb4-5462-44d9-8437-9bdd8f5b3f84,Namespace:calico-system,Attempt:0,} returns sandbox id \"0a9883a4e9231cb1e98292a314943ddd3dde0e2016ccd784f689910c90b7d079\"" Jul 6 23:27:37.854605 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount1336134283.mount: Deactivated successfully. Jul 6 23:27:38.361357 kubelet[2667]: E0706 23:27:38.360949 2667 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-xwdth" podUID="6cccb974-c600-4611-9cf6-2ebb27d5d999" Jul 6 23:27:38.956335 containerd[1514]: time="2025-07-06T23:27:38.956248027Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/typha:v3.30.2\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jul 6 23:27:38.958733 containerd[1514]: time="2025-07-06T23:27:38.958290481Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/typha:v3.30.2: active requests=0, bytes read=33087207" Jul 6 23:27:38.960004 containerd[1514]: time="2025-07-06T23:27:38.959952892Z" level=info msg="ImageCreate event name:\"sha256:bd819526ff844d29b60cd75e846a1f55306016ff269d881d52a9b6c7b2eef0b2\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jul 6 23:27:38.962721 containerd[1514]: time="2025-07-06T23:27:38.962658070Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/typha@sha256:da29d745efe5eb7d25f765d3aa439f3fe60710a458efe39c285e58b02bd961af\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jul 6 23:27:38.963456 containerd[1514]: time="2025-07-06T23:27:38.963424395Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/typha:v3.30.2\" with image id \"sha256:bd819526ff844d29b60cd75e846a1f55306016ff269d881d52a9b6c7b2eef0b2\", repo tag \"ghcr.io/flatcar/calico/typha:v3.30.2\", repo digest \"ghcr.io/flatcar/calico/typha@sha256:da29d745efe5eb7d25f765d3aa439f3fe60710a458efe39c285e58b02bd961af\", size \"33087061\" in 2.612923604s" Jul 6 23:27:38.963576 containerd[1514]: time="2025-07-06T23:27:38.963559156Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/typha:v3.30.2\" returns image reference \"sha256:bd819526ff844d29b60cd75e846a1f55306016ff269d881d52a9b6c7b2eef0b2\"" Jul 6 23:27:38.965140 containerd[1514]: time="2025-07-06T23:27:38.965053086Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.2\"" Jul 6 23:27:38.985969 containerd[1514]: time="2025-07-06T23:27:38.985923826Z" level=info msg="CreateContainer within sandbox \"3d7c9840e22b3e6c508cbbc324ecbf99a8cb28b763f3e10fe74be22b3d1c1147\" for container &ContainerMetadata{Name:calico-typha,Attempt:0,}" Jul 6 23:27:38.994714 containerd[1514]: time="2025-07-06T23:27:38.994514364Z" level=info msg="Container f61ab91b75b35d9fbc3dbcebef1ca18e19950170caff506093684e0833ffa8a6: CDI devices from CRI Config.CDIDevices: []" Jul 6 23:27:39.005287 containerd[1514]: time="2025-07-06T23:27:39.005246114Z" level=info msg="CreateContainer within sandbox \"3d7c9840e22b3e6c508cbbc324ecbf99a8cb28b763f3e10fe74be22b3d1c1147\" for &ContainerMetadata{Name:calico-typha,Attempt:0,} returns container id \"f61ab91b75b35d9fbc3dbcebef1ca18e19950170caff506093684e0833ffa8a6\"" Jul 6 23:27:39.008510 containerd[1514]: time="2025-07-06T23:27:39.007627329Z" level=info msg="StartContainer for \"f61ab91b75b35d9fbc3dbcebef1ca18e19950170caff506093684e0833ffa8a6\"" Jul 6 23:27:39.010006 containerd[1514]: time="2025-07-06T23:27:39.009953144Z" level=info msg="connecting to shim f61ab91b75b35d9fbc3dbcebef1ca18e19950170caff506093684e0833ffa8a6" address="unix:///run/containerd/s/27372d996ce4d6feeba199454eba20a0eadff3cd05ebd5b27bf56542d573e8ab" protocol=ttrpc version=3 Jul 6 23:27:39.036895 systemd[1]: Started cri-containerd-f61ab91b75b35d9fbc3dbcebef1ca18e19950170caff506093684e0833ffa8a6.scope - libcontainer container f61ab91b75b35d9fbc3dbcebef1ca18e19950170caff506093684e0833ffa8a6. Jul 6 23:27:39.097687 containerd[1514]: time="2025-07-06T23:27:39.097602895Z" level=info msg="StartContainer for \"f61ab91b75b35d9fbc3dbcebef1ca18e19950170caff506093684e0833ffa8a6\" returns successfully" Jul 6 23:27:39.518976 kubelet[2667]: E0706 23:27:39.518926 2667 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 6 23:27:39.518976 kubelet[2667]: W0706 23:27:39.518978 2667 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 6 23:27:39.519423 kubelet[2667]: E0706 23:27:39.519071 2667 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 6 23:27:39.519747 kubelet[2667]: E0706 23:27:39.519571 2667 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 6 23:27:39.519747 kubelet[2667]: W0706 23:27:39.519598 2667 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 6 23:27:39.519747 kubelet[2667]: E0706 23:27:39.519658 2667 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 6 23:27:39.520051 kubelet[2667]: E0706 23:27:39.520022 2667 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 6 23:27:39.520051 kubelet[2667]: W0706 23:27:39.520040 2667 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 6 23:27:39.520051 kubelet[2667]: E0706 23:27:39.520052 2667 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 6 23:27:39.520254 kubelet[2667]: E0706 23:27:39.520230 2667 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 6 23:27:39.520306 kubelet[2667]: W0706 23:27:39.520258 2667 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 6 23:27:39.520306 kubelet[2667]: E0706 23:27:39.520269 2667 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 6 23:27:39.520529 kubelet[2667]: E0706 23:27:39.520459 2667 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 6 23:27:39.520529 kubelet[2667]: W0706 23:27:39.520472 2667 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 6 23:27:39.520529 kubelet[2667]: E0706 23:27:39.520500 2667 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 6 23:27:39.520855 kubelet[2667]: E0706 23:27:39.520703 2667 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 6 23:27:39.520855 kubelet[2667]: W0706 23:27:39.520719 2667 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 6 23:27:39.520855 kubelet[2667]: E0706 23:27:39.520729 2667 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 6 23:27:39.521009 kubelet[2667]: E0706 23:27:39.520940 2667 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 6 23:27:39.521009 kubelet[2667]: W0706 23:27:39.520949 2667 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 6 23:27:39.521009 kubelet[2667]: E0706 23:27:39.520957 2667 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 6 23:27:39.521237 kubelet[2667]: E0706 23:27:39.521149 2667 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 6 23:27:39.521237 kubelet[2667]: W0706 23:27:39.521179 2667 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 6 23:27:39.521237 kubelet[2667]: E0706 23:27:39.521190 2667 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 6 23:27:39.521761 kubelet[2667]: E0706 23:27:39.521532 2667 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 6 23:27:39.521761 kubelet[2667]: W0706 23:27:39.521559 2667 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 6 23:27:39.521761 kubelet[2667]: E0706 23:27:39.521571 2667 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 6 23:27:39.521761 kubelet[2667]: E0706 23:27:39.521763 2667 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 6 23:27:39.521856 kubelet[2667]: W0706 23:27:39.521773 2667 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 6 23:27:39.521856 kubelet[2667]: E0706 23:27:39.521781 2667 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 6 23:27:39.521959 kubelet[2667]: E0706 23:27:39.521934 2667 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 6 23:27:39.521959 kubelet[2667]: W0706 23:27:39.521949 2667 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 6 23:27:39.521959 kubelet[2667]: E0706 23:27:39.521957 2667 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 6 23:27:39.522267 kubelet[2667]: E0706 23:27:39.522214 2667 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 6 23:27:39.522267 kubelet[2667]: W0706 23:27:39.522262 2667 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 6 23:27:39.522402 kubelet[2667]: E0706 23:27:39.522273 2667 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 6 23:27:39.522524 kubelet[2667]: E0706 23:27:39.522455 2667 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 6 23:27:39.522524 kubelet[2667]: W0706 23:27:39.522469 2667 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 6 23:27:39.522524 kubelet[2667]: E0706 23:27:39.522497 2667 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 6 23:27:39.522855 kubelet[2667]: E0706 23:27:39.522724 2667 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 6 23:27:39.522855 kubelet[2667]: W0706 23:27:39.522740 2667 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 6 23:27:39.522855 kubelet[2667]: E0706 23:27:39.522774 2667 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 6 23:27:39.523061 kubelet[2667]: E0706 23:27:39.522976 2667 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 6 23:27:39.523061 kubelet[2667]: W0706 23:27:39.522997 2667 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 6 23:27:39.523061 kubelet[2667]: E0706 23:27:39.523007 2667 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 6 23:27:39.551807 kubelet[2667]: E0706 23:27:39.551771 2667 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 6 23:27:39.551807 kubelet[2667]: W0706 23:27:39.551798 2667 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 6 23:27:39.551991 kubelet[2667]: E0706 23:27:39.551848 2667 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 6 23:27:39.552139 kubelet[2667]: E0706 23:27:39.552120 2667 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 6 23:27:39.552685 kubelet[2667]: W0706 23:27:39.552228 2667 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 6 23:27:39.552685 kubelet[2667]: E0706 23:27:39.552316 2667 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 6 23:27:39.553999 kubelet[2667]: E0706 23:27:39.553963 2667 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 6 23:27:39.553999 kubelet[2667]: W0706 23:27:39.553985 2667 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 6 23:27:39.554129 kubelet[2667]: E0706 23:27:39.554082 2667 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 6 23:27:39.554264 kubelet[2667]: E0706 23:27:39.554232 2667 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 6 23:27:39.554264 kubelet[2667]: W0706 23:27:39.554243 2667 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 6 23:27:39.554340 kubelet[2667]: E0706 23:27:39.554316 2667 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 6 23:27:39.554449 kubelet[2667]: E0706 23:27:39.554429 2667 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 6 23:27:39.554449 kubelet[2667]: W0706 23:27:39.554441 2667 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 6 23:27:39.554503 kubelet[2667]: E0706 23:27:39.554461 2667 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 6 23:27:39.554619 kubelet[2667]: E0706 23:27:39.554600 2667 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 6 23:27:39.554619 kubelet[2667]: W0706 23:27:39.554611 2667 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 6 23:27:39.554757 kubelet[2667]: E0706 23:27:39.554625 2667 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 6 23:27:39.554984 kubelet[2667]: E0706 23:27:39.554958 2667 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 6 23:27:39.554984 kubelet[2667]: W0706 23:27:39.554976 2667 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 6 23:27:39.556780 kubelet[2667]: E0706 23:27:39.556740 2667 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 6 23:27:39.557017 kubelet[2667]: E0706 23:27:39.556989 2667 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 6 23:27:39.557062 kubelet[2667]: W0706 23:27:39.557007 2667 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 6 23:27:39.557062 kubelet[2667]: E0706 23:27:39.557037 2667 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 6 23:27:39.557206 kubelet[2667]: E0706 23:27:39.557186 2667 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 6 23:27:39.557206 kubelet[2667]: W0706 23:27:39.557198 2667 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 6 23:27:39.557297 kubelet[2667]: E0706 23:27:39.557257 2667 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 6 23:27:39.558699 kubelet[2667]: E0706 23:27:39.557346 2667 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 6 23:27:39.558699 kubelet[2667]: W0706 23:27:39.557357 2667 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 6 23:27:39.558699 kubelet[2667]: E0706 23:27:39.557402 2667 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 6 23:27:39.558699 kubelet[2667]: E0706 23:27:39.557530 2667 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 6 23:27:39.558699 kubelet[2667]: W0706 23:27:39.557538 2667 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 6 23:27:39.558699 kubelet[2667]: E0706 23:27:39.557551 2667 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 6 23:27:39.558699 kubelet[2667]: E0706 23:27:39.557848 2667 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 6 23:27:39.558699 kubelet[2667]: W0706 23:27:39.557859 2667 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 6 23:27:39.558699 kubelet[2667]: E0706 23:27:39.557869 2667 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 6 23:27:39.558699 kubelet[2667]: E0706 23:27:39.558140 2667 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 6 23:27:39.558996 kubelet[2667]: W0706 23:27:39.558147 2667 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 6 23:27:39.558996 kubelet[2667]: E0706 23:27:39.558252 2667 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 6 23:27:39.560013 kubelet[2667]: E0706 23:27:39.559980 2667 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 6 23:27:39.560013 kubelet[2667]: W0706 23:27:39.560001 2667 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 6 23:27:39.560115 kubelet[2667]: E0706 23:27:39.560100 2667 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 6 23:27:39.560175 kubelet[2667]: E0706 23:27:39.560155 2667 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 6 23:27:39.560205 kubelet[2667]: W0706 23:27:39.560181 2667 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 6 23:27:39.560205 kubelet[2667]: E0706 23:27:39.560198 2667 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 6 23:27:39.560471 kubelet[2667]: E0706 23:27:39.560441 2667 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 6 23:27:39.560471 kubelet[2667]: W0706 23:27:39.560454 2667 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 6 23:27:39.560471 kubelet[2667]: E0706 23:27:39.560470 2667 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 6 23:27:39.560735 kubelet[2667]: E0706 23:27:39.560717 2667 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 6 23:27:39.560735 kubelet[2667]: W0706 23:27:39.560731 2667 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 6 23:27:39.560804 kubelet[2667]: E0706 23:27:39.560742 2667 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 6 23:27:39.562579 kubelet[2667]: E0706 23:27:39.562551 2667 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 6 23:27:39.562579 kubelet[2667]: W0706 23:27:39.562569 2667 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 6 23:27:39.562579 kubelet[2667]: E0706 23:27:39.562580 2667 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 6 23:27:40.361155 kubelet[2667]: E0706 23:27:40.361104 2667 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-xwdth" podUID="6cccb974-c600-4611-9cf6-2ebb27d5d999" Jul 6 23:27:40.498591 kubelet[2667]: I0706 23:27:40.498549 2667 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Jul 6 23:27:40.514273 containerd[1514]: time="2025-07-06T23:27:40.514177435Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.2\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jul 6 23:27:40.516459 containerd[1514]: time="2025-07-06T23:27:40.516372088Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.2: active requests=0, bytes read=4266981" Jul 6 23:27:40.517459 containerd[1514]: time="2025-07-06T23:27:40.517321374Z" level=info msg="ImageCreate event name:\"sha256:53f638101e3d73f7dd5e42dc42fb3d94ae1978e8958677222c3de6ec1d8c3d4f\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jul 6 23:27:40.520126 containerd[1514]: time="2025-07-06T23:27:40.520055870Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/pod2daemon-flexvol@sha256:972be127eaecd7d1a2d5393b8d14f1ae8f88550bee83e0519e9590c7e15eb41b\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jul 6 23:27:40.520822 containerd[1514]: time="2025-07-06T23:27:40.520503752Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.2\" with image id \"sha256:53f638101e3d73f7dd5e42dc42fb3d94ae1978e8958677222c3de6ec1d8c3d4f\", repo tag \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.2\", repo digest \"ghcr.io/flatcar/calico/pod2daemon-flexvol@sha256:972be127eaecd7d1a2d5393b8d14f1ae8f88550bee83e0519e9590c7e15eb41b\", size \"5636182\" in 1.554741261s" Jul 6 23:27:40.520822 containerd[1514]: time="2025-07-06T23:27:40.520540592Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.2\" returns image reference \"sha256:53f638101e3d73f7dd5e42dc42fb3d94ae1978e8958677222c3de6ec1d8c3d4f\"" Jul 6 23:27:40.526365 containerd[1514]: time="2025-07-06T23:27:40.526322147Z" level=info msg="CreateContainer within sandbox \"0a9883a4e9231cb1e98292a314943ddd3dde0e2016ccd784f689910c90b7d079\" for container &ContainerMetadata{Name:flexvol-driver,Attempt:0,}" Jul 6 23:27:40.530990 kubelet[2667]: E0706 23:27:40.530559 2667 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 6 23:27:40.530990 kubelet[2667]: W0706 23:27:40.530610 2667 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 6 23:27:40.530990 kubelet[2667]: E0706 23:27:40.530730 2667 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 6 23:27:40.532494 kubelet[2667]: E0706 23:27:40.531230 2667 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 6 23:27:40.532494 kubelet[2667]: W0706 23:27:40.531246 2667 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 6 23:27:40.532494 kubelet[2667]: E0706 23:27:40.531328 2667 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 6 23:27:40.532494 kubelet[2667]: E0706 23:27:40.531534 2667 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 6 23:27:40.532494 kubelet[2667]: W0706 23:27:40.531546 2667 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 6 23:27:40.532494 kubelet[2667]: E0706 23:27:40.531556 2667 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 6 23:27:40.532494 kubelet[2667]: E0706 23:27:40.531759 2667 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 6 23:27:40.532494 kubelet[2667]: W0706 23:27:40.531769 2667 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 6 23:27:40.532494 kubelet[2667]: E0706 23:27:40.531780 2667 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 6 23:27:40.532494 kubelet[2667]: E0706 23:27:40.531948 2667 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 6 23:27:40.532743 kubelet[2667]: W0706 23:27:40.531956 2667 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 6 23:27:40.532743 kubelet[2667]: E0706 23:27:40.531965 2667 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 6 23:27:40.532743 kubelet[2667]: E0706 23:27:40.532118 2667 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 6 23:27:40.532743 kubelet[2667]: W0706 23:27:40.532125 2667 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 6 23:27:40.532743 kubelet[2667]: E0706 23:27:40.532146 2667 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 6 23:27:40.532743 kubelet[2667]: E0706 23:27:40.532285 2667 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 6 23:27:40.532743 kubelet[2667]: W0706 23:27:40.532302 2667 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 6 23:27:40.532743 kubelet[2667]: E0706 23:27:40.532311 2667 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 6 23:27:40.532743 kubelet[2667]: E0706 23:27:40.532455 2667 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 6 23:27:40.532743 kubelet[2667]: W0706 23:27:40.532464 2667 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 6 23:27:40.532939 kubelet[2667]: E0706 23:27:40.532475 2667 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 6 23:27:40.532939 kubelet[2667]: E0706 23:27:40.532691 2667 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 6 23:27:40.532939 kubelet[2667]: W0706 23:27:40.532701 2667 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 6 23:27:40.532939 kubelet[2667]: E0706 23:27:40.532711 2667 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 6 23:27:40.532939 kubelet[2667]: E0706 23:27:40.532886 2667 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 6 23:27:40.532939 kubelet[2667]: W0706 23:27:40.532895 2667 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 6 23:27:40.532939 kubelet[2667]: E0706 23:27:40.532905 2667 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 6 23:27:40.534182 kubelet[2667]: E0706 23:27:40.533062 2667 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 6 23:27:40.534182 kubelet[2667]: W0706 23:27:40.533084 2667 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 6 23:27:40.534182 kubelet[2667]: E0706 23:27:40.533094 2667 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 6 23:27:40.534182 kubelet[2667]: E0706 23:27:40.533247 2667 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 6 23:27:40.534182 kubelet[2667]: W0706 23:27:40.533255 2667 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 6 23:27:40.534182 kubelet[2667]: E0706 23:27:40.533263 2667 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 6 23:27:40.534182 kubelet[2667]: E0706 23:27:40.533420 2667 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 6 23:27:40.534182 kubelet[2667]: W0706 23:27:40.533428 2667 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 6 23:27:40.534182 kubelet[2667]: E0706 23:27:40.533438 2667 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 6 23:27:40.534182 kubelet[2667]: E0706 23:27:40.533591 2667 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 6 23:27:40.534462 kubelet[2667]: W0706 23:27:40.533607 2667 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 6 23:27:40.534462 kubelet[2667]: E0706 23:27:40.533617 2667 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 6 23:27:40.534462 kubelet[2667]: E0706 23:27:40.533967 2667 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 6 23:27:40.534462 kubelet[2667]: W0706 23:27:40.533979 2667 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 6 23:27:40.534462 kubelet[2667]: E0706 23:27:40.533991 2667 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 6 23:27:40.543934 containerd[1514]: time="2025-07-06T23:27:40.543811130Z" level=info msg="Container 8697a422f63a13fc240920efa92257f0c28336099db919bebe1b1b4ab72d0e5c: CDI devices from CRI Config.CDIDevices: []" Jul 6 23:27:40.559333 containerd[1514]: time="2025-07-06T23:27:40.558514536Z" level=info msg="CreateContainer within sandbox \"0a9883a4e9231cb1e98292a314943ddd3dde0e2016ccd784f689910c90b7d079\" for &ContainerMetadata{Name:flexvol-driver,Attempt:0,} returns container id \"8697a422f63a13fc240920efa92257f0c28336099db919bebe1b1b4ab72d0e5c\"" Jul 6 23:27:40.559827 containerd[1514]: time="2025-07-06T23:27:40.559754143Z" level=info msg="StartContainer for \"8697a422f63a13fc240920efa92257f0c28336099db919bebe1b1b4ab72d0e5c\"" Jul 6 23:27:40.561876 kubelet[2667]: E0706 23:27:40.561755 2667 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 6 23:27:40.562307 kubelet[2667]: W0706 23:27:40.562056 2667 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 6 23:27:40.562459 kubelet[2667]: E0706 23:27:40.562390 2667 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 6 23:27:40.563261 kubelet[2667]: E0706 23:27:40.563031 2667 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 6 23:27:40.563261 kubelet[2667]: W0706 23:27:40.563049 2667 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 6 23:27:40.563261 kubelet[2667]: E0706 23:27:40.563070 2667 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 6 23:27:40.563600 kubelet[2667]: E0706 23:27:40.563532 2667 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 6 23:27:40.563600 kubelet[2667]: W0706 23:27:40.563547 2667 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 6 23:27:40.563600 kubelet[2667]: E0706 23:27:40.563562 2667 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 6 23:27:40.564332 containerd[1514]: time="2025-07-06T23:27:40.564262010Z" level=info msg="connecting to shim 8697a422f63a13fc240920efa92257f0c28336099db919bebe1b1b4ab72d0e5c" address="unix:///run/containerd/s/9b34b730dea15fb491c14c885345ddfbac870c6e1d7d709f99dc7fda4750edf3" protocol=ttrpc version=3 Jul 6 23:27:40.565767 kubelet[2667]: E0706 23:27:40.564619 2667 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 6 23:27:40.565767 kubelet[2667]: W0706 23:27:40.565180 2667 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 6 23:27:40.565767 kubelet[2667]: E0706 23:27:40.565211 2667 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 6 23:27:40.566389 kubelet[2667]: E0706 23:27:40.566235 2667 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 6 23:27:40.566389 kubelet[2667]: W0706 23:27:40.566257 2667 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 6 23:27:40.566389 kubelet[2667]: E0706 23:27:40.566304 2667 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 6 23:27:40.567184 kubelet[2667]: E0706 23:27:40.567073 2667 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 6 23:27:40.567497 kubelet[2667]: W0706 23:27:40.567347 2667 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 6 23:27:40.567497 kubelet[2667]: E0706 23:27:40.567399 2667 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 6 23:27:40.568178 kubelet[2667]: E0706 23:27:40.568059 2667 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 6 23:27:40.568421 kubelet[2667]: W0706 23:27:40.568076 2667 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 6 23:27:40.568808 kubelet[2667]: E0706 23:27:40.568599 2667 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 6 23:27:40.570047 kubelet[2667]: E0706 23:27:40.569900 2667 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 6 23:27:40.570047 kubelet[2667]: W0706 23:27:40.569920 2667 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 6 23:27:40.573042 kubelet[2667]: E0706 23:27:40.572875 2667 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 6 23:27:40.576094 kubelet[2667]: E0706 23:27:40.575998 2667 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 6 23:27:40.576094 kubelet[2667]: W0706 23:27:40.576033 2667 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 6 23:27:40.577607 kubelet[2667]: E0706 23:27:40.577055 2667 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 6 23:27:40.578211 kubelet[2667]: E0706 23:27:40.577790 2667 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 6 23:27:40.578211 kubelet[2667]: W0706 23:27:40.577870 2667 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 6 23:27:40.578211 kubelet[2667]: E0706 23:27:40.578188 2667 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 6 23:27:40.578896 kubelet[2667]: E0706 23:27:40.578746 2667 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 6 23:27:40.578896 kubelet[2667]: W0706 23:27:40.578767 2667 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 6 23:27:40.578896 kubelet[2667]: E0706 23:27:40.578796 2667 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 6 23:27:40.579185 kubelet[2667]: E0706 23:27:40.579171 2667 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 6 23:27:40.579251 kubelet[2667]: W0706 23:27:40.579239 2667 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 6 23:27:40.580530 kubelet[2667]: E0706 23:27:40.580373 2667 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 6 23:27:40.580530 kubelet[2667]: W0706 23:27:40.580405 2667 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 6 23:27:40.580937 kubelet[2667]: E0706 23:27:40.580771 2667 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 6 23:27:40.581013 kubelet[2667]: E0706 23:27:40.580998 2667 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 6 23:27:40.581648 kubelet[2667]: E0706 23:27:40.581510 2667 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 6 23:27:40.581648 kubelet[2667]: W0706 23:27:40.581540 2667 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 6 23:27:40.581648 kubelet[2667]: E0706 23:27:40.581559 2667 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 6 23:27:40.582862 kubelet[2667]: E0706 23:27:40.582814 2667 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 6 23:27:40.582862 kubelet[2667]: W0706 23:27:40.582834 2667 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 6 23:27:40.582862 kubelet[2667]: E0706 23:27:40.582854 2667 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 6 23:27:40.584037 kubelet[2667]: E0706 23:27:40.583921 2667 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 6 23:27:40.584037 kubelet[2667]: W0706 23:27:40.583941 2667 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 6 23:27:40.584037 kubelet[2667]: E0706 23:27:40.583966 2667 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 6 23:27:40.585737 kubelet[2667]: E0706 23:27:40.584912 2667 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 6 23:27:40.585737 kubelet[2667]: W0706 23:27:40.585029 2667 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 6 23:27:40.585737 kubelet[2667]: E0706 23:27:40.585222 2667 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 6 23:27:40.585864 kubelet[2667]: E0706 23:27:40.585786 2667 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 6 23:27:40.585864 kubelet[2667]: W0706 23:27:40.585798 2667 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 6 23:27:40.585864 kubelet[2667]: E0706 23:27:40.585811 2667 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 6 23:27:40.605887 systemd[1]: Started cri-containerd-8697a422f63a13fc240920efa92257f0c28336099db919bebe1b1b4ab72d0e5c.scope - libcontainer container 8697a422f63a13fc240920efa92257f0c28336099db919bebe1b1b4ab72d0e5c. Jul 6 23:27:40.663676 containerd[1514]: time="2025-07-06T23:27:40.663338554Z" level=info msg="StartContainer for \"8697a422f63a13fc240920efa92257f0c28336099db919bebe1b1b4ab72d0e5c\" returns successfully" Jul 6 23:27:40.683042 systemd[1]: cri-containerd-8697a422f63a13fc240920efa92257f0c28336099db919bebe1b1b4ab72d0e5c.scope: Deactivated successfully. Jul 6 23:27:40.690074 containerd[1514]: time="2025-07-06T23:27:40.690008351Z" level=info msg="received exit event container_id:\"8697a422f63a13fc240920efa92257f0c28336099db919bebe1b1b4ab72d0e5c\" id:\"8697a422f63a13fc240920efa92257f0c28336099db919bebe1b1b4ab72d0e5c\" pid:3386 exited_at:{seconds:1751844460 nanos:689515508}" Jul 6 23:27:40.690539 containerd[1514]: time="2025-07-06T23:27:40.690510394Z" level=info msg="TaskExit event in podsandbox handler container_id:\"8697a422f63a13fc240920efa92257f0c28336099db919bebe1b1b4ab72d0e5c\" id:\"8697a422f63a13fc240920efa92257f0c28336099db919bebe1b1b4ab72d0e5c\" pid:3386 exited_at:{seconds:1751844460 nanos:689515508}" Jul 6 23:27:40.716531 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-8697a422f63a13fc240920efa92257f0c28336099db919bebe1b1b4ab72d0e5c-rootfs.mount: Deactivated successfully. Jul 6 23:27:41.507237 containerd[1514]: time="2025-07-06T23:27:41.507190661Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/cni:v3.30.2\"" Jul 6 23:27:41.529290 kubelet[2667]: I0706 23:27:41.529205 2667 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/calico-typha-5bf8fb4678-v79n4" podStartSLOduration=3.914117283 podStartE2EDuration="6.529185062s" podCreationTimestamp="2025-07-06 23:27:35 +0000 UTC" firstStartedPulling="2025-07-06 23:27:36.349615465 +0000 UTC m=+23.131275837" lastFinishedPulling="2025-07-06 23:27:38.964683244 +0000 UTC m=+25.746343616" observedRunningTime="2025-07-06 23:27:39.517488173 +0000 UTC m=+26.299148545" watchObservedRunningTime="2025-07-06 23:27:41.529185062 +0000 UTC m=+28.310845434" Jul 6 23:27:42.360419 kubelet[2667]: E0706 23:27:42.360313 2667 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-xwdth" podUID="6cccb974-c600-4611-9cf6-2ebb27d5d999" Jul 6 23:27:44.361238 kubelet[2667]: E0706 23:27:44.360976 2667 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-xwdth" podUID="6cccb974-c600-4611-9cf6-2ebb27d5d999" Jul 6 23:27:45.259694 containerd[1514]: time="2025-07-06T23:27:45.259494318Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/cni:v3.30.2\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jul 6 23:27:45.261958 containerd[1514]: time="2025-07-06T23:27:45.261875368Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/cni:v3.30.2: active requests=0, bytes read=65888320" Jul 6 23:27:45.263440 containerd[1514]: time="2025-07-06T23:27:45.263131894Z" level=info msg="ImageCreate event name:\"sha256:f6e344d58b3c5524e767c7d1dd4cb29c85ce820b0f3005a603532b6a22db5588\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jul 6 23:27:45.266110 containerd[1514]: time="2025-07-06T23:27:45.266074106Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/cni@sha256:50686775cc60acb78bd92a66fa2d84e1700b2d8e43a718fbadbf35e59baefb4d\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jul 6 23:27:45.266682 containerd[1514]: time="2025-07-06T23:27:45.266631789Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/cni:v3.30.2\" with image id \"sha256:f6e344d58b3c5524e767c7d1dd4cb29c85ce820b0f3005a603532b6a22db5588\", repo tag \"ghcr.io/flatcar/calico/cni:v3.30.2\", repo digest \"ghcr.io/flatcar/calico/cni@sha256:50686775cc60acb78bd92a66fa2d84e1700b2d8e43a718fbadbf35e59baefb4d\", size \"67257561\" in 3.759405848s" Jul 6 23:27:45.266772 containerd[1514]: time="2025-07-06T23:27:45.266726469Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/cni:v3.30.2\" returns image reference \"sha256:f6e344d58b3c5524e767c7d1dd4cb29c85ce820b0f3005a603532b6a22db5588\"" Jul 6 23:27:45.270870 containerd[1514]: time="2025-07-06T23:27:45.270826607Z" level=info msg="CreateContainer within sandbox \"0a9883a4e9231cb1e98292a314943ddd3dde0e2016ccd784f689910c90b7d079\" for container &ContainerMetadata{Name:install-cni,Attempt:0,}" Jul 6 23:27:45.279871 containerd[1514]: time="2025-07-06T23:27:45.279824765Z" level=info msg="Container c7ef4998f39dc730245741239f70074b6eb967eee069f0940933add2d49efa23: CDI devices from CRI Config.CDIDevices: []" Jul 6 23:27:45.295362 containerd[1514]: time="2025-07-06T23:27:45.295291551Z" level=info msg="CreateContainer within sandbox \"0a9883a4e9231cb1e98292a314943ddd3dde0e2016ccd784f689910c90b7d079\" for &ContainerMetadata{Name:install-cni,Attempt:0,} returns container id \"c7ef4998f39dc730245741239f70074b6eb967eee069f0940933add2d49efa23\"" Jul 6 23:27:45.297806 containerd[1514]: time="2025-07-06T23:27:45.296294315Z" level=info msg="StartContainer for \"c7ef4998f39dc730245741239f70074b6eb967eee069f0940933add2d49efa23\"" Jul 6 23:27:45.299039 containerd[1514]: time="2025-07-06T23:27:45.298787646Z" level=info msg="connecting to shim c7ef4998f39dc730245741239f70074b6eb967eee069f0940933add2d49efa23" address="unix:///run/containerd/s/9b34b730dea15fb491c14c885345ddfbac870c6e1d7d709f99dc7fda4750edf3" protocol=ttrpc version=3 Jul 6 23:27:45.330842 systemd[1]: Started cri-containerd-c7ef4998f39dc730245741239f70074b6eb967eee069f0940933add2d49efa23.scope - libcontainer container c7ef4998f39dc730245741239f70074b6eb967eee069f0940933add2d49efa23. Jul 6 23:27:45.378282 containerd[1514]: time="2025-07-06T23:27:45.378235145Z" level=info msg="StartContainer for \"c7ef4998f39dc730245741239f70074b6eb967eee069f0940933add2d49efa23\" returns successfully" Jul 6 23:27:45.929153 containerd[1514]: time="2025-07-06T23:27:45.929086055Z" level=error msg="failed to reload cni configuration after receiving fs change event(WRITE \"/etc/cni/net.d/calico-kubeconfig\")" error="cni config load failed: no network config found in /etc/cni/net.d: cni plugin not initialized: failed to load cni config" Jul 6 23:27:45.932335 systemd[1]: cri-containerd-c7ef4998f39dc730245741239f70074b6eb967eee069f0940933add2d49efa23.scope: Deactivated successfully. Jul 6 23:27:45.933253 systemd[1]: cri-containerd-c7ef4998f39dc730245741239f70074b6eb967eee069f0940933add2d49efa23.scope: Consumed 521ms CPU time, 186.1M memory peak, 165.8M written to disk. Jul 6 23:27:45.936569 containerd[1514]: time="2025-07-06T23:27:45.936511727Z" level=info msg="received exit event container_id:\"c7ef4998f39dc730245741239f70074b6eb967eee069f0940933add2d49efa23\" id:\"c7ef4998f39dc730245741239f70074b6eb967eee069f0940933add2d49efa23\" pid:3446 exited_at:{seconds:1751844465 nanos:936179925}" Jul 6 23:27:45.938108 containerd[1514]: time="2025-07-06T23:27:45.937954733Z" level=info msg="TaskExit event in podsandbox handler container_id:\"c7ef4998f39dc730245741239f70074b6eb967eee069f0940933add2d49efa23\" id:\"c7ef4998f39dc730245741239f70074b6eb967eee069f0940933add2d49efa23\" pid:3446 exited_at:{seconds:1751844465 nanos:936179925}" Jul 6 23:27:45.962798 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-c7ef4998f39dc730245741239f70074b6eb967eee069f0940933add2d49efa23-rootfs.mount: Deactivated successfully. Jul 6 23:27:45.999341 kubelet[2667]: I0706 23:27:45.999242 2667 kubelet_node_status.go:501] "Fast updating node status as it just became ready" Jul 6 23:27:46.055611 systemd[1]: Created slice kubepods-burstable-pod27ad9e03_63e4_4da6_b469_a2bbc29ee17c.slice - libcontainer container kubepods-burstable-pod27ad9e03_63e4_4da6_b469_a2bbc29ee17c.slice. Jul 6 23:27:46.076675 systemd[1]: Created slice kubepods-besteffort-podf0eeabf7_71e7_42d8_936d_d80fd3cbb28d.slice - libcontainer container kubepods-besteffort-podf0eeabf7_71e7_42d8_936d_d80fd3cbb28d.slice. Jul 6 23:27:46.089706 systemd[1]: Created slice kubepods-besteffort-pod0c22019d_9d01_4e51_b47d_bbbe518c9d06.slice - libcontainer container kubepods-besteffort-pod0c22019d_9d01_4e51_b47d_bbbe518c9d06.slice. Jul 6 23:27:46.101044 systemd[1]: Created slice kubepods-burstable-pod13ec9f29_2b5a_4173_8135_b818e53eadb3.slice - libcontainer container kubepods-burstable-pod13ec9f29_2b5a_4173_8135_b818e53eadb3.slice. Jul 6 23:27:46.109717 kubelet[2667]: I0706 23:27:46.109653 2667 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h9kph\" (UniqueName: \"kubernetes.io/projected/0c22019d-9d01-4e51-b47d-bbbe518c9d06-kube-api-access-h9kph\") pod \"calico-kube-controllers-695459cb77-m8dwz\" (UID: \"0c22019d-9d01-4e51-b47d-bbbe518c9d06\") " pod="calico-system/calico-kube-controllers-695459cb77-m8dwz" Jul 6 23:27:46.110079 kubelet[2667]: I0706 23:27:46.109746 2667 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/13ec9f29-2b5a-4173-8135-b818e53eadb3-config-volume\") pod \"coredns-668d6bf9bc-4hbsb\" (UID: \"13ec9f29-2b5a-4173-8135-b818e53eadb3\") " pod="kube-system/coredns-668d6bf9bc-4hbsb" Jul 6 23:27:46.110079 kubelet[2667]: I0706 23:27:46.109779 2667 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x6fs2\" (UniqueName: \"kubernetes.io/projected/f0eeabf7-71e7-42d8-936d-d80fd3cbb28d-kube-api-access-x6fs2\") pod \"calico-apiserver-59c54d7dc-sddts\" (UID: \"f0eeabf7-71e7-42d8-936d-d80fd3cbb28d\") " pod="calico-apiserver/calico-apiserver-59c54d7dc-sddts" Jul 6 23:27:46.110079 kubelet[2667]: I0706 23:27:46.109803 2667 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4cmsp\" (UniqueName: \"kubernetes.io/projected/27ad9e03-63e4-4da6-b469-a2bbc29ee17c-kube-api-access-4cmsp\") pod \"coredns-668d6bf9bc-bq59l\" (UID: \"27ad9e03-63e4-4da6-b469-a2bbc29ee17c\") " pod="kube-system/coredns-668d6bf9bc-bq59l" Jul 6 23:27:46.110079 kubelet[2667]: I0706 23:27:46.109826 2667 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0c22019d-9d01-4e51-b47d-bbbe518c9d06-tigera-ca-bundle\") pod \"calico-kube-controllers-695459cb77-m8dwz\" (UID: \"0c22019d-9d01-4e51-b47d-bbbe518c9d06\") " pod="calico-system/calico-kube-controllers-695459cb77-m8dwz" Jul 6 23:27:46.110636 systemd[1]: Created slice kubepods-besteffort-pod6645371b_fc64_4f6a_96ea_92b175539b84.slice - libcontainer container kubepods-besteffort-pod6645371b_fc64_4f6a_96ea_92b175539b84.slice. Jul 6 23:27:46.111576 kubelet[2667]: I0706 23:27:46.110648 2667 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dj6kk\" (UniqueName: \"kubernetes.io/projected/13ec9f29-2b5a-4173-8135-b818e53eadb3-kube-api-access-dj6kk\") pod \"coredns-668d6bf9bc-4hbsb\" (UID: \"13ec9f29-2b5a-4173-8135-b818e53eadb3\") " pod="kube-system/coredns-668d6bf9bc-4hbsb" Jul 6 23:27:46.111576 kubelet[2667]: I0706 23:27:46.110706 2667 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/27ad9e03-63e4-4da6-b469-a2bbc29ee17c-config-volume\") pod \"coredns-668d6bf9bc-bq59l\" (UID: \"27ad9e03-63e4-4da6-b469-a2bbc29ee17c\") " pod="kube-system/coredns-668d6bf9bc-bq59l" Jul 6 23:27:46.111576 kubelet[2667]: I0706 23:27:46.110730 2667 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"calico-apiserver-certs\" (UniqueName: \"kubernetes.io/secret/f0eeabf7-71e7-42d8-936d-d80fd3cbb28d-calico-apiserver-certs\") pod \"calico-apiserver-59c54d7dc-sddts\" (UID: \"f0eeabf7-71e7-42d8-936d-d80fd3cbb28d\") " pod="calico-apiserver/calico-apiserver-59c54d7dc-sddts" Jul 6 23:27:46.123985 systemd[1]: Created slice kubepods-besteffort-podc35acb70_7389_4854_97f5_ab45ab94f737.slice - libcontainer container kubepods-besteffort-podc35acb70_7389_4854_97f5_ab45ab94f737.slice. Jul 6 23:27:46.136212 systemd[1]: Created slice kubepods-besteffort-pod1210fc34_5b9a_432b_b15e_04a9d1351cf5.slice - libcontainer container kubepods-besteffort-pod1210fc34_5b9a_432b_b15e_04a9d1351cf5.slice. Jul 6 23:27:46.150487 systemd[1]: Created slice kubepods-besteffort-pod12a2e46a_f038_4c8d_8ff0_480027547cbb.slice - libcontainer container kubepods-besteffort-pod12a2e46a_f038_4c8d_8ff0_480027547cbb.slice. Jul 6 23:27:46.213013 kubelet[2667]: I0706 23:27:46.211830 2667 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whisker-backend-key-pair\" (UniqueName: \"kubernetes.io/secret/1210fc34-5b9a-432b-b15e-04a9d1351cf5-whisker-backend-key-pair\") pod \"whisker-64d64cb859-582kb\" (UID: \"1210fc34-5b9a-432b-b15e-04a9d1351cf5\") " pod="calico-system/whisker-64d64cb859-582kb" Jul 6 23:27:46.213013 kubelet[2667]: I0706 23:27:46.211977 2667 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"calico-apiserver-certs\" (UniqueName: \"kubernetes.io/secret/c35acb70-7389-4854-97f5-ab45ab94f737-calico-apiserver-certs\") pod \"calico-apiserver-59c54d7dc-bp45t\" (UID: \"c35acb70-7389-4854-97f5-ab45ab94f737\") " pod="calico-apiserver/calico-apiserver-59c54d7dc-bp45t" Jul 6 23:27:46.213013 kubelet[2667]: I0706 23:27:46.212036 2667 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"calico-apiserver-certs\" (UniqueName: \"kubernetes.io/secret/6645371b-fc64-4f6a-96ea-92b175539b84-calico-apiserver-certs\") pod \"calico-apiserver-5c7f48b6db-cvkmp\" (UID: \"6645371b-fc64-4f6a-96ea-92b175539b84\") " pod="calico-apiserver/calico-apiserver-5c7f48b6db-cvkmp" Jul 6 23:27:46.213013 kubelet[2667]: I0706 23:27:46.212075 2667 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/12a2e46a-f038-4c8d-8ff0-480027547cbb-config\") pod \"goldmane-768f4c5c69-r6dz8\" (UID: \"12a2e46a-f038-4c8d-8ff0-480027547cbb\") " pod="calico-system/goldmane-768f4c5c69-r6dz8" Jul 6 23:27:46.213013 kubelet[2667]: I0706 23:27:46.212123 2667 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whisker-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1210fc34-5b9a-432b-b15e-04a9d1351cf5-whisker-ca-bundle\") pod \"whisker-64d64cb859-582kb\" (UID: \"1210fc34-5b9a-432b-b15e-04a9d1351cf5\") " pod="calico-system/whisker-64d64cb859-582kb" Jul 6 23:27:46.213815 kubelet[2667]: I0706 23:27:46.212186 2667 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g8x9r\" (UniqueName: \"kubernetes.io/projected/12a2e46a-f038-4c8d-8ff0-480027547cbb-kube-api-access-g8x9r\") pod \"goldmane-768f4c5c69-r6dz8\" (UID: \"12a2e46a-f038-4c8d-8ff0-480027547cbb\") " pod="calico-system/goldmane-768f4c5c69-r6dz8" Jul 6 23:27:46.213815 kubelet[2667]: I0706 23:27:46.212262 2667 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p58qf\" (UniqueName: \"kubernetes.io/projected/c35acb70-7389-4854-97f5-ab45ab94f737-kube-api-access-p58qf\") pod \"calico-apiserver-59c54d7dc-bp45t\" (UID: \"c35acb70-7389-4854-97f5-ab45ab94f737\") " pod="calico-apiserver/calico-apiserver-59c54d7dc-bp45t" Jul 6 23:27:46.213950 kubelet[2667]: I0706 23:27:46.213833 2667 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"goldmane-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/12a2e46a-f038-4c8d-8ff0-480027547cbb-goldmane-ca-bundle\") pod \"goldmane-768f4c5c69-r6dz8\" (UID: \"12a2e46a-f038-4c8d-8ff0-480027547cbb\") " pod="calico-system/goldmane-768f4c5c69-r6dz8" Jul 6 23:27:46.214022 kubelet[2667]: I0706 23:27:46.213971 2667 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vl4kt\" (UniqueName: \"kubernetes.io/projected/6645371b-fc64-4f6a-96ea-92b175539b84-kube-api-access-vl4kt\") pod \"calico-apiserver-5c7f48b6db-cvkmp\" (UID: \"6645371b-fc64-4f6a-96ea-92b175539b84\") " pod="calico-apiserver/calico-apiserver-5c7f48b6db-cvkmp" Jul 6 23:27:46.214022 kubelet[2667]: I0706 23:27:46.214010 2667 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ngrrg\" (UniqueName: \"kubernetes.io/projected/1210fc34-5b9a-432b-b15e-04a9d1351cf5-kube-api-access-ngrrg\") pod \"whisker-64d64cb859-582kb\" (UID: \"1210fc34-5b9a-432b-b15e-04a9d1351cf5\") " pod="calico-system/whisker-64d64cb859-582kb" Jul 6 23:27:46.214102 kubelet[2667]: I0706 23:27:46.214045 2667 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"goldmane-key-pair\" (UniqueName: \"kubernetes.io/secret/12a2e46a-f038-4c8d-8ff0-480027547cbb-goldmane-key-pair\") pod \"goldmane-768f4c5c69-r6dz8\" (UID: \"12a2e46a-f038-4c8d-8ff0-480027547cbb\") " pod="calico-system/goldmane-768f4c5c69-r6dz8" Jul 6 23:27:46.367926 containerd[1514]: time="2025-07-06T23:27:46.367885150Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-668d6bf9bc-bq59l,Uid:27ad9e03-63e4-4da6-b469-a2bbc29ee17c,Namespace:kube-system,Attempt:0,}" Jul 6 23:27:46.371542 systemd[1]: Created slice kubepods-besteffort-pod6cccb974_c600_4611_9cf6_2ebb27d5d999.slice - libcontainer container kubepods-besteffort-pod6cccb974_c600_4611_9cf6_2ebb27d5d999.slice. Jul 6 23:27:46.378245 containerd[1514]: time="2025-07-06T23:27:46.378207711Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-xwdth,Uid:6cccb974-c600-4611-9cf6-2ebb27d5d999,Namespace:calico-system,Attempt:0,}" Jul 6 23:27:46.388080 containerd[1514]: time="2025-07-06T23:27:46.388026550Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-59c54d7dc-sddts,Uid:f0eeabf7-71e7-42d8-936d-d80fd3cbb28d,Namespace:calico-apiserver,Attempt:0,}" Jul 6 23:27:46.407022 containerd[1514]: time="2025-07-06T23:27:46.406701665Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-695459cb77-m8dwz,Uid:0c22019d-9d01-4e51-b47d-bbbe518c9d06,Namespace:calico-system,Attempt:0,}" Jul 6 23:27:46.408173 containerd[1514]: time="2025-07-06T23:27:46.408101430Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-668d6bf9bc-4hbsb,Uid:13ec9f29-2b5a-4173-8135-b818e53eadb3,Namespace:kube-system,Attempt:0,}" Jul 6 23:27:46.422224 containerd[1514]: time="2025-07-06T23:27:46.422176287Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-5c7f48b6db-cvkmp,Uid:6645371b-fc64-4f6a-96ea-92b175539b84,Namespace:calico-apiserver,Attempt:0,}" Jul 6 23:27:46.433216 containerd[1514]: time="2025-07-06T23:27:46.432941370Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-59c54d7dc-bp45t,Uid:c35acb70-7389-4854-97f5-ab45ab94f737,Namespace:calico-apiserver,Attempt:0,}" Jul 6 23:27:46.450968 containerd[1514]: time="2025-07-06T23:27:46.450929802Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:whisker-64d64cb859-582kb,Uid:1210fc34-5b9a-432b-b15e-04a9d1351cf5,Namespace:calico-system,Attempt:0,}" Jul 6 23:27:46.456742 containerd[1514]: time="2025-07-06T23:27:46.456693945Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:goldmane-768f4c5c69-r6dz8,Uid:12a2e46a-f038-4c8d-8ff0-480027547cbb,Namespace:calico-system,Attempt:0,}" Jul 6 23:27:46.559642 containerd[1514]: time="2025-07-06T23:27:46.559584396Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node:v3.30.2\"" Jul 6 23:27:46.629926 containerd[1514]: time="2025-07-06T23:27:46.629796197Z" level=error msg="Failed to destroy network for sandbox \"35f5d04c21614968fd3fae2e2ae3b50cdf71676ec8e80ac9176360e692a31604\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jul 6 23:27:46.632543 containerd[1514]: time="2025-07-06T23:27:46.632430648Z" level=error msg="Failed to destroy network for sandbox \"cae35a3f36ceb5cfd2a3de307f28f54fdc39024a0802f51220a981f46552f2bf\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jul 6 23:27:46.637365 containerd[1514]: time="2025-07-06T23:27:46.637157507Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-668d6bf9bc-bq59l,Uid:27ad9e03-63e4-4da6-b469-a2bbc29ee17c,Namespace:kube-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"35f5d04c21614968fd3fae2e2ae3b50cdf71676ec8e80ac9176360e692a31604\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jul 6 23:27:46.637971 kubelet[2667]: E0706 23:27:46.637915 2667 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"35f5d04c21614968fd3fae2e2ae3b50cdf71676ec8e80ac9176360e692a31604\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jul 6 23:27:46.638108 kubelet[2667]: E0706 23:27:46.637995 2667 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"35f5d04c21614968fd3fae2e2ae3b50cdf71676ec8e80ac9176360e692a31604\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-668d6bf9bc-bq59l" Jul 6 23:27:46.638108 kubelet[2667]: E0706 23:27:46.638016 2667 kuberuntime_manager.go:1237] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"35f5d04c21614968fd3fae2e2ae3b50cdf71676ec8e80ac9176360e692a31604\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-668d6bf9bc-bq59l" Jul 6 23:27:46.638108 kubelet[2667]: E0706 23:27:46.638053 2667 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"coredns-668d6bf9bc-bq59l_kube-system(27ad9e03-63e4-4da6-b469-a2bbc29ee17c)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"coredns-668d6bf9bc-bq59l_kube-system(27ad9e03-63e4-4da6-b469-a2bbc29ee17c)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"35f5d04c21614968fd3fae2e2ae3b50cdf71676ec8e80ac9176360e692a31604\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-668d6bf9bc-bq59l" podUID="27ad9e03-63e4-4da6-b469-a2bbc29ee17c" Jul 6 23:27:46.639486 containerd[1514]: time="2025-07-06T23:27:46.638774153Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-xwdth,Uid:6cccb974-c600-4611-9cf6-2ebb27d5d999,Namespace:calico-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"cae35a3f36ceb5cfd2a3de307f28f54fdc39024a0802f51220a981f46552f2bf\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jul 6 23:27:46.641299 kubelet[2667]: E0706 23:27:46.640191 2667 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"cae35a3f36ceb5cfd2a3de307f28f54fdc39024a0802f51220a981f46552f2bf\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jul 6 23:27:46.641299 kubelet[2667]: E0706 23:27:46.640389 2667 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"cae35a3f36ceb5cfd2a3de307f28f54fdc39024a0802f51220a981f46552f2bf\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/csi-node-driver-xwdth" Jul 6 23:27:46.641299 kubelet[2667]: E0706 23:27:46.640412 2667 kuberuntime_manager.go:1237] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"cae35a3f36ceb5cfd2a3de307f28f54fdc39024a0802f51220a981f46552f2bf\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/csi-node-driver-xwdth" Jul 6 23:27:46.643002 kubelet[2667]: E0706 23:27:46.640974 2667 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"csi-node-driver-xwdth_calico-system(6cccb974-c600-4611-9cf6-2ebb27d5d999)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"csi-node-driver-xwdth_calico-system(6cccb974-c600-4611-9cf6-2ebb27d5d999)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"cae35a3f36ceb5cfd2a3de307f28f54fdc39024a0802f51220a981f46552f2bf\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/csi-node-driver-xwdth" podUID="6cccb974-c600-4611-9cf6-2ebb27d5d999" Jul 6 23:27:46.689990 containerd[1514]: time="2025-07-06T23:27:46.689856437Z" level=error msg="Failed to destroy network for sandbox \"4bd06377dbb05e806454f02ae1feb84f2fa3ee0b3b69316b6ba4fd861897bf4b\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jul 6 23:27:46.696747 containerd[1514]: time="2025-07-06T23:27:46.696695385Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-59c54d7dc-sddts,Uid:f0eeabf7-71e7-42d8-936d-d80fd3cbb28d,Namespace:calico-apiserver,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"4bd06377dbb05e806454f02ae1feb84f2fa3ee0b3b69316b6ba4fd861897bf4b\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jul 6 23:27:46.697087 containerd[1514]: time="2025-07-06T23:27:46.697053626Z" level=error msg="Failed to destroy network for sandbox \"279ad21e25d44fa1571cb13f6a7f1aea344a5cf83aa876a90d52459e78ed01a6\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jul 6 23:27:46.697330 kubelet[2667]: E0706 23:27:46.697282 2667 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"4bd06377dbb05e806454f02ae1feb84f2fa3ee0b3b69316b6ba4fd861897bf4b\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jul 6 23:27:46.697399 kubelet[2667]: E0706 23:27:46.697347 2667 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"4bd06377dbb05e806454f02ae1feb84f2fa3ee0b3b69316b6ba4fd861897bf4b\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-59c54d7dc-sddts" Jul 6 23:27:46.697399 kubelet[2667]: E0706 23:27:46.697368 2667 kuberuntime_manager.go:1237] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"4bd06377dbb05e806454f02ae1feb84f2fa3ee0b3b69316b6ba4fd861897bf4b\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-59c54d7dc-sddts" Jul 6 23:27:46.697446 kubelet[2667]: E0706 23:27:46.697413 2667 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-apiserver-59c54d7dc-sddts_calico-apiserver(f0eeabf7-71e7-42d8-936d-d80fd3cbb28d)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-apiserver-59c54d7dc-sddts_calico-apiserver(f0eeabf7-71e7-42d8-936d-d80fd3cbb28d)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"4bd06377dbb05e806454f02ae1feb84f2fa3ee0b3b69316b6ba4fd861897bf4b\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-apiserver/calico-apiserver-59c54d7dc-sddts" podUID="f0eeabf7-71e7-42d8-936d-d80fd3cbb28d" Jul 6 23:27:46.699364 containerd[1514]: time="2025-07-06T23:27:46.699279475Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-5c7f48b6db-cvkmp,Uid:6645371b-fc64-4f6a-96ea-92b175539b84,Namespace:calico-apiserver,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"279ad21e25d44fa1571cb13f6a7f1aea344a5cf83aa876a90d52459e78ed01a6\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jul 6 23:27:46.699903 kubelet[2667]: E0706 23:27:46.699862 2667 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"279ad21e25d44fa1571cb13f6a7f1aea344a5cf83aa876a90d52459e78ed01a6\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jul 6 23:27:46.699985 kubelet[2667]: E0706 23:27:46.699919 2667 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"279ad21e25d44fa1571cb13f6a7f1aea344a5cf83aa876a90d52459e78ed01a6\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-5c7f48b6db-cvkmp" Jul 6 23:27:46.699985 kubelet[2667]: E0706 23:27:46.699937 2667 kuberuntime_manager.go:1237] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"279ad21e25d44fa1571cb13f6a7f1aea344a5cf83aa876a90d52459e78ed01a6\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-5c7f48b6db-cvkmp" Jul 6 23:27:46.699985 kubelet[2667]: E0706 23:27:46.699970 2667 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-apiserver-5c7f48b6db-cvkmp_calico-apiserver(6645371b-fc64-4f6a-96ea-92b175539b84)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-apiserver-5c7f48b6db-cvkmp_calico-apiserver(6645371b-fc64-4f6a-96ea-92b175539b84)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"279ad21e25d44fa1571cb13f6a7f1aea344a5cf83aa876a90d52459e78ed01a6\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-apiserver/calico-apiserver-5c7f48b6db-cvkmp" podUID="6645371b-fc64-4f6a-96ea-92b175539b84" Jul 6 23:27:46.701085 containerd[1514]: time="2025-07-06T23:27:46.701029842Z" level=error msg="Failed to destroy network for sandbox \"585a9a05927d147d56a7afdd7bb41975c39a8d5fc1a7d5fa7f37f29bb47b1f0d\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jul 6 23:27:46.702157 containerd[1514]: time="2025-07-06T23:27:46.702088446Z" level=error msg="Failed to destroy network for sandbox \"677389c438b7840ecdf1dfb0f5514ae9fa7c6794d29527bef562d2be0ce63b3c\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jul 6 23:27:46.707068 containerd[1514]: time="2025-07-06T23:27:46.705822141Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-695459cb77-m8dwz,Uid:0c22019d-9d01-4e51-b47d-bbbe518c9d06,Namespace:calico-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"585a9a05927d147d56a7afdd7bb41975c39a8d5fc1a7d5fa7f37f29bb47b1f0d\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jul 6 23:27:46.707556 kubelet[2667]: E0706 23:27:46.707352 2667 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"585a9a05927d147d56a7afdd7bb41975c39a8d5fc1a7d5fa7f37f29bb47b1f0d\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jul 6 23:27:46.707645 containerd[1514]: time="2025-07-06T23:27:46.707487148Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-668d6bf9bc-4hbsb,Uid:13ec9f29-2b5a-4173-8135-b818e53eadb3,Namespace:kube-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"677389c438b7840ecdf1dfb0f5514ae9fa7c6794d29527bef562d2be0ce63b3c\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jul 6 23:27:46.707745 kubelet[2667]: E0706 23:27:46.707574 2667 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"585a9a05927d147d56a7afdd7bb41975c39a8d5fc1a7d5fa7f37f29bb47b1f0d\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-kube-controllers-695459cb77-m8dwz" Jul 6 23:27:46.707745 kubelet[2667]: E0706 23:27:46.707723 2667 kuberuntime_manager.go:1237] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"585a9a05927d147d56a7afdd7bb41975c39a8d5fc1a7d5fa7f37f29bb47b1f0d\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-kube-controllers-695459cb77-m8dwz" Jul 6 23:27:46.709106 kubelet[2667]: E0706 23:27:46.707793 2667 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-kube-controllers-695459cb77-m8dwz_calico-system(0c22019d-9d01-4e51-b47d-bbbe518c9d06)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-kube-controllers-695459cb77-m8dwz_calico-system(0c22019d-9d01-4e51-b47d-bbbe518c9d06)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"585a9a05927d147d56a7afdd7bb41975c39a8d5fc1a7d5fa7f37f29bb47b1f0d\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/calico-kube-controllers-695459cb77-m8dwz" podUID="0c22019d-9d01-4e51-b47d-bbbe518c9d06" Jul 6 23:27:46.709106 kubelet[2667]: E0706 23:27:46.707962 2667 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"677389c438b7840ecdf1dfb0f5514ae9fa7c6794d29527bef562d2be0ce63b3c\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jul 6 23:27:46.709106 kubelet[2667]: E0706 23:27:46.708846 2667 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"677389c438b7840ecdf1dfb0f5514ae9fa7c6794d29527bef562d2be0ce63b3c\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-668d6bf9bc-4hbsb" Jul 6 23:27:46.711860 kubelet[2667]: E0706 23:27:46.708873 2667 kuberuntime_manager.go:1237] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"677389c438b7840ecdf1dfb0f5514ae9fa7c6794d29527bef562d2be0ce63b3c\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-668d6bf9bc-4hbsb" Jul 6 23:27:46.711860 kubelet[2667]: E0706 23:27:46.708914 2667 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"coredns-668d6bf9bc-4hbsb_kube-system(13ec9f29-2b5a-4173-8135-b818e53eadb3)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"coredns-668d6bf9bc-4hbsb_kube-system(13ec9f29-2b5a-4173-8135-b818e53eadb3)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"677389c438b7840ecdf1dfb0f5514ae9fa7c6794d29527bef562d2be0ce63b3c\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-668d6bf9bc-4hbsb" podUID="13ec9f29-2b5a-4173-8135-b818e53eadb3" Jul 6 23:27:46.740186 containerd[1514]: time="2025-07-06T23:27:46.740126798Z" level=error msg="Failed to destroy network for sandbox \"af6097706ea7987af5e2052a91316539febb1d2065ae6d6d04d1294859a65d4c\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jul 6 23:27:46.742182 containerd[1514]: time="2025-07-06T23:27:46.742096886Z" level=error msg="Failed to destroy network for sandbox \"8c3a3a88b6d227cb902b3ef30c397e59c7a9aa68d6047644824e5a4116a69eb9\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jul 6 23:27:46.743449 containerd[1514]: time="2025-07-06T23:27:46.743351131Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:goldmane-768f4c5c69-r6dz8,Uid:12a2e46a-f038-4c8d-8ff0-480027547cbb,Namespace:calico-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"af6097706ea7987af5e2052a91316539febb1d2065ae6d6d04d1294859a65d4c\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jul 6 23:27:46.743966 kubelet[2667]: E0706 23:27:46.743927 2667 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"af6097706ea7987af5e2052a91316539febb1d2065ae6d6d04d1294859a65d4c\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jul 6 23:27:46.744152 kubelet[2667]: E0706 23:27:46.744070 2667 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"af6097706ea7987af5e2052a91316539febb1d2065ae6d6d04d1294859a65d4c\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/goldmane-768f4c5c69-r6dz8" Jul 6 23:27:46.744152 kubelet[2667]: E0706 23:27:46.744115 2667 kuberuntime_manager.go:1237] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"af6097706ea7987af5e2052a91316539febb1d2065ae6d6d04d1294859a65d4c\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/goldmane-768f4c5c69-r6dz8" Jul 6 23:27:46.744554 kubelet[2667]: E0706 23:27:46.744349 2667 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"goldmane-768f4c5c69-r6dz8_calico-system(12a2e46a-f038-4c8d-8ff0-480027547cbb)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"goldmane-768f4c5c69-r6dz8_calico-system(12a2e46a-f038-4c8d-8ff0-480027547cbb)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"af6097706ea7987af5e2052a91316539febb1d2065ae6d6d04d1294859a65d4c\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/goldmane-768f4c5c69-r6dz8" podUID="12a2e46a-f038-4c8d-8ff0-480027547cbb" Jul 6 23:27:46.746049 containerd[1514]: time="2025-07-06T23:27:46.745065378Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:whisker-64d64cb859-582kb,Uid:1210fc34-5b9a-432b-b15e-04a9d1351cf5,Namespace:calico-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"8c3a3a88b6d227cb902b3ef30c397e59c7a9aa68d6047644824e5a4116a69eb9\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jul 6 23:27:46.746447 kubelet[2667]: E0706 23:27:46.746234 2667 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"8c3a3a88b6d227cb902b3ef30c397e59c7a9aa68d6047644824e5a4116a69eb9\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jul 6 23:27:46.746447 kubelet[2667]: E0706 23:27:46.746284 2667 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"8c3a3a88b6d227cb902b3ef30c397e59c7a9aa68d6047644824e5a4116a69eb9\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/whisker-64d64cb859-582kb" Jul 6 23:27:46.746447 kubelet[2667]: E0706 23:27:46.746301 2667 kuberuntime_manager.go:1237] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"8c3a3a88b6d227cb902b3ef30c397e59c7a9aa68d6047644824e5a4116a69eb9\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/whisker-64d64cb859-582kb" Jul 6 23:27:46.746579 kubelet[2667]: E0706 23:27:46.746341 2667 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"whisker-64d64cb859-582kb_calico-system(1210fc34-5b9a-432b-b15e-04a9d1351cf5)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"whisker-64d64cb859-582kb_calico-system(1210fc34-5b9a-432b-b15e-04a9d1351cf5)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"8c3a3a88b6d227cb902b3ef30c397e59c7a9aa68d6047644824e5a4116a69eb9\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/whisker-64d64cb859-582kb" podUID="1210fc34-5b9a-432b-b15e-04a9d1351cf5" Jul 6 23:27:46.748736 containerd[1514]: time="2025-07-06T23:27:46.748690673Z" level=error msg="Failed to destroy network for sandbox \"0a5cdf1b4c459b02175d3574d763224e7f069af4a755976d2d67645466479062\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jul 6 23:27:46.750569 containerd[1514]: time="2025-07-06T23:27:46.750484720Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-59c54d7dc-bp45t,Uid:c35acb70-7389-4854-97f5-ab45ab94f737,Namespace:calico-apiserver,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"0a5cdf1b4c459b02175d3574d763224e7f069af4a755976d2d67645466479062\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jul 6 23:27:46.750965 kubelet[2667]: E0706 23:27:46.750916 2667 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"0a5cdf1b4c459b02175d3574d763224e7f069af4a755976d2d67645466479062\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jul 6 23:27:46.751279 kubelet[2667]: E0706 23:27:46.751117 2667 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"0a5cdf1b4c459b02175d3574d763224e7f069af4a755976d2d67645466479062\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-59c54d7dc-bp45t" Jul 6 23:27:46.751279 kubelet[2667]: E0706 23:27:46.751140 2667 kuberuntime_manager.go:1237] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"0a5cdf1b4c459b02175d3574d763224e7f069af4a755976d2d67645466479062\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-59c54d7dc-bp45t" Jul 6 23:27:46.751680 kubelet[2667]: E0706 23:27:46.751182 2667 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-apiserver-59c54d7dc-bp45t_calico-apiserver(c35acb70-7389-4854-97f5-ab45ab94f737)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-apiserver-59c54d7dc-bp45t_calico-apiserver(c35acb70-7389-4854-97f5-ab45ab94f737)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"0a5cdf1b4c459b02175d3574d763224e7f069af4a755976d2d67645466479062\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-apiserver/calico-apiserver-59c54d7dc-bp45t" podUID="c35acb70-7389-4854-97f5-ab45ab94f737" Jul 6 23:27:53.997910 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount376211574.mount: Deactivated successfully. Jul 6 23:27:54.027962 containerd[1514]: time="2025-07-06T23:27:54.027908877Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/node:v3.30.2: active requests=0, bytes read=152544909" Jul 6 23:27:54.032413 containerd[1514]: time="2025-07-06T23:27:54.032341768Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/node:v3.30.2\" with image id \"sha256:1c6ddca599ddd18c061e797a7830b0aea985f8b023c5e43d815a9ed1088893a9\", repo tag \"ghcr.io/flatcar/calico/node:v3.30.2\", repo digest \"ghcr.io/flatcar/calico/node@sha256:e94d49349cc361ef2216d27dda4a097278984d778279f66e79b0616c827c6760\", size \"152544771\" in 7.472637891s" Jul 6 23:27:54.032813 containerd[1514]: time="2025-07-06T23:27:54.032607928Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node:v3.30.2\" returns image reference \"sha256:1c6ddca599ddd18c061e797a7830b0aea985f8b023c5e43d815a9ed1088893a9\"" Jul 6 23:27:54.034425 containerd[1514]: time="2025-07-06T23:27:54.034374133Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node:v3.30.2\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jul 6 23:27:54.036597 containerd[1514]: time="2025-07-06T23:27:54.036525018Z" level=info msg="ImageCreate event name:\"sha256:1c6ddca599ddd18c061e797a7830b0aea985f8b023c5e43d815a9ed1088893a9\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jul 6 23:27:54.037382 containerd[1514]: time="2025-07-06T23:27:54.037349620Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node@sha256:e94d49349cc361ef2216d27dda4a097278984d778279f66e79b0616c827c6760\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jul 6 23:27:54.048676 containerd[1514]: time="2025-07-06T23:27:54.048582566Z" level=info msg="CreateContainer within sandbox \"0a9883a4e9231cb1e98292a314943ddd3dde0e2016ccd784f689910c90b7d079\" for container &ContainerMetadata{Name:calico-node,Attempt:0,}" Jul 6 23:27:54.069931 containerd[1514]: time="2025-07-06T23:27:54.069841897Z" level=info msg="Container 0d73d6901e869e7caea6321810ee8eb5fca1374e273d1fd43bb5e8a831a94d32: CDI devices from CRI Config.CDIDevices: []" Jul 6 23:27:54.085486 containerd[1514]: time="2025-07-06T23:27:54.085383654Z" level=info msg="CreateContainer within sandbox \"0a9883a4e9231cb1e98292a314943ddd3dde0e2016ccd784f689910c90b7d079\" for &ContainerMetadata{Name:calico-node,Attempt:0,} returns container id \"0d73d6901e869e7caea6321810ee8eb5fca1374e273d1fd43bb5e8a831a94d32\"" Jul 6 23:27:54.086493 containerd[1514]: time="2025-07-06T23:27:54.086386977Z" level=info msg="StartContainer for \"0d73d6901e869e7caea6321810ee8eb5fca1374e273d1fd43bb5e8a831a94d32\"" Jul 6 23:27:54.088469 containerd[1514]: time="2025-07-06T23:27:54.088392381Z" level=info msg="connecting to shim 0d73d6901e869e7caea6321810ee8eb5fca1374e273d1fd43bb5e8a831a94d32" address="unix:///run/containerd/s/9b34b730dea15fb491c14c885345ddfbac870c6e1d7d709f99dc7fda4750edf3" protocol=ttrpc version=3 Jul 6 23:27:54.143836 systemd[1]: Started cri-containerd-0d73d6901e869e7caea6321810ee8eb5fca1374e273d1fd43bb5e8a831a94d32.scope - libcontainer container 0d73d6901e869e7caea6321810ee8eb5fca1374e273d1fd43bb5e8a831a94d32. Jul 6 23:27:54.203315 containerd[1514]: time="2025-07-06T23:27:54.203262136Z" level=info msg="StartContainer for \"0d73d6901e869e7caea6321810ee8eb5fca1374e273d1fd43bb5e8a831a94d32\" returns successfully" Jul 6 23:27:54.351821 kernel: wireguard: WireGuard 1.0.0 loaded. See www.wireguard.com for information. Jul 6 23:27:54.351947 kernel: wireguard: Copyright (C) 2015-2019 Jason A. Donenfeld . All Rights Reserved. Jul 6 23:27:54.581967 kubelet[2667]: I0706 23:27:54.580870 2667 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"whisker-backend-key-pair\" (UniqueName: \"kubernetes.io/secret/1210fc34-5b9a-432b-b15e-04a9d1351cf5-whisker-backend-key-pair\") pod \"1210fc34-5b9a-432b-b15e-04a9d1351cf5\" (UID: \"1210fc34-5b9a-432b-b15e-04a9d1351cf5\") " Jul 6 23:27:54.581967 kubelet[2667]: I0706 23:27:54.581387 2667 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ngrrg\" (UniqueName: \"kubernetes.io/projected/1210fc34-5b9a-432b-b15e-04a9d1351cf5-kube-api-access-ngrrg\") pod \"1210fc34-5b9a-432b-b15e-04a9d1351cf5\" (UID: \"1210fc34-5b9a-432b-b15e-04a9d1351cf5\") " Jul 6 23:27:54.581967 kubelet[2667]: I0706 23:27:54.581422 2667 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"whisker-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1210fc34-5b9a-432b-b15e-04a9d1351cf5-whisker-ca-bundle\") pod \"1210fc34-5b9a-432b-b15e-04a9d1351cf5\" (UID: \"1210fc34-5b9a-432b-b15e-04a9d1351cf5\") " Jul 6 23:27:54.596814 kubelet[2667]: I0706 23:27:54.596717 2667 operation_generator.go:780] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1210fc34-5b9a-432b-b15e-04a9d1351cf5-whisker-ca-bundle" (OuterVolumeSpecName: "whisker-ca-bundle") pod "1210fc34-5b9a-432b-b15e-04a9d1351cf5" (UID: "1210fc34-5b9a-432b-b15e-04a9d1351cf5"). InnerVolumeSpecName "whisker-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Jul 6 23:27:54.602584 kubelet[2667]: I0706 23:27:54.602129 2667 operation_generator.go:780] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1210fc34-5b9a-432b-b15e-04a9d1351cf5-whisker-backend-key-pair" (OuterVolumeSpecName: "whisker-backend-key-pair") pod "1210fc34-5b9a-432b-b15e-04a9d1351cf5" (UID: "1210fc34-5b9a-432b-b15e-04a9d1351cf5"). InnerVolumeSpecName "whisker-backend-key-pair". PluginName "kubernetes.io/secret", VolumeGIDValue "" Jul 6 23:27:54.604247 kubelet[2667]: I0706 23:27:54.604205 2667 operation_generator.go:780] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1210fc34-5b9a-432b-b15e-04a9d1351cf5-kube-api-access-ngrrg" (OuterVolumeSpecName: "kube-api-access-ngrrg") pod "1210fc34-5b9a-432b-b15e-04a9d1351cf5" (UID: "1210fc34-5b9a-432b-b15e-04a9d1351cf5"). InnerVolumeSpecName "kube-api-access-ngrrg". PluginName "kubernetes.io/projected", VolumeGIDValue "" Jul 6 23:27:54.610776 systemd[1]: Removed slice kubepods-besteffort-pod1210fc34_5b9a_432b_b15e_04a9d1351cf5.slice - libcontainer container kubepods-besteffort-pod1210fc34_5b9a_432b_b15e_04a9d1351cf5.slice. Jul 6 23:27:54.635102 kubelet[2667]: I0706 23:27:54.635034 2667 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/calico-node-gctx6" podStartSLOduration=2.075676768 podStartE2EDuration="19.635011206s" podCreationTimestamp="2025-07-06 23:27:35 +0000 UTC" firstStartedPulling="2025-07-06 23:27:36.473891372 +0000 UTC m=+23.255551744" lastFinishedPulling="2025-07-06 23:27:54.03322585 +0000 UTC m=+40.814886182" observedRunningTime="2025-07-06 23:27:54.633579563 +0000 UTC m=+41.415239975" watchObservedRunningTime="2025-07-06 23:27:54.635011206 +0000 UTC m=+41.416671618" Jul 6 23:27:54.682802 kubelet[2667]: I0706 23:27:54.682212 2667 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-ngrrg\" (UniqueName: \"kubernetes.io/projected/1210fc34-5b9a-432b-b15e-04a9d1351cf5-kube-api-access-ngrrg\") on node \"ci-4344-1-1-3-d8bdec45b1\" DevicePath \"\"" Jul 6 23:27:54.682802 kubelet[2667]: I0706 23:27:54.682275 2667 reconciler_common.go:299] "Volume detached for volume \"whisker-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1210fc34-5b9a-432b-b15e-04a9d1351cf5-whisker-ca-bundle\") on node \"ci-4344-1-1-3-d8bdec45b1\" DevicePath \"\"" Jul 6 23:27:54.682802 kubelet[2667]: I0706 23:27:54.682303 2667 reconciler_common.go:299] "Volume detached for volume \"whisker-backend-key-pair\" (UniqueName: \"kubernetes.io/secret/1210fc34-5b9a-432b-b15e-04a9d1351cf5-whisker-backend-key-pair\") on node \"ci-4344-1-1-3-d8bdec45b1\" DevicePath \"\"" Jul 6 23:27:54.728441 systemd[1]: Created slice kubepods-besteffort-pod0f45c0fb_23cb_4b21_8bd9_1ed17aa8a372.slice - libcontainer container kubepods-besteffort-pod0f45c0fb_23cb_4b21_8bd9_1ed17aa8a372.slice. Jul 6 23:27:54.739072 kubelet[2667]: I0706 23:27:54.739020 2667 status_manager.go:890] "Failed to get status for pod" podUID="0f45c0fb-23cb-4b21-8bd9-1ed17aa8a372" pod="calico-system/whisker-6cb7d4d7d9-tp4hb" err="pods \"whisker-6cb7d4d7d9-tp4hb\" is forbidden: User \"system:node:ci-4344-1-1-3-d8bdec45b1\" cannot get resource \"pods\" in API group \"\" in the namespace \"calico-system\": no relationship found between node 'ci-4344-1-1-3-d8bdec45b1' and this object" Jul 6 23:27:54.783656 kubelet[2667]: I0706 23:27:54.783489 2667 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whisker-backend-key-pair\" (UniqueName: \"kubernetes.io/secret/0f45c0fb-23cb-4b21-8bd9-1ed17aa8a372-whisker-backend-key-pair\") pod \"whisker-6cb7d4d7d9-tp4hb\" (UID: \"0f45c0fb-23cb-4b21-8bd9-1ed17aa8a372\") " pod="calico-system/whisker-6cb7d4d7d9-tp4hb" Jul 6 23:27:54.783656 kubelet[2667]: I0706 23:27:54.783550 2667 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whisker-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0f45c0fb-23cb-4b21-8bd9-1ed17aa8a372-whisker-ca-bundle\") pod \"whisker-6cb7d4d7d9-tp4hb\" (UID: \"0f45c0fb-23cb-4b21-8bd9-1ed17aa8a372\") " pod="calico-system/whisker-6cb7d4d7d9-tp4hb" Jul 6 23:27:54.783656 kubelet[2667]: I0706 23:27:54.783576 2667 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sqcbj\" (UniqueName: \"kubernetes.io/projected/0f45c0fb-23cb-4b21-8bd9-1ed17aa8a372-kube-api-access-sqcbj\") pod \"whisker-6cb7d4d7d9-tp4hb\" (UID: \"0f45c0fb-23cb-4b21-8bd9-1ed17aa8a372\") " pod="calico-system/whisker-6cb7d4d7d9-tp4hb" Jul 6 23:27:55.004074 systemd[1]: var-lib-kubelet-pods-1210fc34\x2d5b9a\x2d432b\x2db15e\x2d04a9d1351cf5-volumes-kubernetes.io\x7eprojected-kube\x2dapi\x2daccess\x2dngrrg.mount: Deactivated successfully. Jul 6 23:27:55.004174 systemd[1]: var-lib-kubelet-pods-1210fc34\x2d5b9a\x2d432b\x2db15e\x2d04a9d1351cf5-volumes-kubernetes.io\x7esecret-whisker\x2dbackend\x2dkey\x2dpair.mount: Deactivated successfully. Jul 6 23:27:55.033788 containerd[1514]: time="2025-07-06T23:27:55.033708593Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:whisker-6cb7d4d7d9-tp4hb,Uid:0f45c0fb-23cb-4b21-8bd9-1ed17aa8a372,Namespace:calico-system,Attempt:0,}" Jul 6 23:27:55.238586 systemd-networkd[1406]: calidad075ca99b: Link UP Jul 6 23:27:55.239339 systemd-networkd[1406]: calidad075ca99b: Gained carrier Jul 6 23:27:55.257290 containerd[1514]: 2025-07-06 23:27:55.066 [INFO][3803] cni-plugin/utils.go 100: File /var/lib/calico/mtu does not exist Jul 6 23:27:55.257290 containerd[1514]: 2025-07-06 23:27:55.110 [INFO][3803] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4344--1--1--3--d8bdec45b1-k8s-whisker--6cb7d4d7d9--tp4hb-eth0 whisker-6cb7d4d7d9- calico-system 0f45c0fb-23cb-4b21-8bd9-1ed17aa8a372 875 0 2025-07-06 23:27:54 +0000 UTC map[app.kubernetes.io/name:whisker k8s-app:whisker pod-template-hash:6cb7d4d7d9 projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:whisker] map[] [] [] []} {k8s ci-4344-1-1-3-d8bdec45b1 whisker-6cb7d4d7d9-tp4hb eth0 whisker [] [] [kns.calico-system ksa.calico-system.whisker] calidad075ca99b [] [] }} ContainerID="bd8ff8cd55b8e0de8574a38cee131001bf2fc85cac7d8ed496b67977a3e70ba8" Namespace="calico-system" Pod="whisker-6cb7d4d7d9-tp4hb" WorkloadEndpoint="ci--4344--1--1--3--d8bdec45b1-k8s-whisker--6cb7d4d7d9--tp4hb-" Jul 6 23:27:55.257290 containerd[1514]: 2025-07-06 23:27:55.110 [INFO][3803] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="bd8ff8cd55b8e0de8574a38cee131001bf2fc85cac7d8ed496b67977a3e70ba8" Namespace="calico-system" Pod="whisker-6cb7d4d7d9-tp4hb" WorkloadEndpoint="ci--4344--1--1--3--d8bdec45b1-k8s-whisker--6cb7d4d7d9--tp4hb-eth0" Jul 6 23:27:55.257290 containerd[1514]: 2025-07-06 23:27:55.162 [INFO][3816] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="bd8ff8cd55b8e0de8574a38cee131001bf2fc85cac7d8ed496b67977a3e70ba8" HandleID="k8s-pod-network.bd8ff8cd55b8e0de8574a38cee131001bf2fc85cac7d8ed496b67977a3e70ba8" Workload="ci--4344--1--1--3--d8bdec45b1-k8s-whisker--6cb7d4d7d9--tp4hb-eth0" Jul 6 23:27:55.258022 containerd[1514]: 2025-07-06 23:27:55.162 [INFO][3816] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="bd8ff8cd55b8e0de8574a38cee131001bf2fc85cac7d8ed496b67977a3e70ba8" HandleID="k8s-pod-network.bd8ff8cd55b8e0de8574a38cee131001bf2fc85cac7d8ed496b67977a3e70ba8" Workload="ci--4344--1--1--3--d8bdec45b1-k8s-whisker--6cb7d4d7d9--tp4hb-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0x40002ab7f0), Attrs:map[string]string{"namespace":"calico-system", "node":"ci-4344-1-1-3-d8bdec45b1", "pod":"whisker-6cb7d4d7d9-tp4hb", "timestamp":"2025-07-06 23:27:55.16219468 +0000 UTC"}, Hostname:"ci-4344-1-1-3-d8bdec45b1", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Jul 6 23:27:55.258022 containerd[1514]: 2025-07-06 23:27:55.162 [INFO][3816] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Jul 6 23:27:55.258022 containerd[1514]: 2025-07-06 23:27:55.162 [INFO][3816] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Jul 6 23:27:55.258022 containerd[1514]: 2025-07-06 23:27:55.162 [INFO][3816] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4344-1-1-3-d8bdec45b1' Jul 6 23:27:55.258022 containerd[1514]: 2025-07-06 23:27:55.179 [INFO][3816] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.bd8ff8cd55b8e0de8574a38cee131001bf2fc85cac7d8ed496b67977a3e70ba8" host="ci-4344-1-1-3-d8bdec45b1" Jul 6 23:27:55.258022 containerd[1514]: 2025-07-06 23:27:55.189 [INFO][3816] ipam/ipam.go 394: Looking up existing affinities for host host="ci-4344-1-1-3-d8bdec45b1" Jul 6 23:27:55.258022 containerd[1514]: 2025-07-06 23:27:55.199 [INFO][3816] ipam/ipam.go 511: Trying affinity for 192.168.75.64/26 host="ci-4344-1-1-3-d8bdec45b1" Jul 6 23:27:55.258022 containerd[1514]: 2025-07-06 23:27:55.202 [INFO][3816] ipam/ipam.go 158: Attempting to load block cidr=192.168.75.64/26 host="ci-4344-1-1-3-d8bdec45b1" Jul 6 23:27:55.258022 containerd[1514]: 2025-07-06 23:27:55.205 [INFO][3816] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.75.64/26 host="ci-4344-1-1-3-d8bdec45b1" Jul 6 23:27:55.258364 containerd[1514]: 2025-07-06 23:27:55.205 [INFO][3816] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.75.64/26 handle="k8s-pod-network.bd8ff8cd55b8e0de8574a38cee131001bf2fc85cac7d8ed496b67977a3e70ba8" host="ci-4344-1-1-3-d8bdec45b1" Jul 6 23:27:55.258364 containerd[1514]: 2025-07-06 23:27:55.207 [INFO][3816] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.bd8ff8cd55b8e0de8574a38cee131001bf2fc85cac7d8ed496b67977a3e70ba8 Jul 6 23:27:55.258364 containerd[1514]: 2025-07-06 23:27:55.214 [INFO][3816] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.75.64/26 handle="k8s-pod-network.bd8ff8cd55b8e0de8574a38cee131001bf2fc85cac7d8ed496b67977a3e70ba8" host="ci-4344-1-1-3-d8bdec45b1" Jul 6 23:27:55.258364 containerd[1514]: 2025-07-06 23:27:55.224 [INFO][3816] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.75.65/26] block=192.168.75.64/26 handle="k8s-pod-network.bd8ff8cd55b8e0de8574a38cee131001bf2fc85cac7d8ed496b67977a3e70ba8" host="ci-4344-1-1-3-d8bdec45b1" Jul 6 23:27:55.258364 containerd[1514]: 2025-07-06 23:27:55.224 [INFO][3816] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.75.65/26] handle="k8s-pod-network.bd8ff8cd55b8e0de8574a38cee131001bf2fc85cac7d8ed496b67977a3e70ba8" host="ci-4344-1-1-3-d8bdec45b1" Jul 6 23:27:55.258364 containerd[1514]: 2025-07-06 23:27:55.224 [INFO][3816] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Jul 6 23:27:55.258364 containerd[1514]: 2025-07-06 23:27:55.224 [INFO][3816] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.75.65/26] IPv6=[] ContainerID="bd8ff8cd55b8e0de8574a38cee131001bf2fc85cac7d8ed496b67977a3e70ba8" HandleID="k8s-pod-network.bd8ff8cd55b8e0de8574a38cee131001bf2fc85cac7d8ed496b67977a3e70ba8" Workload="ci--4344--1--1--3--d8bdec45b1-k8s-whisker--6cb7d4d7d9--tp4hb-eth0" Jul 6 23:27:55.258514 containerd[1514]: 2025-07-06 23:27:55.229 [INFO][3803] cni-plugin/k8s.go 418: Populated endpoint ContainerID="bd8ff8cd55b8e0de8574a38cee131001bf2fc85cac7d8ed496b67977a3e70ba8" Namespace="calico-system" Pod="whisker-6cb7d4d7d9-tp4hb" WorkloadEndpoint="ci--4344--1--1--3--d8bdec45b1-k8s-whisker--6cb7d4d7d9--tp4hb-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4344--1--1--3--d8bdec45b1-k8s-whisker--6cb7d4d7d9--tp4hb-eth0", GenerateName:"whisker-6cb7d4d7d9-", Namespace:"calico-system", SelfLink:"", UID:"0f45c0fb-23cb-4b21-8bd9-1ed17aa8a372", ResourceVersion:"875", Generation:0, CreationTimestamp:time.Date(2025, time.July, 6, 23, 27, 54, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"whisker", "k8s-app":"whisker", "pod-template-hash":"6cb7d4d7d9", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"whisker"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4344-1-1-3-d8bdec45b1", ContainerID:"", Pod:"whisker-6cb7d4d7d9-tp4hb", Endpoint:"eth0", ServiceAccountName:"whisker", IPNetworks:[]string{"192.168.75.65/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.whisker"}, InterfaceName:"calidad075ca99b", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Jul 6 23:27:55.258514 containerd[1514]: 2025-07-06 23:27:55.229 [INFO][3803] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.75.65/32] ContainerID="bd8ff8cd55b8e0de8574a38cee131001bf2fc85cac7d8ed496b67977a3e70ba8" Namespace="calico-system" Pod="whisker-6cb7d4d7d9-tp4hb" WorkloadEndpoint="ci--4344--1--1--3--d8bdec45b1-k8s-whisker--6cb7d4d7d9--tp4hb-eth0" Jul 6 23:27:55.258682 containerd[1514]: 2025-07-06 23:27:55.230 [INFO][3803] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to calidad075ca99b ContainerID="bd8ff8cd55b8e0de8574a38cee131001bf2fc85cac7d8ed496b67977a3e70ba8" Namespace="calico-system" Pod="whisker-6cb7d4d7d9-tp4hb" WorkloadEndpoint="ci--4344--1--1--3--d8bdec45b1-k8s-whisker--6cb7d4d7d9--tp4hb-eth0" Jul 6 23:27:55.258682 containerd[1514]: 2025-07-06 23:27:55.240 [INFO][3803] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="bd8ff8cd55b8e0de8574a38cee131001bf2fc85cac7d8ed496b67977a3e70ba8" Namespace="calico-system" Pod="whisker-6cb7d4d7d9-tp4hb" WorkloadEndpoint="ci--4344--1--1--3--d8bdec45b1-k8s-whisker--6cb7d4d7d9--tp4hb-eth0" Jul 6 23:27:55.258730 containerd[1514]: 2025-07-06 23:27:55.241 [INFO][3803] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="bd8ff8cd55b8e0de8574a38cee131001bf2fc85cac7d8ed496b67977a3e70ba8" Namespace="calico-system" Pod="whisker-6cb7d4d7d9-tp4hb" WorkloadEndpoint="ci--4344--1--1--3--d8bdec45b1-k8s-whisker--6cb7d4d7d9--tp4hb-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4344--1--1--3--d8bdec45b1-k8s-whisker--6cb7d4d7d9--tp4hb-eth0", GenerateName:"whisker-6cb7d4d7d9-", Namespace:"calico-system", SelfLink:"", UID:"0f45c0fb-23cb-4b21-8bd9-1ed17aa8a372", ResourceVersion:"875", Generation:0, CreationTimestamp:time.Date(2025, time.July, 6, 23, 27, 54, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"whisker", "k8s-app":"whisker", "pod-template-hash":"6cb7d4d7d9", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"whisker"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4344-1-1-3-d8bdec45b1", ContainerID:"bd8ff8cd55b8e0de8574a38cee131001bf2fc85cac7d8ed496b67977a3e70ba8", Pod:"whisker-6cb7d4d7d9-tp4hb", Endpoint:"eth0", ServiceAccountName:"whisker", IPNetworks:[]string{"192.168.75.65/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.whisker"}, InterfaceName:"calidad075ca99b", MAC:"e2:78:76:1c:80:22", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Jul 6 23:27:55.258794 containerd[1514]: 2025-07-06 23:27:55.253 [INFO][3803] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="bd8ff8cd55b8e0de8574a38cee131001bf2fc85cac7d8ed496b67977a3e70ba8" Namespace="calico-system" Pod="whisker-6cb7d4d7d9-tp4hb" WorkloadEndpoint="ci--4344--1--1--3--d8bdec45b1-k8s-whisker--6cb7d4d7d9--tp4hb-eth0" Jul 6 23:27:55.305344 containerd[1514]: time="2025-07-06T23:27:55.305287641Z" level=info msg="connecting to shim bd8ff8cd55b8e0de8574a38cee131001bf2fc85cac7d8ed496b67977a3e70ba8" address="unix:///run/containerd/s/84d1da03587149a3b2a987ea16394b79e421db7e0cdeb37deb726f10fb283b0f" namespace=k8s.io protocol=ttrpc version=3 Jul 6 23:27:55.345961 systemd[1]: Started cri-containerd-bd8ff8cd55b8e0de8574a38cee131001bf2fc85cac7d8ed496b67977a3e70ba8.scope - libcontainer container bd8ff8cd55b8e0de8574a38cee131001bf2fc85cac7d8ed496b67977a3e70ba8. Jul 6 23:27:55.366171 kubelet[2667]: I0706 23:27:55.366111 2667 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1210fc34-5b9a-432b-b15e-04a9d1351cf5" path="/var/lib/kubelet/pods/1210fc34-5b9a-432b-b15e-04a9d1351cf5/volumes" Jul 6 23:27:55.434658 containerd[1514]: time="2025-07-06T23:27:55.434572370Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:whisker-6cb7d4d7d9-tp4hb,Uid:0f45c0fb-23cb-4b21-8bd9-1ed17aa8a372,Namespace:calico-system,Attempt:0,} returns sandbox id \"bd8ff8cd55b8e0de8574a38cee131001bf2fc85cac7d8ed496b67977a3e70ba8\"" Jul 6 23:27:55.437461 containerd[1514]: time="2025-07-06T23:27:55.436799455Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.30.2\"" Jul 6 23:27:55.611235 kubelet[2667]: I0706 23:27:55.610631 2667 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Jul 6 23:27:56.830922 systemd-networkd[1406]: calidad075ca99b: Gained IPv6LL Jul 6 23:27:57.342875 containerd[1514]: time="2025-07-06T23:27:57.342166846Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker:v3.30.2: active requests=0, bytes read=4605614" Jul 6 23:27:57.342875 containerd[1514]: time="2025-07-06T23:27:57.342346046Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/whisker:v3.30.2\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jul 6 23:27:57.346385 containerd[1514]: time="2025-07-06T23:27:57.346082374Z" level=info msg="ImageCreate event name:\"sha256:309942601a9ca6c4e92bcd09162824fef1c137a5c5d92fbbb45be0f29bfd1817\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jul 6 23:27:57.349689 containerd[1514]: time="2025-07-06T23:27:57.348163138Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/whisker@sha256:31346d4524252a3b0d2a1d289c4985b8402b498b5ce82a12e682096ab7446678\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jul 6 23:27:57.350018 containerd[1514]: time="2025-07-06T23:27:57.349311940Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/whisker:v3.30.2\" with image id \"sha256:309942601a9ca6c4e92bcd09162824fef1c137a5c5d92fbbb45be0f29bfd1817\", repo tag \"ghcr.io/flatcar/calico/whisker:v3.30.2\", repo digest \"ghcr.io/flatcar/calico/whisker@sha256:31346d4524252a3b0d2a1d289c4985b8402b498b5ce82a12e682096ab7446678\", size \"5974847\" in 1.912452005s" Jul 6 23:27:57.350018 containerd[1514]: time="2025-07-06T23:27:57.349941101Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.30.2\" returns image reference \"sha256:309942601a9ca6c4e92bcd09162824fef1c137a5c5d92fbbb45be0f29bfd1817\"" Jul 6 23:27:57.355586 containerd[1514]: time="2025-07-06T23:27:57.355540992Z" level=info msg="CreateContainer within sandbox \"bd8ff8cd55b8e0de8574a38cee131001bf2fc85cac7d8ed496b67977a3e70ba8\" for container &ContainerMetadata{Name:whisker,Attempt:0,}" Jul 6 23:27:57.372433 containerd[1514]: time="2025-07-06T23:27:57.372385386Z" level=info msg="Container 48c14b53c49ae1e181bdbe9d8ed3f68e5d9a7ba57a8e6a5b75080df4d5efd696: CDI devices from CRI Config.CDIDevices: []" Jul 6 23:27:57.455796 containerd[1514]: time="2025-07-06T23:27:57.455741669Z" level=info msg="CreateContainer within sandbox \"bd8ff8cd55b8e0de8574a38cee131001bf2fc85cac7d8ed496b67977a3e70ba8\" for &ContainerMetadata{Name:whisker,Attempt:0,} returns container id \"48c14b53c49ae1e181bdbe9d8ed3f68e5d9a7ba57a8e6a5b75080df4d5efd696\"" Jul 6 23:27:57.457960 containerd[1514]: time="2025-07-06T23:27:57.457359353Z" level=info msg="StartContainer for \"48c14b53c49ae1e181bdbe9d8ed3f68e5d9a7ba57a8e6a5b75080df4d5efd696\"" Jul 6 23:27:57.460744 containerd[1514]: time="2025-07-06T23:27:57.460651559Z" level=info msg="connecting to shim 48c14b53c49ae1e181bdbe9d8ed3f68e5d9a7ba57a8e6a5b75080df4d5efd696" address="unix:///run/containerd/s/84d1da03587149a3b2a987ea16394b79e421db7e0cdeb37deb726f10fb283b0f" protocol=ttrpc version=3 Jul 6 23:27:57.493884 systemd[1]: Started cri-containerd-48c14b53c49ae1e181bdbe9d8ed3f68e5d9a7ba57a8e6a5b75080df4d5efd696.scope - libcontainer container 48c14b53c49ae1e181bdbe9d8ed3f68e5d9a7ba57a8e6a5b75080df4d5efd696. Jul 6 23:27:57.557805 containerd[1514]: time="2025-07-06T23:27:57.557711550Z" level=info msg="StartContainer for \"48c14b53c49ae1e181bdbe9d8ed3f68e5d9a7ba57a8e6a5b75080df4d5efd696\" returns successfully" Jul 6 23:27:57.561955 containerd[1514]: time="2025-07-06T23:27:57.561919318Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.30.2\"" Jul 6 23:27:58.361641 containerd[1514]: time="2025-07-06T23:27:58.361564967Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-695459cb77-m8dwz,Uid:0c22019d-9d01-4e51-b47d-bbbe518c9d06,Namespace:calico-system,Attempt:0,}" Jul 6 23:27:58.541302 systemd-networkd[1406]: cali00699b07115: Link UP Jul 6 23:27:58.542967 systemd-networkd[1406]: cali00699b07115: Gained carrier Jul 6 23:27:58.565959 containerd[1514]: 2025-07-06 23:27:58.410 [INFO][4033] cni-plugin/utils.go 100: File /var/lib/calico/mtu does not exist Jul 6 23:27:58.565959 containerd[1514]: 2025-07-06 23:27:58.431 [INFO][4033] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4344--1--1--3--d8bdec45b1-k8s-calico--kube--controllers--695459cb77--m8dwz-eth0 calico-kube-controllers-695459cb77- calico-system 0c22019d-9d01-4e51-b47d-bbbe518c9d06 811 0 2025-07-06 23:27:36 +0000 UTC map[app.kubernetes.io/name:calico-kube-controllers k8s-app:calico-kube-controllers pod-template-hash:695459cb77 projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:calico-kube-controllers] map[] [] [] []} {k8s ci-4344-1-1-3-d8bdec45b1 calico-kube-controllers-695459cb77-m8dwz eth0 calico-kube-controllers [] [] [kns.calico-system ksa.calico-system.calico-kube-controllers] cali00699b07115 [] [] }} ContainerID="08174e3ac0afce8dde602378a7e96a4a24ce820f65640fa5f0842da16ae61d88" Namespace="calico-system" Pod="calico-kube-controllers-695459cb77-m8dwz" WorkloadEndpoint="ci--4344--1--1--3--d8bdec45b1-k8s-calico--kube--controllers--695459cb77--m8dwz-" Jul 6 23:27:58.565959 containerd[1514]: 2025-07-06 23:27:58.431 [INFO][4033] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="08174e3ac0afce8dde602378a7e96a4a24ce820f65640fa5f0842da16ae61d88" Namespace="calico-system" Pod="calico-kube-controllers-695459cb77-m8dwz" WorkloadEndpoint="ci--4344--1--1--3--d8bdec45b1-k8s-calico--kube--controllers--695459cb77--m8dwz-eth0" Jul 6 23:27:58.565959 containerd[1514]: 2025-07-06 23:27:58.469 [INFO][4047] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="08174e3ac0afce8dde602378a7e96a4a24ce820f65640fa5f0842da16ae61d88" HandleID="k8s-pod-network.08174e3ac0afce8dde602378a7e96a4a24ce820f65640fa5f0842da16ae61d88" Workload="ci--4344--1--1--3--d8bdec45b1-k8s-calico--kube--controllers--695459cb77--m8dwz-eth0" Jul 6 23:27:58.566402 containerd[1514]: 2025-07-06 23:27:58.470 [INFO][4047] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="08174e3ac0afce8dde602378a7e96a4a24ce820f65640fa5f0842da16ae61d88" HandleID="k8s-pod-network.08174e3ac0afce8dde602378a7e96a4a24ce820f65640fa5f0842da16ae61d88" Workload="ci--4344--1--1--3--d8bdec45b1-k8s-calico--kube--controllers--695459cb77--m8dwz-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0x400024b6d0), Attrs:map[string]string{"namespace":"calico-system", "node":"ci-4344-1-1-3-d8bdec45b1", "pod":"calico-kube-controllers-695459cb77-m8dwz", "timestamp":"2025-07-06 23:27:58.469980566 +0000 UTC"}, Hostname:"ci-4344-1-1-3-d8bdec45b1", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Jul 6 23:27:58.566402 containerd[1514]: 2025-07-06 23:27:58.470 [INFO][4047] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Jul 6 23:27:58.566402 containerd[1514]: 2025-07-06 23:27:58.470 [INFO][4047] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Jul 6 23:27:58.566402 containerd[1514]: 2025-07-06 23:27:58.470 [INFO][4047] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4344-1-1-3-d8bdec45b1' Jul 6 23:27:58.566402 containerd[1514]: 2025-07-06 23:27:58.488 [INFO][4047] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.08174e3ac0afce8dde602378a7e96a4a24ce820f65640fa5f0842da16ae61d88" host="ci-4344-1-1-3-d8bdec45b1" Jul 6 23:27:58.566402 containerd[1514]: 2025-07-06 23:27:58.495 [INFO][4047] ipam/ipam.go 394: Looking up existing affinities for host host="ci-4344-1-1-3-d8bdec45b1" Jul 6 23:27:58.566402 containerd[1514]: 2025-07-06 23:27:58.504 [INFO][4047] ipam/ipam.go 511: Trying affinity for 192.168.75.64/26 host="ci-4344-1-1-3-d8bdec45b1" Jul 6 23:27:58.566402 containerd[1514]: 2025-07-06 23:27:58.507 [INFO][4047] ipam/ipam.go 158: Attempting to load block cidr=192.168.75.64/26 host="ci-4344-1-1-3-d8bdec45b1" Jul 6 23:27:58.566402 containerd[1514]: 2025-07-06 23:27:58.511 [INFO][4047] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.75.64/26 host="ci-4344-1-1-3-d8bdec45b1" Jul 6 23:27:58.568065 containerd[1514]: 2025-07-06 23:27:58.511 [INFO][4047] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.75.64/26 handle="k8s-pod-network.08174e3ac0afce8dde602378a7e96a4a24ce820f65640fa5f0842da16ae61d88" host="ci-4344-1-1-3-d8bdec45b1" Jul 6 23:27:58.568065 containerd[1514]: 2025-07-06 23:27:58.515 [INFO][4047] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.08174e3ac0afce8dde602378a7e96a4a24ce820f65640fa5f0842da16ae61d88 Jul 6 23:27:58.568065 containerd[1514]: 2025-07-06 23:27:58.522 [INFO][4047] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.75.64/26 handle="k8s-pod-network.08174e3ac0afce8dde602378a7e96a4a24ce820f65640fa5f0842da16ae61d88" host="ci-4344-1-1-3-d8bdec45b1" Jul 6 23:27:58.568065 containerd[1514]: 2025-07-06 23:27:58.534 [INFO][4047] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.75.66/26] block=192.168.75.64/26 handle="k8s-pod-network.08174e3ac0afce8dde602378a7e96a4a24ce820f65640fa5f0842da16ae61d88" host="ci-4344-1-1-3-d8bdec45b1" Jul 6 23:27:58.568065 containerd[1514]: 2025-07-06 23:27:58.534 [INFO][4047] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.75.66/26] handle="k8s-pod-network.08174e3ac0afce8dde602378a7e96a4a24ce820f65640fa5f0842da16ae61d88" host="ci-4344-1-1-3-d8bdec45b1" Jul 6 23:27:58.568065 containerd[1514]: 2025-07-06 23:27:58.534 [INFO][4047] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Jul 6 23:27:58.568065 containerd[1514]: 2025-07-06 23:27:58.534 [INFO][4047] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.75.66/26] IPv6=[] ContainerID="08174e3ac0afce8dde602378a7e96a4a24ce820f65640fa5f0842da16ae61d88" HandleID="k8s-pod-network.08174e3ac0afce8dde602378a7e96a4a24ce820f65640fa5f0842da16ae61d88" Workload="ci--4344--1--1--3--d8bdec45b1-k8s-calico--kube--controllers--695459cb77--m8dwz-eth0" Jul 6 23:27:58.568499 containerd[1514]: 2025-07-06 23:27:58.537 [INFO][4033] cni-plugin/k8s.go 418: Populated endpoint ContainerID="08174e3ac0afce8dde602378a7e96a4a24ce820f65640fa5f0842da16ae61d88" Namespace="calico-system" Pod="calico-kube-controllers-695459cb77-m8dwz" WorkloadEndpoint="ci--4344--1--1--3--d8bdec45b1-k8s-calico--kube--controllers--695459cb77--m8dwz-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4344--1--1--3--d8bdec45b1-k8s-calico--kube--controllers--695459cb77--m8dwz-eth0", GenerateName:"calico-kube-controllers-695459cb77-", Namespace:"calico-system", SelfLink:"", UID:"0c22019d-9d01-4e51-b47d-bbbe518c9d06", ResourceVersion:"811", Generation:0, CreationTimestamp:time.Date(2025, time.July, 6, 23, 27, 36, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"calico-kube-controllers", "k8s-app":"calico-kube-controllers", "pod-template-hash":"695459cb77", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-kube-controllers"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4344-1-1-3-d8bdec45b1", ContainerID:"", Pod:"calico-kube-controllers-695459cb77-m8dwz", Endpoint:"eth0", ServiceAccountName:"calico-kube-controllers", IPNetworks:[]string{"192.168.75.66/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.calico-kube-controllers"}, InterfaceName:"cali00699b07115", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Jul 6 23:27:58.568585 containerd[1514]: 2025-07-06 23:27:58.537 [INFO][4033] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.75.66/32] ContainerID="08174e3ac0afce8dde602378a7e96a4a24ce820f65640fa5f0842da16ae61d88" Namespace="calico-system" Pod="calico-kube-controllers-695459cb77-m8dwz" WorkloadEndpoint="ci--4344--1--1--3--d8bdec45b1-k8s-calico--kube--controllers--695459cb77--m8dwz-eth0" Jul 6 23:27:58.568585 containerd[1514]: 2025-07-06 23:27:58.537 [INFO][4033] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali00699b07115 ContainerID="08174e3ac0afce8dde602378a7e96a4a24ce820f65640fa5f0842da16ae61d88" Namespace="calico-system" Pod="calico-kube-controllers-695459cb77-m8dwz" WorkloadEndpoint="ci--4344--1--1--3--d8bdec45b1-k8s-calico--kube--controllers--695459cb77--m8dwz-eth0" Jul 6 23:27:58.568585 containerd[1514]: 2025-07-06 23:27:58.542 [INFO][4033] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="08174e3ac0afce8dde602378a7e96a4a24ce820f65640fa5f0842da16ae61d88" Namespace="calico-system" Pod="calico-kube-controllers-695459cb77-m8dwz" WorkloadEndpoint="ci--4344--1--1--3--d8bdec45b1-k8s-calico--kube--controllers--695459cb77--m8dwz-eth0" Jul 6 23:27:58.568646 containerd[1514]: 2025-07-06 23:27:58.544 [INFO][4033] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="08174e3ac0afce8dde602378a7e96a4a24ce820f65640fa5f0842da16ae61d88" Namespace="calico-system" Pod="calico-kube-controllers-695459cb77-m8dwz" WorkloadEndpoint="ci--4344--1--1--3--d8bdec45b1-k8s-calico--kube--controllers--695459cb77--m8dwz-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4344--1--1--3--d8bdec45b1-k8s-calico--kube--controllers--695459cb77--m8dwz-eth0", GenerateName:"calico-kube-controllers-695459cb77-", Namespace:"calico-system", SelfLink:"", UID:"0c22019d-9d01-4e51-b47d-bbbe518c9d06", ResourceVersion:"811", Generation:0, CreationTimestamp:time.Date(2025, time.July, 6, 23, 27, 36, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"calico-kube-controllers", "k8s-app":"calico-kube-controllers", "pod-template-hash":"695459cb77", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-kube-controllers"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4344-1-1-3-d8bdec45b1", ContainerID:"08174e3ac0afce8dde602378a7e96a4a24ce820f65640fa5f0842da16ae61d88", Pod:"calico-kube-controllers-695459cb77-m8dwz", Endpoint:"eth0", ServiceAccountName:"calico-kube-controllers", IPNetworks:[]string{"192.168.75.66/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.calico-kube-controllers"}, InterfaceName:"cali00699b07115", MAC:"72:ee:a3:6b:ac:0e", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Jul 6 23:27:58.568713 containerd[1514]: 2025-07-06 23:27:58.557 [INFO][4033] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="08174e3ac0afce8dde602378a7e96a4a24ce820f65640fa5f0842da16ae61d88" Namespace="calico-system" Pod="calico-kube-controllers-695459cb77-m8dwz" WorkloadEndpoint="ci--4344--1--1--3--d8bdec45b1-k8s-calico--kube--controllers--695459cb77--m8dwz-eth0" Jul 6 23:27:58.607999 containerd[1514]: time="2025-07-06T23:27:58.607891261Z" level=info msg="connecting to shim 08174e3ac0afce8dde602378a7e96a4a24ce820f65640fa5f0842da16ae61d88" address="unix:///run/containerd/s/0abe4d7f4e996f47a40ef579558bfd17dbbc3cb141296e76a3b045975b6069be" namespace=k8s.io protocol=ttrpc version=3 Jul 6 23:27:58.651863 systemd[1]: Started cri-containerd-08174e3ac0afce8dde602378a7e96a4a24ce820f65640fa5f0842da16ae61d88.scope - libcontainer container 08174e3ac0afce8dde602378a7e96a4a24ce820f65640fa5f0842da16ae61d88. Jul 6 23:27:58.708229 containerd[1514]: time="2025-07-06T23:27:58.707872605Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-695459cb77-m8dwz,Uid:0c22019d-9d01-4e51-b47d-bbbe518c9d06,Namespace:calico-system,Attempt:0,} returns sandbox id \"08174e3ac0afce8dde602378a7e96a4a24ce820f65640fa5f0842da16ae61d88\"" Jul 6 23:27:59.363054 containerd[1514]: time="2025-07-06T23:27:59.362557491Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-59c54d7dc-sddts,Uid:f0eeabf7-71e7-42d8-936d-d80fd3cbb28d,Namespace:calico-apiserver,Attempt:0,}" Jul 6 23:27:59.363054 containerd[1514]: time="2025-07-06T23:27:59.362900012Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-668d6bf9bc-4hbsb,Uid:13ec9f29-2b5a-4173-8135-b818e53eadb3,Namespace:kube-system,Attempt:0,}" Jul 6 23:27:59.559872 systemd-networkd[1406]: cali2121a5177ab: Link UP Jul 6 23:27:59.563610 systemd-networkd[1406]: cali2121a5177ab: Gained carrier Jul 6 23:27:59.587182 containerd[1514]: 2025-07-06 23:27:59.403 [INFO][4129] cni-plugin/utils.go 100: File /var/lib/calico/mtu does not exist Jul 6 23:27:59.587182 containerd[1514]: 2025-07-06 23:27:59.436 [INFO][4129] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4344--1--1--3--d8bdec45b1-k8s-calico--apiserver--59c54d7dc--sddts-eth0 calico-apiserver-59c54d7dc- calico-apiserver f0eeabf7-71e7-42d8-936d-d80fd3cbb28d 817 0 2025-07-06 23:27:29 +0000 UTC map[apiserver:true app.kubernetes.io/name:calico-apiserver k8s-app:calico-apiserver pod-template-hash:59c54d7dc projectcalico.org/namespace:calico-apiserver projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:calico-apiserver] map[] [] [] []} {k8s ci-4344-1-1-3-d8bdec45b1 calico-apiserver-59c54d7dc-sddts eth0 calico-apiserver [] [] [kns.calico-apiserver ksa.calico-apiserver.calico-apiserver] cali2121a5177ab [] [] }} ContainerID="ed7c6d4c03fb6932e88fd0e005b1044d3f49a7d624765e1e0356b34fcffa447d" Namespace="calico-apiserver" Pod="calico-apiserver-59c54d7dc-sddts" WorkloadEndpoint="ci--4344--1--1--3--d8bdec45b1-k8s-calico--apiserver--59c54d7dc--sddts-" Jul 6 23:27:59.587182 containerd[1514]: 2025-07-06 23:27:59.436 [INFO][4129] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="ed7c6d4c03fb6932e88fd0e005b1044d3f49a7d624765e1e0356b34fcffa447d" Namespace="calico-apiserver" Pod="calico-apiserver-59c54d7dc-sddts" WorkloadEndpoint="ci--4344--1--1--3--d8bdec45b1-k8s-calico--apiserver--59c54d7dc--sddts-eth0" Jul 6 23:27:59.587182 containerd[1514]: 2025-07-06 23:27:59.481 [INFO][4153] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="ed7c6d4c03fb6932e88fd0e005b1044d3f49a7d624765e1e0356b34fcffa447d" HandleID="k8s-pod-network.ed7c6d4c03fb6932e88fd0e005b1044d3f49a7d624765e1e0356b34fcffa447d" Workload="ci--4344--1--1--3--d8bdec45b1-k8s-calico--apiserver--59c54d7dc--sddts-eth0" Jul 6 23:27:59.587463 containerd[1514]: 2025-07-06 23:27:59.481 [INFO][4153] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="ed7c6d4c03fb6932e88fd0e005b1044d3f49a7d624765e1e0356b34fcffa447d" HandleID="k8s-pod-network.ed7c6d4c03fb6932e88fd0e005b1044d3f49a7d624765e1e0356b34fcffa447d" Workload="ci--4344--1--1--3--d8bdec45b1-k8s-calico--apiserver--59c54d7dc--sddts-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0x40002d2f50), Attrs:map[string]string{"namespace":"calico-apiserver", "node":"ci-4344-1-1-3-d8bdec45b1", "pod":"calico-apiserver-59c54d7dc-sddts", "timestamp":"2025-07-06 23:27:59.481217896 +0000 UTC"}, Hostname:"ci-4344-1-1-3-d8bdec45b1", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Jul 6 23:27:59.587463 containerd[1514]: 2025-07-06 23:27:59.481 [INFO][4153] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Jul 6 23:27:59.587463 containerd[1514]: 2025-07-06 23:27:59.481 [INFO][4153] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Jul 6 23:27:59.587463 containerd[1514]: 2025-07-06 23:27:59.481 [INFO][4153] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4344-1-1-3-d8bdec45b1' Jul 6 23:27:59.587463 containerd[1514]: 2025-07-06 23:27:59.496 [INFO][4153] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.ed7c6d4c03fb6932e88fd0e005b1044d3f49a7d624765e1e0356b34fcffa447d" host="ci-4344-1-1-3-d8bdec45b1" Jul 6 23:27:59.587463 containerd[1514]: 2025-07-06 23:27:59.504 [INFO][4153] ipam/ipam.go 394: Looking up existing affinities for host host="ci-4344-1-1-3-d8bdec45b1" Jul 6 23:27:59.587463 containerd[1514]: 2025-07-06 23:27:59.512 [INFO][4153] ipam/ipam.go 511: Trying affinity for 192.168.75.64/26 host="ci-4344-1-1-3-d8bdec45b1" Jul 6 23:27:59.587463 containerd[1514]: 2025-07-06 23:27:59.517 [INFO][4153] ipam/ipam.go 158: Attempting to load block cidr=192.168.75.64/26 host="ci-4344-1-1-3-d8bdec45b1" Jul 6 23:27:59.587463 containerd[1514]: 2025-07-06 23:27:59.522 [INFO][4153] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.75.64/26 host="ci-4344-1-1-3-d8bdec45b1" Jul 6 23:27:59.587870 containerd[1514]: 2025-07-06 23:27:59.522 [INFO][4153] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.75.64/26 handle="k8s-pod-network.ed7c6d4c03fb6932e88fd0e005b1044d3f49a7d624765e1e0356b34fcffa447d" host="ci-4344-1-1-3-d8bdec45b1" Jul 6 23:27:59.587870 containerd[1514]: 2025-07-06 23:27:59.529 [INFO][4153] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.ed7c6d4c03fb6932e88fd0e005b1044d3f49a7d624765e1e0356b34fcffa447d Jul 6 23:27:59.587870 containerd[1514]: 2025-07-06 23:27:59.537 [INFO][4153] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.75.64/26 handle="k8s-pod-network.ed7c6d4c03fb6932e88fd0e005b1044d3f49a7d624765e1e0356b34fcffa447d" host="ci-4344-1-1-3-d8bdec45b1" Jul 6 23:27:59.587870 containerd[1514]: 2025-07-06 23:27:59.547 [INFO][4153] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.75.67/26] block=192.168.75.64/26 handle="k8s-pod-network.ed7c6d4c03fb6932e88fd0e005b1044d3f49a7d624765e1e0356b34fcffa447d" host="ci-4344-1-1-3-d8bdec45b1" Jul 6 23:27:59.587870 containerd[1514]: 2025-07-06 23:27:59.547 [INFO][4153] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.75.67/26] handle="k8s-pod-network.ed7c6d4c03fb6932e88fd0e005b1044d3f49a7d624765e1e0356b34fcffa447d" host="ci-4344-1-1-3-d8bdec45b1" Jul 6 23:27:59.587870 containerd[1514]: 2025-07-06 23:27:59.547 [INFO][4153] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Jul 6 23:27:59.587870 containerd[1514]: 2025-07-06 23:27:59.547 [INFO][4153] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.75.67/26] IPv6=[] ContainerID="ed7c6d4c03fb6932e88fd0e005b1044d3f49a7d624765e1e0356b34fcffa447d" HandleID="k8s-pod-network.ed7c6d4c03fb6932e88fd0e005b1044d3f49a7d624765e1e0356b34fcffa447d" Workload="ci--4344--1--1--3--d8bdec45b1-k8s-calico--apiserver--59c54d7dc--sddts-eth0" Jul 6 23:27:59.588063 containerd[1514]: 2025-07-06 23:27:59.554 [INFO][4129] cni-plugin/k8s.go 418: Populated endpoint ContainerID="ed7c6d4c03fb6932e88fd0e005b1044d3f49a7d624765e1e0356b34fcffa447d" Namespace="calico-apiserver" Pod="calico-apiserver-59c54d7dc-sddts" WorkloadEndpoint="ci--4344--1--1--3--d8bdec45b1-k8s-calico--apiserver--59c54d7dc--sddts-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4344--1--1--3--d8bdec45b1-k8s-calico--apiserver--59c54d7dc--sddts-eth0", GenerateName:"calico-apiserver-59c54d7dc-", Namespace:"calico-apiserver", SelfLink:"", UID:"f0eeabf7-71e7-42d8-936d-d80fd3cbb28d", ResourceVersion:"817", Generation:0, CreationTimestamp:time.Date(2025, time.July, 6, 23, 27, 29, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"59c54d7dc", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4344-1-1-3-d8bdec45b1", ContainerID:"", Pod:"calico-apiserver-59c54d7dc-sddts", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.75.67/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"cali2121a5177ab", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Jul 6 23:27:59.588126 containerd[1514]: 2025-07-06 23:27:59.554 [INFO][4129] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.75.67/32] ContainerID="ed7c6d4c03fb6932e88fd0e005b1044d3f49a7d624765e1e0356b34fcffa447d" Namespace="calico-apiserver" Pod="calico-apiserver-59c54d7dc-sddts" WorkloadEndpoint="ci--4344--1--1--3--d8bdec45b1-k8s-calico--apiserver--59c54d7dc--sddts-eth0" Jul 6 23:27:59.588126 containerd[1514]: 2025-07-06 23:27:59.554 [INFO][4129] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali2121a5177ab ContainerID="ed7c6d4c03fb6932e88fd0e005b1044d3f49a7d624765e1e0356b34fcffa447d" Namespace="calico-apiserver" Pod="calico-apiserver-59c54d7dc-sddts" WorkloadEndpoint="ci--4344--1--1--3--d8bdec45b1-k8s-calico--apiserver--59c54d7dc--sddts-eth0" Jul 6 23:27:59.588126 containerd[1514]: 2025-07-06 23:27:59.565 [INFO][4129] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="ed7c6d4c03fb6932e88fd0e005b1044d3f49a7d624765e1e0356b34fcffa447d" Namespace="calico-apiserver" Pod="calico-apiserver-59c54d7dc-sddts" WorkloadEndpoint="ci--4344--1--1--3--d8bdec45b1-k8s-calico--apiserver--59c54d7dc--sddts-eth0" Jul 6 23:27:59.588187 containerd[1514]: 2025-07-06 23:27:59.566 [INFO][4129] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="ed7c6d4c03fb6932e88fd0e005b1044d3f49a7d624765e1e0356b34fcffa447d" Namespace="calico-apiserver" Pod="calico-apiserver-59c54d7dc-sddts" WorkloadEndpoint="ci--4344--1--1--3--d8bdec45b1-k8s-calico--apiserver--59c54d7dc--sddts-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4344--1--1--3--d8bdec45b1-k8s-calico--apiserver--59c54d7dc--sddts-eth0", GenerateName:"calico-apiserver-59c54d7dc-", Namespace:"calico-apiserver", SelfLink:"", UID:"f0eeabf7-71e7-42d8-936d-d80fd3cbb28d", ResourceVersion:"817", Generation:0, CreationTimestamp:time.Date(2025, time.July, 6, 23, 27, 29, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"59c54d7dc", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4344-1-1-3-d8bdec45b1", ContainerID:"ed7c6d4c03fb6932e88fd0e005b1044d3f49a7d624765e1e0356b34fcffa447d", Pod:"calico-apiserver-59c54d7dc-sddts", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.75.67/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"cali2121a5177ab", MAC:"36:cf:49:93:7f:22", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Jul 6 23:27:59.588235 containerd[1514]: 2025-07-06 23:27:59.585 [INFO][4129] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="ed7c6d4c03fb6932e88fd0e005b1044d3f49a7d624765e1e0356b34fcffa447d" Namespace="calico-apiserver" Pod="calico-apiserver-59c54d7dc-sddts" WorkloadEndpoint="ci--4344--1--1--3--d8bdec45b1-k8s-calico--apiserver--59c54d7dc--sddts-eth0" Jul 6 23:27:59.644532 containerd[1514]: time="2025-07-06T23:27:59.643878457Z" level=info msg="connecting to shim ed7c6d4c03fb6932e88fd0e005b1044d3f49a7d624765e1e0356b34fcffa447d" address="unix:///run/containerd/s/c25141bf31792c0528f2b22f4b18f026b37bb59382a0dd0b99018f7ec92278e2" namespace=k8s.io protocol=ttrpc version=3 Jul 6 23:27:59.695840 systemd[1]: Started cri-containerd-ed7c6d4c03fb6932e88fd0e005b1044d3f49a7d624765e1e0356b34fcffa447d.scope - libcontainer container ed7c6d4c03fb6932e88fd0e005b1044d3f49a7d624765e1e0356b34fcffa447d. Jul 6 23:27:59.704445 systemd-networkd[1406]: cali60625f68efb: Link UP Jul 6 23:27:59.707578 systemd-networkd[1406]: cali60625f68efb: Gained carrier Jul 6 23:27:59.737853 containerd[1514]: 2025-07-06 23:27:59.407 [INFO][4123] cni-plugin/utils.go 100: File /var/lib/calico/mtu does not exist Jul 6 23:27:59.737853 containerd[1514]: 2025-07-06 23:27:59.434 [INFO][4123] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4344--1--1--3--d8bdec45b1-k8s-coredns--668d6bf9bc--4hbsb-eth0 coredns-668d6bf9bc- kube-system 13ec9f29-2b5a-4173-8135-b818e53eadb3 819 0 2025-07-06 23:27:18 +0000 UTC map[k8s-app:kube-dns pod-template-hash:668d6bf9bc projectcalico.org/namespace:kube-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:coredns] map[] [] [] []} {k8s ci-4344-1-1-3-d8bdec45b1 coredns-668d6bf9bc-4hbsb eth0 coredns [] [] [kns.kube-system ksa.kube-system.coredns] cali60625f68efb [{dns UDP 53 0 } {dns-tcp TCP 53 0 } {metrics TCP 9153 0 }] [] }} ContainerID="8b4dbbd65d0daa57785b28c0b3004572770c11b458b31d1ad9a5e1856fa490fa" Namespace="kube-system" Pod="coredns-668d6bf9bc-4hbsb" WorkloadEndpoint="ci--4344--1--1--3--d8bdec45b1-k8s-coredns--668d6bf9bc--4hbsb-" Jul 6 23:27:59.737853 containerd[1514]: 2025-07-06 23:27:59.434 [INFO][4123] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="8b4dbbd65d0daa57785b28c0b3004572770c11b458b31d1ad9a5e1856fa490fa" Namespace="kube-system" Pod="coredns-668d6bf9bc-4hbsb" WorkloadEndpoint="ci--4344--1--1--3--d8bdec45b1-k8s-coredns--668d6bf9bc--4hbsb-eth0" Jul 6 23:27:59.737853 containerd[1514]: 2025-07-06 23:27:59.488 [INFO][4147] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="8b4dbbd65d0daa57785b28c0b3004572770c11b458b31d1ad9a5e1856fa490fa" HandleID="k8s-pod-network.8b4dbbd65d0daa57785b28c0b3004572770c11b458b31d1ad9a5e1856fa490fa" Workload="ci--4344--1--1--3--d8bdec45b1-k8s-coredns--668d6bf9bc--4hbsb-eth0" Jul 6 23:27:59.738199 containerd[1514]: 2025-07-06 23:27:59.489 [INFO][4147] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="8b4dbbd65d0daa57785b28c0b3004572770c11b458b31d1ad9a5e1856fa490fa" HandleID="k8s-pod-network.8b4dbbd65d0daa57785b28c0b3004572770c11b458b31d1ad9a5e1856fa490fa" Workload="ci--4344--1--1--3--d8bdec45b1-k8s-coredns--668d6bf9bc--4hbsb-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0x40002c36a0), Attrs:map[string]string{"namespace":"kube-system", "node":"ci-4344-1-1-3-d8bdec45b1", "pod":"coredns-668d6bf9bc-4hbsb", "timestamp":"2025-07-06 23:27:59.488447789 +0000 UTC"}, Hostname:"ci-4344-1-1-3-d8bdec45b1", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Jul 6 23:27:59.738199 containerd[1514]: 2025-07-06 23:27:59.489 [INFO][4147] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Jul 6 23:27:59.738199 containerd[1514]: 2025-07-06 23:27:59.547 [INFO][4147] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Jul 6 23:27:59.738199 containerd[1514]: 2025-07-06 23:27:59.547 [INFO][4147] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4344-1-1-3-d8bdec45b1' Jul 6 23:27:59.738199 containerd[1514]: 2025-07-06 23:27:59.596 [INFO][4147] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.8b4dbbd65d0daa57785b28c0b3004572770c11b458b31d1ad9a5e1856fa490fa" host="ci-4344-1-1-3-d8bdec45b1" Jul 6 23:27:59.738199 containerd[1514]: 2025-07-06 23:27:59.605 [INFO][4147] ipam/ipam.go 394: Looking up existing affinities for host host="ci-4344-1-1-3-d8bdec45b1" Jul 6 23:27:59.738199 containerd[1514]: 2025-07-06 23:27:59.625 [INFO][4147] ipam/ipam.go 511: Trying affinity for 192.168.75.64/26 host="ci-4344-1-1-3-d8bdec45b1" Jul 6 23:27:59.738199 containerd[1514]: 2025-07-06 23:27:59.632 [INFO][4147] ipam/ipam.go 158: Attempting to load block cidr=192.168.75.64/26 host="ci-4344-1-1-3-d8bdec45b1" Jul 6 23:27:59.738199 containerd[1514]: 2025-07-06 23:27:59.640 [INFO][4147] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.75.64/26 host="ci-4344-1-1-3-d8bdec45b1" Jul 6 23:27:59.738548 containerd[1514]: 2025-07-06 23:27:59.643 [INFO][4147] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.75.64/26 handle="k8s-pod-network.8b4dbbd65d0daa57785b28c0b3004572770c11b458b31d1ad9a5e1856fa490fa" host="ci-4344-1-1-3-d8bdec45b1" Jul 6 23:27:59.738548 containerd[1514]: 2025-07-06 23:27:59.648 [INFO][4147] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.8b4dbbd65d0daa57785b28c0b3004572770c11b458b31d1ad9a5e1856fa490fa Jul 6 23:27:59.738548 containerd[1514]: 2025-07-06 23:27:59.664 [INFO][4147] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.75.64/26 handle="k8s-pod-network.8b4dbbd65d0daa57785b28c0b3004572770c11b458b31d1ad9a5e1856fa490fa" host="ci-4344-1-1-3-d8bdec45b1" Jul 6 23:27:59.738548 containerd[1514]: 2025-07-06 23:27:59.685 [INFO][4147] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.75.68/26] block=192.168.75.64/26 handle="k8s-pod-network.8b4dbbd65d0daa57785b28c0b3004572770c11b458b31d1ad9a5e1856fa490fa" host="ci-4344-1-1-3-d8bdec45b1" Jul 6 23:27:59.738548 containerd[1514]: 2025-07-06 23:27:59.685 [INFO][4147] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.75.68/26] handle="k8s-pod-network.8b4dbbd65d0daa57785b28c0b3004572770c11b458b31d1ad9a5e1856fa490fa" host="ci-4344-1-1-3-d8bdec45b1" Jul 6 23:27:59.738548 containerd[1514]: 2025-07-06 23:27:59.685 [INFO][4147] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Jul 6 23:27:59.738548 containerd[1514]: 2025-07-06 23:27:59.685 [INFO][4147] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.75.68/26] IPv6=[] ContainerID="8b4dbbd65d0daa57785b28c0b3004572770c11b458b31d1ad9a5e1856fa490fa" HandleID="k8s-pod-network.8b4dbbd65d0daa57785b28c0b3004572770c11b458b31d1ad9a5e1856fa490fa" Workload="ci--4344--1--1--3--d8bdec45b1-k8s-coredns--668d6bf9bc--4hbsb-eth0" Jul 6 23:27:59.739324 containerd[1514]: 2025-07-06 23:27:59.699 [INFO][4123] cni-plugin/k8s.go 418: Populated endpoint ContainerID="8b4dbbd65d0daa57785b28c0b3004572770c11b458b31d1ad9a5e1856fa490fa" Namespace="kube-system" Pod="coredns-668d6bf9bc-4hbsb" WorkloadEndpoint="ci--4344--1--1--3--d8bdec45b1-k8s-coredns--668d6bf9bc--4hbsb-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4344--1--1--3--d8bdec45b1-k8s-coredns--668d6bf9bc--4hbsb-eth0", GenerateName:"coredns-668d6bf9bc-", Namespace:"kube-system", SelfLink:"", UID:"13ec9f29-2b5a-4173-8135-b818e53eadb3", ResourceVersion:"819", Generation:0, CreationTimestamp:time.Date(2025, time.July, 6, 23, 27, 18, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"668d6bf9bc", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4344-1-1-3-d8bdec45b1", ContainerID:"", Pod:"coredns-668d6bf9bc-4hbsb", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.75.68/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"cali60625f68efb", MAC:"", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Jul 6 23:27:59.739324 containerd[1514]: 2025-07-06 23:27:59.699 [INFO][4123] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.75.68/32] ContainerID="8b4dbbd65d0daa57785b28c0b3004572770c11b458b31d1ad9a5e1856fa490fa" Namespace="kube-system" Pod="coredns-668d6bf9bc-4hbsb" WorkloadEndpoint="ci--4344--1--1--3--d8bdec45b1-k8s-coredns--668d6bf9bc--4hbsb-eth0" Jul 6 23:27:59.739324 containerd[1514]: 2025-07-06 23:27:59.700 [INFO][4123] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali60625f68efb ContainerID="8b4dbbd65d0daa57785b28c0b3004572770c11b458b31d1ad9a5e1856fa490fa" Namespace="kube-system" Pod="coredns-668d6bf9bc-4hbsb" WorkloadEndpoint="ci--4344--1--1--3--d8bdec45b1-k8s-coredns--668d6bf9bc--4hbsb-eth0" Jul 6 23:27:59.739324 containerd[1514]: 2025-07-06 23:27:59.707 [INFO][4123] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="8b4dbbd65d0daa57785b28c0b3004572770c11b458b31d1ad9a5e1856fa490fa" Namespace="kube-system" Pod="coredns-668d6bf9bc-4hbsb" WorkloadEndpoint="ci--4344--1--1--3--d8bdec45b1-k8s-coredns--668d6bf9bc--4hbsb-eth0" Jul 6 23:27:59.739324 containerd[1514]: 2025-07-06 23:27:59.709 [INFO][4123] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="8b4dbbd65d0daa57785b28c0b3004572770c11b458b31d1ad9a5e1856fa490fa" Namespace="kube-system" Pod="coredns-668d6bf9bc-4hbsb" WorkloadEndpoint="ci--4344--1--1--3--d8bdec45b1-k8s-coredns--668d6bf9bc--4hbsb-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4344--1--1--3--d8bdec45b1-k8s-coredns--668d6bf9bc--4hbsb-eth0", GenerateName:"coredns-668d6bf9bc-", Namespace:"kube-system", SelfLink:"", UID:"13ec9f29-2b5a-4173-8135-b818e53eadb3", ResourceVersion:"819", Generation:0, CreationTimestamp:time.Date(2025, time.July, 6, 23, 27, 18, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"668d6bf9bc", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4344-1-1-3-d8bdec45b1", ContainerID:"8b4dbbd65d0daa57785b28c0b3004572770c11b458b31d1ad9a5e1856fa490fa", Pod:"coredns-668d6bf9bc-4hbsb", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.75.68/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"cali60625f68efb", MAC:"a2:88:35:22:91:87", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Jul 6 23:27:59.739324 containerd[1514]: 2025-07-06 23:27:59.730 [INFO][4123] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="8b4dbbd65d0daa57785b28c0b3004572770c11b458b31d1ad9a5e1856fa490fa" Namespace="kube-system" Pod="coredns-668d6bf9bc-4hbsb" WorkloadEndpoint="ci--4344--1--1--3--d8bdec45b1-k8s-coredns--668d6bf9bc--4hbsb-eth0" Jul 6 23:27:59.774283 containerd[1514]: time="2025-07-06T23:27:59.774242603Z" level=info msg="connecting to shim 8b4dbbd65d0daa57785b28c0b3004572770c11b458b31d1ad9a5e1856fa490fa" address="unix:///run/containerd/s/94c69a26a1c8095a9a59371f7a99b61c0b771794725d77788358618f1ee7a31a" namespace=k8s.io protocol=ttrpc version=3 Jul 6 23:27:59.839598 containerd[1514]: time="2025-07-06T23:27:59.839542515Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-59c54d7dc-sddts,Uid:f0eeabf7-71e7-42d8-936d-d80fd3cbb28d,Namespace:calico-apiserver,Attempt:0,} returns sandbox id \"ed7c6d4c03fb6932e88fd0e005b1044d3f49a7d624765e1e0356b34fcffa447d\"" Jul 6 23:27:59.856880 systemd[1]: Started cri-containerd-8b4dbbd65d0daa57785b28c0b3004572770c11b458b31d1ad9a5e1856fa490fa.scope - libcontainer container 8b4dbbd65d0daa57785b28c0b3004572770c11b458b31d1ad9a5e1856fa490fa. Jul 6 23:27:59.924825 containerd[1514]: time="2025-07-06T23:27:59.924557782Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-668d6bf9bc-4hbsb,Uid:13ec9f29-2b5a-4173-8135-b818e53eadb3,Namespace:kube-system,Attempt:0,} returns sandbox id \"8b4dbbd65d0daa57785b28c0b3004572770c11b458b31d1ad9a5e1856fa490fa\"" Jul 6 23:27:59.934688 containerd[1514]: time="2025-07-06T23:27:59.934568040Z" level=info msg="CreateContainer within sandbox \"8b4dbbd65d0daa57785b28c0b3004572770c11b458b31d1ad9a5e1856fa490fa\" for container &ContainerMetadata{Name:coredns,Attempt:0,}" Jul 6 23:27:59.951148 containerd[1514]: time="2025-07-06T23:27:59.951098628Z" level=info msg="Container 78ee68956a189acb3aefc352fb26e4d16d33d42a09cce4e0d672fb2627dc7fbc: CDI devices from CRI Config.CDIDevices: []" Jul 6 23:27:59.957587 containerd[1514]: time="2025-07-06T23:27:59.957511799Z" level=info msg="CreateContainer within sandbox \"8b4dbbd65d0daa57785b28c0b3004572770c11b458b31d1ad9a5e1856fa490fa\" for &ContainerMetadata{Name:coredns,Attempt:0,} returns container id \"78ee68956a189acb3aefc352fb26e4d16d33d42a09cce4e0d672fb2627dc7fbc\"" Jul 6 23:27:59.960898 containerd[1514]: time="2025-07-06T23:27:59.959984324Z" level=info msg="StartContainer for \"78ee68956a189acb3aefc352fb26e4d16d33d42a09cce4e0d672fb2627dc7fbc\"" Jul 6 23:27:59.961461 containerd[1514]: time="2025-07-06T23:27:59.961407486Z" level=info msg="connecting to shim 78ee68956a189acb3aefc352fb26e4d16d33d42a09cce4e0d672fb2627dc7fbc" address="unix:///run/containerd/s/94c69a26a1c8095a9a59371f7a99b61c0b771794725d77788358618f1ee7a31a" protocol=ttrpc version=3 Jul 6 23:27:59.966831 systemd-networkd[1406]: cali00699b07115: Gained IPv6LL Jul 6 23:27:59.983970 systemd[1]: Started cri-containerd-78ee68956a189acb3aefc352fb26e4d16d33d42a09cce4e0d672fb2627dc7fbc.scope - libcontainer container 78ee68956a189acb3aefc352fb26e4d16d33d42a09cce4e0d672fb2627dc7fbc. Jul 6 23:28:00.025526 containerd[1514]: time="2025-07-06T23:28:00.025434794Z" level=info msg="StartContainer for \"78ee68956a189acb3aefc352fb26e4d16d33d42a09cce4e0d672fb2627dc7fbc\" returns successfully" Jul 6 23:28:00.361007 containerd[1514]: time="2025-07-06T23:28:00.360958538Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-668d6bf9bc-bq59l,Uid:27ad9e03-63e4-4da6-b469-a2bbc29ee17c,Namespace:kube-system,Attempt:0,}" Jul 6 23:28:00.361413 containerd[1514]: time="2025-07-06T23:28:00.360961858Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-5c7f48b6db-cvkmp,Uid:6645371b-fc64-4f6a-96ea-92b175539b84,Namespace:calico-apiserver,Attempt:0,}" Jul 6 23:28:00.544881 systemd-networkd[1406]: cali9bec3db93d1: Link UP Jul 6 23:28:00.546176 systemd-networkd[1406]: cali9bec3db93d1: Gained carrier Jul 6 23:28:00.573735 containerd[1514]: 2025-07-06 23:28:00.404 [INFO][4315] cni-plugin/utils.go 100: File /var/lib/calico/mtu does not exist Jul 6 23:28:00.573735 containerd[1514]: 2025-07-06 23:28:00.420 [INFO][4315] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4344--1--1--3--d8bdec45b1-k8s-coredns--668d6bf9bc--bq59l-eth0 coredns-668d6bf9bc- kube-system 27ad9e03-63e4-4da6-b469-a2bbc29ee17c 807 0 2025-07-06 23:27:18 +0000 UTC map[k8s-app:kube-dns pod-template-hash:668d6bf9bc projectcalico.org/namespace:kube-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:coredns] map[] [] [] []} {k8s ci-4344-1-1-3-d8bdec45b1 coredns-668d6bf9bc-bq59l eth0 coredns [] [] [kns.kube-system ksa.kube-system.coredns] cali9bec3db93d1 [{dns UDP 53 0 } {dns-tcp TCP 53 0 } {metrics TCP 9153 0 }] [] }} ContainerID="e881a8da7016b08506ebc34507d845ff403639f0e25af7102f14e4695e1637d9" Namespace="kube-system" Pod="coredns-668d6bf9bc-bq59l" WorkloadEndpoint="ci--4344--1--1--3--d8bdec45b1-k8s-coredns--668d6bf9bc--bq59l-" Jul 6 23:28:00.573735 containerd[1514]: 2025-07-06 23:28:00.421 [INFO][4315] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="e881a8da7016b08506ebc34507d845ff403639f0e25af7102f14e4695e1637d9" Namespace="kube-system" Pod="coredns-668d6bf9bc-bq59l" WorkloadEndpoint="ci--4344--1--1--3--d8bdec45b1-k8s-coredns--668d6bf9bc--bq59l-eth0" Jul 6 23:28:00.573735 containerd[1514]: 2025-07-06 23:28:00.472 [INFO][4340] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="e881a8da7016b08506ebc34507d845ff403639f0e25af7102f14e4695e1637d9" HandleID="k8s-pod-network.e881a8da7016b08506ebc34507d845ff403639f0e25af7102f14e4695e1637d9" Workload="ci--4344--1--1--3--d8bdec45b1-k8s-coredns--668d6bf9bc--bq59l-eth0" Jul 6 23:28:00.573735 containerd[1514]: 2025-07-06 23:28:00.472 [INFO][4340] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="e881a8da7016b08506ebc34507d845ff403639f0e25af7102f14e4695e1637d9" HandleID="k8s-pod-network.e881a8da7016b08506ebc34507d845ff403639f0e25af7102f14e4695e1637d9" Workload="ci--4344--1--1--3--d8bdec45b1-k8s-coredns--668d6bf9bc--bq59l-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0x400024b870), Attrs:map[string]string{"namespace":"kube-system", "node":"ci-4344-1-1-3-d8bdec45b1", "pod":"coredns-668d6bf9bc-bq59l", "timestamp":"2025-07-06 23:28:00.472162638 +0000 UTC"}, Hostname:"ci-4344-1-1-3-d8bdec45b1", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Jul 6 23:28:00.573735 containerd[1514]: 2025-07-06 23:28:00.472 [INFO][4340] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Jul 6 23:28:00.573735 containerd[1514]: 2025-07-06 23:28:00.472 [INFO][4340] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Jul 6 23:28:00.573735 containerd[1514]: 2025-07-06 23:28:00.472 [INFO][4340] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4344-1-1-3-d8bdec45b1' Jul 6 23:28:00.573735 containerd[1514]: 2025-07-06 23:28:00.484 [INFO][4340] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.e881a8da7016b08506ebc34507d845ff403639f0e25af7102f14e4695e1637d9" host="ci-4344-1-1-3-d8bdec45b1" Jul 6 23:28:00.573735 containerd[1514]: 2025-07-06 23:28:00.491 [INFO][4340] ipam/ipam.go 394: Looking up existing affinities for host host="ci-4344-1-1-3-d8bdec45b1" Jul 6 23:28:00.573735 containerd[1514]: 2025-07-06 23:28:00.498 [INFO][4340] ipam/ipam.go 511: Trying affinity for 192.168.75.64/26 host="ci-4344-1-1-3-d8bdec45b1" Jul 6 23:28:00.573735 containerd[1514]: 2025-07-06 23:28:00.501 [INFO][4340] ipam/ipam.go 158: Attempting to load block cidr=192.168.75.64/26 host="ci-4344-1-1-3-d8bdec45b1" Jul 6 23:28:00.573735 containerd[1514]: 2025-07-06 23:28:00.505 [INFO][4340] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.75.64/26 host="ci-4344-1-1-3-d8bdec45b1" Jul 6 23:28:00.573735 containerd[1514]: 2025-07-06 23:28:00.505 [INFO][4340] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.75.64/26 handle="k8s-pod-network.e881a8da7016b08506ebc34507d845ff403639f0e25af7102f14e4695e1637d9" host="ci-4344-1-1-3-d8bdec45b1" Jul 6 23:28:00.573735 containerd[1514]: 2025-07-06 23:28:00.509 [INFO][4340] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.e881a8da7016b08506ebc34507d845ff403639f0e25af7102f14e4695e1637d9 Jul 6 23:28:00.573735 containerd[1514]: 2025-07-06 23:28:00.518 [INFO][4340] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.75.64/26 handle="k8s-pod-network.e881a8da7016b08506ebc34507d845ff403639f0e25af7102f14e4695e1637d9" host="ci-4344-1-1-3-d8bdec45b1" Jul 6 23:28:00.573735 containerd[1514]: 2025-07-06 23:28:00.529 [INFO][4340] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.75.69/26] block=192.168.75.64/26 handle="k8s-pod-network.e881a8da7016b08506ebc34507d845ff403639f0e25af7102f14e4695e1637d9" host="ci-4344-1-1-3-d8bdec45b1" Jul 6 23:28:00.573735 containerd[1514]: 2025-07-06 23:28:00.530 [INFO][4340] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.75.69/26] handle="k8s-pod-network.e881a8da7016b08506ebc34507d845ff403639f0e25af7102f14e4695e1637d9" host="ci-4344-1-1-3-d8bdec45b1" Jul 6 23:28:00.573735 containerd[1514]: 2025-07-06 23:28:00.530 [INFO][4340] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Jul 6 23:28:00.573735 containerd[1514]: 2025-07-06 23:28:00.530 [INFO][4340] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.75.69/26] IPv6=[] ContainerID="e881a8da7016b08506ebc34507d845ff403639f0e25af7102f14e4695e1637d9" HandleID="k8s-pod-network.e881a8da7016b08506ebc34507d845ff403639f0e25af7102f14e4695e1637d9" Workload="ci--4344--1--1--3--d8bdec45b1-k8s-coredns--668d6bf9bc--bq59l-eth0" Jul 6 23:28:00.574572 containerd[1514]: 2025-07-06 23:28:00.540 [INFO][4315] cni-plugin/k8s.go 418: Populated endpoint ContainerID="e881a8da7016b08506ebc34507d845ff403639f0e25af7102f14e4695e1637d9" Namespace="kube-system" Pod="coredns-668d6bf9bc-bq59l" WorkloadEndpoint="ci--4344--1--1--3--d8bdec45b1-k8s-coredns--668d6bf9bc--bq59l-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4344--1--1--3--d8bdec45b1-k8s-coredns--668d6bf9bc--bq59l-eth0", GenerateName:"coredns-668d6bf9bc-", Namespace:"kube-system", SelfLink:"", UID:"27ad9e03-63e4-4da6-b469-a2bbc29ee17c", ResourceVersion:"807", Generation:0, CreationTimestamp:time.Date(2025, time.July, 6, 23, 27, 18, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"668d6bf9bc", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4344-1-1-3-d8bdec45b1", ContainerID:"", Pod:"coredns-668d6bf9bc-bq59l", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.75.69/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"cali9bec3db93d1", MAC:"", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Jul 6 23:28:00.574572 containerd[1514]: 2025-07-06 23:28:00.540 [INFO][4315] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.75.69/32] ContainerID="e881a8da7016b08506ebc34507d845ff403639f0e25af7102f14e4695e1637d9" Namespace="kube-system" Pod="coredns-668d6bf9bc-bq59l" WorkloadEndpoint="ci--4344--1--1--3--d8bdec45b1-k8s-coredns--668d6bf9bc--bq59l-eth0" Jul 6 23:28:00.574572 containerd[1514]: 2025-07-06 23:28:00.540 [INFO][4315] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali9bec3db93d1 ContainerID="e881a8da7016b08506ebc34507d845ff403639f0e25af7102f14e4695e1637d9" Namespace="kube-system" Pod="coredns-668d6bf9bc-bq59l" WorkloadEndpoint="ci--4344--1--1--3--d8bdec45b1-k8s-coredns--668d6bf9bc--bq59l-eth0" Jul 6 23:28:00.574572 containerd[1514]: 2025-07-06 23:28:00.547 [INFO][4315] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="e881a8da7016b08506ebc34507d845ff403639f0e25af7102f14e4695e1637d9" Namespace="kube-system" Pod="coredns-668d6bf9bc-bq59l" WorkloadEndpoint="ci--4344--1--1--3--d8bdec45b1-k8s-coredns--668d6bf9bc--bq59l-eth0" Jul 6 23:28:00.574572 containerd[1514]: 2025-07-06 23:28:00.549 [INFO][4315] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="e881a8da7016b08506ebc34507d845ff403639f0e25af7102f14e4695e1637d9" Namespace="kube-system" Pod="coredns-668d6bf9bc-bq59l" WorkloadEndpoint="ci--4344--1--1--3--d8bdec45b1-k8s-coredns--668d6bf9bc--bq59l-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4344--1--1--3--d8bdec45b1-k8s-coredns--668d6bf9bc--bq59l-eth0", GenerateName:"coredns-668d6bf9bc-", Namespace:"kube-system", SelfLink:"", UID:"27ad9e03-63e4-4da6-b469-a2bbc29ee17c", ResourceVersion:"807", Generation:0, CreationTimestamp:time.Date(2025, time.July, 6, 23, 27, 18, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"668d6bf9bc", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4344-1-1-3-d8bdec45b1", ContainerID:"e881a8da7016b08506ebc34507d845ff403639f0e25af7102f14e4695e1637d9", Pod:"coredns-668d6bf9bc-bq59l", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.75.69/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"cali9bec3db93d1", MAC:"16:44:2a:7d:4e:64", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Jul 6 23:28:00.574572 containerd[1514]: 2025-07-06 23:28:00.570 [INFO][4315] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="e881a8da7016b08506ebc34507d845ff403639f0e25af7102f14e4695e1637d9" Namespace="kube-system" Pod="coredns-668d6bf9bc-bq59l" WorkloadEndpoint="ci--4344--1--1--3--d8bdec45b1-k8s-coredns--668d6bf9bc--bq59l-eth0" Jul 6 23:28:00.605058 containerd[1514]: time="2025-07-06T23:28:00.604418772Z" level=info msg="connecting to shim e881a8da7016b08506ebc34507d845ff403639f0e25af7102f14e4695e1637d9" address="unix:///run/containerd/s/03b93e043bbc0699ba9e3ecb858729d68f6ec44b682ef0d559e676d054fbb00d" namespace=k8s.io protocol=ttrpc version=3 Jul 6 23:28:00.653089 systemd[1]: Started cri-containerd-e881a8da7016b08506ebc34507d845ff403639f0e25af7102f14e4695e1637d9.scope - libcontainer container e881a8da7016b08506ebc34507d845ff403639f0e25af7102f14e4695e1637d9. Jul 6 23:28:00.676875 systemd-networkd[1406]: califce12591438: Link UP Jul 6 23:28:00.680148 kubelet[2667]: I0706 23:28:00.677593 2667 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/coredns-668d6bf9bc-4hbsb" podStartSLOduration=42.67727089 podStartE2EDuration="42.67727089s" podCreationTimestamp="2025-07-06 23:27:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-07-06 23:28:00.675328727 +0000 UTC m=+47.456989099" watchObservedRunningTime="2025-07-06 23:28:00.67727089 +0000 UTC m=+47.458931222" Jul 6 23:28:00.678370 systemd-networkd[1406]: califce12591438: Gained carrier Jul 6 23:28:00.708786 containerd[1514]: 2025-07-06 23:28:00.413 [INFO][4325] cni-plugin/utils.go 100: File /var/lib/calico/mtu does not exist Jul 6 23:28:00.708786 containerd[1514]: 2025-07-06 23:28:00.430 [INFO][4325] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4344--1--1--3--d8bdec45b1-k8s-calico--apiserver--5c7f48b6db--cvkmp-eth0 calico-apiserver-5c7f48b6db- calico-apiserver 6645371b-fc64-4f6a-96ea-92b175539b84 813 0 2025-07-06 23:27:31 +0000 UTC map[apiserver:true app.kubernetes.io/name:calico-apiserver k8s-app:calico-apiserver pod-template-hash:5c7f48b6db projectcalico.org/namespace:calico-apiserver projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:calico-apiserver] map[] [] [] []} {k8s ci-4344-1-1-3-d8bdec45b1 calico-apiserver-5c7f48b6db-cvkmp eth0 calico-apiserver [] [] [kns.calico-apiserver ksa.calico-apiserver.calico-apiserver] califce12591438 [] [] }} ContainerID="3bf438596f4542a3c55fbe9562f5c390820402c941821db38aa37e716a08d56d" Namespace="calico-apiserver" Pod="calico-apiserver-5c7f48b6db-cvkmp" WorkloadEndpoint="ci--4344--1--1--3--d8bdec45b1-k8s-calico--apiserver--5c7f48b6db--cvkmp-" Jul 6 23:28:00.708786 containerd[1514]: 2025-07-06 23:28:00.430 [INFO][4325] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="3bf438596f4542a3c55fbe9562f5c390820402c941821db38aa37e716a08d56d" Namespace="calico-apiserver" Pod="calico-apiserver-5c7f48b6db-cvkmp" WorkloadEndpoint="ci--4344--1--1--3--d8bdec45b1-k8s-calico--apiserver--5c7f48b6db--cvkmp-eth0" Jul 6 23:28:00.708786 containerd[1514]: 2025-07-06 23:28:00.472 [INFO][4345] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="3bf438596f4542a3c55fbe9562f5c390820402c941821db38aa37e716a08d56d" HandleID="k8s-pod-network.3bf438596f4542a3c55fbe9562f5c390820402c941821db38aa37e716a08d56d" Workload="ci--4344--1--1--3--d8bdec45b1-k8s-calico--apiserver--5c7f48b6db--cvkmp-eth0" Jul 6 23:28:00.708786 containerd[1514]: 2025-07-06 23:28:00.475 [INFO][4345] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="3bf438596f4542a3c55fbe9562f5c390820402c941821db38aa37e716a08d56d" HandleID="k8s-pod-network.3bf438596f4542a3c55fbe9562f5c390820402c941821db38aa37e716a08d56d" Workload="ci--4344--1--1--3--d8bdec45b1-k8s-calico--apiserver--5c7f48b6db--cvkmp-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0x40002d2f60), Attrs:map[string]string{"namespace":"calico-apiserver", "node":"ci-4344-1-1-3-d8bdec45b1", "pod":"calico-apiserver-5c7f48b6db-cvkmp", "timestamp":"2025-07-06 23:28:00.472609119 +0000 UTC"}, Hostname:"ci-4344-1-1-3-d8bdec45b1", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Jul 6 23:28:00.708786 containerd[1514]: 2025-07-06 23:28:00.475 [INFO][4345] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Jul 6 23:28:00.708786 containerd[1514]: 2025-07-06 23:28:00.530 [INFO][4345] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Jul 6 23:28:00.708786 containerd[1514]: 2025-07-06 23:28:00.530 [INFO][4345] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4344-1-1-3-d8bdec45b1' Jul 6 23:28:00.708786 containerd[1514]: 2025-07-06 23:28:00.585 [INFO][4345] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.3bf438596f4542a3c55fbe9562f5c390820402c941821db38aa37e716a08d56d" host="ci-4344-1-1-3-d8bdec45b1" Jul 6 23:28:00.708786 containerd[1514]: 2025-07-06 23:28:00.595 [INFO][4345] ipam/ipam.go 394: Looking up existing affinities for host host="ci-4344-1-1-3-d8bdec45b1" Jul 6 23:28:00.708786 containerd[1514]: 2025-07-06 23:28:00.613 [INFO][4345] ipam/ipam.go 511: Trying affinity for 192.168.75.64/26 host="ci-4344-1-1-3-d8bdec45b1" Jul 6 23:28:00.708786 containerd[1514]: 2025-07-06 23:28:00.619 [INFO][4345] ipam/ipam.go 158: Attempting to load block cidr=192.168.75.64/26 host="ci-4344-1-1-3-d8bdec45b1" Jul 6 23:28:00.708786 containerd[1514]: 2025-07-06 23:28:00.625 [INFO][4345] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.75.64/26 host="ci-4344-1-1-3-d8bdec45b1" Jul 6 23:28:00.708786 containerd[1514]: 2025-07-06 23:28:00.625 [INFO][4345] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.75.64/26 handle="k8s-pod-network.3bf438596f4542a3c55fbe9562f5c390820402c941821db38aa37e716a08d56d" host="ci-4344-1-1-3-d8bdec45b1" Jul 6 23:28:00.708786 containerd[1514]: 2025-07-06 23:28:00.628 [INFO][4345] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.3bf438596f4542a3c55fbe9562f5c390820402c941821db38aa37e716a08d56d Jul 6 23:28:00.708786 containerd[1514]: 2025-07-06 23:28:00.643 [INFO][4345] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.75.64/26 handle="k8s-pod-network.3bf438596f4542a3c55fbe9562f5c390820402c941821db38aa37e716a08d56d" host="ci-4344-1-1-3-d8bdec45b1" Jul 6 23:28:00.708786 containerd[1514]: 2025-07-06 23:28:00.659 [INFO][4345] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.75.70/26] block=192.168.75.64/26 handle="k8s-pod-network.3bf438596f4542a3c55fbe9562f5c390820402c941821db38aa37e716a08d56d" host="ci-4344-1-1-3-d8bdec45b1" Jul 6 23:28:00.708786 containerd[1514]: 2025-07-06 23:28:00.659 [INFO][4345] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.75.70/26] handle="k8s-pod-network.3bf438596f4542a3c55fbe9562f5c390820402c941821db38aa37e716a08d56d" host="ci-4344-1-1-3-d8bdec45b1" Jul 6 23:28:00.708786 containerd[1514]: 2025-07-06 23:28:00.659 [INFO][4345] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Jul 6 23:28:00.708786 containerd[1514]: 2025-07-06 23:28:00.659 [INFO][4345] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.75.70/26] IPv6=[] ContainerID="3bf438596f4542a3c55fbe9562f5c390820402c941821db38aa37e716a08d56d" HandleID="k8s-pod-network.3bf438596f4542a3c55fbe9562f5c390820402c941821db38aa37e716a08d56d" Workload="ci--4344--1--1--3--d8bdec45b1-k8s-calico--apiserver--5c7f48b6db--cvkmp-eth0" Jul 6 23:28:00.709339 containerd[1514]: 2025-07-06 23:28:00.664 [INFO][4325] cni-plugin/k8s.go 418: Populated endpoint ContainerID="3bf438596f4542a3c55fbe9562f5c390820402c941821db38aa37e716a08d56d" Namespace="calico-apiserver" Pod="calico-apiserver-5c7f48b6db-cvkmp" WorkloadEndpoint="ci--4344--1--1--3--d8bdec45b1-k8s-calico--apiserver--5c7f48b6db--cvkmp-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4344--1--1--3--d8bdec45b1-k8s-calico--apiserver--5c7f48b6db--cvkmp-eth0", GenerateName:"calico-apiserver-5c7f48b6db-", Namespace:"calico-apiserver", SelfLink:"", UID:"6645371b-fc64-4f6a-96ea-92b175539b84", ResourceVersion:"813", Generation:0, CreationTimestamp:time.Date(2025, time.July, 6, 23, 27, 31, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"5c7f48b6db", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4344-1-1-3-d8bdec45b1", ContainerID:"", Pod:"calico-apiserver-5c7f48b6db-cvkmp", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.75.70/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"califce12591438", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Jul 6 23:28:00.709339 containerd[1514]: 2025-07-06 23:28:00.665 [INFO][4325] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.75.70/32] ContainerID="3bf438596f4542a3c55fbe9562f5c390820402c941821db38aa37e716a08d56d" Namespace="calico-apiserver" Pod="calico-apiserver-5c7f48b6db-cvkmp" WorkloadEndpoint="ci--4344--1--1--3--d8bdec45b1-k8s-calico--apiserver--5c7f48b6db--cvkmp-eth0" Jul 6 23:28:00.709339 containerd[1514]: 2025-07-06 23:28:00.665 [INFO][4325] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to califce12591438 ContainerID="3bf438596f4542a3c55fbe9562f5c390820402c941821db38aa37e716a08d56d" Namespace="calico-apiserver" Pod="calico-apiserver-5c7f48b6db-cvkmp" WorkloadEndpoint="ci--4344--1--1--3--d8bdec45b1-k8s-calico--apiserver--5c7f48b6db--cvkmp-eth0" Jul 6 23:28:00.709339 containerd[1514]: 2025-07-06 23:28:00.679 [INFO][4325] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="3bf438596f4542a3c55fbe9562f5c390820402c941821db38aa37e716a08d56d" Namespace="calico-apiserver" Pod="calico-apiserver-5c7f48b6db-cvkmp" WorkloadEndpoint="ci--4344--1--1--3--d8bdec45b1-k8s-calico--apiserver--5c7f48b6db--cvkmp-eth0" Jul 6 23:28:00.709339 containerd[1514]: 2025-07-06 23:28:00.681 [INFO][4325] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="3bf438596f4542a3c55fbe9562f5c390820402c941821db38aa37e716a08d56d" Namespace="calico-apiserver" Pod="calico-apiserver-5c7f48b6db-cvkmp" WorkloadEndpoint="ci--4344--1--1--3--d8bdec45b1-k8s-calico--apiserver--5c7f48b6db--cvkmp-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4344--1--1--3--d8bdec45b1-k8s-calico--apiserver--5c7f48b6db--cvkmp-eth0", GenerateName:"calico-apiserver-5c7f48b6db-", Namespace:"calico-apiserver", SelfLink:"", UID:"6645371b-fc64-4f6a-96ea-92b175539b84", ResourceVersion:"813", Generation:0, CreationTimestamp:time.Date(2025, time.July, 6, 23, 27, 31, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"5c7f48b6db", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4344-1-1-3-d8bdec45b1", ContainerID:"3bf438596f4542a3c55fbe9562f5c390820402c941821db38aa37e716a08d56d", Pod:"calico-apiserver-5c7f48b6db-cvkmp", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.75.70/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"califce12591438", MAC:"66:80:d1:a3:ea:70", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Jul 6 23:28:00.709339 containerd[1514]: 2025-07-06 23:28:00.704 [INFO][4325] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="3bf438596f4542a3c55fbe9562f5c390820402c941821db38aa37e716a08d56d" Namespace="calico-apiserver" Pod="calico-apiserver-5c7f48b6db-cvkmp" WorkloadEndpoint="ci--4344--1--1--3--d8bdec45b1-k8s-calico--apiserver--5c7f48b6db--cvkmp-eth0" Jul 6 23:28:00.789688 containerd[1514]: time="2025-07-06T23:28:00.788857951Z" level=info msg="connecting to shim 3bf438596f4542a3c55fbe9562f5c390820402c941821db38aa37e716a08d56d" address="unix:///run/containerd/s/a2647b6a4139d82d1961d5e81149081247dc9dbc1ebd62e03f51a68377b5b7ee" namespace=k8s.io protocol=ttrpc version=3 Jul 6 23:28:00.823331 systemd[1]: Started cri-containerd-3bf438596f4542a3c55fbe9562f5c390820402c941821db38aa37e716a08d56d.scope - libcontainer container 3bf438596f4542a3c55fbe9562f5c390820402c941821db38aa37e716a08d56d. Jul 6 23:28:00.829228 containerd[1514]: time="2025-07-06T23:28:00.828953816Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-668d6bf9bc-bq59l,Uid:27ad9e03-63e4-4da6-b469-a2bbc29ee17c,Namespace:kube-system,Attempt:0,} returns sandbox id \"e881a8da7016b08506ebc34507d845ff403639f0e25af7102f14e4695e1637d9\"" Jul 6 23:28:00.846239 containerd[1514]: time="2025-07-06T23:28:00.846074604Z" level=info msg="CreateContainer within sandbox \"e881a8da7016b08506ebc34507d845ff403639f0e25af7102f14e4695e1637d9\" for container &ContainerMetadata{Name:coredns,Attempt:0,}" Jul 6 23:28:00.856553 containerd[1514]: time="2025-07-06T23:28:00.856376661Z" level=info msg="Container 3d50532046f8bfbe720cb04d05319c390b2480788be3a45d05ef800075839f62: CDI devices from CRI Config.CDIDevices: []" Jul 6 23:28:00.864164 systemd-networkd[1406]: cali2121a5177ab: Gained IPv6LL Jul 6 23:28:00.873014 containerd[1514]: time="2025-07-06T23:28:00.872910967Z" level=info msg="CreateContainer within sandbox \"e881a8da7016b08506ebc34507d845ff403639f0e25af7102f14e4695e1637d9\" for &ContainerMetadata{Name:coredns,Attempt:0,} returns container id \"3d50532046f8bfbe720cb04d05319c390b2480788be3a45d05ef800075839f62\"" Jul 6 23:28:00.880454 containerd[1514]: time="2025-07-06T23:28:00.880226499Z" level=info msg="StartContainer for \"3d50532046f8bfbe720cb04d05319c390b2480788be3a45d05ef800075839f62\"" Jul 6 23:28:00.885443 containerd[1514]: time="2025-07-06T23:28:00.885403908Z" level=info msg="connecting to shim 3d50532046f8bfbe720cb04d05319c390b2480788be3a45d05ef800075839f62" address="unix:///run/containerd/s/03b93e043bbc0699ba9e3ecb858729d68f6ec44b682ef0d559e676d054fbb00d" protocol=ttrpc version=3 Jul 6 23:28:00.929044 systemd[1]: Started cri-containerd-3d50532046f8bfbe720cb04d05319c390b2480788be3a45d05ef800075839f62.scope - libcontainer container 3d50532046f8bfbe720cb04d05319c390b2480788be3a45d05ef800075839f62. Jul 6 23:28:01.049128 containerd[1514]: time="2025-07-06T23:28:01.048988008Z" level=info msg="StartContainer for \"3d50532046f8bfbe720cb04d05319c390b2480788be3a45d05ef800075839f62\" returns successfully" Jul 6 23:28:01.272340 containerd[1514]: time="2025-07-06T23:28:01.272288587Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-5c7f48b6db-cvkmp,Uid:6645371b-fc64-4f6a-96ea-92b175539b84,Namespace:calico-apiserver,Attempt:0,} returns sandbox id \"3bf438596f4542a3c55fbe9562f5c390820402c941821db38aa37e716a08d56d\"" Jul 6 23:28:01.364758 containerd[1514]: time="2025-07-06T23:28:01.364589407Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-59c54d7dc-bp45t,Uid:c35acb70-7389-4854-97f5-ab45ab94f737,Namespace:calico-apiserver,Attempt:0,}" Jul 6 23:28:01.366481 containerd[1514]: time="2025-07-06T23:28:01.366127930Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-xwdth,Uid:6cccb974-c600-4611-9cf6-2ebb27d5d999,Namespace:calico-system,Attempt:0,}" Jul 6 23:28:01.368115 containerd[1514]: time="2025-07-06T23:28:01.367019131Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:goldmane-768f4c5c69-r6dz8,Uid:12a2e46a-f038-4c8d-8ff0-480027547cbb,Namespace:calico-system,Attempt:0,}" Jul 6 23:28:01.632017 systemd-networkd[1406]: cali9bec3db93d1: Gained IPv6LL Jul 6 23:28:01.683741 systemd-networkd[1406]: cali45af94bc807: Link UP Jul 6 23:28:01.684013 systemd-networkd[1406]: cali45af94bc807: Gained carrier Jul 6 23:28:01.694831 systemd-networkd[1406]: cali60625f68efb: Gained IPv6LL Jul 6 23:28:01.716040 kubelet[2667]: I0706 23:28:01.715964 2667 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/coredns-668d6bf9bc-bq59l" podStartSLOduration=43.715946541 podStartE2EDuration="43.715946541s" podCreationTimestamp="2025-07-06 23:27:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-07-06 23:28:01.704742364 +0000 UTC m=+48.486402736" watchObservedRunningTime="2025-07-06 23:28:01.715946541 +0000 UTC m=+48.497606913" Jul 6 23:28:01.725058 containerd[1514]: 2025-07-06 23:28:01.467 [INFO][4523] cni-plugin/utils.go 100: File /var/lib/calico/mtu does not exist Jul 6 23:28:01.725058 containerd[1514]: 2025-07-06 23:28:01.497 [INFO][4523] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4344--1--1--3--d8bdec45b1-k8s-calico--apiserver--59c54d7dc--bp45t-eth0 calico-apiserver-59c54d7dc- calico-apiserver c35acb70-7389-4854-97f5-ab45ab94f737 818 0 2025-07-06 23:27:29 +0000 UTC map[apiserver:true app.kubernetes.io/name:calico-apiserver k8s-app:calico-apiserver pod-template-hash:59c54d7dc projectcalico.org/namespace:calico-apiserver projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:calico-apiserver] map[] [] [] []} {k8s ci-4344-1-1-3-d8bdec45b1 calico-apiserver-59c54d7dc-bp45t eth0 calico-apiserver [] [] [kns.calico-apiserver ksa.calico-apiserver.calico-apiserver] cali45af94bc807 [] [] }} ContainerID="55153120b2e39f4ba457796f1115af3f0e6595b1afc0c17662e7bb6f4ca1b112" Namespace="calico-apiserver" Pod="calico-apiserver-59c54d7dc-bp45t" WorkloadEndpoint="ci--4344--1--1--3--d8bdec45b1-k8s-calico--apiserver--59c54d7dc--bp45t-" Jul 6 23:28:01.725058 containerd[1514]: 2025-07-06 23:28:01.497 [INFO][4523] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="55153120b2e39f4ba457796f1115af3f0e6595b1afc0c17662e7bb6f4ca1b112" Namespace="calico-apiserver" Pod="calico-apiserver-59c54d7dc-bp45t" WorkloadEndpoint="ci--4344--1--1--3--d8bdec45b1-k8s-calico--apiserver--59c54d7dc--bp45t-eth0" Jul 6 23:28:01.725058 containerd[1514]: 2025-07-06 23:28:01.578 [INFO][4569] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="55153120b2e39f4ba457796f1115af3f0e6595b1afc0c17662e7bb6f4ca1b112" HandleID="k8s-pod-network.55153120b2e39f4ba457796f1115af3f0e6595b1afc0c17662e7bb6f4ca1b112" Workload="ci--4344--1--1--3--d8bdec45b1-k8s-calico--apiserver--59c54d7dc--bp45t-eth0" Jul 6 23:28:01.725058 containerd[1514]: 2025-07-06 23:28:01.578 [INFO][4569] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="55153120b2e39f4ba457796f1115af3f0e6595b1afc0c17662e7bb6f4ca1b112" HandleID="k8s-pod-network.55153120b2e39f4ba457796f1115af3f0e6595b1afc0c17662e7bb6f4ca1b112" Workload="ci--4344--1--1--3--d8bdec45b1-k8s-calico--apiserver--59c54d7dc--bp45t-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0x40002caff0), Attrs:map[string]string{"namespace":"calico-apiserver", "node":"ci-4344-1-1-3-d8bdec45b1", "pod":"calico-apiserver-59c54d7dc-bp45t", "timestamp":"2025-07-06 23:28:01.578300412 +0000 UTC"}, Hostname:"ci-4344-1-1-3-d8bdec45b1", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Jul 6 23:28:01.725058 containerd[1514]: 2025-07-06 23:28:01.578 [INFO][4569] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Jul 6 23:28:01.725058 containerd[1514]: 2025-07-06 23:28:01.578 [INFO][4569] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Jul 6 23:28:01.725058 containerd[1514]: 2025-07-06 23:28:01.578 [INFO][4569] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4344-1-1-3-d8bdec45b1' Jul 6 23:28:01.725058 containerd[1514]: 2025-07-06 23:28:01.597 [INFO][4569] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.55153120b2e39f4ba457796f1115af3f0e6595b1afc0c17662e7bb6f4ca1b112" host="ci-4344-1-1-3-d8bdec45b1" Jul 6 23:28:01.725058 containerd[1514]: 2025-07-06 23:28:01.623 [INFO][4569] ipam/ipam.go 394: Looking up existing affinities for host host="ci-4344-1-1-3-d8bdec45b1" Jul 6 23:28:01.725058 containerd[1514]: 2025-07-06 23:28:01.634 [INFO][4569] ipam/ipam.go 511: Trying affinity for 192.168.75.64/26 host="ci-4344-1-1-3-d8bdec45b1" Jul 6 23:28:01.725058 containerd[1514]: 2025-07-06 23:28:01.639 [INFO][4569] ipam/ipam.go 158: Attempting to load block cidr=192.168.75.64/26 host="ci-4344-1-1-3-d8bdec45b1" Jul 6 23:28:01.725058 containerd[1514]: 2025-07-06 23:28:01.645 [INFO][4569] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.75.64/26 host="ci-4344-1-1-3-d8bdec45b1" Jul 6 23:28:01.725058 containerd[1514]: 2025-07-06 23:28:01.645 [INFO][4569] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.75.64/26 handle="k8s-pod-network.55153120b2e39f4ba457796f1115af3f0e6595b1afc0c17662e7bb6f4ca1b112" host="ci-4344-1-1-3-d8bdec45b1" Jul 6 23:28:01.725058 containerd[1514]: 2025-07-06 23:28:01.648 [INFO][4569] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.55153120b2e39f4ba457796f1115af3f0e6595b1afc0c17662e7bb6f4ca1b112 Jul 6 23:28:01.725058 containerd[1514]: 2025-07-06 23:28:01.656 [INFO][4569] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.75.64/26 handle="k8s-pod-network.55153120b2e39f4ba457796f1115af3f0e6595b1afc0c17662e7bb6f4ca1b112" host="ci-4344-1-1-3-d8bdec45b1" Jul 6 23:28:01.725058 containerd[1514]: 2025-07-06 23:28:01.667 [INFO][4569] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.75.71/26] block=192.168.75.64/26 handle="k8s-pod-network.55153120b2e39f4ba457796f1115af3f0e6595b1afc0c17662e7bb6f4ca1b112" host="ci-4344-1-1-3-d8bdec45b1" Jul 6 23:28:01.725058 containerd[1514]: 2025-07-06 23:28:01.667 [INFO][4569] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.75.71/26] handle="k8s-pod-network.55153120b2e39f4ba457796f1115af3f0e6595b1afc0c17662e7bb6f4ca1b112" host="ci-4344-1-1-3-d8bdec45b1" Jul 6 23:28:01.725058 containerd[1514]: 2025-07-06 23:28:01.667 [INFO][4569] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Jul 6 23:28:01.725058 containerd[1514]: 2025-07-06 23:28:01.667 [INFO][4569] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.75.71/26] IPv6=[] ContainerID="55153120b2e39f4ba457796f1115af3f0e6595b1afc0c17662e7bb6f4ca1b112" HandleID="k8s-pod-network.55153120b2e39f4ba457796f1115af3f0e6595b1afc0c17662e7bb6f4ca1b112" Workload="ci--4344--1--1--3--d8bdec45b1-k8s-calico--apiserver--59c54d7dc--bp45t-eth0" Jul 6 23:28:01.727480 containerd[1514]: 2025-07-06 23:28:01.674 [INFO][4523] cni-plugin/k8s.go 418: Populated endpoint ContainerID="55153120b2e39f4ba457796f1115af3f0e6595b1afc0c17662e7bb6f4ca1b112" Namespace="calico-apiserver" Pod="calico-apiserver-59c54d7dc-bp45t" WorkloadEndpoint="ci--4344--1--1--3--d8bdec45b1-k8s-calico--apiserver--59c54d7dc--bp45t-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4344--1--1--3--d8bdec45b1-k8s-calico--apiserver--59c54d7dc--bp45t-eth0", GenerateName:"calico-apiserver-59c54d7dc-", Namespace:"calico-apiserver", SelfLink:"", UID:"c35acb70-7389-4854-97f5-ab45ab94f737", ResourceVersion:"818", Generation:0, CreationTimestamp:time.Date(2025, time.July, 6, 23, 27, 29, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"59c54d7dc", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4344-1-1-3-d8bdec45b1", ContainerID:"", Pod:"calico-apiserver-59c54d7dc-bp45t", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.75.71/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"cali45af94bc807", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Jul 6 23:28:01.727480 containerd[1514]: 2025-07-06 23:28:01.675 [INFO][4523] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.75.71/32] ContainerID="55153120b2e39f4ba457796f1115af3f0e6595b1afc0c17662e7bb6f4ca1b112" Namespace="calico-apiserver" Pod="calico-apiserver-59c54d7dc-bp45t" WorkloadEndpoint="ci--4344--1--1--3--d8bdec45b1-k8s-calico--apiserver--59c54d7dc--bp45t-eth0" Jul 6 23:28:01.727480 containerd[1514]: 2025-07-06 23:28:01.676 [INFO][4523] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali45af94bc807 ContainerID="55153120b2e39f4ba457796f1115af3f0e6595b1afc0c17662e7bb6f4ca1b112" Namespace="calico-apiserver" Pod="calico-apiserver-59c54d7dc-bp45t" WorkloadEndpoint="ci--4344--1--1--3--d8bdec45b1-k8s-calico--apiserver--59c54d7dc--bp45t-eth0" Jul 6 23:28:01.727480 containerd[1514]: 2025-07-06 23:28:01.678 [INFO][4523] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="55153120b2e39f4ba457796f1115af3f0e6595b1afc0c17662e7bb6f4ca1b112" Namespace="calico-apiserver" Pod="calico-apiserver-59c54d7dc-bp45t" WorkloadEndpoint="ci--4344--1--1--3--d8bdec45b1-k8s-calico--apiserver--59c54d7dc--bp45t-eth0" Jul 6 23:28:01.727480 containerd[1514]: 2025-07-06 23:28:01.681 [INFO][4523] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="55153120b2e39f4ba457796f1115af3f0e6595b1afc0c17662e7bb6f4ca1b112" Namespace="calico-apiserver" Pod="calico-apiserver-59c54d7dc-bp45t" WorkloadEndpoint="ci--4344--1--1--3--d8bdec45b1-k8s-calico--apiserver--59c54d7dc--bp45t-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4344--1--1--3--d8bdec45b1-k8s-calico--apiserver--59c54d7dc--bp45t-eth0", GenerateName:"calico-apiserver-59c54d7dc-", Namespace:"calico-apiserver", SelfLink:"", UID:"c35acb70-7389-4854-97f5-ab45ab94f737", ResourceVersion:"818", Generation:0, CreationTimestamp:time.Date(2025, time.July, 6, 23, 27, 29, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"59c54d7dc", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4344-1-1-3-d8bdec45b1", ContainerID:"55153120b2e39f4ba457796f1115af3f0e6595b1afc0c17662e7bb6f4ca1b112", Pod:"calico-apiserver-59c54d7dc-bp45t", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.75.71/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"cali45af94bc807", MAC:"06:5d:da:ff:51:b8", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Jul 6 23:28:01.727480 containerd[1514]: 2025-07-06 23:28:01.718 [INFO][4523] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="55153120b2e39f4ba457796f1115af3f0e6595b1afc0c17662e7bb6f4ca1b112" Namespace="calico-apiserver" Pod="calico-apiserver-59c54d7dc-bp45t" WorkloadEndpoint="ci--4344--1--1--3--d8bdec45b1-k8s-calico--apiserver--59c54d7dc--bp45t-eth0" Jul 6 23:28:01.792591 containerd[1514]: time="2025-07-06T23:28:01.792544618Z" level=info msg="connecting to shim 55153120b2e39f4ba457796f1115af3f0e6595b1afc0c17662e7bb6f4ca1b112" address="unix:///run/containerd/s/c3580d82361683f8d748275da254f3e4cb9252ce20aa345c026e60a569ff0ffb" namespace=k8s.io protocol=ttrpc version=3 Jul 6 23:28:01.830024 systemd-networkd[1406]: calidab8520f841: Link UP Jul 6 23:28:01.830783 systemd-networkd[1406]: calidab8520f841: Gained carrier Jul 6 23:28:01.871873 systemd[1]: Started cri-containerd-55153120b2e39f4ba457796f1115af3f0e6595b1afc0c17662e7bb6f4ca1b112.scope - libcontainer container 55153120b2e39f4ba457796f1115af3f0e6595b1afc0c17662e7bb6f4ca1b112. Jul 6 23:28:01.874788 containerd[1514]: 2025-07-06 23:28:01.478 [INFO][4532] cni-plugin/utils.go 100: File /var/lib/calico/mtu does not exist Jul 6 23:28:01.874788 containerd[1514]: 2025-07-06 23:28:01.522 [INFO][4532] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4344--1--1--3--d8bdec45b1-k8s-csi--node--driver--xwdth-eth0 csi-node-driver- calico-system 6cccb974-c600-4611-9cf6-2ebb27d5d999 709 0 2025-07-06 23:27:36 +0000 UTC map[app.kubernetes.io/name:csi-node-driver controller-revision-hash:8967bcb6f k8s-app:csi-node-driver name:csi-node-driver pod-template-generation:1 projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:csi-node-driver] map[] [] [] []} {k8s ci-4344-1-1-3-d8bdec45b1 csi-node-driver-xwdth eth0 csi-node-driver [] [] [kns.calico-system ksa.calico-system.csi-node-driver] calidab8520f841 [] [] }} ContainerID="3aea7c17fd5bc9d6661a3eab9ed3d52346ded1507128e53d356e5378a071080e" Namespace="calico-system" Pod="csi-node-driver-xwdth" WorkloadEndpoint="ci--4344--1--1--3--d8bdec45b1-k8s-csi--node--driver--xwdth-" Jul 6 23:28:01.874788 containerd[1514]: 2025-07-06 23:28:01.522 [INFO][4532] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="3aea7c17fd5bc9d6661a3eab9ed3d52346ded1507128e53d356e5378a071080e" Namespace="calico-system" Pod="csi-node-driver-xwdth" WorkloadEndpoint="ci--4344--1--1--3--d8bdec45b1-k8s-csi--node--driver--xwdth-eth0" Jul 6 23:28:01.874788 containerd[1514]: 2025-07-06 23:28:01.578 [INFO][4575] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="3aea7c17fd5bc9d6661a3eab9ed3d52346ded1507128e53d356e5378a071080e" HandleID="k8s-pod-network.3aea7c17fd5bc9d6661a3eab9ed3d52346ded1507128e53d356e5378a071080e" Workload="ci--4344--1--1--3--d8bdec45b1-k8s-csi--node--driver--xwdth-eth0" Jul 6 23:28:01.874788 containerd[1514]: 2025-07-06 23:28:01.580 [INFO][4575] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="3aea7c17fd5bc9d6661a3eab9ed3d52346ded1507128e53d356e5378a071080e" HandleID="k8s-pod-network.3aea7c17fd5bc9d6661a3eab9ed3d52346ded1507128e53d356e5378a071080e" Workload="ci--4344--1--1--3--d8bdec45b1-k8s-csi--node--driver--xwdth-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0x40002caf60), Attrs:map[string]string{"namespace":"calico-system", "node":"ci-4344-1-1-3-d8bdec45b1", "pod":"csi-node-driver-xwdth", "timestamp":"2025-07-06 23:28:01.578209612 +0000 UTC"}, Hostname:"ci-4344-1-1-3-d8bdec45b1", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Jul 6 23:28:01.874788 containerd[1514]: 2025-07-06 23:28:01.580 [INFO][4575] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Jul 6 23:28:01.874788 containerd[1514]: 2025-07-06 23:28:01.667 [INFO][4575] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Jul 6 23:28:01.874788 containerd[1514]: 2025-07-06 23:28:01.668 [INFO][4575] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4344-1-1-3-d8bdec45b1' Jul 6 23:28:01.874788 containerd[1514]: 2025-07-06 23:28:01.698 [INFO][4575] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.3aea7c17fd5bc9d6661a3eab9ed3d52346ded1507128e53d356e5378a071080e" host="ci-4344-1-1-3-d8bdec45b1" Jul 6 23:28:01.874788 containerd[1514]: 2025-07-06 23:28:01.724 [INFO][4575] ipam/ipam.go 394: Looking up existing affinities for host host="ci-4344-1-1-3-d8bdec45b1" Jul 6 23:28:01.874788 containerd[1514]: 2025-07-06 23:28:01.746 [INFO][4575] ipam/ipam.go 511: Trying affinity for 192.168.75.64/26 host="ci-4344-1-1-3-d8bdec45b1" Jul 6 23:28:01.874788 containerd[1514]: 2025-07-06 23:28:01.755 [INFO][4575] ipam/ipam.go 158: Attempting to load block cidr=192.168.75.64/26 host="ci-4344-1-1-3-d8bdec45b1" Jul 6 23:28:01.874788 containerd[1514]: 2025-07-06 23:28:01.765 [INFO][4575] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.75.64/26 host="ci-4344-1-1-3-d8bdec45b1" Jul 6 23:28:01.874788 containerd[1514]: 2025-07-06 23:28:01.765 [INFO][4575] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.75.64/26 handle="k8s-pod-network.3aea7c17fd5bc9d6661a3eab9ed3d52346ded1507128e53d356e5378a071080e" host="ci-4344-1-1-3-d8bdec45b1" Jul 6 23:28:01.874788 containerd[1514]: 2025-07-06 23:28:01.783 [INFO][4575] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.3aea7c17fd5bc9d6661a3eab9ed3d52346ded1507128e53d356e5378a071080e Jul 6 23:28:01.874788 containerd[1514]: 2025-07-06 23:28:01.795 [INFO][4575] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.75.64/26 handle="k8s-pod-network.3aea7c17fd5bc9d6661a3eab9ed3d52346ded1507128e53d356e5378a071080e" host="ci-4344-1-1-3-d8bdec45b1" Jul 6 23:28:01.874788 containerd[1514]: 2025-07-06 23:28:01.813 [INFO][4575] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.75.72/26] block=192.168.75.64/26 handle="k8s-pod-network.3aea7c17fd5bc9d6661a3eab9ed3d52346ded1507128e53d356e5378a071080e" host="ci-4344-1-1-3-d8bdec45b1" Jul 6 23:28:01.874788 containerd[1514]: 2025-07-06 23:28:01.813 [INFO][4575] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.75.72/26] handle="k8s-pod-network.3aea7c17fd5bc9d6661a3eab9ed3d52346ded1507128e53d356e5378a071080e" host="ci-4344-1-1-3-d8bdec45b1" Jul 6 23:28:01.874788 containerd[1514]: 2025-07-06 23:28:01.813 [INFO][4575] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Jul 6 23:28:01.874788 containerd[1514]: 2025-07-06 23:28:01.813 [INFO][4575] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.75.72/26] IPv6=[] ContainerID="3aea7c17fd5bc9d6661a3eab9ed3d52346ded1507128e53d356e5378a071080e" HandleID="k8s-pod-network.3aea7c17fd5bc9d6661a3eab9ed3d52346ded1507128e53d356e5378a071080e" Workload="ci--4344--1--1--3--d8bdec45b1-k8s-csi--node--driver--xwdth-eth0" Jul 6 23:28:01.876924 containerd[1514]: 2025-07-06 23:28:01.825 [INFO][4532] cni-plugin/k8s.go 418: Populated endpoint ContainerID="3aea7c17fd5bc9d6661a3eab9ed3d52346ded1507128e53d356e5378a071080e" Namespace="calico-system" Pod="csi-node-driver-xwdth" WorkloadEndpoint="ci--4344--1--1--3--d8bdec45b1-k8s-csi--node--driver--xwdth-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4344--1--1--3--d8bdec45b1-k8s-csi--node--driver--xwdth-eth0", GenerateName:"csi-node-driver-", Namespace:"calico-system", SelfLink:"", UID:"6cccb974-c600-4611-9cf6-2ebb27d5d999", ResourceVersion:"709", Generation:0, CreationTimestamp:time.Date(2025, time.July, 6, 23, 27, 36, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"csi-node-driver", "controller-revision-hash":"8967bcb6f", "k8s-app":"csi-node-driver", "name":"csi-node-driver", "pod-template-generation":"1", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"csi-node-driver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4344-1-1-3-d8bdec45b1", ContainerID:"", Pod:"csi-node-driver-xwdth", Endpoint:"eth0", ServiceAccountName:"csi-node-driver", IPNetworks:[]string{"192.168.75.72/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.csi-node-driver"}, InterfaceName:"calidab8520f841", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Jul 6 23:28:01.876924 containerd[1514]: 2025-07-06 23:28:01.826 [INFO][4532] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.75.72/32] ContainerID="3aea7c17fd5bc9d6661a3eab9ed3d52346ded1507128e53d356e5378a071080e" Namespace="calico-system" Pod="csi-node-driver-xwdth" WorkloadEndpoint="ci--4344--1--1--3--d8bdec45b1-k8s-csi--node--driver--xwdth-eth0" Jul 6 23:28:01.876924 containerd[1514]: 2025-07-06 23:28:01.826 [INFO][4532] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to calidab8520f841 ContainerID="3aea7c17fd5bc9d6661a3eab9ed3d52346ded1507128e53d356e5378a071080e" Namespace="calico-system" Pod="csi-node-driver-xwdth" WorkloadEndpoint="ci--4344--1--1--3--d8bdec45b1-k8s-csi--node--driver--xwdth-eth0" Jul 6 23:28:01.876924 containerd[1514]: 2025-07-06 23:28:01.830 [INFO][4532] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="3aea7c17fd5bc9d6661a3eab9ed3d52346ded1507128e53d356e5378a071080e" Namespace="calico-system" Pod="csi-node-driver-xwdth" WorkloadEndpoint="ci--4344--1--1--3--d8bdec45b1-k8s-csi--node--driver--xwdth-eth0" Jul 6 23:28:01.876924 containerd[1514]: 2025-07-06 23:28:01.837 [INFO][4532] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="3aea7c17fd5bc9d6661a3eab9ed3d52346ded1507128e53d356e5378a071080e" Namespace="calico-system" Pod="csi-node-driver-xwdth" WorkloadEndpoint="ci--4344--1--1--3--d8bdec45b1-k8s-csi--node--driver--xwdth-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4344--1--1--3--d8bdec45b1-k8s-csi--node--driver--xwdth-eth0", GenerateName:"csi-node-driver-", Namespace:"calico-system", SelfLink:"", UID:"6cccb974-c600-4611-9cf6-2ebb27d5d999", ResourceVersion:"709", Generation:0, CreationTimestamp:time.Date(2025, time.July, 6, 23, 27, 36, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"csi-node-driver", "controller-revision-hash":"8967bcb6f", "k8s-app":"csi-node-driver", "name":"csi-node-driver", "pod-template-generation":"1", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"csi-node-driver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4344-1-1-3-d8bdec45b1", ContainerID:"3aea7c17fd5bc9d6661a3eab9ed3d52346ded1507128e53d356e5378a071080e", Pod:"csi-node-driver-xwdth", Endpoint:"eth0", ServiceAccountName:"csi-node-driver", IPNetworks:[]string{"192.168.75.72/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.csi-node-driver"}, InterfaceName:"calidab8520f841", MAC:"be:08:08:df:37:85", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Jul 6 23:28:01.876924 containerd[1514]: 2025-07-06 23:28:01.866 [INFO][4532] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="3aea7c17fd5bc9d6661a3eab9ed3d52346ded1507128e53d356e5378a071080e" Namespace="calico-system" Pod="csi-node-driver-xwdth" WorkloadEndpoint="ci--4344--1--1--3--d8bdec45b1-k8s-csi--node--driver--xwdth-eth0" Jul 6 23:28:01.946085 systemd-networkd[1406]: cali369d48a0181: Link UP Jul 6 23:28:01.947241 systemd-networkd[1406]: cali369d48a0181: Gained carrier Jul 6 23:28:01.961875 containerd[1514]: time="2025-07-06T23:28:01.961832475Z" level=info msg="connecting to shim 3aea7c17fd5bc9d6661a3eab9ed3d52346ded1507128e53d356e5378a071080e" address="unix:///run/containerd/s/fc54eabd9530bc2b9114a354e98b7e41c9e3651a8a0fdd07940a1accfee3e6f3" namespace=k8s.io protocol=ttrpc version=3 Jul 6 23:28:01.977478 containerd[1514]: 2025-07-06 23:28:01.456 [INFO][4536] cni-plugin/utils.go 100: File /var/lib/calico/mtu does not exist Jul 6 23:28:01.977478 containerd[1514]: 2025-07-06 23:28:01.488 [INFO][4536] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4344--1--1--3--d8bdec45b1-k8s-goldmane--768f4c5c69--r6dz8-eth0 goldmane-768f4c5c69- calico-system 12a2e46a-f038-4c8d-8ff0-480027547cbb 816 0 2025-07-06 23:27:35 +0000 UTC map[app.kubernetes.io/name:goldmane k8s-app:goldmane pod-template-hash:768f4c5c69 projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:goldmane] map[] [] [] []} {k8s ci-4344-1-1-3-d8bdec45b1 goldmane-768f4c5c69-r6dz8 eth0 goldmane [] [] [kns.calico-system ksa.calico-system.goldmane] cali369d48a0181 [] [] }} ContainerID="55967b677c733dd4e032c9c202f26d3c8a0bf05826f0f9d58b0b1da4a06d50a4" Namespace="calico-system" Pod="goldmane-768f4c5c69-r6dz8" WorkloadEndpoint="ci--4344--1--1--3--d8bdec45b1-k8s-goldmane--768f4c5c69--r6dz8-" Jul 6 23:28:01.977478 containerd[1514]: 2025-07-06 23:28:01.488 [INFO][4536] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="55967b677c733dd4e032c9c202f26d3c8a0bf05826f0f9d58b0b1da4a06d50a4" Namespace="calico-system" Pod="goldmane-768f4c5c69-r6dz8" WorkloadEndpoint="ci--4344--1--1--3--d8bdec45b1-k8s-goldmane--768f4c5c69--r6dz8-eth0" Jul 6 23:28:01.977478 containerd[1514]: 2025-07-06 23:28:01.592 [INFO][4564] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="55967b677c733dd4e032c9c202f26d3c8a0bf05826f0f9d58b0b1da4a06d50a4" HandleID="k8s-pod-network.55967b677c733dd4e032c9c202f26d3c8a0bf05826f0f9d58b0b1da4a06d50a4" Workload="ci--4344--1--1--3--d8bdec45b1-k8s-goldmane--768f4c5c69--r6dz8-eth0" Jul 6 23:28:01.977478 containerd[1514]: 2025-07-06 23:28:01.593 [INFO][4564] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="55967b677c733dd4e032c9c202f26d3c8a0bf05826f0f9d58b0b1da4a06d50a4" HandleID="k8s-pod-network.55967b677c733dd4e032c9c202f26d3c8a0bf05826f0f9d58b0b1da4a06d50a4" Workload="ci--4344--1--1--3--d8bdec45b1-k8s-goldmane--768f4c5c69--r6dz8-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0x40003179d0), Attrs:map[string]string{"namespace":"calico-system", "node":"ci-4344-1-1-3-d8bdec45b1", "pod":"goldmane-768f4c5c69-r6dz8", "timestamp":"2025-07-06 23:28:01.592809674 +0000 UTC"}, Hostname:"ci-4344-1-1-3-d8bdec45b1", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Jul 6 23:28:01.977478 containerd[1514]: 2025-07-06 23:28:01.593 [INFO][4564] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Jul 6 23:28:01.977478 containerd[1514]: 2025-07-06 23:28:01.814 [INFO][4564] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Jul 6 23:28:01.977478 containerd[1514]: 2025-07-06 23:28:01.814 [INFO][4564] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4344-1-1-3-d8bdec45b1' Jul 6 23:28:01.977478 containerd[1514]: 2025-07-06 23:28:01.845 [INFO][4564] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.55967b677c733dd4e032c9c202f26d3c8a0bf05826f0f9d58b0b1da4a06d50a4" host="ci-4344-1-1-3-d8bdec45b1" Jul 6 23:28:01.977478 containerd[1514]: 2025-07-06 23:28:01.867 [INFO][4564] ipam/ipam.go 394: Looking up existing affinities for host host="ci-4344-1-1-3-d8bdec45b1" Jul 6 23:28:01.977478 containerd[1514]: 2025-07-06 23:28:01.880 [INFO][4564] ipam/ipam.go 511: Trying affinity for 192.168.75.64/26 host="ci-4344-1-1-3-d8bdec45b1" Jul 6 23:28:01.977478 containerd[1514]: 2025-07-06 23:28:01.891 [INFO][4564] ipam/ipam.go 158: Attempting to load block cidr=192.168.75.64/26 host="ci-4344-1-1-3-d8bdec45b1" Jul 6 23:28:01.977478 containerd[1514]: 2025-07-06 23:28:01.897 [INFO][4564] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.75.64/26 host="ci-4344-1-1-3-d8bdec45b1" Jul 6 23:28:01.977478 containerd[1514]: 2025-07-06 23:28:01.897 [INFO][4564] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.75.64/26 handle="k8s-pod-network.55967b677c733dd4e032c9c202f26d3c8a0bf05826f0f9d58b0b1da4a06d50a4" host="ci-4344-1-1-3-d8bdec45b1" Jul 6 23:28:01.977478 containerd[1514]: 2025-07-06 23:28:01.902 [INFO][4564] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.55967b677c733dd4e032c9c202f26d3c8a0bf05826f0f9d58b0b1da4a06d50a4 Jul 6 23:28:01.977478 containerd[1514]: 2025-07-06 23:28:01.911 [INFO][4564] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.75.64/26 handle="k8s-pod-network.55967b677c733dd4e032c9c202f26d3c8a0bf05826f0f9d58b0b1da4a06d50a4" host="ci-4344-1-1-3-d8bdec45b1" Jul 6 23:28:01.977478 containerd[1514]: 2025-07-06 23:28:01.926 [INFO][4564] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.75.73/26] block=192.168.75.64/26 handle="k8s-pod-network.55967b677c733dd4e032c9c202f26d3c8a0bf05826f0f9d58b0b1da4a06d50a4" host="ci-4344-1-1-3-d8bdec45b1" Jul 6 23:28:01.977478 containerd[1514]: 2025-07-06 23:28:01.926 [INFO][4564] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.75.73/26] handle="k8s-pod-network.55967b677c733dd4e032c9c202f26d3c8a0bf05826f0f9d58b0b1da4a06d50a4" host="ci-4344-1-1-3-d8bdec45b1" Jul 6 23:28:01.977478 containerd[1514]: 2025-07-06 23:28:01.927 [INFO][4564] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Jul 6 23:28:01.977478 containerd[1514]: 2025-07-06 23:28:01.927 [INFO][4564] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.75.73/26] IPv6=[] ContainerID="55967b677c733dd4e032c9c202f26d3c8a0bf05826f0f9d58b0b1da4a06d50a4" HandleID="k8s-pod-network.55967b677c733dd4e032c9c202f26d3c8a0bf05826f0f9d58b0b1da4a06d50a4" Workload="ci--4344--1--1--3--d8bdec45b1-k8s-goldmane--768f4c5c69--r6dz8-eth0" Jul 6 23:28:01.978550 containerd[1514]: 2025-07-06 23:28:01.939 [INFO][4536] cni-plugin/k8s.go 418: Populated endpoint ContainerID="55967b677c733dd4e032c9c202f26d3c8a0bf05826f0f9d58b0b1da4a06d50a4" Namespace="calico-system" Pod="goldmane-768f4c5c69-r6dz8" WorkloadEndpoint="ci--4344--1--1--3--d8bdec45b1-k8s-goldmane--768f4c5c69--r6dz8-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4344--1--1--3--d8bdec45b1-k8s-goldmane--768f4c5c69--r6dz8-eth0", GenerateName:"goldmane-768f4c5c69-", Namespace:"calico-system", SelfLink:"", UID:"12a2e46a-f038-4c8d-8ff0-480027547cbb", ResourceVersion:"816", Generation:0, CreationTimestamp:time.Date(2025, time.July, 6, 23, 27, 35, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"goldmane", "k8s-app":"goldmane", "pod-template-hash":"768f4c5c69", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"goldmane"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4344-1-1-3-d8bdec45b1", ContainerID:"", Pod:"goldmane-768f4c5c69-r6dz8", Endpoint:"eth0", ServiceAccountName:"goldmane", IPNetworks:[]string{"192.168.75.73/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.goldmane"}, InterfaceName:"cali369d48a0181", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Jul 6 23:28:01.978550 containerd[1514]: 2025-07-06 23:28:01.939 [INFO][4536] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.75.73/32] ContainerID="55967b677c733dd4e032c9c202f26d3c8a0bf05826f0f9d58b0b1da4a06d50a4" Namespace="calico-system" Pod="goldmane-768f4c5c69-r6dz8" WorkloadEndpoint="ci--4344--1--1--3--d8bdec45b1-k8s-goldmane--768f4c5c69--r6dz8-eth0" Jul 6 23:28:01.978550 containerd[1514]: 2025-07-06 23:28:01.939 [INFO][4536] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali369d48a0181 ContainerID="55967b677c733dd4e032c9c202f26d3c8a0bf05826f0f9d58b0b1da4a06d50a4" Namespace="calico-system" Pod="goldmane-768f4c5c69-r6dz8" WorkloadEndpoint="ci--4344--1--1--3--d8bdec45b1-k8s-goldmane--768f4c5c69--r6dz8-eth0" Jul 6 23:28:01.978550 containerd[1514]: 2025-07-06 23:28:01.946 [INFO][4536] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="55967b677c733dd4e032c9c202f26d3c8a0bf05826f0f9d58b0b1da4a06d50a4" Namespace="calico-system" Pod="goldmane-768f4c5c69-r6dz8" WorkloadEndpoint="ci--4344--1--1--3--d8bdec45b1-k8s-goldmane--768f4c5c69--r6dz8-eth0" Jul 6 23:28:01.978550 containerd[1514]: 2025-07-06 23:28:01.948 [INFO][4536] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="55967b677c733dd4e032c9c202f26d3c8a0bf05826f0f9d58b0b1da4a06d50a4" Namespace="calico-system" Pod="goldmane-768f4c5c69-r6dz8" WorkloadEndpoint="ci--4344--1--1--3--d8bdec45b1-k8s-goldmane--768f4c5c69--r6dz8-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4344--1--1--3--d8bdec45b1-k8s-goldmane--768f4c5c69--r6dz8-eth0", GenerateName:"goldmane-768f4c5c69-", Namespace:"calico-system", SelfLink:"", UID:"12a2e46a-f038-4c8d-8ff0-480027547cbb", ResourceVersion:"816", Generation:0, CreationTimestamp:time.Date(2025, time.July, 6, 23, 27, 35, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"goldmane", "k8s-app":"goldmane", "pod-template-hash":"768f4c5c69", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"goldmane"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4344-1-1-3-d8bdec45b1", ContainerID:"55967b677c733dd4e032c9c202f26d3c8a0bf05826f0f9d58b0b1da4a06d50a4", Pod:"goldmane-768f4c5c69-r6dz8", Endpoint:"eth0", ServiceAccountName:"goldmane", IPNetworks:[]string{"192.168.75.73/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.goldmane"}, InterfaceName:"cali369d48a0181", MAC:"ce:06:13:04:49:93", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Jul 6 23:28:01.978550 containerd[1514]: 2025-07-06 23:28:01.971 [INFO][4536] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="55967b677c733dd4e032c9c202f26d3c8a0bf05826f0f9d58b0b1da4a06d50a4" Namespace="calico-system" Pod="goldmane-768f4c5c69-r6dz8" WorkloadEndpoint="ci--4344--1--1--3--d8bdec45b1-k8s-goldmane--768f4c5c69--r6dz8-eth0" Jul 6 23:28:01.991910 containerd[1514]: time="2025-07-06T23:28:01.991838760Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-59c54d7dc-bp45t,Uid:c35acb70-7389-4854-97f5-ab45ab94f737,Namespace:calico-apiserver,Attempt:0,} returns sandbox id \"55153120b2e39f4ba457796f1115af3f0e6595b1afc0c17662e7bb6f4ca1b112\"" Jul 6 23:28:01.993639 containerd[1514]: time="2025-07-06T23:28:01.992883722Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/whisker-backend:v3.30.2\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jul 6 23:28:02.000644 containerd[1514]: time="2025-07-06T23:28:02.000608934Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker-backend:v3.30.2: active requests=0, bytes read=30814581" Jul 6 23:28:02.005838 containerd[1514]: time="2025-07-06T23:28:02.005800301Z" level=info msg="ImageCreate event name:\"sha256:8763d908c0cd23d0e87bc61ce1ba8371b86449688baf955e5eeff7f7d7e101c4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jul 6 23:28:02.007996 systemd[1]: Started cri-containerd-3aea7c17fd5bc9d6661a3eab9ed3d52346ded1507128e53d356e5378a071080e.scope - libcontainer container 3aea7c17fd5bc9d6661a3eab9ed3d52346ded1507128e53d356e5378a071080e. Jul 6 23:28:02.020847 containerd[1514]: time="2025-07-06T23:28:02.020807203Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/whisker-backend@sha256:fbf7f21f5aba95930803ad7e7dea8b083220854eae72c2a7c51681c09c5614b5\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jul 6 23:28:02.024516 containerd[1514]: time="2025-07-06T23:28:02.024454728Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.2\" with image id \"sha256:8763d908c0cd23d0e87bc61ce1ba8371b86449688baf955e5eeff7f7d7e101c4\", repo tag \"ghcr.io/flatcar/calico/whisker-backend:v3.30.2\", repo digest \"ghcr.io/flatcar/calico/whisker-backend@sha256:fbf7f21f5aba95930803ad7e7dea8b083220854eae72c2a7c51681c09c5614b5\", size \"30814411\" in 4.462407769s" Jul 6 23:28:02.025919 containerd[1514]: time="2025-07-06T23:28:02.025884210Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.30.2\" returns image reference \"sha256:8763d908c0cd23d0e87bc61ce1ba8371b86449688baf955e5eeff7f7d7e101c4\"" Jul 6 23:28:02.033933 containerd[1514]: time="2025-07-06T23:28:02.033494381Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.30.2\"" Jul 6 23:28:02.041512 containerd[1514]: time="2025-07-06T23:28:02.041373872Z" level=info msg="CreateContainer within sandbox \"bd8ff8cd55b8e0de8574a38cee131001bf2fc85cac7d8ed496b67977a3e70ba8\" for container &ContainerMetadata{Name:whisker-backend,Attempt:0,}" Jul 6 23:28:02.057727 containerd[1514]: time="2025-07-06T23:28:02.057615095Z" level=info msg="Container 50d5dd518fb6e5e6d87c8ad9ce045fe8bab11157f03f0d14b392c6b9a3920c03: CDI devices from CRI Config.CDIDevices: []" Jul 6 23:28:02.076949 containerd[1514]: time="2025-07-06T23:28:02.076848722Z" level=info msg="connecting to shim 55967b677c733dd4e032c9c202f26d3c8a0bf05826f0f9d58b0b1da4a06d50a4" address="unix:///run/containerd/s/aeda64f88ef02a7173252274719e45345fc9cdbecf875fe9635f1abbb73b4a50" namespace=k8s.io protocol=ttrpc version=3 Jul 6 23:28:02.082446 containerd[1514]: time="2025-07-06T23:28:02.082410210Z" level=info msg="CreateContainer within sandbox \"bd8ff8cd55b8e0de8574a38cee131001bf2fc85cac7d8ed496b67977a3e70ba8\" for &ContainerMetadata{Name:whisker-backend,Attempt:0,} returns container id \"50d5dd518fb6e5e6d87c8ad9ce045fe8bab11157f03f0d14b392c6b9a3920c03\"" Jul 6 23:28:02.086411 containerd[1514]: time="2025-07-06T23:28:02.086374776Z" level=info msg="StartContainer for \"50d5dd518fb6e5e6d87c8ad9ce045fe8bab11157f03f0d14b392c6b9a3920c03\"" Jul 6 23:28:02.090672 containerd[1514]: time="2025-07-06T23:28:02.090622982Z" level=info msg="connecting to shim 50d5dd518fb6e5e6d87c8ad9ce045fe8bab11157f03f0d14b392c6b9a3920c03" address="unix:///run/containerd/s/84d1da03587149a3b2a987ea16394b79e421db7e0cdeb37deb726f10fb283b0f" protocol=ttrpc version=3 Jul 6 23:28:02.091410 containerd[1514]: time="2025-07-06T23:28:02.091384303Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-xwdth,Uid:6cccb974-c600-4611-9cf6-2ebb27d5d999,Namespace:calico-system,Attempt:0,} returns sandbox id \"3aea7c17fd5bc9d6661a3eab9ed3d52346ded1507128e53d356e5378a071080e\"" Jul 6 23:28:02.107900 systemd[1]: Started cri-containerd-55967b677c733dd4e032c9c202f26d3c8a0bf05826f0f9d58b0b1da4a06d50a4.scope - libcontainer container 55967b677c733dd4e032c9c202f26d3c8a0bf05826f0f9d58b0b1da4a06d50a4. Jul 6 23:28:02.117898 systemd[1]: Started cri-containerd-50d5dd518fb6e5e6d87c8ad9ce045fe8bab11157f03f0d14b392c6b9a3920c03.scope - libcontainer container 50d5dd518fb6e5e6d87c8ad9ce045fe8bab11157f03f0d14b392c6b9a3920c03. Jul 6 23:28:02.143916 systemd-networkd[1406]: califce12591438: Gained IPv6LL Jul 6 23:28:02.194602 containerd[1514]: time="2025-07-06T23:28:02.194426210Z" level=info msg="StartContainer for \"50d5dd518fb6e5e6d87c8ad9ce045fe8bab11157f03f0d14b392c6b9a3920c03\" returns successfully" Jul 6 23:28:02.200162 containerd[1514]: time="2025-07-06T23:28:02.199822937Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:goldmane-768f4c5c69-r6dz8,Uid:12a2e46a-f038-4c8d-8ff0-480027547cbb,Namespace:calico-system,Attempt:0,} returns sandbox id \"55967b677c733dd4e032c9c202f26d3c8a0bf05826f0f9d58b0b1da4a06d50a4\"" Jul 6 23:28:02.381105 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount2065725686.mount: Deactivated successfully. Jul 6 23:28:02.704246 kubelet[2667]: I0706 23:28:02.703566 2667 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/whisker-6cb7d4d7d9-tp4hb" podStartSLOduration=2.110649855 podStartE2EDuration="8.703545375s" podCreationTimestamp="2025-07-06 23:27:54 +0000 UTC" firstStartedPulling="2025-07-06 23:27:55.436316734 +0000 UTC m=+42.217977066" lastFinishedPulling="2025-07-06 23:28:02.029212134 +0000 UTC m=+48.810872586" observedRunningTime="2025-07-06 23:28:02.702661214 +0000 UTC m=+49.484321626" watchObservedRunningTime="2025-07-06 23:28:02.703545375 +0000 UTC m=+49.485205747" Jul 6 23:28:02.782923 systemd-networkd[1406]: cali45af94bc807: Gained IPv6LL Jul 6 23:28:02.911020 systemd-networkd[1406]: calidab8520f841: Gained IPv6LL Jul 6 23:28:03.616608 systemd-networkd[1406]: cali369d48a0181: Gained IPv6LL Jul 6 23:28:04.309000 kubelet[2667]: I0706 23:28:04.308799 2667 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Jul 6 23:28:05.528066 systemd-networkd[1406]: vxlan.calico: Link UP Jul 6 23:28:05.529701 systemd-networkd[1406]: vxlan.calico: Gained carrier Jul 6 23:28:05.579895 containerd[1514]: time="2025-07-06T23:28:05.579844939Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/kube-controllers:v3.30.2\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jul 6 23:28:05.582375 containerd[1514]: time="2025-07-06T23:28:05.582285336Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/kube-controllers:v3.30.2: active requests=0, bytes read=48128336" Jul 6 23:28:05.583489 containerd[1514]: time="2025-07-06T23:28:05.583445135Z" level=info msg="ImageCreate event name:\"sha256:ba9e7793995ca67a9b78aa06adda4e89cbd435b1e88ab1032ca665140517fa7a\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jul 6 23:28:05.589772 containerd[1514]: time="2025-07-06T23:28:05.589541330Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/kube-controllers@sha256:5d3ecdec3cbbe8f7009077102e35e8a2141161b59c548cf3f97829177677cbce\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jul 6 23:28:05.590629 containerd[1514]: time="2025-07-06T23:28:05.590280929Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.2\" with image id \"sha256:ba9e7793995ca67a9b78aa06adda4e89cbd435b1e88ab1032ca665140517fa7a\", repo tag \"ghcr.io/flatcar/calico/kube-controllers:v3.30.2\", repo digest \"ghcr.io/flatcar/calico/kube-controllers@sha256:5d3ecdec3cbbe8f7009077102e35e8a2141161b59c548cf3f97829177677cbce\", size \"49497545\" in 3.556747708s" Jul 6 23:28:05.590629 containerd[1514]: time="2025-07-06T23:28:05.590331849Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.30.2\" returns image reference \"sha256:ba9e7793995ca67a9b78aa06adda4e89cbd435b1e88ab1032ca665140517fa7a\"" Jul 6 23:28:05.593651 containerd[1514]: time="2025-07-06T23:28:05.593599686Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.2\"" Jul 6 23:28:05.622361 containerd[1514]: time="2025-07-06T23:28:05.622155018Z" level=info msg="CreateContainer within sandbox \"08174e3ac0afce8dde602378a7e96a4a24ce820f65640fa5f0842da16ae61d88\" for container &ContainerMetadata{Name:calico-kube-controllers,Attempt:0,}" Jul 6 23:28:05.648840 containerd[1514]: time="2025-07-06T23:28:05.648788713Z" level=info msg="Container 9deb0d27fc93321c04f8333fe47cd7c4be33356d857b4c2fb91b4d905d5a998c: CDI devices from CRI Config.CDIDevices: []" Jul 6 23:28:05.656742 containerd[1514]: time="2025-07-06T23:28:05.656689465Z" level=info msg="CreateContainer within sandbox \"08174e3ac0afce8dde602378a7e96a4a24ce820f65640fa5f0842da16ae61d88\" for &ContainerMetadata{Name:calico-kube-controllers,Attempt:0,} returns container id \"9deb0d27fc93321c04f8333fe47cd7c4be33356d857b4c2fb91b4d905d5a998c\"" Jul 6 23:28:05.657814 containerd[1514]: time="2025-07-06T23:28:05.657771824Z" level=info msg="StartContainer for \"9deb0d27fc93321c04f8333fe47cd7c4be33356d857b4c2fb91b4d905d5a998c\"" Jul 6 23:28:05.659580 containerd[1514]: time="2025-07-06T23:28:05.659527783Z" level=info msg="connecting to shim 9deb0d27fc93321c04f8333fe47cd7c4be33356d857b4c2fb91b4d905d5a998c" address="unix:///run/containerd/s/0abe4d7f4e996f47a40ef579558bfd17dbbc3cb141296e76a3b045975b6069be" protocol=ttrpc version=3 Jul 6 23:28:05.693051 systemd[1]: Started cri-containerd-9deb0d27fc93321c04f8333fe47cd7c4be33356d857b4c2fb91b4d905d5a998c.scope - libcontainer container 9deb0d27fc93321c04f8333fe47cd7c4be33356d857b4c2fb91b4d905d5a998c. Jul 6 23:28:05.777830 containerd[1514]: time="2025-07-06T23:28:05.777788470Z" level=info msg="StartContainer for \"9deb0d27fc93321c04f8333fe47cd7c4be33356d857b4c2fb91b4d905d5a998c\" returns successfully" Jul 6 23:28:06.035720 kubelet[2667]: I0706 23:28:06.033766 2667 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Jul 6 23:28:06.154951 containerd[1514]: time="2025-07-06T23:28:06.154889433Z" level=info msg="TaskExit event in podsandbox handler container_id:\"0d73d6901e869e7caea6321810ee8eb5fca1374e273d1fd43bb5e8a831a94d32\" id:\"66e1208f0bf0b491dc734f6aef037aeb544d65833602624c71335d2399484281\" pid:5022 exit_status:1 exited_at:{seconds:1751844486 nanos:153900394}" Jul 6 23:28:06.266822 containerd[1514]: time="2025-07-06T23:28:06.266772569Z" level=info msg="TaskExit event in podsandbox handler container_id:\"0d73d6901e869e7caea6321810ee8eb5fca1374e273d1fd43bb5e8a831a94d32\" id:\"a0d4f8475856fa0913e0523cb96a1daaf361dcf8dfce6de57918d00b6cc3a978\" pid:5049 exited_at:{seconds:1751844486 nanos:266211450}" Jul 6 23:28:06.746335 kubelet[2667]: I0706 23:28:06.746153 2667 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/calico-kube-controllers-695459cb77-m8dwz" podStartSLOduration=23.863578088 podStartE2EDuration="30.746127324s" podCreationTimestamp="2025-07-06 23:27:36 +0000 UTC" firstStartedPulling="2025-07-06 23:27:58.71031517 +0000 UTC m=+45.491975542" lastFinishedPulling="2025-07-06 23:28:05.592864406 +0000 UTC m=+52.374524778" observedRunningTime="2025-07-06 23:28:06.743420807 +0000 UTC m=+53.525081219" watchObservedRunningTime="2025-07-06 23:28:06.746127324 +0000 UTC m=+53.527787776" Jul 6 23:28:06.785721 containerd[1514]: time="2025-07-06T23:28:06.785604527Z" level=info msg="TaskExit event in podsandbox handler container_id:\"9deb0d27fc93321c04f8333fe47cd7c4be33356d857b4c2fb91b4d905d5a998c\" id:\"ef2879a77c311c697c205507cf5919847999ddc6025b78babb848fd0e02c6d31\" pid:5075 exited_at:{seconds:1751844486 nanos:785079768}" Jul 6 23:28:06.942914 systemd-networkd[1406]: vxlan.calico: Gained IPv6LL Jul 6 23:28:09.421433 containerd[1514]: time="2025-07-06T23:28:09.421320347Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/apiserver:v3.30.2\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jul 6 23:28:09.423420 containerd[1514]: time="2025-07-06T23:28:09.423331145Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.30.2: active requests=0, bytes read=44517149" Jul 6 23:28:09.424711 containerd[1514]: time="2025-07-06T23:28:09.423844384Z" level=info msg="ImageCreate event name:\"sha256:3371ea1b18040228ef58c964e49b96f4291def748753dfbc0aef87a55f906b8f\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jul 6 23:28:09.426814 containerd[1514]: time="2025-07-06T23:28:09.426760142Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/apiserver@sha256:ec6b10660962e7caad70c47755049fad68f9fc2f7064e8bc7cb862583e02cc2b\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jul 6 23:28:09.427887 containerd[1514]: time="2025-07-06T23:28:09.427790501Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/apiserver:v3.30.2\" with image id \"sha256:3371ea1b18040228ef58c964e49b96f4291def748753dfbc0aef87a55f906b8f\", repo tag \"ghcr.io/flatcar/calico/apiserver:v3.30.2\", repo digest \"ghcr.io/flatcar/calico/apiserver@sha256:ec6b10660962e7caad70c47755049fad68f9fc2f7064e8bc7cb862583e02cc2b\", size \"45886406\" in 3.834139335s" Jul 6 23:28:09.428047 containerd[1514]: time="2025-07-06T23:28:09.428028141Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.2\" returns image reference \"sha256:3371ea1b18040228ef58c964e49b96f4291def748753dfbc0aef87a55f906b8f\"" Jul 6 23:28:09.429506 containerd[1514]: time="2025-07-06T23:28:09.429463180Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.2\"" Jul 6 23:28:09.435643 containerd[1514]: time="2025-07-06T23:28:09.435462455Z" level=info msg="CreateContainer within sandbox \"ed7c6d4c03fb6932e88fd0e005b1044d3f49a7d624765e1e0356b34fcffa447d\" for container &ContainerMetadata{Name:calico-apiserver,Attempt:0,}" Jul 6 23:28:09.450368 containerd[1514]: time="2025-07-06T23:28:09.449422163Z" level=info msg="Container 9204ed692475d60b065414c0a60b3b7f336c68dfbe0e7b4c45a0e64b51d2c65d: CDI devices from CRI Config.CDIDevices: []" Jul 6 23:28:09.465360 containerd[1514]: time="2025-07-06T23:28:09.465278629Z" level=info msg="CreateContainer within sandbox \"ed7c6d4c03fb6932e88fd0e005b1044d3f49a7d624765e1e0356b34fcffa447d\" for &ContainerMetadata{Name:calico-apiserver,Attempt:0,} returns container id \"9204ed692475d60b065414c0a60b3b7f336c68dfbe0e7b4c45a0e64b51d2c65d\"" Jul 6 23:28:09.466837 containerd[1514]: time="2025-07-06T23:28:09.466768388Z" level=info msg="StartContainer for \"9204ed692475d60b065414c0a60b3b7f336c68dfbe0e7b4c45a0e64b51d2c65d\"" Jul 6 23:28:09.470113 containerd[1514]: time="2025-07-06T23:28:09.470058185Z" level=info msg="connecting to shim 9204ed692475d60b065414c0a60b3b7f336c68dfbe0e7b4c45a0e64b51d2c65d" address="unix:///run/containerd/s/c25141bf31792c0528f2b22f4b18f026b37bb59382a0dd0b99018f7ec92278e2" protocol=ttrpc version=3 Jul 6 23:28:09.499052 systemd[1]: Started cri-containerd-9204ed692475d60b065414c0a60b3b7f336c68dfbe0e7b4c45a0e64b51d2c65d.scope - libcontainer container 9204ed692475d60b065414c0a60b3b7f336c68dfbe0e7b4c45a0e64b51d2c65d. Jul 6 23:28:09.557254 containerd[1514]: time="2025-07-06T23:28:09.557210951Z" level=info msg="StartContainer for \"9204ed692475d60b065414c0a60b3b7f336c68dfbe0e7b4c45a0e64b51d2c65d\" returns successfully" Jul 6 23:28:09.752777 kubelet[2667]: I0706 23:28:09.752625 2667 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-apiserver/calico-apiserver-59c54d7dc-sddts" podStartSLOduration=31.172898937 podStartE2EDuration="40.752605504s" podCreationTimestamp="2025-07-06 23:27:29 +0000 UTC" firstStartedPulling="2025-07-06 23:27:59.849519373 +0000 UTC m=+46.631179705" lastFinishedPulling="2025-07-06 23:28:09.4292259 +0000 UTC m=+56.210886272" observedRunningTime="2025-07-06 23:28:09.749473506 +0000 UTC m=+56.531133998" watchObservedRunningTime="2025-07-06 23:28:09.752605504 +0000 UTC m=+56.534265876" Jul 6 23:28:09.804820 containerd[1514]: time="2025-07-06T23:28:09.804763939Z" level=info msg="ImageUpdate event name:\"ghcr.io/flatcar/calico/apiserver:v3.30.2\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jul 6 23:28:09.806406 containerd[1514]: time="2025-07-06T23:28:09.806328018Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.30.2: active requests=0, bytes read=77" Jul 6 23:28:09.809631 containerd[1514]: time="2025-07-06T23:28:09.809497095Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/apiserver:v3.30.2\" with image id \"sha256:3371ea1b18040228ef58c964e49b96f4291def748753dfbc0aef87a55f906b8f\", repo tag \"ghcr.io/flatcar/calico/apiserver:v3.30.2\", repo digest \"ghcr.io/flatcar/calico/apiserver@sha256:ec6b10660962e7caad70c47755049fad68f9fc2f7064e8bc7cb862583e02cc2b\", size \"45886406\" in 379.967515ms" Jul 6 23:28:09.809631 containerd[1514]: time="2025-07-06T23:28:09.809575495Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.2\" returns image reference \"sha256:3371ea1b18040228ef58c964e49b96f4291def748753dfbc0aef87a55f906b8f\"" Jul 6 23:28:09.811720 containerd[1514]: time="2025-07-06T23:28:09.811534493Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.2\"" Jul 6 23:28:09.817630 containerd[1514]: time="2025-07-06T23:28:09.816879449Z" level=info msg="CreateContainer within sandbox \"3bf438596f4542a3c55fbe9562f5c390820402c941821db38aa37e716a08d56d\" for container &ContainerMetadata{Name:calico-apiserver,Attempt:0,}" Jul 6 23:28:09.833988 containerd[1514]: time="2025-07-06T23:28:09.833923754Z" level=info msg="Container 5533d9c55479925d3b8ec4313db7d88b407ac35567cc2b5420f05835e19ad264: CDI devices from CRI Config.CDIDevices: []" Jul 6 23:28:09.842260 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount3760469843.mount: Deactivated successfully. Jul 6 23:28:09.888518 containerd[1514]: time="2025-07-06T23:28:09.888469068Z" level=info msg="CreateContainer within sandbox \"3bf438596f4542a3c55fbe9562f5c390820402c941821db38aa37e716a08d56d\" for &ContainerMetadata{Name:calico-apiserver,Attempt:0,} returns container id \"5533d9c55479925d3b8ec4313db7d88b407ac35567cc2b5420f05835e19ad264\"" Jul 6 23:28:09.890642 containerd[1514]: time="2025-07-06T23:28:09.889873106Z" level=info msg="StartContainer for \"5533d9c55479925d3b8ec4313db7d88b407ac35567cc2b5420f05835e19ad264\"" Jul 6 23:28:09.895585 containerd[1514]: time="2025-07-06T23:28:09.895493702Z" level=info msg="connecting to shim 5533d9c55479925d3b8ec4313db7d88b407ac35567cc2b5420f05835e19ad264" address="unix:///run/containerd/s/a2647b6a4139d82d1961d5e81149081247dc9dbc1ebd62e03f51a68377b5b7ee" protocol=ttrpc version=3 Jul 6 23:28:09.928897 systemd[1]: Started cri-containerd-5533d9c55479925d3b8ec4313db7d88b407ac35567cc2b5420f05835e19ad264.scope - libcontainer container 5533d9c55479925d3b8ec4313db7d88b407ac35567cc2b5420f05835e19ad264. Jul 6 23:28:09.984823 containerd[1514]: time="2025-07-06T23:28:09.984652585Z" level=info msg="StartContainer for \"5533d9c55479925d3b8ec4313db7d88b407ac35567cc2b5420f05835e19ad264\" returns successfully" Jul 6 23:28:10.210650 containerd[1514]: time="2025-07-06T23:28:10.209155799Z" level=info msg="ImageUpdate event name:\"ghcr.io/flatcar/calico/apiserver:v3.30.2\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jul 6 23:28:10.210650 containerd[1514]: time="2025-07-06T23:28:10.209905798Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.30.2: active requests=0, bytes read=77" Jul 6 23:28:10.214007 containerd[1514]: time="2025-07-06T23:28:10.213963995Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/apiserver:v3.30.2\" with image id \"sha256:3371ea1b18040228ef58c964e49b96f4291def748753dfbc0aef87a55f906b8f\", repo tag \"ghcr.io/flatcar/calico/apiserver:v3.30.2\", repo digest \"ghcr.io/flatcar/calico/apiserver@sha256:ec6b10660962e7caad70c47755049fad68f9fc2f7064e8bc7cb862583e02cc2b\", size \"45886406\" in 402.103742ms" Jul 6 23:28:10.214007 containerd[1514]: time="2025-07-06T23:28:10.214006474Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.2\" returns image reference \"sha256:3371ea1b18040228ef58c964e49b96f4291def748753dfbc0aef87a55f906b8f\"" Jul 6 23:28:10.215929 containerd[1514]: time="2025-07-06T23:28:10.215884713Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.30.2\"" Jul 6 23:28:10.217007 containerd[1514]: time="2025-07-06T23:28:10.216971072Z" level=info msg="CreateContainer within sandbox \"55153120b2e39f4ba457796f1115af3f0e6595b1afc0c17662e7bb6f4ca1b112\" for container &ContainerMetadata{Name:calico-apiserver,Attempt:0,}" Jul 6 23:28:10.237624 containerd[1514]: time="2025-07-06T23:28:10.236947775Z" level=info msg="Container e18182e798b5983978030d6b14fd9e58448397f934a7b45122748556832dc374: CDI devices from CRI Config.CDIDevices: []" Jul 6 23:28:10.250709 containerd[1514]: time="2025-07-06T23:28:10.250294564Z" level=info msg="CreateContainer within sandbox \"55153120b2e39f4ba457796f1115af3f0e6595b1afc0c17662e7bb6f4ca1b112\" for &ContainerMetadata{Name:calico-apiserver,Attempt:0,} returns container id \"e18182e798b5983978030d6b14fd9e58448397f934a7b45122748556832dc374\"" Jul 6 23:28:10.252074 containerd[1514]: time="2025-07-06T23:28:10.252028163Z" level=info msg="StartContainer for \"e18182e798b5983978030d6b14fd9e58448397f934a7b45122748556832dc374\"" Jul 6 23:28:10.254656 containerd[1514]: time="2025-07-06T23:28:10.254625561Z" level=info msg="connecting to shim e18182e798b5983978030d6b14fd9e58448397f934a7b45122748556832dc374" address="unix:///run/containerd/s/c3580d82361683f8d748275da254f3e4cb9252ce20aa345c026e60a569ff0ffb" protocol=ttrpc version=3 Jul 6 23:28:10.278472 systemd[1]: Started cri-containerd-e18182e798b5983978030d6b14fd9e58448397f934a7b45122748556832dc374.scope - libcontainer container e18182e798b5983978030d6b14fd9e58448397f934a7b45122748556832dc374. Jul 6 23:28:10.354620 containerd[1514]: time="2025-07-06T23:28:10.354585078Z" level=info msg="StartContainer for \"e18182e798b5983978030d6b14fd9e58448397f934a7b45122748556832dc374\" returns successfully" Jul 6 23:28:10.784086 kubelet[2667]: I0706 23:28:10.783809 2667 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-apiserver/calico-apiserver-5c7f48b6db-cvkmp" podStartSLOduration=31.251189504 podStartE2EDuration="39.783789361s" podCreationTimestamp="2025-07-06 23:27:31 +0000 UTC" firstStartedPulling="2025-07-06 23:28:01.278482637 +0000 UTC m=+48.060143009" lastFinishedPulling="2025-07-06 23:28:09.811082494 +0000 UTC m=+56.592742866" observedRunningTime="2025-07-06 23:28:10.757804183 +0000 UTC m=+57.539464555" watchObservedRunningTime="2025-07-06 23:28:10.783789361 +0000 UTC m=+57.565449733" Jul 6 23:28:11.752114 kubelet[2667]: I0706 23:28:11.751645 2667 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Jul 6 23:28:11.752114 kubelet[2667]: I0706 23:28:11.751646 2667 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Jul 6 23:28:11.753038 kubelet[2667]: I0706 23:28:11.752486 2667 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Jul 6 23:28:12.144699 containerd[1514]: time="2025-07-06T23:28:12.143949620Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/csi:v3.30.2\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jul 6 23:28:12.146473 containerd[1514]: time="2025-07-06T23:28:12.146424858Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/csi:v3.30.2: active requests=0, bytes read=8225702" Jul 6 23:28:12.148158 containerd[1514]: time="2025-07-06T23:28:12.147611217Z" level=info msg="ImageCreate event name:\"sha256:14ecfabbdbebd1f5a36708f8b11a95a43baddd6a935d7d78c89a9c333849fcd2\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jul 6 23:28:12.151487 containerd[1514]: time="2025-07-06T23:28:12.150793935Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/csi@sha256:e570128aa8067a2f06b96d3cc98afa2e0a4b9790b435ee36ca051c8e72aeb8d0\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jul 6 23:28:12.151487 containerd[1514]: time="2025-07-06T23:28:12.151325014Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/csi:v3.30.2\" with image id \"sha256:14ecfabbdbebd1f5a36708f8b11a95a43baddd6a935d7d78c89a9c333849fcd2\", repo tag \"ghcr.io/flatcar/calico/csi:v3.30.2\", repo digest \"ghcr.io/flatcar/calico/csi@sha256:e570128aa8067a2f06b96d3cc98afa2e0a4b9790b435ee36ca051c8e72aeb8d0\", size \"9594943\" in 1.935396741s" Jul 6 23:28:12.151487 containerd[1514]: time="2025-07-06T23:28:12.151364654Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.30.2\" returns image reference \"sha256:14ecfabbdbebd1f5a36708f8b11a95a43baddd6a935d7d78c89a9c333849fcd2\"" Jul 6 23:28:12.154387 containerd[1514]: time="2025-07-06T23:28:12.154361452Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.30.2\"" Jul 6 23:28:12.157570 containerd[1514]: time="2025-07-06T23:28:12.156786370Z" level=info msg="CreateContainer within sandbox \"3aea7c17fd5bc9d6661a3eab9ed3d52346ded1507128e53d356e5378a071080e\" for container &ContainerMetadata{Name:calico-csi,Attempt:0,}" Jul 6 23:28:12.178878 containerd[1514]: time="2025-07-06T23:28:12.178836513Z" level=info msg="Container ec203f06b1c8bc5dc059e34bbd79c22f9b433ca200dc95051fdfc3d4435bd781: CDI devices from CRI Config.CDIDevices: []" Jul 6 23:28:12.197046 containerd[1514]: time="2025-07-06T23:28:12.197001578Z" level=info msg="CreateContainer within sandbox \"3aea7c17fd5bc9d6661a3eab9ed3d52346ded1507128e53d356e5378a071080e\" for &ContainerMetadata{Name:calico-csi,Attempt:0,} returns container id \"ec203f06b1c8bc5dc059e34bbd79c22f9b433ca200dc95051fdfc3d4435bd781\"" Jul 6 23:28:12.198868 containerd[1514]: time="2025-07-06T23:28:12.198828297Z" level=info msg="StartContainer for \"ec203f06b1c8bc5dc059e34bbd79c22f9b433ca200dc95051fdfc3d4435bd781\"" Jul 6 23:28:12.200457 containerd[1514]: time="2025-07-06T23:28:12.200421936Z" level=info msg="connecting to shim ec203f06b1c8bc5dc059e34bbd79c22f9b433ca200dc95051fdfc3d4435bd781" address="unix:///run/containerd/s/fc54eabd9530bc2b9114a354e98b7e41c9e3651a8a0fdd07940a1accfee3e6f3" protocol=ttrpc version=3 Jul 6 23:28:12.244075 systemd[1]: Started cri-containerd-ec203f06b1c8bc5dc059e34bbd79c22f9b433ca200dc95051fdfc3d4435bd781.scope - libcontainer container ec203f06b1c8bc5dc059e34bbd79c22f9b433ca200dc95051fdfc3d4435bd781. Jul 6 23:28:12.434759 containerd[1514]: time="2025-07-06T23:28:12.434338112Z" level=info msg="StartContainer for \"ec203f06b1c8bc5dc059e34bbd79c22f9b433ca200dc95051fdfc3d4435bd781\" returns successfully" Jul 6 23:28:12.608894 kubelet[2667]: I0706 23:28:12.607425 2667 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-apiserver/calico-apiserver-59c54d7dc-bp45t" podStartSLOduration=35.388288709 podStartE2EDuration="43.607406416s" podCreationTimestamp="2025-07-06 23:27:29 +0000 UTC" firstStartedPulling="2025-07-06 23:28:01.995911927 +0000 UTC m=+48.777572299" lastFinishedPulling="2025-07-06 23:28:10.215029634 +0000 UTC m=+56.996690006" observedRunningTime="2025-07-06 23:28:10.787002758 +0000 UTC m=+57.568663090" watchObservedRunningTime="2025-07-06 23:28:12.607406416 +0000 UTC m=+59.389066828" Jul 6 23:28:12.777957 kubelet[2667]: I0706 23:28:12.777039 2667 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Jul 6 23:28:15.784583 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount1077467988.mount: Deactivated successfully. Jul 6 23:28:16.053635 kubelet[2667]: I0706 23:28:16.053344 2667 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Jul 6 23:28:16.066482 containerd[1514]: time="2025-07-06T23:28:16.066346626Z" level=info msg="StopContainer for \"e18182e798b5983978030d6b14fd9e58448397f934a7b45122748556832dc374\" with timeout 30 (s)" Jul 6 23:28:16.069887 containerd[1514]: time="2025-07-06T23:28:16.069812624Z" level=info msg="Stop container \"e18182e798b5983978030d6b14fd9e58448397f934a7b45122748556832dc374\" with signal terminated" Jul 6 23:28:16.147857 systemd[1]: Created slice kubepods-besteffort-podc12feb6e_6e7f_4280_b0b6_9a4138a27d03.slice - libcontainer container kubepods-besteffort-podc12feb6e_6e7f_4280_b0b6_9a4138a27d03.slice. Jul 6 23:28:16.231534 systemd[1]: cri-containerd-e18182e798b5983978030d6b14fd9e58448397f934a7b45122748556832dc374.scope: Deactivated successfully. Jul 6 23:28:16.252834 containerd[1514]: time="2025-07-06T23:28:16.252651135Z" level=info msg="received exit event container_id:\"e18182e798b5983978030d6b14fd9e58448397f934a7b45122748556832dc374\" id:\"e18182e798b5983978030d6b14fd9e58448397f934a7b45122748556832dc374\" pid:5174 exit_status:1 exited_at:{seconds:1751844496 nanos:252385375}" Jul 6 23:28:16.254081 containerd[1514]: time="2025-07-06T23:28:16.253965134Z" level=info msg="TaskExit event in podsandbox handler container_id:\"e18182e798b5983978030d6b14fd9e58448397f934a7b45122748556832dc374\" id:\"e18182e798b5983978030d6b14fd9e58448397f934a7b45122748556832dc374\" pid:5174 exit_status:1 exited_at:{seconds:1751844496 nanos:252385375}" Jul 6 23:28:16.275178 kubelet[2667]: I0706 23:28:16.274980 2667 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tqjsx\" (UniqueName: \"kubernetes.io/projected/c12feb6e-6e7f-4280-b0b6-9a4138a27d03-kube-api-access-tqjsx\") pod \"calico-apiserver-5c7f48b6db-4wqhl\" (UID: \"c12feb6e-6e7f-4280-b0b6-9a4138a27d03\") " pod="calico-apiserver/calico-apiserver-5c7f48b6db-4wqhl" Jul 6 23:28:16.276823 kubelet[2667]: I0706 23:28:16.276746 2667 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"calico-apiserver-certs\" (UniqueName: \"kubernetes.io/secret/c12feb6e-6e7f-4280-b0b6-9a4138a27d03-calico-apiserver-certs\") pod \"calico-apiserver-5c7f48b6db-4wqhl\" (UID: \"c12feb6e-6e7f-4280-b0b6-9a4138a27d03\") " pod="calico-apiserver/calico-apiserver-5c7f48b6db-4wqhl" Jul 6 23:28:16.308001 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-e18182e798b5983978030d6b14fd9e58448397f934a7b45122748556832dc374-rootfs.mount: Deactivated successfully. Jul 6 23:28:16.458892 containerd[1514]: time="2025-07-06T23:28:16.458839469Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-5c7f48b6db-4wqhl,Uid:c12feb6e-6e7f-4280-b0b6-9a4138a27d03,Namespace:calico-apiserver,Attempt:0,}" Jul 6 23:28:16.497461 containerd[1514]: time="2025-07-06T23:28:16.497414842Z" level=info msg="StopContainer for \"e18182e798b5983978030d6b14fd9e58448397f934a7b45122748556832dc374\" returns successfully" Jul 6 23:28:16.498223 containerd[1514]: time="2025-07-06T23:28:16.498157882Z" level=info msg="StopPodSandbox for \"55153120b2e39f4ba457796f1115af3f0e6595b1afc0c17662e7bb6f4ca1b112\"" Jul 6 23:28:16.498871 containerd[1514]: time="2025-07-06T23:28:16.498740761Z" level=info msg="Container to stop \"e18182e798b5983978030d6b14fd9e58448397f934a7b45122748556832dc374\" must be in running or unknown state, current state \"CONTAINER_EXITED\"" Jul 6 23:28:16.525992 systemd[1]: cri-containerd-55153120b2e39f4ba457796f1115af3f0e6595b1afc0c17662e7bb6f4ca1b112.scope: Deactivated successfully. Jul 6 23:28:16.532367 containerd[1514]: time="2025-07-06T23:28:16.531730618Z" level=info msg="TaskExit event in podsandbox handler container_id:\"55153120b2e39f4ba457796f1115af3f0e6595b1afc0c17662e7bb6f4ca1b112\" id:\"55153120b2e39f4ba457796f1115af3f0e6595b1afc0c17662e7bb6f4ca1b112\" pid:4631 exit_status:137 exited_at:{seconds:1751844496 nanos:530632499}" Jul 6 23:28:16.628467 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-55153120b2e39f4ba457796f1115af3f0e6595b1afc0c17662e7bb6f4ca1b112-rootfs.mount: Deactivated successfully. Jul 6 23:28:16.639238 containerd[1514]: time="2025-07-06T23:28:16.639157782Z" level=info msg="shim disconnected" id=55153120b2e39f4ba457796f1115af3f0e6595b1afc0c17662e7bb6f4ca1b112 namespace=k8s.io Jul 6 23:28:16.657828 containerd[1514]: time="2025-07-06T23:28:16.639201102Z" level=warning msg="cleaning up after shim disconnected" id=55153120b2e39f4ba457796f1115af3f0e6595b1afc0c17662e7bb6f4ca1b112 namespace=k8s.io Jul 6 23:28:16.657828 containerd[1514]: time="2025-07-06T23:28:16.657821209Z" level=info msg="cleaning up dead shim" namespace=k8s.io Jul 6 23:28:16.754931 systemd-networkd[1406]: calic91edc58875: Link UP Jul 6 23:28:16.755075 systemd-networkd[1406]: calic91edc58875: Gained carrier Jul 6 23:28:16.803300 containerd[1514]: time="2025-07-06T23:28:16.803234946Z" level=info msg="received exit event sandbox_id:\"55153120b2e39f4ba457796f1115af3f0e6595b1afc0c17662e7bb6f4ca1b112\" exit_status:137 exited_at:{seconds:1751844496 nanos:530632499}" Jul 6 23:28:16.812190 containerd[1514]: 2025-07-06 23:28:16.564 [INFO][5293] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4344--1--1--3--d8bdec45b1-k8s-calico--apiserver--5c7f48b6db--4wqhl-eth0 calico-apiserver-5c7f48b6db- calico-apiserver c12feb6e-6e7f-4280-b0b6-9a4138a27d03 1084 0 2025-07-06 23:28:16 +0000 UTC map[apiserver:true app.kubernetes.io/name:calico-apiserver k8s-app:calico-apiserver pod-template-hash:5c7f48b6db projectcalico.org/namespace:calico-apiserver projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:calico-apiserver] map[] [] [] []} {k8s ci-4344-1-1-3-d8bdec45b1 calico-apiserver-5c7f48b6db-4wqhl eth0 calico-apiserver [] [] [kns.calico-apiserver ksa.calico-apiserver.calico-apiserver] calic91edc58875 [] [] }} ContainerID="0c769176bd95aaeb8905ebf1893084f7c48b0ac804baa82a0952ce20fda9dfd4" Namespace="calico-apiserver" Pod="calico-apiserver-5c7f48b6db-4wqhl" WorkloadEndpoint="ci--4344--1--1--3--d8bdec45b1-k8s-calico--apiserver--5c7f48b6db--4wqhl-" Jul 6 23:28:16.812190 containerd[1514]: 2025-07-06 23:28:16.564 [INFO][5293] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="0c769176bd95aaeb8905ebf1893084f7c48b0ac804baa82a0952ce20fda9dfd4" Namespace="calico-apiserver" Pod="calico-apiserver-5c7f48b6db-4wqhl" WorkloadEndpoint="ci--4344--1--1--3--d8bdec45b1-k8s-calico--apiserver--5c7f48b6db--4wqhl-eth0" Jul 6 23:28:16.812190 containerd[1514]: 2025-07-06 23:28:16.666 [INFO][5326] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="0c769176bd95aaeb8905ebf1893084f7c48b0ac804baa82a0952ce20fda9dfd4" HandleID="k8s-pod-network.0c769176bd95aaeb8905ebf1893084f7c48b0ac804baa82a0952ce20fda9dfd4" Workload="ci--4344--1--1--3--d8bdec45b1-k8s-calico--apiserver--5c7f48b6db--4wqhl-eth0" Jul 6 23:28:16.812190 containerd[1514]: 2025-07-06 23:28:16.666 [INFO][5326] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="0c769176bd95aaeb8905ebf1893084f7c48b0ac804baa82a0952ce20fda9dfd4" HandleID="k8s-pod-network.0c769176bd95aaeb8905ebf1893084f7c48b0ac804baa82a0952ce20fda9dfd4" Workload="ci--4344--1--1--3--d8bdec45b1-k8s-calico--apiserver--5c7f48b6db--4wqhl-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0x400032adc0), Attrs:map[string]string{"namespace":"calico-apiserver", "node":"ci-4344-1-1-3-d8bdec45b1", "pod":"calico-apiserver-5c7f48b6db-4wqhl", "timestamp":"2025-07-06 23:28:16.665760083 +0000 UTC"}, Hostname:"ci-4344-1-1-3-d8bdec45b1", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Jul 6 23:28:16.812190 containerd[1514]: 2025-07-06 23:28:16.667 [INFO][5326] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Jul 6 23:28:16.812190 containerd[1514]: 2025-07-06 23:28:16.667 [INFO][5326] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Jul 6 23:28:16.812190 containerd[1514]: 2025-07-06 23:28:16.667 [INFO][5326] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4344-1-1-3-d8bdec45b1' Jul 6 23:28:16.812190 containerd[1514]: 2025-07-06 23:28:16.693 [INFO][5326] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.0c769176bd95aaeb8905ebf1893084f7c48b0ac804baa82a0952ce20fda9dfd4" host="ci-4344-1-1-3-d8bdec45b1" Jul 6 23:28:16.812190 containerd[1514]: 2025-07-06 23:28:16.701 [INFO][5326] ipam/ipam.go 394: Looking up existing affinities for host host="ci-4344-1-1-3-d8bdec45b1" Jul 6 23:28:16.812190 containerd[1514]: 2025-07-06 23:28:16.707 [INFO][5326] ipam/ipam.go 511: Trying affinity for 192.168.75.64/26 host="ci-4344-1-1-3-d8bdec45b1" Jul 6 23:28:16.812190 containerd[1514]: 2025-07-06 23:28:16.710 [INFO][5326] ipam/ipam.go 158: Attempting to load block cidr=192.168.75.64/26 host="ci-4344-1-1-3-d8bdec45b1" Jul 6 23:28:16.812190 containerd[1514]: 2025-07-06 23:28:16.716 [INFO][5326] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.75.64/26 host="ci-4344-1-1-3-d8bdec45b1" Jul 6 23:28:16.812190 containerd[1514]: 2025-07-06 23:28:16.717 [INFO][5326] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.75.64/26 handle="k8s-pod-network.0c769176bd95aaeb8905ebf1893084f7c48b0ac804baa82a0952ce20fda9dfd4" host="ci-4344-1-1-3-d8bdec45b1" Jul 6 23:28:16.812190 containerd[1514]: 2025-07-06 23:28:16.722 [INFO][5326] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.0c769176bd95aaeb8905ebf1893084f7c48b0ac804baa82a0952ce20fda9dfd4 Jul 6 23:28:16.812190 containerd[1514]: 2025-07-06 23:28:16.731 [INFO][5326] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.75.64/26 handle="k8s-pod-network.0c769176bd95aaeb8905ebf1893084f7c48b0ac804baa82a0952ce20fda9dfd4" host="ci-4344-1-1-3-d8bdec45b1" Jul 6 23:28:16.812190 containerd[1514]: 2025-07-06 23:28:16.746 [INFO][5326] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.75.74/26] block=192.168.75.64/26 handle="k8s-pod-network.0c769176bd95aaeb8905ebf1893084f7c48b0ac804baa82a0952ce20fda9dfd4" host="ci-4344-1-1-3-d8bdec45b1" Jul 6 23:28:16.812190 containerd[1514]: 2025-07-06 23:28:16.746 [INFO][5326] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.75.74/26] handle="k8s-pod-network.0c769176bd95aaeb8905ebf1893084f7c48b0ac804baa82a0952ce20fda9dfd4" host="ci-4344-1-1-3-d8bdec45b1" Jul 6 23:28:16.812190 containerd[1514]: 2025-07-06 23:28:16.746 [INFO][5326] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Jul 6 23:28:16.812190 containerd[1514]: 2025-07-06 23:28:16.746 [INFO][5326] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.75.74/26] IPv6=[] ContainerID="0c769176bd95aaeb8905ebf1893084f7c48b0ac804baa82a0952ce20fda9dfd4" HandleID="k8s-pod-network.0c769176bd95aaeb8905ebf1893084f7c48b0ac804baa82a0952ce20fda9dfd4" Workload="ci--4344--1--1--3--d8bdec45b1-k8s-calico--apiserver--5c7f48b6db--4wqhl-eth0" Jul 6 23:28:16.813034 containerd[1514]: 2025-07-06 23:28:16.750 [INFO][5293] cni-plugin/k8s.go 418: Populated endpoint ContainerID="0c769176bd95aaeb8905ebf1893084f7c48b0ac804baa82a0952ce20fda9dfd4" Namespace="calico-apiserver" Pod="calico-apiserver-5c7f48b6db-4wqhl" WorkloadEndpoint="ci--4344--1--1--3--d8bdec45b1-k8s-calico--apiserver--5c7f48b6db--4wqhl-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4344--1--1--3--d8bdec45b1-k8s-calico--apiserver--5c7f48b6db--4wqhl-eth0", GenerateName:"calico-apiserver-5c7f48b6db-", Namespace:"calico-apiserver", SelfLink:"", UID:"c12feb6e-6e7f-4280-b0b6-9a4138a27d03", ResourceVersion:"1084", Generation:0, CreationTimestamp:time.Date(2025, time.July, 6, 23, 28, 16, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"5c7f48b6db", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4344-1-1-3-d8bdec45b1", ContainerID:"", Pod:"calico-apiserver-5c7f48b6db-4wqhl", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.75.74/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"calic91edc58875", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Jul 6 23:28:16.813034 containerd[1514]: 2025-07-06 23:28:16.750 [INFO][5293] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.75.74/32] ContainerID="0c769176bd95aaeb8905ebf1893084f7c48b0ac804baa82a0952ce20fda9dfd4" Namespace="calico-apiserver" Pod="calico-apiserver-5c7f48b6db-4wqhl" WorkloadEndpoint="ci--4344--1--1--3--d8bdec45b1-k8s-calico--apiserver--5c7f48b6db--4wqhl-eth0" Jul 6 23:28:16.813034 containerd[1514]: 2025-07-06 23:28:16.750 [INFO][5293] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to calic91edc58875 ContainerID="0c769176bd95aaeb8905ebf1893084f7c48b0ac804baa82a0952ce20fda9dfd4" Namespace="calico-apiserver" Pod="calico-apiserver-5c7f48b6db-4wqhl" WorkloadEndpoint="ci--4344--1--1--3--d8bdec45b1-k8s-calico--apiserver--5c7f48b6db--4wqhl-eth0" Jul 6 23:28:16.813034 containerd[1514]: 2025-07-06 23:28:16.755 [INFO][5293] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="0c769176bd95aaeb8905ebf1893084f7c48b0ac804baa82a0952ce20fda9dfd4" Namespace="calico-apiserver" Pod="calico-apiserver-5c7f48b6db-4wqhl" WorkloadEndpoint="ci--4344--1--1--3--d8bdec45b1-k8s-calico--apiserver--5c7f48b6db--4wqhl-eth0" Jul 6 23:28:16.813034 containerd[1514]: 2025-07-06 23:28:16.756 [INFO][5293] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="0c769176bd95aaeb8905ebf1893084f7c48b0ac804baa82a0952ce20fda9dfd4" Namespace="calico-apiserver" Pod="calico-apiserver-5c7f48b6db-4wqhl" WorkloadEndpoint="ci--4344--1--1--3--d8bdec45b1-k8s-calico--apiserver--5c7f48b6db--4wqhl-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4344--1--1--3--d8bdec45b1-k8s-calico--apiserver--5c7f48b6db--4wqhl-eth0", GenerateName:"calico-apiserver-5c7f48b6db-", Namespace:"calico-apiserver", SelfLink:"", UID:"c12feb6e-6e7f-4280-b0b6-9a4138a27d03", ResourceVersion:"1084", Generation:0, CreationTimestamp:time.Date(2025, time.July, 6, 23, 28, 16, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"5c7f48b6db", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4344-1-1-3-d8bdec45b1", ContainerID:"0c769176bd95aaeb8905ebf1893084f7c48b0ac804baa82a0952ce20fda9dfd4", Pod:"calico-apiserver-5c7f48b6db-4wqhl", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.75.74/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"calic91edc58875", MAC:"9a:2f:ec:77:ac:6e", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Jul 6 23:28:16.813034 containerd[1514]: 2025-07-06 23:28:16.801 [INFO][5293] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="0c769176bd95aaeb8905ebf1893084f7c48b0ac804baa82a0952ce20fda9dfd4" Namespace="calico-apiserver" Pod="calico-apiserver-5c7f48b6db-4wqhl" WorkloadEndpoint="ci--4344--1--1--3--d8bdec45b1-k8s-calico--apiserver--5c7f48b6db--4wqhl-eth0" Jul 6 23:28:16.872702 containerd[1514]: time="2025-07-06T23:28:16.872627858Z" level=info msg="connecting to shim 0c769176bd95aaeb8905ebf1893084f7c48b0ac804baa82a0952ce20fda9dfd4" address="unix:///run/containerd/s/5920588faab81fba5a05001a06508091e60bfc85e5c778d32606a1006b44de14" namespace=k8s.io protocol=ttrpc version=3 Jul 6 23:28:16.942845 systemd[1]: Started cri-containerd-0c769176bd95aaeb8905ebf1893084f7c48b0ac804baa82a0952ce20fda9dfd4.scope - libcontainer container 0c769176bd95aaeb8905ebf1893084f7c48b0ac804baa82a0952ce20fda9dfd4. Jul 6 23:28:17.008340 systemd-networkd[1406]: cali45af94bc807: Link DOWN Jul 6 23:28:17.008349 systemd-networkd[1406]: cali45af94bc807: Lost carrier Jul 6 23:28:17.151192 containerd[1514]: time="2025-07-06T23:28:17.151142624Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/goldmane:v3.30.2\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jul 6 23:28:17.152952 containerd[1514]: time="2025-07-06T23:28:17.152874783Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/goldmane:v3.30.2: active requests=0, bytes read=61838790" Jul 6 23:28:17.154933 containerd[1514]: time="2025-07-06T23:28:17.154414902Z" level=info msg="ImageCreate event name:\"sha256:1389d38feb576cfff09a57a2c028a53e51a72c658f295166960f770eaf07985f\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jul 6 23:28:17.158847 containerd[1514]: time="2025-07-06T23:28:17.158730859Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/goldmane@sha256:a2b761fd93d824431ad93e59e8e670cdf00b478f4b532145297e1e67f2768305\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jul 6 23:28:17.160659 containerd[1514]: time="2025-07-06T23:28:17.160449378Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/goldmane:v3.30.2\" with image id \"sha256:1389d38feb576cfff09a57a2c028a53e51a72c658f295166960f770eaf07985f\", repo tag \"ghcr.io/flatcar/calico/goldmane:v3.30.2\", repo digest \"ghcr.io/flatcar/calico/goldmane@sha256:a2b761fd93d824431ad93e59e8e670cdf00b478f4b532145297e1e67f2768305\", size \"61838636\" in 5.005658727s" Jul 6 23:28:17.160659 containerd[1514]: time="2025-07-06T23:28:17.160489578Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.30.2\" returns image reference \"sha256:1389d38feb576cfff09a57a2c028a53e51a72c658f295166960f770eaf07985f\"" Jul 6 23:28:17.163490 containerd[1514]: time="2025-07-06T23:28:17.163444056Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.2\"" Jul 6 23:28:17.167852 containerd[1514]: time="2025-07-06T23:28:17.167018893Z" level=info msg="CreateContainer within sandbox \"55967b677c733dd4e032c9c202f26d3c8a0bf05826f0f9d58b0b1da4a06d50a4\" for container &ContainerMetadata{Name:goldmane,Attempt:0,}" Jul 6 23:28:17.179593 containerd[1514]: time="2025-07-06T23:28:17.179530964Z" level=info msg="Container 842037bfe3a47b926b97ada3ba3f4d51d779206365ac692fca888d6c2fc39afc: CDI devices from CRI Config.CDIDevices: []" Jul 6 23:28:17.222796 containerd[1514]: time="2025-07-06T23:28:17.222279335Z" level=info msg="CreateContainer within sandbox \"55967b677c733dd4e032c9c202f26d3c8a0bf05826f0f9d58b0b1da4a06d50a4\" for &ContainerMetadata{Name:goldmane,Attempt:0,} returns container id \"842037bfe3a47b926b97ada3ba3f4d51d779206365ac692fca888d6c2fc39afc\"" Jul 6 23:28:17.225698 containerd[1514]: time="2025-07-06T23:28:17.223878694Z" level=info msg="StartContainer for \"842037bfe3a47b926b97ada3ba3f4d51d779206365ac692fca888d6c2fc39afc\"" Jul 6 23:28:17.225698 containerd[1514]: time="2025-07-06T23:28:17.225538813Z" level=info msg="connecting to shim 842037bfe3a47b926b97ada3ba3f4d51d779206365ac692fca888d6c2fc39afc" address="unix:///run/containerd/s/aeda64f88ef02a7173252274719e45345fc9cdbecf875fe9635f1abbb73b4a50" protocol=ttrpc version=3 Jul 6 23:28:17.247048 containerd[1514]: 2025-07-06 23:28:16.971 [INFO][5368] cni-plugin/k8s.go 640: Cleaning up netns ContainerID="55153120b2e39f4ba457796f1115af3f0e6595b1afc0c17662e7bb6f4ca1b112" Jul 6 23:28:17.247048 containerd[1514]: 2025-07-06 23:28:16.973 [INFO][5368] cni-plugin/dataplane_linux.go 559: Deleting workload's device in netns. ContainerID="55153120b2e39f4ba457796f1115af3f0e6595b1afc0c17662e7bb6f4ca1b112" iface="eth0" netns="/var/run/netns/cni-5509e6f1-fa7d-d0a0-524b-5d2eb9a9c75e" Jul 6 23:28:17.247048 containerd[1514]: 2025-07-06 23:28:16.975 [INFO][5368] cni-plugin/dataplane_linux.go 570: Entered netns, deleting veth. ContainerID="55153120b2e39f4ba457796f1115af3f0e6595b1afc0c17662e7bb6f4ca1b112" iface="eth0" netns="/var/run/netns/cni-5509e6f1-fa7d-d0a0-524b-5d2eb9a9c75e" Jul 6 23:28:17.247048 containerd[1514]: 2025-07-06 23:28:17.019 [INFO][5368] cni-plugin/dataplane_linux.go 604: Deleted device in netns. ContainerID="55153120b2e39f4ba457796f1115af3f0e6595b1afc0c17662e7bb6f4ca1b112" after=45.096289ms iface="eth0" netns="/var/run/netns/cni-5509e6f1-fa7d-d0a0-524b-5d2eb9a9c75e" Jul 6 23:28:17.247048 containerd[1514]: 2025-07-06 23:28:17.019 [INFO][5368] cni-plugin/k8s.go 647: Releasing IP address(es) ContainerID="55153120b2e39f4ba457796f1115af3f0e6595b1afc0c17662e7bb6f4ca1b112" Jul 6 23:28:17.247048 containerd[1514]: 2025-07-06 23:28:17.019 [INFO][5368] cni-plugin/utils.go 188: Calico CNI releasing IP address ContainerID="55153120b2e39f4ba457796f1115af3f0e6595b1afc0c17662e7bb6f4ca1b112" Jul 6 23:28:17.247048 containerd[1514]: 2025-07-06 23:28:17.094 [INFO][5409] ipam/ipam_plugin.go 412: Releasing address using handleID ContainerID="55153120b2e39f4ba457796f1115af3f0e6595b1afc0c17662e7bb6f4ca1b112" HandleID="k8s-pod-network.55153120b2e39f4ba457796f1115af3f0e6595b1afc0c17662e7bb6f4ca1b112" Workload="ci--4344--1--1--3--d8bdec45b1-k8s-calico--apiserver--59c54d7dc--bp45t-eth0" Jul 6 23:28:17.247048 containerd[1514]: 2025-07-06 23:28:17.095 [INFO][5409] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Jul 6 23:28:17.247048 containerd[1514]: 2025-07-06 23:28:17.095 [INFO][5409] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Jul 6 23:28:17.247048 containerd[1514]: 2025-07-06 23:28:17.213 [INFO][5409] ipam/ipam_plugin.go 431: Released address using handleID ContainerID="55153120b2e39f4ba457796f1115af3f0e6595b1afc0c17662e7bb6f4ca1b112" HandleID="k8s-pod-network.55153120b2e39f4ba457796f1115af3f0e6595b1afc0c17662e7bb6f4ca1b112" Workload="ci--4344--1--1--3--d8bdec45b1-k8s-calico--apiserver--59c54d7dc--bp45t-eth0" Jul 6 23:28:17.247048 containerd[1514]: 2025-07-06 23:28:17.213 [INFO][5409] ipam/ipam_plugin.go 440: Releasing address using workloadID ContainerID="55153120b2e39f4ba457796f1115af3f0e6595b1afc0c17662e7bb6f4ca1b112" HandleID="k8s-pod-network.55153120b2e39f4ba457796f1115af3f0e6595b1afc0c17662e7bb6f4ca1b112" Workload="ci--4344--1--1--3--d8bdec45b1-k8s-calico--apiserver--59c54d7dc--bp45t-eth0" Jul 6 23:28:17.247048 containerd[1514]: 2025-07-06 23:28:17.238 [INFO][5409] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Jul 6 23:28:17.247048 containerd[1514]: 2025-07-06 23:28:17.241 [INFO][5368] cni-plugin/k8s.go 653: Teardown processing complete. ContainerID="55153120b2e39f4ba457796f1115af3f0e6595b1afc0c17662e7bb6f4ca1b112" Jul 6 23:28:17.248721 containerd[1514]: time="2025-07-06T23:28:17.248646837Z" level=info msg="TearDown network for sandbox \"55153120b2e39f4ba457796f1115af3f0e6595b1afc0c17662e7bb6f4ca1b112\" successfully" Jul 6 23:28:17.248912 containerd[1514]: time="2025-07-06T23:28:17.248877437Z" level=info msg="StopPodSandbox for \"55153120b2e39f4ba457796f1115af3f0e6595b1afc0c17662e7bb6f4ca1b112\" returns successfully" Jul 6 23:28:17.256650 systemd[1]: Started cri-containerd-842037bfe3a47b926b97ada3ba3f4d51d779206365ac692fca888d6c2fc39afc.scope - libcontainer container 842037bfe3a47b926b97ada3ba3f4d51d779206365ac692fca888d6c2fc39afc. Jul 6 23:28:17.312879 systemd[1]: run-containerd-io.containerd.grpc.v1.cri-sandboxes-55153120b2e39f4ba457796f1115af3f0e6595b1afc0c17662e7bb6f4ca1b112-shm.mount: Deactivated successfully. Jul 6 23:28:17.312972 systemd[1]: run-netns-cni\x2d5509e6f1\x2dfa7d\x2dd0a0\x2d524b\x2d5d2eb9a9c75e.mount: Deactivated successfully. Jul 6 23:28:17.348915 containerd[1514]: time="2025-07-06T23:28:17.348256209Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-5c7f48b6db-4wqhl,Uid:c12feb6e-6e7f-4280-b0b6-9a4138a27d03,Namespace:calico-apiserver,Attempt:0,} returns sandbox id \"0c769176bd95aaeb8905ebf1893084f7c48b0ac804baa82a0952ce20fda9dfd4\"" Jul 6 23:28:17.357907 containerd[1514]: time="2025-07-06T23:28:17.357476442Z" level=info msg="CreateContainer within sandbox \"0c769176bd95aaeb8905ebf1893084f7c48b0ac804baa82a0952ce20fda9dfd4\" for container &ContainerMetadata{Name:calico-apiserver,Attempt:0,}" Jul 6 23:28:17.377960 containerd[1514]: time="2025-07-06T23:28:17.377916188Z" level=info msg="Container 91e48aec8b780a84322e07a16631ff0ca136e91cb7cf6d889988426a20cfa643: CDI devices from CRI Config.CDIDevices: []" Jul 6 23:28:17.386632 kubelet[2667]: I0706 23:28:17.386370 2667 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-p58qf\" (UniqueName: \"kubernetes.io/projected/c35acb70-7389-4854-97f5-ab45ab94f737-kube-api-access-p58qf\") pod \"c35acb70-7389-4854-97f5-ab45ab94f737\" (UID: \"c35acb70-7389-4854-97f5-ab45ab94f737\") " Jul 6 23:28:17.389486 kubelet[2667]: I0706 23:28:17.389463 2667 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"calico-apiserver-certs\" (UniqueName: \"kubernetes.io/secret/c35acb70-7389-4854-97f5-ab45ab94f737-calico-apiserver-certs\") pod \"c35acb70-7389-4854-97f5-ab45ab94f737\" (UID: \"c35acb70-7389-4854-97f5-ab45ab94f737\") " Jul 6 23:28:17.396036 kubelet[2667]: I0706 23:28:17.395970 2667 operation_generator.go:780] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c35acb70-7389-4854-97f5-ab45ab94f737-kube-api-access-p58qf" (OuterVolumeSpecName: "kube-api-access-p58qf") pod "c35acb70-7389-4854-97f5-ab45ab94f737" (UID: "c35acb70-7389-4854-97f5-ab45ab94f737"). InnerVolumeSpecName "kube-api-access-p58qf". PluginName "kubernetes.io/projected", VolumeGIDValue "" Jul 6 23:28:17.398849 systemd[1]: var-lib-kubelet-pods-c35acb70\x2d7389\x2d4854\x2d97f5\x2dab45ab94f737-volumes-kubernetes.io\x7eprojected-kube\x2dapi\x2daccess\x2dp58qf.mount: Deactivated successfully. Jul 6 23:28:17.402111 kubelet[2667]: I0706 23:28:17.402067 2667 operation_generator.go:780] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c35acb70-7389-4854-97f5-ab45ab94f737-calico-apiserver-certs" (OuterVolumeSpecName: "calico-apiserver-certs") pod "c35acb70-7389-4854-97f5-ab45ab94f737" (UID: "c35acb70-7389-4854-97f5-ab45ab94f737"). InnerVolumeSpecName "calico-apiserver-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Jul 6 23:28:17.403088 containerd[1514]: time="2025-07-06T23:28:17.402929571Z" level=info msg="CreateContainer within sandbox \"0c769176bd95aaeb8905ebf1893084f7c48b0ac804baa82a0952ce20fda9dfd4\" for &ContainerMetadata{Name:calico-apiserver,Attempt:0,} returns container id \"91e48aec8b780a84322e07a16631ff0ca136e91cb7cf6d889988426a20cfa643\"" Jul 6 23:28:17.405147 containerd[1514]: time="2025-07-06T23:28:17.404640850Z" level=info msg="StartContainer for \"91e48aec8b780a84322e07a16631ff0ca136e91cb7cf6d889988426a20cfa643\"" Jul 6 23:28:17.406655 systemd[1]: var-lib-kubelet-pods-c35acb70\x2d7389\x2d4854\x2d97f5\x2dab45ab94f737-volumes-kubernetes.io\x7esecret-calico\x2dapiserver\x2dcerts.mount: Deactivated successfully. Jul 6 23:28:17.408439 containerd[1514]: time="2025-07-06T23:28:17.408322167Z" level=info msg="connecting to shim 91e48aec8b780a84322e07a16631ff0ca136e91cb7cf6d889988426a20cfa643" address="unix:///run/containerd/s/5920588faab81fba5a05001a06508091e60bfc85e5c778d32606a1006b44de14" protocol=ttrpc version=3 Jul 6 23:28:17.441870 systemd[1]: Started cri-containerd-91e48aec8b780a84322e07a16631ff0ca136e91cb7cf6d889988426a20cfa643.scope - libcontainer container 91e48aec8b780a84322e07a16631ff0ca136e91cb7cf6d889988426a20cfa643. Jul 6 23:28:17.491252 kubelet[2667]: I0706 23:28:17.490783 2667 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-p58qf\" (UniqueName: \"kubernetes.io/projected/c35acb70-7389-4854-97f5-ab45ab94f737-kube-api-access-p58qf\") on node \"ci-4344-1-1-3-d8bdec45b1\" DevicePath \"\"" Jul 6 23:28:17.491252 kubelet[2667]: I0706 23:28:17.490819 2667 reconciler_common.go:299] "Volume detached for volume \"calico-apiserver-certs\" (UniqueName: \"kubernetes.io/secret/c35acb70-7389-4854-97f5-ab45ab94f737-calico-apiserver-certs\") on node \"ci-4344-1-1-3-d8bdec45b1\" DevicePath \"\"" Jul 6 23:28:17.509550 containerd[1514]: time="2025-07-06T23:28:17.509513338Z" level=info msg="StartContainer for \"842037bfe3a47b926b97ada3ba3f4d51d779206365ac692fca888d6c2fc39afc\" returns successfully" Jul 6 23:28:17.535520 containerd[1514]: time="2025-07-06T23:28:17.535468400Z" level=info msg="StartContainer for \"91e48aec8b780a84322e07a16631ff0ca136e91cb7cf6d889988426a20cfa643\" returns successfully" Jul 6 23:28:17.811024 kubelet[2667]: I0706 23:28:17.810768 2667 scope.go:117] "RemoveContainer" containerID="e18182e798b5983978030d6b14fd9e58448397f934a7b45122748556832dc374" Jul 6 23:28:17.813938 containerd[1514]: time="2025-07-06T23:28:17.813836449Z" level=info msg="RemoveContainer for \"e18182e798b5983978030d6b14fd9e58448397f934a7b45122748556832dc374\"" Jul 6 23:28:17.817575 systemd[1]: Removed slice kubepods-besteffort-podc35acb70_7389_4854_97f5_ab45ab94f737.slice - libcontainer container kubepods-besteffort-podc35acb70_7389_4854_97f5_ab45ab94f737.slice. Jul 6 23:28:17.836747 containerd[1514]: time="2025-07-06T23:28:17.836442634Z" level=info msg="RemoveContainer for \"e18182e798b5983978030d6b14fd9e58448397f934a7b45122748556832dc374\" returns successfully" Jul 6 23:28:17.936256 kubelet[2667]: I0706 23:28:17.935826 2667 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/goldmane-768f4c5c69-r6dz8" podStartSLOduration=27.976624852 podStartE2EDuration="42.935807085s" podCreationTimestamp="2025-07-06 23:27:35 +0000 UTC" firstStartedPulling="2025-07-06 23:28:02.203698023 +0000 UTC m=+48.985358355" lastFinishedPulling="2025-07-06 23:28:17.162880216 +0000 UTC m=+63.944540588" observedRunningTime="2025-07-06 23:28:17.882889122 +0000 UTC m=+64.664549494" watchObservedRunningTime="2025-07-06 23:28:17.935807085 +0000 UTC m=+64.717467457" Jul 6 23:28:18.078889 systemd-networkd[1406]: calic91edc58875: Gained IPv6LL Jul 6 23:28:18.486068 containerd[1514]: time="2025-07-06T23:28:18.486024196Z" level=info msg="TaskExit event in podsandbox handler container_id:\"842037bfe3a47b926b97ada3ba3f4d51d779206365ac692fca888d6c2fc39afc\" id:\"e152c2c08195e9f8804bb5474a192c55ffbd2acf2c9b6a43cc5a208386373372\" pid:5522 exit_status:1 exited_at:{seconds:1751844498 nanos:484820237}" Jul 6 23:28:19.125366 containerd[1514]: time="2025-07-06T23:28:19.123916212Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.2\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jul 6 23:28:19.125625 containerd[1514]: time="2025-07-06T23:28:19.125595971Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/node-driver-registrar:v3.30.2: active requests=0, bytes read=13754366" Jul 6 23:28:19.126584 containerd[1514]: time="2025-07-06T23:28:19.126545971Z" level=info msg="ImageCreate event name:\"sha256:664ed31fb4687b0de23d6e6e116bc87b236790d7355871d3237c54452e02e27c\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jul 6 23:28:19.129692 containerd[1514]: time="2025-07-06T23:28:19.129628329Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node-driver-registrar@sha256:8fec2de12dfa51bae89d941938a07af2598eb8bfcab55d0dded1d9c193d7b99f\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jul 6 23:28:19.130764 containerd[1514]: time="2025-07-06T23:28:19.130720328Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.2\" with image id \"sha256:664ed31fb4687b0de23d6e6e116bc87b236790d7355871d3237c54452e02e27c\", repo tag \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.2\", repo digest \"ghcr.io/flatcar/calico/node-driver-registrar@sha256:8fec2de12dfa51bae89d941938a07af2598eb8bfcab55d0dded1d9c193d7b99f\", size \"15123559\" in 1.967240233s" Jul 6 23:28:19.130764 containerd[1514]: time="2025-07-06T23:28:19.130761528Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.2\" returns image reference \"sha256:664ed31fb4687b0de23d6e6e116bc87b236790d7355871d3237c54452e02e27c\"" Jul 6 23:28:19.135463 containerd[1514]: time="2025-07-06T23:28:19.135433085Z" level=info msg="CreateContainer within sandbox \"3aea7c17fd5bc9d6661a3eab9ed3d52346ded1507128e53d356e5378a071080e\" for container &ContainerMetadata{Name:csi-node-driver-registrar,Attempt:0,}" Jul 6 23:28:19.151928 containerd[1514]: time="2025-07-06T23:28:19.150740795Z" level=info msg="Container 628f3156c9349d2ca7017e029521b87c93b8949fcd61ce62b588c9cb8f153472: CDI devices from CRI Config.CDIDevices: []" Jul 6 23:28:19.164227 containerd[1514]: time="2025-07-06T23:28:19.163934426Z" level=info msg="CreateContainer within sandbox \"3aea7c17fd5bc9d6661a3eab9ed3d52346ded1507128e53d356e5378a071080e\" for &ContainerMetadata{Name:csi-node-driver-registrar,Attempt:0,} returns container id \"628f3156c9349d2ca7017e029521b87c93b8949fcd61ce62b588c9cb8f153472\"" Jul 6 23:28:19.166174 containerd[1514]: time="2025-07-06T23:28:19.166067705Z" level=info msg="StartContainer for \"628f3156c9349d2ca7017e029521b87c93b8949fcd61ce62b588c9cb8f153472\"" Jul 6 23:28:19.170358 containerd[1514]: time="2025-07-06T23:28:19.170319342Z" level=info msg="connecting to shim 628f3156c9349d2ca7017e029521b87c93b8949fcd61ce62b588c9cb8f153472" address="unix:///run/containerd/s/fc54eabd9530bc2b9114a354e98b7e41c9e3651a8a0fdd07940a1accfee3e6f3" protocol=ttrpc version=3 Jul 6 23:28:19.195817 containerd[1514]: time="2025-07-06T23:28:19.195761286Z" level=info msg="TaskExit event in podsandbox handler container_id:\"842037bfe3a47b926b97ada3ba3f4d51d779206365ac692fca888d6c2fc39afc\" id:\"e4afd155facdd3dde1b6ddc11f3bf916137ee94f7cd394df6f7f6f707c8baacd\" pid:5551 exit_status:1 exited_at:{seconds:1751844499 nanos:192767527}" Jul 6 23:28:19.216160 systemd[1]: Started cri-containerd-628f3156c9349d2ca7017e029521b87c93b8949fcd61ce62b588c9cb8f153472.scope - libcontainer container 628f3156c9349d2ca7017e029521b87c93b8949fcd61ce62b588c9cb8f153472. Jul 6 23:28:19.283282 containerd[1514]: time="2025-07-06T23:28:19.283221589Z" level=info msg="StartContainer for \"628f3156c9349d2ca7017e029521b87c93b8949fcd61ce62b588c9cb8f153472\" returns successfully" Jul 6 23:28:19.363682 kubelet[2667]: I0706 23:28:19.363618 2667 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c35acb70-7389-4854-97f5-ab45ab94f737" path="/var/lib/kubelet/pods/c35acb70-7389-4854-97f5-ab45ab94f737/volumes" Jul 6 23:28:19.498088 kubelet[2667]: I0706 23:28:19.498037 2667 csi_plugin.go:100] kubernetes.io/csi: Trying to validate a new CSI Driver with name: csi.tigera.io endpoint: /var/lib/kubelet/plugins/csi.tigera.io/csi.sock versions: 1.0.0 Jul 6 23:28:19.498231 kubelet[2667]: I0706 23:28:19.498101 2667 csi_plugin.go:113] kubernetes.io/csi: Register new plugin with name: csi.tigera.io at endpoint: /var/lib/kubelet/plugins/csi.tigera.io/csi.sock Jul 6 23:28:19.859391 kubelet[2667]: I0706 23:28:19.859290 2667 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Jul 6 23:28:19.893757 kubelet[2667]: I0706 23:28:19.893690 2667 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/csi-node-driver-xwdth" podStartSLOduration=26.855454611 podStartE2EDuration="43.893651231s" podCreationTimestamp="2025-07-06 23:27:36 +0000 UTC" firstStartedPulling="2025-07-06 23:28:02.093912307 +0000 UTC m=+48.875572639" lastFinishedPulling="2025-07-06 23:28:19.132108887 +0000 UTC m=+65.913769259" observedRunningTime="2025-07-06 23:28:19.890552633 +0000 UTC m=+66.672213005" watchObservedRunningTime="2025-07-06 23:28:19.893651231 +0000 UTC m=+66.675311603" Jul 6 23:28:19.894243 kubelet[2667]: I0706 23:28:19.894145 2667 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-apiserver/calico-apiserver-5c7f48b6db-4wqhl" podStartSLOduration=3.894135351 podStartE2EDuration="3.894135351s" podCreationTimestamp="2025-07-06 23:28:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-07-06 23:28:17.996540564 +0000 UTC m=+64.778200976" watchObservedRunningTime="2025-07-06 23:28:19.894135351 +0000 UTC m=+66.675795683" Jul 6 23:28:20.369845 containerd[1514]: time="2025-07-06T23:28:20.369799768Z" level=info msg="TaskExit event in podsandbox handler container_id:\"842037bfe3a47b926b97ada3ba3f4d51d779206365ac692fca888d6c2fc39afc\" id:\"f37eab8303f4da262bbbe9fdc8bd40d16a0b1530b8ce02a73bfacf076c224ddd\" pid:5604 exit_status:1 exited_at:{seconds:1751844500 nanos:368706449}" Jul 6 23:28:22.213778 containerd[1514]: time="2025-07-06T23:28:22.213729103Z" level=info msg="StopContainer for \"9204ed692475d60b065414c0a60b3b7f336c68dfbe0e7b4c45a0e64b51d2c65d\" with timeout 30 (s)" Jul 6 23:28:22.215275 containerd[1514]: time="2025-07-06T23:28:22.215207502Z" level=info msg="Stop container \"9204ed692475d60b065414c0a60b3b7f336c68dfbe0e7b4c45a0e64b51d2c65d\" with signal terminated" Jul 6 23:28:22.260780 systemd[1]: cri-containerd-9204ed692475d60b065414c0a60b3b7f336c68dfbe0e7b4c45a0e64b51d2c65d.scope: Deactivated successfully. Jul 6 23:28:22.262767 systemd[1]: cri-containerd-9204ed692475d60b065414c0a60b3b7f336c68dfbe0e7b4c45a0e64b51d2c65d.scope: Consumed 1.750s CPU time, 61.5M memory peak. Jul 6 23:28:22.266495 containerd[1514]: time="2025-07-06T23:28:22.266454911Z" level=info msg="TaskExit event in podsandbox handler container_id:\"9204ed692475d60b065414c0a60b3b7f336c68dfbe0e7b4c45a0e64b51d2c65d\" id:\"9204ed692475d60b065414c0a60b3b7f336c68dfbe0e7b4c45a0e64b51d2c65d\" pid:5104 exit_status:1 exited_at:{seconds:1751844502 nanos:266099031}" Jul 6 23:28:22.266867 containerd[1514]: time="2025-07-06T23:28:22.266800311Z" level=info msg="received exit event container_id:\"9204ed692475d60b065414c0a60b3b7f336c68dfbe0e7b4c45a0e64b51d2c65d\" id:\"9204ed692475d60b065414c0a60b3b7f336c68dfbe0e7b4c45a0e64b51d2c65d\" pid:5104 exit_status:1 exited_at:{seconds:1751844502 nanos:266099031}" Jul 6 23:28:22.316626 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-9204ed692475d60b065414c0a60b3b7f336c68dfbe0e7b4c45a0e64b51d2c65d-rootfs.mount: Deactivated successfully. Jul 6 23:28:22.330293 containerd[1514]: time="2025-07-06T23:28:22.330210233Z" level=info msg="StopContainer for \"9204ed692475d60b065414c0a60b3b7f336c68dfbe0e7b4c45a0e64b51d2c65d\" returns successfully" Jul 6 23:28:22.332995 containerd[1514]: time="2025-07-06T23:28:22.332927351Z" level=info msg="StopPodSandbox for \"ed7c6d4c03fb6932e88fd0e005b1044d3f49a7d624765e1e0356b34fcffa447d\"" Jul 6 23:28:22.334058 containerd[1514]: time="2025-07-06T23:28:22.333936830Z" level=info msg="Container to stop \"9204ed692475d60b065414c0a60b3b7f336c68dfbe0e7b4c45a0e64b51d2c65d\" must be in running or unknown state, current state \"CONTAINER_EXITED\"" Jul 6 23:28:22.353394 systemd[1]: cri-containerd-ed7c6d4c03fb6932e88fd0e005b1044d3f49a7d624765e1e0356b34fcffa447d.scope: Deactivated successfully. Jul 6 23:28:22.353661 systemd[1]: cri-containerd-ed7c6d4c03fb6932e88fd0e005b1044d3f49a7d624765e1e0356b34fcffa447d.scope: Consumed 33ms CPU time, 4.4M memory peak, 2.2M read from disk. Jul 6 23:28:22.357929 containerd[1514]: time="2025-07-06T23:28:22.357872616Z" level=info msg="TaskExit event in podsandbox handler container_id:\"ed7c6d4c03fb6932e88fd0e005b1044d3f49a7d624765e1e0356b34fcffa447d\" id:\"ed7c6d4c03fb6932e88fd0e005b1044d3f49a7d624765e1e0356b34fcffa447d\" pid:4217 exit_status:137 exited_at:{seconds:1751844502 nanos:356515537}" Jul 6 23:28:22.396217 containerd[1514]: time="2025-07-06T23:28:22.396171193Z" level=info msg="shim disconnected" id=ed7c6d4c03fb6932e88fd0e005b1044d3f49a7d624765e1e0356b34fcffa447d namespace=k8s.io Jul 6 23:28:22.396562 containerd[1514]: time="2025-07-06T23:28:22.396203433Z" level=warning msg="cleaning up after shim disconnected" id=ed7c6d4c03fb6932e88fd0e005b1044d3f49a7d624765e1e0356b34fcffa447d namespace=k8s.io Jul 6 23:28:22.396562 containerd[1514]: time="2025-07-06T23:28:22.396445553Z" level=info msg="cleaning up dead shim" namespace=k8s.io Jul 6 23:28:22.400568 containerd[1514]: time="2025-07-06T23:28:22.398701111Z" level=info msg="received exit event sandbox_id:\"ed7c6d4c03fb6932e88fd0e005b1044d3f49a7d624765e1e0356b34fcffa447d\" exit_status:137 exited_at:{seconds:1751844502 nanos:356515537}" Jul 6 23:28:22.399506 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-ed7c6d4c03fb6932e88fd0e005b1044d3f49a7d624765e1e0356b34fcffa447d-rootfs.mount: Deactivated successfully. Jul 6 23:28:22.407358 systemd[1]: run-containerd-io.containerd.grpc.v1.cri-sandboxes-ed7c6d4c03fb6932e88fd0e005b1044d3f49a7d624765e1e0356b34fcffa447d-shm.mount: Deactivated successfully. Jul 6 23:28:22.490093 systemd-networkd[1406]: cali2121a5177ab: Link DOWN Jul 6 23:28:22.490104 systemd-networkd[1406]: cali2121a5177ab: Lost carrier Jul 6 23:28:22.622138 containerd[1514]: 2025-07-06 23:28:22.486 [INFO][5690] cni-plugin/k8s.go 640: Cleaning up netns ContainerID="ed7c6d4c03fb6932e88fd0e005b1044d3f49a7d624765e1e0356b34fcffa447d" Jul 6 23:28:22.622138 containerd[1514]: 2025-07-06 23:28:22.487 [INFO][5690] cni-plugin/dataplane_linux.go 559: Deleting workload's device in netns. ContainerID="ed7c6d4c03fb6932e88fd0e005b1044d3f49a7d624765e1e0356b34fcffa447d" iface="eth0" netns="/var/run/netns/cni-cf4c6973-e82b-1097-693d-26884d9158ae" Jul 6 23:28:22.622138 containerd[1514]: 2025-07-06 23:28:22.488 [INFO][5690] cni-plugin/dataplane_linux.go 570: Entered netns, deleting veth. ContainerID="ed7c6d4c03fb6932e88fd0e005b1044d3f49a7d624765e1e0356b34fcffa447d" iface="eth0" netns="/var/run/netns/cni-cf4c6973-e82b-1097-693d-26884d9158ae" Jul 6 23:28:22.622138 containerd[1514]: 2025-07-06 23:28:22.496 [INFO][5690] cni-plugin/dataplane_linux.go 604: Deleted device in netns. ContainerID="ed7c6d4c03fb6932e88fd0e005b1044d3f49a7d624765e1e0356b34fcffa447d" after=9.289474ms iface="eth0" netns="/var/run/netns/cni-cf4c6973-e82b-1097-693d-26884d9158ae" Jul 6 23:28:22.622138 containerd[1514]: 2025-07-06 23:28:22.496 [INFO][5690] cni-plugin/k8s.go 647: Releasing IP address(es) ContainerID="ed7c6d4c03fb6932e88fd0e005b1044d3f49a7d624765e1e0356b34fcffa447d" Jul 6 23:28:22.622138 containerd[1514]: 2025-07-06 23:28:22.496 [INFO][5690] cni-plugin/utils.go 188: Calico CNI releasing IP address ContainerID="ed7c6d4c03fb6932e88fd0e005b1044d3f49a7d624765e1e0356b34fcffa447d" Jul 6 23:28:22.622138 containerd[1514]: 2025-07-06 23:28:22.539 [INFO][5702] ipam/ipam_plugin.go 412: Releasing address using handleID ContainerID="ed7c6d4c03fb6932e88fd0e005b1044d3f49a7d624765e1e0356b34fcffa447d" HandleID="k8s-pod-network.ed7c6d4c03fb6932e88fd0e005b1044d3f49a7d624765e1e0356b34fcffa447d" Workload="ci--4344--1--1--3--d8bdec45b1-k8s-calico--apiserver--59c54d7dc--sddts-eth0" Jul 6 23:28:22.622138 containerd[1514]: 2025-07-06 23:28:22.540 [INFO][5702] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Jul 6 23:28:22.622138 containerd[1514]: 2025-07-06 23:28:22.540 [INFO][5702] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Jul 6 23:28:22.622138 containerd[1514]: 2025-07-06 23:28:22.615 [INFO][5702] ipam/ipam_plugin.go 431: Released address using handleID ContainerID="ed7c6d4c03fb6932e88fd0e005b1044d3f49a7d624765e1e0356b34fcffa447d" HandleID="k8s-pod-network.ed7c6d4c03fb6932e88fd0e005b1044d3f49a7d624765e1e0356b34fcffa447d" Workload="ci--4344--1--1--3--d8bdec45b1-k8s-calico--apiserver--59c54d7dc--sddts-eth0" Jul 6 23:28:22.622138 containerd[1514]: 2025-07-06 23:28:22.615 [INFO][5702] ipam/ipam_plugin.go 440: Releasing address using workloadID ContainerID="ed7c6d4c03fb6932e88fd0e005b1044d3f49a7d624765e1e0356b34fcffa447d" HandleID="k8s-pod-network.ed7c6d4c03fb6932e88fd0e005b1044d3f49a7d624765e1e0356b34fcffa447d" Workload="ci--4344--1--1--3--d8bdec45b1-k8s-calico--apiserver--59c54d7dc--sddts-eth0" Jul 6 23:28:22.622138 containerd[1514]: 2025-07-06 23:28:22.617 [INFO][5702] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Jul 6 23:28:22.622138 containerd[1514]: 2025-07-06 23:28:22.620 [INFO][5690] cni-plugin/k8s.go 653: Teardown processing complete. ContainerID="ed7c6d4c03fb6932e88fd0e005b1044d3f49a7d624765e1e0356b34fcffa447d" Jul 6 23:28:22.624071 containerd[1514]: time="2025-07-06T23:28:22.624007456Z" level=info msg="TearDown network for sandbox \"ed7c6d4c03fb6932e88fd0e005b1044d3f49a7d624765e1e0356b34fcffa447d\" successfully" Jul 6 23:28:22.624071 containerd[1514]: time="2025-07-06T23:28:22.624046576Z" level=info msg="StopPodSandbox for \"ed7c6d4c03fb6932e88fd0e005b1044d3f49a7d624765e1e0356b34fcffa447d\" returns successfully" Jul 6 23:28:22.630611 systemd[1]: run-netns-cni\x2dcf4c6973\x2de82b\x2d1097\x2d693d\x2d26884d9158ae.mount: Deactivated successfully. Jul 6 23:28:22.737711 kubelet[2667]: I0706 23:28:22.737291 2667 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"calico-apiserver-certs\" (UniqueName: \"kubernetes.io/secret/f0eeabf7-71e7-42d8-936d-d80fd3cbb28d-calico-apiserver-certs\") pod \"f0eeabf7-71e7-42d8-936d-d80fd3cbb28d\" (UID: \"f0eeabf7-71e7-42d8-936d-d80fd3cbb28d\") " Jul 6 23:28:22.737711 kubelet[2667]: I0706 23:28:22.737376 2667 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x6fs2\" (UniqueName: \"kubernetes.io/projected/f0eeabf7-71e7-42d8-936d-d80fd3cbb28d-kube-api-access-x6fs2\") pod \"f0eeabf7-71e7-42d8-936d-d80fd3cbb28d\" (UID: \"f0eeabf7-71e7-42d8-936d-d80fd3cbb28d\") " Jul 6 23:28:22.744586 systemd[1]: var-lib-kubelet-pods-f0eeabf7\x2d71e7\x2d42d8\x2d936d\x2dd80fd3cbb28d-volumes-kubernetes.io\x7eprojected-kube\x2dapi\x2daccess\x2dx6fs2.mount: Deactivated successfully. Jul 6 23:28:22.747980 kubelet[2667]: I0706 23:28:22.747547 2667 operation_generator.go:780] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f0eeabf7-71e7-42d8-936d-d80fd3cbb28d-calico-apiserver-certs" (OuterVolumeSpecName: "calico-apiserver-certs") pod "f0eeabf7-71e7-42d8-936d-d80fd3cbb28d" (UID: "f0eeabf7-71e7-42d8-936d-d80fd3cbb28d"). InnerVolumeSpecName "calico-apiserver-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Jul 6 23:28:22.747980 kubelet[2667]: I0706 23:28:22.746923 2667 operation_generator.go:780] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f0eeabf7-71e7-42d8-936d-d80fd3cbb28d-kube-api-access-x6fs2" (OuterVolumeSpecName: "kube-api-access-x6fs2") pod "f0eeabf7-71e7-42d8-936d-d80fd3cbb28d" (UID: "f0eeabf7-71e7-42d8-936d-d80fd3cbb28d"). InnerVolumeSpecName "kube-api-access-x6fs2". PluginName "kubernetes.io/projected", VolumeGIDValue "" Jul 6 23:28:22.838383 kubelet[2667]: I0706 23:28:22.838214 2667 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-x6fs2\" (UniqueName: \"kubernetes.io/projected/f0eeabf7-71e7-42d8-936d-d80fd3cbb28d-kube-api-access-x6fs2\") on node \"ci-4344-1-1-3-d8bdec45b1\" DevicePath \"\"" Jul 6 23:28:22.838383 kubelet[2667]: I0706 23:28:22.838330 2667 reconciler_common.go:299] "Volume detached for volume \"calico-apiserver-certs\" (UniqueName: \"kubernetes.io/secret/f0eeabf7-71e7-42d8-936d-d80fd3cbb28d-calico-apiserver-certs\") on node \"ci-4344-1-1-3-d8bdec45b1\" DevicePath \"\"" Jul 6 23:28:22.874699 kubelet[2667]: I0706 23:28:22.874551 2667 scope.go:117] "RemoveContainer" containerID="9204ed692475d60b065414c0a60b3b7f336c68dfbe0e7b4c45a0e64b51d2c65d" Jul 6 23:28:22.879615 containerd[1514]: time="2025-07-06T23:28:22.879407982Z" level=info msg="RemoveContainer for \"9204ed692475d60b065414c0a60b3b7f336c68dfbe0e7b4c45a0e64b51d2c65d\"" Jul 6 23:28:22.885934 systemd[1]: Removed slice kubepods-besteffort-podf0eeabf7_71e7_42d8_936d_d80fd3cbb28d.slice - libcontainer container kubepods-besteffort-podf0eeabf7_71e7_42d8_936d_d80fd3cbb28d.slice. Jul 6 23:28:22.886060 systemd[1]: kubepods-besteffort-podf0eeabf7_71e7_42d8_936d_d80fd3cbb28d.slice: Consumed 1.783s CPU time, 62M memory peak, 2.2M read from disk. Jul 6 23:28:22.888781 containerd[1514]: time="2025-07-06T23:28:22.888019857Z" level=info msg="RemoveContainer for \"9204ed692475d60b065414c0a60b3b7f336c68dfbe0e7b4c45a0e64b51d2c65d\" returns successfully" Jul 6 23:28:22.889823 kubelet[2667]: I0706 23:28:22.889407 2667 scope.go:117] "RemoveContainer" containerID="9204ed692475d60b065414c0a60b3b7f336c68dfbe0e7b4c45a0e64b51d2c65d" Jul 6 23:28:22.892355 containerd[1514]: time="2025-07-06T23:28:22.892274895Z" level=error msg="ContainerStatus for \"9204ed692475d60b065414c0a60b3b7f336c68dfbe0e7b4c45a0e64b51d2c65d\" failed" error="rpc error: code = NotFound desc = an error occurred when try to find container \"9204ed692475d60b065414c0a60b3b7f336c68dfbe0e7b4c45a0e64b51d2c65d\": not found" Jul 6 23:28:22.892705 kubelet[2667]: E0706 23:28:22.892557 2667 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = an error occurred when try to find container \"9204ed692475d60b065414c0a60b3b7f336c68dfbe0e7b4c45a0e64b51d2c65d\": not found" containerID="9204ed692475d60b065414c0a60b3b7f336c68dfbe0e7b4c45a0e64b51d2c65d" Jul 6 23:28:22.901698 kubelet[2667]: I0706 23:28:22.900686 2667 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"containerd","ID":"9204ed692475d60b065414c0a60b3b7f336c68dfbe0e7b4c45a0e64b51d2c65d"} err="failed to get container status \"9204ed692475d60b065414c0a60b3b7f336c68dfbe0e7b4c45a0e64b51d2c65d\": rpc error: code = NotFound desc = an error occurred when try to find container \"9204ed692475d60b065414c0a60b3b7f336c68dfbe0e7b4c45a0e64b51d2c65d\": not found" Jul 6 23:28:23.314482 systemd[1]: var-lib-kubelet-pods-f0eeabf7\x2d71e7\x2d42d8\x2d936d\x2dd80fd3cbb28d-volumes-kubernetes.io\x7esecret-calico\x2dapiserver\x2dcerts.mount: Deactivated successfully. Jul 6 23:28:23.366163 kubelet[2667]: I0706 23:28:23.366112 2667 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f0eeabf7-71e7-42d8-936d-d80fd3cbb28d" path="/var/lib/kubelet/pods/f0eeabf7-71e7-42d8-936d-d80fd3cbb28d/volumes" Jul 6 23:28:34.384930 containerd[1514]: time="2025-07-06T23:28:34.384837207Z" level=info msg="TaskExit event in podsandbox handler container_id:\"9deb0d27fc93321c04f8333fe47cd7c4be33356d857b4c2fb91b4d905d5a998c\" id:\"23332967edfdccabe35d4c65f49ce06455bdb207f637d68b526b2ef5b4b971eb\" pid:5742 exited_at:{seconds:1751844514 nanos:384526927}" Jul 6 23:28:36.393765 containerd[1514]: time="2025-07-06T23:28:36.393641453Z" level=info msg="TaskExit event in podsandbox handler container_id:\"0d73d6901e869e7caea6321810ee8eb5fca1374e273d1fd43bb5e8a831a94d32\" id:\"816a7a1bb0cb9e5dcee1dec98c1a92a98fa6e3cb0815a62b1aea89ae48c33387\" pid:5762 exited_at:{seconds:1751844516 nanos:392849373}" Jul 6 23:28:36.770525 containerd[1514]: time="2025-07-06T23:28:36.770479413Z" level=info msg="TaskExit event in podsandbox handler container_id:\"9deb0d27fc93321c04f8333fe47cd7c4be33356d857b4c2fb91b4d905d5a998c\" id:\"09b53c71120bbd4105832b033a33861471f4d2c8b4a2f987f15d50f47a6fb3a9\" pid:5787 exited_at:{seconds:1751844516 nanos:770158253}" Jul 6 23:28:49.985030 containerd[1514]: time="2025-07-06T23:28:49.984968853Z" level=info msg="TaskExit event in podsandbox handler container_id:\"842037bfe3a47b926b97ada3ba3f4d51d779206365ac692fca888d6c2fc39afc\" id:\"944db624358ad9bd020364682220e5be615febe3673f4e3b0d830603143c4508\" pid:5819 exited_at:{seconds:1751844529 nanos:983954733}" Jul 6 23:28:58.016095 containerd[1514]: time="2025-07-06T23:28:58.015867413Z" level=info msg="TaskExit event in podsandbox handler container_id:\"842037bfe3a47b926b97ada3ba3f4d51d779206365ac692fca888d6c2fc39afc\" id:\"3c951826b05df3d022152cffd5b15eff39e724cdd67e2e10e547800045abae41\" pid:5843 exited_at:{seconds:1751844538 nanos:15496534}" Jul 6 23:29:06.256046 containerd[1514]: time="2025-07-06T23:29:06.255903137Z" level=info msg="TaskExit event in podsandbox handler container_id:\"0d73d6901e869e7caea6321810ee8eb5fca1374e273d1fd43bb5e8a831a94d32\" id:\"fac3feb67d4c5867c56ea95967f0a0374735b1711ce41a4da363b22ada961bc7\" pid:5868 exited_at:{seconds:1751844546 nanos:255486697}" Jul 6 23:29:06.750755 containerd[1514]: time="2025-07-06T23:29:06.750616063Z" level=info msg="TaskExit event in podsandbox handler container_id:\"9deb0d27fc93321c04f8333fe47cd7c4be33356d857b4c2fb91b4d905d5a998c\" id:\"733832ba87c455ec1bbd6b268bd282f1c79ff12128e5c47835476892af7f362e\" pid:5893 exited_at:{seconds:1751844546 nanos:748894704}" Jul 6 23:29:13.339449 containerd[1514]: time="2025-07-06T23:29:13.339382442Z" level=info msg="StopPodSandbox for \"55153120b2e39f4ba457796f1115af3f0e6595b1afc0c17662e7bb6f4ca1b112\"" Jul 6 23:29:13.436182 containerd[1514]: 2025-07-06 23:29:13.385 [WARNING][5911] cni-plugin/k8s.go 598: WorkloadEndpoint does not exist in the datastore, moving forward with the clean up ContainerID="55153120b2e39f4ba457796f1115af3f0e6595b1afc0c17662e7bb6f4ca1b112" WorkloadEndpoint="ci--4344--1--1--3--d8bdec45b1-k8s-calico--apiserver--59c54d7dc--bp45t-eth0" Jul 6 23:29:13.436182 containerd[1514]: 2025-07-06 23:29:13.385 [INFO][5911] cni-plugin/k8s.go 640: Cleaning up netns ContainerID="55153120b2e39f4ba457796f1115af3f0e6595b1afc0c17662e7bb6f4ca1b112" Jul 6 23:29:13.436182 containerd[1514]: 2025-07-06 23:29:13.385 [INFO][5911] cni-plugin/dataplane_linux.go 555: CleanUpNamespace called with no netns name, ignoring. ContainerID="55153120b2e39f4ba457796f1115af3f0e6595b1afc0c17662e7bb6f4ca1b112" iface="eth0" netns="" Jul 6 23:29:13.436182 containerd[1514]: 2025-07-06 23:29:13.385 [INFO][5911] cni-plugin/k8s.go 647: Releasing IP address(es) ContainerID="55153120b2e39f4ba457796f1115af3f0e6595b1afc0c17662e7bb6f4ca1b112" Jul 6 23:29:13.436182 containerd[1514]: 2025-07-06 23:29:13.385 [INFO][5911] cni-plugin/utils.go 188: Calico CNI releasing IP address ContainerID="55153120b2e39f4ba457796f1115af3f0e6595b1afc0c17662e7bb6f4ca1b112" Jul 6 23:29:13.436182 containerd[1514]: 2025-07-06 23:29:13.411 [INFO][5921] ipam/ipam_plugin.go 412: Releasing address using handleID ContainerID="55153120b2e39f4ba457796f1115af3f0e6595b1afc0c17662e7bb6f4ca1b112" HandleID="k8s-pod-network.55153120b2e39f4ba457796f1115af3f0e6595b1afc0c17662e7bb6f4ca1b112" Workload="ci--4344--1--1--3--d8bdec45b1-k8s-calico--apiserver--59c54d7dc--bp45t-eth0" Jul 6 23:29:13.436182 containerd[1514]: 2025-07-06 23:29:13.412 [INFO][5921] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Jul 6 23:29:13.436182 containerd[1514]: 2025-07-06 23:29:13.412 [INFO][5921] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Jul 6 23:29:13.436182 containerd[1514]: 2025-07-06 23:29:13.428 [WARNING][5921] ipam/ipam_plugin.go 429: Asked to release address but it doesn't exist. Ignoring ContainerID="55153120b2e39f4ba457796f1115af3f0e6595b1afc0c17662e7bb6f4ca1b112" HandleID="k8s-pod-network.55153120b2e39f4ba457796f1115af3f0e6595b1afc0c17662e7bb6f4ca1b112" Workload="ci--4344--1--1--3--d8bdec45b1-k8s-calico--apiserver--59c54d7dc--bp45t-eth0" Jul 6 23:29:13.436182 containerd[1514]: 2025-07-06 23:29:13.428 [INFO][5921] ipam/ipam_plugin.go 440: Releasing address using workloadID ContainerID="55153120b2e39f4ba457796f1115af3f0e6595b1afc0c17662e7bb6f4ca1b112" HandleID="k8s-pod-network.55153120b2e39f4ba457796f1115af3f0e6595b1afc0c17662e7bb6f4ca1b112" Workload="ci--4344--1--1--3--d8bdec45b1-k8s-calico--apiserver--59c54d7dc--bp45t-eth0" Jul 6 23:29:13.436182 containerd[1514]: 2025-07-06 23:29:13.431 [INFO][5921] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Jul 6 23:29:13.436182 containerd[1514]: 2025-07-06 23:29:13.433 [INFO][5911] cni-plugin/k8s.go 653: Teardown processing complete. ContainerID="55153120b2e39f4ba457796f1115af3f0e6595b1afc0c17662e7bb6f4ca1b112" Jul 6 23:29:13.436580 containerd[1514]: time="2025-07-06T23:29:13.436265742Z" level=info msg="TearDown network for sandbox \"55153120b2e39f4ba457796f1115af3f0e6595b1afc0c17662e7bb6f4ca1b112\" successfully" Jul 6 23:29:13.436580 containerd[1514]: time="2025-07-06T23:29:13.436320982Z" level=info msg="StopPodSandbox for \"55153120b2e39f4ba457796f1115af3f0e6595b1afc0c17662e7bb6f4ca1b112\" returns successfully" Jul 6 23:29:13.437462 containerd[1514]: time="2025-07-06T23:29:13.437345382Z" level=info msg="RemovePodSandbox for \"55153120b2e39f4ba457796f1115af3f0e6595b1afc0c17662e7bb6f4ca1b112\"" Jul 6 23:29:13.437462 containerd[1514]: time="2025-07-06T23:29:13.437405742Z" level=info msg="Forcibly stopping sandbox \"55153120b2e39f4ba457796f1115af3f0e6595b1afc0c17662e7bb6f4ca1b112\"" Jul 6 23:29:13.544399 containerd[1514]: 2025-07-06 23:29:13.497 [WARNING][5935] cni-plugin/k8s.go 598: WorkloadEndpoint does not exist in the datastore, moving forward with the clean up ContainerID="55153120b2e39f4ba457796f1115af3f0e6595b1afc0c17662e7bb6f4ca1b112" WorkloadEndpoint="ci--4344--1--1--3--d8bdec45b1-k8s-calico--apiserver--59c54d7dc--bp45t-eth0" Jul 6 23:29:13.544399 containerd[1514]: 2025-07-06 23:29:13.497 [INFO][5935] cni-plugin/k8s.go 640: Cleaning up netns ContainerID="55153120b2e39f4ba457796f1115af3f0e6595b1afc0c17662e7bb6f4ca1b112" Jul 6 23:29:13.544399 containerd[1514]: 2025-07-06 23:29:13.497 [INFO][5935] cni-plugin/dataplane_linux.go 555: CleanUpNamespace called with no netns name, ignoring. ContainerID="55153120b2e39f4ba457796f1115af3f0e6595b1afc0c17662e7bb6f4ca1b112" iface="eth0" netns="" Jul 6 23:29:13.544399 containerd[1514]: 2025-07-06 23:29:13.497 [INFO][5935] cni-plugin/k8s.go 647: Releasing IP address(es) ContainerID="55153120b2e39f4ba457796f1115af3f0e6595b1afc0c17662e7bb6f4ca1b112" Jul 6 23:29:13.544399 containerd[1514]: 2025-07-06 23:29:13.497 [INFO][5935] cni-plugin/utils.go 188: Calico CNI releasing IP address ContainerID="55153120b2e39f4ba457796f1115af3f0e6595b1afc0c17662e7bb6f4ca1b112" Jul 6 23:29:13.544399 containerd[1514]: 2025-07-06 23:29:13.524 [INFO][5943] ipam/ipam_plugin.go 412: Releasing address using handleID ContainerID="55153120b2e39f4ba457796f1115af3f0e6595b1afc0c17662e7bb6f4ca1b112" HandleID="k8s-pod-network.55153120b2e39f4ba457796f1115af3f0e6595b1afc0c17662e7bb6f4ca1b112" Workload="ci--4344--1--1--3--d8bdec45b1-k8s-calico--apiserver--59c54d7dc--bp45t-eth0" Jul 6 23:29:13.544399 containerd[1514]: 2025-07-06 23:29:13.525 [INFO][5943] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Jul 6 23:29:13.544399 containerd[1514]: 2025-07-06 23:29:13.525 [INFO][5943] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Jul 6 23:29:13.544399 containerd[1514]: 2025-07-06 23:29:13.537 [WARNING][5943] ipam/ipam_plugin.go 429: Asked to release address but it doesn't exist. Ignoring ContainerID="55153120b2e39f4ba457796f1115af3f0e6595b1afc0c17662e7bb6f4ca1b112" HandleID="k8s-pod-network.55153120b2e39f4ba457796f1115af3f0e6595b1afc0c17662e7bb6f4ca1b112" Workload="ci--4344--1--1--3--d8bdec45b1-k8s-calico--apiserver--59c54d7dc--bp45t-eth0" Jul 6 23:29:13.544399 containerd[1514]: 2025-07-06 23:29:13.537 [INFO][5943] ipam/ipam_plugin.go 440: Releasing address using workloadID ContainerID="55153120b2e39f4ba457796f1115af3f0e6595b1afc0c17662e7bb6f4ca1b112" HandleID="k8s-pod-network.55153120b2e39f4ba457796f1115af3f0e6595b1afc0c17662e7bb6f4ca1b112" Workload="ci--4344--1--1--3--d8bdec45b1-k8s-calico--apiserver--59c54d7dc--bp45t-eth0" Jul 6 23:29:13.544399 containerd[1514]: 2025-07-06 23:29:13.540 [INFO][5943] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Jul 6 23:29:13.544399 containerd[1514]: 2025-07-06 23:29:13.542 [INFO][5935] cni-plugin/k8s.go 653: Teardown processing complete. ContainerID="55153120b2e39f4ba457796f1115af3f0e6595b1afc0c17662e7bb6f4ca1b112" Jul 6 23:29:13.545585 containerd[1514]: time="2025-07-06T23:29:13.544540720Z" level=info msg="TearDown network for sandbox \"55153120b2e39f4ba457796f1115af3f0e6595b1afc0c17662e7bb6f4ca1b112\" successfully" Jul 6 23:29:13.546900 containerd[1514]: time="2025-07-06T23:29:13.546856319Z" level=info msg="Ensure that sandbox 55153120b2e39f4ba457796f1115af3f0e6595b1afc0c17662e7bb6f4ca1b112 in task-service has been cleanup successfully" Jul 6 23:29:13.551865 containerd[1514]: time="2025-07-06T23:29:13.551792078Z" level=info msg="RemovePodSandbox \"55153120b2e39f4ba457796f1115af3f0e6595b1afc0c17662e7bb6f4ca1b112\" returns successfully" Jul 6 23:29:13.552648 containerd[1514]: time="2025-07-06T23:29:13.552576358Z" level=info msg="StopPodSandbox for \"ed7c6d4c03fb6932e88fd0e005b1044d3f49a7d624765e1e0356b34fcffa447d\"" Jul 6 23:29:13.639184 containerd[1514]: 2025-07-06 23:29:13.595 [WARNING][5957] cni-plugin/k8s.go 598: WorkloadEndpoint does not exist in the datastore, moving forward with the clean up ContainerID="ed7c6d4c03fb6932e88fd0e005b1044d3f49a7d624765e1e0356b34fcffa447d" WorkloadEndpoint="ci--4344--1--1--3--d8bdec45b1-k8s-calico--apiserver--59c54d7dc--sddts-eth0" Jul 6 23:29:13.639184 containerd[1514]: 2025-07-06 23:29:13.596 [INFO][5957] cni-plugin/k8s.go 640: Cleaning up netns ContainerID="ed7c6d4c03fb6932e88fd0e005b1044d3f49a7d624765e1e0356b34fcffa447d" Jul 6 23:29:13.639184 containerd[1514]: 2025-07-06 23:29:13.596 [INFO][5957] cni-plugin/dataplane_linux.go 555: CleanUpNamespace called with no netns name, ignoring. ContainerID="ed7c6d4c03fb6932e88fd0e005b1044d3f49a7d624765e1e0356b34fcffa447d" iface="eth0" netns="" Jul 6 23:29:13.639184 containerd[1514]: 2025-07-06 23:29:13.596 [INFO][5957] cni-plugin/k8s.go 647: Releasing IP address(es) ContainerID="ed7c6d4c03fb6932e88fd0e005b1044d3f49a7d624765e1e0356b34fcffa447d" Jul 6 23:29:13.639184 containerd[1514]: 2025-07-06 23:29:13.596 [INFO][5957] cni-plugin/utils.go 188: Calico CNI releasing IP address ContainerID="ed7c6d4c03fb6932e88fd0e005b1044d3f49a7d624765e1e0356b34fcffa447d" Jul 6 23:29:13.639184 containerd[1514]: 2025-07-06 23:29:13.620 [INFO][5964] ipam/ipam_plugin.go 412: Releasing address using handleID ContainerID="ed7c6d4c03fb6932e88fd0e005b1044d3f49a7d624765e1e0356b34fcffa447d" HandleID="k8s-pod-network.ed7c6d4c03fb6932e88fd0e005b1044d3f49a7d624765e1e0356b34fcffa447d" Workload="ci--4344--1--1--3--d8bdec45b1-k8s-calico--apiserver--59c54d7dc--sddts-eth0" Jul 6 23:29:13.639184 containerd[1514]: 2025-07-06 23:29:13.620 [INFO][5964] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Jul 6 23:29:13.639184 containerd[1514]: 2025-07-06 23:29:13.620 [INFO][5964] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Jul 6 23:29:13.639184 containerd[1514]: 2025-07-06 23:29:13.632 [WARNING][5964] ipam/ipam_plugin.go 429: Asked to release address but it doesn't exist. Ignoring ContainerID="ed7c6d4c03fb6932e88fd0e005b1044d3f49a7d624765e1e0356b34fcffa447d" HandleID="k8s-pod-network.ed7c6d4c03fb6932e88fd0e005b1044d3f49a7d624765e1e0356b34fcffa447d" Workload="ci--4344--1--1--3--d8bdec45b1-k8s-calico--apiserver--59c54d7dc--sddts-eth0" Jul 6 23:29:13.639184 containerd[1514]: 2025-07-06 23:29:13.632 [INFO][5964] ipam/ipam_plugin.go 440: Releasing address using workloadID ContainerID="ed7c6d4c03fb6932e88fd0e005b1044d3f49a7d624765e1e0356b34fcffa447d" HandleID="k8s-pod-network.ed7c6d4c03fb6932e88fd0e005b1044d3f49a7d624765e1e0356b34fcffa447d" Workload="ci--4344--1--1--3--d8bdec45b1-k8s-calico--apiserver--59c54d7dc--sddts-eth0" Jul 6 23:29:13.639184 containerd[1514]: 2025-07-06 23:29:13.634 [INFO][5964] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Jul 6 23:29:13.639184 containerd[1514]: 2025-07-06 23:29:13.636 [INFO][5957] cni-plugin/k8s.go 653: Teardown processing complete. ContainerID="ed7c6d4c03fb6932e88fd0e005b1044d3f49a7d624765e1e0356b34fcffa447d" Jul 6 23:29:13.639184 containerd[1514]: time="2025-07-06T23:29:13.638105741Z" level=info msg="TearDown network for sandbox \"ed7c6d4c03fb6932e88fd0e005b1044d3f49a7d624765e1e0356b34fcffa447d\" successfully" Jul 6 23:29:13.639184 containerd[1514]: time="2025-07-06T23:29:13.638132901Z" level=info msg="StopPodSandbox for \"ed7c6d4c03fb6932e88fd0e005b1044d3f49a7d624765e1e0356b34fcffa447d\" returns successfully" Jul 6 23:29:13.639184 containerd[1514]: time="2025-07-06T23:29:13.639133020Z" level=info msg="RemovePodSandbox for \"ed7c6d4c03fb6932e88fd0e005b1044d3f49a7d624765e1e0356b34fcffa447d\"" Jul 6 23:29:13.639184 containerd[1514]: time="2025-07-06T23:29:13.639180020Z" level=info msg="Forcibly stopping sandbox \"ed7c6d4c03fb6932e88fd0e005b1044d3f49a7d624765e1e0356b34fcffa447d\"" Jul 6 23:29:13.738217 containerd[1514]: 2025-07-06 23:29:13.692 [WARNING][5978] cni-plugin/k8s.go 598: WorkloadEndpoint does not exist in the datastore, moving forward with the clean up ContainerID="ed7c6d4c03fb6932e88fd0e005b1044d3f49a7d624765e1e0356b34fcffa447d" WorkloadEndpoint="ci--4344--1--1--3--d8bdec45b1-k8s-calico--apiserver--59c54d7dc--sddts-eth0" Jul 6 23:29:13.738217 containerd[1514]: 2025-07-06 23:29:13.692 [INFO][5978] cni-plugin/k8s.go 640: Cleaning up netns ContainerID="ed7c6d4c03fb6932e88fd0e005b1044d3f49a7d624765e1e0356b34fcffa447d" Jul 6 23:29:13.738217 containerd[1514]: 2025-07-06 23:29:13.692 [INFO][5978] cni-plugin/dataplane_linux.go 555: CleanUpNamespace called with no netns name, ignoring. ContainerID="ed7c6d4c03fb6932e88fd0e005b1044d3f49a7d624765e1e0356b34fcffa447d" iface="eth0" netns="" Jul 6 23:29:13.738217 containerd[1514]: 2025-07-06 23:29:13.692 [INFO][5978] cni-plugin/k8s.go 647: Releasing IP address(es) ContainerID="ed7c6d4c03fb6932e88fd0e005b1044d3f49a7d624765e1e0356b34fcffa447d" Jul 6 23:29:13.738217 containerd[1514]: 2025-07-06 23:29:13.692 [INFO][5978] cni-plugin/utils.go 188: Calico CNI releasing IP address ContainerID="ed7c6d4c03fb6932e88fd0e005b1044d3f49a7d624765e1e0356b34fcffa447d" Jul 6 23:29:13.738217 containerd[1514]: 2025-07-06 23:29:13.714 [INFO][5985] ipam/ipam_plugin.go 412: Releasing address using handleID ContainerID="ed7c6d4c03fb6932e88fd0e005b1044d3f49a7d624765e1e0356b34fcffa447d" HandleID="k8s-pod-network.ed7c6d4c03fb6932e88fd0e005b1044d3f49a7d624765e1e0356b34fcffa447d" Workload="ci--4344--1--1--3--d8bdec45b1-k8s-calico--apiserver--59c54d7dc--sddts-eth0" Jul 6 23:29:13.738217 containerd[1514]: 2025-07-06 23:29:13.714 [INFO][5985] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Jul 6 23:29:13.738217 containerd[1514]: 2025-07-06 23:29:13.714 [INFO][5985] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Jul 6 23:29:13.738217 containerd[1514]: 2025-07-06 23:29:13.730 [WARNING][5985] ipam/ipam_plugin.go 429: Asked to release address but it doesn't exist. Ignoring ContainerID="ed7c6d4c03fb6932e88fd0e005b1044d3f49a7d624765e1e0356b34fcffa447d" HandleID="k8s-pod-network.ed7c6d4c03fb6932e88fd0e005b1044d3f49a7d624765e1e0356b34fcffa447d" Workload="ci--4344--1--1--3--d8bdec45b1-k8s-calico--apiserver--59c54d7dc--sddts-eth0" Jul 6 23:29:13.738217 containerd[1514]: 2025-07-06 23:29:13.731 [INFO][5985] ipam/ipam_plugin.go 440: Releasing address using workloadID ContainerID="ed7c6d4c03fb6932e88fd0e005b1044d3f49a7d624765e1e0356b34fcffa447d" HandleID="k8s-pod-network.ed7c6d4c03fb6932e88fd0e005b1044d3f49a7d624765e1e0356b34fcffa447d" Workload="ci--4344--1--1--3--d8bdec45b1-k8s-calico--apiserver--59c54d7dc--sddts-eth0" Jul 6 23:29:13.738217 containerd[1514]: 2025-07-06 23:29:13.733 [INFO][5985] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Jul 6 23:29:13.738217 containerd[1514]: 2025-07-06 23:29:13.736 [INFO][5978] cni-plugin/k8s.go 653: Teardown processing complete. ContainerID="ed7c6d4c03fb6932e88fd0e005b1044d3f49a7d624765e1e0356b34fcffa447d" Jul 6 23:29:13.739304 containerd[1514]: time="2025-07-06T23:29:13.738847040Z" level=info msg="TearDown network for sandbox \"ed7c6d4c03fb6932e88fd0e005b1044d3f49a7d624765e1e0356b34fcffa447d\" successfully" Jul 6 23:29:13.741723 containerd[1514]: time="2025-07-06T23:29:13.741424719Z" level=info msg="Ensure that sandbox ed7c6d4c03fb6932e88fd0e005b1044d3f49a7d624765e1e0356b34fcffa447d in task-service has been cleanup successfully" Jul 6 23:29:13.745380 containerd[1514]: time="2025-07-06T23:29:13.745344879Z" level=info msg="RemovePodSandbox \"ed7c6d4c03fb6932e88fd0e005b1044d3f49a7d624765e1e0356b34fcffa447d\" returns successfully" Jul 6 23:29:19.944157 containerd[1514]: time="2025-07-06T23:29:19.944115473Z" level=info msg="TaskExit event in podsandbox handler container_id:\"842037bfe3a47b926b97ada3ba3f4d51d779206365ac692fca888d6c2fc39afc\" id:\"7def950b05771e09abc8e831cb512e9c26d503be2a325330bb0fc3308d51439a\" pid:6006 exited_at:{seconds:1751844559 nanos:943801713}" Jul 6 23:29:34.231843 containerd[1514]: time="2025-07-06T23:29:34.231514758Z" level=info msg="TaskExit event in podsandbox handler container_id:\"9deb0d27fc93321c04f8333fe47cd7c4be33356d857b4c2fb91b4d905d5a998c\" id:\"72e4d836c3b3cf6c5bafbb1891290343bdf4f016be110e04fc526a315acaa031\" pid:6037 exited_at:{seconds:1751844574 nanos:231236438}" Jul 6 23:29:36.249483 containerd[1514]: time="2025-07-06T23:29:36.249422765Z" level=info msg="TaskExit event in podsandbox handler container_id:\"0d73d6901e869e7caea6321810ee8eb5fca1374e273d1fd43bb5e8a831a94d32\" id:\"6b1bb22fdf475025a855212fdd796e59c9ba7bf42dcfea343ad029bd9cc45391\" pid:6058 exited_at:{seconds:1751844576 nanos:249072525}" Jul 6 23:29:36.754288 containerd[1514]: time="2025-07-06T23:29:36.754248727Z" level=info msg="TaskExit event in podsandbox handler container_id:\"9deb0d27fc93321c04f8333fe47cd7c4be33356d857b4c2fb91b4d905d5a998c\" id:\"5c0c393b2ce0911aa273565e0d9a221b2bca3e62ca5566f5ee534247dccc3f53\" pid:6097 exited_at:{seconds:1751844576 nanos:753478167}" Jul 6 23:29:49.954544 containerd[1514]: time="2025-07-06T23:29:49.954476020Z" level=info msg="TaskExit event in podsandbox handler container_id:\"842037bfe3a47b926b97ada3ba3f4d51d779206365ac692fca888d6c2fc39afc\" id:\"13002d2e0519b6aaafb40df7ca7ba35db8d597dbb48d225c65a05d4dacdff2d1\" pid:6131 exited_at:{seconds:1751844589 nanos:953728420}" Jul 6 23:29:57.960829 containerd[1514]: time="2025-07-06T23:29:57.960768712Z" level=info msg="TaskExit event in podsandbox handler container_id:\"842037bfe3a47b926b97ada3ba3f4d51d779206365ac692fca888d6c2fc39afc\" id:\"68b3391d05447362e32082b8fb745a4f5cd3daeeb8b5bbf031aa68669e24a2cd\" pid:6153 exited_at:{seconds:1751844597 nanos:960416872}" Jul 6 23:30:06.265594 containerd[1514]: time="2025-07-06T23:30:06.265256696Z" level=info msg="TaskExit event in podsandbox handler container_id:\"0d73d6901e869e7caea6321810ee8eb5fca1374e273d1fd43bb5e8a831a94d32\" id:\"af6cd4fd2f5171fbd97ea4e059422a1be3e9ab1937d1439b3b17c564abeffb36\" pid:6178 exited_at:{seconds:1751844606 nanos:264539216}" Jul 6 23:30:06.750353 containerd[1514]: time="2025-07-06T23:30:06.750313796Z" level=info msg="TaskExit event in podsandbox handler container_id:\"9deb0d27fc93321c04f8333fe47cd7c4be33356d857b4c2fb91b4d905d5a998c\" id:\"ea50a7ae286a91365a7668284695ad863bd57dec01bc5a0d9a9e4bc10d765e34\" pid:6203 exited_at:{seconds:1751844606 nanos:749957756}" Jul 6 23:30:19.942195 containerd[1514]: time="2025-07-06T23:30:19.941743600Z" level=info msg="TaskExit event in podsandbox handler container_id:\"842037bfe3a47b926b97ada3ba3f4d51d779206365ac692fca888d6c2fc39afc\" id:\"f88832e044c90f95f48376219a4bb2790e9128bc0444dda83691bfb9093ab23b\" pid:6228 exited_at:{seconds:1751844619 nanos:941037600}" Jul 6 23:30:34.236559 containerd[1514]: time="2025-07-06T23:30:34.236360026Z" level=info msg="TaskExit event in podsandbox handler container_id:\"9deb0d27fc93321c04f8333fe47cd7c4be33356d857b4c2fb91b4d905d5a998c\" id:\"7d3d64ec539e38159f6c2372101797b86547060309d27d2979b4f9fed92b55d6\" pid:6251 exited_at:{seconds:1751844634 nanos:235312466}" Jul 6 23:30:36.257151 containerd[1514]: time="2025-07-06T23:30:36.257088677Z" level=info msg="TaskExit event in podsandbox handler container_id:\"0d73d6901e869e7caea6321810ee8eb5fca1374e273d1fd43bb5e8a831a94d32\" id:\"0a1ab757df66991e3aa5106836269232d2d415d26a43ea550e85ae14e540c322\" pid:6273 exited_at:{seconds:1751844636 nanos:256596517}" Jul 6 23:30:36.752886 containerd[1514]: time="2025-07-06T23:30:36.752821021Z" level=info msg="TaskExit event in podsandbox handler container_id:\"9deb0d27fc93321c04f8333fe47cd7c4be33356d857b4c2fb91b4d905d5a998c\" id:\"bbf7bd5afd4b42e34b2bbb487eee6659e81d1bf90277adc9f5f599cbe0d84304\" pid:6298 exited_at:{seconds:1751844636 nanos:752437741}" Jul 6 23:30:49.955101 containerd[1514]: time="2025-07-06T23:30:49.955049544Z" level=info msg="TaskExit event in podsandbox handler container_id:\"842037bfe3a47b926b97ada3ba3f4d51d779206365ac692fca888d6c2fc39afc\" id:\"f107f3cec6edda6fbba5e012f6deaec794a693d9f960603bfec5f580c003dd90\" pid:6327 exited_at:{seconds:1751844649 nanos:954678864}" Jul 6 23:30:57.960176 containerd[1514]: time="2025-07-06T23:30:57.959956781Z" level=info msg="TaskExit event in podsandbox handler container_id:\"842037bfe3a47b926b97ada3ba3f4d51d779206365ac692fca888d6c2fc39afc\" id:\"438f8e12fac438c1deda5d2487e3435e1665cece129e32f965ab636840d9d6b7\" pid:6347 exited_at:{seconds:1751844657 nanos:959559381}" Jul 6 23:31:06.255922 containerd[1514]: time="2025-07-06T23:31:06.255863835Z" level=info msg="TaskExit event in podsandbox handler container_id:\"0d73d6901e869e7caea6321810ee8eb5fca1374e273d1fd43bb5e8a831a94d32\" id:\"3d4446e6f59d4b69327ce7d27887f797c7101913bce97cc571e5454decd614c5\" pid:6371 exited_at:{seconds:1751844666 nanos:255364995}" Jul 6 23:31:06.752951 containerd[1514]: time="2025-07-06T23:31:06.752910421Z" level=info msg="TaskExit event in podsandbox handler container_id:\"9deb0d27fc93321c04f8333fe47cd7c4be33356d857b4c2fb91b4d905d5a998c\" id:\"7c31e85fdc26429c940a8770346b5e334b20ff5866ca2dda1ab19146b65fc4e1\" pid:6393 exited_at:{seconds:1751844666 nanos:752505141}" Jul 6 23:31:19.952758 containerd[1514]: time="2025-07-06T23:31:19.952692270Z" level=info msg="TaskExit event in podsandbox handler container_id:\"842037bfe3a47b926b97ada3ba3f4d51d779206365ac692fca888d6c2fc39afc\" id:\"fe2c3154520c3438ccdd3ad93cba5a0ff962aceb05d8d800e36de134e3927919\" pid:6441 exited_at:{seconds:1751844679 nanos:952137230}" Jul 6 23:31:34.236234 containerd[1514]: time="2025-07-06T23:31:34.235930813Z" level=info msg="TaskExit event in podsandbox handler container_id:\"9deb0d27fc93321c04f8333fe47cd7c4be33356d857b4c2fb91b4d905d5a998c\" id:\"80b3961199500c9bbb022745be7f069d08dcb610a11e445fe6a6a2e17bfc2e44\" pid:6464 exited_at:{seconds:1751844694 nanos:235504174}" Jul 6 23:31:36.253695 containerd[1514]: time="2025-07-06T23:31:36.253600397Z" level=info msg="TaskExit event in podsandbox handler container_id:\"0d73d6901e869e7caea6321810ee8eb5fca1374e273d1fd43bb5e8a831a94d32\" id:\"4b0339244a195e62b5497f7bdbce0a6849e5ad57b6af8911573d87dcf239ad09\" pid:6486 exited_at:{seconds:1751844696 nanos:253187077}" Jul 6 23:31:36.759491 containerd[1514]: time="2025-07-06T23:31:36.759247703Z" level=info msg="TaskExit event in podsandbox handler container_id:\"9deb0d27fc93321c04f8333fe47cd7c4be33356d857b4c2fb91b4d905d5a998c\" id:\"57e632d0a1d7d419e415d7e005914929fb4fe03b0efe9ed4b1d0cf938829cb91\" pid:6509 exited_at:{seconds:1751844696 nanos:758954343}" Jul 6 23:31:49.957362 containerd[1514]: time="2025-07-06T23:31:49.957281530Z" level=info msg="TaskExit event in podsandbox handler container_id:\"842037bfe3a47b926b97ada3ba3f4d51d779206365ac692fca888d6c2fc39afc\" id:\"03d549df5950b3ef2a753caa4e54c8c1495b5de756ebdc4be5d1ec9adbe9ea46\" pid:6537 exited_at:{seconds:1751844709 nanos:956215690}" Jul 6 23:31:57.959913 containerd[1514]: time="2025-07-06T23:31:57.959750636Z" level=info msg="TaskExit event in podsandbox handler container_id:\"842037bfe3a47b926b97ada3ba3f4d51d779206365ac692fca888d6c2fc39afc\" id:\"f8de0e585c49f074390edf453882c0fda85a9c4bbb88eb317ed624b48b933654\" pid:6560 exited_at:{seconds:1751844717 nanos:959331436}" Jul 6 23:32:06.248131 containerd[1514]: time="2025-07-06T23:32:06.248081112Z" level=info msg="TaskExit event in podsandbox handler container_id:\"0d73d6901e869e7caea6321810ee8eb5fca1374e273d1fd43bb5e8a831a94d32\" id:\"6e55fac3690c509bd08a08201c7011a6f3da1a9cf42d476ba72e1c39aa912882\" pid:6586 exited_at:{seconds:1751844726 nanos:247717312}" Jul 6 23:32:06.341224 systemd[1]: Started sshd@7-91.99.177.85:22-139.178.89.65:39392.service - OpenSSH per-connection server daemon (139.178.89.65:39392). Jul 6 23:32:06.753413 containerd[1514]: time="2025-07-06T23:32:06.753362578Z" level=info msg="TaskExit event in podsandbox handler container_id:\"9deb0d27fc93321c04f8333fe47cd7c4be33356d857b4c2fb91b4d905d5a998c\" id:\"a64482fbcf089d29cc6a4bf52cc8a79f7d1ce87eec02ec42fc82493baa1c75a8\" pid:6614 exited_at:{seconds:1751844726 nanos:753059938}" Jul 6 23:32:07.084141 containerd[1514]: time="2025-07-06T23:32:07.083941583Z" level=warning msg="container event discarded" container=3822a28e27ca4024c847c82a29e10798d049c7b3f8cda8c64b6fef49e74372b6 type=CONTAINER_CREATED_EVENT Jul 6 23:32:07.095407 containerd[1514]: time="2025-07-06T23:32:07.095286262Z" level=warning msg="container event discarded" container=3822a28e27ca4024c847c82a29e10798d049c7b3f8cda8c64b6fef49e74372b6 type=CONTAINER_STARTED_EVENT Jul 6 23:32:07.138803 containerd[1514]: time="2025-07-06T23:32:07.138622377Z" level=warning msg="container event discarded" container=c0bbb527259b66f18382d4109286042c4a92739751fe41f500a0db6d31beeb76 type=CONTAINER_CREATED_EVENT Jul 6 23:32:07.138803 containerd[1514]: time="2025-07-06T23:32:07.138776537Z" level=warning msg="container event discarded" container=82845ac8f3fe88b6e7f60420764259ebadc0fae77b07279331e3f0cdfa0a09c8 type=CONTAINER_CREATED_EVENT Jul 6 23:32:07.138803 containerd[1514]: time="2025-07-06T23:32:07.138798217Z" level=warning msg="container event discarded" container=82845ac8f3fe88b6e7f60420764259ebadc0fae77b07279331e3f0cdfa0a09c8 type=CONTAINER_STARTED_EVENT Jul 6 23:32:07.150584 containerd[1514]: time="2025-07-06T23:32:07.150385696Z" level=warning msg="container event discarded" container=5a2a8ae0d146d87df91edd42240a80da54c6a94553ee243836989163d87ed5ea type=CONTAINER_CREATED_EVENT Jul 6 23:32:07.150584 containerd[1514]: time="2025-07-06T23:32:07.150528816Z" level=warning msg="container event discarded" container=5a2a8ae0d146d87df91edd42240a80da54c6a94553ee243836989163d87ed5ea type=CONTAINER_STARTED_EVENT Jul 6 23:32:07.171950 containerd[1514]: time="2025-07-06T23:32:07.171837094Z" level=warning msg="container event discarded" container=67c7723d1156f61c05d90af66428ebad654a61e3c6ae4602c9a125448dfb18ff type=CONTAINER_CREATED_EVENT Jul 6 23:32:07.189445 containerd[1514]: time="2025-07-06T23:32:07.189321652Z" level=warning msg="container event discarded" container=fed893dbf1fbe5a88a52d671aa161d1765f1b49183b2a3e5a172155462581b2c type=CONTAINER_CREATED_EVENT Jul 6 23:32:07.254818 containerd[1514]: time="2025-07-06T23:32:07.254709885Z" level=warning msg="container event discarded" container=c0bbb527259b66f18382d4109286042c4a92739751fe41f500a0db6d31beeb76 type=CONTAINER_STARTED_EVENT Jul 6 23:32:07.298162 containerd[1514]: time="2025-07-06T23:32:07.298040080Z" level=warning msg="container event discarded" container=67c7723d1156f61c05d90af66428ebad654a61e3c6ae4602c9a125448dfb18ff type=CONTAINER_STARTED_EVENT Jul 6 23:32:07.298162 containerd[1514]: time="2025-07-06T23:32:07.298116520Z" level=warning msg="container event discarded" container=fed893dbf1fbe5a88a52d671aa161d1765f1b49183b2a3e5a172155462581b2c type=CONTAINER_STARTED_EVENT Jul 6 23:32:07.444012 sshd[6599]: Accepted publickey for core from 139.178.89.65 port 39392 ssh2: RSA SHA256:3osEBBBp4JZnYlmjNq6bOg4+EAdCjLWXmHbLkgxlNTk Jul 6 23:32:07.447179 sshd-session[6599]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jul 6 23:32:07.454767 systemd-logind[1491]: New session 8 of user core. Jul 6 23:32:07.460035 systemd[1]: Started session-8.scope - Session 8 of User core. Jul 6 23:32:08.279184 sshd[6623]: Connection closed by 139.178.89.65 port 39392 Jul 6 23:32:08.279700 sshd-session[6599]: pam_unix(sshd:session): session closed for user core Jul 6 23:32:08.285578 systemd[1]: sshd@7-91.99.177.85:22-139.178.89.65:39392.service: Deactivated successfully. Jul 6 23:32:08.287871 systemd[1]: session-8.scope: Deactivated successfully. Jul 6 23:32:08.289492 systemd-logind[1491]: Session 8 logged out. Waiting for processes to exit. Jul 6 23:32:08.292260 systemd-logind[1491]: Removed session 8. Jul 6 23:32:13.472971 systemd[1]: Started sshd@8-91.99.177.85:22-139.178.89.65:55342.service - OpenSSH per-connection server daemon (139.178.89.65:55342). Jul 6 23:32:14.585702 sshd[6638]: Accepted publickey for core from 139.178.89.65 port 55342 ssh2: RSA SHA256:3osEBBBp4JZnYlmjNq6bOg4+EAdCjLWXmHbLkgxlNTk Jul 6 23:32:14.587330 sshd-session[6638]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jul 6 23:32:14.595706 systemd-logind[1491]: New session 9 of user core. Jul 6 23:32:14.599996 systemd[1]: Started session-9.scope - Session 9 of User core. Jul 6 23:32:15.419437 sshd[6640]: Connection closed by 139.178.89.65 port 55342 Jul 6 23:32:15.420923 sshd-session[6638]: pam_unix(sshd:session): session closed for user core Jul 6 23:32:15.428410 systemd[1]: sshd@8-91.99.177.85:22-139.178.89.65:55342.service: Deactivated successfully. Jul 6 23:32:15.428474 systemd-logind[1491]: Session 9 logged out. Waiting for processes to exit. Jul 6 23:32:15.432788 systemd[1]: session-9.scope: Deactivated successfully. Jul 6 23:32:15.438505 systemd-logind[1491]: Removed session 9. Jul 6 23:32:15.610080 systemd[1]: Started sshd@9-91.99.177.85:22-139.178.89.65:55354.service - OpenSSH per-connection server daemon (139.178.89.65:55354). Jul 6 23:32:16.714365 sshd[6653]: Accepted publickey for core from 139.178.89.65 port 55354 ssh2: RSA SHA256:3osEBBBp4JZnYlmjNq6bOg4+EAdCjLWXmHbLkgxlNTk Jul 6 23:32:16.716860 sshd-session[6653]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jul 6 23:32:16.725071 systemd-logind[1491]: New session 10 of user core. Jul 6 23:32:16.731008 systemd[1]: Started session-10.scope - Session 10 of User core. Jul 6 23:32:17.592704 sshd[6655]: Connection closed by 139.178.89.65 port 55354 Jul 6 23:32:17.592006 sshd-session[6653]: pam_unix(sshd:session): session closed for user core Jul 6 23:32:17.598079 systemd[1]: sshd@9-91.99.177.85:22-139.178.89.65:55354.service: Deactivated successfully. Jul 6 23:32:17.602836 systemd[1]: session-10.scope: Deactivated successfully. Jul 6 23:32:17.605173 systemd-logind[1491]: Session 10 logged out. Waiting for processes to exit. Jul 6 23:32:17.607441 systemd-logind[1491]: Removed session 10. Jul 6 23:32:17.784967 systemd[1]: Started sshd@10-91.99.177.85:22-139.178.89.65:55366.service - OpenSSH per-connection server daemon (139.178.89.65:55366). Jul 6 23:32:17.989196 containerd[1514]: time="2025-07-06T23:32:17.989081102Z" level=warning msg="container event discarded" container=d9fb5353437e4f1db2dfceaa8c02b641eb0f6f9c639fa6824417286ac98ec941 type=CONTAINER_CREATED_EVENT Jul 6 23:32:17.989196 containerd[1514]: time="2025-07-06T23:32:17.989194222Z" level=warning msg="container event discarded" container=d9fb5353437e4f1db2dfceaa8c02b641eb0f6f9c639fa6824417286ac98ec941 type=CONTAINER_STARTED_EVENT Jul 6 23:32:18.028559 containerd[1514]: time="2025-07-06T23:32:18.028464297Z" level=warning msg="container event discarded" container=3582477ddf8753ca863b333c5aec9dbdd3accb6a41bd9e23fcc90f7fa08bcd19 type=CONTAINER_CREATED_EVENT Jul 6 23:32:18.152269 containerd[1514]: time="2025-07-06T23:32:18.152190204Z" level=warning msg="container event discarded" container=3582477ddf8753ca863b333c5aec9dbdd3accb6a41bd9e23fcc90f7fa08bcd19 type=CONTAINER_STARTED_EVENT Jul 6 23:32:18.554836 containerd[1514]: time="2025-07-06T23:32:18.554725801Z" level=warning msg="container event discarded" container=1dfb1768ae90720710cb782faaffdedff9620d17ddedc9df0603d251045a2ed1 type=CONTAINER_CREATED_EVENT Jul 6 23:32:18.554836 containerd[1514]: time="2025-07-06T23:32:18.554816801Z" level=warning msg="container event discarded" container=1dfb1768ae90720710cb782faaffdedff9620d17ddedc9df0603d251045a2ed1 type=CONTAINER_STARTED_EVENT Jul 6 23:32:18.904085 sshd[6665]: Accepted publickey for core from 139.178.89.65 port 55366 ssh2: RSA SHA256:3osEBBBp4JZnYlmjNq6bOg4+EAdCjLWXmHbLkgxlNTk Jul 6 23:32:18.906452 sshd-session[6665]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jul 6 23:32:18.917226 systemd-logind[1491]: New session 11 of user core. Jul 6 23:32:18.924962 systemd[1]: Started session-11.scope - Session 11 of User core. Jul 6 23:32:19.756331 sshd[6673]: Connection closed by 139.178.89.65 port 55366 Jul 6 23:32:19.756910 sshd-session[6665]: pam_unix(sshd:session): session closed for user core Jul 6 23:32:19.763231 systemd-logind[1491]: Session 11 logged out. Waiting for processes to exit. Jul 6 23:32:19.764194 systemd[1]: sshd@10-91.99.177.85:22-139.178.89.65:55366.service: Deactivated successfully. Jul 6 23:32:19.767407 systemd[1]: session-11.scope: Deactivated successfully. Jul 6 23:32:19.772133 systemd-logind[1491]: Removed session 11. Jul 6 23:32:19.964097 containerd[1514]: time="2025-07-06T23:32:19.964040171Z" level=info msg="TaskExit event in podsandbox handler container_id:\"842037bfe3a47b926b97ada3ba3f4d51d779206365ac692fca888d6c2fc39afc\" id:\"299888c783a3c94d981bb877b153fea6c9ca73e67b345ef38d7d7605a7da9c69\" pid:6697 exited_at:{seconds:1751844739 nanos:963481531}" Jul 6 23:32:20.582944 containerd[1514]: time="2025-07-06T23:32:20.582862345Z" level=warning msg="container event discarded" container=8e083feeb9d7a32080503be29d78ce5b5b552a02970c2ebc00f43540dc4749dc type=CONTAINER_CREATED_EVENT Jul 6 23:32:20.657245 containerd[1514]: time="2025-07-06T23:32:20.657164778Z" level=warning msg="container event discarded" container=8e083feeb9d7a32080503be29d78ce5b5b552a02970c2ebc00f43540dc4749dc type=CONTAINER_STARTED_EVENT Jul 6 23:32:24.946798 systemd[1]: Started sshd@11-91.99.177.85:22-139.178.89.65:44614.service - OpenSSH per-connection server daemon (139.178.89.65:44614). Jul 6 23:32:26.055138 sshd[6707]: Accepted publickey for core from 139.178.89.65 port 44614 ssh2: RSA SHA256:3osEBBBp4JZnYlmjNq6bOg4+EAdCjLWXmHbLkgxlNTk Jul 6 23:32:26.057945 sshd-session[6707]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jul 6 23:32:26.064952 systemd-logind[1491]: New session 12 of user core. Jul 6 23:32:26.072928 systemd[1]: Started session-12.scope - Session 12 of User core. Jul 6 23:32:26.890825 sshd[6709]: Connection closed by 139.178.89.65 port 44614 Jul 6 23:32:26.891894 sshd-session[6707]: pam_unix(sshd:session): session closed for user core Jul 6 23:32:26.900212 systemd[1]: sshd@11-91.99.177.85:22-139.178.89.65:44614.service: Deactivated successfully. Jul 6 23:32:26.905161 systemd[1]: session-12.scope: Deactivated successfully. Jul 6 23:32:26.908104 systemd-logind[1491]: Session 12 logged out. Waiting for processes to exit. Jul 6 23:32:26.910735 systemd-logind[1491]: Removed session 12. Jul 6 23:32:32.086373 systemd[1]: Started sshd@12-91.99.177.85:22-139.178.89.65:34778.service - OpenSSH per-connection server daemon (139.178.89.65:34778). Jul 6 23:32:33.209885 sshd[6720]: Accepted publickey for core from 139.178.89.65 port 34778 ssh2: RSA SHA256:3osEBBBp4JZnYlmjNq6bOg4+EAdCjLWXmHbLkgxlNTk Jul 6 23:32:33.213580 sshd-session[6720]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jul 6 23:32:33.222201 systemd-logind[1491]: New session 13 of user core. Jul 6 23:32:33.224846 systemd[1]: Started session-13.scope - Session 13 of User core. Jul 6 23:32:34.055710 sshd[6722]: Connection closed by 139.178.89.65 port 34778 Jul 6 23:32:34.056900 sshd-session[6720]: pam_unix(sshd:session): session closed for user core Jul 6 23:32:34.062760 systemd[1]: sshd@12-91.99.177.85:22-139.178.89.65:34778.service: Deactivated successfully. Jul 6 23:32:34.065383 systemd[1]: session-13.scope: Deactivated successfully. Jul 6 23:32:34.066936 systemd-logind[1491]: Session 13 logged out. Waiting for processes to exit. Jul 6 23:32:34.069137 systemd-logind[1491]: Removed session 13. Jul 6 23:32:34.234772 containerd[1514]: time="2025-07-06T23:32:34.234725453Z" level=info msg="TaskExit event in podsandbox handler container_id:\"9deb0d27fc93321c04f8333fe47cd7c4be33356d857b4c2fb91b4d905d5a998c\" id:\"84fbb320f34be0a0bde966f9273e57dbaf517bd05b59ed8afd4b535175e2a6d9\" pid:6745 exited_at:{seconds:1751844754 nanos:234314613}" Jul 6 23:32:36.270480 containerd[1514]: time="2025-07-06T23:32:36.270390797Z" level=info msg="TaskExit event in podsandbox handler container_id:\"0d73d6901e869e7caea6321810ee8eb5fca1374e273d1fd43bb5e8a831a94d32\" id:\"4a5021d5866b4aa7d480ec1c7302a34a5a9436c6b9994da96f1cd5b88b4be243\" pid:6767 exited_at:{seconds:1751844756 nanos:270072877}" Jul 6 23:32:36.357761 containerd[1514]: time="2025-07-06T23:32:36.357571467Z" level=warning msg="container event discarded" container=3d7c9840e22b3e6c508cbbc324ecbf99a8cb28b763f3e10fe74be22b3d1c1147 type=CONTAINER_CREATED_EVENT Jul 6 23:32:36.357761 containerd[1514]: time="2025-07-06T23:32:36.357715307Z" level=warning msg="container event discarded" container=3d7c9840e22b3e6c508cbbc324ecbf99a8cb28b763f3e10fe74be22b3d1c1147 type=CONTAINER_STARTED_EVENT Jul 6 23:32:36.484865 containerd[1514]: time="2025-07-06T23:32:36.481937094Z" level=warning msg="container event discarded" container=0a9883a4e9231cb1e98292a314943ddd3dde0e2016ccd784f689910c90b7d079 type=CONTAINER_CREATED_EVENT Jul 6 23:32:36.484865 containerd[1514]: time="2025-07-06T23:32:36.484857134Z" level=warning msg="container event discarded" container=0a9883a4e9231cb1e98292a314943ddd3dde0e2016ccd784f689910c90b7d079 type=CONTAINER_STARTED_EVENT Jul 6 23:32:36.758717 containerd[1514]: time="2025-07-06T23:32:36.758538265Z" level=info msg="TaskExit event in podsandbox handler container_id:\"9deb0d27fc93321c04f8333fe47cd7c4be33356d857b4c2fb91b4d905d5a998c\" id:\"580e0cf246cb025c84d9daf6ed5955c3f65efef8be42dc3183b557759d92e805\" pid:6790 exited_at:{seconds:1751844756 nanos:757971025}" Jul 6 23:32:39.014272 containerd[1514]: time="2025-07-06T23:32:39.014164625Z" level=warning msg="container event discarded" container=f61ab91b75b35d9fbc3dbcebef1ca18e19950170caff506093684e0833ffa8a6 type=CONTAINER_CREATED_EVENT Jul 6 23:32:39.106175 containerd[1514]: time="2025-07-06T23:32:39.106004415Z" level=warning msg="container event discarded" container=f61ab91b75b35d9fbc3dbcebef1ca18e19950170caff506093684e0833ffa8a6 type=CONTAINER_STARTED_EVENT Jul 6 23:32:39.252637 systemd[1]: Started sshd@13-91.99.177.85:22-139.178.89.65:34780.service - OpenSSH per-connection server daemon (139.178.89.65:34780). Jul 6 23:32:40.374623 sshd[6800]: Accepted publickey for core from 139.178.89.65 port 34780 ssh2: RSA SHA256:3osEBBBp4JZnYlmjNq6bOg4+EAdCjLWXmHbLkgxlNTk Jul 6 23:32:40.376660 sshd-session[6800]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jul 6 23:32:40.384907 systemd-logind[1491]: New session 14 of user core. Jul 6 23:32:40.390886 systemd[1]: Started session-14.scope - Session 14 of User core. Jul 6 23:32:40.568175 containerd[1514]: time="2025-07-06T23:32:40.568108380Z" level=warning msg="container event discarded" container=8697a422f63a13fc240920efa92257f0c28336099db919bebe1b1b4ab72d0e5c type=CONTAINER_CREATED_EVENT Jul 6 23:32:40.672594 containerd[1514]: time="2025-07-06T23:32:40.672444649Z" level=warning msg="container event discarded" container=8697a422f63a13fc240920efa92257f0c28336099db919bebe1b1b4ab72d0e5c type=CONTAINER_STARTED_EVENT Jul 6 23:32:40.847132 containerd[1514]: time="2025-07-06T23:32:40.847067510Z" level=warning msg="container event discarded" container=8697a422f63a13fc240920efa92257f0c28336099db919bebe1b1b4ab72d0e5c type=CONTAINER_STOPPED_EVENT Jul 6 23:32:41.244298 sshd[6802]: Connection closed by 139.178.89.65 port 34780 Jul 6 23:32:41.245036 sshd-session[6800]: pam_unix(sshd:session): session closed for user core Jul 6 23:32:41.251527 systemd[1]: sshd@13-91.99.177.85:22-139.178.89.65:34780.service: Deactivated successfully. Jul 6 23:32:41.255848 systemd[1]: session-14.scope: Deactivated successfully. Jul 6 23:32:41.258387 systemd-logind[1491]: Session 14 logged out. Waiting for processes to exit. Jul 6 23:32:41.262502 systemd-logind[1491]: Removed session 14. Jul 6 23:32:41.428891 systemd[1]: Started sshd@14-91.99.177.85:22-139.178.89.65:35426.service - OpenSSH per-connection server daemon (139.178.89.65:35426). Jul 6 23:32:42.529648 sshd[6821]: Accepted publickey for core from 139.178.89.65 port 35426 ssh2: RSA SHA256:3osEBBBp4JZnYlmjNq6bOg4+EAdCjLWXmHbLkgxlNTk Jul 6 23:32:42.533754 sshd-session[6821]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jul 6 23:32:42.543835 systemd-logind[1491]: New session 15 of user core. Jul 6 23:32:42.549982 systemd[1]: Started session-15.scope - Session 15 of User core. Jul 6 23:32:43.517322 sshd[6823]: Connection closed by 139.178.89.65 port 35426 Jul 6 23:32:43.520907 sshd-session[6821]: pam_unix(sshd:session): session closed for user core Jul 6 23:32:43.526467 systemd-logind[1491]: Session 15 logged out. Waiting for processes to exit. Jul 6 23:32:43.527089 systemd[1]: sshd@14-91.99.177.85:22-139.178.89.65:35426.service: Deactivated successfully. Jul 6 23:32:43.531608 systemd[1]: session-15.scope: Deactivated successfully. Jul 6 23:32:43.539882 systemd-logind[1491]: Removed session 15. Jul 6 23:32:43.706374 systemd[1]: Started sshd@15-91.99.177.85:22-139.178.89.65:35434.service - OpenSSH per-connection server daemon (139.178.89.65:35434). Jul 6 23:32:44.813692 sshd[6833]: Accepted publickey for core from 139.178.89.65 port 35434 ssh2: RSA SHA256:3osEBBBp4JZnYlmjNq6bOg4+EAdCjLWXmHbLkgxlNTk Jul 6 23:32:44.815352 sshd-session[6833]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jul 6 23:32:44.827749 systemd-logind[1491]: New session 16 of user core. Jul 6 23:32:44.832879 systemd[1]: Started session-16.scope - Session 16 of User core. Jul 6 23:32:45.304340 containerd[1514]: time="2025-07-06T23:32:45.304240196Z" level=warning msg="container event discarded" container=c7ef4998f39dc730245741239f70074b6eb967eee069f0940933add2d49efa23 type=CONTAINER_CREATED_EVENT Jul 6 23:32:45.387812 containerd[1514]: time="2025-07-06T23:32:45.387723948Z" level=warning msg="container event discarded" container=c7ef4998f39dc730245741239f70074b6eb967eee069f0940933add2d49efa23 type=CONTAINER_STARTED_EVENT Jul 6 23:32:46.084396 containerd[1514]: time="2025-07-06T23:32:46.084318433Z" level=warning msg="container event discarded" container=c7ef4998f39dc730245741239f70074b6eb967eee069f0940933add2d49efa23 type=CONTAINER_STOPPED_EVENT Jul 6 23:32:46.674812 sshd[6835]: Connection closed by 139.178.89.65 port 35434 Jul 6 23:32:46.676526 sshd-session[6833]: pam_unix(sshd:session): session closed for user core Jul 6 23:32:46.682472 systemd[1]: sshd@15-91.99.177.85:22-139.178.89.65:35434.service: Deactivated successfully. Jul 6 23:32:46.685557 systemd[1]: session-16.scope: Deactivated successfully. Jul 6 23:32:46.687399 systemd-logind[1491]: Session 16 logged out. Waiting for processes to exit. Jul 6 23:32:46.690606 systemd-logind[1491]: Removed session 16. Jul 6 23:32:46.866380 systemd[1]: Started sshd@16-91.99.177.85:22-139.178.89.65:35448.service - OpenSSH per-connection server daemon (139.178.89.65:35448). Jul 6 23:32:47.998222 sshd[6868]: Accepted publickey for core from 139.178.89.65 port 35448 ssh2: RSA SHA256:3osEBBBp4JZnYlmjNq6bOg4+EAdCjLWXmHbLkgxlNTk Jul 6 23:32:48.000846 sshd-session[6868]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jul 6 23:32:48.008539 systemd-logind[1491]: New session 17 of user core. Jul 6 23:32:48.013897 systemd[1]: Started session-17.scope - Session 17 of User core. Jul 6 23:32:48.983778 sshd[6870]: Connection closed by 139.178.89.65 port 35448 Jul 6 23:32:48.984405 sshd-session[6868]: pam_unix(sshd:session): session closed for user core Jul 6 23:32:48.989417 systemd[1]: sshd@16-91.99.177.85:22-139.178.89.65:35448.service: Deactivated successfully. Jul 6 23:32:48.989823 systemd-logind[1491]: Session 17 logged out. Waiting for processes to exit. Jul 6 23:32:48.993210 systemd[1]: session-17.scope: Deactivated successfully. Jul 6 23:32:48.995559 systemd-logind[1491]: Removed session 17. Jul 6 23:32:49.178483 systemd[1]: Started sshd@17-91.99.177.85:22-139.178.89.65:35454.service - OpenSSH per-connection server daemon (139.178.89.65:35454). Jul 6 23:32:49.955341 containerd[1514]: time="2025-07-06T23:32:49.955290142Z" level=info msg="TaskExit event in podsandbox handler container_id:\"842037bfe3a47b926b97ada3ba3f4d51d779206365ac692fca888d6c2fc39afc\" id:\"3150535e7a6f43ace058c987c5dff0edae3624ff7d2e8299554f53c385d6cfa4\" pid:6895 exited_at:{seconds:1751844769 nanos:954742422}" Jul 6 23:32:50.291973 sshd[6881]: Accepted publickey for core from 139.178.89.65 port 35454 ssh2: RSA SHA256:3osEBBBp4JZnYlmjNq6bOg4+EAdCjLWXmHbLkgxlNTk Jul 6 23:32:50.294158 sshd-session[6881]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jul 6 23:32:50.300355 systemd-logind[1491]: New session 18 of user core. Jul 6 23:32:50.316116 systemd[1]: Started session-18.scope - Session 18 of User core. Jul 6 23:32:51.130381 sshd[6906]: Connection closed by 139.178.89.65 port 35454 Jul 6 23:32:51.131328 sshd-session[6881]: pam_unix(sshd:session): session closed for user core Jul 6 23:32:51.137348 systemd[1]: sshd@17-91.99.177.85:22-139.178.89.65:35454.service: Deactivated successfully. Jul 6 23:32:51.141529 systemd[1]: session-18.scope: Deactivated successfully. Jul 6 23:32:51.143371 systemd-logind[1491]: Session 18 logged out. Waiting for processes to exit. Jul 6 23:32:51.145475 systemd-logind[1491]: Removed session 18. Jul 6 23:32:54.093884 containerd[1514]: time="2025-07-06T23:32:54.093762382Z" level=warning msg="container event discarded" container=0d73d6901e869e7caea6321810ee8eb5fca1374e273d1fd43bb5e8a831a94d32 type=CONTAINER_CREATED_EVENT Jul 6 23:32:54.212348 containerd[1514]: time="2025-07-06T23:32:54.212185130Z" level=warning msg="container event discarded" container=0d73d6901e869e7caea6321810ee8eb5fca1374e273d1fd43bb5e8a831a94d32 type=CONTAINER_STARTED_EVENT Jul 6 23:32:55.445646 containerd[1514]: time="2025-07-06T23:32:55.445475479Z" level=warning msg="container event discarded" container=bd8ff8cd55b8e0de8574a38cee131001bf2fc85cac7d8ed496b67977a3e70ba8 type=CONTAINER_CREATED_EVENT Jul 6 23:32:55.445646 containerd[1514]: time="2025-07-06T23:32:55.445601599Z" level=warning msg="container event discarded" container=bd8ff8cd55b8e0de8574a38cee131001bf2fc85cac7d8ed496b67977a3e70ba8 type=CONTAINER_STARTED_EVENT Jul 6 23:32:56.327767 systemd[1]: Started sshd@18-91.99.177.85:22-139.178.89.65:39200.service - OpenSSH per-connection server daemon (139.178.89.65:39200). Jul 6 23:32:57.434605 sshd[6920]: Accepted publickey for core from 139.178.89.65 port 39200 ssh2: RSA SHA256:3osEBBBp4JZnYlmjNq6bOg4+EAdCjLWXmHbLkgxlNTk Jul 6 23:32:57.437485 sshd-session[6920]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jul 6 23:32:57.445756 systemd-logind[1491]: New session 19 of user core. Jul 6 23:32:57.449900 systemd[1]: Started session-19.scope - Session 19 of User core. Jul 6 23:32:57.462378 containerd[1514]: time="2025-07-06T23:32:57.462324304Z" level=warning msg="container event discarded" container=48c14b53c49ae1e181bdbe9d8ed3f68e5d9a7ba57a8e6a5b75080df4d5efd696 type=CONTAINER_CREATED_EVENT Jul 6 23:32:57.566946 containerd[1514]: time="2025-07-06T23:32:57.566843213Z" level=warning msg="container event discarded" container=48c14b53c49ae1e181bdbe9d8ed3f68e5d9a7ba57a8e6a5b75080df4d5efd696 type=CONTAINER_STARTED_EVENT Jul 6 23:32:57.963192 containerd[1514]: time="2025-07-06T23:32:57.963122371Z" level=info msg="TaskExit event in podsandbox handler container_id:\"842037bfe3a47b926b97ada3ba3f4d51d779206365ac692fca888d6c2fc39afc\" id:\"6a6d5dae8ffe0578a628f56acb3812c9d694be47f08093ae8749dbabc0ffe611\" pid:6935 exited_at:{seconds:1751844777 nanos:962379011}" Jul 6 23:32:58.263492 sshd[6922]: Connection closed by 139.178.89.65 port 39200 Jul 6 23:32:58.264984 sshd-session[6920]: pam_unix(sshd:session): session closed for user core Jul 6 23:32:58.270394 systemd[1]: sshd@18-91.99.177.85:22-139.178.89.65:39200.service: Deactivated successfully. Jul 6 23:32:58.272983 systemd[1]: session-19.scope: Deactivated successfully. Jul 6 23:32:58.274523 systemd-logind[1491]: Session 19 logged out. Waiting for processes to exit. Jul 6 23:32:58.277440 systemd-logind[1491]: Removed session 19. Jul 6 23:32:58.719055 containerd[1514]: time="2025-07-06T23:32:58.718800611Z" level=warning msg="container event discarded" container=08174e3ac0afce8dde602378a7e96a4a24ce820f65640fa5f0842da16ae61d88 type=CONTAINER_CREATED_EVENT Jul 6 23:32:58.719055 containerd[1514]: time="2025-07-06T23:32:58.718861091Z" level=warning msg="container event discarded" container=08174e3ac0afce8dde602378a7e96a4a24ce820f65640fa5f0842da16ae61d88 type=CONTAINER_STARTED_EVENT Jul 6 23:32:59.849950 containerd[1514]: time="2025-07-06T23:32:59.849745811Z" level=warning msg="container event discarded" container=ed7c6d4c03fb6932e88fd0e005b1044d3f49a7d624765e1e0356b34fcffa447d type=CONTAINER_CREATED_EVENT Jul 6 23:32:59.849950 containerd[1514]: time="2025-07-06T23:32:59.849800251Z" level=warning msg="container event discarded" container=ed7c6d4c03fb6932e88fd0e005b1044d3f49a7d624765e1e0356b34fcffa447d type=CONTAINER_STARTED_EVENT Jul 6 23:32:59.935380 containerd[1514]: time="2025-07-06T23:32:59.935145362Z" level=warning msg="container event discarded" container=8b4dbbd65d0daa57785b28c0b3004572770c11b458b31d1ad9a5e1856fa490fa type=CONTAINER_CREATED_EVENT Jul 6 23:32:59.935380 containerd[1514]: time="2025-07-06T23:32:59.935280082Z" level=warning msg="container event discarded" container=8b4dbbd65d0daa57785b28c0b3004572770c11b458b31d1ad9a5e1856fa490fa type=CONTAINER_STARTED_EVENT Jul 6 23:32:59.966933 containerd[1514]: time="2025-07-06T23:32:59.966842238Z" level=warning msg="container event discarded" container=78ee68956a189acb3aefc352fb26e4d16d33d42a09cce4e0d672fb2627dc7fbc type=CONTAINER_CREATED_EVENT Jul 6 23:33:00.035358 containerd[1514]: time="2025-07-06T23:33:00.035262351Z" level=warning msg="container event discarded" container=78ee68956a189acb3aefc352fb26e4d16d33d42a09cce4e0d672fb2627dc7fbc type=CONTAINER_STARTED_EVENT Jul 6 23:33:00.840076 containerd[1514]: time="2025-07-06T23:33:00.839837666Z" level=warning msg="container event discarded" container=e881a8da7016b08506ebc34507d845ff403639f0e25af7102f14e4695e1637d9 type=CONTAINER_CREATED_EVENT Jul 6 23:33:00.840076 containerd[1514]: time="2025-07-06T23:33:00.839955506Z" level=warning msg="container event discarded" container=e881a8da7016b08506ebc34507d845ff403639f0e25af7102f14e4695e1637d9 type=CONTAINER_STARTED_EVENT Jul 6 23:33:00.880313 containerd[1514]: time="2025-07-06T23:33:00.880217061Z" level=warning msg="container event discarded" container=3d50532046f8bfbe720cb04d05319c390b2480788be3a45d05ef800075839f62 type=CONTAINER_CREATED_EVENT Jul 6 23:33:01.058194 containerd[1514]: time="2025-07-06T23:33:01.058063002Z" level=warning msg="container event discarded" container=3d50532046f8bfbe720cb04d05319c390b2480788be3a45d05ef800075839f62 type=CONTAINER_STARTED_EVENT Jul 6 23:33:01.282873 containerd[1514]: time="2025-07-06T23:33:01.282735939Z" level=warning msg="container event discarded" container=3bf438596f4542a3c55fbe9562f5c390820402c941821db38aa37e716a08d56d type=CONTAINER_CREATED_EVENT Jul 6 23:33:01.282873 containerd[1514]: time="2025-07-06T23:33:01.282816579Z" level=warning msg="container event discarded" container=3bf438596f4542a3c55fbe9562f5c390820402c941821db38aa37e716a08d56d type=CONTAINER_STARTED_EVENT Jul 6 23:33:02.002141 containerd[1514]: time="2025-07-06T23:33:02.002011382Z" level=warning msg="container event discarded" container=55153120b2e39f4ba457796f1115af3f0e6595b1afc0c17662e7bb6f4ca1b112 type=CONTAINER_CREATED_EVENT Jul 6 23:33:02.002141 containerd[1514]: time="2025-07-06T23:33:02.002104182Z" level=warning msg="container event discarded" container=55153120b2e39f4ba457796f1115af3f0e6595b1afc0c17662e7bb6f4ca1b112 type=CONTAINER_STARTED_EVENT Jul 6 23:33:02.093221 containerd[1514]: time="2025-07-06T23:33:02.087488773Z" level=warning msg="container event discarded" container=50d5dd518fb6e5e6d87c8ad9ce045fe8bab11157f03f0d14b392c6b9a3920c03 type=CONTAINER_CREATED_EVENT Jul 6 23:33:02.105555 containerd[1514]: time="2025-07-06T23:33:02.105456371Z" level=warning msg="container event discarded" container=3aea7c17fd5bc9d6661a3eab9ed3d52346ded1507128e53d356e5378a071080e type=CONTAINER_CREATED_EVENT Jul 6 23:33:02.105555 containerd[1514]: time="2025-07-06T23:33:02.105529611Z" level=warning msg="container event discarded" container=3aea7c17fd5bc9d6661a3eab9ed3d52346ded1507128e53d356e5378a071080e type=CONTAINER_STARTED_EVENT Jul 6 23:33:02.199196 containerd[1514]: time="2025-07-06T23:33:02.199039561Z" level=warning msg="container event discarded" container=50d5dd518fb6e5e6d87c8ad9ce045fe8bab11157f03f0d14b392c6b9a3920c03 type=CONTAINER_STARTED_EVENT Jul 6 23:33:02.210802 containerd[1514]: time="2025-07-06T23:33:02.210657240Z" level=warning msg="container event discarded" container=55967b677c733dd4e032c9c202f26d3c8a0bf05826f0f9d58b0b1da4a06d50a4 type=CONTAINER_CREATED_EVENT Jul 6 23:33:02.210802 containerd[1514]: time="2025-07-06T23:33:02.210757280Z" level=warning msg="container event discarded" container=55967b677c733dd4e032c9c202f26d3c8a0bf05826f0f9d58b0b1da4a06d50a4 type=CONTAINER_STARTED_EVENT Jul 6 23:33:03.453933 systemd[1]: Started sshd@19-91.99.177.85:22-139.178.89.65:36510.service - OpenSSH per-connection server daemon (139.178.89.65:36510). Jul 6 23:33:04.562637 sshd[6955]: Accepted publickey for core from 139.178.89.65 port 36510 ssh2: RSA SHA256:3osEBBBp4JZnYlmjNq6bOg4+EAdCjLWXmHbLkgxlNTk Jul 6 23:33:04.565724 sshd-session[6955]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jul 6 23:33:04.571789 systemd-logind[1491]: New session 20 of user core. Jul 6 23:33:04.578983 systemd[1]: Started session-20.scope - Session 20 of User core. Jul 6 23:33:05.397705 sshd[6957]: Connection closed by 139.178.89.65 port 36510 Jul 6 23:33:05.397029 sshd-session[6955]: pam_unix(sshd:session): session closed for user core Jul 6 23:33:05.403952 systemd[1]: sshd@19-91.99.177.85:22-139.178.89.65:36510.service: Deactivated successfully. Jul 6 23:33:05.407236 systemd[1]: session-20.scope: Deactivated successfully. Jul 6 23:33:05.408457 systemd-logind[1491]: Session 20 logged out. Waiting for processes to exit. Jul 6 23:33:05.411554 systemd-logind[1491]: Removed session 20. Jul 6 23:33:05.666063 containerd[1514]: time="2025-07-06T23:33:05.665850353Z" level=warning msg="container event discarded" container=9deb0d27fc93321c04f8333fe47cd7c4be33356d857b4c2fb91b4d905d5a998c type=CONTAINER_CREATED_EVENT Jul 6 23:33:05.786039 containerd[1514]: time="2025-07-06T23:33:05.785931340Z" level=warning msg="container event discarded" container=9deb0d27fc93321c04f8333fe47cd7c4be33356d857b4c2fb91b4d905d5a998c type=CONTAINER_STARTED_EVENT Jul 6 23:33:06.275330 containerd[1514]: time="2025-07-06T23:33:06.275203568Z" level=info msg="TaskExit event in podsandbox handler container_id:\"0d73d6901e869e7caea6321810ee8eb5fca1374e273d1fd43bb5e8a831a94d32\" id:\"a88ca011f0e65d10350ec21b726a7123135e64404121e6643852cc3a11d76304\" pid:6979 exited_at:{seconds:1751844786 nanos:274621248}" Jul 6 23:33:06.752312 containerd[1514]: time="2025-07-06T23:33:06.752209917Z" level=info msg="TaskExit event in podsandbox handler container_id:\"9deb0d27fc93321c04f8333fe47cd7c4be33356d857b4c2fb91b4d905d5a998c\" id:\"486dbb6a1183b5e4f106ad29fbdc6ec85c0747fe58d8abb787e8644d2cfc4bd1\" pid:7001 exited_at:{seconds:1751844786 nanos:751826958}" Jul 6 23:33:09.473948 containerd[1514]: time="2025-07-06T23:33:09.473760588Z" level=warning msg="container event discarded" container=9204ed692475d60b065414c0a60b3b7f336c68dfbe0e7b4c45a0e64b51d2c65d type=CONTAINER_CREATED_EVENT Jul 6 23:33:09.565690 containerd[1514]: time="2025-07-06T23:33:09.565589179Z" level=warning msg="container event discarded" container=9204ed692475d60b065414c0a60b3b7f336c68dfbe0e7b4c45a0e64b51d2c65d type=CONTAINER_STARTED_EVENT Jul 6 23:33:09.897936 containerd[1514]: time="2025-07-06T23:33:09.897719423Z" level=warning msg="container event discarded" container=5533d9c55479925d3b8ec4313db7d88b407ac35567cc2b5420f05835e19ad264 type=CONTAINER_CREATED_EVENT Jul 6 23:33:09.994183 containerd[1514]: time="2025-07-06T23:33:09.994102773Z" level=warning msg="container event discarded" container=5533d9c55479925d3b8ec4313db7d88b407ac35567cc2b5420f05835e19ad264 type=CONTAINER_STARTED_EVENT Jul 6 23:33:10.259992 containerd[1514]: time="2025-07-06T23:33:10.259873985Z" level=warning msg="container event discarded" container=e18182e798b5983978030d6b14fd9e58448397f934a7b45122748556832dc374 type=CONTAINER_CREATED_EVENT Jul 6 23:33:10.363373 containerd[1514]: time="2025-07-06T23:33:10.363300534Z" level=warning msg="container event discarded" container=e18182e798b5983978030d6b14fd9e58448397f934a7b45122748556832dc374 type=CONTAINER_STARTED_EVENT Jul 6 23:33:12.199964 containerd[1514]: time="2025-07-06T23:33:12.199863299Z" level=warning msg="container event discarded" container=ec203f06b1c8bc5dc059e34bbd79c22f9b433ca200dc95051fdfc3d4435bd781 type=CONTAINER_CREATED_EVENT Jul 6 23:33:12.428191 containerd[1514]: time="2025-07-06T23:33:12.428104195Z" level=warning msg="container event discarded" container=ec203f06b1c8bc5dc059e34bbd79c22f9b433ca200dc95051fdfc3d4435bd781 type=CONTAINER_STARTED_EVENT Jul 6 23:33:16.507143 containerd[1514]: time="2025-07-06T23:33:16.506999961Z" level=warning msg="container event discarded" container=e18182e798b5983978030d6b14fd9e58448397f934a7b45122748556832dc374 type=CONTAINER_STOPPED_EVENT Jul 6 23:33:16.817129 containerd[1514]: time="2025-07-06T23:33:16.816793448Z" level=warning msg="container event discarded" container=55153120b2e39f4ba457796f1115af3f0e6595b1afc0c17662e7bb6f4ca1b112 type=CONTAINER_STOPPED_EVENT Jul 6 23:33:17.231544 containerd[1514]: time="2025-07-06T23:33:17.231426364Z" level=warning msg="container event discarded" container=842037bfe3a47b926b97ada3ba3f4d51d779206365ac692fca888d6c2fc39afc type=CONTAINER_CREATED_EVENT Jul 6 23:33:17.358391 containerd[1514]: time="2025-07-06T23:33:17.358284671Z" level=warning msg="container event discarded" container=0c769176bd95aaeb8905ebf1893084f7c48b0ac804baa82a0952ce20fda9dfd4 type=CONTAINER_CREATED_EVENT Jul 6 23:33:17.358391 containerd[1514]: time="2025-07-06T23:33:17.358367431Z" level=warning msg="container event discarded" container=0c769176bd95aaeb8905ebf1893084f7c48b0ac804baa82a0952ce20fda9dfd4 type=CONTAINER_STARTED_EVENT Jul 6 23:33:17.406500 containerd[1514]: time="2025-07-06T23:33:17.406365586Z" level=warning msg="container event discarded" container=91e48aec8b780a84322e07a16631ff0ca136e91cb7cf6d889988426a20cfa643 type=CONTAINER_CREATED_EVENT Jul 6 23:33:17.519314 containerd[1514]: time="2025-07-06T23:33:17.519105334Z" level=warning msg="container event discarded" container=842037bfe3a47b926b97ada3ba3f4d51d779206365ac692fca888d6c2fc39afc type=CONTAINER_STARTED_EVENT Jul 6 23:33:17.544578 containerd[1514]: time="2025-07-06T23:33:17.544467691Z" level=warning msg="container event discarded" container=91e48aec8b780a84322e07a16631ff0ca136e91cb7cf6d889988426a20cfa643 type=CONTAINER_STARTED_EVENT Jul 6 23:33:17.847898 containerd[1514]: time="2025-07-06T23:33:17.847781059Z" level=warning msg="container event discarded" container=e18182e798b5983978030d6b14fd9e58448397f934a7b45122748556832dc374 type=CONTAINER_DELETED_EVENT Jul 6 23:33:19.172387 containerd[1514]: time="2025-07-06T23:33:19.172284118Z" level=warning msg="container event discarded" container=628f3156c9349d2ca7017e029521b87c93b8949fcd61ce62b588c9cb8f153472 type=CONTAINER_CREATED_EVENT Jul 6 23:33:19.292800 containerd[1514]: time="2025-07-06T23:33:19.292715346Z" level=warning msg="container event discarded" container=628f3156c9349d2ca7017e029521b87c93b8949fcd61ce62b588c9cb8f153472 type=CONTAINER_STARTED_EVENT Jul 6 23:33:19.732244 update_engine[1493]: I20250706 23:33:19.732131 1493 prefs.cc:52] certificate-report-to-send-update not present in /var/lib/update_engine/prefs Jul 6 23:33:19.732244 update_engine[1493]: I20250706 23:33:19.732214 1493 prefs.cc:52] certificate-report-to-send-download not present in /var/lib/update_engine/prefs Jul 6 23:33:19.732791 update_engine[1493]: I20250706 23:33:19.732570 1493 prefs.cc:52] aleph-version not present in /var/lib/update_engine/prefs Jul 6 23:33:19.733319 update_engine[1493]: I20250706 23:33:19.733259 1493 omaha_request_params.cc:62] Current group set to beta Jul 6 23:33:19.734634 update_engine[1493]: I20250706 23:33:19.734560 1493 update_attempter.cc:499] Already updated boot flags. Skipping. Jul 6 23:33:19.734634 update_engine[1493]: I20250706 23:33:19.734602 1493 update_attempter.cc:643] Scheduling an action processor start. Jul 6 23:33:19.734634 update_engine[1493]: I20250706 23:33:19.734625 1493 action_processor.cc:36] ActionProcessor::StartProcessing: OmahaRequestAction Jul 6 23:33:19.737237 update_engine[1493]: I20250706 23:33:19.737193 1493 prefs.cc:52] previous-version not present in /var/lib/update_engine/prefs Jul 6 23:33:19.737333 update_engine[1493]: I20250706 23:33:19.737288 1493 omaha_request_action.cc:271] Posting an Omaha request to disabled Jul 6 23:33:19.737333 update_engine[1493]: I20250706 23:33:19.737296 1493 omaha_request_action.cc:272] Request: Jul 6 23:33:19.737333 update_engine[1493]: Jul 6 23:33:19.737333 update_engine[1493]: Jul 6 23:33:19.737333 update_engine[1493]: Jul 6 23:33:19.737333 update_engine[1493]: Jul 6 23:33:19.737333 update_engine[1493]: Jul 6 23:33:19.737333 update_engine[1493]: Jul 6 23:33:19.737333 update_engine[1493]: Jul 6 23:33:19.737333 update_engine[1493]: Jul 6 23:33:19.737333 update_engine[1493]: I20250706 23:33:19.737303 1493 libcurl_http_fetcher.cc:47] Starting/Resuming transfer Jul 6 23:33:19.744105 update_engine[1493]: I20250706 23:33:19.743870 1493 libcurl_http_fetcher.cc:151] Setting up curl options for HTTP Jul 6 23:33:19.744831 update_engine[1493]: I20250706 23:33:19.744786 1493 libcurl_http_fetcher.cc:449] Setting up timeout source: 1 seconds. Jul 6 23:33:19.745714 update_engine[1493]: E20250706 23:33:19.745596 1493 libcurl_http_fetcher.cc:266] Unable to get http response code: Could not resolve host: disabled Jul 6 23:33:19.745823 update_engine[1493]: I20250706 23:33:19.745805 1493 libcurl_http_fetcher.cc:283] No HTTP response, retry 1 Jul 6 23:33:19.748927 locksmithd[1533]: LastCheckedTime=0 Progress=0 CurrentOperation="UPDATE_STATUS_CHECKING_FOR_UPDATE" NewVersion=0.0.0 NewSize=0 Jul 6 23:33:19.947585 containerd[1514]: time="2025-07-06T23:33:19.947532236Z" level=info msg="TaskExit event in podsandbox handler container_id:\"842037bfe3a47b926b97ada3ba3f4d51d779206365ac692fca888d6c2fc39afc\" id:\"53048b66bb1455fb91de1e33253c1fc27b467c4e714095bce8e7eb495504da6a\" pid:7028 exited_at:{seconds:1751844799 nanos:946349996}" Jul 6 23:33:21.644571 systemd[1]: cri-containerd-c0bbb527259b66f18382d4109286042c4a92739751fe41f500a0db6d31beeb76.scope: Deactivated successfully. Jul 6 23:33:21.646263 systemd[1]: cri-containerd-c0bbb527259b66f18382d4109286042c4a92739751fe41f500a0db6d31beeb76.scope: Consumed 6.579s CPU time, 67.3M memory peak, 4.1M read from disk. Jul 6 23:33:21.652942 containerd[1514]: time="2025-07-06T23:33:21.652582095Z" level=info msg="received exit event container_id:\"c0bbb527259b66f18382d4109286042c4a92739751fe41f500a0db6d31beeb76\" id:\"c0bbb527259b66f18382d4109286042c4a92739751fe41f500a0db6d31beeb76\" pid:2493 exit_status:1 exited_at:{seconds:1751844801 nanos:651713055}" Jul 6 23:33:21.653510 containerd[1514]: time="2025-07-06T23:33:21.653446655Z" level=info msg="TaskExit event in podsandbox handler container_id:\"c0bbb527259b66f18382d4109286042c4a92739751fe41f500a0db6d31beeb76\" id:\"c0bbb527259b66f18382d4109286042c4a92739751fe41f500a0db6d31beeb76\" pid:2493 exit_status:1 exited_at:{seconds:1751844801 nanos:651713055}" Jul 6 23:33:21.686263 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-c0bbb527259b66f18382d4109286042c4a92739751fe41f500a0db6d31beeb76-rootfs.mount: Deactivated successfully. Jul 6 23:33:21.830198 kubelet[2667]: I0706 23:33:21.829998 2667 scope.go:117] "RemoveContainer" containerID="c0bbb527259b66f18382d4109286042c4a92739751fe41f500a0db6d31beeb76" Jul 6 23:33:21.833507 containerd[1514]: time="2025-07-06T23:33:21.833448476Z" level=info msg="CreateContainer within sandbox \"3822a28e27ca4024c847c82a29e10798d049c7b3f8cda8c64b6fef49e74372b6\" for container &ContainerMetadata{Name:kube-controller-manager,Attempt:1,}" Jul 6 23:33:21.846379 containerd[1514]: time="2025-07-06T23:33:21.846098834Z" level=info msg="Container 94af7a954265012c4c320664274ebfb74a190bb969ca9ec6fd7f66b7dd3f01b0: CDI devices from CRI Config.CDIDevices: []" Jul 6 23:33:21.859720 containerd[1514]: time="2025-07-06T23:33:21.859646713Z" level=info msg="CreateContainer within sandbox \"3822a28e27ca4024c847c82a29e10798d049c7b3f8cda8c64b6fef49e74372b6\" for &ContainerMetadata{Name:kube-controller-manager,Attempt:1,} returns container id \"94af7a954265012c4c320664274ebfb74a190bb969ca9ec6fd7f66b7dd3f01b0\"" Jul 6 23:33:21.860340 containerd[1514]: time="2025-07-06T23:33:21.860282713Z" level=info msg="StartContainer for \"94af7a954265012c4c320664274ebfb74a190bb969ca9ec6fd7f66b7dd3f01b0\"" Jul 6 23:33:21.861813 containerd[1514]: time="2025-07-06T23:33:21.861738153Z" level=info msg="connecting to shim 94af7a954265012c4c320664274ebfb74a190bb969ca9ec6fd7f66b7dd3f01b0" address="unix:///run/containerd/s/0308bb234b90146c5af46f6afc0b1c660ea60916804c1dccf3c27fb448b2a8bb" protocol=ttrpc version=3 Jul 6 23:33:21.890951 systemd[1]: Started cri-containerd-94af7a954265012c4c320664274ebfb74a190bb969ca9ec6fd7f66b7dd3f01b0.scope - libcontainer container 94af7a954265012c4c320664274ebfb74a190bb969ca9ec6fd7f66b7dd3f01b0. Jul 6 23:33:21.947467 containerd[1514]: time="2025-07-06T23:33:21.947248904Z" level=info msg="StartContainer for \"94af7a954265012c4c320664274ebfb74a190bb969ca9ec6fd7f66b7dd3f01b0\" returns successfully"