Jul 15 23:15:54.818621 kernel: Booting Linux on physical CPU 0x0000000000 [0x413fd0c1] Jul 15 23:15:54.818647 kernel: Linux version 6.12.36-flatcar (build@pony-truck.infra.kinvolk.io) (aarch64-cros-linux-gnu-gcc (Gentoo Hardened 14.2.1_p20241221 p7) 14.2.1 20241221, GNU ld (Gentoo 2.44 p1) 2.44.0) #1 SMP PREEMPT Tue Jul 15 22:00:45 -00 2025 Jul 15 23:15:54.818658 kernel: KASLR enabled Jul 15 23:15:54.818664 kernel: efi: EFI v2.7 by Ubuntu distribution of EDK II Jul 15 23:15:54.818670 kernel: efi: SMBIOS 3.0=0x139ed0000 MEMATTR=0x1390bb018 ACPI 2.0=0x136760018 RNG=0x13676e918 MEMRESERVE=0x136b41218 Jul 15 23:15:54.818675 kernel: random: crng init done Jul 15 23:15:54.818682 kernel: secureboot: Secure boot disabled Jul 15 23:15:54.818688 kernel: ACPI: Early table checksum verification disabled Jul 15 23:15:54.818694 kernel: ACPI: RSDP 0x0000000136760018 000024 (v02 BOCHS ) Jul 15 23:15:54.818700 kernel: ACPI: XSDT 0x000000013676FE98 00006C (v01 BOCHS BXPC 00000001 01000013) Jul 15 23:15:54.818709 kernel: ACPI: FACP 0x000000013676FA98 000114 (v06 BOCHS BXPC 00000001 BXPC 00000001) Jul 15 23:15:54.818714 kernel: ACPI: DSDT 0x0000000136767518 001468 (v02 BOCHS BXPC 00000001 BXPC 00000001) Jul 15 23:15:54.818720 kernel: ACPI: APIC 0x000000013676FC18 000108 (v04 BOCHS BXPC 00000001 BXPC 00000001) Jul 15 23:15:54.818726 kernel: ACPI: PPTT 0x000000013676FD98 000060 (v02 BOCHS BXPC 00000001 BXPC 00000001) Jul 15 23:15:54.818733 kernel: ACPI: GTDT 0x000000013676D898 000060 (v02 BOCHS BXPC 00000001 BXPC 00000001) Jul 15 23:15:54.818741 kernel: ACPI: MCFG 0x000000013676FF98 00003C (v01 BOCHS BXPC 00000001 BXPC 00000001) Jul 15 23:15:54.818748 kernel: ACPI: SPCR 0x000000013676E818 000050 (v02 BOCHS BXPC 00000001 BXPC 00000001) Jul 15 23:15:54.818754 kernel: ACPI: DBG2 0x000000013676E898 000057 (v00 BOCHS BXPC 00000001 BXPC 00000001) Jul 15 23:15:54.818760 kernel: ACPI: IORT 0x000000013676E418 000080 (v03 BOCHS BXPC 00000001 BXPC 00000001) Jul 15 23:15:54.818766 kernel: ACPI: BGRT 0x000000013676E798 000038 (v01 INTEL EDK2 00000002 01000013) Jul 15 23:15:54.818773 kernel: ACPI: SPCR: console: pl011,mmio32,0x9000000,9600 Jul 15 23:15:54.818779 kernel: ACPI: Use ACPI SPCR as default console: Yes Jul 15 23:15:54.818785 kernel: NUMA: Faking a node at [mem 0x0000000040000000-0x0000000139ffffff] Jul 15 23:15:54.818791 kernel: NODE_DATA(0) allocated [mem 0x13967da00-0x139684fff] Jul 15 23:15:54.818798 kernel: Zone ranges: Jul 15 23:15:54.818807 kernel: DMA [mem 0x0000000040000000-0x00000000ffffffff] Jul 15 23:15:54.818813 kernel: DMA32 empty Jul 15 23:15:54.818819 kernel: Normal [mem 0x0000000100000000-0x0000000139ffffff] Jul 15 23:15:54.818825 kernel: Device empty Jul 15 23:15:54.818831 kernel: Movable zone start for each node Jul 15 23:15:54.818838 kernel: Early memory node ranges Jul 15 23:15:54.818844 kernel: node 0: [mem 0x0000000040000000-0x000000013666ffff] Jul 15 23:15:54.818850 kernel: node 0: [mem 0x0000000136670000-0x000000013667ffff] Jul 15 23:15:54.818856 kernel: node 0: [mem 0x0000000136680000-0x000000013676ffff] Jul 15 23:15:54.818862 kernel: node 0: [mem 0x0000000136770000-0x0000000136b3ffff] Jul 15 23:15:54.818868 kernel: node 0: [mem 0x0000000136b40000-0x0000000139e1ffff] Jul 15 23:15:54.818874 kernel: node 0: [mem 0x0000000139e20000-0x0000000139eaffff] Jul 15 23:15:54.818881 kernel: node 0: [mem 0x0000000139eb0000-0x0000000139ebffff] Jul 15 23:15:54.818888 kernel: node 0: [mem 0x0000000139ec0000-0x0000000139fdffff] Jul 15 23:15:54.818895 kernel: node 0: [mem 0x0000000139fe0000-0x0000000139ffffff] Jul 15 23:15:54.818903 kernel: Initmem setup node 0 [mem 0x0000000040000000-0x0000000139ffffff] Jul 15 23:15:54.818910 kernel: On node 0, zone Normal: 24576 pages in unavailable ranges Jul 15 23:15:54.818917 kernel: cma: Reserved 16 MiB at 0x00000000ff000000 on node -1 Jul 15 23:15:54.818925 kernel: psci: probing for conduit method from ACPI. Jul 15 23:15:54.818943 kernel: psci: PSCIv1.1 detected in firmware. Jul 15 23:15:54.818950 kernel: psci: Using standard PSCI v0.2 function IDs Jul 15 23:15:54.818957 kernel: psci: Trusted OS migration not required Jul 15 23:15:54.818963 kernel: psci: SMC Calling Convention v1.1 Jul 15 23:15:54.818970 kernel: smccc: KVM: hypervisor services detected (0x00000000 0x00000000 0x00000000 0x00000003) Jul 15 23:15:54.818976 kernel: percpu: Embedded 33 pages/cpu s98200 r8192 d28776 u135168 Jul 15 23:15:54.818983 kernel: pcpu-alloc: s98200 r8192 d28776 u135168 alloc=33*4096 Jul 15 23:15:54.818990 kernel: pcpu-alloc: [0] 0 [0] 1 Jul 15 23:15:54.818996 kernel: Detected PIPT I-cache on CPU0 Jul 15 23:15:54.819003 kernel: CPU features: detected: GIC system register CPU interface Jul 15 23:15:54.819012 kernel: CPU features: detected: Spectre-v4 Jul 15 23:15:54.819018 kernel: CPU features: detected: Spectre-BHB Jul 15 23:15:54.819025 kernel: CPU features: kernel page table isolation forced ON by KASLR Jul 15 23:15:54.819031 kernel: CPU features: detected: Kernel page table isolation (KPTI) Jul 15 23:15:54.819038 kernel: CPU features: detected: ARM erratum 1418040 Jul 15 23:15:54.819044 kernel: CPU features: detected: SSBS not fully self-synchronizing Jul 15 23:15:54.819051 kernel: alternatives: applying boot alternatives Jul 15 23:15:54.819059 kernel: Kernel command line: BOOT_IMAGE=/flatcar/vmlinuz-a mount.usr=/dev/mapper/usr verity.usr=PARTUUID=7130c94a-213a-4e5a-8e26-6cce9662f132 rootflags=rw mount.usrflags=ro consoleblank=0 root=LABEL=ROOT console=ttyAMA0,115200n8 flatcar.first_boot=detected acpi=force flatcar.oem.id=hetzner verity.usrhash=6efbcbd16e8e41b645be9f8e34b328753e37d282675200dab08e504f8e58a578 Jul 15 23:15:54.819066 kernel: Unknown kernel command line parameters "BOOT_IMAGE=/flatcar/vmlinuz-a", will be passed to user space. Jul 15 23:15:54.819072 kernel: Dentry cache hash table entries: 524288 (order: 10, 4194304 bytes, linear) Jul 15 23:15:54.819081 kernel: Inode-cache hash table entries: 262144 (order: 9, 2097152 bytes, linear) Jul 15 23:15:54.819088 kernel: Fallback order for Node 0: 0 Jul 15 23:15:54.819094 kernel: Built 1 zonelists, mobility grouping on. Total pages: 1024000 Jul 15 23:15:54.819101 kernel: Policy zone: Normal Jul 15 23:15:54.819107 kernel: mem auto-init: stack:off, heap alloc:off, heap free:off Jul 15 23:15:54.819114 kernel: software IO TLB: area num 2. Jul 15 23:15:54.819121 kernel: software IO TLB: mapped [mem 0x00000000fb000000-0x00000000ff000000] (64MB) Jul 15 23:15:54.819127 kernel: SLUB: HWalign=64, Order=0-3, MinObjects=0, CPUs=2, Nodes=1 Jul 15 23:15:54.819134 kernel: rcu: Preemptible hierarchical RCU implementation. Jul 15 23:15:54.819141 kernel: rcu: RCU event tracing is enabled. Jul 15 23:15:54.819148 kernel: rcu: RCU restricting CPUs from NR_CPUS=512 to nr_cpu_ids=2. Jul 15 23:15:54.819155 kernel: Trampoline variant of Tasks RCU enabled. Jul 15 23:15:54.819162 kernel: Tracing variant of Tasks RCU enabled. Jul 15 23:15:54.819169 kernel: rcu: RCU calculated value of scheduler-enlistment delay is 100 jiffies. Jul 15 23:15:54.819175 kernel: rcu: Adjusting geometry for rcu_fanout_leaf=16, nr_cpu_ids=2 Jul 15 23:15:54.819182 kernel: RCU Tasks: Setting shift to 1 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=2. Jul 15 23:15:54.819188 kernel: RCU Tasks Trace: Setting shift to 1 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=2. Jul 15 23:15:54.819195 kernel: NR_IRQS: 64, nr_irqs: 64, preallocated irqs: 0 Jul 15 23:15:54.819201 kernel: GICv3: 256 SPIs implemented Jul 15 23:15:54.819208 kernel: GICv3: 0 Extended SPIs implemented Jul 15 23:15:54.819249 kernel: Root IRQ handler: gic_handle_irq Jul 15 23:15:54.819260 kernel: GICv3: GICv3 features: 16 PPIs, DirectLPI Jul 15 23:15:54.819267 kernel: GICv3: GICD_CTRL.DS=1, SCR_EL3.FIQ=0 Jul 15 23:15:54.819273 kernel: GICv3: CPU0: found redistributor 0 region 0:0x00000000080a0000 Jul 15 23:15:54.819283 kernel: ITS [mem 0x08080000-0x0809ffff] Jul 15 23:15:54.819290 kernel: ITS@0x0000000008080000: allocated 8192 Devices @100100000 (indirect, esz 8, psz 64K, shr 1) Jul 15 23:15:54.819297 kernel: ITS@0x0000000008080000: allocated 8192 Interrupt Collections @100110000 (flat, esz 8, psz 64K, shr 1) Jul 15 23:15:54.819304 kernel: GICv3: using LPI property table @0x0000000100120000 Jul 15 23:15:54.819310 kernel: GICv3: CPU0: using allocated LPI pending table @0x0000000100130000 Jul 15 23:15:54.819317 kernel: rcu: srcu_init: Setting srcu_struct sizes based on contention. Jul 15 23:15:54.819324 kernel: arch_timer: Enabling local workaround for ARM erratum 1418040 Jul 15 23:15:54.819330 kernel: arch_timer: cp15 timer(s) running at 25.00MHz (virt). Jul 15 23:15:54.819366 kernel: clocksource: arch_sys_counter: mask: 0xffffffffffffff max_cycles: 0x5c40939b5, max_idle_ns: 440795202646 ns Jul 15 23:15:54.819374 kernel: sched_clock: 56 bits at 25MHz, resolution 40ns, wraps every 4398046511100ns Jul 15 23:15:54.819381 kernel: Console: colour dummy device 80x25 Jul 15 23:15:54.819391 kernel: ACPI: Core revision 20240827 Jul 15 23:15:54.819399 kernel: Calibrating delay loop (skipped), value calculated using timer frequency.. 50.00 BogoMIPS (lpj=25000) Jul 15 23:15:54.819406 kernel: pid_max: default: 32768 minimum: 301 Jul 15 23:15:54.819413 kernel: LSM: initializing lsm=lockdown,capability,landlock,selinux,ima Jul 15 23:15:54.819420 kernel: landlock: Up and running. Jul 15 23:15:54.819427 kernel: SELinux: Initializing. Jul 15 23:15:54.819434 kernel: Mount-cache hash table entries: 8192 (order: 4, 65536 bytes, linear) Jul 15 23:15:54.819441 kernel: Mountpoint-cache hash table entries: 8192 (order: 4, 65536 bytes, linear) Jul 15 23:15:54.819448 kernel: rcu: Hierarchical SRCU implementation. Jul 15 23:15:54.819456 kernel: rcu: Max phase no-delay instances is 400. Jul 15 23:15:54.819463 kernel: Timer migration: 1 hierarchy levels; 8 children per group; 1 crossnode level Jul 15 23:15:54.819470 kernel: Remapping and enabling EFI services. Jul 15 23:15:54.819478 kernel: smp: Bringing up secondary CPUs ... Jul 15 23:15:54.819484 kernel: Detected PIPT I-cache on CPU1 Jul 15 23:15:54.819491 kernel: GICv3: CPU1: found redistributor 1 region 0:0x00000000080c0000 Jul 15 23:15:54.819498 kernel: GICv3: CPU1: using allocated LPI pending table @0x0000000100140000 Jul 15 23:15:54.819505 kernel: arch_timer: Enabling local workaround for ARM erratum 1418040 Jul 15 23:15:54.819512 kernel: CPU1: Booted secondary processor 0x0000000001 [0x413fd0c1] Jul 15 23:15:54.819520 kernel: smp: Brought up 1 node, 2 CPUs Jul 15 23:15:54.819533 kernel: SMP: Total of 2 processors activated. Jul 15 23:15:54.819540 kernel: CPU: All CPU(s) started at EL1 Jul 15 23:15:54.819549 kernel: CPU features: detected: 32-bit EL0 Support Jul 15 23:15:54.819556 kernel: CPU features: detected: Data cache clean to the PoU not required for I/D coherence Jul 15 23:15:54.819564 kernel: CPU features: detected: Common not Private translations Jul 15 23:15:54.822583 kernel: CPU features: detected: CRC32 instructions Jul 15 23:15:54.822640 kernel: CPU features: detected: Enhanced Virtualization Traps Jul 15 23:15:54.822657 kernel: CPU features: detected: RCpc load-acquire (LDAPR) Jul 15 23:15:54.822665 kernel: CPU features: detected: LSE atomic instructions Jul 15 23:15:54.822673 kernel: CPU features: detected: Privileged Access Never Jul 15 23:15:54.822682 kernel: CPU features: detected: RAS Extension Support Jul 15 23:15:54.822690 kernel: CPU features: detected: Speculative Store Bypassing Safe (SSBS) Jul 15 23:15:54.822698 kernel: alternatives: applying system-wide alternatives Jul 15 23:15:54.822706 kernel: CPU features: detected: Hardware dirty bit management on CPU0-1 Jul 15 23:15:54.822714 kernel: Memory: 3859044K/4096000K available (11136K kernel code, 2436K rwdata, 9076K rodata, 39488K init, 1038K bss, 215476K reserved, 16384K cma-reserved) Jul 15 23:15:54.822785 kernel: devtmpfs: initialized Jul 15 23:15:54.822798 kernel: clocksource: jiffies: mask: 0xffffffff max_cycles: 0xffffffff, max_idle_ns: 1911260446275000 ns Jul 15 23:15:54.822805 kernel: futex hash table entries: 512 (order: 3, 32768 bytes, linear) Jul 15 23:15:54.822813 kernel: 2G module region forced by RANDOMIZE_MODULE_REGION_FULL Jul 15 23:15:54.822820 kernel: 0 pages in range for non-PLT usage Jul 15 23:15:54.822827 kernel: 508432 pages in range for PLT usage Jul 15 23:15:54.822835 kernel: pinctrl core: initialized pinctrl subsystem Jul 15 23:15:54.822842 kernel: SMBIOS 3.0.0 present. Jul 15 23:15:54.822849 kernel: DMI: Hetzner vServer/KVM Virtual Machine, BIOS 20171111 11/11/2017 Jul 15 23:15:54.822857 kernel: DMI: Memory slots populated: 1/1 Jul 15 23:15:54.822865 kernel: NET: Registered PF_NETLINK/PF_ROUTE protocol family Jul 15 23:15:54.822873 kernel: DMA: preallocated 512 KiB GFP_KERNEL pool for atomic allocations Jul 15 23:15:54.822880 kernel: DMA: preallocated 512 KiB GFP_KERNEL|GFP_DMA pool for atomic allocations Jul 15 23:15:54.822888 kernel: DMA: preallocated 512 KiB GFP_KERNEL|GFP_DMA32 pool for atomic allocations Jul 15 23:15:54.822895 kernel: audit: initializing netlink subsys (disabled) Jul 15 23:15:54.822903 kernel: audit: type=2000 audit(0.013:1): state=initialized audit_enabled=0 res=1 Jul 15 23:15:54.822910 kernel: thermal_sys: Registered thermal governor 'step_wise' Jul 15 23:15:54.822917 kernel: cpuidle: using governor menu Jul 15 23:15:54.822924 kernel: hw-breakpoint: found 6 breakpoint and 4 watchpoint registers. Jul 15 23:15:54.822979 kernel: ASID allocator initialised with 32768 entries Jul 15 23:15:54.822988 kernel: acpiphp: ACPI Hot Plug PCI Controller Driver version: 0.5 Jul 15 23:15:54.822995 kernel: Serial: AMBA PL011 UART driver Jul 15 23:15:54.823003 kernel: HugeTLB: registered 1.00 GiB page size, pre-allocated 0 pages Jul 15 23:15:54.823010 kernel: HugeTLB: 0 KiB vmemmap can be freed for a 1.00 GiB page Jul 15 23:15:54.823018 kernel: HugeTLB: registered 32.0 MiB page size, pre-allocated 0 pages Jul 15 23:15:54.823025 kernel: HugeTLB: 0 KiB vmemmap can be freed for a 32.0 MiB page Jul 15 23:15:54.823032 kernel: HugeTLB: registered 2.00 MiB page size, pre-allocated 0 pages Jul 15 23:15:54.823039 kernel: HugeTLB: 0 KiB vmemmap can be freed for a 2.00 MiB page Jul 15 23:15:54.823077 kernel: HugeTLB: registered 64.0 KiB page size, pre-allocated 0 pages Jul 15 23:15:54.823093 kernel: HugeTLB: 0 KiB vmemmap can be freed for a 64.0 KiB page Jul 15 23:15:54.823129 kernel: ACPI: Added _OSI(Module Device) Jul 15 23:15:54.823139 kernel: ACPI: Added _OSI(Processor Device) Jul 15 23:15:54.823149 kernel: ACPI: Added _OSI(Processor Aggregator Device) Jul 15 23:15:54.823157 kernel: ACPI: 1 ACPI AML tables successfully acquired and loaded Jul 15 23:15:54.823164 kernel: ACPI: Interpreter enabled Jul 15 23:15:54.823172 kernel: ACPI: Using GIC for interrupt routing Jul 15 23:15:54.823179 kernel: ACPI: MCFG table detected, 1 entries Jul 15 23:15:54.823190 kernel: ACPI: CPU0 has been hot-added Jul 15 23:15:54.823197 kernel: ACPI: CPU1 has been hot-added Jul 15 23:15:54.823205 kernel: ARMH0011:00: ttyAMA0 at MMIO 0x9000000 (irq = 12, base_baud = 0) is a SBSA Jul 15 23:15:54.823212 kernel: printk: legacy console [ttyAMA0] enabled Jul 15 23:15:54.823220 kernel: ACPI: PCI Root Bridge [PCI0] (domain 0000 [bus 00-ff]) Jul 15 23:15:54.823421 kernel: acpi PNP0A08:00: _OSC: OS supports [ExtendedConfig ASPM ClockPM Segments MSI HPX-Type3] Jul 15 23:15:54.823503 kernel: acpi PNP0A08:00: _OSC: platform does not support [LTR] Jul 15 23:15:54.823747 kernel: acpi PNP0A08:00: _OSC: OS now controls [PCIeHotplug PME AER PCIeCapability] Jul 15 23:15:54.823825 kernel: acpi PNP0A08:00: ECAM area [mem 0x4010000000-0x401fffffff] reserved by PNP0C02:00 Jul 15 23:15:54.823962 kernel: acpi PNP0A08:00: ECAM at [mem 0x4010000000-0x401fffffff] for [bus 00-ff] Jul 15 23:15:54.823997 kernel: ACPI: Remapped I/O 0x000000003eff0000 to [io 0x0000-0xffff window] Jul 15 23:15:54.824006 kernel: PCI host bridge to bus 0000:00 Jul 15 23:15:54.824096 kernel: pci_bus 0000:00: root bus resource [mem 0x10000000-0x3efeffff window] Jul 15 23:15:54.824154 kernel: pci_bus 0000:00: root bus resource [io 0x0000-0xffff window] Jul 15 23:15:54.824207 kernel: pci_bus 0000:00: root bus resource [mem 0x8000000000-0xffffffffff window] Jul 15 23:15:54.824270 kernel: pci_bus 0000:00: root bus resource [bus 00-ff] Jul 15 23:15:54.824356 kernel: pci 0000:00:00.0: [1b36:0008] type 00 class 0x060000 conventional PCI endpoint Jul 15 23:15:54.824502 kernel: pci 0000:00:01.0: [1af4:1050] type 00 class 0x038000 conventional PCI endpoint Jul 15 23:15:54.825850 kernel: pci 0000:00:01.0: BAR 1 [mem 0x11289000-0x11289fff] Jul 15 23:15:54.826052 kernel: pci 0000:00:01.0: BAR 4 [mem 0x8000600000-0x8000603fff 64bit pref] Jul 15 23:15:54.826146 kernel: pci 0000:00:02.0: [1b36:000c] type 01 class 0x060400 PCIe Root Port Jul 15 23:15:54.826222 kernel: pci 0000:00:02.0: BAR 0 [mem 0x11288000-0x11288fff] Jul 15 23:15:54.826282 kernel: pci 0000:00:02.0: PCI bridge to [bus 01] Jul 15 23:15:54.826342 kernel: pci 0000:00:02.0: bridge window [mem 0x11000000-0x111fffff] Jul 15 23:15:54.826464 kernel: pci 0000:00:02.0: bridge window [mem 0x8000000000-0x80000fffff 64bit pref] Jul 15 23:15:54.826547 kernel: pci 0000:00:02.1: [1b36:000c] type 01 class 0x060400 PCIe Root Port Jul 15 23:15:54.826649 kernel: pci 0000:00:02.1: BAR 0 [mem 0x11287000-0x11287fff] Jul 15 23:15:54.826778 kernel: pci 0000:00:02.1: PCI bridge to [bus 02] Jul 15 23:15:54.826902 kernel: pci 0000:00:02.1: bridge window [mem 0x10e00000-0x10ffffff] Jul 15 23:15:54.826994 kernel: pci 0000:00:02.2: [1b36:000c] type 01 class 0x060400 PCIe Root Port Jul 15 23:15:54.827060 kernel: pci 0000:00:02.2: BAR 0 [mem 0x11286000-0x11286fff] Jul 15 23:15:54.827120 kernel: pci 0000:00:02.2: PCI bridge to [bus 03] Jul 15 23:15:54.827187 kernel: pci 0000:00:02.2: bridge window [mem 0x10c00000-0x10dfffff] Jul 15 23:15:54.827311 kernel: pci 0000:00:02.2: bridge window [mem 0x8000100000-0x80001fffff 64bit pref] Jul 15 23:15:54.827405 kernel: pci 0000:00:02.3: [1b36:000c] type 01 class 0x060400 PCIe Root Port Jul 15 23:15:54.827474 kernel: pci 0000:00:02.3: BAR 0 [mem 0x11285000-0x11285fff] Jul 15 23:15:54.827534 kernel: pci 0000:00:02.3: PCI bridge to [bus 04] Jul 15 23:15:54.831711 kernel: pci 0000:00:02.3: bridge window [mem 0x10a00000-0x10bfffff] Jul 15 23:15:54.831840 kernel: pci 0000:00:02.3: bridge window [mem 0x8000200000-0x80002fffff 64bit pref] Jul 15 23:15:54.831915 kernel: pci 0000:00:02.4: [1b36:000c] type 01 class 0x060400 PCIe Root Port Jul 15 23:15:54.832032 kernel: pci 0000:00:02.4: BAR 0 [mem 0x11284000-0x11284fff] Jul 15 23:15:54.832098 kernel: pci 0000:00:02.4: PCI bridge to [bus 05] Jul 15 23:15:54.832169 kernel: pci 0000:00:02.4: bridge window [mem 0x10800000-0x109fffff] Jul 15 23:15:54.832230 kernel: pci 0000:00:02.4: bridge window [mem 0x8000300000-0x80003fffff 64bit pref] Jul 15 23:15:54.832302 kernel: pci 0000:00:02.5: [1b36:000c] type 01 class 0x060400 PCIe Root Port Jul 15 23:15:54.832363 kernel: pci 0000:00:02.5: BAR 0 [mem 0x11283000-0x11283fff] Jul 15 23:15:54.832423 kernel: pci 0000:00:02.5: PCI bridge to [bus 06] Jul 15 23:15:54.832482 kernel: pci 0000:00:02.5: bridge window [mem 0x10600000-0x107fffff] Jul 15 23:15:54.832542 kernel: pci 0000:00:02.5: bridge window [mem 0x8000400000-0x80004fffff 64bit pref] Jul 15 23:15:54.832658 kernel: pci 0000:00:02.6: [1b36:000c] type 01 class 0x060400 PCIe Root Port Jul 15 23:15:54.832727 kernel: pci 0000:00:02.6: BAR 0 [mem 0x11282000-0x11282fff] Jul 15 23:15:54.832788 kernel: pci 0000:00:02.6: PCI bridge to [bus 07] Jul 15 23:15:54.832846 kernel: pci 0000:00:02.6: bridge window [mem 0x10400000-0x105fffff] Jul 15 23:15:54.832909 kernel: pci 0000:00:02.6: bridge window [mem 0x8000500000-0x80005fffff 64bit pref] Jul 15 23:15:54.832991 kernel: pci 0000:00:02.7: [1b36:000c] type 01 class 0x060400 PCIe Root Port Jul 15 23:15:54.833053 kernel: pci 0000:00:02.7: BAR 0 [mem 0x11281000-0x11281fff] Jul 15 23:15:54.833117 kernel: pci 0000:00:02.7: PCI bridge to [bus 08] Jul 15 23:15:54.833176 kernel: pci 0000:00:02.7: bridge window [mem 0x10200000-0x103fffff] Jul 15 23:15:54.833245 kernel: pci 0000:00:03.0: [1b36:000c] type 01 class 0x060400 PCIe Root Port Jul 15 23:15:54.833305 kernel: pci 0000:00:03.0: BAR 0 [mem 0x11280000-0x11280fff] Jul 15 23:15:54.833368 kernel: pci 0000:00:03.0: PCI bridge to [bus 09] Jul 15 23:15:54.833428 kernel: pci 0000:00:03.0: bridge window [mem 0x10000000-0x101fffff] Jul 15 23:15:54.833498 kernel: pci 0000:00:04.0: [1b36:0002] type 00 class 0x070002 conventional PCI endpoint Jul 15 23:15:54.833562 kernel: pci 0000:00:04.0: BAR 0 [io 0x0000-0x0007] Jul 15 23:15:54.835803 kernel: pci 0000:01:00.0: [1af4:1041] type 00 class 0x020000 PCIe Endpoint Jul 15 23:15:54.835907 kernel: pci 0000:01:00.0: BAR 1 [mem 0x11000000-0x11000fff] Jul 15 23:15:54.835995 kernel: pci 0000:01:00.0: BAR 4 [mem 0x8000000000-0x8000003fff 64bit pref] Jul 15 23:15:54.836067 kernel: pci 0000:01:00.0: ROM [mem 0xfff80000-0xffffffff pref] Jul 15 23:15:54.836140 kernel: pci 0000:02:00.0: [1b36:000d] type 00 class 0x0c0330 PCIe Endpoint Jul 15 23:15:54.836214 kernel: pci 0000:02:00.0: BAR 0 [mem 0x10e00000-0x10e03fff 64bit] Jul 15 23:15:54.836285 kernel: pci 0000:03:00.0: [1af4:1043] type 00 class 0x078000 PCIe Endpoint Jul 15 23:15:54.836351 kernel: pci 0000:03:00.0: BAR 1 [mem 0x10c00000-0x10c00fff] Jul 15 23:15:54.836413 kernel: pci 0000:03:00.0: BAR 4 [mem 0x8000100000-0x8000103fff 64bit pref] Jul 15 23:15:54.836486 kernel: pci 0000:04:00.0: [1af4:1045] type 00 class 0x00ff00 PCIe Endpoint Jul 15 23:15:54.836549 kernel: pci 0000:04:00.0: BAR 4 [mem 0x8000200000-0x8000203fff 64bit pref] Jul 15 23:15:54.836651 kernel: pci 0000:05:00.0: [1af4:1044] type 00 class 0x00ff00 PCIe Endpoint Jul 15 23:15:54.836722 kernel: pci 0000:05:00.0: BAR 4 [mem 0x8000300000-0x8000303fff 64bit pref] Jul 15 23:15:54.836795 kernel: pci 0000:06:00.0: [1af4:1048] type 00 class 0x010000 PCIe Endpoint Jul 15 23:15:54.836858 kernel: pci 0000:06:00.0: BAR 1 [mem 0x10600000-0x10600fff] Jul 15 23:15:54.836922 kernel: pci 0000:06:00.0: BAR 4 [mem 0x8000400000-0x8000403fff 64bit pref] Jul 15 23:15:54.837049 kernel: pci 0000:07:00.0: [1af4:1041] type 00 class 0x020000 PCIe Endpoint Jul 15 23:15:54.837115 kernel: pci 0000:07:00.0: BAR 1 [mem 0x10400000-0x10400fff] Jul 15 23:15:54.837183 kernel: pci 0000:07:00.0: BAR 4 [mem 0x8000500000-0x8000503fff 64bit pref] Jul 15 23:15:54.837249 kernel: pci 0000:07:00.0: ROM [mem 0xfff80000-0xffffffff pref] Jul 15 23:15:54.837317 kernel: pci 0000:00:02.0: bridge window [io 0x1000-0x0fff] to [bus 01] add_size 1000 Jul 15 23:15:54.837379 kernel: pci 0000:00:02.0: bridge window [mem 0x00100000-0x001fffff 64bit pref] to [bus 01] add_size 100000 add_align 100000 Jul 15 23:15:54.837437 kernel: pci 0000:00:02.0: bridge window [mem 0x00100000-0x001fffff] to [bus 01] add_size 100000 add_align 100000 Jul 15 23:15:54.837500 kernel: pci 0000:00:02.1: bridge window [io 0x1000-0x0fff] to [bus 02] add_size 1000 Jul 15 23:15:54.837559 kernel: pci 0000:00:02.1: bridge window [mem 0x00100000-0x000fffff 64bit pref] to [bus 02] add_size 200000 add_align 100000 Jul 15 23:15:54.838792 kernel: pci 0000:00:02.1: bridge window [mem 0x00100000-0x001fffff] to [bus 02] add_size 100000 add_align 100000 Jul 15 23:15:54.838894 kernel: pci 0000:00:02.2: bridge window [io 0x1000-0x0fff] to [bus 03] add_size 1000 Jul 15 23:15:54.838977 kernel: pci 0000:00:02.2: bridge window [mem 0x00100000-0x001fffff 64bit pref] to [bus 03] add_size 100000 add_align 100000 Jul 15 23:15:54.839041 kernel: pci 0000:00:02.2: bridge window [mem 0x00100000-0x001fffff] to [bus 03] add_size 100000 add_align 100000 Jul 15 23:15:54.839105 kernel: pci 0000:00:02.3: bridge window [io 0x1000-0x0fff] to [bus 04] add_size 1000 Jul 15 23:15:54.839165 kernel: pci 0000:00:02.3: bridge window [mem 0x00100000-0x001fffff 64bit pref] to [bus 04] add_size 100000 add_align 100000 Jul 15 23:15:54.839226 kernel: pci 0000:00:02.3: bridge window [mem 0x00100000-0x000fffff] to [bus 04] add_size 200000 add_align 100000 Jul 15 23:15:54.839303 kernel: pci 0000:00:02.4: bridge window [io 0x1000-0x0fff] to [bus 05] add_size 1000 Jul 15 23:15:54.839372 kernel: pci 0000:00:02.4: bridge window [mem 0x00100000-0x001fffff 64bit pref] to [bus 05] add_size 100000 add_align 100000 Jul 15 23:15:54.839430 kernel: pci 0000:00:02.4: bridge window [mem 0x00100000-0x000fffff] to [bus 05] add_size 200000 add_align 100000 Jul 15 23:15:54.839494 kernel: pci 0000:00:02.5: bridge window [io 0x1000-0x0fff] to [bus 06] add_size 1000 Jul 15 23:15:54.839553 kernel: pci 0000:00:02.5: bridge window [mem 0x00100000-0x001fffff 64bit pref] to [bus 06] add_size 100000 add_align 100000 Jul 15 23:15:54.839731 kernel: pci 0000:00:02.5: bridge window [mem 0x00100000-0x001fffff] to [bus 06] add_size 100000 add_align 100000 Jul 15 23:15:54.839825 kernel: pci 0000:00:02.6: bridge window [io 0x1000-0x0fff] to [bus 07] add_size 1000 Jul 15 23:15:54.839891 kernel: pci 0000:00:02.6: bridge window [mem 0x00100000-0x001fffff 64bit pref] to [bus 07] add_size 100000 add_align 100000 Jul 15 23:15:54.840016 kernel: pci 0000:00:02.6: bridge window [mem 0x00100000-0x001fffff] to [bus 07] add_size 100000 add_align 100000 Jul 15 23:15:54.840092 kernel: pci 0000:00:02.7: bridge window [io 0x1000-0x0fff] to [bus 08] add_size 1000 Jul 15 23:15:54.840152 kernel: pci 0000:00:02.7: bridge window [mem 0x00100000-0x000fffff 64bit pref] to [bus 08] add_size 200000 add_align 100000 Jul 15 23:15:54.840210 kernel: pci 0000:00:02.7: bridge window [mem 0x00100000-0x000fffff] to [bus 08] add_size 200000 add_align 100000 Jul 15 23:15:54.840275 kernel: pci 0000:00:03.0: bridge window [io 0x1000-0x0fff] to [bus 09] add_size 1000 Jul 15 23:15:54.840340 kernel: pci 0000:00:03.0: bridge window [mem 0x00100000-0x000fffff 64bit pref] to [bus 09] add_size 200000 add_align 100000 Jul 15 23:15:54.840400 kernel: pci 0000:00:03.0: bridge window [mem 0x00100000-0x000fffff] to [bus 09] add_size 200000 add_align 100000 Jul 15 23:15:54.840464 kernel: pci 0000:00:02.0: bridge window [mem 0x10000000-0x101fffff]: assigned Jul 15 23:15:54.840533 kernel: pci 0000:00:02.0: bridge window [mem 0x8000000000-0x80001fffff 64bit pref]: assigned Jul 15 23:15:54.840613 kernel: pci 0000:00:02.1: bridge window [mem 0x10200000-0x103fffff]: assigned Jul 15 23:15:54.840677 kernel: pci 0000:00:02.1: bridge window [mem 0x8000200000-0x80003fffff 64bit pref]: assigned Jul 15 23:15:54.840740 kernel: pci 0000:00:02.2: bridge window [mem 0x10400000-0x105fffff]: assigned Jul 15 23:15:54.840798 kernel: pci 0000:00:02.2: bridge window [mem 0x8000400000-0x80005fffff 64bit pref]: assigned Jul 15 23:15:54.840866 kernel: pci 0000:00:02.3: bridge window [mem 0x10600000-0x107fffff]: assigned Jul 15 23:15:54.840925 kernel: pci 0000:00:02.3: bridge window [mem 0x8000600000-0x80007fffff 64bit pref]: assigned Jul 15 23:15:54.841021 kernel: pci 0000:00:02.4: bridge window [mem 0x10800000-0x109fffff]: assigned Jul 15 23:15:54.841174 kernel: pci 0000:00:02.4: bridge window [mem 0x8000800000-0x80009fffff 64bit pref]: assigned Jul 15 23:15:54.841253 kernel: pci 0000:00:02.5: bridge window [mem 0x10a00000-0x10bfffff]: assigned Jul 15 23:15:54.841314 kernel: pci 0000:00:02.5: bridge window [mem 0x8000a00000-0x8000bfffff 64bit pref]: assigned Jul 15 23:15:54.841375 kernel: pci 0000:00:02.6: bridge window [mem 0x10c00000-0x10dfffff]: assigned Jul 15 23:15:54.841442 kernel: pci 0000:00:02.6: bridge window [mem 0x8000c00000-0x8000dfffff 64bit pref]: assigned Jul 15 23:15:54.841507 kernel: pci 0000:00:02.7: bridge window [mem 0x10e00000-0x10ffffff]: assigned Jul 15 23:15:54.841565 kernel: pci 0000:00:02.7: bridge window [mem 0x8000e00000-0x8000ffffff 64bit pref]: assigned Jul 15 23:15:54.841650 kernel: pci 0000:00:03.0: bridge window [mem 0x11000000-0x111fffff]: assigned Jul 15 23:15:54.841712 kernel: pci 0000:00:03.0: bridge window [mem 0x8001000000-0x80011fffff 64bit pref]: assigned Jul 15 23:15:54.841777 kernel: pci 0000:00:01.0: BAR 4 [mem 0x8001200000-0x8001203fff 64bit pref]: assigned Jul 15 23:15:54.841837 kernel: pci 0000:00:01.0: BAR 1 [mem 0x11200000-0x11200fff]: assigned Jul 15 23:15:54.841899 kernel: pci 0000:00:02.0: BAR 0 [mem 0x11201000-0x11201fff]: assigned Jul 15 23:15:54.841987 kernel: pci 0000:00:02.0: bridge window [io 0x1000-0x1fff]: assigned Jul 15 23:15:54.842054 kernel: pci 0000:00:02.1: BAR 0 [mem 0x11202000-0x11202fff]: assigned Jul 15 23:15:54.842114 kernel: pci 0000:00:02.1: bridge window [io 0x2000-0x2fff]: assigned Jul 15 23:15:54.842177 kernel: pci 0000:00:02.2: BAR 0 [mem 0x11203000-0x11203fff]: assigned Jul 15 23:15:54.842240 kernel: pci 0000:00:02.2: bridge window [io 0x3000-0x3fff]: assigned Jul 15 23:15:54.842301 kernel: pci 0000:00:02.3: BAR 0 [mem 0x11204000-0x11204fff]: assigned Jul 15 23:15:54.842362 kernel: pci 0000:00:02.3: bridge window [io 0x4000-0x4fff]: assigned Jul 15 23:15:54.842485 kernel: pci 0000:00:02.4: BAR 0 [mem 0x11205000-0x11205fff]: assigned Jul 15 23:15:54.842560 kernel: pci 0000:00:02.4: bridge window [io 0x5000-0x5fff]: assigned Jul 15 23:15:54.842731 kernel: pci 0000:00:02.5: BAR 0 [mem 0x11206000-0x11206fff]: assigned Jul 15 23:15:54.842804 kernel: pci 0000:00:02.5: bridge window [io 0x6000-0x6fff]: assigned Jul 15 23:15:54.842867 kernel: pci 0000:00:02.6: BAR 0 [mem 0x11207000-0x11207fff]: assigned Jul 15 23:15:54.842951 kernel: pci 0000:00:02.6: bridge window [io 0x7000-0x7fff]: assigned Jul 15 23:15:54.843026 kernel: pci 0000:00:02.7: BAR 0 [mem 0x11208000-0x11208fff]: assigned Jul 15 23:15:54.843088 kernel: pci 0000:00:02.7: bridge window [io 0x8000-0x8fff]: assigned Jul 15 23:15:54.843151 kernel: pci 0000:00:03.0: BAR 0 [mem 0x11209000-0x11209fff]: assigned Jul 15 23:15:54.843210 kernel: pci 0000:00:03.0: bridge window [io 0x9000-0x9fff]: assigned Jul 15 23:15:54.843278 kernel: pci 0000:00:04.0: BAR 0 [io 0xa000-0xa007]: assigned Jul 15 23:15:54.843352 kernel: pci 0000:01:00.0: ROM [mem 0x10000000-0x1007ffff pref]: assigned Jul 15 23:15:54.843425 kernel: pci 0000:01:00.0: BAR 4 [mem 0x8000000000-0x8000003fff 64bit pref]: assigned Jul 15 23:15:54.843501 kernel: pci 0000:01:00.0: BAR 1 [mem 0x10080000-0x10080fff]: assigned Jul 15 23:15:54.843569 kernel: pci 0000:00:02.0: PCI bridge to [bus 01] Jul 15 23:15:54.843686 kernel: pci 0000:00:02.0: bridge window [io 0x1000-0x1fff] Jul 15 23:15:54.843754 kernel: pci 0000:00:02.0: bridge window [mem 0x10000000-0x101fffff] Jul 15 23:15:54.843820 kernel: pci 0000:00:02.0: bridge window [mem 0x8000000000-0x80001fffff 64bit pref] Jul 15 23:15:54.843894 kernel: pci 0000:02:00.0: BAR 0 [mem 0x10200000-0x10203fff 64bit]: assigned Jul 15 23:15:54.843994 kernel: pci 0000:00:02.1: PCI bridge to [bus 02] Jul 15 23:15:54.844073 kernel: pci 0000:00:02.1: bridge window [io 0x2000-0x2fff] Jul 15 23:15:54.844140 kernel: pci 0000:00:02.1: bridge window [mem 0x10200000-0x103fffff] Jul 15 23:15:54.844324 kernel: pci 0000:00:02.1: bridge window [mem 0x8000200000-0x80003fffff 64bit pref] Jul 15 23:15:54.844409 kernel: pci 0000:03:00.0: BAR 4 [mem 0x8000400000-0x8000403fff 64bit pref]: assigned Jul 15 23:15:54.844480 kernel: pci 0000:03:00.0: BAR 1 [mem 0x10400000-0x10400fff]: assigned Jul 15 23:15:54.844548 kernel: pci 0000:00:02.2: PCI bridge to [bus 03] Jul 15 23:15:54.844660 kernel: pci 0000:00:02.2: bridge window [io 0x3000-0x3fff] Jul 15 23:15:54.844738 kernel: pci 0000:00:02.2: bridge window [mem 0x10400000-0x105fffff] Jul 15 23:15:54.844810 kernel: pci 0000:00:02.2: bridge window [mem 0x8000400000-0x80005fffff 64bit pref] Jul 15 23:15:54.844903 kernel: pci 0000:04:00.0: BAR 4 [mem 0x8000600000-0x8000603fff 64bit pref]: assigned Jul 15 23:15:54.845003 kernel: pci 0000:00:02.3: PCI bridge to [bus 04] Jul 15 23:15:54.845087 kernel: pci 0000:00:02.3: bridge window [io 0x4000-0x4fff] Jul 15 23:15:54.845156 kernel: pci 0000:00:02.3: bridge window [mem 0x10600000-0x107fffff] Jul 15 23:15:54.845224 kernel: pci 0000:00:02.3: bridge window [mem 0x8000600000-0x80007fffff 64bit pref] Jul 15 23:15:54.845306 kernel: pci 0000:05:00.0: BAR 4 [mem 0x8000800000-0x8000803fff 64bit pref]: assigned Jul 15 23:15:54.845376 kernel: pci 0000:00:02.4: PCI bridge to [bus 05] Jul 15 23:15:54.845444 kernel: pci 0000:00:02.4: bridge window [io 0x5000-0x5fff] Jul 15 23:15:54.845513 kernel: pci 0000:00:02.4: bridge window [mem 0x10800000-0x109fffff] Jul 15 23:15:54.845580 kernel: pci 0000:00:02.4: bridge window [mem 0x8000800000-0x80009fffff 64bit pref] Jul 15 23:15:54.845674 kernel: pci 0000:06:00.0: BAR 4 [mem 0x8000a00000-0x8000a03fff 64bit pref]: assigned Jul 15 23:15:54.845746 kernel: pci 0000:06:00.0: BAR 1 [mem 0x10a00000-0x10a00fff]: assigned Jul 15 23:15:54.845818 kernel: pci 0000:00:02.5: PCI bridge to [bus 06] Jul 15 23:15:54.845887 kernel: pci 0000:00:02.5: bridge window [io 0x6000-0x6fff] Jul 15 23:15:54.846019 kernel: pci 0000:00:02.5: bridge window [mem 0x10a00000-0x10bfffff] Jul 15 23:15:54.846093 kernel: pci 0000:00:02.5: bridge window [mem 0x8000a00000-0x8000bfffff 64bit pref] Jul 15 23:15:54.846172 kernel: pci 0000:07:00.0: ROM [mem 0x10c00000-0x10c7ffff pref]: assigned Jul 15 23:15:54.846244 kernel: pci 0000:07:00.0: BAR 4 [mem 0x8000c00000-0x8000c03fff 64bit pref]: assigned Jul 15 23:15:54.846315 kernel: pci 0000:07:00.0: BAR 1 [mem 0x10c80000-0x10c80fff]: assigned Jul 15 23:15:54.846392 kernel: pci 0000:00:02.6: PCI bridge to [bus 07] Jul 15 23:15:54.846461 kernel: pci 0000:00:02.6: bridge window [io 0x7000-0x7fff] Jul 15 23:15:54.846540 kernel: pci 0000:00:02.6: bridge window [mem 0x10c00000-0x10dfffff] Jul 15 23:15:54.846640 kernel: pci 0000:00:02.6: bridge window [mem 0x8000c00000-0x8000dfffff 64bit pref] Jul 15 23:15:54.846721 kernel: pci 0000:00:02.7: PCI bridge to [bus 08] Jul 15 23:15:54.846789 kernel: pci 0000:00:02.7: bridge window [io 0x8000-0x8fff] Jul 15 23:15:54.846848 kernel: pci 0000:00:02.7: bridge window [mem 0x10e00000-0x10ffffff] Jul 15 23:15:54.846909 kernel: pci 0000:00:02.7: bridge window [mem 0x8000e00000-0x8000ffffff 64bit pref] Jul 15 23:15:54.846991 kernel: pci 0000:00:03.0: PCI bridge to [bus 09] Jul 15 23:15:54.847056 kernel: pci 0000:00:03.0: bridge window [io 0x9000-0x9fff] Jul 15 23:15:54.847153 kernel: pci 0000:00:03.0: bridge window [mem 0x11000000-0x111fffff] Jul 15 23:15:54.847219 kernel: pci 0000:00:03.0: bridge window [mem 0x8001000000-0x80011fffff 64bit pref] Jul 15 23:15:54.847282 kernel: pci_bus 0000:00: resource 4 [mem 0x10000000-0x3efeffff window] Jul 15 23:15:54.847337 kernel: pci_bus 0000:00: resource 5 [io 0x0000-0xffff window] Jul 15 23:15:54.847391 kernel: pci_bus 0000:00: resource 6 [mem 0x8000000000-0xffffffffff window] Jul 15 23:15:54.847462 kernel: pci_bus 0000:01: resource 0 [io 0x1000-0x1fff] Jul 15 23:15:54.847526 kernel: pci_bus 0000:01: resource 1 [mem 0x10000000-0x101fffff] Jul 15 23:15:54.847581 kernel: pci_bus 0000:01: resource 2 [mem 0x8000000000-0x80001fffff 64bit pref] Jul 15 23:15:54.847679 kernel: pci_bus 0000:02: resource 0 [io 0x2000-0x2fff] Jul 15 23:15:54.847736 kernel: pci_bus 0000:02: resource 1 [mem 0x10200000-0x103fffff] Jul 15 23:15:54.847789 kernel: pci_bus 0000:02: resource 2 [mem 0x8000200000-0x80003fffff 64bit pref] Jul 15 23:15:54.847854 kernel: pci_bus 0000:03: resource 0 [io 0x3000-0x3fff] Jul 15 23:15:54.847910 kernel: pci_bus 0000:03: resource 1 [mem 0x10400000-0x105fffff] Jul 15 23:15:54.848008 kernel: pci_bus 0000:03: resource 2 [mem 0x8000400000-0x80005fffff 64bit pref] Jul 15 23:15:54.848085 kernel: pci_bus 0000:04: resource 0 [io 0x4000-0x4fff] Jul 15 23:15:54.848140 kernel: pci_bus 0000:04: resource 1 [mem 0x10600000-0x107fffff] Jul 15 23:15:54.848195 kernel: pci_bus 0000:04: resource 2 [mem 0x8000600000-0x80007fffff 64bit pref] Jul 15 23:15:54.848258 kernel: pci_bus 0000:05: resource 0 [io 0x5000-0x5fff] Jul 15 23:15:54.848315 kernel: pci_bus 0000:05: resource 1 [mem 0x10800000-0x109fffff] Jul 15 23:15:54.848369 kernel: pci_bus 0000:05: resource 2 [mem 0x8000800000-0x80009fffff 64bit pref] Jul 15 23:15:54.848432 kernel: pci_bus 0000:06: resource 0 [io 0x6000-0x6fff] Jul 15 23:15:54.848495 kernel: pci_bus 0000:06: resource 1 [mem 0x10a00000-0x10bfffff] Jul 15 23:15:54.848549 kernel: pci_bus 0000:06: resource 2 [mem 0x8000a00000-0x8000bfffff 64bit pref] Jul 15 23:15:54.848657 kernel: pci_bus 0000:07: resource 0 [io 0x7000-0x7fff] Jul 15 23:15:54.848723 kernel: pci_bus 0000:07: resource 1 [mem 0x10c00000-0x10dfffff] Jul 15 23:15:54.848782 kernel: pci_bus 0000:07: resource 2 [mem 0x8000c00000-0x8000dfffff 64bit pref] Jul 15 23:15:54.848914 kernel: pci_bus 0000:08: resource 0 [io 0x8000-0x8fff] Jul 15 23:15:54.849007 kernel: pci_bus 0000:08: resource 1 [mem 0x10e00000-0x10ffffff] Jul 15 23:15:54.849072 kernel: pci_bus 0000:08: resource 2 [mem 0x8000e00000-0x8000ffffff 64bit pref] Jul 15 23:15:54.849136 kernel: pci_bus 0000:09: resource 0 [io 0x9000-0x9fff] Jul 15 23:15:54.849192 kernel: pci_bus 0000:09: resource 1 [mem 0x11000000-0x111fffff] Jul 15 23:15:54.849249 kernel: pci_bus 0000:09: resource 2 [mem 0x8001000000-0x80011fffff 64bit pref] Jul 15 23:15:54.849258 kernel: ACPI: PCI: Interrupt link GSI0 configured for IRQ 35 Jul 15 23:15:54.849267 kernel: ACPI: PCI: Interrupt link GSI1 configured for IRQ 36 Jul 15 23:15:54.849274 kernel: ACPI: PCI: Interrupt link GSI2 configured for IRQ 37 Jul 15 23:15:54.849283 kernel: ACPI: PCI: Interrupt link GSI3 configured for IRQ 38 Jul 15 23:15:54.849291 kernel: iommu: Default domain type: Translated Jul 15 23:15:54.849298 kernel: iommu: DMA domain TLB invalidation policy: strict mode Jul 15 23:15:54.849306 kernel: efivars: Registered efivars operations Jul 15 23:15:54.849313 kernel: vgaarb: loaded Jul 15 23:15:54.849321 kernel: clocksource: Switched to clocksource arch_sys_counter Jul 15 23:15:54.849328 kernel: VFS: Disk quotas dquot_6.6.0 Jul 15 23:15:54.849336 kernel: VFS: Dquot-cache hash table entries: 512 (order 0, 4096 bytes) Jul 15 23:15:54.849344 kernel: pnp: PnP ACPI init Jul 15 23:15:54.849423 kernel: system 00:00: [mem 0x4010000000-0x401fffffff window] could not be reserved Jul 15 23:15:54.849435 kernel: pnp: PnP ACPI: found 1 devices Jul 15 23:15:54.849442 kernel: NET: Registered PF_INET protocol family Jul 15 23:15:54.849449 kernel: IP idents hash table entries: 65536 (order: 7, 524288 bytes, linear) Jul 15 23:15:54.849457 kernel: tcp_listen_portaddr_hash hash table entries: 2048 (order: 3, 32768 bytes, linear) Jul 15 23:15:54.849465 kernel: Table-perturb hash table entries: 65536 (order: 6, 262144 bytes, linear) Jul 15 23:15:54.849472 kernel: TCP established hash table entries: 32768 (order: 6, 262144 bytes, linear) Jul 15 23:15:54.849480 kernel: TCP bind hash table entries: 32768 (order: 8, 1048576 bytes, linear) Jul 15 23:15:54.849489 kernel: TCP: Hash tables configured (established 32768 bind 32768) Jul 15 23:15:54.849497 kernel: UDP hash table entries: 2048 (order: 4, 65536 bytes, linear) Jul 15 23:15:54.849505 kernel: UDP-Lite hash table entries: 2048 (order: 4, 65536 bytes, linear) Jul 15 23:15:54.849512 kernel: NET: Registered PF_UNIX/PF_LOCAL protocol family Jul 15 23:15:54.849585 kernel: pci 0000:02:00.0: enabling device (0000 -> 0002) Jul 15 23:15:54.849614 kernel: PCI: CLS 0 bytes, default 64 Jul 15 23:15:54.849621 kernel: kvm [1]: HYP mode not available Jul 15 23:15:54.849629 kernel: Initialise system trusted keyrings Jul 15 23:15:54.849637 kernel: workingset: timestamp_bits=39 max_order=20 bucket_order=0 Jul 15 23:15:54.849647 kernel: Key type asymmetric registered Jul 15 23:15:54.849655 kernel: Asymmetric key parser 'x509' registered Jul 15 23:15:54.849662 kernel: Block layer SCSI generic (bsg) driver version 0.4 loaded (major 249) Jul 15 23:15:54.849670 kernel: io scheduler mq-deadline registered Jul 15 23:15:54.849678 kernel: io scheduler kyber registered Jul 15 23:15:54.849685 kernel: io scheduler bfq registered Jul 15 23:15:54.849694 kernel: ACPI: \_SB_.PCI0.GSI2: Enabled at IRQ 37 Jul 15 23:15:54.849767 kernel: pcieport 0000:00:02.0: PME: Signaling with IRQ 50 Jul 15 23:15:54.849831 kernel: pcieport 0000:00:02.0: AER: enabled with IRQ 50 Jul 15 23:15:54.849894 kernel: pcieport 0000:00:02.0: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Jul 15 23:15:54.849974 kernel: pcieport 0000:00:02.1: PME: Signaling with IRQ 51 Jul 15 23:15:54.850068 kernel: pcieport 0000:00:02.1: AER: enabled with IRQ 51 Jul 15 23:15:54.850132 kernel: pcieport 0000:00:02.1: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Jul 15 23:15:54.850196 kernel: pcieport 0000:00:02.2: PME: Signaling with IRQ 52 Jul 15 23:15:54.850256 kernel: pcieport 0000:00:02.2: AER: enabled with IRQ 52 Jul 15 23:15:54.850317 kernel: pcieport 0000:00:02.2: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Jul 15 23:15:54.850381 kernel: pcieport 0000:00:02.3: PME: Signaling with IRQ 53 Jul 15 23:15:54.850447 kernel: pcieport 0000:00:02.3: AER: enabled with IRQ 53 Jul 15 23:15:54.850508 kernel: pcieport 0000:00:02.3: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Jul 15 23:15:54.850573 kernel: pcieport 0000:00:02.4: PME: Signaling with IRQ 54 Jul 15 23:15:54.850669 kernel: pcieport 0000:00:02.4: AER: enabled with IRQ 54 Jul 15 23:15:54.850731 kernel: pcieport 0000:00:02.4: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Jul 15 23:15:54.850795 kernel: pcieport 0000:00:02.5: PME: Signaling with IRQ 55 Jul 15 23:15:54.850858 kernel: pcieport 0000:00:02.5: AER: enabled with IRQ 55 Jul 15 23:15:54.850917 kernel: pcieport 0000:00:02.5: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Jul 15 23:15:54.851036 kernel: pcieport 0000:00:02.6: PME: Signaling with IRQ 56 Jul 15 23:15:54.851106 kernel: pcieport 0000:00:02.6: AER: enabled with IRQ 56 Jul 15 23:15:54.851165 kernel: pcieport 0000:00:02.6: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Jul 15 23:15:54.851231 kernel: pcieport 0000:00:02.7: PME: Signaling with IRQ 57 Jul 15 23:15:54.851293 kernel: pcieport 0000:00:02.7: AER: enabled with IRQ 57 Jul 15 23:15:54.851353 kernel: pcieport 0000:00:02.7: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Jul 15 23:15:54.851364 kernel: ACPI: \_SB_.PCI0.GSI3: Enabled at IRQ 38 Jul 15 23:15:54.851428 kernel: pcieport 0000:00:03.0: PME: Signaling with IRQ 58 Jul 15 23:15:54.851490 kernel: pcieport 0000:00:03.0: AER: enabled with IRQ 58 Jul 15 23:15:54.851550 kernel: pcieport 0000:00:03.0: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Jul 15 23:15:54.851560 kernel: input: Power Button as /devices/LNXSYSTM:00/LNXSYBUS:00/PNP0C0C:00/input/input0 Jul 15 23:15:54.851569 kernel: ACPI: button: Power Button [PWRB] Jul 15 23:15:54.851579 kernel: ACPI: \_SB_.PCI0.GSI1: Enabled at IRQ 36 Jul 15 23:15:54.851803 kernel: virtio-pci 0000:04:00.0: enabling device (0000 -> 0002) Jul 15 23:15:54.851879 kernel: virtio-pci 0000:07:00.0: enabling device (0000 -> 0002) Jul 15 23:15:54.851891 kernel: Serial: 8250/16550 driver, 4 ports, IRQ sharing enabled Jul 15 23:15:54.851903 kernel: ACPI: \_SB_.PCI0.GSI0: Enabled at IRQ 35 Jul 15 23:15:54.851988 kernel: serial 0000:00:04.0: enabling device (0000 -> 0001) Jul 15 23:15:54.852001 kernel: 0000:00:04.0: ttyS0 at I/O 0xa000 (irq = 45, base_baud = 115200) is a 16550A Jul 15 23:15:54.852009 kernel: thunder_xcv, ver 1.0 Jul 15 23:15:54.852016 kernel: thunder_bgx, ver 1.0 Jul 15 23:15:54.852024 kernel: nicpf, ver 1.0 Jul 15 23:15:54.852032 kernel: nicvf, ver 1.0 Jul 15 23:15:54.852113 kernel: rtc-efi rtc-efi.0: registered as rtc0 Jul 15 23:15:54.852175 kernel: rtc-efi rtc-efi.0: setting system clock to 2025-07-15T23:15:54 UTC (1752621354) Jul 15 23:15:54.852185 kernel: hid: raw HID events driver (C) Jiri Kosina Jul 15 23:15:54.852193 kernel: hw perfevents: enabled with armv8_pmuv3_0 PMU driver, 7 (0,8000003f) counters available Jul 15 23:15:54.852200 kernel: watchdog: NMI not fully supported Jul 15 23:15:54.852208 kernel: watchdog: Hard watchdog permanently disabled Jul 15 23:15:54.852216 kernel: NET: Registered PF_INET6 protocol family Jul 15 23:15:54.852223 kernel: Segment Routing with IPv6 Jul 15 23:15:54.852231 kernel: In-situ OAM (IOAM) with IPv6 Jul 15 23:15:54.852239 kernel: NET: Registered PF_PACKET protocol family Jul 15 23:15:54.852248 kernel: Key type dns_resolver registered Jul 15 23:15:54.852255 kernel: registered taskstats version 1 Jul 15 23:15:54.852263 kernel: Loading compiled-in X.509 certificates Jul 15 23:15:54.852271 kernel: Loaded X.509 cert 'Kinvolk GmbH: Module signing key for 6.12.36-flatcar: 2e049b1166d7080a2074348abe7e86e115624bdd' Jul 15 23:15:54.852278 kernel: Demotion targets for Node 0: null Jul 15 23:15:54.852286 kernel: Key type .fscrypt registered Jul 15 23:15:54.852294 kernel: Key type fscrypt-provisioning registered Jul 15 23:15:54.852301 kernel: ima: No TPM chip found, activating TPM-bypass! Jul 15 23:15:54.852311 kernel: ima: Allocated hash algorithm: sha1 Jul 15 23:15:54.852318 kernel: ima: No architecture policies found Jul 15 23:15:54.852326 kernel: alg: No test for fips(ansi_cprng) (fips_ansi_cprng) Jul 15 23:15:54.852334 kernel: clk: Disabling unused clocks Jul 15 23:15:54.852342 kernel: PM: genpd: Disabling unused power domains Jul 15 23:15:54.852349 kernel: Warning: unable to open an initial console. Jul 15 23:15:54.852357 kernel: Freeing unused kernel memory: 39488K Jul 15 23:15:54.852364 kernel: Run /init as init process Jul 15 23:15:54.852372 kernel: with arguments: Jul 15 23:15:54.852380 kernel: /init Jul 15 23:15:54.852389 kernel: with environment: Jul 15 23:15:54.852396 kernel: HOME=/ Jul 15 23:15:54.852404 kernel: TERM=linux Jul 15 23:15:54.852412 kernel: BOOT_IMAGE=/flatcar/vmlinuz-a Jul 15 23:15:54.852421 systemd[1]: Successfully made /usr/ read-only. Jul 15 23:15:54.852432 systemd[1]: systemd 256.8 running in system mode (+PAM +AUDIT +SELINUX -APPARMOR +IMA +SMACK +SECCOMP -GCRYPT -GNUTLS +OPENSSL -ACL +BLKID +CURL +ELFUTILS -FIDO2 +IDN2 -IDN +IPTC +KMOD +LIBCRYPTSETUP +LIBCRYPTSETUP_PLUGINS +LIBFDISK +PCRE2 -PWQUALITY -P11KIT -QRENCODE +TPM2 +BZIP2 +LZ4 +XZ +ZLIB +ZSTD -BPF_FRAMEWORK -XKBCOMMON +UTMP -SYSVINIT +LIBARCHIVE) Jul 15 23:15:54.852440 systemd[1]: Detected virtualization kvm. Jul 15 23:15:54.852449 systemd[1]: Detected architecture arm64. Jul 15 23:15:54.852457 systemd[1]: Running in initrd. Jul 15 23:15:54.852465 systemd[1]: No hostname configured, using default hostname. Jul 15 23:15:54.852473 systemd[1]: Hostname set to . Jul 15 23:15:54.852481 systemd[1]: Initializing machine ID from VM UUID. Jul 15 23:15:54.852489 systemd[1]: Queued start job for default target initrd.target. Jul 15 23:15:54.852497 systemd[1]: Started clevis-luks-askpass.path - Forward Password Requests to Clevis Directory Watch. Jul 15 23:15:54.852505 systemd[1]: Started systemd-ask-password-console.path - Dispatch Password Requests to Console Directory Watch. Jul 15 23:15:54.852516 systemd[1]: Expecting device dev-disk-by\x2dlabel-EFI\x2dSYSTEM.device - /dev/disk/by-label/EFI-SYSTEM... Jul 15 23:15:54.852524 systemd[1]: Expecting device dev-disk-by\x2dlabel-OEM.device - /dev/disk/by-label/OEM... Jul 15 23:15:54.852532 systemd[1]: Expecting device dev-disk-by\x2dlabel-ROOT.device - /dev/disk/by-label/ROOT... Jul 15 23:15:54.852541 systemd[1]: Expecting device dev-disk-by\x2dpartlabel-USR\x2dA.device - /dev/disk/by-partlabel/USR-A... Jul 15 23:15:54.852550 systemd[1]: Expecting device dev-disk-by\x2dpartuuid-7130c94a\x2d213a\x2d4e5a\x2d8e26\x2d6cce9662f132.device - /dev/disk/by-partuuid/7130c94a-213a-4e5a-8e26-6cce9662f132... Jul 15 23:15:54.852558 systemd[1]: Expecting device dev-mapper-usr.device - /dev/mapper/usr... Jul 15 23:15:54.852566 systemd[1]: Reached target cryptsetup-pre.target - Local Encrypted Volumes (Pre). Jul 15 23:15:54.852576 systemd[1]: Reached target cryptsetup.target - Local Encrypted Volumes. Jul 15 23:15:54.852585 systemd[1]: Reached target paths.target - Path Units. Jul 15 23:15:54.852652 systemd[1]: Reached target slices.target - Slice Units. Jul 15 23:15:54.852661 systemd[1]: Reached target swap.target - Swaps. Jul 15 23:15:54.852669 systemd[1]: Reached target timers.target - Timer Units. Jul 15 23:15:54.852677 systemd[1]: Listening on iscsid.socket - Open-iSCSI iscsid Socket. Jul 15 23:15:54.852689 systemd[1]: Listening on iscsiuio.socket - Open-iSCSI iscsiuio Socket. Jul 15 23:15:54.852697 systemd[1]: Listening on systemd-journald-dev-log.socket - Journal Socket (/dev/log). Jul 15 23:15:54.852707 systemd[1]: Listening on systemd-journald.socket - Journal Sockets. Jul 15 23:15:54.852715 systemd[1]: Listening on systemd-networkd.socket - Network Service Netlink Socket. Jul 15 23:15:54.852724 systemd[1]: Listening on systemd-udevd-control.socket - udev Control Socket. Jul 15 23:15:54.852732 systemd[1]: Listening on systemd-udevd-kernel.socket - udev Kernel Socket. Jul 15 23:15:54.852740 systemd[1]: Reached target sockets.target - Socket Units. Jul 15 23:15:54.852748 systemd[1]: Starting ignition-setup-pre.service - Ignition env setup... Jul 15 23:15:54.852756 systemd[1]: Starting kmod-static-nodes.service - Create List of Static Device Nodes... Jul 15 23:15:54.852764 systemd[1]: Finished network-cleanup.service - Network Cleanup. Jul 15 23:15:54.852773 systemd[1]: systemd-battery-check.service - Check battery level during early boot was skipped because of an unmet condition check (ConditionDirectoryNotEmpty=/sys/class/power_supply). Jul 15 23:15:54.852783 systemd[1]: Starting systemd-fsck-usr.service... Jul 15 23:15:54.852792 systemd[1]: Starting systemd-journald.service - Journal Service... Jul 15 23:15:54.852800 systemd[1]: Starting systemd-modules-load.service - Load Kernel Modules... Jul 15 23:15:54.852808 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Jul 15 23:15:54.852816 systemd[1]: Finished ignition-setup-pre.service - Ignition env setup. Jul 15 23:15:54.852825 systemd[1]: Finished kmod-static-nodes.service - Create List of Static Device Nodes. Jul 15 23:15:54.852834 systemd[1]: Finished systemd-fsck-usr.service. Jul 15 23:15:54.852843 systemd[1]: Starting systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully... Jul 15 23:15:54.852881 systemd-journald[244]: Collecting audit messages is disabled. Jul 15 23:15:54.852904 kernel: bridge: filtering via arp/ip/ip6tables is no longer available by default. Update your scripts to load br_netfilter if you need this. Jul 15 23:15:54.852912 kernel: Bridge firewalling registered Jul 15 23:15:54.852921 systemd[1]: Finished systemd-modules-load.service - Load Kernel Modules. Jul 15 23:15:54.852944 systemd[1]: Starting systemd-sysctl.service - Apply Kernel Variables... Jul 15 23:15:54.852954 systemd[1]: Finished systemd-vconsole-setup.service - Virtual Console Setup. Jul 15 23:15:54.852962 systemd[1]: Finished systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully. Jul 15 23:15:54.852972 systemd-journald[244]: Journal started Jul 15 23:15:54.852995 systemd-journald[244]: Runtime Journal (/run/log/journal/275fba7bf51443ffbbfe9a8953517fa2) is 8M, max 76.5M, 68.5M free. Jul 15 23:15:54.799574 systemd-modules-load[245]: Inserted module 'overlay' Jul 15 23:15:54.823665 systemd-modules-load[245]: Inserted module 'br_netfilter' Jul 15 23:15:54.858676 systemd[1]: Starting dracut-cmdline-ask.service - dracut ask for additional cmdline parameters... Jul 15 23:15:54.860094 systemd[1]: Starting systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev... Jul 15 23:15:54.864633 systemd[1]: Started systemd-journald.service - Journal Service. Jul 15 23:15:54.875083 systemd[1]: Finished systemd-sysctl.service - Apply Kernel Variables. Jul 15 23:15:54.882676 systemd[1]: Starting systemd-tmpfiles-setup.service - Create System Files and Directories... Jul 15 23:15:54.891226 systemd[1]: Finished systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev. Jul 15 23:15:54.899329 systemd-tmpfiles[269]: /usr/lib/tmpfiles.d/var.conf:14: Duplicate line for path "/var/log", ignoring. Jul 15 23:15:54.903188 systemd[1]: Finished systemd-tmpfiles-setup.service - Create System Files and Directories. Jul 15 23:15:54.906126 systemd[1]: Starting systemd-resolved.service - Network Name Resolution... Jul 15 23:15:54.911707 systemd[1]: Finished dracut-cmdline-ask.service - dracut ask for additional cmdline parameters. Jul 15 23:15:54.913428 systemd[1]: Starting dracut-cmdline.service - dracut cmdline hook... Jul 15 23:15:54.941107 dracut-cmdline[283]: Using kernel command line parameters: rd.driver.pre=btrfs SYSTEMD_SULOGIN_FORCE=1 BOOT_IMAGE=/flatcar/vmlinuz-a mount.usr=/dev/mapper/usr verity.usr=PARTUUID=7130c94a-213a-4e5a-8e26-6cce9662f132 rootflags=rw mount.usrflags=ro consoleblank=0 root=LABEL=ROOT console=ttyAMA0,115200n8 flatcar.first_boot=detected acpi=force flatcar.oem.id=hetzner verity.usrhash=6efbcbd16e8e41b645be9f8e34b328753e37d282675200dab08e504f8e58a578 Jul 15 23:15:54.962200 systemd-resolved[280]: Positive Trust Anchors: Jul 15 23:15:54.964564 systemd-resolved[280]: . IN DS 20326 8 2 e06d44b80b8f1d39a95c0b0d7c65d08458e880409bbc683457104237c7f8ec8d Jul 15 23:15:54.964632 systemd-resolved[280]: Negative trust anchors: home.arpa 10.in-addr.arpa 16.172.in-addr.arpa 17.172.in-addr.arpa 18.172.in-addr.arpa 19.172.in-addr.arpa 20.172.in-addr.arpa 21.172.in-addr.arpa 22.172.in-addr.arpa 23.172.in-addr.arpa 24.172.in-addr.arpa 25.172.in-addr.arpa 26.172.in-addr.arpa 27.172.in-addr.arpa 28.172.in-addr.arpa 29.172.in-addr.arpa 30.172.in-addr.arpa 31.172.in-addr.arpa 170.0.0.192.in-addr.arpa 171.0.0.192.in-addr.arpa 168.192.in-addr.arpa d.f.ip6.arpa ipv4only.arpa resolver.arpa corp home internal intranet lan local private test Jul 15 23:15:54.977748 systemd-resolved[280]: Defaulting to hostname 'linux'. Jul 15 23:15:54.979967 systemd[1]: Started systemd-resolved.service - Network Name Resolution. Jul 15 23:15:54.981228 systemd[1]: Reached target nss-lookup.target - Host and Network Name Lookups. Jul 15 23:15:55.059651 kernel: SCSI subsystem initialized Jul 15 23:15:55.066623 kernel: Loading iSCSI transport class v2.0-870. Jul 15 23:15:55.074642 kernel: iscsi: registered transport (tcp) Jul 15 23:15:55.089825 kernel: iscsi: registered transport (qla4xxx) Jul 15 23:15:55.089989 kernel: QLogic iSCSI HBA Driver Jul 15 23:15:55.115207 systemd[1]: Starting systemd-network-generator.service - Generate network units from Kernel command line... Jul 15 23:15:55.145884 systemd[1]: Finished systemd-network-generator.service - Generate network units from Kernel command line. Jul 15 23:15:55.149667 systemd[1]: Reached target network-pre.target - Preparation for Network. Jul 15 23:15:55.209149 systemd[1]: Finished dracut-cmdline.service - dracut cmdline hook. Jul 15 23:15:55.211705 systemd[1]: Starting dracut-pre-udev.service - dracut pre-udev hook... Jul 15 23:15:55.278653 kernel: raid6: neonx8 gen() 15480 MB/s Jul 15 23:15:55.294651 kernel: raid6: neonx4 gen() 13289 MB/s Jul 15 23:15:55.311659 kernel: raid6: neonx2 gen() 13010 MB/s Jul 15 23:15:55.328662 kernel: raid6: neonx1 gen() 10303 MB/s Jul 15 23:15:55.345646 kernel: raid6: int64x8 gen() 6839 MB/s Jul 15 23:15:55.362645 kernel: raid6: int64x4 gen() 7300 MB/s Jul 15 23:15:55.381013 kernel: raid6: int64x2 gen() 6021 MB/s Jul 15 23:15:55.396660 kernel: raid6: int64x1 gen() 5009 MB/s Jul 15 23:15:55.396754 kernel: raid6: using algorithm neonx8 gen() 15480 MB/s Jul 15 23:15:55.413649 kernel: raid6: .... xor() 11947 MB/s, rmw enabled Jul 15 23:15:55.413727 kernel: raid6: using neon recovery algorithm Jul 15 23:15:55.418844 kernel: xor: measuring software checksum speed Jul 15 23:15:55.418898 kernel: 8regs : 20222 MB/sec Jul 15 23:15:55.419641 kernel: 32regs : 20598 MB/sec Jul 15 23:15:55.419698 kernel: arm64_neon : 22392 MB/sec Jul 15 23:15:55.419725 kernel: xor: using function: arm64_neon (22392 MB/sec) Jul 15 23:15:55.482661 kernel: Btrfs loaded, zoned=no, fsverity=no Jul 15 23:15:55.492376 systemd[1]: Finished dracut-pre-udev.service - dracut pre-udev hook. Jul 15 23:15:55.495450 systemd[1]: Starting systemd-udevd.service - Rule-based Manager for Device Events and Files... Jul 15 23:15:55.529542 systemd-udevd[492]: Using default interface naming scheme 'v255'. Jul 15 23:15:55.535487 systemd[1]: Started systemd-udevd.service - Rule-based Manager for Device Events and Files. Jul 15 23:15:55.541312 systemd[1]: Starting dracut-pre-trigger.service - dracut pre-trigger hook... Jul 15 23:15:55.576481 dracut-pre-trigger[501]: rd.md=0: removing MD RAID activation Jul 15 23:15:55.616375 systemd[1]: Finished dracut-pre-trigger.service - dracut pre-trigger hook. Jul 15 23:15:55.619613 systemd[1]: Starting systemd-udev-trigger.service - Coldplug All udev Devices... Jul 15 23:15:55.692119 systemd[1]: Finished systemd-udev-trigger.service - Coldplug All udev Devices. Jul 15 23:15:55.696434 systemd[1]: Starting dracut-initqueue.service - dracut initqueue hook... Jul 15 23:15:55.832618 kernel: ACPI: bus type USB registered Jul 15 23:15:55.842850 kernel: usbcore: registered new interface driver usbfs Jul 15 23:15:55.842936 kernel: usbcore: registered new interface driver hub Jul 15 23:15:55.842954 kernel: usbcore: registered new device driver usb Jul 15 23:15:55.849345 kernel: virtio_scsi virtio5: 2/0/0 default/read/poll queues Jul 15 23:15:55.849549 kernel: scsi host0: Virtio SCSI HBA Jul 15 23:15:55.852640 kernel: scsi 0:0:0:0: CD-ROM QEMU QEMU CD-ROM 2.5+ PQ: 0 ANSI: 5 Jul 15 23:15:55.854638 kernel: scsi 0:0:0:1: Direct-Access QEMU QEMU HARDDISK 2.5+ PQ: 0 ANSI: 5 Jul 15 23:15:55.863319 systemd[1]: systemd-vconsole-setup.service: Deactivated successfully. Jul 15 23:15:55.863768 systemd[1]: Stopped systemd-vconsole-setup.service - Virtual Console Setup. Jul 15 23:15:55.865668 systemd[1]: Stopping systemd-vconsole-setup.service - Virtual Console Setup... Jul 15 23:15:55.870089 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Jul 15 23:15:55.871766 systemd[1]: run-credentials-systemd\x2dvconsole\x2dsetup.service.mount: Deactivated successfully. Jul 15 23:15:55.890820 kernel: xhci_hcd 0000:02:00.0: xHCI Host Controller Jul 15 23:15:55.891079 kernel: xhci_hcd 0000:02:00.0: new USB bus registered, assigned bus number 1 Jul 15 23:15:55.893605 kernel: xhci_hcd 0000:02:00.0: hcc params 0x00087001 hci version 0x100 quirks 0x0000000000000010 Jul 15 23:15:55.894717 kernel: xhci_hcd 0000:02:00.0: xHCI Host Controller Jul 15 23:15:55.899666 kernel: xhci_hcd 0000:02:00.0: new USB bus registered, assigned bus number 2 Jul 15 23:15:55.899767 kernel: sd 0:0:0:1: Power-on or device reset occurred Jul 15 23:15:55.899877 kernel: xhci_hcd 0000:02:00.0: Host supports USB 3.0 SuperSpeed Jul 15 23:15:55.900024 kernel: sd 0:0:0:1: [sda] 80003072 512-byte logical blocks: (41.0 GB/38.1 GiB) Jul 15 23:15:55.900110 kernel: sd 0:0:0:1: [sda] Write Protect is off Jul 15 23:15:55.900185 kernel: sd 0:0:0:1: [sda] Mode Sense: 63 00 00 08 Jul 15 23:15:55.900260 kernel: sd 0:0:0:1: [sda] Write cache: enabled, read cache: enabled, doesn't support DPO or FUA Jul 15 23:15:55.900620 kernel: sr 0:0:0:0: Power-on or device reset occurred Jul 15 23:15:55.901652 kernel: sr 0:0:0:0: [sr0] scsi3-mmc drive: 16x/50x cd/rw xa/form2 cdda tray Jul 15 23:15:55.901850 kernel: cdrom: Uniform CD-ROM driver Revision: 3.20 Jul 15 23:15:55.902955 kernel: hub 1-0:1.0: USB hub found Jul 15 23:15:55.903855 kernel: hub 1-0:1.0: 4 ports detected Jul 15 23:15:55.904379 systemd[1]: Finished systemd-vconsole-setup.service - Virtual Console Setup. Jul 15 23:15:55.905541 kernel: sr 0:0:0:0: Attached scsi CD-ROM sr0 Jul 15 23:15:55.906612 kernel: usb usb2: We don't know the algorithms for LPM for this host, disabling LPM. Jul 15 23:15:55.907624 kernel: hub 2-0:1.0: USB hub found Jul 15 23:15:55.907797 kernel: hub 2-0:1.0: 4 ports detected Jul 15 23:15:55.911729 kernel: GPT:Primary header thinks Alt. header is not at the end of the disk. Jul 15 23:15:55.911787 kernel: GPT:17805311 != 80003071 Jul 15 23:15:55.911798 kernel: GPT:Alternate GPT header not at the end of the disk. Jul 15 23:15:55.911807 kernel: GPT:17805311 != 80003071 Jul 15 23:15:55.911816 kernel: GPT: Use GNU Parted to correct GPT errors. Jul 15 23:15:55.911825 kernel: sda: sda1 sda2 sda3 sda4 sda6 sda7 sda9 Jul 15 23:15:55.914714 kernel: sd 0:0:0:1: [sda] Attached SCSI disk Jul 15 23:15:55.980750 systemd[1]: Found device dev-disk-by\x2dlabel-EFI\x2dSYSTEM.device - QEMU_HARDDISK EFI-SYSTEM. Jul 15 23:15:55.996835 systemd[1]: Found device dev-disk-by\x2dlabel-ROOT.device - QEMU_HARDDISK ROOT. Jul 15 23:15:56.006994 systemd[1]: Found device dev-disk-by\x2dlabel-OEM.device - QEMU_HARDDISK OEM. Jul 15 23:15:56.015410 systemd[1]: Found device dev-disk-by\x2dpartuuid-7130c94a\x2d213a\x2d4e5a\x2d8e26\x2d6cce9662f132.device - QEMU_HARDDISK USR-A. Jul 15 23:15:56.017146 systemd[1]: Found device dev-disk-by\x2dpartlabel-USR\x2dA.device - QEMU_HARDDISK USR-A. Jul 15 23:15:56.020646 systemd[1]: Starting disk-uuid.service - Generate new UUID for disk GPT if necessary... Jul 15 23:15:56.033568 systemd[1]: Finished dracut-initqueue.service - dracut initqueue hook. Jul 15 23:15:56.036719 systemd[1]: Reached target remote-fs-pre.target - Preparation for Remote File Systems. Jul 15 23:15:56.038575 systemd[1]: Reached target remote-cryptsetup.target - Remote Encrypted Volumes. Jul 15 23:15:56.039311 systemd[1]: Reached target remote-fs.target - Remote File Systems. Jul 15 23:15:56.041880 systemd[1]: Starting dracut-pre-mount.service - dracut pre-mount hook... Jul 15 23:15:56.053659 kernel: sda: sda1 sda2 sda3 sda4 sda6 sda7 sda9 Jul 15 23:15:56.053912 disk-uuid[598]: Primary Header is updated. Jul 15 23:15:56.053912 disk-uuid[598]: Secondary Entries is updated. Jul 15 23:15:56.053912 disk-uuid[598]: Secondary Header is updated. Jul 15 23:15:56.078768 systemd[1]: Finished dracut-pre-mount.service - dracut pre-mount hook. Jul 15 23:15:56.141997 kernel: usb 1-1: new high-speed USB device number 2 using xhci_hcd Jul 15 23:15:56.274670 kernel: input: QEMU QEMU USB Tablet as /devices/pci0000:00/0000:00:02.1/0000:02:00.0/usb1/1-1/1-1:1.0/0003:0627:0001.0001/input/input1 Jul 15 23:15:56.275911 kernel: hid-generic 0003:0627:0001.0001: input,hidraw0: USB HID v0.01 Mouse [QEMU QEMU USB Tablet] on usb-0000:02:00.0-1/input0 Jul 15 23:15:56.276133 kernel: usbcore: registered new interface driver usbhid Jul 15 23:15:56.276645 kernel: usbhid: USB HID core driver Jul 15 23:15:56.379624 kernel: usb 1-2: new high-speed USB device number 3 using xhci_hcd Jul 15 23:15:56.506623 kernel: input: QEMU QEMU USB Keyboard as /devices/pci0000:00/0000:00:02.1/0000:02:00.0/usb1/1-2/1-2:1.0/0003:0627:0001.0002/input/input2 Jul 15 23:15:56.558643 kernel: hid-generic 0003:0627:0001.0002: input,hidraw1: USB HID v1.11 Keyboard [QEMU QEMU USB Keyboard] on usb-0000:02:00.0-2/input0 Jul 15 23:15:57.085832 kernel: sda: sda1 sda2 sda3 sda4 sda6 sda7 sda9 Jul 15 23:15:57.086024 disk-uuid[601]: The operation has completed successfully. Jul 15 23:15:57.159983 systemd[1]: disk-uuid.service: Deactivated successfully. Jul 15 23:15:57.160918 systemd[1]: Finished disk-uuid.service - Generate new UUID for disk GPT if necessary. Jul 15 23:15:57.188577 systemd[1]: Starting verity-setup.service - Verity Setup for /dev/mapper/usr... Jul 15 23:15:57.214582 sh[624]: Success Jul 15 23:15:57.230747 kernel: device-mapper: core: CONFIG_IMA_DISABLE_HTABLE is disabled. Duplicate IMA measurements will not be recorded in the IMA log. Jul 15 23:15:57.230819 kernel: device-mapper: uevent: version 1.0.3 Jul 15 23:15:57.232633 kernel: device-mapper: ioctl: 4.48.0-ioctl (2023-03-01) initialised: dm-devel@lists.linux.dev Jul 15 23:15:57.244618 kernel: device-mapper: verity: sha256 using shash "sha256-ce" Jul 15 23:15:57.310188 systemd[1]: Found device dev-mapper-usr.device - /dev/mapper/usr. Jul 15 23:15:57.312471 systemd[1]: Mounting sysusr-usr.mount - /sysusr/usr... Jul 15 23:15:57.326537 systemd[1]: Finished verity-setup.service - Verity Setup for /dev/mapper/usr. Jul 15 23:15:57.342653 kernel: BTRFS info: 'norecovery' is for compatibility only, recommended to use 'rescue=nologreplay' Jul 15 23:15:57.343576 kernel: BTRFS: device fsid e70e9257-c19d-4e0a-b2ee-631da7d0eb2b devid 1 transid 37 /dev/mapper/usr (254:0) scanned by mount (636) Jul 15 23:15:57.346783 kernel: BTRFS info (device dm-0): first mount of filesystem e70e9257-c19d-4e0a-b2ee-631da7d0eb2b Jul 15 23:15:57.346855 kernel: BTRFS info (device dm-0): using crc32c (crc32c-generic) checksum algorithm Jul 15 23:15:57.346867 kernel: BTRFS info (device dm-0): using free-space-tree Jul 15 23:15:57.358091 systemd[1]: Mounted sysusr-usr.mount - /sysusr/usr. Jul 15 23:15:57.359748 systemd[1]: Reached target initrd-usr-fs.target - Initrd /usr File System. Jul 15 23:15:57.361629 systemd[1]: afterburn-network-kargs.service - Afterburn Initrd Setup Network Kernel Arguments was skipped because no trigger condition checks were met. Jul 15 23:15:57.362564 systemd[1]: Starting ignition-setup.service - Ignition (setup)... Jul 15 23:15:57.365691 systemd[1]: Starting parse-ip-for-networkd.service - Write systemd-networkd units from cmdline... Jul 15 23:15:57.405868 kernel: BTRFS: device label OEM devid 1 transid 11 /dev/sda6 (8:6) scanned by mount (670) Jul 15 23:15:57.405937 kernel: BTRFS info (device sda6): first mount of filesystem b155db48-94d7-40af-bc6d-97d496102c15 Jul 15 23:15:57.406648 kernel: BTRFS info (device sda6): using crc32c (crc32c-generic) checksum algorithm Jul 15 23:15:57.407697 kernel: BTRFS info (device sda6): using free-space-tree Jul 15 23:15:57.418725 kernel: BTRFS info (device sda6): last unmount of filesystem b155db48-94d7-40af-bc6d-97d496102c15 Jul 15 23:15:57.422062 systemd[1]: Finished ignition-setup.service - Ignition (setup). Jul 15 23:15:57.424153 systemd[1]: Starting ignition-fetch-offline.service - Ignition (fetch-offline)... Jul 15 23:15:57.559687 systemd[1]: Finished parse-ip-for-networkd.service - Write systemd-networkd units from cmdline. Jul 15 23:15:57.562393 systemd[1]: Starting systemd-networkd.service - Network Configuration... Jul 15 23:15:57.582733 ignition[714]: Ignition 2.21.0 Jul 15 23:15:57.583383 ignition[714]: Stage: fetch-offline Jul 15 23:15:57.583786 ignition[714]: no configs at "/usr/lib/ignition/base.d" Jul 15 23:15:57.583795 ignition[714]: no config dir at "/usr/lib/ignition/base.platform.d/hetzner" Jul 15 23:15:57.584057 ignition[714]: parsed url from cmdline: "" Jul 15 23:15:57.584061 ignition[714]: no config URL provided Jul 15 23:15:57.584066 ignition[714]: reading system config file "/usr/lib/ignition/user.ign" Jul 15 23:15:57.584074 ignition[714]: no config at "/usr/lib/ignition/user.ign" Jul 15 23:15:57.584080 ignition[714]: failed to fetch config: resource requires networking Jul 15 23:15:57.587899 systemd[1]: Finished ignition-fetch-offline.service - Ignition (fetch-offline). Jul 15 23:15:57.584277 ignition[714]: Ignition finished successfully Jul 15 23:15:57.602838 systemd-networkd[811]: lo: Link UP Jul 15 23:15:57.602855 systemd-networkd[811]: lo: Gained carrier Jul 15 23:15:57.604846 systemd-networkd[811]: Enumeration completed Jul 15 23:15:57.605096 systemd[1]: Started systemd-networkd.service - Network Configuration. Jul 15 23:15:57.606155 systemd-networkd[811]: eth0: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Jul 15 23:15:57.606159 systemd-networkd[811]: eth0: Configuring with /usr/lib/systemd/network/zz-default.network. Jul 15 23:15:57.606562 systemd-networkd[811]: eth1: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Jul 15 23:15:57.606565 systemd-networkd[811]: eth1: Configuring with /usr/lib/systemd/network/zz-default.network. Jul 15 23:15:57.606889 systemd-networkd[811]: eth0: Link UP Jul 15 23:15:57.606892 systemd-networkd[811]: eth0: Gained carrier Jul 15 23:15:57.606899 systemd-networkd[811]: eth0: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Jul 15 23:15:57.607458 systemd[1]: Reached target network.target - Network. Jul 15 23:15:57.611143 systemd-networkd[811]: eth1: Link UP Jul 15 23:15:57.611147 systemd-networkd[811]: eth1: Gained carrier Jul 15 23:15:57.611162 systemd-networkd[811]: eth1: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Jul 15 23:15:57.611974 systemd[1]: Starting ignition-fetch.service - Ignition (fetch)... Jul 15 23:15:57.641224 ignition[815]: Ignition 2.21.0 Jul 15 23:15:57.641324 ignition[815]: Stage: fetch Jul 15 23:15:57.642755 systemd-networkd[811]: eth1: DHCPv4 address 10.0.0.3/32, gateway 10.0.0.1 acquired from 10.0.0.1 Jul 15 23:15:57.641544 ignition[815]: no configs at "/usr/lib/ignition/base.d" Jul 15 23:15:57.641554 ignition[815]: no config dir at "/usr/lib/ignition/base.platform.d/hetzner" Jul 15 23:15:57.641683 ignition[815]: parsed url from cmdline: "" Jul 15 23:15:57.641686 ignition[815]: no config URL provided Jul 15 23:15:57.641692 ignition[815]: reading system config file "/usr/lib/ignition/user.ign" Jul 15 23:15:57.641699 ignition[815]: no config at "/usr/lib/ignition/user.ign" Jul 15 23:15:57.641820 ignition[815]: GET http://169.254.169.254/hetzner/v1/userdata: attempt #1 Jul 15 23:15:57.643478 ignition[815]: GET error: Get "http://169.254.169.254/hetzner/v1/userdata": dial tcp 169.254.169.254:80: connect: network is unreachable Jul 15 23:15:57.670835 systemd-networkd[811]: eth0: DHCPv4 address 91.99.212.32/32, gateway 172.31.1.1 acquired from 172.31.1.1 Jul 15 23:15:57.845127 ignition[815]: GET http://169.254.169.254/hetzner/v1/userdata: attempt #2 Jul 15 23:15:57.852282 ignition[815]: GET result: OK Jul 15 23:15:57.852628 ignition[815]: parsing config with SHA512: 9285cfdfe012b0e189105bf2a48eb24d980de91affc3d54cc46f7b2b52aee7768ccf31faadd1bbac3675158875e62369bc5e48de90bd85d913c0a91711fa1bfe Jul 15 23:15:57.859930 unknown[815]: fetched base config from "system" Jul 15 23:15:57.860476 unknown[815]: fetched base config from "system" Jul 15 23:15:57.860944 ignition[815]: fetch: fetch complete Jul 15 23:15:57.860484 unknown[815]: fetched user config from "hetzner" Jul 15 23:15:57.860952 ignition[815]: fetch: fetch passed Jul 15 23:15:57.861028 ignition[815]: Ignition finished successfully Jul 15 23:15:57.865412 systemd[1]: Finished ignition-fetch.service - Ignition (fetch). Jul 15 23:15:57.870741 systemd[1]: Starting ignition-kargs.service - Ignition (kargs)... Jul 15 23:15:57.901159 ignition[823]: Ignition 2.21.0 Jul 15 23:15:57.901179 ignition[823]: Stage: kargs Jul 15 23:15:57.901360 ignition[823]: no configs at "/usr/lib/ignition/base.d" Jul 15 23:15:57.901369 ignition[823]: no config dir at "/usr/lib/ignition/base.platform.d/hetzner" Jul 15 23:15:57.902323 ignition[823]: kargs: kargs passed Jul 15 23:15:57.902378 ignition[823]: Ignition finished successfully Jul 15 23:15:57.905669 systemd[1]: Finished ignition-kargs.service - Ignition (kargs). Jul 15 23:15:57.909810 systemd[1]: Starting ignition-disks.service - Ignition (disks)... Jul 15 23:15:57.940575 ignition[830]: Ignition 2.21.0 Jul 15 23:15:57.940626 ignition[830]: Stage: disks Jul 15 23:15:57.940825 ignition[830]: no configs at "/usr/lib/ignition/base.d" Jul 15 23:15:57.940838 ignition[830]: no config dir at "/usr/lib/ignition/base.platform.d/hetzner" Jul 15 23:15:57.944262 systemd[1]: Finished ignition-disks.service - Ignition (disks). Jul 15 23:15:57.942010 ignition[830]: disks: disks passed Jul 15 23:15:57.944972 systemd[1]: Reached target initrd-root-device.target - Initrd Root Device. Jul 15 23:15:57.942085 ignition[830]: Ignition finished successfully Jul 15 23:15:57.946116 systemd[1]: Reached target local-fs-pre.target - Preparation for Local File Systems. Jul 15 23:15:57.947024 systemd[1]: Reached target local-fs.target - Local File Systems. Jul 15 23:15:57.948025 systemd[1]: Reached target sysinit.target - System Initialization. Jul 15 23:15:57.948887 systemd[1]: Reached target basic.target - Basic System. Jul 15 23:15:57.950702 systemd[1]: Starting systemd-fsck-root.service - File System Check on /dev/disk/by-label/ROOT... Jul 15 23:15:57.984014 systemd-fsck[839]: ROOT: clean, 15/1628000 files, 120826/1617920 blocks Jul 15 23:15:57.988080 systemd[1]: Finished systemd-fsck-root.service - File System Check on /dev/disk/by-label/ROOT. Jul 15 23:15:57.990529 systemd[1]: Mounting sysroot.mount - /sysroot... Jul 15 23:15:58.084614 kernel: EXT4-fs (sda9): mounted filesystem db08fdf6-07fd-45a1-bb3b-a7d0399d70fd r/w with ordered data mode. Quota mode: none. Jul 15 23:15:58.086008 systemd[1]: Mounted sysroot.mount - /sysroot. Jul 15 23:15:58.087281 systemd[1]: Reached target initrd-root-fs.target - Initrd Root File System. Jul 15 23:15:58.089847 systemd[1]: Mounting sysroot-oem.mount - /sysroot/oem... Jul 15 23:15:58.092708 systemd[1]: Mounting sysroot-usr.mount - /sysroot/usr... Jul 15 23:15:58.096934 systemd[1]: Starting flatcar-metadata-hostname.service - Flatcar Metadata Hostname Agent... Jul 15 23:15:58.097540 systemd[1]: ignition-remount-sysroot.service - Remount /sysroot read-write for Ignition was skipped because of an unmet condition check (ConditionPathIsReadWrite=!/sysroot). Jul 15 23:15:58.097575 systemd[1]: Reached target ignition-diskful.target - Ignition Boot Disk Setup. Jul 15 23:15:58.110113 systemd[1]: Mounted sysroot-usr.mount - /sysroot/usr. Jul 15 23:15:58.111535 systemd[1]: Starting initrd-setup-root.service - Root filesystem setup... Jul 15 23:15:58.133714 kernel: BTRFS: device label OEM devid 1 transid 11 /dev/sda6 (8:6) scanned by mount (847) Jul 15 23:15:58.138274 kernel: BTRFS info (device sda6): first mount of filesystem b155db48-94d7-40af-bc6d-97d496102c15 Jul 15 23:15:58.138346 kernel: BTRFS info (device sda6): using crc32c (crc32c-generic) checksum algorithm Jul 15 23:15:58.140837 kernel: BTRFS info (device sda6): using free-space-tree Jul 15 23:15:58.155016 systemd[1]: Mounted sysroot-oem.mount - /sysroot/oem. Jul 15 23:15:58.181296 initrd-setup-root[874]: cut: /sysroot/etc/passwd: No such file or directory Jul 15 23:15:58.189506 initrd-setup-root[881]: cut: /sysroot/etc/group: No such file or directory Jul 15 23:15:58.191506 coreos-metadata[849]: Jul 15 23:15:58.190 INFO Fetching http://169.254.169.254/hetzner/v1/metadata/hostname: Attempt #1 Jul 15 23:15:58.193755 coreos-metadata[849]: Jul 15 23:15:58.193 INFO Fetch successful Jul 15 23:15:58.194336 coreos-metadata[849]: Jul 15 23:15:58.193 INFO wrote hostname ci-4372-0-1-n-21be50a87e to /sysroot/etc/hostname Jul 15 23:15:58.198321 systemd[1]: Finished flatcar-metadata-hostname.service - Flatcar Metadata Hostname Agent. Jul 15 23:15:58.200352 initrd-setup-root[888]: cut: /sysroot/etc/shadow: No such file or directory Jul 15 23:15:58.204386 initrd-setup-root[896]: cut: /sysroot/etc/gshadow: No such file or directory Jul 15 23:15:58.327368 systemd[1]: Finished initrd-setup-root.service - Root filesystem setup. Jul 15 23:15:58.331841 systemd[1]: Starting ignition-mount.service - Ignition (mount)... Jul 15 23:15:58.335328 systemd[1]: Starting sysroot-boot.service - /sysroot/boot... Jul 15 23:15:58.354060 systemd[1]: sysroot-oem.mount: Deactivated successfully. Jul 15 23:15:58.355045 kernel: BTRFS info (device sda6): last unmount of filesystem b155db48-94d7-40af-bc6d-97d496102c15 Jul 15 23:15:58.377654 systemd[1]: Finished sysroot-boot.service - /sysroot/boot. Jul 15 23:15:58.389104 ignition[964]: INFO : Ignition 2.21.0 Jul 15 23:15:58.389104 ignition[964]: INFO : Stage: mount Jul 15 23:15:58.390442 ignition[964]: INFO : no configs at "/usr/lib/ignition/base.d" Jul 15 23:15:58.390442 ignition[964]: INFO : no config dir at "/usr/lib/ignition/base.platform.d/hetzner" Jul 15 23:15:58.391740 ignition[964]: INFO : mount: mount passed Jul 15 23:15:58.391740 ignition[964]: INFO : Ignition finished successfully Jul 15 23:15:58.393131 systemd[1]: Finished ignition-mount.service - Ignition (mount). Jul 15 23:15:58.395659 systemd[1]: Starting ignition-files.service - Ignition (files)... Jul 15 23:15:58.422604 systemd[1]: Mounting sysroot-oem.mount - /sysroot/oem... Jul 15 23:15:58.449649 kernel: BTRFS: device label OEM devid 1 transid 11 /dev/sda6 (8:6) scanned by mount (976) Jul 15 23:15:58.451737 kernel: BTRFS info (device sda6): first mount of filesystem b155db48-94d7-40af-bc6d-97d496102c15 Jul 15 23:15:58.451862 kernel: BTRFS info (device sda6): using crc32c (crc32c-generic) checksum algorithm Jul 15 23:15:58.451927 kernel: BTRFS info (device sda6): using free-space-tree Jul 15 23:15:58.460548 systemd[1]: Mounted sysroot-oem.mount - /sysroot/oem. Jul 15 23:15:58.497681 ignition[994]: INFO : Ignition 2.21.0 Jul 15 23:15:58.497681 ignition[994]: INFO : Stage: files Jul 15 23:15:58.497681 ignition[994]: INFO : no configs at "/usr/lib/ignition/base.d" Jul 15 23:15:58.497681 ignition[994]: INFO : no config dir at "/usr/lib/ignition/base.platform.d/hetzner" Jul 15 23:15:58.502725 ignition[994]: DEBUG : files: compiled without relabeling support, skipping Jul 15 23:15:58.505802 ignition[994]: INFO : files: ensureUsers: op(1): [started] creating or modifying user "core" Jul 15 23:15:58.505802 ignition[994]: DEBUG : files: ensureUsers: op(1): executing: "usermod" "--root" "/sysroot" "core" Jul 15 23:15:58.510022 ignition[994]: INFO : files: ensureUsers: op(1): [finished] creating or modifying user "core" Jul 15 23:15:58.511129 ignition[994]: INFO : files: ensureUsers: op(2): [started] adding ssh keys to user "core" Jul 15 23:15:58.512339 unknown[994]: wrote ssh authorized keys file for user: core Jul 15 23:15:58.515180 ignition[994]: INFO : files: ensureUsers: op(2): [finished] adding ssh keys to user "core" Jul 15 23:15:58.516279 ignition[994]: INFO : files: createFilesystemsFiles: createFiles: op(3): [started] writing file "/sysroot/opt/helm-v3.17.3-linux-arm64.tar.gz" Jul 15 23:15:58.516279 ignition[994]: INFO : files: createFilesystemsFiles: createFiles: op(3): GET https://get.helm.sh/helm-v3.17.3-linux-arm64.tar.gz: attempt #1 Jul 15 23:15:58.609760 ignition[994]: INFO : files: createFilesystemsFiles: createFiles: op(3): GET result: OK Jul 15 23:15:58.834666 ignition[994]: INFO : files: createFilesystemsFiles: createFiles: op(3): [finished] writing file "/sysroot/opt/helm-v3.17.3-linux-arm64.tar.gz" Jul 15 23:15:58.834666 ignition[994]: INFO : files: createFilesystemsFiles: createFiles: op(4): [started] writing file "/sysroot/home/core/install.sh" Jul 15 23:15:58.837985 ignition[994]: INFO : files: createFilesystemsFiles: createFiles: op(4): [finished] writing file "/sysroot/home/core/install.sh" Jul 15 23:15:58.837985 ignition[994]: INFO : files: createFilesystemsFiles: createFiles: op(5): [started] writing file "/sysroot/home/core/nginx.yaml" Jul 15 23:15:58.837985 ignition[994]: INFO : files: createFilesystemsFiles: createFiles: op(5): [finished] writing file "/sysroot/home/core/nginx.yaml" Jul 15 23:15:58.837985 ignition[994]: INFO : files: createFilesystemsFiles: createFiles: op(6): [started] writing file "/sysroot/home/core/nfs-pod.yaml" Jul 15 23:15:58.837985 ignition[994]: INFO : files: createFilesystemsFiles: createFiles: op(6): [finished] writing file "/sysroot/home/core/nfs-pod.yaml" Jul 15 23:15:58.837985 ignition[994]: INFO : files: createFilesystemsFiles: createFiles: op(7): [started] writing file "/sysroot/home/core/nfs-pvc.yaml" Jul 15 23:15:58.837985 ignition[994]: INFO : files: createFilesystemsFiles: createFiles: op(7): [finished] writing file "/sysroot/home/core/nfs-pvc.yaml" Jul 15 23:15:58.847421 ignition[994]: INFO : files: createFilesystemsFiles: createFiles: op(8): [started] writing file "/sysroot/etc/flatcar/update.conf" Jul 15 23:15:58.847421 ignition[994]: INFO : files: createFilesystemsFiles: createFiles: op(8): [finished] writing file "/sysroot/etc/flatcar/update.conf" Jul 15 23:15:58.847421 ignition[994]: INFO : files: createFilesystemsFiles: createFiles: op(9): [started] writing link "/sysroot/etc/extensions/kubernetes.raw" -> "/opt/extensions/kubernetes/kubernetes-v1.33.0-arm64.raw" Jul 15 23:15:58.847421 ignition[994]: INFO : files: createFilesystemsFiles: createFiles: op(9): [finished] writing link "/sysroot/etc/extensions/kubernetes.raw" -> "/opt/extensions/kubernetes/kubernetes-v1.33.0-arm64.raw" Jul 15 23:15:58.847421 ignition[994]: INFO : files: createFilesystemsFiles: createFiles: op(a): [started] writing file "/sysroot/opt/extensions/kubernetes/kubernetes-v1.33.0-arm64.raw" Jul 15 23:15:58.847421 ignition[994]: INFO : files: createFilesystemsFiles: createFiles: op(a): GET https://extensions.flatcar.org/extensions/kubernetes-v1.33.0-arm64.raw: attempt #1 Jul 15 23:15:59.267877 systemd-networkd[811]: eth1: Gained IPv6LL Jul 15 23:15:59.331824 systemd-networkd[811]: eth0: Gained IPv6LL Jul 15 23:15:59.598333 ignition[994]: INFO : files: createFilesystemsFiles: createFiles: op(a): GET result: OK Jul 15 23:16:00.854022 ignition[994]: INFO : files: createFilesystemsFiles: createFiles: op(a): [finished] writing file "/sysroot/opt/extensions/kubernetes/kubernetes-v1.33.0-arm64.raw" Jul 15 23:16:00.854022 ignition[994]: INFO : files: op(b): [started] processing unit "prepare-helm.service" Jul 15 23:16:00.858522 ignition[994]: INFO : files: op(b): op(c): [started] writing unit "prepare-helm.service" at "/sysroot/etc/systemd/system/prepare-helm.service" Jul 15 23:16:00.860794 ignition[994]: INFO : files: op(b): op(c): [finished] writing unit "prepare-helm.service" at "/sysroot/etc/systemd/system/prepare-helm.service" Jul 15 23:16:00.860794 ignition[994]: INFO : files: op(b): [finished] processing unit "prepare-helm.service" Jul 15 23:16:00.860794 ignition[994]: INFO : files: op(d): [started] processing unit "coreos-metadata.service" Jul 15 23:16:00.860794 ignition[994]: INFO : files: op(d): op(e): [started] writing systemd drop-in "00-custom-metadata.conf" at "/sysroot/etc/systemd/system/coreos-metadata.service.d/00-custom-metadata.conf" Jul 15 23:16:00.860794 ignition[994]: INFO : files: op(d): op(e): [finished] writing systemd drop-in "00-custom-metadata.conf" at "/sysroot/etc/systemd/system/coreos-metadata.service.d/00-custom-metadata.conf" Jul 15 23:16:00.860794 ignition[994]: INFO : files: op(d): [finished] processing unit "coreos-metadata.service" Jul 15 23:16:00.860794 ignition[994]: INFO : files: op(f): [started] setting preset to enabled for "prepare-helm.service" Jul 15 23:16:00.860794 ignition[994]: INFO : files: op(f): [finished] setting preset to enabled for "prepare-helm.service" Jul 15 23:16:00.860794 ignition[994]: INFO : files: createResultFile: createFiles: op(10): [started] writing file "/sysroot/etc/.ignition-result.json" Jul 15 23:16:00.860794 ignition[994]: INFO : files: createResultFile: createFiles: op(10): [finished] writing file "/sysroot/etc/.ignition-result.json" Jul 15 23:16:00.860794 ignition[994]: INFO : files: files passed Jul 15 23:16:00.860794 ignition[994]: INFO : Ignition finished successfully Jul 15 23:16:00.862420 systemd[1]: Finished ignition-files.service - Ignition (files). Jul 15 23:16:00.867096 systemd[1]: Starting ignition-quench.service - Ignition (record completion)... Jul 15 23:16:00.871939 systemd[1]: Starting initrd-setup-root-after-ignition.service - Root filesystem completion... Jul 15 23:16:00.890393 systemd[1]: ignition-quench.service: Deactivated successfully. Jul 15 23:16:00.890513 systemd[1]: Finished ignition-quench.service - Ignition (record completion). Jul 15 23:16:00.898252 initrd-setup-root-after-ignition[1022]: grep: /sysroot/etc/flatcar/enabled-sysext.conf: No such file or directory Jul 15 23:16:00.898252 initrd-setup-root-after-ignition[1022]: grep: /sysroot/usr/share/flatcar/enabled-sysext.conf: No such file or directory Jul 15 23:16:00.900810 initrd-setup-root-after-ignition[1026]: grep: /sysroot/etc/flatcar/enabled-sysext.conf: No such file or directory Jul 15 23:16:00.901690 systemd[1]: Finished initrd-setup-root-after-ignition.service - Root filesystem completion. Jul 15 23:16:00.902867 systemd[1]: Reached target ignition-complete.target - Ignition Complete. Jul 15 23:16:00.904710 systemd[1]: Starting initrd-parse-etc.service - Mountpoints Configured in the Real Root... Jul 15 23:16:00.959688 systemd[1]: initrd-parse-etc.service: Deactivated successfully. Jul 15 23:16:00.959917 systemd[1]: Finished initrd-parse-etc.service - Mountpoints Configured in the Real Root. Jul 15 23:16:00.962736 systemd[1]: Reached target initrd-fs.target - Initrd File Systems. Jul 15 23:16:00.964321 systemd[1]: Reached target initrd.target - Initrd Default Target. Jul 15 23:16:00.965269 systemd[1]: dracut-mount.service - dracut mount hook was skipped because no trigger condition checks were met. Jul 15 23:16:00.966163 systemd[1]: Starting dracut-pre-pivot.service - dracut pre-pivot and cleanup hook... Jul 15 23:16:00.997631 systemd[1]: Finished dracut-pre-pivot.service - dracut pre-pivot and cleanup hook. Jul 15 23:16:01.000018 systemd[1]: Starting initrd-cleanup.service - Cleaning Up and Shutting Down Daemons... Jul 15 23:16:01.030230 systemd[1]: Stopped target nss-lookup.target - Host and Network Name Lookups. Jul 15 23:16:01.031619 systemd[1]: Stopped target remote-cryptsetup.target - Remote Encrypted Volumes. Jul 15 23:16:01.033014 systemd[1]: Stopped target timers.target - Timer Units. Jul 15 23:16:01.033548 systemd[1]: dracut-pre-pivot.service: Deactivated successfully. Jul 15 23:16:01.033702 systemd[1]: Stopped dracut-pre-pivot.service - dracut pre-pivot and cleanup hook. Jul 15 23:16:01.035704 systemd[1]: Stopped target initrd.target - Initrd Default Target. Jul 15 23:16:01.036317 systemd[1]: Stopped target basic.target - Basic System. Jul 15 23:16:01.037850 systemd[1]: Stopped target ignition-complete.target - Ignition Complete. Jul 15 23:16:01.039252 systemd[1]: Stopped target ignition-diskful.target - Ignition Boot Disk Setup. Jul 15 23:16:01.040569 systemd[1]: Stopped target initrd-root-device.target - Initrd Root Device. Jul 15 23:16:01.042085 systemd[1]: Stopped target initrd-usr-fs.target - Initrd /usr File System. Jul 15 23:16:01.043636 systemd[1]: Stopped target remote-fs.target - Remote File Systems. Jul 15 23:16:01.045265 systemd[1]: Stopped target remote-fs-pre.target - Preparation for Remote File Systems. Jul 15 23:16:01.048069 systemd[1]: Stopped target sysinit.target - System Initialization. Jul 15 23:16:01.049445 systemd[1]: Stopped target local-fs.target - Local File Systems. Jul 15 23:16:01.051304 systemd[1]: Stopped target swap.target - Swaps. Jul 15 23:16:01.052182 systemd[1]: dracut-pre-mount.service: Deactivated successfully. Jul 15 23:16:01.052398 systemd[1]: Stopped dracut-pre-mount.service - dracut pre-mount hook. Jul 15 23:16:01.054255 systemd[1]: Stopped target cryptsetup.target - Local Encrypted Volumes. Jul 15 23:16:01.055504 systemd[1]: Stopped target cryptsetup-pre.target - Local Encrypted Volumes (Pre). Jul 15 23:16:01.056553 systemd[1]: clevis-luks-askpass.path: Deactivated successfully. Jul 15 23:16:01.059658 systemd[1]: Stopped clevis-luks-askpass.path - Forward Password Requests to Clevis Directory Watch. Jul 15 23:16:01.060393 systemd[1]: dracut-initqueue.service: Deactivated successfully. Jul 15 23:16:01.060521 systemd[1]: Stopped dracut-initqueue.service - dracut initqueue hook. Jul 15 23:16:01.062446 systemd[1]: initrd-setup-root-after-ignition.service: Deactivated successfully. Jul 15 23:16:01.062650 systemd[1]: Stopped initrd-setup-root-after-ignition.service - Root filesystem completion. Jul 15 23:16:01.064804 systemd[1]: ignition-files.service: Deactivated successfully. Jul 15 23:16:01.065032 systemd[1]: Stopped ignition-files.service - Ignition (files). Jul 15 23:16:01.066512 systemd[1]: flatcar-metadata-hostname.service: Deactivated successfully. Jul 15 23:16:01.066644 systemd[1]: Stopped flatcar-metadata-hostname.service - Flatcar Metadata Hostname Agent. Jul 15 23:16:01.068446 systemd[1]: Stopping ignition-mount.service - Ignition (mount)... Jul 15 23:16:01.071866 systemd[1]: Stopping sysroot-boot.service - /sysroot/boot... Jul 15 23:16:01.074696 systemd[1]: systemd-udev-trigger.service: Deactivated successfully. Jul 15 23:16:01.075540 systemd[1]: Stopped systemd-udev-trigger.service - Coldplug All udev Devices. Jul 15 23:16:01.077084 systemd[1]: dracut-pre-trigger.service: Deactivated successfully. Jul 15 23:16:01.077776 systemd[1]: Stopped dracut-pre-trigger.service - dracut pre-trigger hook. Jul 15 23:16:01.085003 systemd[1]: initrd-cleanup.service: Deactivated successfully. Jul 15 23:16:01.087638 systemd[1]: Finished initrd-cleanup.service - Cleaning Up and Shutting Down Daemons. Jul 15 23:16:01.095795 systemd[1]: sysroot-boot.mount: Deactivated successfully. Jul 15 23:16:01.104635 ignition[1046]: INFO : Ignition 2.21.0 Jul 15 23:16:01.108994 ignition[1046]: INFO : Stage: umount Jul 15 23:16:01.108994 ignition[1046]: INFO : no configs at "/usr/lib/ignition/base.d" Jul 15 23:16:01.108994 ignition[1046]: INFO : no config dir at "/usr/lib/ignition/base.platform.d/hetzner" Jul 15 23:16:01.108994 ignition[1046]: INFO : umount: umount passed Jul 15 23:16:01.108994 ignition[1046]: INFO : Ignition finished successfully Jul 15 23:16:01.113068 systemd[1]: ignition-mount.service: Deactivated successfully. Jul 15 23:16:01.113176 systemd[1]: Stopped ignition-mount.service - Ignition (mount). Jul 15 23:16:01.116414 systemd[1]: ignition-disks.service: Deactivated successfully. Jul 15 23:16:01.116531 systemd[1]: Stopped ignition-disks.service - Ignition (disks). Jul 15 23:16:01.120395 systemd[1]: ignition-kargs.service: Deactivated successfully. Jul 15 23:16:01.120501 systemd[1]: Stopped ignition-kargs.service - Ignition (kargs). Jul 15 23:16:01.122345 systemd[1]: ignition-fetch.service: Deactivated successfully. Jul 15 23:16:01.122419 systemd[1]: Stopped ignition-fetch.service - Ignition (fetch). Jul 15 23:16:01.124839 systemd[1]: Stopped target network.target - Network. Jul 15 23:16:01.127143 systemd[1]: ignition-fetch-offline.service: Deactivated successfully. Jul 15 23:16:01.127226 systemd[1]: Stopped ignition-fetch-offline.service - Ignition (fetch-offline). Jul 15 23:16:01.128220 systemd[1]: Stopped target paths.target - Path Units. Jul 15 23:16:01.129369 systemd[1]: systemd-ask-password-console.path: Deactivated successfully. Jul 15 23:16:01.132665 systemd[1]: Stopped systemd-ask-password-console.path - Dispatch Password Requests to Console Directory Watch. Jul 15 23:16:01.136538 systemd[1]: Stopped target slices.target - Slice Units. Jul 15 23:16:01.139563 systemd[1]: Stopped target sockets.target - Socket Units. Jul 15 23:16:01.140345 systemd[1]: iscsid.socket: Deactivated successfully. Jul 15 23:16:01.140479 systemd[1]: Closed iscsid.socket - Open-iSCSI iscsid Socket. Jul 15 23:16:01.143520 systemd[1]: iscsiuio.socket: Deactivated successfully. Jul 15 23:16:01.143571 systemd[1]: Closed iscsiuio.socket - Open-iSCSI iscsiuio Socket. Jul 15 23:16:01.144527 systemd[1]: ignition-setup.service: Deactivated successfully. Jul 15 23:16:01.144630 systemd[1]: Stopped ignition-setup.service - Ignition (setup). Jul 15 23:16:01.147452 systemd[1]: ignition-setup-pre.service: Deactivated successfully. Jul 15 23:16:01.147512 systemd[1]: Stopped ignition-setup-pre.service - Ignition env setup. Jul 15 23:16:01.148460 systemd[1]: Stopping systemd-networkd.service - Network Configuration... Jul 15 23:16:01.150247 systemd[1]: Stopping systemd-resolved.service - Network Name Resolution... Jul 15 23:16:01.152551 systemd[1]: sysroot-boot.service: Deactivated successfully. Jul 15 23:16:01.153362 systemd[1]: Stopped sysroot-boot.service - /sysroot/boot. Jul 15 23:16:01.156072 systemd[1]: initrd-setup-root.service: Deactivated successfully. Jul 15 23:16:01.156211 systemd[1]: Stopped initrd-setup-root.service - Root filesystem setup. Jul 15 23:16:01.157169 systemd[1]: systemd-resolved.service: Deactivated successfully. Jul 15 23:16:01.157274 systemd[1]: Stopped systemd-resolved.service - Network Name Resolution. Jul 15 23:16:01.160859 systemd[1]: run-credentials-systemd\x2dresolved.service.mount: Deactivated successfully. Jul 15 23:16:01.161640 systemd[1]: systemd-tmpfiles-setup.service: Deactivated successfully. Jul 15 23:16:01.161717 systemd[1]: Stopped systemd-tmpfiles-setup.service - Create System Files and Directories. Jul 15 23:16:01.165789 systemd[1]: run-credentials-systemd\x2dtmpfiles\x2dsetup.service.mount: Deactivated successfully. Jul 15 23:16:01.166142 systemd[1]: systemd-networkd.service: Deactivated successfully. Jul 15 23:16:01.166317 systemd[1]: Stopped systemd-networkd.service - Network Configuration. Jul 15 23:16:01.168961 systemd[1]: run-credentials-systemd\x2dnetworkd.service.mount: Deactivated successfully. Jul 15 23:16:01.169774 systemd[1]: Stopped target network-pre.target - Preparation for Network. Jul 15 23:16:01.170486 systemd[1]: systemd-networkd.socket: Deactivated successfully. Jul 15 23:16:01.170546 systemd[1]: Closed systemd-networkd.socket - Network Service Netlink Socket. Jul 15 23:16:01.172441 systemd[1]: Stopping network-cleanup.service - Network Cleanup... Jul 15 23:16:01.172999 systemd[1]: parse-ip-for-networkd.service: Deactivated successfully. Jul 15 23:16:01.173058 systemd[1]: Stopped parse-ip-for-networkd.service - Write systemd-networkd units from cmdline. Jul 15 23:16:01.173705 systemd[1]: systemd-sysctl.service: Deactivated successfully. Jul 15 23:16:01.173747 systemd[1]: Stopped systemd-sysctl.service - Apply Kernel Variables. Jul 15 23:16:01.174383 systemd[1]: systemd-modules-load.service: Deactivated successfully. Jul 15 23:16:01.174420 systemd[1]: Stopped systemd-modules-load.service - Load Kernel Modules. Jul 15 23:16:01.175137 systemd[1]: Stopping systemd-udevd.service - Rule-based Manager for Device Events and Files... Jul 15 23:16:01.183010 systemd[1]: run-credentials-systemd\x2dsysctl.service.mount: Deactivated successfully. Jul 15 23:16:01.197683 systemd[1]: systemd-udevd.service: Deactivated successfully. Jul 15 23:16:01.198848 systemd[1]: Stopped systemd-udevd.service - Rule-based Manager for Device Events and Files. Jul 15 23:16:01.200657 systemd[1]: systemd-udevd-control.socket: Deactivated successfully. Jul 15 23:16:01.200724 systemd[1]: Closed systemd-udevd-control.socket - udev Control Socket. Jul 15 23:16:01.202177 systemd[1]: systemd-udevd-kernel.socket: Deactivated successfully. Jul 15 23:16:01.202242 systemd[1]: Closed systemd-udevd-kernel.socket - udev Kernel Socket. Jul 15 23:16:01.203447 systemd[1]: dracut-pre-udev.service: Deactivated successfully. Jul 15 23:16:01.203515 systemd[1]: Stopped dracut-pre-udev.service - dracut pre-udev hook. Jul 15 23:16:01.204394 systemd[1]: dracut-cmdline.service: Deactivated successfully. Jul 15 23:16:01.204444 systemd[1]: Stopped dracut-cmdline.service - dracut cmdline hook. Jul 15 23:16:01.206241 systemd[1]: dracut-cmdline-ask.service: Deactivated successfully. Jul 15 23:16:01.206296 systemd[1]: Stopped dracut-cmdline-ask.service - dracut ask for additional cmdline parameters. Jul 15 23:16:01.209003 systemd[1]: Starting initrd-udevadm-cleanup-db.service - Cleanup udev Database... Jul 15 23:16:01.212704 systemd[1]: systemd-network-generator.service: Deactivated successfully. Jul 15 23:16:01.212792 systemd[1]: Stopped systemd-network-generator.service - Generate network units from Kernel command line. Jul 15 23:16:01.214208 systemd[1]: systemd-tmpfiles-setup-dev.service: Deactivated successfully. Jul 15 23:16:01.214260 systemd[1]: Stopped systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev. Jul 15 23:16:01.216708 systemd[1]: systemd-tmpfiles-setup-dev-early.service: Deactivated successfully. Jul 15 23:16:01.216758 systemd[1]: Stopped systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully. Jul 15 23:16:01.217794 systemd[1]: kmod-static-nodes.service: Deactivated successfully. Jul 15 23:16:01.217836 systemd[1]: Stopped kmod-static-nodes.service - Create List of Static Device Nodes. Jul 15 23:16:01.218511 systemd[1]: systemd-vconsole-setup.service: Deactivated successfully. Jul 15 23:16:01.218548 systemd[1]: Stopped systemd-vconsole-setup.service - Virtual Console Setup. Jul 15 23:16:01.220465 systemd[1]: network-cleanup.service: Deactivated successfully. Jul 15 23:16:01.224179 systemd[1]: Stopped network-cleanup.service - Network Cleanup. Jul 15 23:16:01.234119 systemd[1]: initrd-udevadm-cleanup-db.service: Deactivated successfully. Jul 15 23:16:01.234253 systemd[1]: Finished initrd-udevadm-cleanup-db.service - Cleanup udev Database. Jul 15 23:16:01.235161 systemd[1]: Reached target initrd-switch-root.target - Switch Root. Jul 15 23:16:01.237648 systemd[1]: Starting initrd-switch-root.service - Switch Root... Jul 15 23:16:01.257639 systemd[1]: Switching root. Jul 15 23:16:01.286890 systemd-journald[244]: Journal stopped Jul 15 23:16:02.359619 systemd-journald[244]: Received SIGTERM from PID 1 (systemd). Jul 15 23:16:02.359702 kernel: SELinux: policy capability network_peer_controls=1 Jul 15 23:16:02.359714 kernel: SELinux: policy capability open_perms=1 Jul 15 23:16:02.359724 kernel: SELinux: policy capability extended_socket_class=1 Jul 15 23:16:02.359733 kernel: SELinux: policy capability always_check_network=0 Jul 15 23:16:02.359745 kernel: SELinux: policy capability cgroup_seclabel=1 Jul 15 23:16:02.359756 kernel: SELinux: policy capability nnp_nosuid_transition=1 Jul 15 23:16:02.359765 kernel: SELinux: policy capability genfs_seclabel_symlinks=0 Jul 15 23:16:02.359778 kernel: SELinux: policy capability ioctl_skip_cloexec=0 Jul 15 23:16:02.359789 kernel: SELinux: policy capability userspace_initial_context=0 Jul 15 23:16:02.359798 kernel: audit: type=1403 audit(1752621361.428:2): auid=4294967295 ses=4294967295 lsm=selinux res=1 Jul 15 23:16:02.359808 systemd[1]: Successfully loaded SELinux policy in 57.169ms. Jul 15 23:16:02.359825 systemd[1]: Relabeled /dev/, /dev/shm/, /run/ in 11.567ms. Jul 15 23:16:02.359836 systemd[1]: systemd 256.8 running in system mode (+PAM +AUDIT +SELINUX -APPARMOR +IMA +SMACK +SECCOMP -GCRYPT -GNUTLS +OPENSSL -ACL +BLKID +CURL +ELFUTILS -FIDO2 +IDN2 -IDN +IPTC +KMOD +LIBCRYPTSETUP +LIBCRYPTSETUP_PLUGINS +LIBFDISK +PCRE2 -PWQUALITY -P11KIT -QRENCODE +TPM2 +BZIP2 +LZ4 +XZ +ZLIB +ZSTD -BPF_FRAMEWORK -XKBCOMMON +UTMP -SYSVINIT +LIBARCHIVE) Jul 15 23:16:02.359848 systemd[1]: Detected virtualization kvm. Jul 15 23:16:02.359858 systemd[1]: Detected architecture arm64. Jul 15 23:16:02.359881 systemd[1]: Detected first boot. Jul 15 23:16:02.359895 systemd[1]: Hostname set to . Jul 15 23:16:02.359905 systemd[1]: Initializing machine ID from VM UUID. Jul 15 23:16:02.359915 zram_generator::config[1089]: No configuration found. Jul 15 23:16:02.359929 kernel: NET: Registered PF_VSOCK protocol family Jul 15 23:16:02.359939 systemd[1]: Populated /etc with preset unit settings. Jul 15 23:16:02.359952 systemd[1]: run-credentials-systemd\x2djournald.service.mount: Deactivated successfully. Jul 15 23:16:02.359962 systemd[1]: initrd-switch-root.service: Deactivated successfully. Jul 15 23:16:02.359972 systemd[1]: Stopped initrd-switch-root.service - Switch Root. Jul 15 23:16:02.359982 systemd[1]: systemd-journald.service: Scheduled restart job, restart counter is at 1. Jul 15 23:16:02.359991 systemd[1]: Created slice system-addon\x2dconfig.slice - Slice /system/addon-config. Jul 15 23:16:02.360001 systemd[1]: Created slice system-addon\x2drun.slice - Slice /system/addon-run. Jul 15 23:16:02.360011 systemd[1]: Created slice system-getty.slice - Slice /system/getty. Jul 15 23:16:02.360020 systemd[1]: Created slice system-modprobe.slice - Slice /system/modprobe. Jul 15 23:16:02.360036 systemd[1]: Created slice system-serial\x2dgetty.slice - Slice /system/serial-getty. Jul 15 23:16:02.360047 systemd[1]: Created slice system-system\x2dcloudinit.slice - Slice /system/system-cloudinit. Jul 15 23:16:02.360058 systemd[1]: Created slice system-systemd\x2dfsck.slice - Slice /system/systemd-fsck. Jul 15 23:16:02.360068 systemd[1]: Created slice user.slice - User and Session Slice. Jul 15 23:16:02.360077 systemd[1]: Started clevis-luks-askpass.path - Forward Password Requests to Clevis Directory Watch. Jul 15 23:16:02.360087 systemd[1]: Started systemd-ask-password-console.path - Dispatch Password Requests to Console Directory Watch. Jul 15 23:16:02.360099 systemd[1]: Started systemd-ask-password-wall.path - Forward Password Requests to Wall Directory Watch. Jul 15 23:16:02.360109 systemd[1]: Set up automount boot.automount - Boot partition Automount Point. Jul 15 23:16:02.360119 systemd[1]: Set up automount proc-sys-fs-binfmt_misc.automount - Arbitrary Executable File Formats File System Automount Point. Jul 15 23:16:02.360130 systemd[1]: Expecting device dev-disk-by\x2dlabel-OEM.device - /dev/disk/by-label/OEM... Jul 15 23:16:02.360140 systemd[1]: Expecting device dev-ttyAMA0.device - /dev/ttyAMA0... Jul 15 23:16:02.360152 systemd[1]: Reached target cryptsetup-pre.target - Local Encrypted Volumes (Pre). Jul 15 23:16:02.360163 systemd[1]: Reached target cryptsetup.target - Local Encrypted Volumes. Jul 15 23:16:02.360173 systemd[1]: Stopped target initrd-switch-root.target - Switch Root. Jul 15 23:16:02.360183 systemd[1]: Stopped target initrd-fs.target - Initrd File Systems. Jul 15 23:16:02.360193 systemd[1]: Stopped target initrd-root-fs.target - Initrd Root File System. Jul 15 23:16:02.360209 systemd[1]: Reached target integritysetup.target - Local Integrity Protected Volumes. Jul 15 23:16:02.360219 systemd[1]: Reached target remote-cryptsetup.target - Remote Encrypted Volumes. Jul 15 23:16:02.360229 systemd[1]: Reached target remote-fs.target - Remote File Systems. Jul 15 23:16:02.360239 systemd[1]: Reached target slices.target - Slice Units. Jul 15 23:16:02.360249 systemd[1]: Reached target swap.target - Swaps. Jul 15 23:16:02.360259 systemd[1]: Reached target veritysetup.target - Local Verity Protected Volumes. Jul 15 23:16:02.360270 systemd[1]: Listening on systemd-coredump.socket - Process Core Dump Socket. Jul 15 23:16:02.360279 systemd[1]: Listening on systemd-creds.socket - Credential Encryption/Decryption. Jul 15 23:16:02.360289 systemd[1]: Listening on systemd-networkd.socket - Network Service Netlink Socket. Jul 15 23:16:02.360302 systemd[1]: Listening on systemd-udevd-control.socket - udev Control Socket. Jul 15 23:16:02.360312 systemd[1]: Listening on systemd-udevd-kernel.socket - udev Kernel Socket. Jul 15 23:16:02.360322 systemd[1]: Listening on systemd-userdbd.socket - User Database Manager Socket. Jul 15 23:16:02.360333 systemd[1]: Mounting dev-hugepages.mount - Huge Pages File System... Jul 15 23:16:02.360343 systemd[1]: Mounting dev-mqueue.mount - POSIX Message Queue File System... Jul 15 23:16:02.360353 systemd[1]: Mounting media.mount - External Media Directory... Jul 15 23:16:02.360364 systemd[1]: Mounting sys-kernel-debug.mount - Kernel Debug File System... Jul 15 23:16:02.360375 systemd[1]: Mounting sys-kernel-tracing.mount - Kernel Trace File System... Jul 15 23:16:02.360386 systemd[1]: Mounting tmp.mount - Temporary Directory /tmp... Jul 15 23:16:02.360397 systemd[1]: var-lib-machines.mount - Virtual Machine and Container Storage (Compatibility) was skipped because of an unmet condition check (ConditionPathExists=/var/lib/machines.raw). Jul 15 23:16:02.360407 systemd[1]: Reached target machines.target - Containers. Jul 15 23:16:02.360417 systemd[1]: Starting flatcar-tmpfiles.service - Create missing system files... Jul 15 23:16:02.360428 systemd[1]: ignition-delete-config.service - Ignition (delete config) was skipped because no trigger condition checks were met. Jul 15 23:16:02.360437 systemd[1]: Starting kmod-static-nodes.service - Create List of Static Device Nodes... Jul 15 23:16:02.360449 systemd[1]: Starting modprobe@configfs.service - Load Kernel Module configfs... Jul 15 23:16:02.360459 systemd[1]: Starting modprobe@dm_mod.service - Load Kernel Module dm_mod... Jul 15 23:16:02.360469 systemd[1]: Starting modprobe@drm.service - Load Kernel Module drm... Jul 15 23:16:02.360479 systemd[1]: Starting modprobe@efi_pstore.service - Load Kernel Module efi_pstore... Jul 15 23:16:02.360488 systemd[1]: Starting modprobe@fuse.service - Load Kernel Module fuse... Jul 15 23:16:02.360499 systemd[1]: Starting modprobe@loop.service - Load Kernel Module loop... Jul 15 23:16:02.360509 systemd[1]: setup-nsswitch.service - Create /etc/nsswitch.conf was skipped because of an unmet condition check (ConditionPathExists=!/etc/nsswitch.conf). Jul 15 23:16:02.360519 systemd[1]: systemd-fsck-root.service: Deactivated successfully. Jul 15 23:16:02.360529 systemd[1]: Stopped systemd-fsck-root.service - File System Check on Root Device. Jul 15 23:16:02.360541 systemd[1]: systemd-fsck-usr.service: Deactivated successfully. Jul 15 23:16:02.360551 systemd[1]: Stopped systemd-fsck-usr.service. Jul 15 23:16:02.360562 systemd[1]: systemd-hibernate-clear.service - Clear Stale Hibernate Storage Info was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/HibernateLocation-8cf2644b-4b0b-428f-9387-6d876050dc67). Jul 15 23:16:02.360572 systemd[1]: Starting systemd-journald.service - Journal Service... Jul 15 23:16:02.360583 systemd[1]: Starting systemd-modules-load.service - Load Kernel Modules... Jul 15 23:16:02.364755 systemd[1]: Starting systemd-network-generator.service - Generate network units from Kernel command line... Jul 15 23:16:02.368659 systemd[1]: Starting systemd-remount-fs.service - Remount Root and Kernel File Systems... Jul 15 23:16:02.368688 systemd[1]: Starting systemd-udev-load-credentials.service - Load udev Rules from Credentials... Jul 15 23:16:02.368701 systemd[1]: Starting systemd-udev-trigger.service - Coldplug All udev Devices... Jul 15 23:16:02.368715 systemd[1]: verity-setup.service: Deactivated successfully. Jul 15 23:16:02.368726 systemd[1]: Stopped verity-setup.service. Jul 15 23:16:02.368737 systemd[1]: Mounted dev-hugepages.mount - Huge Pages File System. Jul 15 23:16:02.368747 systemd[1]: Mounted dev-mqueue.mount - POSIX Message Queue File System. Jul 15 23:16:02.368758 systemd[1]: Mounted media.mount - External Media Directory. Jul 15 23:16:02.368768 systemd[1]: Mounted sys-kernel-debug.mount - Kernel Debug File System. Jul 15 23:16:02.368780 systemd[1]: Mounted sys-kernel-tracing.mount - Kernel Trace File System. Jul 15 23:16:02.368790 systemd[1]: Mounted tmp.mount - Temporary Directory /tmp. Jul 15 23:16:02.368802 systemd[1]: Finished kmod-static-nodes.service - Create List of Static Device Nodes. Jul 15 23:16:02.368816 systemd[1]: modprobe@configfs.service: Deactivated successfully. Jul 15 23:16:02.368827 systemd[1]: Finished modprobe@configfs.service - Load Kernel Module configfs. Jul 15 23:16:02.368838 systemd[1]: modprobe@dm_mod.service: Deactivated successfully. Jul 15 23:16:02.368852 systemd[1]: Finished modprobe@dm_mod.service - Load Kernel Module dm_mod. Jul 15 23:16:02.368904 systemd[1]: modprobe@efi_pstore.service: Deactivated successfully. Jul 15 23:16:02.368918 systemd[1]: Finished modprobe@efi_pstore.service - Load Kernel Module efi_pstore. Jul 15 23:16:02.368931 systemd[1]: Finished systemd-modules-load.service - Load Kernel Modules. Jul 15 23:16:02.368941 systemd[1]: Mounting sys-kernel-config.mount - Kernel Configuration File System... Jul 15 23:16:02.368954 systemd[1]: Starting systemd-sysctl.service - Apply Kernel Variables... Jul 15 23:16:02.368965 kernel: fuse: init (API version 7.41) Jul 15 23:16:02.368978 systemd[1]: Starting systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully... Jul 15 23:16:02.368990 systemd[1]: Finished systemd-remount-fs.service - Remount Root and Kernel File Systems. Jul 15 23:16:02.369000 systemd[1]: Mounted sys-kernel-config.mount - Kernel Configuration File System. Jul 15 23:16:02.369011 systemd[1]: remount-root.service - Remount Root File System was skipped because of an unmet condition check (ConditionPathIsReadWrite=!/). Jul 15 23:16:02.369021 systemd[1]: Reached target local-fs.target - Local File Systems. Jul 15 23:16:02.369032 systemd[1]: Listening on systemd-sysext.socket - System Extension Image Management. Jul 15 23:16:02.369044 systemd[1]: Starting ldconfig.service - Rebuild Dynamic Linker Cache... Jul 15 23:16:02.369056 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. Jul 15 23:16:02.369067 kernel: loop: module loaded Jul 15 23:16:02.369076 systemd[1]: Starting systemd-hwdb-update.service - Rebuild Hardware Database... Jul 15 23:16:02.369087 systemd[1]: systemd-pstore.service - Platform Persistent Storage Archival was skipped because of an unmet condition check (ConditionDirectoryNotEmpty=/sys/fs/pstore). Jul 15 23:16:02.369099 systemd[1]: Starting systemd-random-seed.service - Load/Save OS Random Seed... Jul 15 23:16:02.369110 kernel: ACPI: bus type drm_connector registered Jul 15 23:16:02.369122 systemd[1]: Starting systemd-sysext.service - Merge System Extension Images into /usr/ and /opt/... Jul 15 23:16:02.369132 systemd[1]: Finished flatcar-tmpfiles.service - Create missing system files. Jul 15 23:16:02.369176 systemd-journald[1157]: Collecting audit messages is disabled. Jul 15 23:16:02.369205 systemd[1]: modprobe@drm.service: Deactivated successfully. Jul 15 23:16:02.369216 systemd[1]: Finished modprobe@drm.service - Load Kernel Module drm. Jul 15 23:16:02.369229 systemd[1]: modprobe@fuse.service: Deactivated successfully. Jul 15 23:16:02.369239 systemd[1]: Finished modprobe@fuse.service - Load Kernel Module fuse. Jul 15 23:16:02.369252 systemd[1]: modprobe@loop.service: Deactivated successfully. Jul 15 23:16:02.369266 systemd-journald[1157]: Journal started Jul 15 23:16:02.369292 systemd-journald[1157]: Runtime Journal (/run/log/journal/275fba7bf51443ffbbfe9a8953517fa2) is 8M, max 76.5M, 68.5M free. Jul 15 23:16:02.377679 systemd[1]: Finished modprobe@loop.service - Load Kernel Module loop. Jul 15 23:16:02.377758 systemd[1]: Finished systemd-network-generator.service - Generate network units from Kernel command line. Jul 15 23:16:02.009108 systemd[1]: Queued start job for default target multi-user.target. Jul 15 23:16:02.015504 systemd[1]: Unnecessary job was removed for dev-sda6.device - /dev/sda6. Jul 15 23:16:02.016025 systemd[1]: systemd-journald.service: Deactivated successfully. Jul 15 23:16:02.379049 systemd[1]: Started systemd-journald.service - Journal Service. Jul 15 23:16:02.380264 systemd[1]: Finished systemd-udev-load-credentials.service - Load udev Rules from Credentials. Jul 15 23:16:02.382669 systemd[1]: Finished systemd-sysctl.service - Apply Kernel Variables. Jul 15 23:16:02.402810 systemd[1]: Finished systemd-random-seed.service - Load/Save OS Random Seed. Jul 15 23:16:02.412338 systemd[1]: Reached target first-boot-complete.target - First Boot Complete. Jul 15 23:16:02.413845 systemd[1]: Reached target network-pre.target - Preparation for Network. Jul 15 23:16:02.417894 systemd[1]: Starting systemd-journal-flush.service - Flush Journal to Persistent Storage... Jul 15 23:16:02.422351 systemd[1]: Starting systemd-machine-id-commit.service - Save Transient machine-id to Disk... Jul 15 23:16:02.423109 systemd[1]: systemd-repart.service - Repartition Root Disk was skipped because no trigger condition checks were met. Jul 15 23:16:02.429264 systemd-tmpfiles[1185]: ACLs are not supported, ignoring. Jul 15 23:16:02.429281 systemd-tmpfiles[1185]: ACLs are not supported, ignoring. Jul 15 23:16:02.433532 kernel: loop0: detected capacity change from 0 to 211168 Jul 15 23:16:02.447640 systemd[1]: Finished systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully. Jul 15 23:16:02.454245 systemd[1]: Starting systemd-sysusers.service - Create System Users... Jul 15 23:16:02.468806 systemd-journald[1157]: Time spent on flushing to /var/log/journal/275fba7bf51443ffbbfe9a8953517fa2 is 54.473ms for 1172 entries. Jul 15 23:16:02.468806 systemd-journald[1157]: System Journal (/var/log/journal/275fba7bf51443ffbbfe9a8953517fa2) is 8M, max 584.8M, 576.8M free. Jul 15 23:16:02.544838 kernel: squashfs: version 4.0 (2009/01/31) Phillip Lougher Jul 15 23:16:02.544897 systemd-journald[1157]: Received client request to flush runtime journal. Jul 15 23:16:02.544925 kernel: loop1: detected capacity change from 0 to 138376 Jul 15 23:16:02.504254 systemd[1]: Finished systemd-machine-id-commit.service - Save Transient machine-id to Disk. Jul 15 23:16:02.547787 systemd[1]: Finished systemd-journal-flush.service - Flush Journal to Persistent Storage. Jul 15 23:16:02.574613 kernel: loop2: detected capacity change from 0 to 8 Jul 15 23:16:02.603639 kernel: loop3: detected capacity change from 0 to 107312 Jul 15 23:16:02.610792 systemd[1]: Finished systemd-udev-trigger.service - Coldplug All udev Devices. Jul 15 23:16:02.611991 systemd[1]: Finished systemd-sysusers.service - Create System Users. Jul 15 23:16:02.618163 systemd[1]: Starting systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev... Jul 15 23:16:02.656617 kernel: loop4: detected capacity change from 0 to 211168 Jul 15 23:16:02.667283 systemd-tmpfiles[1230]: ACLs are not supported, ignoring. Jul 15 23:16:02.667307 systemd-tmpfiles[1230]: ACLs are not supported, ignoring. Jul 15 23:16:02.677729 systemd[1]: Finished systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev. Jul 15 23:16:02.688633 kernel: loop5: detected capacity change from 0 to 138376 Jul 15 23:16:02.718634 kernel: loop6: detected capacity change from 0 to 8 Jul 15 23:16:02.721627 kernel: loop7: detected capacity change from 0 to 107312 Jul 15 23:16:02.736847 (sd-merge)[1233]: Using extensions 'containerd-flatcar', 'docker-flatcar', 'kubernetes', 'oem-hetzner'. Jul 15 23:16:02.737346 (sd-merge)[1233]: Merged extensions into '/usr'. Jul 15 23:16:02.744678 systemd[1]: Reload requested from client PID 1190 ('systemd-sysext') (unit systemd-sysext.service)... Jul 15 23:16:02.744707 systemd[1]: Reloading... Jul 15 23:16:02.855578 zram_generator::config[1256]: No configuration found. Jul 15 23:16:03.035502 systemd[1]: /usr/lib/systemd/system/docker.socket:6: ListenStream= references a path below legacy directory /var/run/, updating /var/run/docker.sock → /run/docker.sock; please update the unit file accordingly. Jul 15 23:16:03.063622 ldconfig[1186]: /sbin/ldconfig: /usr/lib/ld.so.conf is not an ELF file - it has the wrong magic bytes at the start. Jul 15 23:16:03.118002 systemd[1]: etc-machine\x2did.mount: Deactivated successfully. Jul 15 23:16:03.118270 systemd[1]: Reloading finished in 373 ms. Jul 15 23:16:03.138079 systemd[1]: Finished ldconfig.service - Rebuild Dynamic Linker Cache. Jul 15 23:16:03.140629 systemd[1]: Finished systemd-sysext.service - Merge System Extension Images into /usr/ and /opt/. Jul 15 23:16:03.147964 systemd[1]: Mounting sys-fs-fuse-connections.mount - FUSE Control File System... Jul 15 23:16:03.154721 systemd[1]: Starting ensure-sysext.service... Jul 15 23:16:03.159953 systemd[1]: Starting systemd-tmpfiles-setup.service - Create System Files and Directories... Jul 15 23:16:03.174144 systemd[1]: Mounted sys-fs-fuse-connections.mount - FUSE Control File System. Jul 15 23:16:03.197578 systemd[1]: Reload requested from client PID 1298 ('systemctl') (unit ensure-sysext.service)... Jul 15 23:16:03.197778 systemd[1]: Reloading... Jul 15 23:16:03.225321 systemd-tmpfiles[1299]: /usr/lib/tmpfiles.d/nfs-utils.conf:6: Duplicate line for path "/var/lib/nfs/sm", ignoring. Jul 15 23:16:03.225361 systemd-tmpfiles[1299]: /usr/lib/tmpfiles.d/nfs-utils.conf:7: Duplicate line for path "/var/lib/nfs/sm.bak", ignoring. Jul 15 23:16:03.225583 systemd-tmpfiles[1299]: /usr/lib/tmpfiles.d/provision.conf:20: Duplicate line for path "/root", ignoring. Jul 15 23:16:03.225812 systemd-tmpfiles[1299]: /usr/lib/tmpfiles.d/systemd-flatcar.conf:6: Duplicate line for path "/var/log/journal", ignoring. Jul 15 23:16:03.226464 systemd-tmpfiles[1299]: /usr/lib/tmpfiles.d/systemd.conf:29: Duplicate line for path "/var/lib/systemd", ignoring. Jul 15 23:16:03.228957 systemd-tmpfiles[1299]: ACLs are not supported, ignoring. Jul 15 23:16:03.229083 systemd-tmpfiles[1299]: ACLs are not supported, ignoring. Jul 15 23:16:03.235514 systemd-tmpfiles[1299]: Detected autofs mount point /boot during canonicalization of boot. Jul 15 23:16:03.235530 systemd-tmpfiles[1299]: Skipping /boot Jul 15 23:16:03.250543 systemd-tmpfiles[1299]: Detected autofs mount point /boot during canonicalization of boot. Jul 15 23:16:03.252656 systemd-tmpfiles[1299]: Skipping /boot Jul 15 23:16:03.293636 zram_generator::config[1324]: No configuration found. Jul 15 23:16:03.385216 systemd[1]: /usr/lib/systemd/system/docker.socket:6: ListenStream= references a path below legacy directory /var/run/, updating /var/run/docker.sock → /run/docker.sock; please update the unit file accordingly. Jul 15 23:16:03.464772 systemd[1]: Reloading finished in 266 ms. Jul 15 23:16:03.479529 systemd[1]: Finished systemd-hwdb-update.service - Rebuild Hardware Database. Jul 15 23:16:03.486317 systemd[1]: Finished systemd-tmpfiles-setup.service - Create System Files and Directories. Jul 15 23:16:03.496775 systemd[1]: Starting audit-rules.service - Load Audit Rules... Jul 15 23:16:03.499750 systemd[1]: Starting clean-ca-certificates.service - Clean up broken links in /etc/ssl/certs... Jul 15 23:16:03.503941 systemd[1]: Starting systemd-journal-catalog-update.service - Rebuild Journal Catalog... Jul 15 23:16:03.507020 systemd[1]: Starting systemd-resolved.service - Network Name Resolution... Jul 15 23:16:03.514820 systemd[1]: Starting systemd-udevd.service - Rule-based Manager for Device Events and Files... Jul 15 23:16:03.519138 systemd[1]: Starting systemd-update-utmp.service - Record System Boot/Shutdown in UTMP... Jul 15 23:16:03.531740 systemd[1]: Starting systemd-userdbd.service - User Database Manager... Jul 15 23:16:03.537948 systemd[1]: ignition-delete-config.service - Ignition (delete config) was skipped because no trigger condition checks were met. Jul 15 23:16:03.540504 systemd[1]: Starting modprobe@dm_mod.service - Load Kernel Module dm_mod... Jul 15 23:16:03.553620 systemd[1]: Starting modprobe@efi_pstore.service - Load Kernel Module efi_pstore... Jul 15 23:16:03.570602 systemd[1]: Starting modprobe@loop.service - Load Kernel Module loop... Jul 15 23:16:03.573043 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. Jul 15 23:16:03.573216 systemd[1]: systemd-hibernate-clear.service - Clear Stale Hibernate Storage Info was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/HibernateLocation-8cf2644b-4b0b-428f-9387-6d876050dc67). Jul 15 23:16:03.578749 systemd[1]: Finished systemd-journal-catalog-update.service - Rebuild Journal Catalog. Jul 15 23:16:03.581812 systemd[1]: ignition-delete-config.service - Ignition (delete config) was skipped because no trigger condition checks were met. Jul 15 23:16:03.582066 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. Jul 15 23:16:03.582173 systemd[1]: systemd-hibernate-clear.service - Clear Stale Hibernate Storage Info was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/HibernateLocation-8cf2644b-4b0b-428f-9387-6d876050dc67). Jul 15 23:16:03.585979 systemd[1]: Starting systemd-update-done.service - Update is Completed... Jul 15 23:16:03.591451 systemd[1]: ignition-delete-config.service - Ignition (delete config) was skipped because no trigger condition checks were met. Jul 15 23:16:03.596137 systemd[1]: Starting modprobe@drm.service - Load Kernel Module drm... Jul 15 23:16:03.597800 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. Jul 15 23:16:03.598024 systemd[1]: systemd-hibernate-clear.service - Clear Stale Hibernate Storage Info was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/HibernateLocation-8cf2644b-4b0b-428f-9387-6d876050dc67). Jul 15 23:16:03.615018 systemd[1]: Finished ensure-sysext.service. Jul 15 23:16:03.624543 systemd[1]: Starting systemd-timesyncd.service - Network Time Synchronization... Jul 15 23:16:03.628672 systemd[1]: Finished systemd-update-utmp.service - Record System Boot/Shutdown in UTMP. Jul 15 23:16:03.644119 systemd[1]: modprobe@dm_mod.service: Deactivated successfully. Jul 15 23:16:03.644424 systemd[1]: Finished modprobe@dm_mod.service - Load Kernel Module dm_mod. Jul 15 23:16:03.667969 systemd[1]: Finished clean-ca-certificates.service - Clean up broken links in /etc/ssl/certs. Jul 15 23:16:03.669109 systemd-udevd[1370]: Using default interface naming scheme 'v255'. Jul 15 23:16:03.669423 systemd[1]: modprobe@efi_pstore.service: Deactivated successfully. Jul 15 23:16:03.670242 systemd[1]: Finished modprobe@efi_pstore.service - Load Kernel Module efi_pstore. Jul 15 23:16:03.672126 systemd[1]: modprobe@loop.service: Deactivated successfully. Jul 15 23:16:03.672323 systemd[1]: Finished modprobe@loop.service - Load Kernel Module loop. Jul 15 23:16:03.675645 systemd[1]: Finished systemd-update-done.service - Update is Completed. Jul 15 23:16:03.677281 systemd[1]: modprobe@drm.service: Deactivated successfully. Jul 15 23:16:03.678007 systemd[1]: Finished modprobe@drm.service - Load Kernel Module drm. Jul 15 23:16:03.683479 systemd[1]: systemd-pstore.service - Platform Persistent Storage Archival was skipped because of an unmet condition check (ConditionDirectoryNotEmpty=/sys/fs/pstore). Jul 15 23:16:03.684990 systemd[1]: systemd-repart.service - Repartition Root Disk was skipped because no trigger condition checks were met. Jul 15 23:16:03.685023 systemd[1]: update-ca-certificates.service - Update CA bundle at /etc/ssl/certs/ca-certificates.crt was skipped because of an unmet condition check (ConditionPathIsSymbolicLink=!/etc/ssl/certs/ca-certificates.crt). Jul 15 23:16:03.689583 systemd[1]: Started systemd-userdbd.service - User Database Manager. Jul 15 23:16:03.719117 systemd[1]: Started systemd-udevd.service - Rule-based Manager for Device Events and Files. Jul 15 23:16:03.722583 augenrules[1412]: No rules Jul 15 23:16:03.725804 systemd[1]: Starting systemd-networkd.service - Network Configuration... Jul 15 23:16:03.730402 systemd[1]: audit-rules.service: Deactivated successfully. Jul 15 23:16:03.730966 systemd[1]: Finished audit-rules.service - Load Audit Rules. Jul 15 23:16:03.829357 systemd[1]: Condition check resulted in dev-ttyAMA0.device - /dev/ttyAMA0 being skipped. Jul 15 23:16:03.931532 systemd[1]: Started systemd-timesyncd.service - Network Time Synchronization. Jul 15 23:16:03.933018 systemd[1]: Reached target time-set.target - System Time Set. Jul 15 23:16:03.933678 systemd-resolved[1369]: Positive Trust Anchors: Jul 15 23:16:03.933701 systemd-resolved[1369]: . IN DS 20326 8 2 e06d44b80b8f1d39a95c0b0d7c65d08458e880409bbc683457104237c7f8ec8d Jul 15 23:16:03.933733 systemd-resolved[1369]: Negative trust anchors: home.arpa 10.in-addr.arpa 16.172.in-addr.arpa 17.172.in-addr.arpa 18.172.in-addr.arpa 19.172.in-addr.arpa 20.172.in-addr.arpa 21.172.in-addr.arpa 22.172.in-addr.arpa 23.172.in-addr.arpa 24.172.in-addr.arpa 25.172.in-addr.arpa 26.172.in-addr.arpa 27.172.in-addr.arpa 28.172.in-addr.arpa 29.172.in-addr.arpa 30.172.in-addr.arpa 31.172.in-addr.arpa 170.0.0.192.in-addr.arpa 171.0.0.192.in-addr.arpa 168.192.in-addr.arpa d.f.ip6.arpa ipv4only.arpa resolver.arpa corp home internal intranet lan local private test Jul 15 23:16:03.940426 systemd-resolved[1369]: Using system hostname 'ci-4372-0-1-n-21be50a87e'. Jul 15 23:16:03.942091 systemd[1]: Started systemd-resolved.service - Network Name Resolution. Jul 15 23:16:03.942806 systemd[1]: Reached target nss-lookup.target - Host and Network Name Lookups. Jul 15 23:16:03.943514 systemd[1]: Reached target sysinit.target - System Initialization. Jul 15 23:16:03.946020 systemd[1]: Started motdgen.path - Watch for update engine configuration changes. Jul 15 23:16:03.947557 systemd[1]: Started user-cloudinit@var-lib-flatcar\x2dinstall-user_data.path - Watch for a cloud-config at /var/lib/flatcar-install/user_data. Jul 15 23:16:03.949204 systemd[1]: Started logrotate.timer - Daily rotation of log files. Jul 15 23:16:03.950806 systemd[1]: Started mdadm.timer - Weekly check for MD array's redundancy information.. Jul 15 23:16:03.952699 systemd[1]: Started systemd-tmpfiles-clean.timer - Daily Cleanup of Temporary Directories. Jul 15 23:16:03.953416 systemd[1]: update-engine-stub.timer - Update Engine Stub Timer was skipped because of an unmet condition check (ConditionPathExists=/usr/.noupdate). Jul 15 23:16:03.953455 systemd[1]: Reached target paths.target - Path Units. Jul 15 23:16:03.954184 systemd[1]: Reached target timers.target - Timer Units. Jul 15 23:16:03.960081 systemd[1]: Listening on dbus.socket - D-Bus System Message Bus Socket. Jul 15 23:16:03.964472 systemd[1]: Starting docker.socket - Docker Socket for the API... Jul 15 23:16:03.984420 systemd[1]: Listening on sshd-unix-local.socket - OpenSSH Server Socket (systemd-ssh-generator, AF_UNIX Local). Jul 15 23:16:03.987251 systemd[1]: Listening on sshd-vsock.socket - OpenSSH Server Socket (systemd-ssh-generator, AF_VSOCK). Jul 15 23:16:03.989283 systemd[1]: Reached target ssh-access.target - SSH Access Available. Jul 15 23:16:04.002919 systemd[1]: Listening on sshd.socket - OpenSSH Server Socket. Jul 15 23:16:04.006899 systemd[1]: Listening on systemd-hostnamed.socket - Hostname Service Socket. Jul 15 23:16:04.011371 systemd[1]: Listening on docker.socket - Docker Socket for the API. Jul 15 23:16:04.015865 systemd[1]: Reached target sockets.target - Socket Units. Jul 15 23:16:04.017319 systemd[1]: Reached target basic.target - Basic System. Jul 15 23:16:04.018810 systemd[1]: addon-config@oem.service - Configure Addon /oem was skipped because no trigger condition checks were met. Jul 15 23:16:04.018842 systemd[1]: addon-run@oem.service - Run Addon /oem was skipped because no trigger condition checks were met. Jul 15 23:16:04.022978 systemd[1]: Starting coreos-metadata.service - Flatcar Metadata Agent... Jul 15 23:16:04.027929 systemd[1]: Starting dbus.service - D-Bus System Message Bus... Jul 15 23:16:04.031464 systemd[1]: Starting dracut-shutdown.service - Restore /run/initramfs on shutdown... Jul 15 23:16:04.036744 systemd[1]: Starting enable-oem-cloudinit.service - Enable cloudinit... Jul 15 23:16:04.043757 systemd[1]: Starting extend-filesystems.service - Extend Filesystems... Jul 15 23:16:04.044757 systemd[1]: flatcar-setup-environment.service - Modifies /etc/environment for CoreOS was skipped because of an unmet condition check (ConditionPathExists=/oem/bin/flatcar-setup-environment). Jul 15 23:16:04.051284 systemd[1]: Starting motdgen.service - Generate /run/flatcar/motd... Jul 15 23:16:04.058070 jq[1471]: false Jul 15 23:16:04.057423 systemd[1]: Starting prepare-helm.service - Unpack helm to /opt/bin... Jul 15 23:16:04.066134 systemd[1]: Starting ssh-key-proc-cmdline.service - Install an ssh key from /proc/cmdline... Jul 15 23:16:04.074805 systemd[1]: Starting sshd-keygen.service - Generate sshd host keys... Jul 15 23:16:04.094379 systemd[1]: Starting systemd-logind.service - User Login Management... Jul 15 23:16:04.096869 extend-filesystems[1472]: Found /dev/sda6 Jul 15 23:16:04.098115 systemd[1]: tcsd.service - TCG Core Services Daemon was skipped because of an unmet condition check (ConditionPathExists=/dev/tpm0). Jul 15 23:16:04.098754 systemd[1]: cgroup compatibility translation between legacy and unified hierarchy settings activated. See cgroup-compat debug messages for details. Jul 15 23:16:04.103969 systemd[1]: Starting update-engine.service - Update Engine... Jul 15 23:16:04.107336 systemd[1]: Starting update-ssh-keys-after-ignition.service - Run update-ssh-keys once after Ignition... Jul 15 23:16:04.110326 systemd[1]: enable-oem-cloudinit.service: Skipped due to 'exec-condition'. Jul 15 23:16:04.111724 systemd[1]: Condition check resulted in enable-oem-cloudinit.service - Enable cloudinit being skipped. Jul 15 23:16:04.126682 systemd[1]: Finished dracut-shutdown.service - Restore /run/initramfs on shutdown. Jul 15 23:16:04.131092 extend-filesystems[1472]: Found /dev/sda9 Jul 15 23:16:04.143099 extend-filesystems[1472]: Checking size of /dev/sda9 Jul 15 23:16:04.151408 jq[1483]: true Jul 15 23:16:04.152923 systemd[1]: ssh-key-proc-cmdline.service: Deactivated successfully. Jul 15 23:16:04.153232 systemd[1]: Finished ssh-key-proc-cmdline.service - Install an ssh key from /proc/cmdline. Jul 15 23:16:04.205415 extend-filesystems[1472]: Resized partition /dev/sda9 Jul 15 23:16:04.215067 jq[1497]: true Jul 15 23:16:04.215353 extend-filesystems[1508]: resize2fs 1.47.2 (1-Jan-2025) Jul 15 23:16:04.222767 tar[1486]: linux-arm64/LICENSE Jul 15 23:16:04.222767 tar[1486]: linux-arm64/helm Jul 15 23:16:04.226522 update_engine[1482]: I20250715 23:16:04.225759 1482 main.cc:92] Flatcar Update Engine starting Jul 15 23:16:04.241962 kernel: EXT4-fs (sda9): resizing filesystem from 1617920 to 9393147 blocks Jul 15 23:16:04.256618 coreos-metadata[1468]: Jul 15 23:16:04.255 INFO Fetching http://169.254.169.254/hetzner/v1/metadata: Attempt #1 Jul 15 23:16:04.257163 dbus-daemon[1469]: [system] SELinux support is enabled Jul 15 23:16:04.259660 coreos-metadata[1468]: Jul 15 23:16:04.257 INFO Failed to fetch: error sending request for url (http://169.254.169.254/hetzner/v1/metadata) Jul 15 23:16:04.258197 systemd[1]: Started dbus.service - D-Bus System Message Bus. Jul 15 23:16:04.262808 systemd[1]: system-cloudinit@usr-share-oem-cloud\x2dconfig.yml.service - Load cloud-config from /usr/share/oem/cloud-config.yml was skipped because of an unmet condition check (ConditionFileNotEmpty=/usr/share/oem/cloud-config.yml). Jul 15 23:16:04.262875 systemd[1]: Reached target system-config.target - Load system-provided cloud configs. Jul 15 23:16:04.264216 systemd[1]: user-cloudinit-proc-cmdline.service - Load cloud-config from url defined in /proc/cmdline was skipped because of an unmet condition check (ConditionKernelCommandLine=cloud-config-url). Jul 15 23:16:04.264236 systemd[1]: Reached target user-config.target - Load user-provided cloud configs. Jul 15 23:16:04.269030 systemd[1]: motdgen.service: Deactivated successfully. Jul 15 23:16:04.277744 update_engine[1482]: I20250715 23:16:04.277664 1482 update_check_scheduler.cc:74] Next update check in 4m44s Jul 15 23:16:04.310830 systemd[1]: Finished motdgen.service - Generate /run/flatcar/motd. Jul 15 23:16:04.322108 systemd[1]: Started update-engine.service - Update Engine. Jul 15 23:16:04.329738 kernel: mousedev: PS/2 mouse device common for all mice Jul 15 23:16:04.332383 systemd[1]: Started locksmithd.service - Cluster reboot manager. Jul 15 23:16:04.344524 systemd[1]: Found device dev-disk-by\x2dlabel-OEM.device - QEMU_HARDDISK OEM. Jul 15 23:16:04.348401 systemd[1]: Starting systemd-fsck@dev-disk-by\x2dlabel-OEM.service - File System Check on /dev/disk/by-label/OEM... Jul 15 23:16:04.351370 systemd-networkd[1422]: lo: Link UP Jul 15 23:16:04.351379 systemd-networkd[1422]: lo: Gained carrier Jul 15 23:16:04.366582 systemd-networkd[1422]: Enumeration completed Jul 15 23:16:04.366869 systemd[1]: Started systemd-networkd.service - Network Configuration. Jul 15 23:16:04.368961 systemd[1]: Reached target network.target - Network. Jul 15 23:16:04.376108 systemd[1]: Starting containerd.service - containerd container runtime... Jul 15 23:16:04.377779 systemd-networkd[1422]: eth0: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Jul 15 23:16:04.377795 systemd-networkd[1422]: eth0: Configuring with /usr/lib/systemd/network/zz-default.network. Jul 15 23:16:04.385939 systemd-networkd[1422]: eth1: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Jul 15 23:16:04.385950 systemd-networkd[1422]: eth1: Configuring with /usr/lib/systemd/network/zz-default.network. Jul 15 23:16:04.387689 systemd-networkd[1422]: eth0: Link UP Jul 15 23:16:04.388004 systemd-networkd[1422]: eth0: Gained carrier Jul 15 23:16:04.388029 systemd-networkd[1422]: eth0: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Jul 15 23:16:04.395325 systemd[1]: Starting systemd-networkd-persistent-storage.service - Enable Persistent Storage in systemd-networkd... Jul 15 23:16:04.400118 systemd[1]: Starting systemd-networkd-wait-online.service - Wait for Network to be Configured... Jul 15 23:16:04.427686 kernel: EXT4-fs (sda9): resized filesystem to 9393147 Jul 15 23:16:04.450267 systemd-networkd[1422]: eth1: Link UP Jul 15 23:16:04.456001 systemd-networkd[1422]: eth1: Gained carrier Jul 15 23:16:04.456034 systemd-networkd[1422]: eth1: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Jul 15 23:16:04.465098 bash[1530]: Updated "/home/core/.ssh/authorized_keys" Jul 15 23:16:04.468709 systemd[1]: Finished update-ssh-keys-after-ignition.service - Run update-ssh-keys once after Ignition. Jul 15 23:16:04.470313 extend-filesystems[1508]: Filesystem at /dev/sda9 is mounted on /; on-line resizing required Jul 15 23:16:04.470313 extend-filesystems[1508]: old_desc_blocks = 1, new_desc_blocks = 5 Jul 15 23:16:04.470313 extend-filesystems[1508]: The filesystem on /dev/sda9 is now 9393147 (4k) blocks long. Jul 15 23:16:04.472254 extend-filesystems[1472]: Resized filesystem in /dev/sda9 Jul 15 23:16:04.478906 systemd[1]: Starting sshkeys.service... Jul 15 23:16:04.479637 systemd[1]: extend-filesystems.service: Deactivated successfully. Jul 15 23:16:04.481057 systemd[1]: Finished extend-filesystems.service - Extend Filesystems. Jul 15 23:16:04.496311 systemd[1]: Finished systemd-fsck@dev-disk-by\x2dlabel-OEM.service - File System Check on /dev/disk/by-label/OEM. Jul 15 23:16:04.497123 systemd-networkd[1422]: eth1: DHCPv4 address 10.0.0.3/32, gateway 10.0.0.1 acquired from 10.0.0.1 Jul 15 23:16:04.503685 systemd-networkd[1422]: eth0: DHCPv4 address 91.99.212.32/32, gateway 172.31.1.1 acquired from 172.31.1.1 Jul 15 23:16:04.507784 systemd-timesyncd[1387]: Network configuration changed, trying to establish connection. Jul 15 23:16:04.510146 (ntainerd)[1538]: containerd.service: Referenced but unset environment variable evaluates to an empty string: TORCX_IMAGEDIR, TORCX_UNPACKDIR Jul 15 23:16:04.512256 systemd[1]: Created slice system-coreos\x2dmetadata\x2dsshkeys.slice - Slice /system/coreos-metadata-sshkeys. Jul 15 23:16:04.516539 systemd-timesyncd[1387]: Network configuration changed, trying to establish connection. Jul 15 23:16:04.517403 systemd[1]: Starting coreos-metadata-sshkeys@core.service - Flatcar Metadata Agent (SSH Keys)... Jul 15 23:16:04.520060 systemd[1]: Finished systemd-networkd-persistent-storage.service - Enable Persistent Storage in systemd-networkd. Jul 15 23:16:04.521980 systemd-timesyncd[1387]: Network configuration changed, trying to establish connection. Jul 15 23:16:04.587572 systemd-logind[1480]: New seat seat0. Jul 15 23:16:04.591184 systemd[1]: Started systemd-logind.service - User Login Management. Jul 15 23:16:04.658032 coreos-metadata[1543]: Jul 15 23:16:04.656 INFO Fetching http://169.254.169.254/hetzner/v1/metadata/public-keys: Attempt #1 Jul 15 23:16:04.659205 coreos-metadata[1543]: Jul 15 23:16:04.658 INFO Fetch successful Jul 15 23:16:04.661581 unknown[1543]: wrote ssh authorized keys file for user: core Jul 15 23:16:04.712227 update-ssh-keys[1555]: Updated "/home/core/.ssh/authorized_keys" Jul 15 23:16:04.715925 systemd[1]: Condition check resulted in dev-virtio\x2dports-org.qemu.guest_agent.0.device - /dev/virtio-ports/org.qemu.guest_agent.0 being skipped. Jul 15 23:16:04.718061 systemd[1]: Finished coreos-metadata-sshkeys@core.service - Flatcar Metadata Agent (SSH Keys). Jul 15 23:16:04.724718 systemd[1]: Finished sshkeys.service. Jul 15 23:16:04.731955 systemd[1]: Started qemu-guest-agent.service - QEMU Guest Agent. Jul 15 23:16:04.768258 kernel: [drm] pci: virtio-gpu-pci detected at 0000:00:01.0 Jul 15 23:16:04.768362 kernel: [drm] features: -virgl +edid -resource_blob -host_visible Jul 15 23:16:04.768378 kernel: [drm] features: -context_init Jul 15 23:16:04.778134 kernel: [drm] number of scanouts: 1 Jul 15 23:16:04.778203 kernel: [drm] number of cap sets: 0 Jul 15 23:16:04.787447 kernel: [drm] Initialized virtio_gpu 0.1.0 for 0000:00:01.0 on minor 0 Jul 15 23:16:04.840110 locksmithd[1515]: locksmithd starting currentOperation="UPDATE_STATUS_IDLE" strategy="reboot" Jul 15 23:16:04.925069 containerd[1538]: time="2025-07-15T23:16:04Z" level=warning msg="Ignoring unknown key in TOML" column=1 error="strict mode: fields in the document are missing in the target struct" file=/usr/share/containerd/config.toml key=subreaper row=8 Jul 15 23:16:04.929612 containerd[1538]: time="2025-07-15T23:16:04.928147840Z" level=info msg="starting containerd" revision=06b99ca80cdbfbc6cc8bd567021738c9af2b36ce version=v2.0.4 Jul 15 23:16:04.941183 containerd[1538]: time="2025-07-15T23:16:04.941119360Z" level=warning msg="Configuration migrated from version 2, use `containerd config migrate` to avoid migration" t="12.8µs" Jul 15 23:16:04.941318 containerd[1538]: time="2025-07-15T23:16:04.941300440Z" level=info msg="loading plugin" id=io.containerd.image-verifier.v1.bindir type=io.containerd.image-verifier.v1 Jul 15 23:16:04.941384 containerd[1538]: time="2025-07-15T23:16:04.941369360Z" level=info msg="loading plugin" id=io.containerd.internal.v1.opt type=io.containerd.internal.v1 Jul 15 23:16:04.941723 containerd[1538]: time="2025-07-15T23:16:04.941697160Z" level=info msg="loading plugin" id=io.containerd.warning.v1.deprecations type=io.containerd.warning.v1 Jul 15 23:16:04.941807 containerd[1538]: time="2025-07-15T23:16:04.941788560Z" level=info msg="loading plugin" id=io.containerd.content.v1.content type=io.containerd.content.v1 Jul 15 23:16:04.941942 containerd[1538]: time="2025-07-15T23:16:04.941926200Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.blockfile type=io.containerd.snapshotter.v1 Jul 15 23:16:04.942084 containerd[1538]: time="2025-07-15T23:16:04.942063680Z" level=info msg="skip loading plugin" error="no scratch file generator: skip plugin" id=io.containerd.snapshotter.v1.blockfile type=io.containerd.snapshotter.v1 Jul 15 23:16:04.942654 containerd[1538]: time="2025-07-15T23:16:04.942627200Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.btrfs type=io.containerd.snapshotter.v1 Jul 15 23:16:04.943021 containerd[1538]: time="2025-07-15T23:16:04.942989440Z" level=info msg="skip loading plugin" error="path /var/lib/containerd/io.containerd.snapshotter.v1.btrfs (ext4) must be a btrfs filesystem to be used with the btrfs snapshotter: skip plugin" id=io.containerd.snapshotter.v1.btrfs type=io.containerd.snapshotter.v1 Jul 15 23:16:04.943122 containerd[1538]: time="2025-07-15T23:16:04.943102840Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.devmapper type=io.containerd.snapshotter.v1 Jul 15 23:16:04.943183 containerd[1538]: time="2025-07-15T23:16:04.943165760Z" level=info msg="skip loading plugin" error="devmapper not configured: skip plugin" id=io.containerd.snapshotter.v1.devmapper type=io.containerd.snapshotter.v1 Jul 15 23:16:04.943237 containerd[1538]: time="2025-07-15T23:16:04.943223760Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.native type=io.containerd.snapshotter.v1 Jul 15 23:16:04.943383 containerd[1538]: time="2025-07-15T23:16:04.943362640Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.overlayfs type=io.containerd.snapshotter.v1 Jul 15 23:16:04.943692 containerd[1538]: time="2025-07-15T23:16:04.943669440Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.zfs type=io.containerd.snapshotter.v1 Jul 15 23:16:04.943792 containerd[1538]: time="2025-07-15T23:16:04.943775560Z" level=info msg="skip loading plugin" error="lstat /var/lib/containerd/io.containerd.snapshotter.v1.zfs: no such file or directory: skip plugin" id=io.containerd.snapshotter.v1.zfs type=io.containerd.snapshotter.v1 Jul 15 23:16:04.943980 containerd[1538]: time="2025-07-15T23:16:04.943830800Z" level=info msg="loading plugin" id=io.containerd.event.v1.exchange type=io.containerd.event.v1 Jul 15 23:16:04.944078 containerd[1538]: time="2025-07-15T23:16:04.944061160Z" level=info msg="loading plugin" id=io.containerd.monitor.task.v1.cgroups type=io.containerd.monitor.task.v1 Jul 15 23:16:04.945555 containerd[1538]: time="2025-07-15T23:16:04.945474480Z" level=info msg="loading plugin" id=io.containerd.metadata.v1.bolt type=io.containerd.metadata.v1 Jul 15 23:16:04.946001 containerd[1538]: time="2025-07-15T23:16:04.945977080Z" level=info msg="metadata content store policy set" policy=shared Jul 15 23:16:04.961185 containerd[1538]: time="2025-07-15T23:16:04.961134560Z" level=info msg="loading plugin" id=io.containerd.gc.v1.scheduler type=io.containerd.gc.v1 Jul 15 23:16:04.962293 containerd[1538]: time="2025-07-15T23:16:04.961412160Z" level=info msg="loading plugin" id=io.containerd.differ.v1.walking type=io.containerd.differ.v1 Jul 15 23:16:04.962293 containerd[1538]: time="2025-07-15T23:16:04.961514200Z" level=info msg="loading plugin" id=io.containerd.lease.v1.manager type=io.containerd.lease.v1 Jul 15 23:16:04.962293 containerd[1538]: time="2025-07-15T23:16:04.961533160Z" level=info msg="loading plugin" id=io.containerd.service.v1.containers-service type=io.containerd.service.v1 Jul 15 23:16:04.962293 containerd[1538]: time="2025-07-15T23:16:04.961559160Z" level=info msg="loading plugin" id=io.containerd.service.v1.content-service type=io.containerd.service.v1 Jul 15 23:16:04.962293 containerd[1538]: time="2025-07-15T23:16:04.961574600Z" level=info msg="loading plugin" id=io.containerd.service.v1.diff-service type=io.containerd.service.v1 Jul 15 23:16:04.962293 containerd[1538]: time="2025-07-15T23:16:04.961609040Z" level=info msg="loading plugin" id=io.containerd.service.v1.images-service type=io.containerd.service.v1 Jul 15 23:16:04.962293 containerd[1538]: time="2025-07-15T23:16:04.961623840Z" level=info msg="loading plugin" id=io.containerd.service.v1.introspection-service type=io.containerd.service.v1 Jul 15 23:16:04.962293 containerd[1538]: time="2025-07-15T23:16:04.961636680Z" level=info msg="loading plugin" id=io.containerd.service.v1.namespaces-service type=io.containerd.service.v1 Jul 15 23:16:04.962293 containerd[1538]: time="2025-07-15T23:16:04.961648000Z" level=info msg="loading plugin" id=io.containerd.service.v1.snapshots-service type=io.containerd.service.v1 Jul 15 23:16:04.962293 containerd[1538]: time="2025-07-15T23:16:04.961660360Z" level=info msg="loading plugin" id=io.containerd.shim.v1.manager type=io.containerd.shim.v1 Jul 15 23:16:04.962293 containerd[1538]: time="2025-07-15T23:16:04.961674680Z" level=info msg="loading plugin" id=io.containerd.runtime.v2.task type=io.containerd.runtime.v2 Jul 15 23:16:04.962293 containerd[1538]: time="2025-07-15T23:16:04.961896280Z" level=info msg="loading plugin" id=io.containerd.service.v1.tasks-service type=io.containerd.service.v1 Jul 15 23:16:04.962293 containerd[1538]: time="2025-07-15T23:16:04.961964240Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.containers type=io.containerd.grpc.v1 Jul 15 23:16:04.962293 containerd[1538]: time="2025-07-15T23:16:04.962140880Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.content type=io.containerd.grpc.v1 Jul 15 23:16:04.962735 containerd[1538]: time="2025-07-15T23:16:04.962163680Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.diff type=io.containerd.grpc.v1 Jul 15 23:16:04.962735 containerd[1538]: time="2025-07-15T23:16:04.962175200Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.events type=io.containerd.grpc.v1 Jul 15 23:16:04.962735 containerd[1538]: time="2025-07-15T23:16:04.962186840Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.images type=io.containerd.grpc.v1 Jul 15 23:16:04.962735 containerd[1538]: time="2025-07-15T23:16:04.962213520Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.introspection type=io.containerd.grpc.v1 Jul 15 23:16:04.962735 containerd[1538]: time="2025-07-15T23:16:04.962226160Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.leases type=io.containerd.grpc.v1 Jul 15 23:16:04.962735 containerd[1538]: time="2025-07-15T23:16:04.962239320Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.namespaces type=io.containerd.grpc.v1 Jul 15 23:16:04.962735 containerd[1538]: time="2025-07-15T23:16:04.962252280Z" level=info msg="loading plugin" id=io.containerd.sandbox.store.v1.local type=io.containerd.sandbox.store.v1 Jul 15 23:16:04.962735 containerd[1538]: time="2025-07-15T23:16:04.962265920Z" level=info msg="loading plugin" id=io.containerd.cri.v1.images type=io.containerd.cri.v1 Jul 15 23:16:04.963528 containerd[1538]: time="2025-07-15T23:16:04.963155360Z" level=info msg="Get image filesystem path \"/var/lib/containerd/io.containerd.snapshotter.v1.overlayfs\" for snapshotter \"overlayfs\"" Jul 15 23:16:04.963528 containerd[1538]: time="2025-07-15T23:16:04.963185880Z" level=info msg="Start snapshots syncer" Jul 15 23:16:04.963528 containerd[1538]: time="2025-07-15T23:16:04.963233000Z" level=info msg="loading plugin" id=io.containerd.cri.v1.runtime type=io.containerd.cri.v1 Jul 15 23:16:04.964428 containerd[1538]: time="2025-07-15T23:16:04.963967480Z" level=info msg="starting cri plugin" config="{\"containerd\":{\"defaultRuntimeName\":\"runc\",\"runtimes\":{\"runc\":{\"runtimeType\":\"io.containerd.runc.v2\",\"runtimePath\":\"\",\"PodAnnotations\":null,\"ContainerAnnotations\":null,\"options\":{\"BinaryName\":\"\",\"CriuImagePath\":\"\",\"CriuWorkPath\":\"\",\"IoGid\":0,\"IoUid\":0,\"NoNewKeyring\":false,\"Root\":\"\",\"ShimCgroup\":\"\",\"SystemdCgroup\":true},\"privileged_without_host_devices\":false,\"privileged_without_host_devices_all_devices_allowed\":false,\"baseRuntimeSpec\":\"\",\"cniConfDir\":\"\",\"cniMaxConfNum\":0,\"snapshotter\":\"\",\"sandboxer\":\"podsandbox\",\"io_type\":\"\"}},\"ignoreBlockIONotEnabledErrors\":false,\"ignoreRdtNotEnabledErrors\":false},\"cni\":{\"binDir\":\"/opt/cni/bin\",\"confDir\":\"/etc/cni/net.d\",\"maxConfNum\":1,\"setupSerially\":false,\"confTemplate\":\"\",\"ipPref\":\"\",\"useInternalLoopback\":false},\"enableSelinux\":true,\"selinuxCategoryRange\":1024,\"maxContainerLogSize\":16384,\"disableApparmor\":false,\"restrictOOMScoreAdj\":false,\"disableProcMount\":false,\"unsetSeccompProfile\":\"\",\"tolerateMissingHugetlbController\":true,\"disableHugetlbController\":true,\"device_ownership_from_security_context\":false,\"ignoreImageDefinedVolumes\":false,\"netnsMountsUnderStateDir\":false,\"enableUnprivilegedPorts\":true,\"enableUnprivilegedICMP\":true,\"enableCDI\":true,\"cdiSpecDirs\":[\"/etc/cdi\",\"/var/run/cdi\"],\"drainExecSyncIOTimeout\":\"0s\",\"ignoreDeprecationWarnings\":null,\"containerdRootDir\":\"/var/lib/containerd\",\"containerdEndpoint\":\"/run/containerd/containerd.sock\",\"rootDir\":\"/var/lib/containerd/io.containerd.grpc.v1.cri\",\"stateDir\":\"/run/containerd/io.containerd.grpc.v1.cri\"}" Jul 15 23:16:04.964428 containerd[1538]: time="2025-07-15T23:16:04.964049440Z" level=info msg="loading plugin" id=io.containerd.podsandbox.controller.v1.podsandbox type=io.containerd.podsandbox.controller.v1 Jul 15 23:16:04.964604 containerd[1538]: time="2025-07-15T23:16:04.964193200Z" level=info msg="loading plugin" id=io.containerd.sandbox.controller.v1.shim type=io.containerd.sandbox.controller.v1 Jul 15 23:16:04.965521 containerd[1538]: time="2025-07-15T23:16:04.964817360Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.sandbox-controllers type=io.containerd.grpc.v1 Jul 15 23:16:04.965521 containerd[1538]: time="2025-07-15T23:16:04.964866080Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.sandboxes type=io.containerd.grpc.v1 Jul 15 23:16:04.965521 containerd[1538]: time="2025-07-15T23:16:04.964896760Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.snapshots type=io.containerd.grpc.v1 Jul 15 23:16:04.965521 containerd[1538]: time="2025-07-15T23:16:04.964909960Z" level=info msg="loading plugin" id=io.containerd.streaming.v1.manager type=io.containerd.streaming.v1 Jul 15 23:16:04.965521 containerd[1538]: time="2025-07-15T23:16:04.964923000Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.streaming type=io.containerd.grpc.v1 Jul 15 23:16:04.965521 containerd[1538]: time="2025-07-15T23:16:04.964939920Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.tasks type=io.containerd.grpc.v1 Jul 15 23:16:04.965521 containerd[1538]: time="2025-07-15T23:16:04.965003840Z" level=info msg="loading plugin" id=io.containerd.transfer.v1.local type=io.containerd.transfer.v1 Jul 15 23:16:04.965521 containerd[1538]: time="2025-07-15T23:16:04.965092000Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.transfer type=io.containerd.grpc.v1 Jul 15 23:16:04.965521 containerd[1538]: time="2025-07-15T23:16:04.965109400Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.version type=io.containerd.grpc.v1 Jul 15 23:16:04.965521 containerd[1538]: time="2025-07-15T23:16:04.965132840Z" level=info msg="loading plugin" id=io.containerd.monitor.container.v1.restart type=io.containerd.monitor.container.v1 Jul 15 23:16:04.965521 containerd[1538]: time="2025-07-15T23:16:04.965265920Z" level=info msg="loading plugin" id=io.containerd.tracing.processor.v1.otlp type=io.containerd.tracing.processor.v1 Jul 15 23:16:04.965521 containerd[1538]: time="2025-07-15T23:16:04.965288440Z" level=info msg="skip loading plugin" error="skip plugin: tracing endpoint not configured" id=io.containerd.tracing.processor.v1.otlp type=io.containerd.tracing.processor.v1 Jul 15 23:16:04.965521 containerd[1538]: time="2025-07-15T23:16:04.965310840Z" level=info msg="loading plugin" id=io.containerd.internal.v1.tracing type=io.containerd.internal.v1 Jul 15 23:16:04.965934 containerd[1538]: time="2025-07-15T23:16:04.965323280Z" level=info msg="skip loading plugin" error="skip plugin: tracing endpoint not configured" id=io.containerd.internal.v1.tracing type=io.containerd.internal.v1 Jul 15 23:16:04.965934 containerd[1538]: time="2025-07-15T23:16:04.965331760Z" level=info msg="loading plugin" id=io.containerd.ttrpc.v1.otelttrpc type=io.containerd.ttrpc.v1 Jul 15 23:16:04.965934 containerd[1538]: time="2025-07-15T23:16:04.965341640Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.healthcheck type=io.containerd.grpc.v1 Jul 15 23:16:04.965934 containerd[1538]: time="2025-07-15T23:16:04.965356240Z" level=info msg="loading plugin" id=io.containerd.nri.v1.nri type=io.containerd.nri.v1 Jul 15 23:16:04.965934 containerd[1538]: time="2025-07-15T23:16:04.965445360Z" level=info msg="runtime interface created" Jul 15 23:16:04.965934 containerd[1538]: time="2025-07-15T23:16:04.965452080Z" level=info msg="created NRI interface" Jul 15 23:16:04.965934 containerd[1538]: time="2025-07-15T23:16:04.965461040Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.cri type=io.containerd.grpc.v1 Jul 15 23:16:04.965934 containerd[1538]: time="2025-07-15T23:16:04.965477560Z" level=info msg="Connect containerd service" Jul 15 23:16:04.966761 containerd[1538]: time="2025-07-15T23:16:04.966098560Z" level=info msg="using experimental NRI integration - disable nri plugin to prevent this" Jul 15 23:16:04.967497 containerd[1538]: time="2025-07-15T23:16:04.967464280Z" level=error msg="failed to load cni during init, please check CRI plugin status before setting up network for pods" error="cni config load failed: no network config found in /etc/cni/net.d: cni plugin not initialized: failed to load cni config" Jul 15 23:16:05.034640 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Jul 15 23:16:05.137200 systemd-logind[1480]: Watching system buttons on /dev/input/event0 (Power Button) Jul 15 23:16:05.145307 systemd-logind[1480]: Watching system buttons on /dev/input/event2 (QEMU QEMU USB Keyboard) Jul 15 23:16:05.160415 sshd_keygen[1499]: ssh-keygen: generating new host keys: RSA ECDSA ED25519 Jul 15 23:16:05.194563 systemd[1]: Finished systemd-vconsole-setup.service - Virtual Console Setup. Jul 15 23:16:05.231627 containerd[1538]: time="2025-07-15T23:16:05.231504760Z" level=info msg="Start subscribing containerd event" Jul 15 23:16:05.234056 containerd[1538]: time="2025-07-15T23:16:05.233597680Z" level=info msg="Start recovering state" Jul 15 23:16:05.234056 containerd[1538]: time="2025-07-15T23:16:05.233737080Z" level=info msg="Start event monitor" Jul 15 23:16:05.234056 containerd[1538]: time="2025-07-15T23:16:05.233756680Z" level=info msg="Start cni network conf syncer for default" Jul 15 23:16:05.234056 containerd[1538]: time="2025-07-15T23:16:05.233767800Z" level=info msg="Start streaming server" Jul 15 23:16:05.234056 containerd[1538]: time="2025-07-15T23:16:05.233778640Z" level=info msg="Registered namespace \"k8s.io\" with NRI" Jul 15 23:16:05.234056 containerd[1538]: time="2025-07-15T23:16:05.233785680Z" level=info msg="runtime interface starting up..." Jul 15 23:16:05.234056 containerd[1538]: time="2025-07-15T23:16:05.233791920Z" level=info msg="starting plugins..." Jul 15 23:16:05.234056 containerd[1538]: time="2025-07-15T23:16:05.233806360Z" level=info msg="Synchronizing NRI (plugin) with current runtime state" Jul 15 23:16:05.234056 containerd[1538]: time="2025-07-15T23:16:05.233935440Z" level=info msg=serving... address=/run/containerd/containerd.sock.ttrpc Jul 15 23:16:05.234475 containerd[1538]: time="2025-07-15T23:16:05.234406480Z" level=info msg=serving... address=/run/containerd/containerd.sock Jul 15 23:16:05.236666 containerd[1538]: time="2025-07-15T23:16:05.234562360Z" level=info msg="containerd successfully booted in 0.310062s" Jul 15 23:16:05.234696 systemd[1]: Started containerd.service - containerd container runtime. Jul 15 23:16:05.255294 systemd[1]: Finished sshd-keygen.service - Generate sshd host keys. Jul 15 23:16:05.259913 systemd[1]: Starting issuegen.service - Generate /run/issue... Jul 15 23:16:05.261104 coreos-metadata[1468]: Jul 15 23:16:05.260 INFO Fetching http://169.254.169.254/hetzner/v1/metadata: Attempt #2 Jul 15 23:16:05.264729 coreos-metadata[1468]: Jul 15 23:16:05.263 INFO Fetch successful Jul 15 23:16:05.264729 coreos-metadata[1468]: Jul 15 23:16:05.264 INFO Fetching http://169.254.169.254/hetzner/v1/metadata/private-networks: Attempt #1 Jul 15 23:16:05.265216 coreos-metadata[1468]: Jul 15 23:16:05.265 INFO Fetch successful Jul 15 23:16:05.301087 systemd[1]: issuegen.service: Deactivated successfully. Jul 15 23:16:05.301333 systemd[1]: Finished issuegen.service - Generate /run/issue. Jul 15 23:16:05.307041 systemd[1]: Starting systemd-user-sessions.service - Permit User Sessions... Jul 15 23:16:05.354270 systemd[1]: Finished systemd-user-sessions.service - Permit User Sessions. Jul 15 23:16:05.356997 systemd[1]: Started getty@tty1.service - Getty on tty1. Jul 15 23:16:05.360111 systemd[1]: Started serial-getty@ttyAMA0.service - Serial Getty on ttyAMA0. Jul 15 23:16:05.360922 systemd[1]: Reached target getty.target - Login Prompts. Jul 15 23:16:05.392625 systemd[1]: Finished coreos-metadata.service - Flatcar Metadata Agent. Jul 15 23:16:05.395207 systemd[1]: packet-phone-home.service - Report Success to Packet was skipped because no trigger condition checks were met. Jul 15 23:16:05.544067 tar[1486]: linux-arm64/README.md Jul 15 23:16:05.566772 systemd[1]: Finished prepare-helm.service - Unpack helm to /opt/bin. Jul 15 23:16:05.988168 systemd-networkd[1422]: eth1: Gained IPv6LL Jul 15 23:16:05.989380 systemd-timesyncd[1387]: Network configuration changed, trying to establish connection. Jul 15 23:16:05.992719 systemd[1]: Finished systemd-networkd-wait-online.service - Wait for Network to be Configured. Jul 15 23:16:05.995575 systemd[1]: Reached target network-online.target - Network is Online. Jul 15 23:16:05.999224 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Jul 15 23:16:06.003761 systemd[1]: Starting nvidia.service - NVIDIA Configure Service... Jul 15 23:16:06.034253 systemd[1]: Finished nvidia.service - NVIDIA Configure Service. Jul 15 23:16:06.371798 systemd-networkd[1422]: eth0: Gained IPv6LL Jul 15 23:16:06.372279 systemd-timesyncd[1387]: Network configuration changed, trying to establish connection. Jul 15 23:16:06.915143 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Jul 15 23:16:06.920052 systemd[1]: Reached target multi-user.target - Multi-User System. Jul 15 23:16:06.924160 systemd[1]: Startup finished in 2.429s (kernel) + 6.810s (initrd) + 5.552s (userspace) = 14.792s. Jul 15 23:16:06.926252 (kubelet)[1644]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Jul 15 23:16:07.534763 kubelet[1644]: E0715 23:16:07.534669 1644 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Jul 15 23:16:07.539673 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Jul 15 23:16:07.539905 systemd[1]: kubelet.service: Failed with result 'exit-code'. Jul 15 23:16:07.540977 systemd[1]: kubelet.service: Consumed 1.003s CPU time, 259.5M memory peak. Jul 15 23:16:17.790451 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 1. Jul 15 23:16:17.794120 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Jul 15 23:16:17.961810 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Jul 15 23:16:17.975772 (kubelet)[1664]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Jul 15 23:16:18.027571 kubelet[1664]: E0715 23:16:18.027521 1664 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Jul 15 23:16:18.032950 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Jul 15 23:16:18.033096 systemd[1]: kubelet.service: Failed with result 'exit-code'. Jul 15 23:16:18.035685 systemd[1]: kubelet.service: Consumed 181ms CPU time, 105.4M memory peak. Jul 15 23:16:28.284314 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 2. Jul 15 23:16:28.288075 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Jul 15 23:16:28.458299 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Jul 15 23:16:28.488316 (kubelet)[1679]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Jul 15 23:16:28.534190 kubelet[1679]: E0715 23:16:28.534085 1679 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Jul 15 23:16:28.537478 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Jul 15 23:16:28.537721 systemd[1]: kubelet.service: Failed with result 'exit-code'. Jul 15 23:16:28.538217 systemd[1]: kubelet.service: Consumed 174ms CPU time, 104.3M memory peak. Jul 15 23:16:36.514986 systemd-timesyncd[1387]: Contacted time server 195.201.22.203:123 (2.flatcar.pool.ntp.org). Jul 15 23:16:36.515724 systemd-timesyncd[1387]: Initial clock synchronization to Tue 2025-07-15 23:16:36.392244 UTC. Jul 15 23:16:38.753721 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 3. Jul 15 23:16:38.756035 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Jul 15 23:16:38.938618 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Jul 15 23:16:38.951662 (kubelet)[1694]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Jul 15 23:16:39.014955 kubelet[1694]: E0715 23:16:39.014848 1694 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Jul 15 23:16:39.018820 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Jul 15 23:16:39.019021 systemd[1]: kubelet.service: Failed with result 'exit-code'. Jul 15 23:16:39.019918 systemd[1]: kubelet.service: Consumed 185ms CPU time, 105.7M memory peak. Jul 15 23:16:42.530964 systemd[1]: Created slice system-sshd.slice - Slice /system/sshd. Jul 15 23:16:42.533343 systemd[1]: Started sshd@0-91.99.212.32:22-139.178.68.195:35202.service - OpenSSH per-connection server daemon (139.178.68.195:35202). Jul 15 23:16:43.553130 sshd[1703]: Accepted publickey for core from 139.178.68.195 port 35202 ssh2: RSA SHA256:+cMC7rDY11ooX0rGk8xTzTdhKmHBDbuiScEsywsTdAk Jul 15 23:16:43.557250 sshd-session[1703]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jul 15 23:16:43.567942 systemd[1]: Created slice user-500.slice - User Slice of UID 500. Jul 15 23:16:43.569487 systemd[1]: Starting user-runtime-dir@500.service - User Runtime Directory /run/user/500... Jul 15 23:16:43.579257 systemd-logind[1480]: New session 1 of user core. Jul 15 23:16:43.593140 systemd[1]: Finished user-runtime-dir@500.service - User Runtime Directory /run/user/500. Jul 15 23:16:43.597219 systemd[1]: Starting user@500.service - User Manager for UID 500... Jul 15 23:16:43.616412 (systemd)[1707]: pam_unix(systemd-user:session): session opened for user core(uid=500) by (uid=0) Jul 15 23:16:43.620037 systemd-logind[1480]: New session c1 of user core. Jul 15 23:16:43.765652 systemd[1707]: Queued start job for default target default.target. Jul 15 23:16:43.790479 systemd[1707]: Created slice app.slice - User Application Slice. Jul 15 23:16:43.790876 systemd[1707]: Reached target paths.target - Paths. Jul 15 23:16:43.791086 systemd[1707]: Reached target timers.target - Timers. Jul 15 23:16:43.793045 systemd[1707]: Starting dbus.socket - D-Bus User Message Bus Socket... Jul 15 23:16:43.806370 systemd[1707]: Listening on dbus.socket - D-Bus User Message Bus Socket. Jul 15 23:16:43.806511 systemd[1707]: Reached target sockets.target - Sockets. Jul 15 23:16:43.806616 systemd[1707]: Reached target basic.target - Basic System. Jul 15 23:16:43.806674 systemd[1707]: Reached target default.target - Main User Target. Jul 15 23:16:43.806715 systemd[1707]: Startup finished in 178ms. Jul 15 23:16:43.807151 systemd[1]: Started user@500.service - User Manager for UID 500. Jul 15 23:16:43.814160 systemd[1]: Started session-1.scope - Session 1 of User core. Jul 15 23:16:44.512455 systemd[1]: Started sshd@1-91.99.212.32:22-139.178.68.195:35204.service - OpenSSH per-connection server daemon (139.178.68.195:35204). Jul 15 23:16:45.516720 sshd[1718]: Accepted publickey for core from 139.178.68.195 port 35204 ssh2: RSA SHA256:+cMC7rDY11ooX0rGk8xTzTdhKmHBDbuiScEsywsTdAk Jul 15 23:16:45.518613 sshd-session[1718]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jul 15 23:16:45.526762 systemd-logind[1480]: New session 2 of user core. Jul 15 23:16:45.532891 systemd[1]: Started session-2.scope - Session 2 of User core. Jul 15 23:16:46.198466 sshd[1720]: Connection closed by 139.178.68.195 port 35204 Jul 15 23:16:46.199531 sshd-session[1718]: pam_unix(sshd:session): session closed for user core Jul 15 23:16:46.206314 systemd[1]: sshd@1-91.99.212.32:22-139.178.68.195:35204.service: Deactivated successfully. Jul 15 23:16:46.208275 systemd[1]: session-2.scope: Deactivated successfully. Jul 15 23:16:46.209671 systemd-logind[1480]: Session 2 logged out. Waiting for processes to exit. Jul 15 23:16:46.211774 systemd-logind[1480]: Removed session 2. Jul 15 23:16:46.392011 systemd[1]: Started sshd@2-91.99.212.32:22-139.178.68.195:35212.service - OpenSSH per-connection server daemon (139.178.68.195:35212). Jul 15 23:16:47.410049 sshd[1726]: Accepted publickey for core from 139.178.68.195 port 35212 ssh2: RSA SHA256:+cMC7rDY11ooX0rGk8xTzTdhKmHBDbuiScEsywsTdAk Jul 15 23:16:47.414039 sshd-session[1726]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jul 15 23:16:47.422973 systemd-logind[1480]: New session 3 of user core. Jul 15 23:16:47.428814 systemd[1]: Started session-3.scope - Session 3 of User core. Jul 15 23:16:48.093808 sshd[1728]: Connection closed by 139.178.68.195 port 35212 Jul 15 23:16:48.096134 sshd-session[1726]: pam_unix(sshd:session): session closed for user core Jul 15 23:16:48.103564 systemd[1]: sshd@2-91.99.212.32:22-139.178.68.195:35212.service: Deactivated successfully. Jul 15 23:16:48.108103 systemd[1]: session-3.scope: Deactivated successfully. Jul 15 23:16:48.111153 systemd-logind[1480]: Session 3 logged out. Waiting for processes to exit. Jul 15 23:16:48.113866 systemd-logind[1480]: Removed session 3. Jul 15 23:16:48.278351 systemd[1]: Started sshd@3-91.99.212.32:22-139.178.68.195:35228.service - OpenSSH per-connection server daemon (139.178.68.195:35228). Jul 15 23:16:49.131865 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 4. Jul 15 23:16:49.133652 update_engine[1482]: I20250715 23:16:49.133419 1482 update_attempter.cc:509] Updating boot flags... Jul 15 23:16:49.135762 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Jul 15 23:16:49.303159 sshd[1734]: Accepted publickey for core from 139.178.68.195 port 35228 ssh2: RSA SHA256:+cMC7rDY11ooX0rGk8xTzTdhKmHBDbuiScEsywsTdAk Jul 15 23:16:49.305022 sshd-session[1734]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jul 15 23:16:49.311582 systemd-logind[1480]: New session 4 of user core. Jul 15 23:16:49.316945 systemd[1]: Started session-4.scope - Session 4 of User core. Jul 15 23:16:49.383826 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Jul 15 23:16:49.405174 (kubelet)[1761]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Jul 15 23:16:49.452152 kubelet[1761]: E0715 23:16:49.452062 1761 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Jul 15 23:16:49.456101 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Jul 15 23:16:49.456524 systemd[1]: kubelet.service: Failed with result 'exit-code'. Jul 15 23:16:49.458785 systemd[1]: kubelet.service: Consumed 183ms CPU time, 104.2M memory peak. Jul 15 23:16:49.996816 sshd[1755]: Connection closed by 139.178.68.195 port 35228 Jul 15 23:16:49.997745 sshd-session[1734]: pam_unix(sshd:session): session closed for user core Jul 15 23:16:50.002803 systemd-logind[1480]: Session 4 logged out. Waiting for processes to exit. Jul 15 23:16:50.004299 systemd[1]: sshd@3-91.99.212.32:22-139.178.68.195:35228.service: Deactivated successfully. Jul 15 23:16:50.006849 systemd[1]: session-4.scope: Deactivated successfully. Jul 15 23:16:50.010582 systemd-logind[1480]: Removed session 4. Jul 15 23:16:50.176838 systemd[1]: Started sshd@4-91.99.212.32:22-139.178.68.195:35234.service - OpenSSH per-connection server daemon (139.178.68.195:35234). Jul 15 23:16:51.186869 sshd[1773]: Accepted publickey for core from 139.178.68.195 port 35234 ssh2: RSA SHA256:+cMC7rDY11ooX0rGk8xTzTdhKmHBDbuiScEsywsTdAk Jul 15 23:16:51.188940 sshd-session[1773]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jul 15 23:16:51.196145 systemd-logind[1480]: New session 5 of user core. Jul 15 23:16:51.203077 systemd[1]: Started session-5.scope - Session 5 of User core. Jul 15 23:16:51.744222 sudo[1776]: core : PWD=/home/core ; USER=root ; COMMAND=/usr/sbin/setenforce 1 Jul 15 23:16:51.745300 sudo[1776]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Jul 15 23:16:51.765470 sudo[1776]: pam_unix(sudo:session): session closed for user root Jul 15 23:16:51.928984 sshd[1775]: Connection closed by 139.178.68.195 port 35234 Jul 15 23:16:51.928791 sshd-session[1773]: pam_unix(sshd:session): session closed for user core Jul 15 23:16:51.934339 systemd[1]: sshd@4-91.99.212.32:22-139.178.68.195:35234.service: Deactivated successfully. Jul 15 23:16:51.936330 systemd[1]: session-5.scope: Deactivated successfully. Jul 15 23:16:51.939930 systemd-logind[1480]: Session 5 logged out. Waiting for processes to exit. Jul 15 23:16:51.941881 systemd-logind[1480]: Removed session 5. Jul 15 23:16:52.103870 systemd[1]: Started sshd@5-91.99.212.32:22-139.178.68.195:34376.service - OpenSSH per-connection server daemon (139.178.68.195:34376). Jul 15 23:16:53.114225 sshd[1782]: Accepted publickey for core from 139.178.68.195 port 34376 ssh2: RSA SHA256:+cMC7rDY11ooX0rGk8xTzTdhKmHBDbuiScEsywsTdAk Jul 15 23:16:53.116048 sshd-session[1782]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jul 15 23:16:53.123440 systemd-logind[1480]: New session 6 of user core. Jul 15 23:16:53.130032 systemd[1]: Started session-6.scope - Session 6 of User core. Jul 15 23:16:53.643151 sudo[1786]: core : PWD=/home/core ; USER=root ; COMMAND=/usr/sbin/rm -rf /etc/audit/rules.d/80-selinux.rules /etc/audit/rules.d/99-default.rules Jul 15 23:16:53.643474 sudo[1786]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Jul 15 23:16:53.650852 sudo[1786]: pam_unix(sudo:session): session closed for user root Jul 15 23:16:53.658007 sudo[1785]: core : PWD=/home/core ; USER=root ; COMMAND=/usr/sbin/systemctl restart audit-rules Jul 15 23:16:53.658419 sudo[1785]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Jul 15 23:16:53.670223 systemd[1]: Starting audit-rules.service - Load Audit Rules... Jul 15 23:16:53.729740 augenrules[1808]: No rules Jul 15 23:16:53.731792 systemd[1]: audit-rules.service: Deactivated successfully. Jul 15 23:16:53.732057 systemd[1]: Finished audit-rules.service - Load Audit Rules. Jul 15 23:16:53.734780 sudo[1785]: pam_unix(sudo:session): session closed for user root Jul 15 23:16:53.896219 sshd[1784]: Connection closed by 139.178.68.195 port 34376 Jul 15 23:16:53.897416 sshd-session[1782]: pam_unix(sshd:session): session closed for user core Jul 15 23:16:53.904248 systemd[1]: sshd@5-91.99.212.32:22-139.178.68.195:34376.service: Deactivated successfully. Jul 15 23:16:53.907125 systemd[1]: session-6.scope: Deactivated successfully. Jul 15 23:16:53.910058 systemd-logind[1480]: Session 6 logged out. Waiting for processes to exit. Jul 15 23:16:53.911855 systemd-logind[1480]: Removed session 6. Jul 15 23:16:54.064766 systemd[1]: Started sshd@6-91.99.212.32:22-139.178.68.195:34384.service - OpenSSH per-connection server daemon (139.178.68.195:34384). Jul 15 23:16:55.066669 sshd[1817]: Accepted publickey for core from 139.178.68.195 port 34384 ssh2: RSA SHA256:+cMC7rDY11ooX0rGk8xTzTdhKmHBDbuiScEsywsTdAk Jul 15 23:16:55.071394 sshd-session[1817]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jul 15 23:16:55.079857 systemd-logind[1480]: New session 7 of user core. Jul 15 23:16:55.087065 systemd[1]: Started session-7.scope - Session 7 of User core. Jul 15 23:16:55.589015 sudo[1820]: core : PWD=/home/core ; USER=root ; COMMAND=/home/core/install.sh Jul 15 23:16:55.589368 sudo[1820]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Jul 15 23:16:55.987076 systemd[1]: Starting docker.service - Docker Application Container Engine... Jul 15 23:16:56.001581 (dockerd)[1838]: docker.service: Referenced but unset environment variable evaluates to an empty string: DOCKER_CGROUPS, DOCKER_OPTS, DOCKER_OPT_BIP, DOCKER_OPT_IPMASQ, DOCKER_OPT_MTU Jul 15 23:16:56.312014 dockerd[1838]: time="2025-07-15T23:16:56.311918871Z" level=info msg="Starting up" Jul 15 23:16:56.317248 dockerd[1838]: time="2025-07-15T23:16:56.317174257Z" level=info msg="OTEL tracing is not configured, using no-op tracer provider" Jul 15 23:16:56.371150 systemd[1]: var-lib-docker-check\x2doverlayfs\x2dsupport1124216447-merged.mount: Deactivated successfully. Jul 15 23:16:56.386743 systemd[1]: var-lib-docker-metacopy\x2dcheck2900417661-merged.mount: Deactivated successfully. Jul 15 23:16:56.399279 dockerd[1838]: time="2025-07-15T23:16:56.399105520Z" level=info msg="Loading containers: start." Jul 15 23:16:56.410637 kernel: Initializing XFRM netlink socket Jul 15 23:16:56.720692 systemd-networkd[1422]: docker0: Link UP Jul 15 23:16:56.727136 dockerd[1838]: time="2025-07-15T23:16:56.727057619Z" level=info msg="Loading containers: done." Jul 15 23:16:56.753063 dockerd[1838]: time="2025-07-15T23:16:56.752997956Z" level=warning msg="Not using native diff for overlay2, this may cause degraded performance for building images: kernel has CONFIG_OVERLAY_FS_REDIRECT_DIR enabled" storage-driver=overlay2 Jul 15 23:16:56.753307 dockerd[1838]: time="2025-07-15T23:16:56.753110140Z" level=info msg="Docker daemon" commit=bbd0a17ccc67e48d4a69393287b7fcc4f0578683 containerd-snapshotter=false storage-driver=overlay2 version=28.0.1 Jul 15 23:16:56.753307 dockerd[1838]: time="2025-07-15T23:16:56.753250809Z" level=info msg="Initializing buildkit" Jul 15 23:16:56.788539 dockerd[1838]: time="2025-07-15T23:16:56.788266018Z" level=info msg="Completed buildkit initialization" Jul 15 23:16:56.799287 dockerd[1838]: time="2025-07-15T23:16:56.799203272Z" level=info msg="Daemon has completed initialization" Jul 15 23:16:56.800643 dockerd[1838]: time="2025-07-15T23:16:56.799477140Z" level=info msg="API listen on /run/docker.sock" Jul 15 23:16:56.801752 systemd[1]: Started docker.service - Docker Application Container Engine. Jul 15 23:16:57.367716 systemd[1]: var-lib-docker-overlay2-opaque\x2dbug\x2dcheck3577333553-merged.mount: Deactivated successfully. Jul 15 23:16:57.650955 containerd[1538]: time="2025-07-15T23:16:57.649987759Z" level=info msg="PullImage \"registry.k8s.io/kube-apiserver:v1.33.3\"" Jul 15 23:16:58.378853 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount1018896425.mount: Deactivated successfully. Jul 15 23:16:59.462345 containerd[1538]: time="2025-07-15T23:16:59.462237862Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-apiserver:v1.33.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jul 15 23:16:59.465756 containerd[1538]: time="2025-07-15T23:16:59.465690418Z" level=info msg="stop pulling image registry.k8s.io/kube-apiserver:v1.33.3: active requests=0, bytes read=27352186" Jul 15 23:16:59.467777 containerd[1538]: time="2025-07-15T23:16:59.467718651Z" level=info msg="ImageCreate event name:\"sha256:c0425f3fe3fbf33c17a14d49c43d4fd0b60b2254511902d5b2c29e53ca684fc9\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jul 15 23:16:59.471926 containerd[1538]: time="2025-07-15T23:16:59.471821678Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-apiserver@sha256:125a8b488def5ea24e2de5682ab1abf063163aae4d89ce21811a45f3ecf23816\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jul 15 23:16:59.473132 containerd[1538]: time="2025-07-15T23:16:59.473074580Z" level=info msg="Pulled image \"registry.k8s.io/kube-apiserver:v1.33.3\" with image id \"sha256:c0425f3fe3fbf33c17a14d49c43d4fd0b60b2254511902d5b2c29e53ca684fc9\", repo tag \"registry.k8s.io/kube-apiserver:v1.33.3\", repo digest \"registry.k8s.io/kube-apiserver@sha256:125a8b488def5ea24e2de5682ab1abf063163aae4d89ce21811a45f3ecf23816\", size \"27348894\" in 1.822713054s" Jul 15 23:16:59.473132 containerd[1538]: time="2025-07-15T23:16:59.473130894Z" level=info msg="PullImage \"registry.k8s.io/kube-apiserver:v1.33.3\" returns image reference \"sha256:c0425f3fe3fbf33c17a14d49c43d4fd0b60b2254511902d5b2c29e53ca684fc9\"" Jul 15 23:16:59.475172 containerd[1538]: time="2025-07-15T23:16:59.475132748Z" level=info msg="PullImage \"registry.k8s.io/kube-controller-manager:v1.33.3\"" Jul 15 23:16:59.505376 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 5. Jul 15 23:16:59.509301 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Jul 15 23:16:59.697397 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Jul 15 23:16:59.713956 (kubelet)[2103]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Jul 15 23:16:59.766151 kubelet[2103]: E0715 23:16:59.766065 2103 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Jul 15 23:16:59.769920 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Jul 15 23:16:59.770276 systemd[1]: kubelet.service: Failed with result 'exit-code'. Jul 15 23:16:59.771269 systemd[1]: kubelet.service: Consumed 187ms CPU time, 107.1M memory peak. Jul 15 23:17:01.118998 containerd[1538]: time="2025-07-15T23:17:01.118743476Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-controller-manager:v1.33.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jul 15 23:17:01.120652 containerd[1538]: time="2025-07-15T23:17:01.120514775Z" level=info msg="stop pulling image registry.k8s.io/kube-controller-manager:v1.33.3: active requests=0, bytes read=23537866" Jul 15 23:17:01.122265 containerd[1538]: time="2025-07-15T23:17:01.122182058Z" level=info msg="ImageCreate event name:\"sha256:ef439b94d49d41d1b377c316fb053adb88bf6b26ec7e63aaf3deba953b7c766f\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jul 15 23:17:01.127019 containerd[1538]: time="2025-07-15T23:17:01.126917953Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-controller-manager@sha256:96091626e37c5d5920ee6c3203b783cc01a08f287ec0713aeb7809bb62ccea90\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jul 15 23:17:01.129124 containerd[1538]: time="2025-07-15T23:17:01.128173613Z" level=info msg="Pulled image \"registry.k8s.io/kube-controller-manager:v1.33.3\" with image id \"sha256:ef439b94d49d41d1b377c316fb053adb88bf6b26ec7e63aaf3deba953b7c766f\", repo tag \"registry.k8s.io/kube-controller-manager:v1.33.3\", repo digest \"registry.k8s.io/kube-controller-manager@sha256:96091626e37c5d5920ee6c3203b783cc01a08f287ec0713aeb7809bb62ccea90\", size \"25092764\" in 1.652995099s" Jul 15 23:17:01.129124 containerd[1538]: time="2025-07-15T23:17:01.128238173Z" level=info msg="PullImage \"registry.k8s.io/kube-controller-manager:v1.33.3\" returns image reference \"sha256:ef439b94d49d41d1b377c316fb053adb88bf6b26ec7e63aaf3deba953b7c766f\"" Jul 15 23:17:01.129124 containerd[1538]: time="2025-07-15T23:17:01.128867221Z" level=info msg="PullImage \"registry.k8s.io/kube-scheduler:v1.33.3\"" Jul 15 23:17:02.579682 containerd[1538]: time="2025-07-15T23:17:02.579510313Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-scheduler:v1.33.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jul 15 23:17:02.581896 containerd[1538]: time="2025-07-15T23:17:02.581568034Z" level=info msg="stop pulling image registry.k8s.io/kube-scheduler:v1.33.3: active requests=0, bytes read=18293544" Jul 15 23:17:02.583774 containerd[1538]: time="2025-07-15T23:17:02.583630152Z" level=info msg="ImageCreate event name:\"sha256:c03972dff86ba78247043f2b6171ce436ab9323da7833b18924c3d8e29ea37a5\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jul 15 23:17:02.588133 containerd[1538]: time="2025-07-15T23:17:02.587634774Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-scheduler@sha256:f3a2ffdd7483168205236f7762e9a1933f17dd733bc0188b52bddab9c0762868\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jul 15 23:17:02.588952 containerd[1538]: time="2025-07-15T23:17:02.588898686Z" level=info msg="Pulled image \"registry.k8s.io/kube-scheduler:v1.33.3\" with image id \"sha256:c03972dff86ba78247043f2b6171ce436ab9323da7833b18924c3d8e29ea37a5\", repo tag \"registry.k8s.io/kube-scheduler:v1.33.3\", repo digest \"registry.k8s.io/kube-scheduler@sha256:f3a2ffdd7483168205236f7762e9a1933f17dd733bc0188b52bddab9c0762868\", size \"19848460\" in 1.459994886s" Jul 15 23:17:02.588952 containerd[1538]: time="2025-07-15T23:17:02.588948099Z" level=info msg="PullImage \"registry.k8s.io/kube-scheduler:v1.33.3\" returns image reference \"sha256:c03972dff86ba78247043f2b6171ce436ab9323da7833b18924c3d8e29ea37a5\"" Jul 15 23:17:02.590613 containerd[1538]: time="2025-07-15T23:17:02.590565179Z" level=info msg="PullImage \"registry.k8s.io/kube-proxy:v1.33.3\"" Jul 15 23:17:03.685871 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount3708050638.mount: Deactivated successfully. Jul 15 23:17:04.092653 containerd[1538]: time="2025-07-15T23:17:04.092544776Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-proxy:v1.33.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jul 15 23:17:04.094634 containerd[1538]: time="2025-07-15T23:17:04.094485688Z" level=info msg="stop pulling image registry.k8s.io/kube-proxy:v1.33.3: active requests=0, bytes read=28199498" Jul 15 23:17:04.096055 containerd[1538]: time="2025-07-15T23:17:04.095954756Z" level=info msg="ImageCreate event name:\"sha256:738e99dbd7325e2cdd650d83d59a79c7ecb005ab0d5bf029fc15c54ee9359306\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jul 15 23:17:04.099197 containerd[1538]: time="2025-07-15T23:17:04.099112321Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-proxy@sha256:c69929cfba9e38305eb1e20ca859aeb90e0d2a7326eab9bb1e8298882fe626cd\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jul 15 23:17:04.099989 containerd[1538]: time="2025-07-15T23:17:04.099736621Z" level=info msg="Pulled image \"registry.k8s.io/kube-proxy:v1.33.3\" with image id \"sha256:738e99dbd7325e2cdd650d83d59a79c7ecb005ab0d5bf029fc15c54ee9359306\", repo tag \"registry.k8s.io/kube-proxy:v1.33.3\", repo digest \"registry.k8s.io/kube-proxy@sha256:c69929cfba9e38305eb1e20ca859aeb90e0d2a7326eab9bb1e8298882fe626cd\", size \"28198491\" in 1.508956676s" Jul 15 23:17:04.099989 containerd[1538]: time="2025-07-15T23:17:04.099825744Z" level=info msg="PullImage \"registry.k8s.io/kube-proxy:v1.33.3\" returns image reference \"sha256:738e99dbd7325e2cdd650d83d59a79c7ecb005ab0d5bf029fc15c54ee9359306\"" Jul 15 23:17:04.100372 containerd[1538]: time="2025-07-15T23:17:04.100329054Z" level=info msg="PullImage \"registry.k8s.io/coredns/coredns:v1.12.0\"" Jul 15 23:17:04.694878 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount660704071.mount: Deactivated successfully. Jul 15 23:17:05.523179 containerd[1538]: time="2025-07-15T23:17:05.523072334Z" level=info msg="ImageCreate event name:\"registry.k8s.io/coredns/coredns:v1.12.0\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jul 15 23:17:05.525980 containerd[1538]: time="2025-07-15T23:17:05.525921576Z" level=info msg="stop pulling image registry.k8s.io/coredns/coredns:v1.12.0: active requests=0, bytes read=19152209" Jul 15 23:17:05.527200 containerd[1538]: time="2025-07-15T23:17:05.527134814Z" level=info msg="ImageCreate event name:\"sha256:f72407be9e08c3a1b29a88318cbfee87b9f2da489f84015a5090b1e386e4dbc1\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jul 15 23:17:05.531929 containerd[1538]: time="2025-07-15T23:17:05.531818627Z" level=info msg="ImageCreate event name:\"registry.k8s.io/coredns/coredns@sha256:40384aa1f5ea6bfdc77997d243aec73da05f27aed0c5e9d65bfa98933c519d97\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jul 15 23:17:05.534667 containerd[1538]: time="2025-07-15T23:17:05.533584304Z" level=info msg="Pulled image \"registry.k8s.io/coredns/coredns:v1.12.0\" with image id \"sha256:f72407be9e08c3a1b29a88318cbfee87b9f2da489f84015a5090b1e386e4dbc1\", repo tag \"registry.k8s.io/coredns/coredns:v1.12.0\", repo digest \"registry.k8s.io/coredns/coredns@sha256:40384aa1f5ea6bfdc77997d243aec73da05f27aed0c5e9d65bfa98933c519d97\", size \"19148915\" in 1.433204549s" Jul 15 23:17:05.534667 containerd[1538]: time="2025-07-15T23:17:05.533672631Z" level=info msg="PullImage \"registry.k8s.io/coredns/coredns:v1.12.0\" returns image reference \"sha256:f72407be9e08c3a1b29a88318cbfee87b9f2da489f84015a5090b1e386e4dbc1\"" Jul 15 23:17:05.534825 containerd[1538]: time="2025-07-15T23:17:05.534800900Z" level=info msg="PullImage \"registry.k8s.io/pause:3.10\"" Jul 15 23:17:06.097073 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount3665293141.mount: Deactivated successfully. Jul 15 23:17:06.106620 containerd[1538]: time="2025-07-15T23:17:06.106284361Z" level=info msg="ImageCreate event name:\"registry.k8s.io/pause:3.10\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" Jul 15 23:17:06.107610 containerd[1538]: time="2025-07-15T23:17:06.107528125Z" level=info msg="stop pulling image registry.k8s.io/pause:3.10: active requests=0, bytes read=268723" Jul 15 23:17:06.108561 containerd[1538]: time="2025-07-15T23:17:06.108519449Z" level=info msg="ImageCreate event name:\"sha256:afb61768ce381961ca0beff95337601f29dc70ff3ed14e5e4b3e5699057e6aa8\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" Jul 15 23:17:06.112390 containerd[1538]: time="2025-07-15T23:17:06.111758216Z" level=info msg="ImageCreate event name:\"registry.k8s.io/pause@sha256:ee6521f290b2168b6e0935a181d4cff9be1ac3f505666ef0e3c98fae8199917a\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" Jul 15 23:17:06.112390 containerd[1538]: time="2025-07-15T23:17:06.112246620Z" level=info msg="Pulled image \"registry.k8s.io/pause:3.10\" with image id \"sha256:afb61768ce381961ca0beff95337601f29dc70ff3ed14e5e4b3e5699057e6aa8\", repo tag \"registry.k8s.io/pause:3.10\", repo digest \"registry.k8s.io/pause@sha256:ee6521f290b2168b6e0935a181d4cff9be1ac3f505666ef0e3c98fae8199917a\", size \"267933\" in 577.41497ms" Jul 15 23:17:06.112390 containerd[1538]: time="2025-07-15T23:17:06.112283768Z" level=info msg="PullImage \"registry.k8s.io/pause:3.10\" returns image reference \"sha256:afb61768ce381961ca0beff95337601f29dc70ff3ed14e5e4b3e5699057e6aa8\"" Jul 15 23:17:06.113556 containerd[1538]: time="2025-07-15T23:17:06.113511897Z" level=info msg="PullImage \"registry.k8s.io/etcd:3.5.21-0\"" Jul 15 23:17:06.692751 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount3041614249.mount: Deactivated successfully. Jul 15 23:17:08.490868 containerd[1538]: time="2025-07-15T23:17:08.490761153Z" level=info msg="ImageCreate event name:\"registry.k8s.io/etcd:3.5.21-0\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jul 15 23:17:08.494644 containerd[1538]: time="2025-07-15T23:17:08.494550988Z" level=info msg="stop pulling image registry.k8s.io/etcd:3.5.21-0: active requests=0, bytes read=69334637" Jul 15 23:17:08.496507 containerd[1538]: time="2025-07-15T23:17:08.496243495Z" level=info msg="ImageCreate event name:\"sha256:31747a36ce712f0bf61b50a0c06e99768522025e7b8daedd6dc63d1ae84837b5\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jul 15 23:17:08.504222 containerd[1538]: time="2025-07-15T23:17:08.503657725Z" level=info msg="ImageCreate event name:\"registry.k8s.io/etcd@sha256:d58c035df557080a27387d687092e3fc2b64c6d0e3162dc51453a115f847d121\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jul 15 23:17:08.505541 containerd[1538]: time="2025-07-15T23:17:08.505417855Z" level=info msg="Pulled image \"registry.k8s.io/etcd:3.5.21-0\" with image id \"sha256:31747a36ce712f0bf61b50a0c06e99768522025e7b8daedd6dc63d1ae84837b5\", repo tag \"registry.k8s.io/etcd:3.5.21-0\", repo digest \"registry.k8s.io/etcd@sha256:d58c035df557080a27387d687092e3fc2b64c6d0e3162dc51453a115f847d121\", size \"70026017\" in 2.391851814s" Jul 15 23:17:08.505541 containerd[1538]: time="2025-07-15T23:17:08.505525149Z" level=info msg="PullImage \"registry.k8s.io/etcd:3.5.21-0\" returns image reference \"sha256:31747a36ce712f0bf61b50a0c06e99768522025e7b8daedd6dc63d1ae84837b5\"" Jul 15 23:17:10.004878 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 6. Jul 15 23:17:10.009557 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Jul 15 23:17:10.173971 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Jul 15 23:17:10.191287 (kubelet)[2266]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Jul 15 23:17:10.237554 kubelet[2266]: E0715 23:17:10.237502 2266 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Jul 15 23:17:10.241015 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Jul 15 23:17:10.241345 systemd[1]: kubelet.service: Failed with result 'exit-code'. Jul 15 23:17:10.241848 systemd[1]: kubelet.service: Consumed 170ms CPU time, 104.8M memory peak. Jul 15 23:17:14.080174 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent. Jul 15 23:17:14.080484 systemd[1]: kubelet.service: Consumed 170ms CPU time, 104.8M memory peak. Jul 15 23:17:14.083071 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Jul 15 23:17:14.115445 systemd[1]: Reload requested from client PID 2280 ('systemctl') (unit session-7.scope)... Jul 15 23:17:14.115828 systemd[1]: Reloading... Jul 15 23:17:14.265613 zram_generator::config[2324]: No configuration found. Jul 15 23:17:14.364571 systemd[1]: /usr/lib/systemd/system/docker.socket:6: ListenStream= references a path below legacy directory /var/run/, updating /var/run/docker.sock → /run/docker.sock; please update the unit file accordingly. Jul 15 23:17:14.474316 systemd[1]: Reloading finished in 357 ms. Jul 15 23:17:14.534299 systemd[1]: kubelet.service: Control process exited, code=killed, status=15/TERM Jul 15 23:17:14.534400 systemd[1]: kubelet.service: Failed with result 'signal'. Jul 15 23:17:14.534730 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent. Jul 15 23:17:14.534781 systemd[1]: kubelet.service: Consumed 116ms CPU time, 95M memory peak. Jul 15 23:17:14.537911 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Jul 15 23:17:14.710629 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Jul 15 23:17:14.722047 (kubelet)[2372]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS Jul 15 23:17:14.772691 kubelet[2372]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Jul 15 23:17:14.773085 kubelet[2372]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Jul 15 23:17:14.773129 kubelet[2372]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Jul 15 23:17:14.773298 kubelet[2372]: I0715 23:17:14.773256 2372 server.go:212] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Jul 15 23:17:16.019306 kubelet[2372]: I0715 23:17:16.019250 2372 server.go:530] "Kubelet version" kubeletVersion="v1.33.0" Jul 15 23:17:16.019963 kubelet[2372]: I0715 23:17:16.019940 2372 server.go:532] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Jul 15 23:17:16.020473 kubelet[2372]: I0715 23:17:16.020454 2372 server.go:956] "Client rotation is on, will bootstrap in background" Jul 15 23:17:16.067455 kubelet[2372]: E0715 23:17:16.067390 2372 certificate_manager.go:596] "Failed while requesting a signed certificate from the control plane" err="cannot create certificate signing request: Post \"https://91.99.212.32:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": dial tcp 91.99.212.32:6443: connect: connection refused" logger="kubernetes.io/kube-apiserver-client-kubelet.UnhandledError" Jul 15 23:17:16.067800 kubelet[2372]: I0715 23:17:16.067776 2372 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/pki/ca.crt" Jul 15 23:17:16.080880 kubelet[2372]: I0715 23:17:16.080720 2372 server.go:1446] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Jul 15 23:17:16.084920 kubelet[2372]: I0715 23:17:16.084872 2372 server.go:782] "--cgroups-per-qos enabled, but --cgroup-root was not specified. defaulting to /" Jul 15 23:17:16.085570 kubelet[2372]: I0715 23:17:16.085526 2372 container_manager_linux.go:267] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Jul 15 23:17:16.086036 kubelet[2372]: I0715 23:17:16.085691 2372 container_manager_linux.go:272] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ci-4372-0-1-n-21be50a87e","RuntimeCgroupsName":"","SystemCgroupsName":"","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":false,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":null,"HardEvictionThresholds":[{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"MemoryManagerPolicy":"None","MemoryManagerReservedMemory":null,"PodPidsLimit":-1,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Jul 15 23:17:16.086401 kubelet[2372]: I0715 23:17:16.086373 2372 topology_manager.go:138] "Creating topology manager with none policy" Jul 15 23:17:16.086487 kubelet[2372]: I0715 23:17:16.086475 2372 container_manager_linux.go:303] "Creating device plugin manager" Jul 15 23:17:16.086891 kubelet[2372]: I0715 23:17:16.086865 2372 state_mem.go:36] "Initialized new in-memory state store" Jul 15 23:17:16.092914 kubelet[2372]: I0715 23:17:16.092864 2372 kubelet.go:480] "Attempting to sync node with API server" Jul 15 23:17:16.093283 kubelet[2372]: I0715 23:17:16.093137 2372 kubelet.go:375] "Adding static pod path" path="/etc/kubernetes/manifests" Jul 15 23:17:16.093283 kubelet[2372]: I0715 23:17:16.093195 2372 kubelet.go:386] "Adding apiserver pod source" Jul 15 23:17:16.094900 kubelet[2372]: I0715 23:17:16.094857 2372 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Jul 15 23:17:16.097110 kubelet[2372]: I0715 23:17:16.097059 2372 kuberuntime_manager.go:279] "Container runtime initialized" containerRuntime="containerd" version="v2.0.4" apiVersion="v1" Jul 15 23:17:16.098170 kubelet[2372]: I0715 23:17:16.098133 2372 kubelet.go:935] "Not starting ClusterTrustBundle informer because we are in static kubelet mode or the ClusterTrustBundleProjection featuregate is disabled" Jul 15 23:17:16.098346 kubelet[2372]: W0715 23:17:16.098326 2372 probe.go:272] Flexvolume plugin directory at /opt/libexec/kubernetes/kubelet-plugins/volume/exec/ does not exist. Recreating. Jul 15 23:17:16.101530 kubelet[2372]: I0715 23:17:16.101496 2372 watchdog_linux.go:99] "Systemd watchdog is not enabled" Jul 15 23:17:16.101683 kubelet[2372]: I0715 23:17:16.101556 2372 server.go:1289] "Started kubelet" Jul 15 23:17:16.101898 kubelet[2372]: E0715 23:17:16.101862 2372 reflector.go:200] "Failed to watch" err="failed to list *v1.Node: Get \"https://91.99.212.32:6443/api/v1/nodes?fieldSelector=metadata.name%3Dci-4372-0-1-n-21be50a87e&limit=500&resourceVersion=0\": dial tcp 91.99.212.32:6443: connect: connection refused" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Node" Jul 15 23:17:16.114961 kubelet[2372]: E0715 23:17:16.112116 2372 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://91.99.212.32:6443/api/v1/namespaces/default/events\": dial tcp 91.99.212.32:6443: connect: connection refused" event="&Event{ObjectMeta:{ci-4372-0-1-n-21be50a87e.18528fe90c90e9c0 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ci-4372-0-1-n-21be50a87e,UID:ci-4372-0-1-n-21be50a87e,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:ci-4372-0-1-n-21be50a87e,},FirstTimestamp:2025-07-15 23:17:16.101519808 +0000 UTC m=+1.373301552,LastTimestamp:2025-07-15 23:17:16.101519808 +0000 UTC m=+1.373301552,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ci-4372-0-1-n-21be50a87e,}" Jul 15 23:17:16.116064 kubelet[2372]: E0715 23:17:16.116011 2372 reflector.go:200] "Failed to watch" err="failed to list *v1.Service: Get \"https://91.99.212.32:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 91.99.212.32:6443: connect: connection refused" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" Jul 15 23:17:16.116322 kubelet[2372]: I0715 23:17:16.116290 2372 server.go:180] "Starting to listen" address="0.0.0.0" port=10250 Jul 15 23:17:16.119161 kubelet[2372]: I0715 23:17:16.119068 2372 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Jul 15 23:17:16.120194 kubelet[2372]: I0715 23:17:16.120161 2372 server.go:255] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Jul 15 23:17:16.122745 kubelet[2372]: I0715 23:17:16.122709 2372 server.go:317] "Adding debug handlers to kubelet server" Jul 15 23:17:16.127205 kubelet[2372]: I0715 23:17:16.127167 2372 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Jul 15 23:17:16.130078 kubelet[2372]: I0715 23:17:16.130020 2372 dynamic_serving_content.go:135] "Starting controller" name="kubelet-server-cert-files::/var/lib/kubelet/pki/kubelet.crt::/var/lib/kubelet/pki/kubelet.key" Jul 15 23:17:16.132358 kubelet[2372]: E0715 23:17:16.127218 2372 kubelet.go:1600] "Image garbage collection failed once. Stats initialization may not have completed yet" err="invalid capacity 0 on image filesystem" Jul 15 23:17:16.132358 kubelet[2372]: I0715 23:17:16.132365 2372 volume_manager.go:297] "Starting Kubelet Volume Manager" Jul 15 23:17:16.132520 kubelet[2372]: I0715 23:17:16.132503 2372 desired_state_of_world_populator.go:150] "Desired state populator starts to run" Jul 15 23:17:16.132666 kubelet[2372]: I0715 23:17:16.132574 2372 reconciler.go:26] "Reconciler: start to sync state" Jul 15 23:17:16.133656 kubelet[2372]: I0715 23:17:16.133623 2372 factory.go:223] Registration of the systemd container factory successfully Jul 15 23:17:16.133776 kubelet[2372]: I0715 23:17:16.133741 2372 factory.go:221] Registration of the crio container factory failed: Get "http://%2Fvar%2Frun%2Fcrio%2Fcrio.sock/info": dial unix /var/run/crio/crio.sock: connect: no such file or directory Jul 15 23:17:16.134662 kubelet[2372]: E0715 23:17:16.134346 2372 reflector.go:200] "Failed to watch" err="failed to list *v1.CSIDriver: Get \"https://91.99.212.32:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 91.99.212.32:6443: connect: connection refused" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSIDriver" Jul 15 23:17:16.135652 kubelet[2372]: I0715 23:17:16.135617 2372 factory.go:223] Registration of the containerd container factory successfully Jul 15 23:17:16.138860 kubelet[2372]: E0715 23:17:16.138799 2372 kubelet_node_status.go:466] "Error getting the current node from lister" err="node \"ci-4372-0-1-n-21be50a87e\" not found" Jul 15 23:17:16.155695 kubelet[2372]: I0715 23:17:16.155429 2372 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv4" Jul 15 23:17:16.159637 kubelet[2372]: E0715 23:17:16.159464 2372 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://91.99.212.32:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/ci-4372-0-1-n-21be50a87e?timeout=10s\": dial tcp 91.99.212.32:6443: connect: connection refused" interval="200ms" Jul 15 23:17:16.161784 kubelet[2372]: I0715 23:17:16.160979 2372 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv6" Jul 15 23:17:16.161784 kubelet[2372]: I0715 23:17:16.161029 2372 status_manager.go:230] "Starting to sync pod status with apiserver" Jul 15 23:17:16.161784 kubelet[2372]: I0715 23:17:16.161052 2372 watchdog_linux.go:127] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Jul 15 23:17:16.161784 kubelet[2372]: I0715 23:17:16.161061 2372 kubelet.go:2436] "Starting kubelet main sync loop" Jul 15 23:17:16.161784 kubelet[2372]: E0715 23:17:16.161124 2372 kubelet.go:2460] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" Jul 15 23:17:16.163639 kubelet[2372]: E0715 23:17:16.163516 2372 reflector.go:200] "Failed to watch" err="failed to list *v1.RuntimeClass: Get \"https://91.99.212.32:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 91.99.212.32:6443: connect: connection refused" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.RuntimeClass" Jul 15 23:17:16.166136 kubelet[2372]: I0715 23:17:16.166094 2372 cpu_manager.go:221] "Starting CPU manager" policy="none" Jul 15 23:17:16.166136 kubelet[2372]: I0715 23:17:16.166124 2372 cpu_manager.go:222] "Reconciling" reconcilePeriod="10s" Jul 15 23:17:16.166352 kubelet[2372]: I0715 23:17:16.166161 2372 state_mem.go:36] "Initialized new in-memory state store" Jul 15 23:17:16.169350 kubelet[2372]: I0715 23:17:16.169286 2372 policy_none.go:49] "None policy: Start" Jul 15 23:17:16.169350 kubelet[2372]: I0715 23:17:16.169326 2372 memory_manager.go:186] "Starting memorymanager" policy="None" Jul 15 23:17:16.169350 kubelet[2372]: I0715 23:17:16.169343 2372 state_mem.go:35] "Initializing new in-memory state store" Jul 15 23:17:16.179310 systemd[1]: Created slice kubepods.slice - libcontainer container kubepods.slice. Jul 15 23:17:16.194617 systemd[1]: Created slice kubepods-burstable.slice - libcontainer container kubepods-burstable.slice. Jul 15 23:17:16.200919 systemd[1]: Created slice kubepods-besteffort.slice - libcontainer container kubepods-besteffort.slice. Jul 15 23:17:16.215736 kubelet[2372]: E0715 23:17:16.215639 2372 manager.go:517] "Failed to read data from checkpoint" err="checkpoint is not found" checkpoint="kubelet_internal_checkpoint" Jul 15 23:17:16.216750 kubelet[2372]: I0715 23:17:16.216648 2372 eviction_manager.go:189] "Eviction manager: starting control loop" Jul 15 23:17:16.216890 kubelet[2372]: I0715 23:17:16.216697 2372 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Jul 15 23:17:16.217247 kubelet[2372]: I0715 23:17:16.217205 2372 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Jul 15 23:17:16.221860 kubelet[2372]: E0715 23:17:16.221739 2372 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="no imagefs label for configured runtime" Jul 15 23:17:16.222569 kubelet[2372]: E0715 23:17:16.222523 2372 eviction_manager.go:292] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"ci-4372-0-1-n-21be50a87e\" not found" Jul 15 23:17:16.284416 systemd[1]: Created slice kubepods-burstable-pod2a90ba5db8158141543146688d6c9d0a.slice - libcontainer container kubepods-burstable-pod2a90ba5db8158141543146688d6c9d0a.slice. Jul 15 23:17:16.314172 kubelet[2372]: E0715 23:17:16.314099 2372 kubelet.go:3305] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4372-0-1-n-21be50a87e\" not found" node="ci-4372-0-1-n-21be50a87e" Jul 15 23:17:16.320109 systemd[1]: Created slice kubepods-burstable-pod7605e19980c32d490e2d370fb5437e36.slice - libcontainer container kubepods-burstable-pod7605e19980c32d490e2d370fb5437e36.slice. Jul 15 23:17:16.323784 kubelet[2372]: I0715 23:17:16.323742 2372 kubelet_node_status.go:75] "Attempting to register node" node="ci-4372-0-1-n-21be50a87e" Jul 15 23:17:16.324613 kubelet[2372]: E0715 23:17:16.324462 2372 kubelet_node_status.go:107] "Unable to register node with API server" err="Post \"https://91.99.212.32:6443/api/v1/nodes\": dial tcp 91.99.212.32:6443: connect: connection refused" node="ci-4372-0-1-n-21be50a87e" Jul 15 23:17:16.325553 kubelet[2372]: E0715 23:17:16.325510 2372 kubelet.go:3305] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4372-0-1-n-21be50a87e\" not found" node="ci-4372-0-1-n-21be50a87e" Jul 15 23:17:16.338080 systemd[1]: Created slice kubepods-burstable-pode45fcd8c7d59ab282165af34ac665d29.slice - libcontainer container kubepods-burstable-pode45fcd8c7d59ab282165af34ac665d29.slice. Jul 15 23:17:16.341647 kubelet[2372]: E0715 23:17:16.341427 2372 kubelet.go:3305] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4372-0-1-n-21be50a87e\" not found" node="ci-4372-0-1-n-21be50a87e" Jul 15 23:17:16.361144 kubelet[2372]: E0715 23:17:16.361081 2372 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://91.99.212.32:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/ci-4372-0-1-n-21be50a87e?timeout=10s\": dial tcp 91.99.212.32:6443: connect: connection refused" interval="400ms" Jul 15 23:17:16.408722 kubelet[2372]: E0715 23:17:16.406184 2372 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://91.99.212.32:6443/api/v1/namespaces/default/events\": dial tcp 91.99.212.32:6443: connect: connection refused" event="&Event{ObjectMeta:{ci-4372-0-1-n-21be50a87e.18528fe90c90e9c0 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ci-4372-0-1-n-21be50a87e,UID:ci-4372-0-1-n-21be50a87e,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:ci-4372-0-1-n-21be50a87e,},FirstTimestamp:2025-07-15 23:17:16.101519808 +0000 UTC m=+1.373301552,LastTimestamp:2025-07-15 23:17:16.101519808 +0000 UTC m=+1.373301552,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ci-4372-0-1-n-21be50a87e,}" Jul 15 23:17:16.434286 kubelet[2372]: I0715 23:17:16.433971 2372 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/2a90ba5db8158141543146688d6c9d0a-k8s-certs\") pod \"kube-apiserver-ci-4372-0-1-n-21be50a87e\" (UID: \"2a90ba5db8158141543146688d6c9d0a\") " pod="kube-system/kube-apiserver-ci-4372-0-1-n-21be50a87e" Jul 15 23:17:16.434286 kubelet[2372]: I0715 23:17:16.434087 2372 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/2a90ba5db8158141543146688d6c9d0a-usr-share-ca-certificates\") pod \"kube-apiserver-ci-4372-0-1-n-21be50a87e\" (UID: \"2a90ba5db8158141543146688d6c9d0a\") " pod="kube-system/kube-apiserver-ci-4372-0-1-n-21be50a87e" Jul 15 23:17:16.434286 kubelet[2372]: I0715 23:17:16.434175 2372 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/7605e19980c32d490e2d370fb5437e36-ca-certs\") pod \"kube-controller-manager-ci-4372-0-1-n-21be50a87e\" (UID: \"7605e19980c32d490e2d370fb5437e36\") " pod="kube-system/kube-controller-manager-ci-4372-0-1-n-21be50a87e" Jul 15 23:17:16.434286 kubelet[2372]: I0715 23:17:16.434221 2372 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"flexvolume-dir\" (UniqueName: \"kubernetes.io/host-path/7605e19980c32d490e2d370fb5437e36-flexvolume-dir\") pod \"kube-controller-manager-ci-4372-0-1-n-21be50a87e\" (UID: \"7605e19980c32d490e2d370fb5437e36\") " pod="kube-system/kube-controller-manager-ci-4372-0-1-n-21be50a87e" Jul 15 23:17:16.434286 kubelet[2372]: I0715 23:17:16.434279 2372 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/7605e19980c32d490e2d370fb5437e36-k8s-certs\") pod \"kube-controller-manager-ci-4372-0-1-n-21be50a87e\" (UID: \"7605e19980c32d490e2d370fb5437e36\") " pod="kube-system/kube-controller-manager-ci-4372-0-1-n-21be50a87e" Jul 15 23:17:16.434725 kubelet[2372]: I0715 23:17:16.434324 2372 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/7605e19980c32d490e2d370fb5437e36-kubeconfig\") pod \"kube-controller-manager-ci-4372-0-1-n-21be50a87e\" (UID: \"7605e19980c32d490e2d370fb5437e36\") " pod="kube-system/kube-controller-manager-ci-4372-0-1-n-21be50a87e" Jul 15 23:17:16.434725 kubelet[2372]: I0715 23:17:16.434365 2372 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/7605e19980c32d490e2d370fb5437e36-usr-share-ca-certificates\") pod \"kube-controller-manager-ci-4372-0-1-n-21be50a87e\" (UID: \"7605e19980c32d490e2d370fb5437e36\") " pod="kube-system/kube-controller-manager-ci-4372-0-1-n-21be50a87e" Jul 15 23:17:16.434725 kubelet[2372]: I0715 23:17:16.434407 2372 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/e45fcd8c7d59ab282165af34ac665d29-kubeconfig\") pod \"kube-scheduler-ci-4372-0-1-n-21be50a87e\" (UID: \"e45fcd8c7d59ab282165af34ac665d29\") " pod="kube-system/kube-scheduler-ci-4372-0-1-n-21be50a87e" Jul 15 23:17:16.434725 kubelet[2372]: I0715 23:17:16.434446 2372 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/2a90ba5db8158141543146688d6c9d0a-ca-certs\") pod \"kube-apiserver-ci-4372-0-1-n-21be50a87e\" (UID: \"2a90ba5db8158141543146688d6c9d0a\") " pod="kube-system/kube-apiserver-ci-4372-0-1-n-21be50a87e" Jul 15 23:17:16.528353 kubelet[2372]: I0715 23:17:16.528239 2372 kubelet_node_status.go:75] "Attempting to register node" node="ci-4372-0-1-n-21be50a87e" Jul 15 23:17:16.528861 kubelet[2372]: E0715 23:17:16.528811 2372 kubelet_node_status.go:107] "Unable to register node with API server" err="Post \"https://91.99.212.32:6443/api/v1/nodes\": dial tcp 91.99.212.32:6443: connect: connection refused" node="ci-4372-0-1-n-21be50a87e" Jul 15 23:17:16.617288 containerd[1538]: time="2025-07-15T23:17:16.616015484Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-apiserver-ci-4372-0-1-n-21be50a87e,Uid:2a90ba5db8158141543146688d6c9d0a,Namespace:kube-system,Attempt:0,}" Jul 15 23:17:16.627235 containerd[1538]: time="2025-07-15T23:17:16.627158579Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-controller-manager-ci-4372-0-1-n-21be50a87e,Uid:7605e19980c32d490e2d370fb5437e36,Namespace:kube-system,Attempt:0,}" Jul 15 23:17:16.643997 containerd[1538]: time="2025-07-15T23:17:16.643639752Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-scheduler-ci-4372-0-1-n-21be50a87e,Uid:e45fcd8c7d59ab282165af34ac665d29,Namespace:kube-system,Attempt:0,}" Jul 15 23:17:16.671098 containerd[1538]: time="2025-07-15T23:17:16.670966076Z" level=info msg="connecting to shim 27a54fdba7044c2e8ff29c5eba32d586b9d961b4377db004ba8e16dd2be3a9b7" address="unix:///run/containerd/s/bd11685eb3c0f4a44d2fecf33b5c83bd16a923c0b14d44d016a50f1fdd5cd3d2" namespace=k8s.io protocol=ttrpc version=3 Jul 15 23:17:16.717037 systemd[1]: Started cri-containerd-27a54fdba7044c2e8ff29c5eba32d586b9d961b4377db004ba8e16dd2be3a9b7.scope - libcontainer container 27a54fdba7044c2e8ff29c5eba32d586b9d961b4377db004ba8e16dd2be3a9b7. Jul 15 23:17:16.718252 containerd[1538]: time="2025-07-15T23:17:16.718132277Z" level=info msg="connecting to shim 035bd7a13ce1829bfd5e5a640c54dd1d2989714f8b94aad3548e065a4bf5d268" address="unix:///run/containerd/s/9eb7abb380bea2817c8885cd72e005de429ae51793fc3bb9fe6997b9732ea444" namespace=k8s.io protocol=ttrpc version=3 Jul 15 23:17:16.763467 kubelet[2372]: E0715 23:17:16.762683 2372 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://91.99.212.32:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/ci-4372-0-1-n-21be50a87e?timeout=10s\": dial tcp 91.99.212.32:6443: connect: connection refused" interval="800ms" Jul 15 23:17:16.780114 systemd[1]: Started cri-containerd-035bd7a13ce1829bfd5e5a640c54dd1d2989714f8b94aad3548e065a4bf5d268.scope - libcontainer container 035bd7a13ce1829bfd5e5a640c54dd1d2989714f8b94aad3548e065a4bf5d268. Jul 15 23:17:16.787376 containerd[1538]: time="2025-07-15T23:17:16.787285922Z" level=info msg="connecting to shim db057986d6b140e72e4b16f247f39621137ef9f3727458d3505b0fcbec6704be" address="unix:///run/containerd/s/8fdc545fbcf759094beee139d8932843e4506103278934e68468fa4ef43e9194" namespace=k8s.io protocol=ttrpc version=3 Jul 15 23:17:16.812075 systemd[1]: Started cri-containerd-db057986d6b140e72e4b16f247f39621137ef9f3727458d3505b0fcbec6704be.scope - libcontainer container db057986d6b140e72e4b16f247f39621137ef9f3727458d3505b0fcbec6704be. Jul 15 23:17:16.857262 containerd[1538]: time="2025-07-15T23:17:16.856956700Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-apiserver-ci-4372-0-1-n-21be50a87e,Uid:2a90ba5db8158141543146688d6c9d0a,Namespace:kube-system,Attempt:0,} returns sandbox id \"27a54fdba7044c2e8ff29c5eba32d586b9d961b4377db004ba8e16dd2be3a9b7\"" Jul 15 23:17:16.871950 containerd[1538]: time="2025-07-15T23:17:16.871396541Z" level=info msg="CreateContainer within sandbox \"27a54fdba7044c2e8ff29c5eba32d586b9d961b4377db004ba8e16dd2be3a9b7\" for container &ContainerMetadata{Name:kube-apiserver,Attempt:0,}" Jul 15 23:17:16.876895 containerd[1538]: time="2025-07-15T23:17:16.876798697Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-controller-manager-ci-4372-0-1-n-21be50a87e,Uid:7605e19980c32d490e2d370fb5437e36,Namespace:kube-system,Attempt:0,} returns sandbox id \"035bd7a13ce1829bfd5e5a640c54dd1d2989714f8b94aad3548e065a4bf5d268\"" Jul 15 23:17:16.884091 containerd[1538]: time="2025-07-15T23:17:16.883989959Z" level=info msg="CreateContainer within sandbox \"035bd7a13ce1829bfd5e5a640c54dd1d2989714f8b94aad3548e065a4bf5d268\" for container &ContainerMetadata{Name:kube-controller-manager,Attempt:0,}" Jul 15 23:17:16.889300 containerd[1538]: time="2025-07-15T23:17:16.888947378Z" level=info msg="Container be5359dce80b579ed3eb0976936a70521e7d5a23165b5d52e4328933fb525f68: CDI devices from CRI Config.CDIDevices: []" Jul 15 23:17:16.890295 containerd[1538]: time="2025-07-15T23:17:16.890230151Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-scheduler-ci-4372-0-1-n-21be50a87e,Uid:e45fcd8c7d59ab282165af34ac665d29,Namespace:kube-system,Attempt:0,} returns sandbox id \"db057986d6b140e72e4b16f247f39621137ef9f3727458d3505b0fcbec6704be\"" Jul 15 23:17:16.899245 containerd[1538]: time="2025-07-15T23:17:16.899148842Z" level=info msg="CreateContainer within sandbox \"db057986d6b140e72e4b16f247f39621137ef9f3727458d3505b0fcbec6704be\" for container &ContainerMetadata{Name:kube-scheduler,Attempt:0,}" Jul 15 23:17:16.906918 containerd[1538]: time="2025-07-15T23:17:16.906831638Z" level=info msg="CreateContainer within sandbox \"27a54fdba7044c2e8ff29c5eba32d586b9d961b4377db004ba8e16dd2be3a9b7\" for &ContainerMetadata{Name:kube-apiserver,Attempt:0,} returns container id \"be5359dce80b579ed3eb0976936a70521e7d5a23165b5d52e4328933fb525f68\"" Jul 15 23:17:16.908289 containerd[1538]: time="2025-07-15T23:17:16.908237004Z" level=info msg="Container a9f1189c471a8d23ea8d6e758b311752acce1decc60b30e7a99179125097a57b: CDI devices from CRI Config.CDIDevices: []" Jul 15 23:17:16.909339 containerd[1538]: time="2025-07-15T23:17:16.909281669Z" level=info msg="StartContainer for \"be5359dce80b579ed3eb0976936a70521e7d5a23165b5d52e4328933fb525f68\"" Jul 15 23:17:16.912994 containerd[1538]: time="2025-07-15T23:17:16.912828243Z" level=info msg="connecting to shim be5359dce80b579ed3eb0976936a70521e7d5a23165b5d52e4328933fb525f68" address="unix:///run/containerd/s/bd11685eb3c0f4a44d2fecf33b5c83bd16a923c0b14d44d016a50f1fdd5cd3d2" protocol=ttrpc version=3 Jul 15 23:17:16.924274 containerd[1538]: time="2025-07-15T23:17:16.922480656Z" level=info msg="CreateContainer within sandbox \"035bd7a13ce1829bfd5e5a640c54dd1d2989714f8b94aad3548e065a4bf5d268\" for &ContainerMetadata{Name:kube-controller-manager,Attempt:0,} returns container id \"a9f1189c471a8d23ea8d6e758b311752acce1decc60b30e7a99179125097a57b\"" Jul 15 23:17:16.924867 containerd[1538]: time="2025-07-15T23:17:16.924166927Z" level=info msg="Container fa80c6f39abf9c1f535e6b8679137f97502d73abe80580d1b87b65639f961ac3: CDI devices from CRI Config.CDIDevices: []" Jul 15 23:17:16.925340 containerd[1538]: time="2025-07-15T23:17:16.925230071Z" level=info msg="StartContainer for \"a9f1189c471a8d23ea8d6e758b311752acce1decc60b30e7a99179125097a57b\"" Jul 15 23:17:16.936603 containerd[1538]: time="2025-07-15T23:17:16.936541557Z" level=info msg="connecting to shim a9f1189c471a8d23ea8d6e758b311752acce1decc60b30e7a99179125097a57b" address="unix:///run/containerd/s/9eb7abb380bea2817c8885cd72e005de429ae51793fc3bb9fe6997b9732ea444" protocol=ttrpc version=3 Jul 15 23:17:16.938551 kubelet[2372]: I0715 23:17:16.937791 2372 kubelet_node_status.go:75] "Attempting to register node" node="ci-4372-0-1-n-21be50a87e" Jul 15 23:17:16.938967 kubelet[2372]: E0715 23:17:16.938922 2372 kubelet_node_status.go:107] "Unable to register node with API server" err="Post \"https://91.99.212.32:6443/api/v1/nodes\": dial tcp 91.99.212.32:6443: connect: connection refused" node="ci-4372-0-1-n-21be50a87e" Jul 15 23:17:16.953447 containerd[1538]: time="2025-07-15T23:17:16.953377072Z" level=info msg="CreateContainer within sandbox \"db057986d6b140e72e4b16f247f39621137ef9f3727458d3505b0fcbec6704be\" for &ContainerMetadata{Name:kube-scheduler,Attempt:0,} returns container id \"fa80c6f39abf9c1f535e6b8679137f97502d73abe80580d1b87b65639f961ac3\"" Jul 15 23:17:16.955852 systemd[1]: Started cri-containerd-be5359dce80b579ed3eb0976936a70521e7d5a23165b5d52e4328933fb525f68.scope - libcontainer container be5359dce80b579ed3eb0976936a70521e7d5a23165b5d52e4328933fb525f68. Jul 15 23:17:16.957471 containerd[1538]: time="2025-07-15T23:17:16.957410660Z" level=info msg="StartContainer for \"fa80c6f39abf9c1f535e6b8679137f97502d73abe80580d1b87b65639f961ac3\"" Jul 15 23:17:16.961877 containerd[1538]: time="2025-07-15T23:17:16.961719073Z" level=info msg="connecting to shim fa80c6f39abf9c1f535e6b8679137f97502d73abe80580d1b87b65639f961ac3" address="unix:///run/containerd/s/8fdc545fbcf759094beee139d8932843e4506103278934e68468fa4ef43e9194" protocol=ttrpc version=3 Jul 15 23:17:16.981984 systemd[1]: Started cri-containerd-a9f1189c471a8d23ea8d6e758b311752acce1decc60b30e7a99179125097a57b.scope - libcontainer container a9f1189c471a8d23ea8d6e758b311752acce1decc60b30e7a99179125097a57b. Jul 15 23:17:17.003116 systemd[1]: Started cri-containerd-fa80c6f39abf9c1f535e6b8679137f97502d73abe80580d1b87b65639f961ac3.scope - libcontainer container fa80c6f39abf9c1f535e6b8679137f97502d73abe80580d1b87b65639f961ac3. Jul 15 23:17:17.022160 kubelet[2372]: E0715 23:17:17.022039 2372 reflector.go:200] "Failed to watch" err="failed to list *v1.RuntimeClass: Get \"https://91.99.212.32:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 91.99.212.32:6443: connect: connection refused" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.RuntimeClass" Jul 15 23:17:17.055956 containerd[1538]: time="2025-07-15T23:17:17.055307784Z" level=info msg="StartContainer for \"be5359dce80b579ed3eb0976936a70521e7d5a23165b5d52e4328933fb525f68\" returns successfully" Jul 15 23:17:17.089377 containerd[1538]: time="2025-07-15T23:17:17.089259453Z" level=info msg="StartContainer for \"a9f1189c471a8d23ea8d6e758b311752acce1decc60b30e7a99179125097a57b\" returns successfully" Jul 15 23:17:17.110956 containerd[1538]: time="2025-07-15T23:17:17.110912574Z" level=info msg="StartContainer for \"fa80c6f39abf9c1f535e6b8679137f97502d73abe80580d1b87b65639f961ac3\" returns successfully" Jul 15 23:17:17.174209 kubelet[2372]: E0715 23:17:17.174081 2372 kubelet.go:3305] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4372-0-1-n-21be50a87e\" not found" node="ci-4372-0-1-n-21be50a87e" Jul 15 23:17:17.180340 kubelet[2372]: E0715 23:17:17.180284 2372 kubelet.go:3305] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4372-0-1-n-21be50a87e\" not found" node="ci-4372-0-1-n-21be50a87e" Jul 15 23:17:17.184322 kubelet[2372]: E0715 23:17:17.184019 2372 kubelet.go:3305] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4372-0-1-n-21be50a87e\" not found" node="ci-4372-0-1-n-21be50a87e" Jul 15 23:17:17.741514 kubelet[2372]: I0715 23:17:17.741482 2372 kubelet_node_status.go:75] "Attempting to register node" node="ci-4372-0-1-n-21be50a87e" Jul 15 23:17:18.187072 kubelet[2372]: E0715 23:17:18.187030 2372 kubelet.go:3305] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4372-0-1-n-21be50a87e\" not found" node="ci-4372-0-1-n-21be50a87e" Jul 15 23:17:18.187472 kubelet[2372]: E0715 23:17:18.187403 2372 kubelet.go:3305] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4372-0-1-n-21be50a87e\" not found" node="ci-4372-0-1-n-21be50a87e" Jul 15 23:17:18.431460 kubelet[2372]: E0715 23:17:18.431417 2372 kubelet.go:3305] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4372-0-1-n-21be50a87e\" not found" node="ci-4372-0-1-n-21be50a87e" Jul 15 23:17:19.587244 kubelet[2372]: E0715 23:17:19.587180 2372 nodelease.go:49] "Failed to get node when trying to set owner ref to the node lease" err="nodes \"ci-4372-0-1-n-21be50a87e\" not found" node="ci-4372-0-1-n-21be50a87e" Jul 15 23:17:19.590733 kubelet[2372]: I0715 23:17:19.590660 2372 kubelet_node_status.go:78] "Successfully registered node" node="ci-4372-0-1-n-21be50a87e" Jul 15 23:17:19.590733 kubelet[2372]: E0715 23:17:19.590724 2372 kubelet_node_status.go:548] "Error updating node status, will retry" err="error getting node \"ci-4372-0-1-n-21be50a87e\": node \"ci-4372-0-1-n-21be50a87e\" not found" Jul 15 23:17:19.662474 kubelet[2372]: E0715 23:17:19.662402 2372 kubelet_node_status.go:466] "Error getting the current node from lister" err="node \"ci-4372-0-1-n-21be50a87e\" not found" Jul 15 23:17:19.740357 kubelet[2372]: I0715 23:17:19.740270 2372 kubelet.go:3309] "Creating a mirror pod for static pod" pod="kube-system/kube-scheduler-ci-4372-0-1-n-21be50a87e" Jul 15 23:17:19.759862 kubelet[2372]: E0715 23:17:19.759802 2372 kubelet.go:3311] "Failed creating a mirror pod" err="pods \"kube-scheduler-ci-4372-0-1-n-21be50a87e\" is forbidden: no PriorityClass with name system-node-critical was found" pod="kube-system/kube-scheduler-ci-4372-0-1-n-21be50a87e" Jul 15 23:17:19.759862 kubelet[2372]: I0715 23:17:19.759843 2372 kubelet.go:3309] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-ci-4372-0-1-n-21be50a87e" Jul 15 23:17:19.767505 kubelet[2372]: E0715 23:17:19.767452 2372 kubelet.go:3311] "Failed creating a mirror pod" err="pods \"kube-apiserver-ci-4372-0-1-n-21be50a87e\" is forbidden: no PriorityClass with name system-node-critical was found" pod="kube-system/kube-apiserver-ci-4372-0-1-n-21be50a87e" Jul 15 23:17:19.767505 kubelet[2372]: I0715 23:17:19.767491 2372 kubelet.go:3309] "Creating a mirror pod for static pod" pod="kube-system/kube-controller-manager-ci-4372-0-1-n-21be50a87e" Jul 15 23:17:19.770680 kubelet[2372]: E0715 23:17:19.770575 2372 kubelet.go:3311] "Failed creating a mirror pod" err="pods \"kube-controller-manager-ci-4372-0-1-n-21be50a87e\" is forbidden: no PriorityClass with name system-node-critical was found" pod="kube-system/kube-controller-manager-ci-4372-0-1-n-21be50a87e" Jul 15 23:17:20.112728 kubelet[2372]: I0715 23:17:20.112648 2372 apiserver.go:52] "Watching apiserver" Jul 15 23:17:20.133518 kubelet[2372]: I0715 23:17:20.133339 2372 desired_state_of_world_populator.go:158] "Finished populating initial desired state of world" Jul 15 23:17:22.190039 systemd[1]: Reload requested from client PID 2653 ('systemctl') (unit session-7.scope)... Jul 15 23:17:22.190424 systemd[1]: Reloading... Jul 15 23:17:22.310630 zram_generator::config[2697]: No configuration found. Jul 15 23:17:22.412527 systemd[1]: /usr/lib/systemd/system/docker.socket:6: ListenStream= references a path below legacy directory /var/run/, updating /var/run/docker.sock → /run/docker.sock; please update the unit file accordingly. Jul 15 23:17:22.539490 systemd[1]: Reloading finished in 348 ms. Jul 15 23:17:22.566327 systemd[1]: Stopping kubelet.service - kubelet: The Kubernetes Node Agent... Jul 15 23:17:22.582242 systemd[1]: kubelet.service: Deactivated successfully. Jul 15 23:17:22.583786 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent. Jul 15 23:17:22.583964 systemd[1]: kubelet.service: Consumed 1.885s CPU time, 129M memory peak. Jul 15 23:17:22.588419 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Jul 15 23:17:22.767781 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Jul 15 23:17:22.782922 (kubelet)[2742]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS Jul 15 23:17:22.877989 kubelet[2742]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Jul 15 23:17:22.877989 kubelet[2742]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Jul 15 23:17:22.877989 kubelet[2742]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Jul 15 23:17:22.877989 kubelet[2742]: I0715 23:17:22.877938 2742 server.go:212] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Jul 15 23:17:22.890245 kubelet[2742]: I0715 23:17:22.890164 2742 server.go:530] "Kubelet version" kubeletVersion="v1.33.0" Jul 15 23:17:22.890245 kubelet[2742]: I0715 23:17:22.890230 2742 server.go:532] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Jul 15 23:17:22.890845 kubelet[2742]: I0715 23:17:22.890571 2742 server.go:956] "Client rotation is on, will bootstrap in background" Jul 15 23:17:22.892961 kubelet[2742]: I0715 23:17:22.892929 2742 certificate_store.go:147] "Loading cert/key pair from a file" filePath="/var/lib/kubelet/pki/kubelet-client-current.pem" Jul 15 23:17:22.902222 kubelet[2742]: I0715 23:17:22.901820 2742 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/pki/ca.crt" Jul 15 23:17:22.909243 kubelet[2742]: I0715 23:17:22.909213 2742 server.go:1446] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Jul 15 23:17:22.913232 kubelet[2742]: I0715 23:17:22.913197 2742 server.go:782] "--cgroups-per-qos enabled, but --cgroup-root was not specified. defaulting to /" Jul 15 23:17:22.913512 kubelet[2742]: I0715 23:17:22.913413 2742 container_manager_linux.go:267] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Jul 15 23:17:22.913752 kubelet[2742]: I0715 23:17:22.913530 2742 container_manager_linux.go:272] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ci-4372-0-1-n-21be50a87e","RuntimeCgroupsName":"","SystemCgroupsName":"","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":false,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":null,"HardEvictionThresholds":[{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"MemoryManagerPolicy":"None","MemoryManagerReservedMemory":null,"PodPidsLimit":-1,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Jul 15 23:17:22.913856 kubelet[2742]: I0715 23:17:22.913753 2742 topology_manager.go:138] "Creating topology manager with none policy" Jul 15 23:17:22.913856 kubelet[2742]: I0715 23:17:22.913763 2742 container_manager_linux.go:303] "Creating device plugin manager" Jul 15 23:17:22.913856 kubelet[2742]: I0715 23:17:22.913811 2742 state_mem.go:36] "Initialized new in-memory state store" Jul 15 23:17:22.913996 kubelet[2742]: I0715 23:17:22.913979 2742 kubelet.go:480] "Attempting to sync node with API server" Jul 15 23:17:22.913996 kubelet[2742]: I0715 23:17:22.913994 2742 kubelet.go:375] "Adding static pod path" path="/etc/kubernetes/manifests" Jul 15 23:17:22.914059 kubelet[2742]: I0715 23:17:22.914019 2742 kubelet.go:386] "Adding apiserver pod source" Jul 15 23:17:22.914059 kubelet[2742]: I0715 23:17:22.914032 2742 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Jul 15 23:17:22.920329 kubelet[2742]: I0715 23:17:22.919895 2742 kuberuntime_manager.go:279] "Container runtime initialized" containerRuntime="containerd" version="v2.0.4" apiVersion="v1" Jul 15 23:17:22.922615 kubelet[2742]: I0715 23:17:22.921247 2742 kubelet.go:935] "Not starting ClusterTrustBundle informer because we are in static kubelet mode or the ClusterTrustBundleProjection featuregate is disabled" Jul 15 23:17:22.926065 kubelet[2742]: I0715 23:17:22.926012 2742 watchdog_linux.go:99] "Systemd watchdog is not enabled" Jul 15 23:17:22.926182 kubelet[2742]: I0715 23:17:22.926088 2742 server.go:1289] "Started kubelet" Jul 15 23:17:22.926634 kubelet[2742]: I0715 23:17:22.926533 2742 server.go:180] "Starting to listen" address="0.0.0.0" port=10250 Jul 15 23:17:22.928600 kubelet[2742]: I0715 23:17:22.926780 2742 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Jul 15 23:17:22.928600 kubelet[2742]: I0715 23:17:22.927096 2742 server.go:255] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Jul 15 23:17:22.929528 kubelet[2742]: I0715 23:17:22.929500 2742 server.go:317] "Adding debug handlers to kubelet server" Jul 15 23:17:22.938224 kubelet[2742]: I0715 23:17:22.938175 2742 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Jul 15 23:17:22.945029 kubelet[2742]: I0715 23:17:22.944968 2742 dynamic_serving_content.go:135] "Starting controller" name="kubelet-server-cert-files::/var/lib/kubelet/pki/kubelet.crt::/var/lib/kubelet/pki/kubelet.key" Jul 15 23:17:22.948996 kubelet[2742]: I0715 23:17:22.948901 2742 volume_manager.go:297] "Starting Kubelet Volume Manager" Jul 15 23:17:22.949744 kubelet[2742]: E0715 23:17:22.949647 2742 kubelet_node_status.go:466] "Error getting the current node from lister" err="node \"ci-4372-0-1-n-21be50a87e\" not found" Jul 15 23:17:22.950952 kubelet[2742]: I0715 23:17:22.950781 2742 desired_state_of_world_populator.go:150] "Desired state populator starts to run" Jul 15 23:17:22.951395 kubelet[2742]: I0715 23:17:22.951216 2742 reconciler.go:26] "Reconciler: start to sync state" Jul 15 23:17:22.968507 kubelet[2742]: E0715 23:17:22.968248 2742 kubelet.go:1600] "Image garbage collection failed once. Stats initialization may not have completed yet" err="invalid capacity 0 on image filesystem" Jul 15 23:17:22.973435 kubelet[2742]: I0715 23:17:22.973394 2742 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv4" Jul 15 23:17:22.974638 kubelet[2742]: I0715 23:17:22.974571 2742 factory.go:223] Registration of the systemd container factory successfully Jul 15 23:17:22.974741 kubelet[2742]: I0715 23:17:22.974703 2742 factory.go:221] Registration of the crio container factory failed: Get "http://%2Fvar%2Frun%2Fcrio%2Fcrio.sock/info": dial unix /var/run/crio/crio.sock: connect: no such file or directory Jul 15 23:17:22.977632 kubelet[2742]: I0715 23:17:22.975632 2742 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv6" Jul 15 23:17:22.977632 kubelet[2742]: I0715 23:17:22.975664 2742 status_manager.go:230] "Starting to sync pod status with apiserver" Jul 15 23:17:22.977632 kubelet[2742]: I0715 23:17:22.975690 2742 watchdog_linux.go:127] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Jul 15 23:17:22.977632 kubelet[2742]: I0715 23:17:22.975698 2742 kubelet.go:2436] "Starting kubelet main sync loop" Jul 15 23:17:22.977632 kubelet[2742]: E0715 23:17:22.975742 2742 kubelet.go:2460] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" Jul 15 23:17:22.994335 kubelet[2742]: I0715 23:17:22.993769 2742 factory.go:223] Registration of the containerd container factory successfully Jul 15 23:17:23.063683 kubelet[2742]: I0715 23:17:23.063623 2742 cpu_manager.go:221] "Starting CPU manager" policy="none" Jul 15 23:17:23.063949 kubelet[2742]: I0715 23:17:23.063931 2742 cpu_manager.go:222] "Reconciling" reconcilePeriod="10s" Jul 15 23:17:23.064124 kubelet[2742]: I0715 23:17:23.064024 2742 state_mem.go:36] "Initialized new in-memory state store" Jul 15 23:17:23.064823 kubelet[2742]: I0715 23:17:23.064796 2742 state_mem.go:88] "Updated default CPUSet" cpuSet="" Jul 15 23:17:23.065614 kubelet[2742]: I0715 23:17:23.064921 2742 state_mem.go:96] "Updated CPUSet assignments" assignments={} Jul 15 23:17:23.065614 kubelet[2742]: I0715 23:17:23.064956 2742 policy_none.go:49] "None policy: Start" Jul 15 23:17:23.065614 kubelet[2742]: I0715 23:17:23.064967 2742 memory_manager.go:186] "Starting memorymanager" policy="None" Jul 15 23:17:23.065614 kubelet[2742]: I0715 23:17:23.064982 2742 state_mem.go:35] "Initializing new in-memory state store" Jul 15 23:17:23.065614 kubelet[2742]: I0715 23:17:23.065109 2742 state_mem.go:75] "Updated machine memory state" Jul 15 23:17:23.075010 kubelet[2742]: E0715 23:17:23.074969 2742 manager.go:517] "Failed to read data from checkpoint" err="checkpoint is not found" checkpoint="kubelet_internal_checkpoint" Jul 15 23:17:23.075219 kubelet[2742]: I0715 23:17:23.075196 2742 eviction_manager.go:189] "Eviction manager: starting control loop" Jul 15 23:17:23.075264 kubelet[2742]: I0715 23:17:23.075218 2742 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Jul 15 23:17:23.077156 kubelet[2742]: I0715 23:17:23.077102 2742 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Jul 15 23:17:23.078299 kubelet[2742]: I0715 23:17:23.078249 2742 kubelet.go:3309] "Creating a mirror pod for static pod" pod="kube-system/kube-scheduler-ci-4372-0-1-n-21be50a87e" Jul 15 23:17:23.079258 kubelet[2742]: I0715 23:17:23.079239 2742 kubelet.go:3309] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-ci-4372-0-1-n-21be50a87e" Jul 15 23:17:23.080014 kubelet[2742]: I0715 23:17:23.079985 2742 kubelet.go:3309] "Creating a mirror pod for static pod" pod="kube-system/kube-controller-manager-ci-4372-0-1-n-21be50a87e" Jul 15 23:17:23.080865 kubelet[2742]: E0715 23:17:23.080845 2742 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="no imagefs label for configured runtime" Jul 15 23:17:23.153660 kubelet[2742]: I0715 23:17:23.153498 2742 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/7605e19980c32d490e2d370fb5437e36-kubeconfig\") pod \"kube-controller-manager-ci-4372-0-1-n-21be50a87e\" (UID: \"7605e19980c32d490e2d370fb5437e36\") " pod="kube-system/kube-controller-manager-ci-4372-0-1-n-21be50a87e" Jul 15 23:17:23.154812 kubelet[2742]: I0715 23:17:23.154064 2742 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/7605e19980c32d490e2d370fb5437e36-usr-share-ca-certificates\") pod \"kube-controller-manager-ci-4372-0-1-n-21be50a87e\" (UID: \"7605e19980c32d490e2d370fb5437e36\") " pod="kube-system/kube-controller-manager-ci-4372-0-1-n-21be50a87e" Jul 15 23:17:23.155304 kubelet[2742]: I0715 23:17:23.155078 2742 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/2a90ba5db8158141543146688d6c9d0a-ca-certs\") pod \"kube-apiserver-ci-4372-0-1-n-21be50a87e\" (UID: \"2a90ba5db8158141543146688d6c9d0a\") " pod="kube-system/kube-apiserver-ci-4372-0-1-n-21be50a87e" Jul 15 23:17:23.155304 kubelet[2742]: I0715 23:17:23.155117 2742 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/2a90ba5db8158141543146688d6c9d0a-usr-share-ca-certificates\") pod \"kube-apiserver-ci-4372-0-1-n-21be50a87e\" (UID: \"2a90ba5db8158141543146688d6c9d0a\") " pod="kube-system/kube-apiserver-ci-4372-0-1-n-21be50a87e" Jul 15 23:17:23.155304 kubelet[2742]: I0715 23:17:23.155163 2742 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"flexvolume-dir\" (UniqueName: \"kubernetes.io/host-path/7605e19980c32d490e2d370fb5437e36-flexvolume-dir\") pod \"kube-controller-manager-ci-4372-0-1-n-21be50a87e\" (UID: \"7605e19980c32d490e2d370fb5437e36\") " pod="kube-system/kube-controller-manager-ci-4372-0-1-n-21be50a87e" Jul 15 23:17:23.155304 kubelet[2742]: I0715 23:17:23.155184 2742 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/7605e19980c32d490e2d370fb5437e36-k8s-certs\") pod \"kube-controller-manager-ci-4372-0-1-n-21be50a87e\" (UID: \"7605e19980c32d490e2d370fb5437e36\") " pod="kube-system/kube-controller-manager-ci-4372-0-1-n-21be50a87e" Jul 15 23:17:23.155304 kubelet[2742]: I0715 23:17:23.155205 2742 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/e45fcd8c7d59ab282165af34ac665d29-kubeconfig\") pod \"kube-scheduler-ci-4372-0-1-n-21be50a87e\" (UID: \"e45fcd8c7d59ab282165af34ac665d29\") " pod="kube-system/kube-scheduler-ci-4372-0-1-n-21be50a87e" Jul 15 23:17:23.155620 kubelet[2742]: I0715 23:17:23.155226 2742 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/2a90ba5db8158141543146688d6c9d0a-k8s-certs\") pod \"kube-apiserver-ci-4372-0-1-n-21be50a87e\" (UID: \"2a90ba5db8158141543146688d6c9d0a\") " pod="kube-system/kube-apiserver-ci-4372-0-1-n-21be50a87e" Jul 15 23:17:23.155620 kubelet[2742]: I0715 23:17:23.155278 2742 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/7605e19980c32d490e2d370fb5437e36-ca-certs\") pod \"kube-controller-manager-ci-4372-0-1-n-21be50a87e\" (UID: \"7605e19980c32d490e2d370fb5437e36\") " pod="kube-system/kube-controller-manager-ci-4372-0-1-n-21be50a87e" Jul 15 23:17:23.195325 kubelet[2742]: I0715 23:17:23.194888 2742 kubelet_node_status.go:75] "Attempting to register node" node="ci-4372-0-1-n-21be50a87e" Jul 15 23:17:23.213655 kubelet[2742]: I0715 23:17:23.213548 2742 kubelet_node_status.go:124] "Node was previously registered" node="ci-4372-0-1-n-21be50a87e" Jul 15 23:17:23.213942 kubelet[2742]: I0715 23:17:23.213696 2742 kubelet_node_status.go:78] "Successfully registered node" node="ci-4372-0-1-n-21be50a87e" Jul 15 23:17:23.915997 kubelet[2742]: I0715 23:17:23.915933 2742 apiserver.go:52] "Watching apiserver" Jul 15 23:17:23.951303 kubelet[2742]: I0715 23:17:23.951240 2742 desired_state_of_world_populator.go:158] "Finished populating initial desired state of world" Jul 15 23:17:24.035548 kubelet[2742]: I0715 23:17:24.035391 2742 kubelet.go:3309] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-ci-4372-0-1-n-21be50a87e" Jul 15 23:17:24.036787 kubelet[2742]: I0715 23:17:24.036669 2742 kubelet.go:3309] "Creating a mirror pod for static pod" pod="kube-system/kube-scheduler-ci-4372-0-1-n-21be50a87e" Jul 15 23:17:24.058652 kubelet[2742]: E0715 23:17:24.058565 2742 kubelet.go:3311] "Failed creating a mirror pod" err="pods \"kube-apiserver-ci-4372-0-1-n-21be50a87e\" already exists" pod="kube-system/kube-apiserver-ci-4372-0-1-n-21be50a87e" Jul 15 23:17:24.064914 kubelet[2742]: E0715 23:17:24.064326 2742 kubelet.go:3311] "Failed creating a mirror pod" err="pods \"kube-scheduler-ci-4372-0-1-n-21be50a87e\" already exists" pod="kube-system/kube-scheduler-ci-4372-0-1-n-21be50a87e" Jul 15 23:17:24.078244 kubelet[2742]: I0715 23:17:24.077721 2742 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-apiserver-ci-4372-0-1-n-21be50a87e" podStartSLOduration=1.077685674 podStartE2EDuration="1.077685674s" podCreationTimestamp="2025-07-15 23:17:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-07-15 23:17:24.076666949 +0000 UTC m=+1.283432623" watchObservedRunningTime="2025-07-15 23:17:24.077685674 +0000 UTC m=+1.284451348" Jul 15 23:17:24.122008 kubelet[2742]: I0715 23:17:24.121904 2742 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-controller-manager-ci-4372-0-1-n-21be50a87e" podStartSLOduration=1.121885494 podStartE2EDuration="1.121885494s" podCreationTimestamp="2025-07-15 23:17:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-07-15 23:17:24.094830117 +0000 UTC m=+1.301595791" watchObservedRunningTime="2025-07-15 23:17:24.121885494 +0000 UTC m=+1.328651168" Jul 15 23:17:24.139507 kubelet[2742]: I0715 23:17:24.139333 2742 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-scheduler-ci-4372-0-1-n-21be50a87e" podStartSLOduration=1.139314247 podStartE2EDuration="1.139314247s" podCreationTimestamp="2025-07-15 23:17:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-07-15 23:17:24.123122971 +0000 UTC m=+1.329888645" watchObservedRunningTime="2025-07-15 23:17:24.139314247 +0000 UTC m=+1.346079921" Jul 15 23:17:27.650576 kubelet[2742]: I0715 23:17:27.650490 2742 kuberuntime_manager.go:1746] "Updating runtime config through cri with podcidr" CIDR="192.168.0.0/24" Jul 15 23:17:27.651780 containerd[1538]: time="2025-07-15T23:17:27.651710558Z" level=info msg="No cni config template is specified, wait for other system components to drop the config." Jul 15 23:17:27.654137 kubelet[2742]: I0715 23:17:27.652186 2742 kubelet_network.go:61] "Updating Pod CIDR" originalPodCIDR="" newPodCIDR="192.168.0.0/24" Jul 15 23:17:28.312816 systemd[1]: Created slice kubepods-besteffort-pod3a812599_651a_4ad1_9f33_c784288a2a74.slice - libcontainer container kubepods-besteffort-pod3a812599_651a_4ad1_9f33_c784288a2a74.slice. Jul 15 23:17:28.392127 kubelet[2742]: I0715 23:17:28.391980 2742 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"xtables-lock\" (UniqueName: \"kubernetes.io/host-path/3a812599-651a-4ad1-9f33-c784288a2a74-xtables-lock\") pod \"kube-proxy-9mj6l\" (UID: \"3a812599-651a-4ad1-9f33-c784288a2a74\") " pod="kube-system/kube-proxy-9mj6l" Jul 15 23:17:28.392127 kubelet[2742]: I0715 23:17:28.392038 2742 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/3a812599-651a-4ad1-9f33-c784288a2a74-lib-modules\") pod \"kube-proxy-9mj6l\" (UID: \"3a812599-651a-4ad1-9f33-c784288a2a74\") " pod="kube-system/kube-proxy-9mj6l" Jul 15 23:17:28.392127 kubelet[2742]: I0715 23:17:28.392083 2742 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tgdkh\" (UniqueName: \"kubernetes.io/projected/3a812599-651a-4ad1-9f33-c784288a2a74-kube-api-access-tgdkh\") pod \"kube-proxy-9mj6l\" (UID: \"3a812599-651a-4ad1-9f33-c784288a2a74\") " pod="kube-system/kube-proxy-9mj6l" Jul 15 23:17:28.392476 kubelet[2742]: I0715 23:17:28.392184 2742 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-proxy\" (UniqueName: \"kubernetes.io/configmap/3a812599-651a-4ad1-9f33-c784288a2a74-kube-proxy\") pod \"kube-proxy-9mj6l\" (UID: \"3a812599-651a-4ad1-9f33-c784288a2a74\") " pod="kube-system/kube-proxy-9mj6l" Jul 15 23:17:28.624484 containerd[1538]: time="2025-07-15T23:17:28.624104245Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-proxy-9mj6l,Uid:3a812599-651a-4ad1-9f33-c784288a2a74,Namespace:kube-system,Attempt:0,}" Jul 15 23:17:28.654673 containerd[1538]: time="2025-07-15T23:17:28.654551566Z" level=info msg="connecting to shim fc111a3dedb7fc47c3606b074b7581c3ffb1349f3e8e712135d87900b277676c" address="unix:///run/containerd/s/f9a3b29f4f8d11e4ceb6f7883454ddf0afd7ad1c2f8ba76c038c02b0edfac2cd" namespace=k8s.io protocol=ttrpc version=3 Jul 15 23:17:28.696066 systemd[1]: Started cri-containerd-fc111a3dedb7fc47c3606b074b7581c3ffb1349f3e8e712135d87900b277676c.scope - libcontainer container fc111a3dedb7fc47c3606b074b7581c3ffb1349f3e8e712135d87900b277676c. Jul 15 23:17:28.754513 containerd[1538]: time="2025-07-15T23:17:28.754392203Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-proxy-9mj6l,Uid:3a812599-651a-4ad1-9f33-c784288a2a74,Namespace:kube-system,Attempt:0,} returns sandbox id \"fc111a3dedb7fc47c3606b074b7581c3ffb1349f3e8e712135d87900b277676c\"" Jul 15 23:17:28.767873 containerd[1538]: time="2025-07-15T23:17:28.767761296Z" level=info msg="CreateContainer within sandbox \"fc111a3dedb7fc47c3606b074b7581c3ffb1349f3e8e712135d87900b277676c\" for container &ContainerMetadata{Name:kube-proxy,Attempt:0,}" Jul 15 23:17:28.783638 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount3392575798.mount: Deactivated successfully. Jul 15 23:17:28.786278 containerd[1538]: time="2025-07-15T23:17:28.786235643Z" level=info msg="Container e114ea97287365805b7f62e770b11a8ca6e80e6a77322465d0d7a295b56bafdd: CDI devices from CRI Config.CDIDevices: []" Jul 15 23:17:28.799095 containerd[1538]: time="2025-07-15T23:17:28.799032713Z" level=info msg="CreateContainer within sandbox \"fc111a3dedb7fc47c3606b074b7581c3ffb1349f3e8e712135d87900b277676c\" for &ContainerMetadata{Name:kube-proxy,Attempt:0,} returns container id \"e114ea97287365805b7f62e770b11a8ca6e80e6a77322465d0d7a295b56bafdd\"" Jul 15 23:17:28.800567 containerd[1538]: time="2025-07-15T23:17:28.800459832Z" level=info msg="StartContainer for \"e114ea97287365805b7f62e770b11a8ca6e80e6a77322465d0d7a295b56bafdd\"" Jul 15 23:17:28.806518 containerd[1538]: time="2025-07-15T23:17:28.806452539Z" level=info msg="connecting to shim e114ea97287365805b7f62e770b11a8ca6e80e6a77322465d0d7a295b56bafdd" address="unix:///run/containerd/s/f9a3b29f4f8d11e4ceb6f7883454ddf0afd7ad1c2f8ba76c038c02b0edfac2cd" protocol=ttrpc version=3 Jul 15 23:17:28.837880 systemd[1]: Started cri-containerd-e114ea97287365805b7f62e770b11a8ca6e80e6a77322465d0d7a295b56bafdd.scope - libcontainer container e114ea97287365805b7f62e770b11a8ca6e80e6a77322465d0d7a295b56bafdd. Jul 15 23:17:28.934983 systemd[1]: Created slice kubepods-besteffort-pod34099849_9cf7_4598_984c_1c53ad94a99c.slice - libcontainer container kubepods-besteffort-pod34099849_9cf7_4598_984c_1c53ad94a99c.slice. Jul 15 23:17:28.950692 containerd[1538]: time="2025-07-15T23:17:28.950644575Z" level=info msg="StartContainer for \"e114ea97287365805b7f62e770b11a8ca6e80e6a77322465d0d7a295b56bafdd\" returns successfully" Jul 15 23:17:28.996411 kubelet[2742]: I0715 23:17:28.995991 2742 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hqd6z\" (UniqueName: \"kubernetes.io/projected/34099849-9cf7-4598-984c-1c53ad94a99c-kube-api-access-hqd6z\") pod \"tigera-operator-747864d56d-q7g7s\" (UID: \"34099849-9cf7-4598-984c-1c53ad94a99c\") " pod="tigera-operator/tigera-operator-747864d56d-q7g7s" Jul 15 23:17:28.996411 kubelet[2742]: I0715 23:17:28.996138 2742 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-calico\" (UniqueName: \"kubernetes.io/host-path/34099849-9cf7-4598-984c-1c53ad94a99c-var-lib-calico\") pod \"tigera-operator-747864d56d-q7g7s\" (UID: \"34099849-9cf7-4598-984c-1c53ad94a99c\") " pod="tigera-operator/tigera-operator-747864d56d-q7g7s" Jul 15 23:17:29.242416 containerd[1538]: time="2025-07-15T23:17:29.242272699Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:tigera-operator-747864d56d-q7g7s,Uid:34099849-9cf7-4598-984c-1c53ad94a99c,Namespace:tigera-operator,Attempt:0,}" Jul 15 23:17:29.276380 containerd[1538]: time="2025-07-15T23:17:29.276303319Z" level=info msg="connecting to shim f94139efeb3dd63b0ce76faf66299fc34c890dcb5db1f6bd592a8be24efc8631" address="unix:///run/containerd/s/7def99427fc5bd2a3803f591d51a229ea917aa23c7de48a75244d6342997f69f" namespace=k8s.io protocol=ttrpc version=3 Jul 15 23:17:29.311318 systemd[1]: Started cri-containerd-f94139efeb3dd63b0ce76faf66299fc34c890dcb5db1f6bd592a8be24efc8631.scope - libcontainer container f94139efeb3dd63b0ce76faf66299fc34c890dcb5db1f6bd592a8be24efc8631. Jul 15 23:17:29.379211 containerd[1538]: time="2025-07-15T23:17:29.379091001Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:tigera-operator-747864d56d-q7g7s,Uid:34099849-9cf7-4598-984c-1c53ad94a99c,Namespace:tigera-operator,Attempt:0,} returns sandbox id \"f94139efeb3dd63b0ce76faf66299fc34c890dcb5db1f6bd592a8be24efc8631\"" Jul 15 23:17:29.382418 containerd[1538]: time="2025-07-15T23:17:29.382351831Z" level=info msg="PullImage \"quay.io/tigera/operator:v1.38.3\"" Jul 15 23:17:29.606439 kubelet[2742]: I0715 23:17:29.606138 2742 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-proxy-9mj6l" podStartSLOduration=1.6060681730000002 podStartE2EDuration="1.606068173s" podCreationTimestamp="2025-07-15 23:17:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-07-15 23:17:29.082032164 +0000 UTC m=+6.288797838" watchObservedRunningTime="2025-07-15 23:17:29.606068173 +0000 UTC m=+6.812833927" Jul 15 23:17:30.985069 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount514316382.mount: Deactivated successfully. Jul 15 23:17:31.801774 containerd[1538]: time="2025-07-15T23:17:31.801518576Z" level=info msg="ImageCreate event name:\"quay.io/tigera/operator:v1.38.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jul 15 23:17:31.803948 containerd[1538]: time="2025-07-15T23:17:31.803900236Z" level=info msg="stop pulling image quay.io/tigera/operator:v1.38.3: active requests=0, bytes read=22150610" Jul 15 23:17:31.805237 containerd[1538]: time="2025-07-15T23:17:31.805153444Z" level=info msg="ImageCreate event name:\"sha256:7f8a5b1dba618e907d5f7804e42b3bd7cd5766bc3b0a66da25ff2c687e356bb0\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jul 15 23:17:31.808974 containerd[1538]: time="2025-07-15T23:17:31.808872870Z" level=info msg="ImageCreate event name:\"quay.io/tigera/operator@sha256:dbf1bad0def7b5955dc8e4aeee96e23ead0bc5822f6872518e685cd0ed484121\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jul 15 23:17:31.810329 containerd[1538]: time="2025-07-15T23:17:31.809965922Z" level=info msg="Pulled image \"quay.io/tigera/operator:v1.38.3\" with image id \"sha256:7f8a5b1dba618e907d5f7804e42b3bd7cd5766bc3b0a66da25ff2c687e356bb0\", repo tag \"quay.io/tigera/operator:v1.38.3\", repo digest \"quay.io/tigera/operator@sha256:dbf1bad0def7b5955dc8e4aeee96e23ead0bc5822f6872518e685cd0ed484121\", size \"22146605\" in 2.427570133s" Jul 15 23:17:31.810329 containerd[1538]: time="2025-07-15T23:17:31.810024401Z" level=info msg="PullImage \"quay.io/tigera/operator:v1.38.3\" returns image reference \"sha256:7f8a5b1dba618e907d5f7804e42b3bd7cd5766bc3b0a66da25ff2c687e356bb0\"" Jul 15 23:17:31.821716 containerd[1538]: time="2025-07-15T23:17:31.821641147Z" level=info msg="CreateContainer within sandbox \"f94139efeb3dd63b0ce76faf66299fc34c890dcb5db1f6bd592a8be24efc8631\" for container &ContainerMetadata{Name:tigera-operator,Attempt:0,}" Jul 15 23:17:31.839348 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount243216228.mount: Deactivated successfully. Jul 15 23:17:31.841839 containerd[1538]: time="2025-07-15T23:17:31.841775317Z" level=info msg="Container 80747840e2dfec3fc11ab0da6664cb31ba0201b0674a167830afcea74c3a9ac9: CDI devices from CRI Config.CDIDevices: []" Jul 15 23:17:31.845908 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount833029046.mount: Deactivated successfully. Jul 15 23:17:31.853118 containerd[1538]: time="2025-07-15T23:17:31.853042072Z" level=info msg="CreateContainer within sandbox \"f94139efeb3dd63b0ce76faf66299fc34c890dcb5db1f6bd592a8be24efc8631\" for &ContainerMetadata{Name:tigera-operator,Attempt:0,} returns container id \"80747840e2dfec3fc11ab0da6664cb31ba0201b0674a167830afcea74c3a9ac9\"" Jul 15 23:17:31.853775 containerd[1538]: time="2025-07-15T23:17:31.853745774Z" level=info msg="StartContainer for \"80747840e2dfec3fc11ab0da6664cb31ba0201b0674a167830afcea74c3a9ac9\"" Jul 15 23:17:31.855233 containerd[1538]: time="2025-07-15T23:17:31.855155578Z" level=info msg="connecting to shim 80747840e2dfec3fc11ab0da6664cb31ba0201b0674a167830afcea74c3a9ac9" address="unix:///run/containerd/s/7def99427fc5bd2a3803f591d51a229ea917aa23c7de48a75244d6342997f69f" protocol=ttrpc version=3 Jul 15 23:17:31.886953 systemd[1]: Started cri-containerd-80747840e2dfec3fc11ab0da6664cb31ba0201b0674a167830afcea74c3a9ac9.scope - libcontainer container 80747840e2dfec3fc11ab0da6664cb31ba0201b0674a167830afcea74c3a9ac9. Jul 15 23:17:31.928014 containerd[1538]: time="2025-07-15T23:17:31.927952096Z" level=info msg="StartContainer for \"80747840e2dfec3fc11ab0da6664cb31ba0201b0674a167830afcea74c3a9ac9\" returns successfully" Jul 15 23:17:38.527745 sudo[1820]: pam_unix(sudo:session): session closed for user root Jul 15 23:17:38.688371 sshd[1819]: Connection closed by 139.178.68.195 port 34384 Jul 15 23:17:38.689833 sshd-session[1817]: pam_unix(sshd:session): session closed for user core Jul 15 23:17:38.698151 systemd[1]: sshd@6-91.99.212.32:22-139.178.68.195:34384.service: Deactivated successfully. Jul 15 23:17:38.706634 systemd[1]: session-7.scope: Deactivated successfully. Jul 15 23:17:38.706950 systemd[1]: session-7.scope: Consumed 7.955s CPU time, 231.7M memory peak. Jul 15 23:17:38.709810 systemd-logind[1480]: Session 7 logged out. Waiting for processes to exit. Jul 15 23:17:38.715048 systemd-logind[1480]: Removed session 7. Jul 15 23:17:46.025346 kubelet[2742]: I0715 23:17:46.025249 2742 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="tigera-operator/tigera-operator-747864d56d-q7g7s" podStartSLOduration=15.594475464 podStartE2EDuration="18.025209996s" podCreationTimestamp="2025-07-15 23:17:28 +0000 UTC" firstStartedPulling="2025-07-15 23:17:29.381742927 +0000 UTC m=+6.588508601" lastFinishedPulling="2025-07-15 23:17:31.812477459 +0000 UTC m=+9.019243133" observedRunningTime="2025-07-15 23:17:32.086052063 +0000 UTC m=+9.292817776" watchObservedRunningTime="2025-07-15 23:17:46.025209996 +0000 UTC m=+23.231975630" Jul 15 23:17:46.041660 systemd[1]: Created slice kubepods-besteffort-podb2dca926_5cb6_4374_8d73_7f2f1d386dcc.slice - libcontainer container kubepods-besteffort-podb2dca926_5cb6_4374_8d73_7f2f1d386dcc.slice. Jul 15 23:17:46.111833 kubelet[2742]: I0715 23:17:46.111719 2742 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b2dca926-5cb6-4374-8d73-7f2f1d386dcc-tigera-ca-bundle\") pod \"calico-typha-846dbb8899-l5pbt\" (UID: \"b2dca926-5cb6-4374-8d73-7f2f1d386dcc\") " pod="calico-system/calico-typha-846dbb8899-l5pbt" Jul 15 23:17:46.111833 kubelet[2742]: I0715 23:17:46.111778 2742 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"typha-certs\" (UniqueName: \"kubernetes.io/secret/b2dca926-5cb6-4374-8d73-7f2f1d386dcc-typha-certs\") pod \"calico-typha-846dbb8899-l5pbt\" (UID: \"b2dca926-5cb6-4374-8d73-7f2f1d386dcc\") " pod="calico-system/calico-typha-846dbb8899-l5pbt" Jul 15 23:17:46.111833 kubelet[2742]: I0715 23:17:46.111801 2742 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mz6tz\" (UniqueName: \"kubernetes.io/projected/b2dca926-5cb6-4374-8d73-7f2f1d386dcc-kube-api-access-mz6tz\") pod \"calico-typha-846dbb8899-l5pbt\" (UID: \"b2dca926-5cb6-4374-8d73-7f2f1d386dcc\") " pod="calico-system/calico-typha-846dbb8899-l5pbt" Jul 15 23:17:46.347715 containerd[1538]: time="2025-07-15T23:17:46.347490492Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-typha-846dbb8899-l5pbt,Uid:b2dca926-5cb6-4374-8d73-7f2f1d386dcc,Namespace:calico-system,Attempt:0,}" Jul 15 23:17:46.373819 systemd[1]: Created slice kubepods-besteffort-pod6511db14_8008_4975_8160_497ddc5186fc.slice - libcontainer container kubepods-besteffort-pod6511db14_8008_4975_8160_497ddc5186fc.slice. Jul 15 23:17:46.406012 containerd[1538]: time="2025-07-15T23:17:46.405900138Z" level=info msg="connecting to shim 9bdc28ece6f6d94597ac3c6930ef720d5a9ecac95246931fc679ab609d53a124" address="unix:///run/containerd/s/36686407112e168e4a75c0fb68d4ed4989de5694351059262b9f239c1c9425e3" namespace=k8s.io protocol=ttrpc version=3 Jul 15 23:17:46.414843 kubelet[2742]: I0715 23:17:46.414799 2742 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-log-dir\" (UniqueName: \"kubernetes.io/host-path/6511db14-8008-4975-8160-497ddc5186fc-cni-log-dir\") pod \"calico-node-57f5j\" (UID: \"6511db14-8008-4975-8160-497ddc5186fc\") " pod="calico-system/calico-node-57f5j" Jul 15 23:17:46.414843 kubelet[2742]: I0715 23:17:46.414846 2742 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"xtables-lock\" (UniqueName: \"kubernetes.io/host-path/6511db14-8008-4975-8160-497ddc5186fc-xtables-lock\") pod \"calico-node-57f5j\" (UID: \"6511db14-8008-4975-8160-497ddc5186fc\") " pod="calico-system/calico-node-57f5j" Jul 15 23:17:46.415078 kubelet[2742]: I0715 23:17:46.414866 2742 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-net-dir\" (UniqueName: \"kubernetes.io/host-path/6511db14-8008-4975-8160-497ddc5186fc-cni-net-dir\") pod \"calico-node-57f5j\" (UID: \"6511db14-8008-4975-8160-497ddc5186fc\") " pod="calico-system/calico-node-57f5j" Jul 15 23:17:46.415078 kubelet[2742]: I0715 23:17:46.414882 2742 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"flexvol-driver-host\" (UniqueName: \"kubernetes.io/host-path/6511db14-8008-4975-8160-497ddc5186fc-flexvol-driver-host\") pod \"calico-node-57f5j\" (UID: \"6511db14-8008-4975-8160-497ddc5186fc\") " pod="calico-system/calico-node-57f5j" Jul 15 23:17:46.415078 kubelet[2742]: I0715 23:17:46.414907 2742 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-calico\" (UniqueName: \"kubernetes.io/host-path/6511db14-8008-4975-8160-497ddc5186fc-var-lib-calico\") pod \"calico-node-57f5j\" (UID: \"6511db14-8008-4975-8160-497ddc5186fc\") " pod="calico-system/calico-node-57f5j" Jul 15 23:17:46.415078 kubelet[2742]: I0715 23:17:46.414923 2742 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r2stf\" (UniqueName: \"kubernetes.io/projected/6511db14-8008-4975-8160-497ddc5186fc-kube-api-access-r2stf\") pod \"calico-node-57f5j\" (UID: \"6511db14-8008-4975-8160-497ddc5186fc\") " pod="calico-system/calico-node-57f5j" Jul 15 23:17:46.415078 kubelet[2742]: I0715 23:17:46.414944 2742 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6511db14-8008-4975-8160-497ddc5186fc-tigera-ca-bundle\") pod \"calico-node-57f5j\" (UID: \"6511db14-8008-4975-8160-497ddc5186fc\") " pod="calico-system/calico-node-57f5j" Jul 15 23:17:46.415193 kubelet[2742]: I0715 23:17:46.414983 2742 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-calico\" (UniqueName: \"kubernetes.io/host-path/6511db14-8008-4975-8160-497ddc5186fc-var-run-calico\") pod \"calico-node-57f5j\" (UID: \"6511db14-8008-4975-8160-497ddc5186fc\") " pod="calico-system/calico-node-57f5j" Jul 15 23:17:46.415193 kubelet[2742]: I0715 23:17:46.414999 2742 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/6511db14-8008-4975-8160-497ddc5186fc-lib-modules\") pod \"calico-node-57f5j\" (UID: \"6511db14-8008-4975-8160-497ddc5186fc\") " pod="calico-system/calico-node-57f5j" Jul 15 23:17:46.415193 kubelet[2742]: I0715 23:17:46.415016 2742 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"policysync\" (UniqueName: \"kubernetes.io/host-path/6511db14-8008-4975-8160-497ddc5186fc-policysync\") pod \"calico-node-57f5j\" (UID: \"6511db14-8008-4975-8160-497ddc5186fc\") " pod="calico-system/calico-node-57f5j" Jul 15 23:17:46.415193 kubelet[2742]: I0715 23:17:46.415035 2742 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-certs\" (UniqueName: \"kubernetes.io/secret/6511db14-8008-4975-8160-497ddc5186fc-node-certs\") pod \"calico-node-57f5j\" (UID: \"6511db14-8008-4975-8160-497ddc5186fc\") " pod="calico-system/calico-node-57f5j" Jul 15 23:17:46.415193 kubelet[2742]: I0715 23:17:46.415054 2742 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-bin-dir\" (UniqueName: \"kubernetes.io/host-path/6511db14-8008-4975-8160-497ddc5186fc-cni-bin-dir\") pod \"calico-node-57f5j\" (UID: \"6511db14-8008-4975-8160-497ddc5186fc\") " pod="calico-system/calico-node-57f5j" Jul 15 23:17:46.458295 systemd[1]: Started cri-containerd-9bdc28ece6f6d94597ac3c6930ef720d5a9ecac95246931fc679ab609d53a124.scope - libcontainer container 9bdc28ece6f6d94597ac3c6930ef720d5a9ecac95246931fc679ab609d53a124. Jul 15 23:17:46.521688 kubelet[2742]: E0715 23:17:46.521550 2742 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 15 23:17:46.522121 kubelet[2742]: W0715 23:17:46.521845 2742 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 15 23:17:46.522121 kubelet[2742]: E0715 23:17:46.521885 2742 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 15 23:17:46.526041 kubelet[2742]: E0715 23:17:46.525976 2742 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 15 23:17:46.526041 kubelet[2742]: E0715 23:17:46.525988 2742 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-trqcc" podUID="7821b0da-67bb-46bb-abaf-a7c6ac82a39c" Jul 15 23:17:46.526041 kubelet[2742]: W0715 23:17:46.526001 2742 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 15 23:17:46.526041 kubelet[2742]: E0715 23:17:46.526047 2742 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 15 23:17:46.529003 kubelet[2742]: E0715 23:17:46.526533 2742 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 15 23:17:46.529003 kubelet[2742]: W0715 23:17:46.526554 2742 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 15 23:17:46.529003 kubelet[2742]: E0715 23:17:46.526570 2742 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 15 23:17:46.531535 kubelet[2742]: E0715 23:17:46.531492 2742 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 15 23:17:46.531535 kubelet[2742]: W0715 23:17:46.531522 2742 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 15 23:17:46.531535 kubelet[2742]: E0715 23:17:46.531548 2742 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 15 23:17:46.531896 kubelet[2742]: E0715 23:17:46.531853 2742 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 15 23:17:46.531896 kubelet[2742]: W0715 23:17:46.531863 2742 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 15 23:17:46.531896 kubelet[2742]: E0715 23:17:46.531874 2742 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 15 23:17:46.532206 kubelet[2742]: E0715 23:17:46.532001 2742 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 15 23:17:46.532206 kubelet[2742]: W0715 23:17:46.532016 2742 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 15 23:17:46.532206 kubelet[2742]: E0715 23:17:46.532024 2742 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 15 23:17:46.532206 kubelet[2742]: E0715 23:17:46.532173 2742 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 15 23:17:46.532206 kubelet[2742]: W0715 23:17:46.532181 2742 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 15 23:17:46.532206 kubelet[2742]: E0715 23:17:46.532189 2742 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 15 23:17:46.532912 kubelet[2742]: E0715 23:17:46.532577 2742 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 15 23:17:46.532912 kubelet[2742]: W0715 23:17:46.532610 2742 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 15 23:17:46.532912 kubelet[2742]: E0715 23:17:46.532623 2742 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 15 23:17:46.532912 kubelet[2742]: E0715 23:17:46.532814 2742 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 15 23:17:46.532912 kubelet[2742]: W0715 23:17:46.532824 2742 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 15 23:17:46.532912 kubelet[2742]: E0715 23:17:46.532835 2742 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 15 23:17:46.533595 kubelet[2742]: E0715 23:17:46.533559 2742 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 15 23:17:46.533595 kubelet[2742]: W0715 23:17:46.533578 2742 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 15 23:17:46.534149 kubelet[2742]: E0715 23:17:46.533610 2742 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 15 23:17:46.534149 kubelet[2742]: E0715 23:17:46.533873 2742 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 15 23:17:46.534149 kubelet[2742]: W0715 23:17:46.533882 2742 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 15 23:17:46.534149 kubelet[2742]: E0715 23:17:46.533893 2742 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 15 23:17:46.534149 kubelet[2742]: E0715 23:17:46.534031 2742 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 15 23:17:46.534149 kubelet[2742]: W0715 23:17:46.534038 2742 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 15 23:17:46.534149 kubelet[2742]: E0715 23:17:46.534045 2742 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 15 23:17:46.534149 kubelet[2742]: E0715 23:17:46.534150 2742 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 15 23:17:46.534881 kubelet[2742]: W0715 23:17:46.534159 2742 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 15 23:17:46.534881 kubelet[2742]: E0715 23:17:46.534167 2742 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 15 23:17:46.534881 kubelet[2742]: E0715 23:17:46.534513 2742 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 15 23:17:46.534881 kubelet[2742]: W0715 23:17:46.534525 2742 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 15 23:17:46.534881 kubelet[2742]: E0715 23:17:46.534536 2742 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 15 23:17:46.538929 kubelet[2742]: E0715 23:17:46.538854 2742 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 15 23:17:46.538929 kubelet[2742]: W0715 23:17:46.538891 2742 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 15 23:17:46.539241 kubelet[2742]: E0715 23:17:46.539017 2742 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 15 23:17:46.580318 kubelet[2742]: E0715 23:17:46.580165 2742 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 15 23:17:46.580318 kubelet[2742]: W0715 23:17:46.580198 2742 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 15 23:17:46.580318 kubelet[2742]: E0715 23:17:46.580243 2742 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 15 23:17:46.593666 containerd[1538]: time="2025-07-15T23:17:46.593565090Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-typha-846dbb8899-l5pbt,Uid:b2dca926-5cb6-4374-8d73-7f2f1d386dcc,Namespace:calico-system,Attempt:0,} returns sandbox id \"9bdc28ece6f6d94597ac3c6930ef720d5a9ecac95246931fc679ab609d53a124\"" Jul 15 23:17:46.597390 containerd[1538]: time="2025-07-15T23:17:46.597225115Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/typha:v3.30.2\"" Jul 15 23:17:46.600182 kubelet[2742]: E0715 23:17:46.599846 2742 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 15 23:17:46.600182 kubelet[2742]: W0715 23:17:46.599877 2742 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 15 23:17:46.600182 kubelet[2742]: E0715 23:17:46.599908 2742 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 15 23:17:46.602068 kubelet[2742]: E0715 23:17:46.601767 2742 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 15 23:17:46.602068 kubelet[2742]: W0715 23:17:46.601814 2742 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 15 23:17:46.602963 kubelet[2742]: E0715 23:17:46.602920 2742 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 15 23:17:46.603724 kubelet[2742]: E0715 23:17:46.603446 2742 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 15 23:17:46.604276 kubelet[2742]: W0715 23:17:46.603664 2742 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 15 23:17:46.604347 kubelet[2742]: E0715 23:17:46.604294 2742 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 15 23:17:46.604742 kubelet[2742]: E0715 23:17:46.604714 2742 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 15 23:17:46.604742 kubelet[2742]: W0715 23:17:46.604738 2742 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 15 23:17:46.604960 kubelet[2742]: E0715 23:17:46.604755 2742 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 15 23:17:46.605856 kubelet[2742]: E0715 23:17:46.605824 2742 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 15 23:17:46.605856 kubelet[2742]: W0715 23:17:46.605848 2742 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 15 23:17:46.606867 kubelet[2742]: E0715 23:17:46.605869 2742 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 15 23:17:46.606867 kubelet[2742]: E0715 23:17:46.606034 2742 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 15 23:17:46.606867 kubelet[2742]: W0715 23:17:46.606041 2742 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 15 23:17:46.606867 kubelet[2742]: E0715 23:17:46.606052 2742 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 15 23:17:46.607306 kubelet[2742]: E0715 23:17:46.607278 2742 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 15 23:17:46.607306 kubelet[2742]: W0715 23:17:46.607302 2742 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 15 23:17:46.607482 kubelet[2742]: E0715 23:17:46.607324 2742 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 15 23:17:46.608907 kubelet[2742]: E0715 23:17:46.608870 2742 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 15 23:17:46.608907 kubelet[2742]: W0715 23:17:46.608898 2742 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 15 23:17:46.609043 kubelet[2742]: E0715 23:17:46.608918 2742 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 15 23:17:46.612856 kubelet[2742]: E0715 23:17:46.612802 2742 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 15 23:17:46.612856 kubelet[2742]: W0715 23:17:46.612846 2742 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 15 23:17:46.613050 kubelet[2742]: E0715 23:17:46.612872 2742 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 15 23:17:46.616278 kubelet[2742]: E0715 23:17:46.615737 2742 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 15 23:17:46.616278 kubelet[2742]: W0715 23:17:46.615769 2742 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 15 23:17:46.616278 kubelet[2742]: E0715 23:17:46.615796 2742 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 15 23:17:46.617262 kubelet[2742]: E0715 23:17:46.617218 2742 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 15 23:17:46.617815 kubelet[2742]: W0715 23:17:46.617248 2742 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 15 23:17:46.617881 kubelet[2742]: E0715 23:17:46.617828 2742 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 15 23:17:46.618864 kubelet[2742]: E0715 23:17:46.618820 2742 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 15 23:17:46.618992 kubelet[2742]: W0715 23:17:46.618959 2742 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 15 23:17:46.619025 kubelet[2742]: E0715 23:17:46.619011 2742 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 15 23:17:46.620057 kubelet[2742]: E0715 23:17:46.620024 2742 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 15 23:17:46.620057 kubelet[2742]: W0715 23:17:46.620046 2742 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 15 23:17:46.620191 kubelet[2742]: E0715 23:17:46.620070 2742 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 15 23:17:46.620921 kubelet[2742]: E0715 23:17:46.620866 2742 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 15 23:17:46.620921 kubelet[2742]: W0715 23:17:46.620893 2742 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 15 23:17:46.623855 kubelet[2742]: E0715 23:17:46.622832 2742 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 15 23:17:46.623855 kubelet[2742]: E0715 23:17:46.623382 2742 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 15 23:17:46.623855 kubelet[2742]: W0715 23:17:46.623402 2742 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 15 23:17:46.623855 kubelet[2742]: E0715 23:17:46.623422 2742 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 15 23:17:46.623855 kubelet[2742]: E0715 23:17:46.623747 2742 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 15 23:17:46.623855 kubelet[2742]: W0715 23:17:46.623760 2742 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 15 23:17:46.623855 kubelet[2742]: E0715 23:17:46.623772 2742 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 15 23:17:46.624729 kubelet[2742]: E0715 23:17:46.624635 2742 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 15 23:17:46.624729 kubelet[2742]: W0715 23:17:46.624658 2742 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 15 23:17:46.624729 kubelet[2742]: E0715 23:17:46.624683 2742 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 15 23:17:46.625209 kubelet[2742]: E0715 23:17:46.624959 2742 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 15 23:17:46.625209 kubelet[2742]: W0715 23:17:46.624970 2742 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 15 23:17:46.625209 kubelet[2742]: E0715 23:17:46.624980 2742 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 15 23:17:46.626629 kubelet[2742]: E0715 23:17:46.625415 2742 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 15 23:17:46.626629 kubelet[2742]: W0715 23:17:46.625441 2742 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 15 23:17:46.626629 kubelet[2742]: E0715 23:17:46.625455 2742 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 15 23:17:46.628141 kubelet[2742]: E0715 23:17:46.628107 2742 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 15 23:17:46.628141 kubelet[2742]: W0715 23:17:46.628136 2742 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 15 23:17:46.628236 kubelet[2742]: E0715 23:17:46.628162 2742 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 15 23:17:46.628810 kubelet[2742]: E0715 23:17:46.628774 2742 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 15 23:17:46.628810 kubelet[2742]: W0715 23:17:46.628800 2742 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 15 23:17:46.628923 kubelet[2742]: E0715 23:17:46.628818 2742 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 15 23:17:46.628923 kubelet[2742]: I0715 23:17:46.628851 2742 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/7821b0da-67bb-46bb-abaf-a7c6ac82a39c-kubelet-dir\") pod \"csi-node-driver-trqcc\" (UID: \"7821b0da-67bb-46bb-abaf-a7c6ac82a39c\") " pod="calico-system/csi-node-driver-trqcc" Jul 15 23:17:46.629505 kubelet[2742]: E0715 23:17:46.629288 2742 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 15 23:17:46.629505 kubelet[2742]: W0715 23:17:46.629308 2742 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 15 23:17:46.629505 kubelet[2742]: E0715 23:17:46.629321 2742 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 15 23:17:46.629505 kubelet[2742]: I0715 23:17:46.629343 2742 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"varrun\" (UniqueName: \"kubernetes.io/host-path/7821b0da-67bb-46bb-abaf-a7c6ac82a39c-varrun\") pod \"csi-node-driver-trqcc\" (UID: \"7821b0da-67bb-46bb-abaf-a7c6ac82a39c\") " pod="calico-system/csi-node-driver-trqcc" Jul 15 23:17:46.629909 kubelet[2742]: E0715 23:17:46.629886 2742 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 15 23:17:46.629909 kubelet[2742]: W0715 23:17:46.629906 2742 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 15 23:17:46.629972 kubelet[2742]: E0715 23:17:46.629920 2742 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 15 23:17:46.631621 kubelet[2742]: I0715 23:17:46.630580 2742 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qjtnq\" (UniqueName: \"kubernetes.io/projected/7821b0da-67bb-46bb-abaf-a7c6ac82a39c-kube-api-access-qjtnq\") pod \"csi-node-driver-trqcc\" (UID: \"7821b0da-67bb-46bb-abaf-a7c6ac82a39c\") " pod="calico-system/csi-node-driver-trqcc" Jul 15 23:17:46.631978 kubelet[2742]: E0715 23:17:46.631952 2742 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 15 23:17:46.631978 kubelet[2742]: W0715 23:17:46.631977 2742 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 15 23:17:46.632045 kubelet[2742]: E0715 23:17:46.631999 2742 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 15 23:17:46.632306 kubelet[2742]: E0715 23:17:46.632279 2742 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 15 23:17:46.632367 kubelet[2742]: W0715 23:17:46.632319 2742 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 15 23:17:46.632367 kubelet[2742]: E0715 23:17:46.632332 2742 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 15 23:17:46.632529 kubelet[2742]: E0715 23:17:46.632513 2742 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 15 23:17:46.632571 kubelet[2742]: W0715 23:17:46.632525 2742 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 15 23:17:46.632571 kubelet[2742]: E0715 23:17:46.632557 2742 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 15 23:17:46.633412 kubelet[2742]: E0715 23:17:46.633382 2742 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 15 23:17:46.633491 kubelet[2742]: W0715 23:17:46.633423 2742 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 15 23:17:46.633491 kubelet[2742]: E0715 23:17:46.633440 2742 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 15 23:17:46.633574 kubelet[2742]: I0715 23:17:46.633548 2742 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/7821b0da-67bb-46bb-abaf-a7c6ac82a39c-registration-dir\") pod \"csi-node-driver-trqcc\" (UID: \"7821b0da-67bb-46bb-abaf-a7c6ac82a39c\") " pod="calico-system/csi-node-driver-trqcc" Jul 15 23:17:46.634261 kubelet[2742]: E0715 23:17:46.634229 2742 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 15 23:17:46.634261 kubelet[2742]: W0715 23:17:46.634250 2742 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 15 23:17:46.634383 kubelet[2742]: E0715 23:17:46.634268 2742 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 15 23:17:46.635257 kubelet[2742]: E0715 23:17:46.635220 2742 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 15 23:17:46.635257 kubelet[2742]: W0715 23:17:46.635245 2742 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 15 23:17:46.635257 kubelet[2742]: E0715 23:17:46.635263 2742 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 15 23:17:46.635982 kubelet[2742]: E0715 23:17:46.635566 2742 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 15 23:17:46.635982 kubelet[2742]: W0715 23:17:46.635583 2742 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 15 23:17:46.635982 kubelet[2742]: E0715 23:17:46.635627 2742 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 15 23:17:46.635982 kubelet[2742]: I0715 23:17:46.635880 2742 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/7821b0da-67bb-46bb-abaf-a7c6ac82a39c-socket-dir\") pod \"csi-node-driver-trqcc\" (UID: \"7821b0da-67bb-46bb-abaf-a7c6ac82a39c\") " pod="calico-system/csi-node-driver-trqcc" Jul 15 23:17:46.636788 kubelet[2742]: E0715 23:17:46.636761 2742 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 15 23:17:46.636788 kubelet[2742]: W0715 23:17:46.636783 2742 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 15 23:17:46.636899 kubelet[2742]: E0715 23:17:46.636802 2742 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 15 23:17:46.637271 kubelet[2742]: E0715 23:17:46.637126 2742 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 15 23:17:46.637271 kubelet[2742]: W0715 23:17:46.637144 2742 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 15 23:17:46.637271 kubelet[2742]: E0715 23:17:46.637157 2742 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 15 23:17:46.639119 kubelet[2742]: E0715 23:17:46.639065 2742 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 15 23:17:46.639119 kubelet[2742]: W0715 23:17:46.639103 2742 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 15 23:17:46.639474 kubelet[2742]: E0715 23:17:46.639136 2742 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 15 23:17:46.639474 kubelet[2742]: E0715 23:17:46.639431 2742 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 15 23:17:46.639474 kubelet[2742]: W0715 23:17:46.639443 2742 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 15 23:17:46.639474 kubelet[2742]: E0715 23:17:46.639455 2742 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 15 23:17:46.641069 kubelet[2742]: E0715 23:17:46.641028 2742 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 15 23:17:46.641069 kubelet[2742]: W0715 23:17:46.641056 2742 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 15 23:17:46.641069 kubelet[2742]: E0715 23:17:46.641078 2742 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 15 23:17:46.679623 containerd[1538]: time="2025-07-15T23:17:46.678986811Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-node-57f5j,Uid:6511db14-8008-4975-8160-497ddc5186fc,Namespace:calico-system,Attempt:0,}" Jul 15 23:17:46.711298 containerd[1538]: time="2025-07-15T23:17:46.711238849Z" level=info msg="connecting to shim e6e6647f69fd0e69c938112d122cdff499ccb185e2cf49cebe8ae17c422298fc" address="unix:///run/containerd/s/0f8bdf6194e870e0c8dc786758ab5d09d633a8ad9d071ca2f370f61c9b417358" namespace=k8s.io protocol=ttrpc version=3 Jul 15 23:17:46.739269 kubelet[2742]: E0715 23:17:46.739167 2742 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 15 23:17:46.739269 kubelet[2742]: W0715 23:17:46.739202 2742 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 15 23:17:46.739269 kubelet[2742]: E0715 23:17:46.739228 2742 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 15 23:17:46.739456 kubelet[2742]: E0715 23:17:46.739430 2742 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 15 23:17:46.739456 kubelet[2742]: W0715 23:17:46.739439 2742 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 15 23:17:46.739456 kubelet[2742]: E0715 23:17:46.739449 2742 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 15 23:17:46.740195 kubelet[2742]: E0715 23:17:46.740168 2742 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 15 23:17:46.740195 kubelet[2742]: W0715 23:17:46.740188 2742 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 15 23:17:46.740336 kubelet[2742]: E0715 23:17:46.740205 2742 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 15 23:17:46.740451 kubelet[2742]: E0715 23:17:46.740432 2742 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 15 23:17:46.740451 kubelet[2742]: W0715 23:17:46.740444 2742 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 15 23:17:46.740510 kubelet[2742]: E0715 23:17:46.740455 2742 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 15 23:17:46.740649 kubelet[2742]: E0715 23:17:46.740624 2742 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 15 23:17:46.740649 kubelet[2742]: W0715 23:17:46.740638 2742 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 15 23:17:46.740649 kubelet[2742]: E0715 23:17:46.740648 2742 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 15 23:17:46.741292 kubelet[2742]: E0715 23:17:46.741099 2742 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 15 23:17:46.741292 kubelet[2742]: W0715 23:17:46.741117 2742 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 15 23:17:46.741292 kubelet[2742]: E0715 23:17:46.741131 2742 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 15 23:17:46.741634 kubelet[2742]: E0715 23:17:46.741616 2742 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 15 23:17:46.741634 kubelet[2742]: W0715 23:17:46.741634 2742 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 15 23:17:46.741749 kubelet[2742]: E0715 23:17:46.741647 2742 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 15 23:17:46.743635 kubelet[2742]: E0715 23:17:46.743471 2742 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 15 23:17:46.743635 kubelet[2742]: W0715 23:17:46.743502 2742 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 15 23:17:46.743635 kubelet[2742]: E0715 23:17:46.743525 2742 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 15 23:17:46.744028 kubelet[2742]: E0715 23:17:46.743791 2742 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 15 23:17:46.744028 kubelet[2742]: W0715 23:17:46.743801 2742 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 15 23:17:46.744028 kubelet[2742]: E0715 23:17:46.743812 2742 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 15 23:17:46.744028 kubelet[2742]: E0715 23:17:46.743966 2742 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 15 23:17:46.744028 kubelet[2742]: W0715 23:17:46.743974 2742 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 15 23:17:46.744028 kubelet[2742]: E0715 23:17:46.743985 2742 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 15 23:17:46.744735 kubelet[2742]: E0715 23:17:46.744704 2742 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 15 23:17:46.744735 kubelet[2742]: W0715 23:17:46.744723 2742 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 15 23:17:46.744735 kubelet[2742]: E0715 23:17:46.744738 2742 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 15 23:17:46.745357 kubelet[2742]: E0715 23:17:46.745329 2742 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 15 23:17:46.745357 kubelet[2742]: W0715 23:17:46.745352 2742 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 15 23:17:46.745482 kubelet[2742]: E0715 23:17:46.745367 2742 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 15 23:17:46.746665 kubelet[2742]: E0715 23:17:46.746621 2742 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 15 23:17:46.746665 kubelet[2742]: W0715 23:17:46.746648 2742 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 15 23:17:46.746861 kubelet[2742]: E0715 23:17:46.746666 2742 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 15 23:17:46.747124 kubelet[2742]: E0715 23:17:46.747100 2742 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 15 23:17:46.747187 kubelet[2742]: W0715 23:17:46.747119 2742 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 15 23:17:46.747187 kubelet[2742]: E0715 23:17:46.747151 2742 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 15 23:17:46.747352 kubelet[2742]: E0715 23:17:46.747337 2742 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 15 23:17:46.747352 kubelet[2742]: W0715 23:17:46.747351 2742 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 15 23:17:46.747472 kubelet[2742]: E0715 23:17:46.747363 2742 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 15 23:17:46.747813 kubelet[2742]: E0715 23:17:46.747790 2742 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 15 23:17:46.747813 kubelet[2742]: W0715 23:17:46.747809 2742 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 15 23:17:46.747915 kubelet[2742]: E0715 23:17:46.747822 2742 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 15 23:17:46.749068 kubelet[2742]: E0715 23:17:46.749041 2742 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 15 23:17:46.749068 kubelet[2742]: W0715 23:17:46.749063 2742 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 15 23:17:46.749194 kubelet[2742]: E0715 23:17:46.749080 2742 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 15 23:17:46.749359 kubelet[2742]: E0715 23:17:46.749342 2742 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 15 23:17:46.749395 kubelet[2742]: W0715 23:17:46.749363 2742 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 15 23:17:46.749395 kubelet[2742]: E0715 23:17:46.749376 2742 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 15 23:17:46.749843 kubelet[2742]: E0715 23:17:46.749820 2742 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 15 23:17:46.749843 kubelet[2742]: W0715 23:17:46.749838 2742 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 15 23:17:46.750002 kubelet[2742]: E0715 23:17:46.749854 2742 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 15 23:17:46.751813 kubelet[2742]: E0715 23:17:46.751208 2742 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 15 23:17:46.751813 kubelet[2742]: W0715 23:17:46.751233 2742 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 15 23:17:46.751813 kubelet[2742]: E0715 23:17:46.751251 2742 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 15 23:17:46.751813 kubelet[2742]: E0715 23:17:46.751650 2742 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 15 23:17:46.751813 kubelet[2742]: W0715 23:17:46.751664 2742 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 15 23:17:46.751813 kubelet[2742]: E0715 23:17:46.751715 2742 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 15 23:17:46.752235 kubelet[2742]: E0715 23:17:46.752173 2742 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 15 23:17:46.752235 kubelet[2742]: W0715 23:17:46.752192 2742 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 15 23:17:46.752346 kubelet[2742]: E0715 23:17:46.752316 2742 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 15 23:17:46.752874 kubelet[2742]: E0715 23:17:46.752843 2742 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 15 23:17:46.752874 kubelet[2742]: W0715 23:17:46.752864 2742 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 15 23:17:46.752968 kubelet[2742]: E0715 23:17:46.752880 2742 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 15 23:17:46.753907 kubelet[2742]: E0715 23:17:46.753879 2742 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 15 23:17:46.753907 kubelet[2742]: W0715 23:17:46.753904 2742 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 15 23:17:46.754025 kubelet[2742]: E0715 23:17:46.753926 2742 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 15 23:17:46.755068 kubelet[2742]: E0715 23:17:46.754179 2742 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 15 23:17:46.755068 kubelet[2742]: W0715 23:17:46.754196 2742 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 15 23:17:46.755068 kubelet[2742]: E0715 23:17:46.754208 2742 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 15 23:17:46.776046 systemd[1]: Started cri-containerd-e6e6647f69fd0e69c938112d122cdff499ccb185e2cf49cebe8ae17c422298fc.scope - libcontainer container e6e6647f69fd0e69c938112d122cdff499ccb185e2cf49cebe8ae17c422298fc. Jul 15 23:17:46.789921 kubelet[2742]: E0715 23:17:46.789884 2742 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 15 23:17:46.789921 kubelet[2742]: W0715 23:17:46.789912 2742 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 15 23:17:46.789921 kubelet[2742]: E0715 23:17:46.789938 2742 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 15 23:17:46.954284 containerd[1538]: time="2025-07-15T23:17:46.953750779Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-node-57f5j,Uid:6511db14-8008-4975-8160-497ddc5186fc,Namespace:calico-system,Attempt:0,} returns sandbox id \"e6e6647f69fd0e69c938112d122cdff499ccb185e2cf49cebe8ae17c422298fc\"" Jul 15 23:17:47.976334 kubelet[2742]: E0715 23:17:47.976283 2742 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-trqcc" podUID="7821b0da-67bb-46bb-abaf-a7c6ac82a39c" Jul 15 23:17:48.034754 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount4262756747.mount: Deactivated successfully. Jul 15 23:17:49.193611 containerd[1538]: time="2025-07-15T23:17:49.193016205Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/typha:v3.30.2\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jul 15 23:17:49.197182 containerd[1538]: time="2025-07-15T23:17:49.196974870Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/typha:v3.30.2: active requests=0, bytes read=33087207" Jul 15 23:17:49.199497 containerd[1538]: time="2025-07-15T23:17:49.199355277Z" level=info msg="ImageCreate event name:\"sha256:bd819526ff844d29b60cd75e846a1f55306016ff269d881d52a9b6c7b2eef0b2\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jul 15 23:17:49.202309 containerd[1538]: time="2025-07-15T23:17:49.202248157Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/typha@sha256:da29d745efe5eb7d25f765d3aa439f3fe60710a458efe39c285e58b02bd961af\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jul 15 23:17:49.203122 containerd[1538]: time="2025-07-15T23:17:49.203084985Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/typha:v3.30.2\" with image id \"sha256:bd819526ff844d29b60cd75e846a1f55306016ff269d881d52a9b6c7b2eef0b2\", repo tag \"ghcr.io/flatcar/calico/typha:v3.30.2\", repo digest \"ghcr.io/flatcar/calico/typha@sha256:da29d745efe5eb7d25f765d3aa439f3fe60710a458efe39c285e58b02bd961af\", size \"33087061\" in 2.605380077s" Jul 15 23:17:49.203263 containerd[1538]: time="2025-07-15T23:17:49.203247023Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/typha:v3.30.2\" returns image reference \"sha256:bd819526ff844d29b60cd75e846a1f55306016ff269d881d52a9b6c7b2eef0b2\"" Jul 15 23:17:49.204788 containerd[1538]: time="2025-07-15T23:17:49.204743282Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.2\"" Jul 15 23:17:49.224455 containerd[1538]: time="2025-07-15T23:17:49.224403250Z" level=info msg="CreateContainer within sandbox \"9bdc28ece6f6d94597ac3c6930ef720d5a9ecac95246931fc679ab609d53a124\" for container &ContainerMetadata{Name:calico-typha,Attempt:0,}" Jul 15 23:17:49.238673 containerd[1538]: time="2025-07-15T23:17:49.236961956Z" level=info msg="Container 11c386602575974c2213fb2ca54aaba94b252377f581e56075597c62ad62471f: CDI devices from CRI Config.CDIDevices: []" Jul 15 23:17:49.242943 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount1744567318.mount: Deactivated successfully. Jul 15 23:17:49.252723 containerd[1538]: time="2025-07-15T23:17:49.252659699Z" level=info msg="CreateContainer within sandbox \"9bdc28ece6f6d94597ac3c6930ef720d5a9ecac95246931fc679ab609d53a124\" for &ContainerMetadata{Name:calico-typha,Attempt:0,} returns container id \"11c386602575974c2213fb2ca54aaba94b252377f581e56075597c62ad62471f\"" Jul 15 23:17:49.253877 containerd[1538]: time="2025-07-15T23:17:49.253836762Z" level=info msg="StartContainer for \"11c386602575974c2213fb2ca54aaba94b252377f581e56075597c62ad62471f\"" Jul 15 23:17:49.255527 containerd[1538]: time="2025-07-15T23:17:49.255480619Z" level=info msg="connecting to shim 11c386602575974c2213fb2ca54aaba94b252377f581e56075597c62ad62471f" address="unix:///run/containerd/s/36686407112e168e4a75c0fb68d4ed4989de5694351059262b9f239c1c9425e3" protocol=ttrpc version=3 Jul 15 23:17:49.278060 systemd[1]: Started cri-containerd-11c386602575974c2213fb2ca54aaba94b252377f581e56075597c62ad62471f.scope - libcontainer container 11c386602575974c2213fb2ca54aaba94b252377f581e56075597c62ad62471f. Jul 15 23:17:49.362140 containerd[1538]: time="2025-07-15T23:17:49.362052943Z" level=info msg="StartContainer for \"11c386602575974c2213fb2ca54aaba94b252377f581e56075597c62ad62471f\" returns successfully" Jul 15 23:17:49.977163 kubelet[2742]: E0715 23:17:49.976786 2742 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-trqcc" podUID="7821b0da-67bb-46bb-abaf-a7c6ac82a39c" Jul 15 23:17:50.157848 kubelet[2742]: E0715 23:17:50.157719 2742 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 15 23:17:50.157848 kubelet[2742]: W0715 23:17:50.157779 2742 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 15 23:17:50.157848 kubelet[2742]: E0715 23:17:50.157812 2742 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 15 23:17:50.158623 kubelet[2742]: E0715 23:17:50.158490 2742 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 15 23:17:50.158832 kubelet[2742]: W0715 23:17:50.158531 2742 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 15 23:17:50.158832 kubelet[2742]: E0715 23:17:50.158790 2742 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 15 23:17:50.161003 kubelet[2742]: E0715 23:17:50.159333 2742 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 15 23:17:50.161003 kubelet[2742]: W0715 23:17:50.159353 2742 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 15 23:17:50.161003 kubelet[2742]: E0715 23:17:50.159372 2742 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 15 23:17:50.161439 kubelet[2742]: E0715 23:17:50.161371 2742 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 15 23:17:50.161439 kubelet[2742]: W0715 23:17:50.161397 2742 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 15 23:17:50.161663 kubelet[2742]: E0715 23:17:50.161420 2742 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 15 23:17:50.162026 kubelet[2742]: I0715 23:17:50.161688 2742 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/calico-typha-846dbb8899-l5pbt" podStartSLOduration=2.553811159 podStartE2EDuration="5.161671161s" podCreationTimestamp="2025-07-15 23:17:45 +0000 UTC" firstStartedPulling="2025-07-15 23:17:46.596807921 +0000 UTC m=+23.803573595" lastFinishedPulling="2025-07-15 23:17:49.204667923 +0000 UTC m=+26.411433597" observedRunningTime="2025-07-15 23:17:50.161121248 +0000 UTC m=+27.367886962" watchObservedRunningTime="2025-07-15 23:17:50.161671161 +0000 UTC m=+27.368436875" Jul 15 23:17:50.163578 kubelet[2742]: E0715 23:17:50.163551 2742 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 15 23:17:50.163578 kubelet[2742]: W0715 23:17:50.163660 2742 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 15 23:17:50.163578 kubelet[2742]: E0715 23:17:50.163688 2742 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 15 23:17:50.164528 kubelet[2742]: E0715 23:17:50.164431 2742 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 15 23:17:50.165250 kubelet[2742]: W0715 23:17:50.164903 2742 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 15 23:17:50.165250 kubelet[2742]: E0715 23:17:50.164934 2742 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 15 23:17:50.166136 kubelet[2742]: E0715 23:17:50.166066 2742 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 15 23:17:50.166136 kubelet[2742]: W0715 23:17:50.166086 2742 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 15 23:17:50.166136 kubelet[2742]: E0715 23:17:50.166106 2742 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 15 23:17:50.166707 kubelet[2742]: E0715 23:17:50.166487 2742 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 15 23:17:50.166707 kubelet[2742]: W0715 23:17:50.166500 2742 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 15 23:17:50.166707 kubelet[2742]: E0715 23:17:50.166512 2742 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 15 23:17:50.167091 kubelet[2742]: E0715 23:17:50.167076 2742 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 15 23:17:50.167267 kubelet[2742]: W0715 23:17:50.167155 2742 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 15 23:17:50.167267 kubelet[2742]: E0715 23:17:50.167174 2742 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 15 23:17:50.167392 kubelet[2742]: E0715 23:17:50.167381 2742 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 15 23:17:50.167454 kubelet[2742]: W0715 23:17:50.167443 2742 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 15 23:17:50.167513 kubelet[2742]: E0715 23:17:50.167502 2742 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 15 23:17:50.167786 kubelet[2742]: E0715 23:17:50.167771 2742 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 15 23:17:50.167961 kubelet[2742]: W0715 23:17:50.167858 2742 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 15 23:17:50.167961 kubelet[2742]: E0715 23:17:50.167875 2742 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 15 23:17:50.168094 kubelet[2742]: E0715 23:17:50.168083 2742 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 15 23:17:50.168153 kubelet[2742]: W0715 23:17:50.168142 2742 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 15 23:17:50.168210 kubelet[2742]: E0715 23:17:50.168199 2742 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 15 23:17:50.168629 kubelet[2742]: E0715 23:17:50.168471 2742 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 15 23:17:50.168629 kubelet[2742]: W0715 23:17:50.168483 2742 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 15 23:17:50.168629 kubelet[2742]: E0715 23:17:50.168494 2742 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 15 23:17:50.168807 kubelet[2742]: E0715 23:17:50.168795 2742 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 15 23:17:50.168878 kubelet[2742]: W0715 23:17:50.168865 2742 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 15 23:17:50.169034 kubelet[2742]: E0715 23:17:50.168932 2742 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 15 23:17:50.169135 kubelet[2742]: E0715 23:17:50.169124 2742 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 15 23:17:50.169186 kubelet[2742]: W0715 23:17:50.169177 2742 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 15 23:17:50.169239 kubelet[2742]: E0715 23:17:50.169230 2742 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 15 23:17:50.169715 kubelet[2742]: E0715 23:17:50.169534 2742 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 15 23:17:50.169715 kubelet[2742]: W0715 23:17:50.169547 2742 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 15 23:17:50.169715 kubelet[2742]: E0715 23:17:50.169557 2742 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 15 23:17:50.169989 kubelet[2742]: E0715 23:17:50.169974 2742 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 15 23:17:50.170057 kubelet[2742]: W0715 23:17:50.170046 2742 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 15 23:17:50.170118 kubelet[2742]: E0715 23:17:50.170107 2742 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 15 23:17:50.170399 kubelet[2742]: E0715 23:17:50.170369 2742 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 15 23:17:50.170399 kubelet[2742]: W0715 23:17:50.170392 2742 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 15 23:17:50.170489 kubelet[2742]: E0715 23:17:50.170407 2742 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 15 23:17:50.170640 kubelet[2742]: E0715 23:17:50.170570 2742 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 15 23:17:50.170640 kubelet[2742]: W0715 23:17:50.170585 2742 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 15 23:17:50.170640 kubelet[2742]: E0715 23:17:50.170638 2742 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 15 23:17:50.170887 kubelet[2742]: E0715 23:17:50.170865 2742 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 15 23:17:50.170887 kubelet[2742]: W0715 23:17:50.170883 2742 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 15 23:17:50.170951 kubelet[2742]: E0715 23:17:50.170897 2742 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 15 23:17:50.171166 kubelet[2742]: E0715 23:17:50.171143 2742 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 15 23:17:50.171166 kubelet[2742]: W0715 23:17:50.171162 2742 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 15 23:17:50.171243 kubelet[2742]: E0715 23:17:50.171178 2742 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 15 23:17:50.171468 kubelet[2742]: E0715 23:17:50.171442 2742 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 15 23:17:50.171501 kubelet[2742]: W0715 23:17:50.171462 2742 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 15 23:17:50.171501 kubelet[2742]: E0715 23:17:50.171486 2742 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 15 23:17:50.172042 kubelet[2742]: E0715 23:17:50.172009 2742 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 15 23:17:50.172042 kubelet[2742]: W0715 23:17:50.172035 2742 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 15 23:17:50.172122 kubelet[2742]: E0715 23:17:50.172053 2742 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 15 23:17:50.172292 kubelet[2742]: E0715 23:17:50.172271 2742 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 15 23:17:50.172292 kubelet[2742]: W0715 23:17:50.172289 2742 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 15 23:17:50.172365 kubelet[2742]: E0715 23:17:50.172301 2742 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 15 23:17:50.172576 kubelet[2742]: E0715 23:17:50.172546 2742 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 15 23:17:50.172576 kubelet[2742]: W0715 23:17:50.172569 2742 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 15 23:17:50.172719 kubelet[2742]: E0715 23:17:50.172584 2742 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 15 23:17:50.172883 kubelet[2742]: E0715 23:17:50.172857 2742 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 15 23:17:50.172883 kubelet[2742]: W0715 23:17:50.172880 2742 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 15 23:17:50.172965 kubelet[2742]: E0715 23:17:50.172892 2742 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 15 23:17:50.173584 kubelet[2742]: E0715 23:17:50.173554 2742 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 15 23:17:50.173584 kubelet[2742]: W0715 23:17:50.173577 2742 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 15 23:17:50.173584 kubelet[2742]: E0715 23:17:50.173625 2742 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 15 23:17:50.173843 kubelet[2742]: E0715 23:17:50.173826 2742 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 15 23:17:50.173843 kubelet[2742]: W0715 23:17:50.173839 2742 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 15 23:17:50.173922 kubelet[2742]: E0715 23:17:50.173877 2742 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 15 23:17:50.174138 kubelet[2742]: E0715 23:17:50.174112 2742 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 15 23:17:50.174138 kubelet[2742]: W0715 23:17:50.174128 2742 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 15 23:17:50.174219 kubelet[2742]: E0715 23:17:50.174142 2742 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 15 23:17:50.174367 kubelet[2742]: E0715 23:17:50.174345 2742 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 15 23:17:50.174367 kubelet[2742]: W0715 23:17:50.174359 2742 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 15 23:17:50.174439 kubelet[2742]: E0715 23:17:50.174370 2742 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 15 23:17:50.174734 kubelet[2742]: E0715 23:17:50.174710 2742 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 15 23:17:50.174734 kubelet[2742]: W0715 23:17:50.174728 2742 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 15 23:17:50.174833 kubelet[2742]: E0715 23:17:50.174743 2742 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 15 23:17:50.174961 kubelet[2742]: E0715 23:17:50.174948 2742 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 15 23:17:50.175000 kubelet[2742]: W0715 23:17:50.174963 2742 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 15 23:17:50.175000 kubelet[2742]: E0715 23:17:50.174972 2742 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 15 23:17:50.175401 kubelet[2742]: E0715 23:17:50.175380 2742 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 15 23:17:50.175401 kubelet[2742]: W0715 23:17:50.175394 2742 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 15 23:17:50.175496 kubelet[2742]: E0715 23:17:50.175405 2742 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 15 23:17:50.533303 containerd[1538]: time="2025-07-15T23:17:50.533248616Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.2\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jul 15 23:17:50.535813 containerd[1538]: time="2025-07-15T23:17:50.535760062Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.2: active requests=0, bytes read=4266981" Jul 15 23:17:50.537260 containerd[1538]: time="2025-07-15T23:17:50.537189003Z" level=info msg="ImageCreate event name:\"sha256:53f638101e3d73f7dd5e42dc42fb3d94ae1978e8958677222c3de6ec1d8c3d4f\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jul 15 23:17:50.541760 containerd[1538]: time="2025-07-15T23:17:50.541686342Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/pod2daemon-flexvol@sha256:972be127eaecd7d1a2d5393b8d14f1ae8f88550bee83e0519e9590c7e15eb41b\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jul 15 23:17:50.544146 containerd[1538]: time="2025-07-15T23:17:50.544083789Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.2\" with image id \"sha256:53f638101e3d73f7dd5e42dc42fb3d94ae1978e8958677222c3de6ec1d8c3d4f\", repo tag \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.2\", repo digest \"ghcr.io/flatcar/calico/pod2daemon-flexvol@sha256:972be127eaecd7d1a2d5393b8d14f1ae8f88550bee83e0519e9590c7e15eb41b\", size \"5636182\" in 1.339290188s" Jul 15 23:17:50.544621 containerd[1538]: time="2025-07-15T23:17:50.544434065Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.2\" returns image reference \"sha256:53f638101e3d73f7dd5e42dc42fb3d94ae1978e8958677222c3de6ec1d8c3d4f\"" Jul 15 23:17:50.553111 containerd[1538]: time="2025-07-15T23:17:50.552111921Z" level=info msg="CreateContainer within sandbox \"e6e6647f69fd0e69c938112d122cdff499ccb185e2cf49cebe8ae17c422298fc\" for container &ContainerMetadata{Name:flexvol-driver,Attempt:0,}" Jul 15 23:17:50.572865 containerd[1538]: time="2025-07-15T23:17:50.572819161Z" level=info msg="Container 257d38f12bd923ca0935f60462e12b9dcbcbc07a71756d0af2a5e8bf0060d472: CDI devices from CRI Config.CDIDevices: []" Jul 15 23:17:50.587938 containerd[1538]: time="2025-07-15T23:17:50.587873437Z" level=info msg="CreateContainer within sandbox \"e6e6647f69fd0e69c938112d122cdff499ccb185e2cf49cebe8ae17c422298fc\" for &ContainerMetadata{Name:flexvol-driver,Attempt:0,} returns container id \"257d38f12bd923ca0935f60462e12b9dcbcbc07a71756d0af2a5e8bf0060d472\"" Jul 15 23:17:50.590564 containerd[1538]: time="2025-07-15T23:17:50.590476522Z" level=info msg="StartContainer for \"257d38f12bd923ca0935f60462e12b9dcbcbc07a71756d0af2a5e8bf0060d472\"" Jul 15 23:17:50.593652 containerd[1538]: time="2025-07-15T23:17:50.593537001Z" level=info msg="connecting to shim 257d38f12bd923ca0935f60462e12b9dcbcbc07a71756d0af2a5e8bf0060d472" address="unix:///run/containerd/s/0f8bdf6194e870e0c8dc786758ab5d09d633a8ad9d071ca2f370f61c9b417358" protocol=ttrpc version=3 Jul 15 23:17:50.625881 systemd[1]: Started cri-containerd-257d38f12bd923ca0935f60462e12b9dcbcbc07a71756d0af2a5e8bf0060d472.scope - libcontainer container 257d38f12bd923ca0935f60462e12b9dcbcbc07a71756d0af2a5e8bf0060d472. Jul 15 23:17:50.684019 containerd[1538]: time="2025-07-15T23:17:50.683578623Z" level=info msg="StartContainer for \"257d38f12bd923ca0935f60462e12b9dcbcbc07a71756d0af2a5e8bf0060d472\" returns successfully" Jul 15 23:17:50.704297 systemd[1]: cri-containerd-257d38f12bd923ca0935f60462e12b9dcbcbc07a71756d0af2a5e8bf0060d472.scope: Deactivated successfully. Jul 15 23:17:50.711743 containerd[1538]: time="2025-07-15T23:17:50.711692883Z" level=info msg="received exit event container_id:\"257d38f12bd923ca0935f60462e12b9dcbcbc07a71756d0af2a5e8bf0060d472\" id:\"257d38f12bd923ca0935f60462e12b9dcbcbc07a71756d0af2a5e8bf0060d472\" pid:3420 exited_at:{seconds:1752621470 nanos:711196609}" Jul 15 23:17:50.712209 containerd[1538]: time="2025-07-15T23:17:50.712053838Z" level=info msg="TaskExit event in podsandbox handler container_id:\"257d38f12bd923ca0935f60462e12b9dcbcbc07a71756d0af2a5e8bf0060d472\" id:\"257d38f12bd923ca0935f60462e12b9dcbcbc07a71756d0af2a5e8bf0060d472\" pid:3420 exited_at:{seconds:1752621470 nanos:711196609}" Jul 15 23:17:50.750029 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-257d38f12bd923ca0935f60462e12b9dcbcbc07a71756d0af2a5e8bf0060d472-rootfs.mount: Deactivated successfully. Jul 15 23:17:51.138930 kubelet[2742]: I0715 23:17:51.138877 2742 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Jul 15 23:17:51.142559 containerd[1538]: time="2025-07-15T23:17:51.142164624Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/cni:v3.30.2\"" Jul 15 23:17:51.976716 kubelet[2742]: E0715 23:17:51.976578 2742 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-trqcc" podUID="7821b0da-67bb-46bb-abaf-a7c6ac82a39c" Jul 15 23:17:53.818385 containerd[1538]: time="2025-07-15T23:17:53.817473603Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/cni:v3.30.2\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jul 15 23:17:53.820102 containerd[1538]: time="2025-07-15T23:17:53.820051690Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/cni:v3.30.2: active requests=0, bytes read=65888320" Jul 15 23:17:53.822798 containerd[1538]: time="2025-07-15T23:17:53.822736776Z" level=info msg="ImageCreate event name:\"sha256:f6e344d58b3c5524e767c7d1dd4cb29c85ce820b0f3005a603532b6a22db5588\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jul 15 23:17:53.825726 containerd[1538]: time="2025-07-15T23:17:53.825668819Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/cni@sha256:50686775cc60acb78bd92a66fa2d84e1700b2d8e43a718fbadbf35e59baefb4d\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jul 15 23:17:53.826378 containerd[1538]: time="2025-07-15T23:17:53.826330651Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/cni:v3.30.2\" with image id \"sha256:f6e344d58b3c5524e767c7d1dd4cb29c85ce820b0f3005a603532b6a22db5588\", repo tag \"ghcr.io/flatcar/calico/cni:v3.30.2\", repo digest \"ghcr.io/flatcar/calico/cni@sha256:50686775cc60acb78bd92a66fa2d84e1700b2d8e43a718fbadbf35e59baefb4d\", size \"67257561\" in 2.683549115s" Jul 15 23:17:53.826378 containerd[1538]: time="2025-07-15T23:17:53.826373050Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/cni:v3.30.2\" returns image reference \"sha256:f6e344d58b3c5524e767c7d1dd4cb29c85ce820b0f3005a603532b6a22db5588\"" Jul 15 23:17:53.832743 containerd[1538]: time="2025-07-15T23:17:53.832679531Z" level=info msg="CreateContainer within sandbox \"e6e6647f69fd0e69c938112d122cdff499ccb185e2cf49cebe8ae17c422298fc\" for container &ContainerMetadata{Name:install-cni,Attempt:0,}" Jul 15 23:17:53.847481 containerd[1538]: time="2025-07-15T23:17:53.845863244Z" level=info msg="Container baca1ed01bb8e179580f96e995d066bcab94af01ec686d12aaa9cd40cdbf19bc: CDI devices from CRI Config.CDIDevices: []" Jul 15 23:17:53.861055 containerd[1538]: time="2025-07-15T23:17:53.860953333Z" level=info msg="CreateContainer within sandbox \"e6e6647f69fd0e69c938112d122cdff499ccb185e2cf49cebe8ae17c422298fc\" for &ContainerMetadata{Name:install-cni,Attempt:0,} returns container id \"baca1ed01bb8e179580f96e995d066bcab94af01ec686d12aaa9cd40cdbf19bc\"" Jul 15 23:17:53.862090 containerd[1538]: time="2025-07-15T23:17:53.861911320Z" level=info msg="StartContainer for \"baca1ed01bb8e179580f96e995d066bcab94af01ec686d12aaa9cd40cdbf19bc\"" Jul 15 23:17:53.866856 containerd[1538]: time="2025-07-15T23:17:53.866712860Z" level=info msg="connecting to shim baca1ed01bb8e179580f96e995d066bcab94af01ec686d12aaa9cd40cdbf19bc" address="unix:///run/containerd/s/0f8bdf6194e870e0c8dc786758ab5d09d633a8ad9d071ca2f370f61c9b417358" protocol=ttrpc version=3 Jul 15 23:17:53.912930 systemd[1]: Started cri-containerd-baca1ed01bb8e179580f96e995d066bcab94af01ec686d12aaa9cd40cdbf19bc.scope - libcontainer container baca1ed01bb8e179580f96e995d066bcab94af01ec686d12aaa9cd40cdbf19bc. Jul 15 23:17:53.967657 containerd[1538]: time="2025-07-15T23:17:53.967560823Z" level=info msg="StartContainer for \"baca1ed01bb8e179580f96e995d066bcab94af01ec686d12aaa9cd40cdbf19bc\" returns successfully" Jul 15 23:17:53.976895 kubelet[2742]: E0715 23:17:53.976800 2742 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-trqcc" podUID="7821b0da-67bb-46bb-abaf-a7c6ac82a39c" Jul 15 23:17:54.559363 containerd[1538]: time="2025-07-15T23:17:54.559235032Z" level=error msg="failed to reload cni configuration after receiving fs change event(WRITE \"/etc/cni/net.d/calico-kubeconfig\")" error="cni config load failed: no network config found in /etc/cni/net.d: cni plugin not initialized: failed to load cni config" Jul 15 23:17:54.566619 systemd[1]: cri-containerd-baca1ed01bb8e179580f96e995d066bcab94af01ec686d12aaa9cd40cdbf19bc.scope: Deactivated successfully. Jul 15 23:17:54.566964 systemd[1]: cri-containerd-baca1ed01bb8e179580f96e995d066bcab94af01ec686d12aaa9cd40cdbf19bc.scope: Consumed 555ms CPU time, 185.3M memory peak, 165.8M written to disk. Jul 15 23:17:54.573136 containerd[1538]: time="2025-07-15T23:17:54.573079740Z" level=info msg="TaskExit event in podsandbox handler container_id:\"baca1ed01bb8e179580f96e995d066bcab94af01ec686d12aaa9cd40cdbf19bc\" id:\"baca1ed01bb8e179580f96e995d066bcab94af01ec686d12aaa9cd40cdbf19bc\" pid:3477 exited_at:{seconds:1752621474 nanos:571278202}" Jul 15 23:17:54.573609 containerd[1538]: time="2025-07-15T23:17:54.573431175Z" level=info msg="received exit event container_id:\"baca1ed01bb8e179580f96e995d066bcab94af01ec686d12aaa9cd40cdbf19bc\" id:\"baca1ed01bb8e179580f96e995d066bcab94af01ec686d12aaa9cd40cdbf19bc\" pid:3477 exited_at:{seconds:1752621474 nanos:571278202}" Jul 15 23:17:54.600288 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-baca1ed01bb8e179580f96e995d066bcab94af01ec686d12aaa9cd40cdbf19bc-rootfs.mount: Deactivated successfully. Jul 15 23:17:54.614415 kubelet[2742]: I0715 23:17:54.614376 2742 kubelet_node_status.go:501] "Fast updating node status as it just became ready" Jul 15 23:17:54.708817 kubelet[2742]: I0715 23:17:54.708331 2742 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"calico-apiserver-certs\" (UniqueName: \"kubernetes.io/secret/503c7885-3dbb-4fe6-9d3c-7fcfa8b42faa-calico-apiserver-certs\") pod \"calico-apiserver-799c9d5b45-brbbx\" (UID: \"503c7885-3dbb-4fe6-9d3c-7fcfa8b42faa\") " pod="calico-apiserver/calico-apiserver-799c9d5b45-brbbx" Jul 15 23:17:54.709686 kubelet[2742]: I0715 23:17:54.709654 2742 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qzfz4\" (UniqueName: \"kubernetes.io/projected/503c7885-3dbb-4fe6-9d3c-7fcfa8b42faa-kube-api-access-qzfz4\") pod \"calico-apiserver-799c9d5b45-brbbx\" (UID: \"503c7885-3dbb-4fe6-9d3c-7fcfa8b42faa\") " pod="calico-apiserver/calico-apiserver-799c9d5b45-brbbx" Jul 15 23:17:54.711111 kubelet[2742]: I0715 23:17:54.710671 2742 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/11bfca13-a765-46f5-b84c-ea9b7e562a08-tigera-ca-bundle\") pod \"calico-kube-controllers-68db8888c-tzmd7\" (UID: \"11bfca13-a765-46f5-b84c-ea9b7e562a08\") " pod="calico-system/calico-kube-controllers-68db8888c-tzmd7" Jul 15 23:17:54.711937 kubelet[2742]: I0715 23:17:54.711280 2742 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-csg4f\" (UniqueName: \"kubernetes.io/projected/11bfca13-a765-46f5-b84c-ea9b7e562a08-kube-api-access-csg4f\") pod \"calico-kube-controllers-68db8888c-tzmd7\" (UID: \"11bfca13-a765-46f5-b84c-ea9b7e562a08\") " pod="calico-system/calico-kube-controllers-68db8888c-tzmd7" Jul 15 23:17:54.711937 kubelet[2742]: I0715 23:17:54.711326 2742 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whisker-backend-key-pair\" (UniqueName: \"kubernetes.io/secret/f6910214-8f68-4725-9879-4f5b5702c095-whisker-backend-key-pair\") pod \"whisker-8854798b-m85nt\" (UID: \"f6910214-8f68-4725-9879-4f5b5702c095\") " pod="calico-system/whisker-8854798b-m85nt" Jul 15 23:17:54.711937 kubelet[2742]: I0715 23:17:54.711352 2742 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vq6s7\" (UniqueName: \"kubernetes.io/projected/f6910214-8f68-4725-9879-4f5b5702c095-kube-api-access-vq6s7\") pod \"whisker-8854798b-m85nt\" (UID: \"f6910214-8f68-4725-9879-4f5b5702c095\") " pod="calico-system/whisker-8854798b-m85nt" Jul 15 23:17:54.711937 kubelet[2742]: I0715 23:17:54.711373 2742 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whisker-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f6910214-8f68-4725-9879-4f5b5702c095-whisker-ca-bundle\") pod \"whisker-8854798b-m85nt\" (UID: \"f6910214-8f68-4725-9879-4f5b5702c095\") " pod="calico-system/whisker-8854798b-m85nt" Jul 15 23:17:54.711474 systemd[1]: Created slice kubepods-besteffort-podf6910214_8f68_4725_9879_4f5b5702c095.slice - libcontainer container kubepods-besteffort-podf6910214_8f68_4725_9879_4f5b5702c095.slice. Jul 15 23:17:54.730279 systemd[1]: Created slice kubepods-besteffort-pod503c7885_3dbb_4fe6_9d3c_7fcfa8b42faa.slice - libcontainer container kubepods-besteffort-pod503c7885_3dbb_4fe6_9d3c_7fcfa8b42faa.slice. Jul 15 23:17:54.750154 systemd[1]: Created slice kubepods-besteffort-podabc468d0_2605_48f4_bc68_9c53f0bb8be8.slice - libcontainer container kubepods-besteffort-podabc468d0_2605_48f4_bc68_9c53f0bb8be8.slice. Jul 15 23:17:54.760050 systemd[1]: Created slice kubepods-burstable-podad6dda87_ca8a_4489_9124_9f24cca00875.slice - libcontainer container kubepods-burstable-podad6dda87_ca8a_4489_9124_9f24cca00875.slice. Jul 15 23:17:54.774020 systemd[1]: Created slice kubepods-besteffort-pod11bfca13_a765_46f5_b84c_ea9b7e562a08.slice - libcontainer container kubepods-besteffort-pod11bfca13_a765_46f5_b84c_ea9b7e562a08.slice. Jul 15 23:17:54.783960 systemd[1]: Created slice kubepods-burstable-podd247f82a_56a6_47df_ba27_76b7bfeb6863.slice - libcontainer container kubepods-burstable-podd247f82a_56a6_47df_ba27_76b7bfeb6863.slice. Jul 15 23:17:54.797038 systemd[1]: Created slice kubepods-besteffort-pod420d8d7f_3f4e_4b40_a9e9_124564c6d541.slice - libcontainer container kubepods-besteffort-pod420d8d7f_3f4e_4b40_a9e9_124564c6d541.slice. Jul 15 23:17:54.813196 kubelet[2742]: I0715 23:17:54.812288 2742 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"calico-apiserver-certs\" (UniqueName: \"kubernetes.io/secret/abc468d0-2605-48f4-bc68-9c53f0bb8be8-calico-apiserver-certs\") pod \"calico-apiserver-799c9d5b45-zftqn\" (UID: \"abc468d0-2605-48f4-bc68-9c53f0bb8be8\") " pod="calico-apiserver/calico-apiserver-799c9d5b45-zftqn" Jul 15 23:17:54.813196 kubelet[2742]: I0715 23:17:54.813157 2742 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/ad6dda87-ca8a-4489-9124-9f24cca00875-config-volume\") pod \"coredns-674b8bbfcf-dmcjn\" (UID: \"ad6dda87-ca8a-4489-9124-9f24cca00875\") " pod="kube-system/coredns-674b8bbfcf-dmcjn" Jul 15 23:17:54.815060 kubelet[2742]: I0715 23:17:54.813668 2742 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/420d8d7f-3f4e-4b40-a9e9-124564c6d541-config\") pod \"goldmane-768f4c5c69-fn5px\" (UID: \"420d8d7f-3f4e-4b40-a9e9-124564c6d541\") " pod="calico-system/goldmane-768f4c5c69-fn5px" Jul 15 23:17:54.815060 kubelet[2742]: I0715 23:17:54.813819 2742 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bjzd5\" (UniqueName: \"kubernetes.io/projected/abc468d0-2605-48f4-bc68-9c53f0bb8be8-kube-api-access-bjzd5\") pod \"calico-apiserver-799c9d5b45-zftqn\" (UID: \"abc468d0-2605-48f4-bc68-9c53f0bb8be8\") " pod="calico-apiserver/calico-apiserver-799c9d5b45-zftqn" Jul 15 23:17:54.815060 kubelet[2742]: I0715 23:17:54.813839 2742 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"goldmane-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/420d8d7f-3f4e-4b40-a9e9-124564c6d541-goldmane-ca-bundle\") pod \"goldmane-768f4c5c69-fn5px\" (UID: \"420d8d7f-3f4e-4b40-a9e9-124564c6d541\") " pod="calico-system/goldmane-768f4c5c69-fn5px" Jul 15 23:17:54.815060 kubelet[2742]: I0715 23:17:54.813859 2742 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"goldmane-key-pair\" (UniqueName: \"kubernetes.io/secret/420d8d7f-3f4e-4b40-a9e9-124564c6d541-goldmane-key-pair\") pod \"goldmane-768f4c5c69-fn5px\" (UID: \"420d8d7f-3f4e-4b40-a9e9-124564c6d541\") " pod="calico-system/goldmane-768f4c5c69-fn5px" Jul 15 23:17:54.815060 kubelet[2742]: I0715 23:17:54.813892 2742 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jw86v\" (UniqueName: \"kubernetes.io/projected/ad6dda87-ca8a-4489-9124-9f24cca00875-kube-api-access-jw86v\") pod \"coredns-674b8bbfcf-dmcjn\" (UID: \"ad6dda87-ca8a-4489-9124-9f24cca00875\") " pod="kube-system/coredns-674b8bbfcf-dmcjn" Jul 15 23:17:54.815314 kubelet[2742]: I0715 23:17:54.813912 2742 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/d247f82a-56a6-47df-ba27-76b7bfeb6863-config-volume\") pod \"coredns-674b8bbfcf-98fhj\" (UID: \"d247f82a-56a6-47df-ba27-76b7bfeb6863\") " pod="kube-system/coredns-674b8bbfcf-98fhj" Jul 15 23:17:54.815314 kubelet[2742]: I0715 23:17:54.813939 2742 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2qqfh\" (UniqueName: \"kubernetes.io/projected/d247f82a-56a6-47df-ba27-76b7bfeb6863-kube-api-access-2qqfh\") pod \"coredns-674b8bbfcf-98fhj\" (UID: \"d247f82a-56a6-47df-ba27-76b7bfeb6863\") " pod="kube-system/coredns-674b8bbfcf-98fhj" Jul 15 23:17:54.815314 kubelet[2742]: I0715 23:17:54.813968 2742 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l9qdh\" (UniqueName: \"kubernetes.io/projected/420d8d7f-3f4e-4b40-a9e9-124564c6d541-kube-api-access-l9qdh\") pod \"goldmane-768f4c5c69-fn5px\" (UID: \"420d8d7f-3f4e-4b40-a9e9-124564c6d541\") " pod="calico-system/goldmane-768f4c5c69-fn5px" Jul 15 23:17:55.024208 containerd[1538]: time="2025-07-15T23:17:55.024133626Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:whisker-8854798b-m85nt,Uid:f6910214-8f68-4725-9879-4f5b5702c095,Namespace:calico-system,Attempt:0,}" Jul 15 23:17:55.046579 containerd[1538]: time="2025-07-15T23:17:55.046177318Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-799c9d5b45-brbbx,Uid:503c7885-3dbb-4fe6-9d3c-7fcfa8b42faa,Namespace:calico-apiserver,Attempt:0,}" Jul 15 23:17:55.059449 containerd[1538]: time="2025-07-15T23:17:55.059410597Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-799c9d5b45-zftqn,Uid:abc468d0-2605-48f4-bc68-9c53f0bb8be8,Namespace:calico-apiserver,Attempt:0,}" Jul 15 23:17:55.069997 containerd[1538]: time="2025-07-15T23:17:55.069875389Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-674b8bbfcf-dmcjn,Uid:ad6dda87-ca8a-4489-9124-9f24cca00875,Namespace:kube-system,Attempt:0,}" Jul 15 23:17:55.081846 containerd[1538]: time="2025-07-15T23:17:55.081170252Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-68db8888c-tzmd7,Uid:11bfca13-a765-46f5-b84c-ea9b7e562a08,Namespace:calico-system,Attempt:0,}" Jul 15 23:17:55.097894 containerd[1538]: time="2025-07-15T23:17:55.097671931Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-674b8bbfcf-98fhj,Uid:d247f82a-56a6-47df-ba27-76b7bfeb6863,Namespace:kube-system,Attempt:0,}" Jul 15 23:17:55.108382 containerd[1538]: time="2025-07-15T23:17:55.108237642Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:goldmane-768f4c5c69-fn5px,Uid:420d8d7f-3f4e-4b40-a9e9-124564c6d541,Namespace:calico-system,Attempt:0,}" Jul 15 23:17:55.181811 containerd[1538]: time="2025-07-15T23:17:55.181768107Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node:v3.30.2\"" Jul 15 23:17:55.310785 containerd[1538]: time="2025-07-15T23:17:55.310733337Z" level=error msg="Failed to destroy network for sandbox \"2bea3e59034426901fff4752a231d8e3c68c42ad1b7d7863f1164bb637ef9f2b\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jul 15 23:17:55.318200 containerd[1538]: time="2025-07-15T23:17:55.318136287Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:whisker-8854798b-m85nt,Uid:f6910214-8f68-4725-9879-4f5b5702c095,Namespace:calico-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"2bea3e59034426901fff4752a231d8e3c68c42ad1b7d7863f1164bb637ef9f2b\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jul 15 23:17:55.318741 kubelet[2742]: E0715 23:17:55.318699 2742 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"2bea3e59034426901fff4752a231d8e3c68c42ad1b7d7863f1164bb637ef9f2b\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jul 15 23:17:55.319869 kubelet[2742]: E0715 23:17:55.319168 2742 kuberuntime_sandbox.go:70] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"2bea3e59034426901fff4752a231d8e3c68c42ad1b7d7863f1164bb637ef9f2b\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/whisker-8854798b-m85nt" Jul 15 23:17:55.319869 kubelet[2742]: E0715 23:17:55.319202 2742 kuberuntime_manager.go:1252] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"2bea3e59034426901fff4752a231d8e3c68c42ad1b7d7863f1164bb637ef9f2b\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/whisker-8854798b-m85nt" Jul 15 23:17:55.319869 kubelet[2742]: E0715 23:17:55.319285 2742 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"whisker-8854798b-m85nt_calico-system(f6910214-8f68-4725-9879-4f5b5702c095)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"whisker-8854798b-m85nt_calico-system(f6910214-8f68-4725-9879-4f5b5702c095)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"2bea3e59034426901fff4752a231d8e3c68c42ad1b7d7863f1164bb637ef9f2b\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/whisker-8854798b-m85nt" podUID="f6910214-8f68-4725-9879-4f5b5702c095" Jul 15 23:17:55.320182 containerd[1538]: time="2025-07-15T23:17:55.319233554Z" level=error msg="Failed to destroy network for sandbox \"2e5293f1cfacc5fb3635ef862276eda6087f58ea9cabeccc8a565c69cffe45c7\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jul 15 23:17:55.325455 containerd[1538]: time="2025-07-15T23:17:55.325366519Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-799c9d5b45-brbbx,Uid:503c7885-3dbb-4fe6-9d3c-7fcfa8b42faa,Namespace:calico-apiserver,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"2e5293f1cfacc5fb3635ef862276eda6087f58ea9cabeccc8a565c69cffe45c7\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jul 15 23:17:55.328773 kubelet[2742]: E0715 23:17:55.328711 2742 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"2e5293f1cfacc5fb3635ef862276eda6087f58ea9cabeccc8a565c69cffe45c7\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jul 15 23:17:55.329149 kubelet[2742]: E0715 23:17:55.328780 2742 kuberuntime_sandbox.go:70] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"2e5293f1cfacc5fb3635ef862276eda6087f58ea9cabeccc8a565c69cffe45c7\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-799c9d5b45-brbbx" Jul 15 23:17:55.329149 kubelet[2742]: E0715 23:17:55.328803 2742 kuberuntime_manager.go:1252] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"2e5293f1cfacc5fb3635ef862276eda6087f58ea9cabeccc8a565c69cffe45c7\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-799c9d5b45-brbbx" Jul 15 23:17:55.329149 kubelet[2742]: E0715 23:17:55.328861 2742 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-apiserver-799c9d5b45-brbbx_calico-apiserver(503c7885-3dbb-4fe6-9d3c-7fcfa8b42faa)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-apiserver-799c9d5b45-brbbx_calico-apiserver(503c7885-3dbb-4fe6-9d3c-7fcfa8b42faa)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"2e5293f1cfacc5fb3635ef862276eda6087f58ea9cabeccc8a565c69cffe45c7\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-apiserver/calico-apiserver-799c9d5b45-brbbx" podUID="503c7885-3dbb-4fe6-9d3c-7fcfa8b42faa" Jul 15 23:17:55.344301 containerd[1538]: time="2025-07-15T23:17:55.344245529Z" level=error msg="Failed to destroy network for sandbox \"9efb6f5e88335a45e46aaadd28184a1851bac68062cf4125c12b002456ac832d\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jul 15 23:17:55.347829 containerd[1538]: time="2025-07-15T23:17:55.347771926Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-799c9d5b45-zftqn,Uid:abc468d0-2605-48f4-bc68-9c53f0bb8be8,Namespace:calico-apiserver,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"9efb6f5e88335a45e46aaadd28184a1851bac68062cf4125c12b002456ac832d\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jul 15 23:17:55.348372 kubelet[2742]: E0715 23:17:55.348325 2742 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"9efb6f5e88335a45e46aaadd28184a1851bac68062cf4125c12b002456ac832d\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jul 15 23:17:55.348789 kubelet[2742]: E0715 23:17:55.348692 2742 kuberuntime_sandbox.go:70] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"9efb6f5e88335a45e46aaadd28184a1851bac68062cf4125c12b002456ac832d\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-799c9d5b45-zftqn" Jul 15 23:17:55.348789 kubelet[2742]: E0715 23:17:55.348742 2742 kuberuntime_manager.go:1252] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"9efb6f5e88335a45e46aaadd28184a1851bac68062cf4125c12b002456ac832d\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-799c9d5b45-zftqn" Jul 15 23:17:55.351036 kubelet[2742]: E0715 23:17:55.348947 2742 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-apiserver-799c9d5b45-zftqn_calico-apiserver(abc468d0-2605-48f4-bc68-9c53f0bb8be8)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-apiserver-799c9d5b45-zftqn_calico-apiserver(abc468d0-2605-48f4-bc68-9c53f0bb8be8)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"9efb6f5e88335a45e46aaadd28184a1851bac68062cf4125c12b002456ac832d\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-apiserver/calico-apiserver-799c9d5b45-zftqn" podUID="abc468d0-2605-48f4-bc68-9c53f0bb8be8" Jul 15 23:17:55.355739 containerd[1538]: time="2025-07-15T23:17:55.355686630Z" level=error msg="Failed to destroy network for sandbox \"484bc5041bce2a4868d0c19145d15dd8ccb7dee8dd86dfe7f82796b7b2704570\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jul 15 23:17:55.357972 containerd[1538]: time="2025-07-15T23:17:55.357906403Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-68db8888c-tzmd7,Uid:11bfca13-a765-46f5-b84c-ea9b7e562a08,Namespace:calico-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"484bc5041bce2a4868d0c19145d15dd8ccb7dee8dd86dfe7f82796b7b2704570\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jul 15 23:17:55.358526 kubelet[2742]: E0715 23:17:55.358459 2742 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"484bc5041bce2a4868d0c19145d15dd8ccb7dee8dd86dfe7f82796b7b2704570\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jul 15 23:17:55.358845 kubelet[2742]: E0715 23:17:55.358810 2742 kuberuntime_sandbox.go:70] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"484bc5041bce2a4868d0c19145d15dd8ccb7dee8dd86dfe7f82796b7b2704570\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-kube-controllers-68db8888c-tzmd7" Jul 15 23:17:55.358929 kubelet[2742]: E0715 23:17:55.358912 2742 kuberuntime_manager.go:1252] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"484bc5041bce2a4868d0c19145d15dd8ccb7dee8dd86dfe7f82796b7b2704570\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-kube-controllers-68db8888c-tzmd7" Jul 15 23:17:55.359449 kubelet[2742]: E0715 23:17:55.359065 2742 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-kube-controllers-68db8888c-tzmd7_calico-system(11bfca13-a765-46f5-b84c-ea9b7e562a08)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-kube-controllers-68db8888c-tzmd7_calico-system(11bfca13-a765-46f5-b84c-ea9b7e562a08)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"484bc5041bce2a4868d0c19145d15dd8ccb7dee8dd86dfe7f82796b7b2704570\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/calico-kube-controllers-68db8888c-tzmd7" podUID="11bfca13-a765-46f5-b84c-ea9b7e562a08" Jul 15 23:17:55.364672 containerd[1538]: time="2025-07-15T23:17:55.364618641Z" level=error msg="Failed to destroy network for sandbox \"d7057f9f1af6f623c9d85b8398180a93de915c7dcb123a34aa05a0ffcdc2c3c2\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jul 15 23:17:55.369737 containerd[1538]: time="2025-07-15T23:17:55.369652820Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-674b8bbfcf-dmcjn,Uid:ad6dda87-ca8a-4489-9124-9f24cca00875,Namespace:kube-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"d7057f9f1af6f623c9d85b8398180a93de915c7dcb123a34aa05a0ffcdc2c3c2\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jul 15 23:17:55.371242 kubelet[2742]: E0715 23:17:55.371108 2742 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"d7057f9f1af6f623c9d85b8398180a93de915c7dcb123a34aa05a0ffcdc2c3c2\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jul 15 23:17:55.371242 kubelet[2742]: E0715 23:17:55.371179 2742 kuberuntime_sandbox.go:70] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"d7057f9f1af6f623c9d85b8398180a93de915c7dcb123a34aa05a0ffcdc2c3c2\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-674b8bbfcf-dmcjn" Jul 15 23:17:55.371242 kubelet[2742]: E0715 23:17:55.371200 2742 kuberuntime_manager.go:1252] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"d7057f9f1af6f623c9d85b8398180a93de915c7dcb123a34aa05a0ffcdc2c3c2\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-674b8bbfcf-dmcjn" Jul 15 23:17:55.373712 kubelet[2742]: E0715 23:17:55.372653 2742 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"coredns-674b8bbfcf-dmcjn_kube-system(ad6dda87-ca8a-4489-9124-9f24cca00875)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"coredns-674b8bbfcf-dmcjn_kube-system(ad6dda87-ca8a-4489-9124-9f24cca00875)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"d7057f9f1af6f623c9d85b8398180a93de915c7dcb123a34aa05a0ffcdc2c3c2\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-674b8bbfcf-dmcjn" podUID="ad6dda87-ca8a-4489-9124-9f24cca00875" Jul 15 23:17:55.374249 containerd[1538]: time="2025-07-15T23:17:55.374178565Z" level=error msg="Failed to destroy network for sandbox \"d40219c41de0b3a4a716080d1671da126295861004a63dd7cb2f75fbe889b48d\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jul 15 23:17:55.378630 containerd[1538]: time="2025-07-15T23:17:55.377997838Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-674b8bbfcf-98fhj,Uid:d247f82a-56a6-47df-ba27-76b7bfeb6863,Namespace:kube-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"d40219c41de0b3a4a716080d1671da126295861004a63dd7cb2f75fbe889b48d\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jul 15 23:17:55.379713 kubelet[2742]: E0715 23:17:55.379506 2742 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"d40219c41de0b3a4a716080d1671da126295861004a63dd7cb2f75fbe889b48d\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jul 15 23:17:55.379713 kubelet[2742]: E0715 23:17:55.379575 2742 kuberuntime_sandbox.go:70] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"d40219c41de0b3a4a716080d1671da126295861004a63dd7cb2f75fbe889b48d\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-674b8bbfcf-98fhj" Jul 15 23:17:55.379713 kubelet[2742]: E0715 23:17:55.379605 2742 kuberuntime_manager.go:1252] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"d40219c41de0b3a4a716080d1671da126295861004a63dd7cb2f75fbe889b48d\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-674b8bbfcf-98fhj" Jul 15 23:17:55.379868 kubelet[2742]: E0715 23:17:55.379663 2742 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"coredns-674b8bbfcf-98fhj_kube-system(d247f82a-56a6-47df-ba27-76b7bfeb6863)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"coredns-674b8bbfcf-98fhj_kube-system(d247f82a-56a6-47df-ba27-76b7bfeb6863)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"d40219c41de0b3a4a716080d1671da126295861004a63dd7cb2f75fbe889b48d\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-674b8bbfcf-98fhj" podUID="d247f82a-56a6-47df-ba27-76b7bfeb6863" Jul 15 23:17:55.387842 containerd[1538]: time="2025-07-15T23:17:55.387766200Z" level=error msg="Failed to destroy network for sandbox \"1c41df36d94c782c1e22bab12997092dc2a8c048b3c202d2617bfe632e141c73\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jul 15 23:17:55.390617 containerd[1538]: time="2025-07-15T23:17:55.390006932Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:goldmane-768f4c5c69-fn5px,Uid:420d8d7f-3f4e-4b40-a9e9-124564c6d541,Namespace:calico-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"1c41df36d94c782c1e22bab12997092dc2a8c048b3c202d2617bfe632e141c73\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jul 15 23:17:55.390840 kubelet[2742]: E0715 23:17:55.390316 2742 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"1c41df36d94c782c1e22bab12997092dc2a8c048b3c202d2617bfe632e141c73\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jul 15 23:17:55.390840 kubelet[2742]: E0715 23:17:55.390387 2742 kuberuntime_sandbox.go:70] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"1c41df36d94c782c1e22bab12997092dc2a8c048b3c202d2617bfe632e141c73\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/goldmane-768f4c5c69-fn5px" Jul 15 23:17:55.390840 kubelet[2742]: E0715 23:17:55.390416 2742 kuberuntime_manager.go:1252] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"1c41df36d94c782c1e22bab12997092dc2a8c048b3c202d2617bfe632e141c73\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/goldmane-768f4c5c69-fn5px" Jul 15 23:17:55.390984 kubelet[2742]: E0715 23:17:55.390499 2742 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"goldmane-768f4c5c69-fn5px_calico-system(420d8d7f-3f4e-4b40-a9e9-124564c6d541)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"goldmane-768f4c5c69-fn5px_calico-system(420d8d7f-3f4e-4b40-a9e9-124564c6d541)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"1c41df36d94c782c1e22bab12997092dc2a8c048b3c202d2617bfe632e141c73\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/goldmane-768f4c5c69-fn5px" podUID="420d8d7f-3f4e-4b40-a9e9-124564c6d541" Jul 15 23:17:55.986240 systemd[1]: Created slice kubepods-besteffort-pod7821b0da_67bb_46bb_abaf_a7c6ac82a39c.slice - libcontainer container kubepods-besteffort-pod7821b0da_67bb_46bb_abaf_a7c6ac82a39c.slice. Jul 15 23:17:55.997515 containerd[1538]: time="2025-07-15T23:17:55.995137366Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-trqcc,Uid:7821b0da-67bb-46bb-abaf-a7c6ac82a39c,Namespace:calico-system,Attempt:0,}" Jul 15 23:17:56.097957 containerd[1538]: time="2025-07-15T23:17:56.097689459Z" level=error msg="Failed to destroy network for sandbox \"f6b535e5576ad7fc746537ba58b3aacf82c8975c78b719d605eb465dc7b220d3\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jul 15 23:17:56.101107 systemd[1]: run-netns-cni\x2d14b26a36\x2de3f4\x2df7b6\x2d6d5d\x2d49bdd601284e.mount: Deactivated successfully. Jul 15 23:17:56.102381 containerd[1538]: time="2025-07-15T23:17:56.101975808Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-trqcc,Uid:7821b0da-67bb-46bb-abaf-a7c6ac82a39c,Namespace:calico-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"f6b535e5576ad7fc746537ba58b3aacf82c8975c78b719d605eb465dc7b220d3\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jul 15 23:17:56.105041 kubelet[2742]: E0715 23:17:56.103776 2742 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"f6b535e5576ad7fc746537ba58b3aacf82c8975c78b719d605eb465dc7b220d3\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jul 15 23:17:56.105041 kubelet[2742]: E0715 23:17:56.103841 2742 kuberuntime_sandbox.go:70] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"f6b535e5576ad7fc746537ba58b3aacf82c8975c78b719d605eb465dc7b220d3\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/csi-node-driver-trqcc" Jul 15 23:17:56.105041 kubelet[2742]: E0715 23:17:56.103863 2742 kuberuntime_manager.go:1252] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"f6b535e5576ad7fc746537ba58b3aacf82c8975c78b719d605eb465dc7b220d3\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/csi-node-driver-trqcc" Jul 15 23:17:56.105250 kubelet[2742]: E0715 23:17:56.103914 2742 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"csi-node-driver-trqcc_calico-system(7821b0da-67bb-46bb-abaf-a7c6ac82a39c)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"csi-node-driver-trqcc_calico-system(7821b0da-67bb-46bb-abaf-a7c6ac82a39c)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"f6b535e5576ad7fc746537ba58b3aacf82c8975c78b719d605eb465dc7b220d3\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/csi-node-driver-trqcc" podUID="7821b0da-67bb-46bb-abaf-a7c6ac82a39c" Jul 15 23:17:59.800325 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount4214488323.mount: Deactivated successfully. Jul 15 23:17:59.837688 containerd[1538]: time="2025-07-15T23:17:59.837624700Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/node:v3.30.2: active requests=0, bytes read=152544909" Jul 15 23:17:59.839136 containerd[1538]: time="2025-07-15T23:17:59.839042883Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node:v3.30.2\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jul 15 23:17:59.841633 containerd[1538]: time="2025-07-15T23:17:59.841362977Z" level=info msg="ImageCreate event name:\"sha256:1c6ddca599ddd18c061e797a7830b0aea985f8b023c5e43d815a9ed1088893a9\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jul 15 23:17:59.843779 containerd[1538]: time="2025-07-15T23:17:59.842844280Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node@sha256:e94d49349cc361ef2216d27dda4a097278984d778279f66e79b0616c827c6760\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jul 15 23:17:59.844619 containerd[1538]: time="2025-07-15T23:17:59.843711150Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/node:v3.30.2\" with image id \"sha256:1c6ddca599ddd18c061e797a7830b0aea985f8b023c5e43d815a9ed1088893a9\", repo tag \"ghcr.io/flatcar/calico/node:v3.30.2\", repo digest \"ghcr.io/flatcar/calico/node@sha256:e94d49349cc361ef2216d27dda4a097278984d778279f66e79b0616c827c6760\", size \"152544771\" in 4.660803897s" Jul 15 23:17:59.844619 containerd[1538]: time="2025-07-15T23:17:59.844377463Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node:v3.30.2\" returns image reference \"sha256:1c6ddca599ddd18c061e797a7830b0aea985f8b023c5e43d815a9ed1088893a9\"" Jul 15 23:17:59.875716 containerd[1538]: time="2025-07-15T23:17:59.875655707Z" level=info msg="CreateContainer within sandbox \"e6e6647f69fd0e69c938112d122cdff499ccb185e2cf49cebe8ae17c422298fc\" for container &ContainerMetadata{Name:calico-node,Attempt:0,}" Jul 15 23:17:59.889950 containerd[1538]: time="2025-07-15T23:17:59.889848426Z" level=info msg="Container d3261160ea98124dd71ea19467519976d4184aec8c47397154429269b6d85b2f: CDI devices from CRI Config.CDIDevices: []" Jul 15 23:17:59.919615 containerd[1538]: time="2025-07-15T23:17:59.919365210Z" level=info msg="CreateContainer within sandbox \"e6e6647f69fd0e69c938112d122cdff499ccb185e2cf49cebe8ae17c422298fc\" for &ContainerMetadata{Name:calico-node,Attempt:0,} returns container id \"d3261160ea98124dd71ea19467519976d4184aec8c47397154429269b6d85b2f\"" Jul 15 23:17:59.922676 containerd[1538]: time="2025-07-15T23:17:59.920637676Z" level=info msg="StartContainer for \"d3261160ea98124dd71ea19467519976d4184aec8c47397154429269b6d85b2f\"" Jul 15 23:17:59.923404 containerd[1538]: time="2025-07-15T23:17:59.923338165Z" level=info msg="connecting to shim d3261160ea98124dd71ea19467519976d4184aec8c47397154429269b6d85b2f" address="unix:///run/containerd/s/0f8bdf6194e870e0c8dc786758ab5d09d633a8ad9d071ca2f370f61c9b417358" protocol=ttrpc version=3 Jul 15 23:17:59.984927 systemd[1]: Started cri-containerd-d3261160ea98124dd71ea19467519976d4184aec8c47397154429269b6d85b2f.scope - libcontainer container d3261160ea98124dd71ea19467519976d4184aec8c47397154429269b6d85b2f. Jul 15 23:18:00.048183 containerd[1538]: time="2025-07-15T23:18:00.048074916Z" level=info msg="StartContainer for \"d3261160ea98124dd71ea19467519976d4184aec8c47397154429269b6d85b2f\" returns successfully" Jul 15 23:18:00.236271 kubelet[2742]: I0715 23:18:00.236079 2742 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/calico-node-57f5j" podStartSLOduration=1.352638574 podStartE2EDuration="14.236056011s" podCreationTimestamp="2025-07-15 23:17:46 +0000 UTC" firstStartedPulling="2025-07-15 23:17:46.962542448 +0000 UTC m=+24.169308122" lastFinishedPulling="2025-07-15 23:17:59.845959925 +0000 UTC m=+37.052725559" observedRunningTime="2025-07-15 23:18:00.227921062 +0000 UTC m=+37.434686736" watchObservedRunningTime="2025-07-15 23:18:00.236056011 +0000 UTC m=+37.442821765" Jul 15 23:18:00.239720 kernel: wireguard: WireGuard 1.0.0 loaded. See www.wireguard.com for information. Jul 15 23:18:00.239844 kernel: wireguard: Copyright (C) 2015-2019 Jason A. Donenfeld . All Rights Reserved. Jul 15 23:18:00.567940 kubelet[2742]: I0715 23:18:00.567893 2742 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"whisker-backend-key-pair\" (UniqueName: \"kubernetes.io/secret/f6910214-8f68-4725-9879-4f5b5702c095-whisker-backend-key-pair\") pod \"f6910214-8f68-4725-9879-4f5b5702c095\" (UID: \"f6910214-8f68-4725-9879-4f5b5702c095\") " Jul 15 23:18:00.568262 kubelet[2742]: I0715 23:18:00.568143 2742 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"whisker-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f6910214-8f68-4725-9879-4f5b5702c095-whisker-ca-bundle\") pod \"f6910214-8f68-4725-9879-4f5b5702c095\" (UID: \"f6910214-8f68-4725-9879-4f5b5702c095\") " Jul 15 23:18:00.568262 kubelet[2742]: I0715 23:18:00.568217 2742 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vq6s7\" (UniqueName: \"kubernetes.io/projected/f6910214-8f68-4725-9879-4f5b5702c095-kube-api-access-vq6s7\") pod \"f6910214-8f68-4725-9879-4f5b5702c095\" (UID: \"f6910214-8f68-4725-9879-4f5b5702c095\") " Jul 15 23:18:00.571941 kubelet[2742]: I0715 23:18:00.571828 2742 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f6910214-8f68-4725-9879-4f5b5702c095-whisker-ca-bundle" (OuterVolumeSpecName: "whisker-ca-bundle") pod "f6910214-8f68-4725-9879-4f5b5702c095" (UID: "f6910214-8f68-4725-9879-4f5b5702c095"). InnerVolumeSpecName "whisker-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Jul 15 23:18:00.575105 kubelet[2742]: I0715 23:18:00.574961 2742 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f6910214-8f68-4725-9879-4f5b5702c095-whisker-backend-key-pair" (OuterVolumeSpecName: "whisker-backend-key-pair") pod "f6910214-8f68-4725-9879-4f5b5702c095" (UID: "f6910214-8f68-4725-9879-4f5b5702c095"). InnerVolumeSpecName "whisker-backend-key-pair". PluginName "kubernetes.io/secret", VolumeGIDValue "" Jul 15 23:18:00.575487 kubelet[2742]: I0715 23:18:00.575410 2742 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f6910214-8f68-4725-9879-4f5b5702c095-kube-api-access-vq6s7" (OuterVolumeSpecName: "kube-api-access-vq6s7") pod "f6910214-8f68-4725-9879-4f5b5702c095" (UID: "f6910214-8f68-4725-9879-4f5b5702c095"). InnerVolumeSpecName "kube-api-access-vq6s7". PluginName "kubernetes.io/projected", VolumeGIDValue "" Jul 15 23:18:00.670283 kubelet[2742]: I0715 23:18:00.670050 2742 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-vq6s7\" (UniqueName: \"kubernetes.io/projected/f6910214-8f68-4725-9879-4f5b5702c095-kube-api-access-vq6s7\") on node \"ci-4372-0-1-n-21be50a87e\" DevicePath \"\"" Jul 15 23:18:00.670283 kubelet[2742]: I0715 23:18:00.670090 2742 reconciler_common.go:299] "Volume detached for volume \"whisker-backend-key-pair\" (UniqueName: \"kubernetes.io/secret/f6910214-8f68-4725-9879-4f5b5702c095-whisker-backend-key-pair\") on node \"ci-4372-0-1-n-21be50a87e\" DevicePath \"\"" Jul 15 23:18:00.670283 kubelet[2742]: I0715 23:18:00.670106 2742 reconciler_common.go:299] "Volume detached for volume \"whisker-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f6910214-8f68-4725-9879-4f5b5702c095-whisker-ca-bundle\") on node \"ci-4372-0-1-n-21be50a87e\" DevicePath \"\"" Jul 15 23:18:00.801747 systemd[1]: var-lib-kubelet-pods-f6910214\x2d8f68\x2d4725\x2d9879\x2d4f5b5702c095-volumes-kubernetes.io\x7eprojected-kube\x2dapi\x2daccess\x2dvq6s7.mount: Deactivated successfully. Jul 15 23:18:00.803519 systemd[1]: var-lib-kubelet-pods-f6910214\x2d8f68\x2d4725\x2d9879\x2d4f5b5702c095-volumes-kubernetes.io\x7esecret-whisker\x2dbackend\x2dkey\x2dpair.mount: Deactivated successfully. Jul 15 23:18:00.997676 systemd[1]: Removed slice kubepods-besteffort-podf6910214_8f68_4725_9879_4f5b5702c095.slice - libcontainer container kubepods-besteffort-podf6910214_8f68_4725_9879_4f5b5702c095.slice. Jul 15 23:18:01.200192 kubelet[2742]: I0715 23:18:01.200123 2742 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Jul 15 23:18:01.317153 systemd[1]: Created slice kubepods-besteffort-pod0f2fde19_eb99_4f60_97cb_6223a70f2c1f.slice - libcontainer container kubepods-besteffort-pod0f2fde19_eb99_4f60_97cb_6223a70f2c1f.slice. Jul 15 23:18:01.376048 kubelet[2742]: I0715 23:18:01.375965 2742 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whisker-backend-key-pair\" (UniqueName: \"kubernetes.io/secret/0f2fde19-eb99-4f60-97cb-6223a70f2c1f-whisker-backend-key-pair\") pod \"whisker-6d9dc44786-mg9zx\" (UID: \"0f2fde19-eb99-4f60-97cb-6223a70f2c1f\") " pod="calico-system/whisker-6d9dc44786-mg9zx" Jul 15 23:18:01.376930 kubelet[2742]: I0715 23:18:01.376885 2742 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whisker-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0f2fde19-eb99-4f60-97cb-6223a70f2c1f-whisker-ca-bundle\") pod \"whisker-6d9dc44786-mg9zx\" (UID: \"0f2fde19-eb99-4f60-97cb-6223a70f2c1f\") " pod="calico-system/whisker-6d9dc44786-mg9zx" Jul 15 23:18:01.377010 kubelet[2742]: I0715 23:18:01.376954 2742 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w9c4z\" (UniqueName: \"kubernetes.io/projected/0f2fde19-eb99-4f60-97cb-6223a70f2c1f-kube-api-access-w9c4z\") pod \"whisker-6d9dc44786-mg9zx\" (UID: \"0f2fde19-eb99-4f60-97cb-6223a70f2c1f\") " pod="calico-system/whisker-6d9dc44786-mg9zx" Jul 15 23:18:01.623298 containerd[1538]: time="2025-07-15T23:18:01.623157822Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:whisker-6d9dc44786-mg9zx,Uid:0f2fde19-eb99-4f60-97cb-6223a70f2c1f,Namespace:calico-system,Attempt:0,}" Jul 15 23:18:01.893801 systemd-networkd[1422]: cali76e18e670ea: Link UP Jul 15 23:18:01.897111 systemd-networkd[1422]: cali76e18e670ea: Gained carrier Jul 15 23:18:01.946890 containerd[1538]: 2025-07-15 23:18:01.655 [INFO][3797] cni-plugin/utils.go 100: File /var/lib/calico/mtu does not exist Jul 15 23:18:01.946890 containerd[1538]: 2025-07-15 23:18:01.717 [INFO][3797] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4372--0--1--n--21be50a87e-k8s-whisker--6d9dc44786--mg9zx-eth0 whisker-6d9dc44786- calico-system 0f2fde19-eb99-4f60-97cb-6223a70f2c1f 875 0 2025-07-15 23:18:01 +0000 UTC map[app.kubernetes.io/name:whisker k8s-app:whisker pod-template-hash:6d9dc44786 projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:whisker] map[] [] [] []} {k8s ci-4372-0-1-n-21be50a87e whisker-6d9dc44786-mg9zx eth0 whisker [] [] [kns.calico-system ksa.calico-system.whisker] cali76e18e670ea [] [] }} ContainerID="48d06f28eac91fa598158ce2079f49efdaadc642f56152c077946651e3ecd24a" Namespace="calico-system" Pod="whisker-6d9dc44786-mg9zx" WorkloadEndpoint="ci--4372--0--1--n--21be50a87e-k8s-whisker--6d9dc44786--mg9zx-" Jul 15 23:18:01.946890 containerd[1538]: 2025-07-15 23:18:01.717 [INFO][3797] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="48d06f28eac91fa598158ce2079f49efdaadc642f56152c077946651e3ecd24a" Namespace="calico-system" Pod="whisker-6d9dc44786-mg9zx" WorkloadEndpoint="ci--4372--0--1--n--21be50a87e-k8s-whisker--6d9dc44786--mg9zx-eth0" Jul 15 23:18:01.946890 containerd[1538]: 2025-07-15 23:18:01.776 [INFO][3809] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="48d06f28eac91fa598158ce2079f49efdaadc642f56152c077946651e3ecd24a" HandleID="k8s-pod-network.48d06f28eac91fa598158ce2079f49efdaadc642f56152c077946651e3ecd24a" Workload="ci--4372--0--1--n--21be50a87e-k8s-whisker--6d9dc44786--mg9zx-eth0" Jul 15 23:18:01.947210 containerd[1538]: 2025-07-15 23:18:01.776 [INFO][3809] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="48d06f28eac91fa598158ce2079f49efdaadc642f56152c077946651e3ecd24a" HandleID="k8s-pod-network.48d06f28eac91fa598158ce2079f49efdaadc642f56152c077946651e3ecd24a" Workload="ci--4372--0--1--n--21be50a87e-k8s-whisker--6d9dc44786--mg9zx-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0x4000329aa0), Attrs:map[string]string{"namespace":"calico-system", "node":"ci-4372-0-1-n-21be50a87e", "pod":"whisker-6d9dc44786-mg9zx", "timestamp":"2025-07-15 23:18:01.776046254 +0000 UTC"}, Hostname:"ci-4372-0-1-n-21be50a87e", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Jul 15 23:18:01.947210 containerd[1538]: 2025-07-15 23:18:01.776 [INFO][3809] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Jul 15 23:18:01.947210 containerd[1538]: 2025-07-15 23:18:01.776 [INFO][3809] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Jul 15 23:18:01.947210 containerd[1538]: 2025-07-15 23:18:01.776 [INFO][3809] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4372-0-1-n-21be50a87e' Jul 15 23:18:01.947210 containerd[1538]: 2025-07-15 23:18:01.795 [INFO][3809] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.48d06f28eac91fa598158ce2079f49efdaadc642f56152c077946651e3ecd24a" host="ci-4372-0-1-n-21be50a87e" Jul 15 23:18:01.947210 containerd[1538]: 2025-07-15 23:18:01.807 [INFO][3809] ipam/ipam.go 394: Looking up existing affinities for host host="ci-4372-0-1-n-21be50a87e" Jul 15 23:18:01.947210 containerd[1538]: 2025-07-15 23:18:01.817 [INFO][3809] ipam/ipam.go 511: Trying affinity for 192.168.110.192/26 host="ci-4372-0-1-n-21be50a87e" Jul 15 23:18:01.947210 containerd[1538]: 2025-07-15 23:18:01.821 [INFO][3809] ipam/ipam.go 158: Attempting to load block cidr=192.168.110.192/26 host="ci-4372-0-1-n-21be50a87e" Jul 15 23:18:01.947210 containerd[1538]: 2025-07-15 23:18:01.825 [INFO][3809] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.110.192/26 host="ci-4372-0-1-n-21be50a87e" Jul 15 23:18:01.947423 containerd[1538]: 2025-07-15 23:18:01.825 [INFO][3809] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.110.192/26 handle="k8s-pod-network.48d06f28eac91fa598158ce2079f49efdaadc642f56152c077946651e3ecd24a" host="ci-4372-0-1-n-21be50a87e" Jul 15 23:18:01.947423 containerd[1538]: 2025-07-15 23:18:01.830 [INFO][3809] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.48d06f28eac91fa598158ce2079f49efdaadc642f56152c077946651e3ecd24a Jul 15 23:18:01.947423 containerd[1538]: 2025-07-15 23:18:01.841 [INFO][3809] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.110.192/26 handle="k8s-pod-network.48d06f28eac91fa598158ce2079f49efdaadc642f56152c077946651e3ecd24a" host="ci-4372-0-1-n-21be50a87e" Jul 15 23:18:01.947423 containerd[1538]: 2025-07-15 23:18:01.865 [INFO][3809] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.110.193/26] block=192.168.110.192/26 handle="k8s-pod-network.48d06f28eac91fa598158ce2079f49efdaadc642f56152c077946651e3ecd24a" host="ci-4372-0-1-n-21be50a87e" Jul 15 23:18:01.947423 containerd[1538]: 2025-07-15 23:18:01.865 [INFO][3809] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.110.193/26] handle="k8s-pod-network.48d06f28eac91fa598158ce2079f49efdaadc642f56152c077946651e3ecd24a" host="ci-4372-0-1-n-21be50a87e" Jul 15 23:18:01.947423 containerd[1538]: 2025-07-15 23:18:01.865 [INFO][3809] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Jul 15 23:18:01.947423 containerd[1538]: 2025-07-15 23:18:01.865 [INFO][3809] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.110.193/26] IPv6=[] ContainerID="48d06f28eac91fa598158ce2079f49efdaadc642f56152c077946651e3ecd24a" HandleID="k8s-pod-network.48d06f28eac91fa598158ce2079f49efdaadc642f56152c077946651e3ecd24a" Workload="ci--4372--0--1--n--21be50a87e-k8s-whisker--6d9dc44786--mg9zx-eth0" Jul 15 23:18:01.947557 containerd[1538]: 2025-07-15 23:18:01.873 [INFO][3797] cni-plugin/k8s.go 418: Populated endpoint ContainerID="48d06f28eac91fa598158ce2079f49efdaadc642f56152c077946651e3ecd24a" Namespace="calico-system" Pod="whisker-6d9dc44786-mg9zx" WorkloadEndpoint="ci--4372--0--1--n--21be50a87e-k8s-whisker--6d9dc44786--mg9zx-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4372--0--1--n--21be50a87e-k8s-whisker--6d9dc44786--mg9zx-eth0", GenerateName:"whisker-6d9dc44786-", Namespace:"calico-system", SelfLink:"", UID:"0f2fde19-eb99-4f60-97cb-6223a70f2c1f", ResourceVersion:"875", Generation:0, CreationTimestamp:time.Date(2025, time.July, 15, 23, 18, 1, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"whisker", "k8s-app":"whisker", "pod-template-hash":"6d9dc44786", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"whisker"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4372-0-1-n-21be50a87e", ContainerID:"", Pod:"whisker-6d9dc44786-mg9zx", Endpoint:"eth0", ServiceAccountName:"whisker", IPNetworks:[]string{"192.168.110.193/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.whisker"}, InterfaceName:"cali76e18e670ea", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Jul 15 23:18:01.947557 containerd[1538]: 2025-07-15 23:18:01.873 [INFO][3797] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.110.193/32] ContainerID="48d06f28eac91fa598158ce2079f49efdaadc642f56152c077946651e3ecd24a" Namespace="calico-system" Pod="whisker-6d9dc44786-mg9zx" WorkloadEndpoint="ci--4372--0--1--n--21be50a87e-k8s-whisker--6d9dc44786--mg9zx-eth0" Jul 15 23:18:01.947668 containerd[1538]: 2025-07-15 23:18:01.874 [INFO][3797] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali76e18e670ea ContainerID="48d06f28eac91fa598158ce2079f49efdaadc642f56152c077946651e3ecd24a" Namespace="calico-system" Pod="whisker-6d9dc44786-mg9zx" WorkloadEndpoint="ci--4372--0--1--n--21be50a87e-k8s-whisker--6d9dc44786--mg9zx-eth0" Jul 15 23:18:01.947668 containerd[1538]: 2025-07-15 23:18:01.896 [INFO][3797] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="48d06f28eac91fa598158ce2079f49efdaadc642f56152c077946651e3ecd24a" Namespace="calico-system" Pod="whisker-6d9dc44786-mg9zx" WorkloadEndpoint="ci--4372--0--1--n--21be50a87e-k8s-whisker--6d9dc44786--mg9zx-eth0" Jul 15 23:18:01.947716 containerd[1538]: 2025-07-15 23:18:01.898 [INFO][3797] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="48d06f28eac91fa598158ce2079f49efdaadc642f56152c077946651e3ecd24a" Namespace="calico-system" Pod="whisker-6d9dc44786-mg9zx" WorkloadEndpoint="ci--4372--0--1--n--21be50a87e-k8s-whisker--6d9dc44786--mg9zx-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4372--0--1--n--21be50a87e-k8s-whisker--6d9dc44786--mg9zx-eth0", GenerateName:"whisker-6d9dc44786-", Namespace:"calico-system", SelfLink:"", UID:"0f2fde19-eb99-4f60-97cb-6223a70f2c1f", ResourceVersion:"875", Generation:0, CreationTimestamp:time.Date(2025, time.July, 15, 23, 18, 1, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"whisker", "k8s-app":"whisker", "pod-template-hash":"6d9dc44786", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"whisker"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4372-0-1-n-21be50a87e", ContainerID:"48d06f28eac91fa598158ce2079f49efdaadc642f56152c077946651e3ecd24a", Pod:"whisker-6d9dc44786-mg9zx", Endpoint:"eth0", ServiceAccountName:"whisker", IPNetworks:[]string{"192.168.110.193/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.whisker"}, InterfaceName:"cali76e18e670ea", MAC:"52:9a:46:4d:93:b2", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Jul 15 23:18:01.947764 containerd[1538]: 2025-07-15 23:18:01.943 [INFO][3797] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="48d06f28eac91fa598158ce2079f49efdaadc642f56152c077946651e3ecd24a" Namespace="calico-system" Pod="whisker-6d9dc44786-mg9zx" WorkloadEndpoint="ci--4372--0--1--n--21be50a87e-k8s-whisker--6d9dc44786--mg9zx-eth0" Jul 15 23:18:02.033141 containerd[1538]: time="2025-07-15T23:18:02.032922745Z" level=info msg="connecting to shim 48d06f28eac91fa598158ce2079f49efdaadc642f56152c077946651e3ecd24a" address="unix:///run/containerd/s/d0844d4194f103c0c2dc329926135c3b0203b0405b382405c0fc0fe48feb2f0f" namespace=k8s.io protocol=ttrpc version=3 Jul 15 23:18:02.110092 systemd[1]: Started cri-containerd-48d06f28eac91fa598158ce2079f49efdaadc642f56152c077946651e3ecd24a.scope - libcontainer container 48d06f28eac91fa598158ce2079f49efdaadc642f56152c077946651e3ecd24a. Jul 15 23:18:02.178603 containerd[1538]: time="2025-07-15T23:18:02.177624130Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:whisker-6d9dc44786-mg9zx,Uid:0f2fde19-eb99-4f60-97cb-6223a70f2c1f,Namespace:calico-system,Attempt:0,} returns sandbox id \"48d06f28eac91fa598158ce2079f49efdaadc642f56152c077946651e3ecd24a\"" Jul 15 23:18:02.185729 containerd[1538]: time="2025-07-15T23:18:02.185036849Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.30.2\"" Jul 15 23:18:02.982536 kubelet[2742]: I0715 23:18:02.982456 2742 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f6910214-8f68-4725-9879-4f5b5702c095" path="/var/lib/kubelet/pods/f6910214-8f68-4725-9879-4f5b5702c095/volumes" Jul 15 23:18:03.299885 systemd-networkd[1422]: cali76e18e670ea: Gained IPv6LL Jul 15 23:18:03.760670 containerd[1538]: time="2025-07-15T23:18:03.759574978Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/whisker:v3.30.2\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jul 15 23:18:03.761787 containerd[1538]: time="2025-07-15T23:18:03.761654036Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker:v3.30.2: active requests=0, bytes read=4605614" Jul 15 23:18:03.762696 containerd[1538]: time="2025-07-15T23:18:03.762648385Z" level=info msg="ImageCreate event name:\"sha256:309942601a9ca6c4e92bcd09162824fef1c137a5c5d92fbbb45be0f29bfd1817\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jul 15 23:18:03.765541 containerd[1538]: time="2025-07-15T23:18:03.765487714Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/whisker@sha256:31346d4524252a3b0d2a1d289c4985b8402b498b5ce82a12e682096ab7446678\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jul 15 23:18:03.766361 containerd[1538]: time="2025-07-15T23:18:03.766303586Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/whisker:v3.30.2\" with image id \"sha256:309942601a9ca6c4e92bcd09162824fef1c137a5c5d92fbbb45be0f29bfd1817\", repo tag \"ghcr.io/flatcar/calico/whisker:v3.30.2\", repo digest \"ghcr.io/flatcar/calico/whisker@sha256:31346d4524252a3b0d2a1d289c4985b8402b498b5ce82a12e682096ab7446678\", size \"5974847\" in 1.581214617s" Jul 15 23:18:03.766361 containerd[1538]: time="2025-07-15T23:18:03.766363425Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.30.2\" returns image reference \"sha256:309942601a9ca6c4e92bcd09162824fef1c137a5c5d92fbbb45be0f29bfd1817\"" Jul 15 23:18:03.774435 containerd[1538]: time="2025-07-15T23:18:03.774372579Z" level=info msg="CreateContainer within sandbox \"48d06f28eac91fa598158ce2079f49efdaadc642f56152c077946651e3ecd24a\" for container &ContainerMetadata{Name:whisker,Attempt:0,}" Jul 15 23:18:03.784634 containerd[1538]: time="2025-07-15T23:18:03.783395202Z" level=info msg="Container bf541557d97d69b413769c3354715def089fd4491f0f11763858fef2b73b7031: CDI devices from CRI Config.CDIDevices: []" Jul 15 23:18:03.797628 kubelet[2742]: I0715 23:18:03.796941 2742 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Jul 15 23:18:03.801628 containerd[1538]: time="2025-07-15T23:18:03.801540087Z" level=info msg="CreateContainer within sandbox \"48d06f28eac91fa598158ce2079f49efdaadc642f56152c077946651e3ecd24a\" for &ContainerMetadata{Name:whisker,Attempt:0,} returns container id \"bf541557d97d69b413769c3354715def089fd4491f0f11763858fef2b73b7031\"" Jul 15 23:18:03.805123 containerd[1538]: time="2025-07-15T23:18:03.805055849Z" level=info msg="StartContainer for \"bf541557d97d69b413769c3354715def089fd4491f0f11763858fef2b73b7031\"" Jul 15 23:18:03.809767 containerd[1538]: time="2025-07-15T23:18:03.809485402Z" level=info msg="connecting to shim bf541557d97d69b413769c3354715def089fd4491f0f11763858fef2b73b7031" address="unix:///run/containerd/s/d0844d4194f103c0c2dc329926135c3b0203b0405b382405c0fc0fe48feb2f0f" protocol=ttrpc version=3 Jul 15 23:18:03.844888 systemd[1]: Started cri-containerd-bf541557d97d69b413769c3354715def089fd4491f0f11763858fef2b73b7031.scope - libcontainer container bf541557d97d69b413769c3354715def089fd4491f0f11763858fef2b73b7031. Jul 15 23:18:03.929020 containerd[1538]: time="2025-07-15T23:18:03.928974318Z" level=info msg="StartContainer for \"bf541557d97d69b413769c3354715def089fd4491f0f11763858fef2b73b7031\" returns successfully" Jul 15 23:18:03.934040 containerd[1538]: time="2025-07-15T23:18:03.933895425Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.30.2\"" Jul 15 23:18:04.016146 containerd[1538]: time="2025-07-15T23:18:04.015975465Z" level=info msg="TaskExit event in podsandbox handler container_id:\"d3261160ea98124dd71ea19467519976d4184aec8c47397154429269b6d85b2f\" id:\"0e7873225c6aca52615add1d6f6738b27b1bbe32add74d6c97bd9911b47e47eb\" pid:4016 exit_status:1 exited_at:{seconds:1752621484 nanos:15544990}" Jul 15 23:18:04.131161 containerd[1538]: time="2025-07-15T23:18:04.131122523Z" level=info msg="TaskExit event in podsandbox handler container_id:\"d3261160ea98124dd71ea19467519976d4184aec8c47397154429269b6d85b2f\" id:\"ea4cf9c53568aa1d05b6d7722ec1feb58b7538a064736b635e97ed1b8b106c88\" pid:4051 exit_status:1 exited_at:{seconds:1752621484 nanos:130753927}" Jul 15 23:18:05.977831 containerd[1538]: time="2025-07-15T23:18:05.977465853Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-799c9d5b45-brbbx,Uid:503c7885-3dbb-4fe6-9d3c-7fcfa8b42faa,Namespace:calico-apiserver,Attempt:0,}" Jul 15 23:18:06.269738 systemd-networkd[1422]: cali975717054af: Link UP Jul 15 23:18:06.273165 systemd-networkd[1422]: cali975717054af: Gained carrier Jul 15 23:18:06.312564 containerd[1538]: 2025-07-15 23:18:06.056 [INFO][4107] cni-plugin/utils.go 100: File /var/lib/calico/mtu does not exist Jul 15 23:18:06.312564 containerd[1538]: 2025-07-15 23:18:06.092 [INFO][4107] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4372--0--1--n--21be50a87e-k8s-calico--apiserver--799c9d5b45--brbbx-eth0 calico-apiserver-799c9d5b45- calico-apiserver 503c7885-3dbb-4fe6-9d3c-7fcfa8b42faa 815 0 2025-07-15 23:17:40 +0000 UTC map[apiserver:true app.kubernetes.io/name:calico-apiserver k8s-app:calico-apiserver pod-template-hash:799c9d5b45 projectcalico.org/namespace:calico-apiserver projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:calico-apiserver] map[] [] [] []} {k8s ci-4372-0-1-n-21be50a87e calico-apiserver-799c9d5b45-brbbx eth0 calico-apiserver [] [] [kns.calico-apiserver ksa.calico-apiserver.calico-apiserver] cali975717054af [] [] }} ContainerID="b9b021e3a62743a5b0a3c04d6e23296a641bb175506ab761b20a4004ad3f7578" Namespace="calico-apiserver" Pod="calico-apiserver-799c9d5b45-brbbx" WorkloadEndpoint="ci--4372--0--1--n--21be50a87e-k8s-calico--apiserver--799c9d5b45--brbbx-" Jul 15 23:18:06.312564 containerd[1538]: 2025-07-15 23:18:06.092 [INFO][4107] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="b9b021e3a62743a5b0a3c04d6e23296a641bb175506ab761b20a4004ad3f7578" Namespace="calico-apiserver" Pod="calico-apiserver-799c9d5b45-brbbx" WorkloadEndpoint="ci--4372--0--1--n--21be50a87e-k8s-calico--apiserver--799c9d5b45--brbbx-eth0" Jul 15 23:18:06.312564 containerd[1538]: 2025-07-15 23:18:06.160 [INFO][4123] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="b9b021e3a62743a5b0a3c04d6e23296a641bb175506ab761b20a4004ad3f7578" HandleID="k8s-pod-network.b9b021e3a62743a5b0a3c04d6e23296a641bb175506ab761b20a4004ad3f7578" Workload="ci--4372--0--1--n--21be50a87e-k8s-calico--apiserver--799c9d5b45--brbbx-eth0" Jul 15 23:18:06.313080 containerd[1538]: 2025-07-15 23:18:06.160 [INFO][4123] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="b9b021e3a62743a5b0a3c04d6e23296a641bb175506ab761b20a4004ad3f7578" HandleID="k8s-pod-network.b9b021e3a62743a5b0a3c04d6e23296a641bb175506ab761b20a4004ad3f7578" Workload="ci--4372--0--1--n--21be50a87e-k8s-calico--apiserver--799c9d5b45--brbbx-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0x40002cbef0), Attrs:map[string]string{"namespace":"calico-apiserver", "node":"ci-4372-0-1-n-21be50a87e", "pod":"calico-apiserver-799c9d5b45-brbbx", "timestamp":"2025-07-15 23:18:06.160183875 +0000 UTC"}, Hostname:"ci-4372-0-1-n-21be50a87e", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Jul 15 23:18:06.313080 containerd[1538]: 2025-07-15 23:18:06.160 [INFO][4123] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Jul 15 23:18:06.313080 containerd[1538]: 2025-07-15 23:18:06.160 [INFO][4123] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Jul 15 23:18:06.313080 containerd[1538]: 2025-07-15 23:18:06.160 [INFO][4123] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4372-0-1-n-21be50a87e' Jul 15 23:18:06.313080 containerd[1538]: 2025-07-15 23:18:06.176 [INFO][4123] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.b9b021e3a62743a5b0a3c04d6e23296a641bb175506ab761b20a4004ad3f7578" host="ci-4372-0-1-n-21be50a87e" Jul 15 23:18:06.313080 containerd[1538]: 2025-07-15 23:18:06.193 [INFO][4123] ipam/ipam.go 394: Looking up existing affinities for host host="ci-4372-0-1-n-21be50a87e" Jul 15 23:18:06.313080 containerd[1538]: 2025-07-15 23:18:06.205 [INFO][4123] ipam/ipam.go 511: Trying affinity for 192.168.110.192/26 host="ci-4372-0-1-n-21be50a87e" Jul 15 23:18:06.313080 containerd[1538]: 2025-07-15 23:18:06.210 [INFO][4123] ipam/ipam.go 158: Attempting to load block cidr=192.168.110.192/26 host="ci-4372-0-1-n-21be50a87e" Jul 15 23:18:06.313080 containerd[1538]: 2025-07-15 23:18:06.215 [INFO][4123] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.110.192/26 host="ci-4372-0-1-n-21be50a87e" Jul 15 23:18:06.313298 containerd[1538]: 2025-07-15 23:18:06.215 [INFO][4123] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.110.192/26 handle="k8s-pod-network.b9b021e3a62743a5b0a3c04d6e23296a641bb175506ab761b20a4004ad3f7578" host="ci-4372-0-1-n-21be50a87e" Jul 15 23:18:06.313298 containerd[1538]: 2025-07-15 23:18:06.220 [INFO][4123] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.b9b021e3a62743a5b0a3c04d6e23296a641bb175506ab761b20a4004ad3f7578 Jul 15 23:18:06.313298 containerd[1538]: 2025-07-15 23:18:06.233 [INFO][4123] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.110.192/26 handle="k8s-pod-network.b9b021e3a62743a5b0a3c04d6e23296a641bb175506ab761b20a4004ad3f7578" host="ci-4372-0-1-n-21be50a87e" Jul 15 23:18:06.313298 containerd[1538]: 2025-07-15 23:18:06.255 [INFO][4123] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.110.194/26] block=192.168.110.192/26 handle="k8s-pod-network.b9b021e3a62743a5b0a3c04d6e23296a641bb175506ab761b20a4004ad3f7578" host="ci-4372-0-1-n-21be50a87e" Jul 15 23:18:06.313298 containerd[1538]: 2025-07-15 23:18:06.256 [INFO][4123] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.110.194/26] handle="k8s-pod-network.b9b021e3a62743a5b0a3c04d6e23296a641bb175506ab761b20a4004ad3f7578" host="ci-4372-0-1-n-21be50a87e" Jul 15 23:18:06.313298 containerd[1538]: 2025-07-15 23:18:06.256 [INFO][4123] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Jul 15 23:18:06.313298 containerd[1538]: 2025-07-15 23:18:06.256 [INFO][4123] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.110.194/26] IPv6=[] ContainerID="b9b021e3a62743a5b0a3c04d6e23296a641bb175506ab761b20a4004ad3f7578" HandleID="k8s-pod-network.b9b021e3a62743a5b0a3c04d6e23296a641bb175506ab761b20a4004ad3f7578" Workload="ci--4372--0--1--n--21be50a87e-k8s-calico--apiserver--799c9d5b45--brbbx-eth0" Jul 15 23:18:06.313457 containerd[1538]: 2025-07-15 23:18:06.260 [INFO][4107] cni-plugin/k8s.go 418: Populated endpoint ContainerID="b9b021e3a62743a5b0a3c04d6e23296a641bb175506ab761b20a4004ad3f7578" Namespace="calico-apiserver" Pod="calico-apiserver-799c9d5b45-brbbx" WorkloadEndpoint="ci--4372--0--1--n--21be50a87e-k8s-calico--apiserver--799c9d5b45--brbbx-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4372--0--1--n--21be50a87e-k8s-calico--apiserver--799c9d5b45--brbbx-eth0", GenerateName:"calico-apiserver-799c9d5b45-", Namespace:"calico-apiserver", SelfLink:"", UID:"503c7885-3dbb-4fe6-9d3c-7fcfa8b42faa", ResourceVersion:"815", Generation:0, CreationTimestamp:time.Date(2025, time.July, 15, 23, 17, 40, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"799c9d5b45", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4372-0-1-n-21be50a87e", ContainerID:"", Pod:"calico-apiserver-799c9d5b45-brbbx", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.110.194/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"cali975717054af", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Jul 15 23:18:06.313521 containerd[1538]: 2025-07-15 23:18:06.261 [INFO][4107] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.110.194/32] ContainerID="b9b021e3a62743a5b0a3c04d6e23296a641bb175506ab761b20a4004ad3f7578" Namespace="calico-apiserver" Pod="calico-apiserver-799c9d5b45-brbbx" WorkloadEndpoint="ci--4372--0--1--n--21be50a87e-k8s-calico--apiserver--799c9d5b45--brbbx-eth0" Jul 15 23:18:06.313521 containerd[1538]: 2025-07-15 23:18:06.261 [INFO][4107] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali975717054af ContainerID="b9b021e3a62743a5b0a3c04d6e23296a641bb175506ab761b20a4004ad3f7578" Namespace="calico-apiserver" Pod="calico-apiserver-799c9d5b45-brbbx" WorkloadEndpoint="ci--4372--0--1--n--21be50a87e-k8s-calico--apiserver--799c9d5b45--brbbx-eth0" Jul 15 23:18:06.313521 containerd[1538]: 2025-07-15 23:18:06.278 [INFO][4107] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="b9b021e3a62743a5b0a3c04d6e23296a641bb175506ab761b20a4004ad3f7578" Namespace="calico-apiserver" Pod="calico-apiserver-799c9d5b45-brbbx" WorkloadEndpoint="ci--4372--0--1--n--21be50a87e-k8s-calico--apiserver--799c9d5b45--brbbx-eth0" Jul 15 23:18:06.313600 containerd[1538]: 2025-07-15 23:18:06.279 [INFO][4107] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="b9b021e3a62743a5b0a3c04d6e23296a641bb175506ab761b20a4004ad3f7578" Namespace="calico-apiserver" Pod="calico-apiserver-799c9d5b45-brbbx" WorkloadEndpoint="ci--4372--0--1--n--21be50a87e-k8s-calico--apiserver--799c9d5b45--brbbx-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4372--0--1--n--21be50a87e-k8s-calico--apiserver--799c9d5b45--brbbx-eth0", GenerateName:"calico-apiserver-799c9d5b45-", Namespace:"calico-apiserver", SelfLink:"", UID:"503c7885-3dbb-4fe6-9d3c-7fcfa8b42faa", ResourceVersion:"815", Generation:0, CreationTimestamp:time.Date(2025, time.July, 15, 23, 17, 40, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"799c9d5b45", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4372-0-1-n-21be50a87e", ContainerID:"b9b021e3a62743a5b0a3c04d6e23296a641bb175506ab761b20a4004ad3f7578", Pod:"calico-apiserver-799c9d5b45-brbbx", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.110.194/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"cali975717054af", MAC:"46:b7:92:b1:91:45", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Jul 15 23:18:06.313659 containerd[1538]: 2025-07-15 23:18:06.305 [INFO][4107] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="b9b021e3a62743a5b0a3c04d6e23296a641bb175506ab761b20a4004ad3f7578" Namespace="calico-apiserver" Pod="calico-apiserver-799c9d5b45-brbbx" WorkloadEndpoint="ci--4372--0--1--n--21be50a87e-k8s-calico--apiserver--799c9d5b45--brbbx-eth0" Jul 15 23:18:06.380649 containerd[1538]: time="2025-07-15T23:18:06.380543629Z" level=info msg="connecting to shim b9b021e3a62743a5b0a3c04d6e23296a641bb175506ab761b20a4004ad3f7578" address="unix:///run/containerd/s/c39a105c714d17ce29068cbd6070fc0cb6b87880023a6666446bd87c980346fc" namespace=k8s.io protocol=ttrpc version=3 Jul 15 23:18:06.414034 systemd[1]: Started cri-containerd-b9b021e3a62743a5b0a3c04d6e23296a641bb175506ab761b20a4004ad3f7578.scope - libcontainer container b9b021e3a62743a5b0a3c04d6e23296a641bb175506ab761b20a4004ad3f7578. Jul 15 23:18:06.541536 containerd[1538]: time="2025-07-15T23:18:06.541190323Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-799c9d5b45-brbbx,Uid:503c7885-3dbb-4fe6-9d3c-7fcfa8b42faa,Namespace:calico-apiserver,Attempt:0,} returns sandbox id \"b9b021e3a62743a5b0a3c04d6e23296a641bb175506ab761b20a4004ad3f7578\"" Jul 15 23:18:06.776038 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount444945993.mount: Deactivated successfully. Jul 15 23:18:06.814283 containerd[1538]: time="2025-07-15T23:18:06.813929655Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/whisker-backend:v3.30.2\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jul 15 23:18:06.819008 containerd[1538]: time="2025-07-15T23:18:06.818940563Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker-backend:v3.30.2: active requests=0, bytes read=30814581" Jul 15 23:18:06.820861 containerd[1538]: time="2025-07-15T23:18:06.820716344Z" level=info msg="ImageCreate event name:\"sha256:8763d908c0cd23d0e87bc61ce1ba8371b86449688baf955e5eeff7f7d7e101c4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jul 15 23:18:06.832067 containerd[1538]: time="2025-07-15T23:18:06.831996027Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/whisker-backend@sha256:fbf7f21f5aba95930803ad7e7dea8b083220854eae72c2a7c51681c09c5614b5\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jul 15 23:18:06.835660 containerd[1538]: time="2025-07-15T23:18:06.835243514Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.2\" with image id \"sha256:8763d908c0cd23d0e87bc61ce1ba8371b86449688baf955e5eeff7f7d7e101c4\", repo tag \"ghcr.io/flatcar/calico/whisker-backend:v3.30.2\", repo digest \"ghcr.io/flatcar/calico/whisker-backend@sha256:fbf7f21f5aba95930803ad7e7dea8b083220854eae72c2a7c51681c09c5614b5\", size \"30814411\" in 2.900840894s" Jul 15 23:18:06.838565 containerd[1538]: time="2025-07-15T23:18:06.836380342Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.30.2\" returns image reference \"sha256:8763d908c0cd23d0e87bc61ce1ba8371b86449688baf955e5eeff7f7d7e101c4\"" Jul 15 23:18:06.841280 containerd[1538]: time="2025-07-15T23:18:06.841005614Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.2\"" Jul 15 23:18:06.849680 containerd[1538]: time="2025-07-15T23:18:06.849624485Z" level=info msg="CreateContainer within sandbox \"48d06f28eac91fa598158ce2079f49efdaadc642f56152c077946651e3ecd24a\" for container &ContainerMetadata{Name:whisker-backend,Attempt:0,}" Jul 15 23:18:06.871114 containerd[1538]: time="2025-07-15T23:18:06.870979023Z" level=info msg="Container 5825654b203fe723b30d2e63b3402a6307dd25ac541db74007d2708c73795af9: CDI devices from CRI Config.CDIDevices: []" Jul 15 23:18:06.884601 containerd[1538]: time="2025-07-15T23:18:06.884524523Z" level=info msg="CreateContainer within sandbox \"48d06f28eac91fa598158ce2079f49efdaadc642f56152c077946651e3ecd24a\" for &ContainerMetadata{Name:whisker-backend,Attempt:0,} returns container id \"5825654b203fe723b30d2e63b3402a6307dd25ac541db74007d2708c73795af9\"" Jul 15 23:18:06.886573 containerd[1538]: time="2025-07-15T23:18:06.886525102Z" level=info msg="StartContainer for \"5825654b203fe723b30d2e63b3402a6307dd25ac541db74007d2708c73795af9\"" Jul 15 23:18:06.890301 containerd[1538]: time="2025-07-15T23:18:06.890154504Z" level=info msg="connecting to shim 5825654b203fe723b30d2e63b3402a6307dd25ac541db74007d2708c73795af9" address="unix:///run/containerd/s/d0844d4194f103c0c2dc329926135c3b0203b0405b382405c0fc0fe48feb2f0f" protocol=ttrpc version=3 Jul 15 23:18:06.918074 systemd[1]: Started cri-containerd-5825654b203fe723b30d2e63b3402a6307dd25ac541db74007d2708c73795af9.scope - libcontainer container 5825654b203fe723b30d2e63b3402a6307dd25ac541db74007d2708c73795af9. Jul 15 23:18:07.010138 containerd[1538]: time="2025-07-15T23:18:07.010089181Z" level=info msg="StartContainer for \"5825654b203fe723b30d2e63b3402a6307dd25ac541db74007d2708c73795af9\" returns successfully" Jul 15 23:18:07.262735 kubelet[2742]: I0715 23:18:07.262642 2742 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/whisker-6d9dc44786-mg9zx" podStartSLOduration=1.605247801 podStartE2EDuration="6.26262251s" podCreationTimestamp="2025-07-15 23:18:01 +0000 UTC" firstStartedPulling="2025-07-15 23:18:02.182435997 +0000 UTC m=+39.389201671" lastFinishedPulling="2025-07-15 23:18:06.839810706 +0000 UTC m=+44.046576380" observedRunningTime="2025-07-15 23:18:07.261748679 +0000 UTC m=+44.468514393" watchObservedRunningTime="2025-07-15 23:18:07.26262251 +0000 UTC m=+44.469388224" Jul 15 23:18:07.908041 systemd-networkd[1422]: cali975717054af: Gained IPv6LL Jul 15 23:18:07.978983 containerd[1538]: time="2025-07-15T23:18:07.978894119Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:goldmane-768f4c5c69-fn5px,Uid:420d8d7f-3f4e-4b40-a9e9-124564c6d541,Namespace:calico-system,Attempt:0,}" Jul 15 23:18:07.979142 containerd[1538]: time="2025-07-15T23:18:07.979069598Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-799c9d5b45-zftqn,Uid:abc468d0-2605-48f4-bc68-9c53f0bb8be8,Namespace:calico-apiserver,Attempt:0,}" Jul 15 23:18:07.979184 containerd[1538]: time="2025-07-15T23:18:07.979155557Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-674b8bbfcf-dmcjn,Uid:ad6dda87-ca8a-4489-9124-9f24cca00875,Namespace:kube-system,Attempt:0,}" Jul 15 23:18:07.980107 containerd[1538]: time="2025-07-15T23:18:07.980028748Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-68db8888c-tzmd7,Uid:11bfca13-a765-46f5-b84c-ea9b7e562a08,Namespace:calico-system,Attempt:0,}" Jul 15 23:18:08.355412 systemd-networkd[1422]: cali07db3fd375b: Link UP Jul 15 23:18:08.357205 systemd-networkd[1422]: cali07db3fd375b: Gained carrier Jul 15 23:18:08.397012 containerd[1538]: 2025-07-15 23:18:08.086 [INFO][4279] cni-plugin/utils.go 100: File /var/lib/calico/mtu does not exist Jul 15 23:18:08.397012 containerd[1538]: 2025-07-15 23:18:08.125 [INFO][4279] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4372--0--1--n--21be50a87e-k8s-calico--kube--controllers--68db8888c--tzmd7-eth0 calico-kube-controllers-68db8888c- calico-system 11bfca13-a765-46f5-b84c-ea9b7e562a08 810 0 2025-07-15 23:17:46 +0000 UTC map[app.kubernetes.io/name:calico-kube-controllers k8s-app:calico-kube-controllers pod-template-hash:68db8888c projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:calico-kube-controllers] map[] [] [] []} {k8s ci-4372-0-1-n-21be50a87e calico-kube-controllers-68db8888c-tzmd7 eth0 calico-kube-controllers [] [] [kns.calico-system ksa.calico-system.calico-kube-controllers] cali07db3fd375b [] [] }} ContainerID="6a52e7f219f630fc48045fd5a91a810a9cf1f8fec51f163d862b492fa3d83dc6" Namespace="calico-system" Pod="calico-kube-controllers-68db8888c-tzmd7" WorkloadEndpoint="ci--4372--0--1--n--21be50a87e-k8s-calico--kube--controllers--68db8888c--tzmd7-" Jul 15 23:18:08.397012 containerd[1538]: 2025-07-15 23:18:08.126 [INFO][4279] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="6a52e7f219f630fc48045fd5a91a810a9cf1f8fec51f163d862b492fa3d83dc6" Namespace="calico-system" Pod="calico-kube-controllers-68db8888c-tzmd7" WorkloadEndpoint="ci--4372--0--1--n--21be50a87e-k8s-calico--kube--controllers--68db8888c--tzmd7-eth0" Jul 15 23:18:08.397012 containerd[1538]: 2025-07-15 23:18:08.214 [INFO][4310] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="6a52e7f219f630fc48045fd5a91a810a9cf1f8fec51f163d862b492fa3d83dc6" HandleID="k8s-pod-network.6a52e7f219f630fc48045fd5a91a810a9cf1f8fec51f163d862b492fa3d83dc6" Workload="ci--4372--0--1--n--21be50a87e-k8s-calico--kube--controllers--68db8888c--tzmd7-eth0" Jul 15 23:18:08.399293 containerd[1538]: 2025-07-15 23:18:08.215 [INFO][4310] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="6a52e7f219f630fc48045fd5a91a810a9cf1f8fec51f163d862b492fa3d83dc6" HandleID="k8s-pod-network.6a52e7f219f630fc48045fd5a91a810a9cf1f8fec51f163d862b492fa3d83dc6" Workload="ci--4372--0--1--n--21be50a87e-k8s-calico--kube--controllers--68db8888c--tzmd7-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0x40002d3af0), Attrs:map[string]string{"namespace":"calico-system", "node":"ci-4372-0-1-n-21be50a87e", "pod":"calico-kube-controllers-68db8888c-tzmd7", "timestamp":"2025-07-15 23:18:08.214717881 +0000 UTC"}, Hostname:"ci-4372-0-1-n-21be50a87e", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Jul 15 23:18:08.399293 containerd[1538]: 2025-07-15 23:18:08.215 [INFO][4310] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Jul 15 23:18:08.399293 containerd[1538]: 2025-07-15 23:18:08.215 [INFO][4310] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Jul 15 23:18:08.399293 containerd[1538]: 2025-07-15 23:18:08.215 [INFO][4310] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4372-0-1-n-21be50a87e' Jul 15 23:18:08.399293 containerd[1538]: 2025-07-15 23:18:08.242 [INFO][4310] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.6a52e7f219f630fc48045fd5a91a810a9cf1f8fec51f163d862b492fa3d83dc6" host="ci-4372-0-1-n-21be50a87e" Jul 15 23:18:08.399293 containerd[1538]: 2025-07-15 23:18:08.255 [INFO][4310] ipam/ipam.go 394: Looking up existing affinities for host host="ci-4372-0-1-n-21be50a87e" Jul 15 23:18:08.399293 containerd[1538]: 2025-07-15 23:18:08.274 [INFO][4310] ipam/ipam.go 511: Trying affinity for 192.168.110.192/26 host="ci-4372-0-1-n-21be50a87e" Jul 15 23:18:08.399293 containerd[1538]: 2025-07-15 23:18:08.280 [INFO][4310] ipam/ipam.go 158: Attempting to load block cidr=192.168.110.192/26 host="ci-4372-0-1-n-21be50a87e" Jul 15 23:18:08.399293 containerd[1538]: 2025-07-15 23:18:08.287 [INFO][4310] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.110.192/26 host="ci-4372-0-1-n-21be50a87e" Jul 15 23:18:08.402208 containerd[1538]: 2025-07-15 23:18:08.287 [INFO][4310] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.110.192/26 handle="k8s-pod-network.6a52e7f219f630fc48045fd5a91a810a9cf1f8fec51f163d862b492fa3d83dc6" host="ci-4372-0-1-n-21be50a87e" Jul 15 23:18:08.402208 containerd[1538]: 2025-07-15 23:18:08.293 [INFO][4310] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.6a52e7f219f630fc48045fd5a91a810a9cf1f8fec51f163d862b492fa3d83dc6 Jul 15 23:18:08.402208 containerd[1538]: 2025-07-15 23:18:08.302 [INFO][4310] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.110.192/26 handle="k8s-pod-network.6a52e7f219f630fc48045fd5a91a810a9cf1f8fec51f163d862b492fa3d83dc6" host="ci-4372-0-1-n-21be50a87e" Jul 15 23:18:08.402208 containerd[1538]: 2025-07-15 23:18:08.321 [INFO][4310] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.110.195/26] block=192.168.110.192/26 handle="k8s-pod-network.6a52e7f219f630fc48045fd5a91a810a9cf1f8fec51f163d862b492fa3d83dc6" host="ci-4372-0-1-n-21be50a87e" Jul 15 23:18:08.402208 containerd[1538]: 2025-07-15 23:18:08.321 [INFO][4310] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.110.195/26] handle="k8s-pod-network.6a52e7f219f630fc48045fd5a91a810a9cf1f8fec51f163d862b492fa3d83dc6" host="ci-4372-0-1-n-21be50a87e" Jul 15 23:18:08.402208 containerd[1538]: 2025-07-15 23:18:08.321 [INFO][4310] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Jul 15 23:18:08.402208 containerd[1538]: 2025-07-15 23:18:08.321 [INFO][4310] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.110.195/26] IPv6=[] ContainerID="6a52e7f219f630fc48045fd5a91a810a9cf1f8fec51f163d862b492fa3d83dc6" HandleID="k8s-pod-network.6a52e7f219f630fc48045fd5a91a810a9cf1f8fec51f163d862b492fa3d83dc6" Workload="ci--4372--0--1--n--21be50a87e-k8s-calico--kube--controllers--68db8888c--tzmd7-eth0" Jul 15 23:18:08.403824 containerd[1538]: 2025-07-15 23:18:08.326 [INFO][4279] cni-plugin/k8s.go 418: Populated endpoint ContainerID="6a52e7f219f630fc48045fd5a91a810a9cf1f8fec51f163d862b492fa3d83dc6" Namespace="calico-system" Pod="calico-kube-controllers-68db8888c-tzmd7" WorkloadEndpoint="ci--4372--0--1--n--21be50a87e-k8s-calico--kube--controllers--68db8888c--tzmd7-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4372--0--1--n--21be50a87e-k8s-calico--kube--controllers--68db8888c--tzmd7-eth0", GenerateName:"calico-kube-controllers-68db8888c-", Namespace:"calico-system", SelfLink:"", UID:"11bfca13-a765-46f5-b84c-ea9b7e562a08", ResourceVersion:"810", Generation:0, CreationTimestamp:time.Date(2025, time.July, 15, 23, 17, 46, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"calico-kube-controllers", "k8s-app":"calico-kube-controllers", "pod-template-hash":"68db8888c", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-kube-controllers"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4372-0-1-n-21be50a87e", ContainerID:"", Pod:"calico-kube-controllers-68db8888c-tzmd7", Endpoint:"eth0", ServiceAccountName:"calico-kube-controllers", IPNetworks:[]string{"192.168.110.195/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.calico-kube-controllers"}, InterfaceName:"cali07db3fd375b", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Jul 15 23:18:08.404139 containerd[1538]: 2025-07-15 23:18:08.326 [INFO][4279] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.110.195/32] ContainerID="6a52e7f219f630fc48045fd5a91a810a9cf1f8fec51f163d862b492fa3d83dc6" Namespace="calico-system" Pod="calico-kube-controllers-68db8888c-tzmd7" WorkloadEndpoint="ci--4372--0--1--n--21be50a87e-k8s-calico--kube--controllers--68db8888c--tzmd7-eth0" Jul 15 23:18:08.404139 containerd[1538]: 2025-07-15 23:18:08.326 [INFO][4279] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali07db3fd375b ContainerID="6a52e7f219f630fc48045fd5a91a810a9cf1f8fec51f163d862b492fa3d83dc6" Namespace="calico-system" Pod="calico-kube-controllers-68db8888c-tzmd7" WorkloadEndpoint="ci--4372--0--1--n--21be50a87e-k8s-calico--kube--controllers--68db8888c--tzmd7-eth0" Jul 15 23:18:08.404139 containerd[1538]: 2025-07-15 23:18:08.358 [INFO][4279] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="6a52e7f219f630fc48045fd5a91a810a9cf1f8fec51f163d862b492fa3d83dc6" Namespace="calico-system" Pod="calico-kube-controllers-68db8888c-tzmd7" WorkloadEndpoint="ci--4372--0--1--n--21be50a87e-k8s-calico--kube--controllers--68db8888c--tzmd7-eth0" Jul 15 23:18:08.404340 containerd[1538]: 2025-07-15 23:18:08.359 [INFO][4279] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="6a52e7f219f630fc48045fd5a91a810a9cf1f8fec51f163d862b492fa3d83dc6" Namespace="calico-system" Pod="calico-kube-controllers-68db8888c-tzmd7" WorkloadEndpoint="ci--4372--0--1--n--21be50a87e-k8s-calico--kube--controllers--68db8888c--tzmd7-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4372--0--1--n--21be50a87e-k8s-calico--kube--controllers--68db8888c--tzmd7-eth0", GenerateName:"calico-kube-controllers-68db8888c-", Namespace:"calico-system", SelfLink:"", UID:"11bfca13-a765-46f5-b84c-ea9b7e562a08", ResourceVersion:"810", Generation:0, CreationTimestamp:time.Date(2025, time.July, 15, 23, 17, 46, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"calico-kube-controllers", "k8s-app":"calico-kube-controllers", "pod-template-hash":"68db8888c", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-kube-controllers"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4372-0-1-n-21be50a87e", ContainerID:"6a52e7f219f630fc48045fd5a91a810a9cf1f8fec51f163d862b492fa3d83dc6", Pod:"calico-kube-controllers-68db8888c-tzmd7", Endpoint:"eth0", ServiceAccountName:"calico-kube-controllers", IPNetworks:[]string{"192.168.110.195/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.calico-kube-controllers"}, InterfaceName:"cali07db3fd375b", MAC:"ee:6a:16:d8:71:ba", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Jul 15 23:18:08.404529 containerd[1538]: 2025-07-15 23:18:08.386 [INFO][4279] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="6a52e7f219f630fc48045fd5a91a810a9cf1f8fec51f163d862b492fa3d83dc6" Namespace="calico-system" Pod="calico-kube-controllers-68db8888c-tzmd7" WorkloadEndpoint="ci--4372--0--1--n--21be50a87e-k8s-calico--kube--controllers--68db8888c--tzmd7-eth0" Jul 15 23:18:08.461860 containerd[1538]: time="2025-07-15T23:18:08.461736332Z" level=info msg="connecting to shim 6a52e7f219f630fc48045fd5a91a810a9cf1f8fec51f163d862b492fa3d83dc6" address="unix:///run/containerd/s/c10a24f8361a25e892040bde13bcadb9ddbaf4abf2260df897d3ab74a086a059" namespace=k8s.io protocol=ttrpc version=3 Jul 15 23:18:08.491257 systemd-networkd[1422]: calibd46cc538fe: Link UP Jul 15 23:18:08.492497 systemd-networkd[1422]: calibd46cc538fe: Gained carrier Jul 15 23:18:08.517923 systemd[1]: Started cri-containerd-6a52e7f219f630fc48045fd5a91a810a9cf1f8fec51f163d862b492fa3d83dc6.scope - libcontainer container 6a52e7f219f630fc48045fd5a91a810a9cf1f8fec51f163d862b492fa3d83dc6. Jul 15 23:18:08.539024 containerd[1538]: 2025-07-15 23:18:08.074 [INFO][4258] cni-plugin/utils.go 100: File /var/lib/calico/mtu does not exist Jul 15 23:18:08.539024 containerd[1538]: 2025-07-15 23:18:08.131 [INFO][4258] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4372--0--1--n--21be50a87e-k8s-goldmane--768f4c5c69--fn5px-eth0 goldmane-768f4c5c69- calico-system 420d8d7f-3f4e-4b40-a9e9-124564c6d541 814 0 2025-07-15 23:17:46 +0000 UTC map[app.kubernetes.io/name:goldmane k8s-app:goldmane pod-template-hash:768f4c5c69 projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:goldmane] map[] [] [] []} {k8s ci-4372-0-1-n-21be50a87e goldmane-768f4c5c69-fn5px eth0 goldmane [] [] [kns.calico-system ksa.calico-system.goldmane] calibd46cc538fe [] [] }} ContainerID="a7d16da8207e91e9eb6b6c5c459116702ba1f8b1ff715bfb73f8d55afaa0d519" Namespace="calico-system" Pod="goldmane-768f4c5c69-fn5px" WorkloadEndpoint="ci--4372--0--1--n--21be50a87e-k8s-goldmane--768f4c5c69--fn5px-" Jul 15 23:18:08.539024 containerd[1538]: 2025-07-15 23:18:08.132 [INFO][4258] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="a7d16da8207e91e9eb6b6c5c459116702ba1f8b1ff715bfb73f8d55afaa0d519" Namespace="calico-system" Pod="goldmane-768f4c5c69-fn5px" WorkloadEndpoint="ci--4372--0--1--n--21be50a87e-k8s-goldmane--768f4c5c69--fn5px-eth0" Jul 15 23:18:08.539024 containerd[1538]: 2025-07-15 23:18:08.236 [INFO][4316] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="a7d16da8207e91e9eb6b6c5c459116702ba1f8b1ff715bfb73f8d55afaa0d519" HandleID="k8s-pod-network.a7d16da8207e91e9eb6b6c5c459116702ba1f8b1ff715bfb73f8d55afaa0d519" Workload="ci--4372--0--1--n--21be50a87e-k8s-goldmane--768f4c5c69--fn5px-eth0" Jul 15 23:18:08.539339 containerd[1538]: 2025-07-15 23:18:08.238 [INFO][4316] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="a7d16da8207e91e9eb6b6c5c459116702ba1f8b1ff715bfb73f8d55afaa0d519" HandleID="k8s-pod-network.a7d16da8207e91e9eb6b6c5c459116702ba1f8b1ff715bfb73f8d55afaa0d519" Workload="ci--4372--0--1--n--21be50a87e-k8s-goldmane--768f4c5c69--fn5px-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0x4000367c00), Attrs:map[string]string{"namespace":"calico-system", "node":"ci-4372-0-1-n-21be50a87e", "pod":"goldmane-768f4c5c69-fn5px", "timestamp":"2025-07-15 23:18:08.236056984 +0000 UTC"}, Hostname:"ci-4372-0-1-n-21be50a87e", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Jul 15 23:18:08.539339 containerd[1538]: 2025-07-15 23:18:08.239 [INFO][4316] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Jul 15 23:18:08.539339 containerd[1538]: 2025-07-15 23:18:08.321 [INFO][4316] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Jul 15 23:18:08.539339 containerd[1538]: 2025-07-15 23:18:08.321 [INFO][4316] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4372-0-1-n-21be50a87e' Jul 15 23:18:08.539339 containerd[1538]: 2025-07-15 23:18:08.369 [INFO][4316] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.a7d16da8207e91e9eb6b6c5c459116702ba1f8b1ff715bfb73f8d55afaa0d519" host="ci-4372-0-1-n-21be50a87e" Jul 15 23:18:08.539339 containerd[1538]: 2025-07-15 23:18:08.405 [INFO][4316] ipam/ipam.go 394: Looking up existing affinities for host host="ci-4372-0-1-n-21be50a87e" Jul 15 23:18:08.539339 containerd[1538]: 2025-07-15 23:18:08.422 [INFO][4316] ipam/ipam.go 511: Trying affinity for 192.168.110.192/26 host="ci-4372-0-1-n-21be50a87e" Jul 15 23:18:08.539339 containerd[1538]: 2025-07-15 23:18:08.428 [INFO][4316] ipam/ipam.go 158: Attempting to load block cidr=192.168.110.192/26 host="ci-4372-0-1-n-21be50a87e" Jul 15 23:18:08.539339 containerd[1538]: 2025-07-15 23:18:08.434 [INFO][4316] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.110.192/26 host="ci-4372-0-1-n-21be50a87e" Jul 15 23:18:08.539531 containerd[1538]: 2025-07-15 23:18:08.434 [INFO][4316] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.110.192/26 handle="k8s-pod-network.a7d16da8207e91e9eb6b6c5c459116702ba1f8b1ff715bfb73f8d55afaa0d519" host="ci-4372-0-1-n-21be50a87e" Jul 15 23:18:08.539531 containerd[1538]: 2025-07-15 23:18:08.438 [INFO][4316] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.a7d16da8207e91e9eb6b6c5c459116702ba1f8b1ff715bfb73f8d55afaa0d519 Jul 15 23:18:08.539531 containerd[1538]: 2025-07-15 23:18:08.454 [INFO][4316] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.110.192/26 handle="k8s-pod-network.a7d16da8207e91e9eb6b6c5c459116702ba1f8b1ff715bfb73f8d55afaa0d519" host="ci-4372-0-1-n-21be50a87e" Jul 15 23:18:08.539531 containerd[1538]: 2025-07-15 23:18:08.471 [INFO][4316] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.110.196/26] block=192.168.110.192/26 handle="k8s-pod-network.a7d16da8207e91e9eb6b6c5c459116702ba1f8b1ff715bfb73f8d55afaa0d519" host="ci-4372-0-1-n-21be50a87e" Jul 15 23:18:08.539531 containerd[1538]: 2025-07-15 23:18:08.471 [INFO][4316] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.110.196/26] handle="k8s-pod-network.a7d16da8207e91e9eb6b6c5c459116702ba1f8b1ff715bfb73f8d55afaa0d519" host="ci-4372-0-1-n-21be50a87e" Jul 15 23:18:08.539531 containerd[1538]: 2025-07-15 23:18:08.471 [INFO][4316] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Jul 15 23:18:08.539531 containerd[1538]: 2025-07-15 23:18:08.471 [INFO][4316] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.110.196/26] IPv6=[] ContainerID="a7d16da8207e91e9eb6b6c5c459116702ba1f8b1ff715bfb73f8d55afaa0d519" HandleID="k8s-pod-network.a7d16da8207e91e9eb6b6c5c459116702ba1f8b1ff715bfb73f8d55afaa0d519" Workload="ci--4372--0--1--n--21be50a87e-k8s-goldmane--768f4c5c69--fn5px-eth0" Jul 15 23:18:08.539716 containerd[1538]: 2025-07-15 23:18:08.487 [INFO][4258] cni-plugin/k8s.go 418: Populated endpoint ContainerID="a7d16da8207e91e9eb6b6c5c459116702ba1f8b1ff715bfb73f8d55afaa0d519" Namespace="calico-system" Pod="goldmane-768f4c5c69-fn5px" WorkloadEndpoint="ci--4372--0--1--n--21be50a87e-k8s-goldmane--768f4c5c69--fn5px-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4372--0--1--n--21be50a87e-k8s-goldmane--768f4c5c69--fn5px-eth0", GenerateName:"goldmane-768f4c5c69-", Namespace:"calico-system", SelfLink:"", UID:"420d8d7f-3f4e-4b40-a9e9-124564c6d541", ResourceVersion:"814", Generation:0, CreationTimestamp:time.Date(2025, time.July, 15, 23, 17, 46, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"goldmane", "k8s-app":"goldmane", "pod-template-hash":"768f4c5c69", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"goldmane"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4372-0-1-n-21be50a87e", ContainerID:"", Pod:"goldmane-768f4c5c69-fn5px", Endpoint:"eth0", ServiceAccountName:"goldmane", IPNetworks:[]string{"192.168.110.196/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.goldmane"}, InterfaceName:"calibd46cc538fe", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Jul 15 23:18:08.539775 containerd[1538]: 2025-07-15 23:18:08.487 [INFO][4258] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.110.196/32] ContainerID="a7d16da8207e91e9eb6b6c5c459116702ba1f8b1ff715bfb73f8d55afaa0d519" Namespace="calico-system" Pod="goldmane-768f4c5c69-fn5px" WorkloadEndpoint="ci--4372--0--1--n--21be50a87e-k8s-goldmane--768f4c5c69--fn5px-eth0" Jul 15 23:18:08.539775 containerd[1538]: 2025-07-15 23:18:08.487 [INFO][4258] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to calibd46cc538fe ContainerID="a7d16da8207e91e9eb6b6c5c459116702ba1f8b1ff715bfb73f8d55afaa0d519" Namespace="calico-system" Pod="goldmane-768f4c5c69-fn5px" WorkloadEndpoint="ci--4372--0--1--n--21be50a87e-k8s-goldmane--768f4c5c69--fn5px-eth0" Jul 15 23:18:08.539775 containerd[1538]: 2025-07-15 23:18:08.493 [INFO][4258] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="a7d16da8207e91e9eb6b6c5c459116702ba1f8b1ff715bfb73f8d55afaa0d519" Namespace="calico-system" Pod="goldmane-768f4c5c69-fn5px" WorkloadEndpoint="ci--4372--0--1--n--21be50a87e-k8s-goldmane--768f4c5c69--fn5px-eth0" Jul 15 23:18:08.539838 containerd[1538]: 2025-07-15 23:18:08.496 [INFO][4258] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="a7d16da8207e91e9eb6b6c5c459116702ba1f8b1ff715bfb73f8d55afaa0d519" Namespace="calico-system" Pod="goldmane-768f4c5c69-fn5px" WorkloadEndpoint="ci--4372--0--1--n--21be50a87e-k8s-goldmane--768f4c5c69--fn5px-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4372--0--1--n--21be50a87e-k8s-goldmane--768f4c5c69--fn5px-eth0", GenerateName:"goldmane-768f4c5c69-", Namespace:"calico-system", SelfLink:"", UID:"420d8d7f-3f4e-4b40-a9e9-124564c6d541", ResourceVersion:"814", Generation:0, CreationTimestamp:time.Date(2025, time.July, 15, 23, 17, 46, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"goldmane", "k8s-app":"goldmane", "pod-template-hash":"768f4c5c69", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"goldmane"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4372-0-1-n-21be50a87e", ContainerID:"a7d16da8207e91e9eb6b6c5c459116702ba1f8b1ff715bfb73f8d55afaa0d519", Pod:"goldmane-768f4c5c69-fn5px", Endpoint:"eth0", ServiceAccountName:"goldmane", IPNetworks:[]string{"192.168.110.196/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.goldmane"}, InterfaceName:"calibd46cc538fe", MAC:"8a:b1:0a:d5:b1:f8", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Jul 15 23:18:08.539886 containerd[1538]: 2025-07-15 23:18:08.527 [INFO][4258] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="a7d16da8207e91e9eb6b6c5c459116702ba1f8b1ff715bfb73f8d55afaa0d519" Namespace="calico-system" Pod="goldmane-768f4c5c69-fn5px" WorkloadEndpoint="ci--4372--0--1--n--21be50a87e-k8s-goldmane--768f4c5c69--fn5px-eth0" Jul 15 23:18:08.609358 containerd[1538]: time="2025-07-15T23:18:08.609001915Z" level=info msg="connecting to shim a7d16da8207e91e9eb6b6c5c459116702ba1f8b1ff715bfb73f8d55afaa0d519" address="unix:///run/containerd/s/f242f1f9c1d0b7329081d447b8dcb430198c1c8f4f4267a7d0314a421051c46f" namespace=k8s.io protocol=ttrpc version=3 Jul 15 23:18:08.678360 systemd-networkd[1422]: cali682edd34373: Link UP Jul 15 23:18:08.691123 systemd-networkd[1422]: cali682edd34373: Gained carrier Jul 15 23:18:08.697860 systemd[1]: Started cri-containerd-a7d16da8207e91e9eb6b6c5c459116702ba1f8b1ff715bfb73f8d55afaa0d519.scope - libcontainer container a7d16da8207e91e9eb6b6c5c459116702ba1f8b1ff715bfb73f8d55afaa0d519. Jul 15 23:18:08.724446 containerd[1538]: time="2025-07-15T23:18:08.724337263Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-68db8888c-tzmd7,Uid:11bfca13-a765-46f5-b84c-ea9b7e562a08,Namespace:calico-system,Attempt:0,} returns sandbox id \"6a52e7f219f630fc48045fd5a91a810a9cf1f8fec51f163d862b492fa3d83dc6\"" Jul 15 23:18:08.733130 containerd[1538]: 2025-07-15 23:18:08.131 [INFO][4298] cni-plugin/utils.go 100: File /var/lib/calico/mtu does not exist Jul 15 23:18:08.733130 containerd[1538]: 2025-07-15 23:18:08.167 [INFO][4298] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4372--0--1--n--21be50a87e-k8s-coredns--674b8bbfcf--dmcjn-eth0 coredns-674b8bbfcf- kube-system ad6dda87-ca8a-4489-9124-9f24cca00875 813 0 2025-07-15 23:17:28 +0000 UTC map[k8s-app:kube-dns pod-template-hash:674b8bbfcf projectcalico.org/namespace:kube-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:coredns] map[] [] [] []} {k8s ci-4372-0-1-n-21be50a87e coredns-674b8bbfcf-dmcjn eth0 coredns [] [] [kns.kube-system ksa.kube-system.coredns] cali682edd34373 [{dns UDP 53 0 } {dns-tcp TCP 53 0 } {metrics TCP 9153 0 }] [] }} ContainerID="05d3bf9b2a5b110b5611709551dc28573e964cb6b7f8f57029583825d78197a6" Namespace="kube-system" Pod="coredns-674b8bbfcf-dmcjn" WorkloadEndpoint="ci--4372--0--1--n--21be50a87e-k8s-coredns--674b8bbfcf--dmcjn-" Jul 15 23:18:08.733130 containerd[1538]: 2025-07-15 23:18:08.168 [INFO][4298] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="05d3bf9b2a5b110b5611709551dc28573e964cb6b7f8f57029583825d78197a6" Namespace="kube-system" Pod="coredns-674b8bbfcf-dmcjn" WorkloadEndpoint="ci--4372--0--1--n--21be50a87e-k8s-coredns--674b8bbfcf--dmcjn-eth0" Jul 15 23:18:08.733130 containerd[1538]: 2025-07-15 23:18:08.285 [INFO][4324] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="05d3bf9b2a5b110b5611709551dc28573e964cb6b7f8f57029583825d78197a6" HandleID="k8s-pod-network.05d3bf9b2a5b110b5611709551dc28573e964cb6b7f8f57029583825d78197a6" Workload="ci--4372--0--1--n--21be50a87e-k8s-coredns--674b8bbfcf--dmcjn-eth0" Jul 15 23:18:08.733921 containerd[1538]: 2025-07-15 23:18:08.285 [INFO][4324] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="05d3bf9b2a5b110b5611709551dc28573e964cb6b7f8f57029583825d78197a6" HandleID="k8s-pod-network.05d3bf9b2a5b110b5611709551dc28573e964cb6b7f8f57029583825d78197a6" Workload="ci--4372--0--1--n--21be50a87e-k8s-coredns--674b8bbfcf--dmcjn-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0x4000103e10), Attrs:map[string]string{"namespace":"kube-system", "node":"ci-4372-0-1-n-21be50a87e", "pod":"coredns-674b8bbfcf-dmcjn", "timestamp":"2025-07-15 23:18:08.285057447 +0000 UTC"}, Hostname:"ci-4372-0-1-n-21be50a87e", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Jul 15 23:18:08.733921 containerd[1538]: 2025-07-15 23:18:08.285 [INFO][4324] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Jul 15 23:18:08.733921 containerd[1538]: 2025-07-15 23:18:08.478 [INFO][4324] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Jul 15 23:18:08.733921 containerd[1538]: 2025-07-15 23:18:08.478 [INFO][4324] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4372-0-1-n-21be50a87e' Jul 15 23:18:08.733921 containerd[1538]: 2025-07-15 23:18:08.524 [INFO][4324] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.05d3bf9b2a5b110b5611709551dc28573e964cb6b7f8f57029583825d78197a6" host="ci-4372-0-1-n-21be50a87e" Jul 15 23:18:08.733921 containerd[1538]: 2025-07-15 23:18:08.552 [INFO][4324] ipam/ipam.go 394: Looking up existing affinities for host host="ci-4372-0-1-n-21be50a87e" Jul 15 23:18:08.733921 containerd[1538]: 2025-07-15 23:18:08.577 [INFO][4324] ipam/ipam.go 511: Trying affinity for 192.168.110.192/26 host="ci-4372-0-1-n-21be50a87e" Jul 15 23:18:08.733921 containerd[1538]: 2025-07-15 23:18:08.585 [INFO][4324] ipam/ipam.go 158: Attempting to load block cidr=192.168.110.192/26 host="ci-4372-0-1-n-21be50a87e" Jul 15 23:18:08.733921 containerd[1538]: 2025-07-15 23:18:08.593 [INFO][4324] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.110.192/26 host="ci-4372-0-1-n-21be50a87e" Jul 15 23:18:08.734151 containerd[1538]: 2025-07-15 23:18:08.593 [INFO][4324] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.110.192/26 handle="k8s-pod-network.05d3bf9b2a5b110b5611709551dc28573e964cb6b7f8f57029583825d78197a6" host="ci-4372-0-1-n-21be50a87e" Jul 15 23:18:08.734151 containerd[1538]: 2025-07-15 23:18:08.600 [INFO][4324] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.05d3bf9b2a5b110b5611709551dc28573e964cb6b7f8f57029583825d78197a6 Jul 15 23:18:08.734151 containerd[1538]: 2025-07-15 23:18:08.611 [INFO][4324] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.110.192/26 handle="k8s-pod-network.05d3bf9b2a5b110b5611709551dc28573e964cb6b7f8f57029583825d78197a6" host="ci-4372-0-1-n-21be50a87e" Jul 15 23:18:08.734151 containerd[1538]: 2025-07-15 23:18:08.633 [INFO][4324] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.110.197/26] block=192.168.110.192/26 handle="k8s-pod-network.05d3bf9b2a5b110b5611709551dc28573e964cb6b7f8f57029583825d78197a6" host="ci-4372-0-1-n-21be50a87e" Jul 15 23:18:08.734151 containerd[1538]: 2025-07-15 23:18:08.633 [INFO][4324] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.110.197/26] handle="k8s-pod-network.05d3bf9b2a5b110b5611709551dc28573e964cb6b7f8f57029583825d78197a6" host="ci-4372-0-1-n-21be50a87e" Jul 15 23:18:08.734151 containerd[1538]: 2025-07-15 23:18:08.636 [INFO][4324] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Jul 15 23:18:08.734151 containerd[1538]: 2025-07-15 23:18:08.637 [INFO][4324] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.110.197/26] IPv6=[] ContainerID="05d3bf9b2a5b110b5611709551dc28573e964cb6b7f8f57029583825d78197a6" HandleID="k8s-pod-network.05d3bf9b2a5b110b5611709551dc28573e964cb6b7f8f57029583825d78197a6" Workload="ci--4372--0--1--n--21be50a87e-k8s-coredns--674b8bbfcf--dmcjn-eth0" Jul 15 23:18:08.734310 containerd[1538]: 2025-07-15 23:18:08.651 [INFO][4298] cni-plugin/k8s.go 418: Populated endpoint ContainerID="05d3bf9b2a5b110b5611709551dc28573e964cb6b7f8f57029583825d78197a6" Namespace="kube-system" Pod="coredns-674b8bbfcf-dmcjn" WorkloadEndpoint="ci--4372--0--1--n--21be50a87e-k8s-coredns--674b8bbfcf--dmcjn-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4372--0--1--n--21be50a87e-k8s-coredns--674b8bbfcf--dmcjn-eth0", GenerateName:"coredns-674b8bbfcf-", Namespace:"kube-system", SelfLink:"", UID:"ad6dda87-ca8a-4489-9124-9f24cca00875", ResourceVersion:"813", Generation:0, CreationTimestamp:time.Date(2025, time.July, 15, 23, 17, 28, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"674b8bbfcf", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4372-0-1-n-21be50a87e", ContainerID:"", Pod:"coredns-674b8bbfcf-dmcjn", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.110.197/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"cali682edd34373", MAC:"", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Jul 15 23:18:08.734310 containerd[1538]: 2025-07-15 23:18:08.656 [INFO][4298] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.110.197/32] ContainerID="05d3bf9b2a5b110b5611709551dc28573e964cb6b7f8f57029583825d78197a6" Namespace="kube-system" Pod="coredns-674b8bbfcf-dmcjn" WorkloadEndpoint="ci--4372--0--1--n--21be50a87e-k8s-coredns--674b8bbfcf--dmcjn-eth0" Jul 15 23:18:08.734310 containerd[1538]: 2025-07-15 23:18:08.660 [INFO][4298] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali682edd34373 ContainerID="05d3bf9b2a5b110b5611709551dc28573e964cb6b7f8f57029583825d78197a6" Namespace="kube-system" Pod="coredns-674b8bbfcf-dmcjn" WorkloadEndpoint="ci--4372--0--1--n--21be50a87e-k8s-coredns--674b8bbfcf--dmcjn-eth0" Jul 15 23:18:08.734310 containerd[1538]: 2025-07-15 23:18:08.690 [INFO][4298] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="05d3bf9b2a5b110b5611709551dc28573e964cb6b7f8f57029583825d78197a6" Namespace="kube-system" Pod="coredns-674b8bbfcf-dmcjn" WorkloadEndpoint="ci--4372--0--1--n--21be50a87e-k8s-coredns--674b8bbfcf--dmcjn-eth0" Jul 15 23:18:08.734310 containerd[1538]: 2025-07-15 23:18:08.699 [INFO][4298] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="05d3bf9b2a5b110b5611709551dc28573e964cb6b7f8f57029583825d78197a6" Namespace="kube-system" Pod="coredns-674b8bbfcf-dmcjn" WorkloadEndpoint="ci--4372--0--1--n--21be50a87e-k8s-coredns--674b8bbfcf--dmcjn-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4372--0--1--n--21be50a87e-k8s-coredns--674b8bbfcf--dmcjn-eth0", GenerateName:"coredns-674b8bbfcf-", Namespace:"kube-system", SelfLink:"", UID:"ad6dda87-ca8a-4489-9124-9f24cca00875", ResourceVersion:"813", Generation:0, CreationTimestamp:time.Date(2025, time.July, 15, 23, 17, 28, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"674b8bbfcf", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4372-0-1-n-21be50a87e", ContainerID:"05d3bf9b2a5b110b5611709551dc28573e964cb6b7f8f57029583825d78197a6", Pod:"coredns-674b8bbfcf-dmcjn", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.110.197/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"cali682edd34373", MAC:"ce:51:0a:d7:ae:20", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Jul 15 23:18:08.734310 containerd[1538]: 2025-07-15 23:18:08.726 [INFO][4298] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="05d3bf9b2a5b110b5611709551dc28573e964cb6b7f8f57029583825d78197a6" Namespace="kube-system" Pod="coredns-674b8bbfcf-dmcjn" WorkloadEndpoint="ci--4372--0--1--n--21be50a87e-k8s-coredns--674b8bbfcf--dmcjn-eth0" Jul 15 23:18:08.805621 containerd[1538]: time="2025-07-15T23:18:08.803788096Z" level=info msg="connecting to shim 05d3bf9b2a5b110b5611709551dc28573e964cb6b7f8f57029583825d78197a6" address="unix:///run/containerd/s/5ad4a595498885f02b1b719a92bec5324906c390d8aa884132fbbecca5c25c3e" namespace=k8s.io protocol=ttrpc version=3 Jul 15 23:18:08.806142 systemd-networkd[1422]: caliba501fde9fb: Link UP Jul 15 23:18:08.809416 systemd-networkd[1422]: caliba501fde9fb: Gained carrier Jul 15 23:18:08.850469 containerd[1538]: 2025-07-15 23:18:08.149 [INFO][4262] cni-plugin/utils.go 100: File /var/lib/calico/mtu does not exist Jul 15 23:18:08.850469 containerd[1538]: 2025-07-15 23:18:08.187 [INFO][4262] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4372--0--1--n--21be50a87e-k8s-calico--apiserver--799c9d5b45--zftqn-eth0 calico-apiserver-799c9d5b45- calico-apiserver abc468d0-2605-48f4-bc68-9c53f0bb8be8 808 0 2025-07-15 23:17:40 +0000 UTC map[apiserver:true app.kubernetes.io/name:calico-apiserver k8s-app:calico-apiserver pod-template-hash:799c9d5b45 projectcalico.org/namespace:calico-apiserver projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:calico-apiserver] map[] [] [] []} {k8s ci-4372-0-1-n-21be50a87e calico-apiserver-799c9d5b45-zftqn eth0 calico-apiserver [] [] [kns.calico-apiserver ksa.calico-apiserver.calico-apiserver] caliba501fde9fb [] [] }} ContainerID="0e7a63ee76d35a6b97c9d5f5a92b1141635f0cd160f86de4fc9d80fb5e1e184a" Namespace="calico-apiserver" Pod="calico-apiserver-799c9d5b45-zftqn" WorkloadEndpoint="ci--4372--0--1--n--21be50a87e-k8s-calico--apiserver--799c9d5b45--zftqn-" Jul 15 23:18:08.850469 containerd[1538]: 2025-07-15 23:18:08.188 [INFO][4262] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="0e7a63ee76d35a6b97c9d5f5a92b1141635f0cd160f86de4fc9d80fb5e1e184a" Namespace="calico-apiserver" Pod="calico-apiserver-799c9d5b45-zftqn" WorkloadEndpoint="ci--4372--0--1--n--21be50a87e-k8s-calico--apiserver--799c9d5b45--zftqn-eth0" Jul 15 23:18:08.850469 containerd[1538]: 2025-07-15 23:18:08.321 [INFO][4332] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="0e7a63ee76d35a6b97c9d5f5a92b1141635f0cd160f86de4fc9d80fb5e1e184a" HandleID="k8s-pod-network.0e7a63ee76d35a6b97c9d5f5a92b1141635f0cd160f86de4fc9d80fb5e1e184a" Workload="ci--4372--0--1--n--21be50a87e-k8s-calico--apiserver--799c9d5b45--zftqn-eth0" Jul 15 23:18:08.850469 containerd[1538]: 2025-07-15 23:18:08.323 [INFO][4332] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="0e7a63ee76d35a6b97c9d5f5a92b1141635f0cd160f86de4fc9d80fb5e1e184a" HandleID="k8s-pod-network.0e7a63ee76d35a6b97c9d5f5a92b1141635f0cd160f86de4fc9d80fb5e1e184a" Workload="ci--4372--0--1--n--21be50a87e-k8s-calico--apiserver--799c9d5b45--zftqn-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0x40003359d0), Attrs:map[string]string{"namespace":"calico-apiserver", "node":"ci-4372-0-1-n-21be50a87e", "pod":"calico-apiserver-799c9d5b45-zftqn", "timestamp":"2025-07-15 23:18:08.321811793 +0000 UTC"}, Hostname:"ci-4372-0-1-n-21be50a87e", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Jul 15 23:18:08.850469 containerd[1538]: 2025-07-15 23:18:08.324 [INFO][4332] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Jul 15 23:18:08.850469 containerd[1538]: 2025-07-15 23:18:08.634 [INFO][4332] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Jul 15 23:18:08.850469 containerd[1538]: 2025-07-15 23:18:08.634 [INFO][4332] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4372-0-1-n-21be50a87e' Jul 15 23:18:08.850469 containerd[1538]: 2025-07-15 23:18:08.682 [INFO][4332] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.0e7a63ee76d35a6b97c9d5f5a92b1141635f0cd160f86de4fc9d80fb5e1e184a" host="ci-4372-0-1-n-21be50a87e" Jul 15 23:18:08.850469 containerd[1538]: 2025-07-15 23:18:08.696 [INFO][4332] ipam/ipam.go 394: Looking up existing affinities for host host="ci-4372-0-1-n-21be50a87e" Jul 15 23:18:08.850469 containerd[1538]: 2025-07-15 23:18:08.715 [INFO][4332] ipam/ipam.go 511: Trying affinity for 192.168.110.192/26 host="ci-4372-0-1-n-21be50a87e" Jul 15 23:18:08.850469 containerd[1538]: 2025-07-15 23:18:08.725 [INFO][4332] ipam/ipam.go 158: Attempting to load block cidr=192.168.110.192/26 host="ci-4372-0-1-n-21be50a87e" Jul 15 23:18:08.850469 containerd[1538]: 2025-07-15 23:18:08.737 [INFO][4332] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.110.192/26 host="ci-4372-0-1-n-21be50a87e" Jul 15 23:18:08.850469 containerd[1538]: 2025-07-15 23:18:08.738 [INFO][4332] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.110.192/26 handle="k8s-pod-network.0e7a63ee76d35a6b97c9d5f5a92b1141635f0cd160f86de4fc9d80fb5e1e184a" host="ci-4372-0-1-n-21be50a87e" Jul 15 23:18:08.850469 containerd[1538]: 2025-07-15 23:18:08.742 [INFO][4332] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.0e7a63ee76d35a6b97c9d5f5a92b1141635f0cd160f86de4fc9d80fb5e1e184a Jul 15 23:18:08.850469 containerd[1538]: 2025-07-15 23:18:08.760 [INFO][4332] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.110.192/26 handle="k8s-pod-network.0e7a63ee76d35a6b97c9d5f5a92b1141635f0cd160f86de4fc9d80fb5e1e184a" host="ci-4372-0-1-n-21be50a87e" Jul 15 23:18:08.850469 containerd[1538]: 2025-07-15 23:18:08.778 [INFO][4332] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.110.198/26] block=192.168.110.192/26 handle="k8s-pod-network.0e7a63ee76d35a6b97c9d5f5a92b1141635f0cd160f86de4fc9d80fb5e1e184a" host="ci-4372-0-1-n-21be50a87e" Jul 15 23:18:08.850469 containerd[1538]: 2025-07-15 23:18:08.778 [INFO][4332] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.110.198/26] handle="k8s-pod-network.0e7a63ee76d35a6b97c9d5f5a92b1141635f0cd160f86de4fc9d80fb5e1e184a" host="ci-4372-0-1-n-21be50a87e" Jul 15 23:18:08.850469 containerd[1538]: 2025-07-15 23:18:08.778 [INFO][4332] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Jul 15 23:18:08.850469 containerd[1538]: 2025-07-15 23:18:08.778 [INFO][4332] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.110.198/26] IPv6=[] ContainerID="0e7a63ee76d35a6b97c9d5f5a92b1141635f0cd160f86de4fc9d80fb5e1e184a" HandleID="k8s-pod-network.0e7a63ee76d35a6b97c9d5f5a92b1141635f0cd160f86de4fc9d80fb5e1e184a" Workload="ci--4372--0--1--n--21be50a87e-k8s-calico--apiserver--799c9d5b45--zftqn-eth0" Jul 15 23:18:08.851318 containerd[1538]: 2025-07-15 23:18:08.784 [INFO][4262] cni-plugin/k8s.go 418: Populated endpoint ContainerID="0e7a63ee76d35a6b97c9d5f5a92b1141635f0cd160f86de4fc9d80fb5e1e184a" Namespace="calico-apiserver" Pod="calico-apiserver-799c9d5b45-zftqn" WorkloadEndpoint="ci--4372--0--1--n--21be50a87e-k8s-calico--apiserver--799c9d5b45--zftqn-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4372--0--1--n--21be50a87e-k8s-calico--apiserver--799c9d5b45--zftqn-eth0", GenerateName:"calico-apiserver-799c9d5b45-", Namespace:"calico-apiserver", SelfLink:"", UID:"abc468d0-2605-48f4-bc68-9c53f0bb8be8", ResourceVersion:"808", Generation:0, CreationTimestamp:time.Date(2025, time.July, 15, 23, 17, 40, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"799c9d5b45", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4372-0-1-n-21be50a87e", ContainerID:"", Pod:"calico-apiserver-799c9d5b45-zftqn", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.110.198/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"caliba501fde9fb", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Jul 15 23:18:08.851318 containerd[1538]: 2025-07-15 23:18:08.785 [INFO][4262] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.110.198/32] ContainerID="0e7a63ee76d35a6b97c9d5f5a92b1141635f0cd160f86de4fc9d80fb5e1e184a" Namespace="calico-apiserver" Pod="calico-apiserver-799c9d5b45-zftqn" WorkloadEndpoint="ci--4372--0--1--n--21be50a87e-k8s-calico--apiserver--799c9d5b45--zftqn-eth0" Jul 15 23:18:08.851318 containerd[1538]: 2025-07-15 23:18:08.785 [INFO][4262] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to caliba501fde9fb ContainerID="0e7a63ee76d35a6b97c9d5f5a92b1141635f0cd160f86de4fc9d80fb5e1e184a" Namespace="calico-apiserver" Pod="calico-apiserver-799c9d5b45-zftqn" WorkloadEndpoint="ci--4372--0--1--n--21be50a87e-k8s-calico--apiserver--799c9d5b45--zftqn-eth0" Jul 15 23:18:08.851318 containerd[1538]: 2025-07-15 23:18:08.811 [INFO][4262] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="0e7a63ee76d35a6b97c9d5f5a92b1141635f0cd160f86de4fc9d80fb5e1e184a" Namespace="calico-apiserver" Pod="calico-apiserver-799c9d5b45-zftqn" WorkloadEndpoint="ci--4372--0--1--n--21be50a87e-k8s-calico--apiserver--799c9d5b45--zftqn-eth0" Jul 15 23:18:08.851318 containerd[1538]: 2025-07-15 23:18:08.812 [INFO][4262] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="0e7a63ee76d35a6b97c9d5f5a92b1141635f0cd160f86de4fc9d80fb5e1e184a" Namespace="calico-apiserver" Pod="calico-apiserver-799c9d5b45-zftqn" WorkloadEndpoint="ci--4372--0--1--n--21be50a87e-k8s-calico--apiserver--799c9d5b45--zftqn-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4372--0--1--n--21be50a87e-k8s-calico--apiserver--799c9d5b45--zftqn-eth0", GenerateName:"calico-apiserver-799c9d5b45-", Namespace:"calico-apiserver", SelfLink:"", UID:"abc468d0-2605-48f4-bc68-9c53f0bb8be8", ResourceVersion:"808", Generation:0, CreationTimestamp:time.Date(2025, time.July, 15, 23, 17, 40, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"799c9d5b45", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4372-0-1-n-21be50a87e", ContainerID:"0e7a63ee76d35a6b97c9d5f5a92b1141635f0cd160f86de4fc9d80fb5e1e184a", Pod:"calico-apiserver-799c9d5b45-zftqn", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.110.198/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"caliba501fde9fb", MAC:"22:f7:76:42:82:1f", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Jul 15 23:18:08.851318 containerd[1538]: 2025-07-15 23:18:08.838 [INFO][4262] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="0e7a63ee76d35a6b97c9d5f5a92b1141635f0cd160f86de4fc9d80fb5e1e184a" Namespace="calico-apiserver" Pod="calico-apiserver-799c9d5b45-zftqn" WorkloadEndpoint="ci--4372--0--1--n--21be50a87e-k8s-calico--apiserver--799c9d5b45--zftqn-eth0" Jul 15 23:18:08.865185 containerd[1538]: time="2025-07-15T23:18:08.864881556Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:goldmane-768f4c5c69-fn5px,Uid:420d8d7f-3f4e-4b40-a9e9-124564c6d541,Namespace:calico-system,Attempt:0,} returns sandbox id \"a7d16da8207e91e9eb6b6c5c459116702ba1f8b1ff715bfb73f8d55afaa0d519\"" Jul 15 23:18:08.867825 systemd[1]: Started cri-containerd-05d3bf9b2a5b110b5611709551dc28573e964cb6b7f8f57029583825d78197a6.scope - libcontainer container 05d3bf9b2a5b110b5611709551dc28573e964cb6b7f8f57029583825d78197a6. Jul 15 23:18:08.904632 containerd[1538]: time="2025-07-15T23:18:08.904136837Z" level=info msg="connecting to shim 0e7a63ee76d35a6b97c9d5f5a92b1141635f0cd160f86de4fc9d80fb5e1e184a" address="unix:///run/containerd/s/4d1f928b972aa73987083bba97618719e5caa5f09cbcea1e7ef564f2e28a2d67" namespace=k8s.io protocol=ttrpc version=3 Jul 15 23:18:08.965948 systemd[1]: Started cri-containerd-0e7a63ee76d35a6b97c9d5f5a92b1141635f0cd160f86de4fc9d80fb5e1e184a.scope - libcontainer container 0e7a63ee76d35a6b97c9d5f5a92b1141635f0cd160f86de4fc9d80fb5e1e184a. Jul 15 23:18:08.971749 containerd[1538]: time="2025-07-15T23:18:08.971688790Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-674b8bbfcf-dmcjn,Uid:ad6dda87-ca8a-4489-9124-9f24cca00875,Namespace:kube-system,Attempt:0,} returns sandbox id \"05d3bf9b2a5b110b5611709551dc28573e964cb6b7f8f57029583825d78197a6\"" Jul 15 23:18:08.979831 containerd[1538]: time="2025-07-15T23:18:08.978845278Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-674b8bbfcf-98fhj,Uid:d247f82a-56a6-47df-ba27-76b7bfeb6863,Namespace:kube-system,Attempt:0,}" Jul 15 23:18:08.982693 containerd[1538]: time="2025-07-15T23:18:08.982483841Z" level=info msg="CreateContainer within sandbox \"05d3bf9b2a5b110b5611709551dc28573e964cb6b7f8f57029583825d78197a6\" for container &ContainerMetadata{Name:coredns,Attempt:0,}" Jul 15 23:18:09.058971 containerd[1538]: time="2025-07-15T23:18:09.058918550Z" level=info msg="Container 92dd7ef71be94af19ff8b421bee852092fbb86c59fd308013169a7ddae19b3ad: CDI devices from CRI Config.CDIDevices: []" Jul 15 23:18:09.059942 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount2483559044.mount: Deactivated successfully. Jul 15 23:18:09.092614 containerd[1538]: time="2025-07-15T23:18:09.092012617Z" level=info msg="CreateContainer within sandbox \"05d3bf9b2a5b110b5611709551dc28573e964cb6b7f8f57029583825d78197a6\" for &ContainerMetadata{Name:coredns,Attempt:0,} returns container id \"92dd7ef71be94af19ff8b421bee852092fbb86c59fd308013169a7ddae19b3ad\"" Jul 15 23:18:09.095806 containerd[1538]: time="2025-07-15T23:18:09.095236904Z" level=info msg="StartContainer for \"92dd7ef71be94af19ff8b421bee852092fbb86c59fd308013169a7ddae19b3ad\"" Jul 15 23:18:09.099839 containerd[1538]: time="2025-07-15T23:18:09.099451222Z" level=info msg="connecting to shim 92dd7ef71be94af19ff8b421bee852092fbb86c59fd308013169a7ddae19b3ad" address="unix:///run/containerd/s/5ad4a595498885f02b1b719a92bec5324906c390d8aa884132fbbecca5c25c3e" protocol=ttrpc version=3 Jul 15 23:18:09.179122 systemd[1]: Started cri-containerd-92dd7ef71be94af19ff8b421bee852092fbb86c59fd308013169a7ddae19b3ad.scope - libcontainer container 92dd7ef71be94af19ff8b421bee852092fbb86c59fd308013169a7ddae19b3ad. Jul 15 23:18:09.181110 containerd[1538]: time="2025-07-15T23:18:09.180939442Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-799c9d5b45-zftqn,Uid:abc468d0-2605-48f4-bc68-9c53f0bb8be8,Namespace:calico-apiserver,Attempt:0,} returns sandbox id \"0e7a63ee76d35a6b97c9d5f5a92b1141635f0cd160f86de4fc9d80fb5e1e184a\"" Jul 15 23:18:09.272160 containerd[1538]: time="2025-07-15T23:18:09.272093524Z" level=info msg="StartContainer for \"92dd7ef71be94af19ff8b421bee852092fbb86c59fd308013169a7ddae19b3ad\" returns successfully" Jul 15 23:18:09.380826 systemd-networkd[1422]: cali07db3fd375b: Gained IPv6LL Jul 15 23:18:09.459848 systemd-networkd[1422]: cali63d21bb9dbe: Link UP Jul 15 23:18:09.463803 systemd-networkd[1422]: cali63d21bb9dbe: Gained carrier Jul 15 23:18:09.492307 containerd[1538]: 2025-07-15 23:18:09.109 [INFO][4554] cni-plugin/utils.go 100: File /var/lib/calico/mtu does not exist Jul 15 23:18:09.492307 containerd[1538]: 2025-07-15 23:18:09.161 [INFO][4554] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4372--0--1--n--21be50a87e-k8s-coredns--674b8bbfcf--98fhj-eth0 coredns-674b8bbfcf- kube-system d247f82a-56a6-47df-ba27-76b7bfeb6863 811 0 2025-07-15 23:17:28 +0000 UTC map[k8s-app:kube-dns pod-template-hash:674b8bbfcf projectcalico.org/namespace:kube-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:coredns] map[] [] [] []} {k8s ci-4372-0-1-n-21be50a87e coredns-674b8bbfcf-98fhj eth0 coredns [] [] [kns.kube-system ksa.kube-system.coredns] cali63d21bb9dbe [{dns UDP 53 0 } {dns-tcp TCP 53 0 } {metrics TCP 9153 0 }] [] }} ContainerID="24f1aeb591ae9ab6a3c19ea3e1b0e408ed4c0667d05f7cc45bb9396b18789c8f" Namespace="kube-system" Pod="coredns-674b8bbfcf-98fhj" WorkloadEndpoint="ci--4372--0--1--n--21be50a87e-k8s-coredns--674b8bbfcf--98fhj-" Jul 15 23:18:09.492307 containerd[1538]: 2025-07-15 23:18:09.161 [INFO][4554] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="24f1aeb591ae9ab6a3c19ea3e1b0e408ed4c0667d05f7cc45bb9396b18789c8f" Namespace="kube-system" Pod="coredns-674b8bbfcf-98fhj" WorkloadEndpoint="ci--4372--0--1--n--21be50a87e-k8s-coredns--674b8bbfcf--98fhj-eth0" Jul 15 23:18:09.492307 containerd[1538]: 2025-07-15 23:18:09.299 [INFO][4594] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="24f1aeb591ae9ab6a3c19ea3e1b0e408ed4c0667d05f7cc45bb9396b18789c8f" HandleID="k8s-pod-network.24f1aeb591ae9ab6a3c19ea3e1b0e408ed4c0667d05f7cc45bb9396b18789c8f" Workload="ci--4372--0--1--n--21be50a87e-k8s-coredns--674b8bbfcf--98fhj-eth0" Jul 15 23:18:09.492307 containerd[1538]: 2025-07-15 23:18:09.300 [INFO][4594] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="24f1aeb591ae9ab6a3c19ea3e1b0e408ed4c0667d05f7cc45bb9396b18789c8f" HandleID="k8s-pod-network.24f1aeb591ae9ab6a3c19ea3e1b0e408ed4c0667d05f7cc45bb9396b18789c8f" Workload="ci--4372--0--1--n--21be50a87e-k8s-coredns--674b8bbfcf--98fhj-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0x4000349420), Attrs:map[string]string{"namespace":"kube-system", "node":"ci-4372-0-1-n-21be50a87e", "pod":"coredns-674b8bbfcf-98fhj", "timestamp":"2025-07-15 23:18:09.299533528 +0000 UTC"}, Hostname:"ci-4372-0-1-n-21be50a87e", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Jul 15 23:18:09.492307 containerd[1538]: 2025-07-15 23:18:09.300 [INFO][4594] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Jul 15 23:18:09.492307 containerd[1538]: 2025-07-15 23:18:09.300 [INFO][4594] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Jul 15 23:18:09.492307 containerd[1538]: 2025-07-15 23:18:09.300 [INFO][4594] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4372-0-1-n-21be50a87e' Jul 15 23:18:09.492307 containerd[1538]: 2025-07-15 23:18:09.327 [INFO][4594] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.24f1aeb591ae9ab6a3c19ea3e1b0e408ed4c0667d05f7cc45bb9396b18789c8f" host="ci-4372-0-1-n-21be50a87e" Jul 15 23:18:09.492307 containerd[1538]: 2025-07-15 23:18:09.363 [INFO][4594] ipam/ipam.go 394: Looking up existing affinities for host host="ci-4372-0-1-n-21be50a87e" Jul 15 23:18:09.492307 containerd[1538]: 2025-07-15 23:18:09.378 [INFO][4594] ipam/ipam.go 511: Trying affinity for 192.168.110.192/26 host="ci-4372-0-1-n-21be50a87e" Jul 15 23:18:09.492307 containerd[1538]: 2025-07-15 23:18:09.387 [INFO][4594] ipam/ipam.go 158: Attempting to load block cidr=192.168.110.192/26 host="ci-4372-0-1-n-21be50a87e" Jul 15 23:18:09.492307 containerd[1538]: 2025-07-15 23:18:09.394 [INFO][4594] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.110.192/26 host="ci-4372-0-1-n-21be50a87e" Jul 15 23:18:09.492307 containerd[1538]: 2025-07-15 23:18:09.395 [INFO][4594] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.110.192/26 handle="k8s-pod-network.24f1aeb591ae9ab6a3c19ea3e1b0e408ed4c0667d05f7cc45bb9396b18789c8f" host="ci-4372-0-1-n-21be50a87e" Jul 15 23:18:09.492307 containerd[1538]: 2025-07-15 23:18:09.401 [INFO][4594] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.24f1aeb591ae9ab6a3c19ea3e1b0e408ed4c0667d05f7cc45bb9396b18789c8f Jul 15 23:18:09.492307 containerd[1538]: 2025-07-15 23:18:09.411 [INFO][4594] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.110.192/26 handle="k8s-pod-network.24f1aeb591ae9ab6a3c19ea3e1b0e408ed4c0667d05f7cc45bb9396b18789c8f" host="ci-4372-0-1-n-21be50a87e" Jul 15 23:18:09.492307 containerd[1538]: 2025-07-15 23:18:09.433 [INFO][4594] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.110.199/26] block=192.168.110.192/26 handle="k8s-pod-network.24f1aeb591ae9ab6a3c19ea3e1b0e408ed4c0667d05f7cc45bb9396b18789c8f" host="ci-4372-0-1-n-21be50a87e" Jul 15 23:18:09.492307 containerd[1538]: 2025-07-15 23:18:09.435 [INFO][4594] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.110.199/26] handle="k8s-pod-network.24f1aeb591ae9ab6a3c19ea3e1b0e408ed4c0667d05f7cc45bb9396b18789c8f" host="ci-4372-0-1-n-21be50a87e" Jul 15 23:18:09.492307 containerd[1538]: 2025-07-15 23:18:09.435 [INFO][4594] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Jul 15 23:18:09.492307 containerd[1538]: 2025-07-15 23:18:09.435 [INFO][4594] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.110.199/26] IPv6=[] ContainerID="24f1aeb591ae9ab6a3c19ea3e1b0e408ed4c0667d05f7cc45bb9396b18789c8f" HandleID="k8s-pod-network.24f1aeb591ae9ab6a3c19ea3e1b0e408ed4c0667d05f7cc45bb9396b18789c8f" Workload="ci--4372--0--1--n--21be50a87e-k8s-coredns--674b8bbfcf--98fhj-eth0" Jul 15 23:18:09.493274 containerd[1538]: 2025-07-15 23:18:09.445 [INFO][4554] cni-plugin/k8s.go 418: Populated endpoint ContainerID="24f1aeb591ae9ab6a3c19ea3e1b0e408ed4c0667d05f7cc45bb9396b18789c8f" Namespace="kube-system" Pod="coredns-674b8bbfcf-98fhj" WorkloadEndpoint="ci--4372--0--1--n--21be50a87e-k8s-coredns--674b8bbfcf--98fhj-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4372--0--1--n--21be50a87e-k8s-coredns--674b8bbfcf--98fhj-eth0", GenerateName:"coredns-674b8bbfcf-", Namespace:"kube-system", SelfLink:"", UID:"d247f82a-56a6-47df-ba27-76b7bfeb6863", ResourceVersion:"811", Generation:0, CreationTimestamp:time.Date(2025, time.July, 15, 23, 17, 28, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"674b8bbfcf", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4372-0-1-n-21be50a87e", ContainerID:"", Pod:"coredns-674b8bbfcf-98fhj", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.110.199/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"cali63d21bb9dbe", MAC:"", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Jul 15 23:18:09.493274 containerd[1538]: 2025-07-15 23:18:09.446 [INFO][4554] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.110.199/32] ContainerID="24f1aeb591ae9ab6a3c19ea3e1b0e408ed4c0667d05f7cc45bb9396b18789c8f" Namespace="kube-system" Pod="coredns-674b8bbfcf-98fhj" WorkloadEndpoint="ci--4372--0--1--n--21be50a87e-k8s-coredns--674b8bbfcf--98fhj-eth0" Jul 15 23:18:09.493274 containerd[1538]: 2025-07-15 23:18:09.446 [INFO][4554] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali63d21bb9dbe ContainerID="24f1aeb591ae9ab6a3c19ea3e1b0e408ed4c0667d05f7cc45bb9396b18789c8f" Namespace="kube-system" Pod="coredns-674b8bbfcf-98fhj" WorkloadEndpoint="ci--4372--0--1--n--21be50a87e-k8s-coredns--674b8bbfcf--98fhj-eth0" Jul 15 23:18:09.493274 containerd[1538]: 2025-07-15 23:18:09.459 [INFO][4554] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="24f1aeb591ae9ab6a3c19ea3e1b0e408ed4c0667d05f7cc45bb9396b18789c8f" Namespace="kube-system" Pod="coredns-674b8bbfcf-98fhj" WorkloadEndpoint="ci--4372--0--1--n--21be50a87e-k8s-coredns--674b8bbfcf--98fhj-eth0" Jul 15 23:18:09.493274 containerd[1538]: 2025-07-15 23:18:09.460 [INFO][4554] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="24f1aeb591ae9ab6a3c19ea3e1b0e408ed4c0667d05f7cc45bb9396b18789c8f" Namespace="kube-system" Pod="coredns-674b8bbfcf-98fhj" WorkloadEndpoint="ci--4372--0--1--n--21be50a87e-k8s-coredns--674b8bbfcf--98fhj-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4372--0--1--n--21be50a87e-k8s-coredns--674b8bbfcf--98fhj-eth0", GenerateName:"coredns-674b8bbfcf-", Namespace:"kube-system", SelfLink:"", UID:"d247f82a-56a6-47df-ba27-76b7bfeb6863", ResourceVersion:"811", Generation:0, CreationTimestamp:time.Date(2025, time.July, 15, 23, 17, 28, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"674b8bbfcf", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4372-0-1-n-21be50a87e", ContainerID:"24f1aeb591ae9ab6a3c19ea3e1b0e408ed4c0667d05f7cc45bb9396b18789c8f", Pod:"coredns-674b8bbfcf-98fhj", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.110.199/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"cali63d21bb9dbe", MAC:"9a:27:c5:14:24:ce", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Jul 15 23:18:09.493274 containerd[1538]: 2025-07-15 23:18:09.487 [INFO][4554] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="24f1aeb591ae9ab6a3c19ea3e1b0e408ed4c0667d05f7cc45bb9396b18789c8f" Namespace="kube-system" Pod="coredns-674b8bbfcf-98fhj" WorkloadEndpoint="ci--4372--0--1--n--21be50a87e-k8s-coredns--674b8bbfcf--98fhj-eth0" Jul 15 23:18:09.568658 containerd[1538]: time="2025-07-15T23:18:09.568543381Z" level=info msg="connecting to shim 24f1aeb591ae9ab6a3c19ea3e1b0e408ed4c0667d05f7cc45bb9396b18789c8f" address="unix:///run/containerd/s/45fd5c08c5971acc762c16a34f7bd19995b9383a680752a65331e2a88f393bd9" namespace=k8s.io protocol=ttrpc version=3 Jul 15 23:18:09.607875 systemd[1]: Started cri-containerd-24f1aeb591ae9ab6a3c19ea3e1b0e408ed4c0667d05f7cc45bb9396b18789c8f.scope - libcontainer container 24f1aeb591ae9ab6a3c19ea3e1b0e408ed4c0667d05f7cc45bb9396b18789c8f. Jul 15 23:18:09.678252 containerd[1538]: time="2025-07-15T23:18:09.678074718Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-674b8bbfcf-98fhj,Uid:d247f82a-56a6-47df-ba27-76b7bfeb6863,Namespace:kube-system,Attempt:0,} returns sandbox id \"24f1aeb591ae9ab6a3c19ea3e1b0e408ed4c0667d05f7cc45bb9396b18789c8f\"" Jul 15 23:18:09.691080 containerd[1538]: time="2025-07-15T23:18:09.689951839Z" level=info msg="CreateContainer within sandbox \"24f1aeb591ae9ab6a3c19ea3e1b0e408ed4c0667d05f7cc45bb9396b18789c8f\" for container &ContainerMetadata{Name:coredns,Attempt:0,}" Jul 15 23:18:09.703514 containerd[1538]: time="2025-07-15T23:18:09.703463903Z" level=info msg="Container 390f32795dd02506ada24850750e551a78ebce56fb4d89bcf476cf7eec3c1dae: CDI devices from CRI Config.CDIDevices: []" Jul 15 23:18:09.722825 containerd[1538]: time="2025-07-15T23:18:09.722117115Z" level=info msg="CreateContainer within sandbox \"24f1aeb591ae9ab6a3c19ea3e1b0e408ed4c0667d05f7cc45bb9396b18789c8f\" for &ContainerMetadata{Name:coredns,Attempt:0,} returns container id \"390f32795dd02506ada24850750e551a78ebce56fb4d89bcf476cf7eec3c1dae\"" Jul 15 23:18:09.724900 containerd[1538]: time="2025-07-15T23:18:09.724863608Z" level=info msg="StartContainer for \"390f32795dd02506ada24850750e551a78ebce56fb4d89bcf476cf7eec3c1dae\"" Jul 15 23:18:09.726900 containerd[1538]: time="2025-07-15T23:18:09.726849228Z" level=info msg="connecting to shim 390f32795dd02506ada24850750e551a78ebce56fb4d89bcf476cf7eec3c1dae" address="unix:///run/containerd/s/45fd5c08c5971acc762c16a34f7bd19995b9383a680752a65331e2a88f393bd9" protocol=ttrpc version=3 Jul 15 23:18:09.763772 systemd-networkd[1422]: cali682edd34373: Gained IPv6LL Jul 15 23:18:09.765842 systemd[1]: Started cri-containerd-390f32795dd02506ada24850750e551a78ebce56fb4d89bcf476cf7eec3c1dae.scope - libcontainer container 390f32795dd02506ada24850750e551a78ebce56fb4d89bcf476cf7eec3c1dae. Jul 15 23:18:09.827791 containerd[1538]: time="2025-07-15T23:18:09.827560734Z" level=info msg="StartContainer for \"390f32795dd02506ada24850750e551a78ebce56fb4d89bcf476cf7eec3c1dae\" returns successfully" Jul 15 23:18:10.323433 kubelet[2742]: I0715 23:18:10.323363 2742 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/coredns-674b8bbfcf-dmcjn" podStartSLOduration=42.323258614 podStartE2EDuration="42.323258614s" podCreationTimestamp="2025-07-15 23:17:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-07-15 23:18:10.319899888 +0000 UTC m=+47.526665562" watchObservedRunningTime="2025-07-15 23:18:10.323258614 +0000 UTC m=+47.530024288" Jul 15 23:18:10.368983 kubelet[2742]: I0715 23:18:10.368902 2742 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/coredns-674b8bbfcf-98fhj" podStartSLOduration=42.368875719 podStartE2EDuration="42.368875719s" podCreationTimestamp="2025-07-15 23:17:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-07-15 23:18:10.367084257 +0000 UTC m=+47.573849971" watchObservedRunningTime="2025-07-15 23:18:10.368875719 +0000 UTC m=+47.575641393" Jul 15 23:18:10.380610 containerd[1538]: time="2025-07-15T23:18:10.380445364Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/apiserver:v3.30.2\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jul 15 23:18:10.383242 containerd[1538]: time="2025-07-15T23:18:10.382780620Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.30.2: active requests=0, bytes read=44517149" Jul 15 23:18:10.388535 containerd[1538]: time="2025-07-15T23:18:10.388364045Z" level=info msg="ImageCreate event name:\"sha256:3371ea1b18040228ef58c964e49b96f4291def748753dfbc0aef87a55f906b8f\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jul 15 23:18:10.391390 containerd[1538]: time="2025-07-15T23:18:10.390672822Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/apiserver:v3.30.2\" with image id \"sha256:3371ea1b18040228ef58c964e49b96f4291def748753dfbc0aef87a55f906b8f\", repo tag \"ghcr.io/flatcar/calico/apiserver:v3.30.2\", repo digest \"ghcr.io/flatcar/calico/apiserver@sha256:ec6b10660962e7caad70c47755049fad68f9fc2f7064e8bc7cb862583e02cc2b\", size \"45886406\" in 3.549615768s" Jul 15 23:18:10.395145 containerd[1538]: time="2025-07-15T23:18:10.393134517Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.2\" returns image reference \"sha256:3371ea1b18040228ef58c964e49b96f4291def748753dfbc0aef87a55f906b8f\"" Jul 15 23:18:10.395359 containerd[1538]: time="2025-07-15T23:18:10.393115437Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/apiserver@sha256:ec6b10660962e7caad70c47755049fad68f9fc2f7064e8bc7cb862583e02cc2b\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jul 15 23:18:10.400892 containerd[1538]: time="2025-07-15T23:18:10.400519603Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.30.2\"" Jul 15 23:18:10.406340 containerd[1538]: time="2025-07-15T23:18:10.406147347Z" level=info msg="CreateContainer within sandbox \"b9b021e3a62743a5b0a3c04d6e23296a641bb175506ab761b20a4004ad3f7578\" for container &ContainerMetadata{Name:calico-apiserver,Attempt:0,}" Jul 15 23:18:10.439688 kubelet[2742]: I0715 23:18:10.439640 2742 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Jul 15 23:18:10.447617 containerd[1538]: time="2025-07-15T23:18:10.447235417Z" level=info msg="Container b923230996c96fd7e86789a0a70214ca424f78c41abe9ed3ab7d2a0dc4298b72: CDI devices from CRI Config.CDIDevices: []" Jul 15 23:18:10.461084 containerd[1538]: time="2025-07-15T23:18:10.460962561Z" level=info msg="CreateContainer within sandbox \"b9b021e3a62743a5b0a3c04d6e23296a641bb175506ab761b20a4004ad3f7578\" for &ContainerMetadata{Name:calico-apiserver,Attempt:0,} returns container id \"b923230996c96fd7e86789a0a70214ca424f78c41abe9ed3ab7d2a0dc4298b72\"" Jul 15 23:18:10.462615 containerd[1538]: time="2025-07-15T23:18:10.462388746Z" level=info msg="StartContainer for \"b923230996c96fd7e86789a0a70214ca424f78c41abe9ed3ab7d2a0dc4298b72\"" Jul 15 23:18:10.466423 containerd[1538]: time="2025-07-15T23:18:10.466367107Z" level=info msg="connecting to shim b923230996c96fd7e86789a0a70214ca424f78c41abe9ed3ab7d2a0dc4298b72" address="unix:///run/containerd/s/c39a105c714d17ce29068cbd6070fc0cb6b87880023a6666446bd87c980346fc" protocol=ttrpc version=3 Jul 15 23:18:10.502837 systemd[1]: Started cri-containerd-b923230996c96fd7e86789a0a70214ca424f78c41abe9ed3ab7d2a0dc4298b72.scope - libcontainer container b923230996c96fd7e86789a0a70214ca424f78c41abe9ed3ab7d2a0dc4298b72. Jul 15 23:18:10.531895 systemd-networkd[1422]: calibd46cc538fe: Gained IPv6LL Jul 15 23:18:10.616631 containerd[1538]: time="2025-07-15T23:18:10.616467169Z" level=info msg="StartContainer for \"b923230996c96fd7e86789a0a70214ca424f78c41abe9ed3ab7d2a0dc4298b72\" returns successfully" Jul 15 23:18:10.787801 systemd-networkd[1422]: caliba501fde9fb: Gained IPv6LL Jul 15 23:18:10.979156 containerd[1538]: time="2025-07-15T23:18:10.979010433Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-trqcc,Uid:7821b0da-67bb-46bb-abaf-a7c6ac82a39c,Namespace:calico-system,Attempt:0,}" Jul 15 23:18:11.228965 systemd-networkd[1422]: cali463cd909e33: Link UP Jul 15 23:18:11.231833 systemd-networkd[1422]: cali463cd909e33: Gained carrier Jul 15 23:18:11.266101 containerd[1538]: 2025-07-15 23:18:11.021 [INFO][4815] cni-plugin/utils.go 100: File /var/lib/calico/mtu does not exist Jul 15 23:18:11.266101 containerd[1538]: 2025-07-15 23:18:11.048 [INFO][4815] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4372--0--1--n--21be50a87e-k8s-csi--node--driver--trqcc-eth0 csi-node-driver- calico-system 7821b0da-67bb-46bb-abaf-a7c6ac82a39c 718 0 2025-07-15 23:17:46 +0000 UTC map[app.kubernetes.io/name:csi-node-driver controller-revision-hash:8967bcb6f k8s-app:csi-node-driver name:csi-node-driver pod-template-generation:1 projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:csi-node-driver] map[] [] [] []} {k8s ci-4372-0-1-n-21be50a87e csi-node-driver-trqcc eth0 csi-node-driver [] [] [kns.calico-system ksa.calico-system.csi-node-driver] cali463cd909e33 [] [] }} ContainerID="42a731eb2d3790dd3f11a3d46919890eafdb18ca723a5e493005591e07ff4f58" Namespace="calico-system" Pod="csi-node-driver-trqcc" WorkloadEndpoint="ci--4372--0--1--n--21be50a87e-k8s-csi--node--driver--trqcc-" Jul 15 23:18:11.266101 containerd[1538]: 2025-07-15 23:18:11.048 [INFO][4815] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="42a731eb2d3790dd3f11a3d46919890eafdb18ca723a5e493005591e07ff4f58" Namespace="calico-system" Pod="csi-node-driver-trqcc" WorkloadEndpoint="ci--4372--0--1--n--21be50a87e-k8s-csi--node--driver--trqcc-eth0" Jul 15 23:18:11.266101 containerd[1538]: 2025-07-15 23:18:11.086 [INFO][4828] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="42a731eb2d3790dd3f11a3d46919890eafdb18ca723a5e493005591e07ff4f58" HandleID="k8s-pod-network.42a731eb2d3790dd3f11a3d46919890eafdb18ca723a5e493005591e07ff4f58" Workload="ci--4372--0--1--n--21be50a87e-k8s-csi--node--driver--trqcc-eth0" Jul 15 23:18:11.266101 containerd[1538]: 2025-07-15 23:18:11.087 [INFO][4828] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="42a731eb2d3790dd3f11a3d46919890eafdb18ca723a5e493005591e07ff4f58" HandleID="k8s-pod-network.42a731eb2d3790dd3f11a3d46919890eafdb18ca723a5e493005591e07ff4f58" Workload="ci--4372--0--1--n--21be50a87e-k8s-csi--node--driver--trqcc-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0x40002d3a90), Attrs:map[string]string{"namespace":"calico-system", "node":"ci-4372-0-1-n-21be50a87e", "pod":"csi-node-driver-trqcc", "timestamp":"2025-07-15 23:18:11.086404409 +0000 UTC"}, Hostname:"ci-4372-0-1-n-21be50a87e", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Jul 15 23:18:11.266101 containerd[1538]: 2025-07-15 23:18:11.087 [INFO][4828] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Jul 15 23:18:11.266101 containerd[1538]: 2025-07-15 23:18:11.087 [INFO][4828] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Jul 15 23:18:11.266101 containerd[1538]: 2025-07-15 23:18:11.087 [INFO][4828] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4372-0-1-n-21be50a87e' Jul 15 23:18:11.266101 containerd[1538]: 2025-07-15 23:18:11.101 [INFO][4828] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.42a731eb2d3790dd3f11a3d46919890eafdb18ca723a5e493005591e07ff4f58" host="ci-4372-0-1-n-21be50a87e" Jul 15 23:18:11.266101 containerd[1538]: 2025-07-15 23:18:11.164 [INFO][4828] ipam/ipam.go 394: Looking up existing affinities for host host="ci-4372-0-1-n-21be50a87e" Jul 15 23:18:11.266101 containerd[1538]: 2025-07-15 23:18:11.175 [INFO][4828] ipam/ipam.go 511: Trying affinity for 192.168.110.192/26 host="ci-4372-0-1-n-21be50a87e" Jul 15 23:18:11.266101 containerd[1538]: 2025-07-15 23:18:11.179 [INFO][4828] ipam/ipam.go 158: Attempting to load block cidr=192.168.110.192/26 host="ci-4372-0-1-n-21be50a87e" Jul 15 23:18:11.266101 containerd[1538]: 2025-07-15 23:18:11.188 [INFO][4828] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.110.192/26 host="ci-4372-0-1-n-21be50a87e" Jul 15 23:18:11.266101 containerd[1538]: 2025-07-15 23:18:11.188 [INFO][4828] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.110.192/26 handle="k8s-pod-network.42a731eb2d3790dd3f11a3d46919890eafdb18ca723a5e493005591e07ff4f58" host="ci-4372-0-1-n-21be50a87e" Jul 15 23:18:11.266101 containerd[1538]: 2025-07-15 23:18:11.192 [INFO][4828] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.42a731eb2d3790dd3f11a3d46919890eafdb18ca723a5e493005591e07ff4f58 Jul 15 23:18:11.266101 containerd[1538]: 2025-07-15 23:18:11.200 [INFO][4828] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.110.192/26 handle="k8s-pod-network.42a731eb2d3790dd3f11a3d46919890eafdb18ca723a5e493005591e07ff4f58" host="ci-4372-0-1-n-21be50a87e" Jul 15 23:18:11.266101 containerd[1538]: 2025-07-15 23:18:11.216 [INFO][4828] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.110.200/26] block=192.168.110.192/26 handle="k8s-pod-network.42a731eb2d3790dd3f11a3d46919890eafdb18ca723a5e493005591e07ff4f58" host="ci-4372-0-1-n-21be50a87e" Jul 15 23:18:11.266101 containerd[1538]: 2025-07-15 23:18:11.218 [INFO][4828] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.110.200/26] handle="k8s-pod-network.42a731eb2d3790dd3f11a3d46919890eafdb18ca723a5e493005591e07ff4f58" host="ci-4372-0-1-n-21be50a87e" Jul 15 23:18:11.266101 containerd[1538]: 2025-07-15 23:18:11.218 [INFO][4828] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Jul 15 23:18:11.266101 containerd[1538]: 2025-07-15 23:18:11.218 [INFO][4828] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.110.200/26] IPv6=[] ContainerID="42a731eb2d3790dd3f11a3d46919890eafdb18ca723a5e493005591e07ff4f58" HandleID="k8s-pod-network.42a731eb2d3790dd3f11a3d46919890eafdb18ca723a5e493005591e07ff4f58" Workload="ci--4372--0--1--n--21be50a87e-k8s-csi--node--driver--trqcc-eth0" Jul 15 23:18:11.266952 containerd[1538]: 2025-07-15 23:18:11.221 [INFO][4815] cni-plugin/k8s.go 418: Populated endpoint ContainerID="42a731eb2d3790dd3f11a3d46919890eafdb18ca723a5e493005591e07ff4f58" Namespace="calico-system" Pod="csi-node-driver-trqcc" WorkloadEndpoint="ci--4372--0--1--n--21be50a87e-k8s-csi--node--driver--trqcc-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4372--0--1--n--21be50a87e-k8s-csi--node--driver--trqcc-eth0", GenerateName:"csi-node-driver-", Namespace:"calico-system", SelfLink:"", UID:"7821b0da-67bb-46bb-abaf-a7c6ac82a39c", ResourceVersion:"718", Generation:0, CreationTimestamp:time.Date(2025, time.July, 15, 23, 17, 46, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"csi-node-driver", "controller-revision-hash":"8967bcb6f", "k8s-app":"csi-node-driver", "name":"csi-node-driver", "pod-template-generation":"1", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"csi-node-driver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4372-0-1-n-21be50a87e", ContainerID:"", Pod:"csi-node-driver-trqcc", Endpoint:"eth0", ServiceAccountName:"csi-node-driver", IPNetworks:[]string{"192.168.110.200/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.csi-node-driver"}, InterfaceName:"cali463cd909e33", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Jul 15 23:18:11.266952 containerd[1538]: 2025-07-15 23:18:11.221 [INFO][4815] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.110.200/32] ContainerID="42a731eb2d3790dd3f11a3d46919890eafdb18ca723a5e493005591e07ff4f58" Namespace="calico-system" Pod="csi-node-driver-trqcc" WorkloadEndpoint="ci--4372--0--1--n--21be50a87e-k8s-csi--node--driver--trqcc-eth0" Jul 15 23:18:11.266952 containerd[1538]: 2025-07-15 23:18:11.221 [INFO][4815] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali463cd909e33 ContainerID="42a731eb2d3790dd3f11a3d46919890eafdb18ca723a5e493005591e07ff4f58" Namespace="calico-system" Pod="csi-node-driver-trqcc" WorkloadEndpoint="ci--4372--0--1--n--21be50a87e-k8s-csi--node--driver--trqcc-eth0" Jul 15 23:18:11.266952 containerd[1538]: 2025-07-15 23:18:11.234 [INFO][4815] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="42a731eb2d3790dd3f11a3d46919890eafdb18ca723a5e493005591e07ff4f58" Namespace="calico-system" Pod="csi-node-driver-trqcc" WorkloadEndpoint="ci--4372--0--1--n--21be50a87e-k8s-csi--node--driver--trqcc-eth0" Jul 15 23:18:11.266952 containerd[1538]: 2025-07-15 23:18:11.246 [INFO][4815] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="42a731eb2d3790dd3f11a3d46919890eafdb18ca723a5e493005591e07ff4f58" Namespace="calico-system" Pod="csi-node-driver-trqcc" WorkloadEndpoint="ci--4372--0--1--n--21be50a87e-k8s-csi--node--driver--trqcc-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4372--0--1--n--21be50a87e-k8s-csi--node--driver--trqcc-eth0", GenerateName:"csi-node-driver-", Namespace:"calico-system", SelfLink:"", UID:"7821b0da-67bb-46bb-abaf-a7c6ac82a39c", ResourceVersion:"718", Generation:0, CreationTimestamp:time.Date(2025, time.July, 15, 23, 17, 46, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"csi-node-driver", "controller-revision-hash":"8967bcb6f", "k8s-app":"csi-node-driver", "name":"csi-node-driver", "pod-template-generation":"1", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"csi-node-driver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4372-0-1-n-21be50a87e", ContainerID:"42a731eb2d3790dd3f11a3d46919890eafdb18ca723a5e493005591e07ff4f58", Pod:"csi-node-driver-trqcc", Endpoint:"eth0", ServiceAccountName:"csi-node-driver", IPNetworks:[]string{"192.168.110.200/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.csi-node-driver"}, InterfaceName:"cali463cd909e33", MAC:"ca:75:5a:a6:35:9f", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Jul 15 23:18:11.266952 containerd[1538]: 2025-07-15 23:18:11.262 [INFO][4815] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="42a731eb2d3790dd3f11a3d46919890eafdb18ca723a5e493005591e07ff4f58" Namespace="calico-system" Pod="csi-node-driver-trqcc" WorkloadEndpoint="ci--4372--0--1--n--21be50a87e-k8s-csi--node--driver--trqcc-eth0" Jul 15 23:18:11.316863 containerd[1538]: time="2025-07-15T23:18:11.316158737Z" level=info msg="connecting to shim 42a731eb2d3790dd3f11a3d46919890eafdb18ca723a5e493005591e07ff4f58" address="unix:///run/containerd/s/d68d4232b29dea82f0997f3f615381f49ea09000774ebc8f7b677601399e75d0" namespace=k8s.io protocol=ttrpc version=3 Jul 15 23:18:11.354831 kubelet[2742]: I0715 23:18:11.354739 2742 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-apiserver/calico-apiserver-799c9d5b45-brbbx" podStartSLOduration=27.505943446 podStartE2EDuration="31.354703076s" podCreationTimestamp="2025-07-15 23:17:40 +0000 UTC" firstStartedPulling="2025-07-15 23:18:06.550848023 +0000 UTC m=+43.757613657" lastFinishedPulling="2025-07-15 23:18:10.399607533 +0000 UTC m=+47.606373287" observedRunningTime="2025-07-15 23:18:11.354681476 +0000 UTC m=+48.561447190" watchObservedRunningTime="2025-07-15 23:18:11.354703076 +0000 UTC m=+48.561468750" Jul 15 23:18:11.359444 systemd[1]: Started cri-containerd-42a731eb2d3790dd3f11a3d46919890eafdb18ca723a5e493005591e07ff4f58.scope - libcontainer container 42a731eb2d3790dd3f11a3d46919890eafdb18ca723a5e493005591e07ff4f58. Jul 15 23:18:11.428803 systemd-networkd[1422]: cali63d21bb9dbe: Gained IPv6LL Jul 15 23:18:11.533694 containerd[1538]: time="2025-07-15T23:18:11.532545437Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-trqcc,Uid:7821b0da-67bb-46bb-abaf-a7c6ac82a39c,Namespace:calico-system,Attempt:0,} returns sandbox id \"42a731eb2d3790dd3f11a3d46919890eafdb18ca723a5e493005591e07ff4f58\"" Jul 15 23:18:11.913095 systemd-networkd[1422]: vxlan.calico: Link UP Jul 15 23:18:11.913103 systemd-networkd[1422]: vxlan.calico: Gained carrier Jul 15 23:18:12.319751 kubelet[2742]: I0715 23:18:12.319710 2742 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Jul 15 23:18:13.155802 systemd-networkd[1422]: cali463cd909e33: Gained IPv6LL Jul 15 23:18:13.603755 systemd-networkd[1422]: vxlan.calico: Gained IPv6LL Jul 15 23:18:13.883568 containerd[1538]: time="2025-07-15T23:18:13.883326752Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/kube-controllers:v3.30.2\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jul 15 23:18:13.885737 containerd[1538]: time="2025-07-15T23:18:13.885523984Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/kube-controllers:v3.30.2: active requests=0, bytes read=48128336" Jul 15 23:18:13.886672 containerd[1538]: time="2025-07-15T23:18:13.886574540Z" level=info msg="ImageCreate event name:\"sha256:ba9e7793995ca67a9b78aa06adda4e89cbd435b1e88ab1032ca665140517fa7a\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jul 15 23:18:13.890622 containerd[1538]: time="2025-07-15T23:18:13.890406365Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/kube-controllers@sha256:5d3ecdec3cbbe8f7009077102e35e8a2141161b59c548cf3f97829177677cbce\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jul 15 23:18:13.891666 containerd[1538]: time="2025-07-15T23:18:13.891631601Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.2\" with image id \"sha256:ba9e7793995ca67a9b78aa06adda4e89cbd435b1e88ab1032ca665140517fa7a\", repo tag \"ghcr.io/flatcar/calico/kube-controllers:v3.30.2\", repo digest \"ghcr.io/flatcar/calico/kube-controllers@sha256:5d3ecdec3cbbe8f7009077102e35e8a2141161b59c548cf3f97829177677cbce\", size \"49497545\" in 3.489766931s" Jul 15 23:18:13.891896 containerd[1538]: time="2025-07-15T23:18:13.891793080Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.30.2\" returns image reference \"sha256:ba9e7793995ca67a9b78aa06adda4e89cbd435b1e88ab1032ca665140517fa7a\"" Jul 15 23:18:13.894522 containerd[1538]: time="2025-07-15T23:18:13.894142592Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.30.2\"" Jul 15 23:18:13.918829 containerd[1538]: time="2025-07-15T23:18:13.918787740Z" level=info msg="CreateContainer within sandbox \"6a52e7f219f630fc48045fd5a91a810a9cf1f8fec51f163d862b492fa3d83dc6\" for container &ContainerMetadata{Name:calico-kube-controllers,Attempt:0,}" Jul 15 23:18:13.930873 containerd[1538]: time="2025-07-15T23:18:13.930805616Z" level=info msg="Container b59f0eb9bfb1fa2dbfb9f20c5aae10af21588a9a682f748878d3ba512a91b699: CDI devices from CRI Config.CDIDevices: []" Jul 15 23:18:13.944080 containerd[1538]: time="2025-07-15T23:18:13.943995287Z" level=info msg="CreateContainer within sandbox \"6a52e7f219f630fc48045fd5a91a810a9cf1f8fec51f163d862b492fa3d83dc6\" for &ContainerMetadata{Name:calico-kube-controllers,Attempt:0,} returns container id \"b59f0eb9bfb1fa2dbfb9f20c5aae10af21588a9a682f748878d3ba512a91b699\"" Jul 15 23:18:13.945816 containerd[1538]: time="2025-07-15T23:18:13.945501361Z" level=info msg="StartContainer for \"b59f0eb9bfb1fa2dbfb9f20c5aae10af21588a9a682f748878d3ba512a91b699\"" Jul 15 23:18:13.947937 containerd[1538]: time="2025-07-15T23:18:13.947829352Z" level=info msg="connecting to shim b59f0eb9bfb1fa2dbfb9f20c5aae10af21588a9a682f748878d3ba512a91b699" address="unix:///run/containerd/s/c10a24f8361a25e892040bde13bcadb9ddbaf4abf2260df897d3ab74a086a059" protocol=ttrpc version=3 Jul 15 23:18:13.983174 systemd[1]: Started cri-containerd-b59f0eb9bfb1fa2dbfb9f20c5aae10af21588a9a682f748878d3ba512a91b699.scope - libcontainer container b59f0eb9bfb1fa2dbfb9f20c5aae10af21588a9a682f748878d3ba512a91b699. Jul 15 23:18:14.037271 containerd[1538]: time="2025-07-15T23:18:14.037222987Z" level=info msg="StartContainer for \"b59f0eb9bfb1fa2dbfb9f20c5aae10af21588a9a682f748878d3ba512a91b699\" returns successfully" Jul 15 23:18:14.366479 kubelet[2742]: I0715 23:18:14.365969 2742 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/calico-kube-controllers-68db8888c-tzmd7" podStartSLOduration=23.222106473 podStartE2EDuration="28.365948297s" podCreationTimestamp="2025-07-15 23:17:46 +0000 UTC" firstStartedPulling="2025-07-15 23:18:08.749090572 +0000 UTC m=+45.955856246" lastFinishedPulling="2025-07-15 23:18:13.892932396 +0000 UTC m=+51.099698070" observedRunningTime="2025-07-15 23:18:14.354569978 +0000 UTC m=+51.561335652" watchObservedRunningTime="2025-07-15 23:18:14.365948297 +0000 UTC m=+51.572713971" Jul 15 23:18:14.386616 containerd[1538]: time="2025-07-15T23:18:14.386543956Z" level=info msg="TaskExit event in podsandbox handler container_id:\"b59f0eb9bfb1fa2dbfb9f20c5aae10af21588a9a682f748878d3ba512a91b699\" id:\"9dca9467b2e9f4163cf10dd6cf9041a823e22ad45dcc15fecc463eac0b27db84\" pid:5043 exited_at:{seconds:1752621494 nanos:386019293}" Jul 15 23:18:16.628735 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount2237459424.mount: Deactivated successfully. Jul 15 23:18:17.180639 containerd[1538]: time="2025-07-15T23:18:17.180032925Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/goldmane:v3.30.2\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jul 15 23:18:17.188883 containerd[1538]: time="2025-07-15T23:18:17.188823127Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/goldmane:v3.30.2: active requests=0, bytes read=61838790" Jul 15 23:18:17.192084 containerd[1538]: time="2025-07-15T23:18:17.191970057Z" level=info msg="ImageCreate event name:\"sha256:1389d38feb576cfff09a57a2c028a53e51a72c658f295166960f770eaf07985f\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jul 15 23:18:17.198554 containerd[1538]: time="2025-07-15T23:18:17.198063148Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/goldmane@sha256:a2b761fd93d824431ad93e59e8e670cdf00b478f4b532145297e1e67f2768305\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jul 15 23:18:17.199381 containerd[1538]: time="2025-07-15T23:18:17.199330480Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/goldmane:v3.30.2\" with image id \"sha256:1389d38feb576cfff09a57a2c028a53e51a72c658f295166960f770eaf07985f\", repo tag \"ghcr.io/flatcar/calico/goldmane:v3.30.2\", repo digest \"ghcr.io/flatcar/calico/goldmane@sha256:a2b761fd93d824431ad93e59e8e670cdf00b478f4b532145297e1e67f2768305\", size \"61838636\" in 3.305134609s" Jul 15 23:18:17.199381 containerd[1538]: time="2025-07-15T23:18:17.199376442Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.30.2\" returns image reference \"sha256:1389d38feb576cfff09a57a2c028a53e51a72c658f295166960f770eaf07985f\"" Jul 15 23:18:17.204036 containerd[1538]: time="2025-07-15T23:18:17.203886108Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.2\"" Jul 15 23:18:17.208470 containerd[1538]: time="2025-07-15T23:18:17.208111322Z" level=info msg="CreateContainer within sandbox \"a7d16da8207e91e9eb6b6c5c459116702ba1f8b1ff715bfb73f8d55afaa0d519\" for container &ContainerMetadata{Name:goldmane,Attempt:0,}" Jul 15 23:18:17.228003 containerd[1538]: time="2025-07-15T23:18:17.227910259Z" level=info msg="Container 446a4fc28af36ed9c963e40f64384eb82d67f6c24f711bfe27ee3abdfd90dcda: CDI devices from CRI Config.CDIDevices: []" Jul 15 23:18:17.241435 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount2981536438.mount: Deactivated successfully. Jul 15 23:18:17.251323 containerd[1538]: time="2025-07-15T23:18:17.251112135Z" level=info msg="CreateContainer within sandbox \"a7d16da8207e91e9eb6b6c5c459116702ba1f8b1ff715bfb73f8d55afaa0d519\" for &ContainerMetadata{Name:goldmane,Attempt:0,} returns container id \"446a4fc28af36ed9c963e40f64384eb82d67f6c24f711bfe27ee3abdfd90dcda\"" Jul 15 23:18:17.255610 containerd[1538]: time="2025-07-15T23:18:17.252974732Z" level=info msg="StartContainer for \"446a4fc28af36ed9c963e40f64384eb82d67f6c24f711bfe27ee3abdfd90dcda\"" Jul 15 23:18:17.257319 containerd[1538]: time="2025-07-15T23:18:17.256733887Z" level=info msg="connecting to shim 446a4fc28af36ed9c963e40f64384eb82d67f6c24f711bfe27ee3abdfd90dcda" address="unix:///run/containerd/s/f242f1f9c1d0b7329081d447b8dcb430198c1c8f4f4267a7d0314a421051c46f" protocol=ttrpc version=3 Jul 15 23:18:17.300879 systemd[1]: Started cri-containerd-446a4fc28af36ed9c963e40f64384eb82d67f6c24f711bfe27ee3abdfd90dcda.scope - libcontainer container 446a4fc28af36ed9c963e40f64384eb82d67f6c24f711bfe27ee3abdfd90dcda. Jul 15 23:18:17.416774 containerd[1538]: time="2025-07-15T23:18:17.416722322Z" level=info msg="StartContainer for \"446a4fc28af36ed9c963e40f64384eb82d67f6c24f711bfe27ee3abdfd90dcda\" returns successfully" Jul 15 23:18:17.609664 containerd[1538]: time="2025-07-15T23:18:17.609373703Z" level=info msg="ImageUpdate event name:\"ghcr.io/flatcar/calico/apiserver:v3.30.2\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jul 15 23:18:17.611022 containerd[1538]: time="2025-07-15T23:18:17.610750280Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.30.2: active requests=0, bytes read=77" Jul 15 23:18:17.613724 containerd[1538]: time="2025-07-15T23:18:17.613525874Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/apiserver:v3.30.2\" with image id \"sha256:3371ea1b18040228ef58c964e49b96f4291def748753dfbc0aef87a55f906b8f\", repo tag \"ghcr.io/flatcar/calico/apiserver:v3.30.2\", repo digest \"ghcr.io/flatcar/calico/apiserver@sha256:ec6b10660962e7caad70c47755049fad68f9fc2f7064e8bc7cb862583e02cc2b\", size \"45886406\" in 409.300632ms" Jul 15 23:18:17.613979 containerd[1538]: time="2025-07-15T23:18:17.613913130Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.2\" returns image reference \"sha256:3371ea1b18040228ef58c964e49b96f4291def748753dfbc0aef87a55f906b8f\"" Jul 15 23:18:17.616441 containerd[1538]: time="2025-07-15T23:18:17.616090460Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.30.2\"" Jul 15 23:18:17.622749 containerd[1538]: time="2025-07-15T23:18:17.622709013Z" level=info msg="CreateContainer within sandbox \"0e7a63ee76d35a6b97c9d5f5a92b1141635f0cd160f86de4fc9d80fb5e1e184a\" for container &ContainerMetadata{Name:calico-apiserver,Attempt:0,}" Jul 15 23:18:17.664636 containerd[1538]: time="2025-07-15T23:18:17.663775066Z" level=info msg="Container 142eacdf522feb091d687fe8d09bf8cda0dba06dfd93add90a0b38835c9460e2: CDI devices from CRI Config.CDIDevices: []" Jul 15 23:18:17.706188 containerd[1538]: time="2025-07-15T23:18:17.706135292Z" level=info msg="CreateContainer within sandbox \"0e7a63ee76d35a6b97c9d5f5a92b1141635f0cd160f86de4fc9d80fb5e1e184a\" for &ContainerMetadata{Name:calico-apiserver,Attempt:0,} returns container id \"142eacdf522feb091d687fe8d09bf8cda0dba06dfd93add90a0b38835c9460e2\"" Jul 15 23:18:17.708488 containerd[1538]: time="2025-07-15T23:18:17.708416946Z" level=info msg="StartContainer for \"142eacdf522feb091d687fe8d09bf8cda0dba06dfd93add90a0b38835c9460e2\"" Jul 15 23:18:17.711620 containerd[1538]: time="2025-07-15T23:18:17.711303225Z" level=info msg="connecting to shim 142eacdf522feb091d687fe8d09bf8cda0dba06dfd93add90a0b38835c9460e2" address="unix:///run/containerd/s/4d1f928b972aa73987083bba97618719e5caa5f09cbcea1e7ef564f2e28a2d67" protocol=ttrpc version=3 Jul 15 23:18:17.742036 systemd[1]: Started cri-containerd-142eacdf522feb091d687fe8d09bf8cda0dba06dfd93add90a0b38835c9460e2.scope - libcontainer container 142eacdf522feb091d687fe8d09bf8cda0dba06dfd93add90a0b38835c9460e2. Jul 15 23:18:17.799758 containerd[1538]: time="2025-07-15T23:18:17.799636586Z" level=info msg="StartContainer for \"142eacdf522feb091d687fe8d09bf8cda0dba06dfd93add90a0b38835c9460e2\" returns successfully" Jul 15 23:18:18.414464 kubelet[2742]: I0715 23:18:18.414378 2742 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-apiserver/calico-apiserver-799c9d5b45-zftqn" podStartSLOduration=29.985202048 podStartE2EDuration="38.414354641s" podCreationTimestamp="2025-07-15 23:17:40 +0000 UTC" firstStartedPulling="2025-07-15 23:18:09.18603895 +0000 UTC m=+46.392804584" lastFinishedPulling="2025-07-15 23:18:17.615191503 +0000 UTC m=+54.821957177" observedRunningTime="2025-07-15 23:18:18.382148157 +0000 UTC m=+55.588913831" watchObservedRunningTime="2025-07-15 23:18:18.414354641 +0000 UTC m=+55.621120315" Jul 15 23:18:18.560051 containerd[1538]: time="2025-07-15T23:18:18.559910242Z" level=info msg="TaskExit event in podsandbox handler container_id:\"446a4fc28af36ed9c963e40f64384eb82d67f6c24f711bfe27ee3abdfd90dcda\" id:\"eceaba5554b05ca96fa37a550f405f68d2e9a50154442a39151ffe0bda74d433\" pid:5150 exit_status:1 exited_at:{seconds:1752621498 nanos:557229975}" Jul 15 23:18:19.036096 containerd[1538]: time="2025-07-15T23:18:19.035802202Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/csi:v3.30.2\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jul 15 23:18:19.038205 containerd[1538]: time="2025-07-15T23:18:19.037537069Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/csi:v3.30.2: active requests=0, bytes read=8225702" Jul 15 23:18:19.039400 containerd[1538]: time="2025-07-15T23:18:19.039182292Z" level=info msg="ImageCreate event name:\"sha256:14ecfabbdbebd1f5a36708f8b11a95a43baddd6a935d7d78c89a9c333849fcd2\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jul 15 23:18:19.045352 containerd[1538]: time="2025-07-15T23:18:19.045031037Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/csi@sha256:e570128aa8067a2f06b96d3cc98afa2e0a4b9790b435ee36ca051c8e72aeb8d0\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jul 15 23:18:19.047126 containerd[1538]: time="2025-07-15T23:18:19.046166641Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/csi:v3.30.2\" with image id \"sha256:14ecfabbdbebd1f5a36708f8b11a95a43baddd6a935d7d78c89a9c333849fcd2\", repo tag \"ghcr.io/flatcar/calico/csi:v3.30.2\", repo digest \"ghcr.io/flatcar/calico/csi@sha256:e570128aa8067a2f06b96d3cc98afa2e0a4b9790b435ee36ca051c8e72aeb8d0\", size \"9594943\" in 1.430023499s" Jul 15 23:18:19.047126 containerd[1538]: time="2025-07-15T23:18:19.046201322Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.30.2\" returns image reference \"sha256:14ecfabbdbebd1f5a36708f8b11a95a43baddd6a935d7d78c89a9c333849fcd2\"" Jul 15 23:18:19.058697 containerd[1538]: time="2025-07-15T23:18:19.058491116Z" level=info msg="CreateContainer within sandbox \"42a731eb2d3790dd3f11a3d46919890eafdb18ca723a5e493005591e07ff4f58\" for container &ContainerMetadata{Name:calico-csi,Attempt:0,}" Jul 15 23:18:19.081952 containerd[1538]: time="2025-07-15T23:18:19.081898378Z" level=info msg="Container 51e6ae4d15b545493f14b2f6178121125e014fbd96af2e884cb96fc3d81ecbf3: CDI devices from CRI Config.CDIDevices: []" Jul 15 23:18:19.099707 containerd[1538]: time="2025-07-15T23:18:19.099659022Z" level=info msg="CreateContainer within sandbox \"42a731eb2d3790dd3f11a3d46919890eafdb18ca723a5e493005591e07ff4f58\" for &ContainerMetadata{Name:calico-csi,Attempt:0,} returns container id \"51e6ae4d15b545493f14b2f6178121125e014fbd96af2e884cb96fc3d81ecbf3\"" Jul 15 23:18:19.102927 containerd[1538]: time="2025-07-15T23:18:19.101631978Z" level=info msg="StartContainer for \"51e6ae4d15b545493f14b2f6178121125e014fbd96af2e884cb96fc3d81ecbf3\"" Jul 15 23:18:19.106164 containerd[1538]: time="2025-07-15T23:18:19.106100790Z" level=info msg="connecting to shim 51e6ae4d15b545493f14b2f6178121125e014fbd96af2e884cb96fc3d81ecbf3" address="unix:///run/containerd/s/d68d4232b29dea82f0997f3f615381f49ea09000774ebc8f7b677601399e75d0" protocol=ttrpc version=3 Jul 15 23:18:19.148989 systemd[1]: Started cri-containerd-51e6ae4d15b545493f14b2f6178121125e014fbd96af2e884cb96fc3d81ecbf3.scope - libcontainer container 51e6ae4d15b545493f14b2f6178121125e014fbd96af2e884cb96fc3d81ecbf3. Jul 15 23:18:19.233364 containerd[1538]: time="2025-07-15T23:18:19.233266369Z" level=info msg="StartContainer for \"51e6ae4d15b545493f14b2f6178121125e014fbd96af2e884cb96fc3d81ecbf3\" returns successfully" Jul 15 23:18:19.236389 containerd[1538]: time="2025-07-15T23:18:19.236285885Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.2\"" Jul 15 23:18:19.373464 kubelet[2742]: I0715 23:18:19.372615 2742 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Jul 15 23:18:19.498965 containerd[1538]: time="2025-07-15T23:18:19.498918804Z" level=info msg="TaskExit event in podsandbox handler container_id:\"446a4fc28af36ed9c963e40f64384eb82d67f6c24f711bfe27ee3abdfd90dcda\" id:\"d42e5608c9403f9d46eec5db06f3d06ce5823e881f388b8a64e94afdb1b3caa3\" pid:5211 exit_status:1 exited_at:{seconds:1752621499 nanos:497360024}" Jul 15 23:18:20.966704 containerd[1538]: time="2025-07-15T23:18:20.965294578Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.2\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jul 15 23:18:20.967453 containerd[1538]: time="2025-07-15T23:18:20.967411096Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/node-driver-registrar:v3.30.2: active requests=0, bytes read=13754366" Jul 15 23:18:20.970705 containerd[1538]: time="2025-07-15T23:18:20.970654937Z" level=info msg="ImageCreate event name:\"sha256:664ed31fb4687b0de23d6e6e116bc87b236790d7355871d3237c54452e02e27c\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jul 15 23:18:20.973843 containerd[1538]: time="2025-07-15T23:18:20.973792374Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node-driver-registrar@sha256:8fec2de12dfa51bae89d941938a07af2598eb8bfcab55d0dded1d9c193d7b99f\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jul 15 23:18:20.975837 containerd[1538]: time="2025-07-15T23:18:20.975171545Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.2\" with image id \"sha256:664ed31fb4687b0de23d6e6e116bc87b236790d7355871d3237c54452e02e27c\", repo tag \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.2\", repo digest \"ghcr.io/flatcar/calico/node-driver-registrar@sha256:8fec2de12dfa51bae89d941938a07af2598eb8bfcab55d0dded1d9c193d7b99f\", size \"15123559\" in 1.738753295s" Jul 15 23:18:20.975837 containerd[1538]: time="2025-07-15T23:18:20.975241628Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.2\" returns image reference \"sha256:664ed31fb4687b0de23d6e6e116bc87b236790d7355871d3237c54452e02e27c\"" Jul 15 23:18:20.989957 containerd[1538]: time="2025-07-15T23:18:20.989913094Z" level=info msg="CreateContainer within sandbox \"42a731eb2d3790dd3f11a3d46919890eafdb18ca723a5e493005591e07ff4f58\" for container &ContainerMetadata{Name:csi-node-driver-registrar,Attempt:0,}" Jul 15 23:18:21.015804 containerd[1538]: time="2025-07-15T23:18:21.015751959Z" level=info msg="Container 24f60b89ab69c4c0101746cdfbd9d7dac92b68fe5c0e146be31ee19b207c1817: CDI devices from CRI Config.CDIDevices: []" Jul 15 23:18:21.032990 containerd[1538]: time="2025-07-15T23:18:21.032889376Z" level=info msg="CreateContainer within sandbox \"42a731eb2d3790dd3f11a3d46919890eafdb18ca723a5e493005591e07ff4f58\" for &ContainerMetadata{Name:csi-node-driver-registrar,Attempt:0,} returns container id \"24f60b89ab69c4c0101746cdfbd9d7dac92b68fe5c0e146be31ee19b207c1817\"" Jul 15 23:18:21.034676 containerd[1538]: time="2025-07-15T23:18:21.034337988Z" level=info msg="StartContainer for \"24f60b89ab69c4c0101746cdfbd9d7dac92b68fe5c0e146be31ee19b207c1817\"" Jul 15 23:18:21.039609 containerd[1538]: time="2025-07-15T23:18:21.039423411Z" level=info msg="connecting to shim 24f60b89ab69c4c0101746cdfbd9d7dac92b68fe5c0e146be31ee19b207c1817" address="unix:///run/containerd/s/d68d4232b29dea82f0997f3f615381f49ea09000774ebc8f7b677601399e75d0" protocol=ttrpc version=3 Jul 15 23:18:21.081023 systemd[1]: Started cri-containerd-24f60b89ab69c4c0101746cdfbd9d7dac92b68fe5c0e146be31ee19b207c1817.scope - libcontainer container 24f60b89ab69c4c0101746cdfbd9d7dac92b68fe5c0e146be31ee19b207c1817. Jul 15 23:18:21.162283 containerd[1538]: time="2025-07-15T23:18:21.162070786Z" level=info msg="StartContainer for \"24f60b89ab69c4c0101746cdfbd9d7dac92b68fe5c0e146be31ee19b207c1817\" returns successfully" Jul 15 23:18:21.421859 kubelet[2742]: I0715 23:18:21.420548 2742 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/goldmane-768f4c5c69-fn5px" podStartSLOduration=27.088788448 podStartE2EDuration="35.420524329s" podCreationTimestamp="2025-07-15 23:17:46 +0000 UTC" firstStartedPulling="2025-07-15 23:18:08.870886375 +0000 UTC m=+46.077652009" lastFinishedPulling="2025-07-15 23:18:17.202622176 +0000 UTC m=+54.409387890" observedRunningTime="2025-07-15 23:18:18.415184394 +0000 UTC m=+55.621950068" watchObservedRunningTime="2025-07-15 23:18:21.420524329 +0000 UTC m=+58.627290043" Jul 15 23:18:22.123411 kubelet[2742]: I0715 23:18:22.123228 2742 csi_plugin.go:106] kubernetes.io/csi: Trying to validate a new CSI Driver with name: csi.tigera.io endpoint: /var/lib/kubelet/plugins/csi.tigera.io/csi.sock versions: 1.0.0 Jul 15 23:18:22.132226 kubelet[2742]: I0715 23:18:22.131876 2742 csi_plugin.go:119] kubernetes.io/csi: Register new plugin with name: csi.tigera.io at endpoint: /var/lib/kubelet/plugins/csi.tigera.io/csi.sock Jul 15 23:18:23.622752 kubelet[2742]: I0715 23:18:23.622628 2742 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Jul 15 23:18:23.665952 kubelet[2742]: I0715 23:18:23.665884 2742 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Jul 15 23:18:23.669438 kubelet[2742]: I0715 23:18:23.669097 2742 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/csi-node-driver-trqcc" podStartSLOduration=28.227954278 podStartE2EDuration="37.669062875s" podCreationTimestamp="2025-07-15 23:17:46 +0000 UTC" firstStartedPulling="2025-07-15 23:18:11.536468198 +0000 UTC m=+48.743233872" lastFinishedPulling="2025-07-15 23:18:20.977576795 +0000 UTC m=+58.184342469" observedRunningTime="2025-07-15 23:18:21.426179453 +0000 UTC m=+58.632945127" watchObservedRunningTime="2025-07-15 23:18:23.669062875 +0000 UTC m=+60.875828549" Jul 15 23:18:34.128331 containerd[1538]: time="2025-07-15T23:18:34.128272079Z" level=info msg="TaskExit event in podsandbox handler container_id:\"d3261160ea98124dd71ea19467519976d4184aec8c47397154429269b6d85b2f\" id:\"886c28575f8fcd78333d6964d414da30d5df4be466e48473fb692a0e7b7a29c7\" pid:5296 exited_at:{seconds:1752621514 nanos:127855310}" Jul 15 23:18:44.403505 containerd[1538]: time="2025-07-15T23:18:44.403433464Z" level=info msg="TaskExit event in podsandbox handler container_id:\"b59f0eb9bfb1fa2dbfb9f20c5aae10af21588a9a682f748878d3ba512a91b699\" id:\"2a126983e02c69b2e10e3de5fd4c7ad7430676bfed2bf913b81b989a8a2228e3\" pid:5320 exited_at:{seconds:1752621524 nanos:402987857}" Jul 15 23:18:49.614711 containerd[1538]: time="2025-07-15T23:18:49.614650387Z" level=info msg="TaskExit event in podsandbox handler container_id:\"446a4fc28af36ed9c963e40f64384eb82d67f6c24f711bfe27ee3abdfd90dcda\" id:\"1100652a018d8af54bcb9e294fd263202474d0c19db8b045a7df7ab9bba524ff\" pid:5343 exited_at:{seconds:1752621529 nanos:613480212}" Jul 15 23:18:50.835401 containerd[1538]: time="2025-07-15T23:18:50.835330246Z" level=info msg="TaskExit event in podsandbox handler container_id:\"446a4fc28af36ed9c963e40f64384eb82d67f6c24f711bfe27ee3abdfd90dcda\" id:\"a8ec73ac61c42e7312f75ad5cca2dde45aa383df377d529cf014e6b11ccd8270\" pid:5368 exited_at:{seconds:1752621530 nanos:834695918}" Jul 15 23:18:59.941610 containerd[1538]: time="2025-07-15T23:18:59.941486282Z" level=info msg="TaskExit event in podsandbox handler container_id:\"b59f0eb9bfb1fa2dbfb9f20c5aae10af21588a9a682f748878d3ba512a91b699\" id:\"cf01bfdc4ef34ba271ce271b4e5f0892e6ead498692236190cd076adf8df9e74\" pid:5401 exited_at:{seconds:1752621539 nanos:941105278}" Jul 15 23:19:04.164746 containerd[1538]: time="2025-07-15T23:19:04.164555017Z" level=info msg="TaskExit event in podsandbox handler container_id:\"d3261160ea98124dd71ea19467519976d4184aec8c47397154429269b6d85b2f\" id:\"90aa1f513439f58ac910b5b75e8df511c071c481a8daadce858df9c6d1e402d1\" pid:5423 exited_at:{seconds:1752621544 nanos:163690611}" Jul 15 23:19:14.567278 containerd[1538]: time="2025-07-15T23:19:14.567230668Z" level=info msg="TaskExit event in podsandbox handler container_id:\"b59f0eb9bfb1fa2dbfb9f20c5aae10af21588a9a682f748878d3ba512a91b699\" id:\"26ac9a942fb2a6c506093846f9c9507c966eaf782fa87705c20f3e49a7d58f68\" pid:5450 exited_at:{seconds:1752621554 nanos:566758506}" Jul 15 23:19:14.757152 systemd[1]: Started sshd@7-91.99.212.32:22-139.178.68.195:49408.service - OpenSSH per-connection server daemon (139.178.68.195:49408). Jul 15 23:19:15.794248 sshd[5460]: Accepted publickey for core from 139.178.68.195 port 49408 ssh2: RSA SHA256:+cMC7rDY11ooX0rGk8xTzTdhKmHBDbuiScEsywsTdAk Jul 15 23:19:15.797373 sshd-session[5460]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jul 15 23:19:15.810565 systemd-logind[1480]: New session 8 of user core. Jul 15 23:19:15.817948 systemd[1]: Started session-8.scope - Session 8 of User core. Jul 15 23:19:16.627479 sshd[5462]: Connection closed by 139.178.68.195 port 49408 Jul 15 23:19:16.628272 sshd-session[5460]: pam_unix(sshd:session): session closed for user core Jul 15 23:19:16.636176 systemd[1]: sshd@7-91.99.212.32:22-139.178.68.195:49408.service: Deactivated successfully. Jul 15 23:19:16.642127 systemd[1]: session-8.scope: Deactivated successfully. Jul 15 23:19:16.645800 systemd-logind[1480]: Session 8 logged out. Waiting for processes to exit. Jul 15 23:19:16.649417 systemd-logind[1480]: Removed session 8. Jul 15 23:19:19.601124 containerd[1538]: time="2025-07-15T23:19:19.601072720Z" level=info msg="TaskExit event in podsandbox handler container_id:\"446a4fc28af36ed9c963e40f64384eb82d67f6c24f711bfe27ee3abdfd90dcda\" id:\"5db6af4230bc3426e05c3a0c7a2255cf30915b9ec2f27d5709fe0eedeac0343a\" pid:5490 exited_at:{seconds:1752621559 nanos:599134354}" Jul 15 23:19:21.805875 systemd[1]: Started sshd@8-91.99.212.32:22-139.178.68.195:33970.service - OpenSSH per-connection server daemon (139.178.68.195:33970). Jul 15 23:19:22.831278 sshd[5504]: Accepted publickey for core from 139.178.68.195 port 33970 ssh2: RSA SHA256:+cMC7rDY11ooX0rGk8xTzTdhKmHBDbuiScEsywsTdAk Jul 15 23:19:22.834290 sshd-session[5504]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jul 15 23:19:22.848327 systemd-logind[1480]: New session 9 of user core. Jul 15 23:19:22.855007 systemd[1]: Started session-9.scope - Session 9 of User core. Jul 15 23:19:23.633698 sshd[5506]: Connection closed by 139.178.68.195 port 33970 Jul 15 23:19:23.634007 sshd-session[5504]: pam_unix(sshd:session): session closed for user core Jul 15 23:19:23.641788 systemd[1]: sshd@8-91.99.212.32:22-139.178.68.195:33970.service: Deactivated successfully. Jul 15 23:19:23.650452 systemd[1]: session-9.scope: Deactivated successfully. Jul 15 23:19:23.653835 systemd-logind[1480]: Session 9 logged out. Waiting for processes to exit. Jul 15 23:19:23.657267 systemd-logind[1480]: Removed session 9. Jul 15 23:19:28.817001 systemd[1]: Started sshd@9-91.99.212.32:22-139.178.68.195:33976.service - OpenSSH per-connection server daemon (139.178.68.195:33976). Jul 15 23:19:29.841669 sshd[5525]: Accepted publickey for core from 139.178.68.195 port 33976 ssh2: RSA SHA256:+cMC7rDY11ooX0rGk8xTzTdhKmHBDbuiScEsywsTdAk Jul 15 23:19:29.844357 sshd-session[5525]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jul 15 23:19:29.854202 systemd-logind[1480]: New session 10 of user core. Jul 15 23:19:29.862887 systemd[1]: Started session-10.scope - Session 10 of User core. Jul 15 23:19:30.656761 sshd[5529]: Connection closed by 139.178.68.195 port 33976 Jul 15 23:19:30.659103 sshd-session[5525]: pam_unix(sshd:session): session closed for user core Jul 15 23:19:30.670543 systemd[1]: sshd@9-91.99.212.32:22-139.178.68.195:33976.service: Deactivated successfully. Jul 15 23:19:30.676325 systemd[1]: session-10.scope: Deactivated successfully. Jul 15 23:19:30.678168 systemd-logind[1480]: Session 10 logged out. Waiting for processes to exit. Jul 15 23:19:30.681515 systemd-logind[1480]: Removed session 10. Jul 15 23:19:30.834965 systemd[1]: Started sshd@10-91.99.212.32:22-139.178.68.195:45748.service - OpenSSH per-connection server daemon (139.178.68.195:45748). Jul 15 23:19:31.860835 sshd[5542]: Accepted publickey for core from 139.178.68.195 port 45748 ssh2: RSA SHA256:+cMC7rDY11ooX0rGk8xTzTdhKmHBDbuiScEsywsTdAk Jul 15 23:19:31.863316 sshd-session[5542]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jul 15 23:19:31.871210 systemd-logind[1480]: New session 11 of user core. Jul 15 23:19:31.874830 systemd[1]: Started session-11.scope - Session 11 of User core. Jul 15 23:19:32.696258 sshd[5544]: Connection closed by 139.178.68.195 port 45748 Jul 15 23:19:32.696137 sshd-session[5542]: pam_unix(sshd:session): session closed for user core Jul 15 23:19:32.702465 systemd-logind[1480]: Session 11 logged out. Waiting for processes to exit. Jul 15 23:19:32.702848 systemd[1]: sshd@10-91.99.212.32:22-139.178.68.195:45748.service: Deactivated successfully. Jul 15 23:19:32.706863 systemd[1]: session-11.scope: Deactivated successfully. Jul 15 23:19:32.709198 systemd-logind[1480]: Removed session 11. Jul 15 23:19:32.904232 systemd[1]: Started sshd@11-91.99.212.32:22-139.178.68.195:45758.service - OpenSSH per-connection server daemon (139.178.68.195:45758). Jul 15 23:19:34.018314 sshd[5562]: Accepted publickey for core from 139.178.68.195 port 45758 ssh2: RSA SHA256:+cMC7rDY11ooX0rGk8xTzTdhKmHBDbuiScEsywsTdAk Jul 15 23:19:34.021776 sshd-session[5562]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jul 15 23:19:34.029115 systemd-logind[1480]: New session 12 of user core. Jul 15 23:19:34.037990 systemd[1]: Started session-12.scope - Session 12 of User core. Jul 15 23:19:34.128541 containerd[1538]: time="2025-07-15T23:19:34.128443828Z" level=info msg="TaskExit event in podsandbox handler container_id:\"d3261160ea98124dd71ea19467519976d4184aec8c47397154429269b6d85b2f\" id:\"b973b32a2b8d068083f5ff3d0ed158889587055ecd962bf04b9416cd315c60cd\" pid:5575 exited_at:{seconds:1752621574 nanos:127865708}" Jul 15 23:19:34.876224 sshd[5576]: Connection closed by 139.178.68.195 port 45758 Jul 15 23:19:34.877390 sshd-session[5562]: pam_unix(sshd:session): session closed for user core Jul 15 23:19:34.884502 systemd[1]: sshd@11-91.99.212.32:22-139.178.68.195:45758.service: Deactivated successfully. Jul 15 23:19:34.888460 systemd[1]: session-12.scope: Deactivated successfully. Jul 15 23:19:34.890043 systemd-logind[1480]: Session 12 logged out. Waiting for processes to exit. Jul 15 23:19:34.892307 systemd-logind[1480]: Removed session 12. Jul 15 23:19:40.039383 systemd[1]: Started sshd@12-91.99.212.32:22-139.178.68.195:45764.service - OpenSSH per-connection server daemon (139.178.68.195:45764). Jul 15 23:19:41.060381 sshd[5600]: Accepted publickey for core from 139.178.68.195 port 45764 ssh2: RSA SHA256:+cMC7rDY11ooX0rGk8xTzTdhKmHBDbuiScEsywsTdAk Jul 15 23:19:41.062827 sshd-session[5600]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jul 15 23:19:41.069950 systemd-logind[1480]: New session 13 of user core. Jul 15 23:19:41.076891 systemd[1]: Started session-13.scope - Session 13 of User core. Jul 15 23:19:41.843390 sshd[5602]: Connection closed by 139.178.68.195 port 45764 Jul 15 23:19:41.844218 sshd-session[5600]: pam_unix(sshd:session): session closed for user core Jul 15 23:19:41.850880 systemd-logind[1480]: Session 13 logged out. Waiting for processes to exit. Jul 15 23:19:41.852260 systemd[1]: sshd@12-91.99.212.32:22-139.178.68.195:45764.service: Deactivated successfully. Jul 15 23:19:41.857254 systemd[1]: session-13.scope: Deactivated successfully. Jul 15 23:19:41.861951 systemd-logind[1480]: Removed session 13. Jul 15 23:19:42.030583 systemd[1]: Started sshd@13-91.99.212.32:22-139.178.68.195:59294.service - OpenSSH per-connection server daemon (139.178.68.195:59294). Jul 15 23:19:43.056190 sshd[5615]: Accepted publickey for core from 139.178.68.195 port 59294 ssh2: RSA SHA256:+cMC7rDY11ooX0rGk8xTzTdhKmHBDbuiScEsywsTdAk Jul 15 23:19:43.058523 sshd-session[5615]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jul 15 23:19:43.067019 systemd-logind[1480]: New session 14 of user core. Jul 15 23:19:43.069848 systemd[1]: Started session-14.scope - Session 14 of User core. Jul 15 23:19:44.057978 sshd[5617]: Connection closed by 139.178.68.195 port 59294 Jul 15 23:19:44.060396 sshd-session[5615]: pam_unix(sshd:session): session closed for user core Jul 15 23:19:44.070731 systemd[1]: sshd@13-91.99.212.32:22-139.178.68.195:59294.service: Deactivated successfully. Jul 15 23:19:44.079182 systemd[1]: session-14.scope: Deactivated successfully. Jul 15 23:19:44.082287 systemd-logind[1480]: Session 14 logged out. Waiting for processes to exit. Jul 15 23:19:44.090959 systemd-logind[1480]: Removed session 14. Jul 15 23:19:44.227377 systemd[1]: Started sshd@14-91.99.212.32:22-139.178.68.195:59300.service - OpenSSH per-connection server daemon (139.178.68.195:59300). Jul 15 23:19:44.376284 containerd[1538]: time="2025-07-15T23:19:44.376156894Z" level=info msg="TaskExit event in podsandbox handler container_id:\"b59f0eb9bfb1fa2dbfb9f20c5aae10af21588a9a682f748878d3ba512a91b699\" id:\"9abf24ef37afd9a68d4fd367ae0c997659735ca139a1ecd6ea77cbba95a0fce2\" pid:5641 exited_at:{seconds:1752621584 nanos:375783854}" Jul 15 23:19:45.247177 sshd[5627]: Accepted publickey for core from 139.178.68.195 port 59300 ssh2: RSA SHA256:+cMC7rDY11ooX0rGk8xTzTdhKmHBDbuiScEsywsTdAk Jul 15 23:19:45.250266 sshd-session[5627]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jul 15 23:19:45.257900 systemd-logind[1480]: New session 15 of user core. Jul 15 23:19:45.265044 systemd[1]: Started session-15.scope - Session 15 of User core. Jul 15 23:19:47.187612 sshd[5651]: Connection closed by 139.178.68.195 port 59300 Jul 15 23:19:47.187229 sshd-session[5627]: pam_unix(sshd:session): session closed for user core Jul 15 23:19:47.193743 systemd[1]: sshd@14-91.99.212.32:22-139.178.68.195:59300.service: Deactivated successfully. Jul 15 23:19:47.202223 systemd[1]: session-15.scope: Deactivated successfully. Jul 15 23:19:47.205023 systemd-logind[1480]: Session 15 logged out. Waiting for processes to exit. Jul 15 23:19:47.208038 systemd-logind[1480]: Removed session 15. Jul 15 23:19:47.373895 systemd[1]: Started sshd@15-91.99.212.32:22-139.178.68.195:59316.service - OpenSSH per-connection server daemon (139.178.68.195:59316). Jul 15 23:19:48.393570 sshd[5668]: Accepted publickey for core from 139.178.68.195 port 59316 ssh2: RSA SHA256:+cMC7rDY11ooX0rGk8xTzTdhKmHBDbuiScEsywsTdAk Jul 15 23:19:48.394520 sshd-session[5668]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jul 15 23:19:48.406842 systemd-logind[1480]: New session 16 of user core. Jul 15 23:19:48.413912 systemd[1]: Started session-16.scope - Session 16 of User core. Jul 15 23:19:49.373496 sshd[5671]: Connection closed by 139.178.68.195 port 59316 Jul 15 23:19:49.374536 sshd-session[5668]: pam_unix(sshd:session): session closed for user core Jul 15 23:19:49.386921 systemd[1]: sshd@15-91.99.212.32:22-139.178.68.195:59316.service: Deactivated successfully. Jul 15 23:19:49.393522 systemd[1]: session-16.scope: Deactivated successfully. Jul 15 23:19:49.394781 systemd-logind[1480]: Session 16 logged out. Waiting for processes to exit. Jul 15 23:19:49.403444 systemd-logind[1480]: Removed session 16. Jul 15 23:19:49.542388 systemd[1]: Started sshd@16-91.99.212.32:22-139.178.68.195:59328.service - OpenSSH per-connection server daemon (139.178.68.195:59328). Jul 15 23:19:49.757333 containerd[1538]: time="2025-07-15T23:19:49.757261560Z" level=info msg="TaskExit event in podsandbox handler container_id:\"446a4fc28af36ed9c963e40f64384eb82d67f6c24f711bfe27ee3abdfd90dcda\" id:\"5aaccc37d8a6967bc0bac62187c89052d2f80698d05dc761faa4023bb9f8e0d1\" pid:5705 exited_at:{seconds:1752621589 nanos:756775681}" Jul 15 23:19:50.552202 sshd[5712]: Accepted publickey for core from 139.178.68.195 port 59328 ssh2: RSA SHA256:+cMC7rDY11ooX0rGk8xTzTdhKmHBDbuiScEsywsTdAk Jul 15 23:19:50.555342 sshd-session[5712]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jul 15 23:19:50.565023 systemd-logind[1480]: New session 17 of user core. Jul 15 23:19:50.572845 systemd[1]: Started session-17.scope - Session 17 of User core. Jul 15 23:19:50.835376 containerd[1538]: time="2025-07-15T23:19:50.834845840Z" level=info msg="TaskExit event in podsandbox handler container_id:\"446a4fc28af36ed9c963e40f64384eb82d67f6c24f711bfe27ee3abdfd90dcda\" id:\"be1a85b6cfe243a44e19f17227372d5cf8e53fed791641ef065cd1fc54fb6d91\" pid:5742 exited_at:{seconds:1752621590 nanos:833339840}" Jul 15 23:19:51.338298 sshd[5727]: Connection closed by 139.178.68.195 port 59328 Jul 15 23:19:51.340905 sshd-session[5712]: pam_unix(sshd:session): session closed for user core Jul 15 23:19:51.347295 systemd[1]: sshd@16-91.99.212.32:22-139.178.68.195:59328.service: Deactivated successfully. Jul 15 23:19:51.350914 systemd[1]: session-17.scope: Deactivated successfully. Jul 15 23:19:51.354698 systemd-logind[1480]: Session 17 logged out. Waiting for processes to exit. Jul 15 23:19:51.358246 systemd-logind[1480]: Removed session 17. Jul 15 23:19:56.510346 systemd[1]: Started sshd@17-91.99.212.32:22-139.178.68.195:53076.service - OpenSSH per-connection server daemon (139.178.68.195:53076). Jul 15 23:19:57.510332 sshd[5765]: Accepted publickey for core from 139.178.68.195 port 53076 ssh2: RSA SHA256:+cMC7rDY11ooX0rGk8xTzTdhKmHBDbuiScEsywsTdAk Jul 15 23:19:57.512546 sshd-session[5765]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jul 15 23:19:57.522344 systemd-logind[1480]: New session 18 of user core. Jul 15 23:19:57.528966 systemd[1]: Started session-18.scope - Session 18 of User core. Jul 15 23:19:58.318441 sshd[5767]: Connection closed by 139.178.68.195 port 53076 Jul 15 23:19:58.319197 sshd-session[5765]: pam_unix(sshd:session): session closed for user core Jul 15 23:19:58.326267 systemd[1]: sshd@17-91.99.212.32:22-139.178.68.195:53076.service: Deactivated successfully. Jul 15 23:19:58.330009 systemd[1]: session-18.scope: Deactivated successfully. Jul 15 23:19:58.333147 systemd-logind[1480]: Session 18 logged out. Waiting for processes to exit. Jul 15 23:19:58.336652 systemd-logind[1480]: Removed session 18. Jul 15 23:19:59.965958 containerd[1538]: time="2025-07-15T23:19:59.965888227Z" level=info msg="TaskExit event in podsandbox handler container_id:\"b59f0eb9bfb1fa2dbfb9f20c5aae10af21588a9a682f748878d3ba512a91b699\" id:\"ab998f6edbe2d5de42cf613330e204b116d0255466c2091ef57b051b02ef94ad\" pid:5793 exited_at:{seconds:1752621599 nanos:965477827}" Jul 15 23:20:03.498908 systemd[1]: Started sshd@18-91.99.212.32:22-139.178.68.195:60734.service - OpenSSH per-connection server daemon (139.178.68.195:60734). Jul 15 23:20:04.160943 containerd[1538]: time="2025-07-15T23:20:04.160894842Z" level=info msg="TaskExit event in podsandbox handler container_id:\"d3261160ea98124dd71ea19467519976d4184aec8c47397154429269b6d85b2f\" id:\"8bf52e033f304bd6179ddc91c884b1cff426397e38fb846d46d0589ce59f1552\" pid:5819 exited_at:{seconds:1752621604 nanos:160248243}" Jul 15 23:20:04.488765 sshd[5804]: Accepted publickey for core from 139.178.68.195 port 60734 ssh2: RSA SHA256:+cMC7rDY11ooX0rGk8xTzTdhKmHBDbuiScEsywsTdAk Jul 15 23:20:04.489397 sshd-session[5804]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jul 15 23:20:04.501161 systemd-logind[1480]: New session 19 of user core. Jul 15 23:20:04.504879 systemd[1]: Started session-19.scope - Session 19 of User core. Jul 15 23:20:05.312536 sshd[5831]: Connection closed by 139.178.68.195 port 60734 Jul 15 23:20:05.313801 sshd-session[5804]: pam_unix(sshd:session): session closed for user core Jul 15 23:20:05.323426 systemd[1]: sshd@18-91.99.212.32:22-139.178.68.195:60734.service: Deactivated successfully. Jul 15 23:20:05.331406 systemd[1]: session-19.scope: Deactivated successfully. Jul 15 23:20:05.338306 systemd-logind[1480]: Session 19 logged out. Waiting for processes to exit. Jul 15 23:20:05.340722 systemd-logind[1480]: Removed session 19. Jul 15 23:20:14.377319 containerd[1538]: time="2025-07-15T23:20:14.377106474Z" level=info msg="TaskExit event in podsandbox handler container_id:\"b59f0eb9bfb1fa2dbfb9f20c5aae10af21588a9a682f748878d3ba512a91b699\" id:\"a5bb63cf4cc06d39a7b156808e9bad54a51562d50380afe63cd6ab790ea1c97c\" pid:5853 exited_at:{seconds:1752621614 nanos:376183475}" Jul 15 23:20:19.487167 containerd[1538]: time="2025-07-15T23:20:19.487081474Z" level=info msg="TaskExit event in podsandbox handler container_id:\"446a4fc28af36ed9c963e40f64384eb82d67f6c24f711bfe27ee3abdfd90dcda\" id:\"d3498b29281b8109d106ff61620ccf1d6baf37010560ac15ed32c1e216188149\" pid:5875 exited_at:{seconds:1752621619 nanos:486481795}" Jul 15 23:20:21.747394 systemd[1]: cri-containerd-a9f1189c471a8d23ea8d6e758b311752acce1decc60b30e7a99179125097a57b.scope: Deactivated successfully. Jul 15 23:20:21.747843 systemd[1]: cri-containerd-a9f1189c471a8d23ea8d6e758b311752acce1decc60b30e7a99179125097a57b.scope: Consumed 5.521s CPU time, 62.1M memory peak, 2.9M read from disk. Jul 15 23:20:21.757995 containerd[1538]: time="2025-07-15T23:20:21.756574681Z" level=info msg="received exit event container_id:\"a9f1189c471a8d23ea8d6e758b311752acce1decc60b30e7a99179125097a57b\" id:\"a9f1189c471a8d23ea8d6e758b311752acce1decc60b30e7a99179125097a57b\" pid:2593 exit_status:1 exited_at:{seconds:1752621621 nanos:754876989}" Jul 15 23:20:21.758548 containerd[1538]: time="2025-07-15T23:20:21.757794930Z" level=info msg="TaskExit event in podsandbox handler container_id:\"a9f1189c471a8d23ea8d6e758b311752acce1decc60b30e7a99179125097a57b\" id:\"a9f1189c471a8d23ea8d6e758b311752acce1decc60b30e7a99179125097a57b\" pid:2593 exit_status:1 exited_at:{seconds:1752621621 nanos:754876989}" Jul 15 23:20:21.790845 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-a9f1189c471a8d23ea8d6e758b311752acce1decc60b30e7a99179125097a57b-rootfs.mount: Deactivated successfully. Jul 15 23:20:21.874629 kubelet[2742]: I0715 23:20:21.874139 2742 scope.go:117] "RemoveContainer" containerID="a9f1189c471a8d23ea8d6e758b311752acce1decc60b30e7a99179125097a57b" Jul 15 23:20:21.879255 containerd[1538]: time="2025-07-15T23:20:21.879181484Z" level=info msg="CreateContainer within sandbox \"035bd7a13ce1829bfd5e5a640c54dd1d2989714f8b94aad3548e065a4bf5d268\" for container &ContainerMetadata{Name:kube-controller-manager,Attempt:1,}" Jul 15 23:20:21.895622 containerd[1538]: time="2025-07-15T23:20:21.892324499Z" level=info msg="Container d0f0184a7a2c595897f544e7b1cc756ef8ae837034f85d5067129e6acb549969: CDI devices from CRI Config.CDIDevices: []" Jul 15 23:20:21.908381 containerd[1538]: time="2025-07-15T23:20:21.907992852Z" level=info msg="CreateContainer within sandbox \"035bd7a13ce1829bfd5e5a640c54dd1d2989714f8b94aad3548e065a4bf5d268\" for &ContainerMetadata{Name:kube-controller-manager,Attempt:1,} returns container id \"d0f0184a7a2c595897f544e7b1cc756ef8ae837034f85d5067129e6acb549969\"" Jul 15 23:20:21.909245 containerd[1538]: time="2025-07-15T23:20:21.909175980Z" level=info msg="StartContainer for \"d0f0184a7a2c595897f544e7b1cc756ef8ae837034f85d5067129e6acb549969\"" Jul 15 23:20:21.912495 containerd[1538]: time="2025-07-15T23:20:21.912362443Z" level=info msg="connecting to shim d0f0184a7a2c595897f544e7b1cc756ef8ae837034f85d5067129e6acb549969" address="unix:///run/containerd/s/9eb7abb380bea2817c8885cd72e005de429ae51793fc3bb9fe6997b9732ea444" protocol=ttrpc version=3 Jul 15 23:20:21.915285 systemd[1]: cri-containerd-80747840e2dfec3fc11ab0da6664cb31ba0201b0674a167830afcea74c3a9ac9.scope: Deactivated successfully. Jul 15 23:20:21.917551 systemd[1]: cri-containerd-80747840e2dfec3fc11ab0da6664cb31ba0201b0674a167830afcea74c3a9ac9.scope: Consumed 24.438s CPU time, 110.2M memory peak, 4M read from disk. Jul 15 23:20:21.924552 containerd[1538]: time="2025-07-15T23:20:21.924513291Z" level=info msg="received exit event container_id:\"80747840e2dfec3fc11ab0da6664cb31ba0201b0674a167830afcea74c3a9ac9\" id:\"80747840e2dfec3fc11ab0da6664cb31ba0201b0674a167830afcea74c3a9ac9\" pid:3066 exit_status:1 exited_at:{seconds:1752621621 nanos:917246399}" Jul 15 23:20:21.925385 containerd[1538]: time="2025-07-15T23:20:21.925100415Z" level=info msg="TaskExit event in podsandbox handler container_id:\"80747840e2dfec3fc11ab0da6664cb31ba0201b0674a167830afcea74c3a9ac9\" id:\"80747840e2dfec3fc11ab0da6664cb31ba0201b0674a167830afcea74c3a9ac9\" pid:3066 exit_status:1 exited_at:{seconds:1752621621 nanos:917246399}" Jul 15 23:20:21.958816 systemd[1]: Started cri-containerd-d0f0184a7a2c595897f544e7b1cc756ef8ae837034f85d5067129e6acb549969.scope - libcontainer container d0f0184a7a2c595897f544e7b1cc756ef8ae837034f85d5067129e6acb549969. Jul 15 23:20:21.977647 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-80747840e2dfec3fc11ab0da6664cb31ba0201b0674a167830afcea74c3a9ac9-rootfs.mount: Deactivated successfully. Jul 15 23:20:22.031094 containerd[1538]: time="2025-07-15T23:20:22.031033342Z" level=info msg="StartContainer for \"d0f0184a7a2c595897f544e7b1cc756ef8ae837034f85d5067129e6acb549969\" returns successfully" Jul 15 23:20:22.184226 kubelet[2742]: E0715 23:20:22.184124 2742 controller.go:195] "Failed to update lease" err="rpc error: code = Unavailable desc = error reading from server: read tcp 10.0.0.3:44978->10.0.0.2:2379: read: connection timed out" Jul 15 23:20:22.887953 kubelet[2742]: I0715 23:20:22.887811 2742 scope.go:117] "RemoveContainer" containerID="80747840e2dfec3fc11ab0da6664cb31ba0201b0674a167830afcea74c3a9ac9" Jul 15 23:20:22.890794 containerd[1538]: time="2025-07-15T23:20:22.890742567Z" level=info msg="CreateContainer within sandbox \"f94139efeb3dd63b0ce76faf66299fc34c890dcb5db1f6bd592a8be24efc8631\" for container &ContainerMetadata{Name:tigera-operator,Attempt:1,}" Jul 15 23:20:22.905543 containerd[1538]: time="2025-07-15T23:20:22.905290941Z" level=info msg="Container 06206aa24a40152cd64de8dc3199c3f3d331c19d08ec4e7c520c5ef928a1374c: CDI devices from CRI Config.CDIDevices: []" Jul 15 23:20:22.912268 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount891811534.mount: Deactivated successfully. Jul 15 23:20:22.920016 containerd[1538]: time="2025-07-15T23:20:22.919380040Z" level=info msg="CreateContainer within sandbox \"f94139efeb3dd63b0ce76faf66299fc34c890dcb5db1f6bd592a8be24efc8631\" for &ContainerMetadata{Name:tigera-operator,Attempt:1,} returns container id \"06206aa24a40152cd64de8dc3199c3f3d331c19d08ec4e7c520c5ef928a1374c\"" Jul 15 23:20:22.921953 containerd[1538]: time="2025-07-15T23:20:22.921918878Z" level=info msg="StartContainer for \"06206aa24a40152cd64de8dc3199c3f3d331c19d08ec4e7c520c5ef928a1374c\"" Jul 15 23:20:22.923874 containerd[1538]: time="2025-07-15T23:20:22.923838227Z" level=info msg="connecting to shim 06206aa24a40152cd64de8dc3199c3f3d331c19d08ec4e7c520c5ef928a1374c" address="unix:///run/containerd/s/7def99427fc5bd2a3803f591d51a229ea917aa23c7de48a75244d6342997f69f" protocol=ttrpc version=3 Jul 15 23:20:22.969996 systemd[1]: Started cri-containerd-06206aa24a40152cd64de8dc3199c3f3d331c19d08ec4e7c520c5ef928a1374c.scope - libcontainer container 06206aa24a40152cd64de8dc3199c3f3d331c19d08ec4e7c520c5ef928a1374c. Jul 15 23:20:22.977515 kubelet[2742]: I0715 23:20:22.977468 2742 scope.go:117] "RemoveContainer" containerID="80747840e2dfec3fc11ab0da6664cb31ba0201b0674a167830afcea74c3a9ac9" Jul 15 23:20:22.985926 containerd[1538]: time="2025-07-15T23:20:22.985659127Z" level=info msg="RemoveContainer for \"80747840e2dfec3fc11ab0da6664cb31ba0201b0674a167830afcea74c3a9ac9\"" Jul 15 23:20:23.063979 containerd[1538]: time="2025-07-15T23:20:23.063932080Z" level=info msg="RemoveContainer for \"80747840e2dfec3fc11ab0da6664cb31ba0201b0674a167830afcea74c3a9ac9\" returns successfully" Jul 15 23:20:23.071018 containerd[1538]: time="2025-07-15T23:20:23.070952339Z" level=info msg="StartContainer for \"06206aa24a40152cd64de8dc3199c3f3d331c19d08ec4e7c520c5ef928a1374c\" returns successfully" Jul 15 23:20:26.033350 kubelet[2742]: E0715 23:20:26.029837 2742 event.go:359] "Server rejected event (will not retry!)" err="rpc error: code = Unavailable desc = error reading from server: read tcp 10.0.0.3:44820->10.0.0.2:2379: read: connection timed out" event="&Event{ObjectMeta:{kube-apiserver-ci-4372-0-1-n-21be50a87e.18529012d52eedaf kube-system 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:kube-system,Name:kube-apiserver-ci-4372-0-1-n-21be50a87e,UID:2a90ba5db8158141543146688d6c9d0a,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:Unhealthy,Message:Readiness probe failed: HTTP probe failed with statuscode: 500,Source:EventSource{Component:kubelet,Host:ci-4372-0-1-n-21be50a87e,},FirstTimestamp:2025-07-15 23:20:15.560977839 +0000 UTC m=+172.767743513,LastTimestamp:2025-07-15 23:20:15.560977839 +0000 UTC m=+172.767743513,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ci-4372-0-1-n-21be50a87e,}" Jul 15 23:20:27.889034 systemd[1]: cri-containerd-fa80c6f39abf9c1f535e6b8679137f97502d73abe80580d1b87b65639f961ac3.scope: Deactivated successfully. Jul 15 23:20:27.889480 systemd[1]: cri-containerd-fa80c6f39abf9c1f535e6b8679137f97502d73abe80580d1b87b65639f961ac3.scope: Consumed 3.530s CPU time, 22.9M memory peak, 3.3M read from disk. Jul 15 23:20:27.893184 containerd[1538]: time="2025-07-15T23:20:27.891823462Z" level=info msg="received exit event container_id:\"fa80c6f39abf9c1f535e6b8679137f97502d73abe80580d1b87b65639f961ac3\" id:\"fa80c6f39abf9c1f535e6b8679137f97502d73abe80580d1b87b65639f961ac3\" pid:2601 exit_status:1 exited_at:{seconds:1752621627 nanos:891247820}" Jul 15 23:20:27.893184 containerd[1538]: time="2025-07-15T23:20:27.892491190Z" level=info msg="TaskExit event in podsandbox handler container_id:\"fa80c6f39abf9c1f535e6b8679137f97502d73abe80580d1b87b65639f961ac3\" id:\"fa80c6f39abf9c1f535e6b8679137f97502d73abe80580d1b87b65639f961ac3\" pid:2601 exit_status:1 exited_at:{seconds:1752621627 nanos:891247820}" Jul 15 23:20:27.926571 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-fa80c6f39abf9c1f535e6b8679137f97502d73abe80580d1b87b65639f961ac3-rootfs.mount: Deactivated successfully. Jul 15 23:20:28.920281 kubelet[2742]: I0715 23:20:28.920241 2742 scope.go:117] "RemoveContainer" containerID="fa80c6f39abf9c1f535e6b8679137f97502d73abe80580d1b87b65639f961ac3" Jul 15 23:20:28.923923 containerd[1538]: time="2025-07-15T23:20:28.923877773Z" level=info msg="CreateContainer within sandbox \"db057986d6b140e72e4b16f247f39621137ef9f3727458d3505b0fcbec6704be\" for container &ContainerMetadata{Name:kube-scheduler,Attempt:1,}" Jul 15 23:20:28.938637 containerd[1538]: time="2025-07-15T23:20:28.937413060Z" level=info msg="Container 1a41a525ef4fe52958a8c4ac2eedc23be501bfe1495888165a466fc9802ce06f: CDI devices from CRI Config.CDIDevices: []" Jul 15 23:20:28.952144 containerd[1538]: time="2025-07-15T23:20:28.952000984Z" level=info msg="CreateContainer within sandbox \"db057986d6b140e72e4b16f247f39621137ef9f3727458d3505b0fcbec6704be\" for &ContainerMetadata{Name:kube-scheduler,Attempt:1,} returns container id \"1a41a525ef4fe52958a8c4ac2eedc23be501bfe1495888165a466fc9802ce06f\"" Jul 15 23:20:28.952953 containerd[1538]: time="2025-07-15T23:20:28.952868486Z" level=info msg="StartContainer for \"1a41a525ef4fe52958a8c4ac2eedc23be501bfe1495888165a466fc9802ce06f\"" Jul 15 23:20:28.954297 containerd[1538]: time="2025-07-15T23:20:28.954244344Z" level=info msg="connecting to shim 1a41a525ef4fe52958a8c4ac2eedc23be501bfe1495888165a466fc9802ce06f" address="unix:///run/containerd/s/8fdc545fbcf759094beee139d8932843e4506103278934e68468fa4ef43e9194" protocol=ttrpc version=3 Jul 15 23:20:28.978718 systemd[1]: Started cri-containerd-1a41a525ef4fe52958a8c4ac2eedc23be501bfe1495888165a466fc9802ce06f.scope - libcontainer container 1a41a525ef4fe52958a8c4ac2eedc23be501bfe1495888165a466fc9802ce06f. Jul 15 23:20:29.032813 containerd[1538]: time="2025-07-15T23:20:29.032701402Z" level=info msg="StartContainer for \"1a41a525ef4fe52958a8c4ac2eedc23be501bfe1495888165a466fc9802ce06f\" returns successfully" Jul 15 23:20:32.038497 kubelet[2742]: I0715 23:20:32.038326 2742 status_manager.go:895] "Failed to get status for pod" podUID="2a90ba5db8158141543146688d6c9d0a" pod="kube-system/kube-apiserver-ci-4372-0-1-n-21be50a87e" err="rpc error: code = Unavailable desc = error reading from server: read tcp 10.0.0.3:44926->10.0.0.2:2379: read: connection timed out" Jul 15 23:20:32.185351 kubelet[2742]: E0715 23:20:32.185222 2742 controller.go:195] "Failed to update lease" err="Put \"https://91.99.212.32:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/ci-4372-0-1-n-21be50a87e?timeout=10s\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Jul 15 23:20:34.152997 containerd[1538]: time="2025-07-15T23:20:34.152897156Z" level=info msg="TaskExit event in podsandbox handler container_id:\"d3261160ea98124dd71ea19467519976d4184aec8c47397154429269b6d85b2f\" id:\"7371f35c090dc0b73729a338062053125fe21402b270c719c2d0846fcfb88983\" pid:6039 exited_at:{seconds:1752621634 nanos:151948774}" Jul 15 23:20:34.356068 systemd[1]: cri-containerd-06206aa24a40152cd64de8dc3199c3f3d331c19d08ec4e7c520c5ef928a1374c.scope: Deactivated successfully. Jul 15 23:20:34.357304 systemd[1]: cri-containerd-06206aa24a40152cd64de8dc3199c3f3d331c19d08ec4e7c520c5ef928a1374c.scope: Consumed 277ms CPU time, 44.6M memory peak, 1.6M read from disk. Jul 15 23:20:34.368308 containerd[1538]: time="2025-07-15T23:20:34.368070519Z" level=info msg="received exit event container_id:\"06206aa24a40152cd64de8dc3199c3f3d331c19d08ec4e7c520c5ef928a1374c\" id:\"06206aa24a40152cd64de8dc3199c3f3d331c19d08ec4e7c520c5ef928a1374c\" pid:5957 exit_status:1 exited_at:{seconds:1752621634 nanos:367658171}" Jul 15 23:20:34.368308 containerd[1538]: time="2025-07-15T23:20:34.368190006Z" level=info msg="TaskExit event in podsandbox handler container_id:\"06206aa24a40152cd64de8dc3199c3f3d331c19d08ec4e7c520c5ef928a1374c\" id:\"06206aa24a40152cd64de8dc3199c3f3d331c19d08ec4e7c520c5ef928a1374c\" pid:5957 exit_status:1 exited_at:{seconds:1752621634 nanos:367658171}" Jul 15 23:20:34.404200 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-06206aa24a40152cd64de8dc3199c3f3d331c19d08ec4e7c520c5ef928a1374c-rootfs.mount: Deactivated successfully. Jul 15 23:20:34.947457 kubelet[2742]: I0715 23:20:34.947380 2742 scope.go:117] "RemoveContainer" containerID="06206aa24a40152cd64de8dc3199c3f3d331c19d08ec4e7c520c5ef928a1374c" Jul 15 23:20:34.948116 kubelet[2742]: E0715 23:20:34.947625 2742 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"tigera-operator\" with CrashLoopBackOff: \"back-off 10s restarting failed container=tigera-operator pod=tigera-operator-747864d56d-q7g7s_tigera-operator(34099849-9cf7-4598-984c-1c53ad94a99c)\"" pod="tigera-operator/tigera-operator-747864d56d-q7g7s" podUID="34099849-9cf7-4598-984c-1c53ad94a99c" Jul 15 23:20:42.192446 kubelet[2742]: E0715 23:20:42.186711 2742 controller.go:195] "Failed to update lease" err="Put \"https://91.99.212.32:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/ci-4372-0-1-n-21be50a87e?timeout=10s\": context deadline exceeded" Jul 15 23:20:44.379999 containerd[1538]: time="2025-07-15T23:20:44.379914564Z" level=info msg="TaskExit event in podsandbox handler container_id:\"b59f0eb9bfb1fa2dbfb9f20c5aae10af21588a9a682f748878d3ba512a91b699\" id:\"2ecd6454157cf88aac97215f8cdf1acc5f2619ef1337bf6c470651fa89cd1034\" pid:6075 exit_status:1 exited_at:{seconds:1752621644 nanos:379303089}"