Sep 9 23:55:23.823689 kernel: Booting Linux on physical CPU 0x0000000000 [0x413fd0c1] Sep 9 23:55:23.823712 kernel: Linux version 6.12.45-flatcar (build@pony-truck.infra.kinvolk.io) (aarch64-cros-linux-gnu-gcc (Gentoo Hardened 14.3.0 p8) 14.3.0, GNU ld (Gentoo 2.44 p4) 2.44.0) #1 SMP PREEMPT Tue Sep 9 22:10:22 -00 2025 Sep 9 23:55:23.823722 kernel: KASLR enabled Sep 9 23:55:23.823727 kernel: efi: EFI v2.7 by Ubuntu distribution of EDK II Sep 9 23:55:23.823733 kernel: efi: SMBIOS 3.0=0x139ed0000 MEMATTR=0x1390bb018 ACPI 2.0=0x136760018 RNG=0x13676e918 MEMRESERVE=0x136b41218 Sep 9 23:55:23.823738 kernel: random: crng init done Sep 9 23:55:23.823745 kernel: secureboot: Secure boot disabled Sep 9 23:55:23.823750 kernel: ACPI: Early table checksum verification disabled Sep 9 23:55:23.823756 kernel: ACPI: RSDP 0x0000000136760018 000024 (v02 BOCHS ) Sep 9 23:55:23.823762 kernel: ACPI: XSDT 0x000000013676FE98 00006C (v01 BOCHS BXPC 00000001 01000013) Sep 9 23:55:23.823769 kernel: ACPI: FACP 0x000000013676FA98 000114 (v06 BOCHS BXPC 00000001 BXPC 00000001) Sep 9 23:55:23.823775 kernel: ACPI: DSDT 0x0000000136767518 001468 (v02 BOCHS BXPC 00000001 BXPC 00000001) Sep 9 23:55:23.823781 kernel: ACPI: APIC 0x000000013676FC18 000108 (v04 BOCHS BXPC 00000001 BXPC 00000001) Sep 9 23:55:23.823787 kernel: ACPI: PPTT 0x000000013676FD98 000060 (v02 BOCHS BXPC 00000001 BXPC 00000001) Sep 9 23:55:23.823794 kernel: ACPI: GTDT 0x000000013676D898 000060 (v02 BOCHS BXPC 00000001 BXPC 00000001) Sep 9 23:55:23.823801 kernel: ACPI: MCFG 0x000000013676FF98 00003C (v01 BOCHS BXPC 00000001 BXPC 00000001) Sep 9 23:55:23.823807 kernel: ACPI: SPCR 0x000000013676E818 000050 (v02 BOCHS BXPC 00000001 BXPC 00000001) Sep 9 23:55:23.823813 kernel: ACPI: DBG2 0x000000013676E898 000057 (v00 BOCHS BXPC 00000001 BXPC 00000001) Sep 9 23:55:23.823819 kernel: ACPI: IORT 0x000000013676E418 000080 (v03 BOCHS BXPC 00000001 BXPC 00000001) Sep 9 23:55:23.823826 kernel: ACPI: BGRT 0x000000013676E798 000038 (v01 INTEL EDK2 00000002 01000013) Sep 9 23:55:23.823832 kernel: ACPI: SPCR: console: pl011,mmio32,0x9000000,9600 Sep 9 23:55:23.823850 kernel: ACPI: Use ACPI SPCR as default console: No Sep 9 23:55:23.823856 kernel: NUMA: Faking a node at [mem 0x0000000040000000-0x0000000139ffffff] Sep 9 23:55:23.823862 kernel: NODE_DATA(0) allocated [mem 0x13967da00-0x139684fff] Sep 9 23:55:23.823868 kernel: Zone ranges: Sep 9 23:55:23.823874 kernel: DMA [mem 0x0000000040000000-0x00000000ffffffff] Sep 9 23:55:23.823882 kernel: DMA32 empty Sep 9 23:55:23.823888 kernel: Normal [mem 0x0000000100000000-0x0000000139ffffff] Sep 9 23:55:23.823894 kernel: Device empty Sep 9 23:55:23.823900 kernel: Movable zone start for each node Sep 9 23:55:23.823906 kernel: Early memory node ranges Sep 9 23:55:23.823912 kernel: node 0: [mem 0x0000000040000000-0x000000013666ffff] Sep 9 23:55:23.823918 kernel: node 0: [mem 0x0000000136670000-0x000000013667ffff] Sep 9 23:55:23.823924 kernel: node 0: [mem 0x0000000136680000-0x000000013676ffff] Sep 9 23:55:23.823930 kernel: node 0: [mem 0x0000000136770000-0x0000000136b3ffff] Sep 9 23:55:23.823936 kernel: node 0: [mem 0x0000000136b40000-0x0000000139e1ffff] Sep 9 23:55:23.823942 kernel: node 0: [mem 0x0000000139e20000-0x0000000139eaffff] Sep 9 23:55:23.823948 kernel: node 0: [mem 0x0000000139eb0000-0x0000000139ebffff] Sep 9 23:55:23.823955 kernel: node 0: [mem 0x0000000139ec0000-0x0000000139fdffff] Sep 9 23:55:23.823961 kernel: node 0: [mem 0x0000000139fe0000-0x0000000139ffffff] Sep 9 23:55:23.823970 kernel: Initmem setup node 0 [mem 0x0000000040000000-0x0000000139ffffff] Sep 9 23:55:23.823977 kernel: On node 0, zone Normal: 24576 pages in unavailable ranges Sep 9 23:55:23.823983 kernel: cma: Reserved 16 MiB at 0x00000000ff000000 on node -1 Sep 9 23:55:23.823991 kernel: psci: probing for conduit method from ACPI. Sep 9 23:55:23.823998 kernel: psci: PSCIv1.1 detected in firmware. Sep 9 23:55:23.824004 kernel: psci: Using standard PSCI v0.2 function IDs Sep 9 23:55:23.824010 kernel: psci: Trusted OS migration not required Sep 9 23:55:23.824017 kernel: psci: SMC Calling Convention v1.1 Sep 9 23:55:23.824023 kernel: smccc: KVM: hypervisor services detected (0x00000000 0x00000000 0x00000000 0x00000003) Sep 9 23:55:23.824030 kernel: percpu: Embedded 33 pages/cpu s98200 r8192 d28776 u135168 Sep 9 23:55:23.824036 kernel: pcpu-alloc: s98200 r8192 d28776 u135168 alloc=33*4096 Sep 9 23:55:23.824043 kernel: pcpu-alloc: [0] 0 [0] 1 Sep 9 23:55:23.824049 kernel: Detected PIPT I-cache on CPU0 Sep 9 23:55:23.824055 kernel: CPU features: detected: GIC system register CPU interface Sep 9 23:55:23.824063 kernel: CPU features: detected: Spectre-v4 Sep 9 23:55:23.824069 kernel: CPU features: detected: Spectre-BHB Sep 9 23:55:23.824076 kernel: CPU features: kernel page table isolation forced ON by KASLR Sep 9 23:55:23.824082 kernel: CPU features: detected: Kernel page table isolation (KPTI) Sep 9 23:55:23.824089 kernel: CPU features: detected: ARM erratum 1418040 Sep 9 23:55:23.824095 kernel: CPU features: detected: SSBS not fully self-synchronizing Sep 9 23:55:23.824101 kernel: alternatives: applying boot alternatives Sep 9 23:55:23.824109 kernel: Kernel command line: BOOT_IMAGE=/flatcar/vmlinuz-a mount.usr=/dev/mapper/usr verity.usr=PARTUUID=7130c94a-213a-4e5a-8e26-6cce9662f132 rootflags=rw mount.usrflags=ro consoleblank=0 root=LABEL=ROOT console=ttyAMA0,115200n8 flatcar.first_boot=detected acpi=force flatcar.oem.id=hetzner verity.usrhash=fc7b279c2d918629032c01551b74c66c198cf923a976f9b3bc0d959e7c2302db Sep 9 23:55:23.824116 kernel: Unknown kernel command line parameters "BOOT_IMAGE=/flatcar/vmlinuz-a", will be passed to user space. Sep 9 23:55:23.824122 kernel: Dentry cache hash table entries: 524288 (order: 10, 4194304 bytes, linear) Sep 9 23:55:23.824130 kernel: Inode-cache hash table entries: 262144 (order: 9, 2097152 bytes, linear) Sep 9 23:55:23.824137 kernel: Fallback order for Node 0: 0 Sep 9 23:55:23.824143 kernel: Built 1 zonelists, mobility grouping on. Total pages: 1024000 Sep 9 23:55:23.824149 kernel: Policy zone: Normal Sep 9 23:55:23.824155 kernel: mem auto-init: stack:off, heap alloc:off, heap free:off Sep 9 23:55:23.824162 kernel: software IO TLB: area num 2. Sep 9 23:55:23.824168 kernel: software IO TLB: mapped [mem 0x00000000f46d0000-0x00000000f86d0000] (64MB) Sep 9 23:55:23.824175 kernel: SLUB: HWalign=64, Order=0-3, MinObjects=0, CPUs=2, Nodes=1 Sep 9 23:55:23.824181 kernel: rcu: Preemptible hierarchical RCU implementation. Sep 9 23:55:23.824188 kernel: rcu: RCU event tracing is enabled. Sep 9 23:55:23.824195 kernel: rcu: RCU restricting CPUs from NR_CPUS=512 to nr_cpu_ids=2. Sep 9 23:55:23.824202 kernel: Trampoline variant of Tasks RCU enabled. Sep 9 23:55:23.824210 kernel: Tracing variant of Tasks RCU enabled. Sep 9 23:55:23.824217 kernel: rcu: RCU calculated value of scheduler-enlistment delay is 100 jiffies. Sep 9 23:55:23.824223 kernel: rcu: Adjusting geometry for rcu_fanout_leaf=16, nr_cpu_ids=2 Sep 9 23:55:23.824230 kernel: RCU Tasks: Setting shift to 1 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=2. Sep 9 23:55:23.824237 kernel: RCU Tasks Trace: Setting shift to 1 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=2. Sep 9 23:55:23.824243 kernel: NR_IRQS: 64, nr_irqs: 64, preallocated irqs: 0 Sep 9 23:55:23.824249 kernel: GICv3: 256 SPIs implemented Sep 9 23:55:23.824256 kernel: GICv3: 0 Extended SPIs implemented Sep 9 23:55:23.824262 kernel: Root IRQ handler: gic_handle_irq Sep 9 23:55:23.824269 kernel: GICv3: GICv3 features: 16 PPIs, DirectLPI Sep 9 23:55:23.824275 kernel: GICv3: GICD_CTRL.DS=1, SCR_EL3.FIQ=0 Sep 9 23:55:23.824327 kernel: GICv3: CPU0: found redistributor 0 region 0:0x00000000080a0000 Sep 9 23:55:23.824338 kernel: ITS [mem 0x08080000-0x0809ffff] Sep 9 23:55:23.824345 kernel: ITS@0x0000000008080000: allocated 8192 Devices @100100000 (indirect, esz 8, psz 64K, shr 1) Sep 9 23:55:23.824351 kernel: ITS@0x0000000008080000: allocated 8192 Interrupt Collections @100110000 (flat, esz 8, psz 64K, shr 1) Sep 9 23:55:23.824358 kernel: GICv3: using LPI property table @0x0000000100120000 Sep 9 23:55:23.824364 kernel: GICv3: CPU0: using allocated LPI pending table @0x0000000100130000 Sep 9 23:55:23.824371 kernel: rcu: srcu_init: Setting srcu_struct sizes based on contention. Sep 9 23:55:23.824377 kernel: arch_timer: Enabling local workaround for ARM erratum 1418040 Sep 9 23:55:23.824384 kernel: arch_timer: cp15 timer(s) running at 25.00MHz (virt). Sep 9 23:55:23.824390 kernel: clocksource: arch_sys_counter: mask: 0xffffffffffffff max_cycles: 0x5c40939b5, max_idle_ns: 440795202646 ns Sep 9 23:55:23.824397 kernel: sched_clock: 56 bits at 25MHz, resolution 40ns, wraps every 4398046511100ns Sep 9 23:55:23.824404 kernel: Console: colour dummy device 80x25 Sep 9 23:55:23.824412 kernel: ACPI: Core revision 20240827 Sep 9 23:55:23.824419 kernel: Calibrating delay loop (skipped), value calculated using timer frequency.. 50.00 BogoMIPS (lpj=25000) Sep 9 23:55:23.824425 kernel: pid_max: default: 32768 minimum: 301 Sep 9 23:55:23.824432 kernel: LSM: initializing lsm=lockdown,capability,landlock,selinux,ima Sep 9 23:55:23.824439 kernel: landlock: Up and running. Sep 9 23:55:23.824446 kernel: SELinux: Initializing. Sep 9 23:55:23.824452 kernel: Mount-cache hash table entries: 8192 (order: 4, 65536 bytes, linear) Sep 9 23:55:23.824459 kernel: Mountpoint-cache hash table entries: 8192 (order: 4, 65536 bytes, linear) Sep 9 23:55:23.824466 kernel: rcu: Hierarchical SRCU implementation. Sep 9 23:55:23.824474 kernel: rcu: Max phase no-delay instances is 400. Sep 9 23:55:23.824481 kernel: Timer migration: 1 hierarchy levels; 8 children per group; 1 crossnode level Sep 9 23:55:23.824489 kernel: Remapping and enabling EFI services. Sep 9 23:55:23.824495 kernel: smp: Bringing up secondary CPUs ... Sep 9 23:55:23.824502 kernel: Detected PIPT I-cache on CPU1 Sep 9 23:55:23.824509 kernel: GICv3: CPU1: found redistributor 1 region 0:0x00000000080c0000 Sep 9 23:55:23.824515 kernel: GICv3: CPU1: using allocated LPI pending table @0x0000000100140000 Sep 9 23:55:23.824522 kernel: arch_timer: Enabling local workaround for ARM erratum 1418040 Sep 9 23:55:23.824529 kernel: CPU1: Booted secondary processor 0x0000000001 [0x413fd0c1] Sep 9 23:55:23.824537 kernel: smp: Brought up 1 node, 2 CPUs Sep 9 23:55:23.824548 kernel: SMP: Total of 2 processors activated. Sep 9 23:55:23.824555 kernel: CPU: All CPU(s) started at EL1 Sep 9 23:55:23.824564 kernel: CPU features: detected: 32-bit EL0 Support Sep 9 23:55:23.824572 kernel: CPU features: detected: Data cache clean to the PoU not required for I/D coherence Sep 9 23:55:23.824579 kernel: CPU features: detected: Common not Private translations Sep 9 23:55:23.824586 kernel: CPU features: detected: CRC32 instructions Sep 9 23:55:23.824593 kernel: CPU features: detected: Enhanced Virtualization Traps Sep 9 23:55:23.824602 kernel: CPU features: detected: RCpc load-acquire (LDAPR) Sep 9 23:55:23.824609 kernel: CPU features: detected: LSE atomic instructions Sep 9 23:55:23.824616 kernel: CPU features: detected: Privileged Access Never Sep 9 23:55:23.824623 kernel: CPU features: detected: RAS Extension Support Sep 9 23:55:23.824630 kernel: CPU features: detected: Speculative Store Bypassing Safe (SSBS) Sep 9 23:55:23.824637 kernel: alternatives: applying system-wide alternatives Sep 9 23:55:23.824644 kernel: CPU features: detected: Hardware dirty bit management on CPU0-1 Sep 9 23:55:23.824652 kernel: Memory: 3859620K/4096000K available (11136K kernel code, 2436K rwdata, 9060K rodata, 38912K init, 1038K bss, 214900K reserved, 16384K cma-reserved) Sep 9 23:55:23.824659 kernel: devtmpfs: initialized Sep 9 23:55:23.824667 kernel: clocksource: jiffies: mask: 0xffffffff max_cycles: 0xffffffff, max_idle_ns: 1911260446275000 ns Sep 9 23:55:23.824674 kernel: futex hash table entries: 512 (order: 3, 32768 bytes, linear) Sep 9 23:55:23.824681 kernel: 2G module region forced by RANDOMIZE_MODULE_REGION_FULL Sep 9 23:55:23.824688 kernel: 0 pages in range for non-PLT usage Sep 9 23:55:23.824695 kernel: 508576 pages in range for PLT usage Sep 9 23:55:23.824702 kernel: pinctrl core: initialized pinctrl subsystem Sep 9 23:55:23.824709 kernel: SMBIOS 3.0.0 present. Sep 9 23:55:23.824717 kernel: DMI: Hetzner vServer/KVM Virtual Machine, BIOS 20171111 11/11/2017 Sep 9 23:55:23.824724 kernel: DMI: Memory slots populated: 1/1 Sep 9 23:55:23.824732 kernel: NET: Registered PF_NETLINK/PF_ROUTE protocol family Sep 9 23:55:23.824739 kernel: DMA: preallocated 512 KiB GFP_KERNEL pool for atomic allocations Sep 9 23:55:23.824747 kernel: DMA: preallocated 512 KiB GFP_KERNEL|GFP_DMA pool for atomic allocations Sep 9 23:55:23.824754 kernel: DMA: preallocated 512 KiB GFP_KERNEL|GFP_DMA32 pool for atomic allocations Sep 9 23:55:23.824761 kernel: audit: initializing netlink subsys (disabled) Sep 9 23:55:23.824768 kernel: audit: type=2000 audit(0.014:1): state=initialized audit_enabled=0 res=1 Sep 9 23:55:23.824775 kernel: thermal_sys: Registered thermal governor 'step_wise' Sep 9 23:55:23.824782 kernel: cpuidle: using governor menu Sep 9 23:55:23.824789 kernel: hw-breakpoint: found 6 breakpoint and 4 watchpoint registers. Sep 9 23:55:23.824798 kernel: ASID allocator initialised with 32768 entries Sep 9 23:55:23.824805 kernel: acpiphp: ACPI Hot Plug PCI Controller Driver version: 0.5 Sep 9 23:55:23.824812 kernel: Serial: AMBA PL011 UART driver Sep 9 23:55:23.824819 kernel: HugeTLB: registered 1.00 GiB page size, pre-allocated 0 pages Sep 9 23:55:23.825549 kernel: HugeTLB: 0 KiB vmemmap can be freed for a 1.00 GiB page Sep 9 23:55:23.825558 kernel: HugeTLB: registered 32.0 MiB page size, pre-allocated 0 pages Sep 9 23:55:23.825566 kernel: HugeTLB: 0 KiB vmemmap can be freed for a 32.0 MiB page Sep 9 23:55:23.825573 kernel: HugeTLB: registered 2.00 MiB page size, pre-allocated 0 pages Sep 9 23:55:23.825580 kernel: HugeTLB: 0 KiB vmemmap can be freed for a 2.00 MiB page Sep 9 23:55:23.825593 kernel: HugeTLB: registered 64.0 KiB page size, pre-allocated 0 pages Sep 9 23:55:23.825600 kernel: HugeTLB: 0 KiB vmemmap can be freed for a 64.0 KiB page Sep 9 23:55:23.825607 kernel: ACPI: Added _OSI(Module Device) Sep 9 23:55:23.825614 kernel: ACPI: Added _OSI(Processor Device) Sep 9 23:55:23.825622 kernel: ACPI: Added _OSI(Processor Aggregator Device) Sep 9 23:55:23.825629 kernel: ACPI: 1 ACPI AML tables successfully acquired and loaded Sep 9 23:55:23.825636 kernel: ACPI: Interpreter enabled Sep 9 23:55:23.825643 kernel: ACPI: Using GIC for interrupt routing Sep 9 23:55:23.825650 kernel: ACPI: MCFG table detected, 1 entries Sep 9 23:55:23.825659 kernel: ACPI: CPU0 has been hot-added Sep 9 23:55:23.825666 kernel: ACPI: CPU1 has been hot-added Sep 9 23:55:23.825673 kernel: ARMH0011:00: ttyAMA0 at MMIO 0x9000000 (irq = 12, base_baud = 0) is a SBSA Sep 9 23:55:23.825680 kernel: printk: legacy console [ttyAMA0] enabled Sep 9 23:55:23.825687 kernel: ACPI: PCI Root Bridge [PCI0] (domain 0000 [bus 00-ff]) Sep 9 23:55:23.826175 kernel: acpi PNP0A08:00: _OSC: OS supports [ExtendedConfig ASPM ClockPM Segments MSI HPX-Type3] Sep 9 23:55:23.826243 kernel: acpi PNP0A08:00: _OSC: platform does not support [LTR] Sep 9 23:55:23.826373 kernel: acpi PNP0A08:00: _OSC: OS now controls [PCIeHotplug PME AER PCIeCapability] Sep 9 23:55:23.826449 kernel: acpi PNP0A08:00: ECAM area [mem 0x4010000000-0x401fffffff] reserved by PNP0C02:00 Sep 9 23:55:23.826506 kernel: acpi PNP0A08:00: ECAM at [mem 0x4010000000-0x401fffffff] for [bus 00-ff] Sep 9 23:55:23.826516 kernel: ACPI: Remapped I/O 0x000000003eff0000 to [io 0x0000-0xffff window] Sep 9 23:55:23.826523 kernel: PCI host bridge to bus 0000:00 Sep 9 23:55:23.829049 kernel: pci_bus 0000:00: root bus resource [mem 0x10000000-0x3efeffff window] Sep 9 23:55:23.829134 kernel: pci_bus 0000:00: root bus resource [io 0x0000-0xffff window] Sep 9 23:55:23.829204 kernel: pci_bus 0000:00: root bus resource [mem 0x8000000000-0xffffffffff window] Sep 9 23:55:23.829272 kernel: pci_bus 0000:00: root bus resource [bus 00-ff] Sep 9 23:55:23.829442 kernel: pci 0000:00:00.0: [1b36:0008] type 00 class 0x060000 conventional PCI endpoint Sep 9 23:55:23.829609 kernel: pci 0000:00:01.0: [1af4:1050] type 00 class 0x038000 conventional PCI endpoint Sep 9 23:55:23.829677 kernel: pci 0000:00:01.0: BAR 1 [mem 0x11289000-0x11289fff] Sep 9 23:55:23.829737 kernel: pci 0000:00:01.0: BAR 4 [mem 0x8000600000-0x8000603fff 64bit pref] Sep 9 23:55:23.829810 kernel: pci 0000:00:02.0: [1b36:000c] type 01 class 0x060400 PCIe Root Port Sep 9 23:55:23.831779 kernel: pci 0000:00:02.0: BAR 0 [mem 0x11288000-0x11288fff] Sep 9 23:55:23.831918 kernel: pci 0000:00:02.0: PCI bridge to [bus 01] Sep 9 23:55:23.831983 kernel: pci 0000:00:02.0: bridge window [mem 0x11000000-0x111fffff] Sep 9 23:55:23.832040 kernel: pci 0000:00:02.0: bridge window [mem 0x8000000000-0x80000fffff 64bit pref] Sep 9 23:55:23.832113 kernel: pci 0000:00:02.1: [1b36:000c] type 01 class 0x060400 PCIe Root Port Sep 9 23:55:23.832173 kernel: pci 0000:00:02.1: BAR 0 [mem 0x11287000-0x11287fff] Sep 9 23:55:23.832232 kernel: pci 0000:00:02.1: PCI bridge to [bus 02] Sep 9 23:55:23.832316 kernel: pci 0000:00:02.1: bridge window [mem 0x10e00000-0x10ffffff] Sep 9 23:55:23.832387 kernel: pci 0000:00:02.2: [1b36:000c] type 01 class 0x060400 PCIe Root Port Sep 9 23:55:23.832449 kernel: pci 0000:00:02.2: BAR 0 [mem 0x11286000-0x11286fff] Sep 9 23:55:23.832542 kernel: pci 0000:00:02.2: PCI bridge to [bus 03] Sep 9 23:55:23.832618 kernel: pci 0000:00:02.2: bridge window [mem 0x10c00000-0x10dfffff] Sep 9 23:55:23.832677 kernel: pci 0000:00:02.2: bridge window [mem 0x8000100000-0x80001fffff 64bit pref] Sep 9 23:55:23.832746 kernel: pci 0000:00:02.3: [1b36:000c] type 01 class 0x060400 PCIe Root Port Sep 9 23:55:23.832809 kernel: pci 0000:00:02.3: BAR 0 [mem 0x11285000-0x11285fff] Sep 9 23:55:23.832915 kernel: pci 0000:00:02.3: PCI bridge to [bus 04] Sep 9 23:55:23.832978 kernel: pci 0000:00:02.3: bridge window [mem 0x10a00000-0x10bfffff] Sep 9 23:55:23.833035 kernel: pci 0000:00:02.3: bridge window [mem 0x8000200000-0x80002fffff 64bit pref] Sep 9 23:55:23.833107 kernel: pci 0000:00:02.4: [1b36:000c] type 01 class 0x060400 PCIe Root Port Sep 9 23:55:23.833219 kernel: pci 0000:00:02.4: BAR 0 [mem 0x11284000-0x11284fff] Sep 9 23:55:23.833300 kernel: pci 0000:00:02.4: PCI bridge to [bus 05] Sep 9 23:55:23.833555 kernel: pci 0000:00:02.4: bridge window [mem 0x10800000-0x109fffff] Sep 9 23:55:23.833635 kernel: pci 0000:00:02.4: bridge window [mem 0x8000300000-0x80003fffff 64bit pref] Sep 9 23:55:23.833705 kernel: pci 0000:00:02.5: [1b36:000c] type 01 class 0x060400 PCIe Root Port Sep 9 23:55:23.833763 kernel: pci 0000:00:02.5: BAR 0 [mem 0x11283000-0x11283fff] Sep 9 23:55:23.833820 kernel: pci 0000:00:02.5: PCI bridge to [bus 06] Sep 9 23:55:23.836360 kernel: pci 0000:00:02.5: bridge window [mem 0x10600000-0x107fffff] Sep 9 23:55:23.836479 kernel: pci 0000:00:02.5: bridge window [mem 0x8000400000-0x80004fffff 64bit pref] Sep 9 23:55:23.836563 kernel: pci 0000:00:02.6: [1b36:000c] type 01 class 0x060400 PCIe Root Port Sep 9 23:55:23.836623 kernel: pci 0000:00:02.6: BAR 0 [mem 0x11282000-0x11282fff] Sep 9 23:55:23.836681 kernel: pci 0000:00:02.6: PCI bridge to [bus 07] Sep 9 23:55:23.836739 kernel: pci 0000:00:02.6: bridge window [mem 0x10400000-0x105fffff] Sep 9 23:55:23.836796 kernel: pci 0000:00:02.6: bridge window [mem 0x8000500000-0x80005fffff 64bit pref] Sep 9 23:55:23.836898 kernel: pci 0000:00:02.7: [1b36:000c] type 01 class 0x060400 PCIe Root Port Sep 9 23:55:23.836960 kernel: pci 0000:00:02.7: BAR 0 [mem 0x11281000-0x11281fff] Sep 9 23:55:23.837021 kernel: pci 0000:00:02.7: PCI bridge to [bus 08] Sep 9 23:55:23.837078 kernel: pci 0000:00:02.7: bridge window [mem 0x10200000-0x103fffff] Sep 9 23:55:23.837144 kernel: pci 0000:00:03.0: [1b36:000c] type 01 class 0x060400 PCIe Root Port Sep 9 23:55:23.837203 kernel: pci 0000:00:03.0: BAR 0 [mem 0x11280000-0x11280fff] Sep 9 23:55:23.837260 kernel: pci 0000:00:03.0: PCI bridge to [bus 09] Sep 9 23:55:23.840091 kernel: pci 0000:00:03.0: bridge window [mem 0x10000000-0x101fffff] Sep 9 23:55:23.840215 kernel: pci 0000:00:04.0: [1b36:0002] type 00 class 0x070002 conventional PCI endpoint Sep 9 23:55:23.840471 kernel: pci 0000:00:04.0: BAR 0 [io 0x0000-0x0007] Sep 9 23:55:23.840766 kernel: pci 0000:01:00.0: [1af4:1041] type 00 class 0x020000 PCIe Endpoint Sep 9 23:55:23.840930 kernel: pci 0000:01:00.0: BAR 1 [mem 0x11000000-0x11000fff] Sep 9 23:55:23.841021 kernel: pci 0000:01:00.0: BAR 4 [mem 0x8000000000-0x8000003fff 64bit pref] Sep 9 23:55:23.841197 kernel: pci 0000:01:00.0: ROM [mem 0xfff80000-0xffffffff pref] Sep 9 23:55:23.841439 kernel: pci 0000:02:00.0: [1b36:000d] type 00 class 0x0c0330 PCIe Endpoint Sep 9 23:55:23.841526 kernel: pci 0000:02:00.0: BAR 0 [mem 0x10e00000-0x10e03fff 64bit] Sep 9 23:55:23.841598 kernel: pci 0000:03:00.0: [1af4:1043] type 00 class 0x078000 PCIe Endpoint Sep 9 23:55:23.841659 kernel: pci 0000:03:00.0: BAR 1 [mem 0x10c00000-0x10c00fff] Sep 9 23:55:23.841719 kernel: pci 0000:03:00.0: BAR 4 [mem 0x8000100000-0x8000103fff 64bit pref] Sep 9 23:55:23.841796 kernel: pci 0000:04:00.0: [1af4:1045] type 00 class 0x00ff00 PCIe Endpoint Sep 9 23:55:23.843020 kernel: pci 0000:04:00.0: BAR 4 [mem 0x8000200000-0x8000203fff 64bit pref] Sep 9 23:55:23.843131 kernel: pci 0000:05:00.0: [1af4:1044] type 00 class 0x00ff00 PCIe Endpoint Sep 9 23:55:23.843197 kernel: pci 0000:05:00.0: BAR 4 [mem 0x8000300000-0x8000303fff 64bit pref] Sep 9 23:55:23.843268 kernel: pci 0000:06:00.0: [1af4:1048] type 00 class 0x010000 PCIe Endpoint Sep 9 23:55:23.843641 kernel: pci 0000:06:00.0: BAR 1 [mem 0x10600000-0x10600fff] Sep 9 23:55:23.843737 kernel: pci 0000:06:00.0: BAR 4 [mem 0x8000400000-0x8000403fff 64bit pref] Sep 9 23:55:23.843816 kernel: pci 0000:07:00.0: [1af4:1041] type 00 class 0x020000 PCIe Endpoint Sep 9 23:55:23.845994 kernel: pci 0000:07:00.0: BAR 1 [mem 0x10400000-0x10400fff] Sep 9 23:55:23.846096 kernel: pci 0000:07:00.0: BAR 4 [mem 0x8000500000-0x8000503fff 64bit pref] Sep 9 23:55:23.846161 kernel: pci 0000:07:00.0: ROM [mem 0xfff80000-0xffffffff pref] Sep 9 23:55:23.846228 kernel: pci 0000:00:02.0: bridge window [io 0x1000-0x0fff] to [bus 01] add_size 1000 Sep 9 23:55:23.846346 kernel: pci 0000:00:02.0: bridge window [mem 0x00100000-0x001fffff 64bit pref] to [bus 01] add_size 100000 add_align 100000 Sep 9 23:55:23.846415 kernel: pci 0000:00:02.0: bridge window [mem 0x00100000-0x001fffff] to [bus 01] add_size 100000 add_align 100000 Sep 9 23:55:23.846480 kernel: pci 0000:00:02.1: bridge window [io 0x1000-0x0fff] to [bus 02] add_size 1000 Sep 9 23:55:23.846539 kernel: pci 0000:00:02.1: bridge window [mem 0x00100000-0x000fffff 64bit pref] to [bus 02] add_size 200000 add_align 100000 Sep 9 23:55:23.846600 kernel: pci 0000:00:02.1: bridge window [mem 0x00100000-0x001fffff] to [bus 02] add_size 100000 add_align 100000 Sep 9 23:55:23.846663 kernel: pci 0000:00:02.2: bridge window [io 0x1000-0x0fff] to [bus 03] add_size 1000 Sep 9 23:55:23.846720 kernel: pci 0000:00:02.2: bridge window [mem 0x00100000-0x001fffff 64bit pref] to [bus 03] add_size 100000 add_align 100000 Sep 9 23:55:23.846777 kernel: pci 0000:00:02.2: bridge window [mem 0x00100000-0x001fffff] to [bus 03] add_size 100000 add_align 100000 Sep 9 23:55:23.846864 kernel: pci 0000:00:02.3: bridge window [io 0x1000-0x0fff] to [bus 04] add_size 1000 Sep 9 23:55:23.846930 kernel: pci 0000:00:02.3: bridge window [mem 0x00100000-0x001fffff 64bit pref] to [bus 04] add_size 100000 add_align 100000 Sep 9 23:55:23.846992 kernel: pci 0000:00:02.3: bridge window [mem 0x00100000-0x000fffff] to [bus 04] add_size 200000 add_align 100000 Sep 9 23:55:23.847056 kernel: pci 0000:00:02.4: bridge window [io 0x1000-0x0fff] to [bus 05] add_size 1000 Sep 9 23:55:23.847114 kernel: pci 0000:00:02.4: bridge window [mem 0x00100000-0x001fffff 64bit pref] to [bus 05] add_size 100000 add_align 100000 Sep 9 23:55:23.847171 kernel: pci 0000:00:02.4: bridge window [mem 0x00100000-0x000fffff] to [bus 05] add_size 200000 add_align 100000 Sep 9 23:55:23.847234 kernel: pci 0000:00:02.5: bridge window [io 0x1000-0x0fff] to [bus 06] add_size 1000 Sep 9 23:55:23.847307 kernel: pci 0000:00:02.5: bridge window [mem 0x00100000-0x001fffff 64bit pref] to [bus 06] add_size 100000 add_align 100000 Sep 9 23:55:23.847368 kernel: pci 0000:00:02.5: bridge window [mem 0x00100000-0x001fffff] to [bus 06] add_size 100000 add_align 100000 Sep 9 23:55:23.847434 kernel: pci 0000:00:02.6: bridge window [io 0x1000-0x0fff] to [bus 07] add_size 1000 Sep 9 23:55:23.847496 kernel: pci 0000:00:02.6: bridge window [mem 0x00100000-0x001fffff 64bit pref] to [bus 07] add_size 100000 add_align 100000 Sep 9 23:55:23.847553 kernel: pci 0000:00:02.6: bridge window [mem 0x00100000-0x001fffff] to [bus 07] add_size 100000 add_align 100000 Sep 9 23:55:23.847615 kernel: pci 0000:00:02.7: bridge window [io 0x1000-0x0fff] to [bus 08] add_size 1000 Sep 9 23:55:23.847673 kernel: pci 0000:00:02.7: bridge window [mem 0x00100000-0x000fffff 64bit pref] to [bus 08] add_size 200000 add_align 100000 Sep 9 23:55:23.847730 kernel: pci 0000:00:02.7: bridge window [mem 0x00100000-0x000fffff] to [bus 08] add_size 200000 add_align 100000 Sep 9 23:55:23.847792 kernel: pci 0000:00:03.0: bridge window [io 0x1000-0x0fff] to [bus 09] add_size 1000 Sep 9 23:55:23.847875 kernel: pci 0000:00:03.0: bridge window [mem 0x00100000-0x000fffff 64bit pref] to [bus 09] add_size 200000 add_align 100000 Sep 9 23:55:23.847935 kernel: pci 0000:00:03.0: bridge window [mem 0x00100000-0x000fffff] to [bus 09] add_size 200000 add_align 100000 Sep 9 23:55:23.847993 kernel: pci 0000:00:02.0: bridge window [mem 0x10000000-0x101fffff]: assigned Sep 9 23:55:23.848051 kernel: pci 0000:00:02.0: bridge window [mem 0x8000000000-0x80001fffff 64bit pref]: assigned Sep 9 23:55:23.848110 kernel: pci 0000:00:02.1: bridge window [mem 0x10200000-0x103fffff]: assigned Sep 9 23:55:23.848166 kernel: pci 0000:00:02.1: bridge window [mem 0x8000200000-0x80003fffff 64bit pref]: assigned Sep 9 23:55:23.848234 kernel: pci 0000:00:02.2: bridge window [mem 0x10400000-0x105fffff]: assigned Sep 9 23:55:23.848334 kernel: pci 0000:00:02.2: bridge window [mem 0x8000400000-0x80005fffff 64bit pref]: assigned Sep 9 23:55:23.848396 kernel: pci 0000:00:02.3: bridge window [mem 0x10600000-0x107fffff]: assigned Sep 9 23:55:23.848454 kernel: pci 0000:00:02.3: bridge window [mem 0x8000600000-0x80007fffff 64bit pref]: assigned Sep 9 23:55:23.848717 kernel: pci 0000:00:02.4: bridge window [mem 0x10800000-0x109fffff]: assigned Sep 9 23:55:23.848796 kernel: pci 0000:00:02.4: bridge window [mem 0x8000800000-0x80009fffff 64bit pref]: assigned Sep 9 23:55:23.849485 kernel: pci 0000:00:02.5: bridge window [mem 0x10a00000-0x10bfffff]: assigned Sep 9 23:55:23.849561 kernel: pci 0000:00:02.5: bridge window [mem 0x8000a00000-0x8000bfffff 64bit pref]: assigned Sep 9 23:55:23.849624 kernel: pci 0000:00:02.6: bridge window [mem 0x10c00000-0x10dfffff]: assigned Sep 9 23:55:23.849689 kernel: pci 0000:00:02.6: bridge window [mem 0x8000c00000-0x8000dfffff 64bit pref]: assigned Sep 9 23:55:23.849752 kernel: pci 0000:00:02.7: bridge window [mem 0x10e00000-0x10ffffff]: assigned Sep 9 23:55:23.849810 kernel: pci 0000:00:02.7: bridge window [mem 0x8000e00000-0x8000ffffff 64bit pref]: assigned Sep 9 23:55:23.849905 kernel: pci 0000:00:03.0: bridge window [mem 0x11000000-0x111fffff]: assigned Sep 9 23:55:23.849967 kernel: pci 0000:00:03.0: bridge window [mem 0x8001000000-0x80011fffff 64bit pref]: assigned Sep 9 23:55:23.850031 kernel: pci 0000:00:01.0: BAR 4 [mem 0x8001200000-0x8001203fff 64bit pref]: assigned Sep 9 23:55:23.850090 kernel: pci 0000:00:01.0: BAR 1 [mem 0x11200000-0x11200fff]: assigned Sep 9 23:55:23.850151 kernel: pci 0000:00:02.0: BAR 0 [mem 0x11201000-0x11201fff]: assigned Sep 9 23:55:23.850213 kernel: pci 0000:00:02.0: bridge window [io 0x1000-0x1fff]: assigned Sep 9 23:55:23.850274 kernel: pci 0000:00:02.1: BAR 0 [mem 0x11202000-0x11202fff]: assigned Sep 9 23:55:23.850354 kernel: pci 0000:00:02.1: bridge window [io 0x2000-0x2fff]: assigned Sep 9 23:55:23.850417 kernel: pci 0000:00:02.2: BAR 0 [mem 0x11203000-0x11203fff]: assigned Sep 9 23:55:23.850480 kernel: pci 0000:00:02.2: bridge window [io 0x3000-0x3fff]: assigned Sep 9 23:55:23.850545 kernel: pci 0000:00:02.3: BAR 0 [mem 0x11204000-0x11204fff]: assigned Sep 9 23:55:23.850603 kernel: pci 0000:00:02.3: bridge window [io 0x4000-0x4fff]: assigned Sep 9 23:55:23.850663 kernel: pci 0000:00:02.4: BAR 0 [mem 0x11205000-0x11205fff]: assigned Sep 9 23:55:23.850721 kernel: pci 0000:00:02.4: bridge window [io 0x5000-0x5fff]: assigned Sep 9 23:55:23.850779 kernel: pci 0000:00:02.5: BAR 0 [mem 0x11206000-0x11206fff]: assigned Sep 9 23:55:23.850904 kernel: pci 0000:00:02.5: bridge window [io 0x6000-0x6fff]: assigned Sep 9 23:55:23.850977 kernel: pci 0000:00:02.6: BAR 0 [mem 0x11207000-0x11207fff]: assigned Sep 9 23:55:23.851039 kernel: pci 0000:00:02.6: bridge window [io 0x7000-0x7fff]: assigned Sep 9 23:55:23.851098 kernel: pci 0000:00:02.7: BAR 0 [mem 0x11208000-0x11208fff]: assigned Sep 9 23:55:23.851155 kernel: pci 0000:00:02.7: bridge window [io 0x8000-0x8fff]: assigned Sep 9 23:55:23.851213 kernel: pci 0000:00:03.0: BAR 0 [mem 0x11209000-0x11209fff]: assigned Sep 9 23:55:23.851382 kernel: pci 0000:00:03.0: bridge window [io 0x9000-0x9fff]: assigned Sep 9 23:55:23.851468 kernel: pci 0000:00:04.0: BAR 0 [io 0xa000-0xa007]: assigned Sep 9 23:55:23.851537 kernel: pci 0000:01:00.0: ROM [mem 0x10000000-0x1007ffff pref]: assigned Sep 9 23:55:23.851597 kernel: pci 0000:01:00.0: BAR 4 [mem 0x8000000000-0x8000003fff 64bit pref]: assigned Sep 9 23:55:23.851662 kernel: pci 0000:01:00.0: BAR 1 [mem 0x10080000-0x10080fff]: assigned Sep 9 23:55:23.851721 kernel: pci 0000:00:02.0: PCI bridge to [bus 01] Sep 9 23:55:23.851781 kernel: pci 0000:00:02.0: bridge window [io 0x1000-0x1fff] Sep 9 23:55:23.851858 kernel: pci 0000:00:02.0: bridge window [mem 0x10000000-0x101fffff] Sep 9 23:55:23.851928 kernel: pci 0000:00:02.0: bridge window [mem 0x8000000000-0x80001fffff 64bit pref] Sep 9 23:55:23.851996 kernel: pci 0000:02:00.0: BAR 0 [mem 0x10200000-0x10203fff 64bit]: assigned Sep 9 23:55:23.852058 kernel: pci 0000:00:02.1: PCI bridge to [bus 02] Sep 9 23:55:23.852119 kernel: pci 0000:00:02.1: bridge window [io 0x2000-0x2fff] Sep 9 23:55:23.852177 kernel: pci 0000:00:02.1: bridge window [mem 0x10200000-0x103fffff] Sep 9 23:55:23.852234 kernel: pci 0000:00:02.1: bridge window [mem 0x8000200000-0x80003fffff 64bit pref] Sep 9 23:55:23.852323 kernel: pci 0000:03:00.0: BAR 4 [mem 0x8000400000-0x8000403fff 64bit pref]: assigned Sep 9 23:55:23.852388 kernel: pci 0000:03:00.0: BAR 1 [mem 0x10400000-0x10400fff]: assigned Sep 9 23:55:23.852450 kernel: pci 0000:00:02.2: PCI bridge to [bus 03] Sep 9 23:55:23.852512 kernel: pci 0000:00:02.2: bridge window [io 0x3000-0x3fff] Sep 9 23:55:23.852573 kernel: pci 0000:00:02.2: bridge window [mem 0x10400000-0x105fffff] Sep 9 23:55:23.852630 kernel: pci 0000:00:02.2: bridge window [mem 0x8000400000-0x80005fffff 64bit pref] Sep 9 23:55:23.852697 kernel: pci 0000:04:00.0: BAR 4 [mem 0x8000600000-0x8000603fff 64bit pref]: assigned Sep 9 23:55:23.852756 kernel: pci 0000:00:02.3: PCI bridge to [bus 04] Sep 9 23:55:23.852814 kernel: pci 0000:00:02.3: bridge window [io 0x4000-0x4fff] Sep 9 23:55:23.855006 kernel: pci 0000:00:02.3: bridge window [mem 0x10600000-0x107fffff] Sep 9 23:55:23.855100 kernel: pci 0000:00:02.3: bridge window [mem 0x8000600000-0x80007fffff 64bit pref] Sep 9 23:55:23.855180 kernel: pci 0000:05:00.0: BAR 4 [mem 0x8000800000-0x8000803fff 64bit pref]: assigned Sep 9 23:55:23.855243 kernel: pci 0000:00:02.4: PCI bridge to [bus 05] Sep 9 23:55:23.855372 kernel: pci 0000:00:02.4: bridge window [io 0x5000-0x5fff] Sep 9 23:55:23.855437 kernel: pci 0000:00:02.4: bridge window [mem 0x10800000-0x109fffff] Sep 9 23:55:23.855495 kernel: pci 0000:00:02.4: bridge window [mem 0x8000800000-0x80009fffff 64bit pref] Sep 9 23:55:23.855564 kernel: pci 0000:06:00.0: BAR 4 [mem 0x8000a00000-0x8000a03fff 64bit pref]: assigned Sep 9 23:55:23.855625 kernel: pci 0000:06:00.0: BAR 1 [mem 0x10a00000-0x10a00fff]: assigned Sep 9 23:55:23.855702 kernel: pci 0000:00:02.5: PCI bridge to [bus 06] Sep 9 23:55:23.855765 kernel: pci 0000:00:02.5: bridge window [io 0x6000-0x6fff] Sep 9 23:55:23.855852 kernel: pci 0000:00:02.5: bridge window [mem 0x10a00000-0x10bfffff] Sep 9 23:55:23.855914 kernel: pci 0000:00:02.5: bridge window [mem 0x8000a00000-0x8000bfffff 64bit pref] Sep 9 23:55:23.855983 kernel: pci 0000:07:00.0: ROM [mem 0x10c00000-0x10c7ffff pref]: assigned Sep 9 23:55:23.856044 kernel: pci 0000:07:00.0: BAR 4 [mem 0x8000c00000-0x8000c03fff 64bit pref]: assigned Sep 9 23:55:23.856104 kernel: pci 0000:07:00.0: BAR 1 [mem 0x10c80000-0x10c80fff]: assigned Sep 9 23:55:23.856168 kernel: pci 0000:00:02.6: PCI bridge to [bus 07] Sep 9 23:55:23.856227 kernel: pci 0000:00:02.6: bridge window [io 0x7000-0x7fff] Sep 9 23:55:23.856302 kernel: pci 0000:00:02.6: bridge window [mem 0x10c00000-0x10dfffff] Sep 9 23:55:23.856366 kernel: pci 0000:00:02.6: bridge window [mem 0x8000c00000-0x8000dfffff 64bit pref] Sep 9 23:55:23.856431 kernel: pci 0000:00:02.7: PCI bridge to [bus 08] Sep 9 23:55:23.856489 kernel: pci 0000:00:02.7: bridge window [io 0x8000-0x8fff] Sep 9 23:55:23.856547 kernel: pci 0000:00:02.7: bridge window [mem 0x10e00000-0x10ffffff] Sep 9 23:55:23.856605 kernel: pci 0000:00:02.7: bridge window [mem 0x8000e00000-0x8000ffffff 64bit pref] Sep 9 23:55:23.856667 kernel: pci 0000:00:03.0: PCI bridge to [bus 09] Sep 9 23:55:23.856726 kernel: pci 0000:00:03.0: bridge window [io 0x9000-0x9fff] Sep 9 23:55:23.856783 kernel: pci 0000:00:03.0: bridge window [mem 0x11000000-0x111fffff] Sep 9 23:55:23.857997 kernel: pci 0000:00:03.0: bridge window [mem 0x8001000000-0x80011fffff 64bit pref] Sep 9 23:55:23.858116 kernel: pci_bus 0000:00: resource 4 [mem 0x10000000-0x3efeffff window] Sep 9 23:55:23.858172 kernel: pci_bus 0000:00: resource 5 [io 0x0000-0xffff window] Sep 9 23:55:23.858226 kernel: pci_bus 0000:00: resource 6 [mem 0x8000000000-0xffffffffff window] Sep 9 23:55:23.858366 kernel: pci_bus 0000:01: resource 0 [io 0x1000-0x1fff] Sep 9 23:55:23.858429 kernel: pci_bus 0000:01: resource 1 [mem 0x10000000-0x101fffff] Sep 9 23:55:23.858490 kernel: pci_bus 0000:01: resource 2 [mem 0x8000000000-0x80001fffff 64bit pref] Sep 9 23:55:23.858553 kernel: pci_bus 0000:02: resource 0 [io 0x2000-0x2fff] Sep 9 23:55:23.858696 kernel: pci_bus 0000:02: resource 1 [mem 0x10200000-0x103fffff] Sep 9 23:55:23.858770 kernel: pci_bus 0000:02: resource 2 [mem 0x8000200000-0x80003fffff 64bit pref] Sep 9 23:55:23.858854 kernel: pci_bus 0000:03: resource 0 [io 0x3000-0x3fff] Sep 9 23:55:23.858913 kernel: pci_bus 0000:03: resource 1 [mem 0x10400000-0x105fffff] Sep 9 23:55:23.858968 kernel: pci_bus 0000:03: resource 2 [mem 0x8000400000-0x80005fffff 64bit pref] Sep 9 23:55:23.859037 kernel: pci_bus 0000:04: resource 0 [io 0x4000-0x4fff] Sep 9 23:55:23.859091 kernel: pci_bus 0000:04: resource 1 [mem 0x10600000-0x107fffff] Sep 9 23:55:23.859144 kernel: pci_bus 0000:04: resource 2 [mem 0x8000600000-0x80007fffff 64bit pref] Sep 9 23:55:23.859210 kernel: pci_bus 0000:05: resource 0 [io 0x5000-0x5fff] Sep 9 23:55:23.859264 kernel: pci_bus 0000:05: resource 1 [mem 0x10800000-0x109fffff] Sep 9 23:55:23.859338 kernel: pci_bus 0000:05: resource 2 [mem 0x8000800000-0x80009fffff 64bit pref] Sep 9 23:55:23.859428 kernel: pci_bus 0000:06: resource 0 [io 0x6000-0x6fff] Sep 9 23:55:23.859487 kernel: pci_bus 0000:06: resource 1 [mem 0x10a00000-0x10bfffff] Sep 9 23:55:23.859582 kernel: pci_bus 0000:06: resource 2 [mem 0x8000a00000-0x8000bfffff 64bit pref] Sep 9 23:55:23.859650 kernel: pci_bus 0000:07: resource 0 [io 0x7000-0x7fff] Sep 9 23:55:23.859710 kernel: pci_bus 0000:07: resource 1 [mem 0x10c00000-0x10dfffff] Sep 9 23:55:23.859769 kernel: pci_bus 0000:07: resource 2 [mem 0x8000c00000-0x8000dfffff 64bit pref] Sep 9 23:55:23.859846 kernel: pci_bus 0000:08: resource 0 [io 0x8000-0x8fff] Sep 9 23:55:23.859931 kernel: pci_bus 0000:08: resource 1 [mem 0x10e00000-0x10ffffff] Sep 9 23:55:23.859988 kernel: pci_bus 0000:08: resource 2 [mem 0x8000e00000-0x8000ffffff 64bit pref] Sep 9 23:55:23.860050 kernel: pci_bus 0000:09: resource 0 [io 0x9000-0x9fff] Sep 9 23:55:23.860104 kernel: pci_bus 0000:09: resource 1 [mem 0x11000000-0x111fffff] Sep 9 23:55:23.860157 kernel: pci_bus 0000:09: resource 2 [mem 0x8001000000-0x80011fffff 64bit pref] Sep 9 23:55:23.860167 kernel: ACPI: PCI: Interrupt link GSI0 configured for IRQ 35 Sep 9 23:55:23.860175 kernel: ACPI: PCI: Interrupt link GSI1 configured for IRQ 36 Sep 9 23:55:23.860182 kernel: ACPI: PCI: Interrupt link GSI2 configured for IRQ 37 Sep 9 23:55:23.860192 kernel: ACPI: PCI: Interrupt link GSI3 configured for IRQ 38 Sep 9 23:55:23.860199 kernel: iommu: Default domain type: Translated Sep 9 23:55:23.860207 kernel: iommu: DMA domain TLB invalidation policy: strict mode Sep 9 23:55:23.860215 kernel: efivars: Registered efivars operations Sep 9 23:55:23.860222 kernel: vgaarb: loaded Sep 9 23:55:23.860229 kernel: clocksource: Switched to clocksource arch_sys_counter Sep 9 23:55:23.860237 kernel: VFS: Disk quotas dquot_6.6.0 Sep 9 23:55:23.860245 kernel: VFS: Dquot-cache hash table entries: 512 (order 0, 4096 bytes) Sep 9 23:55:23.860252 kernel: pnp: PnP ACPI init Sep 9 23:55:23.860348 kernel: system 00:00: [mem 0x4010000000-0x401fffffff window] could not be reserved Sep 9 23:55:23.860360 kernel: pnp: PnP ACPI: found 1 devices Sep 9 23:55:23.860368 kernel: NET: Registered PF_INET protocol family Sep 9 23:55:23.860375 kernel: IP idents hash table entries: 65536 (order: 7, 524288 bytes, linear) Sep 9 23:55:23.860383 kernel: tcp_listen_portaddr_hash hash table entries: 2048 (order: 3, 32768 bytes, linear) Sep 9 23:55:23.860390 kernel: Table-perturb hash table entries: 65536 (order: 6, 262144 bytes, linear) Sep 9 23:55:23.860398 kernel: TCP established hash table entries: 32768 (order: 6, 262144 bytes, linear) Sep 9 23:55:23.860405 kernel: TCP bind hash table entries: 32768 (order: 8, 1048576 bytes, linear) Sep 9 23:55:23.860431 kernel: TCP: Hash tables configured (established 32768 bind 32768) Sep 9 23:55:23.860440 kernel: UDP hash table entries: 2048 (order: 4, 65536 bytes, linear) Sep 9 23:55:23.860448 kernel: UDP-Lite hash table entries: 2048 (order: 4, 65536 bytes, linear) Sep 9 23:55:23.860456 kernel: NET: Registered PF_UNIX/PF_LOCAL protocol family Sep 9 23:55:23.863026 kernel: pci 0000:02:00.0: enabling device (0000 -> 0002) Sep 9 23:55:23.863062 kernel: PCI: CLS 0 bytes, default 64 Sep 9 23:55:23.863071 kernel: kvm [1]: HYP mode not available Sep 9 23:55:23.863079 kernel: Initialise system trusted keyrings Sep 9 23:55:23.863087 kernel: workingset: timestamp_bits=39 max_order=20 bucket_order=0 Sep 9 23:55:23.863101 kernel: Key type asymmetric registered Sep 9 23:55:23.863109 kernel: Asymmetric key parser 'x509' registered Sep 9 23:55:23.863116 kernel: Block layer SCSI generic (bsg) driver version 0.4 loaded (major 249) Sep 9 23:55:23.863124 kernel: io scheduler mq-deadline registered Sep 9 23:55:23.863131 kernel: io scheduler kyber registered Sep 9 23:55:23.863139 kernel: io scheduler bfq registered Sep 9 23:55:23.863147 kernel: ACPI: \_SB_.PCI0.GSI2: Enabled at IRQ 37 Sep 9 23:55:23.863248 kernel: pcieport 0000:00:02.0: PME: Signaling with IRQ 50 Sep 9 23:55:23.863404 kernel: pcieport 0000:00:02.0: AER: enabled with IRQ 50 Sep 9 23:55:23.863476 kernel: pcieport 0000:00:02.0: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Sep 9 23:55:23.863542 kernel: pcieport 0000:00:02.1: PME: Signaling with IRQ 51 Sep 9 23:55:23.863604 kernel: pcieport 0000:00:02.1: AER: enabled with IRQ 51 Sep 9 23:55:23.863680 kernel: pcieport 0000:00:02.1: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Sep 9 23:55:23.863755 kernel: pcieport 0000:00:02.2: PME: Signaling with IRQ 52 Sep 9 23:55:23.863817 kernel: pcieport 0000:00:02.2: AER: enabled with IRQ 52 Sep 9 23:55:23.863942 kernel: pcieport 0000:00:02.2: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Sep 9 23:55:23.864013 kernel: pcieport 0000:00:02.3: PME: Signaling with IRQ 53 Sep 9 23:55:23.864088 kernel: pcieport 0000:00:02.3: AER: enabled with IRQ 53 Sep 9 23:55:23.864149 kernel: pcieport 0000:00:02.3: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Sep 9 23:55:23.864213 kernel: pcieport 0000:00:02.4: PME: Signaling with IRQ 54 Sep 9 23:55:23.864275 kernel: pcieport 0000:00:02.4: AER: enabled with IRQ 54 Sep 9 23:55:23.864362 kernel: pcieport 0000:00:02.4: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Sep 9 23:55:23.864493 kernel: pcieport 0000:00:02.5: PME: Signaling with IRQ 55 Sep 9 23:55:23.864561 kernel: pcieport 0000:00:02.5: AER: enabled with IRQ 55 Sep 9 23:55:23.864624 kernel: pcieport 0000:00:02.5: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Sep 9 23:55:23.864687 kernel: pcieport 0000:00:02.6: PME: Signaling with IRQ 56 Sep 9 23:55:23.864752 kernel: pcieport 0000:00:02.6: AER: enabled with IRQ 56 Sep 9 23:55:23.864959 kernel: pcieport 0000:00:02.6: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Sep 9 23:55:23.865046 kernel: pcieport 0000:00:02.7: PME: Signaling with IRQ 57 Sep 9 23:55:23.865153 kernel: pcieport 0000:00:02.7: AER: enabled with IRQ 57 Sep 9 23:55:23.865220 kernel: pcieport 0000:00:02.7: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Sep 9 23:55:23.865231 kernel: ACPI: \_SB_.PCI0.GSI3: Enabled at IRQ 38 Sep 9 23:55:23.865317 kernel: pcieport 0000:00:03.0: PME: Signaling with IRQ 58 Sep 9 23:55:23.865381 kernel: pcieport 0000:00:03.0: AER: enabled with IRQ 58 Sep 9 23:55:23.865440 kernel: pcieport 0000:00:03.0: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Sep 9 23:55:23.865450 kernel: input: Power Button as /devices/LNXSYSTM:00/LNXSYBUS:00/PNP0C0C:00/input/input0 Sep 9 23:55:23.865458 kernel: ACPI: button: Power Button [PWRB] Sep 9 23:55:23.865466 kernel: ACPI: \_SB_.PCI0.GSI1: Enabled at IRQ 36 Sep 9 23:55:23.865533 kernel: virtio-pci 0000:04:00.0: enabling device (0000 -> 0002) Sep 9 23:55:23.865600 kernel: virtio-pci 0000:07:00.0: enabling device (0000 -> 0002) Sep 9 23:55:23.865614 kernel: Serial: 8250/16550 driver, 4 ports, IRQ sharing enabled Sep 9 23:55:23.865622 kernel: ACPI: \_SB_.PCI0.GSI0: Enabled at IRQ 35 Sep 9 23:55:23.865686 kernel: serial 0000:00:04.0: enabling device (0000 -> 0001) Sep 9 23:55:23.865697 kernel: 0000:00:04.0: ttyS0 at I/O 0xa000 (irq = 45, base_baud = 115200) is a 16550A Sep 9 23:55:23.865704 kernel: thunder_xcv, ver 1.0 Sep 9 23:55:23.865712 kernel: thunder_bgx, ver 1.0 Sep 9 23:55:23.865720 kernel: nicpf, ver 1.0 Sep 9 23:55:23.865727 kernel: nicvf, ver 1.0 Sep 9 23:55:23.865804 kernel: rtc-efi rtc-efi.0: registered as rtc0 Sep 9 23:55:23.867985 kernel: rtc-efi rtc-efi.0: setting system clock to 2025-09-09T23:55:23 UTC (1757462123) Sep 9 23:55:23.868018 kernel: hid: raw HID events driver (C) Jiri Kosina Sep 9 23:55:23.868026 kernel: hw perfevents: enabled with armv8_pmuv3_0 PMU driver, 7 (0,8000003f) counters available Sep 9 23:55:23.868034 kernel: watchdog: NMI not fully supported Sep 9 23:55:23.868042 kernel: watchdog: Hard watchdog permanently disabled Sep 9 23:55:23.868050 kernel: NET: Registered PF_INET6 protocol family Sep 9 23:55:23.868057 kernel: Segment Routing with IPv6 Sep 9 23:55:23.868065 kernel: In-situ OAM (IOAM) with IPv6 Sep 9 23:55:23.868073 kernel: NET: Registered PF_PACKET protocol family Sep 9 23:55:23.868088 kernel: Key type dns_resolver registered Sep 9 23:55:23.868096 kernel: registered taskstats version 1 Sep 9 23:55:23.868104 kernel: Loading compiled-in X.509 certificates Sep 9 23:55:23.868111 kernel: Loaded X.509 cert 'Kinvolk GmbH: Module signing key for 6.12.45-flatcar: 61217a1897415238555e2058a4e44c51622b0f87' Sep 9 23:55:23.868118 kernel: Demotion targets for Node 0: null Sep 9 23:55:23.868126 kernel: Key type .fscrypt registered Sep 9 23:55:23.868133 kernel: Key type fscrypt-provisioning registered Sep 9 23:55:23.868141 kernel: ima: No TPM chip found, activating TPM-bypass! Sep 9 23:55:23.868150 kernel: ima: Allocated hash algorithm: sha1 Sep 9 23:55:23.868158 kernel: ima: No architecture policies found Sep 9 23:55:23.868165 kernel: alg: No test for fips(ansi_cprng) (fips_ansi_cprng) Sep 9 23:55:23.868172 kernel: clk: Disabling unused clocks Sep 9 23:55:23.868180 kernel: PM: genpd: Disabling unused power domains Sep 9 23:55:23.868188 kernel: Warning: unable to open an initial console. Sep 9 23:55:23.868196 kernel: Freeing unused kernel memory: 38912K Sep 9 23:55:23.868203 kernel: Run /init as init process Sep 9 23:55:23.868211 kernel: with arguments: Sep 9 23:55:23.868220 kernel: /init Sep 9 23:55:23.868227 kernel: with environment: Sep 9 23:55:23.868234 kernel: HOME=/ Sep 9 23:55:23.868242 kernel: TERM=linux Sep 9 23:55:23.868250 kernel: BOOT_IMAGE=/flatcar/vmlinuz-a Sep 9 23:55:23.868258 systemd[1]: Successfully made /usr/ read-only. Sep 9 23:55:23.868269 systemd[1]: systemd 256.8 running in system mode (+PAM +AUDIT +SELINUX -APPARMOR +IMA +SMACK +SECCOMP -GCRYPT -GNUTLS +OPENSSL -ACL +BLKID +CURL +ELFUTILS -FIDO2 +IDN2 -IDN +IPTC +KMOD +LIBCRYPTSETUP +LIBCRYPTSETUP_PLUGINS +LIBFDISK +PCRE2 -PWQUALITY -P11KIT -QRENCODE +TPM2 +BZIP2 +LZ4 +XZ +ZLIB +ZSTD -BPF_FRAMEWORK -XKBCOMMON +UTMP -SYSVINIT +LIBARCHIVE) Sep 9 23:55:23.868278 systemd[1]: Detected virtualization kvm. Sep 9 23:55:23.868299 systemd[1]: Detected architecture arm64. Sep 9 23:55:23.868307 systemd[1]: Running in initrd. Sep 9 23:55:23.868315 systemd[1]: No hostname configured, using default hostname. Sep 9 23:55:23.868323 systemd[1]: Hostname set to . Sep 9 23:55:23.868331 systemd[1]: Initializing machine ID from VM UUID. Sep 9 23:55:23.868338 systemd[1]: Queued start job for default target initrd.target. Sep 9 23:55:23.868347 systemd[1]: Started clevis-luks-askpass.path - Forward Password Requests to Clevis Directory Watch. Sep 9 23:55:23.868355 systemd[1]: Started systemd-ask-password-console.path - Dispatch Password Requests to Console Directory Watch. Sep 9 23:55:23.868366 systemd[1]: Expecting device dev-disk-by\x2dlabel-EFI\x2dSYSTEM.device - /dev/disk/by-label/EFI-SYSTEM... Sep 9 23:55:23.868374 systemd[1]: Expecting device dev-disk-by\x2dlabel-OEM.device - /dev/disk/by-label/OEM... Sep 9 23:55:23.868384 systemd[1]: Expecting device dev-disk-by\x2dlabel-ROOT.device - /dev/disk/by-label/ROOT... Sep 9 23:55:23.868392 systemd[1]: Expecting device dev-disk-by\x2dpartlabel-USR\x2dA.device - /dev/disk/by-partlabel/USR-A... Sep 9 23:55:23.868401 systemd[1]: Expecting device dev-disk-by\x2dpartuuid-7130c94a\x2d213a\x2d4e5a\x2d8e26\x2d6cce9662f132.device - /dev/disk/by-partuuid/7130c94a-213a-4e5a-8e26-6cce9662f132... Sep 9 23:55:23.868410 systemd[1]: Expecting device dev-mapper-usr.device - /dev/mapper/usr... Sep 9 23:55:23.868418 systemd[1]: Reached target cryptsetup-pre.target - Local Encrypted Volumes (Pre). Sep 9 23:55:23.868427 systemd[1]: Reached target cryptsetup.target - Local Encrypted Volumes. Sep 9 23:55:23.868435 systemd[1]: Reached target paths.target - Path Units. Sep 9 23:55:23.868443 systemd[1]: Reached target slices.target - Slice Units. Sep 9 23:55:23.868450 systemd[1]: Reached target swap.target - Swaps. Sep 9 23:55:23.868458 systemd[1]: Reached target timers.target - Timer Units. Sep 9 23:55:23.868466 systemd[1]: Listening on iscsid.socket - Open-iSCSI iscsid Socket. Sep 9 23:55:23.868474 systemd[1]: Listening on iscsiuio.socket - Open-iSCSI iscsiuio Socket. Sep 9 23:55:23.868482 systemd[1]: Listening on systemd-journald-dev-log.socket - Journal Socket (/dev/log). Sep 9 23:55:23.868491 systemd[1]: Listening on systemd-journald.socket - Journal Sockets. Sep 9 23:55:23.868499 systemd[1]: Listening on systemd-networkd.socket - Network Service Netlink Socket. Sep 9 23:55:23.868507 systemd[1]: Listening on systemd-udevd-control.socket - udev Control Socket. Sep 9 23:55:23.868515 systemd[1]: Listening on systemd-udevd-kernel.socket - udev Kernel Socket. Sep 9 23:55:23.868523 systemd[1]: Reached target sockets.target - Socket Units. Sep 9 23:55:23.868530 systemd[1]: Starting ignition-setup-pre.service - Ignition env setup... Sep 9 23:55:23.868538 systemd[1]: Starting kmod-static-nodes.service - Create List of Static Device Nodes... Sep 9 23:55:23.868546 systemd[1]: Finished network-cleanup.service - Network Cleanup. Sep 9 23:55:23.868555 systemd[1]: systemd-battery-check.service - Check battery level during early boot was skipped because of an unmet condition check (ConditionDirectoryNotEmpty=/sys/class/power_supply). Sep 9 23:55:23.868564 systemd[1]: Starting systemd-fsck-usr.service... Sep 9 23:55:23.868572 systemd[1]: Starting systemd-journald.service - Journal Service... Sep 9 23:55:23.868580 systemd[1]: Starting systemd-modules-load.service - Load Kernel Modules... Sep 9 23:55:23.868588 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Sep 9 23:55:23.868595 systemd[1]: Finished ignition-setup-pre.service - Ignition env setup. Sep 9 23:55:23.868631 systemd-journald[244]: Collecting audit messages is disabled. Sep 9 23:55:23.868652 systemd[1]: Finished kmod-static-nodes.service - Create List of Static Device Nodes. Sep 9 23:55:23.868660 systemd[1]: Finished systemd-fsck-usr.service. Sep 9 23:55:23.868670 systemd[1]: Starting systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully... Sep 9 23:55:23.868678 kernel: bridge: filtering via arp/ip/ip6tables is no longer available by default. Update your scripts to load br_netfilter if you need this. Sep 9 23:55:23.868685 kernel: Bridge firewalling registered Sep 9 23:55:23.868693 systemd[1]: Finished systemd-modules-load.service - Load Kernel Modules. Sep 9 23:55:23.868701 systemd[1]: Starting systemd-sysctl.service - Apply Kernel Variables... Sep 9 23:55:23.868709 systemd[1]: Finished systemd-vconsole-setup.service - Virtual Console Setup. Sep 9 23:55:23.868717 systemd[1]: Starting dracut-cmdline-ask.service - dracut ask for additional cmdline parameters... Sep 9 23:55:23.868726 systemd-journald[244]: Journal started Sep 9 23:55:23.868746 systemd-journald[244]: Runtime Journal (/run/log/journal/ac21d2dbc3064c8eae80e379d8bfb066) is 8M, max 76.5M, 68.5M free. Sep 9 23:55:23.821037 systemd-modules-load[246]: Inserted module 'overlay' Sep 9 23:55:23.845578 systemd-modules-load[246]: Inserted module 'br_netfilter' Sep 9 23:55:23.873102 systemd[1]: Started systemd-journald.service - Journal Service. Sep 9 23:55:23.875185 systemd[1]: Finished systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully. Sep 9 23:55:23.881004 systemd[1]: Finished systemd-sysctl.service - Apply Kernel Variables. Sep 9 23:55:23.889063 systemd[1]: Starting systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev... Sep 9 23:55:23.894115 systemd[1]: Starting systemd-tmpfiles-setup.service - Create System Files and Directories... Sep 9 23:55:23.898944 systemd[1]: Finished dracut-cmdline-ask.service - dracut ask for additional cmdline parameters. Sep 9 23:55:23.902986 systemd[1]: Starting dracut-cmdline.service - dracut cmdline hook... Sep 9 23:55:23.911535 systemd-tmpfiles[273]: /usr/lib/tmpfiles.d/var.conf:14: Duplicate line for path "/var/log", ignoring. Sep 9 23:55:23.916897 systemd[1]: Finished systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev. Sep 9 23:55:23.919049 systemd[1]: Finished systemd-tmpfiles-setup.service - Create System Files and Directories. Sep 9 23:55:23.923216 systemd[1]: Starting systemd-resolved.service - Network Name Resolution... Sep 9 23:55:23.934154 dracut-cmdline[280]: Using kernel command line parameters: rd.driver.pre=btrfs SYSTEMD_SULOGIN_FORCE=1 BOOT_IMAGE=/flatcar/vmlinuz-a mount.usr=/dev/mapper/usr verity.usr=PARTUUID=7130c94a-213a-4e5a-8e26-6cce9662f132 rootflags=rw mount.usrflags=ro consoleblank=0 root=LABEL=ROOT console=ttyAMA0,115200n8 flatcar.first_boot=detected acpi=force flatcar.oem.id=hetzner verity.usrhash=fc7b279c2d918629032c01551b74c66c198cf923a976f9b3bc0d959e7c2302db Sep 9 23:55:23.976233 systemd-resolved[288]: Positive Trust Anchors: Sep 9 23:55:23.976249 systemd-resolved[288]: . IN DS 20326 8 2 e06d44b80b8f1d39a95c0b0d7c65d08458e880409bbc683457104237c7f8ec8d Sep 9 23:55:23.976336 systemd-resolved[288]: Negative trust anchors: home.arpa 10.in-addr.arpa 16.172.in-addr.arpa 17.172.in-addr.arpa 18.172.in-addr.arpa 19.172.in-addr.arpa 20.172.in-addr.arpa 21.172.in-addr.arpa 22.172.in-addr.arpa 23.172.in-addr.arpa 24.172.in-addr.arpa 25.172.in-addr.arpa 26.172.in-addr.arpa 27.172.in-addr.arpa 28.172.in-addr.arpa 29.172.in-addr.arpa 30.172.in-addr.arpa 31.172.in-addr.arpa 170.0.0.192.in-addr.arpa 171.0.0.192.in-addr.arpa 168.192.in-addr.arpa d.f.ip6.arpa ipv4only.arpa resolver.arpa corp home internal intranet lan local private test Sep 9 23:55:23.981859 systemd-resolved[288]: Defaulting to hostname 'linux'. Sep 9 23:55:23.983078 systemd[1]: Started systemd-resolved.service - Network Name Resolution. Sep 9 23:55:23.985550 systemd[1]: Reached target nss-lookup.target - Host and Network Name Lookups. Sep 9 23:55:24.036887 kernel: SCSI subsystem initialized Sep 9 23:55:24.041980 kernel: Loading iSCSI transport class v2.0-870. Sep 9 23:55:24.049889 kernel: iscsi: registered transport (tcp) Sep 9 23:55:24.063884 kernel: iscsi: registered transport (qla4xxx) Sep 9 23:55:24.063970 kernel: QLogic iSCSI HBA Driver Sep 9 23:55:24.086311 systemd[1]: Starting systemd-network-generator.service - Generate network units from Kernel command line... Sep 9 23:55:24.122439 systemd[1]: Finished systemd-network-generator.service - Generate network units from Kernel command line. Sep 9 23:55:24.125257 systemd[1]: Reached target network-pre.target - Preparation for Network. Sep 9 23:55:24.181221 systemd[1]: Finished dracut-cmdline.service - dracut cmdline hook. Sep 9 23:55:24.183239 systemd[1]: Starting dracut-pre-udev.service - dracut pre-udev hook... Sep 9 23:55:24.246876 kernel: raid6: neonx8 gen() 15657 MB/s Sep 9 23:55:24.262896 kernel: raid6: neonx4 gen() 12921 MB/s Sep 9 23:55:24.279888 kernel: raid6: neonx2 gen() 13148 MB/s Sep 9 23:55:24.296885 kernel: raid6: neonx1 gen() 10435 MB/s Sep 9 23:55:24.314013 kernel: raid6: int64x8 gen() 6874 MB/s Sep 9 23:55:24.330888 kernel: raid6: int64x4 gen() 7313 MB/s Sep 9 23:55:24.347878 kernel: raid6: int64x2 gen() 6079 MB/s Sep 9 23:55:24.364887 kernel: raid6: int64x1 gen() 5022 MB/s Sep 9 23:55:24.364943 kernel: raid6: using algorithm neonx8 gen() 15657 MB/s Sep 9 23:55:24.381890 kernel: raid6: .... xor() 11964 MB/s, rmw enabled Sep 9 23:55:24.381962 kernel: raid6: using neon recovery algorithm Sep 9 23:55:24.386938 kernel: xor: measuring software checksum speed Sep 9 23:55:24.386990 kernel: 8regs : 20029 MB/sec Sep 9 23:55:24.388232 kernel: 32regs : 21681 MB/sec Sep 9 23:55:24.388263 kernel: arm64_neon : 27965 MB/sec Sep 9 23:55:24.388300 kernel: xor: using function: arm64_neon (27965 MB/sec) Sep 9 23:55:24.442921 kernel: Btrfs loaded, zoned=no, fsverity=no Sep 9 23:55:24.453680 systemd[1]: Finished dracut-pre-udev.service - dracut pre-udev hook. Sep 9 23:55:24.456625 systemd[1]: Starting systemd-udevd.service - Rule-based Manager for Device Events and Files... Sep 9 23:55:24.479080 systemd-udevd[493]: Using default interface naming scheme 'v255'. Sep 9 23:55:24.483663 systemd[1]: Started systemd-udevd.service - Rule-based Manager for Device Events and Files. Sep 9 23:55:24.489883 systemd[1]: Starting dracut-pre-trigger.service - dracut pre-trigger hook... Sep 9 23:55:24.521028 dracut-pre-trigger[503]: rd.md=0: removing MD RAID activation Sep 9 23:55:24.556491 systemd[1]: Finished dracut-pre-trigger.service - dracut pre-trigger hook. Sep 9 23:55:24.561028 systemd[1]: Starting systemd-udev-trigger.service - Coldplug All udev Devices... Sep 9 23:55:24.626899 systemd[1]: Finished systemd-udev-trigger.service - Coldplug All udev Devices. Sep 9 23:55:24.630456 systemd[1]: Starting dracut-initqueue.service - dracut initqueue hook... Sep 9 23:55:24.757012 kernel: ACPI: bus type USB registered Sep 9 23:55:24.757076 kernel: virtio_scsi virtio5: 2/0/0 default/read/poll queues Sep 9 23:55:24.759009 kernel: scsi host0: Virtio SCSI HBA Sep 9 23:55:24.760333 kernel: usbcore: registered new interface driver usbfs Sep 9 23:55:24.760389 kernel: usbcore: registered new interface driver hub Sep 9 23:55:24.760399 kernel: usbcore: registered new device driver usb Sep 9 23:55:24.775861 kernel: scsi 0:0:0:0: CD-ROM QEMU QEMU CD-ROM 2.5+ PQ: 0 ANSI: 5 Sep 9 23:55:24.776850 kernel: scsi 0:0:0:1: Direct-Access QEMU QEMU HARDDISK 2.5+ PQ: 0 ANSI: 5 Sep 9 23:55:24.785203 systemd[1]: systemd-vconsole-setup.service: Deactivated successfully. Sep 9 23:55:24.786048 systemd[1]: Stopped systemd-vconsole-setup.service - Virtual Console Setup. Sep 9 23:55:24.788519 systemd[1]: Stopping systemd-vconsole-setup.service - Virtual Console Setup... Sep 9 23:55:24.792444 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Sep 9 23:55:24.809405 kernel: xhci_hcd 0000:02:00.0: xHCI Host Controller Sep 9 23:55:24.810595 kernel: sr 0:0:0:0: Power-on or device reset occurred Sep 9 23:55:24.810743 kernel: sr 0:0:0:0: [sr0] scsi3-mmc drive: 16x/50x cd/rw xa/form2 cdda tray Sep 9 23:55:24.810823 kernel: cdrom: Uniform CD-ROM driver Revision: 3.20 Sep 9 23:55:24.812384 kernel: xhci_hcd 0000:02:00.0: new USB bus registered, assigned bus number 1 Sep 9 23:55:24.812568 kernel: xhci_hcd 0000:02:00.0: hcc params 0x00087001 hci version 0x100 quirks 0x0000000000000010 Sep 9 23:55:24.812679 kernel: sr 0:0:0:0: Attached scsi CD-ROM sr0 Sep 9 23:55:24.815918 kernel: sd 0:0:0:1: Power-on or device reset occurred Sep 9 23:55:24.817966 kernel: sd 0:0:0:1: [sda] 80003072 512-byte logical blocks: (41.0 GB/38.1 GiB) Sep 9 23:55:24.818143 kernel: sd 0:0:0:1: [sda] Write Protect is off Sep 9 23:55:24.818219 kernel: sd 0:0:0:1: [sda] Mode Sense: 63 00 00 08 Sep 9 23:55:24.819860 kernel: sd 0:0:0:1: [sda] Write cache: enabled, read cache: enabled, doesn't support DPO or FUA Sep 9 23:55:24.823367 kernel: xhci_hcd 0000:02:00.0: xHCI Host Controller Sep 9 23:55:24.823594 kernel: xhci_hcd 0000:02:00.0: new USB bus registered, assigned bus number 2 Sep 9 23:55:24.825896 kernel: xhci_hcd 0000:02:00.0: Host supports USB 3.0 SuperSpeed Sep 9 23:55:24.827035 kernel: hub 1-0:1.0: USB hub found Sep 9 23:55:24.833127 kernel: hub 1-0:1.0: 4 ports detected Sep 9 23:55:24.833384 kernel: GPT:Primary header thinks Alt. header is not at the end of the disk. Sep 9 23:55:24.833397 kernel: GPT:17805311 != 80003071 Sep 9 23:55:24.833406 kernel: GPT:Alternate GPT header not at the end of the disk. Sep 9 23:55:24.831819 systemd[1]: Finished systemd-vconsole-setup.service - Virtual Console Setup. Sep 9 23:55:24.835055 kernel: GPT:17805311 != 80003071 Sep 9 23:55:24.835091 kernel: GPT: Use GNU Parted to correct GPT errors. Sep 9 23:55:24.837245 kernel: usb usb2: We don't know the algorithms for LPM for this host, disabling LPM. Sep 9 23:55:24.837324 kernel: sda: sda1 sda2 sda3 sda4 sda6 sda7 sda9 Sep 9 23:55:24.837335 kernel: sd 0:0:0:1: [sda] Attached SCSI disk Sep 9 23:55:24.838859 kernel: hub 2-0:1.0: USB hub found Sep 9 23:55:24.839124 kernel: hub 2-0:1.0: 4 ports detected Sep 9 23:55:24.908213 systemd[1]: Found device dev-disk-by\x2dlabel-EFI\x2dSYSTEM.device - QEMU_HARDDISK EFI-SYSTEM. Sep 9 23:55:24.929272 systemd[1]: Found device dev-disk-by\x2dlabel-ROOT.device - QEMU_HARDDISK ROOT. Sep 9 23:55:24.938091 systemd[1]: Found device dev-disk-by\x2dpartuuid-7130c94a\x2d213a\x2d4e5a\x2d8e26\x2d6cce9662f132.device - QEMU_HARDDISK USR-A. Sep 9 23:55:24.939010 systemd[1]: Found device dev-disk-by\x2dpartlabel-USR\x2dA.device - QEMU_HARDDISK USR-A. Sep 9 23:55:24.953642 systemd[1]: Found device dev-disk-by\x2dlabel-OEM.device - QEMU_HARDDISK OEM. Sep 9 23:55:24.954925 systemd[1]: Finished dracut-initqueue.service - dracut initqueue hook. Sep 9 23:55:24.961889 systemd[1]: Reached target remote-fs-pre.target - Preparation for Remote File Systems. Sep 9 23:55:24.962561 systemd[1]: Reached target remote-cryptsetup.target - Remote Encrypted Volumes. Sep 9 23:55:24.964482 systemd[1]: Reached target remote-fs.target - Remote File Systems. Sep 9 23:55:24.967989 systemd[1]: Starting disk-uuid.service - Generate new UUID for disk GPT if necessary... Sep 9 23:55:24.971671 systemd[1]: Starting dracut-pre-mount.service - dracut pre-mount hook... Sep 9 23:55:24.990452 disk-uuid[601]: Primary Header is updated. Sep 9 23:55:24.990452 disk-uuid[601]: Secondary Entries is updated. Sep 9 23:55:24.990452 disk-uuid[601]: Secondary Header is updated. Sep 9 23:55:25.002363 systemd[1]: Finished dracut-pre-mount.service - dracut pre-mount hook. Sep 9 23:55:25.006871 kernel: sda: sda1 sda2 sda3 sda4 sda6 sda7 sda9 Sep 9 23:55:25.072886 kernel: usb 1-1: new high-speed USB device number 2 using xhci_hcd Sep 9 23:55:25.204713 kernel: input: QEMU QEMU USB Tablet as /devices/pci0000:00/0000:00:02.1/0000:02:00.0/usb1/1-1/1-1:1.0/0003:0627:0001.0001/input/input1 Sep 9 23:55:25.205024 kernel: hid-generic 0003:0627:0001.0001: input,hidraw0: USB HID v0.01 Mouse [QEMU QEMU USB Tablet] on usb-0000:02:00.0-1/input0 Sep 9 23:55:25.205363 kernel: usbcore: registered new interface driver usbhid Sep 9 23:55:25.205900 kernel: usbhid: USB HID core driver Sep 9 23:55:25.309939 kernel: usb 1-2: new high-speed USB device number 3 using xhci_hcd Sep 9 23:55:25.437871 kernel: input: QEMU QEMU USB Keyboard as /devices/pci0000:00/0000:00:02.1/0000:02:00.0/usb1/1-2/1-2:1.0/0003:0627:0001.0002/input/input2 Sep 9 23:55:25.489868 kernel: hid-generic 0003:0627:0001.0002: input,hidraw1: USB HID v1.11 Keyboard [QEMU QEMU USB Keyboard] on usb-0000:02:00.0-2/input0 Sep 9 23:55:26.028402 kernel: sda: sda1 sda2 sda3 sda4 sda6 sda7 sda9 Sep 9 23:55:26.028458 disk-uuid[602]: The operation has completed successfully. Sep 9 23:55:26.120185 systemd[1]: disk-uuid.service: Deactivated successfully. Sep 9 23:55:26.121886 systemd[1]: Finished disk-uuid.service - Generate new UUID for disk GPT if necessary. Sep 9 23:55:26.131259 systemd[1]: Starting verity-setup.service - Verity Setup for /dev/mapper/usr... Sep 9 23:55:26.155128 sh[625]: Success Sep 9 23:55:26.173371 kernel: device-mapper: core: CONFIG_IMA_DISABLE_HTABLE is disabled. Duplicate IMA measurements will not be recorded in the IMA log. Sep 9 23:55:26.173450 kernel: device-mapper: uevent: version 1.0.3 Sep 9 23:55:26.173481 kernel: device-mapper: ioctl: 4.48.0-ioctl (2023-03-01) initialised: dm-devel@lists.linux.dev Sep 9 23:55:26.184330 kernel: device-mapper: verity: sha256 using shash "sha256-ce" Sep 9 23:55:26.243465 systemd[1]: Found device dev-mapper-usr.device - /dev/mapper/usr. Sep 9 23:55:26.247090 systemd[1]: Mounting sysusr-usr.mount - /sysusr/usr... Sep 9 23:55:26.271785 systemd[1]: Finished verity-setup.service - Verity Setup for /dev/mapper/usr. Sep 9 23:55:26.283888 kernel: BTRFS: device fsid 2bc16190-0dd5-44d6-b331-3d703f5a1d1f devid 1 transid 40 /dev/mapper/usr (254:0) scanned by mount (637) Sep 9 23:55:26.286874 kernel: BTRFS info (device dm-0): first mount of filesystem 2bc16190-0dd5-44d6-b331-3d703f5a1d1f Sep 9 23:55:26.286950 kernel: BTRFS info (device dm-0): using crc32c (crc32c-generic) checksum algorithm Sep 9 23:55:26.295944 kernel: BTRFS info (device dm-0): enabling ssd optimizations Sep 9 23:55:26.296030 kernel: BTRFS info (device dm-0): disabling log replay at mount time Sep 9 23:55:26.296059 kernel: BTRFS info (device dm-0): enabling free space tree Sep 9 23:55:26.298164 systemd[1]: Mounted sysusr-usr.mount - /sysusr/usr. Sep 9 23:55:26.299294 systemd[1]: Reached target initrd-usr-fs.target - Initrd /usr File System. Sep 9 23:55:26.300062 systemd[1]: afterburn-network-kargs.service - Afterburn Initrd Setup Network Kernel Arguments was skipped because no trigger condition checks were met. Sep 9 23:55:26.300955 systemd[1]: Starting ignition-setup.service - Ignition (setup)... Sep 9 23:55:26.304131 systemd[1]: Starting parse-ip-for-networkd.service - Write systemd-networkd units from cmdline... Sep 9 23:55:26.341866 kernel: BTRFS: device label OEM devid 1 transid 11 /dev/sda6 (8:6) scanned by mount (669) Sep 9 23:55:26.343435 kernel: BTRFS info (device sda6): first mount of filesystem 3a7d3e29-58a5-4f0c-ac69-b528108338f5 Sep 9 23:55:26.343502 kernel: BTRFS info (device sda6): using crc32c (crc32c-generic) checksum algorithm Sep 9 23:55:26.348398 kernel: BTRFS info (device sda6): enabling ssd optimizations Sep 9 23:55:26.348462 kernel: BTRFS info (device sda6): turning on async discard Sep 9 23:55:26.348473 kernel: BTRFS info (device sda6): enabling free space tree Sep 9 23:55:26.353997 kernel: BTRFS info (device sda6): last unmount of filesystem 3a7d3e29-58a5-4f0c-ac69-b528108338f5 Sep 9 23:55:26.356043 systemd[1]: Finished ignition-setup.service - Ignition (setup). Sep 9 23:55:26.358731 systemd[1]: Starting ignition-fetch-offline.service - Ignition (fetch-offline)... Sep 9 23:55:26.456858 systemd[1]: Finished parse-ip-for-networkd.service - Write systemd-networkd units from cmdline. Sep 9 23:55:26.460054 systemd[1]: Starting systemd-networkd.service - Network Configuration... Sep 9 23:55:26.497869 systemd-networkd[809]: lo: Link UP Sep 9 23:55:26.497883 systemd-networkd[809]: lo: Gained carrier Sep 9 23:55:26.499937 systemd-networkd[809]: Enumeration completed Sep 9 23:55:26.500072 systemd[1]: Started systemd-networkd.service - Network Configuration. Sep 9 23:55:26.500816 systemd[1]: Reached target network.target - Network. Sep 9 23:55:26.504136 systemd-networkd[809]: eth0: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Sep 9 23:55:26.504140 systemd-networkd[809]: eth0: Configuring with /usr/lib/systemd/network/zz-default.network. Sep 9 23:55:26.505589 systemd-networkd[809]: eth1: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Sep 9 23:55:26.505592 systemd-networkd[809]: eth1: Configuring with /usr/lib/systemd/network/zz-default.network. Sep 9 23:55:26.506192 systemd-networkd[809]: eth0: Link UP Sep 9 23:55:26.506346 systemd-networkd[809]: eth1: Link UP Sep 9 23:55:26.506480 systemd-networkd[809]: eth0: Gained carrier Sep 9 23:55:26.506494 systemd-networkd[809]: eth0: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Sep 9 23:55:26.516336 systemd-networkd[809]: eth1: Gained carrier Sep 9 23:55:26.516352 systemd-networkd[809]: eth1: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Sep 9 23:55:26.537504 ignition[719]: Ignition 2.21.0 Sep 9 23:55:26.537523 ignition[719]: Stage: fetch-offline Sep 9 23:55:26.537577 ignition[719]: no configs at "/usr/lib/ignition/base.d" Sep 9 23:55:26.537585 ignition[719]: no config dir at "/usr/lib/ignition/base.platform.d/hetzner" Sep 9 23:55:26.540926 systemd[1]: Finished ignition-fetch-offline.service - Ignition (fetch-offline). Sep 9 23:55:26.537762 ignition[719]: parsed url from cmdline: "" Sep 9 23:55:26.543085 systemd[1]: Starting ignition-fetch.service - Ignition (fetch)... Sep 9 23:55:26.537766 ignition[719]: no config URL provided Sep 9 23:55:26.537771 ignition[719]: reading system config file "/usr/lib/ignition/user.ign" Sep 9 23:55:26.537777 ignition[719]: no config at "/usr/lib/ignition/user.ign" Sep 9 23:55:26.537783 ignition[719]: failed to fetch config: resource requires networking Sep 9 23:55:26.538089 ignition[719]: Ignition finished successfully Sep 9 23:55:26.559946 systemd-networkd[809]: eth1: DHCPv4 address 10.0.0.3/32 acquired from 10.0.0.1 Sep 9 23:55:26.565930 systemd-networkd[809]: eth0: DHCPv4 address 91.99.154.191/32, gateway 172.31.1.1 acquired from 172.31.1.1 Sep 9 23:55:26.576438 ignition[820]: Ignition 2.21.0 Sep 9 23:55:26.576457 ignition[820]: Stage: fetch Sep 9 23:55:26.576623 ignition[820]: no configs at "/usr/lib/ignition/base.d" Sep 9 23:55:26.576636 ignition[820]: no config dir at "/usr/lib/ignition/base.platform.d/hetzner" Sep 9 23:55:26.576913 ignition[820]: parsed url from cmdline: "" Sep 9 23:55:26.576917 ignition[820]: no config URL provided Sep 9 23:55:26.576921 ignition[820]: reading system config file "/usr/lib/ignition/user.ign" Sep 9 23:55:26.576929 ignition[820]: no config at "/usr/lib/ignition/user.ign" Sep 9 23:55:26.577041 ignition[820]: GET http://169.254.169.254/hetzner/v1/userdata: attempt #1 Sep 9 23:55:26.583949 ignition[820]: GET result: OK Sep 9 23:55:26.584075 ignition[820]: parsing config with SHA512: 619ba1372e0db5e3bad2d9fd10dfda58ba30ce6dea75aabe4706e3edc688d44ac2732078b4387ceea304a88bc62191518e32585f9de4aff9e31bb06cbe9e7f26 Sep 9 23:55:26.590500 unknown[820]: fetched base config from "system" Sep 9 23:55:26.591098 unknown[820]: fetched base config from "system" Sep 9 23:55:26.591505 ignition[820]: fetch: fetch complete Sep 9 23:55:26.591105 unknown[820]: fetched user config from "hetzner" Sep 9 23:55:26.591511 ignition[820]: fetch: fetch passed Sep 9 23:55:26.591577 ignition[820]: Ignition finished successfully Sep 9 23:55:26.594910 systemd[1]: Finished ignition-fetch.service - Ignition (fetch). Sep 9 23:55:26.596590 systemd[1]: Starting ignition-kargs.service - Ignition (kargs)... Sep 9 23:55:26.632555 ignition[827]: Ignition 2.21.0 Sep 9 23:55:26.632575 ignition[827]: Stage: kargs Sep 9 23:55:26.632724 ignition[827]: no configs at "/usr/lib/ignition/base.d" Sep 9 23:55:26.632733 ignition[827]: no config dir at "/usr/lib/ignition/base.platform.d/hetzner" Sep 9 23:55:26.638770 ignition[827]: kargs: kargs passed Sep 9 23:55:26.639450 ignition[827]: Ignition finished successfully Sep 9 23:55:26.641099 systemd[1]: Finished ignition-kargs.service - Ignition (kargs). Sep 9 23:55:26.643959 systemd[1]: Starting ignition-disks.service - Ignition (disks)... Sep 9 23:55:26.678040 ignition[834]: Ignition 2.21.0 Sep 9 23:55:26.678061 ignition[834]: Stage: disks Sep 9 23:55:26.678510 ignition[834]: no configs at "/usr/lib/ignition/base.d" Sep 9 23:55:26.678528 ignition[834]: no config dir at "/usr/lib/ignition/base.platform.d/hetzner" Sep 9 23:55:26.680684 ignition[834]: disks: disks passed Sep 9 23:55:26.680808 ignition[834]: Ignition finished successfully Sep 9 23:55:26.684902 systemd[1]: Finished ignition-disks.service - Ignition (disks). Sep 9 23:55:26.686419 systemd[1]: Reached target initrd-root-device.target - Initrd Root Device. Sep 9 23:55:26.687256 systemd[1]: Reached target local-fs-pre.target - Preparation for Local File Systems. Sep 9 23:55:26.687982 systemd[1]: Reached target local-fs.target - Local File Systems. Sep 9 23:55:26.688646 systemd[1]: Reached target sysinit.target - System Initialization. Sep 9 23:55:26.689971 systemd[1]: Reached target basic.target - Basic System. Sep 9 23:55:26.692828 systemd[1]: Starting systemd-fsck-root.service - File System Check on /dev/disk/by-label/ROOT... Sep 9 23:55:26.727093 systemd-fsck[842]: ROOT: clean, 15/1628000 files, 120826/1617920 blocks Sep 9 23:55:26.733368 systemd[1]: Finished systemd-fsck-root.service - File System Check on /dev/disk/by-label/ROOT. Sep 9 23:55:26.738315 systemd[1]: Mounting sysroot.mount - /sysroot... Sep 9 23:55:26.815890 kernel: EXT4-fs (sda9): mounted filesystem 7cc0d7f3-e4a1-4dc4-8b58-ceece0d874c1 r/w with ordered data mode. Quota mode: none. Sep 9 23:55:26.818228 systemd[1]: Mounted sysroot.mount - /sysroot. Sep 9 23:55:26.821946 systemd[1]: Reached target initrd-root-fs.target - Initrd Root File System. Sep 9 23:55:26.826102 systemd[1]: Mounting sysroot-oem.mount - /sysroot/oem... Sep 9 23:55:26.830899 systemd[1]: Mounting sysroot-usr.mount - /sysroot/usr... Sep 9 23:55:26.839306 systemd[1]: Starting flatcar-metadata-hostname.service - Flatcar Metadata Hostname Agent... Sep 9 23:55:26.843109 systemd[1]: ignition-remount-sysroot.service - Remount /sysroot read-write for Ignition was skipped because of an unmet condition check (ConditionPathIsReadWrite=!/sysroot). Sep 9 23:55:26.843178 systemd[1]: Reached target ignition-diskful.target - Ignition Boot Disk Setup. Sep 9 23:55:26.859920 kernel: BTRFS: device label OEM devid 1 transid 11 /dev/sda6 (8:6) scanned by mount (850) Sep 9 23:55:26.863866 kernel: BTRFS info (device sda6): first mount of filesystem 3a7d3e29-58a5-4f0c-ac69-b528108338f5 Sep 9 23:55:26.863987 kernel: BTRFS info (device sda6): using crc32c (crc32c-generic) checksum algorithm Sep 9 23:55:26.865747 systemd[1]: Mounted sysroot-usr.mount - /sysroot/usr. Sep 9 23:55:26.875055 systemd[1]: Starting initrd-setup-root.service - Root filesystem setup... Sep 9 23:55:26.888219 kernel: BTRFS info (device sda6): enabling ssd optimizations Sep 9 23:55:26.888250 kernel: BTRFS info (device sda6): turning on async discard Sep 9 23:55:26.888264 kernel: BTRFS info (device sda6): enabling free space tree Sep 9 23:55:26.890555 systemd[1]: Mounted sysroot-oem.mount - /sysroot/oem. Sep 9 23:55:26.920500 coreos-metadata[852]: Sep 09 23:55:26.920 INFO Fetching http://169.254.169.254/hetzner/v1/metadata/hostname: Attempt #1 Sep 9 23:55:26.922751 coreos-metadata[852]: Sep 09 23:55:26.922 INFO Fetch successful Sep 9 23:55:26.923726 coreos-metadata[852]: Sep 09 23:55:26.923 INFO wrote hostname ci-4426-0-0-n-d8dd570c6c to /sysroot/etc/hostname Sep 9 23:55:26.931510 systemd[1]: Finished flatcar-metadata-hostname.service - Flatcar Metadata Hostname Agent. Sep 9 23:55:26.938172 initrd-setup-root[878]: cut: /sysroot/etc/passwd: No such file or directory Sep 9 23:55:26.945673 initrd-setup-root[885]: cut: /sysroot/etc/group: No such file or directory Sep 9 23:55:26.951465 initrd-setup-root[892]: cut: /sysroot/etc/shadow: No such file or directory Sep 9 23:55:26.956985 initrd-setup-root[899]: cut: /sysroot/etc/gshadow: No such file or directory Sep 9 23:55:27.069457 systemd[1]: Finished initrd-setup-root.service - Root filesystem setup. Sep 9 23:55:27.072633 systemd[1]: Starting ignition-mount.service - Ignition (mount)... Sep 9 23:55:27.074831 systemd[1]: Starting sysroot-boot.service - /sysroot/boot... Sep 9 23:55:27.099922 kernel: BTRFS info (device sda6): last unmount of filesystem 3a7d3e29-58a5-4f0c-ac69-b528108338f5 Sep 9 23:55:27.119939 systemd[1]: Finished sysroot-boot.service - /sysroot/boot. Sep 9 23:55:27.129956 ignition[967]: INFO : Ignition 2.21.0 Sep 9 23:55:27.129956 ignition[967]: INFO : Stage: mount Sep 9 23:55:27.132106 ignition[967]: INFO : no configs at "/usr/lib/ignition/base.d" Sep 9 23:55:27.132106 ignition[967]: INFO : no config dir at "/usr/lib/ignition/base.platform.d/hetzner" Sep 9 23:55:27.132106 ignition[967]: INFO : mount: mount passed Sep 9 23:55:27.132106 ignition[967]: INFO : Ignition finished successfully Sep 9 23:55:27.135733 systemd[1]: Finished ignition-mount.service - Ignition (mount). Sep 9 23:55:27.138191 systemd[1]: Starting ignition-files.service - Ignition (files)... Sep 9 23:55:27.285687 systemd[1]: sysroot-oem.mount: Deactivated successfully. Sep 9 23:55:27.287607 systemd[1]: Mounting sysroot-oem.mount - /sysroot/oem... Sep 9 23:55:27.310375 kernel: BTRFS: device label OEM devid 1 transid 11 /dev/sda6 (8:6) scanned by mount (979) Sep 9 23:55:27.310436 kernel: BTRFS info (device sda6): first mount of filesystem 3a7d3e29-58a5-4f0c-ac69-b528108338f5 Sep 9 23:55:27.311861 kernel: BTRFS info (device sda6): using crc32c (crc32c-generic) checksum algorithm Sep 9 23:55:27.317026 kernel: BTRFS info (device sda6): enabling ssd optimizations Sep 9 23:55:27.317092 kernel: BTRFS info (device sda6): turning on async discard Sep 9 23:55:27.317103 kernel: BTRFS info (device sda6): enabling free space tree Sep 9 23:55:27.319824 systemd[1]: Mounted sysroot-oem.mount - /sysroot/oem. Sep 9 23:55:27.359245 ignition[996]: INFO : Ignition 2.21.0 Sep 9 23:55:27.359245 ignition[996]: INFO : Stage: files Sep 9 23:55:27.359245 ignition[996]: INFO : no configs at "/usr/lib/ignition/base.d" Sep 9 23:55:27.361987 ignition[996]: INFO : no config dir at "/usr/lib/ignition/base.platform.d/hetzner" Sep 9 23:55:27.363652 ignition[996]: DEBUG : files: compiled without relabeling support, skipping Sep 9 23:55:27.366049 ignition[996]: INFO : files: ensureUsers: op(1): [started] creating or modifying user "core" Sep 9 23:55:27.366049 ignition[996]: DEBUG : files: ensureUsers: op(1): executing: "usermod" "--root" "/sysroot" "core" Sep 9 23:55:27.370142 ignition[996]: INFO : files: ensureUsers: op(1): [finished] creating or modifying user "core" Sep 9 23:55:27.372208 ignition[996]: INFO : files: ensureUsers: op(2): [started] adding ssh keys to user "core" Sep 9 23:55:27.374163 unknown[996]: wrote ssh authorized keys file for user: core Sep 9 23:55:27.376167 ignition[996]: INFO : files: ensureUsers: op(2): [finished] adding ssh keys to user "core" Sep 9 23:55:27.377946 ignition[996]: INFO : files: createFilesystemsFiles: createFiles: op(3): [started] writing file "/sysroot/opt/helm-v3.17.0-linux-arm64.tar.gz" Sep 9 23:55:27.379156 ignition[996]: INFO : files: createFilesystemsFiles: createFiles: op(3): GET https://get.helm.sh/helm-v3.17.0-linux-arm64.tar.gz: attempt #1 Sep 9 23:55:27.428863 ignition[996]: INFO : files: createFilesystemsFiles: createFiles: op(3): GET result: OK Sep 9 23:55:27.670377 ignition[996]: INFO : files: createFilesystemsFiles: createFiles: op(3): [finished] writing file "/sysroot/opt/helm-v3.17.0-linux-arm64.tar.gz" Sep 9 23:55:27.670377 ignition[996]: INFO : files: createFilesystemsFiles: createFiles: op(4): [started] writing file "/sysroot/home/core/install.sh" Sep 9 23:55:27.670377 ignition[996]: INFO : files: createFilesystemsFiles: createFiles: op(4): [finished] writing file "/sysroot/home/core/install.sh" Sep 9 23:55:27.670377 ignition[996]: INFO : files: createFilesystemsFiles: createFiles: op(5): [started] writing file "/sysroot/home/core/nginx.yaml" Sep 9 23:55:27.670377 ignition[996]: INFO : files: createFilesystemsFiles: createFiles: op(5): [finished] writing file "/sysroot/home/core/nginx.yaml" Sep 9 23:55:27.670377 ignition[996]: INFO : files: createFilesystemsFiles: createFiles: op(6): [started] writing file "/sysroot/home/core/nfs-pod.yaml" Sep 9 23:55:27.670377 ignition[996]: INFO : files: createFilesystemsFiles: createFiles: op(6): [finished] writing file "/sysroot/home/core/nfs-pod.yaml" Sep 9 23:55:27.670377 ignition[996]: INFO : files: createFilesystemsFiles: createFiles: op(7): [started] writing file "/sysroot/home/core/nfs-pvc.yaml" Sep 9 23:55:27.682361 ignition[996]: INFO : files: createFilesystemsFiles: createFiles: op(7): [finished] writing file "/sysroot/home/core/nfs-pvc.yaml" Sep 9 23:55:27.682361 ignition[996]: INFO : files: createFilesystemsFiles: createFiles: op(8): [started] writing file "/sysroot/etc/flatcar/update.conf" Sep 9 23:55:27.682361 ignition[996]: INFO : files: createFilesystemsFiles: createFiles: op(8): [finished] writing file "/sysroot/etc/flatcar/update.conf" Sep 9 23:55:27.682361 ignition[996]: INFO : files: createFilesystemsFiles: createFiles: op(9): [started] writing link "/sysroot/etc/extensions/kubernetes.raw" -> "/opt/extensions/kubernetes/kubernetes-v1.32.4-arm64.raw" Sep 9 23:55:27.686812 ignition[996]: INFO : files: createFilesystemsFiles: createFiles: op(9): [finished] writing link "/sysroot/etc/extensions/kubernetes.raw" -> "/opt/extensions/kubernetes/kubernetes-v1.32.4-arm64.raw" Sep 9 23:55:27.686812 ignition[996]: INFO : files: createFilesystemsFiles: createFiles: op(a): [started] writing file "/sysroot/opt/extensions/kubernetes/kubernetes-v1.32.4-arm64.raw" Sep 9 23:55:27.686812 ignition[996]: INFO : files: createFilesystemsFiles: createFiles: op(a): GET https://extensions.flatcar.org/extensions/kubernetes-v1.32.4-arm64.raw: attempt #1 Sep 9 23:55:27.957592 ignition[996]: INFO : files: createFilesystemsFiles: createFiles: op(a): GET result: OK Sep 9 23:55:28.008113 systemd-networkd[809]: eth1: Gained IPv6LL Sep 9 23:55:28.173420 ignition[996]: INFO : files: createFilesystemsFiles: createFiles: op(a): [finished] writing file "/sysroot/opt/extensions/kubernetes/kubernetes-v1.32.4-arm64.raw" Sep 9 23:55:28.173420 ignition[996]: INFO : files: op(b): [started] processing unit "prepare-helm.service" Sep 9 23:55:28.176857 ignition[996]: INFO : files: op(b): op(c): [started] writing unit "prepare-helm.service" at "/sysroot/etc/systemd/system/prepare-helm.service" Sep 9 23:55:28.180185 ignition[996]: INFO : files: op(b): op(c): [finished] writing unit "prepare-helm.service" at "/sysroot/etc/systemd/system/prepare-helm.service" Sep 9 23:55:28.180185 ignition[996]: INFO : files: op(b): [finished] processing unit "prepare-helm.service" Sep 9 23:55:28.180185 ignition[996]: INFO : files: op(d): [started] processing unit "coreos-metadata.service" Sep 9 23:55:28.180185 ignition[996]: INFO : files: op(d): op(e): [started] writing systemd drop-in "00-custom-metadata.conf" at "/sysroot/etc/systemd/system/coreos-metadata.service.d/00-custom-metadata.conf" Sep 9 23:55:28.189488 ignition[996]: INFO : files: op(d): op(e): [finished] writing systemd drop-in "00-custom-metadata.conf" at "/sysroot/etc/systemd/system/coreos-metadata.service.d/00-custom-metadata.conf" Sep 9 23:55:28.189488 ignition[996]: INFO : files: op(d): [finished] processing unit "coreos-metadata.service" Sep 9 23:55:28.189488 ignition[996]: INFO : files: op(f): [started] setting preset to enabled for "prepare-helm.service" Sep 9 23:55:28.189488 ignition[996]: INFO : files: op(f): [finished] setting preset to enabled for "prepare-helm.service" Sep 9 23:55:28.189488 ignition[996]: INFO : files: createResultFile: createFiles: op(10): [started] writing file "/sysroot/etc/.ignition-result.json" Sep 9 23:55:28.189488 ignition[996]: INFO : files: createResultFile: createFiles: op(10): [finished] writing file "/sysroot/etc/.ignition-result.json" Sep 9 23:55:28.189488 ignition[996]: INFO : files: files passed Sep 9 23:55:28.189488 ignition[996]: INFO : Ignition finished successfully Sep 9 23:55:28.186060 systemd[1]: Finished ignition-files.service - Ignition (files). Sep 9 23:55:28.191235 systemd[1]: Starting ignition-quench.service - Ignition (record completion)... Sep 9 23:55:28.196180 systemd[1]: Starting initrd-setup-root-after-ignition.service - Root filesystem completion... Sep 9 23:55:28.210447 systemd[1]: ignition-quench.service: Deactivated successfully. Sep 9 23:55:28.211114 systemd[1]: Finished ignition-quench.service - Ignition (record completion). Sep 9 23:55:28.219881 initrd-setup-root-after-ignition[1025]: grep: /sysroot/etc/flatcar/enabled-sysext.conf: No such file or directory Sep 9 23:55:28.219881 initrd-setup-root-after-ignition[1025]: grep: /sysroot/usr/share/flatcar/enabled-sysext.conf: No such file or directory Sep 9 23:55:28.222992 initrd-setup-root-after-ignition[1029]: grep: /sysroot/etc/flatcar/enabled-sysext.conf: No such file or directory Sep 9 23:55:28.225711 systemd[1]: Finished initrd-setup-root-after-ignition.service - Root filesystem completion. Sep 9 23:55:28.227553 systemd[1]: Reached target ignition-complete.target - Ignition Complete. Sep 9 23:55:28.229117 systemd[1]: Starting initrd-parse-etc.service - Mountpoints Configured in the Real Root... Sep 9 23:55:28.291876 systemd[1]: initrd-parse-etc.service: Deactivated successfully. Sep 9 23:55:28.292150 systemd[1]: Finished initrd-parse-etc.service - Mountpoints Configured in the Real Root. Sep 9 23:55:28.294665 systemd[1]: Reached target initrd-fs.target - Initrd File Systems. Sep 9 23:55:28.295629 systemd[1]: Reached target initrd.target - Initrd Default Target. Sep 9 23:55:28.296815 systemd[1]: dracut-mount.service - dracut mount hook was skipped because no trigger condition checks were met. Sep 9 23:55:28.297788 systemd[1]: Starting dracut-pre-pivot.service - dracut pre-pivot and cleanup hook... Sep 9 23:55:28.324516 systemd[1]: Finished dracut-pre-pivot.service - dracut pre-pivot and cleanup hook. Sep 9 23:55:28.327756 systemd[1]: Starting initrd-cleanup.service - Cleaning Up and Shutting Down Daemons... Sep 9 23:55:28.353437 systemd[1]: Stopped target nss-lookup.target - Host and Network Name Lookups. Sep 9 23:55:28.354993 systemd[1]: Stopped target remote-cryptsetup.target - Remote Encrypted Volumes. Sep 9 23:55:28.355820 systemd[1]: Stopped target timers.target - Timer Units. Sep 9 23:55:28.357031 systemd[1]: dracut-pre-pivot.service: Deactivated successfully. Sep 9 23:55:28.357209 systemd[1]: Stopped dracut-pre-pivot.service - dracut pre-pivot and cleanup hook. Sep 9 23:55:28.358831 systemd[1]: Stopped target initrd.target - Initrd Default Target. Sep 9 23:55:28.360208 systemd[1]: Stopped target basic.target - Basic System. Sep 9 23:55:28.361065 systemd[1]: Stopped target ignition-complete.target - Ignition Complete. Sep 9 23:55:28.362036 systemd[1]: Stopped target ignition-diskful.target - Ignition Boot Disk Setup. Sep 9 23:55:28.363053 systemd[1]: Stopped target initrd-root-device.target - Initrd Root Device. Sep 9 23:55:28.364066 systemd[1]: Stopped target initrd-usr-fs.target - Initrd /usr File System. Sep 9 23:55:28.365066 systemd[1]: Stopped target remote-fs.target - Remote File Systems. Sep 9 23:55:28.366031 systemd[1]: Stopped target remote-fs-pre.target - Preparation for Remote File Systems. Sep 9 23:55:28.367022 systemd[1]: Stopped target sysinit.target - System Initialization. Sep 9 23:55:28.368006 systemd[1]: Stopped target local-fs.target - Local File Systems. Sep 9 23:55:28.368922 systemd[1]: Stopped target swap.target - Swaps. Sep 9 23:55:28.369726 systemd[1]: dracut-pre-mount.service: Deactivated successfully. Sep 9 23:55:28.369924 systemd[1]: Stopped dracut-pre-mount.service - dracut pre-mount hook. Sep 9 23:55:28.371160 systemd[1]: Stopped target cryptsetup.target - Local Encrypted Volumes. Sep 9 23:55:28.372288 systemd[1]: Stopped target cryptsetup-pre.target - Local Encrypted Volumes (Pre). Sep 9 23:55:28.373318 systemd[1]: clevis-luks-askpass.path: Deactivated successfully. Sep 9 23:55:28.374915 systemd[1]: Stopped clevis-luks-askpass.path - Forward Password Requests to Clevis Directory Watch. Sep 9 23:55:28.375715 systemd[1]: dracut-initqueue.service: Deactivated successfully. Sep 9 23:55:28.375920 systemd[1]: Stopped dracut-initqueue.service - dracut initqueue hook. Sep 9 23:55:28.377489 systemd[1]: initrd-setup-root-after-ignition.service: Deactivated successfully. Sep 9 23:55:28.377643 systemd[1]: Stopped initrd-setup-root-after-ignition.service - Root filesystem completion. Sep 9 23:55:28.378904 systemd[1]: ignition-files.service: Deactivated successfully. Sep 9 23:55:28.379049 systemd[1]: Stopped ignition-files.service - Ignition (files). Sep 9 23:55:28.380135 systemd[1]: flatcar-metadata-hostname.service: Deactivated successfully. Sep 9 23:55:28.380297 systemd[1]: Stopped flatcar-metadata-hostname.service - Flatcar Metadata Hostname Agent. Sep 9 23:55:28.384109 systemd[1]: Stopping ignition-mount.service - Ignition (mount)... Sep 9 23:55:28.384645 systemd[1]: kmod-static-nodes.service: Deactivated successfully. Sep 9 23:55:28.384825 systemd[1]: Stopped kmod-static-nodes.service - Create List of Static Device Nodes. Sep 9 23:55:28.388012 systemd[1]: Stopping sysroot-boot.service - /sysroot/boot... Sep 9 23:55:28.388550 systemd[1]: systemd-udev-trigger.service: Deactivated successfully. Sep 9 23:55:28.388685 systemd[1]: Stopped systemd-udev-trigger.service - Coldplug All udev Devices. Sep 9 23:55:28.393769 systemd-networkd[809]: eth0: Gained IPv6LL Sep 9 23:55:28.394761 systemd[1]: dracut-pre-trigger.service: Deactivated successfully. Sep 9 23:55:28.394899 systemd[1]: Stopped dracut-pre-trigger.service - dracut pre-trigger hook. Sep 9 23:55:28.402099 systemd[1]: initrd-cleanup.service: Deactivated successfully. Sep 9 23:55:28.403117 systemd[1]: Finished initrd-cleanup.service - Cleaning Up and Shutting Down Daemons. Sep 9 23:55:28.418562 ignition[1049]: INFO : Ignition 2.21.0 Sep 9 23:55:28.418562 ignition[1049]: INFO : Stage: umount Sep 9 23:55:28.418562 ignition[1049]: INFO : no configs at "/usr/lib/ignition/base.d" Sep 9 23:55:28.418562 ignition[1049]: INFO : no config dir at "/usr/lib/ignition/base.platform.d/hetzner" Sep 9 23:55:28.418562 ignition[1049]: INFO : umount: umount passed Sep 9 23:55:28.418562 ignition[1049]: INFO : Ignition finished successfully Sep 9 23:55:28.418949 systemd[1]: sysroot-boot.mount: Deactivated successfully. Sep 9 23:55:28.423978 systemd[1]: ignition-mount.service: Deactivated successfully. Sep 9 23:55:28.424965 systemd[1]: Stopped ignition-mount.service - Ignition (mount). Sep 9 23:55:28.428528 systemd[1]: ignition-disks.service: Deactivated successfully. Sep 9 23:55:28.428650 systemd[1]: Stopped ignition-disks.service - Ignition (disks). Sep 9 23:55:28.431859 systemd[1]: ignition-kargs.service: Deactivated successfully. Sep 9 23:55:28.431925 systemd[1]: Stopped ignition-kargs.service - Ignition (kargs). Sep 9 23:55:28.434830 systemd[1]: ignition-fetch.service: Deactivated successfully. Sep 9 23:55:28.434964 systemd[1]: Stopped ignition-fetch.service - Ignition (fetch). Sep 9 23:55:28.435820 systemd[1]: Stopped target network.target - Network. Sep 9 23:55:28.438501 systemd[1]: ignition-fetch-offline.service: Deactivated successfully. Sep 9 23:55:28.438579 systemd[1]: Stopped ignition-fetch-offline.service - Ignition (fetch-offline). Sep 9 23:55:28.444666 systemd[1]: Stopped target paths.target - Path Units. Sep 9 23:55:28.446536 systemd[1]: systemd-ask-password-console.path: Deactivated successfully. Sep 9 23:55:28.449906 systemd[1]: Stopped systemd-ask-password-console.path - Dispatch Password Requests to Console Directory Watch. Sep 9 23:55:28.454658 systemd[1]: Stopped target slices.target - Slice Units. Sep 9 23:55:28.455514 systemd[1]: Stopped target sockets.target - Socket Units. Sep 9 23:55:28.457540 systemd[1]: iscsid.socket: Deactivated successfully. Sep 9 23:55:28.457606 systemd[1]: Closed iscsid.socket - Open-iSCSI iscsid Socket. Sep 9 23:55:28.458693 systemd[1]: iscsiuio.socket: Deactivated successfully. Sep 9 23:55:28.458736 systemd[1]: Closed iscsiuio.socket - Open-iSCSI iscsiuio Socket. Sep 9 23:55:28.460006 systemd[1]: ignition-setup.service: Deactivated successfully. Sep 9 23:55:28.460087 systemd[1]: Stopped ignition-setup.service - Ignition (setup). Sep 9 23:55:28.461463 systemd[1]: ignition-setup-pre.service: Deactivated successfully. Sep 9 23:55:28.461565 systemd[1]: Stopped ignition-setup-pre.service - Ignition env setup. Sep 9 23:55:28.462569 systemd[1]: Stopping systemd-networkd.service - Network Configuration... Sep 9 23:55:28.463597 systemd[1]: Stopping systemd-resolved.service - Network Name Resolution... Sep 9 23:55:28.465462 systemd[1]: sysroot-boot.service: Deactivated successfully. Sep 9 23:55:28.466899 systemd[1]: Stopped sysroot-boot.service - /sysroot/boot. Sep 9 23:55:28.469526 systemd[1]: initrd-setup-root.service: Deactivated successfully. Sep 9 23:55:28.469639 systemd[1]: Stopped initrd-setup-root.service - Root filesystem setup. Sep 9 23:55:28.471352 systemd[1]: systemd-resolved.service: Deactivated successfully. Sep 9 23:55:28.471488 systemd[1]: Stopped systemd-resolved.service - Network Name Resolution. Sep 9 23:55:28.476408 systemd[1]: run-credentials-systemd\x2dresolved.service.mount: Deactivated successfully. Sep 9 23:55:28.476697 systemd[1]: systemd-tmpfiles-setup.service: Deactivated successfully. Sep 9 23:55:28.476740 systemd[1]: Stopped systemd-tmpfiles-setup.service - Create System Files and Directories. Sep 9 23:55:28.479453 systemd[1]: run-credentials-systemd\x2dtmpfiles\x2dsetup.service.mount: Deactivated successfully. Sep 9 23:55:28.482970 systemd[1]: systemd-networkd.service: Deactivated successfully. Sep 9 23:55:28.483104 systemd[1]: Stopped systemd-networkd.service - Network Configuration. Sep 9 23:55:28.486731 systemd[1]: run-credentials-systemd\x2dnetworkd.service.mount: Deactivated successfully. Sep 9 23:55:28.487575 systemd[1]: Stopped target network-pre.target - Preparation for Network. Sep 9 23:55:28.489195 systemd[1]: systemd-networkd.socket: Deactivated successfully. Sep 9 23:55:28.489239 systemd[1]: Closed systemd-networkd.socket - Network Service Netlink Socket. Sep 9 23:55:28.491637 systemd[1]: Stopping network-cleanup.service - Network Cleanup... Sep 9 23:55:28.493968 systemd[1]: parse-ip-for-networkd.service: Deactivated successfully. Sep 9 23:55:28.494055 systemd[1]: Stopped parse-ip-for-networkd.service - Write systemd-networkd units from cmdline. Sep 9 23:55:28.495310 systemd[1]: systemd-sysctl.service: Deactivated successfully. Sep 9 23:55:28.495361 systemd[1]: Stopped systemd-sysctl.service - Apply Kernel Variables. Sep 9 23:55:28.499197 systemd[1]: systemd-modules-load.service: Deactivated successfully. Sep 9 23:55:28.499256 systemd[1]: Stopped systemd-modules-load.service - Load Kernel Modules. Sep 9 23:55:28.499904 systemd[1]: Stopping systemd-udevd.service - Rule-based Manager for Device Events and Files... Sep 9 23:55:28.506711 systemd[1]: run-credentials-systemd\x2dsysctl.service.mount: Deactivated successfully. Sep 9 23:55:28.523624 systemd[1]: systemd-udevd.service: Deactivated successfully. Sep 9 23:55:28.523902 systemd[1]: Stopped systemd-udevd.service - Rule-based Manager for Device Events and Files. Sep 9 23:55:28.527522 systemd[1]: network-cleanup.service: Deactivated successfully. Sep 9 23:55:28.527625 systemd[1]: Stopped network-cleanup.service - Network Cleanup. Sep 9 23:55:28.531192 systemd[1]: systemd-udevd-control.socket: Deactivated successfully. Sep 9 23:55:28.531330 systemd[1]: Closed systemd-udevd-control.socket - udev Control Socket. Sep 9 23:55:28.532169 systemd[1]: systemd-udevd-kernel.socket: Deactivated successfully. Sep 9 23:55:28.532218 systemd[1]: Closed systemd-udevd-kernel.socket - udev Kernel Socket. Sep 9 23:55:28.533854 systemd[1]: dracut-pre-udev.service: Deactivated successfully. Sep 9 23:55:28.533918 systemd[1]: Stopped dracut-pre-udev.service - dracut pre-udev hook. Sep 9 23:55:28.536469 systemd[1]: dracut-cmdline.service: Deactivated successfully. Sep 9 23:55:28.536533 systemd[1]: Stopped dracut-cmdline.service - dracut cmdline hook. Sep 9 23:55:28.538485 systemd[1]: dracut-cmdline-ask.service: Deactivated successfully. Sep 9 23:55:28.538547 systemd[1]: Stopped dracut-cmdline-ask.service - dracut ask for additional cmdline parameters. Sep 9 23:55:28.541259 systemd[1]: Starting initrd-udevadm-cleanup-db.service - Cleanup udev Database... Sep 9 23:55:28.543979 systemd[1]: systemd-network-generator.service: Deactivated successfully. Sep 9 23:55:28.544070 systemd[1]: Stopped systemd-network-generator.service - Generate network units from Kernel command line. Sep 9 23:55:28.548036 systemd[1]: systemd-tmpfiles-setup-dev.service: Deactivated successfully. Sep 9 23:55:28.548105 systemd[1]: Stopped systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev. Sep 9 23:55:28.552058 systemd[1]: systemd-vconsole-setup.service: Deactivated successfully. Sep 9 23:55:28.552126 systemd[1]: Stopped systemd-vconsole-setup.service - Virtual Console Setup. Sep 9 23:55:28.563559 systemd[1]: initrd-udevadm-cleanup-db.service: Deactivated successfully. Sep 9 23:55:28.563732 systemd[1]: Finished initrd-udevadm-cleanup-db.service - Cleanup udev Database. Sep 9 23:55:28.565153 systemd[1]: Reached target initrd-switch-root.target - Switch Root. Sep 9 23:55:28.568588 systemd[1]: Starting initrd-switch-root.service - Switch Root... Sep 9 23:55:28.591480 systemd[1]: Switching root. Sep 9 23:55:28.642443 systemd-journald[244]: Journal stopped Sep 9 23:55:29.648889 systemd-journald[244]: Received SIGTERM from PID 1 (systemd). Sep 9 23:55:29.648973 kernel: SELinux: policy capability network_peer_controls=1 Sep 9 23:55:29.648986 kernel: SELinux: policy capability open_perms=1 Sep 9 23:55:29.648994 kernel: SELinux: policy capability extended_socket_class=1 Sep 9 23:55:29.649004 kernel: SELinux: policy capability always_check_network=0 Sep 9 23:55:29.649012 kernel: SELinux: policy capability cgroup_seclabel=1 Sep 9 23:55:29.649022 kernel: SELinux: policy capability nnp_nosuid_transition=1 Sep 9 23:55:29.649035 kernel: SELinux: policy capability genfs_seclabel_symlinks=0 Sep 9 23:55:29.649044 kernel: SELinux: policy capability ioctl_skip_cloexec=0 Sep 9 23:55:29.649053 kernel: SELinux: policy capability userspace_initial_context=0 Sep 9 23:55:29.649062 kernel: audit: type=1403 audit(1757462128.797:2): auid=4294967295 ses=4294967295 lsm=selinux res=1 Sep 9 23:55:29.649072 systemd[1]: Successfully loaded SELinux policy in 69.177ms. Sep 9 23:55:29.649092 systemd[1]: Relabeled /dev/, /dev/shm/, /run/ in 7.056ms. Sep 9 23:55:29.649107 systemd[1]: systemd 256.8 running in system mode (+PAM +AUDIT +SELINUX -APPARMOR +IMA +SMACK +SECCOMP -GCRYPT -GNUTLS +OPENSSL -ACL +BLKID +CURL +ELFUTILS -FIDO2 +IDN2 -IDN +IPTC +KMOD +LIBCRYPTSETUP +LIBCRYPTSETUP_PLUGINS +LIBFDISK +PCRE2 -PWQUALITY -P11KIT -QRENCODE +TPM2 +BZIP2 +LZ4 +XZ +ZLIB +ZSTD -BPF_FRAMEWORK -XKBCOMMON +UTMP -SYSVINIT +LIBARCHIVE) Sep 9 23:55:29.649118 systemd[1]: Detected virtualization kvm. Sep 9 23:55:29.649128 systemd[1]: Detected architecture arm64. Sep 9 23:55:29.649138 systemd[1]: Detected first boot. Sep 9 23:55:29.649150 systemd[1]: Hostname set to . Sep 9 23:55:29.649160 systemd[1]: Initializing machine ID from VM UUID. Sep 9 23:55:29.649170 zram_generator::config[1093]: No configuration found. Sep 9 23:55:29.649180 kernel: NET: Registered PF_VSOCK protocol family Sep 9 23:55:29.649190 systemd[1]: Populated /etc with preset unit settings. Sep 9 23:55:29.649201 systemd[1]: run-credentials-systemd\x2djournald.service.mount: Deactivated successfully. Sep 9 23:55:29.649214 systemd[1]: initrd-switch-root.service: Deactivated successfully. Sep 9 23:55:29.649225 systemd[1]: Stopped initrd-switch-root.service - Switch Root. Sep 9 23:55:29.649235 systemd[1]: systemd-journald.service: Scheduled restart job, restart counter is at 1. Sep 9 23:55:29.649245 systemd[1]: Created slice system-addon\x2dconfig.slice - Slice /system/addon-config. Sep 9 23:55:29.649255 systemd[1]: Created slice system-addon\x2drun.slice - Slice /system/addon-run. Sep 9 23:55:29.649282 systemd[1]: Created slice system-getty.slice - Slice /system/getty. Sep 9 23:55:29.649296 systemd[1]: Created slice system-modprobe.slice - Slice /system/modprobe. Sep 9 23:55:29.649306 systemd[1]: Created slice system-serial\x2dgetty.slice - Slice /system/serial-getty. Sep 9 23:55:29.649316 systemd[1]: Created slice system-system\x2dcloudinit.slice - Slice /system/system-cloudinit. Sep 9 23:55:29.649326 systemd[1]: Created slice system-systemd\x2dfsck.slice - Slice /system/systemd-fsck. Sep 9 23:55:29.649339 systemd[1]: Created slice user.slice - User and Session Slice. Sep 9 23:55:29.649353 systemd[1]: Started clevis-luks-askpass.path - Forward Password Requests to Clevis Directory Watch. Sep 9 23:55:29.649363 systemd[1]: Started systemd-ask-password-console.path - Dispatch Password Requests to Console Directory Watch. Sep 9 23:55:29.649373 systemd[1]: Started systemd-ask-password-wall.path - Forward Password Requests to Wall Directory Watch. Sep 9 23:55:29.649383 systemd[1]: Set up automount boot.automount - Boot partition Automount Point. Sep 9 23:55:29.649394 systemd[1]: Set up automount proc-sys-fs-binfmt_misc.automount - Arbitrary Executable File Formats File System Automount Point. Sep 9 23:55:29.649404 systemd[1]: Expecting device dev-disk-by\x2dlabel-OEM.device - /dev/disk/by-label/OEM... Sep 9 23:55:29.649414 systemd[1]: Expecting device dev-ttyAMA0.device - /dev/ttyAMA0... Sep 9 23:55:29.649424 systemd[1]: Reached target cryptsetup-pre.target - Local Encrypted Volumes (Pre). Sep 9 23:55:29.649435 systemd[1]: Reached target cryptsetup.target - Local Encrypted Volumes. Sep 9 23:55:29.649444 systemd[1]: Stopped target initrd-switch-root.target - Switch Root. Sep 9 23:55:29.649456 systemd[1]: Stopped target initrd-fs.target - Initrd File Systems. Sep 9 23:55:29.649466 systemd[1]: Stopped target initrd-root-fs.target - Initrd Root File System. Sep 9 23:55:29.649475 systemd[1]: Reached target integritysetup.target - Local Integrity Protected Volumes. Sep 9 23:55:29.649485 systemd[1]: Reached target remote-cryptsetup.target - Remote Encrypted Volumes. Sep 9 23:55:29.649495 systemd[1]: Reached target remote-fs.target - Remote File Systems. Sep 9 23:55:29.649505 systemd[1]: Reached target slices.target - Slice Units. Sep 9 23:55:29.649515 systemd[1]: Reached target swap.target - Swaps. Sep 9 23:55:29.649525 systemd[1]: Reached target veritysetup.target - Local Verity Protected Volumes. Sep 9 23:55:29.649537 systemd[1]: Listening on systemd-coredump.socket - Process Core Dump Socket. Sep 9 23:55:29.649547 systemd[1]: Listening on systemd-creds.socket - Credential Encryption/Decryption. Sep 9 23:55:29.649559 systemd[1]: Listening on systemd-networkd.socket - Network Service Netlink Socket. Sep 9 23:55:29.649570 systemd[1]: Listening on systemd-udevd-control.socket - udev Control Socket. Sep 9 23:55:29.649580 systemd[1]: Listening on systemd-udevd-kernel.socket - udev Kernel Socket. Sep 9 23:55:29.649590 systemd[1]: Listening on systemd-userdbd.socket - User Database Manager Socket. Sep 9 23:55:29.649600 systemd[1]: Mounting dev-hugepages.mount - Huge Pages File System... Sep 9 23:55:29.649610 systemd[1]: Mounting dev-mqueue.mount - POSIX Message Queue File System... Sep 9 23:55:29.649619 systemd[1]: Mounting media.mount - External Media Directory... Sep 9 23:55:29.649629 systemd[1]: Mounting sys-kernel-debug.mount - Kernel Debug File System... Sep 9 23:55:29.649639 systemd[1]: Mounting sys-kernel-tracing.mount - Kernel Trace File System... Sep 9 23:55:29.649650 systemd[1]: Mounting tmp.mount - Temporary Directory /tmp... Sep 9 23:55:29.649660 systemd[1]: var-lib-machines.mount - Virtual Machine and Container Storage (Compatibility) was skipped because of an unmet condition check (ConditionPathExists=/var/lib/machines.raw). Sep 9 23:55:29.649670 systemd[1]: Reached target machines.target - Containers. Sep 9 23:55:29.649680 systemd[1]: Starting flatcar-tmpfiles.service - Create missing system files... Sep 9 23:55:29.649690 systemd[1]: ignition-delete-config.service - Ignition (delete config) was skipped because no trigger condition checks were met. Sep 9 23:55:29.649700 systemd[1]: Starting kmod-static-nodes.service - Create List of Static Device Nodes... Sep 9 23:55:29.649710 systemd[1]: Starting modprobe@configfs.service - Load Kernel Module configfs... Sep 9 23:55:29.649722 systemd[1]: Starting modprobe@dm_mod.service - Load Kernel Module dm_mod... Sep 9 23:55:29.649734 systemd[1]: Starting modprobe@drm.service - Load Kernel Module drm... Sep 9 23:55:29.649744 systemd[1]: Starting modprobe@efi_pstore.service - Load Kernel Module efi_pstore... Sep 9 23:55:29.649754 systemd[1]: Starting modprobe@fuse.service - Load Kernel Module fuse... Sep 9 23:55:29.649765 systemd[1]: Starting modprobe@loop.service - Load Kernel Module loop... Sep 9 23:55:29.649776 systemd[1]: setup-nsswitch.service - Create /etc/nsswitch.conf was skipped because of an unmet condition check (ConditionPathExists=!/etc/nsswitch.conf). Sep 9 23:55:29.649787 systemd[1]: systemd-fsck-root.service: Deactivated successfully. Sep 9 23:55:29.649797 systemd[1]: Stopped systemd-fsck-root.service - File System Check on Root Device. Sep 9 23:55:29.649807 systemd[1]: systemd-fsck-usr.service: Deactivated successfully. Sep 9 23:55:29.649818 systemd[1]: Stopped systemd-fsck-usr.service. Sep 9 23:55:29.649829 systemd[1]: systemd-hibernate-clear.service - Clear Stale Hibernate Storage Info was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/HibernateLocation-8cf2644b-4b0b-428f-9387-6d876050dc67). Sep 9 23:55:29.653395 systemd[1]: Starting systemd-journald.service - Journal Service... Sep 9 23:55:29.653428 systemd[1]: Starting systemd-modules-load.service - Load Kernel Modules... Sep 9 23:55:29.653441 systemd[1]: Starting systemd-network-generator.service - Generate network units from Kernel command line... Sep 9 23:55:29.653453 systemd[1]: Starting systemd-remount-fs.service - Remount Root and Kernel File Systems... Sep 9 23:55:29.653463 systemd[1]: Starting systemd-udev-load-credentials.service - Load udev Rules from Credentials... Sep 9 23:55:29.653473 systemd[1]: Starting systemd-udev-trigger.service - Coldplug All udev Devices... Sep 9 23:55:29.653483 systemd[1]: verity-setup.service: Deactivated successfully. Sep 9 23:55:29.653493 systemd[1]: Stopped verity-setup.service. Sep 9 23:55:29.653503 systemd[1]: Mounted dev-hugepages.mount - Huge Pages File System. Sep 9 23:55:29.653513 systemd[1]: Mounted dev-mqueue.mount - POSIX Message Queue File System. Sep 9 23:55:29.653525 systemd[1]: Mounted media.mount - External Media Directory. Sep 9 23:55:29.653536 systemd[1]: Mounted sys-kernel-debug.mount - Kernel Debug File System. Sep 9 23:55:29.653546 systemd[1]: Mounted sys-kernel-tracing.mount - Kernel Trace File System. Sep 9 23:55:29.653556 systemd[1]: Mounted tmp.mount - Temporary Directory /tmp. Sep 9 23:55:29.653565 systemd[1]: Finished kmod-static-nodes.service - Create List of Static Device Nodes. Sep 9 23:55:29.653575 systemd[1]: modprobe@configfs.service: Deactivated successfully. Sep 9 23:55:29.653587 systemd[1]: Finished modprobe@configfs.service - Load Kernel Module configfs. Sep 9 23:55:29.653597 systemd[1]: modprobe@dm_mod.service: Deactivated successfully. Sep 9 23:55:29.653607 systemd[1]: Finished modprobe@dm_mod.service - Load Kernel Module dm_mod. Sep 9 23:55:29.653617 systemd[1]: modprobe@efi_pstore.service: Deactivated successfully. Sep 9 23:55:29.653628 systemd[1]: Finished modprobe@efi_pstore.service - Load Kernel Module efi_pstore. Sep 9 23:55:29.653638 systemd[1]: Finished systemd-modules-load.service - Load Kernel Modules. Sep 9 23:55:29.653648 systemd[1]: Finished systemd-remount-fs.service - Remount Root and Kernel File Systems. Sep 9 23:55:29.653658 kernel: loop: module loaded Sep 9 23:55:29.653670 systemd[1]: Finished systemd-network-generator.service - Generate network units from Kernel command line. Sep 9 23:55:29.653681 systemd[1]: modprobe@loop.service: Deactivated successfully. Sep 9 23:55:29.653691 systemd[1]: Finished modprobe@loop.service - Load Kernel Module loop. Sep 9 23:55:29.653701 systemd[1]: Reached target network-pre.target - Preparation for Network. Sep 9 23:55:29.653711 kernel: fuse: init (API version 7.41) Sep 9 23:55:29.653720 systemd[1]: Mounting sys-kernel-config.mount - Kernel Configuration File System... Sep 9 23:55:29.653731 systemd[1]: remount-root.service - Remount Root File System was skipped because of an unmet condition check (ConditionPathIsReadWrite=!/). Sep 9 23:55:29.653742 systemd[1]: Reached target local-fs.target - Local File Systems. Sep 9 23:55:29.653752 systemd[1]: Listening on systemd-sysext.socket - System Extension Image Management. Sep 9 23:55:29.653762 systemd[1]: Starting ldconfig.service - Rebuild Dynamic Linker Cache... Sep 9 23:55:29.653774 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. Sep 9 23:55:29.653822 systemd-journald[1161]: Collecting audit messages is disabled. Sep 9 23:55:29.655202 systemd[1]: Starting systemd-hwdb-update.service - Rebuild Hardware Database... Sep 9 23:55:29.655236 systemd[1]: systemd-pstore.service - Platform Persistent Storage Archival was skipped because of an unmet condition check (ConditionDirectoryNotEmpty=/sys/fs/pstore). Sep 9 23:55:29.655248 systemd[1]: Starting systemd-random-seed.service - Load/Save OS Random Seed... Sep 9 23:55:29.655307 systemd[1]: systemd-repart.service - Repartition Root Disk was skipped because no trigger condition checks were met. Sep 9 23:55:29.655327 systemd-journald[1161]: Journal started Sep 9 23:55:29.655354 systemd-journald[1161]: Runtime Journal (/run/log/journal/ac21d2dbc3064c8eae80e379d8bfb066) is 8M, max 76.5M, 68.5M free. Sep 9 23:55:29.667293 kernel: ACPI: bus type drm_connector registered Sep 9 23:55:29.667369 systemd[1]: Starting systemd-sysctl.service - Apply Kernel Variables... Sep 9 23:55:29.306043 systemd[1]: Queued start job for default target multi-user.target. Sep 9 23:55:29.332081 systemd[1]: Unnecessary job was removed for dev-sda6.device - /dev/sda6. Sep 9 23:55:29.332928 systemd[1]: systemd-journald.service: Deactivated successfully. Sep 9 23:55:29.681339 systemd[1]: Starting systemd-sysext.service - Merge System Extension Images into /usr/ and /opt/... Sep 9 23:55:29.681472 systemd[1]: Started systemd-journald.service - Journal Service. Sep 9 23:55:29.680629 systemd[1]: Finished flatcar-tmpfiles.service - Create missing system files. Sep 9 23:55:29.683900 systemd[1]: modprobe@drm.service: Deactivated successfully. Sep 9 23:55:29.686976 systemd[1]: Finished modprobe@drm.service - Load Kernel Module drm. Sep 9 23:55:29.689392 systemd[1]: modprobe@fuse.service: Deactivated successfully. Sep 9 23:55:29.689585 systemd[1]: Finished modprobe@fuse.service - Load Kernel Module fuse. Sep 9 23:55:29.691046 systemd[1]: Finished systemd-udev-load-credentials.service - Load udev Rules from Credentials. Sep 9 23:55:29.692421 systemd[1]: Mounted sys-kernel-config.mount - Kernel Configuration File System. Sep 9 23:55:29.695889 systemd[1]: Finished systemd-random-seed.service - Load/Save OS Random Seed. Sep 9 23:55:29.733536 systemd[1]: Finished systemd-sysctl.service - Apply Kernel Variables. Sep 9 23:55:29.738048 kernel: loop0: detected capacity change from 0 to 100608 Sep 9 23:55:29.739232 systemd[1]: Reached target first-boot-complete.target - First Boot Complete. Sep 9 23:55:29.748010 systemd[1]: Starting systemd-journal-flush.service - Flush Journal to Persistent Storage... Sep 9 23:55:29.752710 systemd[1]: Starting systemd-machine-id-commit.service - Save Transient machine-id to Disk... Sep 9 23:55:29.760643 systemd[1]: Starting systemd-sysusers.service - Create System Users... Sep 9 23:55:29.764875 systemd[1]: Finished systemd-udev-trigger.service - Coldplug All udev Devices. Sep 9 23:55:29.787898 kernel: squashfs: version 4.0 (2009/01/31) Phillip Lougher Sep 9 23:55:29.800631 systemd-journald[1161]: Time spent on flushing to /var/log/journal/ac21d2dbc3064c8eae80e379d8bfb066 is 38.765ms for 1172 entries. Sep 9 23:55:29.800631 systemd-journald[1161]: System Journal (/var/log/journal/ac21d2dbc3064c8eae80e379d8bfb066) is 8M, max 584.8M, 576.8M free. Sep 9 23:55:29.868602 systemd-journald[1161]: Received client request to flush runtime journal. Sep 9 23:55:29.869138 kernel: loop1: detected capacity change from 0 to 207008 Sep 9 23:55:29.869531 kernel: loop2: detected capacity change from 0 to 119320 Sep 9 23:55:29.825602 systemd[1]: Finished systemd-machine-id-commit.service - Save Transient machine-id to Disk. Sep 9 23:55:29.838371 systemd[1]: Finished systemd-sysusers.service - Create System Users. Sep 9 23:55:29.844761 systemd[1]: Starting systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev... Sep 9 23:55:29.874231 systemd[1]: Finished systemd-journal-flush.service - Flush Journal to Persistent Storage. Sep 9 23:55:29.889548 systemd-tmpfiles[1226]: ACLs are not supported, ignoring. Sep 9 23:55:29.889569 systemd-tmpfiles[1226]: ACLs are not supported, ignoring. Sep 9 23:55:29.899356 systemd[1]: Finished systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev. Sep 9 23:55:29.912797 kernel: loop3: detected capacity change from 0 to 8 Sep 9 23:55:29.940862 kernel: loop4: detected capacity change from 0 to 100608 Sep 9 23:55:29.960905 kernel: loop5: detected capacity change from 0 to 207008 Sep 9 23:55:29.983902 kernel: loop6: detected capacity change from 0 to 119320 Sep 9 23:55:30.002154 kernel: loop7: detected capacity change from 0 to 8 Sep 9 23:55:30.002933 (sd-merge)[1233]: Using extensions 'containerd-flatcar', 'docker-flatcar', 'kubernetes', 'oem-hetzner'. Sep 9 23:55:30.003756 (sd-merge)[1233]: Merged extensions into '/usr'. Sep 9 23:55:30.013898 systemd[1]: Reload requested from client PID 1192 ('systemd-sysext') (unit systemd-sysext.service)... Sep 9 23:55:30.013917 systemd[1]: Reloading... Sep 9 23:55:30.165892 zram_generator::config[1265]: No configuration found. Sep 9 23:55:30.248860 ldconfig[1185]: /sbin/ldconfig: /usr/lib/ld.so.conf is not an ELF file - it has the wrong magic bytes at the start. Sep 9 23:55:30.377038 systemd[1]: etc-machine\x2did.mount: Deactivated successfully. Sep 9 23:55:30.377366 systemd[1]: Reloading finished in 362 ms. Sep 9 23:55:30.392556 systemd[1]: Finished ldconfig.service - Rebuild Dynamic Linker Cache. Sep 9 23:55:30.394880 systemd[1]: Finished systemd-sysext.service - Merge System Extension Images into /usr/ and /opt/. Sep 9 23:55:30.401243 systemd[1]: Mounting sys-fs-fuse-connections.mount - FUSE Control File System... Sep 9 23:55:30.412100 systemd[1]: Starting ensure-sysext.service... Sep 9 23:55:30.415093 systemd[1]: Starting systemd-tmpfiles-setup.service - Create System Files and Directories... Sep 9 23:55:30.425691 systemd[1]: Mounted sys-fs-fuse-connections.mount - FUSE Control File System. Sep 9 23:55:30.449431 systemd[1]: Reload requested from client PID 1297 ('systemctl') (unit ensure-sysext.service)... Sep 9 23:55:30.449451 systemd[1]: Reloading... Sep 9 23:55:30.461781 systemd-tmpfiles[1298]: /usr/lib/tmpfiles.d/nfs-utils.conf:6: Duplicate line for path "/var/lib/nfs/sm", ignoring. Sep 9 23:55:30.462948 systemd-tmpfiles[1298]: /usr/lib/tmpfiles.d/nfs-utils.conf:7: Duplicate line for path "/var/lib/nfs/sm.bak", ignoring. Sep 9 23:55:30.463277 systemd-tmpfiles[1298]: /usr/lib/tmpfiles.d/provision.conf:20: Duplicate line for path "/root", ignoring. Sep 9 23:55:30.463475 systemd-tmpfiles[1298]: /usr/lib/tmpfiles.d/systemd-flatcar.conf:6: Duplicate line for path "/var/log/journal", ignoring. Sep 9 23:55:30.464156 systemd-tmpfiles[1298]: /usr/lib/tmpfiles.d/systemd.conf:29: Duplicate line for path "/var/lib/systemd", ignoring. Sep 9 23:55:30.464371 systemd-tmpfiles[1298]: ACLs are not supported, ignoring. Sep 9 23:55:30.464416 systemd-tmpfiles[1298]: ACLs are not supported, ignoring. Sep 9 23:55:30.471739 systemd-tmpfiles[1298]: Detected autofs mount point /boot during canonicalization of boot. Sep 9 23:55:30.471915 systemd-tmpfiles[1298]: Skipping /boot Sep 9 23:55:30.481486 systemd-tmpfiles[1298]: Detected autofs mount point /boot during canonicalization of boot. Sep 9 23:55:30.481648 systemd-tmpfiles[1298]: Skipping /boot Sep 9 23:55:30.541940 zram_generator::config[1326]: No configuration found. Sep 9 23:55:30.697038 systemd[1]: Reloading finished in 247 ms. Sep 9 23:55:30.708966 systemd[1]: Finished systemd-hwdb-update.service - Rebuild Hardware Database. Sep 9 23:55:30.730962 systemd[1]: Finished systemd-tmpfiles-setup.service - Create System Files and Directories. Sep 9 23:55:30.741781 systemd[1]: Starting audit-rules.service - Load Audit Rules... Sep 9 23:55:30.748418 systemd[1]: Starting clean-ca-certificates.service - Clean up broken links in /etc/ssl/certs... Sep 9 23:55:30.752230 systemd[1]: Starting systemd-journal-catalog-update.service - Rebuild Journal Catalog... Sep 9 23:55:30.758132 systemd[1]: Starting systemd-resolved.service - Network Name Resolution... Sep 9 23:55:30.765819 systemd[1]: Starting systemd-udevd.service - Rule-based Manager for Device Events and Files... Sep 9 23:55:30.774519 systemd[1]: Starting systemd-update-utmp.service - Record System Boot/Shutdown in UTMP... Sep 9 23:55:30.779154 systemd[1]: ignition-delete-config.service - Ignition (delete config) was skipped because no trigger condition checks were met. Sep 9 23:55:30.781278 systemd[1]: Starting modprobe@dm_mod.service - Load Kernel Module dm_mod... Sep 9 23:55:30.789281 systemd[1]: Starting modprobe@efi_pstore.service - Load Kernel Module efi_pstore... Sep 9 23:55:30.802126 systemd[1]: Starting modprobe@loop.service - Load Kernel Module loop... Sep 9 23:55:30.812740 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. Sep 9 23:55:30.812919 systemd[1]: systemd-hibernate-clear.service - Clear Stale Hibernate Storage Info was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/HibernateLocation-8cf2644b-4b0b-428f-9387-6d876050dc67). Sep 9 23:55:30.815763 systemd[1]: ignition-delete-config.service - Ignition (delete config) was skipped because no trigger condition checks were met. Sep 9 23:55:30.815940 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. Sep 9 23:55:30.816025 systemd[1]: systemd-hibernate-clear.service - Clear Stale Hibernate Storage Info was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/HibernateLocation-8cf2644b-4b0b-428f-9387-6d876050dc67). Sep 9 23:55:30.824375 systemd[1]: Starting systemd-userdbd.service - User Database Manager... Sep 9 23:55:30.835462 systemd[1]: Finished systemd-update-utmp.service - Record System Boot/Shutdown in UTMP. Sep 9 23:55:30.840464 systemd[1]: modprobe@dm_mod.service: Deactivated successfully. Sep 9 23:55:30.841971 systemd[1]: Finished modprobe@dm_mod.service - Load Kernel Module dm_mod. Sep 9 23:55:30.844047 systemd[1]: Finished ensure-sysext.service. Sep 9 23:55:30.852211 systemd[1]: ignition-delete-config.service - Ignition (delete config) was skipped because no trigger condition checks were met. Sep 9 23:55:30.854620 systemd[1]: Starting modprobe@drm.service - Load Kernel Module drm... Sep 9 23:55:30.857044 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. Sep 9 23:55:30.857111 systemd[1]: systemd-hibernate-clear.service - Clear Stale Hibernate Storage Info was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/HibernateLocation-8cf2644b-4b0b-428f-9387-6d876050dc67). Sep 9 23:55:30.860408 systemd[1]: Starting systemd-timesyncd.service - Network Time Synchronization... Sep 9 23:55:30.861820 systemd[1]: Finished systemd-journal-catalog-update.service - Rebuild Journal Catalog. Sep 9 23:55:30.863244 systemd[1]: modprobe@efi_pstore.service: Deactivated successfully. Sep 9 23:55:30.863975 systemd[1]: Finished modprobe@efi_pstore.service - Load Kernel Module efi_pstore. Sep 9 23:55:30.866453 systemd[1]: modprobe@loop.service: Deactivated successfully. Sep 9 23:55:30.866631 systemd[1]: Finished modprobe@loop.service - Load Kernel Module loop. Sep 9 23:55:30.868783 systemd[1]: systemd-pstore.service - Platform Persistent Storage Archival was skipped because of an unmet condition check (ConditionDirectoryNotEmpty=/sys/fs/pstore). Sep 9 23:55:30.871002 systemd[1]: systemd-repart.service - Repartition Root Disk was skipped because no trigger condition checks were met. Sep 9 23:55:30.873326 systemd[1]: Starting systemd-update-done.service - Update is Completed... Sep 9 23:55:30.886653 systemd[1]: modprobe@drm.service: Deactivated successfully. Sep 9 23:55:30.887971 systemd[1]: Finished modprobe@drm.service - Load Kernel Module drm. Sep 9 23:55:30.902021 systemd-udevd[1370]: Using default interface naming scheme 'v255'. Sep 9 23:55:30.920521 systemd[1]: Finished systemd-update-done.service - Update is Completed. Sep 9 23:55:30.937812 augenrules[1406]: No rules Sep 9 23:55:30.939430 systemd[1]: audit-rules.service: Deactivated successfully. Sep 9 23:55:30.939701 systemd[1]: Finished audit-rules.service - Load Audit Rules. Sep 9 23:55:30.942097 systemd[1]: Finished clean-ca-certificates.service - Clean up broken links in /etc/ssl/certs. Sep 9 23:55:30.943447 systemd[1]: Started systemd-userdbd.service - User Database Manager. Sep 9 23:55:30.946217 systemd[1]: update-ca-certificates.service - Update CA bundle at /etc/ssl/certs/ca-certificates.crt was skipped because of an unmet condition check (ConditionPathIsSymbolicLink=!/etc/ssl/certs/ca-certificates.crt). Sep 9 23:55:30.951603 systemd[1]: Started systemd-udevd.service - Rule-based Manager for Device Events and Files. Sep 9 23:55:30.957093 systemd[1]: Starting systemd-networkd.service - Network Configuration... Sep 9 23:55:31.108472 systemd[1]: Condition check resulted in dev-ttyAMA0.device - /dev/ttyAMA0 being skipped. Sep 9 23:55:31.199723 systemd-resolved[1368]: Positive Trust Anchors: Sep 9 23:55:31.200887 systemd-resolved[1368]: . IN DS 20326 8 2 e06d44b80b8f1d39a95c0b0d7c65d08458e880409bbc683457104237c7f8ec8d Sep 9 23:55:31.201169 systemd-resolved[1368]: Negative trust anchors: home.arpa 10.in-addr.arpa 16.172.in-addr.arpa 17.172.in-addr.arpa 18.172.in-addr.arpa 19.172.in-addr.arpa 20.172.in-addr.arpa 21.172.in-addr.arpa 22.172.in-addr.arpa 23.172.in-addr.arpa 24.172.in-addr.arpa 25.172.in-addr.arpa 26.172.in-addr.arpa 27.172.in-addr.arpa 28.172.in-addr.arpa 29.172.in-addr.arpa 30.172.in-addr.arpa 31.172.in-addr.arpa 170.0.0.192.in-addr.arpa 171.0.0.192.in-addr.arpa 168.192.in-addr.arpa d.f.ip6.arpa ipv4only.arpa resolver.arpa corp home internal intranet lan local private test Sep 9 23:55:31.205775 systemd-networkd[1423]: lo: Link UP Sep 9 23:55:31.205797 systemd-networkd[1423]: lo: Gained carrier Sep 9 23:55:31.206956 systemd-networkd[1423]: Enumeration completed Sep 9 23:55:31.207069 systemd[1]: Started systemd-networkd.service - Network Configuration. Sep 9 23:55:31.209707 systemd[1]: Starting systemd-networkd-persistent-storage.service - Enable Persistent Storage in systemd-networkd... Sep 9 23:55:31.212571 systemd-resolved[1368]: Using system hostname 'ci-4426-0-0-n-d8dd570c6c'. Sep 9 23:55:31.216043 systemd[1]: Starting systemd-networkd-wait-online.service - Wait for Network to be Configured... Sep 9 23:55:31.216861 systemd[1]: Started systemd-resolved.service - Network Name Resolution. Sep 9 23:55:31.217998 systemd[1]: Reached target network.target - Network. Sep 9 23:55:31.218911 systemd[1]: Reached target nss-lookup.target - Host and Network Name Lookups. Sep 9 23:55:31.244713 systemd[1]: Started systemd-timesyncd.service - Network Time Synchronization. Sep 9 23:55:31.245600 systemd[1]: Reached target sysinit.target - System Initialization. Sep 9 23:55:31.246417 systemd[1]: Started motdgen.path - Watch for update engine configuration changes. Sep 9 23:55:31.247119 systemd[1]: Started user-cloudinit@var-lib-flatcar\x2dinstall-user_data.path - Watch for a cloud-config at /var/lib/flatcar-install/user_data. Sep 9 23:55:31.247821 systemd[1]: Started systemd-tmpfiles-clean.timer - Daily Cleanup of Temporary Directories. Sep 9 23:55:31.248524 systemd[1]: update-engine-stub.timer - Update Engine Stub Timer was skipped because of an unmet condition check (ConditionPathExists=/usr/.noupdate). Sep 9 23:55:31.248555 systemd[1]: Reached target paths.target - Path Units. Sep 9 23:55:31.249091 systemd[1]: Reached target time-set.target - System Time Set. Sep 9 23:55:31.249786 systemd[1]: Started logrotate.timer - Daily rotation of log files. Sep 9 23:55:31.250801 systemd[1]: Started mdadm.timer - Weekly check for MD array's redundancy information.. Sep 9 23:55:31.251616 systemd[1]: Reached target timers.target - Timer Units. Sep 9 23:55:31.254919 systemd[1]: Listening on dbus.socket - D-Bus System Message Bus Socket. Sep 9 23:55:31.257995 systemd-networkd[1423]: eth1: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Sep 9 23:55:31.258006 systemd-networkd[1423]: eth1: Configuring with /usr/lib/systemd/network/zz-default.network. Sep 9 23:55:31.258778 systemd-networkd[1423]: eth1: Link UP Sep 9 23:55:31.258911 systemd-networkd[1423]: eth1: Gained carrier Sep 9 23:55:31.258931 systemd-networkd[1423]: eth1: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Sep 9 23:55:31.266858 systemd[1]: Starting docker.socket - Docker Socket for the API... Sep 9 23:55:31.269782 systemd[1]: Listening on sshd-unix-local.socket - OpenSSH Server Socket (systemd-ssh-generator, AF_UNIX Local). Sep 9 23:55:31.271570 systemd[1]: Listening on sshd-vsock.socket - OpenSSH Server Socket (systemd-ssh-generator, AF_VSOCK). Sep 9 23:55:31.272515 systemd[1]: Reached target ssh-access.target - SSH Access Available. Sep 9 23:55:31.276110 systemd[1]: Listening on sshd.socket - OpenSSH Server Socket. Sep 9 23:55:31.277298 systemd[1]: Listening on systemd-hostnamed.socket - Hostname Service Socket. Sep 9 23:55:31.278817 systemd[1]: Listening on docker.socket - Docker Socket for the API. Sep 9 23:55:31.279648 systemd[1]: Reached target sockets.target - Socket Units. Sep 9 23:55:31.280296 systemd[1]: Reached target basic.target - Basic System. Sep 9 23:55:31.280869 systemd[1]: addon-config@oem.service - Configure Addon /oem was skipped because no trigger condition checks were met. Sep 9 23:55:31.280897 systemd[1]: addon-run@oem.service - Run Addon /oem was skipped because no trigger condition checks were met. Sep 9 23:55:31.283999 systemd[1]: Starting containerd.service - containerd container runtime... Sep 9 23:55:31.286072 systemd[1]: Starting coreos-metadata.service - Flatcar Metadata Agent... Sep 9 23:55:31.293067 systemd[1]: Starting dbus.service - D-Bus System Message Bus... Sep 9 23:55:31.295453 systemd[1]: Starting dracut-shutdown.service - Restore /run/initramfs on shutdown... Sep 9 23:55:31.298457 systemd[1]: Starting enable-oem-cloudinit.service - Enable cloudinit... Sep 9 23:55:31.311217 systemd[1]: Starting extend-filesystems.service - Extend Filesystems... Sep 9 23:55:31.312917 systemd[1]: flatcar-setup-environment.service - Modifies /etc/environment for CoreOS was skipped because of an unmet condition check (ConditionPathExists=/oem/bin/flatcar-setup-environment). Sep 9 23:55:31.314275 systemd[1]: Starting motdgen.service - Generate /run/flatcar/motd... Sep 9 23:55:31.315897 systemd-networkd[1423]: eth1: DHCPv4 address 10.0.0.3/32 acquired from 10.0.0.1 Sep 9 23:55:31.316570 systemd-timesyncd[1390]: Network configuration changed, trying to establish connection. Sep 9 23:55:31.317649 systemd[1]: Starting prepare-helm.service - Unpack helm to /opt/bin... Sep 9 23:55:31.320738 systemd[1]: Starting ssh-key-proc-cmdline.service - Install an ssh key from /proc/cmdline... Sep 9 23:55:31.324105 systemd[1]: Starting sshd-keygen.service - Generate sshd host keys... Sep 9 23:55:31.329941 systemd[1]: Starting systemd-logind.service - User Login Management... Sep 9 23:55:31.331535 systemd[1]: tcsd.service - TCG Core Services Daemon was skipped because of an unmet condition check (ConditionPathExists=/dev/tpm0). Sep 9 23:55:31.332392 systemd[1]: cgroup compatibility translation between legacy and unified hierarchy settings activated. See cgroup-compat debug messages for details. Sep 9 23:55:31.340827 systemd[1]: Starting update-engine.service - Update Engine... Sep 9 23:55:31.342649 systemd[1]: Starting update-ssh-keys-after-ignition.service - Run update-ssh-keys once after Ignition... Sep 9 23:55:31.345908 systemd[1]: Finished systemd-networkd-persistent-storage.service - Enable Persistent Storage in systemd-networkd. Sep 9 23:55:31.354197 systemd[1]: Finished dracut-shutdown.service - Restore /run/initramfs on shutdown. Sep 9 23:55:31.362693 jq[1474]: false Sep 9 23:55:31.365404 systemd[1]: enable-oem-cloudinit.service: Skipped due to 'exec-condition'. Sep 9 23:55:31.365902 systemd[1]: Condition check resulted in enable-oem-cloudinit.service - Enable cloudinit being skipped. Sep 9 23:55:31.370828 systemd[1]: ssh-key-proc-cmdline.service: Deactivated successfully. Sep 9 23:55:31.371067 systemd[1]: Finished ssh-key-proc-cmdline.service - Install an ssh key from /proc/cmdline. Sep 9 23:55:31.379083 jq[1483]: true Sep 9 23:55:31.394456 systemd-networkd[1423]: eth0: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Sep 9 23:55:31.394471 systemd-networkd[1423]: eth0: Configuring with /usr/lib/systemd/network/zz-default.network. Sep 9 23:55:31.396381 systemd-networkd[1423]: eth0: Link UP Sep 9 23:55:31.396880 systemd-timesyncd[1390]: Network configuration changed, trying to establish connection. Sep 9 23:55:31.397874 systemd-networkd[1423]: eth0: Gained carrier Sep 9 23:55:31.397905 systemd-networkd[1423]: eth0: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Sep 9 23:55:31.405996 dbus-daemon[1472]: [system] SELinux support is enabled Sep 9 23:55:31.406278 systemd[1]: Started dbus.service - D-Bus System Message Bus. Sep 9 23:55:31.410188 systemd[1]: system-cloudinit@usr-share-oem-cloud\x2dconfig.yml.service - Load cloud-config from /usr/share/oem/cloud-config.yml was skipped because of an unmet condition check (ConditionFileNotEmpty=/usr/share/oem/cloud-config.yml). Sep 9 23:55:31.410227 systemd[1]: Reached target system-config.target - Load system-provided cloud configs. Sep 9 23:55:31.412059 systemd[1]: user-cloudinit-proc-cmdline.service - Load cloud-config from url defined in /proc/cmdline was skipped because of an unmet condition check (ConditionKernelCommandLine=cloud-config-url). Sep 9 23:55:31.412087 systemd[1]: Reached target user-config.target - Load user-provided cloud configs. Sep 9 23:55:31.425586 systemd-timesyncd[1390]: Network configuration changed, trying to establish connection. Sep 9 23:55:31.427427 update_engine[1482]: I20250909 23:55:31.426573 1482 main.cc:92] Flatcar Update Engine starting Sep 9 23:55:31.435801 jq[1493]: true Sep 9 23:55:31.438456 systemd[1]: Started update-engine.service - Update Engine. Sep 9 23:55:31.443720 update_engine[1482]: I20250909 23:55:31.441085 1482 update_check_scheduler.cc:74] Next update check in 9m39s Sep 9 23:55:31.464349 coreos-metadata[1471]: Sep 09 23:55:31.464 INFO Fetching http://169.254.169.254/hetzner/v1/metadata: Attempt #1 Sep 9 23:55:31.466013 coreos-metadata[1471]: Sep 09 23:55:31.465 INFO Failed to fetch: error sending request for url (http://169.254.169.254/hetzner/v1/metadata) Sep 9 23:55:31.466480 systemd[1]: Started locksmithd.service - Cluster reboot manager. Sep 9 23:55:31.473227 extend-filesystems[1475]: Found /dev/sda6 Sep 9 23:55:31.469017 systemd-networkd[1423]: eth0: DHCPv4 address 91.99.154.191/32, gateway 172.31.1.1 acquired from 172.31.1.1 Sep 9 23:55:31.469411 systemd-timesyncd[1390]: Network configuration changed, trying to establish connection. Sep 9 23:55:31.469415 (ntainerd)[1499]: containerd.service: Referenced but unset environment variable evaluates to an empty string: TORCX_IMAGEDIR, TORCX_UNPACKDIR Sep 9 23:55:31.469759 systemd-timesyncd[1390]: Network configuration changed, trying to establish connection. Sep 9 23:55:31.504548 extend-filesystems[1475]: Found /dev/sda9 Sep 9 23:55:31.509822 extend-filesystems[1475]: Checking size of /dev/sda9 Sep 9 23:55:31.512393 systemd[1]: motdgen.service: Deactivated successfully. Sep 9 23:55:31.514805 systemd[1]: Finished motdgen.service - Generate /run/flatcar/motd. Sep 9 23:55:31.525011 bash[1527]: Updated "/home/core/.ssh/authorized_keys" Sep 9 23:55:31.526810 systemd[1]: Finished update-ssh-keys-after-ignition.service - Run update-ssh-keys once after Ignition. Sep 9 23:55:31.532001 systemd[1]: Starting sshkeys.service... Sep 9 23:55:31.535500 tar[1508]: linux-arm64/LICENSE Sep 9 23:55:31.535751 tar[1508]: linux-arm64/helm Sep 9 23:55:31.566879 extend-filesystems[1475]: Resized partition /dev/sda9 Sep 9 23:55:31.575785 extend-filesystems[1537]: resize2fs 1.47.2 (1-Jan-2025) Sep 9 23:55:31.578978 systemd[1]: Created slice system-coreos\x2dmetadata\x2dsshkeys.slice - Slice /system/coreos-metadata-sshkeys. Sep 9 23:55:31.584239 systemd[1]: Starting coreos-metadata-sshkeys@core.service - Flatcar Metadata Agent (SSH Keys)... Sep 9 23:55:31.589893 kernel: EXT4-fs (sda9): resizing filesystem from 1617920 to 9393147 blocks Sep 9 23:55:31.608104 systemd-logind[1481]: New seat seat0. Sep 9 23:55:31.609517 systemd[1]: Started systemd-logind.service - User Login Management. Sep 9 23:55:31.637061 kernel: mousedev: PS/2 mouse device common for all mice Sep 9 23:55:31.743204 coreos-metadata[1538]: Sep 09 23:55:31.743 INFO Fetching http://169.254.169.254/hetzner/v1/metadata/public-keys: Attempt #1 Sep 9 23:55:31.750409 coreos-metadata[1538]: Sep 09 23:55:31.748 INFO Fetch successful Sep 9 23:55:31.758202 unknown[1538]: wrote ssh authorized keys file for user: core Sep 9 23:55:31.826686 locksmithd[1501]: locksmithd starting currentOperation="UPDATE_STATUS_IDLE" strategy="reboot" Sep 9 23:55:31.832865 kernel: EXT4-fs (sda9): resized filesystem to 9393147 Sep 9 23:55:31.847945 extend-filesystems[1537]: Filesystem at /dev/sda9 is mounted on /; on-line resizing required Sep 9 23:55:31.847945 extend-filesystems[1537]: old_desc_blocks = 1, new_desc_blocks = 5 Sep 9 23:55:31.847945 extend-filesystems[1537]: The filesystem on /dev/sda9 is now 9393147 (4k) blocks long. Sep 9 23:55:31.854198 extend-filesystems[1475]: Resized filesystem in /dev/sda9 Sep 9 23:55:31.850118 systemd[1]: extend-filesystems.service: Deactivated successfully. Sep 9 23:55:31.850372 systemd[1]: Finished extend-filesystems.service - Extend Filesystems. Sep 9 23:55:31.860523 update-ssh-keys[1550]: Updated "/home/core/.ssh/authorized_keys" Sep 9 23:55:31.862648 systemd[1]: Finished coreos-metadata-sshkeys@core.service - Flatcar Metadata Agent (SSH Keys). Sep 9 23:55:31.867327 systemd[1]: Finished sshkeys.service. Sep 9 23:55:31.885420 systemd[1]: Found device dev-disk-by\x2dlabel-OEM.device - QEMU_HARDDISK OEM. Sep 9 23:55:31.891099 systemd[1]: Starting systemd-fsck@dev-disk-by\x2dlabel-OEM.service - File System Check on /dev/disk/by-label/OEM... Sep 9 23:55:31.894861 containerd[1499]: time="2025-09-09T23:55:31Z" level=warning msg="Ignoring unknown key in TOML" column=1 error="strict mode: fields in the document are missing in the target struct" file=/usr/share/containerd/config.toml key=subreaper row=8 Sep 9 23:55:31.897473 containerd[1499]: time="2025-09-09T23:55:31.897422640Z" level=info msg="starting containerd" revision=fb4c30d4ede3531652d86197bf3fc9515e5276d9 version=v2.0.5 Sep 9 23:55:31.908995 kernel: [drm] pci: virtio-gpu-pci detected at 0000:00:01.0 Sep 9 23:55:31.909109 kernel: [drm] features: -virgl +edid -resource_blob -host_visible Sep 9 23:55:31.909122 kernel: [drm] features: -context_init Sep 9 23:55:31.912867 kernel: [drm] number of scanouts: 1 Sep 9 23:55:31.912969 kernel: [drm] number of cap sets: 0 Sep 9 23:55:31.914880 kernel: [drm] Initialized virtio_gpu 0.1.0 for 0000:00:01.0 on minor 0 Sep 9 23:55:31.921029 kernel: Console: switching to colour frame buffer device 160x50 Sep 9 23:55:31.933862 kernel: virtio-pci 0000:00:01.0: [drm] fb0: virtio_gpudrmfb frame buffer device Sep 9 23:55:31.937423 systemd[1]: Condition check resulted in dev-virtio\x2dports-org.qemu.guest_agent.0.device - /dev/virtio-ports/org.qemu.guest_agent.0 being skipped. Sep 9 23:55:31.940074 systemd[1]: Started qemu-guest-agent.service - QEMU Guest Agent. Sep 9 23:55:31.959380 containerd[1499]: time="2025-09-09T23:55:31.959328640Z" level=warning msg="Configuration migrated from version 2, use `containerd config migrate` to avoid migration" t="12.08µs" Sep 9 23:55:31.959550 systemd[1]: Finished systemd-fsck@dev-disk-by\x2dlabel-OEM.service - File System Check on /dev/disk/by-label/OEM. Sep 9 23:55:31.960587 containerd[1499]: time="2025-09-09T23:55:31.960544840Z" level=info msg="loading plugin" id=io.containerd.image-verifier.v1.bindir type=io.containerd.image-verifier.v1 Sep 9 23:55:31.960705 containerd[1499]: time="2025-09-09T23:55:31.960690520Z" level=info msg="loading plugin" id=io.containerd.internal.v1.opt type=io.containerd.internal.v1 Sep 9 23:55:31.962374 containerd[1499]: time="2025-09-09T23:55:31.961823640Z" level=info msg="loading plugin" id=io.containerd.warning.v1.deprecations type=io.containerd.warning.v1 Sep 9 23:55:31.962491 containerd[1499]: time="2025-09-09T23:55:31.962470880Z" level=info msg="loading plugin" id=io.containerd.content.v1.content type=io.containerd.content.v1 Sep 9 23:55:31.962660 containerd[1499]: time="2025-09-09T23:55:31.962643040Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.blockfile type=io.containerd.snapshotter.v1 Sep 9 23:55:31.963175 containerd[1499]: time="2025-09-09T23:55:31.963149040Z" level=info msg="skip loading plugin" error="no scratch file generator: skip plugin" id=io.containerd.snapshotter.v1.blockfile type=io.containerd.snapshotter.v1 Sep 9 23:55:31.963543 containerd[1499]: time="2025-09-09T23:55:31.963517120Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.btrfs type=io.containerd.snapshotter.v1 Sep 9 23:55:31.963929 containerd[1499]: time="2025-09-09T23:55:31.963904920Z" level=info msg="skip loading plugin" error="path /var/lib/containerd/io.containerd.snapshotter.v1.btrfs (ext4) must be a btrfs filesystem to be used with the btrfs snapshotter: skip plugin" id=io.containerd.snapshotter.v1.btrfs type=io.containerd.snapshotter.v1 Sep 9 23:55:31.964072 containerd[1499]: time="2025-09-09T23:55:31.964054280Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.devmapper type=io.containerd.snapshotter.v1 Sep 9 23:55:31.964593 containerd[1499]: time="2025-09-09T23:55:31.964570360Z" level=info msg="skip loading plugin" error="devmapper not configured: skip plugin" id=io.containerd.snapshotter.v1.devmapper type=io.containerd.snapshotter.v1 Sep 9 23:55:31.966519 containerd[1499]: time="2025-09-09T23:55:31.964794720Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.native type=io.containerd.snapshotter.v1 Sep 9 23:55:31.966519 containerd[1499]: time="2025-09-09T23:55:31.964949840Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.overlayfs type=io.containerd.snapshotter.v1 Sep 9 23:55:31.966519 containerd[1499]: time="2025-09-09T23:55:31.965165480Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.zfs type=io.containerd.snapshotter.v1 Sep 9 23:55:31.966519 containerd[1499]: time="2025-09-09T23:55:31.965197960Z" level=info msg="skip loading plugin" error="lstat /var/lib/containerd/io.containerd.snapshotter.v1.zfs: no such file or directory: skip plugin" id=io.containerd.snapshotter.v1.zfs type=io.containerd.snapshotter.v1 Sep 9 23:55:31.966519 containerd[1499]: time="2025-09-09T23:55:31.965210040Z" level=info msg="loading plugin" id=io.containerd.event.v1.exchange type=io.containerd.event.v1 Sep 9 23:55:31.966519 containerd[1499]: time="2025-09-09T23:55:31.965271640Z" level=info msg="loading plugin" id=io.containerd.monitor.task.v1.cgroups type=io.containerd.monitor.task.v1 Sep 9 23:55:31.966519 containerd[1499]: time="2025-09-09T23:55:31.965514240Z" level=info msg="loading plugin" id=io.containerd.metadata.v1.bolt type=io.containerd.metadata.v1 Sep 9 23:55:31.966519 containerd[1499]: time="2025-09-09T23:55:31.965584880Z" level=info msg="metadata content store policy set" policy=shared Sep 9 23:55:31.971598 containerd[1499]: time="2025-09-09T23:55:31.971553480Z" level=info msg="loading plugin" id=io.containerd.gc.v1.scheduler type=io.containerd.gc.v1 Sep 9 23:55:31.971927 containerd[1499]: time="2025-09-09T23:55:31.971904080Z" level=info msg="loading plugin" id=io.containerd.differ.v1.walking type=io.containerd.differ.v1 Sep 9 23:55:31.972200 containerd[1499]: time="2025-09-09T23:55:31.972177240Z" level=info msg="loading plugin" id=io.containerd.lease.v1.manager type=io.containerd.lease.v1 Sep 9 23:55:31.972338 containerd[1499]: time="2025-09-09T23:55:31.972318520Z" level=info msg="loading plugin" id=io.containerd.service.v1.containers-service type=io.containerd.service.v1 Sep 9 23:55:31.972399 containerd[1499]: time="2025-09-09T23:55:31.972384480Z" level=info msg="loading plugin" id=io.containerd.service.v1.content-service type=io.containerd.service.v1 Sep 9 23:55:31.972640 containerd[1499]: time="2025-09-09T23:55:31.972622040Z" level=info msg="loading plugin" id=io.containerd.service.v1.diff-service type=io.containerd.service.v1 Sep 9 23:55:31.972707 containerd[1499]: time="2025-09-09T23:55:31.972694360Z" level=info msg="loading plugin" id=io.containerd.service.v1.images-service type=io.containerd.service.v1 Sep 9 23:55:31.972756 containerd[1499]: time="2025-09-09T23:55:31.972745440Z" level=info msg="loading plugin" id=io.containerd.service.v1.introspection-service type=io.containerd.service.v1 Sep 9 23:55:31.972899 containerd[1499]: time="2025-09-09T23:55:31.972880920Z" level=info msg="loading plugin" id=io.containerd.service.v1.namespaces-service type=io.containerd.service.v1 Sep 9 23:55:31.973050 containerd[1499]: time="2025-09-09T23:55:31.973032080Z" level=info msg="loading plugin" id=io.containerd.service.v1.snapshots-service type=io.containerd.service.v1 Sep 9 23:55:31.973115 containerd[1499]: time="2025-09-09T23:55:31.973102000Z" level=info msg="loading plugin" id=io.containerd.shim.v1.manager type=io.containerd.shim.v1 Sep 9 23:55:31.973220 containerd[1499]: time="2025-09-09T23:55:31.973204520Z" level=info msg="loading plugin" id=io.containerd.runtime.v2.task type=io.containerd.runtime.v2 Sep 9 23:55:31.973560 containerd[1499]: time="2025-09-09T23:55:31.973529840Z" level=info msg="loading plugin" id=io.containerd.service.v1.tasks-service type=io.containerd.service.v1 Sep 9 23:55:31.973938 containerd[1499]: time="2025-09-09T23:55:31.973916440Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.containers type=io.containerd.grpc.v1 Sep 9 23:55:31.974204 containerd[1499]: time="2025-09-09T23:55:31.974183200Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.content type=io.containerd.grpc.v1 Sep 9 23:55:31.974350 containerd[1499]: time="2025-09-09T23:55:31.974330800Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.diff type=io.containerd.grpc.v1 Sep 9 23:55:31.975867 containerd[1499]: time="2025-09-09T23:55:31.974533680Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.events type=io.containerd.grpc.v1 Sep 9 23:55:31.975867 containerd[1499]: time="2025-09-09T23:55:31.974557360Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.images type=io.containerd.grpc.v1 Sep 9 23:55:31.975867 containerd[1499]: time="2025-09-09T23:55:31.974572920Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.introspection type=io.containerd.grpc.v1 Sep 9 23:55:31.975867 containerd[1499]: time="2025-09-09T23:55:31.974584920Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.leases type=io.containerd.grpc.v1 Sep 9 23:55:31.975867 containerd[1499]: time="2025-09-09T23:55:31.974597320Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.namespaces type=io.containerd.grpc.v1 Sep 9 23:55:31.975867 containerd[1499]: time="2025-09-09T23:55:31.974609240Z" level=info msg="loading plugin" id=io.containerd.sandbox.store.v1.local type=io.containerd.sandbox.store.v1 Sep 9 23:55:31.975867 containerd[1499]: time="2025-09-09T23:55:31.974620960Z" level=info msg="loading plugin" id=io.containerd.cri.v1.images type=io.containerd.cri.v1 Sep 9 23:55:31.975867 containerd[1499]: time="2025-09-09T23:55:31.974877360Z" level=info msg="Get image filesystem path \"/var/lib/containerd/io.containerd.snapshotter.v1.overlayfs\" for snapshotter \"overlayfs\"" Sep 9 23:55:31.975867 containerd[1499]: time="2025-09-09T23:55:31.974897720Z" level=info msg="Start snapshots syncer" Sep 9 23:55:31.975867 containerd[1499]: time="2025-09-09T23:55:31.974934400Z" level=info msg="loading plugin" id=io.containerd.cri.v1.runtime type=io.containerd.cri.v1 Sep 9 23:55:31.976095 containerd[1499]: time="2025-09-09T23:55:31.975159880Z" level=info msg="starting cri plugin" config="{\"containerd\":{\"defaultRuntimeName\":\"runc\",\"runtimes\":{\"runc\":{\"runtimeType\":\"io.containerd.runc.v2\",\"runtimePath\":\"\",\"PodAnnotations\":null,\"ContainerAnnotations\":null,\"options\":{\"BinaryName\":\"\",\"CriuImagePath\":\"\",\"CriuWorkPath\":\"\",\"IoGid\":0,\"IoUid\":0,\"NoNewKeyring\":false,\"Root\":\"\",\"ShimCgroup\":\"\",\"SystemdCgroup\":true},\"privileged_without_host_devices\":false,\"privileged_without_host_devices_all_devices_allowed\":false,\"baseRuntimeSpec\":\"\",\"cniConfDir\":\"\",\"cniMaxConfNum\":0,\"snapshotter\":\"\",\"sandboxer\":\"podsandbox\",\"io_type\":\"\"}},\"ignoreBlockIONotEnabledErrors\":false,\"ignoreRdtNotEnabledErrors\":false},\"cni\":{\"binDir\":\"/opt/cni/bin\",\"confDir\":\"/etc/cni/net.d\",\"maxConfNum\":1,\"setupSerially\":false,\"confTemplate\":\"\",\"ipPref\":\"\",\"useInternalLoopback\":false},\"enableSelinux\":true,\"selinuxCategoryRange\":1024,\"maxContainerLogSize\":16384,\"disableApparmor\":false,\"restrictOOMScoreAdj\":false,\"disableProcMount\":false,\"unsetSeccompProfile\":\"\",\"tolerateMissingHugetlbController\":true,\"disableHugetlbController\":true,\"device_ownership_from_security_context\":false,\"ignoreImageDefinedVolumes\":false,\"netnsMountsUnderStateDir\":false,\"enableUnprivilegedPorts\":true,\"enableUnprivilegedICMP\":true,\"enableCDI\":true,\"cdiSpecDirs\":[\"/etc/cdi\",\"/var/run/cdi\"],\"drainExecSyncIOTimeout\":\"0s\",\"ignoreDeprecationWarnings\":null,\"containerdRootDir\":\"/var/lib/containerd\",\"containerdEndpoint\":\"/run/containerd/containerd.sock\",\"rootDir\":\"/var/lib/containerd/io.containerd.grpc.v1.cri\",\"stateDir\":\"/run/containerd/io.containerd.grpc.v1.cri\"}" Sep 9 23:55:31.976095 containerd[1499]: time="2025-09-09T23:55:31.975209960Z" level=info msg="loading plugin" id=io.containerd.podsandbox.controller.v1.podsandbox type=io.containerd.podsandbox.controller.v1 Sep 9 23:55:31.976199 containerd[1499]: time="2025-09-09T23:55:31.975307880Z" level=info msg="loading plugin" id=io.containerd.sandbox.controller.v1.shim type=io.containerd.sandbox.controller.v1 Sep 9 23:55:31.976199 containerd[1499]: time="2025-09-09T23:55:31.975464840Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.sandbox-controllers type=io.containerd.grpc.v1 Sep 9 23:55:31.976199 containerd[1499]: time="2025-09-09T23:55:31.975489040Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.sandboxes type=io.containerd.grpc.v1 Sep 9 23:55:31.976199 containerd[1499]: time="2025-09-09T23:55:31.975501000Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.snapshots type=io.containerd.grpc.v1 Sep 9 23:55:31.976199 containerd[1499]: time="2025-09-09T23:55:31.975513520Z" level=info msg="loading plugin" id=io.containerd.streaming.v1.manager type=io.containerd.streaming.v1 Sep 9 23:55:31.976199 containerd[1499]: time="2025-09-09T23:55:31.975525640Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.streaming type=io.containerd.grpc.v1 Sep 9 23:55:31.976199 containerd[1499]: time="2025-09-09T23:55:31.975538320Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.tasks type=io.containerd.grpc.v1 Sep 9 23:55:31.976199 containerd[1499]: time="2025-09-09T23:55:31.975549720Z" level=info msg="loading plugin" id=io.containerd.transfer.v1.local type=io.containerd.transfer.v1 Sep 9 23:55:31.976199 containerd[1499]: time="2025-09-09T23:55:31.975576160Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.transfer type=io.containerd.grpc.v1 Sep 9 23:55:31.976199 containerd[1499]: time="2025-09-09T23:55:31.975587720Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.version type=io.containerd.grpc.v1 Sep 9 23:55:31.976199 containerd[1499]: time="2025-09-09T23:55:31.975599240Z" level=info msg="loading plugin" id=io.containerd.monitor.container.v1.restart type=io.containerd.monitor.container.v1 Sep 9 23:55:31.976199 containerd[1499]: time="2025-09-09T23:55:31.975624840Z" level=info msg="loading plugin" id=io.containerd.tracing.processor.v1.otlp type=io.containerd.tracing.processor.v1 Sep 9 23:55:31.976199 containerd[1499]: time="2025-09-09T23:55:31.975640160Z" level=info msg="skip loading plugin" error="skip plugin: tracing endpoint not configured" id=io.containerd.tracing.processor.v1.otlp type=io.containerd.tracing.processor.v1 Sep 9 23:55:31.976199 containerd[1499]: time="2025-09-09T23:55:31.975649120Z" level=info msg="loading plugin" id=io.containerd.internal.v1.tracing type=io.containerd.internal.v1 Sep 9 23:55:31.976439 containerd[1499]: time="2025-09-09T23:55:31.975658640Z" level=info msg="skip loading plugin" error="skip plugin: tracing endpoint not configured" id=io.containerd.internal.v1.tracing type=io.containerd.internal.v1 Sep 9 23:55:31.976439 containerd[1499]: time="2025-09-09T23:55:31.975668080Z" level=info msg="loading plugin" id=io.containerd.ttrpc.v1.otelttrpc type=io.containerd.ttrpc.v1 Sep 9 23:55:31.976439 containerd[1499]: time="2025-09-09T23:55:31.975677600Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.healthcheck type=io.containerd.grpc.v1 Sep 9 23:55:31.976439 containerd[1499]: time="2025-09-09T23:55:31.975688640Z" level=info msg="loading plugin" id=io.containerd.nri.v1.nri type=io.containerd.nri.v1 Sep 9 23:55:31.976439 containerd[1499]: time="2025-09-09T23:55:31.975804400Z" level=info msg="runtime interface created" Sep 9 23:55:31.976439 containerd[1499]: time="2025-09-09T23:55:31.975811720Z" level=info msg="created NRI interface" Sep 9 23:55:31.976439 containerd[1499]: time="2025-09-09T23:55:31.975820760Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.cri type=io.containerd.grpc.v1 Sep 9 23:55:31.978640 containerd[1499]: time="2025-09-09T23:55:31.978608120Z" level=info msg="Connect containerd service" Sep 9 23:55:31.978967 containerd[1499]: time="2025-09-09T23:55:31.978947720Z" level=info msg="using experimental NRI integration - disable nri plugin to prevent this" Sep 9 23:55:31.983049 containerd[1499]: time="2025-09-09T23:55:31.982369840Z" level=error msg="failed to load cni during init, please check CRI plugin status before setting up network for pods" error="cni config load failed: no network config found in /etc/cni/net.d: cni plugin not initialized: failed to load cni config" Sep 9 23:55:32.162143 containerd[1499]: time="2025-09-09T23:55:32.161603680Z" level=info msg="Start subscribing containerd event" Sep 9 23:55:32.162143 containerd[1499]: time="2025-09-09T23:55:32.161765520Z" level=info msg="Start recovering state" Sep 9 23:55:32.162143 containerd[1499]: time="2025-09-09T23:55:32.161977400Z" level=info msg="Start event monitor" Sep 9 23:55:32.162143 containerd[1499]: time="2025-09-09T23:55:32.162002840Z" level=info msg="Start cni network conf syncer for default" Sep 9 23:55:32.162143 containerd[1499]: time="2025-09-09T23:55:32.162042080Z" level=info msg="Start streaming server" Sep 9 23:55:32.162143 containerd[1499]: time="2025-09-09T23:55:32.162063920Z" level=info msg="Registered namespace \"k8s.io\" with NRI" Sep 9 23:55:32.162143 containerd[1499]: time="2025-09-09T23:55:32.162080320Z" level=info msg="runtime interface starting up..." Sep 9 23:55:32.162143 containerd[1499]: time="2025-09-09T23:55:32.162111200Z" level=info msg="starting plugins..." Sep 9 23:55:32.162143 containerd[1499]: time="2025-09-09T23:55:32.162142160Z" level=info msg="Synchronizing NRI (plugin) with current runtime state" Sep 9 23:55:32.163559 containerd[1499]: time="2025-09-09T23:55:32.163280240Z" level=info msg=serving... address=/run/containerd/containerd.sock.ttrpc Sep 9 23:55:32.164221 containerd[1499]: time="2025-09-09T23:55:32.164191440Z" level=info msg=serving... address=/run/containerd/containerd.sock Sep 9 23:55:32.164508 containerd[1499]: time="2025-09-09T23:55:32.164490760Z" level=info msg="containerd successfully booted in 0.274852s" Sep 9 23:55:32.164622 systemd[1]: Started containerd.service - containerd container runtime. Sep 9 23:55:32.252199 systemd-logind[1481]: Watching system buttons on /dev/input/event2 (QEMU QEMU USB Keyboard) Sep 9 23:55:32.269969 systemd-logind[1481]: Watching system buttons on /dev/input/event0 (Power Button) Sep 9 23:55:32.313189 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Sep 9 23:55:32.424063 systemd-networkd[1423]: eth0: Gained IPv6LL Sep 9 23:55:32.424620 systemd-timesyncd[1390]: Network configuration changed, trying to establish connection. Sep 9 23:55:32.429688 systemd[1]: systemd-vconsole-setup.service: Deactivated successfully. Sep 9 23:55:32.429967 systemd[1]: Stopped systemd-vconsole-setup.service - Virtual Console Setup. Sep 9 23:55:32.446032 tar[1508]: linux-arm64/README.md Sep 9 23:55:32.448505 systemd[1]: run-credentials-systemd\x2dvconsole\x2dsetup.service.mount: Deactivated successfully. Sep 9 23:55:32.449962 systemd[1]: Finished systemd-networkd-wait-online.service - Wait for Network to be Configured. Sep 9 23:55:32.457522 systemd[1]: Reached target network-online.target - Network is Online. Sep 9 23:55:32.461988 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Sep 9 23:55:32.466951 coreos-metadata[1471]: Sep 09 23:55:32.466 INFO Fetching http://169.254.169.254/hetzner/v1/metadata: Attempt #2 Sep 9 23:55:32.470311 systemd[1]: Starting nvidia.service - NVIDIA Configure Service... Sep 9 23:55:32.472674 coreos-metadata[1471]: Sep 09 23:55:32.472 INFO Fetch successful Sep 9 23:55:32.472677 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Sep 9 23:55:32.473643 coreos-metadata[1471]: Sep 09 23:55:32.473 INFO Fetching http://169.254.169.254/hetzner/v1/metadata/private-networks: Attempt #1 Sep 9 23:55:32.474751 coreos-metadata[1471]: Sep 09 23:55:32.474 INFO Fetch successful Sep 9 23:55:32.531922 systemd[1]: Finished prepare-helm.service - Unpack helm to /opt/bin. Sep 9 23:55:32.604728 systemd[1]: Finished nvidia.service - NVIDIA Configure Service. Sep 9 23:55:32.617583 systemd[1]: Finished systemd-vconsole-setup.service - Virtual Console Setup. Sep 9 23:55:32.644888 systemd[1]: Finished coreos-metadata.service - Flatcar Metadata Agent. Sep 9 23:55:32.646969 systemd[1]: packet-phone-home.service - Report Success to Packet was skipped because no trigger condition checks were met. Sep 9 23:55:32.679989 systemd-networkd[1423]: eth1: Gained IPv6LL Sep 9 23:55:32.680454 systemd-timesyncd[1390]: Network configuration changed, trying to establish connection. Sep 9 23:55:32.820291 sshd_keygen[1496]: ssh-keygen: generating new host keys: RSA ECDSA ED25519 Sep 9 23:55:32.847001 systemd[1]: Finished sshd-keygen.service - Generate sshd host keys. Sep 9 23:55:32.851713 systemd[1]: Starting issuegen.service - Generate /run/issue... Sep 9 23:55:32.875195 systemd[1]: issuegen.service: Deactivated successfully. Sep 9 23:55:32.875911 systemd[1]: Finished issuegen.service - Generate /run/issue. Sep 9 23:55:32.881117 systemd[1]: Starting systemd-user-sessions.service - Permit User Sessions... Sep 9 23:55:32.905903 systemd[1]: Finished systemd-user-sessions.service - Permit User Sessions. Sep 9 23:55:32.909330 systemd[1]: Started getty@tty1.service - Getty on tty1. Sep 9 23:55:32.914952 systemd[1]: Started serial-getty@ttyAMA0.service - Serial Getty on ttyAMA0. Sep 9 23:55:32.915894 systemd[1]: Reached target getty.target - Login Prompts. Sep 9 23:55:33.448060 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Sep 9 23:55:33.450561 systemd[1]: Reached target multi-user.target - Multi-User System. Sep 9 23:55:33.454211 systemd[1]: Startup finished in 2.435s (kernel) + 5.164s (initrd) + 4.724s (userspace) = 12.325s. Sep 9 23:55:33.469488 (kubelet)[1654]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Sep 9 23:55:34.019273 kubelet[1654]: E0909 23:55:34.019161 1654 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Sep 9 23:55:34.023675 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Sep 9 23:55:34.023869 systemd[1]: kubelet.service: Failed with result 'exit-code'. Sep 9 23:55:34.024639 systemd[1]: kubelet.service: Consumed 908ms CPU time, 255M memory peak. Sep 9 23:55:44.274737 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 1. Sep 9 23:55:44.279635 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Sep 9 23:55:44.450606 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Sep 9 23:55:44.471045 (kubelet)[1673]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Sep 9 23:55:44.520971 kubelet[1673]: E0909 23:55:44.520921 1673 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Sep 9 23:55:44.524762 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Sep 9 23:55:44.524935 systemd[1]: kubelet.service: Failed with result 'exit-code'. Sep 9 23:55:44.525565 systemd[1]: kubelet.service: Consumed 172ms CPU time, 105.5M memory peak. Sep 9 23:55:54.626611 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 2. Sep 9 23:55:54.630786 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Sep 9 23:55:54.814566 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Sep 9 23:55:54.828656 (kubelet)[1688]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Sep 9 23:55:54.873862 kubelet[1688]: E0909 23:55:54.873780 1688 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Sep 9 23:55:54.876552 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Sep 9 23:55:54.876822 systemd[1]: kubelet.service: Failed with result 'exit-code'. Sep 9 23:55:54.877525 systemd[1]: kubelet.service: Consumed 172ms CPU time, 108.9M memory peak. Sep 9 23:56:03.046273 systemd-timesyncd[1390]: Contacted time server 31.209.85.243:123 (2.flatcar.pool.ntp.org). Sep 9 23:56:03.047322 systemd-timesyncd[1390]: Initial clock synchronization to Tue 2025-09-09 23:56:02.830483 UTC. Sep 9 23:56:05.126012 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 3. Sep 9 23:56:05.130164 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Sep 9 23:56:05.314593 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Sep 9 23:56:05.330631 (kubelet)[1703]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Sep 9 23:56:05.378443 kubelet[1703]: E0909 23:56:05.378309 1703 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Sep 9 23:56:05.380806 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Sep 9 23:56:05.380967 systemd[1]: kubelet.service: Failed with result 'exit-code'. Sep 9 23:56:05.382942 systemd[1]: kubelet.service: Consumed 183ms CPU time, 107.2M memory peak. Sep 9 23:56:07.211464 systemd[1]: Created slice system-sshd.slice - Slice /system/sshd. Sep 9 23:56:07.213983 systemd[1]: Started sshd@0-91.99.154.191:22-139.178.68.195:42698.service - OpenSSH per-connection server daemon (139.178.68.195:42698). Sep 9 23:56:08.223458 sshd[1711]: Accepted publickey for core from 139.178.68.195 port 42698 ssh2: RSA SHA256:wI85FBBkRXQquguYbIxcsprF72ScFbxnS2NbrrYOsGk Sep 9 23:56:08.228658 sshd-session[1711]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 9 23:56:08.241662 systemd[1]: Created slice user-500.slice - User Slice of UID 500. Sep 9 23:56:08.247720 systemd[1]: Starting user-runtime-dir@500.service - User Runtime Directory /run/user/500... Sep 9 23:56:08.263507 systemd-logind[1481]: New session 1 of user core. Sep 9 23:56:08.279373 systemd[1]: Finished user-runtime-dir@500.service - User Runtime Directory /run/user/500. Sep 9 23:56:08.284031 systemd[1]: Starting user@500.service - User Manager for UID 500... Sep 9 23:56:08.302925 (systemd)[1716]: pam_unix(systemd-user:session): session opened for user core(uid=500) by (uid=0) Sep 9 23:56:08.306284 systemd-logind[1481]: New session c1 of user core. Sep 9 23:56:08.455138 systemd[1716]: Queued start job for default target default.target. Sep 9 23:56:08.464799 systemd[1716]: Created slice app.slice - User Application Slice. Sep 9 23:56:08.464879 systemd[1716]: Reached target paths.target - Paths. Sep 9 23:56:08.464948 systemd[1716]: Reached target timers.target - Timers. Sep 9 23:56:08.467441 systemd[1716]: Starting dbus.socket - D-Bus User Message Bus Socket... Sep 9 23:56:08.501265 systemd[1716]: Listening on dbus.socket - D-Bus User Message Bus Socket. Sep 9 23:56:08.501741 systemd[1716]: Reached target sockets.target - Sockets. Sep 9 23:56:08.502007 systemd[1716]: Reached target basic.target - Basic System. Sep 9 23:56:08.502263 systemd[1716]: Reached target default.target - Main User Target. Sep 9 23:56:08.502345 systemd[1]: Started user@500.service - User Manager for UID 500. Sep 9 23:56:08.502585 systemd[1716]: Startup finished in 184ms. Sep 9 23:56:08.510120 systemd[1]: Started session-1.scope - Session 1 of User core. Sep 9 23:56:09.194229 systemd[1]: Started sshd@1-91.99.154.191:22-139.178.68.195:42704.service - OpenSSH per-connection server daemon (139.178.68.195:42704). Sep 9 23:56:10.189900 sshd[1727]: Accepted publickey for core from 139.178.68.195 port 42704 ssh2: RSA SHA256:wI85FBBkRXQquguYbIxcsprF72ScFbxnS2NbrrYOsGk Sep 9 23:56:10.189568 sshd-session[1727]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 9 23:56:10.202669 systemd-logind[1481]: New session 2 of user core. Sep 9 23:56:10.214173 systemd[1]: Started session-2.scope - Session 2 of User core. Sep 9 23:56:10.864904 sshd[1730]: Connection closed by 139.178.68.195 port 42704 Sep 9 23:56:10.865475 sshd-session[1727]: pam_unix(sshd:session): session closed for user core Sep 9 23:56:10.869922 systemd[1]: sshd@1-91.99.154.191:22-139.178.68.195:42704.service: Deactivated successfully. Sep 9 23:56:10.872462 systemd[1]: session-2.scope: Deactivated successfully. Sep 9 23:56:10.873817 systemd-logind[1481]: Session 2 logged out. Waiting for processes to exit. Sep 9 23:56:10.875424 systemd-logind[1481]: Removed session 2. Sep 9 23:56:11.046266 systemd[1]: Started sshd@2-91.99.154.191:22-139.178.68.195:35258.service - OpenSSH per-connection server daemon (139.178.68.195:35258). Sep 9 23:56:12.043919 sshd[1736]: Accepted publickey for core from 139.178.68.195 port 35258 ssh2: RSA SHA256:wI85FBBkRXQquguYbIxcsprF72ScFbxnS2NbrrYOsGk Sep 9 23:56:12.046222 sshd-session[1736]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 9 23:56:12.051903 systemd-logind[1481]: New session 3 of user core. Sep 9 23:56:12.060182 systemd[1]: Started session-3.scope - Session 3 of User core. Sep 9 23:56:12.723869 sshd[1739]: Connection closed by 139.178.68.195 port 35258 Sep 9 23:56:12.725217 sshd-session[1736]: pam_unix(sshd:session): session closed for user core Sep 9 23:56:12.729668 systemd[1]: sshd@2-91.99.154.191:22-139.178.68.195:35258.service: Deactivated successfully. Sep 9 23:56:12.732361 systemd[1]: session-3.scope: Deactivated successfully. Sep 9 23:56:12.734426 systemd-logind[1481]: Session 3 logged out. Waiting for processes to exit. Sep 9 23:56:12.736608 systemd-logind[1481]: Removed session 3. Sep 9 23:56:12.888703 systemd[1]: Started sshd@3-91.99.154.191:22-139.178.68.195:35274.service - OpenSSH per-connection server daemon (139.178.68.195:35274). Sep 9 23:56:13.882531 sshd[1745]: Accepted publickey for core from 139.178.68.195 port 35274 ssh2: RSA SHA256:wI85FBBkRXQquguYbIxcsprF72ScFbxnS2NbrrYOsGk Sep 9 23:56:13.885556 sshd-session[1745]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 9 23:56:13.892788 systemd-logind[1481]: New session 4 of user core. Sep 9 23:56:13.903107 systemd[1]: Started session-4.scope - Session 4 of User core. Sep 9 23:56:14.561631 sshd[1748]: Connection closed by 139.178.68.195 port 35274 Sep 9 23:56:14.562708 sshd-session[1745]: pam_unix(sshd:session): session closed for user core Sep 9 23:56:14.571567 systemd[1]: sshd@3-91.99.154.191:22-139.178.68.195:35274.service: Deactivated successfully. Sep 9 23:56:14.577211 systemd[1]: session-4.scope: Deactivated successfully. Sep 9 23:56:14.586888 systemd-logind[1481]: Session 4 logged out. Waiting for processes to exit. Sep 9 23:56:14.588928 systemd-logind[1481]: Removed session 4. Sep 9 23:56:14.745194 systemd[1]: Started sshd@4-91.99.154.191:22-139.178.68.195:35282.service - OpenSSH per-connection server daemon (139.178.68.195:35282). Sep 9 23:56:15.606782 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 4. Sep 9 23:56:15.614105 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Sep 9 23:56:15.778735 sshd[1754]: Accepted publickey for core from 139.178.68.195 port 35282 ssh2: RSA SHA256:wI85FBBkRXQquguYbIxcsprF72ScFbxnS2NbrrYOsGk Sep 9 23:56:15.781382 sshd-session[1754]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 9 23:56:15.795259 systemd-logind[1481]: New session 5 of user core. Sep 9 23:56:15.797313 systemd[1]: Started session-5.scope - Session 5 of User core. Sep 9 23:56:15.817366 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Sep 9 23:56:15.829081 (kubelet)[1766]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Sep 9 23:56:15.880934 kubelet[1766]: E0909 23:56:15.880865 1766 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Sep 9 23:56:15.883680 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Sep 9 23:56:15.883914 systemd[1]: kubelet.service: Failed with result 'exit-code'. Sep 9 23:56:15.884783 systemd[1]: kubelet.service: Consumed 208ms CPU time, 107.1M memory peak. Sep 9 23:56:16.320968 sudo[1773]: core : PWD=/home/core ; USER=root ; COMMAND=/usr/sbin/setenforce 1 Sep 9 23:56:16.321247 sudo[1773]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Sep 9 23:56:16.338626 sudo[1773]: pam_unix(sudo:session): session closed for user root Sep 9 23:56:16.500984 sshd[1764]: Connection closed by 139.178.68.195 port 35282 Sep 9 23:56:16.501926 sshd-session[1754]: pam_unix(sshd:session): session closed for user core Sep 9 23:56:16.512013 systemd[1]: sshd@4-91.99.154.191:22-139.178.68.195:35282.service: Deactivated successfully. Sep 9 23:56:16.515415 systemd[1]: session-5.scope: Deactivated successfully. Sep 9 23:56:16.517759 systemd-logind[1481]: Session 5 logged out. Waiting for processes to exit. Sep 9 23:56:16.522191 systemd-logind[1481]: Removed session 5. Sep 9 23:56:16.677814 systemd[1]: Started sshd@5-91.99.154.191:22-139.178.68.195:35288.service - OpenSSH per-connection server daemon (139.178.68.195:35288). Sep 9 23:56:17.129884 update_engine[1482]: I20250909 23:56:17.129525 1482 update_attempter.cc:509] Updating boot flags... Sep 9 23:56:17.741846 sshd[1779]: Accepted publickey for core from 139.178.68.195 port 35288 ssh2: RSA SHA256:wI85FBBkRXQquguYbIxcsprF72ScFbxnS2NbrrYOsGk Sep 9 23:56:17.744028 sshd-session[1779]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 9 23:56:17.751457 systemd-logind[1481]: New session 6 of user core. Sep 9 23:56:17.762177 systemd[1]: Started session-6.scope - Session 6 of User core. Sep 9 23:56:18.302074 sudo[1800]: core : PWD=/home/core ; USER=root ; COMMAND=/usr/sbin/rm -rf /etc/audit/rules.d/80-selinux.rules /etc/audit/rules.d/99-default.rules Sep 9 23:56:18.302731 sudo[1800]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Sep 9 23:56:18.311225 sudo[1800]: pam_unix(sudo:session): session closed for user root Sep 9 23:56:18.319478 sudo[1799]: core : PWD=/home/core ; USER=root ; COMMAND=/usr/sbin/systemctl restart audit-rules Sep 9 23:56:18.320535 sudo[1799]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Sep 9 23:56:18.341902 systemd[1]: Starting audit-rules.service - Load Audit Rules... Sep 9 23:56:18.402797 augenrules[1822]: No rules Sep 9 23:56:18.404612 systemd[1]: audit-rules.service: Deactivated successfully. Sep 9 23:56:18.405078 systemd[1]: Finished audit-rules.service - Load Audit Rules. Sep 9 23:56:18.408429 sudo[1799]: pam_unix(sudo:session): session closed for user root Sep 9 23:56:18.578254 sshd[1798]: Connection closed by 139.178.68.195 port 35288 Sep 9 23:56:18.578099 sshd-session[1779]: pam_unix(sshd:session): session closed for user core Sep 9 23:56:18.582314 systemd[1]: sshd@5-91.99.154.191:22-139.178.68.195:35288.service: Deactivated successfully. Sep 9 23:56:18.584571 systemd[1]: session-6.scope: Deactivated successfully. Sep 9 23:56:18.588075 systemd-logind[1481]: Session 6 logged out. Waiting for processes to exit. Sep 9 23:56:18.589443 systemd-logind[1481]: Removed session 6. Sep 9 23:56:18.751085 systemd[1]: Started sshd@6-91.99.154.191:22-139.178.68.195:35290.service - OpenSSH per-connection server daemon (139.178.68.195:35290). Sep 9 23:56:19.749728 sshd[1831]: Accepted publickey for core from 139.178.68.195 port 35290 ssh2: RSA SHA256:wI85FBBkRXQquguYbIxcsprF72ScFbxnS2NbrrYOsGk Sep 9 23:56:19.751976 sshd-session[1831]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 9 23:56:19.758779 systemd-logind[1481]: New session 7 of user core. Sep 9 23:56:19.766368 systemd[1]: Started session-7.scope - Session 7 of User core. Sep 9 23:56:20.270472 sudo[1835]: core : PWD=/home/core ; USER=root ; COMMAND=/home/core/install.sh Sep 9 23:56:20.271115 sudo[1835]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Sep 9 23:56:20.616102 systemd[1]: Starting docker.service - Docker Application Container Engine... Sep 9 23:56:20.631529 (dockerd)[1854]: docker.service: Referenced but unset environment variable evaluates to an empty string: DOCKER_CGROUPS, DOCKER_OPTS, DOCKER_OPT_BIP, DOCKER_OPT_IPMASQ, DOCKER_OPT_MTU Sep 9 23:56:20.877467 dockerd[1854]: time="2025-09-09T23:56:20.877418153Z" level=info msg="Starting up" Sep 9 23:56:20.881040 dockerd[1854]: time="2025-09-09T23:56:20.881009013Z" level=info msg="OTEL tracing is not configured, using no-op tracer provider" Sep 9 23:56:20.895631 dockerd[1854]: time="2025-09-09T23:56:20.895565079Z" level=info msg="Creating a containerd client" address=/var/run/docker/libcontainerd/docker-containerd.sock timeout=1m0s Sep 9 23:56:20.917814 systemd[1]: var-lib-docker-check\x2doverlayfs\x2dsupport2045203338-merged.mount: Deactivated successfully. Sep 9 23:56:20.932984 dockerd[1854]: time="2025-09-09T23:56:20.932934019Z" level=info msg="Loading containers: start." Sep 9 23:56:20.946882 kernel: Initializing XFRM netlink socket Sep 9 23:56:21.203094 systemd-networkd[1423]: docker0: Link UP Sep 9 23:56:21.210852 dockerd[1854]: time="2025-09-09T23:56:21.209808472Z" level=info msg="Loading containers: done." Sep 9 23:56:21.231182 systemd[1]: var-lib-docker-overlay2-opaque\x2dbug\x2dcheck38799922-merged.mount: Deactivated successfully. Sep 9 23:56:21.232868 dockerd[1854]: time="2025-09-09T23:56:21.232398176Z" level=warning msg="Not using native diff for overlay2, this may cause degraded performance for building images: kernel has CONFIG_OVERLAY_FS_REDIRECT_DIR enabled" storage-driver=overlay2 Sep 9 23:56:21.233381 dockerd[1854]: time="2025-09-09T23:56:21.233316971Z" level=info msg="Docker daemon" commit=6430e49a55babd9b8f4d08e70ecb2b68900770fe containerd-snapshotter=false storage-driver=overlay2 version=28.0.4 Sep 9 23:56:21.233719 dockerd[1854]: time="2025-09-09T23:56:21.233616335Z" level=info msg="Initializing buildkit" Sep 9 23:56:21.268873 dockerd[1854]: time="2025-09-09T23:56:21.268810974Z" level=info msg="Completed buildkit initialization" Sep 9 23:56:21.279441 dockerd[1854]: time="2025-09-09T23:56:21.279372134Z" level=info msg="Daemon has completed initialization" Sep 9 23:56:21.279828 systemd[1]: Started docker.service - Docker Application Container Engine. Sep 9 23:56:21.280759 dockerd[1854]: time="2025-09-09T23:56:21.279923395Z" level=info msg="API listen on /run/docker.sock" Sep 9 23:56:22.370459 containerd[1499]: time="2025-09-09T23:56:22.370412454Z" level=info msg="PullImage \"registry.k8s.io/kube-apiserver:v1.32.8\"" Sep 9 23:56:22.940260 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount3282783298.mount: Deactivated successfully. Sep 9 23:56:24.033870 containerd[1499]: time="2025-09-09T23:56:24.033204755Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-apiserver:v1.32.8\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 23:56:24.034927 containerd[1499]: time="2025-09-09T23:56:24.034878708Z" level=info msg="stop pulling image registry.k8s.io/kube-apiserver:v1.32.8: active requests=0, bytes read=26328449" Sep 9 23:56:24.036417 containerd[1499]: time="2025-09-09T23:56:24.036393797Z" level=info msg="ImageCreate event name:\"sha256:61d628eec7e2101b908b4476f1e8e620490a9e8754184860c8eed25183acaa8a\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 23:56:24.040012 containerd[1499]: time="2025-09-09T23:56:24.039976743Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-apiserver@sha256:6e1a2f9b24f69ee77d0c0edaf32b31fdbb5e1a613f4476272197e6e1e239050b\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 23:56:24.041046 containerd[1499]: time="2025-09-09T23:56:24.041002547Z" level=info msg="Pulled image \"registry.k8s.io/kube-apiserver:v1.32.8\" with image id \"sha256:61d628eec7e2101b908b4476f1e8e620490a9e8754184860c8eed25183acaa8a\", repo tag \"registry.k8s.io/kube-apiserver:v1.32.8\", repo digest \"registry.k8s.io/kube-apiserver@sha256:6e1a2f9b24f69ee77d0c0edaf32b31fdbb5e1a613f4476272197e6e1e239050b\", size \"26325157\" in 1.670540289s" Sep 9 23:56:24.041113 containerd[1499]: time="2025-09-09T23:56:24.041049100Z" level=info msg="PullImage \"registry.k8s.io/kube-apiserver:v1.32.8\" returns image reference \"sha256:61d628eec7e2101b908b4476f1e8e620490a9e8754184860c8eed25183acaa8a\"" Sep 9 23:56:24.042317 containerd[1499]: time="2025-09-09T23:56:24.042239276Z" level=info msg="PullImage \"registry.k8s.io/kube-controller-manager:v1.32.8\"" Sep 9 23:56:25.526693 containerd[1499]: time="2025-09-09T23:56:25.526624775Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-controller-manager:v1.32.8\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 23:56:25.528059 containerd[1499]: time="2025-09-09T23:56:25.528025246Z" level=info msg="stop pulling image registry.k8s.io/kube-controller-manager:v1.32.8: active requests=0, bytes read=22528572" Sep 9 23:56:25.529501 containerd[1499]: time="2025-09-09T23:56:25.529445605Z" level=info msg="ImageCreate event name:\"sha256:f17de36e40fc7cc372be0021b2c58ad61f05d3ebe4d430551bc5e4cd9ed2a061\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 23:56:25.534053 containerd[1499]: time="2025-09-09T23:56:25.533930515Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-controller-manager@sha256:8788ccd28ceed9e2e5f8fc31375ef5771df8ea6e518b362c9a06f3cc709cd6c7\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 23:56:25.536427 containerd[1499]: time="2025-09-09T23:56:25.536368810Z" level=info msg="Pulled image \"registry.k8s.io/kube-controller-manager:v1.32.8\" with image id \"sha256:f17de36e40fc7cc372be0021b2c58ad61f05d3ebe4d430551bc5e4cd9ed2a061\", repo tag \"registry.k8s.io/kube-controller-manager:v1.32.8\", repo digest \"registry.k8s.io/kube-controller-manager@sha256:8788ccd28ceed9e2e5f8fc31375ef5771df8ea6e518b362c9a06f3cc709cd6c7\", size \"24065666\" in 1.494087249s" Sep 9 23:56:25.536974 containerd[1499]: time="2025-09-09T23:56:25.536432666Z" level=info msg="PullImage \"registry.k8s.io/kube-controller-manager:v1.32.8\" returns image reference \"sha256:f17de36e40fc7cc372be0021b2c58ad61f05d3ebe4d430551bc5e4cd9ed2a061\"" Sep 9 23:56:25.537466 containerd[1499]: time="2025-09-09T23:56:25.537238549Z" level=info msg="PullImage \"registry.k8s.io/kube-scheduler:v1.32.8\"" Sep 9 23:56:26.126744 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 5. Sep 9 23:56:26.131489 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Sep 9 23:56:26.287174 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Sep 9 23:56:26.301697 (kubelet)[2131]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Sep 9 23:56:26.348097 kubelet[2131]: E0909 23:56:26.348012 2131 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Sep 9 23:56:26.351690 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Sep 9 23:56:26.351854 systemd[1]: kubelet.service: Failed with result 'exit-code'. Sep 9 23:56:26.352816 systemd[1]: kubelet.service: Consumed 166ms CPU time, 106.9M memory peak. Sep 9 23:56:27.002962 containerd[1499]: time="2025-09-09T23:56:27.002898898Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-scheduler:v1.32.8\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 23:56:27.005205 containerd[1499]: time="2025-09-09T23:56:27.005160110Z" level=info msg="stop pulling image registry.k8s.io/kube-scheduler:v1.32.8: active requests=0, bytes read=17483547" Sep 9 23:56:27.006681 containerd[1499]: time="2025-09-09T23:56:27.006628034Z" level=info msg="ImageCreate event name:\"sha256:fe86d26bce3ccd5f0c4057c205b63fde1c8c752778025aea4605ffc3b0f80211\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 23:56:27.011904 containerd[1499]: time="2025-09-09T23:56:27.011425153Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-scheduler@sha256:43c58bcbd1c7812dd19f8bfa5ae11093ebefd28699453ce86fc710869e155cd4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 23:56:27.012801 containerd[1499]: time="2025-09-09T23:56:27.012747979Z" level=info msg="Pulled image \"registry.k8s.io/kube-scheduler:v1.32.8\" with image id \"sha256:fe86d26bce3ccd5f0c4057c205b63fde1c8c752778025aea4605ffc3b0f80211\", repo tag \"registry.k8s.io/kube-scheduler:v1.32.8\", repo digest \"registry.k8s.io/kube-scheduler@sha256:43c58bcbd1c7812dd19f8bfa5ae11093ebefd28699453ce86fc710869e155cd4\", size \"19020659\" in 1.475455027s" Sep 9 23:56:27.012890 containerd[1499]: time="2025-09-09T23:56:27.012799194Z" level=info msg="PullImage \"registry.k8s.io/kube-scheduler:v1.32.8\" returns image reference \"sha256:fe86d26bce3ccd5f0c4057c205b63fde1c8c752778025aea4605ffc3b0f80211\"" Sep 9 23:56:27.013758 containerd[1499]: time="2025-09-09T23:56:27.013703983Z" level=info msg="PullImage \"registry.k8s.io/kube-proxy:v1.32.8\"" Sep 9 23:56:28.039501 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount2199111983.mount: Deactivated successfully. Sep 9 23:56:28.373810 containerd[1499]: time="2025-09-09T23:56:28.373613837Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-proxy:v1.32.8\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 23:56:28.376230 containerd[1499]: time="2025-09-09T23:56:28.376004660Z" level=info msg="stop pulling image registry.k8s.io/kube-proxy:v1.32.8: active requests=0, bytes read=27376750" Sep 9 23:56:28.377768 containerd[1499]: time="2025-09-09T23:56:28.377735366Z" level=info msg="ImageCreate event name:\"sha256:2cf30e39f99f8f4ee1a736a4f3175cc2d8d3f58936d8fa83ec5523658fdc7b8b\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 23:56:28.381807 containerd[1499]: time="2025-09-09T23:56:28.381760841Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-proxy@sha256:adc1335b480ddd833aac3b0bd20f68ff0f3c3cf7a0bd337933b006d9f5cec40a\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 23:56:28.383225 containerd[1499]: time="2025-09-09T23:56:28.382952138Z" level=info msg="Pulled image \"registry.k8s.io/kube-proxy:v1.32.8\" with image id \"sha256:2cf30e39f99f8f4ee1a736a4f3175cc2d8d3f58936d8fa83ec5523658fdc7b8b\", repo tag \"registry.k8s.io/kube-proxy:v1.32.8\", repo digest \"registry.k8s.io/kube-proxy@sha256:adc1335b480ddd833aac3b0bd20f68ff0f3c3cf7a0bd337933b006d9f5cec40a\", size \"27375743\" in 1.369003619s" Sep 9 23:56:28.383225 containerd[1499]: time="2025-09-09T23:56:28.383001963Z" level=info msg="PullImage \"registry.k8s.io/kube-proxy:v1.32.8\" returns image reference \"sha256:2cf30e39f99f8f4ee1a736a4f3175cc2d8d3f58936d8fa83ec5523658fdc7b8b\"" Sep 9 23:56:28.384054 containerd[1499]: time="2025-09-09T23:56:28.384011618Z" level=info msg="PullImage \"registry.k8s.io/coredns/coredns:v1.11.3\"" Sep 9 23:56:28.928059 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount789119652.mount: Deactivated successfully. Sep 9 23:56:29.605292 containerd[1499]: time="2025-09-09T23:56:29.605225830Z" level=info msg="ImageCreate event name:\"registry.k8s.io/coredns/coredns:v1.11.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 23:56:29.606847 containerd[1499]: time="2025-09-09T23:56:29.606754926Z" level=info msg="stop pulling image registry.k8s.io/coredns/coredns:v1.11.3: active requests=0, bytes read=16951714" Sep 9 23:56:29.608131 containerd[1499]: time="2025-09-09T23:56:29.608068029Z" level=info msg="ImageCreate event name:\"sha256:2f6c962e7b8311337352d9fdea917da2184d9919f4da7695bc2a6517cf392fe4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 23:56:29.612118 containerd[1499]: time="2025-09-09T23:56:29.612027079Z" level=info msg="ImageCreate event name:\"registry.k8s.io/coredns/coredns@sha256:9caabbf6238b189a65d0d6e6ac138de60d6a1c419e5a341fbbb7c78382559c6e\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 23:56:29.614149 containerd[1499]: time="2025-09-09T23:56:29.614062490Z" level=info msg="Pulled image \"registry.k8s.io/coredns/coredns:v1.11.3\" with image id \"sha256:2f6c962e7b8311337352d9fdea917da2184d9919f4da7695bc2a6517cf392fe4\", repo tag \"registry.k8s.io/coredns/coredns:v1.11.3\", repo digest \"registry.k8s.io/coredns/coredns@sha256:9caabbf6238b189a65d0d6e6ac138de60d6a1c419e5a341fbbb7c78382559c6e\", size \"16948420\" in 1.229887929s" Sep 9 23:56:29.614149 containerd[1499]: time="2025-09-09T23:56:29.614112802Z" level=info msg="PullImage \"registry.k8s.io/coredns/coredns:v1.11.3\" returns image reference \"sha256:2f6c962e7b8311337352d9fdea917da2184d9919f4da7695bc2a6517cf392fe4\"" Sep 9 23:56:29.614708 containerd[1499]: time="2025-09-09T23:56:29.614578236Z" level=info msg="PullImage \"registry.k8s.io/pause:3.10\"" Sep 9 23:56:30.136079 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount3848971210.mount: Deactivated successfully. Sep 9 23:56:30.146806 containerd[1499]: time="2025-09-09T23:56:30.146114038Z" level=info msg="ImageCreate event name:\"registry.k8s.io/pause:3.10\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" Sep 9 23:56:30.147552 containerd[1499]: time="2025-09-09T23:56:30.147509949Z" level=info msg="stop pulling image registry.k8s.io/pause:3.10: active requests=0, bytes read=268723" Sep 9 23:56:30.150814 containerd[1499]: time="2025-09-09T23:56:30.150762264Z" level=info msg="ImageCreate event name:\"sha256:afb61768ce381961ca0beff95337601f29dc70ff3ed14e5e4b3e5699057e6aa8\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" Sep 9 23:56:30.154074 containerd[1499]: time="2025-09-09T23:56:30.154021055Z" level=info msg="ImageCreate event name:\"registry.k8s.io/pause@sha256:ee6521f290b2168b6e0935a181d4cff9be1ac3f505666ef0e3c98fae8199917a\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" Sep 9 23:56:30.155089 containerd[1499]: time="2025-09-09T23:56:30.155052231Z" level=info msg="Pulled image \"registry.k8s.io/pause:3.10\" with image id \"sha256:afb61768ce381961ca0beff95337601f29dc70ff3ed14e5e4b3e5699057e6aa8\", repo tag \"registry.k8s.io/pause:3.10\", repo digest \"registry.k8s.io/pause@sha256:ee6521f290b2168b6e0935a181d4cff9be1ac3f505666ef0e3c98fae8199917a\", size \"267933\" in 540.444342ms" Sep 9 23:56:30.155270 containerd[1499]: time="2025-09-09T23:56:30.155251464Z" level=info msg="PullImage \"registry.k8s.io/pause:3.10\" returns image reference \"sha256:afb61768ce381961ca0beff95337601f29dc70ff3ed14e5e4b3e5699057e6aa8\"" Sep 9 23:56:30.155874 containerd[1499]: time="2025-09-09T23:56:30.155771628Z" level=info msg="PullImage \"registry.k8s.io/etcd:3.5.16-0\"" Sep 9 23:56:30.761936 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount3144959370.mount: Deactivated successfully. Sep 9 23:56:33.045876 containerd[1499]: time="2025-09-09T23:56:33.045161238Z" level=info msg="ImageCreate event name:\"registry.k8s.io/etcd:3.5.16-0\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 23:56:33.046979 containerd[1499]: time="2025-09-09T23:56:33.046944638Z" level=info msg="stop pulling image registry.k8s.io/etcd:3.5.16-0: active requests=0, bytes read=67943239" Sep 9 23:56:33.049031 containerd[1499]: time="2025-09-09T23:56:33.048986732Z" level=info msg="ImageCreate event name:\"sha256:7fc9d4aa817aa6a3e549f3cd49d1f7b496407be979fc36dd5f356d59ce8c3a82\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 23:56:33.054709 containerd[1499]: time="2025-09-09T23:56:33.054655872Z" level=info msg="ImageCreate event name:\"registry.k8s.io/etcd@sha256:c6a9d11cc5c04b114ccdef39a9265eeef818e3d02f5359be035ae784097fdec5\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 23:56:33.056338 containerd[1499]: time="2025-09-09T23:56:33.056284478Z" level=info msg="Pulled image \"registry.k8s.io/etcd:3.5.16-0\" with image id \"sha256:7fc9d4aa817aa6a3e549f3cd49d1f7b496407be979fc36dd5f356d59ce8c3a82\", repo tag \"registry.k8s.io/etcd:3.5.16-0\", repo digest \"registry.k8s.io/etcd@sha256:c6a9d11cc5c04b114ccdef39a9265eeef818e3d02f5359be035ae784097fdec5\", size \"67941650\" in 2.900476756s" Sep 9 23:56:33.056525 containerd[1499]: time="2025-09-09T23:56:33.056499958Z" level=info msg="PullImage \"registry.k8s.io/etcd:3.5.16-0\" returns image reference \"sha256:7fc9d4aa817aa6a3e549f3cd49d1f7b496407be979fc36dd5f356d59ce8c3a82\"" Sep 9 23:56:36.376179 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 6. Sep 9 23:56:36.381111 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Sep 9 23:56:36.540022 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Sep 9 23:56:36.549345 (kubelet)[2288]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Sep 9 23:56:36.597847 kubelet[2288]: E0909 23:56:36.595903 2288 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Sep 9 23:56:36.598731 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Sep 9 23:56:36.599155 systemd[1]: kubelet.service: Failed with result 'exit-code'. Sep 9 23:56:36.599762 systemd[1]: kubelet.service: Consumed 167ms CPU time, 106.2M memory peak. Sep 9 23:56:38.649876 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent. Sep 9 23:56:38.650028 systemd[1]: kubelet.service: Consumed 167ms CPU time, 106.2M memory peak. Sep 9 23:56:38.653674 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Sep 9 23:56:38.694303 systemd[1]: Reload requested from client PID 2302 ('systemctl') (unit session-7.scope)... Sep 9 23:56:38.694468 systemd[1]: Reloading... Sep 9 23:56:38.832875 zram_generator::config[2364]: No configuration found. Sep 9 23:56:39.015946 systemd[1]: Reloading finished in 321 ms. Sep 9 23:56:39.082684 systemd[1]: kubelet.service: Control process exited, code=killed, status=15/TERM Sep 9 23:56:39.082774 systemd[1]: kubelet.service: Failed with result 'signal'. Sep 9 23:56:39.084009 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent. Sep 9 23:56:39.084103 systemd[1]: kubelet.service: Consumed 120ms CPU time, 95M memory peak. Sep 9 23:56:39.086986 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Sep 9 23:56:39.241316 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Sep 9 23:56:39.253184 (kubelet)[2394]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS Sep 9 23:56:39.313297 kubelet[2394]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Sep 9 23:56:39.313297 kubelet[2394]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Sep 9 23:56:39.313297 kubelet[2394]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Sep 9 23:56:39.313297 kubelet[2394]: I0909 23:56:39.312187 2394 server.go:215] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Sep 9 23:56:40.740873 kubelet[2394]: I0909 23:56:40.740721 2394 server.go:520] "Kubelet version" kubeletVersion="v1.32.4" Sep 9 23:56:40.740873 kubelet[2394]: I0909 23:56:40.740768 2394 server.go:522] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Sep 9 23:56:40.742102 kubelet[2394]: I0909 23:56:40.742069 2394 server.go:954] "Client rotation is on, will bootstrap in background" Sep 9 23:56:40.778095 kubelet[2394]: E0909 23:56:40.778045 2394 certificate_manager.go:562] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post \"https://91.99.154.191:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": dial tcp 91.99.154.191:6443: connect: connection refused" logger="UnhandledError" Sep 9 23:56:40.780398 kubelet[2394]: I0909 23:56:40.780303 2394 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/pki/ca.crt" Sep 9 23:56:40.790171 kubelet[2394]: I0909 23:56:40.790134 2394 server.go:1444] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Sep 9 23:56:40.799744 kubelet[2394]: I0909 23:56:40.799694 2394 server.go:772] "--cgroups-per-qos enabled, but --cgroup-root was not specified. defaulting to /" Sep 9 23:56:40.799992 kubelet[2394]: I0909 23:56:40.799953 2394 container_manager_linux.go:268] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Sep 9 23:56:40.800186 kubelet[2394]: I0909 23:56:40.799993 2394 container_manager_linux.go:273] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ci-4426-0-0-n-d8dd570c6c","RuntimeCgroupsName":"","SystemCgroupsName":"","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":false,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":null,"HardEvictionThresholds":[{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"ExperimentalMemoryManagerPolicy":"None","ExperimentalMemoryManagerReservedMemory":null,"PodPidsLimit":-1,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Sep 9 23:56:40.800334 kubelet[2394]: I0909 23:56:40.800239 2394 topology_manager.go:138] "Creating topology manager with none policy" Sep 9 23:56:40.800334 kubelet[2394]: I0909 23:56:40.800250 2394 container_manager_linux.go:304] "Creating device plugin manager" Sep 9 23:56:40.800908 kubelet[2394]: I0909 23:56:40.800491 2394 state_mem.go:36] "Initialized new in-memory state store" Sep 9 23:56:40.804376 kubelet[2394]: I0909 23:56:40.804311 2394 kubelet.go:446] "Attempting to sync node with API server" Sep 9 23:56:40.804564 kubelet[2394]: I0909 23:56:40.804457 2394 kubelet.go:341] "Adding static pod path" path="/etc/kubernetes/manifests" Sep 9 23:56:40.804564 kubelet[2394]: I0909 23:56:40.804492 2394 kubelet.go:352] "Adding apiserver pod source" Sep 9 23:56:40.804564 kubelet[2394]: I0909 23:56:40.804507 2394 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Sep 9 23:56:40.809586 kubelet[2394]: I0909 23:56:40.809538 2394 kuberuntime_manager.go:269] "Container runtime initialized" containerRuntime="containerd" version="v2.0.5" apiVersion="v1" Sep 9 23:56:40.810250 kubelet[2394]: I0909 23:56:40.810221 2394 kubelet.go:890] "Not starting ClusterTrustBundle informer because we are in static kubelet mode" Sep 9 23:56:40.810378 kubelet[2394]: W0909 23:56:40.810361 2394 probe.go:272] Flexvolume plugin directory at /opt/libexec/kubernetes/kubelet-plugins/volume/exec/ does not exist. Recreating. Sep 9 23:56:40.811323 kubelet[2394]: I0909 23:56:40.811288 2394 watchdog_linux.go:99] "Systemd watchdog is not enabled" Sep 9 23:56:40.811663 kubelet[2394]: I0909 23:56:40.811333 2394 server.go:1287] "Started kubelet" Sep 9 23:56:40.811663 kubelet[2394]: W0909 23:56:40.811513 2394 reflector.go:569] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://91.99.154.191:6443/api/v1/nodes?fieldSelector=metadata.name%3Dci-4426-0-0-n-d8dd570c6c&limit=500&resourceVersion=0": dial tcp 91.99.154.191:6443: connect: connection refused Sep 9 23:56:40.811663 kubelet[2394]: E0909 23:56:40.811566 2394 reflector.go:166] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://91.99.154.191:6443/api/v1/nodes?fieldSelector=metadata.name%3Dci-4426-0-0-n-d8dd570c6c&limit=500&resourceVersion=0\": dial tcp 91.99.154.191:6443: connect: connection refused" logger="UnhandledError" Sep 9 23:56:40.820345 kubelet[2394]: E0909 23:56:40.820079 2394 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://91.99.154.191:6443/api/v1/namespaces/default/events\": dial tcp 91.99.154.191:6443: connect: connection refused" event="&Event{ObjectMeta:{ci-4426-0-0-n-d8dd570c6c.1863c28f6988ceab default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ci-4426-0-0-n-d8dd570c6c,UID:ci-4426-0-0-n-d8dd570c6c,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:ci-4426-0-0-n-d8dd570c6c,},FirstTimestamp:2025-09-09 23:56:40.811310763 +0000 UTC m=+1.551656921,LastTimestamp:2025-09-09 23:56:40.811310763 +0000 UTC m=+1.551656921,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ci-4426-0-0-n-d8dd570c6c,}" Sep 9 23:56:40.821586 kubelet[2394]: I0909 23:56:40.821486 2394 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Sep 9 23:56:40.825865 kubelet[2394]: W0909 23:56:40.824880 2394 reflector.go:569] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://91.99.154.191:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 91.99.154.191:6443: connect: connection refused Sep 9 23:56:40.825865 kubelet[2394]: E0909 23:56:40.824958 2394 reflector.go:166] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://91.99.154.191:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 91.99.154.191:6443: connect: connection refused" logger="UnhandledError" Sep 9 23:56:40.826412 kubelet[2394]: I0909 23:56:40.826349 2394 server.go:169] "Starting to listen" address="0.0.0.0" port=10250 Sep 9 23:56:40.827626 kubelet[2394]: I0909 23:56:40.827592 2394 server.go:479] "Adding debug handlers to kubelet server" Sep 9 23:56:40.827753 kubelet[2394]: I0909 23:56:40.827723 2394 desired_state_of_world_populator.go:150] "Desired state populator starts to run" Sep 9 23:56:40.828039 kubelet[2394]: E0909 23:56:40.828011 2394 kubelet_node_status.go:466] "Error getting the current node from lister" err="node \"ci-4426-0-0-n-d8dd570c6c\" not found" Sep 9 23:56:40.829025 kubelet[2394]: I0909 23:56:40.827696 2394 volume_manager.go:297] "Starting Kubelet Volume Manager" Sep 9 23:56:40.831020 kubelet[2394]: I0909 23:56:40.830936 2394 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Sep 9 23:56:40.831482 kubelet[2394]: I0909 23:56:40.831458 2394 server.go:243] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Sep 9 23:56:40.832005 kubelet[2394]: I0909 23:56:40.831980 2394 dynamic_serving_content.go:135] "Starting controller" name="kubelet-server-cert-files::/var/lib/kubelet/pki/kubelet.crt::/var/lib/kubelet/pki/kubelet.key" Sep 9 23:56:40.833295 kubelet[2394]: I0909 23:56:40.833270 2394 reconciler.go:26] "Reconciler: start to sync state" Sep 9 23:56:40.833910 kubelet[2394]: E0909 23:56:40.833871 2394 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://91.99.154.191:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/ci-4426-0-0-n-d8dd570c6c?timeout=10s\": dial tcp 91.99.154.191:6443: connect: connection refused" interval="200ms" Sep 9 23:56:40.834824 kubelet[2394]: W0909 23:56:40.834776 2394 reflector.go:569] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://91.99.154.191:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 91.99.154.191:6443: connect: connection refused Sep 9 23:56:40.835007 kubelet[2394]: E0909 23:56:40.834985 2394 reflector.go:166] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://91.99.154.191:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 91.99.154.191:6443: connect: connection refused" logger="UnhandledError" Sep 9 23:56:40.837314 kubelet[2394]: E0909 23:56:40.837278 2394 kubelet.go:1555] "Image garbage collection failed once. Stats initialization may not have completed yet" err="invalid capacity 0 on image filesystem" Sep 9 23:56:40.838884 kubelet[2394]: I0909 23:56:40.837661 2394 factory.go:221] Registration of the containerd container factory successfully Sep 9 23:56:40.838884 kubelet[2394]: I0909 23:56:40.837678 2394 factory.go:221] Registration of the systemd container factory successfully Sep 9 23:56:40.838884 kubelet[2394]: I0909 23:56:40.837779 2394 factory.go:219] Registration of the crio container factory failed: Get "http://%2Fvar%2Frun%2Fcrio%2Fcrio.sock/info": dial unix /var/run/crio/crio.sock: connect: no such file or directory Sep 9 23:56:40.846951 kubelet[2394]: I0909 23:56:40.846905 2394 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv4" Sep 9 23:56:40.848807 kubelet[2394]: I0909 23:56:40.848770 2394 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv6" Sep 9 23:56:40.849013 kubelet[2394]: I0909 23:56:40.848999 2394 status_manager.go:227] "Starting to sync pod status with apiserver" Sep 9 23:56:40.849105 kubelet[2394]: I0909 23:56:40.849093 2394 watchdog_linux.go:127] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Sep 9 23:56:40.849172 kubelet[2394]: I0909 23:56:40.849162 2394 kubelet.go:2382] "Starting kubelet main sync loop" Sep 9 23:56:40.849348 kubelet[2394]: E0909 23:56:40.849314 2394 kubelet.go:2406] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" Sep 9 23:56:40.857766 kubelet[2394]: W0909 23:56:40.857716 2394 reflector.go:569] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://91.99.154.191:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 91.99.154.191:6443: connect: connection refused Sep 9 23:56:40.857955 kubelet[2394]: E0909 23:56:40.857932 2394 reflector.go:166] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://91.99.154.191:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 91.99.154.191:6443: connect: connection refused" logger="UnhandledError" Sep 9 23:56:40.870826 kubelet[2394]: I0909 23:56:40.870797 2394 cpu_manager.go:221] "Starting CPU manager" policy="none" Sep 9 23:56:40.871012 kubelet[2394]: I0909 23:56:40.871001 2394 cpu_manager.go:222] "Reconciling" reconcilePeriod="10s" Sep 9 23:56:40.871072 kubelet[2394]: I0909 23:56:40.871064 2394 state_mem.go:36] "Initialized new in-memory state store" Sep 9 23:56:40.873339 kubelet[2394]: I0909 23:56:40.873313 2394 policy_none.go:49] "None policy: Start" Sep 9 23:56:40.873509 kubelet[2394]: I0909 23:56:40.873497 2394 memory_manager.go:186] "Starting memorymanager" policy="None" Sep 9 23:56:40.873564 kubelet[2394]: I0909 23:56:40.873557 2394 state_mem.go:35] "Initializing new in-memory state store" Sep 9 23:56:40.879570 systemd[1]: Created slice kubepods.slice - libcontainer container kubepods.slice. Sep 9 23:56:40.899871 systemd[1]: Created slice kubepods-burstable.slice - libcontainer container kubepods-burstable.slice. Sep 9 23:56:40.905682 systemd[1]: Created slice kubepods-besteffort.slice - libcontainer container kubepods-besteffort.slice. Sep 9 23:56:40.925259 kubelet[2394]: I0909 23:56:40.925087 2394 manager.go:519] "Failed to read data from checkpoint" checkpoint="kubelet_internal_checkpoint" err="checkpoint is not found" Sep 9 23:56:40.925468 kubelet[2394]: I0909 23:56:40.925374 2394 eviction_manager.go:189] "Eviction manager: starting control loop" Sep 9 23:56:40.925468 kubelet[2394]: I0909 23:56:40.925389 2394 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Sep 9 23:56:40.926077 kubelet[2394]: I0909 23:56:40.925786 2394 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Sep 9 23:56:40.929073 kubelet[2394]: E0909 23:56:40.929028 2394 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="no imagefs label for configured runtime" Sep 9 23:56:40.929073 kubelet[2394]: E0909 23:56:40.929094 2394 eviction_manager.go:292] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"ci-4426-0-0-n-d8dd570c6c\" not found" Sep 9 23:56:40.966931 systemd[1]: Created slice kubepods-burstable-pod3ee02ca6703fcd8d38d1f2ae1cb497b0.slice - libcontainer container kubepods-burstable-pod3ee02ca6703fcd8d38d1f2ae1cb497b0.slice. Sep 9 23:56:40.982865 kubelet[2394]: E0909 23:56:40.982794 2394 kubelet.go:3190] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4426-0-0-n-d8dd570c6c\" not found" node="ci-4426-0-0-n-d8dd570c6c" Sep 9 23:56:40.988017 systemd[1]: Created slice kubepods-burstable-poddc48acc3107a7538c12cc34489d1548f.slice - libcontainer container kubepods-burstable-poddc48acc3107a7538c12cc34489d1548f.slice. Sep 9 23:56:40.990502 kubelet[2394]: E0909 23:56:40.990418 2394 kubelet.go:3190] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4426-0-0-n-d8dd570c6c\" not found" node="ci-4426-0-0-n-d8dd570c6c" Sep 9 23:56:40.992637 systemd[1]: Created slice kubepods-burstable-pod561e0aca45e54ee0fc3691690fbed0fa.slice - libcontainer container kubepods-burstable-pod561e0aca45e54ee0fc3691690fbed0fa.slice. Sep 9 23:56:40.996341 kubelet[2394]: E0909 23:56:40.996288 2394 kubelet.go:3190] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4426-0-0-n-d8dd570c6c\" not found" node="ci-4426-0-0-n-d8dd570c6c" Sep 9 23:56:41.029611 kubelet[2394]: I0909 23:56:41.029492 2394 kubelet_node_status.go:75] "Attempting to register node" node="ci-4426-0-0-n-d8dd570c6c" Sep 9 23:56:41.030600 kubelet[2394]: E0909 23:56:41.030553 2394 kubelet_node_status.go:107] "Unable to register node with API server" err="Post \"https://91.99.154.191:6443/api/v1/nodes\": dial tcp 91.99.154.191:6443: connect: connection refused" node="ci-4426-0-0-n-d8dd570c6c" Sep 9 23:56:41.033953 kubelet[2394]: I0909 23:56:41.033829 2394 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/dc48acc3107a7538c12cc34489d1548f-usr-share-ca-certificates\") pod \"kube-controller-manager-ci-4426-0-0-n-d8dd570c6c\" (UID: \"dc48acc3107a7538c12cc34489d1548f\") " pod="kube-system/kube-controller-manager-ci-4426-0-0-n-d8dd570c6c" Sep 9 23:56:41.033953 kubelet[2394]: I0909 23:56:41.033893 2394 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/dc48acc3107a7538c12cc34489d1548f-ca-certs\") pod \"kube-controller-manager-ci-4426-0-0-n-d8dd570c6c\" (UID: \"dc48acc3107a7538c12cc34489d1548f\") " pod="kube-system/kube-controller-manager-ci-4426-0-0-n-d8dd570c6c" Sep 9 23:56:41.033953 kubelet[2394]: I0909 23:56:41.033926 2394 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"flexvolume-dir\" (UniqueName: \"kubernetes.io/host-path/dc48acc3107a7538c12cc34489d1548f-flexvolume-dir\") pod \"kube-controller-manager-ci-4426-0-0-n-d8dd570c6c\" (UID: \"dc48acc3107a7538c12cc34489d1548f\") " pod="kube-system/kube-controller-manager-ci-4426-0-0-n-d8dd570c6c" Sep 9 23:56:41.034168 kubelet[2394]: I0909 23:56:41.033970 2394 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/dc48acc3107a7538c12cc34489d1548f-kubeconfig\") pod \"kube-controller-manager-ci-4426-0-0-n-d8dd570c6c\" (UID: \"dc48acc3107a7538c12cc34489d1548f\") " pod="kube-system/kube-controller-manager-ci-4426-0-0-n-d8dd570c6c" Sep 9 23:56:41.034168 kubelet[2394]: I0909 23:56:41.034028 2394 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/dc48acc3107a7538c12cc34489d1548f-k8s-certs\") pod \"kube-controller-manager-ci-4426-0-0-n-d8dd570c6c\" (UID: \"dc48acc3107a7538c12cc34489d1548f\") " pod="kube-system/kube-controller-manager-ci-4426-0-0-n-d8dd570c6c" Sep 9 23:56:41.034168 kubelet[2394]: I0909 23:56:41.034070 2394 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/561e0aca45e54ee0fc3691690fbed0fa-kubeconfig\") pod \"kube-scheduler-ci-4426-0-0-n-d8dd570c6c\" (UID: \"561e0aca45e54ee0fc3691690fbed0fa\") " pod="kube-system/kube-scheduler-ci-4426-0-0-n-d8dd570c6c" Sep 9 23:56:41.034510 kubelet[2394]: I0909 23:56:41.034356 2394 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/3ee02ca6703fcd8d38d1f2ae1cb497b0-ca-certs\") pod \"kube-apiserver-ci-4426-0-0-n-d8dd570c6c\" (UID: \"3ee02ca6703fcd8d38d1f2ae1cb497b0\") " pod="kube-system/kube-apiserver-ci-4426-0-0-n-d8dd570c6c" Sep 9 23:56:41.034510 kubelet[2394]: I0909 23:56:41.034433 2394 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/3ee02ca6703fcd8d38d1f2ae1cb497b0-k8s-certs\") pod \"kube-apiserver-ci-4426-0-0-n-d8dd570c6c\" (UID: \"3ee02ca6703fcd8d38d1f2ae1cb497b0\") " pod="kube-system/kube-apiserver-ci-4426-0-0-n-d8dd570c6c" Sep 9 23:56:41.034510 kubelet[2394]: I0909 23:56:41.034468 2394 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/3ee02ca6703fcd8d38d1f2ae1cb497b0-usr-share-ca-certificates\") pod \"kube-apiserver-ci-4426-0-0-n-d8dd570c6c\" (UID: \"3ee02ca6703fcd8d38d1f2ae1cb497b0\") " pod="kube-system/kube-apiserver-ci-4426-0-0-n-d8dd570c6c" Sep 9 23:56:41.034766 kubelet[2394]: E0909 23:56:41.034674 2394 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://91.99.154.191:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/ci-4426-0-0-n-d8dd570c6c?timeout=10s\": dial tcp 91.99.154.191:6443: connect: connection refused" interval="400ms" Sep 9 23:56:41.233584 kubelet[2394]: I0909 23:56:41.233531 2394 kubelet_node_status.go:75] "Attempting to register node" node="ci-4426-0-0-n-d8dd570c6c" Sep 9 23:56:41.234323 kubelet[2394]: E0909 23:56:41.234280 2394 kubelet_node_status.go:107] "Unable to register node with API server" err="Post \"https://91.99.154.191:6443/api/v1/nodes\": dial tcp 91.99.154.191:6443: connect: connection refused" node="ci-4426-0-0-n-d8dd570c6c" Sep 9 23:56:41.285305 containerd[1499]: time="2025-09-09T23:56:41.285176556Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-apiserver-ci-4426-0-0-n-d8dd570c6c,Uid:3ee02ca6703fcd8d38d1f2ae1cb497b0,Namespace:kube-system,Attempt:0,}" Sep 9 23:56:41.293637 containerd[1499]: time="2025-09-09T23:56:41.293368829Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-controller-manager-ci-4426-0-0-n-d8dd570c6c,Uid:dc48acc3107a7538c12cc34489d1548f,Namespace:kube-system,Attempt:0,}" Sep 9 23:56:41.298695 containerd[1499]: time="2025-09-09T23:56:41.298512174Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-scheduler-ci-4426-0-0-n-d8dd570c6c,Uid:561e0aca45e54ee0fc3691690fbed0fa,Namespace:kube-system,Attempt:0,}" Sep 9 23:56:41.331714 containerd[1499]: time="2025-09-09T23:56:41.331658929Z" level=info msg="connecting to shim dd0389e8dfc6a226dbbc6cb1baad383d855bba5f93b28898dbf24d08c7652579" address="unix:///run/containerd/s/b990d394cb243cce06353763dbb8b2ec46cb91d09aef1ced050a5fe6ffb221e1" namespace=k8s.io protocol=ttrpc version=3 Sep 9 23:56:41.339419 containerd[1499]: time="2025-09-09T23:56:41.339330949Z" level=info msg="connecting to shim 22b721e31c071ab765952cec3853b842774dd9557b13a47976200e82c5d5a7d4" address="unix:///run/containerd/s/f49cf132cfdd832dcb4dfb3bf197624359f8f919abd12502baed239fc6e70961" namespace=k8s.io protocol=ttrpc version=3 Sep 9 23:56:41.360107 containerd[1499]: time="2025-09-09T23:56:41.360048201Z" level=info msg="connecting to shim 07e265f45fefb6f1bcf35378949cda9964d1ee1e5ce09675ba3b7294716f09c8" address="unix:///run/containerd/s/f06b77e487d9bb1641c77a2beb73bfa6259914b38b2f7c66f9adb6e379a87191" namespace=k8s.io protocol=ttrpc version=3 Sep 9 23:56:41.381263 systemd[1]: Started cri-containerd-22b721e31c071ab765952cec3853b842774dd9557b13a47976200e82c5d5a7d4.scope - libcontainer container 22b721e31c071ab765952cec3853b842774dd9557b13a47976200e82c5d5a7d4. Sep 9 23:56:41.403093 systemd[1]: Started cri-containerd-dd0389e8dfc6a226dbbc6cb1baad383d855bba5f93b28898dbf24d08c7652579.scope - libcontainer container dd0389e8dfc6a226dbbc6cb1baad383d855bba5f93b28898dbf24d08c7652579. Sep 9 23:56:41.407938 systemd[1]: Started cri-containerd-07e265f45fefb6f1bcf35378949cda9964d1ee1e5ce09675ba3b7294716f09c8.scope - libcontainer container 07e265f45fefb6f1bcf35378949cda9964d1ee1e5ce09675ba3b7294716f09c8. Sep 9 23:56:41.436568 kubelet[2394]: E0909 23:56:41.436270 2394 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://91.99.154.191:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/ci-4426-0-0-n-d8dd570c6c?timeout=10s\": dial tcp 91.99.154.191:6443: connect: connection refused" interval="800ms" Sep 9 23:56:41.463663 containerd[1499]: time="2025-09-09T23:56:41.463547105Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-apiserver-ci-4426-0-0-n-d8dd570c6c,Uid:3ee02ca6703fcd8d38d1f2ae1cb497b0,Namespace:kube-system,Attempt:0,} returns sandbox id \"22b721e31c071ab765952cec3853b842774dd9557b13a47976200e82c5d5a7d4\"" Sep 9 23:56:41.470230 containerd[1499]: time="2025-09-09T23:56:41.469734678Z" level=info msg="CreateContainer within sandbox \"22b721e31c071ab765952cec3853b842774dd9557b13a47976200e82c5d5a7d4\" for container &ContainerMetadata{Name:kube-apiserver,Attempt:0,}" Sep 9 23:56:41.478344 containerd[1499]: time="2025-09-09T23:56:41.478199858Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-controller-manager-ci-4426-0-0-n-d8dd570c6c,Uid:dc48acc3107a7538c12cc34489d1548f,Namespace:kube-system,Attempt:0,} returns sandbox id \"dd0389e8dfc6a226dbbc6cb1baad383d855bba5f93b28898dbf24d08c7652579\"" Sep 9 23:56:41.483676 containerd[1499]: time="2025-09-09T23:56:41.483614670Z" level=info msg="CreateContainer within sandbox \"dd0389e8dfc6a226dbbc6cb1baad383d855bba5f93b28898dbf24d08c7652579\" for container &ContainerMetadata{Name:kube-controller-manager,Attempt:0,}" Sep 9 23:56:41.493631 containerd[1499]: time="2025-09-09T23:56:41.493591055Z" level=info msg="Container af937c4a67c0d2b2ee44cc345f4771ea0b978552413fc66fbf6fd129208531fb: CDI devices from CRI Config.CDIDevices: []" Sep 9 23:56:41.495731 containerd[1499]: time="2025-09-09T23:56:41.495675271Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-scheduler-ci-4426-0-0-n-d8dd570c6c,Uid:561e0aca45e54ee0fc3691690fbed0fa,Namespace:kube-system,Attempt:0,} returns sandbox id \"07e265f45fefb6f1bcf35378949cda9964d1ee1e5ce09675ba3b7294716f09c8\"" Sep 9 23:56:41.498184 containerd[1499]: time="2025-09-09T23:56:41.498104911Z" level=info msg="Container ed12b7c362c53835271d3603c18b013fba31df29863995a3750d8fbc8e2bfffb: CDI devices from CRI Config.CDIDevices: []" Sep 9 23:56:41.502884 containerd[1499]: time="2025-09-09T23:56:41.501983998Z" level=info msg="CreateContainer within sandbox \"07e265f45fefb6f1bcf35378949cda9964d1ee1e5ce09675ba3b7294716f09c8\" for container &ContainerMetadata{Name:kube-scheduler,Attempt:0,}" Sep 9 23:56:41.505179 containerd[1499]: time="2025-09-09T23:56:41.505118803Z" level=info msg="CreateContainer within sandbox \"22b721e31c071ab765952cec3853b842774dd9557b13a47976200e82c5d5a7d4\" for &ContainerMetadata{Name:kube-apiserver,Attempt:0,} returns container id \"af937c4a67c0d2b2ee44cc345f4771ea0b978552413fc66fbf6fd129208531fb\"" Sep 9 23:56:41.506349 containerd[1499]: time="2025-09-09T23:56:41.506317743Z" level=info msg="StartContainer for \"af937c4a67c0d2b2ee44cc345f4771ea0b978552413fc66fbf6fd129208531fb\"" Sep 9 23:56:41.508558 containerd[1499]: time="2025-09-09T23:56:41.508515154Z" level=info msg="CreateContainer within sandbox \"dd0389e8dfc6a226dbbc6cb1baad383d855bba5f93b28898dbf24d08c7652579\" for &ContainerMetadata{Name:kube-controller-manager,Attempt:0,} returns container id \"ed12b7c362c53835271d3603c18b013fba31df29863995a3750d8fbc8e2bfffb\"" Sep 9 23:56:41.509902 containerd[1499]: time="2025-09-09T23:56:41.509358512Z" level=info msg="StartContainer for \"ed12b7c362c53835271d3603c18b013fba31df29863995a3750d8fbc8e2bfffb\"" Sep 9 23:56:41.509902 containerd[1499]: time="2025-09-09T23:56:41.509798050Z" level=info msg="connecting to shim af937c4a67c0d2b2ee44cc345f4771ea0b978552413fc66fbf6fd129208531fb" address="unix:///run/containerd/s/f49cf132cfdd832dcb4dfb3bf197624359f8f919abd12502baed239fc6e70961" protocol=ttrpc version=3 Sep 9 23:56:41.511656 containerd[1499]: time="2025-09-09T23:56:41.511597681Z" level=info msg="connecting to shim ed12b7c362c53835271d3603c18b013fba31df29863995a3750d8fbc8e2bfffb" address="unix:///run/containerd/s/b990d394cb243cce06353763dbb8b2ec46cb91d09aef1ced050a5fe6ffb221e1" protocol=ttrpc version=3 Sep 9 23:56:41.522906 containerd[1499]: time="2025-09-09T23:56:41.522831084Z" level=info msg="Container 517cca4662d582043d65aeb68a2b2ace7dbaea875f2a9c56166b1d7ee94dc23d: CDI devices from CRI Config.CDIDevices: []" Sep 9 23:56:41.546597 containerd[1499]: time="2025-09-09T23:56:41.545745747Z" level=info msg="CreateContainer within sandbox \"07e265f45fefb6f1bcf35378949cda9964d1ee1e5ce09675ba3b7294716f09c8\" for &ContainerMetadata{Name:kube-scheduler,Attempt:0,} returns container id \"517cca4662d582043d65aeb68a2b2ace7dbaea875f2a9c56166b1d7ee94dc23d\"" Sep 9 23:56:41.548625 containerd[1499]: time="2025-09-09T23:56:41.548554687Z" level=info msg="StartContainer for \"517cca4662d582043d65aeb68a2b2ace7dbaea875f2a9c56166b1d7ee94dc23d\"" Sep 9 23:56:41.550250 containerd[1499]: time="2025-09-09T23:56:41.549808145Z" level=info msg="connecting to shim 517cca4662d582043d65aeb68a2b2ace7dbaea875f2a9c56166b1d7ee94dc23d" address="unix:///run/containerd/s/f06b77e487d9bb1641c77a2beb73bfa6259914b38b2f7c66f9adb6e379a87191" protocol=ttrpc version=3 Sep 9 23:56:41.552505 systemd[1]: Started cri-containerd-af937c4a67c0d2b2ee44cc345f4771ea0b978552413fc66fbf6fd129208531fb.scope - libcontainer container af937c4a67c0d2b2ee44cc345f4771ea0b978552413fc66fbf6fd129208531fb. Sep 9 23:56:41.578131 systemd[1]: Started cri-containerd-ed12b7c362c53835271d3603c18b013fba31df29863995a3750d8fbc8e2bfffb.scope - libcontainer container ed12b7c362c53835271d3603c18b013fba31df29863995a3750d8fbc8e2bfffb. Sep 9 23:56:41.595817 systemd[1]: Started cri-containerd-517cca4662d582043d65aeb68a2b2ace7dbaea875f2a9c56166b1d7ee94dc23d.scope - libcontainer container 517cca4662d582043d65aeb68a2b2ace7dbaea875f2a9c56166b1d7ee94dc23d. Sep 9 23:56:41.629432 containerd[1499]: time="2025-09-09T23:56:41.629387516Z" level=info msg="StartContainer for \"af937c4a67c0d2b2ee44cc345f4771ea0b978552413fc66fbf6fd129208531fb\" returns successfully" Sep 9 23:56:41.637534 kubelet[2394]: I0909 23:56:41.637182 2394 kubelet_node_status.go:75] "Attempting to register node" node="ci-4426-0-0-n-d8dd570c6c" Sep 9 23:56:41.638168 kubelet[2394]: E0909 23:56:41.638046 2394 kubelet_node_status.go:107] "Unable to register node with API server" err="Post \"https://91.99.154.191:6443/api/v1/nodes\": dial tcp 91.99.154.191:6443: connect: connection refused" node="ci-4426-0-0-n-d8dd570c6c" Sep 9 23:56:41.675110 containerd[1499]: time="2025-09-09T23:56:41.675073370Z" level=info msg="StartContainer for \"ed12b7c362c53835271d3603c18b013fba31df29863995a3750d8fbc8e2bfffb\" returns successfully" Sep 9 23:56:41.685220 containerd[1499]: time="2025-09-09T23:56:41.684974758Z" level=info msg="StartContainer for \"517cca4662d582043d65aeb68a2b2ace7dbaea875f2a9c56166b1d7ee94dc23d\" returns successfully" Sep 9 23:56:41.870955 kubelet[2394]: E0909 23:56:41.869975 2394 kubelet.go:3190] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4426-0-0-n-d8dd570c6c\" not found" node="ci-4426-0-0-n-d8dd570c6c" Sep 9 23:56:41.877396 kubelet[2394]: E0909 23:56:41.877143 2394 kubelet.go:3190] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4426-0-0-n-d8dd570c6c\" not found" node="ci-4426-0-0-n-d8dd570c6c" Sep 9 23:56:41.880827 kubelet[2394]: E0909 23:56:41.880775 2394 kubelet.go:3190] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4426-0-0-n-d8dd570c6c\" not found" node="ci-4426-0-0-n-d8dd570c6c" Sep 9 23:56:42.441852 kubelet[2394]: I0909 23:56:42.440870 2394 kubelet_node_status.go:75] "Attempting to register node" node="ci-4426-0-0-n-d8dd570c6c" Sep 9 23:56:42.884367 kubelet[2394]: E0909 23:56:42.884150 2394 kubelet.go:3190] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4426-0-0-n-d8dd570c6c\" not found" node="ci-4426-0-0-n-d8dd570c6c" Sep 9 23:56:42.884979 kubelet[2394]: E0909 23:56:42.884797 2394 kubelet.go:3190] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4426-0-0-n-d8dd570c6c\" not found" node="ci-4426-0-0-n-d8dd570c6c" Sep 9 23:56:45.277978 kubelet[2394]: E0909 23:56:45.277926 2394 nodelease.go:49] "Failed to get node when trying to set owner ref to the node lease" err="nodes \"ci-4426-0-0-n-d8dd570c6c\" not found" node="ci-4426-0-0-n-d8dd570c6c" Sep 9 23:56:45.338672 kubelet[2394]: I0909 23:56:45.338623 2394 kubelet_node_status.go:78] "Successfully registered node" node="ci-4426-0-0-n-d8dd570c6c" Sep 9 23:56:45.429169 kubelet[2394]: I0909 23:56:45.429015 2394 kubelet.go:3194] "Creating a mirror pod for static pod" pod="kube-system/kube-scheduler-ci-4426-0-0-n-d8dd570c6c" Sep 9 23:56:45.449357 kubelet[2394]: E0909 23:56:45.449249 2394 kubelet.go:3196] "Failed creating a mirror pod" err="pods \"kube-scheduler-ci-4426-0-0-n-d8dd570c6c\" is forbidden: no PriorityClass with name system-node-critical was found" pod="kube-system/kube-scheduler-ci-4426-0-0-n-d8dd570c6c" Sep 9 23:56:45.449357 kubelet[2394]: I0909 23:56:45.449351 2394 kubelet.go:3194] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-ci-4426-0-0-n-d8dd570c6c" Sep 9 23:56:45.454416 kubelet[2394]: E0909 23:56:45.454362 2394 kubelet.go:3196] "Failed creating a mirror pod" err="pods \"kube-apiserver-ci-4426-0-0-n-d8dd570c6c\" is forbidden: no PriorityClass with name system-node-critical was found" pod="kube-system/kube-apiserver-ci-4426-0-0-n-d8dd570c6c" Sep 9 23:56:45.454416 kubelet[2394]: I0909 23:56:45.454402 2394 kubelet.go:3194] "Creating a mirror pod for static pod" pod="kube-system/kube-controller-manager-ci-4426-0-0-n-d8dd570c6c" Sep 9 23:56:45.461652 kubelet[2394]: E0909 23:56:45.461609 2394 kubelet.go:3196] "Failed creating a mirror pod" err="pods \"kube-controller-manager-ci-4426-0-0-n-d8dd570c6c\" is forbidden: no PriorityClass with name system-node-critical was found" pod="kube-system/kube-controller-manager-ci-4426-0-0-n-d8dd570c6c" Sep 9 23:56:45.821713 kubelet[2394]: I0909 23:56:45.821303 2394 apiserver.go:52] "Watching apiserver" Sep 9 23:56:45.828462 kubelet[2394]: I0909 23:56:45.828430 2394 desired_state_of_world_populator.go:158] "Finished populating initial desired state of world" Sep 9 23:56:46.298642 kubelet[2394]: I0909 23:56:46.298221 2394 kubelet.go:3194] "Creating a mirror pod for static pod" pod="kube-system/kube-scheduler-ci-4426-0-0-n-d8dd570c6c" Sep 9 23:56:47.320508 kubelet[2394]: I0909 23:56:47.320460 2394 kubelet.go:3194] "Creating a mirror pod for static pod" pod="kube-system/kube-controller-manager-ci-4426-0-0-n-d8dd570c6c" Sep 9 23:56:47.716433 systemd[1]: Reload requested from client PID 2661 ('systemctl') (unit session-7.scope)... Sep 9 23:56:47.716450 systemd[1]: Reloading... Sep 9 23:56:47.826864 zram_generator::config[2705]: No configuration found. Sep 9 23:56:48.029548 systemd[1]: Reloading finished in 312 ms. Sep 9 23:56:48.059121 systemd[1]: Stopping kubelet.service - kubelet: The Kubernetes Node Agent... Sep 9 23:56:48.077340 systemd[1]: kubelet.service: Deactivated successfully. Sep 9 23:56:48.078933 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent. Sep 9 23:56:48.079036 systemd[1]: kubelet.service: Consumed 2.051s CPU time, 128.2M memory peak. Sep 9 23:56:48.082177 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Sep 9 23:56:48.250232 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Sep 9 23:56:48.265702 (kubelet)[2750]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS Sep 9 23:56:48.333900 kubelet[2750]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Sep 9 23:56:48.333900 kubelet[2750]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Sep 9 23:56:48.333900 kubelet[2750]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Sep 9 23:56:48.333900 kubelet[2750]: I0909 23:56:48.333211 2750 server.go:215] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Sep 9 23:56:48.341310 kubelet[2750]: I0909 23:56:48.341051 2750 server.go:520] "Kubelet version" kubeletVersion="v1.32.4" Sep 9 23:56:48.341310 kubelet[2750]: I0909 23:56:48.341135 2750 server.go:522] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Sep 9 23:56:48.341580 kubelet[2750]: I0909 23:56:48.341538 2750 server.go:954] "Client rotation is on, will bootstrap in background" Sep 9 23:56:48.345623 kubelet[2750]: I0909 23:56:48.345514 2750 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-client-current.pem". Sep 9 23:56:48.349222 kubelet[2750]: I0909 23:56:48.348895 2750 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/pki/ca.crt" Sep 9 23:56:48.355623 kubelet[2750]: I0909 23:56:48.355435 2750 server.go:1444] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Sep 9 23:56:48.358758 kubelet[2750]: I0909 23:56:48.358727 2750 server.go:772] "--cgroups-per-qos enabled, but --cgroup-root was not specified. defaulting to /" Sep 9 23:56:48.359129 kubelet[2750]: I0909 23:56:48.359078 2750 container_manager_linux.go:268] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Sep 9 23:56:48.359355 kubelet[2750]: I0909 23:56:48.359130 2750 container_manager_linux.go:273] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ci-4426-0-0-n-d8dd570c6c","RuntimeCgroupsName":"","SystemCgroupsName":"","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":false,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":null,"HardEvictionThresholds":[{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"ExperimentalMemoryManagerPolicy":"None","ExperimentalMemoryManagerReservedMemory":null,"PodPidsLimit":-1,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Sep 9 23:56:48.359444 kubelet[2750]: I0909 23:56:48.359368 2750 topology_manager.go:138] "Creating topology manager with none policy" Sep 9 23:56:48.359444 kubelet[2750]: I0909 23:56:48.359379 2750 container_manager_linux.go:304] "Creating device plugin manager" Sep 9 23:56:48.359444 kubelet[2750]: I0909 23:56:48.359436 2750 state_mem.go:36] "Initialized new in-memory state store" Sep 9 23:56:48.359642 kubelet[2750]: I0909 23:56:48.359627 2750 kubelet.go:446] "Attempting to sync node with API server" Sep 9 23:56:48.359687 kubelet[2750]: I0909 23:56:48.359656 2750 kubelet.go:341] "Adding static pod path" path="/etc/kubernetes/manifests" Sep 9 23:56:48.359716 kubelet[2750]: I0909 23:56:48.359696 2750 kubelet.go:352] "Adding apiserver pod source" Sep 9 23:56:48.359716 kubelet[2750]: I0909 23:56:48.359710 2750 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Sep 9 23:56:48.362856 kubelet[2750]: I0909 23:56:48.362447 2750 kuberuntime_manager.go:269] "Container runtime initialized" containerRuntime="containerd" version="v2.0.5" apiVersion="v1" Sep 9 23:56:48.363117 kubelet[2750]: I0909 23:56:48.363091 2750 kubelet.go:890] "Not starting ClusterTrustBundle informer because we are in static kubelet mode" Sep 9 23:56:48.364852 kubelet[2750]: I0909 23:56:48.363643 2750 watchdog_linux.go:99] "Systemd watchdog is not enabled" Sep 9 23:56:48.364852 kubelet[2750]: I0909 23:56:48.363694 2750 server.go:1287] "Started kubelet" Sep 9 23:56:48.367854 kubelet[2750]: I0909 23:56:48.365735 2750 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Sep 9 23:56:48.370509 kubelet[2750]: I0909 23:56:48.370452 2750 server.go:169] "Starting to listen" address="0.0.0.0" port=10250 Sep 9 23:56:48.371511 kubelet[2750]: I0909 23:56:48.371474 2750 server.go:479] "Adding debug handlers to kubelet server" Sep 9 23:56:48.376354 kubelet[2750]: I0909 23:56:48.376139 2750 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Sep 9 23:56:48.376464 kubelet[2750]: I0909 23:56:48.376404 2750 server.go:243] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Sep 9 23:56:48.378850 kubelet[2750]: I0909 23:56:48.378121 2750 dynamic_serving_content.go:135] "Starting controller" name="kubelet-server-cert-files::/var/lib/kubelet/pki/kubelet.crt::/var/lib/kubelet/pki/kubelet.key" Sep 9 23:56:48.380729 kubelet[2750]: I0909 23:56:48.380686 2750 volume_manager.go:297] "Starting Kubelet Volume Manager" Sep 9 23:56:48.381026 kubelet[2750]: E0909 23:56:48.380998 2750 kubelet_node_status.go:466] "Error getting the current node from lister" err="node \"ci-4426-0-0-n-d8dd570c6c\" not found" Sep 9 23:56:48.391720 kubelet[2750]: I0909 23:56:48.391686 2750 desired_state_of_world_populator.go:150] "Desired state populator starts to run" Sep 9 23:56:48.392855 kubelet[2750]: I0909 23:56:48.391879 2750 reconciler.go:26] "Reconciler: start to sync state" Sep 9 23:56:48.396380 kubelet[2750]: I0909 23:56:48.396339 2750 factory.go:221] Registration of the systemd container factory successfully Sep 9 23:56:48.396493 kubelet[2750]: I0909 23:56:48.396467 2750 factory.go:219] Registration of the crio container factory failed: Get "http://%2Fvar%2Frun%2Fcrio%2Fcrio.sock/info": dial unix /var/run/crio/crio.sock: connect: no such file or directory Sep 9 23:56:48.406322 kubelet[2750]: I0909 23:56:48.406129 2750 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv4" Sep 9 23:56:48.409586 kubelet[2750]: I0909 23:56:48.409546 2750 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv6" Sep 9 23:56:48.409586 kubelet[2750]: I0909 23:56:48.409579 2750 status_manager.go:227] "Starting to sync pod status with apiserver" Sep 9 23:56:48.409586 kubelet[2750]: I0909 23:56:48.409601 2750 watchdog_linux.go:127] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Sep 9 23:56:48.409586 kubelet[2750]: I0909 23:56:48.409607 2750 kubelet.go:2382] "Starting kubelet main sync loop" Sep 9 23:56:48.409801 kubelet[2750]: E0909 23:56:48.409660 2750 kubelet.go:2406] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" Sep 9 23:56:48.410206 kubelet[2750]: I0909 23:56:48.410183 2750 factory.go:221] Registration of the containerd container factory successfully Sep 9 23:56:48.417791 kubelet[2750]: E0909 23:56:48.417751 2750 kubelet.go:1555] "Image garbage collection failed once. Stats initialization may not have completed yet" err="invalid capacity 0 on image filesystem" Sep 9 23:56:48.473749 kubelet[2750]: I0909 23:56:48.473723 2750 cpu_manager.go:221] "Starting CPU manager" policy="none" Sep 9 23:56:48.473897 kubelet[2750]: I0909 23:56:48.473742 2750 cpu_manager.go:222] "Reconciling" reconcilePeriod="10s" Sep 9 23:56:48.473897 kubelet[2750]: I0909 23:56:48.473794 2750 state_mem.go:36] "Initialized new in-memory state store" Sep 9 23:56:48.474008 kubelet[2750]: I0909 23:56:48.473980 2750 state_mem.go:88] "Updated default CPUSet" cpuSet="" Sep 9 23:56:48.474037 kubelet[2750]: I0909 23:56:48.474008 2750 state_mem.go:96] "Updated CPUSet assignments" assignments={} Sep 9 23:56:48.474037 kubelet[2750]: I0909 23:56:48.474028 2750 policy_none.go:49] "None policy: Start" Sep 9 23:56:48.474037 kubelet[2750]: I0909 23:56:48.474038 2750 memory_manager.go:186] "Starting memorymanager" policy="None" Sep 9 23:56:48.474124 kubelet[2750]: I0909 23:56:48.474060 2750 state_mem.go:35] "Initializing new in-memory state store" Sep 9 23:56:48.474175 kubelet[2750]: I0909 23:56:48.474163 2750 state_mem.go:75] "Updated machine memory state" Sep 9 23:56:48.480503 kubelet[2750]: I0909 23:56:48.480470 2750 manager.go:519] "Failed to read data from checkpoint" checkpoint="kubelet_internal_checkpoint" err="checkpoint is not found" Sep 9 23:56:48.480864 kubelet[2750]: I0909 23:56:48.480824 2750 eviction_manager.go:189] "Eviction manager: starting control loop" Sep 9 23:56:48.480984 kubelet[2750]: I0909 23:56:48.480946 2750 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Sep 9 23:56:48.482119 kubelet[2750]: I0909 23:56:48.482094 2750 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Sep 9 23:56:48.485866 kubelet[2750]: E0909 23:56:48.484715 2750 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="no imagefs label for configured runtime" Sep 9 23:56:48.510415 kubelet[2750]: I0909 23:56:48.510366 2750 kubelet.go:3194] "Creating a mirror pod for static pod" pod="kube-system/kube-scheduler-ci-4426-0-0-n-d8dd570c6c" Sep 9 23:56:48.510898 kubelet[2750]: I0909 23:56:48.510871 2750 kubelet.go:3194] "Creating a mirror pod for static pod" pod="kube-system/kube-controller-manager-ci-4426-0-0-n-d8dd570c6c" Sep 9 23:56:48.511392 kubelet[2750]: I0909 23:56:48.511358 2750 kubelet.go:3194] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-ci-4426-0-0-n-d8dd570c6c" Sep 9 23:56:48.522956 kubelet[2750]: E0909 23:56:48.522906 2750 kubelet.go:3196] "Failed creating a mirror pod" err="pods \"kube-controller-manager-ci-4426-0-0-n-d8dd570c6c\" already exists" pod="kube-system/kube-controller-manager-ci-4426-0-0-n-d8dd570c6c" Sep 9 23:56:48.523803 kubelet[2750]: E0909 23:56:48.523740 2750 kubelet.go:3196] "Failed creating a mirror pod" err="pods \"kube-scheduler-ci-4426-0-0-n-d8dd570c6c\" already exists" pod="kube-system/kube-scheduler-ci-4426-0-0-n-d8dd570c6c" Sep 9 23:56:48.587945 kubelet[2750]: I0909 23:56:48.586676 2750 kubelet_node_status.go:75] "Attempting to register node" node="ci-4426-0-0-n-d8dd570c6c" Sep 9 23:56:48.601081 kubelet[2750]: I0909 23:56:48.601025 2750 kubelet_node_status.go:124] "Node was previously registered" node="ci-4426-0-0-n-d8dd570c6c" Sep 9 23:56:48.601251 kubelet[2750]: I0909 23:56:48.601140 2750 kubelet_node_status.go:78] "Successfully registered node" node="ci-4426-0-0-n-d8dd570c6c" Sep 9 23:56:48.693436 kubelet[2750]: I0909 23:56:48.693010 2750 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/dc48acc3107a7538c12cc34489d1548f-kubeconfig\") pod \"kube-controller-manager-ci-4426-0-0-n-d8dd570c6c\" (UID: \"dc48acc3107a7538c12cc34489d1548f\") " pod="kube-system/kube-controller-manager-ci-4426-0-0-n-d8dd570c6c" Sep 9 23:56:48.693436 kubelet[2750]: I0909 23:56:48.693102 2750 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/dc48acc3107a7538c12cc34489d1548f-ca-certs\") pod \"kube-controller-manager-ci-4426-0-0-n-d8dd570c6c\" (UID: \"dc48acc3107a7538c12cc34489d1548f\") " pod="kube-system/kube-controller-manager-ci-4426-0-0-n-d8dd570c6c" Sep 9 23:56:48.693436 kubelet[2750]: I0909 23:56:48.693131 2750 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"flexvolume-dir\" (UniqueName: \"kubernetes.io/host-path/dc48acc3107a7538c12cc34489d1548f-flexvolume-dir\") pod \"kube-controller-manager-ci-4426-0-0-n-d8dd570c6c\" (UID: \"dc48acc3107a7538c12cc34489d1548f\") " pod="kube-system/kube-controller-manager-ci-4426-0-0-n-d8dd570c6c" Sep 9 23:56:48.693436 kubelet[2750]: I0909 23:56:48.693159 2750 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/dc48acc3107a7538c12cc34489d1548f-k8s-certs\") pod \"kube-controller-manager-ci-4426-0-0-n-d8dd570c6c\" (UID: \"dc48acc3107a7538c12cc34489d1548f\") " pod="kube-system/kube-controller-manager-ci-4426-0-0-n-d8dd570c6c" Sep 9 23:56:48.693436 kubelet[2750]: I0909 23:56:48.693184 2750 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/dc48acc3107a7538c12cc34489d1548f-usr-share-ca-certificates\") pod \"kube-controller-manager-ci-4426-0-0-n-d8dd570c6c\" (UID: \"dc48acc3107a7538c12cc34489d1548f\") " pod="kube-system/kube-controller-manager-ci-4426-0-0-n-d8dd570c6c" Sep 9 23:56:48.693735 kubelet[2750]: I0909 23:56:48.693203 2750 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/561e0aca45e54ee0fc3691690fbed0fa-kubeconfig\") pod \"kube-scheduler-ci-4426-0-0-n-d8dd570c6c\" (UID: \"561e0aca45e54ee0fc3691690fbed0fa\") " pod="kube-system/kube-scheduler-ci-4426-0-0-n-d8dd570c6c" Sep 9 23:56:48.693735 kubelet[2750]: I0909 23:56:48.693218 2750 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/3ee02ca6703fcd8d38d1f2ae1cb497b0-ca-certs\") pod \"kube-apiserver-ci-4426-0-0-n-d8dd570c6c\" (UID: \"3ee02ca6703fcd8d38d1f2ae1cb497b0\") " pod="kube-system/kube-apiserver-ci-4426-0-0-n-d8dd570c6c" Sep 9 23:56:48.693735 kubelet[2750]: I0909 23:56:48.693271 2750 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/3ee02ca6703fcd8d38d1f2ae1cb497b0-k8s-certs\") pod \"kube-apiserver-ci-4426-0-0-n-d8dd570c6c\" (UID: \"3ee02ca6703fcd8d38d1f2ae1cb497b0\") " pod="kube-system/kube-apiserver-ci-4426-0-0-n-d8dd570c6c" Sep 9 23:56:48.693735 kubelet[2750]: I0909 23:56:48.693290 2750 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/3ee02ca6703fcd8d38d1f2ae1cb497b0-usr-share-ca-certificates\") pod \"kube-apiserver-ci-4426-0-0-n-d8dd570c6c\" (UID: \"3ee02ca6703fcd8d38d1f2ae1cb497b0\") " pod="kube-system/kube-apiserver-ci-4426-0-0-n-d8dd570c6c" Sep 9 23:56:49.360850 kubelet[2750]: I0909 23:56:49.360668 2750 apiserver.go:52] "Watching apiserver" Sep 9 23:56:49.392467 kubelet[2750]: I0909 23:56:49.392335 2750 desired_state_of_world_populator.go:158] "Finished populating initial desired state of world" Sep 9 23:56:49.463251 kubelet[2750]: I0909 23:56:49.463211 2750 kubelet.go:3194] "Creating a mirror pod for static pod" pod="kube-system/kube-scheduler-ci-4426-0-0-n-d8dd570c6c" Sep 9 23:56:49.471981 kubelet[2750]: E0909 23:56:49.471939 2750 kubelet.go:3196] "Failed creating a mirror pod" err="pods \"kube-scheduler-ci-4426-0-0-n-d8dd570c6c\" already exists" pod="kube-system/kube-scheduler-ci-4426-0-0-n-d8dd570c6c" Sep 9 23:56:49.492533 kubelet[2750]: I0909 23:56:49.492438 2750 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-controller-manager-ci-4426-0-0-n-d8dd570c6c" podStartSLOduration=2.492414796 podStartE2EDuration="2.492414796s" podCreationTimestamp="2025-09-09 23:56:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-09 23:56:49.491397989 +0000 UTC m=+1.220308047" watchObservedRunningTime="2025-09-09 23:56:49.492414796 +0000 UTC m=+1.221324574" Sep 9 23:56:49.530670 kubelet[2750]: I0909 23:56:49.530598 2750 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-scheduler-ci-4426-0-0-n-d8dd570c6c" podStartSLOduration=3.530579914 podStartE2EDuration="3.530579914s" podCreationTimestamp="2025-09-09 23:56:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-09 23:56:49.526360411 +0000 UTC m=+1.255270189" watchObservedRunningTime="2025-09-09 23:56:49.530579914 +0000 UTC m=+1.259489652" Sep 9 23:56:49.530874 kubelet[2750]: I0909 23:56:49.530678 2750 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-apiserver-ci-4426-0-0-n-d8dd570c6c" podStartSLOduration=1.530673631 podStartE2EDuration="1.530673631s" podCreationTimestamp="2025-09-09 23:56:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-09 23:56:49.512243911 +0000 UTC m=+1.241153649" watchObservedRunningTime="2025-09-09 23:56:49.530673631 +0000 UTC m=+1.259583369" Sep 9 23:56:53.876149 kubelet[2750]: I0909 23:56:53.876102 2750 kuberuntime_manager.go:1702] "Updating runtime config through cri with podcidr" CIDR="192.168.0.0/24" Sep 9 23:56:53.877109 containerd[1499]: time="2025-09-09T23:56:53.877032210Z" level=info msg="No cni config template is specified, wait for other system components to drop the config." Sep 9 23:56:53.877766 kubelet[2750]: I0909 23:56:53.877678 2750 kubelet_network.go:61] "Updating Pod CIDR" originalPodCIDR="" newPodCIDR="192.168.0.0/24" Sep 9 23:56:54.601069 systemd[1]: Created slice kubepods-besteffort-pod843d26c4_ce53_4fce_8ef7_f1e374c36bb9.slice - libcontainer container kubepods-besteffort-pod843d26c4_ce53_4fce_8ef7_f1e374c36bb9.slice. Sep 9 23:56:54.638075 kubelet[2750]: I0909 23:56:54.637804 2750 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-proxy\" (UniqueName: \"kubernetes.io/configmap/843d26c4-ce53-4fce-8ef7-f1e374c36bb9-kube-proxy\") pod \"kube-proxy-gvphj\" (UID: \"843d26c4-ce53-4fce-8ef7-f1e374c36bb9\") " pod="kube-system/kube-proxy-gvphj" Sep 9 23:56:54.638075 kubelet[2750]: I0909 23:56:54.637921 2750 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"xtables-lock\" (UniqueName: \"kubernetes.io/host-path/843d26c4-ce53-4fce-8ef7-f1e374c36bb9-xtables-lock\") pod \"kube-proxy-gvphj\" (UID: \"843d26c4-ce53-4fce-8ef7-f1e374c36bb9\") " pod="kube-system/kube-proxy-gvphj" Sep 9 23:56:54.638075 kubelet[2750]: I0909 23:56:54.637954 2750 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/843d26c4-ce53-4fce-8ef7-f1e374c36bb9-lib-modules\") pod \"kube-proxy-gvphj\" (UID: \"843d26c4-ce53-4fce-8ef7-f1e374c36bb9\") " pod="kube-system/kube-proxy-gvphj" Sep 9 23:56:54.638075 kubelet[2750]: I0909 23:56:54.637984 2750 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ndx9z\" (UniqueName: \"kubernetes.io/projected/843d26c4-ce53-4fce-8ef7-f1e374c36bb9-kube-api-access-ndx9z\") pod \"kube-proxy-gvphj\" (UID: \"843d26c4-ce53-4fce-8ef7-f1e374c36bb9\") " pod="kube-system/kube-proxy-gvphj" Sep 9 23:56:54.912153 containerd[1499]: time="2025-09-09T23:56:54.912113051Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-proxy-gvphj,Uid:843d26c4-ce53-4fce-8ef7-f1e374c36bb9,Namespace:kube-system,Attempt:0,}" Sep 9 23:56:54.949326 containerd[1499]: time="2025-09-09T23:56:54.949225541Z" level=info msg="connecting to shim 0df9e18317e4623f7b2dca9ed83bac39cddad477efad4f093f30022daa13558b" address="unix:///run/containerd/s/711ed3cdfe0f428748b973383a87eb2bbaab0a81e7ba8ed3ef5d6c8fb898fb1f" namespace=k8s.io protocol=ttrpc version=3 Sep 9 23:56:54.962415 systemd[1]: Created slice kubepods-besteffort-pod1112463b_5b41_49e0_9ebd_b987ef3990b0.slice - libcontainer container kubepods-besteffort-pod1112463b_5b41_49e0_9ebd_b987ef3990b0.slice. Sep 9 23:56:55.001273 systemd[1]: Started cri-containerd-0df9e18317e4623f7b2dca9ed83bac39cddad477efad4f093f30022daa13558b.scope - libcontainer container 0df9e18317e4623f7b2dca9ed83bac39cddad477efad4f093f30022daa13558b. Sep 9 23:56:55.033035 containerd[1499]: time="2025-09-09T23:56:55.032924115Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-proxy-gvphj,Uid:843d26c4-ce53-4fce-8ef7-f1e374c36bb9,Namespace:kube-system,Attempt:0,} returns sandbox id \"0df9e18317e4623f7b2dca9ed83bac39cddad477efad4f093f30022daa13558b\"" Sep 9 23:56:55.038161 containerd[1499]: time="2025-09-09T23:56:55.038119868Z" level=info msg="CreateContainer within sandbox \"0df9e18317e4623f7b2dca9ed83bac39cddad477efad4f093f30022daa13558b\" for container &ContainerMetadata{Name:kube-proxy,Attempt:0,}" Sep 9 23:56:55.040124 kubelet[2750]: I0909 23:56:55.040037 2750 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tznkx\" (UniqueName: \"kubernetes.io/projected/1112463b-5b41-49e0-9ebd-b987ef3990b0-kube-api-access-tznkx\") pod \"tigera-operator-755d956888-x7v9h\" (UID: \"1112463b-5b41-49e0-9ebd-b987ef3990b0\") " pod="tigera-operator/tigera-operator-755d956888-x7v9h" Sep 9 23:56:55.040124 kubelet[2750]: I0909 23:56:55.040086 2750 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-calico\" (UniqueName: \"kubernetes.io/host-path/1112463b-5b41-49e0-9ebd-b987ef3990b0-var-lib-calico\") pod \"tigera-operator-755d956888-x7v9h\" (UID: \"1112463b-5b41-49e0-9ebd-b987ef3990b0\") " pod="tigera-operator/tigera-operator-755d956888-x7v9h" Sep 9 23:56:55.056572 containerd[1499]: time="2025-09-09T23:56:55.056506498Z" level=info msg="Container d64785926aaa6bfe5d63029a35e5e1be4fc52313d5e0ca2900476b18559afda9: CDI devices from CRI Config.CDIDevices: []" Sep 9 23:56:55.069633 containerd[1499]: time="2025-09-09T23:56:55.069487021Z" level=info msg="CreateContainer within sandbox \"0df9e18317e4623f7b2dca9ed83bac39cddad477efad4f093f30022daa13558b\" for &ContainerMetadata{Name:kube-proxy,Attempt:0,} returns container id \"d64785926aaa6bfe5d63029a35e5e1be4fc52313d5e0ca2900476b18559afda9\"" Sep 9 23:56:55.072432 containerd[1499]: time="2025-09-09T23:56:55.070380479Z" level=info msg="StartContainer for \"d64785926aaa6bfe5d63029a35e5e1be4fc52313d5e0ca2900476b18559afda9\"" Sep 9 23:56:55.073343 containerd[1499]: time="2025-09-09T23:56:55.073309687Z" level=info msg="connecting to shim d64785926aaa6bfe5d63029a35e5e1be4fc52313d5e0ca2900476b18559afda9" address="unix:///run/containerd/s/711ed3cdfe0f428748b973383a87eb2bbaab0a81e7ba8ed3ef5d6c8fb898fb1f" protocol=ttrpc version=3 Sep 9 23:56:55.093308 systemd[1]: Started cri-containerd-d64785926aaa6bfe5d63029a35e5e1be4fc52313d5e0ca2900476b18559afda9.scope - libcontainer container d64785926aaa6bfe5d63029a35e5e1be4fc52313d5e0ca2900476b18559afda9. Sep 9 23:56:55.134379 containerd[1499]: time="2025-09-09T23:56:55.134325276Z" level=info msg="StartContainer for \"d64785926aaa6bfe5d63029a35e5e1be4fc52313d5e0ca2900476b18559afda9\" returns successfully" Sep 9 23:56:55.268345 containerd[1499]: time="2025-09-09T23:56:55.268234682Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:tigera-operator-755d956888-x7v9h,Uid:1112463b-5b41-49e0-9ebd-b987ef3990b0,Namespace:tigera-operator,Attempt:0,}" Sep 9 23:56:55.297416 containerd[1499]: time="2025-09-09T23:56:55.297369490Z" level=info msg="connecting to shim bd414dd2df0a9b3f866cc7daeb6cefc247d8104602120ab8c783f5032d1523c8" address="unix:///run/containerd/s/221fa533343353e0ccc6ccc8e8c45d704a74d43037806dce145b30904afce858" namespace=k8s.io protocol=ttrpc version=3 Sep 9 23:56:55.330109 systemd[1]: Started cri-containerd-bd414dd2df0a9b3f866cc7daeb6cefc247d8104602120ab8c783f5032d1523c8.scope - libcontainer container bd414dd2df0a9b3f866cc7daeb6cefc247d8104602120ab8c783f5032d1523c8. Sep 9 23:56:55.386610 containerd[1499]: time="2025-09-09T23:56:55.386490231Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:tigera-operator-755d956888-x7v9h,Uid:1112463b-5b41-49e0-9ebd-b987ef3990b0,Namespace:tigera-operator,Attempt:0,} returns sandbox id \"bd414dd2df0a9b3f866cc7daeb6cefc247d8104602120ab8c783f5032d1523c8\"" Sep 9 23:56:55.389073 containerd[1499]: time="2025-09-09T23:56:55.389031689Z" level=info msg="PullImage \"quay.io/tigera/operator:v1.38.6\"" Sep 9 23:56:55.521720 kubelet[2750]: I0909 23:56:55.521576 2750 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-proxy-gvphj" podStartSLOduration=1.5215541689999998 podStartE2EDuration="1.521554169s" podCreationTimestamp="2025-09-09 23:56:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-09 23:56:55.520445997 +0000 UTC m=+7.249355735" watchObservedRunningTime="2025-09-09 23:56:55.521554169 +0000 UTC m=+7.250463907" Sep 9 23:56:58.559417 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount111332178.mount: Deactivated successfully. Sep 9 23:57:01.587860 containerd[1499]: time="2025-09-09T23:57:01.587737489Z" level=info msg="ImageCreate event name:\"quay.io/tigera/operator:v1.38.6\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 23:57:01.590096 containerd[1499]: time="2025-09-09T23:57:01.589652772Z" level=info msg="stop pulling image quay.io/tigera/operator:v1.38.6: active requests=0, bytes read=22152365" Sep 9 23:57:01.591346 containerd[1499]: time="2025-09-09T23:57:01.591296941Z" level=info msg="ImageCreate event name:\"sha256:dd2e197838b00861b08ae5f480dfbfb9a519722e35ced99346315722309cbe9f\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 23:57:01.594535 containerd[1499]: time="2025-09-09T23:57:01.594414322Z" level=info msg="ImageCreate event name:\"quay.io/tigera/operator@sha256:00a7a9b62f9b9a4e0856128b078539783b8352b07f707bff595cb604cc580f6e\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 23:57:01.595448 containerd[1499]: time="2025-09-09T23:57:01.595413063Z" level=info msg="Pulled image \"quay.io/tigera/operator:v1.38.6\" with image id \"sha256:dd2e197838b00861b08ae5f480dfbfb9a519722e35ced99346315722309cbe9f\", repo tag \"quay.io/tigera/operator:v1.38.6\", repo digest \"quay.io/tigera/operator@sha256:00a7a9b62f9b9a4e0856128b078539783b8352b07f707bff595cb604cc580f6e\", size \"22148360\" in 6.206232257s" Sep 9 23:57:01.595689 containerd[1499]: time="2025-09-09T23:57:01.595569100Z" level=info msg="PullImage \"quay.io/tigera/operator:v1.38.6\" returns image reference \"sha256:dd2e197838b00861b08ae5f480dfbfb9a519722e35ced99346315722309cbe9f\"" Sep 9 23:57:01.599591 containerd[1499]: time="2025-09-09T23:57:01.599518425Z" level=info msg="CreateContainer within sandbox \"bd414dd2df0a9b3f866cc7daeb6cefc247d8104602120ab8c783f5032d1523c8\" for container &ContainerMetadata{Name:tigera-operator,Attempt:0,}" Sep 9 23:57:01.612398 containerd[1499]: time="2025-09-09T23:57:01.611948830Z" level=info msg="Container 9d8a32d8b6f93946dd7e1ec44e8138c329d69bd9f5001a16edcff11c8eac9ec8: CDI devices from CRI Config.CDIDevices: []" Sep 9 23:57:01.621599 containerd[1499]: time="2025-09-09T23:57:01.621502729Z" level=info msg="CreateContainer within sandbox \"bd414dd2df0a9b3f866cc7daeb6cefc247d8104602120ab8c783f5032d1523c8\" for &ContainerMetadata{Name:tigera-operator,Attempt:0,} returns container id \"9d8a32d8b6f93946dd7e1ec44e8138c329d69bd9f5001a16edcff11c8eac9ec8\"" Sep 9 23:57:01.622868 containerd[1499]: time="2025-09-09T23:57:01.622691466Z" level=info msg="StartContainer for \"9d8a32d8b6f93946dd7e1ec44e8138c329d69bd9f5001a16edcff11c8eac9ec8\"" Sep 9 23:57:01.624331 containerd[1499]: time="2025-09-09T23:57:01.624229277Z" level=info msg="connecting to shim 9d8a32d8b6f93946dd7e1ec44e8138c329d69bd9f5001a16edcff11c8eac9ec8" address="unix:///run/containerd/s/221fa533343353e0ccc6ccc8e8c45d704a74d43037806dce145b30904afce858" protocol=ttrpc version=3 Sep 9 23:57:01.649158 systemd[1]: Started cri-containerd-9d8a32d8b6f93946dd7e1ec44e8138c329d69bd9f5001a16edcff11c8eac9ec8.scope - libcontainer container 9d8a32d8b6f93946dd7e1ec44e8138c329d69bd9f5001a16edcff11c8eac9ec8. Sep 9 23:57:01.691978 containerd[1499]: time="2025-09-09T23:57:01.691911875Z" level=info msg="StartContainer for \"9d8a32d8b6f93946dd7e1ec44e8138c329d69bd9f5001a16edcff11c8eac9ec8\" returns successfully" Sep 9 23:57:08.286420 sudo[1835]: pam_unix(sudo:session): session closed for user root Sep 9 23:57:08.448742 sshd[1834]: Connection closed by 139.178.68.195 port 35290 Sep 9 23:57:08.447480 sshd-session[1831]: pam_unix(sshd:session): session closed for user core Sep 9 23:57:08.455432 systemd[1]: sshd@6-91.99.154.191:22-139.178.68.195:35290.service: Deactivated successfully. Sep 9 23:57:08.460817 systemd[1]: session-7.scope: Deactivated successfully. Sep 9 23:57:08.463232 systemd[1]: session-7.scope: Consumed 7.520s CPU time, 226.5M memory peak. Sep 9 23:57:08.469330 systemd-logind[1481]: Session 7 logged out. Waiting for processes to exit. Sep 9 23:57:08.473708 systemd-logind[1481]: Removed session 7. Sep 9 23:57:15.499454 kubelet[2750]: I0909 23:57:15.498905 2750 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="tigera-operator/tigera-operator-755d956888-x7v9h" podStartSLOduration=15.290224427 podStartE2EDuration="21.498885552s" podCreationTimestamp="2025-09-09 23:56:54 +0000 UTC" firstStartedPulling="2025-09-09 23:56:55.387989355 +0000 UTC m=+7.116899093" lastFinishedPulling="2025-09-09 23:57:01.59665048 +0000 UTC m=+13.325560218" observedRunningTime="2025-09-09 23:57:02.548995877 +0000 UTC m=+14.277905615" watchObservedRunningTime="2025-09-09 23:57:15.498885552 +0000 UTC m=+27.227795410" Sep 9 23:57:15.513103 systemd[1]: Created slice kubepods-besteffort-pod806f7d2f_dd7e_4c5d_9d69_9f8eb39e046f.slice - libcontainer container kubepods-besteffort-pod806f7d2f_dd7e_4c5d_9d69_9f8eb39e046f.slice. Sep 9 23:57:15.671347 kubelet[2750]: I0909 23:57:15.671172 2750 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c6rc5\" (UniqueName: \"kubernetes.io/projected/806f7d2f-dd7e-4c5d-9d69-9f8eb39e046f-kube-api-access-c6rc5\") pod \"calico-typha-595d4944bf-rfnq2\" (UID: \"806f7d2f-dd7e-4c5d-9d69-9f8eb39e046f\") " pod="calico-system/calico-typha-595d4944bf-rfnq2" Sep 9 23:57:15.671347 kubelet[2750]: I0909 23:57:15.671260 2750 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/806f7d2f-dd7e-4c5d-9d69-9f8eb39e046f-tigera-ca-bundle\") pod \"calico-typha-595d4944bf-rfnq2\" (UID: \"806f7d2f-dd7e-4c5d-9d69-9f8eb39e046f\") " pod="calico-system/calico-typha-595d4944bf-rfnq2" Sep 9 23:57:15.671347 kubelet[2750]: I0909 23:57:15.671279 2750 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"typha-certs\" (UniqueName: \"kubernetes.io/secret/806f7d2f-dd7e-4c5d-9d69-9f8eb39e046f-typha-certs\") pod \"calico-typha-595d4944bf-rfnq2\" (UID: \"806f7d2f-dd7e-4c5d-9d69-9f8eb39e046f\") " pod="calico-system/calico-typha-595d4944bf-rfnq2" Sep 9 23:57:15.801537 systemd[1]: Created slice kubepods-besteffort-pod228d603b_00d7_449b_b780_0246b1ffea57.slice - libcontainer container kubepods-besteffort-pod228d603b_00d7_449b_b780_0246b1ffea57.slice. Sep 9 23:57:15.821545 containerd[1499]: time="2025-09-09T23:57:15.821226678Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-typha-595d4944bf-rfnq2,Uid:806f7d2f-dd7e-4c5d-9d69-9f8eb39e046f,Namespace:calico-system,Attempt:0,}" Sep 9 23:57:15.858700 containerd[1499]: time="2025-09-09T23:57:15.858635109Z" level=info msg="connecting to shim 367e683e0081c115c756a8ea7737b4643f29815d0ec0f648c5d61808ad23fa15" address="unix:///run/containerd/s/ff42afbb337bcf5e98c5a0d441be6172a2039798ea954f7732c309eb8e4bb991" namespace=k8s.io protocol=ttrpc version=3 Sep 9 23:57:15.913101 systemd[1]: Started cri-containerd-367e683e0081c115c756a8ea7737b4643f29815d0ec0f648c5d61808ad23fa15.scope - libcontainer container 367e683e0081c115c756a8ea7737b4643f29815d0ec0f648c5d61808ad23fa15. Sep 9 23:57:15.973345 kubelet[2750]: I0909 23:57:15.972845 2750 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/228d603b-00d7-449b-b780-0246b1ffea57-tigera-ca-bundle\") pod \"calico-node-bww8f\" (UID: \"228d603b-00d7-449b-b780-0246b1ffea57\") " pod="calico-system/calico-node-bww8f" Sep 9 23:57:15.973345 kubelet[2750]: I0909 23:57:15.972894 2750 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"xtables-lock\" (UniqueName: \"kubernetes.io/host-path/228d603b-00d7-449b-b780-0246b1ffea57-xtables-lock\") pod \"calico-node-bww8f\" (UID: \"228d603b-00d7-449b-b780-0246b1ffea57\") " pod="calico-system/calico-node-bww8f" Sep 9 23:57:15.973345 kubelet[2750]: I0909 23:57:15.972911 2750 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fg6s9\" (UniqueName: \"kubernetes.io/projected/228d603b-00d7-449b-b780-0246b1ffea57-kube-api-access-fg6s9\") pod \"calico-node-bww8f\" (UID: \"228d603b-00d7-449b-b780-0246b1ffea57\") " pod="calico-system/calico-node-bww8f" Sep 9 23:57:15.973345 kubelet[2750]: I0909 23:57:15.972930 2750 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"flexvol-driver-host\" (UniqueName: \"kubernetes.io/host-path/228d603b-00d7-449b-b780-0246b1ffea57-flexvol-driver-host\") pod \"calico-node-bww8f\" (UID: \"228d603b-00d7-449b-b780-0246b1ffea57\") " pod="calico-system/calico-node-bww8f" Sep 9 23:57:15.973345 kubelet[2750]: I0909 23:57:15.972947 2750 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-log-dir\" (UniqueName: \"kubernetes.io/host-path/228d603b-00d7-449b-b780-0246b1ffea57-cni-log-dir\") pod \"calico-node-bww8f\" (UID: \"228d603b-00d7-449b-b780-0246b1ffea57\") " pod="calico-system/calico-node-bww8f" Sep 9 23:57:15.973711 kubelet[2750]: I0909 23:57:15.972964 2750 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-certs\" (UniqueName: \"kubernetes.io/secret/228d603b-00d7-449b-b780-0246b1ffea57-node-certs\") pod \"calico-node-bww8f\" (UID: \"228d603b-00d7-449b-b780-0246b1ffea57\") " pod="calico-system/calico-node-bww8f" Sep 9 23:57:15.973711 kubelet[2750]: I0909 23:57:15.973012 2750 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"policysync\" (UniqueName: \"kubernetes.io/host-path/228d603b-00d7-449b-b780-0246b1ffea57-policysync\") pod \"calico-node-bww8f\" (UID: \"228d603b-00d7-449b-b780-0246b1ffea57\") " pod="calico-system/calico-node-bww8f" Sep 9 23:57:15.973711 kubelet[2750]: I0909 23:57:15.973033 2750 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-calico\" (UniqueName: \"kubernetes.io/host-path/228d603b-00d7-449b-b780-0246b1ffea57-var-run-calico\") pod \"calico-node-bww8f\" (UID: \"228d603b-00d7-449b-b780-0246b1ffea57\") " pod="calico-system/calico-node-bww8f" Sep 9 23:57:15.973711 kubelet[2750]: I0909 23:57:15.973054 2750 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-calico\" (UniqueName: \"kubernetes.io/host-path/228d603b-00d7-449b-b780-0246b1ffea57-var-lib-calico\") pod \"calico-node-bww8f\" (UID: \"228d603b-00d7-449b-b780-0246b1ffea57\") " pod="calico-system/calico-node-bww8f" Sep 9 23:57:15.973711 kubelet[2750]: I0909 23:57:15.973069 2750 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-bin-dir\" (UniqueName: \"kubernetes.io/host-path/228d603b-00d7-449b-b780-0246b1ffea57-cni-bin-dir\") pod \"calico-node-bww8f\" (UID: \"228d603b-00d7-449b-b780-0246b1ffea57\") " pod="calico-system/calico-node-bww8f" Sep 9 23:57:15.973827 kubelet[2750]: I0909 23:57:15.973085 2750 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-net-dir\" (UniqueName: \"kubernetes.io/host-path/228d603b-00d7-449b-b780-0246b1ffea57-cni-net-dir\") pod \"calico-node-bww8f\" (UID: \"228d603b-00d7-449b-b780-0246b1ffea57\") " pod="calico-system/calico-node-bww8f" Sep 9 23:57:15.973827 kubelet[2750]: I0909 23:57:15.973104 2750 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/228d603b-00d7-449b-b780-0246b1ffea57-lib-modules\") pod \"calico-node-bww8f\" (UID: \"228d603b-00d7-449b-b780-0246b1ffea57\") " pod="calico-system/calico-node-bww8f" Sep 9 23:57:15.977664 kubelet[2750]: E0909 23:57:15.977067 2750 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-p9n6x" podUID="78e1ea12-0c8b-4e75-9773-e6539bdf3c00" Sep 9 23:57:16.074673 containerd[1499]: time="2025-09-09T23:57:16.074529616Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-typha-595d4944bf-rfnq2,Uid:806f7d2f-dd7e-4c5d-9d69-9f8eb39e046f,Namespace:calico-system,Attempt:0,} returns sandbox id \"367e683e0081c115c756a8ea7737b4643f29815d0ec0f648c5d61808ad23fa15\"" Sep 9 23:57:16.080114 kubelet[2750]: E0909 23:57:16.080005 2750 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 23:57:16.080114 kubelet[2750]: W0909 23:57:16.080041 2750 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 23:57:16.080114 kubelet[2750]: E0909 23:57:16.080068 2750 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 23:57:16.085862 containerd[1499]: time="2025-09-09T23:57:16.083725748Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/typha:v3.30.3\"" Sep 9 23:57:16.088943 kubelet[2750]: E0909 23:57:16.088735 2750 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 23:57:16.088943 kubelet[2750]: W0909 23:57:16.088880 2750 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 23:57:16.088943 kubelet[2750]: E0909 23:57:16.088904 2750 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 23:57:16.132980 kubelet[2750]: E0909 23:57:16.132944 2750 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 23:57:16.132980 kubelet[2750]: W0909 23:57:16.132968 2750 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 23:57:16.133156 kubelet[2750]: E0909 23:57:16.132990 2750 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 23:57:16.174119 kubelet[2750]: E0909 23:57:16.173979 2750 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 23:57:16.174119 kubelet[2750]: W0909 23:57:16.174006 2750 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 23:57:16.174119 kubelet[2750]: E0909 23:57:16.174027 2750 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 23:57:16.174119 kubelet[2750]: I0909 23:57:16.174060 2750 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/78e1ea12-0c8b-4e75-9773-e6539bdf3c00-registration-dir\") pod \"csi-node-driver-p9n6x\" (UID: \"78e1ea12-0c8b-4e75-9773-e6539bdf3c00\") " pod="calico-system/csi-node-driver-p9n6x" Sep 9 23:57:16.174830 kubelet[2750]: E0909 23:57:16.174621 2750 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 23:57:16.174830 kubelet[2750]: W0909 23:57:16.174709 2750 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 23:57:16.174830 kubelet[2750]: E0909 23:57:16.174739 2750 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 23:57:16.174830 kubelet[2750]: I0909 23:57:16.174766 2750 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rtf7w\" (UniqueName: \"kubernetes.io/projected/78e1ea12-0c8b-4e75-9773-e6539bdf3c00-kube-api-access-rtf7w\") pod \"csi-node-driver-p9n6x\" (UID: \"78e1ea12-0c8b-4e75-9773-e6539bdf3c00\") " pod="calico-system/csi-node-driver-p9n6x" Sep 9 23:57:16.175181 kubelet[2750]: E0909 23:57:16.175164 2750 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 23:57:16.175362 kubelet[2750]: W0909 23:57:16.175237 2750 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 23:57:16.175362 kubelet[2750]: E0909 23:57:16.175273 2750 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 23:57:16.175362 kubelet[2750]: I0909 23:57:16.175297 2750 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/78e1ea12-0c8b-4e75-9773-e6539bdf3c00-socket-dir\") pod \"csi-node-driver-p9n6x\" (UID: \"78e1ea12-0c8b-4e75-9773-e6539bdf3c00\") " pod="calico-system/csi-node-driver-p9n6x" Sep 9 23:57:16.175695 kubelet[2750]: E0909 23:57:16.175677 2750 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 23:57:16.175890 kubelet[2750]: W0909 23:57:16.175761 2750 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 23:57:16.175890 kubelet[2750]: E0909 23:57:16.175851 2750 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 23:57:16.176016 kubelet[2750]: I0909 23:57:16.175898 2750 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"varrun\" (UniqueName: \"kubernetes.io/host-path/78e1ea12-0c8b-4e75-9773-e6539bdf3c00-varrun\") pod \"csi-node-driver-p9n6x\" (UID: \"78e1ea12-0c8b-4e75-9773-e6539bdf3c00\") " pod="calico-system/csi-node-driver-p9n6x" Sep 9 23:57:16.176241 kubelet[2750]: E0909 23:57:16.176225 2750 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 23:57:16.176361 kubelet[2750]: W0909 23:57:16.176301 2750 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 23:57:16.176478 kubelet[2750]: E0909 23:57:16.176371 2750 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 23:57:16.176669 kubelet[2750]: E0909 23:57:16.176653 2750 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 23:57:16.176787 kubelet[2750]: W0909 23:57:16.176724 2750 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 23:57:16.176787 kubelet[2750]: E0909 23:57:16.176773 2750 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 23:57:16.177005 kubelet[2750]: E0909 23:57:16.176988 2750 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 23:57:16.177124 kubelet[2750]: W0909 23:57:16.177067 2750 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 23:57:16.177124 kubelet[2750]: E0909 23:57:16.177113 2750 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 23:57:16.177336 kubelet[2750]: E0909 23:57:16.177322 2750 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 23:57:16.177470 kubelet[2750]: W0909 23:57:16.177409 2750 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 23:57:16.177470 kubelet[2750]: E0909 23:57:16.177462 2750 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 23:57:16.177470 kubelet[2750]: I0909 23:57:16.177501 2750 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/78e1ea12-0c8b-4e75-9773-e6539bdf3c00-kubelet-dir\") pod \"csi-node-driver-p9n6x\" (UID: \"78e1ea12-0c8b-4e75-9773-e6539bdf3c00\") " pod="calico-system/csi-node-driver-p9n6x" Sep 9 23:57:16.177813 kubelet[2750]: E0909 23:57:16.177721 2750 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 23:57:16.177813 kubelet[2750]: W0909 23:57:16.177736 2750 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 23:57:16.177813 kubelet[2750]: E0909 23:57:16.177799 2750 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 23:57:16.178338 kubelet[2750]: E0909 23:57:16.178246 2750 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 23:57:16.178338 kubelet[2750]: W0909 23:57:16.178266 2750 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 23:57:16.178338 kubelet[2750]: E0909 23:57:16.178284 2750 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 23:57:16.178954 kubelet[2750]: E0909 23:57:16.178929 2750 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 23:57:16.178954 kubelet[2750]: W0909 23:57:16.178951 2750 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 23:57:16.179054 kubelet[2750]: E0909 23:57:16.178970 2750 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 23:57:16.179932 kubelet[2750]: E0909 23:57:16.179903 2750 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 23:57:16.179932 kubelet[2750]: W0909 23:57:16.179925 2750 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 23:57:16.180049 kubelet[2750]: E0909 23:57:16.179944 2750 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 23:57:16.181138 kubelet[2750]: E0909 23:57:16.181069 2750 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 23:57:16.181138 kubelet[2750]: W0909 23:57:16.181135 2750 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 23:57:16.181285 kubelet[2750]: E0909 23:57:16.181159 2750 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 23:57:16.181493 kubelet[2750]: E0909 23:57:16.181476 2750 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 23:57:16.181493 kubelet[2750]: W0909 23:57:16.181492 2750 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 23:57:16.181553 kubelet[2750]: E0909 23:57:16.181506 2750 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 23:57:16.182183 kubelet[2750]: E0909 23:57:16.182156 2750 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 23:57:16.182183 kubelet[2750]: W0909 23:57:16.182180 2750 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 23:57:16.182294 kubelet[2750]: E0909 23:57:16.182195 2750 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 23:57:16.280567 kubelet[2750]: E0909 23:57:16.280526 2750 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 23:57:16.280567 kubelet[2750]: W0909 23:57:16.280564 2750 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 23:57:16.280883 kubelet[2750]: E0909 23:57:16.280596 2750 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 23:57:16.281122 kubelet[2750]: E0909 23:57:16.281097 2750 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 23:57:16.281189 kubelet[2750]: W0909 23:57:16.281130 2750 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 23:57:16.281261 kubelet[2750]: E0909 23:57:16.281164 2750 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 23:57:16.281716 kubelet[2750]: E0909 23:57:16.281593 2750 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 23:57:16.281716 kubelet[2750]: W0909 23:57:16.281615 2750 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 23:57:16.281716 kubelet[2750]: E0909 23:57:16.281640 2750 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 23:57:16.283135 kubelet[2750]: E0909 23:57:16.283063 2750 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 23:57:16.283135 kubelet[2750]: W0909 23:57:16.283083 2750 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 23:57:16.283135 kubelet[2750]: E0909 23:57:16.283109 2750 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 23:57:16.283454 kubelet[2750]: E0909 23:57:16.283433 2750 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 23:57:16.283454 kubelet[2750]: W0909 23:57:16.283454 2750 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 23:57:16.283517 kubelet[2750]: E0909 23:57:16.283475 2750 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 23:57:16.283864 kubelet[2750]: E0909 23:57:16.283817 2750 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 23:57:16.283864 kubelet[2750]: W0909 23:57:16.283850 2750 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 23:57:16.284058 kubelet[2750]: E0909 23:57:16.283914 2750 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 23:57:16.284261 kubelet[2750]: E0909 23:57:16.284242 2750 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 23:57:16.284261 kubelet[2750]: W0909 23:57:16.284259 2750 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 23:57:16.284384 kubelet[2750]: E0909 23:57:16.284322 2750 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 23:57:16.284888 kubelet[2750]: E0909 23:57:16.284810 2750 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 23:57:16.285065 kubelet[2750]: W0909 23:57:16.284830 2750 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 23:57:16.285257 kubelet[2750]: E0909 23:57:16.285149 2750 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 23:57:16.285507 kubelet[2750]: E0909 23:57:16.285485 2750 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 23:57:16.285507 kubelet[2750]: W0909 23:57:16.285505 2750 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 23:57:16.285711 kubelet[2750]: E0909 23:57:16.285626 2750 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 23:57:16.285939 kubelet[2750]: E0909 23:57:16.285917 2750 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 23:57:16.285939 kubelet[2750]: W0909 23:57:16.285936 2750 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 23:57:16.286079 kubelet[2750]: E0909 23:57:16.286000 2750 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 23:57:16.286315 kubelet[2750]: E0909 23:57:16.286296 2750 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 23:57:16.286315 kubelet[2750]: W0909 23:57:16.286312 2750 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 23:57:16.286467 kubelet[2750]: E0909 23:57:16.286440 2750 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 23:57:16.286790 kubelet[2750]: E0909 23:57:16.286685 2750 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 23:57:16.286790 kubelet[2750]: W0909 23:57:16.286702 2750 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 23:57:16.286790 kubelet[2750]: E0909 23:57:16.286732 2750 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 23:57:16.287071 kubelet[2750]: E0909 23:57:16.287051 2750 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 23:57:16.287071 kubelet[2750]: W0909 23:57:16.287068 2750 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 23:57:16.287184 kubelet[2750]: E0909 23:57:16.287124 2750 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 23:57:16.287476 kubelet[2750]: E0909 23:57:16.287456 2750 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 23:57:16.287476 kubelet[2750]: W0909 23:57:16.287475 2750 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 23:57:16.287606 kubelet[2750]: E0909 23:57:16.287535 2750 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 23:57:16.287912 kubelet[2750]: E0909 23:57:16.287890 2750 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 23:57:16.287958 kubelet[2750]: W0909 23:57:16.287910 2750 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 23:57:16.287958 kubelet[2750]: E0909 23:57:16.287974 2750 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 23:57:16.288211 kubelet[2750]: E0909 23:57:16.288193 2750 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 23:57:16.288211 kubelet[2750]: W0909 23:57:16.288209 2750 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 23:57:16.288338 kubelet[2750]: E0909 23:57:16.288266 2750 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 23:57:16.289278 kubelet[2750]: E0909 23:57:16.289247 2750 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 23:57:16.289278 kubelet[2750]: W0909 23:57:16.289270 2750 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 23:57:16.289566 kubelet[2750]: E0909 23:57:16.289351 2750 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 23:57:16.289888 kubelet[2750]: E0909 23:57:16.289862 2750 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 23:57:16.289888 kubelet[2750]: W0909 23:57:16.289883 2750 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 23:57:16.290954 kubelet[2750]: E0909 23:57:16.289952 2750 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 23:57:16.291343 kubelet[2750]: E0909 23:57:16.291314 2750 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 23:57:16.291343 kubelet[2750]: W0909 23:57:16.291339 2750 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 23:57:16.291553 kubelet[2750]: E0909 23:57:16.291464 2750 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 23:57:16.291620 kubelet[2750]: E0909 23:57:16.291600 2750 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 23:57:16.291620 kubelet[2750]: W0909 23:57:16.291612 2750 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 23:57:16.291790 kubelet[2750]: E0909 23:57:16.291695 2750 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 23:57:16.291977 kubelet[2750]: E0909 23:57:16.291957 2750 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 23:57:16.291977 kubelet[2750]: W0909 23:57:16.291973 2750 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 23:57:16.292153 kubelet[2750]: E0909 23:57:16.292055 2750 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 23:57:16.293636 kubelet[2750]: E0909 23:57:16.293600 2750 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 23:57:16.293636 kubelet[2750]: W0909 23:57:16.293626 2750 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 23:57:16.293636 kubelet[2750]: E0909 23:57:16.293654 2750 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 23:57:16.294299 kubelet[2750]: E0909 23:57:16.294139 2750 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 23:57:16.294299 kubelet[2750]: W0909 23:57:16.294155 2750 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 23:57:16.294299 kubelet[2750]: E0909 23:57:16.294192 2750 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 23:57:16.294750 kubelet[2750]: E0909 23:57:16.294732 2750 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 23:57:16.294852 kubelet[2750]: W0909 23:57:16.294821 2750 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 23:57:16.295003 kubelet[2750]: E0909 23:57:16.294974 2750 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 23:57:16.295210 kubelet[2750]: E0909 23:57:16.295197 2750 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 23:57:16.295286 kubelet[2750]: W0909 23:57:16.295273 2750 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 23:57:16.295569 kubelet[2750]: E0909 23:57:16.295336 2750 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 23:57:16.318116 kubelet[2750]: E0909 23:57:16.318020 2750 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 23:57:16.318116 kubelet[2750]: W0909 23:57:16.318044 2750 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 23:57:16.318116 kubelet[2750]: E0909 23:57:16.318069 2750 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 23:57:16.425845 containerd[1499]: time="2025-09-09T23:57:16.425733978Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-node-bww8f,Uid:228d603b-00d7-449b-b780-0246b1ffea57,Namespace:calico-system,Attempt:0,}" Sep 9 23:57:16.456744 containerd[1499]: time="2025-09-09T23:57:16.456619976Z" level=info msg="connecting to shim 5032c647937ce33b6ad00e55d59499687707b3540c05077bf50c1fe9dbefd86e" address="unix:///run/containerd/s/388c3d68d5c7ec67239d1fb5b28556f2879574c742a8d1336cf1019dfe515419" namespace=k8s.io protocol=ttrpc version=3 Sep 9 23:57:16.498623 systemd[1]: Started cri-containerd-5032c647937ce33b6ad00e55d59499687707b3540c05077bf50c1fe9dbefd86e.scope - libcontainer container 5032c647937ce33b6ad00e55d59499687707b3540c05077bf50c1fe9dbefd86e. Sep 9 23:57:16.648442 containerd[1499]: time="2025-09-09T23:57:16.648344248Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-node-bww8f,Uid:228d603b-00d7-449b-b780-0246b1ffea57,Namespace:calico-system,Attempt:0,} returns sandbox id \"5032c647937ce33b6ad00e55d59499687707b3540c05077bf50c1fe9dbefd86e\"" Sep 9 23:57:17.410362 kubelet[2750]: E0909 23:57:17.410282 2750 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-p9n6x" podUID="78e1ea12-0c8b-4e75-9773-e6539bdf3c00" Sep 9 23:57:17.653557 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount109455802.mount: Deactivated successfully. Sep 9 23:57:18.682965 containerd[1499]: time="2025-09-09T23:57:18.682299243Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/typha:v3.30.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 23:57:18.683429 containerd[1499]: time="2025-09-09T23:57:18.683346311Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/typha:v3.30.3: active requests=0, bytes read=33105775" Sep 9 23:57:18.684142 containerd[1499]: time="2025-09-09T23:57:18.684113503Z" level=info msg="ImageCreate event name:\"sha256:6a1496fdc48cc0b9ab3c10aef777497484efac5df9efbfbbdf9775e9583645cb\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 23:57:18.686451 containerd[1499]: time="2025-09-09T23:57:18.686342158Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/typha@sha256:f4a3d61ffda9c98a53adeb412c5af404ca3727a3cc2d0b4ef28d197bdd47ecaa\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 23:57:18.687152 containerd[1499]: time="2025-09-09T23:57:18.687118909Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/typha:v3.30.3\" with image id \"sha256:6a1496fdc48cc0b9ab3c10aef777497484efac5df9efbfbbdf9775e9583645cb\", repo tag \"ghcr.io/flatcar/calico/typha:v3.30.3\", repo digest \"ghcr.io/flatcar/calico/typha@sha256:f4a3d61ffda9c98a53adeb412c5af404ca3727a3cc2d0b4ef28d197bdd47ecaa\", size \"33105629\" in 2.603344882s" Sep 9 23:57:18.687152 containerd[1499]: time="2025-09-09T23:57:18.687151509Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/typha:v3.30.3\" returns image reference \"sha256:6a1496fdc48cc0b9ab3c10aef777497484efac5df9efbfbbdf9775e9583645cb\"" Sep 9 23:57:18.689723 containerd[1499]: time="2025-09-09T23:57:18.689428483Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.3\"" Sep 9 23:57:18.707379 containerd[1499]: time="2025-09-09T23:57:18.707262284Z" level=info msg="CreateContainer within sandbox \"367e683e0081c115c756a8ea7737b4643f29815d0ec0f648c5d61808ad23fa15\" for container &ContainerMetadata{Name:calico-typha,Attempt:0,}" Sep 9 23:57:18.720007 containerd[1499]: time="2025-09-09T23:57:18.719965982Z" level=info msg="Container a8e37023476ed3630dc25833279118663bc83469677da8597943d7ae939693e8: CDI devices from CRI Config.CDIDevices: []" Sep 9 23:57:18.733045 containerd[1499]: time="2025-09-09T23:57:18.732992676Z" level=info msg="CreateContainer within sandbox \"367e683e0081c115c756a8ea7737b4643f29815d0ec0f648c5d61808ad23fa15\" for &ContainerMetadata{Name:calico-typha,Attempt:0,} returns container id \"a8e37023476ed3630dc25833279118663bc83469677da8597943d7ae939693e8\"" Sep 9 23:57:18.733911 containerd[1499]: time="2025-09-09T23:57:18.733878346Z" level=info msg="StartContainer for \"a8e37023476ed3630dc25833279118663bc83469677da8597943d7ae939693e8\"" Sep 9 23:57:18.743214 containerd[1499]: time="2025-09-09T23:57:18.743114643Z" level=info msg="connecting to shim a8e37023476ed3630dc25833279118663bc83469677da8597943d7ae939693e8" address="unix:///run/containerd/s/ff42afbb337bcf5e98c5a0d441be6172a2039798ea954f7732c309eb8e4bb991" protocol=ttrpc version=3 Sep 9 23:57:18.779118 systemd[1]: Started cri-containerd-a8e37023476ed3630dc25833279118663bc83469677da8597943d7ae939693e8.scope - libcontainer container a8e37023476ed3630dc25833279118663bc83469677da8597943d7ae939693e8. Sep 9 23:57:18.848594 containerd[1499]: time="2025-09-09T23:57:18.848432864Z" level=info msg="StartContainer for \"a8e37023476ed3630dc25833279118663bc83469677da8597943d7ae939693e8\" returns successfully" Sep 9 23:57:19.410060 kubelet[2750]: E0909 23:57:19.410000 2750 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-p9n6x" podUID="78e1ea12-0c8b-4e75-9773-e6539bdf3c00" Sep 9 23:57:19.599278 kubelet[2750]: E0909 23:57:19.599122 2750 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 23:57:19.599278 kubelet[2750]: W0909 23:57:19.599149 2750 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 23:57:19.599278 kubelet[2750]: E0909 23:57:19.599171 2750 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 23:57:19.600489 kubelet[2750]: E0909 23:57:19.600221 2750 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 23:57:19.600489 kubelet[2750]: W0909 23:57:19.600246 2750 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 23:57:19.600489 kubelet[2750]: E0909 23:57:19.600372 2750 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 23:57:19.600706 kubelet[2750]: E0909 23:57:19.600692 2750 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 23:57:19.601076 kubelet[2750]: W0909 23:57:19.601056 2750 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 23:57:19.601952 kubelet[2750]: E0909 23:57:19.601161 2750 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 23:57:19.602187 kubelet[2750]: E0909 23:57:19.602172 2750 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 23:57:19.602459 kubelet[2750]: W0909 23:57:19.602311 2750 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 23:57:19.602459 kubelet[2750]: E0909 23:57:19.602333 2750 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 23:57:19.602627 kubelet[2750]: E0909 23:57:19.602614 2750 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 23:57:19.603956 kubelet[2750]: W0909 23:57:19.603792 2750 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 23:57:19.603956 kubelet[2750]: E0909 23:57:19.603827 2750 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 23:57:19.604234 kubelet[2750]: E0909 23:57:19.604127 2750 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 23:57:19.604234 kubelet[2750]: W0909 23:57:19.604141 2750 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 23:57:19.604234 kubelet[2750]: E0909 23:57:19.604153 2750 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 23:57:19.604792 kubelet[2750]: E0909 23:57:19.604766 2750 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 23:57:19.604908 kubelet[2750]: W0909 23:57:19.604893 2750 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 23:57:19.604982 kubelet[2750]: E0909 23:57:19.604971 2750 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 23:57:19.605196 kubelet[2750]: E0909 23:57:19.605184 2750 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 23:57:19.605378 kubelet[2750]: W0909 23:57:19.605249 2750 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 23:57:19.605378 kubelet[2750]: E0909 23:57:19.605267 2750 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 23:57:19.605722 kubelet[2750]: I0909 23:57:19.605662 2750 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/calico-typha-595d4944bf-rfnq2" podStartSLOduration=2.000438998 podStartE2EDuration="4.605645619s" podCreationTimestamp="2025-09-09 23:57:15 +0000 UTC" firstStartedPulling="2025-09-09 23:57:16.083032676 +0000 UTC m=+27.811942414" lastFinishedPulling="2025-09-09 23:57:18.688239297 +0000 UTC m=+30.417149035" observedRunningTime="2025-09-09 23:57:19.604782148 +0000 UTC m=+31.333691926" watchObservedRunningTime="2025-09-09 23:57:19.605645619 +0000 UTC m=+31.334555357" Sep 9 23:57:19.606733 kubelet[2750]: E0909 23:57:19.606272 2750 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 23:57:19.606733 kubelet[2750]: W0909 23:57:19.606604 2750 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 23:57:19.606733 kubelet[2750]: E0909 23:57:19.606629 2750 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 23:57:19.607116 kubelet[2750]: E0909 23:57:19.607074 2750 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 23:57:19.607451 kubelet[2750]: W0909 23:57:19.607223 2750 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 23:57:19.607451 kubelet[2750]: E0909 23:57:19.607244 2750 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 23:57:19.607909 kubelet[2750]: E0909 23:57:19.607798 2750 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 23:57:19.607909 kubelet[2750]: W0909 23:57:19.607817 2750 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 23:57:19.608425 kubelet[2750]: E0909 23:57:19.607830 2750 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 23:57:19.609475 kubelet[2750]: E0909 23:57:19.609040 2750 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 23:57:19.609475 kubelet[2750]: W0909 23:57:19.609060 2750 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 23:57:19.609475 kubelet[2750]: E0909 23:57:19.609075 2750 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 23:57:19.610245 kubelet[2750]: E0909 23:57:19.610222 2750 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 23:57:19.610406 kubelet[2750]: W0909 23:57:19.610334 2750 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 23:57:19.610473 kubelet[2750]: E0909 23:57:19.610461 2750 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 23:57:19.610752 kubelet[2750]: E0909 23:57:19.610737 2750 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 23:57:19.610949 kubelet[2750]: W0909 23:57:19.610808 2750 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 23:57:19.610949 kubelet[2750]: E0909 23:57:19.610823 2750 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 23:57:19.611112 kubelet[2750]: E0909 23:57:19.611098 2750 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 23:57:19.611428 kubelet[2750]: W0909 23:57:19.611154 2750 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 23:57:19.611428 kubelet[2750]: E0909 23:57:19.611168 2750 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 23:57:19.611646 kubelet[2750]: E0909 23:57:19.611630 2750 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 23:57:19.611711 kubelet[2750]: W0909 23:57:19.611698 2750 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 23:57:19.611770 kubelet[2750]: E0909 23:57:19.611760 2750 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 23:57:19.612223 kubelet[2750]: E0909 23:57:19.612208 2750 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 23:57:19.612291 kubelet[2750]: W0909 23:57:19.612281 2750 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 23:57:19.612885 kubelet[2750]: E0909 23:57:19.612398 2750 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 23:57:19.612885 kubelet[2750]: E0909 23:57:19.612643 2750 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 23:57:19.612885 kubelet[2750]: W0909 23:57:19.612652 2750 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 23:57:19.612885 kubelet[2750]: E0909 23:57:19.612682 2750 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 23:57:19.613111 kubelet[2750]: E0909 23:57:19.613098 2750 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 23:57:19.613304 kubelet[2750]: W0909 23:57:19.613164 2750 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 23:57:19.613304 kubelet[2750]: E0909 23:57:19.613196 2750 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 23:57:19.613571 kubelet[2750]: E0909 23:57:19.613557 2750 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 23:57:19.613634 kubelet[2750]: W0909 23:57:19.613623 2750 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 23:57:19.613733 kubelet[2750]: E0909 23:57:19.613696 2750 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 23:57:19.613958 kubelet[2750]: E0909 23:57:19.613945 2750 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 23:57:19.614031 kubelet[2750]: W0909 23:57:19.614019 2750 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 23:57:19.614117 kubelet[2750]: E0909 23:57:19.614094 2750 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 23:57:19.614504 kubelet[2750]: E0909 23:57:19.614396 2750 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 23:57:19.614504 kubelet[2750]: W0909 23:57:19.614411 2750 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 23:57:19.614504 kubelet[2750]: E0909 23:57:19.614440 2750 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 23:57:19.614685 kubelet[2750]: E0909 23:57:19.614673 2750 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 23:57:19.614917 kubelet[2750]: W0909 23:57:19.614732 2750 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 23:57:19.614917 kubelet[2750]: E0909 23:57:19.614759 2750 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 23:57:19.615105 kubelet[2750]: E0909 23:57:19.615093 2750 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 23:57:19.615164 kubelet[2750]: W0909 23:57:19.615153 2750 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 23:57:19.615238 kubelet[2750]: E0909 23:57:19.615225 2750 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 23:57:19.616688 kubelet[2750]: E0909 23:57:19.616655 2750 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 23:57:19.616688 kubelet[2750]: W0909 23:57:19.616686 2750 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 23:57:19.617395 kubelet[2750]: E0909 23:57:19.616724 2750 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 23:57:19.617395 kubelet[2750]: E0909 23:57:19.617164 2750 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 23:57:19.617395 kubelet[2750]: W0909 23:57:19.617178 2750 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 23:57:19.617395 kubelet[2750]: E0909 23:57:19.617371 2750 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 23:57:19.617806 kubelet[2750]: E0909 23:57:19.617790 2750 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 23:57:19.617890 kubelet[2750]: W0909 23:57:19.617807 2750 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 23:57:19.617890 kubelet[2750]: E0909 23:57:19.617863 2750 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 23:57:19.618194 kubelet[2750]: E0909 23:57:19.618177 2750 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 23:57:19.618232 kubelet[2750]: W0909 23:57:19.618194 2750 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 23:57:19.618286 kubelet[2750]: E0909 23:57:19.618270 2750 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 23:57:19.618556 kubelet[2750]: E0909 23:57:19.618498 2750 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 23:57:19.618556 kubelet[2750]: W0909 23:57:19.618513 2750 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 23:57:19.618556 kubelet[2750]: E0909 23:57:19.618531 2750 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 23:57:19.619640 kubelet[2750]: E0909 23:57:19.619572 2750 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 23:57:19.619640 kubelet[2750]: W0909 23:57:19.619594 2750 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 23:57:19.619817 kubelet[2750]: E0909 23:57:19.619797 2750 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 23:57:19.620147 kubelet[2750]: E0909 23:57:19.620128 2750 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 23:57:19.620147 kubelet[2750]: W0909 23:57:19.620149 2750 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 23:57:19.620147 kubelet[2750]: E0909 23:57:19.620169 2750 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 23:57:19.620581 kubelet[2750]: E0909 23:57:19.620534 2750 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 23:57:19.620581 kubelet[2750]: W0909 23:57:19.620578 2750 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 23:57:19.620581 kubelet[2750]: E0909 23:57:19.620591 2750 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 23:57:19.621105 kubelet[2750]: E0909 23:57:19.621075 2750 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 23:57:19.621105 kubelet[2750]: W0909 23:57:19.621092 2750 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 23:57:19.621194 kubelet[2750]: E0909 23:57:19.621121 2750 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 23:57:20.617164 kubelet[2750]: E0909 23:57:20.617111 2750 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 23:57:20.617164 kubelet[2750]: W0909 23:57:20.617147 2750 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 23:57:20.617613 kubelet[2750]: E0909 23:57:20.617178 2750 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 23:57:20.617613 kubelet[2750]: E0909 23:57:20.617471 2750 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 23:57:20.617613 kubelet[2750]: W0909 23:57:20.617482 2750 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 23:57:20.617613 kubelet[2750]: E0909 23:57:20.617495 2750 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 23:57:20.617734 kubelet[2750]: E0909 23:57:20.617669 2750 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 23:57:20.617734 kubelet[2750]: W0909 23:57:20.617678 2750 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 23:57:20.617734 kubelet[2750]: E0909 23:57:20.617688 2750 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 23:57:20.617920 kubelet[2750]: E0909 23:57:20.617904 2750 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 23:57:20.617955 kubelet[2750]: W0909 23:57:20.617920 2750 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 23:57:20.617955 kubelet[2750]: E0909 23:57:20.617933 2750 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 23:57:20.618242 kubelet[2750]: E0909 23:57:20.618219 2750 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 23:57:20.618242 kubelet[2750]: W0909 23:57:20.618239 2750 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 23:57:20.618387 kubelet[2750]: E0909 23:57:20.618253 2750 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 23:57:20.618608 kubelet[2750]: E0909 23:57:20.618571 2750 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 23:57:20.618608 kubelet[2750]: W0909 23:57:20.618594 2750 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 23:57:20.618608 kubelet[2750]: E0909 23:57:20.618608 2750 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 23:57:20.618881 kubelet[2750]: E0909 23:57:20.618861 2750 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 23:57:20.618881 kubelet[2750]: W0909 23:57:20.618876 2750 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 23:57:20.618958 kubelet[2750]: E0909 23:57:20.618891 2750 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 23:57:20.619291 kubelet[2750]: E0909 23:57:20.619272 2750 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 23:57:20.619291 kubelet[2750]: W0909 23:57:20.619291 2750 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 23:57:20.619401 kubelet[2750]: E0909 23:57:20.619305 2750 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 23:57:20.619613 kubelet[2750]: E0909 23:57:20.619596 2750 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 23:57:20.619613 kubelet[2750]: W0909 23:57:20.619612 2750 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 23:57:20.619685 kubelet[2750]: E0909 23:57:20.619624 2750 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 23:57:20.620026 kubelet[2750]: E0909 23:57:20.619887 2750 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 23:57:20.620026 kubelet[2750]: W0909 23:57:20.619975 2750 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 23:57:20.620026 kubelet[2750]: E0909 23:57:20.619991 2750 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 23:57:20.620302 kubelet[2750]: E0909 23:57:20.620263 2750 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 23:57:20.620351 kubelet[2750]: W0909 23:57:20.620302 2750 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 23:57:20.620408 kubelet[2750]: E0909 23:57:20.620320 2750 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 23:57:20.620721 kubelet[2750]: E0909 23:57:20.620702 2750 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 23:57:20.620763 kubelet[2750]: W0909 23:57:20.620722 2750 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 23:57:20.620763 kubelet[2750]: E0909 23:57:20.620735 2750 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 23:57:20.621229 kubelet[2750]: E0909 23:57:20.621210 2750 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 23:57:20.621229 kubelet[2750]: W0909 23:57:20.621227 2750 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 23:57:20.621320 kubelet[2750]: E0909 23:57:20.621239 2750 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 23:57:20.621462 kubelet[2750]: E0909 23:57:20.621423 2750 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 23:57:20.621462 kubelet[2750]: W0909 23:57:20.621438 2750 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 23:57:20.621462 kubelet[2750]: E0909 23:57:20.621447 2750 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 23:57:20.621675 kubelet[2750]: E0909 23:57:20.621661 2750 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 23:57:20.621675 kubelet[2750]: W0909 23:57:20.621672 2750 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 23:57:20.621736 kubelet[2750]: E0909 23:57:20.621680 2750 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 23:57:20.621951 kubelet[2750]: E0909 23:57:20.621936 2750 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 23:57:20.621951 kubelet[2750]: W0909 23:57:20.621949 2750 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 23:57:20.622018 kubelet[2750]: E0909 23:57:20.621959 2750 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 23:57:20.622210 kubelet[2750]: E0909 23:57:20.622196 2750 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 23:57:20.622210 kubelet[2750]: W0909 23:57:20.622208 2750 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 23:57:20.622276 kubelet[2750]: E0909 23:57:20.622224 2750 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 23:57:20.622438 kubelet[2750]: E0909 23:57:20.622423 2750 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 23:57:20.622438 kubelet[2750]: W0909 23:57:20.622436 2750 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 23:57:20.622500 kubelet[2750]: E0909 23:57:20.622457 2750 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 23:57:20.622670 kubelet[2750]: E0909 23:57:20.622654 2750 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 23:57:20.622670 kubelet[2750]: W0909 23:57:20.622668 2750 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 23:57:20.622824 kubelet[2750]: E0909 23:57:20.622683 2750 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 23:57:20.623030 kubelet[2750]: E0909 23:57:20.623010 2750 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 23:57:20.623030 kubelet[2750]: W0909 23:57:20.623028 2750 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 23:57:20.623095 kubelet[2750]: E0909 23:57:20.623042 2750 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 23:57:20.623235 kubelet[2750]: E0909 23:57:20.623219 2750 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 23:57:20.623281 kubelet[2750]: W0909 23:57:20.623232 2750 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 23:57:20.623308 kubelet[2750]: E0909 23:57:20.623285 2750 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 23:57:20.623576 kubelet[2750]: E0909 23:57:20.623538 2750 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 23:57:20.623576 kubelet[2750]: W0909 23:57:20.623568 2750 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 23:57:20.623657 kubelet[2750]: E0909 23:57:20.623645 2750 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 23:57:20.623753 kubelet[2750]: E0909 23:57:20.623739 2750 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 23:57:20.623782 kubelet[2750]: W0909 23:57:20.623753 2750 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 23:57:20.623915 kubelet[2750]: E0909 23:57:20.623842 2750 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 23:57:20.624017 kubelet[2750]: E0909 23:57:20.624004 2750 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 23:57:20.624017 kubelet[2750]: W0909 23:57:20.624015 2750 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 23:57:20.624076 kubelet[2750]: E0909 23:57:20.624030 2750 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 23:57:20.624198 kubelet[2750]: E0909 23:57:20.624185 2750 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 23:57:20.624198 kubelet[2750]: W0909 23:57:20.624197 2750 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 23:57:20.624260 kubelet[2750]: E0909 23:57:20.624214 2750 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 23:57:20.624449 kubelet[2750]: E0909 23:57:20.624433 2750 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 23:57:20.624449 kubelet[2750]: W0909 23:57:20.624447 2750 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 23:57:20.624511 kubelet[2750]: E0909 23:57:20.624461 2750 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 23:57:20.624875 kubelet[2750]: E0909 23:57:20.624790 2750 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 23:57:20.624875 kubelet[2750]: W0909 23:57:20.624807 2750 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 23:57:20.624875 kubelet[2750]: E0909 23:57:20.624820 2750 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 23:57:20.625229 kubelet[2750]: E0909 23:57:20.625199 2750 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 23:57:20.625229 kubelet[2750]: W0909 23:57:20.625214 2750 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 23:57:20.625313 kubelet[2750]: E0909 23:57:20.625254 2750 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 23:57:20.625461 kubelet[2750]: E0909 23:57:20.625444 2750 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 23:57:20.625461 kubelet[2750]: W0909 23:57:20.625457 2750 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 23:57:20.625592 kubelet[2750]: E0909 23:57:20.625540 2750 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 23:57:20.625679 kubelet[2750]: E0909 23:57:20.625667 2750 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 23:57:20.625679 kubelet[2750]: W0909 23:57:20.625677 2750 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 23:57:20.625730 kubelet[2750]: E0909 23:57:20.625691 2750 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 23:57:20.625883 kubelet[2750]: E0909 23:57:20.625863 2750 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 23:57:20.625931 kubelet[2750]: W0909 23:57:20.625875 2750 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 23:57:20.625931 kubelet[2750]: E0909 23:57:20.625896 2750 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 23:57:20.626184 kubelet[2750]: E0909 23:57:20.626165 2750 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 23:57:20.626184 kubelet[2750]: W0909 23:57:20.626181 2750 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 23:57:20.626247 kubelet[2750]: E0909 23:57:20.626190 2750 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 23:57:20.627006 kubelet[2750]: E0909 23:57:20.626985 2750 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 9 23:57:20.627006 kubelet[2750]: W0909 23:57:20.627002 2750 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 9 23:57:20.627006 kubelet[2750]: E0909 23:57:20.627014 2750 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 9 23:57:20.936854 containerd[1499]: time="2025-09-09T23:57:20.936767662Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 23:57:20.940083 containerd[1499]: time="2025-09-09T23:57:20.939623831Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.3: active requests=0, bytes read=4266814" Sep 9 23:57:20.940670 containerd[1499]: time="2025-09-09T23:57:20.940624020Z" level=info msg="ImageCreate event name:\"sha256:29e6f31ad72882b1b817dd257df6b7981e4d7d31d872b7fe2cf102c6e2af27a5\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 23:57:20.948465 containerd[1499]: time="2025-09-09T23:57:20.948399937Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/pod2daemon-flexvol@sha256:81bdfcd9dbd36624dc35354e8c181c75631ba40e6c7df5820f5f56cea36f0ef9\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 23:57:20.950822 containerd[1499]: time="2025-09-09T23:57:20.950743072Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.3\" with image id \"sha256:29e6f31ad72882b1b817dd257df6b7981e4d7d31d872b7fe2cf102c6e2af27a5\", repo tag \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.3\", repo digest \"ghcr.io/flatcar/calico/pod2daemon-flexvol@sha256:81bdfcd9dbd36624dc35354e8c181c75631ba40e6c7df5820f5f56cea36f0ef9\", size \"5636015\" in 2.261273589s" Sep 9 23:57:20.950822 containerd[1499]: time="2025-09-09T23:57:20.950803751Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.3\" returns image reference \"sha256:29e6f31ad72882b1b817dd257df6b7981e4d7d31d872b7fe2cf102c6e2af27a5\"" Sep 9 23:57:20.955065 containerd[1499]: time="2025-09-09T23:57:20.955017226Z" level=info msg="CreateContainer within sandbox \"5032c647937ce33b6ad00e55d59499687707b3540c05077bf50c1fe9dbefd86e\" for container &ContainerMetadata{Name:flexvol-driver,Attempt:0,}" Sep 9 23:57:20.973639 containerd[1499]: time="2025-09-09T23:57:20.973580067Z" level=info msg="Container 7c55ed2f0adea69b39998018bbe3d86f14b7ae0aeb632b5445a1fd3abfdeacf1: CDI devices from CRI Config.CDIDevices: []" Sep 9 23:57:20.986181 containerd[1499]: time="2025-09-09T23:57:20.986104093Z" level=info msg="CreateContainer within sandbox \"5032c647937ce33b6ad00e55d59499687707b3540c05077bf50c1fe9dbefd86e\" for &ContainerMetadata{Name:flexvol-driver,Attempt:0,} returns container id \"7c55ed2f0adea69b39998018bbe3d86f14b7ae0aeb632b5445a1fd3abfdeacf1\"" Sep 9 23:57:20.987045 containerd[1499]: time="2025-09-09T23:57:20.987002323Z" level=info msg="StartContainer for \"7c55ed2f0adea69b39998018bbe3d86f14b7ae0aeb632b5445a1fd3abfdeacf1\"" Sep 9 23:57:20.989332 containerd[1499]: time="2025-09-09T23:57:20.989232419Z" level=info msg="connecting to shim 7c55ed2f0adea69b39998018bbe3d86f14b7ae0aeb632b5445a1fd3abfdeacf1" address="unix:///run/containerd/s/388c3d68d5c7ec67239d1fb5b28556f2879574c742a8d1336cf1019dfe515419" protocol=ttrpc version=3 Sep 9 23:57:21.021067 systemd[1]: Started cri-containerd-7c55ed2f0adea69b39998018bbe3d86f14b7ae0aeb632b5445a1fd3abfdeacf1.scope - libcontainer container 7c55ed2f0adea69b39998018bbe3d86f14b7ae0aeb632b5445a1fd3abfdeacf1. Sep 9 23:57:21.085640 containerd[1499]: time="2025-09-09T23:57:21.085469566Z" level=info msg="StartContainer for \"7c55ed2f0adea69b39998018bbe3d86f14b7ae0aeb632b5445a1fd3abfdeacf1\" returns successfully" Sep 9 23:57:21.104641 systemd[1]: cri-containerd-7c55ed2f0adea69b39998018bbe3d86f14b7ae0aeb632b5445a1fd3abfdeacf1.scope: Deactivated successfully. Sep 9 23:57:21.114553 containerd[1499]: time="2025-09-09T23:57:21.114309823Z" level=info msg="received exit event container_id:\"7c55ed2f0adea69b39998018bbe3d86f14b7ae0aeb632b5445a1fd3abfdeacf1\" id:\"7c55ed2f0adea69b39998018bbe3d86f14b7ae0aeb632b5445a1fd3abfdeacf1\" pid:3429 exited_at:{seconds:1757462241 nanos:113820228}" Sep 9 23:57:21.115068 containerd[1499]: time="2025-09-09T23:57:21.115017095Z" level=info msg="TaskExit event in podsandbox handler container_id:\"7c55ed2f0adea69b39998018bbe3d86f14b7ae0aeb632b5445a1fd3abfdeacf1\" id:\"7c55ed2f0adea69b39998018bbe3d86f14b7ae0aeb632b5445a1fd3abfdeacf1\" pid:3429 exited_at:{seconds:1757462241 nanos:113820228}" Sep 9 23:57:21.150082 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-7c55ed2f0adea69b39998018bbe3d86f14b7ae0aeb632b5445a1fd3abfdeacf1-rootfs.mount: Deactivated successfully. Sep 9 23:57:21.411486 kubelet[2750]: E0909 23:57:21.410965 2750 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-p9n6x" podUID="78e1ea12-0c8b-4e75-9773-e6539bdf3c00" Sep 9 23:57:21.599182 containerd[1499]: time="2025-09-09T23:57:21.599127490Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/cni:v3.30.3\"" Sep 9 23:57:23.410657 kubelet[2750]: E0909 23:57:23.410269 2750 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-p9n6x" podUID="78e1ea12-0c8b-4e75-9773-e6539bdf3c00" Sep 9 23:57:25.274633 containerd[1499]: time="2025-09-09T23:57:25.273910115Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/cni:v3.30.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 23:57:25.276749 containerd[1499]: time="2025-09-09T23:57:25.276621328Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/cni:v3.30.3: active requests=0, bytes read=65913477" Sep 9 23:57:25.278954 containerd[1499]: time="2025-09-09T23:57:25.278398591Z" level=info msg="ImageCreate event name:\"sha256:7077a1dc632ee598cbfa626f9e3c9bca5b20c0d1e1e557995890125b2e8d2e23\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 23:57:25.282311 containerd[1499]: time="2025-09-09T23:57:25.282214753Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/cni@sha256:73d1e391050490d54e5bee8ff2b1a50a8be1746c98dc530361b00e8c0ab63f87\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 23:57:25.283191 containerd[1499]: time="2025-09-09T23:57:25.283135304Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/cni:v3.30.3\" with image id \"sha256:7077a1dc632ee598cbfa626f9e3c9bca5b20c0d1e1e557995890125b2e8d2e23\", repo tag \"ghcr.io/flatcar/calico/cni:v3.30.3\", repo digest \"ghcr.io/flatcar/calico/cni@sha256:73d1e391050490d54e5bee8ff2b1a50a8be1746c98dc530361b00e8c0ab63f87\", size \"67282718\" in 3.683960935s" Sep 9 23:57:25.283327 containerd[1499]: time="2025-09-09T23:57:25.283308823Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/cni:v3.30.3\" returns image reference \"sha256:7077a1dc632ee598cbfa626f9e3c9bca5b20c0d1e1e557995890125b2e8d2e23\"" Sep 9 23:57:25.288212 containerd[1499]: time="2025-09-09T23:57:25.288165935Z" level=info msg="CreateContainer within sandbox \"5032c647937ce33b6ad00e55d59499687707b3540c05077bf50c1fe9dbefd86e\" for container &ContainerMetadata{Name:install-cni,Attempt:0,}" Sep 9 23:57:25.301952 containerd[1499]: time="2025-09-09T23:57:25.298195757Z" level=info msg="Container d9fc9025fae830e53e892399e4557dca7ae831f5e87a1873ba4628130139da4f: CDI devices from CRI Config.CDIDevices: []" Sep 9 23:57:25.311334 containerd[1499]: time="2025-09-09T23:57:25.311243270Z" level=info msg="CreateContainer within sandbox \"5032c647937ce33b6ad00e55d59499687707b3540c05077bf50c1fe9dbefd86e\" for &ContainerMetadata{Name:install-cni,Attempt:0,} returns container id \"d9fc9025fae830e53e892399e4557dca7ae831f5e87a1873ba4628130139da4f\"" Sep 9 23:57:25.312360 containerd[1499]: time="2025-09-09T23:57:25.312310699Z" level=info msg="StartContainer for \"d9fc9025fae830e53e892399e4557dca7ae831f5e87a1873ba4628130139da4f\"" Sep 9 23:57:25.315031 containerd[1499]: time="2025-09-09T23:57:25.314820435Z" level=info msg="connecting to shim d9fc9025fae830e53e892399e4557dca7ae831f5e87a1873ba4628130139da4f" address="unix:///run/containerd/s/388c3d68d5c7ec67239d1fb5b28556f2879574c742a8d1336cf1019dfe515419" protocol=ttrpc version=3 Sep 9 23:57:25.342304 systemd[1]: Started cri-containerd-d9fc9025fae830e53e892399e4557dca7ae831f5e87a1873ba4628130139da4f.scope - libcontainer container d9fc9025fae830e53e892399e4557dca7ae831f5e87a1873ba4628130139da4f. Sep 9 23:57:25.389076 containerd[1499]: time="2025-09-09T23:57:25.388977150Z" level=info msg="StartContainer for \"d9fc9025fae830e53e892399e4557dca7ae831f5e87a1873ba4628130139da4f\" returns successfully" Sep 9 23:57:25.411070 kubelet[2750]: E0909 23:57:25.411013 2750 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-p9n6x" podUID="78e1ea12-0c8b-4e75-9773-e6539bdf3c00" Sep 9 23:57:25.948117 containerd[1499]: time="2025-09-09T23:57:25.948065244Z" level=error msg="failed to reload cni configuration after receiving fs change event(WRITE \"/etc/cni/net.d/calico-kubeconfig\")" error="cni config load failed: no network config found in /etc/cni/net.d: cni plugin not initialized: failed to load cni config" Sep 9 23:57:25.950963 systemd[1]: cri-containerd-d9fc9025fae830e53e892399e4557dca7ae831f5e87a1873ba4628130139da4f.scope: Deactivated successfully. Sep 9 23:57:25.951560 systemd[1]: cri-containerd-d9fc9025fae830e53e892399e4557dca7ae831f5e87a1873ba4628130139da4f.scope: Consumed 542ms CPU time, 185.6M memory peak, 165.8M written to disk. Sep 9 23:57:25.955007 containerd[1499]: time="2025-09-09T23:57:25.954963576Z" level=info msg="received exit event container_id:\"d9fc9025fae830e53e892399e4557dca7ae831f5e87a1873ba4628130139da4f\" id:\"d9fc9025fae830e53e892399e4557dca7ae831f5e87a1873ba4628130139da4f\" pid:3487 exited_at:{seconds:1757462245 nanos:954656259}" Sep 9 23:57:25.955444 containerd[1499]: time="2025-09-09T23:57:25.955413332Z" level=info msg="TaskExit event in podsandbox handler container_id:\"d9fc9025fae830e53e892399e4557dca7ae831f5e87a1873ba4628130139da4f\" id:\"d9fc9025fae830e53e892399e4557dca7ae831f5e87a1873ba4628130139da4f\" pid:3487 exited_at:{seconds:1757462245 nanos:954656259}" Sep 9 23:57:25.982513 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-d9fc9025fae830e53e892399e4557dca7ae831f5e87a1873ba4628130139da4f-rootfs.mount: Deactivated successfully. Sep 9 23:57:26.015859 kubelet[2750]: I0909 23:57:26.015419 2750 kubelet_node_status.go:501] "Fast updating node status as it just became ready" Sep 9 23:57:26.063484 kubelet[2750]: I0909 23:57:26.063362 2750 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pwgzd\" (UniqueName: \"kubernetes.io/projected/22f16259-3f09-42f8-a728-4b51e247e528-kube-api-access-pwgzd\") pod \"coredns-668d6bf9bc-v2h8b\" (UID: \"22f16259-3f09-42f8-a728-4b51e247e528\") " pod="kube-system/coredns-668d6bf9bc-v2h8b" Sep 9 23:57:26.063484 kubelet[2750]: I0909 23:57:26.063466 2750 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/22f16259-3f09-42f8-a728-4b51e247e528-config-volume\") pod \"coredns-668d6bf9bc-v2h8b\" (UID: \"22f16259-3f09-42f8-a728-4b51e247e528\") " pod="kube-system/coredns-668d6bf9bc-v2h8b" Sep 9 23:57:26.073535 systemd[1]: Created slice kubepods-burstable-pod22f16259_3f09_42f8_a728_4b51e247e528.slice - libcontainer container kubepods-burstable-pod22f16259_3f09_42f8_a728_4b51e247e528.slice. Sep 9 23:57:26.092793 systemd[1]: Created slice kubepods-besteffort-pod934bf4ae_b87a_4084_b4c1_444253f9295c.slice - libcontainer container kubepods-besteffort-pod934bf4ae_b87a_4084_b4c1_444253f9295c.slice. Sep 9 23:57:26.117222 systemd[1]: Created slice kubepods-besteffort-pod56657f5b_5f65_4c76_a8c6_274344b1f6c6.slice - libcontainer container kubepods-besteffort-pod56657f5b_5f65_4c76_a8c6_274344b1f6c6.slice. Sep 9 23:57:26.131702 systemd[1]: Created slice kubepods-besteffort-pod2b493a96_3c4e_400a_b1bc_669373df4971.slice - libcontainer container kubepods-besteffort-pod2b493a96_3c4e_400a_b1bc_669373df4971.slice. Sep 9 23:57:26.141637 systemd[1]: Created slice kubepods-besteffort-podd88e065c_f4c4_4c79_aec9_35cb6eb39df3.slice - libcontainer container kubepods-besteffort-podd88e065c_f4c4_4c79_aec9_35cb6eb39df3.slice. Sep 9 23:57:26.150876 systemd[1]: Created slice kubepods-besteffort-pod8a4f4bc9_5ecb_484f_bb86_7e44b0d39f72.slice - libcontainer container kubepods-besteffort-pod8a4f4bc9_5ecb_484f_bb86_7e44b0d39f72.slice. Sep 9 23:57:26.161819 systemd[1]: Created slice kubepods-burstable-pod5e89e89d_d9f4_4ee7_afac_d5ef87dca67a.slice - libcontainer container kubepods-burstable-pod5e89e89d_d9f4_4ee7_afac_d5ef87dca67a.slice. Sep 9 23:57:26.164036 kubelet[2750]: I0909 23:57:26.163774 2750 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2b493a96-3c4e-400a-b1bc-669373df4971-config\") pod \"goldmane-54d579b49d-fw5ss\" (UID: \"2b493a96-3c4e-400a-b1bc-669373df4971\") " pod="calico-system/goldmane-54d579b49d-fw5ss" Sep 9 23:57:26.164036 kubelet[2750]: I0909 23:57:26.163818 2750 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d88e065c-f4c4-4c79-aec9-35cb6eb39df3-tigera-ca-bundle\") pod \"calico-kube-controllers-749fff5dd9-2hjps\" (UID: \"d88e065c-f4c4-4c79-aec9-35cb6eb39df3\") " pod="calico-system/calico-kube-controllers-749fff5dd9-2hjps" Sep 9 23:57:26.164036 kubelet[2750]: I0909 23:57:26.164034 2750 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"goldmane-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2b493a96-3c4e-400a-b1bc-669373df4971-goldmane-ca-bundle\") pod \"goldmane-54d579b49d-fw5ss\" (UID: \"2b493a96-3c4e-400a-b1bc-669373df4971\") " pod="calico-system/goldmane-54d579b49d-fw5ss" Sep 9 23:57:26.164590 kubelet[2750]: I0909 23:57:26.164361 2750 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whisker-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8a4f4bc9-5ecb-484f-bb86-7e44b0d39f72-whisker-ca-bundle\") pod \"whisker-59885bbdc9-jlcjj\" (UID: \"8a4f4bc9-5ecb-484f-bb86-7e44b0d39f72\") " pod="calico-system/whisker-59885bbdc9-jlcjj" Sep 9 23:57:26.165044 kubelet[2750]: I0909 23:57:26.164604 2750 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kqvlq\" (UniqueName: \"kubernetes.io/projected/8a4f4bc9-5ecb-484f-bb86-7e44b0d39f72-kube-api-access-kqvlq\") pod \"whisker-59885bbdc9-jlcjj\" (UID: \"8a4f4bc9-5ecb-484f-bb86-7e44b0d39f72\") " pod="calico-system/whisker-59885bbdc9-jlcjj" Sep 9 23:57:26.165044 kubelet[2750]: I0909 23:57:26.165066 2750 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"calico-apiserver-certs\" (UniqueName: \"kubernetes.io/secret/56657f5b-5f65-4c76-a8c6-274344b1f6c6-calico-apiserver-certs\") pod \"calico-apiserver-54cdc7764d-sgvm7\" (UID: \"56657f5b-5f65-4c76-a8c6-274344b1f6c6\") " pod="calico-apiserver/calico-apiserver-54cdc7764d-sgvm7" Sep 9 23:57:26.165196 kubelet[2750]: I0909 23:57:26.165110 2750 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"goldmane-key-pair\" (UniqueName: \"kubernetes.io/secret/2b493a96-3c4e-400a-b1bc-669373df4971-goldmane-key-pair\") pod \"goldmane-54d579b49d-fw5ss\" (UID: \"2b493a96-3c4e-400a-b1bc-669373df4971\") " pod="calico-system/goldmane-54d579b49d-fw5ss" Sep 9 23:57:26.165196 kubelet[2750]: I0909 23:57:26.165131 2750 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/5e89e89d-d9f4-4ee7-afac-d5ef87dca67a-config-volume\") pod \"coredns-668d6bf9bc-8js5x\" (UID: \"5e89e89d-d9f4-4ee7-afac-d5ef87dca67a\") " pod="kube-system/coredns-668d6bf9bc-8js5x" Sep 9 23:57:26.165196 kubelet[2750]: I0909 23:57:26.165152 2750 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-skt4c\" (UniqueName: \"kubernetes.io/projected/56657f5b-5f65-4c76-a8c6-274344b1f6c6-kube-api-access-skt4c\") pod \"calico-apiserver-54cdc7764d-sgvm7\" (UID: \"56657f5b-5f65-4c76-a8c6-274344b1f6c6\") " pod="calico-apiserver/calico-apiserver-54cdc7764d-sgvm7" Sep 9 23:57:26.165196 kubelet[2750]: I0909 23:57:26.165177 2750 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"calico-apiserver-certs\" (UniqueName: \"kubernetes.io/secret/934bf4ae-b87a-4084-b4c1-444253f9295c-calico-apiserver-certs\") pod \"calico-apiserver-54cdc7764d-2x7kh\" (UID: \"934bf4ae-b87a-4084-b4c1-444253f9295c\") " pod="calico-apiserver/calico-apiserver-54cdc7764d-2x7kh" Sep 9 23:57:26.165347 kubelet[2750]: I0909 23:57:26.165212 2750 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6lxbq\" (UniqueName: \"kubernetes.io/projected/d88e065c-f4c4-4c79-aec9-35cb6eb39df3-kube-api-access-6lxbq\") pod \"calico-kube-controllers-749fff5dd9-2hjps\" (UID: \"d88e065c-f4c4-4c79-aec9-35cb6eb39df3\") " pod="calico-system/calico-kube-controllers-749fff5dd9-2hjps" Sep 9 23:57:26.165347 kubelet[2750]: I0909 23:57:26.165232 2750 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whisker-backend-key-pair\" (UniqueName: \"kubernetes.io/secret/8a4f4bc9-5ecb-484f-bb86-7e44b0d39f72-whisker-backend-key-pair\") pod \"whisker-59885bbdc9-jlcjj\" (UID: \"8a4f4bc9-5ecb-484f-bb86-7e44b0d39f72\") " pod="calico-system/whisker-59885bbdc9-jlcjj" Sep 9 23:57:26.165347 kubelet[2750]: I0909 23:57:26.165273 2750 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kwv8l\" (UniqueName: \"kubernetes.io/projected/5e89e89d-d9f4-4ee7-afac-d5ef87dca67a-kube-api-access-kwv8l\") pod \"coredns-668d6bf9bc-8js5x\" (UID: \"5e89e89d-d9f4-4ee7-afac-d5ef87dca67a\") " pod="kube-system/coredns-668d6bf9bc-8js5x" Sep 9 23:57:26.165347 kubelet[2750]: I0909 23:57:26.165292 2750 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ghftx\" (UniqueName: \"kubernetes.io/projected/2b493a96-3c4e-400a-b1bc-669373df4971-kube-api-access-ghftx\") pod \"goldmane-54d579b49d-fw5ss\" (UID: \"2b493a96-3c4e-400a-b1bc-669373df4971\") " pod="calico-system/goldmane-54d579b49d-fw5ss" Sep 9 23:57:26.165347 kubelet[2750]: I0909 23:57:26.165308 2750 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-spnws\" (UniqueName: \"kubernetes.io/projected/934bf4ae-b87a-4084-b4c1-444253f9295c-kube-api-access-spnws\") pod \"calico-apiserver-54cdc7764d-2x7kh\" (UID: \"934bf4ae-b87a-4084-b4c1-444253f9295c\") " pod="calico-apiserver/calico-apiserver-54cdc7764d-2x7kh" Sep 9 23:57:26.384957 containerd[1499]: time="2025-09-09T23:57:26.384899512Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-668d6bf9bc-v2h8b,Uid:22f16259-3f09-42f8-a728-4b51e247e528,Namespace:kube-system,Attempt:0,}" Sep 9 23:57:26.405376 containerd[1499]: time="2025-09-09T23:57:26.405226597Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-54cdc7764d-2x7kh,Uid:934bf4ae-b87a-4084-b4c1-444253f9295c,Namespace:calico-apiserver,Attempt:0,}" Sep 9 23:57:26.427712 containerd[1499]: time="2025-09-09T23:57:26.427626221Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-54cdc7764d-sgvm7,Uid:56657f5b-5f65-4c76-a8c6-274344b1f6c6,Namespace:calico-apiserver,Attempt:0,}" Sep 9 23:57:26.440025 containerd[1499]: time="2025-09-09T23:57:26.439939343Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:goldmane-54d579b49d-fw5ss,Uid:2b493a96-3c4e-400a-b1bc-669373df4971,Namespace:calico-system,Attempt:0,}" Sep 9 23:57:26.446661 containerd[1499]: time="2025-09-09T23:57:26.446606639Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-749fff5dd9-2hjps,Uid:d88e065c-f4c4-4c79-aec9-35cb6eb39df3,Namespace:calico-system,Attempt:0,}" Sep 9 23:57:26.467505 containerd[1499]: time="2025-09-09T23:57:26.467141201Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:whisker-59885bbdc9-jlcjj,Uid:8a4f4bc9-5ecb-484f-bb86-7e44b0d39f72,Namespace:calico-system,Attempt:0,}" Sep 9 23:57:26.474537 containerd[1499]: time="2025-09-09T23:57:26.474326212Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-668d6bf9bc-8js5x,Uid:5e89e89d-d9f4-4ee7-afac-d5ef87dca67a,Namespace:kube-system,Attempt:0,}" Sep 9 23:57:26.598891 containerd[1499]: time="2025-09-09T23:57:26.598811614Z" level=error msg="Failed to destroy network for sandbox \"6d1f636d6ff0282e0b164011c66966596516132001360453b07a06e27af1313a\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 9 23:57:26.602622 containerd[1499]: time="2025-09-09T23:57:26.602547258Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-54cdc7764d-2x7kh,Uid:934bf4ae-b87a-4084-b4c1-444253f9295c,Namespace:calico-apiserver,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"6d1f636d6ff0282e0b164011c66966596516132001360453b07a06e27af1313a\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 9 23:57:26.603498 kubelet[2750]: E0909 23:57:26.603446 2750 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"6d1f636d6ff0282e0b164011c66966596516132001360453b07a06e27af1313a\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 9 23:57:26.604958 kubelet[2750]: E0909 23:57:26.603525 2750 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"6d1f636d6ff0282e0b164011c66966596516132001360453b07a06e27af1313a\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-54cdc7764d-2x7kh" Sep 9 23:57:26.604958 kubelet[2750]: E0909 23:57:26.603549 2750 kuberuntime_manager.go:1237] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"6d1f636d6ff0282e0b164011c66966596516132001360453b07a06e27af1313a\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-54cdc7764d-2x7kh" Sep 9 23:57:26.604958 kubelet[2750]: E0909 23:57:26.603588 2750 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-apiserver-54cdc7764d-2x7kh_calico-apiserver(934bf4ae-b87a-4084-b4c1-444253f9295c)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-apiserver-54cdc7764d-2x7kh_calico-apiserver(934bf4ae-b87a-4084-b4c1-444253f9295c)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"6d1f636d6ff0282e0b164011c66966596516132001360453b07a06e27af1313a\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-apiserver/calico-apiserver-54cdc7764d-2x7kh" podUID="934bf4ae-b87a-4084-b4c1-444253f9295c" Sep 9 23:57:26.613575 containerd[1499]: time="2025-09-09T23:57:26.613502593Z" level=error msg="Failed to destroy network for sandbox \"713f83185b5fe14465d8a8dc78cca03946a01c3e10a871e5bea7d5e09ae59296\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 9 23:57:26.616642 containerd[1499]: time="2025-09-09T23:57:26.616582203Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-668d6bf9bc-v2h8b,Uid:22f16259-3f09-42f8-a728-4b51e247e528,Namespace:kube-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"713f83185b5fe14465d8a8dc78cca03946a01c3e10a871e5bea7d5e09ae59296\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 9 23:57:26.618271 kubelet[2750]: E0909 23:57:26.616829 2750 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"713f83185b5fe14465d8a8dc78cca03946a01c3e10a871e5bea7d5e09ae59296\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 9 23:57:26.618271 kubelet[2750]: E0909 23:57:26.616912 2750 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"713f83185b5fe14465d8a8dc78cca03946a01c3e10a871e5bea7d5e09ae59296\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-668d6bf9bc-v2h8b" Sep 9 23:57:26.618271 kubelet[2750]: E0909 23:57:26.616931 2750 kuberuntime_manager.go:1237] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"713f83185b5fe14465d8a8dc78cca03946a01c3e10a871e5bea7d5e09ae59296\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-668d6bf9bc-v2h8b" Sep 9 23:57:26.618393 kubelet[2750]: E0909 23:57:26.616978 2750 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"coredns-668d6bf9bc-v2h8b_kube-system(22f16259-3f09-42f8-a728-4b51e247e528)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"coredns-668d6bf9bc-v2h8b_kube-system(22f16259-3f09-42f8-a728-4b51e247e528)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"713f83185b5fe14465d8a8dc78cca03946a01c3e10a871e5bea7d5e09ae59296\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-668d6bf9bc-v2h8b" podUID="22f16259-3f09-42f8-a728-4b51e247e528" Sep 9 23:57:26.638290 containerd[1499]: time="2025-09-09T23:57:26.638144796Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node:v3.30.3\"" Sep 9 23:57:26.693136 containerd[1499]: time="2025-09-09T23:57:26.693073107Z" level=error msg="Failed to destroy network for sandbox \"90ffe65a0c3870208c7c4c50e24e75a5bdaf1604ab00b2566f470c45a691a998\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 9 23:57:26.694925 containerd[1499]: time="2025-09-09T23:57:26.694829090Z" level=error msg="Failed to destroy network for sandbox \"18f281e684c4adcf63123f37e5354225c2fa1d3fb9a312769ec30e0edfb4e7a4\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 9 23:57:26.696400 containerd[1499]: time="2025-09-09T23:57:26.696330396Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-54cdc7764d-sgvm7,Uid:56657f5b-5f65-4c76-a8c6-274344b1f6c6,Namespace:calico-apiserver,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"90ffe65a0c3870208c7c4c50e24e75a5bdaf1604ab00b2566f470c45a691a998\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 9 23:57:26.696720 kubelet[2750]: E0909 23:57:26.696683 2750 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"90ffe65a0c3870208c7c4c50e24e75a5bdaf1604ab00b2566f470c45a691a998\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 9 23:57:26.696781 kubelet[2750]: E0909 23:57:26.696758 2750 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"90ffe65a0c3870208c7c4c50e24e75a5bdaf1604ab00b2566f470c45a691a998\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-54cdc7764d-sgvm7" Sep 9 23:57:26.696808 kubelet[2750]: E0909 23:57:26.696788 2750 kuberuntime_manager.go:1237] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"90ffe65a0c3870208c7c4c50e24e75a5bdaf1604ab00b2566f470c45a691a998\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-54cdc7764d-sgvm7" Sep 9 23:57:26.696891 kubelet[2750]: E0909 23:57:26.696862 2750 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-apiserver-54cdc7764d-sgvm7_calico-apiserver(56657f5b-5f65-4c76-a8c6-274344b1f6c6)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-apiserver-54cdc7764d-sgvm7_calico-apiserver(56657f5b-5f65-4c76-a8c6-274344b1f6c6)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"90ffe65a0c3870208c7c4c50e24e75a5bdaf1604ab00b2566f470c45a691a998\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-apiserver/calico-apiserver-54cdc7764d-sgvm7" podUID="56657f5b-5f65-4c76-a8c6-274344b1f6c6" Sep 9 23:57:26.697909 containerd[1499]: time="2025-09-09T23:57:26.697307386Z" level=error msg="Failed to destroy network for sandbox \"a3af6c4a88926a3df0dde2f26665cf20df5bd48d6d3b18a29800a61800a5e10b\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 9 23:57:26.699227 containerd[1499]: time="2025-09-09T23:57:26.699177129Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-749fff5dd9-2hjps,Uid:d88e065c-f4c4-4c79-aec9-35cb6eb39df3,Namespace:calico-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"18f281e684c4adcf63123f37e5354225c2fa1d3fb9a312769ec30e0edfb4e7a4\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 9 23:57:26.700736 kubelet[2750]: E0909 23:57:26.700669 2750 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"18f281e684c4adcf63123f37e5354225c2fa1d3fb9a312769ec30e0edfb4e7a4\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 9 23:57:26.700925 kubelet[2750]: E0909 23:57:26.700883 2750 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"18f281e684c4adcf63123f37e5354225c2fa1d3fb9a312769ec30e0edfb4e7a4\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-kube-controllers-749fff5dd9-2hjps" Sep 9 23:57:26.701438 kubelet[2750]: E0909 23:57:26.700925 2750 kuberuntime_manager.go:1237] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"18f281e684c4adcf63123f37e5354225c2fa1d3fb9a312769ec30e0edfb4e7a4\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-kube-controllers-749fff5dd9-2hjps" Sep 9 23:57:26.701438 kubelet[2750]: E0909 23:57:26.701117 2750 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-kube-controllers-749fff5dd9-2hjps_calico-system(d88e065c-f4c4-4c79-aec9-35cb6eb39df3)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-kube-controllers-749fff5dd9-2hjps_calico-system(d88e065c-f4c4-4c79-aec9-35cb6eb39df3)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"18f281e684c4adcf63123f37e5354225c2fa1d3fb9a312769ec30e0edfb4e7a4\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/calico-kube-controllers-749fff5dd9-2hjps" podUID="d88e065c-f4c4-4c79-aec9-35cb6eb39df3" Sep 9 23:57:26.702521 containerd[1499]: time="2025-09-09T23:57:26.701978462Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:goldmane-54d579b49d-fw5ss,Uid:2b493a96-3c4e-400a-b1bc-669373df4971,Namespace:calico-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"a3af6c4a88926a3df0dde2f26665cf20df5bd48d6d3b18a29800a61800a5e10b\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 9 23:57:26.703703 kubelet[2750]: E0909 23:57:26.702693 2750 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"a3af6c4a88926a3df0dde2f26665cf20df5bd48d6d3b18a29800a61800a5e10b\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 9 23:57:26.703703 kubelet[2750]: E0909 23:57:26.702761 2750 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"a3af6c4a88926a3df0dde2f26665cf20df5bd48d6d3b18a29800a61800a5e10b\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/goldmane-54d579b49d-fw5ss" Sep 9 23:57:26.703703 kubelet[2750]: E0909 23:57:26.702780 2750 kuberuntime_manager.go:1237] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"a3af6c4a88926a3df0dde2f26665cf20df5bd48d6d3b18a29800a61800a5e10b\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/goldmane-54d579b49d-fw5ss" Sep 9 23:57:26.703895 kubelet[2750]: E0909 23:57:26.702858 2750 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"goldmane-54d579b49d-fw5ss_calico-system(2b493a96-3c4e-400a-b1bc-669373df4971)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"goldmane-54d579b49d-fw5ss_calico-system(2b493a96-3c4e-400a-b1bc-669373df4971)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"a3af6c4a88926a3df0dde2f26665cf20df5bd48d6d3b18a29800a61800a5e10b\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/goldmane-54d579b49d-fw5ss" podUID="2b493a96-3c4e-400a-b1bc-669373df4971" Sep 9 23:57:26.716898 containerd[1499]: time="2025-09-09T23:57:26.716811399Z" level=error msg="Failed to destroy network for sandbox \"efd6f77793867f5f0306fdf7d1b75fb83caca60e26c62780abd4c4754c823f72\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 9 23:57:26.720573 containerd[1499]: time="2025-09-09T23:57:26.720409924Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:whisker-59885bbdc9-jlcjj,Uid:8a4f4bc9-5ecb-484f-bb86-7e44b0d39f72,Namespace:calico-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"efd6f77793867f5f0306fdf7d1b75fb83caca60e26c62780abd4c4754c823f72\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 9 23:57:26.721696 kubelet[2750]: E0909 23:57:26.720921 2750 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"efd6f77793867f5f0306fdf7d1b75fb83caca60e26c62780abd4c4754c823f72\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 9 23:57:26.721696 kubelet[2750]: E0909 23:57:26.721017 2750 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"efd6f77793867f5f0306fdf7d1b75fb83caca60e26c62780abd4c4754c823f72\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/whisker-59885bbdc9-jlcjj" Sep 9 23:57:26.721696 kubelet[2750]: E0909 23:57:26.721044 2750 kuberuntime_manager.go:1237] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"efd6f77793867f5f0306fdf7d1b75fb83caca60e26c62780abd4c4754c823f72\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/whisker-59885bbdc9-jlcjj" Sep 9 23:57:26.721939 kubelet[2750]: E0909 23:57:26.721098 2750 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"whisker-59885bbdc9-jlcjj_calico-system(8a4f4bc9-5ecb-484f-bb86-7e44b0d39f72)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"whisker-59885bbdc9-jlcjj_calico-system(8a4f4bc9-5ecb-484f-bb86-7e44b0d39f72)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"efd6f77793867f5f0306fdf7d1b75fb83caca60e26c62780abd4c4754c823f72\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/whisker-59885bbdc9-jlcjj" podUID="8a4f4bc9-5ecb-484f-bb86-7e44b0d39f72" Sep 9 23:57:26.725462 containerd[1499]: time="2025-09-09T23:57:26.725380836Z" level=error msg="Failed to destroy network for sandbox \"27689c87fbdcf044b544e9dc4a148c64f3134a5299e36255e67c7e8a0411047c\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 9 23:57:26.727217 containerd[1499]: time="2025-09-09T23:57:26.727169099Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-668d6bf9bc-8js5x,Uid:5e89e89d-d9f4-4ee7-afac-d5ef87dca67a,Namespace:kube-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"27689c87fbdcf044b544e9dc4a148c64f3134a5299e36255e67c7e8a0411047c\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 9 23:57:26.728371 kubelet[2750]: E0909 23:57:26.727618 2750 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"27689c87fbdcf044b544e9dc4a148c64f3134a5299e36255e67c7e8a0411047c\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 9 23:57:26.728371 kubelet[2750]: E0909 23:57:26.727817 2750 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"27689c87fbdcf044b544e9dc4a148c64f3134a5299e36255e67c7e8a0411047c\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-668d6bf9bc-8js5x" Sep 9 23:57:26.728589 kubelet[2750]: E0909 23:57:26.728544 2750 kuberuntime_manager.go:1237] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"27689c87fbdcf044b544e9dc4a148c64f3134a5299e36255e67c7e8a0411047c\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-668d6bf9bc-8js5x" Sep 9 23:57:26.728701 kubelet[2750]: E0909 23:57:26.728675 2750 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"coredns-668d6bf9bc-8js5x_kube-system(5e89e89d-d9f4-4ee7-afac-d5ef87dca67a)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"coredns-668d6bf9bc-8js5x_kube-system(5e89e89d-d9f4-4ee7-afac-d5ef87dca67a)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"27689c87fbdcf044b544e9dc4a148c64f3134a5299e36255e67c7e8a0411047c\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-668d6bf9bc-8js5x" podUID="5e89e89d-d9f4-4ee7-afac-d5ef87dca67a" Sep 9 23:57:27.302696 systemd[1]: run-netns-cni\x2da1bc679f\x2d9317\x2dd109\x2de6a9\x2d83f488996d0d.mount: Deactivated successfully. Sep 9 23:57:27.417924 systemd[1]: Created slice kubepods-besteffort-pod78e1ea12_0c8b_4e75_9773_e6539bdf3c00.slice - libcontainer container kubepods-besteffort-pod78e1ea12_0c8b_4e75_9773_e6539bdf3c00.slice. Sep 9 23:57:27.422483 containerd[1499]: time="2025-09-09T23:57:27.422436670Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-p9n6x,Uid:78e1ea12-0c8b-4e75-9773-e6539bdf3c00,Namespace:calico-system,Attempt:0,}" Sep 9 23:57:27.486763 containerd[1499]: time="2025-09-09T23:57:27.486477824Z" level=error msg="Failed to destroy network for sandbox \"f52d63476805a3ea441b69f6c45248904467a076e6e80bb1d9f54b81081fceb3\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 9 23:57:27.490775 systemd[1]: run-netns-cni\x2d51665ff8\x2d3598\x2d8c3e\x2ddd8b\x2d52e34a1c8390.mount: Deactivated successfully. Sep 9 23:57:27.492380 containerd[1499]: time="2025-09-09T23:57:27.492324408Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-p9n6x,Uid:78e1ea12-0c8b-4e75-9773-e6539bdf3c00,Namespace:calico-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"f52d63476805a3ea441b69f6c45248904467a076e6e80bb1d9f54b81081fceb3\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 9 23:57:27.494505 kubelet[2750]: E0909 23:57:27.493335 2750 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"f52d63476805a3ea441b69f6c45248904467a076e6e80bb1d9f54b81081fceb3\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 9 23:57:27.495332 kubelet[2750]: E0909 23:57:27.494588 2750 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"f52d63476805a3ea441b69f6c45248904467a076e6e80bb1d9f54b81081fceb3\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/csi-node-driver-p9n6x" Sep 9 23:57:27.495332 kubelet[2750]: E0909 23:57:27.494624 2750 kuberuntime_manager.go:1237] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"f52d63476805a3ea441b69f6c45248904467a076e6e80bb1d9f54b81081fceb3\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/csi-node-driver-p9n6x" Sep 9 23:57:27.495332 kubelet[2750]: E0909 23:57:27.495137 2750 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"csi-node-driver-p9n6x_calico-system(78e1ea12-0c8b-4e75-9773-e6539bdf3c00)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"csi-node-driver-p9n6x_calico-system(78e1ea12-0c8b-4e75-9773-e6539bdf3c00)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"f52d63476805a3ea441b69f6c45248904467a076e6e80bb1d9f54b81081fceb3\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/csi-node-driver-p9n6x" podUID="78e1ea12-0c8b-4e75-9773-e6539bdf3c00" Sep 9 23:57:33.659926 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount3790187740.mount: Deactivated successfully. Sep 9 23:57:33.689454 containerd[1499]: time="2025-09-09T23:57:33.689387354Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node:v3.30.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 23:57:33.690891 containerd[1499]: time="2025-09-09T23:57:33.690695822Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/node:v3.30.3: active requests=0, bytes read=151100457" Sep 9 23:57:33.692045 containerd[1499]: time="2025-09-09T23:57:33.691966771Z" level=info msg="ImageCreate event name:\"sha256:2b8abd2140fc4464ed664d225fe38e5b90bbfcf62996b484b0fc0e0537b6a4a9\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 23:57:33.694281 containerd[1499]: time="2025-09-09T23:57:33.694206551Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node@sha256:bcb8146fcaeced1e1c88fad3eaa697f1680746bd23c3e7e8d4535bc484c6f2a1\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 23:57:33.695253 containerd[1499]: time="2025-09-09T23:57:33.694813146Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/node:v3.30.3\" with image id \"sha256:2b8abd2140fc4464ed664d225fe38e5b90bbfcf62996b484b0fc0e0537b6a4a9\", repo tag \"ghcr.io/flatcar/calico/node:v3.30.3\", repo digest \"ghcr.io/flatcar/calico/node@sha256:bcb8146fcaeced1e1c88fad3eaa697f1680746bd23c3e7e8d4535bc484c6f2a1\", size \"151100319\" in 7.056621991s" Sep 9 23:57:33.695253 containerd[1499]: time="2025-09-09T23:57:33.694888905Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node:v3.30.3\" returns image reference \"sha256:2b8abd2140fc4464ed664d225fe38e5b90bbfcf62996b484b0fc0e0537b6a4a9\"" Sep 9 23:57:33.719362 containerd[1499]: time="2025-09-09T23:57:33.719305011Z" level=info msg="CreateContainer within sandbox \"5032c647937ce33b6ad00e55d59499687707b3540c05077bf50c1fe9dbefd86e\" for container &ContainerMetadata{Name:calico-node,Attempt:0,}" Sep 9 23:57:33.740867 containerd[1499]: time="2025-09-09T23:57:33.737128614Z" level=info msg="Container eafaa11bb30a9b23530edc71be999bcf0522d89adbc86dbb860814a1c473116b: CDI devices from CRI Config.CDIDevices: []" Sep 9 23:57:33.740763 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount3872312029.mount: Deactivated successfully. Sep 9 23:57:33.760986 containerd[1499]: time="2025-09-09T23:57:33.760944205Z" level=info msg="CreateContainer within sandbox \"5032c647937ce33b6ad00e55d59499687707b3540c05077bf50c1fe9dbefd86e\" for &ContainerMetadata{Name:calico-node,Attempt:0,} returns container id \"eafaa11bb30a9b23530edc71be999bcf0522d89adbc86dbb860814a1c473116b\"" Sep 9 23:57:33.763247 containerd[1499]: time="2025-09-09T23:57:33.763123986Z" level=info msg="StartContainer for \"eafaa11bb30a9b23530edc71be999bcf0522d89adbc86dbb860814a1c473116b\"" Sep 9 23:57:33.767032 containerd[1499]: time="2025-09-09T23:57:33.766957633Z" level=info msg="connecting to shim eafaa11bb30a9b23530edc71be999bcf0522d89adbc86dbb860814a1c473116b" address="unix:///run/containerd/s/388c3d68d5c7ec67239d1fb5b28556f2879574c742a8d1336cf1019dfe515419" protocol=ttrpc version=3 Sep 9 23:57:33.813061 systemd[1]: Started cri-containerd-eafaa11bb30a9b23530edc71be999bcf0522d89adbc86dbb860814a1c473116b.scope - libcontainer container eafaa11bb30a9b23530edc71be999bcf0522d89adbc86dbb860814a1c473116b. Sep 9 23:57:33.876871 containerd[1499]: time="2025-09-09T23:57:33.876359952Z" level=info msg="StartContainer for \"eafaa11bb30a9b23530edc71be999bcf0522d89adbc86dbb860814a1c473116b\" returns successfully" Sep 9 23:57:34.019310 kernel: wireguard: WireGuard 1.0.0 loaded. See www.wireguard.com for information. Sep 9 23:57:34.019431 kernel: wireguard: Copyright (C) 2015-2019 Jason A. Donenfeld . All Rights Reserved. Sep 9 23:57:34.235719 kubelet[2750]: I0909 23:57:34.235677 2750 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"whisker-backend-key-pair\" (UniqueName: \"kubernetes.io/secret/8a4f4bc9-5ecb-484f-bb86-7e44b0d39f72-whisker-backend-key-pair\") pod \"8a4f4bc9-5ecb-484f-bb86-7e44b0d39f72\" (UID: \"8a4f4bc9-5ecb-484f-bb86-7e44b0d39f72\") " Sep 9 23:57:34.236239 kubelet[2750]: I0909 23:57:34.235747 2750 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"whisker-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8a4f4bc9-5ecb-484f-bb86-7e44b0d39f72-whisker-ca-bundle\") pod \"8a4f4bc9-5ecb-484f-bb86-7e44b0d39f72\" (UID: \"8a4f4bc9-5ecb-484f-bb86-7e44b0d39f72\") " Sep 9 23:57:34.236239 kubelet[2750]: I0909 23:57:34.235769 2750 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kqvlq\" (UniqueName: \"kubernetes.io/projected/8a4f4bc9-5ecb-484f-bb86-7e44b0d39f72-kube-api-access-kqvlq\") pod \"8a4f4bc9-5ecb-484f-bb86-7e44b0d39f72\" (UID: \"8a4f4bc9-5ecb-484f-bb86-7e44b0d39f72\") " Sep 9 23:57:34.242864 kubelet[2750]: I0909 23:57:34.242736 2750 operation_generator.go:780] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8a4f4bc9-5ecb-484f-bb86-7e44b0d39f72-whisker-ca-bundle" (OuterVolumeSpecName: "whisker-ca-bundle") pod "8a4f4bc9-5ecb-484f-bb86-7e44b0d39f72" (UID: "8a4f4bc9-5ecb-484f-bb86-7e44b0d39f72"). InnerVolumeSpecName "whisker-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Sep 9 23:57:34.249117 kubelet[2750]: I0909 23:57:34.249052 2750 operation_generator.go:780] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8a4f4bc9-5ecb-484f-bb86-7e44b0d39f72-whisker-backend-key-pair" (OuterVolumeSpecName: "whisker-backend-key-pair") pod "8a4f4bc9-5ecb-484f-bb86-7e44b0d39f72" (UID: "8a4f4bc9-5ecb-484f-bb86-7e44b0d39f72"). InnerVolumeSpecName "whisker-backend-key-pair". PluginName "kubernetes.io/secret", VolumeGIDValue "" Sep 9 23:57:34.252210 kubelet[2750]: I0909 23:57:34.252038 2750 operation_generator.go:780] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8a4f4bc9-5ecb-484f-bb86-7e44b0d39f72-kube-api-access-kqvlq" (OuterVolumeSpecName: "kube-api-access-kqvlq") pod "8a4f4bc9-5ecb-484f-bb86-7e44b0d39f72" (UID: "8a4f4bc9-5ecb-484f-bb86-7e44b0d39f72"). InnerVolumeSpecName "kube-api-access-kqvlq". PluginName "kubernetes.io/projected", VolumeGIDValue "" Sep 9 23:57:34.336393 kubelet[2750]: I0909 23:57:34.336343 2750 reconciler_common.go:299] "Volume detached for volume \"whisker-backend-key-pair\" (UniqueName: \"kubernetes.io/secret/8a4f4bc9-5ecb-484f-bb86-7e44b0d39f72-whisker-backend-key-pair\") on node \"ci-4426-0-0-n-d8dd570c6c\" DevicePath \"\"" Sep 9 23:57:34.336393 kubelet[2750]: I0909 23:57:34.336387 2750 reconciler_common.go:299] "Volume detached for volume \"whisker-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8a4f4bc9-5ecb-484f-bb86-7e44b0d39f72-whisker-ca-bundle\") on node \"ci-4426-0-0-n-d8dd570c6c\" DevicePath \"\"" Sep 9 23:57:34.336393 kubelet[2750]: I0909 23:57:34.336398 2750 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-kqvlq\" (UniqueName: \"kubernetes.io/projected/8a4f4bc9-5ecb-484f-bb86-7e44b0d39f72-kube-api-access-kqvlq\") on node \"ci-4426-0-0-n-d8dd570c6c\" DevicePath \"\"" Sep 9 23:57:34.420479 systemd[1]: Removed slice kubepods-besteffort-pod8a4f4bc9_5ecb_484f_bb86_7e44b0d39f72.slice - libcontainer container kubepods-besteffort-pod8a4f4bc9_5ecb_484f_bb86_7e44b0d39f72.slice. Sep 9 23:57:34.663906 systemd[1]: var-lib-kubelet-pods-8a4f4bc9\x2d5ecb\x2d484f\x2dbb86\x2d7e44b0d39f72-volumes-kubernetes.io\x7eprojected-kube\x2dapi\x2daccess\x2dkqvlq.mount: Deactivated successfully. Sep 9 23:57:34.664604 systemd[1]: var-lib-kubelet-pods-8a4f4bc9\x2d5ecb\x2d484f\x2dbb86\x2d7e44b0d39f72-volumes-kubernetes.io\x7esecret-whisker\x2dbackend\x2dkey\x2dpair.mount: Deactivated successfully. Sep 9 23:57:34.719606 kubelet[2750]: I0909 23:57:34.719501 2750 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/calico-node-bww8f" podStartSLOduration=2.674801135 podStartE2EDuration="19.719482817s" podCreationTimestamp="2025-09-09 23:57:15 +0000 UTC" firstStartedPulling="2025-09-09 23:57:16.650982577 +0000 UTC m=+28.379892315" lastFinishedPulling="2025-09-09 23:57:33.695664259 +0000 UTC m=+45.424573997" observedRunningTime="2025-09-09 23:57:34.699629509 +0000 UTC m=+46.428539247" watchObservedRunningTime="2025-09-09 23:57:34.719482817 +0000 UTC m=+46.448392555" Sep 9 23:57:34.795094 systemd[1]: Created slice kubepods-besteffort-pod431b288c_2469_4b6f_b343_c85b97de0307.slice - libcontainer container kubepods-besteffort-pod431b288c_2469_4b6f_b343_c85b97de0307.slice. Sep 9 23:57:34.839813 kubelet[2750]: I0909 23:57:34.839665 2750 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-87g8v\" (UniqueName: \"kubernetes.io/projected/431b288c-2469-4b6f-b343-c85b97de0307-kube-api-access-87g8v\") pod \"whisker-5f6d768486-fqb2s\" (UID: \"431b288c-2469-4b6f-b343-c85b97de0307\") " pod="calico-system/whisker-5f6d768486-fqb2s" Sep 9 23:57:34.840852 kubelet[2750]: I0909 23:57:34.840810 2750 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whisker-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/431b288c-2469-4b6f-b343-c85b97de0307-whisker-ca-bundle\") pod \"whisker-5f6d768486-fqb2s\" (UID: \"431b288c-2469-4b6f-b343-c85b97de0307\") " pod="calico-system/whisker-5f6d768486-fqb2s" Sep 9 23:57:34.841041 kubelet[2750]: I0909 23:57:34.841017 2750 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whisker-backend-key-pair\" (UniqueName: \"kubernetes.io/secret/431b288c-2469-4b6f-b343-c85b97de0307-whisker-backend-key-pair\") pod \"whisker-5f6d768486-fqb2s\" (UID: \"431b288c-2469-4b6f-b343-c85b97de0307\") " pod="calico-system/whisker-5f6d768486-fqb2s" Sep 9 23:57:34.868209 containerd[1499]: time="2025-09-09T23:57:34.868112965Z" level=info msg="TaskExit event in podsandbox handler container_id:\"eafaa11bb30a9b23530edc71be999bcf0522d89adbc86dbb860814a1c473116b\" id:\"729f66e182cb081735c2d3e55ec48e746d74a381560b5bb44da229dcd70653b3\" pid:3822 exit_status:1 exited_at:{seconds:1757462254 nanos:867679249}" Sep 9 23:57:35.103513 containerd[1499]: time="2025-09-09T23:57:35.102747616Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:whisker-5f6d768486-fqb2s,Uid:431b288c-2469-4b6f-b343-c85b97de0307,Namespace:calico-system,Attempt:0,}" Sep 9 23:57:35.291117 systemd-networkd[1423]: cali23b915cded7: Link UP Sep 9 23:57:35.292560 systemd-networkd[1423]: cali23b915cded7: Gained carrier Sep 9 23:57:35.313280 containerd[1499]: 2025-09-09 23:57:35.135 [INFO][3835] cni-plugin/utils.go 100: File /var/lib/calico/mtu does not exist Sep 9 23:57:35.313280 containerd[1499]: 2025-09-09 23:57:35.172 [INFO][3835] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4426--0--0--n--d8dd570c6c-k8s-whisker--5f6d768486--fqb2s-eth0 whisker-5f6d768486- calico-system 431b288c-2469-4b6f-b343-c85b97de0307 918 0 2025-09-09 23:57:34 +0000 UTC map[app.kubernetes.io/name:whisker k8s-app:whisker pod-template-hash:5f6d768486 projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:whisker] map[] [] [] []} {k8s ci-4426-0-0-n-d8dd570c6c whisker-5f6d768486-fqb2s eth0 whisker [] [] [kns.calico-system ksa.calico-system.whisker] cali23b915cded7 [] [] }} ContainerID="63fcd4331535dfcdada44d4995cb064e9de1767ecdd0e9b386c782a0bcebc051" Namespace="calico-system" Pod="whisker-5f6d768486-fqb2s" WorkloadEndpoint="ci--4426--0--0--n--d8dd570c6c-k8s-whisker--5f6d768486--fqb2s-" Sep 9 23:57:35.313280 containerd[1499]: 2025-09-09 23:57:35.172 [INFO][3835] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="63fcd4331535dfcdada44d4995cb064e9de1767ecdd0e9b386c782a0bcebc051" Namespace="calico-system" Pod="whisker-5f6d768486-fqb2s" WorkloadEndpoint="ci--4426--0--0--n--d8dd570c6c-k8s-whisker--5f6d768486--fqb2s-eth0" Sep 9 23:57:35.313280 containerd[1499]: 2025-09-09 23:57:35.229 [INFO][3847] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="63fcd4331535dfcdada44d4995cb064e9de1767ecdd0e9b386c782a0bcebc051" HandleID="k8s-pod-network.63fcd4331535dfcdada44d4995cb064e9de1767ecdd0e9b386c782a0bcebc051" Workload="ci--4426--0--0--n--d8dd570c6c-k8s-whisker--5f6d768486--fqb2s-eth0" Sep 9 23:57:35.313521 containerd[1499]: 2025-09-09 23:57:35.229 [INFO][3847] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="63fcd4331535dfcdada44d4995cb064e9de1767ecdd0e9b386c782a0bcebc051" HandleID="k8s-pod-network.63fcd4331535dfcdada44d4995cb064e9de1767ecdd0e9b386c782a0bcebc051" Workload="ci--4426--0--0--n--d8dd570c6c-k8s-whisker--5f6d768486--fqb2s-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0x400004d800), Attrs:map[string]string{"namespace":"calico-system", "node":"ci-4426-0-0-n-d8dd570c6c", "pod":"whisker-5f6d768486-fqb2s", "timestamp":"2025-09-09 23:57:35.229315087 +0000 UTC"}, Hostname:"ci-4426-0-0-n-d8dd570c6c", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Sep 9 23:57:35.313521 containerd[1499]: 2025-09-09 23:57:35.229 [INFO][3847] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Sep 9 23:57:35.313521 containerd[1499]: 2025-09-09 23:57:35.229 [INFO][3847] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Sep 9 23:57:35.313521 containerd[1499]: 2025-09-09 23:57:35.229 [INFO][3847] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4426-0-0-n-d8dd570c6c' Sep 9 23:57:35.313521 containerd[1499]: 2025-09-09 23:57:35.242 [INFO][3847] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.63fcd4331535dfcdada44d4995cb064e9de1767ecdd0e9b386c782a0bcebc051" host="ci-4426-0-0-n-d8dd570c6c" Sep 9 23:57:35.313521 containerd[1499]: 2025-09-09 23:57:35.250 [INFO][3847] ipam/ipam.go 394: Looking up existing affinities for host host="ci-4426-0-0-n-d8dd570c6c" Sep 9 23:57:35.313521 containerd[1499]: 2025-09-09 23:57:35.257 [INFO][3847] ipam/ipam.go 511: Trying affinity for 192.168.72.0/26 host="ci-4426-0-0-n-d8dd570c6c" Sep 9 23:57:35.313521 containerd[1499]: 2025-09-09 23:57:35.260 [INFO][3847] ipam/ipam.go 158: Attempting to load block cidr=192.168.72.0/26 host="ci-4426-0-0-n-d8dd570c6c" Sep 9 23:57:35.313521 containerd[1499]: 2025-09-09 23:57:35.263 [INFO][3847] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.72.0/26 host="ci-4426-0-0-n-d8dd570c6c" Sep 9 23:57:35.313698 containerd[1499]: 2025-09-09 23:57:35.263 [INFO][3847] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.72.0/26 handle="k8s-pod-network.63fcd4331535dfcdada44d4995cb064e9de1767ecdd0e9b386c782a0bcebc051" host="ci-4426-0-0-n-d8dd570c6c" Sep 9 23:57:35.313698 containerd[1499]: 2025-09-09 23:57:35.265 [INFO][3847] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.63fcd4331535dfcdada44d4995cb064e9de1767ecdd0e9b386c782a0bcebc051 Sep 9 23:57:35.313698 containerd[1499]: 2025-09-09 23:57:35.270 [INFO][3847] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.72.0/26 handle="k8s-pod-network.63fcd4331535dfcdada44d4995cb064e9de1767ecdd0e9b386c782a0bcebc051" host="ci-4426-0-0-n-d8dd570c6c" Sep 9 23:57:35.313698 containerd[1499]: 2025-09-09 23:57:35.279 [INFO][3847] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.72.1/26] block=192.168.72.0/26 handle="k8s-pod-network.63fcd4331535dfcdada44d4995cb064e9de1767ecdd0e9b386c782a0bcebc051" host="ci-4426-0-0-n-d8dd570c6c" Sep 9 23:57:35.313698 containerd[1499]: 2025-09-09 23:57:35.279 [INFO][3847] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.72.1/26] handle="k8s-pod-network.63fcd4331535dfcdada44d4995cb064e9de1767ecdd0e9b386c782a0bcebc051" host="ci-4426-0-0-n-d8dd570c6c" Sep 9 23:57:35.313698 containerd[1499]: 2025-09-09 23:57:35.279 [INFO][3847] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Sep 9 23:57:35.313698 containerd[1499]: 2025-09-09 23:57:35.279 [INFO][3847] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.72.1/26] IPv6=[] ContainerID="63fcd4331535dfcdada44d4995cb064e9de1767ecdd0e9b386c782a0bcebc051" HandleID="k8s-pod-network.63fcd4331535dfcdada44d4995cb064e9de1767ecdd0e9b386c782a0bcebc051" Workload="ci--4426--0--0--n--d8dd570c6c-k8s-whisker--5f6d768486--fqb2s-eth0" Sep 9 23:57:35.313821 containerd[1499]: 2025-09-09 23:57:35.282 [INFO][3835] cni-plugin/k8s.go 418: Populated endpoint ContainerID="63fcd4331535dfcdada44d4995cb064e9de1767ecdd0e9b386c782a0bcebc051" Namespace="calico-system" Pod="whisker-5f6d768486-fqb2s" WorkloadEndpoint="ci--4426--0--0--n--d8dd570c6c-k8s-whisker--5f6d768486--fqb2s-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4426--0--0--n--d8dd570c6c-k8s-whisker--5f6d768486--fqb2s-eth0", GenerateName:"whisker-5f6d768486-", Namespace:"calico-system", SelfLink:"", UID:"431b288c-2469-4b6f-b343-c85b97de0307", ResourceVersion:"918", Generation:0, CreationTimestamp:time.Date(2025, time.September, 9, 23, 57, 34, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"whisker", "k8s-app":"whisker", "pod-template-hash":"5f6d768486", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"whisker"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4426-0-0-n-d8dd570c6c", ContainerID:"", Pod:"whisker-5f6d768486-fqb2s", Endpoint:"eth0", ServiceAccountName:"whisker", IPNetworks:[]string{"192.168.72.1/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.whisker"}, InterfaceName:"cali23b915cded7", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 9 23:57:35.313821 containerd[1499]: 2025-09-09 23:57:35.282 [INFO][3835] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.72.1/32] ContainerID="63fcd4331535dfcdada44d4995cb064e9de1767ecdd0e9b386c782a0bcebc051" Namespace="calico-system" Pod="whisker-5f6d768486-fqb2s" WorkloadEndpoint="ci--4426--0--0--n--d8dd570c6c-k8s-whisker--5f6d768486--fqb2s-eth0" Sep 9 23:57:35.314908 containerd[1499]: 2025-09-09 23:57:35.282 [INFO][3835] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali23b915cded7 ContainerID="63fcd4331535dfcdada44d4995cb064e9de1767ecdd0e9b386c782a0bcebc051" Namespace="calico-system" Pod="whisker-5f6d768486-fqb2s" WorkloadEndpoint="ci--4426--0--0--n--d8dd570c6c-k8s-whisker--5f6d768486--fqb2s-eth0" Sep 9 23:57:35.314908 containerd[1499]: 2025-09-09 23:57:35.293 [INFO][3835] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="63fcd4331535dfcdada44d4995cb064e9de1767ecdd0e9b386c782a0bcebc051" Namespace="calico-system" Pod="whisker-5f6d768486-fqb2s" WorkloadEndpoint="ci--4426--0--0--n--d8dd570c6c-k8s-whisker--5f6d768486--fqb2s-eth0" Sep 9 23:57:35.314965 containerd[1499]: 2025-09-09 23:57:35.293 [INFO][3835] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="63fcd4331535dfcdada44d4995cb064e9de1767ecdd0e9b386c782a0bcebc051" Namespace="calico-system" Pod="whisker-5f6d768486-fqb2s" WorkloadEndpoint="ci--4426--0--0--n--d8dd570c6c-k8s-whisker--5f6d768486--fqb2s-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4426--0--0--n--d8dd570c6c-k8s-whisker--5f6d768486--fqb2s-eth0", GenerateName:"whisker-5f6d768486-", Namespace:"calico-system", SelfLink:"", UID:"431b288c-2469-4b6f-b343-c85b97de0307", ResourceVersion:"918", Generation:0, CreationTimestamp:time.Date(2025, time.September, 9, 23, 57, 34, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"whisker", "k8s-app":"whisker", "pod-template-hash":"5f6d768486", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"whisker"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4426-0-0-n-d8dd570c6c", ContainerID:"63fcd4331535dfcdada44d4995cb064e9de1767ecdd0e9b386c782a0bcebc051", Pod:"whisker-5f6d768486-fqb2s", Endpoint:"eth0", ServiceAccountName:"whisker", IPNetworks:[]string{"192.168.72.1/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.whisker"}, InterfaceName:"cali23b915cded7", MAC:"7e:51:70:7d:1e:fa", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 9 23:57:35.315021 containerd[1499]: 2025-09-09 23:57:35.309 [INFO][3835] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="63fcd4331535dfcdada44d4995cb064e9de1767ecdd0e9b386c782a0bcebc051" Namespace="calico-system" Pod="whisker-5f6d768486-fqb2s" WorkloadEndpoint="ci--4426--0--0--n--d8dd570c6c-k8s-whisker--5f6d768486--fqb2s-eth0" Sep 9 23:57:35.359780 containerd[1499]: time="2025-09-09T23:57:35.358787014Z" level=info msg="connecting to shim 63fcd4331535dfcdada44d4995cb064e9de1767ecdd0e9b386c782a0bcebc051" address="unix:///run/containerd/s/f3f1693e49cc3598baf5a5f2d8cf42020888dc97385af67b2f2884748b539328" namespace=k8s.io protocol=ttrpc version=3 Sep 9 23:57:35.385080 systemd[1]: Started cri-containerd-63fcd4331535dfcdada44d4995cb064e9de1767ecdd0e9b386c782a0bcebc051.scope - libcontainer container 63fcd4331535dfcdada44d4995cb064e9de1767ecdd0e9b386c782a0bcebc051. Sep 9 23:57:35.424543 containerd[1499]: time="2025-09-09T23:57:35.423766535Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:whisker-5f6d768486-fqb2s,Uid:431b288c-2469-4b6f-b343-c85b97de0307,Namespace:calico-system,Attempt:0,} returns sandbox id \"63fcd4331535dfcdada44d4995cb064e9de1767ecdd0e9b386c782a0bcebc051\"" Sep 9 23:57:35.429740 containerd[1499]: time="2025-09-09T23:57:35.429690804Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.30.3\"" Sep 9 23:57:35.979109 containerd[1499]: time="2025-09-09T23:57:35.979058279Z" level=info msg="TaskExit event in podsandbox handler container_id:\"eafaa11bb30a9b23530edc71be999bcf0522d89adbc86dbb860814a1c473116b\" id:\"54a13581b5317f76dfda6a242d52cfd24c16da41f507a5d10fe398f1e85570a1\" pid:4007 exit_status:1 exited_at:{seconds:1757462255 nanos:978429484}" Sep 9 23:57:36.145504 systemd-networkd[1423]: vxlan.calico: Link UP Sep 9 23:57:36.145510 systemd-networkd[1423]: vxlan.calico: Gained carrier Sep 9 23:57:36.414300 kubelet[2750]: I0909 23:57:36.414211 2750 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8a4f4bc9-5ecb-484f-bb86-7e44b0d39f72" path="/var/lib/kubelet/pods/8a4f4bc9-5ecb-484f-bb86-7e44b0d39f72/volumes" Sep 9 23:57:36.768800 containerd[1499]: time="2025-09-09T23:57:36.768333553Z" level=info msg="TaskExit event in podsandbox handler container_id:\"eafaa11bb30a9b23530edc71be999bcf0522d89adbc86dbb860814a1c473116b\" id:\"622974bcff9a033ed31509d785abb1061401a8efd254c65d0eda5cd46189a22e\" pid:4137 exit_status:1 exited_at:{seconds:1757462256 nanos:767997876}" Sep 9 23:57:36.776605 systemd-networkd[1423]: cali23b915cded7: Gained IPv6LL Sep 9 23:57:37.159427 containerd[1499]: time="2025-09-09T23:57:37.159305074Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/whisker:v3.30.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 23:57:37.163810 containerd[1499]: time="2025-09-09T23:57:37.163757357Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker:v3.30.3: active requests=0, bytes read=4605606" Sep 9 23:57:37.167235 containerd[1499]: time="2025-09-09T23:57:37.166804331Z" level=info msg="ImageCreate event name:\"sha256:270a0129ec34c3ad6ae6d56c0afce111eb0baa25dfdacb63722ec5887bafd3c5\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 23:57:37.170153 containerd[1499]: time="2025-09-09T23:57:37.170063863Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/whisker@sha256:e7113761fc7633d515882f0d48b5c8d0b8e62f3f9d34823f2ee194bb16d2ec44\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 23:57:37.170690 containerd[1499]: time="2025-09-09T23:57:37.170654059Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/whisker:v3.30.3\" with image id \"sha256:270a0129ec34c3ad6ae6d56c0afce111eb0baa25dfdacb63722ec5887bafd3c5\", repo tag \"ghcr.io/flatcar/calico/whisker:v3.30.3\", repo digest \"ghcr.io/flatcar/calico/whisker@sha256:e7113761fc7633d515882f0d48b5c8d0b8e62f3f9d34823f2ee194bb16d2ec44\", size \"5974839\" in 1.740919535s" Sep 9 23:57:37.170862 containerd[1499]: time="2025-09-09T23:57:37.170788377Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.30.3\" returns image reference \"sha256:270a0129ec34c3ad6ae6d56c0afce111eb0baa25dfdacb63722ec5887bafd3c5\"" Sep 9 23:57:37.188904 containerd[1499]: time="2025-09-09T23:57:37.188862945Z" level=info msg="CreateContainer within sandbox \"63fcd4331535dfcdada44d4995cb064e9de1767ecdd0e9b386c782a0bcebc051\" for container &ContainerMetadata{Name:whisker,Attempt:0,}" Sep 9 23:57:37.215879 containerd[1499]: time="2025-09-09T23:57:37.213943573Z" level=info msg="Container 13f2c3ddaa203139ff7f71c351224f05457ef44a11d4b42f7b37f6df60a95c51: CDI devices from CRI Config.CDIDevices: []" Sep 9 23:57:37.224198 systemd-networkd[1423]: vxlan.calico: Gained IPv6LL Sep 9 23:57:37.245453 containerd[1499]: time="2025-09-09T23:57:37.245403747Z" level=info msg="CreateContainer within sandbox \"63fcd4331535dfcdada44d4995cb064e9de1767ecdd0e9b386c782a0bcebc051\" for &ContainerMetadata{Name:whisker,Attempt:0,} returns container id \"13f2c3ddaa203139ff7f71c351224f05457ef44a11d4b42f7b37f6df60a95c51\"" Sep 9 23:57:37.246934 containerd[1499]: time="2025-09-09T23:57:37.246762176Z" level=info msg="StartContainer for \"13f2c3ddaa203139ff7f71c351224f05457ef44a11d4b42f7b37f6df60a95c51\"" Sep 9 23:57:37.249036 containerd[1499]: time="2025-09-09T23:57:37.248988477Z" level=info msg="connecting to shim 13f2c3ddaa203139ff7f71c351224f05457ef44a11d4b42f7b37f6df60a95c51" address="unix:///run/containerd/s/f3f1693e49cc3598baf5a5f2d8cf42020888dc97385af67b2f2884748b539328" protocol=ttrpc version=3 Sep 9 23:57:37.278071 systemd[1]: Started cri-containerd-13f2c3ddaa203139ff7f71c351224f05457ef44a11d4b42f7b37f6df60a95c51.scope - libcontainer container 13f2c3ddaa203139ff7f71c351224f05457ef44a11d4b42f7b37f6df60a95c51. Sep 9 23:57:37.332451 containerd[1499]: time="2025-09-09T23:57:37.332406413Z" level=info msg="StartContainer for \"13f2c3ddaa203139ff7f71c351224f05457ef44a11d4b42f7b37f6df60a95c51\" returns successfully" Sep 9 23:57:37.335170 containerd[1499]: time="2025-09-09T23:57:37.335135030Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.30.3\"" Sep 9 23:57:37.411476 containerd[1499]: time="2025-09-09T23:57:37.411314307Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-749fff5dd9-2hjps,Uid:d88e065c-f4c4-4c79-aec9-35cb6eb39df3,Namespace:calico-system,Attempt:0,}" Sep 9 23:57:37.575123 systemd-networkd[1423]: cali0f839e01716: Link UP Sep 9 23:57:37.578621 systemd-networkd[1423]: cali0f839e01716: Gained carrier Sep 9 23:57:37.601800 containerd[1499]: 2025-09-09 23:57:37.466 [INFO][4184] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4426--0--0--n--d8dd570c6c-k8s-calico--kube--controllers--749fff5dd9--2hjps-eth0 calico-kube-controllers-749fff5dd9- calico-system d88e065c-f4c4-4c79-aec9-35cb6eb39df3 851 0 2025-09-09 23:57:16 +0000 UTC map[app.kubernetes.io/name:calico-kube-controllers k8s-app:calico-kube-controllers pod-template-hash:749fff5dd9 projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:calico-kube-controllers] map[] [] [] []} {k8s ci-4426-0-0-n-d8dd570c6c calico-kube-controllers-749fff5dd9-2hjps eth0 calico-kube-controllers [] [] [kns.calico-system ksa.calico-system.calico-kube-controllers] cali0f839e01716 [] [] }} ContainerID="4bd62c1aca34fb0a1e8cdb395d7bb8d76d55e2cff20ebc7f68bb6345452bede3" Namespace="calico-system" Pod="calico-kube-controllers-749fff5dd9-2hjps" WorkloadEndpoint="ci--4426--0--0--n--d8dd570c6c-k8s-calico--kube--controllers--749fff5dd9--2hjps-" Sep 9 23:57:37.601800 containerd[1499]: 2025-09-09 23:57:37.466 [INFO][4184] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="4bd62c1aca34fb0a1e8cdb395d7bb8d76d55e2cff20ebc7f68bb6345452bede3" Namespace="calico-system" Pod="calico-kube-controllers-749fff5dd9-2hjps" WorkloadEndpoint="ci--4426--0--0--n--d8dd570c6c-k8s-calico--kube--controllers--749fff5dd9--2hjps-eth0" Sep 9 23:57:37.601800 containerd[1499]: 2025-09-09 23:57:37.508 [INFO][4197] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="4bd62c1aca34fb0a1e8cdb395d7bb8d76d55e2cff20ebc7f68bb6345452bede3" HandleID="k8s-pod-network.4bd62c1aca34fb0a1e8cdb395d7bb8d76d55e2cff20ebc7f68bb6345452bede3" Workload="ci--4426--0--0--n--d8dd570c6c-k8s-calico--kube--controllers--749fff5dd9--2hjps-eth0" Sep 9 23:57:37.602061 containerd[1499]: 2025-09-09 23:57:37.509 [INFO][4197] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="4bd62c1aca34fb0a1e8cdb395d7bb8d76d55e2cff20ebc7f68bb6345452bede3" HandleID="k8s-pod-network.4bd62c1aca34fb0a1e8cdb395d7bb8d76d55e2cff20ebc7f68bb6345452bede3" Workload="ci--4426--0--0--n--d8dd570c6c-k8s-calico--kube--controllers--749fff5dd9--2hjps-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0x40002aa160), Attrs:map[string]string{"namespace":"calico-system", "node":"ci-4426-0-0-n-d8dd570c6c", "pod":"calico-kube-controllers-749fff5dd9-2hjps", "timestamp":"2025-09-09 23:57:37.508604205 +0000 UTC"}, Hostname:"ci-4426-0-0-n-d8dd570c6c", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Sep 9 23:57:37.602061 containerd[1499]: 2025-09-09 23:57:37.509 [INFO][4197] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Sep 9 23:57:37.602061 containerd[1499]: 2025-09-09 23:57:37.509 [INFO][4197] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Sep 9 23:57:37.602061 containerd[1499]: 2025-09-09 23:57:37.509 [INFO][4197] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4426-0-0-n-d8dd570c6c' Sep 9 23:57:37.602061 containerd[1499]: 2025-09-09 23:57:37.523 [INFO][4197] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.4bd62c1aca34fb0a1e8cdb395d7bb8d76d55e2cff20ebc7f68bb6345452bede3" host="ci-4426-0-0-n-d8dd570c6c" Sep 9 23:57:37.602061 containerd[1499]: 2025-09-09 23:57:37.533 [INFO][4197] ipam/ipam.go 394: Looking up existing affinities for host host="ci-4426-0-0-n-d8dd570c6c" Sep 9 23:57:37.602061 containerd[1499]: 2025-09-09 23:57:37.541 [INFO][4197] ipam/ipam.go 511: Trying affinity for 192.168.72.0/26 host="ci-4426-0-0-n-d8dd570c6c" Sep 9 23:57:37.602061 containerd[1499]: 2025-09-09 23:57:37.544 [INFO][4197] ipam/ipam.go 158: Attempting to load block cidr=192.168.72.0/26 host="ci-4426-0-0-n-d8dd570c6c" Sep 9 23:57:37.602061 containerd[1499]: 2025-09-09 23:57:37.548 [INFO][4197] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.72.0/26 host="ci-4426-0-0-n-d8dd570c6c" Sep 9 23:57:37.602552 containerd[1499]: 2025-09-09 23:57:37.548 [INFO][4197] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.72.0/26 handle="k8s-pod-network.4bd62c1aca34fb0a1e8cdb395d7bb8d76d55e2cff20ebc7f68bb6345452bede3" host="ci-4426-0-0-n-d8dd570c6c" Sep 9 23:57:37.602552 containerd[1499]: 2025-09-09 23:57:37.550 [INFO][4197] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.4bd62c1aca34fb0a1e8cdb395d7bb8d76d55e2cff20ebc7f68bb6345452bede3 Sep 9 23:57:37.602552 containerd[1499]: 2025-09-09 23:57:37.556 [INFO][4197] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.72.0/26 handle="k8s-pod-network.4bd62c1aca34fb0a1e8cdb395d7bb8d76d55e2cff20ebc7f68bb6345452bede3" host="ci-4426-0-0-n-d8dd570c6c" Sep 9 23:57:37.602552 containerd[1499]: 2025-09-09 23:57:37.565 [INFO][4197] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.72.2/26] block=192.168.72.0/26 handle="k8s-pod-network.4bd62c1aca34fb0a1e8cdb395d7bb8d76d55e2cff20ebc7f68bb6345452bede3" host="ci-4426-0-0-n-d8dd570c6c" Sep 9 23:57:37.602552 containerd[1499]: 2025-09-09 23:57:37.565 [INFO][4197] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.72.2/26] handle="k8s-pod-network.4bd62c1aca34fb0a1e8cdb395d7bb8d76d55e2cff20ebc7f68bb6345452bede3" host="ci-4426-0-0-n-d8dd570c6c" Sep 9 23:57:37.602552 containerd[1499]: 2025-09-09 23:57:37.565 [INFO][4197] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Sep 9 23:57:37.602552 containerd[1499]: 2025-09-09 23:57:37.566 [INFO][4197] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.72.2/26] IPv6=[] ContainerID="4bd62c1aca34fb0a1e8cdb395d7bb8d76d55e2cff20ebc7f68bb6345452bede3" HandleID="k8s-pod-network.4bd62c1aca34fb0a1e8cdb395d7bb8d76d55e2cff20ebc7f68bb6345452bede3" Workload="ci--4426--0--0--n--d8dd570c6c-k8s-calico--kube--controllers--749fff5dd9--2hjps-eth0" Sep 9 23:57:37.602710 containerd[1499]: 2025-09-09 23:57:37.569 [INFO][4184] cni-plugin/k8s.go 418: Populated endpoint ContainerID="4bd62c1aca34fb0a1e8cdb395d7bb8d76d55e2cff20ebc7f68bb6345452bede3" Namespace="calico-system" Pod="calico-kube-controllers-749fff5dd9-2hjps" WorkloadEndpoint="ci--4426--0--0--n--d8dd570c6c-k8s-calico--kube--controllers--749fff5dd9--2hjps-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4426--0--0--n--d8dd570c6c-k8s-calico--kube--controllers--749fff5dd9--2hjps-eth0", GenerateName:"calico-kube-controllers-749fff5dd9-", Namespace:"calico-system", SelfLink:"", UID:"d88e065c-f4c4-4c79-aec9-35cb6eb39df3", ResourceVersion:"851", Generation:0, CreationTimestamp:time.Date(2025, time.September, 9, 23, 57, 16, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"calico-kube-controllers", "k8s-app":"calico-kube-controllers", "pod-template-hash":"749fff5dd9", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-kube-controllers"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4426-0-0-n-d8dd570c6c", ContainerID:"", Pod:"calico-kube-controllers-749fff5dd9-2hjps", Endpoint:"eth0", ServiceAccountName:"calico-kube-controllers", IPNetworks:[]string{"192.168.72.2/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.calico-kube-controllers"}, InterfaceName:"cali0f839e01716", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 9 23:57:37.602771 containerd[1499]: 2025-09-09 23:57:37.569 [INFO][4184] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.72.2/32] ContainerID="4bd62c1aca34fb0a1e8cdb395d7bb8d76d55e2cff20ebc7f68bb6345452bede3" Namespace="calico-system" Pod="calico-kube-controllers-749fff5dd9-2hjps" WorkloadEndpoint="ci--4426--0--0--n--d8dd570c6c-k8s-calico--kube--controllers--749fff5dd9--2hjps-eth0" Sep 9 23:57:37.602771 containerd[1499]: 2025-09-09 23:57:37.569 [INFO][4184] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali0f839e01716 ContainerID="4bd62c1aca34fb0a1e8cdb395d7bb8d76d55e2cff20ebc7f68bb6345452bede3" Namespace="calico-system" Pod="calico-kube-controllers-749fff5dd9-2hjps" WorkloadEndpoint="ci--4426--0--0--n--d8dd570c6c-k8s-calico--kube--controllers--749fff5dd9--2hjps-eth0" Sep 9 23:57:37.602771 containerd[1499]: 2025-09-09 23:57:37.578 [INFO][4184] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="4bd62c1aca34fb0a1e8cdb395d7bb8d76d55e2cff20ebc7f68bb6345452bede3" Namespace="calico-system" Pod="calico-kube-controllers-749fff5dd9-2hjps" WorkloadEndpoint="ci--4426--0--0--n--d8dd570c6c-k8s-calico--kube--controllers--749fff5dd9--2hjps-eth0" Sep 9 23:57:37.603599 containerd[1499]: 2025-09-09 23:57:37.580 [INFO][4184] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="4bd62c1aca34fb0a1e8cdb395d7bb8d76d55e2cff20ebc7f68bb6345452bede3" Namespace="calico-system" Pod="calico-kube-controllers-749fff5dd9-2hjps" WorkloadEndpoint="ci--4426--0--0--n--d8dd570c6c-k8s-calico--kube--controllers--749fff5dd9--2hjps-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4426--0--0--n--d8dd570c6c-k8s-calico--kube--controllers--749fff5dd9--2hjps-eth0", GenerateName:"calico-kube-controllers-749fff5dd9-", Namespace:"calico-system", SelfLink:"", UID:"d88e065c-f4c4-4c79-aec9-35cb6eb39df3", ResourceVersion:"851", Generation:0, CreationTimestamp:time.Date(2025, time.September, 9, 23, 57, 16, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"calico-kube-controllers", "k8s-app":"calico-kube-controllers", "pod-template-hash":"749fff5dd9", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-kube-controllers"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4426-0-0-n-d8dd570c6c", ContainerID:"4bd62c1aca34fb0a1e8cdb395d7bb8d76d55e2cff20ebc7f68bb6345452bede3", Pod:"calico-kube-controllers-749fff5dd9-2hjps", Endpoint:"eth0", ServiceAccountName:"calico-kube-controllers", IPNetworks:[]string{"192.168.72.2/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.calico-kube-controllers"}, InterfaceName:"cali0f839e01716", MAC:"f6:ba:24:c6:c3:79", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 9 23:57:37.603803 containerd[1499]: 2025-09-09 23:57:37.597 [INFO][4184] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="4bd62c1aca34fb0a1e8cdb395d7bb8d76d55e2cff20ebc7f68bb6345452bede3" Namespace="calico-system" Pod="calico-kube-controllers-749fff5dd9-2hjps" WorkloadEndpoint="ci--4426--0--0--n--d8dd570c6c-k8s-calico--kube--controllers--749fff5dd9--2hjps-eth0" Sep 9 23:57:37.647503 containerd[1499]: time="2025-09-09T23:57:37.647458713Z" level=info msg="connecting to shim 4bd62c1aca34fb0a1e8cdb395d7bb8d76d55e2cff20ebc7f68bb6345452bede3" address="unix:///run/containerd/s/422201ada65d268797c2ec0d6b51bda8c9faf1e53b8b8cb0021905be02ca3c86" namespace=k8s.io protocol=ttrpc version=3 Sep 9 23:57:37.683079 systemd[1]: Started cri-containerd-4bd62c1aca34fb0a1e8cdb395d7bb8d76d55e2cff20ebc7f68bb6345452bede3.scope - libcontainer container 4bd62c1aca34fb0a1e8cdb395d7bb8d76d55e2cff20ebc7f68bb6345452bede3. Sep 9 23:57:37.734638 containerd[1499]: time="2025-09-09T23:57:37.734600697Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-749fff5dd9-2hjps,Uid:d88e065c-f4c4-4c79-aec9-35cb6eb39df3,Namespace:calico-system,Attempt:0,} returns sandbox id \"4bd62c1aca34fb0a1e8cdb395d7bb8d76d55e2cff20ebc7f68bb6345452bede3\"" Sep 9 23:57:38.413519 containerd[1499]: time="2025-09-09T23:57:38.413201197Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:goldmane-54d579b49d-fw5ss,Uid:2b493a96-3c4e-400a-b1bc-669373df4971,Namespace:calico-system,Attempt:0,}" Sep 9 23:57:38.413519 containerd[1499]: time="2025-09-09T23:57:38.413328835Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-54cdc7764d-sgvm7,Uid:56657f5b-5f65-4c76-a8c6-274344b1f6c6,Namespace:calico-apiserver,Attempt:0,}" Sep 9 23:57:38.414600 containerd[1499]: time="2025-09-09T23:57:38.414553345Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-668d6bf9bc-v2h8b,Uid:22f16259-3f09-42f8-a728-4b51e247e528,Namespace:kube-system,Attempt:0,}" Sep 9 23:57:38.700589 systemd-networkd[1423]: calibbd441cdc2f: Link UP Sep 9 23:57:38.701971 systemd-networkd[1423]: calibbd441cdc2f: Gained carrier Sep 9 23:57:38.759041 containerd[1499]: 2025-09-09 23:57:38.540 [INFO][4259] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4426--0--0--n--d8dd570c6c-k8s-goldmane--54d579b49d--fw5ss-eth0 goldmane-54d579b49d- calico-system 2b493a96-3c4e-400a-b1bc-669373df4971 850 0 2025-09-09 23:57:15 +0000 UTC map[app.kubernetes.io/name:goldmane k8s-app:goldmane pod-template-hash:54d579b49d projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:goldmane] map[] [] [] []} {k8s ci-4426-0-0-n-d8dd570c6c goldmane-54d579b49d-fw5ss eth0 goldmane [] [] [kns.calico-system ksa.calico-system.goldmane] calibbd441cdc2f [] [] }} ContainerID="b863af3def73411352acd2b8f8ba01a0a918acf9d18af88f2edd2d1e716abe0e" Namespace="calico-system" Pod="goldmane-54d579b49d-fw5ss" WorkloadEndpoint="ci--4426--0--0--n--d8dd570c6c-k8s-goldmane--54d579b49d--fw5ss-" Sep 9 23:57:38.759041 containerd[1499]: 2025-09-09 23:57:38.542 [INFO][4259] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="b863af3def73411352acd2b8f8ba01a0a918acf9d18af88f2edd2d1e716abe0e" Namespace="calico-system" Pod="goldmane-54d579b49d-fw5ss" WorkloadEndpoint="ci--4426--0--0--n--d8dd570c6c-k8s-goldmane--54d579b49d--fw5ss-eth0" Sep 9 23:57:38.759041 containerd[1499]: 2025-09-09 23:57:38.604 [INFO][4295] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="b863af3def73411352acd2b8f8ba01a0a918acf9d18af88f2edd2d1e716abe0e" HandleID="k8s-pod-network.b863af3def73411352acd2b8f8ba01a0a918acf9d18af88f2edd2d1e716abe0e" Workload="ci--4426--0--0--n--d8dd570c6c-k8s-goldmane--54d579b49d--fw5ss-eth0" Sep 9 23:57:38.759386 containerd[1499]: 2025-09-09 23:57:38.604 [INFO][4295] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="b863af3def73411352acd2b8f8ba01a0a918acf9d18af88f2edd2d1e716abe0e" HandleID="k8s-pod-network.b863af3def73411352acd2b8f8ba01a0a918acf9d18af88f2edd2d1e716abe0e" Workload="ci--4426--0--0--n--d8dd570c6c-k8s-goldmane--54d579b49d--fw5ss-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0x40002cb020), Attrs:map[string]string{"namespace":"calico-system", "node":"ci-4426-0-0-n-d8dd570c6c", "pod":"goldmane-54d579b49d-fw5ss", "timestamp":"2025-09-09 23:57:38.604374516 +0000 UTC"}, Hostname:"ci-4426-0-0-n-d8dd570c6c", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Sep 9 23:57:38.759386 containerd[1499]: 2025-09-09 23:57:38.604 [INFO][4295] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Sep 9 23:57:38.759386 containerd[1499]: 2025-09-09 23:57:38.604 [INFO][4295] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Sep 9 23:57:38.759386 containerd[1499]: 2025-09-09 23:57:38.604 [INFO][4295] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4426-0-0-n-d8dd570c6c' Sep 9 23:57:38.759386 containerd[1499]: 2025-09-09 23:57:38.619 [INFO][4295] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.b863af3def73411352acd2b8f8ba01a0a918acf9d18af88f2edd2d1e716abe0e" host="ci-4426-0-0-n-d8dd570c6c" Sep 9 23:57:38.759386 containerd[1499]: 2025-09-09 23:57:38.628 [INFO][4295] ipam/ipam.go 394: Looking up existing affinities for host host="ci-4426-0-0-n-d8dd570c6c" Sep 9 23:57:38.759386 containerd[1499]: 2025-09-09 23:57:38.636 [INFO][4295] ipam/ipam.go 511: Trying affinity for 192.168.72.0/26 host="ci-4426-0-0-n-d8dd570c6c" Sep 9 23:57:38.759386 containerd[1499]: 2025-09-09 23:57:38.641 [INFO][4295] ipam/ipam.go 158: Attempting to load block cidr=192.168.72.0/26 host="ci-4426-0-0-n-d8dd570c6c" Sep 9 23:57:38.759386 containerd[1499]: 2025-09-09 23:57:38.647 [INFO][4295] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.72.0/26 host="ci-4426-0-0-n-d8dd570c6c" Sep 9 23:57:38.759577 containerd[1499]: 2025-09-09 23:57:38.647 [INFO][4295] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.72.0/26 handle="k8s-pod-network.b863af3def73411352acd2b8f8ba01a0a918acf9d18af88f2edd2d1e716abe0e" host="ci-4426-0-0-n-d8dd570c6c" Sep 9 23:57:38.759577 containerd[1499]: 2025-09-09 23:57:38.650 [INFO][4295] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.b863af3def73411352acd2b8f8ba01a0a918acf9d18af88f2edd2d1e716abe0e Sep 9 23:57:38.759577 containerd[1499]: 2025-09-09 23:57:38.660 [INFO][4295] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.72.0/26 handle="k8s-pod-network.b863af3def73411352acd2b8f8ba01a0a918acf9d18af88f2edd2d1e716abe0e" host="ci-4426-0-0-n-d8dd570c6c" Sep 9 23:57:38.759577 containerd[1499]: 2025-09-09 23:57:38.674 [INFO][4295] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.72.3/26] block=192.168.72.0/26 handle="k8s-pod-network.b863af3def73411352acd2b8f8ba01a0a918acf9d18af88f2edd2d1e716abe0e" host="ci-4426-0-0-n-d8dd570c6c" Sep 9 23:57:38.759577 containerd[1499]: 2025-09-09 23:57:38.675 [INFO][4295] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.72.3/26] handle="k8s-pod-network.b863af3def73411352acd2b8f8ba01a0a918acf9d18af88f2edd2d1e716abe0e" host="ci-4426-0-0-n-d8dd570c6c" Sep 9 23:57:38.759577 containerd[1499]: 2025-09-09 23:57:38.675 [INFO][4295] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Sep 9 23:57:38.759577 containerd[1499]: 2025-09-09 23:57:38.675 [INFO][4295] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.72.3/26] IPv6=[] ContainerID="b863af3def73411352acd2b8f8ba01a0a918acf9d18af88f2edd2d1e716abe0e" HandleID="k8s-pod-network.b863af3def73411352acd2b8f8ba01a0a918acf9d18af88f2edd2d1e716abe0e" Workload="ci--4426--0--0--n--d8dd570c6c-k8s-goldmane--54d579b49d--fw5ss-eth0" Sep 9 23:57:38.759708 containerd[1499]: 2025-09-09 23:57:38.682 [INFO][4259] cni-plugin/k8s.go 418: Populated endpoint ContainerID="b863af3def73411352acd2b8f8ba01a0a918acf9d18af88f2edd2d1e716abe0e" Namespace="calico-system" Pod="goldmane-54d579b49d-fw5ss" WorkloadEndpoint="ci--4426--0--0--n--d8dd570c6c-k8s-goldmane--54d579b49d--fw5ss-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4426--0--0--n--d8dd570c6c-k8s-goldmane--54d579b49d--fw5ss-eth0", GenerateName:"goldmane-54d579b49d-", Namespace:"calico-system", SelfLink:"", UID:"2b493a96-3c4e-400a-b1bc-669373df4971", ResourceVersion:"850", Generation:0, CreationTimestamp:time.Date(2025, time.September, 9, 23, 57, 15, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"goldmane", "k8s-app":"goldmane", "pod-template-hash":"54d579b49d", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"goldmane"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4426-0-0-n-d8dd570c6c", ContainerID:"", Pod:"goldmane-54d579b49d-fw5ss", Endpoint:"eth0", ServiceAccountName:"goldmane", IPNetworks:[]string{"192.168.72.3/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.goldmane"}, InterfaceName:"calibbd441cdc2f", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 9 23:57:38.759756 containerd[1499]: 2025-09-09 23:57:38.683 [INFO][4259] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.72.3/32] ContainerID="b863af3def73411352acd2b8f8ba01a0a918acf9d18af88f2edd2d1e716abe0e" Namespace="calico-system" Pod="goldmane-54d579b49d-fw5ss" WorkloadEndpoint="ci--4426--0--0--n--d8dd570c6c-k8s-goldmane--54d579b49d--fw5ss-eth0" Sep 9 23:57:38.759756 containerd[1499]: 2025-09-09 23:57:38.685 [INFO][4259] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to calibbd441cdc2f ContainerID="b863af3def73411352acd2b8f8ba01a0a918acf9d18af88f2edd2d1e716abe0e" Namespace="calico-system" Pod="goldmane-54d579b49d-fw5ss" WorkloadEndpoint="ci--4426--0--0--n--d8dd570c6c-k8s-goldmane--54d579b49d--fw5ss-eth0" Sep 9 23:57:38.759756 containerd[1499]: 2025-09-09 23:57:38.700 [INFO][4259] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="b863af3def73411352acd2b8f8ba01a0a918acf9d18af88f2edd2d1e716abe0e" Namespace="calico-system" Pod="goldmane-54d579b49d-fw5ss" WorkloadEndpoint="ci--4426--0--0--n--d8dd570c6c-k8s-goldmane--54d579b49d--fw5ss-eth0" Sep 9 23:57:38.759813 containerd[1499]: 2025-09-09 23:57:38.710 [INFO][4259] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="b863af3def73411352acd2b8f8ba01a0a918acf9d18af88f2edd2d1e716abe0e" Namespace="calico-system" Pod="goldmane-54d579b49d-fw5ss" WorkloadEndpoint="ci--4426--0--0--n--d8dd570c6c-k8s-goldmane--54d579b49d--fw5ss-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4426--0--0--n--d8dd570c6c-k8s-goldmane--54d579b49d--fw5ss-eth0", GenerateName:"goldmane-54d579b49d-", Namespace:"calico-system", SelfLink:"", UID:"2b493a96-3c4e-400a-b1bc-669373df4971", ResourceVersion:"850", Generation:0, CreationTimestamp:time.Date(2025, time.September, 9, 23, 57, 15, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"goldmane", "k8s-app":"goldmane", "pod-template-hash":"54d579b49d", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"goldmane"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4426-0-0-n-d8dd570c6c", ContainerID:"b863af3def73411352acd2b8f8ba01a0a918acf9d18af88f2edd2d1e716abe0e", Pod:"goldmane-54d579b49d-fw5ss", Endpoint:"eth0", ServiceAccountName:"goldmane", IPNetworks:[]string{"192.168.72.3/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.goldmane"}, InterfaceName:"calibbd441cdc2f", MAC:"c2:af:66:c8:7b:c0", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 9 23:57:38.759878 containerd[1499]: 2025-09-09 23:57:38.750 [INFO][4259] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="b863af3def73411352acd2b8f8ba01a0a918acf9d18af88f2edd2d1e716abe0e" Namespace="calico-system" Pod="goldmane-54d579b49d-fw5ss" WorkloadEndpoint="ci--4426--0--0--n--d8dd570c6c-k8s-goldmane--54d579b49d--fw5ss-eth0" Sep 9 23:57:38.801896 containerd[1499]: time="2025-09-09T23:57:38.801470186Z" level=info msg="connecting to shim b863af3def73411352acd2b8f8ba01a0a918acf9d18af88f2edd2d1e716abe0e" address="unix:///run/containerd/s/e0a65e60fae03a4de824e3bd06483a88f0c70af5be18b1d0329be6b85923a3bb" namespace=k8s.io protocol=ttrpc version=3 Sep 9 23:57:38.841129 systemd[1]: Started cri-containerd-b863af3def73411352acd2b8f8ba01a0a918acf9d18af88f2edd2d1e716abe0e.scope - libcontainer container b863af3def73411352acd2b8f8ba01a0a918acf9d18af88f2edd2d1e716abe0e. Sep 9 23:57:38.889964 systemd-networkd[1423]: calic9c3af2246a: Link UP Sep 9 23:57:38.891978 systemd-networkd[1423]: calic9c3af2246a: Gained carrier Sep 9 23:57:38.915434 containerd[1499]: 2025-09-09 23:57:38.536 [INFO][4274] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4426--0--0--n--d8dd570c6c-k8s-coredns--668d6bf9bc--v2h8b-eth0 coredns-668d6bf9bc- kube-system 22f16259-3f09-42f8-a728-4b51e247e528 842 0 2025-09-09 23:56:54 +0000 UTC map[k8s-app:kube-dns pod-template-hash:668d6bf9bc projectcalico.org/namespace:kube-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:coredns] map[] [] [] []} {k8s ci-4426-0-0-n-d8dd570c6c coredns-668d6bf9bc-v2h8b eth0 coredns [] [] [kns.kube-system ksa.kube-system.coredns] calic9c3af2246a [{dns UDP 53 0 } {dns-tcp TCP 53 0 } {metrics TCP 9153 0 }] [] }} ContainerID="388f1116c7ee778ae37f7bea608536cc19cb8c1cf7f36c339801976fd5dda438" Namespace="kube-system" Pod="coredns-668d6bf9bc-v2h8b" WorkloadEndpoint="ci--4426--0--0--n--d8dd570c6c-k8s-coredns--668d6bf9bc--v2h8b-" Sep 9 23:57:38.915434 containerd[1499]: 2025-09-09 23:57:38.537 [INFO][4274] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="388f1116c7ee778ae37f7bea608536cc19cb8c1cf7f36c339801976fd5dda438" Namespace="kube-system" Pod="coredns-668d6bf9bc-v2h8b" WorkloadEndpoint="ci--4426--0--0--n--d8dd570c6c-k8s-coredns--668d6bf9bc--v2h8b-eth0" Sep 9 23:57:38.915434 containerd[1499]: 2025-09-09 23:57:38.606 [INFO][4293] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="388f1116c7ee778ae37f7bea608536cc19cb8c1cf7f36c339801976fd5dda438" HandleID="k8s-pod-network.388f1116c7ee778ae37f7bea608536cc19cb8c1cf7f36c339801976fd5dda438" Workload="ci--4426--0--0--n--d8dd570c6c-k8s-coredns--668d6bf9bc--v2h8b-eth0" Sep 9 23:57:38.916027 containerd[1499]: 2025-09-09 23:57:38.607 [INFO][4293] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="388f1116c7ee778ae37f7bea608536cc19cb8c1cf7f36c339801976fd5dda438" HandleID="k8s-pod-network.388f1116c7ee778ae37f7bea608536cc19cb8c1cf7f36c339801976fd5dda438" Workload="ci--4426--0--0--n--d8dd570c6c-k8s-coredns--668d6bf9bc--v2h8b-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0x40002cb640), Attrs:map[string]string{"namespace":"kube-system", "node":"ci-4426-0-0-n-d8dd570c6c", "pod":"coredns-668d6bf9bc-v2h8b", "timestamp":"2025-09-09 23:57:38.606792056 +0000 UTC"}, Hostname:"ci-4426-0-0-n-d8dd570c6c", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Sep 9 23:57:38.916027 containerd[1499]: 2025-09-09 23:57:38.607 [INFO][4293] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Sep 9 23:57:38.916027 containerd[1499]: 2025-09-09 23:57:38.675 [INFO][4293] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Sep 9 23:57:38.916027 containerd[1499]: 2025-09-09 23:57:38.675 [INFO][4293] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4426-0-0-n-d8dd570c6c' Sep 9 23:57:38.916027 containerd[1499]: 2025-09-09 23:57:38.725 [INFO][4293] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.388f1116c7ee778ae37f7bea608536cc19cb8c1cf7f36c339801976fd5dda438" host="ci-4426-0-0-n-d8dd570c6c" Sep 9 23:57:38.916027 containerd[1499]: 2025-09-09 23:57:38.756 [INFO][4293] ipam/ipam.go 394: Looking up existing affinities for host host="ci-4426-0-0-n-d8dd570c6c" Sep 9 23:57:38.916027 containerd[1499]: 2025-09-09 23:57:38.775 [INFO][4293] ipam/ipam.go 511: Trying affinity for 192.168.72.0/26 host="ci-4426-0-0-n-d8dd570c6c" Sep 9 23:57:38.916027 containerd[1499]: 2025-09-09 23:57:38.779 [INFO][4293] ipam/ipam.go 158: Attempting to load block cidr=192.168.72.0/26 host="ci-4426-0-0-n-d8dd570c6c" Sep 9 23:57:38.916027 containerd[1499]: 2025-09-09 23:57:38.784 [INFO][4293] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.72.0/26 host="ci-4426-0-0-n-d8dd570c6c" Sep 9 23:57:38.916540 containerd[1499]: 2025-09-09 23:57:38.784 [INFO][4293] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.72.0/26 handle="k8s-pod-network.388f1116c7ee778ae37f7bea608536cc19cb8c1cf7f36c339801976fd5dda438" host="ci-4426-0-0-n-d8dd570c6c" Sep 9 23:57:38.916540 containerd[1499]: 2025-09-09 23:57:38.813 [INFO][4293] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.388f1116c7ee778ae37f7bea608536cc19cb8c1cf7f36c339801976fd5dda438 Sep 9 23:57:38.916540 containerd[1499]: 2025-09-09 23:57:38.846 [INFO][4293] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.72.0/26 handle="k8s-pod-network.388f1116c7ee778ae37f7bea608536cc19cb8c1cf7f36c339801976fd5dda438" host="ci-4426-0-0-n-d8dd570c6c" Sep 9 23:57:38.916540 containerd[1499]: 2025-09-09 23:57:38.875 [INFO][4293] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.72.4/26] block=192.168.72.0/26 handle="k8s-pod-network.388f1116c7ee778ae37f7bea608536cc19cb8c1cf7f36c339801976fd5dda438" host="ci-4426-0-0-n-d8dd570c6c" Sep 9 23:57:38.916540 containerd[1499]: 2025-09-09 23:57:38.875 [INFO][4293] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.72.4/26] handle="k8s-pod-network.388f1116c7ee778ae37f7bea608536cc19cb8c1cf7f36c339801976fd5dda438" host="ci-4426-0-0-n-d8dd570c6c" Sep 9 23:57:38.916540 containerd[1499]: 2025-09-09 23:57:38.877 [INFO][4293] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Sep 9 23:57:38.916540 containerd[1499]: 2025-09-09 23:57:38.878 [INFO][4293] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.72.4/26] IPv6=[] ContainerID="388f1116c7ee778ae37f7bea608536cc19cb8c1cf7f36c339801976fd5dda438" HandleID="k8s-pod-network.388f1116c7ee778ae37f7bea608536cc19cb8c1cf7f36c339801976fd5dda438" Workload="ci--4426--0--0--n--d8dd570c6c-k8s-coredns--668d6bf9bc--v2h8b-eth0" Sep 9 23:57:38.917560 containerd[1499]: 2025-09-09 23:57:38.886 [INFO][4274] cni-plugin/k8s.go 418: Populated endpoint ContainerID="388f1116c7ee778ae37f7bea608536cc19cb8c1cf7f36c339801976fd5dda438" Namespace="kube-system" Pod="coredns-668d6bf9bc-v2h8b" WorkloadEndpoint="ci--4426--0--0--n--d8dd570c6c-k8s-coredns--668d6bf9bc--v2h8b-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4426--0--0--n--d8dd570c6c-k8s-coredns--668d6bf9bc--v2h8b-eth0", GenerateName:"coredns-668d6bf9bc-", Namespace:"kube-system", SelfLink:"", UID:"22f16259-3f09-42f8-a728-4b51e247e528", ResourceVersion:"842", Generation:0, CreationTimestamp:time.Date(2025, time.September, 9, 23, 56, 54, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"668d6bf9bc", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4426-0-0-n-d8dd570c6c", ContainerID:"", Pod:"coredns-668d6bf9bc-v2h8b", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.72.4/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"calic9c3af2246a", MAC:"", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 9 23:57:38.917560 containerd[1499]: 2025-09-09 23:57:38.886 [INFO][4274] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.72.4/32] ContainerID="388f1116c7ee778ae37f7bea608536cc19cb8c1cf7f36c339801976fd5dda438" Namespace="kube-system" Pod="coredns-668d6bf9bc-v2h8b" WorkloadEndpoint="ci--4426--0--0--n--d8dd570c6c-k8s-coredns--668d6bf9bc--v2h8b-eth0" Sep 9 23:57:38.917560 containerd[1499]: 2025-09-09 23:57:38.886 [INFO][4274] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to calic9c3af2246a ContainerID="388f1116c7ee778ae37f7bea608536cc19cb8c1cf7f36c339801976fd5dda438" Namespace="kube-system" Pod="coredns-668d6bf9bc-v2h8b" WorkloadEndpoint="ci--4426--0--0--n--d8dd570c6c-k8s-coredns--668d6bf9bc--v2h8b-eth0" Sep 9 23:57:38.917560 containerd[1499]: 2025-09-09 23:57:38.890 [INFO][4274] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="388f1116c7ee778ae37f7bea608536cc19cb8c1cf7f36c339801976fd5dda438" Namespace="kube-system" Pod="coredns-668d6bf9bc-v2h8b" WorkloadEndpoint="ci--4426--0--0--n--d8dd570c6c-k8s-coredns--668d6bf9bc--v2h8b-eth0" Sep 9 23:57:38.917560 containerd[1499]: 2025-09-09 23:57:38.891 [INFO][4274] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="388f1116c7ee778ae37f7bea608536cc19cb8c1cf7f36c339801976fd5dda438" Namespace="kube-system" Pod="coredns-668d6bf9bc-v2h8b" WorkloadEndpoint="ci--4426--0--0--n--d8dd570c6c-k8s-coredns--668d6bf9bc--v2h8b-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4426--0--0--n--d8dd570c6c-k8s-coredns--668d6bf9bc--v2h8b-eth0", GenerateName:"coredns-668d6bf9bc-", Namespace:"kube-system", SelfLink:"", UID:"22f16259-3f09-42f8-a728-4b51e247e528", ResourceVersion:"842", Generation:0, CreationTimestamp:time.Date(2025, time.September, 9, 23, 56, 54, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"668d6bf9bc", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4426-0-0-n-d8dd570c6c", ContainerID:"388f1116c7ee778ae37f7bea608536cc19cb8c1cf7f36c339801976fd5dda438", Pod:"coredns-668d6bf9bc-v2h8b", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.72.4/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"calic9c3af2246a", MAC:"b2:df:54:30:13:72", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 9 23:57:38.917560 containerd[1499]: 2025-09-09 23:57:38.911 [INFO][4274] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="388f1116c7ee778ae37f7bea608536cc19cb8c1cf7f36c339801976fd5dda438" Namespace="kube-system" Pod="coredns-668d6bf9bc-v2h8b" WorkloadEndpoint="ci--4426--0--0--n--d8dd570c6c-k8s-coredns--668d6bf9bc--v2h8b-eth0" Sep 9 23:57:38.980568 containerd[1499]: time="2025-09-09T23:57:38.979595335Z" level=info msg="connecting to shim 388f1116c7ee778ae37f7bea608536cc19cb8c1cf7f36c339801976fd5dda438" address="unix:///run/containerd/s/6c818b01d7a7c044b7cf714eb6d8bc0d3289938deb1f99cccc7cd0e1e953707a" namespace=k8s.io protocol=ttrpc version=3 Sep 9 23:57:39.009477 systemd-networkd[1423]: cali3eab47b1821: Link UP Sep 9 23:57:39.010938 systemd-networkd[1423]: cali3eab47b1821: Gained carrier Sep 9 23:57:39.014704 containerd[1499]: time="2025-09-09T23:57:39.014654562Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:goldmane-54d579b49d-fw5ss,Uid:2b493a96-3c4e-400a-b1bc-669373df4971,Namespace:calico-system,Attempt:0,} returns sandbox id \"b863af3def73411352acd2b8f8ba01a0a918acf9d18af88f2edd2d1e716abe0e\"" Sep 9 23:57:39.046049 containerd[1499]: 2025-09-09 23:57:38.553 [INFO][4266] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4426--0--0--n--d8dd570c6c-k8s-calico--apiserver--54cdc7764d--sgvm7-eth0 calico-apiserver-54cdc7764d- calico-apiserver 56657f5b-5f65-4c76-a8c6-274344b1f6c6 854 0 2025-09-09 23:57:09 +0000 UTC map[apiserver:true app.kubernetes.io/name:calico-apiserver k8s-app:calico-apiserver pod-template-hash:54cdc7764d projectcalico.org/namespace:calico-apiserver projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:calico-apiserver] map[] [] [] []} {k8s ci-4426-0-0-n-d8dd570c6c calico-apiserver-54cdc7764d-sgvm7 eth0 calico-apiserver [] [] [kns.calico-apiserver ksa.calico-apiserver.calico-apiserver] cali3eab47b1821 [] [] }} ContainerID="c6ca49440ac01b2db45080cd7f58049b2c3660d80ce115386e4d4c93cf49d7cf" Namespace="calico-apiserver" Pod="calico-apiserver-54cdc7764d-sgvm7" WorkloadEndpoint="ci--4426--0--0--n--d8dd570c6c-k8s-calico--apiserver--54cdc7764d--sgvm7-" Sep 9 23:57:39.046049 containerd[1499]: 2025-09-09 23:57:38.554 [INFO][4266] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="c6ca49440ac01b2db45080cd7f58049b2c3660d80ce115386e4d4c93cf49d7cf" Namespace="calico-apiserver" Pod="calico-apiserver-54cdc7764d-sgvm7" WorkloadEndpoint="ci--4426--0--0--n--d8dd570c6c-k8s-calico--apiserver--54cdc7764d--sgvm7-eth0" Sep 9 23:57:39.046049 containerd[1499]: 2025-09-09 23:57:38.617 [INFO][4303] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="c6ca49440ac01b2db45080cd7f58049b2c3660d80ce115386e4d4c93cf49d7cf" HandleID="k8s-pod-network.c6ca49440ac01b2db45080cd7f58049b2c3660d80ce115386e4d4c93cf49d7cf" Workload="ci--4426--0--0--n--d8dd570c6c-k8s-calico--apiserver--54cdc7764d--sgvm7-eth0" Sep 9 23:57:39.046049 containerd[1499]: 2025-09-09 23:57:38.618 [INFO][4303] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="c6ca49440ac01b2db45080cd7f58049b2c3660d80ce115386e4d4c93cf49d7cf" HandleID="k8s-pod-network.c6ca49440ac01b2db45080cd7f58049b2c3660d80ce115386e4d4c93cf49d7cf" Workload="ci--4426--0--0--n--d8dd570c6c-k8s-calico--apiserver--54cdc7764d--sgvm7-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0x400004d9e0), Attrs:map[string]string{"namespace":"calico-apiserver", "node":"ci-4426-0-0-n-d8dd570c6c", "pod":"calico-apiserver-54cdc7764d-sgvm7", "timestamp":"2025-09-09 23:57:38.617595245 +0000 UTC"}, Hostname:"ci-4426-0-0-n-d8dd570c6c", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Sep 9 23:57:39.046049 containerd[1499]: 2025-09-09 23:57:38.618 [INFO][4303] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Sep 9 23:57:39.046049 containerd[1499]: 2025-09-09 23:57:38.876 [INFO][4303] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Sep 9 23:57:39.046049 containerd[1499]: 2025-09-09 23:57:38.877 [INFO][4303] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4426-0-0-n-d8dd570c6c' Sep 9 23:57:39.046049 containerd[1499]: 2025-09-09 23:57:38.914 [INFO][4303] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.c6ca49440ac01b2db45080cd7f58049b2c3660d80ce115386e4d4c93cf49d7cf" host="ci-4426-0-0-n-d8dd570c6c" Sep 9 23:57:39.046049 containerd[1499]: 2025-09-09 23:57:38.928 [INFO][4303] ipam/ipam.go 394: Looking up existing affinities for host host="ci-4426-0-0-n-d8dd570c6c" Sep 9 23:57:39.046049 containerd[1499]: 2025-09-09 23:57:38.941 [INFO][4303] ipam/ipam.go 511: Trying affinity for 192.168.72.0/26 host="ci-4426-0-0-n-d8dd570c6c" Sep 9 23:57:39.046049 containerd[1499]: 2025-09-09 23:57:38.946 [INFO][4303] ipam/ipam.go 158: Attempting to load block cidr=192.168.72.0/26 host="ci-4426-0-0-n-d8dd570c6c" Sep 9 23:57:39.046049 containerd[1499]: 2025-09-09 23:57:38.949 [INFO][4303] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.72.0/26 host="ci-4426-0-0-n-d8dd570c6c" Sep 9 23:57:39.046049 containerd[1499]: 2025-09-09 23:57:38.950 [INFO][4303] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.72.0/26 handle="k8s-pod-network.c6ca49440ac01b2db45080cd7f58049b2c3660d80ce115386e4d4c93cf49d7cf" host="ci-4426-0-0-n-d8dd570c6c" Sep 9 23:57:39.046049 containerd[1499]: 2025-09-09 23:57:38.958 [INFO][4303] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.c6ca49440ac01b2db45080cd7f58049b2c3660d80ce115386e4d4c93cf49d7cf Sep 9 23:57:39.046049 containerd[1499]: 2025-09-09 23:57:38.977 [INFO][4303] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.72.0/26 handle="k8s-pod-network.c6ca49440ac01b2db45080cd7f58049b2c3660d80ce115386e4d4c93cf49d7cf" host="ci-4426-0-0-n-d8dd570c6c" Sep 9 23:57:39.046049 containerd[1499]: 2025-09-09 23:57:38.990 [INFO][4303] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.72.5/26] block=192.168.72.0/26 handle="k8s-pod-network.c6ca49440ac01b2db45080cd7f58049b2c3660d80ce115386e4d4c93cf49d7cf" host="ci-4426-0-0-n-d8dd570c6c" Sep 9 23:57:39.046049 containerd[1499]: 2025-09-09 23:57:38.990 [INFO][4303] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.72.5/26] handle="k8s-pod-network.c6ca49440ac01b2db45080cd7f58049b2c3660d80ce115386e4d4c93cf49d7cf" host="ci-4426-0-0-n-d8dd570c6c" Sep 9 23:57:39.046049 containerd[1499]: 2025-09-09 23:57:38.991 [INFO][4303] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Sep 9 23:57:39.046049 containerd[1499]: 2025-09-09 23:57:38.991 [INFO][4303] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.72.5/26] IPv6=[] ContainerID="c6ca49440ac01b2db45080cd7f58049b2c3660d80ce115386e4d4c93cf49d7cf" HandleID="k8s-pod-network.c6ca49440ac01b2db45080cd7f58049b2c3660d80ce115386e4d4c93cf49d7cf" Workload="ci--4426--0--0--n--d8dd570c6c-k8s-calico--apiserver--54cdc7764d--sgvm7-eth0" Sep 9 23:57:39.046765 containerd[1499]: 2025-09-09 23:57:38.999 [INFO][4266] cni-plugin/k8s.go 418: Populated endpoint ContainerID="c6ca49440ac01b2db45080cd7f58049b2c3660d80ce115386e4d4c93cf49d7cf" Namespace="calico-apiserver" Pod="calico-apiserver-54cdc7764d-sgvm7" WorkloadEndpoint="ci--4426--0--0--n--d8dd570c6c-k8s-calico--apiserver--54cdc7764d--sgvm7-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4426--0--0--n--d8dd570c6c-k8s-calico--apiserver--54cdc7764d--sgvm7-eth0", GenerateName:"calico-apiserver-54cdc7764d-", Namespace:"calico-apiserver", SelfLink:"", UID:"56657f5b-5f65-4c76-a8c6-274344b1f6c6", ResourceVersion:"854", Generation:0, CreationTimestamp:time.Date(2025, time.September, 9, 23, 57, 9, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"54cdc7764d", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4426-0-0-n-d8dd570c6c", ContainerID:"", Pod:"calico-apiserver-54cdc7764d-sgvm7", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.72.5/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"cali3eab47b1821", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 9 23:57:39.046765 containerd[1499]: 2025-09-09 23:57:38.999 [INFO][4266] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.72.5/32] ContainerID="c6ca49440ac01b2db45080cd7f58049b2c3660d80ce115386e4d4c93cf49d7cf" Namespace="calico-apiserver" Pod="calico-apiserver-54cdc7764d-sgvm7" WorkloadEndpoint="ci--4426--0--0--n--d8dd570c6c-k8s-calico--apiserver--54cdc7764d--sgvm7-eth0" Sep 9 23:57:39.046765 containerd[1499]: 2025-09-09 23:57:38.999 [INFO][4266] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali3eab47b1821 ContainerID="c6ca49440ac01b2db45080cd7f58049b2c3660d80ce115386e4d4c93cf49d7cf" Namespace="calico-apiserver" Pod="calico-apiserver-54cdc7764d-sgvm7" WorkloadEndpoint="ci--4426--0--0--n--d8dd570c6c-k8s-calico--apiserver--54cdc7764d--sgvm7-eth0" Sep 9 23:57:39.046765 containerd[1499]: 2025-09-09 23:57:39.015 [INFO][4266] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="c6ca49440ac01b2db45080cd7f58049b2c3660d80ce115386e4d4c93cf49d7cf" Namespace="calico-apiserver" Pod="calico-apiserver-54cdc7764d-sgvm7" WorkloadEndpoint="ci--4426--0--0--n--d8dd570c6c-k8s-calico--apiserver--54cdc7764d--sgvm7-eth0" Sep 9 23:57:39.046765 containerd[1499]: 2025-09-09 23:57:39.020 [INFO][4266] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="c6ca49440ac01b2db45080cd7f58049b2c3660d80ce115386e4d4c93cf49d7cf" Namespace="calico-apiserver" Pod="calico-apiserver-54cdc7764d-sgvm7" WorkloadEndpoint="ci--4426--0--0--n--d8dd570c6c-k8s-calico--apiserver--54cdc7764d--sgvm7-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4426--0--0--n--d8dd570c6c-k8s-calico--apiserver--54cdc7764d--sgvm7-eth0", GenerateName:"calico-apiserver-54cdc7764d-", Namespace:"calico-apiserver", SelfLink:"", UID:"56657f5b-5f65-4c76-a8c6-274344b1f6c6", ResourceVersion:"854", Generation:0, CreationTimestamp:time.Date(2025, time.September, 9, 23, 57, 9, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"54cdc7764d", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4426-0-0-n-d8dd570c6c", ContainerID:"c6ca49440ac01b2db45080cd7f58049b2c3660d80ce115386e4d4c93cf49d7cf", Pod:"calico-apiserver-54cdc7764d-sgvm7", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.72.5/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"cali3eab47b1821", MAC:"36:ab:61:e1:32:e9", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 9 23:57:39.046765 containerd[1499]: 2025-09-09 23:57:39.040 [INFO][4266] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="c6ca49440ac01b2db45080cd7f58049b2c3660d80ce115386e4d4c93cf49d7cf" Namespace="calico-apiserver" Pod="calico-apiserver-54cdc7764d-sgvm7" WorkloadEndpoint="ci--4426--0--0--n--d8dd570c6c-k8s-calico--apiserver--54cdc7764d--sgvm7-eth0" Sep 9 23:57:39.048264 systemd[1]: Started cri-containerd-388f1116c7ee778ae37f7bea608536cc19cb8c1cf7f36c339801976fd5dda438.scope - libcontainer container 388f1116c7ee778ae37f7bea608536cc19cb8c1cf7f36c339801976fd5dda438. Sep 9 23:57:39.081911 containerd[1499]: time="2025-09-09T23:57:39.081855604Z" level=info msg="connecting to shim c6ca49440ac01b2db45080cd7f58049b2c3660d80ce115386e4d4c93cf49d7cf" address="unix:///run/containerd/s/5faa39d778305a7f2eea7b29b55354e0fbfcf68a92a310a9fbcbb7f0e91a868c" namespace=k8s.io protocol=ttrpc version=3 Sep 9 23:57:39.127269 systemd[1]: Started cri-containerd-c6ca49440ac01b2db45080cd7f58049b2c3660d80ce115386e4d4c93cf49d7cf.scope - libcontainer container c6ca49440ac01b2db45080cd7f58049b2c3660d80ce115386e4d4c93cf49d7cf. Sep 9 23:57:39.130107 containerd[1499]: time="2025-09-09T23:57:39.129967404Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-668d6bf9bc-v2h8b,Uid:22f16259-3f09-42f8-a728-4b51e247e528,Namespace:kube-system,Attempt:0,} returns sandbox id \"388f1116c7ee778ae37f7bea608536cc19cb8c1cf7f36c339801976fd5dda438\"" Sep 9 23:57:39.136550 containerd[1499]: time="2025-09-09T23:57:39.136509990Z" level=info msg="CreateContainer within sandbox \"388f1116c7ee778ae37f7bea608536cc19cb8c1cf7f36c339801976fd5dda438\" for container &ContainerMetadata{Name:coredns,Attempt:0,}" Sep 9 23:57:39.184383 containerd[1499]: time="2025-09-09T23:57:39.184342593Z" level=info msg="Container b7620bbdd5c10cb68b581352c6ef4952c0e03c5ac8f42cbd2e01e2f54b126aed: CDI devices from CRI Config.CDIDevices: []" Sep 9 23:57:39.187762 containerd[1499]: time="2025-09-09T23:57:39.187694965Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-54cdc7764d-sgvm7,Uid:56657f5b-5f65-4c76-a8c6-274344b1f6c6,Namespace:calico-apiserver,Attempt:0,} returns sandbox id \"c6ca49440ac01b2db45080cd7f58049b2c3660d80ce115386e4d4c93cf49d7cf\"" Sep 9 23:57:39.193060 containerd[1499]: time="2025-09-09T23:57:39.192988001Z" level=info msg="CreateContainer within sandbox \"388f1116c7ee778ae37f7bea608536cc19cb8c1cf7f36c339801976fd5dda438\" for &ContainerMetadata{Name:coredns,Attempt:0,} returns container id \"b7620bbdd5c10cb68b581352c6ef4952c0e03c5ac8f42cbd2e01e2f54b126aed\"" Sep 9 23:57:39.194673 containerd[1499]: time="2025-09-09T23:57:39.193958193Z" level=info msg="StartContainer for \"b7620bbdd5c10cb68b581352c6ef4952c0e03c5ac8f42cbd2e01e2f54b126aed\"" Sep 9 23:57:39.197345 containerd[1499]: time="2025-09-09T23:57:39.197173966Z" level=info msg="connecting to shim b7620bbdd5c10cb68b581352c6ef4952c0e03c5ac8f42cbd2e01e2f54b126aed" address="unix:///run/containerd/s/6c818b01d7a7c044b7cf714eb6d8bc0d3289938deb1f99cccc7cd0e1e953707a" protocol=ttrpc version=3 Sep 9 23:57:39.210066 systemd-networkd[1423]: cali0f839e01716: Gained IPv6LL Sep 9 23:57:39.242078 systemd[1]: Started cri-containerd-b7620bbdd5c10cb68b581352c6ef4952c0e03c5ac8f42cbd2e01e2f54b126aed.scope - libcontainer container b7620bbdd5c10cb68b581352c6ef4952c0e03c5ac8f42cbd2e01e2f54b126aed. Sep 9 23:57:39.285031 containerd[1499]: time="2025-09-09T23:57:39.284983865Z" level=info msg="StartContainer for \"b7620bbdd5c10cb68b581352c6ef4952c0e03c5ac8f42cbd2e01e2f54b126aed\" returns successfully" Sep 9 23:57:39.755443 kubelet[2750]: I0909 23:57:39.755314 2750 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/coredns-668d6bf9bc-v2h8b" podStartSLOduration=45.755293432 podStartE2EDuration="45.755293432s" podCreationTimestamp="2025-09-09 23:56:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-09 23:57:39.753347479 +0000 UTC m=+51.482257217" watchObservedRunningTime="2025-09-09 23:57:39.755293432 +0000 UTC m=+51.484203170" Sep 9 23:57:39.976043 systemd-networkd[1423]: calic9c3af2246a: Gained IPv6LL Sep 9 23:57:40.183389 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount220323804.mount: Deactivated successfully. Sep 9 23:57:40.200755 containerd[1499]: time="2025-09-09T23:57:40.200679446Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/whisker-backend:v3.30.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 23:57:40.202354 containerd[1499]: time="2025-09-09T23:57:40.202301017Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker-backend:v3.30.3: active requests=0, bytes read=30823700" Sep 9 23:57:40.203586 containerd[1499]: time="2025-09-09T23:57:40.203346850Z" level=info msg="ImageCreate event name:\"sha256:e210e86234bc99f018431b30477c5ca2ad6f7ecf67ef011498f7beb48fb0b21f\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 23:57:40.206871 containerd[1499]: time="2025-09-09T23:57:40.206807480Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/whisker-backend@sha256:29becebc47401da9997a2a30f4c25c511a5f379d17275680b048224829af71a5\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 23:57:40.208910 containerd[1499]: time="2025-09-09T23:57:40.208863145Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.3\" with image id \"sha256:e210e86234bc99f018431b30477c5ca2ad6f7ecf67ef011498f7beb48fb0b21f\", repo tag \"ghcr.io/flatcar/calico/whisker-backend:v3.30.3\", repo digest \"ghcr.io/flatcar/calico/whisker-backend@sha256:29becebc47401da9997a2a30f4c25c511a5f379d17275680b048224829af71a5\", size \"30823530\" in 2.873651114s" Sep 9 23:57:40.209073 containerd[1499]: time="2025-09-09T23:57:40.209056031Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.30.3\" returns image reference \"sha256:e210e86234bc99f018431b30477c5ca2ad6f7ecf67ef011498f7beb48fb0b21f\"" Sep 9 23:57:40.214286 containerd[1499]: time="2025-09-09T23:57:40.214238314Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.30.3\"" Sep 9 23:57:40.215939 containerd[1499]: time="2025-09-09T23:57:40.215852245Z" level=info msg="CreateContainer within sandbox \"63fcd4331535dfcdada44d4995cb064e9de1767ecdd0e9b386c782a0bcebc051\" for container &ContainerMetadata{Name:whisker-backend,Attempt:0,}" Sep 9 23:57:40.240946 containerd[1499]: time="2025-09-09T23:57:40.240255536Z" level=info msg="Container 1f37e555c7385a25aa03b5689921877402ff0253e15cc6b91387d875b4b39617: CDI devices from CRI Config.CDIDevices: []" Sep 9 23:57:40.266394 containerd[1499]: time="2025-09-09T23:57:40.266343240Z" level=info msg="CreateContainer within sandbox \"63fcd4331535dfcdada44d4995cb064e9de1767ecdd0e9b386c782a0bcebc051\" for &ContainerMetadata{Name:whisker-backend,Attempt:0,} returns container id \"1f37e555c7385a25aa03b5689921877402ff0253e15cc6b91387d875b4b39617\"" Sep 9 23:57:40.268749 containerd[1499]: time="2025-09-09T23:57:40.266928299Z" level=info msg="StartContainer for \"1f37e555c7385a25aa03b5689921877402ff0253e15cc6b91387d875b4b39617\"" Sep 9 23:57:40.268749 containerd[1499]: time="2025-09-09T23:57:40.268356424Z" level=info msg="connecting to shim 1f37e555c7385a25aa03b5689921877402ff0253e15cc6b91387d875b4b39617" address="unix:///run/containerd/s/f3f1693e49cc3598baf5a5f2d8cf42020888dc97385af67b2f2884748b539328" protocol=ttrpc version=3 Sep 9 23:57:40.298314 systemd[1]: Started cri-containerd-1f37e555c7385a25aa03b5689921877402ff0253e15cc6b91387d875b4b39617.scope - libcontainer container 1f37e555c7385a25aa03b5689921877402ff0253e15cc6b91387d875b4b39617. Sep 9 23:57:40.353893 containerd[1499]: time="2025-09-09T23:57:40.353827684Z" level=info msg="StartContainer for \"1f37e555c7385a25aa03b5689921877402ff0253e15cc6b91387d875b4b39617\" returns successfully" Sep 9 23:57:40.617398 systemd-networkd[1423]: calibbd441cdc2f: Gained IPv6LL Sep 9 23:57:40.765926 kubelet[2750]: I0909 23:57:40.765791 2750 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/whisker-5f6d768486-fqb2s" podStartSLOduration=1.981185876 podStartE2EDuration="6.765770256s" podCreationTimestamp="2025-09-09 23:57:34 +0000 UTC" firstStartedPulling="2025-09-09 23:57:35.428407255 +0000 UTC m=+47.157316993" lastFinishedPulling="2025-09-09 23:57:40.212991635 +0000 UTC m=+51.941901373" observedRunningTime="2025-09-09 23:57:40.765726775 +0000 UTC m=+52.494636513" watchObservedRunningTime="2025-09-09 23:57:40.765770256 +0000 UTC m=+52.494679994" Sep 9 23:57:41.000156 systemd-networkd[1423]: cali3eab47b1821: Gained IPv6LL Sep 9 23:57:41.411963 containerd[1499]: time="2025-09-09T23:57:41.411223121Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-668d6bf9bc-8js5x,Uid:5e89e89d-d9f4-4ee7-afac-d5ef87dca67a,Namespace:kube-system,Attempt:0,}" Sep 9 23:57:41.412384 containerd[1499]: time="2025-09-09T23:57:41.412111548Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-p9n6x,Uid:78e1ea12-0c8b-4e75-9773-e6539bdf3c00,Namespace:calico-system,Attempt:0,}" Sep 9 23:57:41.412384 containerd[1499]: time="2025-09-09T23:57:41.412370276Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-54cdc7764d-2x7kh,Uid:934bf4ae-b87a-4084-b4c1-444253f9295c,Namespace:calico-apiserver,Attempt:0,}" Sep 9 23:57:41.703401 systemd-networkd[1423]: cali30d3996480c: Link UP Sep 9 23:57:41.705269 systemd-networkd[1423]: cali30d3996480c: Gained carrier Sep 9 23:57:41.743332 containerd[1499]: 2025-09-09 23:57:41.534 [INFO][4560] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4426--0--0--n--d8dd570c6c-k8s-coredns--668d6bf9bc--8js5x-eth0 coredns-668d6bf9bc- kube-system 5e89e89d-d9f4-4ee7-afac-d5ef87dca67a 847 0 2025-09-09 23:56:54 +0000 UTC map[k8s-app:kube-dns pod-template-hash:668d6bf9bc projectcalico.org/namespace:kube-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:coredns] map[] [] [] []} {k8s ci-4426-0-0-n-d8dd570c6c coredns-668d6bf9bc-8js5x eth0 coredns [] [] [kns.kube-system ksa.kube-system.coredns] cali30d3996480c [{dns UDP 53 0 } {dns-tcp TCP 53 0 } {metrics TCP 9153 0 }] [] }} ContainerID="1a2121c1bad16de34338537807ba5c3604ed7b2dd8ab369a6af5e6bb13326a6a" Namespace="kube-system" Pod="coredns-668d6bf9bc-8js5x" WorkloadEndpoint="ci--4426--0--0--n--d8dd570c6c-k8s-coredns--668d6bf9bc--8js5x-" Sep 9 23:57:41.743332 containerd[1499]: 2025-09-09 23:57:41.534 [INFO][4560] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="1a2121c1bad16de34338537807ba5c3604ed7b2dd8ab369a6af5e6bb13326a6a" Namespace="kube-system" Pod="coredns-668d6bf9bc-8js5x" WorkloadEndpoint="ci--4426--0--0--n--d8dd570c6c-k8s-coredns--668d6bf9bc--8js5x-eth0" Sep 9 23:57:41.743332 containerd[1499]: 2025-09-09 23:57:41.625 [INFO][4614] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="1a2121c1bad16de34338537807ba5c3604ed7b2dd8ab369a6af5e6bb13326a6a" HandleID="k8s-pod-network.1a2121c1bad16de34338537807ba5c3604ed7b2dd8ab369a6af5e6bb13326a6a" Workload="ci--4426--0--0--n--d8dd570c6c-k8s-coredns--668d6bf9bc--8js5x-eth0" Sep 9 23:57:41.743332 containerd[1499]: 2025-09-09 23:57:41.627 [INFO][4614] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="1a2121c1bad16de34338537807ba5c3604ed7b2dd8ab369a6af5e6bb13326a6a" HandleID="k8s-pod-network.1a2121c1bad16de34338537807ba5c3604ed7b2dd8ab369a6af5e6bb13326a6a" Workload="ci--4426--0--0--n--d8dd570c6c-k8s-coredns--668d6bf9bc--8js5x-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0x40002d36e0), Attrs:map[string]string{"namespace":"kube-system", "node":"ci-4426-0-0-n-d8dd570c6c", "pod":"coredns-668d6bf9bc-8js5x", "timestamp":"2025-09-09 23:57:41.624797796 +0000 UTC"}, Hostname:"ci-4426-0-0-n-d8dd570c6c", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Sep 9 23:57:41.743332 containerd[1499]: 2025-09-09 23:57:41.628 [INFO][4614] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Sep 9 23:57:41.743332 containerd[1499]: 2025-09-09 23:57:41.629 [INFO][4614] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Sep 9 23:57:41.743332 containerd[1499]: 2025-09-09 23:57:41.630 [INFO][4614] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4426-0-0-n-d8dd570c6c' Sep 9 23:57:41.743332 containerd[1499]: 2025-09-09 23:57:41.642 [INFO][4614] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.1a2121c1bad16de34338537807ba5c3604ed7b2dd8ab369a6af5e6bb13326a6a" host="ci-4426-0-0-n-d8dd570c6c" Sep 9 23:57:41.743332 containerd[1499]: 2025-09-09 23:57:41.651 [INFO][4614] ipam/ipam.go 394: Looking up existing affinities for host host="ci-4426-0-0-n-d8dd570c6c" Sep 9 23:57:41.743332 containerd[1499]: 2025-09-09 23:57:41.659 [INFO][4614] ipam/ipam.go 511: Trying affinity for 192.168.72.0/26 host="ci-4426-0-0-n-d8dd570c6c" Sep 9 23:57:41.743332 containerd[1499]: 2025-09-09 23:57:41.663 [INFO][4614] ipam/ipam.go 158: Attempting to load block cidr=192.168.72.0/26 host="ci-4426-0-0-n-d8dd570c6c" Sep 9 23:57:41.743332 containerd[1499]: 2025-09-09 23:57:41.668 [INFO][4614] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.72.0/26 host="ci-4426-0-0-n-d8dd570c6c" Sep 9 23:57:41.743332 containerd[1499]: 2025-09-09 23:57:41.668 [INFO][4614] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.72.0/26 handle="k8s-pod-network.1a2121c1bad16de34338537807ba5c3604ed7b2dd8ab369a6af5e6bb13326a6a" host="ci-4426-0-0-n-d8dd570c6c" Sep 9 23:57:41.743332 containerd[1499]: 2025-09-09 23:57:41.671 [INFO][4614] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.1a2121c1bad16de34338537807ba5c3604ed7b2dd8ab369a6af5e6bb13326a6a Sep 9 23:57:41.743332 containerd[1499]: 2025-09-09 23:57:41.678 [INFO][4614] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.72.0/26 handle="k8s-pod-network.1a2121c1bad16de34338537807ba5c3604ed7b2dd8ab369a6af5e6bb13326a6a" host="ci-4426-0-0-n-d8dd570c6c" Sep 9 23:57:41.743332 containerd[1499]: 2025-09-09 23:57:41.687 [INFO][4614] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.72.6/26] block=192.168.72.0/26 handle="k8s-pod-network.1a2121c1bad16de34338537807ba5c3604ed7b2dd8ab369a6af5e6bb13326a6a" host="ci-4426-0-0-n-d8dd570c6c" Sep 9 23:57:41.743332 containerd[1499]: 2025-09-09 23:57:41.687 [INFO][4614] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.72.6/26] handle="k8s-pod-network.1a2121c1bad16de34338537807ba5c3604ed7b2dd8ab369a6af5e6bb13326a6a" host="ci-4426-0-0-n-d8dd570c6c" Sep 9 23:57:41.743332 containerd[1499]: 2025-09-09 23:57:41.688 [INFO][4614] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Sep 9 23:57:41.743332 containerd[1499]: 2025-09-09 23:57:41.688 [INFO][4614] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.72.6/26] IPv6=[] ContainerID="1a2121c1bad16de34338537807ba5c3604ed7b2dd8ab369a6af5e6bb13326a6a" HandleID="k8s-pod-network.1a2121c1bad16de34338537807ba5c3604ed7b2dd8ab369a6af5e6bb13326a6a" Workload="ci--4426--0--0--n--d8dd570c6c-k8s-coredns--668d6bf9bc--8js5x-eth0" Sep 9 23:57:41.744386 containerd[1499]: 2025-09-09 23:57:41.693 [INFO][4560] cni-plugin/k8s.go 418: Populated endpoint ContainerID="1a2121c1bad16de34338537807ba5c3604ed7b2dd8ab369a6af5e6bb13326a6a" Namespace="kube-system" Pod="coredns-668d6bf9bc-8js5x" WorkloadEndpoint="ci--4426--0--0--n--d8dd570c6c-k8s-coredns--668d6bf9bc--8js5x-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4426--0--0--n--d8dd570c6c-k8s-coredns--668d6bf9bc--8js5x-eth0", GenerateName:"coredns-668d6bf9bc-", Namespace:"kube-system", SelfLink:"", UID:"5e89e89d-d9f4-4ee7-afac-d5ef87dca67a", ResourceVersion:"847", Generation:0, CreationTimestamp:time.Date(2025, time.September, 9, 23, 56, 54, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"668d6bf9bc", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4426-0-0-n-d8dd570c6c", ContainerID:"", Pod:"coredns-668d6bf9bc-8js5x", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.72.6/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"cali30d3996480c", MAC:"", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 9 23:57:41.744386 containerd[1499]: 2025-09-09 23:57:41.694 [INFO][4560] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.72.6/32] ContainerID="1a2121c1bad16de34338537807ba5c3604ed7b2dd8ab369a6af5e6bb13326a6a" Namespace="kube-system" Pod="coredns-668d6bf9bc-8js5x" WorkloadEndpoint="ci--4426--0--0--n--d8dd570c6c-k8s-coredns--668d6bf9bc--8js5x-eth0" Sep 9 23:57:41.744386 containerd[1499]: 2025-09-09 23:57:41.694 [INFO][4560] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali30d3996480c ContainerID="1a2121c1bad16de34338537807ba5c3604ed7b2dd8ab369a6af5e6bb13326a6a" Namespace="kube-system" Pod="coredns-668d6bf9bc-8js5x" WorkloadEndpoint="ci--4426--0--0--n--d8dd570c6c-k8s-coredns--668d6bf9bc--8js5x-eth0" Sep 9 23:57:41.744386 containerd[1499]: 2025-09-09 23:57:41.704 [INFO][4560] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="1a2121c1bad16de34338537807ba5c3604ed7b2dd8ab369a6af5e6bb13326a6a" Namespace="kube-system" Pod="coredns-668d6bf9bc-8js5x" WorkloadEndpoint="ci--4426--0--0--n--d8dd570c6c-k8s-coredns--668d6bf9bc--8js5x-eth0" Sep 9 23:57:41.744386 containerd[1499]: 2025-09-09 23:57:41.704 [INFO][4560] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="1a2121c1bad16de34338537807ba5c3604ed7b2dd8ab369a6af5e6bb13326a6a" Namespace="kube-system" Pod="coredns-668d6bf9bc-8js5x" WorkloadEndpoint="ci--4426--0--0--n--d8dd570c6c-k8s-coredns--668d6bf9bc--8js5x-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4426--0--0--n--d8dd570c6c-k8s-coredns--668d6bf9bc--8js5x-eth0", GenerateName:"coredns-668d6bf9bc-", Namespace:"kube-system", SelfLink:"", UID:"5e89e89d-d9f4-4ee7-afac-d5ef87dca67a", ResourceVersion:"847", Generation:0, CreationTimestamp:time.Date(2025, time.September, 9, 23, 56, 54, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"668d6bf9bc", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4426-0-0-n-d8dd570c6c", ContainerID:"1a2121c1bad16de34338537807ba5c3604ed7b2dd8ab369a6af5e6bb13326a6a", Pod:"coredns-668d6bf9bc-8js5x", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.72.6/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"cali30d3996480c", MAC:"66:f4:41:9c:76:81", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 9 23:57:41.744386 containerd[1499]: 2025-09-09 23:57:41.740 [INFO][4560] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="1a2121c1bad16de34338537807ba5c3604ed7b2dd8ab369a6af5e6bb13326a6a" Namespace="kube-system" Pod="coredns-668d6bf9bc-8js5x" WorkloadEndpoint="ci--4426--0--0--n--d8dd570c6c-k8s-coredns--668d6bf9bc--8js5x-eth0" Sep 9 23:57:41.786503 containerd[1499]: time="2025-09-09T23:57:41.786440008Z" level=info msg="connecting to shim 1a2121c1bad16de34338537807ba5c3604ed7b2dd8ab369a6af5e6bb13326a6a" address="unix:///run/containerd/s/e60fc4d2a9bd2d4ce1e035b054e5f70b3245824f308c2b66f406a25c94d191f5" namespace=k8s.io protocol=ttrpc version=3 Sep 9 23:57:41.835240 systemd-networkd[1423]: cali4fd6d753d88: Link UP Sep 9 23:57:41.836770 systemd-networkd[1423]: cali4fd6d753d88: Gained carrier Sep 9 23:57:41.856381 systemd[1]: Started cri-containerd-1a2121c1bad16de34338537807ba5c3604ed7b2dd8ab369a6af5e6bb13326a6a.scope - libcontainer container 1a2121c1bad16de34338537807ba5c3604ed7b2dd8ab369a6af5e6bb13326a6a. Sep 9 23:57:41.868026 containerd[1499]: 2025-09-09 23:57:41.523 [INFO][4581] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4426--0--0--n--d8dd570c6c-k8s-calico--apiserver--54cdc7764d--2x7kh-eth0 calico-apiserver-54cdc7764d- calico-apiserver 934bf4ae-b87a-4084-b4c1-444253f9295c 853 0 2025-09-09 23:57:09 +0000 UTC map[apiserver:true app.kubernetes.io/name:calico-apiserver k8s-app:calico-apiserver pod-template-hash:54cdc7764d projectcalico.org/namespace:calico-apiserver projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:calico-apiserver] map[] [] [] []} {k8s ci-4426-0-0-n-d8dd570c6c calico-apiserver-54cdc7764d-2x7kh eth0 calico-apiserver [] [] [kns.calico-apiserver ksa.calico-apiserver.calico-apiserver] cali4fd6d753d88 [] [] }} ContainerID="9b86f88b2112126cc2072475aa4f217bcf58a3cfcb7d444fe7a61e0248cbed17" Namespace="calico-apiserver" Pod="calico-apiserver-54cdc7764d-2x7kh" WorkloadEndpoint="ci--4426--0--0--n--d8dd570c6c-k8s-calico--apiserver--54cdc7764d--2x7kh-" Sep 9 23:57:41.868026 containerd[1499]: 2025-09-09 23:57:41.523 [INFO][4581] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="9b86f88b2112126cc2072475aa4f217bcf58a3cfcb7d444fe7a61e0248cbed17" Namespace="calico-apiserver" Pod="calico-apiserver-54cdc7764d-2x7kh" WorkloadEndpoint="ci--4426--0--0--n--d8dd570c6c-k8s-calico--apiserver--54cdc7764d--2x7kh-eth0" Sep 9 23:57:41.868026 containerd[1499]: 2025-09-09 23:57:41.632 [INFO][4604] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="9b86f88b2112126cc2072475aa4f217bcf58a3cfcb7d444fe7a61e0248cbed17" HandleID="k8s-pod-network.9b86f88b2112126cc2072475aa4f217bcf58a3cfcb7d444fe7a61e0248cbed17" Workload="ci--4426--0--0--n--d8dd570c6c-k8s-calico--apiserver--54cdc7764d--2x7kh-eth0" Sep 9 23:57:41.868026 containerd[1499]: 2025-09-09 23:57:41.632 [INFO][4604] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="9b86f88b2112126cc2072475aa4f217bcf58a3cfcb7d444fe7a61e0248cbed17" HandleID="k8s-pod-network.9b86f88b2112126cc2072475aa4f217bcf58a3cfcb7d444fe7a61e0248cbed17" Workload="ci--4426--0--0--n--d8dd570c6c-k8s-calico--apiserver--54cdc7764d--2x7kh-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0x4000389970), Attrs:map[string]string{"namespace":"calico-apiserver", "node":"ci-4426-0-0-n-d8dd570c6c", "pod":"calico-apiserver-54cdc7764d-2x7kh", "timestamp":"2025-09-09 23:57:41.631449079 +0000 UTC"}, Hostname:"ci-4426-0-0-n-d8dd570c6c", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Sep 9 23:57:41.868026 containerd[1499]: 2025-09-09 23:57:41.632 [INFO][4604] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Sep 9 23:57:41.868026 containerd[1499]: 2025-09-09 23:57:41.688 [INFO][4604] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Sep 9 23:57:41.868026 containerd[1499]: 2025-09-09 23:57:41.688 [INFO][4604] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4426-0-0-n-d8dd570c6c' Sep 9 23:57:41.868026 containerd[1499]: 2025-09-09 23:57:41.743 [INFO][4604] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.9b86f88b2112126cc2072475aa4f217bcf58a3cfcb7d444fe7a61e0248cbed17" host="ci-4426-0-0-n-d8dd570c6c" Sep 9 23:57:41.868026 containerd[1499]: 2025-09-09 23:57:41.754 [INFO][4604] ipam/ipam.go 394: Looking up existing affinities for host host="ci-4426-0-0-n-d8dd570c6c" Sep 9 23:57:41.868026 containerd[1499]: 2025-09-09 23:57:41.764 [INFO][4604] ipam/ipam.go 511: Trying affinity for 192.168.72.0/26 host="ci-4426-0-0-n-d8dd570c6c" Sep 9 23:57:41.868026 containerd[1499]: 2025-09-09 23:57:41.771 [INFO][4604] ipam/ipam.go 158: Attempting to load block cidr=192.168.72.0/26 host="ci-4426-0-0-n-d8dd570c6c" Sep 9 23:57:41.868026 containerd[1499]: 2025-09-09 23:57:41.777 [INFO][4604] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.72.0/26 host="ci-4426-0-0-n-d8dd570c6c" Sep 9 23:57:41.868026 containerd[1499]: 2025-09-09 23:57:41.777 [INFO][4604] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.72.0/26 handle="k8s-pod-network.9b86f88b2112126cc2072475aa4f217bcf58a3cfcb7d444fe7a61e0248cbed17" host="ci-4426-0-0-n-d8dd570c6c" Sep 9 23:57:41.868026 containerd[1499]: 2025-09-09 23:57:41.781 [INFO][4604] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.9b86f88b2112126cc2072475aa4f217bcf58a3cfcb7d444fe7a61e0248cbed17 Sep 9 23:57:41.868026 containerd[1499]: 2025-09-09 23:57:41.792 [INFO][4604] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.72.0/26 handle="k8s-pod-network.9b86f88b2112126cc2072475aa4f217bcf58a3cfcb7d444fe7a61e0248cbed17" host="ci-4426-0-0-n-d8dd570c6c" Sep 9 23:57:41.868026 containerd[1499]: 2025-09-09 23:57:41.811 [INFO][4604] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.72.7/26] block=192.168.72.0/26 handle="k8s-pod-network.9b86f88b2112126cc2072475aa4f217bcf58a3cfcb7d444fe7a61e0248cbed17" host="ci-4426-0-0-n-d8dd570c6c" Sep 9 23:57:41.868026 containerd[1499]: 2025-09-09 23:57:41.812 [INFO][4604] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.72.7/26] handle="k8s-pod-network.9b86f88b2112126cc2072475aa4f217bcf58a3cfcb7d444fe7a61e0248cbed17" host="ci-4426-0-0-n-d8dd570c6c" Sep 9 23:57:41.868026 containerd[1499]: 2025-09-09 23:57:41.812 [INFO][4604] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Sep 9 23:57:41.868026 containerd[1499]: 2025-09-09 23:57:41.812 [INFO][4604] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.72.7/26] IPv6=[] ContainerID="9b86f88b2112126cc2072475aa4f217bcf58a3cfcb7d444fe7a61e0248cbed17" HandleID="k8s-pod-network.9b86f88b2112126cc2072475aa4f217bcf58a3cfcb7d444fe7a61e0248cbed17" Workload="ci--4426--0--0--n--d8dd570c6c-k8s-calico--apiserver--54cdc7764d--2x7kh-eth0" Sep 9 23:57:41.869613 containerd[1499]: 2025-09-09 23:57:41.818 [INFO][4581] cni-plugin/k8s.go 418: Populated endpoint ContainerID="9b86f88b2112126cc2072475aa4f217bcf58a3cfcb7d444fe7a61e0248cbed17" Namespace="calico-apiserver" Pod="calico-apiserver-54cdc7764d-2x7kh" WorkloadEndpoint="ci--4426--0--0--n--d8dd570c6c-k8s-calico--apiserver--54cdc7764d--2x7kh-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4426--0--0--n--d8dd570c6c-k8s-calico--apiserver--54cdc7764d--2x7kh-eth0", GenerateName:"calico-apiserver-54cdc7764d-", Namespace:"calico-apiserver", SelfLink:"", UID:"934bf4ae-b87a-4084-b4c1-444253f9295c", ResourceVersion:"853", Generation:0, CreationTimestamp:time.Date(2025, time.September, 9, 23, 57, 9, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"54cdc7764d", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4426-0-0-n-d8dd570c6c", ContainerID:"", Pod:"calico-apiserver-54cdc7764d-2x7kh", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.72.7/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"cali4fd6d753d88", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 9 23:57:41.869613 containerd[1499]: 2025-09-09 23:57:41.819 [INFO][4581] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.72.7/32] ContainerID="9b86f88b2112126cc2072475aa4f217bcf58a3cfcb7d444fe7a61e0248cbed17" Namespace="calico-apiserver" Pod="calico-apiserver-54cdc7764d-2x7kh" WorkloadEndpoint="ci--4426--0--0--n--d8dd570c6c-k8s-calico--apiserver--54cdc7764d--2x7kh-eth0" Sep 9 23:57:41.869613 containerd[1499]: 2025-09-09 23:57:41.819 [INFO][4581] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali4fd6d753d88 ContainerID="9b86f88b2112126cc2072475aa4f217bcf58a3cfcb7d444fe7a61e0248cbed17" Namespace="calico-apiserver" Pod="calico-apiserver-54cdc7764d-2x7kh" WorkloadEndpoint="ci--4426--0--0--n--d8dd570c6c-k8s-calico--apiserver--54cdc7764d--2x7kh-eth0" Sep 9 23:57:41.869613 containerd[1499]: 2025-09-09 23:57:41.836 [INFO][4581] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="9b86f88b2112126cc2072475aa4f217bcf58a3cfcb7d444fe7a61e0248cbed17" Namespace="calico-apiserver" Pod="calico-apiserver-54cdc7764d-2x7kh" WorkloadEndpoint="ci--4426--0--0--n--d8dd570c6c-k8s-calico--apiserver--54cdc7764d--2x7kh-eth0" Sep 9 23:57:41.869613 containerd[1499]: 2025-09-09 23:57:41.842 [INFO][4581] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="9b86f88b2112126cc2072475aa4f217bcf58a3cfcb7d444fe7a61e0248cbed17" Namespace="calico-apiserver" Pod="calico-apiserver-54cdc7764d-2x7kh" WorkloadEndpoint="ci--4426--0--0--n--d8dd570c6c-k8s-calico--apiserver--54cdc7764d--2x7kh-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4426--0--0--n--d8dd570c6c-k8s-calico--apiserver--54cdc7764d--2x7kh-eth0", GenerateName:"calico-apiserver-54cdc7764d-", Namespace:"calico-apiserver", SelfLink:"", UID:"934bf4ae-b87a-4084-b4c1-444253f9295c", ResourceVersion:"853", Generation:0, CreationTimestamp:time.Date(2025, time.September, 9, 23, 57, 9, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"54cdc7764d", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4426-0-0-n-d8dd570c6c", ContainerID:"9b86f88b2112126cc2072475aa4f217bcf58a3cfcb7d444fe7a61e0248cbed17", Pod:"calico-apiserver-54cdc7764d-2x7kh", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.72.7/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"cali4fd6d753d88", MAC:"86:48:5d:42:6c:9e", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 9 23:57:41.869613 containerd[1499]: 2025-09-09 23:57:41.864 [INFO][4581] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="9b86f88b2112126cc2072475aa4f217bcf58a3cfcb7d444fe7a61e0248cbed17" Namespace="calico-apiserver" Pod="calico-apiserver-54cdc7764d-2x7kh" WorkloadEndpoint="ci--4426--0--0--n--d8dd570c6c-k8s-calico--apiserver--54cdc7764d--2x7kh-eth0" Sep 9 23:57:41.911197 containerd[1499]: time="2025-09-09T23:57:41.911142732Z" level=info msg="connecting to shim 9b86f88b2112126cc2072475aa4f217bcf58a3cfcb7d444fe7a61e0248cbed17" address="unix:///run/containerd/s/60b5690c185e1bf1b609dde1989dd6e517666bdea137fbe3e392025db044405f" namespace=k8s.io protocol=ttrpc version=3 Sep 9 23:57:41.953096 systemd[1]: Started cri-containerd-9b86f88b2112126cc2072475aa4f217bcf58a3cfcb7d444fe7a61e0248cbed17.scope - libcontainer container 9b86f88b2112126cc2072475aa4f217bcf58a3cfcb7d444fe7a61e0248cbed17. Sep 9 23:57:41.967975 systemd-networkd[1423]: caliab2c725b8a3: Link UP Sep 9 23:57:41.969188 systemd-networkd[1423]: caliab2c725b8a3: Gained carrier Sep 9 23:57:41.973272 containerd[1499]: time="2025-09-09T23:57:41.972767172Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-668d6bf9bc-8js5x,Uid:5e89e89d-d9f4-4ee7-afac-d5ef87dca67a,Namespace:kube-system,Attempt:0,} returns sandbox id \"1a2121c1bad16de34338537807ba5c3604ed7b2dd8ab369a6af5e6bb13326a6a\"" Sep 9 23:57:41.982881 containerd[1499]: time="2025-09-09T23:57:41.982100817Z" level=info msg="CreateContainer within sandbox \"1a2121c1bad16de34338537807ba5c3604ed7b2dd8ab369a6af5e6bb13326a6a\" for container &ContainerMetadata{Name:coredns,Attempt:0,}" Sep 9 23:57:42.008591 containerd[1499]: 2025-09-09 23:57:41.534 [INFO][4563] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4426--0--0--n--d8dd570c6c-k8s-csi--node--driver--p9n6x-eth0 csi-node-driver- calico-system 78e1ea12-0c8b-4e75-9773-e6539bdf3c00 713 0 2025-09-09 23:57:15 +0000 UTC map[app.kubernetes.io/name:csi-node-driver controller-revision-hash:6c96d95cc7 k8s-app:csi-node-driver name:csi-node-driver pod-template-generation:1 projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:csi-node-driver] map[] [] [] []} {k8s ci-4426-0-0-n-d8dd570c6c csi-node-driver-p9n6x eth0 csi-node-driver [] [] [kns.calico-system ksa.calico-system.csi-node-driver] caliab2c725b8a3 [] [] }} ContainerID="779c2f76763606f3db64facd73985666f402c925215141ecba9dc56c659c1e13" Namespace="calico-system" Pod="csi-node-driver-p9n6x" WorkloadEndpoint="ci--4426--0--0--n--d8dd570c6c-k8s-csi--node--driver--p9n6x-" Sep 9 23:57:42.008591 containerd[1499]: 2025-09-09 23:57:41.534 [INFO][4563] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="779c2f76763606f3db64facd73985666f402c925215141ecba9dc56c659c1e13" Namespace="calico-system" Pod="csi-node-driver-p9n6x" WorkloadEndpoint="ci--4426--0--0--n--d8dd570c6c-k8s-csi--node--driver--p9n6x-eth0" Sep 9 23:57:42.008591 containerd[1499]: 2025-09-09 23:57:41.633 [INFO][4612] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="779c2f76763606f3db64facd73985666f402c925215141ecba9dc56c659c1e13" HandleID="k8s-pod-network.779c2f76763606f3db64facd73985666f402c925215141ecba9dc56c659c1e13" Workload="ci--4426--0--0--n--d8dd570c6c-k8s-csi--node--driver--p9n6x-eth0" Sep 9 23:57:42.008591 containerd[1499]: 2025-09-09 23:57:41.635 [INFO][4612] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="779c2f76763606f3db64facd73985666f402c925215141ecba9dc56c659c1e13" HandleID="k8s-pod-network.779c2f76763606f3db64facd73985666f402c925215141ecba9dc56c659c1e13" Workload="ci--4426--0--0--n--d8dd570c6c-k8s-csi--node--driver--p9n6x-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0x40002cb910), Attrs:map[string]string{"namespace":"calico-system", "node":"ci-4426-0-0-n-d8dd570c6c", "pod":"csi-node-driver-p9n6x", "timestamp":"2025-09-09 23:57:41.633792311 +0000 UTC"}, Hostname:"ci-4426-0-0-n-d8dd570c6c", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Sep 9 23:57:42.008591 containerd[1499]: 2025-09-09 23:57:41.636 [INFO][4612] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Sep 9 23:57:42.008591 containerd[1499]: 2025-09-09 23:57:41.812 [INFO][4612] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Sep 9 23:57:42.008591 containerd[1499]: 2025-09-09 23:57:41.812 [INFO][4612] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4426-0-0-n-d8dd570c6c' Sep 9 23:57:42.008591 containerd[1499]: 2025-09-09 23:57:41.844 [INFO][4612] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.779c2f76763606f3db64facd73985666f402c925215141ecba9dc56c659c1e13" host="ci-4426-0-0-n-d8dd570c6c" Sep 9 23:57:42.008591 containerd[1499]: 2025-09-09 23:57:41.855 [INFO][4612] ipam/ipam.go 394: Looking up existing affinities for host host="ci-4426-0-0-n-d8dd570c6c" Sep 9 23:57:42.008591 containerd[1499]: 2025-09-09 23:57:41.876 [INFO][4612] ipam/ipam.go 511: Trying affinity for 192.168.72.0/26 host="ci-4426-0-0-n-d8dd570c6c" Sep 9 23:57:42.008591 containerd[1499]: 2025-09-09 23:57:41.881 [INFO][4612] ipam/ipam.go 158: Attempting to load block cidr=192.168.72.0/26 host="ci-4426-0-0-n-d8dd570c6c" Sep 9 23:57:42.008591 containerd[1499]: 2025-09-09 23:57:41.890 [INFO][4612] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.72.0/26 host="ci-4426-0-0-n-d8dd570c6c" Sep 9 23:57:42.008591 containerd[1499]: 2025-09-09 23:57:41.891 [INFO][4612] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.72.0/26 handle="k8s-pod-network.779c2f76763606f3db64facd73985666f402c925215141ecba9dc56c659c1e13" host="ci-4426-0-0-n-d8dd570c6c" Sep 9 23:57:42.008591 containerd[1499]: 2025-09-09 23:57:41.895 [INFO][4612] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.779c2f76763606f3db64facd73985666f402c925215141ecba9dc56c659c1e13 Sep 9 23:57:42.008591 containerd[1499]: 2025-09-09 23:57:41.906 [INFO][4612] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.72.0/26 handle="k8s-pod-network.779c2f76763606f3db64facd73985666f402c925215141ecba9dc56c659c1e13" host="ci-4426-0-0-n-d8dd570c6c" Sep 9 23:57:42.008591 containerd[1499]: 2025-09-09 23:57:41.924 [INFO][4612] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.72.8/26] block=192.168.72.0/26 handle="k8s-pod-network.779c2f76763606f3db64facd73985666f402c925215141ecba9dc56c659c1e13" host="ci-4426-0-0-n-d8dd570c6c" Sep 9 23:57:42.008591 containerd[1499]: 2025-09-09 23:57:41.925 [INFO][4612] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.72.8/26] handle="k8s-pod-network.779c2f76763606f3db64facd73985666f402c925215141ecba9dc56c659c1e13" host="ci-4426-0-0-n-d8dd570c6c" Sep 9 23:57:42.008591 containerd[1499]: 2025-09-09 23:57:41.925 [INFO][4612] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Sep 9 23:57:42.008591 containerd[1499]: 2025-09-09 23:57:41.926 [INFO][4612] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.72.8/26] IPv6=[] ContainerID="779c2f76763606f3db64facd73985666f402c925215141ecba9dc56c659c1e13" HandleID="k8s-pod-network.779c2f76763606f3db64facd73985666f402c925215141ecba9dc56c659c1e13" Workload="ci--4426--0--0--n--d8dd570c6c-k8s-csi--node--driver--p9n6x-eth0" Sep 9 23:57:42.009301 containerd[1499]: 2025-09-09 23:57:41.941 [INFO][4563] cni-plugin/k8s.go 418: Populated endpoint ContainerID="779c2f76763606f3db64facd73985666f402c925215141ecba9dc56c659c1e13" Namespace="calico-system" Pod="csi-node-driver-p9n6x" WorkloadEndpoint="ci--4426--0--0--n--d8dd570c6c-k8s-csi--node--driver--p9n6x-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4426--0--0--n--d8dd570c6c-k8s-csi--node--driver--p9n6x-eth0", GenerateName:"csi-node-driver-", Namespace:"calico-system", SelfLink:"", UID:"78e1ea12-0c8b-4e75-9773-e6539bdf3c00", ResourceVersion:"713", Generation:0, CreationTimestamp:time.Date(2025, time.September, 9, 23, 57, 15, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"csi-node-driver", "controller-revision-hash":"6c96d95cc7", "k8s-app":"csi-node-driver", "name":"csi-node-driver", "pod-template-generation":"1", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"csi-node-driver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4426-0-0-n-d8dd570c6c", ContainerID:"", Pod:"csi-node-driver-p9n6x", Endpoint:"eth0", ServiceAccountName:"csi-node-driver", IPNetworks:[]string{"192.168.72.8/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.csi-node-driver"}, InterfaceName:"caliab2c725b8a3", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 9 23:57:42.009301 containerd[1499]: 2025-09-09 23:57:41.941 [INFO][4563] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.72.8/32] ContainerID="779c2f76763606f3db64facd73985666f402c925215141ecba9dc56c659c1e13" Namespace="calico-system" Pod="csi-node-driver-p9n6x" WorkloadEndpoint="ci--4426--0--0--n--d8dd570c6c-k8s-csi--node--driver--p9n6x-eth0" Sep 9 23:57:42.009301 containerd[1499]: 2025-09-09 23:57:41.941 [INFO][4563] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to caliab2c725b8a3 ContainerID="779c2f76763606f3db64facd73985666f402c925215141ecba9dc56c659c1e13" Namespace="calico-system" Pod="csi-node-driver-p9n6x" WorkloadEndpoint="ci--4426--0--0--n--d8dd570c6c-k8s-csi--node--driver--p9n6x-eth0" Sep 9 23:57:42.009301 containerd[1499]: 2025-09-09 23:57:41.970 [INFO][4563] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="779c2f76763606f3db64facd73985666f402c925215141ecba9dc56c659c1e13" Namespace="calico-system" Pod="csi-node-driver-p9n6x" WorkloadEndpoint="ci--4426--0--0--n--d8dd570c6c-k8s-csi--node--driver--p9n6x-eth0" Sep 9 23:57:42.009301 containerd[1499]: 2025-09-09 23:57:41.971 [INFO][4563] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="779c2f76763606f3db64facd73985666f402c925215141ecba9dc56c659c1e13" Namespace="calico-system" Pod="csi-node-driver-p9n6x" WorkloadEndpoint="ci--4426--0--0--n--d8dd570c6c-k8s-csi--node--driver--p9n6x-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4426--0--0--n--d8dd570c6c-k8s-csi--node--driver--p9n6x-eth0", GenerateName:"csi-node-driver-", Namespace:"calico-system", SelfLink:"", UID:"78e1ea12-0c8b-4e75-9773-e6539bdf3c00", ResourceVersion:"713", Generation:0, CreationTimestamp:time.Date(2025, time.September, 9, 23, 57, 15, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"csi-node-driver", "controller-revision-hash":"6c96d95cc7", "k8s-app":"csi-node-driver", "name":"csi-node-driver", "pod-template-generation":"1", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"csi-node-driver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4426-0-0-n-d8dd570c6c", ContainerID:"779c2f76763606f3db64facd73985666f402c925215141ecba9dc56c659c1e13", Pod:"csi-node-driver-p9n6x", Endpoint:"eth0", ServiceAccountName:"csi-node-driver", IPNetworks:[]string{"192.168.72.8/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.csi-node-driver"}, InterfaceName:"caliab2c725b8a3", MAC:"56:8c:c0:0d:d4:2a", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 9 23:57:42.009301 containerd[1499]: 2025-09-09 23:57:41.998 [INFO][4563] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="779c2f76763606f3db64facd73985666f402c925215141ecba9dc56c659c1e13" Namespace="calico-system" Pod="csi-node-driver-p9n6x" WorkloadEndpoint="ci--4426--0--0--n--d8dd570c6c-k8s-csi--node--driver--p9n6x-eth0" Sep 9 23:57:42.016723 containerd[1499]: time="2025-09-09T23:57:42.016679655Z" level=info msg="Container d74fa07868f96090d79d9beae280e2c234e729a809756186b81c2c2990fb8171: CDI devices from CRI Config.CDIDevices: []" Sep 9 23:57:42.031459 containerd[1499]: time="2025-09-09T23:57:42.031233604Z" level=info msg="CreateContainer within sandbox \"1a2121c1bad16de34338537807ba5c3604ed7b2dd8ab369a6af5e6bb13326a6a\" for &ContainerMetadata{Name:coredns,Attempt:0,} returns container id \"d74fa07868f96090d79d9beae280e2c234e729a809756186b81c2c2990fb8171\"" Sep 9 23:57:42.034643 containerd[1499]: time="2025-09-09T23:57:42.034550902Z" level=info msg="StartContainer for \"d74fa07868f96090d79d9beae280e2c234e729a809756186b81c2c2990fb8171\"" Sep 9 23:57:42.038444 containerd[1499]: time="2025-09-09T23:57:42.038401375Z" level=info msg="connecting to shim d74fa07868f96090d79d9beae280e2c234e729a809756186b81c2c2990fb8171" address="unix:///run/containerd/s/e60fc4d2a9bd2d4ce1e035b054e5f70b3245824f308c2b66f406a25c94d191f5" protocol=ttrpc version=3 Sep 9 23:57:42.069317 containerd[1499]: time="2025-09-09T23:57:42.068911114Z" level=info msg="connecting to shim 779c2f76763606f3db64facd73985666f402c925215141ecba9dc56c659c1e13" address="unix:///run/containerd/s/c393090204919397cf61dba494d0a12b3b93a9c2bf61cbd11ec695c7657f7bca" namespace=k8s.io protocol=ttrpc version=3 Sep 9 23:57:42.076618 containerd[1499]: time="2025-09-09T23:57:42.076583020Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-54cdc7764d-2x7kh,Uid:934bf4ae-b87a-4084-b4c1-444253f9295c,Namespace:calico-apiserver,Attempt:0,} returns sandbox id \"9b86f88b2112126cc2072475aa4f217bcf58a3cfcb7d444fe7a61e0248cbed17\"" Sep 9 23:57:42.077049 systemd[1]: Started cri-containerd-d74fa07868f96090d79d9beae280e2c234e729a809756186b81c2c2990fb8171.scope - libcontainer container d74fa07868f96090d79d9beae280e2c234e729a809756186b81c2c2990fb8171. Sep 9 23:57:42.119073 systemd[1]: Started cri-containerd-779c2f76763606f3db64facd73985666f402c925215141ecba9dc56c659c1e13.scope - libcontainer container 779c2f76763606f3db64facd73985666f402c925215141ecba9dc56c659c1e13. Sep 9 23:57:42.132491 containerd[1499]: time="2025-09-09T23:57:42.132445386Z" level=info msg="StartContainer for \"d74fa07868f96090d79d9beae280e2c234e729a809756186b81c2c2990fb8171\" returns successfully" Sep 9 23:57:42.172614 containerd[1499]: time="2025-09-09T23:57:42.172564408Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-p9n6x,Uid:78e1ea12-0c8b-4e75-9773-e6539bdf3c00,Namespace:calico-system,Attempt:0,} returns sandbox id \"779c2f76763606f3db64facd73985666f402c925215141ecba9dc56c659c1e13\"" Sep 9 23:57:42.783860 kubelet[2750]: I0909 23:57:42.782526 2750 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/coredns-668d6bf9bc-8js5x" podStartSLOduration=48.782487777 podStartE2EDuration="48.782487777s" podCreationTimestamp="2025-09-09 23:56:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-09 23:57:42.780334433 +0000 UTC m=+54.509244171" watchObservedRunningTime="2025-09-09 23:57:42.782487777 +0000 UTC m=+54.511397555" Sep 9 23:57:43.240997 systemd-networkd[1423]: caliab2c725b8a3: Gained IPv6LL Sep 9 23:57:43.304616 systemd-networkd[1423]: cali30d3996480c: Gained IPv6LL Sep 9 23:57:43.816918 systemd-networkd[1423]: cali4fd6d753d88: Gained IPv6LL Sep 9 23:57:44.456160 containerd[1499]: time="2025-09-09T23:57:44.455416261Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/kube-controllers:v3.30.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 23:57:44.456160 containerd[1499]: time="2025-09-09T23:57:44.456099640Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/kube-controllers:v3.30.3: active requests=0, bytes read=48134957" Sep 9 23:57:44.457350 containerd[1499]: time="2025-09-09T23:57:44.457311153Z" level=info msg="ImageCreate event name:\"sha256:34117caf92350e1565610f2254377d7455b11e36666b5ce11b4a13670720432a\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 23:57:44.460214 containerd[1499]: time="2025-09-09T23:57:44.460144511Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/kube-controllers@sha256:27c4187717f08f0a5727019d8beb7597665eb47e69eaa1d7d091a7e28913e577\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 23:57:44.475365 containerd[1499]: time="2025-09-09T23:57:44.475290847Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.3\" with image id \"sha256:34117caf92350e1565610f2254377d7455b11e36666b5ce11b4a13670720432a\", repo tag \"ghcr.io/flatcar/calico/kube-controllers:v3.30.3\", repo digest \"ghcr.io/flatcar/calico/kube-controllers@sha256:27c4187717f08f0a5727019d8beb7597665eb47e69eaa1d7d091a7e28913e577\", size \"49504166\" in 4.260431074s" Sep 9 23:57:44.475542 containerd[1499]: time="2025-09-09T23:57:44.475524614Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.30.3\" returns image reference \"sha256:34117caf92350e1565610f2254377d7455b11e36666b5ce11b4a13670720432a\"" Sep 9 23:57:44.482758 containerd[1499]: time="2025-09-09T23:57:44.481612501Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.30.3\"" Sep 9 23:57:44.507926 containerd[1499]: time="2025-09-09T23:57:44.507750259Z" level=info msg="CreateContainer within sandbox \"4bd62c1aca34fb0a1e8cdb395d7bb8d76d55e2cff20ebc7f68bb6345452bede3\" for container &ContainerMetadata{Name:calico-kube-controllers,Attempt:0,}" Sep 9 23:57:44.523317 containerd[1499]: time="2025-09-09T23:57:44.522005970Z" level=info msg="Container e658e28f776bc1bf3f6470efd4e2069a173811d51adfcae9b3c469beaf2853dd: CDI devices from CRI Config.CDIDevices: []" Sep 9 23:57:44.548735 containerd[1499]: time="2025-09-09T23:57:44.548674783Z" level=info msg="CreateContainer within sandbox \"4bd62c1aca34fb0a1e8cdb395d7bb8d76d55e2cff20ebc7f68bb6345452bede3\" for &ContainerMetadata{Name:calico-kube-controllers,Attempt:0,} returns container id \"e658e28f776bc1bf3f6470efd4e2069a173811d51adfcae9b3c469beaf2853dd\"" Sep 9 23:57:44.549947 containerd[1499]: time="2025-09-09T23:57:44.549723731Z" level=info msg="StartContainer for \"e658e28f776bc1bf3f6470efd4e2069a173811d51adfcae9b3c469beaf2853dd\"" Sep 9 23:57:44.552595 containerd[1499]: time="2025-09-09T23:57:44.552552449Z" level=info msg="connecting to shim e658e28f776bc1bf3f6470efd4e2069a173811d51adfcae9b3c469beaf2853dd" address="unix:///run/containerd/s/422201ada65d268797c2ec0d6b51bda8c9faf1e53b8b8cb0021905be02ca3c86" protocol=ttrpc version=3 Sep 9 23:57:44.583612 systemd[1]: Started cri-containerd-e658e28f776bc1bf3f6470efd4e2069a173811d51adfcae9b3c469beaf2853dd.scope - libcontainer container e658e28f776bc1bf3f6470efd4e2069a173811d51adfcae9b3c469beaf2853dd. Sep 9 23:57:44.670418 containerd[1499]: time="2025-09-09T23:57:44.670315923Z" level=info msg="StartContainer for \"e658e28f776bc1bf3f6470efd4e2069a173811d51adfcae9b3c469beaf2853dd\" returns successfully" Sep 9 23:57:44.855516 containerd[1499]: time="2025-09-09T23:57:44.854936834Z" level=info msg="TaskExit event in podsandbox handler container_id:\"e658e28f776bc1bf3f6470efd4e2069a173811d51adfcae9b3c469beaf2853dd\" id:\"7fc802318b5b56f446c5f3557b0adb74c3e5c1532337a3d62f004b62ff9c5a15\" pid:4887 exit_status:1 exited_at:{seconds:1757462264 nanos:854213054}" Sep 9 23:57:45.843108 containerd[1499]: time="2025-09-09T23:57:45.842864326Z" level=info msg="TaskExit event in podsandbox handler container_id:\"e658e28f776bc1bf3f6470efd4e2069a173811d51adfcae9b3c469beaf2853dd\" id:\"441fd2b60baa8531e4185e8176a8a78e4bb0e8979236654e01351d2a028d5e7f\" pid:4912 exited_at:{seconds:1757462265 nanos:842463036}" Sep 9 23:57:45.871282 kubelet[2750]: I0909 23:57:45.870634 2750 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/calico-kube-controllers-749fff5dd9-2hjps" podStartSLOduration=23.128263806 podStartE2EDuration="29.870613702s" podCreationTimestamp="2025-09-09 23:57:16 +0000 UTC" firstStartedPulling="2025-09-09 23:57:37.736270443 +0000 UTC m=+49.465180181" lastFinishedPulling="2025-09-09 23:57:44.478620339 +0000 UTC m=+56.207530077" observedRunningTime="2025-09-09 23:57:44.800047326 +0000 UTC m=+56.528957184" watchObservedRunningTime="2025-09-09 23:57:45.870613702 +0000 UTC m=+57.599523400" Sep 9 23:57:47.157783 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount1952498735.mount: Deactivated successfully. Sep 9 23:57:47.585075 containerd[1499]: time="2025-09-09T23:57:47.584213835Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/goldmane:v3.30.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 23:57:47.587148 containerd[1499]: time="2025-09-09T23:57:47.587077906Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/goldmane:v3.30.3: active requests=0, bytes read=61845332" Sep 9 23:57:47.590471 containerd[1499]: time="2025-09-09T23:57:47.590400468Z" level=info msg="ImageCreate event name:\"sha256:14088376331a0622b7f6a2fbc2f2932806a6eafdd7b602f6139d3b985bf1e685\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 23:57:47.593869 containerd[1499]: time="2025-09-09T23:57:47.592363677Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/goldmane@sha256:46297703ab3739331a00a58f0d6a5498c8d3b6523ad947eed68592ee0f3e79f0\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 23:57:47.597096 containerd[1499]: time="2025-09-09T23:57:47.597043312Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/goldmane:v3.30.3\" with image id \"sha256:14088376331a0622b7f6a2fbc2f2932806a6eafdd7b602f6139d3b985bf1e685\", repo tag \"ghcr.io/flatcar/calico/goldmane:v3.30.3\", repo digest \"ghcr.io/flatcar/calico/goldmane@sha256:46297703ab3739331a00a58f0d6a5498c8d3b6523ad947eed68592ee0f3e79f0\", size \"61845178\" in 3.11537965s" Sep 9 23:57:47.597350 containerd[1499]: time="2025-09-09T23:57:47.597320799Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.30.3\" returns image reference \"sha256:14088376331a0622b7f6a2fbc2f2932806a6eafdd7b602f6139d3b985bf1e685\"" Sep 9 23:57:47.602304 containerd[1499]: time="2025-09-09T23:57:47.602245441Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.3\"" Sep 9 23:57:47.604474 containerd[1499]: time="2025-09-09T23:57:47.604429495Z" level=info msg="CreateContainer within sandbox \"b863af3def73411352acd2b8f8ba01a0a918acf9d18af88f2edd2d1e716abe0e\" for container &ContainerMetadata{Name:goldmane,Attempt:0,}" Sep 9 23:57:47.619729 containerd[1499]: time="2025-09-09T23:57:47.619682271Z" level=info msg="Container d59eedc96c2104e11b2608643e7d628b3bd942715093589a5bf7cf81929818f7: CDI devices from CRI Config.CDIDevices: []" Sep 9 23:57:47.633226 containerd[1499]: time="2025-09-09T23:57:47.633096763Z" level=info msg="CreateContainer within sandbox \"b863af3def73411352acd2b8f8ba01a0a918acf9d18af88f2edd2d1e716abe0e\" for &ContainerMetadata{Name:goldmane,Attempt:0,} returns container id \"d59eedc96c2104e11b2608643e7d628b3bd942715093589a5bf7cf81929818f7\"" Sep 9 23:57:47.634320 containerd[1499]: time="2025-09-09T23:57:47.634292192Z" level=info msg="StartContainer for \"d59eedc96c2104e11b2608643e7d628b3bd942715093589a5bf7cf81929818f7\"" Sep 9 23:57:47.636188 containerd[1499]: time="2025-09-09T23:57:47.636155678Z" level=info msg="connecting to shim d59eedc96c2104e11b2608643e7d628b3bd942715093589a5bf7cf81929818f7" address="unix:///run/containerd/s/e0a65e60fae03a4de824e3bd06483a88f0c70af5be18b1d0329be6b85923a3bb" protocol=ttrpc version=3 Sep 9 23:57:47.669178 systemd[1]: Started cri-containerd-d59eedc96c2104e11b2608643e7d628b3bd942715093589a5bf7cf81929818f7.scope - libcontainer container d59eedc96c2104e11b2608643e7d628b3bd942715093589a5bf7cf81929818f7. Sep 9 23:57:47.724793 containerd[1499]: time="2025-09-09T23:57:47.724722026Z" level=info msg="StartContainer for \"d59eedc96c2104e11b2608643e7d628b3bd942715093589a5bf7cf81929818f7\" returns successfully" Sep 9 23:57:47.818082 kubelet[2750]: I0909 23:57:47.817997 2750 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/goldmane-54d579b49d-fw5ss" podStartSLOduration=24.240144209 podStartE2EDuration="32.817908847s" podCreationTimestamp="2025-09-09 23:57:15 +0000 UTC" firstStartedPulling="2025-09-09 23:57:39.021985021 +0000 UTC m=+50.750894759" lastFinishedPulling="2025-09-09 23:57:47.599749659 +0000 UTC m=+59.328659397" observedRunningTime="2025-09-09 23:57:47.816660417 +0000 UTC m=+59.545570155" watchObservedRunningTime="2025-09-09 23:57:47.817908847 +0000 UTC m=+59.546818625" Sep 9 23:57:47.960007 containerd[1499]: time="2025-09-09T23:57:47.959960836Z" level=info msg="TaskExit event in podsandbox handler container_id:\"d59eedc96c2104e11b2608643e7d628b3bd942715093589a5bf7cf81929818f7\" id:\"87ad5a73d79ea9b360c45259060471c706cf962baf5216069f4ffc795e7194cb\" pid:4985 exit_status:1 exited_at:{seconds:1757462267 nanos:959355261}" Sep 9 23:57:48.878864 containerd[1499]: time="2025-09-09T23:57:48.878725329Z" level=info msg="TaskExit event in podsandbox handler container_id:\"d59eedc96c2104e11b2608643e7d628b3bd942715093589a5bf7cf81929818f7\" id:\"36eda610484a0bf0c30548f7e690e2738439b3f42b414eff3a4f4ecc97f4c2db\" pid:5015 exit_status:1 exited_at:{seconds:1757462268 nanos:878412001}" Sep 9 23:57:50.592547 containerd[1499]: time="2025-09-09T23:57:50.592468640Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/apiserver:v3.30.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 23:57:50.593822 containerd[1499]: time="2025-09-09T23:57:50.593767429Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.30.3: active requests=0, bytes read=44530807" Sep 9 23:57:50.594875 containerd[1499]: time="2025-09-09T23:57:50.594811292Z" level=info msg="ImageCreate event name:\"sha256:632fbde00b1918016ac07458e79cc438ccda83cb762bfd5fc50a26721abced08\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 23:57:50.597888 containerd[1499]: time="2025-09-09T23:57:50.597812599Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/apiserver@sha256:6a24147f11c1edce9d6ba79bdb0c2beadec53853fb43438a287291e67b41e51b\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 23:57:50.598774 containerd[1499]: time="2025-09-09T23:57:50.598585016Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/apiserver:v3.30.3\" with image id \"sha256:632fbde00b1918016ac07458e79cc438ccda83cb762bfd5fc50a26721abced08\", repo tag \"ghcr.io/flatcar/calico/apiserver:v3.30.3\", repo digest \"ghcr.io/flatcar/calico/apiserver@sha256:6a24147f11c1edce9d6ba79bdb0c2beadec53853fb43438a287291e67b41e51b\", size \"45900064\" in 2.996052168s" Sep 9 23:57:50.598774 containerd[1499]: time="2025-09-09T23:57:50.598622857Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.3\" returns image reference \"sha256:632fbde00b1918016ac07458e79cc438ccda83cb762bfd5fc50a26721abced08\"" Sep 9 23:57:50.601338 containerd[1499]: time="2025-09-09T23:57:50.601091632Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.3\"" Sep 9 23:57:50.603256 containerd[1499]: time="2025-09-09T23:57:50.603212039Z" level=info msg="CreateContainer within sandbox \"c6ca49440ac01b2db45080cd7f58049b2c3660d80ce115386e4d4c93cf49d7cf\" for container &ContainerMetadata{Name:calico-apiserver,Attempt:0,}" Sep 9 23:57:50.615867 containerd[1499]: time="2025-09-09T23:57:50.613225341Z" level=info msg="Container 360e6f93cc8d63762c83647677ec639f6ff71a9acd2847d790c8a1d51ecae3cb: CDI devices from CRI Config.CDIDevices: []" Sep 9 23:57:50.630437 containerd[1499]: time="2025-09-09T23:57:50.630384442Z" level=info msg="CreateContainer within sandbox \"c6ca49440ac01b2db45080cd7f58049b2c3660d80ce115386e4d4c93cf49d7cf\" for &ContainerMetadata{Name:calico-apiserver,Attempt:0,} returns container id \"360e6f93cc8d63762c83647677ec639f6ff71a9acd2847d790c8a1d51ecae3cb\"" Sep 9 23:57:50.631643 containerd[1499]: time="2025-09-09T23:57:50.631612109Z" level=info msg="StartContainer for \"360e6f93cc8d63762c83647677ec639f6ff71a9acd2847d790c8a1d51ecae3cb\"" Sep 9 23:57:50.633580 containerd[1499]: time="2025-09-09T23:57:50.633537232Z" level=info msg="connecting to shim 360e6f93cc8d63762c83647677ec639f6ff71a9acd2847d790c8a1d51ecae3cb" address="unix:///run/containerd/s/5faa39d778305a7f2eea7b29b55354e0fbfcf68a92a310a9fbcbb7f0e91a868c" protocol=ttrpc version=3 Sep 9 23:57:50.665154 systemd[1]: Started cri-containerd-360e6f93cc8d63762c83647677ec639f6ff71a9acd2847d790c8a1d51ecae3cb.scope - libcontainer container 360e6f93cc8d63762c83647677ec639f6ff71a9acd2847d790c8a1d51ecae3cb. Sep 9 23:57:50.716499 containerd[1499]: time="2025-09-09T23:57:50.716454831Z" level=info msg="StartContainer for \"360e6f93cc8d63762c83647677ec639f6ff71a9acd2847d790c8a1d51ecae3cb\" returns successfully" Sep 9 23:57:51.004857 containerd[1499]: time="2025-09-09T23:57:51.004142450Z" level=info msg="ImageUpdate event name:\"ghcr.io/flatcar/calico/apiserver:v3.30.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 23:57:51.005175 containerd[1499]: time="2025-09-09T23:57:51.005143312Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.30.3: active requests=0, bytes read=77" Sep 9 23:57:51.008750 containerd[1499]: time="2025-09-09T23:57:51.008685348Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/apiserver:v3.30.3\" with image id \"sha256:632fbde00b1918016ac07458e79cc438ccda83cb762bfd5fc50a26721abced08\", repo tag \"ghcr.io/flatcar/calico/apiserver:v3.30.3\", repo digest \"ghcr.io/flatcar/calico/apiserver@sha256:6a24147f11c1edce9d6ba79bdb0c2beadec53853fb43438a287291e67b41e51b\", size \"45900064\" in 407.492433ms" Sep 9 23:57:51.008750 containerd[1499]: time="2025-09-09T23:57:51.008748269Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.3\" returns image reference \"sha256:632fbde00b1918016ac07458e79cc438ccda83cb762bfd5fc50a26721abced08\"" Sep 9 23:57:51.011390 containerd[1499]: time="2025-09-09T23:57:51.011300204Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.30.3\"" Sep 9 23:57:51.013802 containerd[1499]: time="2025-09-09T23:57:51.013761936Z" level=info msg="CreateContainer within sandbox \"9b86f88b2112126cc2072475aa4f217bcf58a3cfcb7d444fe7a61e0248cbed17\" for container &ContainerMetadata{Name:calico-apiserver,Attempt:0,}" Sep 9 23:57:51.027034 containerd[1499]: time="2025-09-09T23:57:51.026976419Z" level=info msg="Container 4509a57aee73e8fa66bfa4f4f32654eed4173c21226a0c3b98c3ac510b38b06e: CDI devices from CRI Config.CDIDevices: []" Sep 9 23:57:51.038526 containerd[1499]: time="2025-09-09T23:57:51.037936534Z" level=info msg="CreateContainer within sandbox \"9b86f88b2112126cc2072475aa4f217bcf58a3cfcb7d444fe7a61e0248cbed17\" for &ContainerMetadata{Name:calico-apiserver,Attempt:0,} returns container id \"4509a57aee73e8fa66bfa4f4f32654eed4173c21226a0c3b98c3ac510b38b06e\"" Sep 9 23:57:51.040871 containerd[1499]: time="2025-09-09T23:57:51.040823435Z" level=info msg="StartContainer for \"4509a57aee73e8fa66bfa4f4f32654eed4173c21226a0c3b98c3ac510b38b06e\"" Sep 9 23:57:51.045561 containerd[1499]: time="2025-09-09T23:57:51.045516096Z" level=info msg="connecting to shim 4509a57aee73e8fa66bfa4f4f32654eed4173c21226a0c3b98c3ac510b38b06e" address="unix:///run/containerd/s/60b5690c185e1bf1b609dde1989dd6e517666bdea137fbe3e392025db044405f" protocol=ttrpc version=3 Sep 9 23:57:51.087606 systemd[1]: Started cri-containerd-4509a57aee73e8fa66bfa4f4f32654eed4173c21226a0c3b98c3ac510b38b06e.scope - libcontainer container 4509a57aee73e8fa66bfa4f4f32654eed4173c21226a0c3b98c3ac510b38b06e. Sep 9 23:57:51.194517 containerd[1499]: time="2025-09-09T23:57:51.194385561Z" level=info msg="StartContainer for \"4509a57aee73e8fa66bfa4f4f32654eed4173c21226a0c3b98c3ac510b38b06e\" returns successfully" Sep 9 23:57:51.834722 kubelet[2750]: I0909 23:57:51.834636 2750 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-apiserver/calico-apiserver-54cdc7764d-sgvm7" podStartSLOduration=31.423895879 podStartE2EDuration="42.83460618s" podCreationTimestamp="2025-09-09 23:57:09 +0000 UTC" firstStartedPulling="2025-09-09 23:57:39.18943203 +0000 UTC m=+50.918341768" lastFinishedPulling="2025-09-09 23:57:50.600142331 +0000 UTC m=+62.329052069" observedRunningTime="2025-09-09 23:57:50.824725913 +0000 UTC m=+62.553635651" watchObservedRunningTime="2025-09-09 23:57:51.83460618 +0000 UTC m=+63.563515918" Sep 9 23:57:52.825552 kubelet[2750]: I0909 23:57:52.825494 2750 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Sep 9 23:57:53.175939 containerd[1499]: time="2025-09-09T23:57:53.175876856Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/csi:v3.30.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 23:57:53.177901 containerd[1499]: time="2025-09-09T23:57:53.177736373Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/csi:v3.30.3: active requests=0, bytes read=8227489" Sep 9 23:57:53.179137 containerd[1499]: time="2025-09-09T23:57:53.179064679Z" level=info msg="ImageCreate event name:\"sha256:5e2b30128ce4b607acd97d3edef62ce1a90be0259903090a51c360adbe4a8f3b\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 23:57:53.182742 containerd[1499]: time="2025-09-09T23:57:53.182619270Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/csi@sha256:f22c88018d8b58c4ef0052f594b216a13bd6852166ac131a538c5ab2fba23bb2\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 23:57:53.184008 containerd[1499]: time="2025-09-09T23:57:53.183944417Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/csi:v3.30.3\" with image id \"sha256:5e2b30128ce4b607acd97d3edef62ce1a90be0259903090a51c360adbe4a8f3b\", repo tag \"ghcr.io/flatcar/calico/csi:v3.30.3\", repo digest \"ghcr.io/flatcar/calico/csi@sha256:f22c88018d8b58c4ef0052f594b216a13bd6852166ac131a538c5ab2fba23bb2\", size \"9596730\" in 2.172553171s" Sep 9 23:57:53.184008 containerd[1499]: time="2025-09-09T23:57:53.183992377Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.30.3\" returns image reference \"sha256:5e2b30128ce4b607acd97d3edef62ce1a90be0259903090a51c360adbe4a8f3b\"" Sep 9 23:57:53.188754 containerd[1499]: time="2025-09-09T23:57:53.188703151Z" level=info msg="CreateContainer within sandbox \"779c2f76763606f3db64facd73985666f402c925215141ecba9dc56c659c1e13\" for container &ContainerMetadata{Name:calico-csi,Attempt:0,}" Sep 9 23:57:53.209344 containerd[1499]: time="2025-09-09T23:57:53.209019875Z" level=info msg="Container 59a5bba0e0ce304b8b80563547871e05df9673e692f9394b545f003b2c20f264: CDI devices from CRI Config.CDIDevices: []" Sep 9 23:57:53.226119 containerd[1499]: time="2025-09-09T23:57:53.226059055Z" level=info msg="CreateContainer within sandbox \"779c2f76763606f3db64facd73985666f402c925215141ecba9dc56c659c1e13\" for &ContainerMetadata{Name:calico-csi,Attempt:0,} returns container id \"59a5bba0e0ce304b8b80563547871e05df9673e692f9394b545f003b2c20f264\"" Sep 9 23:57:53.227009 containerd[1499]: time="2025-09-09T23:57:53.226970513Z" level=info msg="StartContainer for \"59a5bba0e0ce304b8b80563547871e05df9673e692f9394b545f003b2c20f264\"" Sep 9 23:57:53.229468 containerd[1499]: time="2025-09-09T23:57:53.229408001Z" level=info msg="connecting to shim 59a5bba0e0ce304b8b80563547871e05df9673e692f9394b545f003b2c20f264" address="unix:///run/containerd/s/c393090204919397cf61dba494d0a12b3b93a9c2bf61cbd11ec695c7657f7bca" protocol=ttrpc version=3 Sep 9 23:57:53.270185 systemd[1]: Started cri-containerd-59a5bba0e0ce304b8b80563547871e05df9673e692f9394b545f003b2c20f264.scope - libcontainer container 59a5bba0e0ce304b8b80563547871e05df9673e692f9394b545f003b2c20f264. Sep 9 23:57:53.287308 containerd[1499]: time="2025-09-09T23:57:53.287268272Z" level=info msg="TaskExit event in podsandbox handler container_id:\"d59eedc96c2104e11b2608643e7d628b3bd942715093589a5bf7cf81929818f7\" id:\"9e13783319f97e013c11b40cd84b61bc4628b4c26959cd8ce41fe57e2726ce76\" pid:5124 exited_at:{seconds:1757462273 nanos:286908825}" Sep 9 23:57:53.368547 kubelet[2750]: I0909 23:57:53.368448 2750 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-apiserver/calico-apiserver-54cdc7764d-2x7kh" podStartSLOduration=35.440110842 podStartE2EDuration="44.368425647s" podCreationTimestamp="2025-09-09 23:57:09 +0000 UTC" firstStartedPulling="2025-09-09 23:57:42.081826334 +0000 UTC m=+53.810736072" lastFinishedPulling="2025-09-09 23:57:51.010141139 +0000 UTC m=+62.739050877" observedRunningTime="2025-09-09 23:57:51.833607759 +0000 UTC m=+63.562517497" watchObservedRunningTime="2025-09-09 23:57:53.368425647 +0000 UTC m=+65.097335345" Sep 9 23:57:53.401862 containerd[1499]: time="2025-09-09T23:57:53.401784511Z" level=info msg="StartContainer for \"59a5bba0e0ce304b8b80563547871e05df9673e692f9394b545f003b2c20f264\" returns successfully" Sep 9 23:57:53.404695 containerd[1499]: time="2025-09-09T23:57:53.404652488Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.3\"" Sep 9 23:57:55.454075 containerd[1499]: time="2025-09-09T23:57:55.454000713Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 23:57:55.455742 containerd[1499]: time="2025-09-09T23:57:55.455690824Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/node-driver-registrar:v3.30.3: active requests=0, bytes read=13761208" Sep 9 23:57:55.456816 containerd[1499]: time="2025-09-09T23:57:55.456674922Z" level=info msg="ImageCreate event name:\"sha256:a319b5bdc1001e98875b68e2943279adb74bcb19d09f1db857bc27959a078a65\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 23:57:55.459383 containerd[1499]: time="2025-09-09T23:57:55.459290530Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node-driver-registrar@sha256:731ab232ca708102ab332340b1274d5cd656aa896ecc5368ee95850b811df86f\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 9 23:57:55.460524 containerd[1499]: time="2025-09-09T23:57:55.460332430Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.3\" with image id \"sha256:a319b5bdc1001e98875b68e2943279adb74bcb19d09f1db857bc27959a078a65\", repo tag \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.3\", repo digest \"ghcr.io/flatcar/calico/node-driver-registrar@sha256:731ab232ca708102ab332340b1274d5cd656aa896ecc5368ee95850b811df86f\", size \"15130401\" in 2.05562962s" Sep 9 23:57:55.460524 containerd[1499]: time="2025-09-09T23:57:55.460381111Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.3\" returns image reference \"sha256:a319b5bdc1001e98875b68e2943279adb74bcb19d09f1db857bc27959a078a65\"" Sep 9 23:57:55.464430 containerd[1499]: time="2025-09-09T23:57:55.464371584Z" level=info msg="CreateContainer within sandbox \"779c2f76763606f3db64facd73985666f402c925215141ecba9dc56c659c1e13\" for container &ContainerMetadata{Name:csi-node-driver-registrar,Attempt:0,}" Sep 9 23:57:55.476078 containerd[1499]: time="2025-09-09T23:57:55.476025400Z" level=info msg="Container 920f2f9d12d43933d2d7828994fedf1e73a676c31b2dd71e78974c7584c4e601: CDI devices from CRI Config.CDIDevices: []" Sep 9 23:57:55.496076 containerd[1499]: time="2025-09-09T23:57:55.495936208Z" level=info msg="CreateContainer within sandbox \"779c2f76763606f3db64facd73985666f402c925215141ecba9dc56c659c1e13\" for &ContainerMetadata{Name:csi-node-driver-registrar,Attempt:0,} returns container id \"920f2f9d12d43933d2d7828994fedf1e73a676c31b2dd71e78974c7584c4e601\"" Sep 9 23:57:55.499860 containerd[1499]: time="2025-09-09T23:57:55.498893943Z" level=info msg="StartContainer for \"920f2f9d12d43933d2d7828994fedf1e73a676c31b2dd71e78974c7584c4e601\"" Sep 9 23:57:55.509525 containerd[1499]: time="2025-09-09T23:57:55.509456098Z" level=info msg="connecting to shim 920f2f9d12d43933d2d7828994fedf1e73a676c31b2dd71e78974c7584c4e601" address="unix:///run/containerd/s/c393090204919397cf61dba494d0a12b3b93a9c2bf61cbd11ec695c7657f7bca" protocol=ttrpc version=3 Sep 9 23:57:55.546252 systemd[1]: Started cri-containerd-920f2f9d12d43933d2d7828994fedf1e73a676c31b2dd71e78974c7584c4e601.scope - libcontainer container 920f2f9d12d43933d2d7828994fedf1e73a676c31b2dd71e78974c7584c4e601. Sep 9 23:57:55.598179 containerd[1499]: time="2025-09-09T23:57:55.598105817Z" level=info msg="StartContainer for \"920f2f9d12d43933d2d7828994fedf1e73a676c31b2dd71e78974c7584c4e601\" returns successfully" Sep 9 23:57:55.858753 kubelet[2750]: I0909 23:57:55.857908 2750 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/csi-node-driver-p9n6x" podStartSLOduration=27.571458131 podStartE2EDuration="40.8578805s" podCreationTimestamp="2025-09-09 23:57:15 +0000 UTC" firstStartedPulling="2025-09-09 23:57:42.175077202 +0000 UTC m=+53.903986940" lastFinishedPulling="2025-09-09 23:57:55.461499611 +0000 UTC m=+67.190409309" observedRunningTime="2025-09-09 23:57:55.855478496 +0000 UTC m=+67.584388234" watchObservedRunningTime="2025-09-09 23:57:55.8578805 +0000 UTC m=+67.586790278" Sep 9 23:57:56.555032 kubelet[2750]: I0909 23:57:56.554953 2750 csi_plugin.go:100] kubernetes.io/csi: Trying to validate a new CSI Driver with name: csi.tigera.io endpoint: /var/lib/kubelet/plugins/csi.tigera.io/csi.sock versions: 1.0.0 Sep 9 23:57:56.555032 kubelet[2750]: I0909 23:57:56.555030 2750 csi_plugin.go:113] kubernetes.io/csi: Register new plugin with name: csi.tigera.io at endpoint: /var/lib/kubelet/plugins/csi.tigera.io/csi.sock Sep 9 23:58:06.775760 containerd[1499]: time="2025-09-09T23:58:06.775712099Z" level=info msg="TaskExit event in podsandbox handler container_id:\"eafaa11bb30a9b23530edc71be999bcf0522d89adbc86dbb860814a1c473116b\" id:\"079565cc6ca2edfda4f6ca7b4c0232f24d92fdb5dce4a964bae56d8c9fb9cb72\" pid:5233 exited_at:{seconds:1757462286 nanos:775330215}" Sep 9 23:58:15.846957 containerd[1499]: time="2025-09-09T23:58:15.846203255Z" level=info msg="TaskExit event in podsandbox handler container_id:\"e658e28f776bc1bf3f6470efd4e2069a173811d51adfcae9b3c469beaf2853dd\" id:\"f9bd8e4a674a707828975d81048cfc8b4013fd91a6bbfb8a41402c04630d306c\" pid:5258 exited_at:{seconds:1757462295 nanos:845600890}" Sep 9 23:58:18.879983 containerd[1499]: time="2025-09-09T23:58:18.879936406Z" level=info msg="TaskExit event in podsandbox handler container_id:\"d59eedc96c2104e11b2608643e7d628b3bd942715093589a5bf7cf81929818f7\" id:\"5b090679545c514adbee2cd67757a168256e1ef1714c199a370e2df288d0cd78\" pid:5286 exited_at:{seconds:1757462298 nanos:879310281}" Sep 9 23:58:23.898403 containerd[1499]: time="2025-09-09T23:58:23.898357783Z" level=info msg="TaskExit event in podsandbox handler container_id:\"e658e28f776bc1bf3f6470efd4e2069a173811d51adfcae9b3c469beaf2853dd\" id:\"51e0c5385348784a696eefd28b1244feda6c625d047778eeb0e79acaadbf23d9\" pid:5313 exited_at:{seconds:1757462303 nanos:897805540}" Sep 9 23:58:36.784539 containerd[1499]: time="2025-09-09T23:58:36.784438329Z" level=info msg="TaskExit event in podsandbox handler container_id:\"eafaa11bb30a9b23530edc71be999bcf0522d89adbc86dbb860814a1c473116b\" id:\"1e50037c1de4ad46d6a7e8c06a454a6d2b42ca21b44b1735cf07b0d4fbbd30b3\" pid:5340 exited_at:{seconds:1757462316 nanos:784068287}" Sep 9 23:58:45.821363 containerd[1499]: time="2025-09-09T23:58:45.821194067Z" level=info msg="TaskExit event in podsandbox handler container_id:\"e658e28f776bc1bf3f6470efd4e2069a173811d51adfcae9b3c469beaf2853dd\" id:\"4812fda7ca228c2ab9ebdc83475a8103a3a436fd9e73e5427dfc2e46dd5a61b8\" pid:5364 exited_at:{seconds:1757462325 nanos:820213026}" Sep 9 23:58:48.960636 containerd[1499]: time="2025-09-09T23:58:48.960569297Z" level=info msg="TaskExit event in podsandbox handler container_id:\"d59eedc96c2104e11b2608643e7d628b3bd942715093589a5bf7cf81929818f7\" id:\"9b6cf02456589a706a2dac6232eb901dc5cdb1d436cb5732888cc7dc6499d4de\" pid:5387 exited_at:{seconds:1757462328 nanos:959978977}" Sep 9 23:58:53.080901 systemd[1]: Started sshd@7-91.99.154.191:22-139.178.68.195:58440.service - OpenSSH per-connection server daemon (139.178.68.195:58440). Sep 9 23:58:53.155798 containerd[1499]: time="2025-09-09T23:58:53.155750711Z" level=info msg="TaskExit event in podsandbox handler container_id:\"d59eedc96c2104e11b2608643e7d628b3bd942715093589a5bf7cf81929818f7\" id:\"de25ecc0acef587328103918339754904856f77c4b9d85c8050d2daa63a7f597\" pid:5412 exited_at:{seconds:1757462333 nanos:155337431}" Sep 9 23:58:54.115734 sshd[5419]: Accepted publickey for core from 139.178.68.195 port 58440 ssh2: RSA SHA256:wI85FBBkRXQquguYbIxcsprF72ScFbxnS2NbrrYOsGk Sep 9 23:58:54.118862 sshd-session[5419]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 9 23:58:54.125055 systemd-logind[1481]: New session 8 of user core. Sep 9 23:58:54.141256 systemd[1]: Started session-8.scope - Session 8 of User core. Sep 9 23:58:54.894883 sshd[5427]: Connection closed by 139.178.68.195 port 58440 Sep 9 23:58:54.895632 sshd-session[5419]: pam_unix(sshd:session): session closed for user core Sep 9 23:58:54.903330 systemd[1]: sshd@7-91.99.154.191:22-139.178.68.195:58440.service: Deactivated successfully. Sep 9 23:58:54.906432 systemd[1]: session-8.scope: Deactivated successfully. Sep 9 23:58:54.915530 systemd-logind[1481]: Session 8 logged out. Waiting for processes to exit. Sep 9 23:58:54.918041 systemd-logind[1481]: Removed session 8. Sep 9 23:59:00.073187 systemd[1]: Started sshd@8-91.99.154.191:22-139.178.68.195:58456.service - OpenSSH per-connection server daemon (139.178.68.195:58456). Sep 9 23:59:01.093877 sshd[5448]: Accepted publickey for core from 139.178.68.195 port 58456 ssh2: RSA SHA256:wI85FBBkRXQquguYbIxcsprF72ScFbxnS2NbrrYOsGk Sep 9 23:59:01.096025 sshd-session[5448]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 9 23:59:01.105394 systemd-logind[1481]: New session 9 of user core. Sep 9 23:59:01.112142 systemd[1]: Started session-9.scope - Session 9 of User core. Sep 9 23:59:01.884341 sshd[5451]: Connection closed by 139.178.68.195 port 58456 Sep 9 23:59:01.884810 sshd-session[5448]: pam_unix(sshd:session): session closed for user core Sep 9 23:59:01.892269 systemd[1]: sshd@8-91.99.154.191:22-139.178.68.195:58456.service: Deactivated successfully. Sep 9 23:59:01.892798 systemd-logind[1481]: Session 9 logged out. Waiting for processes to exit. Sep 9 23:59:01.898332 systemd[1]: session-9.scope: Deactivated successfully. Sep 9 23:59:01.904335 systemd-logind[1481]: Removed session 9. Sep 9 23:59:02.056161 systemd[1]: Started sshd@9-91.99.154.191:22-139.178.68.195:48372.service - OpenSSH per-connection server daemon (139.178.68.195:48372). Sep 9 23:59:03.060895 sshd[5464]: Accepted publickey for core from 139.178.68.195 port 48372 ssh2: RSA SHA256:wI85FBBkRXQquguYbIxcsprF72ScFbxnS2NbrrYOsGk Sep 9 23:59:03.063366 sshd-session[5464]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 9 23:59:03.069701 systemd-logind[1481]: New session 10 of user core. Sep 9 23:59:03.075173 systemd[1]: Started session-10.scope - Session 10 of User core. Sep 9 23:59:03.880261 sshd[5469]: Connection closed by 139.178.68.195 port 48372 Sep 9 23:59:03.880925 sshd-session[5464]: pam_unix(sshd:session): session closed for user core Sep 9 23:59:03.886865 systemd[1]: sshd@9-91.99.154.191:22-139.178.68.195:48372.service: Deactivated successfully. Sep 9 23:59:03.890280 systemd[1]: session-10.scope: Deactivated successfully. Sep 9 23:59:03.892175 systemd-logind[1481]: Session 10 logged out. Waiting for processes to exit. Sep 9 23:59:03.895707 systemd-logind[1481]: Removed session 10. Sep 9 23:59:04.051563 systemd[1]: Started sshd@10-91.99.154.191:22-139.178.68.195:48380.service - OpenSSH per-connection server daemon (139.178.68.195:48380). Sep 9 23:59:05.056929 sshd[5479]: Accepted publickey for core from 139.178.68.195 port 48380 ssh2: RSA SHA256:wI85FBBkRXQquguYbIxcsprF72ScFbxnS2NbrrYOsGk Sep 9 23:59:05.060231 sshd-session[5479]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 9 23:59:05.065760 systemd-logind[1481]: New session 11 of user core. Sep 9 23:59:05.073145 systemd[1]: Started session-11.scope - Session 11 of User core. Sep 9 23:59:05.818647 sshd[5482]: Connection closed by 139.178.68.195 port 48380 Sep 9 23:59:05.819422 sshd-session[5479]: pam_unix(sshd:session): session closed for user core Sep 9 23:59:05.825630 systemd-logind[1481]: Session 11 logged out. Waiting for processes to exit. Sep 9 23:59:05.826090 systemd[1]: sshd@10-91.99.154.191:22-139.178.68.195:48380.service: Deactivated successfully. Sep 9 23:59:05.830079 systemd[1]: session-11.scope: Deactivated successfully. Sep 9 23:59:05.836305 systemd-logind[1481]: Removed session 11. Sep 9 23:59:06.761006 containerd[1499]: time="2025-09-09T23:59:06.760944175Z" level=info msg="TaskExit event in podsandbox handler container_id:\"eafaa11bb30a9b23530edc71be999bcf0522d89adbc86dbb860814a1c473116b\" id:\"600a833f410b26a4552e6c76b825f78014028e7a9a2f545289b386567be073e5\" pid:5517 exit_status:1 exited_at:{seconds:1757462346 nanos:760019215}" Sep 9 23:59:10.997568 systemd[1]: Started sshd@11-91.99.154.191:22-139.178.68.195:38658.service - OpenSSH per-connection server daemon (139.178.68.195:38658). Sep 9 23:59:12.017895 sshd[5529]: Accepted publickey for core from 139.178.68.195 port 38658 ssh2: RSA SHA256:wI85FBBkRXQquguYbIxcsprF72ScFbxnS2NbrrYOsGk Sep 9 23:59:12.021905 sshd-session[5529]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 9 23:59:12.030627 systemd-logind[1481]: New session 12 of user core. Sep 9 23:59:12.038887 systemd[1]: Started session-12.scope - Session 12 of User core. Sep 9 23:59:12.812626 sshd[5532]: Connection closed by 139.178.68.195 port 38658 Sep 9 23:59:12.813634 sshd-session[5529]: pam_unix(sshd:session): session closed for user core Sep 9 23:59:12.820206 systemd[1]: sshd@11-91.99.154.191:22-139.178.68.195:38658.service: Deactivated successfully. Sep 9 23:59:12.823266 systemd[1]: session-12.scope: Deactivated successfully. Sep 9 23:59:12.824945 systemd-logind[1481]: Session 12 logged out. Waiting for processes to exit. Sep 9 23:59:12.827481 systemd-logind[1481]: Removed session 12. Sep 9 23:59:15.819076 containerd[1499]: time="2025-09-09T23:59:15.819030680Z" level=info msg="TaskExit event in podsandbox handler container_id:\"e658e28f776bc1bf3f6470efd4e2069a173811d51adfcae9b3c469beaf2853dd\" id:\"d1b65821483e93263207db3c1b3f714ffacc54c5600df20c0b45d57f50298faf\" pid:5568 exited_at:{seconds:1757462355 nanos:818655361}" Sep 9 23:59:17.998430 systemd[1]: Started sshd@12-91.99.154.191:22-139.178.68.195:38670.service - OpenSSH per-connection server daemon (139.178.68.195:38670). Sep 9 23:59:18.900984 containerd[1499]: time="2025-09-09T23:59:18.900714854Z" level=info msg="TaskExit event in podsandbox handler container_id:\"d59eedc96c2104e11b2608643e7d628b3bd942715093589a5bf7cf81929818f7\" id:\"396a58fae2108cbcbf10cf14d80784f064a20ed56eb75aba6d48ab04d33d9ac0\" pid:5594 exited_at:{seconds:1757462358 nanos:900280335}" Sep 9 23:59:19.005577 sshd[5578]: Accepted publickey for core from 139.178.68.195 port 38670 ssh2: RSA SHA256:wI85FBBkRXQquguYbIxcsprF72ScFbxnS2NbrrYOsGk Sep 9 23:59:19.009153 sshd-session[5578]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 9 23:59:19.015936 systemd-logind[1481]: New session 13 of user core. Sep 9 23:59:19.026445 systemd[1]: Started session-13.scope - Session 13 of User core. Sep 9 23:59:19.770662 sshd[5603]: Connection closed by 139.178.68.195 port 38670 Sep 9 23:59:19.771302 sshd-session[5578]: pam_unix(sshd:session): session closed for user core Sep 9 23:59:19.778474 systemd[1]: sshd@12-91.99.154.191:22-139.178.68.195:38670.service: Deactivated successfully. Sep 9 23:59:19.782479 systemd[1]: session-13.scope: Deactivated successfully. Sep 9 23:59:19.785673 systemd-logind[1481]: Session 13 logged out. Waiting for processes to exit. Sep 9 23:59:19.788374 systemd-logind[1481]: Removed session 13. Sep 9 23:59:23.918071 containerd[1499]: time="2025-09-09T23:59:23.917636624Z" level=info msg="TaskExit event in podsandbox handler container_id:\"e658e28f776bc1bf3f6470efd4e2069a173811d51adfcae9b3c469beaf2853dd\" id:\"89c8887f00b97f483bd40b4ca148abe0b0f6673e5c6a8d8ecb374a2681ce571c\" pid:5627 exited_at:{seconds:1757462363 nanos:917328185}" Sep 9 23:59:24.956124 systemd[1]: Started sshd@13-91.99.154.191:22-139.178.68.195:55138.service - OpenSSH per-connection server daemon (139.178.68.195:55138). Sep 9 23:59:26.028114 sshd[5638]: Accepted publickey for core from 139.178.68.195 port 55138 ssh2: RSA SHA256:wI85FBBkRXQquguYbIxcsprF72ScFbxnS2NbrrYOsGk Sep 9 23:59:26.030156 sshd-session[5638]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 9 23:59:26.036978 systemd-logind[1481]: New session 14 of user core. Sep 9 23:59:26.050229 systemd[1]: Started session-14.scope - Session 14 of User core. Sep 9 23:59:26.861546 sshd[5643]: Connection closed by 139.178.68.195 port 55138 Sep 9 23:59:26.862156 sshd-session[5638]: pam_unix(sshd:session): session closed for user core Sep 9 23:59:26.869581 systemd[1]: sshd@13-91.99.154.191:22-139.178.68.195:55138.service: Deactivated successfully. Sep 9 23:59:26.875587 systemd[1]: session-14.scope: Deactivated successfully. Sep 9 23:59:26.877931 systemd-logind[1481]: Session 14 logged out. Waiting for processes to exit. Sep 9 23:59:26.880050 systemd-logind[1481]: Removed session 14. Sep 9 23:59:27.033113 systemd[1]: Started sshd@14-91.99.154.191:22-139.178.68.195:55148.service - OpenSSH per-connection server daemon (139.178.68.195:55148). Sep 9 23:59:28.056605 sshd[5655]: Accepted publickey for core from 139.178.68.195 port 55148 ssh2: RSA SHA256:wI85FBBkRXQquguYbIxcsprF72ScFbxnS2NbrrYOsGk Sep 9 23:59:28.060734 sshd-session[5655]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 9 23:59:28.068371 systemd-logind[1481]: New session 15 of user core. Sep 9 23:59:28.073137 systemd[1]: Started session-15.scope - Session 15 of User core. Sep 9 23:59:29.001831 sshd[5658]: Connection closed by 139.178.68.195 port 55148 Sep 9 23:59:29.002961 sshd-session[5655]: pam_unix(sshd:session): session closed for user core Sep 9 23:59:29.008907 systemd-logind[1481]: Session 15 logged out. Waiting for processes to exit. Sep 9 23:59:29.012122 systemd[1]: sshd@14-91.99.154.191:22-139.178.68.195:55148.service: Deactivated successfully. Sep 9 23:59:29.019720 systemd[1]: session-15.scope: Deactivated successfully. Sep 9 23:59:29.024386 systemd-logind[1481]: Removed session 15. Sep 9 23:59:29.174035 systemd[1]: Started sshd@15-91.99.154.191:22-139.178.68.195:55162.service - OpenSSH per-connection server daemon (139.178.68.195:55162). Sep 9 23:59:30.188646 sshd[5667]: Accepted publickey for core from 139.178.68.195 port 55162 ssh2: RSA SHA256:wI85FBBkRXQquguYbIxcsprF72ScFbxnS2NbrrYOsGk Sep 9 23:59:30.189926 sshd-session[5667]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 9 23:59:30.199297 systemd-logind[1481]: New session 16 of user core. Sep 9 23:59:30.205575 systemd[1]: Started session-16.scope - Session 16 of User core. Sep 9 23:59:31.692167 sshd[5670]: Connection closed by 139.178.68.195 port 55162 Sep 9 23:59:31.693466 sshd-session[5667]: pam_unix(sshd:session): session closed for user core Sep 9 23:59:31.698958 systemd[1]: sshd@15-91.99.154.191:22-139.178.68.195:55162.service: Deactivated successfully. Sep 9 23:59:31.703182 systemd[1]: session-16.scope: Deactivated successfully. Sep 9 23:59:31.708351 systemd-logind[1481]: Session 16 logged out. Waiting for processes to exit. Sep 9 23:59:31.709872 systemd-logind[1481]: Removed session 16. Sep 9 23:59:31.864969 systemd[1]: Started sshd@16-91.99.154.191:22-139.178.68.195:55610.service - OpenSSH per-connection server daemon (139.178.68.195:55610). Sep 9 23:59:32.878371 sshd[5690]: Accepted publickey for core from 139.178.68.195 port 55610 ssh2: RSA SHA256:wI85FBBkRXQquguYbIxcsprF72ScFbxnS2NbrrYOsGk Sep 9 23:59:32.881000 sshd-session[5690]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 9 23:59:32.886735 systemd-logind[1481]: New session 17 of user core. Sep 9 23:59:32.896120 systemd[1]: Started session-17.scope - Session 17 of User core. Sep 9 23:59:33.775132 sshd[5693]: Connection closed by 139.178.68.195 port 55610 Sep 9 23:59:33.776491 sshd-session[5690]: pam_unix(sshd:session): session closed for user core Sep 9 23:59:33.783777 systemd-logind[1481]: Session 17 logged out. Waiting for processes to exit. Sep 9 23:59:33.783787 systemd[1]: sshd@16-91.99.154.191:22-139.178.68.195:55610.service: Deactivated successfully. Sep 9 23:59:33.787775 systemd[1]: session-17.scope: Deactivated successfully. Sep 9 23:59:33.791095 systemd-logind[1481]: Removed session 17. Sep 9 23:59:33.965779 systemd[1]: Started sshd@17-91.99.154.191:22-139.178.68.195:55616.service - OpenSSH per-connection server daemon (139.178.68.195:55616). Sep 9 23:59:35.027438 sshd[5703]: Accepted publickey for core from 139.178.68.195 port 55616 ssh2: RSA SHA256:wI85FBBkRXQquguYbIxcsprF72ScFbxnS2NbrrYOsGk Sep 9 23:59:35.029599 sshd-session[5703]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 9 23:59:35.035114 systemd-logind[1481]: New session 18 of user core. Sep 9 23:59:35.040049 systemd[1]: Started session-18.scope - Session 18 of User core. Sep 9 23:59:35.824503 sshd[5706]: Connection closed by 139.178.68.195 port 55616 Sep 9 23:59:35.823553 sshd-session[5703]: pam_unix(sshd:session): session closed for user core Sep 9 23:59:35.829593 systemd[1]: sshd@17-91.99.154.191:22-139.178.68.195:55616.service: Deactivated successfully. Sep 9 23:59:35.833686 systemd[1]: session-18.scope: Deactivated successfully. Sep 9 23:59:35.838463 systemd-logind[1481]: Session 18 logged out. Waiting for processes to exit. Sep 9 23:59:35.841553 systemd-logind[1481]: Removed session 18. Sep 9 23:59:36.760723 containerd[1499]: time="2025-09-09T23:59:36.760675718Z" level=info msg="TaskExit event in podsandbox handler container_id:\"eafaa11bb30a9b23530edc71be999bcf0522d89adbc86dbb860814a1c473116b\" id:\"3fe083fa87cacac75769598660bea99843dd7f6ab7e8ec738257deb591030117\" pid:5731 exited_at:{seconds:1757462376 nanos:759947159}" Sep 9 23:59:40.997219 systemd[1]: Started sshd@18-91.99.154.191:22-139.178.68.195:33370.service - OpenSSH per-connection server daemon (139.178.68.195:33370). Sep 9 23:59:42.012804 sshd[5744]: Accepted publickey for core from 139.178.68.195 port 33370 ssh2: RSA SHA256:wI85FBBkRXQquguYbIxcsprF72ScFbxnS2NbrrYOsGk Sep 9 23:59:42.015724 sshd-session[5744]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 9 23:59:42.024997 systemd-logind[1481]: New session 19 of user core. Sep 9 23:59:42.030220 systemd[1]: Started session-19.scope - Session 19 of User core. Sep 9 23:59:42.773863 sshd[5747]: Connection closed by 139.178.68.195 port 33370 Sep 9 23:59:42.775697 sshd-session[5744]: pam_unix(sshd:session): session closed for user core Sep 9 23:59:42.780210 systemd[1]: sshd@18-91.99.154.191:22-139.178.68.195:33370.service: Deactivated successfully. Sep 9 23:59:42.783363 systemd[1]: session-19.scope: Deactivated successfully. Sep 9 23:59:42.791309 systemd-logind[1481]: Session 19 logged out. Waiting for processes to exit. Sep 9 23:59:42.793563 systemd-logind[1481]: Removed session 19. Sep 9 23:59:45.814768 containerd[1499]: time="2025-09-09T23:59:45.814729064Z" level=info msg="TaskExit event in podsandbox handler container_id:\"e658e28f776bc1bf3f6470efd4e2069a173811d51adfcae9b3c469beaf2853dd\" id:\"9d44a503c4a796de1b959dc8c8bea98671d7f5b6f7686489f2887dd7249a76c4\" pid:5770 exited_at:{seconds:1757462385 nanos:814192426}" Sep 9 23:59:47.948142 systemd[1]: Started sshd@19-91.99.154.191:22-139.178.68.195:33384.service - OpenSSH per-connection server daemon (139.178.68.195:33384). Sep 9 23:59:48.887661 containerd[1499]: time="2025-09-09T23:59:48.887594495Z" level=info msg="TaskExit event in podsandbox handler container_id:\"d59eedc96c2104e11b2608643e7d628b3bd942715093589a5bf7cf81929818f7\" id:\"8b9fd934af87c67258abe20d304c7d53751083f7df69d90adc369bf39f08f0a3\" pid:5797 exited_at:{seconds:1757462388 nanos:887240330}" Sep 9 23:59:48.959170 sshd[5780]: Accepted publickey for core from 139.178.68.195 port 33384 ssh2: RSA SHA256:wI85FBBkRXQquguYbIxcsprF72ScFbxnS2NbrrYOsGk Sep 9 23:59:48.961574 sshd-session[5780]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 9 23:59:48.968925 systemd-logind[1481]: New session 20 of user core. Sep 9 23:59:48.975129 systemd[1]: Started session-20.scope - Session 20 of User core. Sep 9 23:59:49.735127 sshd[5807]: Connection closed by 139.178.68.195 port 33384 Sep 9 23:59:49.735697 sshd-session[5780]: pam_unix(sshd:session): session closed for user core Sep 9 23:59:49.743187 systemd[1]: sshd@19-91.99.154.191:22-139.178.68.195:33384.service: Deactivated successfully. Sep 9 23:59:49.748525 systemd[1]: session-20.scope: Deactivated successfully. Sep 9 23:59:49.750530 systemd-logind[1481]: Session 20 logged out. Waiting for processes to exit. Sep 9 23:59:49.752100 systemd-logind[1481]: Removed session 20. Sep 9 23:59:53.046006 containerd[1499]: time="2025-09-09T23:59:53.045948515Z" level=info msg="TaskExit event in podsandbox handler container_id:\"d59eedc96c2104e11b2608643e7d628b3bd942715093589a5bf7cf81929818f7\" id:\"bec2e6ca5f3648034c10e7f659b7fb02249bf3aaddecc9492cde9455a1d677fd\" pid:5832 exited_at:{seconds:1757462393 nanos:45443629}" Sep 10 00:00:04.684586 kubelet[2750]: E0910 00:00:04.684357 2750 controller.go:195] "Failed to update lease" err="rpc error: code = Unavailable desc = error reading from server: read tcp 10.0.0.3:48798->10.0.0.2:2379: read: connection timed out" Sep 10 00:00:04.694120 systemd[1]: cri-containerd-517cca4662d582043d65aeb68a2b2ace7dbaea875f2a9c56166b1d7ee94dc23d.scope: Deactivated successfully. Sep 10 00:00:04.695978 systemd[1]: cri-containerd-517cca4662d582043d65aeb68a2b2ace7dbaea875f2a9c56166b1d7ee94dc23d.scope: Consumed 3.397s CPU time, 26.4M memory peak, 3.4M read from disk. Sep 10 00:00:04.696170 containerd[1499]: time="2025-09-10T00:00:04.695975851Z" level=info msg="received exit event container_id:\"517cca4662d582043d65aeb68a2b2ace7dbaea875f2a9c56166b1d7ee94dc23d\" id:\"517cca4662d582043d65aeb68a2b2ace7dbaea875f2a9c56166b1d7ee94dc23d\" pid:2617 exit_status:1 exited_at:{seconds:1757462404 nanos:695654288}" Sep 10 00:00:04.698490 containerd[1499]: time="2025-09-10T00:00:04.697167262Z" level=info msg="TaskExit event in podsandbox handler container_id:\"517cca4662d582043d65aeb68a2b2ace7dbaea875f2a9c56166b1d7ee94dc23d\" id:\"517cca4662d582043d65aeb68a2b2ace7dbaea875f2a9c56166b1d7ee94dc23d\" pid:2617 exit_status:1 exited_at:{seconds:1757462404 nanos:695654288}" Sep 10 00:00:04.725633 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-517cca4662d582043d65aeb68a2b2ace7dbaea875f2a9c56166b1d7ee94dc23d-rootfs.mount: Deactivated successfully. Sep 10 00:00:05.290139 kubelet[2750]: I0910 00:00:05.288176 2750 scope.go:117] "RemoveContainer" containerID="517cca4662d582043d65aeb68a2b2ace7dbaea875f2a9c56166b1d7ee94dc23d" Sep 10 00:00:05.295453 containerd[1499]: time="2025-09-10T00:00:05.294794313Z" level=info msg="CreateContainer within sandbox \"07e265f45fefb6f1bcf35378949cda9964d1ee1e5ce09675ba3b7294716f09c8\" for container &ContainerMetadata{Name:kube-scheduler,Attempt:1,}" Sep 10 00:00:05.311946 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount2846524804.mount: Deactivated successfully. Sep 10 00:00:05.335338 containerd[1499]: time="2025-09-10T00:00:05.335285607Z" level=info msg="Container 5fd156a8309d725c12987d9840630f4579a0adaa38ef91d70f78dce616aebb86: CDI devices from CRI Config.CDIDevices: []" Sep 10 00:00:05.344302 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount1304476317.mount: Deactivated successfully. Sep 10 00:00:05.354654 containerd[1499]: time="2025-09-10T00:00:05.354569505Z" level=info msg="CreateContainer within sandbox \"07e265f45fefb6f1bcf35378949cda9964d1ee1e5ce09675ba3b7294716f09c8\" for &ContainerMetadata{Name:kube-scheduler,Attempt:1,} returns container id \"5fd156a8309d725c12987d9840630f4579a0adaa38ef91d70f78dce616aebb86\"" Sep 10 00:00:05.356019 containerd[1499]: time="2025-09-10T00:00:05.355254831Z" level=info msg="StartContainer for \"5fd156a8309d725c12987d9840630f4579a0adaa38ef91d70f78dce616aebb86\"" Sep 10 00:00:05.356918 containerd[1499]: time="2025-09-10T00:00:05.356882486Z" level=info msg="connecting to shim 5fd156a8309d725c12987d9840630f4579a0adaa38ef91d70f78dce616aebb86" address="unix:///run/containerd/s/f06b77e487d9bb1641c77a2beb73bfa6259914b38b2f7c66f9adb6e379a87191" protocol=ttrpc version=3 Sep 10 00:00:05.379052 systemd[1]: Started cri-containerd-5fd156a8309d725c12987d9840630f4579a0adaa38ef91d70f78dce616aebb86.scope - libcontainer container 5fd156a8309d725c12987d9840630f4579a0adaa38ef91d70f78dce616aebb86. Sep 10 00:00:05.407647 systemd[1]: cri-containerd-9d8a32d8b6f93946dd7e1ec44e8138c329d69bd9f5001a16edcff11c8eac9ec8.scope: Deactivated successfully. Sep 10 00:00:05.408651 systemd[1]: cri-containerd-9d8a32d8b6f93946dd7e1ec44e8138c329d69bd9f5001a16edcff11c8eac9ec8.scope: Consumed 22.602s CPU time, 102.4M memory peak, 4.2M read from disk. Sep 10 00:00:05.412472 containerd[1499]: time="2025-09-10T00:00:05.411801514Z" level=info msg="received exit event container_id:\"9d8a32d8b6f93946dd7e1ec44e8138c329d69bd9f5001a16edcff11c8eac9ec8\" id:\"9d8a32d8b6f93946dd7e1ec44e8138c329d69bd9f5001a16edcff11c8eac9ec8\" pid:3070 exit_status:1 exited_at:{seconds:1757462405 nanos:411289669}" Sep 10 00:00:05.412870 containerd[1499]: time="2025-09-10T00:00:05.411901115Z" level=info msg="TaskExit event in podsandbox handler container_id:\"9d8a32d8b6f93946dd7e1ec44e8138c329d69bd9f5001a16edcff11c8eac9ec8\" id:\"9d8a32d8b6f93946dd7e1ec44e8138c329d69bd9f5001a16edcff11c8eac9ec8\" pid:3070 exit_status:1 exited_at:{seconds:1757462405 nanos:411289669}" Sep 10 00:00:05.447878 containerd[1499]: time="2025-09-10T00:00:05.447807006Z" level=info msg="StartContainer for \"5fd156a8309d725c12987d9840630f4579a0adaa38ef91d70f78dce616aebb86\" returns successfully" Sep 10 00:00:05.727123 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-9d8a32d8b6f93946dd7e1ec44e8138c329d69bd9f5001a16edcff11c8eac9ec8-rootfs.mount: Deactivated successfully. Sep 10 00:00:05.901215 systemd[1]: cri-containerd-ed12b7c362c53835271d3603c18b013fba31df29863995a3750d8fbc8e2bfffb.scope: Deactivated successfully. Sep 10 00:00:05.902437 systemd[1]: cri-containerd-ed12b7c362c53835271d3603c18b013fba31df29863995a3750d8fbc8e2bfffb.scope: Consumed 4.940s CPU time, 64.5M memory peak, 3.5M read from disk. Sep 10 00:00:05.906408 containerd[1499]: time="2025-09-10T00:00:05.906051719Z" level=info msg="received exit event container_id:\"ed12b7c362c53835271d3603c18b013fba31df29863995a3750d8fbc8e2bfffb\" id:\"ed12b7c362c53835271d3603c18b013fba31df29863995a3750d8fbc8e2bfffb\" pid:2608 exit_status:1 exited_at:{seconds:1757462405 nanos:905547154}" Sep 10 00:00:05.908855 containerd[1499]: time="2025-09-10T00:00:05.908107178Z" level=info msg="TaskExit event in podsandbox handler container_id:\"ed12b7c362c53835271d3603c18b013fba31df29863995a3750d8fbc8e2bfffb\" id:\"ed12b7c362c53835271d3603c18b013fba31df29863995a3750d8fbc8e2bfffb\" pid:2608 exit_status:1 exited_at:{seconds:1757462405 nanos:905547154}" Sep 10 00:00:05.947806 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-ed12b7c362c53835271d3603c18b013fba31df29863995a3750d8fbc8e2bfffb-rootfs.mount: Deactivated successfully. Sep 10 00:00:06.295366 kubelet[2750]: I0910 00:00:06.295322 2750 scope.go:117] "RemoveContainer" containerID="ed12b7c362c53835271d3603c18b013fba31df29863995a3750d8fbc8e2bfffb" Sep 10 00:00:06.300383 containerd[1499]: time="2025-09-10T00:00:06.299374863Z" level=info msg="CreateContainer within sandbox \"dd0389e8dfc6a226dbbc6cb1baad383d855bba5f93b28898dbf24d08c7652579\" for container &ContainerMetadata{Name:kube-controller-manager,Attempt:1,}" Sep 10 00:00:06.303745 kubelet[2750]: I0910 00:00:06.303693 2750 scope.go:117] "RemoveContainer" containerID="9d8a32d8b6f93946dd7e1ec44e8138c329d69bd9f5001a16edcff11c8eac9ec8" Sep 10 00:00:06.306064 containerd[1499]: time="2025-09-10T00:00:06.306021524Z" level=info msg="CreateContainer within sandbox \"bd414dd2df0a9b3f866cc7daeb6cefc247d8104602120ab8c783f5032d1523c8\" for container &ContainerMetadata{Name:tigera-operator,Attempt:1,}" Sep 10 00:00:06.326732 containerd[1499]: time="2025-09-10T00:00:06.326166626Z" level=info msg="Container ac2068f28b99395733bab0714995e65f13e6c59e1ad465e98185699baa9789fd: CDI devices from CRI Config.CDIDevices: []" Sep 10 00:00:06.332090 containerd[1499]: time="2025-09-10T00:00:06.331948559Z" level=info msg="Container 4cca9252ff5706df3c4416588e25b0f9f0ba88da4feeeac7ef33101629149ca3: CDI devices from CRI Config.CDIDevices: []" Sep 10 00:00:06.344170 containerd[1499]: time="2025-09-10T00:00:06.343968388Z" level=info msg="CreateContainer within sandbox \"dd0389e8dfc6a226dbbc6cb1baad383d855bba5f93b28898dbf24d08c7652579\" for &ContainerMetadata{Name:kube-controller-manager,Attempt:1,} returns container id \"ac2068f28b99395733bab0714995e65f13e6c59e1ad465e98185699baa9789fd\"" Sep 10 00:00:06.345872 containerd[1499]: time="2025-09-10T00:00:06.345294640Z" level=info msg="StartContainer for \"ac2068f28b99395733bab0714995e65f13e6c59e1ad465e98185699baa9789fd\"" Sep 10 00:00:06.346100 containerd[1499]: time="2025-09-10T00:00:06.346061207Z" level=info msg="CreateContainer within sandbox \"bd414dd2df0a9b3f866cc7daeb6cefc247d8104602120ab8c783f5032d1523c8\" for &ContainerMetadata{Name:tigera-operator,Attempt:1,} returns container id \"4cca9252ff5706df3c4416588e25b0f9f0ba88da4feeeac7ef33101629149ca3\"" Sep 10 00:00:06.347190 containerd[1499]: time="2025-09-10T00:00:06.347158897Z" level=info msg="StartContainer for \"4cca9252ff5706df3c4416588e25b0f9f0ba88da4feeeac7ef33101629149ca3\"" Sep 10 00:00:06.348809 containerd[1499]: time="2025-09-10T00:00:06.348762911Z" level=info msg="connecting to shim 4cca9252ff5706df3c4416588e25b0f9f0ba88da4feeeac7ef33101629149ca3" address="unix:///run/containerd/s/221fa533343353e0ccc6ccc8e8c45d704a74d43037806dce145b30904afce858" protocol=ttrpc version=3 Sep 10 00:00:06.350273 containerd[1499]: time="2025-09-10T00:00:06.350130204Z" level=info msg="connecting to shim ac2068f28b99395733bab0714995e65f13e6c59e1ad465e98185699baa9789fd" address="unix:///run/containerd/s/b990d394cb243cce06353763dbb8b2ec46cb91d09aef1ced050a5fe6ffb221e1" protocol=ttrpc version=3 Sep 10 00:00:06.388484 systemd[1]: Started cri-containerd-4cca9252ff5706df3c4416588e25b0f9f0ba88da4feeeac7ef33101629149ca3.scope - libcontainer container 4cca9252ff5706df3c4416588e25b0f9f0ba88da4feeeac7ef33101629149ca3. Sep 10 00:00:06.401030 systemd[1]: Started cri-containerd-ac2068f28b99395733bab0714995e65f13e6c59e1ad465e98185699baa9789fd.scope - libcontainer container ac2068f28b99395733bab0714995e65f13e6c59e1ad465e98185699baa9789fd. Sep 10 00:00:06.478997 containerd[1499]: time="2025-09-10T00:00:06.478810251Z" level=info msg="StartContainer for \"ac2068f28b99395733bab0714995e65f13e6c59e1ad465e98185699baa9789fd\" returns successfully" Sep 10 00:00:06.486970 containerd[1499]: time="2025-09-10T00:00:06.486906965Z" level=info msg="StartContainer for \"4cca9252ff5706df3c4416588e25b0f9f0ba88da4feeeac7ef33101629149ca3\" returns successfully" Sep 10 00:00:07.095328 containerd[1499]: time="2025-09-10T00:00:07.095272669Z" level=info msg="TaskExit event in podsandbox handler container_id:\"eafaa11bb30a9b23530edc71be999bcf0522d89adbc86dbb860814a1c473116b\" id:\"134e82022b38ec9b4bc2fc06001a3b3c7840c0bba139de6e1218009598779144\" pid:5986 exited_at:{seconds:1757462407 nanos:94821305}" Sep 10 00:00:09.359564 systemd[1]: cri-containerd-4cca9252ff5706df3c4416588e25b0f9f0ba88da4feeeac7ef33101629149ca3.scope: Deactivated successfully. Sep 10 00:00:09.369179 containerd[1499]: time="2025-09-10T00:00:09.369034017Z" level=info msg="received exit event container_id:\"4cca9252ff5706df3c4416588e25b0f9f0ba88da4feeeac7ef33101629149ca3\" id:\"4cca9252ff5706df3c4416588e25b0f9f0ba88da4feeeac7ef33101629149ca3\" pid:5938 exit_status:1 exited_at:{seconds:1757462409 nanos:368505692}" Sep 10 00:00:09.369179 containerd[1499]: time="2025-09-10T00:00:09.369120177Z" level=info msg="TaskExit event in podsandbox handler container_id:\"4cca9252ff5706df3c4416588e25b0f9f0ba88da4feeeac7ef33101629149ca3\" id:\"4cca9252ff5706df3c4416588e25b0f9f0ba88da4feeeac7ef33101629149ca3\" pid:5938 exit_status:1 exited_at:{seconds:1757462409 nanos:368505692}" Sep 10 00:00:09.399884 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-4cca9252ff5706df3c4416588e25b0f9f0ba88da4feeeac7ef33101629149ca3-rootfs.mount: Deactivated successfully. Sep 10 00:00:10.002240 kubelet[2750]: E0910 00:00:09.994637 2750 event.go:359] "Server rejected event (will not retry!)" err="rpc error: code = Unavailable desc = error reading from server: read tcp 10.0.0.3:48610->10.0.0.2:2379: read: connection timed out" event="&Event{ObjectMeta:{kube-apiserver-ci-4426-0-0-n-d8dd570c6c.1863c2bdad33e5f1 kube-system 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:kube-system,Name:kube-apiserver-ci-4426-0-0-n-d8dd570c6c,UID:3ee02ca6703fcd8d38d1f2ae1cb497b0,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:Unhealthy,Message:Readiness probe failed: HTTP probe failed with statuscode: 500,Source:EventSource{Component:kubelet,Host:ci-4426-0-0-n-d8dd570c6c,},FirstTimestamp:2025-09-09 23:59:59.515092465 +0000 UTC m=+191.244002323,LastTimestamp:2025-09-09 23:59:59.515092465 +0000 UTC m=+191.244002323,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ci-4426-0-0-n-d8dd570c6c,}" Sep 10 00:00:10.342405 kubelet[2750]: I0910 00:00:10.341490 2750 scope.go:117] "RemoveContainer" containerID="9d8a32d8b6f93946dd7e1ec44e8138c329d69bd9f5001a16edcff11c8eac9ec8" Sep 10 00:00:10.342405 kubelet[2750]: I0910 00:00:10.341916 2750 scope.go:117] "RemoveContainer" containerID="4cca9252ff5706df3c4416588e25b0f9f0ba88da4feeeac7ef33101629149ca3" Sep 10 00:00:10.342405 kubelet[2750]: E0910 00:00:10.342049 2750 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"tigera-operator\" with CrashLoopBackOff: \"back-off 10s restarting failed container=tigera-operator pod=tigera-operator-755d956888-x7v9h_tigera-operator(1112463b-5b41-49e0-9ebd-b987ef3990b0)\"" pod="tigera-operator/tigera-operator-755d956888-x7v9h" podUID="1112463b-5b41-49e0-9ebd-b987ef3990b0" Sep 10 00:00:10.349856 containerd[1499]: time="2025-09-10T00:00:10.348424582Z" level=info msg="RemoveContainer for \"9d8a32d8b6f93946dd7e1ec44e8138c329d69bd9f5001a16edcff11c8eac9ec8\"" Sep 10 00:00:10.368825 containerd[1499]: time="2025-09-10T00:00:10.368716673Z" level=info msg="RemoveContainer for \"9d8a32d8b6f93946dd7e1ec44e8138c329d69bd9f5001a16edcff11c8eac9ec8\" returns successfully"