Sep 12 22:04:40.839154 kernel: Booting Linux on physical CPU 0x0000000000 [0x413fd0c1] Sep 12 22:04:40.839176 kernel: Linux version 6.12.47-flatcar (build@pony-truck.infra.kinvolk.io) (aarch64-cros-linux-gnu-gcc (Gentoo Hardened 14.3.0 p8) 14.3.0, GNU ld (Gentoo 2.44 p4) 2.44.0) #1 SMP PREEMPT Fri Sep 12 20:38:46 -00 2025 Sep 12 22:04:40.839186 kernel: KASLR enabled Sep 12 22:04:40.839191 kernel: efi: EFI v2.7 by Ubuntu distribution of EDK II Sep 12 22:04:40.839197 kernel: efi: SMBIOS 3.0=0x139ed0000 MEMATTR=0x1390bb018 ACPI 2.0=0x136760018 RNG=0x13676e918 MEMRESERVE=0x136b41218 Sep 12 22:04:40.839202 kernel: random: crng init done Sep 12 22:04:40.839209 kernel: secureboot: Secure boot disabled Sep 12 22:04:40.839214 kernel: ACPI: Early table checksum verification disabled Sep 12 22:04:40.839220 kernel: ACPI: RSDP 0x0000000136760018 000024 (v02 BOCHS ) Sep 12 22:04:40.839226 kernel: ACPI: XSDT 0x000000013676FE98 00006C (v01 BOCHS BXPC 00000001 01000013) Sep 12 22:04:40.839233 kernel: ACPI: FACP 0x000000013676FA98 000114 (v06 BOCHS BXPC 00000001 BXPC 00000001) Sep 12 22:04:40.839239 kernel: ACPI: DSDT 0x0000000136767518 001468 (v02 BOCHS BXPC 00000001 BXPC 00000001) Sep 12 22:04:40.839245 kernel: ACPI: APIC 0x000000013676FC18 000108 (v04 BOCHS BXPC 00000001 BXPC 00000001) Sep 12 22:04:40.839251 kernel: ACPI: PPTT 0x000000013676FD98 000060 (v02 BOCHS BXPC 00000001 BXPC 00000001) Sep 12 22:04:40.839258 kernel: ACPI: GTDT 0x000000013676D898 000060 (v02 BOCHS BXPC 00000001 BXPC 00000001) Sep 12 22:04:40.839265 kernel: ACPI: MCFG 0x000000013676FF98 00003C (v01 BOCHS BXPC 00000001 BXPC 00000001) Sep 12 22:04:40.839271 kernel: ACPI: SPCR 0x000000013676E818 000050 (v02 BOCHS BXPC 00000001 BXPC 00000001) Sep 12 22:04:40.839277 kernel: ACPI: DBG2 0x000000013676E898 000057 (v00 BOCHS BXPC 00000001 BXPC 00000001) Sep 12 22:04:40.839283 kernel: ACPI: IORT 0x000000013676E418 000080 (v03 BOCHS BXPC 00000001 BXPC 00000001) Sep 12 22:04:40.839289 kernel: ACPI: BGRT 0x000000013676E798 000038 (v01 INTEL EDK2 00000002 01000013) Sep 12 22:04:40.839295 kernel: ACPI: SPCR: console: pl011,mmio32,0x9000000,9600 Sep 12 22:04:40.839301 kernel: ACPI: Use ACPI SPCR as default console: No Sep 12 22:04:40.839307 kernel: NUMA: Faking a node at [mem 0x0000000040000000-0x0000000139ffffff] Sep 12 22:04:40.839313 kernel: NODE_DATA(0) allocated [mem 0x13967da00-0x139684fff] Sep 12 22:04:40.839319 kernel: Zone ranges: Sep 12 22:04:40.839325 kernel: DMA [mem 0x0000000040000000-0x00000000ffffffff] Sep 12 22:04:40.839332 kernel: DMA32 empty Sep 12 22:04:40.839338 kernel: Normal [mem 0x0000000100000000-0x0000000139ffffff] Sep 12 22:04:40.839344 kernel: Device empty Sep 12 22:04:40.839350 kernel: Movable zone start for each node Sep 12 22:04:40.839356 kernel: Early memory node ranges Sep 12 22:04:40.839362 kernel: node 0: [mem 0x0000000040000000-0x000000013666ffff] Sep 12 22:04:40.839368 kernel: node 0: [mem 0x0000000136670000-0x000000013667ffff] Sep 12 22:04:40.839374 kernel: node 0: [mem 0x0000000136680000-0x000000013676ffff] Sep 12 22:04:40.839380 kernel: node 0: [mem 0x0000000136770000-0x0000000136b3ffff] Sep 12 22:04:40.839386 kernel: node 0: [mem 0x0000000136b40000-0x0000000139e1ffff] Sep 12 22:04:40.839392 kernel: node 0: [mem 0x0000000139e20000-0x0000000139eaffff] Sep 12 22:04:40.839398 kernel: node 0: [mem 0x0000000139eb0000-0x0000000139ebffff] Sep 12 22:04:40.839405 kernel: node 0: [mem 0x0000000139ec0000-0x0000000139fdffff] Sep 12 22:04:40.839411 kernel: node 0: [mem 0x0000000139fe0000-0x0000000139ffffff] Sep 12 22:04:40.839420 kernel: Initmem setup node 0 [mem 0x0000000040000000-0x0000000139ffffff] Sep 12 22:04:40.839427 kernel: On node 0, zone Normal: 24576 pages in unavailable ranges Sep 12 22:04:40.839433 kernel: cma: Reserved 16 MiB at 0x00000000ff000000 on node -1 Sep 12 22:04:40.839441 kernel: psci: probing for conduit method from ACPI. Sep 12 22:04:40.841549 kernel: psci: PSCIv1.1 detected in firmware. Sep 12 22:04:40.841578 kernel: psci: Using standard PSCI v0.2 function IDs Sep 12 22:04:40.841585 kernel: psci: Trusted OS migration not required Sep 12 22:04:40.841591 kernel: psci: SMC Calling Convention v1.1 Sep 12 22:04:40.841598 kernel: smccc: KVM: hypervisor services detected (0x00000000 0x00000000 0x00000000 0x00000003) Sep 12 22:04:40.841605 kernel: percpu: Embedded 33 pages/cpu s98200 r8192 d28776 u135168 Sep 12 22:04:40.841612 kernel: pcpu-alloc: s98200 r8192 d28776 u135168 alloc=33*4096 Sep 12 22:04:40.841619 kernel: pcpu-alloc: [0] 0 [0] 1 Sep 12 22:04:40.841625 kernel: Detected PIPT I-cache on CPU0 Sep 12 22:04:40.841632 kernel: CPU features: detected: GIC system register CPU interface Sep 12 22:04:40.841644 kernel: CPU features: detected: Spectre-v4 Sep 12 22:04:40.841651 kernel: CPU features: detected: Spectre-BHB Sep 12 22:04:40.841658 kernel: CPU features: kernel page table isolation forced ON by KASLR Sep 12 22:04:40.841664 kernel: CPU features: detected: Kernel page table isolation (KPTI) Sep 12 22:04:40.841670 kernel: CPU features: detected: ARM erratum 1418040 Sep 12 22:04:40.841677 kernel: CPU features: detected: SSBS not fully self-synchronizing Sep 12 22:04:40.841683 kernel: alternatives: applying boot alternatives Sep 12 22:04:40.841691 kernel: Kernel command line: BOOT_IMAGE=/flatcar/vmlinuz-a mount.usr=/dev/mapper/usr verity.usr=PARTUUID=7130c94a-213a-4e5a-8e26-6cce9662f132 rootflags=rw mount.usrflags=ro consoleblank=0 root=LABEL=ROOT console=ttyAMA0,115200n8 flatcar.first_boot=detected acpi=force flatcar.oem.id=hetzner verity.usrhash=319fa5fb212e5dd8bf766d2f9f0bbb61d6aa6c81f2813f4b5b49defba0af2b2f Sep 12 22:04:40.841699 kernel: Unknown kernel command line parameters "BOOT_IMAGE=/flatcar/vmlinuz-a", will be passed to user space. Sep 12 22:04:40.841705 kernel: Dentry cache hash table entries: 524288 (order: 10, 4194304 bytes, linear) Sep 12 22:04:40.841713 kernel: Inode-cache hash table entries: 262144 (order: 9, 2097152 bytes, linear) Sep 12 22:04:40.841720 kernel: Fallback order for Node 0: 0 Sep 12 22:04:40.841739 kernel: Built 1 zonelists, mobility grouping on. Total pages: 1024000 Sep 12 22:04:40.841746 kernel: Policy zone: Normal Sep 12 22:04:40.841753 kernel: mem auto-init: stack:off, heap alloc:off, heap free:off Sep 12 22:04:40.841759 kernel: software IO TLB: area num 2. Sep 12 22:04:40.841765 kernel: software IO TLB: mapped [mem 0x00000000fb000000-0x00000000ff000000] (64MB) Sep 12 22:04:40.841772 kernel: SLUB: HWalign=64, Order=0-3, MinObjects=0, CPUs=2, Nodes=1 Sep 12 22:04:40.841778 kernel: rcu: Preemptible hierarchical RCU implementation. Sep 12 22:04:40.841786 kernel: rcu: RCU event tracing is enabled. Sep 12 22:04:40.841792 kernel: rcu: RCU restricting CPUs from NR_CPUS=512 to nr_cpu_ids=2. Sep 12 22:04:40.841799 kernel: Trampoline variant of Tasks RCU enabled. Sep 12 22:04:40.841807 kernel: Tracing variant of Tasks RCU enabled. Sep 12 22:04:40.841815 kernel: rcu: RCU calculated value of scheduler-enlistment delay is 100 jiffies. Sep 12 22:04:40.841821 kernel: rcu: Adjusting geometry for rcu_fanout_leaf=16, nr_cpu_ids=2 Sep 12 22:04:40.841827 kernel: RCU Tasks: Setting shift to 1 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=2. Sep 12 22:04:40.841834 kernel: RCU Tasks Trace: Setting shift to 1 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=2. Sep 12 22:04:40.841841 kernel: NR_IRQS: 64, nr_irqs: 64, preallocated irqs: 0 Sep 12 22:04:40.841847 kernel: GICv3: 256 SPIs implemented Sep 12 22:04:40.841853 kernel: GICv3: 0 Extended SPIs implemented Sep 12 22:04:40.841860 kernel: Root IRQ handler: gic_handle_irq Sep 12 22:04:40.841866 kernel: GICv3: GICv3 features: 16 PPIs, DirectLPI Sep 12 22:04:40.841872 kernel: GICv3: GICD_CTRL.DS=1, SCR_EL3.FIQ=0 Sep 12 22:04:40.841879 kernel: GICv3: CPU0: found redistributor 0 region 0:0x00000000080a0000 Sep 12 22:04:40.841887 kernel: ITS [mem 0x08080000-0x0809ffff] Sep 12 22:04:40.841893 kernel: ITS@0x0000000008080000: allocated 8192 Devices @100100000 (indirect, esz 8, psz 64K, shr 1) Sep 12 22:04:40.841900 kernel: ITS@0x0000000008080000: allocated 8192 Interrupt Collections @100110000 (flat, esz 8, psz 64K, shr 1) Sep 12 22:04:40.841906 kernel: GICv3: using LPI property table @0x0000000100120000 Sep 12 22:04:40.841913 kernel: GICv3: CPU0: using allocated LPI pending table @0x0000000100130000 Sep 12 22:04:40.841920 kernel: rcu: srcu_init: Setting srcu_struct sizes based on contention. Sep 12 22:04:40.841926 kernel: arch_timer: Enabling local workaround for ARM erratum 1418040 Sep 12 22:04:40.841932 kernel: arch_timer: cp15 timer(s) running at 25.00MHz (virt). Sep 12 22:04:40.841939 kernel: clocksource: arch_sys_counter: mask: 0xffffffffffffff max_cycles: 0x5c40939b5, max_idle_ns: 440795202646 ns Sep 12 22:04:40.841946 kernel: sched_clock: 56 bits at 25MHz, resolution 40ns, wraps every 4398046511100ns Sep 12 22:04:40.841952 kernel: Console: colour dummy device 80x25 Sep 12 22:04:40.841960 kernel: ACPI: Core revision 20240827 Sep 12 22:04:40.841968 kernel: Calibrating delay loop (skipped), value calculated using timer frequency.. 50.00 BogoMIPS (lpj=25000) Sep 12 22:04:40.841974 kernel: pid_max: default: 32768 minimum: 301 Sep 12 22:04:40.841981 kernel: LSM: initializing lsm=lockdown,capability,landlock,selinux,ima Sep 12 22:04:40.841987 kernel: landlock: Up and running. Sep 12 22:04:40.841994 kernel: SELinux: Initializing. Sep 12 22:04:40.842001 kernel: Mount-cache hash table entries: 8192 (order: 4, 65536 bytes, linear) Sep 12 22:04:40.842007 kernel: Mountpoint-cache hash table entries: 8192 (order: 4, 65536 bytes, linear) Sep 12 22:04:40.842014 kernel: rcu: Hierarchical SRCU implementation. Sep 12 22:04:40.842022 kernel: rcu: Max phase no-delay instances is 400. Sep 12 22:04:40.842029 kernel: Timer migration: 1 hierarchy levels; 8 children per group; 1 crossnode level Sep 12 22:04:40.842036 kernel: Remapping and enabling EFI services. Sep 12 22:04:40.842042 kernel: smp: Bringing up secondary CPUs ... Sep 12 22:04:40.842050 kernel: Detected PIPT I-cache on CPU1 Sep 12 22:04:40.842056 kernel: GICv3: CPU1: found redistributor 1 region 0:0x00000000080c0000 Sep 12 22:04:40.842063 kernel: GICv3: CPU1: using allocated LPI pending table @0x0000000100140000 Sep 12 22:04:40.842070 kernel: arch_timer: Enabling local workaround for ARM erratum 1418040 Sep 12 22:04:40.842076 kernel: CPU1: Booted secondary processor 0x0000000001 [0x413fd0c1] Sep 12 22:04:40.842085 kernel: smp: Brought up 1 node, 2 CPUs Sep 12 22:04:40.842097 kernel: SMP: Total of 2 processors activated. Sep 12 22:04:40.842104 kernel: CPU: All CPU(s) started at EL1 Sep 12 22:04:40.842112 kernel: CPU features: detected: 32-bit EL0 Support Sep 12 22:04:40.842119 kernel: CPU features: detected: Data cache clean to the PoU not required for I/D coherence Sep 12 22:04:40.842127 kernel: CPU features: detected: Common not Private translations Sep 12 22:04:40.842133 kernel: CPU features: detected: CRC32 instructions Sep 12 22:04:40.842140 kernel: CPU features: detected: Enhanced Virtualization Traps Sep 12 22:04:40.842149 kernel: CPU features: detected: RCpc load-acquire (LDAPR) Sep 12 22:04:40.842156 kernel: CPU features: detected: LSE atomic instructions Sep 12 22:04:40.842163 kernel: CPU features: detected: Privileged Access Never Sep 12 22:04:40.842170 kernel: CPU features: detected: RAS Extension Support Sep 12 22:04:40.842177 kernel: CPU features: detected: Speculative Store Bypassing Safe (SSBS) Sep 12 22:04:40.842184 kernel: alternatives: applying system-wide alternatives Sep 12 22:04:40.842192 kernel: CPU features: detected: Hardware dirty bit management on CPU0-1 Sep 12 22:04:40.842199 kernel: Memory: 3859556K/4096000K available (11136K kernel code, 2440K rwdata, 9068K rodata, 38976K init, 1038K bss, 214964K reserved, 16384K cma-reserved) Sep 12 22:04:40.842265 kernel: devtmpfs: initialized Sep 12 22:04:40.842276 kernel: clocksource: jiffies: mask: 0xffffffff max_cycles: 0xffffffff, max_idle_ns: 1911260446275000 ns Sep 12 22:04:40.842284 kernel: futex hash table entries: 512 (order: 3, 32768 bytes, linear) Sep 12 22:04:40.842291 kernel: 2G module region forced by RANDOMIZE_MODULE_REGION_FULL Sep 12 22:04:40.842298 kernel: 0 pages in range for non-PLT usage Sep 12 22:04:40.842305 kernel: 508560 pages in range for PLT usage Sep 12 22:04:40.842312 kernel: pinctrl core: initialized pinctrl subsystem Sep 12 22:04:40.842319 kernel: SMBIOS 3.0.0 present. Sep 12 22:04:40.842326 kernel: DMI: Hetzner vServer/KVM Virtual Machine, BIOS 20171111 11/11/2017 Sep 12 22:04:40.842333 kernel: DMI: Memory slots populated: 1/1 Sep 12 22:04:40.842342 kernel: NET: Registered PF_NETLINK/PF_ROUTE protocol family Sep 12 22:04:40.842349 kernel: DMA: preallocated 512 KiB GFP_KERNEL pool for atomic allocations Sep 12 22:04:40.842356 kernel: DMA: preallocated 512 KiB GFP_KERNEL|GFP_DMA pool for atomic allocations Sep 12 22:04:40.842363 kernel: DMA: preallocated 512 KiB GFP_KERNEL|GFP_DMA32 pool for atomic allocations Sep 12 22:04:40.842370 kernel: audit: initializing netlink subsys (disabled) Sep 12 22:04:40.842377 kernel: audit: type=2000 audit(0.014:1): state=initialized audit_enabled=0 res=1 Sep 12 22:04:40.842384 kernel: thermal_sys: Registered thermal governor 'step_wise' Sep 12 22:04:40.842391 kernel: cpuidle: using governor menu Sep 12 22:04:40.842398 kernel: hw-breakpoint: found 6 breakpoint and 4 watchpoint registers. Sep 12 22:04:40.842407 kernel: ASID allocator initialised with 32768 entries Sep 12 22:04:40.842414 kernel: acpiphp: ACPI Hot Plug PCI Controller Driver version: 0.5 Sep 12 22:04:40.842421 kernel: Serial: AMBA PL011 UART driver Sep 12 22:04:40.842428 kernel: HugeTLB: registered 1.00 GiB page size, pre-allocated 0 pages Sep 12 22:04:40.842435 kernel: HugeTLB: 0 KiB vmemmap can be freed for a 1.00 GiB page Sep 12 22:04:40.842442 kernel: HugeTLB: registered 32.0 MiB page size, pre-allocated 0 pages Sep 12 22:04:40.842459 kernel: HugeTLB: 0 KiB vmemmap can be freed for a 32.0 MiB page Sep 12 22:04:40.842467 kernel: HugeTLB: registered 2.00 MiB page size, pre-allocated 0 pages Sep 12 22:04:40.842474 kernel: HugeTLB: 0 KiB vmemmap can be freed for a 2.00 MiB page Sep 12 22:04:40.842484 kernel: HugeTLB: registered 64.0 KiB page size, pre-allocated 0 pages Sep 12 22:04:40.842491 kernel: HugeTLB: 0 KiB vmemmap can be freed for a 64.0 KiB page Sep 12 22:04:40.842498 kernel: ACPI: Added _OSI(Module Device) Sep 12 22:04:40.842505 kernel: ACPI: Added _OSI(Processor Device) Sep 12 22:04:40.842512 kernel: ACPI: Added _OSI(Processor Aggregator Device) Sep 12 22:04:40.842519 kernel: ACPI: 1 ACPI AML tables successfully acquired and loaded Sep 12 22:04:40.842526 kernel: ACPI: Interpreter enabled Sep 12 22:04:40.842533 kernel: ACPI: Using GIC for interrupt routing Sep 12 22:04:40.842590 kernel: ACPI: MCFG table detected, 1 entries Sep 12 22:04:40.842600 kernel: ACPI: CPU0 has been hot-added Sep 12 22:04:40.842607 kernel: ACPI: CPU1 has been hot-added Sep 12 22:04:40.842614 kernel: ARMH0011:00: ttyAMA0 at MMIO 0x9000000 (irq = 12, base_baud = 0) is a SBSA Sep 12 22:04:40.842621 kernel: printk: legacy console [ttyAMA0] enabled Sep 12 22:04:40.842628 kernel: ACPI: PCI Root Bridge [PCI0] (domain 0000 [bus 00-ff]) Sep 12 22:04:40.842821 kernel: acpi PNP0A08:00: _OSC: OS supports [ExtendedConfig ASPM ClockPM Segments MSI HPX-Type3] Sep 12 22:04:40.842892 kernel: acpi PNP0A08:00: _OSC: platform does not support [LTR] Sep 12 22:04:40.842963 kernel: acpi PNP0A08:00: _OSC: OS now controls [PCIeHotplug PME AER PCIeCapability] Sep 12 22:04:40.843035 kernel: acpi PNP0A08:00: ECAM area [mem 0x4010000000-0x401fffffff] reserved by PNP0C02:00 Sep 12 22:04:40.843092 kernel: acpi PNP0A08:00: ECAM at [mem 0x4010000000-0x401fffffff] for [bus 00-ff] Sep 12 22:04:40.843102 kernel: ACPI: Remapped I/O 0x000000003eff0000 to [io 0x0000-0xffff window] Sep 12 22:04:40.843109 kernel: PCI host bridge to bus 0000:00 Sep 12 22:04:40.843178 kernel: pci_bus 0000:00: root bus resource [mem 0x10000000-0x3efeffff window] Sep 12 22:04:40.843232 kernel: pci_bus 0000:00: root bus resource [io 0x0000-0xffff window] Sep 12 22:04:40.843285 kernel: pci_bus 0000:00: root bus resource [mem 0x8000000000-0xffffffffff window] Sep 12 22:04:40.843338 kernel: pci_bus 0000:00: root bus resource [bus 00-ff] Sep 12 22:04:40.843415 kernel: pci 0000:00:00.0: [1b36:0008] type 00 class 0x060000 conventional PCI endpoint Sep 12 22:04:40.845241 kernel: pci 0000:00:01.0: [1af4:1050] type 00 class 0x038000 conventional PCI endpoint Sep 12 22:04:40.845325 kernel: pci 0000:00:01.0: BAR 1 [mem 0x11289000-0x11289fff] Sep 12 22:04:40.845387 kernel: pci 0000:00:01.0: BAR 4 [mem 0x8000600000-0x8000603fff 64bit pref] Sep 12 22:04:40.845475 kernel: pci 0000:00:02.0: [1b36:000c] type 01 class 0x060400 PCIe Root Port Sep 12 22:04:40.845551 kernel: pci 0000:00:02.0: BAR 0 [mem 0x11288000-0x11288fff] Sep 12 22:04:40.845611 kernel: pci 0000:00:02.0: PCI bridge to [bus 01] Sep 12 22:04:40.845670 kernel: pci 0000:00:02.0: bridge window [mem 0x11000000-0x111fffff] Sep 12 22:04:40.845746 kernel: pci 0000:00:02.0: bridge window [mem 0x8000000000-0x80000fffff 64bit pref] Sep 12 22:04:40.845822 kernel: pci 0000:00:02.1: [1b36:000c] type 01 class 0x060400 PCIe Root Port Sep 12 22:04:40.845882 kernel: pci 0000:00:02.1: BAR 0 [mem 0x11287000-0x11287fff] Sep 12 22:04:40.845940 kernel: pci 0000:00:02.1: PCI bridge to [bus 02] Sep 12 22:04:40.846013 kernel: pci 0000:00:02.1: bridge window [mem 0x10e00000-0x10ffffff] Sep 12 22:04:40.846142 kernel: pci 0000:00:02.2: [1b36:000c] type 01 class 0x060400 PCIe Root Port Sep 12 22:04:40.846206 kernel: pci 0000:00:02.2: BAR 0 [mem 0x11286000-0x11286fff] Sep 12 22:04:40.846265 kernel: pci 0000:00:02.2: PCI bridge to [bus 03] Sep 12 22:04:40.846323 kernel: pci 0000:00:02.2: bridge window [mem 0x10c00000-0x10dfffff] Sep 12 22:04:40.846381 kernel: pci 0000:00:02.2: bridge window [mem 0x8000100000-0x80001fffff 64bit pref] Sep 12 22:04:40.846466 kernel: pci 0000:00:02.3: [1b36:000c] type 01 class 0x060400 PCIe Root Port Sep 12 22:04:40.846537 kernel: pci 0000:00:02.3: BAR 0 [mem 0x11285000-0x11285fff] Sep 12 22:04:40.846596 kernel: pci 0000:00:02.3: PCI bridge to [bus 04] Sep 12 22:04:40.846654 kernel: pci 0000:00:02.3: bridge window [mem 0x10a00000-0x10bfffff] Sep 12 22:04:40.846712 kernel: pci 0000:00:02.3: bridge window [mem 0x8000200000-0x80002fffff 64bit pref] Sep 12 22:04:40.846839 kernel: pci 0000:00:02.4: [1b36:000c] type 01 class 0x060400 PCIe Root Port Sep 12 22:04:40.846905 kernel: pci 0000:00:02.4: BAR 0 [mem 0x11284000-0x11284fff] Sep 12 22:04:40.846985 kernel: pci 0000:00:02.4: PCI bridge to [bus 05] Sep 12 22:04:40.847050 kernel: pci 0000:00:02.4: bridge window [mem 0x10800000-0x109fffff] Sep 12 22:04:40.847109 kernel: pci 0000:00:02.4: bridge window [mem 0x8000300000-0x80003fffff 64bit pref] Sep 12 22:04:40.847174 kernel: pci 0000:00:02.5: [1b36:000c] type 01 class 0x060400 PCIe Root Port Sep 12 22:04:40.847234 kernel: pci 0000:00:02.5: BAR 0 [mem 0x11283000-0x11283fff] Sep 12 22:04:40.847293 kernel: pci 0000:00:02.5: PCI bridge to [bus 06] Sep 12 22:04:40.847354 kernel: pci 0000:00:02.5: bridge window [mem 0x10600000-0x107fffff] Sep 12 22:04:40.847414 kernel: pci 0000:00:02.5: bridge window [mem 0x8000400000-0x80004fffff 64bit pref] Sep 12 22:04:40.847597 kernel: pci 0000:00:02.6: [1b36:000c] type 01 class 0x060400 PCIe Root Port Sep 12 22:04:40.847671 kernel: pci 0000:00:02.6: BAR 0 [mem 0x11282000-0x11282fff] Sep 12 22:04:40.847784 kernel: pci 0000:00:02.6: PCI bridge to [bus 07] Sep 12 22:04:40.847911 kernel: pci 0000:00:02.6: bridge window [mem 0x10400000-0x105fffff] Sep 12 22:04:40.847976 kernel: pci 0000:00:02.6: bridge window [mem 0x8000500000-0x80005fffff 64bit pref] Sep 12 22:04:40.848051 kernel: pci 0000:00:02.7: [1b36:000c] type 01 class 0x060400 PCIe Root Port Sep 12 22:04:40.848111 kernel: pci 0000:00:02.7: BAR 0 [mem 0x11281000-0x11281fff] Sep 12 22:04:40.848173 kernel: pci 0000:00:02.7: PCI bridge to [bus 08] Sep 12 22:04:40.848231 kernel: pci 0000:00:02.7: bridge window [mem 0x10200000-0x103fffff] Sep 12 22:04:40.848298 kernel: pci 0000:00:03.0: [1b36:000c] type 01 class 0x060400 PCIe Root Port Sep 12 22:04:40.848357 kernel: pci 0000:00:03.0: BAR 0 [mem 0x11280000-0x11280fff] Sep 12 22:04:40.848442 kernel: pci 0000:00:03.0: PCI bridge to [bus 09] Sep 12 22:04:40.848579 kernel: pci 0000:00:03.0: bridge window [mem 0x10000000-0x101fffff] Sep 12 22:04:40.848657 kernel: pci 0000:00:04.0: [1b36:0002] type 00 class 0x070002 conventional PCI endpoint Sep 12 22:04:40.848734 kernel: pci 0000:00:04.0: BAR 0 [io 0x0000-0x0007] Sep 12 22:04:40.848830 kernel: pci 0000:01:00.0: [1af4:1041] type 00 class 0x020000 PCIe Endpoint Sep 12 22:04:40.848895 kernel: pci 0000:01:00.0: BAR 1 [mem 0x11000000-0x11000fff] Sep 12 22:04:40.848956 kernel: pci 0000:01:00.0: BAR 4 [mem 0x8000000000-0x8000003fff 64bit pref] Sep 12 22:04:40.849016 kernel: pci 0000:01:00.0: ROM [mem 0xfff80000-0xffffffff pref] Sep 12 22:04:40.849087 kernel: pci 0000:02:00.0: [1b36:000d] type 00 class 0x0c0330 PCIe Endpoint Sep 12 22:04:40.849151 kernel: pci 0000:02:00.0: BAR 0 [mem 0x10e00000-0x10e03fff 64bit] Sep 12 22:04:40.849263 kernel: pci 0000:03:00.0: [1af4:1043] type 00 class 0x078000 PCIe Endpoint Sep 12 22:04:40.849335 kernel: pci 0000:03:00.0: BAR 1 [mem 0x10c00000-0x10c00fff] Sep 12 22:04:40.849397 kernel: pci 0000:03:00.0: BAR 4 [mem 0x8000100000-0x8000103fff 64bit pref] Sep 12 22:04:40.849485 kernel: pci 0000:04:00.0: [1af4:1045] type 00 class 0x00ff00 PCIe Endpoint Sep 12 22:04:40.849553 kernel: pci 0000:04:00.0: BAR 4 [mem 0x8000200000-0x8000203fff 64bit pref] Sep 12 22:04:40.849622 kernel: pci 0000:05:00.0: [1af4:1044] type 00 class 0x00ff00 PCIe Endpoint Sep 12 22:04:40.849688 kernel: pci 0000:05:00.0: BAR 4 [mem 0x8000300000-0x8000303fff 64bit pref] Sep 12 22:04:40.849773 kernel: pci 0000:06:00.0: [1af4:1048] type 00 class 0x010000 PCIe Endpoint Sep 12 22:04:40.849881 kernel: pci 0000:06:00.0: BAR 1 [mem 0x10600000-0x10600fff] Sep 12 22:04:40.849945 kernel: pci 0000:06:00.0: BAR 4 [mem 0x8000400000-0x8000403fff 64bit pref] Sep 12 22:04:40.850017 kernel: pci 0000:07:00.0: [1af4:1041] type 00 class 0x020000 PCIe Endpoint Sep 12 22:04:40.850079 kernel: pci 0000:07:00.0: BAR 1 [mem 0x10400000-0x10400fff] Sep 12 22:04:40.850144 kernel: pci 0000:07:00.0: BAR 4 [mem 0x8000500000-0x8000503fff 64bit pref] Sep 12 22:04:40.850204 kernel: pci 0000:07:00.0: ROM [mem 0xfff80000-0xffffffff pref] Sep 12 22:04:40.850267 kernel: pci 0000:00:02.0: bridge window [io 0x1000-0x0fff] to [bus 01] add_size 1000 Sep 12 22:04:40.850327 kernel: pci 0000:00:02.0: bridge window [mem 0x00100000-0x001fffff 64bit pref] to [bus 01] add_size 100000 add_align 100000 Sep 12 22:04:40.850387 kernel: pci 0000:00:02.0: bridge window [mem 0x00100000-0x001fffff] to [bus 01] add_size 100000 add_align 100000 Sep 12 22:04:40.850469 kernel: pci 0000:00:02.1: bridge window [io 0x1000-0x0fff] to [bus 02] add_size 1000 Sep 12 22:04:40.850588 kernel: pci 0000:00:02.1: bridge window [mem 0x00100000-0x000fffff 64bit pref] to [bus 02] add_size 200000 add_align 100000 Sep 12 22:04:40.850660 kernel: pci 0000:00:02.1: bridge window [mem 0x00100000-0x001fffff] to [bus 02] add_size 100000 add_align 100000 Sep 12 22:04:40.850765 kernel: pci 0000:00:02.2: bridge window [io 0x1000-0x0fff] to [bus 03] add_size 1000 Sep 12 22:04:40.850842 kernel: pci 0000:00:02.2: bridge window [mem 0x00100000-0x001fffff 64bit pref] to [bus 03] add_size 100000 add_align 100000 Sep 12 22:04:40.850901 kernel: pci 0000:00:02.2: bridge window [mem 0x00100000-0x001fffff] to [bus 03] add_size 100000 add_align 100000 Sep 12 22:04:40.850963 kernel: pci 0000:00:02.3: bridge window [io 0x1000-0x0fff] to [bus 04] add_size 1000 Sep 12 22:04:40.851022 kernel: pci 0000:00:02.3: bridge window [mem 0x00100000-0x001fffff 64bit pref] to [bus 04] add_size 100000 add_align 100000 Sep 12 22:04:40.851080 kernel: pci 0000:00:02.3: bridge window [mem 0x00100000-0x000fffff] to [bus 04] add_size 200000 add_align 100000 Sep 12 22:04:40.851147 kernel: pci 0000:00:02.4: bridge window [io 0x1000-0x0fff] to [bus 05] add_size 1000 Sep 12 22:04:40.851208 kernel: pci 0000:00:02.4: bridge window [mem 0x00100000-0x001fffff 64bit pref] to [bus 05] add_size 100000 add_align 100000 Sep 12 22:04:40.851267 kernel: pci 0000:00:02.4: bridge window [mem 0x00100000-0x000fffff] to [bus 05] add_size 200000 add_align 100000 Sep 12 22:04:40.851342 kernel: pci 0000:00:02.5: bridge window [io 0x1000-0x0fff] to [bus 06] add_size 1000 Sep 12 22:04:40.851417 kernel: pci 0000:00:02.5: bridge window [mem 0x00100000-0x001fffff 64bit pref] to [bus 06] add_size 100000 add_align 100000 Sep 12 22:04:40.851502 kernel: pci 0000:00:02.5: bridge window [mem 0x00100000-0x001fffff] to [bus 06] add_size 100000 add_align 100000 Sep 12 22:04:40.851570 kernel: pci 0000:00:02.6: bridge window [io 0x1000-0x0fff] to [bus 07] add_size 1000 Sep 12 22:04:40.851629 kernel: pci 0000:00:02.6: bridge window [mem 0x00100000-0x001fffff 64bit pref] to [bus 07] add_size 100000 add_align 100000 Sep 12 22:04:40.851686 kernel: pci 0000:00:02.6: bridge window [mem 0x00100000-0x001fffff] to [bus 07] add_size 100000 add_align 100000 Sep 12 22:04:40.851760 kernel: pci 0000:00:02.7: bridge window [io 0x1000-0x0fff] to [bus 08] add_size 1000 Sep 12 22:04:40.851821 kernel: pci 0000:00:02.7: bridge window [mem 0x00100000-0x000fffff 64bit pref] to [bus 08] add_size 200000 add_align 100000 Sep 12 22:04:40.851880 kernel: pci 0000:00:02.7: bridge window [mem 0x00100000-0x000fffff] to [bus 08] add_size 200000 add_align 100000 Sep 12 22:04:40.851940 kernel: pci 0000:00:03.0: bridge window [io 0x1000-0x0fff] to [bus 09] add_size 1000 Sep 12 22:04:40.852071 kernel: pci 0000:00:03.0: bridge window [mem 0x00100000-0x000fffff 64bit pref] to [bus 09] add_size 200000 add_align 100000 Sep 12 22:04:40.852136 kernel: pci 0000:00:03.0: bridge window [mem 0x00100000-0x000fffff] to [bus 09] add_size 200000 add_align 100000 Sep 12 22:04:40.852200 kernel: pci 0000:00:02.0: bridge window [mem 0x10000000-0x101fffff]: assigned Sep 12 22:04:40.852258 kernel: pci 0000:00:02.0: bridge window [mem 0x8000000000-0x80001fffff 64bit pref]: assigned Sep 12 22:04:40.852317 kernel: pci 0000:00:02.1: bridge window [mem 0x10200000-0x103fffff]: assigned Sep 12 22:04:40.852376 kernel: pci 0000:00:02.1: bridge window [mem 0x8000200000-0x80003fffff 64bit pref]: assigned Sep 12 22:04:40.852435 kernel: pci 0000:00:02.2: bridge window [mem 0x10400000-0x105fffff]: assigned Sep 12 22:04:40.853051 kernel: pci 0000:00:02.2: bridge window [mem 0x8000400000-0x80005fffff 64bit pref]: assigned Sep 12 22:04:40.853133 kernel: pci 0000:00:02.3: bridge window [mem 0x10600000-0x107fffff]: assigned Sep 12 22:04:40.853194 kernel: pci 0000:00:02.3: bridge window [mem 0x8000600000-0x80007fffff 64bit pref]: assigned Sep 12 22:04:40.853256 kernel: pci 0000:00:02.4: bridge window [mem 0x10800000-0x109fffff]: assigned Sep 12 22:04:40.853314 kernel: pci 0000:00:02.4: bridge window [mem 0x8000800000-0x80009fffff 64bit pref]: assigned Sep 12 22:04:40.853375 kernel: pci 0000:00:02.5: bridge window [mem 0x10a00000-0x10bfffff]: assigned Sep 12 22:04:40.853434 kernel: pci 0000:00:02.5: bridge window [mem 0x8000a00000-0x8000bfffff 64bit pref]: assigned Sep 12 22:04:40.853524 kernel: pci 0000:00:02.6: bridge window [mem 0x10c00000-0x10dfffff]: assigned Sep 12 22:04:40.853589 kernel: pci 0000:00:02.6: bridge window [mem 0x8000c00000-0x8000dfffff 64bit pref]: assigned Sep 12 22:04:40.853650 kernel: pci 0000:00:02.7: bridge window [mem 0x10e00000-0x10ffffff]: assigned Sep 12 22:04:40.853709 kernel: pci 0000:00:02.7: bridge window [mem 0x8000e00000-0x8000ffffff 64bit pref]: assigned Sep 12 22:04:40.853790 kernel: pci 0000:00:03.0: bridge window [mem 0x11000000-0x111fffff]: assigned Sep 12 22:04:40.853851 kernel: pci 0000:00:03.0: bridge window [mem 0x8001000000-0x80011fffff 64bit pref]: assigned Sep 12 22:04:40.853918 kernel: pci 0000:00:01.0: BAR 4 [mem 0x8001200000-0x8001203fff 64bit pref]: assigned Sep 12 22:04:40.853978 kernel: pci 0000:00:01.0: BAR 1 [mem 0x11200000-0x11200fff]: assigned Sep 12 22:04:40.854039 kernel: pci 0000:00:02.0: BAR 0 [mem 0x11201000-0x11201fff]: assigned Sep 12 22:04:40.854157 kernel: pci 0000:00:02.0: bridge window [io 0x1000-0x1fff]: assigned Sep 12 22:04:40.854223 kernel: pci 0000:00:02.1: BAR 0 [mem 0x11202000-0x11202fff]: assigned Sep 12 22:04:40.854282 kernel: pci 0000:00:02.1: bridge window [io 0x2000-0x2fff]: assigned Sep 12 22:04:40.854342 kernel: pci 0000:00:02.2: BAR 0 [mem 0x11203000-0x11203fff]: assigned Sep 12 22:04:40.854405 kernel: pci 0000:00:02.2: bridge window [io 0x3000-0x3fff]: assigned Sep 12 22:04:40.854480 kernel: pci 0000:00:02.3: BAR 0 [mem 0x11204000-0x11204fff]: assigned Sep 12 22:04:40.854543 kernel: pci 0000:00:02.3: bridge window [io 0x4000-0x4fff]: assigned Sep 12 22:04:40.854603 kernel: pci 0000:00:02.4: BAR 0 [mem 0x11205000-0x11205fff]: assigned Sep 12 22:04:40.854691 kernel: pci 0000:00:02.4: bridge window [io 0x5000-0x5fff]: assigned Sep 12 22:04:40.854784 kernel: pci 0000:00:02.5: BAR 0 [mem 0x11206000-0x11206fff]: assigned Sep 12 22:04:40.854846 kernel: pci 0000:00:02.5: bridge window [io 0x6000-0x6fff]: assigned Sep 12 22:04:40.854919 kernel: pci 0000:00:02.6: BAR 0 [mem 0x11207000-0x11207fff]: assigned Sep 12 22:04:40.854982 kernel: pci 0000:00:02.6: bridge window [io 0x7000-0x7fff]: assigned Sep 12 22:04:40.855323 kernel: pci 0000:00:02.7: BAR 0 [mem 0x11208000-0x11208fff]: assigned Sep 12 22:04:40.855416 kernel: pci 0000:00:02.7: bridge window [io 0x8000-0x8fff]: assigned Sep 12 22:04:40.855515 kernel: pci 0000:00:03.0: BAR 0 [mem 0x11209000-0x11209fff]: assigned Sep 12 22:04:40.855634 kernel: pci 0000:00:03.0: bridge window [io 0x9000-0x9fff]: assigned Sep 12 22:04:40.855709 kernel: pci 0000:00:04.0: BAR 0 [io 0xa000-0xa007]: assigned Sep 12 22:04:40.855830 kernel: pci 0000:01:00.0: ROM [mem 0x10000000-0x1007ffff pref]: assigned Sep 12 22:04:40.855897 kernel: pci 0000:01:00.0: BAR 4 [mem 0x8000000000-0x8000003fff 64bit pref]: assigned Sep 12 22:04:40.855964 kernel: pci 0000:01:00.0: BAR 1 [mem 0x10080000-0x10080fff]: assigned Sep 12 22:04:40.856025 kernel: pci 0000:00:02.0: PCI bridge to [bus 01] Sep 12 22:04:40.856084 kernel: pci 0000:00:02.0: bridge window [io 0x1000-0x1fff] Sep 12 22:04:40.856143 kernel: pci 0000:00:02.0: bridge window [mem 0x10000000-0x101fffff] Sep 12 22:04:40.856253 kernel: pci 0000:00:02.0: bridge window [mem 0x8000000000-0x80001fffff 64bit pref] Sep 12 22:04:40.856325 kernel: pci 0000:02:00.0: BAR 0 [mem 0x10200000-0x10203fff 64bit]: assigned Sep 12 22:04:40.856387 kernel: pci 0000:00:02.1: PCI bridge to [bus 02] Sep 12 22:04:40.856471 kernel: pci 0000:00:02.1: bridge window [io 0x2000-0x2fff] Sep 12 22:04:40.856538 kernel: pci 0000:00:02.1: bridge window [mem 0x10200000-0x103fffff] Sep 12 22:04:40.856597 kernel: pci 0000:00:02.1: bridge window [mem 0x8000200000-0x80003fffff 64bit pref] Sep 12 22:04:40.856663 kernel: pci 0000:03:00.0: BAR 4 [mem 0x8000400000-0x8000403fff 64bit pref]: assigned Sep 12 22:04:40.856736 kernel: pci 0000:03:00.0: BAR 1 [mem 0x10400000-0x10400fff]: assigned Sep 12 22:04:40.856803 kernel: pci 0000:00:02.2: PCI bridge to [bus 03] Sep 12 22:04:40.858495 kernel: pci 0000:00:02.2: bridge window [io 0x3000-0x3fff] Sep 12 22:04:40.858611 kernel: pci 0000:00:02.2: bridge window [mem 0x10400000-0x105fffff] Sep 12 22:04:40.858678 kernel: pci 0000:00:02.2: bridge window [mem 0x8000400000-0x80005fffff 64bit pref] Sep 12 22:04:40.858768 kernel: pci 0000:04:00.0: BAR 4 [mem 0x8000600000-0x8000603fff 64bit pref]: assigned Sep 12 22:04:40.858831 kernel: pci 0000:00:02.3: PCI bridge to [bus 04] Sep 12 22:04:40.858890 kernel: pci 0000:00:02.3: bridge window [io 0x4000-0x4fff] Sep 12 22:04:40.858951 kernel: pci 0000:00:02.3: bridge window [mem 0x10600000-0x107fffff] Sep 12 22:04:40.859008 kernel: pci 0000:00:02.3: bridge window [mem 0x8000600000-0x80007fffff 64bit pref] Sep 12 22:04:40.859077 kernel: pci 0000:05:00.0: BAR 4 [mem 0x8000800000-0x8000803fff 64bit pref]: assigned Sep 12 22:04:40.859139 kernel: pci 0000:00:02.4: PCI bridge to [bus 05] Sep 12 22:04:40.859197 kernel: pci 0000:00:02.4: bridge window [io 0x5000-0x5fff] Sep 12 22:04:40.859329 kernel: pci 0000:00:02.4: bridge window [mem 0x10800000-0x109fffff] Sep 12 22:04:40.859410 kernel: pci 0000:00:02.4: bridge window [mem 0x8000800000-0x80009fffff 64bit pref] Sep 12 22:04:40.859502 kernel: pci 0000:06:00.0: BAR 4 [mem 0x8000a00000-0x8000a03fff 64bit pref]: assigned Sep 12 22:04:40.859567 kernel: pci 0000:06:00.0: BAR 1 [mem 0x10a00000-0x10a00fff]: assigned Sep 12 22:04:40.859633 kernel: pci 0000:00:02.5: PCI bridge to [bus 06] Sep 12 22:04:40.859693 kernel: pci 0000:00:02.5: bridge window [io 0x6000-0x6fff] Sep 12 22:04:40.859781 kernel: pci 0000:00:02.5: bridge window [mem 0x10a00000-0x10bfffff] Sep 12 22:04:40.859843 kernel: pci 0000:00:02.5: bridge window [mem 0x8000a00000-0x8000bfffff 64bit pref] Sep 12 22:04:40.859909 kernel: pci 0000:07:00.0: ROM [mem 0x10c00000-0x10c7ffff pref]: assigned Sep 12 22:04:40.859971 kernel: pci 0000:07:00.0: BAR 4 [mem 0x8000c00000-0x8000c03fff 64bit pref]: assigned Sep 12 22:04:40.860031 kernel: pci 0000:07:00.0: BAR 1 [mem 0x10c80000-0x10c80fff]: assigned Sep 12 22:04:40.860169 kernel: pci 0000:00:02.6: PCI bridge to [bus 07] Sep 12 22:04:40.860235 kernel: pci 0000:00:02.6: bridge window [io 0x7000-0x7fff] Sep 12 22:04:40.860298 kernel: pci 0000:00:02.6: bridge window [mem 0x10c00000-0x10dfffff] Sep 12 22:04:40.860358 kernel: pci 0000:00:02.6: bridge window [mem 0x8000c00000-0x8000dfffff 64bit pref] Sep 12 22:04:40.860422 kernel: pci 0000:00:02.7: PCI bridge to [bus 08] Sep 12 22:04:40.862297 kernel: pci 0000:00:02.7: bridge window [io 0x8000-0x8fff] Sep 12 22:04:40.862385 kernel: pci 0000:00:02.7: bridge window [mem 0x10e00000-0x10ffffff] Sep 12 22:04:40.862488 kernel: pci 0000:00:02.7: bridge window [mem 0x8000e00000-0x8000ffffff 64bit pref] Sep 12 22:04:40.862571 kernel: pci 0000:00:03.0: PCI bridge to [bus 09] Sep 12 22:04:40.862635 kernel: pci 0000:00:03.0: bridge window [io 0x9000-0x9fff] Sep 12 22:04:40.862770 kernel: pci 0000:00:03.0: bridge window [mem 0x11000000-0x111fffff] Sep 12 22:04:40.862847 kernel: pci 0000:00:03.0: bridge window [mem 0x8001000000-0x80011fffff 64bit pref] Sep 12 22:04:40.862911 kernel: pci_bus 0000:00: resource 4 [mem 0x10000000-0x3efeffff window] Sep 12 22:04:40.862965 kernel: pci_bus 0000:00: resource 5 [io 0x0000-0xffff window] Sep 12 22:04:40.863017 kernel: pci_bus 0000:00: resource 6 [mem 0x8000000000-0xffffffffff window] Sep 12 22:04:40.863083 kernel: pci_bus 0000:01: resource 0 [io 0x1000-0x1fff] Sep 12 22:04:40.863139 kernel: pci_bus 0000:01: resource 1 [mem 0x10000000-0x101fffff] Sep 12 22:04:40.863193 kernel: pci_bus 0000:01: resource 2 [mem 0x8000000000-0x80001fffff 64bit pref] Sep 12 22:04:40.863259 kernel: pci_bus 0000:02: resource 0 [io 0x2000-0x2fff] Sep 12 22:04:40.863313 kernel: pci_bus 0000:02: resource 1 [mem 0x10200000-0x103fffff] Sep 12 22:04:40.863367 kernel: pci_bus 0000:02: resource 2 [mem 0x8000200000-0x80003fffff 64bit pref] Sep 12 22:04:40.863431 kernel: pci_bus 0000:03: resource 0 [io 0x3000-0x3fff] Sep 12 22:04:40.863508 kernel: pci_bus 0000:03: resource 1 [mem 0x10400000-0x105fffff] Sep 12 22:04:40.863565 kernel: pci_bus 0000:03: resource 2 [mem 0x8000400000-0x80005fffff 64bit pref] Sep 12 22:04:40.863641 kernel: pci_bus 0000:04: resource 0 [io 0x4000-0x4fff] Sep 12 22:04:40.863697 kernel: pci_bus 0000:04: resource 1 [mem 0x10600000-0x107fffff] Sep 12 22:04:40.863804 kernel: pci_bus 0000:04: resource 2 [mem 0x8000600000-0x80007fffff 64bit pref] Sep 12 22:04:40.863871 kernel: pci_bus 0000:05: resource 0 [io 0x5000-0x5fff] Sep 12 22:04:40.863925 kernel: pci_bus 0000:05: resource 1 [mem 0x10800000-0x109fffff] Sep 12 22:04:40.863979 kernel: pci_bus 0000:05: resource 2 [mem 0x8000800000-0x80009fffff 64bit pref] Sep 12 22:04:40.864042 kernel: pci_bus 0000:06: resource 0 [io 0x6000-0x6fff] Sep 12 22:04:40.864100 kernel: pci_bus 0000:06: resource 1 [mem 0x10a00000-0x10bfffff] Sep 12 22:04:40.864156 kernel: pci_bus 0000:06: resource 2 [mem 0x8000a00000-0x8000bfffff 64bit pref] Sep 12 22:04:40.864218 kernel: pci_bus 0000:07: resource 0 [io 0x7000-0x7fff] Sep 12 22:04:40.864273 kernel: pci_bus 0000:07: resource 1 [mem 0x10c00000-0x10dfffff] Sep 12 22:04:40.864329 kernel: pci_bus 0000:07: resource 2 [mem 0x8000c00000-0x8000dfffff 64bit pref] Sep 12 22:04:40.864393 kernel: pci_bus 0000:08: resource 0 [io 0x8000-0x8fff] Sep 12 22:04:40.866167 kernel: pci_bus 0000:08: resource 1 [mem 0x10e00000-0x10ffffff] Sep 12 22:04:40.866378 kernel: pci_bus 0000:08: resource 2 [mem 0x8000e00000-0x8000ffffff 64bit pref] Sep 12 22:04:40.866808 kernel: pci_bus 0000:09: resource 0 [io 0x9000-0x9fff] Sep 12 22:04:40.866930 kernel: pci_bus 0000:09: resource 1 [mem 0x11000000-0x111fffff] Sep 12 22:04:40.867030 kernel: pci_bus 0000:09: resource 2 [mem 0x8001000000-0x80011fffff 64bit pref] Sep 12 22:04:40.867043 kernel: ACPI: PCI: Interrupt link GSI0 configured for IRQ 35 Sep 12 22:04:40.867057 kernel: ACPI: PCI: Interrupt link GSI1 configured for IRQ 36 Sep 12 22:04:40.867065 kernel: ACPI: PCI: Interrupt link GSI2 configured for IRQ 37 Sep 12 22:04:40.867074 kernel: ACPI: PCI: Interrupt link GSI3 configured for IRQ 38 Sep 12 22:04:40.867082 kernel: iommu: Default domain type: Translated Sep 12 22:04:40.867089 kernel: iommu: DMA domain TLB invalidation policy: strict mode Sep 12 22:04:40.867097 kernel: efivars: Registered efivars operations Sep 12 22:04:40.867104 kernel: vgaarb: loaded Sep 12 22:04:40.867111 kernel: clocksource: Switched to clocksource arch_sys_counter Sep 12 22:04:40.867119 kernel: VFS: Disk quotas dquot_6.6.0 Sep 12 22:04:40.867126 kernel: VFS: Dquot-cache hash table entries: 512 (order 0, 4096 bytes) Sep 12 22:04:40.867134 kernel: pnp: PnP ACPI init Sep 12 22:04:40.867210 kernel: system 00:00: [mem 0x4010000000-0x401fffffff window] could not be reserved Sep 12 22:04:40.867221 kernel: pnp: PnP ACPI: found 1 devices Sep 12 22:04:40.867229 kernel: NET: Registered PF_INET protocol family Sep 12 22:04:40.867236 kernel: IP idents hash table entries: 65536 (order: 7, 524288 bytes, linear) Sep 12 22:04:40.867244 kernel: tcp_listen_portaddr_hash hash table entries: 2048 (order: 3, 32768 bytes, linear) Sep 12 22:04:40.867252 kernel: Table-perturb hash table entries: 65536 (order: 6, 262144 bytes, linear) Sep 12 22:04:40.867259 kernel: TCP established hash table entries: 32768 (order: 6, 262144 bytes, linear) Sep 12 22:04:40.867267 kernel: TCP bind hash table entries: 32768 (order: 8, 1048576 bytes, linear) Sep 12 22:04:40.867276 kernel: TCP: Hash tables configured (established 32768 bind 32768) Sep 12 22:04:40.867284 kernel: UDP hash table entries: 2048 (order: 4, 65536 bytes, linear) Sep 12 22:04:40.867291 kernel: UDP-Lite hash table entries: 2048 (order: 4, 65536 bytes, linear) Sep 12 22:04:40.867299 kernel: NET: Registered PF_UNIX/PF_LOCAL protocol family Sep 12 22:04:40.867369 kernel: pci 0000:02:00.0: enabling device (0000 -> 0002) Sep 12 22:04:40.867380 kernel: PCI: CLS 0 bytes, default 64 Sep 12 22:04:40.867388 kernel: kvm [1]: HYP mode not available Sep 12 22:04:40.867396 kernel: Initialise system trusted keyrings Sep 12 22:04:40.867404 kernel: workingset: timestamp_bits=39 max_order=20 bucket_order=0 Sep 12 22:04:40.867413 kernel: Key type asymmetric registered Sep 12 22:04:40.867421 kernel: Asymmetric key parser 'x509' registered Sep 12 22:04:40.867428 kernel: Block layer SCSI generic (bsg) driver version 0.4 loaded (major 249) Sep 12 22:04:40.867436 kernel: io scheduler mq-deadline registered Sep 12 22:04:40.867443 kernel: io scheduler kyber registered Sep 12 22:04:40.868516 kernel: io scheduler bfq registered Sep 12 22:04:40.868529 kernel: ACPI: \_SB_.PCI0.GSI2: Enabled at IRQ 37 Sep 12 22:04:40.868637 kernel: pcieport 0000:00:02.0: PME: Signaling with IRQ 50 Sep 12 22:04:40.868702 kernel: pcieport 0000:00:02.0: AER: enabled with IRQ 50 Sep 12 22:04:40.868787 kernel: pcieport 0000:00:02.0: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Sep 12 22:04:40.868855 kernel: pcieport 0000:00:02.1: PME: Signaling with IRQ 51 Sep 12 22:04:40.868930 kernel: pcieport 0000:00:02.1: AER: enabled with IRQ 51 Sep 12 22:04:40.869094 kernel: pcieport 0000:00:02.1: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Sep 12 22:04:40.869195 kernel: pcieport 0000:00:02.2: PME: Signaling with IRQ 52 Sep 12 22:04:40.869283 kernel: pcieport 0000:00:02.2: AER: enabled with IRQ 52 Sep 12 22:04:40.869370 kernel: pcieport 0000:00:02.2: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Sep 12 22:04:40.869479 kernel: pcieport 0000:00:02.3: PME: Signaling with IRQ 53 Sep 12 22:04:40.869581 kernel: pcieport 0000:00:02.3: AER: enabled with IRQ 53 Sep 12 22:04:40.869667 kernel: pcieport 0000:00:02.3: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Sep 12 22:04:40.869797 kernel: pcieport 0000:00:02.4: PME: Signaling with IRQ 54 Sep 12 22:04:40.869888 kernel: pcieport 0000:00:02.4: AER: enabled with IRQ 54 Sep 12 22:04:40.869974 kernel: pcieport 0000:00:02.4: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Sep 12 22:04:40.870062 kernel: pcieport 0000:00:02.5: PME: Signaling with IRQ 55 Sep 12 22:04:40.870148 kernel: pcieport 0000:00:02.5: AER: enabled with IRQ 55 Sep 12 22:04:40.870232 kernel: pcieport 0000:00:02.5: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Sep 12 22:04:40.870323 kernel: pcieport 0000:00:02.6: PME: Signaling with IRQ 56 Sep 12 22:04:40.870409 kernel: pcieport 0000:00:02.6: AER: enabled with IRQ 56 Sep 12 22:04:40.870520 kernel: pcieport 0000:00:02.6: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Sep 12 22:04:40.870611 kernel: pcieport 0000:00:02.7: PME: Signaling with IRQ 57 Sep 12 22:04:40.870696 kernel: pcieport 0000:00:02.7: AER: enabled with IRQ 57 Sep 12 22:04:40.870803 kernel: pcieport 0000:00:02.7: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Sep 12 22:04:40.870820 kernel: ACPI: \_SB_.PCI0.GSI3: Enabled at IRQ 38 Sep 12 22:04:40.870913 kernel: pcieport 0000:00:03.0: PME: Signaling with IRQ 58 Sep 12 22:04:40.871002 kernel: pcieport 0000:00:03.0: AER: enabled with IRQ 58 Sep 12 22:04:40.871088 kernel: pcieport 0000:00:03.0: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Sep 12 22:04:40.871103 kernel: input: Power Button as /devices/LNXSYSTM:00/LNXSYBUS:00/PNP0C0C:00/input/input0 Sep 12 22:04:40.871114 kernel: ACPI: button: Power Button [PWRB] Sep 12 22:04:40.871126 kernel: ACPI: \_SB_.PCI0.GSI1: Enabled at IRQ 36 Sep 12 22:04:40.871217 kernel: virtio-pci 0000:04:00.0: enabling device (0000 -> 0002) Sep 12 22:04:40.871315 kernel: virtio-pci 0000:07:00.0: enabling device (0000 -> 0002) Sep 12 22:04:40.871330 kernel: Serial: 8250/16550 driver, 4 ports, IRQ sharing enabled Sep 12 22:04:40.871344 kernel: ACPI: \_SB_.PCI0.GSI0: Enabled at IRQ 35 Sep 12 22:04:40.871432 kernel: serial 0000:00:04.0: enabling device (0000 -> 0001) Sep 12 22:04:40.871446 kernel: 0000:00:04.0: ttyS0 at I/O 0xa000 (irq = 45, base_baud = 115200) is a 16550A Sep 12 22:04:40.871481 kernel: thunder_xcv, ver 1.0 Sep 12 22:04:40.871492 kernel: thunder_bgx, ver 1.0 Sep 12 22:04:40.871503 kernel: nicpf, ver 1.0 Sep 12 22:04:40.871513 kernel: nicvf, ver 1.0 Sep 12 22:04:40.871622 kernel: rtc-efi rtc-efi.0: registered as rtc0 Sep 12 22:04:40.871713 kernel: rtc-efi rtc-efi.0: setting system clock to 2025-09-12T22:04:40 UTC (1757714680) Sep 12 22:04:40.871778 kernel: hid: raw HID events driver (C) Jiri Kosina Sep 12 22:04:40.871791 kernel: hw perfevents: enabled with armv8_pmuv3_0 PMU driver, 7 (0,8000003f) counters available Sep 12 22:04:40.871802 kernel: watchdog: NMI not fully supported Sep 12 22:04:40.871813 kernel: watchdog: Hard watchdog permanently disabled Sep 12 22:04:40.871823 kernel: NET: Registered PF_INET6 protocol family Sep 12 22:04:40.871834 kernel: Segment Routing with IPv6 Sep 12 22:04:40.871845 kernel: In-situ OAM (IOAM) with IPv6 Sep 12 22:04:40.871906 kernel: NET: Registered PF_PACKET protocol family Sep 12 22:04:40.871923 kernel: Key type dns_resolver registered Sep 12 22:04:40.871936 kernel: registered taskstats version 1 Sep 12 22:04:40.871999 kernel: Loading compiled-in X.509 certificates Sep 12 22:04:40.872012 kernel: Loaded X.509 cert 'Kinvolk GmbH: Module signing key for 6.12.47-flatcar: 2d7730e6d35b3fbd1c590cd72a2500b2380c020e' Sep 12 22:04:40.872023 kernel: Demotion targets for Node 0: null Sep 12 22:04:40.872035 kernel: Key type .fscrypt registered Sep 12 22:04:40.872046 kernel: Key type fscrypt-provisioning registered Sep 12 22:04:40.872057 kernel: ima: No TPM chip found, activating TPM-bypass! Sep 12 22:04:40.872072 kernel: ima: Allocated hash algorithm: sha1 Sep 12 22:04:40.872083 kernel: ima: No architecture policies found Sep 12 22:04:40.872095 kernel: alg: No test for fips(ansi_cprng) (fips_ansi_cprng) Sep 12 22:04:40.872105 kernel: clk: Disabling unused clocks Sep 12 22:04:40.872116 kernel: PM: genpd: Disabling unused power domains Sep 12 22:04:40.872129 kernel: Warning: unable to open an initial console. Sep 12 22:04:40.872140 kernel: Freeing unused kernel memory: 38976K Sep 12 22:04:40.872152 kernel: Run /init as init process Sep 12 22:04:40.872162 kernel: with arguments: Sep 12 22:04:40.872174 kernel: /init Sep 12 22:04:40.872187 kernel: with environment: Sep 12 22:04:40.872198 kernel: HOME=/ Sep 12 22:04:40.872208 kernel: TERM=linux Sep 12 22:04:40.872219 kernel: BOOT_IMAGE=/flatcar/vmlinuz-a Sep 12 22:04:40.872232 systemd[1]: Successfully made /usr/ read-only. Sep 12 22:04:40.872249 systemd[1]: systemd 256.8 running in system mode (+PAM +AUDIT +SELINUX -APPARMOR +IMA +SMACK +SECCOMP -GCRYPT -GNUTLS +OPENSSL -ACL +BLKID +CURL +ELFUTILS -FIDO2 +IDN2 -IDN +IPTC +KMOD +LIBCRYPTSETUP +LIBCRYPTSETUP_PLUGINS +LIBFDISK +PCRE2 -PWQUALITY -P11KIT -QRENCODE +TPM2 +BZIP2 +LZ4 +XZ +ZLIB +ZSTD -BPF_FRAMEWORK -XKBCOMMON +UTMP -SYSVINIT +LIBARCHIVE) Sep 12 22:04:40.872261 systemd[1]: Detected virtualization kvm. Sep 12 22:04:40.872283 systemd[1]: Detected architecture arm64. Sep 12 22:04:40.872298 systemd[1]: Running in initrd. Sep 12 22:04:40.872313 systemd[1]: No hostname configured, using default hostname. Sep 12 22:04:40.872325 systemd[1]: Hostname set to . Sep 12 22:04:40.872336 systemd[1]: Initializing machine ID from VM UUID. Sep 12 22:04:40.872349 systemd[1]: Queued start job for default target initrd.target. Sep 12 22:04:40.872361 systemd[1]: Started clevis-luks-askpass.path - Forward Password Requests to Clevis Directory Watch. Sep 12 22:04:40.872372 systemd[1]: Started systemd-ask-password-console.path - Dispatch Password Requests to Console Directory Watch. Sep 12 22:04:40.872387 systemd[1]: Expecting device dev-disk-by\x2dlabel-EFI\x2dSYSTEM.device - /dev/disk/by-label/EFI-SYSTEM... Sep 12 22:04:40.872399 systemd[1]: Expecting device dev-disk-by\x2dlabel-OEM.device - /dev/disk/by-label/OEM... Sep 12 22:04:40.872410 systemd[1]: Expecting device dev-disk-by\x2dlabel-ROOT.device - /dev/disk/by-label/ROOT... Sep 12 22:04:40.872423 systemd[1]: Expecting device dev-disk-by\x2dpartlabel-USR\x2dA.device - /dev/disk/by-partlabel/USR-A... Sep 12 22:04:40.872436 systemd[1]: Expecting device dev-disk-by\x2dpartuuid-7130c94a\x2d213a\x2d4e5a\x2d8e26\x2d6cce9662f132.device - /dev/disk/by-partuuid/7130c94a-213a-4e5a-8e26-6cce9662f132... Sep 12 22:04:40.872470 systemd[1]: Expecting device dev-mapper-usr.device - /dev/mapper/usr... Sep 12 22:04:40.872484 systemd[1]: Reached target cryptsetup-pre.target - Local Encrypted Volumes (Pre). Sep 12 22:04:40.872498 systemd[1]: Reached target cryptsetup.target - Local Encrypted Volumes. Sep 12 22:04:40.872509 systemd[1]: Reached target paths.target - Path Units. Sep 12 22:04:40.872520 systemd[1]: Reached target slices.target - Slice Units. Sep 12 22:04:40.872532 systemd[1]: Reached target swap.target - Swaps. Sep 12 22:04:40.872543 systemd[1]: Reached target timers.target - Timer Units. Sep 12 22:04:40.872554 systemd[1]: Listening on iscsid.socket - Open-iSCSI iscsid Socket. Sep 12 22:04:40.872565 systemd[1]: Listening on iscsiuio.socket - Open-iSCSI iscsiuio Socket. Sep 12 22:04:40.872577 systemd[1]: Listening on systemd-journald-dev-log.socket - Journal Socket (/dev/log). Sep 12 22:04:40.872592 systemd[1]: Listening on systemd-journald.socket - Journal Sockets. Sep 12 22:04:40.872604 systemd[1]: Listening on systemd-networkd.socket - Network Service Netlink Socket. Sep 12 22:04:40.872615 systemd[1]: Listening on systemd-udevd-control.socket - udev Control Socket. Sep 12 22:04:40.872626 systemd[1]: Listening on systemd-udevd-kernel.socket - udev Kernel Socket. Sep 12 22:04:40.872637 systemd[1]: Reached target sockets.target - Socket Units. Sep 12 22:04:40.872648 systemd[1]: Starting ignition-setup-pre.service - Ignition env setup... Sep 12 22:04:40.872660 systemd[1]: Starting kmod-static-nodes.service - Create List of Static Device Nodes... Sep 12 22:04:40.872672 systemd[1]: Finished network-cleanup.service - Network Cleanup. Sep 12 22:04:40.872683 systemd[1]: systemd-battery-check.service - Check battery level during early boot was skipped because of an unmet condition check (ConditionDirectoryNotEmpty=/sys/class/power_supply). Sep 12 22:04:40.872697 systemd[1]: Starting systemd-fsck-usr.service... Sep 12 22:04:40.872709 systemd[1]: Starting systemd-journald.service - Journal Service... Sep 12 22:04:40.872720 systemd[1]: Starting systemd-modules-load.service - Load Kernel Modules... Sep 12 22:04:40.872747 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Sep 12 22:04:40.872759 systemd[1]: Finished ignition-setup-pre.service - Ignition env setup. Sep 12 22:04:40.872771 systemd[1]: Finished kmod-static-nodes.service - Create List of Static Device Nodes. Sep 12 22:04:40.872785 systemd[1]: Finished systemd-fsck-usr.service. Sep 12 22:04:40.872797 systemd[1]: Starting systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully... Sep 12 22:04:40.872851 systemd-journald[244]: Collecting audit messages is disabled. Sep 12 22:04:40.872882 kernel: bridge: filtering via arp/ip/ip6tables is no longer available by default. Update your scripts to load br_netfilter if you need this. Sep 12 22:04:40.872894 kernel: Bridge firewalling registered Sep 12 22:04:40.872906 systemd[1]: Finished systemd-modules-load.service - Load Kernel Modules. Sep 12 22:04:40.872917 systemd[1]: Finished systemd-vconsole-setup.service - Virtual Console Setup. Sep 12 22:04:40.872929 systemd[1]: Finished systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully. Sep 12 22:04:40.872941 systemd[1]: Starting dracut-cmdline-ask.service - dracut ask for additional cmdline parameters... Sep 12 22:04:40.872953 systemd-journald[244]: Journal started Sep 12 22:04:40.872983 systemd-journald[244]: Runtime Journal (/run/log/journal/08d9892b751047cbbd8373de6ee40d41) is 8M, max 76.5M, 68.5M free. Sep 12 22:04:40.832232 systemd-modules-load[246]: Inserted module 'overlay' Sep 12 22:04:40.858600 systemd-modules-load[246]: Inserted module 'br_netfilter' Sep 12 22:04:40.877476 systemd[1]: Starting systemd-sysctl.service - Apply Kernel Variables... Sep 12 22:04:40.880494 systemd[1]: Starting systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev... Sep 12 22:04:40.883473 systemd[1]: Started systemd-journald.service - Journal Service. Sep 12 22:04:40.890670 systemd[1]: Starting systemd-tmpfiles-setup.service - Create System Files and Directories... Sep 12 22:04:40.900501 systemd[1]: Finished systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev. Sep 12 22:04:40.907673 systemd[1]: Finished systemd-sysctl.service - Apply Kernel Variables. Sep 12 22:04:40.911017 systemd[1]: Finished dracut-cmdline-ask.service - dracut ask for additional cmdline parameters. Sep 12 22:04:40.913671 systemd[1]: Starting dracut-cmdline.service - dracut cmdline hook... Sep 12 22:04:40.918891 systemd-tmpfiles[271]: /usr/lib/tmpfiles.d/var.conf:14: Duplicate line for path "/var/log", ignoring. Sep 12 22:04:40.923650 systemd[1]: Finished systemd-tmpfiles-setup.service - Create System Files and Directories. Sep 12 22:04:40.930959 systemd[1]: Starting systemd-resolved.service - Network Name Resolution... Sep 12 22:04:40.943710 dracut-cmdline[284]: Using kernel command line parameters: rd.driver.pre=btrfs SYSTEMD_SULOGIN_FORCE=1 BOOT_IMAGE=/flatcar/vmlinuz-a mount.usr=/dev/mapper/usr verity.usr=PARTUUID=7130c94a-213a-4e5a-8e26-6cce9662f132 rootflags=rw mount.usrflags=ro consoleblank=0 root=LABEL=ROOT console=ttyAMA0,115200n8 flatcar.first_boot=detected acpi=force flatcar.oem.id=hetzner verity.usrhash=319fa5fb212e5dd8bf766d2f9f0bbb61d6aa6c81f2813f4b5b49defba0af2b2f Sep 12 22:04:40.983214 systemd-resolved[288]: Positive Trust Anchors: Sep 12 22:04:40.984082 systemd-resolved[288]: . IN DS 20326 8 2 e06d44b80b8f1d39a95c0b0d7c65d08458e880409bbc683457104237c7f8ec8d Sep 12 22:04:40.984118 systemd-resolved[288]: Negative trust anchors: home.arpa 10.in-addr.arpa 16.172.in-addr.arpa 17.172.in-addr.arpa 18.172.in-addr.arpa 19.172.in-addr.arpa 20.172.in-addr.arpa 21.172.in-addr.arpa 22.172.in-addr.arpa 23.172.in-addr.arpa 24.172.in-addr.arpa 25.172.in-addr.arpa 26.172.in-addr.arpa 27.172.in-addr.arpa 28.172.in-addr.arpa 29.172.in-addr.arpa 30.172.in-addr.arpa 31.172.in-addr.arpa 170.0.0.192.in-addr.arpa 171.0.0.192.in-addr.arpa 168.192.in-addr.arpa d.f.ip6.arpa ipv4only.arpa resolver.arpa corp home internal intranet lan local private test Sep 12 22:04:40.995369 systemd-resolved[288]: Defaulting to hostname 'linux'. Sep 12 22:04:40.997345 systemd[1]: Started systemd-resolved.service - Network Name Resolution. Sep 12 22:04:40.998679 systemd[1]: Reached target nss-lookup.target - Host and Network Name Lookups. Sep 12 22:04:41.041530 kernel: SCSI subsystem initialized Sep 12 22:04:41.046490 kernel: Loading iSCSI transport class v2.0-870. Sep 12 22:04:41.054538 kernel: iscsi: registered transport (tcp) Sep 12 22:04:41.068493 kernel: iscsi: registered transport (qla4xxx) Sep 12 22:04:41.068551 kernel: QLogic iSCSI HBA Driver Sep 12 22:04:41.092546 systemd[1]: Starting systemd-network-generator.service - Generate network units from Kernel command line... Sep 12 22:04:41.123186 systemd[1]: Finished systemd-network-generator.service - Generate network units from Kernel command line. Sep 12 22:04:41.128396 systemd[1]: Reached target network-pre.target - Preparation for Network. Sep 12 22:04:41.179524 systemd[1]: Finished dracut-cmdline.service - dracut cmdline hook. Sep 12 22:04:41.182316 systemd[1]: Starting dracut-pre-udev.service - dracut pre-udev hook... Sep 12 22:04:41.255522 kernel: raid6: neonx8 gen() 15358 MB/s Sep 12 22:04:41.272520 kernel: raid6: neonx4 gen() 15361 MB/s Sep 12 22:04:41.289551 kernel: raid6: neonx2 gen() 12795 MB/s Sep 12 22:04:41.306515 kernel: raid6: neonx1 gen() 9990 MB/s Sep 12 22:04:41.323527 kernel: raid6: int64x8 gen() 6782 MB/s Sep 12 22:04:41.340533 kernel: raid6: int64x4 gen() 7170 MB/s Sep 12 22:04:41.357565 kernel: raid6: int64x2 gen() 5947 MB/s Sep 12 22:04:41.374519 kernel: raid6: int64x1 gen() 4965 MB/s Sep 12 22:04:41.374605 kernel: raid6: using algorithm neonx4 gen() 15361 MB/s Sep 12 22:04:41.391521 kernel: raid6: .... xor() 12093 MB/s, rmw enabled Sep 12 22:04:41.391596 kernel: raid6: using neon recovery algorithm Sep 12 22:04:41.396795 kernel: xor: measuring software checksum speed Sep 12 22:04:41.396863 kernel: 8regs : 21630 MB/sec Sep 12 22:04:41.397655 kernel: 32regs : 21681 MB/sec Sep 12 22:04:41.397701 kernel: arm64_neon : 28070 MB/sec Sep 12 22:04:41.397715 kernel: xor: using function: arm64_neon (28070 MB/sec) Sep 12 22:04:41.454932 kernel: Btrfs loaded, zoned=no, fsverity=no Sep 12 22:04:41.463118 systemd[1]: Finished dracut-pre-udev.service - dracut pre-udev hook. Sep 12 22:04:41.467278 systemd[1]: Starting systemd-udevd.service - Rule-based Manager for Device Events and Files... Sep 12 22:04:41.500658 systemd-udevd[494]: Using default interface naming scheme 'v255'. Sep 12 22:04:41.505578 systemd[1]: Started systemd-udevd.service - Rule-based Manager for Device Events and Files. Sep 12 22:04:41.512620 systemd[1]: Starting dracut-pre-trigger.service - dracut pre-trigger hook... Sep 12 22:04:41.541324 dracut-pre-trigger[506]: rd.md=0: removing MD RAID activation Sep 12 22:04:41.572819 systemd[1]: Finished dracut-pre-trigger.service - dracut pre-trigger hook. Sep 12 22:04:41.575600 systemd[1]: Starting systemd-udev-trigger.service - Coldplug All udev Devices... Sep 12 22:04:41.640378 systemd[1]: Finished systemd-udev-trigger.service - Coldplug All udev Devices. Sep 12 22:04:41.643568 systemd[1]: Starting dracut-initqueue.service - dracut initqueue hook... Sep 12 22:04:41.744553 kernel: virtio_scsi virtio5: 2/0/0 default/read/poll queues Sep 12 22:04:41.768500 kernel: scsi host0: Virtio SCSI HBA Sep 12 22:04:41.781758 kernel: scsi 0:0:0:0: CD-ROM QEMU QEMU CD-ROM 2.5+ PQ: 0 ANSI: 5 Sep 12 22:04:41.781846 kernel: scsi 0:0:0:1: Direct-Access QEMU QEMU HARDDISK 2.5+ PQ: 0 ANSI: 5 Sep 12 22:04:41.798479 kernel: ACPI: bus type USB registered Sep 12 22:04:41.799467 kernel: usbcore: registered new interface driver usbfs Sep 12 22:04:41.801111 kernel: usbcore: registered new interface driver hub Sep 12 22:04:41.801153 kernel: usbcore: registered new device driver usb Sep 12 22:04:41.801711 systemd[1]: systemd-vconsole-setup.service: Deactivated successfully. Sep 12 22:04:41.801896 systemd[1]: Stopped systemd-vconsole-setup.service - Virtual Console Setup. Sep 12 22:04:41.805757 systemd[1]: Stopping systemd-vconsole-setup.service - Virtual Console Setup... Sep 12 22:04:41.807292 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Sep 12 22:04:41.818070 systemd[1]: run-credentials-systemd\x2dvconsole\x2dsetup.service.mount: Deactivated successfully. Sep 12 22:04:41.822591 kernel: sd 0:0:0:1: Power-on or device reset occurred Sep 12 22:04:41.822810 kernel: sr 0:0:0:0: Power-on or device reset occurred Sep 12 22:04:41.822913 kernel: sd 0:0:0:1: [sda] 80003072 512-byte logical blocks: (41.0 GB/38.1 GiB) Sep 12 22:04:41.824494 kernel: sd 0:0:0:1: [sda] Write Protect is off Sep 12 22:04:41.824674 kernel: sd 0:0:0:1: [sda] Mode Sense: 63 00 00 08 Sep 12 22:04:41.824820 kernel: sr 0:0:0:0: [sr0] scsi3-mmc drive: 16x/50x cd/rw xa/form2 cdda tray Sep 12 22:04:41.825470 kernel: cdrom: Uniform CD-ROM driver Revision: 3.20 Sep 12 22:04:41.827388 kernel: sd 0:0:0:1: [sda] Write cache: enabled, read cache: enabled, doesn't support DPO or FUA Sep 12 22:04:41.830503 kernel: sr 0:0:0:0: Attached scsi CD-ROM sr0 Sep 12 22:04:41.838018 kernel: GPT:Primary header thinks Alt. header is not at the end of the disk. Sep 12 22:04:41.838097 kernel: GPT:17805311 != 80003071 Sep 12 22:04:41.838121 kernel: GPT:Alternate GPT header not at the end of the disk. Sep 12 22:04:41.838818 kernel: GPT:17805311 != 80003071 Sep 12 22:04:41.839681 kernel: GPT: Use GNU Parted to correct GPT errors. Sep 12 22:04:41.840641 kernel: sda: sda1 sda2 sda3 sda4 sda6 sda7 sda9 Sep 12 22:04:41.841892 kernel: sd 0:0:0:1: [sda] Attached SCSI disk Sep 12 22:04:41.848282 systemd[1]: Finished systemd-vconsole-setup.service - Virtual Console Setup. Sep 12 22:04:41.851432 kernel: xhci_hcd 0000:02:00.0: xHCI Host Controller Sep 12 22:04:41.851599 kernel: xhci_hcd 0000:02:00.0: new USB bus registered, assigned bus number 1 Sep 12 22:04:41.851677 kernel: xhci_hcd 0000:02:00.0: hcc params 0x00087001 hci version 0x100 quirks 0x0000000000000010 Sep 12 22:04:41.852804 kernel: xhci_hcd 0000:02:00.0: xHCI Host Controller Sep 12 22:04:41.853508 kernel: xhci_hcd 0000:02:00.0: new USB bus registered, assigned bus number 2 Sep 12 22:04:41.854707 kernel: xhci_hcd 0000:02:00.0: Host supports USB 3.0 SuperSpeed Sep 12 22:04:41.855107 kernel: hub 1-0:1.0: USB hub found Sep 12 22:04:41.855645 kernel: hub 1-0:1.0: 4 ports detected Sep 12 22:04:41.856933 kernel: usb usb2: We don't know the algorithms for LPM for this host, disabling LPM. Sep 12 22:04:41.858510 kernel: hub 2-0:1.0: USB hub found Sep 12 22:04:41.859481 kernel: hub 2-0:1.0: 4 ports detected Sep 12 22:04:41.915414 systemd[1]: Found device dev-disk-by\x2dpartuuid-7130c94a\x2d213a\x2d4e5a\x2d8e26\x2d6cce9662f132.device - QEMU_HARDDISK USR-A. Sep 12 22:04:41.916219 systemd[1]: Found device dev-disk-by\x2dpartlabel-USR\x2dA.device - QEMU_HARDDISK USR-A. Sep 12 22:04:41.929232 systemd[1]: Found device dev-disk-by\x2dlabel-ROOT.device - QEMU_HARDDISK ROOT. Sep 12 22:04:41.942177 systemd[1]: Found device dev-disk-by\x2dlabel-EFI\x2dSYSTEM.device - QEMU_HARDDISK EFI-SYSTEM. Sep 12 22:04:41.959776 systemd[1]: Found device dev-disk-by\x2dlabel-OEM.device - QEMU_HARDDISK OEM. Sep 12 22:04:41.966615 systemd[1]: Starting disk-uuid.service - Generate new UUID for disk GPT if necessary... Sep 12 22:04:41.971258 systemd[1]: Finished dracut-initqueue.service - dracut initqueue hook. Sep 12 22:04:41.972298 systemd[1]: Reached target remote-fs-pre.target - Preparation for Remote File Systems. Sep 12 22:04:41.973121 systemd[1]: Reached target remote-cryptsetup.target - Remote Encrypted Volumes. Sep 12 22:04:41.976937 systemd[1]: Reached target remote-fs.target - Remote File Systems. Sep 12 22:04:41.983612 systemd[1]: Starting dracut-pre-mount.service - dracut pre-mount hook... Sep 12 22:04:41.991404 disk-uuid[604]: Primary Header is updated. Sep 12 22:04:41.991404 disk-uuid[604]: Secondary Entries is updated. Sep 12 22:04:41.991404 disk-uuid[604]: Secondary Header is updated. Sep 12 22:04:42.003484 kernel: sda: sda1 sda2 sda3 sda4 sda6 sda7 sda9 Sep 12 22:04:42.014336 systemd[1]: Finished dracut-pre-mount.service - dracut pre-mount hook. Sep 12 22:04:42.099482 kernel: usb 1-1: new high-speed USB device number 2 using xhci_hcd Sep 12 22:04:42.232837 kernel: input: QEMU QEMU USB Tablet as /devices/pci0000:00/0000:00:02.1/0000:02:00.0/usb1/1-1/1-1:1.0/0003:0627:0001.0001/input/input1 Sep 12 22:04:42.232894 kernel: hid-generic 0003:0627:0001.0001: input,hidraw0: USB HID v0.01 Mouse [QEMU QEMU USB Tablet] on usb-0000:02:00.0-1/input0 Sep 12 22:04:42.234485 kernel: usbcore: registered new interface driver usbhid Sep 12 22:04:42.234530 kernel: usbhid: USB HID core driver Sep 12 22:04:42.337794 kernel: usb 1-2: new high-speed USB device number 3 using xhci_hcd Sep 12 22:04:42.475818 kernel: input: QEMU QEMU USB Keyboard as /devices/pci0000:00/0000:00:02.1/0000:02:00.0/usb1/1-2/1-2:1.0/0003:0627:0001.0002/input/input2 Sep 12 22:04:42.529706 kernel: hid-generic 0003:0627:0001.0002: input,hidraw1: USB HID v1.11 Keyboard [QEMU QEMU USB Keyboard] on usb-0000:02:00.0-2/input0 Sep 12 22:04:43.018967 kernel: sda: sda1 sda2 sda3 sda4 sda6 sda7 sda9 Sep 12 22:04:43.020804 disk-uuid[605]: The operation has completed successfully. Sep 12 22:04:43.084446 systemd[1]: disk-uuid.service: Deactivated successfully. Sep 12 22:04:43.084594 systemd[1]: Finished disk-uuid.service - Generate new UUID for disk GPT if necessary. Sep 12 22:04:43.116804 systemd[1]: Starting verity-setup.service - Verity Setup for /dev/mapper/usr... Sep 12 22:04:43.144834 sh[629]: Success Sep 12 22:04:43.162820 kernel: device-mapper: core: CONFIG_IMA_DISABLE_HTABLE is disabled. Duplicate IMA measurements will not be recorded in the IMA log. Sep 12 22:04:43.162904 kernel: device-mapper: uevent: version 1.0.3 Sep 12 22:04:43.163600 kernel: device-mapper: ioctl: 4.48.0-ioctl (2023-03-01) initialised: dm-devel@lists.linux.dev Sep 12 22:04:43.173486 kernel: device-mapper: verity: sha256 using shash "sha256-ce" Sep 12 22:04:43.225602 systemd[1]: Found device dev-mapper-usr.device - /dev/mapper/usr. Sep 12 22:04:43.230823 systemd[1]: Mounting sysusr-usr.mount - /sysusr/usr... Sep 12 22:04:43.239015 systemd[1]: Finished verity-setup.service - Verity Setup for /dev/mapper/usr. Sep 12 22:04:43.258479 kernel: BTRFS: device fsid 254e43f1-b609-42b8-bcc5-437252095415 devid 1 transid 38 /dev/mapper/usr (254:0) scanned by mount (641) Sep 12 22:04:43.260746 kernel: BTRFS info (device dm-0): first mount of filesystem 254e43f1-b609-42b8-bcc5-437252095415 Sep 12 22:04:43.260806 kernel: BTRFS info (device dm-0): using crc32c (crc32c-generic) checksum algorithm Sep 12 22:04:43.268672 kernel: BTRFS info (device dm-0): enabling ssd optimizations Sep 12 22:04:43.268747 kernel: BTRFS info (device dm-0): disabling log replay at mount time Sep 12 22:04:43.268759 kernel: BTRFS info (device dm-0): enabling free space tree Sep 12 22:04:43.270781 systemd[1]: Mounted sysusr-usr.mount - /sysusr/usr. Sep 12 22:04:43.272554 systemd[1]: Reached target initrd-usr-fs.target - Initrd /usr File System. Sep 12 22:04:43.274394 systemd[1]: afterburn-network-kargs.service - Afterburn Initrd Setup Network Kernel Arguments was skipped because no trigger condition checks were met. Sep 12 22:04:43.275312 systemd[1]: Starting ignition-setup.service - Ignition (setup)... Sep 12 22:04:43.278969 systemd[1]: Starting parse-ip-for-networkd.service - Write systemd-networkd units from cmdline... Sep 12 22:04:43.312495 kernel: BTRFS: device label OEM devid 1 transid 11 /dev/sda6 (8:6) scanned by mount (678) Sep 12 22:04:43.314754 kernel: BTRFS info (device sda6): first mount of filesystem 5dadbedd-e975-4944-978a-462cb6ec6aa0 Sep 12 22:04:43.314810 kernel: BTRFS info (device sda6): using crc32c (crc32c-generic) checksum algorithm Sep 12 22:04:43.321626 kernel: BTRFS info (device sda6): enabling ssd optimizations Sep 12 22:04:43.321705 kernel: BTRFS info (device sda6): turning on async discard Sep 12 22:04:43.321730 kernel: BTRFS info (device sda6): enabling free space tree Sep 12 22:04:43.327487 kernel: BTRFS info (device sda6): last unmount of filesystem 5dadbedd-e975-4944-978a-462cb6ec6aa0 Sep 12 22:04:43.329605 systemd[1]: Finished ignition-setup.service - Ignition (setup). Sep 12 22:04:43.331918 systemd[1]: Starting ignition-fetch-offline.service - Ignition (fetch-offline)... Sep 12 22:04:43.438771 systemd[1]: Finished parse-ip-for-networkd.service - Write systemd-networkd units from cmdline. Sep 12 22:04:43.443001 systemd[1]: Starting systemd-networkd.service - Network Configuration... Sep 12 22:04:43.498011 systemd-networkd[815]: lo: Link UP Sep 12 22:04:43.498021 systemd-networkd[815]: lo: Gained carrier Sep 12 22:04:43.500102 systemd-networkd[815]: Enumeration completed Sep 12 22:04:43.500614 systemd[1]: Started systemd-networkd.service - Network Configuration. Sep 12 22:04:43.500681 systemd-networkd[815]: eth0: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Sep 12 22:04:43.500685 systemd-networkd[815]: eth0: Configuring with /usr/lib/systemd/network/zz-default.network. Sep 12 22:04:43.502100 systemd-networkd[815]: eth1: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Sep 12 22:04:43.502104 systemd-networkd[815]: eth1: Configuring with /usr/lib/systemd/network/zz-default.network. Sep 12 22:04:43.502499 systemd-networkd[815]: eth0: Link UP Sep 12 22:04:43.502629 systemd[1]: Reached target network.target - Network. Sep 12 22:04:43.502637 systemd-networkd[815]: eth1: Link UP Sep 12 22:04:43.502789 systemd-networkd[815]: eth0: Gained carrier Sep 12 22:04:43.502799 systemd-networkd[815]: eth0: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Sep 12 22:04:43.508347 systemd-networkd[815]: eth1: Gained carrier Sep 12 22:04:43.508368 systemd-networkd[815]: eth1: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Sep 12 22:04:43.523186 ignition[729]: Ignition 2.22.0 Sep 12 22:04:43.523205 ignition[729]: Stage: fetch-offline Sep 12 22:04:43.523238 ignition[729]: no configs at "/usr/lib/ignition/base.d" Sep 12 22:04:43.523245 ignition[729]: no config dir at "/usr/lib/ignition/base.platform.d/hetzner" Sep 12 22:04:43.526686 systemd[1]: Finished ignition-fetch-offline.service - Ignition (fetch-offline). Sep 12 22:04:43.523369 ignition[729]: parsed url from cmdline: "" Sep 12 22:04:43.530004 systemd[1]: Starting ignition-fetch.service - Ignition (fetch)... Sep 12 22:04:43.523372 ignition[729]: no config URL provided Sep 12 22:04:43.523377 ignition[729]: reading system config file "/usr/lib/ignition/user.ign" Sep 12 22:04:43.523384 ignition[729]: no config at "/usr/lib/ignition/user.ign" Sep 12 22:04:43.523390 ignition[729]: failed to fetch config: resource requires networking Sep 12 22:04:43.523800 ignition[729]: Ignition finished successfully Sep 12 22:04:43.535533 systemd-networkd[815]: eth1: DHCPv4 address 10.0.0.3/32 acquired from 10.0.0.1 Sep 12 22:04:43.546578 systemd-networkd[815]: eth0: DHCPv4 address 168.119.157.2/32, gateway 172.31.1.1 acquired from 172.31.1.1 Sep 12 22:04:43.560488 ignition[819]: Ignition 2.22.0 Sep 12 22:04:43.560498 ignition[819]: Stage: fetch Sep 12 22:04:43.560704 ignition[819]: no configs at "/usr/lib/ignition/base.d" Sep 12 22:04:43.560762 ignition[819]: no config dir at "/usr/lib/ignition/base.platform.d/hetzner" Sep 12 22:04:43.560850 ignition[819]: parsed url from cmdline: "" Sep 12 22:04:43.560853 ignition[819]: no config URL provided Sep 12 22:04:43.560858 ignition[819]: reading system config file "/usr/lib/ignition/user.ign" Sep 12 22:04:43.560865 ignition[819]: no config at "/usr/lib/ignition/user.ign" Sep 12 22:04:43.560896 ignition[819]: GET http://169.254.169.254/hetzner/v1/userdata: attempt #1 Sep 12 22:04:43.566566 ignition[819]: GET result: OK Sep 12 22:04:43.567179 ignition[819]: parsing config with SHA512: 0f47ac4765dfcf9a4d0cad50c91cb3ddac744365c557cf04185f4617c180bac242690124d68c8bbde21a404f2f3e751ac9108967467b1bbcdbc72903bb6bba1d Sep 12 22:04:43.576078 unknown[819]: fetched base config from "system" Sep 12 22:04:43.576776 unknown[819]: fetched base config from "system" Sep 12 22:04:43.577223 ignition[819]: fetch: fetch complete Sep 12 22:04:43.576787 unknown[819]: fetched user config from "hetzner" Sep 12 22:04:43.577229 ignition[819]: fetch: fetch passed Sep 12 22:04:43.580852 systemd[1]: Finished ignition-fetch.service - Ignition (fetch). Sep 12 22:04:43.577292 ignition[819]: Ignition finished successfully Sep 12 22:04:43.582569 systemd[1]: Starting ignition-kargs.service - Ignition (kargs)... Sep 12 22:04:43.617292 ignition[826]: Ignition 2.22.0 Sep 12 22:04:43.617309 ignition[826]: Stage: kargs Sep 12 22:04:43.619232 ignition[826]: no configs at "/usr/lib/ignition/base.d" Sep 12 22:04:43.619244 ignition[826]: no config dir at "/usr/lib/ignition/base.platform.d/hetzner" Sep 12 22:04:43.620190 ignition[826]: kargs: kargs passed Sep 12 22:04:43.620243 ignition[826]: Ignition finished successfully Sep 12 22:04:43.624575 systemd[1]: Finished ignition-kargs.service - Ignition (kargs). Sep 12 22:04:43.627680 systemd[1]: Starting ignition-disks.service - Ignition (disks)... Sep 12 22:04:43.661897 ignition[833]: Ignition 2.22.0 Sep 12 22:04:43.661914 ignition[833]: Stage: disks Sep 12 22:04:43.662080 ignition[833]: no configs at "/usr/lib/ignition/base.d" Sep 12 22:04:43.662089 ignition[833]: no config dir at "/usr/lib/ignition/base.platform.d/hetzner" Sep 12 22:04:43.663185 ignition[833]: disks: disks passed Sep 12 22:04:43.663241 ignition[833]: Ignition finished successfully Sep 12 22:04:43.666037 systemd[1]: Finished ignition-disks.service - Ignition (disks). Sep 12 22:04:43.667404 systemd[1]: Reached target initrd-root-device.target - Initrd Root Device. Sep 12 22:04:43.668363 systemd[1]: Reached target local-fs-pre.target - Preparation for Local File Systems. Sep 12 22:04:43.669915 systemd[1]: Reached target local-fs.target - Local File Systems. Sep 12 22:04:43.671008 systemd[1]: Reached target sysinit.target - System Initialization. Sep 12 22:04:43.672085 systemd[1]: Reached target basic.target - Basic System. Sep 12 22:04:43.673959 systemd[1]: Starting systemd-fsck-root.service - File System Check on /dev/disk/by-label/ROOT... Sep 12 22:04:43.723096 systemd-fsck[842]: ROOT: clean, 15/1628000 files, 120826/1617920 blocks Sep 12 22:04:43.728543 systemd[1]: Finished systemd-fsck-root.service - File System Check on /dev/disk/by-label/ROOT. Sep 12 22:04:43.730905 systemd[1]: Mounting sysroot.mount - /sysroot... Sep 12 22:04:43.814477 kernel: EXT4-fs (sda9): mounted filesystem a7b592ec-3c41-4dc2-88a7-056c1f18b418 r/w with ordered data mode. Quota mode: none. Sep 12 22:04:43.816085 systemd[1]: Mounted sysroot.mount - /sysroot. Sep 12 22:04:43.818640 systemd[1]: Reached target initrd-root-fs.target - Initrd Root File System. Sep 12 22:04:43.821312 systemd[1]: Mounting sysroot-oem.mount - /sysroot/oem... Sep 12 22:04:43.824179 systemd[1]: Mounting sysroot-usr.mount - /sysroot/usr... Sep 12 22:04:43.838677 systemd[1]: Starting flatcar-metadata-hostname.service - Flatcar Metadata Hostname Agent... Sep 12 22:04:43.840229 systemd[1]: ignition-remount-sysroot.service - Remount /sysroot read-write for Ignition was skipped because of an unmet condition check (ConditionPathIsReadWrite=!/sysroot). Sep 12 22:04:43.840272 systemd[1]: Reached target ignition-diskful.target - Ignition Boot Disk Setup. Sep 12 22:04:43.850512 kernel: BTRFS: device label OEM devid 1 transid 11 /dev/sda6 (8:6) scanned by mount (850) Sep 12 22:04:43.844853 systemd[1]: Mounted sysroot-usr.mount - /sysroot/usr. Sep 12 22:04:43.852360 systemd[1]: Starting initrd-setup-root.service - Root filesystem setup... Sep 12 22:04:43.855610 kernel: BTRFS info (device sda6): first mount of filesystem 5dadbedd-e975-4944-978a-462cb6ec6aa0 Sep 12 22:04:43.855638 kernel: BTRFS info (device sda6): using crc32c (crc32c-generic) checksum algorithm Sep 12 22:04:43.866718 kernel: BTRFS info (device sda6): enabling ssd optimizations Sep 12 22:04:43.868296 kernel: BTRFS info (device sda6): turning on async discard Sep 12 22:04:43.869875 kernel: BTRFS info (device sda6): enabling free space tree Sep 12 22:04:43.873758 systemd[1]: Mounted sysroot-oem.mount - /sysroot/oem. Sep 12 22:04:43.919476 initrd-setup-root[877]: cut: /sysroot/etc/passwd: No such file or directory Sep 12 22:04:43.922558 coreos-metadata[852]: Sep 12 22:04:43.922 INFO Fetching http://169.254.169.254/hetzner/v1/metadata/hostname: Attempt #1 Sep 12 22:04:43.927167 coreos-metadata[852]: Sep 12 22:04:43.925 INFO Fetch successful Sep 12 22:04:43.927167 coreos-metadata[852]: Sep 12 22:04:43.925 INFO wrote hostname ci-4459-0-0-7-af931fdd93 to /sysroot/etc/hostname Sep 12 22:04:43.929569 initrd-setup-root[884]: cut: /sysroot/etc/group: No such file or directory Sep 12 22:04:43.930304 systemd[1]: Finished flatcar-metadata-hostname.service - Flatcar Metadata Hostname Agent. Sep 12 22:04:43.936080 initrd-setup-root[892]: cut: /sysroot/etc/shadow: No such file or directory Sep 12 22:04:43.940910 initrd-setup-root[899]: cut: /sysroot/etc/gshadow: No such file or directory Sep 12 22:04:44.059915 systemd[1]: Finished initrd-setup-root.service - Root filesystem setup. Sep 12 22:04:44.061804 systemd[1]: Starting ignition-mount.service - Ignition (mount)... Sep 12 22:04:44.063332 systemd[1]: Starting sysroot-boot.service - /sysroot/boot... Sep 12 22:04:44.090484 kernel: BTRFS info (device sda6): last unmount of filesystem 5dadbedd-e975-4944-978a-462cb6ec6aa0 Sep 12 22:04:44.109867 systemd[1]: Finished sysroot-boot.service - /sysroot/boot. Sep 12 22:04:44.134482 ignition[968]: INFO : Ignition 2.22.0 Sep 12 22:04:44.137769 ignition[968]: INFO : Stage: mount Sep 12 22:04:44.137769 ignition[968]: INFO : no configs at "/usr/lib/ignition/base.d" Sep 12 22:04:44.137769 ignition[968]: INFO : no config dir at "/usr/lib/ignition/base.platform.d/hetzner" Sep 12 22:04:44.137769 ignition[968]: INFO : mount: mount passed Sep 12 22:04:44.137769 ignition[968]: INFO : Ignition finished successfully Sep 12 22:04:44.140218 systemd[1]: Finished ignition-mount.service - Ignition (mount). Sep 12 22:04:44.145096 systemd[1]: Starting ignition-files.service - Ignition (files)... Sep 12 22:04:44.260815 systemd[1]: sysroot-oem.mount: Deactivated successfully. Sep 12 22:04:44.265686 systemd[1]: Mounting sysroot-oem.mount - /sysroot/oem... Sep 12 22:04:44.300155 kernel: BTRFS: device label OEM devid 1 transid 11 /dev/sda6 (8:6) scanned by mount (978) Sep 12 22:04:44.300223 kernel: BTRFS info (device sda6): first mount of filesystem 5dadbedd-e975-4944-978a-462cb6ec6aa0 Sep 12 22:04:44.300238 kernel: BTRFS info (device sda6): using crc32c (crc32c-generic) checksum algorithm Sep 12 22:04:44.305842 kernel: BTRFS info (device sda6): enabling ssd optimizations Sep 12 22:04:44.305911 kernel: BTRFS info (device sda6): turning on async discard Sep 12 22:04:44.305932 kernel: BTRFS info (device sda6): enabling free space tree Sep 12 22:04:44.310137 systemd[1]: Mounted sysroot-oem.mount - /sysroot/oem. Sep 12 22:04:44.348406 ignition[995]: INFO : Ignition 2.22.0 Sep 12 22:04:44.350357 ignition[995]: INFO : Stage: files Sep 12 22:04:44.350357 ignition[995]: INFO : no configs at "/usr/lib/ignition/base.d" Sep 12 22:04:44.350357 ignition[995]: INFO : no config dir at "/usr/lib/ignition/base.platform.d/hetzner" Sep 12 22:04:44.350357 ignition[995]: DEBUG : files: compiled without relabeling support, skipping Sep 12 22:04:44.353250 ignition[995]: INFO : files: ensureUsers: op(1): [started] creating or modifying user "core" Sep 12 22:04:44.354221 ignition[995]: DEBUG : files: ensureUsers: op(1): executing: "usermod" "--root" "/sysroot" "core" Sep 12 22:04:44.359497 ignition[995]: INFO : files: ensureUsers: op(1): [finished] creating or modifying user "core" Sep 12 22:04:44.361496 ignition[995]: INFO : files: ensureUsers: op(2): [started] adding ssh keys to user "core" Sep 12 22:04:44.363057 unknown[995]: wrote ssh authorized keys file for user: core Sep 12 22:04:44.364728 ignition[995]: INFO : files: ensureUsers: op(2): [finished] adding ssh keys to user "core" Sep 12 22:04:44.366523 ignition[995]: INFO : files: createFilesystemsFiles: createFiles: op(3): [started] writing file "/sysroot/opt/helm-v3.13.2-linux-arm64.tar.gz" Sep 12 22:04:44.367919 ignition[995]: INFO : files: createFilesystemsFiles: createFiles: op(3): GET https://get.helm.sh/helm-v3.13.2-linux-arm64.tar.gz: attempt #1 Sep 12 22:04:44.425803 ignition[995]: INFO : files: createFilesystemsFiles: createFiles: op(3): GET result: OK Sep 12 22:04:44.706326 ignition[995]: INFO : files: createFilesystemsFiles: createFiles: op(3): [finished] writing file "/sysroot/opt/helm-v3.13.2-linux-arm64.tar.gz" Sep 12 22:04:44.706326 ignition[995]: INFO : files: createFilesystemsFiles: createFiles: op(4): [started] writing file "/sysroot/home/core/install.sh" Sep 12 22:04:44.711750 ignition[995]: INFO : files: createFilesystemsFiles: createFiles: op(4): [finished] writing file "/sysroot/home/core/install.sh" Sep 12 22:04:44.711750 ignition[995]: INFO : files: createFilesystemsFiles: createFiles: op(5): [started] writing file "/sysroot/home/core/nginx.yaml" Sep 12 22:04:44.711750 ignition[995]: INFO : files: createFilesystemsFiles: createFiles: op(5): [finished] writing file "/sysroot/home/core/nginx.yaml" Sep 12 22:04:44.711750 ignition[995]: INFO : files: createFilesystemsFiles: createFiles: op(6): [started] writing file "/sysroot/home/core/nfs-pod.yaml" Sep 12 22:04:44.711750 ignition[995]: INFO : files: createFilesystemsFiles: createFiles: op(6): [finished] writing file "/sysroot/home/core/nfs-pod.yaml" Sep 12 22:04:44.711750 ignition[995]: INFO : files: createFilesystemsFiles: createFiles: op(7): [started] writing file "/sysroot/home/core/nfs-pvc.yaml" Sep 12 22:04:44.711750 ignition[995]: INFO : files: createFilesystemsFiles: createFiles: op(7): [finished] writing file "/sysroot/home/core/nfs-pvc.yaml" Sep 12 22:04:44.711750 ignition[995]: INFO : files: createFilesystemsFiles: createFiles: op(8): [started] writing file "/sysroot/etc/flatcar/update.conf" Sep 12 22:04:44.711750 ignition[995]: INFO : files: createFilesystemsFiles: createFiles: op(8): [finished] writing file "/sysroot/etc/flatcar/update.conf" Sep 12 22:04:44.711750 ignition[995]: INFO : files: createFilesystemsFiles: createFiles: op(9): [started] writing link "/sysroot/etc/extensions/kubernetes.raw" -> "/opt/extensions/kubernetes/kubernetes-v1.31.8-arm64.raw" Sep 12 22:04:44.723667 ignition[995]: INFO : files: createFilesystemsFiles: createFiles: op(9): [finished] writing link "/sysroot/etc/extensions/kubernetes.raw" -> "/opt/extensions/kubernetes/kubernetes-v1.31.8-arm64.raw" Sep 12 22:04:44.723667 ignition[995]: INFO : files: createFilesystemsFiles: createFiles: op(a): [started] writing file "/sysroot/opt/extensions/kubernetes/kubernetes-v1.31.8-arm64.raw" Sep 12 22:04:44.723667 ignition[995]: INFO : files: createFilesystemsFiles: createFiles: op(a): GET https://extensions.flatcar.org/extensions/kubernetes-v1.31.8-arm64.raw: attempt #1 Sep 12 22:04:44.986038 ignition[995]: INFO : files: createFilesystemsFiles: createFiles: op(a): GET result: OK Sep 12 22:04:45.110544 systemd-networkd[815]: eth1: Gained IPv6LL Sep 12 22:04:45.238550 systemd-networkd[815]: eth0: Gained IPv6LL Sep 12 22:04:45.254548 ignition[995]: INFO : files: createFilesystemsFiles: createFiles: op(a): [finished] writing file "/sysroot/opt/extensions/kubernetes/kubernetes-v1.31.8-arm64.raw" Sep 12 22:04:45.254548 ignition[995]: INFO : files: op(b): [started] processing unit "prepare-helm.service" Sep 12 22:04:45.258356 ignition[995]: INFO : files: op(b): op(c): [started] writing unit "prepare-helm.service" at "/sysroot/etc/systemd/system/prepare-helm.service" Sep 12 22:04:45.258356 ignition[995]: INFO : files: op(b): op(c): [finished] writing unit "prepare-helm.service" at "/sysroot/etc/systemd/system/prepare-helm.service" Sep 12 22:04:45.262435 ignition[995]: INFO : files: op(b): [finished] processing unit "prepare-helm.service" Sep 12 22:04:45.262435 ignition[995]: INFO : files: op(d): [started] processing unit "coreos-metadata.service" Sep 12 22:04:45.262435 ignition[995]: INFO : files: op(d): op(e): [started] writing systemd drop-in "00-custom-metadata.conf" at "/sysroot/etc/systemd/system/coreos-metadata.service.d/00-custom-metadata.conf" Sep 12 22:04:45.262435 ignition[995]: INFO : files: op(d): op(e): [finished] writing systemd drop-in "00-custom-metadata.conf" at "/sysroot/etc/systemd/system/coreos-metadata.service.d/00-custom-metadata.conf" Sep 12 22:04:45.262435 ignition[995]: INFO : files: op(d): [finished] processing unit "coreos-metadata.service" Sep 12 22:04:45.262435 ignition[995]: INFO : files: op(f): [started] setting preset to enabled for "prepare-helm.service" Sep 12 22:04:45.262435 ignition[995]: INFO : files: op(f): [finished] setting preset to enabled for "prepare-helm.service" Sep 12 22:04:45.262435 ignition[995]: INFO : files: createResultFile: createFiles: op(10): [started] writing file "/sysroot/etc/.ignition-result.json" Sep 12 22:04:45.262435 ignition[995]: INFO : files: createResultFile: createFiles: op(10): [finished] writing file "/sysroot/etc/.ignition-result.json" Sep 12 22:04:45.262435 ignition[995]: INFO : files: files passed Sep 12 22:04:45.262435 ignition[995]: INFO : Ignition finished successfully Sep 12 22:04:45.261811 systemd[1]: Finished ignition-files.service - Ignition (files). Sep 12 22:04:45.265267 systemd[1]: Starting ignition-quench.service - Ignition (record completion)... Sep 12 22:04:45.272269 systemd[1]: Starting initrd-setup-root-after-ignition.service - Root filesystem completion... Sep 12 22:04:45.283083 systemd[1]: ignition-quench.service: Deactivated successfully. Sep 12 22:04:45.284067 systemd[1]: Finished ignition-quench.service - Ignition (record completion). Sep 12 22:04:45.294490 initrd-setup-root-after-ignition[1029]: grep: /sysroot/etc/flatcar/enabled-sysext.conf: No such file or directory Sep 12 22:04:45.296049 initrd-setup-root-after-ignition[1025]: grep: /sysroot/etc/flatcar/enabled-sysext.conf: No such file or directory Sep 12 22:04:45.297794 initrd-setup-root-after-ignition[1025]: grep: /sysroot/usr/share/flatcar/enabled-sysext.conf: No such file or directory Sep 12 22:04:45.299797 systemd[1]: Finished initrd-setup-root-after-ignition.service - Root filesystem completion. Sep 12 22:04:45.300727 systemd[1]: Reached target ignition-complete.target - Ignition Complete. Sep 12 22:04:45.303612 systemd[1]: Starting initrd-parse-etc.service - Mountpoints Configured in the Real Root... Sep 12 22:04:45.377604 systemd[1]: initrd-parse-etc.service: Deactivated successfully. Sep 12 22:04:45.378901 systemd[1]: Finished initrd-parse-etc.service - Mountpoints Configured in the Real Root. Sep 12 22:04:45.381412 systemd[1]: Reached target initrd-fs.target - Initrd File Systems. Sep 12 22:04:45.382725 systemd[1]: Reached target initrd.target - Initrd Default Target. Sep 12 22:04:45.384185 systemd[1]: dracut-mount.service - dracut mount hook was skipped because no trigger condition checks were met. Sep 12 22:04:45.385916 systemd[1]: Starting dracut-pre-pivot.service - dracut pre-pivot and cleanup hook... Sep 12 22:04:45.432377 systemd[1]: Finished dracut-pre-pivot.service - dracut pre-pivot and cleanup hook. Sep 12 22:04:45.435588 systemd[1]: Starting initrd-cleanup.service - Cleaning Up and Shutting Down Daemons... Sep 12 22:04:45.457641 systemd[1]: Stopped target nss-lookup.target - Host and Network Name Lookups. Sep 12 22:04:45.459305 systemd[1]: Stopped target remote-cryptsetup.target - Remote Encrypted Volumes. Sep 12 22:04:45.461134 systemd[1]: Stopped target timers.target - Timer Units. Sep 12 22:04:45.462268 systemd[1]: dracut-pre-pivot.service: Deactivated successfully. Sep 12 22:04:45.462568 systemd[1]: Stopped dracut-pre-pivot.service - dracut pre-pivot and cleanup hook. Sep 12 22:04:45.466119 systemd[1]: Stopped target initrd.target - Initrd Default Target. Sep 12 22:04:45.466925 systemd[1]: Stopped target basic.target - Basic System. Sep 12 22:04:45.467886 systemd[1]: Stopped target ignition-complete.target - Ignition Complete. Sep 12 22:04:45.469030 systemd[1]: Stopped target ignition-diskful.target - Ignition Boot Disk Setup. Sep 12 22:04:45.470052 systemd[1]: Stopped target initrd-root-device.target - Initrd Root Device. Sep 12 22:04:45.471021 systemd[1]: Stopped target initrd-usr-fs.target - Initrd /usr File System. Sep 12 22:04:45.471848 systemd[1]: Stopped target remote-fs.target - Remote File Systems. Sep 12 22:04:45.473032 systemd[1]: Stopped target remote-fs-pre.target - Preparation for Remote File Systems. Sep 12 22:04:45.474354 systemd[1]: Stopped target sysinit.target - System Initialization. Sep 12 22:04:45.475349 systemd[1]: Stopped target local-fs.target - Local File Systems. Sep 12 22:04:45.476492 systemd[1]: Stopped target swap.target - Swaps. Sep 12 22:04:45.477570 systemd[1]: dracut-pre-mount.service: Deactivated successfully. Sep 12 22:04:45.477765 systemd[1]: Stopped dracut-pre-mount.service - dracut pre-mount hook. Sep 12 22:04:45.479384 systemd[1]: Stopped target cryptsetup.target - Local Encrypted Volumes. Sep 12 22:04:45.480768 systemd[1]: Stopped target cryptsetup-pre.target - Local Encrypted Volumes (Pre). Sep 12 22:04:45.481769 systemd[1]: clevis-luks-askpass.path: Deactivated successfully. Sep 12 22:04:45.483614 systemd[1]: Stopped clevis-luks-askpass.path - Forward Password Requests to Clevis Directory Watch. Sep 12 22:04:45.485555 systemd[1]: dracut-initqueue.service: Deactivated successfully. Sep 12 22:04:45.485805 systemd[1]: Stopped dracut-initqueue.service - dracut initqueue hook. Sep 12 22:04:45.488055 systemd[1]: initrd-setup-root-after-ignition.service: Deactivated successfully. Sep 12 22:04:45.488496 systemd[1]: Stopped initrd-setup-root-after-ignition.service - Root filesystem completion. Sep 12 22:04:45.489575 systemd[1]: ignition-files.service: Deactivated successfully. Sep 12 22:04:45.489796 systemd[1]: Stopped ignition-files.service - Ignition (files). Sep 12 22:04:45.490576 systemd[1]: flatcar-metadata-hostname.service: Deactivated successfully. Sep 12 22:04:45.490770 systemd[1]: Stopped flatcar-metadata-hostname.service - Flatcar Metadata Hostname Agent. Sep 12 22:04:45.494766 systemd[1]: Stopping ignition-mount.service - Ignition (mount)... Sep 12 22:04:45.496131 systemd[1]: Stopping sysroot-boot.service - /sysroot/boot... Sep 12 22:04:45.498549 systemd[1]: systemd-udev-trigger.service: Deactivated successfully. Sep 12 22:04:45.498793 systemd[1]: Stopped systemd-udev-trigger.service - Coldplug All udev Devices. Sep 12 22:04:45.501711 systemd[1]: dracut-pre-trigger.service: Deactivated successfully. Sep 12 22:04:45.501894 systemd[1]: Stopped dracut-pre-trigger.service - dracut pre-trigger hook. Sep 12 22:04:45.510131 systemd[1]: initrd-cleanup.service: Deactivated successfully. Sep 12 22:04:45.513081 systemd[1]: Finished initrd-cleanup.service - Cleaning Up and Shutting Down Daemons. Sep 12 22:04:45.525297 systemd[1]: sysroot-boot.mount: Deactivated successfully. Sep 12 22:04:45.535279 ignition[1049]: INFO : Ignition 2.22.0 Sep 12 22:04:45.538604 ignition[1049]: INFO : Stage: umount Sep 12 22:04:45.538604 ignition[1049]: INFO : no configs at "/usr/lib/ignition/base.d" Sep 12 22:04:45.538604 ignition[1049]: INFO : no config dir at "/usr/lib/ignition/base.platform.d/hetzner" Sep 12 22:04:45.538604 ignition[1049]: INFO : umount: umount passed Sep 12 22:04:45.542828 ignition[1049]: INFO : Ignition finished successfully Sep 12 22:04:45.544236 systemd[1]: ignition-mount.service: Deactivated successfully. Sep 12 22:04:45.544442 systemd[1]: Stopped ignition-mount.service - Ignition (mount). Sep 12 22:04:45.547348 systemd[1]: ignition-disks.service: Deactivated successfully. Sep 12 22:04:45.547487 systemd[1]: Stopped ignition-disks.service - Ignition (disks). Sep 12 22:04:45.552229 systemd[1]: ignition-kargs.service: Deactivated successfully. Sep 12 22:04:45.552303 systemd[1]: Stopped ignition-kargs.service - Ignition (kargs). Sep 12 22:04:45.554669 systemd[1]: ignition-fetch.service: Deactivated successfully. Sep 12 22:04:45.554878 systemd[1]: Stopped ignition-fetch.service - Ignition (fetch). Sep 12 22:04:45.556574 systemd[1]: Stopped target network.target - Network. Sep 12 22:04:45.557343 systemd[1]: ignition-fetch-offline.service: Deactivated successfully. Sep 12 22:04:45.558214 systemd[1]: Stopped ignition-fetch-offline.service - Ignition (fetch-offline). Sep 12 22:04:45.560466 systemd[1]: Stopped target paths.target - Path Units. Sep 12 22:04:45.564071 systemd[1]: systemd-ask-password-console.path: Deactivated successfully. Sep 12 22:04:45.567552 systemd[1]: Stopped systemd-ask-password-console.path - Dispatch Password Requests to Console Directory Watch. Sep 12 22:04:45.572803 systemd[1]: Stopped target slices.target - Slice Units. Sep 12 22:04:45.573410 systemd[1]: Stopped target sockets.target - Socket Units. Sep 12 22:04:45.574140 systemd[1]: iscsid.socket: Deactivated successfully. Sep 12 22:04:45.574184 systemd[1]: Closed iscsid.socket - Open-iSCSI iscsid Socket. Sep 12 22:04:45.575072 systemd[1]: iscsiuio.socket: Deactivated successfully. Sep 12 22:04:45.575114 systemd[1]: Closed iscsiuio.socket - Open-iSCSI iscsiuio Socket. Sep 12 22:04:45.576118 systemd[1]: ignition-setup.service: Deactivated successfully. Sep 12 22:04:45.576177 systemd[1]: Stopped ignition-setup.service - Ignition (setup). Sep 12 22:04:45.577318 systemd[1]: ignition-setup-pre.service: Deactivated successfully. Sep 12 22:04:45.577362 systemd[1]: Stopped ignition-setup-pre.service - Ignition env setup. Sep 12 22:04:45.578523 systemd[1]: Stopping systemd-networkd.service - Network Configuration... Sep 12 22:04:45.580217 systemd[1]: Stopping systemd-resolved.service - Network Name Resolution... Sep 12 22:04:45.583310 systemd[1]: sysroot-boot.service: Deactivated successfully. Sep 12 22:04:45.583402 systemd[1]: Stopped sysroot-boot.service - /sysroot/boot. Sep 12 22:04:45.584731 systemd[1]: initrd-setup-root.service: Deactivated successfully. Sep 12 22:04:45.584828 systemd[1]: Stopped initrd-setup-root.service - Root filesystem setup. Sep 12 22:04:45.590583 systemd[1]: systemd-resolved.service: Deactivated successfully. Sep 12 22:04:45.590754 systemd[1]: Stopped systemd-resolved.service - Network Name Resolution. Sep 12 22:04:45.597085 systemd[1]: run-credentials-systemd\x2dresolved.service.mount: Deactivated successfully. Sep 12 22:04:45.597518 systemd[1]: systemd-networkd.service: Deactivated successfully. Sep 12 22:04:45.599520 systemd[1]: Stopped systemd-networkd.service - Network Configuration. Sep 12 22:04:45.603963 systemd[1]: run-credentials-systemd\x2dnetworkd.service.mount: Deactivated successfully. Sep 12 22:04:45.604910 systemd[1]: Stopped target network-pre.target - Preparation for Network. Sep 12 22:04:45.606212 systemd[1]: systemd-networkd.socket: Deactivated successfully. Sep 12 22:04:45.606269 systemd[1]: Closed systemd-networkd.socket - Network Service Netlink Socket. Sep 12 22:04:45.608578 systemd[1]: Stopping network-cleanup.service - Network Cleanup... Sep 12 22:04:45.609074 systemd[1]: parse-ip-for-networkd.service: Deactivated successfully. Sep 12 22:04:45.609134 systemd[1]: Stopped parse-ip-for-networkd.service - Write systemd-networkd units from cmdline. Sep 12 22:04:45.611240 systemd[1]: systemd-sysctl.service: Deactivated successfully. Sep 12 22:04:45.611297 systemd[1]: Stopped systemd-sysctl.service - Apply Kernel Variables. Sep 12 22:04:45.615215 systemd[1]: systemd-modules-load.service: Deactivated successfully. Sep 12 22:04:45.615274 systemd[1]: Stopped systemd-modules-load.service - Load Kernel Modules. Sep 12 22:04:45.618173 systemd[1]: systemd-tmpfiles-setup.service: Deactivated successfully. Sep 12 22:04:45.618859 systemd[1]: Stopped systemd-tmpfiles-setup.service - Create System Files and Directories. Sep 12 22:04:45.620522 systemd[1]: Stopping systemd-udevd.service - Rule-based Manager for Device Events and Files... Sep 12 22:04:45.624122 systemd[1]: run-credentials-systemd\x2dsysctl.service.mount: Deactivated successfully. Sep 12 22:04:45.624203 systemd[1]: run-credentials-systemd\x2dtmpfiles\x2dsetup.service.mount: Deactivated successfully. Sep 12 22:04:45.639749 systemd[1]: network-cleanup.service: Deactivated successfully. Sep 12 22:04:45.641050 systemd[1]: Stopped network-cleanup.service - Network Cleanup. Sep 12 22:04:45.644098 systemd[1]: systemd-udevd.service: Deactivated successfully. Sep 12 22:04:45.644317 systemd[1]: Stopped systemd-udevd.service - Rule-based Manager for Device Events and Files. Sep 12 22:04:45.646605 systemd[1]: systemd-udevd-control.socket: Deactivated successfully. Sep 12 22:04:45.646684 systemd[1]: Closed systemd-udevd-control.socket - udev Control Socket. Sep 12 22:04:45.648186 systemd[1]: systemd-udevd-kernel.socket: Deactivated successfully. Sep 12 22:04:45.648223 systemd[1]: Closed systemd-udevd-kernel.socket - udev Kernel Socket. Sep 12 22:04:45.649243 systemd[1]: dracut-pre-udev.service: Deactivated successfully. Sep 12 22:04:45.649291 systemd[1]: Stopped dracut-pre-udev.service - dracut pre-udev hook. Sep 12 22:04:45.652048 systemd[1]: dracut-cmdline.service: Deactivated successfully. Sep 12 22:04:45.652108 systemd[1]: Stopped dracut-cmdline.service - dracut cmdline hook. Sep 12 22:04:45.654742 systemd[1]: dracut-cmdline-ask.service: Deactivated successfully. Sep 12 22:04:45.654803 systemd[1]: Stopped dracut-cmdline-ask.service - dracut ask for additional cmdline parameters. Sep 12 22:04:45.658642 systemd[1]: Starting initrd-udevadm-cleanup-db.service - Cleanup udev Database... Sep 12 22:04:45.661605 systemd[1]: systemd-network-generator.service: Deactivated successfully. Sep 12 22:04:45.661689 systemd[1]: Stopped systemd-network-generator.service - Generate network units from Kernel command line. Sep 12 22:04:45.664624 systemd[1]: systemd-tmpfiles-setup-dev.service: Deactivated successfully. Sep 12 22:04:45.664684 systemd[1]: Stopped systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev. Sep 12 22:04:45.668627 systemd[1]: systemd-tmpfiles-setup-dev-early.service: Deactivated successfully. Sep 12 22:04:45.668690 systemd[1]: Stopped systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully. Sep 12 22:04:45.674543 systemd[1]: kmod-static-nodes.service: Deactivated successfully. Sep 12 22:04:45.674610 systemd[1]: Stopped kmod-static-nodes.service - Create List of Static Device Nodes. Sep 12 22:04:45.675629 systemd[1]: systemd-vconsole-setup.service: Deactivated successfully. Sep 12 22:04:45.675824 systemd[1]: Stopped systemd-vconsole-setup.service - Virtual Console Setup. Sep 12 22:04:45.679080 systemd[1]: run-credentials-systemd\x2dnetwork\x2dgenerator.service.mount: Deactivated successfully. Sep 12 22:04:45.679160 systemd[1]: run-credentials-systemd\x2dtmpfiles\x2dsetup\x2ddev\x2dearly.service.mount: Deactivated successfully. Sep 12 22:04:45.679191 systemd[1]: run-credentials-systemd\x2dtmpfiles\x2dsetup\x2ddev.service.mount: Deactivated successfully. Sep 12 22:04:45.679289 systemd[1]: run-credentials-systemd\x2dvconsole\x2dsetup.service.mount: Deactivated successfully. Sep 12 22:04:45.679857 systemd[1]: initrd-udevadm-cleanup-db.service: Deactivated successfully. Sep 12 22:04:45.679951 systemd[1]: Finished initrd-udevadm-cleanup-db.service - Cleanup udev Database. Sep 12 22:04:45.684189 systemd[1]: Reached target initrd-switch-root.target - Switch Root. Sep 12 22:04:45.689329 systemd[1]: Starting initrd-switch-root.service - Switch Root... Sep 12 22:04:45.729361 systemd[1]: Switching root. Sep 12 22:04:45.761529 systemd-journald[244]: Journal stopped Sep 12 22:04:46.766239 systemd-journald[244]: Received SIGTERM from PID 1 (systemd). Sep 12 22:04:46.766319 kernel: SELinux: policy capability network_peer_controls=1 Sep 12 22:04:46.766331 kernel: SELinux: policy capability open_perms=1 Sep 12 22:04:46.766344 kernel: SELinux: policy capability extended_socket_class=1 Sep 12 22:04:46.766353 kernel: SELinux: policy capability always_check_network=0 Sep 12 22:04:46.766387 kernel: SELinux: policy capability cgroup_seclabel=1 Sep 12 22:04:46.766398 kernel: SELinux: policy capability nnp_nosuid_transition=1 Sep 12 22:04:46.766407 kernel: SELinux: policy capability genfs_seclabel_symlinks=0 Sep 12 22:04:46.766416 kernel: SELinux: policy capability ioctl_skip_cloexec=0 Sep 12 22:04:46.766425 kernel: SELinux: policy capability userspace_initial_context=0 Sep 12 22:04:46.766434 kernel: audit: type=1403 audit(1757714685.910:2): auid=4294967295 ses=4294967295 lsm=selinux res=1 Sep 12 22:04:46.767562 systemd[1]: Successfully loaded SELinux policy in 66.054ms. Sep 12 22:04:46.767635 systemd[1]: Relabeled /dev/, /dev/shm/, /run/ in 7.223ms. Sep 12 22:04:46.767648 systemd[1]: systemd 256.8 running in system mode (+PAM +AUDIT +SELINUX -APPARMOR +IMA +SMACK +SECCOMP -GCRYPT -GNUTLS +OPENSSL -ACL +BLKID +CURL +ELFUTILS -FIDO2 +IDN2 -IDN +IPTC +KMOD +LIBCRYPTSETUP +LIBCRYPTSETUP_PLUGINS +LIBFDISK +PCRE2 -PWQUALITY -P11KIT -QRENCODE +TPM2 +BZIP2 +LZ4 +XZ +ZLIB +ZSTD -BPF_FRAMEWORK -XKBCOMMON +UTMP -SYSVINIT +LIBARCHIVE) Sep 12 22:04:46.767659 systemd[1]: Detected virtualization kvm. Sep 12 22:04:46.767668 systemd[1]: Detected architecture arm64. Sep 12 22:04:46.767678 systemd[1]: Detected first boot. Sep 12 22:04:46.767687 systemd[1]: Hostname set to . Sep 12 22:04:46.767709 systemd[1]: Initializing machine ID from VM UUID. Sep 12 22:04:46.767721 zram_generator::config[1093]: No configuration found. Sep 12 22:04:46.767735 kernel: NET: Registered PF_VSOCK protocol family Sep 12 22:04:46.767749 systemd[1]: Populated /etc with preset unit settings. Sep 12 22:04:46.767761 systemd[1]: run-credentials-systemd\x2djournald.service.mount: Deactivated successfully. Sep 12 22:04:46.767772 systemd[1]: initrd-switch-root.service: Deactivated successfully. Sep 12 22:04:46.767781 systemd[1]: Stopped initrd-switch-root.service - Switch Root. Sep 12 22:04:46.767793 systemd[1]: systemd-journald.service: Scheduled restart job, restart counter is at 1. Sep 12 22:04:46.767803 systemd[1]: Created slice system-addon\x2dconfig.slice - Slice /system/addon-config. Sep 12 22:04:46.767813 systemd[1]: Created slice system-addon\x2drun.slice - Slice /system/addon-run. Sep 12 22:04:46.767822 systemd[1]: Created slice system-getty.slice - Slice /system/getty. Sep 12 22:04:46.767833 systemd[1]: Created slice system-modprobe.slice - Slice /system/modprobe. Sep 12 22:04:46.767843 systemd[1]: Created slice system-serial\x2dgetty.slice - Slice /system/serial-getty. Sep 12 22:04:46.767854 systemd[1]: Created slice system-system\x2dcloudinit.slice - Slice /system/system-cloudinit. Sep 12 22:04:46.767864 systemd[1]: Created slice system-systemd\x2dfsck.slice - Slice /system/systemd-fsck. Sep 12 22:04:46.767873 systemd[1]: Created slice user.slice - User and Session Slice. Sep 12 22:04:46.767884 systemd[1]: Started clevis-luks-askpass.path - Forward Password Requests to Clevis Directory Watch. Sep 12 22:04:46.767895 systemd[1]: Started systemd-ask-password-console.path - Dispatch Password Requests to Console Directory Watch. Sep 12 22:04:46.767905 systemd[1]: Started systemd-ask-password-wall.path - Forward Password Requests to Wall Directory Watch. Sep 12 22:04:46.767915 systemd[1]: Set up automount boot.automount - Boot partition Automount Point. Sep 12 22:04:46.767925 systemd[1]: Set up automount proc-sys-fs-binfmt_misc.automount - Arbitrary Executable File Formats File System Automount Point. Sep 12 22:04:46.767934 systemd[1]: Expecting device dev-disk-by\x2dlabel-OEM.device - /dev/disk/by-label/OEM... Sep 12 22:04:46.767944 systemd[1]: Expecting device dev-ttyAMA0.device - /dev/ttyAMA0... Sep 12 22:04:46.767956 systemd[1]: Reached target cryptsetup-pre.target - Local Encrypted Volumes (Pre). Sep 12 22:04:46.767969 systemd[1]: Reached target cryptsetup.target - Local Encrypted Volumes. Sep 12 22:04:46.767979 systemd[1]: Stopped target initrd-switch-root.target - Switch Root. Sep 12 22:04:46.767992 systemd[1]: Stopped target initrd-fs.target - Initrd File Systems. Sep 12 22:04:46.768018 systemd[1]: Stopped target initrd-root-fs.target - Initrd Root File System. Sep 12 22:04:46.768031 systemd[1]: Reached target integritysetup.target - Local Integrity Protected Volumes. Sep 12 22:04:46.768041 systemd[1]: Reached target remote-cryptsetup.target - Remote Encrypted Volumes. Sep 12 22:04:46.768051 systemd[1]: Reached target remote-fs.target - Remote File Systems. Sep 12 22:04:46.768063 systemd[1]: Reached target slices.target - Slice Units. Sep 12 22:04:46.768073 systemd[1]: Reached target swap.target - Swaps. Sep 12 22:04:46.770170 systemd[1]: Reached target veritysetup.target - Local Verity Protected Volumes. Sep 12 22:04:46.770197 systemd[1]: Listening on systemd-coredump.socket - Process Core Dump Socket. Sep 12 22:04:46.770208 systemd[1]: Listening on systemd-creds.socket - Credential Encryption/Decryption. Sep 12 22:04:46.770219 systemd[1]: Listening on systemd-networkd.socket - Network Service Netlink Socket. Sep 12 22:04:46.770282 systemd[1]: Listening on systemd-udevd-control.socket - udev Control Socket. Sep 12 22:04:46.770328 systemd[1]: Listening on systemd-udevd-kernel.socket - udev Kernel Socket. Sep 12 22:04:46.770342 systemd[1]: Listening on systemd-userdbd.socket - User Database Manager Socket. Sep 12 22:04:46.770357 systemd[1]: Mounting dev-hugepages.mount - Huge Pages File System... Sep 12 22:04:46.770366 systemd[1]: Mounting dev-mqueue.mount - POSIX Message Queue File System... Sep 12 22:04:46.770376 systemd[1]: Mounting media.mount - External Media Directory... Sep 12 22:04:46.770386 systemd[1]: Mounting sys-kernel-debug.mount - Kernel Debug File System... Sep 12 22:04:46.770396 systemd[1]: Mounting sys-kernel-tracing.mount - Kernel Trace File System... Sep 12 22:04:46.770405 systemd[1]: Mounting tmp.mount - Temporary Directory /tmp... Sep 12 22:04:46.770416 systemd[1]: var-lib-machines.mount - Virtual Machine and Container Storage (Compatibility) was skipped because of an unmet condition check (ConditionPathExists=/var/lib/machines.raw). Sep 12 22:04:46.770426 systemd[1]: Reached target machines.target - Containers. Sep 12 22:04:46.770436 systemd[1]: Starting flatcar-tmpfiles.service - Create missing system files... Sep 12 22:04:46.770463 systemd[1]: ignition-delete-config.service - Ignition (delete config) was skipped because no trigger condition checks were met. Sep 12 22:04:46.770476 systemd[1]: Starting kmod-static-nodes.service - Create List of Static Device Nodes... Sep 12 22:04:46.770487 systemd[1]: Starting modprobe@configfs.service - Load Kernel Module configfs... Sep 12 22:04:46.770496 systemd[1]: Starting modprobe@dm_mod.service - Load Kernel Module dm_mod... Sep 12 22:04:46.770508 systemd[1]: Starting modprobe@drm.service - Load Kernel Module drm... Sep 12 22:04:46.770518 systemd[1]: Starting modprobe@efi_pstore.service - Load Kernel Module efi_pstore... Sep 12 22:04:46.770528 systemd[1]: Starting modprobe@fuse.service - Load Kernel Module fuse... Sep 12 22:04:46.770538 systemd[1]: Starting modprobe@loop.service - Load Kernel Module loop... Sep 12 22:04:46.770551 systemd[1]: setup-nsswitch.service - Create /etc/nsswitch.conf was skipped because of an unmet condition check (ConditionPathExists=!/etc/nsswitch.conf). Sep 12 22:04:46.770575 systemd[1]: systemd-fsck-root.service: Deactivated successfully. Sep 12 22:04:46.770587 systemd[1]: Stopped systemd-fsck-root.service - File System Check on Root Device. Sep 12 22:04:46.770598 systemd[1]: systemd-fsck-usr.service: Deactivated successfully. Sep 12 22:04:46.770607 systemd[1]: Stopped systemd-fsck-usr.service. Sep 12 22:04:46.770618 systemd[1]: systemd-hibernate-clear.service - Clear Stale Hibernate Storage Info was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/HibernateLocation-8cf2644b-4b0b-428f-9387-6d876050dc67). Sep 12 22:04:46.770628 systemd[1]: Starting systemd-journald.service - Journal Service... Sep 12 22:04:46.770641 systemd[1]: Starting systemd-modules-load.service - Load Kernel Modules... Sep 12 22:04:46.770653 systemd[1]: Starting systemd-network-generator.service - Generate network units from Kernel command line... Sep 12 22:04:46.770665 systemd[1]: Starting systemd-remount-fs.service - Remount Root and Kernel File Systems... Sep 12 22:04:46.770676 systemd[1]: Starting systemd-udev-load-credentials.service - Load udev Rules from Credentials... Sep 12 22:04:46.770686 systemd[1]: Starting systemd-udev-trigger.service - Coldplug All udev Devices... Sep 12 22:04:46.770707 systemd[1]: verity-setup.service: Deactivated successfully. Sep 12 22:04:46.770719 systemd[1]: Stopped verity-setup.service. Sep 12 22:04:46.770730 systemd[1]: Mounted dev-hugepages.mount - Huge Pages File System. Sep 12 22:04:46.770740 systemd[1]: Mounted dev-mqueue.mount - POSIX Message Queue File System. Sep 12 22:04:46.770750 systemd[1]: Mounted media.mount - External Media Directory. Sep 12 22:04:46.770760 systemd[1]: Mounted sys-kernel-debug.mount - Kernel Debug File System. Sep 12 22:04:46.770772 systemd[1]: Mounted sys-kernel-tracing.mount - Kernel Trace File System. Sep 12 22:04:46.770783 systemd[1]: Mounted tmp.mount - Temporary Directory /tmp. Sep 12 22:04:46.770803 systemd[1]: Finished kmod-static-nodes.service - Create List of Static Device Nodes. Sep 12 22:04:46.770816 systemd[1]: modprobe@configfs.service: Deactivated successfully. Sep 12 22:04:46.770826 systemd[1]: Finished modprobe@configfs.service - Load Kernel Module configfs. Sep 12 22:04:46.770837 systemd[1]: modprobe@dm_mod.service: Deactivated successfully. Sep 12 22:04:46.770878 systemd[1]: Finished modprobe@dm_mod.service - Load Kernel Module dm_mod. Sep 12 22:04:46.770894 systemd[1]: modprobe@efi_pstore.service: Deactivated successfully. Sep 12 22:04:46.770905 systemd[1]: Finished modprobe@efi_pstore.service - Load Kernel Module efi_pstore. Sep 12 22:04:46.770918 systemd[1]: Finished systemd-modules-load.service - Load Kernel Modules. Sep 12 22:04:46.770928 systemd[1]: Finished systemd-network-generator.service - Generate network units from Kernel command line. Sep 12 22:04:46.770938 kernel: loop: module loaded Sep 12 22:04:46.770954 systemd[1]: Finished systemd-remount-fs.service - Remount Root and Kernel File Systems. Sep 12 22:04:46.770964 systemd[1]: Reached target network-pre.target - Preparation for Network. Sep 12 22:04:46.770974 systemd[1]: Mounting sys-kernel-config.mount - Kernel Configuration File System... Sep 12 22:04:46.770984 systemd[1]: remount-root.service - Remount Root File System was skipped because of an unmet condition check (ConditionPathIsReadWrite=!/). Sep 12 22:04:46.770995 systemd[1]: Reached target local-fs.target - Local File Systems. Sep 12 22:04:46.771004 kernel: fuse: init (API version 7.41) Sep 12 22:04:46.771015 systemd[1]: Listening on systemd-sysext.socket - System Extension Image Management. Sep 12 22:04:46.771025 kernel: ACPI: bus type drm_connector registered Sep 12 22:04:46.771074 systemd-journald[1161]: Collecting audit messages is disabled. Sep 12 22:04:46.771097 systemd[1]: Starting ldconfig.service - Rebuild Dynamic Linker Cache... Sep 12 22:04:46.771108 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. Sep 12 22:04:46.771119 systemd-journald[1161]: Journal started Sep 12 22:04:46.771141 systemd-journald[1161]: Runtime Journal (/run/log/journal/08d9892b751047cbbd8373de6ee40d41) is 8M, max 76.5M, 68.5M free. Sep 12 22:04:46.442011 systemd[1]: Queued start job for default target multi-user.target. Sep 12 22:04:46.465525 systemd[1]: Unnecessary job was removed for dev-sda6.device - /dev/sda6. Sep 12 22:04:46.466383 systemd[1]: systemd-journald.service: Deactivated successfully. Sep 12 22:04:46.778536 systemd[1]: Starting systemd-hwdb-update.service - Rebuild Hardware Database... Sep 12 22:04:46.780488 systemd[1]: systemd-pstore.service - Platform Persistent Storage Archival was skipped because of an unmet condition check (ConditionDirectoryNotEmpty=/sys/fs/pstore). Sep 12 22:04:46.786549 systemd[1]: Starting systemd-random-seed.service - Load/Save OS Random Seed... Sep 12 22:04:46.791519 systemd[1]: Starting systemd-sysctl.service - Apply Kernel Variables... Sep 12 22:04:46.794483 systemd[1]: Starting systemd-sysext.service - Merge System Extension Images into /usr/ and /opt/... Sep 12 22:04:46.802478 systemd[1]: Starting systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully... Sep 12 22:04:46.804524 systemd[1]: Started systemd-journald.service - Journal Service. Sep 12 22:04:46.827423 systemd[1]: Finished flatcar-tmpfiles.service - Create missing system files. Sep 12 22:04:46.828934 systemd[1]: modprobe@drm.service: Deactivated successfully. Sep 12 22:04:46.832570 systemd[1]: Finished modprobe@drm.service - Load Kernel Module drm. Sep 12 22:04:46.833990 systemd[1]: modprobe@fuse.service: Deactivated successfully. Sep 12 22:04:46.834162 systemd[1]: Finished modprobe@fuse.service - Load Kernel Module fuse. Sep 12 22:04:46.835927 systemd[1]: modprobe@loop.service: Deactivated successfully. Sep 12 22:04:46.836413 systemd[1]: Finished modprobe@loop.service - Load Kernel Module loop. Sep 12 22:04:46.839755 systemd[1]: Finished systemd-udev-load-credentials.service - Load udev Rules from Credentials. Sep 12 22:04:46.840776 systemd[1]: Mounted sys-kernel-config.mount - Kernel Configuration File System. Sep 12 22:04:46.858075 systemd[1]: Finished systemd-random-seed.service - Load/Save OS Random Seed. Sep 12 22:04:46.872235 systemd[1]: Reached target first-boot-complete.target - First Boot Complete. Sep 12 22:04:46.876962 systemd[1]: Mounting sys-fs-fuse-connections.mount - FUSE Control File System... Sep 12 22:04:46.886204 systemd[1]: Starting systemd-journal-flush.service - Flush Journal to Persistent Storage... Sep 12 22:04:46.895475 kernel: loop0: detected capacity change from 0 to 203944 Sep 12 22:04:46.894065 systemd[1]: Starting systemd-machine-id-commit.service - Save Transient machine-id to Disk... Sep 12 22:04:46.895291 systemd[1]: systemd-repart.service - Repartition Root Disk was skipped because no trigger condition checks were met. Sep 12 22:04:46.899040 systemd[1]: Mounted sys-fs-fuse-connections.mount - FUSE Control File System. Sep 12 22:04:46.935495 kernel: squashfs: version 4.0 (2009/01/31) Phillip Lougher Sep 12 22:04:46.943289 systemd[1]: Finished systemd-sysctl.service - Apply Kernel Variables. Sep 12 22:04:46.948222 systemd-journald[1161]: Time spent on flushing to /var/log/journal/08d9892b751047cbbd8373de6ee40d41 is 64.926ms for 1180 entries. Sep 12 22:04:46.948222 systemd-journald[1161]: System Journal (/var/log/journal/08d9892b751047cbbd8373de6ee40d41) is 8M, max 584.8M, 576.8M free. Sep 12 22:04:47.034840 systemd-journald[1161]: Received client request to flush runtime journal. Sep 12 22:04:47.036245 kernel: loop1: detected capacity change from 0 to 8 Sep 12 22:04:47.036284 kernel: loop2: detected capacity change from 0 to 100632 Sep 12 22:04:46.959668 systemd-tmpfiles[1193]: ACLs are not supported, ignoring. Sep 12 22:04:46.959680 systemd-tmpfiles[1193]: ACLs are not supported, ignoring. Sep 12 22:04:46.977746 systemd[1]: Finished systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully. Sep 12 22:04:46.983176 systemd[1]: Starting systemd-sysusers.service - Create System Users... Sep 12 22:04:46.991413 systemd[1]: Finished systemd-udev-trigger.service - Coldplug All udev Devices. Sep 12 22:04:46.997964 systemd[1]: Finished systemd-machine-id-commit.service - Save Transient machine-id to Disk. Sep 12 22:04:47.040723 systemd[1]: Finished systemd-journal-flush.service - Flush Journal to Persistent Storage. Sep 12 22:04:47.053704 kernel: loop3: detected capacity change from 0 to 119368 Sep 12 22:04:47.091244 systemd[1]: Finished systemd-sysusers.service - Create System Users. Sep 12 22:04:47.098549 kernel: loop4: detected capacity change from 0 to 203944 Sep 12 22:04:47.099893 systemd[1]: Starting systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev... Sep 12 22:04:47.123063 kernel: loop5: detected capacity change from 0 to 8 Sep 12 22:04:47.129588 kernel: loop6: detected capacity change from 0 to 100632 Sep 12 22:04:47.140751 systemd-tmpfiles[1236]: ACLs are not supported, ignoring. Sep 12 22:04:47.140771 systemd-tmpfiles[1236]: ACLs are not supported, ignoring. Sep 12 22:04:47.144376 systemd[1]: Finished systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev. Sep 12 22:04:47.162511 kernel: loop7: detected capacity change from 0 to 119368 Sep 12 22:04:47.185821 (sd-merge)[1235]: Using extensions 'containerd-flatcar', 'docker-flatcar', 'kubernetes', 'oem-hetzner'. Sep 12 22:04:47.188241 (sd-merge)[1235]: Merged extensions into '/usr'. Sep 12 22:04:47.195107 systemd[1]: Reload requested from client PID 1192 ('systemd-sysext') (unit systemd-sysext.service)... Sep 12 22:04:47.195265 systemd[1]: Reloading... Sep 12 22:04:47.336484 zram_generator::config[1262]: No configuration found. Sep 12 22:04:47.365131 ldconfig[1185]: /sbin/ldconfig: /usr/lib/ld.so.conf is not an ELF file - it has the wrong magic bytes at the start. Sep 12 22:04:47.522685 systemd[1]: etc-machine\x2did.mount: Deactivated successfully. Sep 12 22:04:47.523177 systemd[1]: Reloading finished in 326 ms. Sep 12 22:04:47.551341 systemd[1]: Finished ldconfig.service - Rebuild Dynamic Linker Cache. Sep 12 22:04:47.552648 systemd[1]: Finished systemd-hwdb-update.service - Rebuild Hardware Database. Sep 12 22:04:47.553678 systemd[1]: Finished systemd-sysext.service - Merge System Extension Images into /usr/ and /opt/. Sep 12 22:04:47.563805 systemd[1]: Starting ensure-sysext.service... Sep 12 22:04:47.568678 systemd[1]: Starting systemd-tmpfiles-setup.service - Create System Files and Directories... Sep 12 22:04:47.571188 systemd[1]: Starting systemd-udevd.service - Rule-based Manager for Device Events and Files... Sep 12 22:04:47.585591 systemd[1]: Reload requested from client PID 1303 ('systemctl') (unit ensure-sysext.service)... Sep 12 22:04:47.585618 systemd[1]: Reloading... Sep 12 22:04:47.613112 systemd-tmpfiles[1304]: /usr/lib/tmpfiles.d/nfs-utils.conf:6: Duplicate line for path "/var/lib/nfs/sm", ignoring. Sep 12 22:04:47.613583 systemd-tmpfiles[1304]: /usr/lib/tmpfiles.d/nfs-utils.conf:7: Duplicate line for path "/var/lib/nfs/sm.bak", ignoring. Sep 12 22:04:47.613956 systemd-tmpfiles[1304]: /usr/lib/tmpfiles.d/provision.conf:20: Duplicate line for path "/root", ignoring. Sep 12 22:04:47.614233 systemd-tmpfiles[1304]: /usr/lib/tmpfiles.d/systemd-flatcar.conf:6: Duplicate line for path "/var/log/journal", ignoring. Sep 12 22:04:47.615165 systemd-tmpfiles[1304]: /usr/lib/tmpfiles.d/systemd.conf:29: Duplicate line for path "/var/lib/systemd", ignoring. Sep 12 22:04:47.615593 systemd-tmpfiles[1304]: ACLs are not supported, ignoring. Sep 12 22:04:47.617323 systemd-tmpfiles[1304]: ACLs are not supported, ignoring. Sep 12 22:04:47.624792 systemd-udevd[1305]: Using default interface naming scheme 'v255'. Sep 12 22:04:47.626904 systemd-tmpfiles[1304]: Detected autofs mount point /boot during canonicalization of boot. Sep 12 22:04:47.627290 systemd-tmpfiles[1304]: Skipping /boot Sep 12 22:04:47.647811 systemd-tmpfiles[1304]: Detected autofs mount point /boot during canonicalization of boot. Sep 12 22:04:47.647825 systemd-tmpfiles[1304]: Skipping /boot Sep 12 22:04:47.750510 zram_generator::config[1354]: No configuration found. Sep 12 22:04:47.980878 systemd[1]: Condition check resulted in dev-ttyAMA0.device - /dev/ttyAMA0 being skipped. Sep 12 22:04:47.981105 systemd[1]: Reloading finished in 395 ms. Sep 12 22:04:47.982483 kernel: mousedev: PS/2 mouse device common for all mice Sep 12 22:04:47.998841 systemd[1]: Started systemd-udevd.service - Rule-based Manager for Device Events and Files. Sep 12 22:04:48.009514 systemd[1]: Finished systemd-tmpfiles-setup.service - Create System Files and Directories. Sep 12 22:04:48.026010 systemd[1]: Condition check resulted in dev-virtio\x2dports-org.qemu.guest_agent.0.device - /dev/virtio-ports/org.qemu.guest_agent.0 being skipped. Sep 12 22:04:48.034949 systemd[1]: Starting audit-rules.service - Load Audit Rules... Sep 12 22:04:48.043639 systemd[1]: Starting clean-ca-certificates.service - Clean up broken links in /etc/ssl/certs... Sep 12 22:04:48.044640 systemd[1]: ignition-delete-config.service - Ignition (delete config) was skipped because no trigger condition checks were met. Sep 12 22:04:48.047743 systemd[1]: Starting modprobe@dm_mod.service - Load Kernel Module dm_mod... Sep 12 22:04:48.052061 systemd[1]: Starting modprobe@efi_pstore.service - Load Kernel Module efi_pstore... Sep 12 22:04:48.060526 systemd[1]: Starting modprobe@loop.service - Load Kernel Module loop... Sep 12 22:04:48.062831 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. Sep 12 22:04:48.063581 systemd[1]: systemd-hibernate-clear.service - Clear Stale Hibernate Storage Info was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/HibernateLocation-8cf2644b-4b0b-428f-9387-6d876050dc67). Sep 12 22:04:48.066273 systemd[1]: Starting systemd-journal-catalog-update.service - Rebuild Journal Catalog... Sep 12 22:04:48.071783 systemd[1]: Starting systemd-networkd.service - Network Configuration... Sep 12 22:04:48.099883 systemd[1]: Starting systemd-resolved.service - Network Name Resolution... Sep 12 22:04:48.108750 systemd[1]: Starting systemd-update-utmp.service - Record System Boot/Shutdown in UTMP... Sep 12 22:04:48.134671 systemd[1]: Starting systemd-userdbd.service - User Database Manager... Sep 12 22:04:48.138171 systemd[1]: modprobe@dm_mod.service: Deactivated successfully. Sep 12 22:04:48.138444 systemd[1]: Finished modprobe@dm_mod.service - Load Kernel Module dm_mod. Sep 12 22:04:48.140153 systemd[1]: modprobe@efi_pstore.service: Deactivated successfully. Sep 12 22:04:48.140756 systemd[1]: Finished modprobe@efi_pstore.service - Load Kernel Module efi_pstore. Sep 12 22:04:48.143177 systemd[1]: modprobe@loop.service: Deactivated successfully. Sep 12 22:04:48.143661 systemd[1]: Finished modprobe@loop.service - Load Kernel Module loop. Sep 12 22:04:48.190081 systemd[1]: ignition-delete-config.service - Ignition (delete config) was skipped because no trigger condition checks were met. Sep 12 22:04:48.192932 systemd[1]: Starting modprobe@dm_mod.service - Load Kernel Module dm_mod... Sep 12 22:04:48.198878 systemd[1]: Starting modprobe@efi_pstore.service - Load Kernel Module efi_pstore... Sep 12 22:04:48.203182 systemd[1]: Starting modprobe@loop.service - Load Kernel Module loop... Sep 12 22:04:48.205672 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. Sep 12 22:04:48.205889 systemd[1]: systemd-hibernate-clear.service - Clear Stale Hibernate Storage Info was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/HibernateLocation-8cf2644b-4b0b-428f-9387-6d876050dc67). Sep 12 22:04:48.208921 systemd[1]: Finished systemd-update-utmp.service - Record System Boot/Shutdown in UTMP. Sep 12 22:04:48.211931 systemd[1]: Finished systemd-journal-catalog-update.service - Rebuild Journal Catalog. Sep 12 22:04:48.223116 systemd[1]: Finished clean-ca-certificates.service - Clean up broken links in /etc/ssl/certs. Sep 12 22:04:48.237226 systemd[1]: ignition-delete-config.service - Ignition (delete config) was skipped because no trigger condition checks were met. Sep 12 22:04:48.247889 augenrules[1462]: No rules Sep 12 22:04:48.253728 systemd[1]: Starting modprobe@drm.service - Load Kernel Module drm... Sep 12 22:04:48.255677 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. Sep 12 22:04:48.255739 systemd[1]: systemd-hibernate-clear.service - Clear Stale Hibernate Storage Info was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/HibernateLocation-8cf2644b-4b0b-428f-9387-6d876050dc67). Sep 12 22:04:48.261676 systemd[1]: Starting systemd-update-done.service - Update is Completed... Sep 12 22:04:48.262387 systemd[1]: update-ca-certificates.service - Update CA bundle at /etc/ssl/certs/ca-certificates.crt was skipped because of an unmet condition check (ConditionPathIsSymbolicLink=!/etc/ssl/certs/ca-certificates.crt). Sep 12 22:04:48.265545 systemd[1]: Finished ensure-sysext.service. Sep 12 22:04:48.266919 systemd[1]: audit-rules.service: Deactivated successfully. Sep 12 22:04:48.268534 systemd[1]: Finished audit-rules.service - Load Audit Rules. Sep 12 22:04:48.270178 systemd[1]: modprobe@efi_pstore.service: Deactivated successfully. Sep 12 22:04:48.271780 systemd[1]: Finished modprobe@efi_pstore.service - Load Kernel Module efi_pstore. Sep 12 22:04:48.273932 systemd[1]: modprobe@dm_mod.service: Deactivated successfully. Sep 12 22:04:48.276853 kernel: [drm] pci: virtio-gpu-pci detected at 0000:00:01.0 Sep 12 22:04:48.276957 kernel: [drm] features: -virgl +edid -resource_blob -host_visible Sep 12 22:04:48.276973 kernel: [drm] features: -context_init Sep 12 22:04:48.275537 systemd[1]: Finished modprobe@dm_mod.service - Load Kernel Module dm_mod. Sep 12 22:04:48.277418 systemd[1]: modprobe@loop.service: Deactivated successfully. Sep 12 22:04:48.279271 systemd[1]: Finished modprobe@loop.service - Load Kernel Module loop. Sep 12 22:04:48.287297 systemd[1]: systemd-pstore.service - Platform Persistent Storage Archival was skipped because of an unmet condition check (ConditionDirectoryNotEmpty=/sys/fs/pstore). Sep 12 22:04:48.287517 systemd[1]: systemd-repart.service - Repartition Root Disk was skipped because no trigger condition checks were met. Sep 12 22:04:48.294107 systemd[1]: Starting systemd-timesyncd.service - Network Time Synchronization... Sep 12 22:04:48.297580 systemd[1]: Started systemd-userdbd.service - User Database Manager. Sep 12 22:04:48.312013 systemd[1]: modprobe@drm.service: Deactivated successfully. Sep 12 22:04:48.312837 systemd[1]: Finished modprobe@drm.service - Load Kernel Module drm. Sep 12 22:04:48.317219 systemd[1]: Finished systemd-update-done.service - Update is Completed. Sep 12 22:04:48.324600 kernel: [drm] number of scanouts: 1 Sep 12 22:04:48.324726 kernel: [drm] number of cap sets: 0 Sep 12 22:04:48.349508 kernel: [drm] Initialized virtio_gpu 0.1.0 for 0000:00:01.0 on minor 0 Sep 12 22:04:48.364119 kernel: Console: switching to colour frame buffer device 160x50 Sep 12 22:04:48.376679 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Sep 12 22:04:48.386921 kernel: virtio-pci 0000:00:01.0: [drm] fb0: virtio_gpudrmfb frame buffer device Sep 12 22:04:48.405195 systemd[1]: Found device dev-disk-by\x2dlabel-OEM.device - QEMU_HARDDISK OEM. Sep 12 22:04:48.409475 systemd[1]: Starting systemd-fsck@dev-disk-by\x2dlabel-OEM.service - File System Check on /dev/disk/by-label/OEM... Sep 12 22:04:48.422182 systemd[1]: systemd-vconsole-setup.service: Deactivated successfully. Sep 12 22:04:48.424103 systemd[1]: Stopped systemd-vconsole-setup.service - Virtual Console Setup. Sep 12 22:04:48.432536 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Sep 12 22:04:48.452378 systemd[1]: Finished systemd-fsck@dev-disk-by\x2dlabel-OEM.service - File System Check on /dev/disk/by-label/OEM. Sep 12 22:04:48.570863 systemd[1]: Finished systemd-vconsole-setup.service - Virtual Console Setup. Sep 12 22:04:48.592555 systemd-networkd[1421]: lo: Link UP Sep 12 22:04:48.592565 systemd-networkd[1421]: lo: Gained carrier Sep 12 22:04:48.596582 systemd-networkd[1421]: Enumeration completed Sep 12 22:04:48.596744 systemd[1]: Started systemd-networkd.service - Network Configuration. Sep 12 22:04:48.597224 systemd-networkd[1421]: eth0: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Sep 12 22:04:48.597228 systemd-networkd[1421]: eth0: Configuring with /usr/lib/systemd/network/zz-default.network. Sep 12 22:04:48.597878 systemd-networkd[1421]: eth1: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Sep 12 22:04:48.597890 systemd-networkd[1421]: eth1: Configuring with /usr/lib/systemd/network/zz-default.network. Sep 12 22:04:48.598318 systemd-networkd[1421]: eth0: Link UP Sep 12 22:04:48.598475 systemd-networkd[1421]: eth0: Gained carrier Sep 12 22:04:48.598493 systemd-networkd[1421]: eth0: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Sep 12 22:04:48.604723 systemd[1]: Starting systemd-networkd-persistent-storage.service - Enable Persistent Storage in systemd-networkd... Sep 12 22:04:48.605022 systemd-networkd[1421]: eth1: Link UP Sep 12 22:04:48.606660 systemd-networkd[1421]: eth1: Gained carrier Sep 12 22:04:48.606705 systemd-networkd[1421]: eth1: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Sep 12 22:04:48.611771 systemd[1]: Starting systemd-networkd-wait-online.service - Wait for Network to be Configured... Sep 12 22:04:48.618381 systemd[1]: Started systemd-timesyncd.service - Network Time Synchronization. Sep 12 22:04:48.619321 systemd[1]: Reached target time-set.target - System Time Set. Sep 12 22:04:48.636555 systemd-networkd[1421]: eth1: DHCPv4 address 10.0.0.3/32 acquired from 10.0.0.1 Sep 12 22:04:48.637176 systemd-timesyncd[1478]: Network configuration changed, trying to establish connection. Sep 12 22:04:48.641112 systemd-resolved[1422]: Positive Trust Anchors: Sep 12 22:04:48.641124 systemd-resolved[1422]: . IN DS 20326 8 2 e06d44b80b8f1d39a95c0b0d7c65d08458e880409bbc683457104237c7f8ec8d Sep 12 22:04:48.641156 systemd-resolved[1422]: Negative trust anchors: home.arpa 10.in-addr.arpa 16.172.in-addr.arpa 17.172.in-addr.arpa 18.172.in-addr.arpa 19.172.in-addr.arpa 20.172.in-addr.arpa 21.172.in-addr.arpa 22.172.in-addr.arpa 23.172.in-addr.arpa 24.172.in-addr.arpa 25.172.in-addr.arpa 26.172.in-addr.arpa 27.172.in-addr.arpa 28.172.in-addr.arpa 29.172.in-addr.arpa 30.172.in-addr.arpa 31.172.in-addr.arpa 170.0.0.192.in-addr.arpa 171.0.0.192.in-addr.arpa 168.192.in-addr.arpa d.f.ip6.arpa ipv4only.arpa resolver.arpa corp home internal intranet lan local private test Sep 12 22:04:48.642634 systemd-networkd[1421]: eth0: DHCPv4 address 168.119.157.2/32, gateway 172.31.1.1 acquired from 172.31.1.1 Sep 12 22:04:48.650201 systemd-resolved[1422]: Using system hostname 'ci-4459-0-0-7-af931fdd93'. Sep 12 22:04:48.651830 systemd[1]: Finished systemd-networkd-persistent-storage.service - Enable Persistent Storage in systemd-networkd. Sep 12 22:04:48.652785 systemd[1]: Started systemd-resolved.service - Network Name Resolution. Sep 12 22:04:48.653621 systemd[1]: Reached target network.target - Network. Sep 12 22:04:48.654216 systemd[1]: Reached target nss-lookup.target - Host and Network Name Lookups. Sep 12 22:04:48.655148 systemd[1]: Reached target sysinit.target - System Initialization. Sep 12 22:04:48.655903 systemd[1]: Started motdgen.path - Watch for update engine configuration changes. Sep 12 22:04:48.656587 systemd[1]: Started user-cloudinit@var-lib-flatcar\x2dinstall-user_data.path - Watch for a cloud-config at /var/lib/flatcar-install/user_data. Sep 12 22:04:48.657611 systemd[1]: Started logrotate.timer - Daily rotation of log files. Sep 12 22:04:48.658320 systemd[1]: Started mdadm.timer - Weekly check for MD array's redundancy information.. Sep 12 22:04:48.659253 systemd[1]: Started systemd-tmpfiles-clean.timer - Daily Cleanup of Temporary Directories. Sep 12 22:04:48.660012 systemd[1]: update-engine-stub.timer - Update Engine Stub Timer was skipped because of an unmet condition check (ConditionPathExists=/usr/.noupdate). Sep 12 22:04:48.660052 systemd[1]: Reached target paths.target - Path Units. Sep 12 22:04:48.660548 systemd[1]: Reached target timers.target - Timer Units. Sep 12 22:04:48.662551 systemd[1]: Listening on dbus.socket - D-Bus System Message Bus Socket. Sep 12 22:04:48.664770 systemd[1]: Starting docker.socket - Docker Socket for the API... Sep 12 22:04:48.668002 systemd[1]: Listening on sshd-unix-local.socket - OpenSSH Server Socket (systemd-ssh-generator, AF_UNIX Local). Sep 12 22:04:48.669225 systemd[1]: Listening on sshd-vsock.socket - OpenSSH Server Socket (systemd-ssh-generator, AF_VSOCK). Sep 12 22:04:48.670115 systemd[1]: Reached target ssh-access.target - SSH Access Available. Sep 12 22:04:48.674025 systemd[1]: Listening on sshd.socket - OpenSSH Server Socket. Sep 12 22:04:48.675139 systemd[1]: Listening on systemd-hostnamed.socket - Hostname Service Socket. Sep 12 22:04:48.676896 systemd[1]: Listening on docker.socket - Docker Socket for the API. Sep 12 22:04:48.677892 systemd[1]: Reached target sockets.target - Socket Units. Sep 12 22:04:48.678518 systemd[1]: Reached target basic.target - Basic System. Sep 12 22:04:48.679135 systemd[1]: addon-config@oem.service - Configure Addon /oem was skipped because no trigger condition checks were met. Sep 12 22:04:48.679168 systemd[1]: addon-run@oem.service - Run Addon /oem was skipped because no trigger condition checks were met. Sep 12 22:04:48.680713 systemd[1]: Starting containerd.service - containerd container runtime... Sep 12 22:04:48.684645 systemd[1]: Starting coreos-metadata.service - Flatcar Metadata Agent... Sep 12 22:04:48.687625 systemd[1]: Starting dbus.service - D-Bus System Message Bus... Sep 12 22:04:48.689696 systemd[1]: Starting dracut-shutdown.service - Restore /run/initramfs on shutdown... Sep 12 22:04:48.694202 systemd[1]: Starting enable-oem-cloudinit.service - Enable cloudinit... Sep 12 22:04:48.700805 systemd[1]: Starting extend-filesystems.service - Extend Filesystems... Sep 12 22:04:48.701900 systemd[1]: flatcar-setup-environment.service - Modifies /etc/environment for CoreOS was skipped because of an unmet condition check (ConditionPathExists=/oem/bin/flatcar-setup-environment). Sep 12 22:04:48.704185 systemd[1]: Starting motdgen.service - Generate /run/flatcar/motd... Sep 12 22:04:48.711521 systemd[1]: Starting prepare-helm.service - Unpack helm to /opt/bin... Sep 12 22:04:48.714850 systemd[1]: Started qemu-guest-agent.service - QEMU Guest Agent. Sep 12 22:04:48.718800 systemd[1]: Starting ssh-key-proc-cmdline.service - Install an ssh key from /proc/cmdline... Sep 12 22:04:48.722611 systemd[1]: Starting sshd-keygen.service - Generate sshd host keys... Sep 12 22:04:48.725435 jq[1515]: false Sep 12 22:04:48.727923 systemd[1]: Starting systemd-logind.service - User Login Management... Sep 12 22:04:48.730552 systemd[1]: tcsd.service - TCG Core Services Daemon was skipped because of an unmet condition check (ConditionPathExists=/dev/tpm0). Sep 12 22:04:48.731149 systemd[1]: cgroup compatibility translation between legacy and unified hierarchy settings activated. See cgroup-compat debug messages for details. Sep 12 22:04:48.737802 systemd[1]: Starting update-engine.service - Update Engine... Sep 12 22:04:48.743288 extend-filesystems[1516]: Found /dev/sda6 Sep 12 22:04:48.743663 systemd[1]: Starting update-ssh-keys-after-ignition.service - Run update-ssh-keys once after Ignition... Sep 12 22:04:48.751156 extend-filesystems[1516]: Found /dev/sda9 Sep 12 22:04:48.760064 systemd[1]: Finished dracut-shutdown.service - Restore /run/initramfs on shutdown. Sep 12 22:04:48.761993 systemd[1]: enable-oem-cloudinit.service: Skipped due to 'exec-condition'. Sep 12 22:04:48.762220 systemd[1]: Condition check resulted in enable-oem-cloudinit.service - Enable cloudinit being skipped. Sep 12 22:04:48.774596 extend-filesystems[1516]: Checking size of /dev/sda9 Sep 12 22:04:48.790594 systemd[1]: ssh-key-proc-cmdline.service: Deactivated successfully. Sep 12 22:04:48.790829 systemd[1]: Finished ssh-key-proc-cmdline.service - Install an ssh key from /proc/cmdline. Sep 12 22:04:48.813841 jq[1529]: true Sep 12 22:04:48.824655 jq[1550]: true Sep 12 22:04:48.830243 coreos-metadata[1512]: Sep 12 22:04:48.829 INFO Fetching http://169.254.169.254/hetzner/v1/metadata: Attempt #1 Sep 12 22:04:48.839046 coreos-metadata[1512]: Sep 12 22:04:48.837 INFO Fetch successful Sep 12 22:04:48.839046 coreos-metadata[1512]: Sep 12 22:04:48.837 INFO Fetching http://169.254.169.254/hetzner/v1/metadata/private-networks: Attempt #1 Sep 12 22:04:48.846328 coreos-metadata[1512]: Sep 12 22:04:48.840 INFO Fetch successful Sep 12 22:04:48.846829 systemd[1]: motdgen.service: Deactivated successfully. Sep 12 22:04:48.847063 dbus-daemon[1513]: [system] SELinux support is enabled Sep 12 22:04:48.848232 systemd[1]: Finished motdgen.service - Generate /run/flatcar/motd. Sep 12 22:04:48.851152 systemd[1]: Started dbus.service - D-Bus System Message Bus. Sep 12 22:04:48.854125 extend-filesystems[1516]: Resized partition /dev/sda9 Sep 12 22:04:48.860615 extend-filesystems[1562]: resize2fs 1.47.3 (8-Jul-2025) Sep 12 22:04:48.399797 tar[1541]: linux-arm64/helm Sep 12 22:04:48.481947 kernel: EXT4-fs (sda9): resizing filesystem from 1617920 to 9393147 blocks Sep 12 22:04:48.481971 systemd-journald[1161]: Time jumped backwards, rotating. Sep 12 22:04:48.858484 systemd[1]: system-cloudinit@usr-share-oem-cloud\x2dconfig.yml.service - Load cloud-config from /usr/share/oem/cloud-config.yml was skipped because of an unmet condition check (ConditionFileNotEmpty=/usr/share/oem/cloud-config.yml). Sep 12 22:04:48.482059 update_engine[1528]: I20250912 22:04:48.398533 1528 main.cc:92] Flatcar Update Engine starting Sep 12 22:04:48.482059 update_engine[1528]: I20250912 22:04:48.415789 1528 update_check_scheduler.cc:74] Next update check in 10m27s Sep 12 22:04:48.858510 systemd[1]: Reached target system-config.target - Load system-provided cloud configs. Sep 12 22:04:48.862358 systemd[1]: user-cloudinit-proc-cmdline.service - Load cloud-config from url defined in /proc/cmdline was skipped because of an unmet condition check (ConditionKernelCommandLine=cloud-config-url). Sep 12 22:04:48.862379 systemd[1]: Reached target user-config.target - Load user-provided cloud configs. Sep 12 22:04:48.395072 systemd-resolved[1422]: Clock change detected. Flushing caches. Sep 12 22:04:48.395212 systemd-timesyncd[1478]: Contacted time server 185.232.69.65:123 (1.flatcar.pool.ntp.org). Sep 12 22:04:48.395271 systemd-timesyncd[1478]: Initial clock synchronization to Fri 2025-09-12 22:04:48.395022 UTC. Sep 12 22:04:48.403453 (ntainerd)[1553]: containerd.service: Referenced but unset environment variable evaluates to an empty string: TORCX_IMAGEDIR, TORCX_UNPACKDIR Sep 12 22:04:48.411666 systemd[1]: Started update-engine.service - Update Engine. Sep 12 22:04:48.421977 systemd[1]: Started locksmithd.service - Cluster reboot manager. Sep 12 22:04:48.460858 systemd-logind[1527]: New seat seat0. Sep 12 22:04:48.478202 systemd-logind[1527]: Watching system buttons on /dev/input/event0 (Power Button) Sep 12 22:04:48.478218 systemd-logind[1527]: Watching system buttons on /dev/input/event2 (QEMU QEMU USB Keyboard) Sep 12 22:04:48.478473 systemd[1]: Started systemd-logind.service - User Login Management. Sep 12 22:04:48.551491 bash[1581]: Updated "/home/core/.ssh/authorized_keys" Sep 12 22:04:48.554635 systemd[1]: Finished update-ssh-keys-after-ignition.service - Run update-ssh-keys once after Ignition. Sep 12 22:04:48.559073 systemd[1]: Starting sshkeys.service... Sep 12 22:04:48.600663 kernel: EXT4-fs (sda9): resized filesystem to 9393147 Sep 12 22:04:48.628797 extend-filesystems[1562]: Filesystem at /dev/sda9 is mounted on /; on-line resizing required Sep 12 22:04:48.628797 extend-filesystems[1562]: old_desc_blocks = 1, new_desc_blocks = 5 Sep 12 22:04:48.628797 extend-filesystems[1562]: The filesystem on /dev/sda9 is now 9393147 (4k) blocks long. Sep 12 22:04:48.649603 extend-filesystems[1516]: Resized filesystem in /dev/sda9 Sep 12 22:04:48.630415 systemd[1]: extend-filesystems.service: Deactivated successfully. Sep 12 22:04:48.631317 systemd[1]: Finished extend-filesystems.service - Extend Filesystems. Sep 12 22:04:48.635592 systemd[1]: Created slice system-coreos\x2dmetadata\x2dsshkeys.slice - Slice /system/coreos-metadata-sshkeys. Sep 12 22:04:48.641192 systemd[1]: Starting coreos-metadata-sshkeys@core.service - Flatcar Metadata Agent (SSH Keys)... Sep 12 22:04:48.660383 systemd[1]: Finished coreos-metadata.service - Flatcar Metadata Agent. Sep 12 22:04:48.662277 systemd[1]: packet-phone-home.service - Report Success to Packet was skipped because no trigger condition checks were met. Sep 12 22:04:48.676300 locksmithd[1566]: locksmithd starting currentOperation="UPDATE_STATUS_IDLE" strategy="reboot" Sep 12 22:04:48.708113 coreos-metadata[1597]: Sep 12 22:04:48.707 INFO Fetching http://169.254.169.254/hetzner/v1/metadata/public-keys: Attempt #1 Sep 12 22:04:48.709315 coreos-metadata[1597]: Sep 12 22:04:48.709 INFO Fetch successful Sep 12 22:04:48.711156 unknown[1597]: wrote ssh authorized keys file for user: core Sep 12 22:04:48.768091 update-ssh-keys[1607]: Updated "/home/core/.ssh/authorized_keys" Sep 12 22:04:48.770620 systemd[1]: Finished coreos-metadata-sshkeys@core.service - Flatcar Metadata Agent (SSH Keys). Sep 12 22:04:48.774534 systemd[1]: Finished sshkeys.service. Sep 12 22:04:48.813494 containerd[1553]: time="2025-09-12T22:04:48Z" level=warning msg="Ignoring unknown key in TOML" column=1 error="strict mode: fields in the document are missing in the target struct" file=/usr/share/containerd/config.toml key=subreaper row=8 Sep 12 22:04:48.817538 containerd[1553]: time="2025-09-12T22:04:48.816025146Z" level=info msg="starting containerd" revision=fb4c30d4ede3531652d86197bf3fc9515e5276d9 version=v2.0.5 Sep 12 22:04:48.833252 containerd[1553]: time="2025-09-12T22:04:48.833195186Z" level=warning msg="Configuration migrated from version 2, use `containerd config migrate` to avoid migration" t="10.6µs" Sep 12 22:04:48.836539 containerd[1553]: time="2025-09-12T22:04:48.835560306Z" level=info msg="loading plugin" id=io.containerd.image-verifier.v1.bindir type=io.containerd.image-verifier.v1 Sep 12 22:04:48.836539 containerd[1553]: time="2025-09-12T22:04:48.835619346Z" level=info msg="loading plugin" id=io.containerd.internal.v1.opt type=io.containerd.internal.v1 Sep 12 22:04:48.836539 containerd[1553]: time="2025-09-12T22:04:48.835836946Z" level=info msg="loading plugin" id=io.containerd.warning.v1.deprecations type=io.containerd.warning.v1 Sep 12 22:04:48.836539 containerd[1553]: time="2025-09-12T22:04:48.835855746Z" level=info msg="loading plugin" id=io.containerd.content.v1.content type=io.containerd.content.v1 Sep 12 22:04:48.836539 containerd[1553]: time="2025-09-12T22:04:48.835882986Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.blockfile type=io.containerd.snapshotter.v1 Sep 12 22:04:48.836539 containerd[1553]: time="2025-09-12T22:04:48.835935666Z" level=info msg="skip loading plugin" error="no scratch file generator: skip plugin" id=io.containerd.snapshotter.v1.blockfile type=io.containerd.snapshotter.v1 Sep 12 22:04:48.836539 containerd[1553]: time="2025-09-12T22:04:48.835949146Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.btrfs type=io.containerd.snapshotter.v1 Sep 12 22:04:48.836539 containerd[1553]: time="2025-09-12T22:04:48.836202666Z" level=info msg="skip loading plugin" error="path /var/lib/containerd/io.containerd.snapshotter.v1.btrfs (ext4) must be a btrfs filesystem to be used with the btrfs snapshotter: skip plugin" id=io.containerd.snapshotter.v1.btrfs type=io.containerd.snapshotter.v1 Sep 12 22:04:48.836539 containerd[1553]: time="2025-09-12T22:04:48.836217226Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.devmapper type=io.containerd.snapshotter.v1 Sep 12 22:04:48.836539 containerd[1553]: time="2025-09-12T22:04:48.836228186Z" level=info msg="skip loading plugin" error="devmapper not configured: skip plugin" id=io.containerd.snapshotter.v1.devmapper type=io.containerd.snapshotter.v1 Sep 12 22:04:48.836539 containerd[1553]: time="2025-09-12T22:04:48.836237146Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.native type=io.containerd.snapshotter.v1 Sep 12 22:04:48.836539 containerd[1553]: time="2025-09-12T22:04:48.836311266Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.overlayfs type=io.containerd.snapshotter.v1 Sep 12 22:04:48.836899 containerd[1553]: time="2025-09-12T22:04:48.836503146Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.zfs type=io.containerd.snapshotter.v1 Sep 12 22:04:48.836988 containerd[1553]: time="2025-09-12T22:04:48.836959306Z" level=info msg="skip loading plugin" error="lstat /var/lib/containerd/io.containerd.snapshotter.v1.zfs: no such file or directory: skip plugin" id=io.containerd.snapshotter.v1.zfs type=io.containerd.snapshotter.v1 Sep 12 22:04:48.838888 containerd[1553]: time="2025-09-12T22:04:48.838534386Z" level=info msg="loading plugin" id=io.containerd.event.v1.exchange type=io.containerd.event.v1 Sep 12 22:04:48.838888 containerd[1553]: time="2025-09-12T22:04:48.838621986Z" level=info msg="loading plugin" id=io.containerd.monitor.task.v1.cgroups type=io.containerd.monitor.task.v1 Sep 12 22:04:48.839502 containerd[1553]: time="2025-09-12T22:04:48.839082666Z" level=info msg="loading plugin" id=io.containerd.metadata.v1.bolt type=io.containerd.metadata.v1 Sep 12 22:04:48.839502 containerd[1553]: time="2025-09-12T22:04:48.839181426Z" level=info msg="metadata content store policy set" policy=shared Sep 12 22:04:48.843459 containerd[1553]: time="2025-09-12T22:04:48.843405746Z" level=info msg="loading plugin" id=io.containerd.gc.v1.scheduler type=io.containerd.gc.v1 Sep 12 22:04:48.843636 containerd[1553]: time="2025-09-12T22:04:48.843621066Z" level=info msg="loading plugin" id=io.containerd.differ.v1.walking type=io.containerd.differ.v1 Sep 12 22:04:48.843689 containerd[1553]: time="2025-09-12T22:04:48.843678746Z" level=info msg="loading plugin" id=io.containerd.lease.v1.manager type=io.containerd.lease.v1 Sep 12 22:04:48.844151 containerd[1553]: time="2025-09-12T22:04:48.844129546Z" level=info msg="loading plugin" id=io.containerd.service.v1.containers-service type=io.containerd.service.v1 Sep 12 22:04:48.844218 containerd[1553]: time="2025-09-12T22:04:48.844206066Z" level=info msg="loading plugin" id=io.containerd.service.v1.content-service type=io.containerd.service.v1 Sep 12 22:04:48.844265 containerd[1553]: time="2025-09-12T22:04:48.844253626Z" level=info msg="loading plugin" id=io.containerd.service.v1.diff-service type=io.containerd.service.v1 Sep 12 22:04:48.844312 containerd[1553]: time="2025-09-12T22:04:48.844301666Z" level=info msg="loading plugin" id=io.containerd.service.v1.images-service type=io.containerd.service.v1 Sep 12 22:04:48.844395 containerd[1553]: time="2025-09-12T22:04:48.844378746Z" level=info msg="loading plugin" id=io.containerd.service.v1.introspection-service type=io.containerd.service.v1 Sep 12 22:04:48.844442 containerd[1553]: time="2025-09-12T22:04:48.844431546Z" level=info msg="loading plugin" id=io.containerd.service.v1.namespaces-service type=io.containerd.service.v1 Sep 12 22:04:48.844536 containerd[1553]: time="2025-09-12T22:04:48.844500306Z" level=info msg="loading plugin" id=io.containerd.service.v1.snapshots-service type=io.containerd.service.v1 Sep 12 22:04:48.844584 containerd[1553]: time="2025-09-12T22:04:48.844573226Z" level=info msg="loading plugin" id=io.containerd.shim.v1.manager type=io.containerd.shim.v1 Sep 12 22:04:48.844636 containerd[1553]: time="2025-09-12T22:04:48.844625226Z" level=info msg="loading plugin" id=io.containerd.runtime.v2.task type=io.containerd.runtime.v2 Sep 12 22:04:48.844905 containerd[1553]: time="2025-09-12T22:04:48.844882466Z" level=info msg="loading plugin" id=io.containerd.service.v1.tasks-service type=io.containerd.service.v1 Sep 12 22:04:48.845565 containerd[1553]: time="2025-09-12T22:04:48.845545186Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.containers type=io.containerd.grpc.v1 Sep 12 22:04:48.845651 containerd[1553]: time="2025-09-12T22:04:48.845637386Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.content type=io.containerd.grpc.v1 Sep 12 22:04:48.845698 containerd[1553]: time="2025-09-12T22:04:48.845688026Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.diff type=io.containerd.grpc.v1 Sep 12 22:04:48.845793 containerd[1553]: time="2025-09-12T22:04:48.845733946Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.events type=io.containerd.grpc.v1 Sep 12 22:04:48.845852 containerd[1553]: time="2025-09-12T22:04:48.845840306Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.images type=io.containerd.grpc.v1 Sep 12 22:04:48.845911 containerd[1553]: time="2025-09-12T22:04:48.845899146Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.introspection type=io.containerd.grpc.v1 Sep 12 22:04:48.845961 containerd[1553]: time="2025-09-12T22:04:48.845949386Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.leases type=io.containerd.grpc.v1 Sep 12 22:04:48.846007 containerd[1553]: time="2025-09-12T22:04:48.845997146Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.namespaces type=io.containerd.grpc.v1 Sep 12 22:04:48.846056 containerd[1553]: time="2025-09-12T22:04:48.846044586Z" level=info msg="loading plugin" id=io.containerd.sandbox.store.v1.local type=io.containerd.sandbox.store.v1 Sep 12 22:04:48.847523 containerd[1553]: time="2025-09-12T22:04:48.846544906Z" level=info msg="loading plugin" id=io.containerd.cri.v1.images type=io.containerd.cri.v1 Sep 12 22:04:48.847523 containerd[1553]: time="2025-09-12T22:04:48.846814786Z" level=info msg="Get image filesystem path \"/var/lib/containerd/io.containerd.snapshotter.v1.overlayfs\" for snapshotter \"overlayfs\"" Sep 12 22:04:48.847523 containerd[1553]: time="2025-09-12T22:04:48.846836546Z" level=info msg="Start snapshots syncer" Sep 12 22:04:48.847523 containerd[1553]: time="2025-09-12T22:04:48.846868426Z" level=info msg="loading plugin" id=io.containerd.cri.v1.runtime type=io.containerd.cri.v1 Sep 12 22:04:48.847628 containerd[1553]: time="2025-09-12T22:04:48.847126546Z" level=info msg="starting cri plugin" config="{\"containerd\":{\"defaultRuntimeName\":\"runc\",\"runtimes\":{\"runc\":{\"runtimeType\":\"io.containerd.runc.v2\",\"runtimePath\":\"\",\"PodAnnotations\":null,\"ContainerAnnotations\":null,\"options\":{\"BinaryName\":\"\",\"CriuImagePath\":\"\",\"CriuWorkPath\":\"\",\"IoGid\":0,\"IoUid\":0,\"NoNewKeyring\":false,\"Root\":\"\",\"ShimCgroup\":\"\",\"SystemdCgroup\":true},\"privileged_without_host_devices\":false,\"privileged_without_host_devices_all_devices_allowed\":false,\"baseRuntimeSpec\":\"\",\"cniConfDir\":\"\",\"cniMaxConfNum\":0,\"snapshotter\":\"\",\"sandboxer\":\"podsandbox\",\"io_type\":\"\"}},\"ignoreBlockIONotEnabledErrors\":false,\"ignoreRdtNotEnabledErrors\":false},\"cni\":{\"binDir\":\"/opt/cni/bin\",\"confDir\":\"/etc/cni/net.d\",\"maxConfNum\":1,\"setupSerially\":false,\"confTemplate\":\"\",\"ipPref\":\"\",\"useInternalLoopback\":false},\"enableSelinux\":true,\"selinuxCategoryRange\":1024,\"maxContainerLogSize\":16384,\"disableApparmor\":false,\"restrictOOMScoreAdj\":false,\"disableProcMount\":false,\"unsetSeccompProfile\":\"\",\"tolerateMissingHugetlbController\":true,\"disableHugetlbController\":true,\"device_ownership_from_security_context\":false,\"ignoreImageDefinedVolumes\":false,\"netnsMountsUnderStateDir\":false,\"enableUnprivilegedPorts\":true,\"enableUnprivilegedICMP\":true,\"enableCDI\":true,\"cdiSpecDirs\":[\"/etc/cdi\",\"/var/run/cdi\"],\"drainExecSyncIOTimeout\":\"0s\",\"ignoreDeprecationWarnings\":null,\"containerdRootDir\":\"/var/lib/containerd\",\"containerdEndpoint\":\"/run/containerd/containerd.sock\",\"rootDir\":\"/var/lib/containerd/io.containerd.grpc.v1.cri\",\"stateDir\":\"/run/containerd/io.containerd.grpc.v1.cri\"}" Sep 12 22:04:48.847628 containerd[1553]: time="2025-09-12T22:04:48.847178226Z" level=info msg="loading plugin" id=io.containerd.podsandbox.controller.v1.podsandbox type=io.containerd.podsandbox.controller.v1 Sep 12 22:04:48.847628 containerd[1553]: time="2025-09-12T22:04:48.847250586Z" level=info msg="loading plugin" id=io.containerd.sandbox.controller.v1.shim type=io.containerd.sandbox.controller.v1 Sep 12 22:04:48.847628 containerd[1553]: time="2025-09-12T22:04:48.847394586Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.sandbox-controllers type=io.containerd.grpc.v1 Sep 12 22:04:48.847628 containerd[1553]: time="2025-09-12T22:04:48.847421146Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.sandboxes type=io.containerd.grpc.v1 Sep 12 22:04:48.847628 containerd[1553]: time="2025-09-12T22:04:48.847431826Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.snapshots type=io.containerd.grpc.v1 Sep 12 22:04:48.847628 containerd[1553]: time="2025-09-12T22:04:48.847441466Z" level=info msg="loading plugin" id=io.containerd.streaming.v1.manager type=io.containerd.streaming.v1 Sep 12 22:04:48.847628 containerd[1553]: time="2025-09-12T22:04:48.847454266Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.streaming type=io.containerd.grpc.v1 Sep 12 22:04:48.847628 containerd[1553]: time="2025-09-12T22:04:48.847468186Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.tasks type=io.containerd.grpc.v1 Sep 12 22:04:48.847628 containerd[1553]: time="2025-09-12T22:04:48.847479706Z" level=info msg="loading plugin" id=io.containerd.transfer.v1.local type=io.containerd.transfer.v1 Sep 12 22:04:48.848320 containerd[1553]: time="2025-09-12T22:04:48.847505786Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.transfer type=io.containerd.grpc.v1 Sep 12 22:04:48.848397 containerd[1553]: time="2025-09-12T22:04:48.848383266Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.version type=io.containerd.grpc.v1 Sep 12 22:04:48.848445 containerd[1553]: time="2025-09-12T22:04:48.848434866Z" level=info msg="loading plugin" id=io.containerd.monitor.container.v1.restart type=io.containerd.monitor.container.v1 Sep 12 22:04:48.848544 containerd[1553]: time="2025-09-12T22:04:48.848520386Z" level=info msg="loading plugin" id=io.containerd.tracing.processor.v1.otlp type=io.containerd.tracing.processor.v1 Sep 12 22:04:48.848609 containerd[1553]: time="2025-09-12T22:04:48.848586266Z" level=info msg="skip loading plugin" error="skip plugin: tracing endpoint not configured" id=io.containerd.tracing.processor.v1.otlp type=io.containerd.tracing.processor.v1 Sep 12 22:04:48.850535 containerd[1553]: time="2025-09-12T22:04:48.849686066Z" level=info msg="loading plugin" id=io.containerd.internal.v1.tracing type=io.containerd.internal.v1 Sep 12 22:04:48.850535 containerd[1553]: time="2025-09-12T22:04:48.849715746Z" level=info msg="skip loading plugin" error="skip plugin: tracing endpoint not configured" id=io.containerd.internal.v1.tracing type=io.containerd.internal.v1 Sep 12 22:04:48.850535 containerd[1553]: time="2025-09-12T22:04:48.849726946Z" level=info msg="loading plugin" id=io.containerd.ttrpc.v1.otelttrpc type=io.containerd.ttrpc.v1 Sep 12 22:04:48.850535 containerd[1553]: time="2025-09-12T22:04:48.849757346Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.healthcheck type=io.containerd.grpc.v1 Sep 12 22:04:48.850535 containerd[1553]: time="2025-09-12T22:04:48.849773306Z" level=info msg="loading plugin" id=io.containerd.nri.v1.nri type=io.containerd.nri.v1 Sep 12 22:04:48.850535 containerd[1553]: time="2025-09-12T22:04:48.849861346Z" level=info msg="runtime interface created" Sep 12 22:04:48.850535 containerd[1553]: time="2025-09-12T22:04:48.849866266Z" level=info msg="created NRI interface" Sep 12 22:04:48.850535 containerd[1553]: time="2025-09-12T22:04:48.849874906Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.cri type=io.containerd.grpc.v1 Sep 12 22:04:48.850535 containerd[1553]: time="2025-09-12T22:04:48.849891746Z" level=info msg="Connect containerd service" Sep 12 22:04:48.850535 containerd[1553]: time="2025-09-12T22:04:48.849943986Z" level=info msg="using experimental NRI integration - disable nri plugin to prevent this" Sep 12 22:04:48.852155 containerd[1553]: time="2025-09-12T22:04:48.852128946Z" level=error msg="failed to load cni during init, please check CRI plugin status before setting up network for pods" error="cni config load failed: no network config found in /etc/cni/net.d: cni plugin not initialized: failed to load cni config" Sep 12 22:04:48.966097 containerd[1553]: time="2025-09-12T22:04:48.965266306Z" level=info msg="Start subscribing containerd event" Sep 12 22:04:48.966097 containerd[1553]: time="2025-09-12T22:04:48.965465306Z" level=info msg="Start recovering state" Sep 12 22:04:48.966097 containerd[1553]: time="2025-09-12T22:04:48.965710946Z" level=info msg=serving... address=/run/containerd/containerd.sock.ttrpc Sep 12 22:04:48.966097 containerd[1553]: time="2025-09-12T22:04:48.965813066Z" level=info msg=serving... address=/run/containerd/containerd.sock Sep 12 22:04:48.966097 containerd[1553]: time="2025-09-12T22:04:48.965976786Z" level=info msg="Start event monitor" Sep 12 22:04:48.966097 containerd[1553]: time="2025-09-12T22:04:48.965998586Z" level=info msg="Start cni network conf syncer for default" Sep 12 22:04:48.966097 containerd[1553]: time="2025-09-12T22:04:48.966009426Z" level=info msg="Start streaming server" Sep 12 22:04:48.967785 containerd[1553]: time="2025-09-12T22:04:48.967728826Z" level=info msg="Registered namespace \"k8s.io\" with NRI" Sep 12 22:04:48.967785 containerd[1553]: time="2025-09-12T22:04:48.967788026Z" level=info msg="runtime interface starting up..." Sep 12 22:04:48.967870 containerd[1553]: time="2025-09-12T22:04:48.967796386Z" level=info msg="starting plugins..." Sep 12 22:04:48.967870 containerd[1553]: time="2025-09-12T22:04:48.967824986Z" level=info msg="Synchronizing NRI (plugin) with current runtime state" Sep 12 22:04:48.969180 containerd[1553]: time="2025-09-12T22:04:48.969153866Z" level=info msg="containerd successfully booted in 0.156983s" Sep 12 22:04:48.969277 systemd[1]: Started containerd.service - containerd container runtime. Sep 12 22:04:49.084268 tar[1541]: linux-arm64/LICENSE Sep 12 22:04:49.084357 tar[1541]: linux-arm64/README.md Sep 12 22:04:49.104585 systemd[1]: Finished prepare-helm.service - Unpack helm to /opt/bin. Sep 12 22:04:49.208059 sshd_keygen[1556]: ssh-keygen: generating new host keys: RSA ECDSA ED25519 Sep 12 22:04:49.236980 systemd[1]: Finished sshd-keygen.service - Generate sshd host keys. Sep 12 22:04:49.241382 systemd[1]: Starting issuegen.service - Generate /run/issue... Sep 12 22:04:49.261976 systemd[1]: issuegen.service: Deactivated successfully. Sep 12 22:04:49.262250 systemd[1]: Finished issuegen.service - Generate /run/issue. Sep 12 22:04:49.266264 systemd[1]: Starting systemd-user-sessions.service - Permit User Sessions... Sep 12 22:04:49.288048 systemd[1]: Finished systemd-user-sessions.service - Permit User Sessions. Sep 12 22:04:49.292349 systemd[1]: Started getty@tty1.service - Getty on tty1. Sep 12 22:04:49.296866 systemd[1]: Started serial-getty@ttyAMA0.service - Serial Getty on ttyAMA0. Sep 12 22:04:49.297908 systemd[1]: Reached target getty.target - Login Prompts. Sep 12 22:04:49.887900 systemd-networkd[1421]: eth0: Gained IPv6LL Sep 12 22:04:49.894287 systemd[1]: Finished systemd-networkd-wait-online.service - Wait for Network to be Configured. Sep 12 22:04:49.896121 systemd[1]: Reached target network-online.target - Network is Online. Sep 12 22:04:49.902706 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Sep 12 22:04:49.909382 systemd[1]: Starting nvidia.service - NVIDIA Configure Service... Sep 12 22:04:49.950360 systemd[1]: Finished nvidia.service - NVIDIA Configure Service. Sep 12 22:04:50.017119 systemd-networkd[1421]: eth1: Gained IPv6LL Sep 12 22:04:50.783657 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Sep 12 22:04:50.787967 systemd[1]: Reached target multi-user.target - Multi-User System. Sep 12 22:04:50.791701 systemd[1]: Startup finished in 2.402s (kernel) + 5.276s (initrd) + 5.415s (userspace) = 13.095s. Sep 12 22:04:50.795783 (kubelet)[1661]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Sep 12 22:04:51.371137 kubelet[1661]: E0912 22:04:51.371071 1661 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Sep 12 22:04:51.374046 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Sep 12 22:04:51.374390 systemd[1]: kubelet.service: Failed with result 'exit-code'. Sep 12 22:04:51.375249 systemd[1]: kubelet.service: Consumed 945ms CPU time, 256.1M memory peak. Sep 12 22:05:01.468909 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 1. Sep 12 22:05:01.472036 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Sep 12 22:05:01.689350 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Sep 12 22:05:01.705562 (kubelet)[1680]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Sep 12 22:05:01.764868 kubelet[1680]: E0912 22:05:01.764731 1680 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Sep 12 22:05:01.768529 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Sep 12 22:05:01.768742 systemd[1]: kubelet.service: Failed with result 'exit-code'. Sep 12 22:05:01.769488 systemd[1]: kubelet.service: Consumed 217ms CPU time, 108M memory peak. Sep 12 22:05:11.968853 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 2. Sep 12 22:05:11.972448 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Sep 12 22:05:12.176841 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Sep 12 22:05:12.192395 (kubelet)[1694]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Sep 12 22:05:12.240444 kubelet[1694]: E0912 22:05:12.240344 1694 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Sep 12 22:05:12.243647 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Sep 12 22:05:12.243807 systemd[1]: kubelet.service: Failed with result 'exit-code'. Sep 12 22:05:12.244604 systemd[1]: kubelet.service: Consumed 192ms CPU time, 107.3M memory peak. Sep 12 22:05:22.468642 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 3. Sep 12 22:05:22.471296 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Sep 12 22:05:22.699907 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Sep 12 22:05:22.713161 (kubelet)[1708]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Sep 12 22:05:22.760809 kubelet[1708]: E0912 22:05:22.760597 1708 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Sep 12 22:05:22.764329 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Sep 12 22:05:22.764575 systemd[1]: kubelet.service: Failed with result 'exit-code'. Sep 12 22:05:22.765599 systemd[1]: kubelet.service: Consumed 204ms CPU time, 105.1M memory peak. Sep 12 22:05:26.483013 systemd[1]: Created slice system-sshd.slice - Slice /system/sshd. Sep 12 22:05:26.488149 systemd[1]: Started sshd@0-168.119.157.2:22-139.178.68.195:50952.service - OpenSSH per-connection server daemon (139.178.68.195:50952). Sep 12 22:05:27.520769 sshd[1717]: Accepted publickey for core from 139.178.68.195 port 50952 ssh2: RSA SHA256:fkNOO6LYLT5WIi2mCSq4FAK1DpB2w5SXOx3BU2RCgh0 Sep 12 22:05:27.524812 sshd-session[1717]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 12 22:05:27.533001 systemd[1]: Created slice user-500.slice - User Slice of UID 500. Sep 12 22:05:27.534719 systemd[1]: Starting user-runtime-dir@500.service - User Runtime Directory /run/user/500... Sep 12 22:05:27.543355 systemd-logind[1527]: New session 1 of user core. Sep 12 22:05:27.560570 systemd[1]: Finished user-runtime-dir@500.service - User Runtime Directory /run/user/500. Sep 12 22:05:27.563976 systemd[1]: Starting user@500.service - User Manager for UID 500... Sep 12 22:05:27.582711 (systemd)[1722]: pam_unix(systemd-user:session): session opened for user core(uid=500) by (uid=0) Sep 12 22:05:27.586320 systemd-logind[1527]: New session c1 of user core. Sep 12 22:05:27.731316 systemd[1722]: Queued start job for default target default.target. Sep 12 22:05:27.743202 systemd[1722]: Created slice app.slice - User Application Slice. Sep 12 22:05:27.743258 systemd[1722]: Reached target paths.target - Paths. Sep 12 22:05:27.743421 systemd[1722]: Reached target timers.target - Timers. Sep 12 22:05:27.745641 systemd[1722]: Starting dbus.socket - D-Bus User Message Bus Socket... Sep 12 22:05:27.781240 systemd[1722]: Listening on dbus.socket - D-Bus User Message Bus Socket. Sep 12 22:05:27.781429 systemd[1722]: Reached target sockets.target - Sockets. Sep 12 22:05:27.781493 systemd[1722]: Reached target basic.target - Basic System. Sep 12 22:05:27.781579 systemd[1722]: Reached target default.target - Main User Target. Sep 12 22:05:27.781617 systemd[1722]: Startup finished in 186ms. Sep 12 22:05:27.781940 systemd[1]: Started user@500.service - User Manager for UID 500. Sep 12 22:05:27.788259 systemd[1]: Started session-1.scope - Session 1 of User core. Sep 12 22:05:28.498358 systemd[1]: Started sshd@1-168.119.157.2:22-139.178.68.195:50968.service - OpenSSH per-connection server daemon (139.178.68.195:50968). Sep 12 22:05:29.551943 sshd[1733]: Accepted publickey for core from 139.178.68.195 port 50968 ssh2: RSA SHA256:fkNOO6LYLT5WIi2mCSq4FAK1DpB2w5SXOx3BU2RCgh0 Sep 12 22:05:29.555254 sshd-session[1733]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 12 22:05:29.563907 systemd-logind[1527]: New session 2 of user core. Sep 12 22:05:29.577187 systemd[1]: Started session-2.scope - Session 2 of User core. Sep 12 22:05:30.248959 sshd[1736]: Connection closed by 139.178.68.195 port 50968 Sep 12 22:05:30.249941 sshd-session[1733]: pam_unix(sshd:session): session closed for user core Sep 12 22:05:30.264876 systemd[1]: sshd@1-168.119.157.2:22-139.178.68.195:50968.service: Deactivated successfully. Sep 12 22:05:30.266494 systemd-logind[1527]: Session 2 logged out. Waiting for processes to exit. Sep 12 22:05:30.273994 systemd[1]: session-2.scope: Deactivated successfully. Sep 12 22:05:30.279072 systemd-logind[1527]: Removed session 2. Sep 12 22:05:30.424763 systemd[1]: Started sshd@2-168.119.157.2:22-139.178.68.195:52378.service - OpenSSH per-connection server daemon (139.178.68.195:52378). Sep 12 22:05:31.461390 sshd[1742]: Accepted publickey for core from 139.178.68.195 port 52378 ssh2: RSA SHA256:fkNOO6LYLT5WIi2mCSq4FAK1DpB2w5SXOx3BU2RCgh0 Sep 12 22:05:31.463863 sshd-session[1742]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 12 22:05:31.472423 systemd-logind[1527]: New session 3 of user core. Sep 12 22:05:31.477780 systemd[1]: Started session-3.scope - Session 3 of User core. Sep 12 22:05:32.153899 sshd[1745]: Connection closed by 139.178.68.195 port 52378 Sep 12 22:05:32.154726 sshd-session[1742]: pam_unix(sshd:session): session closed for user core Sep 12 22:05:32.161849 systemd-logind[1527]: Session 3 logged out. Waiting for processes to exit. Sep 12 22:05:32.161992 systemd[1]: sshd@2-168.119.157.2:22-139.178.68.195:52378.service: Deactivated successfully. Sep 12 22:05:32.163847 systemd[1]: session-3.scope: Deactivated successfully. Sep 12 22:05:32.165729 systemd-logind[1527]: Removed session 3. Sep 12 22:05:32.331701 systemd[1]: Started sshd@3-168.119.157.2:22-139.178.68.195:52384.service - OpenSSH per-connection server daemon (139.178.68.195:52384). Sep 12 22:05:32.968110 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 4. Sep 12 22:05:32.972781 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Sep 12 22:05:33.163827 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Sep 12 22:05:33.179123 (kubelet)[1762]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Sep 12 22:05:33.233615 kubelet[1762]: E0912 22:05:33.233434 1762 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Sep 12 22:05:33.236681 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Sep 12 22:05:33.237032 systemd[1]: kubelet.service: Failed with result 'exit-code'. Sep 12 22:05:33.238701 systemd[1]: kubelet.service: Consumed 196ms CPU time, 105.5M memory peak. Sep 12 22:05:33.391483 sshd[1751]: Accepted publickey for core from 139.178.68.195 port 52384 ssh2: RSA SHA256:fkNOO6LYLT5WIi2mCSq4FAK1DpB2w5SXOx3BU2RCgh0 Sep 12 22:05:33.394275 sshd-session[1751]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 12 22:05:33.401123 systemd-logind[1527]: New session 4 of user core. Sep 12 22:05:33.403846 systemd[1]: Started session-4.scope - Session 4 of User core. Sep 12 22:05:33.697052 update_engine[1528]: I20250912 22:05:33.696904 1528 update_attempter.cc:509] Updating boot flags... Sep 12 22:05:34.112972 sshd[1768]: Connection closed by 139.178.68.195 port 52384 Sep 12 22:05:34.112852 sshd-session[1751]: pam_unix(sshd:session): session closed for user core Sep 12 22:05:34.120709 systemd[1]: sshd@3-168.119.157.2:22-139.178.68.195:52384.service: Deactivated successfully. Sep 12 22:05:34.122805 systemd[1]: session-4.scope: Deactivated successfully. Sep 12 22:05:34.123696 systemd-logind[1527]: Session 4 logged out. Waiting for processes to exit. Sep 12 22:05:34.125887 systemd-logind[1527]: Removed session 4. Sep 12 22:05:34.288840 systemd[1]: Started sshd@4-168.119.157.2:22-139.178.68.195:52390.service - OpenSSH per-connection server daemon (139.178.68.195:52390). Sep 12 22:05:35.306423 sshd[1790]: Accepted publickey for core from 139.178.68.195 port 52390 ssh2: RSA SHA256:fkNOO6LYLT5WIi2mCSq4FAK1DpB2w5SXOx3BU2RCgh0 Sep 12 22:05:35.310583 sshd-session[1790]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 12 22:05:35.317467 systemd-logind[1527]: New session 5 of user core. Sep 12 22:05:35.320761 systemd[1]: Started session-5.scope - Session 5 of User core. Sep 12 22:05:35.845054 sudo[1794]: core : PWD=/home/core ; USER=root ; COMMAND=/usr/sbin/setenforce 1 Sep 12 22:05:35.845751 sudo[1794]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Sep 12 22:05:35.861101 sudo[1794]: pam_unix(sudo:session): session closed for user root Sep 12 22:05:36.023846 sshd[1793]: Connection closed by 139.178.68.195 port 52390 Sep 12 22:05:36.023440 sshd-session[1790]: pam_unix(sshd:session): session closed for user core Sep 12 22:05:36.032828 systemd-logind[1527]: Session 5 logged out. Waiting for processes to exit. Sep 12 22:05:36.033977 systemd[1]: sshd@4-168.119.157.2:22-139.178.68.195:52390.service: Deactivated successfully. Sep 12 22:05:36.036410 systemd[1]: session-5.scope: Deactivated successfully. Sep 12 22:05:36.039260 systemd-logind[1527]: Removed session 5. Sep 12 22:05:36.196214 systemd[1]: Started sshd@5-168.119.157.2:22-139.178.68.195:52400.service - OpenSSH per-connection server daemon (139.178.68.195:52400). Sep 12 22:05:37.200731 sshd[1800]: Accepted publickey for core from 139.178.68.195 port 52400 ssh2: RSA SHA256:fkNOO6LYLT5WIi2mCSq4FAK1DpB2w5SXOx3BU2RCgh0 Sep 12 22:05:37.203376 sshd-session[1800]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 12 22:05:37.212881 systemd-logind[1527]: New session 6 of user core. Sep 12 22:05:37.221987 systemd[1]: Started session-6.scope - Session 6 of User core. Sep 12 22:05:37.729244 sudo[1805]: core : PWD=/home/core ; USER=root ; COMMAND=/usr/sbin/rm -rf /etc/audit/rules.d/80-selinux.rules /etc/audit/rules.d/99-default.rules Sep 12 22:05:37.729632 sudo[1805]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Sep 12 22:05:37.738116 sudo[1805]: pam_unix(sudo:session): session closed for user root Sep 12 22:05:37.744906 sudo[1804]: core : PWD=/home/core ; USER=root ; COMMAND=/usr/sbin/systemctl restart audit-rules Sep 12 22:05:37.745218 sudo[1804]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Sep 12 22:05:37.760301 systemd[1]: Starting audit-rules.service - Load Audit Rules... Sep 12 22:05:37.806614 augenrules[1827]: No rules Sep 12 22:05:37.808701 systemd[1]: audit-rules.service: Deactivated successfully. Sep 12 22:05:37.809024 systemd[1]: Finished audit-rules.service - Load Audit Rules. Sep 12 22:05:37.811008 sudo[1804]: pam_unix(sudo:session): session closed for user root Sep 12 22:05:37.973553 sshd[1803]: Connection closed by 139.178.68.195 port 52400 Sep 12 22:05:37.975207 sshd-session[1800]: pam_unix(sshd:session): session closed for user core Sep 12 22:05:37.980992 systemd[1]: sshd@5-168.119.157.2:22-139.178.68.195:52400.service: Deactivated successfully. Sep 12 22:05:37.981214 systemd-logind[1527]: Session 6 logged out. Waiting for processes to exit. Sep 12 22:05:37.983536 systemd[1]: session-6.scope: Deactivated successfully. Sep 12 22:05:37.988718 systemd-logind[1527]: Removed session 6. Sep 12 22:05:38.149118 systemd[1]: Started sshd@6-168.119.157.2:22-139.178.68.195:52412.service - OpenSSH per-connection server daemon (139.178.68.195:52412). Sep 12 22:05:39.162950 sshd[1836]: Accepted publickey for core from 139.178.68.195 port 52412 ssh2: RSA SHA256:fkNOO6LYLT5WIi2mCSq4FAK1DpB2w5SXOx3BU2RCgh0 Sep 12 22:05:39.167259 sshd-session[1836]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 12 22:05:39.177267 systemd-logind[1527]: New session 7 of user core. Sep 12 22:05:39.186828 systemd[1]: Started session-7.scope - Session 7 of User core. Sep 12 22:05:39.684530 sudo[1840]: core : PWD=/home/core ; USER=root ; COMMAND=/home/core/install.sh Sep 12 22:05:39.684880 sudo[1840]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Sep 12 22:05:40.037233 systemd[1]: Starting docker.service - Docker Application Container Engine... Sep 12 22:05:40.052154 (dockerd)[1858]: docker.service: Referenced but unset environment variable evaluates to an empty string: DOCKER_CGROUPS, DOCKER_OPTS, DOCKER_OPT_BIP, DOCKER_OPT_IPMASQ, DOCKER_OPT_MTU Sep 12 22:05:40.317204 dockerd[1858]: time="2025-09-12T22:05:40.316288321Z" level=info msg="Starting up" Sep 12 22:05:40.319485 dockerd[1858]: time="2025-09-12T22:05:40.319378509Z" level=info msg="OTEL tracing is not configured, using no-op tracer provider" Sep 12 22:05:40.335617 dockerd[1858]: time="2025-09-12T22:05:40.335554731Z" level=info msg="Creating a containerd client" address=/var/run/docker/libcontainerd/docker-containerd.sock timeout=1m0s Sep 12 22:05:40.355695 systemd[1]: var-lib-docker-check\x2doverlayfs\x2dsupport1908088578-merged.mount: Deactivated successfully. Sep 12 22:05:40.367573 systemd[1]: var-lib-docker-metacopy\x2dcheck3442595181-merged.mount: Deactivated successfully. Sep 12 22:05:40.385058 dockerd[1858]: time="2025-09-12T22:05:40.384780726Z" level=info msg="Loading containers: start." Sep 12 22:05:40.396547 kernel: Initializing XFRM netlink socket Sep 12 22:05:40.670304 systemd-networkd[1421]: docker0: Link UP Sep 12 22:05:40.676091 dockerd[1858]: time="2025-09-12T22:05:40.675961336Z" level=info msg="Loading containers: done." Sep 12 22:05:40.701439 dockerd[1858]: time="2025-09-12T22:05:40.701350160Z" level=warning msg="Not using native diff for overlay2, this may cause degraded performance for building images: kernel has CONFIG_OVERLAY_FS_REDIRECT_DIR enabled" storage-driver=overlay2 Sep 12 22:05:40.702005 dockerd[1858]: time="2025-09-12T22:05:40.701902965Z" level=info msg="Docker daemon" commit=6430e49a55babd9b8f4d08e70ecb2b68900770fe containerd-snapshotter=false storage-driver=overlay2 version=28.0.4 Sep 12 22:05:40.702357 dockerd[1858]: time="2025-09-12T22:05:40.702236528Z" level=info msg="Initializing buildkit" Sep 12 22:05:40.737499 dockerd[1858]: time="2025-09-12T22:05:40.737118316Z" level=info msg="Completed buildkit initialization" Sep 12 22:05:40.748102 dockerd[1858]: time="2025-09-12T22:05:40.748056813Z" level=info msg="Daemon has completed initialization" Sep 12 22:05:40.748346 systemd[1]: Started docker.service - Docker Application Container Engine. Sep 12 22:05:40.748477 dockerd[1858]: time="2025-09-12T22:05:40.748276174Z" level=info msg="API listen on /run/docker.sock" Sep 12 22:05:41.352139 systemd[1]: var-lib-docker-overlay2-opaque\x2dbug\x2dcheck810843376-merged.mount: Deactivated successfully. Sep 12 22:05:42.155472 containerd[1553]: time="2025-09-12T22:05:42.155030239Z" level=info msg="PullImage \"registry.k8s.io/kube-apiserver:v1.31.13\"" Sep 12 22:05:42.738065 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount3023118511.mount: Deactivated successfully. Sep 12 22:05:43.468165 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 5. Sep 12 22:05:43.471129 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Sep 12 22:05:43.629345 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Sep 12 22:05:43.642297 (kubelet)[2133]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Sep 12 22:05:43.712901 kubelet[2133]: E0912 22:05:43.712800 2133 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Sep 12 22:05:43.718386 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Sep 12 22:05:43.718886 systemd[1]: kubelet.service: Failed with result 'exit-code'. Sep 12 22:05:43.719416 systemd[1]: kubelet.service: Consumed 171ms CPU time, 106.8M memory peak. Sep 12 22:05:43.729259 containerd[1553]: time="2025-09-12T22:05:43.729160498Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-apiserver:v1.31.13\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 22:05:43.732031 containerd[1553]: time="2025-09-12T22:05:43.730965991Z" level=info msg="stop pulling image registry.k8s.io/kube-apiserver:v1.31.13: active requests=0, bytes read=25687423" Sep 12 22:05:43.733604 containerd[1553]: time="2025-09-12T22:05:43.733562490Z" level=info msg="ImageCreate event name:\"sha256:0b1c07d8fd4a3526d5c44502e682df3627a3b01c1e608e5e24c3519c8fb337b6\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 22:05:43.739368 containerd[1553]: time="2025-09-12T22:05:43.739251851Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-apiserver@sha256:9abeb8a2d3e53e356d1f2e5d5dc2081cf28f23242651b0552c9e38f4a7ae960e\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 22:05:43.740411 containerd[1553]: time="2025-09-12T22:05:43.740370899Z" level=info msg="Pulled image \"registry.k8s.io/kube-apiserver:v1.31.13\" with image id \"sha256:0b1c07d8fd4a3526d5c44502e682df3627a3b01c1e608e5e24c3519c8fb337b6\", repo tag \"registry.k8s.io/kube-apiserver:v1.31.13\", repo digest \"registry.k8s.io/kube-apiserver@sha256:9abeb8a2d3e53e356d1f2e5d5dc2081cf28f23242651b0552c9e38f4a7ae960e\", size \"25683924\" in 1.58529574s" Sep 12 22:05:43.740592 containerd[1553]: time="2025-09-12T22:05:43.740572861Z" level=info msg="PullImage \"registry.k8s.io/kube-apiserver:v1.31.13\" returns image reference \"sha256:0b1c07d8fd4a3526d5c44502e682df3627a3b01c1e608e5e24c3519c8fb337b6\"" Sep 12 22:05:43.742805 containerd[1553]: time="2025-09-12T22:05:43.742722436Z" level=info msg="PullImage \"registry.k8s.io/kube-controller-manager:v1.31.13\"" Sep 12 22:05:44.842351 containerd[1553]: time="2025-09-12T22:05:44.842250736Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-controller-manager:v1.31.13\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 22:05:44.844258 containerd[1553]: time="2025-09-12T22:05:44.843695186Z" level=info msg="stop pulling image registry.k8s.io/kube-controller-manager:v1.31.13: active requests=0, bytes read=22459787" Sep 12 22:05:44.845408 containerd[1553]: time="2025-09-12T22:05:44.845367318Z" level=info msg="ImageCreate event name:\"sha256:c359cb88f3d2147f2cb4c5ada4fbdeadc4b1c009d66c8f33f3856efaf04ee6ef\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 22:05:44.850378 containerd[1553]: time="2025-09-12T22:05:44.850308071Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-controller-manager@sha256:facc91288697a288a691520949fe4eec40059ef065c89da8e10481d14e131b09\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 22:05:44.852077 containerd[1553]: time="2025-09-12T22:05:44.852013843Z" level=info msg="Pulled image \"registry.k8s.io/kube-controller-manager:v1.31.13\" with image id \"sha256:c359cb88f3d2147f2cb4c5ada4fbdeadc4b1c009d66c8f33f3856efaf04ee6ef\", repo tag \"registry.k8s.io/kube-controller-manager:v1.31.13\", repo digest \"registry.k8s.io/kube-controller-manager@sha256:facc91288697a288a691520949fe4eec40059ef065c89da8e10481d14e131b09\", size \"24028542\" in 1.109241927s" Sep 12 22:05:44.852277 containerd[1553]: time="2025-09-12T22:05:44.852249484Z" level=info msg="PullImage \"registry.k8s.io/kube-controller-manager:v1.31.13\" returns image reference \"sha256:c359cb88f3d2147f2cb4c5ada4fbdeadc4b1c009d66c8f33f3856efaf04ee6ef\"" Sep 12 22:05:44.852940 containerd[1553]: time="2025-09-12T22:05:44.852897849Z" level=info msg="PullImage \"registry.k8s.io/kube-scheduler:v1.31.13\"" Sep 12 22:05:45.850534 containerd[1553]: time="2025-09-12T22:05:45.849598167Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-scheduler:v1.31.13\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 22:05:45.853211 containerd[1553]: time="2025-09-12T22:05:45.853173750Z" level=info msg="stop pulling image registry.k8s.io/kube-scheduler:v1.31.13: active requests=0, bytes read=17127526" Sep 12 22:05:45.854300 containerd[1553]: time="2025-09-12T22:05:45.854263517Z" level=info msg="ImageCreate event name:\"sha256:5e3cbe2ba7db787c6aebfcf4484156dd4ebd7ede811ef72e8929593e59a5fa27\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 22:05:45.859079 containerd[1553]: time="2025-09-12T22:05:45.859035907Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-scheduler@sha256:c5ce150dcce2419fdef9f9875fef43014355ccebf937846ed3a2971953f9b241\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 22:05:45.861131 containerd[1553]: time="2025-09-12T22:05:45.861087080Z" level=info msg="Pulled image \"registry.k8s.io/kube-scheduler:v1.31.13\" with image id \"sha256:5e3cbe2ba7db787c6aebfcf4484156dd4ebd7ede811ef72e8929593e59a5fa27\", repo tag \"registry.k8s.io/kube-scheduler:v1.31.13\", repo digest \"registry.k8s.io/kube-scheduler@sha256:c5ce150dcce2419fdef9f9875fef43014355ccebf937846ed3a2971953f9b241\", size \"18696299\" in 1.008138751s" Sep 12 22:05:45.861253 containerd[1553]: time="2025-09-12T22:05:45.861239321Z" level=info msg="PullImage \"registry.k8s.io/kube-scheduler:v1.31.13\" returns image reference \"sha256:5e3cbe2ba7db787c6aebfcf4484156dd4ebd7ede811ef72e8929593e59a5fa27\"" Sep 12 22:05:45.862387 containerd[1553]: time="2025-09-12T22:05:45.862270408Z" level=info msg="PullImage \"registry.k8s.io/kube-proxy:v1.31.13\"" Sep 12 22:05:46.843773 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount3517463804.mount: Deactivated successfully. Sep 12 22:05:47.196796 containerd[1553]: time="2025-09-12T22:05:47.196276711Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-proxy:v1.31.13\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 22:05:47.197876 containerd[1553]: time="2025-09-12T22:05:47.197726879Z" level=info msg="stop pulling image registry.k8s.io/kube-proxy:v1.31.13: active requests=0, bytes read=26954933" Sep 12 22:05:47.199687 containerd[1553]: time="2025-09-12T22:05:47.199633010Z" level=info msg="ImageCreate event name:\"sha256:c15699f0b7002450249485b10f20211982dfd2bec4d61c86c35acebc659e794e\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 22:05:47.203684 containerd[1553]: time="2025-09-12T22:05:47.203634233Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-proxy@sha256:a39637326e88d128d38da6ff2b2ceb4e856475887bfcb5f7a55734d4f63d9fae\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 22:05:47.204419 containerd[1553]: time="2025-09-12T22:05:47.204388877Z" level=info msg="Pulled image \"registry.k8s.io/kube-proxy:v1.31.13\" with image id \"sha256:c15699f0b7002450249485b10f20211982dfd2bec4d61c86c35acebc659e794e\", repo tag \"registry.k8s.io/kube-proxy:v1.31.13\", repo digest \"registry.k8s.io/kube-proxy@sha256:a39637326e88d128d38da6ff2b2ceb4e856475887bfcb5f7a55734d4f63d9fae\", size \"26953926\" in 1.342079589s" Sep 12 22:05:47.204475 containerd[1553]: time="2025-09-12T22:05:47.204427477Z" level=info msg="PullImage \"registry.k8s.io/kube-proxy:v1.31.13\" returns image reference \"sha256:c15699f0b7002450249485b10f20211982dfd2bec4d61c86c35acebc659e794e\"" Sep 12 22:05:47.205076 containerd[1553]: time="2025-09-12T22:05:47.205045601Z" level=info msg="PullImage \"registry.k8s.io/coredns/coredns:v1.11.3\"" Sep 12 22:05:47.781442 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount4226582035.mount: Deactivated successfully. Sep 12 22:05:48.480953 containerd[1553]: time="2025-09-12T22:05:48.480877760Z" level=info msg="ImageCreate event name:\"registry.k8s.io/coredns/coredns:v1.11.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 22:05:48.484461 containerd[1553]: time="2025-09-12T22:05:48.482907011Z" level=info msg="stop pulling image registry.k8s.io/coredns/coredns:v1.11.3: active requests=0, bytes read=16951714" Sep 12 22:05:48.484461 containerd[1553]: time="2025-09-12T22:05:48.483115332Z" level=info msg="ImageCreate event name:\"sha256:2f6c962e7b8311337352d9fdea917da2184d9919f4da7695bc2a6517cf392fe4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 22:05:48.486239 containerd[1553]: time="2025-09-12T22:05:48.486194788Z" level=info msg="ImageCreate event name:\"registry.k8s.io/coredns/coredns@sha256:9caabbf6238b189a65d0d6e6ac138de60d6a1c419e5a341fbbb7c78382559c6e\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 22:05:48.487402 containerd[1553]: time="2025-09-12T22:05:48.487352674Z" level=info msg="Pulled image \"registry.k8s.io/coredns/coredns:v1.11.3\" with image id \"sha256:2f6c962e7b8311337352d9fdea917da2184d9919f4da7695bc2a6517cf392fe4\", repo tag \"registry.k8s.io/coredns/coredns:v1.11.3\", repo digest \"registry.k8s.io/coredns/coredns@sha256:9caabbf6238b189a65d0d6e6ac138de60d6a1c419e5a341fbbb7c78382559c6e\", size \"16948420\" in 1.282274153s" Sep 12 22:05:48.487402 containerd[1553]: time="2025-09-12T22:05:48.487395354Z" level=info msg="PullImage \"registry.k8s.io/coredns/coredns:v1.11.3\" returns image reference \"sha256:2f6c962e7b8311337352d9fdea917da2184d9919f4da7695bc2a6517cf392fe4\"" Sep 12 22:05:48.488383 containerd[1553]: time="2025-09-12T22:05:48.488356799Z" level=info msg="PullImage \"registry.k8s.io/pause:3.10\"" Sep 12 22:05:49.027021 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount904905948.mount: Deactivated successfully. Sep 12 22:05:49.042861 containerd[1553]: time="2025-09-12T22:05:49.041706142Z" level=info msg="ImageCreate event name:\"registry.k8s.io/pause:3.10\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" Sep 12 22:05:49.043942 containerd[1553]: time="2025-09-12T22:05:49.043888713Z" level=info msg="stop pulling image registry.k8s.io/pause:3.10: active requests=0, bytes read=268723" Sep 12 22:05:49.049490 containerd[1553]: time="2025-09-12T22:05:49.047060608Z" level=info msg="ImageCreate event name:\"sha256:afb61768ce381961ca0beff95337601f29dc70ff3ed14e5e4b3e5699057e6aa8\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" Sep 12 22:05:49.051290 containerd[1553]: time="2025-09-12T22:05:49.051246469Z" level=info msg="ImageCreate event name:\"registry.k8s.io/pause@sha256:ee6521f290b2168b6e0935a181d4cff9be1ac3f505666ef0e3c98fae8199917a\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" Sep 12 22:05:49.052744 containerd[1553]: time="2025-09-12T22:05:49.052673756Z" level=info msg="Pulled image \"registry.k8s.io/pause:3.10\" with image id \"sha256:afb61768ce381961ca0beff95337601f29dc70ff3ed14e5e4b3e5699057e6aa8\", repo tag \"registry.k8s.io/pause:3.10\", repo digest \"registry.k8s.io/pause@sha256:ee6521f290b2168b6e0935a181d4cff9be1ac3f505666ef0e3c98fae8199917a\", size \"267933\" in 564.208356ms" Sep 12 22:05:49.053009 containerd[1553]: time="2025-09-12T22:05:49.052955997Z" level=info msg="PullImage \"registry.k8s.io/pause:3.10\" returns image reference \"sha256:afb61768ce381961ca0beff95337601f29dc70ff3ed14e5e4b3e5699057e6aa8\"" Sep 12 22:05:49.054254 containerd[1553]: time="2025-09-12T22:05:49.054140923Z" level=info msg="PullImage \"registry.k8s.io/etcd:3.5.15-0\"" Sep 12 22:05:49.650896 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount3762901519.mount: Deactivated successfully. Sep 12 22:05:51.152545 containerd[1553]: time="2025-09-12T22:05:51.151164199Z" level=info msg="ImageCreate event name:\"registry.k8s.io/etcd:3.5.15-0\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 22:05:51.153713 containerd[1553]: time="2025-09-12T22:05:51.153623330Z" level=info msg="stop pulling image registry.k8s.io/etcd:3.5.15-0: active requests=0, bytes read=66537235" Sep 12 22:05:51.154100 containerd[1553]: time="2025-09-12T22:05:51.154052212Z" level=info msg="ImageCreate event name:\"sha256:27e3830e1402783674d8b594038967deea9d51f0d91b34c93c8f39d2f68af7da\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 22:05:51.159199 containerd[1553]: time="2025-09-12T22:05:51.159148634Z" level=info msg="ImageCreate event name:\"registry.k8s.io/etcd@sha256:a6dc63e6e8cfa0307d7851762fa6b629afb18f28d8aa3fab5a6e91b4af60026a\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 22:05:51.160455 containerd[1553]: time="2025-09-12T22:05:51.160363839Z" level=info msg="Pulled image \"registry.k8s.io/etcd:3.5.15-0\" with image id \"sha256:27e3830e1402783674d8b594038967deea9d51f0d91b34c93c8f39d2f68af7da\", repo tag \"registry.k8s.io/etcd:3.5.15-0\", repo digest \"registry.k8s.io/etcd@sha256:a6dc63e6e8cfa0307d7851762fa6b629afb18f28d8aa3fab5a6e91b4af60026a\", size \"66535646\" in 2.105996075s" Sep 12 22:05:51.160455 containerd[1553]: time="2025-09-12T22:05:51.160446600Z" level=info msg="PullImage \"registry.k8s.io/etcd:3.5.15-0\" returns image reference \"sha256:27e3830e1402783674d8b594038967deea9d51f0d91b34c93c8f39d2f68af7da\"" Sep 12 22:05:53.968265 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 6. Sep 12 22:05:53.971791 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Sep 12 22:05:54.150778 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Sep 12 22:05:54.160586 (kubelet)[2293]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Sep 12 22:05:54.212999 kubelet[2293]: E0912 22:05:54.212940 2293 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Sep 12 22:05:54.216055 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Sep 12 22:05:54.216184 systemd[1]: kubelet.service: Failed with result 'exit-code'. Sep 12 22:05:54.218584 systemd[1]: kubelet.service: Consumed 183ms CPU time, 107.1M memory peak. Sep 12 22:05:56.755600 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent. Sep 12 22:05:56.755837 systemd[1]: kubelet.service: Consumed 183ms CPU time, 107.1M memory peak. Sep 12 22:05:56.758847 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Sep 12 22:05:56.790135 systemd[1]: Reload requested from client PID 2307 ('systemctl') (unit session-7.scope)... Sep 12 22:05:56.790151 systemd[1]: Reloading... Sep 12 22:05:56.942586 zram_generator::config[2369]: No configuration found. Sep 12 22:05:57.114565 systemd[1]: Reloading finished in 323 ms. Sep 12 22:05:57.182284 systemd[1]: kubelet.service: Control process exited, code=killed, status=15/TERM Sep 12 22:05:57.182376 systemd[1]: kubelet.service: Failed with result 'signal'. Sep 12 22:05:57.182675 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent. Sep 12 22:05:57.182950 systemd[1]: kubelet.service: Consumed 119ms CPU time, 94.9M memory peak. Sep 12 22:05:57.185311 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Sep 12 22:05:57.354951 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Sep 12 22:05:57.367075 (kubelet)[2399]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS Sep 12 22:05:57.416543 kubelet[2399]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Sep 12 22:05:57.416543 kubelet[2399]: Flag --pod-infra-container-image has been deprecated, will be removed in a future release. Image garbage collector will get sandbox image information from CRI. Sep 12 22:05:57.416543 kubelet[2399]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Sep 12 22:05:57.416543 kubelet[2399]: I0912 22:05:57.415435 2399 server.go:211] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Sep 12 22:05:58.723782 kubelet[2399]: I0912 22:05:58.723719 2399 server.go:491] "Kubelet version" kubeletVersion="v1.31.8" Sep 12 22:05:58.723782 kubelet[2399]: I0912 22:05:58.723758 2399 server.go:493] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Sep 12 22:05:58.724232 kubelet[2399]: I0912 22:05:58.724045 2399 server.go:934] "Client rotation is on, will bootstrap in background" Sep 12 22:05:58.757946 kubelet[2399]: E0912 22:05:58.757889 2399 certificate_manager.go:562] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post \"https://168.119.157.2:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": dial tcp 168.119.157.2:6443: connect: connection refused" logger="UnhandledError" Sep 12 22:05:58.760681 kubelet[2399]: I0912 22:05:58.760395 2399 dynamic_cafile_content.go:160] "Starting controller" name="client-ca-bundle::/etc/kubernetes/pki/ca.crt" Sep 12 22:05:58.776565 kubelet[2399]: I0912 22:05:58.776490 2399 server.go:1431] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Sep 12 22:05:58.780984 kubelet[2399]: I0912 22:05:58.780954 2399 server.go:749] "--cgroups-per-qos enabled, but --cgroup-root was not specified. defaulting to /" Sep 12 22:05:58.781271 kubelet[2399]: I0912 22:05:58.781259 2399 swap_util.go:113] "Swap is on" /proc/swaps contents="Filename\t\t\t\tType\t\tSize\t\tUsed\t\tPriority" Sep 12 22:05:58.781406 kubelet[2399]: I0912 22:05:58.781378 2399 container_manager_linux.go:264] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Sep 12 22:05:58.781614 kubelet[2399]: I0912 22:05:58.781409 2399 container_manager_linux.go:269] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ci-4459-0-0-7-af931fdd93","RuntimeCgroupsName":"","SystemCgroupsName":"","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":false,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":null,"HardEvictionThresholds":[{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"ExperimentalMemoryManagerPolicy":"None","ExperimentalMemoryManagerReservedMemory":null,"PodPidsLimit":-1,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Sep 12 22:05:58.781710 kubelet[2399]: I0912 22:05:58.781683 2399 topology_manager.go:138] "Creating topology manager with none policy" Sep 12 22:05:58.781710 kubelet[2399]: I0912 22:05:58.781693 2399 container_manager_linux.go:300] "Creating device plugin manager" Sep 12 22:05:58.781928 kubelet[2399]: I0912 22:05:58.781904 2399 state_mem.go:36] "Initialized new in-memory state store" Sep 12 22:05:58.786537 kubelet[2399]: I0912 22:05:58.786094 2399 kubelet.go:408] "Attempting to sync node with API server" Sep 12 22:05:58.786537 kubelet[2399]: I0912 22:05:58.786147 2399 kubelet.go:303] "Adding static pod path" path="/etc/kubernetes/manifests" Sep 12 22:05:58.786537 kubelet[2399]: I0912 22:05:58.786177 2399 kubelet.go:314] "Adding apiserver pod source" Sep 12 22:05:58.786537 kubelet[2399]: I0912 22:05:58.786197 2399 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Sep 12 22:05:58.792990 kubelet[2399]: I0912 22:05:58.792961 2399 kuberuntime_manager.go:262] "Container runtime initialized" containerRuntime="containerd" version="v2.0.5" apiVersion="v1" Sep 12 22:05:58.794100 kubelet[2399]: I0912 22:05:58.794074 2399 kubelet.go:837] "Not starting ClusterTrustBundle informer because we are in static kubelet mode" Sep 12 22:05:58.794410 kubelet[2399]: W0912 22:05:58.794396 2399 probe.go:272] Flexvolume plugin directory at /opt/libexec/kubernetes/kubelet-plugins/volume/exec/ does not exist. Recreating. Sep 12 22:05:58.795696 kubelet[2399]: I0912 22:05:58.795674 2399 server.go:1274] "Started kubelet" Sep 12 22:05:58.796075 kubelet[2399]: W0912 22:05:58.796022 2399 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://168.119.157.2:6443/api/v1/nodes?fieldSelector=metadata.name%3Dci-4459-0-0-7-af931fdd93&limit=500&resourceVersion=0": dial tcp 168.119.157.2:6443: connect: connection refused Sep 12 22:05:58.796193 kubelet[2399]: E0912 22:05:58.796175 2399 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://168.119.157.2:6443/api/v1/nodes?fieldSelector=metadata.name%3Dci-4459-0-0-7-af931fdd93&limit=500&resourceVersion=0\": dial tcp 168.119.157.2:6443: connect: connection refused" logger="UnhandledError" Sep 12 22:05:58.800484 kubelet[2399]: W0912 22:05:58.800424 2399 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://168.119.157.2:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 168.119.157.2:6443: connect: connection refused Sep 12 22:05:58.800634 kubelet[2399]: E0912 22:05:58.800545 2399 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://168.119.157.2:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 168.119.157.2:6443: connect: connection refused" logger="UnhandledError" Sep 12 22:05:58.800659 kubelet[2399]: I0912 22:05:58.800624 2399 server.go:163] "Starting to listen" address="0.0.0.0" port=10250 Sep 12 22:05:58.801851 kubelet[2399]: I0912 22:05:58.801769 2399 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Sep 12 22:05:58.802308 kubelet[2399]: I0912 22:05:58.802287 2399 server.go:236] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Sep 12 22:05:58.804596 kubelet[2399]: E0912 22:05:58.802987 2399 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://168.119.157.2:6443/api/v1/namespaces/default/events\": dial tcp 168.119.157.2:6443: connect: connection refused" event="&Event{ObjectMeta:{ci-4459-0-0-7-af931fdd93.1864a842a679ce30 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ci-4459-0-0-7-af931fdd93,UID:ci-4459-0-0-7-af931fdd93,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:ci-4459-0-0-7-af931fdd93,},FirstTimestamp:2025-09-12 22:05:58.795644464 +0000 UTC m=+1.423351209,LastTimestamp:2025-09-12 22:05:58.795644464 +0000 UTC m=+1.423351209,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ci-4459-0-0-7-af931fdd93,}" Sep 12 22:05:58.804730 kubelet[2399]: I0912 22:05:58.804661 2399 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Sep 12 22:05:58.805239 kubelet[2399]: I0912 22:05:58.805222 2399 server.go:449] "Adding debug handlers to kubelet server" Sep 12 22:05:58.807459 kubelet[2399]: I0912 22:05:58.807427 2399 dynamic_serving_content.go:135] "Starting controller" name="kubelet-server-cert-files::/var/lib/kubelet/pki/kubelet.crt::/var/lib/kubelet/pki/kubelet.key" Sep 12 22:05:58.812429 kubelet[2399]: E0912 22:05:58.812404 2399 kubelet.go:1478] "Image garbage collection failed once. Stats initialization may not have completed yet" err="invalid capacity 0 on image filesystem" Sep 12 22:05:58.812871 kubelet[2399]: I0912 22:05:58.812853 2399 volume_manager.go:289] "Starting Kubelet Volume Manager" Sep 12 22:05:58.813103 kubelet[2399]: I0912 22:05:58.813087 2399 desired_state_of_world_populator.go:147] "Desired state populator starts to run" Sep 12 22:05:58.813205 kubelet[2399]: I0912 22:05:58.813196 2399 reconciler.go:26] "Reconciler: start to sync state" Sep 12 22:05:58.813927 kubelet[2399]: W0912 22:05:58.813887 2399 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://168.119.157.2:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 168.119.157.2:6443: connect: connection refused Sep 12 22:05:58.814058 kubelet[2399]: E0912 22:05:58.814039 2399 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://168.119.157.2:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 168.119.157.2:6443: connect: connection refused" logger="UnhandledError" Sep 12 22:05:58.814655 kubelet[2399]: I0912 22:05:58.814630 2399 factory.go:221] Registration of the systemd container factory successfully Sep 12 22:05:58.814905 kubelet[2399]: I0912 22:05:58.814880 2399 factory.go:219] Registration of the crio container factory failed: Get "http://%2Fvar%2Frun%2Fcrio%2Fcrio.sock/info": dial unix /var/run/crio/crio.sock: connect: no such file or directory Sep 12 22:05:58.816155 kubelet[2399]: E0912 22:05:58.816121 2399 kubelet_node_status.go:453] "Error getting the current node from lister" err="node \"ci-4459-0-0-7-af931fdd93\" not found" Sep 12 22:05:58.816392 kubelet[2399]: E0912 22:05:58.816366 2399 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://168.119.157.2:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/ci-4459-0-0-7-af931fdd93?timeout=10s\": dial tcp 168.119.157.2:6443: connect: connection refused" interval="200ms" Sep 12 22:05:58.817162 kubelet[2399]: I0912 22:05:58.817139 2399 factory.go:221] Registration of the containerd container factory successfully Sep 12 22:05:58.832910 kubelet[2399]: I0912 22:05:58.832815 2399 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv4" Sep 12 22:05:58.834602 kubelet[2399]: I0912 22:05:58.834556 2399 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv6" Sep 12 22:05:58.834602 kubelet[2399]: I0912 22:05:58.834595 2399 status_manager.go:217] "Starting to sync pod status with apiserver" Sep 12 22:05:58.834738 kubelet[2399]: I0912 22:05:58.834619 2399 kubelet.go:2321] "Starting kubelet main sync loop" Sep 12 22:05:58.834738 kubelet[2399]: E0912 22:05:58.834681 2399 kubelet.go:2345] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" Sep 12 22:05:58.846468 kubelet[2399]: I0912 22:05:58.846098 2399 cpu_manager.go:214] "Starting CPU manager" policy="none" Sep 12 22:05:58.846468 kubelet[2399]: I0912 22:05:58.846122 2399 cpu_manager.go:215] "Reconciling" reconcilePeriod="10s" Sep 12 22:05:58.846468 kubelet[2399]: I0912 22:05:58.846149 2399 state_mem.go:36] "Initialized new in-memory state store" Sep 12 22:05:58.849269 kubelet[2399]: I0912 22:05:58.849051 2399 policy_none.go:49] "None policy: Start" Sep 12 22:05:58.849568 kubelet[2399]: W0912 22:05:58.849534 2399 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://168.119.157.2:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 168.119.157.2:6443: connect: connection refused Sep 12 22:05:58.849714 kubelet[2399]: E0912 22:05:58.849688 2399 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://168.119.157.2:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 168.119.157.2:6443: connect: connection refused" logger="UnhandledError" Sep 12 22:05:58.850392 kubelet[2399]: I0912 22:05:58.850373 2399 memory_manager.go:170] "Starting memorymanager" policy="None" Sep 12 22:05:58.850542 kubelet[2399]: I0912 22:05:58.850504 2399 state_mem.go:35] "Initializing new in-memory state store" Sep 12 22:05:58.858297 systemd[1]: Created slice kubepods.slice - libcontainer container kubepods.slice. Sep 12 22:05:58.872709 systemd[1]: Created slice kubepods-besteffort.slice - libcontainer container kubepods-besteffort.slice. Sep 12 22:05:58.888871 systemd[1]: Created slice kubepods-burstable.slice - libcontainer container kubepods-burstable.slice. Sep 12 22:05:58.893601 kubelet[2399]: I0912 22:05:58.893501 2399 manager.go:513] "Failed to read data from checkpoint" checkpoint="kubelet_internal_checkpoint" err="checkpoint is not found" Sep 12 22:05:58.894536 kubelet[2399]: I0912 22:05:58.894183 2399 eviction_manager.go:189] "Eviction manager: starting control loop" Sep 12 22:05:58.894536 kubelet[2399]: I0912 22:05:58.894201 2399 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Sep 12 22:05:58.896207 kubelet[2399]: I0912 22:05:58.896185 2399 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Sep 12 22:05:58.897714 kubelet[2399]: E0912 22:05:58.897627 2399 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"ci-4459-0-0-7-af931fdd93\" not found" Sep 12 22:05:58.951465 systemd[1]: Created slice kubepods-burstable-pod2a23d7d786da800d9f209be849dd96ab.slice - libcontainer container kubepods-burstable-pod2a23d7d786da800d9f209be849dd96ab.slice. Sep 12 22:05:58.972581 systemd[1]: Created slice kubepods-burstable-pod45e885bf8c6495be86a54078ab8820fd.slice - libcontainer container kubepods-burstable-pod45e885bf8c6495be86a54078ab8820fd.slice. Sep 12 22:05:58.988553 systemd[1]: Created slice kubepods-burstable-pod6c1293a01faba4bf41d0b8716a666214.slice - libcontainer container kubepods-burstable-pod6c1293a01faba4bf41d0b8716a666214.slice. Sep 12 22:05:58.998128 kubelet[2399]: I0912 22:05:58.997254 2399 kubelet_node_status.go:72] "Attempting to register node" node="ci-4459-0-0-7-af931fdd93" Sep 12 22:05:58.998128 kubelet[2399]: E0912 22:05:58.997956 2399 kubelet_node_status.go:95] "Unable to register node with API server" err="Post \"https://168.119.157.2:6443/api/v1/nodes\": dial tcp 168.119.157.2:6443: connect: connection refused" node="ci-4459-0-0-7-af931fdd93" Sep 12 22:05:59.017184 kubelet[2399]: E0912 22:05:59.017115 2399 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://168.119.157.2:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/ci-4459-0-0-7-af931fdd93?timeout=10s\": dial tcp 168.119.157.2:6443: connect: connection refused" interval="400ms" Sep 12 22:05:59.114552 kubelet[2399]: I0912 22:05:59.114360 2399 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/45e885bf8c6495be86a54078ab8820fd-ca-certs\") pod \"kube-controller-manager-ci-4459-0-0-7-af931fdd93\" (UID: \"45e885bf8c6495be86a54078ab8820fd\") " pod="kube-system/kube-controller-manager-ci-4459-0-0-7-af931fdd93" Sep 12 22:05:59.114552 kubelet[2399]: I0912 22:05:59.114402 2399 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"flexvolume-dir\" (UniqueName: \"kubernetes.io/host-path/45e885bf8c6495be86a54078ab8820fd-flexvolume-dir\") pod \"kube-controller-manager-ci-4459-0-0-7-af931fdd93\" (UID: \"45e885bf8c6495be86a54078ab8820fd\") " pod="kube-system/kube-controller-manager-ci-4459-0-0-7-af931fdd93" Sep 12 22:05:59.114552 kubelet[2399]: I0912 22:05:59.114419 2399 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/45e885bf8c6495be86a54078ab8820fd-k8s-certs\") pod \"kube-controller-manager-ci-4459-0-0-7-af931fdd93\" (UID: \"45e885bf8c6495be86a54078ab8820fd\") " pod="kube-system/kube-controller-manager-ci-4459-0-0-7-af931fdd93" Sep 12 22:05:59.114552 kubelet[2399]: I0912 22:05:59.114435 2399 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/45e885bf8c6495be86a54078ab8820fd-kubeconfig\") pod \"kube-controller-manager-ci-4459-0-0-7-af931fdd93\" (UID: \"45e885bf8c6495be86a54078ab8820fd\") " pod="kube-system/kube-controller-manager-ci-4459-0-0-7-af931fdd93" Sep 12 22:05:59.114552 kubelet[2399]: I0912 22:05:59.114452 2399 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/45e885bf8c6495be86a54078ab8820fd-usr-share-ca-certificates\") pod \"kube-controller-manager-ci-4459-0-0-7-af931fdd93\" (UID: \"45e885bf8c6495be86a54078ab8820fd\") " pod="kube-system/kube-controller-manager-ci-4459-0-0-7-af931fdd93" Sep 12 22:05:59.115019 kubelet[2399]: I0912 22:05:59.114468 2399 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/2a23d7d786da800d9f209be849dd96ab-ca-certs\") pod \"kube-apiserver-ci-4459-0-0-7-af931fdd93\" (UID: \"2a23d7d786da800d9f209be849dd96ab\") " pod="kube-system/kube-apiserver-ci-4459-0-0-7-af931fdd93" Sep 12 22:05:59.115019 kubelet[2399]: I0912 22:05:59.114486 2399 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/2a23d7d786da800d9f209be849dd96ab-k8s-certs\") pod \"kube-apiserver-ci-4459-0-0-7-af931fdd93\" (UID: \"2a23d7d786da800d9f209be849dd96ab\") " pod="kube-system/kube-apiserver-ci-4459-0-0-7-af931fdd93" Sep 12 22:05:59.115019 kubelet[2399]: I0912 22:05:59.114527 2399 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/2a23d7d786da800d9f209be849dd96ab-usr-share-ca-certificates\") pod \"kube-apiserver-ci-4459-0-0-7-af931fdd93\" (UID: \"2a23d7d786da800d9f209be849dd96ab\") " pod="kube-system/kube-apiserver-ci-4459-0-0-7-af931fdd93" Sep 12 22:05:59.115019 kubelet[2399]: I0912 22:05:59.114556 2399 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/6c1293a01faba4bf41d0b8716a666214-kubeconfig\") pod \"kube-scheduler-ci-4459-0-0-7-af931fdd93\" (UID: \"6c1293a01faba4bf41d0b8716a666214\") " pod="kube-system/kube-scheduler-ci-4459-0-0-7-af931fdd93" Sep 12 22:05:59.201108 kubelet[2399]: I0912 22:05:59.200657 2399 kubelet_node_status.go:72] "Attempting to register node" node="ci-4459-0-0-7-af931fdd93" Sep 12 22:05:59.201108 kubelet[2399]: E0912 22:05:59.201013 2399 kubelet_node_status.go:95] "Unable to register node with API server" err="Post \"https://168.119.157.2:6443/api/v1/nodes\": dial tcp 168.119.157.2:6443: connect: connection refused" node="ci-4459-0-0-7-af931fdd93" Sep 12 22:05:59.270740 containerd[1553]: time="2025-09-12T22:05:59.270491009Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-apiserver-ci-4459-0-0-7-af931fdd93,Uid:2a23d7d786da800d9f209be849dd96ab,Namespace:kube-system,Attempt:0,}" Sep 12 22:05:59.286008 containerd[1553]: time="2025-09-12T22:05:59.285960849Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-controller-manager-ci-4459-0-0-7-af931fdd93,Uid:45e885bf8c6495be86a54078ab8820fd,Namespace:kube-system,Attempt:0,}" Sep 12 22:05:59.301265 containerd[1553]: time="2025-09-12T22:05:59.301156968Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-scheduler-ci-4459-0-0-7-af931fdd93,Uid:6c1293a01faba4bf41d0b8716a666214,Namespace:kube-system,Attempt:0,}" Sep 12 22:05:59.303410 containerd[1553]: time="2025-09-12T22:05:59.303353814Z" level=info msg="connecting to shim 29216e72e17f6331668e5f7e75d668c62ef706655c9bb150afae17445da877e6" address="unix:///run/containerd/s/acda6ed3643d9d674c889f135f0fd2bb3245b2259203ee5860e07c1376fa47f2" namespace=k8s.io protocol=ttrpc version=3 Sep 12 22:05:59.340732 systemd[1]: Started cri-containerd-29216e72e17f6331668e5f7e75d668c62ef706655c9bb150afae17445da877e6.scope - libcontainer container 29216e72e17f6331668e5f7e75d668c62ef706655c9bb150afae17445da877e6. Sep 12 22:05:59.352536 containerd[1553]: time="2025-09-12T22:05:59.352212940Z" level=info msg="connecting to shim c590d2e71705951df7be19fb7678be84569b81181adc4c8ea25a40103b43ec1e" address="unix:///run/containerd/s/574a7bd51590baa8bcbd1a795c73648b2404a1612fbfccb09a9a326b75f75f65" namespace=k8s.io protocol=ttrpc version=3 Sep 12 22:05:59.373363 containerd[1553]: time="2025-09-12T22:05:59.373078755Z" level=info msg="connecting to shim 313042e47ae9d45945dedb1c9c6ceb3a70efe3d579bc33085e0cd7886a94bf47" address="unix:///run/containerd/s/8340d9bef414d8a9885ea70ea26f8eff83f8b69db18285d5addbe4d4210db666" namespace=k8s.io protocol=ttrpc version=3 Sep 12 22:05:59.411732 systemd[1]: Started cri-containerd-c590d2e71705951df7be19fb7678be84569b81181adc4c8ea25a40103b43ec1e.scope - libcontainer container c590d2e71705951df7be19fb7678be84569b81181adc4c8ea25a40103b43ec1e. Sep 12 22:05:59.418213 systemd[1]: Started cri-containerd-313042e47ae9d45945dedb1c9c6ceb3a70efe3d579bc33085e0cd7886a94bf47.scope - libcontainer container 313042e47ae9d45945dedb1c9c6ceb3a70efe3d579bc33085e0cd7886a94bf47. Sep 12 22:05:59.418709 kubelet[2399]: E0912 22:05:59.418578 2399 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://168.119.157.2:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/ci-4459-0-0-7-af931fdd93?timeout=10s\": dial tcp 168.119.157.2:6443: connect: connection refused" interval="800ms" Sep 12 22:05:59.436155 containerd[1553]: time="2025-09-12T22:05:59.436071998Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-apiserver-ci-4459-0-0-7-af931fdd93,Uid:2a23d7d786da800d9f209be849dd96ab,Namespace:kube-system,Attempt:0,} returns sandbox id \"29216e72e17f6331668e5f7e75d668c62ef706655c9bb150afae17445da877e6\"" Sep 12 22:05:59.440938 containerd[1553]: time="2025-09-12T22:05:59.440876210Z" level=info msg="CreateContainer within sandbox \"29216e72e17f6331668e5f7e75d668c62ef706655c9bb150afae17445da877e6\" for container &ContainerMetadata{Name:kube-apiserver,Attempt:0,}" Sep 12 22:05:59.459589 containerd[1553]: time="2025-09-12T22:05:59.457871054Z" level=info msg="Container 42de8383750bc0e8c70c2593ee2146f94ca3710b100632413365621ade74624f: CDI devices from CRI Config.CDIDevices: []" Sep 12 22:05:59.468715 containerd[1553]: time="2025-09-12T22:05:59.468661562Z" level=info msg="CreateContainer within sandbox \"29216e72e17f6331668e5f7e75d668c62ef706655c9bb150afae17445da877e6\" for &ContainerMetadata{Name:kube-apiserver,Attempt:0,} returns container id \"42de8383750bc0e8c70c2593ee2146f94ca3710b100632413365621ade74624f\"" Sep 12 22:05:59.470359 containerd[1553]: time="2025-09-12T22:05:59.470312806Z" level=info msg="StartContainer for \"42de8383750bc0e8c70c2593ee2146f94ca3710b100632413365621ade74624f\"" Sep 12 22:05:59.473640 containerd[1553]: time="2025-09-12T22:05:59.473604255Z" level=info msg="connecting to shim 42de8383750bc0e8c70c2593ee2146f94ca3710b100632413365621ade74624f" address="unix:///run/containerd/s/acda6ed3643d9d674c889f135f0fd2bb3245b2259203ee5860e07c1376fa47f2" protocol=ttrpc version=3 Sep 12 22:05:59.500357 containerd[1553]: time="2025-09-12T22:05:59.500306524Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-controller-manager-ci-4459-0-0-7-af931fdd93,Uid:45e885bf8c6495be86a54078ab8820fd,Namespace:kube-system,Attempt:0,} returns sandbox id \"c590d2e71705951df7be19fb7678be84569b81181adc4c8ea25a40103b43ec1e\"" Sep 12 22:05:59.506037 containerd[1553]: time="2025-09-12T22:05:59.505924139Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-scheduler-ci-4459-0-0-7-af931fdd93,Uid:6c1293a01faba4bf41d0b8716a666214,Namespace:kube-system,Attempt:0,} returns sandbox id \"313042e47ae9d45945dedb1c9c6ceb3a70efe3d579bc33085e0cd7886a94bf47\"" Sep 12 22:05:59.508110 systemd[1]: Started cri-containerd-42de8383750bc0e8c70c2593ee2146f94ca3710b100632413365621ade74624f.scope - libcontainer container 42de8383750bc0e8c70c2593ee2146f94ca3710b100632413365621ade74624f. Sep 12 22:05:59.509120 kubelet[2399]: E0912 22:05:59.508953 2399 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://168.119.157.2:6443/api/v1/namespaces/default/events\": dial tcp 168.119.157.2:6443: connect: connection refused" event="&Event{ObjectMeta:{ci-4459-0-0-7-af931fdd93.1864a842a679ce30 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ci-4459-0-0-7-af931fdd93,UID:ci-4459-0-0-7-af931fdd93,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:ci-4459-0-0-7-af931fdd93,},FirstTimestamp:2025-09-12 22:05:58.795644464 +0000 UTC m=+1.423351209,LastTimestamp:2025-09-12 22:05:58.795644464 +0000 UTC m=+1.423351209,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ci-4459-0-0-7-af931fdd93,}" Sep 12 22:05:59.511923 containerd[1553]: time="2025-09-12T22:05:59.511252112Z" level=info msg="CreateContainer within sandbox \"c590d2e71705951df7be19fb7678be84569b81181adc4c8ea25a40103b43ec1e\" for container &ContainerMetadata{Name:kube-controller-manager,Attempt:0,}" Sep 12 22:05:59.515190 containerd[1553]: time="2025-09-12T22:05:59.515123962Z" level=info msg="CreateContainer within sandbox \"313042e47ae9d45945dedb1c9c6ceb3a70efe3d579bc33085e0cd7886a94bf47\" for container &ContainerMetadata{Name:kube-scheduler,Attempt:0,}" Sep 12 22:05:59.538034 containerd[1553]: time="2025-09-12T22:05:59.536779978Z" level=info msg="Container 229f6f7730297a1efc3d9215405c98f5f8527db3123bd76a141bc62d4d90061c: CDI devices from CRI Config.CDIDevices: []" Sep 12 22:05:59.538034 containerd[1553]: time="2025-09-12T22:05:59.536968619Z" level=info msg="Container 8ea42c9e4f539d8482062c982db11611daacca6f11aefff02b6a89f0bf41aebd: CDI devices from CRI Config.CDIDevices: []" Sep 12 22:05:59.547393 containerd[1553]: time="2025-09-12T22:05:59.547333046Z" level=info msg="CreateContainer within sandbox \"c590d2e71705951df7be19fb7678be84569b81181adc4c8ea25a40103b43ec1e\" for &ContainerMetadata{Name:kube-controller-manager,Attempt:0,} returns container id \"229f6f7730297a1efc3d9215405c98f5f8527db3123bd76a141bc62d4d90061c\"" Sep 12 22:05:59.548914 containerd[1553]: time="2025-09-12T22:05:59.548881290Z" level=info msg="StartContainer for \"229f6f7730297a1efc3d9215405c98f5f8527db3123bd76a141bc62d4d90061c\"" Sep 12 22:05:59.551424 containerd[1553]: time="2025-09-12T22:05:59.551380296Z" level=info msg="CreateContainer within sandbox \"313042e47ae9d45945dedb1c9c6ceb3a70efe3d579bc33085e0cd7886a94bf47\" for &ContainerMetadata{Name:kube-scheduler,Attempt:0,} returns container id \"8ea42c9e4f539d8482062c982db11611daacca6f11aefff02b6a89f0bf41aebd\"" Sep 12 22:05:59.552350 containerd[1553]: time="2025-09-12T22:05:59.552323099Z" level=info msg="connecting to shim 229f6f7730297a1efc3d9215405c98f5f8527db3123bd76a141bc62d4d90061c" address="unix:///run/containerd/s/574a7bd51590baa8bcbd1a795c73648b2404a1612fbfccb09a9a326b75f75f65" protocol=ttrpc version=3 Sep 12 22:05:59.553073 containerd[1553]: time="2025-09-12T22:05:59.553042541Z" level=info msg="StartContainer for \"8ea42c9e4f539d8482062c982db11611daacca6f11aefff02b6a89f0bf41aebd\"" Sep 12 22:05:59.557068 containerd[1553]: time="2025-09-12T22:05:59.557016311Z" level=info msg="connecting to shim 8ea42c9e4f539d8482062c982db11611daacca6f11aefff02b6a89f0bf41aebd" address="unix:///run/containerd/s/8340d9bef414d8a9885ea70ea26f8eff83f8b69db18285d5addbe4d4210db666" protocol=ttrpc version=3 Sep 12 22:05:59.581006 systemd[1]: Started cri-containerd-229f6f7730297a1efc3d9215405c98f5f8527db3123bd76a141bc62d4d90061c.scope - libcontainer container 229f6f7730297a1efc3d9215405c98f5f8527db3123bd76a141bc62d4d90061c. Sep 12 22:05:59.601944 containerd[1553]: time="2025-09-12T22:05:59.599733582Z" level=info msg="StartContainer for \"42de8383750bc0e8c70c2593ee2146f94ca3710b100632413365621ade74624f\" returns successfully" Sep 12 22:05:59.604556 kubelet[2399]: I0912 22:05:59.604419 2399 kubelet_node_status.go:72] "Attempting to register node" node="ci-4459-0-0-7-af931fdd93" Sep 12 22:05:59.605754 kubelet[2399]: E0912 22:05:59.605710 2399 kubelet_node_status.go:95] "Unable to register node with API server" err="Post \"https://168.119.157.2:6443/api/v1/nodes\": dial tcp 168.119.157.2:6443: connect: connection refused" node="ci-4459-0-0-7-af931fdd93" Sep 12 22:05:59.608001 systemd[1]: Started cri-containerd-8ea42c9e4f539d8482062c982db11611daacca6f11aefff02b6a89f0bf41aebd.scope - libcontainer container 8ea42c9e4f539d8482062c982db11611daacca6f11aefff02b6a89f0bf41aebd. Sep 12 22:05:59.658049 containerd[1553]: time="2025-09-12T22:05:59.657944812Z" level=info msg="StartContainer for \"229f6f7730297a1efc3d9215405c98f5f8527db3123bd76a141bc62d4d90061c\" returns successfully" Sep 12 22:05:59.705341 containerd[1553]: time="2025-09-12T22:05:59.705282895Z" level=info msg="StartContainer for \"8ea42c9e4f539d8482062c982db11611daacca6f11aefff02b6a89f0bf41aebd\" returns successfully" Sep 12 22:06:00.410619 kubelet[2399]: I0912 22:06:00.408305 2399 kubelet_node_status.go:72] "Attempting to register node" node="ci-4459-0-0-7-af931fdd93" Sep 12 22:06:01.925375 kubelet[2399]: I0912 22:06:01.925326 2399 kubelet_node_status.go:75] "Successfully registered node" node="ci-4459-0-0-7-af931fdd93" Sep 12 22:06:02.025822 kubelet[2399]: E0912 22:06:02.025751 2399 controller.go:145] "Failed to ensure lease exists, will retry" err="namespaces \"kube-node-lease\" not found" interval="1.6s" Sep 12 22:06:02.805547 kubelet[2399]: I0912 22:06:02.803082 2399 apiserver.go:52] "Watching apiserver" Sep 12 22:06:02.813852 kubelet[2399]: I0912 22:06:02.813815 2399 desired_state_of_world_populator.go:155] "Finished populating initial desired state of world" Sep 12 22:06:04.340385 systemd[1]: Reload requested from client PID 2673 ('systemctl') (unit session-7.scope)... Sep 12 22:06:04.340551 systemd[1]: Reloading... Sep 12 22:06:04.443546 zram_generator::config[2720]: No configuration found. Sep 12 22:06:04.669967 systemd[1]: Reloading finished in 328 ms. Sep 12 22:06:04.702682 systemd[1]: Stopping kubelet.service - kubelet: The Kubernetes Node Agent... Sep 12 22:06:04.716093 systemd[1]: kubelet.service: Deactivated successfully. Sep 12 22:06:04.716690 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent. Sep 12 22:06:04.716809 systemd[1]: kubelet.service: Consumed 1.880s CPU time, 124.6M memory peak. Sep 12 22:06:04.720199 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Sep 12 22:06:04.916969 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Sep 12 22:06:04.929163 (kubelet)[2762]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS Sep 12 22:06:04.998089 kubelet[2762]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Sep 12 22:06:04.999388 kubelet[2762]: Flag --pod-infra-container-image has been deprecated, will be removed in a future release. Image garbage collector will get sandbox image information from CRI. Sep 12 22:06:04.999388 kubelet[2762]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Sep 12 22:06:04.999636 kubelet[2762]: I0912 22:06:04.999576 2762 server.go:211] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Sep 12 22:06:05.015132 kubelet[2762]: I0912 22:06:05.015096 2762 server.go:491] "Kubelet version" kubeletVersion="v1.31.8" Sep 12 22:06:05.015387 kubelet[2762]: I0912 22:06:05.015373 2762 server.go:493] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Sep 12 22:06:05.015814 kubelet[2762]: I0912 22:06:05.015793 2762 server.go:934] "Client rotation is on, will bootstrap in background" Sep 12 22:06:05.018263 kubelet[2762]: I0912 22:06:05.018065 2762 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-client-current.pem". Sep 12 22:06:05.021172 kubelet[2762]: I0912 22:06:05.021093 2762 dynamic_cafile_content.go:160] "Starting controller" name="client-ca-bundle::/etc/kubernetes/pki/ca.crt" Sep 12 22:06:05.032585 kubelet[2762]: I0912 22:06:05.032541 2762 server.go:1431] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Sep 12 22:06:05.036745 kubelet[2762]: I0912 22:06:05.036710 2762 server.go:749] "--cgroups-per-qos enabled, but --cgroup-root was not specified. defaulting to /" Sep 12 22:06:05.036977 kubelet[2762]: I0912 22:06:05.036959 2762 swap_util.go:113] "Swap is on" /proc/swaps contents="Filename\t\t\t\tType\t\tSize\t\tUsed\t\tPriority" Sep 12 22:06:05.037119 kubelet[2762]: I0912 22:06:05.037086 2762 container_manager_linux.go:264] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Sep 12 22:06:05.037419 kubelet[2762]: I0912 22:06:05.037119 2762 container_manager_linux.go:269] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ci-4459-0-0-7-af931fdd93","RuntimeCgroupsName":"","SystemCgroupsName":"","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":false,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":null,"HardEvictionThresholds":[{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"ExperimentalMemoryManagerPolicy":"None","ExperimentalMemoryManagerReservedMemory":null,"PodPidsLimit":-1,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Sep 12 22:06:05.037419 kubelet[2762]: I0912 22:06:05.037346 2762 topology_manager.go:138] "Creating topology manager with none policy" Sep 12 22:06:05.038425 kubelet[2762]: I0912 22:06:05.038390 2762 container_manager_linux.go:300] "Creating device plugin manager" Sep 12 22:06:05.038486 kubelet[2762]: I0912 22:06:05.038481 2762 state_mem.go:36] "Initialized new in-memory state store" Sep 12 22:06:05.038664 kubelet[2762]: I0912 22:06:05.038633 2762 kubelet.go:408] "Attempting to sync node with API server" Sep 12 22:06:05.038664 kubelet[2762]: I0912 22:06:05.038651 2762 kubelet.go:303] "Adding static pod path" path="/etc/kubernetes/manifests" Sep 12 22:06:05.039088 kubelet[2762]: I0912 22:06:05.038675 2762 kubelet.go:314] "Adding apiserver pod source" Sep 12 22:06:05.039088 kubelet[2762]: I0912 22:06:05.038690 2762 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Sep 12 22:06:05.047199 kubelet[2762]: I0912 22:06:05.047156 2762 kuberuntime_manager.go:262] "Container runtime initialized" containerRuntime="containerd" version="v2.0.5" apiVersion="v1" Sep 12 22:06:05.048003 kubelet[2762]: I0912 22:06:05.047976 2762 kubelet.go:837] "Not starting ClusterTrustBundle informer because we are in static kubelet mode" Sep 12 22:06:05.049474 kubelet[2762]: I0912 22:06:05.049449 2762 server.go:1274] "Started kubelet" Sep 12 22:06:05.053375 kubelet[2762]: I0912 22:06:05.053245 2762 server.go:163] "Starting to listen" address="0.0.0.0" port=10250 Sep 12 22:06:05.057961 kubelet[2762]: I0912 22:06:05.057933 2762 server.go:449] "Adding debug handlers to kubelet server" Sep 12 22:06:05.060805 kubelet[2762]: I0912 22:06:05.060419 2762 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Sep 12 22:06:05.080708 kubelet[2762]: I0912 22:06:05.080635 2762 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Sep 12 22:06:05.081118 kubelet[2762]: I0912 22:06:05.081099 2762 server.go:236] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Sep 12 22:06:05.081699 kubelet[2762]: I0912 22:06:05.081680 2762 dynamic_serving_content.go:135] "Starting controller" name="kubelet-server-cert-files::/var/lib/kubelet/pki/kubelet.crt::/var/lib/kubelet/pki/kubelet.key" Sep 12 22:06:05.085274 kubelet[2762]: I0912 22:06:05.084499 2762 volume_manager.go:289] "Starting Kubelet Volume Manager" Sep 12 22:06:05.085274 kubelet[2762]: E0912 22:06:05.085017 2762 kubelet_node_status.go:453] "Error getting the current node from lister" err="node \"ci-4459-0-0-7-af931fdd93\" not found" Sep 12 22:06:05.090936 kubelet[2762]: I0912 22:06:05.090891 2762 desired_state_of_world_populator.go:147] "Desired state populator starts to run" Sep 12 22:06:05.091522 kubelet[2762]: I0912 22:06:05.091062 2762 reconciler.go:26] "Reconciler: start to sync state" Sep 12 22:06:05.096894 kubelet[2762]: I0912 22:06:05.096836 2762 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv4" Sep 12 22:06:05.101726 kubelet[2762]: I0912 22:06:05.101664 2762 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv6" Sep 12 22:06:05.101726 kubelet[2762]: I0912 22:06:05.101721 2762 status_manager.go:217] "Starting to sync pod status with apiserver" Sep 12 22:06:05.101894 kubelet[2762]: I0912 22:06:05.101749 2762 kubelet.go:2321] "Starting kubelet main sync loop" Sep 12 22:06:05.101894 kubelet[2762]: E0912 22:06:05.101820 2762 kubelet.go:2345] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" Sep 12 22:06:05.103046 kubelet[2762]: I0912 22:06:05.102822 2762 factory.go:221] Registration of the systemd container factory successfully Sep 12 22:06:05.103046 kubelet[2762]: I0912 22:06:05.102953 2762 factory.go:219] Registration of the crio container factory failed: Get "http://%2Fvar%2Frun%2Fcrio%2Fcrio.sock/info": dial unix /var/run/crio/crio.sock: connect: no such file or directory Sep 12 22:06:05.110104 kubelet[2762]: E0912 22:06:05.110062 2762 kubelet.go:1478] "Image garbage collection failed once. Stats initialization may not have completed yet" err="invalid capacity 0 on image filesystem" Sep 12 22:06:05.110737 kubelet[2762]: I0912 22:06:05.110636 2762 factory.go:221] Registration of the containerd container factory successfully Sep 12 22:06:05.189251 kubelet[2762]: I0912 22:06:05.189000 2762 cpu_manager.go:214] "Starting CPU manager" policy="none" Sep 12 22:06:05.189251 kubelet[2762]: I0912 22:06:05.189020 2762 cpu_manager.go:215] "Reconciling" reconcilePeriod="10s" Sep 12 22:06:05.189251 kubelet[2762]: I0912 22:06:05.189045 2762 state_mem.go:36] "Initialized new in-memory state store" Sep 12 22:06:05.189251 kubelet[2762]: I0912 22:06:05.189205 2762 state_mem.go:88] "Updated default CPUSet" cpuSet="" Sep 12 22:06:05.189251 kubelet[2762]: I0912 22:06:05.189215 2762 state_mem.go:96] "Updated CPUSet assignments" assignments={} Sep 12 22:06:05.189251 kubelet[2762]: I0912 22:06:05.189234 2762 policy_none.go:49] "None policy: Start" Sep 12 22:06:05.193278 kubelet[2762]: I0912 22:06:05.193154 2762 memory_manager.go:170] "Starting memorymanager" policy="None" Sep 12 22:06:05.193278 kubelet[2762]: I0912 22:06:05.193198 2762 state_mem.go:35] "Initializing new in-memory state store" Sep 12 22:06:05.193484 kubelet[2762]: I0912 22:06:05.193394 2762 state_mem.go:75] "Updated machine memory state" Sep 12 22:06:05.199222 kubelet[2762]: I0912 22:06:05.199159 2762 manager.go:513] "Failed to read data from checkpoint" checkpoint="kubelet_internal_checkpoint" err="checkpoint is not found" Sep 12 22:06:05.199389 kubelet[2762]: I0912 22:06:05.199358 2762 eviction_manager.go:189] "Eviction manager: starting control loop" Sep 12 22:06:05.199438 kubelet[2762]: I0912 22:06:05.199378 2762 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Sep 12 22:06:05.199918 kubelet[2762]: I0912 22:06:05.199841 2762 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Sep 12 22:06:05.321691 kubelet[2762]: I0912 22:06:05.321638 2762 kubelet_node_status.go:72] "Attempting to register node" node="ci-4459-0-0-7-af931fdd93" Sep 12 22:06:05.337736 kubelet[2762]: I0912 22:06:05.337692 2762 kubelet_node_status.go:111] "Node was previously registered" node="ci-4459-0-0-7-af931fdd93" Sep 12 22:06:05.337968 kubelet[2762]: I0912 22:06:05.337831 2762 kubelet_node_status.go:75] "Successfully registered node" node="ci-4459-0-0-7-af931fdd93" Sep 12 22:06:05.392869 kubelet[2762]: I0912 22:06:05.392103 2762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/45e885bf8c6495be86a54078ab8820fd-kubeconfig\") pod \"kube-controller-manager-ci-4459-0-0-7-af931fdd93\" (UID: \"45e885bf8c6495be86a54078ab8820fd\") " pod="kube-system/kube-controller-manager-ci-4459-0-0-7-af931fdd93" Sep 12 22:06:05.392869 kubelet[2762]: I0912 22:06:05.392207 2762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/2a23d7d786da800d9f209be849dd96ab-usr-share-ca-certificates\") pod \"kube-apiserver-ci-4459-0-0-7-af931fdd93\" (UID: \"2a23d7d786da800d9f209be849dd96ab\") " pod="kube-system/kube-apiserver-ci-4459-0-0-7-af931fdd93" Sep 12 22:06:05.392869 kubelet[2762]: I0912 22:06:05.392231 2762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/45e885bf8c6495be86a54078ab8820fd-ca-certs\") pod \"kube-controller-manager-ci-4459-0-0-7-af931fdd93\" (UID: \"45e885bf8c6495be86a54078ab8820fd\") " pod="kube-system/kube-controller-manager-ci-4459-0-0-7-af931fdd93" Sep 12 22:06:05.392869 kubelet[2762]: I0912 22:06:05.392250 2762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/2a23d7d786da800d9f209be849dd96ab-k8s-certs\") pod \"kube-apiserver-ci-4459-0-0-7-af931fdd93\" (UID: \"2a23d7d786da800d9f209be849dd96ab\") " pod="kube-system/kube-apiserver-ci-4459-0-0-7-af931fdd93" Sep 12 22:06:05.392869 kubelet[2762]: I0912 22:06:05.392269 2762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"flexvolume-dir\" (UniqueName: \"kubernetes.io/host-path/45e885bf8c6495be86a54078ab8820fd-flexvolume-dir\") pod \"kube-controller-manager-ci-4459-0-0-7-af931fdd93\" (UID: \"45e885bf8c6495be86a54078ab8820fd\") " pod="kube-system/kube-controller-manager-ci-4459-0-0-7-af931fdd93" Sep 12 22:06:05.393205 kubelet[2762]: I0912 22:06:05.392302 2762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/45e885bf8c6495be86a54078ab8820fd-k8s-certs\") pod \"kube-controller-manager-ci-4459-0-0-7-af931fdd93\" (UID: \"45e885bf8c6495be86a54078ab8820fd\") " pod="kube-system/kube-controller-manager-ci-4459-0-0-7-af931fdd93" Sep 12 22:06:05.393205 kubelet[2762]: I0912 22:06:05.392379 2762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/45e885bf8c6495be86a54078ab8820fd-usr-share-ca-certificates\") pod \"kube-controller-manager-ci-4459-0-0-7-af931fdd93\" (UID: \"45e885bf8c6495be86a54078ab8820fd\") " pod="kube-system/kube-controller-manager-ci-4459-0-0-7-af931fdd93" Sep 12 22:06:05.393205 kubelet[2762]: I0912 22:06:05.392462 2762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/6c1293a01faba4bf41d0b8716a666214-kubeconfig\") pod \"kube-scheduler-ci-4459-0-0-7-af931fdd93\" (UID: \"6c1293a01faba4bf41d0b8716a666214\") " pod="kube-system/kube-scheduler-ci-4459-0-0-7-af931fdd93" Sep 12 22:06:05.393205 kubelet[2762]: I0912 22:06:05.392503 2762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/2a23d7d786da800d9f209be849dd96ab-ca-certs\") pod \"kube-apiserver-ci-4459-0-0-7-af931fdd93\" (UID: \"2a23d7d786da800d9f209be849dd96ab\") " pod="kube-system/kube-apiserver-ci-4459-0-0-7-af931fdd93" Sep 12 22:06:06.039922 kubelet[2762]: I0912 22:06:06.039841 2762 apiserver.go:52] "Watching apiserver" Sep 12 22:06:06.091953 kubelet[2762]: I0912 22:06:06.091901 2762 desired_state_of_world_populator.go:155] "Finished populating initial desired state of world" Sep 12 22:06:06.188628 kubelet[2762]: E0912 22:06:06.187881 2762 kubelet.go:1915] "Failed creating a mirror pod for" err="pods \"kube-apiserver-ci-4459-0-0-7-af931fdd93\" already exists" pod="kube-system/kube-apiserver-ci-4459-0-0-7-af931fdd93" Sep 12 22:06:06.231583 kubelet[2762]: I0912 22:06:06.230615 2762 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-controller-manager-ci-4459-0-0-7-af931fdd93" podStartSLOduration=1.230588431 podStartE2EDuration="1.230588431s" podCreationTimestamp="2025-09-12 22:06:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-12 22:06:06.215987927 +0000 UTC m=+1.281182718" watchObservedRunningTime="2025-09-12 22:06:06.230588431 +0000 UTC m=+1.295783022" Sep 12 22:06:06.231583 kubelet[2762]: I0912 22:06:06.230974 2762 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-scheduler-ci-4459-0-0-7-af931fdd93" podStartSLOduration=1.230960952 podStartE2EDuration="1.230960952s" podCreationTimestamp="2025-09-12 22:06:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-12 22:06:06.22971355 +0000 UTC m=+1.294908141" watchObservedRunningTime="2025-09-12 22:06:06.230960952 +0000 UTC m=+1.296155543" Sep 12 22:06:06.256708 kubelet[2762]: I0912 22:06:06.256569 2762 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-apiserver-ci-4459-0-0-7-af931fdd93" podStartSLOduration=1.256549274 podStartE2EDuration="1.256549274s" podCreationTimestamp="2025-09-12 22:06:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-12 22:06:06.243231572 +0000 UTC m=+1.308426243" watchObservedRunningTime="2025-09-12 22:06:06.256549274 +0000 UTC m=+1.321743785" Sep 12 22:06:09.448978 kubelet[2762]: I0912 22:06:09.448817 2762 kuberuntime_manager.go:1635] "Updating runtime config through cri with podcidr" CIDR="192.168.0.0/24" Sep 12 22:06:09.449311 containerd[1553]: time="2025-09-12T22:06:09.449268024Z" level=info msg="No cni config template is specified, wait for other system components to drop the config." Sep 12 22:06:09.449718 kubelet[2762]: I0912 22:06:09.449690 2762 kubelet_network.go:61] "Updating Pod CIDR" originalPodCIDR="" newPodCIDR="192.168.0.0/24" Sep 12 22:06:10.221266 systemd[1]: Created slice kubepods-besteffort-podf551f19c_0d8f_4784_84ad_34549e379f6d.slice - libcontainer container kubepods-besteffort-podf551f19c_0d8f_4784_84ad_34549e379f6d.slice. Sep 12 22:06:10.328244 kubelet[2762]: I0912 22:06:10.328055 2762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-proxy\" (UniqueName: \"kubernetes.io/configmap/f551f19c-0d8f-4784-84ad-34549e379f6d-kube-proxy\") pod \"kube-proxy-4cmck\" (UID: \"f551f19c-0d8f-4784-84ad-34549e379f6d\") " pod="kube-system/kube-proxy-4cmck" Sep 12 22:06:10.328244 kubelet[2762]: I0912 22:06:10.328119 2762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/f551f19c-0d8f-4784-84ad-34549e379f6d-lib-modules\") pod \"kube-proxy-4cmck\" (UID: \"f551f19c-0d8f-4784-84ad-34549e379f6d\") " pod="kube-system/kube-proxy-4cmck" Sep 12 22:06:10.328244 kubelet[2762]: I0912 22:06:10.328166 2762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"xtables-lock\" (UniqueName: \"kubernetes.io/host-path/f551f19c-0d8f-4784-84ad-34549e379f6d-xtables-lock\") pod \"kube-proxy-4cmck\" (UID: \"f551f19c-0d8f-4784-84ad-34549e379f6d\") " pod="kube-system/kube-proxy-4cmck" Sep 12 22:06:10.328244 kubelet[2762]: I0912 22:06:10.328215 2762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2467s\" (UniqueName: \"kubernetes.io/projected/f551f19c-0d8f-4784-84ad-34549e379f6d-kube-api-access-2467s\") pod \"kube-proxy-4cmck\" (UID: \"f551f19c-0d8f-4784-84ad-34549e379f6d\") " pod="kube-system/kube-proxy-4cmck" Sep 12 22:06:10.536306 containerd[1553]: time="2025-09-12T22:06:10.535851774Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-proxy-4cmck,Uid:f551f19c-0d8f-4784-84ad-34549e379f6d,Namespace:kube-system,Attempt:0,}" Sep 12 22:06:10.562825 containerd[1553]: time="2025-09-12T22:06:10.562735689Z" level=info msg="connecting to shim 476992831a148122a8d52ead1ac1c7ad80919efde55f241476c727991779c80c" address="unix:///run/containerd/s/e21d35b2e9ef967210ce2186b71efc4b0e23a4afde65b3f4688fd53fd678815d" namespace=k8s.io protocol=ttrpc version=3 Sep 12 22:06:10.625149 systemd[1]: Started cri-containerd-476992831a148122a8d52ead1ac1c7ad80919efde55f241476c727991779c80c.scope - libcontainer container 476992831a148122a8d52ead1ac1c7ad80919efde55f241476c727991779c80c. Sep 12 22:06:10.665040 systemd[1]: Created slice kubepods-besteffort-poddeaf3cbf_f0a6_4c15_8592_599383b9970b.slice - libcontainer container kubepods-besteffort-poddeaf3cbf_f0a6_4c15_8592_599383b9970b.slice. Sep 12 22:06:10.719157 containerd[1553]: time="2025-09-12T22:06:10.718949688Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-proxy-4cmck,Uid:f551f19c-0d8f-4784-84ad-34549e379f6d,Namespace:kube-system,Attempt:0,} returns sandbox id \"476992831a148122a8d52ead1ac1c7ad80919efde55f241476c727991779c80c\"" Sep 12 22:06:10.724261 containerd[1553]: time="2025-09-12T22:06:10.724214454Z" level=info msg="CreateContainer within sandbox \"476992831a148122a8d52ead1ac1c7ad80919efde55f241476c727991779c80c\" for container &ContainerMetadata{Name:kube-proxy,Attempt:0,}" Sep 12 22:06:10.742547 containerd[1553]: time="2025-09-12T22:06:10.739563794Z" level=info msg="Container 521817728fa5f423d33b64077b86d30a25516c74b5dcfed543eb511e986478b6: CDI devices from CRI Config.CDIDevices: []" Sep 12 22:06:10.754540 containerd[1553]: time="2025-09-12T22:06:10.754464253Z" level=info msg="CreateContainer within sandbox \"476992831a148122a8d52ead1ac1c7ad80919efde55f241476c727991779c80c\" for &ContainerMetadata{Name:kube-proxy,Attempt:0,} returns container id \"521817728fa5f423d33b64077b86d30a25516c74b5dcfed543eb511e986478b6\"" Sep 12 22:06:10.757540 containerd[1553]: time="2025-09-12T22:06:10.757242496Z" level=info msg="StartContainer for \"521817728fa5f423d33b64077b86d30a25516c74b5dcfed543eb511e986478b6\"" Sep 12 22:06:10.760297 containerd[1553]: time="2025-09-12T22:06:10.760213020Z" level=info msg="connecting to shim 521817728fa5f423d33b64077b86d30a25516c74b5dcfed543eb511e986478b6" address="unix:///run/containerd/s/e21d35b2e9ef967210ce2186b71efc4b0e23a4afde65b3f4688fd53fd678815d" protocol=ttrpc version=3 Sep 12 22:06:10.785879 systemd[1]: Started cri-containerd-521817728fa5f423d33b64077b86d30a25516c74b5dcfed543eb511e986478b6.scope - libcontainer container 521817728fa5f423d33b64077b86d30a25516c74b5dcfed543eb511e986478b6. Sep 12 22:06:10.832373 kubelet[2762]: I0912 22:06:10.831924 2762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-calico\" (UniqueName: \"kubernetes.io/host-path/deaf3cbf-f0a6-4c15-8592-599383b9970b-var-lib-calico\") pod \"tigera-operator-58fc44c59b-c6hbt\" (UID: \"deaf3cbf-f0a6-4c15-8592-599383b9970b\") " pod="tigera-operator/tigera-operator-58fc44c59b-c6hbt" Sep 12 22:06:10.832373 kubelet[2762]: I0912 22:06:10.831974 2762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8hkn7\" (UniqueName: \"kubernetes.io/projected/deaf3cbf-f0a6-4c15-8592-599383b9970b-kube-api-access-8hkn7\") pod \"tigera-operator-58fc44c59b-c6hbt\" (UID: \"deaf3cbf-f0a6-4c15-8592-599383b9970b\") " pod="tigera-operator/tigera-operator-58fc44c59b-c6hbt" Sep 12 22:06:10.840496 containerd[1553]: time="2025-09-12T22:06:10.840424802Z" level=info msg="StartContainer for \"521817728fa5f423d33b64077b86d30a25516c74b5dcfed543eb511e986478b6\" returns successfully" Sep 12 22:06:10.969878 containerd[1553]: time="2025-09-12T22:06:10.969749207Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:tigera-operator-58fc44c59b-c6hbt,Uid:deaf3cbf-f0a6-4c15-8592-599383b9970b,Namespace:tigera-operator,Attempt:0,}" Sep 12 22:06:10.998471 containerd[1553]: time="2025-09-12T22:06:10.998425004Z" level=info msg="connecting to shim 1f137cc6a3be46ad2ee82ab8a28094c071d82374834461be2eab8e9c2b7517dc" address="unix:///run/containerd/s/2a961ffc7e899f20d0eec4483630e17387a1621a472a8c1b4ad1d79aa2ffe269" namespace=k8s.io protocol=ttrpc version=3 Sep 12 22:06:11.044015 systemd[1]: Started cri-containerd-1f137cc6a3be46ad2ee82ab8a28094c071d82374834461be2eab8e9c2b7517dc.scope - libcontainer container 1f137cc6a3be46ad2ee82ab8a28094c071d82374834461be2eab8e9c2b7517dc. Sep 12 22:06:11.092384 containerd[1553]: time="2025-09-12T22:06:11.092206516Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:tigera-operator-58fc44c59b-c6hbt,Uid:deaf3cbf-f0a6-4c15-8592-599383b9970b,Namespace:tigera-operator,Attempt:0,} returns sandbox id \"1f137cc6a3be46ad2ee82ab8a28094c071d82374834461be2eab8e9c2b7517dc\"" Sep 12 22:06:11.096702 containerd[1553]: time="2025-09-12T22:06:11.096554761Z" level=info msg="PullImage \"quay.io/tigera/operator:v1.38.6\"" Sep 12 22:06:11.763671 kubelet[2762]: I0912 22:06:11.763562 2762 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-proxy-4cmck" podStartSLOduration=1.762826556 podStartE2EDuration="1.762826556s" podCreationTimestamp="2025-09-12 22:06:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-12 22:06:11.215977023 +0000 UTC m=+6.281171574" watchObservedRunningTime="2025-09-12 22:06:11.762826556 +0000 UTC m=+6.828021107" Sep 12 22:06:13.038897 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount2785339197.mount: Deactivated successfully. Sep 12 22:06:13.541821 containerd[1553]: time="2025-09-12T22:06:13.541725167Z" level=info msg="ImageCreate event name:\"quay.io/tigera/operator:v1.38.6\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 22:06:13.543241 containerd[1553]: time="2025-09-12T22:06:13.543200689Z" level=info msg="stop pulling image quay.io/tigera/operator:v1.38.6: active requests=0, bytes read=22152365" Sep 12 22:06:13.543858 containerd[1553]: time="2025-09-12T22:06:13.543816249Z" level=info msg="ImageCreate event name:\"sha256:dd2e197838b00861b08ae5f480dfbfb9a519722e35ced99346315722309cbe9f\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 22:06:13.547563 containerd[1553]: time="2025-09-12T22:06:13.547489573Z" level=info msg="ImageCreate event name:\"quay.io/tigera/operator@sha256:00a7a9b62f9b9a4e0856128b078539783b8352b07f707bff595cb604cc580f6e\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 22:06:13.548389 containerd[1553]: time="2025-09-12T22:06:13.548322374Z" level=info msg="Pulled image \"quay.io/tigera/operator:v1.38.6\" with image id \"sha256:dd2e197838b00861b08ae5f480dfbfb9a519722e35ced99346315722309cbe9f\", repo tag \"quay.io/tigera/operator:v1.38.6\", repo digest \"quay.io/tigera/operator@sha256:00a7a9b62f9b9a4e0856128b078539783b8352b07f707bff595cb604cc580f6e\", size \"22148360\" in 2.451724373s" Sep 12 22:06:13.548389 containerd[1553]: time="2025-09-12T22:06:13.548367174Z" level=info msg="PullImage \"quay.io/tigera/operator:v1.38.6\" returns image reference \"sha256:dd2e197838b00861b08ae5f480dfbfb9a519722e35ced99346315722309cbe9f\"" Sep 12 22:06:13.554457 containerd[1553]: time="2025-09-12T22:06:13.554380500Z" level=info msg="CreateContainer within sandbox \"1f137cc6a3be46ad2ee82ab8a28094c071d82374834461be2eab8e9c2b7517dc\" for container &ContainerMetadata{Name:tigera-operator,Attempt:0,}" Sep 12 22:06:13.569042 containerd[1553]: time="2025-09-12T22:06:13.568415635Z" level=info msg="Container a0f4a35904d993bc942ae1af9dc66fb8d991f2d7aa4618d8c22879de3edb8b17: CDI devices from CRI Config.CDIDevices: []" Sep 12 22:06:13.583700 containerd[1553]: time="2025-09-12T22:06:13.583654211Z" level=info msg="CreateContainer within sandbox \"1f137cc6a3be46ad2ee82ab8a28094c071d82374834461be2eab8e9c2b7517dc\" for &ContainerMetadata{Name:tigera-operator,Attempt:0,} returns container id \"a0f4a35904d993bc942ae1af9dc66fb8d991f2d7aa4618d8c22879de3edb8b17\"" Sep 12 22:06:13.585021 containerd[1553]: time="2025-09-12T22:06:13.584871492Z" level=info msg="StartContainer for \"a0f4a35904d993bc942ae1af9dc66fb8d991f2d7aa4618d8c22879de3edb8b17\"" Sep 12 22:06:13.586739 containerd[1553]: time="2025-09-12T22:06:13.586658214Z" level=info msg="connecting to shim a0f4a35904d993bc942ae1af9dc66fb8d991f2d7aa4618d8c22879de3edb8b17" address="unix:///run/containerd/s/2a961ffc7e899f20d0eec4483630e17387a1621a472a8c1b4ad1d79aa2ffe269" protocol=ttrpc version=3 Sep 12 22:06:13.605712 systemd[1]: Started cri-containerd-a0f4a35904d993bc942ae1af9dc66fb8d991f2d7aa4618d8c22879de3edb8b17.scope - libcontainer container a0f4a35904d993bc942ae1af9dc66fb8d991f2d7aa4618d8c22879de3edb8b17. Sep 12 22:06:13.639769 containerd[1553]: time="2025-09-12T22:06:13.639694630Z" level=info msg="StartContainer for \"a0f4a35904d993bc942ae1af9dc66fb8d991f2d7aa4618d8c22879de3edb8b17\" returns successfully" Sep 12 22:06:14.230484 kubelet[2762]: I0912 22:06:14.230350 2762 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="tigera-operator/tigera-operator-58fc44c59b-c6hbt" podStartSLOduration=1.7754502570000001 podStartE2EDuration="4.230333434s" podCreationTimestamp="2025-09-12 22:06:10 +0000 UTC" firstStartedPulling="2025-09-12 22:06:11.094544158 +0000 UTC m=+6.159738709" lastFinishedPulling="2025-09-12 22:06:13.549427295 +0000 UTC m=+8.614621886" observedRunningTime="2025-09-12 22:06:14.229628594 +0000 UTC m=+9.294823105" watchObservedRunningTime="2025-09-12 22:06:14.230333434 +0000 UTC m=+9.295527985" Sep 12 22:06:18.291700 sudo[1840]: pam_unix(sudo:session): session closed for user root Sep 12 22:06:18.450632 sshd[1839]: Connection closed by 139.178.68.195 port 52412 Sep 12 22:06:18.452258 sshd-session[1836]: pam_unix(sshd:session): session closed for user core Sep 12 22:06:18.463928 systemd[1]: sshd@6-168.119.157.2:22-139.178.68.195:52412.service: Deactivated successfully. Sep 12 22:06:18.466978 systemd[1]: session-7.scope: Deactivated successfully. Sep 12 22:06:18.467384 systemd[1]: session-7.scope: Consumed 7.401s CPU time, 217.9M memory peak. Sep 12 22:06:18.469040 systemd-logind[1527]: Session 7 logged out. Waiting for processes to exit. Sep 12 22:06:18.472483 systemd-logind[1527]: Removed session 7. Sep 12 22:06:26.717338 systemd[1]: Created slice kubepods-besteffort-pod2ae554ef_9fb0_4756_ae14_945cfd4fb7d6.slice - libcontainer container kubepods-besteffort-pod2ae554ef_9fb0_4756_ae14_945cfd4fb7d6.slice. Sep 12 22:06:26.840952 kubelet[2762]: I0912 22:06:26.840828 2762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"typha-certs\" (UniqueName: \"kubernetes.io/secret/2ae554ef-9fb0-4756-ae14-945cfd4fb7d6-typha-certs\") pod \"calico-typha-6869b6b474-lmq6x\" (UID: \"2ae554ef-9fb0-4756-ae14-945cfd4fb7d6\") " pod="calico-system/calico-typha-6869b6b474-lmq6x" Sep 12 22:06:26.840952 kubelet[2762]: I0912 22:06:26.840878 2762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2ae554ef-9fb0-4756-ae14-945cfd4fb7d6-tigera-ca-bundle\") pod \"calico-typha-6869b6b474-lmq6x\" (UID: \"2ae554ef-9fb0-4756-ae14-945cfd4fb7d6\") " pod="calico-system/calico-typha-6869b6b474-lmq6x" Sep 12 22:06:26.840952 kubelet[2762]: I0912 22:06:26.840902 2762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bfhz9\" (UniqueName: \"kubernetes.io/projected/2ae554ef-9fb0-4756-ae14-945cfd4fb7d6-kube-api-access-bfhz9\") pod \"calico-typha-6869b6b474-lmq6x\" (UID: \"2ae554ef-9fb0-4756-ae14-945cfd4fb7d6\") " pod="calico-system/calico-typha-6869b6b474-lmq6x" Sep 12 22:06:26.942051 systemd[1]: Created slice kubepods-besteffort-pod3ed15555_cc83_4988_826c_acf1ebdd2e6f.slice - libcontainer container kubepods-besteffort-pod3ed15555_cc83_4988_826c_acf1ebdd2e6f.slice. Sep 12 22:06:27.024472 containerd[1553]: time="2025-09-12T22:06:27.024198532Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-typha-6869b6b474-lmq6x,Uid:2ae554ef-9fb0-4756-ae14-945cfd4fb7d6,Namespace:calico-system,Attempt:0,}" Sep 12 22:06:27.045540 kubelet[2762]: I0912 22:06:27.042773 2762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/3ed15555-cc83-4988-826c-acf1ebdd2e6f-tigera-ca-bundle\") pod \"calico-node-l74dx\" (UID: \"3ed15555-cc83-4988-826c-acf1ebdd2e6f\") " pod="calico-system/calico-node-l74dx" Sep 12 22:06:27.045540 kubelet[2762]: I0912 22:06:27.042842 2762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-calico\" (UniqueName: \"kubernetes.io/host-path/3ed15555-cc83-4988-826c-acf1ebdd2e6f-var-lib-calico\") pod \"calico-node-l74dx\" (UID: \"3ed15555-cc83-4988-826c-acf1ebdd2e6f\") " pod="calico-system/calico-node-l74dx" Sep 12 22:06:27.045540 kubelet[2762]: I0912 22:06:27.042863 2762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-bin-dir\" (UniqueName: \"kubernetes.io/host-path/3ed15555-cc83-4988-826c-acf1ebdd2e6f-cni-bin-dir\") pod \"calico-node-l74dx\" (UID: \"3ed15555-cc83-4988-826c-acf1ebdd2e6f\") " pod="calico-system/calico-node-l74dx" Sep 12 22:06:27.045540 kubelet[2762]: I0912 22:06:27.042884 2762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"policysync\" (UniqueName: \"kubernetes.io/host-path/3ed15555-cc83-4988-826c-acf1ebdd2e6f-policysync\") pod \"calico-node-l74dx\" (UID: \"3ed15555-cc83-4988-826c-acf1ebdd2e6f\") " pod="calico-system/calico-node-l74dx" Sep 12 22:06:27.045540 kubelet[2762]: I0912 22:06:27.042933 2762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-certs\" (UniqueName: \"kubernetes.io/secret/3ed15555-cc83-4988-826c-acf1ebdd2e6f-node-certs\") pod \"calico-node-l74dx\" (UID: \"3ed15555-cc83-4988-826c-acf1ebdd2e6f\") " pod="calico-system/calico-node-l74dx" Sep 12 22:06:27.045765 kubelet[2762]: I0912 22:06:27.042966 2762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-log-dir\" (UniqueName: \"kubernetes.io/host-path/3ed15555-cc83-4988-826c-acf1ebdd2e6f-cni-log-dir\") pod \"calico-node-l74dx\" (UID: \"3ed15555-cc83-4988-826c-acf1ebdd2e6f\") " pod="calico-system/calico-node-l74dx" Sep 12 22:06:27.045765 kubelet[2762]: I0912 22:06:27.042984 2762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/3ed15555-cc83-4988-826c-acf1ebdd2e6f-lib-modules\") pod \"calico-node-l74dx\" (UID: \"3ed15555-cc83-4988-826c-acf1ebdd2e6f\") " pod="calico-system/calico-node-l74dx" Sep 12 22:06:27.045765 kubelet[2762]: I0912 22:06:27.043003 2762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-net-dir\" (UniqueName: \"kubernetes.io/host-path/3ed15555-cc83-4988-826c-acf1ebdd2e6f-cni-net-dir\") pod \"calico-node-l74dx\" (UID: \"3ed15555-cc83-4988-826c-acf1ebdd2e6f\") " pod="calico-system/calico-node-l74dx" Sep 12 22:06:27.045765 kubelet[2762]: I0912 22:06:27.043034 2762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w9zmf\" (UniqueName: \"kubernetes.io/projected/3ed15555-cc83-4988-826c-acf1ebdd2e6f-kube-api-access-w9zmf\") pod \"calico-node-l74dx\" (UID: \"3ed15555-cc83-4988-826c-acf1ebdd2e6f\") " pod="calico-system/calico-node-l74dx" Sep 12 22:06:27.045765 kubelet[2762]: I0912 22:06:27.043055 2762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"flexvol-driver-host\" (UniqueName: \"kubernetes.io/host-path/3ed15555-cc83-4988-826c-acf1ebdd2e6f-flexvol-driver-host\") pod \"calico-node-l74dx\" (UID: \"3ed15555-cc83-4988-826c-acf1ebdd2e6f\") " pod="calico-system/calico-node-l74dx" Sep 12 22:06:27.045872 kubelet[2762]: I0912 22:06:27.043071 2762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-calico\" (UniqueName: \"kubernetes.io/host-path/3ed15555-cc83-4988-826c-acf1ebdd2e6f-var-run-calico\") pod \"calico-node-l74dx\" (UID: \"3ed15555-cc83-4988-826c-acf1ebdd2e6f\") " pod="calico-system/calico-node-l74dx" Sep 12 22:06:27.045872 kubelet[2762]: I0912 22:06:27.043095 2762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"xtables-lock\" (UniqueName: \"kubernetes.io/host-path/3ed15555-cc83-4988-826c-acf1ebdd2e6f-xtables-lock\") pod \"calico-node-l74dx\" (UID: \"3ed15555-cc83-4988-826c-acf1ebdd2e6f\") " pod="calico-system/calico-node-l74dx" Sep 12 22:06:27.060673 containerd[1553]: time="2025-09-12T22:06:27.060536486Z" level=info msg="connecting to shim f48ef1fa9a14f6822bc88c6f6b204072f861ffd55e201817a3b4632707f2c9ed" address="unix:///run/containerd/s/3a16165c6f7f2de385ba9c36e854dc5419d8cccf0f5322f1c4fe707ffb14f263" namespace=k8s.io protocol=ttrpc version=3 Sep 12 22:06:27.096267 kubelet[2762]: E0912 22:06:27.096046 2762 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-mr7s6" podUID="9c8a5f44-6d52-4bd2-b382-1b9dd40eab85" Sep 12 22:06:27.123778 systemd[1]: Started cri-containerd-f48ef1fa9a14f6822bc88c6f6b204072f861ffd55e201817a3b4632707f2c9ed.scope - libcontainer container f48ef1fa9a14f6822bc88c6f6b204072f861ffd55e201817a3b4632707f2c9ed. Sep 12 22:06:27.147553 kubelet[2762]: E0912 22:06:27.146840 2762 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 22:06:27.147817 kubelet[2762]: W0912 22:06:27.147646 2762 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 22:06:27.147939 kubelet[2762]: E0912 22:06:27.147688 2762 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 22:06:27.152676 kubelet[2762]: E0912 22:06:27.152645 2762 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 22:06:27.152937 kubelet[2762]: W0912 22:06:27.152766 2762 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 22:06:27.153273 kubelet[2762]: E0912 22:06:27.153244 2762 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 22:06:27.153680 kubelet[2762]: E0912 22:06:27.153354 2762 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 22:06:27.153680 kubelet[2762]: W0912 22:06:27.153366 2762 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 22:06:27.153680 kubelet[2762]: E0912 22:06:27.153399 2762 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 22:06:27.154852 kubelet[2762]: E0912 22:06:27.154785 2762 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 22:06:27.154852 kubelet[2762]: W0912 22:06:27.154805 2762 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 22:06:27.154957 kubelet[2762]: E0912 22:06:27.154853 2762 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 22:06:27.157055 kubelet[2762]: E0912 22:06:27.156734 2762 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 22:06:27.157444 kubelet[2762]: W0912 22:06:27.157229 2762 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 22:06:27.158370 kubelet[2762]: E0912 22:06:27.158281 2762 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 22:06:27.158781 kubelet[2762]: E0912 22:06:27.158672 2762 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 22:06:27.159156 kubelet[2762]: W0912 22:06:27.159131 2762 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 22:06:27.159776 kubelet[2762]: E0912 22:06:27.159485 2762 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 22:06:27.160980 kubelet[2762]: E0912 22:06:27.160910 2762 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 22:06:27.161140 kubelet[2762]: W0912 22:06:27.161070 2762 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 22:06:27.161140 kubelet[2762]: E0912 22:06:27.161116 2762 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 22:06:27.162852 kubelet[2762]: E0912 22:06:27.162825 2762 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 22:06:27.163034 kubelet[2762]: W0912 22:06:27.162963 2762 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 22:06:27.163034 kubelet[2762]: E0912 22:06:27.163004 2762 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 22:06:27.163380 kubelet[2762]: E0912 22:06:27.163350 2762 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 22:06:27.163380 kubelet[2762]: W0912 22:06:27.163373 2762 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 22:06:27.163464 kubelet[2762]: E0912 22:06:27.163402 2762 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 22:06:27.191632 kubelet[2762]: E0912 22:06:27.191582 2762 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 22:06:27.191632 kubelet[2762]: W0912 22:06:27.191611 2762 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 22:06:27.191632 kubelet[2762]: E0912 22:06:27.191631 2762 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 22:06:27.242055 containerd[1553]: time="2025-09-12T22:06:27.240890441Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-typha-6869b6b474-lmq6x,Uid:2ae554ef-9fb0-4756-ae14-945cfd4fb7d6,Namespace:calico-system,Attempt:0,} returns sandbox id \"f48ef1fa9a14f6822bc88c6f6b204072f861ffd55e201817a3b4632707f2c9ed\"" Sep 12 22:06:27.245014 kubelet[2762]: E0912 22:06:27.244990 2762 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 22:06:27.245286 kubelet[2762]: W0912 22:06:27.245126 2762 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 22:06:27.245286 kubelet[2762]: E0912 22:06:27.245165 2762 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 22:06:27.245286 kubelet[2762]: I0912 22:06:27.245199 2762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/9c8a5f44-6d52-4bd2-b382-1b9dd40eab85-kubelet-dir\") pod \"csi-node-driver-mr7s6\" (UID: \"9c8a5f44-6d52-4bd2-b382-1b9dd40eab85\") " pod="calico-system/csi-node-driver-mr7s6" Sep 12 22:06:27.245453 containerd[1553]: time="2025-09-12T22:06:27.245393975Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/typha:v3.30.3\"" Sep 12 22:06:27.246078 kubelet[2762]: E0912 22:06:27.246057 2762 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 22:06:27.246078 kubelet[2762]: W0912 22:06:27.246103 2762 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 22:06:27.246078 kubelet[2762]: E0912 22:06:27.246133 2762 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 22:06:27.246078 kubelet[2762]: I0912 22:06:27.246155 2762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/9c8a5f44-6d52-4bd2-b382-1b9dd40eab85-socket-dir\") pod \"csi-node-driver-mr7s6\" (UID: \"9c8a5f44-6d52-4bd2-b382-1b9dd40eab85\") " pod="calico-system/csi-node-driver-mr7s6" Sep 12 22:06:27.246489 kubelet[2762]: E0912 22:06:27.246440 2762 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 22:06:27.246489 kubelet[2762]: W0912 22:06:27.246481 2762 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 22:06:27.246604 kubelet[2762]: E0912 22:06:27.246502 2762 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 22:06:27.246771 kubelet[2762]: E0912 22:06:27.246751 2762 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 22:06:27.246771 kubelet[2762]: W0912 22:06:27.246769 2762 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 22:06:27.246874 kubelet[2762]: E0912 22:06:27.246803 2762 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 22:06:27.247114 kubelet[2762]: E0912 22:06:27.247088 2762 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 22:06:27.247162 kubelet[2762]: W0912 22:06:27.247109 2762 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 22:06:27.247162 kubelet[2762]: E0912 22:06:27.247138 2762 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 22:06:27.247209 kubelet[2762]: I0912 22:06:27.247167 2762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"varrun\" (UniqueName: \"kubernetes.io/host-path/9c8a5f44-6d52-4bd2-b382-1b9dd40eab85-varrun\") pod \"csi-node-driver-mr7s6\" (UID: \"9c8a5f44-6d52-4bd2-b382-1b9dd40eab85\") " pod="calico-system/csi-node-driver-mr7s6" Sep 12 22:06:27.249002 kubelet[2762]: E0912 22:06:27.248965 2762 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 22:06:27.249204 kubelet[2762]: W0912 22:06:27.248995 2762 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 22:06:27.249204 kubelet[2762]: E0912 22:06:27.249054 2762 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 22:06:27.249204 kubelet[2762]: I0912 22:06:27.249097 2762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wckdk\" (UniqueName: \"kubernetes.io/projected/9c8a5f44-6d52-4bd2-b382-1b9dd40eab85-kube-api-access-wckdk\") pod \"csi-node-driver-mr7s6\" (UID: \"9c8a5f44-6d52-4bd2-b382-1b9dd40eab85\") " pod="calico-system/csi-node-driver-mr7s6" Sep 12 22:06:27.249285 kubelet[2762]: E0912 22:06:27.249235 2762 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 22:06:27.249285 kubelet[2762]: W0912 22:06:27.249243 2762 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 22:06:27.249285 kubelet[2762]: E0912 22:06:27.249277 2762 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 22:06:27.249661 kubelet[2762]: E0912 22:06:27.249408 2762 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 22:06:27.249661 kubelet[2762]: W0912 22:06:27.249416 2762 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 22:06:27.249661 kubelet[2762]: E0912 22:06:27.249462 2762 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 22:06:27.250043 kubelet[2762]: E0912 22:06:27.249752 2762 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 22:06:27.250043 kubelet[2762]: W0912 22:06:27.250015 2762 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 22:06:27.250043 kubelet[2762]: E0912 22:06:27.250044 2762 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 22:06:27.250132 kubelet[2762]: I0912 22:06:27.250070 2762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/9c8a5f44-6d52-4bd2-b382-1b9dd40eab85-registration-dir\") pod \"csi-node-driver-mr7s6\" (UID: \"9c8a5f44-6d52-4bd2-b382-1b9dd40eab85\") " pod="calico-system/csi-node-driver-mr7s6" Sep 12 22:06:27.250383 kubelet[2762]: E0912 22:06:27.250337 2762 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 22:06:27.250383 kubelet[2762]: W0912 22:06:27.250356 2762 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 22:06:27.250383 kubelet[2762]: E0912 22:06:27.250376 2762 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 22:06:27.250667 kubelet[2762]: E0912 22:06:27.250593 2762 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 22:06:27.250667 kubelet[2762]: W0912 22:06:27.250607 2762 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 22:06:27.250667 kubelet[2762]: E0912 22:06:27.250619 2762 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 22:06:27.251797 kubelet[2762]: E0912 22:06:27.251767 2762 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 22:06:27.251797 kubelet[2762]: W0912 22:06:27.251795 2762 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 22:06:27.251856 kubelet[2762]: E0912 22:06:27.251818 2762 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 22:06:27.252130 kubelet[2762]: E0912 22:06:27.252065 2762 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 22:06:27.252130 kubelet[2762]: W0912 22:06:27.252093 2762 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 22:06:27.252130 kubelet[2762]: E0912 22:06:27.252110 2762 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 22:06:27.252453 kubelet[2762]: E0912 22:06:27.252384 2762 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 22:06:27.252453 kubelet[2762]: W0912 22:06:27.252411 2762 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 22:06:27.252453 kubelet[2762]: E0912 22:06:27.252421 2762 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 22:06:27.252914 kubelet[2762]: E0912 22:06:27.252884 2762 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 22:06:27.252914 kubelet[2762]: W0912 22:06:27.252906 2762 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 22:06:27.253026 kubelet[2762]: E0912 22:06:27.252929 2762 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 22:06:27.258376 containerd[1553]: time="2025-09-12T22:06:27.258331850Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-node-l74dx,Uid:3ed15555-cc83-4988-826c-acf1ebdd2e6f,Namespace:calico-system,Attempt:0,}" Sep 12 22:06:27.294951 containerd[1553]: time="2025-09-12T22:06:27.292757021Z" level=info msg="connecting to shim 63c63a200646bb5a00278e0c3649352a1544c2a3b49605a4c0ccecec419ef41f" address="unix:///run/containerd/s/88dffdc194bb573f8c9de89ba67c9ad2c2f61a0f46fecb9d97eb61215578417d" namespace=k8s.io protocol=ttrpc version=3 Sep 12 22:06:27.332067 systemd[1]: Started cri-containerd-63c63a200646bb5a00278e0c3649352a1544c2a3b49605a4c0ccecec419ef41f.scope - libcontainer container 63c63a200646bb5a00278e0c3649352a1544c2a3b49605a4c0ccecec419ef41f. Sep 12 22:06:27.354008 kubelet[2762]: E0912 22:06:27.353953 2762 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 22:06:27.354008 kubelet[2762]: W0912 22:06:27.353996 2762 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 22:06:27.354008 kubelet[2762]: E0912 22:06:27.354017 2762 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 22:06:27.354580 kubelet[2762]: E0912 22:06:27.354551 2762 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 22:06:27.354580 kubelet[2762]: W0912 22:06:27.354578 2762 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 22:06:27.354873 kubelet[2762]: E0912 22:06:27.354848 2762 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 22:06:27.354873 kubelet[2762]: W0912 22:06:27.354867 2762 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 22:06:27.354937 kubelet[2762]: E0912 22:06:27.354888 2762 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 22:06:27.355617 kubelet[2762]: E0912 22:06:27.355589 2762 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 22:06:27.355689 kubelet[2762]: E0912 22:06:27.355620 2762 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 22:06:27.355689 kubelet[2762]: W0912 22:06:27.355666 2762 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 22:06:27.355907 kubelet[2762]: E0912 22:06:27.355835 2762 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 22:06:27.355907 kubelet[2762]: W0912 22:06:27.355850 2762 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 22:06:27.355907 kubelet[2762]: E0912 22:06:27.355860 2762 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 22:06:27.356027 kubelet[2762]: E0912 22:06:27.356011 2762 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 22:06:27.356027 kubelet[2762]: W0912 22:06:27.356023 2762 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 22:06:27.356073 kubelet[2762]: E0912 22:06:27.356031 2762 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 22:06:27.356073 kubelet[2762]: E0912 22:06:27.356069 2762 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 22:06:27.357655 kubelet[2762]: E0912 22:06:27.357600 2762 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 22:06:27.357655 kubelet[2762]: W0912 22:06:27.357646 2762 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 22:06:27.357819 kubelet[2762]: E0912 22:06:27.357671 2762 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 22:06:27.357962 kubelet[2762]: E0912 22:06:27.357938 2762 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 22:06:27.357962 kubelet[2762]: W0912 22:06:27.357955 2762 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 22:06:27.358015 kubelet[2762]: E0912 22:06:27.357986 2762 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 22:06:27.358261 kubelet[2762]: E0912 22:06:27.358234 2762 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 22:06:27.358261 kubelet[2762]: W0912 22:06:27.358250 2762 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 22:06:27.358261 kubelet[2762]: E0912 22:06:27.358263 2762 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 22:06:27.358582 kubelet[2762]: E0912 22:06:27.358495 2762 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 22:06:27.358582 kubelet[2762]: W0912 22:06:27.358561 2762 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 22:06:27.358582 kubelet[2762]: E0912 22:06:27.358577 2762 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 22:06:27.358768 kubelet[2762]: E0912 22:06:27.358749 2762 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 22:06:27.358768 kubelet[2762]: W0912 22:06:27.358762 2762 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 22:06:27.358826 kubelet[2762]: E0912 22:06:27.358802 2762 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 22:06:27.359154 kubelet[2762]: E0912 22:06:27.359126 2762 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 22:06:27.359154 kubelet[2762]: W0912 22:06:27.359143 2762 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 22:06:27.359272 kubelet[2762]: E0912 22:06:27.359253 2762 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 22:06:27.359797 kubelet[2762]: E0912 22:06:27.359767 2762 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 22:06:27.359797 kubelet[2762]: W0912 22:06:27.359792 2762 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 22:06:27.359933 kubelet[2762]: E0912 22:06:27.359924 2762 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 22:06:27.360817 kubelet[2762]: E0912 22:06:27.360786 2762 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 22:06:27.360817 kubelet[2762]: W0912 22:06:27.360807 2762 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 22:06:27.360931 kubelet[2762]: E0912 22:06:27.360893 2762 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 22:06:27.361001 kubelet[2762]: E0912 22:06:27.360984 2762 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 22:06:27.361001 kubelet[2762]: W0912 22:06:27.360995 2762 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 22:06:27.361066 kubelet[2762]: E0912 22:06:27.361042 2762 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 22:06:27.361166 kubelet[2762]: E0912 22:06:27.361149 2762 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 22:06:27.361166 kubelet[2762]: W0912 22:06:27.361160 2762 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 22:06:27.361450 kubelet[2762]: E0912 22:06:27.361422 2762 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 22:06:27.361560 kubelet[2762]: E0912 22:06:27.361538 2762 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 22:06:27.361560 kubelet[2762]: W0912 22:06:27.361556 2762 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 22:06:27.361760 kubelet[2762]: E0912 22:06:27.361735 2762 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 22:06:27.362101 kubelet[2762]: E0912 22:06:27.362073 2762 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 22:06:27.362101 kubelet[2762]: W0912 22:06:27.362090 2762 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 22:06:27.362228 kubelet[2762]: E0912 22:06:27.362209 2762 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 22:06:27.363075 kubelet[2762]: E0912 22:06:27.363008 2762 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 22:06:27.363075 kubelet[2762]: W0912 22:06:27.363026 2762 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 22:06:27.363174 kubelet[2762]: E0912 22:06:27.363086 2762 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 22:06:27.363475 kubelet[2762]: E0912 22:06:27.363434 2762 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 22:06:27.363475 kubelet[2762]: W0912 22:06:27.363451 2762 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 22:06:27.364533 kubelet[2762]: E0912 22:06:27.363578 2762 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 22:06:27.364533 kubelet[2762]: E0912 22:06:27.363933 2762 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 22:06:27.364533 kubelet[2762]: W0912 22:06:27.363948 2762 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 22:06:27.364533 kubelet[2762]: E0912 22:06:27.363964 2762 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 22:06:27.364533 kubelet[2762]: E0912 22:06:27.364322 2762 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 22:06:27.364533 kubelet[2762]: W0912 22:06:27.364334 2762 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 22:06:27.364533 kubelet[2762]: E0912 22:06:27.364352 2762 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 22:06:27.365455 kubelet[2762]: E0912 22:06:27.365416 2762 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 22:06:27.365455 kubelet[2762]: W0912 22:06:27.365439 2762 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 22:06:27.365455 kubelet[2762]: E0912 22:06:27.365461 2762 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 22:06:27.365696 kubelet[2762]: E0912 22:06:27.365677 2762 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 22:06:27.365696 kubelet[2762]: W0912 22:06:27.365691 2762 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 22:06:27.365791 kubelet[2762]: E0912 22:06:27.365765 2762 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 22:06:27.366279 kubelet[2762]: E0912 22:06:27.366248 2762 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 22:06:27.366279 kubelet[2762]: W0912 22:06:27.366266 2762 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 22:06:27.366279 kubelet[2762]: E0912 22:06:27.366278 2762 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 22:06:27.387481 kubelet[2762]: E0912 22:06:27.387439 2762 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 22:06:27.387481 kubelet[2762]: W0912 22:06:27.387469 2762 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 22:06:27.387481 kubelet[2762]: E0912 22:06:27.387490 2762 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 22:06:27.484538 containerd[1553]: time="2025-09-12T22:06:27.484140228Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-node-l74dx,Uid:3ed15555-cc83-4988-826c-acf1ebdd2e6f,Namespace:calico-system,Attempt:0,} returns sandbox id \"63c63a200646bb5a00278e0c3649352a1544c2a3b49605a4c0ccecec419ef41f\"" Sep 12 22:06:28.624274 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount513720626.mount: Deactivated successfully. Sep 12 22:06:29.103304 kubelet[2762]: E0912 22:06:29.103066 2762 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-mr7s6" podUID="9c8a5f44-6d52-4bd2-b382-1b9dd40eab85" Sep 12 22:06:29.123787 containerd[1553]: time="2025-09-12T22:06:29.123727930Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/typha:v3.30.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 22:06:29.125132 containerd[1553]: time="2025-09-12T22:06:29.125093225Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/typha:v3.30.3: active requests=0, bytes read=33105775" Sep 12 22:06:29.126250 containerd[1553]: time="2025-09-12T22:06:29.126211638Z" level=info msg="ImageCreate event name:\"sha256:6a1496fdc48cc0b9ab3c10aef777497484efac5df9efbfbbdf9775e9583645cb\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 22:06:29.129339 containerd[1553]: time="2025-09-12T22:06:29.129296833Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/typha@sha256:f4a3d61ffda9c98a53adeb412c5af404ca3727a3cc2d0b4ef28d197bdd47ecaa\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 22:06:29.131549 containerd[1553]: time="2025-09-12T22:06:29.130317044Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/typha:v3.30.3\" with image id \"sha256:6a1496fdc48cc0b9ab3c10aef777497484efac5df9efbfbbdf9775e9583645cb\", repo tag \"ghcr.io/flatcar/calico/typha:v3.30.3\", repo digest \"ghcr.io/flatcar/calico/typha@sha256:f4a3d61ffda9c98a53adeb412c5af404ca3727a3cc2d0b4ef28d197bdd47ecaa\", size \"33105629\" in 1.884876708s" Sep 12 22:06:29.131549 containerd[1553]: time="2025-09-12T22:06:29.130352885Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/typha:v3.30.3\" returns image reference \"sha256:6a1496fdc48cc0b9ab3c10aef777497484efac5df9efbfbbdf9775e9583645cb\"" Sep 12 22:06:29.133923 containerd[1553]: time="2025-09-12T22:06:29.133601961Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.3\"" Sep 12 22:06:29.160957 containerd[1553]: time="2025-09-12T22:06:29.160884350Z" level=info msg="CreateContainer within sandbox \"f48ef1fa9a14f6822bc88c6f6b204072f861ffd55e201817a3b4632707f2c9ed\" for container &ContainerMetadata{Name:calico-typha,Attempt:0,}" Sep 12 22:06:29.171541 containerd[1553]: time="2025-09-12T22:06:29.170727221Z" level=info msg="Container 3e9992f76e01cfe55a6cea4432defa4403ff3571eea1ae44d7ea40b8b9e1e987: CDI devices from CRI Config.CDIDevices: []" Sep 12 22:06:29.183885 containerd[1553]: time="2025-09-12T22:06:29.183805529Z" level=info msg="CreateContainer within sandbox \"f48ef1fa9a14f6822bc88c6f6b204072f861ffd55e201817a3b4632707f2c9ed\" for &ContainerMetadata{Name:calico-typha,Attempt:0,} returns container id \"3e9992f76e01cfe55a6cea4432defa4403ff3571eea1ae44d7ea40b8b9e1e987\"" Sep 12 22:06:29.185039 containerd[1553]: time="2025-09-12T22:06:29.184898941Z" level=info msg="StartContainer for \"3e9992f76e01cfe55a6cea4432defa4403ff3571eea1ae44d7ea40b8b9e1e987\"" Sep 12 22:06:29.188319 containerd[1553]: time="2025-09-12T22:06:29.187824894Z" level=info msg="connecting to shim 3e9992f76e01cfe55a6cea4432defa4403ff3571eea1ae44d7ea40b8b9e1e987" address="unix:///run/containerd/s/3a16165c6f7f2de385ba9c36e854dc5419d8cccf0f5322f1c4fe707ffb14f263" protocol=ttrpc version=3 Sep 12 22:06:29.216779 systemd[1]: Started cri-containerd-3e9992f76e01cfe55a6cea4432defa4403ff3571eea1ae44d7ea40b8b9e1e987.scope - libcontainer container 3e9992f76e01cfe55a6cea4432defa4403ff3571eea1ae44d7ea40b8b9e1e987. Sep 12 22:06:29.279683 containerd[1553]: time="2025-09-12T22:06:29.279638571Z" level=info msg="StartContainer for \"3e9992f76e01cfe55a6cea4432defa4403ff3571eea1ae44d7ea40b8b9e1e987\" returns successfully" Sep 12 22:06:30.151778 systemd[1]: Started sshd@7-168.119.157.2:22-196.251.114.29:51824.service - OpenSSH per-connection server daemon (196.251.114.29:51824). Sep 12 22:06:30.202701 sshd[3352]: Connection closed by 196.251.114.29 port 51824 Sep 12 22:06:30.204853 systemd[1]: sshd@7-168.119.157.2:22-196.251.114.29:51824.service: Deactivated successfully. Sep 12 22:06:30.269035 kubelet[2762]: E0912 22:06:30.268950 2762 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 22:06:30.269035 kubelet[2762]: W0912 22:06:30.268982 2762 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 22:06:30.269035 kubelet[2762]: E0912 22:06:30.269004 2762 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 22:06:30.270239 kubelet[2762]: E0912 22:06:30.269793 2762 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 22:06:30.270239 kubelet[2762]: W0912 22:06:30.269822 2762 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 22:06:30.270239 kubelet[2762]: E0912 22:06:30.269853 2762 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 22:06:30.270826 kubelet[2762]: E0912 22:06:30.270766 2762 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 22:06:30.270826 kubelet[2762]: W0912 22:06:30.270786 2762 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 22:06:30.271004 kubelet[2762]: E0912 22:06:30.270812 2762 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 22:06:30.271295 kubelet[2762]: E0912 22:06:30.271281 2762 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 22:06:30.271444 kubelet[2762]: W0912 22:06:30.271337 2762 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 22:06:30.271444 kubelet[2762]: E0912 22:06:30.271353 2762 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 22:06:30.271685 kubelet[2762]: E0912 22:06:30.271672 2762 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 22:06:30.272041 kubelet[2762]: W0912 22:06:30.271789 2762 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 22:06:30.272041 kubelet[2762]: E0912 22:06:30.271811 2762 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 22:06:30.273417 kubelet[2762]: E0912 22:06:30.273246 2762 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 22:06:30.274123 kubelet[2762]: W0912 22:06:30.274103 2762 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 22:06:30.274502 kubelet[2762]: E0912 22:06:30.274171 2762 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 22:06:30.274607 kubelet[2762]: E0912 22:06:30.274594 2762 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 22:06:30.274787 kubelet[2762]: W0912 22:06:30.274667 2762 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 22:06:30.274787 kubelet[2762]: E0912 22:06:30.274684 2762 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 22:06:30.275044 kubelet[2762]: E0912 22:06:30.275029 2762 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 22:06:30.275126 kubelet[2762]: W0912 22:06:30.275115 2762 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 22:06:30.275197 kubelet[2762]: E0912 22:06:30.275177 2762 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 22:06:30.275419 kubelet[2762]: E0912 22:06:30.275399 2762 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 22:06:30.275522 kubelet[2762]: W0912 22:06:30.275411 2762 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 22:06:30.275522 kubelet[2762]: E0912 22:06:30.275493 2762 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 22:06:30.275841 kubelet[2762]: E0912 22:06:30.275780 2762 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 22:06:30.275841 kubelet[2762]: W0912 22:06:30.275794 2762 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 22:06:30.275841 kubelet[2762]: E0912 22:06:30.275805 2762 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 22:06:30.276102 kubelet[2762]: E0912 22:06:30.276067 2762 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 22:06:30.276102 kubelet[2762]: W0912 22:06:30.276079 2762 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 22:06:30.276237 kubelet[2762]: E0912 22:06:30.276089 2762 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 22:06:30.276381 kubelet[2762]: E0912 22:06:30.276358 2762 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 22:06:30.276448 kubelet[2762]: W0912 22:06:30.276431 2762 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 22:06:30.276634 kubelet[2762]: E0912 22:06:30.276503 2762 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 22:06:30.276822 kubelet[2762]: E0912 22:06:30.276810 2762 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 22:06:30.276984 kubelet[2762]: W0912 22:06:30.276854 2762 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 22:06:30.276984 kubelet[2762]: E0912 22:06:30.276868 2762 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 22:06:30.277170 kubelet[2762]: E0912 22:06:30.277155 2762 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 22:06:30.277308 kubelet[2762]: W0912 22:06:30.277208 2762 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 22:06:30.277308 kubelet[2762]: E0912 22:06:30.277221 2762 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 22:06:30.277497 kubelet[2762]: E0912 22:06:30.277485 2762 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 22:06:30.277634 kubelet[2762]: W0912 22:06:30.277529 2762 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 22:06:30.277634 kubelet[2762]: E0912 22:06:30.277541 2762 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 22:06:30.280048 kubelet[2762]: E0912 22:06:30.279904 2762 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 22:06:30.280048 kubelet[2762]: W0912 22:06:30.279928 2762 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 22:06:30.280048 kubelet[2762]: E0912 22:06:30.280002 2762 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 22:06:30.280328 kubelet[2762]: E0912 22:06:30.280232 2762 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 22:06:30.280328 kubelet[2762]: W0912 22:06:30.280242 2762 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 22:06:30.280328 kubelet[2762]: E0912 22:06:30.280252 2762 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 22:06:30.282780 kubelet[2762]: E0912 22:06:30.280428 2762 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 22:06:30.282780 kubelet[2762]: W0912 22:06:30.280446 2762 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 22:06:30.282780 kubelet[2762]: E0912 22:06:30.280456 2762 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 22:06:30.282780 kubelet[2762]: E0912 22:06:30.280637 2762 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 22:06:30.282780 kubelet[2762]: W0912 22:06:30.280645 2762 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 22:06:30.282780 kubelet[2762]: E0912 22:06:30.280654 2762 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 22:06:30.282780 kubelet[2762]: E0912 22:06:30.280837 2762 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 22:06:30.282780 kubelet[2762]: W0912 22:06:30.280845 2762 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 22:06:30.282780 kubelet[2762]: E0912 22:06:30.280853 2762 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 22:06:30.282780 kubelet[2762]: E0912 22:06:30.281128 2762 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 22:06:30.283590 kubelet[2762]: W0912 22:06:30.281139 2762 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 22:06:30.283590 kubelet[2762]: E0912 22:06:30.281149 2762 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 22:06:30.283590 kubelet[2762]: E0912 22:06:30.281354 2762 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 22:06:30.283590 kubelet[2762]: W0912 22:06:30.281363 2762 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 22:06:30.283590 kubelet[2762]: E0912 22:06:30.281377 2762 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 22:06:30.283590 kubelet[2762]: E0912 22:06:30.282164 2762 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 22:06:30.283590 kubelet[2762]: W0912 22:06:30.282193 2762 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 22:06:30.283590 kubelet[2762]: E0912 22:06:30.282207 2762 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 22:06:30.283590 kubelet[2762]: E0912 22:06:30.282400 2762 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 22:06:30.283590 kubelet[2762]: W0912 22:06:30.282409 2762 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 22:06:30.284007 kubelet[2762]: E0912 22:06:30.282438 2762 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 22:06:30.284007 kubelet[2762]: E0912 22:06:30.282587 2762 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 22:06:30.284007 kubelet[2762]: W0912 22:06:30.282595 2762 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 22:06:30.284007 kubelet[2762]: E0912 22:06:30.282633 2762 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 22:06:30.284007 kubelet[2762]: E0912 22:06:30.282835 2762 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 22:06:30.284007 kubelet[2762]: W0912 22:06:30.282859 2762 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 22:06:30.284007 kubelet[2762]: E0912 22:06:30.282880 2762 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 22:06:30.284007 kubelet[2762]: E0912 22:06:30.283060 2762 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 22:06:30.284007 kubelet[2762]: W0912 22:06:30.283068 2762 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 22:06:30.284007 kubelet[2762]: E0912 22:06:30.283098 2762 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 22:06:30.285602 kubelet[2762]: E0912 22:06:30.284187 2762 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 22:06:30.285602 kubelet[2762]: W0912 22:06:30.284237 2762 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 22:06:30.285602 kubelet[2762]: E0912 22:06:30.284273 2762 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 22:06:30.285602 kubelet[2762]: E0912 22:06:30.284878 2762 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 22:06:30.285602 kubelet[2762]: W0912 22:06:30.284895 2762 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 22:06:30.285602 kubelet[2762]: E0912 22:06:30.284919 2762 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 22:06:30.286281 kubelet[2762]: E0912 22:06:30.286098 2762 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 22:06:30.286281 kubelet[2762]: W0912 22:06:30.286124 2762 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 22:06:30.286608 kubelet[2762]: E0912 22:06:30.286172 2762 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 22:06:30.286854 kubelet[2762]: E0912 22:06:30.286803 2762 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 22:06:30.286854 kubelet[2762]: W0912 22:06:30.286825 2762 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 22:06:30.287564 kubelet[2762]: E0912 22:06:30.286957 2762 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 22:06:30.287654 kubelet[2762]: E0912 22:06:30.287596 2762 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 22:06:30.287654 kubelet[2762]: W0912 22:06:30.287616 2762 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 22:06:30.287654 kubelet[2762]: E0912 22:06:30.287634 2762 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 22:06:30.288203 kubelet[2762]: E0912 22:06:30.288177 2762 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 12 22:06:30.288203 kubelet[2762]: W0912 22:06:30.288195 2762 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 12 22:06:30.288285 kubelet[2762]: E0912 22:06:30.288210 2762 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 12 22:06:30.584364 containerd[1553]: time="2025-09-12T22:06:30.583758849Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 22:06:30.590461 containerd[1553]: time="2025-09-12T22:06:30.590369881Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.3: active requests=0, bytes read=4266814" Sep 12 22:06:30.592844 containerd[1553]: time="2025-09-12T22:06:30.592761027Z" level=info msg="ImageCreate event name:\"sha256:29e6f31ad72882b1b817dd257df6b7981e4d7d31d872b7fe2cf102c6e2af27a5\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 22:06:30.595892 containerd[1553]: time="2025-09-12T22:06:30.595835661Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/pod2daemon-flexvol@sha256:81bdfcd9dbd36624dc35354e8c181c75631ba40e6c7df5820f5f56cea36f0ef9\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 22:06:30.597679 containerd[1553]: time="2025-09-12T22:06:30.597597961Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.3\" with image id \"sha256:29e6f31ad72882b1b817dd257df6b7981e4d7d31d872b7fe2cf102c6e2af27a5\", repo tag \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.3\", repo digest \"ghcr.io/flatcar/calico/pod2daemon-flexvol@sha256:81bdfcd9dbd36624dc35354e8c181c75631ba40e6c7df5820f5f56cea36f0ef9\", size \"5636015\" in 1.463923838s" Sep 12 22:06:30.597803 containerd[1553]: time="2025-09-12T22:06:30.597754442Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.3\" returns image reference \"sha256:29e6f31ad72882b1b817dd257df6b7981e4d7d31d872b7fe2cf102c6e2af27a5\"" Sep 12 22:06:30.602929 containerd[1553]: time="2025-09-12T22:06:30.602821218Z" level=info msg="CreateContainer within sandbox \"63c63a200646bb5a00278e0c3649352a1544c2a3b49605a4c0ccecec419ef41f\" for container &ContainerMetadata{Name:flexvol-driver,Attempt:0,}" Sep 12 22:06:30.616547 containerd[1553]: time="2025-09-12T22:06:30.614397665Z" level=info msg="Container 7e7eadfe45072bed289aa4c319a2598a496b8ecf0f40967a8e0573c7aa309aa3: CDI devices from CRI Config.CDIDevices: []" Sep 12 22:06:30.628782 containerd[1553]: time="2025-09-12T22:06:30.628708623Z" level=info msg="CreateContainer within sandbox \"63c63a200646bb5a00278e0c3649352a1544c2a3b49605a4c0ccecec419ef41f\" for &ContainerMetadata{Name:flexvol-driver,Attempt:0,} returns container id \"7e7eadfe45072bed289aa4c319a2598a496b8ecf0f40967a8e0573c7aa309aa3\"" Sep 12 22:06:30.629549 containerd[1553]: time="2025-09-12T22:06:30.629399310Z" level=info msg="StartContainer for \"7e7eadfe45072bed289aa4c319a2598a496b8ecf0f40967a8e0573c7aa309aa3\"" Sep 12 22:06:30.631790 containerd[1553]: time="2025-09-12T22:06:30.631747536Z" level=info msg="connecting to shim 7e7eadfe45072bed289aa4c319a2598a496b8ecf0f40967a8e0573c7aa309aa3" address="unix:///run/containerd/s/88dffdc194bb573f8c9de89ba67c9ad2c2f61a0f46fecb9d97eb61215578417d" protocol=ttrpc version=3 Sep 12 22:06:30.661801 systemd[1]: Started cri-containerd-7e7eadfe45072bed289aa4c319a2598a496b8ecf0f40967a8e0573c7aa309aa3.scope - libcontainer container 7e7eadfe45072bed289aa4c319a2598a496b8ecf0f40967a8e0573c7aa309aa3. Sep 12 22:06:30.717157 containerd[1553]: time="2025-09-12T22:06:30.717098674Z" level=info msg="StartContainer for \"7e7eadfe45072bed289aa4c319a2598a496b8ecf0f40967a8e0573c7aa309aa3\" returns successfully" Sep 12 22:06:30.734152 systemd[1]: cri-containerd-7e7eadfe45072bed289aa4c319a2598a496b8ecf0f40967a8e0573c7aa309aa3.scope: Deactivated successfully. Sep 12 22:06:30.740197 containerd[1553]: time="2025-09-12T22:06:30.740131767Z" level=info msg="received exit event container_id:\"7e7eadfe45072bed289aa4c319a2598a496b8ecf0f40967a8e0573c7aa309aa3\" id:\"7e7eadfe45072bed289aa4c319a2598a496b8ecf0f40967a8e0573c7aa309aa3\" pid:3409 exited_at:{seconds:1757714790 nanos:739253317}" Sep 12 22:06:30.740549 containerd[1553]: time="2025-09-12T22:06:30.740483011Z" level=info msg="TaskExit event in podsandbox handler container_id:\"7e7eadfe45072bed289aa4c319a2598a496b8ecf0f40967a8e0573c7aa309aa3\" id:\"7e7eadfe45072bed289aa4c319a2598a496b8ecf0f40967a8e0573c7aa309aa3\" pid:3409 exited_at:{seconds:1757714790 nanos:739253317}" Sep 12 22:06:30.770461 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-7e7eadfe45072bed289aa4c319a2598a496b8ecf0f40967a8e0573c7aa309aa3-rootfs.mount: Deactivated successfully. Sep 12 22:06:31.103397 kubelet[2762]: E0912 22:06:31.103322 2762 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-mr7s6" podUID="9c8a5f44-6d52-4bd2-b382-1b9dd40eab85" Sep 12 22:06:31.276622 kubelet[2762]: I0912 22:06:31.276562 2762 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Sep 12 22:06:31.277990 containerd[1553]: time="2025-09-12T22:06:31.277468630Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/cni:v3.30.3\"" Sep 12 22:06:31.307888 kubelet[2762]: I0912 22:06:31.307818 2762 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/calico-typha-6869b6b474-lmq6x" podStartSLOduration=3.419684568 podStartE2EDuration="5.307797434s" podCreationTimestamp="2025-09-12 22:06:26 +0000 UTC" firstStartedPulling="2025-09-12 22:06:27.244745287 +0000 UTC m=+22.309939838" lastFinishedPulling="2025-09-12 22:06:29.132858153 +0000 UTC m=+24.198052704" observedRunningTime="2025-09-12 22:06:30.29367562 +0000 UTC m=+25.358870171" watchObservedRunningTime="2025-09-12 22:06:31.307797434 +0000 UTC m=+26.372991985" Sep 12 22:06:33.102997 kubelet[2762]: E0912 22:06:33.102927 2762 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-mr7s6" podUID="9c8a5f44-6d52-4bd2-b382-1b9dd40eab85" Sep 12 22:06:33.746551 containerd[1553]: time="2025-09-12T22:06:33.746415030Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/cni:v3.30.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 22:06:33.747609 containerd[1553]: time="2025-09-12T22:06:33.747353959Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/cni:v3.30.3: active requests=0, bytes read=65913477" Sep 12 22:06:33.748604 containerd[1553]: time="2025-09-12T22:06:33.748560451Z" level=info msg="ImageCreate event name:\"sha256:7077a1dc632ee598cbfa626f9e3c9bca5b20c0d1e1e557995890125b2e8d2e23\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 22:06:33.751079 containerd[1553]: time="2025-09-12T22:06:33.751021276Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/cni@sha256:73d1e391050490d54e5bee8ff2b1a50a8be1746c98dc530361b00e8c0ab63f87\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 22:06:33.752552 containerd[1553]: time="2025-09-12T22:06:33.752080767Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/cni:v3.30.3\" with image id \"sha256:7077a1dc632ee598cbfa626f9e3c9bca5b20c0d1e1e557995890125b2e8d2e23\", repo tag \"ghcr.io/flatcar/calico/cni:v3.30.3\", repo digest \"ghcr.io/flatcar/calico/cni@sha256:73d1e391050490d54e5bee8ff2b1a50a8be1746c98dc530361b00e8c0ab63f87\", size \"67282718\" in 2.474514256s" Sep 12 22:06:33.752552 containerd[1553]: time="2025-09-12T22:06:33.752117927Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/cni:v3.30.3\" returns image reference \"sha256:7077a1dc632ee598cbfa626f9e3c9bca5b20c0d1e1e557995890125b2e8d2e23\"" Sep 12 22:06:33.756594 containerd[1553]: time="2025-09-12T22:06:33.756349370Z" level=info msg="CreateContainer within sandbox \"63c63a200646bb5a00278e0c3649352a1544c2a3b49605a4c0ccecec419ef41f\" for container &ContainerMetadata{Name:install-cni,Attempt:0,}" Sep 12 22:06:33.770537 containerd[1553]: time="2025-09-12T22:06:33.769679865Z" level=info msg="Container 729dbe206eff9c2eecac17b301090370346c75fc41f97c8566ef75981dc7c02c: CDI devices from CRI Config.CDIDevices: []" Sep 12 22:06:33.783203 containerd[1553]: time="2025-09-12T22:06:33.783123001Z" level=info msg="CreateContainer within sandbox \"63c63a200646bb5a00278e0c3649352a1544c2a3b49605a4c0ccecec419ef41f\" for &ContainerMetadata{Name:install-cni,Attempt:0,} returns container id \"729dbe206eff9c2eecac17b301090370346c75fc41f97c8566ef75981dc7c02c\"" Sep 12 22:06:33.784650 containerd[1553]: time="2025-09-12T22:06:33.784301453Z" level=info msg="StartContainer for \"729dbe206eff9c2eecac17b301090370346c75fc41f97c8566ef75981dc7c02c\"" Sep 12 22:06:33.789094 containerd[1553]: time="2025-09-12T22:06:33.789046861Z" level=info msg="connecting to shim 729dbe206eff9c2eecac17b301090370346c75fc41f97c8566ef75981dc7c02c" address="unix:///run/containerd/s/88dffdc194bb573f8c9de89ba67c9ad2c2f61a0f46fecb9d97eb61215578417d" protocol=ttrpc version=3 Sep 12 22:06:33.818849 systemd[1]: Started cri-containerd-729dbe206eff9c2eecac17b301090370346c75fc41f97c8566ef75981dc7c02c.scope - libcontainer container 729dbe206eff9c2eecac17b301090370346c75fc41f97c8566ef75981dc7c02c. Sep 12 22:06:33.871089 containerd[1553]: time="2025-09-12T22:06:33.871033571Z" level=info msg="StartContainer for \"729dbe206eff9c2eecac17b301090370346c75fc41f97c8566ef75981dc7c02c\" returns successfully" Sep 12 22:06:34.442905 containerd[1553]: time="2025-09-12T22:06:34.442837637Z" level=error msg="failed to reload cni configuration after receiving fs change event(WRITE \"/etc/cni/net.d/calico-kubeconfig\")" error="cni config load failed: no network config found in /etc/cni/net.d: cni plugin not initialized: failed to load cni config" Sep 12 22:06:34.445446 systemd[1]: cri-containerd-729dbe206eff9c2eecac17b301090370346c75fc41f97c8566ef75981dc7c02c.scope: Deactivated successfully. Sep 12 22:06:34.446641 systemd[1]: cri-containerd-729dbe206eff9c2eecac17b301090370346c75fc41f97c8566ef75981dc7c02c.scope: Consumed 540ms CPU time, 184.2M memory peak, 165.8M written to disk. Sep 12 22:06:34.448407 containerd[1553]: time="2025-09-12T22:06:34.448291611Z" level=info msg="TaskExit event in podsandbox handler container_id:\"729dbe206eff9c2eecac17b301090370346c75fc41f97c8566ef75981dc7c02c\" id:\"729dbe206eff9c2eecac17b301090370346c75fc41f97c8566ef75981dc7c02c\" pid:3466 exited_at:{seconds:1757714794 nanos:447651204}" Sep 12 22:06:34.448665 containerd[1553]: time="2025-09-12T22:06:34.448624374Z" level=info msg="received exit event container_id:\"729dbe206eff9c2eecac17b301090370346c75fc41f97c8566ef75981dc7c02c\" id:\"729dbe206eff9c2eecac17b301090370346c75fc41f97c8566ef75981dc7c02c\" pid:3466 exited_at:{seconds:1757714794 nanos:447651204}" Sep 12 22:06:34.476183 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-729dbe206eff9c2eecac17b301090370346c75fc41f97c8566ef75981dc7c02c-rootfs.mount: Deactivated successfully. Sep 12 22:06:34.482676 kubelet[2762]: I0912 22:06:34.482390 2762 kubelet_node_status.go:488] "Fast updating node status as it just became ready" Sep 12 22:06:34.539995 systemd[1]: Created slice kubepods-burstable-pod1fd9bc8e_8eaf_4b7f_8777_056108d2728d.slice - libcontainer container kubepods-burstable-pod1fd9bc8e_8eaf_4b7f_8777_056108d2728d.slice. Sep 12 22:06:34.568944 systemd[1]: Created slice kubepods-besteffort-pod1760c214_51ea_4d0b_970c_4a27ed78a891.slice - libcontainer container kubepods-besteffort-pod1760c214_51ea_4d0b_970c_4a27ed78a891.slice. Sep 12 22:06:34.584819 kubelet[2762]: W0912 22:06:34.584296 2762 reflector.go:561] object-"calico-apiserver"/"calico-apiserver-certs": failed to list *v1.Secret: secrets "calico-apiserver-certs" is forbidden: User "system:node:ci-4459-0-0-7-af931fdd93" cannot list resource "secrets" in API group "" in the namespace "calico-apiserver": no relationship found between node 'ci-4459-0-0-7-af931fdd93' and this object Sep 12 22:06:34.584819 kubelet[2762]: E0912 22:06:34.584349 2762 reflector.go:158] "Unhandled Error" err="object-\"calico-apiserver\"/\"calico-apiserver-certs\": Failed to watch *v1.Secret: failed to list *v1.Secret: secrets \"calico-apiserver-certs\" is forbidden: User \"system:node:ci-4459-0-0-7-af931fdd93\" cannot list resource \"secrets\" in API group \"\" in the namespace \"calico-apiserver\": no relationship found between node 'ci-4459-0-0-7-af931fdd93' and this object" logger="UnhandledError" Sep 12 22:06:34.584819 kubelet[2762]: W0912 22:06:34.584441 2762 reflector.go:561] object-"calico-apiserver"/"kube-root-ca.crt": failed to list *v1.ConfigMap: configmaps "kube-root-ca.crt" is forbidden: User "system:node:ci-4459-0-0-7-af931fdd93" cannot list resource "configmaps" in API group "" in the namespace "calico-apiserver": no relationship found between node 'ci-4459-0-0-7-af931fdd93' and this object Sep 12 22:06:34.584819 kubelet[2762]: E0912 22:06:34.584453 2762 reflector.go:158] "Unhandled Error" err="object-\"calico-apiserver\"/\"kube-root-ca.crt\": Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: configmaps \"kube-root-ca.crt\" is forbidden: User \"system:node:ci-4459-0-0-7-af931fdd93\" cannot list resource \"configmaps\" in API group \"\" in the namespace \"calico-apiserver\": no relationship found between node 'ci-4459-0-0-7-af931fdd93' and this object" logger="UnhandledError" Sep 12 22:06:34.588639 systemd[1]: Created slice kubepods-besteffort-pod6252003c_5f62_4a5b_ae66_4eb96a9effa8.slice - libcontainer container kubepods-besteffort-pod6252003c_5f62_4a5b_ae66_4eb96a9effa8.slice. Sep 12 22:06:34.606487 systemd[1]: Created slice kubepods-burstable-pod7742b50f_d16e_46b2_aaad_1853990171fc.slice - libcontainer container kubepods-burstable-pod7742b50f_d16e_46b2_aaad_1853990171fc.slice. Sep 12 22:06:34.629292 systemd[1]: Created slice kubepods-besteffort-pod111ee55a_9453_48a3_8be1_100851a1a1ea.slice - libcontainer container kubepods-besteffort-pod111ee55a_9453_48a3_8be1_100851a1a1ea.slice. Sep 12 22:06:34.642402 systemd[1]: Created slice kubepods-besteffort-pode600ded0_81cb_44c9_ba7c_7a86580bc3d1.slice - libcontainer container kubepods-besteffort-pode600ded0_81cb_44c9_ba7c_7a86580bc3d1.slice. Sep 12 22:06:34.653301 systemd[1]: Created slice kubepods-besteffort-podb6d05037_1881_4fc1_bcbd_569865ed54b9.slice - libcontainer container kubepods-besteffort-podb6d05037_1881_4fc1_bcbd_569865ed54b9.slice. Sep 12 22:06:34.716043 kubelet[2762]: I0912 22:06:34.715633 2762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"goldmane-key-pair\" (UniqueName: \"kubernetes.io/secret/b6d05037-1881-4fc1-bcbd-569865ed54b9-goldmane-key-pair\") pod \"goldmane-7988f88666-5qdsb\" (UID: \"b6d05037-1881-4fc1-bcbd-569865ed54b9\") " pod="calico-system/goldmane-7988f88666-5qdsb" Sep 12 22:06:34.716043 kubelet[2762]: I0912 22:06:34.715810 2762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wsv5k\" (UniqueName: \"kubernetes.io/projected/b6d05037-1881-4fc1-bcbd-569865ed54b9-kube-api-access-wsv5k\") pod \"goldmane-7988f88666-5qdsb\" (UID: \"b6d05037-1881-4fc1-bcbd-569865ed54b9\") " pod="calico-system/goldmane-7988f88666-5qdsb" Sep 12 22:06:34.716043 kubelet[2762]: I0912 22:06:34.715835 2762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6252003c-5f62-4a5b-ae66-4eb96a9effa8-tigera-ca-bundle\") pod \"calico-kube-controllers-8696478589-r2m8w\" (UID: \"6252003c-5f62-4a5b-ae66-4eb96a9effa8\") " pod="calico-system/calico-kube-controllers-8696478589-r2m8w" Sep 12 22:06:34.716043 kubelet[2762]: I0912 22:06:34.715991 2762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/1fd9bc8e-8eaf-4b7f-8777-056108d2728d-config-volume\") pod \"coredns-7c65d6cfc9-xl5lq\" (UID: \"1fd9bc8e-8eaf-4b7f-8777-056108d2728d\") " pod="kube-system/coredns-7c65d6cfc9-xl5lq" Sep 12 22:06:34.716211 kubelet[2762]: I0912 22:06:34.716125 2762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b6d05037-1881-4fc1-bcbd-569865ed54b9-config\") pod \"goldmane-7988f88666-5qdsb\" (UID: \"b6d05037-1881-4fc1-bcbd-569865ed54b9\") " pod="calico-system/goldmane-7988f88666-5qdsb" Sep 12 22:06:34.716946 kubelet[2762]: I0912 22:06:34.716232 2762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q8kpw\" (UniqueName: \"kubernetes.io/projected/1760c214-51ea-4d0b-970c-4a27ed78a891-kube-api-access-q8kpw\") pod \"calico-apiserver-567b5c8b5-jlsm6\" (UID: \"1760c214-51ea-4d0b-970c-4a27ed78a891\") " pod="calico-apiserver/calico-apiserver-567b5c8b5-jlsm6" Sep 12 22:06:34.716946 kubelet[2762]: I0912 22:06:34.716263 2762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whisker-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e600ded0-81cb-44c9-ba7c-7a86580bc3d1-whisker-ca-bundle\") pod \"whisker-98795495b-7bmqd\" (UID: \"e600ded0-81cb-44c9-ba7c-7a86580bc3d1\") " pod="calico-system/whisker-98795495b-7bmqd" Sep 12 22:06:34.716946 kubelet[2762]: I0912 22:06:34.716394 2762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/7742b50f-d16e-46b2-aaad-1853990171fc-config-volume\") pod \"coredns-7c65d6cfc9-jbgw9\" (UID: \"7742b50f-d16e-46b2-aaad-1853990171fc\") " pod="kube-system/coredns-7c65d6cfc9-jbgw9" Sep 12 22:06:34.716946 kubelet[2762]: I0912 22:06:34.716416 2762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"goldmane-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b6d05037-1881-4fc1-bcbd-569865ed54b9-goldmane-ca-bundle\") pod \"goldmane-7988f88666-5qdsb\" (UID: \"b6d05037-1881-4fc1-bcbd-569865ed54b9\") " pod="calico-system/goldmane-7988f88666-5qdsb" Sep 12 22:06:34.716946 kubelet[2762]: I0912 22:06:34.716433 2762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6bcrh\" (UniqueName: \"kubernetes.io/projected/1fd9bc8e-8eaf-4b7f-8777-056108d2728d-kube-api-access-6bcrh\") pod \"coredns-7c65d6cfc9-xl5lq\" (UID: \"1fd9bc8e-8eaf-4b7f-8777-056108d2728d\") " pod="kube-system/coredns-7c65d6cfc9-xl5lq" Sep 12 22:06:34.717125 kubelet[2762]: I0912 22:06:34.716686 2762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"calico-apiserver-certs\" (UniqueName: \"kubernetes.io/secret/111ee55a-9453-48a3-8be1-100851a1a1ea-calico-apiserver-certs\") pod \"calico-apiserver-567b5c8b5-9gng8\" (UID: \"111ee55a-9453-48a3-8be1-100851a1a1ea\") " pod="calico-apiserver/calico-apiserver-567b5c8b5-9gng8" Sep 12 22:06:34.717125 kubelet[2762]: I0912 22:06:34.716731 2762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jnl5k\" (UniqueName: \"kubernetes.io/projected/6252003c-5f62-4a5b-ae66-4eb96a9effa8-kube-api-access-jnl5k\") pod \"calico-kube-controllers-8696478589-r2m8w\" (UID: \"6252003c-5f62-4a5b-ae66-4eb96a9effa8\") " pod="calico-system/calico-kube-controllers-8696478589-r2m8w" Sep 12 22:06:34.717125 kubelet[2762]: I0912 22:06:34.716913 2762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whisker-backend-key-pair\" (UniqueName: \"kubernetes.io/secret/e600ded0-81cb-44c9-ba7c-7a86580bc3d1-whisker-backend-key-pair\") pod \"whisker-98795495b-7bmqd\" (UID: \"e600ded0-81cb-44c9-ba7c-7a86580bc3d1\") " pod="calico-system/whisker-98795495b-7bmqd" Sep 12 22:06:34.717125 kubelet[2762]: I0912 22:06:34.716943 2762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7cvn8\" (UniqueName: \"kubernetes.io/projected/e600ded0-81cb-44c9-ba7c-7a86580bc3d1-kube-api-access-7cvn8\") pod \"whisker-98795495b-7bmqd\" (UID: \"e600ded0-81cb-44c9-ba7c-7a86580bc3d1\") " pod="calico-system/whisker-98795495b-7bmqd" Sep 12 22:06:34.717125 kubelet[2762]: I0912 22:06:34.717079 2762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p8twz\" (UniqueName: \"kubernetes.io/projected/7742b50f-d16e-46b2-aaad-1853990171fc-kube-api-access-p8twz\") pod \"coredns-7c65d6cfc9-jbgw9\" (UID: \"7742b50f-d16e-46b2-aaad-1853990171fc\") " pod="kube-system/coredns-7c65d6cfc9-jbgw9" Sep 12 22:06:34.717226 kubelet[2762]: I0912 22:06:34.717102 2762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"calico-apiserver-certs\" (UniqueName: \"kubernetes.io/secret/1760c214-51ea-4d0b-970c-4a27ed78a891-calico-apiserver-certs\") pod \"calico-apiserver-567b5c8b5-jlsm6\" (UID: \"1760c214-51ea-4d0b-970c-4a27ed78a891\") " pod="calico-apiserver/calico-apiserver-567b5c8b5-jlsm6" Sep 12 22:06:34.717574 kubelet[2762]: I0912 22:06:34.717260 2762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rsvzm\" (UniqueName: \"kubernetes.io/projected/111ee55a-9453-48a3-8be1-100851a1a1ea-kube-api-access-rsvzm\") pod \"calico-apiserver-567b5c8b5-9gng8\" (UID: \"111ee55a-9453-48a3-8be1-100851a1a1ea\") " pod="calico-apiserver/calico-apiserver-567b5c8b5-9gng8" Sep 12 22:06:34.903603 containerd[1553]: time="2025-09-12T22:06:34.903543614Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-8696478589-r2m8w,Uid:6252003c-5f62-4a5b-ae66-4eb96a9effa8,Namespace:calico-system,Attempt:0,}" Sep 12 22:06:34.923315 containerd[1553]: time="2025-09-12T22:06:34.922993125Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-7c65d6cfc9-jbgw9,Uid:7742b50f-d16e-46b2-aaad-1853990171fc,Namespace:kube-system,Attempt:0,}" Sep 12 22:06:34.952178 containerd[1553]: time="2025-09-12T22:06:34.952017491Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:whisker-98795495b-7bmqd,Uid:e600ded0-81cb-44c9-ba7c-7a86580bc3d1,Namespace:calico-system,Attempt:0,}" Sep 12 22:06:34.961209 containerd[1553]: time="2025-09-12T22:06:34.961164461Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:goldmane-7988f88666-5qdsb,Uid:b6d05037-1881-4fc1-bcbd-569865ed54b9,Namespace:calico-system,Attempt:0,}" Sep 12 22:06:35.055082 containerd[1553]: time="2025-09-12T22:06:35.055022811Z" level=error msg="Failed to destroy network for sandbox \"550e5636f1de7a4133689c8e5c8f720c3953f7cc5e66e90c0c1ea91c597c9741\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 12 22:06:35.057877 containerd[1553]: time="2025-09-12T22:06:35.057814878Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-7c65d6cfc9-jbgw9,Uid:7742b50f-d16e-46b2-aaad-1853990171fc,Namespace:kube-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"550e5636f1de7a4133689c8e5c8f720c3953f7cc5e66e90c0c1ea91c597c9741\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 12 22:06:35.060644 kubelet[2762]: E0912 22:06:35.058212 2762 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"550e5636f1de7a4133689c8e5c8f720c3953f7cc5e66e90c0c1ea91c597c9741\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 12 22:06:35.060644 kubelet[2762]: E0912 22:06:35.058293 2762 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"550e5636f1de7a4133689c8e5c8f720c3953f7cc5e66e90c0c1ea91c597c9741\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-7c65d6cfc9-jbgw9" Sep 12 22:06:35.060644 kubelet[2762]: E0912 22:06:35.058314 2762 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"550e5636f1de7a4133689c8e5c8f720c3953f7cc5e66e90c0c1ea91c597c9741\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-7c65d6cfc9-jbgw9" Sep 12 22:06:35.060832 kubelet[2762]: E0912 22:06:35.058418 2762 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"coredns-7c65d6cfc9-jbgw9_kube-system(7742b50f-d16e-46b2-aaad-1853990171fc)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"coredns-7c65d6cfc9-jbgw9_kube-system(7742b50f-d16e-46b2-aaad-1853990171fc)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"550e5636f1de7a4133689c8e5c8f720c3953f7cc5e66e90c0c1ea91c597c9741\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-7c65d6cfc9-jbgw9" podUID="7742b50f-d16e-46b2-aaad-1853990171fc" Sep 12 22:06:35.063637 containerd[1553]: time="2025-09-12T22:06:35.063479772Z" level=error msg="Failed to destroy network for sandbox \"178961fb9ddbd3b9306bd47bda2d66b53dfbf848a1d2a4d309a45b18f3a3b8b4\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 12 22:06:35.066629 containerd[1553]: time="2025-09-12T22:06:35.066035837Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-8696478589-r2m8w,Uid:6252003c-5f62-4a5b-ae66-4eb96a9effa8,Namespace:calico-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"178961fb9ddbd3b9306bd47bda2d66b53dfbf848a1d2a4d309a45b18f3a3b8b4\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 12 22:06:35.066868 kubelet[2762]: E0912 22:06:35.066803 2762 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"178961fb9ddbd3b9306bd47bda2d66b53dfbf848a1d2a4d309a45b18f3a3b8b4\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 12 22:06:35.066868 kubelet[2762]: E0912 22:06:35.066858 2762 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"178961fb9ddbd3b9306bd47bda2d66b53dfbf848a1d2a4d309a45b18f3a3b8b4\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-kube-controllers-8696478589-r2m8w" Sep 12 22:06:35.067030 kubelet[2762]: E0912 22:06:35.066877 2762 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"178961fb9ddbd3b9306bd47bda2d66b53dfbf848a1d2a4d309a45b18f3a3b8b4\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-kube-controllers-8696478589-r2m8w" Sep 12 22:06:35.067030 kubelet[2762]: E0912 22:06:35.066941 2762 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-kube-controllers-8696478589-r2m8w_calico-system(6252003c-5f62-4a5b-ae66-4eb96a9effa8)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-kube-controllers-8696478589-r2m8w_calico-system(6252003c-5f62-4a5b-ae66-4eb96a9effa8)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"178961fb9ddbd3b9306bd47bda2d66b53dfbf848a1d2a4d309a45b18f3a3b8b4\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/calico-kube-controllers-8696478589-r2m8w" podUID="6252003c-5f62-4a5b-ae66-4eb96a9effa8" Sep 12 22:06:35.086308 containerd[1553]: time="2025-09-12T22:06:35.086251871Z" level=error msg="Failed to destroy network for sandbox \"b1ad9f078f1769ed97d0fa4959ad6ad67210ee44876efc9ba5c95b5bda2ea04e\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 12 22:06:35.089026 containerd[1553]: time="2025-09-12T22:06:35.088928616Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:goldmane-7988f88666-5qdsb,Uid:b6d05037-1881-4fc1-bcbd-569865ed54b9,Namespace:calico-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"b1ad9f078f1769ed97d0fa4959ad6ad67210ee44876efc9ba5c95b5bda2ea04e\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 12 22:06:35.089651 kubelet[2762]: E0912 22:06:35.089394 2762 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"b1ad9f078f1769ed97d0fa4959ad6ad67210ee44876efc9ba5c95b5bda2ea04e\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 12 22:06:35.089651 kubelet[2762]: E0912 22:06:35.089474 2762 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"b1ad9f078f1769ed97d0fa4959ad6ad67210ee44876efc9ba5c95b5bda2ea04e\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/goldmane-7988f88666-5qdsb" Sep 12 22:06:35.089651 kubelet[2762]: E0912 22:06:35.089496 2762 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"b1ad9f078f1769ed97d0fa4959ad6ad67210ee44876efc9ba5c95b5bda2ea04e\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/goldmane-7988f88666-5qdsb" Sep 12 22:06:35.089881 kubelet[2762]: E0912 22:06:35.089576 2762 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"goldmane-7988f88666-5qdsb_calico-system(b6d05037-1881-4fc1-bcbd-569865ed54b9)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"goldmane-7988f88666-5qdsb_calico-system(b6d05037-1881-4fc1-bcbd-569865ed54b9)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"b1ad9f078f1769ed97d0fa4959ad6ad67210ee44876efc9ba5c95b5bda2ea04e\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/goldmane-7988f88666-5qdsb" podUID="b6d05037-1881-4fc1-bcbd-569865ed54b9" Sep 12 22:06:35.095321 containerd[1553]: time="2025-09-12T22:06:35.095255517Z" level=error msg="Failed to destroy network for sandbox \"49f1d14800e1c10e8134d8434225cd558f585adf5098a8c91902f507714e04f2\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 12 22:06:35.096976 containerd[1553]: time="2025-09-12T22:06:35.096849132Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:whisker-98795495b-7bmqd,Uid:e600ded0-81cb-44c9-ba7c-7a86580bc3d1,Namespace:calico-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"49f1d14800e1c10e8134d8434225cd558f585adf5098a8c91902f507714e04f2\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 12 22:06:35.097364 kubelet[2762]: E0912 22:06:35.097325 2762 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"49f1d14800e1c10e8134d8434225cd558f585adf5098a8c91902f507714e04f2\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 12 22:06:35.097444 kubelet[2762]: E0912 22:06:35.097386 2762 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"49f1d14800e1c10e8134d8434225cd558f585adf5098a8c91902f507714e04f2\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/whisker-98795495b-7bmqd" Sep 12 22:06:35.097444 kubelet[2762]: E0912 22:06:35.097404 2762 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"49f1d14800e1c10e8134d8434225cd558f585adf5098a8c91902f507714e04f2\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/whisker-98795495b-7bmqd" Sep 12 22:06:35.097706 kubelet[2762]: E0912 22:06:35.097630 2762 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"whisker-98795495b-7bmqd_calico-system(e600ded0-81cb-44c9-ba7c-7a86580bc3d1)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"whisker-98795495b-7bmqd_calico-system(e600ded0-81cb-44c9-ba7c-7a86580bc3d1)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"49f1d14800e1c10e8134d8434225cd558f585adf5098a8c91902f507714e04f2\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/whisker-98795495b-7bmqd" podUID="e600ded0-81cb-44c9-ba7c-7a86580bc3d1" Sep 12 22:06:35.112189 systemd[1]: Created slice kubepods-besteffort-pod9c8a5f44_6d52_4bd2_b382_1b9dd40eab85.slice - libcontainer container kubepods-besteffort-pod9c8a5f44_6d52_4bd2_b382_1b9dd40eab85.slice. Sep 12 22:06:35.115383 containerd[1553]: time="2025-09-12T22:06:35.115326469Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-mr7s6,Uid:9c8a5f44-6d52-4bd2-b382-1b9dd40eab85,Namespace:calico-system,Attempt:0,}" Sep 12 22:06:35.156437 containerd[1553]: time="2025-09-12T22:06:35.156401463Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-7c65d6cfc9-xl5lq,Uid:1fd9bc8e-8eaf-4b7f-8777-056108d2728d,Namespace:kube-system,Attempt:0,}" Sep 12 22:06:35.190349 containerd[1553]: time="2025-09-12T22:06:35.190261267Z" level=error msg="Failed to destroy network for sandbox \"7dad37b2fe2b4573edb5f3657cd51fe8c59966dcb93785cadea959bc4ee85860\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 12 22:06:35.193364 containerd[1553]: time="2025-09-12T22:06:35.193314817Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-mr7s6,Uid:9c8a5f44-6d52-4bd2-b382-1b9dd40eab85,Namespace:calico-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"7dad37b2fe2b4573edb5f3657cd51fe8c59966dcb93785cadea959bc4ee85860\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 12 22:06:35.194088 kubelet[2762]: E0912 22:06:35.194026 2762 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"7dad37b2fe2b4573edb5f3657cd51fe8c59966dcb93785cadea959bc4ee85860\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 12 22:06:35.194186 kubelet[2762]: E0912 22:06:35.194140 2762 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"7dad37b2fe2b4573edb5f3657cd51fe8c59966dcb93785cadea959bc4ee85860\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/csi-node-driver-mr7s6" Sep 12 22:06:35.194186 kubelet[2762]: E0912 22:06:35.194168 2762 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"7dad37b2fe2b4573edb5f3657cd51fe8c59966dcb93785cadea959bc4ee85860\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/csi-node-driver-mr7s6" Sep 12 22:06:35.194283 kubelet[2762]: E0912 22:06:35.194214 2762 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"csi-node-driver-mr7s6_calico-system(9c8a5f44-6d52-4bd2-b382-1b9dd40eab85)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"csi-node-driver-mr7s6_calico-system(9c8a5f44-6d52-4bd2-b382-1b9dd40eab85)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"7dad37b2fe2b4573edb5f3657cd51fe8c59966dcb93785cadea959bc4ee85860\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/csi-node-driver-mr7s6" podUID="9c8a5f44-6d52-4bd2-b382-1b9dd40eab85" Sep 12 22:06:35.242059 containerd[1553]: time="2025-09-12T22:06:35.241937483Z" level=error msg="Failed to destroy network for sandbox \"9567730c4a3e3a360d0795e94bc53b5951e2652069e919730b3de3063d360063\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 12 22:06:35.244438 containerd[1553]: time="2025-09-12T22:06:35.244376586Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-7c65d6cfc9-xl5lq,Uid:1fd9bc8e-8eaf-4b7f-8777-056108d2728d,Namespace:kube-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"9567730c4a3e3a360d0795e94bc53b5951e2652069e919730b3de3063d360063\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 12 22:06:35.246183 kubelet[2762]: E0912 22:06:35.244718 2762 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"9567730c4a3e3a360d0795e94bc53b5951e2652069e919730b3de3063d360063\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 12 22:06:35.246183 kubelet[2762]: E0912 22:06:35.244821 2762 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"9567730c4a3e3a360d0795e94bc53b5951e2652069e919730b3de3063d360063\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-7c65d6cfc9-xl5lq" Sep 12 22:06:35.246183 kubelet[2762]: E0912 22:06:35.244842 2762 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"9567730c4a3e3a360d0795e94bc53b5951e2652069e919730b3de3063d360063\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-7c65d6cfc9-xl5lq" Sep 12 22:06:35.246311 kubelet[2762]: E0912 22:06:35.244908 2762 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"coredns-7c65d6cfc9-xl5lq_kube-system(1fd9bc8e-8eaf-4b7f-8777-056108d2728d)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"coredns-7c65d6cfc9-xl5lq_kube-system(1fd9bc8e-8eaf-4b7f-8777-056108d2728d)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"9567730c4a3e3a360d0795e94bc53b5951e2652069e919730b3de3063d360063\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-7c65d6cfc9-xl5lq" podUID="1fd9bc8e-8eaf-4b7f-8777-056108d2728d" Sep 12 22:06:35.316058 containerd[1553]: time="2025-09-12T22:06:35.315029943Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node:v3.30.3\"" Sep 12 22:06:35.822660 kubelet[2762]: E0912 22:06:35.822178 2762 secret.go:189] Couldn't get secret calico-apiserver/calico-apiserver-certs: failed to sync secret cache: timed out waiting for the condition Sep 12 22:06:35.822660 kubelet[2762]: E0912 22:06:35.822282 2762 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/111ee55a-9453-48a3-8be1-100851a1a1ea-calico-apiserver-certs podName:111ee55a-9453-48a3-8be1-100851a1a1ea nodeName:}" failed. No retries permitted until 2025-09-12 22:06:36.322257044 +0000 UTC m=+31.387451595 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "calico-apiserver-certs" (UniqueName: "kubernetes.io/secret/111ee55a-9453-48a3-8be1-100851a1a1ea-calico-apiserver-certs") pod "calico-apiserver-567b5c8b5-9gng8" (UID: "111ee55a-9453-48a3-8be1-100851a1a1ea") : failed to sync secret cache: timed out waiting for the condition Sep 12 22:06:35.831575 kubelet[2762]: E0912 22:06:35.831317 2762 secret.go:189] Couldn't get secret calico-apiserver/calico-apiserver-certs: failed to sync secret cache: timed out waiting for the condition Sep 12 22:06:35.831575 kubelet[2762]: E0912 22:06:35.831410 2762 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1760c214-51ea-4d0b-970c-4a27ed78a891-calico-apiserver-certs podName:1760c214-51ea-4d0b-970c-4a27ed78a891 nodeName:}" failed. No retries permitted until 2025-09-12 22:06:36.331390532 +0000 UTC m=+31.396585083 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "calico-apiserver-certs" (UniqueName: "kubernetes.io/secret/1760c214-51ea-4d0b-970c-4a27ed78a891-calico-apiserver-certs") pod "calico-apiserver-567b5c8b5-jlsm6" (UID: "1760c214-51ea-4d0b-970c-4a27ed78a891") : failed to sync secret cache: timed out waiting for the condition Sep 12 22:06:35.845730 kubelet[2762]: E0912 22:06:35.844337 2762 projected.go:288] Couldn't get configMap calico-apiserver/kube-root-ca.crt: failed to sync configmap cache: timed out waiting for the condition Sep 12 22:06:35.845730 kubelet[2762]: E0912 22:06:35.844382 2762 projected.go:194] Error preparing data for projected volume kube-api-access-rsvzm for pod calico-apiserver/calico-apiserver-567b5c8b5-9gng8: failed to sync configmap cache: timed out waiting for the condition Sep 12 22:06:35.845730 kubelet[2762]: E0912 22:06:35.844453 2762 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/111ee55a-9453-48a3-8be1-100851a1a1ea-kube-api-access-rsvzm podName:111ee55a-9453-48a3-8be1-100851a1a1ea nodeName:}" failed. No retries permitted until 2025-09-12 22:06:36.344432017 +0000 UTC m=+31.409626568 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-rsvzm" (UniqueName: "kubernetes.io/projected/111ee55a-9453-48a3-8be1-100851a1a1ea-kube-api-access-rsvzm") pod "calico-apiserver-567b5c8b5-9gng8" (UID: "111ee55a-9453-48a3-8be1-100851a1a1ea") : failed to sync configmap cache: timed out waiting for the condition Sep 12 22:06:35.867980 kubelet[2762]: E0912 22:06:35.867835 2762 projected.go:288] Couldn't get configMap calico-apiserver/kube-root-ca.crt: failed to sync configmap cache: timed out waiting for the condition Sep 12 22:06:35.867980 kubelet[2762]: E0912 22:06:35.867886 2762 projected.go:194] Error preparing data for projected volume kube-api-access-q8kpw for pod calico-apiserver/calico-apiserver-567b5c8b5-jlsm6: failed to sync configmap cache: timed out waiting for the condition Sep 12 22:06:35.867980 kubelet[2762]: E0912 22:06:35.867953 2762 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/1760c214-51ea-4d0b-970c-4a27ed78a891-kube-api-access-q8kpw podName:1760c214-51ea-4d0b-970c-4a27ed78a891 nodeName:}" failed. No retries permitted until 2025-09-12 22:06:36.367934562 +0000 UTC m=+31.433129113 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-q8kpw" (UniqueName: "kubernetes.io/projected/1760c214-51ea-4d0b-970c-4a27ed78a891-kube-api-access-q8kpw") pod "calico-apiserver-567b5c8b5-jlsm6" (UID: "1760c214-51ea-4d0b-970c-4a27ed78a891") : failed to sync configmap cache: timed out waiting for the condition Sep 12 22:06:36.679103 containerd[1553]: time="2025-09-12T22:06:36.678808920Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-567b5c8b5-jlsm6,Uid:1760c214-51ea-4d0b-970c-4a27ed78a891,Namespace:calico-apiserver,Attempt:0,}" Sep 12 22:06:36.740532 containerd[1553]: time="2025-09-12T22:06:36.739458446Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-567b5c8b5-9gng8,Uid:111ee55a-9453-48a3-8be1-100851a1a1ea,Namespace:calico-apiserver,Attempt:0,}" Sep 12 22:06:36.793804 containerd[1553]: time="2025-09-12T22:06:36.793743432Z" level=error msg="Failed to destroy network for sandbox \"f0d5fece696a3fd07ccdaa804474babcb024398ac4f65e60d868cd7640cf965f\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 12 22:06:36.800936 containerd[1553]: time="2025-09-12T22:06:36.800882379Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-567b5c8b5-jlsm6,Uid:1760c214-51ea-4d0b-970c-4a27ed78a891,Namespace:calico-apiserver,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"f0d5fece696a3fd07ccdaa804474babcb024398ac4f65e60d868cd7640cf965f\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 12 22:06:36.801450 kubelet[2762]: E0912 22:06:36.801322 2762 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"f0d5fece696a3fd07ccdaa804474babcb024398ac4f65e60d868cd7640cf965f\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 12 22:06:36.801602 kubelet[2762]: E0912 22:06:36.801477 2762 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"f0d5fece696a3fd07ccdaa804474babcb024398ac4f65e60d868cd7640cf965f\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-567b5c8b5-jlsm6" Sep 12 22:06:36.801602 kubelet[2762]: E0912 22:06:36.801498 2762 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"f0d5fece696a3fd07ccdaa804474babcb024398ac4f65e60d868cd7640cf965f\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-567b5c8b5-jlsm6" Sep 12 22:06:36.803448 kubelet[2762]: E0912 22:06:36.801598 2762 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-apiserver-567b5c8b5-jlsm6_calico-apiserver(1760c214-51ea-4d0b-970c-4a27ed78a891)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-apiserver-567b5c8b5-jlsm6_calico-apiserver(1760c214-51ea-4d0b-970c-4a27ed78a891)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"f0d5fece696a3fd07ccdaa804474babcb024398ac4f65e60d868cd7640cf965f\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-apiserver/calico-apiserver-567b5c8b5-jlsm6" podUID="1760c214-51ea-4d0b-970c-4a27ed78a891" Sep 12 22:06:36.830828 containerd[1553]: time="2025-09-12T22:06:36.830687977Z" level=error msg="Failed to destroy network for sandbox \"6f52cb858d22da8c2d8b97b8cfff02455fda84c9f25f73c30520b902a45d04c9\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 12 22:06:36.833707 containerd[1553]: time="2025-09-12T22:06:36.833594684Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-567b5c8b5-9gng8,Uid:111ee55a-9453-48a3-8be1-100851a1a1ea,Namespace:calico-apiserver,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"6f52cb858d22da8c2d8b97b8cfff02455fda84c9f25f73c30520b902a45d04c9\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 12 22:06:36.835698 kubelet[2762]: E0912 22:06:36.835121 2762 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"6f52cb858d22da8c2d8b97b8cfff02455fda84c9f25f73c30520b902a45d04c9\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 12 22:06:36.835698 kubelet[2762]: E0912 22:06:36.835193 2762 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"6f52cb858d22da8c2d8b97b8cfff02455fda84c9f25f73c30520b902a45d04c9\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-567b5c8b5-9gng8" Sep 12 22:06:36.835698 kubelet[2762]: E0912 22:06:36.835214 2762 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"6f52cb858d22da8c2d8b97b8cfff02455fda84c9f25f73c30520b902a45d04c9\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-567b5c8b5-9gng8" Sep 12 22:06:36.837567 kubelet[2762]: E0912 22:06:36.835260 2762 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-apiserver-567b5c8b5-9gng8_calico-apiserver(111ee55a-9453-48a3-8be1-100851a1a1ea)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-apiserver-567b5c8b5-9gng8_calico-apiserver(111ee55a-9453-48a3-8be1-100851a1a1ea)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"6f52cb858d22da8c2d8b97b8cfff02455fda84c9f25f73c30520b902a45d04c9\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-apiserver/calico-apiserver-567b5c8b5-9gng8" podUID="111ee55a-9453-48a3-8be1-100851a1a1ea" Sep 12 22:06:39.331659 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount1112740222.mount: Deactivated successfully. Sep 12 22:06:39.365932 containerd[1553]: time="2025-09-12T22:06:39.365872348Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node:v3.30.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 22:06:39.367593 containerd[1553]: time="2025-09-12T22:06:39.367539802Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/node:v3.30.3: active requests=0, bytes read=151100457" Sep 12 22:06:39.368436 containerd[1553]: time="2025-09-12T22:06:39.368406929Z" level=info msg="ImageCreate event name:\"sha256:2b8abd2140fc4464ed664d225fe38e5b90bbfcf62996b484b0fc0e0537b6a4a9\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 22:06:39.371346 containerd[1553]: time="2025-09-12T22:06:39.371304714Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node@sha256:bcb8146fcaeced1e1c88fad3eaa697f1680746bd23c3e7e8d4535bc484c6f2a1\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 22:06:39.372564 containerd[1553]: time="2025-09-12T22:06:39.372483484Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/node:v3.30.3\" with image id \"sha256:2b8abd2140fc4464ed664d225fe38e5b90bbfcf62996b484b0fc0e0537b6a4a9\", repo tag \"ghcr.io/flatcar/calico/node:v3.30.3\", repo digest \"ghcr.io/flatcar/calico/node@sha256:bcb8146fcaeced1e1c88fad3eaa697f1680746bd23c3e7e8d4535bc484c6f2a1\", size \"151100319\" in 4.057410101s" Sep 12 22:06:39.372631 containerd[1553]: time="2025-09-12T22:06:39.372574565Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node:v3.30.3\" returns image reference \"sha256:2b8abd2140fc4464ed664d225fe38e5b90bbfcf62996b484b0fc0e0537b6a4a9\"" Sep 12 22:06:39.390128 containerd[1553]: time="2025-09-12T22:06:39.390087916Z" level=info msg="CreateContainer within sandbox \"63c63a200646bb5a00278e0c3649352a1544c2a3b49605a4c0ccecec419ef41f\" for container &ContainerMetadata{Name:calico-node,Attempt:0,}" Sep 12 22:06:39.402727 containerd[1553]: time="2025-09-12T22:06:39.402671584Z" level=info msg="Container a4e608c211de058001bdf0d0e6503e393e4578d41f7d556cebac9cac0633dafb: CDI devices from CRI Config.CDIDevices: []" Sep 12 22:06:39.414537 containerd[1553]: time="2025-09-12T22:06:39.414438566Z" level=info msg="CreateContainer within sandbox \"63c63a200646bb5a00278e0c3649352a1544c2a3b49605a4c0ccecec419ef41f\" for &ContainerMetadata{Name:calico-node,Attempt:0,} returns container id \"a4e608c211de058001bdf0d0e6503e393e4578d41f7d556cebac9cac0633dafb\"" Sep 12 22:06:39.417249 containerd[1553]: time="2025-09-12T22:06:39.416404102Z" level=info msg="StartContainer for \"a4e608c211de058001bdf0d0e6503e393e4578d41f7d556cebac9cac0633dafb\"" Sep 12 22:06:39.419598 containerd[1553]: time="2025-09-12T22:06:39.419533649Z" level=info msg="connecting to shim a4e608c211de058001bdf0d0e6503e393e4578d41f7d556cebac9cac0633dafb" address="unix:///run/containerd/s/88dffdc194bb573f8c9de89ba67c9ad2c2f61a0f46fecb9d97eb61215578417d" protocol=ttrpc version=3 Sep 12 22:06:39.448035 systemd[1]: Started cri-containerd-a4e608c211de058001bdf0d0e6503e393e4578d41f7d556cebac9cac0633dafb.scope - libcontainer container a4e608c211de058001bdf0d0e6503e393e4578d41f7d556cebac9cac0633dafb. Sep 12 22:06:39.502384 containerd[1553]: time="2025-09-12T22:06:39.502294762Z" level=info msg="StartContainer for \"a4e608c211de058001bdf0d0e6503e393e4578d41f7d556cebac9cac0633dafb\" returns successfully" Sep 12 22:06:39.651544 kernel: wireguard: WireGuard 1.0.0 loaded. See www.wireguard.com for information. Sep 12 22:06:39.651661 kernel: wireguard: Copyright (C) 2015-2019 Jason A. Donenfeld . All Rights Reserved. Sep 12 22:06:39.858728 kubelet[2762]: I0912 22:06:39.858684 2762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7cvn8\" (UniqueName: \"kubernetes.io/projected/e600ded0-81cb-44c9-ba7c-7a86580bc3d1-kube-api-access-7cvn8\") pod \"e600ded0-81cb-44c9-ba7c-7a86580bc3d1\" (UID: \"e600ded0-81cb-44c9-ba7c-7a86580bc3d1\") " Sep 12 22:06:39.859178 kubelet[2762]: I0912 22:06:39.858742 2762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"whisker-backend-key-pair\" (UniqueName: \"kubernetes.io/secret/e600ded0-81cb-44c9-ba7c-7a86580bc3d1-whisker-backend-key-pair\") pod \"e600ded0-81cb-44c9-ba7c-7a86580bc3d1\" (UID: \"e600ded0-81cb-44c9-ba7c-7a86580bc3d1\") " Sep 12 22:06:39.859178 kubelet[2762]: I0912 22:06:39.858786 2762 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"whisker-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e600ded0-81cb-44c9-ba7c-7a86580bc3d1-whisker-ca-bundle\") pod \"e600ded0-81cb-44c9-ba7c-7a86580bc3d1\" (UID: \"e600ded0-81cb-44c9-ba7c-7a86580bc3d1\") " Sep 12 22:06:39.859178 kubelet[2762]: I0912 22:06:39.859158 2762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e600ded0-81cb-44c9-ba7c-7a86580bc3d1-whisker-ca-bundle" (OuterVolumeSpecName: "whisker-ca-bundle") pod "e600ded0-81cb-44c9-ba7c-7a86580bc3d1" (UID: "e600ded0-81cb-44c9-ba7c-7a86580bc3d1"). InnerVolumeSpecName "whisker-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 12 22:06:39.867563 kubelet[2762]: I0912 22:06:39.867493 2762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e600ded0-81cb-44c9-ba7c-7a86580bc3d1-kube-api-access-7cvn8" (OuterVolumeSpecName: "kube-api-access-7cvn8") pod "e600ded0-81cb-44c9-ba7c-7a86580bc3d1" (UID: "e600ded0-81cb-44c9-ba7c-7a86580bc3d1"). InnerVolumeSpecName "kube-api-access-7cvn8". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 12 22:06:39.868967 kubelet[2762]: I0912 22:06:39.868914 2762 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e600ded0-81cb-44c9-ba7c-7a86580bc3d1-whisker-backend-key-pair" (OuterVolumeSpecName: "whisker-backend-key-pair") pod "e600ded0-81cb-44c9-ba7c-7a86580bc3d1" (UID: "e600ded0-81cb-44c9-ba7c-7a86580bc3d1"). InnerVolumeSpecName "whisker-backend-key-pair". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 12 22:06:39.959737 kubelet[2762]: I0912 22:06:39.959689 2762 reconciler_common.go:293] "Volume detached for volume \"whisker-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e600ded0-81cb-44c9-ba7c-7a86580bc3d1-whisker-ca-bundle\") on node \"ci-4459-0-0-7-af931fdd93\" DevicePath \"\"" Sep 12 22:06:39.960031 kubelet[2762]: I0912 22:06:39.959760 2762 reconciler_common.go:293] "Volume detached for volume \"whisker-backend-key-pair\" (UniqueName: \"kubernetes.io/secret/e600ded0-81cb-44c9-ba7c-7a86580bc3d1-whisker-backend-key-pair\") on node \"ci-4459-0-0-7-af931fdd93\" DevicePath \"\"" Sep 12 22:06:39.960031 kubelet[2762]: I0912 22:06:39.959783 2762 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7cvn8\" (UniqueName: \"kubernetes.io/projected/e600ded0-81cb-44c9-ba7c-7a86580bc3d1-kube-api-access-7cvn8\") on node \"ci-4459-0-0-7-af931fdd93\" DevicePath \"\"" Sep 12 22:06:40.330667 systemd[1]: var-lib-kubelet-pods-e600ded0\x2d81cb\x2d44c9\x2dba7c\x2d7a86580bc3d1-volumes-kubernetes.io\x7eprojected-kube\x2dapi\x2daccess\x2d7cvn8.mount: Deactivated successfully. Sep 12 22:06:40.330790 systemd[1]: var-lib-kubelet-pods-e600ded0\x2d81cb\x2d44c9\x2dba7c\x2d7a86580bc3d1-volumes-kubernetes.io\x7esecret-whisker\x2dbackend\x2dkey\x2dpair.mount: Deactivated successfully. Sep 12 22:06:40.356144 systemd[1]: Removed slice kubepods-besteffort-pode600ded0_81cb_44c9_ba7c_7a86580bc3d1.slice - libcontainer container kubepods-besteffort-pode600ded0_81cb_44c9_ba7c_7a86580bc3d1.slice. Sep 12 22:06:40.397635 kubelet[2762]: I0912 22:06:40.397111 2762 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/calico-node-l74dx" podStartSLOduration=2.51026466 podStartE2EDuration="14.397081215s" podCreationTimestamp="2025-09-12 22:06:26 +0000 UTC" firstStartedPulling="2025-09-12 22:06:27.486678618 +0000 UTC m=+22.551873169" lastFinishedPulling="2025-09-12 22:06:39.373495173 +0000 UTC m=+34.438689724" observedRunningTime="2025-09-12 22:06:40.381334203 +0000 UTC m=+35.446528754" watchObservedRunningTime="2025-09-12 22:06:40.397081215 +0000 UTC m=+35.462275806" Sep 12 22:06:40.458371 systemd[1]: Created slice kubepods-besteffort-podd8838147_8f72_4768_9bc8_ce693b71d129.slice - libcontainer container kubepods-besteffort-podd8838147_8f72_4768_9bc8_ce693b71d129.slice. Sep 12 22:06:40.464322 kubelet[2762]: I0912 22:06:40.464276 2762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whisker-backend-key-pair\" (UniqueName: \"kubernetes.io/secret/d8838147-8f72-4768-9bc8-ce693b71d129-whisker-backend-key-pair\") pod \"whisker-7c6c9f44dc-kjhbs\" (UID: \"d8838147-8f72-4768-9bc8-ce693b71d129\") " pod="calico-system/whisker-7c6c9f44dc-kjhbs" Sep 12 22:06:40.464322 kubelet[2762]: I0912 22:06:40.464327 2762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whisker-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d8838147-8f72-4768-9bc8-ce693b71d129-whisker-ca-bundle\") pod \"whisker-7c6c9f44dc-kjhbs\" (UID: \"d8838147-8f72-4768-9bc8-ce693b71d129\") " pod="calico-system/whisker-7c6c9f44dc-kjhbs" Sep 12 22:06:40.464621 kubelet[2762]: I0912 22:06:40.464348 2762 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5dbm4\" (UniqueName: \"kubernetes.io/projected/d8838147-8f72-4768-9bc8-ce693b71d129-kube-api-access-5dbm4\") pod \"whisker-7c6c9f44dc-kjhbs\" (UID: \"d8838147-8f72-4768-9bc8-ce693b71d129\") " pod="calico-system/whisker-7c6c9f44dc-kjhbs" Sep 12 22:06:40.765835 containerd[1553]: time="2025-09-12T22:06:40.765212621Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:whisker-7c6c9f44dc-kjhbs,Uid:d8838147-8f72-4768-9bc8-ce693b71d129,Namespace:calico-system,Attempt:0,}" Sep 12 22:06:40.958796 systemd-networkd[1421]: caliae5fbabe0d6: Link UP Sep 12 22:06:40.961019 systemd-networkd[1421]: caliae5fbabe0d6: Gained carrier Sep 12 22:06:40.985545 containerd[1553]: 2025-09-12 22:06:40.798 [INFO][3787] cni-plugin/utils.go 100: File /var/lib/calico/mtu does not exist Sep 12 22:06:40.985545 containerd[1553]: 2025-09-12 22:06:40.836 [INFO][3787] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4459--0--0--7--af931fdd93-k8s-whisker--7c6c9f44dc--kjhbs-eth0 whisker-7c6c9f44dc- calico-system d8838147-8f72-4768-9bc8-ce693b71d129 859 0 2025-09-12 22:06:40 +0000 UTC map[app.kubernetes.io/name:whisker k8s-app:whisker pod-template-hash:7c6c9f44dc projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:whisker] map[] [] [] []} {k8s ci-4459-0-0-7-af931fdd93 whisker-7c6c9f44dc-kjhbs eth0 whisker [] [] [kns.calico-system ksa.calico-system.whisker] caliae5fbabe0d6 [] [] }} ContainerID="7f681d6347ef93efe0717a3a5deed6d15f19bfcc3a6c1d12d6362fcc46c2b628" Namespace="calico-system" Pod="whisker-7c6c9f44dc-kjhbs" WorkloadEndpoint="ci--4459--0--0--7--af931fdd93-k8s-whisker--7c6c9f44dc--kjhbs-" Sep 12 22:06:40.985545 containerd[1553]: 2025-09-12 22:06:40.836 [INFO][3787] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="7f681d6347ef93efe0717a3a5deed6d15f19bfcc3a6c1d12d6362fcc46c2b628" Namespace="calico-system" Pod="whisker-7c6c9f44dc-kjhbs" WorkloadEndpoint="ci--4459--0--0--7--af931fdd93-k8s-whisker--7c6c9f44dc--kjhbs-eth0" Sep 12 22:06:40.985545 containerd[1553]: 2025-09-12 22:06:40.891 [INFO][3798] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="7f681d6347ef93efe0717a3a5deed6d15f19bfcc3a6c1d12d6362fcc46c2b628" HandleID="k8s-pod-network.7f681d6347ef93efe0717a3a5deed6d15f19bfcc3a6c1d12d6362fcc46c2b628" Workload="ci--4459--0--0--7--af931fdd93-k8s-whisker--7c6c9f44dc--kjhbs-eth0" Sep 12 22:06:40.985545 containerd[1553]: 2025-09-12 22:06:40.891 [INFO][3798] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="7f681d6347ef93efe0717a3a5deed6d15f19bfcc3a6c1d12d6362fcc46c2b628" HandleID="k8s-pod-network.7f681d6347ef93efe0717a3a5deed6d15f19bfcc3a6c1d12d6362fcc46c2b628" Workload="ci--4459--0--0--7--af931fdd93-k8s-whisker--7c6c9f44dc--kjhbs-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0x400024b770), Attrs:map[string]string{"namespace":"calico-system", "node":"ci-4459-0-0-7-af931fdd93", "pod":"whisker-7c6c9f44dc-kjhbs", "timestamp":"2025-09-12 22:06:40.891268478 +0000 UTC"}, Hostname:"ci-4459-0-0-7-af931fdd93", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Sep 12 22:06:40.985545 containerd[1553]: 2025-09-12 22:06:40.891 [INFO][3798] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Sep 12 22:06:40.985545 containerd[1553]: 2025-09-12 22:06:40.891 [INFO][3798] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Sep 12 22:06:40.985545 containerd[1553]: 2025-09-12 22:06:40.891 [INFO][3798] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4459-0-0-7-af931fdd93' Sep 12 22:06:40.985545 containerd[1553]: 2025-09-12 22:06:40.909 [INFO][3798] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.7f681d6347ef93efe0717a3a5deed6d15f19bfcc3a6c1d12d6362fcc46c2b628" host="ci-4459-0-0-7-af931fdd93" Sep 12 22:06:40.985545 containerd[1553]: 2025-09-12 22:06:40.919 [INFO][3798] ipam/ipam.go 394: Looking up existing affinities for host host="ci-4459-0-0-7-af931fdd93" Sep 12 22:06:40.985545 containerd[1553]: 2025-09-12 22:06:40.925 [INFO][3798] ipam/ipam.go 511: Trying affinity for 192.168.1.64/26 host="ci-4459-0-0-7-af931fdd93" Sep 12 22:06:40.985545 containerd[1553]: 2025-09-12 22:06:40.928 [INFO][3798] ipam/ipam.go 158: Attempting to load block cidr=192.168.1.64/26 host="ci-4459-0-0-7-af931fdd93" Sep 12 22:06:40.985545 containerd[1553]: 2025-09-12 22:06:40.931 [INFO][3798] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.1.64/26 host="ci-4459-0-0-7-af931fdd93" Sep 12 22:06:40.985545 containerd[1553]: 2025-09-12 22:06:40.931 [INFO][3798] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.1.64/26 handle="k8s-pod-network.7f681d6347ef93efe0717a3a5deed6d15f19bfcc3a6c1d12d6362fcc46c2b628" host="ci-4459-0-0-7-af931fdd93" Sep 12 22:06:40.985545 containerd[1553]: 2025-09-12 22:06:40.933 [INFO][3798] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.7f681d6347ef93efe0717a3a5deed6d15f19bfcc3a6c1d12d6362fcc46c2b628 Sep 12 22:06:40.985545 containerd[1553]: 2025-09-12 22:06:40.938 [INFO][3798] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.1.64/26 handle="k8s-pod-network.7f681d6347ef93efe0717a3a5deed6d15f19bfcc3a6c1d12d6362fcc46c2b628" host="ci-4459-0-0-7-af931fdd93" Sep 12 22:06:40.985545 containerd[1553]: 2025-09-12 22:06:40.947 [INFO][3798] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.1.65/26] block=192.168.1.64/26 handle="k8s-pod-network.7f681d6347ef93efe0717a3a5deed6d15f19bfcc3a6c1d12d6362fcc46c2b628" host="ci-4459-0-0-7-af931fdd93" Sep 12 22:06:40.985545 containerd[1553]: 2025-09-12 22:06:40.947 [INFO][3798] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.1.65/26] handle="k8s-pod-network.7f681d6347ef93efe0717a3a5deed6d15f19bfcc3a6c1d12d6362fcc46c2b628" host="ci-4459-0-0-7-af931fdd93" Sep 12 22:06:40.985545 containerd[1553]: 2025-09-12 22:06:40.947 [INFO][3798] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Sep 12 22:06:40.986194 containerd[1553]: 2025-09-12 22:06:40.947 [INFO][3798] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.1.65/26] IPv6=[] ContainerID="7f681d6347ef93efe0717a3a5deed6d15f19bfcc3a6c1d12d6362fcc46c2b628" HandleID="k8s-pod-network.7f681d6347ef93efe0717a3a5deed6d15f19bfcc3a6c1d12d6362fcc46c2b628" Workload="ci--4459--0--0--7--af931fdd93-k8s-whisker--7c6c9f44dc--kjhbs-eth0" Sep 12 22:06:40.986194 containerd[1553]: 2025-09-12 22:06:40.950 [INFO][3787] cni-plugin/k8s.go 418: Populated endpoint ContainerID="7f681d6347ef93efe0717a3a5deed6d15f19bfcc3a6c1d12d6362fcc46c2b628" Namespace="calico-system" Pod="whisker-7c6c9f44dc-kjhbs" WorkloadEndpoint="ci--4459--0--0--7--af931fdd93-k8s-whisker--7c6c9f44dc--kjhbs-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4459--0--0--7--af931fdd93-k8s-whisker--7c6c9f44dc--kjhbs-eth0", GenerateName:"whisker-7c6c9f44dc-", Namespace:"calico-system", SelfLink:"", UID:"d8838147-8f72-4768-9bc8-ce693b71d129", ResourceVersion:"859", Generation:0, CreationTimestamp:time.Date(2025, time.September, 12, 22, 6, 40, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"whisker", "k8s-app":"whisker", "pod-template-hash":"7c6c9f44dc", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"whisker"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4459-0-0-7-af931fdd93", ContainerID:"", Pod:"whisker-7c6c9f44dc-kjhbs", Endpoint:"eth0", ServiceAccountName:"whisker", IPNetworks:[]string{"192.168.1.65/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.whisker"}, InterfaceName:"caliae5fbabe0d6", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 12 22:06:40.986194 containerd[1553]: 2025-09-12 22:06:40.950 [INFO][3787] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.1.65/32] ContainerID="7f681d6347ef93efe0717a3a5deed6d15f19bfcc3a6c1d12d6362fcc46c2b628" Namespace="calico-system" Pod="whisker-7c6c9f44dc-kjhbs" WorkloadEndpoint="ci--4459--0--0--7--af931fdd93-k8s-whisker--7c6c9f44dc--kjhbs-eth0" Sep 12 22:06:40.986194 containerd[1553]: 2025-09-12 22:06:40.950 [INFO][3787] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to caliae5fbabe0d6 ContainerID="7f681d6347ef93efe0717a3a5deed6d15f19bfcc3a6c1d12d6362fcc46c2b628" Namespace="calico-system" Pod="whisker-7c6c9f44dc-kjhbs" WorkloadEndpoint="ci--4459--0--0--7--af931fdd93-k8s-whisker--7c6c9f44dc--kjhbs-eth0" Sep 12 22:06:40.986194 containerd[1553]: 2025-09-12 22:06:40.960 [INFO][3787] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="7f681d6347ef93efe0717a3a5deed6d15f19bfcc3a6c1d12d6362fcc46c2b628" Namespace="calico-system" Pod="whisker-7c6c9f44dc-kjhbs" WorkloadEndpoint="ci--4459--0--0--7--af931fdd93-k8s-whisker--7c6c9f44dc--kjhbs-eth0" Sep 12 22:06:40.986194 containerd[1553]: 2025-09-12 22:06:40.961 [INFO][3787] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="7f681d6347ef93efe0717a3a5deed6d15f19bfcc3a6c1d12d6362fcc46c2b628" Namespace="calico-system" Pod="whisker-7c6c9f44dc-kjhbs" WorkloadEndpoint="ci--4459--0--0--7--af931fdd93-k8s-whisker--7c6c9f44dc--kjhbs-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4459--0--0--7--af931fdd93-k8s-whisker--7c6c9f44dc--kjhbs-eth0", GenerateName:"whisker-7c6c9f44dc-", Namespace:"calico-system", SelfLink:"", UID:"d8838147-8f72-4768-9bc8-ce693b71d129", ResourceVersion:"859", Generation:0, CreationTimestamp:time.Date(2025, time.September, 12, 22, 6, 40, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"whisker", "k8s-app":"whisker", "pod-template-hash":"7c6c9f44dc", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"whisker"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4459-0-0-7-af931fdd93", ContainerID:"7f681d6347ef93efe0717a3a5deed6d15f19bfcc3a6c1d12d6362fcc46c2b628", Pod:"whisker-7c6c9f44dc-kjhbs", Endpoint:"eth0", ServiceAccountName:"whisker", IPNetworks:[]string{"192.168.1.65/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.whisker"}, InterfaceName:"caliae5fbabe0d6", MAC:"12:1e:b1:6a:64:f7", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 12 22:06:40.986799 containerd[1553]: 2025-09-12 22:06:40.981 [INFO][3787] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="7f681d6347ef93efe0717a3a5deed6d15f19bfcc3a6c1d12d6362fcc46c2b628" Namespace="calico-system" Pod="whisker-7c6c9f44dc-kjhbs" WorkloadEndpoint="ci--4459--0--0--7--af931fdd93-k8s-whisker--7c6c9f44dc--kjhbs-eth0" Sep 12 22:06:41.020794 containerd[1553]: time="2025-09-12T22:06:41.020630758Z" level=info msg="connecting to shim 7f681d6347ef93efe0717a3a5deed6d15f19bfcc3a6c1d12d6362fcc46c2b628" address="unix:///run/containerd/s/b6effe1e3efda6fbe8829840f2b1f9060250b6454904ea369dfa02e1c5410efc" namespace=k8s.io protocol=ttrpc version=3 Sep 12 22:06:41.053739 systemd[1]: Started cri-containerd-7f681d6347ef93efe0717a3a5deed6d15f19bfcc3a6c1d12d6362fcc46c2b628.scope - libcontainer container 7f681d6347ef93efe0717a3a5deed6d15f19bfcc3a6c1d12d6362fcc46c2b628. Sep 12 22:06:41.101988 containerd[1553]: time="2025-09-12T22:06:41.101944782Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:whisker-7c6c9f44dc-kjhbs,Uid:d8838147-8f72-4768-9bc8-ce693b71d129,Namespace:calico-system,Attempt:0,} returns sandbox id \"7f681d6347ef93efe0717a3a5deed6d15f19bfcc3a6c1d12d6362fcc46c2b628\"" Sep 12 22:06:41.103849 containerd[1553]: time="2025-09-12T22:06:41.103757557Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.30.3\"" Sep 12 22:06:41.109080 kubelet[2762]: I0912 22:06:41.109029 2762 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e600ded0-81cb-44c9-ba7c-7a86580bc3d1" path="/var/lib/kubelet/pods/e600ded0-81cb-44c9-ba7c-7a86580bc3d1/volumes" Sep 12 22:06:41.349737 kubelet[2762]: I0912 22:06:41.349622 2762 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Sep 12 22:06:42.529487 systemd-networkd[1421]: caliae5fbabe0d6: Gained IPv6LL Sep 12 22:06:42.599848 containerd[1553]: time="2025-09-12T22:06:42.599798565Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/whisker:v3.30.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 22:06:42.602422 containerd[1553]: time="2025-09-12T22:06:42.601750141Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker:v3.30.3: active requests=0, bytes read=4605606" Sep 12 22:06:42.604480 containerd[1553]: time="2025-09-12T22:06:42.604432042Z" level=info msg="ImageCreate event name:\"sha256:270a0129ec34c3ad6ae6d56c0afce111eb0baa25dfdacb63722ec5887bafd3c5\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 22:06:42.607622 containerd[1553]: time="2025-09-12T22:06:42.607370466Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/whisker@sha256:e7113761fc7633d515882f0d48b5c8d0b8e62f3f9d34823f2ee194bb16d2ec44\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 22:06:42.608365 containerd[1553]: time="2025-09-12T22:06:42.608306313Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/whisker:v3.30.3\" with image id \"sha256:270a0129ec34c3ad6ae6d56c0afce111eb0baa25dfdacb63722ec5887bafd3c5\", repo tag \"ghcr.io/flatcar/calico/whisker:v3.30.3\", repo digest \"ghcr.io/flatcar/calico/whisker@sha256:e7113761fc7633d515882f0d48b5c8d0b8e62f3f9d34823f2ee194bb16d2ec44\", size \"5974839\" in 1.503021384s" Sep 12 22:06:42.608365 containerd[1553]: time="2025-09-12T22:06:42.608341393Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.30.3\" returns image reference \"sha256:270a0129ec34c3ad6ae6d56c0afce111eb0baa25dfdacb63722ec5887bafd3c5\"" Sep 12 22:06:42.613474 containerd[1553]: time="2025-09-12T22:06:42.613380713Z" level=info msg="CreateContainer within sandbox \"7f681d6347ef93efe0717a3a5deed6d15f19bfcc3a6c1d12d6362fcc46c2b628\" for container &ContainerMetadata{Name:whisker,Attempt:0,}" Sep 12 22:06:42.625539 containerd[1553]: time="2025-09-12T22:06:42.623732276Z" level=info msg="Container 91021150164e7b07c61017b5c1bd8b7d981bef5b5c3335b467e4a23db3b67024: CDI devices from CRI Config.CDIDevices: []" Sep 12 22:06:42.642059 containerd[1553]: time="2025-09-12T22:06:42.642004541Z" level=info msg="CreateContainer within sandbox \"7f681d6347ef93efe0717a3a5deed6d15f19bfcc3a6c1d12d6362fcc46c2b628\" for &ContainerMetadata{Name:whisker,Attempt:0,} returns container id \"91021150164e7b07c61017b5c1bd8b7d981bef5b5c3335b467e4a23db3b67024\"" Sep 12 22:06:42.643068 containerd[1553]: time="2025-09-12T22:06:42.643038629Z" level=info msg="StartContainer for \"91021150164e7b07c61017b5c1bd8b7d981bef5b5c3335b467e4a23db3b67024\"" Sep 12 22:06:42.644912 containerd[1553]: time="2025-09-12T22:06:42.644871844Z" level=info msg="connecting to shim 91021150164e7b07c61017b5c1bd8b7d981bef5b5c3335b467e4a23db3b67024" address="unix:///run/containerd/s/b6effe1e3efda6fbe8829840f2b1f9060250b6454904ea369dfa02e1c5410efc" protocol=ttrpc version=3 Sep 12 22:06:42.679734 systemd[1]: Started cri-containerd-91021150164e7b07c61017b5c1bd8b7d981bef5b5c3335b467e4a23db3b67024.scope - libcontainer container 91021150164e7b07c61017b5c1bd8b7d981bef5b5c3335b467e4a23db3b67024. Sep 12 22:06:42.729556 containerd[1553]: time="2025-09-12T22:06:42.729519437Z" level=info msg="StartContainer for \"91021150164e7b07c61017b5c1bd8b7d981bef5b5c3335b467e4a23db3b67024\" returns successfully" Sep 12 22:06:42.733614 containerd[1553]: time="2025-09-12T22:06:42.733052665Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.30.3\"" Sep 12 22:06:44.561474 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount3954972556.mount: Deactivated successfully. Sep 12 22:06:44.591268 containerd[1553]: time="2025-09-12T22:06:44.591207802Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/whisker-backend:v3.30.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 22:06:44.592789 containerd[1553]: time="2025-09-12T22:06:44.592036169Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker-backend:v3.30.3: active requests=0, bytes read=30823700" Sep 12 22:06:44.593592 containerd[1553]: time="2025-09-12T22:06:44.593558820Z" level=info msg="ImageCreate event name:\"sha256:e210e86234bc99f018431b30477c5ca2ad6f7ecf67ef011498f7beb48fb0b21f\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 22:06:44.598358 containerd[1553]: time="2025-09-12T22:06:44.598309016Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/whisker-backend@sha256:29becebc47401da9997a2a30f4c25c511a5f379d17275680b048224829af71a5\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 22:06:44.600285 containerd[1553]: time="2025-09-12T22:06:44.600202070Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.3\" with image id \"sha256:e210e86234bc99f018431b30477c5ca2ad6f7ecf67ef011498f7beb48fb0b21f\", repo tag \"ghcr.io/flatcar/calico/whisker-backend:v3.30.3\", repo digest \"ghcr.io/flatcar/calico/whisker-backend@sha256:29becebc47401da9997a2a30f4c25c511a5f379d17275680b048224829af71a5\", size \"30823530\" in 1.866347598s" Sep 12 22:06:44.600285 containerd[1553]: time="2025-09-12T22:06:44.600286831Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.30.3\" returns image reference \"sha256:e210e86234bc99f018431b30477c5ca2ad6f7ecf67ef011498f7beb48fb0b21f\"" Sep 12 22:06:44.604344 containerd[1553]: time="2025-09-12T22:06:44.604291141Z" level=info msg="CreateContainer within sandbox \"7f681d6347ef93efe0717a3a5deed6d15f19bfcc3a6c1d12d6362fcc46c2b628\" for container &ContainerMetadata{Name:whisker-backend,Attempt:0,}" Sep 12 22:06:44.616816 containerd[1553]: time="2025-09-12T22:06:44.616708595Z" level=info msg="Container e15c89c83a987425855ceb807062d08dd87fb440120e3e07d94f651696707df7: CDI devices from CRI Config.CDIDevices: []" Sep 12 22:06:44.636964 containerd[1553]: time="2025-09-12T22:06:44.636860267Z" level=info msg="CreateContainer within sandbox \"7f681d6347ef93efe0717a3a5deed6d15f19bfcc3a6c1d12d6362fcc46c2b628\" for &ContainerMetadata{Name:whisker-backend,Attempt:0,} returns container id \"e15c89c83a987425855ceb807062d08dd87fb440120e3e07d94f651696707df7\"" Sep 12 22:06:44.638683 containerd[1553]: time="2025-09-12T22:06:44.637853235Z" level=info msg="StartContainer for \"e15c89c83a987425855ceb807062d08dd87fb440120e3e07d94f651696707df7\"" Sep 12 22:06:44.647097 containerd[1553]: time="2025-09-12T22:06:44.646653541Z" level=info msg="connecting to shim e15c89c83a987425855ceb807062d08dd87fb440120e3e07d94f651696707df7" address="unix:///run/containerd/s/b6effe1e3efda6fbe8829840f2b1f9060250b6454904ea369dfa02e1c5410efc" protocol=ttrpc version=3 Sep 12 22:06:44.681791 systemd[1]: Started cri-containerd-e15c89c83a987425855ceb807062d08dd87fb440120e3e07d94f651696707df7.scope - libcontainer container e15c89c83a987425855ceb807062d08dd87fb440120e3e07d94f651696707df7. Sep 12 22:06:44.754409 containerd[1553]: time="2025-09-12T22:06:44.754310594Z" level=info msg="StartContainer for \"e15c89c83a987425855ceb807062d08dd87fb440120e3e07d94f651696707df7\" returns successfully" Sep 12 22:06:44.890233 kubelet[2762]: I0912 22:06:44.889856 2762 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Sep 12 22:06:45.001262 containerd[1553]: time="2025-09-12T22:06:45.001217858Z" level=info msg="TaskExit event in podsandbox handler container_id:\"a4e608c211de058001bdf0d0e6503e393e4578d41f7d556cebac9cac0633dafb\" id:\"2f58ae19aefe3b8b2fc148989c6a0b6abf0bc867b537af9af961b90e0ce9f5df\" pid:4101 exit_status:1 exited_at:{seconds:1757714805 nanos:812855}" Sep 12 22:06:45.116878 containerd[1553]: time="2025-09-12T22:06:45.116765509Z" level=info msg="TaskExit event in podsandbox handler container_id:\"a4e608c211de058001bdf0d0e6503e393e4578d41f7d556cebac9cac0633dafb\" id:\"148037d21c5c684772030f03c060dad44c29c02442ac34f0f5735a944dd2ec00\" pid:4123 exit_status:1 exited_at:{seconds:1757714805 nanos:116250745}" Sep 12 22:06:45.386062 kubelet[2762]: I0912 22:06:45.385922 2762 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/whisker-7c6c9f44dc-kjhbs" podStartSLOduration=1.8875764419999999 podStartE2EDuration="5.385890089s" podCreationTimestamp="2025-09-12 22:06:40 +0000 UTC" firstStartedPulling="2025-09-12 22:06:41.103484675 +0000 UTC m=+36.168679226" lastFinishedPulling="2025-09-12 22:06:44.601798322 +0000 UTC m=+39.666992873" observedRunningTime="2025-09-12 22:06:45.383938755 +0000 UTC m=+40.449133306" watchObservedRunningTime="2025-09-12 22:06:45.385890089 +0000 UTC m=+40.451084600" Sep 12 22:06:47.104532 containerd[1553]: time="2025-09-12T22:06:47.104077789Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-8696478589-r2m8w,Uid:6252003c-5f62-4a5b-ae66-4eb96a9effa8,Namespace:calico-system,Attempt:0,}" Sep 12 22:06:47.263773 systemd-networkd[1421]: cali6e69a643b52: Link UP Sep 12 22:06:47.264339 systemd-networkd[1421]: cali6e69a643b52: Gained carrier Sep 12 22:06:47.293185 containerd[1553]: 2025-09-12 22:06:47.141 [INFO][4179] cni-plugin/utils.go 100: File /var/lib/calico/mtu does not exist Sep 12 22:06:47.293185 containerd[1553]: 2025-09-12 22:06:47.162 [INFO][4179] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4459--0--0--7--af931fdd93-k8s-calico--kube--controllers--8696478589--r2m8w-eth0 calico-kube-controllers-8696478589- calico-system 6252003c-5f62-4a5b-ae66-4eb96a9effa8 794 0 2025-09-12 22:06:27 +0000 UTC map[app.kubernetes.io/name:calico-kube-controllers k8s-app:calico-kube-controllers pod-template-hash:8696478589 projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:calico-kube-controllers] map[] [] [] []} {k8s ci-4459-0-0-7-af931fdd93 calico-kube-controllers-8696478589-r2m8w eth0 calico-kube-controllers [] [] [kns.calico-system ksa.calico-system.calico-kube-controllers] cali6e69a643b52 [] [] }} ContainerID="afdd4d7dff96f93d29a89d0786da6a5f9c0b4146bd5a0b1b99d374529bccf017" Namespace="calico-system" Pod="calico-kube-controllers-8696478589-r2m8w" WorkloadEndpoint="ci--4459--0--0--7--af931fdd93-k8s-calico--kube--controllers--8696478589--r2m8w-" Sep 12 22:06:47.293185 containerd[1553]: 2025-09-12 22:06:47.162 [INFO][4179] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="afdd4d7dff96f93d29a89d0786da6a5f9c0b4146bd5a0b1b99d374529bccf017" Namespace="calico-system" Pod="calico-kube-controllers-8696478589-r2m8w" WorkloadEndpoint="ci--4459--0--0--7--af931fdd93-k8s-calico--kube--controllers--8696478589--r2m8w-eth0" Sep 12 22:06:47.293185 containerd[1553]: 2025-09-12 22:06:47.194 [INFO][4191] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="afdd4d7dff96f93d29a89d0786da6a5f9c0b4146bd5a0b1b99d374529bccf017" HandleID="k8s-pod-network.afdd4d7dff96f93d29a89d0786da6a5f9c0b4146bd5a0b1b99d374529bccf017" Workload="ci--4459--0--0--7--af931fdd93-k8s-calico--kube--controllers--8696478589--r2m8w-eth0" Sep 12 22:06:47.293185 containerd[1553]: 2025-09-12 22:06:47.196 [INFO][4191] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="afdd4d7dff96f93d29a89d0786da6a5f9c0b4146bd5a0b1b99d374529bccf017" HandleID="k8s-pod-network.afdd4d7dff96f93d29a89d0786da6a5f9c0b4146bd5a0b1b99d374529bccf017" Workload="ci--4459--0--0--7--af931fdd93-k8s-calico--kube--controllers--8696478589--r2m8w-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0x400024b610), Attrs:map[string]string{"namespace":"calico-system", "node":"ci-4459-0-0-7-af931fdd93", "pod":"calico-kube-controllers-8696478589-r2m8w", "timestamp":"2025-09-12 22:06:47.194743423 +0000 UTC"}, Hostname:"ci-4459-0-0-7-af931fdd93", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Sep 12 22:06:47.293185 containerd[1553]: 2025-09-12 22:06:47.196 [INFO][4191] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Sep 12 22:06:47.293185 containerd[1553]: 2025-09-12 22:06:47.196 [INFO][4191] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Sep 12 22:06:47.293185 containerd[1553]: 2025-09-12 22:06:47.196 [INFO][4191] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4459-0-0-7-af931fdd93' Sep 12 22:06:47.293185 containerd[1553]: 2025-09-12 22:06:47.210 [INFO][4191] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.afdd4d7dff96f93d29a89d0786da6a5f9c0b4146bd5a0b1b99d374529bccf017" host="ci-4459-0-0-7-af931fdd93" Sep 12 22:06:47.293185 containerd[1553]: 2025-09-12 22:06:47.217 [INFO][4191] ipam/ipam.go 394: Looking up existing affinities for host host="ci-4459-0-0-7-af931fdd93" Sep 12 22:06:47.293185 containerd[1553]: 2025-09-12 22:06:47.227 [INFO][4191] ipam/ipam.go 511: Trying affinity for 192.168.1.64/26 host="ci-4459-0-0-7-af931fdd93" Sep 12 22:06:47.293185 containerd[1553]: 2025-09-12 22:06:47.233 [INFO][4191] ipam/ipam.go 158: Attempting to load block cidr=192.168.1.64/26 host="ci-4459-0-0-7-af931fdd93" Sep 12 22:06:47.293185 containerd[1553]: 2025-09-12 22:06:47.237 [INFO][4191] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.1.64/26 host="ci-4459-0-0-7-af931fdd93" Sep 12 22:06:47.293185 containerd[1553]: 2025-09-12 22:06:47.237 [INFO][4191] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.1.64/26 handle="k8s-pod-network.afdd4d7dff96f93d29a89d0786da6a5f9c0b4146bd5a0b1b99d374529bccf017" host="ci-4459-0-0-7-af931fdd93" Sep 12 22:06:47.293185 containerd[1553]: 2025-09-12 22:06:47.240 [INFO][4191] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.afdd4d7dff96f93d29a89d0786da6a5f9c0b4146bd5a0b1b99d374529bccf017 Sep 12 22:06:47.293185 containerd[1553]: 2025-09-12 22:06:47.246 [INFO][4191] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.1.64/26 handle="k8s-pod-network.afdd4d7dff96f93d29a89d0786da6a5f9c0b4146bd5a0b1b99d374529bccf017" host="ci-4459-0-0-7-af931fdd93" Sep 12 22:06:47.293185 containerd[1553]: 2025-09-12 22:06:47.256 [INFO][4191] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.1.66/26] block=192.168.1.64/26 handle="k8s-pod-network.afdd4d7dff96f93d29a89d0786da6a5f9c0b4146bd5a0b1b99d374529bccf017" host="ci-4459-0-0-7-af931fdd93" Sep 12 22:06:47.293185 containerd[1553]: 2025-09-12 22:06:47.256 [INFO][4191] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.1.66/26] handle="k8s-pod-network.afdd4d7dff96f93d29a89d0786da6a5f9c0b4146bd5a0b1b99d374529bccf017" host="ci-4459-0-0-7-af931fdd93" Sep 12 22:06:47.293185 containerd[1553]: 2025-09-12 22:06:47.256 [INFO][4191] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Sep 12 22:06:47.293942 containerd[1553]: 2025-09-12 22:06:47.256 [INFO][4191] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.1.66/26] IPv6=[] ContainerID="afdd4d7dff96f93d29a89d0786da6a5f9c0b4146bd5a0b1b99d374529bccf017" HandleID="k8s-pod-network.afdd4d7dff96f93d29a89d0786da6a5f9c0b4146bd5a0b1b99d374529bccf017" Workload="ci--4459--0--0--7--af931fdd93-k8s-calico--kube--controllers--8696478589--r2m8w-eth0" Sep 12 22:06:47.293942 containerd[1553]: 2025-09-12 22:06:47.260 [INFO][4179] cni-plugin/k8s.go 418: Populated endpoint ContainerID="afdd4d7dff96f93d29a89d0786da6a5f9c0b4146bd5a0b1b99d374529bccf017" Namespace="calico-system" Pod="calico-kube-controllers-8696478589-r2m8w" WorkloadEndpoint="ci--4459--0--0--7--af931fdd93-k8s-calico--kube--controllers--8696478589--r2m8w-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4459--0--0--7--af931fdd93-k8s-calico--kube--controllers--8696478589--r2m8w-eth0", GenerateName:"calico-kube-controllers-8696478589-", Namespace:"calico-system", SelfLink:"", UID:"6252003c-5f62-4a5b-ae66-4eb96a9effa8", ResourceVersion:"794", Generation:0, CreationTimestamp:time.Date(2025, time.September, 12, 22, 6, 27, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"calico-kube-controllers", "k8s-app":"calico-kube-controllers", "pod-template-hash":"8696478589", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-kube-controllers"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4459-0-0-7-af931fdd93", ContainerID:"", Pod:"calico-kube-controllers-8696478589-r2m8w", Endpoint:"eth0", ServiceAccountName:"calico-kube-controllers", IPNetworks:[]string{"192.168.1.66/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.calico-kube-controllers"}, InterfaceName:"cali6e69a643b52", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 12 22:06:47.293942 containerd[1553]: 2025-09-12 22:06:47.260 [INFO][4179] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.1.66/32] ContainerID="afdd4d7dff96f93d29a89d0786da6a5f9c0b4146bd5a0b1b99d374529bccf017" Namespace="calico-system" Pod="calico-kube-controllers-8696478589-r2m8w" WorkloadEndpoint="ci--4459--0--0--7--af931fdd93-k8s-calico--kube--controllers--8696478589--r2m8w-eth0" Sep 12 22:06:47.293942 containerd[1553]: 2025-09-12 22:06:47.260 [INFO][4179] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali6e69a643b52 ContainerID="afdd4d7dff96f93d29a89d0786da6a5f9c0b4146bd5a0b1b99d374529bccf017" Namespace="calico-system" Pod="calico-kube-controllers-8696478589-r2m8w" WorkloadEndpoint="ci--4459--0--0--7--af931fdd93-k8s-calico--kube--controllers--8696478589--r2m8w-eth0" Sep 12 22:06:47.293942 containerd[1553]: 2025-09-12 22:06:47.262 [INFO][4179] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="afdd4d7dff96f93d29a89d0786da6a5f9c0b4146bd5a0b1b99d374529bccf017" Namespace="calico-system" Pod="calico-kube-controllers-8696478589-r2m8w" WorkloadEndpoint="ci--4459--0--0--7--af931fdd93-k8s-calico--kube--controllers--8696478589--r2m8w-eth0" Sep 12 22:06:47.294433 containerd[1553]: 2025-09-12 22:06:47.266 [INFO][4179] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="afdd4d7dff96f93d29a89d0786da6a5f9c0b4146bd5a0b1b99d374529bccf017" Namespace="calico-system" Pod="calico-kube-controllers-8696478589-r2m8w" WorkloadEndpoint="ci--4459--0--0--7--af931fdd93-k8s-calico--kube--controllers--8696478589--r2m8w-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4459--0--0--7--af931fdd93-k8s-calico--kube--controllers--8696478589--r2m8w-eth0", GenerateName:"calico-kube-controllers-8696478589-", Namespace:"calico-system", SelfLink:"", UID:"6252003c-5f62-4a5b-ae66-4eb96a9effa8", ResourceVersion:"794", Generation:0, CreationTimestamp:time.Date(2025, time.September, 12, 22, 6, 27, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"calico-kube-controllers", "k8s-app":"calico-kube-controllers", "pod-template-hash":"8696478589", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-kube-controllers"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4459-0-0-7-af931fdd93", ContainerID:"afdd4d7dff96f93d29a89d0786da6a5f9c0b4146bd5a0b1b99d374529bccf017", Pod:"calico-kube-controllers-8696478589-r2m8w", Endpoint:"eth0", ServiceAccountName:"calico-kube-controllers", IPNetworks:[]string{"192.168.1.66/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.calico-kube-controllers"}, InterfaceName:"cali6e69a643b52", MAC:"1a:cf:20:04:c2:b5", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 12 22:06:47.294433 containerd[1553]: 2025-09-12 22:06:47.290 [INFO][4179] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="afdd4d7dff96f93d29a89d0786da6a5f9c0b4146bd5a0b1b99d374529bccf017" Namespace="calico-system" Pod="calico-kube-controllers-8696478589-r2m8w" WorkloadEndpoint="ci--4459--0--0--7--af931fdd93-k8s-calico--kube--controllers--8696478589--r2m8w-eth0" Sep 12 22:06:47.325179 containerd[1553]: time="2025-09-12T22:06:47.325041734Z" level=info msg="connecting to shim afdd4d7dff96f93d29a89d0786da6a5f9c0b4146bd5a0b1b99d374529bccf017" address="unix:///run/containerd/s/a68c8e7fe7da6731f68aab3d7b41312988649ef16eb7ad90ef4edb29b40756f1" namespace=k8s.io protocol=ttrpc version=3 Sep 12 22:06:47.360849 systemd[1]: Started cri-containerd-afdd4d7dff96f93d29a89d0786da6a5f9c0b4146bd5a0b1b99d374529bccf017.scope - libcontainer container afdd4d7dff96f93d29a89d0786da6a5f9c0b4146bd5a0b1b99d374529bccf017. Sep 12 22:06:47.408757 containerd[1553]: time="2025-09-12T22:06:47.408647718Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-8696478589-r2m8w,Uid:6252003c-5f62-4a5b-ae66-4eb96a9effa8,Namespace:calico-system,Attempt:0,} returns sandbox id \"afdd4d7dff96f93d29a89d0786da6a5f9c0b4146bd5a0b1b99d374529bccf017\"" Sep 12 22:06:47.414615 containerd[1553]: time="2025-09-12T22:06:47.414500559Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.30.3\"" Sep 12 22:06:48.415816 systemd-networkd[1421]: cali6e69a643b52: Gained IPv6LL Sep 12 22:06:49.110875 containerd[1553]: time="2025-09-12T22:06:49.110581887Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-mr7s6,Uid:9c8a5f44-6d52-4bd2-b382-1b9dd40eab85,Namespace:calico-system,Attempt:0,}" Sep 12 22:06:49.128013 containerd[1553]: time="2025-09-12T22:06:49.127932522Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:goldmane-7988f88666-5qdsb,Uid:b6d05037-1881-4fc1-bcbd-569865ed54b9,Namespace:calico-system,Attempt:0,}" Sep 12 22:06:49.128343 containerd[1553]: time="2025-09-12T22:06:49.128289085Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-7c65d6cfc9-jbgw9,Uid:7742b50f-d16e-46b2-aaad-1853990171fc,Namespace:kube-system,Attempt:0,}" Sep 12 22:06:49.128655 containerd[1553]: time="2025-09-12T22:06:49.128628247Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-7c65d6cfc9-xl5lq,Uid:1fd9bc8e-8eaf-4b7f-8777-056108d2728d,Namespace:kube-system,Attempt:0,}" Sep 12 22:06:49.560915 systemd-networkd[1421]: cali19cf70b8272: Link UP Sep 12 22:06:49.563481 systemd-networkd[1421]: cali19cf70b8272: Gained carrier Sep 12 22:06:49.600372 containerd[1553]: 2025-09-12 22:06:49.240 [INFO][4281] cni-plugin/utils.go 100: File /var/lib/calico/mtu does not exist Sep 12 22:06:49.600372 containerd[1553]: 2025-09-12 22:06:49.313 [INFO][4281] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4459--0--0--7--af931fdd93-k8s-csi--node--driver--mr7s6-eth0 csi-node-driver- calico-system 9c8a5f44-6d52-4bd2-b382-1b9dd40eab85 670 0 2025-09-12 22:06:27 +0000 UTC map[app.kubernetes.io/name:csi-node-driver controller-revision-hash:856c6b598f k8s-app:csi-node-driver name:csi-node-driver pod-template-generation:1 projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:csi-node-driver] map[] [] [] []} {k8s ci-4459-0-0-7-af931fdd93 csi-node-driver-mr7s6 eth0 csi-node-driver [] [] [kns.calico-system ksa.calico-system.csi-node-driver] cali19cf70b8272 [] [] }} ContainerID="1ff3691f489dfeb852cad9bf4ed3b9de1f23adebbae6bc73d7ab647a9d05b065" Namespace="calico-system" Pod="csi-node-driver-mr7s6" WorkloadEndpoint="ci--4459--0--0--7--af931fdd93-k8s-csi--node--driver--mr7s6-" Sep 12 22:06:49.600372 containerd[1553]: 2025-09-12 22:06:49.313 [INFO][4281] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="1ff3691f489dfeb852cad9bf4ed3b9de1f23adebbae6bc73d7ab647a9d05b065" Namespace="calico-system" Pod="csi-node-driver-mr7s6" WorkloadEndpoint="ci--4459--0--0--7--af931fdd93-k8s-csi--node--driver--mr7s6-eth0" Sep 12 22:06:49.600372 containerd[1553]: 2025-09-12 22:06:49.437 [INFO][4349] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="1ff3691f489dfeb852cad9bf4ed3b9de1f23adebbae6bc73d7ab647a9d05b065" HandleID="k8s-pod-network.1ff3691f489dfeb852cad9bf4ed3b9de1f23adebbae6bc73d7ab647a9d05b065" Workload="ci--4459--0--0--7--af931fdd93-k8s-csi--node--driver--mr7s6-eth0" Sep 12 22:06:49.600372 containerd[1553]: 2025-09-12 22:06:49.440 [INFO][4349] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="1ff3691f489dfeb852cad9bf4ed3b9de1f23adebbae6bc73d7ab647a9d05b065" HandleID="k8s-pod-network.1ff3691f489dfeb852cad9bf4ed3b9de1f23adebbae6bc73d7ab647a9d05b065" Workload="ci--4459--0--0--7--af931fdd93-k8s-csi--node--driver--mr7s6-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0x4000330770), Attrs:map[string]string{"namespace":"calico-system", "node":"ci-4459-0-0-7-af931fdd93", "pod":"csi-node-driver-mr7s6", "timestamp":"2025-09-12 22:06:49.437532061 +0000 UTC"}, Hostname:"ci-4459-0-0-7-af931fdd93", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Sep 12 22:06:49.600372 containerd[1553]: 2025-09-12 22:06:49.440 [INFO][4349] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Sep 12 22:06:49.600372 containerd[1553]: 2025-09-12 22:06:49.440 [INFO][4349] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Sep 12 22:06:49.600372 containerd[1553]: 2025-09-12 22:06:49.440 [INFO][4349] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4459-0-0-7-af931fdd93' Sep 12 22:06:49.600372 containerd[1553]: 2025-09-12 22:06:49.463 [INFO][4349] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.1ff3691f489dfeb852cad9bf4ed3b9de1f23adebbae6bc73d7ab647a9d05b065" host="ci-4459-0-0-7-af931fdd93" Sep 12 22:06:49.600372 containerd[1553]: 2025-09-12 22:06:49.482 [INFO][4349] ipam/ipam.go 394: Looking up existing affinities for host host="ci-4459-0-0-7-af931fdd93" Sep 12 22:06:49.600372 containerd[1553]: 2025-09-12 22:06:49.507 [INFO][4349] ipam/ipam.go 511: Trying affinity for 192.168.1.64/26 host="ci-4459-0-0-7-af931fdd93" Sep 12 22:06:49.600372 containerd[1553]: 2025-09-12 22:06:49.514 [INFO][4349] ipam/ipam.go 158: Attempting to load block cidr=192.168.1.64/26 host="ci-4459-0-0-7-af931fdd93" Sep 12 22:06:49.600372 containerd[1553]: 2025-09-12 22:06:49.524 [INFO][4349] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.1.64/26 host="ci-4459-0-0-7-af931fdd93" Sep 12 22:06:49.600372 containerd[1553]: 2025-09-12 22:06:49.524 [INFO][4349] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.1.64/26 handle="k8s-pod-network.1ff3691f489dfeb852cad9bf4ed3b9de1f23adebbae6bc73d7ab647a9d05b065" host="ci-4459-0-0-7-af931fdd93" Sep 12 22:06:49.600372 containerd[1553]: 2025-09-12 22:06:49.530 [INFO][4349] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.1ff3691f489dfeb852cad9bf4ed3b9de1f23adebbae6bc73d7ab647a9d05b065 Sep 12 22:06:49.600372 containerd[1553]: 2025-09-12 22:06:49.540 [INFO][4349] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.1.64/26 handle="k8s-pod-network.1ff3691f489dfeb852cad9bf4ed3b9de1f23adebbae6bc73d7ab647a9d05b065" host="ci-4459-0-0-7-af931fdd93" Sep 12 22:06:49.600372 containerd[1553]: 2025-09-12 22:06:49.551 [INFO][4349] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.1.67/26] block=192.168.1.64/26 handle="k8s-pod-network.1ff3691f489dfeb852cad9bf4ed3b9de1f23adebbae6bc73d7ab647a9d05b065" host="ci-4459-0-0-7-af931fdd93" Sep 12 22:06:49.600372 containerd[1553]: 2025-09-12 22:06:49.551 [INFO][4349] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.1.67/26] handle="k8s-pod-network.1ff3691f489dfeb852cad9bf4ed3b9de1f23adebbae6bc73d7ab647a9d05b065" host="ci-4459-0-0-7-af931fdd93" Sep 12 22:06:49.600372 containerd[1553]: 2025-09-12 22:06:49.551 [INFO][4349] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Sep 12 22:06:49.602166 containerd[1553]: 2025-09-12 22:06:49.552 [INFO][4349] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.1.67/26] IPv6=[] ContainerID="1ff3691f489dfeb852cad9bf4ed3b9de1f23adebbae6bc73d7ab647a9d05b065" HandleID="k8s-pod-network.1ff3691f489dfeb852cad9bf4ed3b9de1f23adebbae6bc73d7ab647a9d05b065" Workload="ci--4459--0--0--7--af931fdd93-k8s-csi--node--driver--mr7s6-eth0" Sep 12 22:06:49.602166 containerd[1553]: 2025-09-12 22:06:49.555 [INFO][4281] cni-plugin/k8s.go 418: Populated endpoint ContainerID="1ff3691f489dfeb852cad9bf4ed3b9de1f23adebbae6bc73d7ab647a9d05b065" Namespace="calico-system" Pod="csi-node-driver-mr7s6" WorkloadEndpoint="ci--4459--0--0--7--af931fdd93-k8s-csi--node--driver--mr7s6-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4459--0--0--7--af931fdd93-k8s-csi--node--driver--mr7s6-eth0", GenerateName:"csi-node-driver-", Namespace:"calico-system", SelfLink:"", UID:"9c8a5f44-6d52-4bd2-b382-1b9dd40eab85", ResourceVersion:"670", Generation:0, CreationTimestamp:time.Date(2025, time.September, 12, 22, 6, 27, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"csi-node-driver", "controller-revision-hash":"856c6b598f", "k8s-app":"csi-node-driver", "name":"csi-node-driver", "pod-template-generation":"1", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"csi-node-driver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4459-0-0-7-af931fdd93", ContainerID:"", Pod:"csi-node-driver-mr7s6", Endpoint:"eth0", ServiceAccountName:"csi-node-driver", IPNetworks:[]string{"192.168.1.67/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.csi-node-driver"}, InterfaceName:"cali19cf70b8272", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 12 22:06:49.602166 containerd[1553]: 2025-09-12 22:06:49.556 [INFO][4281] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.1.67/32] ContainerID="1ff3691f489dfeb852cad9bf4ed3b9de1f23adebbae6bc73d7ab647a9d05b065" Namespace="calico-system" Pod="csi-node-driver-mr7s6" WorkloadEndpoint="ci--4459--0--0--7--af931fdd93-k8s-csi--node--driver--mr7s6-eth0" Sep 12 22:06:49.602166 containerd[1553]: 2025-09-12 22:06:49.556 [INFO][4281] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali19cf70b8272 ContainerID="1ff3691f489dfeb852cad9bf4ed3b9de1f23adebbae6bc73d7ab647a9d05b065" Namespace="calico-system" Pod="csi-node-driver-mr7s6" WorkloadEndpoint="ci--4459--0--0--7--af931fdd93-k8s-csi--node--driver--mr7s6-eth0" Sep 12 22:06:49.602166 containerd[1553]: 2025-09-12 22:06:49.563 [INFO][4281] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="1ff3691f489dfeb852cad9bf4ed3b9de1f23adebbae6bc73d7ab647a9d05b065" Namespace="calico-system" Pod="csi-node-driver-mr7s6" WorkloadEndpoint="ci--4459--0--0--7--af931fdd93-k8s-csi--node--driver--mr7s6-eth0" Sep 12 22:06:49.602166 containerd[1553]: 2025-09-12 22:06:49.564 [INFO][4281] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="1ff3691f489dfeb852cad9bf4ed3b9de1f23adebbae6bc73d7ab647a9d05b065" Namespace="calico-system" Pod="csi-node-driver-mr7s6" WorkloadEndpoint="ci--4459--0--0--7--af931fdd93-k8s-csi--node--driver--mr7s6-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4459--0--0--7--af931fdd93-k8s-csi--node--driver--mr7s6-eth0", GenerateName:"csi-node-driver-", Namespace:"calico-system", SelfLink:"", UID:"9c8a5f44-6d52-4bd2-b382-1b9dd40eab85", ResourceVersion:"670", Generation:0, CreationTimestamp:time.Date(2025, time.September, 12, 22, 6, 27, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"csi-node-driver", "controller-revision-hash":"856c6b598f", "k8s-app":"csi-node-driver", "name":"csi-node-driver", "pod-template-generation":"1", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"csi-node-driver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4459-0-0-7-af931fdd93", ContainerID:"1ff3691f489dfeb852cad9bf4ed3b9de1f23adebbae6bc73d7ab647a9d05b065", Pod:"csi-node-driver-mr7s6", Endpoint:"eth0", ServiceAccountName:"csi-node-driver", IPNetworks:[]string{"192.168.1.67/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.csi-node-driver"}, InterfaceName:"cali19cf70b8272", MAC:"a6:2e:46:48:f2:11", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 12 22:06:49.602408 containerd[1553]: 2025-09-12 22:06:49.598 [INFO][4281] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="1ff3691f489dfeb852cad9bf4ed3b9de1f23adebbae6bc73d7ab647a9d05b065" Namespace="calico-system" Pod="csi-node-driver-mr7s6" WorkloadEndpoint="ci--4459--0--0--7--af931fdd93-k8s-csi--node--driver--mr7s6-eth0" Sep 12 22:06:49.646879 containerd[1553]: time="2025-09-12T22:06:49.646346529Z" level=info msg="connecting to shim 1ff3691f489dfeb852cad9bf4ed3b9de1f23adebbae6bc73d7ab647a9d05b065" address="unix:///run/containerd/s/24ceaac3ae3d378bffe2e7c6a57ab345599917d5a51a09f1be73c4cd4b2ed2be" namespace=k8s.io protocol=ttrpc version=3 Sep 12 22:06:49.669674 systemd-networkd[1421]: cali7c0626cfe37: Link UP Sep 12 22:06:49.671744 systemd-networkd[1421]: cali7c0626cfe37: Gained carrier Sep 12 22:06:49.703081 containerd[1553]: 2025-09-12 22:06:49.283 [INFO][4317] cni-plugin/utils.go 100: File /var/lib/calico/mtu does not exist Sep 12 22:06:49.703081 containerd[1553]: 2025-09-12 22:06:49.346 [INFO][4317] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4459--0--0--7--af931fdd93-k8s-coredns--7c65d6cfc9--xl5lq-eth0 coredns-7c65d6cfc9- kube-system 1fd9bc8e-8eaf-4b7f-8777-056108d2728d 787 0 2025-09-12 22:06:10 +0000 UTC map[k8s-app:kube-dns pod-template-hash:7c65d6cfc9 projectcalico.org/namespace:kube-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:coredns] map[] [] [] []} {k8s ci-4459-0-0-7-af931fdd93 coredns-7c65d6cfc9-xl5lq eth0 coredns [] [] [kns.kube-system ksa.kube-system.coredns] cali7c0626cfe37 [{dns UDP 53 0 } {dns-tcp TCP 53 0 } {metrics TCP 9153 0 }] [] }} ContainerID="0e13698e56c7675fb7471d330d7edeef225918fd96c868632cb2fe579ac8e555" Namespace="kube-system" Pod="coredns-7c65d6cfc9-xl5lq" WorkloadEndpoint="ci--4459--0--0--7--af931fdd93-k8s-coredns--7c65d6cfc9--xl5lq-" Sep 12 22:06:49.703081 containerd[1553]: 2025-09-12 22:06:49.347 [INFO][4317] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="0e13698e56c7675fb7471d330d7edeef225918fd96c868632cb2fe579ac8e555" Namespace="kube-system" Pod="coredns-7c65d6cfc9-xl5lq" WorkloadEndpoint="ci--4459--0--0--7--af931fdd93-k8s-coredns--7c65d6cfc9--xl5lq-eth0" Sep 12 22:06:49.703081 containerd[1553]: 2025-09-12 22:06:49.472 [INFO][4357] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="0e13698e56c7675fb7471d330d7edeef225918fd96c868632cb2fe579ac8e555" HandleID="k8s-pod-network.0e13698e56c7675fb7471d330d7edeef225918fd96c868632cb2fe579ac8e555" Workload="ci--4459--0--0--7--af931fdd93-k8s-coredns--7c65d6cfc9--xl5lq-eth0" Sep 12 22:06:49.703081 containerd[1553]: 2025-09-12 22:06:49.473 [INFO][4357] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="0e13698e56c7675fb7471d330d7edeef225918fd96c868632cb2fe579ac8e555" HandleID="k8s-pod-network.0e13698e56c7675fb7471d330d7edeef225918fd96c868632cb2fe579ac8e555" Workload="ci--4459--0--0--7--af931fdd93-k8s-coredns--7c65d6cfc9--xl5lq-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0x4000264d00), Attrs:map[string]string{"namespace":"kube-system", "node":"ci-4459-0-0-7-af931fdd93", "pod":"coredns-7c65d6cfc9-xl5lq", "timestamp":"2025-09-12 22:06:49.472538654 +0000 UTC"}, Hostname:"ci-4459-0-0-7-af931fdd93", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Sep 12 22:06:49.703081 containerd[1553]: 2025-09-12 22:06:49.473 [INFO][4357] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Sep 12 22:06:49.703081 containerd[1553]: 2025-09-12 22:06:49.551 [INFO][4357] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Sep 12 22:06:49.703081 containerd[1553]: 2025-09-12 22:06:49.551 [INFO][4357] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4459-0-0-7-af931fdd93' Sep 12 22:06:49.703081 containerd[1553]: 2025-09-12 22:06:49.578 [INFO][4357] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.0e13698e56c7675fb7471d330d7edeef225918fd96c868632cb2fe579ac8e555" host="ci-4459-0-0-7-af931fdd93" Sep 12 22:06:49.703081 containerd[1553]: 2025-09-12 22:06:49.587 [INFO][4357] ipam/ipam.go 394: Looking up existing affinities for host host="ci-4459-0-0-7-af931fdd93" Sep 12 22:06:49.703081 containerd[1553]: 2025-09-12 22:06:49.613 [INFO][4357] ipam/ipam.go 511: Trying affinity for 192.168.1.64/26 host="ci-4459-0-0-7-af931fdd93" Sep 12 22:06:49.703081 containerd[1553]: 2025-09-12 22:06:49.619 [INFO][4357] ipam/ipam.go 158: Attempting to load block cidr=192.168.1.64/26 host="ci-4459-0-0-7-af931fdd93" Sep 12 22:06:49.703081 containerd[1553]: 2025-09-12 22:06:49.624 [INFO][4357] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.1.64/26 host="ci-4459-0-0-7-af931fdd93" Sep 12 22:06:49.703081 containerd[1553]: 2025-09-12 22:06:49.624 [INFO][4357] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.1.64/26 handle="k8s-pod-network.0e13698e56c7675fb7471d330d7edeef225918fd96c868632cb2fe579ac8e555" host="ci-4459-0-0-7-af931fdd93" Sep 12 22:06:49.703081 containerd[1553]: 2025-09-12 22:06:49.628 [INFO][4357] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.0e13698e56c7675fb7471d330d7edeef225918fd96c868632cb2fe579ac8e555 Sep 12 22:06:49.703081 containerd[1553]: 2025-09-12 22:06:49.641 [INFO][4357] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.1.64/26 handle="k8s-pod-network.0e13698e56c7675fb7471d330d7edeef225918fd96c868632cb2fe579ac8e555" host="ci-4459-0-0-7-af931fdd93" Sep 12 22:06:49.703081 containerd[1553]: 2025-09-12 22:06:49.651 [INFO][4357] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.1.68/26] block=192.168.1.64/26 handle="k8s-pod-network.0e13698e56c7675fb7471d330d7edeef225918fd96c868632cb2fe579ac8e555" host="ci-4459-0-0-7-af931fdd93" Sep 12 22:06:49.703081 containerd[1553]: 2025-09-12 22:06:49.651 [INFO][4357] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.1.68/26] handle="k8s-pod-network.0e13698e56c7675fb7471d330d7edeef225918fd96c868632cb2fe579ac8e555" host="ci-4459-0-0-7-af931fdd93" Sep 12 22:06:49.703081 containerd[1553]: 2025-09-12 22:06:49.651 [INFO][4357] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Sep 12 22:06:49.703828 containerd[1553]: 2025-09-12 22:06:49.651 [INFO][4357] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.1.68/26] IPv6=[] ContainerID="0e13698e56c7675fb7471d330d7edeef225918fd96c868632cb2fe579ac8e555" HandleID="k8s-pod-network.0e13698e56c7675fb7471d330d7edeef225918fd96c868632cb2fe579ac8e555" Workload="ci--4459--0--0--7--af931fdd93-k8s-coredns--7c65d6cfc9--xl5lq-eth0" Sep 12 22:06:49.703828 containerd[1553]: 2025-09-12 22:06:49.658 [INFO][4317] cni-plugin/k8s.go 418: Populated endpoint ContainerID="0e13698e56c7675fb7471d330d7edeef225918fd96c868632cb2fe579ac8e555" Namespace="kube-system" Pod="coredns-7c65d6cfc9-xl5lq" WorkloadEndpoint="ci--4459--0--0--7--af931fdd93-k8s-coredns--7c65d6cfc9--xl5lq-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4459--0--0--7--af931fdd93-k8s-coredns--7c65d6cfc9--xl5lq-eth0", GenerateName:"coredns-7c65d6cfc9-", Namespace:"kube-system", SelfLink:"", UID:"1fd9bc8e-8eaf-4b7f-8777-056108d2728d", ResourceVersion:"787", Generation:0, CreationTimestamp:time.Date(2025, time.September, 12, 22, 6, 10, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"7c65d6cfc9", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4459-0-0-7-af931fdd93", ContainerID:"", Pod:"coredns-7c65d6cfc9-xl5lq", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.1.68/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"cali7c0626cfe37", MAC:"", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 12 22:06:49.703828 containerd[1553]: 2025-09-12 22:06:49.658 [INFO][4317] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.1.68/32] ContainerID="0e13698e56c7675fb7471d330d7edeef225918fd96c868632cb2fe579ac8e555" Namespace="kube-system" Pod="coredns-7c65d6cfc9-xl5lq" WorkloadEndpoint="ci--4459--0--0--7--af931fdd93-k8s-coredns--7c65d6cfc9--xl5lq-eth0" Sep 12 22:06:49.703828 containerd[1553]: 2025-09-12 22:06:49.658 [INFO][4317] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali7c0626cfe37 ContainerID="0e13698e56c7675fb7471d330d7edeef225918fd96c868632cb2fe579ac8e555" Namespace="kube-system" Pod="coredns-7c65d6cfc9-xl5lq" WorkloadEndpoint="ci--4459--0--0--7--af931fdd93-k8s-coredns--7c65d6cfc9--xl5lq-eth0" Sep 12 22:06:49.703828 containerd[1553]: 2025-09-12 22:06:49.672 [INFO][4317] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="0e13698e56c7675fb7471d330d7edeef225918fd96c868632cb2fe579ac8e555" Namespace="kube-system" Pod="coredns-7c65d6cfc9-xl5lq" WorkloadEndpoint="ci--4459--0--0--7--af931fdd93-k8s-coredns--7c65d6cfc9--xl5lq-eth0" Sep 12 22:06:49.704028 containerd[1553]: 2025-09-12 22:06:49.677 [INFO][4317] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="0e13698e56c7675fb7471d330d7edeef225918fd96c868632cb2fe579ac8e555" Namespace="kube-system" Pod="coredns-7c65d6cfc9-xl5lq" WorkloadEndpoint="ci--4459--0--0--7--af931fdd93-k8s-coredns--7c65d6cfc9--xl5lq-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4459--0--0--7--af931fdd93-k8s-coredns--7c65d6cfc9--xl5lq-eth0", GenerateName:"coredns-7c65d6cfc9-", Namespace:"kube-system", SelfLink:"", UID:"1fd9bc8e-8eaf-4b7f-8777-056108d2728d", ResourceVersion:"787", Generation:0, CreationTimestamp:time.Date(2025, time.September, 12, 22, 6, 10, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"7c65d6cfc9", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4459-0-0-7-af931fdd93", ContainerID:"0e13698e56c7675fb7471d330d7edeef225918fd96c868632cb2fe579ac8e555", Pod:"coredns-7c65d6cfc9-xl5lq", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.1.68/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"cali7c0626cfe37", MAC:"3e:f4:d3:4a:38:1e", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 12 22:06:49.704028 containerd[1553]: 2025-09-12 22:06:49.691 [INFO][4317] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="0e13698e56c7675fb7471d330d7edeef225918fd96c868632cb2fe579ac8e555" Namespace="kube-system" Pod="coredns-7c65d6cfc9-xl5lq" WorkloadEndpoint="ci--4459--0--0--7--af931fdd93-k8s-coredns--7c65d6cfc9--xl5lq-eth0" Sep 12 22:06:49.703864 systemd[1]: Started cri-containerd-1ff3691f489dfeb852cad9bf4ed3b9de1f23adebbae6bc73d7ab647a9d05b065.scope - libcontainer container 1ff3691f489dfeb852cad9bf4ed3b9de1f23adebbae6bc73d7ab647a9d05b065. Sep 12 22:06:49.740464 containerd[1553]: time="2025-09-12T22:06:49.740353714Z" level=info msg="connecting to shim 0e13698e56c7675fb7471d330d7edeef225918fd96c868632cb2fe579ac8e555" address="unix:///run/containerd/s/3e14581d813f1b6354962565e083b5e8cc14c92447e1588415af8518d2c005b1" namespace=k8s.io protocol=ttrpc version=3 Sep 12 22:06:49.767490 systemd-networkd[1421]: cali9fa1271b1df: Link UP Sep 12 22:06:49.772161 systemd-networkd[1421]: cali9fa1271b1df: Gained carrier Sep 12 22:06:49.797782 containerd[1553]: time="2025-09-12T22:06:49.797724296Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-mr7s6,Uid:9c8a5f44-6d52-4bd2-b382-1b9dd40eab85,Namespace:calico-system,Attempt:0,} returns sandbox id \"1ff3691f489dfeb852cad9bf4ed3b9de1f23adebbae6bc73d7ab647a9d05b065\"" Sep 12 22:06:49.801768 systemd[1]: Started cri-containerd-0e13698e56c7675fb7471d330d7edeef225918fd96c868632cb2fe579ac8e555.scope - libcontainer container 0e13698e56c7675fb7471d330d7edeef225918fd96c868632cb2fe579ac8e555. Sep 12 22:06:49.813499 containerd[1553]: 2025-09-12 22:06:49.314 [INFO][4299] cni-plugin/utils.go 100: File /var/lib/calico/mtu does not exist Sep 12 22:06:49.813499 containerd[1553]: 2025-09-12 22:06:49.361 [INFO][4299] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4459--0--0--7--af931fdd93-k8s-goldmane--7988f88666--5qdsb-eth0 goldmane-7988f88666- calico-system b6d05037-1881-4fc1-bcbd-569865ed54b9 796 0 2025-09-12 22:06:26 +0000 UTC map[app.kubernetes.io/name:goldmane k8s-app:goldmane pod-template-hash:7988f88666 projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:goldmane] map[] [] [] []} {k8s ci-4459-0-0-7-af931fdd93 goldmane-7988f88666-5qdsb eth0 goldmane [] [] [kns.calico-system ksa.calico-system.goldmane] cali9fa1271b1df [] [] }} ContainerID="c5bce689db931db02829073d163d397ba920c0ab74d7a98df48b5f2876b491b0" Namespace="calico-system" Pod="goldmane-7988f88666-5qdsb" WorkloadEndpoint="ci--4459--0--0--7--af931fdd93-k8s-goldmane--7988f88666--5qdsb-" Sep 12 22:06:49.813499 containerd[1553]: 2025-09-12 22:06:49.362 [INFO][4299] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="c5bce689db931db02829073d163d397ba920c0ab74d7a98df48b5f2876b491b0" Namespace="calico-system" Pod="goldmane-7988f88666-5qdsb" WorkloadEndpoint="ci--4459--0--0--7--af931fdd93-k8s-goldmane--7988f88666--5qdsb-eth0" Sep 12 22:06:49.813499 containerd[1553]: 2025-09-12 22:06:49.492 [INFO][4359] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="c5bce689db931db02829073d163d397ba920c0ab74d7a98df48b5f2876b491b0" HandleID="k8s-pod-network.c5bce689db931db02829073d163d397ba920c0ab74d7a98df48b5f2876b491b0" Workload="ci--4459--0--0--7--af931fdd93-k8s-goldmane--7988f88666--5qdsb-eth0" Sep 12 22:06:49.813499 containerd[1553]: 2025-09-12 22:06:49.493 [INFO][4359] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="c5bce689db931db02829073d163d397ba920c0ab74d7a98df48b5f2876b491b0" HandleID="k8s-pod-network.c5bce689db931db02829073d163d397ba920c0ab74d7a98df48b5f2876b491b0" Workload="ci--4459--0--0--7--af931fdd93-k8s-goldmane--7988f88666--5qdsb-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0x400038d7a0), Attrs:map[string]string{"namespace":"calico-system", "node":"ci-4459-0-0-7-af931fdd93", "pod":"goldmane-7988f88666-5qdsb", "timestamp":"2025-09-12 22:06:49.492242545 +0000 UTC"}, Hostname:"ci-4459-0-0-7-af931fdd93", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Sep 12 22:06:49.813499 containerd[1553]: 2025-09-12 22:06:49.493 [INFO][4359] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Sep 12 22:06:49.813499 containerd[1553]: 2025-09-12 22:06:49.651 [INFO][4359] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Sep 12 22:06:49.813499 containerd[1553]: 2025-09-12 22:06:49.653 [INFO][4359] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4459-0-0-7-af931fdd93' Sep 12 22:06:49.813499 containerd[1553]: 2025-09-12 22:06:49.683 [INFO][4359] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.c5bce689db931db02829073d163d397ba920c0ab74d7a98df48b5f2876b491b0" host="ci-4459-0-0-7-af931fdd93" Sep 12 22:06:49.813499 containerd[1553]: 2025-09-12 22:06:49.702 [INFO][4359] ipam/ipam.go 394: Looking up existing affinities for host host="ci-4459-0-0-7-af931fdd93" Sep 12 22:06:49.813499 containerd[1553]: 2025-09-12 22:06:49.713 [INFO][4359] ipam/ipam.go 511: Trying affinity for 192.168.1.64/26 host="ci-4459-0-0-7-af931fdd93" Sep 12 22:06:49.813499 containerd[1553]: 2025-09-12 22:06:49.716 [INFO][4359] ipam/ipam.go 158: Attempting to load block cidr=192.168.1.64/26 host="ci-4459-0-0-7-af931fdd93" Sep 12 22:06:49.813499 containerd[1553]: 2025-09-12 22:06:49.720 [INFO][4359] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.1.64/26 host="ci-4459-0-0-7-af931fdd93" Sep 12 22:06:49.813499 containerd[1553]: 2025-09-12 22:06:49.720 [INFO][4359] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.1.64/26 handle="k8s-pod-network.c5bce689db931db02829073d163d397ba920c0ab74d7a98df48b5f2876b491b0" host="ci-4459-0-0-7-af931fdd93" Sep 12 22:06:49.813499 containerd[1553]: 2025-09-12 22:06:49.725 [INFO][4359] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.c5bce689db931db02829073d163d397ba920c0ab74d7a98df48b5f2876b491b0 Sep 12 22:06:49.813499 containerd[1553]: 2025-09-12 22:06:49.734 [INFO][4359] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.1.64/26 handle="k8s-pod-network.c5bce689db931db02829073d163d397ba920c0ab74d7a98df48b5f2876b491b0" host="ci-4459-0-0-7-af931fdd93" Sep 12 22:06:49.813499 containerd[1553]: 2025-09-12 22:06:49.755 [INFO][4359] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.1.69/26] block=192.168.1.64/26 handle="k8s-pod-network.c5bce689db931db02829073d163d397ba920c0ab74d7a98df48b5f2876b491b0" host="ci-4459-0-0-7-af931fdd93" Sep 12 22:06:49.813499 containerd[1553]: 2025-09-12 22:06:49.755 [INFO][4359] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.1.69/26] handle="k8s-pod-network.c5bce689db931db02829073d163d397ba920c0ab74d7a98df48b5f2876b491b0" host="ci-4459-0-0-7-af931fdd93" Sep 12 22:06:49.813499 containerd[1553]: 2025-09-12 22:06:49.755 [INFO][4359] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Sep 12 22:06:49.814052 containerd[1553]: 2025-09-12 22:06:49.755 [INFO][4359] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.1.69/26] IPv6=[] ContainerID="c5bce689db931db02829073d163d397ba920c0ab74d7a98df48b5f2876b491b0" HandleID="k8s-pod-network.c5bce689db931db02829073d163d397ba920c0ab74d7a98df48b5f2876b491b0" Workload="ci--4459--0--0--7--af931fdd93-k8s-goldmane--7988f88666--5qdsb-eth0" Sep 12 22:06:49.814052 containerd[1553]: 2025-09-12 22:06:49.763 [INFO][4299] cni-plugin/k8s.go 418: Populated endpoint ContainerID="c5bce689db931db02829073d163d397ba920c0ab74d7a98df48b5f2876b491b0" Namespace="calico-system" Pod="goldmane-7988f88666-5qdsb" WorkloadEndpoint="ci--4459--0--0--7--af931fdd93-k8s-goldmane--7988f88666--5qdsb-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4459--0--0--7--af931fdd93-k8s-goldmane--7988f88666--5qdsb-eth0", GenerateName:"goldmane-7988f88666-", Namespace:"calico-system", SelfLink:"", UID:"b6d05037-1881-4fc1-bcbd-569865ed54b9", ResourceVersion:"796", Generation:0, CreationTimestamp:time.Date(2025, time.September, 12, 22, 6, 26, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"goldmane", "k8s-app":"goldmane", "pod-template-hash":"7988f88666", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"goldmane"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4459-0-0-7-af931fdd93", ContainerID:"", Pod:"goldmane-7988f88666-5qdsb", Endpoint:"eth0", ServiceAccountName:"goldmane", IPNetworks:[]string{"192.168.1.69/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.goldmane"}, InterfaceName:"cali9fa1271b1df", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 12 22:06:49.814052 containerd[1553]: 2025-09-12 22:06:49.763 [INFO][4299] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.1.69/32] ContainerID="c5bce689db931db02829073d163d397ba920c0ab74d7a98df48b5f2876b491b0" Namespace="calico-system" Pod="goldmane-7988f88666-5qdsb" WorkloadEndpoint="ci--4459--0--0--7--af931fdd93-k8s-goldmane--7988f88666--5qdsb-eth0" Sep 12 22:06:49.814052 containerd[1553]: 2025-09-12 22:06:49.763 [INFO][4299] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali9fa1271b1df ContainerID="c5bce689db931db02829073d163d397ba920c0ab74d7a98df48b5f2876b491b0" Namespace="calico-system" Pod="goldmane-7988f88666-5qdsb" WorkloadEndpoint="ci--4459--0--0--7--af931fdd93-k8s-goldmane--7988f88666--5qdsb-eth0" Sep 12 22:06:49.814052 containerd[1553]: 2025-09-12 22:06:49.775 [INFO][4299] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="c5bce689db931db02829073d163d397ba920c0ab74d7a98df48b5f2876b491b0" Namespace="calico-system" Pod="goldmane-7988f88666-5qdsb" WorkloadEndpoint="ci--4459--0--0--7--af931fdd93-k8s-goldmane--7988f88666--5qdsb-eth0" Sep 12 22:06:49.814052 containerd[1553]: 2025-09-12 22:06:49.776 [INFO][4299] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="c5bce689db931db02829073d163d397ba920c0ab74d7a98df48b5f2876b491b0" Namespace="calico-system" Pod="goldmane-7988f88666-5qdsb" WorkloadEndpoint="ci--4459--0--0--7--af931fdd93-k8s-goldmane--7988f88666--5qdsb-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4459--0--0--7--af931fdd93-k8s-goldmane--7988f88666--5qdsb-eth0", GenerateName:"goldmane-7988f88666-", Namespace:"calico-system", SelfLink:"", UID:"b6d05037-1881-4fc1-bcbd-569865ed54b9", ResourceVersion:"796", Generation:0, CreationTimestamp:time.Date(2025, time.September, 12, 22, 6, 26, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"goldmane", "k8s-app":"goldmane", "pod-template-hash":"7988f88666", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"goldmane"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4459-0-0-7-af931fdd93", ContainerID:"c5bce689db931db02829073d163d397ba920c0ab74d7a98df48b5f2876b491b0", Pod:"goldmane-7988f88666-5qdsb", Endpoint:"eth0", ServiceAccountName:"goldmane", IPNetworks:[]string{"192.168.1.69/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.goldmane"}, InterfaceName:"cali9fa1271b1df", MAC:"fa:0c:ab:7f:c1:ef", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 12 22:06:49.814228 containerd[1553]: 2025-09-12 22:06:49.807 [INFO][4299] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="c5bce689db931db02829073d163d397ba920c0ab74d7a98df48b5f2876b491b0" Namespace="calico-system" Pod="goldmane-7988f88666-5qdsb" WorkloadEndpoint="ci--4459--0--0--7--af931fdd93-k8s-goldmane--7988f88666--5qdsb-eth0" Sep 12 22:06:49.873958 containerd[1553]: time="2025-09-12T22:06:49.873720441Z" level=info msg="connecting to shim c5bce689db931db02829073d163d397ba920c0ab74d7a98df48b5f2876b491b0" address="unix:///run/containerd/s/24f20cd12ec48286365a2cf5d1a09ca767abc44296eb23f5736ab6270f905a77" namespace=k8s.io protocol=ttrpc version=3 Sep 12 22:06:49.884985 systemd-networkd[1421]: cali4f011a6ff3f: Link UP Sep 12 22:06:49.887099 systemd-networkd[1421]: cali4f011a6ff3f: Gained carrier Sep 12 22:06:49.903697 containerd[1553]: time="2025-09-12T22:06:49.903630640Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-7c65d6cfc9-xl5lq,Uid:1fd9bc8e-8eaf-4b7f-8777-056108d2728d,Namespace:kube-system,Attempt:0,} returns sandbox id \"0e13698e56c7675fb7471d330d7edeef225918fd96c868632cb2fe579ac8e555\"" Sep 12 22:06:49.912545 containerd[1553]: time="2025-09-12T22:06:49.912343938Z" level=info msg="CreateContainer within sandbox \"0e13698e56c7675fb7471d330d7edeef225918fd96c868632cb2fe579ac8e555\" for container &ContainerMetadata{Name:coredns,Attempt:0,}" Sep 12 22:06:49.923542 containerd[1553]: 2025-09-12 22:06:49.329 [INFO][4306] cni-plugin/utils.go 100: File /var/lib/calico/mtu does not exist Sep 12 22:06:49.923542 containerd[1553]: 2025-09-12 22:06:49.387 [INFO][4306] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4459--0--0--7--af931fdd93-k8s-coredns--7c65d6cfc9--jbgw9-eth0 coredns-7c65d6cfc9- kube-system 7742b50f-d16e-46b2-aaad-1853990171fc 795 0 2025-09-12 22:06:10 +0000 UTC map[k8s-app:kube-dns pod-template-hash:7c65d6cfc9 projectcalico.org/namespace:kube-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:coredns] map[] [] [] []} {k8s ci-4459-0-0-7-af931fdd93 coredns-7c65d6cfc9-jbgw9 eth0 coredns [] [] [kns.kube-system ksa.kube-system.coredns] cali4f011a6ff3f [{dns UDP 53 0 } {dns-tcp TCP 53 0 } {metrics TCP 9153 0 }] [] }} ContainerID="a4efceac9469e7794edc9b141cae78a769c9e38981bb50576e6ac3e099c61cc0" Namespace="kube-system" Pod="coredns-7c65d6cfc9-jbgw9" WorkloadEndpoint="ci--4459--0--0--7--af931fdd93-k8s-coredns--7c65d6cfc9--jbgw9-" Sep 12 22:06:49.923542 containerd[1553]: 2025-09-12 22:06:49.390 [INFO][4306] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="a4efceac9469e7794edc9b141cae78a769c9e38981bb50576e6ac3e099c61cc0" Namespace="kube-system" Pod="coredns-7c65d6cfc9-jbgw9" WorkloadEndpoint="ci--4459--0--0--7--af931fdd93-k8s-coredns--7c65d6cfc9--jbgw9-eth0" Sep 12 22:06:49.923542 containerd[1553]: 2025-09-12 22:06:49.534 [INFO][4368] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="a4efceac9469e7794edc9b141cae78a769c9e38981bb50576e6ac3e099c61cc0" HandleID="k8s-pod-network.a4efceac9469e7794edc9b141cae78a769c9e38981bb50576e6ac3e099c61cc0" Workload="ci--4459--0--0--7--af931fdd93-k8s-coredns--7c65d6cfc9--jbgw9-eth0" Sep 12 22:06:49.923542 containerd[1553]: 2025-09-12 22:06:49.535 [INFO][4368] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="a4efceac9469e7794edc9b141cae78a769c9e38981bb50576e6ac3e099c61cc0" HandleID="k8s-pod-network.a4efceac9469e7794edc9b141cae78a769c9e38981bb50576e6ac3e099c61cc0" Workload="ci--4459--0--0--7--af931fdd93-k8s-coredns--7c65d6cfc9--jbgw9-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0x4000332450), Attrs:map[string]string{"namespace":"kube-system", "node":"ci-4459-0-0-7-af931fdd93", "pod":"coredns-7c65d6cfc9-jbgw9", "timestamp":"2025-09-12 22:06:49.534927588 +0000 UTC"}, Hostname:"ci-4459-0-0-7-af931fdd93", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Sep 12 22:06:49.923542 containerd[1553]: 2025-09-12 22:06:49.535 [INFO][4368] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Sep 12 22:06:49.923542 containerd[1553]: 2025-09-12 22:06:49.755 [INFO][4368] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Sep 12 22:06:49.923542 containerd[1553]: 2025-09-12 22:06:49.755 [INFO][4368] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4459-0-0-7-af931fdd93' Sep 12 22:06:49.923542 containerd[1553]: 2025-09-12 22:06:49.786 [INFO][4368] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.a4efceac9469e7794edc9b141cae78a769c9e38981bb50576e6ac3e099c61cc0" host="ci-4459-0-0-7-af931fdd93" Sep 12 22:06:49.923542 containerd[1553]: 2025-09-12 22:06:49.806 [INFO][4368] ipam/ipam.go 394: Looking up existing affinities for host host="ci-4459-0-0-7-af931fdd93" Sep 12 22:06:49.923542 containerd[1553]: 2025-09-12 22:06:49.824 [INFO][4368] ipam/ipam.go 511: Trying affinity for 192.168.1.64/26 host="ci-4459-0-0-7-af931fdd93" Sep 12 22:06:49.923542 containerd[1553]: 2025-09-12 22:06:49.829 [INFO][4368] ipam/ipam.go 158: Attempting to load block cidr=192.168.1.64/26 host="ci-4459-0-0-7-af931fdd93" Sep 12 22:06:49.923542 containerd[1553]: 2025-09-12 22:06:49.836 [INFO][4368] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.1.64/26 host="ci-4459-0-0-7-af931fdd93" Sep 12 22:06:49.923542 containerd[1553]: 2025-09-12 22:06:49.836 [INFO][4368] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.1.64/26 handle="k8s-pod-network.a4efceac9469e7794edc9b141cae78a769c9e38981bb50576e6ac3e099c61cc0" host="ci-4459-0-0-7-af931fdd93" Sep 12 22:06:49.923542 containerd[1553]: 2025-09-12 22:06:49.840 [INFO][4368] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.a4efceac9469e7794edc9b141cae78a769c9e38981bb50576e6ac3e099c61cc0 Sep 12 22:06:49.923542 containerd[1553]: 2025-09-12 22:06:49.857 [INFO][4368] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.1.64/26 handle="k8s-pod-network.a4efceac9469e7794edc9b141cae78a769c9e38981bb50576e6ac3e099c61cc0" host="ci-4459-0-0-7-af931fdd93" Sep 12 22:06:49.923542 containerd[1553]: 2025-09-12 22:06:49.872 [INFO][4368] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.1.70/26] block=192.168.1.64/26 handle="k8s-pod-network.a4efceac9469e7794edc9b141cae78a769c9e38981bb50576e6ac3e099c61cc0" host="ci-4459-0-0-7-af931fdd93" Sep 12 22:06:49.923542 containerd[1553]: 2025-09-12 22:06:49.872 [INFO][4368] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.1.70/26] handle="k8s-pod-network.a4efceac9469e7794edc9b141cae78a769c9e38981bb50576e6ac3e099c61cc0" host="ci-4459-0-0-7-af931fdd93" Sep 12 22:06:49.923542 containerd[1553]: 2025-09-12 22:06:49.873 [INFO][4368] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Sep 12 22:06:49.924080 containerd[1553]: 2025-09-12 22:06:49.873 [INFO][4368] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.1.70/26] IPv6=[] ContainerID="a4efceac9469e7794edc9b141cae78a769c9e38981bb50576e6ac3e099c61cc0" HandleID="k8s-pod-network.a4efceac9469e7794edc9b141cae78a769c9e38981bb50576e6ac3e099c61cc0" Workload="ci--4459--0--0--7--af931fdd93-k8s-coredns--7c65d6cfc9--jbgw9-eth0" Sep 12 22:06:49.924080 containerd[1553]: 2025-09-12 22:06:49.879 [INFO][4306] cni-plugin/k8s.go 418: Populated endpoint ContainerID="a4efceac9469e7794edc9b141cae78a769c9e38981bb50576e6ac3e099c61cc0" Namespace="kube-system" Pod="coredns-7c65d6cfc9-jbgw9" WorkloadEndpoint="ci--4459--0--0--7--af931fdd93-k8s-coredns--7c65d6cfc9--jbgw9-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4459--0--0--7--af931fdd93-k8s-coredns--7c65d6cfc9--jbgw9-eth0", GenerateName:"coredns-7c65d6cfc9-", Namespace:"kube-system", SelfLink:"", UID:"7742b50f-d16e-46b2-aaad-1853990171fc", ResourceVersion:"795", Generation:0, CreationTimestamp:time.Date(2025, time.September, 12, 22, 6, 10, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"7c65d6cfc9", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4459-0-0-7-af931fdd93", ContainerID:"", Pod:"coredns-7c65d6cfc9-jbgw9", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.1.70/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"cali4f011a6ff3f", MAC:"", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 12 22:06:49.924080 containerd[1553]: 2025-09-12 22:06:49.880 [INFO][4306] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.1.70/32] ContainerID="a4efceac9469e7794edc9b141cae78a769c9e38981bb50576e6ac3e099c61cc0" Namespace="kube-system" Pod="coredns-7c65d6cfc9-jbgw9" WorkloadEndpoint="ci--4459--0--0--7--af931fdd93-k8s-coredns--7c65d6cfc9--jbgw9-eth0" Sep 12 22:06:49.924080 containerd[1553]: 2025-09-12 22:06:49.880 [INFO][4306] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali4f011a6ff3f ContainerID="a4efceac9469e7794edc9b141cae78a769c9e38981bb50576e6ac3e099c61cc0" Namespace="kube-system" Pod="coredns-7c65d6cfc9-jbgw9" WorkloadEndpoint="ci--4459--0--0--7--af931fdd93-k8s-coredns--7c65d6cfc9--jbgw9-eth0" Sep 12 22:06:49.924080 containerd[1553]: 2025-09-12 22:06:49.888 [INFO][4306] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="a4efceac9469e7794edc9b141cae78a769c9e38981bb50576e6ac3e099c61cc0" Namespace="kube-system" Pod="coredns-7c65d6cfc9-jbgw9" WorkloadEndpoint="ci--4459--0--0--7--af931fdd93-k8s-coredns--7c65d6cfc9--jbgw9-eth0" Sep 12 22:06:49.924261 containerd[1553]: 2025-09-12 22:06:49.890 [INFO][4306] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="a4efceac9469e7794edc9b141cae78a769c9e38981bb50576e6ac3e099c61cc0" Namespace="kube-system" Pod="coredns-7c65d6cfc9-jbgw9" WorkloadEndpoint="ci--4459--0--0--7--af931fdd93-k8s-coredns--7c65d6cfc9--jbgw9-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4459--0--0--7--af931fdd93-k8s-coredns--7c65d6cfc9--jbgw9-eth0", GenerateName:"coredns-7c65d6cfc9-", Namespace:"kube-system", SelfLink:"", UID:"7742b50f-d16e-46b2-aaad-1853990171fc", ResourceVersion:"795", Generation:0, CreationTimestamp:time.Date(2025, time.September, 12, 22, 6, 10, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"7c65d6cfc9", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4459-0-0-7-af931fdd93", ContainerID:"a4efceac9469e7794edc9b141cae78a769c9e38981bb50576e6ac3e099c61cc0", Pod:"coredns-7c65d6cfc9-jbgw9", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.1.70/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"cali4f011a6ff3f", MAC:"4a:35:24:f2:97:d8", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 12 22:06:49.924261 containerd[1553]: 2025-09-12 22:06:49.917 [INFO][4306] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="a4efceac9469e7794edc9b141cae78a769c9e38981bb50576e6ac3e099c61cc0" Namespace="kube-system" Pod="coredns-7c65d6cfc9-jbgw9" WorkloadEndpoint="ci--4459--0--0--7--af931fdd93-k8s-coredns--7c65d6cfc9--jbgw9-eth0" Sep 12 22:06:49.932857 systemd[1]: Started cri-containerd-c5bce689db931db02829073d163d397ba920c0ab74d7a98df48b5f2876b491b0.scope - libcontainer container c5bce689db931db02829073d163d397ba920c0ab74d7a98df48b5f2876b491b0. Sep 12 22:06:49.940673 containerd[1553]: time="2025-09-12T22:06:49.940622086Z" level=info msg="Container 397312a3bd1195a46582bb12ab5393a5dbab5b5510b5a5683526d201d2d1711c: CDI devices from CRI Config.CDIDevices: []" Sep 12 22:06:49.955711 containerd[1553]: time="2025-09-12T22:06:49.955665626Z" level=info msg="CreateContainer within sandbox \"0e13698e56c7675fb7471d330d7edeef225918fd96c868632cb2fe579ac8e555\" for &ContainerMetadata{Name:coredns,Attempt:0,} returns container id \"397312a3bd1195a46582bb12ab5393a5dbab5b5510b5a5683526d201d2d1711c\"" Sep 12 22:06:49.962536 containerd[1553]: time="2025-09-12T22:06:49.962007668Z" level=info msg="StartContainer for \"397312a3bd1195a46582bb12ab5393a5dbab5b5510b5a5683526d201d2d1711c\"" Sep 12 22:06:49.964307 containerd[1553]: time="2025-09-12T22:06:49.964224802Z" level=info msg="connecting to shim 397312a3bd1195a46582bb12ab5393a5dbab5b5510b5a5683526d201d2d1711c" address="unix:///run/containerd/s/3e14581d813f1b6354962565e083b5e8cc14c92447e1588415af8518d2c005b1" protocol=ttrpc version=3 Sep 12 22:06:49.995076 containerd[1553]: time="2025-09-12T22:06:49.994856766Z" level=info msg="connecting to shim a4efceac9469e7794edc9b141cae78a769c9e38981bb50576e6ac3e099c61cc0" address="unix:///run/containerd/s/2332c5d6555afc0dcc9419aa1c78c63fa12287bbd626e375702b05f32e44e229" namespace=k8s.io protocol=ttrpc version=3 Sep 12 22:06:50.008931 systemd[1]: Started cri-containerd-397312a3bd1195a46582bb12ab5393a5dbab5b5510b5a5683526d201d2d1711c.scope - libcontainer container 397312a3bd1195a46582bb12ab5393a5dbab5b5510b5a5683526d201d2d1711c. Sep 12 22:06:50.043535 containerd[1553]: time="2025-09-12T22:06:50.043453882Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:goldmane-7988f88666-5qdsb,Uid:b6d05037-1881-4fc1-bcbd-569865ed54b9,Namespace:calico-system,Attempt:0,} returns sandbox id \"c5bce689db931db02829073d163d397ba920c0ab74d7a98df48b5f2876b491b0\"" Sep 12 22:06:50.067927 systemd[1]: Started cri-containerd-a4efceac9469e7794edc9b141cae78a769c9e38981bb50576e6ac3e099c61cc0.scope - libcontainer container a4efceac9469e7794edc9b141cae78a769c9e38981bb50576e6ac3e099c61cc0. Sep 12 22:06:50.096896 containerd[1553]: time="2025-09-12T22:06:50.096763188Z" level=info msg="StartContainer for \"397312a3bd1195a46582bb12ab5393a5dbab5b5510b5a5683526d201d2d1711c\" returns successfully" Sep 12 22:06:50.106731 containerd[1553]: time="2025-09-12T22:06:50.106654132Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-567b5c8b5-jlsm6,Uid:1760c214-51ea-4d0b-970c-4a27ed78a891,Namespace:calico-apiserver,Attempt:0,}" Sep 12 22:06:50.171600 containerd[1553]: time="2025-09-12T22:06:50.171475433Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-7c65d6cfc9-jbgw9,Uid:7742b50f-d16e-46b2-aaad-1853990171fc,Namespace:kube-system,Attempt:0,} returns sandbox id \"a4efceac9469e7794edc9b141cae78a769c9e38981bb50576e6ac3e099c61cc0\"" Sep 12 22:06:50.179015 containerd[1553]: time="2025-09-12T22:06:50.178961721Z" level=info msg="CreateContainer within sandbox \"a4efceac9469e7794edc9b141cae78a769c9e38981bb50576e6ac3e099c61cc0\" for container &ContainerMetadata{Name:coredns,Attempt:0,}" Sep 12 22:06:50.235020 containerd[1553]: time="2025-09-12T22:06:50.234973124Z" level=info msg="Container 85cd790d19a9804254b76bbef6a0f4ba3ab8303337098372d8d7dd4f5b7a9e21: CDI devices from CRI Config.CDIDevices: []" Sep 12 22:06:50.241146 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount3694826899.mount: Deactivated successfully. Sep 12 22:06:50.247262 containerd[1553]: time="2025-09-12T22:06:50.247189724Z" level=info msg="CreateContainer within sandbox \"a4efceac9469e7794edc9b141cae78a769c9e38981bb50576e6ac3e099c61cc0\" for &ContainerMetadata{Name:coredns,Attempt:0,} returns container id \"85cd790d19a9804254b76bbef6a0f4ba3ab8303337098372d8d7dd4f5b7a9e21\"" Sep 12 22:06:50.252058 containerd[1553]: time="2025-09-12T22:06:50.251574912Z" level=info msg="StartContainer for \"85cd790d19a9804254b76bbef6a0f4ba3ab8303337098372d8d7dd4f5b7a9e21\"" Sep 12 22:06:50.253815 containerd[1553]: time="2025-09-12T22:06:50.253757086Z" level=info msg="connecting to shim 85cd790d19a9804254b76bbef6a0f4ba3ab8303337098372d8d7dd4f5b7a9e21" address="unix:///run/containerd/s/2332c5d6555afc0dcc9419aa1c78c63fa12287bbd626e375702b05f32e44e229" protocol=ttrpc version=3 Sep 12 22:06:50.300756 systemd[1]: Started cri-containerd-85cd790d19a9804254b76bbef6a0f4ba3ab8303337098372d8d7dd4f5b7a9e21.scope - libcontainer container 85cd790d19a9804254b76bbef6a0f4ba3ab8303337098372d8d7dd4f5b7a9e21. Sep 12 22:06:50.410491 containerd[1553]: time="2025-09-12T22:06:50.409597977Z" level=info msg="StartContainer for \"85cd790d19a9804254b76bbef6a0f4ba3ab8303337098372d8d7dd4f5b7a9e21\" returns successfully" Sep 12 22:06:50.460909 kubelet[2762]: I0912 22:06:50.459547 2762 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/coredns-7c65d6cfc9-xl5lq" podStartSLOduration=40.4595295 podStartE2EDuration="40.4595295s" podCreationTimestamp="2025-09-12 22:06:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-12 22:06:50.458941017 +0000 UTC m=+45.524135568" watchObservedRunningTime="2025-09-12 22:06:50.4595295 +0000 UTC m=+45.524724011" Sep 12 22:06:50.471958 kubelet[2762]: I0912 22:06:50.471918 2762 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Sep 12 22:06:50.531230 systemd-networkd[1421]: cali457b15ccbf8: Link UP Sep 12 22:06:50.534123 systemd-networkd[1421]: cali457b15ccbf8: Gained carrier Sep 12 22:06:50.582523 kubelet[2762]: I0912 22:06:50.581240 2762 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/coredns-7c65d6cfc9-jbgw9" podStartSLOduration=40.58121789 podStartE2EDuration="40.58121789s" podCreationTimestamp="2025-09-12 22:06:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-12 22:06:50.571379626 +0000 UTC m=+45.636574177" watchObservedRunningTime="2025-09-12 22:06:50.58121789 +0000 UTC m=+45.646412441" Sep 12 22:06:50.593934 containerd[1553]: 2025-09-12 22:06:50.200 [INFO][4610] cni-plugin/utils.go 100: File /var/lib/calico/mtu does not exist Sep 12 22:06:50.593934 containerd[1553]: 2025-09-12 22:06:50.288 [INFO][4610] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4459--0--0--7--af931fdd93-k8s-calico--apiserver--567b5c8b5--jlsm6-eth0 calico-apiserver-567b5c8b5- calico-apiserver 1760c214-51ea-4d0b-970c-4a27ed78a891 793 0 2025-09-12 22:06:20 +0000 UTC map[apiserver:true app.kubernetes.io/name:calico-apiserver k8s-app:calico-apiserver pod-template-hash:567b5c8b5 projectcalico.org/namespace:calico-apiserver projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:calico-apiserver] map[] [] [] []} {k8s ci-4459-0-0-7-af931fdd93 calico-apiserver-567b5c8b5-jlsm6 eth0 calico-apiserver [] [] [kns.calico-apiserver ksa.calico-apiserver.calico-apiserver] cali457b15ccbf8 [] [] }} ContainerID="b34f482145a42126bb4824ab87f8775c30e4a891e67b09f3d3a0ed5c68ef83b4" Namespace="calico-apiserver" Pod="calico-apiserver-567b5c8b5-jlsm6" WorkloadEndpoint="ci--4459--0--0--7--af931fdd93-k8s-calico--apiserver--567b5c8b5--jlsm6-" Sep 12 22:06:50.593934 containerd[1553]: 2025-09-12 22:06:50.288 [INFO][4610] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="b34f482145a42126bb4824ab87f8775c30e4a891e67b09f3d3a0ed5c68ef83b4" Namespace="calico-apiserver" Pod="calico-apiserver-567b5c8b5-jlsm6" WorkloadEndpoint="ci--4459--0--0--7--af931fdd93-k8s-calico--apiserver--567b5c8b5--jlsm6-eth0" Sep 12 22:06:50.593934 containerd[1553]: 2025-09-12 22:06:50.387 [INFO][4653] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="b34f482145a42126bb4824ab87f8775c30e4a891e67b09f3d3a0ed5c68ef83b4" HandleID="k8s-pod-network.b34f482145a42126bb4824ab87f8775c30e4a891e67b09f3d3a0ed5c68ef83b4" Workload="ci--4459--0--0--7--af931fdd93-k8s-calico--apiserver--567b5c8b5--jlsm6-eth0" Sep 12 22:06:50.593934 containerd[1553]: 2025-09-12 22:06:50.387 [INFO][4653] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="b34f482145a42126bb4824ab87f8775c30e4a891e67b09f3d3a0ed5c68ef83b4" HandleID="k8s-pod-network.b34f482145a42126bb4824ab87f8775c30e4a891e67b09f3d3a0ed5c68ef83b4" Workload="ci--4459--0--0--7--af931fdd93-k8s-calico--apiserver--567b5c8b5--jlsm6-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0x4000103910), Attrs:map[string]string{"namespace":"calico-apiserver", "node":"ci-4459-0-0-7-af931fdd93", "pod":"calico-apiserver-567b5c8b5-jlsm6", "timestamp":"2025-09-12 22:06:50.387127471 +0000 UTC"}, Hostname:"ci-4459-0-0-7-af931fdd93", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Sep 12 22:06:50.593934 containerd[1553]: 2025-09-12 22:06:50.387 [INFO][4653] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Sep 12 22:06:50.593934 containerd[1553]: 2025-09-12 22:06:50.387 [INFO][4653] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Sep 12 22:06:50.593934 containerd[1553]: 2025-09-12 22:06:50.387 [INFO][4653] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4459-0-0-7-af931fdd93' Sep 12 22:06:50.593934 containerd[1553]: 2025-09-12 22:06:50.412 [INFO][4653] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.b34f482145a42126bb4824ab87f8775c30e4a891e67b09f3d3a0ed5c68ef83b4" host="ci-4459-0-0-7-af931fdd93" Sep 12 22:06:50.593934 containerd[1553]: 2025-09-12 22:06:50.431 [INFO][4653] ipam/ipam.go 394: Looking up existing affinities for host host="ci-4459-0-0-7-af931fdd93" Sep 12 22:06:50.593934 containerd[1553]: 2025-09-12 22:06:50.443 [INFO][4653] ipam/ipam.go 511: Trying affinity for 192.168.1.64/26 host="ci-4459-0-0-7-af931fdd93" Sep 12 22:06:50.593934 containerd[1553]: 2025-09-12 22:06:50.449 [INFO][4653] ipam/ipam.go 158: Attempting to load block cidr=192.168.1.64/26 host="ci-4459-0-0-7-af931fdd93" Sep 12 22:06:50.593934 containerd[1553]: 2025-09-12 22:06:50.456 [INFO][4653] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.1.64/26 host="ci-4459-0-0-7-af931fdd93" Sep 12 22:06:50.593934 containerd[1553]: 2025-09-12 22:06:50.456 [INFO][4653] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.1.64/26 handle="k8s-pod-network.b34f482145a42126bb4824ab87f8775c30e4a891e67b09f3d3a0ed5c68ef83b4" host="ci-4459-0-0-7-af931fdd93" Sep 12 22:06:50.593934 containerd[1553]: 2025-09-12 22:06:50.478 [INFO][4653] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.b34f482145a42126bb4824ab87f8775c30e4a891e67b09f3d3a0ed5c68ef83b4 Sep 12 22:06:50.593934 containerd[1553]: 2025-09-12 22:06:50.488 [INFO][4653] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.1.64/26 handle="k8s-pod-network.b34f482145a42126bb4824ab87f8775c30e4a891e67b09f3d3a0ed5c68ef83b4" host="ci-4459-0-0-7-af931fdd93" Sep 12 22:06:50.593934 containerd[1553]: 2025-09-12 22:06:50.510 [INFO][4653] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.1.71/26] block=192.168.1.64/26 handle="k8s-pod-network.b34f482145a42126bb4824ab87f8775c30e4a891e67b09f3d3a0ed5c68ef83b4" host="ci-4459-0-0-7-af931fdd93" Sep 12 22:06:50.593934 containerd[1553]: 2025-09-12 22:06:50.510 [INFO][4653] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.1.71/26] handle="k8s-pod-network.b34f482145a42126bb4824ab87f8775c30e4a891e67b09f3d3a0ed5c68ef83b4" host="ci-4459-0-0-7-af931fdd93" Sep 12 22:06:50.593934 containerd[1553]: 2025-09-12 22:06:50.510 [INFO][4653] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Sep 12 22:06:50.595039 containerd[1553]: 2025-09-12 22:06:50.510 [INFO][4653] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.1.71/26] IPv6=[] ContainerID="b34f482145a42126bb4824ab87f8775c30e4a891e67b09f3d3a0ed5c68ef83b4" HandleID="k8s-pod-network.b34f482145a42126bb4824ab87f8775c30e4a891e67b09f3d3a0ed5c68ef83b4" Workload="ci--4459--0--0--7--af931fdd93-k8s-calico--apiserver--567b5c8b5--jlsm6-eth0" Sep 12 22:06:50.595039 containerd[1553]: 2025-09-12 22:06:50.521 [INFO][4610] cni-plugin/k8s.go 418: Populated endpoint ContainerID="b34f482145a42126bb4824ab87f8775c30e4a891e67b09f3d3a0ed5c68ef83b4" Namespace="calico-apiserver" Pod="calico-apiserver-567b5c8b5-jlsm6" WorkloadEndpoint="ci--4459--0--0--7--af931fdd93-k8s-calico--apiserver--567b5c8b5--jlsm6-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4459--0--0--7--af931fdd93-k8s-calico--apiserver--567b5c8b5--jlsm6-eth0", GenerateName:"calico-apiserver-567b5c8b5-", Namespace:"calico-apiserver", SelfLink:"", UID:"1760c214-51ea-4d0b-970c-4a27ed78a891", ResourceVersion:"793", Generation:0, CreationTimestamp:time.Date(2025, time.September, 12, 22, 6, 20, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"567b5c8b5", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4459-0-0-7-af931fdd93", ContainerID:"", Pod:"calico-apiserver-567b5c8b5-jlsm6", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.1.71/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"cali457b15ccbf8", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 12 22:06:50.595039 containerd[1553]: 2025-09-12 22:06:50.522 [INFO][4610] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.1.71/32] ContainerID="b34f482145a42126bb4824ab87f8775c30e4a891e67b09f3d3a0ed5c68ef83b4" Namespace="calico-apiserver" Pod="calico-apiserver-567b5c8b5-jlsm6" WorkloadEndpoint="ci--4459--0--0--7--af931fdd93-k8s-calico--apiserver--567b5c8b5--jlsm6-eth0" Sep 12 22:06:50.595039 containerd[1553]: 2025-09-12 22:06:50.523 [INFO][4610] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali457b15ccbf8 ContainerID="b34f482145a42126bb4824ab87f8775c30e4a891e67b09f3d3a0ed5c68ef83b4" Namespace="calico-apiserver" Pod="calico-apiserver-567b5c8b5-jlsm6" WorkloadEndpoint="ci--4459--0--0--7--af931fdd93-k8s-calico--apiserver--567b5c8b5--jlsm6-eth0" Sep 12 22:06:50.595039 containerd[1553]: 2025-09-12 22:06:50.537 [INFO][4610] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="b34f482145a42126bb4824ab87f8775c30e4a891e67b09f3d3a0ed5c68ef83b4" Namespace="calico-apiserver" Pod="calico-apiserver-567b5c8b5-jlsm6" WorkloadEndpoint="ci--4459--0--0--7--af931fdd93-k8s-calico--apiserver--567b5c8b5--jlsm6-eth0" Sep 12 22:06:50.595183 containerd[1553]: 2025-09-12 22:06:50.543 [INFO][4610] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="b34f482145a42126bb4824ab87f8775c30e4a891e67b09f3d3a0ed5c68ef83b4" Namespace="calico-apiserver" Pod="calico-apiserver-567b5c8b5-jlsm6" WorkloadEndpoint="ci--4459--0--0--7--af931fdd93-k8s-calico--apiserver--567b5c8b5--jlsm6-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4459--0--0--7--af931fdd93-k8s-calico--apiserver--567b5c8b5--jlsm6-eth0", GenerateName:"calico-apiserver-567b5c8b5-", Namespace:"calico-apiserver", SelfLink:"", UID:"1760c214-51ea-4d0b-970c-4a27ed78a891", ResourceVersion:"793", Generation:0, CreationTimestamp:time.Date(2025, time.September, 12, 22, 6, 20, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"567b5c8b5", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4459-0-0-7-af931fdd93", ContainerID:"b34f482145a42126bb4824ab87f8775c30e4a891e67b09f3d3a0ed5c68ef83b4", Pod:"calico-apiserver-567b5c8b5-jlsm6", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.1.71/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"cali457b15ccbf8", MAC:"0a:a4:d5:e5:28:01", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 12 22:06:50.595183 containerd[1553]: 2025-09-12 22:06:50.585 [INFO][4610] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="b34f482145a42126bb4824ab87f8775c30e4a891e67b09f3d3a0ed5c68ef83b4" Namespace="calico-apiserver" Pod="calico-apiserver-567b5c8b5-jlsm6" WorkloadEndpoint="ci--4459--0--0--7--af931fdd93-k8s-calico--apiserver--567b5c8b5--jlsm6-eth0" Sep 12 22:06:50.682591 containerd[1553]: time="2025-09-12T22:06:50.681627461Z" level=info msg="connecting to shim b34f482145a42126bb4824ab87f8775c30e4a891e67b09f3d3a0ed5c68ef83b4" address="unix:///run/containerd/s/370df7475240e3f40a38a2758c9ca2d81e54bf5fc3611f5ed0c385e1700587c3" namespace=k8s.io protocol=ttrpc version=3 Sep 12 22:06:50.734076 systemd[1]: Started cri-containerd-b34f482145a42126bb4824ab87f8775c30e4a891e67b09f3d3a0ed5c68ef83b4.scope - libcontainer container b34f482145a42126bb4824ab87f8775c30e4a891e67b09f3d3a0ed5c68ef83b4. Sep 12 22:06:50.881410 containerd[1553]: time="2025-09-12T22:06:50.880690112Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-567b5c8b5-jlsm6,Uid:1760c214-51ea-4d0b-970c-4a27ed78a891,Namespace:calico-apiserver,Attempt:0,} returns sandbox id \"b34f482145a42126bb4824ab87f8775c30e4a891e67b09f3d3a0ed5c68ef83b4\"" Sep 12 22:06:50.912235 systemd-networkd[1421]: cali7c0626cfe37: Gained IPv6LL Sep 12 22:06:51.039783 systemd-networkd[1421]: cali9fa1271b1df: Gained IPv6LL Sep 12 22:06:51.167832 systemd-networkd[1421]: cali19cf70b8272: Gained IPv6LL Sep 12 22:06:51.680821 systemd-networkd[1421]: cali4f011a6ff3f: Gained IPv6LL Sep 12 22:06:51.760183 containerd[1553]: time="2025-09-12T22:06:51.758768805Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/kube-controllers:v3.30.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 22:06:51.762939 containerd[1553]: time="2025-09-12T22:06:51.762039746Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/kube-controllers:v3.30.3: active requests=0, bytes read=48134957" Sep 12 22:06:51.766183 containerd[1553]: time="2025-09-12T22:06:51.766135692Z" level=info msg="ImageCreate event name:\"sha256:34117caf92350e1565610f2254377d7455b11e36666b5ce11b4a13670720432a\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 22:06:51.770381 containerd[1553]: time="2025-09-12T22:06:51.770216038Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/kube-controllers@sha256:27c4187717f08f0a5727019d8beb7597665eb47e69eaa1d7d091a7e28913e577\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 22:06:51.773861 systemd-networkd[1421]: vxlan.calico: Link UP Sep 12 22:06:51.773870 systemd-networkd[1421]: vxlan.calico: Gained carrier Sep 12 22:06:51.775481 containerd[1553]: time="2025-09-12T22:06:51.775437551Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.3\" with image id \"sha256:34117caf92350e1565610f2254377d7455b11e36666b5ce11b4a13670720432a\", repo tag \"ghcr.io/flatcar/calico/kube-controllers:v3.30.3\", repo digest \"ghcr.io/flatcar/calico/kube-controllers@sha256:27c4187717f08f0a5727019d8beb7597665eb47e69eaa1d7d091a7e28913e577\", size \"49504166\" in 4.360855711s" Sep 12 22:06:51.775641 containerd[1553]: time="2025-09-12T22:06:51.775625712Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.30.3\" returns image reference \"sha256:34117caf92350e1565610f2254377d7455b11e36666b5ce11b4a13670720432a\"" Sep 12 22:06:51.788570 containerd[1553]: time="2025-09-12T22:06:51.788204032Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.30.3\"" Sep 12 22:06:51.819997 containerd[1553]: time="2025-09-12T22:06:51.819963392Z" level=info msg="CreateContainer within sandbox \"afdd4d7dff96f93d29a89d0786da6a5f9c0b4146bd5a0b1b99d374529bccf017\" for container &ContainerMetadata{Name:calico-kube-controllers,Attempt:0,}" Sep 12 22:06:51.839888 containerd[1553]: time="2025-09-12T22:06:51.839830078Z" level=info msg="Container 2aa5620ccb3a9416a6479017356ae03e9044c61bd67a5b5ec10df3b91e120197: CDI devices from CRI Config.CDIDevices: []" Sep 12 22:06:51.844909 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount204409015.mount: Deactivated successfully. Sep 12 22:06:51.873388 containerd[1553]: time="2025-09-12T22:06:51.870761794Z" level=info msg="CreateContainer within sandbox \"afdd4d7dff96f93d29a89d0786da6a5f9c0b4146bd5a0b1b99d374529bccf017\" for &ContainerMetadata{Name:calico-kube-controllers,Attempt:0,} returns container id \"2aa5620ccb3a9416a6479017356ae03e9044c61bd67a5b5ec10df3b91e120197\"" Sep 12 22:06:51.877214 containerd[1553]: time="2025-09-12T22:06:51.875487944Z" level=info msg="StartContainer for \"2aa5620ccb3a9416a6479017356ae03e9044c61bd67a5b5ec10df3b91e120197\"" Sep 12 22:06:51.886056 containerd[1553]: time="2025-09-12T22:06:51.886001490Z" level=info msg="connecting to shim 2aa5620ccb3a9416a6479017356ae03e9044c61bd67a5b5ec10df3b91e120197" address="unix:///run/containerd/s/a68c8e7fe7da6731f68aab3d7b41312988649ef16eb7ad90ef4edb29b40756f1" protocol=ttrpc version=3 Sep 12 22:06:51.917314 systemd[1]: Started cri-containerd-2aa5620ccb3a9416a6479017356ae03e9044c61bd67a5b5ec10df3b91e120197.scope - libcontainer container 2aa5620ccb3a9416a6479017356ae03e9044c61bd67a5b5ec10df3b91e120197. Sep 12 22:06:51.990365 containerd[1553]: time="2025-09-12T22:06:51.990313910Z" level=info msg="StartContainer for \"2aa5620ccb3a9416a6479017356ae03e9044c61bd67a5b5ec10df3b91e120197\" returns successfully" Sep 12 22:06:52.063921 systemd-networkd[1421]: cali457b15ccbf8: Gained IPv6LL Sep 12 22:06:52.106043 containerd[1553]: time="2025-09-12T22:06:52.105995426Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-567b5c8b5-9gng8,Uid:111ee55a-9453-48a3-8be1-100851a1a1ea,Namespace:calico-apiserver,Attempt:0,}" Sep 12 22:06:52.377044 systemd-networkd[1421]: cali045bb48fb8d: Link UP Sep 12 22:06:52.377649 systemd-networkd[1421]: cali045bb48fb8d: Gained carrier Sep 12 22:06:52.411372 containerd[1553]: 2025-09-12 22:06:52.206 [INFO][4888] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4459--0--0--7--af931fdd93-k8s-calico--apiserver--567b5c8b5--9gng8-eth0 calico-apiserver-567b5c8b5- calico-apiserver 111ee55a-9453-48a3-8be1-100851a1a1ea 792 0 2025-09-12 22:06:20 +0000 UTC map[apiserver:true app.kubernetes.io/name:calico-apiserver k8s-app:calico-apiserver pod-template-hash:567b5c8b5 projectcalico.org/namespace:calico-apiserver projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:calico-apiserver] map[] [] [] []} {k8s ci-4459-0-0-7-af931fdd93 calico-apiserver-567b5c8b5-9gng8 eth0 calico-apiserver [] [] [kns.calico-apiserver ksa.calico-apiserver.calico-apiserver] cali045bb48fb8d [] [] }} ContainerID="f53473dee702fcb2dd93c831cffc99b4ee46648f491df559fca7e44355733e00" Namespace="calico-apiserver" Pod="calico-apiserver-567b5c8b5-9gng8" WorkloadEndpoint="ci--4459--0--0--7--af931fdd93-k8s-calico--apiserver--567b5c8b5--9gng8-" Sep 12 22:06:52.411372 containerd[1553]: 2025-09-12 22:06:52.206 [INFO][4888] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="f53473dee702fcb2dd93c831cffc99b4ee46648f491df559fca7e44355733e00" Namespace="calico-apiserver" Pod="calico-apiserver-567b5c8b5-9gng8" WorkloadEndpoint="ci--4459--0--0--7--af931fdd93-k8s-calico--apiserver--567b5c8b5--9gng8-eth0" Sep 12 22:06:52.411372 containerd[1553]: 2025-09-12 22:06:52.274 [INFO][4914] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="f53473dee702fcb2dd93c831cffc99b4ee46648f491df559fca7e44355733e00" HandleID="k8s-pod-network.f53473dee702fcb2dd93c831cffc99b4ee46648f491df559fca7e44355733e00" Workload="ci--4459--0--0--7--af931fdd93-k8s-calico--apiserver--567b5c8b5--9gng8-eth0" Sep 12 22:06:52.411372 containerd[1553]: 2025-09-12 22:06:52.276 [INFO][4914] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="f53473dee702fcb2dd93c831cffc99b4ee46648f491df559fca7e44355733e00" HandleID="k8s-pod-network.f53473dee702fcb2dd93c831cffc99b4ee46648f491df559fca7e44355733e00" Workload="ci--4459--0--0--7--af931fdd93-k8s-calico--apiserver--567b5c8b5--9gng8-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0x400004dd60), Attrs:map[string]string{"namespace":"calico-apiserver", "node":"ci-4459-0-0-7-af931fdd93", "pod":"calico-apiserver-567b5c8b5-9gng8", "timestamp":"2025-09-12 22:06:52.274636867 +0000 UTC"}, Hostname:"ci-4459-0-0-7-af931fdd93", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Sep 12 22:06:52.411372 containerd[1553]: 2025-09-12 22:06:52.276 [INFO][4914] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Sep 12 22:06:52.411372 containerd[1553]: 2025-09-12 22:06:52.276 [INFO][4914] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Sep 12 22:06:52.411372 containerd[1553]: 2025-09-12 22:06:52.276 [INFO][4914] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4459-0-0-7-af931fdd93' Sep 12 22:06:52.411372 containerd[1553]: 2025-09-12 22:06:52.292 [INFO][4914] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.f53473dee702fcb2dd93c831cffc99b4ee46648f491df559fca7e44355733e00" host="ci-4459-0-0-7-af931fdd93" Sep 12 22:06:52.411372 containerd[1553]: 2025-09-12 22:06:52.301 [INFO][4914] ipam/ipam.go 394: Looking up existing affinities for host host="ci-4459-0-0-7-af931fdd93" Sep 12 22:06:52.411372 containerd[1553]: 2025-09-12 22:06:52.315 [INFO][4914] ipam/ipam.go 511: Trying affinity for 192.168.1.64/26 host="ci-4459-0-0-7-af931fdd93" Sep 12 22:06:52.411372 containerd[1553]: 2025-09-12 22:06:52.321 [INFO][4914] ipam/ipam.go 158: Attempting to load block cidr=192.168.1.64/26 host="ci-4459-0-0-7-af931fdd93" Sep 12 22:06:52.411372 containerd[1553]: 2025-09-12 22:06:52.329 [INFO][4914] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.1.64/26 host="ci-4459-0-0-7-af931fdd93" Sep 12 22:06:52.411372 containerd[1553]: 2025-09-12 22:06:52.330 [INFO][4914] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.1.64/26 handle="k8s-pod-network.f53473dee702fcb2dd93c831cffc99b4ee46648f491df559fca7e44355733e00" host="ci-4459-0-0-7-af931fdd93" Sep 12 22:06:52.411372 containerd[1553]: 2025-09-12 22:06:52.334 [INFO][4914] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.f53473dee702fcb2dd93c831cffc99b4ee46648f491df559fca7e44355733e00 Sep 12 22:06:52.411372 containerd[1553]: 2025-09-12 22:06:52.343 [INFO][4914] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.1.64/26 handle="k8s-pod-network.f53473dee702fcb2dd93c831cffc99b4ee46648f491df559fca7e44355733e00" host="ci-4459-0-0-7-af931fdd93" Sep 12 22:06:52.411372 containerd[1553]: 2025-09-12 22:06:52.357 [INFO][4914] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.1.72/26] block=192.168.1.64/26 handle="k8s-pod-network.f53473dee702fcb2dd93c831cffc99b4ee46648f491df559fca7e44355733e00" host="ci-4459-0-0-7-af931fdd93" Sep 12 22:06:52.411372 containerd[1553]: 2025-09-12 22:06:52.358 [INFO][4914] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.1.72/26] handle="k8s-pod-network.f53473dee702fcb2dd93c831cffc99b4ee46648f491df559fca7e44355733e00" host="ci-4459-0-0-7-af931fdd93" Sep 12 22:06:52.411372 containerd[1553]: 2025-09-12 22:06:52.358 [INFO][4914] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Sep 12 22:06:52.412196 containerd[1553]: 2025-09-12 22:06:52.358 [INFO][4914] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.1.72/26] IPv6=[] ContainerID="f53473dee702fcb2dd93c831cffc99b4ee46648f491df559fca7e44355733e00" HandleID="k8s-pod-network.f53473dee702fcb2dd93c831cffc99b4ee46648f491df559fca7e44355733e00" Workload="ci--4459--0--0--7--af931fdd93-k8s-calico--apiserver--567b5c8b5--9gng8-eth0" Sep 12 22:06:52.412196 containerd[1553]: 2025-09-12 22:06:52.362 [INFO][4888] cni-plugin/k8s.go 418: Populated endpoint ContainerID="f53473dee702fcb2dd93c831cffc99b4ee46648f491df559fca7e44355733e00" Namespace="calico-apiserver" Pod="calico-apiserver-567b5c8b5-9gng8" WorkloadEndpoint="ci--4459--0--0--7--af931fdd93-k8s-calico--apiserver--567b5c8b5--9gng8-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4459--0--0--7--af931fdd93-k8s-calico--apiserver--567b5c8b5--9gng8-eth0", GenerateName:"calico-apiserver-567b5c8b5-", Namespace:"calico-apiserver", SelfLink:"", UID:"111ee55a-9453-48a3-8be1-100851a1a1ea", ResourceVersion:"792", Generation:0, CreationTimestamp:time.Date(2025, time.September, 12, 22, 6, 20, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"567b5c8b5", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4459-0-0-7-af931fdd93", ContainerID:"", Pod:"calico-apiserver-567b5c8b5-9gng8", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.1.72/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"cali045bb48fb8d", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 12 22:06:52.412196 containerd[1553]: 2025-09-12 22:06:52.362 [INFO][4888] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.1.72/32] ContainerID="f53473dee702fcb2dd93c831cffc99b4ee46648f491df559fca7e44355733e00" Namespace="calico-apiserver" Pod="calico-apiserver-567b5c8b5-9gng8" WorkloadEndpoint="ci--4459--0--0--7--af931fdd93-k8s-calico--apiserver--567b5c8b5--9gng8-eth0" Sep 12 22:06:52.412196 containerd[1553]: 2025-09-12 22:06:52.362 [INFO][4888] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali045bb48fb8d ContainerID="f53473dee702fcb2dd93c831cffc99b4ee46648f491df559fca7e44355733e00" Namespace="calico-apiserver" Pod="calico-apiserver-567b5c8b5-9gng8" WorkloadEndpoint="ci--4459--0--0--7--af931fdd93-k8s-calico--apiserver--567b5c8b5--9gng8-eth0" Sep 12 22:06:52.412196 containerd[1553]: 2025-09-12 22:06:52.378 [INFO][4888] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="f53473dee702fcb2dd93c831cffc99b4ee46648f491df559fca7e44355733e00" Namespace="calico-apiserver" Pod="calico-apiserver-567b5c8b5-9gng8" WorkloadEndpoint="ci--4459--0--0--7--af931fdd93-k8s-calico--apiserver--567b5c8b5--9gng8-eth0" Sep 12 22:06:52.412455 containerd[1553]: 2025-09-12 22:06:52.381 [INFO][4888] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="f53473dee702fcb2dd93c831cffc99b4ee46648f491df559fca7e44355733e00" Namespace="calico-apiserver" Pod="calico-apiserver-567b5c8b5-9gng8" WorkloadEndpoint="ci--4459--0--0--7--af931fdd93-k8s-calico--apiserver--567b5c8b5--9gng8-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4459--0--0--7--af931fdd93-k8s-calico--apiserver--567b5c8b5--9gng8-eth0", GenerateName:"calico-apiserver-567b5c8b5-", Namespace:"calico-apiserver", SelfLink:"", UID:"111ee55a-9453-48a3-8be1-100851a1a1ea", ResourceVersion:"792", Generation:0, CreationTimestamp:time.Date(2025, time.September, 12, 22, 6, 20, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"567b5c8b5", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4459-0-0-7-af931fdd93", ContainerID:"f53473dee702fcb2dd93c831cffc99b4ee46648f491df559fca7e44355733e00", Pod:"calico-apiserver-567b5c8b5-9gng8", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.1.72/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"cali045bb48fb8d", MAC:"52:86:e1:0b:7d:a2", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 12 22:06:52.412455 containerd[1553]: 2025-09-12 22:06:52.406 [INFO][4888] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="f53473dee702fcb2dd93c831cffc99b4ee46648f491df559fca7e44355733e00" Namespace="calico-apiserver" Pod="calico-apiserver-567b5c8b5-9gng8" WorkloadEndpoint="ci--4459--0--0--7--af931fdd93-k8s-calico--apiserver--567b5c8b5--9gng8-eth0" Sep 12 22:06:52.460982 containerd[1553]: time="2025-09-12T22:06:52.460856576Z" level=info msg="connecting to shim f53473dee702fcb2dd93c831cffc99b4ee46648f491df559fca7e44355733e00" address="unix:///run/containerd/s/0c973eb3652d9f697ceffeb72724a0a7e3f2268fb01bf45299eacf0662e4bded" namespace=k8s.io protocol=ttrpc version=3 Sep 12 22:06:52.512103 kubelet[2762]: I0912 22:06:52.511950 2762 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/calico-kube-controllers-8696478589-r2m8w" podStartSLOduration=21.143144808 podStartE2EDuration="25.511864491s" podCreationTimestamp="2025-09-12 22:06:27 +0000 UTC" firstStartedPulling="2025-09-12 22:06:47.410933854 +0000 UTC m=+42.476128405" lastFinishedPulling="2025-09-12 22:06:51.779653537 +0000 UTC m=+46.844848088" observedRunningTime="2025-09-12 22:06:52.507731305 +0000 UTC m=+47.572925856" watchObservedRunningTime="2025-09-12 22:06:52.511864491 +0000 UTC m=+47.577059042" Sep 12 22:06:52.546285 systemd[1]: Started cri-containerd-f53473dee702fcb2dd93c831cffc99b4ee46648f491df559fca7e44355733e00.scope - libcontainer container f53473dee702fcb2dd93c831cffc99b4ee46648f491df559fca7e44355733e00. Sep 12 22:06:52.642627 containerd[1553]: time="2025-09-12T22:06:52.641651892Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-567b5c8b5-9gng8,Uid:111ee55a-9453-48a3-8be1-100851a1a1ea,Namespace:calico-apiserver,Attempt:0,} returns sandbox id \"f53473dee702fcb2dd93c831cffc99b4ee46648f491df559fca7e44355733e00\"" Sep 12 22:06:52.833245 systemd-networkd[1421]: vxlan.calico: Gained IPv6LL Sep 12 22:06:53.181058 containerd[1553]: time="2025-09-12T22:06:53.181006114Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/csi:v3.30.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 22:06:53.182270 containerd[1553]: time="2025-09-12T22:06:53.182001480Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/csi:v3.30.3: active requests=0, bytes read=8227489" Sep 12 22:06:53.183863 containerd[1553]: time="2025-09-12T22:06:53.183750891Z" level=info msg="ImageCreate event name:\"sha256:5e2b30128ce4b607acd97d3edef62ce1a90be0259903090a51c360adbe4a8f3b\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 22:06:53.186893 containerd[1553]: time="2025-09-12T22:06:53.186825669Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/csi@sha256:f22c88018d8b58c4ef0052f594b216a13bd6852166ac131a538c5ab2fba23bb2\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 22:06:53.187567 containerd[1553]: time="2025-09-12T22:06:53.187372192Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/csi:v3.30.3\" with image id \"sha256:5e2b30128ce4b607acd97d3edef62ce1a90be0259903090a51c360adbe4a8f3b\", repo tag \"ghcr.io/flatcar/calico/csi:v3.30.3\", repo digest \"ghcr.io/flatcar/calico/csi@sha256:f22c88018d8b58c4ef0052f594b216a13bd6852166ac131a538c5ab2fba23bb2\", size \"9596730\" in 1.39911468s" Sep 12 22:06:53.187567 containerd[1553]: time="2025-09-12T22:06:53.187417753Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.30.3\" returns image reference \"sha256:5e2b30128ce4b607acd97d3edef62ce1a90be0259903090a51c360adbe4a8f3b\"" Sep 12 22:06:53.190332 containerd[1553]: time="2025-09-12T22:06:53.190036608Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.30.3\"" Sep 12 22:06:53.191486 containerd[1553]: time="2025-09-12T22:06:53.191449577Z" level=info msg="CreateContainer within sandbox \"1ff3691f489dfeb852cad9bf4ed3b9de1f23adebbae6bc73d7ab647a9d05b065\" for container &ContainerMetadata{Name:calico-csi,Attempt:0,}" Sep 12 22:06:53.205209 containerd[1553]: time="2025-09-12T22:06:53.203973252Z" level=info msg="Container c92d5e40512f68e08471b1135603c8af1a9858d1bda9ef751cbbad9a0c0570ed: CDI devices from CRI Config.CDIDevices: []" Sep 12 22:06:53.213611 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount889307811.mount: Deactivated successfully. Sep 12 22:06:53.223339 containerd[1553]: time="2025-09-12T22:06:53.223221408Z" level=info msg="CreateContainer within sandbox \"1ff3691f489dfeb852cad9bf4ed3b9de1f23adebbae6bc73d7ab647a9d05b065\" for &ContainerMetadata{Name:calico-csi,Attempt:0,} returns container id \"c92d5e40512f68e08471b1135603c8af1a9858d1bda9ef751cbbad9a0c0570ed\"" Sep 12 22:06:53.224801 containerd[1553]: time="2025-09-12T22:06:53.224362895Z" level=info msg="StartContainer for \"c92d5e40512f68e08471b1135603c8af1a9858d1bda9ef751cbbad9a0c0570ed\"" Sep 12 22:06:53.228851 containerd[1553]: time="2025-09-12T22:06:53.228690881Z" level=info msg="connecting to shim c92d5e40512f68e08471b1135603c8af1a9858d1bda9ef751cbbad9a0c0570ed" address="unix:///run/containerd/s/24ceaac3ae3d378bffe2e7c6a57ab345599917d5a51a09f1be73c4cd4b2ed2be" protocol=ttrpc version=3 Sep 12 22:06:53.258783 systemd[1]: Started cri-containerd-c92d5e40512f68e08471b1135603c8af1a9858d1bda9ef751cbbad9a0c0570ed.scope - libcontainer container c92d5e40512f68e08471b1135603c8af1a9858d1bda9ef751cbbad9a0c0570ed. Sep 12 22:06:53.308923 containerd[1553]: time="2025-09-12T22:06:53.308706403Z" level=info msg="StartContainer for \"c92d5e40512f68e08471b1135603c8af1a9858d1bda9ef751cbbad9a0c0570ed\" returns successfully" Sep 12 22:06:53.473814 kubelet[2762]: I0912 22:06:53.473716 2762 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Sep 12 22:06:53.792050 systemd-networkd[1421]: cali045bb48fb8d: Gained IPv6LL Sep 12 22:06:55.128260 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount2754503525.mount: Deactivated successfully. Sep 12 22:06:55.589976 containerd[1553]: time="2025-09-12T22:06:55.589811831Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/goldmane:v3.30.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 22:06:55.592493 containerd[1553]: time="2025-09-12T22:06:55.592383566Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/goldmane:v3.30.3: active requests=0, bytes read=61845332" Sep 12 22:06:55.594660 containerd[1553]: time="2025-09-12T22:06:55.594476538Z" level=info msg="ImageCreate event name:\"sha256:14088376331a0622b7f6a2fbc2f2932806a6eafdd7b602f6139d3b985bf1e685\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 22:06:55.597100 containerd[1553]: time="2025-09-12T22:06:55.597028392Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/goldmane@sha256:46297703ab3739331a00a58f0d6a5498c8d3b6523ad947eed68592ee0f3e79f0\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 22:06:55.597875 containerd[1553]: time="2025-09-12T22:06:55.597640356Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/goldmane:v3.30.3\" with image id \"sha256:14088376331a0622b7f6a2fbc2f2932806a6eafdd7b602f6139d3b985bf1e685\", repo tag \"ghcr.io/flatcar/calico/goldmane:v3.30.3\", repo digest \"ghcr.io/flatcar/calico/goldmane@sha256:46297703ab3739331a00a58f0d6a5498c8d3b6523ad947eed68592ee0f3e79f0\", size \"61845178\" in 2.407543867s" Sep 12 22:06:55.597875 containerd[1553]: time="2025-09-12T22:06:55.597666556Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.30.3\" returns image reference \"sha256:14088376331a0622b7f6a2fbc2f2932806a6eafdd7b602f6139d3b985bf1e685\"" Sep 12 22:06:55.600645 containerd[1553]: time="2025-09-12T22:06:55.600536772Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.3\"" Sep 12 22:06:55.602002 containerd[1553]: time="2025-09-12T22:06:55.601746099Z" level=info msg="CreateContainer within sandbox \"c5bce689db931db02829073d163d397ba920c0ab74d7a98df48b5f2876b491b0\" for container &ContainerMetadata{Name:goldmane,Attempt:0,}" Sep 12 22:06:55.618247 containerd[1553]: time="2025-09-12T22:06:55.617281189Z" level=info msg="Container fadc5f8b8f840e84e34dc81e9a6729ec46c7833a637f767058f111ffc8f64bdc: CDI devices from CRI Config.CDIDevices: []" Sep 12 22:06:55.620620 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount1566143577.mount: Deactivated successfully. Sep 12 22:06:55.633659 containerd[1553]: time="2025-09-12T22:06:55.633617642Z" level=info msg="CreateContainer within sandbox \"c5bce689db931db02829073d163d397ba920c0ab74d7a98df48b5f2876b491b0\" for &ContainerMetadata{Name:goldmane,Attempt:0,} returns container id \"fadc5f8b8f840e84e34dc81e9a6729ec46c7833a637f767058f111ffc8f64bdc\"" Sep 12 22:06:55.636675 containerd[1553]: time="2025-09-12T22:06:55.636635900Z" level=info msg="StartContainer for \"fadc5f8b8f840e84e34dc81e9a6729ec46c7833a637f767058f111ffc8f64bdc\"" Sep 12 22:06:55.638313 containerd[1553]: time="2025-09-12T22:06:55.638272189Z" level=info msg="connecting to shim fadc5f8b8f840e84e34dc81e9a6729ec46c7833a637f767058f111ffc8f64bdc" address="unix:///run/containerd/s/24f20cd12ec48286365a2cf5d1a09ca767abc44296eb23f5736ab6270f905a77" protocol=ttrpc version=3 Sep 12 22:06:55.673836 systemd[1]: Started cri-containerd-fadc5f8b8f840e84e34dc81e9a6729ec46c7833a637f767058f111ffc8f64bdc.scope - libcontainer container fadc5f8b8f840e84e34dc81e9a6729ec46c7833a637f767058f111ffc8f64bdc. Sep 12 22:06:55.724680 containerd[1553]: time="2025-09-12T22:06:55.723857680Z" level=info msg="StartContainer for \"fadc5f8b8f840e84e34dc81e9a6729ec46c7833a637f767058f111ffc8f64bdc\" returns successfully" Sep 12 22:06:56.518552 kubelet[2762]: I0912 22:06:56.518234 2762 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/goldmane-7988f88666-5qdsb" podStartSLOduration=24.966862314 podStartE2EDuration="30.518213689s" podCreationTimestamp="2025-09-12 22:06:26 +0000 UTC" firstStartedPulling="2025-09-12 22:06:50.047317907 +0000 UTC m=+45.112512458" lastFinishedPulling="2025-09-12 22:06:55.598669282 +0000 UTC m=+50.663863833" observedRunningTime="2025-09-12 22:06:56.517534845 +0000 UTC m=+51.582729396" watchObservedRunningTime="2025-09-12 22:06:56.518213689 +0000 UTC m=+51.583408240" Sep 12 22:06:56.598867 containerd[1553]: time="2025-09-12T22:06:56.598783700Z" level=info msg="TaskExit event in podsandbox handler container_id:\"fadc5f8b8f840e84e34dc81e9a6729ec46c7833a637f767058f111ffc8f64bdc\" id:\"420545be97a3647cb3951890eef3d1813eb02e68c9623b08fad95f715d03201a\" pid:5090 exit_status:1 exited_at:{seconds:1757714816 nanos:598087336}" Sep 12 22:06:57.648450 containerd[1553]: time="2025-09-12T22:06:57.648391537Z" level=info msg="TaskExit event in podsandbox handler container_id:\"fadc5f8b8f840e84e34dc81e9a6729ec46c7833a637f767058f111ffc8f64bdc\" id:\"536432a452ef37dc05bd1f8b689ec77c1491137c6983844bc9b1193f7753c701\" pid:5120 exit_status:1 exited_at:{seconds:1757714817 nanos:647225531}" Sep 12 22:06:58.672725 containerd[1553]: time="2025-09-12T22:06:58.672607336Z" level=info msg="TaskExit event in podsandbox handler container_id:\"fadc5f8b8f840e84e34dc81e9a6729ec46c7833a637f767058f111ffc8f64bdc\" id:\"dc97cf994c3d8bee2db2451f0c6ea3f0b917f7946c0bf34170beeda916dae1fe\" pid:5151 exit_status:1 exited_at:{seconds:1757714818 nanos:671858332}" Sep 12 22:06:58.725110 containerd[1553]: time="2025-09-12T22:06:58.724994256Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/apiserver:v3.30.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 22:06:58.727420 containerd[1553]: time="2025-09-12T22:06:58.727349789Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.30.3: active requests=0, bytes read=44530807" Sep 12 22:06:58.729202 containerd[1553]: time="2025-09-12T22:06:58.729145079Z" level=info msg="ImageCreate event name:\"sha256:632fbde00b1918016ac07458e79cc438ccda83cb762bfd5fc50a26721abced08\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 22:06:58.732353 containerd[1553]: time="2025-09-12T22:06:58.732295655Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/apiserver@sha256:6a24147f11c1edce9d6ba79bdb0c2beadec53853fb43438a287291e67b41e51b\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 22:06:58.733925 containerd[1553]: time="2025-09-12T22:06:58.733861704Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/apiserver:v3.30.3\" with image id \"sha256:632fbde00b1918016ac07458e79cc438ccda83cb762bfd5fc50a26721abced08\", repo tag \"ghcr.io/flatcar/calico/apiserver:v3.30.3\", repo digest \"ghcr.io/flatcar/calico/apiserver@sha256:6a24147f11c1edce9d6ba79bdb0c2beadec53853fb43438a287291e67b41e51b\", size \"45900064\" in 3.133277611s" Sep 12 22:06:58.733925 containerd[1553]: time="2025-09-12T22:06:58.733910464Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.3\" returns image reference \"sha256:632fbde00b1918016ac07458e79cc438ccda83cb762bfd5fc50a26721abced08\"" Sep 12 22:06:58.736055 containerd[1553]: time="2025-09-12T22:06:58.735883795Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.3\"" Sep 12 22:06:58.739623 containerd[1553]: time="2025-09-12T22:06:58.739501054Z" level=info msg="CreateContainer within sandbox \"b34f482145a42126bb4824ab87f8775c30e4a891e67b09f3d3a0ed5c68ef83b4\" for container &ContainerMetadata{Name:calico-apiserver,Attempt:0,}" Sep 12 22:06:58.755852 containerd[1553]: time="2025-09-12T22:06:58.755708421Z" level=info msg="Container 5109fdb33697b78b37ba08f41e2e9680efb88aad7e0fe61c29cfacd64ef9b87d: CDI devices from CRI Config.CDIDevices: []" Sep 12 22:06:58.772239 containerd[1553]: time="2025-09-12T22:06:58.772015348Z" level=info msg="CreateContainer within sandbox \"b34f482145a42126bb4824ab87f8775c30e4a891e67b09f3d3a0ed5c68ef83b4\" for &ContainerMetadata{Name:calico-apiserver,Attempt:0,} returns container id \"5109fdb33697b78b37ba08f41e2e9680efb88aad7e0fe61c29cfacd64ef9b87d\"" Sep 12 22:06:58.772931 containerd[1553]: time="2025-09-12T22:06:58.772891352Z" level=info msg="StartContainer for \"5109fdb33697b78b37ba08f41e2e9680efb88aad7e0fe61c29cfacd64ef9b87d\"" Sep 12 22:06:58.774295 containerd[1553]: time="2025-09-12T22:06:58.774254880Z" level=info msg="connecting to shim 5109fdb33697b78b37ba08f41e2e9680efb88aad7e0fe61c29cfacd64ef9b87d" address="unix:///run/containerd/s/370df7475240e3f40a38a2758c9ca2d81e54bf5fc3611f5ed0c385e1700587c3" protocol=ttrpc version=3 Sep 12 22:06:58.805779 systemd[1]: Started cri-containerd-5109fdb33697b78b37ba08f41e2e9680efb88aad7e0fe61c29cfacd64ef9b87d.scope - libcontainer container 5109fdb33697b78b37ba08f41e2e9680efb88aad7e0fe61c29cfacd64ef9b87d. Sep 12 22:06:58.874623 containerd[1553]: time="2025-09-12T22:06:58.874569336Z" level=info msg="StartContainer for \"5109fdb33697b78b37ba08f41e2e9680efb88aad7e0fe61c29cfacd64ef9b87d\" returns successfully" Sep 12 22:06:59.131575 containerd[1553]: time="2025-09-12T22:06:59.130993651Z" level=info msg="ImageUpdate event name:\"ghcr.io/flatcar/calico/apiserver:v3.30.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 22:06:59.132781 containerd[1553]: time="2025-09-12T22:06:59.132736860Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.30.3: active requests=0, bytes read=77" Sep 12 22:06:59.135533 containerd[1553]: time="2025-09-12T22:06:59.135444554Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/apiserver:v3.30.3\" with image id \"sha256:632fbde00b1918016ac07458e79cc438ccda83cb762bfd5fc50a26721abced08\", repo tag \"ghcr.io/flatcar/calico/apiserver:v3.30.3\", repo digest \"ghcr.io/flatcar/calico/apiserver@sha256:6a24147f11c1edce9d6ba79bdb0c2beadec53853fb43438a287291e67b41e51b\", size \"45900064\" in 399.506319ms" Sep 12 22:06:59.135749 containerd[1553]: time="2025-09-12T22:06:59.135716555Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.3\" returns image reference \"sha256:632fbde00b1918016ac07458e79cc438ccda83cb762bfd5fc50a26721abced08\"" Sep 12 22:06:59.139834 containerd[1553]: time="2025-09-12T22:06:59.139767056Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.3\"" Sep 12 22:06:59.140517 containerd[1553]: time="2025-09-12T22:06:59.140480700Z" level=info msg="CreateContainer within sandbox \"f53473dee702fcb2dd93c831cffc99b4ee46648f491df559fca7e44355733e00\" for container &ContainerMetadata{Name:calico-apiserver,Attempt:0,}" Sep 12 22:06:59.157690 containerd[1553]: time="2025-09-12T22:06:59.157608550Z" level=info msg="Container c2290200245b87781fb953d4e7c0c3555245572e99a831a15605c1389660da7e: CDI devices from CRI Config.CDIDevices: []" Sep 12 22:06:59.172850 containerd[1553]: time="2025-09-12T22:06:59.172759749Z" level=info msg="CreateContainer within sandbox \"f53473dee702fcb2dd93c831cffc99b4ee46648f491df559fca7e44355733e00\" for &ContainerMetadata{Name:calico-apiserver,Attempt:0,} returns container id \"c2290200245b87781fb953d4e7c0c3555245572e99a831a15605c1389660da7e\"" Sep 12 22:06:59.175158 containerd[1553]: time="2025-09-12T22:06:59.175113241Z" level=info msg="StartContainer for \"c2290200245b87781fb953d4e7c0c3555245572e99a831a15605c1389660da7e\"" Sep 12 22:06:59.177043 containerd[1553]: time="2025-09-12T22:06:59.176940651Z" level=info msg="connecting to shim c2290200245b87781fb953d4e7c0c3555245572e99a831a15605c1389660da7e" address="unix:///run/containerd/s/0c973eb3652d9f697ceffeb72724a0a7e3f2268fb01bf45299eacf0662e4bded" protocol=ttrpc version=3 Sep 12 22:06:59.206824 systemd[1]: Started cri-containerd-c2290200245b87781fb953d4e7c0c3555245572e99a831a15605c1389660da7e.scope - libcontainer container c2290200245b87781fb953d4e7c0c3555245572e99a831a15605c1389660da7e. Sep 12 22:06:59.264679 containerd[1553]: time="2025-09-12T22:06:59.264448268Z" level=info msg="StartContainer for \"c2290200245b87781fb953d4e7c0c3555245572e99a831a15605c1389660da7e\" returns successfully" Sep 12 22:06:59.554391 kubelet[2762]: I0912 22:06:59.554283 2762 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-apiserver/calico-apiserver-567b5c8b5-9gng8" podStartSLOduration=33.060679163 podStartE2EDuration="39.554260021s" podCreationTimestamp="2025-09-12 22:06:20 +0000 UTC" firstStartedPulling="2025-09-12 22:06:52.645138513 +0000 UTC m=+47.710333064" lastFinishedPulling="2025-09-12 22:06:59.138719371 +0000 UTC m=+54.203913922" observedRunningTime="2025-09-12 22:06:59.553391816 +0000 UTC m=+54.618586447" watchObservedRunningTime="2025-09-12 22:06:59.554260021 +0000 UTC m=+54.619454572" Sep 12 22:06:59.556409 kubelet[2762]: I0912 22:06:59.555894 2762 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-apiserver/calico-apiserver-567b5c8b5-jlsm6" podStartSLOduration=31.706733784 podStartE2EDuration="39.555873669s" podCreationTimestamp="2025-09-12 22:06:20 +0000 UTC" firstStartedPulling="2025-09-12 22:06:50.886039466 +0000 UTC m=+45.951233977" lastFinishedPulling="2025-09-12 22:06:58.735179311 +0000 UTC m=+53.800373862" observedRunningTime="2025-09-12 22:06:59.531364941 +0000 UTC m=+54.596559492" watchObservedRunningTime="2025-09-12 22:06:59.555873669 +0000 UTC m=+54.621068220" Sep 12 22:07:00.519910 kubelet[2762]: I0912 22:07:00.519753 2762 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Sep 12 22:07:00.520587 kubelet[2762]: I0912 22:07:00.520150 2762 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Sep 12 22:07:00.829677 containerd[1553]: time="2025-09-12T22:07:00.828910018Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 22:07:00.831540 containerd[1553]: time="2025-09-12T22:07:00.831034709Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/node-driver-registrar:v3.30.3: active requests=0, bytes read=13761208" Sep 12 22:07:00.832422 containerd[1553]: time="2025-09-12T22:07:00.832011114Z" level=info msg="ImageCreate event name:\"sha256:a319b5bdc1001e98875b68e2943279adb74bcb19d09f1db857bc27959a078a65\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 22:07:00.836874 containerd[1553]: time="2025-09-12T22:07:00.836160015Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node-driver-registrar@sha256:731ab232ca708102ab332340b1274d5cd656aa896ecc5368ee95850b811df86f\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 12 22:07:00.838394 containerd[1553]: time="2025-09-12T22:07:00.838105105Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.3\" with image id \"sha256:a319b5bdc1001e98875b68e2943279adb74bcb19d09f1db857bc27959a078a65\", repo tag \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.3\", repo digest \"ghcr.io/flatcar/calico/node-driver-registrar@sha256:731ab232ca708102ab332340b1274d5cd656aa896ecc5368ee95850b811df86f\", size \"15130401\" in 1.698080447s" Sep 12 22:07:00.838852 containerd[1553]: time="2025-09-12T22:07:00.838715508Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.3\" returns image reference \"sha256:a319b5bdc1001e98875b68e2943279adb74bcb19d09f1db857bc27959a078a65\"" Sep 12 22:07:00.844840 containerd[1553]: time="2025-09-12T22:07:00.844754099Z" level=info msg="CreateContainer within sandbox \"1ff3691f489dfeb852cad9bf4ed3b9de1f23adebbae6bc73d7ab647a9d05b065\" for container &ContainerMetadata{Name:csi-node-driver-registrar,Attempt:0,}" Sep 12 22:07:00.861900 containerd[1553]: time="2025-09-12T22:07:00.861618065Z" level=info msg="Container 1e322a2547fc47bbf52a3de8085e354a70fd14e0f593d84931299c2c543a2b19: CDI devices from CRI Config.CDIDevices: []" Sep 12 22:07:00.872495 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount1431987598.mount: Deactivated successfully. Sep 12 22:07:00.883899 containerd[1553]: time="2025-09-12T22:07:00.883834659Z" level=info msg="CreateContainer within sandbox \"1ff3691f489dfeb852cad9bf4ed3b9de1f23adebbae6bc73d7ab647a9d05b065\" for &ContainerMetadata{Name:csi-node-driver-registrar,Attempt:0,} returns container id \"1e322a2547fc47bbf52a3de8085e354a70fd14e0f593d84931299c2c543a2b19\"" Sep 12 22:07:00.884984 containerd[1553]: time="2025-09-12T22:07:00.884930104Z" level=info msg="StartContainer for \"1e322a2547fc47bbf52a3de8085e354a70fd14e0f593d84931299c2c543a2b19\"" Sep 12 22:07:00.887280 containerd[1553]: time="2025-09-12T22:07:00.887086515Z" level=info msg="connecting to shim 1e322a2547fc47bbf52a3de8085e354a70fd14e0f593d84931299c2c543a2b19" address="unix:///run/containerd/s/24ceaac3ae3d378bffe2e7c6a57ab345599917d5a51a09f1be73c4cd4b2ed2be" protocol=ttrpc version=3 Sep 12 22:07:00.922025 systemd[1]: Started cri-containerd-1e322a2547fc47bbf52a3de8085e354a70fd14e0f593d84931299c2c543a2b19.scope - libcontainer container 1e322a2547fc47bbf52a3de8085e354a70fd14e0f593d84931299c2c543a2b19. Sep 12 22:07:00.995477 containerd[1553]: time="2025-09-12T22:07:00.995433508Z" level=info msg="StartContainer for \"1e322a2547fc47bbf52a3de8085e354a70fd14e0f593d84931299c2c543a2b19\" returns successfully" Sep 12 22:07:01.255479 kubelet[2762]: I0912 22:07:01.254609 2762 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Sep 12 22:07:01.266640 kubelet[2762]: I0912 22:07:01.266486 2762 csi_plugin.go:100] kubernetes.io/csi: Trying to validate a new CSI Driver with name: csi.tigera.io endpoint: /var/lib/kubelet/plugins/csi.tigera.io/csi.sock versions: 1.0.0 Sep 12 22:07:01.272442 kubelet[2762]: I0912 22:07:01.272347 2762 csi_plugin.go:113] kubernetes.io/csi: Register new plugin with name: csi.tigera.io at endpoint: /var/lib/kubelet/plugins/csi.tigera.io/csi.sock Sep 12 22:07:01.294991 containerd[1553]: time="2025-09-12T22:07:01.294869882Z" level=info msg="TaskExit event in podsandbox handler container_id:\"2aa5620ccb3a9416a6479017356ae03e9044c61bd67a5b5ec10df3b91e120197\" id:\"4742b8f584e7f55d7e26ea01617355868605030335a29ef9f0d49d9515e1c198\" pid:5286 exited_at:{seconds:1757714821 nanos:294413520}" Sep 12 22:07:01.383432 containerd[1553]: time="2025-09-12T22:07:01.383375124Z" level=info msg="TaskExit event in podsandbox handler container_id:\"2aa5620ccb3a9416a6479017356ae03e9044c61bd67a5b5ec10df3b91e120197\" id:\"bd19e26aa274a056d1db8e58b883ffc8fe9ecfcd06382086923fe3da2a362a81\" pid:5307 exited_at:{seconds:1757714821 nanos:382712520}" Sep 12 22:07:01.549162 kubelet[2762]: I0912 22:07:01.547115 2762 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/csi-node-driver-mr7s6" podStartSLOduration=23.506750214 podStartE2EDuration="34.54708954s" podCreationTimestamp="2025-09-12 22:06:27 +0000 UTC" firstStartedPulling="2025-09-12 22:06:49.800217992 +0000 UTC m=+44.865412503" lastFinishedPulling="2025-09-12 22:07:00.840557278 +0000 UTC m=+55.905751829" observedRunningTime="2025-09-12 22:07:01.546857539 +0000 UTC m=+56.612052130" watchObservedRunningTime="2025-09-12 22:07:01.54708954 +0000 UTC m=+56.612284131" Sep 12 22:07:14.899463 kernel: hrtimer: interrupt took 2548730 ns Sep 12 22:07:15.001114 containerd[1553]: time="2025-09-12T22:07:15.001055939Z" level=info msg="TaskExit event in podsandbox handler container_id:\"a4e608c211de058001bdf0d0e6503e393e4578d41f7d556cebac9cac0633dafb\" id:\"7604da96977013464b0317d9d770ce14803b73f70c91d95cf2272efa25e3472f\" pid:5350 exited_at:{seconds:1757714835 nanos:557898}" Sep 12 22:07:19.269544 containerd[1553]: time="2025-09-12T22:07:19.269466884Z" level=info msg="TaskExit event in podsandbox handler container_id:\"fadc5f8b8f840e84e34dc81e9a6729ec46c7833a637f767058f111ffc8f64bdc\" id:\"c72f883cab609fee53ca1329bd9d5a9f2d85e3d44b825d8d00a2593b403481b8\" pid:5374 exited_at:{seconds:1757714839 nanos:268934122}" Sep 12 22:07:21.134319 kubelet[2762]: I0912 22:07:21.134276 2762 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Sep 12 22:07:31.197552 containerd[1553]: time="2025-09-12T22:07:31.197372555Z" level=info msg="TaskExit event in podsandbox handler container_id:\"fadc5f8b8f840e84e34dc81e9a6729ec46c7833a637f767058f111ffc8f64bdc\" id:\"931270c84c754b3aef258c56ff4c44713f95eaa9ef505690f54f6660237a7267\" pid:5406 exited_at:{seconds:1757714851 nanos:197016394}" Sep 12 22:07:31.297017 containerd[1553]: time="2025-09-12T22:07:31.296774392Z" level=info msg="TaskExit event in podsandbox handler container_id:\"2aa5620ccb3a9416a6479017356ae03e9044c61bd67a5b5ec10df3b91e120197\" id:\"c07374b1b86bf9697a90f866b0461798f4e5789e4bf2233a0340c91d3d8dce51\" pid:5428 exited_at:{seconds:1757714851 nanos:296339670}" Sep 12 22:07:33.922505 containerd[1553]: time="2025-09-12T22:07:33.922402427Z" level=info msg="TaskExit event in podsandbox handler container_id:\"2aa5620ccb3a9416a6479017356ae03e9044c61bd67a5b5ec10df3b91e120197\" id:\"42a658083d044d5855d1240eb362d0f3ca1756b367c2bd499a7f01ad2a8efb9e\" pid:5457 exited_at:{seconds:1757714853 nanos:922077906}" Sep 12 22:07:35.378537 kubelet[2762]: I0912 22:07:35.378080 2762 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Sep 12 22:07:45.006213 containerd[1553]: time="2025-09-12T22:07:45.006152913Z" level=info msg="TaskExit event in podsandbox handler container_id:\"a4e608c211de058001bdf0d0e6503e393e4578d41f7d556cebac9cac0633dafb\" id:\"03133546138ebabc2dade7dc278e7df9dedd65fcf35a631870b2cb035715b9cd\" pid:5483 exited_at:{seconds:1757714865 nanos:4928950}" Sep 12 22:07:49.232008 containerd[1553]: time="2025-09-12T22:07:49.231438565Z" level=info msg="TaskExit event in podsandbox handler container_id:\"fadc5f8b8f840e84e34dc81e9a6729ec46c7833a637f767058f111ffc8f64bdc\" id:\"14c3b96e708ef4ee015b81557c20d8e70c424e87f40e782ef4703d03977b9391\" pid:5509 exited_at:{seconds:1757714869 nanos:230594923}" Sep 12 22:08:01.296001 containerd[1553]: time="2025-09-12T22:08:01.295744478Z" level=info msg="TaskExit event in podsandbox handler container_id:\"2aa5620ccb3a9416a6479017356ae03e9044c61bd67a5b5ec10df3b91e120197\" id:\"de1bf7773eca497d9b274505369d80b8fcceb6ea00ddef560f48b53a38cfcd33\" pid:5534 exited_at:{seconds:1757714881 nanos:295319557}" Sep 12 22:08:14.980373 containerd[1553]: time="2025-09-12T22:08:14.980305136Z" level=info msg="TaskExit event in podsandbox handler container_id:\"a4e608c211de058001bdf0d0e6503e393e4578d41f7d556cebac9cac0633dafb\" id:\"e1d8070ff66fd52dc3c02857bfa772106045387e72e5b2ff8fe769c1ba9c82c8\" pid:5566 exited_at:{seconds:1757714894 nanos:979848735}" Sep 12 22:08:19.202767 containerd[1553]: time="2025-09-12T22:08:19.202721119Z" level=info msg="TaskExit event in podsandbox handler container_id:\"fadc5f8b8f840e84e34dc81e9a6729ec46c7833a637f767058f111ffc8f64bdc\" id:\"4afdb63bf4e3b3d511b74a83e383873acc3a9822b32959c4c5be18147f36835c\" pid:5592 exited_at:{seconds:1757714899 nanos:202093798}" Sep 12 22:08:31.187930 containerd[1553]: time="2025-09-12T22:08:31.187853101Z" level=info msg="TaskExit event in podsandbox handler container_id:\"fadc5f8b8f840e84e34dc81e9a6729ec46c7833a637f767058f111ffc8f64bdc\" id:\"622da9edf137bb33c491f5470bc09fd205c52ddb175b6e502241dee648f6161c\" pid:5636 exited_at:{seconds:1757714911 nanos:187167380}" Sep 12 22:08:31.292909 containerd[1553]: time="2025-09-12T22:08:31.292867031Z" level=info msg="TaskExit event in podsandbox handler container_id:\"2aa5620ccb3a9416a6479017356ae03e9044c61bd67a5b5ec10df3b91e120197\" id:\"07d97cd69b13f76d1bc17e14b084d2833805551f47bc2ac0f88003525c3ab409\" pid:5657 exited_at:{seconds:1757714911 nanos:289647105}" Sep 12 22:08:33.898909 containerd[1553]: time="2025-09-12T22:08:33.898858663Z" level=info msg="TaskExit event in podsandbox handler container_id:\"2aa5620ccb3a9416a6479017356ae03e9044c61bd67a5b5ec10df3b91e120197\" id:\"781021f9b504335ff13e6e932e2976bacaa59096bb2959f386fbc595e8eaaa3d\" pid:5682 exited_at:{seconds:1757714913 nanos:898411779}" Sep 12 22:08:40.079675 systemd[1]: Started sshd@8-168.119.157.2:22-139.178.68.195:50624.service - OpenSSH per-connection server daemon (139.178.68.195:50624). Sep 12 22:08:41.119219 sshd[5695]: Accepted publickey for core from 139.178.68.195 port 50624 ssh2: RSA SHA256:fkNOO6LYLT5WIi2mCSq4FAK1DpB2w5SXOx3BU2RCgh0 Sep 12 22:08:41.123805 sshd-session[5695]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 12 22:08:41.130827 systemd-logind[1527]: New session 8 of user core. Sep 12 22:08:41.139945 systemd[1]: Started session-8.scope - Session 8 of User core. Sep 12 22:08:41.909815 sshd[5700]: Connection closed by 139.178.68.195 port 50624 Sep 12 22:08:41.910588 sshd-session[5695]: pam_unix(sshd:session): session closed for user core Sep 12 22:08:41.916463 systemd[1]: sshd@8-168.119.157.2:22-139.178.68.195:50624.service: Deactivated successfully. Sep 12 22:08:41.919357 systemd[1]: session-8.scope: Deactivated successfully. Sep 12 22:08:41.921034 systemd-logind[1527]: Session 8 logged out. Waiting for processes to exit. Sep 12 22:08:41.923985 systemd-logind[1527]: Removed session 8. Sep 12 22:08:44.987352 containerd[1553]: time="2025-09-12T22:08:44.987253886Z" level=info msg="TaskExit event in podsandbox handler container_id:\"a4e608c211de058001bdf0d0e6503e393e4578d41f7d556cebac9cac0633dafb\" id:\"3e6fc2d014defb966d720f82c5501a34a6c217534a9368c83353fe11c4cd9a89\" pid:5725 exited_at:{seconds:1757714924 nanos:986799043}" Sep 12 22:08:47.084918 systemd[1]: Started sshd@9-168.119.157.2:22-139.178.68.195:37728.service - OpenSSH per-connection server daemon (139.178.68.195:37728). Sep 12 22:08:48.094633 sshd[5737]: Accepted publickey for core from 139.178.68.195 port 37728 ssh2: RSA SHA256:fkNOO6LYLT5WIi2mCSq4FAK1DpB2w5SXOx3BU2RCgh0 Sep 12 22:08:48.097985 sshd-session[5737]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 12 22:08:48.107395 systemd-logind[1527]: New session 9 of user core. Sep 12 22:08:48.110779 systemd[1]: Started session-9.scope - Session 9 of User core. Sep 12 22:08:48.867023 sshd[5740]: Connection closed by 139.178.68.195 port 37728 Sep 12 22:08:48.867875 sshd-session[5737]: pam_unix(sshd:session): session closed for user core Sep 12 22:08:48.875245 systemd[1]: sshd@9-168.119.157.2:22-139.178.68.195:37728.service: Deactivated successfully. Sep 12 22:08:48.881975 systemd[1]: session-9.scope: Deactivated successfully. Sep 12 22:08:48.884825 systemd-logind[1527]: Session 9 logged out. Waiting for processes to exit. Sep 12 22:08:48.887455 systemd-logind[1527]: Removed session 9. Sep 12 22:08:49.037256 systemd[1]: Started sshd@10-168.119.157.2:22-139.178.68.195:37734.service - OpenSSH per-connection server daemon (139.178.68.195:37734). Sep 12 22:08:49.198248 containerd[1553]: time="2025-09-12T22:08:49.198128561Z" level=info msg="TaskExit event in podsandbox handler container_id:\"fadc5f8b8f840e84e34dc81e9a6729ec46c7833a637f767058f111ffc8f64bdc\" id:\"cf10888c376e59889d09d04adf376575e2e49abc2e6182ebd27e54f2eb83bf5d\" pid:5770 exited_at:{seconds:1757714929 nanos:197290674}" Sep 12 22:08:50.050489 sshd[5754]: Accepted publickey for core from 139.178.68.195 port 37734 ssh2: RSA SHA256:fkNOO6LYLT5WIi2mCSq4FAK1DpB2w5SXOx3BU2RCgh0 Sep 12 22:08:50.053502 sshd-session[5754]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 12 22:08:50.060855 systemd-logind[1527]: New session 10 of user core. Sep 12 22:08:50.068881 systemd[1]: Started session-10.scope - Session 10 of User core. Sep 12 22:08:50.865281 sshd[5780]: Connection closed by 139.178.68.195 port 37734 Sep 12 22:08:50.866406 sshd-session[5754]: pam_unix(sshd:session): session closed for user core Sep 12 22:08:50.872648 systemd[1]: sshd@10-168.119.157.2:22-139.178.68.195:37734.service: Deactivated successfully. Sep 12 22:08:50.872865 systemd-logind[1527]: Session 10 logged out. Waiting for processes to exit. Sep 12 22:08:50.875901 systemd[1]: session-10.scope: Deactivated successfully. Sep 12 22:08:50.878064 systemd-logind[1527]: Removed session 10. Sep 12 22:08:51.035334 systemd[1]: Started sshd@11-168.119.157.2:22-139.178.68.195:59440.service - OpenSSH per-connection server daemon (139.178.68.195:59440). Sep 12 22:08:52.029234 sshd[5789]: Accepted publickey for core from 139.178.68.195 port 59440 ssh2: RSA SHA256:fkNOO6LYLT5WIi2mCSq4FAK1DpB2w5SXOx3BU2RCgh0 Sep 12 22:08:52.033157 sshd-session[5789]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 12 22:08:52.039882 systemd-logind[1527]: New session 11 of user core. Sep 12 22:08:52.045803 systemd[1]: Started session-11.scope - Session 11 of User core. Sep 12 22:08:52.791362 sshd[5792]: Connection closed by 139.178.68.195 port 59440 Sep 12 22:08:52.792176 sshd-session[5789]: pam_unix(sshd:session): session closed for user core Sep 12 22:08:52.800064 systemd-logind[1527]: Session 11 logged out. Waiting for processes to exit. Sep 12 22:08:52.800672 systemd[1]: sshd@11-168.119.157.2:22-139.178.68.195:59440.service: Deactivated successfully. Sep 12 22:08:52.804386 systemd[1]: session-11.scope: Deactivated successfully. Sep 12 22:08:52.808068 systemd-logind[1527]: Removed session 11. Sep 12 22:08:57.975036 systemd[1]: Started sshd@12-168.119.157.2:22-139.178.68.195:59444.service - OpenSSH per-connection server daemon (139.178.68.195:59444). Sep 12 22:08:59.048335 sshd[5808]: Accepted publickey for core from 139.178.68.195 port 59444 ssh2: RSA SHA256:fkNOO6LYLT5WIi2mCSq4FAK1DpB2w5SXOx3BU2RCgh0 Sep 12 22:08:59.050877 sshd-session[5808]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 12 22:08:59.056986 systemd-logind[1527]: New session 12 of user core. Sep 12 22:08:59.060793 systemd[1]: Started session-12.scope - Session 12 of User core. Sep 12 22:08:59.859667 sshd[5811]: Connection closed by 139.178.68.195 port 59444 Sep 12 22:08:59.859475 sshd-session[5808]: pam_unix(sshd:session): session closed for user core Sep 12 22:08:59.866106 systemd-logind[1527]: Session 12 logged out. Waiting for processes to exit. Sep 12 22:08:59.866815 systemd[1]: sshd@12-168.119.157.2:22-139.178.68.195:59444.service: Deactivated successfully. Sep 12 22:08:59.870994 systemd[1]: session-12.scope: Deactivated successfully. Sep 12 22:08:59.873968 systemd-logind[1527]: Removed session 12. Sep 12 22:09:01.299098 containerd[1553]: time="2025-09-12T22:09:01.299049936Z" level=info msg="TaskExit event in podsandbox handler container_id:\"2aa5620ccb3a9416a6479017356ae03e9044c61bd67a5b5ec10df3b91e120197\" id:\"ec10ee118590d95e230326827e6d7cef09e34956059fd591e0b61b02cf37a96d\" pid:5835 exited_at:{seconds:1757714941 nanos:298671454}" Sep 12 22:09:05.032006 systemd[1]: Started sshd@13-168.119.157.2:22-139.178.68.195:33828.service - OpenSSH per-connection server daemon (139.178.68.195:33828). Sep 12 22:09:06.043854 sshd[5844]: Accepted publickey for core from 139.178.68.195 port 33828 ssh2: RSA SHA256:fkNOO6LYLT5WIi2mCSq4FAK1DpB2w5SXOx3BU2RCgh0 Sep 12 22:09:06.046362 sshd-session[5844]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 12 22:09:06.051944 systemd-logind[1527]: New session 13 of user core. Sep 12 22:09:06.058534 systemd[1]: Started session-13.scope - Session 13 of User core. Sep 12 22:09:06.835832 sshd[5849]: Connection closed by 139.178.68.195 port 33828 Sep 12 22:09:06.836902 sshd-session[5844]: pam_unix(sshd:session): session closed for user core Sep 12 22:09:06.844905 systemd[1]: sshd@13-168.119.157.2:22-139.178.68.195:33828.service: Deactivated successfully. Sep 12 22:09:06.850943 systemd[1]: session-13.scope: Deactivated successfully. Sep 12 22:09:06.853550 systemd-logind[1527]: Session 13 logged out. Waiting for processes to exit. Sep 12 22:09:06.856398 systemd-logind[1527]: Removed session 13. Sep 12 22:09:12.007327 systemd[1]: Started sshd@14-168.119.157.2:22-139.178.68.195:48382.service - OpenSSH per-connection server daemon (139.178.68.195:48382). Sep 12 22:09:13.022022 sshd[5864]: Accepted publickey for core from 139.178.68.195 port 48382 ssh2: RSA SHA256:fkNOO6LYLT5WIi2mCSq4FAK1DpB2w5SXOx3BU2RCgh0 Sep 12 22:09:13.025352 sshd-session[5864]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 12 22:09:13.033148 systemd-logind[1527]: New session 14 of user core. Sep 12 22:09:13.038806 systemd[1]: Started session-14.scope - Session 14 of User core. Sep 12 22:09:13.819098 sshd[5867]: Connection closed by 139.178.68.195 port 48382 Sep 12 22:09:13.818223 sshd-session[5864]: pam_unix(sshd:session): session closed for user core Sep 12 22:09:13.824287 systemd[1]: sshd@14-168.119.157.2:22-139.178.68.195:48382.service: Deactivated successfully. Sep 12 22:09:13.827365 systemd[1]: session-14.scope: Deactivated successfully. Sep 12 22:09:13.829131 systemd-logind[1527]: Session 14 logged out. Waiting for processes to exit. Sep 12 22:09:13.831242 systemd-logind[1527]: Removed session 14. Sep 12 22:09:13.993827 systemd[1]: Started sshd@15-168.119.157.2:22-139.178.68.195:48390.service - OpenSSH per-connection server daemon (139.178.68.195:48390). Sep 12 22:09:15.006804 containerd[1553]: time="2025-09-12T22:09:15.006703161Z" level=info msg="TaskExit event in podsandbox handler container_id:\"a4e608c211de058001bdf0d0e6503e393e4578d41f7d556cebac9cac0633dafb\" id:\"545e6fc70c4150ca937e4c925c73fddac807459a4d8bb85ef96f5770a300ca35\" pid:5895 exited_at:{seconds:1757714955 nanos:5624555}" Sep 12 22:09:15.018926 sshd[5879]: Accepted publickey for core from 139.178.68.195 port 48390 ssh2: RSA SHA256:fkNOO6LYLT5WIi2mCSq4FAK1DpB2w5SXOx3BU2RCgh0 Sep 12 22:09:15.020468 sshd-session[5879]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 12 22:09:15.030946 systemd-logind[1527]: New session 15 of user core. Sep 12 22:09:15.036852 systemd[1]: Started session-15.scope - Session 15 of User core. Sep 12 22:09:15.996406 sshd[5907]: Connection closed by 139.178.68.195 port 48390 Sep 12 22:09:15.997890 sshd-session[5879]: pam_unix(sshd:session): session closed for user core Sep 12 22:09:16.007917 systemd[1]: sshd@15-168.119.157.2:22-139.178.68.195:48390.service: Deactivated successfully. Sep 12 22:09:16.012860 systemd[1]: session-15.scope: Deactivated successfully. Sep 12 22:09:16.015006 systemd-logind[1527]: Session 15 logged out. Waiting for processes to exit. Sep 12 22:09:16.017408 systemd-logind[1527]: Removed session 15. Sep 12 22:09:16.173606 systemd[1]: Started sshd@16-168.119.157.2:22-139.178.68.195:48402.service - OpenSSH per-connection server daemon (139.178.68.195:48402). Sep 12 22:09:17.196570 sshd[5918]: Accepted publickey for core from 139.178.68.195 port 48402 ssh2: RSA SHA256:fkNOO6LYLT5WIi2mCSq4FAK1DpB2w5SXOx3BU2RCgh0 Sep 12 22:09:17.198969 sshd-session[5918]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 12 22:09:17.204566 systemd-logind[1527]: New session 16 of user core. Sep 12 22:09:17.212110 systemd[1]: Started session-16.scope - Session 16 of User core. Sep 12 22:09:19.761943 containerd[1553]: time="2025-09-12T22:09:19.761851282Z" level=info msg="TaskExit event in podsandbox handler container_id:\"fadc5f8b8f840e84e34dc81e9a6729ec46c7833a637f767058f111ffc8f64bdc\" id:\"246ad10c6b394d0eab6d57ae1e51d36038c610644cc8dc767fd4b129fce80dc6\" pid:5944 exited_at:{seconds:1757714959 nanos:760984637}" Sep 12 22:09:20.410277 sshd[5921]: Connection closed by 139.178.68.195 port 48402 Sep 12 22:09:20.411165 sshd-session[5918]: pam_unix(sshd:session): session closed for user core Sep 12 22:09:20.417016 systemd[1]: sshd@16-168.119.157.2:22-139.178.68.195:48402.service: Deactivated successfully. Sep 12 22:09:20.417201 systemd-logind[1527]: Session 16 logged out. Waiting for processes to exit. Sep 12 22:09:20.421306 systemd[1]: session-16.scope: Deactivated successfully. Sep 12 22:09:20.421908 systemd[1]: session-16.scope: Consumed 635ms CPU time, 73.6M memory peak. Sep 12 22:09:20.425241 systemd-logind[1527]: Removed session 16. Sep 12 22:09:20.593839 systemd[1]: Started sshd@17-168.119.157.2:22-139.178.68.195:40974.service - OpenSSH per-connection server daemon (139.178.68.195:40974). Sep 12 22:09:21.627411 sshd[5963]: Accepted publickey for core from 139.178.68.195 port 40974 ssh2: RSA SHA256:fkNOO6LYLT5WIi2mCSq4FAK1DpB2w5SXOx3BU2RCgh0 Sep 12 22:09:21.629982 sshd-session[5963]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 12 22:09:21.636153 systemd-logind[1527]: New session 17 of user core. Sep 12 22:09:21.641746 systemd[1]: Started session-17.scope - Session 17 of User core. Sep 12 22:09:22.542241 sshd[5966]: Connection closed by 139.178.68.195 port 40974 Sep 12 22:09:22.543142 sshd-session[5963]: pam_unix(sshd:session): session closed for user core Sep 12 22:09:22.550073 systemd[1]: sshd@17-168.119.157.2:22-139.178.68.195:40974.service: Deactivated successfully. Sep 12 22:09:22.551761 systemd-logind[1527]: Session 17 logged out. Waiting for processes to exit. Sep 12 22:09:22.554268 systemd[1]: session-17.scope: Deactivated successfully. Sep 12 22:09:22.557567 systemd-logind[1527]: Removed session 17. Sep 12 22:09:22.716770 systemd[1]: Started sshd@18-168.119.157.2:22-139.178.68.195:40978.service - OpenSSH per-connection server daemon (139.178.68.195:40978). Sep 12 22:09:23.731841 sshd[5977]: Accepted publickey for core from 139.178.68.195 port 40978 ssh2: RSA SHA256:fkNOO6LYLT5WIi2mCSq4FAK1DpB2w5SXOx3BU2RCgh0 Sep 12 22:09:23.734387 sshd-session[5977]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 12 22:09:23.741094 systemd-logind[1527]: New session 18 of user core. Sep 12 22:09:23.745752 systemd[1]: Started session-18.scope - Session 18 of User core. Sep 12 22:09:24.495487 sshd[5980]: Connection closed by 139.178.68.195 port 40978 Sep 12 22:09:24.497351 sshd-session[5977]: pam_unix(sshd:session): session closed for user core Sep 12 22:09:24.504482 systemd[1]: sshd@18-168.119.157.2:22-139.178.68.195:40978.service: Deactivated successfully. Sep 12 22:09:24.508479 systemd[1]: session-18.scope: Deactivated successfully. Sep 12 22:09:24.511087 systemd-logind[1527]: Session 18 logged out. Waiting for processes to exit. Sep 12 22:09:24.514243 systemd-logind[1527]: Removed session 18. Sep 12 22:09:29.673734 systemd[1]: Started sshd@19-168.119.157.2:22-139.178.68.195:40994.service - OpenSSH per-connection server daemon (139.178.68.195:40994). Sep 12 22:09:30.705439 sshd[5994]: Accepted publickey for core from 139.178.68.195 port 40994 ssh2: RSA SHA256:fkNOO6LYLT5WIi2mCSq4FAK1DpB2w5SXOx3BU2RCgh0 Sep 12 22:09:30.707388 sshd-session[5994]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 12 22:09:30.713483 systemd-logind[1527]: New session 19 of user core. Sep 12 22:09:30.720846 systemd[1]: Started session-19.scope - Session 19 of User core. Sep 12 22:09:31.214086 containerd[1553]: time="2025-09-12T22:09:31.214039304Z" level=info msg="TaskExit event in podsandbox handler container_id:\"fadc5f8b8f840e84e34dc81e9a6729ec46c7833a637f767058f111ffc8f64bdc\" id:\"68cf7032d69f3e8575e5d3cf667a408f8a5acc01136b0926cbb64bf842931ba7\" pid:6010 exited_at:{seconds:1757714971 nanos:213721862}" Sep 12 22:09:31.293299 containerd[1553]: time="2025-09-12T22:09:31.293167592Z" level=info msg="TaskExit event in podsandbox handler container_id:\"2aa5620ccb3a9416a6479017356ae03e9044c61bd67a5b5ec10df3b91e120197\" id:\"9e4095ca0e660d41896a37352dfd27673b0517050b87a7f46876bb5749e94376\" pid:6040 exited_at:{seconds:1757714971 nanos:292837830}" Sep 12 22:09:31.475736 sshd[5997]: Connection closed by 139.178.68.195 port 40994 Sep 12 22:09:31.476313 sshd-session[5994]: pam_unix(sshd:session): session closed for user core Sep 12 22:09:31.483777 systemd[1]: sshd@19-168.119.157.2:22-139.178.68.195:40994.service: Deactivated successfully. Sep 12 22:09:31.487528 systemd[1]: session-19.scope: Deactivated successfully. Sep 12 22:09:31.491042 systemd-logind[1527]: Session 19 logged out. Waiting for processes to exit. Sep 12 22:09:31.492504 systemd-logind[1527]: Removed session 19. Sep 12 22:09:33.911426 containerd[1553]: time="2025-09-12T22:09:33.911277235Z" level=info msg="TaskExit event in podsandbox handler container_id:\"2aa5620ccb3a9416a6479017356ae03e9044c61bd67a5b5ec10df3b91e120197\" id:\"e1d694a71ff61d26f2eb7e15be5d61d8ab32fe4ffb2aa67c9940d3f235172768\" pid:6072 exited_at:{seconds:1757714973 nanos:910881553}" Sep 12 22:09:36.651085 systemd[1]: Started sshd@20-168.119.157.2:22-139.178.68.195:50344.service - OpenSSH per-connection server daemon (139.178.68.195:50344). Sep 12 22:09:37.666892 sshd[6081]: Accepted publickey for core from 139.178.68.195 port 50344 ssh2: RSA SHA256:fkNOO6LYLT5WIi2mCSq4FAK1DpB2w5SXOx3BU2RCgh0 Sep 12 22:09:37.669487 sshd-session[6081]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 12 22:09:37.676002 systemd-logind[1527]: New session 20 of user core. Sep 12 22:09:37.689897 systemd[1]: Started session-20.scope - Session 20 of User core. Sep 12 22:09:38.438376 sshd[6084]: Connection closed by 139.178.68.195 port 50344 Sep 12 22:09:38.437505 sshd-session[6081]: pam_unix(sshd:session): session closed for user core Sep 12 22:09:38.443915 systemd-logind[1527]: Session 20 logged out. Waiting for processes to exit. Sep 12 22:09:38.445189 systemd[1]: sshd@20-168.119.157.2:22-139.178.68.195:50344.service: Deactivated successfully. Sep 12 22:09:38.447425 systemd[1]: session-20.scope: Deactivated successfully. Sep 12 22:09:38.451625 systemd-logind[1527]: Removed session 20. Sep 12 22:09:44.994588 containerd[1553]: time="2025-09-12T22:09:44.994541686Z" level=info msg="TaskExit event in podsandbox handler container_id:\"a4e608c211de058001bdf0d0e6503e393e4578d41f7d556cebac9cac0633dafb\" id:\"48612e182684192c2f82f9b500f56ec5ab42401528beaf20165a357d6568ca23\" pid:6108 exited_at:{seconds:1757714984 nanos:994191685}" Sep 12 22:09:49.188857 containerd[1553]: time="2025-09-12T22:09:49.188739285Z" level=info msg="TaskExit event in podsandbox handler container_id:\"fadc5f8b8f840e84e34dc81e9a6729ec46c7833a637f767058f111ffc8f64bdc\" id:\"fd8c671d6835f897e9e6c1acdf4a5893a1acded7e83187fcd67cae61bb6642a2\" pid:6132 exited_at:{seconds:1757714989 nanos:188313443}" Sep 12 22:09:52.995133 systemd[1]: cri-containerd-a0f4a35904d993bc942ae1af9dc66fb8d991f2d7aa4618d8c22879de3edb8b17.scope: Deactivated successfully. Sep 12 22:09:52.997205 systemd[1]: cri-containerd-a0f4a35904d993bc942ae1af9dc66fb8d991f2d7aa4618d8c22879de3edb8b17.scope: Consumed 23.518s CPU time, 110.2M memory peak, 4.6M read from disk. Sep 12 22:09:53.002774 containerd[1553]: time="2025-09-12T22:09:53.002711725Z" level=info msg="received exit event container_id:\"a0f4a35904d993bc942ae1af9dc66fb8d991f2d7aa4618d8c22879de3edb8b17\" id:\"a0f4a35904d993bc942ae1af9dc66fb8d991f2d7aa4618d8c22879de3edb8b17\" pid:3078 exit_status:1 exited_at:{seconds:1757714993 nanos:1910202}" Sep 12 22:09:53.003864 containerd[1553]: time="2025-09-12T22:09:53.003057287Z" level=info msg="TaskExit event in podsandbox handler container_id:\"a0f4a35904d993bc942ae1af9dc66fb8d991f2d7aa4618d8c22879de3edb8b17\" id:\"a0f4a35904d993bc942ae1af9dc66fb8d991f2d7aa4618d8c22879de3edb8b17\" pid:3078 exit_status:1 exited_at:{seconds:1757714993 nanos:1910202}" Sep 12 22:09:53.039235 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-a0f4a35904d993bc942ae1af9dc66fb8d991f2d7aa4618d8c22879de3edb8b17-rootfs.mount: Deactivated successfully. Sep 12 22:09:53.132214 kubelet[2762]: I0912 22:09:53.131961 2762 scope.go:117] "RemoveContainer" containerID="a0f4a35904d993bc942ae1af9dc66fb8d991f2d7aa4618d8c22879de3edb8b17" Sep 12 22:09:53.142970 containerd[1553]: time="2025-09-12T22:09:53.142833450Z" level=info msg="CreateContainer within sandbox \"1f137cc6a3be46ad2ee82ab8a28094c071d82374834461be2eab8e9c2b7517dc\" for container &ContainerMetadata{Name:tigera-operator,Attempt:1,}" Sep 12 22:09:53.152044 containerd[1553]: time="2025-09-12T22:09:53.151996090Z" level=info msg="Container 52940e03bdb83df6abd01bf32176f995be4486634b39eb6876478fb5cb1abf22: CDI devices from CRI Config.CDIDevices: []" Sep 12 22:09:53.163170 containerd[1553]: time="2025-09-12T22:09:53.163126978Z" level=info msg="CreateContainer within sandbox \"1f137cc6a3be46ad2ee82ab8a28094c071d82374834461be2eab8e9c2b7517dc\" for &ContainerMetadata{Name:tigera-operator,Attempt:1,} returns container id \"52940e03bdb83df6abd01bf32176f995be4486634b39eb6876478fb5cb1abf22\"" Sep 12 22:09:53.163725 containerd[1553]: time="2025-09-12T22:09:53.163704740Z" level=info msg="StartContainer for \"52940e03bdb83df6abd01bf32176f995be4486634b39eb6876478fb5cb1abf22\"" Sep 12 22:09:53.164872 containerd[1553]: time="2025-09-12T22:09:53.164835385Z" level=info msg="connecting to shim 52940e03bdb83df6abd01bf32176f995be4486634b39eb6876478fb5cb1abf22" address="unix:///run/containerd/s/2a961ffc7e899f20d0eec4483630e17387a1621a472a8c1b4ad1d79aa2ffe269" protocol=ttrpc version=3 Sep 12 22:09:53.192933 systemd[1]: Started cri-containerd-52940e03bdb83df6abd01bf32176f995be4486634b39eb6876478fb5cb1abf22.scope - libcontainer container 52940e03bdb83df6abd01bf32176f995be4486634b39eb6876478fb5cb1abf22. Sep 12 22:09:53.228975 containerd[1553]: time="2025-09-12T22:09:53.228935182Z" level=info msg="StartContainer for \"52940e03bdb83df6abd01bf32176f995be4486634b39eb6876478fb5cb1abf22\" returns successfully" Sep 12 22:09:53.444876 kubelet[2762]: E0912 22:09:53.443028 2762 controller.go:195] "Failed to update lease" err="rpc error: code = Unavailable desc = error reading from server: read tcp 10.0.0.3:53018->10.0.0.2:2379: read: connection timed out" Sep 12 22:09:53.813211 systemd[1]: cri-containerd-229f6f7730297a1efc3d9215405c98f5f8527db3123bd76a141bc62d4d90061c.scope: Deactivated successfully. Sep 12 22:09:53.815085 systemd[1]: cri-containerd-229f6f7730297a1efc3d9215405c98f5f8527db3123bd76a141bc62d4d90061c.scope: Consumed 5.218s CPU time, 61.3M memory peak, 3.3M read from disk. Sep 12 22:09:53.819504 containerd[1553]: time="2025-09-12T22:09:53.819450410Z" level=info msg="TaskExit event in podsandbox handler container_id:\"229f6f7730297a1efc3d9215405c98f5f8527db3123bd76a141bc62d4d90061c\" id:\"229f6f7730297a1efc3d9215405c98f5f8527db3123bd76a141bc62d4d90061c\" pid:2626 exit_status:1 exited_at:{seconds:1757714993 nanos:819072288}" Sep 12 22:09:53.819504 containerd[1553]: time="2025-09-12T22:09:53.820642935Z" level=info msg="received exit event container_id:\"229f6f7730297a1efc3d9215405c98f5f8527db3123bd76a141bc62d4d90061c\" id:\"229f6f7730297a1efc3d9215405c98f5f8527db3123bd76a141bc62d4d90061c\" pid:2626 exit_status:1 exited_at:{seconds:1757714993 nanos:819072288}" Sep 12 22:09:53.853035 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-229f6f7730297a1efc3d9215405c98f5f8527db3123bd76a141bc62d4d90061c-rootfs.mount: Deactivated successfully. Sep 12 22:09:54.140622 kubelet[2762]: I0912 22:09:54.140467 2762 scope.go:117] "RemoveContainer" containerID="229f6f7730297a1efc3d9215405c98f5f8527db3123bd76a141bc62d4d90061c" Sep 12 22:09:54.144708 containerd[1553]: time="2025-09-12T22:09:54.144330767Z" level=info msg="CreateContainer within sandbox \"c590d2e71705951df7be19fb7678be84569b81181adc4c8ea25a40103b43ec1e\" for container &ContainerMetadata{Name:kube-controller-manager,Attempt:1,}" Sep 12 22:09:54.157559 containerd[1553]: time="2025-09-12T22:09:54.155523615Z" level=info msg="Container b8e757d12b15096dcf0673f5b0cbab4db09ae4e089039d63820ef37fe7b07309: CDI devices from CRI Config.CDIDevices: []" Sep 12 22:09:54.167925 containerd[1553]: time="2025-09-12T22:09:54.167863868Z" level=info msg="CreateContainer within sandbox \"c590d2e71705951df7be19fb7678be84569b81181adc4c8ea25a40103b43ec1e\" for &ContainerMetadata{Name:kube-controller-manager,Attempt:1,} returns container id \"b8e757d12b15096dcf0673f5b0cbab4db09ae4e089039d63820ef37fe7b07309\"" Sep 12 22:09:54.168903 containerd[1553]: time="2025-09-12T22:09:54.168872953Z" level=info msg="StartContainer for \"b8e757d12b15096dcf0673f5b0cbab4db09ae4e089039d63820ef37fe7b07309\"" Sep 12 22:09:54.170329 containerd[1553]: time="2025-09-12T22:09:54.170301519Z" level=info msg="connecting to shim b8e757d12b15096dcf0673f5b0cbab4db09ae4e089039d63820ef37fe7b07309" address="unix:///run/containerd/s/574a7bd51590baa8bcbd1a795c73648b2404a1612fbfccb09a9a326b75f75f65" protocol=ttrpc version=3 Sep 12 22:09:54.201857 systemd[1]: Started cri-containerd-b8e757d12b15096dcf0673f5b0cbab4db09ae4e089039d63820ef37fe7b07309.scope - libcontainer container b8e757d12b15096dcf0673f5b0cbab4db09ae4e089039d63820ef37fe7b07309. Sep 12 22:09:54.252476 containerd[1553]: time="2025-09-12T22:09:54.252441991Z" level=info msg="StartContainer for \"b8e757d12b15096dcf0673f5b0cbab4db09ae4e089039d63820ef37fe7b07309\" returns successfully" Sep 12 22:09:57.483542 kubelet[2762]: E0912 22:09:57.476467 2762 event.go:359] "Server rejected event (will not retry!)" err="rpc error: code = Unavailable desc = error reading from server: read tcp 10.0.0.3:52842->10.0.0.2:2379: read: connection timed out" event="&Event{ObjectMeta:{kube-apiserver-ci-4459-0-0-7-af931fdd93.1864a877c9e24439 kube-system 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:kube-system,Name:kube-apiserver-ci-4459-0-0-7-af931fdd93,UID:2a23d7d786da800d9f209be849dd96ab,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:Unhealthy,Message:Readiness probe failed: HTTP probe failed with statuscode: 500,Source:EventSource{Component:kubelet,Host:ci-4459-0-0-7-af931fdd93,},FirstTimestamp:2025-09-12 22:09:47.022959673 +0000 UTC m=+222.088154264,LastTimestamp:2025-09-12 22:09:47.022959673 +0000 UTC m=+222.088154264,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ci-4459-0-0-7-af931fdd93,}"