Sep 10 23:52:23.789302 kernel: Booting Linux on physical CPU 0x0000000000 [0x413fd0c1] Sep 10 23:52:23.789325 kernel: Linux version 6.12.46-flatcar (build@pony-truck.infra.kinvolk.io) (aarch64-cros-linux-gnu-gcc (Gentoo Hardened 14.2.1_p20241221 p7) 14.2.1 20241221, GNU ld (Gentoo 2.44 p1) 2.44.0) #1 SMP PREEMPT Wed Sep 10 22:24:03 -00 2025 Sep 10 23:52:23.789335 kernel: KASLR enabled Sep 10 23:52:23.789341 kernel: efi: EFI v2.7 by Ubuntu distribution of EDK II Sep 10 23:52:23.789347 kernel: efi: SMBIOS 3.0=0x139ed0000 MEMATTR=0x1390bb018 ACPI 2.0=0x136760018 RNG=0x13676e918 MEMRESERVE=0x136b41218 Sep 10 23:52:23.789352 kernel: random: crng init done Sep 10 23:52:23.789359 kernel: secureboot: Secure boot disabled Sep 10 23:52:23.789364 kernel: ACPI: Early table checksum verification disabled Sep 10 23:52:23.789370 kernel: ACPI: RSDP 0x0000000136760018 000024 (v02 BOCHS ) Sep 10 23:52:23.789376 kernel: ACPI: XSDT 0x000000013676FE98 00006C (v01 BOCHS BXPC 00000001 01000013) Sep 10 23:52:23.789383 kernel: ACPI: FACP 0x000000013676FA98 000114 (v06 BOCHS BXPC 00000001 BXPC 00000001) Sep 10 23:52:23.789389 kernel: ACPI: DSDT 0x0000000136767518 001468 (v02 BOCHS BXPC 00000001 BXPC 00000001) Sep 10 23:52:23.789394 kernel: ACPI: APIC 0x000000013676FC18 000108 (v04 BOCHS BXPC 00000001 BXPC 00000001) Sep 10 23:52:23.789413 kernel: ACPI: PPTT 0x000000013676FD98 000060 (v02 BOCHS BXPC 00000001 BXPC 00000001) Sep 10 23:52:23.789421 kernel: ACPI: GTDT 0x000000013676D898 000060 (v02 BOCHS BXPC 00000001 BXPC 00000001) Sep 10 23:52:23.789429 kernel: ACPI: MCFG 0x000000013676FF98 00003C (v01 BOCHS BXPC 00000001 BXPC 00000001) Sep 10 23:52:23.789435 kernel: ACPI: SPCR 0x000000013676E818 000050 (v02 BOCHS BXPC 00000001 BXPC 00000001) Sep 10 23:52:23.789441 kernel: ACPI: DBG2 0x000000013676E898 000057 (v00 BOCHS BXPC 00000001 BXPC 00000001) Sep 10 23:52:23.789447 kernel: ACPI: IORT 0x000000013676E418 000080 (v03 BOCHS BXPC 00000001 BXPC 00000001) Sep 10 23:52:23.789453 kernel: ACPI: BGRT 0x000000013676E798 000038 (v01 INTEL EDK2 00000002 01000013) Sep 10 23:52:23.789459 kernel: ACPI: SPCR: console: pl011,mmio32,0x9000000,9600 Sep 10 23:52:23.789465 kernel: ACPI: Use ACPI SPCR as default console: No Sep 10 23:52:23.789471 kernel: NUMA: Faking a node at [mem 0x0000000040000000-0x0000000139ffffff] Sep 10 23:52:23.789477 kernel: NODE_DATA(0) allocated [mem 0x13967da00-0x139684fff] Sep 10 23:52:23.789483 kernel: Zone ranges: Sep 10 23:52:23.789490 kernel: DMA [mem 0x0000000040000000-0x00000000ffffffff] Sep 10 23:52:23.789496 kernel: DMA32 empty Sep 10 23:52:23.789502 kernel: Normal [mem 0x0000000100000000-0x0000000139ffffff] Sep 10 23:52:23.789508 kernel: Device empty Sep 10 23:52:23.789513 kernel: Movable zone start for each node Sep 10 23:52:23.789519 kernel: Early memory node ranges Sep 10 23:52:23.789525 kernel: node 0: [mem 0x0000000040000000-0x000000013666ffff] Sep 10 23:52:23.789531 kernel: node 0: [mem 0x0000000136670000-0x000000013667ffff] Sep 10 23:52:23.789537 kernel: node 0: [mem 0x0000000136680000-0x000000013676ffff] Sep 10 23:52:23.789543 kernel: node 0: [mem 0x0000000136770000-0x0000000136b3ffff] Sep 10 23:52:23.789548 kernel: node 0: [mem 0x0000000136b40000-0x0000000139e1ffff] Sep 10 23:52:23.789554 kernel: node 0: [mem 0x0000000139e20000-0x0000000139eaffff] Sep 10 23:52:23.789560 kernel: node 0: [mem 0x0000000139eb0000-0x0000000139ebffff] Sep 10 23:52:23.789567 kernel: node 0: [mem 0x0000000139ec0000-0x0000000139fdffff] Sep 10 23:52:23.789573 kernel: node 0: [mem 0x0000000139fe0000-0x0000000139ffffff] Sep 10 23:52:23.789582 kernel: Initmem setup node 0 [mem 0x0000000040000000-0x0000000139ffffff] Sep 10 23:52:23.789588 kernel: On node 0, zone Normal: 24576 pages in unavailable ranges Sep 10 23:52:23.789595 kernel: cma: Reserved 16 MiB at 0x00000000ff000000 on node -1 Sep 10 23:52:23.789602 kernel: psci: probing for conduit method from ACPI. Sep 10 23:52:23.789608 kernel: psci: PSCIv1.1 detected in firmware. Sep 10 23:52:23.789615 kernel: psci: Using standard PSCI v0.2 function IDs Sep 10 23:52:23.789621 kernel: psci: Trusted OS migration not required Sep 10 23:52:23.789627 kernel: psci: SMC Calling Convention v1.1 Sep 10 23:52:23.789634 kernel: smccc: KVM: hypervisor services detected (0x00000000 0x00000000 0x00000000 0x00000003) Sep 10 23:52:23.789640 kernel: percpu: Embedded 33 pages/cpu s98200 r8192 d28776 u135168 Sep 10 23:52:23.789646 kernel: pcpu-alloc: s98200 r8192 d28776 u135168 alloc=33*4096 Sep 10 23:52:23.789653 kernel: pcpu-alloc: [0] 0 [0] 1 Sep 10 23:52:23.789659 kernel: Detected PIPT I-cache on CPU0 Sep 10 23:52:23.789665 kernel: CPU features: detected: GIC system register CPU interface Sep 10 23:52:23.789673 kernel: CPU features: detected: Spectre-v4 Sep 10 23:52:23.789679 kernel: CPU features: detected: Spectre-BHB Sep 10 23:52:23.789686 kernel: CPU features: kernel page table isolation forced ON by KASLR Sep 10 23:52:23.789692 kernel: CPU features: detected: Kernel page table isolation (KPTI) Sep 10 23:52:23.789698 kernel: CPU features: detected: ARM erratum 1418040 Sep 10 23:52:23.789704 kernel: CPU features: detected: SSBS not fully self-synchronizing Sep 10 23:52:23.789711 kernel: alternatives: applying boot alternatives Sep 10 23:52:23.789718 kernel: Kernel command line: BOOT_IMAGE=/flatcar/vmlinuz-a mount.usr=/dev/mapper/usr verity.usr=PARTUUID=7130c94a-213a-4e5a-8e26-6cce9662f132 rootflags=rw mount.usrflags=ro consoleblank=0 root=LABEL=ROOT console=ttyAMA0,115200n8 flatcar.first_boot=detected acpi=force flatcar.oem.id=hetzner verity.usrhash=dd9c14cce645c634e06a91b09405eea80057f02909b9267c482dc457df1cddec Sep 10 23:52:23.789725 kernel: Unknown kernel command line parameters "BOOT_IMAGE=/flatcar/vmlinuz-a", will be passed to user space. Sep 10 23:52:23.789731 kernel: Dentry cache hash table entries: 524288 (order: 10, 4194304 bytes, linear) Sep 10 23:52:23.789739 kernel: Inode-cache hash table entries: 262144 (order: 9, 2097152 bytes, linear) Sep 10 23:52:23.789745 kernel: Fallback order for Node 0: 0 Sep 10 23:52:23.789751 kernel: Built 1 zonelists, mobility grouping on. Total pages: 1024000 Sep 10 23:52:23.789758 kernel: Policy zone: Normal Sep 10 23:52:23.789764 kernel: mem auto-init: stack:off, heap alloc:off, heap free:off Sep 10 23:52:23.789770 kernel: software IO TLB: area num 2. Sep 10 23:52:23.789777 kernel: software IO TLB: mapped [mem 0x00000000fb000000-0x00000000ff000000] (64MB) Sep 10 23:52:23.789783 kernel: SLUB: HWalign=64, Order=0-3, MinObjects=0, CPUs=2, Nodes=1 Sep 10 23:52:23.789789 kernel: rcu: Preemptible hierarchical RCU implementation. Sep 10 23:52:23.789796 kernel: rcu: RCU event tracing is enabled. Sep 10 23:52:23.789803 kernel: rcu: RCU restricting CPUs from NR_CPUS=512 to nr_cpu_ids=2. Sep 10 23:52:23.789809 kernel: Trampoline variant of Tasks RCU enabled. Sep 10 23:52:23.789817 kernel: Tracing variant of Tasks RCU enabled. Sep 10 23:52:23.789823 kernel: rcu: RCU calculated value of scheduler-enlistment delay is 100 jiffies. Sep 10 23:52:23.789829 kernel: rcu: Adjusting geometry for rcu_fanout_leaf=16, nr_cpu_ids=2 Sep 10 23:52:23.789836 kernel: RCU Tasks: Setting shift to 1 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=2. Sep 10 23:52:23.789843 kernel: RCU Tasks Trace: Setting shift to 1 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=2. Sep 10 23:52:23.789849 kernel: NR_IRQS: 64, nr_irqs: 64, preallocated irqs: 0 Sep 10 23:52:23.789855 kernel: GICv3: 256 SPIs implemented Sep 10 23:52:23.789862 kernel: GICv3: 0 Extended SPIs implemented Sep 10 23:52:23.789868 kernel: Root IRQ handler: gic_handle_irq Sep 10 23:52:23.789874 kernel: GICv3: GICv3 features: 16 PPIs, DirectLPI Sep 10 23:52:23.789880 kernel: GICv3: GICD_CTRL.DS=1, SCR_EL3.FIQ=0 Sep 10 23:52:23.789887 kernel: GICv3: CPU0: found redistributor 0 region 0:0x00000000080a0000 Sep 10 23:52:23.789894 kernel: ITS [mem 0x08080000-0x0809ffff] Sep 10 23:52:23.789901 kernel: ITS@0x0000000008080000: allocated 8192 Devices @100100000 (indirect, esz 8, psz 64K, shr 1) Sep 10 23:52:23.789907 kernel: ITS@0x0000000008080000: allocated 8192 Interrupt Collections @100110000 (flat, esz 8, psz 64K, shr 1) Sep 10 23:52:23.789914 kernel: GICv3: using LPI property table @0x0000000100120000 Sep 10 23:52:23.789920 kernel: GICv3: CPU0: using allocated LPI pending table @0x0000000100130000 Sep 10 23:52:23.789926 kernel: rcu: srcu_init: Setting srcu_struct sizes based on contention. Sep 10 23:52:23.789933 kernel: arch_timer: Enabling local workaround for ARM erratum 1418040 Sep 10 23:52:23.789939 kernel: arch_timer: cp15 timer(s) running at 25.00MHz (virt). Sep 10 23:52:23.789946 kernel: clocksource: arch_sys_counter: mask: 0xffffffffffffff max_cycles: 0x5c40939b5, max_idle_ns: 440795202646 ns Sep 10 23:52:23.789952 kernel: sched_clock: 56 bits at 25MHz, resolution 40ns, wraps every 4398046511100ns Sep 10 23:52:23.789958 kernel: Console: colour dummy device 80x25 Sep 10 23:52:23.789966 kernel: ACPI: Core revision 20240827 Sep 10 23:52:23.789973 kernel: Calibrating delay loop (skipped), value calculated using timer frequency.. 50.00 BogoMIPS (lpj=25000) Sep 10 23:52:23.789980 kernel: pid_max: default: 32768 minimum: 301 Sep 10 23:52:23.789986 kernel: LSM: initializing lsm=lockdown,capability,landlock,selinux,ima Sep 10 23:52:23.789993 kernel: landlock: Up and running. Sep 10 23:52:23.789999 kernel: SELinux: Initializing. Sep 10 23:52:23.790006 kernel: Mount-cache hash table entries: 8192 (order: 4, 65536 bytes, linear) Sep 10 23:52:23.790012 kernel: Mountpoint-cache hash table entries: 8192 (order: 4, 65536 bytes, linear) Sep 10 23:52:23.790019 kernel: rcu: Hierarchical SRCU implementation. Sep 10 23:52:23.790027 kernel: rcu: Max phase no-delay instances is 400. Sep 10 23:52:23.790033 kernel: Timer migration: 1 hierarchy levels; 8 children per group; 1 crossnode level Sep 10 23:52:23.790040 kernel: Remapping and enabling EFI services. Sep 10 23:52:23.790046 kernel: smp: Bringing up secondary CPUs ... Sep 10 23:52:23.790053 kernel: Detected PIPT I-cache on CPU1 Sep 10 23:52:23.790059 kernel: GICv3: CPU1: found redistributor 1 region 0:0x00000000080c0000 Sep 10 23:52:23.790066 kernel: GICv3: CPU1: using allocated LPI pending table @0x0000000100140000 Sep 10 23:52:23.790073 kernel: arch_timer: Enabling local workaround for ARM erratum 1418040 Sep 10 23:52:23.790079 kernel: CPU1: Booted secondary processor 0x0000000001 [0x413fd0c1] Sep 10 23:52:23.790087 kernel: smp: Brought up 1 node, 2 CPUs Sep 10 23:52:23.790098 kernel: SMP: Total of 2 processors activated. Sep 10 23:52:23.790105 kernel: CPU: All CPU(s) started at EL1 Sep 10 23:52:23.790113 kernel: CPU features: detected: 32-bit EL0 Support Sep 10 23:52:23.790120 kernel: CPU features: detected: Data cache clean to the PoU not required for I/D coherence Sep 10 23:52:23.790127 kernel: CPU features: detected: Common not Private translations Sep 10 23:52:23.790134 kernel: CPU features: detected: CRC32 instructions Sep 10 23:52:23.790140 kernel: CPU features: detected: Enhanced Virtualization Traps Sep 10 23:52:23.790149 kernel: CPU features: detected: RCpc load-acquire (LDAPR) Sep 10 23:52:23.790156 kernel: CPU features: detected: LSE atomic instructions Sep 10 23:52:23.790162 kernel: CPU features: detected: Privileged Access Never Sep 10 23:52:23.790169 kernel: CPU features: detected: RAS Extension Support Sep 10 23:52:23.790176 kernel: CPU features: detected: Speculative Store Bypassing Safe (SSBS) Sep 10 23:52:23.790183 kernel: alternatives: applying system-wide alternatives Sep 10 23:52:23.790231 kernel: CPU features: detected: Hardware dirty bit management on CPU0-1 Sep 10 23:52:23.790240 kernel: Memory: 3859556K/4096000K available (11136K kernel code, 2436K rwdata, 9084K rodata, 38976K init, 1038K bss, 214964K reserved, 16384K cma-reserved) Sep 10 23:52:23.790247 kernel: devtmpfs: initialized Sep 10 23:52:23.790256 kernel: clocksource: jiffies: mask: 0xffffffff max_cycles: 0xffffffff, max_idle_ns: 1911260446275000 ns Sep 10 23:52:23.790263 kernel: futex hash table entries: 512 (order: 3, 32768 bytes, linear) Sep 10 23:52:23.790270 kernel: 2G module region forced by RANDOMIZE_MODULE_REGION_FULL Sep 10 23:52:23.790277 kernel: 0 pages in range for non-PLT usage Sep 10 23:52:23.790284 kernel: 508560 pages in range for PLT usage Sep 10 23:52:23.790291 kernel: pinctrl core: initialized pinctrl subsystem Sep 10 23:52:23.790298 kernel: SMBIOS 3.0.0 present. Sep 10 23:52:23.790305 kernel: DMI: Hetzner vServer/KVM Virtual Machine, BIOS 20171111 11/11/2017 Sep 10 23:52:23.790311 kernel: DMI: Memory slots populated: 1/1 Sep 10 23:52:23.790320 kernel: NET: Registered PF_NETLINK/PF_ROUTE protocol family Sep 10 23:52:23.790327 kernel: DMA: preallocated 512 KiB GFP_KERNEL pool for atomic allocations Sep 10 23:52:23.790334 kernel: DMA: preallocated 512 KiB GFP_KERNEL|GFP_DMA pool for atomic allocations Sep 10 23:52:23.790341 kernel: DMA: preallocated 512 KiB GFP_KERNEL|GFP_DMA32 pool for atomic allocations Sep 10 23:52:23.790348 kernel: audit: initializing netlink subsys (disabled) Sep 10 23:52:23.790355 kernel: audit: type=2000 audit(0.014:1): state=initialized audit_enabled=0 res=1 Sep 10 23:52:23.790362 kernel: thermal_sys: Registered thermal governor 'step_wise' Sep 10 23:52:23.790368 kernel: cpuidle: using governor menu Sep 10 23:52:23.790375 kernel: hw-breakpoint: found 6 breakpoint and 4 watchpoint registers. Sep 10 23:52:23.790383 kernel: ASID allocator initialised with 32768 entries Sep 10 23:52:23.790390 kernel: acpiphp: ACPI Hot Plug PCI Controller Driver version: 0.5 Sep 10 23:52:23.790429 kernel: Serial: AMBA PL011 UART driver Sep 10 23:52:23.790440 kernel: HugeTLB: registered 1.00 GiB page size, pre-allocated 0 pages Sep 10 23:52:23.790447 kernel: HugeTLB: 0 KiB vmemmap can be freed for a 1.00 GiB page Sep 10 23:52:23.790454 kernel: HugeTLB: registered 32.0 MiB page size, pre-allocated 0 pages Sep 10 23:52:23.790461 kernel: HugeTLB: 0 KiB vmemmap can be freed for a 32.0 MiB page Sep 10 23:52:23.790468 kernel: HugeTLB: registered 2.00 MiB page size, pre-allocated 0 pages Sep 10 23:52:23.790475 kernel: HugeTLB: 0 KiB vmemmap can be freed for a 2.00 MiB page Sep 10 23:52:23.790485 kernel: HugeTLB: registered 64.0 KiB page size, pre-allocated 0 pages Sep 10 23:52:23.790492 kernel: HugeTLB: 0 KiB vmemmap can be freed for a 64.0 KiB page Sep 10 23:52:23.790498 kernel: ACPI: Added _OSI(Module Device) Sep 10 23:52:23.790505 kernel: ACPI: Added _OSI(Processor Device) Sep 10 23:52:23.790512 kernel: ACPI: Added _OSI(Processor Aggregator Device) Sep 10 23:52:23.790519 kernel: ACPI: 1 ACPI AML tables successfully acquired and loaded Sep 10 23:52:23.790526 kernel: ACPI: Interpreter enabled Sep 10 23:52:23.790533 kernel: ACPI: Using GIC for interrupt routing Sep 10 23:52:23.790540 kernel: ACPI: MCFG table detected, 1 entries Sep 10 23:52:23.790548 kernel: ACPI: CPU0 has been hot-added Sep 10 23:52:23.790555 kernel: ACPI: CPU1 has been hot-added Sep 10 23:52:23.790562 kernel: ARMH0011:00: ttyAMA0 at MMIO 0x9000000 (irq = 12, base_baud = 0) is a SBSA Sep 10 23:52:23.790569 kernel: printk: legacy console [ttyAMA0] enabled Sep 10 23:52:23.790575 kernel: ACPI: PCI Root Bridge [PCI0] (domain 0000 [bus 00-ff]) Sep 10 23:52:23.790712 kernel: acpi PNP0A08:00: _OSC: OS supports [ExtendedConfig ASPM ClockPM Segments MSI HPX-Type3] Sep 10 23:52:23.790784 kernel: acpi PNP0A08:00: _OSC: platform does not support [LTR] Sep 10 23:52:23.790852 kernel: acpi PNP0A08:00: _OSC: OS now controls [PCIeHotplug PME AER PCIeCapability] Sep 10 23:52:23.790910 kernel: acpi PNP0A08:00: ECAM area [mem 0x4010000000-0x401fffffff] reserved by PNP0C02:00 Sep 10 23:52:23.790967 kernel: acpi PNP0A08:00: ECAM at [mem 0x4010000000-0x401fffffff] for [bus 00-ff] Sep 10 23:52:23.790976 kernel: ACPI: Remapped I/O 0x000000003eff0000 to [io 0x0000-0xffff window] Sep 10 23:52:23.790983 kernel: PCI host bridge to bus 0000:00 Sep 10 23:52:23.791047 kernel: pci_bus 0000:00: root bus resource [mem 0x10000000-0x3efeffff window] Sep 10 23:52:23.791100 kernel: pci_bus 0000:00: root bus resource [io 0x0000-0xffff window] Sep 10 23:52:23.791153 kernel: pci_bus 0000:00: root bus resource [mem 0x8000000000-0xffffffffff window] Sep 10 23:52:23.791272 kernel: pci_bus 0000:00: root bus resource [bus 00-ff] Sep 10 23:52:23.792317 kernel: pci 0000:00:00.0: [1b36:0008] type 00 class 0x060000 conventional PCI endpoint Sep 10 23:52:23.792455 kernel: pci 0000:00:01.0: [1af4:1050] type 00 class 0x038000 conventional PCI endpoint Sep 10 23:52:23.792529 kernel: pci 0000:00:01.0: BAR 1 [mem 0x11289000-0x11289fff] Sep 10 23:52:23.792591 kernel: pci 0000:00:01.0: BAR 4 [mem 0x8000600000-0x8000603fff 64bit pref] Sep 10 23:52:23.792659 kernel: pci 0000:00:02.0: [1b36:000c] type 01 class 0x060400 PCIe Root Port Sep 10 23:52:23.792725 kernel: pci 0000:00:02.0: BAR 0 [mem 0x11288000-0x11288fff] Sep 10 23:52:23.792790 kernel: pci 0000:00:02.0: PCI bridge to [bus 01] Sep 10 23:52:23.792849 kernel: pci 0000:00:02.0: bridge window [mem 0x11000000-0x111fffff] Sep 10 23:52:23.792908 kernel: pci 0000:00:02.0: bridge window [mem 0x8000000000-0x80000fffff 64bit pref] Sep 10 23:52:23.792976 kernel: pci 0000:00:02.1: [1b36:000c] type 01 class 0x060400 PCIe Root Port Sep 10 23:52:23.793100 kernel: pci 0000:00:02.1: BAR 0 [mem 0x11287000-0x11287fff] Sep 10 23:52:23.793166 kernel: pci 0000:00:02.1: PCI bridge to [bus 02] Sep 10 23:52:23.794842 kernel: pci 0000:00:02.1: bridge window [mem 0x10e00000-0x10ffffff] Sep 10 23:52:23.794925 kernel: pci 0000:00:02.2: [1b36:000c] type 01 class 0x060400 PCIe Root Port Sep 10 23:52:23.794986 kernel: pci 0000:00:02.2: BAR 0 [mem 0x11286000-0x11286fff] Sep 10 23:52:23.795064 kernel: pci 0000:00:02.2: PCI bridge to [bus 03] Sep 10 23:52:23.795125 kernel: pci 0000:00:02.2: bridge window [mem 0x10c00000-0x10dfffff] Sep 10 23:52:23.795183 kernel: pci 0000:00:02.2: bridge window [mem 0x8000100000-0x80001fffff 64bit pref] Sep 10 23:52:23.796162 kernel: pci 0000:00:02.3: [1b36:000c] type 01 class 0x060400 PCIe Root Port Sep 10 23:52:23.796274 kernel: pci 0000:00:02.3: BAR 0 [mem 0x11285000-0x11285fff] Sep 10 23:52:23.796338 kernel: pci 0000:00:02.3: PCI bridge to [bus 04] Sep 10 23:52:23.796407 kernel: pci 0000:00:02.3: bridge window [mem 0x10a00000-0x10bfffff] Sep 10 23:52:23.796473 kernel: pci 0000:00:02.3: bridge window [mem 0x8000200000-0x80002fffff 64bit pref] Sep 10 23:52:23.796540 kernel: pci 0000:00:02.4: [1b36:000c] type 01 class 0x060400 PCIe Root Port Sep 10 23:52:23.796600 kernel: pci 0000:00:02.4: BAR 0 [mem 0x11284000-0x11284fff] Sep 10 23:52:23.796658 kernel: pci 0000:00:02.4: PCI bridge to [bus 05] Sep 10 23:52:23.796719 kernel: pci 0000:00:02.4: bridge window [mem 0x10800000-0x109fffff] Sep 10 23:52:23.796777 kernel: pci 0000:00:02.4: bridge window [mem 0x8000300000-0x80003fffff 64bit pref] Sep 10 23:52:23.796844 kernel: pci 0000:00:02.5: [1b36:000c] type 01 class 0x060400 PCIe Root Port Sep 10 23:52:23.796902 kernel: pci 0000:00:02.5: BAR 0 [mem 0x11283000-0x11283fff] Sep 10 23:52:23.796960 kernel: pci 0000:00:02.5: PCI bridge to [bus 06] Sep 10 23:52:23.797017 kernel: pci 0000:00:02.5: bridge window [mem 0x10600000-0x107fffff] Sep 10 23:52:23.797074 kernel: pci 0000:00:02.5: bridge window [mem 0x8000400000-0x80004fffff 64bit pref] Sep 10 23:52:23.797140 kernel: pci 0000:00:02.6: [1b36:000c] type 01 class 0x060400 PCIe Root Port Sep 10 23:52:23.798289 kernel: pci 0000:00:02.6: BAR 0 [mem 0x11282000-0x11282fff] Sep 10 23:52:23.798396 kernel: pci 0000:00:02.6: PCI bridge to [bus 07] Sep 10 23:52:23.798484 kernel: pci 0000:00:02.6: bridge window [mem 0x10400000-0x105fffff] Sep 10 23:52:23.798553 kernel: pci 0000:00:02.6: bridge window [mem 0x8000500000-0x80005fffff 64bit pref] Sep 10 23:52:23.798620 kernel: pci 0000:00:02.7: [1b36:000c] type 01 class 0x060400 PCIe Root Port Sep 10 23:52:23.798680 kernel: pci 0000:00:02.7: BAR 0 [mem 0x11281000-0x11281fff] Sep 10 23:52:23.798744 kernel: pci 0000:00:02.7: PCI bridge to [bus 08] Sep 10 23:52:23.798803 kernel: pci 0000:00:02.7: bridge window [mem 0x10200000-0x103fffff] Sep 10 23:52:23.798873 kernel: pci 0000:00:03.0: [1b36:000c] type 01 class 0x060400 PCIe Root Port Sep 10 23:52:23.798932 kernel: pci 0000:00:03.0: BAR 0 [mem 0x11280000-0x11280fff] Sep 10 23:52:23.798991 kernel: pci 0000:00:03.0: PCI bridge to [bus 09] Sep 10 23:52:23.799049 kernel: pci 0000:00:03.0: bridge window [mem 0x10000000-0x101fffff] Sep 10 23:52:23.799118 kernel: pci 0000:00:04.0: [1b36:0002] type 00 class 0x070002 conventional PCI endpoint Sep 10 23:52:23.799182 kernel: pci 0000:00:04.0: BAR 0 [io 0x0000-0x0007] Sep 10 23:52:23.799274 kernel: pci 0000:01:00.0: [1af4:1041] type 00 class 0x020000 PCIe Endpoint Sep 10 23:52:23.799338 kernel: pci 0000:01:00.0: BAR 1 [mem 0x11000000-0x11000fff] Sep 10 23:52:23.799406 kernel: pci 0000:01:00.0: BAR 4 [mem 0x8000000000-0x8000003fff 64bit pref] Sep 10 23:52:23.799477 kernel: pci 0000:01:00.0: ROM [mem 0xfff80000-0xffffffff pref] Sep 10 23:52:23.799548 kernel: pci 0000:02:00.0: [1b36:000d] type 00 class 0x0c0330 PCIe Endpoint Sep 10 23:52:23.799613 kernel: pci 0000:02:00.0: BAR 0 [mem 0x10e00000-0x10e03fff 64bit] Sep 10 23:52:23.799685 kernel: pci 0000:03:00.0: [1af4:1043] type 00 class 0x078000 PCIe Endpoint Sep 10 23:52:23.799747 kernel: pci 0000:03:00.0: BAR 1 [mem 0x10c00000-0x10c00fff] Sep 10 23:52:23.799807 kernel: pci 0000:03:00.0: BAR 4 [mem 0x8000100000-0x8000103fff 64bit pref] Sep 10 23:52:23.799875 kernel: pci 0000:04:00.0: [1af4:1045] type 00 class 0x00ff00 PCIe Endpoint Sep 10 23:52:23.799935 kernel: pci 0000:04:00.0: BAR 4 [mem 0x8000200000-0x8000203fff 64bit pref] Sep 10 23:52:23.800005 kernel: pci 0000:05:00.0: [1af4:1044] type 00 class 0x00ff00 PCIe Endpoint Sep 10 23:52:23.800067 kernel: pci 0000:05:00.0: BAR 4 [mem 0x8000300000-0x8000303fff 64bit pref] Sep 10 23:52:23.800134 kernel: pci 0000:06:00.0: [1af4:1048] type 00 class 0x010000 PCIe Endpoint Sep 10 23:52:23.802992 kernel: pci 0000:06:00.0: BAR 1 [mem 0x10600000-0x10600fff] Sep 10 23:52:23.803088 kernel: pci 0000:06:00.0: BAR 4 [mem 0x8000400000-0x8000403fff 64bit pref] Sep 10 23:52:23.803165 kernel: pci 0000:07:00.0: [1af4:1041] type 00 class 0x020000 PCIe Endpoint Sep 10 23:52:23.803285 kernel: pci 0000:07:00.0: BAR 1 [mem 0x10400000-0x10400fff] Sep 10 23:52:23.803360 kernel: pci 0000:07:00.0: BAR 4 [mem 0x8000500000-0x8000503fff 64bit pref] Sep 10 23:52:23.803447 kernel: pci 0000:07:00.0: ROM [mem 0xfff80000-0xffffffff pref] Sep 10 23:52:23.803516 kernel: pci 0000:00:02.0: bridge window [io 0x1000-0x0fff] to [bus 01] add_size 1000 Sep 10 23:52:23.803577 kernel: pci 0000:00:02.0: bridge window [mem 0x00100000-0x001fffff 64bit pref] to [bus 01] add_size 100000 add_align 100000 Sep 10 23:52:23.803639 kernel: pci 0000:00:02.0: bridge window [mem 0x00100000-0x001fffff] to [bus 01] add_size 100000 add_align 100000 Sep 10 23:52:23.803706 kernel: pci 0000:00:02.1: bridge window [io 0x1000-0x0fff] to [bus 02] add_size 1000 Sep 10 23:52:23.803766 kernel: pci 0000:00:02.1: bridge window [mem 0x00100000-0x000fffff 64bit pref] to [bus 02] add_size 200000 add_align 100000 Sep 10 23:52:23.803829 kernel: pci 0000:00:02.1: bridge window [mem 0x00100000-0x001fffff] to [bus 02] add_size 100000 add_align 100000 Sep 10 23:52:23.803891 kernel: pci 0000:00:02.2: bridge window [io 0x1000-0x0fff] to [bus 03] add_size 1000 Sep 10 23:52:23.803950 kernel: pci 0000:00:02.2: bridge window [mem 0x00100000-0x001fffff 64bit pref] to [bus 03] add_size 100000 add_align 100000 Sep 10 23:52:23.804008 kernel: pci 0000:00:02.2: bridge window [mem 0x00100000-0x001fffff] to [bus 03] add_size 100000 add_align 100000 Sep 10 23:52:23.804070 kernel: pci 0000:00:02.3: bridge window [io 0x1000-0x0fff] to [bus 04] add_size 1000 Sep 10 23:52:23.804130 kernel: pci 0000:00:02.3: bridge window [mem 0x00100000-0x001fffff 64bit pref] to [bus 04] add_size 100000 add_align 100000 Sep 10 23:52:23.804213 kernel: pci 0000:00:02.3: bridge window [mem 0x00100000-0x000fffff] to [bus 04] add_size 200000 add_align 100000 Sep 10 23:52:23.804284 kernel: pci 0000:00:02.4: bridge window [io 0x1000-0x0fff] to [bus 05] add_size 1000 Sep 10 23:52:23.804344 kernel: pci 0000:00:02.4: bridge window [mem 0x00100000-0x001fffff 64bit pref] to [bus 05] add_size 100000 add_align 100000 Sep 10 23:52:23.804434 kernel: pci 0000:00:02.4: bridge window [mem 0x00100000-0x000fffff] to [bus 05] add_size 200000 add_align 100000 Sep 10 23:52:23.804506 kernel: pci 0000:00:02.5: bridge window [io 0x1000-0x0fff] to [bus 06] add_size 1000 Sep 10 23:52:23.804567 kernel: pci 0000:00:02.5: bridge window [mem 0x00100000-0x001fffff 64bit pref] to [bus 06] add_size 100000 add_align 100000 Sep 10 23:52:23.804652 kernel: pci 0000:00:02.5: bridge window [mem 0x00100000-0x001fffff] to [bus 06] add_size 100000 add_align 100000 Sep 10 23:52:23.804723 kernel: pci 0000:00:02.6: bridge window [io 0x1000-0x0fff] to [bus 07] add_size 1000 Sep 10 23:52:23.804782 kernel: pci 0000:00:02.6: bridge window [mem 0x00100000-0x001fffff 64bit pref] to [bus 07] add_size 100000 add_align 100000 Sep 10 23:52:23.804840 kernel: pci 0000:00:02.6: bridge window [mem 0x00100000-0x001fffff] to [bus 07] add_size 100000 add_align 100000 Sep 10 23:52:23.804902 kernel: pci 0000:00:02.7: bridge window [io 0x1000-0x0fff] to [bus 08] add_size 1000 Sep 10 23:52:23.804960 kernel: pci 0000:00:02.7: bridge window [mem 0x00100000-0x000fffff 64bit pref] to [bus 08] add_size 200000 add_align 100000 Sep 10 23:52:23.805018 kernel: pci 0000:00:02.7: bridge window [mem 0x00100000-0x000fffff] to [bus 08] add_size 200000 add_align 100000 Sep 10 23:52:23.805080 kernel: pci 0000:00:03.0: bridge window [io 0x1000-0x0fff] to [bus 09] add_size 1000 Sep 10 23:52:23.805140 kernel: pci 0000:00:03.0: bridge window [mem 0x00100000-0x000fffff 64bit pref] to [bus 09] add_size 200000 add_align 100000 Sep 10 23:52:23.805220 kernel: pci 0000:00:03.0: bridge window [mem 0x00100000-0x000fffff] to [bus 09] add_size 200000 add_align 100000 Sep 10 23:52:23.805285 kernel: pci 0000:00:02.0: bridge window [mem 0x10000000-0x101fffff]: assigned Sep 10 23:52:23.805344 kernel: pci 0000:00:02.0: bridge window [mem 0x8000000000-0x80001fffff 64bit pref]: assigned Sep 10 23:52:23.805443 kernel: pci 0000:00:02.1: bridge window [mem 0x10200000-0x103fffff]: assigned Sep 10 23:52:23.805513 kernel: pci 0000:00:02.1: bridge window [mem 0x8000200000-0x80003fffff 64bit pref]: assigned Sep 10 23:52:23.805573 kernel: pci 0000:00:02.2: bridge window [mem 0x10400000-0x105fffff]: assigned Sep 10 23:52:23.805638 kernel: pci 0000:00:02.2: bridge window [mem 0x8000400000-0x80005fffff 64bit pref]: assigned Sep 10 23:52:23.805699 kernel: pci 0000:00:02.3: bridge window [mem 0x10600000-0x107fffff]: assigned Sep 10 23:52:23.805757 kernel: pci 0000:00:02.3: bridge window [mem 0x8000600000-0x80007fffff 64bit pref]: assigned Sep 10 23:52:23.805815 kernel: pci 0000:00:02.4: bridge window [mem 0x10800000-0x109fffff]: assigned Sep 10 23:52:23.805873 kernel: pci 0000:00:02.4: bridge window [mem 0x8000800000-0x80009fffff 64bit pref]: assigned Sep 10 23:52:23.805931 kernel: pci 0000:00:02.5: bridge window [mem 0x10a00000-0x10bfffff]: assigned Sep 10 23:52:23.805991 kernel: pci 0000:00:02.5: bridge window [mem 0x8000a00000-0x8000bfffff 64bit pref]: assigned Sep 10 23:52:23.806050 kernel: pci 0000:00:02.6: bridge window [mem 0x10c00000-0x10dfffff]: assigned Sep 10 23:52:23.806111 kernel: pci 0000:00:02.6: bridge window [mem 0x8000c00000-0x8000dfffff 64bit pref]: assigned Sep 10 23:52:23.806169 kernel: pci 0000:00:02.7: bridge window [mem 0x10e00000-0x10ffffff]: assigned Sep 10 23:52:23.806308 kernel: pci 0000:00:02.7: bridge window [mem 0x8000e00000-0x8000ffffff 64bit pref]: assigned Sep 10 23:52:23.806374 kernel: pci 0000:00:03.0: bridge window [mem 0x11000000-0x111fffff]: assigned Sep 10 23:52:23.806448 kernel: pci 0000:00:03.0: bridge window [mem 0x8001000000-0x80011fffff 64bit pref]: assigned Sep 10 23:52:23.806516 kernel: pci 0000:00:01.0: BAR 4 [mem 0x8001200000-0x8001203fff 64bit pref]: assigned Sep 10 23:52:23.806575 kernel: pci 0000:00:01.0: BAR 1 [mem 0x11200000-0x11200fff]: assigned Sep 10 23:52:23.806635 kernel: pci 0000:00:02.0: BAR 0 [mem 0x11201000-0x11201fff]: assigned Sep 10 23:52:23.806697 kernel: pci 0000:00:02.0: bridge window [io 0x1000-0x1fff]: assigned Sep 10 23:52:23.806756 kernel: pci 0000:00:02.1: BAR 0 [mem 0x11202000-0x11202fff]: assigned Sep 10 23:52:23.806823 kernel: pci 0000:00:02.1: bridge window [io 0x2000-0x2fff]: assigned Sep 10 23:52:23.806885 kernel: pci 0000:00:02.2: BAR 0 [mem 0x11203000-0x11203fff]: assigned Sep 10 23:52:23.806945 kernel: pci 0000:00:02.2: bridge window [io 0x3000-0x3fff]: assigned Sep 10 23:52:23.807004 kernel: pci 0000:00:02.3: BAR 0 [mem 0x11204000-0x11204fff]: assigned Sep 10 23:52:23.807061 kernel: pci 0000:00:02.3: bridge window [io 0x4000-0x4fff]: assigned Sep 10 23:52:23.807119 kernel: pci 0000:00:02.4: BAR 0 [mem 0x11205000-0x11205fff]: assigned Sep 10 23:52:23.807177 kernel: pci 0000:00:02.4: bridge window [io 0x5000-0x5fff]: assigned Sep 10 23:52:23.807660 kernel: pci 0000:00:02.5: BAR 0 [mem 0x11206000-0x11206fff]: assigned Sep 10 23:52:23.807726 kernel: pci 0000:00:02.5: bridge window [io 0x6000-0x6fff]: assigned Sep 10 23:52:23.807787 kernel: pci 0000:00:02.6: BAR 0 [mem 0x11207000-0x11207fff]: assigned Sep 10 23:52:23.807851 kernel: pci 0000:00:02.6: bridge window [io 0x7000-0x7fff]: assigned Sep 10 23:52:23.807910 kernel: pci 0000:00:02.7: BAR 0 [mem 0x11208000-0x11208fff]: assigned Sep 10 23:52:23.807969 kernel: pci 0000:00:02.7: bridge window [io 0x8000-0x8fff]: assigned Sep 10 23:52:23.808028 kernel: pci 0000:00:03.0: BAR 0 [mem 0x11209000-0x11209fff]: assigned Sep 10 23:52:23.808086 kernel: pci 0000:00:03.0: bridge window [io 0x9000-0x9fff]: assigned Sep 10 23:52:23.808151 kernel: pci 0000:00:04.0: BAR 0 [io 0xa000-0xa007]: assigned Sep 10 23:52:23.808252 kernel: pci 0000:01:00.0: ROM [mem 0x10000000-0x1007ffff pref]: assigned Sep 10 23:52:23.808318 kernel: pci 0000:01:00.0: BAR 4 [mem 0x8000000000-0x8000003fff 64bit pref]: assigned Sep 10 23:52:23.808382 kernel: pci 0000:01:00.0: BAR 1 [mem 0x10080000-0x10080fff]: assigned Sep 10 23:52:23.808461 kernel: pci 0000:00:02.0: PCI bridge to [bus 01] Sep 10 23:52:23.808522 kernel: pci 0000:00:02.0: bridge window [io 0x1000-0x1fff] Sep 10 23:52:23.808581 kernel: pci 0000:00:02.0: bridge window [mem 0x10000000-0x101fffff] Sep 10 23:52:23.808639 kernel: pci 0000:00:02.0: bridge window [mem 0x8000000000-0x80001fffff 64bit pref] Sep 10 23:52:23.808704 kernel: pci 0000:02:00.0: BAR 0 [mem 0x10200000-0x10203fff 64bit]: assigned Sep 10 23:52:23.808763 kernel: pci 0000:00:02.1: PCI bridge to [bus 02] Sep 10 23:52:23.808826 kernel: pci 0000:00:02.1: bridge window [io 0x2000-0x2fff] Sep 10 23:52:23.808884 kernel: pci 0000:00:02.1: bridge window [mem 0x10200000-0x103fffff] Sep 10 23:52:23.808942 kernel: pci 0000:00:02.1: bridge window [mem 0x8000200000-0x80003fffff 64bit pref] Sep 10 23:52:23.809008 kernel: pci 0000:03:00.0: BAR 4 [mem 0x8000400000-0x8000403fff 64bit pref]: assigned Sep 10 23:52:23.809071 kernel: pci 0000:03:00.0: BAR 1 [mem 0x10400000-0x10400fff]: assigned Sep 10 23:52:23.809130 kernel: pci 0000:00:02.2: PCI bridge to [bus 03] Sep 10 23:52:23.809200 kernel: pci 0000:00:02.2: bridge window [io 0x3000-0x3fff] Sep 10 23:52:23.809266 kernel: pci 0000:00:02.2: bridge window [mem 0x10400000-0x105fffff] Sep 10 23:52:23.809324 kernel: pci 0000:00:02.2: bridge window [mem 0x8000400000-0x80005fffff 64bit pref] Sep 10 23:52:23.809390 kernel: pci 0000:04:00.0: BAR 4 [mem 0x8000600000-0x8000603fff 64bit pref]: assigned Sep 10 23:52:23.809466 kernel: pci 0000:00:02.3: PCI bridge to [bus 04] Sep 10 23:52:23.809525 kernel: pci 0000:00:02.3: bridge window [io 0x4000-0x4fff] Sep 10 23:52:23.809584 kernel: pci 0000:00:02.3: bridge window [mem 0x10600000-0x107fffff] Sep 10 23:52:23.809642 kernel: pci 0000:00:02.3: bridge window [mem 0x8000600000-0x80007fffff 64bit pref] Sep 10 23:52:23.809710 kernel: pci 0000:05:00.0: BAR 4 [mem 0x8000800000-0x8000803fff 64bit pref]: assigned Sep 10 23:52:23.809770 kernel: pci 0000:00:02.4: PCI bridge to [bus 05] Sep 10 23:52:23.809828 kernel: pci 0000:00:02.4: bridge window [io 0x5000-0x5fff] Sep 10 23:52:23.809886 kernel: pci 0000:00:02.4: bridge window [mem 0x10800000-0x109fffff] Sep 10 23:52:23.809943 kernel: pci 0000:00:02.4: bridge window [mem 0x8000800000-0x80009fffff 64bit pref] Sep 10 23:52:23.810007 kernel: pci 0000:06:00.0: BAR 4 [mem 0x8000a00000-0x8000a03fff 64bit pref]: assigned Sep 10 23:52:23.810067 kernel: pci 0000:06:00.0: BAR 1 [mem 0x10a00000-0x10a00fff]: assigned Sep 10 23:52:23.810127 kernel: pci 0000:00:02.5: PCI bridge to [bus 06] Sep 10 23:52:23.810206 kernel: pci 0000:00:02.5: bridge window [io 0x6000-0x6fff] Sep 10 23:52:23.810288 kernel: pci 0000:00:02.5: bridge window [mem 0x10a00000-0x10bfffff] Sep 10 23:52:23.810349 kernel: pci 0000:00:02.5: bridge window [mem 0x8000a00000-0x8000bfffff 64bit pref] Sep 10 23:52:23.810456 kernel: pci 0000:07:00.0: ROM [mem 0x10c00000-0x10c7ffff pref]: assigned Sep 10 23:52:23.810570 kernel: pci 0000:07:00.0: BAR 4 [mem 0x8000c00000-0x8000c03fff 64bit pref]: assigned Sep 10 23:52:23.810637 kernel: pci 0000:07:00.0: BAR 1 [mem 0x10c80000-0x10c80fff]: assigned Sep 10 23:52:23.810701 kernel: pci 0000:00:02.6: PCI bridge to [bus 07] Sep 10 23:52:23.810786 kernel: pci 0000:00:02.6: bridge window [io 0x7000-0x7fff] Sep 10 23:52:23.810849 kernel: pci 0000:00:02.6: bridge window [mem 0x10c00000-0x10dfffff] Sep 10 23:52:23.810908 kernel: pci 0000:00:02.6: bridge window [mem 0x8000c00000-0x8000dfffff 64bit pref] Sep 10 23:52:23.810970 kernel: pci 0000:00:02.7: PCI bridge to [bus 08] Sep 10 23:52:23.811028 kernel: pci 0000:00:02.7: bridge window [io 0x8000-0x8fff] Sep 10 23:52:23.811085 kernel: pci 0000:00:02.7: bridge window [mem 0x10e00000-0x10ffffff] Sep 10 23:52:23.811143 kernel: pci 0000:00:02.7: bridge window [mem 0x8000e00000-0x8000ffffff 64bit pref] Sep 10 23:52:23.813267 kernel: pci 0000:00:03.0: PCI bridge to [bus 09] Sep 10 23:52:23.813366 kernel: pci 0000:00:03.0: bridge window [io 0x9000-0x9fff] Sep 10 23:52:23.813457 kernel: pci 0000:00:03.0: bridge window [mem 0x11000000-0x111fffff] Sep 10 23:52:23.813530 kernel: pci 0000:00:03.0: bridge window [mem 0x8001000000-0x80011fffff 64bit pref] Sep 10 23:52:23.813593 kernel: pci_bus 0000:00: resource 4 [mem 0x10000000-0x3efeffff window] Sep 10 23:52:23.813646 kernel: pci_bus 0000:00: resource 5 [io 0x0000-0xffff window] Sep 10 23:52:23.813698 kernel: pci_bus 0000:00: resource 6 [mem 0x8000000000-0xffffffffff window] Sep 10 23:52:23.813822 kernel: pci_bus 0000:01: resource 0 [io 0x1000-0x1fff] Sep 10 23:52:23.813898 kernel: pci_bus 0000:01: resource 1 [mem 0x10000000-0x101fffff] Sep 10 23:52:23.813959 kernel: pci_bus 0000:01: resource 2 [mem 0x8000000000-0x80001fffff 64bit pref] Sep 10 23:52:23.814022 kernel: pci_bus 0000:02: resource 0 [io 0x2000-0x2fff] Sep 10 23:52:23.814077 kernel: pci_bus 0000:02: resource 1 [mem 0x10200000-0x103fffff] Sep 10 23:52:23.814130 kernel: pci_bus 0000:02: resource 2 [mem 0x8000200000-0x80003fffff 64bit pref] Sep 10 23:52:23.814227 kernel: pci_bus 0000:03: resource 0 [io 0x3000-0x3fff] Sep 10 23:52:23.814292 kernel: pci_bus 0000:03: resource 1 [mem 0x10400000-0x105fffff] Sep 10 23:52:23.814347 kernel: pci_bus 0000:03: resource 2 [mem 0x8000400000-0x80005fffff 64bit pref] Sep 10 23:52:23.814429 kernel: pci_bus 0000:04: resource 0 [io 0x4000-0x4fff] Sep 10 23:52:23.814490 kernel: pci_bus 0000:04: resource 1 [mem 0x10600000-0x107fffff] Sep 10 23:52:23.814544 kernel: pci_bus 0000:04: resource 2 [mem 0x8000600000-0x80007fffff 64bit pref] Sep 10 23:52:23.814608 kernel: pci_bus 0000:05: resource 0 [io 0x5000-0x5fff] Sep 10 23:52:23.814669 kernel: pci_bus 0000:05: resource 1 [mem 0x10800000-0x109fffff] Sep 10 23:52:23.814724 kernel: pci_bus 0000:05: resource 2 [mem 0x8000800000-0x80009fffff 64bit pref] Sep 10 23:52:23.814791 kernel: pci_bus 0000:06: resource 0 [io 0x6000-0x6fff] Sep 10 23:52:23.814849 kernel: pci_bus 0000:06: resource 1 [mem 0x10a00000-0x10bfffff] Sep 10 23:52:23.814903 kernel: pci_bus 0000:06: resource 2 [mem 0x8000a00000-0x8000bfffff 64bit pref] Sep 10 23:52:23.814964 kernel: pci_bus 0000:07: resource 0 [io 0x7000-0x7fff] Sep 10 23:52:23.815018 kernel: pci_bus 0000:07: resource 1 [mem 0x10c00000-0x10dfffff] Sep 10 23:52:23.815072 kernel: pci_bus 0000:07: resource 2 [mem 0x8000c00000-0x8000dfffff 64bit pref] Sep 10 23:52:23.815133 kernel: pci_bus 0000:08: resource 0 [io 0x8000-0x8fff] Sep 10 23:52:23.815205 kernel: pci_bus 0000:08: resource 1 [mem 0x10e00000-0x10ffffff] Sep 10 23:52:23.815263 kernel: pci_bus 0000:08: resource 2 [mem 0x8000e00000-0x8000ffffff 64bit pref] Sep 10 23:52:23.815325 kernel: pci_bus 0000:09: resource 0 [io 0x9000-0x9fff] Sep 10 23:52:23.815380 kernel: pci_bus 0000:09: resource 1 [mem 0x11000000-0x111fffff] Sep 10 23:52:23.815479 kernel: pci_bus 0000:09: resource 2 [mem 0x8001000000-0x80011fffff 64bit pref] Sep 10 23:52:23.815490 kernel: ACPI: PCI: Interrupt link GSI0 configured for IRQ 35 Sep 10 23:52:23.815498 kernel: ACPI: PCI: Interrupt link GSI1 configured for IRQ 36 Sep 10 23:52:23.815506 kernel: ACPI: PCI: Interrupt link GSI2 configured for IRQ 37 Sep 10 23:52:23.815517 kernel: ACPI: PCI: Interrupt link GSI3 configured for IRQ 38 Sep 10 23:52:23.815524 kernel: iommu: Default domain type: Translated Sep 10 23:52:23.815531 kernel: iommu: DMA domain TLB invalidation policy: strict mode Sep 10 23:52:23.815539 kernel: efivars: Registered efivars operations Sep 10 23:52:23.815547 kernel: vgaarb: loaded Sep 10 23:52:23.815554 kernel: clocksource: Switched to clocksource arch_sys_counter Sep 10 23:52:23.815561 kernel: VFS: Disk quotas dquot_6.6.0 Sep 10 23:52:23.815569 kernel: VFS: Dquot-cache hash table entries: 512 (order 0, 4096 bytes) Sep 10 23:52:23.815576 kernel: pnp: PnP ACPI init Sep 10 23:52:23.815651 kernel: system 00:00: [mem 0x4010000000-0x401fffffff window] could not be reserved Sep 10 23:52:23.815663 kernel: pnp: PnP ACPI: found 1 devices Sep 10 23:52:23.815671 kernel: NET: Registered PF_INET protocol family Sep 10 23:52:23.815678 kernel: IP idents hash table entries: 65536 (order: 7, 524288 bytes, linear) Sep 10 23:52:23.815686 kernel: tcp_listen_portaddr_hash hash table entries: 2048 (order: 3, 32768 bytes, linear) Sep 10 23:52:23.815693 kernel: Table-perturb hash table entries: 65536 (order: 6, 262144 bytes, linear) Sep 10 23:52:23.815701 kernel: TCP established hash table entries: 32768 (order: 6, 262144 bytes, linear) Sep 10 23:52:23.815708 kernel: TCP bind hash table entries: 32768 (order: 8, 1048576 bytes, linear) Sep 10 23:52:23.815718 kernel: TCP: Hash tables configured (established 32768 bind 32768) Sep 10 23:52:23.815725 kernel: UDP hash table entries: 2048 (order: 4, 65536 bytes, linear) Sep 10 23:52:23.815732 kernel: UDP-Lite hash table entries: 2048 (order: 4, 65536 bytes, linear) Sep 10 23:52:23.815740 kernel: NET: Registered PF_UNIX/PF_LOCAL protocol family Sep 10 23:52:23.815807 kernel: pci 0000:02:00.0: enabling device (0000 -> 0002) Sep 10 23:52:23.815818 kernel: PCI: CLS 0 bytes, default 64 Sep 10 23:52:23.815825 kernel: kvm [1]: HYP mode not available Sep 10 23:52:23.815833 kernel: Initialise system trusted keyrings Sep 10 23:52:23.815840 kernel: workingset: timestamp_bits=39 max_order=20 bucket_order=0 Sep 10 23:52:23.815849 kernel: Key type asymmetric registered Sep 10 23:52:23.815856 kernel: Asymmetric key parser 'x509' registered Sep 10 23:52:23.815863 kernel: Block layer SCSI generic (bsg) driver version 0.4 loaded (major 249) Sep 10 23:52:23.815871 kernel: io scheduler mq-deadline registered Sep 10 23:52:23.815878 kernel: io scheduler kyber registered Sep 10 23:52:23.815885 kernel: io scheduler bfq registered Sep 10 23:52:23.815893 kernel: ACPI: \_SB_.PCI0.GSI2: Enabled at IRQ 37 Sep 10 23:52:23.815954 kernel: pcieport 0000:00:02.0: PME: Signaling with IRQ 50 Sep 10 23:52:23.816014 kernel: pcieport 0000:00:02.0: AER: enabled with IRQ 50 Sep 10 23:52:23.816076 kernel: pcieport 0000:00:02.0: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Sep 10 23:52:23.816136 kernel: pcieport 0000:00:02.1: PME: Signaling with IRQ 51 Sep 10 23:52:23.818250 kernel: pcieport 0000:00:02.1: AER: enabled with IRQ 51 Sep 10 23:52:23.818340 kernel: pcieport 0000:00:02.1: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Sep 10 23:52:23.818419 kernel: pcieport 0000:00:02.2: PME: Signaling with IRQ 52 Sep 10 23:52:23.818488 kernel: pcieport 0000:00:02.2: AER: enabled with IRQ 52 Sep 10 23:52:23.818548 kernel: pcieport 0000:00:02.2: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Sep 10 23:52:23.818611 kernel: pcieport 0000:00:02.3: PME: Signaling with IRQ 53 Sep 10 23:52:23.818678 kernel: pcieport 0000:00:02.3: AER: enabled with IRQ 53 Sep 10 23:52:23.818736 kernel: pcieport 0000:00:02.3: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Sep 10 23:52:23.818797 kernel: pcieport 0000:00:02.4: PME: Signaling with IRQ 54 Sep 10 23:52:23.818856 kernel: pcieport 0000:00:02.4: AER: enabled with IRQ 54 Sep 10 23:52:23.818915 kernel: pcieport 0000:00:02.4: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Sep 10 23:52:23.818975 kernel: pcieport 0000:00:02.5: PME: Signaling with IRQ 55 Sep 10 23:52:23.819037 kernel: pcieport 0000:00:02.5: AER: enabled with IRQ 55 Sep 10 23:52:23.819099 kernel: pcieport 0000:00:02.5: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Sep 10 23:52:23.819162 kernel: pcieport 0000:00:02.6: PME: Signaling with IRQ 56 Sep 10 23:52:23.819780 kernel: pcieport 0000:00:02.6: AER: enabled with IRQ 56 Sep 10 23:52:23.819866 kernel: pcieport 0000:00:02.6: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Sep 10 23:52:23.819932 kernel: pcieport 0000:00:02.7: PME: Signaling with IRQ 57 Sep 10 23:52:23.819994 kernel: pcieport 0000:00:02.7: AER: enabled with IRQ 57 Sep 10 23:52:23.820052 kernel: pcieport 0000:00:02.7: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Sep 10 23:52:23.820063 kernel: ACPI: \_SB_.PCI0.GSI3: Enabled at IRQ 38 Sep 10 23:52:23.820131 kernel: pcieport 0000:00:03.0: PME: Signaling with IRQ 58 Sep 10 23:52:23.820215 kernel: pcieport 0000:00:03.0: AER: enabled with IRQ 58 Sep 10 23:52:23.820303 kernel: pcieport 0000:00:03.0: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Sep 10 23:52:23.820314 kernel: input: Power Button as /devices/LNXSYSTM:00/LNXSYBUS:00/PNP0C0C:00/input/input0 Sep 10 23:52:23.820322 kernel: ACPI: button: Power Button [PWRB] Sep 10 23:52:23.820330 kernel: ACPI: \_SB_.PCI0.GSI1: Enabled at IRQ 36 Sep 10 23:52:23.820408 kernel: virtio-pci 0000:04:00.0: enabling device (0000 -> 0002) Sep 10 23:52:23.820487 kernel: virtio-pci 0000:07:00.0: enabling device (0000 -> 0002) Sep 10 23:52:23.820502 kernel: Serial: 8250/16550 driver, 4 ports, IRQ sharing enabled Sep 10 23:52:23.820510 kernel: ACPI: \_SB_.PCI0.GSI0: Enabled at IRQ 35 Sep 10 23:52:23.820573 kernel: serial 0000:00:04.0: enabling device (0000 -> 0001) Sep 10 23:52:23.820584 kernel: 0000:00:04.0: ttyS0 at I/O 0xa000 (irq = 45, base_baud = 115200) is a 16550A Sep 10 23:52:23.820591 kernel: thunder_xcv, ver 1.0 Sep 10 23:52:23.820598 kernel: thunder_bgx, ver 1.0 Sep 10 23:52:23.820606 kernel: nicpf, ver 1.0 Sep 10 23:52:23.820613 kernel: nicvf, ver 1.0 Sep 10 23:52:23.820682 kernel: rtc-efi rtc-efi.0: registered as rtc0 Sep 10 23:52:23.820742 kernel: rtc-efi rtc-efi.0: setting system clock to 2025-09-10T23:52:23 UTC (1757548343) Sep 10 23:52:23.820751 kernel: hid: raw HID events driver (C) Jiri Kosina Sep 10 23:52:23.820759 kernel: hw perfevents: enabled with armv8_pmuv3_0 PMU driver, 7 (0,8000003f) counters available Sep 10 23:52:23.820766 kernel: watchdog: NMI not fully supported Sep 10 23:52:23.820774 kernel: watchdog: Hard watchdog permanently disabled Sep 10 23:52:23.820781 kernel: NET: Registered PF_INET6 protocol family Sep 10 23:52:23.820788 kernel: Segment Routing with IPv6 Sep 10 23:52:23.820796 kernel: In-situ OAM (IOAM) with IPv6 Sep 10 23:52:23.820803 kernel: NET: Registered PF_PACKET protocol family Sep 10 23:52:23.820812 kernel: Key type dns_resolver registered Sep 10 23:52:23.820819 kernel: registered taskstats version 1 Sep 10 23:52:23.820827 kernel: Loading compiled-in X.509 certificates Sep 10 23:52:23.820834 kernel: Loaded X.509 cert 'Kinvolk GmbH: Module signing key for 6.12.46-flatcar: 3c20aab1105575c84ea94c1a59a27813fcebdea7' Sep 10 23:52:23.820842 kernel: Demotion targets for Node 0: null Sep 10 23:52:23.820849 kernel: Key type .fscrypt registered Sep 10 23:52:23.820856 kernel: Key type fscrypt-provisioning registered Sep 10 23:52:23.820864 kernel: ima: No TPM chip found, activating TPM-bypass! Sep 10 23:52:23.820872 kernel: ima: Allocated hash algorithm: sha1 Sep 10 23:52:23.820880 kernel: ima: No architecture policies found Sep 10 23:52:23.820887 kernel: alg: No test for fips(ansi_cprng) (fips_ansi_cprng) Sep 10 23:52:23.820895 kernel: clk: Disabling unused clocks Sep 10 23:52:23.820902 kernel: PM: genpd: Disabling unused power domains Sep 10 23:52:23.820909 kernel: Warning: unable to open an initial console. Sep 10 23:52:23.820917 kernel: Freeing unused kernel memory: 38976K Sep 10 23:52:23.820924 kernel: Run /init as init process Sep 10 23:52:23.820931 kernel: with arguments: Sep 10 23:52:23.820940 kernel: /init Sep 10 23:52:23.820947 kernel: with environment: Sep 10 23:52:23.820954 kernel: HOME=/ Sep 10 23:52:23.820962 kernel: TERM=linux Sep 10 23:52:23.820969 kernel: BOOT_IMAGE=/flatcar/vmlinuz-a Sep 10 23:52:23.820977 systemd[1]: Successfully made /usr/ read-only. Sep 10 23:52:23.820988 systemd[1]: systemd 256.8 running in system mode (+PAM +AUDIT +SELINUX -APPARMOR +IMA +SMACK +SECCOMP -GCRYPT -GNUTLS +OPENSSL -ACL +BLKID +CURL +ELFUTILS -FIDO2 +IDN2 -IDN +IPTC +KMOD +LIBCRYPTSETUP +LIBCRYPTSETUP_PLUGINS +LIBFDISK +PCRE2 -PWQUALITY -P11KIT -QRENCODE +TPM2 +BZIP2 +LZ4 +XZ +ZLIB +ZSTD -BPF_FRAMEWORK -XKBCOMMON +UTMP -SYSVINIT +LIBARCHIVE) Sep 10 23:52:23.820996 systemd[1]: Detected virtualization kvm. Sep 10 23:52:23.821005 systemd[1]: Detected architecture arm64. Sep 10 23:52:23.821013 systemd[1]: Running in initrd. Sep 10 23:52:23.821020 systemd[1]: No hostname configured, using default hostname. Sep 10 23:52:23.821028 systemd[1]: Hostname set to . Sep 10 23:52:23.821036 systemd[1]: Initializing machine ID from VM UUID. Sep 10 23:52:23.821043 systemd[1]: Queued start job for default target initrd.target. Sep 10 23:52:23.821051 systemd[1]: Started clevis-luks-askpass.path - Forward Password Requests to Clevis Directory Watch. Sep 10 23:52:23.821060 systemd[1]: Started systemd-ask-password-console.path - Dispatch Password Requests to Console Directory Watch. Sep 10 23:52:23.821070 systemd[1]: Expecting device dev-disk-by\x2dlabel-EFI\x2dSYSTEM.device - /dev/disk/by-label/EFI-SYSTEM... Sep 10 23:52:23.821078 systemd[1]: Expecting device dev-disk-by\x2dlabel-OEM.device - /dev/disk/by-label/OEM... Sep 10 23:52:23.821086 systemd[1]: Expecting device dev-disk-by\x2dlabel-ROOT.device - /dev/disk/by-label/ROOT... Sep 10 23:52:23.821094 systemd[1]: Expecting device dev-disk-by\x2dpartlabel-USR\x2dA.device - /dev/disk/by-partlabel/USR-A... Sep 10 23:52:23.821103 systemd[1]: Expecting device dev-disk-by\x2dpartuuid-7130c94a\x2d213a\x2d4e5a\x2d8e26\x2d6cce9662f132.device - /dev/disk/by-partuuid/7130c94a-213a-4e5a-8e26-6cce9662f132... Sep 10 23:52:23.821111 systemd[1]: Expecting device dev-mapper-usr.device - /dev/mapper/usr... Sep 10 23:52:23.821119 systemd[1]: Reached target cryptsetup-pre.target - Local Encrypted Volumes (Pre). Sep 10 23:52:23.821128 systemd[1]: Reached target cryptsetup.target - Local Encrypted Volumes. Sep 10 23:52:23.821136 systemd[1]: Reached target paths.target - Path Units. Sep 10 23:52:23.821144 systemd[1]: Reached target slices.target - Slice Units. Sep 10 23:52:23.821152 systemd[1]: Reached target swap.target - Swaps. Sep 10 23:52:23.821160 systemd[1]: Reached target timers.target - Timer Units. Sep 10 23:52:23.821168 systemd[1]: Listening on iscsid.socket - Open-iSCSI iscsid Socket. Sep 10 23:52:23.821176 systemd[1]: Listening on iscsiuio.socket - Open-iSCSI iscsiuio Socket. Sep 10 23:52:23.821207 systemd[1]: Listening on systemd-journald-dev-log.socket - Journal Socket (/dev/log). Sep 10 23:52:23.821220 systemd[1]: Listening on systemd-journald.socket - Journal Sockets. Sep 10 23:52:23.821228 systemd[1]: Listening on systemd-networkd.socket - Network Service Netlink Socket. Sep 10 23:52:23.821236 systemd[1]: Listening on systemd-udevd-control.socket - udev Control Socket. Sep 10 23:52:23.821243 systemd[1]: Listening on systemd-udevd-kernel.socket - udev Kernel Socket. Sep 10 23:52:23.821251 systemd[1]: Reached target sockets.target - Socket Units. Sep 10 23:52:23.821259 systemd[1]: Starting ignition-setup-pre.service - Ignition env setup... Sep 10 23:52:23.821267 systemd[1]: Starting kmod-static-nodes.service - Create List of Static Device Nodes... Sep 10 23:52:23.821275 systemd[1]: Finished network-cleanup.service - Network Cleanup. Sep 10 23:52:23.821283 systemd[1]: systemd-battery-check.service - Check battery level during early boot was skipped because of an unmet condition check (ConditionDirectoryNotEmpty=/sys/class/power_supply). Sep 10 23:52:23.821292 systemd[1]: Starting systemd-fsck-usr.service... Sep 10 23:52:23.821300 systemd[1]: Starting systemd-journald.service - Journal Service... Sep 10 23:52:23.821308 systemd[1]: Starting systemd-modules-load.service - Load Kernel Modules... Sep 10 23:52:23.821316 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Sep 10 23:52:23.821323 systemd[1]: Finished ignition-setup-pre.service - Ignition env setup. Sep 10 23:52:23.821333 systemd[1]: Finished kmod-static-nodes.service - Create List of Static Device Nodes. Sep 10 23:52:23.821365 systemd-journald[244]: Collecting audit messages is disabled. Sep 10 23:52:23.821386 systemd[1]: Finished systemd-fsck-usr.service. Sep 10 23:52:23.821396 systemd[1]: Starting systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully... Sep 10 23:52:23.821416 kernel: bridge: filtering via arp/ip/ip6tables is no longer available by default. Update your scripts to load br_netfilter if you need this. Sep 10 23:52:23.821424 kernel: Bridge firewalling registered Sep 10 23:52:23.821432 systemd[1]: Finished systemd-modules-load.service - Load Kernel Modules. Sep 10 23:52:23.821440 systemd[1]: Starting systemd-sysctl.service - Apply Kernel Variables... Sep 10 23:52:23.821447 systemd[1]: Finished systemd-vconsole-setup.service - Virtual Console Setup. Sep 10 23:52:23.821455 systemd[1]: Starting dracut-cmdline-ask.service - dracut ask for additional cmdline parameters... Sep 10 23:52:23.821464 systemd[1]: Finished systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully. Sep 10 23:52:23.821475 systemd-journald[244]: Journal started Sep 10 23:52:23.821494 systemd-journald[244]: Runtime Journal (/run/log/journal/f4d97b8bb2b54e04a99a13517a9bca8e) is 8M, max 76.5M, 68.5M free. Sep 10 23:52:23.779278 systemd-modules-load[245]: Inserted module 'overlay' Sep 10 23:52:23.825651 systemd[1]: Started systemd-journald.service - Journal Service. Sep 10 23:52:23.799092 systemd-modules-load[245]: Inserted module 'br_netfilter' Sep 10 23:52:23.826845 systemd[1]: Starting systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev... Sep 10 23:52:23.828858 systemd[1]: Starting systemd-tmpfiles-setup.service - Create System Files and Directories... Sep 10 23:52:23.835233 systemd[1]: Finished systemd-sysctl.service - Apply Kernel Variables. Sep 10 23:52:23.841895 systemd[1]: Finished dracut-cmdline-ask.service - dracut ask for additional cmdline parameters. Sep 10 23:52:23.844353 systemd[1]: Starting dracut-cmdline.service - dracut cmdline hook... Sep 10 23:52:23.850228 systemd[1]: Finished systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev. Sep 10 23:52:23.851658 systemd-tmpfiles[265]: /usr/lib/tmpfiles.d/var.conf:14: Duplicate line for path "/var/log", ignoring. Sep 10 23:52:23.855850 systemd[1]: Finished systemd-tmpfiles-setup.service - Create System Files and Directories. Sep 10 23:52:23.858460 systemd[1]: Starting systemd-resolved.service - Network Name Resolution... Sep 10 23:52:23.866727 dracut-cmdline[283]: Using kernel command line parameters: rd.driver.pre=btrfs SYSTEMD_SULOGIN_FORCE=1 BOOT_IMAGE=/flatcar/vmlinuz-a mount.usr=/dev/mapper/usr verity.usr=PARTUUID=7130c94a-213a-4e5a-8e26-6cce9662f132 rootflags=rw mount.usrflags=ro consoleblank=0 root=LABEL=ROOT console=ttyAMA0,115200n8 flatcar.first_boot=detected acpi=force flatcar.oem.id=hetzner verity.usrhash=dd9c14cce645c634e06a91b09405eea80057f02909b9267c482dc457df1cddec Sep 10 23:52:23.905281 systemd-resolved[292]: Positive Trust Anchors: Sep 10 23:52:23.905295 systemd-resolved[292]: . IN DS 20326 8 2 e06d44b80b8f1d39a95c0b0d7c65d08458e880409bbc683457104237c7f8ec8d Sep 10 23:52:23.905326 systemd-resolved[292]: Negative trust anchors: home.arpa 10.in-addr.arpa 16.172.in-addr.arpa 17.172.in-addr.arpa 18.172.in-addr.arpa 19.172.in-addr.arpa 20.172.in-addr.arpa 21.172.in-addr.arpa 22.172.in-addr.arpa 23.172.in-addr.arpa 24.172.in-addr.arpa 25.172.in-addr.arpa 26.172.in-addr.arpa 27.172.in-addr.arpa 28.172.in-addr.arpa 29.172.in-addr.arpa 30.172.in-addr.arpa 31.172.in-addr.arpa 170.0.0.192.in-addr.arpa 171.0.0.192.in-addr.arpa 168.192.in-addr.arpa d.f.ip6.arpa ipv4only.arpa resolver.arpa corp home internal intranet lan local private test Sep 10 23:52:23.915540 systemd-resolved[292]: Defaulting to hostname 'linux'. Sep 10 23:52:23.917047 systemd[1]: Started systemd-resolved.service - Network Name Resolution. Sep 10 23:52:23.918115 systemd[1]: Reached target nss-lookup.target - Host and Network Name Lookups. Sep 10 23:52:23.972214 kernel: SCSI subsystem initialized Sep 10 23:52:23.977232 kernel: Loading iSCSI transport class v2.0-870. Sep 10 23:52:23.985225 kernel: iscsi: registered transport (tcp) Sep 10 23:52:23.998263 kernel: iscsi: registered transport (qla4xxx) Sep 10 23:52:23.998320 kernel: QLogic iSCSI HBA Driver Sep 10 23:52:24.018566 systemd[1]: Starting systemd-network-generator.service - Generate network units from Kernel command line... Sep 10 23:52:24.035716 systemd[1]: Finished systemd-network-generator.service - Generate network units from Kernel command line. Sep 10 23:52:24.039412 systemd[1]: Reached target network-pre.target - Preparation for Network. Sep 10 23:52:24.090109 systemd[1]: Finished dracut-cmdline.service - dracut cmdline hook. Sep 10 23:52:24.092264 systemd[1]: Starting dracut-pre-udev.service - dracut pre-udev hook... Sep 10 23:52:24.156235 kernel: raid6: neonx8 gen() 15602 MB/s Sep 10 23:52:24.173242 kernel: raid6: neonx4 gen() 14206 MB/s Sep 10 23:52:24.190237 kernel: raid6: neonx2 gen() 13107 MB/s Sep 10 23:52:24.207249 kernel: raid6: neonx1 gen() 10366 MB/s Sep 10 23:52:24.224233 kernel: raid6: int64x8 gen() 6842 MB/s Sep 10 23:52:24.241249 kernel: raid6: int64x4 gen() 7308 MB/s Sep 10 23:52:24.258265 kernel: raid6: int64x2 gen() 6070 MB/s Sep 10 23:52:24.275235 kernel: raid6: int64x1 gen() 5000 MB/s Sep 10 23:52:24.275320 kernel: raid6: using algorithm neonx8 gen() 15602 MB/s Sep 10 23:52:24.292256 kernel: raid6: .... xor() 11898 MB/s, rmw enabled Sep 10 23:52:24.292339 kernel: raid6: using neon recovery algorithm Sep 10 23:52:24.297354 kernel: xor: measuring software checksum speed Sep 10 23:52:24.297448 kernel: 8regs : 21630 MB/sec Sep 10 23:52:24.297464 kernel: 32regs : 21710 MB/sec Sep 10 23:52:24.297476 kernel: arm64_neon : 26196 MB/sec Sep 10 23:52:24.298221 kernel: xor: using function: arm64_neon (26196 MB/sec) Sep 10 23:52:24.351256 kernel: Btrfs loaded, zoned=no, fsverity=no Sep 10 23:52:24.361231 systemd[1]: Finished dracut-pre-udev.service - dracut pre-udev hook. Sep 10 23:52:24.365721 systemd[1]: Starting systemd-udevd.service - Rule-based Manager for Device Events and Files... Sep 10 23:52:24.391654 systemd-udevd[494]: Using default interface naming scheme 'v255'. Sep 10 23:52:24.396268 systemd[1]: Started systemd-udevd.service - Rule-based Manager for Device Events and Files. Sep 10 23:52:24.400120 systemd[1]: Starting dracut-pre-trigger.service - dracut pre-trigger hook... Sep 10 23:52:24.433089 dracut-pre-trigger[501]: rd.md=0: removing MD RAID activation Sep 10 23:52:24.464052 systemd[1]: Finished dracut-pre-trigger.service - dracut pre-trigger hook. Sep 10 23:52:24.467479 systemd[1]: Starting systemd-udev-trigger.service - Coldplug All udev Devices... Sep 10 23:52:24.538445 systemd[1]: Finished systemd-udev-trigger.service - Coldplug All udev Devices. Sep 10 23:52:24.541417 systemd[1]: Starting dracut-initqueue.service - dracut initqueue hook... Sep 10 23:52:24.653047 kernel: virtio_scsi virtio5: 2/0/0 default/read/poll queues Sep 10 23:52:24.653653 kernel: scsi host0: Virtio SCSI HBA Sep 10 23:52:24.658941 kernel: scsi 0:0:0:0: CD-ROM QEMU QEMU CD-ROM 2.5+ PQ: 0 ANSI: 5 Sep 10 23:52:24.659007 kernel: scsi 0:0:0:1: Direct-Access QEMU QEMU HARDDISK 2.5+ PQ: 0 ANSI: 5 Sep 10 23:52:24.666216 kernel: ACPI: bus type USB registered Sep 10 23:52:24.666269 kernel: usbcore: registered new interface driver usbfs Sep 10 23:52:24.667451 kernel: usbcore: registered new interface driver hub Sep 10 23:52:24.670313 kernel: usbcore: registered new device driver usb Sep 10 23:52:24.688559 systemd[1]: systemd-vconsole-setup.service: Deactivated successfully. Sep 10 23:52:24.688676 systemd[1]: Stopped systemd-vconsole-setup.service - Virtual Console Setup. Sep 10 23:52:24.691505 systemd[1]: Stopping systemd-vconsole-setup.service - Virtual Console Setup... Sep 10 23:52:24.694166 kernel: sr 0:0:0:0: Power-on or device reset occurred Sep 10 23:52:24.694359 kernel: sr 0:0:0:0: [sr0] scsi3-mmc drive: 16x/50x cd/rw xa/form2 cdda tray Sep 10 23:52:24.694466 kernel: cdrom: Uniform CD-ROM driver Revision: 3.20 Sep 10 23:52:24.696007 kernel: xhci_hcd 0000:02:00.0: xHCI Host Controller Sep 10 23:52:24.696179 kernel: xhci_hcd 0000:02:00.0: new USB bus registered, assigned bus number 1 Sep 10 23:52:24.694731 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Sep 10 23:52:24.700519 kernel: sr 0:0:0:0: Attached scsi CD-ROM sr0 Sep 10 23:52:24.703418 kernel: xhci_hcd 0000:02:00.0: hcc params 0x00087001 hci version 0x100 quirks 0x0000000000000010 Sep 10 23:52:24.703544 kernel: xhci_hcd 0000:02:00.0: xHCI Host Controller Sep 10 23:52:24.703630 kernel: xhci_hcd 0000:02:00.0: new USB bus registered, assigned bus number 2 Sep 10 23:52:24.703709 kernel: xhci_hcd 0000:02:00.0: Host supports USB 3.0 SuperSpeed Sep 10 23:52:24.707207 kernel: hub 1-0:1.0: USB hub found Sep 10 23:52:24.707381 kernel: hub 1-0:1.0: 4 ports detected Sep 10 23:52:24.707514 kernel: usb usb2: We don't know the algorithms for LPM for this host, disabling LPM. Sep 10 23:52:24.709211 kernel: hub 2-0:1.0: USB hub found Sep 10 23:52:24.709382 kernel: hub 2-0:1.0: 4 ports detected Sep 10 23:52:24.710214 kernel: sd 0:0:0:1: Power-on or device reset occurred Sep 10 23:52:24.712813 kernel: sd 0:0:0:1: [sda] 80003072 512-byte logical blocks: (41.0 GB/38.1 GiB) Sep 10 23:52:24.712951 kernel: sd 0:0:0:1: [sda] Write Protect is off Sep 10 23:52:24.713136 kernel: sd 0:0:0:1: [sda] Mode Sense: 63 00 00 08 Sep 10 23:52:24.714339 kernel: sd 0:0:0:1: [sda] Write cache: enabled, read cache: enabled, doesn't support DPO or FUA Sep 10 23:52:24.721390 kernel: GPT:Primary header thinks Alt. header is not at the end of the disk. Sep 10 23:52:24.721446 kernel: GPT:17805311 != 80003071 Sep 10 23:52:24.721456 kernel: GPT:Alternate GPT header not at the end of the disk. Sep 10 23:52:24.721471 kernel: GPT:17805311 != 80003071 Sep 10 23:52:24.721480 kernel: GPT: Use GNU Parted to correct GPT errors. Sep 10 23:52:24.722220 kernel: sda: sda1 sda2 sda3 sda4 sda6 sda7 sda9 Sep 10 23:52:24.724239 kernel: sd 0:0:0:1: [sda] Attached SCSI disk Sep 10 23:52:24.737377 systemd[1]: Finished systemd-vconsole-setup.service - Virtual Console Setup. Sep 10 23:52:24.788127 systemd[1]: Found device dev-disk-by\x2dlabel-EFI\x2dSYSTEM.device - QEMU_HARDDISK EFI-SYSTEM. Sep 10 23:52:24.804743 systemd[1]: Found device dev-disk-by\x2dlabel-ROOT.device - QEMU_HARDDISK ROOT. Sep 10 23:52:24.819935 systemd[1]: Found device dev-disk-by\x2dpartuuid-7130c94a\x2d213a\x2d4e5a\x2d8e26\x2d6cce9662f132.device - QEMU_HARDDISK USR-A. Sep 10 23:52:24.820652 systemd[1]: Found device dev-disk-by\x2dpartlabel-USR\x2dA.device - QEMU_HARDDISK USR-A. Sep 10 23:52:24.824229 systemd[1]: Finished dracut-initqueue.service - dracut initqueue hook. Sep 10 23:52:24.833073 systemd[1]: Found device dev-disk-by\x2dlabel-OEM.device - QEMU_HARDDISK OEM. Sep 10 23:52:24.834651 systemd[1]: Reached target remote-fs-pre.target - Preparation for Remote File Systems. Sep 10 23:52:24.835954 systemd[1]: Reached target remote-cryptsetup.target - Remote Encrypted Volumes. Sep 10 23:52:24.837166 systemd[1]: Reached target remote-fs.target - Remote File Systems. Sep 10 23:52:24.840337 systemd[1]: Starting disk-uuid.service - Generate new UUID for disk GPT if necessary... Sep 10 23:52:24.842952 systemd[1]: Starting dracut-pre-mount.service - dracut pre-mount hook... Sep 10 23:52:24.872987 systemd[1]: Finished dracut-pre-mount.service - dracut pre-mount hook. Sep 10 23:52:24.879032 disk-uuid[602]: Primary Header is updated. Sep 10 23:52:24.879032 disk-uuid[602]: Secondary Entries is updated. Sep 10 23:52:24.879032 disk-uuid[602]: Secondary Header is updated. Sep 10 23:52:24.883378 kernel: sda: sda1 sda2 sda3 sda4 sda6 sda7 sda9 Sep 10 23:52:24.950222 kernel: usb 1-1: new high-speed USB device number 2 using xhci_hcd Sep 10 23:52:25.082763 kernel: input: QEMU QEMU USB Tablet as /devices/pci0000:00/0000:00:02.1/0000:02:00.0/usb1/1-1/1-1:1.0/0003:0627:0001.0001/input/input1 Sep 10 23:52:25.082844 kernel: hid-generic 0003:0627:0001.0001: input,hidraw0: USB HID v0.01 Mouse [QEMU QEMU USB Tablet] on usb-0000:02:00.0-1/input0 Sep 10 23:52:25.083099 kernel: usbcore: registered new interface driver usbhid Sep 10 23:52:25.083120 kernel: usbhid: USB HID core driver Sep 10 23:52:25.187259 kernel: usb 1-2: new high-speed USB device number 3 using xhci_hcd Sep 10 23:52:25.314223 kernel: input: QEMU QEMU USB Keyboard as /devices/pci0000:00/0000:00:02.1/0000:02:00.0/usb1/1-2/1-2:1.0/0003:0627:0001.0002/input/input2 Sep 10 23:52:25.367227 kernel: hid-generic 0003:0627:0001.0002: input,hidraw1: USB HID v1.11 Keyboard [QEMU QEMU USB Keyboard] on usb-0000:02:00.0-2/input0 Sep 10 23:52:25.907124 disk-uuid[610]: The operation has completed successfully. Sep 10 23:52:25.908425 kernel: sda: sda1 sda2 sda3 sda4 sda6 sda7 sda9 Sep 10 23:52:25.965573 systemd[1]: disk-uuid.service: Deactivated successfully. Sep 10 23:52:25.966274 systemd[1]: Finished disk-uuid.service - Generate new UUID for disk GPT if necessary. Sep 10 23:52:25.989918 systemd[1]: Starting verity-setup.service - Verity Setup for /dev/mapper/usr... Sep 10 23:52:26.017091 sh[626]: Success Sep 10 23:52:26.032609 kernel: device-mapper: core: CONFIG_IMA_DISABLE_HTABLE is disabled. Duplicate IMA measurements will not be recorded in the IMA log. Sep 10 23:52:26.032665 kernel: device-mapper: uevent: version 1.0.3 Sep 10 23:52:26.032677 kernel: device-mapper: ioctl: 4.48.0-ioctl (2023-03-01) initialised: dm-devel@lists.linux.dev Sep 10 23:52:26.042326 kernel: device-mapper: verity: sha256 using shash "sha256-ce" Sep 10 23:52:26.097981 systemd[1]: Found device dev-mapper-usr.device - /dev/mapper/usr. Sep 10 23:52:26.106238 systemd[1]: Mounting sysusr-usr.mount - /sysusr/usr... Sep 10 23:52:26.117754 systemd[1]: Finished verity-setup.service - Verity Setup for /dev/mapper/usr. Sep 10 23:52:26.127236 kernel: BTRFS: device fsid 3b17f37f-d395-4116-a46d-e07f86112ade devid 1 transid 39 /dev/mapper/usr (254:0) scanned by mount (638) Sep 10 23:52:26.128547 kernel: BTRFS info (device dm-0): first mount of filesystem 3b17f37f-d395-4116-a46d-e07f86112ade Sep 10 23:52:26.128574 kernel: BTRFS info (device dm-0): using crc32c (crc32c-generic) checksum algorithm Sep 10 23:52:26.136179 kernel: BTRFS info (device dm-0): enabling ssd optimizations Sep 10 23:52:26.136260 kernel: BTRFS info (device dm-0): disabling log replay at mount time Sep 10 23:52:26.136275 kernel: BTRFS info (device dm-0): enabling free space tree Sep 10 23:52:26.139183 systemd[1]: Mounted sysusr-usr.mount - /sysusr/usr. Sep 10 23:52:26.140780 systemd[1]: Reached target initrd-usr-fs.target - Initrd /usr File System. Sep 10 23:52:26.141947 systemd[1]: afterburn-network-kargs.service - Afterburn Initrd Setup Network Kernel Arguments was skipped because no trigger condition checks were met. Sep 10 23:52:26.142727 systemd[1]: Starting ignition-setup.service - Ignition (setup)... Sep 10 23:52:26.146298 systemd[1]: Starting parse-ip-for-networkd.service - Write systemd-networkd units from cmdline... Sep 10 23:52:26.172677 kernel: BTRFS: device label OEM devid 1 transid 11 /dev/sda6 (8:6) scanned by mount (668) Sep 10 23:52:26.172742 kernel: BTRFS info (device sda6): first mount of filesystem 538ffae8-60fb-4c82-9100-efc4d2404f73 Sep 10 23:52:26.173268 kernel: BTRFS info (device sda6): using crc32c (crc32c-generic) checksum algorithm Sep 10 23:52:26.177238 kernel: BTRFS info (device sda6): enabling ssd optimizations Sep 10 23:52:26.177290 kernel: BTRFS info (device sda6): turning on async discard Sep 10 23:52:26.178211 kernel: BTRFS info (device sda6): enabling free space tree Sep 10 23:52:26.183201 kernel: BTRFS info (device sda6): last unmount of filesystem 538ffae8-60fb-4c82-9100-efc4d2404f73 Sep 10 23:52:26.184683 systemd[1]: Finished ignition-setup.service - Ignition (setup). Sep 10 23:52:26.187311 systemd[1]: Starting ignition-fetch-offline.service - Ignition (fetch-offline)... Sep 10 23:52:26.268579 systemd[1]: Finished parse-ip-for-networkd.service - Write systemd-networkd units from cmdline. Sep 10 23:52:26.272324 systemd[1]: Starting systemd-networkd.service - Network Configuration... Sep 10 23:52:26.309996 systemd-networkd[808]: lo: Link UP Sep 10 23:52:26.310006 systemd-networkd[808]: lo: Gained carrier Sep 10 23:52:26.311470 systemd-networkd[808]: Enumeration completed Sep 10 23:52:26.311569 systemd[1]: Started systemd-networkd.service - Network Configuration. Sep 10 23:52:26.312079 systemd-networkd[808]: eth0: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Sep 10 23:52:26.312083 systemd-networkd[808]: eth0: Configuring with /usr/lib/systemd/network/zz-default.network. Sep 10 23:52:26.313147 systemd[1]: Reached target network.target - Network. Sep 10 23:52:26.314139 systemd-networkd[808]: eth1: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Sep 10 23:52:26.314143 systemd-networkd[808]: eth1: Configuring with /usr/lib/systemd/network/zz-default.network. Sep 10 23:52:26.314688 systemd-networkd[808]: eth0: Link UP Sep 10 23:52:26.314817 systemd-networkd[808]: eth1: Link UP Sep 10 23:52:26.314940 systemd-networkd[808]: eth0: Gained carrier Sep 10 23:52:26.314949 systemd-networkd[808]: eth0: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Sep 10 23:52:26.322272 systemd-networkd[808]: eth1: Gained carrier Sep 10 23:52:26.323217 systemd-networkd[808]: eth1: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Sep 10 23:52:26.336437 ignition[715]: Ignition 2.21.0 Sep 10 23:52:26.336448 ignition[715]: Stage: fetch-offline Sep 10 23:52:26.336488 ignition[715]: no configs at "/usr/lib/ignition/base.d" Sep 10 23:52:26.336495 ignition[715]: no config dir at "/usr/lib/ignition/base.platform.d/hetzner" Sep 10 23:52:26.339138 systemd[1]: Finished ignition-fetch-offline.service - Ignition (fetch-offline). Sep 10 23:52:26.336659 ignition[715]: parsed url from cmdline: "" Sep 10 23:52:26.336662 ignition[715]: no config URL provided Sep 10 23:52:26.336668 ignition[715]: reading system config file "/usr/lib/ignition/user.ign" Sep 10 23:52:26.336675 ignition[715]: no config at "/usr/lib/ignition/user.ign" Sep 10 23:52:26.342263 systemd[1]: Starting ignition-fetch.service - Ignition (fetch)... Sep 10 23:52:26.336679 ignition[715]: failed to fetch config: resource requires networking Sep 10 23:52:26.336962 ignition[715]: Ignition finished successfully Sep 10 23:52:26.347319 systemd-networkd[808]: eth1: DHCPv4 address 10.0.0.3/32 acquired from 10.0.0.1 Sep 10 23:52:26.374727 ignition[818]: Ignition 2.21.0 Sep 10 23:52:26.375368 ignition[818]: Stage: fetch Sep 10 23:52:26.375814 ignition[818]: no configs at "/usr/lib/ignition/base.d" Sep 10 23:52:26.375823 ignition[818]: no config dir at "/usr/lib/ignition/base.platform.d/hetzner" Sep 10 23:52:26.376436 ignition[818]: parsed url from cmdline: "" Sep 10 23:52:26.376440 ignition[818]: no config URL provided Sep 10 23:52:26.376445 ignition[818]: reading system config file "/usr/lib/ignition/user.ign" Sep 10 23:52:26.376454 ignition[818]: no config at "/usr/lib/ignition/user.ign" Sep 10 23:52:26.376554 ignition[818]: GET http://169.254.169.254/hetzner/v1/userdata: attempt #1 Sep 10 23:52:26.379336 ignition[818]: GET error: Get "http://169.254.169.254/hetzner/v1/userdata": dial tcp 169.254.169.254:80: connect: network is unreachable Sep 10 23:52:26.380265 systemd-networkd[808]: eth0: DHCPv4 address 91.107.201.216/32, gateway 172.31.1.1 acquired from 172.31.1.1 Sep 10 23:52:26.580376 ignition[818]: GET http://169.254.169.254/hetzner/v1/userdata: attempt #2 Sep 10 23:52:26.586778 ignition[818]: GET result: OK Sep 10 23:52:26.588103 ignition[818]: parsing config with SHA512: 6d1d741420dab6af53935dca6d4ea5883bba010d98d8ee210a9bdfbdeaaf4a7cbe8c48682aaa6875aa6d34a38cba0f830b621925cd8846e95984f08d3a6f01fc Sep 10 23:52:26.596609 unknown[818]: fetched base config from "system" Sep 10 23:52:26.597632 unknown[818]: fetched base config from "system" Sep 10 23:52:26.598207 ignition[818]: fetch: fetch complete Sep 10 23:52:26.597646 unknown[818]: fetched user config from "hetzner" Sep 10 23:52:26.598214 ignition[818]: fetch: fetch passed Sep 10 23:52:26.598287 ignition[818]: Ignition finished successfully Sep 10 23:52:26.602180 systemd[1]: Finished ignition-fetch.service - Ignition (fetch). Sep 10 23:52:26.604949 systemd[1]: Starting ignition-kargs.service - Ignition (kargs)... Sep 10 23:52:26.640054 ignition[826]: Ignition 2.21.0 Sep 10 23:52:26.640080 ignition[826]: Stage: kargs Sep 10 23:52:26.640347 ignition[826]: no configs at "/usr/lib/ignition/base.d" Sep 10 23:52:26.640361 ignition[826]: no config dir at "/usr/lib/ignition/base.platform.d/hetzner" Sep 10 23:52:26.641631 ignition[826]: kargs: kargs passed Sep 10 23:52:26.641700 ignition[826]: Ignition finished successfully Sep 10 23:52:26.646285 systemd[1]: Finished ignition-kargs.service - Ignition (kargs). Sep 10 23:52:26.650068 systemd[1]: Starting ignition-disks.service - Ignition (disks)... Sep 10 23:52:26.677601 ignition[833]: Ignition 2.21.0 Sep 10 23:52:26.677620 ignition[833]: Stage: disks Sep 10 23:52:26.677834 ignition[833]: no configs at "/usr/lib/ignition/base.d" Sep 10 23:52:26.677848 ignition[833]: no config dir at "/usr/lib/ignition/base.platform.d/hetzner" Sep 10 23:52:26.683305 ignition[833]: disks: disks passed Sep 10 23:52:26.683415 ignition[833]: Ignition finished successfully Sep 10 23:52:26.687486 systemd[1]: Finished ignition-disks.service - Ignition (disks). Sep 10 23:52:26.690134 systemd[1]: Reached target initrd-root-device.target - Initrd Root Device. Sep 10 23:52:26.691684 systemd[1]: Reached target local-fs-pre.target - Preparation for Local File Systems. Sep 10 23:52:26.693440 systemd[1]: Reached target local-fs.target - Local File Systems. Sep 10 23:52:26.694326 systemd[1]: Reached target sysinit.target - System Initialization. Sep 10 23:52:26.695157 systemd[1]: Reached target basic.target - Basic System. Sep 10 23:52:26.697050 systemd[1]: Starting systemd-fsck-root.service - File System Check on /dev/disk/by-label/ROOT... Sep 10 23:52:26.738398 systemd-fsck[842]: ROOT: clean, 15/1628000 files, 120826/1617920 blocks Sep 10 23:52:26.743255 systemd[1]: Finished systemd-fsck-root.service - File System Check on /dev/disk/by-label/ROOT. Sep 10 23:52:26.747264 systemd[1]: Mounting sysroot.mount - /sysroot... Sep 10 23:52:26.820254 kernel: EXT4-fs (sda9): mounted filesystem fcae628f-5f9a-4539-a638-93fb1399b5d7 r/w with ordered data mode. Quota mode: none. Sep 10 23:52:26.822033 systemd[1]: Mounted sysroot.mount - /sysroot. Sep 10 23:52:26.824527 systemd[1]: Reached target initrd-root-fs.target - Initrd Root File System. Sep 10 23:52:26.827617 systemd[1]: Mounting sysroot-oem.mount - /sysroot/oem... Sep 10 23:52:26.829649 systemd[1]: Mounting sysroot-usr.mount - /sysroot/usr... Sep 10 23:52:26.835340 systemd[1]: Starting flatcar-metadata-hostname.service - Flatcar Metadata Hostname Agent... Sep 10 23:52:26.835928 systemd[1]: ignition-remount-sysroot.service - Remount /sysroot read-write for Ignition was skipped because of an unmet condition check (ConditionPathIsReadWrite=!/sysroot). Sep 10 23:52:26.835958 systemd[1]: Reached target ignition-diskful.target - Ignition Boot Disk Setup. Sep 10 23:52:26.842242 systemd[1]: Mounted sysroot-usr.mount - /sysroot/usr. Sep 10 23:52:26.843991 systemd[1]: Starting initrd-setup-root.service - Root filesystem setup... Sep 10 23:52:26.858218 kernel: BTRFS: device label OEM devid 1 transid 11 /dev/sda6 (8:6) scanned by mount (850) Sep 10 23:52:26.860243 kernel: BTRFS info (device sda6): first mount of filesystem 538ffae8-60fb-4c82-9100-efc4d2404f73 Sep 10 23:52:26.860285 kernel: BTRFS info (device sda6): using crc32c (crc32c-generic) checksum algorithm Sep 10 23:52:26.873968 kernel: BTRFS info (device sda6): enabling ssd optimizations Sep 10 23:52:26.874035 kernel: BTRFS info (device sda6): turning on async discard Sep 10 23:52:26.875109 kernel: BTRFS info (device sda6): enabling free space tree Sep 10 23:52:26.879583 systemd[1]: Mounted sysroot-oem.mount - /sysroot/oem. Sep 10 23:52:26.907474 initrd-setup-root[877]: cut: /sysroot/etc/passwd: No such file or directory Sep 10 23:52:26.911829 coreos-metadata[852]: Sep 10 23:52:26.911 INFO Fetching http://169.254.169.254/hetzner/v1/metadata/hostname: Attempt #1 Sep 10 23:52:26.915206 coreos-metadata[852]: Sep 10 23:52:26.914 INFO Fetch successful Sep 10 23:52:26.915206 coreos-metadata[852]: Sep 10 23:52:26.914 INFO wrote hostname ci-4372-1-0-n-c06092ab73 to /sysroot/etc/hostname Sep 10 23:52:26.917749 initrd-setup-root[884]: cut: /sysroot/etc/group: No such file or directory Sep 10 23:52:26.917437 systemd[1]: Finished flatcar-metadata-hostname.service - Flatcar Metadata Hostname Agent. Sep 10 23:52:26.922062 initrd-setup-root[892]: cut: /sysroot/etc/shadow: No such file or directory Sep 10 23:52:26.926880 initrd-setup-root[899]: cut: /sysroot/etc/gshadow: No such file or directory Sep 10 23:52:27.024849 systemd[1]: Finished initrd-setup-root.service - Root filesystem setup. Sep 10 23:52:27.026687 systemd[1]: Starting ignition-mount.service - Ignition (mount)... Sep 10 23:52:27.027857 systemd[1]: Starting sysroot-boot.service - /sysroot/boot... Sep 10 23:52:27.050217 kernel: BTRFS info (device sda6): last unmount of filesystem 538ffae8-60fb-4c82-9100-efc4d2404f73 Sep 10 23:52:27.065369 systemd[1]: Finished sysroot-boot.service - /sysroot/boot. Sep 10 23:52:27.078887 ignition[968]: INFO : Ignition 2.21.0 Sep 10 23:52:27.078887 ignition[968]: INFO : Stage: mount Sep 10 23:52:27.080136 ignition[968]: INFO : no configs at "/usr/lib/ignition/base.d" Sep 10 23:52:27.080136 ignition[968]: INFO : no config dir at "/usr/lib/ignition/base.platform.d/hetzner" Sep 10 23:52:27.080136 ignition[968]: INFO : mount: mount passed Sep 10 23:52:27.080136 ignition[968]: INFO : Ignition finished successfully Sep 10 23:52:27.084367 systemd[1]: Finished ignition-mount.service - Ignition (mount). Sep 10 23:52:27.086517 systemd[1]: Starting ignition-files.service - Ignition (files)... Sep 10 23:52:27.128123 systemd[1]: sysroot-oem.mount: Deactivated successfully. Sep 10 23:52:27.132486 systemd[1]: Mounting sysroot-oem.mount - /sysroot/oem... Sep 10 23:52:27.155236 kernel: BTRFS: device label OEM devid 1 transid 11 /dev/sda6 (8:6) scanned by mount (978) Sep 10 23:52:27.155312 kernel: BTRFS info (device sda6): first mount of filesystem 538ffae8-60fb-4c82-9100-efc4d2404f73 Sep 10 23:52:27.156371 kernel: BTRFS info (device sda6): using crc32c (crc32c-generic) checksum algorithm Sep 10 23:52:27.162015 kernel: BTRFS info (device sda6): enabling ssd optimizations Sep 10 23:52:27.162077 kernel: BTRFS info (device sda6): turning on async discard Sep 10 23:52:27.162092 kernel: BTRFS info (device sda6): enabling free space tree Sep 10 23:52:27.165312 systemd[1]: Mounted sysroot-oem.mount - /sysroot/oem. Sep 10 23:52:27.195582 ignition[995]: INFO : Ignition 2.21.0 Sep 10 23:52:27.195582 ignition[995]: INFO : Stage: files Sep 10 23:52:27.196627 ignition[995]: INFO : no configs at "/usr/lib/ignition/base.d" Sep 10 23:52:27.196627 ignition[995]: INFO : no config dir at "/usr/lib/ignition/base.platform.d/hetzner" Sep 10 23:52:27.196627 ignition[995]: DEBUG : files: compiled without relabeling support, skipping Sep 10 23:52:27.200286 ignition[995]: INFO : files: ensureUsers: op(1): [started] creating or modifying user "core" Sep 10 23:52:27.200286 ignition[995]: DEBUG : files: ensureUsers: op(1): executing: "usermod" "--root" "/sysroot" "core" Sep 10 23:52:27.200286 ignition[995]: INFO : files: ensureUsers: op(1): [finished] creating or modifying user "core" Sep 10 23:52:27.202945 ignition[995]: INFO : files: ensureUsers: op(2): [started] adding ssh keys to user "core" Sep 10 23:52:27.202945 ignition[995]: INFO : files: ensureUsers: op(2): [finished] adding ssh keys to user "core" Sep 10 23:52:27.202044 unknown[995]: wrote ssh authorized keys file for user: core Sep 10 23:52:27.205306 ignition[995]: INFO : files: createFilesystemsFiles: createFiles: op(3): [started] writing file "/sysroot/opt/helm-v3.13.2-linux-arm64.tar.gz" Sep 10 23:52:27.205306 ignition[995]: INFO : files: createFilesystemsFiles: createFiles: op(3): GET https://get.helm.sh/helm-v3.13.2-linux-arm64.tar.gz: attempt #1 Sep 10 23:52:27.297222 ignition[995]: INFO : files: createFilesystemsFiles: createFiles: op(3): GET result: OK Sep 10 23:52:27.620148 ignition[995]: INFO : files: createFilesystemsFiles: createFiles: op(3): [finished] writing file "/sysroot/opt/helm-v3.13.2-linux-arm64.tar.gz" Sep 10 23:52:27.622248 ignition[995]: INFO : files: createFilesystemsFiles: createFiles: op(4): [started] writing file "/sysroot/home/core/install.sh" Sep 10 23:52:27.622248 ignition[995]: INFO : files: createFilesystemsFiles: createFiles: op(4): [finished] writing file "/sysroot/home/core/install.sh" Sep 10 23:52:27.622248 ignition[995]: INFO : files: createFilesystemsFiles: createFiles: op(5): [started] writing file "/sysroot/home/core/nginx.yaml" Sep 10 23:52:27.622248 ignition[995]: INFO : files: createFilesystemsFiles: createFiles: op(5): [finished] writing file "/sysroot/home/core/nginx.yaml" Sep 10 23:52:27.622248 ignition[995]: INFO : files: createFilesystemsFiles: createFiles: op(6): [started] writing file "/sysroot/home/core/nfs-pod.yaml" Sep 10 23:52:27.622248 ignition[995]: INFO : files: createFilesystemsFiles: createFiles: op(6): [finished] writing file "/sysroot/home/core/nfs-pod.yaml" Sep 10 23:52:27.622248 ignition[995]: INFO : files: createFilesystemsFiles: createFiles: op(7): [started] writing file "/sysroot/home/core/nfs-pvc.yaml" Sep 10 23:52:27.622248 ignition[995]: INFO : files: createFilesystemsFiles: createFiles: op(7): [finished] writing file "/sysroot/home/core/nfs-pvc.yaml" Sep 10 23:52:27.635459 ignition[995]: INFO : files: createFilesystemsFiles: createFiles: op(8): [started] writing file "/sysroot/etc/flatcar/update.conf" Sep 10 23:52:27.635459 ignition[995]: INFO : files: createFilesystemsFiles: createFiles: op(8): [finished] writing file "/sysroot/etc/flatcar/update.conf" Sep 10 23:52:27.635459 ignition[995]: INFO : files: createFilesystemsFiles: createFiles: op(9): [started] writing link "/sysroot/etc/extensions/kubernetes.raw" -> "/opt/extensions/kubernetes/kubernetes-v1.31.8-arm64.raw" Sep 10 23:52:27.635459 ignition[995]: INFO : files: createFilesystemsFiles: createFiles: op(9): [finished] writing link "/sysroot/etc/extensions/kubernetes.raw" -> "/opt/extensions/kubernetes/kubernetes-v1.31.8-arm64.raw" Sep 10 23:52:27.635459 ignition[995]: INFO : files: createFilesystemsFiles: createFiles: op(a): [started] writing file "/sysroot/opt/extensions/kubernetes/kubernetes-v1.31.8-arm64.raw" Sep 10 23:52:27.635459 ignition[995]: INFO : files: createFilesystemsFiles: createFiles: op(a): GET https://extensions.flatcar.org/extensions/kubernetes-v1.31.8-arm64.raw: attempt #1 Sep 10 23:52:27.754493 systemd-networkd[808]: eth1: Gained IPv6LL Sep 10 23:52:27.956121 ignition[995]: INFO : files: createFilesystemsFiles: createFiles: op(a): GET result: OK Sep 10 23:52:28.013257 systemd-networkd[808]: eth0: Gained IPv6LL Sep 10 23:52:28.142182 ignition[995]: INFO : files: createFilesystemsFiles: createFiles: op(a): [finished] writing file "/sysroot/opt/extensions/kubernetes/kubernetes-v1.31.8-arm64.raw" Sep 10 23:52:28.144131 ignition[995]: INFO : files: op(b): [started] processing unit "prepare-helm.service" Sep 10 23:52:28.144131 ignition[995]: INFO : files: op(b): op(c): [started] writing unit "prepare-helm.service" at "/sysroot/etc/systemd/system/prepare-helm.service" Sep 10 23:52:28.147259 ignition[995]: INFO : files: op(b): op(c): [finished] writing unit "prepare-helm.service" at "/sysroot/etc/systemd/system/prepare-helm.service" Sep 10 23:52:28.147259 ignition[995]: INFO : files: op(b): [finished] processing unit "prepare-helm.service" Sep 10 23:52:28.147259 ignition[995]: INFO : files: op(d): [started] processing unit "coreos-metadata.service" Sep 10 23:52:28.151034 ignition[995]: INFO : files: op(d): op(e): [started] writing systemd drop-in "00-custom-metadata.conf" at "/sysroot/etc/systemd/system/coreos-metadata.service.d/00-custom-metadata.conf" Sep 10 23:52:28.151034 ignition[995]: INFO : files: op(d): op(e): [finished] writing systemd drop-in "00-custom-metadata.conf" at "/sysroot/etc/systemd/system/coreos-metadata.service.d/00-custom-metadata.conf" Sep 10 23:52:28.151034 ignition[995]: INFO : files: op(d): [finished] processing unit "coreos-metadata.service" Sep 10 23:52:28.151034 ignition[995]: INFO : files: op(f): [started] setting preset to enabled for "prepare-helm.service" Sep 10 23:52:28.151034 ignition[995]: INFO : files: op(f): [finished] setting preset to enabled for "prepare-helm.service" Sep 10 23:52:28.151034 ignition[995]: INFO : files: createResultFile: createFiles: op(10): [started] writing file "/sysroot/etc/.ignition-result.json" Sep 10 23:52:28.151034 ignition[995]: INFO : files: createResultFile: createFiles: op(10): [finished] writing file "/sysroot/etc/.ignition-result.json" Sep 10 23:52:28.151034 ignition[995]: INFO : files: files passed Sep 10 23:52:28.151034 ignition[995]: INFO : Ignition finished successfully Sep 10 23:52:28.150861 systemd[1]: Finished ignition-files.service - Ignition (files). Sep 10 23:52:28.154944 systemd[1]: Starting ignition-quench.service - Ignition (record completion)... Sep 10 23:52:28.160824 systemd[1]: Starting initrd-setup-root-after-ignition.service - Root filesystem completion... Sep 10 23:52:28.169312 systemd[1]: ignition-quench.service: Deactivated successfully. Sep 10 23:52:28.170076 systemd[1]: Finished ignition-quench.service - Ignition (record completion). Sep 10 23:52:28.176497 initrd-setup-root-after-ignition[1025]: grep: /sysroot/etc/flatcar/enabled-sysext.conf: No such file or directory Sep 10 23:52:28.176497 initrd-setup-root-after-ignition[1025]: grep: /sysroot/usr/share/flatcar/enabled-sysext.conf: No such file or directory Sep 10 23:52:28.179644 initrd-setup-root-after-ignition[1029]: grep: /sysroot/etc/flatcar/enabled-sysext.conf: No such file or directory Sep 10 23:52:28.181971 systemd[1]: Finished initrd-setup-root-after-ignition.service - Root filesystem completion. Sep 10 23:52:28.183071 systemd[1]: Reached target ignition-complete.target - Ignition Complete. Sep 10 23:52:28.185498 systemd[1]: Starting initrd-parse-etc.service - Mountpoints Configured in the Real Root... Sep 10 23:52:28.255607 systemd[1]: initrd-parse-etc.service: Deactivated successfully. Sep 10 23:52:28.255789 systemd[1]: Finished initrd-parse-etc.service - Mountpoints Configured in the Real Root. Sep 10 23:52:28.257511 systemd[1]: Reached target initrd-fs.target - Initrd File Systems. Sep 10 23:52:28.258336 systemd[1]: Reached target initrd.target - Initrd Default Target. Sep 10 23:52:28.259314 systemd[1]: dracut-mount.service - dracut mount hook was skipped because no trigger condition checks were met. Sep 10 23:52:28.260218 systemd[1]: Starting dracut-pre-pivot.service - dracut pre-pivot and cleanup hook... Sep 10 23:52:28.288052 systemd[1]: Finished dracut-pre-pivot.service - dracut pre-pivot and cleanup hook. Sep 10 23:52:28.290603 systemd[1]: Starting initrd-cleanup.service - Cleaning Up and Shutting Down Daemons... Sep 10 23:52:28.316224 systemd[1]: Stopped target nss-lookup.target - Host and Network Name Lookups. Sep 10 23:52:28.316889 systemd[1]: Stopped target remote-cryptsetup.target - Remote Encrypted Volumes. Sep 10 23:52:28.318509 systemd[1]: Stopped target timers.target - Timer Units. Sep 10 23:52:28.319439 systemd[1]: dracut-pre-pivot.service: Deactivated successfully. Sep 10 23:52:28.319557 systemd[1]: Stopped dracut-pre-pivot.service - dracut pre-pivot and cleanup hook. Sep 10 23:52:28.320841 systemd[1]: Stopped target initrd.target - Initrd Default Target. Sep 10 23:52:28.321443 systemd[1]: Stopped target basic.target - Basic System. Sep 10 23:52:28.322423 systemd[1]: Stopped target ignition-complete.target - Ignition Complete. Sep 10 23:52:28.323384 systemd[1]: Stopped target ignition-diskful.target - Ignition Boot Disk Setup. Sep 10 23:52:28.324282 systemd[1]: Stopped target initrd-root-device.target - Initrd Root Device. Sep 10 23:52:28.325266 systemd[1]: Stopped target initrd-usr-fs.target - Initrd /usr File System. Sep 10 23:52:28.326375 systemd[1]: Stopped target remote-fs.target - Remote File Systems. Sep 10 23:52:28.327324 systemd[1]: Stopped target remote-fs-pre.target - Preparation for Remote File Systems. Sep 10 23:52:28.328386 systemd[1]: Stopped target sysinit.target - System Initialization. Sep 10 23:52:28.329277 systemd[1]: Stopped target local-fs.target - Local File Systems. Sep 10 23:52:28.330412 systemd[1]: Stopped target swap.target - Swaps. Sep 10 23:52:28.331266 systemd[1]: dracut-pre-mount.service: Deactivated successfully. Sep 10 23:52:28.331422 systemd[1]: Stopped dracut-pre-mount.service - dracut pre-mount hook. Sep 10 23:52:28.332636 systemd[1]: Stopped target cryptsetup.target - Local Encrypted Volumes. Sep 10 23:52:28.333213 systemd[1]: Stopped target cryptsetup-pre.target - Local Encrypted Volumes (Pre). Sep 10 23:52:28.334231 systemd[1]: clevis-luks-askpass.path: Deactivated successfully. Sep 10 23:52:28.334729 systemd[1]: Stopped clevis-luks-askpass.path - Forward Password Requests to Clevis Directory Watch. Sep 10 23:52:28.335412 systemd[1]: dracut-initqueue.service: Deactivated successfully. Sep 10 23:52:28.335522 systemd[1]: Stopped dracut-initqueue.service - dracut initqueue hook. Sep 10 23:52:28.336940 systemd[1]: initrd-setup-root-after-ignition.service: Deactivated successfully. Sep 10 23:52:28.337044 systemd[1]: Stopped initrd-setup-root-after-ignition.service - Root filesystem completion. Sep 10 23:52:28.338097 systemd[1]: ignition-files.service: Deactivated successfully. Sep 10 23:52:28.338209 systemd[1]: Stopped ignition-files.service - Ignition (files). Sep 10 23:52:28.339282 systemd[1]: flatcar-metadata-hostname.service: Deactivated successfully. Sep 10 23:52:28.339407 systemd[1]: Stopped flatcar-metadata-hostname.service - Flatcar Metadata Hostname Agent. Sep 10 23:52:28.341086 systemd[1]: Stopping ignition-mount.service - Ignition (mount)... Sep 10 23:52:28.345496 systemd[1]: Stopping sysroot-boot.service - /sysroot/boot... Sep 10 23:52:28.345927 systemd[1]: systemd-udev-trigger.service: Deactivated successfully. Sep 10 23:52:28.346034 systemd[1]: Stopped systemd-udev-trigger.service - Coldplug All udev Devices. Sep 10 23:52:28.348348 systemd[1]: dracut-pre-trigger.service: Deactivated successfully. Sep 10 23:52:28.348452 systemd[1]: Stopped dracut-pre-trigger.service - dracut pre-trigger hook. Sep 10 23:52:28.354575 systemd[1]: initrd-cleanup.service: Deactivated successfully. Sep 10 23:52:28.356339 systemd[1]: Finished initrd-cleanup.service - Cleaning Up and Shutting Down Daemons. Sep 10 23:52:28.378491 systemd[1]: sysroot-boot.mount: Deactivated successfully. Sep 10 23:52:28.383781 ignition[1049]: INFO : Ignition 2.21.0 Sep 10 23:52:28.385762 ignition[1049]: INFO : Stage: umount Sep 10 23:52:28.386312 ignition[1049]: INFO : no configs at "/usr/lib/ignition/base.d" Sep 10 23:52:28.386901 ignition[1049]: INFO : no config dir at "/usr/lib/ignition/base.platform.d/hetzner" Sep 10 23:52:28.392427 ignition[1049]: INFO : umount: umount passed Sep 10 23:52:28.393151 ignition[1049]: INFO : Ignition finished successfully Sep 10 23:52:28.394221 systemd[1]: ignition-mount.service: Deactivated successfully. Sep 10 23:52:28.394323 systemd[1]: Stopped ignition-mount.service - Ignition (mount). Sep 10 23:52:28.401224 systemd[1]: ignition-disks.service: Deactivated successfully. Sep 10 23:52:28.401296 systemd[1]: Stopped ignition-disks.service - Ignition (disks). Sep 10 23:52:28.402912 systemd[1]: ignition-kargs.service: Deactivated successfully. Sep 10 23:52:28.402962 systemd[1]: Stopped ignition-kargs.service - Ignition (kargs). Sep 10 23:52:28.404096 systemd[1]: ignition-fetch.service: Deactivated successfully. Sep 10 23:52:28.404146 systemd[1]: Stopped ignition-fetch.service - Ignition (fetch). Sep 10 23:52:28.405404 systemd[1]: Stopped target network.target - Network. Sep 10 23:52:28.406421 systemd[1]: ignition-fetch-offline.service: Deactivated successfully. Sep 10 23:52:28.406469 systemd[1]: Stopped ignition-fetch-offline.service - Ignition (fetch-offline). Sep 10 23:52:28.407084 systemd[1]: Stopped target paths.target - Path Units. Sep 10 23:52:28.407962 systemd[1]: systemd-ask-password-console.path: Deactivated successfully. Sep 10 23:52:28.411630 systemd[1]: Stopped systemd-ask-password-console.path - Dispatch Password Requests to Console Directory Watch. Sep 10 23:52:28.412921 systemd[1]: Stopped target slices.target - Slice Units. Sep 10 23:52:28.414031 systemd[1]: Stopped target sockets.target - Socket Units. Sep 10 23:52:28.415255 systemd[1]: iscsid.socket: Deactivated successfully. Sep 10 23:52:28.415296 systemd[1]: Closed iscsid.socket - Open-iSCSI iscsid Socket. Sep 10 23:52:28.416605 systemd[1]: iscsiuio.socket: Deactivated successfully. Sep 10 23:52:28.416638 systemd[1]: Closed iscsiuio.socket - Open-iSCSI iscsiuio Socket. Sep 10 23:52:28.417549 systemd[1]: ignition-setup.service: Deactivated successfully. Sep 10 23:52:28.417611 systemd[1]: Stopped ignition-setup.service - Ignition (setup). Sep 10 23:52:28.419287 systemd[1]: ignition-setup-pre.service: Deactivated successfully. Sep 10 23:52:28.419327 systemd[1]: Stopped ignition-setup-pre.service - Ignition env setup. Sep 10 23:52:28.424072 systemd[1]: Stopping systemd-networkd.service - Network Configuration... Sep 10 23:52:28.428269 systemd[1]: Stopping systemd-resolved.service - Network Name Resolution... Sep 10 23:52:28.434459 systemd[1]: sysroot-boot.service: Deactivated successfully. Sep 10 23:52:28.435139 systemd[1]: Stopped sysroot-boot.service - /sysroot/boot. Sep 10 23:52:28.436752 systemd[1]: systemd-resolved.service: Deactivated successfully. Sep 10 23:52:28.436852 systemd[1]: Stopped systemd-resolved.service - Network Name Resolution. Sep 10 23:52:28.440111 systemd[1]: run-credentials-systemd\x2dresolved.service.mount: Deactivated successfully. Sep 10 23:52:28.442351 systemd[1]: systemd-networkd.service: Deactivated successfully. Sep 10 23:52:28.442507 systemd[1]: Stopped systemd-networkd.service - Network Configuration. Sep 10 23:52:28.446088 systemd[1]: run-credentials-systemd\x2dnetworkd.service.mount: Deactivated successfully. Sep 10 23:52:28.447025 systemd[1]: Stopped target network-pre.target - Preparation for Network. Sep 10 23:52:28.448651 systemd[1]: systemd-networkd.socket: Deactivated successfully. Sep 10 23:52:28.448699 systemd[1]: Closed systemd-networkd.socket - Network Service Netlink Socket. Sep 10 23:52:28.449303 systemd[1]: initrd-setup-root.service: Deactivated successfully. Sep 10 23:52:28.449357 systemd[1]: Stopped initrd-setup-root.service - Root filesystem setup. Sep 10 23:52:28.451513 systemd[1]: Stopping network-cleanup.service - Network Cleanup... Sep 10 23:52:28.452781 systemd[1]: parse-ip-for-networkd.service: Deactivated successfully. Sep 10 23:52:28.452839 systemd[1]: Stopped parse-ip-for-networkd.service - Write systemd-networkd units from cmdline. Sep 10 23:52:28.455467 systemd[1]: systemd-sysctl.service: Deactivated successfully. Sep 10 23:52:28.455527 systemd[1]: Stopped systemd-sysctl.service - Apply Kernel Variables. Sep 10 23:52:28.458930 systemd[1]: systemd-modules-load.service: Deactivated successfully. Sep 10 23:52:28.458986 systemd[1]: Stopped systemd-modules-load.service - Load Kernel Modules. Sep 10 23:52:28.460910 systemd[1]: systemd-tmpfiles-setup.service: Deactivated successfully. Sep 10 23:52:28.460953 systemd[1]: Stopped systemd-tmpfiles-setup.service - Create System Files and Directories. Sep 10 23:52:28.464811 systemd[1]: Stopping systemd-udevd.service - Rule-based Manager for Device Events and Files... Sep 10 23:52:28.467826 systemd[1]: run-credentials-systemd\x2dsysctl.service.mount: Deactivated successfully. Sep 10 23:52:28.467894 systemd[1]: run-credentials-systemd\x2dtmpfiles\x2dsetup.service.mount: Deactivated successfully. Sep 10 23:52:28.479200 systemd[1]: systemd-udevd.service: Deactivated successfully. Sep 10 23:52:28.482876 systemd[1]: Stopped systemd-udevd.service - Rule-based Manager for Device Events and Files. Sep 10 23:52:28.485947 systemd[1]: network-cleanup.service: Deactivated successfully. Sep 10 23:52:28.487325 systemd[1]: Stopped network-cleanup.service - Network Cleanup. Sep 10 23:52:28.490155 systemd[1]: systemd-udevd-control.socket: Deactivated successfully. Sep 10 23:52:28.491525 systemd[1]: Closed systemd-udevd-control.socket - udev Control Socket. Sep 10 23:52:28.492301 systemd[1]: systemd-udevd-kernel.socket: Deactivated successfully. Sep 10 23:52:28.492332 systemd[1]: Closed systemd-udevd-kernel.socket - udev Kernel Socket. Sep 10 23:52:28.492845 systemd[1]: dracut-pre-udev.service: Deactivated successfully. Sep 10 23:52:28.492889 systemd[1]: Stopped dracut-pre-udev.service - dracut pre-udev hook. Sep 10 23:52:28.494858 systemd[1]: dracut-cmdline.service: Deactivated successfully. Sep 10 23:52:28.494905 systemd[1]: Stopped dracut-cmdline.service - dracut cmdline hook. Sep 10 23:52:28.496978 systemd[1]: dracut-cmdline-ask.service: Deactivated successfully. Sep 10 23:52:28.497032 systemd[1]: Stopped dracut-cmdline-ask.service - dracut ask for additional cmdline parameters. Sep 10 23:52:28.499568 systemd[1]: Starting initrd-udevadm-cleanup-db.service - Cleanup udev Database... Sep 10 23:52:28.501028 systemd[1]: systemd-network-generator.service: Deactivated successfully. Sep 10 23:52:28.501083 systemd[1]: Stopped systemd-network-generator.service - Generate network units from Kernel command line. Sep 10 23:52:28.504340 systemd[1]: systemd-tmpfiles-setup-dev.service: Deactivated successfully. Sep 10 23:52:28.504412 systemd[1]: Stopped systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev. Sep 10 23:52:28.505812 systemd[1]: systemd-tmpfiles-setup-dev-early.service: Deactivated successfully. Sep 10 23:52:28.505850 systemd[1]: Stopped systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully. Sep 10 23:52:28.509073 systemd[1]: kmod-static-nodes.service: Deactivated successfully. Sep 10 23:52:28.509120 systemd[1]: Stopped kmod-static-nodes.service - Create List of Static Device Nodes. Sep 10 23:52:28.509904 systemd[1]: systemd-vconsole-setup.service: Deactivated successfully. Sep 10 23:52:28.509946 systemd[1]: Stopped systemd-vconsole-setup.service - Virtual Console Setup. Sep 10 23:52:28.512739 systemd[1]: run-credentials-systemd\x2dnetwork\x2dgenerator.service.mount: Deactivated successfully. Sep 10 23:52:28.512795 systemd[1]: run-credentials-systemd\x2dtmpfiles\x2dsetup\x2ddev\x2dearly.service.mount: Deactivated successfully. Sep 10 23:52:28.512824 systemd[1]: run-credentials-systemd\x2dtmpfiles\x2dsetup\x2ddev.service.mount: Deactivated successfully. Sep 10 23:52:28.512855 systemd[1]: run-credentials-systemd\x2dvconsole\x2dsetup.service.mount: Deactivated successfully. Sep 10 23:52:28.523270 systemd[1]: initrd-udevadm-cleanup-db.service: Deactivated successfully. Sep 10 23:52:28.523526 systemd[1]: Finished initrd-udevadm-cleanup-db.service - Cleanup udev Database. Sep 10 23:52:28.524972 systemd[1]: Reached target initrd-switch-root.target - Switch Root. Sep 10 23:52:28.527076 systemd[1]: Starting initrd-switch-root.service - Switch Root... Sep 10 23:52:28.547950 systemd[1]: Switching root. Sep 10 23:52:28.598207 systemd-journald[244]: Received SIGTERM from PID 1 (systemd). Sep 10 23:52:28.598290 systemd-journald[244]: Journal stopped Sep 10 23:52:29.466988 kernel: SELinux: policy capability network_peer_controls=1 Sep 10 23:52:29.467060 kernel: SELinux: policy capability open_perms=1 Sep 10 23:52:29.467075 kernel: SELinux: policy capability extended_socket_class=1 Sep 10 23:52:29.467084 kernel: SELinux: policy capability always_check_network=0 Sep 10 23:52:29.467096 kernel: SELinux: policy capability cgroup_seclabel=1 Sep 10 23:52:29.467105 kernel: SELinux: policy capability nnp_nosuid_transition=1 Sep 10 23:52:29.467114 kernel: SELinux: policy capability genfs_seclabel_symlinks=0 Sep 10 23:52:29.467124 kernel: SELinux: policy capability ioctl_skip_cloexec=0 Sep 10 23:52:29.467132 kernel: SELinux: policy capability userspace_initial_context=0 Sep 10 23:52:29.467143 kernel: audit: type=1403 audit(1757548348.728:2): auid=4294967295 ses=4294967295 lsm=selinux res=1 Sep 10 23:52:29.467158 systemd[1]: Successfully loaded SELinux policy in 46.262ms. Sep 10 23:52:29.467178 systemd[1]: Relabeled /dev/, /dev/shm/, /run/ in 11.356ms. Sep 10 23:52:29.467202 systemd[1]: systemd 256.8 running in system mode (+PAM +AUDIT +SELINUX -APPARMOR +IMA +SMACK +SECCOMP -GCRYPT -GNUTLS +OPENSSL -ACL +BLKID +CURL +ELFUTILS -FIDO2 +IDN2 -IDN +IPTC +KMOD +LIBCRYPTSETUP +LIBCRYPTSETUP_PLUGINS +LIBFDISK +PCRE2 -PWQUALITY -P11KIT -QRENCODE +TPM2 +BZIP2 +LZ4 +XZ +ZLIB +ZSTD -BPF_FRAMEWORK -XKBCOMMON +UTMP -SYSVINIT +LIBARCHIVE) Sep 10 23:52:29.467213 systemd[1]: Detected virtualization kvm. Sep 10 23:52:29.467224 systemd[1]: Detected architecture arm64. Sep 10 23:52:29.467240 systemd[1]: Detected first boot. Sep 10 23:52:29.467250 systemd[1]: Hostname set to . Sep 10 23:52:29.467262 systemd[1]: Initializing machine ID from VM UUID. Sep 10 23:52:29.467274 zram_generator::config[1093]: No configuration found. Sep 10 23:52:29.467290 kernel: NET: Registered PF_VSOCK protocol family Sep 10 23:52:29.467301 systemd[1]: Populated /etc with preset unit settings. Sep 10 23:52:29.467312 systemd[1]: run-credentials-systemd\x2djournald.service.mount: Deactivated successfully. Sep 10 23:52:29.467323 systemd[1]: initrd-switch-root.service: Deactivated successfully. Sep 10 23:52:29.467333 systemd[1]: Stopped initrd-switch-root.service - Switch Root. Sep 10 23:52:29.467343 systemd[1]: systemd-journald.service: Scheduled restart job, restart counter is at 1. Sep 10 23:52:29.467353 systemd[1]: Created slice system-addon\x2dconfig.slice - Slice /system/addon-config. Sep 10 23:52:29.467373 systemd[1]: Created slice system-addon\x2drun.slice - Slice /system/addon-run. Sep 10 23:52:29.467385 systemd[1]: Created slice system-getty.slice - Slice /system/getty. Sep 10 23:52:29.467396 systemd[1]: Created slice system-modprobe.slice - Slice /system/modprobe. Sep 10 23:52:29.467406 systemd[1]: Created slice system-serial\x2dgetty.slice - Slice /system/serial-getty. Sep 10 23:52:29.467416 systemd[1]: Created slice system-system\x2dcloudinit.slice - Slice /system/system-cloudinit. Sep 10 23:52:29.467429 systemd[1]: Created slice system-systemd\x2dfsck.slice - Slice /system/systemd-fsck. Sep 10 23:52:29.467439 systemd[1]: Created slice user.slice - User and Session Slice. Sep 10 23:52:29.467449 systemd[1]: Started clevis-luks-askpass.path - Forward Password Requests to Clevis Directory Watch. Sep 10 23:52:29.467464 systemd[1]: Started systemd-ask-password-console.path - Dispatch Password Requests to Console Directory Watch. Sep 10 23:52:29.467474 systemd[1]: Started systemd-ask-password-wall.path - Forward Password Requests to Wall Directory Watch. Sep 10 23:52:29.467484 systemd[1]: Set up automount boot.automount - Boot partition Automount Point. Sep 10 23:52:29.467494 systemd[1]: Set up automount proc-sys-fs-binfmt_misc.automount - Arbitrary Executable File Formats File System Automount Point. Sep 10 23:52:29.467504 systemd[1]: Expecting device dev-disk-by\x2dlabel-OEM.device - /dev/disk/by-label/OEM... Sep 10 23:52:29.467516 systemd[1]: Expecting device dev-ttyAMA0.device - /dev/ttyAMA0... Sep 10 23:52:29.467526 systemd[1]: Reached target cryptsetup-pre.target - Local Encrypted Volumes (Pre). Sep 10 23:52:29.467536 systemd[1]: Reached target cryptsetup.target - Local Encrypted Volumes. Sep 10 23:52:29.467546 systemd[1]: Stopped target initrd-switch-root.target - Switch Root. Sep 10 23:52:29.467556 systemd[1]: Stopped target initrd-fs.target - Initrd File Systems. Sep 10 23:52:29.467566 systemd[1]: Stopped target initrd-root-fs.target - Initrd Root File System. Sep 10 23:52:29.467576 systemd[1]: Reached target integritysetup.target - Local Integrity Protected Volumes. Sep 10 23:52:29.467587 systemd[1]: Reached target remote-cryptsetup.target - Remote Encrypted Volumes. Sep 10 23:52:29.467601 systemd[1]: Reached target remote-fs.target - Remote File Systems. Sep 10 23:52:29.467611 systemd[1]: Reached target slices.target - Slice Units. Sep 10 23:52:29.467621 systemd[1]: Reached target swap.target - Swaps. Sep 10 23:52:29.467631 systemd[1]: Reached target veritysetup.target - Local Verity Protected Volumes. Sep 10 23:52:29.467640 systemd[1]: Listening on systemd-coredump.socket - Process Core Dump Socket. Sep 10 23:52:29.467650 systemd[1]: Listening on systemd-creds.socket - Credential Encryption/Decryption. Sep 10 23:52:29.467660 systemd[1]: Listening on systemd-networkd.socket - Network Service Netlink Socket. Sep 10 23:52:29.467670 systemd[1]: Listening on systemd-udevd-control.socket - udev Control Socket. Sep 10 23:52:29.467681 systemd[1]: Listening on systemd-udevd-kernel.socket - udev Kernel Socket. Sep 10 23:52:29.467691 systemd[1]: Listening on systemd-userdbd.socket - User Database Manager Socket. Sep 10 23:52:29.467700 systemd[1]: Mounting dev-hugepages.mount - Huge Pages File System... Sep 10 23:52:29.467710 systemd[1]: Mounting dev-mqueue.mount - POSIX Message Queue File System... Sep 10 23:52:29.467720 systemd[1]: Mounting media.mount - External Media Directory... Sep 10 23:52:29.467729 systemd[1]: Mounting sys-kernel-debug.mount - Kernel Debug File System... Sep 10 23:52:29.467739 systemd[1]: Mounting sys-kernel-tracing.mount - Kernel Trace File System... Sep 10 23:52:29.467749 systemd[1]: Mounting tmp.mount - Temporary Directory /tmp... Sep 10 23:52:29.467760 systemd[1]: var-lib-machines.mount - Virtual Machine and Container Storage (Compatibility) was skipped because of an unmet condition check (ConditionPathExists=/var/lib/machines.raw). Sep 10 23:52:29.467771 systemd[1]: Reached target machines.target - Containers. Sep 10 23:52:29.467781 systemd[1]: Starting flatcar-tmpfiles.service - Create missing system files... Sep 10 23:52:29.467791 systemd[1]: ignition-delete-config.service - Ignition (delete config) was skipped because no trigger condition checks were met. Sep 10 23:52:29.467801 systemd[1]: Starting kmod-static-nodes.service - Create List of Static Device Nodes... Sep 10 23:52:29.467811 systemd[1]: Starting modprobe@configfs.service - Load Kernel Module configfs... Sep 10 23:52:29.467821 systemd[1]: Starting modprobe@dm_mod.service - Load Kernel Module dm_mod... Sep 10 23:52:29.467831 systemd[1]: Starting modprobe@drm.service - Load Kernel Module drm... Sep 10 23:52:29.467844 systemd[1]: Starting modprobe@efi_pstore.service - Load Kernel Module efi_pstore... Sep 10 23:52:29.467856 systemd[1]: Starting modprobe@fuse.service - Load Kernel Module fuse... Sep 10 23:52:29.467866 systemd[1]: Starting modprobe@loop.service - Load Kernel Module loop... Sep 10 23:52:29.467878 systemd[1]: setup-nsswitch.service - Create /etc/nsswitch.conf was skipped because of an unmet condition check (ConditionPathExists=!/etc/nsswitch.conf). Sep 10 23:52:29.467887 systemd[1]: systemd-fsck-root.service: Deactivated successfully. Sep 10 23:52:29.467897 systemd[1]: Stopped systemd-fsck-root.service - File System Check on Root Device. Sep 10 23:52:29.467908 systemd[1]: systemd-fsck-usr.service: Deactivated successfully. Sep 10 23:52:29.467918 systemd[1]: Stopped systemd-fsck-usr.service. Sep 10 23:52:29.467928 systemd[1]: systemd-hibernate-clear.service - Clear Stale Hibernate Storage Info was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/HibernateLocation-8cf2644b-4b0b-428f-9387-6d876050dc67). Sep 10 23:52:29.467941 systemd[1]: Starting systemd-journald.service - Journal Service... Sep 10 23:52:29.467951 systemd[1]: Starting systemd-modules-load.service - Load Kernel Modules... Sep 10 23:52:29.467961 systemd[1]: Starting systemd-network-generator.service - Generate network units from Kernel command line... Sep 10 23:52:29.467973 systemd[1]: Starting systemd-remount-fs.service - Remount Root and Kernel File Systems... Sep 10 23:52:29.467984 systemd[1]: Starting systemd-udev-load-credentials.service - Load udev Rules from Credentials... Sep 10 23:52:29.467994 systemd[1]: Starting systemd-udev-trigger.service - Coldplug All udev Devices... Sep 10 23:52:29.468005 systemd[1]: verity-setup.service: Deactivated successfully. Sep 10 23:52:29.468015 systemd[1]: Stopped verity-setup.service. Sep 10 23:52:29.468025 systemd[1]: Mounted dev-hugepages.mount - Huge Pages File System. Sep 10 23:52:29.468035 systemd[1]: Mounted dev-mqueue.mount - POSIX Message Queue File System. Sep 10 23:52:29.468047 systemd[1]: Mounted media.mount - External Media Directory. Sep 10 23:52:29.468057 systemd[1]: Mounted sys-kernel-debug.mount - Kernel Debug File System. Sep 10 23:52:29.468067 systemd[1]: Mounted sys-kernel-tracing.mount - Kernel Trace File System. Sep 10 23:52:29.468077 systemd[1]: Mounted tmp.mount - Temporary Directory /tmp. Sep 10 23:52:29.468087 systemd[1]: Finished kmod-static-nodes.service - Create List of Static Device Nodes. Sep 10 23:52:29.468097 systemd[1]: modprobe@configfs.service: Deactivated successfully. Sep 10 23:52:29.468107 systemd[1]: Finished modprobe@configfs.service - Load Kernel Module configfs. Sep 10 23:52:29.468146 systemd-journald[1157]: Collecting audit messages is disabled. Sep 10 23:52:29.468172 systemd[1]: modprobe@dm_mod.service: Deactivated successfully. Sep 10 23:52:29.468182 systemd[1]: Finished modprobe@dm_mod.service - Load Kernel Module dm_mod. Sep 10 23:52:29.468208 systemd-journald[1157]: Journal started Sep 10 23:52:29.468230 systemd-journald[1157]: Runtime Journal (/run/log/journal/f4d97b8bb2b54e04a99a13517a9bca8e) is 8M, max 76.5M, 68.5M free. Sep 10 23:52:29.246336 systemd[1]: Queued start job for default target multi-user.target. Sep 10 23:52:29.253157 systemd[1]: Unnecessary job was removed for dev-sda6.device - /dev/sda6. Sep 10 23:52:29.253946 systemd[1]: systemd-journald.service: Deactivated successfully. Sep 10 23:52:29.470208 systemd[1]: Started systemd-journald.service - Journal Service. Sep 10 23:52:29.473417 systemd[1]: modprobe@efi_pstore.service: Deactivated successfully. Sep 10 23:52:29.476257 systemd[1]: Finished modprobe@efi_pstore.service - Load Kernel Module efi_pstore. Sep 10 23:52:29.478803 systemd[1]: Finished systemd-remount-fs.service - Remount Root and Kernel File Systems. Sep 10 23:52:29.500195 kernel: loop: module loaded Sep 10 23:52:29.500266 kernel: ACPI: bus type drm_connector registered Sep 10 23:52:29.500386 systemd[1]: Finished systemd-modules-load.service - Load Kernel Modules. Sep 10 23:52:29.501638 systemd[1]: modprobe@loop.service: Deactivated successfully. Sep 10 23:52:29.501806 systemd[1]: Finished modprobe@loop.service - Load Kernel Module loop. Sep 10 23:52:29.505294 kernel: fuse: init (API version 7.41) Sep 10 23:52:29.506296 systemd[1]: Finished systemd-network-generator.service - Generate network units from Kernel command line. Sep 10 23:52:29.508790 systemd[1]: modprobe@drm.service: Deactivated successfully. Sep 10 23:52:29.508949 systemd[1]: Finished modprobe@drm.service - Load Kernel Module drm. Sep 10 23:52:29.510945 systemd[1]: modprobe@fuse.service: Deactivated successfully. Sep 10 23:52:29.512291 systemd[1]: Finished modprobe@fuse.service - Load Kernel Module fuse. Sep 10 23:52:29.525494 systemd[1]: Reached target network-pre.target - Preparation for Network. Sep 10 23:52:29.529015 systemd[1]: Mounting sys-fs-fuse-connections.mount - FUSE Control File System... Sep 10 23:52:29.535312 systemd[1]: Mounting sys-kernel-config.mount - Kernel Configuration File System... Sep 10 23:52:29.535934 systemd[1]: remount-root.service - Remount Root File System was skipped because of an unmet condition check (ConditionPathIsReadWrite=!/). Sep 10 23:52:29.535974 systemd[1]: Reached target local-fs.target - Local File Systems. Sep 10 23:52:29.539412 systemd[1]: Listening on systemd-sysext.socket - System Extension Image Management. Sep 10 23:52:29.544019 systemd[1]: Starting ldconfig.service - Rebuild Dynamic Linker Cache... Sep 10 23:52:29.544825 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. Sep 10 23:52:29.546685 systemd[1]: Starting systemd-hwdb-update.service - Rebuild Hardware Database... Sep 10 23:52:29.552784 systemd[1]: Starting systemd-journal-flush.service - Flush Journal to Persistent Storage... Sep 10 23:52:29.554296 systemd[1]: systemd-pstore.service - Platform Persistent Storage Archival was skipped because of an unmet condition check (ConditionDirectoryNotEmpty=/sys/fs/pstore). Sep 10 23:52:29.555812 systemd[1]: Starting systemd-random-seed.service - Load/Save OS Random Seed... Sep 10 23:52:29.558456 systemd[1]: systemd-repart.service - Repartition Root Disk was skipped because no trigger condition checks were met. Sep 10 23:52:29.561829 systemd[1]: Starting systemd-sysctl.service - Apply Kernel Variables... Sep 10 23:52:29.567091 systemd[1]: Starting systemd-sysext.service - Merge System Extension Images into /usr/ and /opt/... Sep 10 23:52:29.575124 systemd[1]: Starting systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully... Sep 10 23:52:29.581620 systemd[1]: Finished flatcar-tmpfiles.service - Create missing system files. Sep 10 23:52:29.584328 systemd[1]: Finished systemd-udev-load-credentials.service - Load udev Rules from Credentials. Sep 10 23:52:29.588979 systemd[1]: Mounted sys-fs-fuse-connections.mount - FUSE Control File System. Sep 10 23:52:29.593270 systemd-journald[1157]: Time spent on flushing to /var/log/journal/f4d97b8bb2b54e04a99a13517a9bca8e is 38.891ms for 1174 entries. Sep 10 23:52:29.593270 systemd-journald[1157]: System Journal (/var/log/journal/f4d97b8bb2b54e04a99a13517a9bca8e) is 8M, max 584.8M, 576.8M free. Sep 10 23:52:29.646809 systemd-journald[1157]: Received client request to flush runtime journal. Sep 10 23:52:29.646881 kernel: loop0: detected capacity change from 0 to 138376 Sep 10 23:52:29.592494 systemd[1]: Mounted sys-kernel-config.mount - Kernel Configuration File System. Sep 10 23:52:29.595655 systemd[1]: Finished systemd-random-seed.service - Load/Save OS Random Seed. Sep 10 23:52:29.603574 systemd[1]: Reached target first-boot-complete.target - First Boot Complete. Sep 10 23:52:29.612890 systemd[1]: Starting systemd-machine-id-commit.service - Save Transient machine-id to Disk... Sep 10 23:52:29.656112 systemd[1]: Finished systemd-journal-flush.service - Flush Journal to Persistent Storage. Sep 10 23:52:29.676276 kernel: squashfs: version 4.0 (2009/01/31) Phillip Lougher Sep 10 23:52:29.673567 systemd[1]: Finished systemd-sysctl.service - Apply Kernel Variables. Sep 10 23:52:29.688301 systemd[1]: Finished systemd-machine-id-commit.service - Save Transient machine-id to Disk. Sep 10 23:52:29.691865 systemd-tmpfiles[1211]: ACLs are not supported, ignoring. Sep 10 23:52:29.691880 systemd-tmpfiles[1211]: ACLs are not supported, ignoring. Sep 10 23:52:29.702222 kernel: loop1: detected capacity change from 0 to 203944 Sep 10 23:52:29.703034 systemd[1]: Finished systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully. Sep 10 23:52:29.706151 systemd[1]: Starting systemd-sysusers.service - Create System Users... Sep 10 23:52:29.720050 systemd[1]: Finished systemd-udev-trigger.service - Coldplug All udev Devices. Sep 10 23:52:29.754225 kernel: loop2: detected capacity change from 0 to 8 Sep 10 23:52:29.767777 systemd[1]: Finished systemd-sysusers.service - Create System Users. Sep 10 23:52:29.774215 kernel: loop3: detected capacity change from 0 to 107312 Sep 10 23:52:29.772650 systemd[1]: Starting systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev... Sep 10 23:52:29.800127 systemd-tmpfiles[1234]: ACLs are not supported, ignoring. Sep 10 23:52:29.800148 systemd-tmpfiles[1234]: ACLs are not supported, ignoring. Sep 10 23:52:29.807302 systemd[1]: Finished systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev. Sep 10 23:52:29.821227 kernel: loop4: detected capacity change from 0 to 138376 Sep 10 23:52:29.849227 kernel: loop5: detected capacity change from 0 to 203944 Sep 10 23:52:29.876218 kernel: loop6: detected capacity change from 0 to 8 Sep 10 23:52:29.880218 kernel: loop7: detected capacity change from 0 to 107312 Sep 10 23:52:29.896850 (sd-merge)[1238]: Using extensions 'containerd-flatcar', 'docker-flatcar', 'kubernetes', 'oem-hetzner'. Sep 10 23:52:29.897890 (sd-merge)[1238]: Merged extensions into '/usr'. Sep 10 23:52:29.905111 systemd[1]: Reload requested from client PID 1210 ('systemd-sysext') (unit systemd-sysext.service)... Sep 10 23:52:29.905131 systemd[1]: Reloading... Sep 10 23:52:30.039218 zram_generator::config[1262]: No configuration found. Sep 10 23:52:30.110251 ldconfig[1205]: /sbin/ldconfig: /usr/lib/ld.so.conf is not an ELF file - it has the wrong magic bytes at the start. Sep 10 23:52:30.177617 systemd[1]: /usr/lib/systemd/system/docker.socket:6: ListenStream= references a path below legacy directory /var/run/, updating /var/run/docker.sock → /run/docker.sock; please update the unit file accordingly. Sep 10 23:52:30.252680 systemd[1]: etc-machine\x2did.mount: Deactivated successfully. Sep 10 23:52:30.252838 systemd[1]: Reloading finished in 346 ms. Sep 10 23:52:30.270669 systemd[1]: Finished ldconfig.service - Rebuild Dynamic Linker Cache. Sep 10 23:52:30.273631 systemd[1]: Finished systemd-sysext.service - Merge System Extension Images into /usr/ and /opt/. Sep 10 23:52:30.286181 systemd[1]: Starting ensure-sysext.service... Sep 10 23:52:30.290758 systemd[1]: Starting systemd-tmpfiles-setup.service - Create System Files and Directories... Sep 10 23:52:30.312971 systemd[1]: Reload requested from client PID 1301 ('systemctl') (unit ensure-sysext.service)... Sep 10 23:52:30.312990 systemd[1]: Reloading... Sep 10 23:52:30.317408 systemd-tmpfiles[1302]: /usr/lib/tmpfiles.d/nfs-utils.conf:6: Duplicate line for path "/var/lib/nfs/sm", ignoring. Sep 10 23:52:30.317459 systemd-tmpfiles[1302]: /usr/lib/tmpfiles.d/nfs-utils.conf:7: Duplicate line for path "/var/lib/nfs/sm.bak", ignoring. Sep 10 23:52:30.317708 systemd-tmpfiles[1302]: /usr/lib/tmpfiles.d/provision.conf:20: Duplicate line for path "/root", ignoring. Sep 10 23:52:30.317920 systemd-tmpfiles[1302]: /usr/lib/tmpfiles.d/systemd-flatcar.conf:6: Duplicate line for path "/var/log/journal", ignoring. Sep 10 23:52:30.318650 systemd-tmpfiles[1302]: /usr/lib/tmpfiles.d/systemd.conf:29: Duplicate line for path "/var/lib/systemd", ignoring. Sep 10 23:52:30.318908 systemd-tmpfiles[1302]: ACLs are not supported, ignoring. Sep 10 23:52:30.318961 systemd-tmpfiles[1302]: ACLs are not supported, ignoring. Sep 10 23:52:30.329549 systemd-tmpfiles[1302]: Detected autofs mount point /boot during canonicalization of boot. Sep 10 23:52:30.329563 systemd-tmpfiles[1302]: Skipping /boot Sep 10 23:52:30.343046 systemd-tmpfiles[1302]: Detected autofs mount point /boot during canonicalization of boot. Sep 10 23:52:30.343063 systemd-tmpfiles[1302]: Skipping /boot Sep 10 23:52:30.403220 zram_generator::config[1329]: No configuration found. Sep 10 23:52:30.485344 systemd[1]: /usr/lib/systemd/system/docker.socket:6: ListenStream= references a path below legacy directory /var/run/, updating /var/run/docker.sock → /run/docker.sock; please update the unit file accordingly. Sep 10 23:52:30.558873 systemd[1]: Reloading finished in 245 ms. Sep 10 23:52:30.587343 systemd[1]: Finished systemd-hwdb-update.service - Rebuild Hardware Database. Sep 10 23:52:30.589264 systemd[1]: Finished systemd-tmpfiles-setup.service - Create System Files and Directories. Sep 10 23:52:30.602418 systemd[1]: Starting audit-rules.service - Load Audit Rules... Sep 10 23:52:30.604847 systemd[1]: Starting clean-ca-certificates.service - Clean up broken links in /etc/ssl/certs... Sep 10 23:52:30.611718 systemd[1]: Starting systemd-journal-catalog-update.service - Rebuild Journal Catalog... Sep 10 23:52:30.617523 systemd[1]: Starting systemd-resolved.service - Network Name Resolution... Sep 10 23:52:30.621168 systemd[1]: Starting systemd-udevd.service - Rule-based Manager for Device Events and Files... Sep 10 23:52:30.623655 systemd[1]: Starting systemd-update-utmp.service - Record System Boot/Shutdown in UTMP... Sep 10 23:52:30.629983 systemd[1]: ignition-delete-config.service - Ignition (delete config) was skipped because no trigger condition checks were met. Sep 10 23:52:30.634290 systemd[1]: Starting modprobe@dm_mod.service - Load Kernel Module dm_mod... Sep 10 23:52:30.641731 systemd[1]: Starting modprobe@efi_pstore.service - Load Kernel Module efi_pstore... Sep 10 23:52:30.645381 systemd[1]: Starting modprobe@loop.service - Load Kernel Module loop... Sep 10 23:52:30.645982 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. Sep 10 23:52:30.646086 systemd[1]: systemd-hibernate-clear.service - Clear Stale Hibernate Storage Info was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/HibernateLocation-8cf2644b-4b0b-428f-9387-6d876050dc67). Sep 10 23:52:30.650475 systemd[1]: Starting systemd-userdbd.service - User Database Manager... Sep 10 23:52:30.653926 systemd[1]: ignition-delete-config.service - Ignition (delete config) was skipped because no trigger condition checks were met. Sep 10 23:52:30.654068 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. Sep 10 23:52:30.654165 systemd[1]: systemd-hibernate-clear.service - Clear Stale Hibernate Storage Info was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/HibernateLocation-8cf2644b-4b0b-428f-9387-6d876050dc67). Sep 10 23:52:30.657547 systemd[1]: ignition-delete-config.service - Ignition (delete config) was skipped because no trigger condition checks were met. Sep 10 23:52:30.659670 systemd[1]: Starting modprobe@drm.service - Load Kernel Module drm... Sep 10 23:52:30.660591 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. Sep 10 23:52:30.660788 systemd[1]: systemd-hibernate-clear.service - Clear Stale Hibernate Storage Info was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/HibernateLocation-8cf2644b-4b0b-428f-9387-6d876050dc67). Sep 10 23:52:30.672028 systemd[1]: Finished ensure-sysext.service. Sep 10 23:52:30.677109 systemd[1]: Starting systemd-timesyncd.service - Network Time Synchronization... Sep 10 23:52:30.685647 systemd[1]: Finished systemd-update-utmp.service - Record System Boot/Shutdown in UTMP. Sep 10 23:52:30.691512 systemd[1]: modprobe@dm_mod.service: Deactivated successfully. Sep 10 23:52:30.691960 systemd[1]: Finished modprobe@dm_mod.service - Load Kernel Module dm_mod. Sep 10 23:52:30.697492 systemd[1]: Finished systemd-journal-catalog-update.service - Rebuild Journal Catalog. Sep 10 23:52:30.703480 systemd[1]: Starting systemd-update-done.service - Update is Completed... Sep 10 23:52:30.711294 systemd[1]: modprobe@efi_pstore.service: Deactivated successfully. Sep 10 23:52:30.711538 systemd[1]: Finished modprobe@efi_pstore.service - Load Kernel Module efi_pstore. Sep 10 23:52:30.713627 systemd[1]: modprobe@loop.service: Deactivated successfully. Sep 10 23:52:30.717799 systemd[1]: Finished modprobe@loop.service - Load Kernel Module loop. Sep 10 23:52:30.720071 systemd[1]: systemd-pstore.service - Platform Persistent Storage Archival was skipped because of an unmet condition check (ConditionDirectoryNotEmpty=/sys/fs/pstore). Sep 10 23:52:30.720152 systemd[1]: systemd-repart.service - Repartition Root Disk was skipped because no trigger condition checks were met. Sep 10 23:52:30.727093 systemd[1]: modprobe@drm.service: Deactivated successfully. Sep 10 23:52:30.728334 systemd[1]: Finished modprobe@drm.service - Load Kernel Module drm. Sep 10 23:52:30.745269 systemd[1]: Finished systemd-update-done.service - Update is Completed. Sep 10 23:52:30.750993 systemd-udevd[1372]: Using default interface naming scheme 'v255'. Sep 10 23:52:30.760366 augenrules[1409]: No rules Sep 10 23:52:30.762736 systemd[1]: audit-rules.service: Deactivated successfully. Sep 10 23:52:30.763334 systemd[1]: Finished audit-rules.service - Load Audit Rules. Sep 10 23:52:30.764846 systemd[1]: Started systemd-userdbd.service - User Database Manager. Sep 10 23:52:30.767664 systemd[1]: Finished clean-ca-certificates.service - Clean up broken links in /etc/ssl/certs. Sep 10 23:52:30.769182 systemd[1]: update-ca-certificates.service - Update CA bundle at /etc/ssl/certs/ca-certificates.crt was skipped because of an unmet condition check (ConditionPathIsSymbolicLink=!/etc/ssl/certs/ca-certificates.crt). Sep 10 23:52:30.792032 systemd[1]: Started systemd-udevd.service - Rule-based Manager for Device Events and Files. Sep 10 23:52:30.797528 systemd[1]: Starting systemd-networkd.service - Network Configuration... Sep 10 23:52:30.915706 systemd[1]: Condition check resulted in dev-ttyAMA0.device - /dev/ttyAMA0 being skipped. Sep 10 23:52:31.016264 systemd[1]: Started systemd-timesyncd.service - Network Time Synchronization. Sep 10 23:52:31.017439 systemd[1]: Reached target time-set.target - System Time Set. Sep 10 23:52:31.021426 systemd-resolved[1371]: Positive Trust Anchors: Sep 10 23:52:31.021906 systemd-resolved[1371]: . IN DS 20326 8 2 e06d44b80b8f1d39a95c0b0d7c65d08458e880409bbc683457104237c7f8ec8d Sep 10 23:52:31.022018 systemd-resolved[1371]: Negative trust anchors: home.arpa 10.in-addr.arpa 16.172.in-addr.arpa 17.172.in-addr.arpa 18.172.in-addr.arpa 19.172.in-addr.arpa 20.172.in-addr.arpa 21.172.in-addr.arpa 22.172.in-addr.arpa 23.172.in-addr.arpa 24.172.in-addr.arpa 25.172.in-addr.arpa 26.172.in-addr.arpa 27.172.in-addr.arpa 28.172.in-addr.arpa 29.172.in-addr.arpa 30.172.in-addr.arpa 31.172.in-addr.arpa 170.0.0.192.in-addr.arpa 171.0.0.192.in-addr.arpa 168.192.in-addr.arpa d.f.ip6.arpa ipv4only.arpa resolver.arpa corp home internal intranet lan local private test Sep 10 23:52:31.027999 systemd-resolved[1371]: Using system hostname 'ci-4372-1-0-n-c06092ab73'. Sep 10 23:52:31.029712 systemd[1]: Started systemd-resolved.service - Network Name Resolution. Sep 10 23:52:31.030608 systemd[1]: Reached target nss-lookup.target - Host and Network Name Lookups. Sep 10 23:52:31.031589 systemd[1]: Reached target sysinit.target - System Initialization. Sep 10 23:52:31.032272 systemd[1]: Started motdgen.path - Watch for update engine configuration changes. Sep 10 23:52:31.032990 systemd[1]: Started user-cloudinit@var-lib-flatcar\x2dinstall-user_data.path - Watch for a cloud-config at /var/lib/flatcar-install/user_data. Sep 10 23:52:31.033975 systemd[1]: Started logrotate.timer - Daily rotation of log files. Sep 10 23:52:31.034855 systemd[1]: Started mdadm.timer - Weekly check for MD array's redundancy information.. Sep 10 23:52:31.035680 systemd[1]: Started systemd-tmpfiles-clean.timer - Daily Cleanup of Temporary Directories. Sep 10 23:52:31.036851 systemd[1]: update-engine-stub.timer - Update Engine Stub Timer was skipped because of an unmet condition check (ConditionPathExists=/usr/.noupdate). Sep 10 23:52:31.036882 systemd[1]: Reached target paths.target - Path Units. Sep 10 23:52:31.037522 systemd[1]: Reached target timers.target - Timer Units. Sep 10 23:52:31.040507 systemd[1]: Listening on dbus.socket - D-Bus System Message Bus Socket. Sep 10 23:52:31.043793 systemd[1]: Starting docker.socket - Docker Socket for the API... Sep 10 23:52:31.049997 systemd[1]: Listening on sshd-unix-local.socket - OpenSSH Server Socket (systemd-ssh-generator, AF_UNIX Local). Sep 10 23:52:31.052242 systemd[1]: Listening on sshd-vsock.socket - OpenSSH Server Socket (systemd-ssh-generator, AF_VSOCK). Sep 10 23:52:31.053397 systemd[1]: Reached target ssh-access.target - SSH Access Available. Sep 10 23:52:31.087737 systemd[1]: Listening on sshd.socket - OpenSSH Server Socket. Sep 10 23:52:31.091532 systemd[1]: Listening on systemd-hostnamed.socket - Hostname Service Socket. Sep 10 23:52:31.101625 systemd[1]: Listening on docker.socket - Docker Socket for the API. Sep 10 23:52:31.125815 systemd[1]: Reached target sockets.target - Socket Units. Sep 10 23:52:31.126642 systemd[1]: Reached target basic.target - Basic System. Sep 10 23:52:31.127273 systemd[1]: addon-config@oem.service - Configure Addon /oem was skipped because no trigger condition checks were met. Sep 10 23:52:31.127439 systemd[1]: addon-run@oem.service - Run Addon /oem was skipped because no trigger condition checks were met. Sep 10 23:52:31.129459 systemd[1]: Starting coreos-metadata.service - Flatcar Metadata Agent... Sep 10 23:52:31.131396 systemd[1]: Starting dbus.service - D-Bus System Message Bus... Sep 10 23:52:31.135581 systemd[1]: Starting dracut-shutdown.service - Restore /run/initramfs on shutdown... Sep 10 23:52:31.141400 systemd[1]: Starting enable-oem-cloudinit.service - Enable cloudinit... Sep 10 23:52:31.145627 systemd[1]: Starting extend-filesystems.service - Extend Filesystems... Sep 10 23:52:31.146285 systemd[1]: flatcar-setup-environment.service - Modifies /etc/environment for CoreOS was skipped because of an unmet condition check (ConditionPathExists=/oem/bin/flatcar-setup-environment). Sep 10 23:52:31.151451 systemd[1]: Starting motdgen.service - Generate /run/flatcar/motd... Sep 10 23:52:31.155602 systemd[1]: Starting prepare-helm.service - Unpack helm to /opt/bin... Sep 10 23:52:31.159495 systemd[1]: Starting ssh-key-proc-cmdline.service - Install an ssh key from /proc/cmdline... Sep 10 23:52:31.161706 systemd-networkd[1425]: lo: Link UP Sep 10 23:52:31.161721 systemd-networkd[1425]: lo: Gained carrier Sep 10 23:52:31.167568 systemd-networkd[1425]: Enumeration completed Sep 10 23:52:31.173868 systemd[1]: Starting sshd-keygen.service - Generate sshd host keys... Sep 10 23:52:31.181263 systemd-networkd[1425]: eth0: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Sep 10 23:52:31.181267 systemd-networkd[1425]: eth0: Configuring with /usr/lib/systemd/network/zz-default.network. Sep 10 23:52:31.181902 systemd[1]: Starting systemd-logind.service - User Login Management... Sep 10 23:52:31.185384 systemd[1]: tcsd.service - TCG Core Services Daemon was skipped because of an unmet condition check (ConditionPathExists=/dev/tpm0). Sep 10 23:52:31.185884 systemd[1]: cgroup compatibility translation between legacy and unified hierarchy settings activated. See cgroup-compat debug messages for details. Sep 10 23:52:31.186402 systemd-networkd[1425]: eth1: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Sep 10 23:52:31.186406 systemd-networkd[1425]: eth1: Configuring with /usr/lib/systemd/network/zz-default.network. Sep 10 23:52:31.189680 systemd-networkd[1425]: eth0: Link UP Sep 10 23:52:31.191052 systemd-networkd[1425]: eth0: Gained carrier Sep 10 23:52:31.191076 systemd-networkd[1425]: eth0: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Sep 10 23:52:31.191102 systemd[1]: Starting update-engine.service - Update Engine... Sep 10 23:52:31.196320 kernel: mousedev: PS/2 mouse device common for all mice Sep 10 23:52:31.196244 systemd-networkd[1425]: eth1: Link UP Sep 10 23:52:31.198875 systemd-networkd[1425]: eth1: Gained carrier Sep 10 23:52:31.201441 extend-filesystems[1478]: Found /dev/sda6 Sep 10 23:52:31.198907 systemd-networkd[1425]: eth1: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Sep 10 23:52:31.213099 jq[1476]: false Sep 10 23:52:31.213311 extend-filesystems[1478]: Found /dev/sda9 Sep 10 23:52:31.202209 systemd[1]: Starting update-ssh-keys-after-ignition.service - Run update-ssh-keys once after Ignition... Sep 10 23:52:31.224344 extend-filesystems[1478]: Checking size of /dev/sda9 Sep 10 23:52:31.203166 systemd[1]: Started systemd-networkd.service - Network Configuration. Sep 10 23:52:31.205837 systemd[1]: Finished dracut-shutdown.service - Restore /run/initramfs on shutdown. Sep 10 23:52:31.215541 systemd[1]: Reached target network.target - Network. Sep 10 23:52:31.235045 systemd[1]: Starting containerd.service - containerd container runtime... Sep 10 23:52:31.238493 systemd[1]: Starting systemd-networkd-persistent-storage.service - Enable Persistent Storage in systemd-networkd... Sep 10 23:52:31.252160 extend-filesystems[1478]: Resized partition /dev/sda9 Sep 10 23:52:31.245926 systemd[1]: Starting systemd-networkd-wait-online.service - Wait for Network to be Configured... Sep 10 23:52:31.248854 systemd[1]: enable-oem-cloudinit.service: Skipped due to 'exec-condition'. Sep 10 23:52:31.249042 systemd[1]: Condition check resulted in enable-oem-cloudinit.service - Enable cloudinit being skipped. Sep 10 23:52:31.249403 systemd[1]: ssh-key-proc-cmdline.service: Deactivated successfully. Sep 10 23:52:31.249598 systemd[1]: Finished ssh-key-proc-cmdline.service - Install an ssh key from /proc/cmdline. Sep 10 23:52:31.256258 extend-filesystems[1514]: resize2fs 1.47.2 (1-Jan-2025) Sep 10 23:52:31.272344 kernel: EXT4-fs (sda9): resizing filesystem from 1617920 to 9393147 blocks Sep 10 23:52:31.272704 tar[1493]: linux-arm64/helm Sep 10 23:52:31.275485 systemd-networkd[1425]: eth1: DHCPv4 address 10.0.0.3/32 acquired from 10.0.0.1 Sep 10 23:52:31.279077 jq[1490]: true Sep 10 23:52:31.324731 systemd[1]: motdgen.service: Deactivated successfully. Sep 10 23:52:31.325327 dbus-daemon[1474]: [system] SELinux support is enabled Sep 10 23:52:31.324990 systemd[1]: Finished motdgen.service - Generate /run/flatcar/motd. Sep 10 23:52:31.326791 systemd[1]: Started dbus.service - D-Bus System Message Bus. Sep 10 23:52:31.334093 update_engine[1487]: I20250910 23:52:31.328855 1487 main.cc:92] Flatcar Update Engine starting Sep 10 23:52:31.332666 systemd[1]: system-cloudinit@usr-share-oem-cloud\x2dconfig.yml.service - Load cloud-config from /usr/share/oem/cloud-config.yml was skipped because of an unmet condition check (ConditionFileNotEmpty=/usr/share/oem/cloud-config.yml). Sep 10 23:52:31.332718 systemd[1]: Reached target system-config.target - Load system-provided cloud configs. Sep 10 23:52:31.334692 systemd[1]: user-cloudinit-proc-cmdline.service - Load cloud-config from url defined in /proc/cmdline was skipped because of an unmet condition check (ConditionKernelCommandLine=cloud-config-url). Sep 10 23:52:31.334712 systemd[1]: Reached target user-config.target - Load user-provided cloud configs. Sep 10 23:52:31.346268 jq[1518]: true Sep 10 23:52:31.351526 systemd[1]: Started update-engine.service - Update Engine. Sep 10 23:52:31.355803 update_engine[1487]: I20250910 23:52:31.354754 1487 update_check_scheduler.cc:74] Next update check in 2m56s Sep 10 23:52:31.357359 (ntainerd)[1522]: containerd.service: Referenced but unset environment variable evaluates to an empty string: TORCX_IMAGEDIR, TORCX_UNPACKDIR Sep 10 23:52:31.366816 coreos-metadata[1473]: Sep 10 23:52:31.357 INFO Fetching http://169.254.169.254/hetzner/v1/metadata: Attempt #1 Sep 10 23:52:31.362308 systemd[1]: Started locksmithd.service - Cluster reboot manager. Sep 10 23:52:31.367670 coreos-metadata[1473]: Sep 10 23:52:31.367 INFO Failed to fetch: error sending request for url (http://169.254.169.254/hetzner/v1/metadata) Sep 10 23:52:31.401473 systemd-networkd[1425]: eth0: DHCPv4 address 91.107.201.216/32, gateway 172.31.1.1 acquired from 172.31.1.1 Sep 10 23:52:31.402214 kernel: EXT4-fs (sda9): resized filesystem to 9393147 Sep 10 23:52:31.402798 systemd-timesyncd[1386]: Network configuration changed, trying to establish connection. Sep 10 23:52:31.406504 systemd-timesyncd[1386]: Network configuration changed, trying to establish connection. Sep 10 23:52:31.412237 systemd[1]: Finished systemd-networkd-persistent-storage.service - Enable Persistent Storage in systemd-networkd. Sep 10 23:52:31.417427 extend-filesystems[1514]: Filesystem at /dev/sda9 is mounted on /; on-line resizing required Sep 10 23:52:31.417427 extend-filesystems[1514]: old_desc_blocks = 1, new_desc_blocks = 5 Sep 10 23:52:31.417427 extend-filesystems[1514]: The filesystem on /dev/sda9 is now 9393147 (4k) blocks long. Sep 10 23:52:31.416479 systemd[1]: extend-filesystems.service: Deactivated successfully. Sep 10 23:52:31.425965 extend-filesystems[1478]: Resized filesystem in /dev/sda9 Sep 10 23:52:31.416742 systemd[1]: Finished extend-filesystems.service - Extend Filesystems. Sep 10 23:52:31.478949 bash[1543]: Updated "/home/core/.ssh/authorized_keys" Sep 10 23:52:31.479811 systemd[1]: Finished update-ssh-keys-after-ignition.service - Run update-ssh-keys once after Ignition. Sep 10 23:52:31.486800 systemd[1]: Starting sshkeys.service... Sep 10 23:52:31.522286 systemd[1]: Condition check resulted in dev-virtio\x2dports-org.qemu.guest_agent.0.device - /dev/virtio-ports/org.qemu.guest_agent.0 being skipped. Sep 10 23:52:31.526087 systemd[1]: Started qemu-guest-agent.service - QEMU Guest Agent. Sep 10 23:52:31.532678 systemd[1]: Created slice system-coreos\x2dmetadata\x2dsshkeys.slice - Slice /system/coreos-metadata-sshkeys. Sep 10 23:52:31.537257 systemd[1]: Starting coreos-metadata-sshkeys@core.service - Flatcar Metadata Agent (SSH Keys)... Sep 10 23:52:31.540697 systemd-logind[1483]: New seat seat0. Sep 10 23:52:31.542849 systemd[1]: Started systemd-logind.service - User Login Management. Sep 10 23:52:31.549454 systemd[1]: Found device dev-disk-by\x2dlabel-OEM.device - QEMU_HARDDISK OEM. Sep 10 23:52:31.559212 kernel: [drm] pci: virtio-gpu-pci detected at 0000:00:01.0 Sep 10 23:52:31.559315 kernel: [drm] features: -virgl +edid -resource_blob -host_visible Sep 10 23:52:31.559329 kernel: [drm] features: -context_init Sep 10 23:52:31.560200 kernel: [drm] number of scanouts: 1 Sep 10 23:52:31.560232 kernel: [drm] number of cap sets: 0 Sep 10 23:52:31.561128 systemd[1]: Starting systemd-fsck@dev-disk-by\x2dlabel-OEM.service - File System Check on /dev/disk/by-label/OEM... Sep 10 23:52:31.562203 kernel: [drm] Initialized virtio_gpu 0.1.0 for 0000:00:01.0 on minor 0 Sep 10 23:52:31.566411 kernel: Console: switching to colour frame buffer device 160x50 Sep 10 23:52:31.602207 kernel: virtio-pci 0000:00:01.0: [drm] fb0: virtio_gpudrmfb frame buffer device Sep 10 23:52:31.743059 coreos-metadata[1551]: Sep 10 23:52:31.743 INFO Fetching http://169.254.169.254/hetzner/v1/metadata/public-keys: Attempt #1 Sep 10 23:52:31.745665 coreos-metadata[1551]: Sep 10 23:52:31.745 INFO Fetch successful Sep 10 23:52:31.749530 unknown[1551]: wrote ssh authorized keys file for user: core Sep 10 23:52:31.768540 systemd-logind[1483]: Watching system buttons on /dev/input/event2 (QEMU QEMU USB Keyboard) Sep 10 23:52:31.769740 systemd-logind[1483]: Watching system buttons on /dev/input/event0 (Power Button) Sep 10 23:52:31.796401 systemd[1]: Finished systemd-fsck@dev-disk-by\x2dlabel-OEM.service - File System Check on /dev/disk/by-label/OEM. Sep 10 23:52:31.799618 locksmithd[1526]: locksmithd starting currentOperation="UPDATE_STATUS_IDLE" strategy="reboot" Sep 10 23:52:31.817430 update-ssh-keys[1577]: Updated "/home/core/.ssh/authorized_keys" Sep 10 23:52:31.818985 systemd[1]: Finished coreos-metadata-sshkeys@core.service - Flatcar Metadata Agent (SSH Keys). Sep 10 23:52:31.828478 systemd[1]: Finished sshkeys.service. Sep 10 23:52:31.859614 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Sep 10 23:52:31.871230 containerd[1522]: time="2025-09-10T23:52:31Z" level=warning msg="Ignoring unknown key in TOML" column=1 error="strict mode: fields in the document are missing in the target struct" file=/usr/share/containerd/config.toml key=subreaper row=8 Sep 10 23:52:31.891218 containerd[1522]: time="2025-09-10T23:52:31.889064040Z" level=info msg="starting containerd" revision=06b99ca80cdbfbc6cc8bd567021738c9af2b36ce version=v2.0.4 Sep 10 23:52:31.925049 containerd[1522]: time="2025-09-10T23:52:31.924994120Z" level=warning msg="Configuration migrated from version 2, use `containerd config migrate` to avoid migration" t="12.52µs" Sep 10 23:52:31.925834 containerd[1522]: time="2025-09-10T23:52:31.925804520Z" level=info msg="loading plugin" id=io.containerd.image-verifier.v1.bindir type=io.containerd.image-verifier.v1 Sep 10 23:52:31.925943 containerd[1522]: time="2025-09-10T23:52:31.925926880Z" level=info msg="loading plugin" id=io.containerd.internal.v1.opt type=io.containerd.internal.v1 Sep 10 23:52:31.926637 containerd[1522]: time="2025-09-10T23:52:31.926612680Z" level=info msg="loading plugin" id=io.containerd.warning.v1.deprecations type=io.containerd.warning.v1 Sep 10 23:52:31.926785 containerd[1522]: time="2025-09-10T23:52:31.926767640Z" level=info msg="loading plugin" id=io.containerd.content.v1.content type=io.containerd.content.v1 Sep 10 23:52:31.926970 containerd[1522]: time="2025-09-10T23:52:31.926952320Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.blockfile type=io.containerd.snapshotter.v1 Sep 10 23:52:31.927338 containerd[1522]: time="2025-09-10T23:52:31.927315120Z" level=info msg="skip loading plugin" error="no scratch file generator: skip plugin" id=io.containerd.snapshotter.v1.blockfile type=io.containerd.snapshotter.v1 Sep 10 23:52:31.928229 containerd[1522]: time="2025-09-10T23:52:31.928206760Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.btrfs type=io.containerd.snapshotter.v1 Sep 10 23:52:31.928597 containerd[1522]: time="2025-09-10T23:52:31.928569880Z" level=info msg="skip loading plugin" error="path /var/lib/containerd/io.containerd.snapshotter.v1.btrfs (ext4) must be a btrfs filesystem to be used with the btrfs snapshotter: skip plugin" id=io.containerd.snapshotter.v1.btrfs type=io.containerd.snapshotter.v1 Sep 10 23:52:31.929232 containerd[1522]: time="2025-09-10T23:52:31.929039120Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.devmapper type=io.containerd.snapshotter.v1 Sep 10 23:52:31.929232 containerd[1522]: time="2025-09-10T23:52:31.929063800Z" level=info msg="skip loading plugin" error="devmapper not configured: skip plugin" id=io.containerd.snapshotter.v1.devmapper type=io.containerd.snapshotter.v1 Sep 10 23:52:31.929232 containerd[1522]: time="2025-09-10T23:52:31.929075040Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.native type=io.containerd.snapshotter.v1 Sep 10 23:52:31.929232 containerd[1522]: time="2025-09-10T23:52:31.929156840Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.overlayfs type=io.containerd.snapshotter.v1 Sep 10 23:52:31.930208 containerd[1522]: time="2025-09-10T23:52:31.929601080Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.zfs type=io.containerd.snapshotter.v1 Sep 10 23:52:31.930208 containerd[1522]: time="2025-09-10T23:52:31.929640120Z" level=info msg="skip loading plugin" error="lstat /var/lib/containerd/io.containerd.snapshotter.v1.zfs: no such file or directory: skip plugin" id=io.containerd.snapshotter.v1.zfs type=io.containerd.snapshotter.v1 Sep 10 23:52:31.930208 containerd[1522]: time="2025-09-10T23:52:31.929651040Z" level=info msg="loading plugin" id=io.containerd.event.v1.exchange type=io.containerd.event.v1 Sep 10 23:52:31.930362 containerd[1522]: time="2025-09-10T23:52:31.930326280Z" level=info msg="loading plugin" id=io.containerd.monitor.task.v1.cgroups type=io.containerd.monitor.task.v1 Sep 10 23:52:31.931212 containerd[1522]: time="2025-09-10T23:52:31.931178160Z" level=info msg="loading plugin" id=io.containerd.metadata.v1.bolt type=io.containerd.metadata.v1 Sep 10 23:52:31.931364 containerd[1522]: time="2025-09-10T23:52:31.931335080Z" level=info msg="metadata content store policy set" policy=shared Sep 10 23:52:31.941835 containerd[1522]: time="2025-09-10T23:52:31.941079840Z" level=info msg="loading plugin" id=io.containerd.gc.v1.scheduler type=io.containerd.gc.v1 Sep 10 23:52:31.941835 containerd[1522]: time="2025-09-10T23:52:31.941159920Z" level=info msg="loading plugin" id=io.containerd.differ.v1.walking type=io.containerd.differ.v1 Sep 10 23:52:31.941835 containerd[1522]: time="2025-09-10T23:52:31.941178040Z" level=info msg="loading plugin" id=io.containerd.lease.v1.manager type=io.containerd.lease.v1 Sep 10 23:52:31.941835 containerd[1522]: time="2025-09-10T23:52:31.941235240Z" level=info msg="loading plugin" id=io.containerd.service.v1.containers-service type=io.containerd.service.v1 Sep 10 23:52:31.941835 containerd[1522]: time="2025-09-10T23:52:31.941250800Z" level=info msg="loading plugin" id=io.containerd.service.v1.content-service type=io.containerd.service.v1 Sep 10 23:52:31.941835 containerd[1522]: time="2025-09-10T23:52:31.941266760Z" level=info msg="loading plugin" id=io.containerd.service.v1.diff-service type=io.containerd.service.v1 Sep 10 23:52:31.941835 containerd[1522]: time="2025-09-10T23:52:31.941279600Z" level=info msg="loading plugin" id=io.containerd.service.v1.images-service type=io.containerd.service.v1 Sep 10 23:52:31.941835 containerd[1522]: time="2025-09-10T23:52:31.941291880Z" level=info msg="loading plugin" id=io.containerd.service.v1.introspection-service type=io.containerd.service.v1 Sep 10 23:52:31.941835 containerd[1522]: time="2025-09-10T23:52:31.941304640Z" level=info msg="loading plugin" id=io.containerd.service.v1.namespaces-service type=io.containerd.service.v1 Sep 10 23:52:31.941835 containerd[1522]: time="2025-09-10T23:52:31.941316440Z" level=info msg="loading plugin" id=io.containerd.service.v1.snapshots-service type=io.containerd.service.v1 Sep 10 23:52:31.941835 containerd[1522]: time="2025-09-10T23:52:31.941327200Z" level=info msg="loading plugin" id=io.containerd.shim.v1.manager type=io.containerd.shim.v1 Sep 10 23:52:31.941835 containerd[1522]: time="2025-09-10T23:52:31.941341920Z" level=info msg="loading plugin" id=io.containerd.runtime.v2.task type=io.containerd.runtime.v2 Sep 10 23:52:31.941835 containerd[1522]: time="2025-09-10T23:52:31.941528280Z" level=info msg="loading plugin" id=io.containerd.service.v1.tasks-service type=io.containerd.service.v1 Sep 10 23:52:31.941835 containerd[1522]: time="2025-09-10T23:52:31.941550520Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.containers type=io.containerd.grpc.v1 Sep 10 23:52:31.942139 containerd[1522]: time="2025-09-10T23:52:31.941566720Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.content type=io.containerd.grpc.v1 Sep 10 23:52:31.942139 containerd[1522]: time="2025-09-10T23:52:31.941578440Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.diff type=io.containerd.grpc.v1 Sep 10 23:52:31.942139 containerd[1522]: time="2025-09-10T23:52:31.941595520Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.events type=io.containerd.grpc.v1 Sep 10 23:52:31.942139 containerd[1522]: time="2025-09-10T23:52:31.941606640Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.images type=io.containerd.grpc.v1 Sep 10 23:52:31.942139 containerd[1522]: time="2025-09-10T23:52:31.941622320Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.introspection type=io.containerd.grpc.v1 Sep 10 23:52:31.942139 containerd[1522]: time="2025-09-10T23:52:31.941637200Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.leases type=io.containerd.grpc.v1 Sep 10 23:52:31.942139 containerd[1522]: time="2025-09-10T23:52:31.941673440Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.namespaces type=io.containerd.grpc.v1 Sep 10 23:52:31.942139 containerd[1522]: time="2025-09-10T23:52:31.941686680Z" level=info msg="loading plugin" id=io.containerd.sandbox.store.v1.local type=io.containerd.sandbox.store.v1 Sep 10 23:52:31.942139 containerd[1522]: time="2025-09-10T23:52:31.941697360Z" level=info msg="loading plugin" id=io.containerd.cri.v1.images type=io.containerd.cri.v1 Sep 10 23:52:31.942461 containerd[1522]: time="2025-09-10T23:52:31.942417120Z" level=info msg="Get image filesystem path \"/var/lib/containerd/io.containerd.snapshotter.v1.overlayfs\" for snapshotter \"overlayfs\"" Sep 10 23:52:31.944510 containerd[1522]: time="2025-09-10T23:52:31.943230600Z" level=info msg="Start snapshots syncer" Sep 10 23:52:31.944510 containerd[1522]: time="2025-09-10T23:52:31.943281000Z" level=info msg="loading plugin" id=io.containerd.cri.v1.runtime type=io.containerd.cri.v1 Sep 10 23:52:31.945209 containerd[1522]: time="2025-09-10T23:52:31.944820000Z" level=info msg="starting cri plugin" config="{\"containerd\":{\"defaultRuntimeName\":\"runc\",\"runtimes\":{\"runc\":{\"runtimeType\":\"io.containerd.runc.v2\",\"runtimePath\":\"\",\"PodAnnotations\":null,\"ContainerAnnotations\":null,\"options\":{\"BinaryName\":\"\",\"CriuImagePath\":\"\",\"CriuWorkPath\":\"\",\"IoGid\":0,\"IoUid\":0,\"NoNewKeyring\":false,\"Root\":\"\",\"ShimCgroup\":\"\",\"SystemdCgroup\":true},\"privileged_without_host_devices\":false,\"privileged_without_host_devices_all_devices_allowed\":false,\"baseRuntimeSpec\":\"\",\"cniConfDir\":\"\",\"cniMaxConfNum\":0,\"snapshotter\":\"\",\"sandboxer\":\"podsandbox\",\"io_type\":\"\"}},\"ignoreBlockIONotEnabledErrors\":false,\"ignoreRdtNotEnabledErrors\":false},\"cni\":{\"binDir\":\"/opt/cni/bin\",\"confDir\":\"/etc/cni/net.d\",\"maxConfNum\":1,\"setupSerially\":false,\"confTemplate\":\"\",\"ipPref\":\"\",\"useInternalLoopback\":false},\"enableSelinux\":true,\"selinuxCategoryRange\":1024,\"maxContainerLogSize\":16384,\"disableApparmor\":false,\"restrictOOMScoreAdj\":false,\"disableProcMount\":false,\"unsetSeccompProfile\":\"\",\"tolerateMissingHugetlbController\":true,\"disableHugetlbController\":true,\"device_ownership_from_security_context\":false,\"ignoreImageDefinedVolumes\":false,\"netnsMountsUnderStateDir\":false,\"enableUnprivilegedPorts\":true,\"enableUnprivilegedICMP\":true,\"enableCDI\":true,\"cdiSpecDirs\":[\"/etc/cdi\",\"/var/run/cdi\"],\"drainExecSyncIOTimeout\":\"0s\",\"ignoreDeprecationWarnings\":null,\"containerdRootDir\":\"/var/lib/containerd\",\"containerdEndpoint\":\"/run/containerd/containerd.sock\",\"rootDir\":\"/var/lib/containerd/io.containerd.grpc.v1.cri\",\"stateDir\":\"/run/containerd/io.containerd.grpc.v1.cri\"}" Sep 10 23:52:31.945793 containerd[1522]: time="2025-09-10T23:52:31.945770840Z" level=info msg="loading plugin" id=io.containerd.podsandbox.controller.v1.podsandbox type=io.containerd.podsandbox.controller.v1 Sep 10 23:52:31.947990 containerd[1522]: time="2025-09-10T23:52:31.946418800Z" level=info msg="loading plugin" id=io.containerd.sandbox.controller.v1.shim type=io.containerd.sandbox.controller.v1 Sep 10 23:52:31.947990 containerd[1522]: time="2025-09-10T23:52:31.947687120Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.sandbox-controllers type=io.containerd.grpc.v1 Sep 10 23:52:31.947990 containerd[1522]: time="2025-09-10T23:52:31.947714360Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.sandboxes type=io.containerd.grpc.v1 Sep 10 23:52:31.947990 containerd[1522]: time="2025-09-10T23:52:31.947726480Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.snapshots type=io.containerd.grpc.v1 Sep 10 23:52:31.947990 containerd[1522]: time="2025-09-10T23:52:31.947740240Z" level=info msg="loading plugin" id=io.containerd.streaming.v1.manager type=io.containerd.streaming.v1 Sep 10 23:52:31.947990 containerd[1522]: time="2025-09-10T23:52:31.947753800Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.streaming type=io.containerd.grpc.v1 Sep 10 23:52:31.947990 containerd[1522]: time="2025-09-10T23:52:31.947765040Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.tasks type=io.containerd.grpc.v1 Sep 10 23:52:31.947990 containerd[1522]: time="2025-09-10T23:52:31.947775760Z" level=info msg="loading plugin" id=io.containerd.transfer.v1.local type=io.containerd.transfer.v1 Sep 10 23:52:31.947990 containerd[1522]: time="2025-09-10T23:52:31.947804240Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.transfer type=io.containerd.grpc.v1 Sep 10 23:52:31.947990 containerd[1522]: time="2025-09-10T23:52:31.947815080Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.version type=io.containerd.grpc.v1 Sep 10 23:52:31.947990 containerd[1522]: time="2025-09-10T23:52:31.947826520Z" level=info msg="loading plugin" id=io.containerd.monitor.container.v1.restart type=io.containerd.monitor.container.v1 Sep 10 23:52:31.949677 containerd[1522]: time="2025-09-10T23:52:31.949230440Z" level=info msg="loading plugin" id=io.containerd.tracing.processor.v1.otlp type=io.containerd.tracing.processor.v1 Sep 10 23:52:31.949677 containerd[1522]: time="2025-09-10T23:52:31.949262920Z" level=info msg="skip loading plugin" error="skip plugin: tracing endpoint not configured" id=io.containerd.tracing.processor.v1.otlp type=io.containerd.tracing.processor.v1 Sep 10 23:52:31.949677 containerd[1522]: time="2025-09-10T23:52:31.949327880Z" level=info msg="loading plugin" id=io.containerd.internal.v1.tracing type=io.containerd.internal.v1 Sep 10 23:52:31.949677 containerd[1522]: time="2025-09-10T23:52:31.949338720Z" level=info msg="skip loading plugin" error="skip plugin: tracing endpoint not configured" id=io.containerd.internal.v1.tracing type=io.containerd.internal.v1 Sep 10 23:52:31.949677 containerd[1522]: time="2025-09-10T23:52:31.949361520Z" level=info msg="loading plugin" id=io.containerd.ttrpc.v1.otelttrpc type=io.containerd.ttrpc.v1 Sep 10 23:52:31.949677 containerd[1522]: time="2025-09-10T23:52:31.949372440Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.healthcheck type=io.containerd.grpc.v1 Sep 10 23:52:31.949677 containerd[1522]: time="2025-09-10T23:52:31.949383760Z" level=info msg="loading plugin" id=io.containerd.nri.v1.nri type=io.containerd.nri.v1 Sep 10 23:52:31.949677 containerd[1522]: time="2025-09-10T23:52:31.949470160Z" level=info msg="runtime interface created" Sep 10 23:52:31.949677 containerd[1522]: time="2025-09-10T23:52:31.949477000Z" level=info msg="created NRI interface" Sep 10 23:52:31.949677 containerd[1522]: time="2025-09-10T23:52:31.949485960Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.cri type=io.containerd.grpc.v1 Sep 10 23:52:31.949677 containerd[1522]: time="2025-09-10T23:52:31.949501840Z" level=info msg="Connect containerd service" Sep 10 23:52:31.949677 containerd[1522]: time="2025-09-10T23:52:31.949540440Z" level=info msg="using experimental NRI integration - disable nri plugin to prevent this" Sep 10 23:52:31.956240 containerd[1522]: time="2025-09-10T23:52:31.955760480Z" level=error msg="failed to load cni during init, please check CRI plugin status before setting up network for pods" error="cni config load failed: no network config found in /etc/cni/net.d: cni plugin not initialized: failed to load cni config" Sep 10 23:52:31.971616 systemd[1]: systemd-vconsole-setup.service: Deactivated successfully. Sep 10 23:52:31.973097 systemd[1]: Stopped systemd-vconsole-setup.service - Virtual Console Setup. Sep 10 23:52:31.977936 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Sep 10 23:52:32.097925 containerd[1522]: time="2025-09-10T23:52:32.097791600Z" level=info msg="Start subscribing containerd event" Sep 10 23:52:32.097925 containerd[1522]: time="2025-09-10T23:52:32.097889040Z" level=info msg="Start recovering state" Sep 10 23:52:32.098163 containerd[1522]: time="2025-09-10T23:52:32.097980720Z" level=info msg="Start event monitor" Sep 10 23:52:32.098219 containerd[1522]: time="2025-09-10T23:52:32.098165400Z" level=info msg="Start cni network conf syncer for default" Sep 10 23:52:32.098219 containerd[1522]: time="2025-09-10T23:52:32.098179120Z" level=info msg="Start streaming server" Sep 10 23:52:32.098269 containerd[1522]: time="2025-09-10T23:52:32.098224640Z" level=info msg="Registered namespace \"k8s.io\" with NRI" Sep 10 23:52:32.098269 containerd[1522]: time="2025-09-10T23:52:32.098235960Z" level=info msg="runtime interface starting up..." Sep 10 23:52:32.098269 containerd[1522]: time="2025-09-10T23:52:32.098243120Z" level=info msg="starting plugins..." Sep 10 23:52:32.098269 containerd[1522]: time="2025-09-10T23:52:32.098264560Z" level=info msg="Synchronizing NRI (plugin) with current runtime state" Sep 10 23:52:32.098409 containerd[1522]: time="2025-09-10T23:52:32.098128720Z" level=info msg=serving... address=/run/containerd/containerd.sock.ttrpc Sep 10 23:52:32.098517 containerd[1522]: time="2025-09-10T23:52:32.098502920Z" level=info msg=serving... address=/run/containerd/containerd.sock Sep 10 23:52:32.098767 systemd[1]: Started containerd.service - containerd container runtime. Sep 10 23:52:32.101669 containerd[1522]: time="2025-09-10T23:52:32.101519320Z" level=info msg="containerd successfully booted in 0.230892s" Sep 10 23:52:32.143482 systemd[1]: Finished systemd-vconsole-setup.service - Virtual Console Setup. Sep 10 23:52:32.234394 systemd-networkd[1425]: eth0: Gained IPv6LL Sep 10 23:52:32.235273 systemd-timesyncd[1386]: Network configuration changed, trying to establish connection. Sep 10 23:52:32.242147 systemd[1]: Finished systemd-networkd-wait-online.service - Wait for Network to be Configured. Sep 10 23:52:32.244755 systemd[1]: Reached target network-online.target - Network is Online. Sep 10 23:52:32.251756 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Sep 10 23:52:32.256590 systemd[1]: Starting nvidia.service - NVIDIA Configure Service... Sep 10 23:52:32.322597 systemd[1]: Finished nvidia.service - NVIDIA Configure Service. Sep 10 23:52:32.325521 tar[1493]: linux-arm64/LICENSE Sep 10 23:52:32.325521 tar[1493]: linux-arm64/README.md Sep 10 23:52:32.347383 systemd[1]: Finished prepare-helm.service - Unpack helm to /opt/bin. Sep 10 23:52:32.367779 coreos-metadata[1473]: Sep 10 23:52:32.367 INFO Fetching http://169.254.169.254/hetzner/v1/metadata: Attempt #2 Sep 10 23:52:32.368842 coreos-metadata[1473]: Sep 10 23:52:32.368 INFO Fetch successful Sep 10 23:52:32.370230 coreos-metadata[1473]: Sep 10 23:52:32.369 INFO Fetching http://169.254.169.254/hetzner/v1/metadata/private-networks: Attempt #1 Sep 10 23:52:32.370689 coreos-metadata[1473]: Sep 10 23:52:32.370 INFO Fetch successful Sep 10 23:52:32.476870 systemd[1]: Finished coreos-metadata.service - Flatcar Metadata Agent. Sep 10 23:52:32.477964 systemd[1]: packet-phone-home.service - Report Success to Packet was skipped because no trigger condition checks were met. Sep 10 23:52:32.618388 systemd-networkd[1425]: eth1: Gained IPv6LL Sep 10 23:52:32.619327 systemd-timesyncd[1386]: Network configuration changed, trying to establish connection. Sep 10 23:52:32.701742 sshd_keygen[1505]: ssh-keygen: generating new host keys: RSA ECDSA ED25519 Sep 10 23:52:32.731711 systemd[1]: Finished sshd-keygen.service - Generate sshd host keys. Sep 10 23:52:32.737622 systemd[1]: Starting issuegen.service - Generate /run/issue... Sep 10 23:52:32.761538 systemd[1]: issuegen.service: Deactivated successfully. Sep 10 23:52:32.761907 systemd[1]: Finished issuegen.service - Generate /run/issue. Sep 10 23:52:32.768022 systemd[1]: Starting systemd-user-sessions.service - Permit User Sessions... Sep 10 23:52:32.790046 systemd[1]: Finished systemd-user-sessions.service - Permit User Sessions. Sep 10 23:52:32.795573 systemd[1]: Started getty@tty1.service - Getty on tty1. Sep 10 23:52:32.798902 systemd[1]: Started serial-getty@ttyAMA0.service - Serial Getty on ttyAMA0. Sep 10 23:52:32.800308 systemd[1]: Reached target getty.target - Login Prompts. Sep 10 23:52:33.150147 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Sep 10 23:52:33.151624 systemd[1]: Reached target multi-user.target - Multi-User System. Sep 10 23:52:33.153958 systemd[1]: Startup finished in 2.296s (kernel) + 5.128s (initrd) + 4.470s (userspace) = 11.895s. Sep 10 23:52:33.159442 (kubelet)[1655]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Sep 10 23:52:33.733947 kubelet[1655]: E0910 23:52:33.733870 1655 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Sep 10 23:52:33.738770 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Sep 10 23:52:33.739043 systemd[1]: kubelet.service: Failed with result 'exit-code'. Sep 10 23:52:33.741298 systemd[1]: kubelet.service: Consumed 930ms CPU time, 256.3M memory peak. Sep 10 23:52:43.959597 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 1. Sep 10 23:52:43.962238 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Sep 10 23:52:44.124043 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Sep 10 23:52:44.138048 (kubelet)[1674]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Sep 10 23:52:44.193391 kubelet[1674]: E0910 23:52:44.193343 1674 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Sep 10 23:52:44.196778 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Sep 10 23:52:44.196911 systemd[1]: kubelet.service: Failed with result 'exit-code'. Sep 10 23:52:44.197362 systemd[1]: kubelet.service: Consumed 178ms CPU time, 107.7M memory peak. Sep 10 23:52:54.209482 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 2. Sep 10 23:52:54.213023 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Sep 10 23:52:54.369463 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Sep 10 23:52:54.381777 (kubelet)[1690]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Sep 10 23:52:54.426148 kubelet[1690]: E0910 23:52:54.426069 1690 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Sep 10 23:52:54.429460 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Sep 10 23:52:54.429711 systemd[1]: kubelet.service: Failed with result 'exit-code'. Sep 10 23:52:54.430309 systemd[1]: kubelet.service: Consumed 168ms CPU time, 106.9M memory peak. Sep 10 23:53:02.813730 systemd-timesyncd[1386]: Contacted time server 162.159.200.1:123 (2.flatcar.pool.ntp.org). Sep 10 23:53:02.813830 systemd-timesyncd[1386]: Initial clock synchronization to Wed 2025-09-10 23:53:03.123987 UTC. Sep 10 23:53:04.459522 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 3. Sep 10 23:53:04.462917 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Sep 10 23:53:04.634482 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Sep 10 23:53:04.649165 (kubelet)[1704]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Sep 10 23:53:04.693800 kubelet[1704]: E0910 23:53:04.693750 1704 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Sep 10 23:53:04.696762 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Sep 10 23:53:04.696909 systemd[1]: kubelet.service: Failed with result 'exit-code'. Sep 10 23:53:04.697555 systemd[1]: kubelet.service: Consumed 165ms CPU time, 107M memory peak. Sep 10 23:53:12.140448 systemd[1]: Created slice system-sshd.slice - Slice /system/sshd. Sep 10 23:53:12.142504 systemd[1]: Started sshd@0-91.107.201.216:22-139.178.89.65:32914.service - OpenSSH per-connection server daemon (139.178.89.65:32914). Sep 10 23:53:13.175413 sshd[1712]: Accepted publickey for core from 139.178.89.65 port 32914 ssh2: RSA SHA256:WhS/KOZ1o/uklv7h/4WLPYbUs/Yyh1JayZdeMawA7QM Sep 10 23:53:13.178624 sshd-session[1712]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 10 23:53:13.187146 systemd[1]: Created slice user-500.slice - User Slice of UID 500. Sep 10 23:53:13.189262 systemd[1]: Starting user-runtime-dir@500.service - User Runtime Directory /run/user/500... Sep 10 23:53:13.197266 systemd-logind[1483]: New session 1 of user core. Sep 10 23:53:13.215576 systemd[1]: Finished user-runtime-dir@500.service - User Runtime Directory /run/user/500. Sep 10 23:53:13.219838 systemd[1]: Starting user@500.service - User Manager for UID 500... Sep 10 23:53:13.239061 (systemd)[1716]: pam_unix(systemd-user:session): session opened for user core(uid=500) by (uid=0) Sep 10 23:53:13.242429 systemd-logind[1483]: New session c1 of user core. Sep 10 23:53:13.391864 systemd[1716]: Queued start job for default target default.target. Sep 10 23:53:13.400189 systemd[1716]: Created slice app.slice - User Application Slice. Sep 10 23:53:13.400281 systemd[1716]: Reached target paths.target - Paths. Sep 10 23:53:13.400503 systemd[1716]: Reached target timers.target - Timers. Sep 10 23:53:13.403256 systemd[1716]: Starting dbus.socket - D-Bus User Message Bus Socket... Sep 10 23:53:13.414895 systemd[1716]: Listening on dbus.socket - D-Bus User Message Bus Socket. Sep 10 23:53:13.415033 systemd[1716]: Reached target sockets.target - Sockets. Sep 10 23:53:13.415123 systemd[1716]: Reached target basic.target - Basic System. Sep 10 23:53:13.415169 systemd[1716]: Reached target default.target - Main User Target. Sep 10 23:53:13.415220 systemd[1716]: Startup finished in 165ms. Sep 10 23:53:13.415298 systemd[1]: Started user@500.service - User Manager for UID 500. Sep 10 23:53:13.428531 systemd[1]: Started session-1.scope - Session 1 of User core. Sep 10 23:53:14.134498 systemd[1]: Started sshd@1-91.107.201.216:22-139.178.89.65:32926.service - OpenSSH per-connection server daemon (139.178.89.65:32926). Sep 10 23:53:14.709357 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 4. Sep 10 23:53:14.713042 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Sep 10 23:53:14.875016 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Sep 10 23:53:14.887745 (kubelet)[1737]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Sep 10 23:53:14.933525 kubelet[1737]: E0910 23:53:14.933482 1737 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Sep 10 23:53:14.936417 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Sep 10 23:53:14.936660 systemd[1]: kubelet.service: Failed with result 'exit-code'. Sep 10 23:53:14.937311 systemd[1]: kubelet.service: Consumed 160ms CPU time, 106.9M memory peak. Sep 10 23:53:15.159018 sshd[1727]: Accepted publickey for core from 139.178.89.65 port 32926 ssh2: RSA SHA256:WhS/KOZ1o/uklv7h/4WLPYbUs/Yyh1JayZdeMawA7QM Sep 10 23:53:15.161651 sshd-session[1727]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 10 23:53:15.168927 systemd-logind[1483]: New session 2 of user core. Sep 10 23:53:15.175493 systemd[1]: Started session-2.scope - Session 2 of User core. Sep 10 23:53:15.854288 sshd[1744]: Connection closed by 139.178.89.65 port 32926 Sep 10 23:53:15.855386 sshd-session[1727]: pam_unix(sshd:session): session closed for user core Sep 10 23:53:15.860250 systemd-logind[1483]: Session 2 logged out. Waiting for processes to exit. Sep 10 23:53:15.860760 systemd[1]: sshd@1-91.107.201.216:22-139.178.89.65:32926.service: Deactivated successfully. Sep 10 23:53:15.863270 systemd[1]: session-2.scope: Deactivated successfully. Sep 10 23:53:15.866818 systemd-logind[1483]: Removed session 2. Sep 10 23:53:16.026486 systemd[1]: Started sshd@2-91.107.201.216:22-139.178.89.65:32936.service - OpenSSH per-connection server daemon (139.178.89.65:32936). Sep 10 23:53:16.259328 update_engine[1487]: I20250910 23:53:16.258973 1487 update_attempter.cc:509] Updating boot flags... Sep 10 23:53:17.024629 sshd[1750]: Accepted publickey for core from 139.178.89.65 port 32936 ssh2: RSA SHA256:WhS/KOZ1o/uklv7h/4WLPYbUs/Yyh1JayZdeMawA7QM Sep 10 23:53:17.026653 sshd-session[1750]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 10 23:53:17.032110 systemd-logind[1483]: New session 3 of user core. Sep 10 23:53:17.040522 systemd[1]: Started session-3.scope - Session 3 of User core. Sep 10 23:53:17.699813 sshd[1772]: Connection closed by 139.178.89.65 port 32936 Sep 10 23:53:17.700764 sshd-session[1750]: pam_unix(sshd:session): session closed for user core Sep 10 23:53:17.707569 systemd[1]: sshd@2-91.107.201.216:22-139.178.89.65:32936.service: Deactivated successfully. Sep 10 23:53:17.710948 systemd[1]: session-3.scope: Deactivated successfully. Sep 10 23:53:17.712666 systemd-logind[1483]: Session 3 logged out. Waiting for processes to exit. Sep 10 23:53:17.714330 systemd-logind[1483]: Removed session 3. Sep 10 23:53:17.878538 systemd[1]: Started sshd@3-91.107.201.216:22-139.178.89.65:32940.service - OpenSSH per-connection server daemon (139.178.89.65:32940). Sep 10 23:53:18.895448 sshd[1778]: Accepted publickey for core from 139.178.89.65 port 32940 ssh2: RSA SHA256:WhS/KOZ1o/uklv7h/4WLPYbUs/Yyh1JayZdeMawA7QM Sep 10 23:53:18.897332 sshd-session[1778]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 10 23:53:18.903852 systemd-logind[1483]: New session 4 of user core. Sep 10 23:53:18.910497 systemd[1]: Started session-4.scope - Session 4 of User core. Sep 10 23:53:19.591604 sshd[1780]: Connection closed by 139.178.89.65 port 32940 Sep 10 23:53:19.592329 sshd-session[1778]: pam_unix(sshd:session): session closed for user core Sep 10 23:53:19.602384 systemd[1]: sshd@3-91.107.201.216:22-139.178.89.65:32940.service: Deactivated successfully. Sep 10 23:53:19.604103 systemd[1]: session-4.scope: Deactivated successfully. Sep 10 23:53:19.605159 systemd-logind[1483]: Session 4 logged out. Waiting for processes to exit. Sep 10 23:53:19.606650 systemd-logind[1483]: Removed session 4. Sep 10 23:53:19.763577 systemd[1]: Started sshd@4-91.107.201.216:22-139.178.89.65:32954.service - OpenSSH per-connection server daemon (139.178.89.65:32954). Sep 10 23:53:20.770820 sshd[1786]: Accepted publickey for core from 139.178.89.65 port 32954 ssh2: RSA SHA256:WhS/KOZ1o/uklv7h/4WLPYbUs/Yyh1JayZdeMawA7QM Sep 10 23:53:20.773568 sshd-session[1786]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 10 23:53:20.779516 systemd-logind[1483]: New session 5 of user core. Sep 10 23:53:20.787440 systemd[1]: Started session-5.scope - Session 5 of User core. Sep 10 23:53:21.302578 sudo[1789]: core : PWD=/home/core ; USER=root ; COMMAND=/usr/sbin/setenforce 1 Sep 10 23:53:21.303298 sudo[1789]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Sep 10 23:53:21.319473 sudo[1789]: pam_unix(sudo:session): session closed for user root Sep 10 23:53:21.480787 sshd[1788]: Connection closed by 139.178.89.65 port 32954 Sep 10 23:53:21.480574 sshd-session[1786]: pam_unix(sshd:session): session closed for user core Sep 10 23:53:21.487472 systemd[1]: sshd@4-91.107.201.216:22-139.178.89.65:32954.service: Deactivated successfully. Sep 10 23:53:21.489977 systemd[1]: session-5.scope: Deactivated successfully. Sep 10 23:53:21.491143 systemd-logind[1483]: Session 5 logged out. Waiting for processes to exit. Sep 10 23:53:21.493377 systemd-logind[1483]: Removed session 5. Sep 10 23:53:21.657730 systemd[1]: Started sshd@5-91.107.201.216:22-139.178.89.65:35366.service - OpenSSH per-connection server daemon (139.178.89.65:35366). Sep 10 23:53:22.662169 sshd[1795]: Accepted publickey for core from 139.178.89.65 port 35366 ssh2: RSA SHA256:WhS/KOZ1o/uklv7h/4WLPYbUs/Yyh1JayZdeMawA7QM Sep 10 23:53:22.664290 sshd-session[1795]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 10 23:53:22.670867 systemd-logind[1483]: New session 6 of user core. Sep 10 23:53:22.676546 systemd[1]: Started session-6.scope - Session 6 of User core. Sep 10 23:53:23.180953 sudo[1799]: core : PWD=/home/core ; USER=root ; COMMAND=/usr/sbin/rm -rf /etc/audit/rules.d/80-selinux.rules /etc/audit/rules.d/99-default.rules Sep 10 23:53:23.181349 sudo[1799]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Sep 10 23:53:23.187045 sudo[1799]: pam_unix(sudo:session): session closed for user root Sep 10 23:53:23.193396 sudo[1798]: core : PWD=/home/core ; USER=root ; COMMAND=/usr/sbin/systemctl restart audit-rules Sep 10 23:53:23.193656 sudo[1798]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Sep 10 23:53:23.205475 systemd[1]: Starting audit-rules.service - Load Audit Rules... Sep 10 23:53:23.248442 augenrules[1821]: No rules Sep 10 23:53:23.250295 systemd[1]: audit-rules.service: Deactivated successfully. Sep 10 23:53:23.250529 systemd[1]: Finished audit-rules.service - Load Audit Rules. Sep 10 23:53:23.252125 sudo[1798]: pam_unix(sudo:session): session closed for user root Sep 10 23:53:23.410742 sshd[1797]: Connection closed by 139.178.89.65 port 35366 Sep 10 23:53:23.411609 sshd-session[1795]: pam_unix(sshd:session): session closed for user core Sep 10 23:53:23.417618 systemd[1]: sshd@5-91.107.201.216:22-139.178.89.65:35366.service: Deactivated successfully. Sep 10 23:53:23.419469 systemd[1]: session-6.scope: Deactivated successfully. Sep 10 23:53:23.420630 systemd-logind[1483]: Session 6 logged out. Waiting for processes to exit. Sep 10 23:53:23.423076 systemd-logind[1483]: Removed session 6. Sep 10 23:53:23.587052 systemd[1]: Started sshd@6-91.107.201.216:22-139.178.89.65:35368.service - OpenSSH per-connection server daemon (139.178.89.65:35368). Sep 10 23:53:24.589943 sshd[1830]: Accepted publickey for core from 139.178.89.65 port 35368 ssh2: RSA SHA256:WhS/KOZ1o/uklv7h/4WLPYbUs/Yyh1JayZdeMawA7QM Sep 10 23:53:24.592362 sshd-session[1830]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 10 23:53:24.598253 systemd-logind[1483]: New session 7 of user core. Sep 10 23:53:24.604439 systemd[1]: Started session-7.scope - Session 7 of User core. Sep 10 23:53:24.959250 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 5. Sep 10 23:53:24.961086 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Sep 10 23:53:25.114758 sudo[1840]: core : PWD=/home/core ; USER=root ; COMMAND=/home/core/install.sh Sep 10 23:53:25.115473 sudo[1840]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Sep 10 23:53:25.116095 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Sep 10 23:53:25.124547 (kubelet)[1842]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Sep 10 23:53:25.172849 kubelet[1842]: E0910 23:53:25.172735 1842 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Sep 10 23:53:25.177293 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Sep 10 23:53:25.177495 systemd[1]: kubelet.service: Failed with result 'exit-code'. Sep 10 23:53:25.179271 systemd[1]: kubelet.service: Consumed 159ms CPU time, 106.8M memory peak. Sep 10 23:53:25.469053 systemd[1]: Starting docker.service - Docker Application Container Engine... Sep 10 23:53:25.492238 (dockerd)[1865]: docker.service: Referenced but unset environment variable evaluates to an empty string: DOCKER_CGROUPS, DOCKER_OPTS, DOCKER_OPT_BIP, DOCKER_OPT_IPMASQ, DOCKER_OPT_MTU Sep 10 23:53:25.749164 dockerd[1865]: time="2025-09-10T23:53:25.748136112Z" level=info msg="Starting up" Sep 10 23:53:25.751437 dockerd[1865]: time="2025-09-10T23:53:25.751406864Z" level=info msg="OTEL tracing is not configured, using no-op tracer provider" Sep 10 23:53:25.785661 systemd[1]: var-lib-docker-check\x2doverlayfs\x2dsupport3356135369-merged.mount: Deactivated successfully. Sep 10 23:53:25.809578 dockerd[1865]: time="2025-09-10T23:53:25.809175858Z" level=info msg="Loading containers: start." Sep 10 23:53:25.820269 kernel: Initializing XFRM netlink socket Sep 10 23:53:26.049375 systemd-networkd[1425]: docker0: Link UP Sep 10 23:53:26.054786 dockerd[1865]: time="2025-09-10T23:53:26.054725469Z" level=info msg="Loading containers: done." Sep 10 23:53:26.072858 dockerd[1865]: time="2025-09-10T23:53:26.072791959Z" level=warning msg="Not using native diff for overlay2, this may cause degraded performance for building images: kernel has CONFIG_OVERLAY_FS_REDIRECT_DIR enabled" storage-driver=overlay2 Sep 10 23:53:26.073027 dockerd[1865]: time="2025-09-10T23:53:26.072920870Z" level=info msg="Docker daemon" commit=bbd0a17ccc67e48d4a69393287b7fcc4f0578683 containerd-snapshotter=false storage-driver=overlay2 version=28.0.1 Sep 10 23:53:26.073090 dockerd[1865]: time="2025-09-10T23:53:26.073068656Z" level=info msg="Initializing buildkit" Sep 10 23:53:26.100938 dockerd[1865]: time="2025-09-10T23:53:26.100854403Z" level=info msg="Completed buildkit initialization" Sep 10 23:53:26.110680 dockerd[1865]: time="2025-09-10T23:53:26.110575263Z" level=info msg="Daemon has completed initialization" Sep 10 23:53:26.112235 dockerd[1865]: time="2025-09-10T23:53:26.111259612Z" level=info msg="API listen on /run/docker.sock" Sep 10 23:53:26.112345 systemd[1]: Started docker.service - Docker Application Container Engine. Sep 10 23:53:26.782923 systemd[1]: var-lib-docker-overlay2-opaque\x2dbug\x2dcheck183079328-merged.mount: Deactivated successfully. Sep 10 23:53:27.135216 containerd[1522]: time="2025-09-10T23:53:27.135032048Z" level=info msg="PullImage \"registry.k8s.io/kube-apiserver:v1.31.13\"" Sep 10 23:53:27.755941 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount1437779491.mount: Deactivated successfully. Sep 10 23:53:28.988933 containerd[1522]: time="2025-09-10T23:53:28.988864176Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-apiserver:v1.31.13\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 10 23:53:28.990783 containerd[1522]: time="2025-09-10T23:53:28.990750892Z" level=info msg="stop pulling image registry.k8s.io/kube-apiserver:v1.31.13: active requests=0, bytes read=25687423" Sep 10 23:53:28.991864 containerd[1522]: time="2025-09-10T23:53:28.991829135Z" level=info msg="ImageCreate event name:\"sha256:0b1c07d8fd4a3526d5c44502e682df3627a3b01c1e608e5e24c3519c8fb337b6\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 10 23:53:28.995312 containerd[1522]: time="2025-09-10T23:53:28.995224126Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-apiserver@sha256:9abeb8a2d3e53e356d1f2e5d5dc2081cf28f23242651b0552c9e38f4a7ae960e\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 10 23:53:28.997230 containerd[1522]: time="2025-09-10T23:53:28.996828653Z" level=info msg="Pulled image \"registry.k8s.io/kube-apiserver:v1.31.13\" with image id \"sha256:0b1c07d8fd4a3526d5c44502e682df3627a3b01c1e608e5e24c3519c8fb337b6\", repo tag \"registry.k8s.io/kube-apiserver:v1.31.13\", repo digest \"registry.k8s.io/kube-apiserver@sha256:9abeb8a2d3e53e356d1f2e5d5dc2081cf28f23242651b0552c9e38f4a7ae960e\", size \"25683924\" in 1.861745929s" Sep 10 23:53:28.997230 containerd[1522]: time="2025-09-10T23:53:28.996897107Z" level=info msg="PullImage \"registry.k8s.io/kube-apiserver:v1.31.13\" returns image reference \"sha256:0b1c07d8fd4a3526d5c44502e682df3627a3b01c1e608e5e24c3519c8fb337b6\"" Sep 10 23:53:28.999671 containerd[1522]: time="2025-09-10T23:53:28.999637077Z" level=info msg="PullImage \"registry.k8s.io/kube-controller-manager:v1.31.13\"" Sep 10 23:53:30.649422 containerd[1522]: time="2025-09-10T23:53:30.649361704Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-controller-manager:v1.31.13\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 10 23:53:30.651914 containerd[1522]: time="2025-09-10T23:53:30.651342151Z" level=info msg="stop pulling image registry.k8s.io/kube-controller-manager:v1.31.13: active requests=0, bytes read=22459787" Sep 10 23:53:30.653095 containerd[1522]: time="2025-09-10T23:53:30.652996013Z" level=info msg="ImageCreate event name:\"sha256:c359cb88f3d2147f2cb4c5ada4fbdeadc4b1c009d66c8f33f3856efaf04ee6ef\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 10 23:53:30.658066 containerd[1522]: time="2025-09-10T23:53:30.658017905Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-controller-manager@sha256:facc91288697a288a691520949fe4eec40059ef065c89da8e10481d14e131b09\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 10 23:53:30.659270 containerd[1522]: time="2025-09-10T23:53:30.659238030Z" level=info msg="Pulled image \"registry.k8s.io/kube-controller-manager:v1.31.13\" with image id \"sha256:c359cb88f3d2147f2cb4c5ada4fbdeadc4b1c009d66c8f33f3856efaf04ee6ef\", repo tag \"registry.k8s.io/kube-controller-manager:v1.31.13\", repo digest \"registry.k8s.io/kube-controller-manager@sha256:facc91288697a288a691520949fe4eec40059ef065c89da8e10481d14e131b09\", size \"24028542\" in 1.659469896s" Sep 10 23:53:30.659383 containerd[1522]: time="2025-09-10T23:53:30.659365885Z" level=info msg="PullImage \"registry.k8s.io/kube-controller-manager:v1.31.13\" returns image reference \"sha256:c359cb88f3d2147f2cb4c5ada4fbdeadc4b1c009d66c8f33f3856efaf04ee6ef\"" Sep 10 23:53:30.660772 containerd[1522]: time="2025-09-10T23:53:30.660738371Z" level=info msg="PullImage \"registry.k8s.io/kube-scheduler:v1.31.13\"" Sep 10 23:53:32.097092 containerd[1522]: time="2025-09-10T23:53:32.097021197Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-scheduler:v1.31.13\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 10 23:53:32.098816 containerd[1522]: time="2025-09-10T23:53:32.098785741Z" level=info msg="stop pulling image registry.k8s.io/kube-scheduler:v1.31.13: active requests=0, bytes read=17127526" Sep 10 23:53:32.099913 containerd[1522]: time="2025-09-10T23:53:32.099524897Z" level=info msg="ImageCreate event name:\"sha256:5e3cbe2ba7db787c6aebfcf4484156dd4ebd7ede811ef72e8929593e59a5fa27\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 10 23:53:32.102768 containerd[1522]: time="2025-09-10T23:53:32.102741172Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-scheduler@sha256:c5ce150dcce2419fdef9f9875fef43014355ccebf937846ed3a2971953f9b241\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 10 23:53:32.103781 containerd[1522]: time="2025-09-10T23:53:32.103741339Z" level=info msg="Pulled image \"registry.k8s.io/kube-scheduler:v1.31.13\" with image id \"sha256:5e3cbe2ba7db787c6aebfcf4484156dd4ebd7ede811ef72e8929593e59a5fa27\", repo tag \"registry.k8s.io/kube-scheduler:v1.31.13\", repo digest \"registry.k8s.io/kube-scheduler@sha256:c5ce150dcce2419fdef9f9875fef43014355ccebf937846ed3a2971953f9b241\", size \"18696299\" in 1.442962568s" Sep 10 23:53:32.103781 containerd[1522]: time="2025-09-10T23:53:32.103779730Z" level=info msg="PullImage \"registry.k8s.io/kube-scheduler:v1.31.13\" returns image reference \"sha256:5e3cbe2ba7db787c6aebfcf4484156dd4ebd7ede811ef72e8929593e59a5fa27\"" Sep 10 23:53:32.104356 containerd[1522]: time="2025-09-10T23:53:32.104335699Z" level=info msg="PullImage \"registry.k8s.io/kube-proxy:v1.31.13\"" Sep 10 23:53:33.162405 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount1175816878.mount: Deactivated successfully. Sep 10 23:53:33.520250 containerd[1522]: time="2025-09-10T23:53:33.520077738Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-proxy:v1.31.13\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 10 23:53:33.521181 containerd[1522]: time="2025-09-10T23:53:33.521093935Z" level=info msg="stop pulling image registry.k8s.io/kube-proxy:v1.31.13: active requests=0, bytes read=26954933" Sep 10 23:53:33.522240 containerd[1522]: time="2025-09-10T23:53:33.522095042Z" level=info msg="ImageCreate event name:\"sha256:c15699f0b7002450249485b10f20211982dfd2bec4d61c86c35acebc659e794e\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 10 23:53:33.525214 containerd[1522]: time="2025-09-10T23:53:33.524325057Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-proxy@sha256:a39637326e88d128d38da6ff2b2ceb4e856475887bfcb5f7a55734d4f63d9fae\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 10 23:53:33.527322 containerd[1522]: time="2025-09-10T23:53:33.527284226Z" level=info msg="Pulled image \"registry.k8s.io/kube-proxy:v1.31.13\" with image id \"sha256:c15699f0b7002450249485b10f20211982dfd2bec4d61c86c35acebc659e794e\", repo tag \"registry.k8s.io/kube-proxy:v1.31.13\", repo digest \"registry.k8s.io/kube-proxy@sha256:a39637326e88d128d38da6ff2b2ceb4e856475887bfcb5f7a55734d4f63d9fae\", size \"26953926\" in 1.422796966s" Sep 10 23:53:33.527463 containerd[1522]: time="2025-09-10T23:53:33.527448302Z" level=info msg="PullImage \"registry.k8s.io/kube-proxy:v1.31.13\" returns image reference \"sha256:c15699f0b7002450249485b10f20211982dfd2bec4d61c86c35acebc659e794e\"" Sep 10 23:53:33.528354 containerd[1522]: time="2025-09-10T23:53:33.528323440Z" level=info msg="PullImage \"registry.k8s.io/coredns/coredns:v1.11.3\"" Sep 10 23:53:34.086893 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount2767252495.mount: Deactivated successfully. Sep 10 23:53:34.751602 containerd[1522]: time="2025-09-10T23:53:34.751554543Z" level=info msg="ImageCreate event name:\"registry.k8s.io/coredns/coredns:v1.11.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 10 23:53:34.752895 containerd[1522]: time="2025-09-10T23:53:34.752866633Z" level=info msg="stop pulling image registry.k8s.io/coredns/coredns:v1.11.3: active requests=0, bytes read=16951714" Sep 10 23:53:34.753589 containerd[1522]: time="2025-09-10T23:53:34.753566146Z" level=info msg="ImageCreate event name:\"sha256:2f6c962e7b8311337352d9fdea917da2184d9919f4da7695bc2a6517cf392fe4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 10 23:53:34.756410 containerd[1522]: time="2025-09-10T23:53:34.756381125Z" level=info msg="ImageCreate event name:\"registry.k8s.io/coredns/coredns@sha256:9caabbf6238b189a65d0d6e6ac138de60d6a1c419e5a341fbbb7c78382559c6e\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 10 23:53:34.757531 containerd[1522]: time="2025-09-10T23:53:34.757499376Z" level=info msg="Pulled image \"registry.k8s.io/coredns/coredns:v1.11.3\" with image id \"sha256:2f6c962e7b8311337352d9fdea917da2184d9919f4da7695bc2a6517cf392fe4\", repo tag \"registry.k8s.io/coredns/coredns:v1.11.3\", repo digest \"registry.k8s.io/coredns/coredns@sha256:9caabbf6238b189a65d0d6e6ac138de60d6a1c419e5a341fbbb7c78382559c6e\", size \"16948420\" in 1.229140072s" Sep 10 23:53:34.757641 containerd[1522]: time="2025-09-10T23:53:34.757625174Z" level=info msg="PullImage \"registry.k8s.io/coredns/coredns:v1.11.3\" returns image reference \"sha256:2f6c962e7b8311337352d9fdea917da2184d9919f4da7695bc2a6517cf392fe4\"" Sep 10 23:53:34.758630 containerd[1522]: time="2025-09-10T23:53:34.758608461Z" level=info msg="PullImage \"registry.k8s.io/pause:3.10\"" Sep 10 23:53:35.209052 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 6. Sep 10 23:53:35.212119 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Sep 10 23:53:35.242521 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount3473460254.mount: Deactivated successfully. Sep 10 23:53:35.261217 containerd[1522]: time="2025-09-10T23:53:35.260981293Z" level=info msg="ImageCreate event name:\"registry.k8s.io/pause:3.10\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" Sep 10 23:53:35.262470 containerd[1522]: time="2025-09-10T23:53:35.262432247Z" level=info msg="stop pulling image registry.k8s.io/pause:3.10: active requests=0, bytes read=268723" Sep 10 23:53:35.262952 containerd[1522]: time="2025-09-10T23:53:35.262881129Z" level=info msg="ImageCreate event name:\"sha256:afb61768ce381961ca0beff95337601f29dc70ff3ed14e5e4b3e5699057e6aa8\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" Sep 10 23:53:35.266484 containerd[1522]: time="2025-09-10T23:53:35.266418637Z" level=info msg="ImageCreate event name:\"registry.k8s.io/pause@sha256:ee6521f290b2168b6e0935a181d4cff9be1ac3f505666ef0e3c98fae8199917a\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" Sep 10 23:53:35.267644 containerd[1522]: time="2025-09-10T23:53:35.267564038Z" level=info msg="Pulled image \"registry.k8s.io/pause:3.10\" with image id \"sha256:afb61768ce381961ca0beff95337601f29dc70ff3ed14e5e4b3e5699057e6aa8\", repo tag \"registry.k8s.io/pause:3.10\", repo digest \"registry.k8s.io/pause@sha256:ee6521f290b2168b6e0935a181d4cff9be1ac3f505666ef0e3c98fae8199917a\", size \"267933\" in 508.913951ms" Sep 10 23:53:35.267644 containerd[1522]: time="2025-09-10T23:53:35.267605784Z" level=info msg="PullImage \"registry.k8s.io/pause:3.10\" returns image reference \"sha256:afb61768ce381961ca0beff95337601f29dc70ff3ed14e5e4b3e5699057e6aa8\"" Sep 10 23:53:35.269494 containerd[1522]: time="2025-09-10T23:53:35.269447664Z" level=info msg="PullImage \"registry.k8s.io/etcd:3.5.15-0\"" Sep 10 23:53:35.372036 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Sep 10 23:53:35.383094 (kubelet)[2205]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Sep 10 23:53:35.434055 kubelet[2205]: E0910 23:53:35.433992 2205 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Sep 10 23:53:35.436875 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Sep 10 23:53:35.437033 systemd[1]: kubelet.service: Failed with result 'exit-code'. Sep 10 23:53:35.439291 systemd[1]: kubelet.service: Consumed 164ms CPU time, 104.6M memory peak. Sep 10 23:53:35.816415 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount2771318138.mount: Deactivated successfully. Sep 10 23:53:37.363704 containerd[1522]: time="2025-09-10T23:53:37.363641035Z" level=info msg="ImageCreate event name:\"registry.k8s.io/etcd:3.5.15-0\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 10 23:53:37.365265 containerd[1522]: time="2025-09-10T23:53:37.365182381Z" level=info msg="stop pulling image registry.k8s.io/etcd:3.5.15-0: active requests=0, bytes read=66537235" Sep 10 23:53:37.369271 containerd[1522]: time="2025-09-10T23:53:37.369237259Z" level=info msg="ImageCreate event name:\"sha256:27e3830e1402783674d8b594038967deea9d51f0d91b34c93c8f39d2f68af7da\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 10 23:53:37.372413 containerd[1522]: time="2025-09-10T23:53:37.372369659Z" level=info msg="ImageCreate event name:\"registry.k8s.io/etcd@sha256:a6dc63e6e8cfa0307d7851762fa6b629afb18f28d8aa3fab5a6e91b4af60026a\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 10 23:53:37.374539 containerd[1522]: time="2025-09-10T23:53:37.374270047Z" level=info msg="Pulled image \"registry.k8s.io/etcd:3.5.15-0\" with image id \"sha256:27e3830e1402783674d8b594038967deea9d51f0d91b34c93c8f39d2f68af7da\", repo tag \"registry.k8s.io/etcd:3.5.15-0\", repo digest \"registry.k8s.io/etcd@sha256:a6dc63e6e8cfa0307d7851762fa6b629afb18f28d8aa3fab5a6e91b4af60026a\", size \"66535646\" in 2.104761066s" Sep 10 23:53:37.374539 containerd[1522]: time="2025-09-10T23:53:37.374308989Z" level=info msg="PullImage \"registry.k8s.io/etcd:3.5.15-0\" returns image reference \"sha256:27e3830e1402783674d8b594038967deea9d51f0d91b34c93c8f39d2f68af7da\"" Sep 10 23:53:42.602280 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent. Sep 10 23:53:42.602885 systemd[1]: kubelet.service: Consumed 164ms CPU time, 104.6M memory peak. Sep 10 23:53:42.607086 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Sep 10 23:53:42.636261 systemd[1]: Reload requested from client PID 2293 ('systemctl') (unit session-7.scope)... Sep 10 23:53:42.636278 systemd[1]: Reloading... Sep 10 23:53:42.779218 zram_generator::config[2336]: No configuration found. Sep 10 23:53:42.859738 systemd[1]: /usr/lib/systemd/system/docker.socket:6: ListenStream= references a path below legacy directory /var/run/, updating /var/run/docker.sock → /run/docker.sock; please update the unit file accordingly. Sep 10 23:53:42.964891 systemd[1]: Reloading finished in 328 ms. Sep 10 23:53:43.029937 systemd[1]: kubelet.service: Control process exited, code=killed, status=15/TERM Sep 10 23:53:43.030022 systemd[1]: kubelet.service: Failed with result 'signal'. Sep 10 23:53:43.030579 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent. Sep 10 23:53:43.030668 systemd[1]: kubelet.service: Consumed 113ms CPU time, 95M memory peak. Sep 10 23:53:43.033019 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Sep 10 23:53:43.205142 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Sep 10 23:53:43.215506 (kubelet)[2385]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS Sep 10 23:53:43.265591 kubelet[2385]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Sep 10 23:53:43.265956 kubelet[2385]: Flag --pod-infra-container-image has been deprecated, will be removed in a future release. Image garbage collector will get sandbox image information from CRI. Sep 10 23:53:43.266002 kubelet[2385]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Sep 10 23:53:43.266170 kubelet[2385]: I0910 23:53:43.266136 2385 server.go:211] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Sep 10 23:53:43.907373 kubelet[2385]: I0910 23:53:43.907329 2385 server.go:491] "Kubelet version" kubeletVersion="v1.31.8" Sep 10 23:53:43.907525 kubelet[2385]: I0910 23:53:43.907514 2385 server.go:493] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Sep 10 23:53:43.908239 kubelet[2385]: I0910 23:53:43.908219 2385 server.go:934] "Client rotation is on, will bootstrap in background" Sep 10 23:53:43.938391 kubelet[2385]: E0910 23:53:43.938332 2385 certificate_manager.go:562] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post \"https://91.107.201.216:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": dial tcp 91.107.201.216:6443: connect: connection refused" logger="UnhandledError" Sep 10 23:53:43.938690 kubelet[2385]: I0910 23:53:43.938617 2385 dynamic_cafile_content.go:160] "Starting controller" name="client-ca-bundle::/etc/kubernetes/pki/ca.crt" Sep 10 23:53:43.949585 kubelet[2385]: I0910 23:53:43.949554 2385 server.go:1431] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Sep 10 23:53:43.954060 kubelet[2385]: I0910 23:53:43.954019 2385 server.go:749] "--cgroups-per-qos enabled, but --cgroup-root was not specified. defaulting to /" Sep 10 23:53:43.954492 kubelet[2385]: I0910 23:53:43.954479 2385 swap_util.go:113] "Swap is on" /proc/swaps contents="Filename\t\t\t\tType\t\tSize\t\tUsed\t\tPriority" Sep 10 23:53:43.954783 kubelet[2385]: I0910 23:53:43.954752 2385 container_manager_linux.go:264] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Sep 10 23:53:43.955026 kubelet[2385]: I0910 23:53:43.954846 2385 container_manager_linux.go:269] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ci-4372-1-0-n-c06092ab73","RuntimeCgroupsName":"","SystemCgroupsName":"","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":false,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":null,"HardEvictionThresholds":[{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"ExperimentalMemoryManagerPolicy":"None","ExperimentalMemoryManagerReservedMemory":null,"PodPidsLimit":-1,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Sep 10 23:53:43.955225 kubelet[2385]: I0910 23:53:43.955211 2385 topology_manager.go:138] "Creating topology manager with none policy" Sep 10 23:53:43.955285 kubelet[2385]: I0910 23:53:43.955277 2385 container_manager_linux.go:300] "Creating device plugin manager" Sep 10 23:53:43.955564 kubelet[2385]: I0910 23:53:43.955549 2385 state_mem.go:36] "Initialized new in-memory state store" Sep 10 23:53:43.959320 kubelet[2385]: I0910 23:53:43.959282 2385 kubelet.go:408] "Attempting to sync node with API server" Sep 10 23:53:43.959434 kubelet[2385]: I0910 23:53:43.959422 2385 kubelet.go:303] "Adding static pod path" path="/etc/kubernetes/manifests" Sep 10 23:53:43.959501 kubelet[2385]: I0910 23:53:43.959491 2385 kubelet.go:314] "Adding apiserver pod source" Sep 10 23:53:43.959557 kubelet[2385]: I0910 23:53:43.959548 2385 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Sep 10 23:53:43.964444 kubelet[2385]: W0910 23:53:43.964373 2385 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://91.107.201.216:6443/api/v1/nodes?fieldSelector=metadata.name%3Dci-4372-1-0-n-c06092ab73&limit=500&resourceVersion=0": dial tcp 91.107.201.216:6443: connect: connection refused Sep 10 23:53:43.964526 kubelet[2385]: E0910 23:53:43.964473 2385 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://91.107.201.216:6443/api/v1/nodes?fieldSelector=metadata.name%3Dci-4372-1-0-n-c06092ab73&limit=500&resourceVersion=0\": dial tcp 91.107.201.216:6443: connect: connection refused" logger="UnhandledError" Sep 10 23:53:43.966413 kubelet[2385]: W0910 23:53:43.966360 2385 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://91.107.201.216:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 91.107.201.216:6443: connect: connection refused Sep 10 23:53:43.966512 kubelet[2385]: E0910 23:53:43.966421 2385 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://91.107.201.216:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 91.107.201.216:6443: connect: connection refused" logger="UnhandledError" Sep 10 23:53:43.966603 kubelet[2385]: I0910 23:53:43.966582 2385 kuberuntime_manager.go:262] "Container runtime initialized" containerRuntime="containerd" version="v2.0.4" apiVersion="v1" Sep 10 23:53:43.967454 kubelet[2385]: I0910 23:53:43.967412 2385 kubelet.go:837] "Not starting ClusterTrustBundle informer because we are in static kubelet mode" Sep 10 23:53:43.967613 kubelet[2385]: W0910 23:53:43.967594 2385 probe.go:272] Flexvolume plugin directory at /opt/libexec/kubernetes/kubelet-plugins/volume/exec/ does not exist. Recreating. Sep 10 23:53:43.969993 kubelet[2385]: I0910 23:53:43.969921 2385 server.go:1274] "Started kubelet" Sep 10 23:53:43.978618 kubelet[2385]: I0910 23:53:43.978587 2385 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Sep 10 23:53:43.979684 kubelet[2385]: E0910 23:53:43.977816 2385 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://91.107.201.216:6443/api/v1/namespaces/default/events\": dial tcp 91.107.201.216:6443: connect: connection refused" event="&Event{ObjectMeta:{ci-4372-1-0-n-c06092ab73.186410face4600e7 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ci-4372-1-0-n-c06092ab73,UID:ci-4372-1-0-n-c06092ab73,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:ci-4372-1-0-n-c06092ab73,},FirstTimestamp:2025-09-10 23:53:43.969898727 +0000 UTC m=+0.747043847,LastTimestamp:2025-09-10 23:53:43.969898727 +0000 UTC m=+0.747043847,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ci-4372-1-0-n-c06092ab73,}" Sep 10 23:53:43.981197 kubelet[2385]: I0910 23:53:43.981156 2385 server.go:163] "Starting to listen" address="0.0.0.0" port=10250 Sep 10 23:53:43.983597 kubelet[2385]: I0910 23:53:43.981449 2385 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Sep 10 23:53:43.984995 kubelet[2385]: I0910 23:53:43.984935 2385 server.go:236] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Sep 10 23:53:43.985074 kubelet[2385]: I0910 23:53:43.983978 2385 volume_manager.go:289] "Starting Kubelet Volume Manager" Sep 10 23:53:43.985371 kubelet[2385]: I0910 23:53:43.982350 2385 dynamic_serving_content.go:135] "Starting controller" name="kubelet-server-cert-files::/var/lib/kubelet/pki/kubelet.crt::/var/lib/kubelet/pki/kubelet.key" Sep 10 23:53:43.985371 kubelet[2385]: I0910 23:53:43.984015 2385 desired_state_of_world_populator.go:147] "Desired state populator starts to run" Sep 10 23:53:43.985454 kubelet[2385]: I0910 23:53:43.985411 2385 reconciler.go:26] "Reconciler: start to sync state" Sep 10 23:53:43.985881 kubelet[2385]: E0910 23:53:43.984270 2385 kubelet_node_status.go:453] "Error getting the current node from lister" err="node \"ci-4372-1-0-n-c06092ab73\" not found" Sep 10 23:53:43.986146 kubelet[2385]: I0910 23:53:43.984539 2385 server.go:449] "Adding debug handlers to kubelet server" Sep 10 23:53:43.987402 kubelet[2385]: E0910 23:53:43.987357 2385 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://91.107.201.216:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/ci-4372-1-0-n-c06092ab73?timeout=10s\": dial tcp 91.107.201.216:6443: connect: connection refused" interval="200ms" Sep 10 23:53:43.988767 kubelet[2385]: W0910 23:53:43.988720 2385 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://91.107.201.216:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 91.107.201.216:6443: connect: connection refused Sep 10 23:53:43.988871 kubelet[2385]: E0910 23:53:43.988774 2385 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://91.107.201.216:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 91.107.201.216:6443: connect: connection refused" logger="UnhandledError" Sep 10 23:53:43.988913 kubelet[2385]: E0910 23:53:43.988900 2385 kubelet.go:1478] "Image garbage collection failed once. Stats initialization may not have completed yet" err="invalid capacity 0 on image filesystem" Sep 10 23:53:43.989598 kubelet[2385]: I0910 23:53:43.989504 2385 factory.go:221] Registration of the containerd container factory successfully Sep 10 23:53:43.989598 kubelet[2385]: I0910 23:53:43.989521 2385 factory.go:221] Registration of the systemd container factory successfully Sep 10 23:53:43.989686 kubelet[2385]: I0910 23:53:43.989632 2385 factory.go:219] Registration of the crio container factory failed: Get "http://%2Fvar%2Frun%2Fcrio%2Fcrio.sock/info": dial unix /var/run/crio/crio.sock: connect: no such file or directory Sep 10 23:53:43.999093 kubelet[2385]: I0910 23:53:43.999008 2385 cpu_manager.go:214] "Starting CPU manager" policy="none" Sep 10 23:53:43.999093 kubelet[2385]: I0910 23:53:43.999027 2385 cpu_manager.go:215] "Reconciling" reconcilePeriod="10s" Sep 10 23:53:43.999093 kubelet[2385]: I0910 23:53:43.999045 2385 state_mem.go:36] "Initialized new in-memory state store" Sep 10 23:53:44.001815 kubelet[2385]: I0910 23:53:44.001752 2385 policy_none.go:49] "None policy: Start" Sep 10 23:53:44.003098 kubelet[2385]: I0910 23:53:44.003021 2385 memory_manager.go:170] "Starting memorymanager" policy="None" Sep 10 23:53:44.003098 kubelet[2385]: I0910 23:53:44.003052 2385 state_mem.go:35] "Initializing new in-memory state store" Sep 10 23:53:44.012772 systemd[1]: Created slice kubepods.slice - libcontainer container kubepods.slice. Sep 10 23:53:44.018092 kubelet[2385]: I0910 23:53:44.018036 2385 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv4" Sep 10 23:53:44.020240 kubelet[2385]: I0910 23:53:44.020206 2385 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv6" Sep 10 23:53:44.020370 kubelet[2385]: I0910 23:53:44.020358 2385 status_manager.go:217] "Starting to sync pod status with apiserver" Sep 10 23:53:44.020443 kubelet[2385]: I0910 23:53:44.020435 2385 kubelet.go:2321] "Starting kubelet main sync loop" Sep 10 23:53:44.020548 kubelet[2385]: E0910 23:53:44.020531 2385 kubelet.go:2345] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" Sep 10 23:53:44.021937 kubelet[2385]: W0910 23:53:44.021732 2385 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://91.107.201.216:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 91.107.201.216:6443: connect: connection refused Sep 10 23:53:44.021937 kubelet[2385]: E0910 23:53:44.021798 2385 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://91.107.201.216:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 91.107.201.216:6443: connect: connection refused" logger="UnhandledError" Sep 10 23:53:44.027210 systemd[1]: Created slice kubepods-burstable.slice - libcontainer container kubepods-burstable.slice. Sep 10 23:53:44.032027 systemd[1]: Created slice kubepods-besteffort.slice - libcontainer container kubepods-besteffort.slice. Sep 10 23:53:44.048398 kubelet[2385]: I0910 23:53:44.048322 2385 manager.go:513] "Failed to read data from checkpoint" checkpoint="kubelet_internal_checkpoint" err="checkpoint is not found" Sep 10 23:53:44.049049 kubelet[2385]: I0910 23:53:44.048583 2385 eviction_manager.go:189] "Eviction manager: starting control loop" Sep 10 23:53:44.049049 kubelet[2385]: I0910 23:53:44.048607 2385 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Sep 10 23:53:44.049259 kubelet[2385]: I0910 23:53:44.049155 2385 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Sep 10 23:53:44.052750 kubelet[2385]: E0910 23:53:44.052721 2385 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"ci-4372-1-0-n-c06092ab73\" not found" Sep 10 23:53:44.134982 systemd[1]: Created slice kubepods-burstable-podacb7cc2c1c7e611e6c9eb92704a65c89.slice - libcontainer container kubepods-burstable-podacb7cc2c1c7e611e6c9eb92704a65c89.slice. Sep 10 23:53:44.155227 kubelet[2385]: I0910 23:53:44.154651 2385 kubelet_node_status.go:72] "Attempting to register node" node="ci-4372-1-0-n-c06092ab73" Sep 10 23:53:44.155791 kubelet[2385]: E0910 23:53:44.155765 2385 kubelet_node_status.go:95] "Unable to register node with API server" err="Post \"https://91.107.201.216:6443/api/v1/nodes\": dial tcp 91.107.201.216:6443: connect: connection refused" node="ci-4372-1-0-n-c06092ab73" Sep 10 23:53:44.156231 systemd[1]: Created slice kubepods-burstable-pod165237a12d917652cf5c9760ecae4d9a.slice - libcontainer container kubepods-burstable-pod165237a12d917652cf5c9760ecae4d9a.slice. Sep 10 23:53:44.163642 systemd[1]: Created slice kubepods-burstable-pod67881df9adff5d81b062d3dfa04aa3d1.slice - libcontainer container kubepods-burstable-pod67881df9adff5d81b062d3dfa04aa3d1.slice. Sep 10 23:53:44.188963 kubelet[2385]: E0910 23:53:44.188684 2385 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://91.107.201.216:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/ci-4372-1-0-n-c06092ab73?timeout=10s\": dial tcp 91.107.201.216:6443: connect: connection refused" interval="400ms" Sep 10 23:53:44.288430 kubelet[2385]: I0910 23:53:44.288149 2385 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/67881df9adff5d81b062d3dfa04aa3d1-usr-share-ca-certificates\") pod \"kube-apiserver-ci-4372-1-0-n-c06092ab73\" (UID: \"67881df9adff5d81b062d3dfa04aa3d1\") " pod="kube-system/kube-apiserver-ci-4372-1-0-n-c06092ab73" Sep 10 23:53:44.288430 kubelet[2385]: I0910 23:53:44.288426 2385 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/acb7cc2c1c7e611e6c9eb92704a65c89-ca-certs\") pod \"kube-controller-manager-ci-4372-1-0-n-c06092ab73\" (UID: \"acb7cc2c1c7e611e6c9eb92704a65c89\") " pod="kube-system/kube-controller-manager-ci-4372-1-0-n-c06092ab73" Sep 10 23:53:44.289024 kubelet[2385]: I0910 23:53:44.288512 2385 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"flexvolume-dir\" (UniqueName: \"kubernetes.io/host-path/acb7cc2c1c7e611e6c9eb92704a65c89-flexvolume-dir\") pod \"kube-controller-manager-ci-4372-1-0-n-c06092ab73\" (UID: \"acb7cc2c1c7e611e6c9eb92704a65c89\") " pod="kube-system/kube-controller-manager-ci-4372-1-0-n-c06092ab73" Sep 10 23:53:44.289024 kubelet[2385]: I0910 23:53:44.288604 2385 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/acb7cc2c1c7e611e6c9eb92704a65c89-kubeconfig\") pod \"kube-controller-manager-ci-4372-1-0-n-c06092ab73\" (UID: \"acb7cc2c1c7e611e6c9eb92704a65c89\") " pod="kube-system/kube-controller-manager-ci-4372-1-0-n-c06092ab73" Sep 10 23:53:44.289024 kubelet[2385]: I0910 23:53:44.288666 2385 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/acb7cc2c1c7e611e6c9eb92704a65c89-usr-share-ca-certificates\") pod \"kube-controller-manager-ci-4372-1-0-n-c06092ab73\" (UID: \"acb7cc2c1c7e611e6c9eb92704a65c89\") " pod="kube-system/kube-controller-manager-ci-4372-1-0-n-c06092ab73" Sep 10 23:53:44.289024 kubelet[2385]: I0910 23:53:44.288707 2385 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/67881df9adff5d81b062d3dfa04aa3d1-k8s-certs\") pod \"kube-apiserver-ci-4372-1-0-n-c06092ab73\" (UID: \"67881df9adff5d81b062d3dfa04aa3d1\") " pod="kube-system/kube-apiserver-ci-4372-1-0-n-c06092ab73" Sep 10 23:53:44.289024 kubelet[2385]: I0910 23:53:44.288745 2385 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/acb7cc2c1c7e611e6c9eb92704a65c89-k8s-certs\") pod \"kube-controller-manager-ci-4372-1-0-n-c06092ab73\" (UID: \"acb7cc2c1c7e611e6c9eb92704a65c89\") " pod="kube-system/kube-controller-manager-ci-4372-1-0-n-c06092ab73" Sep 10 23:53:44.289331 kubelet[2385]: I0910 23:53:44.288784 2385 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/165237a12d917652cf5c9760ecae4d9a-kubeconfig\") pod \"kube-scheduler-ci-4372-1-0-n-c06092ab73\" (UID: \"165237a12d917652cf5c9760ecae4d9a\") " pod="kube-system/kube-scheduler-ci-4372-1-0-n-c06092ab73" Sep 10 23:53:44.289331 kubelet[2385]: I0910 23:53:44.288825 2385 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/67881df9adff5d81b062d3dfa04aa3d1-ca-certs\") pod \"kube-apiserver-ci-4372-1-0-n-c06092ab73\" (UID: \"67881df9adff5d81b062d3dfa04aa3d1\") " pod="kube-system/kube-apiserver-ci-4372-1-0-n-c06092ab73" Sep 10 23:53:44.359369 kubelet[2385]: I0910 23:53:44.359290 2385 kubelet_node_status.go:72] "Attempting to register node" node="ci-4372-1-0-n-c06092ab73" Sep 10 23:53:44.359768 kubelet[2385]: E0910 23:53:44.359732 2385 kubelet_node_status.go:95] "Unable to register node with API server" err="Post \"https://91.107.201.216:6443/api/v1/nodes\": dial tcp 91.107.201.216:6443: connect: connection refused" node="ci-4372-1-0-n-c06092ab73" Sep 10 23:53:44.451815 containerd[1522]: time="2025-09-10T23:53:44.451671781Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-controller-manager-ci-4372-1-0-n-c06092ab73,Uid:acb7cc2c1c7e611e6c9eb92704a65c89,Namespace:kube-system,Attempt:0,}" Sep 10 23:53:44.462487 containerd[1522]: time="2025-09-10T23:53:44.462356715Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-scheduler-ci-4372-1-0-n-c06092ab73,Uid:165237a12d917652cf5c9760ecae4d9a,Namespace:kube-system,Attempt:0,}" Sep 10 23:53:44.476911 containerd[1522]: time="2025-09-10T23:53:44.476708294Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-apiserver-ci-4372-1-0-n-c06092ab73,Uid:67881df9adff5d81b062d3dfa04aa3d1,Namespace:kube-system,Attempt:0,}" Sep 10 23:53:44.485539 containerd[1522]: time="2025-09-10T23:53:44.485415429Z" level=info msg="connecting to shim e8a761325458a3100818963d4815d4bbbe03665590d35c93b44c5005d2c1bd76" address="unix:///run/containerd/s/29baa797365d79cd702d6ad1f3e287bf17d79858962ae21867bba43015f9038f" namespace=k8s.io protocol=ttrpc version=3 Sep 10 23:53:44.514884 containerd[1522]: time="2025-09-10T23:53:44.514821816Z" level=info msg="connecting to shim 1eba38c580b6fa474eba1aac38b5804249e8ea0d4f6623cf7c2a4dfd788baf07" address="unix:///run/containerd/s/cac749d29dfcc7a5aaa7ead5ba1b6839734fa7551bb2e536e4d443469e8dd269" namespace=k8s.io protocol=ttrpc version=3 Sep 10 23:53:44.525428 containerd[1522]: time="2025-09-10T23:53:44.525390305Z" level=info msg="connecting to shim 1c7bce1fb15172784159db587e95f66a70eccb484c41ae1bdeb439b8109c4df7" address="unix:///run/containerd/s/7bbccaa64c0be431bbd7636f1fd91cacdfd8017272ea3b447057aaa7a6e47af0" namespace=k8s.io protocol=ttrpc version=3 Sep 10 23:53:44.528432 systemd[1]: Started cri-containerd-e8a761325458a3100818963d4815d4bbbe03665590d35c93b44c5005d2c1bd76.scope - libcontainer container e8a761325458a3100818963d4815d4bbbe03665590d35c93b44c5005d2c1bd76. Sep 10 23:53:44.570565 systemd[1]: Started cri-containerd-1c7bce1fb15172784159db587e95f66a70eccb484c41ae1bdeb439b8109c4df7.scope - libcontainer container 1c7bce1fb15172784159db587e95f66a70eccb484c41ae1bdeb439b8109c4df7. Sep 10 23:53:44.574139 systemd[1]: Started cri-containerd-1eba38c580b6fa474eba1aac38b5804249e8ea0d4f6623cf7c2a4dfd788baf07.scope - libcontainer container 1eba38c580b6fa474eba1aac38b5804249e8ea0d4f6623cf7c2a4dfd788baf07. Sep 10 23:53:44.590133 kubelet[2385]: E0910 23:53:44.590059 2385 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://91.107.201.216:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/ci-4372-1-0-n-c06092ab73?timeout=10s\": dial tcp 91.107.201.216:6443: connect: connection refused" interval="800ms" Sep 10 23:53:44.598662 containerd[1522]: time="2025-09-10T23:53:44.598623283Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-controller-manager-ci-4372-1-0-n-c06092ab73,Uid:acb7cc2c1c7e611e6c9eb92704a65c89,Namespace:kube-system,Attempt:0,} returns sandbox id \"e8a761325458a3100818963d4815d4bbbe03665590d35c93b44c5005d2c1bd76\"" Sep 10 23:53:44.605333 containerd[1522]: time="2025-09-10T23:53:44.605147543Z" level=info msg="CreateContainer within sandbox \"e8a761325458a3100818963d4815d4bbbe03665590d35c93b44c5005d2c1bd76\" for container &ContainerMetadata{Name:kube-controller-manager,Attempt:0,}" Sep 10 23:53:44.619666 containerd[1522]: time="2025-09-10T23:53:44.619606282Z" level=info msg="Container 77c56c596d0c4ba611fc95417832031c711525d16f57310ede66f5ae357e0e38: CDI devices from CRI Config.CDIDevices: []" Sep 10 23:53:44.636554 containerd[1522]: time="2025-09-10T23:53:44.636491151Z" level=info msg="CreateContainer within sandbox \"e8a761325458a3100818963d4815d4bbbe03665590d35c93b44c5005d2c1bd76\" for &ContainerMetadata{Name:kube-controller-manager,Attempt:0,} returns container id \"77c56c596d0c4ba611fc95417832031c711525d16f57310ede66f5ae357e0e38\"" Sep 10 23:53:44.637994 containerd[1522]: time="2025-09-10T23:53:44.637964156Z" level=info msg="StartContainer for \"77c56c596d0c4ba611fc95417832031c711525d16f57310ede66f5ae357e0e38\"" Sep 10 23:53:44.639654 containerd[1522]: time="2025-09-10T23:53:44.639403427Z" level=info msg="connecting to shim 77c56c596d0c4ba611fc95417832031c711525d16f57310ede66f5ae357e0e38" address="unix:///run/containerd/s/29baa797365d79cd702d6ad1f3e287bf17d79858962ae21867bba43015f9038f" protocol=ttrpc version=3 Sep 10 23:53:44.647896 containerd[1522]: time="2025-09-10T23:53:44.647742582Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-apiserver-ci-4372-1-0-n-c06092ab73,Uid:67881df9adff5d81b062d3dfa04aa3d1,Namespace:kube-system,Attempt:0,} returns sandbox id \"1c7bce1fb15172784159db587e95f66a70eccb484c41ae1bdeb439b8109c4df7\"" Sep 10 23:53:44.652900 containerd[1522]: time="2025-09-10T23:53:44.652687076Z" level=info msg="CreateContainer within sandbox \"1c7bce1fb15172784159db587e95f66a70eccb484c41ae1bdeb439b8109c4df7\" for container &ContainerMetadata{Name:kube-apiserver,Attempt:0,}" Sep 10 23:53:44.663697 containerd[1522]: time="2025-09-10T23:53:44.663632470Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-scheduler-ci-4372-1-0-n-c06092ab73,Uid:165237a12d917652cf5c9760ecae4d9a,Namespace:kube-system,Attempt:0,} returns sandbox id \"1eba38c580b6fa474eba1aac38b5804249e8ea0d4f6623cf7c2a4dfd788baf07\"" Sep 10 23:53:44.667631 systemd[1]: Started cri-containerd-77c56c596d0c4ba611fc95417832031c711525d16f57310ede66f5ae357e0e38.scope - libcontainer container 77c56c596d0c4ba611fc95417832031c711525d16f57310ede66f5ae357e0e38. Sep 10 23:53:44.668414 containerd[1522]: time="2025-09-10T23:53:44.668272088Z" level=info msg="Container 4049ee71c1d830bc8203c6d366dc6619de5a2912e4015c457699e2176ede02e7: CDI devices from CRI Config.CDIDevices: []" Sep 10 23:53:44.669458 containerd[1522]: time="2025-09-10T23:53:44.668859753Z" level=info msg="CreateContainer within sandbox \"1eba38c580b6fa474eba1aac38b5804249e8ea0d4f6623cf7c2a4dfd788baf07\" for container &ContainerMetadata{Name:kube-scheduler,Attempt:0,}" Sep 10 23:53:44.684382 containerd[1522]: time="2025-09-10T23:53:44.684328159Z" level=info msg="CreateContainer within sandbox \"1c7bce1fb15172784159db587e95f66a70eccb484c41ae1bdeb439b8109c4df7\" for &ContainerMetadata{Name:kube-apiserver,Attempt:0,} returns container id \"4049ee71c1d830bc8203c6d366dc6619de5a2912e4015c457699e2176ede02e7\"" Sep 10 23:53:44.685540 containerd[1522]: time="2025-09-10T23:53:44.685504090Z" level=info msg="StartContainer for \"4049ee71c1d830bc8203c6d366dc6619de5a2912e4015c457699e2176ede02e7\"" Sep 10 23:53:44.687399 containerd[1522]: time="2025-09-10T23:53:44.687356639Z" level=info msg="Container f4aa9ef0788d534b6e307c1dbb8b7f785203362c440100ce98119f89af0761a7: CDI devices from CRI Config.CDIDevices: []" Sep 10 23:53:44.689582 containerd[1522]: time="2025-09-10T23:53:44.689537115Z" level=info msg="connecting to shim 4049ee71c1d830bc8203c6d366dc6619de5a2912e4015c457699e2176ede02e7" address="unix:///run/containerd/s/7bbccaa64c0be431bbd7636f1fd91cacdfd8017272ea3b447057aaa7a6e47af0" protocol=ttrpc version=3 Sep 10 23:53:44.701777 containerd[1522]: time="2025-09-10T23:53:44.701723064Z" level=info msg="CreateContainer within sandbox \"1eba38c580b6fa474eba1aac38b5804249e8ea0d4f6623cf7c2a4dfd788baf07\" for &ContainerMetadata{Name:kube-scheduler,Attempt:0,} returns container id \"f4aa9ef0788d534b6e307c1dbb8b7f785203362c440100ce98119f89af0761a7\"" Sep 10 23:53:44.702684 containerd[1522]: time="2025-09-10T23:53:44.702576551Z" level=info msg="StartContainer for \"f4aa9ef0788d534b6e307c1dbb8b7f785203362c440100ce98119f89af0761a7\"" Sep 10 23:53:44.709632 containerd[1522]: time="2025-09-10T23:53:44.709538178Z" level=info msg="connecting to shim f4aa9ef0788d534b6e307c1dbb8b7f785203362c440100ce98119f89af0761a7" address="unix:///run/containerd/s/cac749d29dfcc7a5aaa7ead5ba1b6839734fa7551bb2e536e4d443469e8dd269" protocol=ttrpc version=3 Sep 10 23:53:44.727405 systemd[1]: Started cri-containerd-4049ee71c1d830bc8203c6d366dc6619de5a2912e4015c457699e2176ede02e7.scope - libcontainer container 4049ee71c1d830bc8203c6d366dc6619de5a2912e4015c457699e2176ede02e7. Sep 10 23:53:44.753021 containerd[1522]: time="2025-09-10T23:53:44.752887506Z" level=info msg="StartContainer for \"77c56c596d0c4ba611fc95417832031c711525d16f57310ede66f5ae357e0e38\" returns successfully" Sep 10 23:53:44.755452 systemd[1]: Started cri-containerd-f4aa9ef0788d534b6e307c1dbb8b7f785203362c440100ce98119f89af0761a7.scope - libcontainer container f4aa9ef0788d534b6e307c1dbb8b7f785203362c440100ce98119f89af0761a7. Sep 10 23:53:44.763632 kubelet[2385]: I0910 23:53:44.763589 2385 kubelet_node_status.go:72] "Attempting to register node" node="ci-4372-1-0-n-c06092ab73" Sep 10 23:53:44.764115 kubelet[2385]: E0910 23:53:44.764088 2385 kubelet_node_status.go:95] "Unable to register node with API server" err="Post \"https://91.107.201.216:6443/api/v1/nodes\": dial tcp 91.107.201.216:6443: connect: connection refused" node="ci-4372-1-0-n-c06092ab73" Sep 10 23:53:44.814401 containerd[1522]: time="2025-09-10T23:53:44.814328967Z" level=info msg="StartContainer for \"4049ee71c1d830bc8203c6d366dc6619de5a2912e4015c457699e2176ede02e7\" returns successfully" Sep 10 23:53:44.843646 containerd[1522]: time="2025-09-10T23:53:44.843526473Z" level=info msg="StartContainer for \"f4aa9ef0788d534b6e307c1dbb8b7f785203362c440100ce98119f89af0761a7\" returns successfully" Sep 10 23:53:45.567217 kubelet[2385]: I0910 23:53:45.566934 2385 kubelet_node_status.go:72] "Attempting to register node" node="ci-4372-1-0-n-c06092ab73" Sep 10 23:53:47.295179 kubelet[2385]: E0910 23:53:47.295016 2385 nodelease.go:49] "Failed to get node when trying to set owner ref to the node lease" err="nodes \"ci-4372-1-0-n-c06092ab73\" not found" node="ci-4372-1-0-n-c06092ab73" Sep 10 23:53:47.350299 kubelet[2385]: E0910 23:53:47.350172 2385 event.go:359] "Server rejected event (will not retry!)" err="namespaces \"default\" not found" event="&Event{ObjectMeta:{ci-4372-1-0-n-c06092ab73.186410face4600e7 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ci-4372-1-0-n-c06092ab73,UID:ci-4372-1-0-n-c06092ab73,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:ci-4372-1-0-n-c06092ab73,},FirstTimestamp:2025-09-10 23:53:43.969898727 +0000 UTC m=+0.747043847,LastTimestamp:2025-09-10 23:53:43.969898727 +0000 UTC m=+0.747043847,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ci-4372-1-0-n-c06092ab73,}" Sep 10 23:53:47.410090 kubelet[2385]: E0910 23:53:47.409878 2385 event.go:359] "Server rejected event (will not retry!)" err="namespaces \"default\" not found" event="&Event{ObjectMeta:{ci-4372-1-0-n-c06092ab73.186410facf67d1b0 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ci-4372-1-0-n-c06092ab73,UID:ci-4372-1-0-n-c06092ab73,APIVersion:,ResourceVersion:,FieldPath:,},Reason:InvalidDiskCapacity,Message:invalid capacity 0 on image filesystem,Source:EventSource{Component:kubelet,Host:ci-4372-1-0-n-c06092ab73,},FirstTimestamp:2025-09-10 23:53:43.98889208 +0000 UTC m=+0.766037200,LastTimestamp:2025-09-10 23:53:43.98889208 +0000 UTC m=+0.766037200,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ci-4372-1-0-n-c06092ab73,}" Sep 10 23:53:47.415763 kubelet[2385]: I0910 23:53:47.415710 2385 kubelet_node_status.go:75] "Successfully registered node" node="ci-4372-1-0-n-c06092ab73" Sep 10 23:53:47.470284 kubelet[2385]: E0910 23:53:47.470132 2385 event.go:359] "Server rejected event (will not retry!)" err="namespaces \"default\" not found" event="&Event{ObjectMeta:{ci-4372-1-0-n-c06092ab73.186410facff6e895 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ci-4372-1-0-n-c06092ab73,UID:ci-4372-1-0-n-c06092ab73,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node ci-4372-1-0-n-c06092ab73 status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:ci-4372-1-0-n-c06092ab73,},FirstTimestamp:2025-09-10 23:53:43.998269589 +0000 UTC m=+0.775414709,LastTimestamp:2025-09-10 23:53:43.998269589 +0000 UTC m=+0.775414709,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ci-4372-1-0-n-c06092ab73,}" Sep 10 23:53:47.742579 kubelet[2385]: E0910 23:53:47.742531 2385 kubelet.go:1915] "Failed creating a mirror pod for" err="pods \"kube-scheduler-ci-4372-1-0-n-c06092ab73\" is forbidden: no PriorityClass with name system-node-critical was found" pod="kube-system/kube-scheduler-ci-4372-1-0-n-c06092ab73" Sep 10 23:53:47.968174 kubelet[2385]: I0910 23:53:47.968132 2385 apiserver.go:52] "Watching apiserver" Sep 10 23:53:47.986262 kubelet[2385]: I0910 23:53:47.986211 2385 desired_state_of_world_populator.go:155] "Finished populating initial desired state of world" Sep 10 23:53:49.678900 systemd[1]: Reload requested from client PID 2654 ('systemctl') (unit session-7.scope)... Sep 10 23:53:49.678921 systemd[1]: Reloading... Sep 10 23:53:49.781278 zram_generator::config[2698]: No configuration found. Sep 10 23:53:49.866554 systemd[1]: /usr/lib/systemd/system/docker.socket:6: ListenStream= references a path below legacy directory /var/run/, updating /var/run/docker.sock → /run/docker.sock; please update the unit file accordingly. Sep 10 23:53:49.983576 systemd[1]: Reloading finished in 304 ms. Sep 10 23:53:50.021511 systemd[1]: Stopping kubelet.service - kubelet: The Kubernetes Node Agent... Sep 10 23:53:50.034881 systemd[1]: kubelet.service: Deactivated successfully. Sep 10 23:53:50.035502 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent. Sep 10 23:53:50.035561 systemd[1]: kubelet.service: Consumed 1.198s CPU time, 124.9M memory peak. Sep 10 23:53:50.039694 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Sep 10 23:53:50.192652 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Sep 10 23:53:50.205770 (kubelet)[2743]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS Sep 10 23:53:50.263228 kubelet[2743]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Sep 10 23:53:50.263228 kubelet[2743]: Flag --pod-infra-container-image has been deprecated, will be removed in a future release. Image garbage collector will get sandbox image information from CRI. Sep 10 23:53:50.263228 kubelet[2743]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Sep 10 23:53:50.263228 kubelet[2743]: I0910 23:53:50.262858 2743 server.go:211] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Sep 10 23:53:50.271024 kubelet[2743]: I0910 23:53:50.270968 2743 server.go:491] "Kubelet version" kubeletVersion="v1.31.8" Sep 10 23:53:50.271314 kubelet[2743]: I0910 23:53:50.271183 2743 server.go:493] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Sep 10 23:53:50.272244 kubelet[2743]: I0910 23:53:50.271651 2743 server.go:934] "Client rotation is on, will bootstrap in background" Sep 10 23:53:50.273740 kubelet[2743]: I0910 23:53:50.273707 2743 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-client-current.pem". Sep 10 23:53:50.276533 kubelet[2743]: I0910 23:53:50.276517 2743 dynamic_cafile_content.go:160] "Starting controller" name="client-ca-bundle::/etc/kubernetes/pki/ca.crt" Sep 10 23:53:50.283103 kubelet[2743]: I0910 23:53:50.283081 2743 server.go:1431] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Sep 10 23:53:50.288935 kubelet[2743]: I0910 23:53:50.288901 2743 server.go:749] "--cgroups-per-qos enabled, but --cgroup-root was not specified. defaulting to /" Sep 10 23:53:50.289183 kubelet[2743]: I0910 23:53:50.289169 2743 swap_util.go:113] "Swap is on" /proc/swaps contents="Filename\t\t\t\tType\t\tSize\t\tUsed\t\tPriority" Sep 10 23:53:50.289446 kubelet[2743]: I0910 23:53:50.289412 2743 container_manager_linux.go:264] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Sep 10 23:53:50.289744 kubelet[2743]: I0910 23:53:50.289500 2743 container_manager_linux.go:269] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ci-4372-1-0-n-c06092ab73","RuntimeCgroupsName":"","SystemCgroupsName":"","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":false,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":null,"HardEvictionThresholds":[{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"ExperimentalMemoryManagerPolicy":"None","ExperimentalMemoryManagerReservedMemory":null,"PodPidsLimit":-1,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Sep 10 23:53:50.289872 kubelet[2743]: I0910 23:53:50.289858 2743 topology_manager.go:138] "Creating topology manager with none policy" Sep 10 23:53:50.289925 kubelet[2743]: I0910 23:53:50.289918 2743 container_manager_linux.go:300] "Creating device plugin manager" Sep 10 23:53:50.290058 kubelet[2743]: I0910 23:53:50.290003 2743 state_mem.go:36] "Initialized new in-memory state store" Sep 10 23:53:50.291253 kubelet[2743]: I0910 23:53:50.290842 2743 kubelet.go:408] "Attempting to sync node with API server" Sep 10 23:53:50.291253 kubelet[2743]: I0910 23:53:50.290863 2743 kubelet.go:303] "Adding static pod path" path="/etc/kubernetes/manifests" Sep 10 23:53:50.291253 kubelet[2743]: I0910 23:53:50.290882 2743 kubelet.go:314] "Adding apiserver pod source" Sep 10 23:53:50.291253 kubelet[2743]: I0910 23:53:50.290896 2743 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Sep 10 23:53:50.293819 kubelet[2743]: I0910 23:53:50.293788 2743 kuberuntime_manager.go:262] "Container runtime initialized" containerRuntime="containerd" version="v2.0.4" apiVersion="v1" Sep 10 23:53:50.295154 kubelet[2743]: I0910 23:53:50.295121 2743 kubelet.go:837] "Not starting ClusterTrustBundle informer because we are in static kubelet mode" Sep 10 23:53:50.296316 kubelet[2743]: I0910 23:53:50.296297 2743 server.go:1274] "Started kubelet" Sep 10 23:53:50.300616 kubelet[2743]: I0910 23:53:50.300592 2743 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Sep 10 23:53:50.314205 kubelet[2743]: I0910 23:53:50.313667 2743 server.go:163] "Starting to listen" address="0.0.0.0" port=10250 Sep 10 23:53:50.314677 kubelet[2743]: I0910 23:53:50.314651 2743 server.go:449] "Adding debug handlers to kubelet server" Sep 10 23:53:50.318114 kubelet[2743]: I0910 23:53:50.318063 2743 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Sep 10 23:53:50.319365 kubelet[2743]: I0910 23:53:50.318411 2743 server.go:236] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Sep 10 23:53:50.319515 kubelet[2743]: I0910 23:53:50.319163 2743 volume_manager.go:289] "Starting Kubelet Volume Manager" Sep 10 23:53:50.319799 kubelet[2743]: I0910 23:53:50.319778 2743 dynamic_serving_content.go:135] "Starting controller" name="kubelet-server-cert-files::/var/lib/kubelet/pki/kubelet.crt::/var/lib/kubelet/pki/kubelet.key" Sep 10 23:53:50.322591 kubelet[2743]: I0910 23:53:50.319176 2743 desired_state_of_world_populator.go:147] "Desired state populator starts to run" Sep 10 23:53:50.322816 kubelet[2743]: I0910 23:53:50.322803 2743 reconciler.go:26] "Reconciler: start to sync state" Sep 10 23:53:50.322880 kubelet[2743]: E0910 23:53:50.319346 2743 kubelet_node_status.go:453] "Error getting the current node from lister" err="node \"ci-4372-1-0-n-c06092ab73\" not found" Sep 10 23:53:50.328593 kubelet[2743]: I0910 23:53:50.328549 2743 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv4" Sep 10 23:53:50.331884 kubelet[2743]: I0910 23:53:50.331854 2743 factory.go:221] Registration of the systemd container factory successfully Sep 10 23:53:50.332128 kubelet[2743]: I0910 23:53:50.332104 2743 factory.go:219] Registration of the crio container factory failed: Get "http://%2Fvar%2Frun%2Fcrio%2Fcrio.sock/info": dial unix /var/run/crio/crio.sock: connect: no such file or directory Sep 10 23:53:50.332796 kubelet[2743]: I0910 23:53:50.332107 2743 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv6" Sep 10 23:53:50.332951 kubelet[2743]: I0910 23:53:50.332938 2743 status_manager.go:217] "Starting to sync pod status with apiserver" Sep 10 23:53:50.333016 kubelet[2743]: I0910 23:53:50.333009 2743 kubelet.go:2321] "Starting kubelet main sync loop" Sep 10 23:53:50.333152 kubelet[2743]: E0910 23:53:50.333130 2743 kubelet.go:2345] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" Sep 10 23:53:50.337740 kubelet[2743]: I0910 23:53:50.337719 2743 factory.go:221] Registration of the containerd container factory successfully Sep 10 23:53:50.361752 kubelet[2743]: E0910 23:53:50.361221 2743 kubelet.go:1478] "Image garbage collection failed once. Stats initialization may not have completed yet" err="invalid capacity 0 on image filesystem" Sep 10 23:53:50.397988 kubelet[2743]: I0910 23:53:50.397953 2743 cpu_manager.go:214] "Starting CPU manager" policy="none" Sep 10 23:53:50.398269 kubelet[2743]: I0910 23:53:50.398248 2743 cpu_manager.go:215] "Reconciling" reconcilePeriod="10s" Sep 10 23:53:50.398371 kubelet[2743]: I0910 23:53:50.398359 2743 state_mem.go:36] "Initialized new in-memory state store" Sep 10 23:53:50.398609 kubelet[2743]: I0910 23:53:50.398591 2743 state_mem.go:88] "Updated default CPUSet" cpuSet="" Sep 10 23:53:50.398722 kubelet[2743]: I0910 23:53:50.398689 2743 state_mem.go:96] "Updated CPUSet assignments" assignments={} Sep 10 23:53:50.398793 kubelet[2743]: I0910 23:53:50.398783 2743 policy_none.go:49] "None policy: Start" Sep 10 23:53:50.399703 kubelet[2743]: I0910 23:53:50.399677 2743 memory_manager.go:170] "Starting memorymanager" policy="None" Sep 10 23:53:50.399703 kubelet[2743]: I0910 23:53:50.399701 2743 state_mem.go:35] "Initializing new in-memory state store" Sep 10 23:53:50.399919 kubelet[2743]: I0910 23:53:50.399880 2743 state_mem.go:75] "Updated machine memory state" Sep 10 23:53:50.405783 kubelet[2743]: I0910 23:53:50.405750 2743 manager.go:513] "Failed to read data from checkpoint" checkpoint="kubelet_internal_checkpoint" err="checkpoint is not found" Sep 10 23:53:50.407093 kubelet[2743]: I0910 23:53:50.406625 2743 eviction_manager.go:189] "Eviction manager: starting control loop" Sep 10 23:53:50.407093 kubelet[2743]: I0910 23:53:50.406646 2743 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Sep 10 23:53:50.407093 kubelet[2743]: I0910 23:53:50.406991 2743 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Sep 10 23:53:50.512352 kubelet[2743]: I0910 23:53:50.512320 2743 kubelet_node_status.go:72] "Attempting to register node" node="ci-4372-1-0-n-c06092ab73" Sep 10 23:53:50.525155 kubelet[2743]: I0910 23:53:50.525036 2743 kubelet_node_status.go:111] "Node was previously registered" node="ci-4372-1-0-n-c06092ab73" Sep 10 23:53:50.526645 kubelet[2743]: I0910 23:53:50.526274 2743 kubelet_node_status.go:75] "Successfully registered node" node="ci-4372-1-0-n-c06092ab73" Sep 10 23:53:50.624305 kubelet[2743]: I0910 23:53:50.624246 2743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/acb7cc2c1c7e611e6c9eb92704a65c89-kubeconfig\") pod \"kube-controller-manager-ci-4372-1-0-n-c06092ab73\" (UID: \"acb7cc2c1c7e611e6c9eb92704a65c89\") " pod="kube-system/kube-controller-manager-ci-4372-1-0-n-c06092ab73" Sep 10 23:53:50.624667 kubelet[2743]: I0910 23:53:50.624563 2743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/acb7cc2c1c7e611e6c9eb92704a65c89-usr-share-ca-certificates\") pod \"kube-controller-manager-ci-4372-1-0-n-c06092ab73\" (UID: \"acb7cc2c1c7e611e6c9eb92704a65c89\") " pod="kube-system/kube-controller-manager-ci-4372-1-0-n-c06092ab73" Sep 10 23:53:50.624667 kubelet[2743]: I0910 23:53:50.624633 2743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/165237a12d917652cf5c9760ecae4d9a-kubeconfig\") pod \"kube-scheduler-ci-4372-1-0-n-c06092ab73\" (UID: \"165237a12d917652cf5c9760ecae4d9a\") " pod="kube-system/kube-scheduler-ci-4372-1-0-n-c06092ab73" Sep 10 23:53:50.624977 kubelet[2743]: I0910 23:53:50.624868 2743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/67881df9adff5d81b062d3dfa04aa3d1-usr-share-ca-certificates\") pod \"kube-apiserver-ci-4372-1-0-n-c06092ab73\" (UID: \"67881df9adff5d81b062d3dfa04aa3d1\") " pod="kube-system/kube-apiserver-ci-4372-1-0-n-c06092ab73" Sep 10 23:53:50.624977 kubelet[2743]: I0910 23:53:50.624934 2743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/acb7cc2c1c7e611e6c9eb92704a65c89-k8s-certs\") pod \"kube-controller-manager-ci-4372-1-0-n-c06092ab73\" (UID: \"acb7cc2c1c7e611e6c9eb92704a65c89\") " pod="kube-system/kube-controller-manager-ci-4372-1-0-n-c06092ab73" Sep 10 23:53:50.625368 kubelet[2743]: I0910 23:53:50.625223 2743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"flexvolume-dir\" (UniqueName: \"kubernetes.io/host-path/acb7cc2c1c7e611e6c9eb92704a65c89-flexvolume-dir\") pod \"kube-controller-manager-ci-4372-1-0-n-c06092ab73\" (UID: \"acb7cc2c1c7e611e6c9eb92704a65c89\") " pod="kube-system/kube-controller-manager-ci-4372-1-0-n-c06092ab73" Sep 10 23:53:50.625368 kubelet[2743]: I0910 23:53:50.625325 2743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/67881df9adff5d81b062d3dfa04aa3d1-ca-certs\") pod \"kube-apiserver-ci-4372-1-0-n-c06092ab73\" (UID: \"67881df9adff5d81b062d3dfa04aa3d1\") " pod="kube-system/kube-apiserver-ci-4372-1-0-n-c06092ab73" Sep 10 23:53:50.625812 kubelet[2743]: I0910 23:53:50.625580 2743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/67881df9adff5d81b062d3dfa04aa3d1-k8s-certs\") pod \"kube-apiserver-ci-4372-1-0-n-c06092ab73\" (UID: \"67881df9adff5d81b062d3dfa04aa3d1\") " pod="kube-system/kube-apiserver-ci-4372-1-0-n-c06092ab73" Sep 10 23:53:50.625812 kubelet[2743]: I0910 23:53:50.625639 2743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/acb7cc2c1c7e611e6c9eb92704a65c89-ca-certs\") pod \"kube-controller-manager-ci-4372-1-0-n-c06092ab73\" (UID: \"acb7cc2c1c7e611e6c9eb92704a65c89\") " pod="kube-system/kube-controller-manager-ci-4372-1-0-n-c06092ab73" Sep 10 23:53:51.292778 kubelet[2743]: I0910 23:53:51.292466 2743 apiserver.go:52] "Watching apiserver" Sep 10 23:53:51.322901 kubelet[2743]: I0910 23:53:51.322833 2743 desired_state_of_world_populator.go:155] "Finished populating initial desired state of world" Sep 10 23:53:51.442759 kubelet[2743]: I0910 23:53:51.442681 2743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-scheduler-ci-4372-1-0-n-c06092ab73" podStartSLOduration=1.442655906 podStartE2EDuration="1.442655906s" podCreationTimestamp="2025-09-10 23:53:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-10 23:53:51.421883191 +0000 UTC m=+1.211948367" watchObservedRunningTime="2025-09-10 23:53:51.442655906 +0000 UTC m=+1.232721082" Sep 10 23:53:51.468277 kubelet[2743]: I0910 23:53:51.468201 2743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-apiserver-ci-4372-1-0-n-c06092ab73" podStartSLOduration=1.4681615350000001 podStartE2EDuration="1.468161535s" podCreationTimestamp="2025-09-10 23:53:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-10 23:53:51.443503254 +0000 UTC m=+1.233568430" watchObservedRunningTime="2025-09-10 23:53:51.468161535 +0000 UTC m=+1.258226711" Sep 10 23:53:51.494545 kubelet[2743]: I0910 23:53:51.494389 2743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-controller-manager-ci-4372-1-0-n-c06092ab73" podStartSLOduration=1.494368114 podStartE2EDuration="1.494368114s" podCreationTimestamp="2025-09-10 23:53:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-10 23:53:51.46948025 +0000 UTC m=+1.259545426" watchObservedRunningTime="2025-09-10 23:53:51.494368114 +0000 UTC m=+1.284433330" Sep 10 23:53:52.340436 systemd[1]: Started sshd@7-91.107.201.216:22-8.140.63.248:37092.service - OpenSSH per-connection server daemon (8.140.63.248:37092). Sep 10 23:53:52.960212 sshd[2786]: Invalid user from 8.140.63.248 port 37092 Sep 10 23:53:55.752141 kubelet[2743]: I0910 23:53:55.752097 2743 kuberuntime_manager.go:1635] "Updating runtime config through cri with podcidr" CIDR="192.168.0.0/24" Sep 10 23:53:55.753092 containerd[1522]: time="2025-09-10T23:53:55.753052895Z" level=info msg="No cni config template is specified, wait for other system components to drop the config." Sep 10 23:53:55.754705 kubelet[2743]: I0910 23:53:55.753454 2743 kubelet_network.go:61] "Updating Pod CIDR" originalPodCIDR="" newPodCIDR="192.168.0.0/24" Sep 10 23:53:56.645066 systemd[1]: Created slice kubepods-besteffort-pod725366e9_9c9b_4192_98e1_e84d45275342.slice - libcontainer container kubepods-besteffort-pod725366e9_9c9b_4192_98e1_e84d45275342.slice. Sep 10 23:53:56.670912 kubelet[2743]: I0910 23:53:56.670843 2743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"xtables-lock\" (UniqueName: \"kubernetes.io/host-path/725366e9-9c9b-4192-98e1-e84d45275342-xtables-lock\") pod \"kube-proxy-z6z8m\" (UID: \"725366e9-9c9b-4192-98e1-e84d45275342\") " pod="kube-system/kube-proxy-z6z8m" Sep 10 23:53:56.670912 kubelet[2743]: I0910 23:53:56.670889 2743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/725366e9-9c9b-4192-98e1-e84d45275342-lib-modules\") pod \"kube-proxy-z6z8m\" (UID: \"725366e9-9c9b-4192-98e1-e84d45275342\") " pod="kube-system/kube-proxy-z6z8m" Sep 10 23:53:56.670912 kubelet[2743]: I0910 23:53:56.670910 2743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6ft22\" (UniqueName: \"kubernetes.io/projected/725366e9-9c9b-4192-98e1-e84d45275342-kube-api-access-6ft22\") pod \"kube-proxy-z6z8m\" (UID: \"725366e9-9c9b-4192-98e1-e84d45275342\") " pod="kube-system/kube-proxy-z6z8m" Sep 10 23:53:56.671160 kubelet[2743]: I0910 23:53:56.670932 2743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-proxy\" (UniqueName: \"kubernetes.io/configmap/725366e9-9c9b-4192-98e1-e84d45275342-kube-proxy\") pod \"kube-proxy-z6z8m\" (UID: \"725366e9-9c9b-4192-98e1-e84d45275342\") " pod="kube-system/kube-proxy-z6z8m" Sep 10 23:53:56.869396 systemd[1]: Created slice kubepods-besteffort-poda709ca6d_6746_446d_b3f4_b8e6002dcf78.slice - libcontainer container kubepods-besteffort-poda709ca6d_6746_446d_b3f4_b8e6002dcf78.slice. Sep 10 23:53:56.956466 containerd[1522]: time="2025-09-10T23:53:56.955748014Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-proxy-z6z8m,Uid:725366e9-9c9b-4192-98e1-e84d45275342,Namespace:kube-system,Attempt:0,}" Sep 10 23:53:56.974255 kubelet[2743]: I0910 23:53:56.973248 2743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-calico\" (UniqueName: \"kubernetes.io/host-path/a709ca6d-6746-446d-b3f4-b8e6002dcf78-var-lib-calico\") pod \"tigera-operator-58fc44c59b-9fcvb\" (UID: \"a709ca6d-6746-446d-b3f4-b8e6002dcf78\") " pod="tigera-operator/tigera-operator-58fc44c59b-9fcvb" Sep 10 23:53:56.974255 kubelet[2743]: I0910 23:53:56.973304 2743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8mqdr\" (UniqueName: \"kubernetes.io/projected/a709ca6d-6746-446d-b3f4-b8e6002dcf78-kube-api-access-8mqdr\") pod \"tigera-operator-58fc44c59b-9fcvb\" (UID: \"a709ca6d-6746-446d-b3f4-b8e6002dcf78\") " pod="tigera-operator/tigera-operator-58fc44c59b-9fcvb" Sep 10 23:53:56.978152 containerd[1522]: time="2025-09-10T23:53:56.978040392Z" level=info msg="connecting to shim 0e8262b1aa072094bcc5986342e9162a149fe84c5a9c23cb23c1f2ddc20b8f29" address="unix:///run/containerd/s/0729928f2b39050152bc0493d838808a20fa682cfd8ff64f51381171b1bfbf0c" namespace=k8s.io protocol=ttrpc version=3 Sep 10 23:53:57.004444 systemd[1]: Started cri-containerd-0e8262b1aa072094bcc5986342e9162a149fe84c5a9c23cb23c1f2ddc20b8f29.scope - libcontainer container 0e8262b1aa072094bcc5986342e9162a149fe84c5a9c23cb23c1f2ddc20b8f29. Sep 10 23:53:57.034814 containerd[1522]: time="2025-09-10T23:53:57.034771490Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-proxy-z6z8m,Uid:725366e9-9c9b-4192-98e1-e84d45275342,Namespace:kube-system,Attempt:0,} returns sandbox id \"0e8262b1aa072094bcc5986342e9162a149fe84c5a9c23cb23c1f2ddc20b8f29\"" Sep 10 23:53:57.040464 containerd[1522]: time="2025-09-10T23:53:57.040409808Z" level=info msg="CreateContainer within sandbox \"0e8262b1aa072094bcc5986342e9162a149fe84c5a9c23cb23c1f2ddc20b8f29\" for container &ContainerMetadata{Name:kube-proxy,Attempt:0,}" Sep 10 23:53:57.055713 containerd[1522]: time="2025-09-10T23:53:57.055534994Z" level=info msg="Container f3861fb7885d0410b97758824c3d165e8bb83c912b8289d3e074ea8a1606b741: CDI devices from CRI Config.CDIDevices: []" Sep 10 23:53:57.064448 containerd[1522]: time="2025-09-10T23:53:57.064324519Z" level=info msg="CreateContainer within sandbox \"0e8262b1aa072094bcc5986342e9162a149fe84c5a9c23cb23c1f2ddc20b8f29\" for &ContainerMetadata{Name:kube-proxy,Attempt:0,} returns container id \"f3861fb7885d0410b97758824c3d165e8bb83c912b8289d3e074ea8a1606b741\"" Sep 10 23:53:57.065222 containerd[1522]: time="2025-09-10T23:53:57.065171572Z" level=info msg="StartContainer for \"f3861fb7885d0410b97758824c3d165e8bb83c912b8289d3e074ea8a1606b741\"" Sep 10 23:53:57.068885 containerd[1522]: time="2025-09-10T23:53:57.068852728Z" level=info msg="connecting to shim f3861fb7885d0410b97758824c3d165e8bb83c912b8289d3e074ea8a1606b741" address="unix:///run/containerd/s/0729928f2b39050152bc0493d838808a20fa682cfd8ff64f51381171b1bfbf0c" protocol=ttrpc version=3 Sep 10 23:53:57.088396 systemd[1]: Started cri-containerd-f3861fb7885d0410b97758824c3d165e8bb83c912b8289d3e074ea8a1606b741.scope - libcontainer container f3861fb7885d0410b97758824c3d165e8bb83c912b8289d3e074ea8a1606b741. Sep 10 23:53:57.139181 containerd[1522]: time="2025-09-10T23:53:57.139112395Z" level=info msg="StartContainer for \"f3861fb7885d0410b97758824c3d165e8bb83c912b8289d3e074ea8a1606b741\" returns successfully" Sep 10 23:53:57.175282 containerd[1522]: time="2025-09-10T23:53:57.174861496Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:tigera-operator-58fc44c59b-9fcvb,Uid:a709ca6d-6746-446d-b3f4-b8e6002dcf78,Namespace:tigera-operator,Attempt:0,}" Sep 10 23:53:57.202719 containerd[1522]: time="2025-09-10T23:53:57.202677928Z" level=info msg="connecting to shim c28ab7d7ce1b314281ba7d56b4a3ddc9c0d352d24c1496c60c604eb9d098d14d" address="unix:///run/containerd/s/e447875bf92dee430eabd7c1b7aa3f427107c261e2e531ce7273e4742152c92a" namespace=k8s.io protocol=ttrpc version=3 Sep 10 23:53:57.236394 systemd[1]: Started cri-containerd-c28ab7d7ce1b314281ba7d56b4a3ddc9c0d352d24c1496c60c604eb9d098d14d.scope - libcontainer container c28ab7d7ce1b314281ba7d56b4a3ddc9c0d352d24c1496c60c604eb9d098d14d. Sep 10 23:53:57.287137 containerd[1522]: time="2025-09-10T23:53:57.287063695Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:tigera-operator-58fc44c59b-9fcvb,Uid:a709ca6d-6746-446d-b3f4-b8e6002dcf78,Namespace:tigera-operator,Attempt:0,} returns sandbox id \"c28ab7d7ce1b314281ba7d56b4a3ddc9c0d352d24c1496c60c604eb9d098d14d\"" Sep 10 23:53:57.290966 containerd[1522]: time="2025-09-10T23:53:57.290931010Z" level=info msg="PullImage \"quay.io/tigera/operator:v1.38.6\"" Sep 10 23:53:57.422673 kubelet[2743]: I0910 23:53:57.422514 2743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-proxy-z6z8m" podStartSLOduration=1.422495465 podStartE2EDuration="1.422495465s" podCreationTimestamp="2025-09-10 23:53:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-10 23:53:57.422242493 +0000 UTC m=+7.212307749" watchObservedRunningTime="2025-09-10 23:53:57.422495465 +0000 UTC m=+7.212560641" Sep 10 23:53:58.885456 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount1086160014.mount: Deactivated successfully. Sep 10 23:53:59.563150 containerd[1522]: time="2025-09-10T23:53:59.563008365Z" level=info msg="ImageCreate event name:\"quay.io/tigera/operator:v1.38.6\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 10 23:53:59.564244 containerd[1522]: time="2025-09-10T23:53:59.564179226Z" level=info msg="stop pulling image quay.io/tigera/operator:v1.38.6: active requests=0, bytes read=22152365" Sep 10 23:53:59.564883 containerd[1522]: time="2025-09-10T23:53:59.564842671Z" level=info msg="ImageCreate event name:\"sha256:dd2e197838b00861b08ae5f480dfbfb9a519722e35ced99346315722309cbe9f\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 10 23:53:59.568096 containerd[1522]: time="2025-09-10T23:53:59.567857121Z" level=info msg="ImageCreate event name:\"quay.io/tigera/operator@sha256:00a7a9b62f9b9a4e0856128b078539783b8352b07f707bff595cb604cc580f6e\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 10 23:53:59.568713 containerd[1522]: time="2025-09-10T23:53:59.568677356Z" level=info msg="Pulled image \"quay.io/tigera/operator:v1.38.6\" with image id \"sha256:dd2e197838b00861b08ae5f480dfbfb9a519722e35ced99346315722309cbe9f\", repo tag \"quay.io/tigera/operator:v1.38.6\", repo digest \"quay.io/tigera/operator@sha256:00a7a9b62f9b9a4e0856128b078539783b8352b07f707bff595cb604cc580f6e\", size \"22148360\" in 2.277560948s" Sep 10 23:53:59.568713 containerd[1522]: time="2025-09-10T23:53:59.568711442Z" level=info msg="PullImage \"quay.io/tigera/operator:v1.38.6\" returns image reference \"sha256:dd2e197838b00861b08ae5f480dfbfb9a519722e35ced99346315722309cbe9f\"" Sep 10 23:53:59.572839 containerd[1522]: time="2025-09-10T23:53:59.572799335Z" level=info msg="CreateContainer within sandbox \"c28ab7d7ce1b314281ba7d56b4a3ddc9c0d352d24c1496c60c604eb9d098d14d\" for container &ContainerMetadata{Name:tigera-operator,Attempt:0,}" Sep 10 23:53:59.586222 containerd[1522]: time="2025-09-10T23:53:59.584801483Z" level=info msg="Container d4422c79d8dbe6c9e2bec53241825cb1fe0c6aef10bfe29a1d3a62676dd035ad: CDI devices from CRI Config.CDIDevices: []" Sep 10 23:53:59.589645 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount3509838510.mount: Deactivated successfully. Sep 10 23:53:59.592542 containerd[1522]: time="2025-09-10T23:53:59.592482894Z" level=info msg="CreateContainer within sandbox \"c28ab7d7ce1b314281ba7d56b4a3ddc9c0d352d24c1496c60c604eb9d098d14d\" for &ContainerMetadata{Name:tigera-operator,Attempt:0,} returns container id \"d4422c79d8dbe6c9e2bec53241825cb1fe0c6aef10bfe29a1d3a62676dd035ad\"" Sep 10 23:53:59.593660 containerd[1522]: time="2025-09-10T23:53:59.593639353Z" level=info msg="StartContainer for \"d4422c79d8dbe6c9e2bec53241825cb1fe0c6aef10bfe29a1d3a62676dd035ad\"" Sep 10 23:53:59.595647 containerd[1522]: time="2025-09-10T23:53:59.595038977Z" level=info msg="connecting to shim d4422c79d8dbe6c9e2bec53241825cb1fe0c6aef10bfe29a1d3a62676dd035ad" address="unix:///run/containerd/s/e447875bf92dee430eabd7c1b7aa3f427107c261e2e531ce7273e4742152c92a" protocol=ttrpc version=3 Sep 10 23:53:59.619371 systemd[1]: Started cri-containerd-d4422c79d8dbe6c9e2bec53241825cb1fe0c6aef10bfe29a1d3a62676dd035ad.scope - libcontainer container d4422c79d8dbe6c9e2bec53241825cb1fe0c6aef10bfe29a1d3a62676dd035ad. Sep 10 23:53:59.662482 containerd[1522]: time="2025-09-10T23:53:59.662034596Z" level=info msg="StartContainer for \"d4422c79d8dbe6c9e2bec53241825cb1fe0c6aef10bfe29a1d3a62676dd035ad\" returns successfully" Sep 10 23:54:00.327599 sshd[2786]: Connection closed by invalid user 8.140.63.248 port 37092 [preauth] Sep 10 23:54:00.329891 systemd[1]: sshd@7-91.107.201.216:22-8.140.63.248:37092.service: Deactivated successfully. Sep 10 23:54:01.487219 kubelet[2743]: I0910 23:54:01.486939 2743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="tigera-operator/tigera-operator-58fc44c59b-9fcvb" podStartSLOduration=3.20748029 podStartE2EDuration="5.486921645s" podCreationTimestamp="2025-09-10 23:53:56 +0000 UTC" firstStartedPulling="2025-09-10 23:53:57.290432427 +0000 UTC m=+7.080497603" lastFinishedPulling="2025-09-10 23:53:59.569873782 +0000 UTC m=+9.359938958" observedRunningTime="2025-09-10 23:54:00.428971154 +0000 UTC m=+10.219036330" watchObservedRunningTime="2025-09-10 23:54:01.486921645 +0000 UTC m=+11.276986821" Sep 10 23:54:02.658166 systemd[1]: cri-containerd-d4422c79d8dbe6c9e2bec53241825cb1fe0c6aef10bfe29a1d3a62676dd035ad.scope: Deactivated successfully. Sep 10 23:54:02.665554 containerd[1522]: time="2025-09-10T23:54:02.665502783Z" level=info msg="TaskExit event in podsandbox handler container_id:\"d4422c79d8dbe6c9e2bec53241825cb1fe0c6aef10bfe29a1d3a62676dd035ad\" id:\"d4422c79d8dbe6c9e2bec53241825cb1fe0c6aef10bfe29a1d3a62676dd035ad\" pid:3063 exit_status:1 exited_at:{seconds:1757548442 nanos:664952611}" Sep 10 23:54:02.665554 containerd[1522]: time="2025-09-10T23:54:02.665513185Z" level=info msg="received exit event container_id:\"d4422c79d8dbe6c9e2bec53241825cb1fe0c6aef10bfe29a1d3a62676dd035ad\" id:\"d4422c79d8dbe6c9e2bec53241825cb1fe0c6aef10bfe29a1d3a62676dd035ad\" pid:3063 exit_status:1 exited_at:{seconds:1757548442 nanos:664952611}" Sep 10 23:54:02.700088 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-d4422c79d8dbe6c9e2bec53241825cb1fe0c6aef10bfe29a1d3a62676dd035ad-rootfs.mount: Deactivated successfully. Sep 10 23:54:03.430937 kubelet[2743]: I0910 23:54:03.430080 2743 scope.go:117] "RemoveContainer" containerID="d4422c79d8dbe6c9e2bec53241825cb1fe0c6aef10bfe29a1d3a62676dd035ad" Sep 10 23:54:03.438315 containerd[1522]: time="2025-09-10T23:54:03.438273102Z" level=info msg="CreateContainer within sandbox \"c28ab7d7ce1b314281ba7d56b4a3ddc9c0d352d24c1496c60c604eb9d098d14d\" for container &ContainerMetadata{Name:tigera-operator,Attempt:1,}" Sep 10 23:54:03.456680 containerd[1522]: time="2025-09-10T23:54:03.455375871Z" level=info msg="Container 33297a90db7f8cdd9697668a43ab136caeee437394decd46ecefe7df0c3498d5: CDI devices from CRI Config.CDIDevices: []" Sep 10 23:54:03.477425 containerd[1522]: time="2025-09-10T23:54:03.477365591Z" level=info msg="CreateContainer within sandbox \"c28ab7d7ce1b314281ba7d56b4a3ddc9c0d352d24c1496c60c604eb9d098d14d\" for &ContainerMetadata{Name:tigera-operator,Attempt:1,} returns container id \"33297a90db7f8cdd9697668a43ab136caeee437394decd46ecefe7df0c3498d5\"" Sep 10 23:54:03.479026 containerd[1522]: time="2025-09-10T23:54:03.478987973Z" level=info msg="StartContainer for \"33297a90db7f8cdd9697668a43ab136caeee437394decd46ecefe7df0c3498d5\"" Sep 10 23:54:03.480970 containerd[1522]: time="2025-09-10T23:54:03.480918806Z" level=info msg="connecting to shim 33297a90db7f8cdd9697668a43ab136caeee437394decd46ecefe7df0c3498d5" address="unix:///run/containerd/s/e447875bf92dee430eabd7c1b7aa3f427107c261e2e531ce7273e4742152c92a" protocol=ttrpc version=3 Sep 10 23:54:03.519419 systemd[1]: Started cri-containerd-33297a90db7f8cdd9697668a43ab136caeee437394decd46ecefe7df0c3498d5.scope - libcontainer container 33297a90db7f8cdd9697668a43ab136caeee437394decd46ecefe7df0c3498d5. Sep 10 23:54:03.569366 containerd[1522]: time="2025-09-10T23:54:03.569316157Z" level=info msg="StartContainer for \"33297a90db7f8cdd9697668a43ab136caeee437394decd46ecefe7df0c3498d5\" returns successfully" Sep 10 23:54:05.954639 sudo[1840]: pam_unix(sudo:session): session closed for user root Sep 10 23:54:06.113970 sshd[1832]: Connection closed by 139.178.89.65 port 35368 Sep 10 23:54:06.114642 sshd-session[1830]: pam_unix(sshd:session): session closed for user core Sep 10 23:54:06.119492 systemd-logind[1483]: Session 7 logged out. Waiting for processes to exit. Sep 10 23:54:06.120368 systemd[1]: sshd@6-91.107.201.216:22-139.178.89.65:35368.service: Deactivated successfully. Sep 10 23:54:06.122464 systemd[1]: session-7.scope: Deactivated successfully. Sep 10 23:54:06.122785 systemd[1]: session-7.scope: Consumed 7.004s CPU time, 229.7M memory peak. Sep 10 23:54:06.125212 systemd-logind[1483]: Removed session 7. Sep 10 23:54:12.288700 systemd[1]: Created slice kubepods-besteffort-podff29ad7c_0224_457b_949b_6882b7eb9ce7.slice - libcontainer container kubepods-besteffort-podff29ad7c_0224_457b_949b_6882b7eb9ce7.slice. Sep 10 23:54:12.371082 kubelet[2743]: I0910 23:54:12.371034 2743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-phkbd\" (UniqueName: \"kubernetes.io/projected/ff29ad7c-0224-457b-949b-6882b7eb9ce7-kube-api-access-phkbd\") pod \"calico-typha-59656fbc59-82nff\" (UID: \"ff29ad7c-0224-457b-949b-6882b7eb9ce7\") " pod="calico-system/calico-typha-59656fbc59-82nff" Sep 10 23:54:12.371082 kubelet[2743]: I0910 23:54:12.371085 2743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"typha-certs\" (UniqueName: \"kubernetes.io/secret/ff29ad7c-0224-457b-949b-6882b7eb9ce7-typha-certs\") pod \"calico-typha-59656fbc59-82nff\" (UID: \"ff29ad7c-0224-457b-949b-6882b7eb9ce7\") " pod="calico-system/calico-typha-59656fbc59-82nff" Sep 10 23:54:12.371506 kubelet[2743]: I0910 23:54:12.371108 2743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ff29ad7c-0224-457b-949b-6882b7eb9ce7-tigera-ca-bundle\") pod \"calico-typha-59656fbc59-82nff\" (UID: \"ff29ad7c-0224-457b-949b-6882b7eb9ce7\") " pod="calico-system/calico-typha-59656fbc59-82nff" Sep 10 23:54:12.472086 kubelet[2743]: I0910 23:54:12.472034 2743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-certs\" (UniqueName: \"kubernetes.io/secret/fd928deb-7f2c-408c-aa3e-945cc2d73815-node-certs\") pod \"calico-node-5znmh\" (UID: \"fd928deb-7f2c-408c-aa3e-945cc2d73815\") " pod="calico-system/calico-node-5znmh" Sep 10 23:54:12.472086 kubelet[2743]: I0910 23:54:12.472080 2743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-calico\" (UniqueName: \"kubernetes.io/host-path/fd928deb-7f2c-408c-aa3e-945cc2d73815-var-run-calico\") pod \"calico-node-5znmh\" (UID: \"fd928deb-7f2c-408c-aa3e-945cc2d73815\") " pod="calico-system/calico-node-5znmh" Sep 10 23:54:12.472265 kubelet[2743]: I0910 23:54:12.472097 2743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/fd928deb-7f2c-408c-aa3e-945cc2d73815-lib-modules\") pod \"calico-node-5znmh\" (UID: \"fd928deb-7f2c-408c-aa3e-945cc2d73815\") " pod="calico-system/calico-node-5znmh" Sep 10 23:54:12.472265 kubelet[2743]: I0910 23:54:12.472113 2743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"xtables-lock\" (UniqueName: \"kubernetes.io/host-path/fd928deb-7f2c-408c-aa3e-945cc2d73815-xtables-lock\") pod \"calico-node-5znmh\" (UID: \"fd928deb-7f2c-408c-aa3e-945cc2d73815\") " pod="calico-system/calico-node-5znmh" Sep 10 23:54:12.472265 kubelet[2743]: I0910 23:54:12.472129 2743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-log-dir\" (UniqueName: \"kubernetes.io/host-path/fd928deb-7f2c-408c-aa3e-945cc2d73815-cni-log-dir\") pod \"calico-node-5znmh\" (UID: \"fd928deb-7f2c-408c-aa3e-945cc2d73815\") " pod="calico-system/calico-node-5znmh" Sep 10 23:54:12.472265 kubelet[2743]: I0910 23:54:12.472144 2743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-calico\" (UniqueName: \"kubernetes.io/host-path/fd928deb-7f2c-408c-aa3e-945cc2d73815-var-lib-calico\") pod \"calico-node-5znmh\" (UID: \"fd928deb-7f2c-408c-aa3e-945cc2d73815\") " pod="calico-system/calico-node-5znmh" Sep 10 23:54:12.472265 kubelet[2743]: I0910 23:54:12.472174 2743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-bin-dir\" (UniqueName: \"kubernetes.io/host-path/fd928deb-7f2c-408c-aa3e-945cc2d73815-cni-bin-dir\") pod \"calico-node-5znmh\" (UID: \"fd928deb-7f2c-408c-aa3e-945cc2d73815\") " pod="calico-system/calico-node-5znmh" Sep 10 23:54:12.472797 kubelet[2743]: I0910 23:54:12.472760 2743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"policysync\" (UniqueName: \"kubernetes.io/host-path/fd928deb-7f2c-408c-aa3e-945cc2d73815-policysync\") pod \"calico-node-5znmh\" (UID: \"fd928deb-7f2c-408c-aa3e-945cc2d73815\") " pod="calico-system/calico-node-5znmh" Sep 10 23:54:12.472797 kubelet[2743]: I0910 23:54:12.472794 2743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-njmtf\" (UniqueName: \"kubernetes.io/projected/fd928deb-7f2c-408c-aa3e-945cc2d73815-kube-api-access-njmtf\") pod \"calico-node-5znmh\" (UID: \"fd928deb-7f2c-408c-aa3e-945cc2d73815\") " pod="calico-system/calico-node-5znmh" Sep 10 23:54:12.472876 kubelet[2743]: I0910 23:54:12.472813 2743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"flexvol-driver-host\" (UniqueName: \"kubernetes.io/host-path/fd928deb-7f2c-408c-aa3e-945cc2d73815-flexvol-driver-host\") pod \"calico-node-5znmh\" (UID: \"fd928deb-7f2c-408c-aa3e-945cc2d73815\") " pod="calico-system/calico-node-5znmh" Sep 10 23:54:12.472876 kubelet[2743]: I0910 23:54:12.472829 2743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/fd928deb-7f2c-408c-aa3e-945cc2d73815-tigera-ca-bundle\") pod \"calico-node-5znmh\" (UID: \"fd928deb-7f2c-408c-aa3e-945cc2d73815\") " pod="calico-system/calico-node-5znmh" Sep 10 23:54:12.472876 kubelet[2743]: I0910 23:54:12.472843 2743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-net-dir\" (UniqueName: \"kubernetes.io/host-path/fd928deb-7f2c-408c-aa3e-945cc2d73815-cni-net-dir\") pod \"calico-node-5znmh\" (UID: \"fd928deb-7f2c-408c-aa3e-945cc2d73815\") " pod="calico-system/calico-node-5znmh" Sep 10 23:54:12.476618 systemd[1]: Created slice kubepods-besteffort-podfd928deb_7f2c_408c_aa3e_945cc2d73815.slice - libcontainer container kubepods-besteffort-podfd928deb_7f2c_408c_aa3e_945cc2d73815.slice. Sep 10 23:54:12.574219 kubelet[2743]: E0910 23:54:12.573519 2743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-px2tb" podUID="33c72a96-49c1-4e41-a8b3-15ab5f93e7db" Sep 10 23:54:12.576906 kubelet[2743]: E0910 23:54:12.576514 2743 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 10 23:54:12.576906 kubelet[2743]: W0910 23:54:12.576652 2743 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 10 23:54:12.576906 kubelet[2743]: E0910 23:54:12.576676 2743 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 10 23:54:12.581738 kubelet[2743]: E0910 23:54:12.581702 2743 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 10 23:54:12.581738 kubelet[2743]: W0910 23:54:12.581724 2743 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 10 23:54:12.581903 kubelet[2743]: E0910 23:54:12.581748 2743 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 10 23:54:12.584436 kubelet[2743]: E0910 23:54:12.584398 2743 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 10 23:54:12.584436 kubelet[2743]: W0910 23:54:12.584423 2743 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 10 23:54:12.584852 kubelet[2743]: E0910 23:54:12.584727 2743 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 10 23:54:12.588214 kubelet[2743]: E0910 23:54:12.585984 2743 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 10 23:54:12.588214 kubelet[2743]: W0910 23:54:12.587496 2743 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 10 23:54:12.588214 kubelet[2743]: E0910 23:54:12.587629 2743 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 10 23:54:12.589436 kubelet[2743]: E0910 23:54:12.589407 2743 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 10 23:54:12.589436 kubelet[2743]: W0910 23:54:12.589429 2743 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 10 23:54:12.589658 kubelet[2743]: E0910 23:54:12.589640 2743 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 10 23:54:12.589807 kubelet[2743]: E0910 23:54:12.589788 2743 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 10 23:54:12.589807 kubelet[2743]: W0910 23:54:12.589804 2743 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 10 23:54:12.589904 kubelet[2743]: E0910 23:54:12.589889 2743 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 10 23:54:12.590157 kubelet[2743]: E0910 23:54:12.590066 2743 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 10 23:54:12.590157 kubelet[2743]: W0910 23:54:12.590128 2743 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 10 23:54:12.590388 kubelet[2743]: E0910 23:54:12.590276 2743 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 10 23:54:12.590586 kubelet[2743]: E0910 23:54:12.590561 2743 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 10 23:54:12.590586 kubelet[2743]: W0910 23:54:12.590579 2743 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 10 23:54:12.590652 kubelet[2743]: E0910 23:54:12.590593 2743 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 10 23:54:12.591007 kubelet[2743]: E0910 23:54:12.590987 2743 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 10 23:54:12.591813 kubelet[2743]: W0910 23:54:12.591219 2743 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 10 23:54:12.591813 kubelet[2743]: E0910 23:54:12.591242 2743 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 10 23:54:12.593222 kubelet[2743]: E0910 23:54:12.592299 2743 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 10 23:54:12.593222 kubelet[2743]: W0910 23:54:12.593218 2743 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 10 23:54:12.593222 kubelet[2743]: E0910 23:54:12.593240 2743 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 10 23:54:12.593611 kubelet[2743]: E0910 23:54:12.593597 2743 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 10 23:54:12.593681 kubelet[2743]: W0910 23:54:12.593669 2743 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 10 23:54:12.593738 kubelet[2743]: E0910 23:54:12.593728 2743 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 10 23:54:12.594110 containerd[1522]: time="2025-09-10T23:54:12.594045235Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-typha-59656fbc59-82nff,Uid:ff29ad7c-0224-457b-949b-6882b7eb9ce7,Namespace:calico-system,Attempt:0,}" Sep 10 23:54:12.604783 kubelet[2743]: E0910 23:54:12.604756 2743 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 10 23:54:12.605243 kubelet[2743]: W0910 23:54:12.604938 2743 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 10 23:54:12.605243 kubelet[2743]: E0910 23:54:12.604964 2743 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 10 23:54:12.635220 containerd[1522]: time="2025-09-10T23:54:12.635152426Z" level=info msg="connecting to shim ea4127d7ab14de6734cd816df4001d3cf6cfed1a62c84315ee47b8d9a238e9ce" address="unix:///run/containerd/s/d5e3d2cc18ae543cdb686d0c8b3483d49887f227336875643dcbc2a0057cffb4" namespace=k8s.io protocol=ttrpc version=3 Sep 10 23:54:12.668786 kubelet[2743]: E0910 23:54:12.668752 2743 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 10 23:54:12.668786 kubelet[2743]: W0910 23:54:12.668776 2743 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 10 23:54:12.668933 kubelet[2743]: E0910 23:54:12.668799 2743 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 10 23:54:12.671218 kubelet[2743]: E0910 23:54:12.669043 2743 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 10 23:54:12.671218 kubelet[2743]: W0910 23:54:12.669058 2743 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 10 23:54:12.671218 kubelet[2743]: E0910 23:54:12.669069 2743 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 10 23:54:12.671218 kubelet[2743]: E0910 23:54:12.669455 2743 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 10 23:54:12.671218 kubelet[2743]: W0910 23:54:12.669472 2743 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 10 23:54:12.671218 kubelet[2743]: E0910 23:54:12.669483 2743 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 10 23:54:12.671218 kubelet[2743]: E0910 23:54:12.670426 2743 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 10 23:54:12.671218 kubelet[2743]: W0910 23:54:12.670440 2743 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 10 23:54:12.671218 kubelet[2743]: E0910 23:54:12.670463 2743 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 10 23:54:12.671218 kubelet[2743]: E0910 23:54:12.670614 2743 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 10 23:54:12.671551 kubelet[2743]: W0910 23:54:12.670623 2743 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 10 23:54:12.671551 kubelet[2743]: E0910 23:54:12.670632 2743 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 10 23:54:12.671551 kubelet[2743]: E0910 23:54:12.670751 2743 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 10 23:54:12.671551 kubelet[2743]: W0910 23:54:12.670768 2743 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 10 23:54:12.671551 kubelet[2743]: E0910 23:54:12.670776 2743 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 10 23:54:12.671551 kubelet[2743]: E0910 23:54:12.670895 2743 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 10 23:54:12.671551 kubelet[2743]: W0910 23:54:12.670902 2743 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 10 23:54:12.671551 kubelet[2743]: E0910 23:54:12.670909 2743 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 10 23:54:12.671551 kubelet[2743]: E0910 23:54:12.671175 2743 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 10 23:54:12.671714 kubelet[2743]: W0910 23:54:12.671582 2743 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 10 23:54:12.671714 kubelet[2743]: E0910 23:54:12.671606 2743 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 10 23:54:12.671944 kubelet[2743]: E0910 23:54:12.671926 2743 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 10 23:54:12.671944 kubelet[2743]: W0910 23:54:12.671944 2743 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 10 23:54:12.672038 kubelet[2743]: E0910 23:54:12.671956 2743 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 10 23:54:12.676213 kubelet[2743]: E0910 23:54:12.675363 2743 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 10 23:54:12.676213 kubelet[2743]: W0910 23:54:12.675388 2743 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 10 23:54:12.676213 kubelet[2743]: E0910 23:54:12.675405 2743 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 10 23:54:12.676506 kubelet[2743]: E0910 23:54:12.676492 2743 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 10 23:54:12.676756 kubelet[2743]: W0910 23:54:12.676688 2743 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 10 23:54:12.676756 kubelet[2743]: E0910 23:54:12.676710 2743 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 10 23:54:12.677351 kubelet[2743]: E0910 23:54:12.677334 2743 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 10 23:54:12.678276 kubelet[2743]: W0910 23:54:12.678099 2743 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 10 23:54:12.678276 kubelet[2743]: E0910 23:54:12.678149 2743 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 10 23:54:12.678484 kubelet[2743]: E0910 23:54:12.678470 2743 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 10 23:54:12.678550 kubelet[2743]: W0910 23:54:12.678537 2743 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 10 23:54:12.678608 kubelet[2743]: E0910 23:54:12.678598 2743 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 10 23:54:12.678786 kubelet[2743]: E0910 23:54:12.678774 2743 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 10 23:54:12.679102 kubelet[2743]: W0910 23:54:12.678846 2743 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 10 23:54:12.679102 kubelet[2743]: E0910 23:54:12.678863 2743 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 10 23:54:12.680481 kubelet[2743]: E0910 23:54:12.680336 2743 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 10 23:54:12.680481 kubelet[2743]: W0910 23:54:12.680360 2743 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 10 23:54:12.680481 kubelet[2743]: E0910 23:54:12.680378 2743 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 10 23:54:12.680736 kubelet[2743]: E0910 23:54:12.680723 2743 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 10 23:54:12.680801 kubelet[2743]: W0910 23:54:12.680790 2743 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 10 23:54:12.680865 kubelet[2743]: E0910 23:54:12.680853 2743 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 10 23:54:12.682228 kubelet[2743]: E0910 23:54:12.681409 2743 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 10 23:54:12.682572 kubelet[2743]: W0910 23:54:12.682400 2743 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 10 23:54:12.682572 kubelet[2743]: E0910 23:54:12.682430 2743 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 10 23:54:12.682734 kubelet[2743]: E0910 23:54:12.682719 2743 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 10 23:54:12.682785 kubelet[2743]: W0910 23:54:12.682774 2743 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 10 23:54:12.682847 kubelet[2743]: E0910 23:54:12.682835 2743 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 10 23:54:12.683169 kubelet[2743]: E0910 23:54:12.683061 2743 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 10 23:54:12.683169 kubelet[2743]: W0910 23:54:12.683074 2743 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 10 23:54:12.683169 kubelet[2743]: E0910 23:54:12.683089 2743 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 10 23:54:12.683605 systemd[1]: Started cri-containerd-ea4127d7ab14de6734cd816df4001d3cf6cfed1a62c84315ee47b8d9a238e9ce.scope - libcontainer container ea4127d7ab14de6734cd816df4001d3cf6cfed1a62c84315ee47b8d9a238e9ce. Sep 10 23:54:12.684330 kubelet[2743]: E0910 23:54:12.684312 2743 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 10 23:54:12.684521 kubelet[2743]: W0910 23:54:12.684408 2743 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 10 23:54:12.684521 kubelet[2743]: E0910 23:54:12.684427 2743 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 10 23:54:12.685485 kubelet[2743]: E0910 23:54:12.685375 2743 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 10 23:54:12.685485 kubelet[2743]: W0910 23:54:12.685390 2743 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 10 23:54:12.685485 kubelet[2743]: E0910 23:54:12.686245 2743 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 10 23:54:12.685485 kubelet[2743]: I0910 23:54:12.686309 2743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"varrun\" (UniqueName: \"kubernetes.io/host-path/33c72a96-49c1-4e41-a8b3-15ab5f93e7db-varrun\") pod \"csi-node-driver-px2tb\" (UID: \"33c72a96-49c1-4e41-a8b3-15ab5f93e7db\") " pod="calico-system/csi-node-driver-px2tb" Sep 10 23:54:12.687297 kubelet[2743]: E0910 23:54:12.687277 2743 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 10 23:54:12.687457 kubelet[2743]: W0910 23:54:12.687441 2743 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 10 23:54:12.687556 kubelet[2743]: E0910 23:54:12.687544 2743 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 10 23:54:12.687661 kubelet[2743]: I0910 23:54:12.687646 2743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2pvkq\" (UniqueName: \"kubernetes.io/projected/33c72a96-49c1-4e41-a8b3-15ab5f93e7db-kube-api-access-2pvkq\") pod \"csi-node-driver-px2tb\" (UID: \"33c72a96-49c1-4e41-a8b3-15ab5f93e7db\") " pod="calico-system/csi-node-driver-px2tb" Sep 10 23:54:12.687873 kubelet[2743]: E0910 23:54:12.687854 2743 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 10 23:54:12.687873 kubelet[2743]: W0910 23:54:12.687872 2743 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 10 23:54:12.687958 kubelet[2743]: E0910 23:54:12.687898 2743 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 10 23:54:12.688086 kubelet[2743]: E0910 23:54:12.688062 2743 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 10 23:54:12.688086 kubelet[2743]: W0910 23:54:12.688074 2743 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 10 23:54:12.688163 kubelet[2743]: E0910 23:54:12.688087 2743 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 10 23:54:12.688632 kubelet[2743]: E0910 23:54:12.688247 2743 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 10 23:54:12.688632 kubelet[2743]: W0910 23:54:12.688256 2743 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 10 23:54:12.688632 kubelet[2743]: E0910 23:54:12.688269 2743 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 10 23:54:12.688632 kubelet[2743]: I0910 23:54:12.688288 2743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/33c72a96-49c1-4e41-a8b3-15ab5f93e7db-kubelet-dir\") pod \"csi-node-driver-px2tb\" (UID: \"33c72a96-49c1-4e41-a8b3-15ab5f93e7db\") " pod="calico-system/csi-node-driver-px2tb" Sep 10 23:54:12.689542 kubelet[2743]: E0910 23:54:12.689089 2743 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 10 23:54:12.689753 kubelet[2743]: W0910 23:54:12.689702 2743 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 10 23:54:12.689896 kubelet[2743]: E0910 23:54:12.689737 2743 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 10 23:54:12.690281 kubelet[2743]: E0910 23:54:12.690233 2743 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 10 23:54:12.690281 kubelet[2743]: W0910 23:54:12.690252 2743 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 10 23:54:12.690281 kubelet[2743]: E0910 23:54:12.690271 2743 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 10 23:54:12.690281 kubelet[2743]: I0910 23:54:12.690290 2743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/33c72a96-49c1-4e41-a8b3-15ab5f93e7db-registration-dir\") pod \"csi-node-driver-px2tb\" (UID: \"33c72a96-49c1-4e41-a8b3-15ab5f93e7db\") " pod="calico-system/csi-node-driver-px2tb" Sep 10 23:54:12.691621 kubelet[2743]: E0910 23:54:12.691519 2743 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 10 23:54:12.691747 kubelet[2743]: W0910 23:54:12.691692 2743 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 10 23:54:12.691747 kubelet[2743]: E0910 23:54:12.691730 2743 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 10 23:54:12.693165 kubelet[2743]: E0910 23:54:12.692463 2743 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 10 23:54:12.693165 kubelet[2743]: W0910 23:54:12.692490 2743 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 10 23:54:12.693165 kubelet[2743]: E0910 23:54:12.692512 2743 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 10 23:54:12.693165 kubelet[2743]: E0910 23:54:12.692952 2743 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 10 23:54:12.693165 kubelet[2743]: W0910 23:54:12.692965 2743 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 10 23:54:12.693165 kubelet[2743]: E0910 23:54:12.692977 2743 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 10 23:54:12.693678 kubelet[2743]: E0910 23:54:12.693626 2743 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 10 23:54:12.693678 kubelet[2743]: W0910 23:54:12.693643 2743 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 10 23:54:12.693678 kubelet[2743]: E0910 23:54:12.693655 2743 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 10 23:54:12.694138 kubelet[2743]: E0910 23:54:12.694103 2743 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 10 23:54:12.694138 kubelet[2743]: W0910 23:54:12.694125 2743 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 10 23:54:12.694138 kubelet[2743]: E0910 23:54:12.694140 2743 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 10 23:54:12.695354 kubelet[2743]: E0910 23:54:12.695305 2743 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 10 23:54:12.695354 kubelet[2743]: W0910 23:54:12.695328 2743 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 10 23:54:12.695578 kubelet[2743]: E0910 23:54:12.695343 2743 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 10 23:54:12.695618 kubelet[2743]: I0910 23:54:12.695581 2743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/33c72a96-49c1-4e41-a8b3-15ab5f93e7db-socket-dir\") pod \"csi-node-driver-px2tb\" (UID: \"33c72a96-49c1-4e41-a8b3-15ab5f93e7db\") " pod="calico-system/csi-node-driver-px2tb" Sep 10 23:54:12.696172 kubelet[2743]: E0910 23:54:12.696144 2743 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 10 23:54:12.696172 kubelet[2743]: W0910 23:54:12.696165 2743 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 10 23:54:12.696388 kubelet[2743]: E0910 23:54:12.696180 2743 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 10 23:54:12.696508 kubelet[2743]: E0910 23:54:12.696485 2743 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 10 23:54:12.696508 kubelet[2743]: W0910 23:54:12.696503 2743 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 10 23:54:12.696570 kubelet[2743]: E0910 23:54:12.696516 2743 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 10 23:54:12.762148 containerd[1522]: time="2025-09-10T23:54:12.762081636Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-typha-59656fbc59-82nff,Uid:ff29ad7c-0224-457b-949b-6882b7eb9ce7,Namespace:calico-system,Attempt:0,} returns sandbox id \"ea4127d7ab14de6734cd816df4001d3cf6cfed1a62c84315ee47b8d9a238e9ce\"" Sep 10 23:54:12.763757 containerd[1522]: time="2025-09-10T23:54:12.763708513Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/typha:v3.30.3\"" Sep 10 23:54:12.791218 containerd[1522]: time="2025-09-10T23:54:12.790768638Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-node-5znmh,Uid:fd928deb-7f2c-408c-aa3e-945cc2d73815,Namespace:calico-system,Attempt:0,}" Sep 10 23:54:12.798204 kubelet[2743]: E0910 23:54:12.797310 2743 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 10 23:54:12.798204 kubelet[2743]: W0910 23:54:12.797336 2743 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 10 23:54:12.798204 kubelet[2743]: E0910 23:54:12.797373 2743 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 10 23:54:12.798204 kubelet[2743]: E0910 23:54:12.797549 2743 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 10 23:54:12.798204 kubelet[2743]: W0910 23:54:12.797557 2743 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 10 23:54:12.798204 kubelet[2743]: E0910 23:54:12.797566 2743 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 10 23:54:12.798204 kubelet[2743]: E0910 23:54:12.797729 2743 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 10 23:54:12.798204 kubelet[2743]: W0910 23:54:12.797737 2743 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 10 23:54:12.798204 kubelet[2743]: E0910 23:54:12.797745 2743 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 10 23:54:12.798204 kubelet[2743]: E0910 23:54:12.797893 2743 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 10 23:54:12.798523 kubelet[2743]: W0910 23:54:12.797900 2743 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 10 23:54:12.798523 kubelet[2743]: E0910 23:54:12.797907 2743 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 10 23:54:12.798523 kubelet[2743]: E0910 23:54:12.798074 2743 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 10 23:54:12.798523 kubelet[2743]: W0910 23:54:12.798081 2743 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 10 23:54:12.798523 kubelet[2743]: E0910 23:54:12.798090 2743 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 10 23:54:12.798523 kubelet[2743]: E0910 23:54:12.798289 2743 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 10 23:54:12.798523 kubelet[2743]: W0910 23:54:12.798298 2743 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 10 23:54:12.798523 kubelet[2743]: E0910 23:54:12.798307 2743 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 10 23:54:12.799526 kubelet[2743]: E0910 23:54:12.798680 2743 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 10 23:54:12.799526 kubelet[2743]: W0910 23:54:12.798697 2743 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 10 23:54:12.799526 kubelet[2743]: E0910 23:54:12.798708 2743 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 10 23:54:12.799526 kubelet[2743]: E0910 23:54:12.798850 2743 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 10 23:54:12.799526 kubelet[2743]: W0910 23:54:12.798857 2743 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 10 23:54:12.799526 kubelet[2743]: E0910 23:54:12.798864 2743 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 10 23:54:12.799526 kubelet[2743]: E0910 23:54:12.799056 2743 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 10 23:54:12.799526 kubelet[2743]: W0910 23:54:12.799065 2743 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 10 23:54:12.799526 kubelet[2743]: E0910 23:54:12.799075 2743 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 10 23:54:12.801425 kubelet[2743]: E0910 23:54:12.801287 2743 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 10 23:54:12.801425 kubelet[2743]: W0910 23:54:12.801311 2743 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 10 23:54:12.801425 kubelet[2743]: E0910 23:54:12.801331 2743 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 10 23:54:12.801639 kubelet[2743]: E0910 23:54:12.801626 2743 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 10 23:54:12.801798 kubelet[2743]: W0910 23:54:12.801709 2743 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 10 23:54:12.801798 kubelet[2743]: E0910 23:54:12.801756 2743 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 10 23:54:12.802550 kubelet[2743]: E0910 23:54:12.801915 2743 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 10 23:54:12.802550 kubelet[2743]: W0910 23:54:12.801928 2743 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 10 23:54:12.802550 kubelet[2743]: E0910 23:54:12.801953 2743 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 10 23:54:12.802931 kubelet[2743]: E0910 23:54:12.802710 2743 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 10 23:54:12.802931 kubelet[2743]: W0910 23:54:12.802723 2743 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 10 23:54:12.802931 kubelet[2743]: E0910 23:54:12.802766 2743 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 10 23:54:12.803593 kubelet[2743]: E0910 23:54:12.803368 2743 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 10 23:54:12.803593 kubelet[2743]: W0910 23:54:12.803386 2743 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 10 23:54:12.803593 kubelet[2743]: E0910 23:54:12.803421 2743 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 10 23:54:12.803933 kubelet[2743]: E0910 23:54:12.803916 2743 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 10 23:54:12.804046 kubelet[2743]: W0910 23:54:12.804012 2743 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 10 23:54:12.804235 kubelet[2743]: E0910 23:54:12.804121 2743 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 10 23:54:12.804487 kubelet[2743]: E0910 23:54:12.804376 2743 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 10 23:54:12.804487 kubelet[2743]: W0910 23:54:12.804392 2743 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 10 23:54:12.804487 kubelet[2743]: E0910 23:54:12.804418 2743 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 10 23:54:12.805463 kubelet[2743]: E0910 23:54:12.805443 2743 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 10 23:54:12.805659 kubelet[2743]: W0910 23:54:12.805552 2743 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 10 23:54:12.805659 kubelet[2743]: E0910 23:54:12.805601 2743 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 10 23:54:12.806547 kubelet[2743]: E0910 23:54:12.805810 2743 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 10 23:54:12.806547 kubelet[2743]: W0910 23:54:12.805826 2743 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 10 23:54:12.806547 kubelet[2743]: E0910 23:54:12.805855 2743 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 10 23:54:12.806751 kubelet[2743]: E0910 23:54:12.806730 2743 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 10 23:54:12.806825 kubelet[2743]: W0910 23:54:12.806811 2743 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 10 23:54:12.807010 kubelet[2743]: E0910 23:54:12.806933 2743 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 10 23:54:12.807316 kubelet[2743]: E0910 23:54:12.807278 2743 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 10 23:54:12.807487 kubelet[2743]: W0910 23:54:12.807395 2743 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 10 23:54:12.807487 kubelet[2743]: E0910 23:54:12.807441 2743 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 10 23:54:12.807746 kubelet[2743]: E0910 23:54:12.807637 2743 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 10 23:54:12.807746 kubelet[2743]: W0910 23:54:12.807649 2743 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 10 23:54:12.807861 kubelet[2743]: E0910 23:54:12.807831 2743 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 10 23:54:12.808350 kubelet[2743]: E0910 23:54:12.808115 2743 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 10 23:54:12.808350 kubelet[2743]: W0910 23:54:12.808138 2743 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 10 23:54:12.808350 kubelet[2743]: E0910 23:54:12.808167 2743 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 10 23:54:12.808596 kubelet[2743]: E0910 23:54:12.808579 2743 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 10 23:54:12.808673 kubelet[2743]: W0910 23:54:12.808658 2743 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 10 23:54:12.808745 kubelet[2743]: E0910 23:54:12.808734 2743 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 10 23:54:12.809223 kubelet[2743]: E0910 23:54:12.809181 2743 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 10 23:54:12.809271 kubelet[2743]: W0910 23:54:12.809215 2743 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 10 23:54:12.809271 kubelet[2743]: E0910 23:54:12.809269 2743 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 10 23:54:12.810563 kubelet[2743]: E0910 23:54:12.810533 2743 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 10 23:54:12.810563 kubelet[2743]: W0910 23:54:12.810562 2743 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 10 23:54:12.810661 kubelet[2743]: E0910 23:54:12.810578 2743 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 10 23:54:12.824602 containerd[1522]: time="2025-09-10T23:54:12.824467650Z" level=info msg="connecting to shim 1142a2e143fef3a1a4055808f377ce37caa2ee8b8c46fbabbb1b285f6b14bfb7" address="unix:///run/containerd/s/db02a93a31ac4548d6e85c6ba41035123e8c987f22f031884b2ef4037d8ef741" namespace=k8s.io protocol=ttrpc version=3 Sep 10 23:54:12.837323 kubelet[2743]: E0910 23:54:12.837285 2743 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 10 23:54:12.837323 kubelet[2743]: W0910 23:54:12.837313 2743 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 10 23:54:12.837473 kubelet[2743]: E0910 23:54:12.837337 2743 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 10 23:54:12.880145 systemd[1]: Started cri-containerd-1142a2e143fef3a1a4055808f377ce37caa2ee8b8c46fbabbb1b285f6b14bfb7.scope - libcontainer container 1142a2e143fef3a1a4055808f377ce37caa2ee8b8c46fbabbb1b285f6b14bfb7. Sep 10 23:54:12.947467 containerd[1522]: time="2025-09-10T23:54:12.947419857Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-node-5znmh,Uid:fd928deb-7f2c-408c-aa3e-945cc2d73815,Namespace:calico-system,Attempt:0,} returns sandbox id \"1142a2e143fef3a1a4055808f377ce37caa2ee8b8c46fbabbb1b285f6b14bfb7\"" Sep 10 23:54:14.095863 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount748137413.mount: Deactivated successfully. Sep 10 23:54:14.334461 kubelet[2743]: E0910 23:54:14.333351 2743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-px2tb" podUID="33c72a96-49c1-4e41-a8b3-15ab5f93e7db" Sep 10 23:54:15.272942 containerd[1522]: time="2025-09-10T23:54:15.272893535Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/typha:v3.30.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 10 23:54:15.274052 containerd[1522]: time="2025-09-10T23:54:15.274016901Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/typha:v3.30.3: active requests=0, bytes read=33105775" Sep 10 23:54:15.275488 containerd[1522]: time="2025-09-10T23:54:15.275453822Z" level=info msg="ImageCreate event name:\"sha256:6a1496fdc48cc0b9ab3c10aef777497484efac5df9efbfbbdf9775e9583645cb\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 10 23:54:15.279378 containerd[1522]: time="2025-09-10T23:54:15.279343620Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/typha@sha256:f4a3d61ffda9c98a53adeb412c5af404ca3727a3cc2d0b4ef28d197bdd47ecaa\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 10 23:54:15.280468 containerd[1522]: time="2025-09-10T23:54:15.280436982Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/typha:v3.30.3\" with image id \"sha256:6a1496fdc48cc0b9ab3c10aef777497484efac5df9efbfbbdf9775e9583645cb\", repo tag \"ghcr.io/flatcar/calico/typha:v3.30.3\", repo digest \"ghcr.io/flatcar/calico/typha@sha256:f4a3d61ffda9c98a53adeb412c5af404ca3727a3cc2d0b4ef28d197bdd47ecaa\", size \"33105629\" in 2.516689465s" Sep 10 23:54:15.280515 containerd[1522]: time="2025-09-10T23:54:15.280477067Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/typha:v3.30.3\" returns image reference \"sha256:6a1496fdc48cc0b9ab3c10aef777497484efac5df9efbfbbdf9775e9583645cb\"" Sep 10 23:54:15.282118 containerd[1522]: time="2025-09-10T23:54:15.282085848Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.3\"" Sep 10 23:54:15.295845 containerd[1522]: time="2025-09-10T23:54:15.295805989Z" level=info msg="CreateContainer within sandbox \"ea4127d7ab14de6734cd816df4001d3cf6cfed1a62c84315ee47b8d9a238e9ce\" for container &ContainerMetadata{Name:calico-typha,Attempt:0,}" Sep 10 23:54:15.304473 containerd[1522]: time="2025-09-10T23:54:15.303159935Z" level=info msg="Container 01761cbc82e880a9b63c671a2df1247797d52fb5aba6952267b35c21d58320d7: CDI devices from CRI Config.CDIDevices: []" Sep 10 23:54:15.311876 containerd[1522]: time="2025-09-10T23:54:15.311800306Z" level=info msg="CreateContainer within sandbox \"ea4127d7ab14de6734cd816df4001d3cf6cfed1a62c84315ee47b8d9a238e9ce\" for &ContainerMetadata{Name:calico-typha,Attempt:0,} returns container id \"01761cbc82e880a9b63c671a2df1247797d52fb5aba6952267b35c21d58320d7\"" Sep 10 23:54:15.312485 containerd[1522]: time="2025-09-10T23:54:15.312459340Z" level=info msg="StartContainer for \"01761cbc82e880a9b63c671a2df1247797d52fb5aba6952267b35c21d58320d7\"" Sep 10 23:54:15.314706 containerd[1522]: time="2025-09-10T23:54:15.314648346Z" level=info msg="connecting to shim 01761cbc82e880a9b63c671a2df1247797d52fb5aba6952267b35c21d58320d7" address="unix:///run/containerd/s/d5e3d2cc18ae543cdb686d0c8b3483d49887f227336875643dcbc2a0057cffb4" protocol=ttrpc version=3 Sep 10 23:54:15.345444 systemd[1]: Started cri-containerd-01761cbc82e880a9b63c671a2df1247797d52fb5aba6952267b35c21d58320d7.scope - libcontainer container 01761cbc82e880a9b63c671a2df1247797d52fb5aba6952267b35c21d58320d7. Sep 10 23:54:15.402776 containerd[1522]: time="2025-09-10T23:54:15.402739684Z" level=info msg="StartContainer for \"01761cbc82e880a9b63c671a2df1247797d52fb5aba6952267b35c21d58320d7\" returns successfully" Sep 10 23:54:15.509854 kubelet[2743]: E0910 23:54:15.509587 2743 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 10 23:54:15.511258 kubelet[2743]: W0910 23:54:15.509798 2743 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 10 23:54:15.511258 kubelet[2743]: E0910 23:54:15.511258 2743 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 10 23:54:15.512491 kubelet[2743]: E0910 23:54:15.512441 2743 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 10 23:54:15.512491 kubelet[2743]: W0910 23:54:15.512477 2743 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 10 23:54:15.512491 kubelet[2743]: E0910 23:54:15.512494 2743 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 10 23:54:15.512706 kubelet[2743]: E0910 23:54:15.512674 2743 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 10 23:54:15.512706 kubelet[2743]: W0910 23:54:15.512681 2743 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 10 23:54:15.512824 kubelet[2743]: E0910 23:54:15.512712 2743 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 10 23:54:15.512888 kubelet[2743]: E0910 23:54:15.512836 2743 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 10 23:54:15.512888 kubelet[2743]: W0910 23:54:15.512845 2743 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 10 23:54:15.512888 kubelet[2743]: E0910 23:54:15.512875 2743 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 10 23:54:15.513034 kubelet[2743]: E0910 23:54:15.513007 2743 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 10 23:54:15.513034 kubelet[2743]: W0910 23:54:15.513032 2743 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 10 23:54:15.513131 kubelet[2743]: E0910 23:54:15.513040 2743 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 10 23:54:15.513769 kubelet[2743]: E0910 23:54:15.513359 2743 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 10 23:54:15.513769 kubelet[2743]: W0910 23:54:15.513371 2743 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 10 23:54:15.513769 kubelet[2743]: E0910 23:54:15.513381 2743 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 10 23:54:15.513769 kubelet[2743]: E0910 23:54:15.513550 2743 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 10 23:54:15.513769 kubelet[2743]: W0910 23:54:15.513558 2743 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 10 23:54:15.513769 kubelet[2743]: E0910 23:54:15.513566 2743 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 10 23:54:15.513769 kubelet[2743]: E0910 23:54:15.513726 2743 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 10 23:54:15.513769 kubelet[2743]: W0910 23:54:15.513754 2743 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 10 23:54:15.513769 kubelet[2743]: E0910 23:54:15.513764 2743 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 10 23:54:15.515649 kubelet[2743]: E0910 23:54:15.513890 2743 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 10 23:54:15.515649 kubelet[2743]: W0910 23:54:15.513897 2743 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 10 23:54:15.515649 kubelet[2743]: E0910 23:54:15.513905 2743 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 10 23:54:15.515649 kubelet[2743]: E0910 23:54:15.514143 2743 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 10 23:54:15.515649 kubelet[2743]: W0910 23:54:15.514150 2743 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 10 23:54:15.515649 kubelet[2743]: E0910 23:54:15.514157 2743 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 10 23:54:15.515649 kubelet[2743]: E0910 23:54:15.514306 2743 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 10 23:54:15.515649 kubelet[2743]: W0910 23:54:15.514315 2743 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 10 23:54:15.515649 kubelet[2743]: E0910 23:54:15.514324 2743 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 10 23:54:15.515649 kubelet[2743]: E0910 23:54:15.514727 2743 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 10 23:54:15.516517 kubelet[2743]: W0910 23:54:15.514741 2743 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 10 23:54:15.516517 kubelet[2743]: E0910 23:54:15.514752 2743 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 10 23:54:15.516517 kubelet[2743]: E0910 23:54:15.516031 2743 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 10 23:54:15.516517 kubelet[2743]: W0910 23:54:15.516049 2743 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 10 23:54:15.516517 kubelet[2743]: E0910 23:54:15.516165 2743 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 10 23:54:15.516925 kubelet[2743]: E0910 23:54:15.516895 2743 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 10 23:54:15.516925 kubelet[2743]: W0910 23:54:15.516921 2743 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 10 23:54:15.516999 kubelet[2743]: E0910 23:54:15.516934 2743 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 10 23:54:15.517607 kubelet[2743]: E0910 23:54:15.517571 2743 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 10 23:54:15.517607 kubelet[2743]: W0910 23:54:15.517592 2743 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 10 23:54:15.517607 kubelet[2743]: E0910 23:54:15.517605 2743 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 10 23:54:15.519920 kubelet[2743]: E0910 23:54:15.519866 2743 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 10 23:54:15.519920 kubelet[2743]: W0910 23:54:15.519885 2743 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 10 23:54:15.519920 kubelet[2743]: E0910 23:54:15.519899 2743 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 10 23:54:15.520910 kubelet[2743]: E0910 23:54:15.520894 2743 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 10 23:54:15.521076 kubelet[2743]: W0910 23:54:15.520988 2743 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 10 23:54:15.521076 kubelet[2743]: E0910 23:54:15.521016 2743 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 10 23:54:15.521505 kubelet[2743]: E0910 23:54:15.521369 2743 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 10 23:54:15.521505 kubelet[2743]: W0910 23:54:15.521383 2743 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 10 23:54:15.521505 kubelet[2743]: E0910 23:54:15.521399 2743 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 10 23:54:15.521654 kubelet[2743]: E0910 23:54:15.521644 2743 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 10 23:54:15.521715 kubelet[2743]: W0910 23:54:15.521704 2743 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 10 23:54:15.521772 kubelet[2743]: E0910 23:54:15.521763 2743 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 10 23:54:15.522251 kubelet[2743]: E0910 23:54:15.522228 2743 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 10 23:54:15.522251 kubelet[2743]: W0910 23:54:15.522248 2743 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 10 23:54:15.522568 kubelet[2743]: E0910 23:54:15.522540 2743 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 10 23:54:15.523278 kubelet[2743]: E0910 23:54:15.522803 2743 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 10 23:54:15.523278 kubelet[2743]: W0910 23:54:15.522817 2743 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 10 23:54:15.523278 kubelet[2743]: E0910 23:54:15.522838 2743 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 10 23:54:15.524162 kubelet[2743]: E0910 23:54:15.524132 2743 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 10 23:54:15.524162 kubelet[2743]: W0910 23:54:15.524158 2743 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 10 23:54:15.524290 kubelet[2743]: E0910 23:54:15.524180 2743 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 10 23:54:15.524765 kubelet[2743]: E0910 23:54:15.524744 2743 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 10 23:54:15.525044 kubelet[2743]: W0910 23:54:15.524761 2743 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 10 23:54:15.525519 kubelet[2743]: E0910 23:54:15.525445 2743 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 10 23:54:15.526133 kubelet[2743]: E0910 23:54:15.526107 2743 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 10 23:54:15.526133 kubelet[2743]: W0910 23:54:15.526127 2743 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 10 23:54:15.526286 kubelet[2743]: E0910 23:54:15.526215 2743 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 10 23:54:15.527437 kubelet[2743]: E0910 23:54:15.527401 2743 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 10 23:54:15.527437 kubelet[2743]: W0910 23:54:15.527433 2743 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 10 23:54:15.527793 kubelet[2743]: E0910 23:54:15.527751 2743 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 10 23:54:15.528171 kubelet[2743]: E0910 23:54:15.528131 2743 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 10 23:54:15.528391 kubelet[2743]: W0910 23:54:15.528149 2743 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 10 23:54:15.529053 kubelet[2743]: E0910 23:54:15.528555 2743 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 10 23:54:15.529746 kubelet[2743]: E0910 23:54:15.529604 2743 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 10 23:54:15.529746 kubelet[2743]: W0910 23:54:15.529742 2743 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 10 23:54:15.529984 kubelet[2743]: E0910 23:54:15.529853 2743 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 10 23:54:15.530519 kubelet[2743]: E0910 23:54:15.530489 2743 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 10 23:54:15.530519 kubelet[2743]: W0910 23:54:15.530513 2743 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 10 23:54:15.530717 kubelet[2743]: E0910 23:54:15.530683 2743 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 10 23:54:15.530846 kubelet[2743]: E0910 23:54:15.530828 2743 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 10 23:54:15.530846 kubelet[2743]: W0910 23:54:15.530843 2743 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 10 23:54:15.530978 kubelet[2743]: E0910 23:54:15.530930 2743 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 10 23:54:15.531429 kubelet[2743]: E0910 23:54:15.531407 2743 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 10 23:54:15.531429 kubelet[2743]: W0910 23:54:15.531425 2743 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 10 23:54:15.531824 kubelet[2743]: E0910 23:54:15.531455 2743 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 10 23:54:15.532409 kubelet[2743]: E0910 23:54:15.532384 2743 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 10 23:54:15.532409 kubelet[2743]: W0910 23:54:15.532402 2743 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 10 23:54:15.532570 kubelet[2743]: E0910 23:54:15.532503 2743 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 10 23:54:15.533404 kubelet[2743]: E0910 23:54:15.533373 2743 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 10 23:54:15.533404 kubelet[2743]: W0910 23:54:15.533397 2743 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 10 23:54:15.533486 kubelet[2743]: E0910 23:54:15.533413 2743 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 10 23:54:15.534211 kubelet[2743]: E0910 23:54:15.534160 2743 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 10 23:54:15.534372 kubelet[2743]: W0910 23:54:15.534347 2743 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 10 23:54:15.534408 kubelet[2743]: E0910 23:54:15.534383 2743 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 10 23:54:15.883924 kubelet[2743]: I0910 23:54:15.883359 2743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/calico-typha-59656fbc59-82nff" podStartSLOduration=1.365580654 podStartE2EDuration="3.883328801s" podCreationTimestamp="2025-09-10 23:54:12 +0000 UTC" firstStartedPulling="2025-09-10 23:54:12.763402916 +0000 UTC m=+22.553468052" lastFinishedPulling="2025-09-10 23:54:15.281151023 +0000 UTC m=+25.071216199" observedRunningTime="2025-09-10 23:54:15.490175708 +0000 UTC m=+25.280240884" watchObservedRunningTime="2025-09-10 23:54:15.883328801 +0000 UTC m=+25.673393977" Sep 10 23:54:16.334589 kubelet[2743]: E0910 23:54:16.334523 2743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-px2tb" podUID="33c72a96-49c1-4e41-a8b3-15ab5f93e7db" Sep 10 23:54:16.525969 kubelet[2743]: E0910 23:54:16.525935 2743 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 10 23:54:16.527144 kubelet[2743]: W0910 23:54:16.526349 2743 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 10 23:54:16.527144 kubelet[2743]: E0910 23:54:16.526385 2743 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 10 23:54:16.528951 kubelet[2743]: E0910 23:54:16.528933 2743 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 10 23:54:16.528951 kubelet[2743]: W0910 23:54:16.528950 2743 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 10 23:54:16.529261 kubelet[2743]: E0910 23:54:16.528967 2743 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 10 23:54:16.529483 kubelet[2743]: E0910 23:54:16.529464 2743 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 10 23:54:16.529483 kubelet[2743]: W0910 23:54:16.529480 2743 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 10 23:54:16.529571 kubelet[2743]: E0910 23:54:16.529491 2743 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 10 23:54:16.529942 kubelet[2743]: E0910 23:54:16.529925 2743 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 10 23:54:16.529942 kubelet[2743]: W0910 23:54:16.529939 2743 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 10 23:54:16.530306 kubelet[2743]: E0910 23:54:16.529950 2743 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 10 23:54:16.530446 kubelet[2743]: E0910 23:54:16.530393 2743 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 10 23:54:16.531743 kubelet[2743]: W0910 23:54:16.531257 2743 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 10 23:54:16.531743 kubelet[2743]: E0910 23:54:16.531285 2743 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 10 23:54:16.531743 kubelet[2743]: E0910 23:54:16.531482 2743 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 10 23:54:16.531743 kubelet[2743]: W0910 23:54:16.531492 2743 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 10 23:54:16.531743 kubelet[2743]: E0910 23:54:16.531501 2743 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 10 23:54:16.531743 kubelet[2743]: E0910 23:54:16.531633 2743 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 10 23:54:16.531743 kubelet[2743]: W0910 23:54:16.531641 2743 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 10 23:54:16.531743 kubelet[2743]: E0910 23:54:16.531649 2743 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 10 23:54:16.532207 kubelet[2743]: E0910 23:54:16.532015 2743 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 10 23:54:16.532207 kubelet[2743]: W0910 23:54:16.532032 2743 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 10 23:54:16.532207 kubelet[2743]: E0910 23:54:16.532043 2743 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 10 23:54:16.532385 kubelet[2743]: E0910 23:54:16.532361 2743 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 10 23:54:16.532549 kubelet[2743]: W0910 23:54:16.532499 2743 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 10 23:54:16.532549 kubelet[2743]: E0910 23:54:16.532519 2743 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 10 23:54:16.532952 kubelet[2743]: E0910 23:54:16.532854 2743 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 10 23:54:16.532952 kubelet[2743]: W0910 23:54:16.532881 2743 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 10 23:54:16.532952 kubelet[2743]: E0910 23:54:16.532892 2743 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 10 23:54:16.533434 kubelet[2743]: E0910 23:54:16.533294 2743 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 10 23:54:16.533434 kubelet[2743]: W0910 23:54:16.533309 2743 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 10 23:54:16.533434 kubelet[2743]: E0910 23:54:16.533327 2743 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 10 23:54:16.533924 kubelet[2743]: E0910 23:54:16.533772 2743 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 10 23:54:16.533924 kubelet[2743]: W0910 23:54:16.533792 2743 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 10 23:54:16.533924 kubelet[2743]: E0910 23:54:16.533805 2743 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 10 23:54:16.534179 kubelet[2743]: E0910 23:54:16.534148 2743 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 10 23:54:16.534339 kubelet[2743]: W0910 23:54:16.534255 2743 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 10 23:54:16.534339 kubelet[2743]: E0910 23:54:16.534276 2743 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 10 23:54:16.534687 kubelet[2743]: E0910 23:54:16.534672 2743 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 10 23:54:16.534902 kubelet[2743]: W0910 23:54:16.534746 2743 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 10 23:54:16.534902 kubelet[2743]: E0910 23:54:16.534764 2743 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 10 23:54:16.535374 kubelet[2743]: E0910 23:54:16.535029 2743 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 10 23:54:16.535374 kubelet[2743]: W0910 23:54:16.535042 2743 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 10 23:54:16.535374 kubelet[2743]: E0910 23:54:16.535052 2743 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 10 23:54:16.535765 kubelet[2743]: E0910 23:54:16.535748 2743 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 10 23:54:16.535931 kubelet[2743]: W0910 23:54:16.535913 2743 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 10 23:54:16.536007 kubelet[2743]: E0910 23:54:16.535996 2743 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 10 23:54:16.536636 kubelet[2743]: E0910 23:54:16.536616 2743 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 10 23:54:16.536842 kubelet[2743]: W0910 23:54:16.536717 2743 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 10 23:54:16.536842 kubelet[2743]: E0910 23:54:16.536749 2743 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 10 23:54:16.537038 kubelet[2743]: E0910 23:54:16.537025 2743 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 10 23:54:16.537280 kubelet[2743]: W0910 23:54:16.537095 2743 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 10 23:54:16.537280 kubelet[2743]: E0910 23:54:16.537117 2743 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 10 23:54:16.537624 kubelet[2743]: E0910 23:54:16.537522 2743 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 10 23:54:16.537624 kubelet[2743]: W0910 23:54:16.537539 2743 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 10 23:54:16.537624 kubelet[2743]: E0910 23:54:16.537569 2743 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 10 23:54:16.537908 kubelet[2743]: E0910 23:54:16.537894 2743 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 10 23:54:16.538053 kubelet[2743]: W0910 23:54:16.537969 2743 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 10 23:54:16.538090 kubelet[2743]: E0910 23:54:16.538055 2743 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 10 23:54:16.538507 kubelet[2743]: E0910 23:54:16.538350 2743 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 10 23:54:16.538507 kubelet[2743]: W0910 23:54:16.538365 2743 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 10 23:54:16.538507 kubelet[2743]: E0910 23:54:16.538452 2743 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 10 23:54:16.538781 kubelet[2743]: E0910 23:54:16.538722 2743 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 10 23:54:16.538851 kubelet[2743]: W0910 23:54:16.538839 2743 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 10 23:54:16.539039 kubelet[2743]: E0910 23:54:16.538944 2743 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 10 23:54:16.539318 kubelet[2743]: E0910 23:54:16.539275 2743 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 10 23:54:16.539426 kubelet[2743]: W0910 23:54:16.539410 2743 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 10 23:54:16.539685 kubelet[2743]: E0910 23:54:16.539555 2743 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 10 23:54:16.539946 kubelet[2743]: E0910 23:54:16.539924 2743 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 10 23:54:16.540059 kubelet[2743]: W0910 23:54:16.540044 2743 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 10 23:54:16.540242 kubelet[2743]: E0910 23:54:16.540216 2743 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 10 23:54:16.540849 kubelet[2743]: E0910 23:54:16.540831 2743 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 10 23:54:16.541010 kubelet[2743]: W0910 23:54:16.540910 2743 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 10 23:54:16.541010 kubelet[2743]: E0910 23:54:16.540943 2743 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 10 23:54:16.541354 kubelet[2743]: E0910 23:54:16.541338 2743 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 10 23:54:16.541488 kubelet[2743]: W0910 23:54:16.541429 2743 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 10 23:54:16.541609 kubelet[2743]: E0910 23:54:16.541564 2743 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 10 23:54:16.541981 kubelet[2743]: E0910 23:54:16.541963 2743 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 10 23:54:16.542135 kubelet[2743]: W0910 23:54:16.542042 2743 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 10 23:54:16.542179 kubelet[2743]: E0910 23:54:16.542132 2743 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 10 23:54:16.542610 kubelet[2743]: E0910 23:54:16.542493 2743 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 10 23:54:16.542610 kubelet[2743]: W0910 23:54:16.542518 2743 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 10 23:54:16.542610 kubelet[2743]: E0910 23:54:16.542544 2743 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 10 23:54:16.542854 kubelet[2743]: E0910 23:54:16.542839 2743 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 10 23:54:16.543050 kubelet[2743]: W0910 23:54:16.542903 2743 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 10 23:54:16.543050 kubelet[2743]: E0910 23:54:16.542927 2743 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 10 23:54:16.543513 kubelet[2743]: E0910 23:54:16.543459 2743 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 10 23:54:16.544020 kubelet[2743]: W0910 23:54:16.543776 2743 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 10 23:54:16.544020 kubelet[2743]: E0910 23:54:16.543810 2743 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 10 23:54:16.544642 kubelet[2743]: E0910 23:54:16.544495 2743 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 10 23:54:16.544642 kubelet[2743]: W0910 23:54:16.544518 2743 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 10 23:54:16.544642 kubelet[2743]: E0910 23:54:16.544538 2743 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 10 23:54:16.545004 kubelet[2743]: E0910 23:54:16.544986 2743 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 10 23:54:16.545176 kubelet[2743]: W0910 23:54:16.545063 2743 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 10 23:54:16.545176 kubelet[2743]: E0910 23:54:16.545094 2743 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 10 23:54:16.545690 kubelet[2743]: E0910 23:54:16.545634 2743 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Sep 10 23:54:16.545690 kubelet[2743]: W0910 23:54:16.545649 2743 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Sep 10 23:54:16.545690 kubelet[2743]: E0910 23:54:16.545661 2743 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Sep 10 23:54:16.588876 containerd[1522]: time="2025-09-10T23:54:16.588699458Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 10 23:54:16.590955 containerd[1522]: time="2025-09-10T23:54:16.590887738Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.3: active requests=0, bytes read=4266814" Sep 10 23:54:16.592351 containerd[1522]: time="2025-09-10T23:54:16.591611297Z" level=info msg="ImageCreate event name:\"sha256:29e6f31ad72882b1b817dd257df6b7981e4d7d31d872b7fe2cf102c6e2af27a5\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 10 23:54:16.595547 containerd[1522]: time="2025-09-10T23:54:16.595499644Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/pod2daemon-flexvol@sha256:81bdfcd9dbd36624dc35354e8c181c75631ba40e6c7df5820f5f56cea36f0ef9\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 10 23:54:16.596042 containerd[1522]: time="2025-09-10T23:54:16.596006339Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.3\" with image id \"sha256:29e6f31ad72882b1b817dd257df6b7981e4d7d31d872b7fe2cf102c6e2af27a5\", repo tag \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.3\", repo digest \"ghcr.io/flatcar/calico/pod2daemon-flexvol@sha256:81bdfcd9dbd36624dc35354e8c181c75631ba40e6c7df5820f5f56cea36f0ef9\", size \"5636015\" in 1.313891248s" Sep 10 23:54:16.596084 containerd[1522]: time="2025-09-10T23:54:16.596042463Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.3\" returns image reference \"sha256:29e6f31ad72882b1b817dd257df6b7981e4d7d31d872b7fe2cf102c6e2af27a5\"" Sep 10 23:54:16.600016 containerd[1522]: time="2025-09-10T23:54:16.599975775Z" level=info msg="CreateContainer within sandbox \"1142a2e143fef3a1a4055808f377ce37caa2ee8b8c46fbabbb1b285f6b14bfb7\" for container &ContainerMetadata{Name:flexvol-driver,Attempt:0,}" Sep 10 23:54:16.610899 containerd[1522]: time="2025-09-10T23:54:16.610861889Z" level=info msg="Container 212957758231af5748dca0d88f9a528b799a300499472de65db6f5a3dddcde85: CDI devices from CRI Config.CDIDevices: []" Sep 10 23:54:16.619849 containerd[1522]: time="2025-09-10T23:54:16.619796029Z" level=info msg="CreateContainer within sandbox \"1142a2e143fef3a1a4055808f377ce37caa2ee8b8c46fbabbb1b285f6b14bfb7\" for &ContainerMetadata{Name:flexvol-driver,Attempt:0,} returns container id \"212957758231af5748dca0d88f9a528b799a300499472de65db6f5a3dddcde85\"" Sep 10 23:54:16.620694 containerd[1522]: time="2025-09-10T23:54:16.620630921Z" level=info msg="StartContainer for \"212957758231af5748dca0d88f9a528b799a300499472de65db6f5a3dddcde85\"" Sep 10 23:54:16.624881 containerd[1522]: time="2025-09-10T23:54:16.624813900Z" level=info msg="connecting to shim 212957758231af5748dca0d88f9a528b799a300499472de65db6f5a3dddcde85" address="unix:///run/containerd/s/db02a93a31ac4548d6e85c6ba41035123e8c987f22f031884b2ef4037d8ef741" protocol=ttrpc version=3 Sep 10 23:54:16.654408 systemd[1]: Started cri-containerd-212957758231af5748dca0d88f9a528b799a300499472de65db6f5a3dddcde85.scope - libcontainer container 212957758231af5748dca0d88f9a528b799a300499472de65db6f5a3dddcde85. Sep 10 23:54:16.728014 containerd[1522]: time="2025-09-10T23:54:16.727795918Z" level=info msg="StartContainer for \"212957758231af5748dca0d88f9a528b799a300499472de65db6f5a3dddcde85\" returns successfully" Sep 10 23:54:16.751621 systemd[1]: cri-containerd-212957758231af5748dca0d88f9a528b799a300499472de65db6f5a3dddcde85.scope: Deactivated successfully. Sep 10 23:54:16.758040 containerd[1522]: time="2025-09-10T23:54:16.757900300Z" level=info msg="received exit event container_id:\"212957758231af5748dca0d88f9a528b799a300499472de65db6f5a3dddcde85\" id:\"212957758231af5748dca0d88f9a528b799a300499472de65db6f5a3dddcde85\" pid:3501 exited_at:{seconds:1757548456 nanos:756989400}" Sep 10 23:54:16.758919 containerd[1522]: time="2025-09-10T23:54:16.758879128Z" level=info msg="TaskExit event in podsandbox handler container_id:\"212957758231af5748dca0d88f9a528b799a300499472de65db6f5a3dddcde85\" id:\"212957758231af5748dca0d88f9a528b799a300499472de65db6f5a3dddcde85\" pid:3501 exited_at:{seconds:1757548456 nanos:756989400}" Sep 10 23:54:16.788961 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-212957758231af5748dca0d88f9a528b799a300499472de65db6f5a3dddcde85-rootfs.mount: Deactivated successfully. Sep 10 23:54:17.482428 containerd[1522]: time="2025-09-10T23:54:17.482377150Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/cni:v3.30.3\"" Sep 10 23:54:18.334770 kubelet[2743]: E0910 23:54:18.334055 2743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-px2tb" podUID="33c72a96-49c1-4e41-a8b3-15ab5f93e7db" Sep 10 23:54:19.809980 containerd[1522]: time="2025-09-10T23:54:19.809120107Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/cni:v3.30.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 10 23:54:19.809980 containerd[1522]: time="2025-09-10T23:54:19.809941151Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/cni:v3.30.3: active requests=0, bytes read=65913477" Sep 10 23:54:19.811046 containerd[1522]: time="2025-09-10T23:54:19.811012581Z" level=info msg="ImageCreate event name:\"sha256:7077a1dc632ee598cbfa626f9e3c9bca5b20c0d1e1e557995890125b2e8d2e23\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 10 23:54:19.813583 containerd[1522]: time="2025-09-10T23:54:19.813530560Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/cni@sha256:73d1e391050490d54e5bee8ff2b1a50a8be1746c98dc530361b00e8c0ab63f87\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 10 23:54:19.814758 containerd[1522]: time="2025-09-10T23:54:19.814703120Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/cni:v3.30.3\" with image id \"sha256:7077a1dc632ee598cbfa626f9e3c9bca5b20c0d1e1e557995890125b2e8d2e23\", repo tag \"ghcr.io/flatcar/calico/cni:v3.30.3\", repo digest \"ghcr.io/flatcar/calico/cni@sha256:73d1e391050490d54e5bee8ff2b1a50a8be1746c98dc530361b00e8c0ab63f87\", size \"67282718\" in 2.332202877s" Sep 10 23:54:19.814907 containerd[1522]: time="2025-09-10T23:54:19.814885539Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/cni:v3.30.3\" returns image reference \"sha256:7077a1dc632ee598cbfa626f9e3c9bca5b20c0d1e1e557995890125b2e8d2e23\"" Sep 10 23:54:19.820672 containerd[1522]: time="2025-09-10T23:54:19.820626129Z" level=info msg="CreateContainer within sandbox \"1142a2e143fef3a1a4055808f377ce37caa2ee8b8c46fbabbb1b285f6b14bfb7\" for container &ContainerMetadata{Name:install-cni,Attempt:0,}" Sep 10 23:54:19.831986 containerd[1522]: time="2025-09-10T23:54:19.831930970Z" level=info msg="Container 1c753c17223361097206a096397bb0e6a6249fc202ac70472503d109faf68b65: CDI devices from CRI Config.CDIDevices: []" Sep 10 23:54:19.843143 containerd[1522]: time="2025-09-10T23:54:19.843084916Z" level=info msg="CreateContainer within sandbox \"1142a2e143fef3a1a4055808f377ce37caa2ee8b8c46fbabbb1b285f6b14bfb7\" for &ContainerMetadata{Name:install-cni,Attempt:0,} returns container id \"1c753c17223361097206a096397bb0e6a6249fc202ac70472503d109faf68b65\"" Sep 10 23:54:19.844665 containerd[1522]: time="2025-09-10T23:54:19.844587030Z" level=info msg="StartContainer for \"1c753c17223361097206a096397bb0e6a6249fc202ac70472503d109faf68b65\"" Sep 10 23:54:19.847140 containerd[1522]: time="2025-09-10T23:54:19.846671964Z" level=info msg="connecting to shim 1c753c17223361097206a096397bb0e6a6249fc202ac70472503d109faf68b65" address="unix:///run/containerd/s/db02a93a31ac4548d6e85c6ba41035123e8c987f22f031884b2ef4037d8ef741" protocol=ttrpc version=3 Sep 10 23:54:19.876434 systemd[1]: Started cri-containerd-1c753c17223361097206a096397bb0e6a6249fc202ac70472503d109faf68b65.scope - libcontainer container 1c753c17223361097206a096397bb0e6a6249fc202ac70472503d109faf68b65. Sep 10 23:54:19.925007 containerd[1522]: time="2025-09-10T23:54:19.924956045Z" level=info msg="StartContainer for \"1c753c17223361097206a096397bb0e6a6249fc202ac70472503d109faf68b65\" returns successfully" Sep 10 23:54:20.334162 kubelet[2743]: E0910 23:54:20.334113 2743 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-px2tb" podUID="33c72a96-49c1-4e41-a8b3-15ab5f93e7db" Sep 10 23:54:20.498408 systemd[1]: cri-containerd-1c753c17223361097206a096397bb0e6a6249fc202ac70472503d109faf68b65.scope: Deactivated successfully. Sep 10 23:54:20.499124 systemd[1]: cri-containerd-1c753c17223361097206a096397bb0e6a6249fc202ac70472503d109faf68b65.scope: Consumed 498ms CPU time, 196.1M memory peak, 165.8M written to disk. Sep 10 23:54:20.504638 containerd[1522]: time="2025-09-10T23:54:20.504594553Z" level=info msg="received exit event container_id:\"1c753c17223361097206a096397bb0e6a6249fc202ac70472503d109faf68b65\" id:\"1c753c17223361097206a096397bb0e6a6249fc202ac70472503d109faf68b65\" pid:3561 exited_at:{seconds:1757548460 nanos:503762029}" Sep 10 23:54:20.504778 containerd[1522]: time="2025-09-10T23:54:20.504756609Z" level=info msg="TaskExit event in podsandbox handler container_id:\"1c753c17223361097206a096397bb0e6a6249fc202ac70472503d109faf68b65\" id:\"1c753c17223361097206a096397bb0e6a6249fc202ac70472503d109faf68b65\" pid:3561 exited_at:{seconds:1757548460 nanos:503762029}" Sep 10 23:54:20.512322 kubelet[2743]: I0910 23:54:20.512247 2743 kubelet_node_status.go:488] "Fast updating node status as it just became ready" Sep 10 23:54:20.553708 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-1c753c17223361097206a096397bb0e6a6249fc202ac70472503d109faf68b65-rootfs.mount: Deactivated successfully. Sep 10 23:54:20.590619 systemd[1]: Created slice kubepods-besteffort-poda43d71ea_8b5a_4dec_a972_0788b2e2b853.slice - libcontainer container kubepods-besteffort-poda43d71ea_8b5a_4dec_a972_0788b2e2b853.slice. Sep 10 23:54:20.598001 kubelet[2743]: W0910 23:54:20.596791 2743 reflector.go:561] object-"kube-system"/"coredns": failed to list *v1.ConfigMap: configmaps "coredns" is forbidden: User "system:node:ci-4372-1-0-n-c06092ab73" cannot list resource "configmaps" in API group "" in the namespace "kube-system": no relationship found between node 'ci-4372-1-0-n-c06092ab73' and this object Sep 10 23:54:20.598001 kubelet[2743]: E0910 23:54:20.596847 2743 reflector.go:158] "Unhandled Error" err="object-\"kube-system\"/\"coredns\": Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: configmaps \"coredns\" is forbidden: User \"system:node:ci-4372-1-0-n-c06092ab73\" cannot list resource \"configmaps\" in API group \"\" in the namespace \"kube-system\": no relationship found between node 'ci-4372-1-0-n-c06092ab73' and this object" logger="UnhandledError" Sep 10 23:54:20.606429 systemd[1]: Created slice kubepods-burstable-podde82da2e_d798_4452_8450_fbfaa1f65b47.slice - libcontainer container kubepods-burstable-podde82da2e_d798_4452_8450_fbfaa1f65b47.slice. Sep 10 23:54:20.622054 kubelet[2743]: W0910 23:54:20.622007 2743 reflector.go:561] object-"calico-system"/"goldmane-key-pair": failed to list *v1.Secret: secrets "goldmane-key-pair" is forbidden: User "system:node:ci-4372-1-0-n-c06092ab73" cannot list resource "secrets" in API group "" in the namespace "calico-system": no relationship found between node 'ci-4372-1-0-n-c06092ab73' and this object Sep 10 23:54:20.624634 kubelet[2743]: E0910 23:54:20.624235 2743 reflector.go:158] "Unhandled Error" err="object-\"calico-system\"/\"goldmane-key-pair\": Failed to watch *v1.Secret: failed to list *v1.Secret: secrets \"goldmane-key-pair\" is forbidden: User \"system:node:ci-4372-1-0-n-c06092ab73\" cannot list resource \"secrets\" in API group \"\" in the namespace \"calico-system\": no relationship found between node 'ci-4372-1-0-n-c06092ab73' and this object" logger="UnhandledError" Sep 10 23:54:20.624788 kubelet[2743]: W0910 23:54:20.622511 2743 reflector.go:561] object-"calico-system"/"goldmane-ca-bundle": failed to list *v1.ConfigMap: configmaps "goldmane-ca-bundle" is forbidden: User "system:node:ci-4372-1-0-n-c06092ab73" cannot list resource "configmaps" in API group "" in the namespace "calico-system": no relationship found between node 'ci-4372-1-0-n-c06092ab73' and this object Sep 10 23:54:20.626263 kubelet[2743]: E0910 23:54:20.626236 2743 reflector.go:158] "Unhandled Error" err="object-\"calico-system\"/\"goldmane-ca-bundle\": Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: configmaps \"goldmane-ca-bundle\" is forbidden: User \"system:node:ci-4372-1-0-n-c06092ab73\" cannot list resource \"configmaps\" in API group \"\" in the namespace \"calico-system\": no relationship found between node 'ci-4372-1-0-n-c06092ab73' and this object" logger="UnhandledError" Sep 10 23:54:20.626382 kubelet[2743]: W0910 23:54:20.622551 2743 reflector.go:561] object-"calico-system"/"goldmane": failed to list *v1.ConfigMap: configmaps "goldmane" is forbidden: User "system:node:ci-4372-1-0-n-c06092ab73" cannot list resource "configmaps" in API group "" in the namespace "calico-system": no relationship found between node 'ci-4372-1-0-n-c06092ab73' and this object Sep 10 23:54:20.626462 kubelet[2743]: E0910 23:54:20.626446 2743 reflector.go:158] "Unhandled Error" err="object-\"calico-system\"/\"goldmane\": Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: configmaps \"goldmane\" is forbidden: User \"system:node:ci-4372-1-0-n-c06092ab73\" cannot list resource \"configmaps\" in API group \"\" in the namespace \"calico-system\": no relationship found between node 'ci-4372-1-0-n-c06092ab73' and this object" logger="UnhandledError" Sep 10 23:54:20.634041 systemd[1]: Created slice kubepods-burstable-podb46d3443_7658_4ea6_8bfe_1899b5945225.slice - libcontainer container kubepods-burstable-podb46d3443_7658_4ea6_8bfe_1899b5945225.slice. Sep 10 23:54:20.647237 systemd[1]: Created slice kubepods-besteffort-pod5af1d038_1e6a_4009_a16f_7e6eca2aca84.slice - libcontainer container kubepods-besteffort-pod5af1d038_1e6a_4009_a16f_7e6eca2aca84.slice. Sep 10 23:54:20.656556 systemd[1]: Created slice kubepods-besteffort-pod974afa17_d878_4288_8c80_49d63baee70d.slice - libcontainer container kubepods-besteffort-pod974afa17_d878_4288_8c80_49d63baee70d.slice. Sep 10 23:54:20.664305 systemd[1]: Created slice kubepods-besteffort-pod8f5dde21_0592_4365_967a_d0767c076647.slice - libcontainer container kubepods-besteffort-pod8f5dde21_0592_4365_967a_d0767c076647.slice. Sep 10 23:54:20.669113 kubelet[2743]: I0910 23:54:20.669065 2743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qf2bq\" (UniqueName: \"kubernetes.io/projected/0dd1dd97-6eb7-401b-871e-a9aed197c21b-kube-api-access-qf2bq\") pod \"goldmane-7988f88666-hswkn\" (UID: \"0dd1dd97-6eb7-401b-871e-a9aed197c21b\") " pod="calico-system/goldmane-7988f88666-hswkn" Sep 10 23:54:20.669567 kubelet[2743]: I0910 23:54:20.669512 2743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mgzcc\" (UniqueName: \"kubernetes.io/projected/8f5dde21-0592-4365-967a-d0767c076647-kube-api-access-mgzcc\") pod \"calico-apiserver-cd5495d6f-tcd74\" (UID: \"8f5dde21-0592-4365-967a-d0767c076647\") " pod="calico-apiserver/calico-apiserver-cd5495d6f-tcd74" Sep 10 23:54:20.671071 kubelet[2743]: I0910 23:54:20.670246 2743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t4rfj\" (UniqueName: \"kubernetes.io/projected/a43d71ea-8b5a-4dec-a972-0788b2e2b853-kube-api-access-t4rfj\") pod \"calico-apiserver-cd5495d6f-mbcq8\" (UID: \"a43d71ea-8b5a-4dec-a972-0788b2e2b853\") " pod="calico-apiserver/calico-apiserver-cd5495d6f-mbcq8" Sep 10 23:54:20.671071 kubelet[2743]: I0910 23:54:20.670394 2743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0dd1dd97-6eb7-401b-871e-a9aed197c21b-config\") pod \"goldmane-7988f88666-hswkn\" (UID: \"0dd1dd97-6eb7-401b-871e-a9aed197c21b\") " pod="calico-system/goldmane-7988f88666-hswkn" Sep 10 23:54:20.671071 kubelet[2743]: I0910 23:54:20.670418 2743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"calico-apiserver-certs\" (UniqueName: \"kubernetes.io/secret/8f5dde21-0592-4365-967a-d0767c076647-calico-apiserver-certs\") pod \"calico-apiserver-cd5495d6f-tcd74\" (UID: \"8f5dde21-0592-4365-967a-d0767c076647\") " pod="calico-apiserver/calico-apiserver-cd5495d6f-tcd74" Sep 10 23:54:20.671071 kubelet[2743]: I0910 23:54:20.670543 2743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"goldmane-key-pair\" (UniqueName: \"kubernetes.io/secret/0dd1dd97-6eb7-401b-871e-a9aed197c21b-goldmane-key-pair\") pod \"goldmane-7988f88666-hswkn\" (UID: \"0dd1dd97-6eb7-401b-871e-a9aed197c21b\") " pod="calico-system/goldmane-7988f88666-hswkn" Sep 10 23:54:20.671071 kubelet[2743]: I0910 23:54:20.670565 2743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zk7cr\" (UniqueName: \"kubernetes.io/projected/de82da2e-d798-4452-8450-fbfaa1f65b47-kube-api-access-zk7cr\") pod \"coredns-7c65d6cfc9-pw9tk\" (UID: \"de82da2e-d798-4452-8450-fbfaa1f65b47\") " pod="kube-system/coredns-7c65d6cfc9-pw9tk" Sep 10 23:54:20.671751 kubelet[2743]: I0910 23:54:20.670587 2743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"goldmane-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0dd1dd97-6eb7-401b-871e-a9aed197c21b-goldmane-ca-bundle\") pod \"goldmane-7988f88666-hswkn\" (UID: \"0dd1dd97-6eb7-401b-871e-a9aed197c21b\") " pod="calico-system/goldmane-7988f88666-hswkn" Sep 10 23:54:20.671751 kubelet[2743]: I0910 23:54:20.671722 2743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"calico-apiserver-certs\" (UniqueName: \"kubernetes.io/secret/a43d71ea-8b5a-4dec-a972-0788b2e2b853-calico-apiserver-certs\") pod \"calico-apiserver-cd5495d6f-mbcq8\" (UID: \"a43d71ea-8b5a-4dec-a972-0788b2e2b853\") " pod="calico-apiserver/calico-apiserver-cd5495d6f-mbcq8" Sep 10 23:54:20.672034 kubelet[2743]: I0910 23:54:20.671952 2743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/de82da2e-d798-4452-8450-fbfaa1f65b47-config-volume\") pod \"coredns-7c65d6cfc9-pw9tk\" (UID: \"de82da2e-d798-4452-8450-fbfaa1f65b47\") " pod="kube-system/coredns-7c65d6cfc9-pw9tk" Sep 10 23:54:20.677666 systemd[1]: Created slice kubepods-besteffort-pod0dd1dd97_6eb7_401b_871e_a9aed197c21b.slice - libcontainer container kubepods-besteffort-pod0dd1dd97_6eb7_401b_871e_a9aed197c21b.slice. Sep 10 23:54:20.773884 kubelet[2743]: I0910 23:54:20.773833 2743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5af1d038-1e6a-4009-a16f-7e6eca2aca84-tigera-ca-bundle\") pod \"calico-kube-controllers-5747f59d78-md7gk\" (UID: \"5af1d038-1e6a-4009-a16f-7e6eca2aca84\") " pod="calico-system/calico-kube-controllers-5747f59d78-md7gk" Sep 10 23:54:20.774555 kubelet[2743]: I0910 23:54:20.774363 2743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kmnq2\" (UniqueName: \"kubernetes.io/projected/5af1d038-1e6a-4009-a16f-7e6eca2aca84-kube-api-access-kmnq2\") pod \"calico-kube-controllers-5747f59d78-md7gk\" (UID: \"5af1d038-1e6a-4009-a16f-7e6eca2aca84\") " pod="calico-system/calico-kube-controllers-5747f59d78-md7gk" Sep 10 23:54:20.775057 kubelet[2743]: I0910 23:54:20.774987 2743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h8rcs\" (UniqueName: \"kubernetes.io/projected/974afa17-d878-4288-8c80-49d63baee70d-kube-api-access-h8rcs\") pod \"whisker-5dfcd65957-5m4vn\" (UID: \"974afa17-d878-4288-8c80-49d63baee70d\") " pod="calico-system/whisker-5dfcd65957-5m4vn" Sep 10 23:54:20.775358 kubelet[2743]: I0910 23:54:20.775328 2743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whisker-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/974afa17-d878-4288-8c80-49d63baee70d-whisker-ca-bundle\") pod \"whisker-5dfcd65957-5m4vn\" (UID: \"974afa17-d878-4288-8c80-49d63baee70d\") " pod="calico-system/whisker-5dfcd65957-5m4vn" Sep 10 23:54:20.775695 kubelet[2743]: I0910 23:54:20.775559 2743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v4tr4\" (UniqueName: \"kubernetes.io/projected/b46d3443-7658-4ea6-8bfe-1899b5945225-kube-api-access-v4tr4\") pod \"coredns-7c65d6cfc9-rdn9k\" (UID: \"b46d3443-7658-4ea6-8bfe-1899b5945225\") " pod="kube-system/coredns-7c65d6cfc9-rdn9k" Sep 10 23:54:20.776132 kubelet[2743]: I0910 23:54:20.776077 2743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whisker-backend-key-pair\" (UniqueName: \"kubernetes.io/secret/974afa17-d878-4288-8c80-49d63baee70d-whisker-backend-key-pair\") pod \"whisker-5dfcd65957-5m4vn\" (UID: \"974afa17-d878-4288-8c80-49d63baee70d\") " pod="calico-system/whisker-5dfcd65957-5m4vn" Sep 10 23:54:20.776553 kubelet[2743]: I0910 23:54:20.776442 2743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/b46d3443-7658-4ea6-8bfe-1899b5945225-config-volume\") pod \"coredns-7c65d6cfc9-rdn9k\" (UID: \"b46d3443-7658-4ea6-8bfe-1899b5945225\") " pod="kube-system/coredns-7c65d6cfc9-rdn9k" Sep 10 23:54:20.908916 containerd[1522]: time="2025-09-10T23:54:20.908783084Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-cd5495d6f-mbcq8,Uid:a43d71ea-8b5a-4dec-a972-0788b2e2b853,Namespace:calico-apiserver,Attempt:0,}" Sep 10 23:54:20.955091 containerd[1522]: time="2025-09-10T23:54:20.955033940Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-5747f59d78-md7gk,Uid:5af1d038-1e6a-4009-a16f-7e6eca2aca84,Namespace:calico-system,Attempt:0,}" Sep 10 23:54:20.963605 containerd[1522]: time="2025-09-10T23:54:20.963398702Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:whisker-5dfcd65957-5m4vn,Uid:974afa17-d878-4288-8c80-49d63baee70d,Namespace:calico-system,Attempt:0,}" Sep 10 23:54:20.980384 containerd[1522]: time="2025-09-10T23:54:20.980344928Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-cd5495d6f-tcd74,Uid:8f5dde21-0592-4365-967a-d0767c076647,Namespace:calico-apiserver,Attempt:0,}" Sep 10 23:54:21.040896 containerd[1522]: time="2025-09-10T23:54:21.040763015Z" level=error msg="Failed to destroy network for sandbox \"ae1542456f0f004a0d70d38da94963abb3a9b0b0d972d452d5f7ffa2083cbfe0\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 10 23:54:21.042952 containerd[1522]: time="2025-09-10T23:54:21.042899266Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-cd5495d6f-mbcq8,Uid:a43d71ea-8b5a-4dec-a972-0788b2e2b853,Namespace:calico-apiserver,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"ae1542456f0f004a0d70d38da94963abb3a9b0b0d972d452d5f7ffa2083cbfe0\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 10 23:54:21.043578 kubelet[2743]: E0910 23:54:21.043508 2743 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"ae1542456f0f004a0d70d38da94963abb3a9b0b0d972d452d5f7ffa2083cbfe0\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 10 23:54:21.044253 kubelet[2743]: E0910 23:54:21.043716 2743 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"ae1542456f0f004a0d70d38da94963abb3a9b0b0d972d452d5f7ffa2083cbfe0\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-cd5495d6f-mbcq8" Sep 10 23:54:21.044253 kubelet[2743]: E0910 23:54:21.043742 2743 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"ae1542456f0f004a0d70d38da94963abb3a9b0b0d972d452d5f7ffa2083cbfe0\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-cd5495d6f-mbcq8" Sep 10 23:54:21.044253 kubelet[2743]: E0910 23:54:21.043930 2743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-apiserver-cd5495d6f-mbcq8_calico-apiserver(a43d71ea-8b5a-4dec-a972-0788b2e2b853)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-apiserver-cd5495d6f-mbcq8_calico-apiserver(a43d71ea-8b5a-4dec-a972-0788b2e2b853)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"ae1542456f0f004a0d70d38da94963abb3a9b0b0d972d452d5f7ffa2083cbfe0\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-apiserver/calico-apiserver-cd5495d6f-mbcq8" podUID="a43d71ea-8b5a-4dec-a972-0788b2e2b853" Sep 10 23:54:21.067391 containerd[1522]: time="2025-09-10T23:54:21.067321717Z" level=error msg="Failed to destroy network for sandbox \"a327f421e124de711d6f434cde45f6ecc6343d9b55b2ad1309d3e27cb5c272b9\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 10 23:54:21.069593 containerd[1522]: time="2025-09-10T23:54:21.069537736Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-cd5495d6f-tcd74,Uid:8f5dde21-0592-4365-967a-d0767c076647,Namespace:calico-apiserver,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"a327f421e124de711d6f434cde45f6ecc6343d9b55b2ad1309d3e27cb5c272b9\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 10 23:54:21.070036 kubelet[2743]: E0910 23:54:21.069969 2743 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"a327f421e124de711d6f434cde45f6ecc6343d9b55b2ad1309d3e27cb5c272b9\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 10 23:54:21.070097 kubelet[2743]: E0910 23:54:21.070060 2743 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"a327f421e124de711d6f434cde45f6ecc6343d9b55b2ad1309d3e27cb5c272b9\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-cd5495d6f-tcd74" Sep 10 23:54:21.070097 kubelet[2743]: E0910 23:54:21.070078 2743 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"a327f421e124de711d6f434cde45f6ecc6343d9b55b2ad1309d3e27cb5c272b9\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-cd5495d6f-tcd74" Sep 10 23:54:21.070172 kubelet[2743]: E0910 23:54:21.070127 2743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-apiserver-cd5495d6f-tcd74_calico-apiserver(8f5dde21-0592-4365-967a-d0767c076647)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-apiserver-cd5495d6f-tcd74_calico-apiserver(8f5dde21-0592-4365-967a-d0767c076647)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"a327f421e124de711d6f434cde45f6ecc6343d9b55b2ad1309d3e27cb5c272b9\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-apiserver/calico-apiserver-cd5495d6f-tcd74" podUID="8f5dde21-0592-4365-967a-d0767c076647" Sep 10 23:54:21.072363 containerd[1522]: time="2025-09-10T23:54:21.072180397Z" level=error msg="Failed to destroy network for sandbox \"c4d5724ff580acaa6b96ec27a95de90271c7f4ed214a974ea9eb3bbf215c8fa8\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 10 23:54:21.073776 containerd[1522]: time="2025-09-10T23:54:21.073688266Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-5747f59d78-md7gk,Uid:5af1d038-1e6a-4009-a16f-7e6eca2aca84,Namespace:calico-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"c4d5724ff580acaa6b96ec27a95de90271c7f4ed214a974ea9eb3bbf215c8fa8\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 10 23:54:21.073980 kubelet[2743]: E0910 23:54:21.073915 2743 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"c4d5724ff580acaa6b96ec27a95de90271c7f4ed214a974ea9eb3bbf215c8fa8\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 10 23:54:21.073980 kubelet[2743]: E0910 23:54:21.073972 2743 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"c4d5724ff580acaa6b96ec27a95de90271c7f4ed214a974ea9eb3bbf215c8fa8\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-kube-controllers-5747f59d78-md7gk" Sep 10 23:54:21.074120 kubelet[2743]: E0910 23:54:21.074001 2743 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"c4d5724ff580acaa6b96ec27a95de90271c7f4ed214a974ea9eb3bbf215c8fa8\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-kube-controllers-5747f59d78-md7gk" Sep 10 23:54:21.074120 kubelet[2743]: E0910 23:54:21.074043 2743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-kube-controllers-5747f59d78-md7gk_calico-system(5af1d038-1e6a-4009-a16f-7e6eca2aca84)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-kube-controllers-5747f59d78-md7gk_calico-system(5af1d038-1e6a-4009-a16f-7e6eca2aca84)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"c4d5724ff580acaa6b96ec27a95de90271c7f4ed214a974ea9eb3bbf215c8fa8\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/calico-kube-controllers-5747f59d78-md7gk" podUID="5af1d038-1e6a-4009-a16f-7e6eca2aca84" Sep 10 23:54:21.088416 containerd[1522]: time="2025-09-10T23:54:21.088325392Z" level=error msg="Failed to destroy network for sandbox \"333679a8b1a8d2e6a6f4531746f4154749cddf27e7d546217eba4db067e6ccfe\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 10 23:54:21.090052 containerd[1522]: time="2025-09-10T23:54:21.089999397Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:whisker-5dfcd65957-5m4vn,Uid:974afa17-d878-4288-8c80-49d63baee70d,Namespace:calico-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"333679a8b1a8d2e6a6f4531746f4154749cddf27e7d546217eba4db067e6ccfe\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 10 23:54:21.090317 kubelet[2743]: E0910 23:54:21.090276 2743 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"333679a8b1a8d2e6a6f4531746f4154749cddf27e7d546217eba4db067e6ccfe\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 10 23:54:21.090430 kubelet[2743]: E0910 23:54:21.090347 2743 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"333679a8b1a8d2e6a6f4531746f4154749cddf27e7d546217eba4db067e6ccfe\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/whisker-5dfcd65957-5m4vn" Sep 10 23:54:21.090430 kubelet[2743]: E0910 23:54:21.090371 2743 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"333679a8b1a8d2e6a6f4531746f4154749cddf27e7d546217eba4db067e6ccfe\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/whisker-5dfcd65957-5m4vn" Sep 10 23:54:21.090494 kubelet[2743]: E0910 23:54:21.090417 2743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"whisker-5dfcd65957-5m4vn_calico-system(974afa17-d878-4288-8c80-49d63baee70d)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"whisker-5dfcd65957-5m4vn_calico-system(974afa17-d878-4288-8c80-49d63baee70d)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"333679a8b1a8d2e6a6f4531746f4154749cddf27e7d546217eba4db067e6ccfe\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/whisker-5dfcd65957-5m4vn" podUID="974afa17-d878-4288-8c80-49d63baee70d" Sep 10 23:54:21.508238 containerd[1522]: time="2025-09-10T23:54:21.508059203Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node:v3.30.3\"" Sep 10 23:54:21.779286 kubelet[2743]: E0910 23:54:21.778423 2743 configmap.go:193] Couldn't get configMap calico-system/goldmane: failed to sync configmap cache: timed out waiting for the condition Sep 10 23:54:21.779286 kubelet[2743]: E0910 23:54:21.778551 2743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/0dd1dd97-6eb7-401b-871e-a9aed197c21b-config podName:0dd1dd97-6eb7-401b-871e-a9aed197c21b nodeName:}" failed. No retries permitted until 2025-09-10 23:54:22.278522712 +0000 UTC m=+32.068587888 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "config" (UniqueName: "kubernetes.io/configmap/0dd1dd97-6eb7-401b-871e-a9aed197c21b-config") pod "goldmane-7988f88666-hswkn" (UID: "0dd1dd97-6eb7-401b-871e-a9aed197c21b") : failed to sync configmap cache: timed out waiting for the condition Sep 10 23:54:21.779286 kubelet[2743]: E0910 23:54:21.778872 2743 configmap.go:193] Couldn't get configMap kube-system/coredns: failed to sync configmap cache: timed out waiting for the condition Sep 10 23:54:21.779286 kubelet[2743]: E0910 23:54:21.778945 2743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/de82da2e-d798-4452-8450-fbfaa1f65b47-config-volume podName:de82da2e-d798-4452-8450-fbfaa1f65b47 nodeName:}" failed. No retries permitted until 2025-09-10 23:54:22.27890059 +0000 UTC m=+32.068965766 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "config-volume" (UniqueName: "kubernetes.io/configmap/de82da2e-d798-4452-8450-fbfaa1f65b47-config-volume") pod "coredns-7c65d6cfc9-pw9tk" (UID: "de82da2e-d798-4452-8450-fbfaa1f65b47") : failed to sync configmap cache: timed out waiting for the condition Sep 10 23:54:21.779286 kubelet[2743]: E0910 23:54:21.778969 2743 configmap.go:193] Couldn't get configMap calico-system/goldmane-ca-bundle: failed to sync configmap cache: timed out waiting for the condition Sep 10 23:54:21.779943 kubelet[2743]: E0910 23:54:21.778999 2743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/0dd1dd97-6eb7-401b-871e-a9aed197c21b-goldmane-ca-bundle podName:0dd1dd97-6eb7-401b-871e-a9aed197c21b nodeName:}" failed. No retries permitted until 2025-09-10 23:54:22.278989359 +0000 UTC m=+32.069054535 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "goldmane-ca-bundle" (UniqueName: "kubernetes.io/configmap/0dd1dd97-6eb7-401b-871e-a9aed197c21b-goldmane-ca-bundle") pod "goldmane-7988f88666-hswkn" (UID: "0dd1dd97-6eb7-401b-871e-a9aed197c21b") : failed to sync configmap cache: timed out waiting for the condition Sep 10 23:54:21.783467 kubelet[2743]: E0910 23:54:21.783385 2743 secret.go:189] Couldn't get secret calico-system/goldmane-key-pair: failed to sync secret cache: timed out waiting for the condition Sep 10 23:54:21.783719 kubelet[2743]: E0910 23:54:21.783664 2743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0dd1dd97-6eb7-401b-871e-a9aed197c21b-goldmane-key-pair podName:0dd1dd97-6eb7-401b-871e-a9aed197c21b nodeName:}" failed. No retries permitted until 2025-09-10 23:54:22.283638058 +0000 UTC m=+32.073703234 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "goldmane-key-pair" (UniqueName: "kubernetes.io/secret/0dd1dd97-6eb7-401b-871e-a9aed197c21b-goldmane-key-pair") pod "goldmane-7988f88666-hswkn" (UID: "0dd1dd97-6eb7-401b-871e-a9aed197c21b") : failed to sync secret cache: timed out waiting for the condition Sep 10 23:54:21.879848 kubelet[2743]: E0910 23:54:21.879544 2743 configmap.go:193] Couldn't get configMap kube-system/coredns: failed to sync configmap cache: timed out waiting for the condition Sep 10 23:54:21.879848 kubelet[2743]: E0910 23:54:21.879627 2743 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/b46d3443-7658-4ea6-8bfe-1899b5945225-config-volume podName:b46d3443-7658-4ea6-8bfe-1899b5945225 nodeName:}" failed. No retries permitted until 2025-09-10 23:54:22.379607935 +0000 UTC m=+32.169673111 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "config-volume" (UniqueName: "kubernetes.io/configmap/b46d3443-7658-4ea6-8bfe-1899b5945225-config-volume") pod "coredns-7c65d6cfc9-rdn9k" (UID: "b46d3443-7658-4ea6-8bfe-1899b5945225") : failed to sync configmap cache: timed out waiting for the condition Sep 10 23:54:22.343727 systemd[1]: Created slice kubepods-besteffort-pod33c72a96_49c1_4e41_a8b3_15ab5f93e7db.slice - libcontainer container kubepods-besteffort-pod33c72a96_49c1_4e41_a8b3_15ab5f93e7db.slice. Sep 10 23:54:22.347931 containerd[1522]: time="2025-09-10T23:54:22.347868603Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-px2tb,Uid:33c72a96-49c1-4e41-a8b3-15ab5f93e7db,Namespace:calico-system,Attempt:0,}" Sep 10 23:54:22.402034 containerd[1522]: time="2025-09-10T23:54:22.401986210Z" level=error msg="Failed to destroy network for sandbox \"2d3f63141c2e504d8b1f679a694a65711144ab70285f87e9635d5e268d89a7c1\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 10 23:54:22.404602 systemd[1]: run-netns-cni\x2d374b9434\x2d865e\x2debe7\x2d1c66\x2da899103daba9.mount: Deactivated successfully. Sep 10 23:54:22.405986 containerd[1522]: time="2025-09-10T23:54:22.405917112Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-px2tb,Uid:33c72a96-49c1-4e41-a8b3-15ab5f93e7db,Namespace:calico-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"2d3f63141c2e504d8b1f679a694a65711144ab70285f87e9635d5e268d89a7c1\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 10 23:54:22.406170 kubelet[2743]: E0910 23:54:22.406141 2743 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"2d3f63141c2e504d8b1f679a694a65711144ab70285f87e9635d5e268d89a7c1\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 10 23:54:22.406359 kubelet[2743]: E0910 23:54:22.406253 2743 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"2d3f63141c2e504d8b1f679a694a65711144ab70285f87e9635d5e268d89a7c1\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/csi-node-driver-px2tb" Sep 10 23:54:22.406359 kubelet[2743]: E0910 23:54:22.406275 2743 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"2d3f63141c2e504d8b1f679a694a65711144ab70285f87e9635d5e268d89a7c1\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/csi-node-driver-px2tb" Sep 10 23:54:22.406359 kubelet[2743]: E0910 23:54:22.406333 2743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"csi-node-driver-px2tb_calico-system(33c72a96-49c1-4e41-a8b3-15ab5f93e7db)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"csi-node-driver-px2tb_calico-system(33c72a96-49c1-4e41-a8b3-15ab5f93e7db)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"2d3f63141c2e504d8b1f679a694a65711144ab70285f87e9635d5e268d89a7c1\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/csi-node-driver-px2tb" podUID="33c72a96-49c1-4e41-a8b3-15ab5f93e7db" Sep 10 23:54:22.422716 containerd[1522]: time="2025-09-10T23:54:22.422603610Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-7c65d6cfc9-pw9tk,Uid:de82da2e-d798-4452-8450-fbfaa1f65b47,Namespace:kube-system,Attempt:0,}" Sep 10 23:54:22.443995 containerd[1522]: time="2025-09-10T23:54:22.443710576Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-7c65d6cfc9-rdn9k,Uid:b46d3443-7658-4ea6-8bfe-1899b5945225,Namespace:kube-system,Attempt:0,}" Sep 10 23:54:22.485242 containerd[1522]: time="2025-09-10T23:54:22.484756116Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:goldmane-7988f88666-hswkn,Uid:0dd1dd97-6eb7-401b-871e-a9aed197c21b,Namespace:calico-system,Attempt:0,}" Sep 10 23:54:22.499086 containerd[1522]: time="2025-09-10T23:54:22.499042661Z" level=error msg="Failed to destroy network for sandbox \"ffd77368e34b415758d2b93eee0f55d02ac1706f3926f21c3927932cf8daef5b\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 10 23:54:22.502379 containerd[1522]: time="2025-09-10T23:54:22.502333900Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-7c65d6cfc9-pw9tk,Uid:de82da2e-d798-4452-8450-fbfaa1f65b47,Namespace:kube-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"ffd77368e34b415758d2b93eee0f55d02ac1706f3926f21c3927932cf8daef5b\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 10 23:54:22.502834 kubelet[2743]: E0910 23:54:22.502781 2743 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"ffd77368e34b415758d2b93eee0f55d02ac1706f3926f21c3927932cf8daef5b\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 10 23:54:22.503440 kubelet[2743]: E0910 23:54:22.502957 2743 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"ffd77368e34b415758d2b93eee0f55d02ac1706f3926f21c3927932cf8daef5b\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-7c65d6cfc9-pw9tk" Sep 10 23:54:22.503440 kubelet[2743]: E0910 23:54:22.503233 2743 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"ffd77368e34b415758d2b93eee0f55d02ac1706f3926f21c3927932cf8daef5b\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-7c65d6cfc9-pw9tk" Sep 10 23:54:22.503440 kubelet[2743]: E0910 23:54:22.503313 2743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"coredns-7c65d6cfc9-pw9tk_kube-system(de82da2e-d798-4452-8450-fbfaa1f65b47)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"coredns-7c65d6cfc9-pw9tk_kube-system(de82da2e-d798-4452-8450-fbfaa1f65b47)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"ffd77368e34b415758d2b93eee0f55d02ac1706f3926f21c3927932cf8daef5b\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-7c65d6cfc9-pw9tk" podUID="de82da2e-d798-4452-8450-fbfaa1f65b47" Sep 10 23:54:22.520617 containerd[1522]: time="2025-09-10T23:54:22.520564148Z" level=error msg="Failed to destroy network for sandbox \"8c5374c0384f34f415499a86ae53ab4d8ed48c7b9dd6c512beb22f37c9f25437\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 10 23:54:22.522851 containerd[1522]: time="2025-09-10T23:54:22.522265393Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-7c65d6cfc9-rdn9k,Uid:b46d3443-7658-4ea6-8bfe-1899b5945225,Namespace:kube-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"8c5374c0384f34f415499a86ae53ab4d8ed48c7b9dd6c512beb22f37c9f25437\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 10 23:54:22.523041 kubelet[2743]: E0910 23:54:22.522495 2743 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"8c5374c0384f34f415499a86ae53ab4d8ed48c7b9dd6c512beb22f37c9f25437\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 10 23:54:22.523041 kubelet[2743]: E0910 23:54:22.522547 2743 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"8c5374c0384f34f415499a86ae53ab4d8ed48c7b9dd6c512beb22f37c9f25437\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-7c65d6cfc9-rdn9k" Sep 10 23:54:22.523041 kubelet[2743]: E0910 23:54:22.522573 2743 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"8c5374c0384f34f415499a86ae53ab4d8ed48c7b9dd6c512beb22f37c9f25437\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-7c65d6cfc9-rdn9k" Sep 10 23:54:22.523135 kubelet[2743]: E0910 23:54:22.522608 2743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"coredns-7c65d6cfc9-rdn9k_kube-system(b46d3443-7658-4ea6-8bfe-1899b5945225)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"coredns-7c65d6cfc9-rdn9k_kube-system(b46d3443-7658-4ea6-8bfe-1899b5945225)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"8c5374c0384f34f415499a86ae53ab4d8ed48c7b9dd6c512beb22f37c9f25437\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-7c65d6cfc9-rdn9k" podUID="b46d3443-7658-4ea6-8bfe-1899b5945225" Sep 10 23:54:22.557562 containerd[1522]: time="2025-09-10T23:54:22.557451724Z" level=error msg="Failed to destroy network for sandbox \"ed94487520bf50da88fb4602aa71c6797ad8dfd031dbb9a6e26a2155cfdc2ac0\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 10 23:54:22.559369 containerd[1522]: time="2025-09-10T23:54:22.559313465Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:goldmane-7988f88666-hswkn,Uid:0dd1dd97-6eb7-401b-871e-a9aed197c21b,Namespace:calico-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"ed94487520bf50da88fb4602aa71c6797ad8dfd031dbb9a6e26a2155cfdc2ac0\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 10 23:54:22.559770 kubelet[2743]: E0910 23:54:22.559727 2743 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"ed94487520bf50da88fb4602aa71c6797ad8dfd031dbb9a6e26a2155cfdc2ac0\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Sep 10 23:54:22.560061 kubelet[2743]: E0910 23:54:22.559943 2743 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"ed94487520bf50da88fb4602aa71c6797ad8dfd031dbb9a6e26a2155cfdc2ac0\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/goldmane-7988f88666-hswkn" Sep 10 23:54:22.560061 kubelet[2743]: E0910 23:54:22.560022 2743 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"ed94487520bf50da88fb4602aa71c6797ad8dfd031dbb9a6e26a2155cfdc2ac0\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/goldmane-7988f88666-hswkn" Sep 10 23:54:22.560863 kubelet[2743]: E0910 23:54:22.560814 2743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"goldmane-7988f88666-hswkn_calico-system(0dd1dd97-6eb7-401b-871e-a9aed197c21b)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"goldmane-7988f88666-hswkn_calico-system(0dd1dd97-6eb7-401b-871e-a9aed197c21b)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"ed94487520bf50da88fb4602aa71c6797ad8dfd031dbb9a6e26a2155cfdc2ac0\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/goldmane-7988f88666-hswkn" podUID="0dd1dd97-6eb7-401b-871e-a9aed197c21b" Sep 10 23:54:22.833248 systemd[1]: run-netns-cni\x2d8bde0880\x2de875\x2da783\x2d2858\x2d3c21afdda8db.mount: Deactivated successfully. Sep 10 23:54:25.535375 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount3220087863.mount: Deactivated successfully. Sep 10 23:54:25.560823 containerd[1522]: time="2025-09-10T23:54:25.560778995Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node:v3.30.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 10 23:54:25.562228 containerd[1522]: time="2025-09-10T23:54:25.562173763Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/node:v3.30.3: active requests=0, bytes read=151100457" Sep 10 23:54:25.564007 containerd[1522]: time="2025-09-10T23:54:25.563292907Z" level=info msg="ImageCreate event name:\"sha256:2b8abd2140fc4464ed664d225fe38e5b90bbfcf62996b484b0fc0e0537b6a4a9\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 10 23:54:25.565274 containerd[1522]: time="2025-09-10T23:54:25.565227685Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node@sha256:bcb8146fcaeced1e1c88fad3eaa697f1680746bd23c3e7e8d4535bc484c6f2a1\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 10 23:54:25.566579 containerd[1522]: time="2025-09-10T23:54:25.566555808Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/node:v3.30.3\" with image id \"sha256:2b8abd2140fc4464ed664d225fe38e5b90bbfcf62996b484b0fc0e0537b6a4a9\", repo tag \"ghcr.io/flatcar/calico/node:v3.30.3\", repo digest \"ghcr.io/flatcar/calico/node@sha256:bcb8146fcaeced1e1c88fad3eaa697f1680746bd23c3e7e8d4535bc484c6f2a1\", size \"151100319\" in 4.058323148s" Sep 10 23:54:25.566676 containerd[1522]: time="2025-09-10T23:54:25.566661697Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node:v3.30.3\" returns image reference \"sha256:2b8abd2140fc4464ed664d225fe38e5b90bbfcf62996b484b0fc0e0537b6a4a9\"" Sep 10 23:54:25.587224 containerd[1522]: time="2025-09-10T23:54:25.587156867Z" level=info msg="CreateContainer within sandbox \"1142a2e143fef3a1a4055808f377ce37caa2ee8b8c46fbabbb1b285f6b14bfb7\" for container &ContainerMetadata{Name:calico-node,Attempt:0,}" Sep 10 23:54:25.605813 containerd[1522]: time="2025-09-10T23:54:25.605750382Z" level=info msg="Container 296cbc8d5df525b21444371f1c154cc68257e3af8c60d311e62f7b31ca19feb0: CDI devices from CRI Config.CDIDevices: []" Sep 10 23:54:25.626506 containerd[1522]: time="2025-09-10T23:54:25.626436689Z" level=info msg="CreateContainer within sandbox \"1142a2e143fef3a1a4055808f377ce37caa2ee8b8c46fbabbb1b285f6b14bfb7\" for &ContainerMetadata{Name:calico-node,Attempt:0,} returns container id \"296cbc8d5df525b21444371f1c154cc68257e3af8c60d311e62f7b31ca19feb0\"" Sep 10 23:54:25.627416 containerd[1522]: time="2025-09-10T23:54:25.627241804Z" level=info msg="StartContainer for \"296cbc8d5df525b21444371f1c154cc68257e3af8c60d311e62f7b31ca19feb0\"" Sep 10 23:54:25.629182 containerd[1522]: time="2025-09-10T23:54:25.628953481Z" level=info msg="connecting to shim 296cbc8d5df525b21444371f1c154cc68257e3af8c60d311e62f7b31ca19feb0" address="unix:///run/containerd/s/db02a93a31ac4548d6e85c6ba41035123e8c987f22f031884b2ef4037d8ef741" protocol=ttrpc version=3 Sep 10 23:54:25.681375 systemd[1]: Started cri-containerd-296cbc8d5df525b21444371f1c154cc68257e3af8c60d311e62f7b31ca19feb0.scope - libcontainer container 296cbc8d5df525b21444371f1c154cc68257e3af8c60d311e62f7b31ca19feb0. Sep 10 23:54:25.758292 containerd[1522]: time="2025-09-10T23:54:25.758170357Z" level=info msg="StartContainer for \"296cbc8d5df525b21444371f1c154cc68257e3af8c60d311e62f7b31ca19feb0\" returns successfully" Sep 10 23:54:25.894374 kernel: wireguard: WireGuard 1.0.0 loaded. See www.wireguard.com for information. Sep 10 23:54:25.894489 kernel: wireguard: Copyright (C) 2015-2019 Jason A. Donenfeld . All Rights Reserved. Sep 10 23:54:26.118146 kubelet[2743]: I0910 23:54:26.118062 2743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"whisker-backend-key-pair\" (UniqueName: \"kubernetes.io/secret/974afa17-d878-4288-8c80-49d63baee70d-whisker-backend-key-pair\") pod \"974afa17-d878-4288-8c80-49d63baee70d\" (UID: \"974afa17-d878-4288-8c80-49d63baee70d\") " Sep 10 23:54:26.119034 kubelet[2743]: I0910 23:54:26.118272 2743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"whisker-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/974afa17-d878-4288-8c80-49d63baee70d-whisker-ca-bundle\") pod \"974afa17-d878-4288-8c80-49d63baee70d\" (UID: \"974afa17-d878-4288-8c80-49d63baee70d\") " Sep 10 23:54:26.119034 kubelet[2743]: I0910 23:54:26.118342 2743 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-h8rcs\" (UniqueName: \"kubernetes.io/projected/974afa17-d878-4288-8c80-49d63baee70d-kube-api-access-h8rcs\") pod \"974afa17-d878-4288-8c80-49d63baee70d\" (UID: \"974afa17-d878-4288-8c80-49d63baee70d\") " Sep 10 23:54:26.119617 kubelet[2743]: I0910 23:54:26.119566 2743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/974afa17-d878-4288-8c80-49d63baee70d-whisker-ca-bundle" (OuterVolumeSpecName: "whisker-ca-bundle") pod "974afa17-d878-4288-8c80-49d63baee70d" (UID: "974afa17-d878-4288-8c80-49d63baee70d"). InnerVolumeSpecName "whisker-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 10 23:54:26.123046 kubelet[2743]: I0910 23:54:26.123000 2743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/974afa17-d878-4288-8c80-49d63baee70d-kube-api-access-h8rcs" (OuterVolumeSpecName: "kube-api-access-h8rcs") pod "974afa17-d878-4288-8c80-49d63baee70d" (UID: "974afa17-d878-4288-8c80-49d63baee70d"). InnerVolumeSpecName "kube-api-access-h8rcs". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 10 23:54:26.123553 kubelet[2743]: I0910 23:54:26.123393 2743 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/974afa17-d878-4288-8c80-49d63baee70d-whisker-backend-key-pair" (OuterVolumeSpecName: "whisker-backend-key-pair") pod "974afa17-d878-4288-8c80-49d63baee70d" (UID: "974afa17-d878-4288-8c80-49d63baee70d"). InnerVolumeSpecName "whisker-backend-key-pair". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 10 23:54:26.219044 kubelet[2743]: I0910 23:54:26.218980 2743 reconciler_common.go:293] "Volume detached for volume \"whisker-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/974afa17-d878-4288-8c80-49d63baee70d-whisker-ca-bundle\") on node \"ci-4372-1-0-n-c06092ab73\" DevicePath \"\"" Sep 10 23:54:26.219044 kubelet[2743]: I0910 23:54:26.219026 2743 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-h8rcs\" (UniqueName: \"kubernetes.io/projected/974afa17-d878-4288-8c80-49d63baee70d-kube-api-access-h8rcs\") on node \"ci-4372-1-0-n-c06092ab73\" DevicePath \"\"" Sep 10 23:54:26.219044 kubelet[2743]: I0910 23:54:26.219041 2743 reconciler_common.go:293] "Volume detached for volume \"whisker-backend-key-pair\" (UniqueName: \"kubernetes.io/secret/974afa17-d878-4288-8c80-49d63baee70d-whisker-backend-key-pair\") on node \"ci-4372-1-0-n-c06092ab73\" DevicePath \"\"" Sep 10 23:54:26.350124 systemd[1]: Removed slice kubepods-besteffort-pod974afa17_d878_4288_8c80_49d63baee70d.slice - libcontainer container kubepods-besteffort-pod974afa17_d878_4288_8c80_49d63baee70d.slice. Sep 10 23:54:26.538283 systemd[1]: var-lib-kubelet-pods-974afa17\x2dd878\x2d4288\x2d8c80\x2d49d63baee70d-volumes-kubernetes.io\x7eprojected-kube\x2dapi\x2daccess\x2dh8rcs.mount: Deactivated successfully. Sep 10 23:54:26.539249 systemd[1]: var-lib-kubelet-pods-974afa17\x2dd878\x2d4288\x2d8c80\x2d49d63baee70d-volumes-kubernetes.io\x7esecret-whisker\x2dbackend\x2dkey\x2dpair.mount: Deactivated successfully. Sep 10 23:54:26.566631 kubelet[2743]: I0910 23:54:26.566520 2743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/calico-node-5znmh" podStartSLOduration=1.94965516 podStartE2EDuration="14.566498389s" podCreationTimestamp="2025-09-10 23:54:12 +0000 UTC" firstStartedPulling="2025-09-10 23:54:12.951210597 +0000 UTC m=+22.741275773" lastFinishedPulling="2025-09-10 23:54:25.568053786 +0000 UTC m=+35.358119002" observedRunningTime="2025-09-10 23:54:26.564052407 +0000 UTC m=+36.354117623" watchObservedRunningTime="2025-09-10 23:54:26.566498389 +0000 UTC m=+36.356563565" Sep 10 23:54:26.661833 systemd[1]: Created slice kubepods-besteffort-podf3e33739_3521_4bf4_bdea_a23412e15a2c.slice - libcontainer container kubepods-besteffort-podf3e33739_3521_4bf4_bdea_a23412e15a2c.slice. Sep 10 23:54:26.722280 kubelet[2743]: I0910 23:54:26.722235 2743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-95zjr\" (UniqueName: \"kubernetes.io/projected/f3e33739-3521-4bf4-bdea-a23412e15a2c-kube-api-access-95zjr\") pod \"whisker-6c76fd745b-cpg8f\" (UID: \"f3e33739-3521-4bf4-bdea-a23412e15a2c\") " pod="calico-system/whisker-6c76fd745b-cpg8f" Sep 10 23:54:26.723091 kubelet[2743]: I0910 23:54:26.722991 2743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whisker-backend-key-pair\" (UniqueName: \"kubernetes.io/secret/f3e33739-3521-4bf4-bdea-a23412e15a2c-whisker-backend-key-pair\") pod \"whisker-6c76fd745b-cpg8f\" (UID: \"f3e33739-3521-4bf4-bdea-a23412e15a2c\") " pod="calico-system/whisker-6c76fd745b-cpg8f" Sep 10 23:54:26.723091 kubelet[2743]: I0910 23:54:26.723040 2743 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whisker-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f3e33739-3521-4bf4-bdea-a23412e15a2c-whisker-ca-bundle\") pod \"whisker-6c76fd745b-cpg8f\" (UID: \"f3e33739-3521-4bf4-bdea-a23412e15a2c\") " pod="calico-system/whisker-6c76fd745b-cpg8f" Sep 10 23:54:26.787336 containerd[1522]: time="2025-09-10T23:54:26.787291722Z" level=info msg="TaskExit event in podsandbox handler container_id:\"296cbc8d5df525b21444371f1c154cc68257e3af8c60d311e62f7b31ca19feb0\" id:\"f2c2e3f83ba8745c3fda3c2a602aed397fa57949f25ce9c3f3b15dbd69d9dc89\" pid:3886 exit_status:1 exited_at:{seconds:1757548466 nanos:785827069}" Sep 10 23:54:26.968818 containerd[1522]: time="2025-09-10T23:54:26.968754884Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:whisker-6c76fd745b-cpg8f,Uid:f3e33739-3521-4bf4-bdea-a23412e15a2c,Namespace:calico-system,Attempt:0,}" Sep 10 23:54:27.165540 systemd-networkd[1425]: cali29d4a90d941: Link UP Sep 10 23:54:27.166932 systemd-networkd[1425]: cali29d4a90d941: Gained carrier Sep 10 23:54:27.188566 containerd[1522]: 2025-09-10 23:54:26.997 [INFO][3898] cni-plugin/utils.go 100: File /var/lib/calico/mtu does not exist Sep 10 23:54:27.188566 containerd[1522]: 2025-09-10 23:54:27.044 [INFO][3898] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4372--1--0--n--c06092ab73-k8s-whisker--6c76fd745b--cpg8f-eth0 whisker-6c76fd745b- calico-system f3e33739-3521-4bf4-bdea-a23412e15a2c 901 0 2025-09-10 23:54:26 +0000 UTC map[app.kubernetes.io/name:whisker k8s-app:whisker pod-template-hash:6c76fd745b projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:whisker] map[] [] [] []} {k8s ci-4372-1-0-n-c06092ab73 whisker-6c76fd745b-cpg8f eth0 whisker [] [] [kns.calico-system ksa.calico-system.whisker] cali29d4a90d941 [] [] }} ContainerID="b8cddc6081fe2dde498aa2d07c0f21cfe04cbacc3706030a7d67d692784ecd71" Namespace="calico-system" Pod="whisker-6c76fd745b-cpg8f" WorkloadEndpoint="ci--4372--1--0--n--c06092ab73-k8s-whisker--6c76fd745b--cpg8f-" Sep 10 23:54:27.188566 containerd[1522]: 2025-09-10 23:54:27.044 [INFO][3898] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="b8cddc6081fe2dde498aa2d07c0f21cfe04cbacc3706030a7d67d692784ecd71" Namespace="calico-system" Pod="whisker-6c76fd745b-cpg8f" WorkloadEndpoint="ci--4372--1--0--n--c06092ab73-k8s-whisker--6c76fd745b--cpg8f-eth0" Sep 10 23:54:27.188566 containerd[1522]: 2025-09-10 23:54:27.093 [INFO][3910] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="b8cddc6081fe2dde498aa2d07c0f21cfe04cbacc3706030a7d67d692784ecd71" HandleID="k8s-pod-network.b8cddc6081fe2dde498aa2d07c0f21cfe04cbacc3706030a7d67d692784ecd71" Workload="ci--4372--1--0--n--c06092ab73-k8s-whisker--6c76fd745b--cpg8f-eth0" Sep 10 23:54:27.188994 containerd[1522]: 2025-09-10 23:54:27.093 [INFO][3910] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="b8cddc6081fe2dde498aa2d07c0f21cfe04cbacc3706030a7d67d692784ecd71" HandleID="k8s-pod-network.b8cddc6081fe2dde498aa2d07c0f21cfe04cbacc3706030a7d67d692784ecd71" Workload="ci--4372--1--0--n--c06092ab73-k8s-whisker--6c76fd745b--cpg8f-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0x400024b7b0), Attrs:map[string]string{"namespace":"calico-system", "node":"ci-4372-1-0-n-c06092ab73", "pod":"whisker-6c76fd745b-cpg8f", "timestamp":"2025-09-10 23:54:27.09324243 +0000 UTC"}, Hostname:"ci-4372-1-0-n-c06092ab73", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Sep 10 23:54:27.188994 containerd[1522]: 2025-09-10 23:54:27.093 [INFO][3910] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Sep 10 23:54:27.188994 containerd[1522]: 2025-09-10 23:54:27.093 [INFO][3910] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Sep 10 23:54:27.188994 containerd[1522]: 2025-09-10 23:54:27.093 [INFO][3910] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4372-1-0-n-c06092ab73' Sep 10 23:54:27.188994 containerd[1522]: 2025-09-10 23:54:27.106 [INFO][3910] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.b8cddc6081fe2dde498aa2d07c0f21cfe04cbacc3706030a7d67d692784ecd71" host="ci-4372-1-0-n-c06092ab73" Sep 10 23:54:27.188994 containerd[1522]: 2025-09-10 23:54:27.117 [INFO][3910] ipam/ipam.go 394: Looking up existing affinities for host host="ci-4372-1-0-n-c06092ab73" Sep 10 23:54:27.188994 containerd[1522]: 2025-09-10 23:54:27.126 [INFO][3910] ipam/ipam.go 511: Trying affinity for 192.168.91.64/26 host="ci-4372-1-0-n-c06092ab73" Sep 10 23:54:27.188994 containerd[1522]: 2025-09-10 23:54:27.129 [INFO][3910] ipam/ipam.go 158: Attempting to load block cidr=192.168.91.64/26 host="ci-4372-1-0-n-c06092ab73" Sep 10 23:54:27.188994 containerd[1522]: 2025-09-10 23:54:27.132 [INFO][3910] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.91.64/26 host="ci-4372-1-0-n-c06092ab73" Sep 10 23:54:27.190160 containerd[1522]: 2025-09-10 23:54:27.132 [INFO][3910] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.91.64/26 handle="k8s-pod-network.b8cddc6081fe2dde498aa2d07c0f21cfe04cbacc3706030a7d67d692784ecd71" host="ci-4372-1-0-n-c06092ab73" Sep 10 23:54:27.190160 containerd[1522]: 2025-09-10 23:54:27.135 [INFO][3910] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.b8cddc6081fe2dde498aa2d07c0f21cfe04cbacc3706030a7d67d692784ecd71 Sep 10 23:54:27.190160 containerd[1522]: 2025-09-10 23:54:27.140 [INFO][3910] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.91.64/26 handle="k8s-pod-network.b8cddc6081fe2dde498aa2d07c0f21cfe04cbacc3706030a7d67d692784ecd71" host="ci-4372-1-0-n-c06092ab73" Sep 10 23:54:27.190160 containerd[1522]: 2025-09-10 23:54:27.150 [INFO][3910] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.91.65/26] block=192.168.91.64/26 handle="k8s-pod-network.b8cddc6081fe2dde498aa2d07c0f21cfe04cbacc3706030a7d67d692784ecd71" host="ci-4372-1-0-n-c06092ab73" Sep 10 23:54:27.190160 containerd[1522]: 2025-09-10 23:54:27.150 [INFO][3910] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.91.65/26] handle="k8s-pod-network.b8cddc6081fe2dde498aa2d07c0f21cfe04cbacc3706030a7d67d692784ecd71" host="ci-4372-1-0-n-c06092ab73" Sep 10 23:54:27.190160 containerd[1522]: 2025-09-10 23:54:27.150 [INFO][3910] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Sep 10 23:54:27.190160 containerd[1522]: 2025-09-10 23:54:27.150 [INFO][3910] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.91.65/26] IPv6=[] ContainerID="b8cddc6081fe2dde498aa2d07c0f21cfe04cbacc3706030a7d67d692784ecd71" HandleID="k8s-pod-network.b8cddc6081fe2dde498aa2d07c0f21cfe04cbacc3706030a7d67d692784ecd71" Workload="ci--4372--1--0--n--c06092ab73-k8s-whisker--6c76fd745b--cpg8f-eth0" Sep 10 23:54:27.190579 containerd[1522]: 2025-09-10 23:54:27.154 [INFO][3898] cni-plugin/k8s.go 418: Populated endpoint ContainerID="b8cddc6081fe2dde498aa2d07c0f21cfe04cbacc3706030a7d67d692784ecd71" Namespace="calico-system" Pod="whisker-6c76fd745b-cpg8f" WorkloadEndpoint="ci--4372--1--0--n--c06092ab73-k8s-whisker--6c76fd745b--cpg8f-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4372--1--0--n--c06092ab73-k8s-whisker--6c76fd745b--cpg8f-eth0", GenerateName:"whisker-6c76fd745b-", Namespace:"calico-system", SelfLink:"", UID:"f3e33739-3521-4bf4-bdea-a23412e15a2c", ResourceVersion:"901", Generation:0, CreationTimestamp:time.Date(2025, time.September, 10, 23, 54, 26, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"whisker", "k8s-app":"whisker", "pod-template-hash":"6c76fd745b", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"whisker"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4372-1-0-n-c06092ab73", ContainerID:"", Pod:"whisker-6c76fd745b-cpg8f", Endpoint:"eth0", ServiceAccountName:"whisker", IPNetworks:[]string{"192.168.91.65/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.whisker"}, InterfaceName:"cali29d4a90d941", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 10 23:54:27.190579 containerd[1522]: 2025-09-10 23:54:27.154 [INFO][3898] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.91.65/32] ContainerID="b8cddc6081fe2dde498aa2d07c0f21cfe04cbacc3706030a7d67d692784ecd71" Namespace="calico-system" Pod="whisker-6c76fd745b-cpg8f" WorkloadEndpoint="ci--4372--1--0--n--c06092ab73-k8s-whisker--6c76fd745b--cpg8f-eth0" Sep 10 23:54:27.190807 containerd[1522]: 2025-09-10 23:54:27.154 [INFO][3898] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali29d4a90d941 ContainerID="b8cddc6081fe2dde498aa2d07c0f21cfe04cbacc3706030a7d67d692784ecd71" Namespace="calico-system" Pod="whisker-6c76fd745b-cpg8f" WorkloadEndpoint="ci--4372--1--0--n--c06092ab73-k8s-whisker--6c76fd745b--cpg8f-eth0" Sep 10 23:54:27.190807 containerd[1522]: 2025-09-10 23:54:27.166 [INFO][3898] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="b8cddc6081fe2dde498aa2d07c0f21cfe04cbacc3706030a7d67d692784ecd71" Namespace="calico-system" Pod="whisker-6c76fd745b-cpg8f" WorkloadEndpoint="ci--4372--1--0--n--c06092ab73-k8s-whisker--6c76fd745b--cpg8f-eth0" Sep 10 23:54:27.190887 containerd[1522]: 2025-09-10 23:54:27.166 [INFO][3898] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="b8cddc6081fe2dde498aa2d07c0f21cfe04cbacc3706030a7d67d692784ecd71" Namespace="calico-system" Pod="whisker-6c76fd745b-cpg8f" WorkloadEndpoint="ci--4372--1--0--n--c06092ab73-k8s-whisker--6c76fd745b--cpg8f-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4372--1--0--n--c06092ab73-k8s-whisker--6c76fd745b--cpg8f-eth0", GenerateName:"whisker-6c76fd745b-", Namespace:"calico-system", SelfLink:"", UID:"f3e33739-3521-4bf4-bdea-a23412e15a2c", ResourceVersion:"901", Generation:0, CreationTimestamp:time.Date(2025, time.September, 10, 23, 54, 26, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"whisker", "k8s-app":"whisker", "pod-template-hash":"6c76fd745b", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"whisker"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4372-1-0-n-c06092ab73", ContainerID:"b8cddc6081fe2dde498aa2d07c0f21cfe04cbacc3706030a7d67d692784ecd71", Pod:"whisker-6c76fd745b-cpg8f", Endpoint:"eth0", ServiceAccountName:"whisker", IPNetworks:[]string{"192.168.91.65/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.whisker"}, InterfaceName:"cali29d4a90d941", MAC:"46:04:fe:7f:45:79", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 10 23:54:27.191023 containerd[1522]: 2025-09-10 23:54:27.184 [INFO][3898] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="b8cddc6081fe2dde498aa2d07c0f21cfe04cbacc3706030a7d67d692784ecd71" Namespace="calico-system" Pod="whisker-6c76fd745b-cpg8f" WorkloadEndpoint="ci--4372--1--0--n--c06092ab73-k8s-whisker--6c76fd745b--cpg8f-eth0" Sep 10 23:54:27.248285 containerd[1522]: time="2025-09-10T23:54:27.248105294Z" level=info msg="connecting to shim b8cddc6081fe2dde498aa2d07c0f21cfe04cbacc3706030a7d67d692784ecd71" address="unix:///run/containerd/s/602ce42306190878027d3e37587490b7fd7707dc313857a3550240541ca5381c" namespace=k8s.io protocol=ttrpc version=3 Sep 10 23:54:27.279413 systemd[1]: Started cri-containerd-b8cddc6081fe2dde498aa2d07c0f21cfe04cbacc3706030a7d67d692784ecd71.scope - libcontainer container b8cddc6081fe2dde498aa2d07c0f21cfe04cbacc3706030a7d67d692784ecd71. Sep 10 23:54:27.330763 containerd[1522]: time="2025-09-10T23:54:27.330707768Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:whisker-6c76fd745b-cpg8f,Uid:f3e33739-3521-4bf4-bdea-a23412e15a2c,Namespace:calico-system,Attempt:0,} returns sandbox id \"b8cddc6081fe2dde498aa2d07c0f21cfe04cbacc3706030a7d67d692784ecd71\"" Sep 10 23:54:27.333540 containerd[1522]: time="2025-09-10T23:54:27.333487617Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.30.3\"" Sep 10 23:54:27.841157 containerd[1522]: time="2025-09-10T23:54:27.840999691Z" level=info msg="TaskExit event in podsandbox handler container_id:\"296cbc8d5df525b21444371f1c154cc68257e3af8c60d311e62f7b31ca19feb0\" id:\"0b227d9fedbeb9b75b30b2ca11efe2d7e7039045bac8eb69705f676e96adf7ad\" pid:4070 exit_status:1 exited_at:{seconds:1757548467 nanos:840427319}" Sep 10 23:54:28.061248 systemd-networkd[1425]: vxlan.calico: Link UP Sep 10 23:54:28.061260 systemd-networkd[1425]: vxlan.calico: Gained carrier Sep 10 23:54:28.338289 kubelet[2743]: I0910 23:54:28.336957 2743 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="974afa17-d878-4288-8c80-49d63baee70d" path="/var/lib/kubelet/pods/974afa17-d878-4288-8c80-49d63baee70d/volumes" Sep 10 23:54:28.586985 systemd-networkd[1425]: cali29d4a90d941: Gained IPv6LL Sep 10 23:54:28.891238 containerd[1522]: time="2025-09-10T23:54:28.891149816Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/whisker:v3.30.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 10 23:54:28.892498 containerd[1522]: time="2025-09-10T23:54:28.892454971Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker:v3.30.3: active requests=0, bytes read=4605606" Sep 10 23:54:28.893726 containerd[1522]: time="2025-09-10T23:54:28.893336529Z" level=info msg="ImageCreate event name:\"sha256:270a0129ec34c3ad6ae6d56c0afce111eb0baa25dfdacb63722ec5887bafd3c5\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 10 23:54:28.895946 containerd[1522]: time="2025-09-10T23:54:28.895901595Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/whisker@sha256:e7113761fc7633d515882f0d48b5c8d0b8e62f3f9d34823f2ee194bb16d2ec44\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 10 23:54:28.896858 containerd[1522]: time="2025-09-10T23:54:28.896824277Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/whisker:v3.30.3\" with image id \"sha256:270a0129ec34c3ad6ae6d56c0afce111eb0baa25dfdacb63722ec5887bafd3c5\", repo tag \"ghcr.io/flatcar/calico/whisker:v3.30.3\", repo digest \"ghcr.io/flatcar/calico/whisker@sha256:e7113761fc7633d515882f0d48b5c8d0b8e62f3f9d34823f2ee194bb16d2ec44\", size \"5974839\" in 1.563087437s" Sep 10 23:54:28.896858 containerd[1522]: time="2025-09-10T23:54:28.896858480Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.30.3\" returns image reference \"sha256:270a0129ec34c3ad6ae6d56c0afce111eb0baa25dfdacb63722ec5887bafd3c5\"" Sep 10 23:54:28.900953 containerd[1522]: time="2025-09-10T23:54:28.900855393Z" level=info msg="CreateContainer within sandbox \"b8cddc6081fe2dde498aa2d07c0f21cfe04cbacc3706030a7d67d692784ecd71\" for container &ContainerMetadata{Name:whisker,Attempt:0,}" Sep 10 23:54:28.911344 containerd[1522]: time="2025-09-10T23:54:28.910374913Z" level=info msg="Container d6ef574336cb805d58c96c6be0d6d7819fb94e9a76c8bcaae8f81882425165c5: CDI devices from CRI Config.CDIDevices: []" Sep 10 23:54:28.922864 containerd[1522]: time="2025-09-10T23:54:28.922808771Z" level=info msg="CreateContainer within sandbox \"b8cddc6081fe2dde498aa2d07c0f21cfe04cbacc3706030a7d67d692784ecd71\" for &ContainerMetadata{Name:whisker,Attempt:0,} returns container id \"d6ef574336cb805d58c96c6be0d6d7819fb94e9a76c8bcaae8f81882425165c5\"" Sep 10 23:54:28.923820 containerd[1522]: time="2025-09-10T23:54:28.923758375Z" level=info msg="StartContainer for \"d6ef574336cb805d58c96c6be0d6d7819fb94e9a76c8bcaae8f81882425165c5\"" Sep 10 23:54:28.925293 containerd[1522]: time="2025-09-10T23:54:28.925253747Z" level=info msg="connecting to shim d6ef574336cb805d58c96c6be0d6d7819fb94e9a76c8bcaae8f81882425165c5" address="unix:///run/containerd/s/602ce42306190878027d3e37587490b7fd7707dc313857a3550240541ca5381c" protocol=ttrpc version=3 Sep 10 23:54:28.947483 systemd[1]: Started cri-containerd-d6ef574336cb805d58c96c6be0d6d7819fb94e9a76c8bcaae8f81882425165c5.scope - libcontainer container d6ef574336cb805d58c96c6be0d6d7819fb94e9a76c8bcaae8f81882425165c5. Sep 10 23:54:28.999641 containerd[1522]: time="2025-09-10T23:54:28.999581430Z" level=info msg="StartContainer for \"d6ef574336cb805d58c96c6be0d6d7819fb94e9a76c8bcaae8f81882425165c5\" returns successfully" Sep 10 23:54:29.004289 containerd[1522]: time="2025-09-10T23:54:29.004243839Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.30.3\"" Sep 10 23:54:29.994450 systemd-networkd[1425]: vxlan.calico: Gained IPv6LL Sep 10 23:54:31.502664 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount777971283.mount: Deactivated successfully. Sep 10 23:54:31.522510 containerd[1522]: time="2025-09-10T23:54:31.521175372Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/whisker-backend:v3.30.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 10 23:54:31.522510 containerd[1522]: time="2025-09-10T23:54:31.522456041Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker-backend:v3.30.3: active requests=0, bytes read=30823700" Sep 10 23:54:31.523138 containerd[1522]: time="2025-09-10T23:54:31.522898959Z" level=info msg="ImageCreate event name:\"sha256:e210e86234bc99f018431b30477c5ca2ad6f7ecf67ef011498f7beb48fb0b21f\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 10 23:54:31.525174 containerd[1522]: time="2025-09-10T23:54:31.525123428Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/whisker-backend@sha256:29becebc47401da9997a2a30f4c25c511a5f379d17275680b048224829af71a5\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 10 23:54:31.526537 containerd[1522]: time="2025-09-10T23:54:31.526487904Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.3\" with image id \"sha256:e210e86234bc99f018431b30477c5ca2ad6f7ecf67ef011498f7beb48fb0b21f\", repo tag \"ghcr.io/flatcar/calico/whisker-backend:v3.30.3\", repo digest \"ghcr.io/flatcar/calico/whisker-backend@sha256:29becebc47401da9997a2a30f4c25c511a5f379d17275680b048224829af71a5\", size \"30823530\" in 2.522197662s" Sep 10 23:54:31.526749 containerd[1522]: time="2025-09-10T23:54:31.526700122Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.30.3\" returns image reference \"sha256:e210e86234bc99f018431b30477c5ca2ad6f7ecf67ef011498f7beb48fb0b21f\"" Sep 10 23:54:31.530547 containerd[1522]: time="2025-09-10T23:54:31.530517367Z" level=info msg="CreateContainer within sandbox \"b8cddc6081fe2dde498aa2d07c0f21cfe04cbacc3706030a7d67d692784ecd71\" for container &ContainerMetadata{Name:whisker-backend,Attempt:0,}" Sep 10 23:54:31.540398 containerd[1522]: time="2025-09-10T23:54:31.540350764Z" level=info msg="Container 15d2d3722af41d6ac738195b908ea8a622af1f6769db79ac674682acf2e92b54: CDI devices from CRI Config.CDIDevices: []" Sep 10 23:54:31.554307 containerd[1522]: time="2025-09-10T23:54:31.554264548Z" level=info msg="CreateContainer within sandbox \"b8cddc6081fe2dde498aa2d07c0f21cfe04cbacc3706030a7d67d692784ecd71\" for &ContainerMetadata{Name:whisker-backend,Attempt:0,} returns container id \"15d2d3722af41d6ac738195b908ea8a622af1f6769db79ac674682acf2e92b54\"" Sep 10 23:54:31.555416 containerd[1522]: time="2025-09-10T23:54:31.555384043Z" level=info msg="StartContainer for \"15d2d3722af41d6ac738195b908ea8a622af1f6769db79ac674682acf2e92b54\"" Sep 10 23:54:31.557065 containerd[1522]: time="2025-09-10T23:54:31.556987979Z" level=info msg="connecting to shim 15d2d3722af41d6ac738195b908ea8a622af1f6769db79ac674682acf2e92b54" address="unix:///run/containerd/s/602ce42306190878027d3e37587490b7fd7707dc313857a3550240541ca5381c" protocol=ttrpc version=3 Sep 10 23:54:31.581633 systemd[1]: Started cri-containerd-15d2d3722af41d6ac738195b908ea8a622af1f6769db79ac674682acf2e92b54.scope - libcontainer container 15d2d3722af41d6ac738195b908ea8a622af1f6769db79ac674682acf2e92b54. Sep 10 23:54:31.633524 containerd[1522]: time="2025-09-10T23:54:31.633399960Z" level=info msg="StartContainer for \"15d2d3722af41d6ac738195b908ea8a622af1f6769db79ac674682acf2e92b54\" returns successfully" Sep 10 23:54:32.335111 containerd[1522]: time="2025-09-10T23:54:32.334991576Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-5747f59d78-md7gk,Uid:5af1d038-1e6a-4009-a16f-7e6eca2aca84,Namespace:calico-system,Attempt:0,}" Sep 10 23:54:32.493251 systemd-networkd[1425]: calibad206a67c7: Link UP Sep 10 23:54:32.493665 systemd-networkd[1425]: calibad206a67c7: Gained carrier Sep 10 23:54:32.516135 containerd[1522]: 2025-09-10 23:54:32.384 [INFO][4268] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4372--1--0--n--c06092ab73-k8s-calico--kube--controllers--5747f59d78--md7gk-eth0 calico-kube-controllers-5747f59d78- calico-system 5af1d038-1e6a-4009-a16f-7e6eca2aca84 828 0 2025-09-10 23:54:12 +0000 UTC map[app.kubernetes.io/name:calico-kube-controllers k8s-app:calico-kube-controllers pod-template-hash:5747f59d78 projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:calico-kube-controllers] map[] [] [] []} {k8s ci-4372-1-0-n-c06092ab73 calico-kube-controllers-5747f59d78-md7gk eth0 calico-kube-controllers [] [] [kns.calico-system ksa.calico-system.calico-kube-controllers] calibad206a67c7 [] [] }} ContainerID="f0fc36a64f51d1a9398060dcc44619b533470eb0ede52cce20680e42c6d394f8" Namespace="calico-system" Pod="calico-kube-controllers-5747f59d78-md7gk" WorkloadEndpoint="ci--4372--1--0--n--c06092ab73-k8s-calico--kube--controllers--5747f59d78--md7gk-" Sep 10 23:54:32.516135 containerd[1522]: 2025-09-10 23:54:32.384 [INFO][4268] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="f0fc36a64f51d1a9398060dcc44619b533470eb0ede52cce20680e42c6d394f8" Namespace="calico-system" Pod="calico-kube-controllers-5747f59d78-md7gk" WorkloadEndpoint="ci--4372--1--0--n--c06092ab73-k8s-calico--kube--controllers--5747f59d78--md7gk-eth0" Sep 10 23:54:32.516135 containerd[1522]: 2025-09-10 23:54:32.419 [INFO][4280] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="f0fc36a64f51d1a9398060dcc44619b533470eb0ede52cce20680e42c6d394f8" HandleID="k8s-pod-network.f0fc36a64f51d1a9398060dcc44619b533470eb0ede52cce20680e42c6d394f8" Workload="ci--4372--1--0--n--c06092ab73-k8s-calico--kube--controllers--5747f59d78--md7gk-eth0" Sep 10 23:54:32.516446 containerd[1522]: 2025-09-10 23:54:32.420 [INFO][4280] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="f0fc36a64f51d1a9398060dcc44619b533470eb0ede52cce20680e42c6d394f8" HandleID="k8s-pod-network.f0fc36a64f51d1a9398060dcc44619b533470eb0ede52cce20680e42c6d394f8" Workload="ci--4372--1--0--n--c06092ab73-k8s-calico--kube--controllers--5747f59d78--md7gk-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0x40002d3020), Attrs:map[string]string{"namespace":"calico-system", "node":"ci-4372-1-0-n-c06092ab73", "pod":"calico-kube-controllers-5747f59d78-md7gk", "timestamp":"2025-09-10 23:54:32.419729785 +0000 UTC"}, Hostname:"ci-4372-1-0-n-c06092ab73", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Sep 10 23:54:32.516446 containerd[1522]: 2025-09-10 23:54:32.420 [INFO][4280] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Sep 10 23:54:32.516446 containerd[1522]: 2025-09-10 23:54:32.420 [INFO][4280] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Sep 10 23:54:32.516446 containerd[1522]: 2025-09-10 23:54:32.420 [INFO][4280] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4372-1-0-n-c06092ab73' Sep 10 23:54:32.516446 containerd[1522]: 2025-09-10 23:54:32.431 [INFO][4280] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.f0fc36a64f51d1a9398060dcc44619b533470eb0ede52cce20680e42c6d394f8" host="ci-4372-1-0-n-c06092ab73" Sep 10 23:54:32.516446 containerd[1522]: 2025-09-10 23:54:32.443 [INFO][4280] ipam/ipam.go 394: Looking up existing affinities for host host="ci-4372-1-0-n-c06092ab73" Sep 10 23:54:32.516446 containerd[1522]: 2025-09-10 23:54:32.451 [INFO][4280] ipam/ipam.go 511: Trying affinity for 192.168.91.64/26 host="ci-4372-1-0-n-c06092ab73" Sep 10 23:54:32.516446 containerd[1522]: 2025-09-10 23:54:32.454 [INFO][4280] ipam/ipam.go 158: Attempting to load block cidr=192.168.91.64/26 host="ci-4372-1-0-n-c06092ab73" Sep 10 23:54:32.516446 containerd[1522]: 2025-09-10 23:54:32.459 [INFO][4280] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.91.64/26 host="ci-4372-1-0-n-c06092ab73" Sep 10 23:54:32.516648 containerd[1522]: 2025-09-10 23:54:32.461 [INFO][4280] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.91.64/26 handle="k8s-pod-network.f0fc36a64f51d1a9398060dcc44619b533470eb0ede52cce20680e42c6d394f8" host="ci-4372-1-0-n-c06092ab73" Sep 10 23:54:32.516648 containerd[1522]: 2025-09-10 23:54:32.466 [INFO][4280] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.f0fc36a64f51d1a9398060dcc44619b533470eb0ede52cce20680e42c6d394f8 Sep 10 23:54:32.516648 containerd[1522]: 2025-09-10 23:54:32.472 [INFO][4280] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.91.64/26 handle="k8s-pod-network.f0fc36a64f51d1a9398060dcc44619b533470eb0ede52cce20680e42c6d394f8" host="ci-4372-1-0-n-c06092ab73" Sep 10 23:54:32.516648 containerd[1522]: 2025-09-10 23:54:32.485 [INFO][4280] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.91.66/26] block=192.168.91.64/26 handle="k8s-pod-network.f0fc36a64f51d1a9398060dcc44619b533470eb0ede52cce20680e42c6d394f8" host="ci-4372-1-0-n-c06092ab73" Sep 10 23:54:32.516648 containerd[1522]: 2025-09-10 23:54:32.486 [INFO][4280] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.91.66/26] handle="k8s-pod-network.f0fc36a64f51d1a9398060dcc44619b533470eb0ede52cce20680e42c6d394f8" host="ci-4372-1-0-n-c06092ab73" Sep 10 23:54:32.516648 containerd[1522]: 2025-09-10 23:54:32.486 [INFO][4280] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Sep 10 23:54:32.516648 containerd[1522]: 2025-09-10 23:54:32.486 [INFO][4280] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.91.66/26] IPv6=[] ContainerID="f0fc36a64f51d1a9398060dcc44619b533470eb0ede52cce20680e42c6d394f8" HandleID="k8s-pod-network.f0fc36a64f51d1a9398060dcc44619b533470eb0ede52cce20680e42c6d394f8" Workload="ci--4372--1--0--n--c06092ab73-k8s-calico--kube--controllers--5747f59d78--md7gk-eth0" Sep 10 23:54:32.516857 containerd[1522]: 2025-09-10 23:54:32.488 [INFO][4268] cni-plugin/k8s.go 418: Populated endpoint ContainerID="f0fc36a64f51d1a9398060dcc44619b533470eb0ede52cce20680e42c6d394f8" Namespace="calico-system" Pod="calico-kube-controllers-5747f59d78-md7gk" WorkloadEndpoint="ci--4372--1--0--n--c06092ab73-k8s-calico--kube--controllers--5747f59d78--md7gk-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4372--1--0--n--c06092ab73-k8s-calico--kube--controllers--5747f59d78--md7gk-eth0", GenerateName:"calico-kube-controllers-5747f59d78-", Namespace:"calico-system", SelfLink:"", UID:"5af1d038-1e6a-4009-a16f-7e6eca2aca84", ResourceVersion:"828", Generation:0, CreationTimestamp:time.Date(2025, time.September, 10, 23, 54, 12, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"calico-kube-controllers", "k8s-app":"calico-kube-controllers", "pod-template-hash":"5747f59d78", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-kube-controllers"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4372-1-0-n-c06092ab73", ContainerID:"", Pod:"calico-kube-controllers-5747f59d78-md7gk", Endpoint:"eth0", ServiceAccountName:"calico-kube-controllers", IPNetworks:[]string{"192.168.91.66/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.calico-kube-controllers"}, InterfaceName:"calibad206a67c7", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 10 23:54:32.516908 containerd[1522]: 2025-09-10 23:54:32.488 [INFO][4268] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.91.66/32] ContainerID="f0fc36a64f51d1a9398060dcc44619b533470eb0ede52cce20680e42c6d394f8" Namespace="calico-system" Pod="calico-kube-controllers-5747f59d78-md7gk" WorkloadEndpoint="ci--4372--1--0--n--c06092ab73-k8s-calico--kube--controllers--5747f59d78--md7gk-eth0" Sep 10 23:54:32.516908 containerd[1522]: 2025-09-10 23:54:32.488 [INFO][4268] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to calibad206a67c7 ContainerID="f0fc36a64f51d1a9398060dcc44619b533470eb0ede52cce20680e42c6d394f8" Namespace="calico-system" Pod="calico-kube-controllers-5747f59d78-md7gk" WorkloadEndpoint="ci--4372--1--0--n--c06092ab73-k8s-calico--kube--controllers--5747f59d78--md7gk-eth0" Sep 10 23:54:32.516908 containerd[1522]: 2025-09-10 23:54:32.492 [INFO][4268] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="f0fc36a64f51d1a9398060dcc44619b533470eb0ede52cce20680e42c6d394f8" Namespace="calico-system" Pod="calico-kube-controllers-5747f59d78-md7gk" WorkloadEndpoint="ci--4372--1--0--n--c06092ab73-k8s-calico--kube--controllers--5747f59d78--md7gk-eth0" Sep 10 23:54:32.516980 containerd[1522]: 2025-09-10 23:54:32.492 [INFO][4268] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="f0fc36a64f51d1a9398060dcc44619b533470eb0ede52cce20680e42c6d394f8" Namespace="calico-system" Pod="calico-kube-controllers-5747f59d78-md7gk" WorkloadEndpoint="ci--4372--1--0--n--c06092ab73-k8s-calico--kube--controllers--5747f59d78--md7gk-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4372--1--0--n--c06092ab73-k8s-calico--kube--controllers--5747f59d78--md7gk-eth0", GenerateName:"calico-kube-controllers-5747f59d78-", Namespace:"calico-system", SelfLink:"", UID:"5af1d038-1e6a-4009-a16f-7e6eca2aca84", ResourceVersion:"828", Generation:0, CreationTimestamp:time.Date(2025, time.September, 10, 23, 54, 12, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"calico-kube-controllers", "k8s-app":"calico-kube-controllers", "pod-template-hash":"5747f59d78", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-kube-controllers"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4372-1-0-n-c06092ab73", ContainerID:"f0fc36a64f51d1a9398060dcc44619b533470eb0ede52cce20680e42c6d394f8", Pod:"calico-kube-controllers-5747f59d78-md7gk", Endpoint:"eth0", ServiceAccountName:"calico-kube-controllers", IPNetworks:[]string{"192.168.91.66/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.calico-kube-controllers"}, InterfaceName:"calibad206a67c7", MAC:"62:ec:6f:79:bf:32", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 10 23:54:32.517032 containerd[1522]: 2025-09-10 23:54:32.509 [INFO][4268] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="f0fc36a64f51d1a9398060dcc44619b533470eb0ede52cce20680e42c6d394f8" Namespace="calico-system" Pod="calico-kube-controllers-5747f59d78-md7gk" WorkloadEndpoint="ci--4372--1--0--n--c06092ab73-k8s-calico--kube--controllers--5747f59d78--md7gk-eth0" Sep 10 23:54:32.561209 containerd[1522]: time="2025-09-10T23:54:32.561035314Z" level=info msg="connecting to shim f0fc36a64f51d1a9398060dcc44619b533470eb0ede52cce20680e42c6d394f8" address="unix:///run/containerd/s/75d07cb5991bba1c7dda9e53e679a71fddff54a02300f9a4bdfdc9d939745a43" namespace=k8s.io protocol=ttrpc version=3 Sep 10 23:54:32.602494 systemd[1]: Started cri-containerd-f0fc36a64f51d1a9398060dcc44619b533470eb0ede52cce20680e42c6d394f8.scope - libcontainer container f0fc36a64f51d1a9398060dcc44619b533470eb0ede52cce20680e42c6d394f8. Sep 10 23:54:32.651947 containerd[1522]: time="2025-09-10T23:54:32.651802391Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-5747f59d78-md7gk,Uid:5af1d038-1e6a-4009-a16f-7e6eca2aca84,Namespace:calico-system,Attempt:0,} returns sandbox id \"f0fc36a64f51d1a9398060dcc44619b533470eb0ede52cce20680e42c6d394f8\"" Sep 10 23:54:32.653689 containerd[1522]: time="2025-09-10T23:54:32.653654027Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.30.3\"" Sep 10 23:54:33.335101 containerd[1522]: time="2025-09-10T23:54:33.334991976Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-cd5495d6f-mbcq8,Uid:a43d71ea-8b5a-4dec-a972-0788b2e2b853,Namespace:calico-apiserver,Attempt:0,}" Sep 10 23:54:33.496815 systemd-networkd[1425]: calid358cab9c05: Link UP Sep 10 23:54:33.497397 systemd-networkd[1425]: calid358cab9c05: Gained carrier Sep 10 23:54:33.520941 kubelet[2743]: I0910 23:54:33.520829 2743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/whisker-6c76fd745b-cpg8f" podStartSLOduration=3.325620408 podStartE2EDuration="7.520791284s" podCreationTimestamp="2025-09-10 23:54:26 +0000 UTC" firstStartedPulling="2025-09-10 23:54:27.332455645 +0000 UTC m=+37.122520821" lastFinishedPulling="2025-09-10 23:54:31.527626521 +0000 UTC m=+41.317691697" observedRunningTime="2025-09-10 23:54:32.58772912 +0000 UTC m=+42.377794296" watchObservedRunningTime="2025-09-10 23:54:33.520791284 +0000 UTC m=+43.310856500" Sep 10 23:54:33.523613 containerd[1522]: 2025-09-10 23:54:33.393 [INFO][4352] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4372--1--0--n--c06092ab73-k8s-calico--apiserver--cd5495d6f--mbcq8-eth0 calico-apiserver-cd5495d6f- calico-apiserver a43d71ea-8b5a-4dec-a972-0788b2e2b853 825 0 2025-09-10 23:54:08 +0000 UTC map[apiserver:true app.kubernetes.io/name:calico-apiserver k8s-app:calico-apiserver pod-template-hash:cd5495d6f projectcalico.org/namespace:calico-apiserver projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:calico-apiserver] map[] [] [] []} {k8s ci-4372-1-0-n-c06092ab73 calico-apiserver-cd5495d6f-mbcq8 eth0 calico-apiserver [] [] [kns.calico-apiserver ksa.calico-apiserver.calico-apiserver] calid358cab9c05 [] [] }} ContainerID="1b22884b623d6cf998ab73f8b6886c02fe8bc460966340a6db16569798b9d66c" Namespace="calico-apiserver" Pod="calico-apiserver-cd5495d6f-mbcq8" WorkloadEndpoint="ci--4372--1--0--n--c06092ab73-k8s-calico--apiserver--cd5495d6f--mbcq8-" Sep 10 23:54:33.523613 containerd[1522]: 2025-09-10 23:54:33.393 [INFO][4352] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="1b22884b623d6cf998ab73f8b6886c02fe8bc460966340a6db16569798b9d66c" Namespace="calico-apiserver" Pod="calico-apiserver-cd5495d6f-mbcq8" WorkloadEndpoint="ci--4372--1--0--n--c06092ab73-k8s-calico--apiserver--cd5495d6f--mbcq8-eth0" Sep 10 23:54:33.523613 containerd[1522]: 2025-09-10 23:54:33.420 [INFO][4364] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="1b22884b623d6cf998ab73f8b6886c02fe8bc460966340a6db16569798b9d66c" HandleID="k8s-pod-network.1b22884b623d6cf998ab73f8b6886c02fe8bc460966340a6db16569798b9d66c" Workload="ci--4372--1--0--n--c06092ab73-k8s-calico--apiserver--cd5495d6f--mbcq8-eth0" Sep 10 23:54:33.525011 containerd[1522]: 2025-09-10 23:54:33.421 [INFO][4364] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="1b22884b623d6cf998ab73f8b6886c02fe8bc460966340a6db16569798b9d66c" HandleID="k8s-pod-network.1b22884b623d6cf998ab73f8b6886c02fe8bc460966340a6db16569798b9d66c" Workload="ci--4372--1--0--n--c06092ab73-k8s-calico--apiserver--cd5495d6f--mbcq8-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0x400024b610), Attrs:map[string]string{"namespace":"calico-apiserver", "node":"ci-4372-1-0-n-c06092ab73", "pod":"calico-apiserver-cd5495d6f-mbcq8", "timestamp":"2025-09-10 23:54:33.420809761 +0000 UTC"}, Hostname:"ci-4372-1-0-n-c06092ab73", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Sep 10 23:54:33.525011 containerd[1522]: 2025-09-10 23:54:33.421 [INFO][4364] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Sep 10 23:54:33.525011 containerd[1522]: 2025-09-10 23:54:33.421 [INFO][4364] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Sep 10 23:54:33.525011 containerd[1522]: 2025-09-10 23:54:33.421 [INFO][4364] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4372-1-0-n-c06092ab73' Sep 10 23:54:33.525011 containerd[1522]: 2025-09-10 23:54:33.437 [INFO][4364] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.1b22884b623d6cf998ab73f8b6886c02fe8bc460966340a6db16569798b9d66c" host="ci-4372-1-0-n-c06092ab73" Sep 10 23:54:33.525011 containerd[1522]: 2025-09-10 23:54:33.446 [INFO][4364] ipam/ipam.go 394: Looking up existing affinities for host host="ci-4372-1-0-n-c06092ab73" Sep 10 23:54:33.525011 containerd[1522]: 2025-09-10 23:54:33.457 [INFO][4364] ipam/ipam.go 511: Trying affinity for 192.168.91.64/26 host="ci-4372-1-0-n-c06092ab73" Sep 10 23:54:33.525011 containerd[1522]: 2025-09-10 23:54:33.462 [INFO][4364] ipam/ipam.go 158: Attempting to load block cidr=192.168.91.64/26 host="ci-4372-1-0-n-c06092ab73" Sep 10 23:54:33.525011 containerd[1522]: 2025-09-10 23:54:33.468 [INFO][4364] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.91.64/26 host="ci-4372-1-0-n-c06092ab73" Sep 10 23:54:33.525716 containerd[1522]: 2025-09-10 23:54:33.469 [INFO][4364] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.91.64/26 handle="k8s-pod-network.1b22884b623d6cf998ab73f8b6886c02fe8bc460966340a6db16569798b9d66c" host="ci-4372-1-0-n-c06092ab73" Sep 10 23:54:33.525716 containerd[1522]: 2025-09-10 23:54:33.472 [INFO][4364] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.1b22884b623d6cf998ab73f8b6886c02fe8bc460966340a6db16569798b9d66c Sep 10 23:54:33.525716 containerd[1522]: 2025-09-10 23:54:33.479 [INFO][4364] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.91.64/26 handle="k8s-pod-network.1b22884b623d6cf998ab73f8b6886c02fe8bc460966340a6db16569798b9d66c" host="ci-4372-1-0-n-c06092ab73" Sep 10 23:54:33.525716 containerd[1522]: 2025-09-10 23:54:33.490 [INFO][4364] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.91.67/26] block=192.168.91.64/26 handle="k8s-pod-network.1b22884b623d6cf998ab73f8b6886c02fe8bc460966340a6db16569798b9d66c" host="ci-4372-1-0-n-c06092ab73" Sep 10 23:54:33.525716 containerd[1522]: 2025-09-10 23:54:33.490 [INFO][4364] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.91.67/26] handle="k8s-pod-network.1b22884b623d6cf998ab73f8b6886c02fe8bc460966340a6db16569798b9d66c" host="ci-4372-1-0-n-c06092ab73" Sep 10 23:54:33.525716 containerd[1522]: 2025-09-10 23:54:33.490 [INFO][4364] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Sep 10 23:54:33.525716 containerd[1522]: 2025-09-10 23:54:33.490 [INFO][4364] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.91.67/26] IPv6=[] ContainerID="1b22884b623d6cf998ab73f8b6886c02fe8bc460966340a6db16569798b9d66c" HandleID="k8s-pod-network.1b22884b623d6cf998ab73f8b6886c02fe8bc460966340a6db16569798b9d66c" Workload="ci--4372--1--0--n--c06092ab73-k8s-calico--apiserver--cd5495d6f--mbcq8-eth0" Sep 10 23:54:33.526095 containerd[1522]: 2025-09-10 23:54:33.493 [INFO][4352] cni-plugin/k8s.go 418: Populated endpoint ContainerID="1b22884b623d6cf998ab73f8b6886c02fe8bc460966340a6db16569798b9d66c" Namespace="calico-apiserver" Pod="calico-apiserver-cd5495d6f-mbcq8" WorkloadEndpoint="ci--4372--1--0--n--c06092ab73-k8s-calico--apiserver--cd5495d6f--mbcq8-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4372--1--0--n--c06092ab73-k8s-calico--apiserver--cd5495d6f--mbcq8-eth0", GenerateName:"calico-apiserver-cd5495d6f-", Namespace:"calico-apiserver", SelfLink:"", UID:"a43d71ea-8b5a-4dec-a972-0788b2e2b853", ResourceVersion:"825", Generation:0, CreationTimestamp:time.Date(2025, time.September, 10, 23, 54, 8, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"cd5495d6f", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4372-1-0-n-c06092ab73", ContainerID:"", Pod:"calico-apiserver-cd5495d6f-mbcq8", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.91.67/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"calid358cab9c05", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 10 23:54:33.526162 containerd[1522]: 2025-09-10 23:54:33.493 [INFO][4352] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.91.67/32] ContainerID="1b22884b623d6cf998ab73f8b6886c02fe8bc460966340a6db16569798b9d66c" Namespace="calico-apiserver" Pod="calico-apiserver-cd5495d6f-mbcq8" WorkloadEndpoint="ci--4372--1--0--n--c06092ab73-k8s-calico--apiserver--cd5495d6f--mbcq8-eth0" Sep 10 23:54:33.526162 containerd[1522]: 2025-09-10 23:54:33.494 [INFO][4352] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to calid358cab9c05 ContainerID="1b22884b623d6cf998ab73f8b6886c02fe8bc460966340a6db16569798b9d66c" Namespace="calico-apiserver" Pod="calico-apiserver-cd5495d6f-mbcq8" WorkloadEndpoint="ci--4372--1--0--n--c06092ab73-k8s-calico--apiserver--cd5495d6f--mbcq8-eth0" Sep 10 23:54:33.526162 containerd[1522]: 2025-09-10 23:54:33.498 [INFO][4352] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="1b22884b623d6cf998ab73f8b6886c02fe8bc460966340a6db16569798b9d66c" Namespace="calico-apiserver" Pod="calico-apiserver-cd5495d6f-mbcq8" WorkloadEndpoint="ci--4372--1--0--n--c06092ab73-k8s-calico--apiserver--cd5495d6f--mbcq8-eth0" Sep 10 23:54:33.526794 containerd[1522]: 2025-09-10 23:54:33.498 [INFO][4352] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="1b22884b623d6cf998ab73f8b6886c02fe8bc460966340a6db16569798b9d66c" Namespace="calico-apiserver" Pod="calico-apiserver-cd5495d6f-mbcq8" WorkloadEndpoint="ci--4372--1--0--n--c06092ab73-k8s-calico--apiserver--cd5495d6f--mbcq8-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4372--1--0--n--c06092ab73-k8s-calico--apiserver--cd5495d6f--mbcq8-eth0", GenerateName:"calico-apiserver-cd5495d6f-", Namespace:"calico-apiserver", SelfLink:"", UID:"a43d71ea-8b5a-4dec-a972-0788b2e2b853", ResourceVersion:"825", Generation:0, CreationTimestamp:time.Date(2025, time.September, 10, 23, 54, 8, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"cd5495d6f", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4372-1-0-n-c06092ab73", ContainerID:"1b22884b623d6cf998ab73f8b6886c02fe8bc460966340a6db16569798b9d66c", Pod:"calico-apiserver-cd5495d6f-mbcq8", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.91.67/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"calid358cab9c05", MAC:"4a:c3:b5:cf:42:fe", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 10 23:54:33.526853 containerd[1522]: 2025-09-10 23:54:33.518 [INFO][4352] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="1b22884b623d6cf998ab73f8b6886c02fe8bc460966340a6db16569798b9d66c" Namespace="calico-apiserver" Pod="calico-apiserver-cd5495d6f-mbcq8" WorkloadEndpoint="ci--4372--1--0--n--c06092ab73-k8s-calico--apiserver--cd5495d6f--mbcq8-eth0" Sep 10 23:54:33.570223 containerd[1522]: time="2025-09-10T23:54:33.568851286Z" level=info msg="connecting to shim 1b22884b623d6cf998ab73f8b6886c02fe8bc460966340a6db16569798b9d66c" address="unix:///run/containerd/s/4d2ffc0259e68840072cbb4a3b0798c70c5794eae6a044f83160a4024efeaf69" namespace=k8s.io protocol=ttrpc version=3 Sep 10 23:54:33.596374 systemd[1]: Started cri-containerd-1b22884b623d6cf998ab73f8b6886c02fe8bc460966340a6db16569798b9d66c.scope - libcontainer container 1b22884b623d6cf998ab73f8b6886c02fe8bc460966340a6db16569798b9d66c. Sep 10 23:54:33.654065 containerd[1522]: time="2025-09-10T23:54:33.654022496Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-cd5495d6f-mbcq8,Uid:a43d71ea-8b5a-4dec-a972-0788b2e2b853,Namespace:calico-apiserver,Attempt:0,} returns sandbox id \"1b22884b623d6cf998ab73f8b6886c02fe8bc460966340a6db16569798b9d66c\"" Sep 10 23:54:34.093623 systemd-networkd[1425]: calibad206a67c7: Gained IPv6LL Sep 10 23:54:34.336785 containerd[1522]: time="2025-09-10T23:54:34.336426831Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-cd5495d6f-tcd74,Uid:8f5dde21-0592-4365-967a-d0767c076647,Namespace:calico-apiserver,Attempt:0,}" Sep 10 23:54:34.337783 containerd[1522]: time="2025-09-10T23:54:34.337313864Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:goldmane-7988f88666-hswkn,Uid:0dd1dd97-6eb7-401b-871e-a9aed197c21b,Namespace:calico-system,Attempt:0,}" Sep 10 23:54:34.528488 systemd-networkd[1425]: cali39afd11cc1a: Link UP Sep 10 23:54:34.529947 systemd-networkd[1425]: cali39afd11cc1a: Gained carrier Sep 10 23:54:34.552174 containerd[1522]: 2025-09-10 23:54:34.406 [INFO][4429] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4372--1--0--n--c06092ab73-k8s-calico--apiserver--cd5495d6f--tcd74-eth0 calico-apiserver-cd5495d6f- calico-apiserver 8f5dde21-0592-4365-967a-d0767c076647 834 0 2025-09-10 23:54:08 +0000 UTC map[apiserver:true app.kubernetes.io/name:calico-apiserver k8s-app:calico-apiserver pod-template-hash:cd5495d6f projectcalico.org/namespace:calico-apiserver projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:calico-apiserver] map[] [] [] []} {k8s ci-4372-1-0-n-c06092ab73 calico-apiserver-cd5495d6f-tcd74 eth0 calico-apiserver [] [] [kns.calico-apiserver ksa.calico-apiserver.calico-apiserver] cali39afd11cc1a [] [] }} ContainerID="b8b41b7f20cd5d4db3700fabcf147c0a7c4e7c1b1e2b09fb697d22bdd82d7912" Namespace="calico-apiserver" Pod="calico-apiserver-cd5495d6f-tcd74" WorkloadEndpoint="ci--4372--1--0--n--c06092ab73-k8s-calico--apiserver--cd5495d6f--tcd74-" Sep 10 23:54:34.552174 containerd[1522]: 2025-09-10 23:54:34.407 [INFO][4429] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="b8b41b7f20cd5d4db3700fabcf147c0a7c4e7c1b1e2b09fb697d22bdd82d7912" Namespace="calico-apiserver" Pod="calico-apiserver-cd5495d6f-tcd74" WorkloadEndpoint="ci--4372--1--0--n--c06092ab73-k8s-calico--apiserver--cd5495d6f--tcd74-eth0" Sep 10 23:54:34.552174 containerd[1522]: 2025-09-10 23:54:34.452 [INFO][4451] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="b8b41b7f20cd5d4db3700fabcf147c0a7c4e7c1b1e2b09fb697d22bdd82d7912" HandleID="k8s-pod-network.b8b41b7f20cd5d4db3700fabcf147c0a7c4e7c1b1e2b09fb697d22bdd82d7912" Workload="ci--4372--1--0--n--c06092ab73-k8s-calico--apiserver--cd5495d6f--tcd74-eth0" Sep 10 23:54:34.552402 containerd[1522]: 2025-09-10 23:54:34.452 [INFO][4451] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="b8b41b7f20cd5d4db3700fabcf147c0a7c4e7c1b1e2b09fb697d22bdd82d7912" HandleID="k8s-pod-network.b8b41b7f20cd5d4db3700fabcf147c0a7c4e7c1b1e2b09fb697d22bdd82d7912" Workload="ci--4372--1--0--n--c06092ab73-k8s-calico--apiserver--cd5495d6f--tcd74-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0x40002cb640), Attrs:map[string]string{"namespace":"calico-apiserver", "node":"ci-4372-1-0-n-c06092ab73", "pod":"calico-apiserver-cd5495d6f-tcd74", "timestamp":"2025-09-10 23:54:34.452596206 +0000 UTC"}, Hostname:"ci-4372-1-0-n-c06092ab73", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Sep 10 23:54:34.552402 containerd[1522]: 2025-09-10 23:54:34.452 [INFO][4451] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Sep 10 23:54:34.552402 containerd[1522]: 2025-09-10 23:54:34.452 [INFO][4451] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Sep 10 23:54:34.552402 containerd[1522]: 2025-09-10 23:54:34.452 [INFO][4451] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4372-1-0-n-c06092ab73' Sep 10 23:54:34.552402 containerd[1522]: 2025-09-10 23:54:34.468 [INFO][4451] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.b8b41b7f20cd5d4db3700fabcf147c0a7c4e7c1b1e2b09fb697d22bdd82d7912" host="ci-4372-1-0-n-c06092ab73" Sep 10 23:54:34.552402 containerd[1522]: 2025-09-10 23:54:34.475 [INFO][4451] ipam/ipam.go 394: Looking up existing affinities for host host="ci-4372-1-0-n-c06092ab73" Sep 10 23:54:34.552402 containerd[1522]: 2025-09-10 23:54:34.484 [INFO][4451] ipam/ipam.go 511: Trying affinity for 192.168.91.64/26 host="ci-4372-1-0-n-c06092ab73" Sep 10 23:54:34.552402 containerd[1522]: 2025-09-10 23:54:34.487 [INFO][4451] ipam/ipam.go 158: Attempting to load block cidr=192.168.91.64/26 host="ci-4372-1-0-n-c06092ab73" Sep 10 23:54:34.552402 containerd[1522]: 2025-09-10 23:54:34.493 [INFO][4451] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.91.64/26 host="ci-4372-1-0-n-c06092ab73" Sep 10 23:54:34.552587 containerd[1522]: 2025-09-10 23:54:34.495 [INFO][4451] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.91.64/26 handle="k8s-pod-network.b8b41b7f20cd5d4db3700fabcf147c0a7c4e7c1b1e2b09fb697d22bdd82d7912" host="ci-4372-1-0-n-c06092ab73" Sep 10 23:54:34.552587 containerd[1522]: 2025-09-10 23:54:34.498 [INFO][4451] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.b8b41b7f20cd5d4db3700fabcf147c0a7c4e7c1b1e2b09fb697d22bdd82d7912 Sep 10 23:54:34.552587 containerd[1522]: 2025-09-10 23:54:34.504 [INFO][4451] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.91.64/26 handle="k8s-pod-network.b8b41b7f20cd5d4db3700fabcf147c0a7c4e7c1b1e2b09fb697d22bdd82d7912" host="ci-4372-1-0-n-c06092ab73" Sep 10 23:54:34.552587 containerd[1522]: 2025-09-10 23:54:34.515 [INFO][4451] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.91.68/26] block=192.168.91.64/26 handle="k8s-pod-network.b8b41b7f20cd5d4db3700fabcf147c0a7c4e7c1b1e2b09fb697d22bdd82d7912" host="ci-4372-1-0-n-c06092ab73" Sep 10 23:54:34.552587 containerd[1522]: 2025-09-10 23:54:34.516 [INFO][4451] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.91.68/26] handle="k8s-pod-network.b8b41b7f20cd5d4db3700fabcf147c0a7c4e7c1b1e2b09fb697d22bdd82d7912" host="ci-4372-1-0-n-c06092ab73" Sep 10 23:54:34.552587 containerd[1522]: 2025-09-10 23:54:34.516 [INFO][4451] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Sep 10 23:54:34.552587 containerd[1522]: 2025-09-10 23:54:34.516 [INFO][4451] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.91.68/26] IPv6=[] ContainerID="b8b41b7f20cd5d4db3700fabcf147c0a7c4e7c1b1e2b09fb697d22bdd82d7912" HandleID="k8s-pod-network.b8b41b7f20cd5d4db3700fabcf147c0a7c4e7c1b1e2b09fb697d22bdd82d7912" Workload="ci--4372--1--0--n--c06092ab73-k8s-calico--apiserver--cd5495d6f--tcd74-eth0" Sep 10 23:54:34.552720 containerd[1522]: 2025-09-10 23:54:34.521 [INFO][4429] cni-plugin/k8s.go 418: Populated endpoint ContainerID="b8b41b7f20cd5d4db3700fabcf147c0a7c4e7c1b1e2b09fb697d22bdd82d7912" Namespace="calico-apiserver" Pod="calico-apiserver-cd5495d6f-tcd74" WorkloadEndpoint="ci--4372--1--0--n--c06092ab73-k8s-calico--apiserver--cd5495d6f--tcd74-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4372--1--0--n--c06092ab73-k8s-calico--apiserver--cd5495d6f--tcd74-eth0", GenerateName:"calico-apiserver-cd5495d6f-", Namespace:"calico-apiserver", SelfLink:"", UID:"8f5dde21-0592-4365-967a-d0767c076647", ResourceVersion:"834", Generation:0, CreationTimestamp:time.Date(2025, time.September, 10, 23, 54, 8, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"cd5495d6f", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4372-1-0-n-c06092ab73", ContainerID:"", Pod:"calico-apiserver-cd5495d6f-tcd74", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.91.68/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"cali39afd11cc1a", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 10 23:54:34.552767 containerd[1522]: 2025-09-10 23:54:34.521 [INFO][4429] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.91.68/32] ContainerID="b8b41b7f20cd5d4db3700fabcf147c0a7c4e7c1b1e2b09fb697d22bdd82d7912" Namespace="calico-apiserver" Pod="calico-apiserver-cd5495d6f-tcd74" WorkloadEndpoint="ci--4372--1--0--n--c06092ab73-k8s-calico--apiserver--cd5495d6f--tcd74-eth0" Sep 10 23:54:34.552767 containerd[1522]: 2025-09-10 23:54:34.521 [INFO][4429] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali39afd11cc1a ContainerID="b8b41b7f20cd5d4db3700fabcf147c0a7c4e7c1b1e2b09fb697d22bdd82d7912" Namespace="calico-apiserver" Pod="calico-apiserver-cd5495d6f-tcd74" WorkloadEndpoint="ci--4372--1--0--n--c06092ab73-k8s-calico--apiserver--cd5495d6f--tcd74-eth0" Sep 10 23:54:34.552767 containerd[1522]: 2025-09-10 23:54:34.531 [INFO][4429] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="b8b41b7f20cd5d4db3700fabcf147c0a7c4e7c1b1e2b09fb697d22bdd82d7912" Namespace="calico-apiserver" Pod="calico-apiserver-cd5495d6f-tcd74" WorkloadEndpoint="ci--4372--1--0--n--c06092ab73-k8s-calico--apiserver--cd5495d6f--tcd74-eth0" Sep 10 23:54:34.552828 containerd[1522]: 2025-09-10 23:54:34.534 [INFO][4429] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="b8b41b7f20cd5d4db3700fabcf147c0a7c4e7c1b1e2b09fb697d22bdd82d7912" Namespace="calico-apiserver" Pod="calico-apiserver-cd5495d6f-tcd74" WorkloadEndpoint="ci--4372--1--0--n--c06092ab73-k8s-calico--apiserver--cd5495d6f--tcd74-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4372--1--0--n--c06092ab73-k8s-calico--apiserver--cd5495d6f--tcd74-eth0", GenerateName:"calico-apiserver-cd5495d6f-", Namespace:"calico-apiserver", SelfLink:"", UID:"8f5dde21-0592-4365-967a-d0767c076647", ResourceVersion:"834", Generation:0, CreationTimestamp:time.Date(2025, time.September, 10, 23, 54, 8, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"cd5495d6f", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4372-1-0-n-c06092ab73", ContainerID:"b8b41b7f20cd5d4db3700fabcf147c0a7c4e7c1b1e2b09fb697d22bdd82d7912", Pod:"calico-apiserver-cd5495d6f-tcd74", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.91.68/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"cali39afd11cc1a", MAC:"6e:bf:86:c7:77:7a", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 10 23:54:34.552873 containerd[1522]: 2025-09-10 23:54:34.550 [INFO][4429] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="b8b41b7f20cd5d4db3700fabcf147c0a7c4e7c1b1e2b09fb697d22bdd82d7912" Namespace="calico-apiserver" Pod="calico-apiserver-cd5495d6f-tcd74" WorkloadEndpoint="ci--4372--1--0--n--c06092ab73-k8s-calico--apiserver--cd5495d6f--tcd74-eth0" Sep 10 23:54:34.587048 containerd[1522]: time="2025-09-10T23:54:34.585367469Z" level=info msg="connecting to shim b8b41b7f20cd5d4db3700fabcf147c0a7c4e7c1b1e2b09fb697d22bdd82d7912" address="unix:///run/containerd/s/6d8d41a5bdd674f10df39659a739d244e7701b11df35ad8fd8fc7b113546512f" namespace=k8s.io protocol=ttrpc version=3 Sep 10 23:54:34.626391 systemd[1]: Started cri-containerd-b8b41b7f20cd5d4db3700fabcf147c0a7c4e7c1b1e2b09fb697d22bdd82d7912.scope - libcontainer container b8b41b7f20cd5d4db3700fabcf147c0a7c4e7c1b1e2b09fb697d22bdd82d7912. Sep 10 23:54:34.646361 systemd-networkd[1425]: cali57d2be5e282: Link UP Sep 10 23:54:34.647301 systemd-networkd[1425]: cali57d2be5e282: Gained carrier Sep 10 23:54:34.677461 containerd[1522]: 2025-09-10 23:54:34.415 [INFO][4425] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4372--1--0--n--c06092ab73-k8s-goldmane--7988f88666--hswkn-eth0 goldmane-7988f88666- calico-system 0dd1dd97-6eb7-401b-871e-a9aed197c21b 836 0 2025-09-10 23:54:12 +0000 UTC map[app.kubernetes.io/name:goldmane k8s-app:goldmane pod-template-hash:7988f88666 projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:goldmane] map[] [] [] []} {k8s ci-4372-1-0-n-c06092ab73 goldmane-7988f88666-hswkn eth0 goldmane [] [] [kns.calico-system ksa.calico-system.goldmane] cali57d2be5e282 [] [] }} ContainerID="f4e61518e6a045729882683dd967c26a5724df5586e2dde4a13fc456b84fce45" Namespace="calico-system" Pod="goldmane-7988f88666-hswkn" WorkloadEndpoint="ci--4372--1--0--n--c06092ab73-k8s-goldmane--7988f88666--hswkn-" Sep 10 23:54:34.677461 containerd[1522]: 2025-09-10 23:54:34.416 [INFO][4425] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="f4e61518e6a045729882683dd967c26a5724df5586e2dde4a13fc456b84fce45" Namespace="calico-system" Pod="goldmane-7988f88666-hswkn" WorkloadEndpoint="ci--4372--1--0--n--c06092ab73-k8s-goldmane--7988f88666--hswkn-eth0" Sep 10 23:54:34.677461 containerd[1522]: 2025-09-10 23:54:34.459 [INFO][4456] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="f4e61518e6a045729882683dd967c26a5724df5586e2dde4a13fc456b84fce45" HandleID="k8s-pod-network.f4e61518e6a045729882683dd967c26a5724df5586e2dde4a13fc456b84fce45" Workload="ci--4372--1--0--n--c06092ab73-k8s-goldmane--7988f88666--hswkn-eth0" Sep 10 23:54:34.677666 containerd[1522]: 2025-09-10 23:54:34.459 [INFO][4456] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="f4e61518e6a045729882683dd967c26a5724df5586e2dde4a13fc456b84fce45" HandleID="k8s-pod-network.f4e61518e6a045729882683dd967c26a5724df5586e2dde4a13fc456b84fce45" Workload="ci--4372--1--0--n--c06092ab73-k8s-goldmane--7988f88666--hswkn-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0x40002cb0d0), Attrs:map[string]string{"namespace":"calico-system", "node":"ci-4372-1-0-n-c06092ab73", "pod":"goldmane-7988f88666-hswkn", "timestamp":"2025-09-10 23:54:34.45834636 +0000 UTC"}, Hostname:"ci-4372-1-0-n-c06092ab73", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Sep 10 23:54:34.677666 containerd[1522]: 2025-09-10 23:54:34.460 [INFO][4456] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Sep 10 23:54:34.677666 containerd[1522]: 2025-09-10 23:54:34.516 [INFO][4456] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Sep 10 23:54:34.677666 containerd[1522]: 2025-09-10 23:54:34.516 [INFO][4456] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4372-1-0-n-c06092ab73' Sep 10 23:54:34.677666 containerd[1522]: 2025-09-10 23:54:34.570 [INFO][4456] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.f4e61518e6a045729882683dd967c26a5724df5586e2dde4a13fc456b84fce45" host="ci-4372-1-0-n-c06092ab73" Sep 10 23:54:34.677666 containerd[1522]: 2025-09-10 23:54:34.585 [INFO][4456] ipam/ipam.go 394: Looking up existing affinities for host host="ci-4372-1-0-n-c06092ab73" Sep 10 23:54:34.677666 containerd[1522]: 2025-09-10 23:54:34.598 [INFO][4456] ipam/ipam.go 511: Trying affinity for 192.168.91.64/26 host="ci-4372-1-0-n-c06092ab73" Sep 10 23:54:34.677666 containerd[1522]: 2025-09-10 23:54:34.603 [INFO][4456] ipam/ipam.go 158: Attempting to load block cidr=192.168.91.64/26 host="ci-4372-1-0-n-c06092ab73" Sep 10 23:54:34.677666 containerd[1522]: 2025-09-10 23:54:34.611 [INFO][4456] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.91.64/26 host="ci-4372-1-0-n-c06092ab73" Sep 10 23:54:34.677849 containerd[1522]: 2025-09-10 23:54:34.611 [INFO][4456] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.91.64/26 handle="k8s-pod-network.f4e61518e6a045729882683dd967c26a5724df5586e2dde4a13fc456b84fce45" host="ci-4372-1-0-n-c06092ab73" Sep 10 23:54:34.677849 containerd[1522]: 2025-09-10 23:54:34.617 [INFO][4456] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.f4e61518e6a045729882683dd967c26a5724df5586e2dde4a13fc456b84fce45 Sep 10 23:54:34.677849 containerd[1522]: 2025-09-10 23:54:34.628 [INFO][4456] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.91.64/26 handle="k8s-pod-network.f4e61518e6a045729882683dd967c26a5724df5586e2dde4a13fc456b84fce45" host="ci-4372-1-0-n-c06092ab73" Sep 10 23:54:34.677849 containerd[1522]: 2025-09-10 23:54:34.637 [INFO][4456] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.91.69/26] block=192.168.91.64/26 handle="k8s-pod-network.f4e61518e6a045729882683dd967c26a5724df5586e2dde4a13fc456b84fce45" host="ci-4372-1-0-n-c06092ab73" Sep 10 23:54:34.677849 containerd[1522]: 2025-09-10 23:54:34.637 [INFO][4456] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.91.69/26] handle="k8s-pod-network.f4e61518e6a045729882683dd967c26a5724df5586e2dde4a13fc456b84fce45" host="ci-4372-1-0-n-c06092ab73" Sep 10 23:54:34.677849 containerd[1522]: 2025-09-10 23:54:34.637 [INFO][4456] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Sep 10 23:54:34.677849 containerd[1522]: 2025-09-10 23:54:34.637 [INFO][4456] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.91.69/26] IPv6=[] ContainerID="f4e61518e6a045729882683dd967c26a5724df5586e2dde4a13fc456b84fce45" HandleID="k8s-pod-network.f4e61518e6a045729882683dd967c26a5724df5586e2dde4a13fc456b84fce45" Workload="ci--4372--1--0--n--c06092ab73-k8s-goldmane--7988f88666--hswkn-eth0" Sep 10 23:54:34.678042 containerd[1522]: 2025-09-10 23:54:34.641 [INFO][4425] cni-plugin/k8s.go 418: Populated endpoint ContainerID="f4e61518e6a045729882683dd967c26a5724df5586e2dde4a13fc456b84fce45" Namespace="calico-system" Pod="goldmane-7988f88666-hswkn" WorkloadEndpoint="ci--4372--1--0--n--c06092ab73-k8s-goldmane--7988f88666--hswkn-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4372--1--0--n--c06092ab73-k8s-goldmane--7988f88666--hswkn-eth0", GenerateName:"goldmane-7988f88666-", Namespace:"calico-system", SelfLink:"", UID:"0dd1dd97-6eb7-401b-871e-a9aed197c21b", ResourceVersion:"836", Generation:0, CreationTimestamp:time.Date(2025, time.September, 10, 23, 54, 12, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"goldmane", "k8s-app":"goldmane", "pod-template-hash":"7988f88666", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"goldmane"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4372-1-0-n-c06092ab73", ContainerID:"", Pod:"goldmane-7988f88666-hswkn", Endpoint:"eth0", ServiceAccountName:"goldmane", IPNetworks:[]string{"192.168.91.69/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.goldmane"}, InterfaceName:"cali57d2be5e282", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 10 23:54:34.678101 containerd[1522]: 2025-09-10 23:54:34.641 [INFO][4425] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.91.69/32] ContainerID="f4e61518e6a045729882683dd967c26a5724df5586e2dde4a13fc456b84fce45" Namespace="calico-system" Pod="goldmane-7988f88666-hswkn" WorkloadEndpoint="ci--4372--1--0--n--c06092ab73-k8s-goldmane--7988f88666--hswkn-eth0" Sep 10 23:54:34.678101 containerd[1522]: 2025-09-10 23:54:34.641 [INFO][4425] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali57d2be5e282 ContainerID="f4e61518e6a045729882683dd967c26a5724df5586e2dde4a13fc456b84fce45" Namespace="calico-system" Pod="goldmane-7988f88666-hswkn" WorkloadEndpoint="ci--4372--1--0--n--c06092ab73-k8s-goldmane--7988f88666--hswkn-eth0" Sep 10 23:54:34.678101 containerd[1522]: 2025-09-10 23:54:34.651 [INFO][4425] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="f4e61518e6a045729882683dd967c26a5724df5586e2dde4a13fc456b84fce45" Namespace="calico-system" Pod="goldmane-7988f88666-hswkn" WorkloadEndpoint="ci--4372--1--0--n--c06092ab73-k8s-goldmane--7988f88666--hswkn-eth0" Sep 10 23:54:34.678688 containerd[1522]: 2025-09-10 23:54:34.652 [INFO][4425] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="f4e61518e6a045729882683dd967c26a5724df5586e2dde4a13fc456b84fce45" Namespace="calico-system" Pod="goldmane-7988f88666-hswkn" WorkloadEndpoint="ci--4372--1--0--n--c06092ab73-k8s-goldmane--7988f88666--hswkn-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4372--1--0--n--c06092ab73-k8s-goldmane--7988f88666--hswkn-eth0", GenerateName:"goldmane-7988f88666-", Namespace:"calico-system", SelfLink:"", UID:"0dd1dd97-6eb7-401b-871e-a9aed197c21b", ResourceVersion:"836", Generation:0, CreationTimestamp:time.Date(2025, time.September, 10, 23, 54, 12, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"goldmane", "k8s-app":"goldmane", "pod-template-hash":"7988f88666", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"goldmane"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4372-1-0-n-c06092ab73", ContainerID:"f4e61518e6a045729882683dd967c26a5724df5586e2dde4a13fc456b84fce45", Pod:"goldmane-7988f88666-hswkn", Endpoint:"eth0", ServiceAccountName:"goldmane", IPNetworks:[]string{"192.168.91.69/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.goldmane"}, InterfaceName:"cali57d2be5e282", MAC:"62:00:1f:a1:be:f0", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 10 23:54:34.678886 containerd[1522]: 2025-09-10 23:54:34.670 [INFO][4425] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="f4e61518e6a045729882683dd967c26a5724df5586e2dde4a13fc456b84fce45" Namespace="calico-system" Pod="goldmane-7988f88666-hswkn" WorkloadEndpoint="ci--4372--1--0--n--c06092ab73-k8s-goldmane--7988f88666--hswkn-eth0" Sep 10 23:54:34.733076 containerd[1522]: time="2025-09-10T23:54:34.732956514Z" level=info msg="connecting to shim f4e61518e6a045729882683dd967c26a5724df5586e2dde4a13fc456b84fce45" address="unix:///run/containerd/s/4f86a4e9185fd1879ee19495be40dde6e202a10270e73dc6507232a5a345b108" namespace=k8s.io protocol=ttrpc version=3 Sep 10 23:54:34.774256 containerd[1522]: time="2025-09-10T23:54:34.774219075Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-cd5495d6f-tcd74,Uid:8f5dde21-0592-4365-967a-d0767c076647,Namespace:calico-apiserver,Attempt:0,} returns sandbox id \"b8b41b7f20cd5d4db3700fabcf147c0a7c4e7c1b1e2b09fb697d22bdd82d7912\"" Sep 10 23:54:34.775455 systemd[1]: Started cri-containerd-f4e61518e6a045729882683dd967c26a5724df5586e2dde4a13fc456b84fce45.scope - libcontainer container f4e61518e6a045729882683dd967c26a5724df5586e2dde4a13fc456b84fce45. Sep 10 23:54:34.825412 containerd[1522]: time="2025-09-10T23:54:34.825334768Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:goldmane-7988f88666-hswkn,Uid:0dd1dd97-6eb7-401b-871e-a9aed197c21b,Namespace:calico-system,Attempt:0,} returns sandbox id \"f4e61518e6a045729882683dd967c26a5724df5586e2dde4a13fc456b84fce45\"" Sep 10 23:54:34.986441 systemd-networkd[1425]: calid358cab9c05: Gained IPv6LL Sep 10 23:54:35.335007 containerd[1522]: time="2025-09-10T23:54:35.334968195Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-px2tb,Uid:33c72a96-49c1-4e41-a8b3-15ab5f93e7db,Namespace:calico-system,Attempt:0,}" Sep 10 23:54:35.524088 systemd-networkd[1425]: calid6f21d9e90e: Link UP Sep 10 23:54:35.527538 systemd-networkd[1425]: calid6f21d9e90e: Gained carrier Sep 10 23:54:35.556482 containerd[1522]: 2025-09-10 23:54:35.393 [INFO][4581] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4372--1--0--n--c06092ab73-k8s-csi--node--driver--px2tb-eth0 csi-node-driver- calico-system 33c72a96-49c1-4e41-a8b3-15ab5f93e7db 701 0 2025-09-10 23:54:12 +0000 UTC map[app.kubernetes.io/name:csi-node-driver controller-revision-hash:856c6b598f k8s-app:csi-node-driver name:csi-node-driver pod-template-generation:1 projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:csi-node-driver] map[] [] [] []} {k8s ci-4372-1-0-n-c06092ab73 csi-node-driver-px2tb eth0 csi-node-driver [] [] [kns.calico-system ksa.calico-system.csi-node-driver] calid6f21d9e90e [] [] }} ContainerID="dce484bb0cd01dc59be3a6d6250476ec9a737909d4172a20276e900d9e7d7171" Namespace="calico-system" Pod="csi-node-driver-px2tb" WorkloadEndpoint="ci--4372--1--0--n--c06092ab73-k8s-csi--node--driver--px2tb-" Sep 10 23:54:35.556482 containerd[1522]: 2025-09-10 23:54:35.393 [INFO][4581] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="dce484bb0cd01dc59be3a6d6250476ec9a737909d4172a20276e900d9e7d7171" Namespace="calico-system" Pod="csi-node-driver-px2tb" WorkloadEndpoint="ci--4372--1--0--n--c06092ab73-k8s-csi--node--driver--px2tb-eth0" Sep 10 23:54:35.556482 containerd[1522]: 2025-09-10 23:54:35.429 [INFO][4592] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="dce484bb0cd01dc59be3a6d6250476ec9a737909d4172a20276e900d9e7d7171" HandleID="k8s-pod-network.dce484bb0cd01dc59be3a6d6250476ec9a737909d4172a20276e900d9e7d7171" Workload="ci--4372--1--0--n--c06092ab73-k8s-csi--node--driver--px2tb-eth0" Sep 10 23:54:35.556722 containerd[1522]: 2025-09-10 23:54:35.430 [INFO][4592] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="dce484bb0cd01dc59be3a6d6250476ec9a737909d4172a20276e900d9e7d7171" HandleID="k8s-pod-network.dce484bb0cd01dc59be3a6d6250476ec9a737909d4172a20276e900d9e7d7171" Workload="ci--4372--1--0--n--c06092ab73-k8s-csi--node--driver--px2tb-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0x40002d3600), Attrs:map[string]string{"namespace":"calico-system", "node":"ci-4372-1-0-n-c06092ab73", "pod":"csi-node-driver-px2tb", "timestamp":"2025-09-10 23:54:35.42994535 +0000 UTC"}, Hostname:"ci-4372-1-0-n-c06092ab73", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Sep 10 23:54:35.556722 containerd[1522]: 2025-09-10 23:54:35.430 [INFO][4592] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Sep 10 23:54:35.556722 containerd[1522]: 2025-09-10 23:54:35.430 [INFO][4592] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Sep 10 23:54:35.556722 containerd[1522]: 2025-09-10 23:54:35.430 [INFO][4592] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4372-1-0-n-c06092ab73' Sep 10 23:54:35.556722 containerd[1522]: 2025-09-10 23:54:35.445 [INFO][4592] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.dce484bb0cd01dc59be3a6d6250476ec9a737909d4172a20276e900d9e7d7171" host="ci-4372-1-0-n-c06092ab73" Sep 10 23:54:35.556722 containerd[1522]: 2025-09-10 23:54:35.469 [INFO][4592] ipam/ipam.go 394: Looking up existing affinities for host host="ci-4372-1-0-n-c06092ab73" Sep 10 23:54:35.556722 containerd[1522]: 2025-09-10 23:54:35.481 [INFO][4592] ipam/ipam.go 511: Trying affinity for 192.168.91.64/26 host="ci-4372-1-0-n-c06092ab73" Sep 10 23:54:35.556722 containerd[1522]: 2025-09-10 23:54:35.485 [INFO][4592] ipam/ipam.go 158: Attempting to load block cidr=192.168.91.64/26 host="ci-4372-1-0-n-c06092ab73" Sep 10 23:54:35.556722 containerd[1522]: 2025-09-10 23:54:35.489 [INFO][4592] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.91.64/26 host="ci-4372-1-0-n-c06092ab73" Sep 10 23:54:35.556922 containerd[1522]: 2025-09-10 23:54:35.489 [INFO][4592] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.91.64/26 handle="k8s-pod-network.dce484bb0cd01dc59be3a6d6250476ec9a737909d4172a20276e900d9e7d7171" host="ci-4372-1-0-n-c06092ab73" Sep 10 23:54:35.556922 containerd[1522]: 2025-09-10 23:54:35.492 [INFO][4592] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.dce484bb0cd01dc59be3a6d6250476ec9a737909d4172a20276e900d9e7d7171 Sep 10 23:54:35.556922 containerd[1522]: 2025-09-10 23:54:35.499 [INFO][4592] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.91.64/26 handle="k8s-pod-network.dce484bb0cd01dc59be3a6d6250476ec9a737909d4172a20276e900d9e7d7171" host="ci-4372-1-0-n-c06092ab73" Sep 10 23:54:35.556922 containerd[1522]: 2025-09-10 23:54:35.513 [INFO][4592] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.91.70/26] block=192.168.91.64/26 handle="k8s-pod-network.dce484bb0cd01dc59be3a6d6250476ec9a737909d4172a20276e900d9e7d7171" host="ci-4372-1-0-n-c06092ab73" Sep 10 23:54:35.556922 containerd[1522]: 2025-09-10 23:54:35.513 [INFO][4592] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.91.70/26] handle="k8s-pod-network.dce484bb0cd01dc59be3a6d6250476ec9a737909d4172a20276e900d9e7d7171" host="ci-4372-1-0-n-c06092ab73" Sep 10 23:54:35.556922 containerd[1522]: 2025-09-10 23:54:35.513 [INFO][4592] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Sep 10 23:54:35.556922 containerd[1522]: 2025-09-10 23:54:35.513 [INFO][4592] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.91.70/26] IPv6=[] ContainerID="dce484bb0cd01dc59be3a6d6250476ec9a737909d4172a20276e900d9e7d7171" HandleID="k8s-pod-network.dce484bb0cd01dc59be3a6d6250476ec9a737909d4172a20276e900d9e7d7171" Workload="ci--4372--1--0--n--c06092ab73-k8s-csi--node--driver--px2tb-eth0" Sep 10 23:54:35.557112 containerd[1522]: 2025-09-10 23:54:35.517 [INFO][4581] cni-plugin/k8s.go 418: Populated endpoint ContainerID="dce484bb0cd01dc59be3a6d6250476ec9a737909d4172a20276e900d9e7d7171" Namespace="calico-system" Pod="csi-node-driver-px2tb" WorkloadEndpoint="ci--4372--1--0--n--c06092ab73-k8s-csi--node--driver--px2tb-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4372--1--0--n--c06092ab73-k8s-csi--node--driver--px2tb-eth0", GenerateName:"csi-node-driver-", Namespace:"calico-system", SelfLink:"", UID:"33c72a96-49c1-4e41-a8b3-15ab5f93e7db", ResourceVersion:"701", Generation:0, CreationTimestamp:time.Date(2025, time.September, 10, 23, 54, 12, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"csi-node-driver", "controller-revision-hash":"856c6b598f", "k8s-app":"csi-node-driver", "name":"csi-node-driver", "pod-template-generation":"1", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"csi-node-driver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4372-1-0-n-c06092ab73", ContainerID:"", Pod:"csi-node-driver-px2tb", Endpoint:"eth0", ServiceAccountName:"csi-node-driver", IPNetworks:[]string{"192.168.91.70/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.csi-node-driver"}, InterfaceName:"calid6f21d9e90e", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 10 23:54:35.557173 containerd[1522]: 2025-09-10 23:54:35.517 [INFO][4581] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.91.70/32] ContainerID="dce484bb0cd01dc59be3a6d6250476ec9a737909d4172a20276e900d9e7d7171" Namespace="calico-system" Pod="csi-node-driver-px2tb" WorkloadEndpoint="ci--4372--1--0--n--c06092ab73-k8s-csi--node--driver--px2tb-eth0" Sep 10 23:54:35.557173 containerd[1522]: 2025-09-10 23:54:35.517 [INFO][4581] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to calid6f21d9e90e ContainerID="dce484bb0cd01dc59be3a6d6250476ec9a737909d4172a20276e900d9e7d7171" Namespace="calico-system" Pod="csi-node-driver-px2tb" WorkloadEndpoint="ci--4372--1--0--n--c06092ab73-k8s-csi--node--driver--px2tb-eth0" Sep 10 23:54:35.557173 containerd[1522]: 2025-09-10 23:54:35.530 [INFO][4581] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="dce484bb0cd01dc59be3a6d6250476ec9a737909d4172a20276e900d9e7d7171" Namespace="calico-system" Pod="csi-node-driver-px2tb" WorkloadEndpoint="ci--4372--1--0--n--c06092ab73-k8s-csi--node--driver--px2tb-eth0" Sep 10 23:54:35.557265 containerd[1522]: 2025-09-10 23:54:35.532 [INFO][4581] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="dce484bb0cd01dc59be3a6d6250476ec9a737909d4172a20276e900d9e7d7171" Namespace="calico-system" Pod="csi-node-driver-px2tb" WorkloadEndpoint="ci--4372--1--0--n--c06092ab73-k8s-csi--node--driver--px2tb-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4372--1--0--n--c06092ab73-k8s-csi--node--driver--px2tb-eth0", GenerateName:"csi-node-driver-", Namespace:"calico-system", SelfLink:"", UID:"33c72a96-49c1-4e41-a8b3-15ab5f93e7db", ResourceVersion:"701", Generation:0, CreationTimestamp:time.Date(2025, time.September, 10, 23, 54, 12, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"csi-node-driver", "controller-revision-hash":"856c6b598f", "k8s-app":"csi-node-driver", "name":"csi-node-driver", "pod-template-generation":"1", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"csi-node-driver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4372-1-0-n-c06092ab73", ContainerID:"dce484bb0cd01dc59be3a6d6250476ec9a737909d4172a20276e900d9e7d7171", Pod:"csi-node-driver-px2tb", Endpoint:"eth0", ServiceAccountName:"csi-node-driver", IPNetworks:[]string{"192.168.91.70/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.csi-node-driver"}, InterfaceName:"calid6f21d9e90e", MAC:"ca:06:11:5e:f8:b5", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 10 23:54:35.557316 containerd[1522]: 2025-09-10 23:54:35.551 [INFO][4581] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="dce484bb0cd01dc59be3a6d6250476ec9a737909d4172a20276e900d9e7d7171" Namespace="calico-system" Pod="csi-node-driver-px2tb" WorkloadEndpoint="ci--4372--1--0--n--c06092ab73-k8s-csi--node--driver--px2tb-eth0" Sep 10 23:54:35.597660 containerd[1522]: time="2025-09-10T23:54:35.597546634Z" level=info msg="connecting to shim dce484bb0cd01dc59be3a6d6250476ec9a737909d4172a20276e900d9e7d7171" address="unix:///run/containerd/s/fa44cd799f7766bd4c090ca6f8e743a6450a07035da74b9e7695dbcd50522cc7" namespace=k8s.io protocol=ttrpc version=3 Sep 10 23:54:35.638433 systemd[1]: Started cri-containerd-dce484bb0cd01dc59be3a6d6250476ec9a737909d4172a20276e900d9e7d7171.scope - libcontainer container dce484bb0cd01dc59be3a6d6250476ec9a737909d4172a20276e900d9e7d7171. Sep 10 23:54:35.687748 containerd[1522]: time="2025-09-10T23:54:35.687625948Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-px2tb,Uid:33c72a96-49c1-4e41-a8b3-15ab5f93e7db,Namespace:calico-system,Attempt:0,} returns sandbox id \"dce484bb0cd01dc59be3a6d6250476ec9a737909d4172a20276e900d9e7d7171\"" Sep 10 23:54:35.818330 systemd-networkd[1425]: cali57d2be5e282: Gained IPv6LL Sep 10 23:54:35.838244 containerd[1522]: time="2025-09-10T23:54:35.837897498Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/kube-controllers:v3.30.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 10 23:54:35.839241 containerd[1522]: time="2025-09-10T23:54:35.839170482Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/kube-controllers:v3.30.3: active requests=0, bytes read=48134957" Sep 10 23:54:35.840240 containerd[1522]: time="2025-09-10T23:54:35.839741648Z" level=info msg="ImageCreate event name:\"sha256:34117caf92350e1565610f2254377d7455b11e36666b5ce11b4a13670720432a\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 10 23:54:35.843120 containerd[1522]: time="2025-09-10T23:54:35.843056879Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/kube-controllers@sha256:27c4187717f08f0a5727019d8beb7597665eb47e69eaa1d7d091a7e28913e577\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 10 23:54:35.844073 containerd[1522]: time="2025-09-10T23:54:35.844020678Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.3\" with image id \"sha256:34117caf92350e1565610f2254377d7455b11e36666b5ce11b4a13670720432a\", repo tag \"ghcr.io/flatcar/calico/kube-controllers:v3.30.3\", repo digest \"ghcr.io/flatcar/calico/kube-controllers@sha256:27c4187717f08f0a5727019d8beb7597665eb47e69eaa1d7d091a7e28913e577\", size \"49504166\" in 3.190323848s" Sep 10 23:54:35.844282 containerd[1522]: time="2025-09-10T23:54:35.844252057Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.30.3\" returns image reference \"sha256:34117caf92350e1565610f2254377d7455b11e36666b5ce11b4a13670720432a\"" Sep 10 23:54:35.846045 containerd[1522]: time="2025-09-10T23:54:35.846020361Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.3\"" Sep 10 23:54:35.863820 containerd[1522]: time="2025-09-10T23:54:35.863706165Z" level=info msg="CreateContainer within sandbox \"f0fc36a64f51d1a9398060dcc44619b533470eb0ede52cce20680e42c6d394f8\" for container &ContainerMetadata{Name:calico-kube-controllers,Attempt:0,}" Sep 10 23:54:35.873994 containerd[1522]: time="2025-09-10T23:54:35.872402915Z" level=info msg="Container 3042355787539ec33867586a8e7b5c696a3acfe781c23d2b305989da1cfcfaa1: CDI devices from CRI Config.CDIDevices: []" Sep 10 23:54:35.882381 containerd[1522]: time="2025-09-10T23:54:35.882326885Z" level=info msg="CreateContainer within sandbox \"f0fc36a64f51d1a9398060dcc44619b533470eb0ede52cce20680e42c6d394f8\" for &ContainerMetadata{Name:calico-kube-controllers,Attempt:0,} returns container id \"3042355787539ec33867586a8e7b5c696a3acfe781c23d2b305989da1cfcfaa1\"" Sep 10 23:54:35.883180 containerd[1522]: time="2025-09-10T23:54:35.883072426Z" level=info msg="StartContainer for \"3042355787539ec33867586a8e7b5c696a3acfe781c23d2b305989da1cfcfaa1\"" Sep 10 23:54:35.884580 containerd[1522]: time="2025-09-10T23:54:35.884543626Z" level=info msg="connecting to shim 3042355787539ec33867586a8e7b5c696a3acfe781c23d2b305989da1cfcfaa1" address="unix:///run/containerd/s/75d07cb5991bba1c7dda9e53e679a71fddff54a02300f9a4bdfdc9d939745a43" protocol=ttrpc version=3 Sep 10 23:54:35.905473 systemd[1]: Started cri-containerd-3042355787539ec33867586a8e7b5c696a3acfe781c23d2b305989da1cfcfaa1.scope - libcontainer container 3042355787539ec33867586a8e7b5c696a3acfe781c23d2b305989da1cfcfaa1. Sep 10 23:54:35.959270 containerd[1522]: time="2025-09-10T23:54:35.958415458Z" level=info msg="StartContainer for \"3042355787539ec33867586a8e7b5c696a3acfe781c23d2b305989da1cfcfaa1\" returns successfully" Sep 10 23:54:36.138646 systemd-networkd[1425]: cali39afd11cc1a: Gained IPv6LL Sep 10 23:54:36.335662 containerd[1522]: time="2025-09-10T23:54:36.335614091Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-7c65d6cfc9-rdn9k,Uid:b46d3443-7658-4ea6-8bfe-1899b5945225,Namespace:kube-system,Attempt:0,}" Sep 10 23:54:36.491414 systemd-networkd[1425]: cali3cf2af56384: Link UP Sep 10 23:54:36.493476 systemd-networkd[1425]: cali3cf2af56384: Gained carrier Sep 10 23:54:36.517232 containerd[1522]: 2025-09-10 23:54:36.382 [INFO][4700] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4372--1--0--n--c06092ab73-k8s-coredns--7c65d6cfc9--rdn9k-eth0 coredns-7c65d6cfc9- kube-system b46d3443-7658-4ea6-8bfe-1899b5945225 837 0 2025-09-10 23:53:56 +0000 UTC map[k8s-app:kube-dns pod-template-hash:7c65d6cfc9 projectcalico.org/namespace:kube-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:coredns] map[] [] [] []} {k8s ci-4372-1-0-n-c06092ab73 coredns-7c65d6cfc9-rdn9k eth0 coredns [] [] [kns.kube-system ksa.kube-system.coredns] cali3cf2af56384 [{dns UDP 53 0 } {dns-tcp TCP 53 0 } {metrics TCP 9153 0 }] [] }} ContainerID="a4ec3f5003b041d9fde4d4b47e08d5f044258fb92688a75c2f97fd32bea0c35c" Namespace="kube-system" Pod="coredns-7c65d6cfc9-rdn9k" WorkloadEndpoint="ci--4372--1--0--n--c06092ab73-k8s-coredns--7c65d6cfc9--rdn9k-" Sep 10 23:54:36.517232 containerd[1522]: 2025-09-10 23:54:36.382 [INFO][4700] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="a4ec3f5003b041d9fde4d4b47e08d5f044258fb92688a75c2f97fd32bea0c35c" Namespace="kube-system" Pod="coredns-7c65d6cfc9-rdn9k" WorkloadEndpoint="ci--4372--1--0--n--c06092ab73-k8s-coredns--7c65d6cfc9--rdn9k-eth0" Sep 10 23:54:36.517232 containerd[1522]: 2025-09-10 23:54:36.413 [INFO][4712] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="a4ec3f5003b041d9fde4d4b47e08d5f044258fb92688a75c2f97fd32bea0c35c" HandleID="k8s-pod-network.a4ec3f5003b041d9fde4d4b47e08d5f044258fb92688a75c2f97fd32bea0c35c" Workload="ci--4372--1--0--n--c06092ab73-k8s-coredns--7c65d6cfc9--rdn9k-eth0" Sep 10 23:54:36.517445 containerd[1522]: 2025-09-10 23:54:36.416 [INFO][4712] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="a4ec3f5003b041d9fde4d4b47e08d5f044258fb92688a75c2f97fd32bea0c35c" HandleID="k8s-pod-network.a4ec3f5003b041d9fde4d4b47e08d5f044258fb92688a75c2f97fd32bea0c35c" Workload="ci--4372--1--0--n--c06092ab73-k8s-coredns--7c65d6cfc9--rdn9k-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0x40002d30a0), Attrs:map[string]string{"namespace":"kube-system", "node":"ci-4372-1-0-n-c06092ab73", "pod":"coredns-7c65d6cfc9-rdn9k", "timestamp":"2025-09-10 23:54:36.413515515 +0000 UTC"}, Hostname:"ci-4372-1-0-n-c06092ab73", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Sep 10 23:54:36.517445 containerd[1522]: 2025-09-10 23:54:36.416 [INFO][4712] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Sep 10 23:54:36.517445 containerd[1522]: 2025-09-10 23:54:36.416 [INFO][4712] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Sep 10 23:54:36.517445 containerd[1522]: 2025-09-10 23:54:36.416 [INFO][4712] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4372-1-0-n-c06092ab73' Sep 10 23:54:36.517445 containerd[1522]: 2025-09-10 23:54:36.437 [INFO][4712] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.a4ec3f5003b041d9fde4d4b47e08d5f044258fb92688a75c2f97fd32bea0c35c" host="ci-4372-1-0-n-c06092ab73" Sep 10 23:54:36.517445 containerd[1522]: 2025-09-10 23:54:36.443 [INFO][4712] ipam/ipam.go 394: Looking up existing affinities for host host="ci-4372-1-0-n-c06092ab73" Sep 10 23:54:36.517445 containerd[1522]: 2025-09-10 23:54:36.450 [INFO][4712] ipam/ipam.go 511: Trying affinity for 192.168.91.64/26 host="ci-4372-1-0-n-c06092ab73" Sep 10 23:54:36.517445 containerd[1522]: 2025-09-10 23:54:36.454 [INFO][4712] ipam/ipam.go 158: Attempting to load block cidr=192.168.91.64/26 host="ci-4372-1-0-n-c06092ab73" Sep 10 23:54:36.517445 containerd[1522]: 2025-09-10 23:54:36.457 [INFO][4712] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.91.64/26 host="ci-4372-1-0-n-c06092ab73" Sep 10 23:54:36.517681 containerd[1522]: 2025-09-10 23:54:36.457 [INFO][4712] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.91.64/26 handle="k8s-pod-network.a4ec3f5003b041d9fde4d4b47e08d5f044258fb92688a75c2f97fd32bea0c35c" host="ci-4372-1-0-n-c06092ab73" Sep 10 23:54:36.517681 containerd[1522]: 2025-09-10 23:54:36.460 [INFO][4712] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.a4ec3f5003b041d9fde4d4b47e08d5f044258fb92688a75c2f97fd32bea0c35c Sep 10 23:54:36.517681 containerd[1522]: 2025-09-10 23:54:36.468 [INFO][4712] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.91.64/26 handle="k8s-pod-network.a4ec3f5003b041d9fde4d4b47e08d5f044258fb92688a75c2f97fd32bea0c35c" host="ci-4372-1-0-n-c06092ab73" Sep 10 23:54:36.517681 containerd[1522]: 2025-09-10 23:54:36.480 [INFO][4712] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.91.71/26] block=192.168.91.64/26 handle="k8s-pod-network.a4ec3f5003b041d9fde4d4b47e08d5f044258fb92688a75c2f97fd32bea0c35c" host="ci-4372-1-0-n-c06092ab73" Sep 10 23:54:36.517681 containerd[1522]: 2025-09-10 23:54:36.481 [INFO][4712] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.91.71/26] handle="k8s-pod-network.a4ec3f5003b041d9fde4d4b47e08d5f044258fb92688a75c2f97fd32bea0c35c" host="ci-4372-1-0-n-c06092ab73" Sep 10 23:54:36.517681 containerd[1522]: 2025-09-10 23:54:36.481 [INFO][4712] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Sep 10 23:54:36.517681 containerd[1522]: 2025-09-10 23:54:36.481 [INFO][4712] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.91.71/26] IPv6=[] ContainerID="a4ec3f5003b041d9fde4d4b47e08d5f044258fb92688a75c2f97fd32bea0c35c" HandleID="k8s-pod-network.a4ec3f5003b041d9fde4d4b47e08d5f044258fb92688a75c2f97fd32bea0c35c" Workload="ci--4372--1--0--n--c06092ab73-k8s-coredns--7c65d6cfc9--rdn9k-eth0" Sep 10 23:54:36.517838 containerd[1522]: 2025-09-10 23:54:36.484 [INFO][4700] cni-plugin/k8s.go 418: Populated endpoint ContainerID="a4ec3f5003b041d9fde4d4b47e08d5f044258fb92688a75c2f97fd32bea0c35c" Namespace="kube-system" Pod="coredns-7c65d6cfc9-rdn9k" WorkloadEndpoint="ci--4372--1--0--n--c06092ab73-k8s-coredns--7c65d6cfc9--rdn9k-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4372--1--0--n--c06092ab73-k8s-coredns--7c65d6cfc9--rdn9k-eth0", GenerateName:"coredns-7c65d6cfc9-", Namespace:"kube-system", SelfLink:"", UID:"b46d3443-7658-4ea6-8bfe-1899b5945225", ResourceVersion:"837", Generation:0, CreationTimestamp:time.Date(2025, time.September, 10, 23, 53, 56, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"7c65d6cfc9", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4372-1-0-n-c06092ab73", ContainerID:"", Pod:"coredns-7c65d6cfc9-rdn9k", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.91.71/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"cali3cf2af56384", MAC:"", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 10 23:54:36.517838 containerd[1522]: 2025-09-10 23:54:36.484 [INFO][4700] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.91.71/32] ContainerID="a4ec3f5003b041d9fde4d4b47e08d5f044258fb92688a75c2f97fd32bea0c35c" Namespace="kube-system" Pod="coredns-7c65d6cfc9-rdn9k" WorkloadEndpoint="ci--4372--1--0--n--c06092ab73-k8s-coredns--7c65d6cfc9--rdn9k-eth0" Sep 10 23:54:36.517838 containerd[1522]: 2025-09-10 23:54:36.484 [INFO][4700] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali3cf2af56384 ContainerID="a4ec3f5003b041d9fde4d4b47e08d5f044258fb92688a75c2f97fd32bea0c35c" Namespace="kube-system" Pod="coredns-7c65d6cfc9-rdn9k" WorkloadEndpoint="ci--4372--1--0--n--c06092ab73-k8s-coredns--7c65d6cfc9--rdn9k-eth0" Sep 10 23:54:36.517838 containerd[1522]: 2025-09-10 23:54:36.491 [INFO][4700] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="a4ec3f5003b041d9fde4d4b47e08d5f044258fb92688a75c2f97fd32bea0c35c" Namespace="kube-system" Pod="coredns-7c65d6cfc9-rdn9k" WorkloadEndpoint="ci--4372--1--0--n--c06092ab73-k8s-coredns--7c65d6cfc9--rdn9k-eth0" Sep 10 23:54:36.517838 containerd[1522]: 2025-09-10 23:54:36.492 [INFO][4700] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="a4ec3f5003b041d9fde4d4b47e08d5f044258fb92688a75c2f97fd32bea0c35c" Namespace="kube-system" Pod="coredns-7c65d6cfc9-rdn9k" WorkloadEndpoint="ci--4372--1--0--n--c06092ab73-k8s-coredns--7c65d6cfc9--rdn9k-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4372--1--0--n--c06092ab73-k8s-coredns--7c65d6cfc9--rdn9k-eth0", GenerateName:"coredns-7c65d6cfc9-", Namespace:"kube-system", SelfLink:"", UID:"b46d3443-7658-4ea6-8bfe-1899b5945225", ResourceVersion:"837", Generation:0, CreationTimestamp:time.Date(2025, time.September, 10, 23, 53, 56, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"7c65d6cfc9", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4372-1-0-n-c06092ab73", ContainerID:"a4ec3f5003b041d9fde4d4b47e08d5f044258fb92688a75c2f97fd32bea0c35c", Pod:"coredns-7c65d6cfc9-rdn9k", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.91.71/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"cali3cf2af56384", MAC:"2e:11:3e:41:ce:63", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 10 23:54:36.517838 containerd[1522]: 2025-09-10 23:54:36.512 [INFO][4700] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="a4ec3f5003b041d9fde4d4b47e08d5f044258fb92688a75c2f97fd32bea0c35c" Namespace="kube-system" Pod="coredns-7c65d6cfc9-rdn9k" WorkloadEndpoint="ci--4372--1--0--n--c06092ab73-k8s-coredns--7c65d6cfc9--rdn9k-eth0" Sep 10 23:54:36.542720 containerd[1522]: time="2025-09-10T23:54:36.542669726Z" level=info msg="connecting to shim a4ec3f5003b041d9fde4d4b47e08d5f044258fb92688a75c2f97fd32bea0c35c" address="unix:///run/containerd/s/0c2a47d0c7af86624c63ab8116515cfd8fecf4f21c53e967f469cb9ad44efb85" namespace=k8s.io protocol=ttrpc version=3 Sep 10 23:54:36.567643 systemd[1]: Started cri-containerd-a4ec3f5003b041d9fde4d4b47e08d5f044258fb92688a75c2f97fd32bea0c35c.scope - libcontainer container a4ec3f5003b041d9fde4d4b47e08d5f044258fb92688a75c2f97fd32bea0c35c. Sep 10 23:54:36.641452 containerd[1522]: time="2025-09-10T23:54:36.641391674Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-7c65d6cfc9-rdn9k,Uid:b46d3443-7658-4ea6-8bfe-1899b5945225,Namespace:kube-system,Attempt:0,} returns sandbox id \"a4ec3f5003b041d9fde4d4b47e08d5f044258fb92688a75c2f97fd32bea0c35c\"" Sep 10 23:54:36.646244 containerd[1522]: time="2025-09-10T23:54:36.645667740Z" level=info msg="CreateContainer within sandbox \"a4ec3f5003b041d9fde4d4b47e08d5f044258fb92688a75c2f97fd32bea0c35c\" for container &ContainerMetadata{Name:coredns,Attempt:0,}" Sep 10 23:54:36.679107 containerd[1522]: time="2025-09-10T23:54:36.679029960Z" level=info msg="TaskExit event in podsandbox handler container_id:\"3042355787539ec33867586a8e7b5c696a3acfe781c23d2b305989da1cfcfaa1\" id:\"b6477bd387f3336951774dec75ec2e6b3eba012a5b111b74ef4d31b30bd4cd0e\" pid:4782 exited_at:{seconds:1757548476 nanos:651932967}" Sep 10 23:54:36.706625 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount1571595683.mount: Deactivated successfully. Sep 10 23:54:36.710685 containerd[1522]: time="2025-09-10T23:54:36.710649799Z" level=info msg="Container e91934134e04ad0cb28c6001d7ca2b690b0d965c85301d24859fe5a77909f2b1: CDI devices from CRI Config.CDIDevices: []" Sep 10 23:54:36.734008 containerd[1522]: time="2025-09-10T23:54:36.733964685Z" level=info msg="CreateContainer within sandbox \"a4ec3f5003b041d9fde4d4b47e08d5f044258fb92688a75c2f97fd32bea0c35c\" for &ContainerMetadata{Name:coredns,Attempt:0,} returns container id \"e91934134e04ad0cb28c6001d7ca2b690b0d965c85301d24859fe5a77909f2b1\"" Sep 10 23:54:36.735234 containerd[1522]: time="2025-09-10T23:54:36.734944045Z" level=info msg="StartContainer for \"e91934134e04ad0cb28c6001d7ca2b690b0d965c85301d24859fe5a77909f2b1\"" Sep 10 23:54:36.738215 kubelet[2743]: I0910 23:54:36.738030 2743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/calico-kube-controllers-5747f59d78-md7gk" podStartSLOduration=21.54591122 podStartE2EDuration="24.738009013s" podCreationTimestamp="2025-09-10 23:54:12 +0000 UTC" firstStartedPulling="2025-09-10 23:54:32.653249073 +0000 UTC m=+42.443314249" lastFinishedPulling="2025-09-10 23:54:35.845346866 +0000 UTC m=+45.635412042" observedRunningTime="2025-09-10 23:54:36.612729395 +0000 UTC m=+46.402794571" watchObservedRunningTime="2025-09-10 23:54:36.738009013 +0000 UTC m=+46.528074189" Sep 10 23:54:36.739868 containerd[1522]: time="2025-09-10T23:54:36.739837881Z" level=info msg="connecting to shim e91934134e04ad0cb28c6001d7ca2b690b0d965c85301d24859fe5a77909f2b1" address="unix:///run/containerd/s/0c2a47d0c7af86624c63ab8116515cfd8fecf4f21c53e967f469cb9ad44efb85" protocol=ttrpc version=3 Sep 10 23:54:36.773642 systemd[1]: Started cri-containerd-e91934134e04ad0cb28c6001d7ca2b690b0d965c85301d24859fe5a77909f2b1.scope - libcontainer container e91934134e04ad0cb28c6001d7ca2b690b0d965c85301d24859fe5a77909f2b1. Sep 10 23:54:36.824062 containerd[1522]: time="2025-09-10T23:54:36.824003931Z" level=info msg="StartContainer for \"e91934134e04ad0cb28c6001d7ca2b690b0d965c85301d24859fe5a77909f2b1\" returns successfully" Sep 10 23:54:37.334998 containerd[1522]: time="2025-09-10T23:54:37.334939288Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-7c65d6cfc9-pw9tk,Uid:de82da2e-d798-4452-8450-fbfaa1f65b47,Namespace:kube-system,Attempt:0,}" Sep 10 23:54:37.510426 systemd-networkd[1425]: cali0433fba7bfb: Link UP Sep 10 23:54:37.510715 systemd-networkd[1425]: cali0433fba7bfb: Gained carrier Sep 10 23:54:37.535139 containerd[1522]: 2025-09-10 23:54:37.393 [INFO][4830] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4372--1--0--n--c06092ab73-k8s-coredns--7c65d6cfc9--pw9tk-eth0 coredns-7c65d6cfc9- kube-system de82da2e-d798-4452-8450-fbfaa1f65b47 835 0 2025-09-10 23:53:56 +0000 UTC map[k8s-app:kube-dns pod-template-hash:7c65d6cfc9 projectcalico.org/namespace:kube-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:coredns] map[] [] [] []} {k8s ci-4372-1-0-n-c06092ab73 coredns-7c65d6cfc9-pw9tk eth0 coredns [] [] [kns.kube-system ksa.kube-system.coredns] cali0433fba7bfb [{dns UDP 53 0 } {dns-tcp TCP 53 0 } {metrics TCP 9153 0 }] [] }} ContainerID="8a31451f7dd160a402c98d10045a7999ead001052c062c9f49ff5f04ac18ac1e" Namespace="kube-system" Pod="coredns-7c65d6cfc9-pw9tk" WorkloadEndpoint="ci--4372--1--0--n--c06092ab73-k8s-coredns--7c65d6cfc9--pw9tk-" Sep 10 23:54:37.535139 containerd[1522]: 2025-09-10 23:54:37.393 [INFO][4830] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="8a31451f7dd160a402c98d10045a7999ead001052c062c9f49ff5f04ac18ac1e" Namespace="kube-system" Pod="coredns-7c65d6cfc9-pw9tk" WorkloadEndpoint="ci--4372--1--0--n--c06092ab73-k8s-coredns--7c65d6cfc9--pw9tk-eth0" Sep 10 23:54:37.535139 containerd[1522]: 2025-09-10 23:54:37.429 [INFO][4842] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="8a31451f7dd160a402c98d10045a7999ead001052c062c9f49ff5f04ac18ac1e" HandleID="k8s-pod-network.8a31451f7dd160a402c98d10045a7999ead001052c062c9f49ff5f04ac18ac1e" Workload="ci--4372--1--0--n--c06092ab73-k8s-coredns--7c65d6cfc9--pw9tk-eth0" Sep 10 23:54:37.535139 containerd[1522]: 2025-09-10 23:54:37.430 [INFO][4842] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="8a31451f7dd160a402c98d10045a7999ead001052c062c9f49ff5f04ac18ac1e" HandleID="k8s-pod-network.8a31451f7dd160a402c98d10045a7999ead001052c062c9f49ff5f04ac18ac1e" Workload="ci--4372--1--0--n--c06092ab73-k8s-coredns--7c65d6cfc9--pw9tk-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0x400024b730), Attrs:map[string]string{"namespace":"kube-system", "node":"ci-4372-1-0-n-c06092ab73", "pod":"coredns-7c65d6cfc9-pw9tk", "timestamp":"2025-09-10 23:54:37.429950431 +0000 UTC"}, Hostname:"ci-4372-1-0-n-c06092ab73", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Sep 10 23:54:37.535139 containerd[1522]: 2025-09-10 23:54:37.430 [INFO][4842] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Sep 10 23:54:37.535139 containerd[1522]: 2025-09-10 23:54:37.430 [INFO][4842] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Sep 10 23:54:37.535139 containerd[1522]: 2025-09-10 23:54:37.430 [INFO][4842] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4372-1-0-n-c06092ab73' Sep 10 23:54:37.535139 containerd[1522]: 2025-09-10 23:54:37.446 [INFO][4842] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.8a31451f7dd160a402c98d10045a7999ead001052c062c9f49ff5f04ac18ac1e" host="ci-4372-1-0-n-c06092ab73" Sep 10 23:54:37.535139 containerd[1522]: 2025-09-10 23:54:37.456 [INFO][4842] ipam/ipam.go 394: Looking up existing affinities for host host="ci-4372-1-0-n-c06092ab73" Sep 10 23:54:37.535139 containerd[1522]: 2025-09-10 23:54:37.464 [INFO][4842] ipam/ipam.go 511: Trying affinity for 192.168.91.64/26 host="ci-4372-1-0-n-c06092ab73" Sep 10 23:54:37.535139 containerd[1522]: 2025-09-10 23:54:37.467 [INFO][4842] ipam/ipam.go 158: Attempting to load block cidr=192.168.91.64/26 host="ci-4372-1-0-n-c06092ab73" Sep 10 23:54:37.535139 containerd[1522]: 2025-09-10 23:54:37.474 [INFO][4842] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.91.64/26 host="ci-4372-1-0-n-c06092ab73" Sep 10 23:54:37.535139 containerd[1522]: 2025-09-10 23:54:37.474 [INFO][4842] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.91.64/26 handle="k8s-pod-network.8a31451f7dd160a402c98d10045a7999ead001052c062c9f49ff5f04ac18ac1e" host="ci-4372-1-0-n-c06092ab73" Sep 10 23:54:37.535139 containerd[1522]: 2025-09-10 23:54:37.477 [INFO][4842] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.8a31451f7dd160a402c98d10045a7999ead001052c062c9f49ff5f04ac18ac1e Sep 10 23:54:37.535139 containerd[1522]: 2025-09-10 23:54:37.484 [INFO][4842] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.91.64/26 handle="k8s-pod-network.8a31451f7dd160a402c98d10045a7999ead001052c062c9f49ff5f04ac18ac1e" host="ci-4372-1-0-n-c06092ab73" Sep 10 23:54:37.535139 containerd[1522]: 2025-09-10 23:54:37.497 [INFO][4842] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.91.72/26] block=192.168.91.64/26 handle="k8s-pod-network.8a31451f7dd160a402c98d10045a7999ead001052c062c9f49ff5f04ac18ac1e" host="ci-4372-1-0-n-c06092ab73" Sep 10 23:54:37.535139 containerd[1522]: 2025-09-10 23:54:37.497 [INFO][4842] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.91.72/26] handle="k8s-pod-network.8a31451f7dd160a402c98d10045a7999ead001052c062c9f49ff5f04ac18ac1e" host="ci-4372-1-0-n-c06092ab73" Sep 10 23:54:37.535139 containerd[1522]: 2025-09-10 23:54:37.498 [INFO][4842] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Sep 10 23:54:37.535139 containerd[1522]: 2025-09-10 23:54:37.499 [INFO][4842] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.91.72/26] IPv6=[] ContainerID="8a31451f7dd160a402c98d10045a7999ead001052c062c9f49ff5f04ac18ac1e" HandleID="k8s-pod-network.8a31451f7dd160a402c98d10045a7999ead001052c062c9f49ff5f04ac18ac1e" Workload="ci--4372--1--0--n--c06092ab73-k8s-coredns--7c65d6cfc9--pw9tk-eth0" Sep 10 23:54:37.535861 containerd[1522]: 2025-09-10 23:54:37.503 [INFO][4830] cni-plugin/k8s.go 418: Populated endpoint ContainerID="8a31451f7dd160a402c98d10045a7999ead001052c062c9f49ff5f04ac18ac1e" Namespace="kube-system" Pod="coredns-7c65d6cfc9-pw9tk" WorkloadEndpoint="ci--4372--1--0--n--c06092ab73-k8s-coredns--7c65d6cfc9--pw9tk-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4372--1--0--n--c06092ab73-k8s-coredns--7c65d6cfc9--pw9tk-eth0", GenerateName:"coredns-7c65d6cfc9-", Namespace:"kube-system", SelfLink:"", UID:"de82da2e-d798-4452-8450-fbfaa1f65b47", ResourceVersion:"835", Generation:0, CreationTimestamp:time.Date(2025, time.September, 10, 23, 53, 56, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"7c65d6cfc9", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4372-1-0-n-c06092ab73", ContainerID:"", Pod:"coredns-7c65d6cfc9-pw9tk", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.91.72/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"cali0433fba7bfb", MAC:"", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 10 23:54:37.535861 containerd[1522]: 2025-09-10 23:54:37.503 [INFO][4830] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.91.72/32] ContainerID="8a31451f7dd160a402c98d10045a7999ead001052c062c9f49ff5f04ac18ac1e" Namespace="kube-system" Pod="coredns-7c65d6cfc9-pw9tk" WorkloadEndpoint="ci--4372--1--0--n--c06092ab73-k8s-coredns--7c65d6cfc9--pw9tk-eth0" Sep 10 23:54:37.535861 containerd[1522]: 2025-09-10 23:54:37.503 [INFO][4830] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali0433fba7bfb ContainerID="8a31451f7dd160a402c98d10045a7999ead001052c062c9f49ff5f04ac18ac1e" Namespace="kube-system" Pod="coredns-7c65d6cfc9-pw9tk" WorkloadEndpoint="ci--4372--1--0--n--c06092ab73-k8s-coredns--7c65d6cfc9--pw9tk-eth0" Sep 10 23:54:37.535861 containerd[1522]: 2025-09-10 23:54:37.511 [INFO][4830] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="8a31451f7dd160a402c98d10045a7999ead001052c062c9f49ff5f04ac18ac1e" Namespace="kube-system" Pod="coredns-7c65d6cfc9-pw9tk" WorkloadEndpoint="ci--4372--1--0--n--c06092ab73-k8s-coredns--7c65d6cfc9--pw9tk-eth0" Sep 10 23:54:37.535861 containerd[1522]: 2025-09-10 23:54:37.512 [INFO][4830] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="8a31451f7dd160a402c98d10045a7999ead001052c062c9f49ff5f04ac18ac1e" Namespace="kube-system" Pod="coredns-7c65d6cfc9-pw9tk" WorkloadEndpoint="ci--4372--1--0--n--c06092ab73-k8s-coredns--7c65d6cfc9--pw9tk-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4372--1--0--n--c06092ab73-k8s-coredns--7c65d6cfc9--pw9tk-eth0", GenerateName:"coredns-7c65d6cfc9-", Namespace:"kube-system", SelfLink:"", UID:"de82da2e-d798-4452-8450-fbfaa1f65b47", ResourceVersion:"835", Generation:0, CreationTimestamp:time.Date(2025, time.September, 10, 23, 53, 56, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"7c65d6cfc9", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4372-1-0-n-c06092ab73", ContainerID:"8a31451f7dd160a402c98d10045a7999ead001052c062c9f49ff5f04ac18ac1e", Pod:"coredns-7c65d6cfc9-pw9tk", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.91.72/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"cali0433fba7bfb", MAC:"0e:80:ca:e9:1a:5a", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Sep 10 23:54:37.535861 containerd[1522]: 2025-09-10 23:54:37.530 [INFO][4830] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="8a31451f7dd160a402c98d10045a7999ead001052c062c9f49ff5f04ac18ac1e" Namespace="kube-system" Pod="coredns-7c65d6cfc9-pw9tk" WorkloadEndpoint="ci--4372--1--0--n--c06092ab73-k8s-coredns--7c65d6cfc9--pw9tk-eth0" Sep 10 23:54:37.546708 systemd-networkd[1425]: calid6f21d9e90e: Gained IPv6LL Sep 10 23:54:37.569697 containerd[1522]: time="2025-09-10T23:54:37.569439743Z" level=info msg="connecting to shim 8a31451f7dd160a402c98d10045a7999ead001052c062c9f49ff5f04ac18ac1e" address="unix:///run/containerd/s/53f015f4f07dd372f3e65ea3ecbb3cec76a1cc7cb7cee28db038565a47c24e5f" namespace=k8s.io protocol=ttrpc version=3 Sep 10 23:54:37.618536 systemd[1]: Started cri-containerd-8a31451f7dd160a402c98d10045a7999ead001052c062c9f49ff5f04ac18ac1e.scope - libcontainer container 8a31451f7dd160a402c98d10045a7999ead001052c062c9f49ff5f04ac18ac1e. Sep 10 23:54:37.628564 kubelet[2743]: I0910 23:54:37.628465 2743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/coredns-7c65d6cfc9-rdn9k" podStartSLOduration=41.628446158 podStartE2EDuration="41.628446158s" podCreationTimestamp="2025-09-10 23:53:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-10 23:54:37.625139252 +0000 UTC m=+47.415204428" watchObservedRunningTime="2025-09-10 23:54:37.628446158 +0000 UTC m=+47.418511334" Sep 10 23:54:37.704173 containerd[1522]: time="2025-09-10T23:54:37.704135511Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-7c65d6cfc9-pw9tk,Uid:de82da2e-d798-4452-8450-fbfaa1f65b47,Namespace:kube-system,Attempt:0,} returns sandbox id \"8a31451f7dd160a402c98d10045a7999ead001052c062c9f49ff5f04ac18ac1e\"" Sep 10 23:54:37.711224 containerd[1522]: time="2025-09-10T23:54:37.711079748Z" level=info msg="CreateContainer within sandbox \"8a31451f7dd160a402c98d10045a7999ead001052c062c9f49ff5f04ac18ac1e\" for container &ContainerMetadata{Name:coredns,Attempt:0,}" Sep 10 23:54:37.737589 containerd[1522]: time="2025-09-10T23:54:37.734178921Z" level=info msg="Container e8fd45aa13095389e94bbf86c7c5558d3bda32df3c637d8e2e134784c3afd43d: CDI devices from CRI Config.CDIDevices: []" Sep 10 23:54:37.739289 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount446582743.mount: Deactivated successfully. Sep 10 23:54:37.747066 containerd[1522]: time="2025-09-10T23:54:37.747022432Z" level=info msg="CreateContainer within sandbox \"8a31451f7dd160a402c98d10045a7999ead001052c062c9f49ff5f04ac18ac1e\" for &ContainerMetadata{Name:coredns,Attempt:0,} returns container id \"e8fd45aa13095389e94bbf86c7c5558d3bda32df3c637d8e2e134784c3afd43d\"" Sep 10 23:54:37.748139 containerd[1522]: time="2025-09-10T23:54:37.748109919Z" level=info msg="StartContainer for \"e8fd45aa13095389e94bbf86c7c5558d3bda32df3c637d8e2e134784c3afd43d\"" Sep 10 23:54:37.749831 containerd[1522]: time="2025-09-10T23:54:37.749685005Z" level=info msg="connecting to shim e8fd45aa13095389e94bbf86c7c5558d3bda32df3c637d8e2e134784c3afd43d" address="unix:///run/containerd/s/53f015f4f07dd372f3e65ea3ecbb3cec76a1cc7cb7cee28db038565a47c24e5f" protocol=ttrpc version=3 Sep 10 23:54:37.772650 systemd[1]: Started cri-containerd-e8fd45aa13095389e94bbf86c7c5558d3bda32df3c637d8e2e134784c3afd43d.scope - libcontainer container e8fd45aa13095389e94bbf86c7c5558d3bda32df3c637d8e2e134784c3afd43d. Sep 10 23:54:37.812907 containerd[1522]: time="2025-09-10T23:54:37.812871035Z" level=info msg="StartContainer for \"e8fd45aa13095389e94bbf86c7c5558d3bda32df3c637d8e2e134784c3afd43d\" returns successfully" Sep 10 23:54:38.379490 systemd-networkd[1425]: cali3cf2af56384: Gained IPv6LL Sep 10 23:54:38.684020 kubelet[2743]: I0910 23:54:38.682789 2743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/coredns-7c65d6cfc9-pw9tk" podStartSLOduration=42.682765196 podStartE2EDuration="42.682765196s" podCreationTimestamp="2025-09-10 23:53:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-10 23:54:38.655767447 +0000 UTC m=+48.445832623" watchObservedRunningTime="2025-09-10 23:54:38.682765196 +0000 UTC m=+48.472830412" Sep 10 23:54:38.820650 containerd[1522]: time="2025-09-10T23:54:38.820579886Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/apiserver:v3.30.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 10 23:54:38.822157 containerd[1522]: time="2025-09-10T23:54:38.821869908Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.30.3: active requests=0, bytes read=44530807" Sep 10 23:54:38.823075 containerd[1522]: time="2025-09-10T23:54:38.823036361Z" level=info msg="ImageCreate event name:\"sha256:632fbde00b1918016ac07458e79cc438ccda83cb762bfd5fc50a26721abced08\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 10 23:54:38.825815 containerd[1522]: time="2025-09-10T23:54:38.825764058Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/apiserver@sha256:6a24147f11c1edce9d6ba79bdb0c2beadec53853fb43438a287291e67b41e51b\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 10 23:54:38.826815 containerd[1522]: time="2025-09-10T23:54:38.826775859Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/apiserver:v3.30.3\" with image id \"sha256:632fbde00b1918016ac07458e79cc438ccda83cb762bfd5fc50a26721abced08\", repo tag \"ghcr.io/flatcar/calico/apiserver:v3.30.3\", repo digest \"ghcr.io/flatcar/calico/apiserver@sha256:6a24147f11c1edce9d6ba79bdb0c2beadec53853fb43438a287291e67b41e51b\", size \"45900064\" in 2.980324263s" Sep 10 23:54:38.826815 containerd[1522]: time="2025-09-10T23:54:38.826812662Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.3\" returns image reference \"sha256:632fbde00b1918016ac07458e79cc438ccda83cb762bfd5fc50a26721abced08\"" Sep 10 23:54:38.828394 containerd[1522]: time="2025-09-10T23:54:38.828370106Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.3\"" Sep 10 23:54:38.830226 containerd[1522]: time="2025-09-10T23:54:38.829820261Z" level=info msg="CreateContainer within sandbox \"1b22884b623d6cf998ab73f8b6886c02fe8bc460966340a6db16569798b9d66c\" for container &ContainerMetadata{Name:calico-apiserver,Attempt:0,}" Sep 10 23:54:38.843601 containerd[1522]: time="2025-09-10T23:54:38.843471148Z" level=info msg="Container 09a65b5db0f054744ae5a952833972ba76ea34bb358e9721d44e10d5b0a147a9: CDI devices from CRI Config.CDIDevices: []" Sep 10 23:54:38.847637 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount1817656084.mount: Deactivated successfully. Sep 10 23:54:38.858902 containerd[1522]: time="2025-09-10T23:54:38.858763365Z" level=info msg="CreateContainer within sandbox \"1b22884b623d6cf998ab73f8b6886c02fe8bc460966340a6db16569798b9d66c\" for &ContainerMetadata{Name:calico-apiserver,Attempt:0,} returns container id \"09a65b5db0f054744ae5a952833972ba76ea34bb358e9721d44e10d5b0a147a9\"" Sep 10 23:54:38.860223 containerd[1522]: time="2025-09-10T23:54:38.859623473Z" level=info msg="StartContainer for \"09a65b5db0f054744ae5a952833972ba76ea34bb358e9721d44e10d5b0a147a9\"" Sep 10 23:54:38.861554 containerd[1522]: time="2025-09-10T23:54:38.861523745Z" level=info msg="connecting to shim 09a65b5db0f054744ae5a952833972ba76ea34bb358e9721d44e10d5b0a147a9" address="unix:///run/containerd/s/4d2ffc0259e68840072cbb4a3b0798c70c5794eae6a044f83160a4024efeaf69" protocol=ttrpc version=3 Sep 10 23:54:38.886584 systemd[1]: Started cri-containerd-09a65b5db0f054744ae5a952833972ba76ea34bb358e9721d44e10d5b0a147a9.scope - libcontainer container 09a65b5db0f054744ae5a952833972ba76ea34bb358e9721d44e10d5b0a147a9. Sep 10 23:54:38.890662 systemd-networkd[1425]: cali0433fba7bfb: Gained IPv6LL Sep 10 23:54:38.938735 containerd[1522]: time="2025-09-10T23:54:38.938553716Z" level=info msg="StartContainer for \"09a65b5db0f054744ae5a952833972ba76ea34bb358e9721d44e10d5b0a147a9\" returns successfully" Sep 10 23:54:39.213096 containerd[1522]: time="2025-09-10T23:54:39.212995873Z" level=info msg="ImageUpdate event name:\"ghcr.io/flatcar/calico/apiserver:v3.30.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 10 23:54:39.214926 containerd[1522]: time="2025-09-10T23:54:39.214886663Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.30.3: active requests=0, bytes read=77" Sep 10 23:54:39.216921 containerd[1522]: time="2025-09-10T23:54:39.216882953Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/apiserver:v3.30.3\" with image id \"sha256:632fbde00b1918016ac07458e79cc438ccda83cb762bfd5fc50a26721abced08\", repo tag \"ghcr.io/flatcar/calico/apiserver:v3.30.3\", repo digest \"ghcr.io/flatcar/calico/apiserver@sha256:6a24147f11c1edce9d6ba79bdb0c2beadec53853fb43438a287291e67b41e51b\", size \"45900064\" in 388.368475ms" Sep 10 23:54:39.216993 containerd[1522]: time="2025-09-10T23:54:39.216950757Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.3\" returns image reference \"sha256:632fbde00b1918016ac07458e79cc438ccda83cb762bfd5fc50a26721abced08\"" Sep 10 23:54:39.218677 containerd[1522]: time="2025-09-10T23:54:39.218635703Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.30.3\"" Sep 10 23:54:39.220112 containerd[1522]: time="2025-09-10T23:54:39.220083874Z" level=info msg="CreateContainer within sandbox \"b8b41b7f20cd5d4db3700fabcf147c0a7c4e7c1b1e2b09fb697d22bdd82d7912\" for container &ContainerMetadata{Name:calico-apiserver,Attempt:0,}" Sep 10 23:54:39.236808 containerd[1522]: time="2025-09-10T23:54:39.236761959Z" level=info msg="Container b261d4052cf785b83e2460329752626fc73037ce2a0559616447a07700babca3: CDI devices from CRI Config.CDIDevices: []" Sep 10 23:54:39.249941 containerd[1522]: time="2025-09-10T23:54:39.249894663Z" level=info msg="CreateContainer within sandbox \"b8b41b7f20cd5d4db3700fabcf147c0a7c4e7c1b1e2b09fb697d22bdd82d7912\" for &ContainerMetadata{Name:calico-apiserver,Attempt:0,} returns container id \"b261d4052cf785b83e2460329752626fc73037ce2a0559616447a07700babca3\"" Sep 10 23:54:39.252060 containerd[1522]: time="2025-09-10T23:54:39.250785759Z" level=info msg="StartContainer for \"b261d4052cf785b83e2460329752626fc73037ce2a0559616447a07700babca3\"" Sep 10 23:54:39.253676 containerd[1522]: time="2025-09-10T23:54:39.253494289Z" level=info msg="connecting to shim b261d4052cf785b83e2460329752626fc73037ce2a0559616447a07700babca3" address="unix:///run/containerd/s/6d8d41a5bdd674f10df39659a739d244e7701b11df35ad8fd8fc7b113546512f" protocol=ttrpc version=3 Sep 10 23:54:39.282406 systemd[1]: Started cri-containerd-b261d4052cf785b83e2460329752626fc73037ce2a0559616447a07700babca3.scope - libcontainer container b261d4052cf785b83e2460329752626fc73037ce2a0559616447a07700babca3. Sep 10 23:54:39.350881 containerd[1522]: time="2025-09-10T23:54:39.350836153Z" level=info msg="StartContainer for \"b261d4052cf785b83e2460329752626fc73037ce2a0559616447a07700babca3\" returns successfully" Sep 10 23:54:39.664245 kubelet[2743]: I0910 23:54:39.663671 2743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-apiserver/calico-apiserver-cd5495d6f-mbcq8" podStartSLOduration=26.490955578 podStartE2EDuration="31.66364957s" podCreationTimestamp="2025-09-10 23:54:08 +0000 UTC" firstStartedPulling="2025-09-10 23:54:33.655519821 +0000 UTC m=+43.445584997" lastFinishedPulling="2025-09-10 23:54:38.828213733 +0000 UTC m=+48.618278989" observedRunningTime="2025-09-10 23:54:39.663563884 +0000 UTC m=+49.453629060" watchObservedRunningTime="2025-09-10 23:54:39.66364957 +0000 UTC m=+49.453714746" Sep 10 23:54:40.645665 kubelet[2743]: I0910 23:54:40.644598 2743 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Sep 10 23:54:40.647204 kubelet[2743]: I0910 23:54:40.646863 2743 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Sep 10 23:54:41.700079 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount3245925737.mount: Deactivated successfully. Sep 10 23:54:42.104949 containerd[1522]: time="2025-09-10T23:54:42.104870466Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/goldmane:v3.30.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 10 23:54:42.106244 containerd[1522]: time="2025-09-10T23:54:42.106124340Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/goldmane:v3.30.3: active requests=0, bytes read=61845332" Sep 10 23:54:42.107321 containerd[1522]: time="2025-09-10T23:54:42.107240464Z" level=info msg="ImageCreate event name:\"sha256:14088376331a0622b7f6a2fbc2f2932806a6eafdd7b602f6139d3b985bf1e685\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 10 23:54:42.110689 containerd[1522]: time="2025-09-10T23:54:42.110365970Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/goldmane@sha256:46297703ab3739331a00a58f0d6a5498c8d3b6523ad947eed68592ee0f3e79f0\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 10 23:54:42.111127 containerd[1522]: time="2025-09-10T23:54:42.111097439Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/goldmane:v3.30.3\" with image id \"sha256:14088376331a0622b7f6a2fbc2f2932806a6eafdd7b602f6139d3b985bf1e685\", repo tag \"ghcr.io/flatcar/calico/goldmane:v3.30.3\", repo digest \"ghcr.io/flatcar/calico/goldmane@sha256:46297703ab3739331a00a58f0d6a5498c8d3b6523ad947eed68592ee0f3e79f0\", size \"61845178\" in 2.892423615s" Sep 10 23:54:42.111230 containerd[1522]: time="2025-09-10T23:54:42.111130557Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.30.3\" returns image reference \"sha256:14088376331a0622b7f6a2fbc2f2932806a6eafdd7b602f6139d3b985bf1e685\"" Sep 10 23:54:42.113286 containerd[1522]: time="2025-09-10T23:54:42.112437388Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.30.3\"" Sep 10 23:54:42.115287 containerd[1522]: time="2025-09-10T23:54:42.114485447Z" level=info msg="CreateContainer within sandbox \"f4e61518e6a045729882683dd967c26a5724df5586e2dde4a13fc456b84fce45\" for container &ContainerMetadata{Name:goldmane,Attempt:0,}" Sep 10 23:54:42.124412 containerd[1522]: time="2025-09-10T23:54:42.124368369Z" level=info msg="Container 3d4887b1f13e72160318a4c2eab7a59585b79f1b69b010035906796839b12e22: CDI devices from CRI Config.CDIDevices: []" Sep 10 23:54:42.144958 containerd[1522]: time="2025-09-10T23:54:42.144915560Z" level=info msg="CreateContainer within sandbox \"f4e61518e6a045729882683dd967c26a5724df5586e2dde4a13fc456b84fce45\" for &ContainerMetadata{Name:goldmane,Attempt:0,} returns container id \"3d4887b1f13e72160318a4c2eab7a59585b79f1b69b010035906796839b12e22\"" Sep 10 23:54:42.146256 containerd[1522]: time="2025-09-10T23:54:42.145952449Z" level=info msg="StartContainer for \"3d4887b1f13e72160318a4c2eab7a59585b79f1b69b010035906796839b12e22\"" Sep 10 23:54:42.149663 containerd[1522]: time="2025-09-10T23:54:42.149589320Z" level=info msg="connecting to shim 3d4887b1f13e72160318a4c2eab7a59585b79f1b69b010035906796839b12e22" address="unix:///run/containerd/s/4f86a4e9185fd1879ee19495be40dde6e202a10270e73dc6507232a5a345b108" protocol=ttrpc version=3 Sep 10 23:54:42.173419 systemd[1]: Started cri-containerd-3d4887b1f13e72160318a4c2eab7a59585b79f1b69b010035906796839b12e22.scope - libcontainer container 3d4887b1f13e72160318a4c2eab7a59585b79f1b69b010035906796839b12e22. Sep 10 23:54:42.225846 containerd[1522]: time="2025-09-10T23:54:42.225815532Z" level=info msg="StartContainer for \"3d4887b1f13e72160318a4c2eab7a59585b79f1b69b010035906796839b12e22\" returns successfully" Sep 10 23:54:42.678003 kubelet[2743]: I0910 23:54:42.677557 2743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/goldmane-7988f88666-hswkn" podStartSLOduration=23.393352397 podStartE2EDuration="30.677534994s" podCreationTimestamp="2025-09-10 23:54:12 +0000 UTC" firstStartedPulling="2025-09-10 23:54:34.827861777 +0000 UTC m=+44.617926953" lastFinishedPulling="2025-09-10 23:54:42.112044335 +0000 UTC m=+51.902109550" observedRunningTime="2025-09-10 23:54:42.676016098 +0000 UTC m=+52.466081394" watchObservedRunningTime="2025-09-10 23:54:42.677534994 +0000 UTC m=+52.467600170" Sep 10 23:54:42.679533 kubelet[2743]: I0910 23:54:42.678089 2743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-apiserver/calico-apiserver-cd5495d6f-tcd74" podStartSLOduration=30.236613958 podStartE2EDuration="34.678078316s" podCreationTimestamp="2025-09-10 23:54:08 +0000 UTC" firstStartedPulling="2025-09-10 23:54:34.776497663 +0000 UTC m=+44.566562839" lastFinishedPulling="2025-09-10 23:54:39.217962021 +0000 UTC m=+49.008027197" observedRunningTime="2025-09-10 23:54:39.684719171 +0000 UTC m=+49.474784347" watchObservedRunningTime="2025-09-10 23:54:42.678078316 +0000 UTC m=+52.468143532" Sep 10 23:54:42.773982 containerd[1522]: time="2025-09-10T23:54:42.773886066Z" level=info msg="TaskExit event in podsandbox handler container_id:\"3d4887b1f13e72160318a4c2eab7a59585b79f1b69b010035906796839b12e22\" id:\"0901b5fef29d6ef54f0fc548ebe3d1f9f68b0867584566f349169082ea1f6723\" pid:5096 exit_status:1 exited_at:{seconds:1757548482 nanos:772841418}" Sep 10 23:54:43.562234 containerd[1522]: time="2025-09-10T23:54:43.562164951Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/csi:v3.30.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 10 23:54:43.563605 containerd[1522]: time="2025-09-10T23:54:43.563286638Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/csi:v3.30.3: active requests=0, bytes read=8227489" Sep 10 23:54:43.564409 containerd[1522]: time="2025-09-10T23:54:43.564356249Z" level=info msg="ImageCreate event name:\"sha256:5e2b30128ce4b607acd97d3edef62ce1a90be0259903090a51c360adbe4a8f3b\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 10 23:54:43.567702 containerd[1522]: time="2025-09-10T23:54:43.567613758Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/csi@sha256:f22c88018d8b58c4ef0052f594b216a13bd6852166ac131a538c5ab2fba23bb2\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 10 23:54:43.569137 containerd[1522]: time="2025-09-10T23:54:43.568744245Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/csi:v3.30.3\" with image id \"sha256:5e2b30128ce4b607acd97d3edef62ce1a90be0259903090a51c360adbe4a8f3b\", repo tag \"ghcr.io/flatcar/calico/csi:v3.30.3\", repo digest \"ghcr.io/flatcar/calico/csi@sha256:f22c88018d8b58c4ef0052f594b216a13bd6852166ac131a538c5ab2fba23bb2\", size \"9596730\" in 1.456271979s" Sep 10 23:54:43.569137 containerd[1522]: time="2025-09-10T23:54:43.568777922Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.30.3\" returns image reference \"sha256:5e2b30128ce4b607acd97d3edef62ce1a90be0259903090a51c360adbe4a8f3b\"" Sep 10 23:54:43.573293 containerd[1522]: time="2025-09-10T23:54:43.573262152Z" level=info msg="CreateContainer within sandbox \"dce484bb0cd01dc59be3a6d6250476ec9a737909d4172a20276e900d9e7d7171\" for container &ContainerMetadata{Name:calico-csi,Attempt:0,}" Sep 10 23:54:43.594858 containerd[1522]: time="2025-09-10T23:54:43.594314829Z" level=info msg="Container 9dad47d8eda04f0a10c9d84f31233a4fb5f0ded802e3bcd292e78f1d90b33b33: CDI devices from CRI Config.CDIDevices: []" Sep 10 23:54:43.613312 containerd[1522]: time="2025-09-10T23:54:43.613266721Z" level=info msg="CreateContainer within sandbox \"dce484bb0cd01dc59be3a6d6250476ec9a737909d4172a20276e900d9e7d7171\" for &ContainerMetadata{Name:calico-csi,Attempt:0,} returns container id \"9dad47d8eda04f0a10c9d84f31233a4fb5f0ded802e3bcd292e78f1d90b33b33\"" Sep 10 23:54:43.614181 containerd[1522]: time="2025-09-10T23:54:43.614108707Z" level=info msg="StartContainer for \"9dad47d8eda04f0a10c9d84f31233a4fb5f0ded802e3bcd292e78f1d90b33b33\"" Sep 10 23:54:43.617320 containerd[1522]: time="2025-09-10T23:54:43.617293021Z" level=info msg="connecting to shim 9dad47d8eda04f0a10c9d84f31233a4fb5f0ded802e3bcd292e78f1d90b33b33" address="unix:///run/containerd/s/fa44cd799f7766bd4c090ca6f8e743a6450a07035da74b9e7695dbcd50522cc7" protocol=ttrpc version=3 Sep 10 23:54:43.641433 systemd[1]: Started cri-containerd-9dad47d8eda04f0a10c9d84f31233a4fb5f0ded802e3bcd292e78f1d90b33b33.scope - libcontainer container 9dad47d8eda04f0a10c9d84f31233a4fb5f0ded802e3bcd292e78f1d90b33b33. Sep 10 23:54:43.708383 containerd[1522]: time="2025-09-10T23:54:43.708313607Z" level=info msg="StartContainer for \"9dad47d8eda04f0a10c9d84f31233a4fb5f0ded802e3bcd292e78f1d90b33b33\" returns successfully" Sep 10 23:54:43.711514 containerd[1522]: time="2025-09-10T23:54:43.711267375Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.3\"" Sep 10 23:54:43.766699 containerd[1522]: time="2025-09-10T23:54:43.766615271Z" level=info msg="TaskExit event in podsandbox handler container_id:\"3d4887b1f13e72160318a4c2eab7a59585b79f1b69b010035906796839b12e22\" id:\"b26cb8eea867383d075ff7cc8745ceca526cf90d31e2c0ecd8c4f6950f4b63bc\" pid:5147 exit_status:1 exited_at:{seconds:1757548483 nanos:766244895}" Sep 10 23:54:44.000850 kubelet[2743]: I0910 23:54:44.000777 2743 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Sep 10 23:54:45.494872 containerd[1522]: time="2025-09-10T23:54:45.494647067Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 10 23:54:45.498090 containerd[1522]: time="2025-09-10T23:54:45.498029593Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/node-driver-registrar:v3.30.3: active requests=0, bytes read=13761208" Sep 10 23:54:45.499681 containerd[1522]: time="2025-09-10T23:54:45.499604262Z" level=info msg="ImageCreate event name:\"sha256:a319b5bdc1001e98875b68e2943279adb74bcb19d09f1db857bc27959a078a65\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 10 23:54:45.504911 containerd[1522]: time="2025-09-10T23:54:45.504868880Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node-driver-registrar@sha256:731ab232ca708102ab332340b1274d5cd656aa896ecc5368ee95850b811df86f\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Sep 10 23:54:45.507075 containerd[1522]: time="2025-09-10T23:54:45.506837487Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.3\" with image id \"sha256:a319b5bdc1001e98875b68e2943279adb74bcb19d09f1db857bc27959a078a65\", repo tag \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.3\", repo digest \"ghcr.io/flatcar/calico/node-driver-registrar@sha256:731ab232ca708102ab332340b1274d5cd656aa896ecc5368ee95850b811df86f\", size \"15130401\" in 1.795530274s" Sep 10 23:54:45.507075 containerd[1522]: time="2025-09-10T23:54:45.506881404Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.3\" returns image reference \"sha256:a319b5bdc1001e98875b68e2943279adb74bcb19d09f1db857bc27959a078a65\"" Sep 10 23:54:45.525077 containerd[1522]: time="2025-09-10T23:54:45.524915928Z" level=info msg="CreateContainer within sandbox \"dce484bb0cd01dc59be3a6d6250476ec9a737909d4172a20276e900d9e7d7171\" for container &ContainerMetadata{Name:csi-node-driver-registrar,Attempt:0,}" Sep 10 23:54:45.574616 containerd[1522]: time="2025-09-10T23:54:45.574563955Z" level=info msg="Container 9f9654af1f09b9211ba15710ea24e8d2f52f550dd940e1dc62641ff1e6e2c677: CDI devices from CRI Config.CDIDevices: []" Sep 10 23:54:45.590940 containerd[1522]: time="2025-09-10T23:54:45.590860018Z" level=info msg="CreateContainer within sandbox \"dce484bb0cd01dc59be3a6d6250476ec9a737909d4172a20276e900d9e7d7171\" for &ContainerMetadata{Name:csi-node-driver-registrar,Attempt:0,} returns container id \"9f9654af1f09b9211ba15710ea24e8d2f52f550dd940e1dc62641ff1e6e2c677\"" Sep 10 23:54:45.592354 containerd[1522]: time="2025-09-10T23:54:45.592234819Z" level=info msg="StartContainer for \"9f9654af1f09b9211ba15710ea24e8d2f52f550dd940e1dc62641ff1e6e2c677\"" Sep 10 23:54:45.594651 containerd[1522]: time="2025-09-10T23:54:45.594615762Z" level=info msg="connecting to shim 9f9654af1f09b9211ba15710ea24e8d2f52f550dd940e1dc62641ff1e6e2c677" address="unix:///run/containerd/s/fa44cd799f7766bd4c090ca6f8e743a6450a07035da74b9e7695dbcd50522cc7" protocol=ttrpc version=3 Sep 10 23:54:45.619727 systemd[1]: Started cri-containerd-9f9654af1f09b9211ba15710ea24e8d2f52f550dd940e1dc62641ff1e6e2c677.scope - libcontainer container 9f9654af1f09b9211ba15710ea24e8d2f52f550dd940e1dc62641ff1e6e2c677. Sep 10 23:54:45.667268 containerd[1522]: time="2025-09-10T23:54:45.667166953Z" level=info msg="StartContainer for \"9f9654af1f09b9211ba15710ea24e8d2f52f550dd940e1dc62641ff1e6e2c677\" returns successfully" Sep 10 23:54:45.706575 kubelet[2743]: I0910 23:54:45.706290 2743 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/csi-node-driver-px2tb" podStartSLOduration=23.884864545 podStartE2EDuration="33.706267386s" podCreationTimestamp="2025-09-10 23:54:12 +0000 UTC" firstStartedPulling="2025-09-10 23:54:35.689433136 +0000 UTC m=+45.479498312" lastFinishedPulling="2025-09-10 23:54:45.510835977 +0000 UTC m=+55.300901153" observedRunningTime="2025-09-10 23:54:45.705778694 +0000 UTC m=+55.495843910" watchObservedRunningTime="2025-09-10 23:54:45.706267386 +0000 UTC m=+55.496332562" Sep 10 23:54:46.461808 kubelet[2743]: I0910 23:54:46.461351 2743 csi_plugin.go:100] kubernetes.io/csi: Trying to validate a new CSI Driver with name: csi.tigera.io endpoint: /var/lib/kubelet/plugins/csi.tigera.io/csi.sock versions: 1.0.0 Sep 10 23:54:46.461808 kubelet[2743]: I0910 23:54:46.461425 2743 csi_plugin.go:113] kubernetes.io/csi: Register new plugin with name: csi.tigera.io at endpoint: /var/lib/kubelet/plugins/csi.tigera.io/csi.sock Sep 10 23:54:50.993175 containerd[1522]: time="2025-09-10T23:54:50.993118122Z" level=info msg="TaskExit event in podsandbox handler container_id:\"3042355787539ec33867586a8e7b5c696a3acfe781c23d2b305989da1cfcfaa1\" id:\"7ace80beeab74ddd4ff0df9a34361ea52e1c2b8aa3a82d126d803da89f423526\" pid:5227 exited_at:{seconds:1757548490 nanos:992832694}" Sep 10 23:54:52.555991 containerd[1522]: time="2025-09-10T23:54:52.555897615Z" level=info msg="TaskExit event in podsandbox handler container_id:\"3d4887b1f13e72160318a4c2eab7a59585b79f1b69b010035906796839b12e22\" id:\"dbad8ac4ee02225b3d632ca6a760c892a2ad15126855dd46f5de60bd8646f712\" pid:5248 exited_at:{seconds:1757548492 nanos:555344955}" Sep 10 23:54:54.296389 containerd[1522]: time="2025-09-10T23:54:54.296345520Z" level=info msg="TaskExit event in podsandbox handler container_id:\"296cbc8d5df525b21444371f1c154cc68257e3af8c60d311e62f7b31ca19feb0\" id:\"f72743adfea367c8b772d20e2f8e29415eb93d320e091d937de654d9da1607e2\" pid:5272 exited_at:{seconds:1757548494 nanos:296042809}" Sep 10 23:55:09.553234 kubelet[2743]: I0910 23:55:09.553130 2743 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Sep 10 23:55:14.906285 containerd[1522]: time="2025-09-10T23:55:14.905882849Z" level=info msg="TaskExit event in podsandbox handler container_id:\"3d4887b1f13e72160318a4c2eab7a59585b79f1b69b010035906796839b12e22\" id:\"dbc28cdc91716792e052c9e5c0854d67c5a330d5972cb31663401c3f1bbda43c\" pid:5310 exited_at:{seconds:1757548514 nanos:905211722}" Sep 10 23:55:17.453704 containerd[1522]: time="2025-09-10T23:55:17.453655501Z" level=info msg="TaskExit event in podsandbox handler container_id:\"3042355787539ec33867586a8e7b5c696a3acfe781c23d2b305989da1cfcfaa1\" id:\"70130736bd17f4bfd0d931b70b10b59b72b48dc0987602f9e45f940fc3fdf0cb\" pid:5333 exited_at:{seconds:1757548517 nanos:453373337}" Sep 10 23:55:20.998600 containerd[1522]: time="2025-09-10T23:55:20.998484837Z" level=info msg="TaskExit event in podsandbox handler container_id:\"3042355787539ec33867586a8e7b5c696a3acfe781c23d2b305989da1cfcfaa1\" id:\"3b0043bab6340b7e6c278daa6335c8515fcd7817ebb1947774b968367722538a\" pid:5358 exited_at:{seconds:1757548520 nanos:997455540}" Sep 10 23:55:22.592990 containerd[1522]: time="2025-09-10T23:55:22.592936327Z" level=info msg="TaskExit event in podsandbox handler container_id:\"3d4887b1f13e72160318a4c2eab7a59585b79f1b69b010035906796839b12e22\" id:\"a2003d0d593a9b06c92e4fad7be093106560c50ffc92d3c95b1a23ad35bc1175\" pid:5377 exited_at:{seconds:1757548522 nanos:592079510}" Sep 10 23:55:24.300758 containerd[1522]: time="2025-09-10T23:55:24.300711672Z" level=info msg="TaskExit event in podsandbox handler container_id:\"296cbc8d5df525b21444371f1c154cc68257e3af8c60d311e62f7b31ca19feb0\" id:\"389d5e336ec029f159c8f70a2ba8a339a850c22290b36fd77f57c54649b00aa4\" pid:5401 exited_at:{seconds:1757548524 nanos:300421385}" Sep 10 23:55:27.299365 update_engine[1487]: I20250910 23:55:27.299285 1487 prefs.cc:52] certificate-report-to-send-update not present in /var/lib/update_engine/prefs Sep 10 23:55:27.299365 update_engine[1487]: I20250910 23:55:27.299358 1487 prefs.cc:52] certificate-report-to-send-download not present in /var/lib/update_engine/prefs Sep 10 23:55:27.299918 update_engine[1487]: I20250910 23:55:27.299664 1487 prefs.cc:52] aleph-version not present in /var/lib/update_engine/prefs Sep 10 23:55:27.301656 update_engine[1487]: I20250910 23:55:27.301581 1487 omaha_request_params.cc:62] Current group set to beta Sep 10 23:55:27.304396 update_engine[1487]: I20250910 23:55:27.304227 1487 update_attempter.cc:499] Already updated boot flags. Skipping. Sep 10 23:55:27.304396 update_engine[1487]: I20250910 23:55:27.304265 1487 update_attempter.cc:643] Scheduling an action processor start. Sep 10 23:55:27.304396 update_engine[1487]: I20250910 23:55:27.304286 1487 action_processor.cc:36] ActionProcessor::StartProcessing: OmahaRequestAction Sep 10 23:55:27.316059 update_engine[1487]: I20250910 23:55:27.314318 1487 prefs.cc:52] previous-version not present in /var/lib/update_engine/prefs Sep 10 23:55:27.316059 update_engine[1487]: I20250910 23:55:27.314549 1487 omaha_request_action.cc:271] Posting an Omaha request to disabled Sep 10 23:55:27.316059 update_engine[1487]: I20250910 23:55:27.314560 1487 omaha_request_action.cc:272] Request: Sep 10 23:55:27.316059 update_engine[1487]: Sep 10 23:55:27.316059 update_engine[1487]: Sep 10 23:55:27.316059 update_engine[1487]: Sep 10 23:55:27.316059 update_engine[1487]: Sep 10 23:55:27.316059 update_engine[1487]: Sep 10 23:55:27.316059 update_engine[1487]: Sep 10 23:55:27.316059 update_engine[1487]: Sep 10 23:55:27.316059 update_engine[1487]: Sep 10 23:55:27.316059 update_engine[1487]: I20250910 23:55:27.314567 1487 libcurl_http_fetcher.cc:47] Starting/Resuming transfer Sep 10 23:55:27.321201 update_engine[1487]: I20250910 23:55:27.320843 1487 libcurl_http_fetcher.cc:151] Setting up curl options for HTTP Sep 10 23:55:27.322914 update_engine[1487]: I20250910 23:55:27.322868 1487 libcurl_http_fetcher.cc:449] Setting up timeout source: 1 seconds. Sep 10 23:55:27.323591 locksmithd[1526]: LastCheckedTime=0 Progress=0 CurrentOperation="UPDATE_STATUS_CHECKING_FOR_UPDATE" NewVersion=0.0.0 NewSize=0 Sep 10 23:55:27.323957 update_engine[1487]: E20250910 23:55:27.323913 1487 libcurl_http_fetcher.cc:266] Unable to get http response code: Could not resolve host: disabled Sep 10 23:55:27.324006 update_engine[1487]: I20250910 23:55:27.323990 1487 libcurl_http_fetcher.cc:283] No HTTP response, retry 1 Sep 10 23:55:37.267643 update_engine[1487]: I20250910 23:55:37.267235 1487 libcurl_http_fetcher.cc:47] Starting/Resuming transfer Sep 10 23:55:37.267643 update_engine[1487]: I20250910 23:55:37.267470 1487 libcurl_http_fetcher.cc:151] Setting up curl options for HTTP Sep 10 23:55:37.268416 update_engine[1487]: I20250910 23:55:37.268350 1487 libcurl_http_fetcher.cc:449] Setting up timeout source: 1 seconds. Sep 10 23:55:37.268903 update_engine[1487]: E20250910 23:55:37.268853 1487 libcurl_http_fetcher.cc:266] Unable to get http response code: Could not resolve host: disabled Sep 10 23:55:37.268959 update_engine[1487]: I20250910 23:55:37.268929 1487 libcurl_http_fetcher.cc:283] No HTTP response, retry 2 Sep 10 23:55:47.266512 update_engine[1487]: I20250910 23:55:47.266422 1487 libcurl_http_fetcher.cc:47] Starting/Resuming transfer Sep 10 23:55:47.266943 update_engine[1487]: I20250910 23:55:47.266734 1487 libcurl_http_fetcher.cc:151] Setting up curl options for HTTP Sep 10 23:55:47.267215 update_engine[1487]: I20250910 23:55:47.267156 1487 libcurl_http_fetcher.cc:449] Setting up timeout source: 1 seconds. Sep 10 23:55:47.268097 update_engine[1487]: E20250910 23:55:47.267639 1487 libcurl_http_fetcher.cc:266] Unable to get http response code: Could not resolve host: disabled Sep 10 23:55:47.268097 update_engine[1487]: I20250910 23:55:47.267729 1487 libcurl_http_fetcher.cc:283] No HTTP response, retry 3 Sep 10 23:55:50.995358 containerd[1522]: time="2025-09-10T23:55:50.995296320Z" level=info msg="TaskExit event in podsandbox handler container_id:\"3042355787539ec33867586a8e7b5c696a3acfe781c23d2b305989da1cfcfaa1\" id:\"e7177ab63163d7f23db0cab89e56edec3825205495df41316e9f2d5912ca2b37\" pid:5436 exited_at:{seconds:1757548550 nanos:994502369}" Sep 10 23:55:52.552511 containerd[1522]: time="2025-09-10T23:55:52.552457612Z" level=info msg="TaskExit event in podsandbox handler container_id:\"3d4887b1f13e72160318a4c2eab7a59585b79f1b69b010035906796839b12e22\" id:\"23dabb6d5c24058c724973eb5f603248ee6e8ee0ad377c7a5c8f4ad04d7a0a2d\" pid:5458 exited_at:{seconds:1757548552 nanos:551507773}" Sep 10 23:55:54.289984 containerd[1522]: time="2025-09-10T23:55:54.289941082Z" level=info msg="TaskExit event in podsandbox handler container_id:\"296cbc8d5df525b21444371f1c154cc68257e3af8c60d311e62f7b31ca19feb0\" id:\"5f6ccf65eb9e0e3131211146da8a7faea28aa6a8f0e6653a46bf846cded9f824\" pid:5480 exited_at:{seconds:1757548554 nanos:289441022}" Sep 10 23:55:57.266494 update_engine[1487]: I20250910 23:55:57.266377 1487 libcurl_http_fetcher.cc:47] Starting/Resuming transfer Sep 10 23:55:57.267258 update_engine[1487]: I20250910 23:55:57.266762 1487 libcurl_http_fetcher.cc:151] Setting up curl options for HTTP Sep 10 23:55:57.267611 update_engine[1487]: I20250910 23:55:57.267531 1487 libcurl_http_fetcher.cc:449] Setting up timeout source: 1 seconds. Sep 10 23:55:57.267828 update_engine[1487]: E20250910 23:55:57.267788 1487 libcurl_http_fetcher.cc:266] Unable to get http response code: Could not resolve host: disabled Sep 10 23:55:57.267923 update_engine[1487]: I20250910 23:55:57.267858 1487 libcurl_http_fetcher.cc:297] Transfer resulted in an error (0), 0 bytes downloaded Sep 10 23:55:57.267923 update_engine[1487]: I20250910 23:55:57.267874 1487 omaha_request_action.cc:617] Omaha request response: Sep 10 23:55:57.268041 update_engine[1487]: E20250910 23:55:57.267982 1487 omaha_request_action.cc:636] Omaha request network transfer failed. Sep 10 23:55:57.268041 update_engine[1487]: I20250910 23:55:57.268005 1487 action_processor.cc:68] ActionProcessor::ActionComplete: OmahaRequestAction action failed. Aborting processing. Sep 10 23:55:57.268041 update_engine[1487]: I20250910 23:55:57.268015 1487 action_processor.cc:73] ActionProcessor::ActionComplete: finished last action of type OmahaRequestAction Sep 10 23:55:57.268041 update_engine[1487]: I20250910 23:55:57.268024 1487 update_attempter.cc:306] Processing Done. Sep 10 23:55:57.268340 update_engine[1487]: E20250910 23:55:57.268044 1487 update_attempter.cc:619] Update failed. Sep 10 23:55:57.268340 update_engine[1487]: I20250910 23:55:57.268054 1487 utils.cc:600] Converting error code 2000 to kActionCodeOmahaErrorInHTTPResponse Sep 10 23:55:57.268340 update_engine[1487]: I20250910 23:55:57.268062 1487 payload_state.cc:97] Updating payload state for error code: 37 (kActionCodeOmahaErrorInHTTPResponse) Sep 10 23:55:57.268340 update_engine[1487]: I20250910 23:55:57.268072 1487 payload_state.cc:103] Ignoring failures until we get a valid Omaha response. Sep 10 23:55:57.268562 update_engine[1487]: I20250910 23:55:57.268512 1487 action_processor.cc:36] ActionProcessor::StartProcessing: OmahaRequestAction Sep 10 23:55:57.268622 update_engine[1487]: I20250910 23:55:57.268572 1487 omaha_request_action.cc:271] Posting an Omaha request to disabled Sep 10 23:55:57.268622 update_engine[1487]: I20250910 23:55:57.268589 1487 omaha_request_action.cc:272] Request: Sep 10 23:55:57.268622 update_engine[1487]: Sep 10 23:55:57.268622 update_engine[1487]: Sep 10 23:55:57.268622 update_engine[1487]: Sep 10 23:55:57.268622 update_engine[1487]: Sep 10 23:55:57.268622 update_engine[1487]: Sep 10 23:55:57.268622 update_engine[1487]: Sep 10 23:55:57.268622 update_engine[1487]: I20250910 23:55:57.268599 1487 libcurl_http_fetcher.cc:47] Starting/Resuming transfer Sep 10 23:55:57.269010 update_engine[1487]: I20250910 23:55:57.268850 1487 libcurl_http_fetcher.cc:151] Setting up curl options for HTTP Sep 10 23:55:57.269427 update_engine[1487]: I20250910 23:55:57.269234 1487 libcurl_http_fetcher.cc:449] Setting up timeout source: 1 seconds. Sep 10 23:55:57.269522 locksmithd[1526]: LastCheckedTime=0 Progress=0 CurrentOperation="UPDATE_STATUS_REPORTING_ERROR_EVENT" NewVersion=0.0.0 NewSize=0 Sep 10 23:55:57.269951 update_engine[1487]: E20250910 23:55:57.269519 1487 libcurl_http_fetcher.cc:266] Unable to get http response code: Could not resolve host: disabled Sep 10 23:55:57.269951 update_engine[1487]: I20250910 23:55:57.269578 1487 libcurl_http_fetcher.cc:297] Transfer resulted in an error (0), 0 bytes downloaded Sep 10 23:55:57.269951 update_engine[1487]: I20250910 23:55:57.269590 1487 omaha_request_action.cc:617] Omaha request response: Sep 10 23:55:57.269951 update_engine[1487]: I20250910 23:55:57.269601 1487 action_processor.cc:65] ActionProcessor::ActionComplete: finished last action of type OmahaRequestAction Sep 10 23:55:57.269951 update_engine[1487]: I20250910 23:55:57.269610 1487 action_processor.cc:73] ActionProcessor::ActionComplete: finished last action of type OmahaRequestAction Sep 10 23:55:57.269951 update_engine[1487]: I20250910 23:55:57.269619 1487 update_attempter.cc:306] Processing Done. Sep 10 23:55:57.269951 update_engine[1487]: I20250910 23:55:57.269629 1487 update_attempter.cc:310] Error event sent. Sep 10 23:55:57.269951 update_engine[1487]: I20250910 23:55:57.269642 1487 update_check_scheduler.cc:74] Next update check in 45m44s Sep 10 23:55:57.270435 locksmithd[1526]: LastCheckedTime=0 Progress=0 CurrentOperation="UPDATE_STATUS_IDLE" NewVersion=0.0.0 NewSize=0 Sep 10 23:56:14.855299 containerd[1522]: time="2025-09-10T23:56:14.855229129Z" level=info msg="TaskExit event in podsandbox handler container_id:\"3d4887b1f13e72160318a4c2eab7a59585b79f1b69b010035906796839b12e22\" id:\"7e7a7408b1a31767ae5eadf61945e059b07d22a7818bc39a55c8969fd7eb2899\" pid:5532 exited_at:{seconds:1757548574 nanos:854872272}" Sep 10 23:56:17.444411 containerd[1522]: time="2025-09-10T23:56:17.444373888Z" level=info msg="TaskExit event in podsandbox handler container_id:\"3042355787539ec33867586a8e7b5c696a3acfe781c23d2b305989da1cfcfaa1\" id:\"e6d90fd416fcb8270d44a207fa1be53f74802c0e7813cdf28632ea022694a36c\" pid:5555 exited_at:{seconds:1757548577 nanos:443908866}" Sep 10 23:56:21.000900 containerd[1522]: time="2025-09-10T23:56:21.000827578Z" level=info msg="TaskExit event in podsandbox handler container_id:\"3042355787539ec33867586a8e7b5c696a3acfe781c23d2b305989da1cfcfaa1\" id:\"bdf65192cb56154f9d618eddb6dfbbbe0cba65bae12fd1b695352fadbc6fda78\" pid:5576 exited_at:{seconds:1757548581 nanos:349835}" Sep 10 23:56:22.561553 containerd[1522]: time="2025-09-10T23:56:22.561319931Z" level=info msg="TaskExit event in podsandbox handler container_id:\"3d4887b1f13e72160318a4c2eab7a59585b79f1b69b010035906796839b12e22\" id:\"e3233ebab52f89b907f28b07f9d0799b8b5483bccca95c5471ee2646a28053b1\" pid:5597 exited_at:{seconds:1757548582 nanos:560654178}" Sep 10 23:56:24.291811 containerd[1522]: time="2025-09-10T23:56:24.291714196Z" level=info msg="TaskExit event in podsandbox handler container_id:\"296cbc8d5df525b21444371f1c154cc68257e3af8c60d311e62f7b31ca19feb0\" id:\"c995f410f0c70ab18ef17a2fbb5edee86e17930f18c86547f2f0404465e888f4\" pid:5621 exited_at:{seconds:1757548584 nanos:291072925}" Sep 10 23:56:27.892723 systemd[1]: Started sshd@8-91.107.201.216:22-139.178.89.65:38588.service - OpenSSH per-connection server daemon (139.178.89.65:38588). Sep 10 23:56:28.909580 sshd[5639]: Accepted publickey for core from 139.178.89.65 port 38588 ssh2: RSA SHA256:WhS/KOZ1o/uklv7h/4WLPYbUs/Yyh1JayZdeMawA7QM Sep 10 23:56:28.911949 sshd-session[5639]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 10 23:56:28.919701 systemd-logind[1483]: New session 8 of user core. Sep 10 23:56:28.926471 systemd[1]: Started session-8.scope - Session 8 of User core. Sep 10 23:56:29.689292 sshd[5641]: Connection closed by 139.178.89.65 port 38588 Sep 10 23:56:29.690119 sshd-session[5639]: pam_unix(sshd:session): session closed for user core Sep 10 23:56:29.695932 systemd[1]: sshd@8-91.107.201.216:22-139.178.89.65:38588.service: Deactivated successfully. Sep 10 23:56:29.698351 systemd[1]: session-8.scope: Deactivated successfully. Sep 10 23:56:29.701979 systemd-logind[1483]: Session 8 logged out. Waiting for processes to exit. Sep 10 23:56:29.703943 systemd-logind[1483]: Removed session 8. Sep 10 23:56:34.871121 systemd[1]: Started sshd@9-91.107.201.216:22-139.178.89.65:48636.service - OpenSSH per-connection server daemon (139.178.89.65:48636). Sep 10 23:56:35.929570 sshd[5654]: Accepted publickey for core from 139.178.89.65 port 48636 ssh2: RSA SHA256:WhS/KOZ1o/uklv7h/4WLPYbUs/Yyh1JayZdeMawA7QM Sep 10 23:56:35.933000 sshd-session[5654]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 10 23:56:35.938796 systemd-logind[1483]: New session 9 of user core. Sep 10 23:56:35.944624 systemd[1]: Started session-9.scope - Session 9 of User core. Sep 10 23:56:36.746228 sshd[5656]: Connection closed by 139.178.89.65 port 48636 Sep 10 23:56:36.747368 sshd-session[5654]: pam_unix(sshd:session): session closed for user core Sep 10 23:56:36.752465 systemd-logind[1483]: Session 9 logged out. Waiting for processes to exit. Sep 10 23:56:36.752654 systemd[1]: sshd@9-91.107.201.216:22-139.178.89.65:48636.service: Deactivated successfully. Sep 10 23:56:36.755934 systemd[1]: session-9.scope: Deactivated successfully. Sep 10 23:56:36.759586 systemd-logind[1483]: Removed session 9. Sep 10 23:56:41.921411 systemd[1]: Started sshd@10-91.107.201.216:22-139.178.89.65:39296.service - OpenSSH per-connection server daemon (139.178.89.65:39296). Sep 10 23:56:42.918141 sshd[5669]: Accepted publickey for core from 139.178.89.65 port 39296 ssh2: RSA SHA256:WhS/KOZ1o/uklv7h/4WLPYbUs/Yyh1JayZdeMawA7QM Sep 10 23:56:42.920629 sshd-session[5669]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 10 23:56:42.927432 systemd-logind[1483]: New session 10 of user core. Sep 10 23:56:42.935419 systemd[1]: Started session-10.scope - Session 10 of User core. Sep 10 23:56:43.676512 sshd[5671]: Connection closed by 139.178.89.65 port 39296 Sep 10 23:56:43.678427 sshd-session[5669]: pam_unix(sshd:session): session closed for user core Sep 10 23:56:43.683857 systemd-logind[1483]: Session 10 logged out. Waiting for processes to exit. Sep 10 23:56:43.684507 systemd[1]: sshd@10-91.107.201.216:22-139.178.89.65:39296.service: Deactivated successfully. Sep 10 23:56:43.688487 systemd[1]: session-10.scope: Deactivated successfully. Sep 10 23:56:43.690579 systemd-logind[1483]: Removed session 10. Sep 10 23:56:43.848776 systemd[1]: Started sshd@11-91.107.201.216:22-139.178.89.65:39298.service - OpenSSH per-connection server daemon (139.178.89.65:39298). Sep 10 23:56:44.849982 sshd[5684]: Accepted publickey for core from 139.178.89.65 port 39298 ssh2: RSA SHA256:WhS/KOZ1o/uklv7h/4WLPYbUs/Yyh1JayZdeMawA7QM Sep 10 23:56:44.851920 sshd-session[5684]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 10 23:56:44.858016 systemd-logind[1483]: New session 11 of user core. Sep 10 23:56:44.867512 systemd[1]: Started session-11.scope - Session 11 of User core. Sep 10 23:56:45.659680 sshd[5686]: Connection closed by 139.178.89.65 port 39298 Sep 10 23:56:45.661940 sshd-session[5684]: pam_unix(sshd:session): session closed for user core Sep 10 23:56:45.669858 systemd[1]: sshd@11-91.107.201.216:22-139.178.89.65:39298.service: Deactivated successfully. Sep 10 23:56:45.670156 systemd-logind[1483]: Session 11 logged out. Waiting for processes to exit. Sep 10 23:56:45.673383 systemd[1]: session-11.scope: Deactivated successfully. Sep 10 23:56:45.676408 systemd-logind[1483]: Removed session 11. Sep 10 23:56:45.833380 systemd[1]: Started sshd@12-91.107.201.216:22-139.178.89.65:39314.service - OpenSSH per-connection server daemon (139.178.89.65:39314). Sep 10 23:56:46.831407 sshd[5702]: Accepted publickey for core from 139.178.89.65 port 39314 ssh2: RSA SHA256:WhS/KOZ1o/uklv7h/4WLPYbUs/Yyh1JayZdeMawA7QM Sep 10 23:56:46.833672 sshd-session[5702]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 10 23:56:46.839511 systemd-logind[1483]: New session 12 of user core. Sep 10 23:56:46.847498 systemd[1]: Started session-12.scope - Session 12 of User core. Sep 10 23:56:47.592796 sshd[5704]: Connection closed by 139.178.89.65 port 39314 Sep 10 23:56:47.593362 sshd-session[5702]: pam_unix(sshd:session): session closed for user core Sep 10 23:56:47.598936 systemd[1]: sshd@12-91.107.201.216:22-139.178.89.65:39314.service: Deactivated successfully. Sep 10 23:56:47.602868 systemd[1]: session-12.scope: Deactivated successfully. Sep 10 23:56:47.604115 systemd-logind[1483]: Session 12 logged out. Waiting for processes to exit. Sep 10 23:56:47.606860 systemd-logind[1483]: Removed session 12. Sep 10 23:56:50.987361 containerd[1522]: time="2025-09-10T23:56:50.987102919Z" level=info msg="TaskExit event in podsandbox handler container_id:\"3042355787539ec33867586a8e7b5c696a3acfe781c23d2b305989da1cfcfaa1\" id:\"182995882d85234f8d0f1cf80543505dcf3da1a8ccc7cc243a148f2e405ab6d8\" pid:5730 exited_at:{seconds:1757548610 nanos:986795970}" Sep 10 23:56:52.568484 containerd[1522]: time="2025-09-10T23:56:52.568385335Z" level=info msg="TaskExit event in podsandbox handler container_id:\"3d4887b1f13e72160318a4c2eab7a59585b79f1b69b010035906796839b12e22\" id:\"99075d0326eefce6ba95feaaa13d026a92ea9032f4a902b519613acaafd5adc9\" pid:5753 exited_at:{seconds:1757548612 nanos:567930149}" Sep 10 23:56:52.771619 systemd[1]: Started sshd@13-91.107.201.216:22-139.178.89.65:47126.service - OpenSSH per-connection server daemon (139.178.89.65:47126). Sep 10 23:56:53.790353 sshd[5764]: Accepted publickey for core from 139.178.89.65 port 47126 ssh2: RSA SHA256:WhS/KOZ1o/uklv7h/4WLPYbUs/Yyh1JayZdeMawA7QM Sep 10 23:56:53.792563 sshd-session[5764]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 10 23:56:53.797511 systemd-logind[1483]: New session 13 of user core. Sep 10 23:56:53.808482 systemd[1]: Started session-13.scope - Session 13 of User core. Sep 10 23:56:54.304269 containerd[1522]: time="2025-09-10T23:56:54.304217131Z" level=info msg="TaskExit event in podsandbox handler container_id:\"296cbc8d5df525b21444371f1c154cc68257e3af8c60d311e62f7b31ca19feb0\" id:\"fc3423baa4db713bda599adfe46fc18beed96245c00d6a0c30d310c10306b402\" pid:5779 exited_at:{seconds:1757548614 nanos:302922129}" Sep 10 23:56:54.564160 sshd[5766]: Connection closed by 139.178.89.65 port 47126 Sep 10 23:56:54.563258 sshd-session[5764]: pam_unix(sshd:session): session closed for user core Sep 10 23:56:54.567539 systemd-logind[1483]: Session 13 logged out. Waiting for processes to exit. Sep 10 23:56:54.569320 systemd[1]: sshd@13-91.107.201.216:22-139.178.89.65:47126.service: Deactivated successfully. Sep 10 23:56:54.573161 systemd[1]: session-13.scope: Deactivated successfully. Sep 10 23:56:54.577223 systemd-logind[1483]: Removed session 13. Sep 10 23:56:54.744857 systemd[1]: Started sshd@14-91.107.201.216:22-139.178.89.65:47132.service - OpenSSH per-connection server daemon (139.178.89.65:47132). Sep 10 23:56:55.745576 sshd[5802]: Accepted publickey for core from 139.178.89.65 port 47132 ssh2: RSA SHA256:WhS/KOZ1o/uklv7h/4WLPYbUs/Yyh1JayZdeMawA7QM Sep 10 23:56:55.747767 sshd-session[5802]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 10 23:56:55.755688 systemd-logind[1483]: New session 14 of user core. Sep 10 23:56:55.762404 systemd[1]: Started session-14.scope - Session 14 of User core. Sep 10 23:56:56.662373 sshd[5804]: Connection closed by 139.178.89.65 port 47132 Sep 10 23:56:56.663924 sshd-session[5802]: pam_unix(sshd:session): session closed for user core Sep 10 23:56:56.669882 systemd[1]: sshd@14-91.107.201.216:22-139.178.89.65:47132.service: Deactivated successfully. Sep 10 23:56:56.672402 systemd[1]: session-14.scope: Deactivated successfully. Sep 10 23:56:56.674446 systemd-logind[1483]: Session 14 logged out. Waiting for processes to exit. Sep 10 23:56:56.676485 systemd-logind[1483]: Removed session 14. Sep 10 23:56:56.834318 systemd[1]: Started sshd@15-91.107.201.216:22-139.178.89.65:47136.service - OpenSSH per-connection server daemon (139.178.89.65:47136). Sep 10 23:56:57.833078 sshd[5814]: Accepted publickey for core from 139.178.89.65 port 47136 ssh2: RSA SHA256:WhS/KOZ1o/uklv7h/4WLPYbUs/Yyh1JayZdeMawA7QM Sep 10 23:56:57.835086 sshd-session[5814]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 10 23:56:57.841006 systemd-logind[1483]: New session 15 of user core. Sep 10 23:56:57.849470 systemd[1]: Started session-15.scope - Session 15 of User core. Sep 10 23:57:00.367036 sshd[5818]: Connection closed by 139.178.89.65 port 47136 Sep 10 23:57:00.367875 sshd-session[5814]: pam_unix(sshd:session): session closed for user core Sep 10 23:57:00.373507 systemd[1]: sshd@15-91.107.201.216:22-139.178.89.65:47136.service: Deactivated successfully. Sep 10 23:57:00.378933 systemd[1]: session-15.scope: Deactivated successfully. Sep 10 23:57:00.379253 systemd[1]: session-15.scope: Consumed 603ms CPU time, 77M memory peak. Sep 10 23:57:00.381030 systemd-logind[1483]: Session 15 logged out. Waiting for processes to exit. Sep 10 23:57:00.383921 systemd-logind[1483]: Removed session 15. Sep 10 23:57:00.550236 systemd[1]: Started sshd@16-91.107.201.216:22-139.178.89.65:57048.service - OpenSSH per-connection server daemon (139.178.89.65:57048). Sep 10 23:57:01.626893 sshd[5841]: Accepted publickey for core from 139.178.89.65 port 57048 ssh2: RSA SHA256:WhS/KOZ1o/uklv7h/4WLPYbUs/Yyh1JayZdeMawA7QM Sep 10 23:57:01.629792 sshd-session[5841]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 10 23:57:01.635133 systemd-logind[1483]: New session 16 of user core. Sep 10 23:57:01.645850 systemd[1]: Started session-16.scope - Session 16 of User core. Sep 10 23:57:02.546469 sshd[5843]: Connection closed by 139.178.89.65 port 57048 Sep 10 23:57:02.547622 sshd-session[5841]: pam_unix(sshd:session): session closed for user core Sep 10 23:57:02.552446 systemd[1]: sshd@16-91.107.201.216:22-139.178.89.65:57048.service: Deactivated successfully. Sep 10 23:57:02.556123 systemd[1]: session-16.scope: Deactivated successfully. Sep 10 23:57:02.558559 systemd-logind[1483]: Session 16 logged out. Waiting for processes to exit. Sep 10 23:57:02.560093 systemd-logind[1483]: Removed session 16. Sep 10 23:57:02.721801 systemd[1]: Started sshd@17-91.107.201.216:22-139.178.89.65:57062.service - OpenSSH per-connection server daemon (139.178.89.65:57062). Sep 10 23:57:03.709899 sshd[5853]: Accepted publickey for core from 139.178.89.65 port 57062 ssh2: RSA SHA256:WhS/KOZ1o/uklv7h/4WLPYbUs/Yyh1JayZdeMawA7QM Sep 10 23:57:03.711579 sshd-session[5853]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 10 23:57:03.718954 systemd-logind[1483]: New session 17 of user core. Sep 10 23:57:03.724449 systemd[1]: Started session-17.scope - Session 17 of User core. Sep 10 23:57:04.465015 sshd[5855]: Connection closed by 139.178.89.65 port 57062 Sep 10 23:57:04.464876 sshd-session[5853]: pam_unix(sshd:session): session closed for user core Sep 10 23:57:04.471474 systemd[1]: sshd@17-91.107.201.216:22-139.178.89.65:57062.service: Deactivated successfully. Sep 10 23:57:04.474796 systemd[1]: session-17.scope: Deactivated successfully. Sep 10 23:57:04.477023 systemd-logind[1483]: Session 17 logged out. Waiting for processes to exit. Sep 10 23:57:04.480615 systemd-logind[1483]: Removed session 17. Sep 10 23:57:09.640737 systemd[1]: Started sshd@18-91.107.201.216:22-139.178.89.65:57068.service - OpenSSH per-connection server daemon (139.178.89.65:57068). Sep 10 23:57:10.637620 sshd[5876]: Accepted publickey for core from 139.178.89.65 port 57068 ssh2: RSA SHA256:WhS/KOZ1o/uklv7h/4WLPYbUs/Yyh1JayZdeMawA7QM Sep 10 23:57:10.640361 sshd-session[5876]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 10 23:57:10.645270 systemd-logind[1483]: New session 18 of user core. Sep 10 23:57:10.657954 systemd[1]: Started session-18.scope - Session 18 of User core. Sep 10 23:57:11.417912 sshd[5878]: Connection closed by 139.178.89.65 port 57068 Sep 10 23:57:11.419936 sshd-session[5876]: pam_unix(sshd:session): session closed for user core Sep 10 23:57:11.424050 systemd[1]: sshd@18-91.107.201.216:22-139.178.89.65:57068.service: Deactivated successfully. Sep 10 23:57:11.427937 systemd[1]: session-18.scope: Deactivated successfully. Sep 10 23:57:11.434084 systemd-logind[1483]: Session 18 logged out. Waiting for processes to exit. Sep 10 23:57:11.436304 systemd-logind[1483]: Removed session 18. Sep 10 23:57:14.865251 containerd[1522]: time="2025-09-10T23:57:14.865141839Z" level=info msg="TaskExit event in podsandbox handler container_id:\"3d4887b1f13e72160318a4c2eab7a59585b79f1b69b010035906796839b12e22\" id:\"029c9562fbdfecbc86896b6c81bda2a2dfac56e3f4c3cfa35c3e77caca22f244\" pid:5901 exited_at:{seconds:1757548634 nanos:864209567}" Sep 10 23:57:16.591405 systemd[1]: Started sshd@19-91.107.201.216:22-139.178.89.65:42520.service - OpenSSH per-connection server daemon (139.178.89.65:42520). Sep 10 23:57:17.446386 containerd[1522]: time="2025-09-10T23:57:17.446343838Z" level=info msg="TaskExit event in podsandbox handler container_id:\"3042355787539ec33867586a8e7b5c696a3acfe781c23d2b305989da1cfcfaa1\" id:\"4f2498583d130a927ca080c29bc38a1cd5944ed4840e2ed856e8de969b839d60\" pid:5927 exited_at:{seconds:1757548637 nanos:446129080}" Sep 10 23:57:17.603608 sshd[5912]: Accepted publickey for core from 139.178.89.65 port 42520 ssh2: RSA SHA256:WhS/KOZ1o/uklv7h/4WLPYbUs/Yyh1JayZdeMawA7QM Sep 10 23:57:17.605667 sshd-session[5912]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Sep 10 23:57:17.611041 systemd-logind[1483]: New session 19 of user core. Sep 10 23:57:17.615365 systemd[1]: Started session-19.scope - Session 19 of User core. Sep 10 23:57:18.373601 sshd[5936]: Connection closed by 139.178.89.65 port 42520 Sep 10 23:57:18.374543 sshd-session[5912]: pam_unix(sshd:session): session closed for user core Sep 10 23:57:18.379940 systemd[1]: sshd@19-91.107.201.216:22-139.178.89.65:42520.service: Deactivated successfully. Sep 10 23:57:18.380090 systemd-logind[1483]: Session 19 logged out. Waiting for processes to exit. Sep 10 23:57:18.384762 systemd[1]: session-19.scope: Deactivated successfully. Sep 10 23:57:18.388504 systemd-logind[1483]: Removed session 19. Sep 10 23:57:20.986804 containerd[1522]: time="2025-09-10T23:57:20.986749562Z" level=info msg="TaskExit event in podsandbox handler container_id:\"3042355787539ec33867586a8e7b5c696a3acfe781c23d2b305989da1cfcfaa1\" id:\"385570b9a29b808498c20ba4340485b728db946c59e30ea08386c738dcf937d8\" pid:5961 exited_at:{seconds:1757548640 nanos:986274724}" Sep 10 23:57:22.575251 containerd[1522]: time="2025-09-10T23:57:22.575120062Z" level=info msg="TaskExit event in podsandbox handler container_id:\"3d4887b1f13e72160318a4c2eab7a59585b79f1b69b010035906796839b12e22\" id:\"b5e7307c1002225ad12dc37484baaa356853373ee0c1a8cbaaa79d0d2246c992\" pid:5982 exited_at:{seconds:1757548642 nanos:573914666}" Sep 10 23:57:24.295438 containerd[1522]: time="2025-09-10T23:57:24.295264034Z" level=info msg="TaskExit event in podsandbox handler container_id:\"296cbc8d5df525b21444371f1c154cc68257e3af8c60d311e62f7b31ca19feb0\" id:\"c6445969e1d4cf241b4efb56988b1d8ad273f476dfcac16d01632d824874849a\" pid:6003 exited_at:{seconds:1757548644 nanos:294831595}" Sep 10 23:57:33.873155 systemd[1]: cri-containerd-33297a90db7f8cdd9697668a43ab136caeee437394decd46ecefe7df0c3498d5.scope: Deactivated successfully. Sep 10 23:57:33.875304 systemd[1]: cri-containerd-33297a90db7f8cdd9697668a43ab136caeee437394decd46ecefe7df0c3498d5.scope: Consumed 19.239s CPU time, 120.4M memory peak, 4.5M read from disk. Sep 10 23:57:33.877064 containerd[1522]: time="2025-09-10T23:57:33.876788016Z" level=info msg="received exit event container_id:\"33297a90db7f8cdd9697668a43ab136caeee437394decd46ecefe7df0c3498d5\" id:\"33297a90db7f8cdd9697668a43ab136caeee437394decd46ecefe7df0c3498d5\" pid:3136 exit_status:1 exited_at:{seconds:1757548653 nanos:876470935}" Sep 10 23:57:33.878127 containerd[1522]: time="2025-09-10T23:57:33.876961857Z" level=info msg="TaskExit event in podsandbox handler container_id:\"33297a90db7f8cdd9697668a43ab136caeee437394decd46ecefe7df0c3498d5\" id:\"33297a90db7f8cdd9697668a43ab136caeee437394decd46ecefe7df0c3498d5\" pid:3136 exit_status:1 exited_at:{seconds:1757548653 nanos:876470935}" Sep 10 23:57:33.898912 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-33297a90db7f8cdd9697668a43ab136caeee437394decd46ecefe7df0c3498d5-rootfs.mount: Deactivated successfully. Sep 10 23:57:33.967748 systemd[1]: cri-containerd-77c56c596d0c4ba611fc95417832031c711525d16f57310ede66f5ae357e0e38.scope: Deactivated successfully. Sep 10 23:57:33.968042 systemd[1]: cri-containerd-77c56c596d0c4ba611fc95417832031c711525d16f57310ede66f5ae357e0e38.scope: Consumed 4.543s CPU time, 63.3M memory peak, 2.3M read from disk. Sep 10 23:57:33.970801 containerd[1522]: time="2025-09-10T23:57:33.970761664Z" level=info msg="received exit event container_id:\"77c56c596d0c4ba611fc95417832031c711525d16f57310ede66f5ae357e0e38\" id:\"77c56c596d0c4ba611fc95417832031c711525d16f57310ede66f5ae357e0e38\" pid:2566 exit_status:1 exited_at:{seconds:1757548653 nanos:969773379}" Sep 10 23:57:33.971097 containerd[1522]: time="2025-09-10T23:57:33.971073065Z" level=info msg="TaskExit event in podsandbox handler container_id:\"77c56c596d0c4ba611fc95417832031c711525d16f57310ede66f5ae357e0e38\" id:\"77c56c596d0c4ba611fc95417832031c711525d16f57310ede66f5ae357e0e38\" pid:2566 exit_status:1 exited_at:{seconds:1757548653 nanos:969773379}" Sep 10 23:57:33.995355 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-77c56c596d0c4ba611fc95417832031c711525d16f57310ede66f5ae357e0e38-rootfs.mount: Deactivated successfully. Sep 10 23:57:34.190907 kubelet[2743]: E0910 23:57:34.190092 2743 controller.go:195] "Failed to update lease" err="rpc error: code = Unavailable desc = error reading from server: read tcp 10.0.0.3:47736->10.0.0.2:2379: read: connection timed out" Sep 10 23:57:34.198546 kubelet[2743]: I0910 23:57:34.198507 2743 scope.go:117] "RemoveContainer" containerID="77c56c596d0c4ba611fc95417832031c711525d16f57310ede66f5ae357e0e38" Sep 10 23:57:34.202523 kubelet[2743]: I0910 23:57:34.202467 2743 scope.go:117] "RemoveContainer" containerID="d4422c79d8dbe6c9e2bec53241825cb1fe0c6aef10bfe29a1d3a62676dd035ad" Sep 10 23:57:34.203735 kubelet[2743]: I0910 23:57:34.203689 2743 scope.go:117] "RemoveContainer" containerID="33297a90db7f8cdd9697668a43ab136caeee437394decd46ecefe7df0c3498d5" Sep 10 23:57:34.204612 containerd[1522]: time="2025-09-10T23:57:34.203177917Z" level=info msg="CreateContainer within sandbox \"e8a761325458a3100818963d4815d4bbbe03665590d35c93b44c5005d2c1bd76\" for container &ContainerMetadata{Name:kube-controller-manager,Attempt:1,}" Sep 10 23:57:34.205521 containerd[1522]: time="2025-09-10T23:57:34.205487569Z" level=info msg="RemoveContainer for \"d4422c79d8dbe6c9e2bec53241825cb1fe0c6aef10bfe29a1d3a62676dd035ad\"" Sep 10 23:57:34.207046 kubelet[2743]: E0910 23:57:34.206988 2743 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"tigera-operator\" with CrashLoopBackOff: \"back-off 10s restarting failed container=tigera-operator pod=tigera-operator-58fc44c59b-9fcvb_tigera-operator(a709ca6d-6746-446d-b3f4-b8e6002dcf78)\"" pod="tigera-operator/tigera-operator-58fc44c59b-9fcvb" podUID="a709ca6d-6746-446d-b3f4-b8e6002dcf78" Sep 10 23:57:34.224638 containerd[1522]: time="2025-09-10T23:57:34.222366492Z" level=info msg="Container 841bc69e6b266b7789fc057817a4a5631856b788d6c64fe8bb3e648f7541a2d1: CDI devices from CRI Config.CDIDevices: []" Sep 10 23:57:34.231402 containerd[1522]: time="2025-09-10T23:57:34.231179536Z" level=info msg="RemoveContainer for \"d4422c79d8dbe6c9e2bec53241825cb1fe0c6aef10bfe29a1d3a62676dd035ad\" returns successfully" Sep 10 23:57:34.239825 containerd[1522]: time="2025-09-10T23:57:34.239776579Z" level=info msg="CreateContainer within sandbox \"e8a761325458a3100818963d4815d4bbbe03665590d35c93b44c5005d2c1bd76\" for &ContainerMetadata{Name:kube-controller-manager,Attempt:1,} returns container id \"841bc69e6b266b7789fc057817a4a5631856b788d6c64fe8bb3e648f7541a2d1\"" Sep 10 23:57:34.240366 containerd[1522]: time="2025-09-10T23:57:34.240308741Z" level=info msg="StartContainer for \"841bc69e6b266b7789fc057817a4a5631856b788d6c64fe8bb3e648f7541a2d1\"" Sep 10 23:57:34.241629 containerd[1522]: time="2025-09-10T23:57:34.241604348Z" level=info msg="connecting to shim 841bc69e6b266b7789fc057817a4a5631856b788d6c64fe8bb3e648f7541a2d1" address="unix:///run/containerd/s/29baa797365d79cd702d6ad1f3e287bf17d79858962ae21867bba43015f9038f" protocol=ttrpc version=3 Sep 10 23:57:34.262546 systemd[1]: Started cri-containerd-841bc69e6b266b7789fc057817a4a5631856b788d6c64fe8bb3e648f7541a2d1.scope - libcontainer container 841bc69e6b266b7789fc057817a4a5631856b788d6c64fe8bb3e648f7541a2d1. Sep 10 23:57:34.307596 containerd[1522]: time="2025-09-10T23:57:34.307538674Z" level=info msg="StartContainer for \"841bc69e6b266b7789fc057817a4a5631856b788d6c64fe8bb3e648f7541a2d1\" returns successfully" Sep 10 23:57:37.914290 kubelet[2743]: E0910 23:57:37.906607 2743 event.go:359] "Server rejected event (will not retry!)" err="rpc error: code = Unavailable desc = error reading from server: read tcp 10.0.0.3:47556->10.0.0.2:2379: read: connection timed out" event="&Event{ObjectMeta:{kube-apiserver-ci-4372-1-0-n-c06092ab73.1864112ed662d0b7 kube-system 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:kube-system,Name:kube-apiserver-ci-4372-1-0-n-c06092ab73,UID:67881df9adff5d81b062d3dfa04aa3d1,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:Unhealthy,Message:Readiness probe failed: HTTP probe failed with statuscode: 500,Source:EventSource{Component:kubelet,Host:ci-4372-1-0-n-c06092ab73,},FirstTimestamp:2025-09-10 23:57:27.444304055 +0000 UTC m=+217.234369271,LastTimestamp:2025-09-10 23:57:27.444304055 +0000 UTC m=+217.234369271,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ci-4372-1-0-n-c06092ab73,}" Sep 10 23:57:39.483418 systemd[1]: cri-containerd-f4aa9ef0788d534b6e307c1dbb8b7f785203362c440100ce98119f89af0761a7.scope: Deactivated successfully. Sep 10 23:57:39.483938 systemd[1]: cri-containerd-f4aa9ef0788d534b6e307c1dbb8b7f785203362c440100ce98119f89af0761a7.scope: Consumed 1.782s CPU time, 23.9M memory peak, 3.1M read from disk. Sep 10 23:57:39.486758 containerd[1522]: time="2025-09-10T23:57:39.486598525Z" level=info msg="received exit event container_id:\"f4aa9ef0788d534b6e307c1dbb8b7f785203362c440100ce98119f89af0761a7\" id:\"f4aa9ef0788d534b6e307c1dbb8b7f785203362c440100ce98119f89af0761a7\" pid:2615 exit_status:1 exited_at:{seconds:1757548659 nanos:486022280}" Sep 10 23:57:39.487403 containerd[1522]: time="2025-09-10T23:57:39.486849127Z" level=info msg="TaskExit event in podsandbox handler container_id:\"f4aa9ef0788d534b6e307c1dbb8b7f785203362c440100ce98119f89af0761a7\" id:\"f4aa9ef0788d534b6e307c1dbb8b7f785203362c440100ce98119f89af0761a7\" pid:2615 exit_status:1 exited_at:{seconds:1757548659 nanos:486022280}" Sep 10 23:57:39.510628 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-f4aa9ef0788d534b6e307c1dbb8b7f785203362c440100ce98119f89af0761a7-rootfs.mount: Deactivated successfully. Sep 10 23:57:40.230236 kubelet[2743]: I0910 23:57:40.230148 2743 scope.go:117] "RemoveContainer" containerID="f4aa9ef0788d534b6e307c1dbb8b7f785203362c440100ce98119f89af0761a7" Sep 10 23:57:40.232651 containerd[1522]: time="2025-09-10T23:57:40.232573540Z" level=info msg="CreateContainer within sandbox \"1eba38c580b6fa474eba1aac38b5804249e8ea0d4f6623cf7c2a4dfd788baf07\" for container &ContainerMetadata{Name:kube-scheduler,Attempt:1,}" Sep 10 23:57:40.247222 containerd[1522]: time="2025-09-10T23:57:40.245671411Z" level=info msg="Container c89e49a568a06fa1923694fdc0bc981635f067837ba05b574e9e610252814057: CDI devices from CRI Config.CDIDevices: []" Sep 10 23:57:40.257528 containerd[1522]: time="2025-09-10T23:57:40.257479351Z" level=info msg="CreateContainer within sandbox \"1eba38c580b6fa474eba1aac38b5804249e8ea0d4f6623cf7c2a4dfd788baf07\" for &ContainerMetadata{Name:kube-scheduler,Attempt:1,} returns container id \"c89e49a568a06fa1923694fdc0bc981635f067837ba05b574e9e610252814057\"" Sep 10 23:57:40.258633 containerd[1522]: time="2025-09-10T23:57:40.258369998Z" level=info msg="StartContainer for \"c89e49a568a06fa1923694fdc0bc981635f067837ba05b574e9e610252814057\"" Sep 10 23:57:40.260030 containerd[1522]: time="2025-09-10T23:57:40.259990892Z" level=info msg="connecting to shim c89e49a568a06fa1923694fdc0bc981635f067837ba05b574e9e610252814057" address="unix:///run/containerd/s/cac749d29dfcc7a5aaa7ead5ba1b6839734fa7551bb2e536e4d443469e8dd269" protocol=ttrpc version=3 Sep 10 23:57:40.292548 systemd[1]: Started cri-containerd-c89e49a568a06fa1923694fdc0bc981635f067837ba05b574e9e610252814057.scope - libcontainer container c89e49a568a06fa1923694fdc0bc981635f067837ba05b574e9e610252814057. Sep 10 23:57:40.343876 containerd[1522]: time="2025-09-10T23:57:40.343839960Z" level=info msg="StartContainer for \"c89e49a568a06fa1923694fdc0bc981635f067837ba05b574e9e610252814057\" returns successfully"