May 13 23:48:01.893952 kernel: Booting Linux on physical CPU 0x0000000000 [0x413fd0c1] May 13 23:48:01.893979 kernel: Linux version 6.6.89-flatcar (build@pony-truck.infra.kinvolk.io) (aarch64-cros-linux-gnu-gcc (Gentoo Hardened 14.2.1_p20241221 p7) 14.2.1 20241221, GNU ld (Gentoo 2.44 p1) 2.44.0) #1 SMP PREEMPT Tue May 13 22:16:18 -00 2025 May 13 23:48:01.893990 kernel: KASLR enabled May 13 23:48:01.893996 kernel: efi: EFI v2.7 by Ubuntu distribution of EDK II May 13 23:48:01.894002 kernel: efi: SMBIOS 3.0=0x139ed0000 MEMATTR=0x1390bb018 ACPI 2.0=0x136760018 RNG=0x13676e918 MEMRESERVE=0x136b41218 May 13 23:48:01.894008 kernel: random: crng init done May 13 23:48:01.894015 kernel: secureboot: Secure boot disabled May 13 23:48:01.894021 kernel: ACPI: Early table checksum verification disabled May 13 23:48:01.894027 kernel: ACPI: RSDP 0x0000000136760018 000024 (v02 BOCHS ) May 13 23:48:01.894034 kernel: ACPI: XSDT 0x000000013676FE98 00006C (v01 BOCHS BXPC 00000001 01000013) May 13 23:48:01.894041 kernel: ACPI: FACP 0x000000013676FA98 000114 (v06 BOCHS BXPC 00000001 BXPC 00000001) May 13 23:48:01.894046 kernel: ACPI: DSDT 0x0000000136767518 001468 (v02 BOCHS BXPC 00000001 BXPC 00000001) May 13 23:48:01.894052 kernel: ACPI: APIC 0x000000013676FC18 000108 (v04 BOCHS BXPC 00000001 BXPC 00000001) May 13 23:48:01.894058 kernel: ACPI: PPTT 0x000000013676FD98 000060 (v02 BOCHS BXPC 00000001 BXPC 00000001) May 13 23:48:01.894065 kernel: ACPI: GTDT 0x000000013676D898 000060 (v02 BOCHS BXPC 00000001 BXPC 00000001) May 13 23:48:01.894073 kernel: ACPI: MCFG 0x000000013676FF98 00003C (v01 BOCHS BXPC 00000001 BXPC 00000001) May 13 23:48:01.894079 kernel: ACPI: SPCR 0x000000013676E818 000050 (v02 BOCHS BXPC 00000001 BXPC 00000001) May 13 23:48:01.894086 kernel: ACPI: DBG2 0x000000013676E898 000057 (v00 BOCHS BXPC 00000001 BXPC 00000001) May 13 23:48:01.894092 kernel: ACPI: IORT 0x000000013676E418 000080 (v03 BOCHS BXPC 00000001 BXPC 00000001) May 13 23:48:01.894098 kernel: ACPI: BGRT 0x000000013676E798 000038 (v01 INTEL EDK2 00000002 01000013) May 13 23:48:01.894104 kernel: ACPI: SPCR: console: pl011,mmio32,0x9000000,9600 May 13 23:48:01.894110 kernel: NUMA: Failed to initialise from firmware May 13 23:48:01.894116 kernel: NUMA: Faking a node at [mem 0x0000000040000000-0x0000000139ffffff] May 13 23:48:01.894122 kernel: NUMA: NODE_DATA [mem 0x13966d800-0x139672fff] May 13 23:48:01.894128 kernel: Zone ranges: May 13 23:48:01.894136 kernel: DMA [mem 0x0000000040000000-0x00000000ffffffff] May 13 23:48:01.894142 kernel: DMA32 empty May 13 23:48:01.894148 kernel: Normal [mem 0x0000000100000000-0x0000000139ffffff] May 13 23:48:01.894154 kernel: Movable zone start for each node May 13 23:48:01.894160 kernel: Early memory node ranges May 13 23:48:01.894166 kernel: node 0: [mem 0x0000000040000000-0x000000013666ffff] May 13 23:48:01.894172 kernel: node 0: [mem 0x0000000136670000-0x000000013667ffff] May 13 23:48:01.894178 kernel: node 0: [mem 0x0000000136680000-0x000000013676ffff] May 13 23:48:01.894184 kernel: node 0: [mem 0x0000000136770000-0x0000000136b3ffff] May 13 23:48:01.894190 kernel: node 0: [mem 0x0000000136b40000-0x0000000139e1ffff] May 13 23:48:01.894196 kernel: node 0: [mem 0x0000000139e20000-0x0000000139eaffff] May 13 23:48:01.894202 kernel: node 0: [mem 0x0000000139eb0000-0x0000000139ebffff] May 13 23:48:01.894210 kernel: node 0: [mem 0x0000000139ec0000-0x0000000139fdffff] May 13 23:48:01.894216 kernel: node 0: [mem 0x0000000139fe0000-0x0000000139ffffff] May 13 23:48:01.894222 kernel: Initmem setup node 0 [mem 0x0000000040000000-0x0000000139ffffff] May 13 23:48:01.894231 kernel: On node 0, zone Normal: 24576 pages in unavailable ranges May 13 23:48:01.894238 kernel: psci: probing for conduit method from ACPI. May 13 23:48:01.894244 kernel: psci: PSCIv1.1 detected in firmware. May 13 23:48:01.894253 kernel: psci: Using standard PSCI v0.2 function IDs May 13 23:48:01.894259 kernel: psci: Trusted OS migration not required May 13 23:48:01.894265 kernel: psci: SMC Calling Convention v1.1 May 13 23:48:01.894272 kernel: smccc: KVM: hypervisor services detected (0x00000000 0x00000000 0x00000000 0x00000003) May 13 23:48:01.894279 kernel: percpu: Embedded 31 pages/cpu s86632 r8192 d32152 u126976 May 13 23:48:01.894285 kernel: pcpu-alloc: s86632 r8192 d32152 u126976 alloc=31*4096 May 13 23:48:01.894292 kernel: pcpu-alloc: [0] 0 [0] 1 May 13 23:48:01.894298 kernel: Detected PIPT I-cache on CPU0 May 13 23:48:01.894304 kernel: CPU features: detected: GIC system register CPU interface May 13 23:48:01.894311 kernel: CPU features: detected: Hardware dirty bit management May 13 23:48:01.894319 kernel: CPU features: detected: Spectre-v4 May 13 23:48:01.894325 kernel: CPU features: detected: Spectre-BHB May 13 23:48:01.894331 kernel: CPU features: kernel page table isolation forced ON by KASLR May 13 23:48:01.894338 kernel: CPU features: detected: Kernel page table isolation (KPTI) May 13 23:48:01.894345 kernel: CPU features: detected: ARM erratum 1418040 May 13 23:48:01.894351 kernel: CPU features: detected: SSBS not fully self-synchronizing May 13 23:48:01.894357 kernel: alternatives: applying boot alternatives May 13 23:48:01.894365 kernel: Kernel command line: BOOT_IMAGE=/flatcar/vmlinuz-a mount.usr=/dev/mapper/usr verity.usr=PARTUUID=7130c94a-213a-4e5a-8e26-6cce9662f132 rootflags=rw mount.usrflags=ro consoleblank=0 root=LABEL=ROOT console=ttyAMA0,115200n8 flatcar.first_boot=detected acpi=force flatcar.oem.id=hetzner verity.usrhash=3174b2682629aa8ad4069807ed6fd62c10f62266ee1e150a1104f2a2fb6489b5 May 13 23:48:01.894372 kernel: Unknown kernel command line parameters "BOOT_IMAGE=/flatcar/vmlinuz-a", will be passed to user space. May 13 23:48:01.894378 kernel: Dentry cache hash table entries: 524288 (order: 10, 4194304 bytes, linear) May 13 23:48:01.894385 kernel: Inode-cache hash table entries: 262144 (order: 9, 2097152 bytes, linear) May 13 23:48:01.894393 kernel: Fallback order for Node 0: 0 May 13 23:48:01.896413 kernel: Built 1 zonelists, mobility grouping on. Total pages: 1008000 May 13 23:48:01.896443 kernel: Policy zone: Normal May 13 23:48:01.896451 kernel: mem auto-init: stack:off, heap alloc:off, heap free:off May 13 23:48:01.896458 kernel: software IO TLB: area num 2. May 13 23:48:01.896465 kernel: software IO TLB: mapped [mem 0x00000000f95c0000-0x00000000fd5c0000] (64MB) May 13 23:48:01.896473 kernel: Memory: 3883696K/4096000K available (10368K kernel code, 2186K rwdata, 8100K rodata, 38464K init, 897K bss, 212304K reserved, 0K cma-reserved) May 13 23:48:01.896480 kernel: SLUB: HWalign=64, Order=0-3, MinObjects=0, CPUs=2, Nodes=1 May 13 23:48:01.896487 kernel: rcu: Preemptible hierarchical RCU implementation. May 13 23:48:01.896494 kernel: rcu: RCU event tracing is enabled. May 13 23:48:01.896501 kernel: rcu: RCU restricting CPUs from NR_CPUS=512 to nr_cpu_ids=2. May 13 23:48:01.896508 kernel: Trampoline variant of Tasks RCU enabled. May 13 23:48:01.896522 kernel: Tracing variant of Tasks RCU enabled. May 13 23:48:01.896529 kernel: rcu: RCU calculated value of scheduler-enlistment delay is 100 jiffies. May 13 23:48:01.896535 kernel: rcu: Adjusting geometry for rcu_fanout_leaf=16, nr_cpu_ids=2 May 13 23:48:01.896542 kernel: NR_IRQS: 64, nr_irqs: 64, preallocated irqs: 0 May 13 23:48:01.896548 kernel: GICv3: 256 SPIs implemented May 13 23:48:01.896555 kernel: GICv3: 0 Extended SPIs implemented May 13 23:48:01.896562 kernel: Root IRQ handler: gic_handle_irq May 13 23:48:01.896568 kernel: GICv3: GICv3 features: 16 PPIs, DirectLPI May 13 23:48:01.896575 kernel: GICv3: CPU0: found redistributor 0 region 0:0x00000000080a0000 May 13 23:48:01.896581 kernel: ITS [mem 0x08080000-0x0809ffff] May 13 23:48:01.896588 kernel: ITS@0x0000000008080000: allocated 8192 Devices @1014c0000 (indirect, esz 8, psz 64K, shr 1) May 13 23:48:01.896597 kernel: ITS@0x0000000008080000: allocated 8192 Interrupt Collections @1014d0000 (flat, esz 8, psz 64K, shr 1) May 13 23:48:01.896603 kernel: GICv3: using LPI property table @0x00000001014e0000 May 13 23:48:01.896610 kernel: GICv3: CPU0: using allocated LPI pending table @0x00000001014f0000 May 13 23:48:01.896617 kernel: rcu: srcu_init: Setting srcu_struct sizes based on contention. May 13 23:48:01.896624 kernel: arch_timer: Enabling local workaround for ARM erratum 1418040 May 13 23:48:01.896630 kernel: arch_timer: cp15 timer(s) running at 25.00MHz (virt). May 13 23:48:01.896637 kernel: clocksource: arch_sys_counter: mask: 0xffffffffffffff max_cycles: 0x5c40939b5, max_idle_ns: 440795202646 ns May 13 23:48:01.896644 kernel: sched_clock: 56 bits at 25MHz, resolution 40ns, wraps every 4398046511100ns May 13 23:48:01.896651 kernel: Console: colour dummy device 80x25 May 13 23:48:01.896658 kernel: ACPI: Core revision 20230628 May 13 23:48:01.896665 kernel: Calibrating delay loop (skipped), value calculated using timer frequency.. 50.00 BogoMIPS (lpj=25000) May 13 23:48:01.896673 kernel: pid_max: default: 32768 minimum: 301 May 13 23:48:01.896680 kernel: LSM: initializing lsm=lockdown,capability,landlock,selinux,integrity May 13 23:48:01.896687 kernel: landlock: Up and running. May 13 23:48:01.896693 kernel: SELinux: Initializing. May 13 23:48:01.896700 kernel: Mount-cache hash table entries: 8192 (order: 4, 65536 bytes, linear) May 13 23:48:01.896707 kernel: Mountpoint-cache hash table entries: 8192 (order: 4, 65536 bytes, linear) May 13 23:48:01.896713 kernel: ACPI PPTT: PPTT table found, but unable to locate core 1 (1) May 13 23:48:01.896721 kernel: RCU Tasks: Setting shift to 1 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=2. May 13 23:48:01.896728 kernel: RCU Tasks Trace: Setting shift to 1 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=2. May 13 23:48:01.896736 kernel: rcu: Hierarchical SRCU implementation. May 13 23:48:01.896743 kernel: rcu: Max phase no-delay instances is 400. May 13 23:48:01.896750 kernel: Platform MSI: ITS@0x8080000 domain created May 13 23:48:01.896757 kernel: PCI/MSI: ITS@0x8080000 domain created May 13 23:48:01.896763 kernel: Remapping and enabling EFI services. May 13 23:48:01.896770 kernel: smp: Bringing up secondary CPUs ... May 13 23:48:01.896777 kernel: Detected PIPT I-cache on CPU1 May 13 23:48:01.896784 kernel: GICv3: CPU1: found redistributor 1 region 0:0x00000000080c0000 May 13 23:48:01.896790 kernel: GICv3: CPU1: using allocated LPI pending table @0x0000000101500000 May 13 23:48:01.896799 kernel: arch_timer: Enabling local workaround for ARM erratum 1418040 May 13 23:48:01.896806 kernel: CPU1: Booted secondary processor 0x0000000001 [0x413fd0c1] May 13 23:48:01.896818 kernel: smp: Brought up 1 node, 2 CPUs May 13 23:48:01.896827 kernel: SMP: Total of 2 processors activated. May 13 23:48:01.896834 kernel: CPU features: detected: 32-bit EL0 Support May 13 23:48:01.896850 kernel: CPU features: detected: Data cache clean to the PoU not required for I/D coherence May 13 23:48:01.896858 kernel: CPU features: detected: Common not Private translations May 13 23:48:01.896865 kernel: CPU features: detected: CRC32 instructions May 13 23:48:01.896872 kernel: CPU features: detected: Enhanced Virtualization Traps May 13 23:48:01.896879 kernel: CPU features: detected: RCpc load-acquire (LDAPR) May 13 23:48:01.896889 kernel: CPU features: detected: LSE atomic instructions May 13 23:48:01.896896 kernel: CPU features: detected: Privileged Access Never May 13 23:48:01.896903 kernel: CPU features: detected: RAS Extension Support May 13 23:48:01.896910 kernel: CPU features: detected: Speculative Store Bypassing Safe (SSBS) May 13 23:48:01.896917 kernel: CPU: All CPU(s) started at EL1 May 13 23:48:01.896924 kernel: alternatives: applying system-wide alternatives May 13 23:48:01.896931 kernel: devtmpfs: initialized May 13 23:48:01.896940 kernel: clocksource: jiffies: mask: 0xffffffff max_cycles: 0xffffffff, max_idle_ns: 1911260446275000 ns May 13 23:48:01.896948 kernel: futex hash table entries: 512 (order: 3, 32768 bytes, linear) May 13 23:48:01.896955 kernel: pinctrl core: initialized pinctrl subsystem May 13 23:48:01.896962 kernel: SMBIOS 3.0.0 present. May 13 23:48:01.896969 kernel: DMI: Hetzner vServer/KVM Virtual Machine, BIOS 20171111 11/11/2017 May 13 23:48:01.896977 kernel: NET: Registered PF_NETLINK/PF_ROUTE protocol family May 13 23:48:01.896984 kernel: DMA: preallocated 512 KiB GFP_KERNEL pool for atomic allocations May 13 23:48:01.896991 kernel: DMA: preallocated 512 KiB GFP_KERNEL|GFP_DMA pool for atomic allocations May 13 23:48:01.896998 kernel: DMA: preallocated 512 KiB GFP_KERNEL|GFP_DMA32 pool for atomic allocations May 13 23:48:01.897007 kernel: audit: initializing netlink subsys (disabled) May 13 23:48:01.897014 kernel: audit: type=2000 audit(0.011:1): state=initialized audit_enabled=0 res=1 May 13 23:48:01.897022 kernel: thermal_sys: Registered thermal governor 'step_wise' May 13 23:48:01.897029 kernel: cpuidle: using governor menu May 13 23:48:01.897036 kernel: hw-breakpoint: found 6 breakpoint and 4 watchpoint registers. May 13 23:48:01.897043 kernel: ASID allocator initialised with 32768 entries May 13 23:48:01.897050 kernel: acpiphp: ACPI Hot Plug PCI Controller Driver version: 0.5 May 13 23:48:01.897057 kernel: Serial: AMBA PL011 UART driver May 13 23:48:01.897064 kernel: Modules: 2G module region forced by RANDOMIZE_MODULE_REGION_FULL May 13 23:48:01.897073 kernel: Modules: 0 pages in range for non-PLT usage May 13 23:48:01.897080 kernel: Modules: 509232 pages in range for PLT usage May 13 23:48:01.897087 kernel: HugeTLB: registered 1.00 GiB page size, pre-allocated 0 pages May 13 23:48:01.897094 kernel: HugeTLB: 0 KiB vmemmap can be freed for a 1.00 GiB page May 13 23:48:01.897102 kernel: HugeTLB: registered 32.0 MiB page size, pre-allocated 0 pages May 13 23:48:01.897109 kernel: HugeTLB: 0 KiB vmemmap can be freed for a 32.0 MiB page May 13 23:48:01.897116 kernel: HugeTLB: registered 2.00 MiB page size, pre-allocated 0 pages May 13 23:48:01.897123 kernel: HugeTLB: 0 KiB vmemmap can be freed for a 2.00 MiB page May 13 23:48:01.897130 kernel: HugeTLB: registered 64.0 KiB page size, pre-allocated 0 pages May 13 23:48:01.897138 kernel: HugeTLB: 0 KiB vmemmap can be freed for a 64.0 KiB page May 13 23:48:01.897145 kernel: ACPI: Added _OSI(Module Device) May 13 23:48:01.897152 kernel: ACPI: Added _OSI(Processor Device) May 13 23:48:01.897159 kernel: ACPI: Added _OSI(3.0 _SCP Extensions) May 13 23:48:01.897167 kernel: ACPI: Added _OSI(Processor Aggregator Device) May 13 23:48:01.897174 kernel: ACPI: 1 ACPI AML tables successfully acquired and loaded May 13 23:48:01.897182 kernel: ACPI: Interpreter enabled May 13 23:48:01.897189 kernel: ACPI: Using GIC for interrupt routing May 13 23:48:01.897196 kernel: ACPI: MCFG table detected, 1 entries May 13 23:48:01.897205 kernel: ARMH0011:00: ttyAMA0 at MMIO 0x9000000 (irq = 12, base_baud = 0) is a SBSA May 13 23:48:01.897212 kernel: printk: console [ttyAMA0] enabled May 13 23:48:01.897219 kernel: ACPI: PCI Root Bridge [PCI0] (domain 0000 [bus 00-ff]) May 13 23:48:01.897384 kernel: acpi PNP0A08:00: _OSC: OS supports [ExtendedConfig ASPM ClockPM Segments MSI HPX-Type3] May 13 23:48:01.897489 kernel: acpi PNP0A08:00: _OSC: platform does not support [LTR] May 13 23:48:01.897556 kernel: acpi PNP0A08:00: _OSC: OS now controls [PCIeHotplug PME AER PCIeCapability] May 13 23:48:01.897619 kernel: acpi PNP0A08:00: ECAM area [mem 0x4010000000-0x401fffffff] reserved by PNP0C02:00 May 13 23:48:01.897687 kernel: acpi PNP0A08:00: ECAM at [mem 0x4010000000-0x401fffffff] for [bus 00-ff] May 13 23:48:01.897696 kernel: ACPI: Remapped I/O 0x000000003eff0000 to [io 0x0000-0xffff window] May 13 23:48:01.897703 kernel: PCI host bridge to bus 0000:00 May 13 23:48:01.897778 kernel: pci_bus 0000:00: root bus resource [mem 0x10000000-0x3efeffff window] May 13 23:48:01.897838 kernel: pci_bus 0000:00: root bus resource [io 0x0000-0xffff window] May 13 23:48:01.897946 kernel: pci_bus 0000:00: root bus resource [mem 0x8000000000-0xffffffffff window] May 13 23:48:01.898005 kernel: pci_bus 0000:00: root bus resource [bus 00-ff] May 13 23:48:01.898087 kernel: pci 0000:00:00.0: [1b36:0008] type 00 class 0x060000 May 13 23:48:01.898168 kernel: pci 0000:00:01.0: [1af4:1050] type 00 class 0x038000 May 13 23:48:01.898246 kernel: pci 0000:00:01.0: reg 0x14: [mem 0x11289000-0x11289fff] May 13 23:48:01.898314 kernel: pci 0000:00:01.0: reg 0x20: [mem 0x8000600000-0x8000603fff 64bit pref] May 13 23:48:01.898387 kernel: pci 0000:00:02.0: [1b36:000c] type 01 class 0x060400 May 13 23:48:01.899079 kernel: pci 0000:00:02.0: reg 0x10: [mem 0x11288000-0x11288fff] May 13 23:48:01.899159 kernel: pci 0000:00:02.1: [1b36:000c] type 01 class 0x060400 May 13 23:48:01.899234 kernel: pci 0000:00:02.1: reg 0x10: [mem 0x11287000-0x11287fff] May 13 23:48:01.899306 kernel: pci 0000:00:02.2: [1b36:000c] type 01 class 0x060400 May 13 23:48:01.899370 kernel: pci 0000:00:02.2: reg 0x10: [mem 0x11286000-0x11286fff] May 13 23:48:01.900710 kernel: pci 0000:00:02.3: [1b36:000c] type 01 class 0x060400 May 13 23:48:01.900799 kernel: pci 0000:00:02.3: reg 0x10: [mem 0x11285000-0x11285fff] May 13 23:48:01.900891 kernel: pci 0000:00:02.4: [1b36:000c] type 01 class 0x060400 May 13 23:48:01.900993 kernel: pci 0000:00:02.4: reg 0x10: [mem 0x11284000-0x11284fff] May 13 23:48:01.901071 kernel: pci 0000:00:02.5: [1b36:000c] type 01 class 0x060400 May 13 23:48:01.901136 kernel: pci 0000:00:02.5: reg 0x10: [mem 0x11283000-0x11283fff] May 13 23:48:01.901206 kernel: pci 0000:00:02.6: [1b36:000c] type 01 class 0x060400 May 13 23:48:01.901270 kernel: pci 0000:00:02.6: reg 0x10: [mem 0x11282000-0x11282fff] May 13 23:48:01.901343 kernel: pci 0000:00:02.7: [1b36:000c] type 01 class 0x060400 May 13 23:48:01.901464 kernel: pci 0000:00:02.7: reg 0x10: [mem 0x11281000-0x11281fff] May 13 23:48:01.901541 kernel: pci 0000:00:03.0: [1b36:000c] type 01 class 0x060400 May 13 23:48:01.901618 kernel: pci 0000:00:03.0: reg 0x10: [mem 0x11280000-0x11280fff] May 13 23:48:01.901691 kernel: pci 0000:00:04.0: [1b36:0002] type 00 class 0x070002 May 13 23:48:01.901755 kernel: pci 0000:00:04.0: reg 0x10: [io 0x0000-0x0007] May 13 23:48:01.901829 kernel: pci 0000:01:00.0: [1af4:1041] type 00 class 0x020000 May 13 23:48:01.901964 kernel: pci 0000:01:00.0: reg 0x14: [mem 0x11000000-0x11000fff] May 13 23:48:01.902038 kernel: pci 0000:01:00.0: reg 0x20: [mem 0x8000000000-0x8000003fff 64bit pref] May 13 23:48:01.902106 kernel: pci 0000:01:00.0: reg 0x30: [mem 0xfff80000-0xffffffff pref] May 13 23:48:01.902183 kernel: pci 0000:02:00.0: [1b36:000d] type 00 class 0x0c0330 May 13 23:48:01.902253 kernel: pci 0000:02:00.0: reg 0x10: [mem 0x10e00000-0x10e03fff 64bit] May 13 23:48:01.902329 kernel: pci 0000:03:00.0: [1af4:1043] type 00 class 0x078000 May 13 23:48:01.902396 kernel: pci 0000:03:00.0: reg 0x14: [mem 0x10c00000-0x10c00fff] May 13 23:48:01.903577 kernel: pci 0000:03:00.0: reg 0x20: [mem 0x8000100000-0x8000103fff 64bit pref] May 13 23:48:01.903668 kernel: pci 0000:04:00.0: [1af4:1045] type 00 class 0x00ff00 May 13 23:48:01.903737 kernel: pci 0000:04:00.0: reg 0x20: [mem 0x8000200000-0x8000203fff 64bit pref] May 13 23:48:01.903812 kernel: pci 0000:05:00.0: [1af4:1044] type 00 class 0x00ff00 May 13 23:48:01.903900 kernel: pci 0000:05:00.0: reg 0x14: [mem 0x10800000-0x10800fff] May 13 23:48:01.903969 kernel: pci 0000:05:00.0: reg 0x20: [mem 0x8000300000-0x8000303fff 64bit pref] May 13 23:48:01.904049 kernel: pci 0000:06:00.0: [1af4:1048] type 00 class 0x010000 May 13 23:48:01.904116 kernel: pci 0000:06:00.0: reg 0x14: [mem 0x10600000-0x10600fff] May 13 23:48:01.904181 kernel: pci 0000:06:00.0: reg 0x20: [mem 0x8000400000-0x8000403fff 64bit pref] May 13 23:48:01.904254 kernel: pci 0000:07:00.0: [1af4:1041] type 00 class 0x020000 May 13 23:48:01.904320 kernel: pci 0000:07:00.0: reg 0x14: [mem 0x10400000-0x10400fff] May 13 23:48:01.904384 kernel: pci 0000:07:00.0: reg 0x20: [mem 0x8000500000-0x8000503fff 64bit pref] May 13 23:48:01.905581 kernel: pci 0000:07:00.0: reg 0x30: [mem 0xfff80000-0xffffffff pref] May 13 23:48:01.905666 kernel: pci 0000:00:02.0: bridge window [io 0x1000-0x0fff] to [bus 01] add_size 1000 May 13 23:48:01.905732 kernel: pci 0000:00:02.0: bridge window [mem 0x00100000-0x001fffff 64bit pref] to [bus 01] add_size 100000 add_align 100000 May 13 23:48:01.905794 kernel: pci 0000:00:02.0: bridge window [mem 0x00100000-0x001fffff] to [bus 01] add_size 100000 add_align 100000 May 13 23:48:01.905879 kernel: pci 0000:00:02.1: bridge window [io 0x1000-0x0fff] to [bus 02] add_size 1000 May 13 23:48:01.905946 kernel: pci 0000:00:02.1: bridge window [mem 0x00100000-0x000fffff 64bit pref] to [bus 02] add_size 200000 add_align 100000 May 13 23:48:01.906008 kernel: pci 0000:00:02.1: bridge window [mem 0x00100000-0x001fffff] to [bus 02] add_size 100000 add_align 100000 May 13 23:48:01.906077 kernel: pci 0000:00:02.2: bridge window [io 0x1000-0x0fff] to [bus 03] add_size 1000 May 13 23:48:01.906147 kernel: pci 0000:00:02.2: bridge window [mem 0x00100000-0x001fffff 64bit pref] to [bus 03] add_size 100000 add_align 100000 May 13 23:48:01.906210 kernel: pci 0000:00:02.2: bridge window [mem 0x00100000-0x001fffff] to [bus 03] add_size 100000 add_align 100000 May 13 23:48:01.906287 kernel: pci 0000:00:02.3: bridge window [io 0x1000-0x0fff] to [bus 04] add_size 1000 May 13 23:48:01.906350 kernel: pci 0000:00:02.3: bridge window [mem 0x00100000-0x001fffff 64bit pref] to [bus 04] add_size 100000 add_align 100000 May 13 23:48:01.907169 kernel: pci 0000:00:02.3: bridge window [mem 0x00100000-0x000fffff] to [bus 04] add_size 200000 add_align 100000 May 13 23:48:01.908623 kernel: pci 0000:00:02.4: bridge window [io 0x1000-0x0fff] to [bus 05] add_size 1000 May 13 23:48:01.908692 kernel: pci 0000:00:02.4: bridge window [mem 0x00100000-0x001fffff 64bit pref] to [bus 05] add_size 100000 add_align 100000 May 13 23:48:01.908762 kernel: pci 0000:00:02.4: bridge window [mem 0x00100000-0x001fffff] to [bus 05] add_size 100000 add_align 100000 May 13 23:48:01.908830 kernel: pci 0000:00:02.5: bridge window [io 0x1000-0x0fff] to [bus 06] add_size 1000 May 13 23:48:01.908950 kernel: pci 0000:00:02.5: bridge window [mem 0x00100000-0x001fffff 64bit pref] to [bus 06] add_size 100000 add_align 100000 May 13 23:48:01.909018 kernel: pci 0000:00:02.5: bridge window [mem 0x00100000-0x001fffff] to [bus 06] add_size 100000 add_align 100000 May 13 23:48:01.909085 kernel: pci 0000:00:02.6: bridge window [io 0x1000-0x0fff] to [bus 07] add_size 1000 May 13 23:48:01.909148 kernel: pci 0000:00:02.6: bridge window [mem 0x00100000-0x001fffff 64bit pref] to [bus 07] add_size 100000 add_align 100000 May 13 23:48:01.909211 kernel: pci 0000:00:02.6: bridge window [mem 0x00100000-0x001fffff] to [bus 07] add_size 100000 add_align 100000 May 13 23:48:01.909286 kernel: pci 0000:00:02.7: bridge window [io 0x1000-0x0fff] to [bus 08] add_size 1000 May 13 23:48:01.909351 kernel: pci 0000:00:02.7: bridge window [mem 0x00100000-0x000fffff 64bit pref] to [bus 08] add_size 200000 add_align 100000 May 13 23:48:01.909431 kernel: pci 0000:00:02.7: bridge window [mem 0x00100000-0x000fffff] to [bus 08] add_size 200000 add_align 100000 May 13 23:48:01.909500 kernel: pci 0000:00:03.0: bridge window [io 0x1000-0x0fff] to [bus 09] add_size 1000 May 13 23:48:01.909567 kernel: pci 0000:00:03.0: bridge window [mem 0x00100000-0x000fffff 64bit pref] to [bus 09] add_size 200000 add_align 100000 May 13 23:48:01.909632 kernel: pci 0000:00:03.0: bridge window [mem 0x00100000-0x000fffff] to [bus 09] add_size 200000 add_align 100000 May 13 23:48:01.909700 kernel: pci 0000:00:02.0: BAR 14: assigned [mem 0x10000000-0x101fffff] May 13 23:48:01.909764 kernel: pci 0000:00:02.0: BAR 15: assigned [mem 0x8000000000-0x80001fffff 64bit pref] May 13 23:48:01.909835 kernel: pci 0000:00:02.1: BAR 14: assigned [mem 0x10200000-0x103fffff] May 13 23:48:01.909916 kernel: pci 0000:00:02.1: BAR 15: assigned [mem 0x8000200000-0x80003fffff 64bit pref] May 13 23:48:01.909984 kernel: pci 0000:00:02.2: BAR 14: assigned [mem 0x10400000-0x105fffff] May 13 23:48:01.910048 kernel: pci 0000:00:02.2: BAR 15: assigned [mem 0x8000400000-0x80005fffff 64bit pref] May 13 23:48:01.910114 kernel: pci 0000:00:02.3: BAR 14: assigned [mem 0x10600000-0x107fffff] May 13 23:48:01.910179 kernel: pci 0000:00:02.3: BAR 15: assigned [mem 0x8000600000-0x80007fffff 64bit pref] May 13 23:48:01.910243 kernel: pci 0000:00:02.4: BAR 14: assigned [mem 0x10800000-0x109fffff] May 13 23:48:01.910311 kernel: pci 0000:00:02.4: BAR 15: assigned [mem 0x8000800000-0x80009fffff 64bit pref] May 13 23:48:01.910375 kernel: pci 0000:00:02.5: BAR 14: assigned [mem 0x10a00000-0x10bfffff] May 13 23:48:01.910455 kernel: pci 0000:00:02.5: BAR 15: assigned [mem 0x8000a00000-0x8000bfffff 64bit pref] May 13 23:48:01.910524 kernel: pci 0000:00:02.6: BAR 14: assigned [mem 0x10c00000-0x10dfffff] May 13 23:48:01.910589 kernel: pci 0000:00:02.6: BAR 15: assigned [mem 0x8000c00000-0x8000dfffff 64bit pref] May 13 23:48:01.910698 kernel: pci 0000:00:02.7: BAR 14: assigned [mem 0x10e00000-0x10ffffff] May 13 23:48:01.910773 kernel: pci 0000:00:02.7: BAR 15: assigned [mem 0x8000e00000-0x8000ffffff 64bit pref] May 13 23:48:01.910853 kernel: pci 0000:00:03.0: BAR 14: assigned [mem 0x11000000-0x111fffff] May 13 23:48:01.910924 kernel: pci 0000:00:03.0: BAR 15: assigned [mem 0x8001000000-0x80011fffff 64bit pref] May 13 23:48:01.910995 kernel: pci 0000:00:01.0: BAR 4: assigned [mem 0x8001200000-0x8001203fff 64bit pref] May 13 23:48:01.911060 kernel: pci 0000:00:01.0: BAR 1: assigned [mem 0x11200000-0x11200fff] May 13 23:48:01.911124 kernel: pci 0000:00:02.0: BAR 0: assigned [mem 0x11201000-0x11201fff] May 13 23:48:01.911188 kernel: pci 0000:00:02.0: BAR 13: assigned [io 0x1000-0x1fff] May 13 23:48:01.911254 kernel: pci 0000:00:02.1: BAR 0: assigned [mem 0x11202000-0x11202fff] May 13 23:48:01.911323 kernel: pci 0000:00:02.1: BAR 13: assigned [io 0x2000-0x2fff] May 13 23:48:01.911390 kernel: pci 0000:00:02.2: BAR 0: assigned [mem 0x11203000-0x11203fff] May 13 23:48:01.911468 kernel: pci 0000:00:02.2: BAR 13: assigned [io 0x3000-0x3fff] May 13 23:48:01.911534 kernel: pci 0000:00:02.3: BAR 0: assigned [mem 0x11204000-0x11204fff] May 13 23:48:01.911599 kernel: pci 0000:00:02.3: BAR 13: assigned [io 0x4000-0x4fff] May 13 23:48:01.911664 kernel: pci 0000:00:02.4: BAR 0: assigned [mem 0x11205000-0x11205fff] May 13 23:48:01.911727 kernel: pci 0000:00:02.4: BAR 13: assigned [io 0x5000-0x5fff] May 13 23:48:01.911793 kernel: pci 0000:00:02.5: BAR 0: assigned [mem 0x11206000-0x11206fff] May 13 23:48:01.911912 kernel: pci 0000:00:02.5: BAR 13: assigned [io 0x6000-0x6fff] May 13 23:48:01.911981 kernel: pci 0000:00:02.6: BAR 0: assigned [mem 0x11207000-0x11207fff] May 13 23:48:01.912047 kernel: pci 0000:00:02.6: BAR 13: assigned [io 0x7000-0x7fff] May 13 23:48:01.912111 kernel: pci 0000:00:02.7: BAR 0: assigned [mem 0x11208000-0x11208fff] May 13 23:48:01.912175 kernel: pci 0000:00:02.7: BAR 13: assigned [io 0x8000-0x8fff] May 13 23:48:01.912239 kernel: pci 0000:00:03.0: BAR 0: assigned [mem 0x11209000-0x11209fff] May 13 23:48:01.912303 kernel: pci 0000:00:03.0: BAR 13: assigned [io 0x9000-0x9fff] May 13 23:48:01.912373 kernel: pci 0000:00:04.0: BAR 0: assigned [io 0xa000-0xa007] May 13 23:48:01.913566 kernel: pci 0000:01:00.0: BAR 6: assigned [mem 0x10000000-0x1007ffff pref] May 13 23:48:01.913657 kernel: pci 0000:01:00.0: BAR 4: assigned [mem 0x8000000000-0x8000003fff 64bit pref] May 13 23:48:01.913729 kernel: pci 0000:01:00.0: BAR 1: assigned [mem 0x10080000-0x10080fff] May 13 23:48:01.913797 kernel: pci 0000:00:02.0: PCI bridge to [bus 01] May 13 23:48:01.913883 kernel: pci 0000:00:02.0: bridge window [io 0x1000-0x1fff] May 13 23:48:01.913952 kernel: pci 0000:00:02.0: bridge window [mem 0x10000000-0x101fffff] May 13 23:48:01.914017 kernel: pci 0000:00:02.0: bridge window [mem 0x8000000000-0x80001fffff 64bit pref] May 13 23:48:01.914101 kernel: pci 0000:02:00.0: BAR 0: assigned [mem 0x10200000-0x10203fff 64bit] May 13 23:48:01.914169 kernel: pci 0000:00:02.1: PCI bridge to [bus 02] May 13 23:48:01.914234 kernel: pci 0000:00:02.1: bridge window [io 0x2000-0x2fff] May 13 23:48:01.914298 kernel: pci 0000:00:02.1: bridge window [mem 0x10200000-0x103fffff] May 13 23:48:01.914361 kernel: pci 0000:00:02.1: bridge window [mem 0x8000200000-0x80003fffff 64bit pref] May 13 23:48:01.914482 kernel: pci 0000:03:00.0: BAR 4: assigned [mem 0x8000400000-0x8000403fff 64bit pref] May 13 23:48:01.914554 kernel: pci 0000:03:00.0: BAR 1: assigned [mem 0x10400000-0x10400fff] May 13 23:48:01.914621 kernel: pci 0000:00:02.2: PCI bridge to [bus 03] May 13 23:48:01.914686 kernel: pci 0000:00:02.2: bridge window [io 0x3000-0x3fff] May 13 23:48:01.914749 kernel: pci 0000:00:02.2: bridge window [mem 0x10400000-0x105fffff] May 13 23:48:01.914813 kernel: pci 0000:00:02.2: bridge window [mem 0x8000400000-0x80005fffff 64bit pref] May 13 23:48:01.914931 kernel: pci 0000:04:00.0: BAR 4: assigned [mem 0x8000600000-0x8000603fff 64bit pref] May 13 23:48:01.915003 kernel: pci 0000:00:02.3: PCI bridge to [bus 04] May 13 23:48:01.915077 kernel: pci 0000:00:02.3: bridge window [io 0x4000-0x4fff] May 13 23:48:01.915141 kernel: pci 0000:00:02.3: bridge window [mem 0x10600000-0x107fffff] May 13 23:48:01.915205 kernel: pci 0000:00:02.3: bridge window [mem 0x8000600000-0x80007fffff 64bit pref] May 13 23:48:01.915277 kernel: pci 0000:05:00.0: BAR 4: assigned [mem 0x8000800000-0x8000803fff 64bit pref] May 13 23:48:01.915345 kernel: pci 0000:05:00.0: BAR 1: assigned [mem 0x10800000-0x10800fff] May 13 23:48:01.917476 kernel: pci 0000:00:02.4: PCI bridge to [bus 05] May 13 23:48:01.917583 kernel: pci 0000:00:02.4: bridge window [io 0x5000-0x5fff] May 13 23:48:01.917652 kernel: pci 0000:00:02.4: bridge window [mem 0x10800000-0x109fffff] May 13 23:48:01.917728 kernel: pci 0000:00:02.4: bridge window [mem 0x8000800000-0x80009fffff 64bit pref] May 13 23:48:01.917803 kernel: pci 0000:06:00.0: BAR 4: assigned [mem 0x8000a00000-0x8000a03fff 64bit pref] May 13 23:48:01.917897 kernel: pci 0000:06:00.0: BAR 1: assigned [mem 0x10a00000-0x10a00fff] May 13 23:48:01.917982 kernel: pci 0000:00:02.5: PCI bridge to [bus 06] May 13 23:48:01.918047 kernel: pci 0000:00:02.5: bridge window [io 0x6000-0x6fff] May 13 23:48:01.918114 kernel: pci 0000:00:02.5: bridge window [mem 0x10a00000-0x10bfffff] May 13 23:48:01.918178 kernel: pci 0000:00:02.5: bridge window [mem 0x8000a00000-0x8000bfffff 64bit pref] May 13 23:48:01.918249 kernel: pci 0000:07:00.0: BAR 6: assigned [mem 0x10c00000-0x10c7ffff pref] May 13 23:48:01.918320 kernel: pci 0000:07:00.0: BAR 4: assigned [mem 0x8000c00000-0x8000c03fff 64bit pref] May 13 23:48:01.918386 kernel: pci 0000:07:00.0: BAR 1: assigned [mem 0x10c80000-0x10c80fff] May 13 23:48:01.918489 kernel: pci 0000:00:02.6: PCI bridge to [bus 07] May 13 23:48:01.918555 kernel: pci 0000:00:02.6: bridge window [io 0x7000-0x7fff] May 13 23:48:01.918617 kernel: pci 0000:00:02.6: bridge window [mem 0x10c00000-0x10dfffff] May 13 23:48:01.918679 kernel: pci 0000:00:02.6: bridge window [mem 0x8000c00000-0x8000dfffff 64bit pref] May 13 23:48:01.918744 kernel: pci 0000:00:02.7: PCI bridge to [bus 08] May 13 23:48:01.918814 kernel: pci 0000:00:02.7: bridge window [io 0x8000-0x8fff] May 13 23:48:01.918895 kernel: pci 0000:00:02.7: bridge window [mem 0x10e00000-0x10ffffff] May 13 23:48:01.918961 kernel: pci 0000:00:02.7: bridge window [mem 0x8000e00000-0x8000ffffff 64bit pref] May 13 23:48:01.919028 kernel: pci 0000:00:03.0: PCI bridge to [bus 09] May 13 23:48:01.919094 kernel: pci 0000:00:03.0: bridge window [io 0x9000-0x9fff] May 13 23:48:01.919157 kernel: pci 0000:00:03.0: bridge window [mem 0x11000000-0x111fffff] May 13 23:48:01.919220 kernel: pci 0000:00:03.0: bridge window [mem 0x8001000000-0x80011fffff 64bit pref] May 13 23:48:01.919288 kernel: pci_bus 0000:00: resource 4 [mem 0x10000000-0x3efeffff window] May 13 23:48:01.919350 kernel: pci_bus 0000:00: resource 5 [io 0x0000-0xffff window] May 13 23:48:01.919551 kernel: pci_bus 0000:00: resource 6 [mem 0x8000000000-0xffffffffff window] May 13 23:48:01.919634 kernel: pci_bus 0000:01: resource 0 [io 0x1000-0x1fff] May 13 23:48:01.919693 kernel: pci_bus 0000:01: resource 1 [mem 0x10000000-0x101fffff] May 13 23:48:01.919756 kernel: pci_bus 0000:01: resource 2 [mem 0x8000000000-0x80001fffff 64bit pref] May 13 23:48:01.919824 kernel: pci_bus 0000:02: resource 0 [io 0x2000-0x2fff] May 13 23:48:01.919926 kernel: pci_bus 0000:02: resource 1 [mem 0x10200000-0x103fffff] May 13 23:48:01.920008 kernel: pci_bus 0000:02: resource 2 [mem 0x8000200000-0x80003fffff 64bit pref] May 13 23:48:01.920075 kernel: pci_bus 0000:03: resource 0 [io 0x3000-0x3fff] May 13 23:48:01.920137 kernel: pci_bus 0000:03: resource 1 [mem 0x10400000-0x105fffff] May 13 23:48:01.920195 kernel: pci_bus 0000:03: resource 2 [mem 0x8000400000-0x80005fffff 64bit pref] May 13 23:48:01.920263 kernel: pci_bus 0000:04: resource 0 [io 0x4000-0x4fff] May 13 23:48:01.920324 kernel: pci_bus 0000:04: resource 1 [mem 0x10600000-0x107fffff] May 13 23:48:01.920387 kernel: pci_bus 0000:04: resource 2 [mem 0x8000600000-0x80007fffff 64bit pref] May 13 23:48:01.920475 kernel: pci_bus 0000:05: resource 0 [io 0x5000-0x5fff] May 13 23:48:01.920537 kernel: pci_bus 0000:05: resource 1 [mem 0x10800000-0x109fffff] May 13 23:48:01.920602 kernel: pci_bus 0000:05: resource 2 [mem 0x8000800000-0x80009fffff 64bit pref] May 13 23:48:01.920677 kernel: pci_bus 0000:06: resource 0 [io 0x6000-0x6fff] May 13 23:48:01.920739 kernel: pci_bus 0000:06: resource 1 [mem 0x10a00000-0x10bfffff] May 13 23:48:01.920800 kernel: pci_bus 0000:06: resource 2 [mem 0x8000a00000-0x8000bfffff 64bit pref] May 13 23:48:01.920882 kernel: pci_bus 0000:07: resource 0 [io 0x7000-0x7fff] May 13 23:48:01.920944 kernel: pci_bus 0000:07: resource 1 [mem 0x10c00000-0x10dfffff] May 13 23:48:01.921004 kernel: pci_bus 0000:07: resource 2 [mem 0x8000c00000-0x8000dfffff 64bit pref] May 13 23:48:01.921070 kernel: pci_bus 0000:08: resource 0 [io 0x8000-0x8fff] May 13 23:48:01.921133 kernel: pci_bus 0000:08: resource 1 [mem 0x10e00000-0x10ffffff] May 13 23:48:01.921192 kernel: pci_bus 0000:08: resource 2 [mem 0x8000e00000-0x8000ffffff 64bit pref] May 13 23:48:01.921261 kernel: pci_bus 0000:09: resource 0 [io 0x9000-0x9fff] May 13 23:48:01.921321 kernel: pci_bus 0000:09: resource 1 [mem 0x11000000-0x111fffff] May 13 23:48:01.921380 kernel: pci_bus 0000:09: resource 2 [mem 0x8001000000-0x80011fffff 64bit pref] May 13 23:48:01.921389 kernel: ACPI: PCI: Interrupt link GSI0 configured for IRQ 35 May 13 23:48:01.921397 kernel: ACPI: PCI: Interrupt link GSI1 configured for IRQ 36 May 13 23:48:01.923916 kernel: ACPI: PCI: Interrupt link GSI2 configured for IRQ 37 May 13 23:48:01.923934 kernel: ACPI: PCI: Interrupt link GSI3 configured for IRQ 38 May 13 23:48:01.923942 kernel: iommu: Default domain type: Translated May 13 23:48:01.923952 kernel: iommu: DMA domain TLB invalidation policy: strict mode May 13 23:48:01.923960 kernel: efivars: Registered efivars operations May 13 23:48:01.923968 kernel: vgaarb: loaded May 13 23:48:01.923977 kernel: clocksource: Switched to clocksource arch_sys_counter May 13 23:48:01.923985 kernel: VFS: Disk quotas dquot_6.6.0 May 13 23:48:01.923993 kernel: VFS: Dquot-cache hash table entries: 512 (order 0, 4096 bytes) May 13 23:48:01.924000 kernel: pnp: PnP ACPI init May 13 23:48:01.924137 kernel: system 00:00: [mem 0x4010000000-0x401fffffff window] could not be reserved May 13 23:48:01.924151 kernel: pnp: PnP ACPI: found 1 devices May 13 23:48:01.924158 kernel: NET: Registered PF_INET protocol family May 13 23:48:01.924166 kernel: IP idents hash table entries: 65536 (order: 7, 524288 bytes, linear) May 13 23:48:01.924173 kernel: tcp_listen_portaddr_hash hash table entries: 2048 (order: 3, 32768 bytes, linear) May 13 23:48:01.924181 kernel: Table-perturb hash table entries: 65536 (order: 6, 262144 bytes, linear) May 13 23:48:01.924189 kernel: TCP established hash table entries: 32768 (order: 6, 262144 bytes, linear) May 13 23:48:01.924196 kernel: TCP bind hash table entries: 32768 (order: 8, 1048576 bytes, linear) May 13 23:48:01.924207 kernel: TCP: Hash tables configured (established 32768 bind 32768) May 13 23:48:01.924215 kernel: UDP hash table entries: 2048 (order: 4, 65536 bytes, linear) May 13 23:48:01.924222 kernel: UDP-Lite hash table entries: 2048 (order: 4, 65536 bytes, linear) May 13 23:48:01.924230 kernel: NET: Registered PF_UNIX/PF_LOCAL protocol family May 13 23:48:01.924309 kernel: pci 0000:02:00.0: enabling device (0000 -> 0002) May 13 23:48:01.924320 kernel: PCI: CLS 0 bytes, default 64 May 13 23:48:01.924328 kernel: kvm [1]: HYP mode not available May 13 23:48:01.924335 kernel: Initialise system trusted keyrings May 13 23:48:01.924343 kernel: workingset: timestamp_bits=39 max_order=20 bucket_order=0 May 13 23:48:01.924352 kernel: Key type asymmetric registered May 13 23:48:01.924359 kernel: Asymmetric key parser 'x509' registered May 13 23:48:01.924367 kernel: Block layer SCSI generic (bsg) driver version 0.4 loaded (major 250) May 13 23:48:01.924374 kernel: io scheduler mq-deadline registered May 13 23:48:01.924382 kernel: io scheduler kyber registered May 13 23:48:01.924389 kernel: io scheduler bfq registered May 13 23:48:01.924397 kernel: ACPI: \_SB_.PCI0.GSI2: Enabled at IRQ 37 May 13 23:48:01.925550 kernel: pcieport 0000:00:02.0: PME: Signaling with IRQ 50 May 13 23:48:01.925636 kernel: pcieport 0000:00:02.0: AER: enabled with IRQ 50 May 13 23:48:01.927586 kernel: pcieport 0000:00:02.0: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ May 13 23:48:01.927679 kernel: pcieport 0000:00:02.1: PME: Signaling with IRQ 51 May 13 23:48:01.927745 kernel: pcieport 0000:00:02.1: AER: enabled with IRQ 51 May 13 23:48:01.927810 kernel: pcieport 0000:00:02.1: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ May 13 23:48:01.927933 kernel: pcieport 0000:00:02.2: PME: Signaling with IRQ 52 May 13 23:48:01.928021 kernel: pcieport 0000:00:02.2: AER: enabled with IRQ 52 May 13 23:48:01.928087 kernel: pcieport 0000:00:02.2: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ May 13 23:48:01.928155 kernel: pcieport 0000:00:02.3: PME: Signaling with IRQ 53 May 13 23:48:01.928221 kernel: pcieport 0000:00:02.3: AER: enabled with IRQ 53 May 13 23:48:01.928284 kernel: pcieport 0000:00:02.3: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ May 13 23:48:01.928352 kernel: pcieport 0000:00:02.4: PME: Signaling with IRQ 54 May 13 23:48:01.928438 kernel: pcieport 0000:00:02.4: AER: enabled with IRQ 54 May 13 23:48:01.928506 kernel: pcieport 0000:00:02.4: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ May 13 23:48:01.928574 kernel: pcieport 0000:00:02.5: PME: Signaling with IRQ 55 May 13 23:48:01.928639 kernel: pcieport 0000:00:02.5: AER: enabled with IRQ 55 May 13 23:48:01.928704 kernel: pcieport 0000:00:02.5: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ May 13 23:48:01.928773 kernel: pcieport 0000:00:02.6: PME: Signaling with IRQ 56 May 13 23:48:01.928850 kernel: pcieport 0000:00:02.6: AER: enabled with IRQ 56 May 13 23:48:01.928920 kernel: pcieport 0000:00:02.6: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ May 13 23:48:01.928987 kernel: pcieport 0000:00:02.7: PME: Signaling with IRQ 57 May 13 23:48:01.929053 kernel: pcieport 0000:00:02.7: AER: enabled with IRQ 57 May 13 23:48:01.929117 kernel: pcieport 0000:00:02.7: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ May 13 23:48:01.929128 kernel: ACPI: \_SB_.PCI0.GSI3: Enabled at IRQ 38 May 13 23:48:01.929194 kernel: pcieport 0000:00:03.0: PME: Signaling with IRQ 58 May 13 23:48:01.929260 kernel: pcieport 0000:00:03.0: AER: enabled with IRQ 58 May 13 23:48:01.929324 kernel: pcieport 0000:00:03.0: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ May 13 23:48:01.929334 kernel: input: Power Button as /devices/LNXSYSTM:00/LNXSYBUS:00/PNP0C0C:00/input/input0 May 13 23:48:01.929341 kernel: ACPI: button: Power Button [PWRB] May 13 23:48:01.929349 kernel: ACPI: \_SB_.PCI0.GSI1: Enabled at IRQ 36 May 13 23:48:01.932017 kernel: virtio-pci 0000:04:00.0: enabling device (0000 -> 0002) May 13 23:48:01.932125 kernel: virtio-pci 0000:07:00.0: enabling device (0000 -> 0002) May 13 23:48:01.932143 kernel: Serial: 8250/16550 driver, 4 ports, IRQ sharing enabled May 13 23:48:01.932152 kernel: ACPI: \_SB_.PCI0.GSI0: Enabled at IRQ 35 May 13 23:48:01.932222 kernel: serial 0000:00:04.0: enabling device (0000 -> 0001) May 13 23:48:01.932235 kernel: 0000:00:04.0: ttyS0 at I/O 0xa000 (irq = 45, base_baud = 115200) is a 16550A May 13 23:48:01.932243 kernel: thunder_xcv, ver 1.0 May 13 23:48:01.932250 kernel: thunder_bgx, ver 1.0 May 13 23:48:01.932257 kernel: nicpf, ver 1.0 May 13 23:48:01.932265 kernel: nicvf, ver 1.0 May 13 23:48:01.932348 kernel: rtc-efi rtc-efi.0: registered as rtc0 May 13 23:48:01.933151 kernel: rtc-efi rtc-efi.0: setting system clock to 2025-05-13T23:48:01 UTC (1747180081) May 13 23:48:01.933174 kernel: hid: raw HID events driver (C) Jiri Kosina May 13 23:48:01.933182 kernel: hw perfevents: enabled with armv8_pmuv3_0 PMU driver, 7 counters available May 13 23:48:01.933191 kernel: watchdog: Delayed init of the lockup detector failed: -19 May 13 23:48:01.933199 kernel: watchdog: Hard watchdog permanently disabled May 13 23:48:01.933207 kernel: NET: Registered PF_INET6 protocol family May 13 23:48:01.933214 kernel: Segment Routing with IPv6 May 13 23:48:01.933222 kernel: In-situ OAM (IOAM) with IPv6 May 13 23:48:01.933236 kernel: NET: Registered PF_PACKET protocol family May 13 23:48:01.933244 kernel: Key type dns_resolver registered May 13 23:48:01.933251 kernel: registered taskstats version 1 May 13 23:48:01.933259 kernel: Loading compiled-in X.509 certificates May 13 23:48:01.933266 kernel: Loaded X.509 cert 'Kinvolk GmbH: Module signing key for 6.6.89-flatcar: 568a15bbab977599d8f910f319ba50c03c8a57bd' May 13 23:48:01.933274 kernel: Key type .fscrypt registered May 13 23:48:01.933281 kernel: Key type fscrypt-provisioning registered May 13 23:48:01.933289 kernel: ima: No TPM chip found, activating TPM-bypass! May 13 23:48:01.933297 kernel: ima: Allocated hash algorithm: sha1 May 13 23:48:01.933307 kernel: ima: No architecture policies found May 13 23:48:01.933315 kernel: alg: No test for fips(ansi_cprng) (fips_ansi_cprng) May 13 23:48:01.933322 kernel: clk: Disabling unused clocks May 13 23:48:01.933329 kernel: Freeing unused kernel memory: 38464K May 13 23:48:01.933337 kernel: Run /init as init process May 13 23:48:01.933344 kernel: with arguments: May 13 23:48:01.933352 kernel: /init May 13 23:48:01.933359 kernel: with environment: May 13 23:48:01.933366 kernel: HOME=/ May 13 23:48:01.933376 kernel: TERM=linux May 13 23:48:01.933383 kernel: BOOT_IMAGE=/flatcar/vmlinuz-a May 13 23:48:01.933391 systemd[1]: Successfully made /usr/ read-only. May 13 23:48:01.933420 systemd[1]: systemd 256.8 running in system mode (+PAM +AUDIT +SELINUX -APPARMOR +IMA +SMACK +SECCOMP +GCRYPT -GNUTLS +OPENSSL -ACL +BLKID +CURL +ELFUTILS -FIDO2 +IDN2 -IDN +IPTC +KMOD +LIBCRYPTSETUP +LIBCRYPTSETUP_PLUGINS +LIBFDISK +PCRE2 -PWQUALITY -P11KIT -QRENCODE +TPM2 +BZIP2 +LZ4 +XZ +ZLIB +ZSTD -BPF_FRAMEWORK -XKBCOMMON +UTMP -SYSVINIT +LIBARCHIVE) May 13 23:48:01.933429 systemd[1]: Detected virtualization kvm. May 13 23:48:01.933437 systemd[1]: Detected architecture arm64. May 13 23:48:01.933445 systemd[1]: Running in initrd. May 13 23:48:01.933456 systemd[1]: No hostname configured, using default hostname. May 13 23:48:01.933465 systemd[1]: Hostname set to . May 13 23:48:01.933473 systemd[1]: Initializing machine ID from VM UUID. May 13 23:48:01.933481 systemd[1]: Queued start job for default target initrd.target. May 13 23:48:01.933489 systemd[1]: Started clevis-luks-askpass.path - Forward Password Requests to Clevis Directory Watch. May 13 23:48:01.933524 systemd[1]: Started systemd-ask-password-console.path - Dispatch Password Requests to Console Directory Watch. May 13 23:48:01.933534 systemd[1]: Expecting device dev-disk-by\x2dlabel-EFI\x2dSYSTEM.device - /dev/disk/by-label/EFI-SYSTEM... May 13 23:48:01.933543 systemd[1]: Expecting device dev-disk-by\x2dlabel-OEM.device - /dev/disk/by-label/OEM... May 13 23:48:01.933554 systemd[1]: Expecting device dev-disk-by\x2dlabel-ROOT.device - /dev/disk/by-label/ROOT... May 13 23:48:01.933563 systemd[1]: Expecting device dev-disk-by\x2dpartlabel-USR\x2dA.device - /dev/disk/by-partlabel/USR-A... May 13 23:48:01.933572 systemd[1]: Expecting device dev-disk-by\x2dpartuuid-7130c94a\x2d213a\x2d4e5a\x2d8e26\x2d6cce9662f132.device - /dev/disk/by-partuuid/7130c94a-213a-4e5a-8e26-6cce9662f132... May 13 23:48:01.933580 systemd[1]: Expecting device dev-mapper-usr.device - /dev/mapper/usr... May 13 23:48:01.933588 systemd[1]: Reached target cryptsetup-pre.target - Local Encrypted Volumes (Pre). May 13 23:48:01.933597 systemd[1]: Reached target cryptsetup.target - Local Encrypted Volumes. May 13 23:48:01.933604 systemd[1]: Reached target paths.target - Path Units. May 13 23:48:01.933614 systemd[1]: Reached target slices.target - Slice Units. May 13 23:48:01.933622 systemd[1]: Reached target swap.target - Swaps. May 13 23:48:01.933630 systemd[1]: Reached target timers.target - Timer Units. May 13 23:48:01.933638 systemd[1]: Listening on iscsid.socket - Open-iSCSI iscsid Socket. May 13 23:48:01.933673 systemd[1]: Listening on iscsiuio.socket - Open-iSCSI iscsiuio Socket. May 13 23:48:01.933685 systemd[1]: Listening on systemd-journald-dev-log.socket - Journal Socket (/dev/log). May 13 23:48:01.933693 systemd[1]: Listening on systemd-journald.socket - Journal Sockets. May 13 23:48:01.933701 systemd[1]: Listening on systemd-networkd.socket - Network Service Netlink Socket. May 13 23:48:01.933710 systemd[1]: Listening on systemd-udevd-control.socket - udev Control Socket. May 13 23:48:01.933720 systemd[1]: Listening on systemd-udevd-kernel.socket - udev Kernel Socket. May 13 23:48:01.933728 systemd[1]: Reached target sockets.target - Socket Units. May 13 23:48:01.933736 systemd[1]: Starting ignition-setup-pre.service - Ignition env setup... May 13 23:48:01.933744 systemd[1]: Starting kmod-static-nodes.service - Create List of Static Device Nodes... May 13 23:48:01.933752 systemd[1]: Finished network-cleanup.service - Network Cleanup. May 13 23:48:01.933760 systemd[1]: Starting systemd-fsck-usr.service... May 13 23:48:01.933768 systemd[1]: Starting systemd-journald.service - Journal Service... May 13 23:48:01.933776 systemd[1]: Starting systemd-modules-load.service - Load Kernel Modules... May 13 23:48:01.933786 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... May 13 23:48:01.933794 systemd[1]: Finished ignition-setup-pre.service - Ignition env setup. May 13 23:48:01.933802 systemd[1]: Finished kmod-static-nodes.service - Create List of Static Device Nodes. May 13 23:48:01.933811 systemd[1]: Finished systemd-fsck-usr.service. May 13 23:48:01.933819 systemd[1]: Starting systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully... May 13 23:48:01.933896 systemd-journald[236]: Collecting audit messages is disabled. May 13 23:48:01.933920 systemd[1]: Finished systemd-vconsole-setup.service - Virtual Console Setup. May 13 23:48:01.933930 systemd[1]: Starting dracut-cmdline-ask.service - dracut ask for additional cmdline parameters... May 13 23:48:01.933938 kernel: bridge: filtering via arp/ip/ip6tables is no longer available by default. Update your scripts to load br_netfilter if you need this. May 13 23:48:01.933949 systemd[1]: Finished systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully. May 13 23:48:01.933957 kernel: Bridge firewalling registered May 13 23:48:01.933965 systemd[1]: Starting systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev... May 13 23:48:01.933974 systemd[1]: Finished systemd-modules-load.service - Load Kernel Modules. May 13 23:48:01.933982 systemd[1]: Starting systemd-sysctl.service - Apply Kernel Variables... May 13 23:48:01.933991 systemd-journald[236]: Journal started May 13 23:48:01.934012 systemd-journald[236]: Runtime Journal (/run/log/journal/932b4659872a4a0486b76bd08d218774) is 8M, max 76.6M, 68.6M free. May 13 23:48:01.882759 systemd-modules-load[238]: Inserted module 'overlay' May 13 23:48:01.937053 systemd[1]: Finished systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev. May 13 23:48:01.906517 systemd-modules-load[238]: Inserted module 'br_netfilter' May 13 23:48:01.939622 systemd[1]: Started systemd-journald.service - Journal Service. May 13 23:48:01.941613 systemd[1]: Starting systemd-tmpfiles-setup.service - Create System Files and Directories... May 13 23:48:01.950815 systemd[1]: Finished systemd-sysctl.service - Apply Kernel Variables. May 13 23:48:01.957886 systemd[1]: Finished dracut-cmdline-ask.service - dracut ask for additional cmdline parameters. May 13 23:48:01.964619 systemd[1]: Starting dracut-cmdline.service - dracut cmdline hook... May 13 23:48:01.970018 systemd[1]: Finished systemd-tmpfiles-setup.service - Create System Files and Directories. May 13 23:48:01.975604 systemd[1]: Starting systemd-resolved.service - Network Name Resolution... May 13 23:48:01.988442 dracut-cmdline[273]: dracut-dracut-053 May 13 23:48:01.993688 dracut-cmdline[273]: Using kernel command line parameters: rd.driver.pre=btrfs BOOT_IMAGE=/flatcar/vmlinuz-a mount.usr=/dev/mapper/usr verity.usr=PARTUUID=7130c94a-213a-4e5a-8e26-6cce9662f132 rootflags=rw mount.usrflags=ro consoleblank=0 root=LABEL=ROOT console=ttyAMA0,115200n8 flatcar.first_boot=detected acpi=force flatcar.oem.id=hetzner verity.usrhash=3174b2682629aa8ad4069807ed6fd62c10f62266ee1e150a1104f2a2fb6489b5 May 13 23:48:02.014619 systemd-resolved[275]: Positive Trust Anchors: May 13 23:48:02.014634 systemd-resolved[275]: . IN DS 20326 8 2 e06d44b80b8f1d39a95c0b0d7c65d08458e880409bbc683457104237c7f8ec8d May 13 23:48:02.014667 systemd-resolved[275]: Negative trust anchors: home.arpa 10.in-addr.arpa 16.172.in-addr.arpa 17.172.in-addr.arpa 18.172.in-addr.arpa 19.172.in-addr.arpa 20.172.in-addr.arpa 21.172.in-addr.arpa 22.172.in-addr.arpa 23.172.in-addr.arpa 24.172.in-addr.arpa 25.172.in-addr.arpa 26.172.in-addr.arpa 27.172.in-addr.arpa 28.172.in-addr.arpa 29.172.in-addr.arpa 30.172.in-addr.arpa 31.172.in-addr.arpa 170.0.0.192.in-addr.arpa 171.0.0.192.in-addr.arpa 168.192.in-addr.arpa d.f.ip6.arpa ipv4only.arpa resolver.arpa corp home internal intranet lan local private test May 13 23:48:02.020622 systemd-resolved[275]: Defaulting to hostname 'linux'. May 13 23:48:02.021810 systemd[1]: Started systemd-resolved.service - Network Name Resolution. May 13 23:48:02.022449 systemd[1]: Reached target nss-lookup.target - Host and Network Name Lookups. May 13 23:48:02.109481 kernel: SCSI subsystem initialized May 13 23:48:02.116482 kernel: Loading iSCSI transport class v2.0-870. May 13 23:48:02.125462 kernel: iscsi: registered transport (tcp) May 13 23:48:02.139484 kernel: iscsi: registered transport (qla4xxx) May 13 23:48:02.139551 kernel: QLogic iSCSI HBA Driver May 13 23:48:02.197471 systemd[1]: Finished dracut-cmdline.service - dracut cmdline hook. May 13 23:48:02.202637 systemd[1]: Starting dracut-pre-udev.service - dracut pre-udev hook... May 13 23:48:02.230087 kernel: device-mapper: core: CONFIG_IMA_DISABLE_HTABLE is disabled. Duplicate IMA measurements will not be recorded in the IMA log. May 13 23:48:02.230172 kernel: device-mapper: uevent: version 1.0.3 May 13 23:48:02.230193 kernel: device-mapper: ioctl: 4.48.0-ioctl (2023-03-01) initialised: dm-devel@redhat.com May 13 23:48:02.282452 kernel: raid6: neonx8 gen() 14860 MB/s May 13 23:48:02.299485 kernel: raid6: neonx4 gen() 15410 MB/s May 13 23:48:02.316497 kernel: raid6: neonx2 gen() 12670 MB/s May 13 23:48:02.333475 kernel: raid6: neonx1 gen() 9495 MB/s May 13 23:48:02.350476 kernel: raid6: int64x8 gen() 4954 MB/s May 13 23:48:02.367478 kernel: raid6: int64x4 gen() 7262 MB/s May 13 23:48:02.384459 kernel: raid6: int64x2 gen() 5915 MB/s May 13 23:48:02.401465 kernel: raid6: int64x1 gen() 3581 MB/s May 13 23:48:02.401537 kernel: raid6: using algorithm neonx4 gen() 15410 MB/s May 13 23:48:02.418466 kernel: raid6: .... xor() 12089 MB/s, rmw enabled May 13 23:48:02.418538 kernel: raid6: using neon recovery algorithm May 13 23:48:02.423462 kernel: xor: measuring software checksum speed May 13 23:48:02.423529 kernel: 8regs : 21584 MB/sec May 13 23:48:02.424514 kernel: 32regs : 21676 MB/sec May 13 23:48:02.424552 kernel: arm64_neon : 23955 MB/sec May 13 23:48:02.424565 kernel: xor: using function: arm64_neon (23955 MB/sec) May 13 23:48:02.476462 kernel: Btrfs loaded, zoned=no, fsverity=no May 13 23:48:02.498589 systemd[1]: Finished dracut-pre-udev.service - dracut pre-udev hook. May 13 23:48:02.503609 systemd[1]: Starting systemd-udevd.service - Rule-based Manager for Device Events and Files... May 13 23:48:02.532732 systemd-udevd[457]: Using default interface naming scheme 'v255'. May 13 23:48:02.539236 systemd[1]: Started systemd-udevd.service - Rule-based Manager for Device Events and Files. May 13 23:48:02.544592 systemd[1]: Starting dracut-pre-trigger.service - dracut pre-trigger hook... May 13 23:48:02.578437 dracut-pre-trigger[470]: rd.md=0: removing MD RAID activation May 13 23:48:02.616197 systemd[1]: Finished dracut-pre-trigger.service - dracut pre-trigger hook. May 13 23:48:02.618492 systemd[1]: Starting systemd-udev-trigger.service - Coldplug All udev Devices... May 13 23:48:02.680970 systemd[1]: Finished systemd-udev-trigger.service - Coldplug All udev Devices. May 13 23:48:02.684644 systemd[1]: Starting dracut-initqueue.service - dracut initqueue hook... May 13 23:48:02.717475 systemd[1]: Finished dracut-initqueue.service - dracut initqueue hook. May 13 23:48:02.719969 systemd[1]: Reached target remote-fs-pre.target - Preparation for Remote File Systems. May 13 23:48:02.723869 systemd[1]: Reached target remote-cryptsetup.target - Remote Encrypted Volumes. May 13 23:48:02.726390 systemd[1]: Reached target remote-fs.target - Remote File Systems. May 13 23:48:02.729916 systemd[1]: Starting dracut-pre-mount.service - dracut pre-mount hook... May 13 23:48:02.757154 systemd[1]: Finished dracut-pre-mount.service - dracut pre-mount hook. May 13 23:48:02.783786 kernel: scsi host0: Virtio SCSI HBA May 13 23:48:02.789467 kernel: scsi 0:0:0:0: CD-ROM QEMU QEMU CD-ROM 2.5+ PQ: 0 ANSI: 5 May 13 23:48:02.793715 kernel: scsi 0:0:0:1: Direct-Access QEMU QEMU HARDDISK 2.5+ PQ: 0 ANSI: 5 May 13 23:48:02.812559 kernel: ACPI: bus type USB registered May 13 23:48:02.812617 kernel: usbcore: registered new interface driver usbfs May 13 23:48:02.813417 kernel: usbcore: registered new interface driver hub May 13 23:48:02.813444 kernel: usbcore: registered new device driver usb May 13 23:48:02.827806 kernel: xhci_hcd 0000:02:00.0: xHCI Host Controller May 13 23:48:02.828011 kernel: xhci_hcd 0000:02:00.0: new USB bus registered, assigned bus number 1 May 13 23:48:02.830453 kernel: xhci_hcd 0000:02:00.0: hcc params 0x00087001 hci version 0x100 quirks 0x0000000000000010 May 13 23:48:02.832739 kernel: xhci_hcd 0000:02:00.0: xHCI Host Controller May 13 23:48:02.832966 kernel: xhci_hcd 0000:02:00.0: new USB bus registered, assigned bus number 2 May 13 23:48:02.833055 kernel: xhci_hcd 0000:02:00.0: Host supports USB 3.0 SuperSpeed May 13 23:48:02.835776 systemd[1]: dracut-cmdline-ask.service: Deactivated successfully. May 13 23:48:02.836426 kernel: hub 1-0:1.0: USB hub found May 13 23:48:02.835915 systemd[1]: Stopped dracut-cmdline-ask.service - dracut ask for additional cmdline parameters. May 13 23:48:02.839316 kernel: hub 1-0:1.0: 4 ports detected May 13 23:48:02.839493 kernel: usb usb2: We don't know the algorithms for LPM for this host, disabling LPM. May 13 23:48:02.839592 kernel: hub 2-0:1.0: USB hub found May 13 23:48:02.839681 kernel: hub 2-0:1.0: 4 ports detected May 13 23:48:02.839334 systemd[1]: Stopping dracut-cmdline-ask.service - dracut ask for additional cmdline parameters... May 13 23:48:02.840324 systemd[1]: systemd-vconsole-setup.service: Deactivated successfully. May 13 23:48:02.840643 systemd[1]: Stopped systemd-vconsole-setup.service - Virtual Console Setup. May 13 23:48:02.848816 systemd[1]: Stopping systemd-vconsole-setup.service - Virtual Console Setup... May 13 23:48:02.851226 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... May 13 23:48:02.864464 kernel: sr 0:0:0:0: Power-on or device reset occurred May 13 23:48:02.871582 kernel: sr 0:0:0:0: [sr0] scsi3-mmc drive: 16x/50x cd/rw xa/form2 cdda tray May 13 23:48:02.871793 kernel: cdrom: Uniform CD-ROM driver Revision: 3.20 May 13 23:48:02.879736 kernel: sr 0:0:0:0: Attached scsi CD-ROM sr0 May 13 23:48:02.879986 kernel: sd 0:0:0:1: Power-on or device reset occurred May 13 23:48:02.881631 kernel: sd 0:0:0:1: [sda] 80003072 512-byte logical blocks: (41.0 GB/38.1 GiB) May 13 23:48:02.881865 kernel: sd 0:0:0:1: [sda] Write Protect is off May 13 23:48:02.881981 kernel: sd 0:0:0:1: [sda] Mode Sense: 63 00 00 08 May 13 23:48:02.882951 kernel: sd 0:0:0:1: [sda] Write cache: enabled, read cache: enabled, doesn't support DPO or FUA May 13 23:48:02.886205 systemd[1]: Finished systemd-vconsole-setup.service - Virtual Console Setup. May 13 23:48:02.888917 kernel: GPT:Primary header thinks Alt. header is not at the end of the disk. May 13 23:48:02.888970 kernel: GPT:17805311 != 80003071 May 13 23:48:02.888982 kernel: GPT:Alternate GPT header not at the end of the disk. May 13 23:48:02.888992 kernel: GPT:17805311 != 80003071 May 13 23:48:02.889657 kernel: GPT: Use GNU Parted to correct GPT errors. May 13 23:48:02.889675 kernel: sda: sda1 sda2 sda3 sda4 sda6 sda7 sda9 May 13 23:48:02.890421 kernel: sd 0:0:0:1: [sda] Attached SCSI disk May 13 23:48:02.890563 systemd[1]: Starting dracut-cmdline-ask.service - dracut ask for additional cmdline parameters... May 13 23:48:02.931020 systemd[1]: Finished dracut-cmdline-ask.service - dracut ask for additional cmdline parameters. May 13 23:48:02.945428 kernel: BTRFS: device fsid ee830c17-a93d-4109-bd12-3fec8ef6763d devid 1 transid 41 /dev/sda3 scanned by (udev-worker) (516) May 13 23:48:02.957458 kernel: BTRFS: device label OEM devid 1 transid 14 /dev/sda6 scanned by (udev-worker) (515) May 13 23:48:02.972068 systemd[1]: Found device dev-disk-by\x2dlabel-EFI\x2dSYSTEM.device - QEMU_HARDDISK EFI-SYSTEM. May 13 23:48:02.981746 systemd[1]: Found device dev-disk-by\x2dpartuuid-7130c94a\x2d213a\x2d4e5a\x2d8e26\x2d6cce9662f132.device - QEMU_HARDDISK USR-A. May 13 23:48:02.982468 systemd[1]: Found device dev-disk-by\x2dpartlabel-USR\x2dA.device - QEMU_HARDDISK USR-A. May 13 23:48:02.998815 systemd[1]: Found device dev-disk-by\x2dlabel-ROOT.device - QEMU_HARDDISK ROOT. May 13 23:48:03.006947 systemd[1]: Found device dev-disk-by\x2dlabel-OEM.device - QEMU_HARDDISK OEM. May 13 23:48:03.011368 systemd[1]: Starting disk-uuid.service - Generate new UUID for disk GPT if necessary... May 13 23:48:03.031314 disk-uuid[578]: Primary Header is updated. May 13 23:48:03.031314 disk-uuid[578]: Secondary Entries is updated. May 13 23:48:03.031314 disk-uuid[578]: Secondary Header is updated. May 13 23:48:03.040288 kernel: sda: sda1 sda2 sda3 sda4 sda6 sda7 sda9 May 13 23:48:03.088430 kernel: usb 1-1: new high-speed USB device number 2 using xhci_hcd May 13 23:48:03.227068 kernel: input: QEMU QEMU USB Tablet as /devices/pci0000:00/0000:00:02.1/0000:02:00.0/usb1/1-1/1-1:1.0/0003:0627:0001.0001/input/input1 May 13 23:48:03.227148 kernel: hid-generic 0003:0627:0001.0001: input,hidraw0: USB HID v0.01 Mouse [QEMU QEMU USB Tablet] on usb-0000:02:00.0-1/input0 May 13 23:48:03.227865 kernel: usbcore: registered new interface driver usbhid May 13 23:48:03.227896 kernel: usbhid: USB HID core driver May 13 23:48:03.331463 kernel: usb 1-2: new high-speed USB device number 3 using xhci_hcd May 13 23:48:03.461461 kernel: input: QEMU QEMU USB Keyboard as /devices/pci0000:00/0000:00:02.1/0000:02:00.0/usb1/1-2/1-2:1.0/0003:0627:0001.0002/input/input2 May 13 23:48:03.514495 kernel: hid-generic 0003:0627:0001.0002: input,hidraw1: USB HID v1.11 Keyboard [QEMU QEMU USB Keyboard] on usb-0000:02:00.0-2/input0 May 13 23:48:04.050463 kernel: sda: sda1 sda2 sda3 sda4 sda6 sda7 sda9 May 13 23:48:04.052550 disk-uuid[579]: The operation has completed successfully. May 13 23:48:04.117864 systemd[1]: disk-uuid.service: Deactivated successfully. May 13 23:48:04.117989 systemd[1]: Finished disk-uuid.service - Generate new UUID for disk GPT if necessary. May 13 23:48:04.137537 systemd[1]: Starting verity-setup.service - Verity Setup for /dev/mapper/usr... May 13 23:48:04.157938 sh[594]: Success May 13 23:48:04.171438 kernel: device-mapper: verity: sha256 using implementation "sha256-ce" May 13 23:48:04.225904 systemd[1]: Found device dev-mapper-usr.device - /dev/mapper/usr. May 13 23:48:04.231366 systemd[1]: Mounting sysusr-usr.mount - /sysusr/usr... May 13 23:48:04.249493 systemd[1]: Finished verity-setup.service - Verity Setup for /dev/mapper/usr. May 13 23:48:04.264830 kernel: BTRFS info (device dm-0): first mount of filesystem ee830c17-a93d-4109-bd12-3fec8ef6763d May 13 23:48:04.264882 kernel: BTRFS info (device dm-0): using crc32c (crc32c-generic) checksum algorithm May 13 23:48:04.265727 kernel: BTRFS warning (device dm-0): 'nologreplay' is deprecated, use 'rescue=nologreplay' instead May 13 23:48:04.265791 kernel: BTRFS info (device dm-0): disabling log replay at mount time May 13 23:48:04.265836 kernel: BTRFS info (device dm-0): using free space tree May 13 23:48:04.271429 kernel: BTRFS info (device dm-0): enabling ssd optimizations May 13 23:48:04.273963 systemd[1]: Mounted sysusr-usr.mount - /sysusr/usr. May 13 23:48:04.275005 systemd[1]: afterburn-network-kargs.service - Afterburn Initrd Setup Network Kernel Arguments was skipped because no trigger condition checks were met. May 13 23:48:04.277301 systemd[1]: Starting ignition-setup.service - Ignition (setup)... May 13 23:48:04.290239 systemd[1]: Starting parse-ip-for-networkd.service - Write systemd-networkd units from cmdline... May 13 23:48:04.312677 kernel: BTRFS info (device sda6): first mount of filesystem e7b30525-8b14-4004-ad68-68a99b3959db May 13 23:48:04.312755 kernel: BTRFS info (device sda6): using crc32c (crc32c-generic) checksum algorithm May 13 23:48:04.312768 kernel: BTRFS info (device sda6): using free space tree May 13 23:48:04.317437 kernel: BTRFS info (device sda6): enabling ssd optimizations May 13 23:48:04.317496 kernel: BTRFS info (device sda6): auto enabling async discard May 13 23:48:04.324459 kernel: BTRFS info (device sda6): last unmount of filesystem e7b30525-8b14-4004-ad68-68a99b3959db May 13 23:48:04.326929 systemd[1]: Finished ignition-setup.service - Ignition (setup). May 13 23:48:04.330619 systemd[1]: Starting ignition-fetch-offline.service - Ignition (fetch-offline)... May 13 23:48:04.436781 systemd[1]: Finished parse-ip-for-networkd.service - Write systemd-networkd units from cmdline. May 13 23:48:04.441620 systemd[1]: Starting systemd-networkd.service - Network Configuration... May 13 23:48:04.461384 ignition[682]: Ignition 2.20.0 May 13 23:48:04.461421 ignition[682]: Stage: fetch-offline May 13 23:48:04.461460 ignition[682]: no configs at "/usr/lib/ignition/base.d" May 13 23:48:04.461469 ignition[682]: no config dir at "/usr/lib/ignition/base.platform.d/hetzner" May 13 23:48:04.461670 ignition[682]: parsed url from cmdline: "" May 13 23:48:04.463943 systemd[1]: Finished ignition-fetch-offline.service - Ignition (fetch-offline). May 13 23:48:04.461681 ignition[682]: no config URL provided May 13 23:48:04.461687 ignition[682]: reading system config file "/usr/lib/ignition/user.ign" May 13 23:48:04.461695 ignition[682]: no config at "/usr/lib/ignition/user.ign" May 13 23:48:04.461700 ignition[682]: failed to fetch config: resource requires networking May 13 23:48:04.461922 ignition[682]: Ignition finished successfully May 13 23:48:04.485657 systemd-networkd[779]: lo: Link UP May 13 23:48:04.485672 systemd-networkd[779]: lo: Gained carrier May 13 23:48:04.487802 systemd-networkd[779]: Enumeration completed May 13 23:48:04.488347 systemd-networkd[779]: eth0: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. May 13 23:48:04.488352 systemd-networkd[779]: eth0: Configuring with /usr/lib/systemd/network/zz-default.network. May 13 23:48:04.489848 systemd[1]: Started systemd-networkd.service - Network Configuration. May 13 23:48:04.490159 systemd-networkd[779]: eth1: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. May 13 23:48:04.490162 systemd-networkd[779]: eth1: Configuring with /usr/lib/systemd/network/zz-default.network. May 13 23:48:04.491958 systemd-networkd[779]: eth0: Link UP May 13 23:48:04.491962 systemd-networkd[779]: eth0: Gained carrier May 13 23:48:04.491970 systemd-networkd[779]: eth0: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. May 13 23:48:04.492875 systemd[1]: Reached target network.target - Network. May 13 23:48:04.496375 systemd-networkd[779]: eth1: Link UP May 13 23:48:04.496379 systemd-networkd[779]: eth1: Gained carrier May 13 23:48:04.496390 systemd-networkd[779]: eth1: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. May 13 23:48:04.496588 systemd[1]: Starting ignition-fetch.service - Ignition (fetch)... May 13 23:48:04.519586 ignition[784]: Ignition 2.20.0 May 13 23:48:04.519596 ignition[784]: Stage: fetch May 13 23:48:04.519768 ignition[784]: no configs at "/usr/lib/ignition/base.d" May 13 23:48:04.519778 ignition[784]: no config dir at "/usr/lib/ignition/base.platform.d/hetzner" May 13 23:48:04.519914 ignition[784]: parsed url from cmdline: "" May 13 23:48:04.519917 ignition[784]: no config URL provided May 13 23:48:04.519922 ignition[784]: reading system config file "/usr/lib/ignition/user.ign" May 13 23:48:04.519931 ignition[784]: no config at "/usr/lib/ignition/user.ign" May 13 23:48:04.520016 ignition[784]: GET http://169.254.169.254/hetzner/v1/userdata: attempt #1 May 13 23:48:04.520832 ignition[784]: GET error: Get "http://169.254.169.254/hetzner/v1/userdata": dial tcp 169.254.169.254:80: connect: network is unreachable May 13 23:48:04.531559 systemd-networkd[779]: eth1: DHCPv4 address 10.0.0.3/32, gateway 10.0.0.1 acquired from 10.0.0.1 May 13 23:48:04.565511 systemd-networkd[779]: eth0: DHCPv4 address 91.99.1.97/32, gateway 172.31.1.1 acquired from 172.31.1.1 May 13 23:48:04.721018 ignition[784]: GET http://169.254.169.254/hetzner/v1/userdata: attempt #2 May 13 23:48:04.726575 ignition[784]: GET result: OK May 13 23:48:04.726723 ignition[784]: parsing config with SHA512: b1e22a97305fbc8576d3c4640c04411619a0ef63caa2e86aab8b628ac4aa60a2090f7d7c5ecf33ad09b0c05d1a7a4332eef68944ee3da2e7edf79e9c460c477d May 13 23:48:04.735179 unknown[784]: fetched base config from "system" May 13 23:48:04.735190 unknown[784]: fetched base config from "system" May 13 23:48:04.735705 ignition[784]: fetch: fetch complete May 13 23:48:04.735196 unknown[784]: fetched user config from "hetzner" May 13 23:48:04.735712 ignition[784]: fetch: fetch passed May 13 23:48:04.735779 ignition[784]: Ignition finished successfully May 13 23:48:04.739479 systemd[1]: Finished ignition-fetch.service - Ignition (fetch). May 13 23:48:04.742592 systemd[1]: Starting ignition-kargs.service - Ignition (kargs)... May 13 23:48:04.771118 ignition[792]: Ignition 2.20.0 May 13 23:48:04.771134 ignition[792]: Stage: kargs May 13 23:48:04.771305 ignition[792]: no configs at "/usr/lib/ignition/base.d" May 13 23:48:04.771315 ignition[792]: no config dir at "/usr/lib/ignition/base.platform.d/hetzner" May 13 23:48:04.772274 ignition[792]: kargs: kargs passed May 13 23:48:04.772325 ignition[792]: Ignition finished successfully May 13 23:48:04.776043 systemd[1]: Finished ignition-kargs.service - Ignition (kargs). May 13 23:48:04.779615 systemd[1]: Starting ignition-disks.service - Ignition (disks)... May 13 23:48:04.804149 ignition[799]: Ignition 2.20.0 May 13 23:48:04.804167 ignition[799]: Stage: disks May 13 23:48:04.804343 ignition[799]: no configs at "/usr/lib/ignition/base.d" May 13 23:48:04.804353 ignition[799]: no config dir at "/usr/lib/ignition/base.platform.d/hetzner" May 13 23:48:04.805252 ignition[799]: disks: disks passed May 13 23:48:04.805306 ignition[799]: Ignition finished successfully May 13 23:48:04.807896 systemd[1]: Finished ignition-disks.service - Ignition (disks). May 13 23:48:04.808667 systemd[1]: Reached target initrd-root-device.target - Initrd Root Device. May 13 23:48:04.810488 systemd[1]: Reached target local-fs-pre.target - Preparation for Local File Systems. May 13 23:48:04.812155 systemd[1]: Reached target local-fs.target - Local File Systems. May 13 23:48:04.813368 systemd[1]: Reached target sysinit.target - System Initialization. May 13 23:48:04.814702 systemd[1]: Reached target basic.target - Basic System. May 13 23:48:04.817152 systemd[1]: Starting systemd-fsck-root.service - File System Check on /dev/disk/by-label/ROOT... May 13 23:48:04.845668 systemd-fsck[807]: ROOT: clean, 14/1628000 files, 120691/1617920 blocks May 13 23:48:04.848575 systemd[1]: Finished systemd-fsck-root.service - File System Check on /dev/disk/by-label/ROOT. May 13 23:48:04.852294 systemd[1]: Mounting sysroot.mount - /sysroot... May 13 23:48:04.922448 kernel: EXT4-fs (sda9): mounted filesystem 9f8d74e6-c079-469f-823a-18a62077a2c7 r/w with ordered data mode. Quota mode: none. May 13 23:48:04.924762 systemd[1]: Mounted sysroot.mount - /sysroot. May 13 23:48:04.926753 systemd[1]: Reached target initrd-root-fs.target - Initrd Root File System. May 13 23:48:04.930373 systemd[1]: Mounting sysroot-oem.mount - /sysroot/oem... May 13 23:48:04.933509 systemd[1]: Mounting sysroot-usr.mount - /sysroot/usr... May 13 23:48:04.936612 systemd[1]: Starting flatcar-metadata-hostname.service - Flatcar Metadata Hostname Agent... May 13 23:48:04.939508 systemd[1]: ignition-remount-sysroot.service - Remount /sysroot read-write for Ignition was skipped because of an unmet condition check (ConditionPathIsReadWrite=!/sysroot). May 13 23:48:04.939548 systemd[1]: Reached target ignition-diskful.target - Ignition Boot Disk Setup. May 13 23:48:04.947629 systemd[1]: Mounted sysroot-usr.mount - /sysroot/usr. May 13 23:48:04.957010 kernel: BTRFS: device label OEM devid 1 transid 15 /dev/sda6 scanned by mount (815) May 13 23:48:04.957072 kernel: BTRFS info (device sda6): first mount of filesystem e7b30525-8b14-4004-ad68-68a99b3959db May 13 23:48:04.957096 kernel: BTRFS info (device sda6): using crc32c (crc32c-generic) checksum algorithm May 13 23:48:04.957110 kernel: BTRFS info (device sda6): using free space tree May 13 23:48:04.957883 systemd[1]: Starting initrd-setup-root.service - Root filesystem setup... May 13 23:48:04.962051 kernel: BTRFS info (device sda6): enabling ssd optimizations May 13 23:48:04.962078 kernel: BTRFS info (device sda6): auto enabling async discard May 13 23:48:04.965728 systemd[1]: Mounted sysroot-oem.mount - /sysroot/oem. May 13 23:48:05.024332 coreos-metadata[817]: May 13 23:48:05.024 INFO Fetching http://169.254.169.254/hetzner/v1/metadata/hostname: Attempt #1 May 13 23:48:05.027051 coreos-metadata[817]: May 13 23:48:05.026 INFO Fetch successful May 13 23:48:05.028257 coreos-metadata[817]: May 13 23:48:05.028 INFO wrote hostname ci-4284-0-0-n-cba8e36126 to /sysroot/etc/hostname May 13 23:48:05.030534 initrd-setup-root[842]: cut: /sysroot/etc/passwd: No such file or directory May 13 23:48:05.033619 systemd[1]: Finished flatcar-metadata-hostname.service - Flatcar Metadata Hostname Agent. May 13 23:48:05.039852 initrd-setup-root[850]: cut: /sysroot/etc/group: No such file or directory May 13 23:48:05.045369 initrd-setup-root[857]: cut: /sysroot/etc/shadow: No such file or directory May 13 23:48:05.050666 initrd-setup-root[864]: cut: /sysroot/etc/gshadow: No such file or directory May 13 23:48:05.148943 systemd[1]: Finished initrd-setup-root.service - Root filesystem setup. May 13 23:48:05.152370 systemd[1]: Starting ignition-mount.service - Ignition (mount)... May 13 23:48:05.154548 systemd[1]: Starting sysroot-boot.service - /sysroot/boot... May 13 23:48:05.174431 kernel: BTRFS info (device sda6): last unmount of filesystem e7b30525-8b14-4004-ad68-68a99b3959db May 13 23:48:05.194730 systemd[1]: Finished sysroot-boot.service - /sysroot/boot. May 13 23:48:05.198392 ignition[932]: INFO : Ignition 2.20.0 May 13 23:48:05.198392 ignition[932]: INFO : Stage: mount May 13 23:48:05.199546 ignition[932]: INFO : no configs at "/usr/lib/ignition/base.d" May 13 23:48:05.199546 ignition[932]: INFO : no config dir at "/usr/lib/ignition/base.platform.d/hetzner" May 13 23:48:05.199546 ignition[932]: INFO : mount: mount passed May 13 23:48:05.202064 ignition[932]: INFO : Ignition finished successfully May 13 23:48:05.202567 systemd[1]: Finished ignition-mount.service - Ignition (mount). May 13 23:48:05.204683 systemd[1]: Starting ignition-files.service - Ignition (files)... May 13 23:48:05.266609 systemd[1]: sysroot-oem.mount: Deactivated successfully. May 13 23:48:05.269593 systemd[1]: Mounting sysroot-oem.mount - /sysroot/oem... May 13 23:48:05.287585 kernel: BTRFS: device label OEM devid 1 transid 16 /dev/sda6 scanned by mount (944) May 13 23:48:05.289544 kernel: BTRFS info (device sda6): first mount of filesystem e7b30525-8b14-4004-ad68-68a99b3959db May 13 23:48:05.289614 kernel: BTRFS info (device sda6): using crc32c (crc32c-generic) checksum algorithm May 13 23:48:05.289630 kernel: BTRFS info (device sda6): using free space tree May 13 23:48:05.293483 kernel: BTRFS info (device sda6): enabling ssd optimizations May 13 23:48:05.293574 kernel: BTRFS info (device sda6): auto enabling async discard May 13 23:48:05.296104 systemd[1]: Mounted sysroot-oem.mount - /sysroot/oem. May 13 23:48:05.331412 ignition[962]: INFO : Ignition 2.20.0 May 13 23:48:05.331412 ignition[962]: INFO : Stage: files May 13 23:48:05.332639 ignition[962]: INFO : no configs at "/usr/lib/ignition/base.d" May 13 23:48:05.332639 ignition[962]: INFO : no config dir at "/usr/lib/ignition/base.platform.d/hetzner" May 13 23:48:05.332639 ignition[962]: DEBUG : files: compiled without relabeling support, skipping May 13 23:48:05.334692 ignition[962]: INFO : files: ensureUsers: op(1): [started] creating or modifying user "core" May 13 23:48:05.334692 ignition[962]: DEBUG : files: ensureUsers: op(1): executing: "usermod" "--root" "/sysroot" "core" May 13 23:48:05.337769 ignition[962]: INFO : files: ensureUsers: op(1): [finished] creating or modifying user "core" May 13 23:48:05.338657 ignition[962]: INFO : files: ensureUsers: op(2): [started] adding ssh keys to user "core" May 13 23:48:05.339791 unknown[962]: wrote ssh authorized keys file for user: core May 13 23:48:05.341082 ignition[962]: INFO : files: ensureUsers: op(2): [finished] adding ssh keys to user "core" May 13 23:48:05.342766 ignition[962]: INFO : files: createFilesystemsFiles: createFiles: op(3): [started] writing file "/sysroot/opt/helm-v3.13.2-linux-arm64.tar.gz" May 13 23:48:05.344475 ignition[962]: INFO : files: createFilesystemsFiles: createFiles: op(3): GET https://get.helm.sh/helm-v3.13.2-linux-arm64.tar.gz: attempt #1 May 13 23:48:05.445786 ignition[962]: INFO : files: createFilesystemsFiles: createFiles: op(3): GET result: OK May 13 23:48:05.563532 systemd-networkd[779]: eth0: Gained IPv6LL May 13 23:48:05.563859 systemd-networkd[779]: eth1: Gained IPv6LL May 13 23:48:05.680106 ignition[962]: INFO : files: createFilesystemsFiles: createFiles: op(3): [finished] writing file "/sysroot/opt/helm-v3.13.2-linux-arm64.tar.gz" May 13 23:48:05.681865 ignition[962]: INFO : files: createFilesystemsFiles: createFiles: op(4): [started] writing file "/sysroot/home/core/install.sh" May 13 23:48:05.681865 ignition[962]: INFO : files: createFilesystemsFiles: createFiles: op(4): [finished] writing file "/sysroot/home/core/install.sh" May 13 23:48:05.681865 ignition[962]: INFO : files: createFilesystemsFiles: createFiles: op(5): [started] writing file "/sysroot/home/core/nginx.yaml" May 13 23:48:05.681865 ignition[962]: INFO : files: createFilesystemsFiles: createFiles: op(5): [finished] writing file "/sysroot/home/core/nginx.yaml" May 13 23:48:05.681865 ignition[962]: INFO : files: createFilesystemsFiles: createFiles: op(6): [started] writing file "/sysroot/home/core/nfs-pod.yaml" May 13 23:48:05.681865 ignition[962]: INFO : files: createFilesystemsFiles: createFiles: op(6): [finished] writing file "/sysroot/home/core/nfs-pod.yaml" May 13 23:48:05.681865 ignition[962]: INFO : files: createFilesystemsFiles: createFiles: op(7): [started] writing file "/sysroot/home/core/nfs-pvc.yaml" May 13 23:48:05.681865 ignition[962]: INFO : files: createFilesystemsFiles: createFiles: op(7): [finished] writing file "/sysroot/home/core/nfs-pvc.yaml" May 13 23:48:05.681865 ignition[962]: INFO : files: createFilesystemsFiles: createFiles: op(8): [started] writing file "/sysroot/etc/flatcar/update.conf" May 13 23:48:05.681865 ignition[962]: INFO : files: createFilesystemsFiles: createFiles: op(8): [finished] writing file "/sysroot/etc/flatcar/update.conf" May 13 23:48:05.681865 ignition[962]: INFO : files: createFilesystemsFiles: createFiles: op(9): [started] writing link "/sysroot/etc/extensions/kubernetes.raw" -> "/opt/extensions/kubernetes/kubernetes-v1.30.1-arm64.raw" May 13 23:48:05.681865 ignition[962]: INFO : files: createFilesystemsFiles: createFiles: op(9): [finished] writing link "/sysroot/etc/extensions/kubernetes.raw" -> "/opt/extensions/kubernetes/kubernetes-v1.30.1-arm64.raw" May 13 23:48:05.681865 ignition[962]: INFO : files: createFilesystemsFiles: createFiles: op(a): [started] writing file "/sysroot/opt/extensions/kubernetes/kubernetes-v1.30.1-arm64.raw" May 13 23:48:05.681865 ignition[962]: INFO : files: createFilesystemsFiles: createFiles: op(a): GET https://github.com/flatcar/sysext-bakery/releases/download/latest/kubernetes-v1.30.1-arm64.raw: attempt #1 May 13 23:48:06.295385 ignition[962]: INFO : files: createFilesystemsFiles: createFiles: op(a): GET result: OK May 13 23:48:07.406444 ignition[962]: INFO : files: createFilesystemsFiles: createFiles: op(a): [finished] writing file "/sysroot/opt/extensions/kubernetes/kubernetes-v1.30.1-arm64.raw" May 13 23:48:07.406444 ignition[962]: INFO : files: op(b): [started] processing unit "prepare-helm.service" May 13 23:48:07.409592 ignition[962]: INFO : files: op(b): op(c): [started] writing unit "prepare-helm.service" at "/sysroot/etc/systemd/system/prepare-helm.service" May 13 23:48:07.409592 ignition[962]: INFO : files: op(b): op(c): [finished] writing unit "prepare-helm.service" at "/sysroot/etc/systemd/system/prepare-helm.service" May 13 23:48:07.409592 ignition[962]: INFO : files: op(b): [finished] processing unit "prepare-helm.service" May 13 23:48:07.409592 ignition[962]: INFO : files: op(d): [started] processing unit "coreos-metadata.service" May 13 23:48:07.409592 ignition[962]: INFO : files: op(d): op(e): [started] writing systemd drop-in "00-custom-metadata.conf" at "/sysroot/etc/systemd/system/coreos-metadata.service.d/00-custom-metadata.conf" May 13 23:48:07.409592 ignition[962]: INFO : files: op(d): op(e): [finished] writing systemd drop-in "00-custom-metadata.conf" at "/sysroot/etc/systemd/system/coreos-metadata.service.d/00-custom-metadata.conf" May 13 23:48:07.409592 ignition[962]: INFO : files: op(d): [finished] processing unit "coreos-metadata.service" May 13 23:48:07.409592 ignition[962]: INFO : files: op(f): [started] setting preset to enabled for "prepare-helm.service" May 13 23:48:07.409592 ignition[962]: INFO : files: op(f): [finished] setting preset to enabled for "prepare-helm.service" May 13 23:48:07.409592 ignition[962]: INFO : files: createResultFile: createFiles: op(10): [started] writing file "/sysroot/etc/.ignition-result.json" May 13 23:48:07.409592 ignition[962]: INFO : files: createResultFile: createFiles: op(10): [finished] writing file "/sysroot/etc/.ignition-result.json" May 13 23:48:07.409592 ignition[962]: INFO : files: files passed May 13 23:48:07.409592 ignition[962]: INFO : Ignition finished successfully May 13 23:48:07.410707 systemd[1]: Finished ignition-files.service - Ignition (files). May 13 23:48:07.419615 systemd[1]: Starting ignition-quench.service - Ignition (record completion)... May 13 23:48:07.423551 systemd[1]: Starting initrd-setup-root-after-ignition.service - Root filesystem completion... May 13 23:48:07.438170 systemd[1]: ignition-quench.service: Deactivated successfully. May 13 23:48:07.439135 systemd[1]: Finished ignition-quench.service - Ignition (record completion). May 13 23:48:07.448130 initrd-setup-root-after-ignition[990]: grep: /sysroot/etc/flatcar/enabled-sysext.conf: No such file or directory May 13 23:48:07.448130 initrd-setup-root-after-ignition[990]: grep: /sysroot/usr/share/flatcar/enabled-sysext.conf: No such file or directory May 13 23:48:07.451080 initrd-setup-root-after-ignition[994]: grep: /sysroot/etc/flatcar/enabled-sysext.conf: No such file or directory May 13 23:48:07.454562 systemd[1]: Finished initrd-setup-root-after-ignition.service - Root filesystem completion. May 13 23:48:07.456218 systemd[1]: Reached target ignition-complete.target - Ignition Complete. May 13 23:48:07.458564 systemd[1]: Starting initrd-parse-etc.service - Mountpoints Configured in the Real Root... May 13 23:48:07.524196 systemd[1]: initrd-parse-etc.service: Deactivated successfully. May 13 23:48:07.524349 systemd[1]: Finished initrd-parse-etc.service - Mountpoints Configured in the Real Root. May 13 23:48:07.547878 systemd[1]: Reached target initrd-fs.target - Initrd File Systems. May 13 23:48:07.549857 systemd[1]: Reached target initrd.target - Initrd Default Target. May 13 23:48:07.550841 systemd[1]: dracut-mount.service - dracut mount hook was skipped because no trigger condition checks were met. May 13 23:48:07.552016 systemd[1]: Starting dracut-pre-pivot.service - dracut pre-pivot and cleanup hook... May 13 23:48:07.581485 systemd[1]: Finished dracut-pre-pivot.service - dracut pre-pivot and cleanup hook. May 13 23:48:07.583582 systemd[1]: Starting initrd-cleanup.service - Cleaning Up and Shutting Down Daemons... May 13 23:48:07.606717 systemd[1]: Stopped target nss-lookup.target - Host and Network Name Lookups. May 13 23:48:07.607471 systemd[1]: Stopped target remote-cryptsetup.target - Remote Encrypted Volumes. May 13 23:48:07.608990 systemd[1]: Stopped target timers.target - Timer Units. May 13 23:48:07.610432 systemd[1]: dracut-pre-pivot.service: Deactivated successfully. May 13 23:48:07.610568 systemd[1]: Stopped dracut-pre-pivot.service - dracut pre-pivot and cleanup hook. May 13 23:48:07.612328 systemd[1]: Stopped target initrd.target - Initrd Default Target. May 13 23:48:07.612986 systemd[1]: Stopped target basic.target - Basic System. May 13 23:48:07.613939 systemd[1]: Stopped target ignition-complete.target - Ignition Complete. May 13 23:48:07.614982 systemd[1]: Stopped target ignition-diskful.target - Ignition Boot Disk Setup. May 13 23:48:07.616006 systemd[1]: Stopped target initrd-root-device.target - Initrd Root Device. May 13 23:48:07.617014 systemd[1]: Stopped target remote-fs.target - Remote File Systems. May 13 23:48:07.618036 systemd[1]: Stopped target remote-fs-pre.target - Preparation for Remote File Systems. May 13 23:48:07.619231 systemd[1]: Stopped target sysinit.target - System Initialization. May 13 23:48:07.620174 systemd[1]: Stopped target local-fs.target - Local File Systems. May 13 23:48:07.621251 systemd[1]: Stopped target swap.target - Swaps. May 13 23:48:07.622368 systemd[1]: dracut-pre-mount.service: Deactivated successfully. May 13 23:48:07.622521 systemd[1]: Stopped dracut-pre-mount.service - dracut pre-mount hook. May 13 23:48:07.623729 systemd[1]: Stopped target cryptsetup.target - Local Encrypted Volumes. May 13 23:48:07.624601 systemd[1]: Stopped target cryptsetup-pre.target - Local Encrypted Volumes (Pre). May 13 23:48:07.625571 systemd[1]: clevis-luks-askpass.path: Deactivated successfully. May 13 23:48:07.625648 systemd[1]: Stopped clevis-luks-askpass.path - Forward Password Requests to Clevis Directory Watch. May 13 23:48:07.626662 systemd[1]: dracut-initqueue.service: Deactivated successfully. May 13 23:48:07.626822 systemd[1]: Stopped dracut-initqueue.service - dracut initqueue hook. May 13 23:48:07.628292 systemd[1]: initrd-setup-root-after-ignition.service: Deactivated successfully. May 13 23:48:07.628436 systemd[1]: Stopped initrd-setup-root-after-ignition.service - Root filesystem completion. May 13 23:48:07.629532 systemd[1]: ignition-files.service: Deactivated successfully. May 13 23:48:07.629629 systemd[1]: Stopped ignition-files.service - Ignition (files). May 13 23:48:07.630728 systemd[1]: flatcar-metadata-hostname.service: Deactivated successfully. May 13 23:48:07.630843 systemd[1]: Stopped flatcar-metadata-hostname.service - Flatcar Metadata Hostname Agent. May 13 23:48:07.634608 systemd[1]: Stopping ignition-mount.service - Ignition (mount)... May 13 23:48:07.639357 systemd[1]: Stopping sysroot-boot.service - /sysroot/boot... May 13 23:48:07.640265 systemd[1]: systemd-udev-trigger.service: Deactivated successfully. May 13 23:48:07.640416 systemd[1]: Stopped systemd-udev-trigger.service - Coldplug All udev Devices. May 13 23:48:07.641509 systemd[1]: dracut-pre-trigger.service: Deactivated successfully. May 13 23:48:07.641607 systemd[1]: Stopped dracut-pre-trigger.service - dracut pre-trigger hook. May 13 23:48:07.655365 systemd[1]: initrd-cleanup.service: Deactivated successfully. May 13 23:48:07.655847 systemd[1]: Finished initrd-cleanup.service - Cleaning Up and Shutting Down Daemons. May 13 23:48:07.667320 systemd[1]: sysroot-boot.mount: Deactivated successfully. May 13 23:48:07.671088 ignition[1014]: INFO : Ignition 2.20.0 May 13 23:48:07.671088 ignition[1014]: INFO : Stage: umount May 13 23:48:07.671088 ignition[1014]: INFO : no configs at "/usr/lib/ignition/base.d" May 13 23:48:07.671088 ignition[1014]: INFO : no config dir at "/usr/lib/ignition/base.platform.d/hetzner" May 13 23:48:07.671088 ignition[1014]: INFO : umount: umount passed May 13 23:48:07.671088 ignition[1014]: INFO : Ignition finished successfully May 13 23:48:07.672183 systemd[1]: ignition-mount.service: Deactivated successfully. May 13 23:48:07.672315 systemd[1]: Stopped ignition-mount.service - Ignition (mount). May 13 23:48:07.673482 systemd[1]: sysroot-boot.service: Deactivated successfully. May 13 23:48:07.673588 systemd[1]: Stopped sysroot-boot.service - /sysroot/boot. May 13 23:48:07.675751 systemd[1]: ignition-disks.service: Deactivated successfully. May 13 23:48:07.676012 systemd[1]: Stopped ignition-disks.service - Ignition (disks). May 13 23:48:07.678859 systemd[1]: ignition-kargs.service: Deactivated successfully. May 13 23:48:07.678927 systemd[1]: Stopped ignition-kargs.service - Ignition (kargs). May 13 23:48:07.680564 systemd[1]: ignition-fetch.service: Deactivated successfully. May 13 23:48:07.680614 systemd[1]: Stopped ignition-fetch.service - Ignition (fetch). May 13 23:48:07.682434 systemd[1]: Stopped target network.target - Network. May 13 23:48:07.683162 systemd[1]: ignition-fetch-offline.service: Deactivated successfully. May 13 23:48:07.683223 systemd[1]: Stopped ignition-fetch-offline.service - Ignition (fetch-offline). May 13 23:48:07.684128 systemd[1]: Stopped target paths.target - Path Units. May 13 23:48:07.684871 systemd[1]: systemd-ask-password-console.path: Deactivated successfully. May 13 23:48:07.691025 systemd[1]: Stopped systemd-ask-password-console.path - Dispatch Password Requests to Console Directory Watch. May 13 23:48:07.693190 systemd[1]: Stopped target slices.target - Slice Units. May 13 23:48:07.695038 systemd[1]: Stopped target sockets.target - Socket Units. May 13 23:48:07.696650 systemd[1]: iscsid.socket: Deactivated successfully. May 13 23:48:07.696705 systemd[1]: Closed iscsid.socket - Open-iSCSI iscsid Socket. May 13 23:48:07.697715 systemd[1]: iscsiuio.socket: Deactivated successfully. May 13 23:48:07.697751 systemd[1]: Closed iscsiuio.socket - Open-iSCSI iscsiuio Socket. May 13 23:48:07.698555 systemd[1]: ignition-setup.service: Deactivated successfully. May 13 23:48:07.698608 systemd[1]: Stopped ignition-setup.service - Ignition (setup). May 13 23:48:07.699445 systemd[1]: ignition-setup-pre.service: Deactivated successfully. May 13 23:48:07.699490 systemd[1]: Stopped ignition-setup-pre.service - Ignition env setup. May 13 23:48:07.700291 systemd[1]: initrd-setup-root.service: Deactivated successfully. May 13 23:48:07.700335 systemd[1]: Stopped initrd-setup-root.service - Root filesystem setup. May 13 23:48:07.701334 systemd[1]: Stopping systemd-networkd.service - Network Configuration... May 13 23:48:07.702089 systemd[1]: Stopping systemd-resolved.service - Network Name Resolution... May 13 23:48:07.712382 systemd[1]: systemd-resolved.service: Deactivated successfully. May 13 23:48:07.712620 systemd[1]: Stopped systemd-resolved.service - Network Name Resolution. May 13 23:48:07.719019 systemd[1]: run-credentials-systemd\x2dresolved.service.mount: Deactivated successfully. May 13 23:48:07.719542 systemd[1]: systemd-networkd.service: Deactivated successfully. May 13 23:48:07.719723 systemd[1]: Stopped systemd-networkd.service - Network Configuration. May 13 23:48:07.723626 systemd[1]: run-credentials-systemd\x2dnetworkd.service.mount: Deactivated successfully. May 13 23:48:07.724341 systemd[1]: systemd-networkd.socket: Deactivated successfully. May 13 23:48:07.724736 systemd[1]: Closed systemd-networkd.socket - Network Service Netlink Socket. May 13 23:48:07.727010 systemd[1]: Stopping network-cleanup.service - Network Cleanup... May 13 23:48:07.727563 systemd[1]: parse-ip-for-networkd.service: Deactivated successfully. May 13 23:48:07.727623 systemd[1]: Stopped parse-ip-for-networkd.service - Write systemd-networkd units from cmdline. May 13 23:48:07.728353 systemd[1]: systemd-sysctl.service: Deactivated successfully. May 13 23:48:07.728396 systemd[1]: Stopped systemd-sysctl.service - Apply Kernel Variables. May 13 23:48:07.730526 systemd[1]: systemd-modules-load.service: Deactivated successfully. May 13 23:48:07.730578 systemd[1]: Stopped systemd-modules-load.service - Load Kernel Modules. May 13 23:48:07.731876 systemd[1]: systemd-tmpfiles-setup.service: Deactivated successfully. May 13 23:48:07.731931 systemd[1]: Stopped systemd-tmpfiles-setup.service - Create System Files and Directories. May 13 23:48:07.736024 systemd[1]: Stopping systemd-udevd.service - Rule-based Manager for Device Events and Files... May 13 23:48:07.744307 systemd[1]: run-credentials-systemd\x2dsysctl.service.mount: Deactivated successfully. May 13 23:48:07.744379 systemd[1]: run-credentials-systemd\x2dtmpfiles\x2dsetup.service.mount: Deactivated successfully. May 13 23:48:07.763688 systemd[1]: systemd-udevd.service: Deactivated successfully. May 13 23:48:07.763890 systemd[1]: Stopped systemd-udevd.service - Rule-based Manager for Device Events and Files. May 13 23:48:07.767495 systemd[1]: network-cleanup.service: Deactivated successfully. May 13 23:48:07.767675 systemd[1]: Stopped network-cleanup.service - Network Cleanup. May 13 23:48:07.769387 systemd[1]: systemd-udevd-control.socket: Deactivated successfully. May 13 23:48:07.769840 systemd[1]: Closed systemd-udevd-control.socket - udev Control Socket. May 13 23:48:07.770901 systemd[1]: systemd-udevd-kernel.socket: Deactivated successfully. May 13 23:48:07.770942 systemd[1]: Closed systemd-udevd-kernel.socket - udev Kernel Socket. May 13 23:48:07.772000 systemd[1]: dracut-pre-udev.service: Deactivated successfully. May 13 23:48:07.772060 systemd[1]: Stopped dracut-pre-udev.service - dracut pre-udev hook. May 13 23:48:07.773472 systemd[1]: dracut-cmdline.service: Deactivated successfully. May 13 23:48:07.773527 systemd[1]: Stopped dracut-cmdline.service - dracut cmdline hook. May 13 23:48:07.774844 systemd[1]: dracut-cmdline-ask.service: Deactivated successfully. May 13 23:48:07.774892 systemd[1]: Stopped dracut-cmdline-ask.service - dracut ask for additional cmdline parameters. May 13 23:48:07.777583 systemd[1]: Starting initrd-udevadm-cleanup-db.service - Cleanup udev Database... May 13 23:48:07.778712 systemd[1]: systemd-tmpfiles-setup-dev.service: Deactivated successfully. May 13 23:48:07.778787 systemd[1]: Stopped systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev. May 13 23:48:07.779555 systemd[1]: systemd-tmpfiles-setup-dev-early.service: Deactivated successfully. May 13 23:48:07.779597 systemd[1]: Stopped systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully. May 13 23:48:07.780237 systemd[1]: kmod-static-nodes.service: Deactivated successfully. May 13 23:48:07.780276 systemd[1]: Stopped kmod-static-nodes.service - Create List of Static Device Nodes. May 13 23:48:07.781094 systemd[1]: systemd-vconsole-setup.service: Deactivated successfully. May 13 23:48:07.781137 systemd[1]: Stopped systemd-vconsole-setup.service - Virtual Console Setup. May 13 23:48:07.784555 systemd[1]: run-credentials-systemd\x2dtmpfiles\x2dsetup\x2ddev.service.mount: Deactivated successfully. May 13 23:48:07.784612 systemd[1]: run-credentials-systemd\x2dvconsole\x2dsetup.service.mount: Deactivated successfully. May 13 23:48:07.805227 systemd[1]: initrd-udevadm-cleanup-db.service: Deactivated successfully. May 13 23:48:07.805496 systemd[1]: Finished initrd-udevadm-cleanup-db.service - Cleanup udev Database. May 13 23:48:07.808762 systemd[1]: Reached target initrd-switch-root.target - Switch Root. May 13 23:48:07.812072 systemd[1]: Starting initrd-switch-root.service - Switch Root... May 13 23:48:07.835764 systemd[1]: Switching root. May 13 23:48:07.872441 systemd-journald[236]: Received SIGTERM from PID 1 (systemd). May 13 23:48:07.872505 systemd-journald[236]: Journal stopped May 13 23:48:08.944329 kernel: SELinux: policy capability network_peer_controls=1 May 13 23:48:08.945531 kernel: SELinux: policy capability open_perms=1 May 13 23:48:08.945570 kernel: SELinux: policy capability extended_socket_class=1 May 13 23:48:08.945587 kernel: SELinux: policy capability always_check_network=0 May 13 23:48:08.945596 kernel: SELinux: policy capability cgroup_seclabel=1 May 13 23:48:08.945611 kernel: SELinux: policy capability nnp_nosuid_transition=1 May 13 23:48:08.945624 kernel: SELinux: policy capability genfs_seclabel_symlinks=0 May 13 23:48:08.945633 kernel: SELinux: policy capability ioctl_skip_cloexec=0 May 13 23:48:08.945646 kernel: audit: type=1403 audit(1747180088.043:2): auid=4294967295 ses=4294967295 lsm=selinux res=1 May 13 23:48:08.945658 systemd[1]: Successfully loaded SELinux policy in 42.626ms. May 13 23:48:08.945678 systemd[1]: Relabeled /dev/, /dev/shm/, /run/ in 12.619ms. May 13 23:48:08.945689 systemd[1]: systemd 256.8 running in system mode (+PAM +AUDIT +SELINUX -APPARMOR +IMA +SMACK +SECCOMP +GCRYPT -GNUTLS +OPENSSL -ACL +BLKID +CURL +ELFUTILS -FIDO2 +IDN2 -IDN +IPTC +KMOD +LIBCRYPTSETUP +LIBCRYPTSETUP_PLUGINS +LIBFDISK +PCRE2 -PWQUALITY -P11KIT -QRENCODE +TPM2 +BZIP2 +LZ4 +XZ +ZLIB +ZSTD -BPF_FRAMEWORK -XKBCOMMON +UTMP -SYSVINIT +LIBARCHIVE) May 13 23:48:08.945702 systemd[1]: Detected virtualization kvm. May 13 23:48:08.945712 systemd[1]: Detected architecture arm64. May 13 23:48:08.945722 systemd[1]: Detected first boot. May 13 23:48:08.945732 systemd[1]: Hostname set to . May 13 23:48:08.945742 systemd[1]: Initializing machine ID from VM UUID. May 13 23:48:08.945752 zram_generator::config[1059]: No configuration found. May 13 23:48:08.945781 kernel: NET: Registered PF_VSOCK protocol family May 13 23:48:08.945795 systemd[1]: Populated /etc with preset unit settings. May 13 23:48:08.945806 systemd[1]: run-credentials-systemd\x2djournald.service.mount: Deactivated successfully. May 13 23:48:08.945817 systemd[1]: initrd-switch-root.service: Deactivated successfully. May 13 23:48:08.945830 systemd[1]: Stopped initrd-switch-root.service - Switch Root. May 13 23:48:08.945840 systemd[1]: systemd-journald.service: Scheduled restart job, restart counter is at 1. May 13 23:48:08.945851 systemd[1]: Created slice system-addon\x2dconfig.slice - Slice /system/addon-config. May 13 23:48:08.945862 systemd[1]: Created slice system-addon\x2drun.slice - Slice /system/addon-run. May 13 23:48:08.945872 systemd[1]: Created slice system-getty.slice - Slice /system/getty. May 13 23:48:08.945882 systemd[1]: Created slice system-modprobe.slice - Slice /system/modprobe. May 13 23:48:08.945894 systemd[1]: Created slice system-serial\x2dgetty.slice - Slice /system/serial-getty. May 13 23:48:08.945905 systemd[1]: Created slice system-system\x2dcloudinit.slice - Slice /system/system-cloudinit. May 13 23:48:08.945915 systemd[1]: Created slice system-systemd\x2dfsck.slice - Slice /system/systemd-fsck. May 13 23:48:08.945926 systemd[1]: Created slice user.slice - User and Session Slice. May 13 23:48:08.945943 systemd[1]: Started clevis-luks-askpass.path - Forward Password Requests to Clevis Directory Watch. May 13 23:48:08.945953 systemd[1]: Started systemd-ask-password-console.path - Dispatch Password Requests to Console Directory Watch. May 13 23:48:08.945963 systemd[1]: Started systemd-ask-password-wall.path - Forward Password Requests to Wall Directory Watch. May 13 23:48:08.945973 systemd[1]: Set up automount boot.automount - Boot partition Automount Point. May 13 23:48:08.945984 systemd[1]: Set up automount proc-sys-fs-binfmt_misc.automount - Arbitrary Executable File Formats File System Automount Point. May 13 23:48:08.945995 systemd[1]: Expecting device dev-disk-by\x2dlabel-OEM.device - /dev/disk/by-label/OEM... May 13 23:48:08.946005 systemd[1]: Expecting device dev-ttyAMA0.device - /dev/ttyAMA0... May 13 23:48:08.946016 systemd[1]: Reached target cryptsetup-pre.target - Local Encrypted Volumes (Pre). May 13 23:48:08.946026 systemd[1]: Stopped target initrd-switch-root.target - Switch Root. May 13 23:48:08.946041 systemd[1]: Stopped target initrd-fs.target - Initrd File Systems. May 13 23:48:08.946050 systemd[1]: Stopped target initrd-root-fs.target - Initrd Root File System. May 13 23:48:08.946062 systemd[1]: Reached target integritysetup.target - Local Integrity Protected Volumes. May 13 23:48:08.946072 systemd[1]: Reached target remote-cryptsetup.target - Remote Encrypted Volumes. May 13 23:48:08.946082 systemd[1]: Reached target remote-fs.target - Remote File Systems. May 13 23:48:08.946093 systemd[1]: Reached target slices.target - Slice Units. May 13 23:48:08.946103 systemd[1]: Reached target swap.target - Swaps. May 13 23:48:08.946113 systemd[1]: Reached target veritysetup.target - Local Verity Protected Volumes. May 13 23:48:08.946123 systemd[1]: Listening on systemd-coredump.socket - Process Core Dump Socket. May 13 23:48:08.946133 systemd[1]: Listening on systemd-creds.socket - Credential Encryption/Decryption. May 13 23:48:08.946143 systemd[1]: Listening on systemd-networkd.socket - Network Service Netlink Socket. May 13 23:48:08.946157 systemd[1]: Listening on systemd-udevd-control.socket - udev Control Socket. May 13 23:48:08.946171 systemd[1]: Listening on systemd-udevd-kernel.socket - udev Kernel Socket. May 13 23:48:08.946182 systemd[1]: Listening on systemd-userdbd.socket - User Database Manager Socket. May 13 23:48:08.946192 systemd[1]: Mounting dev-hugepages.mount - Huge Pages File System... May 13 23:48:08.946203 systemd[1]: Mounting dev-mqueue.mount - POSIX Message Queue File System... May 13 23:48:08.946213 systemd[1]: Mounting media.mount - External Media Directory... May 13 23:48:08.946225 systemd[1]: Mounting sys-kernel-debug.mount - Kernel Debug File System... May 13 23:48:08.946236 systemd[1]: Mounting sys-kernel-tracing.mount - Kernel Trace File System... May 13 23:48:08.946247 systemd[1]: Mounting tmp.mount - Temporary Directory /tmp... May 13 23:48:08.946257 systemd[1]: var-lib-machines.mount - Virtual Machine and Container Storage (Compatibility) was skipped because of an unmet condition check (ConditionPathExists=/var/lib/machines.raw). May 13 23:48:08.946268 systemd[1]: Reached target machines.target - Containers. May 13 23:48:08.946278 systemd[1]: Starting flatcar-tmpfiles.service - Create missing system files... May 13 23:48:08.946289 systemd[1]: ignition-delete-config.service - Ignition (delete config) was skipped because no trigger condition checks were met. May 13 23:48:08.946299 systemd[1]: Starting kmod-static-nodes.service - Create List of Static Device Nodes... May 13 23:48:08.946312 systemd[1]: Starting modprobe@configfs.service - Load Kernel Module configfs... May 13 23:48:08.946322 systemd[1]: Starting modprobe@dm_mod.service - Load Kernel Module dm_mod... May 13 23:48:08.946332 systemd[1]: Starting modprobe@drm.service - Load Kernel Module drm... May 13 23:48:08.946347 systemd[1]: Starting modprobe@efi_pstore.service - Load Kernel Module efi_pstore... May 13 23:48:08.946357 systemd[1]: Starting modprobe@fuse.service - Load Kernel Module fuse... May 13 23:48:08.946367 systemd[1]: Starting modprobe@loop.service - Load Kernel Module loop... May 13 23:48:08.946377 systemd[1]: setup-nsswitch.service - Create /etc/nsswitch.conf was skipped because of an unmet condition check (ConditionPathExists=!/etc/nsswitch.conf). May 13 23:48:08.946387 systemd[1]: systemd-fsck-root.service: Deactivated successfully. May 13 23:48:08.951428 systemd[1]: Stopped systemd-fsck-root.service - File System Check on Root Device. May 13 23:48:08.951482 systemd[1]: systemd-fsck-usr.service: Deactivated successfully. May 13 23:48:08.951494 systemd[1]: Stopped systemd-fsck-usr.service. May 13 23:48:08.951506 systemd[1]: systemd-hibernate-clear.service - Clear Stale Hibernate Storage Info was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/HibernateLocation-8cf2644b-4b0b-428f-9387-6d876050dc67). May 13 23:48:08.951516 systemd[1]: Starting systemd-journald.service - Journal Service... May 13 23:48:08.951529 systemd[1]: Starting systemd-modules-load.service - Load Kernel Modules... May 13 23:48:08.951540 systemd[1]: Starting systemd-network-generator.service - Generate network units from Kernel command line... May 13 23:48:08.951550 systemd[1]: Starting systemd-remount-fs.service - Remount Root and Kernel File Systems... May 13 23:48:08.951561 systemd[1]: Starting systemd-udev-load-credentials.service - Load udev Rules from Credentials... May 13 23:48:08.951571 systemd[1]: Starting systemd-udev-trigger.service - Coldplug All udev Devices... May 13 23:48:08.951582 systemd[1]: verity-setup.service: Deactivated successfully. May 13 23:48:08.951592 systemd[1]: Stopped verity-setup.service. May 13 23:48:08.951602 kernel: fuse: init (API version 7.39) May 13 23:48:08.951613 systemd[1]: Mounted dev-hugepages.mount - Huge Pages File System. May 13 23:48:08.951626 systemd[1]: Mounted dev-mqueue.mount - POSIX Message Queue File System. May 13 23:48:08.951636 systemd[1]: Mounted media.mount - External Media Directory. May 13 23:48:08.951648 systemd[1]: Mounted sys-kernel-debug.mount - Kernel Debug File System. May 13 23:48:08.951658 systemd[1]: Mounted sys-kernel-tracing.mount - Kernel Trace File System. May 13 23:48:08.951668 systemd[1]: Mounted tmp.mount - Temporary Directory /tmp. May 13 23:48:08.951680 systemd[1]: Finished flatcar-tmpfiles.service - Create missing system files. May 13 23:48:08.951691 systemd[1]: Finished kmod-static-nodes.service - Create List of Static Device Nodes. May 13 23:48:08.951701 systemd[1]: modprobe@configfs.service: Deactivated successfully. May 13 23:48:08.951712 systemd[1]: Finished modprobe@configfs.service - Load Kernel Module configfs. May 13 23:48:08.951722 systemd[1]: modprobe@dm_mod.service: Deactivated successfully. May 13 23:48:08.951732 systemd[1]: Finished modprobe@dm_mod.service - Load Kernel Module dm_mod. May 13 23:48:08.951743 systemd[1]: modprobe@efi_pstore.service: Deactivated successfully. May 13 23:48:08.951753 systemd[1]: Finished modprobe@efi_pstore.service - Load Kernel Module efi_pstore. May 13 23:48:08.951774 systemd[1]: modprobe@fuse.service: Deactivated successfully. May 13 23:48:08.951790 systemd[1]: Finished modprobe@fuse.service - Load Kernel Module fuse. May 13 23:48:08.951803 systemd[1]: Mounting sys-fs-fuse-connections.mount - FUSE Control File System... May 13 23:48:08.951815 kernel: ACPI: bus type drm_connector registered May 13 23:48:08.951826 systemd[1]: Mounting sys-kernel-config.mount - Kernel Configuration File System... May 13 23:48:08.951836 systemd[1]: Starting systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully... May 13 23:48:08.951849 systemd[1]: modprobe@drm.service: Deactivated successfully. May 13 23:48:08.951860 systemd[1]: Finished modprobe@drm.service - Load Kernel Module drm. May 13 23:48:08.951870 kernel: loop: module loaded May 13 23:48:08.951882 systemd[1]: Finished systemd-modules-load.service - Load Kernel Modules. May 13 23:48:08.951892 systemd[1]: Finished systemd-remount-fs.service - Remount Root and Kernel File Systems. May 13 23:48:08.951902 systemd[1]: Mounted sys-fs-fuse-connections.mount - FUSE Control File System. May 13 23:48:08.951913 systemd[1]: modprobe@loop.service: Deactivated successfully. May 13 23:48:08.951923 systemd[1]: Finished modprobe@loop.service - Load Kernel Module loop. May 13 23:48:08.951933 systemd[1]: Mounted sys-kernel-config.mount - Kernel Configuration File System. May 13 23:48:08.951944 systemd[1]: remount-root.service - Remount Root File System was skipped because of an unmet condition check (ConditionPathIsReadWrite=!/). May 13 23:48:08.951954 systemd[1]: Reached target local-fs.target - Local File Systems. May 13 23:48:08.951967 systemd[1]: Listening on systemd-sysext.socket - System Extension Image Management. May 13 23:48:08.951979 systemd[1]: Starting dracut-shutdown.service - Restore /run/initramfs on shutdown... May 13 23:48:08.951989 systemd[1]: Starting ldconfig.service - Rebuild Dynamic Linker Cache... May 13 23:48:08.952000 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. May 13 23:48:08.952010 systemd[1]: Starting systemd-hwdb-update.service - Rebuild Hardware Database... May 13 23:48:08.952021 systemd[1]: systemd-pstore.service - Platform Persistent Storage Archival was skipped because of an unmet condition check (ConditionDirectoryNotEmpty=/sys/fs/pstore). May 13 23:48:08.952031 systemd[1]: Starting systemd-random-seed.service - Load/Save OS Random Seed... May 13 23:48:08.952041 systemd[1]: systemd-repart.service - Repartition Root Disk was skipped because no trigger condition checks were met. May 13 23:48:08.952083 systemd-journald[1127]: Collecting audit messages is disabled. May 13 23:48:08.952110 systemd[1]: Starting systemd-sysctl.service - Apply Kernel Variables... May 13 23:48:08.952122 systemd-journald[1127]: Journal started May 13 23:48:08.952149 systemd-journald[1127]: Runtime Journal (/run/log/journal/932b4659872a4a0486b76bd08d218774) is 8M, max 76.6M, 68.6M free. May 13 23:48:08.602957 systemd[1]: Queued start job for default target multi-user.target. May 13 23:48:08.617506 systemd[1]: Unnecessary job was removed for dev-sda6.device - /dev/sda6. May 13 23:48:08.618352 systemd[1]: systemd-journald.service: Deactivated successfully. May 13 23:48:08.955922 systemd[1]: Starting systemd-sysext.service - Merge System Extension Images into /usr/ and /opt/... May 13 23:48:08.974328 systemd[1]: Started systemd-journald.service - Journal Service. May 13 23:48:08.972063 systemd[1]: Finished systemd-network-generator.service - Generate network units from Kernel command line. May 13 23:48:08.973124 systemd[1]: Finished systemd-udev-load-credentials.service - Load udev Rules from Credentials. May 13 23:48:08.978316 systemd[1]: Finished dracut-shutdown.service - Restore /run/initramfs on shutdown. May 13 23:48:08.993972 systemd[1]: Finished systemd-random-seed.service - Load/Save OS Random Seed. May 13 23:48:09.001470 kernel: loop0: detected capacity change from 0 to 126448 May 13 23:48:09.004590 systemd-tmpfiles[1154]: ACLs are not supported, ignoring. May 13 23:48:09.004605 systemd-tmpfiles[1154]: ACLs are not supported, ignoring. May 13 23:48:09.026955 systemd[1]: Finished systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully. May 13 23:48:09.033389 systemd[1]: Reached target first-boot-complete.target - First Boot Complete. May 13 23:48:09.037558 systemd[1]: Reached target network-pre.target - Preparation for Network. May 13 23:48:09.042192 systemd[1]: Starting systemd-journal-flush.service - Flush Journal to Persistent Storage... May 13 23:48:09.051916 systemd[1]: Starting systemd-machine-id-commit.service - Save Transient machine-id to Disk... May 13 23:48:09.062632 systemd[1]: Starting systemd-sysusers.service - Create System Users... May 13 23:48:09.065557 systemd[1]: Finished systemd-sysctl.service - Apply Kernel Variables. May 13 23:48:09.077846 kernel: squashfs: version 4.0 (2009/01/31) Phillip Lougher May 13 23:48:09.089583 systemd[1]: Finished systemd-udev-trigger.service - Coldplug All udev Devices. May 13 23:48:09.095073 systemd[1]: Starting systemd-udev-settle.service - Wait for udev To Complete Device Initialization... May 13 23:48:09.103439 kernel: loop1: detected capacity change from 0 to 103832 May 13 23:48:09.111026 systemd-journald[1127]: Time spent on flushing to /var/log/journal/932b4659872a4a0486b76bd08d218774 is 40.064ms for 1154 entries. May 13 23:48:09.111026 systemd-journald[1127]: System Journal (/var/log/journal/932b4659872a4a0486b76bd08d218774) is 8M, max 584.8M, 576.8M free. May 13 23:48:09.166083 systemd-journald[1127]: Received client request to flush runtime journal. May 13 23:48:09.166142 kernel: loop2: detected capacity change from 0 to 8 May 13 23:48:09.134652 systemd[1]: Finished systemd-machine-id-commit.service - Save Transient machine-id to Disk. May 13 23:48:09.156511 systemd[1]: Finished systemd-sysusers.service - Create System Users. May 13 23:48:09.157811 udevadm[1195]: systemd-udev-settle.service is deprecated. Please fix lvm2-activation-early.service, lvm2-activation.service not to pull it in. May 13 23:48:09.161735 systemd[1]: Starting systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev... May 13 23:48:09.170130 systemd[1]: Finished systemd-journal-flush.service - Flush Journal to Persistent Storage. May 13 23:48:09.195443 kernel: loop3: detected capacity change from 0 to 194096 May 13 23:48:09.202629 systemd-tmpfiles[1201]: ACLs are not supported, ignoring. May 13 23:48:09.202645 systemd-tmpfiles[1201]: ACLs are not supported, ignoring. May 13 23:48:09.213216 systemd[1]: Finished systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev. May 13 23:48:09.266450 kernel: loop4: detected capacity change from 0 to 126448 May 13 23:48:09.290491 kernel: loop5: detected capacity change from 0 to 103832 May 13 23:48:09.300750 kernel: loop6: detected capacity change from 0 to 8 May 13 23:48:09.303485 kernel: loop7: detected capacity change from 0 to 194096 May 13 23:48:09.321650 (sd-merge)[1208]: Using extensions 'containerd-flatcar', 'docker-flatcar', 'kubernetes', 'oem-hetzner'. May 13 23:48:09.323306 (sd-merge)[1208]: Merged extensions into '/usr'. May 13 23:48:09.331878 systemd[1]: Reload requested from client PID 1162 ('systemd-sysext') (unit systemd-sysext.service)... May 13 23:48:09.331896 systemd[1]: Reloading... May 13 23:48:09.448426 zram_generator::config[1236]: No configuration found. May 13 23:48:09.592652 systemd[1]: /usr/lib/systemd/system/docker.socket:6: ListenStream= references a path below legacy directory /var/run/, updating /var/run/docker.sock → /run/docker.sock; please update the unit file accordingly. May 13 23:48:09.676045 systemd[1]: etc-machine\x2did.mount: Deactivated successfully. May 13 23:48:09.677021 systemd[1]: Reloading finished in 341 ms. May 13 23:48:09.693906 systemd[1]: Finished systemd-sysext.service - Merge System Extension Images into /usr/ and /opt/. May 13 23:48:09.704824 ldconfig[1157]: /sbin/ldconfig: /usr/lib/ld.so.conf is not an ELF file - it has the wrong magic bytes at the start. May 13 23:48:09.708593 systemd[1]: Starting ensure-sysext.service... May 13 23:48:09.714559 systemd[1]: Starting systemd-tmpfiles-setup.service - Create System Files and Directories... May 13 23:48:09.734726 systemd[1]: Finished ldconfig.service - Rebuild Dynamic Linker Cache. May 13 23:48:09.741359 systemd[1]: Reload requested from client PID 1272 ('systemctl') (unit ensure-sysext.service)... May 13 23:48:09.741487 systemd[1]: Reloading... May 13 23:48:09.750145 systemd-tmpfiles[1273]: /usr/lib/tmpfiles.d/provision.conf:20: Duplicate line for path "/root", ignoring. May 13 23:48:09.750348 systemd-tmpfiles[1273]: /usr/lib/tmpfiles.d/systemd-flatcar.conf:6: Duplicate line for path "/var/log/journal", ignoring. May 13 23:48:09.751111 systemd-tmpfiles[1273]: /usr/lib/tmpfiles.d/systemd.conf:29: Duplicate line for path "/var/lib/systemd", ignoring. May 13 23:48:09.751310 systemd-tmpfiles[1273]: ACLs are not supported, ignoring. May 13 23:48:09.751355 systemd-tmpfiles[1273]: ACLs are not supported, ignoring. May 13 23:48:09.755108 systemd-tmpfiles[1273]: Detected autofs mount point /boot during canonicalization of boot. May 13 23:48:09.755121 systemd-tmpfiles[1273]: Skipping /boot May 13 23:48:09.769165 systemd-tmpfiles[1273]: Detected autofs mount point /boot during canonicalization of boot. May 13 23:48:09.769182 systemd-tmpfiles[1273]: Skipping /boot May 13 23:48:09.843439 zram_generator::config[1303]: No configuration found. May 13 23:48:09.965145 systemd[1]: /usr/lib/systemd/system/docker.socket:6: ListenStream= references a path below legacy directory /var/run/, updating /var/run/docker.sock → /run/docker.sock; please update the unit file accordingly. May 13 23:48:10.026065 systemd[1]: Reloading finished in 284 ms. May 13 23:48:10.044389 systemd[1]: Finished systemd-hwdb-update.service - Rebuild Hardware Database. May 13 23:48:10.050987 systemd[1]: Finished systemd-tmpfiles-setup.service - Create System Files and Directories. May 13 23:48:10.061928 systemd[1]: Starting audit-rules.service - Load Audit Rules... May 13 23:48:10.066576 systemd[1]: Starting clean-ca-certificates.service - Clean up broken links in /etc/ssl/certs... May 13 23:48:10.070631 systemd[1]: Starting systemd-journal-catalog-update.service - Rebuild Journal Catalog... May 13 23:48:10.074598 systemd[1]: Starting systemd-resolved.service - Network Name Resolution... May 13 23:48:10.079462 systemd[1]: Starting systemd-udevd.service - Rule-based Manager for Device Events and Files... May 13 23:48:10.085977 systemd[1]: Starting systemd-update-utmp.service - Record System Boot/Shutdown in UTMP... May 13 23:48:10.092973 systemd[1]: ignition-delete-config.service - Ignition (delete config) was skipped because no trigger condition checks were met. May 13 23:48:10.095652 systemd[1]: Starting modprobe@dm_mod.service - Load Kernel Module dm_mod... May 13 23:48:10.102498 systemd[1]: Starting modprobe@efi_pstore.service - Load Kernel Module efi_pstore... May 13 23:48:10.115510 systemd[1]: Starting modprobe@loop.service - Load Kernel Module loop... May 13 23:48:10.116578 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. May 13 23:48:10.116718 systemd[1]: systemd-hibernate-clear.service - Clear Stale Hibernate Storage Info was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/HibernateLocation-8cf2644b-4b0b-428f-9387-6d876050dc67). May 13 23:48:10.120191 systemd[1]: ignition-delete-config.service - Ignition (delete config) was skipped because no trigger condition checks were met. May 13 23:48:10.120344 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. May 13 23:48:10.121485 systemd[1]: systemd-hibernate-clear.service - Clear Stale Hibernate Storage Info was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/HibernateLocation-8cf2644b-4b0b-428f-9387-6d876050dc67). May 13 23:48:10.125261 systemd[1]: Starting systemd-userdbd.service - User Database Manager... May 13 23:48:10.132244 systemd[1]: Finished systemd-journal-catalog-update.service - Rebuild Journal Catalog. May 13 23:48:10.135076 systemd[1]: modprobe@dm_mod.service: Deactivated successfully. May 13 23:48:10.136559 systemd[1]: Finished modprobe@dm_mod.service - Load Kernel Module dm_mod. May 13 23:48:10.139134 systemd[1]: modprobe@efi_pstore.service: Deactivated successfully. May 13 23:48:10.140575 systemd[1]: Finished modprobe@efi_pstore.service - Load Kernel Module efi_pstore. May 13 23:48:10.149282 systemd[1]: ignition-delete-config.service - Ignition (delete config) was skipped because no trigger condition checks were met. May 13 23:48:10.154934 systemd[1]: Starting modprobe@dm_mod.service - Load Kernel Module dm_mod... May 13 23:48:10.164956 systemd[1]: Starting modprobe@drm.service - Load Kernel Module drm... May 13 23:48:10.174098 systemd[1]: Starting modprobe@efi_pstore.service - Load Kernel Module efi_pstore... May 13 23:48:10.177963 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. May 13 23:48:10.180679 systemd[1]: systemd-hibernate-clear.service - Clear Stale Hibernate Storage Info was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/HibernateLocation-8cf2644b-4b0b-428f-9387-6d876050dc67). May 13 23:48:10.186913 systemd[1]: Starting systemd-update-done.service - Update is Completed... May 13 23:48:10.193304 systemd[1]: Finished clean-ca-certificates.service - Clean up broken links in /etc/ssl/certs. May 13 23:48:10.203045 systemd[1]: Finished systemd-update-utmp.service - Record System Boot/Shutdown in UTMP. May 13 23:48:10.210346 systemd[1]: modprobe@loop.service: Deactivated successfully. May 13 23:48:10.210535 systemd[1]: Finished modprobe@loop.service - Load Kernel Module loop. May 13 23:48:10.218054 systemd[1]: Finished ensure-sysext.service. May 13 23:48:10.226231 systemd-udevd[1346]: Using default interface naming scheme 'v255'. May 13 23:48:10.232345 systemd[1]: modprobe@dm_mod.service: Deactivated successfully. May 13 23:48:10.232546 systemd[1]: Finished modprobe@dm_mod.service - Load Kernel Module dm_mod. May 13 23:48:10.235225 systemd[1]: systemd-repart.service - Repartition Root Disk was skipped because no trigger condition checks were met. May 13 23:48:10.241432 systemd[1]: Starting systemd-timesyncd.service - Network Time Synchronization... May 13 23:48:10.242031 systemd[1]: update-ca-certificates.service - Update CA bundle at /etc/ssl/certs/ca-certificates.crt was skipped because of an unmet condition check (ConditionPathIsSymbolicLink=!/etc/ssl/certs/ca-certificates.crt). May 13 23:48:10.242257 systemd[1]: Started systemd-userdbd.service - User Database Manager. May 13 23:48:10.244356 systemd[1]: modprobe@drm.service: Deactivated successfully. May 13 23:48:10.245670 systemd[1]: Finished modprobe@drm.service - Load Kernel Module drm. May 13 23:48:10.249466 systemd[1]: Finished systemd-update-done.service - Update is Completed. May 13 23:48:10.259698 systemd[1]: modprobe@efi_pstore.service: Deactivated successfully. May 13 23:48:10.259921 systemd[1]: Finished modprobe@efi_pstore.service - Load Kernel Module efi_pstore. May 13 23:48:10.261103 systemd[1]: systemd-pstore.service - Platform Persistent Storage Archival was skipped because of an unmet condition check (ConditionDirectoryNotEmpty=/sys/fs/pstore). May 13 23:48:10.277520 augenrules[1393]: No rules May 13 23:48:10.280355 systemd[1]: audit-rules.service: Deactivated successfully. May 13 23:48:10.281138 systemd[1]: Finished audit-rules.service - Load Audit Rules. May 13 23:48:10.296361 systemd[1]: Started systemd-udevd.service - Rule-based Manager for Device Events and Files. May 13 23:48:10.303153 systemd[1]: Starting systemd-networkd.service - Network Configuration... May 13 23:48:10.349450 systemd[1]: Started systemd-timesyncd.service - Network Time Synchronization. May 13 23:48:10.350276 systemd[1]: Reached target time-set.target - System Time Set. May 13 23:48:10.381672 systemd-resolved[1345]: Positive Trust Anchors: May 13 23:48:10.381689 systemd-resolved[1345]: . IN DS 20326 8 2 e06d44b80b8f1d39a95c0b0d7c65d08458e880409bbc683457104237c7f8ec8d May 13 23:48:10.381721 systemd-resolved[1345]: Negative trust anchors: home.arpa 10.in-addr.arpa 16.172.in-addr.arpa 17.172.in-addr.arpa 18.172.in-addr.arpa 19.172.in-addr.arpa 20.172.in-addr.arpa 21.172.in-addr.arpa 22.172.in-addr.arpa 23.172.in-addr.arpa 24.172.in-addr.arpa 25.172.in-addr.arpa 26.172.in-addr.arpa 27.172.in-addr.arpa 28.172.in-addr.arpa 29.172.in-addr.arpa 30.172.in-addr.arpa 31.172.in-addr.arpa 170.0.0.192.in-addr.arpa 171.0.0.192.in-addr.arpa 168.192.in-addr.arpa d.f.ip6.arpa ipv4only.arpa resolver.arpa corp home internal intranet lan local private test May 13 23:48:10.391507 systemd-resolved[1345]: Using system hostname 'ci-4284-0-0-n-cba8e36126'. May 13 23:48:10.394143 systemd[1]: Started systemd-resolved.service - Network Name Resolution. May 13 23:48:10.395125 systemd[1]: Reached target nss-lookup.target - Host and Network Name Lookups. May 13 23:48:10.428391 systemd-networkd[1403]: lo: Link UP May 13 23:48:10.428415 systemd-networkd[1403]: lo: Gained carrier May 13 23:48:10.443492 systemd-networkd[1403]: Enumeration completed May 13 23:48:10.443734 systemd[1]: Started systemd-networkd.service - Network Configuration. May 13 23:48:10.445053 systemd[1]: Reached target network.target - Network. May 13 23:48:10.449555 systemd[1]: Starting systemd-networkd-persistent-storage.service - Enable Persistent Storage in systemd-networkd... May 13 23:48:10.452619 systemd[1]: Starting systemd-networkd-wait-online.service - Wait for Network to be Configured... May 13 23:48:10.463713 systemd[1]: Condition check resulted in dev-ttyAMA0.device - /dev/ttyAMA0 being skipped. May 13 23:48:10.477232 systemd[1]: Finished systemd-networkd-persistent-storage.service - Enable Persistent Storage in systemd-networkd. May 13 23:48:10.482704 systemd-networkd[1403]: eth1: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. May 13 23:48:10.482715 systemd-networkd[1403]: eth1: Configuring with /usr/lib/systemd/network/zz-default.network. May 13 23:48:10.483633 systemd-networkd[1403]: eth1: Link UP May 13 23:48:10.483641 systemd-networkd[1403]: eth1: Gained carrier May 13 23:48:10.483662 systemd-networkd[1403]: eth1: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. May 13 23:48:10.511354 systemd-networkd[1403]: eth1: DHCPv4 address 10.0.0.3/32, gateway 10.0.0.1 acquired from 10.0.0.1 May 13 23:48:10.512572 systemd-timesyncd[1384]: Network configuration changed, trying to establish connection. May 13 23:48:10.514629 systemd-timesyncd[1384]: Network configuration changed, trying to establish connection. May 13 23:48:10.549902 kernel: mousedev: PS/2 mouse device common for all mice May 13 23:48:10.553879 systemd-networkd[1403]: eth0: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. May 13 23:48:10.553893 systemd-networkd[1403]: eth0: Configuring with /usr/lib/systemd/network/zz-default.network. May 13 23:48:10.555503 systemd-timesyncd[1384]: Network configuration changed, trying to establish connection. May 13 23:48:10.555528 systemd-networkd[1403]: eth0: Link UP May 13 23:48:10.555532 systemd-networkd[1403]: eth0: Gained carrier May 13 23:48:10.555652 systemd-networkd[1403]: eth0: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. May 13 23:48:10.560250 systemd-timesyncd[1384]: Network configuration changed, trying to establish connection. May 13 23:48:10.577475 kernel: BTRFS warning: duplicate device /dev/sda3 devid 1 generation 41 scanned by (udev-worker) (1404) May 13 23:48:10.593466 systemd[1]: Condition check resulted in dev-virtio\x2dports-org.qemu.guest_agent.0.device - /dev/virtio-ports/org.qemu.guest_agent.0 being skipped. May 13 23:48:10.593606 systemd[1]: ignition-delete-config.service - Ignition (delete config) was skipped because no trigger condition checks were met. May 13 23:48:10.596054 systemd[1]: Starting modprobe@dm_mod.service - Load Kernel Module dm_mod... May 13 23:48:10.602610 systemd[1]: Starting modprobe@efi_pstore.service - Load Kernel Module efi_pstore... May 13 23:48:10.605698 systemd[1]: Starting modprobe@loop.service - Load Kernel Module loop... May 13 23:48:10.606323 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. May 13 23:48:10.606367 systemd[1]: systemd-hibernate-clear.service - Clear Stale Hibernate Storage Info was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/HibernateLocation-8cf2644b-4b0b-428f-9387-6d876050dc67). May 13 23:48:10.606392 systemd[1]: update-ca-certificates.service - Update CA bundle at /etc/ssl/certs/ca-certificates.crt was skipped because of an unmet condition check (ConditionPathIsSymbolicLink=!/etc/ssl/certs/ca-certificates.crt). May 13 23:48:10.618502 kernel: [drm] pci: virtio-gpu-pci detected at 0000:00:01.0 May 13 23:48:10.618564 kernel: [drm] features: -virgl +edid -resource_blob -host_visible May 13 23:48:10.618576 kernel: [drm] features: -context_init May 13 23:48:10.619424 kernel: [drm] number of scanouts: 1 May 13 23:48:10.619472 kernel: [drm] number of cap sets: 0 May 13 23:48:10.620527 kernel: [drm] Initialized virtio_gpu 0.1.0 0 for 0000:00:01.0 on minor 0 May 13 23:48:10.631757 systemd-networkd[1403]: eth0: DHCPv4 address 91.99.1.97/32, gateway 172.31.1.1 acquired from 172.31.1.1 May 13 23:48:10.635504 systemd-timesyncd[1384]: Network configuration changed, trying to establish connection. May 13 23:48:10.655744 kernel: Console: switching to colour frame buffer device 160x50 May 13 23:48:10.668658 kernel: virtio-pci 0000:00:01.0: [drm] fb0: virtio_gpudrmfb frame buffer device May 13 23:48:10.687639 systemd[1]: modprobe@dm_mod.service: Deactivated successfully. May 13 23:48:10.689459 systemd[1]: Finished modprobe@dm_mod.service - Load Kernel Module dm_mod. May 13 23:48:10.690389 systemd[1]: modprobe@efi_pstore.service: Deactivated successfully. May 13 23:48:10.690607 systemd[1]: Finished modprobe@efi_pstore.service - Load Kernel Module efi_pstore. May 13 23:48:10.692944 systemd[1]: modprobe@loop.service: Deactivated successfully. May 13 23:48:10.693339 systemd[1]: Finished modprobe@loop.service - Load Kernel Module loop. May 13 23:48:10.695272 systemd[1]: systemd-pstore.service - Platform Persistent Storage Archival was skipped because of an unmet condition check (ConditionDirectoryNotEmpty=/sys/fs/pstore). May 13 23:48:10.695358 systemd[1]: systemd-repart.service - Repartition Root Disk was skipped because no trigger condition checks were met. May 13 23:48:10.733384 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... May 13 23:48:10.755650 systemd[1]: Found device dev-disk-by\x2dlabel-OEM.device - QEMU_HARDDISK OEM. May 13 23:48:10.759837 systemd[1]: Starting systemd-fsck@dev-disk-by\x2dlabel-OEM.service - File System Check on /dev/disk/by-label/OEM... May 13 23:48:10.768688 systemd[1]: systemd-vconsole-setup.service: Deactivated successfully. May 13 23:48:10.770800 systemd[1]: Stopped systemd-vconsole-setup.service - Virtual Console Setup. May 13 23:48:10.773978 systemd[1]: run-credentials-systemd\x2dvconsole\x2dsetup.service.mount: Deactivated successfully. May 13 23:48:10.783648 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... May 13 23:48:10.786496 systemd[1]: Finished systemd-fsck@dev-disk-by\x2dlabel-OEM.service - File System Check on /dev/disk/by-label/OEM. May 13 23:48:10.861880 systemd[1]: Finished systemd-vconsole-setup.service - Virtual Console Setup. May 13 23:48:10.893226 systemd[1]: Finished systemd-udev-settle.service - Wait for udev To Complete Device Initialization. May 13 23:48:10.895928 systemd[1]: Starting lvm2-activation-early.service - Activation of LVM2 logical volumes... May 13 23:48:10.919447 lvm[1468]: WARNING: Failed to connect to lvmetad. Falling back to device scanning. May 13 23:48:10.945938 systemd[1]: Finished lvm2-activation-early.service - Activation of LVM2 logical volumes. May 13 23:48:10.947874 systemd[1]: Reached target cryptsetup.target - Local Encrypted Volumes. May 13 23:48:10.948854 systemd[1]: Reached target sysinit.target - System Initialization. May 13 23:48:10.949775 systemd[1]: Started motdgen.path - Watch for update engine configuration changes. May 13 23:48:10.950678 systemd[1]: Started user-cloudinit@var-lib-flatcar\x2dinstall-user_data.path - Watch for a cloud-config at /var/lib/flatcar-install/user_data. May 13 23:48:10.951586 systemd[1]: Started logrotate.timer - Daily rotation of log files. May 13 23:48:10.952278 systemd[1]: Started mdadm.timer - Weekly check for MD array's redundancy information.. May 13 23:48:10.953012 systemd[1]: Started systemd-tmpfiles-clean.timer - Daily Cleanup of Temporary Directories. May 13 23:48:10.954166 systemd[1]: update-engine-stub.timer - Update Engine Stub Timer was skipped because of an unmet condition check (ConditionPathExists=/usr/.noupdate). May 13 23:48:10.954209 systemd[1]: Reached target paths.target - Path Units. May 13 23:48:10.955103 systemd[1]: Reached target timers.target - Timer Units. May 13 23:48:10.957226 systemd[1]: Listening on dbus.socket - D-Bus System Message Bus Socket. May 13 23:48:10.959621 systemd[1]: Starting docker.socket - Docker Socket for the API... May 13 23:48:10.963632 systemd[1]: Listening on sshd-unix-local.socket - OpenSSH Server Socket (systemd-ssh-generator, AF_UNIX Local). May 13 23:48:10.964553 systemd[1]: Listening on sshd-vsock.socket - OpenSSH Server Socket (systemd-ssh-generator, AF_VSOCK). May 13 23:48:10.965195 systemd[1]: Reached target ssh-access.target - SSH Access Available. May 13 23:48:10.978116 systemd[1]: Listening on sshd.socket - OpenSSH Server Socket. May 13 23:48:10.979562 systemd[1]: Listening on systemd-hostnamed.socket - Hostname Service Socket. May 13 23:48:10.981827 systemd[1]: Starting lvm2-activation.service - Activation of LVM2 logical volumes... May 13 23:48:10.983685 systemd[1]: Listening on docker.socket - Docker Socket for the API. May 13 23:48:10.984449 systemd[1]: Reached target sockets.target - Socket Units. May 13 23:48:10.985004 systemd[1]: Reached target basic.target - Basic System. May 13 23:48:10.985534 systemd[1]: addon-config@oem.service - Configure Addon /oem was skipped because no trigger condition checks were met. May 13 23:48:10.985563 systemd[1]: addon-run@oem.service - Run Addon /oem was skipped because no trigger condition checks were met. May 13 23:48:10.988529 systemd[1]: Starting containerd.service - containerd container runtime... May 13 23:48:10.994693 systemd[1]: Starting coreos-metadata.service - Flatcar Metadata Agent... May 13 23:48:10.998812 systemd[1]: Starting dbus.service - D-Bus System Message Bus... May 13 23:48:11.003450 lvm[1472]: WARNING: Failed to connect to lvmetad. Falling back to device scanning. May 13 23:48:11.005544 systemd[1]: Starting enable-oem-cloudinit.service - Enable cloudinit... May 13 23:48:11.009612 systemd[1]: Starting extend-filesystems.service - Extend Filesystems... May 13 23:48:11.010233 systemd[1]: flatcar-setup-environment.service - Modifies /etc/environment for CoreOS was skipped because of an unmet condition check (ConditionPathExists=/oem/bin/flatcar-setup-environment). May 13 23:48:11.012248 systemd[1]: Starting motdgen.service - Generate /run/flatcar/motd... May 13 23:48:11.020270 systemd[1]: Starting prepare-helm.service - Unpack helm to /opt/bin... May 13 23:48:11.026689 systemd[1]: Started qemu-guest-agent.service - QEMU Guest Agent. May 13 23:48:11.033674 systemd[1]: Starting ssh-key-proc-cmdline.service - Install an ssh key from /proc/cmdline... May 13 23:48:11.043651 systemd[1]: Starting sshd-keygen.service - Generate sshd host keys... May 13 23:48:11.055995 systemd[1]: Starting systemd-logind.service - User Login Management... May 13 23:48:11.058795 systemd[1]: tcsd.service - TCG Core Services Daemon was skipped because of an unmet condition check (ConditionPathExists=/dev/tpm0). May 13 23:48:11.059329 systemd[1]: cgroup compatibility translation between legacy and unified hierarchy settings activated. See cgroup-compat debug messages for details. May 13 23:48:11.064128 jq[1476]: false May 13 23:48:11.063682 systemd[1]: Starting update-engine.service - Update Engine... May 13 23:48:11.070922 systemd[1]: Starting update-ssh-keys-after-ignition.service - Run update-ssh-keys once after Ignition... May 13 23:48:11.077085 dbus-daemon[1475]: [system] SELinux support is enabled May 13 23:48:11.079921 systemd[1]: Started dbus.service - D-Bus System Message Bus. May 13 23:48:11.087583 coreos-metadata[1474]: May 13 23:48:11.084 INFO Fetching http://169.254.169.254/hetzner/v1/metadata: Attempt #1 May 13 23:48:11.088238 systemd[1]: Finished lvm2-activation.service - Activation of LVM2 logical volumes. May 13 23:48:11.092042 coreos-metadata[1474]: May 13 23:48:11.088 INFO Fetch successful May 13 23:48:11.092042 coreos-metadata[1474]: May 13 23:48:11.091 INFO Fetching http://169.254.169.254/hetzner/v1/metadata/private-networks: Attempt #1 May 13 23:48:11.092761 coreos-metadata[1474]: May 13 23:48:11.092 INFO Fetch successful May 13 23:48:11.097928 systemd[1]: enable-oem-cloudinit.service: Skipped due to 'exec-condition'. May 13 23:48:11.098214 systemd[1]: Condition check resulted in enable-oem-cloudinit.service - Enable cloudinit being skipped. May 13 23:48:11.121347 systemd[1]: system-cloudinit@usr-share-oem-cloud\x2dconfig.yml.service - Load cloud-config from /usr/share/oem/cloud-config.yml was skipped because of an unmet condition check (ConditionFileNotEmpty=/usr/share/oem/cloud-config.yml). May 13 23:48:11.122799 systemd[1]: Reached target system-config.target - Load system-provided cloud configs. May 13 23:48:11.124544 systemd[1]: user-cloudinit-proc-cmdline.service - Load cloud-config from url defined in /proc/cmdline was skipped because of an unmet condition check (ConditionKernelCommandLine=cloud-config-url). May 13 23:48:11.124577 systemd[1]: Reached target user-config.target - Load user-provided cloud configs. May 13 23:48:11.128880 extend-filesystems[1477]: Found loop4 May 13 23:48:11.131423 extend-filesystems[1477]: Found loop5 May 13 23:48:11.131423 extend-filesystems[1477]: Found loop6 May 13 23:48:11.131423 extend-filesystems[1477]: Found loop7 May 13 23:48:11.131423 extend-filesystems[1477]: Found sda May 13 23:48:11.131423 extend-filesystems[1477]: Found sda1 May 13 23:48:11.131423 extend-filesystems[1477]: Found sda2 May 13 23:48:11.131423 extend-filesystems[1477]: Found sda3 May 13 23:48:11.131423 extend-filesystems[1477]: Found usr May 13 23:48:11.131423 extend-filesystems[1477]: Found sda4 May 13 23:48:11.131423 extend-filesystems[1477]: Found sda6 May 13 23:48:11.131423 extend-filesystems[1477]: Found sda7 May 13 23:48:11.131423 extend-filesystems[1477]: Found sda9 May 13 23:48:11.131423 extend-filesystems[1477]: Checking size of /dev/sda9 May 13 23:48:11.173502 jq[1488]: true May 13 23:48:11.136677 systemd[1]: ssh-key-proc-cmdline.service: Deactivated successfully. May 13 23:48:11.136906 systemd[1]: Finished ssh-key-proc-cmdline.service - Install an ssh key from /proc/cmdline. May 13 23:48:11.154786 systemd[1]: motdgen.service: Deactivated successfully. May 13 23:48:11.155006 systemd[1]: Finished motdgen.service - Generate /run/flatcar/motd. May 13 23:48:11.183106 extend-filesystems[1477]: Resized partition /dev/sda9 May 13 23:48:11.190419 extend-filesystems[1520]: resize2fs 1.47.2 (1-Jan-2025) May 13 23:48:11.197350 jq[1507]: true May 13 23:48:11.200455 kernel: EXT4-fs (sda9): resizing filesystem from 1617920 to 9393147 blocks May 13 23:48:11.201466 update_engine[1487]: I20250513 23:48:11.201081 1487 main.cc:92] Flatcar Update Engine starting May 13 23:48:11.202684 tar[1492]: linux-arm64/helm May 13 23:48:11.206768 systemd[1]: Started update-engine.service - Update Engine. May 13 23:48:11.207535 update_engine[1487]: I20250513 23:48:11.207119 1487 update_check_scheduler.cc:74] Next update check in 6m43s May 13 23:48:11.223430 systemd[1]: Started locksmithd.service - Cluster reboot manager. May 13 23:48:11.246711 systemd-logind[1485]: New seat seat0. May 13 23:48:11.250572 systemd-logind[1485]: Watching system buttons on /dev/input/event0 (Power Button) May 13 23:48:11.250588 systemd-logind[1485]: Watching system buttons on /dev/input/event2 (QEMU QEMU USB Keyboard) May 13 23:48:11.250820 systemd[1]: Started systemd-logind.service - User Login Management. May 13 23:48:11.270455 (ntainerd)[1515]: containerd.service: Referenced but unset environment variable evaluates to an empty string: TORCX_IMAGEDIR, TORCX_UNPACKDIR May 13 23:48:11.343205 kernel: EXT4-fs (sda9): resized filesystem to 9393147 May 13 23:48:11.359172 systemd[1]: Finished coreos-metadata.service - Flatcar Metadata Agent. May 13 23:48:11.361117 extend-filesystems[1520]: Filesystem at /dev/sda9 is mounted on /; on-line resizing required May 13 23:48:11.361117 extend-filesystems[1520]: old_desc_blocks = 1, new_desc_blocks = 5 May 13 23:48:11.361117 extend-filesystems[1520]: The filesystem on /dev/sda9 is now 9393147 (4k) blocks long. May 13 23:48:11.377010 extend-filesystems[1477]: Resized filesystem in /dev/sda9 May 13 23:48:11.377010 extend-filesystems[1477]: Found sr0 May 13 23:48:11.365103 systemd[1]: extend-filesystems.service: Deactivated successfully. May 13 23:48:11.383713 bash[1543]: Updated "/home/core/.ssh/authorized_keys" May 13 23:48:11.366150 systemd[1]: Finished extend-filesystems.service - Extend Filesystems. May 13 23:48:11.376536 systemd[1]: Finished update-ssh-keys-after-ignition.service - Run update-ssh-keys once after Ignition. May 13 23:48:11.387180 systemd[1]: packet-phone-home.service - Report Success to Packet was skipped because no trigger condition checks were met. May 13 23:48:11.393349 systemd[1]: Starting sshkeys.service... May 13 23:48:11.401961 kernel: BTRFS warning: duplicate device /dev/sda3 devid 1 generation 41 scanned by (udev-worker) (1411) May 13 23:48:11.469128 systemd[1]: Created slice system-coreos\x2dmetadata\x2dsshkeys.slice - Slice /system/coreos-metadata-sshkeys. May 13 23:48:11.471935 systemd[1]: Starting coreos-metadata-sshkeys@core.service - Flatcar Metadata Agent (SSH Keys)... May 13 23:48:11.545066 locksmithd[1523]: locksmithd starting currentOperation="UPDATE_STATUS_IDLE" strategy="reboot" May 13 23:48:11.546029 coreos-metadata[1554]: May 13 23:48:11.545 INFO Fetching http://169.254.169.254/hetzner/v1/metadata/public-keys: Attempt #1 May 13 23:48:11.548482 coreos-metadata[1554]: May 13 23:48:11.548 INFO Fetch successful May 13 23:48:11.552719 unknown[1554]: wrote ssh authorized keys file for user: core May 13 23:48:11.596049 update-ssh-keys[1564]: Updated "/home/core/.ssh/authorized_keys" May 13 23:48:11.599913 systemd[1]: Finished coreos-metadata-sshkeys@core.service - Flatcar Metadata Agent (SSH Keys). May 13 23:48:11.606829 systemd[1]: Finished sshkeys.service. May 13 23:48:11.677873 containerd[1515]: time="2025-05-13T23:48:11Z" level=warning msg="Ignoring unknown key in TOML" column=1 error="strict mode: fields in the document are missing in the target struct" file=/usr/share/containerd/config.toml key=subreaper row=8 May 13 23:48:11.680891 containerd[1515]: time="2025-05-13T23:48:11.680817400Z" level=info msg="starting containerd" revision=88aa2f531d6c2922003cc7929e51daf1c14caa0a version=v2.0.1 May 13 23:48:11.713421 containerd[1515]: time="2025-05-13T23:48:11.711223560Z" level=warning msg="Configuration migrated from version 2, use `containerd config migrate` to avoid migration" t="8.72µs" May 13 23:48:11.717491 containerd[1515]: time="2025-05-13T23:48:11.717427880Z" level=info msg="loading plugin" id=io.containerd.image-verifier.v1.bindir type=io.containerd.image-verifier.v1 May 13 23:48:11.717651 containerd[1515]: time="2025-05-13T23:48:11.717635520Z" level=info msg="loading plugin" id=io.containerd.internal.v1.opt type=io.containerd.internal.v1 May 13 23:48:11.718048 containerd[1515]: time="2025-05-13T23:48:11.718019920Z" level=info msg="loading plugin" id=io.containerd.warning.v1.deprecations type=io.containerd.warning.v1 May 13 23:48:11.718159 containerd[1515]: time="2025-05-13T23:48:11.718142760Z" level=info msg="loading plugin" id=io.containerd.content.v1.content type=io.containerd.content.v1 May 13 23:48:11.718242 containerd[1515]: time="2025-05-13T23:48:11.718229280Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.blockfile type=io.containerd.snapshotter.v1 May 13 23:48:11.718496 containerd[1515]: time="2025-05-13T23:48:11.718390080Z" level=info msg="skip loading plugin" error="no scratch file generator: skip plugin" id=io.containerd.snapshotter.v1.blockfile type=io.containerd.snapshotter.v1 May 13 23:48:11.718594 containerd[1515]: time="2025-05-13T23:48:11.718577680Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.btrfs type=io.containerd.snapshotter.v1 May 13 23:48:11.720537 containerd[1515]: time="2025-05-13T23:48:11.720491280Z" level=info msg="skip loading plugin" error="path /var/lib/containerd/io.containerd.snapshotter.v1.btrfs (ext4) must be a btrfs filesystem to be used with the btrfs snapshotter: skip plugin" id=io.containerd.snapshotter.v1.btrfs type=io.containerd.snapshotter.v1 May 13 23:48:11.720698 containerd[1515]: time="2025-05-13T23:48:11.720626320Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.devmapper type=io.containerd.snapshotter.v1 May 13 23:48:11.720698 containerd[1515]: time="2025-05-13T23:48:11.720648720Z" level=info msg="skip loading plugin" error="devmapper not configured: skip plugin" id=io.containerd.snapshotter.v1.devmapper type=io.containerd.snapshotter.v1 May 13 23:48:11.720698 containerd[1515]: time="2025-05-13T23:48:11.720658360Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.native type=io.containerd.snapshotter.v1 May 13 23:48:11.721154 containerd[1515]: time="2025-05-13T23:48:11.721131120Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.overlayfs type=io.containerd.snapshotter.v1 May 13 23:48:11.722572 containerd[1515]: time="2025-05-13T23:48:11.722545440Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.zfs type=io.containerd.snapshotter.v1 May 13 23:48:11.722696 containerd[1515]: time="2025-05-13T23:48:11.722680120Z" level=info msg="skip loading plugin" error="lstat /var/lib/containerd/io.containerd.snapshotter.v1.zfs: no such file or directory: skip plugin" id=io.containerd.snapshotter.v1.zfs type=io.containerd.snapshotter.v1 May 13 23:48:11.722790 containerd[1515]: time="2025-05-13T23:48:11.722775440Z" level=info msg="loading plugin" id=io.containerd.event.v1.exchange type=io.containerd.event.v1 May 13 23:48:11.722884 containerd[1515]: time="2025-05-13T23:48:11.722870320Z" level=info msg="loading plugin" id=io.containerd.monitor.task.v1.cgroups type=io.containerd.monitor.task.v1 May 13 23:48:11.724153 containerd[1515]: time="2025-05-13T23:48:11.723851760Z" level=info msg="loading plugin" id=io.containerd.metadata.v1.bolt type=io.containerd.metadata.v1 May 13 23:48:11.725358 containerd[1515]: time="2025-05-13T23:48:11.725150160Z" level=info msg="metadata content store policy set" policy=shared May 13 23:48:11.731494 containerd[1515]: time="2025-05-13T23:48:11.731455600Z" level=info msg="loading plugin" id=io.containerd.gc.v1.scheduler type=io.containerd.gc.v1 May 13 23:48:11.731648 containerd[1515]: time="2025-05-13T23:48:11.731630840Z" level=info msg="loading plugin" id=io.containerd.differ.v1.walking type=io.containerd.differ.v1 May 13 23:48:11.733444 containerd[1515]: time="2025-05-13T23:48:11.732370960Z" level=info msg="loading plugin" id=io.containerd.lease.v1.manager type=io.containerd.lease.v1 May 13 23:48:11.733444 containerd[1515]: time="2025-05-13T23:48:11.732434480Z" level=info msg="loading plugin" id=io.containerd.service.v1.containers-service type=io.containerd.service.v1 May 13 23:48:11.733444 containerd[1515]: time="2025-05-13T23:48:11.732643880Z" level=info msg="loading plugin" id=io.containerd.service.v1.content-service type=io.containerd.service.v1 May 13 23:48:11.733444 containerd[1515]: time="2025-05-13T23:48:11.732676640Z" level=info msg="loading plugin" id=io.containerd.service.v1.diff-service type=io.containerd.service.v1 May 13 23:48:11.733444 containerd[1515]: time="2025-05-13T23:48:11.732716920Z" level=info msg="loading plugin" id=io.containerd.service.v1.images-service type=io.containerd.service.v1 May 13 23:48:11.733444 containerd[1515]: time="2025-05-13T23:48:11.732785160Z" level=info msg="loading plugin" id=io.containerd.service.v1.introspection-service type=io.containerd.service.v1 May 13 23:48:11.733444 containerd[1515]: time="2025-05-13T23:48:11.732847440Z" level=info msg="loading plugin" id=io.containerd.service.v1.namespaces-service type=io.containerd.service.v1 May 13 23:48:11.733444 containerd[1515]: time="2025-05-13T23:48:11.732903440Z" level=info msg="loading plugin" id=io.containerd.service.v1.snapshots-service type=io.containerd.service.v1 May 13 23:48:11.733444 containerd[1515]: time="2025-05-13T23:48:11.732918640Z" level=info msg="loading plugin" id=io.containerd.shim.v1.manager type=io.containerd.shim.v1 May 13 23:48:11.733444 containerd[1515]: time="2025-05-13T23:48:11.732952120Z" level=info msg="loading plugin" id=io.containerd.runtime.v2.task type=io.containerd.runtime.v2 May 13 23:48:11.733444 containerd[1515]: time="2025-05-13T23:48:11.733124720Z" level=info msg="loading plugin" id=io.containerd.service.v1.tasks-service type=io.containerd.service.v1 May 13 23:48:11.733444 containerd[1515]: time="2025-05-13T23:48:11.733155280Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.containers type=io.containerd.grpc.v1 May 13 23:48:11.733444 containerd[1515]: time="2025-05-13T23:48:11.733258320Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.content type=io.containerd.grpc.v1 May 13 23:48:11.733444 containerd[1515]: time="2025-05-13T23:48:11.733276200Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.diff type=io.containerd.grpc.v1 May 13 23:48:11.734182 containerd[1515]: time="2025-05-13T23:48:11.733290040Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.events type=io.containerd.grpc.v1 May 13 23:48:11.734182 containerd[1515]: time="2025-05-13T23:48:11.733305160Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.images type=io.containerd.grpc.v1 May 13 23:48:11.734182 containerd[1515]: time="2025-05-13T23:48:11.733324240Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.introspection type=io.containerd.grpc.v1 May 13 23:48:11.734182 containerd[1515]: time="2025-05-13T23:48:11.733340160Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.leases type=io.containerd.grpc.v1 May 13 23:48:11.734182 containerd[1515]: time="2025-05-13T23:48:11.733368560Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.namespaces type=io.containerd.grpc.v1 May 13 23:48:11.734182 containerd[1515]: time="2025-05-13T23:48:11.733383160Z" level=info msg="loading plugin" id=io.containerd.sandbox.store.v1.local type=io.containerd.sandbox.store.v1 May 13 23:48:11.734182 containerd[1515]: time="2025-05-13T23:48:11.733397680Z" level=info msg="loading plugin" id=io.containerd.cri.v1.images type=io.containerd.cri.v1 May 13 23:48:11.734182 containerd[1515]: time="2025-05-13T23:48:11.733772520Z" level=info msg="Get image filesystem path \"/var/lib/containerd/io.containerd.snapshotter.v1.overlayfs\" for snapshotter \"overlayfs\"" May 13 23:48:11.734182 containerd[1515]: time="2025-05-13T23:48:11.733855720Z" level=info msg="Start snapshots syncer" May 13 23:48:11.734182 containerd[1515]: time="2025-05-13T23:48:11.733879120Z" level=info msg="loading plugin" id=io.containerd.cri.v1.runtime type=io.containerd.cri.v1 May 13 23:48:11.734370 containerd[1515]: time="2025-05-13T23:48:11.734209120Z" level=info msg="starting cri plugin" config="{\"containerd\":{\"defaultRuntimeName\":\"runc\",\"runtimes\":{\"runc\":{\"runtimeType\":\"io.containerd.runc.v2\",\"runtimePath\":\"\",\"PodAnnotations\":null,\"ContainerAnnotations\":null,\"options\":{\"BinaryName\":\"\",\"CriuImagePath\":\"\",\"CriuWorkPath\":\"\",\"IoGid\":0,\"IoUid\":0,\"NoNewKeyring\":false,\"Root\":\"\",\"ShimCgroup\":\"\",\"SystemdCgroup\":true},\"privileged_without_host_devices\":false,\"privileged_without_host_devices_all_devices_allowed\":false,\"baseRuntimeSpec\":\"\",\"cniConfDir\":\"\",\"cniMaxConfNum\":0,\"snapshotter\":\"\",\"sandboxer\":\"podsandbox\",\"io_type\":\"\"}},\"ignoreBlockIONotEnabledErrors\":false,\"ignoreRdtNotEnabledErrors\":false},\"cni\":{\"binDir\":\"/opt/cni/bin\",\"confDir\":\"/etc/cni/net.d\",\"maxConfNum\":1,\"setupSerially\":false,\"confTemplate\":\"\",\"ipPref\":\"\",\"useInternalLoopback\":false},\"enableSelinux\":true,\"selinuxCategoryRange\":1024,\"maxContainerLogSize\":16384,\"disableApparmor\":false,\"restrictOOMScoreAdj\":false,\"disableProcMount\":false,\"unsetSeccompProfile\":\"\",\"tolerateMissingHugetlbController\":true,\"disableHugetlbController\":true,\"device_ownership_from_security_context\":false,\"ignoreImageDefinedVolumes\":false,\"netnsMountsUnderStateDir\":false,\"enableUnprivilegedPorts\":true,\"enableUnprivilegedICMP\":true,\"enableCDI\":true,\"cdiSpecDirs\":[\"/etc/cdi\",\"/var/run/cdi\"],\"drainExecSyncIOTimeout\":\"0s\",\"ignoreDeprecationWarnings\":null,\"containerdRootDir\":\"/var/lib/containerd\",\"containerdEndpoint\":\"/run/containerd/containerd.sock\",\"rootDir\":\"/var/lib/containerd/io.containerd.grpc.v1.cri\",\"stateDir\":\"/run/containerd/io.containerd.grpc.v1.cri\"}" May 13 23:48:11.734370 containerd[1515]: time="2025-05-13T23:48:11.734265520Z" level=info msg="loading plugin" id=io.containerd.podsandbox.controller.v1.podsandbox type=io.containerd.podsandbox.controller.v1 May 13 23:48:11.739258 containerd[1515]: time="2025-05-13T23:48:11.737655320Z" level=info msg="loading plugin" id=io.containerd.sandbox.controller.v1.shim type=io.containerd.sandbox.controller.v1 May 13 23:48:11.739258 containerd[1515]: time="2025-05-13T23:48:11.737877680Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.sandbox-controllers type=io.containerd.grpc.v1 May 13 23:48:11.739258 containerd[1515]: time="2025-05-13T23:48:11.737914680Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.sandboxes type=io.containerd.grpc.v1 May 13 23:48:11.739258 containerd[1515]: time="2025-05-13T23:48:11.737932840Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.snapshots type=io.containerd.grpc.v1 May 13 23:48:11.739258 containerd[1515]: time="2025-05-13T23:48:11.737948440Z" level=info msg="loading plugin" id=io.containerd.streaming.v1.manager type=io.containerd.streaming.v1 May 13 23:48:11.739258 containerd[1515]: time="2025-05-13T23:48:11.737966000Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.streaming type=io.containerd.grpc.v1 May 13 23:48:11.739258 containerd[1515]: time="2025-05-13T23:48:11.737979320Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.tasks type=io.containerd.grpc.v1 May 13 23:48:11.739258 containerd[1515]: time="2025-05-13T23:48:11.737995560Z" level=info msg="loading plugin" id=io.containerd.transfer.v1.local type=io.containerd.transfer.v1 May 13 23:48:11.739258 containerd[1515]: time="2025-05-13T23:48:11.738030680Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.transfer type=io.containerd.grpc.v1 May 13 23:48:11.739258 containerd[1515]: time="2025-05-13T23:48:11.738050120Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.version type=io.containerd.grpc.v1 May 13 23:48:11.739258 containerd[1515]: time="2025-05-13T23:48:11.738064640Z" level=info msg="loading plugin" id=io.containerd.monitor.container.v1.restart type=io.containerd.monitor.container.v1 May 13 23:48:11.739258 containerd[1515]: time="2025-05-13T23:48:11.738109280Z" level=info msg="loading plugin" id=io.containerd.tracing.processor.v1.otlp type=io.containerd.tracing.processor.v1 May 13 23:48:11.739258 containerd[1515]: time="2025-05-13T23:48:11.738129880Z" level=info msg="skip loading plugin" error="skip plugin: tracing endpoint not configured" id=io.containerd.tracing.processor.v1.otlp type=io.containerd.tracing.processor.v1 May 13 23:48:11.739258 containerd[1515]: time="2025-05-13T23:48:11.738140800Z" level=info msg="loading plugin" id=io.containerd.internal.v1.tracing type=io.containerd.internal.v1 May 13 23:48:11.739559 containerd[1515]: time="2025-05-13T23:48:11.738156000Z" level=info msg="skip loading plugin" error="skip plugin: tracing endpoint not configured" id=io.containerd.internal.v1.tracing type=io.containerd.internal.v1 May 13 23:48:11.739559 containerd[1515]: time="2025-05-13T23:48:11.738170240Z" level=info msg="loading plugin" id=io.containerd.ttrpc.v1.otelttrpc type=io.containerd.ttrpc.v1 May 13 23:48:11.739559 containerd[1515]: time="2025-05-13T23:48:11.738184000Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.healthcheck type=io.containerd.grpc.v1 May 13 23:48:11.739559 containerd[1515]: time="2025-05-13T23:48:11.738196920Z" level=info msg="loading plugin" id=io.containerd.nri.v1.nri type=io.containerd.nri.v1 May 13 23:48:11.739559 containerd[1515]: time="2025-05-13T23:48:11.738277800Z" level=info msg="runtime interface created" May 13 23:48:11.739559 containerd[1515]: time="2025-05-13T23:48:11.738283840Z" level=info msg="created NRI interface" May 13 23:48:11.739559 containerd[1515]: time="2025-05-13T23:48:11.738297360Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.cri type=io.containerd.grpc.v1 May 13 23:48:11.739559 containerd[1515]: time="2025-05-13T23:48:11.738314720Z" level=info msg="Connect containerd service" May 13 23:48:11.739559 containerd[1515]: time="2025-05-13T23:48:11.738378920Z" level=info msg="using experimental NRI integration - disable nri plugin to prevent this" May 13 23:48:11.745321 containerd[1515]: time="2025-05-13T23:48:11.745235280Z" level=error msg="failed to load cni during init, please check CRI plugin status before setting up network for pods" error="cni config load failed: no network config found in /etc/cni/net.d: cni plugin not initialized: failed to load cni config" May 13 23:48:11.900258 systemd-networkd[1403]: eth0: Gained IPv6LL May 13 23:48:11.902488 systemd-timesyncd[1384]: Network configuration changed, trying to establish connection. May 13 23:48:11.909458 systemd[1]: Finished systemd-networkd-wait-online.service - Wait for Network to be Configured. May 13 23:48:11.914480 systemd[1]: Reached target network-online.target - Network is Online. May 13 23:48:11.920796 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... May 13 23:48:11.925765 systemd[1]: Starting nvidia.service - NVIDIA Configure Service... May 13 23:48:11.977257 containerd[1515]: time="2025-05-13T23:48:11.977199520Z" level=info msg="Start subscribing containerd event" May 13 23:48:11.977483 containerd[1515]: time="2025-05-13T23:48:11.977393480Z" level=info msg="Start recovering state" May 13 23:48:11.979003 containerd[1515]: time="2025-05-13T23:48:11.978970720Z" level=info msg="Start event monitor" May 13 23:48:11.979115 containerd[1515]: time="2025-05-13T23:48:11.979102680Z" level=info msg="Start cni network conf syncer for default" May 13 23:48:11.979162 containerd[1515]: time="2025-05-13T23:48:11.979150920Z" level=info msg="Start streaming server" May 13 23:48:11.979238 containerd[1515]: time="2025-05-13T23:48:11.979224480Z" level=info msg="Registered namespace \"k8s.io\" with NRI" May 13 23:48:11.979284 containerd[1515]: time="2025-05-13T23:48:11.979272840Z" level=info msg="runtime interface starting up..." May 13 23:48:11.979325 containerd[1515]: time="2025-05-13T23:48:11.979315480Z" level=info msg="starting plugins..." May 13 23:48:11.979378 containerd[1515]: time="2025-05-13T23:48:11.979366960Z" level=info msg="Synchronizing NRI (plugin) with current runtime state" May 13 23:48:11.979647 containerd[1515]: time="2025-05-13T23:48:11.978082880Z" level=info msg=serving... address=/run/containerd/containerd.sock.ttrpc May 13 23:48:11.979931 containerd[1515]: time="2025-05-13T23:48:11.979909320Z" level=info msg=serving... address=/run/containerd/containerd.sock May 13 23:48:11.980279 containerd[1515]: time="2025-05-13T23:48:11.980263120Z" level=info msg="containerd successfully booted in 0.303127s" May 13 23:48:11.980357 systemd[1]: Started containerd.service - containerd container runtime. May 13 23:48:12.011635 systemd[1]: Finished nvidia.service - NVIDIA Configure Service. May 13 23:48:12.034123 tar[1492]: linux-arm64/LICENSE May 13 23:48:12.034429 tar[1492]: linux-arm64/README.md May 13 23:48:12.061179 systemd[1]: Finished prepare-helm.service - Unpack helm to /opt/bin. May 13 23:48:12.090590 systemd-networkd[1403]: eth1: Gained IPv6LL May 13 23:48:12.091024 systemd-timesyncd[1384]: Network configuration changed, trying to establish connection. May 13 23:48:12.638190 sshd_keygen[1512]: ssh-keygen: generating new host keys: RSA ECDSA ED25519 May 13 23:48:12.671635 systemd[1]: Finished sshd-keygen.service - Generate sshd host keys. May 13 23:48:12.679073 systemd[1]: Starting issuegen.service - Generate /run/issue... May 13 23:48:12.705829 systemd[1]: issuegen.service: Deactivated successfully. May 13 23:48:12.707379 systemd[1]: Finished issuegen.service - Generate /run/issue. May 13 23:48:12.711659 systemd[1]: Starting systemd-user-sessions.service - Permit User Sessions... May 13 23:48:12.730940 systemd[1]: Finished systemd-user-sessions.service - Permit User Sessions. May 13 23:48:12.739608 systemd[1]: Started getty@tty1.service - Getty on tty1. May 13 23:48:12.744547 systemd[1]: Started serial-getty@ttyAMA0.service - Serial Getty on ttyAMA0. May 13 23:48:12.745482 systemd[1]: Reached target getty.target - Login Prompts. May 13 23:48:12.748592 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. May 13 23:48:12.750353 systemd[1]: Reached target multi-user.target - Multi-User System. May 13 23:48:12.753382 systemd[1]: Startup finished in 784ms (kernel) + 6.357s (initrd) + 4.752s (userspace) = 11.893s. May 13 23:48:12.773077 (kubelet)[1617]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS May 13 23:48:13.407794 kubelet[1617]: E0513 23:48:13.407733 1617 run.go:74] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" May 13 23:48:13.410777 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE May 13 23:48:13.411016 systemd[1]: kubelet.service: Failed with result 'exit-code'. May 13 23:48:13.411688 systemd[1]: kubelet.service: Consumed 907ms CPU time, 237.7M memory peak. May 13 23:48:23.413959 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 1. May 13 23:48:23.417497 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... May 13 23:48:23.578918 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. May 13 23:48:23.594086 (kubelet)[1638]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS May 13 23:48:23.648234 kubelet[1638]: E0513 23:48:23.648154 1638 run.go:74] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" May 13 23:48:23.655519 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE May 13 23:48:23.655889 systemd[1]: kubelet.service: Failed with result 'exit-code'. May 13 23:48:23.656660 systemd[1]: kubelet.service: Consumed 178ms CPU time, 94.5M memory peak. May 13 23:48:33.663812 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 2. May 13 23:48:33.666958 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... May 13 23:48:33.823040 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. May 13 23:48:33.835730 (kubelet)[1653]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS May 13 23:48:33.893909 kubelet[1653]: E0513 23:48:33.893867 1653 run.go:74] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" May 13 23:48:33.896287 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE May 13 23:48:33.896453 systemd[1]: kubelet.service: Failed with result 'exit-code'. May 13 23:48:33.896883 systemd[1]: kubelet.service: Consumed 178ms CPU time, 93.3M memory peak. May 13 23:48:42.206176 systemd-timesyncd[1384]: Contacted time server 5.45.104.115:123 (2.flatcar.pool.ntp.org). May 13 23:48:42.206266 systemd-timesyncd[1384]: Initial clock synchronization to Tue 2025-05-13 23:48:42.267077 UTC. May 13 23:48:43.914388 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 3. May 13 23:48:43.917238 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... May 13 23:48:44.074043 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. May 13 23:48:44.086251 (kubelet)[1669]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS May 13 23:48:44.142081 kubelet[1669]: E0513 23:48:44.142004 1669 run.go:74] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" May 13 23:48:44.147240 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE May 13 23:48:44.147847 systemd[1]: kubelet.service: Failed with result 'exit-code'. May 13 23:48:44.150720 systemd[1]: kubelet.service: Consumed 177ms CPU time, 93M memory peak. May 13 23:48:54.163910 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 4. May 13 23:48:54.167130 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... May 13 23:48:54.305702 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. May 13 23:48:54.318846 (kubelet)[1685]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS May 13 23:48:54.386332 kubelet[1685]: E0513 23:48:54.386264 1685 run.go:74] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" May 13 23:48:54.388775 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE May 13 23:48:54.389011 systemd[1]: kubelet.service: Failed with result 'exit-code'. May 13 23:48:54.389512 systemd[1]: kubelet.service: Consumed 182ms CPU time, 93.4M memory peak. May 13 23:48:56.891212 update_engine[1487]: I20250513 23:48:56.890537 1487 update_attempter.cc:509] Updating boot flags... May 13 23:48:56.960514 kernel: BTRFS warning: duplicate device /dev/sda3 devid 1 generation 41 scanned by (udev-worker) (1703) May 13 23:48:57.029660 kernel: BTRFS warning: duplicate device /dev/sda3 devid 1 generation 41 scanned by (udev-worker) (1706) May 13 23:48:57.112561 kernel: BTRFS warning: duplicate device /dev/sda3 devid 1 generation 41 scanned by (udev-worker) (1706) May 13 23:49:04.414473 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 5. May 13 23:49:04.417063 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... May 13 23:49:04.567799 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. May 13 23:49:04.578079 (kubelet)[1723]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS May 13 23:49:04.626653 kubelet[1723]: E0513 23:49:04.626583 1723 run.go:74] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" May 13 23:49:04.629956 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE May 13 23:49:04.630198 systemd[1]: kubelet.service: Failed with result 'exit-code'. May 13 23:49:04.631980 systemd[1]: kubelet.service: Consumed 169ms CPU time, 95M memory peak. May 13 23:49:14.664466 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 6. May 13 23:49:14.667602 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... May 13 23:49:14.808886 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. May 13 23:49:14.818034 (kubelet)[1739]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS May 13 23:49:14.872510 kubelet[1739]: E0513 23:49:14.872462 1739 run.go:74] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" May 13 23:49:14.876904 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE May 13 23:49:14.877180 systemd[1]: kubelet.service: Failed with result 'exit-code'. May 13 23:49:14.877844 systemd[1]: kubelet.service: Consumed 166ms CPU time, 94.6M memory peak. May 13 23:49:24.913997 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 7. May 13 23:49:24.916302 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... May 13 23:49:25.082555 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. May 13 23:49:25.092803 (kubelet)[1756]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS May 13 23:49:25.144180 kubelet[1756]: E0513 23:49:25.144109 1756 run.go:74] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" May 13 23:49:25.146209 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE May 13 23:49:25.146428 systemd[1]: kubelet.service: Failed with result 'exit-code'. May 13 23:49:25.147393 systemd[1]: kubelet.service: Consumed 176ms CPU time, 95M memory peak. May 13 23:49:35.164031 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 8. May 13 23:49:35.167698 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... May 13 23:49:35.330784 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. May 13 23:49:35.346039 (kubelet)[1771]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS May 13 23:49:35.398314 kubelet[1771]: E0513 23:49:35.398221 1771 run.go:74] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" May 13 23:49:35.400871 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE May 13 23:49:35.401271 systemd[1]: kubelet.service: Failed with result 'exit-code'. May 13 23:49:35.402389 systemd[1]: kubelet.service: Consumed 178ms CPU time, 96.5M memory peak. May 13 23:49:45.413961 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 9. May 13 23:49:45.417686 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... May 13 23:49:45.587936 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. May 13 23:49:45.598506 (kubelet)[1787]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS May 13 23:49:45.662317 kubelet[1787]: E0513 23:49:45.662273 1787 run.go:74] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" May 13 23:49:45.666244 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE May 13 23:49:45.666434 systemd[1]: kubelet.service: Failed with result 'exit-code'. May 13 23:49:45.666961 systemd[1]: kubelet.service: Consumed 188ms CPU time, 94.2M memory peak. May 13 23:49:51.911426 systemd[1]: Created slice system-sshd.slice - Slice /system/sshd. May 13 23:49:51.912988 systemd[1]: Started sshd@0-91.99.1.97:22-139.178.89.65:49726.service - OpenSSH per-connection server daemon (139.178.89.65:49726). May 13 23:49:52.952106 sshd[1796]: Accepted publickey for core from 139.178.89.65 port 49726 ssh2: RSA SHA256:2AVjOJVQfxOW/VtYd+RUuCe04o4LOcgB/L6rdW4ABGI May 13 23:49:52.955211 sshd-session[1796]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 13 23:49:52.971907 systemd[1]: Created slice user-500.slice - User Slice of UID 500. May 13 23:49:52.971950 systemd-logind[1485]: New session 1 of user core. May 13 23:49:52.974748 systemd[1]: Starting user-runtime-dir@500.service - User Runtime Directory /run/user/500... May 13 23:49:53.004664 systemd[1]: Finished user-runtime-dir@500.service - User Runtime Directory /run/user/500. May 13 23:49:53.007661 systemd[1]: Starting user@500.service - User Manager for UID 500... May 13 23:49:53.024034 (systemd)[1800]: pam_unix(systemd-user:session): session opened for user core(uid=500) by (uid=0) May 13 23:49:53.027374 systemd-logind[1485]: New session c1 of user core. May 13 23:49:53.173464 systemd[1800]: Queued start job for default target default.target. May 13 23:49:53.189672 systemd[1800]: Created slice app.slice - User Application Slice. May 13 23:49:53.189729 systemd[1800]: Reached target paths.target - Paths. May 13 23:49:53.189780 systemd[1800]: Reached target timers.target - Timers. May 13 23:49:53.191631 systemd[1800]: Starting dbus.socket - D-Bus User Message Bus Socket... May 13 23:49:53.207509 systemd[1800]: Listening on dbus.socket - D-Bus User Message Bus Socket. May 13 23:49:53.208152 systemd[1800]: Reached target sockets.target - Sockets. May 13 23:49:53.208395 systemd[1800]: Reached target basic.target - Basic System. May 13 23:49:53.208704 systemd[1800]: Reached target default.target - Main User Target. May 13 23:49:53.208849 systemd[1800]: Startup finished in 173ms. May 13 23:49:53.209055 systemd[1]: Started user@500.service - User Manager for UID 500. May 13 23:49:53.221187 systemd[1]: Started session-1.scope - Session 1 of User core. May 13 23:49:53.924192 systemd[1]: Started sshd@1-91.99.1.97:22-139.178.89.65:49730.service - OpenSSH per-connection server daemon (139.178.89.65:49730). May 13 23:49:54.947314 sshd[1811]: Accepted publickey for core from 139.178.89.65 port 49730 ssh2: RSA SHA256:2AVjOJVQfxOW/VtYd+RUuCe04o4LOcgB/L6rdW4ABGI May 13 23:49:54.950762 sshd-session[1811]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 13 23:49:54.962070 systemd-logind[1485]: New session 2 of user core. May 13 23:49:54.968669 systemd[1]: Started session-2.scope - Session 2 of User core. May 13 23:49:55.629477 sshd[1813]: Connection closed by 139.178.89.65 port 49730 May 13 23:49:55.630707 sshd-session[1811]: pam_unix(sshd:session): session closed for user core May 13 23:49:55.635937 systemd[1]: sshd@1-91.99.1.97:22-139.178.89.65:49730.service: Deactivated successfully. May 13 23:49:55.638743 systemd[1]: session-2.scope: Deactivated successfully. May 13 23:49:55.639926 systemd-logind[1485]: Session 2 logged out. Waiting for processes to exit. May 13 23:49:55.641241 systemd-logind[1485]: Removed session 2. May 13 23:49:55.804637 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 10. May 13 23:49:55.806990 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... May 13 23:49:55.810729 systemd[1]: Started sshd@2-91.99.1.97:22-139.178.89.65:49736.service - OpenSSH per-connection server daemon (139.178.89.65:49736). May 13 23:49:55.976550 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. May 13 23:49:55.990126 (kubelet)[1829]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS May 13 23:49:56.043437 kubelet[1829]: E0513 23:49:56.042803 1829 run.go:74] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" May 13 23:49:56.045866 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE May 13 23:49:56.046176 systemd[1]: kubelet.service: Failed with result 'exit-code'. May 13 23:49:56.046643 systemd[1]: kubelet.service: Consumed 180ms CPU time, 94.9M memory peak. May 13 23:49:56.806625 sshd[1820]: Accepted publickey for core from 139.178.89.65 port 49736 ssh2: RSA SHA256:2AVjOJVQfxOW/VtYd+RUuCe04o4LOcgB/L6rdW4ABGI May 13 23:49:56.809626 sshd-session[1820]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 13 23:49:56.815751 systemd-logind[1485]: New session 3 of user core. May 13 23:49:56.823880 systemd[1]: Started session-3.scope - Session 3 of User core. May 13 23:49:57.484337 sshd[1837]: Connection closed by 139.178.89.65 port 49736 May 13 23:49:57.485476 sshd-session[1820]: pam_unix(sshd:session): session closed for user core May 13 23:49:57.489863 systemd[1]: sshd@2-91.99.1.97:22-139.178.89.65:49736.service: Deactivated successfully. May 13 23:49:57.492509 systemd[1]: session-3.scope: Deactivated successfully. May 13 23:49:57.493322 systemd-logind[1485]: Session 3 logged out. Waiting for processes to exit. May 13 23:49:57.494759 systemd-logind[1485]: Removed session 3. May 13 23:49:57.661931 systemd[1]: Started sshd@3-91.99.1.97:22-139.178.89.65:52310.service - OpenSSH per-connection server daemon (139.178.89.65:52310). May 13 23:49:58.670732 sshd[1843]: Accepted publickey for core from 139.178.89.65 port 52310 ssh2: RSA SHA256:2AVjOJVQfxOW/VtYd+RUuCe04o4LOcgB/L6rdW4ABGI May 13 23:49:58.672390 sshd-session[1843]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 13 23:49:58.679651 systemd-logind[1485]: New session 4 of user core. May 13 23:49:58.685963 systemd[1]: Started session-4.scope - Session 4 of User core. May 13 23:49:59.357667 sshd[1845]: Connection closed by 139.178.89.65 port 52310 May 13 23:49:59.358735 sshd-session[1843]: pam_unix(sshd:session): session closed for user core May 13 23:49:59.364244 systemd[1]: sshd@3-91.99.1.97:22-139.178.89.65:52310.service: Deactivated successfully. May 13 23:49:59.367949 systemd[1]: session-4.scope: Deactivated successfully. May 13 23:49:59.369156 systemd-logind[1485]: Session 4 logged out. Waiting for processes to exit. May 13 23:49:59.370478 systemd-logind[1485]: Removed session 4. May 13 23:49:59.531501 systemd[1]: Started sshd@4-91.99.1.97:22-139.178.89.65:52322.service - OpenSSH per-connection server daemon (139.178.89.65:52322). May 13 23:50:00.525343 sshd[1851]: Accepted publickey for core from 139.178.89.65 port 52322 ssh2: RSA SHA256:2AVjOJVQfxOW/VtYd+RUuCe04o4LOcgB/L6rdW4ABGI May 13 23:50:00.527485 sshd-session[1851]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 13 23:50:00.536683 systemd-logind[1485]: New session 5 of user core. May 13 23:50:00.538593 systemd[1]: Started session-5.scope - Session 5 of User core. May 13 23:50:01.055021 sudo[1854]: core : PWD=/home/core ; USER=root ; COMMAND=/usr/sbin/setenforce 1 May 13 23:50:01.055342 sudo[1854]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) May 13 23:50:01.072902 sudo[1854]: pam_unix(sudo:session): session closed for user root May 13 23:50:01.233444 sshd[1853]: Connection closed by 139.178.89.65 port 52322 May 13 23:50:01.234585 sshd-session[1851]: pam_unix(sshd:session): session closed for user core May 13 23:50:01.239044 systemd[1]: sshd@4-91.99.1.97:22-139.178.89.65:52322.service: Deactivated successfully. May 13 23:50:01.241354 systemd[1]: session-5.scope: Deactivated successfully. May 13 23:50:01.243975 systemd-logind[1485]: Session 5 logged out. Waiting for processes to exit. May 13 23:50:01.245529 systemd-logind[1485]: Removed session 5. May 13 23:50:01.414455 systemd[1]: Started sshd@5-91.99.1.97:22-139.178.89.65:52338.service - OpenSSH per-connection server daemon (139.178.89.65:52338). May 13 23:50:02.440375 sshd[1860]: Accepted publickey for core from 139.178.89.65 port 52338 ssh2: RSA SHA256:2AVjOJVQfxOW/VtYd+RUuCe04o4LOcgB/L6rdW4ABGI May 13 23:50:02.443038 sshd-session[1860]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 13 23:50:02.452535 systemd-logind[1485]: New session 6 of user core. May 13 23:50:02.462844 systemd[1]: Started session-6.scope - Session 6 of User core. May 13 23:50:02.972984 sudo[1864]: core : PWD=/home/core ; USER=root ; COMMAND=/usr/sbin/rm -rf /etc/audit/rules.d/80-selinux.rules /etc/audit/rules.d/99-default.rules May 13 23:50:02.973358 sudo[1864]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) May 13 23:50:02.977600 sudo[1864]: pam_unix(sudo:session): session closed for user root May 13 23:50:02.983888 sudo[1863]: core : PWD=/home/core ; USER=root ; COMMAND=/usr/sbin/systemctl restart audit-rules May 13 23:50:02.984896 sudo[1863]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) May 13 23:50:02.997170 systemd[1]: Starting audit-rules.service - Load Audit Rules... May 13 23:50:03.044229 augenrules[1886]: No rules May 13 23:50:03.046031 systemd[1]: audit-rules.service: Deactivated successfully. May 13 23:50:03.046252 systemd[1]: Finished audit-rules.service - Load Audit Rules. May 13 23:50:03.048211 sudo[1863]: pam_unix(sudo:session): session closed for user root May 13 23:50:03.211970 sshd[1862]: Connection closed by 139.178.89.65 port 52338 May 13 23:50:03.211780 sshd-session[1860]: pam_unix(sshd:session): session closed for user core May 13 23:50:03.216546 systemd[1]: sshd@5-91.99.1.97:22-139.178.89.65:52338.service: Deactivated successfully. May 13 23:50:03.218386 systemd[1]: session-6.scope: Deactivated successfully. May 13 23:50:03.221168 systemd-logind[1485]: Session 6 logged out. Waiting for processes to exit. May 13 23:50:03.222385 systemd-logind[1485]: Removed session 6. May 13 23:50:03.380357 systemd[1]: Started sshd@6-91.99.1.97:22-139.178.89.65:52348.service - OpenSSH per-connection server daemon (139.178.89.65:52348). May 13 23:50:04.384039 sshd[1895]: Accepted publickey for core from 139.178.89.65 port 52348 ssh2: RSA SHA256:2AVjOJVQfxOW/VtYd+RUuCe04o4LOcgB/L6rdW4ABGI May 13 23:50:04.385920 sshd-session[1895]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 13 23:50:04.396882 systemd-logind[1485]: New session 7 of user core. May 13 23:50:04.402862 systemd[1]: Started session-7.scope - Session 7 of User core. May 13 23:50:04.910101 sudo[1898]: core : PWD=/home/core ; USER=root ; COMMAND=/home/core/install.sh May 13 23:50:04.910396 sudo[1898]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) May 13 23:50:05.305033 systemd[1]: Starting docker.service - Docker Application Container Engine... May 13 23:50:05.319386 (dockerd)[1916]: docker.service: Referenced but unset environment variable evaluates to an empty string: DOCKER_CGROUPS, DOCKER_OPTS, DOCKER_OPT_BIP, DOCKER_OPT_IPMASQ, DOCKER_OPT_MTU May 13 23:50:05.575597 dockerd[1916]: time="2025-05-13T23:50:05.573779907Z" level=info msg="Starting up" May 13 23:50:05.576756 dockerd[1916]: time="2025-05-13T23:50:05.576663461Z" level=info msg="OTEL tracing is not configured, using no-op tracer provider" May 13 23:50:05.641468 dockerd[1916]: time="2025-05-13T23:50:05.641377192Z" level=info msg="Loading containers: start." May 13 23:50:05.817469 kernel: Initializing XFRM netlink socket May 13 23:50:05.908775 systemd-networkd[1403]: docker0: Link UP May 13 23:50:05.979957 dockerd[1916]: time="2025-05-13T23:50:05.979883709Z" level=info msg="Loading containers: done." May 13 23:50:05.995388 systemd[1]: var-lib-docker-overlay2-opaque\x2dbug\x2dcheck1138356234-merged.mount: Deactivated successfully. May 13 23:50:05.998647 dockerd[1916]: time="2025-05-13T23:50:05.997968250Z" level=warning msg="Not using native diff for overlay2, this may cause degraded performance for building images: kernel has CONFIG_OVERLAY_FS_REDIRECT_DIR enabled" storage-driver=overlay2 May 13 23:50:05.998647 dockerd[1916]: time="2025-05-13T23:50:05.998097773Z" level=info msg="Docker daemon" commit=c710b88579fcb5e0d53f96dcae976d79323b9166 containerd-snapshotter=false storage-driver=overlay2 version=27.4.1 May 13 23:50:05.998647 dockerd[1916]: time="2025-05-13T23:50:05.998346820Z" level=info msg="Daemon has completed initialization" May 13 23:50:06.042784 dockerd[1916]: time="2025-05-13T23:50:06.042690975Z" level=info msg="API listen on /run/docker.sock" May 13 23:50:06.043376 systemd[1]: Started docker.service - Docker Application Container Engine. May 13 23:50:06.163694 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 11. May 13 23:50:06.166644 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... May 13 23:50:06.310604 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. May 13 23:50:06.324074 (kubelet)[2121]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS May 13 23:50:06.393756 kubelet[2121]: E0513 23:50:06.393672 2121 run.go:74] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" May 13 23:50:06.396243 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE May 13 23:50:06.396378 systemd[1]: kubelet.service: Failed with result 'exit-code'. May 13 23:50:06.396937 systemd[1]: kubelet.service: Consumed 172ms CPU time, 94.5M memory peak. May 13 23:50:07.229351 containerd[1515]: time="2025-05-13T23:50:07.228498539Z" level=info msg="PullImage \"registry.k8s.io/kube-apiserver:v1.30.12\"" May 13 23:50:07.937275 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount2414360151.mount: Deactivated successfully. May 13 23:50:10.109462 containerd[1515]: time="2025-05-13T23:50:10.109358809Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-apiserver:v1.30.12\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 13 23:50:10.111525 containerd[1515]: time="2025-05-13T23:50:10.111466459Z" level=info msg="stop pulling image registry.k8s.io/kube-apiserver:v1.30.12: active requests=0, bytes read=29794242" May 13 23:50:10.114456 containerd[1515]: time="2025-05-13T23:50:10.114346407Z" level=info msg="ImageCreate event name:\"sha256:afbe230ec4abc2c9e87f7fbe7814bde21dbe30f03252c8861c4ca9510cb43ec6\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 13 23:50:10.118451 containerd[1515]: time="2025-05-13T23:50:10.118386824Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-apiserver@sha256:4878682f7a044274d42399a6316ef452c5411aafd4ad99cc57de7235ca490e4e\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 13 23:50:10.120438 containerd[1515]: time="2025-05-13T23:50:10.119665694Z" level=info msg="Pulled image \"registry.k8s.io/kube-apiserver:v1.30.12\" with image id \"sha256:afbe230ec4abc2c9e87f7fbe7814bde21dbe30f03252c8861c4ca9510cb43ec6\", repo tag \"registry.k8s.io/kube-apiserver:v1.30.12\", repo digest \"registry.k8s.io/kube-apiserver@sha256:4878682f7a044274d42399a6316ef452c5411aafd4ad99cc57de7235ca490e4e\", size \"29790950\" in 2.891116794s" May 13 23:50:10.120438 containerd[1515]: time="2025-05-13T23:50:10.119722095Z" level=info msg="PullImage \"registry.k8s.io/kube-apiserver:v1.30.12\" returns image reference \"sha256:afbe230ec4abc2c9e87f7fbe7814bde21dbe30f03252c8861c4ca9510cb43ec6\"" May 13 23:50:10.138758 containerd[1515]: time="2025-05-13T23:50:10.138365500Z" level=info msg="PullImage \"registry.k8s.io/kube-controller-manager:v1.30.12\"" May 13 23:50:12.696982 containerd[1515]: time="2025-05-13T23:50:12.695659844Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-controller-manager:v1.30.12\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 13 23:50:12.696982 containerd[1515]: time="2025-05-13T23:50:12.696905193Z" level=info msg="stop pulling image registry.k8s.io/kube-controller-manager:v1.30.12: active requests=0, bytes read=26855570" May 13 23:50:12.698208 containerd[1515]: time="2025-05-13T23:50:12.698159583Z" level=info msg="ImageCreate event name:\"sha256:3df23260c56ff58d759f8a841c67846184e97ce81a269549ca8d14b36da14c14\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 13 23:50:12.703711 containerd[1515]: time="2025-05-13T23:50:12.703656831Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-controller-manager@sha256:3a36711d0409d565b370a18d0c19339e93d4f1b1f2b3fd382eb31c714c463b74\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 13 23:50:12.704703 containerd[1515]: time="2025-05-13T23:50:12.704669334Z" level=info msg="Pulled image \"registry.k8s.io/kube-controller-manager:v1.30.12\" with image id \"sha256:3df23260c56ff58d759f8a841c67846184e97ce81a269549ca8d14b36da14c14\", repo tag \"registry.k8s.io/kube-controller-manager:v1.30.12\", repo digest \"registry.k8s.io/kube-controller-manager@sha256:3a36711d0409d565b370a18d0c19339e93d4f1b1f2b3fd382eb31c714c463b74\", size \"28297111\" in 2.566241713s" May 13 23:50:12.704873 containerd[1515]: time="2025-05-13T23:50:12.704851418Z" level=info msg="PullImage \"registry.k8s.io/kube-controller-manager:v1.30.12\" returns image reference \"sha256:3df23260c56ff58d759f8a841c67846184e97ce81a269549ca8d14b36da14c14\"" May 13 23:50:12.726659 containerd[1515]: time="2025-05-13T23:50:12.726611205Z" level=info msg="PullImage \"registry.k8s.io/kube-scheduler:v1.30.12\"" May 13 23:50:14.321168 containerd[1515]: time="2025-05-13T23:50:14.321119217Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-scheduler:v1.30.12\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 13 23:50:14.322419 containerd[1515]: time="2025-05-13T23:50:14.322347125Z" level=info msg="stop pulling image registry.k8s.io/kube-scheduler:v1.30.12: active requests=0, bytes read=16263965" May 13 23:50:14.322732 containerd[1515]: time="2025-05-13T23:50:14.322705213Z" level=info msg="ImageCreate event name:\"sha256:fb0f5dac5fa74463b801d11598454c00462609b582d17052195012e5f682c2ba\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 13 23:50:14.326302 containerd[1515]: time="2025-05-13T23:50:14.326249494Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-scheduler@sha256:521c843d01025be7d4e246ddee8cde74556eb9813c606d6db9f0f03236f6d029\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 13 23:50:14.328085 containerd[1515]: time="2025-05-13T23:50:14.328032374Z" level=info msg="Pulled image \"registry.k8s.io/kube-scheduler:v1.30.12\" with image id \"sha256:fb0f5dac5fa74463b801d11598454c00462609b582d17052195012e5f682c2ba\", repo tag \"registry.k8s.io/kube-scheduler:v1.30.12\", repo digest \"registry.k8s.io/kube-scheduler@sha256:521c843d01025be7d4e246ddee8cde74556eb9813c606d6db9f0f03236f6d029\", size \"17705524\" in 1.601175004s" May 13 23:50:14.328085 containerd[1515]: time="2025-05-13T23:50:14.328077495Z" level=info msg="PullImage \"registry.k8s.io/kube-scheduler:v1.30.12\" returns image reference \"sha256:fb0f5dac5fa74463b801d11598454c00462609b582d17052195012e5f682c2ba\"" May 13 23:50:14.347282 containerd[1515]: time="2025-05-13T23:50:14.346910925Z" level=info msg="PullImage \"registry.k8s.io/kube-proxy:v1.30.12\"" May 13 23:50:15.461213 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount3223033426.mount: Deactivated successfully. May 13 23:50:16.064375 containerd[1515]: time="2025-05-13T23:50:16.063283469Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-proxy:v1.30.12\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 13 23:50:16.064375 containerd[1515]: time="2025-05-13T23:50:16.064296251Z" level=info msg="stop pulling image registry.k8s.io/kube-proxy:v1.30.12: active requests=0, bytes read=25775731" May 13 23:50:16.064981 containerd[1515]: time="2025-05-13T23:50:16.064820383Z" level=info msg="ImageCreate event name:\"sha256:b4250a9efcae16f8d20358e204a159844e2b7e854edad08aee8791774acbdaed\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 13 23:50:16.067299 containerd[1515]: time="2025-05-13T23:50:16.067258398Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-proxy@sha256:ea8c7d5392acf6b0c11ebba78301e1a6c2dc6abcd7544102ed578e49d1c82f15\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 13 23:50:16.067794 containerd[1515]: time="2025-05-13T23:50:16.067751969Z" level=info msg="Pulled image \"registry.k8s.io/kube-proxy:v1.30.12\" with image id \"sha256:b4250a9efcae16f8d20358e204a159844e2b7e854edad08aee8791774acbdaed\", repo tag \"registry.k8s.io/kube-proxy:v1.30.12\", repo digest \"registry.k8s.io/kube-proxy@sha256:ea8c7d5392acf6b0c11ebba78301e1a6c2dc6abcd7544102ed578e49d1c82f15\", size \"25774724\" in 1.720785802s" May 13 23:50:16.067794 containerd[1515]: time="2025-05-13T23:50:16.067792169Z" level=info msg="PullImage \"registry.k8s.io/kube-proxy:v1.30.12\" returns image reference \"sha256:b4250a9efcae16f8d20358e204a159844e2b7e854edad08aee8791774acbdaed\"" May 13 23:50:16.086174 containerd[1515]: time="2025-05-13T23:50:16.086124900Z" level=info msg="PullImage \"registry.k8s.io/coredns/coredns:v1.11.1\"" May 13 23:50:16.414760 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 12. May 13 23:50:16.416890 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... May 13 23:50:16.567187 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. May 13 23:50:16.578985 (kubelet)[2248]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS May 13 23:50:16.640104 kubelet[2248]: E0513 23:50:16.640052 2248 run.go:74] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" May 13 23:50:16.643015 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE May 13 23:50:16.643163 systemd[1]: kubelet.service: Failed with result 'exit-code'. May 13 23:50:16.644609 systemd[1]: kubelet.service: Consumed 180ms CPU time, 94.4M memory peak. May 13 23:50:16.678908 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount387022380.mount: Deactivated successfully. May 13 23:50:17.458438 containerd[1515]: time="2025-05-13T23:50:17.457962689Z" level=info msg="ImageCreate event name:\"registry.k8s.io/coredns/coredns:v1.11.1\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 13 23:50:17.459946 containerd[1515]: time="2025-05-13T23:50:17.459882612Z" level=info msg="stop pulling image registry.k8s.io/coredns/coredns:v1.11.1: active requests=0, bytes read=16485461" May 13 23:50:17.461452 containerd[1515]: time="2025-05-13T23:50:17.461135440Z" level=info msg="ImageCreate event name:\"sha256:2437cf762177702dec2dfe99a09c37427a15af6d9a57c456b65352667c223d93\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 13 23:50:17.464427 containerd[1515]: time="2025-05-13T23:50:17.463734817Z" level=info msg="ImageCreate event name:\"registry.k8s.io/coredns/coredns@sha256:1eeb4c7316bacb1d4c8ead65571cd92dd21e27359f0d4917f1a5822a73b75db1\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 13 23:50:17.465902 containerd[1515]: time="2025-05-13T23:50:17.465654580Z" level=info msg="Pulled image \"registry.k8s.io/coredns/coredns:v1.11.1\" with image id \"sha256:2437cf762177702dec2dfe99a09c37427a15af6d9a57c456b65352667c223d93\", repo tag \"registry.k8s.io/coredns/coredns:v1.11.1\", repo digest \"registry.k8s.io/coredns/coredns@sha256:1eeb4c7316bacb1d4c8ead65571cd92dd21e27359f0d4917f1a5822a73b75db1\", size \"16482581\" in 1.379481119s" May 13 23:50:17.465902 containerd[1515]: time="2025-05-13T23:50:17.465717221Z" level=info msg="PullImage \"registry.k8s.io/coredns/coredns:v1.11.1\" returns image reference \"sha256:2437cf762177702dec2dfe99a09c37427a15af6d9a57c456b65352667c223d93\"" May 13 23:50:17.484034 containerd[1515]: time="2025-05-13T23:50:17.483957546Z" level=info msg="PullImage \"registry.k8s.io/pause:3.9\"" May 13 23:50:18.041496 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount1798878295.mount: Deactivated successfully. May 13 23:50:18.051365 containerd[1515]: time="2025-05-13T23:50:18.051277733Z" level=info msg="ImageCreate event name:\"registry.k8s.io/pause:3.9\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 13 23:50:18.053371 containerd[1515]: time="2025-05-13T23:50:18.053280137Z" level=info msg="stop pulling image registry.k8s.io/pause:3.9: active requests=0, bytes read=268841" May 13 23:50:18.054803 containerd[1515]: time="2025-05-13T23:50:18.054177357Z" level=info msg="ImageCreate event name:\"sha256:829e9de338bd5fdd3f16f68f83a9fb288fbc8453e881e5d5cfd0f6f2ff72b43e\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 13 23:50:18.058265 containerd[1515]: time="2025-05-13T23:50:18.058197205Z" level=info msg="ImageCreate event name:\"registry.k8s.io/pause@sha256:7031c1b283388d2c2e09b57badb803c05ebed362dc88d84b480cc47f72a21097\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 13 23:50:18.058821 containerd[1515]: time="2025-05-13T23:50:18.058775258Z" level=info msg="Pulled image \"registry.k8s.io/pause:3.9\" with image id \"sha256:829e9de338bd5fdd3f16f68f83a9fb288fbc8453e881e5d5cfd0f6f2ff72b43e\", repo tag \"registry.k8s.io/pause:3.9\", repo digest \"registry.k8s.io/pause@sha256:7031c1b283388d2c2e09b57badb803c05ebed362dc88d84b480cc47f72a21097\", size \"268051\" in 574.445424ms" May 13 23:50:18.058821 containerd[1515]: time="2025-05-13T23:50:18.058814179Z" level=info msg="PullImage \"registry.k8s.io/pause:3.9\" returns image reference \"sha256:829e9de338bd5fdd3f16f68f83a9fb288fbc8453e881e5d5cfd0f6f2ff72b43e\"" May 13 23:50:18.083380 containerd[1515]: time="2025-05-13T23:50:18.083333199Z" level=info msg="PullImage \"registry.k8s.io/etcd:3.5.12-0\"" May 13 23:50:18.763812 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount647301528.mount: Deactivated successfully. May 13 23:50:22.332799 containerd[1515]: time="2025-05-13T23:50:22.332725732Z" level=info msg="ImageCreate event name:\"registry.k8s.io/etcd:3.5.12-0\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 13 23:50:22.334703 containerd[1515]: time="2025-05-13T23:50:22.334634688Z" level=info msg="stop pulling image registry.k8s.io/etcd:3.5.12-0: active requests=0, bytes read=66191552" May 13 23:50:22.337360 containerd[1515]: time="2025-05-13T23:50:22.337234299Z" level=info msg="ImageCreate event name:\"sha256:014faa467e29798aeef733fe6d1a3b5e382688217b053ad23410e6cccd5d22fd\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 13 23:50:22.343666 containerd[1515]: time="2025-05-13T23:50:22.342163396Z" level=info msg="ImageCreate event name:\"registry.k8s.io/etcd@sha256:44a8e24dcbba3470ee1fee21d5e88d128c936e9b55d4bc51fbef8086f8ed123b\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 13 23:50:22.343666 containerd[1515]: time="2025-05-13T23:50:22.343509816Z" level=info msg="Pulled image \"registry.k8s.io/etcd:3.5.12-0\" with image id \"sha256:014faa467e29798aeef733fe6d1a3b5e382688217b053ad23410e6cccd5d22fd\", repo tag \"registry.k8s.io/etcd:3.5.12-0\", repo digest \"registry.k8s.io/etcd@sha256:44a8e24dcbba3470ee1fee21d5e88d128c936e9b55d4bc51fbef8086f8ed123b\", size \"66189079\" in 4.260126653s" May 13 23:50:22.343666 containerd[1515]: time="2025-05-13T23:50:22.343554590Z" level=info msg="PullImage \"registry.k8s.io/etcd:3.5.12-0\" returns image reference \"sha256:014faa467e29798aeef733fe6d1a3b5e382688217b053ad23410e6cccd5d22fd\"" May 13 23:50:26.664258 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 13. May 13 23:50:26.667678 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... May 13 23:50:26.844368 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. May 13 23:50:26.854082 (kubelet)[2448]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS May 13 23:50:26.903413 kubelet[2448]: E0513 23:50:26.902013 2448 run.go:74] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" May 13 23:50:26.905199 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE May 13 23:50:26.905358 systemd[1]: kubelet.service: Failed with result 'exit-code'. May 13 23:50:26.905708 systemd[1]: kubelet.service: Consumed 174ms CPU time, 94.5M memory peak. May 13 23:50:29.507661 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent. May 13 23:50:29.507810 systemd[1]: kubelet.service: Consumed 174ms CPU time, 94.5M memory peak. May 13 23:50:29.510784 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... May 13 23:50:29.546304 systemd[1]: Reload requested from client PID 2462 ('systemctl') (unit session-7.scope)... May 13 23:50:29.546497 systemd[1]: Reloading... May 13 23:50:29.679435 zram_generator::config[2510]: No configuration found. May 13 23:50:29.792543 systemd[1]: /usr/lib/systemd/system/docker.socket:6: ListenStream= references a path below legacy directory /var/run/, updating /var/run/docker.sock → /run/docker.sock; please update the unit file accordingly. May 13 23:50:29.888093 systemd[1]: Reloading finished in 341 ms. May 13 23:50:29.946642 systemd[1]: kubelet.service: Control process exited, code=killed, status=15/TERM May 13 23:50:29.946736 systemd[1]: kubelet.service: Failed with result 'signal'. May 13 23:50:29.949560 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent. May 13 23:50:29.949620 systemd[1]: kubelet.service: Consumed 108ms CPU time, 82.3M memory peak. May 13 23:50:29.955343 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... May 13 23:50:30.106323 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. May 13 23:50:30.117010 (kubelet)[2554]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS May 13 23:50:30.178144 kubelet[2554]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. May 13 23:50:30.178144 kubelet[2554]: Flag --pod-infra-container-image has been deprecated, will be removed in a future release. Image garbage collector will get sandbox image information from CRI. May 13 23:50:30.178144 kubelet[2554]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. May 13 23:50:30.178144 kubelet[2554]: I0513 23:50:30.177508 2554 server.go:205] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" May 13 23:50:30.899078 kubelet[2554]: I0513 23:50:30.899000 2554 server.go:484] "Kubelet version" kubeletVersion="v1.30.1" May 13 23:50:30.899078 kubelet[2554]: I0513 23:50:30.899041 2554 server.go:486] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" May 13 23:50:30.899397 kubelet[2554]: I0513 23:50:30.899306 2554 server.go:927] "Client rotation is on, will bootstrap in background" May 13 23:50:30.919345 kubelet[2554]: E0513 23:50:30.919310 2554 certificate_manager.go:562] kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post "https://91.99.1.97:6443/apis/certificates.k8s.io/v1/certificatesigningrequests": dial tcp 91.99.1.97:6443: connect: connection refused May 13 23:50:30.920173 kubelet[2554]: I0513 23:50:30.919861 2554 dynamic_cafile_content.go:157] "Starting controller" name="client-ca-bundle::/etc/kubernetes/pki/ca.crt" May 13 23:50:30.932902 kubelet[2554]: I0513 23:50:30.932868 2554 server.go:742] "--cgroups-per-qos enabled, but --cgroup-root was not specified. defaulting to /" May 13 23:50:30.933523 kubelet[2554]: I0513 23:50:30.933475 2554 container_manager_linux.go:265] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] May 13 23:50:30.933738 kubelet[2554]: I0513 23:50:30.933515 2554 container_manager_linux.go:270] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ci-4284-0-0-n-cba8e36126","RuntimeCgroupsName":"","SystemCgroupsName":"","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":false,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":null,"HardEvictionThresholds":[{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"ExperimentalMemoryManagerPolicy":"None","ExperimentalMemoryManagerReservedMemory":null,"PodPidsLimit":-1,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null} May 13 23:50:30.933832 kubelet[2554]: I0513 23:50:30.933792 2554 topology_manager.go:138] "Creating topology manager with none policy" May 13 23:50:30.933832 kubelet[2554]: I0513 23:50:30.933805 2554 container_manager_linux.go:301] "Creating device plugin manager" May 13 23:50:30.934046 kubelet[2554]: I0513 23:50:30.934029 2554 state_mem.go:36] "Initialized new in-memory state store" May 13 23:50:30.937433 kubelet[2554]: I0513 23:50:30.935334 2554 kubelet.go:400] "Attempting to sync node with API server" May 13 23:50:30.937433 kubelet[2554]: I0513 23:50:30.935366 2554 kubelet.go:301] "Adding static pod path" path="/etc/kubernetes/manifests" May 13 23:50:30.937433 kubelet[2554]: I0513 23:50:30.935677 2554 kubelet.go:312] "Adding apiserver pod source" May 13 23:50:30.937433 kubelet[2554]: I0513 23:50:30.935848 2554 apiserver.go:42] "Waiting for node sync before watching apiserver pods" May 13 23:50:30.937433 kubelet[2554]: W0513 23:50:30.936168 2554 reflector.go:547] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://91.99.1.97:6443/api/v1/nodes?fieldSelector=metadata.name%3Dci-4284-0-0-n-cba8e36126&limit=500&resourceVersion=0": dial tcp 91.99.1.97:6443: connect: connection refused May 13 23:50:30.937433 kubelet[2554]: E0513 23:50:30.936240 2554 reflector.go:150] k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get "https://91.99.1.97:6443/api/v1/nodes?fieldSelector=metadata.name%3Dci-4284-0-0-n-cba8e36126&limit=500&resourceVersion=0": dial tcp 91.99.1.97:6443: connect: connection refused May 13 23:50:30.937433 kubelet[2554]: W0513 23:50:30.936844 2554 reflector.go:547] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://91.99.1.97:6443/api/v1/services?limit=500&resourceVersion=0": dial tcp 91.99.1.97:6443: connect: connection refused May 13 23:50:30.937433 kubelet[2554]: E0513 23:50:30.936886 2554 reflector.go:150] k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get "https://91.99.1.97:6443/api/v1/services?limit=500&resourceVersion=0": dial tcp 91.99.1.97:6443: connect: connection refused May 13 23:50:30.937914 kubelet[2554]: I0513 23:50:30.937880 2554 kuberuntime_manager.go:261] "Container runtime initialized" containerRuntime="containerd" version="v2.0.1" apiVersion="v1" May 13 23:50:30.938294 kubelet[2554]: I0513 23:50:30.938252 2554 kubelet.go:815] "Not starting ClusterTrustBundle informer because we are in static kubelet mode" May 13 23:50:30.938427 kubelet[2554]: W0513 23:50:30.938383 2554 probe.go:272] Flexvolume plugin directory at /opt/libexec/kubernetes/kubelet-plugins/volume/exec/ does not exist. Recreating. May 13 23:50:30.939330 kubelet[2554]: I0513 23:50:30.939286 2554 server.go:1264] "Started kubelet" May 13 23:50:30.944372 kubelet[2554]: I0513 23:50:30.944332 2554 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" May 13 23:50:30.949684 kubelet[2554]: E0513 23:50:30.949484 2554 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://91.99.1.97:6443/api/v1/namespaces/default/events\": dial tcp 91.99.1.97:6443: connect: connection refused" event="&Event{ObjectMeta:{ci-4284-0-0-n-cba8e36126.183f3b29bfb7f707 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ci-4284-0-0-n-cba8e36126,UID:ci-4284-0-0-n-cba8e36126,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:ci-4284-0-0-n-cba8e36126,},FirstTimestamp:2025-05-13 23:50:30.939244295 +0000 UTC m=+0.817363164,LastTimestamp:2025-05-13 23:50:30.939244295 +0000 UTC m=+0.817363164,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ci-4284-0-0-n-cba8e36126,}" May 13 23:50:30.953567 kubelet[2554]: I0513 23:50:30.952463 2554 server.go:163] "Starting to listen" address="0.0.0.0" port=10250 May 13 23:50:30.953567 kubelet[2554]: I0513 23:50:30.953540 2554 server.go:455] "Adding debug handlers to kubelet server" May 13 23:50:30.954544 kubelet[2554]: I0513 23:50:30.954478 2554 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 May 13 23:50:30.954744 kubelet[2554]: I0513 23:50:30.954717 2554 server.go:227] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" May 13 23:50:30.955388 kubelet[2554]: I0513 23:50:30.955354 2554 volume_manager.go:291] "Starting Kubelet Volume Manager" May 13 23:50:30.955491 kubelet[2554]: I0513 23:50:30.955471 2554 desired_state_of_world_populator.go:149] "Desired state populator starts to run" May 13 23:50:30.956571 kubelet[2554]: I0513 23:50:30.956543 2554 reconciler.go:26] "Reconciler: start to sync state" May 13 23:50:30.956973 kubelet[2554]: W0513 23:50:30.956921 2554 reflector.go:547] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://91.99.1.97:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 91.99.1.97:6443: connect: connection refused May 13 23:50:30.957058 kubelet[2554]: E0513 23:50:30.956978 2554 reflector.go:150] k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get "https://91.99.1.97:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 91.99.1.97:6443: connect: connection refused May 13 23:50:30.958895 kubelet[2554]: E0513 23:50:30.958847 2554 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://91.99.1.97:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/ci-4284-0-0-n-cba8e36126?timeout=10s\": dial tcp 91.99.1.97:6443: connect: connection refused" interval="200ms" May 13 23:50:30.959118 kubelet[2554]: E0513 23:50:30.959099 2554 kubelet.go:1467] "Image garbage collection failed once. Stats initialization may not have completed yet" err="invalid capacity 0 on image filesystem" May 13 23:50:30.959577 kubelet[2554]: I0513 23:50:30.959556 2554 factory.go:221] Registration of the systemd container factory successfully May 13 23:50:30.959735 kubelet[2554]: I0513 23:50:30.959717 2554 factory.go:219] Registration of the crio container factory failed: Get "http://%2Fvar%2Frun%2Fcrio%2Fcrio.sock/info": dial unix /var/run/crio/crio.sock: connect: no such file or directory May 13 23:50:30.961559 kubelet[2554]: I0513 23:50:30.961537 2554 factory.go:221] Registration of the containerd container factory successfully May 13 23:50:30.972705 kubelet[2554]: I0513 23:50:30.972641 2554 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv4" May 13 23:50:30.973867 kubelet[2554]: I0513 23:50:30.973791 2554 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv6" May 13 23:50:30.973998 kubelet[2554]: I0513 23:50:30.973958 2554 status_manager.go:217] "Starting to sync pod status with apiserver" May 13 23:50:30.973998 kubelet[2554]: I0513 23:50:30.973983 2554 kubelet.go:2337] "Starting kubelet main sync loop" May 13 23:50:30.974340 kubelet[2554]: E0513 23:50:30.974282 2554 kubelet.go:2361] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" May 13 23:50:30.991111 kubelet[2554]: W0513 23:50:30.990954 2554 reflector.go:547] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://91.99.1.97:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 91.99.1.97:6443: connect: connection refused May 13 23:50:30.991111 kubelet[2554]: E0513 23:50:30.991033 2554 reflector.go:150] k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get "https://91.99.1.97:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 91.99.1.97:6443: connect: connection refused May 13 23:50:30.999482 kubelet[2554]: I0513 23:50:30.999452 2554 cpu_manager.go:214] "Starting CPU manager" policy="none" May 13 23:50:30.999482 kubelet[2554]: I0513 23:50:30.999475 2554 cpu_manager.go:215] "Reconciling" reconcilePeriod="10s" May 13 23:50:30.999633 kubelet[2554]: I0513 23:50:30.999499 2554 state_mem.go:36] "Initialized new in-memory state store" May 13 23:50:31.001140 kubelet[2554]: I0513 23:50:31.001108 2554 policy_none.go:49] "None policy: Start" May 13 23:50:31.001974 kubelet[2554]: I0513 23:50:31.001929 2554 memory_manager.go:170] "Starting memorymanager" policy="None" May 13 23:50:31.002059 kubelet[2554]: I0513 23:50:31.001988 2554 state_mem.go:35] "Initializing new in-memory state store" May 13 23:50:31.010924 systemd[1]: Created slice kubepods.slice - libcontainer container kubepods.slice. May 13 23:50:31.024868 systemd[1]: Created slice kubepods-burstable.slice - libcontainer container kubepods-burstable.slice. May 13 23:50:31.029702 systemd[1]: Created slice kubepods-besteffort.slice - libcontainer container kubepods-besteffort.slice. May 13 23:50:31.045879 kubelet[2554]: I0513 23:50:31.045814 2554 manager.go:479] "Failed to read data from checkpoint" checkpoint="kubelet_internal_checkpoint" err="checkpoint is not found" May 13 23:50:31.046298 kubelet[2554]: I0513 23:50:31.046106 2554 container_log_manager.go:186] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" May 13 23:50:31.046298 kubelet[2554]: I0513 23:50:31.046239 2554 plugin_manager.go:118] "Starting Kubelet Plugin Manager" May 13 23:50:31.051942 kubelet[2554]: E0513 23:50:31.051883 2554 eviction_manager.go:282] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"ci-4284-0-0-n-cba8e36126\" not found" May 13 23:50:31.057778 kubelet[2554]: I0513 23:50:31.057682 2554 kubelet_node_status.go:73] "Attempting to register node" node="ci-4284-0-0-n-cba8e36126" May 13 23:50:31.058322 kubelet[2554]: E0513 23:50:31.058238 2554 kubelet_node_status.go:96] "Unable to register node with API server" err="Post \"https://91.99.1.97:6443/api/v1/nodes\": dial tcp 91.99.1.97:6443: connect: connection refused" node="ci-4284-0-0-n-cba8e36126" May 13 23:50:31.074544 kubelet[2554]: I0513 23:50:31.074488 2554 topology_manager.go:215] "Topology Admit Handler" podUID="07d24e43dd0b2b37042db4176361c11e" podNamespace="kube-system" podName="kube-apiserver-ci-4284-0-0-n-cba8e36126" May 13 23:50:31.078155 kubelet[2554]: I0513 23:50:31.077372 2554 topology_manager.go:215] "Topology Admit Handler" podUID="ed7a87ea6244797ff2fc56a962da64cf" podNamespace="kube-system" podName="kube-controller-manager-ci-4284-0-0-n-cba8e36126" May 13 23:50:31.081345 kubelet[2554]: I0513 23:50:31.081289 2554 topology_manager.go:215] "Topology Admit Handler" podUID="acec66ca0d7270f739e3685c8d1d745f" podNamespace="kube-system" podName="kube-scheduler-ci-4284-0-0-n-cba8e36126" May 13 23:50:31.092194 systemd[1]: Created slice kubepods-burstable-pod07d24e43dd0b2b37042db4176361c11e.slice - libcontainer container kubepods-burstable-pod07d24e43dd0b2b37042db4176361c11e.slice. May 13 23:50:31.118234 systemd[1]: Created slice kubepods-burstable-poded7a87ea6244797ff2fc56a962da64cf.slice - libcontainer container kubepods-burstable-poded7a87ea6244797ff2fc56a962da64cf.slice. May 13 23:50:31.132348 systemd[1]: Created slice kubepods-burstable-podacec66ca0d7270f739e3685c8d1d745f.slice - libcontainer container kubepods-burstable-podacec66ca0d7270f739e3685c8d1d745f.slice. May 13 23:50:31.158579 kubelet[2554]: I0513 23:50:31.158011 2554 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/07d24e43dd0b2b37042db4176361c11e-ca-certs\") pod \"kube-apiserver-ci-4284-0-0-n-cba8e36126\" (UID: \"07d24e43dd0b2b37042db4176361c11e\") " pod="kube-system/kube-apiserver-ci-4284-0-0-n-cba8e36126" May 13 23:50:31.158579 kubelet[2554]: I0513 23:50:31.158069 2554 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/07d24e43dd0b2b37042db4176361c11e-k8s-certs\") pod \"kube-apiserver-ci-4284-0-0-n-cba8e36126\" (UID: \"07d24e43dd0b2b37042db4176361c11e\") " pod="kube-system/kube-apiserver-ci-4284-0-0-n-cba8e36126" May 13 23:50:31.158579 kubelet[2554]: I0513 23:50:31.158101 2554 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/07d24e43dd0b2b37042db4176361c11e-usr-share-ca-certificates\") pod \"kube-apiserver-ci-4284-0-0-n-cba8e36126\" (UID: \"07d24e43dd0b2b37042db4176361c11e\") " pod="kube-system/kube-apiserver-ci-4284-0-0-n-cba8e36126" May 13 23:50:31.158579 kubelet[2554]: I0513 23:50:31.158136 2554 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/ed7a87ea6244797ff2fc56a962da64cf-ca-certs\") pod \"kube-controller-manager-ci-4284-0-0-n-cba8e36126\" (UID: \"ed7a87ea6244797ff2fc56a962da64cf\") " pod="kube-system/kube-controller-manager-ci-4284-0-0-n-cba8e36126" May 13 23:50:31.158579 kubelet[2554]: I0513 23:50:31.158170 2554 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"flexvolume-dir\" (UniqueName: \"kubernetes.io/host-path/ed7a87ea6244797ff2fc56a962da64cf-flexvolume-dir\") pod \"kube-controller-manager-ci-4284-0-0-n-cba8e36126\" (UID: \"ed7a87ea6244797ff2fc56a962da64cf\") " pod="kube-system/kube-controller-manager-ci-4284-0-0-n-cba8e36126" May 13 23:50:31.158884 kubelet[2554]: I0513 23:50:31.158211 2554 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/ed7a87ea6244797ff2fc56a962da64cf-usr-share-ca-certificates\") pod \"kube-controller-manager-ci-4284-0-0-n-cba8e36126\" (UID: \"ed7a87ea6244797ff2fc56a962da64cf\") " pod="kube-system/kube-controller-manager-ci-4284-0-0-n-cba8e36126" May 13 23:50:31.158884 kubelet[2554]: I0513 23:50:31.158243 2554 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/acec66ca0d7270f739e3685c8d1d745f-kubeconfig\") pod \"kube-scheduler-ci-4284-0-0-n-cba8e36126\" (UID: \"acec66ca0d7270f739e3685c8d1d745f\") " pod="kube-system/kube-scheduler-ci-4284-0-0-n-cba8e36126" May 13 23:50:31.158884 kubelet[2554]: I0513 23:50:31.158272 2554 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/ed7a87ea6244797ff2fc56a962da64cf-k8s-certs\") pod \"kube-controller-manager-ci-4284-0-0-n-cba8e36126\" (UID: \"ed7a87ea6244797ff2fc56a962da64cf\") " pod="kube-system/kube-controller-manager-ci-4284-0-0-n-cba8e36126" May 13 23:50:31.158884 kubelet[2554]: I0513 23:50:31.158304 2554 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/ed7a87ea6244797ff2fc56a962da64cf-kubeconfig\") pod \"kube-controller-manager-ci-4284-0-0-n-cba8e36126\" (UID: \"ed7a87ea6244797ff2fc56a962da64cf\") " pod="kube-system/kube-controller-manager-ci-4284-0-0-n-cba8e36126" May 13 23:50:31.159725 kubelet[2554]: E0513 23:50:31.159620 2554 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://91.99.1.97:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/ci-4284-0-0-n-cba8e36126?timeout=10s\": dial tcp 91.99.1.97:6443: connect: connection refused" interval="400ms" May 13 23:50:31.261257 kubelet[2554]: I0513 23:50:31.261216 2554 kubelet_node_status.go:73] "Attempting to register node" node="ci-4284-0-0-n-cba8e36126" May 13 23:50:31.261924 kubelet[2554]: E0513 23:50:31.261776 2554 kubelet_node_status.go:96] "Unable to register node with API server" err="Post \"https://91.99.1.97:6443/api/v1/nodes\": dial tcp 91.99.1.97:6443: connect: connection refused" node="ci-4284-0-0-n-cba8e36126" May 13 23:50:31.413824 containerd[1515]: time="2025-05-13T23:50:31.413122982Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-apiserver-ci-4284-0-0-n-cba8e36126,Uid:07d24e43dd0b2b37042db4176361c11e,Namespace:kube-system,Attempt:0,}" May 13 23:50:31.431358 containerd[1515]: time="2025-05-13T23:50:31.430169650Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-controller-manager-ci-4284-0-0-n-cba8e36126,Uid:ed7a87ea6244797ff2fc56a962da64cf,Namespace:kube-system,Attempt:0,}" May 13 23:50:31.437075 containerd[1515]: time="2025-05-13T23:50:31.437031632Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-scheduler-ci-4284-0-0-n-cba8e36126,Uid:acec66ca0d7270f739e3685c8d1d745f,Namespace:kube-system,Attempt:0,}" May 13 23:50:31.561038 kubelet[2554]: E0513 23:50:31.560957 2554 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://91.99.1.97:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/ci-4284-0-0-n-cba8e36126?timeout=10s\": dial tcp 91.99.1.97:6443: connect: connection refused" interval="800ms" May 13 23:50:31.667517 kubelet[2554]: I0513 23:50:31.667259 2554 kubelet_node_status.go:73] "Attempting to register node" node="ci-4284-0-0-n-cba8e36126" May 13 23:50:31.667732 kubelet[2554]: E0513 23:50:31.667689 2554 kubelet_node_status.go:96] "Unable to register node with API server" err="Post \"https://91.99.1.97:6443/api/v1/nodes\": dial tcp 91.99.1.97:6443: connect: connection refused" node="ci-4284-0-0-n-cba8e36126" May 13 23:50:31.971296 kubelet[2554]: W0513 23:50:31.971117 2554 reflector.go:547] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://91.99.1.97:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 91.99.1.97:6443: connect: connection refused May 13 23:50:31.971296 kubelet[2554]: E0513 23:50:31.971177 2554 reflector.go:150] k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get "https://91.99.1.97:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 91.99.1.97:6443: connect: connection refused May 13 23:50:31.990350 kubelet[2554]: W0513 23:50:31.990062 2554 reflector.go:547] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://91.99.1.97:6443/api/v1/nodes?fieldSelector=metadata.name%3Dci-4284-0-0-n-cba8e36126&limit=500&resourceVersion=0": dial tcp 91.99.1.97:6443: connect: connection refused May 13 23:50:31.990350 kubelet[2554]: E0513 23:50:31.990341 2554 reflector.go:150] k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get "https://91.99.1.97:6443/api/v1/nodes?fieldSelector=metadata.name%3Dci-4284-0-0-n-cba8e36126&limit=500&resourceVersion=0": dial tcp 91.99.1.97:6443: connect: connection refused May 13 23:50:32.087368 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount3864366271.mount: Deactivated successfully. May 13 23:50:32.099944 containerd[1515]: time="2025-05-13T23:50:32.099877176Z" level=info msg="ImageCreate event name:\"registry.k8s.io/pause:3.10\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" May 13 23:50:32.102037 containerd[1515]: time="2025-05-13T23:50:32.101978165Z" level=info msg="ImageUpdate event name:\"registry.k8s.io/pause:3.10\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" May 13 23:50:32.104213 containerd[1515]: time="2025-05-13T23:50:32.104147170Z" level=info msg="stop pulling image registry.k8s.io/pause:3.10: active requests=0, bytes read=268723" May 13 23:50:32.106873 containerd[1515]: time="2025-05-13T23:50:32.106815415Z" level=info msg="stop pulling image registry.k8s.io/pause:3.10: active requests=0, bytes read=0" May 13 23:50:32.113391 containerd[1515]: time="2025-05-13T23:50:32.112706881Z" level=info msg="ImageUpdate event name:\"registry.k8s.io/pause:3.10\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" May 13 23:50:32.113929 containerd[1515]: time="2025-05-13T23:50:32.113873764Z" level=info msg="stop pulling image registry.k8s.io/pause:3.10: active requests=0, bytes read=0" May 13 23:50:32.114790 containerd[1515]: time="2025-05-13T23:50:32.114757137Z" level=info msg="ImageCreate event name:\"sha256:afb61768ce381961ca0beff95337601f29dc70ff3ed14e5e4b3e5699057e6aa8\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" May 13 23:50:32.118422 containerd[1515]: time="2025-05-13T23:50:32.118359129Z" level=info msg="ImageCreate event name:\"registry.k8s.io/pause@sha256:ee6521f290b2168b6e0935a181d4cff9be1ac3f505666ef0e3c98fae8199917a\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" May 13 23:50:32.119094 containerd[1515]: time="2025-05-13T23:50:32.119054698Z" level=info msg="Pulled image \"registry.k8s.io/pause:3.10\" with image id \"sha256:afb61768ce381961ca0beff95337601f29dc70ff3ed14e5e4b3e5699057e6aa8\", repo tag \"registry.k8s.io/pause:3.10\", repo digest \"registry.k8s.io/pause@sha256:ee6521f290b2168b6e0935a181d4cff9be1ac3f505666ef0e3c98fae8199917a\", size \"267933\" in 699.488557ms" May 13 23:50:32.124273 containerd[1515]: time="2025-05-13T23:50:32.124230630Z" level=info msg="Pulled image \"registry.k8s.io/pause:3.10\" with image id \"sha256:afb61768ce381961ca0beff95337601f29dc70ff3ed14e5e4b3e5699057e6aa8\", repo tag \"registry.k8s.io/pause:3.10\", repo digest \"registry.k8s.io/pause@sha256:ee6521f290b2168b6e0935a181d4cff9be1ac3f505666ef0e3c98fae8199917a\", size \"267933\" in 684.62628ms" May 13 23:50:32.127632 containerd[1515]: time="2025-05-13T23:50:32.127568118Z" level=info msg="Pulled image \"registry.k8s.io/pause:3.10\" with image id \"sha256:afb61768ce381961ca0beff95337601f29dc70ff3ed14e5e4b3e5699057e6aa8\", repo tag \"registry.k8s.io/pause:3.10\", repo digest \"registry.k8s.io/pause@sha256:ee6521f290b2168b6e0935a181d4cff9be1ac3f505666ef0e3c98fae8199917a\", size \"267933\" in 691.472798ms" May 13 23:50:32.167643 containerd[1515]: time="2025-05-13T23:50:32.167515105Z" level=info msg="connecting to shim 76e1955205d61236f06a2b963a6b94c294944f8b9d2a7aff7eaf31d8418cd9bd" address="unix:///run/containerd/s/9b668c9c0b6d988cb3da52ee44153cbeba15a7fdf95e348619f87e86db5b4ffd" namespace=k8s.io protocol=ttrpc version=3 May 13 23:50:32.177219 containerd[1515]: time="2025-05-13T23:50:32.177079700Z" level=info msg="connecting to shim 9a49ff9e4ea3574acdd0a5ecb6c3ff9d1d2b6d669facc36a8d386faa1e9923c8" address="unix:///run/containerd/s/07ffe93fd1843398689df55cb1d49f204c644b9bf66fd3f4ff215c20e040e46e" namespace=k8s.io protocol=ttrpc version=3 May 13 23:50:32.181790 containerd[1515]: time="2025-05-13T23:50:32.181220382Z" level=info msg="connecting to shim f2cfe3c36a29abc928a8ed7b9676947183d99701ca59355816ec2f2b236d6a0c" address="unix:///run/containerd/s/86e565f142014ac4648a1ac4a0cda0a5278e310aa2c44a94f17d05dd38d54277" namespace=k8s.io protocol=ttrpc version=3 May 13 23:50:32.207811 systemd[1]: Started cri-containerd-76e1955205d61236f06a2b963a6b94c294944f8b9d2a7aff7eaf31d8418cd9bd.scope - libcontainer container 76e1955205d61236f06a2b963a6b94c294944f8b9d2a7aff7eaf31d8418cd9bd. May 13 23:50:32.222596 systemd[1]: Started cri-containerd-9a49ff9e4ea3574acdd0a5ecb6c3ff9d1d2b6d669facc36a8d386faa1e9923c8.scope - libcontainer container 9a49ff9e4ea3574acdd0a5ecb6c3ff9d1d2b6d669facc36a8d386faa1e9923c8. May 13 23:50:32.226756 systemd[1]: Started cri-containerd-f2cfe3c36a29abc928a8ed7b9676947183d99701ca59355816ec2f2b236d6a0c.scope - libcontainer container f2cfe3c36a29abc928a8ed7b9676947183d99701ca59355816ec2f2b236d6a0c. May 13 23:50:32.293257 containerd[1515]: time="2025-05-13T23:50:32.293011877Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-scheduler-ci-4284-0-0-n-cba8e36126,Uid:acec66ca0d7270f739e3685c8d1d745f,Namespace:kube-system,Attempt:0,} returns sandbox id \"76e1955205d61236f06a2b963a6b94c294944f8b9d2a7aff7eaf31d8418cd9bd\"" May 13 23:50:32.295034 containerd[1515]: time="2025-05-13T23:50:32.294927180Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-apiserver-ci-4284-0-0-n-cba8e36126,Uid:07d24e43dd0b2b37042db4176361c11e,Namespace:kube-system,Attempt:0,} returns sandbox id \"f2cfe3c36a29abc928a8ed7b9676947183d99701ca59355816ec2f2b236d6a0c\"" May 13 23:50:32.301029 containerd[1515]: time="2025-05-13T23:50:32.300834810Z" level=info msg="CreateContainer within sandbox \"76e1955205d61236f06a2b963a6b94c294944f8b9d2a7aff7eaf31d8418cd9bd\" for container &ContainerMetadata{Name:kube-scheduler,Attempt:0,}" May 13 23:50:32.301655 containerd[1515]: time="2025-05-13T23:50:32.301512734Z" level=info msg="CreateContainer within sandbox \"f2cfe3c36a29abc928a8ed7b9676947183d99701ca59355816ec2f2b236d6a0c\" for container &ContainerMetadata{Name:kube-apiserver,Attempt:0,}" May 13 23:50:32.312281 containerd[1515]: time="2025-05-13T23:50:32.312231808Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-controller-manager-ci-4284-0-0-n-cba8e36126,Uid:ed7a87ea6244797ff2fc56a962da64cf,Namespace:kube-system,Attempt:0,} returns sandbox id \"9a49ff9e4ea3574acdd0a5ecb6c3ff9d1d2b6d669facc36a8d386faa1e9923c8\"" May 13 23:50:32.317009 containerd[1515]: time="2025-05-13T23:50:32.316812117Z" level=info msg="CreateContainer within sandbox \"9a49ff9e4ea3574acdd0a5ecb6c3ff9d1d2b6d669facc36a8d386faa1e9923c8\" for container &ContainerMetadata{Name:kube-controller-manager,Attempt:0,}" May 13 23:50:32.319606 containerd[1515]: time="2025-05-13T23:50:32.319554981Z" level=info msg="Container a2923d255f43b17ecb984ca2fd1e92d3f3bb0581321b9c53f8f02e4611a7a642: CDI devices from CRI Config.CDIDevices: []" May 13 23:50:32.324331 containerd[1515]: time="2025-05-13T23:50:32.323600280Z" level=info msg="Container 552e7ac7c59e545e79555ea5759a9f14d9ac99e24ff272a4809a6058f2895c76: CDI devices from CRI Config.CDIDevices: []" May 13 23:50:32.337840 containerd[1515]: time="2025-05-13T23:50:32.337793114Z" level=info msg="CreateContainer within sandbox \"76e1955205d61236f06a2b963a6b94c294944f8b9d2a7aff7eaf31d8418cd9bd\" for &ContainerMetadata{Name:kube-scheduler,Attempt:0,} returns container id \"552e7ac7c59e545e79555ea5759a9f14d9ac99e24ff272a4809a6058f2895c76\"" May 13 23:50:32.338911 containerd[1515]: time="2025-05-13T23:50:32.338875897Z" level=info msg="StartContainer for \"552e7ac7c59e545e79555ea5759a9f14d9ac99e24ff272a4809a6058f2895c76\"" May 13 23:50:32.340259 containerd[1515]: time="2025-05-13T23:50:32.340224463Z" level=info msg="Container 83127b837c28633feaf17d517c8a4528671adae3a4df81e40bd57c0d77c0c8e5: CDI devices from CRI Config.CDIDevices: []" May 13 23:50:32.341186 containerd[1515]: time="2025-05-13T23:50:32.341137364Z" level=info msg="connecting to shim 552e7ac7c59e545e79555ea5759a9f14d9ac99e24ff272a4809a6058f2895c76" address="unix:///run/containerd/s/9b668c9c0b6d988cb3da52ee44153cbeba15a7fdf95e348619f87e86db5b4ffd" protocol=ttrpc version=3 May 13 23:50:32.346249 containerd[1515]: time="2025-05-13T23:50:32.346184985Z" level=info msg="CreateContainer within sandbox \"f2cfe3c36a29abc928a8ed7b9676947183d99701ca59355816ec2f2b236d6a0c\" for &ContainerMetadata{Name:kube-apiserver,Attempt:0,} returns container id \"a2923d255f43b17ecb984ca2fd1e92d3f3bb0581321b9c53f8f02e4611a7a642\"" May 13 23:50:32.347337 containerd[1515]: time="2025-05-13T23:50:32.347016107Z" level=info msg="StartContainer for \"a2923d255f43b17ecb984ca2fd1e92d3f3bb0581321b9c53f8f02e4611a7a642\"" May 13 23:50:32.348259 containerd[1515]: time="2025-05-13T23:50:32.348225039Z" level=info msg="CreateContainer within sandbox \"9a49ff9e4ea3574acdd0a5ecb6c3ff9d1d2b6d669facc36a8d386faa1e9923c8\" for &ContainerMetadata{Name:kube-controller-manager,Attempt:0,} returns container id \"83127b837c28633feaf17d517c8a4528671adae3a4df81e40bd57c0d77c0c8e5\"" May 13 23:50:32.350925 containerd[1515]: time="2025-05-13T23:50:32.350143183Z" level=info msg="connecting to shim a2923d255f43b17ecb984ca2fd1e92d3f3bb0581321b9c53f8f02e4611a7a642" address="unix:///run/containerd/s/86e565f142014ac4648a1ac4a0cda0a5278e310aa2c44a94f17d05dd38d54277" protocol=ttrpc version=3 May 13 23:50:32.351757 containerd[1515]: time="2025-05-13T23:50:32.351715684Z" level=info msg="StartContainer for \"83127b837c28633feaf17d517c8a4528671adae3a4df81e40bd57c0d77c0c8e5\"" May 13 23:50:32.361613 kubelet[2554]: E0513 23:50:32.361532 2554 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://91.99.1.97:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/ci-4284-0-0-n-cba8e36126?timeout=10s\": dial tcp 91.99.1.97:6443: connect: connection refused" interval="1.6s" May 13 23:50:32.365659 systemd[1]: Started cri-containerd-552e7ac7c59e545e79555ea5759a9f14d9ac99e24ff272a4809a6058f2895c76.scope - libcontainer container 552e7ac7c59e545e79555ea5759a9f14d9ac99e24ff272a4809a6058f2895c76. May 13 23:50:32.372656 containerd[1515]: time="2025-05-13T23:50:32.372255215Z" level=info msg="connecting to shim 83127b837c28633feaf17d517c8a4528671adae3a4df81e40bd57c0d77c0c8e5" address="unix:///run/containerd/s/07ffe93fd1843398689df55cb1d49f204c644b9bf66fd3f4ff215c20e040e46e" protocol=ttrpc version=3 May 13 23:50:32.388588 systemd[1]: Started cri-containerd-a2923d255f43b17ecb984ca2fd1e92d3f3bb0581321b9c53f8f02e4611a7a642.scope - libcontainer container a2923d255f43b17ecb984ca2fd1e92d3f3bb0581321b9c53f8f02e4611a7a642. May 13 23:50:32.402860 kubelet[2554]: W0513 23:50:32.401678 2554 reflector.go:547] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://91.99.1.97:6443/api/v1/services?limit=500&resourceVersion=0": dial tcp 91.99.1.97:6443: connect: connection refused May 13 23:50:32.402860 kubelet[2554]: E0513 23:50:32.402027 2554 reflector.go:150] k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get "https://91.99.1.97:6443/api/v1/services?limit=500&resourceVersion=0": dial tcp 91.99.1.97:6443: connect: connection refused May 13 23:50:32.406625 systemd[1]: Started cri-containerd-83127b837c28633feaf17d517c8a4528671adae3a4df81e40bd57c0d77c0c8e5.scope - libcontainer container 83127b837c28633feaf17d517c8a4528671adae3a4df81e40bd57c0d77c0c8e5. May 13 23:50:32.461433 containerd[1515]: time="2025-05-13T23:50:32.461365780Z" level=info msg="StartContainer for \"552e7ac7c59e545e79555ea5759a9f14d9ac99e24ff272a4809a6058f2895c76\" returns successfully" May 13 23:50:32.475332 kubelet[2554]: W0513 23:50:32.474542 2554 reflector.go:547] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://91.99.1.97:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 91.99.1.97:6443: connect: connection refused May 13 23:50:32.475332 kubelet[2554]: E0513 23:50:32.474678 2554 reflector.go:150] k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get "https://91.99.1.97:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 91.99.1.97:6443: connect: connection refused May 13 23:50:32.480683 containerd[1515]: time="2025-05-13T23:50:32.480542021Z" level=info msg="StartContainer for \"83127b837c28633feaf17d517c8a4528671adae3a4df81e40bd57c0d77c0c8e5\" returns successfully" May 13 23:50:32.486851 kubelet[2554]: I0513 23:50:32.486812 2554 kubelet_node_status.go:73] "Attempting to register node" node="ci-4284-0-0-n-cba8e36126" May 13 23:50:32.488494 kubelet[2554]: E0513 23:50:32.487382 2554 kubelet_node_status.go:96] "Unable to register node with API server" err="Post \"https://91.99.1.97:6443/api/v1/nodes\": dial tcp 91.99.1.97:6443: connect: connection refused" node="ci-4284-0-0-n-cba8e36126" May 13 23:50:32.520821 containerd[1515]: time="2025-05-13T23:50:32.520780839Z" level=info msg="StartContainer for \"a2923d255f43b17ecb984ca2fd1e92d3f3bb0581321b9c53f8f02e4611a7a642\" returns successfully" May 13 23:50:34.092072 kubelet[2554]: I0513 23:50:34.091045 2554 kubelet_node_status.go:73] "Attempting to register node" node="ci-4284-0-0-n-cba8e36126" May 13 23:50:35.346808 kubelet[2554]: E0513 23:50:35.346743 2554 nodelease.go:49] "Failed to get node when trying to set owner ref to the node lease" err="nodes \"ci-4284-0-0-n-cba8e36126\" not found" node="ci-4284-0-0-n-cba8e36126" May 13 23:50:35.483909 kubelet[2554]: I0513 23:50:35.483851 2554 kubelet_node_status.go:76] "Successfully registered node" node="ci-4284-0-0-n-cba8e36126" May 13 23:50:35.938928 kubelet[2554]: I0513 23:50:35.938875 2554 apiserver.go:52] "Watching apiserver" May 13 23:50:35.956661 kubelet[2554]: I0513 23:50:35.956609 2554 desired_state_of_world_populator.go:157] "Finished populating initial desired state of world" May 13 23:50:38.068193 systemd[1]: Reload requested from client PID 2827 ('systemctl') (unit session-7.scope)... May 13 23:50:38.068587 systemd[1]: Reloading... May 13 23:50:38.211498 zram_generator::config[2872]: No configuration found. May 13 23:50:38.341981 systemd[1]: /usr/lib/systemd/system/docker.socket:6: ListenStream= references a path below legacy directory /var/run/, updating /var/run/docker.sock → /run/docker.sock; please update the unit file accordingly. May 13 23:50:38.446648 systemd[1]: Reloading finished in 377 ms. May 13 23:50:38.476565 systemd[1]: Stopping kubelet.service - kubelet: The Kubernetes Node Agent... May 13 23:50:38.477506 kubelet[2554]: I0513 23:50:38.477185 2554 dynamic_cafile_content.go:171] "Shutting down controller" name="client-ca-bundle::/etc/kubernetes/pki/ca.crt" May 13 23:50:38.494173 systemd[1]: kubelet.service: Deactivated successfully. May 13 23:50:38.494593 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent. May 13 23:50:38.494687 systemd[1]: kubelet.service: Consumed 1.306s CPU time, 113.3M memory peak. May 13 23:50:38.497672 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... May 13 23:50:38.655022 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. May 13 23:50:38.667331 (kubelet)[2916]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS May 13 23:50:38.735427 kubelet[2916]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. May 13 23:50:38.735427 kubelet[2916]: Flag --pod-infra-container-image has been deprecated, will be removed in a future release. Image garbage collector will get sandbox image information from CRI. May 13 23:50:38.735427 kubelet[2916]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. May 13 23:50:38.735427 kubelet[2916]: I0513 23:50:38.733959 2916 server.go:205] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" May 13 23:50:38.744500 kubelet[2916]: I0513 23:50:38.744463 2916 server.go:484] "Kubelet version" kubeletVersion="v1.30.1" May 13 23:50:38.744500 kubelet[2916]: I0513 23:50:38.744492 2916 server.go:486] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" May 13 23:50:38.744721 kubelet[2916]: I0513 23:50:38.744706 2916 server.go:927] "Client rotation is on, will bootstrap in background" May 13 23:50:38.747185 kubelet[2916]: I0513 23:50:38.747149 2916 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-client-current.pem". May 13 23:50:38.749014 kubelet[2916]: I0513 23:50:38.748807 2916 dynamic_cafile_content.go:157] "Starting controller" name="client-ca-bundle::/etc/kubernetes/pki/ca.crt" May 13 23:50:38.756039 kubelet[2916]: I0513 23:50:38.756009 2916 server.go:742] "--cgroups-per-qos enabled, but --cgroup-root was not specified. defaulting to /" May 13 23:50:38.756267 kubelet[2916]: I0513 23:50:38.756213 2916 container_manager_linux.go:265] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] May 13 23:50:38.756609 kubelet[2916]: I0513 23:50:38.756257 2916 container_manager_linux.go:270] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ci-4284-0-0-n-cba8e36126","RuntimeCgroupsName":"","SystemCgroupsName":"","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":false,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":null,"HardEvictionThresholds":[{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"ExperimentalMemoryManagerPolicy":"None","ExperimentalMemoryManagerReservedMemory":null,"PodPidsLimit":-1,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null} May 13 23:50:38.756717 kubelet[2916]: I0513 23:50:38.756621 2916 topology_manager.go:138] "Creating topology manager with none policy" May 13 23:50:38.756717 kubelet[2916]: I0513 23:50:38.756634 2916 container_manager_linux.go:301] "Creating device plugin manager" May 13 23:50:38.756717 kubelet[2916]: I0513 23:50:38.756674 2916 state_mem.go:36] "Initialized new in-memory state store" May 13 23:50:38.756819 kubelet[2916]: I0513 23:50:38.756802 2916 kubelet.go:400] "Attempting to sync node with API server" May 13 23:50:38.756819 kubelet[2916]: I0513 23:50:38.756817 2916 kubelet.go:301] "Adding static pod path" path="/etc/kubernetes/manifests" May 13 23:50:38.756877 kubelet[2916]: I0513 23:50:38.756849 2916 kubelet.go:312] "Adding apiserver pod source" May 13 23:50:38.756877 kubelet[2916]: I0513 23:50:38.756867 2916 apiserver.go:42] "Waiting for node sync before watching apiserver pods" May 13 23:50:38.759799 kubelet[2916]: I0513 23:50:38.759770 2916 kuberuntime_manager.go:261] "Container runtime initialized" containerRuntime="containerd" version="v2.0.1" apiVersion="v1" May 13 23:50:38.760218 kubelet[2916]: I0513 23:50:38.759987 2916 kubelet.go:815] "Not starting ClusterTrustBundle informer because we are in static kubelet mode" May 13 23:50:38.760538 kubelet[2916]: I0513 23:50:38.760514 2916 server.go:1264] "Started kubelet" May 13 23:50:38.762921 kubelet[2916]: I0513 23:50:38.762558 2916 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" May 13 23:50:38.772215 kubelet[2916]: I0513 23:50:38.772162 2916 server.go:163] "Starting to listen" address="0.0.0.0" port=10250 May 13 23:50:38.777453 kubelet[2916]: I0513 23:50:38.777425 2916 server.go:455] "Adding debug handlers to kubelet server" May 13 23:50:38.781987 kubelet[2916]: I0513 23:50:38.781913 2916 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 May 13 23:50:38.782721 kubelet[2916]: I0513 23:50:38.782697 2916 server.go:227] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" May 13 23:50:38.787951 kubelet[2916]: I0513 23:50:38.787922 2916 volume_manager.go:291] "Starting Kubelet Volume Manager" May 13 23:50:38.795764 kubelet[2916]: I0513 23:50:38.794948 2916 desired_state_of_world_populator.go:149] "Desired state populator starts to run" May 13 23:50:38.795764 kubelet[2916]: I0513 23:50:38.795153 2916 reconciler.go:26] "Reconciler: start to sync state" May 13 23:50:38.808846 kubelet[2916]: E0513 23:50:38.808814 2916 kubelet.go:1467] "Image garbage collection failed once. Stats initialization may not have completed yet" err="invalid capacity 0 on image filesystem" May 13 23:50:38.809733 kubelet[2916]: I0513 23:50:38.809689 2916 factory.go:221] Registration of the systemd container factory successfully May 13 23:50:38.810008 kubelet[2916]: I0513 23:50:38.809945 2916 factory.go:219] Registration of the crio container factory failed: Get "http://%2Fvar%2Frun%2Fcrio%2Fcrio.sock/info": dial unix /var/run/crio/crio.sock: connect: no such file or directory May 13 23:50:38.814783 kubelet[2916]: I0513 23:50:38.814644 2916 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv4" May 13 23:50:38.816520 kubelet[2916]: I0513 23:50:38.816106 2916 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv6" May 13 23:50:38.816520 kubelet[2916]: I0513 23:50:38.816148 2916 status_manager.go:217] "Starting to sync pod status with apiserver" May 13 23:50:38.816520 kubelet[2916]: I0513 23:50:38.816165 2916 kubelet.go:2337] "Starting kubelet main sync loop" May 13 23:50:38.816520 kubelet[2916]: E0513 23:50:38.816227 2916 kubelet.go:2361] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" May 13 23:50:38.822200 kubelet[2916]: I0513 23:50:38.821864 2916 factory.go:221] Registration of the containerd container factory successfully May 13 23:50:38.877197 kubelet[2916]: I0513 23:50:38.877170 2916 cpu_manager.go:214] "Starting CPU manager" policy="none" May 13 23:50:38.878021 kubelet[2916]: I0513 23:50:38.877428 2916 cpu_manager.go:215] "Reconciling" reconcilePeriod="10s" May 13 23:50:38.878021 kubelet[2916]: I0513 23:50:38.877455 2916 state_mem.go:36] "Initialized new in-memory state store" May 13 23:50:38.878021 kubelet[2916]: I0513 23:50:38.877641 2916 state_mem.go:88] "Updated default CPUSet" cpuSet="" May 13 23:50:38.878021 kubelet[2916]: I0513 23:50:38.877656 2916 state_mem.go:96] "Updated CPUSet assignments" assignments={} May 13 23:50:38.878021 kubelet[2916]: I0513 23:50:38.877673 2916 policy_none.go:49] "None policy: Start" May 13 23:50:38.878865 kubelet[2916]: I0513 23:50:38.878843 2916 memory_manager.go:170] "Starting memorymanager" policy="None" May 13 23:50:38.879045 kubelet[2916]: I0513 23:50:38.879034 2916 state_mem.go:35] "Initializing new in-memory state store" May 13 23:50:38.879493 kubelet[2916]: I0513 23:50:38.879389 2916 state_mem.go:75] "Updated machine memory state" May 13 23:50:38.885966 kubelet[2916]: I0513 23:50:38.885946 2916 manager.go:479] "Failed to read data from checkpoint" checkpoint="kubelet_internal_checkpoint" err="checkpoint is not found" May 13 23:50:38.886273 kubelet[2916]: I0513 23:50:38.886239 2916 container_log_manager.go:186] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" May 13 23:50:38.887832 kubelet[2916]: I0513 23:50:38.887809 2916 plugin_manager.go:118] "Starting Kubelet Plugin Manager" May 13 23:50:38.902093 kubelet[2916]: I0513 23:50:38.901937 2916 kubelet_node_status.go:73] "Attempting to register node" node="ci-4284-0-0-n-cba8e36126" May 13 23:50:38.916787 kubelet[2916]: I0513 23:50:38.916641 2916 topology_manager.go:215] "Topology Admit Handler" podUID="07d24e43dd0b2b37042db4176361c11e" podNamespace="kube-system" podName="kube-apiserver-ci-4284-0-0-n-cba8e36126" May 13 23:50:38.916915 kubelet[2916]: I0513 23:50:38.916875 2916 topology_manager.go:215] "Topology Admit Handler" podUID="ed7a87ea6244797ff2fc56a962da64cf" podNamespace="kube-system" podName="kube-controller-manager-ci-4284-0-0-n-cba8e36126" May 13 23:50:38.916956 kubelet[2916]: I0513 23:50:38.916927 2916 topology_manager.go:215] "Topology Admit Handler" podUID="acec66ca0d7270f739e3685c8d1d745f" podNamespace="kube-system" podName="kube-scheduler-ci-4284-0-0-n-cba8e36126" May 13 23:50:38.936660 kubelet[2916]: I0513 23:50:38.936386 2916 kubelet_node_status.go:112] "Node was previously registered" node="ci-4284-0-0-n-cba8e36126" May 13 23:50:38.936660 kubelet[2916]: I0513 23:50:38.936505 2916 kubelet_node_status.go:76] "Successfully registered node" node="ci-4284-0-0-n-cba8e36126" May 13 23:50:38.996970 kubelet[2916]: I0513 23:50:38.996921 2916 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/ed7a87ea6244797ff2fc56a962da64cf-usr-share-ca-certificates\") pod \"kube-controller-manager-ci-4284-0-0-n-cba8e36126\" (UID: \"ed7a87ea6244797ff2fc56a962da64cf\") " pod="kube-system/kube-controller-manager-ci-4284-0-0-n-cba8e36126" May 13 23:50:38.997689 kubelet[2916]: I0513 23:50:38.997185 2916 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/07d24e43dd0b2b37042db4176361c11e-ca-certs\") pod \"kube-apiserver-ci-4284-0-0-n-cba8e36126\" (UID: \"07d24e43dd0b2b37042db4176361c11e\") " pod="kube-system/kube-apiserver-ci-4284-0-0-n-cba8e36126" May 13 23:50:38.997689 kubelet[2916]: I0513 23:50:38.997239 2916 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/07d24e43dd0b2b37042db4176361c11e-usr-share-ca-certificates\") pod \"kube-apiserver-ci-4284-0-0-n-cba8e36126\" (UID: \"07d24e43dd0b2b37042db4176361c11e\") " pod="kube-system/kube-apiserver-ci-4284-0-0-n-cba8e36126" May 13 23:50:38.997689 kubelet[2916]: I0513 23:50:38.997275 2916 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/ed7a87ea6244797ff2fc56a962da64cf-k8s-certs\") pod \"kube-controller-manager-ci-4284-0-0-n-cba8e36126\" (UID: \"ed7a87ea6244797ff2fc56a962da64cf\") " pod="kube-system/kube-controller-manager-ci-4284-0-0-n-cba8e36126" May 13 23:50:38.997689 kubelet[2916]: I0513 23:50:38.997311 2916 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/ed7a87ea6244797ff2fc56a962da64cf-kubeconfig\") pod \"kube-controller-manager-ci-4284-0-0-n-cba8e36126\" (UID: \"ed7a87ea6244797ff2fc56a962da64cf\") " pod="kube-system/kube-controller-manager-ci-4284-0-0-n-cba8e36126" May 13 23:50:38.997689 kubelet[2916]: I0513 23:50:38.997343 2916 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/07d24e43dd0b2b37042db4176361c11e-k8s-certs\") pod \"kube-apiserver-ci-4284-0-0-n-cba8e36126\" (UID: \"07d24e43dd0b2b37042db4176361c11e\") " pod="kube-system/kube-apiserver-ci-4284-0-0-n-cba8e36126" May 13 23:50:38.998039 kubelet[2916]: I0513 23:50:38.997381 2916 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/ed7a87ea6244797ff2fc56a962da64cf-ca-certs\") pod \"kube-controller-manager-ci-4284-0-0-n-cba8e36126\" (UID: \"ed7a87ea6244797ff2fc56a962da64cf\") " pod="kube-system/kube-controller-manager-ci-4284-0-0-n-cba8e36126" May 13 23:50:38.998039 kubelet[2916]: I0513 23:50:38.997459 2916 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"flexvolume-dir\" (UniqueName: \"kubernetes.io/host-path/ed7a87ea6244797ff2fc56a962da64cf-flexvolume-dir\") pod \"kube-controller-manager-ci-4284-0-0-n-cba8e36126\" (UID: \"ed7a87ea6244797ff2fc56a962da64cf\") " pod="kube-system/kube-controller-manager-ci-4284-0-0-n-cba8e36126" May 13 23:50:38.998039 kubelet[2916]: I0513 23:50:38.997645 2916 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/acec66ca0d7270f739e3685c8d1d745f-kubeconfig\") pod \"kube-scheduler-ci-4284-0-0-n-cba8e36126\" (UID: \"acec66ca0d7270f739e3685c8d1d745f\") " pod="kube-system/kube-scheduler-ci-4284-0-0-n-cba8e36126" May 13 23:50:39.757698 kubelet[2916]: I0513 23:50:39.757613 2916 apiserver.go:52] "Watching apiserver" May 13 23:50:39.796302 kubelet[2916]: I0513 23:50:39.796203 2916 desired_state_of_world_populator.go:157] "Finished populating initial desired state of world" May 13 23:50:39.884649 kubelet[2916]: E0513 23:50:39.884086 2916 kubelet.go:1928] "Failed creating a mirror pod for" err="pods \"kube-apiserver-ci-4284-0-0-n-cba8e36126\" already exists" pod="kube-system/kube-apiserver-ci-4284-0-0-n-cba8e36126" May 13 23:50:39.892332 kubelet[2916]: I0513 23:50:39.892234 2916 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-scheduler-ci-4284-0-0-n-cba8e36126" podStartSLOduration=1.8922142 podStartE2EDuration="1.8922142s" podCreationTimestamp="2025-05-13 23:50:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-05-13 23:50:39.891521418 +0000 UTC m=+1.218838520" watchObservedRunningTime="2025-05-13 23:50:39.8922142 +0000 UTC m=+1.219531262" May 13 23:50:39.939805 kubelet[2916]: I0513 23:50:39.939556 2916 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-apiserver-ci-4284-0-0-n-cba8e36126" podStartSLOduration=1.939535086 podStartE2EDuration="1.939535086s" podCreationTimestamp="2025-05-13 23:50:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-05-13 23:50:39.90970586 +0000 UTC m=+1.237022922" watchObservedRunningTime="2025-05-13 23:50:39.939535086 +0000 UTC m=+1.266852148" May 13 23:50:39.939805 kubelet[2916]: I0513 23:50:39.939687 2916 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-controller-manager-ci-4284-0-0-n-cba8e36126" podStartSLOduration=1.939680596 podStartE2EDuration="1.939680596s" podCreationTimestamp="2025-05-13 23:50:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-05-13 23:50:39.939644829 +0000 UTC m=+1.266961891" watchObservedRunningTime="2025-05-13 23:50:39.939680596 +0000 UTC m=+1.266997658" May 13 23:50:44.423167 sudo[1898]: pam_unix(sudo:session): session closed for user root May 13 23:50:44.584625 sshd[1897]: Connection closed by 139.178.89.65 port 52348 May 13 23:50:44.585435 sshd-session[1895]: pam_unix(sshd:session): session closed for user core May 13 23:50:44.590167 systemd[1]: sshd@6-91.99.1.97:22-139.178.89.65:52348.service: Deactivated successfully. May 13 23:50:44.593783 systemd[1]: session-7.scope: Deactivated successfully. May 13 23:50:44.594285 systemd[1]: session-7.scope: Consumed 9.102s CPU time, 248M memory peak. May 13 23:50:44.596477 systemd-logind[1485]: Session 7 logged out. Waiting for processes to exit. May 13 23:50:44.597702 systemd-logind[1485]: Removed session 7. May 13 23:50:52.035420 kubelet[2916]: I0513 23:50:52.035378 2916 kuberuntime_manager.go:1523] "Updating runtime config through cri with podcidr" CIDR="192.168.0.0/24" May 13 23:50:52.037572 kubelet[2916]: I0513 23:50:52.036913 2916 kubelet_network.go:61] "Updating Pod CIDR" originalPodCIDR="" newPodCIDR="192.168.0.0/24" May 13 23:50:52.037611 containerd[1515]: time="2025-05-13T23:50:52.036172808Z" level=info msg="No cni config template is specified, wait for other system components to drop the config." May 13 23:50:52.974841 kubelet[2916]: I0513 23:50:52.974790 2916 topology_manager.go:215] "Topology Admit Handler" podUID="ed061130-c2f2-486c-8944-ad9f3e009e8b" podNamespace="kube-system" podName="kube-proxy-7jrsx" May 13 23:50:52.986144 systemd[1]: Created slice kubepods-besteffort-poded061130_c2f2_486c_8944_ad9f3e009e8b.slice - libcontainer container kubepods-besteffort-poded061130_c2f2_486c_8944_ad9f3e009e8b.slice. May 13 23:50:52.995374 kubelet[2916]: I0513 23:50:52.995274 2916 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-proxy\" (UniqueName: \"kubernetes.io/configmap/ed061130-c2f2-486c-8944-ad9f3e009e8b-kube-proxy\") pod \"kube-proxy-7jrsx\" (UID: \"ed061130-c2f2-486c-8944-ad9f3e009e8b\") " pod="kube-system/kube-proxy-7jrsx" May 13 23:50:52.995374 kubelet[2916]: I0513 23:50:52.995337 2916 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"xtables-lock\" (UniqueName: \"kubernetes.io/host-path/ed061130-c2f2-486c-8944-ad9f3e009e8b-xtables-lock\") pod \"kube-proxy-7jrsx\" (UID: \"ed061130-c2f2-486c-8944-ad9f3e009e8b\") " pod="kube-system/kube-proxy-7jrsx" May 13 23:50:52.995374 kubelet[2916]: I0513 23:50:52.995360 2916 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/ed061130-c2f2-486c-8944-ad9f3e009e8b-lib-modules\") pod \"kube-proxy-7jrsx\" (UID: \"ed061130-c2f2-486c-8944-ad9f3e009e8b\") " pod="kube-system/kube-proxy-7jrsx" May 13 23:50:52.995374 kubelet[2916]: I0513 23:50:52.995381 2916 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-82z9x\" (UniqueName: \"kubernetes.io/projected/ed061130-c2f2-486c-8944-ad9f3e009e8b-kube-api-access-82z9x\") pod \"kube-proxy-7jrsx\" (UID: \"ed061130-c2f2-486c-8944-ad9f3e009e8b\") " pod="kube-system/kube-proxy-7jrsx" May 13 23:50:53.161629 kubelet[2916]: I0513 23:50:53.161549 2916 topology_manager.go:215] "Topology Admit Handler" podUID="a63746f9-45de-4479-afa8-0dfdaad58e4d" podNamespace="tigera-operator" podName="tigera-operator-797db67f8-5gvc9" May 13 23:50:53.174549 systemd[1]: Created slice kubepods-besteffort-poda63746f9_45de_4479_afa8_0dfdaad58e4d.slice - libcontainer container kubepods-besteffort-poda63746f9_45de_4479_afa8_0dfdaad58e4d.slice. May 13 23:50:53.196526 kubelet[2916]: I0513 23:50:53.196473 2916 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-calico\" (UniqueName: \"kubernetes.io/host-path/a63746f9-45de-4479-afa8-0dfdaad58e4d-var-lib-calico\") pod \"tigera-operator-797db67f8-5gvc9\" (UID: \"a63746f9-45de-4479-afa8-0dfdaad58e4d\") " pod="tigera-operator/tigera-operator-797db67f8-5gvc9" May 13 23:50:53.196816 kubelet[2916]: I0513 23:50:53.196630 2916 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-plskg\" (UniqueName: \"kubernetes.io/projected/a63746f9-45de-4479-afa8-0dfdaad58e4d-kube-api-access-plskg\") pod \"tigera-operator-797db67f8-5gvc9\" (UID: \"a63746f9-45de-4479-afa8-0dfdaad58e4d\") " pod="tigera-operator/tigera-operator-797db67f8-5gvc9" May 13 23:50:53.298973 containerd[1515]: time="2025-05-13T23:50:53.298795002Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-proxy-7jrsx,Uid:ed061130-c2f2-486c-8944-ad9f3e009e8b,Namespace:kube-system,Attempt:0,}" May 13 23:50:53.334631 containerd[1515]: time="2025-05-13T23:50:53.333931784Z" level=info msg="connecting to shim 02699cd442c0ba3701471692d544466fc8d7aa3269e4da8671b47b07d7070c24" address="unix:///run/containerd/s/5cfa0ae7b5d1c6caeb5c3bd7756040cd7c732655c09ebe4042868d7352359a5b" namespace=k8s.io protocol=ttrpc version=3 May 13 23:50:53.370648 systemd[1]: Started cri-containerd-02699cd442c0ba3701471692d544466fc8d7aa3269e4da8671b47b07d7070c24.scope - libcontainer container 02699cd442c0ba3701471692d544466fc8d7aa3269e4da8671b47b07d7070c24. May 13 23:50:53.398215 containerd[1515]: time="2025-05-13T23:50:53.398147194Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-proxy-7jrsx,Uid:ed061130-c2f2-486c-8944-ad9f3e009e8b,Namespace:kube-system,Attempt:0,} returns sandbox id \"02699cd442c0ba3701471692d544466fc8d7aa3269e4da8671b47b07d7070c24\"" May 13 23:50:53.403343 containerd[1515]: time="2025-05-13T23:50:53.402184523Z" level=info msg="CreateContainer within sandbox \"02699cd442c0ba3701471692d544466fc8d7aa3269e4da8671b47b07d7070c24\" for container &ContainerMetadata{Name:kube-proxy,Attempt:0,}" May 13 23:50:53.414783 containerd[1515]: time="2025-05-13T23:50:53.414744578Z" level=info msg="Container a50af5f0bcf634fb8b6633c7988e506108ec0a059707d2d079693507213ab695: CDI devices from CRI Config.CDIDevices: []" May 13 23:50:53.429945 containerd[1515]: time="2025-05-13T23:50:53.429859219Z" level=info msg="CreateContainer within sandbox \"02699cd442c0ba3701471692d544466fc8d7aa3269e4da8671b47b07d7070c24\" for &ContainerMetadata{Name:kube-proxy,Attempt:0,} returns container id \"a50af5f0bcf634fb8b6633c7988e506108ec0a059707d2d079693507213ab695\"" May 13 23:50:53.432592 containerd[1515]: time="2025-05-13T23:50:53.431021514Z" level=info msg="StartContainer for \"a50af5f0bcf634fb8b6633c7988e506108ec0a059707d2d079693507213ab695\"" May 13 23:50:53.433275 containerd[1515]: time="2025-05-13T23:50:53.433235808Z" level=info msg="connecting to shim a50af5f0bcf634fb8b6633c7988e506108ec0a059707d2d079693507213ab695" address="unix:///run/containerd/s/5cfa0ae7b5d1c6caeb5c3bd7756040cd7c732655c09ebe4042868d7352359a5b" protocol=ttrpc version=3 May 13 23:50:53.464765 systemd[1]: Started cri-containerd-a50af5f0bcf634fb8b6633c7988e506108ec0a059707d2d079693507213ab695.scope - libcontainer container a50af5f0bcf634fb8b6633c7988e506108ec0a059707d2d079693507213ab695. May 13 23:50:53.481751 containerd[1515]: time="2025-05-13T23:50:53.481193765Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:tigera-operator-797db67f8-5gvc9,Uid:a63746f9-45de-4479-afa8-0dfdaad58e4d,Namespace:tigera-operator,Attempt:0,}" May 13 23:50:53.516529 containerd[1515]: time="2025-05-13T23:50:53.516328306Z" level=info msg="StartContainer for \"a50af5f0bcf634fb8b6633c7988e506108ec0a059707d2d079693507213ab695\" returns successfully" May 13 23:50:53.522156 containerd[1515]: time="2025-05-13T23:50:53.521632947Z" level=info msg="connecting to shim 4dad1fa3b28cc13e1234ef2da62fe9d5b86ef3cee156dd1ddb970836ff19943b" address="unix:///run/containerd/s/f162030d94b44f362e4e203e74b9b519217927c58fdccbe2c4c667e0d284a622" namespace=k8s.io protocol=ttrpc version=3 May 13 23:50:53.551646 systemd[1]: Started cri-containerd-4dad1fa3b28cc13e1234ef2da62fe9d5b86ef3cee156dd1ddb970836ff19943b.scope - libcontainer container 4dad1fa3b28cc13e1234ef2da62fe9d5b86ef3cee156dd1ddb970836ff19943b. May 13 23:50:53.600728 containerd[1515]: time="2025-05-13T23:50:53.600652510Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:tigera-operator-797db67f8-5gvc9,Uid:a63746f9-45de-4479-afa8-0dfdaad58e4d,Namespace:tigera-operator,Attempt:0,} returns sandbox id \"4dad1fa3b28cc13e1234ef2da62fe9d5b86ef3cee156dd1ddb970836ff19943b\"" May 13 23:50:53.604200 containerd[1515]: time="2025-05-13T23:50:53.603937446Z" level=info msg="PullImage \"quay.io/tigera/operator:v1.36.7\"" May 13 23:50:55.173960 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount367520580.mount: Deactivated successfully. May 13 23:50:55.534433 containerd[1515]: time="2025-05-13T23:50:55.534267083Z" level=info msg="ImageCreate event name:\"quay.io/tigera/operator:v1.36.7\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 13 23:50:55.535480 containerd[1515]: time="2025-05-13T23:50:55.535392366Z" level=info msg="stop pulling image quay.io/tigera/operator:v1.36.7: active requests=0, bytes read=19323084" May 13 23:50:55.538313 containerd[1515]: time="2025-05-13T23:50:55.536170479Z" level=info msg="ImageCreate event name:\"sha256:27f7c2cfac802523e44ecd16453a4cc992f6c7d610c13054f2715a7cb4370565\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 13 23:50:55.538726 containerd[1515]: time="2025-05-13T23:50:55.538688564Z" level=info msg="ImageCreate event name:\"quay.io/tigera/operator@sha256:a4a44422d8f2a14e0aaea2031ccb5580f2bf68218c9db444450c1888743305e9\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 13 23:50:55.539473 containerd[1515]: time="2025-05-13T23:50:55.539396067Z" level=info msg="Pulled image \"quay.io/tigera/operator:v1.36.7\" with image id \"sha256:27f7c2cfac802523e44ecd16453a4cc992f6c7d610c13054f2715a7cb4370565\", repo tag \"quay.io/tigera/operator:v1.36.7\", repo digest \"quay.io/tigera/operator@sha256:a4a44422d8f2a14e0aaea2031ccb5580f2bf68218c9db444450c1888743305e9\", size \"19319079\" in 1.935414094s" May 13 23:50:55.539612 containerd[1515]: time="2025-05-13T23:50:55.539594336Z" level=info msg="PullImage \"quay.io/tigera/operator:v1.36.7\" returns image reference \"sha256:27f7c2cfac802523e44ecd16453a4cc992f6c7d610c13054f2715a7cb4370565\"" May 13 23:50:55.544024 containerd[1515]: time="2025-05-13T23:50:55.543969610Z" level=info msg="CreateContainer within sandbox \"4dad1fa3b28cc13e1234ef2da62fe9d5b86ef3cee156dd1ddb970836ff19943b\" for container &ContainerMetadata{Name:tigera-operator,Attempt:0,}" May 13 23:50:55.554459 containerd[1515]: time="2025-05-13T23:50:55.552649068Z" level=info msg="Container 03db3bc78ed087c2d2c2e113706c70934bc599d4d06b4e3553772c469f07e430: CDI devices from CRI Config.CDIDevices: []" May 13 23:50:55.568649 containerd[1515]: time="2025-05-13T23:50:55.568391110Z" level=info msg="CreateContainer within sandbox \"4dad1fa3b28cc13e1234ef2da62fe9d5b86ef3cee156dd1ddb970836ff19943b\" for &ContainerMetadata{Name:tigera-operator,Attempt:0,} returns container id \"03db3bc78ed087c2d2c2e113706c70934bc599d4d06b4e3553772c469f07e430\"" May 13 23:50:55.571935 containerd[1515]: time="2025-05-13T23:50:55.571575092Z" level=info msg="StartContainer for \"03db3bc78ed087c2d2c2e113706c70934bc599d4d06b4e3553772c469f07e430\"" May 13 23:50:55.572962 containerd[1515]: time="2025-05-13T23:50:55.572907965Z" level=info msg="connecting to shim 03db3bc78ed087c2d2c2e113706c70934bc599d4d06b4e3553772c469f07e430" address="unix:///run/containerd/s/f162030d94b44f362e4e203e74b9b519217927c58fdccbe2c4c667e0d284a622" protocol=ttrpc version=3 May 13 23:50:55.599613 systemd[1]: Started cri-containerd-03db3bc78ed087c2d2c2e113706c70934bc599d4d06b4e3553772c469f07e430.scope - libcontainer container 03db3bc78ed087c2d2c2e113706c70934bc599d4d06b4e3553772c469f07e430. May 13 23:50:55.651532 containerd[1515]: time="2025-05-13T23:50:55.651432030Z" level=info msg="StartContainer for \"03db3bc78ed087c2d2c2e113706c70934bc599d4d06b4e3553772c469f07e430\" returns successfully" May 13 23:50:55.925525 kubelet[2916]: I0513 23:50:55.925416 2916 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-proxy-7jrsx" podStartSLOduration=3.925366624 podStartE2EDuration="3.925366624s" podCreationTimestamp="2025-05-13 23:50:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-05-13 23:50:53.915942725 +0000 UTC m=+15.243259827" watchObservedRunningTime="2025-05-13 23:50:55.925366624 +0000 UTC m=+17.252683726" May 13 23:50:55.927492 kubelet[2916]: I0513 23:50:55.925668 2916 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="tigera-operator/tigera-operator-797db67f8-5gvc9" podStartSLOduration=0.987391792 podStartE2EDuration="2.925656146s" podCreationTimestamp="2025-05-13 23:50:53 +0000 UTC" firstStartedPulling="2025-05-13 23:50:53.602789793 +0000 UTC m=+14.930106815" lastFinishedPulling="2025-05-13 23:50:55.541054107 +0000 UTC m=+16.868371169" observedRunningTime="2025-05-13 23:50:55.923285442 +0000 UTC m=+17.250602504" watchObservedRunningTime="2025-05-13 23:50:55.925656146 +0000 UTC m=+17.252973248" May 13 23:51:00.105755 kubelet[2916]: I0513 23:51:00.105677 2916 topology_manager.go:215] "Topology Admit Handler" podUID="5bf8c565-319e-40f9-8b90-11c92e4e0e85" podNamespace="calico-system" podName="calico-typha-ff85b86b5-vwq9l" May 13 23:51:00.117315 systemd[1]: Created slice kubepods-besteffort-pod5bf8c565_319e_40f9_8b90_11c92e4e0e85.slice - libcontainer container kubepods-besteffort-pod5bf8c565_319e_40f9_8b90_11c92e4e0e85.slice. May 13 23:51:00.148067 kubelet[2916]: I0513 23:51:00.147188 2916 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"typha-certs\" (UniqueName: \"kubernetes.io/secret/5bf8c565-319e-40f9-8b90-11c92e4e0e85-typha-certs\") pod \"calico-typha-ff85b86b5-vwq9l\" (UID: \"5bf8c565-319e-40f9-8b90-11c92e4e0e85\") " pod="calico-system/calico-typha-ff85b86b5-vwq9l" May 13 23:51:00.148067 kubelet[2916]: I0513 23:51:00.147234 2916 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4txdf\" (UniqueName: \"kubernetes.io/projected/5bf8c565-319e-40f9-8b90-11c92e4e0e85-kube-api-access-4txdf\") pod \"calico-typha-ff85b86b5-vwq9l\" (UID: \"5bf8c565-319e-40f9-8b90-11c92e4e0e85\") " pod="calico-system/calico-typha-ff85b86b5-vwq9l" May 13 23:51:00.148067 kubelet[2916]: I0513 23:51:00.148015 2916 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5bf8c565-319e-40f9-8b90-11c92e4e0e85-tigera-ca-bundle\") pod \"calico-typha-ff85b86b5-vwq9l\" (UID: \"5bf8c565-319e-40f9-8b90-11c92e4e0e85\") " pod="calico-system/calico-typha-ff85b86b5-vwq9l" May 13 23:51:00.328590 kubelet[2916]: I0513 23:51:00.327175 2916 topology_manager.go:215] "Topology Admit Handler" podUID="6a9ba8ac-b078-4b58-af66-448e71d3c56b" podNamespace="calico-system" podName="calico-node-9d6d5" May 13 23:51:00.341112 systemd[1]: Created slice kubepods-besteffort-pod6a9ba8ac_b078_4b58_af66_448e71d3c56b.slice - libcontainer container kubepods-besteffort-pod6a9ba8ac_b078_4b58_af66_448e71d3c56b.slice. May 13 23:51:00.349844 kubelet[2916]: I0513 23:51:00.349791 2916 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-net-dir\" (UniqueName: \"kubernetes.io/host-path/6a9ba8ac-b078-4b58-af66-448e71d3c56b-cni-net-dir\") pod \"calico-node-9d6d5\" (UID: \"6a9ba8ac-b078-4b58-af66-448e71d3c56b\") " pod="calico-system/calico-node-9d6d5" May 13 23:51:00.349844 kubelet[2916]: I0513 23:51:00.349838 2916 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"flexvol-driver-host\" (UniqueName: \"kubernetes.io/host-path/6a9ba8ac-b078-4b58-af66-448e71d3c56b-flexvol-driver-host\") pod \"calico-node-9d6d5\" (UID: \"6a9ba8ac-b078-4b58-af66-448e71d3c56b\") " pod="calico-system/calico-node-9d6d5" May 13 23:51:00.350014 kubelet[2916]: I0513 23:51:00.349861 2916 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9ck66\" (UniqueName: \"kubernetes.io/projected/6a9ba8ac-b078-4b58-af66-448e71d3c56b-kube-api-access-9ck66\") pod \"calico-node-9d6d5\" (UID: \"6a9ba8ac-b078-4b58-af66-448e71d3c56b\") " pod="calico-system/calico-node-9d6d5" May 13 23:51:00.350014 kubelet[2916]: I0513 23:51:00.349882 2916 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-calico\" (UniqueName: \"kubernetes.io/host-path/6a9ba8ac-b078-4b58-af66-448e71d3c56b-var-lib-calico\") pod \"calico-node-9d6d5\" (UID: \"6a9ba8ac-b078-4b58-af66-448e71d3c56b\") " pod="calico-system/calico-node-9d6d5" May 13 23:51:00.350014 kubelet[2916]: I0513 23:51:00.349898 2916 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-bin-dir\" (UniqueName: \"kubernetes.io/host-path/6a9ba8ac-b078-4b58-af66-448e71d3c56b-cni-bin-dir\") pod \"calico-node-9d6d5\" (UID: \"6a9ba8ac-b078-4b58-af66-448e71d3c56b\") " pod="calico-system/calico-node-9d6d5" May 13 23:51:00.350014 kubelet[2916]: I0513 23:51:00.349917 2916 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"xtables-lock\" (UniqueName: \"kubernetes.io/host-path/6a9ba8ac-b078-4b58-af66-448e71d3c56b-xtables-lock\") pod \"calico-node-9d6d5\" (UID: \"6a9ba8ac-b078-4b58-af66-448e71d3c56b\") " pod="calico-system/calico-node-9d6d5" May 13 23:51:00.350014 kubelet[2916]: I0513 23:51:00.349933 2916 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"policysync\" (UniqueName: \"kubernetes.io/host-path/6a9ba8ac-b078-4b58-af66-448e71d3c56b-policysync\") pod \"calico-node-9d6d5\" (UID: \"6a9ba8ac-b078-4b58-af66-448e71d3c56b\") " pod="calico-system/calico-node-9d6d5" May 13 23:51:00.350117 kubelet[2916]: I0513 23:51:00.349950 2916 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-calico\" (UniqueName: \"kubernetes.io/host-path/6a9ba8ac-b078-4b58-af66-448e71d3c56b-var-run-calico\") pod \"calico-node-9d6d5\" (UID: \"6a9ba8ac-b078-4b58-af66-448e71d3c56b\") " pod="calico-system/calico-node-9d6d5" May 13 23:51:00.350117 kubelet[2916]: I0513 23:51:00.349966 2916 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6a9ba8ac-b078-4b58-af66-448e71d3c56b-tigera-ca-bundle\") pod \"calico-node-9d6d5\" (UID: \"6a9ba8ac-b078-4b58-af66-448e71d3c56b\") " pod="calico-system/calico-node-9d6d5" May 13 23:51:00.350117 kubelet[2916]: I0513 23:51:00.349984 2916 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-certs\" (UniqueName: \"kubernetes.io/secret/6a9ba8ac-b078-4b58-af66-448e71d3c56b-node-certs\") pod \"calico-node-9d6d5\" (UID: \"6a9ba8ac-b078-4b58-af66-448e71d3c56b\") " pod="calico-system/calico-node-9d6d5" May 13 23:51:00.350117 kubelet[2916]: I0513 23:51:00.350005 2916 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/6a9ba8ac-b078-4b58-af66-448e71d3c56b-lib-modules\") pod \"calico-node-9d6d5\" (UID: \"6a9ba8ac-b078-4b58-af66-448e71d3c56b\") " pod="calico-system/calico-node-9d6d5" May 13 23:51:00.350117 kubelet[2916]: I0513 23:51:00.350022 2916 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-log-dir\" (UniqueName: \"kubernetes.io/host-path/6a9ba8ac-b078-4b58-af66-448e71d3c56b-cni-log-dir\") pod \"calico-node-9d6d5\" (UID: \"6a9ba8ac-b078-4b58-af66-448e71d3c56b\") " pod="calico-system/calico-node-9d6d5" May 13 23:51:00.426918 containerd[1515]: time="2025-05-13T23:51:00.426502865Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-typha-ff85b86b5-vwq9l,Uid:5bf8c565-319e-40f9-8b90-11c92e4e0e85,Namespace:calico-system,Attempt:0,}" May 13 23:51:00.452109 kubelet[2916]: E0513 23:51:00.452050 2916 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 23:51:00.452109 kubelet[2916]: W0513 23:51:00.452085 2916 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 23:51:00.452109 kubelet[2916]: E0513 23:51:00.452116 2916 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 23:51:00.456447 kubelet[2916]: E0513 23:51:00.454529 2916 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 23:51:00.456447 kubelet[2916]: W0513 23:51:00.454568 2916 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 23:51:00.456447 kubelet[2916]: E0513 23:51:00.454600 2916 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 23:51:00.456447 kubelet[2916]: E0513 23:51:00.454848 2916 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 23:51:00.456447 kubelet[2916]: W0513 23:51:00.454860 2916 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 23:51:00.456447 kubelet[2916]: E0513 23:51:00.454874 2916 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 23:51:00.458530 kubelet[2916]: E0513 23:51:00.457932 2916 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 23:51:00.458530 kubelet[2916]: W0513 23:51:00.457955 2916 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 23:51:00.458530 kubelet[2916]: E0513 23:51:00.457977 2916 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 23:51:00.460918 kubelet[2916]: E0513 23:51:00.459745 2916 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 23:51:00.460918 kubelet[2916]: W0513 23:51:00.459865 2916 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 23:51:00.460918 kubelet[2916]: E0513 23:51:00.460696 2916 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 23:51:00.461662 kubelet[2916]: E0513 23:51:00.461633 2916 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 23:51:00.461884 kubelet[2916]: W0513 23:51:00.461867 2916 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 23:51:00.462583 kubelet[2916]: E0513 23:51:00.462522 2916 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 23:51:00.462852 kubelet[2916]: E0513 23:51:00.462693 2916 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 23:51:00.462852 kubelet[2916]: W0513 23:51:00.462707 2916 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 23:51:00.463814 kubelet[2916]: E0513 23:51:00.463759 2916 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 23:51:00.464358 kubelet[2916]: E0513 23:51:00.464131 2916 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 23:51:00.464358 kubelet[2916]: W0513 23:51:00.464146 2916 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 23:51:00.464735 kubelet[2916]: E0513 23:51:00.464552 2916 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 23:51:00.465437 kubelet[2916]: E0513 23:51:00.465295 2916 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 23:51:00.465437 kubelet[2916]: W0513 23:51:00.465329 2916 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 23:51:00.465856 kubelet[2916]: E0513 23:51:00.465734 2916 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 23:51:00.466817 kubelet[2916]: E0513 23:51:00.466699 2916 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 23:51:00.466817 kubelet[2916]: W0513 23:51:00.466719 2916 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 23:51:00.468410 kubelet[2916]: E0513 23:51:00.468165 2916 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 23:51:00.468410 kubelet[2916]: W0513 23:51:00.468183 2916 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 23:51:00.468410 kubelet[2916]: E0513 23:51:00.468212 2916 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 23:51:00.469605 kubelet[2916]: E0513 23:51:00.469284 2916 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 23:51:00.469871 kubelet[2916]: E0513 23:51:00.469701 2916 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 23:51:00.469871 kubelet[2916]: W0513 23:51:00.469714 2916 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 23:51:00.469871 kubelet[2916]: E0513 23:51:00.469728 2916 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 23:51:00.471001 kubelet[2916]: E0513 23:51:00.470868 2916 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 23:51:00.471001 kubelet[2916]: W0513 23:51:00.470885 2916 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 23:51:00.471001 kubelet[2916]: E0513 23:51:00.470914 2916 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 23:51:00.485808 kubelet[2916]: E0513 23:51:00.485563 2916 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 23:51:00.485808 kubelet[2916]: W0513 23:51:00.485595 2916 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 23:51:00.485808 kubelet[2916]: E0513 23:51:00.485627 2916 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 23:51:00.488121 containerd[1515]: time="2025-05-13T23:51:00.486274059Z" level=info msg="connecting to shim 37e0626f129003369ed3c089cd45dc0cc86077364abebc8ab952b4b4e49c4def" address="unix:///run/containerd/s/d9efcfe45805ac62fe99f14efc815fcbed4afc0d878d15c8b65b83c5b0feec2d" namespace=k8s.io protocol=ttrpc version=3 May 13 23:51:00.488273 kubelet[2916]: E0513 23:51:00.488006 2916 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 23:51:00.488273 kubelet[2916]: W0513 23:51:00.488037 2916 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 23:51:00.488273 kubelet[2916]: E0513 23:51:00.488061 2916 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 23:51:00.535349 systemd[1]: Started cri-containerd-37e0626f129003369ed3c089cd45dc0cc86077364abebc8ab952b4b4e49c4def.scope - libcontainer container 37e0626f129003369ed3c089cd45dc0cc86077364abebc8ab952b4b4e49c4def. May 13 23:51:00.539431 kubelet[2916]: I0513 23:51:00.539370 2916 topology_manager.go:215] "Topology Admit Handler" podUID="61909eec-3ecf-4969-befe-a7397a9a89d4" podNamespace="calico-system" podName="csi-node-driver-dxllc" May 13 23:51:00.539695 kubelet[2916]: E0513 23:51:00.539661 2916 pod_workers.go:1298] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-dxllc" podUID="61909eec-3ecf-4969-befe-a7397a9a89d4" May 13 23:51:00.550585 kubelet[2916]: E0513 23:51:00.550550 2916 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 23:51:00.550924 kubelet[2916]: W0513 23:51:00.550787 2916 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 23:51:00.550924 kubelet[2916]: E0513 23:51:00.550816 2916 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 23:51:00.551368 kubelet[2916]: E0513 23:51:00.551321 2916 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 23:51:00.551622 kubelet[2916]: W0513 23:51:00.551526 2916 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 23:51:00.551622 kubelet[2916]: E0513 23:51:00.551581 2916 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 23:51:00.552228 kubelet[2916]: E0513 23:51:00.552093 2916 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 23:51:00.552228 kubelet[2916]: W0513 23:51:00.552109 2916 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 23:51:00.552228 kubelet[2916]: E0513 23:51:00.552129 2916 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 23:51:00.552718 kubelet[2916]: E0513 23:51:00.552542 2916 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 23:51:00.552718 kubelet[2916]: W0513 23:51:00.552556 2916 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 23:51:00.552718 kubelet[2916]: E0513 23:51:00.552570 2916 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 23:51:00.553370 kubelet[2916]: E0513 23:51:00.553128 2916 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 23:51:00.553370 kubelet[2916]: W0513 23:51:00.553142 2916 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 23:51:00.553370 kubelet[2916]: E0513 23:51:00.553156 2916 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 23:51:00.553830 kubelet[2916]: E0513 23:51:00.553723 2916 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 23:51:00.553830 kubelet[2916]: W0513 23:51:00.553740 2916 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 23:51:00.553830 kubelet[2916]: E0513 23:51:00.553757 2916 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 23:51:00.555634 kubelet[2916]: E0513 23:51:00.555501 2916 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 23:51:00.555634 kubelet[2916]: W0513 23:51:00.555519 2916 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 23:51:00.555634 kubelet[2916]: E0513 23:51:00.555533 2916 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 23:51:00.556025 kubelet[2916]: E0513 23:51:00.556009 2916 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 23:51:00.556264 kubelet[2916]: W0513 23:51:00.556112 2916 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 23:51:00.556264 kubelet[2916]: E0513 23:51:00.556132 2916 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 23:51:00.556640 kubelet[2916]: E0513 23:51:00.556619 2916 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 23:51:00.556837 kubelet[2916]: W0513 23:51:00.556717 2916 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 23:51:00.556837 kubelet[2916]: E0513 23:51:00.556736 2916 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 23:51:00.557407 kubelet[2916]: E0513 23:51:00.557201 2916 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 23:51:00.557407 kubelet[2916]: W0513 23:51:00.557216 2916 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 23:51:00.557407 kubelet[2916]: E0513 23:51:00.557229 2916 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 23:51:00.560102 kubelet[2916]: E0513 23:51:00.559764 2916 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 23:51:00.560102 kubelet[2916]: W0513 23:51:00.559783 2916 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 23:51:00.560102 kubelet[2916]: E0513 23:51:00.559994 2916 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 23:51:00.561647 kubelet[2916]: E0513 23:51:00.561491 2916 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 23:51:00.561647 kubelet[2916]: W0513 23:51:00.561511 2916 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 23:51:00.561647 kubelet[2916]: E0513 23:51:00.561527 2916 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 23:51:00.562088 kubelet[2916]: E0513 23:51:00.561968 2916 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 23:51:00.562088 kubelet[2916]: W0513 23:51:00.561988 2916 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 23:51:00.562088 kubelet[2916]: E0513 23:51:00.562001 2916 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 23:51:00.562362 kubelet[2916]: E0513 23:51:00.562346 2916 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 23:51:00.562530 kubelet[2916]: W0513 23:51:00.562437 2916 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 23:51:00.562530 kubelet[2916]: E0513 23:51:00.562456 2916 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 23:51:00.563348 kubelet[2916]: E0513 23:51:00.563200 2916 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 23:51:00.563348 kubelet[2916]: W0513 23:51:00.563214 2916 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 23:51:00.563348 kubelet[2916]: E0513 23:51:00.563239 2916 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 23:51:00.563843 kubelet[2916]: E0513 23:51:00.563555 2916 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 23:51:00.563959 kubelet[2916]: W0513 23:51:00.563934 2916 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 23:51:00.564032 kubelet[2916]: E0513 23:51:00.564019 2916 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 23:51:00.565147 kubelet[2916]: E0513 23:51:00.565123 2916 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 23:51:00.565372 kubelet[2916]: W0513 23:51:00.565225 2916 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 23:51:00.565372 kubelet[2916]: E0513 23:51:00.565245 2916 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 23:51:00.565745 kubelet[2916]: E0513 23:51:00.565634 2916 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 23:51:00.565745 kubelet[2916]: W0513 23:51:00.565649 2916 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 23:51:00.565745 kubelet[2916]: E0513 23:51:00.565662 2916 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 23:51:00.566081 kubelet[2916]: E0513 23:51:00.565977 2916 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 23:51:00.566081 kubelet[2916]: W0513 23:51:00.565992 2916 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 23:51:00.566081 kubelet[2916]: E0513 23:51:00.566004 2916 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 23:51:00.567421 kubelet[2916]: E0513 23:51:00.566303 2916 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 23:51:00.567421 kubelet[2916]: W0513 23:51:00.566358 2916 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 23:51:00.567421 kubelet[2916]: E0513 23:51:00.566372 2916 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 23:51:00.567983 kubelet[2916]: E0513 23:51:00.567853 2916 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 23:51:00.567983 kubelet[2916]: W0513 23:51:00.567867 2916 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 23:51:00.567983 kubelet[2916]: E0513 23:51:00.567879 2916 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 23:51:00.567983 kubelet[2916]: I0513 23:51:00.567906 2916 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/61909eec-3ecf-4969-befe-a7397a9a89d4-socket-dir\") pod \"csi-node-driver-dxllc\" (UID: \"61909eec-3ecf-4969-befe-a7397a9a89d4\") " pod="calico-system/csi-node-driver-dxllc" May 13 23:51:00.568626 kubelet[2916]: E0513 23:51:00.568605 2916 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 23:51:00.568798 kubelet[2916]: W0513 23:51:00.568707 2916 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 23:51:00.568798 kubelet[2916]: E0513 23:51:00.568733 2916 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 23:51:00.568798 kubelet[2916]: I0513 23:51:00.568759 2916 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/61909eec-3ecf-4969-befe-a7397a9a89d4-kubelet-dir\") pod \"csi-node-driver-dxllc\" (UID: \"61909eec-3ecf-4969-befe-a7397a9a89d4\") " pod="calico-system/csi-node-driver-dxllc" May 13 23:51:00.569073 kubelet[2916]: E0513 23:51:00.569047 2916 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 23:51:00.569073 kubelet[2916]: W0513 23:51:00.569067 2916 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 23:51:00.569149 kubelet[2916]: E0513 23:51:00.569086 2916 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 23:51:00.569330 kubelet[2916]: E0513 23:51:00.569303 2916 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 23:51:00.569330 kubelet[2916]: W0513 23:51:00.569328 2916 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 23:51:00.569382 kubelet[2916]: E0513 23:51:00.569343 2916 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 23:51:00.570860 kubelet[2916]: E0513 23:51:00.570826 2916 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 23:51:00.570860 kubelet[2916]: W0513 23:51:00.570851 2916 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 23:51:00.570860 kubelet[2916]: E0513 23:51:00.570869 2916 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 23:51:00.570991 kubelet[2916]: I0513 23:51:00.570898 2916 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mkt8x\" (UniqueName: \"kubernetes.io/projected/61909eec-3ecf-4969-befe-a7397a9a89d4-kube-api-access-mkt8x\") pod \"csi-node-driver-dxllc\" (UID: \"61909eec-3ecf-4969-befe-a7397a9a89d4\") " pod="calico-system/csi-node-driver-dxllc" May 13 23:51:00.571362 kubelet[2916]: E0513 23:51:00.571224 2916 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 23:51:00.571362 kubelet[2916]: W0513 23:51:00.571356 2916 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 23:51:00.571838 kubelet[2916]: E0513 23:51:00.571803 2916 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 23:51:00.571902 kubelet[2916]: I0513 23:51:00.571845 2916 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"varrun\" (UniqueName: \"kubernetes.io/host-path/61909eec-3ecf-4969-befe-a7397a9a89d4-varrun\") pod \"csi-node-driver-dxllc\" (UID: \"61909eec-3ecf-4969-befe-a7397a9a89d4\") " pod="calico-system/csi-node-driver-dxllc" May 13 23:51:00.572207 kubelet[2916]: E0513 23:51:00.572178 2916 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 23:51:00.572207 kubelet[2916]: W0513 23:51:00.572197 2916 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 23:51:00.572350 kubelet[2916]: E0513 23:51:00.572304 2916 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 23:51:00.572982 kubelet[2916]: E0513 23:51:00.572950 2916 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 23:51:00.572982 kubelet[2916]: W0513 23:51:00.572969 2916 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 23:51:00.573430 kubelet[2916]: E0513 23:51:00.573385 2916 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 23:51:00.573914 kubelet[2916]: E0513 23:51:00.573890 2916 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 23:51:00.573914 kubelet[2916]: W0513 23:51:00.573908 2916 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 23:51:00.574464 kubelet[2916]: E0513 23:51:00.574023 2916 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 23:51:00.574464 kubelet[2916]: I0513 23:51:00.574057 2916 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/61909eec-3ecf-4969-befe-a7397a9a89d4-registration-dir\") pod \"csi-node-driver-dxllc\" (UID: \"61909eec-3ecf-4969-befe-a7397a9a89d4\") " pod="calico-system/csi-node-driver-dxllc" May 13 23:51:00.575513 kubelet[2916]: E0513 23:51:00.575483 2916 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 23:51:00.575513 kubelet[2916]: W0513 23:51:00.575503 2916 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 23:51:00.575684 kubelet[2916]: E0513 23:51:00.575615 2916 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 23:51:00.576016 kubelet[2916]: E0513 23:51:00.575901 2916 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 23:51:00.576016 kubelet[2916]: W0513 23:51:00.575916 2916 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 23:51:00.576016 kubelet[2916]: E0513 23:51:00.575928 2916 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 23:51:00.576516 kubelet[2916]: E0513 23:51:00.576488 2916 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 23:51:00.576516 kubelet[2916]: W0513 23:51:00.576510 2916 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 23:51:00.576616 kubelet[2916]: E0513 23:51:00.576526 2916 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 23:51:00.577535 kubelet[2916]: E0513 23:51:00.577498 2916 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 23:51:00.577535 kubelet[2916]: W0513 23:51:00.577525 2916 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 23:51:00.577631 kubelet[2916]: E0513 23:51:00.577540 2916 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 23:51:00.577906 kubelet[2916]: E0513 23:51:00.577880 2916 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 23:51:00.577906 kubelet[2916]: W0513 23:51:00.577899 2916 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 23:51:00.577986 kubelet[2916]: E0513 23:51:00.577910 2916 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 23:51:00.578577 kubelet[2916]: E0513 23:51:00.578551 2916 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 23:51:00.578577 kubelet[2916]: W0513 23:51:00.578571 2916 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 23:51:00.578666 kubelet[2916]: E0513 23:51:00.578583 2916 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 23:51:00.622671 containerd[1515]: time="2025-05-13T23:51:00.622324981Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-typha-ff85b86b5-vwq9l,Uid:5bf8c565-319e-40f9-8b90-11c92e4e0e85,Namespace:calico-system,Attempt:0,} returns sandbox id \"37e0626f129003369ed3c089cd45dc0cc86077364abebc8ab952b4b4e49c4def\"" May 13 23:51:00.629384 containerd[1515]: time="2025-05-13T23:51:00.629305701Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/typha:v3.29.3\"" May 13 23:51:00.646876 containerd[1515]: time="2025-05-13T23:51:00.646291458Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-node-9d6d5,Uid:6a9ba8ac-b078-4b58-af66-448e71d3c56b,Namespace:calico-system,Attempt:0,}" May 13 23:51:00.680599 kubelet[2916]: E0513 23:51:00.679187 2916 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 23:51:00.680599 kubelet[2916]: W0513 23:51:00.679221 2916 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 23:51:00.680599 kubelet[2916]: E0513 23:51:00.679247 2916 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 23:51:00.681412 kubelet[2916]: E0513 23:51:00.681036 2916 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 23:51:00.681412 kubelet[2916]: W0513 23:51:00.681084 2916 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 23:51:00.681412 kubelet[2916]: E0513 23:51:00.681121 2916 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 23:51:00.681853 kubelet[2916]: E0513 23:51:00.681680 2916 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 23:51:00.681853 kubelet[2916]: W0513 23:51:00.681698 2916 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 23:51:00.681853 kubelet[2916]: E0513 23:51:00.681724 2916 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 23:51:00.685224 kubelet[2916]: E0513 23:51:00.684561 2916 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 23:51:00.685442 kubelet[2916]: W0513 23:51:00.684598 2916 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 23:51:00.685679 kubelet[2916]: E0513 23:51:00.685504 2916 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 23:51:00.687013 kubelet[2916]: E0513 23:51:00.686185 2916 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 23:51:00.687013 kubelet[2916]: W0513 23:51:00.686961 2916 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 23:51:00.687432 kubelet[2916]: E0513 23:51:00.686991 2916 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 23:51:00.687726 kubelet[2916]: E0513 23:51:00.687586 2916 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 23:51:00.687726 kubelet[2916]: W0513 23:51:00.687620 2916 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 23:51:00.687726 kubelet[2916]: E0513 23:51:00.687636 2916 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 23:51:00.688510 kubelet[2916]: E0513 23:51:00.688032 2916 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 23:51:00.688510 kubelet[2916]: W0513 23:51:00.688047 2916 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 23:51:00.688804 kubelet[2916]: E0513 23:51:00.688601 2916 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 23:51:00.691066 kubelet[2916]: E0513 23:51:00.689567 2916 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 23:51:00.691066 kubelet[2916]: W0513 23:51:00.689584 2916 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 23:51:00.691066 kubelet[2916]: E0513 23:51:00.690889 2916 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 23:51:00.691383 kubelet[2916]: E0513 23:51:00.691260 2916 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 23:51:00.691383 kubelet[2916]: W0513 23:51:00.691297 2916 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 23:51:00.691505 kubelet[2916]: E0513 23:51:00.691444 2916 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 23:51:00.692010 kubelet[2916]: E0513 23:51:00.691789 2916 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 23:51:00.692010 kubelet[2916]: W0513 23:51:00.691803 2916 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 23:51:00.692010 kubelet[2916]: E0513 23:51:00.691925 2916 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 23:51:00.692482 kubelet[2916]: E0513 23:51:00.692300 2916 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 23:51:00.692482 kubelet[2916]: W0513 23:51:00.692315 2916 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 23:51:00.693050 kubelet[2916]: E0513 23:51:00.692835 2916 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 23:51:00.693050 kubelet[2916]: W0513 23:51:00.692866 2916 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 23:51:00.693374 kubelet[2916]: E0513 23:51:00.693197 2916 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 23:51:00.693374 kubelet[2916]: W0513 23:51:00.693209 2916 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 23:51:00.693374 kubelet[2916]: E0513 23:51:00.693225 2916 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 23:51:00.693374 kubelet[2916]: E0513 23:51:00.693259 2916 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 23:51:00.693374 kubelet[2916]: E0513 23:51:00.693297 2916 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 23:51:00.693982 kubelet[2916]: E0513 23:51:00.693687 2916 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 23:51:00.693982 kubelet[2916]: W0513 23:51:00.693702 2916 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 23:51:00.693982 kubelet[2916]: E0513 23:51:00.693934 2916 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 23:51:00.695150 kubelet[2916]: E0513 23:51:00.694914 2916 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 23:51:00.695150 kubelet[2916]: W0513 23:51:00.694933 2916 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 23:51:00.695150 kubelet[2916]: E0513 23:51:00.695037 2916 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 23:51:00.695817 kubelet[2916]: E0513 23:51:00.695439 2916 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 23:51:00.695817 kubelet[2916]: W0513 23:51:00.695454 2916 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 23:51:00.695817 kubelet[2916]: E0513 23:51:00.695540 2916 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 23:51:00.696341 kubelet[2916]: E0513 23:51:00.696098 2916 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 23:51:00.696341 kubelet[2916]: W0513 23:51:00.696111 2916 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 23:51:00.696645 kubelet[2916]: E0513 23:51:00.696527 2916 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 23:51:00.696645 kubelet[2916]: W0513 23:51:00.696538 2916 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 23:51:00.696912 kubelet[2916]: E0513 23:51:00.696838 2916 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 23:51:00.696912 kubelet[2916]: W0513 23:51:00.696852 2916 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 23:51:00.696912 kubelet[2916]: E0513 23:51:00.696849 2916 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 23:51:00.696996 kubelet[2916]: E0513 23:51:00.696839 2916 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 23:51:00.696996 kubelet[2916]: E0513 23:51:00.696970 2916 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 23:51:00.697255 kubelet[2916]: E0513 23:51:00.697140 2916 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 23:51:00.697255 kubelet[2916]: W0513 23:51:00.697152 2916 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 23:51:00.697255 kubelet[2916]: E0513 23:51:00.697174 2916 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 23:51:00.697738 kubelet[2916]: E0513 23:51:00.697573 2916 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 23:51:00.697738 kubelet[2916]: W0513 23:51:00.697587 2916 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 23:51:00.697738 kubelet[2916]: E0513 23:51:00.697602 2916 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 23:51:00.698465 kubelet[2916]: E0513 23:51:00.698141 2916 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 23:51:00.698465 kubelet[2916]: W0513 23:51:00.698155 2916 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 23:51:00.698465 kubelet[2916]: E0513 23:51:00.698168 2916 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 23:51:00.700859 kubelet[2916]: E0513 23:51:00.700827 2916 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 23:51:00.700859 kubelet[2916]: W0513 23:51:00.700846 2916 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 23:51:00.701087 kubelet[2916]: E0513 23:51:00.700868 2916 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 23:51:00.701548 kubelet[2916]: E0513 23:51:00.701513 2916 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 23:51:00.701548 kubelet[2916]: W0513 23:51:00.701546 2916 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 23:51:00.701649 kubelet[2916]: E0513 23:51:00.701563 2916 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 23:51:00.701911 kubelet[2916]: E0513 23:51:00.701891 2916 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 23:51:00.701911 kubelet[2916]: W0513 23:51:00.701907 2916 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 23:51:00.701984 kubelet[2916]: E0513 23:51:00.701926 2916 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 23:51:00.707612 containerd[1515]: time="2025-05-13T23:51:00.707262210Z" level=info msg="connecting to shim 946f03493c6dc8f7f97547a29e4ac96167dee0786d964740d9074576144ee34b" address="unix:///run/containerd/s/60510d1865c6e20831be72dc660c8f89140861b3678f5203df25ae308b6b4ab7" namespace=k8s.io protocol=ttrpc version=3 May 13 23:51:00.724779 kubelet[2916]: E0513 23:51:00.724684 2916 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 23:51:00.724779 kubelet[2916]: W0513 23:51:00.724712 2916 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 23:51:00.724779 kubelet[2916]: E0513 23:51:00.724736 2916 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 23:51:00.741600 systemd[1]: Started cri-containerd-946f03493c6dc8f7f97547a29e4ac96167dee0786d964740d9074576144ee34b.scope - libcontainer container 946f03493c6dc8f7f97547a29e4ac96167dee0786d964740d9074576144ee34b. May 13 23:51:00.775964 containerd[1515]: time="2025-05-13T23:51:00.775207841Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-node-9d6d5,Uid:6a9ba8ac-b078-4b58-af66-448e71d3c56b,Namespace:calico-system,Attempt:0,} returns sandbox id \"946f03493c6dc8f7f97547a29e4ac96167dee0786d964740d9074576144ee34b\"" May 13 23:51:01.818106 kubelet[2916]: E0513 23:51:01.817884 2916 pod_workers.go:1298] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-dxllc" podUID="61909eec-3ecf-4969-befe-a7397a9a89d4" May 13 23:51:03.183629 containerd[1515]: time="2025-05-13T23:51:03.183538805Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/typha:v3.29.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 13 23:51:03.185902 containerd[1515]: time="2025-05-13T23:51:03.185833931Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/typha:v3.29.3: active requests=0, bytes read=28370571" May 13 23:51:03.187119 containerd[1515]: time="2025-05-13T23:51:03.186647993Z" level=info msg="ImageCreate event name:\"sha256:26e730979a07ea7452715da6ac48076016018bc982c06ebd32d5e095f42d3d54\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 13 23:51:03.190515 containerd[1515]: time="2025-05-13T23:51:03.190161551Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/typha@sha256:f5516aa6a78f00931d2625f3012dcf2c69d141ce41483b8d59c6ec6330a18620\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 13 23:51:03.192186 containerd[1515]: time="2025-05-13T23:51:03.192051107Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/typha:v3.29.3\" with image id \"sha256:26e730979a07ea7452715da6ac48076016018bc982c06ebd32d5e095f42d3d54\", repo tag \"ghcr.io/flatcar/calico/typha:v3.29.3\", repo digest \"ghcr.io/flatcar/calico/typha@sha256:f5516aa6a78f00931d2625f3012dcf2c69d141ce41483b8d59c6ec6330a18620\", size \"29739745\" in 2.562669516s" May 13 23:51:03.192186 containerd[1515]: time="2025-05-13T23:51:03.192091112Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/typha:v3.29.3\" returns image reference \"sha256:26e730979a07ea7452715da6ac48076016018bc982c06ebd32d5e095f42d3d54\"" May 13 23:51:03.194439 containerd[1515]: time="2025-05-13T23:51:03.193742237Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.29.3\"" May 13 23:51:03.210777 containerd[1515]: time="2025-05-13T23:51:03.210718755Z" level=info msg="CreateContainer within sandbox \"37e0626f129003369ed3c089cd45dc0cc86077364abebc8ab952b4b4e49c4def\" for container &ContainerMetadata{Name:calico-typha,Attempt:0,}" May 13 23:51:03.223366 containerd[1515]: time="2025-05-13T23:51:03.221634797Z" level=info msg="Container dbd01f55dcf3fa6ffbb746dde0370d5bfe8ad3877ef75fa584add80a07b9e031: CDI devices from CRI Config.CDIDevices: []" May 13 23:51:03.234757 containerd[1515]: time="2025-05-13T23:51:03.234698266Z" level=info msg="CreateContainer within sandbox \"37e0626f129003369ed3c089cd45dc0cc86077364abebc8ab952b4b4e49c4def\" for &ContainerMetadata{Name:calico-typha,Attempt:0,} returns container id \"dbd01f55dcf3fa6ffbb746dde0370d5bfe8ad3877ef75fa584add80a07b9e031\"" May 13 23:51:03.235789 containerd[1515]: time="2025-05-13T23:51:03.235743677Z" level=info msg="StartContainer for \"dbd01f55dcf3fa6ffbb746dde0370d5bfe8ad3877ef75fa584add80a07b9e031\"" May 13 23:51:03.238090 containerd[1515]: time="2025-05-13T23:51:03.238026361Z" level=info msg="connecting to shim dbd01f55dcf3fa6ffbb746dde0370d5bfe8ad3877ef75fa584add80a07b9e031" address="unix:///run/containerd/s/d9efcfe45805ac62fe99f14efc815fcbed4afc0d878d15c8b65b83c5b0feec2d" protocol=ttrpc version=3 May 13 23:51:03.282690 systemd[1]: Started cri-containerd-dbd01f55dcf3fa6ffbb746dde0370d5bfe8ad3877ef75fa584add80a07b9e031.scope - libcontainer container dbd01f55dcf3fa6ffbb746dde0370d5bfe8ad3877ef75fa584add80a07b9e031. May 13 23:51:03.342168 containerd[1515]: time="2025-05-13T23:51:03.341888637Z" level=info msg="StartContainer for \"dbd01f55dcf3fa6ffbb746dde0370d5bfe8ad3877ef75fa584add80a07b9e031\" returns successfully" May 13 23:51:03.816789 kubelet[2916]: E0513 23:51:03.816647 2916 pod_workers.go:1298] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-dxllc" podUID="61909eec-3ecf-4969-befe-a7397a9a89d4" May 13 23:51:03.958189 kubelet[2916]: I0513 23:51:03.957694 2916 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/calico-typha-ff85b86b5-vwq9l" podStartSLOduration=1.391823443 podStartE2EDuration="3.957676651s" podCreationTimestamp="2025-05-13 23:51:00 +0000 UTC" firstStartedPulling="2025-05-13 23:51:00.627149497 +0000 UTC m=+21.954466559" lastFinishedPulling="2025-05-13 23:51:03.193002705 +0000 UTC m=+24.520319767" observedRunningTime="2025-05-13 23:51:03.957226314 +0000 UTC m=+25.284543416" watchObservedRunningTime="2025-05-13 23:51:03.957676651 +0000 UTC m=+25.284993713" May 13 23:51:03.989881 kubelet[2916]: E0513 23:51:03.989844 2916 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 23:51:03.989881 kubelet[2916]: W0513 23:51:03.989871 2916 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 23:51:03.990385 kubelet[2916]: E0513 23:51:03.989893 2916 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 23:51:03.990385 kubelet[2916]: E0513 23:51:03.990152 2916 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 23:51:03.990385 kubelet[2916]: W0513 23:51:03.990164 2916 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 23:51:03.990385 kubelet[2916]: E0513 23:51:03.990176 2916 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 23:51:03.990385 kubelet[2916]: E0513 23:51:03.990353 2916 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 23:51:03.990385 kubelet[2916]: W0513 23:51:03.990362 2916 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 23:51:03.990385 kubelet[2916]: E0513 23:51:03.990381 2916 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 23:51:03.990612 kubelet[2916]: E0513 23:51:03.990581 2916 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 23:51:03.990612 kubelet[2916]: W0513 23:51:03.990590 2916 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 23:51:03.990612 kubelet[2916]: E0513 23:51:03.990600 2916 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 23:51:03.991077 kubelet[2916]: E0513 23:51:03.991053 2916 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 23:51:03.991077 kubelet[2916]: W0513 23:51:03.991076 2916 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 23:51:03.991196 kubelet[2916]: E0513 23:51:03.991088 2916 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 23:51:03.991377 kubelet[2916]: E0513 23:51:03.991341 2916 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 23:51:03.991377 kubelet[2916]: W0513 23:51:03.991356 2916 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 23:51:03.991377 kubelet[2916]: E0513 23:51:03.991368 2916 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 23:51:03.991611 kubelet[2916]: E0513 23:51:03.991547 2916 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 23:51:03.991611 kubelet[2916]: W0513 23:51:03.991555 2916 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 23:51:03.991611 kubelet[2916]: E0513 23:51:03.991564 2916 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 23:51:03.991788 kubelet[2916]: E0513 23:51:03.991720 2916 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 23:51:03.991788 kubelet[2916]: W0513 23:51:03.991727 2916 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 23:51:03.991788 kubelet[2916]: E0513 23:51:03.991736 2916 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 23:51:03.991968 kubelet[2916]: E0513 23:51:03.991903 2916 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 23:51:03.991968 kubelet[2916]: W0513 23:51:03.991911 2916 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 23:51:03.991968 kubelet[2916]: E0513 23:51:03.991920 2916 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 23:51:03.992114 kubelet[2916]: E0513 23:51:03.992060 2916 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 23:51:03.992114 kubelet[2916]: W0513 23:51:03.992067 2916 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 23:51:03.992114 kubelet[2916]: E0513 23:51:03.992075 2916 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 23:51:03.992251 kubelet[2916]: E0513 23:51:03.992222 2916 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 23:51:03.992251 kubelet[2916]: W0513 23:51:03.992229 2916 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 23:51:03.992251 kubelet[2916]: E0513 23:51:03.992239 2916 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 23:51:03.992462 kubelet[2916]: E0513 23:51:03.992394 2916 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 23:51:03.992462 kubelet[2916]: W0513 23:51:03.992456 2916 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 23:51:03.992462 kubelet[2916]: E0513 23:51:03.992465 2916 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 23:51:03.992653 kubelet[2916]: E0513 23:51:03.992634 2916 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 23:51:03.992653 kubelet[2916]: W0513 23:51:03.992647 2916 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 23:51:03.992653 kubelet[2916]: E0513 23:51:03.992655 2916 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 23:51:03.992940 kubelet[2916]: E0513 23:51:03.992859 2916 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 23:51:03.992940 kubelet[2916]: W0513 23:51:03.992869 2916 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 23:51:03.992940 kubelet[2916]: E0513 23:51:03.992878 2916 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 23:51:03.993504 kubelet[2916]: E0513 23:51:03.993035 2916 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 23:51:03.993504 kubelet[2916]: W0513 23:51:03.993043 2916 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 23:51:03.993504 kubelet[2916]: E0513 23:51:03.993052 2916 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 23:51:04.016958 kubelet[2916]: E0513 23:51:04.016917 2916 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 23:51:04.016958 kubelet[2916]: W0513 23:51:04.016951 2916 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 23:51:04.016958 kubelet[2916]: E0513 23:51:04.016977 2916 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 23:51:04.018188 kubelet[2916]: E0513 23:51:04.017919 2916 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 23:51:04.018188 kubelet[2916]: W0513 23:51:04.017950 2916 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 23:51:04.018188 kubelet[2916]: E0513 23:51:04.017982 2916 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 23:51:04.018511 kubelet[2916]: E0513 23:51:04.018358 2916 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 23:51:04.018511 kubelet[2916]: W0513 23:51:04.018372 2916 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 23:51:04.018511 kubelet[2916]: E0513 23:51:04.018455 2916 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 23:51:04.019328 kubelet[2916]: E0513 23:51:04.019301 2916 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 23:51:04.019328 kubelet[2916]: W0513 23:51:04.019326 2916 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 23:51:04.019469 kubelet[2916]: E0513 23:51:04.019352 2916 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 23:51:04.019618 kubelet[2916]: E0513 23:51:04.019601 2916 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 23:51:04.019618 kubelet[2916]: W0513 23:51:04.019617 2916 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 23:51:04.019815 kubelet[2916]: E0513 23:51:04.019692 2916 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 23:51:04.019980 kubelet[2916]: E0513 23:51:04.019933 2916 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 23:51:04.019980 kubelet[2916]: W0513 23:51:04.019955 2916 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 23:51:04.020641 kubelet[2916]: E0513 23:51:04.020546 2916 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 23:51:04.020836 kubelet[2916]: E0513 23:51:04.020818 2916 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 23:51:04.020836 kubelet[2916]: W0513 23:51:04.020835 2916 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 23:51:04.021073 kubelet[2916]: E0513 23:51:04.020954 2916 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 23:51:04.021180 kubelet[2916]: E0513 23:51:04.021163 2916 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 23:51:04.021217 kubelet[2916]: W0513 23:51:04.021180 2916 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 23:51:04.021432 kubelet[2916]: E0513 23:51:04.021252 2916 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 23:51:04.021597 kubelet[2916]: E0513 23:51:04.021583 2916 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 23:51:04.021646 kubelet[2916]: W0513 23:51:04.021596 2916 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 23:51:04.021959 kubelet[2916]: E0513 23:51:04.021925 2916 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 23:51:04.022167 kubelet[2916]: E0513 23:51:04.022152 2916 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 23:51:04.022167 kubelet[2916]: W0513 23:51:04.022166 2916 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 23:51:04.022237 kubelet[2916]: E0513 23:51:04.022180 2916 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 23:51:04.022351 kubelet[2916]: E0513 23:51:04.022338 2916 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 23:51:04.022387 kubelet[2916]: W0513 23:51:04.022351 2916 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 23:51:04.022387 kubelet[2916]: E0513 23:51:04.022369 2916 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 23:51:04.022614 kubelet[2916]: E0513 23:51:04.022601 2916 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 23:51:04.022614 kubelet[2916]: W0513 23:51:04.022613 2916 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 23:51:04.022684 kubelet[2916]: E0513 23:51:04.022628 2916 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 23:51:04.022848 kubelet[2916]: E0513 23:51:04.022828 2916 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 23:51:04.022848 kubelet[2916]: W0513 23:51:04.022840 2916 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 23:51:04.022933 kubelet[2916]: E0513 23:51:04.022862 2916 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 23:51:04.023060 kubelet[2916]: E0513 23:51:04.023021 2916 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 23:51:04.023060 kubelet[2916]: W0513 23:51:04.023036 2916 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 23:51:04.023060 kubelet[2916]: E0513 23:51:04.023047 2916 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 23:51:04.023484 kubelet[2916]: E0513 23:51:04.023468 2916 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 23:51:04.023484 kubelet[2916]: W0513 23:51:04.023484 2916 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 23:51:04.023561 kubelet[2916]: E0513 23:51:04.023505 2916 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 23:51:04.023866 kubelet[2916]: E0513 23:51:04.023849 2916 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 23:51:04.023866 kubelet[2916]: W0513 23:51:04.023864 2916 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 23:51:04.023940 kubelet[2916]: E0513 23:51:04.023878 2916 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 23:51:04.024262 kubelet[2916]: E0513 23:51:04.024201 2916 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 23:51:04.024262 kubelet[2916]: W0513 23:51:04.024216 2916 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 23:51:04.024262 kubelet[2916]: E0513 23:51:04.024229 2916 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 23:51:04.025534 kubelet[2916]: E0513 23:51:04.025505 2916 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 23:51:04.025534 kubelet[2916]: W0513 23:51:04.025524 2916 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 23:51:04.025620 kubelet[2916]: E0513 23:51:04.025539 2916 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 23:51:04.615143 containerd[1515]: time="2025-05-13T23:51:04.614846322Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.29.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 13 23:51:04.616771 containerd[1515]: time="2025-05-13T23:51:04.616384431Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.29.3: active requests=0, bytes read=5122903" May 13 23:51:04.619067 containerd[1515]: time="2025-05-13T23:51:04.618377675Z" level=info msg="ImageCreate event name:\"sha256:dd8e710a588cc6f5834c4d84f7e12458efae593d3dfe527ca9e757c89239ecb8\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 13 23:51:04.624437 containerd[1515]: time="2025-05-13T23:51:04.624352887Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/pod2daemon-flexvol@sha256:eeaa2bb4f9b1aa61adde43ce6dea95eee89291f96963548e108d9a2dfbc5edd1\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 13 23:51:04.626372 containerd[1515]: time="2025-05-13T23:51:04.625980047Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.29.3\" with image id \"sha256:dd8e710a588cc6f5834c4d84f7e12458efae593d3dfe527ca9e757c89239ecb8\", repo tag \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.29.3\", repo digest \"ghcr.io/flatcar/calico/pod2daemon-flexvol@sha256:eeaa2bb4f9b1aa61adde43ce6dea95eee89291f96963548e108d9a2dfbc5edd1\", size \"6492045\" in 1.432189723s" May 13 23:51:04.626372 containerd[1515]: time="2025-05-13T23:51:04.626025452Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.29.3\" returns image reference \"sha256:dd8e710a588cc6f5834c4d84f7e12458efae593d3dfe527ca9e757c89239ecb8\"" May 13 23:51:04.630074 containerd[1515]: time="2025-05-13T23:51:04.629998539Z" level=info msg="CreateContainer within sandbox \"946f03493c6dc8f7f97547a29e4ac96167dee0786d964740d9074576144ee34b\" for container &ContainerMetadata{Name:flexvol-driver,Attempt:0,}" May 13 23:51:04.641654 containerd[1515]: time="2025-05-13T23:51:04.640459301Z" level=info msg="Container cd8285cbdb2dfa6d65ef0fb31e06249ca4083eece3eca5e368ab39c71b80257e: CDI devices from CRI Config.CDIDevices: []" May 13 23:51:04.654291 containerd[1515]: time="2025-05-13T23:51:04.654213107Z" level=info msg="CreateContainer within sandbox \"946f03493c6dc8f7f97547a29e4ac96167dee0786d964740d9074576144ee34b\" for &ContainerMetadata{Name:flexvol-driver,Attempt:0,} returns container id \"cd8285cbdb2dfa6d65ef0fb31e06249ca4083eece3eca5e368ab39c71b80257e\"" May 13 23:51:04.655730 containerd[1515]: time="2025-05-13T23:51:04.655311521Z" level=info msg="StartContainer for \"cd8285cbdb2dfa6d65ef0fb31e06249ca4083eece3eca5e368ab39c71b80257e\"" May 13 23:51:04.657669 containerd[1515]: time="2025-05-13T23:51:04.657619844Z" level=info msg="connecting to shim cd8285cbdb2dfa6d65ef0fb31e06249ca4083eece3eca5e368ab39c71b80257e" address="unix:///run/containerd/s/60510d1865c6e20831be72dc660c8f89140861b3678f5203df25ae308b6b4ab7" protocol=ttrpc version=3 May 13 23:51:04.683872 systemd[1]: Started cri-containerd-cd8285cbdb2dfa6d65ef0fb31e06249ca4083eece3eca5e368ab39c71b80257e.scope - libcontainer container cd8285cbdb2dfa6d65ef0fb31e06249ca4083eece3eca5e368ab39c71b80257e. May 13 23:51:04.746785 containerd[1515]: time="2025-05-13T23:51:04.746619351Z" level=info msg="StartContainer for \"cd8285cbdb2dfa6d65ef0fb31e06249ca4083eece3eca5e368ab39c71b80257e\" returns successfully" May 13 23:51:04.768865 systemd[1]: cri-containerd-cd8285cbdb2dfa6d65ef0fb31e06249ca4083eece3eca5e368ab39c71b80257e.scope: Deactivated successfully. May 13 23:51:04.775858 containerd[1515]: time="2025-05-13T23:51:04.775596102Z" level=info msg="received exit event container_id:\"cd8285cbdb2dfa6d65ef0fb31e06249ca4083eece3eca5e368ab39c71b80257e\" id:\"cd8285cbdb2dfa6d65ef0fb31e06249ca4083eece3eca5e368ab39c71b80257e\" pid:3565 exited_at:{seconds:1747180264 nanos:775039554}" May 13 23:51:04.777067 containerd[1515]: time="2025-05-13T23:51:04.776759805Z" level=info msg="TaskExit event in podsandbox handler container_id:\"cd8285cbdb2dfa6d65ef0fb31e06249ca4083eece3eca5e368ab39c71b80257e\" id:\"cd8285cbdb2dfa6d65ef0fb31e06249ca4083eece3eca5e368ab39c71b80257e\" pid:3565 exited_at:{seconds:1747180264 nanos:775039554}" May 13 23:51:04.798917 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-cd8285cbdb2dfa6d65ef0fb31e06249ca4083eece3eca5e368ab39c71b80257e-rootfs.mount: Deactivated successfully. May 13 23:51:04.944396 kubelet[2916]: I0513 23:51:04.944250 2916 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" May 13 23:51:04.946694 containerd[1515]: time="2025-05-13T23:51:04.946596979Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/cni:v3.29.3\"" May 13 23:51:05.817856 kubelet[2916]: E0513 23:51:05.817662 2916 pod_workers.go:1298] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-dxllc" podUID="61909eec-3ecf-4969-befe-a7397a9a89d4" May 13 23:51:07.816950 kubelet[2916]: E0513 23:51:07.816704 2916 pod_workers.go:1298] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-dxllc" podUID="61909eec-3ecf-4969-befe-a7397a9a89d4" May 13 23:51:08.492669 containerd[1515]: time="2025-05-13T23:51:08.492602283Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/cni:v3.29.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 13 23:51:08.494012 containerd[1515]: time="2025-05-13T23:51:08.493933876Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/cni:v3.29.3: active requests=0, bytes read=91256270" May 13 23:51:08.494600 containerd[1515]: time="2025-05-13T23:51:08.494527544Z" level=info msg="ImageCreate event name:\"sha256:add6372545fb406bb017769f222d84c50549ce13e3b19f1fbaee3d8a4aaef627\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 13 23:51:08.497264 containerd[1515]: time="2025-05-13T23:51:08.497162365Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/cni@sha256:4505ec8f976470994b6a94295a4dabac0cb98375db050e959a22603e00ada90b\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 13 23:51:08.498293 containerd[1515]: time="2025-05-13T23:51:08.497763674Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/cni:v3.29.3\" with image id \"sha256:add6372545fb406bb017769f222d84c50549ce13e3b19f1fbaee3d8a4aaef627\", repo tag \"ghcr.io/flatcar/calico/cni:v3.29.3\", repo digest \"ghcr.io/flatcar/calico/cni@sha256:4505ec8f976470994b6a94295a4dabac0cb98375db050e959a22603e00ada90b\", size \"92625452\" in 3.55112649s" May 13 23:51:08.498293 containerd[1515]: time="2025-05-13T23:51:08.497805399Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/cni:v3.29.3\" returns image reference \"sha256:add6372545fb406bb017769f222d84c50549ce13e3b19f1fbaee3d8a4aaef627\"" May 13 23:51:08.501786 containerd[1515]: time="2025-05-13T23:51:08.501751891Z" level=info msg="CreateContainer within sandbox \"946f03493c6dc8f7f97547a29e4ac96167dee0786d964740d9074576144ee34b\" for container &ContainerMetadata{Name:install-cni,Attempt:0,}" May 13 23:51:08.512055 containerd[1515]: time="2025-05-13T23:51:08.511933416Z" level=info msg="Container e5b8d67f90eec10c2b465641a5ddafaa17641814886706ef9649870b8060928d: CDI devices from CRI Config.CDIDevices: []" May 13 23:51:08.523668 containerd[1515]: time="2025-05-13T23:51:08.523605952Z" level=info msg="CreateContainer within sandbox \"946f03493c6dc8f7f97547a29e4ac96167dee0786d964740d9074576144ee34b\" for &ContainerMetadata{Name:install-cni,Attempt:0,} returns container id \"e5b8d67f90eec10c2b465641a5ddafaa17641814886706ef9649870b8060928d\"" May 13 23:51:08.524344 containerd[1515]: time="2025-05-13T23:51:08.524254346Z" level=info msg="StartContainer for \"e5b8d67f90eec10c2b465641a5ddafaa17641814886706ef9649870b8060928d\"" May 13 23:51:08.527656 containerd[1515]: time="2025-05-13T23:51:08.527595249Z" level=info msg="connecting to shim e5b8d67f90eec10c2b465641a5ddafaa17641814886706ef9649870b8060928d" address="unix:///run/containerd/s/60510d1865c6e20831be72dc660c8f89140861b3678f5203df25ae308b6b4ab7" protocol=ttrpc version=3 May 13 23:51:08.561639 systemd[1]: Started cri-containerd-e5b8d67f90eec10c2b465641a5ddafaa17641814886706ef9649870b8060928d.scope - libcontainer container e5b8d67f90eec10c2b465641a5ddafaa17641814886706ef9649870b8060928d. May 13 23:51:08.629858 containerd[1515]: time="2025-05-13T23:51:08.629706017Z" level=info msg="StartContainer for \"e5b8d67f90eec10c2b465641a5ddafaa17641814886706ef9649870b8060928d\" returns successfully" May 13 23:51:09.191332 containerd[1515]: time="2025-05-13T23:51:09.191156410Z" level=error msg="failed to reload cni configuration after receiving fs change event(WRITE \"/etc/cni/net.d/calico-kubeconfig\")" error="cni config load failed: no network config found in /etc/cni/net.d: cni plugin not initialized: failed to load cni config" May 13 23:51:09.195520 systemd[1]: cri-containerd-e5b8d67f90eec10c2b465641a5ddafaa17641814886706ef9649870b8060928d.scope: Deactivated successfully. May 13 23:51:09.196426 systemd[1]: cri-containerd-e5b8d67f90eec10c2b465641a5ddafaa17641814886706ef9649870b8060928d.scope: Consumed 523ms CPU time, 169.4M memory peak, 150.3M written to disk. May 13 23:51:09.200453 containerd[1515]: time="2025-05-13T23:51:09.200391690Z" level=info msg="received exit event container_id:\"e5b8d67f90eec10c2b465641a5ddafaa17641814886706ef9649870b8060928d\" id:\"e5b8d67f90eec10c2b465641a5ddafaa17641814886706ef9649870b8060928d\" pid:3623 exited_at:{seconds:1747180269 nanos:200130421}" May 13 23:51:09.201861 containerd[1515]: time="2025-05-13T23:51:09.200758251Z" level=info msg="TaskExit event in podsandbox handler container_id:\"e5b8d67f90eec10c2b465641a5ddafaa17641814886706ef9649870b8060928d\" id:\"e5b8d67f90eec10c2b465641a5ddafaa17641814886706ef9649870b8060928d\" pid:3623 exited_at:{seconds:1747180269 nanos:200130421}" May 13 23:51:09.229646 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-e5b8d67f90eec10c2b465641a5ddafaa17641814886706ef9649870b8060928d-rootfs.mount: Deactivated successfully. May 13 23:51:09.251858 kubelet[2916]: I0513 23:51:09.251824 2916 kubelet_node_status.go:497] "Fast updating node status as it just became ready" May 13 23:51:09.297316 kubelet[2916]: I0513 23:51:09.297246 2916 topology_manager.go:215] "Topology Admit Handler" podUID="320deada-65e9-4ed1-bb7a-4b617ad6a953" podNamespace="kube-system" podName="coredns-7db6d8ff4d-2mlnb" May 13 23:51:09.313846 kubelet[2916]: I0513 23:51:09.313779 2916 topology_manager.go:215] "Topology Admit Handler" podUID="67f78161-b863-4d99-b7fa-721f3c1dd240" podNamespace="calico-system" podName="calico-kube-controllers-c4dbd9494-hmzc4" May 13 23:51:09.317356 kubelet[2916]: I0513 23:51:09.313986 2916 topology_manager.go:215] "Topology Admit Handler" podUID="af984b4c-8125-4744-82af-9465e063a376" podNamespace="kube-system" podName="coredns-7db6d8ff4d-5w474" May 13 23:51:09.317356 kubelet[2916]: I0513 23:51:09.314103 2916 topology_manager.go:215] "Topology Admit Handler" podUID="24eddc54-32b6-4824-b8f8-0cb6ae8aff36" podNamespace="calico-apiserver" podName="calico-apiserver-59bf95bdb4-hsqx4" May 13 23:51:09.317356 kubelet[2916]: I0513 23:51:09.314631 2916 topology_manager.go:215] "Topology Admit Handler" podUID="f25819bc-7144-44b1-bf9b-16a1da05bd16" podNamespace="calico-apiserver" podName="calico-apiserver-59bf95bdb4-rmmk8" May 13 23:51:09.323113 systemd[1]: Created slice kubepods-burstable-pod320deada_65e9_4ed1_bb7a_4b617ad6a953.slice - libcontainer container kubepods-burstable-pod320deada_65e9_4ed1_bb7a_4b617ad6a953.slice. May 13 23:51:09.357024 systemd[1]: Created slice kubepods-besteffort-pod24eddc54_32b6_4824_b8f8_0cb6ae8aff36.slice - libcontainer container kubepods-besteffort-pod24eddc54_32b6_4824_b8f8_0cb6ae8aff36.slice. May 13 23:51:09.357526 kubelet[2916]: I0513 23:51:09.357395 2916 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z2v62\" (UniqueName: \"kubernetes.io/projected/af984b4c-8125-4744-82af-9465e063a376-kube-api-access-z2v62\") pod \"coredns-7db6d8ff4d-5w474\" (UID: \"af984b4c-8125-4744-82af-9465e063a376\") " pod="kube-system/coredns-7db6d8ff4d-5w474" May 13 23:51:09.357526 kubelet[2916]: I0513 23:51:09.357475 2916 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"calico-apiserver-certs\" (UniqueName: \"kubernetes.io/secret/f25819bc-7144-44b1-bf9b-16a1da05bd16-calico-apiserver-certs\") pod \"calico-apiserver-59bf95bdb4-rmmk8\" (UID: \"f25819bc-7144-44b1-bf9b-16a1da05bd16\") " pod="calico-apiserver/calico-apiserver-59bf95bdb4-rmmk8" May 13 23:51:09.357526 kubelet[2916]: I0513 23:51:09.357504 2916 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f55fq\" (UniqueName: \"kubernetes.io/projected/320deada-65e9-4ed1-bb7a-4b617ad6a953-kube-api-access-f55fq\") pod \"coredns-7db6d8ff4d-2mlnb\" (UID: \"320deada-65e9-4ed1-bb7a-4b617ad6a953\") " pod="kube-system/coredns-7db6d8ff4d-2mlnb" May 13 23:51:09.358152 kubelet[2916]: I0513 23:51:09.357532 2916 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-826h9\" (UniqueName: \"kubernetes.io/projected/67f78161-b863-4d99-b7fa-721f3c1dd240-kube-api-access-826h9\") pod \"calico-kube-controllers-c4dbd9494-hmzc4\" (UID: \"67f78161-b863-4d99-b7fa-721f3c1dd240\") " pod="calico-system/calico-kube-controllers-c4dbd9494-hmzc4" May 13 23:51:09.358152 kubelet[2916]: I0513 23:51:09.357562 2916 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/67f78161-b863-4d99-b7fa-721f3c1dd240-tigera-ca-bundle\") pod \"calico-kube-controllers-c4dbd9494-hmzc4\" (UID: \"67f78161-b863-4d99-b7fa-721f3c1dd240\") " pod="calico-system/calico-kube-controllers-c4dbd9494-hmzc4" May 13 23:51:09.358152 kubelet[2916]: I0513 23:51:09.357579 2916 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sc2gw\" (UniqueName: \"kubernetes.io/projected/f25819bc-7144-44b1-bf9b-16a1da05bd16-kube-api-access-sc2gw\") pod \"calico-apiserver-59bf95bdb4-rmmk8\" (UID: \"f25819bc-7144-44b1-bf9b-16a1da05bd16\") " pod="calico-apiserver/calico-apiserver-59bf95bdb4-rmmk8" May 13 23:51:09.358152 kubelet[2916]: I0513 23:51:09.357597 2916 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"calico-apiserver-certs\" (UniqueName: \"kubernetes.io/secret/24eddc54-32b6-4824-b8f8-0cb6ae8aff36-calico-apiserver-certs\") pod \"calico-apiserver-59bf95bdb4-hsqx4\" (UID: \"24eddc54-32b6-4824-b8f8-0cb6ae8aff36\") " pod="calico-apiserver/calico-apiserver-59bf95bdb4-hsqx4" May 13 23:51:09.358152 kubelet[2916]: I0513 23:51:09.357614 2916 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/af984b4c-8125-4744-82af-9465e063a376-config-volume\") pod \"coredns-7db6d8ff4d-5w474\" (UID: \"af984b4c-8125-4744-82af-9465e063a376\") " pod="kube-system/coredns-7db6d8ff4d-5w474" May 13 23:51:09.358305 kubelet[2916]: I0513 23:51:09.357637 2916 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mt7r6\" (UniqueName: \"kubernetes.io/projected/24eddc54-32b6-4824-b8f8-0cb6ae8aff36-kube-api-access-mt7r6\") pod \"calico-apiserver-59bf95bdb4-hsqx4\" (UID: \"24eddc54-32b6-4824-b8f8-0cb6ae8aff36\") " pod="calico-apiserver/calico-apiserver-59bf95bdb4-hsqx4" May 13 23:51:09.358305 kubelet[2916]: I0513 23:51:09.357651 2916 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/320deada-65e9-4ed1-bb7a-4b617ad6a953-config-volume\") pod \"coredns-7db6d8ff4d-2mlnb\" (UID: \"320deada-65e9-4ed1-bb7a-4b617ad6a953\") " pod="kube-system/coredns-7db6d8ff4d-2mlnb" May 13 23:51:09.370739 systemd[1]: Created slice kubepods-besteffort-pod67f78161_b863_4d99_b7fa_721f3c1dd240.slice - libcontainer container kubepods-besteffort-pod67f78161_b863_4d99_b7fa_721f3c1dd240.slice. May 13 23:51:09.382551 systemd[1]: Created slice kubepods-besteffort-podf25819bc_7144_44b1_bf9b_16a1da05bd16.slice - libcontainer container kubepods-besteffort-podf25819bc_7144_44b1_bf9b_16a1da05bd16.slice. May 13 23:51:09.392442 systemd[1]: Created slice kubepods-burstable-podaf984b4c_8125_4744_82af_9465e063a376.slice - libcontainer container kubepods-burstable-podaf984b4c_8125_4744_82af_9465e063a376.slice. May 13 23:51:09.634863 containerd[1515]: time="2025-05-13T23:51:09.634163413Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-7db6d8ff4d-2mlnb,Uid:320deada-65e9-4ed1-bb7a-4b617ad6a953,Namespace:kube-system,Attempt:0,}" May 13 23:51:09.668820 containerd[1515]: time="2025-05-13T23:51:09.668491598Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-59bf95bdb4-hsqx4,Uid:24eddc54-32b6-4824-b8f8-0cb6ae8aff36,Namespace:calico-apiserver,Attempt:0,}" May 13 23:51:09.679956 containerd[1515]: time="2025-05-13T23:51:09.679898122Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-c4dbd9494-hmzc4,Uid:67f78161-b863-4d99-b7fa-721f3c1dd240,Namespace:calico-system,Attempt:0,}" May 13 23:51:09.691829 containerd[1515]: time="2025-05-13T23:51:09.691021535Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-59bf95bdb4-rmmk8,Uid:f25819bc-7144-44b1-bf9b-16a1da05bd16,Namespace:calico-apiserver,Attempt:0,}" May 13 23:51:09.701730 containerd[1515]: time="2025-05-13T23:51:09.701690536Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-7db6d8ff4d-5w474,Uid:af984b4c-8125-4744-82af-9465e063a376,Namespace:kube-system,Attempt:0,}" May 13 23:51:09.826873 systemd[1]: Created slice kubepods-besteffort-pod61909eec_3ecf_4969_befe_a7397a9a89d4.slice - libcontainer container kubepods-besteffort-pod61909eec_3ecf_4969_befe_a7397a9a89d4.slice. May 13 23:51:09.831439 containerd[1515]: time="2025-05-13T23:51:09.831366938Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-dxllc,Uid:61909eec-3ecf-4969-befe-a7397a9a89d4,Namespace:calico-system,Attempt:0,}" May 13 23:51:09.836108 containerd[1515]: time="2025-05-13T23:51:09.836062907Z" level=error msg="Failed to destroy network for sandbox \"403ad7b5f03049f59e9e88e560132fd4989e1cafb1b716c221c88795b670dec3\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 13 23:51:09.838419 containerd[1515]: time="2025-05-13T23:51:09.838245672Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-7db6d8ff4d-2mlnb,Uid:320deada-65e9-4ed1-bb7a-4b617ad6a953,Namespace:kube-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"403ad7b5f03049f59e9e88e560132fd4989e1cafb1b716c221c88795b670dec3\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 13 23:51:09.840148 kubelet[2916]: E0513 23:51:09.838937 2916 remote_runtime.go:193] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"403ad7b5f03049f59e9e88e560132fd4989e1cafb1b716c221c88795b670dec3\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 13 23:51:09.840148 kubelet[2916]: E0513 23:51:09.839016 2916 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"403ad7b5f03049f59e9e88e560132fd4989e1cafb1b716c221c88795b670dec3\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-7db6d8ff4d-2mlnb" May 13 23:51:09.840148 kubelet[2916]: E0513 23:51:09.839036 2916 kuberuntime_manager.go:1166] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"403ad7b5f03049f59e9e88e560132fd4989e1cafb1b716c221c88795b670dec3\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-7db6d8ff4d-2mlnb" May 13 23:51:09.840366 kubelet[2916]: E0513 23:51:09.839080 2916 pod_workers.go:1298] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"coredns-7db6d8ff4d-2mlnb_kube-system(320deada-65e9-4ed1-bb7a-4b617ad6a953)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"coredns-7db6d8ff4d-2mlnb_kube-system(320deada-65e9-4ed1-bb7a-4b617ad6a953)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"403ad7b5f03049f59e9e88e560132fd4989e1cafb1b716c221c88795b670dec3\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-7db6d8ff4d-2mlnb" podUID="320deada-65e9-4ed1-bb7a-4b617ad6a953" May 13 23:51:09.854163 containerd[1515]: time="2025-05-13T23:51:09.854078495Z" level=error msg="Failed to destroy network for sandbox \"8d639113938cd667b8151081c84aeeb138a7931ff74c037d1730b81daf00fbe9\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 13 23:51:09.859423 containerd[1515]: time="2025-05-13T23:51:09.859292642Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-59bf95bdb4-hsqx4,Uid:24eddc54-32b6-4824-b8f8-0cb6ae8aff36,Namespace:calico-apiserver,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"8d639113938cd667b8151081c84aeeb138a7931ff74c037d1730b81daf00fbe9\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 13 23:51:09.861538 kubelet[2916]: E0513 23:51:09.859671 2916 remote_runtime.go:193] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"8d639113938cd667b8151081c84aeeb138a7931ff74c037d1730b81daf00fbe9\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 13 23:51:09.861538 kubelet[2916]: E0513 23:51:09.859730 2916 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"8d639113938cd667b8151081c84aeeb138a7931ff74c037d1730b81daf00fbe9\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-59bf95bdb4-hsqx4" May 13 23:51:09.861538 kubelet[2916]: E0513 23:51:09.859750 2916 kuberuntime_manager.go:1166] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"8d639113938cd667b8151081c84aeeb138a7931ff74c037d1730b81daf00fbe9\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-59bf95bdb4-hsqx4" May 13 23:51:09.861978 kubelet[2916]: E0513 23:51:09.859809 2916 pod_workers.go:1298] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-apiserver-59bf95bdb4-hsqx4_calico-apiserver(24eddc54-32b6-4824-b8f8-0cb6ae8aff36)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-apiserver-59bf95bdb4-hsqx4_calico-apiserver(24eddc54-32b6-4824-b8f8-0cb6ae8aff36)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"8d639113938cd667b8151081c84aeeb138a7931ff74c037d1730b81daf00fbe9\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-apiserver/calico-apiserver-59bf95bdb4-hsqx4" podUID="24eddc54-32b6-4824-b8f8-0cb6ae8aff36" May 13 23:51:09.881678 containerd[1515]: time="2025-05-13T23:51:09.881616316Z" level=error msg="Failed to destroy network for sandbox \"f90f5ef5c04fd58c6dd530721644f6cec8e226da596a14f97ce2f9908c4638b5\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 13 23:51:09.882593 kubelet[2916]: I0513 23:51:09.882545 2916 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" May 13 23:51:09.890296 containerd[1515]: time="2025-05-13T23:51:09.889686265Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-c4dbd9494-hmzc4,Uid:67f78161-b863-4d99-b7fa-721f3c1dd240,Namespace:calico-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"f90f5ef5c04fd58c6dd530721644f6cec8e226da596a14f97ce2f9908c4638b5\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 13 23:51:09.891700 kubelet[2916]: E0513 23:51:09.891471 2916 remote_runtime.go:193] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"f90f5ef5c04fd58c6dd530721644f6cec8e226da596a14f97ce2f9908c4638b5\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 13 23:51:09.892656 kubelet[2916]: E0513 23:51:09.892607 2916 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"f90f5ef5c04fd58c6dd530721644f6cec8e226da596a14f97ce2f9908c4638b5\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-kube-controllers-c4dbd9494-hmzc4" May 13 23:51:09.892656 kubelet[2916]: E0513 23:51:09.892648 2916 kuberuntime_manager.go:1166] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"f90f5ef5c04fd58c6dd530721644f6cec8e226da596a14f97ce2f9908c4638b5\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-kube-controllers-c4dbd9494-hmzc4" May 13 23:51:09.892926 kubelet[2916]: E0513 23:51:09.892688 2916 pod_workers.go:1298] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-kube-controllers-c4dbd9494-hmzc4_calico-system(67f78161-b863-4d99-b7fa-721f3c1dd240)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-kube-controllers-c4dbd9494-hmzc4_calico-system(67f78161-b863-4d99-b7fa-721f3c1dd240)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"f90f5ef5c04fd58c6dd530721644f6cec8e226da596a14f97ce2f9908c4638b5\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/calico-kube-controllers-c4dbd9494-hmzc4" podUID="67f78161-b863-4d99-b7fa-721f3c1dd240" May 13 23:51:09.897830 containerd[1515]: time="2025-05-13T23:51:09.897588114Z" level=error msg="Failed to destroy network for sandbox \"1b6281e4ed72fcdb051d3dbdd9f621f29c839f810392dac17b3073729a1d272d\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 13 23:51:09.900901 containerd[1515]: time="2025-05-13T23:51:09.900609935Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-7db6d8ff4d-5w474,Uid:af984b4c-8125-4744-82af-9465e063a376,Namespace:kube-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"1b6281e4ed72fcdb051d3dbdd9f621f29c839f810392dac17b3073729a1d272d\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 13 23:51:09.901560 kubelet[2916]: E0513 23:51:09.901530 2916 remote_runtime.go:193] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"1b6281e4ed72fcdb051d3dbdd9f621f29c839f810392dac17b3073729a1d272d\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 13 23:51:09.901560 kubelet[2916]: E0513 23:51:09.901615 2916 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"1b6281e4ed72fcdb051d3dbdd9f621f29c839f810392dac17b3073729a1d272d\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-7db6d8ff4d-5w474" May 13 23:51:09.901560 kubelet[2916]: E0513 23:51:09.901639 2916 kuberuntime_manager.go:1166] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"1b6281e4ed72fcdb051d3dbdd9f621f29c839f810392dac17b3073729a1d272d\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-7db6d8ff4d-5w474" May 13 23:51:09.902921 kubelet[2916]: E0513 23:51:09.901867 2916 pod_workers.go:1298] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"coredns-7db6d8ff4d-5w474_kube-system(af984b4c-8125-4744-82af-9465e063a376)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"coredns-7db6d8ff4d-5w474_kube-system(af984b4c-8125-4744-82af-9465e063a376)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"1b6281e4ed72fcdb051d3dbdd9f621f29c839f810392dac17b3073729a1d272d\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-7db6d8ff4d-5w474" podUID="af984b4c-8125-4744-82af-9465e063a376" May 13 23:51:09.911975 containerd[1515]: time="2025-05-13T23:51:09.911849840Z" level=error msg="Failed to destroy network for sandbox \"b81ce272063dd619d2060467590c20413f64a723f43d9418a174173fe43d4cb5\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 13 23:51:09.914732 containerd[1515]: time="2025-05-13T23:51:09.914550544Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-59bf95bdb4-rmmk8,Uid:f25819bc-7144-44b1-bf9b-16a1da05bd16,Namespace:calico-apiserver,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"b81ce272063dd619d2060467590c20413f64a723f43d9418a174173fe43d4cb5\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 13 23:51:09.915313 kubelet[2916]: E0513 23:51:09.915042 2916 remote_runtime.go:193] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"b81ce272063dd619d2060467590c20413f64a723f43d9418a174173fe43d4cb5\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 13 23:51:09.915313 kubelet[2916]: E0513 23:51:09.915107 2916 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"b81ce272063dd619d2060467590c20413f64a723f43d9418a174173fe43d4cb5\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-59bf95bdb4-rmmk8" May 13 23:51:09.915313 kubelet[2916]: E0513 23:51:09.915134 2916 kuberuntime_manager.go:1166] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"b81ce272063dd619d2060467590c20413f64a723f43d9418a174173fe43d4cb5\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-59bf95bdb4-rmmk8" May 13 23:51:09.916568 kubelet[2916]: E0513 23:51:09.915173 2916 pod_workers.go:1298] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-apiserver-59bf95bdb4-rmmk8_calico-apiserver(f25819bc-7144-44b1-bf9b-16a1da05bd16)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-apiserver-59bf95bdb4-rmmk8_calico-apiserver(f25819bc-7144-44b1-bf9b-16a1da05bd16)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"b81ce272063dd619d2060467590c20413f64a723f43d9418a174173fe43d4cb5\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-apiserver/calico-apiserver-59bf95bdb4-rmmk8" podUID="f25819bc-7144-44b1-bf9b-16a1da05bd16" May 13 23:51:09.951168 containerd[1515]: time="2025-05-13T23:51:09.949768750Z" level=error msg="Failed to destroy network for sandbox \"49c10075e7faeecfb96511ae5505be8c084b18b11d9990b07497459ea5165961\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 13 23:51:09.954746 containerd[1515]: time="2025-05-13T23:51:09.954608535Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-dxllc,Uid:61909eec-3ecf-4969-befe-a7397a9a89d4,Namespace:calico-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"49c10075e7faeecfb96511ae5505be8c084b18b11d9990b07497459ea5165961\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 13 23:51:09.955507 kubelet[2916]: E0513 23:51:09.955208 2916 remote_runtime.go:193] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"49c10075e7faeecfb96511ae5505be8c084b18b11d9990b07497459ea5165961\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 13 23:51:09.955507 kubelet[2916]: E0513 23:51:09.955271 2916 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"49c10075e7faeecfb96511ae5505be8c084b18b11d9990b07497459ea5165961\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/csi-node-driver-dxllc" May 13 23:51:09.955507 kubelet[2916]: E0513 23:51:09.955299 2916 kuberuntime_manager.go:1166] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"49c10075e7faeecfb96511ae5505be8c084b18b11d9990b07497459ea5165961\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/csi-node-driver-dxllc" May 13 23:51:09.955681 kubelet[2916]: E0513 23:51:09.955466 2916 pod_workers.go:1298] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"csi-node-driver-dxllc_calico-system(61909eec-3ecf-4969-befe-a7397a9a89d4)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"csi-node-driver-dxllc_calico-system(61909eec-3ecf-4969-befe-a7397a9a89d4)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"49c10075e7faeecfb96511ae5505be8c084b18b11d9990b07497459ea5165961\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/csi-node-driver-dxllc" podUID="61909eec-3ecf-4969-befe-a7397a9a89d4" May 13 23:51:09.972131 containerd[1515]: time="2025-05-13T23:51:09.971882680Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node:v3.29.3\"" May 13 23:51:10.515425 systemd[1]: run-netns-cni\x2d44cff0ab\x2d7402\x2d4a7c\x2d564e\x2d2e35cd243c63.mount: Deactivated successfully. May 13 23:51:10.515606 systemd[1]: run-netns-cni\x2da8f1cc24\x2d4d7e\x2d4afa\x2d3bc2\x2db72996265d20.mount: Deactivated successfully. May 13 23:51:10.515690 systemd[1]: run-netns-cni\x2d8cca4307\x2d4694\x2d5592\x2dd59e\x2d3a971979d57a.mount: Deactivated successfully. May 13 23:51:10.515768 systemd[1]: run-netns-cni\x2dd8ca0d5e\x2d0625\x2d4c96\x2d023e\x2dd9c80a91d5b3.mount: Deactivated successfully. May 13 23:51:15.286163 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount3871972112.mount: Deactivated successfully. May 13 23:51:15.323652 containerd[1515]: time="2025-05-13T23:51:15.323569670Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node:v3.29.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 13 23:51:15.325190 containerd[1515]: time="2025-05-13T23:51:15.325099667Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/node:v3.29.3: active requests=0, bytes read=138981893" May 13 23:51:15.326257 containerd[1515]: time="2025-05-13T23:51:15.326194739Z" level=info msg="ImageCreate event name:\"sha256:cdcce3ec4624a24c28cdc07b0ee29ddf6703628edee7452a3f8a8b4816bfd057\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 13 23:51:15.328607 containerd[1515]: time="2025-05-13T23:51:15.328550820Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node@sha256:750e267b4f8217e0ca9e4107228370190d1a2499b72112ad04370ab9b4553916\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 13 23:51:15.329890 containerd[1515]: time="2025-05-13T23:51:15.329795948Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/node:v3.29.3\" with image id \"sha256:cdcce3ec4624a24c28cdc07b0ee29ddf6703628edee7452a3f8a8b4816bfd057\", repo tag \"ghcr.io/flatcar/calico/node:v3.29.3\", repo digest \"ghcr.io/flatcar/calico/node@sha256:750e267b4f8217e0ca9e4107228370190d1a2499b72112ad04370ab9b4553916\", size \"138981755\" in 5.357869703s" May 13 23:51:15.329890 containerd[1515]: time="2025-05-13T23:51:15.329869676Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node:v3.29.3\" returns image reference \"sha256:cdcce3ec4624a24c28cdc07b0ee29ddf6703628edee7452a3f8a8b4816bfd057\"" May 13 23:51:15.351682 containerd[1515]: time="2025-05-13T23:51:15.351625107Z" level=info msg="CreateContainer within sandbox \"946f03493c6dc8f7f97547a29e4ac96167dee0786d964740d9074576144ee34b\" for container &ContainerMetadata{Name:calico-node,Attempt:0,}" May 13 23:51:15.366594 containerd[1515]: time="2025-05-13T23:51:15.364844103Z" level=info msg="Container 6e216e5e38dfeaed7e9085644f5c11b62a72e72b15f966089d9009ebbc63b632: CDI devices from CRI Config.CDIDevices: []" May 13 23:51:15.377946 containerd[1515]: time="2025-05-13T23:51:15.377896762Z" level=info msg="CreateContainer within sandbox \"946f03493c6dc8f7f97547a29e4ac96167dee0786d964740d9074576144ee34b\" for &ContainerMetadata{Name:calico-node,Attempt:0,} returns container id \"6e216e5e38dfeaed7e9085644f5c11b62a72e72b15f966089d9009ebbc63b632\"" May 13 23:51:15.379200 containerd[1515]: time="2025-05-13T23:51:15.379152130Z" level=info msg="StartContainer for \"6e216e5e38dfeaed7e9085644f5c11b62a72e72b15f966089d9009ebbc63b632\"" May 13 23:51:15.381074 containerd[1515]: time="2025-05-13T23:51:15.381034323Z" level=info msg="connecting to shim 6e216e5e38dfeaed7e9085644f5c11b62a72e72b15f966089d9009ebbc63b632" address="unix:///run/containerd/s/60510d1865c6e20831be72dc660c8f89140861b3678f5203df25ae308b6b4ab7" protocol=ttrpc version=3 May 13 23:51:15.401584 systemd[1]: Started cri-containerd-6e216e5e38dfeaed7e9085644f5c11b62a72e72b15f966089d9009ebbc63b632.scope - libcontainer container 6e216e5e38dfeaed7e9085644f5c11b62a72e72b15f966089d9009ebbc63b632. May 13 23:51:15.461125 containerd[1515]: time="2025-05-13T23:51:15.460726217Z" level=info msg="StartContainer for \"6e216e5e38dfeaed7e9085644f5c11b62a72e72b15f966089d9009ebbc63b632\" returns successfully" May 13 23:51:15.577071 kernel: wireguard: WireGuard 1.0.0 loaded. See www.wireguard.com for information. May 13 23:51:15.577600 kernel: wireguard: Copyright (C) 2015-2019 Jason A. Donenfeld . All Rights Reserved. May 13 23:51:16.012085 kubelet[2916]: I0513 23:51:16.012011 2916 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/calico-node-9d6d5" podStartSLOduration=1.458856572 podStartE2EDuration="16.011993461s" podCreationTimestamp="2025-05-13 23:51:00 +0000 UTC" firstStartedPulling="2025-05-13 23:51:00.777589635 +0000 UTC m=+22.104906697" lastFinishedPulling="2025-05-13 23:51:15.330726524 +0000 UTC m=+36.658043586" observedRunningTime="2025-05-13 23:51:16.008886507 +0000 UTC m=+37.336203569" watchObservedRunningTime="2025-05-13 23:51:16.011993461 +0000 UTC m=+37.339310523" May 13 23:51:16.989637 kubelet[2916]: I0513 23:51:16.989602 2916 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" May 13 23:51:17.406448 kernel: bpftool[4031]: memfd_create() called without MFD_EXEC or MFD_NOEXEC_SEAL set May 13 23:51:17.613331 systemd-networkd[1403]: vxlan.calico: Link UP May 13 23:51:17.613337 systemd-networkd[1403]: vxlan.calico: Gained carrier May 13 23:51:19.356189 systemd-networkd[1403]: vxlan.calico: Gained IPv6LL May 13 23:51:20.818829 containerd[1515]: time="2025-05-13T23:51:20.818662573Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-59bf95bdb4-hsqx4,Uid:24eddc54-32b6-4824-b8f8-0cb6ae8aff36,Namespace:calico-apiserver,Attempt:0,}" May 13 23:51:20.820438 containerd[1515]: time="2025-05-13T23:51:20.819658348Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-dxllc,Uid:61909eec-3ecf-4969-befe-a7397a9a89d4,Namespace:calico-system,Attempt:0,}" May 13 23:51:21.086811 systemd-networkd[1403]: cali8a82efc8fad: Link UP May 13 23:51:21.089127 systemd-networkd[1403]: cali8a82efc8fad: Gained carrier May 13 23:51:21.118825 containerd[1515]: 2025-05-13 23:51:20.932 [INFO][4114] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4284--0--0--n--cba8e36126-k8s-calico--apiserver--59bf95bdb4--hsqx4-eth0 calico-apiserver-59bf95bdb4- calico-apiserver 24eddc54-32b6-4824-b8f8-0cb6ae8aff36 685 0 2025-05-13 23:50:59 +0000 UTC map[apiserver:true app.kubernetes.io/name:calico-apiserver k8s-app:calico-apiserver pod-template-hash:59bf95bdb4 projectcalico.org/namespace:calico-apiserver projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:calico-apiserver] map[] [] [] []} {k8s ci-4284-0-0-n-cba8e36126 calico-apiserver-59bf95bdb4-hsqx4 eth0 calico-apiserver [] [] [kns.calico-apiserver ksa.calico-apiserver.calico-apiserver] cali8a82efc8fad [] []}} ContainerID="82330f81334b068898c2fa0b3601827e31d0f6ef794bb1c3cd797955b0f804a9" Namespace="calico-apiserver" Pod="calico-apiserver-59bf95bdb4-hsqx4" WorkloadEndpoint="ci--4284--0--0--n--cba8e36126-k8s-calico--apiserver--59bf95bdb4--hsqx4-" May 13 23:51:21.118825 containerd[1515]: 2025-05-13 23:51:20.932 [INFO][4114] cni-plugin/k8s.go 77: Extracted identifiers for CmdAddK8s ContainerID="82330f81334b068898c2fa0b3601827e31d0f6ef794bb1c3cd797955b0f804a9" Namespace="calico-apiserver" Pod="calico-apiserver-59bf95bdb4-hsqx4" WorkloadEndpoint="ci--4284--0--0--n--cba8e36126-k8s-calico--apiserver--59bf95bdb4--hsqx4-eth0" May 13 23:51:21.118825 containerd[1515]: 2025-05-13 23:51:20.993 [INFO][4128] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="82330f81334b068898c2fa0b3601827e31d0f6ef794bb1c3cd797955b0f804a9" HandleID="k8s-pod-network.82330f81334b068898c2fa0b3601827e31d0f6ef794bb1c3cd797955b0f804a9" Workload="ci--4284--0--0--n--cba8e36126-k8s-calico--apiserver--59bf95bdb4--hsqx4-eth0" May 13 23:51:21.119878 containerd[1515]: 2025-05-13 23:51:21.021 [INFO][4128] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="82330f81334b068898c2fa0b3601827e31d0f6ef794bb1c3cd797955b0f804a9" HandleID="k8s-pod-network.82330f81334b068898c2fa0b3601827e31d0f6ef794bb1c3cd797955b0f804a9" Workload="ci--4284--0--0--n--cba8e36126-k8s-calico--apiserver--59bf95bdb4--hsqx4-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0x40003bacb0), Attrs:map[string]string{"namespace":"calico-apiserver", "node":"ci-4284-0-0-n-cba8e36126", "pod":"calico-apiserver-59bf95bdb4-hsqx4", "timestamp":"2025-05-13 23:51:20.993881195 +0000 UTC"}, Hostname:"ci-4284-0-0-n-cba8e36126", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} May 13 23:51:21.119878 containerd[1515]: 2025-05-13 23:51:21.022 [INFO][4128] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. May 13 23:51:21.119878 containerd[1515]: 2025-05-13 23:51:21.022 [INFO][4128] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. May 13 23:51:21.119878 containerd[1515]: 2025-05-13 23:51:21.022 [INFO][4128] ipam/ipam.go 107: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4284-0-0-n-cba8e36126' May 13 23:51:21.119878 containerd[1515]: 2025-05-13 23:51:21.026 [INFO][4128] ipam/ipam.go 660: Looking up existing affinities for host handle="k8s-pod-network.82330f81334b068898c2fa0b3601827e31d0f6ef794bb1c3cd797955b0f804a9" host="ci-4284-0-0-n-cba8e36126" May 13 23:51:21.119878 containerd[1515]: 2025-05-13 23:51:21.037 [INFO][4128] ipam/ipam.go 372: Looking up existing affinities for host host="ci-4284-0-0-n-cba8e36126" May 13 23:51:21.119878 containerd[1515]: 2025-05-13 23:51:21.044 [INFO][4128] ipam/ipam.go 489: Trying affinity for 192.168.25.64/26 host="ci-4284-0-0-n-cba8e36126" May 13 23:51:21.119878 containerd[1515]: 2025-05-13 23:51:21.048 [INFO][4128] ipam/ipam.go 155: Attempting to load block cidr=192.168.25.64/26 host="ci-4284-0-0-n-cba8e36126" May 13 23:51:21.119878 containerd[1515]: 2025-05-13 23:51:21.052 [INFO][4128] ipam/ipam.go 232: Affinity is confirmed and block has been loaded cidr=192.168.25.64/26 host="ci-4284-0-0-n-cba8e36126" May 13 23:51:21.120087 containerd[1515]: 2025-05-13 23:51:21.052 [INFO][4128] ipam/ipam.go 1180: Attempting to assign 1 addresses from block block=192.168.25.64/26 handle="k8s-pod-network.82330f81334b068898c2fa0b3601827e31d0f6ef794bb1c3cd797955b0f804a9" host="ci-4284-0-0-n-cba8e36126" May 13 23:51:21.120087 containerd[1515]: 2025-05-13 23:51:21.055 [INFO][4128] ipam/ipam.go 1685: Creating new handle: k8s-pod-network.82330f81334b068898c2fa0b3601827e31d0f6ef794bb1c3cd797955b0f804a9 May 13 23:51:21.120087 containerd[1515]: 2025-05-13 23:51:21.062 [INFO][4128] ipam/ipam.go 1203: Writing block in order to claim IPs block=192.168.25.64/26 handle="k8s-pod-network.82330f81334b068898c2fa0b3601827e31d0f6ef794bb1c3cd797955b0f804a9" host="ci-4284-0-0-n-cba8e36126" May 13 23:51:21.120087 containerd[1515]: 2025-05-13 23:51:21.072 [INFO][4128] ipam/ipam.go 1216: Successfully claimed IPs: [192.168.25.65/26] block=192.168.25.64/26 handle="k8s-pod-network.82330f81334b068898c2fa0b3601827e31d0f6ef794bb1c3cd797955b0f804a9" host="ci-4284-0-0-n-cba8e36126" May 13 23:51:21.120087 containerd[1515]: 2025-05-13 23:51:21.072 [INFO][4128] ipam/ipam.go 847: Auto-assigned 1 out of 1 IPv4s: [192.168.25.65/26] handle="k8s-pod-network.82330f81334b068898c2fa0b3601827e31d0f6ef794bb1c3cd797955b0f804a9" host="ci-4284-0-0-n-cba8e36126" May 13 23:51:21.120087 containerd[1515]: 2025-05-13 23:51:21.072 [INFO][4128] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. May 13 23:51:21.120087 containerd[1515]: 2025-05-13 23:51:21.072 [INFO][4128] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.25.65/26] IPv6=[] ContainerID="82330f81334b068898c2fa0b3601827e31d0f6ef794bb1c3cd797955b0f804a9" HandleID="k8s-pod-network.82330f81334b068898c2fa0b3601827e31d0f6ef794bb1c3cd797955b0f804a9" Workload="ci--4284--0--0--n--cba8e36126-k8s-calico--apiserver--59bf95bdb4--hsqx4-eth0" May 13 23:51:21.120219 containerd[1515]: 2025-05-13 23:51:21.076 [INFO][4114] cni-plugin/k8s.go 386: Populated endpoint ContainerID="82330f81334b068898c2fa0b3601827e31d0f6ef794bb1c3cd797955b0f804a9" Namespace="calico-apiserver" Pod="calico-apiserver-59bf95bdb4-hsqx4" WorkloadEndpoint="ci--4284--0--0--n--cba8e36126-k8s-calico--apiserver--59bf95bdb4--hsqx4-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4284--0--0--n--cba8e36126-k8s-calico--apiserver--59bf95bdb4--hsqx4-eth0", GenerateName:"calico-apiserver-59bf95bdb4-", Namespace:"calico-apiserver", SelfLink:"", UID:"24eddc54-32b6-4824-b8f8-0cb6ae8aff36", ResourceVersion:"685", Generation:0, CreationTimestamp:time.Date(2025, time.May, 13, 23, 50, 59, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"59bf95bdb4", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4284-0-0-n-cba8e36126", ContainerID:"", Pod:"calico-apiserver-59bf95bdb4-hsqx4", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.25.65/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"cali8a82efc8fad", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil)}} May 13 23:51:21.120270 containerd[1515]: 2025-05-13 23:51:21.077 [INFO][4114] cni-plugin/k8s.go 387: Calico CNI using IPs: [192.168.25.65/32] ContainerID="82330f81334b068898c2fa0b3601827e31d0f6ef794bb1c3cd797955b0f804a9" Namespace="calico-apiserver" Pod="calico-apiserver-59bf95bdb4-hsqx4" WorkloadEndpoint="ci--4284--0--0--n--cba8e36126-k8s-calico--apiserver--59bf95bdb4--hsqx4-eth0" May 13 23:51:21.120270 containerd[1515]: 2025-05-13 23:51:21.077 [INFO][4114] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali8a82efc8fad ContainerID="82330f81334b068898c2fa0b3601827e31d0f6ef794bb1c3cd797955b0f804a9" Namespace="calico-apiserver" Pod="calico-apiserver-59bf95bdb4-hsqx4" WorkloadEndpoint="ci--4284--0--0--n--cba8e36126-k8s-calico--apiserver--59bf95bdb4--hsqx4-eth0" May 13 23:51:21.120270 containerd[1515]: 2025-05-13 23:51:21.080 [INFO][4114] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="82330f81334b068898c2fa0b3601827e31d0f6ef794bb1c3cd797955b0f804a9" Namespace="calico-apiserver" Pod="calico-apiserver-59bf95bdb4-hsqx4" WorkloadEndpoint="ci--4284--0--0--n--cba8e36126-k8s-calico--apiserver--59bf95bdb4--hsqx4-eth0" May 13 23:51:21.120331 containerd[1515]: 2025-05-13 23:51:21.080 [INFO][4114] cni-plugin/k8s.go 414: Added Mac, interface name, and active container ID to endpoint ContainerID="82330f81334b068898c2fa0b3601827e31d0f6ef794bb1c3cd797955b0f804a9" Namespace="calico-apiserver" Pod="calico-apiserver-59bf95bdb4-hsqx4" WorkloadEndpoint="ci--4284--0--0--n--cba8e36126-k8s-calico--apiserver--59bf95bdb4--hsqx4-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4284--0--0--n--cba8e36126-k8s-calico--apiserver--59bf95bdb4--hsqx4-eth0", GenerateName:"calico-apiserver-59bf95bdb4-", Namespace:"calico-apiserver", SelfLink:"", UID:"24eddc54-32b6-4824-b8f8-0cb6ae8aff36", ResourceVersion:"685", Generation:0, CreationTimestamp:time.Date(2025, time.May, 13, 23, 50, 59, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"59bf95bdb4", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4284-0-0-n-cba8e36126", ContainerID:"82330f81334b068898c2fa0b3601827e31d0f6ef794bb1c3cd797955b0f804a9", Pod:"calico-apiserver-59bf95bdb4-hsqx4", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.25.65/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"cali8a82efc8fad", MAC:"d6:a8:78:05:8d:03", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil)}} May 13 23:51:21.124469 containerd[1515]: 2025-05-13 23:51:21.111 [INFO][4114] cni-plugin/k8s.go 500: Wrote updated endpoint to datastore ContainerID="82330f81334b068898c2fa0b3601827e31d0f6ef794bb1c3cd797955b0f804a9" Namespace="calico-apiserver" Pod="calico-apiserver-59bf95bdb4-hsqx4" WorkloadEndpoint="ci--4284--0--0--n--cba8e36126-k8s-calico--apiserver--59bf95bdb4--hsqx4-eth0" May 13 23:51:21.171956 systemd-networkd[1403]: calib2d26313190: Link UP May 13 23:51:21.172870 systemd-networkd[1403]: calib2d26313190: Gained carrier May 13 23:51:21.202142 containerd[1515]: 2025-05-13 23:51:20.932 [INFO][4106] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4284--0--0--n--cba8e36126-k8s-csi--node--driver--dxllc-eth0 csi-node-driver- calico-system 61909eec-3ecf-4969-befe-a7397a9a89d4 607 0 2025-05-13 23:51:00 +0000 UTC map[app.kubernetes.io/name:csi-node-driver controller-revision-hash:55b7b4b9d k8s-app:csi-node-driver name:csi-node-driver pod-template-generation:1 projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:csi-node-driver] map[] [] [] []} {k8s ci-4284-0-0-n-cba8e36126 csi-node-driver-dxllc eth0 csi-node-driver [] [] [kns.calico-system ksa.calico-system.csi-node-driver] calib2d26313190 [] []}} ContainerID="e6df9f84214acb3f3f8fcccd3fa80d22ad29ee939103f182785640e112d9005e" Namespace="calico-system" Pod="csi-node-driver-dxllc" WorkloadEndpoint="ci--4284--0--0--n--cba8e36126-k8s-csi--node--driver--dxllc-" May 13 23:51:21.202142 containerd[1515]: 2025-05-13 23:51:20.932 [INFO][4106] cni-plugin/k8s.go 77: Extracted identifiers for CmdAddK8s ContainerID="e6df9f84214acb3f3f8fcccd3fa80d22ad29ee939103f182785640e112d9005e" Namespace="calico-system" Pod="csi-node-driver-dxllc" WorkloadEndpoint="ci--4284--0--0--n--cba8e36126-k8s-csi--node--driver--dxllc-eth0" May 13 23:51:21.202142 containerd[1515]: 2025-05-13 23:51:20.993 [INFO][4133] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="e6df9f84214acb3f3f8fcccd3fa80d22ad29ee939103f182785640e112d9005e" HandleID="k8s-pod-network.e6df9f84214acb3f3f8fcccd3fa80d22ad29ee939103f182785640e112d9005e" Workload="ci--4284--0--0--n--cba8e36126-k8s-csi--node--driver--dxllc-eth0" May 13 23:51:21.202706 containerd[1515]: 2025-05-13 23:51:21.028 [INFO][4133] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="e6df9f84214acb3f3f8fcccd3fa80d22ad29ee939103f182785640e112d9005e" HandleID="k8s-pod-network.e6df9f84214acb3f3f8fcccd3fa80d22ad29ee939103f182785640e112d9005e" Workload="ci--4284--0--0--n--cba8e36126-k8s-csi--node--driver--dxllc-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0x40003310d0), Attrs:map[string]string{"namespace":"calico-system", "node":"ci-4284-0-0-n-cba8e36126", "pod":"csi-node-driver-dxllc", "timestamp":"2025-05-13 23:51:20.993585526 +0000 UTC"}, Hostname:"ci-4284-0-0-n-cba8e36126", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} May 13 23:51:21.202706 containerd[1515]: 2025-05-13 23:51:21.028 [INFO][4133] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. May 13 23:51:21.202706 containerd[1515]: 2025-05-13 23:51:21.072 [INFO][4133] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. May 13 23:51:21.202706 containerd[1515]: 2025-05-13 23:51:21.072 [INFO][4133] ipam/ipam.go 107: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4284-0-0-n-cba8e36126' May 13 23:51:21.202706 containerd[1515]: 2025-05-13 23:51:21.080 [INFO][4133] ipam/ipam.go 660: Looking up existing affinities for host handle="k8s-pod-network.e6df9f84214acb3f3f8fcccd3fa80d22ad29ee939103f182785640e112d9005e" host="ci-4284-0-0-n-cba8e36126" May 13 23:51:21.202706 containerd[1515]: 2025-05-13 23:51:21.101 [INFO][4133] ipam/ipam.go 372: Looking up existing affinities for host host="ci-4284-0-0-n-cba8e36126" May 13 23:51:21.202706 containerd[1515]: 2025-05-13 23:51:21.118 [INFO][4133] ipam/ipam.go 489: Trying affinity for 192.168.25.64/26 host="ci-4284-0-0-n-cba8e36126" May 13 23:51:21.202706 containerd[1515]: 2025-05-13 23:51:21.126 [INFO][4133] ipam/ipam.go 155: Attempting to load block cidr=192.168.25.64/26 host="ci-4284-0-0-n-cba8e36126" May 13 23:51:21.202706 containerd[1515]: 2025-05-13 23:51:21.132 [INFO][4133] ipam/ipam.go 232: Affinity is confirmed and block has been loaded cidr=192.168.25.64/26 host="ci-4284-0-0-n-cba8e36126" May 13 23:51:21.202926 containerd[1515]: 2025-05-13 23:51:21.132 [INFO][4133] ipam/ipam.go 1180: Attempting to assign 1 addresses from block block=192.168.25.64/26 handle="k8s-pod-network.e6df9f84214acb3f3f8fcccd3fa80d22ad29ee939103f182785640e112d9005e" host="ci-4284-0-0-n-cba8e36126" May 13 23:51:21.202926 containerd[1515]: 2025-05-13 23:51:21.137 [INFO][4133] ipam/ipam.go 1685: Creating new handle: k8s-pod-network.e6df9f84214acb3f3f8fcccd3fa80d22ad29ee939103f182785640e112d9005e May 13 23:51:21.202926 containerd[1515]: 2025-05-13 23:51:21.147 [INFO][4133] ipam/ipam.go 1203: Writing block in order to claim IPs block=192.168.25.64/26 handle="k8s-pod-network.e6df9f84214acb3f3f8fcccd3fa80d22ad29ee939103f182785640e112d9005e" host="ci-4284-0-0-n-cba8e36126" May 13 23:51:21.202926 containerd[1515]: 2025-05-13 23:51:21.159 [INFO][4133] ipam/ipam.go 1216: Successfully claimed IPs: [192.168.25.66/26] block=192.168.25.64/26 handle="k8s-pod-network.e6df9f84214acb3f3f8fcccd3fa80d22ad29ee939103f182785640e112d9005e" host="ci-4284-0-0-n-cba8e36126" May 13 23:51:21.202926 containerd[1515]: 2025-05-13 23:51:21.159 [INFO][4133] ipam/ipam.go 847: Auto-assigned 1 out of 1 IPv4s: [192.168.25.66/26] handle="k8s-pod-network.e6df9f84214acb3f3f8fcccd3fa80d22ad29ee939103f182785640e112d9005e" host="ci-4284-0-0-n-cba8e36126" May 13 23:51:21.202926 containerd[1515]: 2025-05-13 23:51:21.159 [INFO][4133] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. May 13 23:51:21.202926 containerd[1515]: 2025-05-13 23:51:21.159 [INFO][4133] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.25.66/26] IPv6=[] ContainerID="e6df9f84214acb3f3f8fcccd3fa80d22ad29ee939103f182785640e112d9005e" HandleID="k8s-pod-network.e6df9f84214acb3f3f8fcccd3fa80d22ad29ee939103f182785640e112d9005e" Workload="ci--4284--0--0--n--cba8e36126-k8s-csi--node--driver--dxllc-eth0" May 13 23:51:21.203073 containerd[1515]: 2025-05-13 23:51:21.164 [INFO][4106] cni-plugin/k8s.go 386: Populated endpoint ContainerID="e6df9f84214acb3f3f8fcccd3fa80d22ad29ee939103f182785640e112d9005e" Namespace="calico-system" Pod="csi-node-driver-dxllc" WorkloadEndpoint="ci--4284--0--0--n--cba8e36126-k8s-csi--node--driver--dxllc-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4284--0--0--n--cba8e36126-k8s-csi--node--driver--dxllc-eth0", GenerateName:"csi-node-driver-", Namespace:"calico-system", SelfLink:"", UID:"61909eec-3ecf-4969-befe-a7397a9a89d4", ResourceVersion:"607", Generation:0, CreationTimestamp:time.Date(2025, time.May, 13, 23, 51, 0, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"csi-node-driver", "controller-revision-hash":"55b7b4b9d", "k8s-app":"csi-node-driver", "name":"csi-node-driver", "pod-template-generation":"1", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"csi-node-driver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4284-0-0-n-cba8e36126", ContainerID:"", Pod:"csi-node-driver-dxllc", Endpoint:"eth0", ServiceAccountName:"csi-node-driver", IPNetworks:[]string{"192.168.25.66/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.csi-node-driver"}, InterfaceName:"calib2d26313190", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil)}} May 13 23:51:21.203123 containerd[1515]: 2025-05-13 23:51:21.164 [INFO][4106] cni-plugin/k8s.go 387: Calico CNI using IPs: [192.168.25.66/32] ContainerID="e6df9f84214acb3f3f8fcccd3fa80d22ad29ee939103f182785640e112d9005e" Namespace="calico-system" Pod="csi-node-driver-dxllc" WorkloadEndpoint="ci--4284--0--0--n--cba8e36126-k8s-csi--node--driver--dxllc-eth0" May 13 23:51:21.203123 containerd[1515]: 2025-05-13 23:51:21.165 [INFO][4106] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to calib2d26313190 ContainerID="e6df9f84214acb3f3f8fcccd3fa80d22ad29ee939103f182785640e112d9005e" Namespace="calico-system" Pod="csi-node-driver-dxllc" WorkloadEndpoint="ci--4284--0--0--n--cba8e36126-k8s-csi--node--driver--dxllc-eth0" May 13 23:51:21.203123 containerd[1515]: 2025-05-13 23:51:21.171 [INFO][4106] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="e6df9f84214acb3f3f8fcccd3fa80d22ad29ee939103f182785640e112d9005e" Namespace="calico-system" Pod="csi-node-driver-dxllc" WorkloadEndpoint="ci--4284--0--0--n--cba8e36126-k8s-csi--node--driver--dxllc-eth0" May 13 23:51:21.203182 containerd[1515]: 2025-05-13 23:51:21.171 [INFO][4106] cni-plugin/k8s.go 414: Added Mac, interface name, and active container ID to endpoint ContainerID="e6df9f84214acb3f3f8fcccd3fa80d22ad29ee939103f182785640e112d9005e" Namespace="calico-system" Pod="csi-node-driver-dxllc" WorkloadEndpoint="ci--4284--0--0--n--cba8e36126-k8s-csi--node--driver--dxllc-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4284--0--0--n--cba8e36126-k8s-csi--node--driver--dxllc-eth0", GenerateName:"csi-node-driver-", Namespace:"calico-system", SelfLink:"", UID:"61909eec-3ecf-4969-befe-a7397a9a89d4", ResourceVersion:"607", Generation:0, CreationTimestamp:time.Date(2025, time.May, 13, 23, 51, 0, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"csi-node-driver", "controller-revision-hash":"55b7b4b9d", "k8s-app":"csi-node-driver", "name":"csi-node-driver", "pod-template-generation":"1", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"csi-node-driver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4284-0-0-n-cba8e36126", ContainerID:"e6df9f84214acb3f3f8fcccd3fa80d22ad29ee939103f182785640e112d9005e", Pod:"csi-node-driver-dxllc", Endpoint:"eth0", ServiceAccountName:"csi-node-driver", IPNetworks:[]string{"192.168.25.66/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.csi-node-driver"}, InterfaceName:"calib2d26313190", MAC:"ba:f0:0d:dc:dd:41", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil)}} May 13 23:51:21.203230 containerd[1515]: 2025-05-13 23:51:21.198 [INFO][4106] cni-plugin/k8s.go 500: Wrote updated endpoint to datastore ContainerID="e6df9f84214acb3f3f8fcccd3fa80d22ad29ee939103f182785640e112d9005e" Namespace="calico-system" Pod="csi-node-driver-dxllc" WorkloadEndpoint="ci--4284--0--0--n--cba8e36126-k8s-csi--node--driver--dxllc-eth0" May 13 23:51:21.231516 containerd[1515]: time="2025-05-13T23:51:21.230889507Z" level=info msg="connecting to shim 82330f81334b068898c2fa0b3601827e31d0f6ef794bb1c3cd797955b0f804a9" address="unix:///run/containerd/s/d634909037434b5489765c383661552552981e2906d9d243251441c2c4eec2ac" namespace=k8s.io protocol=ttrpc version=3 May 13 23:51:21.290329 containerd[1515]: time="2025-05-13T23:51:21.290144573Z" level=info msg="connecting to shim e6df9f84214acb3f3f8fcccd3fa80d22ad29ee939103f182785640e112d9005e" address="unix:///run/containerd/s/d33fd5f6ba3f14755a008308c61ac2d7c03a7515943371dc956357468e8491fc" namespace=k8s.io protocol=ttrpc version=3 May 13 23:51:21.291611 systemd[1]: Started cri-containerd-82330f81334b068898c2fa0b3601827e31d0f6ef794bb1c3cd797955b0f804a9.scope - libcontainer container 82330f81334b068898c2fa0b3601827e31d0f6ef794bb1c3cd797955b0f804a9. May 13 23:51:21.326766 systemd[1]: Started cri-containerd-e6df9f84214acb3f3f8fcccd3fa80d22ad29ee939103f182785640e112d9005e.scope - libcontainer container e6df9f84214acb3f3f8fcccd3fa80d22ad29ee939103f182785640e112d9005e. May 13 23:51:21.366317 containerd[1515]: time="2025-05-13T23:51:21.366147218Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-59bf95bdb4-hsqx4,Uid:24eddc54-32b6-4824-b8f8-0cb6ae8aff36,Namespace:calico-apiserver,Attempt:0,} returns sandbox id \"82330f81334b068898c2fa0b3601827e31d0f6ef794bb1c3cd797955b0f804a9\"" May 13 23:51:21.371604 containerd[1515]: time="2025-05-13T23:51:21.371307465Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.29.3\"" May 13 23:51:21.392234 containerd[1515]: time="2025-05-13T23:51:21.392085423Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-dxllc,Uid:61909eec-3ecf-4969-befe-a7397a9a89d4,Namespace:calico-system,Attempt:0,} returns sandbox id \"e6df9f84214acb3f3f8fcccd3fa80d22ad29ee939103f182785640e112d9005e\"" May 13 23:51:21.818218 containerd[1515]: time="2025-05-13T23:51:21.817886644Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-7db6d8ff4d-5w474,Uid:af984b4c-8125-4744-82af-9465e063a376,Namespace:kube-system,Attempt:0,}" May 13 23:51:21.818825 containerd[1515]: time="2025-05-13T23:51:21.818464339Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-7db6d8ff4d-2mlnb,Uid:320deada-65e9-4ed1-bb7a-4b617ad6a953,Namespace:kube-system,Attempt:0,}" May 13 23:51:22.038988 systemd-networkd[1403]: cali1455a268d83: Link UP May 13 23:51:22.040327 systemd-networkd[1403]: cali1455a268d83: Gained carrier May 13 23:51:22.069575 containerd[1515]: 2025-05-13 23:51:21.892 [INFO][4269] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4284--0--0--n--cba8e36126-k8s-coredns--7db6d8ff4d--2mlnb-eth0 coredns-7db6d8ff4d- kube-system 320deada-65e9-4ed1-bb7a-4b617ad6a953 679 0 2025-05-13 23:50:53 +0000 UTC map[k8s-app:kube-dns pod-template-hash:7db6d8ff4d projectcalico.org/namespace:kube-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:coredns] map[] [] [] []} {k8s ci-4284-0-0-n-cba8e36126 coredns-7db6d8ff4d-2mlnb eth0 coredns [] [] [kns.kube-system ksa.kube-system.coredns] cali1455a268d83 [{dns UDP 53 0 } {dns-tcp TCP 53 0 } {metrics TCP 9153 0 }] []}} ContainerID="316aca57eac91f43c7f8a53bec6d148949862a3d6dfae24e59268ded0093648c" Namespace="kube-system" Pod="coredns-7db6d8ff4d-2mlnb" WorkloadEndpoint="ci--4284--0--0--n--cba8e36126-k8s-coredns--7db6d8ff4d--2mlnb-" May 13 23:51:22.069575 containerd[1515]: 2025-05-13 23:51:21.892 [INFO][4269] cni-plugin/k8s.go 77: Extracted identifiers for CmdAddK8s ContainerID="316aca57eac91f43c7f8a53bec6d148949862a3d6dfae24e59268ded0093648c" Namespace="kube-system" Pod="coredns-7db6d8ff4d-2mlnb" WorkloadEndpoint="ci--4284--0--0--n--cba8e36126-k8s-coredns--7db6d8ff4d--2mlnb-eth0" May 13 23:51:22.069575 containerd[1515]: 2025-05-13 23:51:21.947 [INFO][4291] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="316aca57eac91f43c7f8a53bec6d148949862a3d6dfae24e59268ded0093648c" HandleID="k8s-pod-network.316aca57eac91f43c7f8a53bec6d148949862a3d6dfae24e59268ded0093648c" Workload="ci--4284--0--0--n--cba8e36126-k8s-coredns--7db6d8ff4d--2mlnb-eth0" May 13 23:51:22.070177 containerd[1515]: 2025-05-13 23:51:21.971 [INFO][4291] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="316aca57eac91f43c7f8a53bec6d148949862a3d6dfae24e59268ded0093648c" HandleID="k8s-pod-network.316aca57eac91f43c7f8a53bec6d148949862a3d6dfae24e59268ded0093648c" Workload="ci--4284--0--0--n--cba8e36126-k8s-coredns--7db6d8ff4d--2mlnb-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0x4000319ab0), Attrs:map[string]string{"namespace":"kube-system", "node":"ci-4284-0-0-n-cba8e36126", "pod":"coredns-7db6d8ff4d-2mlnb", "timestamp":"2025-05-13 23:51:21.947263561 +0000 UTC"}, Hostname:"ci-4284-0-0-n-cba8e36126", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} May 13 23:51:22.070177 containerd[1515]: 2025-05-13 23:51:21.971 [INFO][4291] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. May 13 23:51:22.070177 containerd[1515]: 2025-05-13 23:51:21.971 [INFO][4291] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. May 13 23:51:22.070177 containerd[1515]: 2025-05-13 23:51:21.971 [INFO][4291] ipam/ipam.go 107: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4284-0-0-n-cba8e36126' May 13 23:51:22.070177 containerd[1515]: 2025-05-13 23:51:21.978 [INFO][4291] ipam/ipam.go 660: Looking up existing affinities for host handle="k8s-pod-network.316aca57eac91f43c7f8a53bec6d148949862a3d6dfae24e59268ded0093648c" host="ci-4284-0-0-n-cba8e36126" May 13 23:51:22.070177 containerd[1515]: 2025-05-13 23:51:21.987 [INFO][4291] ipam/ipam.go 372: Looking up existing affinities for host host="ci-4284-0-0-n-cba8e36126" May 13 23:51:22.070177 containerd[1515]: 2025-05-13 23:51:21.995 [INFO][4291] ipam/ipam.go 489: Trying affinity for 192.168.25.64/26 host="ci-4284-0-0-n-cba8e36126" May 13 23:51:22.070177 containerd[1515]: 2025-05-13 23:51:21.999 [INFO][4291] ipam/ipam.go 155: Attempting to load block cidr=192.168.25.64/26 host="ci-4284-0-0-n-cba8e36126" May 13 23:51:22.070177 containerd[1515]: 2025-05-13 23:51:22.003 [INFO][4291] ipam/ipam.go 232: Affinity is confirmed and block has been loaded cidr=192.168.25.64/26 host="ci-4284-0-0-n-cba8e36126" May 13 23:51:22.070376 containerd[1515]: 2025-05-13 23:51:22.003 [INFO][4291] ipam/ipam.go 1180: Attempting to assign 1 addresses from block block=192.168.25.64/26 handle="k8s-pod-network.316aca57eac91f43c7f8a53bec6d148949862a3d6dfae24e59268ded0093648c" host="ci-4284-0-0-n-cba8e36126" May 13 23:51:22.070376 containerd[1515]: 2025-05-13 23:51:22.006 [INFO][4291] ipam/ipam.go 1685: Creating new handle: k8s-pod-network.316aca57eac91f43c7f8a53bec6d148949862a3d6dfae24e59268ded0093648c May 13 23:51:22.070376 containerd[1515]: 2025-05-13 23:51:22.011 [INFO][4291] ipam/ipam.go 1203: Writing block in order to claim IPs block=192.168.25.64/26 handle="k8s-pod-network.316aca57eac91f43c7f8a53bec6d148949862a3d6dfae24e59268ded0093648c" host="ci-4284-0-0-n-cba8e36126" May 13 23:51:22.070376 containerd[1515]: 2025-05-13 23:51:22.022 [INFO][4291] ipam/ipam.go 1216: Successfully claimed IPs: [192.168.25.67/26] block=192.168.25.64/26 handle="k8s-pod-network.316aca57eac91f43c7f8a53bec6d148949862a3d6dfae24e59268ded0093648c" host="ci-4284-0-0-n-cba8e36126" May 13 23:51:22.070376 containerd[1515]: 2025-05-13 23:51:22.022 [INFO][4291] ipam/ipam.go 847: Auto-assigned 1 out of 1 IPv4s: [192.168.25.67/26] handle="k8s-pod-network.316aca57eac91f43c7f8a53bec6d148949862a3d6dfae24e59268ded0093648c" host="ci-4284-0-0-n-cba8e36126" May 13 23:51:22.070376 containerd[1515]: 2025-05-13 23:51:22.022 [INFO][4291] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. May 13 23:51:22.070376 containerd[1515]: 2025-05-13 23:51:22.022 [INFO][4291] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.25.67/26] IPv6=[] ContainerID="316aca57eac91f43c7f8a53bec6d148949862a3d6dfae24e59268ded0093648c" HandleID="k8s-pod-network.316aca57eac91f43c7f8a53bec6d148949862a3d6dfae24e59268ded0093648c" Workload="ci--4284--0--0--n--cba8e36126-k8s-coredns--7db6d8ff4d--2mlnb-eth0" May 13 23:51:22.070606 containerd[1515]: 2025-05-13 23:51:22.027 [INFO][4269] cni-plugin/k8s.go 386: Populated endpoint ContainerID="316aca57eac91f43c7f8a53bec6d148949862a3d6dfae24e59268ded0093648c" Namespace="kube-system" Pod="coredns-7db6d8ff4d-2mlnb" WorkloadEndpoint="ci--4284--0--0--n--cba8e36126-k8s-coredns--7db6d8ff4d--2mlnb-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4284--0--0--n--cba8e36126-k8s-coredns--7db6d8ff4d--2mlnb-eth0", GenerateName:"coredns-7db6d8ff4d-", Namespace:"kube-system", SelfLink:"", UID:"320deada-65e9-4ed1-bb7a-4b617ad6a953", ResourceVersion:"679", Generation:0, CreationTimestamp:time.Date(2025, time.May, 13, 23, 50, 53, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"7db6d8ff4d", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4284-0-0-n-cba8e36126", ContainerID:"", Pod:"coredns-7db6d8ff4d-2mlnb", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.25.67/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"cali1455a268d83", MAC:"", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil)}} May 13 23:51:22.070606 containerd[1515]: 2025-05-13 23:51:22.028 [INFO][4269] cni-plugin/k8s.go 387: Calico CNI using IPs: [192.168.25.67/32] ContainerID="316aca57eac91f43c7f8a53bec6d148949862a3d6dfae24e59268ded0093648c" Namespace="kube-system" Pod="coredns-7db6d8ff4d-2mlnb" WorkloadEndpoint="ci--4284--0--0--n--cba8e36126-k8s-coredns--7db6d8ff4d--2mlnb-eth0" May 13 23:51:22.070606 containerd[1515]: 2025-05-13 23:51:22.029 [INFO][4269] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali1455a268d83 ContainerID="316aca57eac91f43c7f8a53bec6d148949862a3d6dfae24e59268ded0093648c" Namespace="kube-system" Pod="coredns-7db6d8ff4d-2mlnb" WorkloadEndpoint="ci--4284--0--0--n--cba8e36126-k8s-coredns--7db6d8ff4d--2mlnb-eth0" May 13 23:51:22.070606 containerd[1515]: 2025-05-13 23:51:22.039 [INFO][4269] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="316aca57eac91f43c7f8a53bec6d148949862a3d6dfae24e59268ded0093648c" Namespace="kube-system" Pod="coredns-7db6d8ff4d-2mlnb" WorkloadEndpoint="ci--4284--0--0--n--cba8e36126-k8s-coredns--7db6d8ff4d--2mlnb-eth0" May 13 23:51:22.070606 containerd[1515]: 2025-05-13 23:51:22.040 [INFO][4269] cni-plugin/k8s.go 414: Added Mac, interface name, and active container ID to endpoint ContainerID="316aca57eac91f43c7f8a53bec6d148949862a3d6dfae24e59268ded0093648c" Namespace="kube-system" Pod="coredns-7db6d8ff4d-2mlnb" WorkloadEndpoint="ci--4284--0--0--n--cba8e36126-k8s-coredns--7db6d8ff4d--2mlnb-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4284--0--0--n--cba8e36126-k8s-coredns--7db6d8ff4d--2mlnb-eth0", GenerateName:"coredns-7db6d8ff4d-", Namespace:"kube-system", SelfLink:"", UID:"320deada-65e9-4ed1-bb7a-4b617ad6a953", ResourceVersion:"679", Generation:0, CreationTimestamp:time.Date(2025, time.May, 13, 23, 50, 53, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"7db6d8ff4d", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4284-0-0-n-cba8e36126", ContainerID:"316aca57eac91f43c7f8a53bec6d148949862a3d6dfae24e59268ded0093648c", Pod:"coredns-7db6d8ff4d-2mlnb", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.25.67/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"cali1455a268d83", MAC:"16:cc:4c:e9:e2:f9", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil)}} May 13 23:51:22.070606 containerd[1515]: 2025-05-13 23:51:22.064 [INFO][4269] cni-plugin/k8s.go 500: Wrote updated endpoint to datastore ContainerID="316aca57eac91f43c7f8a53bec6d148949862a3d6dfae24e59268ded0093648c" Namespace="kube-system" Pod="coredns-7db6d8ff4d-2mlnb" WorkloadEndpoint="ci--4284--0--0--n--cba8e36126-k8s-coredns--7db6d8ff4d--2mlnb-eth0" May 13 23:51:22.110085 systemd-networkd[1403]: cali9d2a263f751: Link UP May 13 23:51:22.111994 systemd-networkd[1403]: cali9d2a263f751: Gained carrier May 13 23:51:22.133129 containerd[1515]: time="2025-05-13T23:51:22.132615952Z" level=info msg="connecting to shim 316aca57eac91f43c7f8a53bec6d148949862a3d6dfae24e59268ded0093648c" address="unix:///run/containerd/s/adf7fe1deeb5bb827017a9554e1df7e11125aacf3d8aecb2f283578346e02ff6" namespace=k8s.io protocol=ttrpc version=3 May 13 23:51:22.158372 containerd[1515]: 2025-05-13 23:51:21.908 [INFO][4265] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4284--0--0--n--cba8e36126-k8s-coredns--7db6d8ff4d--5w474-eth0 coredns-7db6d8ff4d- kube-system af984b4c-8125-4744-82af-9465e063a376 687 0 2025-05-13 23:50:53 +0000 UTC map[k8s-app:kube-dns pod-template-hash:7db6d8ff4d projectcalico.org/namespace:kube-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:coredns] map[] [] [] []} {k8s ci-4284-0-0-n-cba8e36126 coredns-7db6d8ff4d-5w474 eth0 coredns [] [] [kns.kube-system ksa.kube-system.coredns] cali9d2a263f751 [{dns UDP 53 0 } {dns-tcp TCP 53 0 } {metrics TCP 9153 0 }] []}} ContainerID="c88df53653c340258487d4c13b0d8f891833f19b4267a16e9b7981979928f9d2" Namespace="kube-system" Pod="coredns-7db6d8ff4d-5w474" WorkloadEndpoint="ci--4284--0--0--n--cba8e36126-k8s-coredns--7db6d8ff4d--5w474-" May 13 23:51:22.158372 containerd[1515]: 2025-05-13 23:51:21.909 [INFO][4265] cni-plugin/k8s.go 77: Extracted identifiers for CmdAddK8s ContainerID="c88df53653c340258487d4c13b0d8f891833f19b4267a16e9b7981979928f9d2" Namespace="kube-system" Pod="coredns-7db6d8ff4d-5w474" WorkloadEndpoint="ci--4284--0--0--n--cba8e36126-k8s-coredns--7db6d8ff4d--5w474-eth0" May 13 23:51:22.158372 containerd[1515]: 2025-05-13 23:51:21.961 [INFO][4296] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="c88df53653c340258487d4c13b0d8f891833f19b4267a16e9b7981979928f9d2" HandleID="k8s-pod-network.c88df53653c340258487d4c13b0d8f891833f19b4267a16e9b7981979928f9d2" Workload="ci--4284--0--0--n--cba8e36126-k8s-coredns--7db6d8ff4d--5w474-eth0" May 13 23:51:22.158372 containerd[1515]: 2025-05-13 23:51:21.985 [INFO][4296] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="c88df53653c340258487d4c13b0d8f891833f19b4267a16e9b7981979928f9d2" HandleID="k8s-pod-network.c88df53653c340258487d4c13b0d8f891833f19b4267a16e9b7981979928f9d2" Workload="ci--4284--0--0--n--cba8e36126-k8s-coredns--7db6d8ff4d--5w474-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0x400031b700), Attrs:map[string]string{"namespace":"kube-system", "node":"ci-4284-0-0-n-cba8e36126", "pod":"coredns-7db6d8ff4d-5w474", "timestamp":"2025-05-13 23:51:21.961207955 +0000 UTC"}, Hostname:"ci-4284-0-0-n-cba8e36126", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} May 13 23:51:22.158372 containerd[1515]: 2025-05-13 23:51:21.986 [INFO][4296] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. May 13 23:51:22.158372 containerd[1515]: 2025-05-13 23:51:22.023 [INFO][4296] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. May 13 23:51:22.158372 containerd[1515]: 2025-05-13 23:51:22.023 [INFO][4296] ipam/ipam.go 107: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4284-0-0-n-cba8e36126' May 13 23:51:22.158372 containerd[1515]: 2025-05-13 23:51:22.030 [INFO][4296] ipam/ipam.go 660: Looking up existing affinities for host handle="k8s-pod-network.c88df53653c340258487d4c13b0d8f891833f19b4267a16e9b7981979928f9d2" host="ci-4284-0-0-n-cba8e36126" May 13 23:51:22.158372 containerd[1515]: 2025-05-13 23:51:22.048 [INFO][4296] ipam/ipam.go 372: Looking up existing affinities for host host="ci-4284-0-0-n-cba8e36126" May 13 23:51:22.158372 containerd[1515]: 2025-05-13 23:51:22.059 [INFO][4296] ipam/ipam.go 489: Trying affinity for 192.168.25.64/26 host="ci-4284-0-0-n-cba8e36126" May 13 23:51:22.158372 containerd[1515]: 2025-05-13 23:51:22.063 [INFO][4296] ipam/ipam.go 155: Attempting to load block cidr=192.168.25.64/26 host="ci-4284-0-0-n-cba8e36126" May 13 23:51:22.158372 containerd[1515]: 2025-05-13 23:51:22.072 [INFO][4296] ipam/ipam.go 232: Affinity is confirmed and block has been loaded cidr=192.168.25.64/26 host="ci-4284-0-0-n-cba8e36126" May 13 23:51:22.158372 containerd[1515]: 2025-05-13 23:51:22.072 [INFO][4296] ipam/ipam.go 1180: Attempting to assign 1 addresses from block block=192.168.25.64/26 handle="k8s-pod-network.c88df53653c340258487d4c13b0d8f891833f19b4267a16e9b7981979928f9d2" host="ci-4284-0-0-n-cba8e36126" May 13 23:51:22.158372 containerd[1515]: 2025-05-13 23:51:22.077 [INFO][4296] ipam/ipam.go 1685: Creating new handle: k8s-pod-network.c88df53653c340258487d4c13b0d8f891833f19b4267a16e9b7981979928f9d2 May 13 23:51:22.158372 containerd[1515]: 2025-05-13 23:51:22.085 [INFO][4296] ipam/ipam.go 1203: Writing block in order to claim IPs block=192.168.25.64/26 handle="k8s-pod-network.c88df53653c340258487d4c13b0d8f891833f19b4267a16e9b7981979928f9d2" host="ci-4284-0-0-n-cba8e36126" May 13 23:51:22.158372 containerd[1515]: 2025-05-13 23:51:22.096 [INFO][4296] ipam/ipam.go 1216: Successfully claimed IPs: [192.168.25.68/26] block=192.168.25.64/26 handle="k8s-pod-network.c88df53653c340258487d4c13b0d8f891833f19b4267a16e9b7981979928f9d2" host="ci-4284-0-0-n-cba8e36126" May 13 23:51:22.158372 containerd[1515]: 2025-05-13 23:51:22.096 [INFO][4296] ipam/ipam.go 847: Auto-assigned 1 out of 1 IPv4s: [192.168.25.68/26] handle="k8s-pod-network.c88df53653c340258487d4c13b0d8f891833f19b4267a16e9b7981979928f9d2" host="ci-4284-0-0-n-cba8e36126" May 13 23:51:22.158372 containerd[1515]: 2025-05-13 23:51:22.096 [INFO][4296] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. May 13 23:51:22.158372 containerd[1515]: 2025-05-13 23:51:22.096 [INFO][4296] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.25.68/26] IPv6=[] ContainerID="c88df53653c340258487d4c13b0d8f891833f19b4267a16e9b7981979928f9d2" HandleID="k8s-pod-network.c88df53653c340258487d4c13b0d8f891833f19b4267a16e9b7981979928f9d2" Workload="ci--4284--0--0--n--cba8e36126-k8s-coredns--7db6d8ff4d--5w474-eth0" May 13 23:51:22.159643 containerd[1515]: 2025-05-13 23:51:22.103 [INFO][4265] cni-plugin/k8s.go 386: Populated endpoint ContainerID="c88df53653c340258487d4c13b0d8f891833f19b4267a16e9b7981979928f9d2" Namespace="kube-system" Pod="coredns-7db6d8ff4d-5w474" WorkloadEndpoint="ci--4284--0--0--n--cba8e36126-k8s-coredns--7db6d8ff4d--5w474-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4284--0--0--n--cba8e36126-k8s-coredns--7db6d8ff4d--5w474-eth0", GenerateName:"coredns-7db6d8ff4d-", Namespace:"kube-system", SelfLink:"", UID:"af984b4c-8125-4744-82af-9465e063a376", ResourceVersion:"687", Generation:0, CreationTimestamp:time.Date(2025, time.May, 13, 23, 50, 53, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"7db6d8ff4d", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4284-0-0-n-cba8e36126", ContainerID:"", Pod:"coredns-7db6d8ff4d-5w474", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.25.68/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"cali9d2a263f751", MAC:"", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil)}} May 13 23:51:22.159643 containerd[1515]: 2025-05-13 23:51:22.104 [INFO][4265] cni-plugin/k8s.go 387: Calico CNI using IPs: [192.168.25.68/32] ContainerID="c88df53653c340258487d4c13b0d8f891833f19b4267a16e9b7981979928f9d2" Namespace="kube-system" Pod="coredns-7db6d8ff4d-5w474" WorkloadEndpoint="ci--4284--0--0--n--cba8e36126-k8s-coredns--7db6d8ff4d--5w474-eth0" May 13 23:51:22.159643 containerd[1515]: 2025-05-13 23:51:22.104 [INFO][4265] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali9d2a263f751 ContainerID="c88df53653c340258487d4c13b0d8f891833f19b4267a16e9b7981979928f9d2" Namespace="kube-system" Pod="coredns-7db6d8ff4d-5w474" WorkloadEndpoint="ci--4284--0--0--n--cba8e36126-k8s-coredns--7db6d8ff4d--5w474-eth0" May 13 23:51:22.159643 containerd[1515]: 2025-05-13 23:51:22.112 [INFO][4265] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="c88df53653c340258487d4c13b0d8f891833f19b4267a16e9b7981979928f9d2" Namespace="kube-system" Pod="coredns-7db6d8ff4d-5w474" WorkloadEndpoint="ci--4284--0--0--n--cba8e36126-k8s-coredns--7db6d8ff4d--5w474-eth0" May 13 23:51:22.159643 containerd[1515]: 2025-05-13 23:51:22.113 [INFO][4265] cni-plugin/k8s.go 414: Added Mac, interface name, and active container ID to endpoint ContainerID="c88df53653c340258487d4c13b0d8f891833f19b4267a16e9b7981979928f9d2" Namespace="kube-system" Pod="coredns-7db6d8ff4d-5w474" WorkloadEndpoint="ci--4284--0--0--n--cba8e36126-k8s-coredns--7db6d8ff4d--5w474-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4284--0--0--n--cba8e36126-k8s-coredns--7db6d8ff4d--5w474-eth0", GenerateName:"coredns-7db6d8ff4d-", Namespace:"kube-system", SelfLink:"", UID:"af984b4c-8125-4744-82af-9465e063a376", ResourceVersion:"687", Generation:0, CreationTimestamp:time.Date(2025, time.May, 13, 23, 50, 53, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"7db6d8ff4d", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4284-0-0-n-cba8e36126", ContainerID:"c88df53653c340258487d4c13b0d8f891833f19b4267a16e9b7981979928f9d2", Pod:"coredns-7db6d8ff4d-5w474", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.25.68/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"cali9d2a263f751", MAC:"f6:81:a6:f5:1f:ff", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil)}} May 13 23:51:22.159643 containerd[1515]: 2025-05-13 23:51:22.144 [INFO][4265] cni-plugin/k8s.go 500: Wrote updated endpoint to datastore ContainerID="c88df53653c340258487d4c13b0d8f891833f19b4267a16e9b7981979928f9d2" Namespace="kube-system" Pod="coredns-7db6d8ff4d-5w474" WorkloadEndpoint="ci--4284--0--0--n--cba8e36126-k8s-coredns--7db6d8ff4d--5w474-eth0" May 13 23:51:22.203627 containerd[1515]: time="2025-05-13T23:51:22.203511468Z" level=info msg="connecting to shim c88df53653c340258487d4c13b0d8f891833f19b4267a16e9b7981979928f9d2" address="unix:///run/containerd/s/9d0fffc0215162c2d166426e53e84c8165c88947102ac453f01b5ad6aa73b340" namespace=k8s.io protocol=ttrpc version=3 May 13 23:51:22.206601 systemd[1]: Started cri-containerd-316aca57eac91f43c7f8a53bec6d148949862a3d6dfae24e59268ded0093648c.scope - libcontainer container 316aca57eac91f43c7f8a53bec6d148949862a3d6dfae24e59268ded0093648c. May 13 23:51:22.235650 systemd[1]: Started cri-containerd-c88df53653c340258487d4c13b0d8f891833f19b4267a16e9b7981979928f9d2.scope - libcontainer container c88df53653c340258487d4c13b0d8f891833f19b4267a16e9b7981979928f9d2. May 13 23:51:22.303250 containerd[1515]: time="2025-05-13T23:51:22.303003204Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-7db6d8ff4d-2mlnb,Uid:320deada-65e9-4ed1-bb7a-4b617ad6a953,Namespace:kube-system,Attempt:0,} returns sandbox id \"316aca57eac91f43c7f8a53bec6d148949862a3d6dfae24e59268ded0093648c\"" May 13 23:51:22.308061 containerd[1515]: time="2025-05-13T23:51:22.307239158Z" level=info msg="CreateContainer within sandbox \"316aca57eac91f43c7f8a53bec6d148949862a3d6dfae24e59268ded0093648c\" for container &ContainerMetadata{Name:coredns,Attempt:0,}" May 13 23:51:22.329682 containerd[1515]: time="2025-05-13T23:51:22.329536313Z" level=info msg="Container 768d0b2397de8324b673d98725cba11b864bc006fd29040fd7dc115227974abc: CDI devices from CRI Config.CDIDevices: []" May 13 23:51:22.339725 containerd[1515]: time="2025-05-13T23:51:22.339647853Z" level=info msg="CreateContainer within sandbox \"316aca57eac91f43c7f8a53bec6d148949862a3d6dfae24e59268ded0093648c\" for &ContainerMetadata{Name:coredns,Attempt:0,} returns container id \"768d0b2397de8324b673d98725cba11b864bc006fd29040fd7dc115227974abc\"" May 13 23:51:22.340619 containerd[1515]: time="2025-05-13T23:51:22.340578420Z" level=info msg="StartContainer for \"768d0b2397de8324b673d98725cba11b864bc006fd29040fd7dc115227974abc\"" May 13 23:51:22.343969 containerd[1515]: time="2025-05-13T23:51:22.343925131Z" level=info msg="connecting to shim 768d0b2397de8324b673d98725cba11b864bc006fd29040fd7dc115227974abc" address="unix:///run/containerd/s/adf7fe1deeb5bb827017a9554e1df7e11125aacf3d8aecb2f283578346e02ff6" protocol=ttrpc version=3 May 13 23:51:22.378624 systemd[1]: Started cri-containerd-768d0b2397de8324b673d98725cba11b864bc006fd29040fd7dc115227974abc.scope - libcontainer container 768d0b2397de8324b673d98725cba11b864bc006fd29040fd7dc115227974abc. May 13 23:51:22.387743 containerd[1515]: time="2025-05-13T23:51:22.387690683Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-7db6d8ff4d-5w474,Uid:af984b4c-8125-4744-82af-9465e063a376,Namespace:kube-system,Attempt:0,} returns sandbox id \"c88df53653c340258487d4c13b0d8f891833f19b4267a16e9b7981979928f9d2\"" May 13 23:51:22.392660 containerd[1515]: time="2025-05-13T23:51:22.392561936Z" level=info msg="CreateContainer within sandbox \"c88df53653c340258487d4c13b0d8f891833f19b4267a16e9b7981979928f9d2\" for container &ContainerMetadata{Name:coredns,Attempt:0,}" May 13 23:51:22.415808 containerd[1515]: time="2025-05-13T23:51:22.415738893Z" level=info msg="Container bd02ff5b879a1f493bf7007e62d3c35e7d33d9eb9005ce5378b3cc427dcb392a: CDI devices from CRI Config.CDIDevices: []" May 13 23:51:22.430765 containerd[1515]: time="2025-05-13T23:51:22.430683163Z" level=info msg="CreateContainer within sandbox \"c88df53653c340258487d4c13b0d8f891833f19b4267a16e9b7981979928f9d2\" for &ContainerMetadata{Name:coredns,Attempt:0,} returns container id \"bd02ff5b879a1f493bf7007e62d3c35e7d33d9eb9005ce5378b3cc427dcb392a\"" May 13 23:51:22.433532 containerd[1515]: time="2025-05-13T23:51:22.433067105Z" level=info msg="StartContainer for \"bd02ff5b879a1f493bf7007e62d3c35e7d33d9eb9005ce5378b3cc427dcb392a\"" May 13 23:51:22.437306 containerd[1515]: time="2025-05-13T23:51:22.437262015Z" level=info msg="connecting to shim bd02ff5b879a1f493bf7007e62d3c35e7d33d9eb9005ce5378b3cc427dcb392a" address="unix:///run/containerd/s/9d0fffc0215162c2d166426e53e84c8165c88947102ac453f01b5ad6aa73b340" protocol=ttrpc version=3 May 13 23:51:22.441991 containerd[1515]: time="2025-05-13T23:51:22.441877884Z" level=info msg="StartContainer for \"768d0b2397de8324b673d98725cba11b864bc006fd29040fd7dc115227974abc\" returns successfully" May 13 23:51:22.462682 systemd[1]: Started cri-containerd-bd02ff5b879a1f493bf7007e62d3c35e7d33d9eb9005ce5378b3cc427dcb392a.scope - libcontainer container bd02ff5b879a1f493bf7007e62d3c35e7d33d9eb9005ce5378b3cc427dcb392a. May 13 23:51:22.513445 containerd[1515]: time="2025-05-13T23:51:22.513390698Z" level=info msg="StartContainer for \"bd02ff5b879a1f493bf7007e62d3c35e7d33d9eb9005ce5378b3cc427dcb392a\" returns successfully" May 13 23:51:22.876224 systemd-networkd[1403]: cali8a82efc8fad: Gained IPv6LL May 13 23:51:22.876899 systemd-networkd[1403]: calib2d26313190: Gained IPv6LL May 13 23:51:23.080797 kubelet[2916]: I0513 23:51:23.079780 2916 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/coredns-7db6d8ff4d-2mlnb" podStartSLOduration=30.079762536 podStartE2EDuration="30.079762536s" podCreationTimestamp="2025-05-13 23:50:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-05-13 23:51:23.079730813 +0000 UTC m=+44.407047875" watchObservedRunningTime="2025-05-13 23:51:23.079762536 +0000 UTC m=+44.407079598" May 13 23:51:23.080797 kubelet[2916]: I0513 23:51:23.080068 2916 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/coredns-7db6d8ff4d-5w474" podStartSLOduration=30.080062083 podStartE2EDuration="30.080062083s" podCreationTimestamp="2025-05-13 23:50:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-05-13 23:51:23.060924005 +0000 UTC m=+44.388241108" watchObservedRunningTime="2025-05-13 23:51:23.080062083 +0000 UTC m=+44.407379145" May 13 23:51:23.196753 systemd-networkd[1403]: cali9d2a263f751: Gained IPv6LL May 13 23:51:23.708364 systemd-networkd[1403]: cali1455a268d83: Gained IPv6LL May 13 23:51:23.819547 containerd[1515]: time="2025-05-13T23:51:23.819210846Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-c4dbd9494-hmzc4,Uid:67f78161-b863-4d99-b7fa-721f3c1dd240,Namespace:calico-system,Attempt:0,}" May 13 23:51:23.821445 containerd[1515]: time="2025-05-13T23:51:23.821372525Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-59bf95bdb4-rmmk8,Uid:f25819bc-7144-44b1-bf9b-16a1da05bd16,Namespace:calico-apiserver,Attempt:0,}" May 13 23:51:24.175333 systemd-networkd[1403]: cali25856214951: Link UP May 13 23:51:24.176815 systemd-networkd[1403]: cali25856214951: Gained carrier May 13 23:51:24.210254 containerd[1515]: 2025-05-13 23:51:23.923 [INFO][4504] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4284--0--0--n--cba8e36126-k8s-calico--kube--controllers--c4dbd9494--hmzc4-eth0 calico-kube-controllers-c4dbd9494- calico-system 67f78161-b863-4d99-b7fa-721f3c1dd240 686 0 2025-05-13 23:51:00 +0000 UTC map[app.kubernetes.io/name:calico-kube-controllers k8s-app:calico-kube-controllers pod-template-hash:c4dbd9494 projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:calico-kube-controllers] map[] [] [] []} {k8s ci-4284-0-0-n-cba8e36126 calico-kube-controllers-c4dbd9494-hmzc4 eth0 calico-kube-controllers [] [] [kns.calico-system ksa.calico-system.calico-kube-controllers] cali25856214951 [] []}} ContainerID="4585f55f8a8d71791ccc9addb3dd6ac56bf78bab50dc6c32d1155dfea9658885" Namespace="calico-system" Pod="calico-kube-controllers-c4dbd9494-hmzc4" WorkloadEndpoint="ci--4284--0--0--n--cba8e36126-k8s-calico--kube--controllers--c4dbd9494--hmzc4-" May 13 23:51:24.210254 containerd[1515]: 2025-05-13 23:51:23.923 [INFO][4504] cni-plugin/k8s.go 77: Extracted identifiers for CmdAddK8s ContainerID="4585f55f8a8d71791ccc9addb3dd6ac56bf78bab50dc6c32d1155dfea9658885" Namespace="calico-system" Pod="calico-kube-controllers-c4dbd9494-hmzc4" WorkloadEndpoint="ci--4284--0--0--n--cba8e36126-k8s-calico--kube--controllers--c4dbd9494--hmzc4-eth0" May 13 23:51:24.210254 containerd[1515]: 2025-05-13 23:51:23.999 [INFO][4529] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="4585f55f8a8d71791ccc9addb3dd6ac56bf78bab50dc6c32d1155dfea9658885" HandleID="k8s-pod-network.4585f55f8a8d71791ccc9addb3dd6ac56bf78bab50dc6c32d1155dfea9658885" Workload="ci--4284--0--0--n--cba8e36126-k8s-calico--kube--controllers--c4dbd9494--hmzc4-eth0" May 13 23:51:24.210254 containerd[1515]: 2025-05-13 23:51:24.033 [INFO][4529] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="4585f55f8a8d71791ccc9addb3dd6ac56bf78bab50dc6c32d1155dfea9658885" HandleID="k8s-pod-network.4585f55f8a8d71791ccc9addb3dd6ac56bf78bab50dc6c32d1155dfea9658885" Workload="ci--4284--0--0--n--cba8e36126-k8s-calico--kube--controllers--c4dbd9494--hmzc4-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0x40003187a0), Attrs:map[string]string{"namespace":"calico-system", "node":"ci-4284-0-0-n-cba8e36126", "pod":"calico-kube-controllers-c4dbd9494-hmzc4", "timestamp":"2025-05-13 23:51:23.999329548 +0000 UTC"}, Hostname:"ci-4284-0-0-n-cba8e36126", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} May 13 23:51:24.210254 containerd[1515]: 2025-05-13 23:51:24.033 [INFO][4529] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. May 13 23:51:24.210254 containerd[1515]: 2025-05-13 23:51:24.033 [INFO][4529] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. May 13 23:51:24.210254 containerd[1515]: 2025-05-13 23:51:24.033 [INFO][4529] ipam/ipam.go 107: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4284-0-0-n-cba8e36126' May 13 23:51:24.210254 containerd[1515]: 2025-05-13 23:51:24.045 [INFO][4529] ipam/ipam.go 660: Looking up existing affinities for host handle="k8s-pod-network.4585f55f8a8d71791ccc9addb3dd6ac56bf78bab50dc6c32d1155dfea9658885" host="ci-4284-0-0-n-cba8e36126" May 13 23:51:24.210254 containerd[1515]: 2025-05-13 23:51:24.068 [INFO][4529] ipam/ipam.go 372: Looking up existing affinities for host host="ci-4284-0-0-n-cba8e36126" May 13 23:51:24.210254 containerd[1515]: 2025-05-13 23:51:24.087 [INFO][4529] ipam/ipam.go 489: Trying affinity for 192.168.25.64/26 host="ci-4284-0-0-n-cba8e36126" May 13 23:51:24.210254 containerd[1515]: 2025-05-13 23:51:24.094 [INFO][4529] ipam/ipam.go 155: Attempting to load block cidr=192.168.25.64/26 host="ci-4284-0-0-n-cba8e36126" May 13 23:51:24.210254 containerd[1515]: 2025-05-13 23:51:24.109 [INFO][4529] ipam/ipam.go 232: Affinity is confirmed and block has been loaded cidr=192.168.25.64/26 host="ci-4284-0-0-n-cba8e36126" May 13 23:51:24.210254 containerd[1515]: 2025-05-13 23:51:24.109 [INFO][4529] ipam/ipam.go 1180: Attempting to assign 1 addresses from block block=192.168.25.64/26 handle="k8s-pod-network.4585f55f8a8d71791ccc9addb3dd6ac56bf78bab50dc6c32d1155dfea9658885" host="ci-4284-0-0-n-cba8e36126" May 13 23:51:24.210254 containerd[1515]: 2025-05-13 23:51:24.114 [INFO][4529] ipam/ipam.go 1685: Creating new handle: k8s-pod-network.4585f55f8a8d71791ccc9addb3dd6ac56bf78bab50dc6c32d1155dfea9658885 May 13 23:51:24.210254 containerd[1515]: 2025-05-13 23:51:24.142 [INFO][4529] ipam/ipam.go 1203: Writing block in order to claim IPs block=192.168.25.64/26 handle="k8s-pod-network.4585f55f8a8d71791ccc9addb3dd6ac56bf78bab50dc6c32d1155dfea9658885" host="ci-4284-0-0-n-cba8e36126" May 13 23:51:24.210254 containerd[1515]: 2025-05-13 23:51:24.161 [INFO][4529] ipam/ipam.go 1216: Successfully claimed IPs: [192.168.25.69/26] block=192.168.25.64/26 handle="k8s-pod-network.4585f55f8a8d71791ccc9addb3dd6ac56bf78bab50dc6c32d1155dfea9658885" host="ci-4284-0-0-n-cba8e36126" May 13 23:51:24.210254 containerd[1515]: 2025-05-13 23:51:24.161 [INFO][4529] ipam/ipam.go 847: Auto-assigned 1 out of 1 IPv4s: [192.168.25.69/26] handle="k8s-pod-network.4585f55f8a8d71791ccc9addb3dd6ac56bf78bab50dc6c32d1155dfea9658885" host="ci-4284-0-0-n-cba8e36126" May 13 23:51:24.210254 containerd[1515]: 2025-05-13 23:51:24.161 [INFO][4529] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. May 13 23:51:24.210254 containerd[1515]: 2025-05-13 23:51:24.161 [INFO][4529] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.25.69/26] IPv6=[] ContainerID="4585f55f8a8d71791ccc9addb3dd6ac56bf78bab50dc6c32d1155dfea9658885" HandleID="k8s-pod-network.4585f55f8a8d71791ccc9addb3dd6ac56bf78bab50dc6c32d1155dfea9658885" Workload="ci--4284--0--0--n--cba8e36126-k8s-calico--kube--controllers--c4dbd9494--hmzc4-eth0" May 13 23:51:24.211038 containerd[1515]: 2025-05-13 23:51:24.167 [INFO][4504] cni-plugin/k8s.go 386: Populated endpoint ContainerID="4585f55f8a8d71791ccc9addb3dd6ac56bf78bab50dc6c32d1155dfea9658885" Namespace="calico-system" Pod="calico-kube-controllers-c4dbd9494-hmzc4" WorkloadEndpoint="ci--4284--0--0--n--cba8e36126-k8s-calico--kube--controllers--c4dbd9494--hmzc4-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4284--0--0--n--cba8e36126-k8s-calico--kube--controllers--c4dbd9494--hmzc4-eth0", GenerateName:"calico-kube-controllers-c4dbd9494-", Namespace:"calico-system", SelfLink:"", UID:"67f78161-b863-4d99-b7fa-721f3c1dd240", ResourceVersion:"686", Generation:0, CreationTimestamp:time.Date(2025, time.May, 13, 23, 51, 0, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"calico-kube-controllers", "k8s-app":"calico-kube-controllers", "pod-template-hash":"c4dbd9494", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-kube-controllers"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4284-0-0-n-cba8e36126", ContainerID:"", Pod:"calico-kube-controllers-c4dbd9494-hmzc4", Endpoint:"eth0", ServiceAccountName:"calico-kube-controllers", IPNetworks:[]string{"192.168.25.69/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.calico-kube-controllers"}, InterfaceName:"cali25856214951", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil)}} May 13 23:51:24.211038 containerd[1515]: 2025-05-13 23:51:24.167 [INFO][4504] cni-plugin/k8s.go 387: Calico CNI using IPs: [192.168.25.69/32] ContainerID="4585f55f8a8d71791ccc9addb3dd6ac56bf78bab50dc6c32d1155dfea9658885" Namespace="calico-system" Pod="calico-kube-controllers-c4dbd9494-hmzc4" WorkloadEndpoint="ci--4284--0--0--n--cba8e36126-k8s-calico--kube--controllers--c4dbd9494--hmzc4-eth0" May 13 23:51:24.211038 containerd[1515]: 2025-05-13 23:51:24.167 [INFO][4504] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali25856214951 ContainerID="4585f55f8a8d71791ccc9addb3dd6ac56bf78bab50dc6c32d1155dfea9658885" Namespace="calico-system" Pod="calico-kube-controllers-c4dbd9494-hmzc4" WorkloadEndpoint="ci--4284--0--0--n--cba8e36126-k8s-calico--kube--controllers--c4dbd9494--hmzc4-eth0" May 13 23:51:24.211038 containerd[1515]: 2025-05-13 23:51:24.175 [INFO][4504] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="4585f55f8a8d71791ccc9addb3dd6ac56bf78bab50dc6c32d1155dfea9658885" Namespace="calico-system" Pod="calico-kube-controllers-c4dbd9494-hmzc4" WorkloadEndpoint="ci--4284--0--0--n--cba8e36126-k8s-calico--kube--controllers--c4dbd9494--hmzc4-eth0" May 13 23:51:24.211038 containerd[1515]: 2025-05-13 23:51:24.177 [INFO][4504] cni-plugin/k8s.go 414: Added Mac, interface name, and active container ID to endpoint ContainerID="4585f55f8a8d71791ccc9addb3dd6ac56bf78bab50dc6c32d1155dfea9658885" Namespace="calico-system" Pod="calico-kube-controllers-c4dbd9494-hmzc4" WorkloadEndpoint="ci--4284--0--0--n--cba8e36126-k8s-calico--kube--controllers--c4dbd9494--hmzc4-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4284--0--0--n--cba8e36126-k8s-calico--kube--controllers--c4dbd9494--hmzc4-eth0", GenerateName:"calico-kube-controllers-c4dbd9494-", Namespace:"calico-system", SelfLink:"", UID:"67f78161-b863-4d99-b7fa-721f3c1dd240", ResourceVersion:"686", Generation:0, CreationTimestamp:time.Date(2025, time.May, 13, 23, 51, 0, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"calico-kube-controllers", "k8s-app":"calico-kube-controllers", "pod-template-hash":"c4dbd9494", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-kube-controllers"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4284-0-0-n-cba8e36126", ContainerID:"4585f55f8a8d71791ccc9addb3dd6ac56bf78bab50dc6c32d1155dfea9658885", Pod:"calico-kube-controllers-c4dbd9494-hmzc4", Endpoint:"eth0", ServiceAccountName:"calico-kube-controllers", IPNetworks:[]string{"192.168.25.69/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.calico-kube-controllers"}, InterfaceName:"cali25856214951", MAC:"86:66:91:32:4e:85", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil)}} May 13 23:51:24.211038 containerd[1515]: 2025-05-13 23:51:24.197 [INFO][4504] cni-plugin/k8s.go 500: Wrote updated endpoint to datastore ContainerID="4585f55f8a8d71791ccc9addb3dd6ac56bf78bab50dc6c32d1155dfea9658885" Namespace="calico-system" Pod="calico-kube-controllers-c4dbd9494-hmzc4" WorkloadEndpoint="ci--4284--0--0--n--cba8e36126-k8s-calico--kube--controllers--c4dbd9494--hmzc4-eth0" May 13 23:51:24.276109 systemd-networkd[1403]: cali096624859ef: Link UP May 13 23:51:24.278277 systemd-networkd[1403]: cali096624859ef: Gained carrier May 13 23:51:24.306075 containerd[1515]: time="2025-05-13T23:51:24.305475391Z" level=info msg="connecting to shim 4585f55f8a8d71791ccc9addb3dd6ac56bf78bab50dc6c32d1155dfea9658885" address="unix:///run/containerd/s/3d9baa01cffd6ab5824aa0cd753dc155c11cab5d25ed5d0c0dff422ff1bf4a73" namespace=k8s.io protocol=ttrpc version=3 May 13 23:51:24.325004 containerd[1515]: 2025-05-13 23:51:23.929 [INFO][4513] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4284--0--0--n--cba8e36126-k8s-calico--apiserver--59bf95bdb4--rmmk8-eth0 calico-apiserver-59bf95bdb4- calico-apiserver f25819bc-7144-44b1-bf9b-16a1da05bd16 684 0 2025-05-13 23:50:59 +0000 UTC map[apiserver:true app.kubernetes.io/name:calico-apiserver k8s-app:calico-apiserver pod-template-hash:59bf95bdb4 projectcalico.org/namespace:calico-apiserver projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:calico-apiserver] map[] [] [] []} {k8s ci-4284-0-0-n-cba8e36126 calico-apiserver-59bf95bdb4-rmmk8 eth0 calico-apiserver [] [] [kns.calico-apiserver ksa.calico-apiserver.calico-apiserver] cali096624859ef [] []}} ContainerID="23da35459384e486723d2143e47afee950d8dc80e277c34307d162253dd6414d" Namespace="calico-apiserver" Pod="calico-apiserver-59bf95bdb4-rmmk8" WorkloadEndpoint="ci--4284--0--0--n--cba8e36126-k8s-calico--apiserver--59bf95bdb4--rmmk8-" May 13 23:51:24.325004 containerd[1515]: 2025-05-13 23:51:23.929 [INFO][4513] cni-plugin/k8s.go 77: Extracted identifiers for CmdAddK8s ContainerID="23da35459384e486723d2143e47afee950d8dc80e277c34307d162253dd6414d" Namespace="calico-apiserver" Pod="calico-apiserver-59bf95bdb4-rmmk8" WorkloadEndpoint="ci--4284--0--0--n--cba8e36126-k8s-calico--apiserver--59bf95bdb4--rmmk8-eth0" May 13 23:51:24.325004 containerd[1515]: 2025-05-13 23:51:24.008 [INFO][4534] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="23da35459384e486723d2143e47afee950d8dc80e277c34307d162253dd6414d" HandleID="k8s-pod-network.23da35459384e486723d2143e47afee950d8dc80e277c34307d162253dd6414d" Workload="ci--4284--0--0--n--cba8e36126-k8s-calico--apiserver--59bf95bdb4--rmmk8-eth0" May 13 23:51:24.325004 containerd[1515]: 2025-05-13 23:51:24.049 [INFO][4534] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="23da35459384e486723d2143e47afee950d8dc80e277c34307d162253dd6414d" HandleID="k8s-pod-network.23da35459384e486723d2143e47afee950d8dc80e277c34307d162253dd6414d" Workload="ci--4284--0--0--n--cba8e36126-k8s-calico--apiserver--59bf95bdb4--rmmk8-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0x400031b770), Attrs:map[string]string{"namespace":"calico-apiserver", "node":"ci-4284-0-0-n-cba8e36126", "pod":"calico-apiserver-59bf95bdb4-rmmk8", "timestamp":"2025-05-13 23:51:24.005298331 +0000 UTC"}, Hostname:"ci-4284-0-0-n-cba8e36126", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} May 13 23:51:24.325004 containerd[1515]: 2025-05-13 23:51:24.049 [INFO][4534] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. May 13 23:51:24.325004 containerd[1515]: 2025-05-13 23:51:24.162 [INFO][4534] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. May 13 23:51:24.325004 containerd[1515]: 2025-05-13 23:51:24.162 [INFO][4534] ipam/ipam.go 107: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4284-0-0-n-cba8e36126' May 13 23:51:24.325004 containerd[1515]: 2025-05-13 23:51:24.173 [INFO][4534] ipam/ipam.go 660: Looking up existing affinities for host handle="k8s-pod-network.23da35459384e486723d2143e47afee950d8dc80e277c34307d162253dd6414d" host="ci-4284-0-0-n-cba8e36126" May 13 23:51:24.325004 containerd[1515]: 2025-05-13 23:51:24.193 [INFO][4534] ipam/ipam.go 372: Looking up existing affinities for host host="ci-4284-0-0-n-cba8e36126" May 13 23:51:24.325004 containerd[1515]: 2025-05-13 23:51:24.208 [INFO][4534] ipam/ipam.go 489: Trying affinity for 192.168.25.64/26 host="ci-4284-0-0-n-cba8e36126" May 13 23:51:24.325004 containerd[1515]: 2025-05-13 23:51:24.216 [INFO][4534] ipam/ipam.go 155: Attempting to load block cidr=192.168.25.64/26 host="ci-4284-0-0-n-cba8e36126" May 13 23:51:24.325004 containerd[1515]: 2025-05-13 23:51:24.224 [INFO][4534] ipam/ipam.go 232: Affinity is confirmed and block has been loaded cidr=192.168.25.64/26 host="ci-4284-0-0-n-cba8e36126" May 13 23:51:24.325004 containerd[1515]: 2025-05-13 23:51:24.225 [INFO][4534] ipam/ipam.go 1180: Attempting to assign 1 addresses from block block=192.168.25.64/26 handle="k8s-pod-network.23da35459384e486723d2143e47afee950d8dc80e277c34307d162253dd6414d" host="ci-4284-0-0-n-cba8e36126" May 13 23:51:24.325004 containerd[1515]: 2025-05-13 23:51:24.230 [INFO][4534] ipam/ipam.go 1685: Creating new handle: k8s-pod-network.23da35459384e486723d2143e47afee950d8dc80e277c34307d162253dd6414d May 13 23:51:24.325004 containerd[1515]: 2025-05-13 23:51:24.242 [INFO][4534] ipam/ipam.go 1203: Writing block in order to claim IPs block=192.168.25.64/26 handle="k8s-pod-network.23da35459384e486723d2143e47afee950d8dc80e277c34307d162253dd6414d" host="ci-4284-0-0-n-cba8e36126" May 13 23:51:24.325004 containerd[1515]: 2025-05-13 23:51:24.259 [INFO][4534] ipam/ipam.go 1216: Successfully claimed IPs: [192.168.25.70/26] block=192.168.25.64/26 handle="k8s-pod-network.23da35459384e486723d2143e47afee950d8dc80e277c34307d162253dd6414d" host="ci-4284-0-0-n-cba8e36126" May 13 23:51:24.325004 containerd[1515]: 2025-05-13 23:51:24.259 [INFO][4534] ipam/ipam.go 847: Auto-assigned 1 out of 1 IPv4s: [192.168.25.70/26] handle="k8s-pod-network.23da35459384e486723d2143e47afee950d8dc80e277c34307d162253dd6414d" host="ci-4284-0-0-n-cba8e36126" May 13 23:51:24.325004 containerd[1515]: 2025-05-13 23:51:24.259 [INFO][4534] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. May 13 23:51:24.325004 containerd[1515]: 2025-05-13 23:51:24.259 [INFO][4534] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.25.70/26] IPv6=[] ContainerID="23da35459384e486723d2143e47afee950d8dc80e277c34307d162253dd6414d" HandleID="k8s-pod-network.23da35459384e486723d2143e47afee950d8dc80e277c34307d162253dd6414d" Workload="ci--4284--0--0--n--cba8e36126-k8s-calico--apiserver--59bf95bdb4--rmmk8-eth0" May 13 23:51:24.325988 containerd[1515]: 2025-05-13 23:51:24.266 [INFO][4513] cni-plugin/k8s.go 386: Populated endpoint ContainerID="23da35459384e486723d2143e47afee950d8dc80e277c34307d162253dd6414d" Namespace="calico-apiserver" Pod="calico-apiserver-59bf95bdb4-rmmk8" WorkloadEndpoint="ci--4284--0--0--n--cba8e36126-k8s-calico--apiserver--59bf95bdb4--rmmk8-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4284--0--0--n--cba8e36126-k8s-calico--apiserver--59bf95bdb4--rmmk8-eth0", GenerateName:"calico-apiserver-59bf95bdb4-", Namespace:"calico-apiserver", SelfLink:"", UID:"f25819bc-7144-44b1-bf9b-16a1da05bd16", ResourceVersion:"684", Generation:0, CreationTimestamp:time.Date(2025, time.May, 13, 23, 50, 59, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"59bf95bdb4", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4284-0-0-n-cba8e36126", ContainerID:"", Pod:"calico-apiserver-59bf95bdb4-rmmk8", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.25.70/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"cali096624859ef", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil)}} May 13 23:51:24.325988 containerd[1515]: 2025-05-13 23:51:24.266 [INFO][4513] cni-plugin/k8s.go 387: Calico CNI using IPs: [192.168.25.70/32] ContainerID="23da35459384e486723d2143e47afee950d8dc80e277c34307d162253dd6414d" Namespace="calico-apiserver" Pod="calico-apiserver-59bf95bdb4-rmmk8" WorkloadEndpoint="ci--4284--0--0--n--cba8e36126-k8s-calico--apiserver--59bf95bdb4--rmmk8-eth0" May 13 23:51:24.325988 containerd[1515]: 2025-05-13 23:51:24.266 [INFO][4513] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali096624859ef ContainerID="23da35459384e486723d2143e47afee950d8dc80e277c34307d162253dd6414d" Namespace="calico-apiserver" Pod="calico-apiserver-59bf95bdb4-rmmk8" WorkloadEndpoint="ci--4284--0--0--n--cba8e36126-k8s-calico--apiserver--59bf95bdb4--rmmk8-eth0" May 13 23:51:24.325988 containerd[1515]: 2025-05-13 23:51:24.287 [INFO][4513] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="23da35459384e486723d2143e47afee950d8dc80e277c34307d162253dd6414d" Namespace="calico-apiserver" Pod="calico-apiserver-59bf95bdb4-rmmk8" WorkloadEndpoint="ci--4284--0--0--n--cba8e36126-k8s-calico--apiserver--59bf95bdb4--rmmk8-eth0" May 13 23:51:24.325988 containerd[1515]: 2025-05-13 23:51:24.295 [INFO][4513] cni-plugin/k8s.go 414: Added Mac, interface name, and active container ID to endpoint ContainerID="23da35459384e486723d2143e47afee950d8dc80e277c34307d162253dd6414d" Namespace="calico-apiserver" Pod="calico-apiserver-59bf95bdb4-rmmk8" WorkloadEndpoint="ci--4284--0--0--n--cba8e36126-k8s-calico--apiserver--59bf95bdb4--rmmk8-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4284--0--0--n--cba8e36126-k8s-calico--apiserver--59bf95bdb4--rmmk8-eth0", GenerateName:"calico-apiserver-59bf95bdb4-", Namespace:"calico-apiserver", SelfLink:"", UID:"f25819bc-7144-44b1-bf9b-16a1da05bd16", ResourceVersion:"684", Generation:0, CreationTimestamp:time.Date(2025, time.May, 13, 23, 50, 59, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"59bf95bdb4", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4284-0-0-n-cba8e36126", ContainerID:"23da35459384e486723d2143e47afee950d8dc80e277c34307d162253dd6414d", Pod:"calico-apiserver-59bf95bdb4-rmmk8", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.25.70/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"cali096624859ef", MAC:"5a:38:17:91:0e:70", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil)}} May 13 23:51:24.325988 containerd[1515]: 2025-05-13 23:51:24.317 [INFO][4513] cni-plugin/k8s.go 500: Wrote updated endpoint to datastore ContainerID="23da35459384e486723d2143e47afee950d8dc80e277c34307d162253dd6414d" Namespace="calico-apiserver" Pod="calico-apiserver-59bf95bdb4-rmmk8" WorkloadEndpoint="ci--4284--0--0--n--cba8e36126-k8s-calico--apiserver--59bf95bdb4--rmmk8-eth0" May 13 23:51:24.401902 systemd[1]: Started cri-containerd-4585f55f8a8d71791ccc9addb3dd6ac56bf78bab50dc6c32d1155dfea9658885.scope - libcontainer container 4585f55f8a8d71791ccc9addb3dd6ac56bf78bab50dc6c32d1155dfea9658885. May 13 23:51:24.417395 containerd[1515]: time="2025-05-13T23:51:24.417346055Z" level=info msg="connecting to shim 23da35459384e486723d2143e47afee950d8dc80e277c34307d162253dd6414d" address="unix:///run/containerd/s/74f966357e1d856e38f4760c0bd80f2d335083e128ed4bf07973bead5978f146" namespace=k8s.io protocol=ttrpc version=3 May 13 23:51:24.465675 systemd[1]: Started cri-containerd-23da35459384e486723d2143e47afee950d8dc80e277c34307d162253dd6414d.scope - libcontainer container 23da35459384e486723d2143e47afee950d8dc80e277c34307d162253dd6414d. May 13 23:51:24.506239 containerd[1515]: time="2025-05-13T23:51:24.506066180Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-c4dbd9494-hmzc4,Uid:67f78161-b863-4d99-b7fa-721f3c1dd240,Namespace:calico-system,Attempt:0,} returns sandbox id \"4585f55f8a8d71791ccc9addb3dd6ac56bf78bab50dc6c32d1155dfea9658885\"" May 13 23:51:24.566864 containerd[1515]: time="2025-05-13T23:51:24.566769565Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-59bf95bdb4-rmmk8,Uid:f25819bc-7144-44b1-bf9b-16a1da05bd16,Namespace:calico-apiserver,Attempt:0,} returns sandbox id \"23da35459384e486723d2143e47afee950d8dc80e277c34307d162253dd6414d\"" May 13 23:51:24.678316 containerd[1515]: time="2025-05-13T23:51:24.678237953Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/apiserver:v3.29.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 13 23:51:24.679474 containerd[1515]: time="2025-05-13T23:51:24.679364855Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.29.3: active requests=0, bytes read=40247603" May 13 23:51:24.681461 containerd[1515]: time="2025-05-13T23:51:24.680952479Z" level=info msg="ImageCreate event name:\"sha256:eca64fb9fcc40e83ed2310ac1fab340ba460a939c54e10dc0b7428f02b9b6253\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 13 23:51:24.687178 containerd[1515]: time="2025-05-13T23:51:24.687133280Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/apiserver@sha256:bcb659f25f9aebaa389ed1dbb65edb39478ddf82c57d07d8da474e8cab38d77b\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 13 23:51:24.688516 containerd[1515]: time="2025-05-13T23:51:24.688457360Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/apiserver:v3.29.3\" with image id \"sha256:eca64fb9fcc40e83ed2310ac1fab340ba460a939c54e10dc0b7428f02b9b6253\", repo tag \"ghcr.io/flatcar/calico/apiserver:v3.29.3\", repo digest \"ghcr.io/flatcar/calico/apiserver@sha256:bcb659f25f9aebaa389ed1dbb65edb39478ddf82c57d07d8da474e8cab38d77b\", size \"41616801\" in 3.31709657s" May 13 23:51:24.688672 containerd[1515]: time="2025-05-13T23:51:24.688518925Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.29.3\" returns image reference \"sha256:eca64fb9fcc40e83ed2310ac1fab340ba460a939c54e10dc0b7428f02b9b6253\"" May 13 23:51:24.691380 containerd[1515]: time="2025-05-13T23:51:24.690803292Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.29.3\"" May 13 23:51:24.694014 containerd[1515]: time="2025-05-13T23:51:24.693961379Z" level=info msg="CreateContainer within sandbox \"82330f81334b068898c2fa0b3601827e31d0f6ef794bb1c3cd797955b0f804a9\" for container &ContainerMetadata{Name:calico-apiserver,Attempt:0,}" May 13 23:51:24.702580 containerd[1515]: time="2025-05-13T23:51:24.702485512Z" level=info msg="Container c7bc7aa19970939e74e5ba1f141beb81d6fea5509774940edeb7cddc4fbff64d: CDI devices from CRI Config.CDIDevices: []" May 13 23:51:24.724949 containerd[1515]: time="2025-05-13T23:51:24.724724808Z" level=info msg="CreateContainer within sandbox \"82330f81334b068898c2fa0b3601827e31d0f6ef794bb1c3cd797955b0f804a9\" for &ContainerMetadata{Name:calico-apiserver,Attempt:0,} returns container id \"c7bc7aa19970939e74e5ba1f141beb81d6fea5509774940edeb7cddc4fbff64d\"" May 13 23:51:24.726917 containerd[1515]: time="2025-05-13T23:51:24.726383559Z" level=info msg="StartContainer for \"c7bc7aa19970939e74e5ba1f141beb81d6fea5509774940edeb7cddc4fbff64d\"" May 13 23:51:24.727914 containerd[1515]: time="2025-05-13T23:51:24.727880135Z" level=info msg="connecting to shim c7bc7aa19970939e74e5ba1f141beb81d6fea5509774940edeb7cddc4fbff64d" address="unix:///run/containerd/s/d634909037434b5489765c383661552552981e2906d9d243251441c2c4eec2ac" protocol=ttrpc version=3 May 13 23:51:24.767774 systemd[1]: Started cri-containerd-c7bc7aa19970939e74e5ba1f141beb81d6fea5509774940edeb7cddc4fbff64d.scope - libcontainer container c7bc7aa19970939e74e5ba1f141beb81d6fea5509774940edeb7cddc4fbff64d. May 13 23:51:24.841088 containerd[1515]: time="2025-05-13T23:51:24.840843338Z" level=info msg="StartContainer for \"c7bc7aa19970939e74e5ba1f141beb81d6fea5509774940edeb7cddc4fbff64d\" returns successfully" May 13 23:51:25.082615 kubelet[2916]: I0513 23:51:25.082030 2916 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-apiserver/calico-apiserver-59bf95bdb4-hsqx4" podStartSLOduration=22.762926885 podStartE2EDuration="26.082013636s" podCreationTimestamp="2025-05-13 23:50:59 +0000 UTC" firstStartedPulling="2025-05-13 23:51:21.37105244 +0000 UTC m=+42.698369503" lastFinishedPulling="2025-05-13 23:51:24.690139192 +0000 UTC m=+46.017456254" observedRunningTime="2025-05-13 23:51:25.080726841 +0000 UTC m=+46.408043943" watchObservedRunningTime="2025-05-13 23:51:25.082013636 +0000 UTC m=+46.409330658" May 13 23:51:26.076418 systemd-networkd[1403]: cali096624859ef: Gained IPv6LL May 13 23:51:26.104142 kubelet[2916]: I0513 23:51:26.104093 2916 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" May 13 23:51:26.139821 systemd-networkd[1403]: cali25856214951: Gained IPv6LL May 13 23:51:26.264581 containerd[1515]: time="2025-05-13T23:51:26.264529251Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/csi:v3.29.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 13 23:51:26.266166 containerd[1515]: time="2025-05-13T23:51:26.266103231Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/csi:v3.29.3: active requests=0, bytes read=7474935" May 13 23:51:26.267354 containerd[1515]: time="2025-05-13T23:51:26.267315378Z" level=info msg="ImageCreate event name:\"sha256:15faf29e8b518d846c91c15785ff89e783d356ea0f2b22826f47a556ea32645b\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 13 23:51:26.271211 containerd[1515]: time="2025-05-13T23:51:26.271165839Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/csi@sha256:72455a36febc7c56ec8881007f4805caed5764026a0694e4f86a2503209b2d31\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 13 23:51:26.271911 containerd[1515]: time="2025-05-13T23:51:26.271874461Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/csi:v3.29.3\" with image id \"sha256:15faf29e8b518d846c91c15785ff89e783d356ea0f2b22826f47a556ea32645b\", repo tag \"ghcr.io/flatcar/calico/csi:v3.29.3\", repo digest \"ghcr.io/flatcar/calico/csi@sha256:72455a36febc7c56ec8881007f4805caed5764026a0694e4f86a2503209b2d31\", size \"8844117\" in 1.581022445s" May 13 23:51:26.271966 containerd[1515]: time="2025-05-13T23:51:26.271914745Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.29.3\" returns image reference \"sha256:15faf29e8b518d846c91c15785ff89e783d356ea0f2b22826f47a556ea32645b\"" May 13 23:51:26.273863 containerd[1515]: time="2025-05-13T23:51:26.273816753Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.29.3\"" May 13 23:51:26.275480 containerd[1515]: time="2025-05-13T23:51:26.275374211Z" level=info msg="CreateContainer within sandbox \"e6df9f84214acb3f3f8fcccd3fa80d22ad29ee939103f182785640e112d9005e\" for container &ContainerMetadata{Name:calico-csi,Attempt:0,}" May 13 23:51:26.295841 containerd[1515]: time="2025-05-13T23:51:26.295794337Z" level=info msg="Container 659d5184440c7d9c71dab1ca594c210ad76f7a8c56eeab2e2930e38cdce0ed60: CDI devices from CRI Config.CDIDevices: []" May 13 23:51:26.299482 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount765255759.mount: Deactivated successfully. May 13 23:51:26.318489 containerd[1515]: time="2025-05-13T23:51:26.317676633Z" level=info msg="CreateContainer within sandbox \"e6df9f84214acb3f3f8fcccd3fa80d22ad29ee939103f182785640e112d9005e\" for &ContainerMetadata{Name:calico-csi,Attempt:0,} returns container id \"659d5184440c7d9c71dab1ca594c210ad76f7a8c56eeab2e2930e38cdce0ed60\"" May 13 23:51:26.322653 containerd[1515]: time="2025-05-13T23:51:26.320630575Z" level=info msg="StartContainer for \"659d5184440c7d9c71dab1ca594c210ad76f7a8c56eeab2e2930e38cdce0ed60\"" May 13 23:51:26.325588 containerd[1515]: time="2025-05-13T23:51:26.323787534Z" level=info msg="connecting to shim 659d5184440c7d9c71dab1ca594c210ad76f7a8c56eeab2e2930e38cdce0ed60" address="unix:///run/containerd/s/d33fd5f6ba3f14755a008308c61ac2d7c03a7515943371dc956357468e8491fc" protocol=ttrpc version=3 May 13 23:51:26.355748 systemd[1]: Started cri-containerd-659d5184440c7d9c71dab1ca594c210ad76f7a8c56eeab2e2930e38cdce0ed60.scope - libcontainer container 659d5184440c7d9c71dab1ca594c210ad76f7a8c56eeab2e2930e38cdce0ed60. May 13 23:51:26.438536 containerd[1515]: time="2025-05-13T23:51:26.438493642Z" level=info msg="StartContainer for \"659d5184440c7d9c71dab1ca594c210ad76f7a8c56eeab2e2930e38cdce0ed60\" returns successfully" May 13 23:51:29.667207 kubelet[2916]: I0513 23:51:29.667162 2916 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" May 13 23:51:29.820908 containerd[1515]: time="2025-05-13T23:51:29.820255427Z" level=info msg="TaskExit event in podsandbox handler container_id:\"6e216e5e38dfeaed7e9085644f5c11b62a72e72b15f966089d9009ebbc63b632\" id:\"9837bf56ee8a5916c7eefadf24a69e907aae12d7431db694c91eb3a0b43da751\" pid:4751 exited_at:{seconds:1747180289 nanos:819525365}" May 13 23:51:30.076124 containerd[1515]: time="2025-05-13T23:51:30.075456112Z" level=info msg="TaskExit event in podsandbox handler container_id:\"6e216e5e38dfeaed7e9085644f5c11b62a72e72b15f966089d9009ebbc63b632\" id:\"64379cc2ce8812cb83de0db769248a1a03fd2c86b0420519fd3dbb7985078fdb\" pid:4780 exited_at:{seconds:1747180290 nanos:73828094}" May 13 23:51:30.419478 containerd[1515]: time="2025-05-13T23:51:30.418811547Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/kube-controllers:v3.29.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 13 23:51:30.419478 containerd[1515]: time="2025-05-13T23:51:30.419455041Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/kube-controllers:v3.29.3: active requests=0, bytes read=32554116" May 13 23:51:30.420878 containerd[1515]: time="2025-05-13T23:51:30.420823237Z" level=info msg="ImageCreate event name:\"sha256:ec7c64189a2fd01b24b044fea1840d441e9884a0df32c2e9d6982cfbbea1f814\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 13 23:51:30.423320 containerd[1515]: time="2025-05-13T23:51:30.423262323Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/kube-controllers@sha256:feaab0197035d474845e0f8137a99a78cab274f0a3cac4d5485cf9b1bdf9ffa9\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 13 23:51:30.424233 containerd[1515]: time="2025-05-13T23:51:30.424192721Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/kube-controllers:v3.29.3\" with image id \"sha256:ec7c64189a2fd01b24b044fea1840d441e9884a0df32c2e9d6982cfbbea1f814\", repo tag \"ghcr.io/flatcar/calico/kube-controllers:v3.29.3\", repo digest \"ghcr.io/flatcar/calico/kube-controllers@sha256:feaab0197035d474845e0f8137a99a78cab274f0a3cac4d5485cf9b1bdf9ffa9\", size \"33923266\" in 4.150333165s" May 13 23:51:30.424233 containerd[1515]: time="2025-05-13T23:51:30.424231085Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.29.3\" returns image reference \"sha256:ec7c64189a2fd01b24b044fea1840d441e9884a0df32c2e9d6982cfbbea1f814\"" May 13 23:51:30.427025 containerd[1515]: time="2025-05-13T23:51:30.426970516Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.29.3\"" May 13 23:51:30.451856 containerd[1515]: time="2025-05-13T23:51:30.451805773Z" level=info msg="CreateContainer within sandbox \"4585f55f8a8d71791ccc9addb3dd6ac56bf78bab50dc6c32d1155dfea9658885\" for container &ContainerMetadata{Name:calico-kube-controllers,Attempt:0,}" May 13 23:51:30.477508 containerd[1515]: time="2025-05-13T23:51:30.477394574Z" level=info msg="Container 9792211b68a7146313562872544b61aada9a793ff21785d51cf55ea980c7f7e1: CDI devices from CRI Config.CDIDevices: []" May 13 23:51:30.490438 containerd[1515]: time="2025-05-13T23:51:30.490272742Z" level=info msg="CreateContainer within sandbox \"4585f55f8a8d71791ccc9addb3dd6ac56bf78bab50dc6c32d1155dfea9658885\" for &ContainerMetadata{Name:calico-kube-controllers,Attempt:0,} returns container id \"9792211b68a7146313562872544b61aada9a793ff21785d51cf55ea980c7f7e1\"" May 13 23:51:30.491682 containerd[1515]: time="2025-05-13T23:51:30.491346952Z" level=info msg="StartContainer for \"9792211b68a7146313562872544b61aada9a793ff21785d51cf55ea980c7f7e1\"" May 13 23:51:30.493816 containerd[1515]: time="2025-05-13T23:51:30.493656948Z" level=info msg="connecting to shim 9792211b68a7146313562872544b61aada9a793ff21785d51cf55ea980c7f7e1" address="unix:///run/containerd/s/3d9baa01cffd6ab5824aa0cd753dc155c11cab5d25ed5d0c0dff422ff1bf4a73" protocol=ttrpc version=3 May 13 23:51:30.529750 systemd[1]: Started cri-containerd-9792211b68a7146313562872544b61aada9a793ff21785d51cf55ea980c7f7e1.scope - libcontainer container 9792211b68a7146313562872544b61aada9a793ff21785d51cf55ea980c7f7e1. May 13 23:51:30.610953 containerd[1515]: time="2025-05-13T23:51:30.610911049Z" level=info msg="StartContainer for \"9792211b68a7146313562872544b61aada9a793ff21785d51cf55ea980c7f7e1\" returns successfully" May 13 23:51:30.837136 containerd[1515]: time="2025-05-13T23:51:30.836205555Z" level=info msg="ImageUpdate event name:\"ghcr.io/flatcar/calico/apiserver:v3.29.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 13 23:51:30.838795 containerd[1515]: time="2025-05-13T23:51:30.838661202Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.29.3: active requests=0, bytes read=77" May 13 23:51:30.840449 containerd[1515]: time="2025-05-13T23:51:30.840385228Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/apiserver:v3.29.3\" with image id \"sha256:eca64fb9fcc40e83ed2310ac1fab340ba460a939c54e10dc0b7428f02b9b6253\", repo tag \"ghcr.io/flatcar/calico/apiserver:v3.29.3\", repo digest \"ghcr.io/flatcar/calico/apiserver@sha256:bcb659f25f9aebaa389ed1dbb65edb39478ddf82c57d07d8da474e8cab38d77b\", size \"41616801\" in 413.198854ms" May 13 23:51:30.840449 containerd[1515]: time="2025-05-13T23:51:30.840451673Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.29.3\" returns image reference \"sha256:eca64fb9fcc40e83ed2310ac1fab340ba460a939c54e10dc0b7428f02b9b6253\"" May 13 23:51:30.842477 containerd[1515]: time="2025-05-13T23:51:30.842316351Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.29.3\"" May 13 23:51:30.849283 containerd[1515]: time="2025-05-13T23:51:30.847138878Z" level=info msg="CreateContainer within sandbox \"23da35459384e486723d2143e47afee950d8dc80e277c34307d162253dd6414d\" for container &ContainerMetadata{Name:calico-apiserver,Attempt:0,}" May 13 23:51:30.858247 containerd[1515]: time="2025-05-13T23:51:30.858198052Z" level=info msg="Container 7e81c402feaaf70fb3e88300df64e7a3a648b6bbe17e500b24366f933325646d: CDI devices from CRI Config.CDIDevices: []" May 13 23:51:30.878786 containerd[1515]: time="2025-05-13T23:51:30.878309270Z" level=info msg="CreateContainer within sandbox \"23da35459384e486723d2143e47afee950d8dc80e277c34307d162253dd6414d\" for &ContainerMetadata{Name:calico-apiserver,Attempt:0,} returns container id \"7e81c402feaaf70fb3e88300df64e7a3a648b6bbe17e500b24366f933325646d\"" May 13 23:51:30.882898 containerd[1515]: time="2025-05-13T23:51:30.882832092Z" level=info msg="StartContainer for \"7e81c402feaaf70fb3e88300df64e7a3a648b6bbe17e500b24366f933325646d\"" May 13 23:51:30.884614 containerd[1515]: time="2025-05-13T23:51:30.884535916Z" level=info msg="connecting to shim 7e81c402feaaf70fb3e88300df64e7a3a648b6bbe17e500b24366f933325646d" address="unix:///run/containerd/s/74f966357e1d856e38f4760c0bd80f2d335083e128ed4bf07973bead5978f146" protocol=ttrpc version=3 May 13 23:51:30.917669 systemd[1]: Started cri-containerd-7e81c402feaaf70fb3e88300df64e7a3a648b6bbe17e500b24366f933325646d.scope - libcontainer container 7e81c402feaaf70fb3e88300df64e7a3a648b6bbe17e500b24366f933325646d. May 13 23:51:31.006081 containerd[1515]: time="2025-05-13T23:51:31.005817194Z" level=info msg="StartContainer for \"7e81c402feaaf70fb3e88300df64e7a3a648b6bbe17e500b24366f933325646d\" returns successfully" May 13 23:51:31.188582 kubelet[2916]: I0513 23:51:31.187306 2916 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/calico-kube-controllers-c4dbd9494-hmzc4" podStartSLOduration=25.272417432 podStartE2EDuration="31.18728783s" podCreationTimestamp="2025-05-13 23:51:00 +0000 UTC" firstStartedPulling="2025-05-13 23:51:24.510569349 +0000 UTC m=+45.837886411" lastFinishedPulling="2025-05-13 23:51:30.425439747 +0000 UTC m=+51.752756809" observedRunningTime="2025-05-13 23:51:31.181989388 +0000 UTC m=+52.509306450" watchObservedRunningTime="2025-05-13 23:51:31.18728783 +0000 UTC m=+52.514604892" May 13 23:51:31.213803 kubelet[2916]: I0513 23:51:31.213086 2916 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-apiserver/calico-apiserver-59bf95bdb4-rmmk8" podStartSLOduration=25.940125832 podStartE2EDuration="32.213061623s" podCreationTimestamp="2025-05-13 23:50:59 +0000 UTC" firstStartedPulling="2025-05-13 23:51:24.569075854 +0000 UTC m=+45.896392916" lastFinishedPulling="2025-05-13 23:51:30.842011645 +0000 UTC m=+52.169328707" observedRunningTime="2025-05-13 23:51:31.211265753 +0000 UTC m=+52.538582815" watchObservedRunningTime="2025-05-13 23:51:31.213061623 +0000 UTC m=+52.540378685" May 13 23:51:31.271484 containerd[1515]: time="2025-05-13T23:51:31.270132989Z" level=info msg="TaskExit event in podsandbox handler container_id:\"9792211b68a7146313562872544b61aada9a793ff21785d51cf55ea980c7f7e1\" id:\"3ceffc5ef5513088870b0e6847f268713e9cf40385b0e26a3c9985af069e4e33\" pid:4869 exited_at:{seconds:1747180291 nanos:252229054}" May 13 23:51:32.564978 containerd[1515]: time="2025-05-13T23:51:32.564832375Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node-driver-registrar:v3.29.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 13 23:51:32.567054 containerd[1515]: time="2025-05-13T23:51:32.566834901Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/node-driver-registrar:v3.29.3: active requests=0, bytes read=13124299" May 13 23:51:32.569430 containerd[1515]: time="2025-05-13T23:51:32.568591206Z" level=info msg="ImageCreate event name:\"sha256:a91b1f00752edc175f270a01b33683fa80818734aa2274388785eaf3364315dc\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 13 23:51:32.572037 containerd[1515]: time="2025-05-13T23:51:32.571979486Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node-driver-registrar@sha256:3f15090a9bb45773d1fd019455ec3d3f3746f3287c35d8013e497b38d8237324\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 13 23:51:32.575051 containerd[1515]: time="2025-05-13T23:51:32.572755710Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.29.3\" with image id \"sha256:a91b1f00752edc175f270a01b33683fa80818734aa2274388785eaf3364315dc\", repo tag \"ghcr.io/flatcar/calico/node-driver-registrar:v3.29.3\", repo digest \"ghcr.io/flatcar/calico/node-driver-registrar@sha256:3f15090a9bb45773d1fd019455ec3d3f3746f3287c35d8013e497b38d8237324\", size \"14493433\" in 1.730394955s" May 13 23:51:32.575051 containerd[1515]: time="2025-05-13T23:51:32.572799353Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.29.3\" returns image reference \"sha256:a91b1f00752edc175f270a01b33683fa80818734aa2274388785eaf3364315dc\"" May 13 23:51:32.579194 containerd[1515]: time="2025-05-13T23:51:32.578782648Z" level=info msg="CreateContainer within sandbox \"e6df9f84214acb3f3f8fcccd3fa80d22ad29ee939103f182785640e112d9005e\" for container &ContainerMetadata{Name:csi-node-driver-registrar,Attempt:0,}" May 13 23:51:32.592142 containerd[1515]: time="2025-05-13T23:51:32.592061025Z" level=info msg="Container 82cfd5d6a9eca1665a1a5f3f340c67ff7b5957dcaef199d249b383f3e8492033: CDI devices from CRI Config.CDIDevices: []" May 13 23:51:32.601972 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount301346464.mount: Deactivated successfully. May 13 23:51:32.609966 containerd[1515]: time="2025-05-13T23:51:32.609805251Z" level=info msg="CreateContainer within sandbox \"e6df9f84214acb3f3f8fcccd3fa80d22ad29ee939103f182785640e112d9005e\" for &ContainerMetadata{Name:csi-node-driver-registrar,Attempt:0,} returns container id \"82cfd5d6a9eca1665a1a5f3f340c67ff7b5957dcaef199d249b383f3e8492033\"" May 13 23:51:32.610816 containerd[1515]: time="2025-05-13T23:51:32.610720486Z" level=info msg="StartContainer for \"82cfd5d6a9eca1665a1a5f3f340c67ff7b5957dcaef199d249b383f3e8492033\"" May 13 23:51:32.612319 containerd[1515]: time="2025-05-13T23:51:32.612285856Z" level=info msg="connecting to shim 82cfd5d6a9eca1665a1a5f3f340c67ff7b5957dcaef199d249b383f3e8492033" address="unix:///run/containerd/s/d33fd5f6ba3f14755a008308c61ac2d7c03a7515943371dc956357468e8491fc" protocol=ttrpc version=3 May 13 23:51:32.672607 systemd[1]: Started cri-containerd-82cfd5d6a9eca1665a1a5f3f340c67ff7b5957dcaef199d249b383f3e8492033.scope - libcontainer container 82cfd5d6a9eca1665a1a5f3f340c67ff7b5957dcaef199d249b383f3e8492033. May 13 23:51:32.757119 containerd[1515]: time="2025-05-13T23:51:32.757068698Z" level=info msg="StartContainer for \"82cfd5d6a9eca1665a1a5f3f340c67ff7b5957dcaef199d249b383f3e8492033\" returns successfully" May 13 23:51:32.935309 kubelet[2916]: I0513 23:51:32.935270 2916 csi_plugin.go:100] kubernetes.io/csi: Trying to validate a new CSI Driver with name: csi.tigera.io endpoint: /var/lib/kubelet/plugins/csi.tigera.io/csi.sock versions: 1.0.0 May 13 23:51:32.944302 kubelet[2916]: I0513 23:51:32.944240 2916 csi_plugin.go:113] kubernetes.io/csi: Register new plugin with name: csi.tigera.io at endpoint: /var/lib/kubelet/plugins/csi.tigera.io/csi.sock May 13 23:51:33.190989 kubelet[2916]: I0513 23:51:33.190701 2916 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/csi-node-driver-dxllc" podStartSLOduration=22.009775407 podStartE2EDuration="33.190676397s" podCreationTimestamp="2025-05-13 23:51:00 +0000 UTC" firstStartedPulling="2025-05-13 23:51:21.395170034 +0000 UTC m=+42.722487096" lastFinishedPulling="2025-05-13 23:51:32.576071024 +0000 UTC m=+53.903388086" observedRunningTime="2025-05-13 23:51:33.187633749 +0000 UTC m=+54.514950811" watchObservedRunningTime="2025-05-13 23:51:33.190676397 +0000 UTC m=+54.517993499" May 13 23:51:35.962474 kubelet[2916]: I0513 23:51:35.961888 2916 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" May 13 23:51:39.818829 containerd[1515]: time="2025-05-13T23:51:39.818690935Z" level=info msg="TaskExit event in podsandbox handler container_id:\"9792211b68a7146313562872544b61aada9a793ff21785d51cf55ea980c7f7e1\" id:\"22f7e66792994344bb86851346d6e8d1c2b51011097639a86a9645b06e70e33f\" pid:4949 exited_at:{seconds:1747180299 nanos:817175059}" May 13 23:51:59.737028 containerd[1515]: time="2025-05-13T23:51:59.736652003Z" level=info msg="TaskExit event in podsandbox handler container_id:\"6e216e5e38dfeaed7e9085644f5c11b62a72e72b15f966089d9009ebbc63b632\" id:\"9f6f0c4a0493b19c96e895d43ef06814e80249938cb8b2969a47f305ea511800\" pid:4982 exited_at:{seconds:1747180319 nanos:736295580}" May 13 23:52:08.367060 containerd[1515]: time="2025-05-13T23:52:08.367012415Z" level=info msg="TaskExit event in podsandbox handler container_id:\"9792211b68a7146313562872544b61aada9a793ff21785d51cf55ea980c7f7e1\" id:\"1e191b61f553a17f22a10da3e8e4fa24e9a6a204dfdbdfd870713528f823f94f\" pid:5009 exited_at:{seconds:1747180328 nanos:366604029}" May 13 23:52:09.714034 containerd[1515]: time="2025-05-13T23:52:09.713753714Z" level=info msg="TaskExit event in podsandbox handler container_id:\"9792211b68a7146313562872544b61aada9a793ff21785d51cf55ea980c7f7e1\" id:\"699f4adb746a1a15ecde9174a7b95e152a3c3a3eb79a609c5e898ba3165a87e7\" pid:5031 exited_at:{seconds:1747180329 nanos:713255762}" May 13 23:52:29.735529 containerd[1515]: time="2025-05-13T23:52:29.735312632Z" level=info msg="TaskExit event in podsandbox handler container_id:\"6e216e5e38dfeaed7e9085644f5c11b62a72e72b15f966089d9009ebbc63b632\" id:\"9924504cbec966c0175134f6b10abdb98fe038802e1e044e267360eb672ea223\" pid:5057 exited_at:{seconds:1747180349 nanos:734949784}" May 13 23:52:39.714633 containerd[1515]: time="2025-05-13T23:52:39.714521199Z" level=info msg="TaskExit event in podsandbox handler container_id:\"9792211b68a7146313562872544b61aada9a793ff21785d51cf55ea980c7f7e1\" id:\"ee4b1857770ea0fd4a0041a44ee4532389881b1589e0a8f4031acaf357977200\" pid:5088 exited_at:{seconds:1747180359 nanos:714145341}" May 13 23:52:59.738483 containerd[1515]: time="2025-05-13T23:52:59.738348755Z" level=info msg="TaskExit event in podsandbox handler container_id:\"6e216e5e38dfeaed7e9085644f5c11b62a72e72b15f966089d9009ebbc63b632\" id:\"9da90745ddd841525042efb7f5cd0e1e39b33cc8519e0275adb162dd2e19b131\" pid:5134 exited_at:{seconds:1747180379 nanos:737877568}" May 13 23:53:08.364335 containerd[1515]: time="2025-05-13T23:53:08.364276770Z" level=info msg="TaskExit event in podsandbox handler container_id:\"9792211b68a7146313562872544b61aada9a793ff21785d51cf55ea980c7f7e1\" id:\"1fbe8d1a8e3b44d0808a3b812fff1d9b356209dc302f3c83d00fa5c272c7c082\" pid:5158 exited_at:{seconds:1747180388 nanos:363498004}" May 13 23:53:09.717921 containerd[1515]: time="2025-05-13T23:53:09.717839743Z" level=info msg="TaskExit event in podsandbox handler container_id:\"9792211b68a7146313562872544b61aada9a793ff21785d51cf55ea980c7f7e1\" id:\"934d0e057c5989800b28ac3e9d266e8a8a554b434ba4b5e98f4408fe328819b4\" pid:5180 exited_at:{seconds:1747180389 nanos:717217841}" May 13 23:53:29.737093 containerd[1515]: time="2025-05-13T23:53:29.737029039Z" level=info msg="TaskExit event in podsandbox handler container_id:\"6e216e5e38dfeaed7e9085644f5c11b62a72e72b15f966089d9009ebbc63b632\" id:\"0e7aa743470207cfbfda51535dcde3c93dff236534b4f92919f92a47f439dd8d\" pid:5209 exited_at:{seconds:1747180409 nanos:734804174}" May 13 23:53:39.714894 containerd[1515]: time="2025-05-13T23:53:39.714850251Z" level=info msg="TaskExit event in podsandbox handler container_id:\"9792211b68a7146313562872544b61aada9a793ff21785d51cf55ea980c7f7e1\" id:\"e2741ca036e9b7ced2c94a3a3af195db57af5f4953a39969cf6f2623309ebc73\" pid:5236 exited_at:{seconds:1747180419 nanos:714546346}" May 13 23:53:59.735765 containerd[1515]: time="2025-05-13T23:53:59.735715120Z" level=info msg="TaskExit event in podsandbox handler container_id:\"6e216e5e38dfeaed7e9085644f5c11b62a72e72b15f966089d9009ebbc63b632\" id:\"af1d030f9353da366352094dc8b7fdf9ba574c62b81d01a3df3fc2582342d323\" pid:5267 exited_at:{seconds:1747180439 nanos:735196375}" May 13 23:54:08.381713 containerd[1515]: time="2025-05-13T23:54:08.381600126Z" level=info msg="TaskExit event in podsandbox handler container_id:\"9792211b68a7146313562872544b61aada9a793ff21785d51cf55ea980c7f7e1\" id:\"4885e9f7cbca16d07d9ec2e0cf89ad9768c14f3baf83b7af6c6487f2bccb3187\" pid:5292 exited_at:{seconds:1747180448 nanos:380488309}" May 13 23:54:09.723145 containerd[1515]: time="2025-05-13T23:54:09.722875045Z" level=info msg="TaskExit event in podsandbox handler container_id:\"9792211b68a7146313562872544b61aada9a793ff21785d51cf55ea980c7f7e1\" id:\"cfa79afc543a44fd4651f6b700f775842592726e0215d9e5c3bcba5c97d62952\" pid:5313 exited_at:{seconds:1747180449 nanos:722665010}" May 13 23:54:29.751921 containerd[1515]: time="2025-05-13T23:54:29.751009786Z" level=info msg="TaskExit event in podsandbox handler container_id:\"6e216e5e38dfeaed7e9085644f5c11b62a72e72b15f966089d9009ebbc63b632\" id:\"edfa3f7be8dc3dae30553fc5d8379843009e62f57bc010075fa21b920a5c01e5\" pid:5353 exited_at:{seconds:1747180469 nanos:749966754}" May 13 23:54:39.717654 containerd[1515]: time="2025-05-13T23:54:39.717604963Z" level=info msg="TaskExit event in podsandbox handler container_id:\"9792211b68a7146313562872544b61aada9a793ff21785d51cf55ea980c7f7e1\" id:\"22ef3a28e192a052209a669e3adf4300a49f73d89b13986bd33a3b20a854a697\" pid:5380 exited_at:{seconds:1747180479 nanos:717207764}" May 13 23:54:54.959533 update_engine[1487]: I20250513 23:54:54.959396 1487 prefs.cc:52] certificate-report-to-send-update not present in /var/lib/update_engine/prefs May 13 23:54:54.959533 update_engine[1487]: I20250513 23:54:54.959506 1487 prefs.cc:52] certificate-report-to-send-download not present in /var/lib/update_engine/prefs May 13 23:54:54.962051 update_engine[1487]: I20250513 23:54:54.959879 1487 prefs.cc:52] aleph-version not present in /var/lib/update_engine/prefs May 13 23:54:54.962051 update_engine[1487]: I20250513 23:54:54.960850 1487 omaha_request_params.cc:62] Current group set to alpha May 13 23:54:54.962051 update_engine[1487]: I20250513 23:54:54.961979 1487 update_attempter.cc:499] Already updated boot flags. Skipping. May 13 23:54:54.962051 update_engine[1487]: I20250513 23:54:54.962011 1487 update_attempter.cc:643] Scheduling an action processor start. May 13 23:54:54.962051 update_engine[1487]: I20250513 23:54:54.962033 1487 action_processor.cc:36] ActionProcessor::StartProcessing: OmahaRequestAction May 13 23:54:54.969351 update_engine[1487]: I20250513 23:54:54.963845 1487 prefs.cc:52] previous-version not present in /var/lib/update_engine/prefs May 13 23:54:54.969351 update_engine[1487]: I20250513 23:54:54.963994 1487 omaha_request_action.cc:271] Posting an Omaha request to disabled May 13 23:54:54.969351 update_engine[1487]: I20250513 23:54:54.964015 1487 omaha_request_action.cc:272] Request: May 13 23:54:54.969351 update_engine[1487]: May 13 23:54:54.969351 update_engine[1487]: May 13 23:54:54.969351 update_engine[1487]: May 13 23:54:54.969351 update_engine[1487]: May 13 23:54:54.969351 update_engine[1487]: May 13 23:54:54.969351 update_engine[1487]: May 13 23:54:54.969351 update_engine[1487]: May 13 23:54:54.969351 update_engine[1487]: May 13 23:54:54.969351 update_engine[1487]: I20250513 23:54:54.964025 1487 libcurl_http_fetcher.cc:47] Starting/Resuming transfer May 13 23:54:54.969703 locksmithd[1523]: LastCheckedTime=0 Progress=0 CurrentOperation="UPDATE_STATUS_CHECKING_FOR_UPDATE" NewVersion=0.0.0 NewSize=0 May 13 23:54:54.971137 update_engine[1487]: I20250513 23:54:54.970649 1487 libcurl_http_fetcher.cc:151] Setting up curl options for HTTP May 13 23:54:54.971137 update_engine[1487]: I20250513 23:54:54.971074 1487 libcurl_http_fetcher.cc:449] Setting up timeout source: 1 seconds. May 13 23:54:54.973461 update_engine[1487]: E20250513 23:54:54.973323 1487 libcurl_http_fetcher.cc:266] Unable to get http response code: Could not resolve host: disabled May 13 23:54:54.973553 update_engine[1487]: I20250513 23:54:54.973467 1487 libcurl_http_fetcher.cc:283] No HTTP response, retry 1 May 13 23:54:59.749113 containerd[1515]: time="2025-05-13T23:54:59.749002644Z" level=info msg="TaskExit event in podsandbox handler container_id:\"6e216e5e38dfeaed7e9085644f5c11b62a72e72b15f966089d9009ebbc63b632\" id:\"e756c15923ebf8e3822233f0eb6e3ea8608b6e4061dbd4b3e6b417f43d21e417\" pid:5404 exited_at:{seconds:1747180499 nanos:748536922}" May 13 23:55:04.869554 update_engine[1487]: I20250513 23:55:04.869247 1487 libcurl_http_fetcher.cc:47] Starting/Resuming transfer May 13 23:55:04.870020 update_engine[1487]: I20250513 23:55:04.869589 1487 libcurl_http_fetcher.cc:151] Setting up curl options for HTTP May 13 23:55:04.870108 update_engine[1487]: I20250513 23:55:04.870049 1487 libcurl_http_fetcher.cc:449] Setting up timeout source: 1 seconds. May 13 23:55:04.873118 update_engine[1487]: E20250513 23:55:04.873030 1487 libcurl_http_fetcher.cc:266] Unable to get http response code: Could not resolve host: disabled May 13 23:55:04.873315 update_engine[1487]: I20250513 23:55:04.873170 1487 libcurl_http_fetcher.cc:283] No HTTP response, retry 2 May 13 23:55:08.362797 containerd[1515]: time="2025-05-13T23:55:08.362752465Z" level=info msg="TaskExit event in podsandbox handler container_id:\"9792211b68a7146313562872544b61aada9a793ff21785d51cf55ea980c7f7e1\" id:\"bc0fb353d2dd7c492164485d3c84b0108bf5d882a6c3287aed1ca28c783fb33e\" pid:5428 exited_at:{seconds:1747180508 nanos:362237541}" May 13 23:55:09.721794 containerd[1515]: time="2025-05-13T23:55:09.721743563Z" level=info msg="TaskExit event in podsandbox handler container_id:\"9792211b68a7146313562872544b61aada9a793ff21785d51cf55ea980c7f7e1\" id:\"a5f4a1da1affac117548bce797a6222283ef7dc327b31f54d43ac5ec89721a1b\" pid:5451 exited_at:{seconds:1747180509 nanos:721156678}" May 13 23:55:14.864209 update_engine[1487]: I20250513 23:55:14.864089 1487 libcurl_http_fetcher.cc:47] Starting/Resuming transfer May 13 23:55:14.864854 update_engine[1487]: I20250513 23:55:14.864360 1487 libcurl_http_fetcher.cc:151] Setting up curl options for HTTP May 13 23:55:14.864854 update_engine[1487]: I20250513 23:55:14.864670 1487 libcurl_http_fetcher.cc:449] Setting up timeout source: 1 seconds. May 13 23:55:14.866117 update_engine[1487]: E20250513 23:55:14.866029 1487 libcurl_http_fetcher.cc:266] Unable to get http response code: Could not resolve host: disabled May 13 23:55:14.866117 update_engine[1487]: I20250513 23:55:14.866119 1487 libcurl_http_fetcher.cc:283] No HTTP response, retry 3 May 13 23:55:20.941116 systemd[1]: Started sshd@7-91.99.1.97:22-139.178.89.65:48658.service - OpenSSH per-connection server daemon (139.178.89.65:48658). May 13 23:55:21.976206 sshd[5464]: Accepted publickey for core from 139.178.89.65 port 48658 ssh2: RSA SHA256:2AVjOJVQfxOW/VtYd+RUuCe04o4LOcgB/L6rdW4ABGI May 13 23:55:21.980278 sshd-session[5464]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 13 23:55:21.988105 systemd-logind[1485]: New session 8 of user core. May 13 23:55:21.991061 systemd[1]: Started session-8.scope - Session 8 of User core. May 13 23:55:22.770109 sshd[5466]: Connection closed by 139.178.89.65 port 48658 May 13 23:55:22.770734 sshd-session[5464]: pam_unix(sshd:session): session closed for user core May 13 23:55:22.776805 systemd[1]: sshd@7-91.99.1.97:22-139.178.89.65:48658.service: Deactivated successfully. May 13 23:55:22.779800 systemd[1]: session-8.scope: Deactivated successfully. May 13 23:55:22.781185 systemd-logind[1485]: Session 8 logged out. Waiting for processes to exit. May 13 23:55:22.783361 systemd-logind[1485]: Removed session 8. May 13 23:55:24.862349 update_engine[1487]: I20250513 23:55:24.861342 1487 libcurl_http_fetcher.cc:47] Starting/Resuming transfer May 13 23:55:24.862349 update_engine[1487]: I20250513 23:55:24.861771 1487 libcurl_http_fetcher.cc:151] Setting up curl options for HTTP May 13 23:55:24.862349 update_engine[1487]: I20250513 23:55:24.862143 1487 libcurl_http_fetcher.cc:449] Setting up timeout source: 1 seconds. May 13 23:55:24.863457 update_engine[1487]: E20250513 23:55:24.863333 1487 libcurl_http_fetcher.cc:266] Unable to get http response code: Could not resolve host: disabled May 13 23:55:24.863580 update_engine[1487]: I20250513 23:55:24.863466 1487 libcurl_http_fetcher.cc:297] Transfer resulted in an error (0), 0 bytes downloaded May 13 23:55:24.863580 update_engine[1487]: I20250513 23:55:24.863488 1487 omaha_request_action.cc:617] Omaha request response: May 13 23:55:24.863680 update_engine[1487]: E20250513 23:55:24.863620 1487 omaha_request_action.cc:636] Omaha request network transfer failed. May 13 23:55:24.863680 update_engine[1487]: I20250513 23:55:24.863651 1487 action_processor.cc:68] ActionProcessor::ActionComplete: OmahaRequestAction action failed. Aborting processing. May 13 23:55:24.863680 update_engine[1487]: I20250513 23:55:24.863662 1487 action_processor.cc:73] ActionProcessor::ActionComplete: finished last action of type OmahaRequestAction May 13 23:55:24.863680 update_engine[1487]: I20250513 23:55:24.863669 1487 update_attempter.cc:306] Processing Done. May 13 23:55:24.864417 update_engine[1487]: E20250513 23:55:24.863689 1487 update_attempter.cc:619] Update failed. May 13 23:55:24.864417 update_engine[1487]: I20250513 23:55:24.863700 1487 utils.cc:600] Converting error code 2000 to kActionCodeOmahaErrorInHTTPResponse May 13 23:55:24.864417 update_engine[1487]: I20250513 23:55:24.863709 1487 payload_state.cc:97] Updating payload state for error code: 37 (kActionCodeOmahaErrorInHTTPResponse) May 13 23:55:24.864417 update_engine[1487]: I20250513 23:55:24.863718 1487 payload_state.cc:103] Ignoring failures until we get a valid Omaha response. May 13 23:55:24.864417 update_engine[1487]: I20250513 23:55:24.863817 1487 action_processor.cc:36] ActionProcessor::StartProcessing: OmahaRequestAction May 13 23:55:24.864417 update_engine[1487]: I20250513 23:55:24.863857 1487 omaha_request_action.cc:271] Posting an Omaha request to disabled May 13 23:55:24.864417 update_engine[1487]: I20250513 23:55:24.863868 1487 omaha_request_action.cc:272] Request: May 13 23:55:24.864417 update_engine[1487]: May 13 23:55:24.864417 update_engine[1487]: May 13 23:55:24.864417 update_engine[1487]: May 13 23:55:24.864417 update_engine[1487]: May 13 23:55:24.864417 update_engine[1487]: May 13 23:55:24.864417 update_engine[1487]: May 13 23:55:24.864417 update_engine[1487]: I20250513 23:55:24.863878 1487 libcurl_http_fetcher.cc:47] Starting/Resuming transfer May 13 23:55:24.864417 update_engine[1487]: I20250513 23:55:24.864161 1487 libcurl_http_fetcher.cc:151] Setting up curl options for HTTP May 13 23:55:24.865845 update_engine[1487]: I20250513 23:55:24.864519 1487 libcurl_http_fetcher.cc:449] Setting up timeout source: 1 seconds. May 13 23:55:24.866364 locksmithd[1523]: LastCheckedTime=0 Progress=0 CurrentOperation="UPDATE_STATUS_REPORTING_ERROR_EVENT" NewVersion=0.0.0 NewSize=0 May 13 23:55:24.866882 update_engine[1487]: E20250513 23:55:24.865984 1487 libcurl_http_fetcher.cc:266] Unable to get http response code: Could not resolve host: disabled May 13 23:55:24.866882 update_engine[1487]: I20250513 23:55:24.866057 1487 libcurl_http_fetcher.cc:297] Transfer resulted in an error (0), 0 bytes downloaded May 13 23:55:24.866882 update_engine[1487]: I20250513 23:55:24.866070 1487 omaha_request_action.cc:617] Omaha request response: May 13 23:55:24.866882 update_engine[1487]: I20250513 23:55:24.866077 1487 action_processor.cc:65] ActionProcessor::ActionComplete: finished last action of type OmahaRequestAction May 13 23:55:24.866882 update_engine[1487]: I20250513 23:55:24.866082 1487 action_processor.cc:73] ActionProcessor::ActionComplete: finished last action of type OmahaRequestAction May 13 23:55:24.866882 update_engine[1487]: I20250513 23:55:24.866088 1487 update_attempter.cc:306] Processing Done. May 13 23:55:24.866882 update_engine[1487]: I20250513 23:55:24.866096 1487 update_attempter.cc:310] Error event sent. May 13 23:55:24.866882 update_engine[1487]: I20250513 23:55:24.866107 1487 update_check_scheduler.cc:74] Next update check in 43m33s May 13 23:55:24.867111 locksmithd[1523]: LastCheckedTime=0 Progress=0 CurrentOperation="UPDATE_STATUS_IDLE" NewVersion=0.0.0 NewSize=0 May 13 23:55:27.942608 systemd[1]: Started sshd@8-91.99.1.97:22-139.178.89.65:58634.service - OpenSSH per-connection server daemon (139.178.89.65:58634). May 13 23:55:28.951653 sshd[5484]: Accepted publickey for core from 139.178.89.65 port 58634 ssh2: RSA SHA256:2AVjOJVQfxOW/VtYd+RUuCe04o4LOcgB/L6rdW4ABGI May 13 23:55:28.955891 sshd-session[5484]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 13 23:55:28.967503 systemd-logind[1485]: New session 9 of user core. May 13 23:55:28.972683 systemd[1]: Started session-9.scope - Session 9 of User core. May 13 23:55:29.728153 sshd[5486]: Connection closed by 139.178.89.65 port 58634 May 13 23:55:29.729920 sshd-session[5484]: pam_unix(sshd:session): session closed for user core May 13 23:55:29.738819 systemd[1]: sshd@8-91.99.1.97:22-139.178.89.65:58634.service: Deactivated successfully. May 13 23:55:29.744752 systemd[1]: session-9.scope: Deactivated successfully. May 13 23:55:29.746449 systemd-logind[1485]: Session 9 logged out. Waiting for processes to exit. May 13 23:55:29.749541 containerd[1515]: time="2025-05-13T23:55:29.749227244Z" level=info msg="TaskExit event in podsandbox handler container_id:\"6e216e5e38dfeaed7e9085644f5c11b62a72e72b15f966089d9009ebbc63b632\" id:\"3a137a2c31d204bfa13aad13403cc66a12c8667069633b12ec219159027b17d1\" pid:5507 exited_at:{seconds:1747180529 nanos:748592436}" May 13 23:55:29.751571 systemd-logind[1485]: Removed session 9. May 13 23:55:32.294141 containerd[1515]: time="2025-05-13T23:55:32.293992738Z" level=warning msg="container event discarded" container=76e1955205d61236f06a2b963a6b94c294944f8b9d2a7aff7eaf31d8418cd9bd type=CONTAINER_CREATED_EVENT May 13 23:55:32.305793 containerd[1515]: time="2025-05-13T23:55:32.305382451Z" level=warning msg="container event discarded" container=76e1955205d61236f06a2b963a6b94c294944f8b9d2a7aff7eaf31d8418cd9bd type=CONTAINER_STARTED_EVENT May 13 23:55:32.305793 containerd[1515]: time="2025-05-13T23:55:32.305521253Z" level=warning msg="container event discarded" container=f2cfe3c36a29abc928a8ed7b9676947183d99701ca59355816ec2f2b236d6a0c type=CONTAINER_CREATED_EVENT May 13 23:55:32.305793 containerd[1515]: time="2025-05-13T23:55:32.305544933Z" level=warning msg="container event discarded" container=f2cfe3c36a29abc928a8ed7b9676947183d99701ca59355816ec2f2b236d6a0c type=CONTAINER_STARTED_EVENT May 13 23:55:32.322826 containerd[1515]: time="2025-05-13T23:55:32.322735884Z" level=warning msg="container event discarded" container=9a49ff9e4ea3574acdd0a5ecb6c3ff9d1d2b6d669facc36a8d386faa1e9923c8 type=CONTAINER_CREATED_EVENT May 13 23:55:32.322826 containerd[1515]: time="2025-05-13T23:55:32.322803845Z" level=warning msg="container event discarded" container=9a49ff9e4ea3574acdd0a5ecb6c3ff9d1d2b6d669facc36a8d386faa1e9923c8 type=CONTAINER_STARTED_EVENT May 13 23:55:32.347390 containerd[1515]: time="2025-05-13T23:55:32.347278973Z" level=warning msg="container event discarded" container=552e7ac7c59e545e79555ea5759a9f14d9ac99e24ff272a4809a6058f2895c76 type=CONTAINER_CREATED_EVENT May 13 23:55:32.347390 containerd[1515]: time="2025-05-13T23:55:32.347349174Z" level=warning msg="container event discarded" container=a2923d255f43b17ecb984ca2fd1e92d3f3bb0581321b9c53f8f02e4611a7a642 type=CONTAINER_CREATED_EVENT May 13 23:55:32.358601 containerd[1515]: time="2025-05-13T23:55:32.358524324Z" level=warning msg="container event discarded" container=83127b837c28633feaf17d517c8a4528671adae3a4df81e40bd57c0d77c0c8e5 type=CONTAINER_CREATED_EVENT May 13 23:55:32.469095 containerd[1515]: time="2025-05-13T23:55:32.468966485Z" level=warning msg="container event discarded" container=552e7ac7c59e545e79555ea5759a9f14d9ac99e24ff272a4809a6058f2895c76 type=CONTAINER_STARTED_EVENT May 13 23:55:32.486504 containerd[1515]: time="2025-05-13T23:55:32.486306397Z" level=warning msg="container event discarded" container=83127b837c28633feaf17d517c8a4528671adae3a4df81e40bd57c0d77c0c8e5 type=CONTAINER_STARTED_EVENT May 13 23:55:32.528039 containerd[1515]: time="2025-05-13T23:55:32.527879155Z" level=warning msg="container event discarded" container=a2923d255f43b17ecb984ca2fd1e92d3f3bb0581321b9c53f8f02e4611a7a642 type=CONTAINER_STARTED_EVENT May 13 23:55:34.910290 systemd[1]: Started sshd@9-91.99.1.97:22-139.178.89.65:58648.service - OpenSSH per-connection server daemon (139.178.89.65:58648). May 13 23:55:35.938577 sshd[5523]: Accepted publickey for core from 139.178.89.65 port 58648 ssh2: RSA SHA256:2AVjOJVQfxOW/VtYd+RUuCe04o4LOcgB/L6rdW4ABGI May 13 23:55:35.940548 sshd-session[5523]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 13 23:55:35.946796 systemd-logind[1485]: New session 10 of user core. May 13 23:55:35.950625 systemd[1]: Started session-10.scope - Session 10 of User core. May 13 23:55:36.743537 sshd[5525]: Connection closed by 139.178.89.65 port 58648 May 13 23:55:36.745111 sshd-session[5523]: pam_unix(sshd:session): session closed for user core May 13 23:55:36.751467 systemd[1]: sshd@9-91.99.1.97:22-139.178.89.65:58648.service: Deactivated successfully. May 13 23:55:36.756052 systemd[1]: session-10.scope: Deactivated successfully. May 13 23:55:36.757926 systemd-logind[1485]: Session 10 logged out. Waiting for processes to exit. May 13 23:55:36.759655 systemd-logind[1485]: Removed session 10. May 13 23:55:36.922550 systemd[1]: Started sshd@10-91.99.1.97:22-139.178.89.65:42390.service - OpenSSH per-connection server daemon (139.178.89.65:42390). May 13 23:55:37.952374 sshd[5538]: Accepted publickey for core from 139.178.89.65 port 42390 ssh2: RSA SHA256:2AVjOJVQfxOW/VtYd+RUuCe04o4LOcgB/L6rdW4ABGI May 13 23:55:37.954883 sshd-session[5538]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 13 23:55:37.961567 systemd-logind[1485]: New session 11 of user core. May 13 23:55:37.972119 systemd[1]: Started session-11.scope - Session 11 of User core. May 13 23:55:38.790309 sshd[5540]: Connection closed by 139.178.89.65 port 42390 May 13 23:55:38.790301 sshd-session[5538]: pam_unix(sshd:session): session closed for user core May 13 23:55:38.795328 systemd[1]: sshd@10-91.99.1.97:22-139.178.89.65:42390.service: Deactivated successfully. May 13 23:55:38.798831 systemd[1]: session-11.scope: Deactivated successfully. May 13 23:55:38.802653 systemd-logind[1485]: Session 11 logged out. Waiting for processes to exit. May 13 23:55:38.804325 systemd-logind[1485]: Removed session 11. May 13 23:55:38.963668 systemd[1]: Started sshd@11-91.99.1.97:22-139.178.89.65:42400.service - OpenSSH per-connection server daemon (139.178.89.65:42400). May 13 23:55:39.730459 containerd[1515]: time="2025-05-13T23:55:39.730310936Z" level=info msg="TaskExit event in podsandbox handler container_id:\"9792211b68a7146313562872544b61aada9a793ff21785d51cf55ea980c7f7e1\" id:\"bf810eb59410c269f0c3c779130bb699519f7da3536aa48276b6acaf5066620b\" pid:5568 exited_at:{seconds:1747180539 nanos:729937451}" May 13 23:55:39.975085 sshd[5552]: Accepted publickey for core from 139.178.89.65 port 42400 ssh2: RSA SHA256:2AVjOJVQfxOW/VtYd+RUuCe04o4LOcgB/L6rdW4ABGI May 13 23:55:39.977891 sshd-session[5552]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 13 23:55:39.983818 systemd-logind[1485]: New session 12 of user core. May 13 23:55:39.989738 systemd[1]: Started session-12.scope - Session 12 of User core. May 13 23:55:40.753583 sshd[5578]: Connection closed by 139.178.89.65 port 42400 May 13 23:55:40.755148 sshd-session[5552]: pam_unix(sshd:session): session closed for user core May 13 23:55:40.758645 systemd[1]: sshd@11-91.99.1.97:22-139.178.89.65:42400.service: Deactivated successfully. May 13 23:55:40.763818 systemd[1]: session-12.scope: Deactivated successfully. May 13 23:55:40.766876 systemd-logind[1485]: Session 12 logged out. Waiting for processes to exit. May 13 23:55:40.768336 systemd-logind[1485]: Removed session 12. May 13 23:55:45.927704 systemd[1]: Started sshd@12-91.99.1.97:22-139.178.89.65:42404.service - OpenSSH per-connection server daemon (139.178.89.65:42404). May 13 23:55:46.938054 sshd[5590]: Accepted publickey for core from 139.178.89.65 port 42404 ssh2: RSA SHA256:2AVjOJVQfxOW/VtYd+RUuCe04o4LOcgB/L6rdW4ABGI May 13 23:55:46.940090 sshd-session[5590]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 13 23:55:46.947449 systemd-logind[1485]: New session 13 of user core. May 13 23:55:46.952645 systemd[1]: Started session-13.scope - Session 13 of User core. May 13 23:55:47.699487 sshd[5592]: Connection closed by 139.178.89.65 port 42404 May 13 23:55:47.700491 sshd-session[5590]: pam_unix(sshd:session): session closed for user core May 13 23:55:47.706079 systemd[1]: sshd@12-91.99.1.97:22-139.178.89.65:42404.service: Deactivated successfully. May 13 23:55:47.710369 systemd[1]: session-13.scope: Deactivated successfully. May 13 23:55:47.713675 systemd-logind[1485]: Session 13 logged out. Waiting for processes to exit. May 13 23:55:47.716287 systemd-logind[1485]: Removed session 13. May 13 23:55:52.877350 systemd[1]: Started sshd@13-91.99.1.97:22-139.178.89.65:57760.service - OpenSSH per-connection server daemon (139.178.89.65:57760). May 13 23:55:53.408477 containerd[1515]: time="2025-05-13T23:55:53.408192517Z" level=warning msg="container event discarded" container=02699cd442c0ba3701471692d544466fc8d7aa3269e4da8671b47b07d7070c24 type=CONTAINER_CREATED_EVENT May 13 23:55:53.408477 containerd[1515]: time="2025-05-13T23:55:53.408265278Z" level=warning msg="container event discarded" container=02699cd442c0ba3701471692d544466fc8d7aa3269e4da8671b47b07d7070c24 type=CONTAINER_STARTED_EVENT May 13 23:55:53.438739 containerd[1515]: time="2025-05-13T23:55:53.438605312Z" level=warning msg="container event discarded" container=a50af5f0bcf634fb8b6633c7988e506108ec0a059707d2d079693507213ab695 type=CONTAINER_CREATED_EVENT May 13 23:55:53.527785 containerd[1515]: time="2025-05-13T23:55:53.520709622Z" level=warning msg="container event discarded" container=a50af5f0bcf634fb8b6633c7988e506108ec0a059707d2d079693507213ab695 type=CONTAINER_STARTED_EVENT May 13 23:55:53.611774 containerd[1515]: time="2025-05-13T23:55:53.611690883Z" level=warning msg="container event discarded" container=4dad1fa3b28cc13e1234ef2da62fe9d5b86ef3cee156dd1ddb970836ff19943b type=CONTAINER_CREATED_EVENT May 13 23:55:53.611774 containerd[1515]: time="2025-05-13T23:55:53.611745764Z" level=warning msg="container event discarded" container=4dad1fa3b28cc13e1234ef2da62fe9d5b86ef3cee156dd1ddb970836ff19943b type=CONTAINER_STARTED_EVENT May 13 23:55:53.895237 sshd[5605]: Accepted publickey for core from 139.178.89.65 port 57760 ssh2: RSA SHA256:2AVjOJVQfxOW/VtYd+RUuCe04o4LOcgB/L6rdW4ABGI May 13 23:55:53.898483 sshd-session[5605]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 13 23:55:53.908543 systemd-logind[1485]: New session 14 of user core. May 13 23:55:53.923609 systemd[1]: Started session-14.scope - Session 14 of User core. May 13 23:55:54.683433 sshd[5609]: Connection closed by 139.178.89.65 port 57760 May 13 23:55:54.682675 sshd-session[5605]: pam_unix(sshd:session): session closed for user core May 13 23:55:54.686959 systemd[1]: sshd@13-91.99.1.97:22-139.178.89.65:57760.service: Deactivated successfully. May 13 23:55:54.690891 systemd[1]: session-14.scope: Deactivated successfully. May 13 23:55:54.693212 systemd-logind[1485]: Session 14 logged out. Waiting for processes to exit. May 13 23:55:54.694462 systemd-logind[1485]: Removed session 14. May 13 23:55:55.578260 containerd[1515]: time="2025-05-13T23:55:55.578154765Z" level=warning msg="container event discarded" container=03db3bc78ed087c2d2c2e113706c70934bc599d4d06b4e3553772c469f07e430 type=CONTAINER_CREATED_EVENT May 13 23:55:55.660002 containerd[1515]: time="2025-05-13T23:55:55.658446428Z" level=warning msg="container event discarded" container=03db3bc78ed087c2d2c2e113706c70934bc599d4d06b4e3553772c469f07e430 type=CONTAINER_STARTED_EVENT May 13 23:55:59.737589 containerd[1515]: time="2025-05-13T23:55:59.737479602Z" level=info msg="TaskExit event in podsandbox handler container_id:\"6e216e5e38dfeaed7e9085644f5c11b62a72e72b15f966089d9009ebbc63b632\" id:\"c126cd5fccbf0eb4cd6691162129e1e02da90661d2486746d698270b4f9ae03c\" pid:5650 exited_at:{seconds:1747180559 nanos:736586346}" May 13 23:55:59.858362 systemd[1]: Started sshd@14-91.99.1.97:22-139.178.89.65:53734.service - OpenSSH per-connection server daemon (139.178.89.65:53734). May 13 23:56:00.632645 containerd[1515]: time="2025-05-13T23:56:00.632454044Z" level=warning msg="container event discarded" container=37e0626f129003369ed3c089cd45dc0cc86077364abebc8ab952b4b4e49c4def type=CONTAINER_CREATED_EVENT May 13 23:56:00.632645 containerd[1515]: time="2025-05-13T23:56:00.632507445Z" level=warning msg="container event discarded" container=37e0626f129003369ed3c089cd45dc0cc86077364abebc8ab952b4b4e49c4def type=CONTAINER_STARTED_EVENT May 13 23:56:00.785864 containerd[1515]: time="2025-05-13T23:56:00.785718784Z" level=warning msg="container event discarded" container=946f03493c6dc8f7f97547a29e4ac96167dee0786d964740d9074576144ee34b type=CONTAINER_CREATED_EVENT May 13 23:56:00.785864 containerd[1515]: time="2025-05-13T23:56:00.785776825Z" level=warning msg="container event discarded" container=946f03493c6dc8f7f97547a29e4ac96167dee0786d964740d9074576144ee34b type=CONTAINER_STARTED_EVENT May 13 23:56:00.874484 sshd[5663]: Accepted publickey for core from 139.178.89.65 port 53734 ssh2: RSA SHA256:2AVjOJVQfxOW/VtYd+RUuCe04o4LOcgB/L6rdW4ABGI May 13 23:56:00.877116 sshd-session[5663]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 13 23:56:00.887769 systemd-logind[1485]: New session 15 of user core. May 13 23:56:00.895993 systemd[1]: Started session-15.scope - Session 15 of User core. May 13 23:56:01.653421 sshd[5665]: Connection closed by 139.178.89.65 port 53734 May 13 23:56:01.654255 sshd-session[5663]: pam_unix(sshd:session): session closed for user core May 13 23:56:01.659778 systemd[1]: sshd@14-91.99.1.97:22-139.178.89.65:53734.service: Deactivated successfully. May 13 23:56:01.662144 systemd[1]: session-15.scope: Deactivated successfully. May 13 23:56:01.665603 systemd-logind[1485]: Session 15 logged out. Waiting for processes to exit. May 13 23:56:01.667971 systemd-logind[1485]: Removed session 15. May 13 23:56:01.823000 systemd[1]: Started sshd@15-91.99.1.97:22-139.178.89.65:53736.service - OpenSSH per-connection server daemon (139.178.89.65:53736). May 13 23:56:02.832965 sshd[5677]: Accepted publickey for core from 139.178.89.65 port 53736 ssh2: RSA SHA256:2AVjOJVQfxOW/VtYd+RUuCe04o4LOcgB/L6rdW4ABGI May 13 23:56:02.835589 sshd-session[5677]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 13 23:56:02.842836 systemd-logind[1485]: New session 16 of user core. May 13 23:56:02.848640 systemd[1]: Started session-16.scope - Session 16 of User core. May 13 23:56:03.245138 containerd[1515]: time="2025-05-13T23:56:03.244755081Z" level=warning msg="container event discarded" container=dbd01f55dcf3fa6ffbb746dde0370d5bfe8ad3877ef75fa584add80a07b9e031 type=CONTAINER_CREATED_EVENT May 13 23:56:03.349992 containerd[1515]: time="2025-05-13T23:56:03.349887679Z" level=warning msg="container event discarded" container=dbd01f55dcf3fa6ffbb746dde0370d5bfe8ad3877ef75fa584add80a07b9e031 type=CONTAINER_STARTED_EVENT May 13 23:56:03.732091 sshd[5679]: Connection closed by 139.178.89.65 port 53736 May 13 23:56:03.732740 sshd-session[5677]: pam_unix(sshd:session): session closed for user core May 13 23:56:03.737655 systemd[1]: sshd@15-91.99.1.97:22-139.178.89.65:53736.service: Deactivated successfully. May 13 23:56:03.737991 systemd-logind[1485]: Session 16 logged out. Waiting for processes to exit. May 13 23:56:03.740945 systemd[1]: session-16.scope: Deactivated successfully. May 13 23:56:03.744359 systemd-logind[1485]: Removed session 16. May 13 23:56:03.901836 systemd[1]: Started sshd@16-91.99.1.97:22-139.178.89.65:53750.service - OpenSSH per-connection server daemon (139.178.89.65:53750). May 13 23:56:04.663773 containerd[1515]: time="2025-05-13T23:56:04.663665928Z" level=warning msg="container event discarded" container=cd8285cbdb2dfa6d65ef0fb31e06249ca4083eece3eca5e368ab39c71b80257e type=CONTAINER_CREATED_EVENT May 13 23:56:04.753558 containerd[1515]: time="2025-05-13T23:56:04.753363455Z" level=warning msg="container event discarded" container=cd8285cbdb2dfa6d65ef0fb31e06249ca4083eece3eca5e368ab39c71b80257e type=CONTAINER_STARTED_EVENT May 13 23:56:04.903987 sshd[5688]: Accepted publickey for core from 139.178.89.65 port 53750 ssh2: RSA SHA256:2AVjOJVQfxOW/VtYd+RUuCe04o4LOcgB/L6rdW4ABGI May 13 23:56:04.907019 sshd-session[5688]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 13 23:56:04.913834 systemd-logind[1485]: New session 17 of user core. May 13 23:56:04.917835 systemd[1]: Started session-17.scope - Session 17 of User core. May 13 23:56:04.927730 containerd[1515]: time="2025-05-13T23:56:04.927621616Z" level=warning msg="container event discarded" container=cd8285cbdb2dfa6d65ef0fb31e06249ca4083eece3eca5e368ab39c71b80257e type=CONTAINER_STOPPED_EVENT May 13 23:56:07.728154 sshd[5690]: Connection closed by 139.178.89.65 port 53750 May 13 23:56:07.729255 sshd-session[5688]: pam_unix(sshd:session): session closed for user core May 13 23:56:07.735464 systemd[1]: sshd@16-91.99.1.97:22-139.178.89.65:53750.service: Deactivated successfully. May 13 23:56:07.738127 systemd[1]: session-17.scope: Deactivated successfully. May 13 23:56:07.738748 systemd[1]: session-17.scope: Consumed 604ms CPU time, 69.1M memory peak. May 13 23:56:07.739744 systemd-logind[1485]: Session 17 logged out. Waiting for processes to exit. May 13 23:56:07.742389 systemd-logind[1485]: Removed session 17. May 13 23:56:07.911190 systemd[1]: Started sshd@17-91.99.1.97:22-139.178.89.65:53422.service - OpenSSH per-connection server daemon (139.178.89.65:53422). May 13 23:56:08.365705 containerd[1515]: time="2025-05-13T23:56:08.365665545Z" level=info msg="TaskExit event in podsandbox handler container_id:\"9792211b68a7146313562872544b61aada9a793ff21785d51cf55ea980c7f7e1\" id:\"a85c7d7a38cab8028fdf77752092920e1f7c63b49691130c733b7c430ed04390\" pid:5723 exited_at:{seconds:1747180568 nanos:365200176}" May 13 23:56:08.533061 containerd[1515]: time="2025-05-13T23:56:08.532972615Z" level=warning msg="container event discarded" container=e5b8d67f90eec10c2b465641a5ddafaa17641814886706ef9649870b8060928d type=CONTAINER_CREATED_EVENT May 13 23:56:08.639546 containerd[1515]: time="2025-05-13T23:56:08.639319417Z" level=warning msg="container event discarded" container=e5b8d67f90eec10c2b465641a5ddafaa17641814886706ef9649870b8060928d type=CONTAINER_STARTED_EVENT May 13 23:56:08.955283 sshd[5707]: Accepted publickey for core from 139.178.89.65 port 53422 ssh2: RSA SHA256:2AVjOJVQfxOW/VtYd+RUuCe04o4LOcgB/L6rdW4ABGI May 13 23:56:08.958749 sshd-session[5707]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 13 23:56:08.970343 systemd-logind[1485]: New session 18 of user core. May 13 23:56:08.975640 systemd[1]: Started session-18.scope - Session 18 of User core. May 13 23:56:09.384357 containerd[1515]: time="2025-05-13T23:56:09.384194843Z" level=warning msg="container event discarded" container=e5b8d67f90eec10c2b465641a5ddafaa17641814886706ef9649870b8060928d type=CONTAINER_STOPPED_EVENT May 13 23:56:09.728863 containerd[1515]: time="2025-05-13T23:56:09.728732928Z" level=info msg="TaskExit event in podsandbox handler container_id:\"9792211b68a7146313562872544b61aada9a793ff21785d51cf55ea980c7f7e1\" id:\"b5c395bdc96799903520ea2cd72efa995efd72d03c8183a3225ef899cb3e05b7\" pid:5751 exited_at:{seconds:1747180569 nanos:728179917}" May 13 23:56:09.927328 sshd[5732]: Connection closed by 139.178.89.65 port 53422 May 13 23:56:09.928733 sshd-session[5707]: pam_unix(sshd:session): session closed for user core May 13 23:56:09.933378 systemd[1]: sshd@17-91.99.1.97:22-139.178.89.65:53422.service: Deactivated successfully. May 13 23:56:09.936859 systemd[1]: session-18.scope: Deactivated successfully. May 13 23:56:09.938674 systemd-logind[1485]: Session 18 logged out. Waiting for processes to exit. May 13 23:56:09.939835 systemd-logind[1485]: Removed session 18. May 13 23:56:10.101617 systemd[1]: Started sshd@18-91.99.1.97:22-139.178.89.65:53432.service - OpenSSH per-connection server daemon (139.178.89.65:53432). May 13 23:56:11.110023 sshd[5764]: Accepted publickey for core from 139.178.89.65 port 53432 ssh2: RSA SHA256:2AVjOJVQfxOW/VtYd+RUuCe04o4LOcgB/L6rdW4ABGI May 13 23:56:11.112483 sshd-session[5764]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 13 23:56:11.120198 systemd-logind[1485]: New session 19 of user core. May 13 23:56:11.124613 systemd[1]: Started session-19.scope - Session 19 of User core. May 13 23:56:11.878016 sshd[5766]: Connection closed by 139.178.89.65 port 53432 May 13 23:56:11.879201 sshd-session[5764]: pam_unix(sshd:session): session closed for user core May 13 23:56:11.883884 systemd[1]: sshd@18-91.99.1.97:22-139.178.89.65:53432.service: Deactivated successfully. May 13 23:56:11.886334 systemd[1]: session-19.scope: Deactivated successfully. May 13 23:56:11.889630 systemd-logind[1485]: Session 19 logged out. Waiting for processes to exit. May 13 23:56:11.891650 systemd-logind[1485]: Removed session 19. May 13 23:56:15.387645 containerd[1515]: time="2025-05-13T23:56:15.387528299Z" level=warning msg="container event discarded" container=6e216e5e38dfeaed7e9085644f5c11b62a72e72b15f966089d9009ebbc63b632 type=CONTAINER_CREATED_EVENT May 13 23:56:15.467476 containerd[1515]: time="2025-05-13T23:56:15.467355981Z" level=warning msg="container event discarded" container=6e216e5e38dfeaed7e9085644f5c11b62a72e72b15f966089d9009ebbc63b632 type=CONTAINER_STARTED_EVENT May 13 23:56:17.056672 systemd[1]: Started sshd@19-91.99.1.97:22-139.178.89.65:35450.service - OpenSSH per-connection server daemon (139.178.89.65:35450). May 13 23:56:18.079471 sshd[5783]: Accepted publickey for core from 139.178.89.65 port 35450 ssh2: RSA SHA256:2AVjOJVQfxOW/VtYd+RUuCe04o4LOcgB/L6rdW4ABGI May 13 23:56:18.081823 sshd-session[5783]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 13 23:56:18.087688 systemd-logind[1485]: New session 20 of user core. May 13 23:56:18.094727 systemd[1]: Started session-20.scope - Session 20 of User core. May 13 23:56:18.862443 sshd[5785]: Connection closed by 139.178.89.65 port 35450 May 13 23:56:18.863549 sshd-session[5783]: pam_unix(sshd:session): session closed for user core May 13 23:56:18.869285 systemd[1]: sshd@19-91.99.1.97:22-139.178.89.65:35450.service: Deactivated successfully. May 13 23:56:18.871858 systemd[1]: session-20.scope: Deactivated successfully. May 13 23:56:18.874328 systemd-logind[1485]: Session 20 logged out. Waiting for processes to exit. May 13 23:56:18.875891 systemd-logind[1485]: Removed session 20. May 13 23:56:21.376429 containerd[1515]: time="2025-05-13T23:56:21.376280480Z" level=warning msg="container event discarded" container=82330f81334b068898c2fa0b3601827e31d0f6ef794bb1c3cd797955b0f804a9 type=CONTAINER_CREATED_EVENT May 13 23:56:21.376429 containerd[1515]: time="2025-05-13T23:56:21.376357162Z" level=warning msg="container event discarded" container=82330f81334b068898c2fa0b3601827e31d0f6ef794bb1c3cd797955b0f804a9 type=CONTAINER_STARTED_EVENT May 13 23:56:21.402722 containerd[1515]: time="2025-05-13T23:56:21.402639011Z" level=warning msg="container event discarded" container=e6df9f84214acb3f3f8fcccd3fa80d22ad29ee939103f182785640e112d9005e type=CONTAINER_CREATED_EVENT May 13 23:56:21.402722 containerd[1515]: time="2025-05-13T23:56:21.402724293Z" level=warning msg="container event discarded" container=e6df9f84214acb3f3f8fcccd3fa80d22ad29ee939103f182785640e112d9005e type=CONTAINER_STARTED_EVENT May 13 23:56:22.313605 containerd[1515]: time="2025-05-13T23:56:22.313491705Z" level=warning msg="container event discarded" container=316aca57eac91f43c7f8a53bec6d148949862a3d6dfae24e59268ded0093648c type=CONTAINER_CREATED_EVENT May 13 23:56:22.313605 containerd[1515]: time="2025-05-13T23:56:22.313595307Z" level=warning msg="container event discarded" container=316aca57eac91f43c7f8a53bec6d148949862a3d6dfae24e59268ded0093648c type=CONTAINER_STARTED_EVENT May 13 23:56:22.349168 containerd[1515]: time="2025-05-13T23:56:22.349057384Z" level=warning msg="container event discarded" container=768d0b2397de8324b673d98725cba11b864bc006fd29040fd7dc115227974abc type=CONTAINER_CREATED_EVENT May 13 23:56:22.397425 containerd[1515]: time="2025-05-13T23:56:22.397328161Z" level=warning msg="container event discarded" container=c88df53653c340258487d4c13b0d8f891833f19b4267a16e9b7981979928f9d2 type=CONTAINER_CREATED_EVENT May 13 23:56:22.397425 containerd[1515]: time="2025-05-13T23:56:22.397432203Z" level=warning msg="container event discarded" container=c88df53653c340258487d4c13b0d8f891833f19b4267a16e9b7981979928f9d2 type=CONTAINER_STARTED_EVENT May 13 23:56:22.438251 containerd[1515]: time="2025-05-13T23:56:22.438113386Z" level=warning msg="container event discarded" container=bd02ff5b879a1f493bf7007e62d3c35e7d33d9eb9005ce5378b3cc427dcb392a type=CONTAINER_CREATED_EVENT May 13 23:56:22.449564 containerd[1515]: time="2025-05-13T23:56:22.449455975Z" level=warning msg="container event discarded" container=768d0b2397de8324b673d98725cba11b864bc006fd29040fd7dc115227974abc type=CONTAINER_STARTED_EVENT May 13 23:56:22.520060 containerd[1515]: time="2025-05-13T23:56:22.519980482Z" level=warning msg="container event discarded" container=bd02ff5b879a1f493bf7007e62d3c35e7d33d9eb9005ce5378b3cc427dcb392a type=CONTAINER_STARTED_EVENT May 13 23:56:24.039520 systemd[1]: Started sshd@20-91.99.1.97:22-139.178.89.65:35464.service - OpenSSH per-connection server daemon (139.178.89.65:35464). May 13 23:56:24.516998 containerd[1515]: time="2025-05-13T23:56:24.516921623Z" level=warning msg="container event discarded" container=4585f55f8a8d71791ccc9addb3dd6ac56bf78bab50dc6c32d1155dfea9658885 type=CONTAINER_CREATED_EVENT May 13 23:56:24.518330 containerd[1515]: time="2025-05-13T23:56:24.518124568Z" level=warning msg="container event discarded" container=4585f55f8a8d71791ccc9addb3dd6ac56bf78bab50dc6c32d1155dfea9658885 type=CONTAINER_STARTED_EVENT May 13 23:56:24.577827 containerd[1515]: time="2025-05-13T23:56:24.577671383Z" level=warning msg="container event discarded" container=23da35459384e486723d2143e47afee950d8dc80e277c34307d162253dd6414d type=CONTAINER_CREATED_EVENT May 13 23:56:24.577827 containerd[1515]: time="2025-05-13T23:56:24.577745545Z" level=warning msg="container event discarded" container=23da35459384e486723d2143e47afee950d8dc80e277c34307d162253dd6414d type=CONTAINER_STARTED_EVENT May 13 23:56:24.733594 containerd[1515]: time="2025-05-13T23:56:24.733493363Z" level=warning msg="container event discarded" container=c7bc7aa19970939e74e5ba1f141beb81d6fea5509774940edeb7cddc4fbff64d type=CONTAINER_CREATED_EVENT May 13 23:56:24.849035 containerd[1515]: time="2025-05-13T23:56:24.848839237Z" level=warning msg="container event discarded" container=c7bc7aa19970939e74e5ba1f141beb81d6fea5509774940edeb7cddc4fbff64d type=CONTAINER_STARTED_EVENT May 13 23:56:25.080817 sshd[5800]: Accepted publickey for core from 139.178.89.65 port 35464 ssh2: RSA SHA256:2AVjOJVQfxOW/VtYd+RUuCe04o4LOcgB/L6rdW4ABGI May 13 23:56:25.083194 sshd-session[5800]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 13 23:56:25.088876 systemd-logind[1485]: New session 21 of user core. May 13 23:56:25.093604 systemd[1]: Started session-21.scope - Session 21 of User core. May 13 23:56:25.862397 sshd[5802]: Connection closed by 139.178.89.65 port 35464 May 13 23:56:25.863572 sshd-session[5800]: pam_unix(sshd:session): session closed for user core May 13 23:56:25.870828 systemd[1]: sshd@20-91.99.1.97:22-139.178.89.65:35464.service: Deactivated successfully. May 13 23:56:25.871029 systemd-logind[1485]: Session 21 logged out. Waiting for processes to exit. May 13 23:56:25.873177 systemd[1]: session-21.scope: Deactivated successfully. May 13 23:56:25.876714 systemd-logind[1485]: Removed session 21. May 13 23:56:26.327604 containerd[1515]: time="2025-05-13T23:56:26.327108908Z" level=warning msg="container event discarded" container=659d5184440c7d9c71dab1ca594c210ad76f7a8c56eeab2e2930e38cdce0ed60 type=CONTAINER_CREATED_EVENT May 13 23:56:26.444589 containerd[1515]: time="2025-05-13T23:56:26.444523684Z" level=warning msg="container event discarded" container=659d5184440c7d9c71dab1ca594c210ad76f7a8c56eeab2e2930e38cdce0ed60 type=CONTAINER_STARTED_EVENT May 13 23:56:29.729108 containerd[1515]: time="2025-05-13T23:56:29.728923903Z" level=info msg="TaskExit event in podsandbox handler container_id:\"6e216e5e38dfeaed7e9085644f5c11b62a72e72b15f966089d9009ebbc63b632\" id:\"d674c79f2e4740bdac87a4acb349c23a62841a0565c4c05c9f1a0cf3b250e409\" pid:5825 exited_at:{seconds:1747180589 nanos:728560095}" May 13 23:56:30.499981 containerd[1515]: time="2025-05-13T23:56:30.499473830Z" level=warning msg="container event discarded" container=9792211b68a7146313562872544b61aada9a793ff21785d51cf55ea980c7f7e1 type=CONTAINER_CREATED_EVENT May 13 23:56:30.620344 containerd[1515]: time="2025-05-13T23:56:30.620218634Z" level=warning msg="container event discarded" container=9792211b68a7146313562872544b61aada9a793ff21785d51cf55ea980c7f7e1 type=CONTAINER_STARTED_EVENT May 13 23:56:30.887431 containerd[1515]: time="2025-05-13T23:56:30.887229216Z" level=warning msg="container event discarded" container=7e81c402feaaf70fb3e88300df64e7a3a648b6bbe17e500b24366f933325646d type=CONTAINER_CREATED_EVENT May 13 23:56:31.014866 containerd[1515]: time="2025-05-13T23:56:31.014704002Z" level=warning msg="container event discarded" container=7e81c402feaaf70fb3e88300df64e7a3a648b6bbe17e500b24366f933325646d type=CONTAINER_STARTED_EVENT May 13 23:56:32.618633 containerd[1515]: time="2025-05-13T23:56:32.618516824Z" level=warning msg="container event discarded" container=82cfd5d6a9eca1665a1a5f3f340c67ff7b5957dcaef199d249b383f3e8492033 type=CONTAINER_CREATED_EVENT May 13 23:56:32.764174 containerd[1515]: time="2025-05-13T23:56:32.764083010Z" level=warning msg="container event discarded" container=82cfd5d6a9eca1665a1a5f3f340c67ff7b5957dcaef199d249b383f3e8492033 type=CONTAINER_STARTED_EVENT