May 13 23:48:32.888675 kernel: Booting Linux on physical CPU 0x0000000000 [0x413fd0c1] May 13 23:48:32.888701 kernel: Linux version 6.6.89-flatcar (build@pony-truck.infra.kinvolk.io) (aarch64-cros-linux-gnu-gcc (Gentoo Hardened 14.2.1_p20241221 p7) 14.2.1 20241221, GNU ld (Gentoo 2.44 p1) 2.44.0) #1 SMP PREEMPT Tue May 13 22:16:18 -00 2025 May 13 23:48:32.888712 kernel: KASLR enabled May 13 23:48:32.888718 kernel: efi: EFI v2.7 by Ubuntu distribution of EDK II May 13 23:48:32.888724 kernel: efi: SMBIOS 3.0=0x139ed0000 MEMATTR=0x1390b8118 ACPI 2.0=0x136760018 RNG=0x13676e918 MEMRESERVE=0x136b41218 May 13 23:48:32.888730 kernel: random: crng init done May 13 23:48:32.888737 kernel: secureboot: Secure boot disabled May 13 23:48:32.888743 kernel: ACPI: Early table checksum verification disabled May 13 23:48:32.888749 kernel: ACPI: RSDP 0x0000000136760018 000024 (v02 BOCHS ) May 13 23:48:32.888757 kernel: ACPI: XSDT 0x000000013676FE98 00006C (v01 BOCHS BXPC 00000001 01000013) May 13 23:48:32.888764 kernel: ACPI: FACP 0x000000013676FA98 000114 (v06 BOCHS BXPC 00000001 BXPC 00000001) May 13 23:48:32.888770 kernel: ACPI: DSDT 0x0000000136767518 001468 (v02 BOCHS BXPC 00000001 BXPC 00000001) May 13 23:48:32.888776 kernel: ACPI: APIC 0x000000013676FC18 000108 (v04 BOCHS BXPC 00000001 BXPC 00000001) May 13 23:48:32.888782 kernel: ACPI: PPTT 0x000000013676FD98 000060 (v02 BOCHS BXPC 00000001 BXPC 00000001) May 13 23:48:32.888790 kernel: ACPI: GTDT 0x000000013676D898 000060 (v02 BOCHS BXPC 00000001 BXPC 00000001) May 13 23:48:32.888798 kernel: ACPI: MCFG 0x000000013676FF98 00003C (v01 BOCHS BXPC 00000001 BXPC 00000001) May 13 23:48:32.888805 kernel: ACPI: SPCR 0x000000013676E818 000050 (v02 BOCHS BXPC 00000001 BXPC 00000001) May 13 23:48:32.888811 kernel: ACPI: DBG2 0x000000013676E898 000057 (v00 BOCHS BXPC 00000001 BXPC 00000001) May 13 23:48:32.888818 kernel: ACPI: IORT 0x000000013676E418 000080 (v03 BOCHS BXPC 00000001 BXPC 00000001) May 13 23:48:32.888824 kernel: ACPI: BGRT 0x000000013676E798 000038 (v01 INTEL EDK2 00000002 01000013) May 13 23:48:32.888830 kernel: ACPI: SPCR: console: pl011,mmio32,0x9000000,9600 May 13 23:48:32.888837 kernel: NUMA: Failed to initialise from firmware May 13 23:48:32.888843 kernel: NUMA: Faking a node at [mem 0x0000000040000000-0x0000000139ffffff] May 13 23:48:32.888850 kernel: NUMA: NODE_DATA [mem 0x13966f800-0x139674fff] May 13 23:48:32.888856 kernel: Zone ranges: May 13 23:48:32.888864 kernel: DMA [mem 0x0000000040000000-0x00000000ffffffff] May 13 23:48:32.888870 kernel: DMA32 empty May 13 23:48:32.888877 kernel: Normal [mem 0x0000000100000000-0x0000000139ffffff] May 13 23:48:32.888883 kernel: Movable zone start for each node May 13 23:48:32.888890 kernel: Early memory node ranges May 13 23:48:32.888896 kernel: node 0: [mem 0x0000000040000000-0x000000013666ffff] May 13 23:48:32.888902 kernel: node 0: [mem 0x0000000136670000-0x000000013667ffff] May 13 23:48:32.888909 kernel: node 0: [mem 0x0000000136680000-0x000000013676ffff] May 13 23:48:32.888915 kernel: node 0: [mem 0x0000000136770000-0x0000000136b3ffff] May 13 23:48:32.888922 kernel: node 0: [mem 0x0000000136b40000-0x0000000139e1ffff] May 13 23:48:32.888928 kernel: node 0: [mem 0x0000000139e20000-0x0000000139eaffff] May 13 23:48:32.888935 kernel: node 0: [mem 0x0000000139eb0000-0x0000000139ebffff] May 13 23:48:32.888942 kernel: node 0: [mem 0x0000000139ec0000-0x0000000139fdffff] May 13 23:48:32.888949 kernel: node 0: [mem 0x0000000139fe0000-0x0000000139ffffff] May 13 23:48:32.888955 kernel: Initmem setup node 0 [mem 0x0000000040000000-0x0000000139ffffff] May 13 23:48:32.888965 kernel: On node 0, zone Normal: 24576 pages in unavailable ranges May 13 23:48:32.889019 kernel: psci: probing for conduit method from ACPI. May 13 23:48:32.889028 kernel: psci: PSCIv1.1 detected in firmware. May 13 23:48:32.889037 kernel: psci: Using standard PSCI v0.2 function IDs May 13 23:48:32.889062 kernel: psci: Trusted OS migration not required May 13 23:48:32.889069 kernel: psci: SMC Calling Convention v1.1 May 13 23:48:32.889076 kernel: smccc: KVM: hypervisor services detected (0x00000000 0x00000000 0x00000000 0x00000003) May 13 23:48:32.889083 kernel: percpu: Embedded 31 pages/cpu s86632 r8192 d32152 u126976 May 13 23:48:32.889090 kernel: pcpu-alloc: s86632 r8192 d32152 u126976 alloc=31*4096 May 13 23:48:32.889097 kernel: pcpu-alloc: [0] 0 [0] 1 May 13 23:48:32.889104 kernel: Detected PIPT I-cache on CPU0 May 13 23:48:32.889110 kernel: CPU features: detected: GIC system register CPU interface May 13 23:48:32.889117 kernel: CPU features: detected: Hardware dirty bit management May 13 23:48:32.889127 kernel: CPU features: detected: Spectre-v4 May 13 23:48:32.889134 kernel: CPU features: detected: Spectre-BHB May 13 23:48:32.889141 kernel: CPU features: kernel page table isolation forced ON by KASLR May 13 23:48:32.889147 kernel: CPU features: detected: Kernel page table isolation (KPTI) May 13 23:48:32.889154 kernel: CPU features: detected: ARM erratum 1418040 May 13 23:48:32.889161 kernel: CPU features: detected: SSBS not fully self-synchronizing May 13 23:48:32.889168 kernel: alternatives: applying boot alternatives May 13 23:48:32.889176 kernel: Kernel command line: BOOT_IMAGE=/flatcar/vmlinuz-a mount.usr=/dev/mapper/usr verity.usr=PARTUUID=7130c94a-213a-4e5a-8e26-6cce9662f132 rootflags=rw mount.usrflags=ro consoleblank=0 root=LABEL=ROOT console=ttyAMA0,115200n8 flatcar.first_boot=detected acpi=force flatcar.oem.id=hetzner verity.usrhash=3174b2682629aa8ad4069807ed6fd62c10f62266ee1e150a1104f2a2fb6489b5 May 13 23:48:32.889184 kernel: Unknown kernel command line parameters "BOOT_IMAGE=/flatcar/vmlinuz-a", will be passed to user space. May 13 23:48:32.889191 kernel: Dentry cache hash table entries: 524288 (order: 10, 4194304 bytes, linear) May 13 23:48:32.889198 kernel: Inode-cache hash table entries: 262144 (order: 9, 2097152 bytes, linear) May 13 23:48:32.889206 kernel: Fallback order for Node 0: 0 May 13 23:48:32.889213 kernel: Built 1 zonelists, mobility grouping on. Total pages: 1008000 May 13 23:48:32.889220 kernel: Policy zone: Normal May 13 23:48:32.889226 kernel: mem auto-init: stack:off, heap alloc:off, heap free:off May 13 23:48:32.889233 kernel: software IO TLB: area num 2. May 13 23:48:32.889240 kernel: software IO TLB: mapped [mem 0x00000000fbfff000-0x00000000fffff000] (64MB) May 13 23:48:32.889247 kernel: Memory: 3883704K/4096000K available (10368K kernel code, 2186K rwdata, 8100K rodata, 38464K init, 897K bss, 212296K reserved, 0K cma-reserved) May 13 23:48:32.889254 kernel: SLUB: HWalign=64, Order=0-3, MinObjects=0, CPUs=2, Nodes=1 May 13 23:48:32.889261 kernel: rcu: Preemptible hierarchical RCU implementation. May 13 23:48:32.889269 kernel: rcu: RCU event tracing is enabled. May 13 23:48:32.889276 kernel: rcu: RCU restricting CPUs from NR_CPUS=512 to nr_cpu_ids=2. May 13 23:48:32.889283 kernel: Trampoline variant of Tasks RCU enabled. May 13 23:48:32.889292 kernel: Tracing variant of Tasks RCU enabled. May 13 23:48:32.889299 kernel: rcu: RCU calculated value of scheduler-enlistment delay is 100 jiffies. May 13 23:48:32.889305 kernel: rcu: Adjusting geometry for rcu_fanout_leaf=16, nr_cpu_ids=2 May 13 23:48:32.889313 kernel: NR_IRQS: 64, nr_irqs: 64, preallocated irqs: 0 May 13 23:48:32.889319 kernel: GICv3: 256 SPIs implemented May 13 23:48:32.889326 kernel: GICv3: 0 Extended SPIs implemented May 13 23:48:32.889333 kernel: Root IRQ handler: gic_handle_irq May 13 23:48:32.889339 kernel: GICv3: GICv3 features: 16 PPIs, DirectLPI May 13 23:48:32.889347 kernel: GICv3: CPU0: found redistributor 0 region 0:0x00000000080a0000 May 13 23:48:32.889353 kernel: ITS [mem 0x08080000-0x0809ffff] May 13 23:48:32.889360 kernel: ITS@0x0000000008080000: allocated 8192 Devices @1000c0000 (indirect, esz 8, psz 64K, shr 1) May 13 23:48:32.889369 kernel: ITS@0x0000000008080000: allocated 8192 Interrupt Collections @1000d0000 (flat, esz 8, psz 64K, shr 1) May 13 23:48:32.889376 kernel: GICv3: using LPI property table @0x00000001000e0000 May 13 23:48:32.889383 kernel: GICv3: CPU0: using allocated LPI pending table @0x00000001000f0000 May 13 23:48:32.889390 kernel: rcu: srcu_init: Setting srcu_struct sizes based on contention. May 13 23:48:32.889397 kernel: arch_timer: Enabling local workaround for ARM erratum 1418040 May 13 23:48:32.889404 kernel: arch_timer: cp15 timer(s) running at 25.00MHz (virt). May 13 23:48:32.889411 kernel: clocksource: arch_sys_counter: mask: 0xffffffffffffff max_cycles: 0x5c40939b5, max_idle_ns: 440795202646 ns May 13 23:48:32.889418 kernel: sched_clock: 56 bits at 25MHz, resolution 40ns, wraps every 4398046511100ns May 13 23:48:32.889425 kernel: Console: colour dummy device 80x25 May 13 23:48:32.889432 kernel: ACPI: Core revision 20230628 May 13 23:48:32.889442 kernel: Calibrating delay loop (skipped), value calculated using timer frequency.. 50.00 BogoMIPS (lpj=25000) May 13 23:48:32.889450 kernel: pid_max: default: 32768 minimum: 301 May 13 23:48:32.889457 kernel: LSM: initializing lsm=lockdown,capability,landlock,selinux,integrity May 13 23:48:32.889464 kernel: landlock: Up and running. May 13 23:48:32.889472 kernel: SELinux: Initializing. May 13 23:48:32.889478 kernel: Mount-cache hash table entries: 8192 (order: 4, 65536 bytes, linear) May 13 23:48:32.889486 kernel: Mountpoint-cache hash table entries: 8192 (order: 4, 65536 bytes, linear) May 13 23:48:32.889493 kernel: ACPI PPTT: PPTT table found, but unable to locate core 1 (1) May 13 23:48:32.889500 kernel: RCU Tasks: Setting shift to 1 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=2. May 13 23:48:32.889508 kernel: RCU Tasks Trace: Setting shift to 1 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=2. May 13 23:48:32.889520 kernel: rcu: Hierarchical SRCU implementation. May 13 23:48:32.889527 kernel: rcu: Max phase no-delay instances is 400. May 13 23:48:32.889534 kernel: Platform MSI: ITS@0x8080000 domain created May 13 23:48:32.889541 kernel: PCI/MSI: ITS@0x8080000 domain created May 13 23:48:32.889549 kernel: Remapping and enabling EFI services. May 13 23:48:32.889556 kernel: smp: Bringing up secondary CPUs ... May 13 23:48:32.889563 kernel: Detected PIPT I-cache on CPU1 May 13 23:48:32.889571 kernel: GICv3: CPU1: found redistributor 1 region 0:0x00000000080c0000 May 13 23:48:32.889578 kernel: GICv3: CPU1: using allocated LPI pending table @0x0000000100100000 May 13 23:48:32.889586 kernel: arch_timer: Enabling local workaround for ARM erratum 1418040 May 13 23:48:32.889594 kernel: CPU1: Booted secondary processor 0x0000000001 [0x413fd0c1] May 13 23:48:32.889606 kernel: smp: Brought up 1 node, 2 CPUs May 13 23:48:32.889615 kernel: SMP: Total of 2 processors activated. May 13 23:48:32.889622 kernel: CPU features: detected: 32-bit EL0 Support May 13 23:48:32.889630 kernel: CPU features: detected: Data cache clean to the PoU not required for I/D coherence May 13 23:48:32.889637 kernel: CPU features: detected: Common not Private translations May 13 23:48:32.889645 kernel: CPU features: detected: CRC32 instructions May 13 23:48:32.889652 kernel: CPU features: detected: Enhanced Virtualization Traps May 13 23:48:32.889660 kernel: CPU features: detected: RCpc load-acquire (LDAPR) May 13 23:48:32.889669 kernel: CPU features: detected: LSE atomic instructions May 13 23:48:32.889677 kernel: CPU features: detected: Privileged Access Never May 13 23:48:32.889684 kernel: CPU features: detected: RAS Extension Support May 13 23:48:32.889691 kernel: CPU features: detected: Speculative Store Bypassing Safe (SSBS) May 13 23:48:32.889699 kernel: CPU: All CPU(s) started at EL1 May 13 23:48:32.889706 kernel: alternatives: applying system-wide alternatives May 13 23:48:32.889713 kernel: devtmpfs: initialized May 13 23:48:32.889723 kernel: clocksource: jiffies: mask: 0xffffffff max_cycles: 0xffffffff, max_idle_ns: 1911260446275000 ns May 13 23:48:32.889730 kernel: futex hash table entries: 512 (order: 3, 32768 bytes, linear) May 13 23:48:32.889738 kernel: pinctrl core: initialized pinctrl subsystem May 13 23:48:32.889745 kernel: SMBIOS 3.0.0 present. May 13 23:48:32.889752 kernel: DMI: Hetzner vServer/KVM Virtual Machine, BIOS 20171111 11/11/2017 May 13 23:48:32.889760 kernel: NET: Registered PF_NETLINK/PF_ROUTE protocol family May 13 23:48:32.891057 kernel: DMA: preallocated 512 KiB GFP_KERNEL pool for atomic allocations May 13 23:48:32.891090 kernel: DMA: preallocated 512 KiB GFP_KERNEL|GFP_DMA pool for atomic allocations May 13 23:48:32.891099 kernel: DMA: preallocated 512 KiB GFP_KERNEL|GFP_DMA32 pool for atomic allocations May 13 23:48:32.891117 kernel: audit: initializing netlink subsys (disabled) May 13 23:48:32.891125 kernel: audit: type=2000 audit(0.013:1): state=initialized audit_enabled=0 res=1 May 13 23:48:32.891133 kernel: thermal_sys: Registered thermal governor 'step_wise' May 13 23:48:32.891140 kernel: cpuidle: using governor menu May 13 23:48:32.891150 kernel: hw-breakpoint: found 6 breakpoint and 4 watchpoint registers. May 13 23:48:32.891158 kernel: ASID allocator initialised with 32768 entries May 13 23:48:32.891168 kernel: acpiphp: ACPI Hot Plug PCI Controller Driver version: 0.5 May 13 23:48:32.891176 kernel: Serial: AMBA PL011 UART driver May 13 23:48:32.891183 kernel: Modules: 2G module region forced by RANDOMIZE_MODULE_REGION_FULL May 13 23:48:32.891192 kernel: Modules: 0 pages in range for non-PLT usage May 13 23:48:32.891200 kernel: Modules: 509232 pages in range for PLT usage May 13 23:48:32.891209 kernel: HugeTLB: registered 1.00 GiB page size, pre-allocated 0 pages May 13 23:48:32.891218 kernel: HugeTLB: 0 KiB vmemmap can be freed for a 1.00 GiB page May 13 23:48:32.891227 kernel: HugeTLB: registered 32.0 MiB page size, pre-allocated 0 pages May 13 23:48:32.891237 kernel: HugeTLB: 0 KiB vmemmap can be freed for a 32.0 MiB page May 13 23:48:32.891245 kernel: HugeTLB: registered 2.00 MiB page size, pre-allocated 0 pages May 13 23:48:32.891253 kernel: HugeTLB: 0 KiB vmemmap can be freed for a 2.00 MiB page May 13 23:48:32.891260 kernel: HugeTLB: registered 64.0 KiB page size, pre-allocated 0 pages May 13 23:48:32.891270 kernel: HugeTLB: 0 KiB vmemmap can be freed for a 64.0 KiB page May 13 23:48:32.891277 kernel: ACPI: Added _OSI(Module Device) May 13 23:48:32.891284 kernel: ACPI: Added _OSI(Processor Device) May 13 23:48:32.891292 kernel: ACPI: Added _OSI(3.0 _SCP Extensions) May 13 23:48:32.891299 kernel: ACPI: Added _OSI(Processor Aggregator Device) May 13 23:48:32.891306 kernel: ACPI: 1 ACPI AML tables successfully acquired and loaded May 13 23:48:32.891314 kernel: ACPI: Interpreter enabled May 13 23:48:32.891321 kernel: ACPI: Using GIC for interrupt routing May 13 23:48:32.891329 kernel: ACPI: MCFG table detected, 1 entries May 13 23:48:32.891338 kernel: ARMH0011:00: ttyAMA0 at MMIO 0x9000000 (irq = 12, base_baud = 0) is a SBSA May 13 23:48:32.891346 kernel: printk: console [ttyAMA0] enabled May 13 23:48:32.891353 kernel: ACPI: PCI Root Bridge [PCI0] (domain 0000 [bus 00-ff]) May 13 23:48:32.891523 kernel: acpi PNP0A08:00: _OSC: OS supports [ExtendedConfig ASPM ClockPM Segments MSI HPX-Type3] May 13 23:48:32.891601 kernel: acpi PNP0A08:00: _OSC: platform does not support [LTR] May 13 23:48:32.891669 kernel: acpi PNP0A08:00: _OSC: OS now controls [PCIeHotplug PME AER PCIeCapability] May 13 23:48:32.891737 kernel: acpi PNP0A08:00: ECAM area [mem 0x4010000000-0x401fffffff] reserved by PNP0C02:00 May 13 23:48:32.891806 kernel: acpi PNP0A08:00: ECAM at [mem 0x4010000000-0x401fffffff] for [bus 00-ff] May 13 23:48:32.891816 kernel: ACPI: Remapped I/O 0x000000003eff0000 to [io 0x0000-0xffff window] May 13 23:48:32.891824 kernel: PCI host bridge to bus 0000:00 May 13 23:48:32.891999 kernel: pci_bus 0000:00: root bus resource [mem 0x10000000-0x3efeffff window] May 13 23:48:32.892088 kernel: pci_bus 0000:00: root bus resource [io 0x0000-0xffff window] May 13 23:48:32.892148 kernel: pci_bus 0000:00: root bus resource [mem 0x8000000000-0xffffffffff window] May 13 23:48:32.892207 kernel: pci_bus 0000:00: root bus resource [bus 00-ff] May 13 23:48:32.892289 kernel: pci 0000:00:00.0: [1b36:0008] type 00 class 0x060000 May 13 23:48:32.892370 kernel: pci 0000:00:01.0: [1af4:1050] type 00 class 0x038000 May 13 23:48:32.892436 kernel: pci 0000:00:01.0: reg 0x14: [mem 0x11289000-0x11289fff] May 13 23:48:32.892501 kernel: pci 0000:00:01.0: reg 0x20: [mem 0x8000600000-0x8000603fff 64bit pref] May 13 23:48:32.892580 kernel: pci 0000:00:02.0: [1b36:000c] type 01 class 0x060400 May 13 23:48:32.892646 kernel: pci 0000:00:02.0: reg 0x10: [mem 0x11288000-0x11288fff] May 13 23:48:32.892720 kernel: pci 0000:00:02.1: [1b36:000c] type 01 class 0x060400 May 13 23:48:32.892789 kernel: pci 0000:00:02.1: reg 0x10: [mem 0x11287000-0x11287fff] May 13 23:48:32.892864 kernel: pci 0000:00:02.2: [1b36:000c] type 01 class 0x060400 May 13 23:48:32.892929 kernel: pci 0000:00:02.2: reg 0x10: [mem 0x11286000-0x11286fff] May 13 23:48:32.895328 kernel: pci 0000:00:02.3: [1b36:000c] type 01 class 0x060400 May 13 23:48:32.895425 kernel: pci 0000:00:02.3: reg 0x10: [mem 0x11285000-0x11285fff] May 13 23:48:32.895498 kernel: pci 0000:00:02.4: [1b36:000c] type 01 class 0x060400 May 13 23:48:32.895570 kernel: pci 0000:00:02.4: reg 0x10: [mem 0x11284000-0x11284fff] May 13 23:48:32.895642 kernel: pci 0000:00:02.5: [1b36:000c] type 01 class 0x060400 May 13 23:48:32.895706 kernel: pci 0000:00:02.5: reg 0x10: [mem 0x11283000-0x11283fff] May 13 23:48:32.895777 kernel: pci 0000:00:02.6: [1b36:000c] type 01 class 0x060400 May 13 23:48:32.895841 kernel: pci 0000:00:02.6: reg 0x10: [mem 0x11282000-0x11282fff] May 13 23:48:32.895911 kernel: pci 0000:00:02.7: [1b36:000c] type 01 class 0x060400 May 13 23:48:32.896006 kernel: pci 0000:00:02.7: reg 0x10: [mem 0x11281000-0x11281fff] May 13 23:48:32.896113 kernel: pci 0000:00:03.0: [1b36:000c] type 01 class 0x060400 May 13 23:48:32.896184 kernel: pci 0000:00:03.0: reg 0x10: [mem 0x11280000-0x11280fff] May 13 23:48:32.896256 kernel: pci 0000:00:04.0: [1b36:0002] type 00 class 0x070002 May 13 23:48:32.896322 kernel: pci 0000:00:04.0: reg 0x10: [io 0x0000-0x0007] May 13 23:48:32.896416 kernel: pci 0000:01:00.0: [1af4:1041] type 00 class 0x020000 May 13 23:48:32.896503 kernel: pci 0000:01:00.0: reg 0x14: [mem 0x11000000-0x11000fff] May 13 23:48:32.896574 kernel: pci 0000:01:00.0: reg 0x20: [mem 0x8000000000-0x8000003fff 64bit pref] May 13 23:48:32.896652 kernel: pci 0000:01:00.0: reg 0x30: [mem 0xfff80000-0xffffffff pref] May 13 23:48:32.896727 kernel: pci 0000:02:00.0: [1b36:000d] type 00 class 0x0c0330 May 13 23:48:32.896795 kernel: pci 0000:02:00.0: reg 0x10: [mem 0x10e00000-0x10e03fff 64bit] May 13 23:48:32.896869 kernel: pci 0000:03:00.0: [1af4:1043] type 00 class 0x078000 May 13 23:48:32.896937 kernel: pci 0000:03:00.0: reg 0x14: [mem 0x10c00000-0x10c00fff] May 13 23:48:32.897023 kernel: pci 0000:03:00.0: reg 0x20: [mem 0x8000100000-0x8000103fff 64bit pref] May 13 23:48:32.897123 kernel: pci 0000:04:00.0: [1af4:1045] type 00 class 0x00ff00 May 13 23:48:32.897194 kernel: pci 0000:04:00.0: reg 0x20: [mem 0x8000200000-0x8000203fff 64bit pref] May 13 23:48:32.897271 kernel: pci 0000:05:00.0: [1af4:1044] type 00 class 0x00ff00 May 13 23:48:32.897338 kernel: pci 0000:05:00.0: reg 0x14: [mem 0x10800000-0x10800fff] May 13 23:48:32.897410 kernel: pci 0000:05:00.0: reg 0x20: [mem 0x8000300000-0x8000303fff 64bit pref] May 13 23:48:32.897487 kernel: pci 0000:06:00.0: [1af4:1048] type 00 class 0x010000 May 13 23:48:32.897555 kernel: pci 0000:06:00.0: reg 0x14: [mem 0x10600000-0x10600fff] May 13 23:48:32.897621 kernel: pci 0000:06:00.0: reg 0x20: [mem 0x8000400000-0x8000403fff 64bit pref] May 13 23:48:32.897696 kernel: pci 0000:07:00.0: [1af4:1041] type 00 class 0x020000 May 13 23:48:32.897763 kernel: pci 0000:07:00.0: reg 0x14: [mem 0x10400000-0x10400fff] May 13 23:48:32.897830 kernel: pci 0000:07:00.0: reg 0x20: [mem 0x8000500000-0x8000503fff 64bit pref] May 13 23:48:32.897897 kernel: pci 0000:07:00.0: reg 0x30: [mem 0xfff80000-0xffffffff pref] May 13 23:48:32.897966 kernel: pci 0000:00:02.0: bridge window [io 0x1000-0x0fff] to [bus 01] add_size 1000 May 13 23:48:32.898103 kernel: pci 0000:00:02.0: bridge window [mem 0x00100000-0x001fffff 64bit pref] to [bus 01] add_size 100000 add_align 100000 May 13 23:48:32.898177 kernel: pci 0000:00:02.0: bridge window [mem 0x00100000-0x001fffff] to [bus 01] add_size 100000 add_align 100000 May 13 23:48:32.898246 kernel: pci 0000:00:02.1: bridge window [io 0x1000-0x0fff] to [bus 02] add_size 1000 May 13 23:48:32.898312 kernel: pci 0000:00:02.1: bridge window [mem 0x00100000-0x000fffff 64bit pref] to [bus 02] add_size 200000 add_align 100000 May 13 23:48:32.898377 kernel: pci 0000:00:02.1: bridge window [mem 0x00100000-0x001fffff] to [bus 02] add_size 100000 add_align 100000 May 13 23:48:32.898450 kernel: pci 0000:00:02.2: bridge window [io 0x1000-0x0fff] to [bus 03] add_size 1000 May 13 23:48:32.898514 kernel: pci 0000:00:02.2: bridge window [mem 0x00100000-0x001fffff 64bit pref] to [bus 03] add_size 100000 add_align 100000 May 13 23:48:32.898576 kernel: pci 0000:00:02.2: bridge window [mem 0x00100000-0x001fffff] to [bus 03] add_size 100000 add_align 100000 May 13 23:48:32.898643 kernel: pci 0000:00:02.3: bridge window [io 0x1000-0x0fff] to [bus 04] add_size 1000 May 13 23:48:32.898716 kernel: pci 0000:00:02.3: bridge window [mem 0x00100000-0x001fffff 64bit pref] to [bus 04] add_size 100000 add_align 100000 May 13 23:48:32.898783 kernel: pci 0000:00:02.3: bridge window [mem 0x00100000-0x000fffff] to [bus 04] add_size 200000 add_align 100000 May 13 23:48:32.898852 kernel: pci 0000:00:02.4: bridge window [io 0x1000-0x0fff] to [bus 05] add_size 1000 May 13 23:48:32.898916 kernel: pci 0000:00:02.4: bridge window [mem 0x00100000-0x001fffff 64bit pref] to [bus 05] add_size 100000 add_align 100000 May 13 23:48:32.900151 kernel: pci 0000:00:02.4: bridge window [mem 0x00100000-0x001fffff] to [bus 05] add_size 100000 add_align 100000 May 13 23:48:32.900263 kernel: pci 0000:00:02.5: bridge window [io 0x1000-0x0fff] to [bus 06] add_size 1000 May 13 23:48:32.900331 kernel: pci 0000:00:02.5: bridge window [mem 0x00100000-0x001fffff 64bit pref] to [bus 06] add_size 100000 add_align 100000 May 13 23:48:32.900396 kernel: pci 0000:00:02.5: bridge window [mem 0x00100000-0x001fffff] to [bus 06] add_size 100000 add_align 100000 May 13 23:48:32.900464 kernel: pci 0000:00:02.6: bridge window [io 0x1000-0x0fff] to [bus 07] add_size 1000 May 13 23:48:32.900528 kernel: pci 0000:00:02.6: bridge window [mem 0x00100000-0x001fffff 64bit pref] to [bus 07] add_size 100000 add_align 100000 May 13 23:48:32.900592 kernel: pci 0000:00:02.6: bridge window [mem 0x00100000-0x001fffff] to [bus 07] add_size 100000 add_align 100000 May 13 23:48:32.900668 kernel: pci 0000:00:02.7: bridge window [io 0x1000-0x0fff] to [bus 08] add_size 1000 May 13 23:48:32.900732 kernel: pci 0000:00:02.7: bridge window [mem 0x00100000-0x000fffff 64bit pref] to [bus 08] add_size 200000 add_align 100000 May 13 23:48:32.900797 kernel: pci 0000:00:02.7: bridge window [mem 0x00100000-0x000fffff] to [bus 08] add_size 200000 add_align 100000 May 13 23:48:32.900866 kernel: pci 0000:00:03.0: bridge window [io 0x1000-0x0fff] to [bus 09] add_size 1000 May 13 23:48:32.900932 kernel: pci 0000:00:03.0: bridge window [mem 0x00100000-0x000fffff 64bit pref] to [bus 09] add_size 200000 add_align 100000 May 13 23:48:32.901171 kernel: pci 0000:00:03.0: bridge window [mem 0x00100000-0x000fffff] to [bus 09] add_size 200000 add_align 100000 May 13 23:48:32.901251 kernel: pci 0000:00:02.0: BAR 14: assigned [mem 0x10000000-0x101fffff] May 13 23:48:32.901315 kernel: pci 0000:00:02.0: BAR 15: assigned [mem 0x8000000000-0x80001fffff 64bit pref] May 13 23:48:32.901386 kernel: pci 0000:00:02.1: BAR 14: assigned [mem 0x10200000-0x103fffff] May 13 23:48:32.901449 kernel: pci 0000:00:02.1: BAR 15: assigned [mem 0x8000200000-0x80003fffff 64bit pref] May 13 23:48:32.901515 kernel: pci 0000:00:02.2: BAR 14: assigned [mem 0x10400000-0x105fffff] May 13 23:48:32.901579 kernel: pci 0000:00:02.2: BAR 15: assigned [mem 0x8000400000-0x80005fffff 64bit pref] May 13 23:48:32.901644 kernel: pci 0000:00:02.3: BAR 14: assigned [mem 0x10600000-0x107fffff] May 13 23:48:32.901712 kernel: pci 0000:00:02.3: BAR 15: assigned [mem 0x8000600000-0x80007fffff 64bit pref] May 13 23:48:32.901785 kernel: pci 0000:00:02.4: BAR 14: assigned [mem 0x10800000-0x109fffff] May 13 23:48:32.901848 kernel: pci 0000:00:02.4: BAR 15: assigned [mem 0x8000800000-0x80009fffff 64bit pref] May 13 23:48:32.901913 kernel: pci 0000:00:02.5: BAR 14: assigned [mem 0x10a00000-0x10bfffff] May 13 23:48:32.901992 kernel: pci 0000:00:02.5: BAR 15: assigned [mem 0x8000a00000-0x8000bfffff 64bit pref] May 13 23:48:32.902077 kernel: pci 0000:00:02.6: BAR 14: assigned [mem 0x10c00000-0x10dfffff] May 13 23:48:32.902145 kernel: pci 0000:00:02.6: BAR 15: assigned [mem 0x8000c00000-0x8000dfffff 64bit pref] May 13 23:48:32.902214 kernel: pci 0000:00:02.7: BAR 14: assigned [mem 0x10e00000-0x10ffffff] May 13 23:48:32.902283 kernel: pci 0000:00:02.7: BAR 15: assigned [mem 0x8000e00000-0x8000ffffff 64bit pref] May 13 23:48:32.902347 kernel: pci 0000:00:03.0: BAR 14: assigned [mem 0x11000000-0x111fffff] May 13 23:48:32.902411 kernel: pci 0000:00:03.0: BAR 15: assigned [mem 0x8001000000-0x80011fffff 64bit pref] May 13 23:48:32.902505 kernel: pci 0000:00:01.0: BAR 4: assigned [mem 0x8001200000-0x8001203fff 64bit pref] May 13 23:48:32.902577 kernel: pci 0000:00:01.0: BAR 1: assigned [mem 0x11200000-0x11200fff] May 13 23:48:32.902642 kernel: pci 0000:00:02.0: BAR 0: assigned [mem 0x11201000-0x11201fff] May 13 23:48:32.902706 kernel: pci 0000:00:02.0: BAR 13: assigned [io 0x1000-0x1fff] May 13 23:48:32.902772 kernel: pci 0000:00:02.1: BAR 0: assigned [mem 0x11202000-0x11202fff] May 13 23:48:32.902838 kernel: pci 0000:00:02.1: BAR 13: assigned [io 0x2000-0x2fff] May 13 23:48:32.905021 kernel: pci 0000:00:02.2: BAR 0: assigned [mem 0x11203000-0x11203fff] May 13 23:48:32.905171 kernel: pci 0000:00:02.2: BAR 13: assigned [io 0x3000-0x3fff] May 13 23:48:32.905245 kernel: pci 0000:00:02.3: BAR 0: assigned [mem 0x11204000-0x11204fff] May 13 23:48:32.905325 kernel: pci 0000:00:02.3: BAR 13: assigned [io 0x4000-0x4fff] May 13 23:48:32.905407 kernel: pci 0000:00:02.4: BAR 0: assigned [mem 0x11205000-0x11205fff] May 13 23:48:32.905475 kernel: pci 0000:00:02.4: BAR 13: assigned [io 0x5000-0x5fff] May 13 23:48:32.905540 kernel: pci 0000:00:02.5: BAR 0: assigned [mem 0x11206000-0x11206fff] May 13 23:48:32.905613 kernel: pci 0000:00:02.5: BAR 13: assigned [io 0x6000-0x6fff] May 13 23:48:32.905679 kernel: pci 0000:00:02.6: BAR 0: assigned [mem 0x11207000-0x11207fff] May 13 23:48:32.905742 kernel: pci 0000:00:02.6: BAR 13: assigned [io 0x7000-0x7fff] May 13 23:48:32.905809 kernel: pci 0000:00:02.7: BAR 0: assigned [mem 0x11208000-0x11208fff] May 13 23:48:32.905873 kernel: pci 0000:00:02.7: BAR 13: assigned [io 0x8000-0x8fff] May 13 23:48:32.905940 kernel: pci 0000:00:03.0: BAR 0: assigned [mem 0x11209000-0x11209fff] May 13 23:48:32.906125 kernel: pci 0000:00:03.0: BAR 13: assigned [io 0x9000-0x9fff] May 13 23:48:32.906204 kernel: pci 0000:00:04.0: BAR 0: assigned [io 0xa000-0xa007] May 13 23:48:32.906283 kernel: pci 0000:01:00.0: BAR 6: assigned [mem 0x10000000-0x1007ffff pref] May 13 23:48:32.906349 kernel: pci 0000:01:00.0: BAR 4: assigned [mem 0x8000000000-0x8000003fff 64bit pref] May 13 23:48:32.906414 kernel: pci 0000:01:00.0: BAR 1: assigned [mem 0x10080000-0x10080fff] May 13 23:48:32.906479 kernel: pci 0000:00:02.0: PCI bridge to [bus 01] May 13 23:48:32.906543 kernel: pci 0000:00:02.0: bridge window [io 0x1000-0x1fff] May 13 23:48:32.906604 kernel: pci 0000:00:02.0: bridge window [mem 0x10000000-0x101fffff] May 13 23:48:32.906667 kernel: pci 0000:00:02.0: bridge window [mem 0x8000000000-0x80001fffff 64bit pref] May 13 23:48:32.906740 kernel: pci 0000:02:00.0: BAR 0: assigned [mem 0x10200000-0x10203fff 64bit] May 13 23:48:32.906806 kernel: pci 0000:00:02.1: PCI bridge to [bus 02] May 13 23:48:32.906871 kernel: pci 0000:00:02.1: bridge window [io 0x2000-0x2fff] May 13 23:48:32.906933 kernel: pci 0000:00:02.1: bridge window [mem 0x10200000-0x103fffff] May 13 23:48:32.908084 kernel: pci 0000:00:02.1: bridge window [mem 0x8000200000-0x80003fffff 64bit pref] May 13 23:48:32.908179 kernel: pci 0000:03:00.0: BAR 4: assigned [mem 0x8000400000-0x8000403fff 64bit pref] May 13 23:48:32.908246 kernel: pci 0000:03:00.0: BAR 1: assigned [mem 0x10400000-0x10400fff] May 13 23:48:32.908310 kernel: pci 0000:00:02.2: PCI bridge to [bus 03] May 13 23:48:32.908372 kernel: pci 0000:00:02.2: bridge window [io 0x3000-0x3fff] May 13 23:48:32.908433 kernel: pci 0000:00:02.2: bridge window [mem 0x10400000-0x105fffff] May 13 23:48:32.908494 kernel: pci 0000:00:02.2: bridge window [mem 0x8000400000-0x80005fffff 64bit pref] May 13 23:48:32.908567 kernel: pci 0000:04:00.0: BAR 4: assigned [mem 0x8000600000-0x8000603fff 64bit pref] May 13 23:48:32.908634 kernel: pci 0000:00:02.3: PCI bridge to [bus 04] May 13 23:48:32.908700 kernel: pci 0000:00:02.3: bridge window [io 0x4000-0x4fff] May 13 23:48:32.908763 kernel: pci 0000:00:02.3: bridge window [mem 0x10600000-0x107fffff] May 13 23:48:32.908824 kernel: pci 0000:00:02.3: bridge window [mem 0x8000600000-0x80007fffff 64bit pref] May 13 23:48:32.908896 kernel: pci 0000:05:00.0: BAR 4: assigned [mem 0x8000800000-0x8000803fff 64bit pref] May 13 23:48:32.908962 kernel: pci 0000:05:00.0: BAR 1: assigned [mem 0x10800000-0x10800fff] May 13 23:48:32.909254 kernel: pci 0000:00:02.4: PCI bridge to [bus 05] May 13 23:48:32.909325 kernel: pci 0000:00:02.4: bridge window [io 0x5000-0x5fff] May 13 23:48:32.909387 kernel: pci 0000:00:02.4: bridge window [mem 0x10800000-0x109fffff] May 13 23:48:32.909454 kernel: pci 0000:00:02.4: bridge window [mem 0x8000800000-0x80009fffff 64bit pref] May 13 23:48:32.909525 kernel: pci 0000:06:00.0: BAR 4: assigned [mem 0x8000a00000-0x8000a03fff 64bit pref] May 13 23:48:32.909590 kernel: pci 0000:06:00.0: BAR 1: assigned [mem 0x10a00000-0x10a00fff] May 13 23:48:32.909654 kernel: pci 0000:00:02.5: PCI bridge to [bus 06] May 13 23:48:32.909716 kernel: pci 0000:00:02.5: bridge window [io 0x6000-0x6fff] May 13 23:48:32.909778 kernel: pci 0000:00:02.5: bridge window [mem 0x10a00000-0x10bfffff] May 13 23:48:32.909843 kernel: pci 0000:00:02.5: bridge window [mem 0x8000a00000-0x8000bfffff 64bit pref] May 13 23:48:32.909913 kernel: pci 0000:07:00.0: BAR 6: assigned [mem 0x10c00000-0x10c7ffff pref] May 13 23:48:32.909999 kernel: pci 0000:07:00.0: BAR 4: assigned [mem 0x8000c00000-0x8000c03fff 64bit pref] May 13 23:48:32.910123 kernel: pci 0000:07:00.0: BAR 1: assigned [mem 0x10c80000-0x10c80fff] May 13 23:48:32.910195 kernel: pci 0000:00:02.6: PCI bridge to [bus 07] May 13 23:48:32.910260 kernel: pci 0000:00:02.6: bridge window [io 0x7000-0x7fff] May 13 23:48:32.910325 kernel: pci 0000:00:02.6: bridge window [mem 0x10c00000-0x10dfffff] May 13 23:48:32.910388 kernel: pci 0000:00:02.6: bridge window [mem 0x8000c00000-0x8000dfffff 64bit pref] May 13 23:48:32.910454 kernel: pci 0000:00:02.7: PCI bridge to [bus 08] May 13 23:48:32.910521 kernel: pci 0000:00:02.7: bridge window [io 0x8000-0x8fff] May 13 23:48:32.910584 kernel: pci 0000:00:02.7: bridge window [mem 0x10e00000-0x10ffffff] May 13 23:48:32.910649 kernel: pci 0000:00:02.7: bridge window [mem 0x8000e00000-0x8000ffffff 64bit pref] May 13 23:48:32.910717 kernel: pci 0000:00:03.0: PCI bridge to [bus 09] May 13 23:48:32.910780 kernel: pci 0000:00:03.0: bridge window [io 0x9000-0x9fff] May 13 23:48:32.910845 kernel: pci 0000:00:03.0: bridge window [mem 0x11000000-0x111fffff] May 13 23:48:32.910908 kernel: pci 0000:00:03.0: bridge window [mem 0x8001000000-0x80011fffff 64bit pref] May 13 23:48:32.911681 kernel: pci_bus 0000:00: resource 4 [mem 0x10000000-0x3efeffff window] May 13 23:48:32.911794 kernel: pci_bus 0000:00: resource 5 [io 0x0000-0xffff window] May 13 23:48:32.911855 kernel: pci_bus 0000:00: resource 6 [mem 0x8000000000-0xffffffffff window] May 13 23:48:32.911931 kernel: pci_bus 0000:01: resource 0 [io 0x1000-0x1fff] May 13 23:48:32.912012 kernel: pci_bus 0000:01: resource 1 [mem 0x10000000-0x101fffff] May 13 23:48:32.912136 kernel: pci_bus 0000:01: resource 2 [mem 0x8000000000-0x80001fffff 64bit pref] May 13 23:48:32.912213 kernel: pci_bus 0000:02: resource 0 [io 0x2000-0x2fff] May 13 23:48:32.912279 kernel: pci_bus 0000:02: resource 1 [mem 0x10200000-0x103fffff] May 13 23:48:32.912356 kernel: pci_bus 0000:02: resource 2 [mem 0x8000200000-0x80003fffff 64bit pref] May 13 23:48:32.912425 kernel: pci_bus 0000:03: resource 0 [io 0x3000-0x3fff] May 13 23:48:32.912485 kernel: pci_bus 0000:03: resource 1 [mem 0x10400000-0x105fffff] May 13 23:48:32.912543 kernel: pci_bus 0000:03: resource 2 [mem 0x8000400000-0x80005fffff 64bit pref] May 13 23:48:32.912611 kernel: pci_bus 0000:04: resource 0 [io 0x4000-0x4fff] May 13 23:48:32.912683 kernel: pci_bus 0000:04: resource 1 [mem 0x10600000-0x107fffff] May 13 23:48:32.912746 kernel: pci_bus 0000:04: resource 2 [mem 0x8000600000-0x80007fffff 64bit pref] May 13 23:48:32.912819 kernel: pci_bus 0000:05: resource 0 [io 0x5000-0x5fff] May 13 23:48:32.912879 kernel: pci_bus 0000:05: resource 1 [mem 0x10800000-0x109fffff] May 13 23:48:32.912941 kernel: pci_bus 0000:05: resource 2 [mem 0x8000800000-0x80009fffff 64bit pref] May 13 23:48:32.915191 kernel: pci_bus 0000:06: resource 0 [io 0x6000-0x6fff] May 13 23:48:32.915275 kernel: pci_bus 0000:06: resource 1 [mem 0x10a00000-0x10bfffff] May 13 23:48:32.915336 kernel: pci_bus 0000:06: resource 2 [mem 0x8000a00000-0x8000bfffff 64bit pref] May 13 23:48:32.915404 kernel: pci_bus 0000:07: resource 0 [io 0x7000-0x7fff] May 13 23:48:32.915464 kernel: pci_bus 0000:07: resource 1 [mem 0x10c00000-0x10dfffff] May 13 23:48:32.915521 kernel: pci_bus 0000:07: resource 2 [mem 0x8000c00000-0x8000dfffff 64bit pref] May 13 23:48:32.915588 kernel: pci_bus 0000:08: resource 0 [io 0x8000-0x8fff] May 13 23:48:32.915657 kernel: pci_bus 0000:08: resource 1 [mem 0x10e00000-0x10ffffff] May 13 23:48:32.915718 kernel: pci_bus 0000:08: resource 2 [mem 0x8000e00000-0x8000ffffff 64bit pref] May 13 23:48:32.915789 kernel: pci_bus 0000:09: resource 0 [io 0x9000-0x9fff] May 13 23:48:32.915850 kernel: pci_bus 0000:09: resource 1 [mem 0x11000000-0x111fffff] May 13 23:48:32.915910 kernel: pci_bus 0000:09: resource 2 [mem 0x8001000000-0x80011fffff 64bit pref] May 13 23:48:32.915920 kernel: ACPI: PCI: Interrupt link GSI0 configured for IRQ 35 May 13 23:48:32.915928 kernel: ACPI: PCI: Interrupt link GSI1 configured for IRQ 36 May 13 23:48:32.915937 kernel: ACPI: PCI: Interrupt link GSI2 configured for IRQ 37 May 13 23:48:32.915947 kernel: ACPI: PCI: Interrupt link GSI3 configured for IRQ 38 May 13 23:48:32.915954 kernel: iommu: Default domain type: Translated May 13 23:48:32.915962 kernel: iommu: DMA domain TLB invalidation policy: strict mode May 13 23:48:32.915969 kernel: efivars: Registered efivars operations May 13 23:48:32.916113 kernel: vgaarb: loaded May 13 23:48:32.916121 kernel: clocksource: Switched to clocksource arch_sys_counter May 13 23:48:32.916129 kernel: VFS: Disk quotas dquot_6.6.0 May 13 23:48:32.916136 kernel: VFS: Dquot-cache hash table entries: 512 (order 0, 4096 bytes) May 13 23:48:32.916144 kernel: pnp: PnP ACPI init May 13 23:48:32.916243 kernel: system 00:00: [mem 0x4010000000-0x401fffffff window] could not be reserved May 13 23:48:32.916255 kernel: pnp: PnP ACPI: found 1 devices May 13 23:48:32.916263 kernel: NET: Registered PF_INET protocol family May 13 23:48:32.916270 kernel: IP idents hash table entries: 65536 (order: 7, 524288 bytes, linear) May 13 23:48:32.916278 kernel: tcp_listen_portaddr_hash hash table entries: 2048 (order: 3, 32768 bytes, linear) May 13 23:48:32.916285 kernel: Table-perturb hash table entries: 65536 (order: 6, 262144 bytes, linear) May 13 23:48:32.916293 kernel: TCP established hash table entries: 32768 (order: 6, 262144 bytes, linear) May 13 23:48:32.916300 kernel: TCP bind hash table entries: 32768 (order: 8, 1048576 bytes, linear) May 13 23:48:32.916311 kernel: TCP: Hash tables configured (established 32768 bind 32768) May 13 23:48:32.916318 kernel: UDP hash table entries: 2048 (order: 4, 65536 bytes, linear) May 13 23:48:32.916326 kernel: UDP-Lite hash table entries: 2048 (order: 4, 65536 bytes, linear) May 13 23:48:32.916333 kernel: NET: Registered PF_UNIX/PF_LOCAL protocol family May 13 23:48:32.916409 kernel: pci 0000:02:00.0: enabling device (0000 -> 0002) May 13 23:48:32.916420 kernel: PCI: CLS 0 bytes, default 64 May 13 23:48:32.916427 kernel: kvm [1]: HYP mode not available May 13 23:48:32.916435 kernel: Initialise system trusted keyrings May 13 23:48:32.916442 kernel: workingset: timestamp_bits=39 max_order=20 bucket_order=0 May 13 23:48:32.916451 kernel: Key type asymmetric registered May 13 23:48:32.916459 kernel: Asymmetric key parser 'x509' registered May 13 23:48:32.916466 kernel: Block layer SCSI generic (bsg) driver version 0.4 loaded (major 250) May 13 23:48:32.916474 kernel: io scheduler mq-deadline registered May 13 23:48:32.916481 kernel: io scheduler kyber registered May 13 23:48:32.916488 kernel: io scheduler bfq registered May 13 23:48:32.916496 kernel: ACPI: \_SB_.PCI0.GSI2: Enabled at IRQ 37 May 13 23:48:32.916564 kernel: pcieport 0000:00:02.0: PME: Signaling with IRQ 50 May 13 23:48:32.916632 kernel: pcieport 0000:00:02.0: AER: enabled with IRQ 50 May 13 23:48:32.916696 kernel: pcieport 0000:00:02.0: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ May 13 23:48:32.916763 kernel: pcieport 0000:00:02.1: PME: Signaling with IRQ 51 May 13 23:48:32.916827 kernel: pcieport 0000:00:02.1: AER: enabled with IRQ 51 May 13 23:48:32.916890 kernel: pcieport 0000:00:02.1: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ May 13 23:48:32.916956 kernel: pcieport 0000:00:02.2: PME: Signaling with IRQ 52 May 13 23:48:32.917056 kernel: pcieport 0000:00:02.2: AER: enabled with IRQ 52 May 13 23:48:32.917125 kernel: pcieport 0000:00:02.2: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ May 13 23:48:32.917193 kernel: pcieport 0000:00:02.3: PME: Signaling with IRQ 53 May 13 23:48:32.917260 kernel: pcieport 0000:00:02.3: AER: enabled with IRQ 53 May 13 23:48:32.917324 kernel: pcieport 0000:00:02.3: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ May 13 23:48:32.917391 kernel: pcieport 0000:00:02.4: PME: Signaling with IRQ 54 May 13 23:48:32.917460 kernel: pcieport 0000:00:02.4: AER: enabled with IRQ 54 May 13 23:48:32.917525 kernel: pcieport 0000:00:02.4: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ May 13 23:48:32.917592 kernel: pcieport 0000:00:02.5: PME: Signaling with IRQ 55 May 13 23:48:32.917657 kernel: pcieport 0000:00:02.5: AER: enabled with IRQ 55 May 13 23:48:32.917721 kernel: pcieport 0000:00:02.5: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ May 13 23:48:32.917788 kernel: pcieport 0000:00:02.6: PME: Signaling with IRQ 56 May 13 23:48:32.917855 kernel: pcieport 0000:00:02.6: AER: enabled with IRQ 56 May 13 23:48:32.917918 kernel: pcieport 0000:00:02.6: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ May 13 23:48:32.919127 kernel: pcieport 0000:00:02.7: PME: Signaling with IRQ 57 May 13 23:48:32.919230 kernel: pcieport 0000:00:02.7: AER: enabled with IRQ 57 May 13 23:48:32.919297 kernel: pcieport 0000:00:02.7: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ May 13 23:48:32.919308 kernel: ACPI: \_SB_.PCI0.GSI3: Enabled at IRQ 38 May 13 23:48:32.919374 kernel: pcieport 0000:00:03.0: PME: Signaling with IRQ 58 May 13 23:48:32.919453 kernel: pcieport 0000:00:03.0: AER: enabled with IRQ 58 May 13 23:48:32.919530 kernel: pcieport 0000:00:03.0: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ May 13 23:48:32.919540 kernel: input: Power Button as /devices/LNXSYSTM:00/LNXSYBUS:00/PNP0C0C:00/input/input0 May 13 23:48:32.919550 kernel: ACPI: button: Power Button [PWRB] May 13 23:48:32.919562 kernel: ACPI: \_SB_.PCI0.GSI1: Enabled at IRQ 36 May 13 23:48:32.919651 kernel: virtio-pci 0000:04:00.0: enabling device (0000 -> 0002) May 13 23:48:32.919743 kernel: virtio-pci 0000:07:00.0: enabling device (0000 -> 0002) May 13 23:48:32.919758 kernel: Serial: 8250/16550 driver, 4 ports, IRQ sharing enabled May 13 23:48:32.919767 kernel: ACPI: \_SB_.PCI0.GSI0: Enabled at IRQ 35 May 13 23:48:32.919833 kernel: serial 0000:00:04.0: enabling device (0000 -> 0001) May 13 23:48:32.919843 kernel: 0000:00:04.0: ttyS0 at I/O 0xa000 (irq = 45, base_baud = 115200) is a 16550A May 13 23:48:32.919850 kernel: thunder_xcv, ver 1.0 May 13 23:48:32.919858 kernel: thunder_bgx, ver 1.0 May 13 23:48:32.919865 kernel: nicpf, ver 1.0 May 13 23:48:32.919873 kernel: nicvf, ver 1.0 May 13 23:48:32.919951 kernel: rtc-efi rtc-efi.0: registered as rtc0 May 13 23:48:32.920174 kernel: rtc-efi rtc-efi.0: setting system clock to 2025-05-13T23:48:32 UTC (1747180112) May 13 23:48:32.920191 kernel: hid: raw HID events driver (C) Jiri Kosina May 13 23:48:32.920199 kernel: hw perfevents: enabled with armv8_pmuv3_0 PMU driver, 7 counters available May 13 23:48:32.920206 kernel: watchdog: Delayed init of the lockup detector failed: -19 May 13 23:48:32.920214 kernel: watchdog: Hard watchdog permanently disabled May 13 23:48:32.920222 kernel: NET: Registered PF_INET6 protocol family May 13 23:48:32.920229 kernel: Segment Routing with IPv6 May 13 23:48:32.920236 kernel: In-situ OAM (IOAM) with IPv6 May 13 23:48:32.920249 kernel: NET: Registered PF_PACKET protocol family May 13 23:48:32.920256 kernel: Key type dns_resolver registered May 13 23:48:32.920263 kernel: registered taskstats version 1 May 13 23:48:32.920271 kernel: Loading compiled-in X.509 certificates May 13 23:48:32.920278 kernel: Loaded X.509 cert 'Kinvolk GmbH: Module signing key for 6.6.89-flatcar: 568a15bbab977599d8f910f319ba50c03c8a57bd' May 13 23:48:32.920286 kernel: Key type .fscrypt registered May 13 23:48:32.920293 kernel: Key type fscrypt-provisioning registered May 13 23:48:32.920300 kernel: ima: No TPM chip found, activating TPM-bypass! May 13 23:48:32.920308 kernel: ima: Allocated hash algorithm: sha1 May 13 23:48:32.920317 kernel: ima: No architecture policies found May 13 23:48:32.920325 kernel: alg: No test for fips(ansi_cprng) (fips_ansi_cprng) May 13 23:48:32.920332 kernel: clk: Disabling unused clocks May 13 23:48:32.920339 kernel: Freeing unused kernel memory: 38464K May 13 23:48:32.920347 kernel: Run /init as init process May 13 23:48:32.920354 kernel: with arguments: May 13 23:48:32.920361 kernel: /init May 13 23:48:32.920368 kernel: with environment: May 13 23:48:32.920375 kernel: HOME=/ May 13 23:48:32.920384 kernel: TERM=linux May 13 23:48:32.920391 kernel: BOOT_IMAGE=/flatcar/vmlinuz-a May 13 23:48:32.920400 systemd[1]: Successfully made /usr/ read-only. May 13 23:48:32.920411 systemd[1]: systemd 256.8 running in system mode (+PAM +AUDIT +SELINUX -APPARMOR +IMA +SMACK +SECCOMP +GCRYPT -GNUTLS +OPENSSL -ACL +BLKID +CURL +ELFUTILS -FIDO2 +IDN2 -IDN +IPTC +KMOD +LIBCRYPTSETUP +LIBCRYPTSETUP_PLUGINS +LIBFDISK +PCRE2 -PWQUALITY -P11KIT -QRENCODE +TPM2 +BZIP2 +LZ4 +XZ +ZLIB +ZSTD -BPF_FRAMEWORK -XKBCOMMON +UTMP -SYSVINIT +LIBARCHIVE) May 13 23:48:32.920420 systemd[1]: Detected virtualization kvm. May 13 23:48:32.920428 systemd[1]: Detected architecture arm64. May 13 23:48:32.920435 systemd[1]: Running in initrd. May 13 23:48:32.920445 systemd[1]: No hostname configured, using default hostname. May 13 23:48:32.920453 systemd[1]: Hostname set to . May 13 23:48:32.920460 systemd[1]: Initializing machine ID from VM UUID. May 13 23:48:32.920468 systemd[1]: Queued start job for default target initrd.target. May 13 23:48:32.920476 systemd[1]: Started clevis-luks-askpass.path - Forward Password Requests to Clevis Directory Watch. May 13 23:48:32.920484 systemd[1]: Started systemd-ask-password-console.path - Dispatch Password Requests to Console Directory Watch. May 13 23:48:32.920493 systemd[1]: Expecting device dev-disk-by\x2dlabel-EFI\x2dSYSTEM.device - /dev/disk/by-label/EFI-SYSTEM... May 13 23:48:32.920502 systemd[1]: Expecting device dev-disk-by\x2dlabel-OEM.device - /dev/disk/by-label/OEM... May 13 23:48:32.920511 systemd[1]: Expecting device dev-disk-by\x2dlabel-ROOT.device - /dev/disk/by-label/ROOT... May 13 23:48:32.920520 systemd[1]: Expecting device dev-disk-by\x2dpartlabel-USR\x2dA.device - /dev/disk/by-partlabel/USR-A... May 13 23:48:32.920529 systemd[1]: Expecting device dev-disk-by\x2dpartuuid-7130c94a\x2d213a\x2d4e5a\x2d8e26\x2d6cce9662f132.device - /dev/disk/by-partuuid/7130c94a-213a-4e5a-8e26-6cce9662f132... May 13 23:48:32.920537 systemd[1]: Expecting device dev-mapper-usr.device - /dev/mapper/usr... May 13 23:48:32.920545 systemd[1]: Reached target cryptsetup-pre.target - Local Encrypted Volumes (Pre). May 13 23:48:32.920553 systemd[1]: Reached target cryptsetup.target - Local Encrypted Volumes. May 13 23:48:32.920562 systemd[1]: Reached target paths.target - Path Units. May 13 23:48:32.920571 systemd[1]: Reached target slices.target - Slice Units. May 13 23:48:32.920580 systemd[1]: Reached target swap.target - Swaps. May 13 23:48:32.920588 systemd[1]: Reached target timers.target - Timer Units. May 13 23:48:32.920596 systemd[1]: Listening on iscsid.socket - Open-iSCSI iscsid Socket. May 13 23:48:32.920604 systemd[1]: Listening on iscsiuio.socket - Open-iSCSI iscsiuio Socket. May 13 23:48:32.920612 systemd[1]: Listening on systemd-journald-dev-log.socket - Journal Socket (/dev/log). May 13 23:48:32.920620 systemd[1]: Listening on systemd-journald.socket - Journal Sockets. May 13 23:48:32.920628 systemd[1]: Listening on systemd-networkd.socket - Network Service Netlink Socket. May 13 23:48:32.920636 systemd[1]: Listening on systemd-udevd-control.socket - udev Control Socket. May 13 23:48:32.920645 systemd[1]: Listening on systemd-udevd-kernel.socket - udev Kernel Socket. May 13 23:48:32.920654 systemd[1]: Reached target sockets.target - Socket Units. May 13 23:48:32.920662 systemd[1]: Starting ignition-setup-pre.service - Ignition env setup... May 13 23:48:32.920669 systemd[1]: Starting kmod-static-nodes.service - Create List of Static Device Nodes... May 13 23:48:32.920677 systemd[1]: Finished network-cleanup.service - Network Cleanup. May 13 23:48:32.920685 systemd[1]: Starting systemd-fsck-usr.service... May 13 23:48:32.920693 systemd[1]: Starting systemd-journald.service - Journal Service... May 13 23:48:32.920701 systemd[1]: Starting systemd-modules-load.service - Load Kernel Modules... May 13 23:48:32.920711 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... May 13 23:48:32.920719 systemd[1]: Finished ignition-setup-pre.service - Ignition env setup. May 13 23:48:32.920753 systemd-journald[236]: Collecting audit messages is disabled. May 13 23:48:32.920774 systemd[1]: Finished kmod-static-nodes.service - Create List of Static Device Nodes. May 13 23:48:32.920785 systemd[1]: Finished systemd-fsck-usr.service. May 13 23:48:32.920794 systemd[1]: Starting systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully... May 13 23:48:32.920802 kernel: bridge: filtering via arp/ip/ip6tables is no longer available by default. Update your scripts to load br_netfilter if you need this. May 13 23:48:32.920809 kernel: Bridge firewalling registered May 13 23:48:32.920817 systemd[1]: Finished systemd-modules-load.service - Load Kernel Modules. May 13 23:48:32.920827 systemd[1]: Starting systemd-sysctl.service - Apply Kernel Variables... May 13 23:48:32.920835 systemd[1]: Finished systemd-vconsole-setup.service - Virtual Console Setup. May 13 23:48:32.920845 systemd-journald[236]: Journal started May 13 23:48:32.920863 systemd-journald[236]: Runtime Journal (/run/log/journal/7944bf04907c47a6a9efb5595ba2fe6e) is 8M, max 76.6M, 68.6M free. May 13 23:48:32.879451 systemd-modules-load[238]: Inserted module 'overlay' May 13 23:48:32.903020 systemd-modules-load[238]: Inserted module 'br_netfilter' May 13 23:48:32.923751 systemd[1]: Started systemd-journald.service - Journal Service. May 13 23:48:32.934618 systemd[1]: Finished systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully. May 13 23:48:32.942181 systemd[1]: Starting dracut-cmdline-ask.service - dracut ask for additional cmdline parameters... May 13 23:48:32.946180 systemd[1]: Starting systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev... May 13 23:48:32.949244 systemd[1]: Starting systemd-tmpfiles-setup.service - Create System Files and Directories... May 13 23:48:32.950158 systemd[1]: Finished systemd-sysctl.service - Apply Kernel Variables. May 13 23:48:32.969579 systemd[1]: Finished systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev. May 13 23:48:32.970658 systemd[1]: Finished systemd-tmpfiles-setup.service - Create System Files and Directories. May 13 23:48:32.977234 systemd[1]: Starting systemd-resolved.service - Network Name Resolution... May 13 23:48:32.992395 systemd[1]: Finished dracut-cmdline-ask.service - dracut ask for additional cmdline parameters. May 13 23:48:32.997664 systemd[1]: Starting dracut-cmdline.service - dracut cmdline hook... May 13 23:48:33.020998 dracut-cmdline[274]: dracut-dracut-053 May 13 23:48:33.022169 dracut-cmdline[274]: Using kernel command line parameters: rd.driver.pre=btrfs BOOT_IMAGE=/flatcar/vmlinuz-a mount.usr=/dev/mapper/usr verity.usr=PARTUUID=7130c94a-213a-4e5a-8e26-6cce9662f132 rootflags=rw mount.usrflags=ro consoleblank=0 root=LABEL=ROOT console=ttyAMA0,115200n8 flatcar.first_boot=detected acpi=force flatcar.oem.id=hetzner verity.usrhash=3174b2682629aa8ad4069807ed6fd62c10f62266ee1e150a1104f2a2fb6489b5 May 13 23:48:33.026496 systemd-resolved[268]: Positive Trust Anchors: May 13 23:48:33.026506 systemd-resolved[268]: . IN DS 20326 8 2 e06d44b80b8f1d39a95c0b0d7c65d08458e880409bbc683457104237c7f8ec8d May 13 23:48:33.026538 systemd-resolved[268]: Negative trust anchors: home.arpa 10.in-addr.arpa 16.172.in-addr.arpa 17.172.in-addr.arpa 18.172.in-addr.arpa 19.172.in-addr.arpa 20.172.in-addr.arpa 21.172.in-addr.arpa 22.172.in-addr.arpa 23.172.in-addr.arpa 24.172.in-addr.arpa 25.172.in-addr.arpa 26.172.in-addr.arpa 27.172.in-addr.arpa 28.172.in-addr.arpa 29.172.in-addr.arpa 30.172.in-addr.arpa 31.172.in-addr.arpa 170.0.0.192.in-addr.arpa 171.0.0.192.in-addr.arpa 168.192.in-addr.arpa d.f.ip6.arpa ipv4only.arpa resolver.arpa corp home internal intranet lan local private test May 13 23:48:33.036988 systemd-resolved[268]: Defaulting to hostname 'linux'. May 13 23:48:33.038537 systemd[1]: Started systemd-resolved.service - Network Name Resolution. May 13 23:48:33.039293 systemd[1]: Reached target nss-lookup.target - Host and Network Name Lookups. May 13 23:48:33.111081 kernel: SCSI subsystem initialized May 13 23:48:33.115004 kernel: Loading iSCSI transport class v2.0-870. May 13 23:48:33.123067 kernel: iscsi: registered transport (tcp) May 13 23:48:33.138048 kernel: iscsi: registered transport (qla4xxx) May 13 23:48:33.138113 kernel: QLogic iSCSI HBA Driver May 13 23:48:33.190445 systemd[1]: Finished dracut-cmdline.service - dracut cmdline hook. May 13 23:48:33.192739 systemd[1]: Starting dracut-pre-udev.service - dracut pre-udev hook... May 13 23:48:33.228492 kernel: device-mapper: core: CONFIG_IMA_DISABLE_HTABLE is disabled. Duplicate IMA measurements will not be recorded in the IMA log. May 13 23:48:33.228566 kernel: device-mapper: uevent: version 1.0.3 May 13 23:48:33.228583 kernel: device-mapper: ioctl: 4.48.0-ioctl (2023-03-01) initialised: dm-devel@redhat.com May 13 23:48:33.285091 kernel: raid6: neonx8 gen() 14040 MB/s May 13 23:48:33.302063 kernel: raid6: neonx4 gen() 14526 MB/s May 13 23:48:33.319066 kernel: raid6: neonx2 gen() 12186 MB/s May 13 23:48:33.336030 kernel: raid6: neonx1 gen() 10211 MB/s May 13 23:48:33.353050 kernel: raid6: int64x8 gen() 6544 MB/s May 13 23:48:33.370058 kernel: raid6: int64x4 gen() 6684 MB/s May 13 23:48:33.387054 kernel: raid6: int64x2 gen() 5794 MB/s May 13 23:48:33.404057 kernel: raid6: int64x1 gen() 4574 MB/s May 13 23:48:33.404154 kernel: raid6: using algorithm neonx4 gen() 14526 MB/s May 13 23:48:33.421071 kernel: raid6: .... xor() 12019 MB/s, rmw enabled May 13 23:48:33.421160 kernel: raid6: using neon recovery algorithm May 13 23:48:33.426165 kernel: xor: measuring software checksum speed May 13 23:48:33.426258 kernel: 8regs : 21647 MB/sec May 13 23:48:33.426280 kernel: 32regs : 21687 MB/sec May 13 23:48:33.426320 kernel: arm64_neon : 27823 MB/sec May 13 23:48:33.427017 kernel: xor: using function: arm64_neon (27823 MB/sec) May 13 23:48:33.486371 kernel: Btrfs loaded, zoned=no, fsverity=no May 13 23:48:33.501646 systemd[1]: Finished dracut-pre-udev.service - dracut pre-udev hook. May 13 23:48:33.503936 systemd[1]: Starting systemd-udevd.service - Rule-based Manager for Device Events and Files... May 13 23:48:33.535165 systemd-udevd[455]: Using default interface naming scheme 'v255'. May 13 23:48:33.539225 systemd[1]: Started systemd-udevd.service - Rule-based Manager for Device Events and Files. May 13 23:48:33.543851 systemd[1]: Starting dracut-pre-trigger.service - dracut pre-trigger hook... May 13 23:48:33.578615 dracut-pre-trigger[463]: rd.md=0: removing MD RAID activation May 13 23:48:33.618007 systemd[1]: Finished dracut-pre-trigger.service - dracut pre-trigger hook. May 13 23:48:33.622658 systemd[1]: Starting systemd-udev-trigger.service - Coldplug All udev Devices... May 13 23:48:33.683056 systemd[1]: Finished systemd-udev-trigger.service - Coldplug All udev Devices. May 13 23:48:33.688190 systemd[1]: Starting dracut-initqueue.service - dracut initqueue hook... May 13 23:48:33.716101 systemd[1]: Finished dracut-initqueue.service - dracut initqueue hook. May 13 23:48:33.720331 systemd[1]: Reached target remote-fs-pre.target - Preparation for Remote File Systems. May 13 23:48:33.722586 systemd[1]: Reached target remote-cryptsetup.target - Remote Encrypted Volumes. May 13 23:48:33.724268 systemd[1]: Reached target remote-fs.target - Remote File Systems. May 13 23:48:33.727289 systemd[1]: Starting dracut-pre-mount.service - dracut pre-mount hook... May 13 23:48:33.756595 systemd[1]: Finished dracut-pre-mount.service - dracut pre-mount hook. May 13 23:48:33.791002 kernel: ACPI: bus type USB registered May 13 23:48:33.792017 kernel: usbcore: registered new interface driver usbfs May 13 23:48:33.792076 kernel: usbcore: registered new interface driver hub May 13 23:48:33.792995 kernel: usbcore: registered new device driver usb May 13 23:48:33.814539 kernel: xhci_hcd 0000:02:00.0: xHCI Host Controller May 13 23:48:33.814774 kernel: xhci_hcd 0000:02:00.0: new USB bus registered, assigned bus number 1 May 13 23:48:33.819821 kernel: xhci_hcd 0000:02:00.0: hcc params 0x00087001 hci version 0x100 quirks 0x0000000000000010 May 13 23:48:33.823229 kernel: xhci_hcd 0000:02:00.0: xHCI Host Controller May 13 23:48:33.823410 kernel: xhci_hcd 0000:02:00.0: new USB bus registered, assigned bus number 2 May 13 23:48:33.823857 kernel: xhci_hcd 0000:02:00.0: Host supports USB 3.0 SuperSpeed May 13 23:48:33.825999 kernel: hub 1-0:1.0: USB hub found May 13 23:48:33.826193 kernel: hub 1-0:1.0: 4 ports detected May 13 23:48:33.829015 kernel: usb usb2: We don't know the algorithms for LPM for this host, disabling LPM. May 13 23:48:33.829409 kernel: hub 2-0:1.0: USB hub found May 13 23:48:33.833378 kernel: hub 2-0:1.0: 4 ports detected May 13 23:48:33.846119 kernel: scsi host0: Virtio SCSI HBA May 13 23:48:33.850479 kernel: scsi 0:0:0:0: CD-ROM QEMU QEMU CD-ROM 2.5+ PQ: 0 ANSI: 5 May 13 23:48:33.851055 kernel: scsi 0:0:0:1: Direct-Access QEMU QEMU HARDDISK 2.5+ PQ: 0 ANSI: 5 May 13 23:48:33.855005 systemd[1]: dracut-cmdline-ask.service: Deactivated successfully. May 13 23:48:33.855715 systemd[1]: Stopped dracut-cmdline-ask.service - dracut ask for additional cmdline parameters. May 13 23:48:33.858696 systemd[1]: Stopping dracut-cmdline-ask.service - dracut ask for additional cmdline parameters... May 13 23:48:33.859337 systemd[1]: systemd-vconsole-setup.service: Deactivated successfully. May 13 23:48:33.859501 systemd[1]: Stopped systemd-vconsole-setup.service - Virtual Console Setup. May 13 23:48:33.861888 systemd[1]: Stopping systemd-vconsole-setup.service - Virtual Console Setup... May 13 23:48:33.864230 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... May 13 23:48:33.882998 kernel: sr 0:0:0:0: Power-on or device reset occurred May 13 23:48:33.884233 kernel: sr 0:0:0:0: [sr0] scsi3-mmc drive: 16x/50x cd/rw xa/form2 cdda tray May 13 23:48:33.884414 kernel: cdrom: Uniform CD-ROM driver Revision: 3.20 May 13 23:48:33.885994 kernel: sr 0:0:0:0: Attached scsi CD-ROM sr0 May 13 23:48:33.888350 systemd[1]: Finished systemd-vconsole-setup.service - Virtual Console Setup. May 13 23:48:33.893414 systemd[1]: Starting dracut-cmdline-ask.service - dracut ask for additional cmdline parameters... May 13 23:48:33.910785 kernel: sd 0:0:0:1: Power-on or device reset occurred May 13 23:48:33.911012 kernel: sd 0:0:0:1: [sda] 80003072 512-byte logical blocks: (41.0 GB/38.1 GiB) May 13 23:48:33.911173 kernel: sd 0:0:0:1: [sda] Write Protect is off May 13 23:48:33.911267 kernel: sd 0:0:0:1: [sda] Mode Sense: 63 00 00 08 May 13 23:48:33.912247 kernel: sd 0:0:0:1: [sda] Write cache: enabled, read cache: enabled, doesn't support DPO or FUA May 13 23:48:33.920991 kernel: GPT:Primary header thinks Alt. header is not at the end of the disk. May 13 23:48:33.921050 kernel: GPT:17805311 != 80003071 May 13 23:48:33.921062 kernel: GPT:Alternate GPT header not at the end of the disk. May 13 23:48:33.921071 kernel: GPT:17805311 != 80003071 May 13 23:48:33.921080 kernel: GPT: Use GNU Parted to correct GPT errors. May 13 23:48:33.922067 kernel: sda: sda1 sda2 sda3 sda4 sda6 sda7 sda9 May 13 23:48:33.924995 kernel: sd 0:0:0:1: [sda] Attached SCSI disk May 13 23:48:33.933294 systemd[1]: Finished dracut-cmdline-ask.service - dracut ask for additional cmdline parameters. May 13 23:48:33.975999 kernel: BTRFS: device fsid ee830c17-a93d-4109-bd12-3fec8ef6763d devid 1 transid 41 /dev/sda3 scanned by (udev-worker) (511) May 13 23:48:33.977013 kernel: BTRFS: device label OEM devid 1 transid 14 /dev/sda6 scanned by (udev-worker) (514) May 13 23:48:33.994786 systemd[1]: Found device dev-disk-by\x2dlabel-EFI\x2dSYSTEM.device - QEMU_HARDDISK EFI-SYSTEM. May 13 23:48:34.006179 systemd[1]: Found device dev-disk-by\x2dlabel-ROOT.device - QEMU_HARDDISK ROOT. May 13 23:48:34.016730 systemd[1]: Found device dev-disk-by\x2dlabel-OEM.device - QEMU_HARDDISK OEM. May 13 23:48:34.024086 systemd[1]: Found device dev-disk-by\x2dpartlabel-USR\x2dA.device - QEMU_HARDDISK USR-A. May 13 23:48:34.024743 systemd[1]: Found device dev-disk-by\x2dpartuuid-7130c94a\x2d213a\x2d4e5a\x2d8e26\x2d6cce9662f132.device - QEMU_HARDDISK USR-A. May 13 23:48:34.030051 systemd[1]: Starting disk-uuid.service - Generate new UUID for disk GPT if necessary... May 13 23:48:34.047891 disk-uuid[574]: Primary Header is updated. May 13 23:48:34.047891 disk-uuid[574]: Secondary Entries is updated. May 13 23:48:34.047891 disk-uuid[574]: Secondary Header is updated. May 13 23:48:34.054077 kernel: sda: sda1 sda2 sda3 sda4 sda6 sda7 sda9 May 13 23:48:34.072025 kernel: usb 1-1: new high-speed USB device number 2 using xhci_hcd May 13 23:48:34.206714 kernel: input: QEMU QEMU USB Tablet as /devices/pci0000:00/0000:00:02.1/0000:02:00.0/usb1/1-1/1-1:1.0/0003:0627:0001.0001/input/input1 May 13 23:48:34.206793 kernel: hid-generic 0003:0627:0001.0001: input,hidraw0: USB HID v0.01 Mouse [QEMU QEMU USB Tablet] on usb-0000:02:00.0-1/input0 May 13 23:48:34.207124 kernel: usbcore: registered new interface driver usbhid May 13 23:48:34.207501 kernel: usbhid: USB HID core driver May 13 23:48:34.313028 kernel: usb 1-2: new high-speed USB device number 3 using xhci_hcd May 13 23:48:34.443006 kernel: input: QEMU QEMU USB Keyboard as /devices/pci0000:00/0000:00:02.1/0000:02:00.0/usb1/1-2/1-2:1.0/0003:0627:0001.0002/input/input2 May 13 23:48:34.498064 kernel: hid-generic 0003:0627:0001.0002: input,hidraw1: USB HID v1.11 Keyboard [QEMU QEMU USB Keyboard] on usb-0000:02:00.0-2/input0 May 13 23:48:35.069874 kernel: sda: sda1 sda2 sda3 sda4 sda6 sda7 sda9 May 13 23:48:35.069940 disk-uuid[575]: The operation has completed successfully. May 13 23:48:35.135602 systemd[1]: disk-uuid.service: Deactivated successfully. May 13 23:48:35.136920 systemd[1]: Finished disk-uuid.service - Generate new UUID for disk GPT if necessary. May 13 23:48:35.167808 systemd[1]: Starting verity-setup.service - Verity Setup for /dev/mapper/usr... May 13 23:48:35.185952 sh[589]: Success May 13 23:48:35.200156 kernel: device-mapper: verity: sha256 using implementation "sha256-ce" May 13 23:48:35.262340 systemd[1]: Found device dev-mapper-usr.device - /dev/mapper/usr. May 13 23:48:35.267253 systemd[1]: Mounting sysusr-usr.mount - /sysusr/usr... May 13 23:48:35.277606 systemd[1]: Finished verity-setup.service - Verity Setup for /dev/mapper/usr. May 13 23:48:35.297003 kernel: BTRFS info (device dm-0): first mount of filesystem ee830c17-a93d-4109-bd12-3fec8ef6763d May 13 23:48:35.297077 kernel: BTRFS info (device dm-0): using crc32c (crc32c-generic) checksum algorithm May 13 23:48:35.297105 kernel: BTRFS warning (device dm-0): 'nologreplay' is deprecated, use 'rescue=nologreplay' instead May 13 23:48:35.297120 kernel: BTRFS info (device dm-0): disabling log replay at mount time May 13 23:48:35.297845 kernel: BTRFS info (device dm-0): using free space tree May 13 23:48:35.306045 kernel: BTRFS info (device dm-0): enabling ssd optimizations May 13 23:48:35.307937 systemd[1]: Mounted sysusr-usr.mount - /sysusr/usr. May 13 23:48:35.309392 systemd[1]: afterburn-network-kargs.service - Afterburn Initrd Setup Network Kernel Arguments was skipped because no trigger condition checks were met. May 13 23:48:35.311157 systemd[1]: Starting ignition-setup.service - Ignition (setup)... May 13 23:48:35.315210 systemd[1]: Starting parse-ip-for-networkd.service - Write systemd-networkd units from cmdline... May 13 23:48:35.345469 kernel: BTRFS info (device sda6): first mount of filesystem e7b30525-8b14-4004-ad68-68a99b3959db May 13 23:48:35.345528 kernel: BTRFS info (device sda6): using crc32c (crc32c-generic) checksum algorithm May 13 23:48:35.345540 kernel: BTRFS info (device sda6): using free space tree May 13 23:48:35.352010 kernel: BTRFS info (device sda6): enabling ssd optimizations May 13 23:48:35.352090 kernel: BTRFS info (device sda6): auto enabling async discard May 13 23:48:35.358255 kernel: BTRFS info (device sda6): last unmount of filesystem e7b30525-8b14-4004-ad68-68a99b3959db May 13 23:48:35.363871 systemd[1]: Finished ignition-setup.service - Ignition (setup). May 13 23:48:35.369215 systemd[1]: Starting ignition-fetch-offline.service - Ignition (fetch-offline)... May 13 23:48:35.473401 systemd[1]: Finished parse-ip-for-networkd.service - Write systemd-networkd units from cmdline. May 13 23:48:35.477160 systemd[1]: Starting systemd-networkd.service - Network Configuration... May 13 23:48:35.481495 ignition[690]: Ignition 2.20.0 May 13 23:48:35.481511 ignition[690]: Stage: fetch-offline May 13 23:48:35.481543 ignition[690]: no configs at "/usr/lib/ignition/base.d" May 13 23:48:35.481551 ignition[690]: no config dir at "/usr/lib/ignition/base.platform.d/hetzner" May 13 23:48:35.481704 ignition[690]: parsed url from cmdline: "" May 13 23:48:35.481707 ignition[690]: no config URL provided May 13 23:48:35.481711 ignition[690]: reading system config file "/usr/lib/ignition/user.ign" May 13 23:48:35.481717 ignition[690]: no config at "/usr/lib/ignition/user.ign" May 13 23:48:35.487385 systemd[1]: Finished ignition-fetch-offline.service - Ignition (fetch-offline). May 13 23:48:35.481722 ignition[690]: failed to fetch config: resource requires networking May 13 23:48:35.481914 ignition[690]: Ignition finished successfully May 13 23:48:35.509778 systemd-networkd[773]: lo: Link UP May 13 23:48:35.509792 systemd-networkd[773]: lo: Gained carrier May 13 23:48:35.511615 systemd-networkd[773]: Enumeration completed May 13 23:48:35.511818 systemd[1]: Started systemd-networkd.service - Network Configuration. May 13 23:48:35.512890 systemd[1]: Reached target network.target - Network. May 13 23:48:35.513921 systemd-networkd[773]: eth0: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. May 13 23:48:35.513924 systemd-networkd[773]: eth0: Configuring with /usr/lib/systemd/network/zz-default.network. May 13 23:48:35.514637 systemd-networkd[773]: eth1: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. May 13 23:48:35.514640 systemd-networkd[773]: eth1: Configuring with /usr/lib/systemd/network/zz-default.network. May 13 23:48:35.515324 systemd-networkd[773]: eth0: Link UP May 13 23:48:35.515327 systemd-networkd[773]: eth0: Gained carrier May 13 23:48:35.515336 systemd-networkd[773]: eth0: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. May 13 23:48:35.519295 systemd-networkd[773]: eth1: Link UP May 13 23:48:35.519298 systemd-networkd[773]: eth1: Gained carrier May 13 23:48:35.519308 systemd-networkd[773]: eth1: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. May 13 23:48:35.520020 systemd[1]: Starting ignition-fetch.service - Ignition (fetch)... May 13 23:48:35.544092 systemd-networkd[773]: eth1: DHCPv4 address 10.0.0.3/32, gateway 10.0.0.1 acquired from 10.0.0.1 May 13 23:48:35.552085 ignition[778]: Ignition 2.20.0 May 13 23:48:35.552693 ignition[778]: Stage: fetch May 13 23:48:35.553350 ignition[778]: no configs at "/usr/lib/ignition/base.d" May 13 23:48:35.553364 ignition[778]: no config dir at "/usr/lib/ignition/base.platform.d/hetzner" May 13 23:48:35.553475 ignition[778]: parsed url from cmdline: "" May 13 23:48:35.553478 ignition[778]: no config URL provided May 13 23:48:35.553483 ignition[778]: reading system config file "/usr/lib/ignition/user.ign" May 13 23:48:35.553491 ignition[778]: no config at "/usr/lib/ignition/user.ign" May 13 23:48:35.553582 ignition[778]: GET http://169.254.169.254/hetzner/v1/userdata: attempt #1 May 13 23:48:35.554469 ignition[778]: GET error: Get "http://169.254.169.254/hetzner/v1/userdata": dial tcp 169.254.169.254:80: connect: network is unreachable May 13 23:48:35.568103 systemd-networkd[773]: eth0: DHCPv4 address 188.245.195.87/32, gateway 172.31.1.1 acquired from 172.31.1.1 May 13 23:48:35.754583 ignition[778]: GET http://169.254.169.254/hetzner/v1/userdata: attempt #2 May 13 23:48:35.764584 ignition[778]: GET result: OK May 13 23:48:35.764744 ignition[778]: parsing config with SHA512: a354773131d32585737fe2ef5e38b796289575f6e261718f701dd5075c61d23ae08ddf00418ed1d2669760d1df3b7322919b47c45aa0fc91b9ae6db1e325f9a8 May 13 23:48:35.771330 unknown[778]: fetched base config from "system" May 13 23:48:35.771339 unknown[778]: fetched base config from "system" May 13 23:48:35.771742 ignition[778]: fetch: fetch complete May 13 23:48:35.771353 unknown[778]: fetched user config from "hetzner" May 13 23:48:35.771748 ignition[778]: fetch: fetch passed May 13 23:48:35.774278 systemd[1]: Finished ignition-fetch.service - Ignition (fetch). May 13 23:48:35.771799 ignition[778]: Ignition finished successfully May 13 23:48:35.776209 systemd[1]: Starting ignition-kargs.service - Ignition (kargs)... May 13 23:48:35.801826 ignition[786]: Ignition 2.20.0 May 13 23:48:35.801837 ignition[786]: Stage: kargs May 13 23:48:35.802046 ignition[786]: no configs at "/usr/lib/ignition/base.d" May 13 23:48:35.802056 ignition[786]: no config dir at "/usr/lib/ignition/base.platform.d/hetzner" May 13 23:48:35.803063 ignition[786]: kargs: kargs passed May 13 23:48:35.803129 ignition[786]: Ignition finished successfully May 13 23:48:35.806800 systemd[1]: Finished ignition-kargs.service - Ignition (kargs). May 13 23:48:35.809179 systemd[1]: Starting ignition-disks.service - Ignition (disks)... May 13 23:48:35.833492 ignition[792]: Ignition 2.20.0 May 13 23:48:35.833502 ignition[792]: Stage: disks May 13 23:48:35.833676 ignition[792]: no configs at "/usr/lib/ignition/base.d" May 13 23:48:35.833685 ignition[792]: no config dir at "/usr/lib/ignition/base.platform.d/hetzner" May 13 23:48:35.835679 ignition[792]: disks: disks passed May 13 23:48:35.835742 ignition[792]: Ignition finished successfully May 13 23:48:35.837741 systemd[1]: Finished ignition-disks.service - Ignition (disks). May 13 23:48:35.838886 systemd[1]: Reached target initrd-root-device.target - Initrd Root Device. May 13 23:48:35.839710 systemd[1]: Reached target local-fs-pre.target - Preparation for Local File Systems. May 13 23:48:35.840867 systemd[1]: Reached target local-fs.target - Local File Systems. May 13 23:48:35.842092 systemd[1]: Reached target sysinit.target - System Initialization. May 13 23:48:35.843433 systemd[1]: Reached target basic.target - Basic System. May 13 23:48:35.846163 systemd[1]: Starting systemd-fsck-root.service - File System Check on /dev/disk/by-label/ROOT... May 13 23:48:35.883290 systemd-fsck[800]: ROOT: clean, 14/1628000 files, 120691/1617920 blocks May 13 23:48:35.890068 systemd[1]: Finished systemd-fsck-root.service - File System Check on /dev/disk/by-label/ROOT. May 13 23:48:35.892193 systemd[1]: Mounting sysroot.mount - /sysroot... May 13 23:48:35.954343 kernel: EXT4-fs (sda9): mounted filesystem 9f8d74e6-c079-469f-823a-18a62077a2c7 r/w with ordered data mode. Quota mode: none. May 13 23:48:35.955469 systemd[1]: Mounted sysroot.mount - /sysroot. May 13 23:48:35.956605 systemd[1]: Reached target initrd-root-fs.target - Initrd Root File System. May 13 23:48:35.959657 systemd[1]: Mounting sysroot-oem.mount - /sysroot/oem... May 13 23:48:35.964136 systemd[1]: Mounting sysroot-usr.mount - /sysroot/usr... May 13 23:48:35.966201 systemd[1]: Starting flatcar-metadata-hostname.service - Flatcar Metadata Hostname Agent... May 13 23:48:35.969091 systemd[1]: ignition-remount-sysroot.service - Remount /sysroot read-write for Ignition was skipped because of an unmet condition check (ConditionPathIsReadWrite=!/sysroot). May 13 23:48:35.969134 systemd[1]: Reached target ignition-diskful.target - Ignition Boot Disk Setup. May 13 23:48:35.977485 systemd[1]: Mounted sysroot-usr.mount - /sysroot/usr. May 13 23:48:35.981995 systemd[1]: Starting initrd-setup-root.service - Root filesystem setup... May 13 23:48:35.991483 kernel: BTRFS: device label OEM devid 1 transid 15 /dev/sda6 scanned by mount (808) May 13 23:48:35.991536 kernel: BTRFS info (device sda6): first mount of filesystem e7b30525-8b14-4004-ad68-68a99b3959db May 13 23:48:35.992048 kernel: BTRFS info (device sda6): using crc32c (crc32c-generic) checksum algorithm May 13 23:48:35.993007 kernel: BTRFS info (device sda6): using free space tree May 13 23:48:35.997997 kernel: BTRFS info (device sda6): enabling ssd optimizations May 13 23:48:35.998061 kernel: BTRFS info (device sda6): auto enabling async discard May 13 23:48:36.003292 systemd[1]: Mounted sysroot-oem.mount - /sysroot/oem. May 13 23:48:36.029899 coreos-metadata[810]: May 13 23:48:36.029 INFO Fetching http://169.254.169.254/hetzner/v1/metadata/hostname: Attempt #1 May 13 23:48:36.031788 coreos-metadata[810]: May 13 23:48:36.031 INFO Fetch successful May 13 23:48:36.034660 coreos-metadata[810]: May 13 23:48:36.033 INFO wrote hostname ci-4284-0-0-n-40578dffbd to /sysroot/etc/hostname May 13 23:48:36.035809 initrd-setup-root[835]: cut: /sysroot/etc/passwd: No such file or directory May 13 23:48:36.038086 systemd[1]: Finished flatcar-metadata-hostname.service - Flatcar Metadata Hostname Agent. May 13 23:48:36.043836 initrd-setup-root[843]: cut: /sysroot/etc/group: No such file or directory May 13 23:48:36.049740 initrd-setup-root[850]: cut: /sysroot/etc/shadow: No such file or directory May 13 23:48:36.055771 initrd-setup-root[857]: cut: /sysroot/etc/gshadow: No such file or directory May 13 23:48:36.151786 systemd[1]: Finished initrd-setup-root.service - Root filesystem setup. May 13 23:48:36.153738 systemd[1]: Starting ignition-mount.service - Ignition (mount)... May 13 23:48:36.155135 systemd[1]: Starting sysroot-boot.service - /sysroot/boot... May 13 23:48:36.177151 kernel: BTRFS info (device sda6): last unmount of filesystem e7b30525-8b14-4004-ad68-68a99b3959db May 13 23:48:36.199939 systemd[1]: Finished sysroot-boot.service - /sysroot/boot. May 13 23:48:36.207777 ignition[925]: INFO : Ignition 2.20.0 May 13 23:48:36.208938 ignition[925]: INFO : Stage: mount May 13 23:48:36.208938 ignition[925]: INFO : no configs at "/usr/lib/ignition/base.d" May 13 23:48:36.208938 ignition[925]: INFO : no config dir at "/usr/lib/ignition/base.platform.d/hetzner" May 13 23:48:36.211290 ignition[925]: INFO : mount: mount passed May 13 23:48:36.212430 ignition[925]: INFO : Ignition finished successfully May 13 23:48:36.213301 systemd[1]: Finished ignition-mount.service - Ignition (mount). May 13 23:48:36.215716 systemd[1]: Starting ignition-files.service - Ignition (files)... May 13 23:48:36.294798 systemd[1]: sysroot-oem.mount: Deactivated successfully. May 13 23:48:36.297705 systemd[1]: Mounting sysroot-oem.mount - /sysroot/oem... May 13 23:48:36.327039 kernel: BTRFS: device label OEM devid 1 transid 16 /dev/sda6 scanned by mount (936) May 13 23:48:36.329501 kernel: BTRFS info (device sda6): first mount of filesystem e7b30525-8b14-4004-ad68-68a99b3959db May 13 23:48:36.329554 kernel: BTRFS info (device sda6): using crc32c (crc32c-generic) checksum algorithm May 13 23:48:36.329587 kernel: BTRFS info (device sda6): using free space tree May 13 23:48:36.333150 kernel: BTRFS info (device sda6): enabling ssd optimizations May 13 23:48:36.333231 kernel: BTRFS info (device sda6): auto enabling async discard May 13 23:48:36.336709 systemd[1]: Mounted sysroot-oem.mount - /sysroot/oem. May 13 23:48:36.365941 ignition[953]: INFO : Ignition 2.20.0 May 13 23:48:36.365941 ignition[953]: INFO : Stage: files May 13 23:48:36.367721 ignition[953]: INFO : no configs at "/usr/lib/ignition/base.d" May 13 23:48:36.367721 ignition[953]: INFO : no config dir at "/usr/lib/ignition/base.platform.d/hetzner" May 13 23:48:36.367721 ignition[953]: DEBUG : files: compiled without relabeling support, skipping May 13 23:48:36.370208 ignition[953]: INFO : files: ensureUsers: op(1): [started] creating or modifying user "core" May 13 23:48:36.370208 ignition[953]: DEBUG : files: ensureUsers: op(1): executing: "usermod" "--root" "/sysroot" "core" May 13 23:48:36.372153 ignition[953]: INFO : files: ensureUsers: op(1): [finished] creating or modifying user "core" May 13 23:48:36.372153 ignition[953]: INFO : files: ensureUsers: op(2): [started] adding ssh keys to user "core" May 13 23:48:36.373943 ignition[953]: INFO : files: ensureUsers: op(2): [finished] adding ssh keys to user "core" May 13 23:48:36.372263 unknown[953]: wrote ssh authorized keys file for user: core May 13 23:48:36.375375 ignition[953]: INFO : files: createFilesystemsFiles: createFiles: op(3): [started] writing file "/sysroot/opt/helm-v3.13.2-linux-arm64.tar.gz" May 13 23:48:36.375375 ignition[953]: INFO : files: createFilesystemsFiles: createFiles: op(3): GET https://get.helm.sh/helm-v3.13.2-linux-arm64.tar.gz: attempt #1 May 13 23:48:36.464300 ignition[953]: INFO : files: createFilesystemsFiles: createFiles: op(3): GET result: OK May 13 23:48:36.757879 ignition[953]: INFO : files: createFilesystemsFiles: createFiles: op(3): [finished] writing file "/sysroot/opt/helm-v3.13.2-linux-arm64.tar.gz" May 13 23:48:36.759113 ignition[953]: INFO : files: createFilesystemsFiles: createFiles: op(4): [started] writing file "/sysroot/home/core/install.sh" May 13 23:48:36.759113 ignition[953]: INFO : files: createFilesystemsFiles: createFiles: op(4): [finished] writing file "/sysroot/home/core/install.sh" May 13 23:48:36.759113 ignition[953]: INFO : files: createFilesystemsFiles: createFiles: op(5): [started] writing file "/sysroot/home/core/nginx.yaml" May 13 23:48:36.759113 ignition[953]: INFO : files: createFilesystemsFiles: createFiles: op(5): [finished] writing file "/sysroot/home/core/nginx.yaml" May 13 23:48:36.759113 ignition[953]: INFO : files: createFilesystemsFiles: createFiles: op(6): [started] writing file "/sysroot/home/core/nfs-pod.yaml" May 13 23:48:36.759113 ignition[953]: INFO : files: createFilesystemsFiles: createFiles: op(6): [finished] writing file "/sysroot/home/core/nfs-pod.yaml" May 13 23:48:36.759113 ignition[953]: INFO : files: createFilesystemsFiles: createFiles: op(7): [started] writing file "/sysroot/home/core/nfs-pvc.yaml" May 13 23:48:36.759113 ignition[953]: INFO : files: createFilesystemsFiles: createFiles: op(7): [finished] writing file "/sysroot/home/core/nfs-pvc.yaml" May 13 23:48:36.768284 ignition[953]: INFO : files: createFilesystemsFiles: createFiles: op(8): [started] writing file "/sysroot/etc/flatcar/update.conf" May 13 23:48:36.768284 ignition[953]: INFO : files: createFilesystemsFiles: createFiles: op(8): [finished] writing file "/sysroot/etc/flatcar/update.conf" May 13 23:48:36.768284 ignition[953]: INFO : files: createFilesystemsFiles: createFiles: op(9): [started] writing link "/sysroot/etc/extensions/kubernetes.raw" -> "/opt/extensions/kubernetes/kubernetes-v1.31.0-arm64.raw" May 13 23:48:36.768284 ignition[953]: INFO : files: createFilesystemsFiles: createFiles: op(9): [finished] writing link "/sysroot/etc/extensions/kubernetes.raw" -> "/opt/extensions/kubernetes/kubernetes-v1.31.0-arm64.raw" May 13 23:48:36.768284 ignition[953]: INFO : files: createFilesystemsFiles: createFiles: op(a): [started] writing file "/sysroot/opt/extensions/kubernetes/kubernetes-v1.31.0-arm64.raw" May 13 23:48:36.768284 ignition[953]: INFO : files: createFilesystemsFiles: createFiles: op(a): GET https://github.com/flatcar/sysext-bakery/releases/download/latest/kubernetes-v1.31.0-arm64.raw: attempt #1 May 13 23:48:36.954241 systemd-networkd[773]: eth0: Gained IPv6LL May 13 23:48:37.274403 systemd-networkd[773]: eth1: Gained IPv6LL May 13 23:48:37.350218 ignition[953]: INFO : files: createFilesystemsFiles: createFiles: op(a): GET result: OK May 13 23:48:38.521776 ignition[953]: INFO : files: createFilesystemsFiles: createFiles: op(a): [finished] writing file "/sysroot/opt/extensions/kubernetes/kubernetes-v1.31.0-arm64.raw" May 13 23:48:38.521776 ignition[953]: INFO : files: op(b): [started] processing unit "prepare-helm.service" May 13 23:48:38.527461 ignition[953]: INFO : files: op(b): op(c): [started] writing unit "prepare-helm.service" at "/sysroot/etc/systemd/system/prepare-helm.service" May 13 23:48:38.527461 ignition[953]: INFO : files: op(b): op(c): [finished] writing unit "prepare-helm.service" at "/sysroot/etc/systemd/system/prepare-helm.service" May 13 23:48:38.527461 ignition[953]: INFO : files: op(b): [finished] processing unit "prepare-helm.service" May 13 23:48:38.527461 ignition[953]: INFO : files: op(d): [started] processing unit "coreos-metadata.service" May 13 23:48:38.527461 ignition[953]: INFO : files: op(d): op(e): [started] writing systemd drop-in "00-custom-metadata.conf" at "/sysroot/etc/systemd/system/coreos-metadata.service.d/00-custom-metadata.conf" May 13 23:48:38.527461 ignition[953]: INFO : files: op(d): op(e): [finished] writing systemd drop-in "00-custom-metadata.conf" at "/sysroot/etc/systemd/system/coreos-metadata.service.d/00-custom-metadata.conf" May 13 23:48:38.527461 ignition[953]: INFO : files: op(d): [finished] processing unit "coreos-metadata.service" May 13 23:48:38.527461 ignition[953]: INFO : files: op(f): [started] setting preset to enabled for "prepare-helm.service" May 13 23:48:38.527461 ignition[953]: INFO : files: op(f): [finished] setting preset to enabled for "prepare-helm.service" May 13 23:48:38.527461 ignition[953]: INFO : files: createResultFile: createFiles: op(10): [started] writing file "/sysroot/etc/.ignition-result.json" May 13 23:48:38.527461 ignition[953]: INFO : files: createResultFile: createFiles: op(10): [finished] writing file "/sysroot/etc/.ignition-result.json" May 13 23:48:38.527461 ignition[953]: INFO : files: files passed May 13 23:48:38.527461 ignition[953]: INFO : Ignition finished successfully May 13 23:48:38.527348 systemd[1]: Finished ignition-files.service - Ignition (files). May 13 23:48:38.533630 systemd[1]: Starting ignition-quench.service - Ignition (record completion)... May 13 23:48:38.536220 systemd[1]: Starting initrd-setup-root-after-ignition.service - Root filesystem completion... May 13 23:48:38.556250 systemd[1]: ignition-quench.service: Deactivated successfully. May 13 23:48:38.556711 systemd[1]: Finished ignition-quench.service - Ignition (record completion). May 13 23:48:38.564548 initrd-setup-root-after-ignition[983]: grep: /sysroot/etc/flatcar/enabled-sysext.conf: No such file or directory May 13 23:48:38.564548 initrd-setup-root-after-ignition[983]: grep: /sysroot/usr/share/flatcar/enabled-sysext.conf: No such file or directory May 13 23:48:38.568083 initrd-setup-root-after-ignition[987]: grep: /sysroot/etc/flatcar/enabled-sysext.conf: No such file or directory May 13 23:48:38.572049 systemd[1]: Finished initrd-setup-root-after-ignition.service - Root filesystem completion. May 13 23:48:38.573346 systemd[1]: Reached target ignition-complete.target - Ignition Complete. May 13 23:48:38.575361 systemd[1]: Starting initrd-parse-etc.service - Mountpoints Configured in the Real Root... May 13 23:48:38.618909 systemd[1]: initrd-parse-etc.service: Deactivated successfully. May 13 23:48:38.619178 systemd[1]: Finished initrd-parse-etc.service - Mountpoints Configured in the Real Root. May 13 23:48:38.622111 systemd[1]: Reached target initrd-fs.target - Initrd File Systems. May 13 23:48:38.623570 systemd[1]: Reached target initrd.target - Initrd Default Target. May 13 23:48:38.625268 systemd[1]: dracut-mount.service - dracut mount hook was skipped because no trigger condition checks were met. May 13 23:48:38.627193 systemd[1]: Starting dracut-pre-pivot.service - dracut pre-pivot and cleanup hook... May 13 23:48:38.659167 systemd[1]: Finished dracut-pre-pivot.service - dracut pre-pivot and cleanup hook. May 13 23:48:38.663575 systemd[1]: Starting initrd-cleanup.service - Cleaning Up and Shutting Down Daemons... May 13 23:48:38.691611 systemd[1]: Stopped target nss-lookup.target - Host and Network Name Lookups. May 13 23:48:38.692438 systemd[1]: Stopped target remote-cryptsetup.target - Remote Encrypted Volumes. May 13 23:48:38.693212 systemd[1]: Stopped target timers.target - Timer Units. May 13 23:48:38.694413 systemd[1]: dracut-pre-pivot.service: Deactivated successfully. May 13 23:48:38.694547 systemd[1]: Stopped dracut-pre-pivot.service - dracut pre-pivot and cleanup hook. May 13 23:48:38.697149 systemd[1]: Stopped target initrd.target - Initrd Default Target. May 13 23:48:38.697765 systemd[1]: Stopped target basic.target - Basic System. May 13 23:48:38.698774 systemd[1]: Stopped target ignition-complete.target - Ignition Complete. May 13 23:48:38.699937 systemd[1]: Stopped target ignition-diskful.target - Ignition Boot Disk Setup. May 13 23:48:38.701197 systemd[1]: Stopped target initrd-root-device.target - Initrd Root Device. May 13 23:48:38.702142 systemd[1]: Stopped target remote-fs.target - Remote File Systems. May 13 23:48:38.703183 systemd[1]: Stopped target remote-fs-pre.target - Preparation for Remote File Systems. May 13 23:48:38.704421 systemd[1]: Stopped target sysinit.target - System Initialization. May 13 23:48:38.705535 systemd[1]: Stopped target local-fs.target - Local File Systems. May 13 23:48:38.706779 systemd[1]: Stopped target swap.target - Swaps. May 13 23:48:38.707817 systemd[1]: dracut-pre-mount.service: Deactivated successfully. May 13 23:48:38.707947 systemd[1]: Stopped dracut-pre-mount.service - dracut pre-mount hook. May 13 23:48:38.709401 systemd[1]: Stopped target cryptsetup.target - Local Encrypted Volumes. May 13 23:48:38.710112 systemd[1]: Stopped target cryptsetup-pre.target - Local Encrypted Volumes (Pre). May 13 23:48:38.711079 systemd[1]: clevis-luks-askpass.path: Deactivated successfully. May 13 23:48:38.711157 systemd[1]: Stopped clevis-luks-askpass.path - Forward Password Requests to Clevis Directory Watch. May 13 23:48:38.712203 systemd[1]: dracut-initqueue.service: Deactivated successfully. May 13 23:48:38.712335 systemd[1]: Stopped dracut-initqueue.service - dracut initqueue hook. May 13 23:48:38.713832 systemd[1]: initrd-setup-root-after-ignition.service: Deactivated successfully. May 13 23:48:38.713955 systemd[1]: Stopped initrd-setup-root-after-ignition.service - Root filesystem completion. May 13 23:48:38.715361 systemd[1]: ignition-files.service: Deactivated successfully. May 13 23:48:38.715471 systemd[1]: Stopped ignition-files.service - Ignition (files). May 13 23:48:38.716655 systemd[1]: flatcar-metadata-hostname.service: Deactivated successfully. May 13 23:48:38.716760 systemd[1]: Stopped flatcar-metadata-hostname.service - Flatcar Metadata Hostname Agent. May 13 23:48:38.720736 systemd[1]: Stopping ignition-mount.service - Ignition (mount)... May 13 23:48:38.722458 systemd[1]: Stopping sysroot-boot.service - /sysroot/boot... May 13 23:48:38.726481 systemd[1]: systemd-udev-trigger.service: Deactivated successfully. May 13 23:48:38.726649 systemd[1]: Stopped systemd-udev-trigger.service - Coldplug All udev Devices. May 13 23:48:38.729502 systemd[1]: dracut-pre-trigger.service: Deactivated successfully. May 13 23:48:38.729623 systemd[1]: Stopped dracut-pre-trigger.service - dracut pre-trigger hook. May 13 23:48:38.738061 systemd[1]: initrd-cleanup.service: Deactivated successfully. May 13 23:48:38.740393 systemd[1]: Finished initrd-cleanup.service - Cleaning Up and Shutting Down Daemons. May 13 23:48:38.747520 systemd[1]: sysroot-boot.mount: Deactivated successfully. May 13 23:48:38.751531 systemd[1]: sysroot-boot.service: Deactivated successfully. May 13 23:48:38.751650 systemd[1]: Stopped sysroot-boot.service - /sysroot/boot. May 13 23:48:38.756916 ignition[1007]: INFO : Ignition 2.20.0 May 13 23:48:38.756916 ignition[1007]: INFO : Stage: umount May 13 23:48:38.756916 ignition[1007]: INFO : no configs at "/usr/lib/ignition/base.d" May 13 23:48:38.756916 ignition[1007]: INFO : no config dir at "/usr/lib/ignition/base.platform.d/hetzner" May 13 23:48:38.756916 ignition[1007]: INFO : umount: umount passed May 13 23:48:38.756916 ignition[1007]: INFO : Ignition finished successfully May 13 23:48:38.760947 systemd[1]: ignition-mount.service: Deactivated successfully. May 13 23:48:38.761215 systemd[1]: Stopped ignition-mount.service - Ignition (mount). May 13 23:48:38.762816 systemd[1]: ignition-disks.service: Deactivated successfully. May 13 23:48:38.762902 systemd[1]: Stopped ignition-disks.service - Ignition (disks). May 13 23:48:38.764358 systemd[1]: ignition-kargs.service: Deactivated successfully. May 13 23:48:38.764447 systemd[1]: Stopped ignition-kargs.service - Ignition (kargs). May 13 23:48:38.765142 systemd[1]: ignition-fetch.service: Deactivated successfully. May 13 23:48:38.765190 systemd[1]: Stopped ignition-fetch.service - Ignition (fetch). May 13 23:48:38.765746 systemd[1]: Stopped target network.target - Network. May 13 23:48:38.766293 systemd[1]: ignition-fetch-offline.service: Deactivated successfully. May 13 23:48:38.766359 systemd[1]: Stopped ignition-fetch-offline.service - Ignition (fetch-offline). May 13 23:48:38.767390 systemd[1]: Stopped target paths.target - Path Units. May 13 23:48:38.768364 systemd[1]: systemd-ask-password-console.path: Deactivated successfully. May 13 23:48:38.772103 systemd[1]: Stopped systemd-ask-password-console.path - Dispatch Password Requests to Console Directory Watch. May 13 23:48:38.774221 systemd[1]: Stopped target slices.target - Slice Units. May 13 23:48:38.775567 systemd[1]: Stopped target sockets.target - Socket Units. May 13 23:48:38.776648 systemd[1]: iscsid.socket: Deactivated successfully. May 13 23:48:38.776714 systemd[1]: Closed iscsid.socket - Open-iSCSI iscsid Socket. May 13 23:48:38.777965 systemd[1]: iscsiuio.socket: Deactivated successfully. May 13 23:48:38.778074 systemd[1]: Closed iscsiuio.socket - Open-iSCSI iscsiuio Socket. May 13 23:48:38.779033 systemd[1]: ignition-setup.service: Deactivated successfully. May 13 23:48:38.779101 systemd[1]: Stopped ignition-setup.service - Ignition (setup). May 13 23:48:38.779997 systemd[1]: ignition-setup-pre.service: Deactivated successfully. May 13 23:48:38.780060 systemd[1]: Stopped ignition-setup-pre.service - Ignition env setup. May 13 23:48:38.780915 systemd[1]: initrd-setup-root.service: Deactivated successfully. May 13 23:48:38.780964 systemd[1]: Stopped initrd-setup-root.service - Root filesystem setup. May 13 23:48:38.782020 systemd[1]: Stopping systemd-networkd.service - Network Configuration... May 13 23:48:38.782912 systemd[1]: Stopping systemd-resolved.service - Network Name Resolution... May 13 23:48:38.787059 systemd[1]: systemd-resolved.service: Deactivated successfully. May 13 23:48:38.787239 systemd[1]: Stopped systemd-resolved.service - Network Name Resolution. May 13 23:48:38.791244 systemd[1]: run-credentials-systemd\x2dresolved.service.mount: Deactivated successfully. May 13 23:48:38.791532 systemd[1]: systemd-tmpfiles-setup.service: Deactivated successfully. May 13 23:48:38.791572 systemd[1]: Stopped systemd-tmpfiles-setup.service - Create System Files and Directories. May 13 23:48:38.794388 systemd[1]: run-credentials-systemd\x2dtmpfiles\x2dsetup.service.mount: Deactivated successfully. May 13 23:48:38.798273 systemd[1]: systemd-networkd.service: Deactivated successfully. May 13 23:48:38.799042 systemd[1]: Stopped systemd-networkd.service - Network Configuration. May 13 23:48:38.802267 systemd[1]: run-credentials-systemd\x2dnetworkd.service.mount: Deactivated successfully. May 13 23:48:38.802419 systemd[1]: systemd-networkd.socket: Deactivated successfully. May 13 23:48:38.802462 systemd[1]: Closed systemd-networkd.socket - Network Service Netlink Socket. May 13 23:48:38.806153 systemd[1]: Stopping network-cleanup.service - Network Cleanup... May 13 23:48:38.807910 systemd[1]: parse-ip-for-networkd.service: Deactivated successfully. May 13 23:48:38.808937 systemd[1]: Stopped parse-ip-for-networkd.service - Write systemd-networkd units from cmdline. May 13 23:48:38.811284 systemd[1]: systemd-sysctl.service: Deactivated successfully. May 13 23:48:38.812143 systemd[1]: Stopped systemd-sysctl.service - Apply Kernel Variables. May 13 23:48:38.815234 systemd[1]: systemd-modules-load.service: Deactivated successfully. May 13 23:48:38.815329 systemd[1]: Stopped systemd-modules-load.service - Load Kernel Modules. May 13 23:48:38.816583 systemd[1]: Stopping systemd-udevd.service - Rule-based Manager for Device Events and Files... May 13 23:48:38.819644 systemd[1]: run-credentials-systemd\x2dsysctl.service.mount: Deactivated successfully. May 13 23:48:38.839119 systemd[1]: systemd-udevd.service: Deactivated successfully. May 13 23:48:38.839305 systemd[1]: Stopped systemd-udevd.service - Rule-based Manager for Device Events and Files. May 13 23:48:38.842219 systemd[1]: network-cleanup.service: Deactivated successfully. May 13 23:48:38.842338 systemd[1]: Stopped network-cleanup.service - Network Cleanup. May 13 23:48:38.844484 systemd[1]: systemd-udevd-control.socket: Deactivated successfully. May 13 23:48:38.844563 systemd[1]: Closed systemd-udevd-control.socket - udev Control Socket. May 13 23:48:38.845577 systemd[1]: systemd-udevd-kernel.socket: Deactivated successfully. May 13 23:48:38.845615 systemd[1]: Closed systemd-udevd-kernel.socket - udev Kernel Socket. May 13 23:48:38.846650 systemd[1]: dracut-pre-udev.service: Deactivated successfully. May 13 23:48:38.846702 systemd[1]: Stopped dracut-pre-udev.service - dracut pre-udev hook. May 13 23:48:38.848167 systemd[1]: dracut-cmdline.service: Deactivated successfully. May 13 23:48:38.848233 systemd[1]: Stopped dracut-cmdline.service - dracut cmdline hook. May 13 23:48:38.849621 systemd[1]: dracut-cmdline-ask.service: Deactivated successfully. May 13 23:48:38.849675 systemd[1]: Stopped dracut-cmdline-ask.service - dracut ask for additional cmdline parameters. May 13 23:48:38.852160 systemd[1]: Starting initrd-udevadm-cleanup-db.service - Cleanup udev Database... May 13 23:48:38.854040 systemd[1]: systemd-tmpfiles-setup-dev.service: Deactivated successfully. May 13 23:48:38.854106 systemd[1]: Stopped systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev. May 13 23:48:38.857609 systemd[1]: systemd-tmpfiles-setup-dev-early.service: Deactivated successfully. May 13 23:48:38.857664 systemd[1]: Stopped systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully. May 13 23:48:38.858663 systemd[1]: kmod-static-nodes.service: Deactivated successfully. May 13 23:48:38.858723 systemd[1]: Stopped kmod-static-nodes.service - Create List of Static Device Nodes. May 13 23:48:38.859694 systemd[1]: systemd-vconsole-setup.service: Deactivated successfully. May 13 23:48:38.859752 systemd[1]: Stopped systemd-vconsole-setup.service - Virtual Console Setup. May 13 23:48:38.873253 systemd[1]: initrd-udevadm-cleanup-db.service: Deactivated successfully. May 13 23:48:38.873411 systemd[1]: Finished initrd-udevadm-cleanup-db.service - Cleanup udev Database. May 13 23:48:38.875034 systemd[1]: Reached target initrd-switch-root.target - Switch Root. May 13 23:48:38.877463 systemd[1]: Starting initrd-switch-root.service - Switch Root... May 13 23:48:38.900421 systemd[1]: Switching root. May 13 23:48:38.934372 systemd-journald[236]: Journal stopped May 13 23:48:39.924261 systemd-journald[236]: Received SIGTERM from PID 1 (systemd). May 13 23:48:39.924346 kernel: SELinux: policy capability network_peer_controls=1 May 13 23:48:39.924363 kernel: SELinux: policy capability open_perms=1 May 13 23:48:39.924373 kernel: SELinux: policy capability extended_socket_class=1 May 13 23:48:39.924391 kernel: SELinux: policy capability always_check_network=0 May 13 23:48:39.924400 kernel: SELinux: policy capability cgroup_seclabel=1 May 13 23:48:39.924410 kernel: SELinux: policy capability nnp_nosuid_transition=1 May 13 23:48:39.924419 kernel: SELinux: policy capability genfs_seclabel_symlinks=0 May 13 23:48:39.924428 kernel: SELinux: policy capability ioctl_skip_cloexec=0 May 13 23:48:39.924437 kernel: audit: type=1403 audit(1747180119.097:2): auid=4294967295 ses=4294967295 lsm=selinux res=1 May 13 23:48:39.924448 systemd[1]: Successfully loaded SELinux policy in 36.466ms. May 13 23:48:39.924472 systemd[1]: Relabeled /dev/, /dev/shm/, /run/ in 11.025ms. May 13 23:48:39.924485 systemd[1]: systemd 256.8 running in system mode (+PAM +AUDIT +SELINUX -APPARMOR +IMA +SMACK +SECCOMP +GCRYPT -GNUTLS +OPENSSL -ACL +BLKID +CURL +ELFUTILS -FIDO2 +IDN2 -IDN +IPTC +KMOD +LIBCRYPTSETUP +LIBCRYPTSETUP_PLUGINS +LIBFDISK +PCRE2 -PWQUALITY -P11KIT -QRENCODE +TPM2 +BZIP2 +LZ4 +XZ +ZLIB +ZSTD -BPF_FRAMEWORK -XKBCOMMON +UTMP -SYSVINIT +LIBARCHIVE) May 13 23:48:39.924496 systemd[1]: Detected virtualization kvm. May 13 23:48:39.924506 systemd[1]: Detected architecture arm64. May 13 23:48:39.924516 systemd[1]: Detected first boot. May 13 23:48:39.924527 systemd[1]: Hostname set to . May 13 23:48:39.924536 systemd[1]: Initializing machine ID from VM UUID. May 13 23:48:39.924546 zram_generator::config[1052]: No configuration found. May 13 23:48:39.924557 kernel: NET: Registered PF_VSOCK protocol family May 13 23:48:39.924569 systemd[1]: Populated /etc with preset unit settings. May 13 23:48:39.924580 systemd[1]: run-credentials-systemd\x2djournald.service.mount: Deactivated successfully. May 13 23:48:39.924590 systemd[1]: initrd-switch-root.service: Deactivated successfully. May 13 23:48:39.924599 systemd[1]: Stopped initrd-switch-root.service - Switch Root. May 13 23:48:39.924610 systemd[1]: systemd-journald.service: Scheduled restart job, restart counter is at 1. May 13 23:48:39.924621 systemd[1]: Created slice system-addon\x2dconfig.slice - Slice /system/addon-config. May 13 23:48:39.924631 systemd[1]: Created slice system-addon\x2drun.slice - Slice /system/addon-run. May 13 23:48:39.924641 systemd[1]: Created slice system-getty.slice - Slice /system/getty. May 13 23:48:39.924651 systemd[1]: Created slice system-modprobe.slice - Slice /system/modprobe. May 13 23:48:39.924663 systemd[1]: Created slice system-serial\x2dgetty.slice - Slice /system/serial-getty. May 13 23:48:39.924673 systemd[1]: Created slice system-system\x2dcloudinit.slice - Slice /system/system-cloudinit. May 13 23:48:39.924683 systemd[1]: Created slice system-systemd\x2dfsck.slice - Slice /system/systemd-fsck. May 13 23:48:39.924694 systemd[1]: Created slice user.slice - User and Session Slice. May 13 23:48:39.924705 systemd[1]: Started clevis-luks-askpass.path - Forward Password Requests to Clevis Directory Watch. May 13 23:48:39.924715 systemd[1]: Started systemd-ask-password-console.path - Dispatch Password Requests to Console Directory Watch. May 13 23:48:39.924730 systemd[1]: Started systemd-ask-password-wall.path - Forward Password Requests to Wall Directory Watch. May 13 23:48:39.924740 systemd[1]: Set up automount boot.automount - Boot partition Automount Point. May 13 23:48:39.924752 systemd[1]: Set up automount proc-sys-fs-binfmt_misc.automount - Arbitrary Executable File Formats File System Automount Point. May 13 23:48:39.924763 systemd[1]: Expecting device dev-disk-by\x2dlabel-OEM.device - /dev/disk/by-label/OEM... May 13 23:48:39.924773 systemd[1]: Expecting device dev-ttyAMA0.device - /dev/ttyAMA0... May 13 23:48:39.924784 systemd[1]: Reached target cryptsetup-pre.target - Local Encrypted Volumes (Pre). May 13 23:48:39.924794 systemd[1]: Stopped target initrd-switch-root.target - Switch Root. May 13 23:48:39.924804 systemd[1]: Stopped target initrd-fs.target - Initrd File Systems. May 13 23:48:39.924816 systemd[1]: Stopped target initrd-root-fs.target - Initrd Root File System. May 13 23:48:39.924826 systemd[1]: Reached target integritysetup.target - Local Integrity Protected Volumes. May 13 23:48:39.924836 systemd[1]: Reached target remote-cryptsetup.target - Remote Encrypted Volumes. May 13 23:48:39.924850 systemd[1]: Reached target remote-fs.target - Remote File Systems. May 13 23:48:39.924861 systemd[1]: Reached target slices.target - Slice Units. May 13 23:48:39.924871 systemd[1]: Reached target swap.target - Swaps. May 13 23:48:39.924882 systemd[1]: Reached target veritysetup.target - Local Verity Protected Volumes. May 13 23:48:39.924892 systemd[1]: Listening on systemd-coredump.socket - Process Core Dump Socket. May 13 23:48:39.924902 systemd[1]: Listening on systemd-creds.socket - Credential Encryption/Decryption. May 13 23:48:39.924912 systemd[1]: Listening on systemd-networkd.socket - Network Service Netlink Socket. May 13 23:48:39.924924 systemd[1]: Listening on systemd-udevd-control.socket - udev Control Socket. May 13 23:48:39.924934 systemd[1]: Listening on systemd-udevd-kernel.socket - udev Kernel Socket. May 13 23:48:39.924946 systemd[1]: Listening on systemd-userdbd.socket - User Database Manager Socket. May 13 23:48:39.924958 systemd[1]: Mounting dev-hugepages.mount - Huge Pages File System... May 13 23:48:39.924983 systemd[1]: Mounting dev-mqueue.mount - POSIX Message Queue File System... May 13 23:48:39.925009 systemd[1]: Mounting media.mount - External Media Directory... May 13 23:48:39.925024 systemd[1]: Mounting sys-kernel-debug.mount - Kernel Debug File System... May 13 23:48:39.925035 systemd[1]: Mounting sys-kernel-tracing.mount - Kernel Trace File System... May 13 23:48:39.925045 systemd[1]: Mounting tmp.mount - Temporary Directory /tmp... May 13 23:48:39.925056 systemd[1]: var-lib-machines.mount - Virtual Machine and Container Storage (Compatibility) was skipped because of an unmet condition check (ConditionPathExists=/var/lib/machines.raw). May 13 23:48:39.925066 systemd[1]: Reached target machines.target - Containers. May 13 23:48:39.925076 systemd[1]: Starting flatcar-tmpfiles.service - Create missing system files... May 13 23:48:39.925087 systemd[1]: ignition-delete-config.service - Ignition (delete config) was skipped because no trigger condition checks were met. May 13 23:48:39.925097 systemd[1]: Starting kmod-static-nodes.service - Create List of Static Device Nodes... May 13 23:48:39.925109 systemd[1]: Starting modprobe@configfs.service - Load Kernel Module configfs... May 13 23:48:39.925119 systemd[1]: Starting modprobe@dm_mod.service - Load Kernel Module dm_mod... May 13 23:48:39.925130 systemd[1]: Starting modprobe@drm.service - Load Kernel Module drm... May 13 23:48:39.925151 systemd[1]: Starting modprobe@efi_pstore.service - Load Kernel Module efi_pstore... May 13 23:48:39.925171 systemd[1]: Starting modprobe@fuse.service - Load Kernel Module fuse... May 13 23:48:39.925184 systemd[1]: Starting modprobe@loop.service - Load Kernel Module loop... May 13 23:48:39.925195 systemd[1]: setup-nsswitch.service - Create /etc/nsswitch.conf was skipped because of an unmet condition check (ConditionPathExists=!/etc/nsswitch.conf). May 13 23:48:39.925205 systemd[1]: systemd-fsck-root.service: Deactivated successfully. May 13 23:48:39.925226 systemd[1]: Stopped systemd-fsck-root.service - File System Check on Root Device. May 13 23:48:39.925246 systemd[1]: systemd-fsck-usr.service: Deactivated successfully. May 13 23:48:39.925256 systemd[1]: Stopped systemd-fsck-usr.service. May 13 23:48:39.925267 systemd[1]: systemd-hibernate-clear.service - Clear Stale Hibernate Storage Info was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/HibernateLocation-8cf2644b-4b0b-428f-9387-6d876050dc67). May 13 23:48:39.925277 systemd[1]: Starting systemd-journald.service - Journal Service... May 13 23:48:39.925287 systemd[1]: Starting systemd-modules-load.service - Load Kernel Modules... May 13 23:48:39.925298 systemd[1]: Starting systemd-network-generator.service - Generate network units from Kernel command line... May 13 23:48:39.925308 systemd[1]: Starting systemd-remount-fs.service - Remount Root and Kernel File Systems... May 13 23:48:39.925318 systemd[1]: Starting systemd-udev-load-credentials.service - Load udev Rules from Credentials... May 13 23:48:39.925331 systemd[1]: Starting systemd-udev-trigger.service - Coldplug All udev Devices... May 13 23:48:39.925342 systemd[1]: verity-setup.service: Deactivated successfully. May 13 23:48:39.925352 systemd[1]: Stopped verity-setup.service. May 13 23:48:39.925362 systemd[1]: Mounted dev-hugepages.mount - Huge Pages File System. May 13 23:48:39.925373 systemd[1]: Mounted dev-mqueue.mount - POSIX Message Queue File System. May 13 23:48:39.925386 systemd[1]: Mounted media.mount - External Media Directory. May 13 23:48:39.925396 systemd[1]: Mounted sys-kernel-debug.mount - Kernel Debug File System. May 13 23:48:39.925406 systemd[1]: Mounted sys-kernel-tracing.mount - Kernel Trace File System. May 13 23:48:39.925418 systemd[1]: Mounted tmp.mount - Temporary Directory /tmp. May 13 23:48:39.925428 systemd[1]: Finished kmod-static-nodes.service - Create List of Static Device Nodes. May 13 23:48:39.925440 systemd[1]: modprobe@configfs.service: Deactivated successfully. May 13 23:48:39.925450 systemd[1]: Finished modprobe@configfs.service - Load Kernel Module configfs. May 13 23:48:39.925461 systemd[1]: modprobe@dm_mod.service: Deactivated successfully. May 13 23:48:39.925471 kernel: loop: module loaded May 13 23:48:39.925481 kernel: fuse: init (API version 7.39) May 13 23:48:39.925491 systemd[1]: Finished modprobe@dm_mod.service - Load Kernel Module dm_mod. May 13 23:48:39.925501 systemd[1]: modprobe@efi_pstore.service: Deactivated successfully. May 13 23:48:39.925512 systemd[1]: Finished modprobe@efi_pstore.service - Load Kernel Module efi_pstore. May 13 23:48:39.925523 systemd[1]: modprobe@fuse.service: Deactivated successfully. May 13 23:48:39.925535 systemd[1]: Finished modprobe@fuse.service - Load Kernel Module fuse. May 13 23:48:39.925545 systemd[1]: modprobe@loop.service: Deactivated successfully. May 13 23:48:39.925555 systemd[1]: Finished modprobe@loop.service - Load Kernel Module loop. May 13 23:48:39.925565 kernel: ACPI: bus type drm_connector registered May 13 23:48:39.925575 systemd[1]: Finished systemd-modules-load.service - Load Kernel Modules. May 13 23:48:39.925586 systemd[1]: modprobe@drm.service: Deactivated successfully. May 13 23:48:39.925596 systemd[1]: Finished modprobe@drm.service - Load Kernel Module drm. May 13 23:48:39.925606 systemd[1]: Finished systemd-remount-fs.service - Remount Root and Kernel File Systems. May 13 23:48:39.925620 systemd[1]: Finished systemd-network-generator.service - Generate network units from Kernel command line. May 13 23:48:39.925630 systemd[1]: Reached target network-pre.target - Preparation for Network. May 13 23:48:39.925640 systemd[1]: Mounting sys-fs-fuse-connections.mount - FUSE Control File System... May 13 23:48:39.925687 systemd-journald[1118]: Collecting audit messages is disabled. May 13 23:48:39.925714 systemd[1]: Mounting sys-kernel-config.mount - Kernel Configuration File System... May 13 23:48:39.925727 systemd-journald[1118]: Journal started May 13 23:48:39.925749 systemd-journald[1118]: Runtime Journal (/run/log/journal/7944bf04907c47a6a9efb5595ba2fe6e) is 8M, max 76.6M, 68.6M free. May 13 23:48:39.639935 systemd[1]: Queued start job for default target multi-user.target. May 13 23:48:39.653287 systemd[1]: Unnecessary job was removed for dev-sda6.device - /dev/sda6. May 13 23:48:39.654163 systemd[1]: systemd-journald.service: Deactivated successfully. May 13 23:48:39.932066 systemd[1]: remount-root.service - Remount Root File System was skipped because of an unmet condition check (ConditionPathIsReadWrite=!/). May 13 23:48:39.932133 systemd[1]: Reached target local-fs.target - Local File Systems. May 13 23:48:39.936657 systemd[1]: Listening on systemd-sysext.socket - System Extension Image Management. May 13 23:48:39.946009 systemd[1]: Starting dracut-shutdown.service - Restore /run/initramfs on shutdown... May 13 23:48:39.950614 systemd[1]: Starting ldconfig.service - Rebuild Dynamic Linker Cache... May 13 23:48:39.950677 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. May 13 23:48:39.957315 systemd[1]: Starting systemd-hwdb-update.service - Rebuild Hardware Database... May 13 23:48:39.961026 systemd[1]: systemd-pstore.service - Platform Persistent Storage Archival was skipped because of an unmet condition check (ConditionDirectoryNotEmpty=/sys/fs/pstore). May 13 23:48:39.962301 systemd[1]: Starting systemd-random-seed.service - Load/Save OS Random Seed... May 13 23:48:39.966908 systemd[1]: systemd-repart.service - Repartition Root Disk was skipped because no trigger condition checks were met. May 13 23:48:39.976136 systemd[1]: Starting systemd-sysctl.service - Apply Kernel Variables... May 13 23:48:39.979896 systemd[1]: Starting systemd-sysext.service - Merge System Extension Images into /usr/ and /opt/... May 13 23:48:39.989014 systemd[1]: Starting systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully... May 13 23:48:39.993050 systemd[1]: Started systemd-journald.service - Journal Service. May 13 23:48:39.995015 systemd[1]: Finished flatcar-tmpfiles.service - Create missing system files. May 13 23:48:40.000603 systemd[1]: Finished systemd-udev-load-credentials.service - Load udev Rules from Credentials. May 13 23:48:40.001681 systemd[1]: Mounted sys-fs-fuse-connections.mount - FUSE Control File System. May 13 23:48:40.004277 systemd[1]: Mounted sys-kernel-config.mount - Kernel Configuration File System. May 13 23:48:40.005712 systemd[1]: Finished dracut-shutdown.service - Restore /run/initramfs on shutdown. May 13 23:48:40.011676 systemd[1]: Finished systemd-random-seed.service - Load/Save OS Random Seed. May 13 23:48:40.034030 kernel: loop0: detected capacity change from 0 to 126448 May 13 23:48:40.035608 systemd[1]: Reached target first-boot-complete.target - First Boot Complete. May 13 23:48:40.046285 systemd[1]: Starting systemd-journal-flush.service - Flush Journal to Persistent Storage... May 13 23:48:40.057823 systemd[1]: Starting systemd-machine-id-commit.service - Save Transient machine-id to Disk... May 13 23:48:40.061415 systemd[1]: Finished systemd-sysctl.service - Apply Kernel Variables. May 13 23:48:40.077369 kernel: squashfs: version 4.0 (2009/01/31) Phillip Lougher May 13 23:48:40.086385 systemd-tmpfiles[1156]: ACLs are not supported, ignoring. May 13 23:48:40.086402 systemd-tmpfiles[1156]: ACLs are not supported, ignoring. May 13 23:48:40.094354 systemd-journald[1118]: Time spent on flushing to /var/log/journal/7944bf04907c47a6a9efb5595ba2fe6e is 46.925ms for 1147 entries. May 13 23:48:40.094354 systemd-journald[1118]: System Journal (/var/log/journal/7944bf04907c47a6a9efb5595ba2fe6e) is 8M, max 584.8M, 576.8M free. May 13 23:48:40.167144 systemd-journald[1118]: Received client request to flush runtime journal. May 13 23:48:40.167198 kernel: loop1: detected capacity change from 0 to 8 May 13 23:48:40.167211 kernel: loop2: detected capacity change from 0 to 103832 May 13 23:48:40.106121 systemd[1]: Finished systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully. May 13 23:48:40.109930 systemd[1]: Starting systemd-sysusers.service - Create System Users... May 13 23:48:40.124294 systemd[1]: Finished systemd-machine-id-commit.service - Save Transient machine-id to Disk. May 13 23:48:40.153126 systemd[1]: Finished systemd-udev-trigger.service - Coldplug All udev Devices. May 13 23:48:40.157258 systemd[1]: Starting systemd-udev-settle.service - Wait for udev To Complete Device Initialization... May 13 23:48:40.172748 systemd[1]: Finished systemd-journal-flush.service - Flush Journal to Persistent Storage. May 13 23:48:40.197572 systemd[1]: Finished systemd-sysusers.service - Create System Users. May 13 23:48:40.201850 systemd[1]: Starting systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev... May 13 23:48:40.205120 kernel: loop3: detected capacity change from 0 to 189592 May 13 23:48:40.210117 udevadm[1194]: systemd-udev-settle.service is deprecated. Please fix lvm2-activation.service, lvm2-activation-early.service not to pull it in. May 13 23:48:40.248917 systemd-tmpfiles[1199]: ACLs are not supported, ignoring. May 13 23:48:40.248967 systemd-tmpfiles[1199]: ACLs are not supported, ignoring. May 13 23:48:40.256029 systemd[1]: Finished systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev. May 13 23:48:40.262056 kernel: loop4: detected capacity change from 0 to 126448 May 13 23:48:40.277136 kernel: loop5: detected capacity change from 0 to 8 May 13 23:48:40.281550 kernel: loop6: detected capacity change from 0 to 103832 May 13 23:48:40.298011 kernel: loop7: detected capacity change from 0 to 189592 May 13 23:48:40.328657 (sd-merge)[1203]: Using extensions 'containerd-flatcar', 'docker-flatcar', 'kubernetes', 'oem-hetzner'. May 13 23:48:40.329202 (sd-merge)[1203]: Merged extensions into '/usr'. May 13 23:48:40.337125 systemd[1]: Reload requested from client PID 1155 ('systemd-sysext') (unit systemd-sysext.service)... May 13 23:48:40.337191 systemd[1]: Reloading... May 13 23:48:40.457599 zram_generator::config[1230]: No configuration found. May 13 23:48:40.574051 ldconfig[1148]: /sbin/ldconfig: /usr/lib/ld.so.conf is not an ELF file - it has the wrong magic bytes at the start. May 13 23:48:40.622421 systemd[1]: /usr/lib/systemd/system/docker.socket:6: ListenStream= references a path below legacy directory /var/run/, updating /var/run/docker.sock → /run/docker.sock; please update the unit file accordingly. May 13 23:48:40.683505 systemd[1]: etc-machine\x2did.mount: Deactivated successfully. May 13 23:48:40.684141 systemd[1]: Reloading finished in 346 ms. May 13 23:48:40.700059 systemd[1]: Finished ldconfig.service - Rebuild Dynamic Linker Cache. May 13 23:48:40.701309 systemd[1]: Finished systemd-sysext.service - Merge System Extension Images into /usr/ and /opt/. May 13 23:48:40.715251 systemd[1]: Starting ensure-sysext.service... May 13 23:48:40.722137 systemd[1]: Starting systemd-tmpfiles-setup.service - Create System Files and Directories... May 13 23:48:40.745076 systemd[1]: Reload requested from client PID 1269 ('systemctl') (unit ensure-sysext.service)... May 13 23:48:40.745206 systemd[1]: Reloading... May 13 23:48:40.760558 systemd-tmpfiles[1270]: /usr/lib/tmpfiles.d/provision.conf:20: Duplicate line for path "/root", ignoring. May 13 23:48:40.760765 systemd-tmpfiles[1270]: /usr/lib/tmpfiles.d/systemd-flatcar.conf:6: Duplicate line for path "/var/log/journal", ignoring. May 13 23:48:40.764689 systemd-tmpfiles[1270]: /usr/lib/tmpfiles.d/systemd.conf:29: Duplicate line for path "/var/lib/systemd", ignoring. May 13 23:48:40.767249 systemd-tmpfiles[1270]: ACLs are not supported, ignoring. May 13 23:48:40.767314 systemd-tmpfiles[1270]: ACLs are not supported, ignoring. May 13 23:48:40.771715 systemd-tmpfiles[1270]: Detected autofs mount point /boot during canonicalization of boot. May 13 23:48:40.773239 systemd-tmpfiles[1270]: Skipping /boot May 13 23:48:40.795086 systemd-tmpfiles[1270]: Detected autofs mount point /boot during canonicalization of boot. May 13 23:48:40.795102 systemd-tmpfiles[1270]: Skipping /boot May 13 23:48:40.839019 zram_generator::config[1296]: No configuration found. May 13 23:48:40.964337 systemd[1]: /usr/lib/systemd/system/docker.socket:6: ListenStream= references a path below legacy directory /var/run/, updating /var/run/docker.sock → /run/docker.sock; please update the unit file accordingly. May 13 23:48:41.026359 systemd[1]: Reloading finished in 280 ms. May 13 23:48:41.040172 systemd[1]: Finished systemd-hwdb-update.service - Rebuild Hardware Database. May 13 23:48:41.047296 systemd[1]: Finished systemd-tmpfiles-setup.service - Create System Files and Directories. May 13 23:48:41.063141 systemd[1]: Starting audit-rules.service - Load Audit Rules... May 13 23:48:41.068131 systemd[1]: Starting clean-ca-certificates.service - Clean up broken links in /etc/ssl/certs... May 13 23:48:41.073111 systemd[1]: Starting systemd-journal-catalog-update.service - Rebuild Journal Catalog... May 13 23:48:41.077722 systemd[1]: Starting systemd-resolved.service - Network Name Resolution... May 13 23:48:41.083380 systemd[1]: Starting systemd-udevd.service - Rule-based Manager for Device Events and Files... May 13 23:48:41.088316 systemd[1]: Starting systemd-update-utmp.service - Record System Boot/Shutdown in UTMP... May 13 23:48:41.096781 systemd[1]: ignition-delete-config.service - Ignition (delete config) was skipped because no trigger condition checks were met. May 13 23:48:41.101038 systemd[1]: Starting modprobe@dm_mod.service - Load Kernel Module dm_mod... May 13 23:48:41.108658 systemd[1]: Starting modprobe@efi_pstore.service - Load Kernel Module efi_pstore... May 13 23:48:41.119019 systemd[1]: Starting modprobe@loop.service - Load Kernel Module loop... May 13 23:48:41.120871 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. May 13 23:48:41.121053 systemd[1]: systemd-hibernate-clear.service - Clear Stale Hibernate Storage Info was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/HibernateLocation-8cf2644b-4b0b-428f-9387-6d876050dc67). May 13 23:48:41.129904 systemd[1]: Starting systemd-userdbd.service - User Database Manager... May 13 23:48:41.136248 systemd[1]: ignition-delete-config.service - Ignition (delete config) was skipped because no trigger condition checks were met. May 13 23:48:41.136462 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. May 13 23:48:41.136598 systemd[1]: systemd-hibernate-clear.service - Clear Stale Hibernate Storage Info was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/HibernateLocation-8cf2644b-4b0b-428f-9387-6d876050dc67). May 13 23:48:41.137742 systemd[1]: Finished systemd-journal-catalog-update.service - Rebuild Journal Catalog. May 13 23:48:41.140500 systemd[1]: modprobe@efi_pstore.service: Deactivated successfully. May 13 23:48:41.142168 systemd[1]: Finished modprobe@efi_pstore.service - Load Kernel Module efi_pstore. May 13 23:48:41.145766 systemd[1]: systemd-pstore.service - Platform Persistent Storage Archival was skipped because of an unmet condition check (ConditionDirectoryNotEmpty=/sys/fs/pstore). May 13 23:48:41.151440 systemd[1]: Starting systemd-update-done.service - Update is Completed... May 13 23:48:41.156019 systemd[1]: Finished systemd-update-utmp.service - Record System Boot/Shutdown in UTMP. May 13 23:48:41.163350 systemd[1]: modprobe@dm_mod.service: Deactivated successfully. May 13 23:48:41.163525 systemd[1]: Finished modprobe@dm_mod.service - Load Kernel Module dm_mod. May 13 23:48:41.174317 systemd[1]: ignition-delete-config.service - Ignition (delete config) was skipped because no trigger condition checks were met. May 13 23:48:41.176490 systemd[1]: Starting modprobe@dm_mod.service - Load Kernel Module dm_mod... May 13 23:48:41.180420 systemd[1]: Starting modprobe@drm.service - Load Kernel Module drm... May 13 23:48:41.190478 systemd[1]: Starting modprobe@efi_pstore.service - Load Kernel Module efi_pstore... May 13 23:48:41.191286 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. May 13 23:48:41.191410 systemd[1]: systemd-hibernate-clear.service - Clear Stale Hibernate Storage Info was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/HibernateLocation-8cf2644b-4b0b-428f-9387-6d876050dc67). May 13 23:48:41.192264 systemd[1]: Finished clean-ca-certificates.service - Clean up broken links in /etc/ssl/certs. May 13 23:48:41.194601 systemd[1]: modprobe@loop.service: Deactivated successfully. May 13 23:48:41.194772 systemd[1]: Finished modprobe@loop.service - Load Kernel Module loop. May 13 23:48:41.196900 systemd[1]: update-ca-certificates.service - Update CA bundle at /etc/ssl/certs/ca-certificates.crt was skipped because of an unmet condition check (ConditionPathIsSymbolicLink=!/etc/ssl/certs/ca-certificates.crt). May 13 23:48:41.205076 systemd[1]: Finished systemd-update-done.service - Update is Completed. May 13 23:48:41.215539 systemd[1]: Finished ensure-sysext.service. May 13 23:48:41.224901 systemd[1]: Starting systemd-timesyncd.service - Network Time Synchronization... May 13 23:48:41.234283 systemd[1]: modprobe@dm_mod.service: Deactivated successfully. May 13 23:48:41.236645 systemd[1]: Finished modprobe@dm_mod.service - Load Kernel Module dm_mod. May 13 23:48:41.239368 systemd[1]: Started systemd-userdbd.service - User Database Manager. May 13 23:48:41.240348 systemd[1]: modprobe@drm.service: Deactivated successfully. May 13 23:48:41.241044 systemd[1]: Finished modprobe@drm.service - Load Kernel Module drm. May 13 23:48:41.243410 systemd[1]: modprobe@efi_pstore.service: Deactivated successfully. May 13 23:48:41.243579 systemd[1]: Finished modprobe@efi_pstore.service - Load Kernel Module efi_pstore. May 13 23:48:41.245896 systemd[1]: systemd-pstore.service - Platform Persistent Storage Archival was skipped because of an unmet condition check (ConditionDirectoryNotEmpty=/sys/fs/pstore). May 13 23:48:41.247093 systemd[1]: systemd-repart.service - Repartition Root Disk was skipped because no trigger condition checks were met. May 13 23:48:41.254305 systemd-udevd[1342]: Using default interface naming scheme 'v255'. May 13 23:48:41.260811 augenrules[1388]: No rules May 13 23:48:41.262010 systemd[1]: audit-rules.service: Deactivated successfully. May 13 23:48:41.264108 systemd[1]: Finished audit-rules.service - Load Audit Rules. May 13 23:48:41.288126 systemd[1]: Started systemd-udevd.service - Rule-based Manager for Device Events and Files. May 13 23:48:41.294206 systemd[1]: Starting systemd-networkd.service - Network Configuration... May 13 23:48:41.387108 systemd[1]: Started systemd-timesyncd.service - Network Time Synchronization. May 13 23:48:41.387939 systemd[1]: Reached target time-set.target - System Time Set. May 13 23:48:41.391113 systemd-resolved[1341]: Positive Trust Anchors: May 13 23:48:41.391145 systemd-resolved[1341]: . IN DS 20326 8 2 e06d44b80b8f1d39a95c0b0d7c65d08458e880409bbc683457104237c7f8ec8d May 13 23:48:41.391184 systemd-resolved[1341]: Negative trust anchors: home.arpa 10.in-addr.arpa 16.172.in-addr.arpa 17.172.in-addr.arpa 18.172.in-addr.arpa 19.172.in-addr.arpa 20.172.in-addr.arpa 21.172.in-addr.arpa 22.172.in-addr.arpa 23.172.in-addr.arpa 24.172.in-addr.arpa 25.172.in-addr.arpa 26.172.in-addr.arpa 27.172.in-addr.arpa 28.172.in-addr.arpa 29.172.in-addr.arpa 30.172.in-addr.arpa 31.172.in-addr.arpa 170.0.0.192.in-addr.arpa 171.0.0.192.in-addr.arpa 168.192.in-addr.arpa d.f.ip6.arpa ipv4only.arpa resolver.arpa corp home internal intranet lan local private test May 13 23:48:41.403271 systemd-resolved[1341]: Using system hostname 'ci-4284-0-0-n-40578dffbd'. May 13 23:48:41.404943 systemd[1]: Started systemd-resolved.service - Network Name Resolution. May 13 23:48:41.405686 systemd[1]: Reached target nss-lookup.target - Host and Network Name Lookups. May 13 23:48:41.435784 systemd[1]: Condition check resulted in dev-ttyAMA0.device - /dev/ttyAMA0 being skipped. May 13 23:48:41.464086 systemd-networkd[1400]: lo: Link UP May 13 23:48:41.464100 systemd-networkd[1400]: lo: Gained carrier May 13 23:48:41.465113 systemd-networkd[1400]: Enumeration completed May 13 23:48:41.465234 systemd[1]: Started systemd-networkd.service - Network Configuration. May 13 23:48:41.465931 systemd[1]: Reached target network.target - Network. May 13 23:48:41.470080 systemd[1]: Starting systemd-networkd-persistent-storage.service - Enable Persistent Storage in systemd-networkd... May 13 23:48:41.474161 systemd[1]: Starting systemd-networkd-wait-online.service - Wait for Network to be Configured... May 13 23:48:41.494016 systemd-networkd[1400]: eth1: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. May 13 23:48:41.494028 systemd-networkd[1400]: eth1: Configuring with /usr/lib/systemd/network/zz-default.network. May 13 23:48:41.494924 systemd-networkd[1400]: eth1: Link UP May 13 23:48:41.494932 systemd-networkd[1400]: eth1: Gained carrier May 13 23:48:41.494953 systemd-networkd[1400]: eth1: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. May 13 23:48:41.501306 systemd[1]: Finished systemd-networkd-persistent-storage.service - Enable Persistent Storage in systemd-networkd. May 13 23:48:41.513005 kernel: mousedev: PS/2 mouse device common for all mice May 13 23:48:41.522164 systemd-networkd[1400]: eth1: DHCPv4 address 10.0.0.3/32, gateway 10.0.0.1 acquired from 10.0.0.1 May 13 23:48:41.523060 systemd-timesyncd[1376]: Network configuration changed, trying to establish connection. May 13 23:48:41.546672 systemd-networkd[1400]: eth0: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. May 13 23:48:41.546691 systemd-networkd[1400]: eth0: Configuring with /usr/lib/systemd/network/zz-default.network. May 13 23:48:41.547826 systemd-networkd[1400]: eth0: Link UP May 13 23:48:41.547837 systemd-networkd[1400]: eth0: Gained carrier May 13 23:48:41.547843 systemd-timesyncd[1376]: Network configuration changed, trying to establish connection. May 13 23:48:41.547856 systemd-networkd[1400]: eth0: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. May 13 23:48:41.552584 systemd-timesyncd[1376]: Network configuration changed, trying to establish connection. May 13 23:48:41.582017 kernel: BTRFS warning: duplicate device /dev/sda3 devid 1 generation 41 scanned by (udev-worker) (1409) May 13 23:48:41.605337 systemd[1]: Condition check resulted in dev-virtio\x2dports-org.qemu.guest_agent.0.device - /dev/virtio-ports/org.qemu.guest_agent.0 being skipped. May 13 23:48:41.605466 systemd[1]: ignition-delete-config.service - Ignition (delete config) was skipped because no trigger condition checks were met. May 13 23:48:41.608705 systemd[1]: Starting modprobe@dm_mod.service - Load Kernel Module dm_mod... May 13 23:48:41.611166 systemd[1]: Starting modprobe@efi_pstore.service - Load Kernel Module efi_pstore... May 13 23:48:41.614728 systemd[1]: Starting modprobe@loop.service - Load Kernel Module loop... May 13 23:48:41.615796 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. May 13 23:48:41.615841 systemd[1]: systemd-hibernate-clear.service - Clear Stale Hibernate Storage Info was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/HibernateLocation-8cf2644b-4b0b-428f-9387-6d876050dc67). May 13 23:48:41.615863 systemd[1]: update-ca-certificates.service - Update CA bundle at /etc/ssl/certs/ca-certificates.crt was skipped because of an unmet condition check (ConditionPathIsSymbolicLink=!/etc/ssl/certs/ca-certificates.crt). May 13 23:48:41.627073 systemd-networkd[1400]: eth0: DHCPv4 address 188.245.195.87/32, gateway 172.31.1.1 acquired from 172.31.1.1 May 13 23:48:41.627452 systemd-timesyncd[1376]: Network configuration changed, trying to establish connection. May 13 23:48:41.628690 systemd-timesyncd[1376]: Network configuration changed, trying to establish connection. May 13 23:48:41.631560 systemd[1]: modprobe@dm_mod.service: Deactivated successfully. May 13 23:48:41.631750 systemd[1]: Finished modprobe@dm_mod.service - Load Kernel Module dm_mod. May 13 23:48:41.637065 systemd[1]: modprobe@loop.service: Deactivated successfully. May 13 23:48:41.637250 systemd[1]: Finished modprobe@loop.service - Load Kernel Module loop. May 13 23:48:41.646581 systemd[1]: modprobe@efi_pstore.service: Deactivated successfully. May 13 23:48:41.649089 systemd[1]: Finished modprobe@efi_pstore.service - Load Kernel Module efi_pstore. May 13 23:48:41.653481 systemd[1]: Found device dev-disk-by\x2dlabel-OEM.device - QEMU_HARDDISK OEM. May 13 23:48:41.659549 systemd[1]: Starting systemd-fsck@dev-disk-by\x2dlabel-OEM.service - File System Check on /dev/disk/by-label/OEM... May 13 23:48:41.660423 systemd[1]: systemd-pstore.service - Platform Persistent Storage Archival was skipped because of an unmet condition check (ConditionDirectoryNotEmpty=/sys/fs/pstore). May 13 23:48:41.660488 systemd[1]: systemd-repart.service - Repartition Root Disk was skipped because no trigger condition checks were met. May 13 23:48:41.681685 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... May 13 23:48:41.696029 kernel: [drm] pci: virtio-gpu-pci detected at 0000:00:01.0 May 13 23:48:41.697039 systemd[1]: Finished systemd-fsck@dev-disk-by\x2dlabel-OEM.service - File System Check on /dev/disk/by-label/OEM. May 13 23:48:41.698506 kernel: [drm] features: -virgl +edid -resource_blob -host_visible May 13 23:48:41.698569 kernel: [drm] features: -context_init May 13 23:48:41.700050 kernel: [drm] number of scanouts: 1 May 13 23:48:41.700286 kernel: [drm] number of cap sets: 0 May 13 23:48:41.705039 kernel: [drm] Initialized virtio_gpu 0.1.0 0 for 0000:00:01.0 on minor 0 May 13 23:48:41.710313 kernel: Console: switching to colour frame buffer device 160x50 May 13 23:48:41.724019 kernel: virtio-pci 0000:00:01.0: [drm] fb0: virtio_gpudrmfb frame buffer device May 13 23:48:41.723903 systemd[1]: systemd-vconsole-setup.service: Deactivated successfully. May 13 23:48:41.726038 systemd[1]: Stopped systemd-vconsole-setup.service - Virtual Console Setup. May 13 23:48:41.728672 systemd[1]: run-credentials-systemd\x2dvconsole\x2dsetup.service.mount: Deactivated successfully. May 13 23:48:41.732616 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... May 13 23:48:41.806060 systemd[1]: Finished systemd-vconsole-setup.service - Virtual Console Setup. May 13 23:48:41.832717 systemd[1]: Finished systemd-udev-settle.service - Wait for udev To Complete Device Initialization. May 13 23:48:41.838752 systemd[1]: Starting lvm2-activation-early.service - Activation of LVM2 logical volumes... May 13 23:48:41.861935 lvm[1464]: WARNING: Failed to connect to lvmetad. Falling back to device scanning. May 13 23:48:41.887447 systemd[1]: Finished lvm2-activation-early.service - Activation of LVM2 logical volumes. May 13 23:48:41.890953 systemd[1]: Reached target cryptsetup.target - Local Encrypted Volumes. May 13 23:48:41.892492 systemd[1]: Reached target sysinit.target - System Initialization. May 13 23:48:41.893241 systemd[1]: Started motdgen.path - Watch for update engine configuration changes. May 13 23:48:41.893949 systemd[1]: Started user-cloudinit@var-lib-flatcar\x2dinstall-user_data.path - Watch for a cloud-config at /var/lib/flatcar-install/user_data. May 13 23:48:41.894860 systemd[1]: Started logrotate.timer - Daily rotation of log files. May 13 23:48:41.895645 systemd[1]: Started mdadm.timer - Weekly check for MD array's redundancy information.. May 13 23:48:41.896489 systemd[1]: Started systemd-tmpfiles-clean.timer - Daily Cleanup of Temporary Directories. May 13 23:48:41.897729 systemd[1]: update-engine-stub.timer - Update Engine Stub Timer was skipped because of an unmet condition check (ConditionPathExists=/usr/.noupdate). May 13 23:48:41.897773 systemd[1]: Reached target paths.target - Path Units. May 13 23:48:41.898298 systemd[1]: Reached target timers.target - Timer Units. May 13 23:48:41.901462 systemd[1]: Listening on dbus.socket - D-Bus System Message Bus Socket. May 13 23:48:41.903608 systemd[1]: Starting docker.socket - Docker Socket for the API... May 13 23:48:41.907463 systemd[1]: Listening on sshd-unix-local.socket - OpenSSH Server Socket (systemd-ssh-generator, AF_UNIX Local). May 13 23:48:41.908350 systemd[1]: Listening on sshd-vsock.socket - OpenSSH Server Socket (systemd-ssh-generator, AF_VSOCK). May 13 23:48:41.909453 systemd[1]: Reached target ssh-access.target - SSH Access Available. May 13 23:48:41.918656 systemd[1]: Listening on sshd.socket - OpenSSH Server Socket. May 13 23:48:41.920658 systemd[1]: Listening on systemd-hostnamed.socket - Hostname Service Socket. May 13 23:48:41.924842 systemd[1]: Starting lvm2-activation.service - Activation of LVM2 logical volumes... May 13 23:48:41.926680 systemd[1]: Listening on docker.socket - Docker Socket for the API. May 13 23:48:41.928460 systemd[1]: Reached target sockets.target - Socket Units. May 13 23:48:41.929302 systemd[1]: Reached target basic.target - Basic System. May 13 23:48:41.930160 systemd[1]: addon-config@oem.service - Configure Addon /oem was skipped because no trigger condition checks were met. May 13 23:48:41.930204 systemd[1]: addon-run@oem.service - Run Addon /oem was skipped because no trigger condition checks were met. May 13 23:48:41.933115 systemd[1]: Starting containerd.service - containerd container runtime... May 13 23:48:41.940145 systemd[1]: Starting coreos-metadata.service - Flatcar Metadata Agent... May 13 23:48:41.942449 systemd[1]: Starting dbus.service - D-Bus System Message Bus... May 13 23:48:41.946286 systemd[1]: Starting enable-oem-cloudinit.service - Enable cloudinit... May 13 23:48:41.951332 lvm[1468]: WARNING: Failed to connect to lvmetad. Falling back to device scanning. May 13 23:48:41.952489 systemd[1]: Starting extend-filesystems.service - Extend Filesystems... May 13 23:48:41.953124 systemd[1]: flatcar-setup-environment.service - Modifies /etc/environment for CoreOS was skipped because of an unmet condition check (ConditionPathExists=/oem/bin/flatcar-setup-environment). May 13 23:48:41.961259 systemd[1]: Starting motdgen.service - Generate /run/flatcar/motd... May 13 23:48:41.966314 systemd[1]: Starting prepare-helm.service - Unpack helm to /opt/bin... May 13 23:48:41.973016 jq[1472]: false May 13 23:48:41.974234 systemd[1]: Started qemu-guest-agent.service - QEMU Guest Agent. May 13 23:48:41.981380 systemd[1]: Starting ssh-key-proc-cmdline.service - Install an ssh key from /proc/cmdline... May 13 23:48:41.991221 systemd[1]: Starting sshd-keygen.service - Generate sshd host keys... May 13 23:48:41.997905 systemd[1]: Starting systemd-logind.service - User Login Management... May 13 23:48:41.999878 systemd[1]: tcsd.service - TCG Core Services Daemon was skipped because of an unmet condition check (ConditionPathExists=/dev/tpm0). May 13 23:48:42.000482 systemd[1]: cgroup compatibility translation between legacy and unified hierarchy settings activated. See cgroup-compat debug messages for details. May 13 23:48:42.005276 systemd[1]: Starting update-engine.service - Update Engine... May 13 23:48:42.010026 systemd[1]: Starting update-ssh-keys-after-ignition.service - Run update-ssh-keys once after Ignition... May 13 23:48:42.014219 systemd[1]: Finished lvm2-activation.service - Activation of LVM2 logical volumes. May 13 23:48:42.016118 extend-filesystems[1473]: Found loop4 May 13 23:48:42.019696 extend-filesystems[1473]: Found loop5 May 13 23:48:42.019696 extend-filesystems[1473]: Found loop6 May 13 23:48:42.019696 extend-filesystems[1473]: Found loop7 May 13 23:48:42.019696 extend-filesystems[1473]: Found sda May 13 23:48:42.019696 extend-filesystems[1473]: Found sda1 May 13 23:48:42.019696 extend-filesystems[1473]: Found sda2 May 13 23:48:42.019696 extend-filesystems[1473]: Found sda3 May 13 23:48:42.019696 extend-filesystems[1473]: Found usr May 13 23:48:42.019696 extend-filesystems[1473]: Found sda4 May 13 23:48:42.019696 extend-filesystems[1473]: Found sda6 May 13 23:48:42.019696 extend-filesystems[1473]: Found sda7 May 13 23:48:42.019696 extend-filesystems[1473]: Found sda9 May 13 23:48:42.019696 extend-filesystems[1473]: Checking size of /dev/sda9 May 13 23:48:42.020795 systemd[1]: enable-oem-cloudinit.service: Skipped due to 'exec-condition'. May 13 23:48:42.025820 dbus-daemon[1471]: [system] SELinux support is enabled May 13 23:48:42.021021 systemd[1]: Condition check resulted in enable-oem-cloudinit.service - Enable cloudinit being skipped. May 13 23:48:42.026843 systemd[1]: Started dbus.service - D-Bus System Message Bus. May 13 23:48:42.045290 systemd[1]: system-cloudinit@usr-share-oem-cloud\x2dconfig.yml.service - Load cloud-config from /usr/share/oem/cloud-config.yml was skipped because of an unmet condition check (ConditionFileNotEmpty=/usr/share/oem/cloud-config.yml). May 13 23:48:42.045353 systemd[1]: Reached target system-config.target - Load system-provided cloud configs. May 13 23:48:42.054649 systemd[1]: user-cloudinit-proc-cmdline.service - Load cloud-config from url defined in /proc/cmdline was skipped because of an unmet condition check (ConditionKernelCommandLine=cloud-config-url). May 13 23:48:42.054673 systemd[1]: Reached target user-config.target - Load user-provided cloud configs. May 13 23:48:42.067579 systemd[1]: ssh-key-proc-cmdline.service: Deactivated successfully. May 13 23:48:42.067899 systemd[1]: Finished ssh-key-proc-cmdline.service - Install an ssh key from /proc/cmdline. May 13 23:48:42.074426 tar[1489]: linux-arm64/helm May 13 23:48:42.084239 jq[1485]: true May 13 23:48:42.084516 coreos-metadata[1470]: May 13 23:48:42.082 INFO Fetching http://169.254.169.254/hetzner/v1/metadata: Attempt #1 May 13 23:48:42.085937 extend-filesystems[1473]: Resized partition /dev/sda9 May 13 23:48:42.093994 coreos-metadata[1470]: May 13 23:48:42.093 INFO Fetch successful May 13 23:48:42.093994 coreos-metadata[1470]: May 13 23:48:42.093 INFO Fetching http://169.254.169.254/hetzner/v1/metadata/private-networks: Attempt #1 May 13 23:48:42.093994 coreos-metadata[1470]: May 13 23:48:42.093 INFO Fetch successful May 13 23:48:42.096824 extend-filesystems[1510]: resize2fs 1.47.2 (1-Jan-2025) May 13 23:48:42.098624 systemd[1]: motdgen.service: Deactivated successfully. May 13 23:48:42.098951 systemd[1]: Finished motdgen.service - Generate /run/flatcar/motd. May 13 23:48:42.104016 kernel: EXT4-fs (sda9): resizing filesystem from 1617920 to 9393147 blocks May 13 23:48:42.137531 jq[1509]: true May 13 23:48:42.143448 update_engine[1483]: I20250513 23:48:42.141419 1483 main.cc:92] Flatcar Update Engine starting May 13 23:48:42.143467 (ntainerd)[1507]: containerd.service: Referenced but unset environment variable evaluates to an empty string: TORCX_IMAGEDIR, TORCX_UNPACKDIR May 13 23:48:42.158259 update_engine[1483]: I20250513 23:48:42.156913 1483 update_check_scheduler.cc:74] Next update check in 4m48s May 13 23:48:42.159404 systemd[1]: Started update-engine.service - Update Engine. May 13 23:48:42.197171 systemd[1]: Started locksmithd.service - Cluster reboot manager. May 13 23:48:42.232358 kernel: BTRFS warning: duplicate device /dev/sda3 devid 1 generation 41 scanned by (udev-worker) (1399) May 13 23:48:42.245526 systemd-logind[1482]: New seat seat0. May 13 23:48:42.256358 systemd-logind[1482]: Watching system buttons on /dev/input/event0 (Power Button) May 13 23:48:42.256384 systemd-logind[1482]: Watching system buttons on /dev/input/event2 (QEMU QEMU USB Keyboard) May 13 23:48:42.266016 systemd[1]: Started systemd-logind.service - User Login Management. May 13 23:48:42.313491 kernel: EXT4-fs (sda9): resized filesystem to 9393147 May 13 23:48:42.316176 systemd[1]: Finished coreos-metadata.service - Flatcar Metadata Agent. May 13 23:48:42.318482 systemd[1]: packet-phone-home.service - Report Success to Packet was skipped because no trigger condition checks were met. May 13 23:48:42.341444 extend-filesystems[1510]: Filesystem at /dev/sda9 is mounted on /; on-line resizing required May 13 23:48:42.341444 extend-filesystems[1510]: old_desc_blocks = 1, new_desc_blocks = 5 May 13 23:48:42.341444 extend-filesystems[1510]: The filesystem on /dev/sda9 is now 9393147 (4k) blocks long. May 13 23:48:42.344745 extend-filesystems[1473]: Resized filesystem in /dev/sda9 May 13 23:48:42.344745 extend-filesystems[1473]: Found sr0 May 13 23:48:42.346582 systemd[1]: extend-filesystems.service: Deactivated successfully. May 13 23:48:42.346962 systemd[1]: Finished extend-filesystems.service - Extend Filesystems. May 13 23:48:42.363407 bash[1540]: Updated "/home/core/.ssh/authorized_keys" May 13 23:48:42.367488 systemd[1]: Finished update-ssh-keys-after-ignition.service - Run update-ssh-keys once after Ignition. May 13 23:48:42.377906 systemd[1]: Starting sshkeys.service... May 13 23:48:42.411346 systemd[1]: Created slice system-coreos\x2dmetadata\x2dsshkeys.slice - Slice /system/coreos-metadata-sshkeys. May 13 23:48:42.419107 systemd[1]: Starting coreos-metadata-sshkeys@core.service - Flatcar Metadata Agent (SSH Keys)... May 13 23:48:42.445965 locksmithd[1521]: locksmithd starting currentOperation="UPDATE_STATUS_IDLE" strategy="reboot" May 13 23:48:42.479399 coreos-metadata[1551]: May 13 23:48:42.479 INFO Fetching http://169.254.169.254/hetzner/v1/metadata/public-keys: Attempt #1 May 13 23:48:42.480902 coreos-metadata[1551]: May 13 23:48:42.480 INFO Fetch successful May 13 23:48:42.483444 unknown[1551]: wrote ssh authorized keys file for user: core May 13 23:48:42.530071 update-ssh-keys[1559]: Updated "/home/core/.ssh/authorized_keys" May 13 23:48:42.531265 systemd[1]: Finished coreos-metadata-sshkeys@core.service - Flatcar Metadata Agent (SSH Keys). May 13 23:48:42.534665 systemd[1]: Finished sshkeys.service. May 13 23:48:42.597204 containerd[1507]: time="2025-05-13T23:48:42Z" level=warning msg="Ignoring unknown key in TOML" column=1 error="strict mode: fields in the document are missing in the target struct" file=/usr/share/containerd/config.toml key=subreaper row=8 May 13 23:48:42.598288 containerd[1507]: time="2025-05-13T23:48:42.598239000Z" level=info msg="starting containerd" revision=88aa2f531d6c2922003cc7929e51daf1c14caa0a version=v2.0.1 May 13 23:48:42.631133 containerd[1507]: time="2025-05-13T23:48:42.631056360Z" level=warning msg="Configuration migrated from version 2, use `containerd config migrate` to avoid migration" t="11.32µs" May 13 23:48:42.631133 containerd[1507]: time="2025-05-13T23:48:42.631118040Z" level=info msg="loading plugin" id=io.containerd.image-verifier.v1.bindir type=io.containerd.image-verifier.v1 May 13 23:48:42.631261 containerd[1507]: time="2025-05-13T23:48:42.631148400Z" level=info msg="loading plugin" id=io.containerd.internal.v1.opt type=io.containerd.internal.v1 May 13 23:48:42.631428 containerd[1507]: time="2025-05-13T23:48:42.631384000Z" level=info msg="loading plugin" id=io.containerd.warning.v1.deprecations type=io.containerd.warning.v1 May 13 23:48:42.631428 containerd[1507]: time="2025-05-13T23:48:42.631419360Z" level=info msg="loading plugin" id=io.containerd.content.v1.content type=io.containerd.content.v1 May 13 23:48:42.631529 containerd[1507]: time="2025-05-13T23:48:42.631463560Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.blockfile type=io.containerd.snapshotter.v1 May 13 23:48:42.631573 containerd[1507]: time="2025-05-13T23:48:42.631547400Z" level=info msg="skip loading plugin" error="no scratch file generator: skip plugin" id=io.containerd.snapshotter.v1.blockfile type=io.containerd.snapshotter.v1 May 13 23:48:42.631597 containerd[1507]: time="2025-05-13T23:48:42.631571960Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.btrfs type=io.containerd.snapshotter.v1 May 13 23:48:42.632284 containerd[1507]: time="2025-05-13T23:48:42.632229320Z" level=info msg="skip loading plugin" error="path /var/lib/containerd/io.containerd.snapshotter.v1.btrfs (ext4) must be a btrfs filesystem to be used with the btrfs snapshotter: skip plugin" id=io.containerd.snapshotter.v1.btrfs type=io.containerd.snapshotter.v1 May 13 23:48:42.633101 containerd[1507]: time="2025-05-13T23:48:42.632352280Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.devmapper type=io.containerd.snapshotter.v1 May 13 23:48:42.633101 containerd[1507]: time="2025-05-13T23:48:42.632386600Z" level=info msg="skip loading plugin" error="devmapper not configured: skip plugin" id=io.containerd.snapshotter.v1.devmapper type=io.containerd.snapshotter.v1 May 13 23:48:42.633101 containerd[1507]: time="2025-05-13T23:48:42.632398880Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.native type=io.containerd.snapshotter.v1 May 13 23:48:42.633101 containerd[1507]: time="2025-05-13T23:48:42.632544560Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.overlayfs type=io.containerd.snapshotter.v1 May 13 23:48:42.633101 containerd[1507]: time="2025-05-13T23:48:42.632792880Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.zfs type=io.containerd.snapshotter.v1 May 13 23:48:42.633101 containerd[1507]: time="2025-05-13T23:48:42.632831680Z" level=info msg="skip loading plugin" error="lstat /var/lib/containerd/io.containerd.snapshotter.v1.zfs: no such file or directory: skip plugin" id=io.containerd.snapshotter.v1.zfs type=io.containerd.snapshotter.v1 May 13 23:48:42.633101 containerd[1507]: time="2025-05-13T23:48:42.632845120Z" level=info msg="loading plugin" id=io.containerd.event.v1.exchange type=io.containerd.event.v1 May 13 23:48:42.634537 containerd[1507]: time="2025-05-13T23:48:42.634491320Z" level=info msg="loading plugin" id=io.containerd.monitor.task.v1.cgroups type=io.containerd.monitor.task.v1 May 13 23:48:42.635291 containerd[1507]: time="2025-05-13T23:48:42.635246360Z" level=info msg="loading plugin" id=io.containerd.metadata.v1.bolt type=io.containerd.metadata.v1 May 13 23:48:42.635587 containerd[1507]: time="2025-05-13T23:48:42.635558200Z" level=info msg="metadata content store policy set" policy=shared May 13 23:48:42.643797 containerd[1507]: time="2025-05-13T23:48:42.643748360Z" level=info msg="loading plugin" id=io.containerd.gc.v1.scheduler type=io.containerd.gc.v1 May 13 23:48:42.644107 containerd[1507]: time="2025-05-13T23:48:42.644080400Z" level=info msg="loading plugin" id=io.containerd.differ.v1.walking type=io.containerd.differ.v1 May 13 23:48:42.644230 containerd[1507]: time="2025-05-13T23:48:42.644211240Z" level=info msg="loading plugin" id=io.containerd.lease.v1.manager type=io.containerd.lease.v1 May 13 23:48:42.644311 containerd[1507]: time="2025-05-13T23:48:42.644293600Z" level=info msg="loading plugin" id=io.containerd.service.v1.containers-service type=io.containerd.service.v1 May 13 23:48:42.644373 containerd[1507]: time="2025-05-13T23:48:42.644358040Z" level=info msg="loading plugin" id=io.containerd.service.v1.content-service type=io.containerd.service.v1 May 13 23:48:42.644434 containerd[1507]: time="2025-05-13T23:48:42.644418640Z" level=info msg="loading plugin" id=io.containerd.service.v1.diff-service type=io.containerd.service.v1 May 13 23:48:42.644501 containerd[1507]: time="2025-05-13T23:48:42.644485520Z" level=info msg="loading plugin" id=io.containerd.service.v1.images-service type=io.containerd.service.v1 May 13 23:48:42.644623 containerd[1507]: time="2025-05-13T23:48:42.644548200Z" level=info msg="loading plugin" id=io.containerd.service.v1.introspection-service type=io.containerd.service.v1 May 13 23:48:42.644888 containerd[1507]: time="2025-05-13T23:48:42.644569760Z" level=info msg="loading plugin" id=io.containerd.service.v1.namespaces-service type=io.containerd.service.v1 May 13 23:48:42.646050 containerd[1507]: time="2025-05-13T23:48:42.644951040Z" level=info msg="loading plugin" id=io.containerd.service.v1.snapshots-service type=io.containerd.service.v1 May 13 23:48:42.646050 containerd[1507]: time="2025-05-13T23:48:42.645033040Z" level=info msg="loading plugin" id=io.containerd.shim.v1.manager type=io.containerd.shim.v1 May 13 23:48:42.646050 containerd[1507]: time="2025-05-13T23:48:42.645091640Z" level=info msg="loading plugin" id=io.containerd.runtime.v2.task type=io.containerd.runtime.v2 May 13 23:48:42.646050 containerd[1507]: time="2025-05-13T23:48:42.645292960Z" level=info msg="loading plugin" id=io.containerd.service.v1.tasks-service type=io.containerd.service.v1 May 13 23:48:42.646050 containerd[1507]: time="2025-05-13T23:48:42.645335360Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.containers type=io.containerd.grpc.v1 May 13 23:48:42.646050 containerd[1507]: time="2025-05-13T23:48:42.645353320Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.content type=io.containerd.grpc.v1 May 13 23:48:42.646050 containerd[1507]: time="2025-05-13T23:48:42.645371240Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.diff type=io.containerd.grpc.v1 May 13 23:48:42.646050 containerd[1507]: time="2025-05-13T23:48:42.645388160Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.events type=io.containerd.grpc.v1 May 13 23:48:42.646050 containerd[1507]: time="2025-05-13T23:48:42.645409280Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.images type=io.containerd.grpc.v1 May 13 23:48:42.646050 containerd[1507]: time="2025-05-13T23:48:42.645423960Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.introspection type=io.containerd.grpc.v1 May 13 23:48:42.646050 containerd[1507]: time="2025-05-13T23:48:42.645439560Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.leases type=io.containerd.grpc.v1 May 13 23:48:42.646050 containerd[1507]: time="2025-05-13T23:48:42.645459280Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.namespaces type=io.containerd.grpc.v1 May 13 23:48:42.646050 containerd[1507]: time="2025-05-13T23:48:42.645477080Z" level=info msg="loading plugin" id=io.containerd.sandbox.store.v1.local type=io.containerd.sandbox.store.v1 May 13 23:48:42.646050 containerd[1507]: time="2025-05-13T23:48:42.645489240Z" level=info msg="loading plugin" id=io.containerd.cri.v1.images type=io.containerd.cri.v1 May 13 23:48:42.646050 containerd[1507]: time="2025-05-13T23:48:42.645787040Z" level=info msg="Get image filesystem path \"/var/lib/containerd/io.containerd.snapshotter.v1.overlayfs\" for snapshotter \"overlayfs\"" May 13 23:48:42.646354 containerd[1507]: time="2025-05-13T23:48:42.645811400Z" level=info msg="Start snapshots syncer" May 13 23:48:42.646354 containerd[1507]: time="2025-05-13T23:48:42.645843480Z" level=info msg="loading plugin" id=io.containerd.cri.v1.runtime type=io.containerd.cri.v1 May 13 23:48:42.648641 containerd[1507]: time="2025-05-13T23:48:42.648587680Z" level=info msg="starting cri plugin" config="{\"containerd\":{\"defaultRuntimeName\":\"runc\",\"runtimes\":{\"runc\":{\"runtimeType\":\"io.containerd.runc.v2\",\"runtimePath\":\"\",\"PodAnnotations\":null,\"ContainerAnnotations\":null,\"options\":{\"BinaryName\":\"\",\"CriuImagePath\":\"\",\"CriuWorkPath\":\"\",\"IoGid\":0,\"IoUid\":0,\"NoNewKeyring\":false,\"Root\":\"\",\"ShimCgroup\":\"\",\"SystemdCgroup\":true},\"privileged_without_host_devices\":false,\"privileged_without_host_devices_all_devices_allowed\":false,\"baseRuntimeSpec\":\"\",\"cniConfDir\":\"\",\"cniMaxConfNum\":0,\"snapshotter\":\"\",\"sandboxer\":\"podsandbox\",\"io_type\":\"\"}},\"ignoreBlockIONotEnabledErrors\":false,\"ignoreRdtNotEnabledErrors\":false},\"cni\":{\"binDir\":\"/opt/cni/bin\",\"confDir\":\"/etc/cni/net.d\",\"maxConfNum\":1,\"setupSerially\":false,\"confTemplate\":\"\",\"ipPref\":\"\",\"useInternalLoopback\":false},\"enableSelinux\":true,\"selinuxCategoryRange\":1024,\"maxContainerLogSize\":16384,\"disableApparmor\":false,\"restrictOOMScoreAdj\":false,\"disableProcMount\":false,\"unsetSeccompProfile\":\"\",\"tolerateMissingHugetlbController\":true,\"disableHugetlbController\":true,\"device_ownership_from_security_context\":false,\"ignoreImageDefinedVolumes\":false,\"netnsMountsUnderStateDir\":false,\"enableUnprivilegedPorts\":true,\"enableUnprivilegedICMP\":true,\"enableCDI\":true,\"cdiSpecDirs\":[\"/etc/cdi\",\"/var/run/cdi\"],\"drainExecSyncIOTimeout\":\"0s\",\"ignoreDeprecationWarnings\":null,\"containerdRootDir\":\"/var/lib/containerd\",\"containerdEndpoint\":\"/run/containerd/containerd.sock\",\"rootDir\":\"/var/lib/containerd/io.containerd.grpc.v1.cri\",\"stateDir\":\"/run/containerd/io.containerd.grpc.v1.cri\"}" May 13 23:48:42.650093 containerd[1507]: time="2025-05-13T23:48:42.650053800Z" level=info msg="loading plugin" id=io.containerd.podsandbox.controller.v1.podsandbox type=io.containerd.podsandbox.controller.v1 May 13 23:48:42.650192 containerd[1507]: time="2025-05-13T23:48:42.650172360Z" level=info msg="loading plugin" id=io.containerd.sandbox.controller.v1.shim type=io.containerd.sandbox.controller.v1 May 13 23:48:42.650376 containerd[1507]: time="2025-05-13T23:48:42.650352520Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.sandbox-controllers type=io.containerd.grpc.v1 May 13 23:48:42.650437 containerd[1507]: time="2025-05-13T23:48:42.650389480Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.sandboxes type=io.containerd.grpc.v1 May 13 23:48:42.650437 containerd[1507]: time="2025-05-13T23:48:42.650404400Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.snapshots type=io.containerd.grpc.v1 May 13 23:48:42.650437 containerd[1507]: time="2025-05-13T23:48:42.650416040Z" level=info msg="loading plugin" id=io.containerd.streaming.v1.manager type=io.containerd.streaming.v1 May 13 23:48:42.650437 containerd[1507]: time="2025-05-13T23:48:42.650429960Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.streaming type=io.containerd.grpc.v1 May 13 23:48:42.650507 containerd[1507]: time="2025-05-13T23:48:42.650440880Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.tasks type=io.containerd.grpc.v1 May 13 23:48:42.650507 containerd[1507]: time="2025-05-13T23:48:42.650459720Z" level=info msg="loading plugin" id=io.containerd.transfer.v1.local type=io.containerd.transfer.v1 May 13 23:48:42.650507 containerd[1507]: time="2025-05-13T23:48:42.650492520Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.transfer type=io.containerd.grpc.v1 May 13 23:48:42.650558 containerd[1507]: time="2025-05-13T23:48:42.650507440Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.version type=io.containerd.grpc.v1 May 13 23:48:42.650558 containerd[1507]: time="2025-05-13T23:48:42.650518680Z" level=info msg="loading plugin" id=io.containerd.monitor.container.v1.restart type=io.containerd.monitor.container.v1 May 13 23:48:42.650591 containerd[1507]: time="2025-05-13T23:48:42.650556680Z" level=info msg="loading plugin" id=io.containerd.tracing.processor.v1.otlp type=io.containerd.tracing.processor.v1 May 13 23:48:42.650591 containerd[1507]: time="2025-05-13T23:48:42.650573080Z" level=info msg="skip loading plugin" error="skip plugin: tracing endpoint not configured" id=io.containerd.tracing.processor.v1.otlp type=io.containerd.tracing.processor.v1 May 13 23:48:42.650591 containerd[1507]: time="2025-05-13T23:48:42.650582640Z" level=info msg="loading plugin" id=io.containerd.internal.v1.tracing type=io.containerd.internal.v1 May 13 23:48:42.650644 containerd[1507]: time="2025-05-13T23:48:42.650593440Z" level=info msg="skip loading plugin" error="skip plugin: tracing endpoint not configured" id=io.containerd.internal.v1.tracing type=io.containerd.internal.v1 May 13 23:48:42.650644 containerd[1507]: time="2025-05-13T23:48:42.650601680Z" level=info msg="loading plugin" id=io.containerd.ttrpc.v1.otelttrpc type=io.containerd.ttrpc.v1 May 13 23:48:42.650644 containerd[1507]: time="2025-05-13T23:48:42.650620680Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.healthcheck type=io.containerd.grpc.v1 May 13 23:48:42.650699 containerd[1507]: time="2025-05-13T23:48:42.650644000Z" level=info msg="loading plugin" id=io.containerd.nri.v1.nri type=io.containerd.nri.v1 May 13 23:48:42.650852 containerd[1507]: time="2025-05-13T23:48:42.650799760Z" level=info msg="runtime interface created" May 13 23:48:42.650852 containerd[1507]: time="2025-05-13T23:48:42.650814600Z" level=info msg="created NRI interface" May 13 23:48:42.650852 containerd[1507]: time="2025-05-13T23:48:42.650837600Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.cri type=io.containerd.grpc.v1 May 13 23:48:42.650919 containerd[1507]: time="2025-05-13T23:48:42.650857840Z" level=info msg="Connect containerd service" May 13 23:48:42.650948 containerd[1507]: time="2025-05-13T23:48:42.650918640Z" level=info msg="using experimental NRI integration - disable nri plugin to prevent this" May 13 23:48:42.657416 containerd[1507]: time="2025-05-13T23:48:42.657344400Z" level=error msg="failed to load cni during init, please check CRI plugin status before setting up network for pods" error="cni config load failed: no network config found in /etc/cni/net.d: cni plugin not initialized: failed to load cni config" May 13 23:48:42.726727 sshd_keygen[1516]: ssh-keygen: generating new host keys: RSA ECDSA ED25519 May 13 23:48:42.776384 systemd[1]: Finished sshd-keygen.service - Generate sshd host keys. May 13 23:48:42.781434 systemd[1]: Starting issuegen.service - Generate /run/issue... May 13 23:48:42.811411 systemd[1]: issuegen.service: Deactivated successfully. May 13 23:48:42.811901 systemd[1]: Finished issuegen.service - Generate /run/issue. May 13 23:48:42.815489 systemd[1]: Starting systemd-user-sessions.service - Permit User Sessions... May 13 23:48:42.838065 containerd[1507]: time="2025-05-13T23:48:42.837963400Z" level=info msg=serving... address=/run/containerd/containerd.sock.ttrpc May 13 23:48:42.838065 containerd[1507]: time="2025-05-13T23:48:42.838066680Z" level=info msg=serving... address=/run/containerd/containerd.sock May 13 23:48:42.838178 containerd[1507]: time="2025-05-13T23:48:42.838107920Z" level=info msg="Start subscribing containerd event" May 13 23:48:42.838178 containerd[1507]: time="2025-05-13T23:48:42.838149600Z" level=info msg="Start recovering state" May 13 23:48:42.838253 containerd[1507]: time="2025-05-13T23:48:42.838232240Z" level=info msg="Start event monitor" May 13 23:48:42.838280 containerd[1507]: time="2025-05-13T23:48:42.838254640Z" level=info msg="Start cni network conf syncer for default" May 13 23:48:42.838280 containerd[1507]: time="2025-05-13T23:48:42.838267000Z" level=info msg="Start streaming server" May 13 23:48:42.838280 containerd[1507]: time="2025-05-13T23:48:42.838275480Z" level=info msg="Registered namespace \"k8s.io\" with NRI" May 13 23:48:42.838354 containerd[1507]: time="2025-05-13T23:48:42.838283880Z" level=info msg="runtime interface starting up..." May 13 23:48:42.838354 containerd[1507]: time="2025-05-13T23:48:42.838289920Z" level=info msg="starting plugins..." May 13 23:48:42.838354 containerd[1507]: time="2025-05-13T23:48:42.838304280Z" level=info msg="Synchronizing NRI (plugin) with current runtime state" May 13 23:48:42.840014 containerd[1507]: time="2025-05-13T23:48:42.838425400Z" level=info msg="containerd successfully booted in 0.241754s" May 13 23:48:42.838529 systemd[1]: Started containerd.service - containerd container runtime. May 13 23:48:42.841392 systemd[1]: Finished systemd-user-sessions.service - Permit User Sessions. May 13 23:48:42.845471 systemd[1]: Started getty@tty1.service - Getty on tty1. May 13 23:48:42.849587 systemd[1]: Started serial-getty@ttyAMA0.service - Serial Getty on ttyAMA0. May 13 23:48:42.850495 systemd[1]: Reached target getty.target - Login Prompts. May 13 23:48:42.898207 tar[1489]: linux-arm64/LICENSE May 13 23:48:42.899070 tar[1489]: linux-arm64/README.md May 13 23:48:42.920650 systemd[1]: Finished prepare-helm.service - Unpack helm to /opt/bin. May 13 23:48:42.970373 systemd-networkd[1400]: eth1: Gained IPv6LL May 13 23:48:42.971397 systemd-timesyncd[1376]: Network configuration changed, trying to establish connection. May 13 23:48:42.975444 systemd[1]: Finished systemd-networkd-wait-online.service - Wait for Network to be Configured. May 13 23:48:42.976770 systemd[1]: Reached target network-online.target - Network is Online. May 13 23:48:42.979888 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... May 13 23:48:42.983423 systemd[1]: Starting nvidia.service - NVIDIA Configure Service... May 13 23:48:43.011924 systemd[1]: Finished nvidia.service - NVIDIA Configure Service. May 13 23:48:43.354494 systemd-networkd[1400]: eth0: Gained IPv6LL May 13 23:48:43.355100 systemd-timesyncd[1376]: Network configuration changed, trying to establish connection. May 13 23:48:43.713366 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. May 13 23:48:43.715733 systemd[1]: Reached target multi-user.target - Multi-User System. May 13 23:48:43.721619 systemd[1]: Startup finished in 777ms (kernel) + 6.416s (initrd) + 4.660s (userspace) = 11.854s. May 13 23:48:43.725473 (kubelet)[1613]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS May 13 23:48:44.313636 kubelet[1613]: E0513 23:48:44.313555 1613 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" May 13 23:48:44.316468 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE May 13 23:48:44.316634 systemd[1]: kubelet.service: Failed with result 'exit-code'. May 13 23:48:44.317808 systemd[1]: kubelet.service: Consumed 878ms CPU time, 232M memory peak. May 13 23:48:54.567415 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 1. May 13 23:48:54.570755 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... May 13 23:48:54.714723 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. May 13 23:48:54.730382 (kubelet)[1631]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS May 13 23:48:54.788265 kubelet[1631]: E0513 23:48:54.788196 1631 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" May 13 23:48:54.792534 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE May 13 23:48:54.792842 systemd[1]: kubelet.service: Failed with result 'exit-code'. May 13 23:48:54.793340 systemd[1]: kubelet.service: Consumed 179ms CPU time, 94.1M memory peak. May 13 23:49:05.044504 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 2. May 13 23:49:05.049426 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... May 13 23:49:05.213113 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. May 13 23:49:05.225536 (kubelet)[1646]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS May 13 23:49:05.277777 kubelet[1646]: E0513 23:49:05.277649 1646 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" May 13 23:49:05.279807 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE May 13 23:49:05.279999 systemd[1]: kubelet.service: Failed with result 'exit-code'. May 13 23:49:05.280321 systemd[1]: kubelet.service: Consumed 168ms CPU time, 95M memory peak. May 13 23:49:13.624334 systemd-timesyncd[1376]: Contacted time server 31.209.85.242:123 (2.flatcar.pool.ntp.org). May 13 23:49:13.624440 systemd-timesyncd[1376]: Initial clock synchronization to Tue 2025-05-13 23:49:13.378087 UTC. May 13 23:49:15.530880 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 3. May 13 23:49:15.533070 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... May 13 23:49:15.702873 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. May 13 23:49:15.713673 (kubelet)[1661]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS May 13 23:49:15.757799 kubelet[1661]: E0513 23:49:15.757330 1661 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" May 13 23:49:15.761705 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE May 13 23:49:15.762052 systemd[1]: kubelet.service: Failed with result 'exit-code'. May 13 23:49:15.762764 systemd[1]: kubelet.service: Consumed 167ms CPU time, 96.4M memory peak. May 13 23:49:26.012663 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 4. May 13 23:49:26.015097 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... May 13 23:49:26.162488 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. May 13 23:49:26.173389 (kubelet)[1675]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS May 13 23:49:26.222783 kubelet[1675]: E0513 23:49:26.222718 1675 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" May 13 23:49:26.225601 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE May 13 23:49:26.225816 systemd[1]: kubelet.service: Failed with result 'exit-code'. May 13 23:49:26.226434 systemd[1]: kubelet.service: Consumed 171ms CPU time, 96.2M memory peak. May 13 23:49:27.904674 update_engine[1483]: I20250513 23:49:27.904031 1483 update_attempter.cc:509] Updating boot flags... May 13 23:49:27.958118 kernel: BTRFS warning: duplicate device /dev/sda3 devid 1 generation 41 scanned by (udev-worker) (1692) May 13 23:49:28.052010 kernel: BTRFS warning: duplicate device /dev/sda3 devid 1 generation 41 scanned by (udev-worker) (1694) May 13 23:49:36.469482 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 5. May 13 23:49:36.475840 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... May 13 23:49:36.624073 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. May 13 23:49:36.639939 (kubelet)[1709]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS May 13 23:49:36.687100 kubelet[1709]: E0513 23:49:36.686936 1709 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" May 13 23:49:36.689478 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE May 13 23:49:36.689639 systemd[1]: kubelet.service: Failed with result 'exit-code'. May 13 23:49:36.690133 systemd[1]: kubelet.service: Consumed 165ms CPU time, 96.6M memory peak. May 13 23:49:46.719361 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 6. May 13 23:49:46.723263 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... May 13 23:49:46.864482 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. May 13 23:49:46.875534 (kubelet)[1724]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS May 13 23:49:46.919055 kubelet[1724]: E0513 23:49:46.918123 1724 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" May 13 23:49:46.921030 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE May 13 23:49:46.921224 systemd[1]: kubelet.service: Failed with result 'exit-code'. May 13 23:49:46.921853 systemd[1]: kubelet.service: Consumed 156ms CPU time, 94.5M memory peak. May 13 23:49:56.969074 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 7. May 13 23:49:56.972332 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... May 13 23:49:57.132735 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. May 13 23:49:57.143903 (kubelet)[1739]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS May 13 23:49:57.183997 kubelet[1739]: E0513 23:49:57.183452 1739 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" May 13 23:49:57.186648 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE May 13 23:49:57.187035 systemd[1]: kubelet.service: Failed with result 'exit-code'. May 13 23:49:57.188281 systemd[1]: kubelet.service: Consumed 169ms CPU time, 93.5M memory peak. May 13 23:50:07.218754 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 8. May 13 23:50:07.221063 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... May 13 23:50:07.378025 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. May 13 23:50:07.392921 (kubelet)[1753]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS May 13 23:50:07.446019 kubelet[1753]: E0513 23:50:07.445879 1753 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" May 13 23:50:07.451518 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE May 13 23:50:07.453450 systemd[1]: kubelet.service: Failed with result 'exit-code'. May 13 23:50:07.454425 systemd[1]: kubelet.service: Consumed 179ms CPU time, 96.5M memory peak. May 13 23:50:17.469073 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 9. May 13 23:50:17.472593 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... May 13 23:50:17.624571 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. May 13 23:50:17.636646 (kubelet)[1768]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS May 13 23:50:17.700430 kubelet[1768]: E0513 23:50:17.700340 1768 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" May 13 23:50:17.702603 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE May 13 23:50:17.702907 systemd[1]: kubelet.service: Failed with result 'exit-code'. May 13 23:50:17.703584 systemd[1]: kubelet.service: Consumed 181ms CPU time, 94.4M memory peak. May 13 23:50:24.071248 systemd[1]: Created slice system-sshd.slice - Slice /system/sshd. May 13 23:50:24.073836 systemd[1]: Started sshd@0-188.245.195.87:22-139.178.89.65:55848.service - OpenSSH per-connection server daemon (139.178.89.65:55848). May 13 23:50:25.101605 sshd[1776]: Accepted publickey for core from 139.178.89.65 port 55848 ssh2: RSA SHA256:2AVjOJVQfxOW/VtYd+RUuCe04o4LOcgB/L6rdW4ABGI May 13 23:50:25.105178 sshd-session[1776]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 13 23:50:25.115769 systemd[1]: Created slice user-500.slice - User Slice of UID 500. May 13 23:50:25.117917 systemd[1]: Starting user-runtime-dir@500.service - User Runtime Directory /run/user/500... May 13 23:50:25.126921 systemd-logind[1482]: New session 1 of user core. May 13 23:50:25.152341 systemd[1]: Finished user-runtime-dir@500.service - User Runtime Directory /run/user/500. May 13 23:50:25.157479 systemd[1]: Starting user@500.service - User Manager for UID 500... May 13 23:50:25.174830 (systemd)[1780]: pam_unix(systemd-user:session): session opened for user core(uid=500) by (uid=0) May 13 23:50:25.178232 systemd-logind[1482]: New session c1 of user core. May 13 23:50:25.321005 systemd[1780]: Queued start job for default target default.target. May 13 23:50:25.334173 systemd[1780]: Created slice app.slice - User Application Slice. May 13 23:50:25.334237 systemd[1780]: Reached target paths.target - Paths. May 13 23:50:25.334319 systemd[1780]: Reached target timers.target - Timers. May 13 23:50:25.335997 systemd[1780]: Starting dbus.socket - D-Bus User Message Bus Socket... May 13 23:50:25.349357 systemd[1780]: Listening on dbus.socket - D-Bus User Message Bus Socket. May 13 23:50:25.349571 systemd[1780]: Reached target sockets.target - Sockets. May 13 23:50:25.349651 systemd[1780]: Reached target basic.target - Basic System. May 13 23:50:25.349724 systemd[1780]: Reached target default.target - Main User Target. May 13 23:50:25.349776 systemd[1780]: Startup finished in 162ms. May 13 23:50:25.350954 systemd[1]: Started user@500.service - User Manager for UID 500. May 13 23:50:25.361637 systemd[1]: Started session-1.scope - Session 1 of User core. May 13 23:50:26.064374 systemd[1]: Started sshd@1-188.245.195.87:22-139.178.89.65:55852.service - OpenSSH per-connection server daemon (139.178.89.65:55852). May 13 23:50:27.107566 sshd[1791]: Accepted publickey for core from 139.178.89.65 port 55852 ssh2: RSA SHA256:2AVjOJVQfxOW/VtYd+RUuCe04o4LOcgB/L6rdW4ABGI May 13 23:50:27.109554 sshd-session[1791]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 13 23:50:27.115671 systemd-logind[1482]: New session 2 of user core. May 13 23:50:27.123308 systemd[1]: Started session-2.scope - Session 2 of User core. May 13 23:50:27.718651 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 10. May 13 23:50:27.720507 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... May 13 23:50:27.799319 sshd[1793]: Connection closed by 139.178.89.65 port 55852 May 13 23:50:27.801243 sshd-session[1791]: pam_unix(sshd:session): session closed for user core May 13 23:50:27.809364 systemd[1]: sshd@1-188.245.195.87:22-139.178.89.65:55852.service: Deactivated successfully. May 13 23:50:27.812433 systemd[1]: session-2.scope: Deactivated successfully. May 13 23:50:27.815845 systemd-logind[1482]: Session 2 logged out. Waiting for processes to exit. May 13 23:50:27.817737 systemd-logind[1482]: Removed session 2. May 13 23:50:27.864112 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. May 13 23:50:27.874521 (kubelet)[1806]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS May 13 23:50:27.916726 kubelet[1806]: E0513 23:50:27.916645 1806 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" May 13 23:50:27.920094 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE May 13 23:50:27.920365 systemd[1]: kubelet.service: Failed with result 'exit-code'. May 13 23:50:27.921054 systemd[1]: kubelet.service: Consumed 162ms CPU time, 96.1M memory peak. May 13 23:50:27.973963 systemd[1]: Started sshd@2-188.245.195.87:22-139.178.89.65:50836.service - OpenSSH per-connection server daemon (139.178.89.65:50836). May 13 23:50:28.994465 sshd[1815]: Accepted publickey for core from 139.178.89.65 port 50836 ssh2: RSA SHA256:2AVjOJVQfxOW/VtYd+RUuCe04o4LOcgB/L6rdW4ABGI May 13 23:50:28.996868 sshd-session[1815]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 13 23:50:29.005799 systemd-logind[1482]: New session 3 of user core. May 13 23:50:29.012567 systemd[1]: Started session-3.scope - Session 3 of User core. May 13 23:50:29.680072 sshd[1817]: Connection closed by 139.178.89.65 port 50836 May 13 23:50:29.681089 sshd-session[1815]: pam_unix(sshd:session): session closed for user core May 13 23:50:29.688090 systemd[1]: sshd@2-188.245.195.87:22-139.178.89.65:50836.service: Deactivated successfully. May 13 23:50:29.691514 systemd[1]: session-3.scope: Deactivated successfully. May 13 23:50:29.692963 systemd-logind[1482]: Session 3 logged out. Waiting for processes to exit. May 13 23:50:29.694650 systemd-logind[1482]: Removed session 3. May 13 23:50:29.858864 systemd[1]: Started sshd@3-188.245.195.87:22-139.178.89.65:50842.service - OpenSSH per-connection server daemon (139.178.89.65:50842). May 13 23:50:30.877938 sshd[1823]: Accepted publickey for core from 139.178.89.65 port 50842 ssh2: RSA SHA256:2AVjOJVQfxOW/VtYd+RUuCe04o4LOcgB/L6rdW4ABGI May 13 23:50:30.880164 sshd-session[1823]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 13 23:50:30.886123 systemd-logind[1482]: New session 4 of user core. May 13 23:50:30.894312 systemd[1]: Started session-4.scope - Session 4 of User core. May 13 23:50:31.575014 sshd[1825]: Connection closed by 139.178.89.65 port 50842 May 13 23:50:31.574311 sshd-session[1823]: pam_unix(sshd:session): session closed for user core May 13 23:50:31.578725 systemd[1]: sshd@3-188.245.195.87:22-139.178.89.65:50842.service: Deactivated successfully. May 13 23:50:31.580646 systemd[1]: session-4.scope: Deactivated successfully. May 13 23:50:31.582651 systemd-logind[1482]: Session 4 logged out. Waiting for processes to exit. May 13 23:50:31.583779 systemd-logind[1482]: Removed session 4. May 13 23:50:31.753461 systemd[1]: Started sshd@4-188.245.195.87:22-139.178.89.65:50854.service - OpenSSH per-connection server daemon (139.178.89.65:50854). May 13 23:50:32.781877 sshd[1831]: Accepted publickey for core from 139.178.89.65 port 50854 ssh2: RSA SHA256:2AVjOJVQfxOW/VtYd+RUuCe04o4LOcgB/L6rdW4ABGI May 13 23:50:32.783902 sshd-session[1831]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 13 23:50:32.791582 systemd-logind[1482]: New session 5 of user core. May 13 23:50:32.794265 systemd[1]: Started session-5.scope - Session 5 of User core. May 13 23:50:33.325947 sudo[1834]: core : PWD=/home/core ; USER=root ; COMMAND=/usr/sbin/setenforce 1 May 13 23:50:33.326327 sudo[1834]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) May 13 23:50:33.344553 sudo[1834]: pam_unix(sudo:session): session closed for user root May 13 23:50:33.510538 sshd[1833]: Connection closed by 139.178.89.65 port 50854 May 13 23:50:33.510334 sshd-session[1831]: pam_unix(sshd:session): session closed for user core May 13 23:50:33.516571 systemd[1]: sshd@4-188.245.195.87:22-139.178.89.65:50854.service: Deactivated successfully. May 13 23:50:33.518550 systemd[1]: session-5.scope: Deactivated successfully. May 13 23:50:33.520715 systemd-logind[1482]: Session 5 logged out. Waiting for processes to exit. May 13 23:50:33.522020 systemd-logind[1482]: Removed session 5. May 13 23:50:33.676648 systemd[1]: Started sshd@5-188.245.195.87:22-139.178.89.65:50856.service - OpenSSH per-connection server daemon (139.178.89.65:50856). May 13 23:50:34.676834 sshd[1840]: Accepted publickey for core from 139.178.89.65 port 50856 ssh2: RSA SHA256:2AVjOJVQfxOW/VtYd+RUuCe04o4LOcgB/L6rdW4ABGI May 13 23:50:34.679950 sshd-session[1840]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 13 23:50:34.689078 systemd-logind[1482]: New session 6 of user core. May 13 23:50:34.694182 systemd[1]: Started session-6.scope - Session 6 of User core. May 13 23:50:35.200933 sudo[1844]: core : PWD=/home/core ; USER=root ; COMMAND=/usr/sbin/rm -rf /etc/audit/rules.d/80-selinux.rules /etc/audit/rules.d/99-default.rules May 13 23:50:35.201727 sudo[1844]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) May 13 23:50:35.206872 sudo[1844]: pam_unix(sudo:session): session closed for user root May 13 23:50:35.214847 sudo[1843]: core : PWD=/home/core ; USER=root ; COMMAND=/usr/sbin/systemctl restart audit-rules May 13 23:50:35.215345 sudo[1843]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) May 13 23:50:35.228044 systemd[1]: Starting audit-rules.service - Load Audit Rules... May 13 23:50:35.278453 augenrules[1866]: No rules May 13 23:50:35.280315 systemd[1]: audit-rules.service: Deactivated successfully. May 13 23:50:35.280561 systemd[1]: Finished audit-rules.service - Load Audit Rules. May 13 23:50:35.282301 sudo[1843]: pam_unix(sudo:session): session closed for user root May 13 23:50:35.441633 sshd[1842]: Connection closed by 139.178.89.65 port 50856 May 13 23:50:35.442633 sshd-session[1840]: pam_unix(sshd:session): session closed for user core May 13 23:50:35.447579 systemd-logind[1482]: Session 6 logged out. Waiting for processes to exit. May 13 23:50:35.447807 systemd[1]: sshd@5-188.245.195.87:22-139.178.89.65:50856.service: Deactivated successfully. May 13 23:50:35.450186 systemd[1]: session-6.scope: Deactivated successfully. May 13 23:50:35.453028 systemd-logind[1482]: Removed session 6. May 13 23:50:35.614685 systemd[1]: Started sshd@6-188.245.195.87:22-139.178.89.65:50870.service - OpenSSH per-connection server daemon (139.178.89.65:50870). May 13 23:50:36.624623 sshd[1875]: Accepted publickey for core from 139.178.89.65 port 50870 ssh2: RSA SHA256:2AVjOJVQfxOW/VtYd+RUuCe04o4LOcgB/L6rdW4ABGI May 13 23:50:36.627120 sshd-session[1875]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 13 23:50:36.637033 systemd-logind[1482]: New session 7 of user core. May 13 23:50:36.646421 systemd[1]: Started session-7.scope - Session 7 of User core. May 13 23:50:37.152487 sudo[1878]: core : PWD=/home/core ; USER=root ; COMMAND=/home/core/install.sh May 13 23:50:37.152961 sudo[1878]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) May 13 23:50:37.501005 systemd[1]: Starting docker.service - Docker Application Container Engine... May 13 23:50:37.516673 (dockerd)[1894]: docker.service: Referenced but unset environment variable evaluates to an empty string: DOCKER_CGROUPS, DOCKER_OPTS, DOCKER_OPT_BIP, DOCKER_OPT_IPMASQ, DOCKER_OPT_MTU May 13 23:50:37.749781 dockerd[1894]: time="2025-05-13T23:50:37.748893787Z" level=info msg="Starting up" May 13 23:50:37.753688 dockerd[1894]: time="2025-05-13T23:50:37.753273048Z" level=info msg="OTEL tracing is not configured, using no-op tracer provider" May 13 23:50:37.798513 systemd[1]: var-lib-docker-check\x2doverlayfs\x2dsupport3239765508-merged.mount: Deactivated successfully. May 13 23:50:37.830137 systemd[1]: var-lib-docker-metacopy\x2dcheck4108887028-merged.mount: Deactivated successfully. May 13 23:50:37.840320 dockerd[1894]: time="2025-05-13T23:50:37.840243002Z" level=info msg="Loading containers: start." May 13 23:50:37.968597 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 11. May 13 23:50:37.970302 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... May 13 23:50:38.012023 kernel: Initializing XFRM netlink socket May 13 23:50:38.100378 systemd-networkd[1400]: docker0: Link UP May 13 23:50:38.141616 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. May 13 23:50:38.153530 (kubelet)[2054]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS May 13 23:50:38.164554 dockerd[1894]: time="2025-05-13T23:50:38.164499622Z" level=info msg="Loading containers: done." May 13 23:50:38.192246 dockerd[1894]: time="2025-05-13T23:50:38.192178007Z" level=warning msg="Not using native diff for overlay2, this may cause degraded performance for building images: kernel has CONFIG_OVERLAY_FS_REDIRECT_DIR enabled" storage-driver=overlay2 May 13 23:50:38.195390 dockerd[1894]: time="2025-05-13T23:50:38.194642423Z" level=info msg="Docker daemon" commit=c710b88579fcb5e0d53f96dcae976d79323b9166 containerd-snapshotter=false storage-driver=overlay2 version=27.4.1 May 13 23:50:38.195390 dockerd[1894]: time="2025-05-13T23:50:38.195040872Z" level=info msg="Daemon has completed initialization" May 13 23:50:38.202083 kubelet[2054]: E0513 23:50:38.201920 2054 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" May 13 23:50:38.205777 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE May 13 23:50:38.205928 systemd[1]: kubelet.service: Failed with result 'exit-code'. May 13 23:50:38.208076 systemd[1]: kubelet.service: Consumed 163ms CPU time, 96.4M memory peak. May 13 23:50:38.245778 dockerd[1894]: time="2025-05-13T23:50:38.245609615Z" level=info msg="API listen on /run/docker.sock" May 13 23:50:38.246137 systemd[1]: Started docker.service - Docker Application Container Engine. May 13 23:50:38.793210 systemd[1]: var-lib-docker-overlay2-opaque\x2dbug\x2dcheck806303374-merged.mount: Deactivated successfully. May 13 23:50:39.330809 containerd[1507]: time="2025-05-13T23:50:39.330726035Z" level=info msg="PullImage \"registry.k8s.io/kube-apiserver:v1.31.8\"" May 13 23:50:40.023913 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount94708055.mount: Deactivated successfully. May 13 23:50:41.445405 containerd[1507]: time="2025-05-13T23:50:41.445303392Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-apiserver:v1.31.8\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 13 23:50:41.447571 containerd[1507]: time="2025-05-13T23:50:41.447485359Z" level=info msg="stop pulling image registry.k8s.io/kube-apiserver:v1.31.8: active requests=0, bytes read=25554700" May 13 23:50:41.448879 containerd[1507]: time="2025-05-13T23:50:41.448392779Z" level=info msg="ImageCreate event name:\"sha256:ef8fb1ea7c9599dbedea6f9d5589975ebc5bf4ec72f6be6acaaec59a723a09b3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 13 23:50:41.452096 containerd[1507]: time="2025-05-13T23:50:41.452052178Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-apiserver@sha256:30090db6a7d53799163ce82dae9e8ddb645fd47db93f2ec9da0cc787fd825625\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 13 23:50:41.453235 containerd[1507]: time="2025-05-13T23:50:41.453184283Z" level=info msg="Pulled image \"registry.k8s.io/kube-apiserver:v1.31.8\" with image id \"sha256:ef8fb1ea7c9599dbedea6f9d5589975ebc5bf4ec72f6be6acaaec59a723a09b3\", repo tag \"registry.k8s.io/kube-apiserver:v1.31.8\", repo digest \"registry.k8s.io/kube-apiserver@sha256:30090db6a7d53799163ce82dae9e8ddb645fd47db93f2ec9da0cc787fd825625\", size \"25551408\" in 2.122408767s" May 13 23:50:41.453329 containerd[1507]: time="2025-05-13T23:50:41.453238484Z" level=info msg="PullImage \"registry.k8s.io/kube-apiserver:v1.31.8\" returns image reference \"sha256:ef8fb1ea7c9599dbedea6f9d5589975ebc5bf4ec72f6be6acaaec59a723a09b3\"" May 13 23:50:41.454153 containerd[1507]: time="2025-05-13T23:50:41.454100863Z" level=info msg="PullImage \"registry.k8s.io/kube-controller-manager:v1.31.8\"" May 13 23:50:42.996075 containerd[1507]: time="2025-05-13T23:50:42.996013479Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-controller-manager:v1.31.8\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 13 23:50:42.997522 containerd[1507]: time="2025-05-13T23:50:42.997450670Z" level=info msg="stop pulling image registry.k8s.io/kube-controller-manager:v1.31.8: active requests=0, bytes read=22458998" May 13 23:50:42.998820 containerd[1507]: time="2025-05-13T23:50:42.998334609Z" level=info msg="ImageCreate event name:\"sha256:ea6e6085feca75547d0422ab0536fe0d18c9ff5831de7a9d6a707c968027bb6a\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 13 23:50:43.001742 containerd[1507]: time="2025-05-13T23:50:43.001703561Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-controller-manager@sha256:29eaddc64792a689df48506e78bbc641d063ac8bb92d2e66ae2ad05977420747\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 13 23:50:43.002777 containerd[1507]: time="2025-05-13T23:50:43.002735223Z" level=info msg="Pulled image \"registry.k8s.io/kube-controller-manager:v1.31.8\" with image id \"sha256:ea6e6085feca75547d0422ab0536fe0d18c9ff5831de7a9d6a707c968027bb6a\", repo tag \"registry.k8s.io/kube-controller-manager:v1.31.8\", repo digest \"registry.k8s.io/kube-controller-manager@sha256:29eaddc64792a689df48506e78bbc641d063ac8bb92d2e66ae2ad05977420747\", size \"23900539\" in 1.548592599s" May 13 23:50:43.003771 containerd[1507]: time="2025-05-13T23:50:43.003713084Z" level=info msg="PullImage \"registry.k8s.io/kube-controller-manager:v1.31.8\" returns image reference \"sha256:ea6e6085feca75547d0422ab0536fe0d18c9ff5831de7a9d6a707c968027bb6a\"" May 13 23:50:43.007540 containerd[1507]: time="2025-05-13T23:50:43.007490684Z" level=info msg="PullImage \"registry.k8s.io/kube-scheduler:v1.31.8\"" May 13 23:50:44.357034 containerd[1507]: time="2025-05-13T23:50:44.356029927Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-scheduler:v1.31.8\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 13 23:50:44.357568 containerd[1507]: time="2025-05-13T23:50:44.357510758Z" level=info msg="stop pulling image registry.k8s.io/kube-scheduler:v1.31.8: active requests=0, bytes read=17125833" May 13 23:50:44.358461 containerd[1507]: time="2025-05-13T23:50:44.358430577Z" level=info msg="ImageCreate event name:\"sha256:1d2db6ef0dd2f3e08bdfcd46afde7b755b05192841f563d8df54b807daaa7d8d\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 13 23:50:44.361557 containerd[1507]: time="2025-05-13T23:50:44.361513002Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-scheduler@sha256:22994a2632e81059720480b9f6bdeb133b08d58492d0b36dfd6e9768b159b22a\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 13 23:50:44.362772 containerd[1507]: time="2025-05-13T23:50:44.362729107Z" level=info msg="Pulled image \"registry.k8s.io/kube-scheduler:v1.31.8\" with image id \"sha256:1d2db6ef0dd2f3e08bdfcd46afde7b755b05192841f563d8df54b807daaa7d8d\", repo tag \"registry.k8s.io/kube-scheduler:v1.31.8\", repo digest \"registry.k8s.io/kube-scheduler@sha256:22994a2632e81059720480b9f6bdeb133b08d58492d0b36dfd6e9768b159b22a\", size \"18567392\" in 1.355053939s" May 13 23:50:44.362772 containerd[1507]: time="2025-05-13T23:50:44.362765988Z" level=info msg="PullImage \"registry.k8s.io/kube-scheduler:v1.31.8\" returns image reference \"sha256:1d2db6ef0dd2f3e08bdfcd46afde7b755b05192841f563d8df54b807daaa7d8d\"" May 13 23:50:44.363802 containerd[1507]: time="2025-05-13T23:50:44.363773649Z" level=info msg="PullImage \"registry.k8s.io/kube-proxy:v1.31.8\"" May 13 23:50:45.621900 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount2741422135.mount: Deactivated successfully. May 13 23:50:45.970017 containerd[1507]: time="2025-05-13T23:50:45.969911800Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-proxy:v1.31.8\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 13 23:50:45.971362 containerd[1507]: time="2025-05-13T23:50:45.971287389Z" level=info msg="stop pulling image registry.k8s.io/kube-proxy:v1.31.8: active requests=0, bytes read=26871943" May 13 23:50:45.973999 containerd[1507]: time="2025-05-13T23:50:45.972287170Z" level=info msg="ImageCreate event name:\"sha256:c5361ece77e80334cd5fb082c0b678cb3244f5834ecacea1719ae6b38b465581\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 13 23:50:45.978851 containerd[1507]: time="2025-05-13T23:50:45.977854606Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-proxy@sha256:dd0c9a37670f209947b1ed880f06a2e93e1d41da78c037f52f94b13858769838\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 13 23:50:45.978851 containerd[1507]: time="2025-05-13T23:50:45.978713944Z" level=info msg="Pulled image \"registry.k8s.io/kube-proxy:v1.31.8\" with image id \"sha256:c5361ece77e80334cd5fb082c0b678cb3244f5834ecacea1719ae6b38b465581\", repo tag \"registry.k8s.io/kube-proxy:v1.31.8\", repo digest \"registry.k8s.io/kube-proxy@sha256:dd0c9a37670f209947b1ed880f06a2e93e1d41da78c037f52f94b13858769838\", size \"26870936\" in 1.614795171s" May 13 23:50:45.978851 containerd[1507]: time="2025-05-13T23:50:45.978750024Z" level=info msg="PullImage \"registry.k8s.io/kube-proxy:v1.31.8\" returns image reference \"sha256:c5361ece77e80334cd5fb082c0b678cb3244f5834ecacea1719ae6b38b465581\"" May 13 23:50:45.979666 containerd[1507]: time="2025-05-13T23:50:45.979615442Z" level=info msg="PullImage \"registry.k8s.io/coredns/coredns:v1.11.1\"" May 13 23:50:46.639657 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount2156153234.mount: Deactivated successfully. May 13 23:50:47.363041 containerd[1507]: time="2025-05-13T23:50:47.362957412Z" level=info msg="ImageCreate event name:\"registry.k8s.io/coredns/coredns:v1.11.1\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 13 23:50:47.364937 containerd[1507]: time="2025-05-13T23:50:47.364844730Z" level=info msg="stop pulling image registry.k8s.io/coredns/coredns:v1.11.1: active requests=0, bytes read=16485461" May 13 23:50:47.365548 containerd[1507]: time="2025-05-13T23:50:47.365494504Z" level=info msg="ImageCreate event name:\"sha256:2437cf762177702dec2dfe99a09c37427a15af6d9a57c456b65352667c223d93\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 13 23:50:47.369215 containerd[1507]: time="2025-05-13T23:50:47.369161619Z" level=info msg="ImageCreate event name:\"registry.k8s.io/coredns/coredns@sha256:1eeb4c7316bacb1d4c8ead65571cd92dd21e27359f0d4917f1a5822a73b75db1\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 13 23:50:47.370341 containerd[1507]: time="2025-05-13T23:50:47.370280961Z" level=info msg="Pulled image \"registry.k8s.io/coredns/coredns:v1.11.1\" with image id \"sha256:2437cf762177702dec2dfe99a09c37427a15af6d9a57c456b65352667c223d93\", repo tag \"registry.k8s.io/coredns/coredns:v1.11.1\", repo digest \"registry.k8s.io/coredns/coredns@sha256:1eeb4c7316bacb1d4c8ead65571cd92dd21e27359f0d4917f1a5822a73b75db1\", size \"16482581\" in 1.390481755s" May 13 23:50:47.370341 containerd[1507]: time="2025-05-13T23:50:47.370328362Z" level=info msg="PullImage \"registry.k8s.io/coredns/coredns:v1.11.1\" returns image reference \"sha256:2437cf762177702dec2dfe99a09c37427a15af6d9a57c456b65352667c223d93\"" May 13 23:50:47.371142 containerd[1507]: time="2025-05-13T23:50:47.370884094Z" level=info msg="PullImage \"registry.k8s.io/pause:3.10\"" May 13 23:50:48.014901 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount2080198701.mount: Deactivated successfully. May 13 23:50:48.025940 containerd[1507]: time="2025-05-13T23:50:48.025014891Z" level=info msg="ImageCreate event name:\"registry.k8s.io/pause:3.10\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" May 13 23:50:48.027516 containerd[1507]: time="2025-05-13T23:50:48.027438940Z" level=info msg="stop pulling image registry.k8s.io/pause:3.10: active requests=0, bytes read=268723" May 13 23:50:48.029613 containerd[1507]: time="2025-05-13T23:50:48.029527062Z" level=info msg="ImageCreate event name:\"sha256:afb61768ce381961ca0beff95337601f29dc70ff3ed14e5e4b3e5699057e6aa8\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" May 13 23:50:48.034620 containerd[1507]: time="2025-05-13T23:50:48.034535524Z" level=info msg="ImageCreate event name:\"registry.k8s.io/pause@sha256:ee6521f290b2168b6e0935a181d4cff9be1ac3f505666ef0e3c98fae8199917a\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" May 13 23:50:48.036399 containerd[1507]: time="2025-05-13T23:50:48.035813630Z" level=info msg="Pulled image \"registry.k8s.io/pause:3.10\" with image id \"sha256:afb61768ce381961ca0beff95337601f29dc70ff3ed14e5e4b3e5699057e6aa8\", repo tag \"registry.k8s.io/pause:3.10\", repo digest \"registry.k8s.io/pause@sha256:ee6521f290b2168b6e0935a181d4cff9be1ac3f505666ef0e3c98fae8199917a\", size \"267933\" in 664.891935ms" May 13 23:50:48.036399 containerd[1507]: time="2025-05-13T23:50:48.035873471Z" level=info msg="PullImage \"registry.k8s.io/pause:3.10\" returns image reference \"sha256:afb61768ce381961ca0beff95337601f29dc70ff3ed14e5e4b3e5699057e6aa8\"" May 13 23:50:48.036608 containerd[1507]: time="2025-05-13T23:50:48.036420002Z" level=info msg="PullImage \"registry.k8s.io/etcd:3.5.15-0\"" May 13 23:50:48.218917 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 12. May 13 23:50:48.222711 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... May 13 23:50:48.377891 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. May 13 23:50:48.388527 (kubelet)[2234]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS May 13 23:50:48.441781 kubelet[2234]: E0513 23:50:48.441709 2234 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" May 13 23:50:48.444827 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE May 13 23:50:48.445171 systemd[1]: kubelet.service: Failed with result 'exit-code'. May 13 23:50:48.445885 systemd[1]: kubelet.service: Consumed 164ms CPU time, 92.5M memory peak. May 13 23:50:48.627413 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount1589357752.mount: Deactivated successfully. May 13 23:50:50.513131 containerd[1507]: time="2025-05-13T23:50:50.513055643Z" level=info msg="ImageCreate event name:\"registry.k8s.io/etcd:3.5.15-0\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 13 23:50:50.515088 containerd[1507]: time="2025-05-13T23:50:50.515044576Z" level=info msg="ImageCreate event name:\"sha256:27e3830e1402783674d8b594038967deea9d51f0d91b34c93c8f39d2f68af7da\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 13 23:50:50.515259 containerd[1507]: time="2025-05-13T23:50:50.515223868Z" level=info msg="stop pulling image registry.k8s.io/etcd:3.5.15-0: active requests=0, bytes read=66406533" May 13 23:50:50.519378 containerd[1507]: time="2025-05-13T23:50:50.519323064Z" level=info msg="ImageCreate event name:\"registry.k8s.io/etcd@sha256:a6dc63e6e8cfa0307d7851762fa6b629afb18f28d8aa3fab5a6e91b4af60026a\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 13 23:50:50.520638 containerd[1507]: time="2025-05-13T23:50:50.520595709Z" level=info msg="Pulled image \"registry.k8s.io/etcd:3.5.15-0\" with image id \"sha256:27e3830e1402783674d8b594038967deea9d51f0d91b34c93c8f39d2f68af7da\", repo tag \"registry.k8s.io/etcd:3.5.15-0\", repo digest \"registry.k8s.io/etcd@sha256:a6dc63e6e8cfa0307d7851762fa6b629afb18f28d8aa3fab5a6e91b4af60026a\", size \"66535646\" in 2.484139106s" May 13 23:50:50.520845 containerd[1507]: time="2025-05-13T23:50:50.520763961Z" level=info msg="PullImage \"registry.k8s.io/etcd:3.5.15-0\" returns image reference \"sha256:27e3830e1402783674d8b594038967deea9d51f0d91b34c93c8f39d2f68af7da\"" May 13 23:50:55.339456 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent. May 13 23:50:55.340064 systemd[1]: kubelet.service: Consumed 164ms CPU time, 92.5M memory peak. May 13 23:50:55.342486 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... May 13 23:50:55.377007 systemd[1]: Reload requested from client PID 2322 ('systemctl') (unit session-7.scope)... May 13 23:50:55.377030 systemd[1]: Reloading... May 13 23:50:55.522086 zram_generator::config[2370]: No configuration found. May 13 23:50:55.634446 systemd[1]: /usr/lib/systemd/system/docker.socket:6: ListenStream= references a path below legacy directory /var/run/, updating /var/run/docker.sock → /run/docker.sock; please update the unit file accordingly. May 13 23:50:55.731295 systemd[1]: Reloading finished in 353 ms. May 13 23:50:55.780641 systemd[1]: kubelet.service: Control process exited, code=killed, status=15/TERM May 13 23:50:55.780739 systemd[1]: kubelet.service: Failed with result 'signal'. May 13 23:50:55.781148 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent. May 13 23:50:55.783235 systemd[1]: kubelet.service: Consumed 103ms CPU time, 82.3M memory peak. May 13 23:50:55.786852 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... May 13 23:50:55.917596 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. May 13 23:50:55.926412 (kubelet)[2415]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS May 13 23:50:55.990145 kubelet[2415]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. May 13 23:50:55.990145 kubelet[2415]: Flag --pod-infra-container-image has been deprecated, will be removed in a future release. Image garbage collector will get sandbox image information from CRI. May 13 23:50:55.990145 kubelet[2415]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. May 13 23:50:55.990145 kubelet[2415]: I0513 23:50:55.989759 2415 server.go:206] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" May 13 23:50:57.218243 kubelet[2415]: I0513 23:50:57.218181 2415 server.go:486] "Kubelet version" kubeletVersion="v1.31.0" May 13 23:50:57.218243 kubelet[2415]: I0513 23:50:57.218230 2415 server.go:488] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" May 13 23:50:57.218656 kubelet[2415]: I0513 23:50:57.218590 2415 server.go:929] "Client rotation is on, will bootstrap in background" May 13 23:50:57.259606 kubelet[2415]: E0513 23:50:57.259545 2415 certificate_manager.go:562] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post \"https://188.245.195.87:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": dial tcp 188.245.195.87:6443: connect: connection refused" logger="UnhandledError" May 13 23:50:57.259926 kubelet[2415]: I0513 23:50:57.259760 2415 dynamic_cafile_content.go:160] "Starting controller" name="client-ca-bundle::/etc/kubernetes/pki/ca.crt" May 13 23:50:57.277009 kubelet[2415]: I0513 23:50:57.276660 2415 server.go:1426] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" May 13 23:50:57.282837 kubelet[2415]: I0513 23:50:57.282724 2415 server.go:744] "--cgroups-per-qos enabled, but --cgroup-root was not specified. defaulting to /" May 13 23:50:57.284207 kubelet[2415]: I0513 23:50:57.284161 2415 swap_util.go:113] "Swap is on" /proc/swaps contents="Filename\t\t\t\tType\t\tSize\t\tUsed\t\tPriority" May 13 23:50:57.284509 kubelet[2415]: I0513 23:50:57.284444 2415 container_manager_linux.go:264] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] May 13 23:50:57.284717 kubelet[2415]: I0513 23:50:57.284490 2415 container_manager_linux.go:269] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ci-4284-0-0-n-40578dffbd","RuntimeCgroupsName":"","SystemCgroupsName":"","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":false,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":null,"HardEvictionThresholds":[{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"ExperimentalMemoryManagerPolicy":"None","ExperimentalMemoryManagerReservedMemory":null,"PodPidsLimit":-1,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} May 13 23:50:57.284963 kubelet[2415]: I0513 23:50:57.284937 2415 topology_manager.go:138] "Creating topology manager with none policy" May 13 23:50:57.284963 kubelet[2415]: I0513 23:50:57.284961 2415 container_manager_linux.go:300] "Creating device plugin manager" May 13 23:50:57.285240 kubelet[2415]: I0513 23:50:57.285194 2415 state_mem.go:36] "Initialized new in-memory state store" May 13 23:50:57.288004 kubelet[2415]: I0513 23:50:57.287733 2415 kubelet.go:408] "Attempting to sync node with API server" May 13 23:50:57.288004 kubelet[2415]: I0513 23:50:57.287779 2415 kubelet.go:303] "Adding static pod path" path="/etc/kubernetes/manifests" May 13 23:50:57.288004 kubelet[2415]: I0513 23:50:57.287814 2415 kubelet.go:314] "Adding apiserver pod source" May 13 23:50:57.288004 kubelet[2415]: I0513 23:50:57.287827 2415 apiserver.go:42] "Waiting for node sync before watching apiserver pods" May 13 23:50:57.296825 kubelet[2415]: W0513 23:50:57.296744 2415 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://188.245.195.87:6443/api/v1/nodes?fieldSelector=metadata.name%3Dci-4284-0-0-n-40578dffbd&limit=500&resourceVersion=0": dial tcp 188.245.195.87:6443: connect: connection refused May 13 23:50:57.297407 kubelet[2415]: E0513 23:50:57.297077 2415 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://188.245.195.87:6443/api/v1/nodes?fieldSelector=metadata.name%3Dci-4284-0-0-n-40578dffbd&limit=500&resourceVersion=0\": dial tcp 188.245.195.87:6443: connect: connection refused" logger="UnhandledError" May 13 23:50:57.297407 kubelet[2415]: I0513 23:50:57.297204 2415 kuberuntime_manager.go:262] "Container runtime initialized" containerRuntime="containerd" version="v2.0.1" apiVersion="v1" May 13 23:50:57.300702 kubelet[2415]: I0513 23:50:57.300661 2415 kubelet.go:837] "Not starting ClusterTrustBundle informer because we are in static kubelet mode" May 13 23:50:57.301406 kubelet[2415]: W0513 23:50:57.301174 2415 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://188.245.195.87:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 188.245.195.87:6443: connect: connection refused May 13 23:50:57.301406 kubelet[2415]: E0513 23:50:57.301263 2415 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://188.245.195.87:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 188.245.195.87:6443: connect: connection refused" logger="UnhandledError" May 13 23:50:57.302040 kubelet[2415]: W0513 23:50:57.301960 2415 probe.go:272] Flexvolume plugin directory at /opt/libexec/kubernetes/kubelet-plugins/volume/exec/ does not exist. Recreating. May 13 23:50:57.303053 kubelet[2415]: I0513 23:50:57.303025 2415 server.go:1269] "Started kubelet" May 13 23:50:57.304969 kubelet[2415]: I0513 23:50:57.304934 2415 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" May 13 23:50:57.309316 kubelet[2415]: E0513 23:50:57.307063 2415 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://188.245.195.87:6443/api/v1/namespaces/default/events\": dial tcp 188.245.195.87:6443: connect: connection refused" event="&Event{ObjectMeta:{ci-4284-0-0-n-40578dffbd.183f3b2fe31ed7ea default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ci-4284-0-0-n-40578dffbd,UID:ci-4284-0-0-n-40578dffbd,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:ci-4284-0-0-n-40578dffbd,},FirstTimestamp:2025-05-13 23:50:57.302992874 +0000 UTC m=+1.370684764,LastTimestamp:2025-05-13 23:50:57.302992874 +0000 UTC m=+1.370684764,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ci-4284-0-0-n-40578dffbd,}" May 13 23:50:57.313010 kubelet[2415]: I0513 23:50:57.312897 2415 server.go:163] "Starting to listen" address="0.0.0.0" port=10250 May 13 23:50:57.314330 kubelet[2415]: I0513 23:50:57.314289 2415 server.go:460] "Adding debug handlers to kubelet server" May 13 23:50:57.315302 kubelet[2415]: I0513 23:50:57.315229 2415 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 May 13 23:50:57.315563 kubelet[2415]: I0513 23:50:57.315485 2415 server.go:236] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" May 13 23:50:57.315720 kubelet[2415]: I0513 23:50:57.315701 2415 volume_manager.go:289] "Starting Kubelet Volume Manager" May 13 23:50:57.317093 kubelet[2415]: E0513 23:50:57.316635 2415 kubelet_node_status.go:453] "Error getting the current node from lister" err="node \"ci-4284-0-0-n-40578dffbd\" not found" May 13 23:50:57.317304 kubelet[2415]: I0513 23:50:57.317272 2415 factory.go:221] Registration of the systemd container factory successfully May 13 23:50:57.317427 kubelet[2415]: I0513 23:50:57.317403 2415 factory.go:219] Registration of the crio container factory failed: Get "http://%2Fvar%2Frun%2Fcrio%2Fcrio.sock/info": dial unix /var/run/crio/crio.sock: connect: no such file or directory May 13 23:50:57.318246 kubelet[2415]: E0513 23:50:57.318045 2415 kubelet.go:1478] "Image garbage collection failed once. Stats initialization may not have completed yet" err="invalid capacity 0 on image filesystem" May 13 23:50:57.320093 kubelet[2415]: I0513 23:50:57.320054 2415 factory.go:221] Registration of the containerd container factory successfully May 13 23:50:57.321462 kubelet[2415]: I0513 23:50:57.321422 2415 desired_state_of_world_populator.go:146] "Desired state populator starts to run" May 13 23:50:57.321462 kubelet[2415]: I0513 23:50:57.315734 2415 dynamic_serving_content.go:135] "Starting controller" name="kubelet-server-cert-files::/var/lib/kubelet/pki/kubelet.crt::/var/lib/kubelet/pki/kubelet.key" May 13 23:50:57.321603 kubelet[2415]: I0513 23:50:57.321589 2415 reconciler.go:26] "Reconciler: start to sync state" May 13 23:50:57.330444 kubelet[2415]: E0513 23:50:57.330379 2415 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://188.245.195.87:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/ci-4284-0-0-n-40578dffbd?timeout=10s\": dial tcp 188.245.195.87:6443: connect: connection refused" interval="200ms" May 13 23:50:57.333296 kubelet[2415]: W0513 23:50:57.333219 2415 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://188.245.195.87:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 188.245.195.87:6443: connect: connection refused May 13 23:50:57.333422 kubelet[2415]: E0513 23:50:57.333296 2415 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://188.245.195.87:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 188.245.195.87:6443: connect: connection refused" logger="UnhandledError" May 13 23:50:57.350420 kubelet[2415]: I0513 23:50:57.350134 2415 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv4" May 13 23:50:57.351360 kubelet[2415]: I0513 23:50:57.351065 2415 cpu_manager.go:214] "Starting CPU manager" policy="none" May 13 23:50:57.351360 kubelet[2415]: I0513 23:50:57.351088 2415 cpu_manager.go:215] "Reconciling" reconcilePeriod="10s" May 13 23:50:57.351360 kubelet[2415]: I0513 23:50:57.351108 2415 state_mem.go:36] "Initialized new in-memory state store" May 13 23:50:57.351732 kubelet[2415]: I0513 23:50:57.351697 2415 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv6" May 13 23:50:57.351732 kubelet[2415]: I0513 23:50:57.351731 2415 status_manager.go:217] "Starting to sync pod status with apiserver" May 13 23:50:57.351808 kubelet[2415]: I0513 23:50:57.351755 2415 kubelet.go:2321] "Starting kubelet main sync loop" May 13 23:50:57.351837 kubelet[2415]: E0513 23:50:57.351802 2415 kubelet.go:2345] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" May 13 23:50:57.353829 kubelet[2415]: W0513 23:50:57.353758 2415 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://188.245.195.87:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 188.245.195.87:6443: connect: connection refused May 13 23:50:57.354618 kubelet[2415]: E0513 23:50:57.354540 2415 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://188.245.195.87:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 188.245.195.87:6443: connect: connection refused" logger="UnhandledError" May 13 23:50:57.356523 kubelet[2415]: I0513 23:50:57.356497 2415 policy_none.go:49] "None policy: Start" May 13 23:50:57.358646 kubelet[2415]: I0513 23:50:57.358620 2415 memory_manager.go:170] "Starting memorymanager" policy="None" May 13 23:50:57.358749 kubelet[2415]: I0513 23:50:57.358664 2415 state_mem.go:35] "Initializing new in-memory state store" May 13 23:50:57.369446 systemd[1]: Created slice kubepods.slice - libcontainer container kubepods.slice. May 13 23:50:57.383937 systemd[1]: Created slice kubepods-burstable.slice - libcontainer container kubepods-burstable.slice. May 13 23:50:57.393411 systemd[1]: Created slice kubepods-besteffort.slice - libcontainer container kubepods-besteffort.slice. May 13 23:50:57.404605 kubelet[2415]: I0513 23:50:57.403413 2415 manager.go:510] "Failed to read data from checkpoint" checkpoint="kubelet_internal_checkpoint" err="checkpoint is not found" May 13 23:50:57.404605 kubelet[2415]: I0513 23:50:57.403671 2415 eviction_manager.go:189] "Eviction manager: starting control loop" May 13 23:50:57.404605 kubelet[2415]: I0513 23:50:57.403686 2415 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" May 13 23:50:57.404605 kubelet[2415]: I0513 23:50:57.404325 2415 plugin_manager.go:118] "Starting Kubelet Plugin Manager" May 13 23:50:57.407406 kubelet[2415]: E0513 23:50:57.407340 2415 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"ci-4284-0-0-n-40578dffbd\" not found" May 13 23:50:57.471959 systemd[1]: Created slice kubepods-burstable-podfcaceb0f23d0e6d40879dcf406e1f87e.slice - libcontainer container kubepods-burstable-podfcaceb0f23d0e6d40879dcf406e1f87e.slice. May 13 23:50:57.488636 systemd[1]: Created slice kubepods-burstable-pod044e2bbae01eca3de2e1ee14c5840fd6.slice - libcontainer container kubepods-burstable-pod044e2bbae01eca3de2e1ee14c5840fd6.slice. May 13 23:50:57.505061 systemd[1]: Created slice kubepods-burstable-podbde9116b07dfa7bf3cd524f3c218d4a1.slice - libcontainer container kubepods-burstable-podbde9116b07dfa7bf3cd524f3c218d4a1.slice. May 13 23:50:57.506932 kubelet[2415]: I0513 23:50:57.506887 2415 kubelet_node_status.go:72] "Attempting to register node" node="ci-4284-0-0-n-40578dffbd" May 13 23:50:57.508012 kubelet[2415]: E0513 23:50:57.507604 2415 kubelet_node_status.go:95] "Unable to register node with API server" err="Post \"https://188.245.195.87:6443/api/v1/nodes\": dial tcp 188.245.195.87:6443: connect: connection refused" node="ci-4284-0-0-n-40578dffbd" May 13 23:50:57.523158 kubelet[2415]: I0513 23:50:57.523100 2415 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/bde9116b07dfa7bf3cd524f3c218d4a1-usr-share-ca-certificates\") pod \"kube-apiserver-ci-4284-0-0-n-40578dffbd\" (UID: \"bde9116b07dfa7bf3cd524f3c218d4a1\") " pod="kube-system/kube-apiserver-ci-4284-0-0-n-40578dffbd" May 13 23:50:57.523621 kubelet[2415]: I0513 23:50:57.523552 2415 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/fcaceb0f23d0e6d40879dcf406e1f87e-ca-certs\") pod \"kube-controller-manager-ci-4284-0-0-n-40578dffbd\" (UID: \"fcaceb0f23d0e6d40879dcf406e1f87e\") " pod="kube-system/kube-controller-manager-ci-4284-0-0-n-40578dffbd" May 13 23:50:57.523687 kubelet[2415]: I0513 23:50:57.523664 2415 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"flexvolume-dir\" (UniqueName: \"kubernetes.io/host-path/fcaceb0f23d0e6d40879dcf406e1f87e-flexvolume-dir\") pod \"kube-controller-manager-ci-4284-0-0-n-40578dffbd\" (UID: \"fcaceb0f23d0e6d40879dcf406e1f87e\") " pod="kube-system/kube-controller-manager-ci-4284-0-0-n-40578dffbd" May 13 23:50:57.523812 kubelet[2415]: I0513 23:50:57.523773 2415 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/fcaceb0f23d0e6d40879dcf406e1f87e-k8s-certs\") pod \"kube-controller-manager-ci-4284-0-0-n-40578dffbd\" (UID: \"fcaceb0f23d0e6d40879dcf406e1f87e\") " pod="kube-system/kube-controller-manager-ci-4284-0-0-n-40578dffbd" May 13 23:50:57.527158 kubelet[2415]: I0513 23:50:57.523840 2415 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/fcaceb0f23d0e6d40879dcf406e1f87e-kubeconfig\") pod \"kube-controller-manager-ci-4284-0-0-n-40578dffbd\" (UID: \"fcaceb0f23d0e6d40879dcf406e1f87e\") " pod="kube-system/kube-controller-manager-ci-4284-0-0-n-40578dffbd" May 13 23:50:57.527158 kubelet[2415]: I0513 23:50:57.523883 2415 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/bde9116b07dfa7bf3cd524f3c218d4a1-k8s-certs\") pod \"kube-apiserver-ci-4284-0-0-n-40578dffbd\" (UID: \"bde9116b07dfa7bf3cd524f3c218d4a1\") " pod="kube-system/kube-apiserver-ci-4284-0-0-n-40578dffbd" May 13 23:50:57.527158 kubelet[2415]: I0513 23:50:57.523959 2415 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/fcaceb0f23d0e6d40879dcf406e1f87e-usr-share-ca-certificates\") pod \"kube-controller-manager-ci-4284-0-0-n-40578dffbd\" (UID: \"fcaceb0f23d0e6d40879dcf406e1f87e\") " pod="kube-system/kube-controller-manager-ci-4284-0-0-n-40578dffbd" May 13 23:50:57.527158 kubelet[2415]: I0513 23:50:57.524016 2415 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/044e2bbae01eca3de2e1ee14c5840fd6-kubeconfig\") pod \"kube-scheduler-ci-4284-0-0-n-40578dffbd\" (UID: \"044e2bbae01eca3de2e1ee14c5840fd6\") " pod="kube-system/kube-scheduler-ci-4284-0-0-n-40578dffbd" May 13 23:50:57.527158 kubelet[2415]: I0513 23:50:57.524049 2415 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/bde9116b07dfa7bf3cd524f3c218d4a1-ca-certs\") pod \"kube-apiserver-ci-4284-0-0-n-40578dffbd\" (UID: \"bde9116b07dfa7bf3cd524f3c218d4a1\") " pod="kube-system/kube-apiserver-ci-4284-0-0-n-40578dffbd" May 13 23:50:57.531679 kubelet[2415]: E0513 23:50:57.531604 2415 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://188.245.195.87:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/ci-4284-0-0-n-40578dffbd?timeout=10s\": dial tcp 188.245.195.87:6443: connect: connection refused" interval="400ms" May 13 23:50:57.711858 kubelet[2415]: I0513 23:50:57.711368 2415 kubelet_node_status.go:72] "Attempting to register node" node="ci-4284-0-0-n-40578dffbd" May 13 23:50:57.711858 kubelet[2415]: E0513 23:50:57.711750 2415 kubelet_node_status.go:95] "Unable to register node with API server" err="Post \"https://188.245.195.87:6443/api/v1/nodes\": dial tcp 188.245.195.87:6443: connect: connection refused" node="ci-4284-0-0-n-40578dffbd" May 13 23:50:57.785486 containerd[1507]: time="2025-05-13T23:50:57.785270045Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-controller-manager-ci-4284-0-0-n-40578dffbd,Uid:fcaceb0f23d0e6d40879dcf406e1f87e,Namespace:kube-system,Attempt:0,}" May 13 23:50:57.805092 containerd[1507]: time="2025-05-13T23:50:57.804409683Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-scheduler-ci-4284-0-0-n-40578dffbd,Uid:044e2bbae01eca3de2e1ee14c5840fd6,Namespace:kube-system,Attempt:0,}" May 13 23:50:57.811702 containerd[1507]: time="2025-05-13T23:50:57.810999028Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-apiserver-ci-4284-0-0-n-40578dffbd,Uid:bde9116b07dfa7bf3cd524f3c218d4a1,Namespace:kube-system,Attempt:0,}" May 13 23:50:57.816342 containerd[1507]: time="2025-05-13T23:50:57.816301098Z" level=info msg="connecting to shim 281ef7c12689b1a5993ad7ffc0be4f98cb6affce5dd34256a60515457db460fa" address="unix:///run/containerd/s/4f9ecc41de700c7da03633dc6bf388b88e9f3fab823334393a28e3ebeb01305b" namespace=k8s.io protocol=ttrpc version=3 May 13 23:50:57.867245 containerd[1507]: time="2025-05-13T23:50:57.867197591Z" level=info msg="connecting to shim 54a42263a8e5b7e56875d56472bc4f93524681ddbad279f83d1074056819ef64" address="unix:///run/containerd/s/38b10f37c17d81eee18c262425a75fbbf6ffd365b47e83e1aa310e86af112de5" namespace=k8s.io protocol=ttrpc version=3 May 13 23:50:57.881207 systemd[1]: Started cri-containerd-281ef7c12689b1a5993ad7ffc0be4f98cb6affce5dd34256a60515457db460fa.scope - libcontainer container 281ef7c12689b1a5993ad7ffc0be4f98cb6affce5dd34256a60515457db460fa. May 13 23:50:57.901017 containerd[1507]: time="2025-05-13T23:50:57.900922401Z" level=info msg="connecting to shim bfdf3cb17427a274d3ea9e5ddecf78fd865a477337fc5112be6a0851925694f5" address="unix:///run/containerd/s/5f1acb794e908c9ed7b382764037f25f8b8ca1ab4e51a7e29adb00a083ceb98a" namespace=k8s.io protocol=ttrpc version=3 May 13 23:50:57.904479 systemd[1]: Started cri-containerd-54a42263a8e5b7e56875d56472bc4f93524681ddbad279f83d1074056819ef64.scope - libcontainer container 54a42263a8e5b7e56875d56472bc4f93524681ddbad279f83d1074056819ef64. May 13 23:50:57.934342 kubelet[2415]: E0513 23:50:57.933916 2415 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://188.245.195.87:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/ci-4284-0-0-n-40578dffbd?timeout=10s\": dial tcp 188.245.195.87:6443: connect: connection refused" interval="800ms" May 13 23:50:57.942222 systemd[1]: Started cri-containerd-bfdf3cb17427a274d3ea9e5ddecf78fd865a477337fc5112be6a0851925694f5.scope - libcontainer container bfdf3cb17427a274d3ea9e5ddecf78fd865a477337fc5112be6a0851925694f5. May 13 23:50:57.964819 containerd[1507]: time="2025-05-13T23:50:57.964731488Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-controller-manager-ci-4284-0-0-n-40578dffbd,Uid:fcaceb0f23d0e6d40879dcf406e1f87e,Namespace:kube-system,Attempt:0,} returns sandbox id \"281ef7c12689b1a5993ad7ffc0be4f98cb6affce5dd34256a60515457db460fa\"" May 13 23:50:57.973200 containerd[1507]: time="2025-05-13T23:50:57.972900486Z" level=info msg="CreateContainer within sandbox \"281ef7c12689b1a5993ad7ffc0be4f98cb6affce5dd34256a60515457db460fa\" for container &ContainerMetadata{Name:kube-controller-manager,Attempt:0,}" May 13 23:50:57.990331 containerd[1507]: time="2025-05-13T23:50:57.989169036Z" level=info msg="Container 558d47e04f618903205da713052711f336048d8fae7aee952a246ce637d1d5e6: CDI devices from CRI Config.CDIDevices: []" May 13 23:50:57.999831 containerd[1507]: time="2025-05-13T23:50:57.999784816Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-apiserver-ci-4284-0-0-n-40578dffbd,Uid:bde9116b07dfa7bf3cd524f3c218d4a1,Namespace:kube-system,Attempt:0,} returns sandbox id \"54a42263a8e5b7e56875d56472bc4f93524681ddbad279f83d1074056819ef64\"" May 13 23:50:58.004494 containerd[1507]: time="2025-05-13T23:50:58.004348599Z" level=info msg="CreateContainer within sandbox \"281ef7c12689b1a5993ad7ffc0be4f98cb6affce5dd34256a60515457db460fa\" for &ContainerMetadata{Name:kube-controller-manager,Attempt:0,} returns container id \"558d47e04f618903205da713052711f336048d8fae7aee952a246ce637d1d5e6\"" May 13 23:50:58.006478 containerd[1507]: time="2025-05-13T23:50:58.006419678Z" level=info msg="StartContainer for \"558d47e04f618903205da713052711f336048d8fae7aee952a246ce637d1d5e6\"" May 13 23:50:58.007851 containerd[1507]: time="2025-05-13T23:50:58.007799157Z" level=info msg="connecting to shim 558d47e04f618903205da713052711f336048d8fae7aee952a246ce637d1d5e6" address="unix:///run/containerd/s/4f9ecc41de700c7da03633dc6bf388b88e9f3fab823334393a28e3ebeb01305b" protocol=ttrpc version=3 May 13 23:50:58.010046 containerd[1507]: time="2025-05-13T23:50:58.009922678Z" level=info msg="CreateContainer within sandbox \"54a42263a8e5b7e56875d56472bc4f93524681ddbad279f83d1074056819ef64\" for container &ContainerMetadata{Name:kube-apiserver,Attempt:0,}" May 13 23:50:58.025372 containerd[1507]: time="2025-05-13T23:50:58.025325841Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-scheduler-ci-4284-0-0-n-40578dffbd,Uid:044e2bbae01eca3de2e1ee14c5840fd6,Namespace:kube-system,Attempt:0,} returns sandbox id \"bfdf3cb17427a274d3ea9e5ddecf78fd865a477337fc5112be6a0851925694f5\"" May 13 23:50:58.029512 containerd[1507]: time="2025-05-13T23:50:58.029467159Z" level=info msg="Container fd9194ed0223d790205ba5afa3d849604a51f345dbdb12bc45971317f587d8e1: CDI devices from CRI Config.CDIDevices: []" May 13 23:50:58.031062 containerd[1507]: time="2025-05-13T23:50:58.030391532Z" level=info msg="CreateContainer within sandbox \"bfdf3cb17427a274d3ea9e5ddecf78fd865a477337fc5112be6a0851925694f5\" for container &ContainerMetadata{Name:kube-scheduler,Attempt:0,}" May 13 23:50:58.038228 systemd[1]: Started cri-containerd-558d47e04f618903205da713052711f336048d8fae7aee952a246ce637d1d5e6.scope - libcontainer container 558d47e04f618903205da713052711f336048d8fae7aee952a246ce637d1d5e6. May 13 23:50:58.048870 containerd[1507]: time="2025-05-13T23:50:58.048809587Z" level=info msg="CreateContainer within sandbox \"54a42263a8e5b7e56875d56472bc4f93524681ddbad279f83d1074056819ef64\" for &ContainerMetadata{Name:kube-apiserver,Attempt:0,} returns container id \"fd9194ed0223d790205ba5afa3d849604a51f345dbdb12bc45971317f587d8e1\"" May 13 23:50:58.051020 containerd[1507]: time="2025-05-13T23:50:58.050909867Z" level=info msg="StartContainer for \"fd9194ed0223d790205ba5afa3d849604a51f345dbdb12bc45971317f587d8e1\"" May 13 23:50:58.052658 containerd[1507]: time="2025-05-13T23:50:58.052598244Z" level=info msg="Container 28655717a7c33734edf3a123df3df054f40592279e3d2dcb6b00dcb9fba879c3: CDI devices from CRI Config.CDIDevices: []" May 13 23:50:58.060683 containerd[1507]: time="2025-05-13T23:50:58.060635025Z" level=info msg="connecting to shim fd9194ed0223d790205ba5afa3d849604a51f345dbdb12bc45971317f587d8e1" address="unix:///run/containerd/s/38b10f37c17d81eee18c262425a75fbbf6ffd365b47e83e1aa310e86af112de5" protocol=ttrpc version=3 May 13 23:50:58.071075 containerd[1507]: time="2025-05-13T23:50:58.070919374Z" level=info msg="CreateContainer within sandbox \"bfdf3cb17427a274d3ea9e5ddecf78fd865a477337fc5112be6a0851925694f5\" for &ContainerMetadata{Name:kube-scheduler,Attempt:0,} returns container id \"28655717a7c33734edf3a123df3df054f40592279e3d2dcb6b00dcb9fba879c3\"" May 13 23:50:58.072125 containerd[1507]: time="2025-05-13T23:50:58.071850508Z" level=info msg="StartContainer for \"28655717a7c33734edf3a123df3df054f40592279e3d2dcb6b00dcb9fba879c3\"" May 13 23:50:58.073877 containerd[1507]: time="2025-05-13T23:50:58.073757697Z" level=info msg="connecting to shim 28655717a7c33734edf3a123df3df054f40592279e3d2dcb6b00dcb9fba879c3" address="unix:///run/containerd/s/5f1acb794e908c9ed7b382764037f25f8b8ca1ab4e51a7e29adb00a083ceb98a" protocol=ttrpc version=3 May 13 23:50:58.092201 systemd[1]: Started cri-containerd-fd9194ed0223d790205ba5afa3d849604a51f345dbdb12bc45971317f587d8e1.scope - libcontainer container fd9194ed0223d790205ba5afa3d849604a51f345dbdb12bc45971317f587d8e1. May 13 23:50:58.112206 systemd[1]: Started cri-containerd-28655717a7c33734edf3a123df3df054f40592279e3d2dcb6b00dcb9fba879c3.scope - libcontainer container 28655717a7c33734edf3a123df3df054f40592279e3d2dcb6b00dcb9fba879c3. May 13 23:50:58.118659 kubelet[2415]: I0513 23:50:58.118435 2415 kubelet_node_status.go:72] "Attempting to register node" node="ci-4284-0-0-n-40578dffbd" May 13 23:50:58.118848 kubelet[2415]: E0513 23:50:58.118819 2415 kubelet_node_status.go:95] "Unable to register node with API server" err="Post \"https://188.245.195.87:6443/api/v1/nodes\": dial tcp 188.245.195.87:6443: connect: connection refused" node="ci-4284-0-0-n-40578dffbd" May 13 23:50:58.126341 containerd[1507]: time="2025-05-13T23:50:58.126291588Z" level=info msg="StartContainer for \"558d47e04f618903205da713052711f336048d8fae7aee952a246ce637d1d5e6\" returns successfully" May 13 23:50:58.175092 containerd[1507]: time="2025-05-13T23:50:58.175022781Z" level=info msg="StartContainer for \"fd9194ed0223d790205ba5afa3d849604a51f345dbdb12bc45971317f587d8e1\" returns successfully" May 13 23:50:58.177660 kubelet[2415]: W0513 23:50:58.177527 2415 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://188.245.195.87:6443/api/v1/nodes?fieldSelector=metadata.name%3Dci-4284-0-0-n-40578dffbd&limit=500&resourceVersion=0": dial tcp 188.245.195.87:6443: connect: connection refused May 13 23:50:58.177660 kubelet[2415]: E0513 23:50:58.177605 2415 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://188.245.195.87:6443/api/v1/nodes?fieldSelector=metadata.name%3Dci-4284-0-0-n-40578dffbd&limit=500&resourceVersion=0\": dial tcp 188.245.195.87:6443: connect: connection refused" logger="UnhandledError" May 13 23:50:58.213088 containerd[1507]: time="2025-05-13T23:50:58.212880670Z" level=info msg="StartContainer for \"28655717a7c33734edf3a123df3df054f40592279e3d2dcb6b00dcb9fba879c3\" returns successfully" May 13 23:50:58.920876 kubelet[2415]: I0513 23:50:58.920838 2415 kubelet_node_status.go:72] "Attempting to register node" node="ci-4284-0-0-n-40578dffbd" May 13 23:51:00.656297 kubelet[2415]: E0513 23:51:00.656242 2415 nodelease.go:49] "Failed to get node when trying to set owner ref to the node lease" err="nodes \"ci-4284-0-0-n-40578dffbd\" not found" node="ci-4284-0-0-n-40578dffbd" May 13 23:51:00.720996 kubelet[2415]: I0513 23:51:00.720865 2415 kubelet_node_status.go:75] "Successfully registered node" node="ci-4284-0-0-n-40578dffbd" May 13 23:51:00.720996 kubelet[2415]: E0513 23:51:00.720914 2415 kubelet_node_status.go:535] "Error updating node status, will retry" err="error getting node \"ci-4284-0-0-n-40578dffbd\": node \"ci-4284-0-0-n-40578dffbd\" not found" May 13 23:51:00.746748 kubelet[2415]: E0513 23:51:00.746702 2415 kubelet_node_status.go:453] "Error getting the current node from lister" err="node \"ci-4284-0-0-n-40578dffbd\" not found" May 13 23:51:01.302990 kubelet[2415]: I0513 23:51:01.301640 2415 apiserver.go:52] "Watching apiserver" May 13 23:51:01.321933 kubelet[2415]: I0513 23:51:01.321893 2415 desired_state_of_world_populator.go:154] "Finished populating initial desired state of world" May 13 23:51:03.389181 systemd[1]: Reload requested from client PID 2683 ('systemctl') (unit session-7.scope)... May 13 23:51:03.389214 systemd[1]: Reloading... May 13 23:51:03.523017 zram_generator::config[2728]: No configuration found. May 13 23:51:03.637941 systemd[1]: /usr/lib/systemd/system/docker.socket:6: ListenStream= references a path below legacy directory /var/run/, updating /var/run/docker.sock → /run/docker.sock; please update the unit file accordingly. May 13 23:51:03.757156 systemd[1]: Reloading finished in 367 ms. May 13 23:51:03.784064 systemd[1]: Stopping kubelet.service - kubelet: The Kubernetes Node Agent... May 13 23:51:03.805208 systemd[1]: kubelet.service: Deactivated successfully. May 13 23:51:03.805621 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent. May 13 23:51:03.805705 systemd[1]: kubelet.service: Consumed 1.833s CPU time, 119.4M memory peak. May 13 23:51:03.814398 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... May 13 23:51:03.967492 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. May 13 23:51:03.980601 (kubelet)[2772]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS May 13 23:51:04.033526 kubelet[2772]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. May 13 23:51:04.033526 kubelet[2772]: Flag --pod-infra-container-image has been deprecated, will be removed in a future release. Image garbage collector will get sandbox image information from CRI. May 13 23:51:04.033526 kubelet[2772]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. May 13 23:51:04.033526 kubelet[2772]: I0513 23:51:04.033403 2772 server.go:206] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" May 13 23:51:04.047397 kubelet[2772]: I0513 23:51:04.046831 2772 server.go:486] "Kubelet version" kubeletVersion="v1.31.0" May 13 23:51:04.047397 kubelet[2772]: I0513 23:51:04.046862 2772 server.go:488] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" May 13 23:51:04.047397 kubelet[2772]: I0513 23:51:04.047180 2772 server.go:929] "Client rotation is on, will bootstrap in background" May 13 23:51:04.049004 kubelet[2772]: I0513 23:51:04.048928 2772 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-client-current.pem". May 13 23:51:04.051431 kubelet[2772]: I0513 23:51:04.051384 2772 dynamic_cafile_content.go:160] "Starting controller" name="client-ca-bundle::/etc/kubernetes/pki/ca.crt" May 13 23:51:04.058811 kubelet[2772]: I0513 23:51:04.058766 2772 server.go:1426] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" May 13 23:51:04.061619 kubelet[2772]: I0513 23:51:04.061225 2772 server.go:744] "--cgroups-per-qos enabled, but --cgroup-root was not specified. defaulting to /" May 13 23:51:04.061619 kubelet[2772]: I0513 23:51:04.061405 2772 swap_util.go:113] "Swap is on" /proc/swaps contents="Filename\t\t\t\tType\t\tSize\t\tUsed\t\tPriority" May 13 23:51:04.061619 kubelet[2772]: I0513 23:51:04.061509 2772 container_manager_linux.go:264] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] May 13 23:51:04.061835 kubelet[2772]: I0513 23:51:04.061534 2772 container_manager_linux.go:269] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ci-4284-0-0-n-40578dffbd","RuntimeCgroupsName":"","SystemCgroupsName":"","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":false,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":null,"HardEvictionThresholds":[{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"ExperimentalMemoryManagerPolicy":"None","ExperimentalMemoryManagerReservedMemory":null,"PodPidsLimit":-1,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} May 13 23:51:04.061835 kubelet[2772]: I0513 23:51:04.061723 2772 topology_manager.go:138] "Creating topology manager with none policy" May 13 23:51:04.061835 kubelet[2772]: I0513 23:51:04.061732 2772 container_manager_linux.go:300] "Creating device plugin manager" May 13 23:51:04.061835 kubelet[2772]: I0513 23:51:04.061766 2772 state_mem.go:36] "Initialized new in-memory state store" May 13 23:51:04.061969 kubelet[2772]: I0513 23:51:04.061882 2772 kubelet.go:408] "Attempting to sync node with API server" May 13 23:51:04.061969 kubelet[2772]: I0513 23:51:04.061895 2772 kubelet.go:303] "Adding static pod path" path="/etc/kubernetes/manifests" May 13 23:51:04.061969 kubelet[2772]: I0513 23:51:04.061918 2772 kubelet.go:314] "Adding apiserver pod source" May 13 23:51:04.066989 kubelet[2772]: I0513 23:51:04.065004 2772 apiserver.go:42] "Waiting for node sync before watching apiserver pods" May 13 23:51:04.072321 kubelet[2772]: I0513 23:51:04.071730 2772 kuberuntime_manager.go:262] "Container runtime initialized" containerRuntime="containerd" version="v2.0.1" apiVersion="v1" May 13 23:51:04.072764 kubelet[2772]: I0513 23:51:04.072333 2772 kubelet.go:837] "Not starting ClusterTrustBundle informer because we are in static kubelet mode" May 13 23:51:04.073008 kubelet[2772]: I0513 23:51:04.072841 2772 server.go:1269] "Started kubelet" May 13 23:51:04.081477 kubelet[2772]: I0513 23:51:04.076941 2772 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" May 13 23:51:04.089665 kubelet[2772]: I0513 23:51:04.089623 2772 dynamic_serving_content.go:135] "Starting controller" name="kubelet-server-cert-files::/var/lib/kubelet/pki/kubelet.crt::/var/lib/kubelet/pki/kubelet.key" May 13 23:51:04.094945 kubelet[2772]: I0513 23:51:04.094553 2772 volume_manager.go:289] "Starting Kubelet Volume Manager" May 13 23:51:04.094945 kubelet[2772]: E0513 23:51:04.094801 2772 kubelet_node_status.go:453] "Error getting the current node from lister" err="node \"ci-4284-0-0-n-40578dffbd\" not found" May 13 23:51:04.096755 kubelet[2772]: I0513 23:51:04.095458 2772 desired_state_of_world_populator.go:146] "Desired state populator starts to run" May 13 23:51:04.096755 kubelet[2772]: I0513 23:51:04.095617 2772 reconciler.go:26] "Reconciler: start to sync state" May 13 23:51:04.104569 kubelet[2772]: I0513 23:51:04.104503 2772 server.go:163] "Starting to listen" address="0.0.0.0" port=10250 May 13 23:51:04.105547 kubelet[2772]: I0513 23:51:04.105529 2772 server.go:460] "Adding debug handlers to kubelet server" May 13 23:51:04.107289 kubelet[2772]: I0513 23:51:04.107173 2772 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 May 13 23:51:04.109277 kubelet[2772]: I0513 23:51:04.107747 2772 server.go:236] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" May 13 23:51:04.123814 kubelet[2772]: I0513 23:51:04.122354 2772 factory.go:221] Registration of the systemd container factory successfully May 13 23:51:04.123814 kubelet[2772]: I0513 23:51:04.122492 2772 factory.go:219] Registration of the crio container factory failed: Get "http://%2Fvar%2Frun%2Fcrio%2Fcrio.sock/info": dial unix /var/run/crio/crio.sock: connect: no such file or directory May 13 23:51:04.126550 kubelet[2772]: I0513 23:51:04.126515 2772 factory.go:221] Registration of the containerd container factory successfully May 13 23:51:04.128730 kubelet[2772]: I0513 23:51:04.128215 2772 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv4" May 13 23:51:04.129588 kubelet[2772]: I0513 23:51:04.129550 2772 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv6" May 13 23:51:04.129588 kubelet[2772]: I0513 23:51:04.129586 2772 status_manager.go:217] "Starting to sync pod status with apiserver" May 13 23:51:04.129702 kubelet[2772]: I0513 23:51:04.129608 2772 kubelet.go:2321] "Starting kubelet main sync loop" May 13 23:51:04.129702 kubelet[2772]: E0513 23:51:04.129649 2772 kubelet.go:2345] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" May 13 23:51:04.133152 kubelet[2772]: E0513 23:51:04.133126 2772 kubelet.go:1478] "Image garbage collection failed once. Stats initialization may not have completed yet" err="invalid capacity 0 on image filesystem" May 13 23:51:04.210209 kubelet[2772]: I0513 23:51:04.210178 2772 cpu_manager.go:214] "Starting CPU manager" policy="none" May 13 23:51:04.210421 kubelet[2772]: I0513 23:51:04.210407 2772 cpu_manager.go:215] "Reconciling" reconcilePeriod="10s" May 13 23:51:04.210494 kubelet[2772]: I0513 23:51:04.210486 2772 state_mem.go:36] "Initialized new in-memory state store" May 13 23:51:04.210785 kubelet[2772]: I0513 23:51:04.210767 2772 state_mem.go:88] "Updated default CPUSet" cpuSet="" May 13 23:51:04.210892 kubelet[2772]: I0513 23:51:04.210866 2772 state_mem.go:96] "Updated CPUSet assignments" assignments={} May 13 23:51:04.210947 kubelet[2772]: I0513 23:51:04.210939 2772 policy_none.go:49] "None policy: Start" May 13 23:51:04.212352 kubelet[2772]: I0513 23:51:04.212322 2772 memory_manager.go:170] "Starting memorymanager" policy="None" May 13 23:51:04.212571 kubelet[2772]: I0513 23:51:04.212557 2772 state_mem.go:35] "Initializing new in-memory state store" May 13 23:51:04.212922 kubelet[2772]: I0513 23:51:04.212903 2772 state_mem.go:75] "Updated machine memory state" May 13 23:51:04.220670 kubelet[2772]: I0513 23:51:04.220237 2772 manager.go:510] "Failed to read data from checkpoint" checkpoint="kubelet_internal_checkpoint" err="checkpoint is not found" May 13 23:51:04.220670 kubelet[2772]: I0513 23:51:04.220451 2772 eviction_manager.go:189] "Eviction manager: starting control loop" May 13 23:51:04.220670 kubelet[2772]: I0513 23:51:04.220475 2772 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" May 13 23:51:04.221181 kubelet[2772]: I0513 23:51:04.221165 2772 plugin_manager.go:118] "Starting Kubelet Plugin Manager" May 13 23:51:04.343835 kubelet[2772]: I0513 23:51:04.343437 2772 kubelet_node_status.go:72] "Attempting to register node" node="ci-4284-0-0-n-40578dffbd" May 13 23:51:04.361815 kubelet[2772]: I0513 23:51:04.361681 2772 kubelet_node_status.go:111] "Node was previously registered" node="ci-4284-0-0-n-40578dffbd" May 13 23:51:04.361815 kubelet[2772]: I0513 23:51:04.361788 2772 kubelet_node_status.go:75] "Successfully registered node" node="ci-4284-0-0-n-40578dffbd" May 13 23:51:04.396802 kubelet[2772]: I0513 23:51:04.396416 2772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/bde9116b07dfa7bf3cd524f3c218d4a1-usr-share-ca-certificates\") pod \"kube-apiserver-ci-4284-0-0-n-40578dffbd\" (UID: \"bde9116b07dfa7bf3cd524f3c218d4a1\") " pod="kube-system/kube-apiserver-ci-4284-0-0-n-40578dffbd" May 13 23:51:04.396802 kubelet[2772]: I0513 23:51:04.396623 2772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/fcaceb0f23d0e6d40879dcf406e1f87e-k8s-certs\") pod \"kube-controller-manager-ci-4284-0-0-n-40578dffbd\" (UID: \"fcaceb0f23d0e6d40879dcf406e1f87e\") " pod="kube-system/kube-controller-manager-ci-4284-0-0-n-40578dffbd" May 13 23:51:04.396802 kubelet[2772]: I0513 23:51:04.396659 2772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/fcaceb0f23d0e6d40879dcf406e1f87e-kubeconfig\") pod \"kube-controller-manager-ci-4284-0-0-n-40578dffbd\" (UID: \"fcaceb0f23d0e6d40879dcf406e1f87e\") " pod="kube-system/kube-controller-manager-ci-4284-0-0-n-40578dffbd" May 13 23:51:04.398193 kubelet[2772]: I0513 23:51:04.397201 2772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/fcaceb0f23d0e6d40879dcf406e1f87e-usr-share-ca-certificates\") pod \"kube-controller-manager-ci-4284-0-0-n-40578dffbd\" (UID: \"fcaceb0f23d0e6d40879dcf406e1f87e\") " pod="kube-system/kube-controller-manager-ci-4284-0-0-n-40578dffbd" May 13 23:51:04.398193 kubelet[2772]: I0513 23:51:04.397737 2772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/044e2bbae01eca3de2e1ee14c5840fd6-kubeconfig\") pod \"kube-scheduler-ci-4284-0-0-n-40578dffbd\" (UID: \"044e2bbae01eca3de2e1ee14c5840fd6\") " pod="kube-system/kube-scheduler-ci-4284-0-0-n-40578dffbd" May 13 23:51:04.398193 kubelet[2772]: I0513 23:51:04.397946 2772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/bde9116b07dfa7bf3cd524f3c218d4a1-ca-certs\") pod \"kube-apiserver-ci-4284-0-0-n-40578dffbd\" (UID: \"bde9116b07dfa7bf3cd524f3c218d4a1\") " pod="kube-system/kube-apiserver-ci-4284-0-0-n-40578dffbd" May 13 23:51:04.398193 kubelet[2772]: I0513 23:51:04.398084 2772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/fcaceb0f23d0e6d40879dcf406e1f87e-ca-certs\") pod \"kube-controller-manager-ci-4284-0-0-n-40578dffbd\" (UID: \"fcaceb0f23d0e6d40879dcf406e1f87e\") " pod="kube-system/kube-controller-manager-ci-4284-0-0-n-40578dffbd" May 13 23:51:04.398193 kubelet[2772]: I0513 23:51:04.398111 2772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"flexvolume-dir\" (UniqueName: \"kubernetes.io/host-path/fcaceb0f23d0e6d40879dcf406e1f87e-flexvolume-dir\") pod \"kube-controller-manager-ci-4284-0-0-n-40578dffbd\" (UID: \"fcaceb0f23d0e6d40879dcf406e1f87e\") " pod="kube-system/kube-controller-manager-ci-4284-0-0-n-40578dffbd" May 13 23:51:04.398598 kubelet[2772]: I0513 23:51:04.398127 2772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/bde9116b07dfa7bf3cd524f3c218d4a1-k8s-certs\") pod \"kube-apiserver-ci-4284-0-0-n-40578dffbd\" (UID: \"bde9116b07dfa7bf3cd524f3c218d4a1\") " pod="kube-system/kube-apiserver-ci-4284-0-0-n-40578dffbd" May 13 23:51:05.070835 kubelet[2772]: I0513 23:51:05.070769 2772 apiserver.go:52] "Watching apiserver" May 13 23:51:05.096374 kubelet[2772]: I0513 23:51:05.096335 2772 desired_state_of_world_populator.go:154] "Finished populating initial desired state of world" May 13 23:51:05.271722 kubelet[2772]: I0513 23:51:05.271650 2772 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-apiserver-ci-4284-0-0-n-40578dffbd" podStartSLOduration=1.271630752 podStartE2EDuration="1.271630752s" podCreationTimestamp="2025-05-13 23:51:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-05-13 23:51:05.248929245 +0000 UTC m=+1.262761770" watchObservedRunningTime="2025-05-13 23:51:05.271630752 +0000 UTC m=+1.285463277" May 13 23:51:05.294042 kubelet[2772]: I0513 23:51:05.293946 2772 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-scheduler-ci-4284-0-0-n-40578dffbd" podStartSLOduration=1.293926558 podStartE2EDuration="1.293926558s" podCreationTimestamp="2025-05-13 23:51:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-05-13 23:51:05.271957608 +0000 UTC m=+1.285790133" watchObservedRunningTime="2025-05-13 23:51:05.293926558 +0000 UTC m=+1.307759083" May 13 23:51:05.321286 kubelet[2772]: I0513 23:51:05.321143 2772 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-controller-manager-ci-4284-0-0-n-40578dffbd" podStartSLOduration=1.321124612 podStartE2EDuration="1.321124612s" podCreationTimestamp="2025-05-13 23:51:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-05-13 23:51:05.2947696 +0000 UTC m=+1.308602125" watchObservedRunningTime="2025-05-13 23:51:05.321124612 +0000 UTC m=+1.334957137" May 13 23:51:08.806009 kubelet[2772]: I0513 23:51:08.805905 2772 kuberuntime_manager.go:1633] "Updating runtime config through cri with podcidr" CIDR="192.168.0.0/24" May 13 23:51:08.807071 containerd[1507]: time="2025-05-13T23:51:08.806929101Z" level=info msg="No cni config template is specified, wait for other system components to drop the config." May 13 23:51:08.808346 kubelet[2772]: I0513 23:51:08.807176 2772 kubelet_network.go:61] "Updating Pod CIDR" originalPodCIDR="" newPodCIDR="192.168.0.0/24" May 13 23:51:09.525737 sudo[1878]: pam_unix(sudo:session): session closed for user root May 13 23:51:09.688422 sshd[1877]: Connection closed by 139.178.89.65 port 50870 May 13 23:51:09.688253 sshd-session[1875]: pam_unix(sshd:session): session closed for user core May 13 23:51:09.695420 systemd-logind[1482]: Session 7 logged out. Waiting for processes to exit. May 13 23:51:09.696634 systemd[1]: sshd@6-188.245.195.87:22-139.178.89.65:50870.service: Deactivated successfully. May 13 23:51:09.702306 systemd[1]: session-7.scope: Deactivated successfully. May 13 23:51:09.702668 systemd[1]: session-7.scope: Consumed 6.533s CPU time, 224M memory peak. May 13 23:51:09.704880 systemd-logind[1482]: Removed session 7. May 13 23:51:09.838935 kubelet[2772]: I0513 23:51:09.838215 2772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-proxy\" (UniqueName: \"kubernetes.io/configmap/7982e64e-b547-4772-ab9e-162ebbdd6030-kube-proxy\") pod \"kube-proxy-l8hhb\" (UID: \"7982e64e-b547-4772-ab9e-162ebbdd6030\") " pod="kube-system/kube-proxy-l8hhb" May 13 23:51:09.838935 kubelet[2772]: I0513 23:51:09.838327 2772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/7982e64e-b547-4772-ab9e-162ebbdd6030-lib-modules\") pod \"kube-proxy-l8hhb\" (UID: \"7982e64e-b547-4772-ab9e-162ebbdd6030\") " pod="kube-system/kube-proxy-l8hhb" May 13 23:51:09.838935 kubelet[2772]: I0513 23:51:09.838351 2772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g26rh\" (UniqueName: \"kubernetes.io/projected/7982e64e-b547-4772-ab9e-162ebbdd6030-kube-api-access-g26rh\") pod \"kube-proxy-l8hhb\" (UID: \"7982e64e-b547-4772-ab9e-162ebbdd6030\") " pod="kube-system/kube-proxy-l8hhb" May 13 23:51:09.838935 kubelet[2772]: I0513 23:51:09.838372 2772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"xtables-lock\" (UniqueName: \"kubernetes.io/host-path/7982e64e-b547-4772-ab9e-162ebbdd6030-xtables-lock\") pod \"kube-proxy-l8hhb\" (UID: \"7982e64e-b547-4772-ab9e-162ebbdd6030\") " pod="kube-system/kube-proxy-l8hhb" May 13 23:51:09.839083 systemd[1]: Created slice kubepods-besteffort-pod7982e64e_b547_4772_ab9e_162ebbdd6030.slice - libcontainer container kubepods-besteffort-pod7982e64e_b547_4772_ab9e_162ebbdd6030.slice. May 13 23:51:09.940540 kubelet[2772]: I0513 23:51:09.938707 2772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-calico\" (UniqueName: \"kubernetes.io/host-path/60eb046b-6756-43a6-8d00-cc781cc176b1-var-lib-calico\") pod \"tigera-operator-6f6897fdc5-rgtqr\" (UID: \"60eb046b-6756-43a6-8d00-cc781cc176b1\") " pod="tigera-operator/tigera-operator-6f6897fdc5-rgtqr" May 13 23:51:09.940540 kubelet[2772]: I0513 23:51:09.938768 2772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kfpdl\" (UniqueName: \"kubernetes.io/projected/60eb046b-6756-43a6-8d00-cc781cc176b1-kube-api-access-kfpdl\") pod \"tigera-operator-6f6897fdc5-rgtqr\" (UID: \"60eb046b-6756-43a6-8d00-cc781cc176b1\") " pod="tigera-operator/tigera-operator-6f6897fdc5-rgtqr" May 13 23:51:09.939704 systemd[1]: Created slice kubepods-besteffort-pod60eb046b_6756_43a6_8d00_cc781cc176b1.slice - libcontainer container kubepods-besteffort-pod60eb046b_6756_43a6_8d00_cc781cc176b1.slice. May 13 23:51:10.151430 containerd[1507]: time="2025-05-13T23:51:10.151212256Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-proxy-l8hhb,Uid:7982e64e-b547-4772-ab9e-162ebbdd6030,Namespace:kube-system,Attempt:0,}" May 13 23:51:10.180827 containerd[1507]: time="2025-05-13T23:51:10.180658025Z" level=info msg="connecting to shim 73d4f60b757c4a8dbc0470fe207eb1a26a811f997ead7347b6a611bcef538f1d" address="unix:///run/containerd/s/f2e80b10cfba0552241b8c69de0bb8b505f039e7dde7de04537115efc4ef49fd" namespace=k8s.io protocol=ttrpc version=3 May 13 23:51:10.215286 systemd[1]: Started cri-containerd-73d4f60b757c4a8dbc0470fe207eb1a26a811f997ead7347b6a611bcef538f1d.scope - libcontainer container 73d4f60b757c4a8dbc0470fe207eb1a26a811f997ead7347b6a611bcef538f1d. May 13 23:51:10.244585 containerd[1507]: time="2025-05-13T23:51:10.244440790Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:tigera-operator-6f6897fdc5-rgtqr,Uid:60eb046b-6756-43a6-8d00-cc781cc176b1,Namespace:tigera-operator,Attempt:0,}" May 13 23:51:10.254243 containerd[1507]: time="2025-05-13T23:51:10.253654539Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-proxy-l8hhb,Uid:7982e64e-b547-4772-ab9e-162ebbdd6030,Namespace:kube-system,Attempt:0,} returns sandbox id \"73d4f60b757c4a8dbc0470fe207eb1a26a811f997ead7347b6a611bcef538f1d\"" May 13 23:51:10.262031 containerd[1507]: time="2025-05-13T23:51:10.261552306Z" level=info msg="CreateContainer within sandbox \"73d4f60b757c4a8dbc0470fe207eb1a26a811f997ead7347b6a611bcef538f1d\" for container &ContainerMetadata{Name:kube-proxy,Attempt:0,}" May 13 23:51:10.289440 containerd[1507]: time="2025-05-13T23:51:10.288960340Z" level=info msg="Container 77c867969c2c0933b475f8a770d01777df11320e3075424531c6122e0d198f4c: CDI devices from CRI Config.CDIDevices: []" May 13 23:51:10.291966 containerd[1507]: time="2025-05-13T23:51:10.291907478Z" level=info msg="connecting to shim 7b572f8c6d7c2317c275876f0a2fa95ee8f66035af87bc7af049fc14b2675b86" address="unix:///run/containerd/s/ca9d0a967c686b70bcf17a3d6924371019368267ed178488c187385ff40d08fb" namespace=k8s.io protocol=ttrpc version=3 May 13 23:51:10.305162 containerd[1507]: time="2025-05-13T23:51:10.305113012Z" level=info msg="CreateContainer within sandbox \"73d4f60b757c4a8dbc0470fe207eb1a26a811f997ead7347b6a611bcef538f1d\" for &ContainerMetadata{Name:kube-proxy,Attempt:0,} returns container id \"77c867969c2c0933b475f8a770d01777df11320e3075424531c6122e0d198f4c\"" May 13 23:51:10.307591 containerd[1507]: time="2025-05-13T23:51:10.307403118Z" level=info msg="StartContainer for \"77c867969c2c0933b475f8a770d01777df11320e3075424531c6122e0d198f4c\"" May 13 23:51:10.310645 containerd[1507]: time="2025-05-13T23:51:10.310595106Z" level=info msg="connecting to shim 77c867969c2c0933b475f8a770d01777df11320e3075424531c6122e0d198f4c" address="unix:///run/containerd/s/f2e80b10cfba0552241b8c69de0bb8b505f039e7dde7de04537115efc4ef49fd" protocol=ttrpc version=3 May 13 23:51:10.335118 systemd[1]: Started cri-containerd-7b572f8c6d7c2317c275876f0a2fa95ee8f66035af87bc7af049fc14b2675b86.scope - libcontainer container 7b572f8c6d7c2317c275876f0a2fa95ee8f66035af87bc7af049fc14b2675b86. May 13 23:51:10.350325 systemd[1]: Started cri-containerd-77c867969c2c0933b475f8a770d01777df11320e3075424531c6122e0d198f4c.scope - libcontainer container 77c867969c2c0933b475f8a770d01777df11320e3075424531c6122e0d198f4c. May 13 23:51:10.388903 containerd[1507]: time="2025-05-13T23:51:10.388243357Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:tigera-operator-6f6897fdc5-rgtqr,Uid:60eb046b-6756-43a6-8d00-cc781cc176b1,Namespace:tigera-operator,Attempt:0,} returns sandbox id \"7b572f8c6d7c2317c275876f0a2fa95ee8f66035af87bc7af049fc14b2675b86\"" May 13 23:51:10.393270 containerd[1507]: time="2025-05-13T23:51:10.391807883Z" level=info msg="PullImage \"quay.io/tigera/operator:v1.36.7\"" May 13 23:51:10.424867 containerd[1507]: time="2025-05-13T23:51:10.424693492Z" level=info msg="StartContainer for \"77c867969c2c0933b475f8a770d01777df11320e3075424531c6122e0d198f4c\" returns successfully" May 13 23:51:12.053911 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount2696266313.mount: Deactivated successfully. May 13 23:51:12.442266 containerd[1507]: time="2025-05-13T23:51:12.441953732Z" level=info msg="ImageCreate event name:\"quay.io/tigera/operator:v1.36.7\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 13 23:51:12.443501 containerd[1507]: time="2025-05-13T23:51:12.443424078Z" level=info msg="stop pulling image quay.io/tigera/operator:v1.36.7: active requests=0, bytes read=19323084" May 13 23:51:12.444690 containerd[1507]: time="2025-05-13T23:51:12.444285077Z" level=info msg="ImageCreate event name:\"sha256:27f7c2cfac802523e44ecd16453a4cc992f6c7d610c13054f2715a7cb4370565\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 13 23:51:12.447206 containerd[1507]: time="2025-05-13T23:51:12.447139326Z" level=info msg="ImageCreate event name:\"quay.io/tigera/operator@sha256:a4a44422d8f2a14e0aaea2031ccb5580f2bf68218c9db444450c1888743305e9\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 13 23:51:12.447787 containerd[1507]: time="2025-05-13T23:51:12.447751393Z" level=info msg="Pulled image \"quay.io/tigera/operator:v1.36.7\" with image id \"sha256:27f7c2cfac802523e44ecd16453a4cc992f6c7d610c13054f2715a7cb4370565\", repo tag \"quay.io/tigera/operator:v1.36.7\", repo digest \"quay.io/tigera/operator@sha256:a4a44422d8f2a14e0aaea2031ccb5580f2bf68218c9db444450c1888743305e9\", size \"19319079\" in 2.055891149s" May 13 23:51:12.447919 containerd[1507]: time="2025-05-13T23:51:12.447903000Z" level=info msg="PullImage \"quay.io/tigera/operator:v1.36.7\" returns image reference \"sha256:27f7c2cfac802523e44ecd16453a4cc992f6c7d610c13054f2715a7cb4370565\"" May 13 23:51:12.455020 containerd[1507]: time="2025-05-13T23:51:12.452492567Z" level=info msg="CreateContainer within sandbox \"7b572f8c6d7c2317c275876f0a2fa95ee8f66035af87bc7af049fc14b2675b86\" for container &ContainerMetadata{Name:tigera-operator,Attempt:0,}" May 13 23:51:12.468018 containerd[1507]: time="2025-05-13T23:51:12.464092450Z" level=info msg="Container 98da301f10ed65f4ea95c8857b325a66329a4b2d70631d7c802fcd9ee3df9026: CDI devices from CRI Config.CDIDevices: []" May 13 23:51:12.465369 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount3049447412.mount: Deactivated successfully. May 13 23:51:12.474992 containerd[1507]: time="2025-05-13T23:51:12.474906297Z" level=info msg="CreateContainer within sandbox \"7b572f8c6d7c2317c275876f0a2fa95ee8f66035af87bc7af049fc14b2675b86\" for &ContainerMetadata{Name:tigera-operator,Attempt:0,} returns container id \"98da301f10ed65f4ea95c8857b325a66329a4b2d70631d7c802fcd9ee3df9026\"" May 13 23:51:12.478106 containerd[1507]: time="2025-05-13T23:51:12.476687497Z" level=info msg="StartContainer for \"98da301f10ed65f4ea95c8857b325a66329a4b2d70631d7c802fcd9ee3df9026\"" May 13 23:51:12.478106 containerd[1507]: time="2025-05-13T23:51:12.477647981Z" level=info msg="connecting to shim 98da301f10ed65f4ea95c8857b325a66329a4b2d70631d7c802fcd9ee3df9026" address="unix:///run/containerd/s/ca9d0a967c686b70bcf17a3d6924371019368267ed178488c187385ff40d08fb" protocol=ttrpc version=3 May 13 23:51:12.504210 systemd[1]: Started cri-containerd-98da301f10ed65f4ea95c8857b325a66329a4b2d70631d7c802fcd9ee3df9026.scope - libcontainer container 98da301f10ed65f4ea95c8857b325a66329a4b2d70631d7c802fcd9ee3df9026. May 13 23:51:12.544296 containerd[1507]: time="2025-05-13T23:51:12.544211900Z" level=info msg="StartContainer for \"98da301f10ed65f4ea95c8857b325a66329a4b2d70631d7c802fcd9ee3df9026\" returns successfully" May 13 23:51:12.696413 kubelet[2772]: I0513 23:51:12.696248 2772 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-proxy-l8hhb" podStartSLOduration=3.69622559 podStartE2EDuration="3.69622559s" podCreationTimestamp="2025-05-13 23:51:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-05-13 23:51:11.251445117 +0000 UTC m=+7.265277642" watchObservedRunningTime="2025-05-13 23:51:12.69622559 +0000 UTC m=+8.710058115" May 13 23:51:13.246448 kubelet[2772]: I0513 23:51:13.246239 2772 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="tigera-operator/tigera-operator-6f6897fdc5-rgtqr" podStartSLOduration=2.188020512 podStartE2EDuration="4.246215886s" podCreationTimestamp="2025-05-13 23:51:09 +0000 UTC" firstStartedPulling="2025-05-13 23:51:10.391158692 +0000 UTC m=+6.404991217" lastFinishedPulling="2025-05-13 23:51:12.449354066 +0000 UTC m=+8.463186591" observedRunningTime="2025-05-13 23:51:13.244242918 +0000 UTC m=+9.258075483" watchObservedRunningTime="2025-05-13 23:51:13.246215886 +0000 UTC m=+9.260048451" May 13 23:51:17.081994 kubelet[2772]: W0513 23:51:17.081875 2772 reflector.go:561] object-"calico-system"/"tigera-ca-bundle": failed to list *v1.ConfigMap: configmaps "tigera-ca-bundle" is forbidden: User "system:node:ci-4284-0-0-n-40578dffbd" cannot list resource "configmaps" in API group "" in the namespace "calico-system": no relationship found between node 'ci-4284-0-0-n-40578dffbd' and this object May 13 23:51:17.082712 kubelet[2772]: E0513 23:51:17.081960 2772 reflector.go:158] "Unhandled Error" err="object-\"calico-system\"/\"tigera-ca-bundle\": Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: configmaps \"tigera-ca-bundle\" is forbidden: User \"system:node:ci-4284-0-0-n-40578dffbd\" cannot list resource \"configmaps\" in API group \"\" in the namespace \"calico-system\": no relationship found between node 'ci-4284-0-0-n-40578dffbd' and this object" logger="UnhandledError" May 13 23:51:17.082712 kubelet[2772]: W0513 23:51:17.082648 2772 reflector.go:561] object-"calico-system"/"kube-root-ca.crt": failed to list *v1.ConfigMap: configmaps "kube-root-ca.crt" is forbidden: User "system:node:ci-4284-0-0-n-40578dffbd" cannot list resource "configmaps" in API group "" in the namespace "calico-system": no relationship found between node 'ci-4284-0-0-n-40578dffbd' and this object May 13 23:51:17.082712 kubelet[2772]: E0513 23:51:17.082690 2772 reflector.go:158] "Unhandled Error" err="object-\"calico-system\"/\"kube-root-ca.crt\": Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: configmaps \"kube-root-ca.crt\" is forbidden: User \"system:node:ci-4284-0-0-n-40578dffbd\" cannot list resource \"configmaps\" in API group \"\" in the namespace \"calico-system\": no relationship found between node 'ci-4284-0-0-n-40578dffbd' and this object" logger="UnhandledError" May 13 23:51:17.082712 kubelet[2772]: W0513 23:51:17.081899 2772 reflector.go:561] object-"calico-system"/"typha-certs": failed to list *v1.Secret: secrets "typha-certs" is forbidden: User "system:node:ci-4284-0-0-n-40578dffbd" cannot list resource "secrets" in API group "" in the namespace "calico-system": no relationship found between node 'ci-4284-0-0-n-40578dffbd' and this object May 13 23:51:17.083083 kubelet[2772]: E0513 23:51:17.082818 2772 reflector.go:158] "Unhandled Error" err="object-\"calico-system\"/\"typha-certs\": Failed to watch *v1.Secret: failed to list *v1.Secret: secrets \"typha-certs\" is forbidden: User \"system:node:ci-4284-0-0-n-40578dffbd\" cannot list resource \"secrets\" in API group \"\" in the namespace \"calico-system\": no relationship found between node 'ci-4284-0-0-n-40578dffbd' and this object" logger="UnhandledError" May 13 23:51:17.086990 systemd[1]: Created slice kubepods-besteffort-pod7f8beb27_df29_4d64_82a3_c5a23cef9546.slice - libcontainer container kubepods-besteffort-pod7f8beb27_df29_4d64_82a3_c5a23cef9546.slice. May 13 23:51:17.182114 kubelet[2772]: I0513 23:51:17.181219 2772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"typha-certs\" (UniqueName: \"kubernetes.io/secret/7f8beb27-df29-4d64-82a3-c5a23cef9546-typha-certs\") pod \"calico-typha-6b74f96f4-7mp6k\" (UID: \"7f8beb27-df29-4d64-82a3-c5a23cef9546\") " pod="calico-system/calico-typha-6b74f96f4-7mp6k" May 13 23:51:17.182114 kubelet[2772]: I0513 23:51:17.181285 2772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vf7v8\" (UniqueName: \"kubernetes.io/projected/7f8beb27-df29-4d64-82a3-c5a23cef9546-kube-api-access-vf7v8\") pod \"calico-typha-6b74f96f4-7mp6k\" (UID: \"7f8beb27-df29-4d64-82a3-c5a23cef9546\") " pod="calico-system/calico-typha-6b74f96f4-7mp6k" May 13 23:51:17.182114 kubelet[2772]: I0513 23:51:17.181919 2772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7f8beb27-df29-4d64-82a3-c5a23cef9546-tigera-ca-bundle\") pod \"calico-typha-6b74f96f4-7mp6k\" (UID: \"7f8beb27-df29-4d64-82a3-c5a23cef9546\") " pod="calico-system/calico-typha-6b74f96f4-7mp6k" May 13 23:51:17.233315 systemd[1]: Created slice kubepods-besteffort-pod7e83aff7_6ebd_46bd_aed4_bcf6027bd328.slice - libcontainer container kubepods-besteffort-pod7e83aff7_6ebd_46bd_aed4_bcf6027bd328.slice. May 13 23:51:17.283892 kubelet[2772]: I0513 23:51:17.283033 2772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"policysync\" (UniqueName: \"kubernetes.io/host-path/7e83aff7-6ebd-46bd-aed4-bcf6027bd328-policysync\") pod \"calico-node-bc4rn\" (UID: \"7e83aff7-6ebd-46bd-aed4-bcf6027bd328\") " pod="calico-system/calico-node-bc4rn" May 13 23:51:17.283892 kubelet[2772]: I0513 23:51:17.283081 2772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-certs\" (UniqueName: \"kubernetes.io/secret/7e83aff7-6ebd-46bd-aed4-bcf6027bd328-node-certs\") pod \"calico-node-bc4rn\" (UID: \"7e83aff7-6ebd-46bd-aed4-bcf6027bd328\") " pod="calico-system/calico-node-bc4rn" May 13 23:51:17.283892 kubelet[2772]: I0513 23:51:17.283117 2772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/7e83aff7-6ebd-46bd-aed4-bcf6027bd328-lib-modules\") pod \"calico-node-bc4rn\" (UID: \"7e83aff7-6ebd-46bd-aed4-bcf6027bd328\") " pod="calico-system/calico-node-bc4rn" May 13 23:51:17.283892 kubelet[2772]: I0513 23:51:17.283159 2772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"xtables-lock\" (UniqueName: \"kubernetes.io/host-path/7e83aff7-6ebd-46bd-aed4-bcf6027bd328-xtables-lock\") pod \"calico-node-bc4rn\" (UID: \"7e83aff7-6ebd-46bd-aed4-bcf6027bd328\") " pod="calico-system/calico-node-bc4rn" May 13 23:51:17.283892 kubelet[2772]: I0513 23:51:17.283181 2772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-calico\" (UniqueName: \"kubernetes.io/host-path/7e83aff7-6ebd-46bd-aed4-bcf6027bd328-var-lib-calico\") pod \"calico-node-bc4rn\" (UID: \"7e83aff7-6ebd-46bd-aed4-bcf6027bd328\") " pod="calico-system/calico-node-bc4rn" May 13 23:51:17.284226 kubelet[2772]: I0513 23:51:17.283199 2772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-bin-dir\" (UniqueName: \"kubernetes.io/host-path/7e83aff7-6ebd-46bd-aed4-bcf6027bd328-cni-bin-dir\") pod \"calico-node-bc4rn\" (UID: \"7e83aff7-6ebd-46bd-aed4-bcf6027bd328\") " pod="calico-system/calico-node-bc4rn" May 13 23:51:17.284226 kubelet[2772]: I0513 23:51:17.283222 2772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mqxdb\" (UniqueName: \"kubernetes.io/projected/7e83aff7-6ebd-46bd-aed4-bcf6027bd328-kube-api-access-mqxdb\") pod \"calico-node-bc4rn\" (UID: \"7e83aff7-6ebd-46bd-aed4-bcf6027bd328\") " pod="calico-system/calico-node-bc4rn" May 13 23:51:17.284226 kubelet[2772]: I0513 23:51:17.283243 2772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7e83aff7-6ebd-46bd-aed4-bcf6027bd328-tigera-ca-bundle\") pod \"calico-node-bc4rn\" (UID: \"7e83aff7-6ebd-46bd-aed4-bcf6027bd328\") " pod="calico-system/calico-node-bc4rn" May 13 23:51:17.284226 kubelet[2772]: I0513 23:51:17.283260 2772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-log-dir\" (UniqueName: \"kubernetes.io/host-path/7e83aff7-6ebd-46bd-aed4-bcf6027bd328-cni-log-dir\") pod \"calico-node-bc4rn\" (UID: \"7e83aff7-6ebd-46bd-aed4-bcf6027bd328\") " pod="calico-system/calico-node-bc4rn" May 13 23:51:17.284226 kubelet[2772]: I0513 23:51:17.283291 2772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-calico\" (UniqueName: \"kubernetes.io/host-path/7e83aff7-6ebd-46bd-aed4-bcf6027bd328-var-run-calico\") pod \"calico-node-bc4rn\" (UID: \"7e83aff7-6ebd-46bd-aed4-bcf6027bd328\") " pod="calico-system/calico-node-bc4rn" May 13 23:51:17.284383 kubelet[2772]: I0513 23:51:17.283309 2772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-net-dir\" (UniqueName: \"kubernetes.io/host-path/7e83aff7-6ebd-46bd-aed4-bcf6027bd328-cni-net-dir\") pod \"calico-node-bc4rn\" (UID: \"7e83aff7-6ebd-46bd-aed4-bcf6027bd328\") " pod="calico-system/calico-node-bc4rn" May 13 23:51:17.284383 kubelet[2772]: I0513 23:51:17.283330 2772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"flexvol-driver-host\" (UniqueName: \"kubernetes.io/host-path/7e83aff7-6ebd-46bd-aed4-bcf6027bd328-flexvol-driver-host\") pod \"calico-node-bc4rn\" (UID: \"7e83aff7-6ebd-46bd-aed4-bcf6027bd328\") " pod="calico-system/calico-node-bc4rn" May 13 23:51:17.363222 kubelet[2772]: E0513 23:51:17.362242 2772 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-68jvj" podUID="25916451-c2fa-46e7-8188-33b7982635fd" May 13 23:51:17.384368 kubelet[2772]: I0513 23:51:17.383565 2772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/25916451-c2fa-46e7-8188-33b7982635fd-socket-dir\") pod \"csi-node-driver-68jvj\" (UID: \"25916451-c2fa-46e7-8188-33b7982635fd\") " pod="calico-system/csi-node-driver-68jvj" May 13 23:51:17.384368 kubelet[2772]: I0513 23:51:17.383611 2772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/25916451-c2fa-46e7-8188-33b7982635fd-registration-dir\") pod \"csi-node-driver-68jvj\" (UID: \"25916451-c2fa-46e7-8188-33b7982635fd\") " pod="calico-system/csi-node-driver-68jvj" May 13 23:51:17.384368 kubelet[2772]: I0513 23:51:17.383653 2772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"varrun\" (UniqueName: \"kubernetes.io/host-path/25916451-c2fa-46e7-8188-33b7982635fd-varrun\") pod \"csi-node-driver-68jvj\" (UID: \"25916451-c2fa-46e7-8188-33b7982635fd\") " pod="calico-system/csi-node-driver-68jvj" May 13 23:51:17.384368 kubelet[2772]: I0513 23:51:17.383684 2772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r4cf2\" (UniqueName: \"kubernetes.io/projected/25916451-c2fa-46e7-8188-33b7982635fd-kube-api-access-r4cf2\") pod \"csi-node-driver-68jvj\" (UID: \"25916451-c2fa-46e7-8188-33b7982635fd\") " pod="calico-system/csi-node-driver-68jvj" May 13 23:51:17.384368 kubelet[2772]: I0513 23:51:17.383803 2772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/25916451-c2fa-46e7-8188-33b7982635fd-kubelet-dir\") pod \"csi-node-driver-68jvj\" (UID: \"25916451-c2fa-46e7-8188-33b7982635fd\") " pod="calico-system/csi-node-driver-68jvj" May 13 23:51:17.394947 kubelet[2772]: E0513 23:51:17.394902 2772 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 23:51:17.394947 kubelet[2772]: W0513 23:51:17.394931 2772 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 23:51:17.395121 kubelet[2772]: E0513 23:51:17.394966 2772 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 23:51:17.484569 kubelet[2772]: E0513 23:51:17.484372 2772 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 23:51:17.484569 kubelet[2772]: W0513 23:51:17.484398 2772 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 23:51:17.484569 kubelet[2772]: E0513 23:51:17.484421 2772 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 23:51:17.485224 kubelet[2772]: E0513 23:51:17.484847 2772 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 23:51:17.485224 kubelet[2772]: W0513 23:51:17.484865 2772 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 23:51:17.485224 kubelet[2772]: E0513 23:51:17.484888 2772 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 23:51:17.486001 kubelet[2772]: E0513 23:51:17.485506 2772 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 23:51:17.486001 kubelet[2772]: W0513 23:51:17.485528 2772 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 23:51:17.486001 kubelet[2772]: E0513 23:51:17.485555 2772 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 23:51:17.486463 kubelet[2772]: E0513 23:51:17.486344 2772 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 23:51:17.486463 kubelet[2772]: W0513 23:51:17.486364 2772 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 23:51:17.486463 kubelet[2772]: E0513 23:51:17.486449 2772 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 23:51:17.488196 kubelet[2772]: E0513 23:51:17.488111 2772 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 23:51:17.488196 kubelet[2772]: W0513 23:51:17.488132 2772 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 23:51:17.488196 kubelet[2772]: E0513 23:51:17.488161 2772 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 23:51:17.489205 kubelet[2772]: E0513 23:51:17.489174 2772 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 23:51:17.489205 kubelet[2772]: W0513 23:51:17.489200 2772 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 23:51:17.489354 kubelet[2772]: E0513 23:51:17.489227 2772 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 23:51:17.490886 kubelet[2772]: E0513 23:51:17.490841 2772 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 23:51:17.490886 kubelet[2772]: W0513 23:51:17.490869 2772 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 23:51:17.491125 kubelet[2772]: E0513 23:51:17.491017 2772 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 23:51:17.492261 kubelet[2772]: E0513 23:51:17.492233 2772 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 23:51:17.492261 kubelet[2772]: W0513 23:51:17.492255 2772 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 23:51:17.492379 kubelet[2772]: E0513 23:51:17.492353 2772 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 23:51:17.492598 kubelet[2772]: E0513 23:51:17.492577 2772 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 23:51:17.492598 kubelet[2772]: W0513 23:51:17.492593 2772 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 23:51:17.492701 kubelet[2772]: E0513 23:51:17.492680 2772 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 23:51:17.492931 kubelet[2772]: E0513 23:51:17.492903 2772 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 23:51:17.492931 kubelet[2772]: W0513 23:51:17.492923 2772 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 23:51:17.493202 kubelet[2772]: E0513 23:51:17.493048 2772 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 23:51:17.493340 kubelet[2772]: E0513 23:51:17.493319 2772 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 23:51:17.493374 kubelet[2772]: W0513 23:51:17.493339 2772 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 23:51:17.493374 kubelet[2772]: E0513 23:51:17.493360 2772 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 23:51:17.493696 kubelet[2772]: E0513 23:51:17.493665 2772 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 23:51:17.493696 kubelet[2772]: W0513 23:51:17.493685 2772 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 23:51:17.493696 kubelet[2772]: E0513 23:51:17.493704 2772 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 23:51:17.494924 kubelet[2772]: E0513 23:51:17.494884 2772 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 23:51:17.494924 kubelet[2772]: W0513 23:51:17.494910 2772 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 23:51:17.495055 kubelet[2772]: E0513 23:51:17.494938 2772 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 23:51:17.495553 kubelet[2772]: E0513 23:51:17.495506 2772 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 23:51:17.495553 kubelet[2772]: W0513 23:51:17.495543 2772 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 23:51:17.496063 kubelet[2772]: E0513 23:51:17.495714 2772 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 23:51:17.496063 kubelet[2772]: E0513 23:51:17.495884 2772 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 23:51:17.496063 kubelet[2772]: W0513 23:51:17.495896 2772 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 23:51:17.496063 kubelet[2772]: E0513 23:51:17.496026 2772 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 23:51:17.498198 kubelet[2772]: E0513 23:51:17.498143 2772 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 23:51:17.498198 kubelet[2772]: W0513 23:51:17.498173 2772 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 23:51:17.498382 kubelet[2772]: E0513 23:51:17.498338 2772 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 23:51:17.498520 kubelet[2772]: E0513 23:51:17.498503 2772 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 23:51:17.498520 kubelet[2772]: W0513 23:51:17.498517 2772 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 23:51:17.499026 kubelet[2772]: E0513 23:51:17.498608 2772 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 23:51:17.499026 kubelet[2772]: E0513 23:51:17.498833 2772 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 23:51:17.499026 kubelet[2772]: W0513 23:51:17.498847 2772 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 23:51:17.499026 kubelet[2772]: E0513 23:51:17.498933 2772 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 23:51:17.499326 kubelet[2772]: E0513 23:51:17.499293 2772 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 23:51:17.499326 kubelet[2772]: W0513 23:51:17.499315 2772 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 23:51:17.499495 kubelet[2772]: E0513 23:51:17.499454 2772 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 23:51:17.499495 kubelet[2772]: E0513 23:51:17.499482 2772 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 23:51:17.499495 kubelet[2772]: W0513 23:51:17.499490 2772 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 23:51:17.499650 kubelet[2772]: E0513 23:51:17.499509 2772 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 23:51:17.499962 kubelet[2772]: E0513 23:51:17.499668 2772 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 23:51:17.499962 kubelet[2772]: W0513 23:51:17.499677 2772 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 23:51:17.499962 kubelet[2772]: E0513 23:51:17.499695 2772 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 23:51:17.501466 kubelet[2772]: E0513 23:51:17.501240 2772 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 23:51:17.501466 kubelet[2772]: W0513 23:51:17.501267 2772 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 23:51:17.501466 kubelet[2772]: E0513 23:51:17.501297 2772 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 23:51:17.501646 kubelet[2772]: E0513 23:51:17.501631 2772 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 23:51:17.501707 kubelet[2772]: W0513 23:51:17.501695 2772 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 23:51:17.501812 kubelet[2772]: E0513 23:51:17.501785 2772 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 23:51:17.502225 kubelet[2772]: E0513 23:51:17.502204 2772 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 23:51:17.502344 kubelet[2772]: W0513 23:51:17.502328 2772 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 23:51:17.503010 kubelet[2772]: E0513 23:51:17.502429 2772 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 23:51:17.503554 kubelet[2772]: E0513 23:51:17.503442 2772 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 23:51:17.503554 kubelet[2772]: W0513 23:51:17.503463 2772 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 23:51:17.503554 kubelet[2772]: E0513 23:51:17.503523 2772 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 23:51:17.504750 kubelet[2772]: E0513 23:51:17.504007 2772 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 23:51:17.504750 kubelet[2772]: W0513 23:51:17.504031 2772 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 23:51:17.504750 kubelet[2772]: E0513 23:51:17.504106 2772 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 23:51:17.505258 kubelet[2772]: E0513 23:51:17.505156 2772 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 23:51:17.505258 kubelet[2772]: W0513 23:51:17.505177 2772 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 23:51:17.505258 kubelet[2772]: E0513 23:51:17.505204 2772 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 23:51:17.505901 kubelet[2772]: E0513 23:51:17.505867 2772 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 23:51:17.505901 kubelet[2772]: W0513 23:51:17.505890 2772 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 23:51:17.507075 kubelet[2772]: E0513 23:51:17.506102 2772 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 23:51:17.507194 kubelet[2772]: E0513 23:51:17.507174 2772 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 23:51:17.507194 kubelet[2772]: W0513 23:51:17.507190 2772 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 23:51:17.507263 kubelet[2772]: E0513 23:51:17.507213 2772 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 23:51:17.507534 kubelet[2772]: E0513 23:51:17.507512 2772 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 23:51:17.507534 kubelet[2772]: W0513 23:51:17.507531 2772 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 23:51:17.507597 kubelet[2772]: E0513 23:51:17.507545 2772 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 23:51:17.602555 kubelet[2772]: E0513 23:51:17.602505 2772 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 23:51:17.602555 kubelet[2772]: W0513 23:51:17.602548 2772 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 23:51:17.602918 kubelet[2772]: E0513 23:51:17.602585 2772 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 23:51:17.604345 kubelet[2772]: E0513 23:51:17.604298 2772 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 23:51:17.604345 kubelet[2772]: W0513 23:51:17.604332 2772 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 23:51:17.604550 kubelet[2772]: E0513 23:51:17.604376 2772 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 23:51:17.604798 kubelet[2772]: E0513 23:51:17.604741 2772 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 23:51:17.604798 kubelet[2772]: W0513 23:51:17.604760 2772 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 23:51:17.605105 kubelet[2772]: E0513 23:51:17.604827 2772 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 23:51:17.605274 kubelet[2772]: E0513 23:51:17.605246 2772 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 23:51:17.605274 kubelet[2772]: W0513 23:51:17.605270 2772 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 23:51:17.605378 kubelet[2772]: E0513 23:51:17.605286 2772 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 23:51:17.605525 kubelet[2772]: E0513 23:51:17.605507 2772 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 23:51:17.605525 kubelet[2772]: W0513 23:51:17.605521 2772 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 23:51:17.605670 kubelet[2772]: E0513 23:51:17.605540 2772 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 23:51:17.605740 kubelet[2772]: E0513 23:51:17.605723 2772 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 23:51:17.605740 kubelet[2772]: W0513 23:51:17.605736 2772 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 23:51:17.605899 kubelet[2772]: E0513 23:51:17.605747 2772 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 23:51:17.706768 kubelet[2772]: E0513 23:51:17.706642 2772 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 23:51:17.706768 kubelet[2772]: W0513 23:51:17.706678 2772 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 23:51:17.706768 kubelet[2772]: E0513 23:51:17.706706 2772 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 23:51:17.707866 kubelet[2772]: E0513 23:51:17.707145 2772 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 23:51:17.707866 kubelet[2772]: W0513 23:51:17.707169 2772 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 23:51:17.707866 kubelet[2772]: E0513 23:51:17.707188 2772 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 23:51:17.707866 kubelet[2772]: E0513 23:51:17.707443 2772 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 23:51:17.707866 kubelet[2772]: W0513 23:51:17.707454 2772 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 23:51:17.707866 kubelet[2772]: E0513 23:51:17.707469 2772 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 23:51:17.707866 kubelet[2772]: E0513 23:51:17.707681 2772 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 23:51:17.707866 kubelet[2772]: W0513 23:51:17.707692 2772 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 23:51:17.707866 kubelet[2772]: E0513 23:51:17.707705 2772 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 23:51:17.708238 kubelet[2772]: E0513 23:51:17.707993 2772 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 23:51:17.708238 kubelet[2772]: W0513 23:51:17.708009 2772 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 23:51:17.708238 kubelet[2772]: E0513 23:51:17.708030 2772 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 23:51:17.708558 kubelet[2772]: E0513 23:51:17.708291 2772 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 23:51:17.708558 kubelet[2772]: W0513 23:51:17.708305 2772 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 23:51:17.708558 kubelet[2772]: E0513 23:51:17.708318 2772 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 23:51:17.810287 kubelet[2772]: E0513 23:51:17.810013 2772 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 23:51:17.810287 kubelet[2772]: W0513 23:51:17.810056 2772 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 23:51:17.810287 kubelet[2772]: E0513 23:51:17.810087 2772 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 23:51:17.810560 kubelet[2772]: E0513 23:51:17.810426 2772 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 23:51:17.810560 kubelet[2772]: W0513 23:51:17.810440 2772 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 23:51:17.810560 kubelet[2772]: E0513 23:51:17.810454 2772 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 23:51:17.810684 kubelet[2772]: E0513 23:51:17.810632 2772 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 23:51:17.810684 kubelet[2772]: W0513 23:51:17.810641 2772 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 23:51:17.810684 kubelet[2772]: E0513 23:51:17.810652 2772 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 23:51:17.810870 kubelet[2772]: E0513 23:51:17.810823 2772 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 23:51:17.810870 kubelet[2772]: W0513 23:51:17.810842 2772 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 23:51:17.810870 kubelet[2772]: E0513 23:51:17.810854 2772 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 23:51:17.811105 kubelet[2772]: E0513 23:51:17.811087 2772 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 23:51:17.811105 kubelet[2772]: W0513 23:51:17.811103 2772 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 23:51:17.811206 kubelet[2772]: E0513 23:51:17.811116 2772 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 23:51:17.811312 kubelet[2772]: E0513 23:51:17.811298 2772 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 23:51:17.811312 kubelet[2772]: W0513 23:51:17.811310 2772 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 23:51:17.811393 kubelet[2772]: E0513 23:51:17.811321 2772 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 23:51:17.912707 kubelet[2772]: E0513 23:51:17.912584 2772 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 23:51:17.912707 kubelet[2772]: W0513 23:51:17.912611 2772 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 23:51:17.912707 kubelet[2772]: E0513 23:51:17.912635 2772 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 23:51:17.913462 kubelet[2772]: E0513 23:51:17.913251 2772 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 23:51:17.913462 kubelet[2772]: W0513 23:51:17.913272 2772 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 23:51:17.913462 kubelet[2772]: E0513 23:51:17.913289 2772 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 23:51:17.913941 kubelet[2772]: E0513 23:51:17.913681 2772 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 23:51:17.913941 kubelet[2772]: W0513 23:51:17.913699 2772 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 23:51:17.913941 kubelet[2772]: E0513 23:51:17.913713 2772 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 23:51:17.914514 kubelet[2772]: E0513 23:51:17.914353 2772 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 23:51:17.914514 kubelet[2772]: W0513 23:51:17.914375 2772 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 23:51:17.914514 kubelet[2772]: E0513 23:51:17.914393 2772 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 23:51:17.915011 kubelet[2772]: E0513 23:51:17.914838 2772 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 23:51:17.915011 kubelet[2772]: W0513 23:51:17.914855 2772 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 23:51:17.915011 kubelet[2772]: E0513 23:51:17.914871 2772 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 23:51:17.915397 kubelet[2772]: E0513 23:51:17.915299 2772 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 23:51:17.915397 kubelet[2772]: W0513 23:51:17.915315 2772 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 23:51:17.915397 kubelet[2772]: E0513 23:51:17.915329 2772 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 23:51:18.017462 kubelet[2772]: E0513 23:51:18.017163 2772 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 23:51:18.017462 kubelet[2772]: W0513 23:51:18.017201 2772 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 23:51:18.017462 kubelet[2772]: E0513 23:51:18.017234 2772 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 23:51:18.018525 kubelet[2772]: E0513 23:51:18.018273 2772 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 23:51:18.018525 kubelet[2772]: W0513 23:51:18.018302 2772 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 23:51:18.018525 kubelet[2772]: E0513 23:51:18.018330 2772 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 23:51:18.018906 kubelet[2772]: E0513 23:51:18.018880 2772 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 23:51:18.019261 kubelet[2772]: W0513 23:51:18.019037 2772 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 23:51:18.019261 kubelet[2772]: E0513 23:51:18.019075 2772 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 23:51:18.019513 kubelet[2772]: E0513 23:51:18.019488 2772 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 23:51:18.019621 kubelet[2772]: W0513 23:51:18.019600 2772 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 23:51:18.019888 kubelet[2772]: E0513 23:51:18.019703 2772 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 23:51:18.020235 kubelet[2772]: E0513 23:51:18.020043 2772 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 23:51:18.020235 kubelet[2772]: W0513 23:51:18.020106 2772 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 23:51:18.020235 kubelet[2772]: E0513 23:51:18.020124 2772 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 23:51:18.021197 kubelet[2772]: E0513 23:51:18.020614 2772 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 23:51:18.021197 kubelet[2772]: W0513 23:51:18.021123 2772 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 23:51:18.021197 kubelet[2772]: E0513 23:51:18.021140 2772 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 23:51:18.122948 kubelet[2772]: E0513 23:51:18.122713 2772 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 23:51:18.122948 kubelet[2772]: W0513 23:51:18.122739 2772 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 23:51:18.122948 kubelet[2772]: E0513 23:51:18.122762 2772 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 23:51:18.123570 kubelet[2772]: E0513 23:51:18.123548 2772 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 23:51:18.123822 kubelet[2772]: W0513 23:51:18.123641 2772 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 23:51:18.123822 kubelet[2772]: E0513 23:51:18.123667 2772 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 23:51:18.124019 kubelet[2772]: E0513 23:51:18.124002 2772 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 23:51:18.124118 kubelet[2772]: W0513 23:51:18.124101 2772 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 23:51:18.124200 kubelet[2772]: E0513 23:51:18.124186 2772 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 23:51:18.124824 kubelet[2772]: E0513 23:51:18.124680 2772 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 23:51:18.124824 kubelet[2772]: W0513 23:51:18.124697 2772 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 23:51:18.124824 kubelet[2772]: E0513 23:51:18.124711 2772 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 23:51:18.125031 kubelet[2772]: E0513 23:51:18.125018 2772 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 23:51:18.125111 kubelet[2772]: W0513 23:51:18.125099 2772 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 23:51:18.125265 kubelet[2772]: E0513 23:51:18.125161 2772 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 23:51:18.125355 kubelet[2772]: E0513 23:51:18.125344 2772 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 23:51:18.125403 kubelet[2772]: W0513 23:51:18.125393 2772 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 23:51:18.125465 kubelet[2772]: E0513 23:51:18.125454 2772 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 23:51:18.192092 kubelet[2772]: E0513 23:51:18.191293 2772 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 23:51:18.192092 kubelet[2772]: W0513 23:51:18.191325 2772 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 23:51:18.192092 kubelet[2772]: E0513 23:51:18.191349 2772 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 23:51:18.227769 kubelet[2772]: E0513 23:51:18.227525 2772 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 23:51:18.227769 kubelet[2772]: W0513 23:51:18.227551 2772 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 23:51:18.227769 kubelet[2772]: E0513 23:51:18.227595 2772 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 23:51:18.228416 kubelet[2772]: E0513 23:51:18.228238 2772 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 23:51:18.228416 kubelet[2772]: W0513 23:51:18.228270 2772 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 23:51:18.228416 kubelet[2772]: E0513 23:51:18.228289 2772 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 23:51:18.228837 kubelet[2772]: E0513 23:51:18.228815 2772 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 23:51:18.228837 kubelet[2772]: W0513 23:51:18.228835 2772 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 23:51:18.228923 kubelet[2772]: E0513 23:51:18.228852 2772 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 23:51:18.229118 kubelet[2772]: E0513 23:51:18.229104 2772 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 23:51:18.229118 kubelet[2772]: W0513 23:51:18.229116 2772 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 23:51:18.229198 kubelet[2772]: E0513 23:51:18.229127 2772 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 23:51:18.229326 kubelet[2772]: E0513 23:51:18.229311 2772 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 23:51:18.229356 kubelet[2772]: W0513 23:51:18.229325 2772 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 23:51:18.229356 kubelet[2772]: E0513 23:51:18.229335 2772 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 23:51:18.284999 kubelet[2772]: E0513 23:51:18.283787 2772 configmap.go:193] Couldn't get configMap calico-system/tigera-ca-bundle: failed to sync configmap cache: timed out waiting for the condition May 13 23:51:18.284999 kubelet[2772]: E0513 23:51:18.283919 2772 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/7f8beb27-df29-4d64-82a3-c5a23cef9546-tigera-ca-bundle podName:7f8beb27-df29-4d64-82a3-c5a23cef9546 nodeName:}" failed. No retries permitted until 2025-05-13 23:51:18.783892981 +0000 UTC m=+14.797725506 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "tigera-ca-bundle" (UniqueName: "kubernetes.io/configmap/7f8beb27-df29-4d64-82a3-c5a23cef9546-tigera-ca-bundle") pod "calico-typha-6b74f96f4-7mp6k" (UID: "7f8beb27-df29-4d64-82a3-c5a23cef9546") : failed to sync configmap cache: timed out waiting for the condition May 13 23:51:18.296321 kubelet[2772]: E0513 23:51:18.295698 2772 projected.go:288] Couldn't get configMap calico-system/kube-root-ca.crt: failed to sync configmap cache: timed out waiting for the condition May 13 23:51:18.296321 kubelet[2772]: E0513 23:51:18.295749 2772 projected.go:194] Error preparing data for projected volume kube-api-access-vf7v8 for pod calico-system/calico-typha-6b74f96f4-7mp6k: failed to sync configmap cache: timed out waiting for the condition May 13 23:51:18.296321 kubelet[2772]: E0513 23:51:18.295863 2772 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/7f8beb27-df29-4d64-82a3-c5a23cef9546-kube-api-access-vf7v8 podName:7f8beb27-df29-4d64-82a3-c5a23cef9546 nodeName:}" failed. No retries permitted until 2025-05-13 23:51:18.795838594 +0000 UTC m=+14.809671119 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-vf7v8" (UniqueName: "kubernetes.io/projected/7f8beb27-df29-4d64-82a3-c5a23cef9546-kube-api-access-vf7v8") pod "calico-typha-6b74f96f4-7mp6k" (UID: "7f8beb27-df29-4d64-82a3-c5a23cef9546") : failed to sync configmap cache: timed out waiting for the condition May 13 23:51:18.332725 kubelet[2772]: E0513 23:51:18.331405 2772 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 23:51:18.332725 kubelet[2772]: W0513 23:51:18.331435 2772 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 23:51:18.332725 kubelet[2772]: E0513 23:51:18.331459 2772 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 23:51:18.332725 kubelet[2772]: E0513 23:51:18.331827 2772 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 23:51:18.332725 kubelet[2772]: W0513 23:51:18.331843 2772 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 23:51:18.332725 kubelet[2772]: E0513 23:51:18.331862 2772 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 23:51:18.332725 kubelet[2772]: E0513 23:51:18.332097 2772 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 23:51:18.332725 kubelet[2772]: W0513 23:51:18.332111 2772 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 23:51:18.332725 kubelet[2772]: E0513 23:51:18.332123 2772 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 23:51:18.332725 kubelet[2772]: E0513 23:51:18.332308 2772 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 23:51:18.333216 kubelet[2772]: W0513 23:51:18.332318 2772 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 23:51:18.333216 kubelet[2772]: E0513 23:51:18.332329 2772 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 23:51:18.333216 kubelet[2772]: E0513 23:51:18.332541 2772 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 23:51:18.333216 kubelet[2772]: W0513 23:51:18.332551 2772 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 23:51:18.333216 kubelet[2772]: E0513 23:51:18.332562 2772 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 23:51:18.384952 kubelet[2772]: E0513 23:51:18.384842 2772 configmap.go:193] Couldn't get configMap calico-system/tigera-ca-bundle: failed to sync configmap cache: timed out waiting for the condition May 13 23:51:18.384952 kubelet[2772]: E0513 23:51:18.384962 2772 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/7e83aff7-6ebd-46bd-aed4-bcf6027bd328-tigera-ca-bundle podName:7e83aff7-6ebd-46bd-aed4-bcf6027bd328 nodeName:}" failed. No retries permitted until 2025-05-13 23:51:18.884933871 +0000 UTC m=+14.898766396 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "tigera-ca-bundle" (UniqueName: "kubernetes.io/configmap/7e83aff7-6ebd-46bd-aed4-bcf6027bd328-tigera-ca-bundle") pod "calico-node-bc4rn" (UID: "7e83aff7-6ebd-46bd-aed4-bcf6027bd328") : failed to sync configmap cache: timed out waiting for the condition May 13 23:51:18.407894 kubelet[2772]: E0513 23:51:18.407843 2772 projected.go:288] Couldn't get configMap calico-system/kube-root-ca.crt: failed to sync configmap cache: timed out waiting for the condition May 13 23:51:18.407894 kubelet[2772]: E0513 23:51:18.407891 2772 projected.go:194] Error preparing data for projected volume kube-api-access-mqxdb for pod calico-system/calico-node-bc4rn: failed to sync configmap cache: timed out waiting for the condition May 13 23:51:18.408120 kubelet[2772]: E0513 23:51:18.407962 2772 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/7e83aff7-6ebd-46bd-aed4-bcf6027bd328-kube-api-access-mqxdb podName:7e83aff7-6ebd-46bd-aed4-bcf6027bd328 nodeName:}" failed. No retries permitted until 2025-05-13 23:51:18.90793966 +0000 UTC m=+14.921772185 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-mqxdb" (UniqueName: "kubernetes.io/projected/7e83aff7-6ebd-46bd-aed4-bcf6027bd328-kube-api-access-mqxdb") pod "calico-node-bc4rn" (UID: "7e83aff7-6ebd-46bd-aed4-bcf6027bd328") : failed to sync configmap cache: timed out waiting for the condition May 13 23:51:18.433575 kubelet[2772]: E0513 23:51:18.433427 2772 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 23:51:18.433575 kubelet[2772]: W0513 23:51:18.433471 2772 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 23:51:18.433575 kubelet[2772]: E0513 23:51:18.433493 2772 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 23:51:18.434004 kubelet[2772]: E0513 23:51:18.433774 2772 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 23:51:18.434004 kubelet[2772]: W0513 23:51:18.433792 2772 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 23:51:18.434004 kubelet[2772]: E0513 23:51:18.433842 2772 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 23:51:18.436235 kubelet[2772]: E0513 23:51:18.436188 2772 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 23:51:18.436235 kubelet[2772]: W0513 23:51:18.436214 2772 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 23:51:18.436235 kubelet[2772]: E0513 23:51:18.436235 2772 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 23:51:18.437079 kubelet[2772]: E0513 23:51:18.437054 2772 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 23:51:18.437079 kubelet[2772]: W0513 23:51:18.437073 2772 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 23:51:18.437223 kubelet[2772]: E0513 23:51:18.437089 2772 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 23:51:18.437333 kubelet[2772]: E0513 23:51:18.437320 2772 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 23:51:18.437359 kubelet[2772]: W0513 23:51:18.437333 2772 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 23:51:18.437359 kubelet[2772]: E0513 23:51:18.437343 2772 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 23:51:18.462153 kubelet[2772]: E0513 23:51:18.462106 2772 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 23:51:18.462153 kubelet[2772]: W0513 23:51:18.462141 2772 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 23:51:18.462324 kubelet[2772]: E0513 23:51:18.462171 2772 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 23:51:18.540052 kubelet[2772]: E0513 23:51:18.538770 2772 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 23:51:18.540052 kubelet[2772]: W0513 23:51:18.538838 2772 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 23:51:18.540052 kubelet[2772]: E0513 23:51:18.538881 2772 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 23:51:18.541031 kubelet[2772]: E0513 23:51:18.540933 2772 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 23:51:18.541031 kubelet[2772]: W0513 23:51:18.540985 2772 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 23:51:18.541181 kubelet[2772]: E0513 23:51:18.541089 2772 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 23:51:18.541642 kubelet[2772]: E0513 23:51:18.541586 2772 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 23:51:18.541642 kubelet[2772]: W0513 23:51:18.541629 2772 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 23:51:18.541642 kubelet[2772]: E0513 23:51:18.541648 2772 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 23:51:18.542129 kubelet[2772]: E0513 23:51:18.542104 2772 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 23:51:18.542129 kubelet[2772]: W0513 23:51:18.542127 2772 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 23:51:18.542268 kubelet[2772]: E0513 23:51:18.542245 2772 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 23:51:18.644486 kubelet[2772]: E0513 23:51:18.644430 2772 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 23:51:18.644486 kubelet[2772]: W0513 23:51:18.644482 2772 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 23:51:18.644786 kubelet[2772]: E0513 23:51:18.644525 2772 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 23:51:18.645803 kubelet[2772]: E0513 23:51:18.645754 2772 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 23:51:18.645939 kubelet[2772]: W0513 23:51:18.645798 2772 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 23:51:18.645939 kubelet[2772]: E0513 23:51:18.645854 2772 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 23:51:18.647069 kubelet[2772]: E0513 23:51:18.647027 2772 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 23:51:18.647069 kubelet[2772]: W0513 23:51:18.647065 2772 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 23:51:18.647220 kubelet[2772]: E0513 23:51:18.647099 2772 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 23:51:18.647524 kubelet[2772]: E0513 23:51:18.647495 2772 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 23:51:18.647524 kubelet[2772]: W0513 23:51:18.647520 2772 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 23:51:18.647663 kubelet[2772]: E0513 23:51:18.647542 2772 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 23:51:18.749489 kubelet[2772]: E0513 23:51:18.748965 2772 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 23:51:18.749489 kubelet[2772]: W0513 23:51:18.749013 2772 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 23:51:18.749489 kubelet[2772]: E0513 23:51:18.749038 2772 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 23:51:18.749489 kubelet[2772]: E0513 23:51:18.749361 2772 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 23:51:18.749489 kubelet[2772]: W0513 23:51:18.749375 2772 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 23:51:18.749489 kubelet[2772]: E0513 23:51:18.749390 2772 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 23:51:18.749858 kubelet[2772]: E0513 23:51:18.749626 2772 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 23:51:18.749858 kubelet[2772]: W0513 23:51:18.749643 2772 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 23:51:18.749858 kubelet[2772]: E0513 23:51:18.749656 2772 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 23:51:18.749966 kubelet[2772]: E0513 23:51:18.749951 2772 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 23:51:18.750021 kubelet[2772]: W0513 23:51:18.749966 2772 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 23:51:18.750021 kubelet[2772]: E0513 23:51:18.750006 2772 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 23:51:18.851566 kubelet[2772]: E0513 23:51:18.851215 2772 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 23:51:18.851566 kubelet[2772]: W0513 23:51:18.851246 2772 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 23:51:18.851566 kubelet[2772]: E0513 23:51:18.851269 2772 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 23:51:18.851767 kubelet[2772]: E0513 23:51:18.851727 2772 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 23:51:18.851767 kubelet[2772]: W0513 23:51:18.851746 2772 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 23:51:18.851851 kubelet[2772]: E0513 23:51:18.851768 2772 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 23:51:18.852335 kubelet[2772]: E0513 23:51:18.852295 2772 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 23:51:18.852335 kubelet[2772]: W0513 23:51:18.852321 2772 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 23:51:18.852548 kubelet[2772]: E0513 23:51:18.852403 2772 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 23:51:18.853070 kubelet[2772]: E0513 23:51:18.852704 2772 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 23:51:18.853070 kubelet[2772]: W0513 23:51:18.852734 2772 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 23:51:18.853070 kubelet[2772]: E0513 23:51:18.852750 2772 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 23:51:18.853303 kubelet[2772]: E0513 23:51:18.853269 2772 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 23:51:18.853303 kubelet[2772]: W0513 23:51:18.853294 2772 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 23:51:18.853370 kubelet[2772]: E0513 23:51:18.853323 2772 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 23:51:18.854079 kubelet[2772]: E0513 23:51:18.853659 2772 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 23:51:18.854079 kubelet[2772]: W0513 23:51:18.853682 2772 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 23:51:18.854079 kubelet[2772]: E0513 23:51:18.853707 2772 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 23:51:18.854079 kubelet[2772]: E0513 23:51:18.854015 2772 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 23:51:18.854079 kubelet[2772]: W0513 23:51:18.854034 2772 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 23:51:18.854079 kubelet[2772]: E0513 23:51:18.854057 2772 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 23:51:18.854718 kubelet[2772]: E0513 23:51:18.854660 2772 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 23:51:18.854718 kubelet[2772]: W0513 23:51:18.854696 2772 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 23:51:18.854718 kubelet[2772]: E0513 23:51:18.854716 2772 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 23:51:18.857299 kubelet[2772]: E0513 23:51:18.856101 2772 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 23:51:18.857299 kubelet[2772]: W0513 23:51:18.856121 2772 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 23:51:18.857299 kubelet[2772]: E0513 23:51:18.856152 2772 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 23:51:18.857299 kubelet[2772]: E0513 23:51:18.857176 2772 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 23:51:18.857299 kubelet[2772]: W0513 23:51:18.857193 2772 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 23:51:18.857299 kubelet[2772]: E0513 23:51:18.857210 2772 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 23:51:18.858071 kubelet[2772]: E0513 23:51:18.858050 2772 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 23:51:18.858667 kubelet[2772]: W0513 23:51:18.858625 2772 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 23:51:18.858667 kubelet[2772]: E0513 23:51:18.858661 2772 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 23:51:18.859220 kubelet[2772]: E0513 23:51:18.859147 2772 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 23:51:18.859220 kubelet[2772]: W0513 23:51:18.859178 2772 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 23:51:18.859220 kubelet[2772]: E0513 23:51:18.859194 2772 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 23:51:18.859455 kubelet[2772]: E0513 23:51:18.859413 2772 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 23:51:18.859455 kubelet[2772]: W0513 23:51:18.859449 2772 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 23:51:18.859575 kubelet[2772]: E0513 23:51:18.859462 2772 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 23:51:18.869677 kubelet[2772]: E0513 23:51:18.868638 2772 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 23:51:18.869677 kubelet[2772]: W0513 23:51:18.868671 2772 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 23:51:18.869677 kubelet[2772]: E0513 23:51:18.868706 2772 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 23:51:18.893116 containerd[1507]: time="2025-05-13T23:51:18.893049638Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-typha-6b74f96f4-7mp6k,Uid:7f8beb27-df29-4d64-82a3-c5a23cef9546,Namespace:calico-system,Attempt:0,}" May 13 23:51:18.951489 containerd[1507]: time="2025-05-13T23:51:18.951189837Z" level=info msg="connecting to shim 06be926168658e1d4ccb5f55cefc13dea9be0c7e65ac089ac367f513ab418d23" address="unix:///run/containerd/s/cf3d3f1513ac056ca2c3ab5ccdbbabd884ea89dc47195011ab5c6d6ba4221c95" namespace=k8s.io protocol=ttrpc version=3 May 13 23:51:18.956433 kubelet[2772]: E0513 23:51:18.956394 2772 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 23:51:18.956433 kubelet[2772]: W0513 23:51:18.956424 2772 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 23:51:18.956961 kubelet[2772]: E0513 23:51:18.956551 2772 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 23:51:18.957336 kubelet[2772]: E0513 23:51:18.957266 2772 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 23:51:18.957336 kubelet[2772]: W0513 23:51:18.957291 2772 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 23:51:18.957336 kubelet[2772]: E0513 23:51:18.957318 2772 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 23:51:18.958303 kubelet[2772]: E0513 23:51:18.958277 2772 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 23:51:18.958303 kubelet[2772]: W0513 23:51:18.958300 2772 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 23:51:18.958570 kubelet[2772]: E0513 23:51:18.958323 2772 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 23:51:18.958707 kubelet[2772]: E0513 23:51:18.958689 2772 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 23:51:18.958760 kubelet[2772]: W0513 23:51:18.958707 2772 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 23:51:18.958820 kubelet[2772]: E0513 23:51:18.958794 2772 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 23:51:18.959558 kubelet[2772]: E0513 23:51:18.959536 2772 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 23:51:18.959645 kubelet[2772]: W0513 23:51:18.959560 2772 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 23:51:18.959645 kubelet[2772]: E0513 23:51:18.959584 2772 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 23:51:18.962347 kubelet[2772]: E0513 23:51:18.962316 2772 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 23:51:18.962347 kubelet[2772]: W0513 23:51:18.962341 2772 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 23:51:18.962486 kubelet[2772]: E0513 23:51:18.962369 2772 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 23:51:18.965293 kubelet[2772]: E0513 23:51:18.964321 2772 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 23:51:18.965293 kubelet[2772]: W0513 23:51:18.964348 2772 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 23:51:18.965293 kubelet[2772]: E0513 23:51:18.964737 2772 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 23:51:18.965293 kubelet[2772]: W0513 23:51:18.964748 2772 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 23:51:18.965293 kubelet[2772]: E0513 23:51:18.964764 2772 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 23:51:18.965293 kubelet[2772]: E0513 23:51:18.965198 2772 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 23:51:18.965293 kubelet[2772]: W0513 23:51:18.965211 2772 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 23:51:18.965293 kubelet[2772]: E0513 23:51:18.965223 2772 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 23:51:18.965718 kubelet[2772]: E0513 23:51:18.965586 2772 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 23:51:18.969412 kubelet[2772]: E0513 23:51:18.966723 2772 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 23:51:18.969589 kubelet[2772]: W0513 23:51:18.969569 2772 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 23:51:18.969697 kubelet[2772]: E0513 23:51:18.969682 2772 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 23:51:18.970248 kubelet[2772]: E0513 23:51:18.970211 2772 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 23:51:18.970248 kubelet[2772]: W0513 23:51:18.970238 2772 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 23:51:18.970373 kubelet[2772]: E0513 23:51:18.970257 2772 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 23:51:18.977997 kubelet[2772]: E0513 23:51:18.976512 2772 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 23:51:18.977997 kubelet[2772]: W0513 23:51:18.976640 2772 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 23:51:18.977997 kubelet[2772]: E0513 23:51:18.976667 2772 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 23:51:19.000271 systemd[1]: Started cri-containerd-06be926168658e1d4ccb5f55cefc13dea9be0c7e65ac089ac367f513ab418d23.scope - libcontainer container 06be926168658e1d4ccb5f55cefc13dea9be0c7e65ac089ac367f513ab418d23. May 13 23:51:19.042012 containerd[1507]: time="2025-05-13T23:51:19.040726309Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-node-bc4rn,Uid:7e83aff7-6ebd-46bd-aed4-bcf6027bd328,Namespace:calico-system,Attempt:0,}" May 13 23:51:19.054515 containerd[1507]: time="2025-05-13T23:51:19.054361344Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-typha-6b74f96f4-7mp6k,Uid:7f8beb27-df29-4d64-82a3-c5a23cef9546,Namespace:calico-system,Attempt:0,} returns sandbox id \"06be926168658e1d4ccb5f55cefc13dea9be0c7e65ac089ac367f513ab418d23\"" May 13 23:51:19.058203 containerd[1507]: time="2025-05-13T23:51:19.057721761Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/typha:v3.29.3\"" May 13 23:51:19.077357 containerd[1507]: time="2025-05-13T23:51:19.077299598Z" level=info msg="connecting to shim ca8d710b13687069d5d3293e27ce8b60258ade20b0a11780e0b583d0dddbf2be" address="unix:///run/containerd/s/ed1e82d4ed91061ec0224aacf214cc30a12d59607756da8563797992d2e4baec" namespace=k8s.io protocol=ttrpc version=3 May 13 23:51:19.110318 systemd[1]: Started cri-containerd-ca8d710b13687069d5d3293e27ce8b60258ade20b0a11780e0b583d0dddbf2be.scope - libcontainer container ca8d710b13687069d5d3293e27ce8b60258ade20b0a11780e0b583d0dddbf2be. May 13 23:51:19.131044 kubelet[2772]: E0513 23:51:19.130988 2772 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-68jvj" podUID="25916451-c2fa-46e7-8188-33b7982635fd" May 13 23:51:19.155496 containerd[1507]: time="2025-05-13T23:51:19.155449379Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-node-bc4rn,Uid:7e83aff7-6ebd-46bd-aed4-bcf6027bd328,Namespace:calico-system,Attempt:0,} returns sandbox id \"ca8d710b13687069d5d3293e27ce8b60258ade20b0a11780e0b583d0dddbf2be\"" May 13 23:51:21.104010 containerd[1507]: time="2025-05-13T23:51:21.103815337Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/typha:v3.29.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 13 23:51:21.105843 containerd[1507]: time="2025-05-13T23:51:21.105722132Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/typha:v3.29.3: active requests=0, bytes read=28370571" May 13 23:51:21.107021 containerd[1507]: time="2025-05-13T23:51:21.106684890Z" level=info msg="ImageCreate event name:\"sha256:26e730979a07ea7452715da6ac48076016018bc982c06ebd32d5e095f42d3d54\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 13 23:51:21.109359 containerd[1507]: time="2025-05-13T23:51:21.109318795Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/typha@sha256:f5516aa6a78f00931d2625f3012dcf2c69d141ce41483b8d59c6ec6330a18620\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 13 23:51:21.109859 containerd[1507]: time="2025-05-13T23:51:21.109823175Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/typha:v3.29.3\" with image id \"sha256:26e730979a07ea7452715da6ac48076016018bc982c06ebd32d5e095f42d3d54\", repo tag \"ghcr.io/flatcar/calico/typha:v3.29.3\", repo digest \"ghcr.io/flatcar/calico/typha@sha256:f5516aa6a78f00931d2625f3012dcf2c69d141ce41483b8d59c6ec6330a18620\", size \"29739745\" in 2.051191536s" May 13 23:51:21.109938 containerd[1507]: time="2025-05-13T23:51:21.109860616Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/typha:v3.29.3\" returns image reference \"sha256:26e730979a07ea7452715da6ac48076016018bc982c06ebd32d5e095f42d3d54\"" May 13 23:51:21.112333 containerd[1507]: time="2025-05-13T23:51:21.112284032Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.29.3\"" May 13 23:51:21.130218 kubelet[2772]: E0513 23:51:21.129849 2772 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-68jvj" podUID="25916451-c2fa-46e7-8188-33b7982635fd" May 13 23:51:21.132683 containerd[1507]: time="2025-05-13T23:51:21.132348707Z" level=info msg="CreateContainer within sandbox \"06be926168658e1d4ccb5f55cefc13dea9be0c7e65ac089ac367f513ab418d23\" for container &ContainerMetadata{Name:calico-typha,Attempt:0,}" May 13 23:51:21.150406 containerd[1507]: time="2025-05-13T23:51:21.150355381Z" level=info msg="Container 2ace13d2221a7c91c1dfc1be98d4d3908a2a8722073db7a44e5a2fab860ccf3f: CDI devices from CRI Config.CDIDevices: []" May 13 23:51:21.156524 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount984569017.mount: Deactivated successfully. May 13 23:51:21.165352 containerd[1507]: time="2025-05-13T23:51:21.165186088Z" level=info msg="CreateContainer within sandbox \"06be926168658e1d4ccb5f55cefc13dea9be0c7e65ac089ac367f513ab418d23\" for &ContainerMetadata{Name:calico-typha,Attempt:0,} returns container id \"2ace13d2221a7c91c1dfc1be98d4d3908a2a8722073db7a44e5a2fab860ccf3f\"" May 13 23:51:21.168620 containerd[1507]: time="2025-05-13T23:51:21.166765951Z" level=info msg="StartContainer for \"2ace13d2221a7c91c1dfc1be98d4d3908a2a8722073db7a44e5a2fab860ccf3f\"" May 13 23:51:21.170094 containerd[1507]: time="2025-05-13T23:51:21.170007759Z" level=info msg="connecting to shim 2ace13d2221a7c91c1dfc1be98d4d3908a2a8722073db7a44e5a2fab860ccf3f" address="unix:///run/containerd/s/cf3d3f1513ac056ca2c3ab5ccdbbabd884ea89dc47195011ab5c6d6ba4221c95" protocol=ttrpc version=3 May 13 23:51:21.205494 systemd[1]: Started cri-containerd-2ace13d2221a7c91c1dfc1be98d4d3908a2a8722073db7a44e5a2fab860ccf3f.scope - libcontainer container 2ace13d2221a7c91c1dfc1be98d4d3908a2a8722073db7a44e5a2fab860ccf3f. May 13 23:51:21.267892 containerd[1507]: time="2025-05-13T23:51:21.267798274Z" level=info msg="StartContainer for \"2ace13d2221a7c91c1dfc1be98d4d3908a2a8722073db7a44e5a2fab860ccf3f\" returns successfully" May 13 23:51:22.274880 kubelet[2772]: I0513 23:51:22.274780 2772 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/calico-typha-6b74f96f4-7mp6k" podStartSLOduration=3.220383128 podStartE2EDuration="5.274762193s" podCreationTimestamp="2025-05-13 23:51:17 +0000 UTC" firstStartedPulling="2025-05-13 23:51:19.056668558 +0000 UTC m=+15.070501083" lastFinishedPulling="2025-05-13 23:51:21.111047623 +0000 UTC m=+17.124880148" observedRunningTime="2025-05-13 23:51:22.273458902 +0000 UTC m=+18.287291427" watchObservedRunningTime="2025-05-13 23:51:22.274762193 +0000 UTC m=+18.288594718" May 13 23:51:22.323327 kubelet[2772]: E0513 23:51:22.322573 2772 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 23:51:22.323327 kubelet[2772]: W0513 23:51:22.322720 2772 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 23:51:22.323327 kubelet[2772]: E0513 23:51:22.322766 2772 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 23:51:22.325092 kubelet[2772]: E0513 23:51:22.324440 2772 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 23:51:22.325092 kubelet[2772]: W0513 23:51:22.324471 2772 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 23:51:22.325092 kubelet[2772]: E0513 23:51:22.324498 2772 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 23:51:22.325092 kubelet[2772]: E0513 23:51:22.324800 2772 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 23:51:22.325092 kubelet[2772]: W0513 23:51:22.324814 2772 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 23:51:22.325092 kubelet[2772]: E0513 23:51:22.324830 2772 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 23:51:22.326084 kubelet[2772]: E0513 23:51:22.325661 2772 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 23:51:22.326084 kubelet[2772]: W0513 23:51:22.326017 2772 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 23:51:22.327776 kubelet[2772]: E0513 23:51:22.326330 2772 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 23:51:22.327776 kubelet[2772]: E0513 23:51:22.326670 2772 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 23:51:22.327776 kubelet[2772]: W0513 23:51:22.326685 2772 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 23:51:22.327776 kubelet[2772]: E0513 23:51:22.326703 2772 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 23:51:22.327776 kubelet[2772]: E0513 23:51:22.326902 2772 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 23:51:22.327776 kubelet[2772]: W0513 23:51:22.326912 2772 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 23:51:22.327776 kubelet[2772]: E0513 23:51:22.326922 2772 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 23:51:22.327776 kubelet[2772]: E0513 23:51:22.327149 2772 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 23:51:22.327776 kubelet[2772]: W0513 23:51:22.327161 2772 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 23:51:22.327776 kubelet[2772]: E0513 23:51:22.327172 2772 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 23:51:22.328155 kubelet[2772]: E0513 23:51:22.327368 2772 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 23:51:22.328155 kubelet[2772]: W0513 23:51:22.327377 2772 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 23:51:22.328155 kubelet[2772]: E0513 23:51:22.327388 2772 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 23:51:22.328155 kubelet[2772]: E0513 23:51:22.327571 2772 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 23:51:22.328155 kubelet[2772]: W0513 23:51:22.327587 2772 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 23:51:22.328155 kubelet[2772]: E0513 23:51:22.327597 2772 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 23:51:22.330414 kubelet[2772]: E0513 23:51:22.329342 2772 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 23:51:22.330414 kubelet[2772]: W0513 23:51:22.329371 2772 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 23:51:22.330414 kubelet[2772]: E0513 23:51:22.329387 2772 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 23:51:22.330414 kubelet[2772]: E0513 23:51:22.329560 2772 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 23:51:22.330414 kubelet[2772]: W0513 23:51:22.329568 2772 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 23:51:22.330414 kubelet[2772]: E0513 23:51:22.329577 2772 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 23:51:22.330414 kubelet[2772]: E0513 23:51:22.329729 2772 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 23:51:22.330414 kubelet[2772]: W0513 23:51:22.329740 2772 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 23:51:22.330414 kubelet[2772]: E0513 23:51:22.329748 2772 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 23:51:22.330414 kubelet[2772]: E0513 23:51:22.329908 2772 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 23:51:22.330741 kubelet[2772]: W0513 23:51:22.329917 2772 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 23:51:22.330741 kubelet[2772]: E0513 23:51:22.329925 2772 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 23:51:22.332775 kubelet[2772]: E0513 23:51:22.332262 2772 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 23:51:22.332775 kubelet[2772]: W0513 23:51:22.332294 2772 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 23:51:22.332775 kubelet[2772]: E0513 23:51:22.332312 2772 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 23:51:22.332775 kubelet[2772]: E0513 23:51:22.332478 2772 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 23:51:22.332775 kubelet[2772]: W0513 23:51:22.332487 2772 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 23:51:22.332775 kubelet[2772]: E0513 23:51:22.332496 2772 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 23:51:22.385011 kubelet[2772]: E0513 23:51:22.384906 2772 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 23:51:22.385011 kubelet[2772]: W0513 23:51:22.384963 2772 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 23:51:22.385011 kubelet[2772]: E0513 23:51:22.385004 2772 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 23:51:22.385458 kubelet[2772]: E0513 23:51:22.385264 2772 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 23:51:22.385458 kubelet[2772]: W0513 23:51:22.385278 2772 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 23:51:22.385458 kubelet[2772]: E0513 23:51:22.385299 2772 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 23:51:22.386416 kubelet[2772]: E0513 23:51:22.385530 2772 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 23:51:22.386416 kubelet[2772]: W0513 23:51:22.385543 2772 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 23:51:22.386416 kubelet[2772]: E0513 23:51:22.385562 2772 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 23:51:22.388519 kubelet[2772]: E0513 23:51:22.388455 2772 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 23:51:22.388519 kubelet[2772]: W0513 23:51:22.388493 2772 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 23:51:22.388904 kubelet[2772]: E0513 23:51:22.388734 2772 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 23:51:22.389430 kubelet[2772]: E0513 23:51:22.389304 2772 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 23:51:22.389430 kubelet[2772]: W0513 23:51:22.389324 2772 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 23:51:22.389430 kubelet[2772]: E0513 23:51:22.389387 2772 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 23:51:22.390048 kubelet[2772]: E0513 23:51:22.389806 2772 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 23:51:22.390048 kubelet[2772]: W0513 23:51:22.389847 2772 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 23:51:22.390048 kubelet[2772]: E0513 23:51:22.389892 2772 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 23:51:22.391392 kubelet[2772]: E0513 23:51:22.391089 2772 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 23:51:22.391392 kubelet[2772]: W0513 23:51:22.391157 2772 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 23:51:22.391392 kubelet[2772]: E0513 23:51:22.391209 2772 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 23:51:22.392012 kubelet[2772]: E0513 23:51:22.391722 2772 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 23:51:22.392012 kubelet[2772]: W0513 23:51:22.391756 2772 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 23:51:22.392012 kubelet[2772]: E0513 23:51:22.391800 2772 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 23:51:22.392616 kubelet[2772]: E0513 23:51:22.392400 2772 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 23:51:22.392616 kubelet[2772]: W0513 23:51:22.392524 2772 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 23:51:22.392616 kubelet[2772]: E0513 23:51:22.392572 2772 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 23:51:22.393324 kubelet[2772]: E0513 23:51:22.393170 2772 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 23:51:22.393324 kubelet[2772]: W0513 23:51:22.393191 2772 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 23:51:22.393324 kubelet[2772]: E0513 23:51:22.393225 2772 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 23:51:22.393834 kubelet[2772]: E0513 23:51:22.393639 2772 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 23:51:22.393834 kubelet[2772]: W0513 23:51:22.393656 2772 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 23:51:22.393834 kubelet[2772]: E0513 23:51:22.393684 2772 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 23:51:22.394338 kubelet[2772]: E0513 23:51:22.394123 2772 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 23:51:22.394338 kubelet[2772]: W0513 23:51:22.394165 2772 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 23:51:22.394338 kubelet[2772]: E0513 23:51:22.394215 2772 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 23:51:22.394744 kubelet[2772]: E0513 23:51:22.394603 2772 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 23:51:22.394744 kubelet[2772]: W0513 23:51:22.394619 2772 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 23:51:22.394744 kubelet[2772]: E0513 23:51:22.394639 2772 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 23:51:22.395154 kubelet[2772]: E0513 23:51:22.395051 2772 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 23:51:22.395154 kubelet[2772]: W0513 23:51:22.395069 2772 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 23:51:22.395154 kubelet[2772]: E0513 23:51:22.395105 2772 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 23:51:22.395474 kubelet[2772]: E0513 23:51:22.395381 2772 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 23:51:22.395474 kubelet[2772]: W0513 23:51:22.395395 2772 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 23:51:22.395474 kubelet[2772]: E0513 23:51:22.395443 2772 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 23:51:22.395779 kubelet[2772]: E0513 23:51:22.395694 2772 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 23:51:22.395779 kubelet[2772]: W0513 23:51:22.395706 2772 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 23:51:22.395779 kubelet[2772]: E0513 23:51:22.395729 2772 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 23:51:22.396149 kubelet[2772]: E0513 23:51:22.396076 2772 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 23:51:22.396149 kubelet[2772]: W0513 23:51:22.396095 2772 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 23:51:22.396149 kubelet[2772]: E0513 23:51:22.396110 2772 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 23:51:22.396489 kubelet[2772]: E0513 23:51:22.396466 2772 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 13 23:51:22.396524 kubelet[2772]: W0513 23:51:22.396489 2772 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 13 23:51:22.396524 kubelet[2772]: E0513 23:51:22.396504 2772 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 13 23:51:22.494291 containerd[1507]: time="2025-05-13T23:51:22.494219735Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.29.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 13 23:51:22.496217 containerd[1507]: time="2025-05-13T23:51:22.496128489Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.29.3: active requests=0, bytes read=5122903" May 13 23:51:22.497717 containerd[1507]: time="2025-05-13T23:51:22.497661589Z" level=info msg="ImageCreate event name:\"sha256:dd8e710a588cc6f5834c4d84f7e12458efae593d3dfe527ca9e757c89239ecb8\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 13 23:51:22.500190 containerd[1507]: time="2025-05-13T23:51:22.500112405Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/pod2daemon-flexvol@sha256:eeaa2bb4f9b1aa61adde43ce6dea95eee89291f96963548e108d9a2dfbc5edd1\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 13 23:51:22.501076 containerd[1507]: time="2025-05-13T23:51:22.500873435Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.29.3\" with image id \"sha256:dd8e710a588cc6f5834c4d84f7e12458efae593d3dfe527ca9e757c89239ecb8\", repo tag \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.29.3\", repo digest \"ghcr.io/flatcar/calico/pod2daemon-flexvol@sha256:eeaa2bb4f9b1aa61adde43ce6dea95eee89291f96963548e108d9a2dfbc5edd1\", size \"6492045\" in 1.388541121s" May 13 23:51:22.501076 containerd[1507]: time="2025-05-13T23:51:22.500915877Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.29.3\" returns image reference \"sha256:dd8e710a588cc6f5834c4d84f7e12458efae593d3dfe527ca9e757c89239ecb8\"" May 13 23:51:22.505158 containerd[1507]: time="2025-05-13T23:51:22.505109921Z" level=info msg="CreateContainer within sandbox \"ca8d710b13687069d5d3293e27ce8b60258ade20b0a11780e0b583d0dddbf2be\" for container &ContainerMetadata{Name:flexvol-driver,Attempt:0,}" May 13 23:51:22.516012 containerd[1507]: time="2025-05-13T23:51:22.515933144Z" level=info msg="Container 0876d9d619628e3c96dec1daa824f2cef9f72d8d860c07209ce6fd6cee166048: CDI devices from CRI Config.CDIDevices: []" May 13 23:51:22.527313 containerd[1507]: time="2025-05-13T23:51:22.527023298Z" level=info msg="CreateContainer within sandbox \"ca8d710b13687069d5d3293e27ce8b60258ade20b0a11780e0b583d0dddbf2be\" for &ContainerMetadata{Name:flexvol-driver,Attempt:0,} returns container id \"0876d9d619628e3c96dec1daa824f2cef9f72d8d860c07209ce6fd6cee166048\"" May 13 23:51:22.531051 containerd[1507]: time="2025-05-13T23:51:22.529468713Z" level=info msg="StartContainer for \"0876d9d619628e3c96dec1daa824f2cef9f72d8d860c07209ce6fd6cee166048\"" May 13 23:51:22.532050 containerd[1507]: time="2025-05-13T23:51:22.532002252Z" level=info msg="connecting to shim 0876d9d619628e3c96dec1daa824f2cef9f72d8d860c07209ce6fd6cee166048" address="unix:///run/containerd/s/ed1e82d4ed91061ec0224aacf214cc30a12d59607756da8563797992d2e4baec" protocol=ttrpc version=3 May 13 23:51:22.563262 systemd[1]: Started cri-containerd-0876d9d619628e3c96dec1daa824f2cef9f72d8d860c07209ce6fd6cee166048.scope - libcontainer container 0876d9d619628e3c96dec1daa824f2cef9f72d8d860c07209ce6fd6cee166048. May 13 23:51:22.617800 containerd[1507]: time="2025-05-13T23:51:22.617650402Z" level=info msg="StartContainer for \"0876d9d619628e3c96dec1daa824f2cef9f72d8d860c07209ce6fd6cee166048\" returns successfully" May 13 23:51:22.632735 systemd[1]: cri-containerd-0876d9d619628e3c96dec1daa824f2cef9f72d8d860c07209ce6fd6cee166048.scope: Deactivated successfully. May 13 23:51:22.637217 containerd[1507]: time="2025-05-13T23:51:22.637026199Z" level=info msg="received exit event container_id:\"0876d9d619628e3c96dec1daa824f2cef9f72d8d860c07209ce6fd6cee166048\" id:\"0876d9d619628e3c96dec1daa824f2cef9f72d8d860c07209ce6fd6cee166048\" pid:3472 exited_at:{seconds:1747180282 nanos:636434696}" May 13 23:51:22.637217 containerd[1507]: time="2025-05-13T23:51:22.637122363Z" level=info msg="TaskExit event in podsandbox handler container_id:\"0876d9d619628e3c96dec1daa824f2cef9f72d8d860c07209ce6fd6cee166048\" id:\"0876d9d619628e3c96dec1daa824f2cef9f72d8d860c07209ce6fd6cee166048\" pid:3472 exited_at:{seconds:1747180282 nanos:636434696}" May 13 23:51:22.669369 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-0876d9d619628e3c96dec1daa824f2cef9f72d8d860c07209ce6fd6cee166048-rootfs.mount: Deactivated successfully. May 13 23:51:23.131351 kubelet[2772]: E0513 23:51:23.131278 2772 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-68jvj" podUID="25916451-c2fa-46e7-8188-33b7982635fd" May 13 23:51:23.260287 kubelet[2772]: I0513 23:51:23.259917 2772 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" May 13 23:51:23.263868 containerd[1507]: time="2025-05-13T23:51:23.263580732Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/cni:v3.29.3\"" May 13 23:51:24.332242 kubelet[2772]: I0513 23:51:24.331454 2772 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" May 13 23:51:25.131236 kubelet[2772]: E0513 23:51:25.131175 2772 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-68jvj" podUID="25916451-c2fa-46e7-8188-33b7982635fd" May 13 23:51:26.886388 containerd[1507]: time="2025-05-13T23:51:26.886304046Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/cni:v3.29.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 13 23:51:26.889397 containerd[1507]: time="2025-05-13T23:51:26.888539289Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/cni:v3.29.3: active requests=0, bytes read=91256270" May 13 23:51:26.891218 containerd[1507]: time="2025-05-13T23:51:26.890536044Z" level=info msg="ImageCreate event name:\"sha256:add6372545fb406bb017769f222d84c50549ce13e3b19f1fbaee3d8a4aaef627\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 13 23:51:26.893873 containerd[1507]: time="2025-05-13T23:51:26.893828926Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/cni@sha256:4505ec8f976470994b6a94295a4dabac0cb98375db050e959a22603e00ada90b\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 13 23:51:26.895054 containerd[1507]: time="2025-05-13T23:51:26.894587915Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/cni:v3.29.3\" with image id \"sha256:add6372545fb406bb017769f222d84c50549ce13e3b19f1fbaee3d8a4aaef627\", repo tag \"ghcr.io/flatcar/calico/cni:v3.29.3\", repo digest \"ghcr.io/flatcar/calico/cni@sha256:4505ec8f976470994b6a94295a4dabac0cb98375db050e959a22603e00ada90b\", size \"92625452\" in 3.630966101s" May 13 23:51:26.895851 containerd[1507]: time="2025-05-13T23:51:26.895823400Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/cni:v3.29.3\" returns image reference \"sha256:add6372545fb406bb017769f222d84c50549ce13e3b19f1fbaee3d8a4aaef627\"" May 13 23:51:26.901076 containerd[1507]: time="2025-05-13T23:51:26.899752787Z" level=info msg="CreateContainer within sandbox \"ca8d710b13687069d5d3293e27ce8b60258ade20b0a11780e0b583d0dddbf2be\" for container &ContainerMetadata{Name:install-cni,Attempt:0,}" May 13 23:51:26.912790 containerd[1507]: time="2025-05-13T23:51:26.908108818Z" level=info msg="Container d44472e94337c8774078b94e9cca5458563720a2a9c668d8b30bc7b7e3722a69: CDI devices from CRI Config.CDIDevices: []" May 13 23:51:26.923962 containerd[1507]: time="2025-05-13T23:51:26.923763880Z" level=info msg="CreateContainer within sandbox \"ca8d710b13687069d5d3293e27ce8b60258ade20b0a11780e0b583d0dddbf2be\" for &ContainerMetadata{Name:install-cni,Attempt:0,} returns container id \"d44472e94337c8774078b94e9cca5458563720a2a9c668d8b30bc7b7e3722a69\"" May 13 23:51:26.926461 containerd[1507]: time="2025-05-13T23:51:26.924674354Z" level=info msg="StartContainer for \"d44472e94337c8774078b94e9cca5458563720a2a9c668d8b30bc7b7e3722a69\"" May 13 23:51:26.928291 containerd[1507]: time="2025-05-13T23:51:26.928234806Z" level=info msg="connecting to shim d44472e94337c8774078b94e9cca5458563720a2a9c668d8b30bc7b7e3722a69" address="unix:///run/containerd/s/ed1e82d4ed91061ec0224aacf214cc30a12d59607756da8563797992d2e4baec" protocol=ttrpc version=3 May 13 23:51:26.956235 systemd[1]: Started cri-containerd-d44472e94337c8774078b94e9cca5458563720a2a9c668d8b30bc7b7e3722a69.scope - libcontainer container d44472e94337c8774078b94e9cca5458563720a2a9c668d8b30bc7b7e3722a69. May 13 23:51:27.014480 containerd[1507]: time="2025-05-13T23:51:27.014316763Z" level=info msg="StartContainer for \"d44472e94337c8774078b94e9cca5458563720a2a9c668d8b30bc7b7e3722a69\" returns successfully" May 13 23:51:27.130787 kubelet[2772]: E0513 23:51:27.130449 2772 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-68jvj" podUID="25916451-c2fa-46e7-8188-33b7982635fd" May 13 23:51:27.540047 containerd[1507]: time="2025-05-13T23:51:27.539998891Z" level=error msg="failed to reload cni configuration after receiving fs change event(WRITE \"/etc/cni/net.d/calico-kubeconfig\")" error="cni config load failed: no network config found in /etc/cni/net.d: cni plugin not initialized: failed to load cni config" May 13 23:51:27.544471 systemd[1]: cri-containerd-d44472e94337c8774078b94e9cca5458563720a2a9c668d8b30bc7b7e3722a69.scope: Deactivated successfully. May 13 23:51:27.545312 systemd[1]: cri-containerd-d44472e94337c8774078b94e9cca5458563720a2a9c668d8b30bc7b7e3722a69.scope: Consumed 499ms CPU time, 170.5M memory peak, 150.3M written to disk. May 13 23:51:27.548313 containerd[1507]: time="2025-05-13T23:51:27.547590771Z" level=info msg="TaskExit event in podsandbox handler container_id:\"d44472e94337c8774078b94e9cca5458563720a2a9c668d8b30bc7b7e3722a69\" id:\"d44472e94337c8774078b94e9cca5458563720a2a9c668d8b30bc7b7e3722a69\" pid:3532 exited_at:{seconds:1747180287 nanos:546439128}" May 13 23:51:27.548313 containerd[1507]: time="2025-05-13T23:51:27.547681174Z" level=info msg="received exit event container_id:\"d44472e94337c8774078b94e9cca5458563720a2a9c668d8b30bc7b7e3722a69\" id:\"d44472e94337c8774078b94e9cca5458563720a2a9c668d8b30bc7b7e3722a69\" pid:3532 exited_at:{seconds:1747180287 nanos:546439128}" May 13 23:51:27.570035 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-d44472e94337c8774078b94e9cca5458563720a2a9c668d8b30bc7b7e3722a69-rootfs.mount: Deactivated successfully. May 13 23:51:27.624586 kubelet[2772]: I0513 23:51:27.624549 2772 kubelet_node_status.go:488] "Fast updating node status as it just became ready" May 13 23:51:27.674409 systemd[1]: Created slice kubepods-burstable-pod7a20d30b_a034_4ee2_8bb1_f36d725100bb.slice - libcontainer container kubepods-burstable-pod7a20d30b_a034_4ee2_8bb1_f36d725100bb.slice. May 13 23:51:27.691072 systemd[1]: Created slice kubepods-burstable-pod7e1be591_b8a4_4385_a412_aa3deff29976.slice - libcontainer container kubepods-burstable-pod7e1be591_b8a4_4385_a412_aa3deff29976.slice. May 13 23:51:27.712731 systemd[1]: Created slice kubepods-besteffort-podae83f87c_567b_4c35_a4bc_9b1afafaba01.slice - libcontainer container kubepods-besteffort-podae83f87c_567b_4c35_a4bc_9b1afafaba01.slice. May 13 23:51:27.724515 systemd[1]: Created slice kubepods-besteffort-poda9cfa8c1_0881_48db_9b0c_a8081ef8041d.slice - libcontainer container kubepods-besteffort-poda9cfa8c1_0881_48db_9b0c_a8081ef8041d.slice. May 13 23:51:27.734993 kubelet[2772]: I0513 23:51:27.734857 2772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/7a20d30b-a034-4ee2-8bb1-f36d725100bb-config-volume\") pod \"coredns-6f6b679f8f-4sdg7\" (UID: \"7a20d30b-a034-4ee2-8bb1-f36d725100bb\") " pod="kube-system/coredns-6f6b679f8f-4sdg7" May 13 23:51:27.735421 systemd[1]: Created slice kubepods-besteffort-poda9e7fa11_c2d7_4fc9_bc76_9c7b1719d09c.slice - libcontainer container kubepods-besteffort-poda9e7fa11_c2d7_4fc9_bc76_9c7b1719d09c.slice. May 13 23:51:27.739170 kubelet[2772]: I0513 23:51:27.738875 2772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8qzzq\" (UniqueName: \"kubernetes.io/projected/7a20d30b-a034-4ee2-8bb1-f36d725100bb-kube-api-access-8qzzq\") pod \"coredns-6f6b679f8f-4sdg7\" (UID: \"7a20d30b-a034-4ee2-8bb1-f36d725100bb\") " pod="kube-system/coredns-6f6b679f8f-4sdg7" May 13 23:51:27.739170 kubelet[2772]: I0513 23:51:27.738922 2772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/7e1be591-b8a4-4385-a412-aa3deff29976-config-volume\") pod \"coredns-6f6b679f8f-qjs7p\" (UID: \"7e1be591-b8a4-4385-a412-aa3deff29976\") " pod="kube-system/coredns-6f6b679f8f-qjs7p" May 13 23:51:27.739170 kubelet[2772]: I0513 23:51:27.738941 2772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dlw5h\" (UniqueName: \"kubernetes.io/projected/7e1be591-b8a4-4385-a412-aa3deff29976-kube-api-access-dlw5h\") pod \"coredns-6f6b679f8f-qjs7p\" (UID: \"7e1be591-b8a4-4385-a412-aa3deff29976\") " pod="kube-system/coredns-6f6b679f8f-qjs7p" May 13 23:51:27.739170 kubelet[2772]: I0513 23:51:27.738960 2772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"calico-apiserver-certs\" (UniqueName: \"kubernetes.io/secret/ae83f87c-567b-4c35-a4bc-9b1afafaba01-calico-apiserver-certs\") pod \"calico-apiserver-7fd5cd4657-xd97q\" (UID: \"ae83f87c-567b-4c35-a4bc-9b1afafaba01\") " pod="calico-apiserver/calico-apiserver-7fd5cd4657-xd97q" May 13 23:51:27.739170 kubelet[2772]: I0513 23:51:27.738999 2772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bkft9\" (UniqueName: \"kubernetes.io/projected/ae83f87c-567b-4c35-a4bc-9b1afafaba01-kube-api-access-bkft9\") pod \"calico-apiserver-7fd5cd4657-xd97q\" (UID: \"ae83f87c-567b-4c35-a4bc-9b1afafaba01\") " pod="calico-apiserver/calico-apiserver-7fd5cd4657-xd97q" May 13 23:51:27.842863 kubelet[2772]: I0513 23:51:27.840046 2772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5s6qx\" (UniqueName: \"kubernetes.io/projected/a9cfa8c1-0881-48db-9b0c-a8081ef8041d-kube-api-access-5s6qx\") pod \"calico-apiserver-7fd5cd4657-7ls8m\" (UID: \"a9cfa8c1-0881-48db-9b0c-a8081ef8041d\") " pod="calico-apiserver/calico-apiserver-7fd5cd4657-7ls8m" May 13 23:51:27.842863 kubelet[2772]: I0513 23:51:27.840221 2772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"calico-apiserver-certs\" (UniqueName: \"kubernetes.io/secret/a9cfa8c1-0881-48db-9b0c-a8081ef8041d-calico-apiserver-certs\") pod \"calico-apiserver-7fd5cd4657-7ls8m\" (UID: \"a9cfa8c1-0881-48db-9b0c-a8081ef8041d\") " pod="calico-apiserver/calico-apiserver-7fd5cd4657-7ls8m" May 13 23:51:27.842863 kubelet[2772]: I0513 23:51:27.840332 2772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fxtxs\" (UniqueName: \"kubernetes.io/projected/a9e7fa11-c2d7-4fc9-bc76-9c7b1719d09c-kube-api-access-fxtxs\") pod \"calico-kube-controllers-67895f656d-fr7gd\" (UID: \"a9e7fa11-c2d7-4fc9-bc76-9c7b1719d09c\") " pod="calico-system/calico-kube-controllers-67895f656d-fr7gd" May 13 23:51:27.842863 kubelet[2772]: I0513 23:51:27.840425 2772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a9e7fa11-c2d7-4fc9-bc76-9c7b1719d09c-tigera-ca-bundle\") pod \"calico-kube-controllers-67895f656d-fr7gd\" (UID: \"a9e7fa11-c2d7-4fc9-bc76-9c7b1719d09c\") " pod="calico-system/calico-kube-controllers-67895f656d-fr7gd" May 13 23:51:27.985619 containerd[1507]: time="2025-05-13T23:51:27.985553673Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-6f6b679f8f-4sdg7,Uid:7a20d30b-a034-4ee2-8bb1-f36d725100bb,Namespace:kube-system,Attempt:0,}" May 13 23:51:28.008320 containerd[1507]: time="2025-05-13T23:51:28.008012256Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-6f6b679f8f-qjs7p,Uid:7e1be591-b8a4-4385-a412-aa3deff29976,Namespace:kube-system,Attempt:0,}" May 13 23:51:28.018837 containerd[1507]: time="2025-05-13T23:51:28.018797848Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-7fd5cd4657-xd97q,Uid:ae83f87c-567b-4c35-a4bc-9b1afafaba01,Namespace:calico-apiserver,Attempt:0,}" May 13 23:51:28.034676 containerd[1507]: time="2025-05-13T23:51:28.034514179Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-7fd5cd4657-7ls8m,Uid:a9cfa8c1-0881-48db-9b0c-a8081ef8041d,Namespace:calico-apiserver,Attempt:0,}" May 13 23:51:28.049580 containerd[1507]: time="2025-05-13T23:51:28.049287116Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-67895f656d-fr7gd,Uid:a9e7fa11-c2d7-4fc9-bc76-9c7b1719d09c,Namespace:calico-system,Attempt:0,}" May 13 23:51:28.166199 containerd[1507]: time="2025-05-13T23:51:28.165533381Z" level=error msg="Failed to destroy network for sandbox \"bf546ade6225da2c53cab13bbfe628c96a7954b83d9a20094c57b6394df5bc60\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 13 23:51:28.168853 containerd[1507]: time="2025-05-13T23:51:28.168799820Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-6f6b679f8f-4sdg7,Uid:7a20d30b-a034-4ee2-8bb1-f36d725100bb,Namespace:kube-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"bf546ade6225da2c53cab13bbfe628c96a7954b83d9a20094c57b6394df5bc60\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 13 23:51:28.169412 kubelet[2772]: E0513 23:51:28.169358 2772 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"bf546ade6225da2c53cab13bbfe628c96a7954b83d9a20094c57b6394df5bc60\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 13 23:51:28.169720 kubelet[2772]: E0513 23:51:28.169439 2772 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"bf546ade6225da2c53cab13bbfe628c96a7954b83d9a20094c57b6394df5bc60\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-6f6b679f8f-4sdg7" May 13 23:51:28.169720 kubelet[2772]: E0513 23:51:28.169460 2772 kuberuntime_manager.go:1168] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"bf546ade6225da2c53cab13bbfe628c96a7954b83d9a20094c57b6394df5bc60\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-6f6b679f8f-4sdg7" May 13 23:51:28.169720 kubelet[2772]: E0513 23:51:28.169500 2772 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"coredns-6f6b679f8f-4sdg7_kube-system(7a20d30b-a034-4ee2-8bb1-f36d725100bb)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"coredns-6f6b679f8f-4sdg7_kube-system(7a20d30b-a034-4ee2-8bb1-f36d725100bb)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"bf546ade6225da2c53cab13bbfe628c96a7954b83d9a20094c57b6394df5bc60\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-6f6b679f8f-4sdg7" podUID="7a20d30b-a034-4ee2-8bb1-f36d725100bb" May 13 23:51:28.177054 containerd[1507]: time="2025-05-13T23:51:28.177008558Z" level=error msg="Failed to destroy network for sandbox \"6a6bde334a52c5114f9a6d7fc219a398d7a54be1e6bb2dfc5b84b0e7bb76f683\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 13 23:51:28.177641 containerd[1507]: time="2025-05-13T23:51:28.177609340Z" level=error msg="Failed to destroy network for sandbox \"619b30db26d18471be08f6f58cb5e93eb9cc7080112d9660ae3fcaee3d97b4ae\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 13 23:51:28.178527 containerd[1507]: time="2025-05-13T23:51:28.178485812Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-6f6b679f8f-qjs7p,Uid:7e1be591-b8a4-4385-a412-aa3deff29976,Namespace:kube-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"6a6bde334a52c5114f9a6d7fc219a398d7a54be1e6bb2dfc5b84b0e7bb76f683\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 13 23:51:28.178727 kubelet[2772]: E0513 23:51:28.178690 2772 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"6a6bde334a52c5114f9a6d7fc219a398d7a54be1e6bb2dfc5b84b0e7bb76f683\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 13 23:51:28.178789 kubelet[2772]: E0513 23:51:28.178750 2772 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"6a6bde334a52c5114f9a6d7fc219a398d7a54be1e6bb2dfc5b84b0e7bb76f683\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-6f6b679f8f-qjs7p" May 13 23:51:28.178789 kubelet[2772]: E0513 23:51:28.178769 2772 kuberuntime_manager.go:1168] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"6a6bde334a52c5114f9a6d7fc219a398d7a54be1e6bb2dfc5b84b0e7bb76f683\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-6f6b679f8f-qjs7p" May 13 23:51:28.178837 kubelet[2772]: E0513 23:51:28.178803 2772 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"coredns-6f6b679f8f-qjs7p_kube-system(7e1be591-b8a4-4385-a412-aa3deff29976)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"coredns-6f6b679f8f-qjs7p_kube-system(7e1be591-b8a4-4385-a412-aa3deff29976)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"6a6bde334a52c5114f9a6d7fc219a398d7a54be1e6bb2dfc5b84b0e7bb76f683\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-6f6b679f8f-qjs7p" podUID="7e1be591-b8a4-4385-a412-aa3deff29976" May 13 23:51:28.180610 containerd[1507]: time="2025-05-13T23:51:28.180568047Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-7fd5cd4657-xd97q,Uid:ae83f87c-567b-4c35-a4bc-9b1afafaba01,Namespace:calico-apiserver,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"619b30db26d18471be08f6f58cb5e93eb9cc7080112d9660ae3fcaee3d97b4ae\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 13 23:51:28.181481 kubelet[2772]: E0513 23:51:28.180943 2772 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"619b30db26d18471be08f6f58cb5e93eb9cc7080112d9660ae3fcaee3d97b4ae\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 13 23:51:28.181767 kubelet[2772]: E0513 23:51:28.181735 2772 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"619b30db26d18471be08f6f58cb5e93eb9cc7080112d9660ae3fcaee3d97b4ae\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-7fd5cd4657-xd97q" May 13 23:51:28.181842 kubelet[2772]: E0513 23:51:28.181768 2772 kuberuntime_manager.go:1168] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"619b30db26d18471be08f6f58cb5e93eb9cc7080112d9660ae3fcaee3d97b4ae\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-7fd5cd4657-xd97q" May 13 23:51:28.181842 kubelet[2772]: E0513 23:51:28.181816 2772 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-apiserver-7fd5cd4657-xd97q_calico-apiserver(ae83f87c-567b-4c35-a4bc-9b1afafaba01)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-apiserver-7fd5cd4657-xd97q_calico-apiserver(ae83f87c-567b-4c35-a4bc-9b1afafaba01)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"619b30db26d18471be08f6f58cb5e93eb9cc7080112d9660ae3fcaee3d97b4ae\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-apiserver/calico-apiserver-7fd5cd4657-xd97q" podUID="ae83f87c-567b-4c35-a4bc-9b1afafaba01" May 13 23:51:28.192798 containerd[1507]: time="2025-05-13T23:51:28.192752050Z" level=error msg="Failed to destroy network for sandbox \"2774397c7766c9dc52be0c0c184132614ce37fd6ea64b190b4fab902be9082bd\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 13 23:51:28.195430 containerd[1507]: time="2025-05-13T23:51:28.195375785Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-67895f656d-fr7gd,Uid:a9e7fa11-c2d7-4fc9-bc76-9c7b1719d09c,Namespace:calico-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"2774397c7766c9dc52be0c0c184132614ce37fd6ea64b190b4fab902be9082bd\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 13 23:51:28.195656 kubelet[2772]: E0513 23:51:28.195611 2772 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"2774397c7766c9dc52be0c0c184132614ce37fd6ea64b190b4fab902be9082bd\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 13 23:51:28.195715 kubelet[2772]: E0513 23:51:28.195679 2772 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"2774397c7766c9dc52be0c0c184132614ce37fd6ea64b190b4fab902be9082bd\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-kube-controllers-67895f656d-fr7gd" May 13 23:51:28.195715 kubelet[2772]: E0513 23:51:28.195700 2772 kuberuntime_manager.go:1168] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"2774397c7766c9dc52be0c0c184132614ce37fd6ea64b190b4fab902be9082bd\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-kube-controllers-67895f656d-fr7gd" May 13 23:51:28.195770 kubelet[2772]: E0513 23:51:28.195737 2772 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-kube-controllers-67895f656d-fr7gd_calico-system(a9e7fa11-c2d7-4fc9-bc76-9c7b1719d09c)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-kube-controllers-67895f656d-fr7gd_calico-system(a9e7fa11-c2d7-4fc9-bc76-9c7b1719d09c)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"2774397c7766c9dc52be0c0c184132614ce37fd6ea64b190b4fab902be9082bd\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/calico-kube-controllers-67895f656d-fr7gd" podUID="a9e7fa11-c2d7-4fc9-bc76-9c7b1719d09c" May 13 23:51:28.197670 containerd[1507]: time="2025-05-13T23:51:28.197618547Z" level=error msg="Failed to destroy network for sandbox \"78c6f40145d55b525207ac6e036f4ff2893ae348fded4c91b247b8211a229f2e\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 13 23:51:28.199418 containerd[1507]: time="2025-05-13T23:51:28.199368731Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-7fd5cd4657-7ls8m,Uid:a9cfa8c1-0881-48db-9b0c-a8081ef8041d,Namespace:calico-apiserver,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"78c6f40145d55b525207ac6e036f4ff2893ae348fded4c91b247b8211a229f2e\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 13 23:51:28.199680 kubelet[2772]: E0513 23:51:28.199611 2772 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"78c6f40145d55b525207ac6e036f4ff2893ae348fded4c91b247b8211a229f2e\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 13 23:51:28.199758 kubelet[2772]: E0513 23:51:28.199670 2772 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"78c6f40145d55b525207ac6e036f4ff2893ae348fded4c91b247b8211a229f2e\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-7fd5cd4657-7ls8m" May 13 23:51:28.199758 kubelet[2772]: E0513 23:51:28.199700 2772 kuberuntime_manager.go:1168] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"78c6f40145d55b525207ac6e036f4ff2893ae348fded4c91b247b8211a229f2e\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-7fd5cd4657-7ls8m" May 13 23:51:28.199758 kubelet[2772]: E0513 23:51:28.199742 2772 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-apiserver-7fd5cd4657-7ls8m_calico-apiserver(a9cfa8c1-0881-48db-9b0c-a8081ef8041d)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-apiserver-7fd5cd4657-7ls8m_calico-apiserver(a9cfa8c1-0881-48db-9b0c-a8081ef8041d)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"78c6f40145d55b525207ac6e036f4ff2893ae348fded4c91b247b8211a229f2e\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-apiserver/calico-apiserver-7fd5cd4657-7ls8m" podUID="a9cfa8c1-0881-48db-9b0c-a8081ef8041d" May 13 23:51:28.295660 containerd[1507]: time="2025-05-13T23:51:28.295070649Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node:v3.29.3\"" May 13 23:51:28.913725 systemd[1]: run-netns-cni\x2dbebc122d\x2dcf94\x2d97fe\x2d9209\x2d521930316df8.mount: Deactivated successfully. May 13 23:51:29.139403 systemd[1]: Created slice kubepods-besteffort-pod25916451_c2fa_46e7_8188_33b7982635fd.slice - libcontainer container kubepods-besteffort-pod25916451_c2fa_46e7_8188_33b7982635fd.slice. May 13 23:51:29.146826 containerd[1507]: time="2025-05-13T23:51:29.146521133Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-68jvj,Uid:25916451-c2fa-46e7-8188-33b7982635fd,Namespace:calico-system,Attempt:0,}" May 13 23:51:29.218196 containerd[1507]: time="2025-05-13T23:51:29.217839096Z" level=error msg="Failed to destroy network for sandbox \"0975a52a02441e717482a1439b4116ccb5247154d0c117d08333d7350c1746c8\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 13 23:51:29.223682 systemd[1]: run-netns-cni\x2dd2faf905\x2d124a\x2d4021\x2df5d4\x2d7088162f1e19.mount: Deactivated successfully. May 13 23:51:29.226480 containerd[1507]: time="2025-05-13T23:51:29.225775941Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-68jvj,Uid:25916451-c2fa-46e7-8188-33b7982635fd,Namespace:calico-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"0975a52a02441e717482a1439b4116ccb5247154d0c117d08333d7350c1746c8\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 13 23:51:29.226652 kubelet[2772]: E0513 23:51:29.226121 2772 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"0975a52a02441e717482a1439b4116ccb5247154d0c117d08333d7350c1746c8\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 13 23:51:29.226652 kubelet[2772]: E0513 23:51:29.226185 2772 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"0975a52a02441e717482a1439b4116ccb5247154d0c117d08333d7350c1746c8\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/csi-node-driver-68jvj" May 13 23:51:29.226652 kubelet[2772]: E0513 23:51:29.226215 2772 kuberuntime_manager.go:1168] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"0975a52a02441e717482a1439b4116ccb5247154d0c117d08333d7350c1746c8\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/csi-node-driver-68jvj" May 13 23:51:29.228699 kubelet[2772]: E0513 23:51:29.226255 2772 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"csi-node-driver-68jvj_calico-system(25916451-c2fa-46e7-8188-33b7982635fd)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"csi-node-driver-68jvj_calico-system(25916451-c2fa-46e7-8188-33b7982635fd)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"0975a52a02441e717482a1439b4116ccb5247154d0c117d08333d7350c1746c8\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/csi-node-driver-68jvj" podUID="25916451-c2fa-46e7-8188-33b7982635fd" May 13 23:51:33.873271 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount1947717713.mount: Deactivated successfully. May 13 23:51:33.909682 containerd[1507]: time="2025-05-13T23:51:33.909620511Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node:v3.29.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 13 23:51:33.911163 containerd[1507]: time="2025-05-13T23:51:33.910669787Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/node:v3.29.3: active requests=0, bytes read=138981893" May 13 23:51:33.911952 containerd[1507]: time="2025-05-13T23:51:33.911871629Z" level=info msg="ImageCreate event name:\"sha256:cdcce3ec4624a24c28cdc07b0ee29ddf6703628edee7452a3f8a8b4816bfd057\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 13 23:51:33.915077 containerd[1507]: time="2025-05-13T23:51:33.914967775Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node@sha256:750e267b4f8217e0ca9e4107228370190d1a2499b72112ad04370ab9b4553916\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 13 23:51:33.916023 containerd[1507]: time="2025-05-13T23:51:33.915549515Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/node:v3.29.3\" with image id \"sha256:cdcce3ec4624a24c28cdc07b0ee29ddf6703628edee7452a3f8a8b4816bfd057\", repo tag \"ghcr.io/flatcar/calico/node:v3.29.3\", repo digest \"ghcr.io/flatcar/calico/node@sha256:750e267b4f8217e0ca9e4107228370190d1a2499b72112ad04370ab9b4553916\", size \"138981755\" in 5.620432465s" May 13 23:51:33.916023 containerd[1507]: time="2025-05-13T23:51:33.915592237Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node:v3.29.3\" returns image reference \"sha256:cdcce3ec4624a24c28cdc07b0ee29ddf6703628edee7452a3f8a8b4816bfd057\"" May 13 23:51:33.931218 containerd[1507]: time="2025-05-13T23:51:33.931174813Z" level=info msg="CreateContainer within sandbox \"ca8d710b13687069d5d3293e27ce8b60258ade20b0a11780e0b583d0dddbf2be\" for container &ContainerMetadata{Name:calico-node,Attempt:0,}" May 13 23:51:33.943967 containerd[1507]: time="2025-05-13T23:51:33.942717010Z" level=info msg="Container 31a4144284996cba49259ad8f22433f69404813566164bcd2319e49b1d12eadc: CDI devices from CRI Config.CDIDevices: []" May 13 23:51:33.956023 containerd[1507]: time="2025-05-13T23:51:33.955916224Z" level=info msg="CreateContainer within sandbox \"ca8d710b13687069d5d3293e27ce8b60258ade20b0a11780e0b583d0dddbf2be\" for &ContainerMetadata{Name:calico-node,Attempt:0,} returns container id \"31a4144284996cba49259ad8f22433f69404813566164bcd2319e49b1d12eadc\"" May 13 23:51:33.959200 containerd[1507]: time="2025-05-13T23:51:33.958122860Z" level=info msg="StartContainer for \"31a4144284996cba49259ad8f22433f69404813566164bcd2319e49b1d12eadc\"" May 13 23:51:33.961938 containerd[1507]: time="2025-05-13T23:51:33.961898310Z" level=info msg="connecting to shim 31a4144284996cba49259ad8f22433f69404813566164bcd2319e49b1d12eadc" address="unix:///run/containerd/s/ed1e82d4ed91061ec0224aacf214cc30a12d59607756da8563797992d2e4baec" protocol=ttrpc version=3 May 13 23:51:33.986195 systemd[1]: Started cri-containerd-31a4144284996cba49259ad8f22433f69404813566164bcd2319e49b1d12eadc.scope - libcontainer container 31a4144284996cba49259ad8f22433f69404813566164bcd2319e49b1d12eadc. May 13 23:51:34.035255 containerd[1507]: time="2025-05-13T23:51:34.035127618Z" level=info msg="StartContainer for \"31a4144284996cba49259ad8f22433f69404813566164bcd2319e49b1d12eadc\" returns successfully" May 13 23:51:34.161888 kernel: wireguard: WireGuard 1.0.0 loaded. See www.wireguard.com for information. May 13 23:51:34.162575 kernel: wireguard: Copyright (C) 2015-2019 Jason A. Donenfeld . All Rights Reserved. May 13 23:51:34.365271 kubelet[2772]: I0513 23:51:34.363946 2772 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/calico-node-bc4rn" podStartSLOduration=2.604702562 podStartE2EDuration="17.363910415s" podCreationTimestamp="2025-05-13 23:51:17 +0000 UTC" firstStartedPulling="2025-05-13 23:51:19.157498782 +0000 UTC m=+15.171331307" lastFinishedPulling="2025-05-13 23:51:33.916706675 +0000 UTC m=+29.930539160" observedRunningTime="2025-05-13 23:51:34.362037552 +0000 UTC m=+30.375870077" watchObservedRunningTime="2025-05-13 23:51:34.363910415 +0000 UTC m=+30.377742940" May 13 23:51:34.450555 containerd[1507]: time="2025-05-13T23:51:34.450515765Z" level=info msg="TaskExit event in podsandbox handler container_id:\"31a4144284996cba49259ad8f22433f69404813566164bcd2319e49b1d12eadc\" id:\"29e42c12e8b6ec8200b9cd4e1296997c70ca899350c57e1a26478f1ad5208eb8\" pid:3813 exit_status:1 exited_at:{seconds:1747180294 nanos:450087390}" May 13 23:51:35.390955 containerd[1507]: time="2025-05-13T23:51:35.390851420Z" level=info msg="TaskExit event in podsandbox handler container_id:\"31a4144284996cba49259ad8f22433f69404813566164bcd2319e49b1d12eadc\" id:\"e2f06a06167e5d8808b7949be93ad07b5e38abb37bcbbbf8afbbf549ea1ec8de\" pid:3847 exit_status:1 exited_at:{seconds:1747180295 nanos:390208278}" May 13 23:51:36.111015 kernel: bpftool[3981]: memfd_create() called without MFD_EXEC or MFD_NOEXEC_SEAL set May 13 23:51:36.308107 systemd-networkd[1400]: vxlan.calico: Link UP May 13 23:51:36.308116 systemd-networkd[1400]: vxlan.calico: Gained carrier May 13 23:51:37.819058 systemd-networkd[1400]: vxlan.calico: Gained IPv6LL May 13 23:51:39.131875 containerd[1507]: time="2025-05-13T23:51:39.131473845Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-7fd5cd4657-xd97q,Uid:ae83f87c-567b-4c35-a4bc-9b1afafaba01,Namespace:calico-apiserver,Attempt:0,}" May 13 23:51:39.132957 containerd[1507]: time="2025-05-13T23:51:39.132694005Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-6f6b679f8f-qjs7p,Uid:7e1be591-b8a4-4385-a412-aa3deff29976,Namespace:kube-system,Attempt:0,}" May 13 23:51:39.415397 systemd-networkd[1400]: cali52eb1c97d82: Link UP May 13 23:51:39.416651 systemd-networkd[1400]: cali52eb1c97d82: Gained carrier May 13 23:51:39.450497 containerd[1507]: 2025-05-13 23:51:39.221 [INFO][4054] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4284--0--0--n--40578dffbd-k8s-calico--apiserver--7fd5cd4657--xd97q-eth0 calico-apiserver-7fd5cd4657- calico-apiserver ae83f87c-567b-4c35-a4bc-9b1afafaba01 715 0 2025-05-13 23:51:16 +0000 UTC map[apiserver:true app.kubernetes.io/name:calico-apiserver k8s-app:calico-apiserver pod-template-hash:7fd5cd4657 projectcalico.org/namespace:calico-apiserver projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:calico-apiserver] map[] [] [] []} {k8s ci-4284-0-0-n-40578dffbd calico-apiserver-7fd5cd4657-xd97q eth0 calico-apiserver [] [] [kns.calico-apiserver ksa.calico-apiserver.calico-apiserver] cali52eb1c97d82 [] []}} ContainerID="afeff18632648d95ab95bc085d0a25208093d7fc76708387a3d0816eb13a551a" Namespace="calico-apiserver" Pod="calico-apiserver-7fd5cd4657-xd97q" WorkloadEndpoint="ci--4284--0--0--n--40578dffbd-k8s-calico--apiserver--7fd5cd4657--xd97q-" May 13 23:51:39.450497 containerd[1507]: 2025-05-13 23:51:39.221 [INFO][4054] cni-plugin/k8s.go 77: Extracted identifiers for CmdAddK8s ContainerID="afeff18632648d95ab95bc085d0a25208093d7fc76708387a3d0816eb13a551a" Namespace="calico-apiserver" Pod="calico-apiserver-7fd5cd4657-xd97q" WorkloadEndpoint="ci--4284--0--0--n--40578dffbd-k8s-calico--apiserver--7fd5cd4657--xd97q-eth0" May 13 23:51:39.450497 containerd[1507]: 2025-05-13 23:51:39.285 [INFO][4085] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="afeff18632648d95ab95bc085d0a25208093d7fc76708387a3d0816eb13a551a" HandleID="k8s-pod-network.afeff18632648d95ab95bc085d0a25208093d7fc76708387a3d0816eb13a551a" Workload="ci--4284--0--0--n--40578dffbd-k8s-calico--apiserver--7fd5cd4657--xd97q-eth0" May 13 23:51:39.451452 containerd[1507]: 2025-05-13 23:51:39.312 [INFO][4085] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="afeff18632648d95ab95bc085d0a25208093d7fc76708387a3d0816eb13a551a" HandleID="k8s-pod-network.afeff18632648d95ab95bc085d0a25208093d7fc76708387a3d0816eb13a551a" Workload="ci--4284--0--0--n--40578dffbd-k8s-calico--apiserver--7fd5cd4657--xd97q-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0x40001fac80), Attrs:map[string]string{"namespace":"calico-apiserver", "node":"ci-4284-0-0-n-40578dffbd", "pod":"calico-apiserver-7fd5cd4657-xd97q", "timestamp":"2025-05-13 23:51:39.285589528 +0000 UTC"}, Hostname:"ci-4284-0-0-n-40578dffbd", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} May 13 23:51:39.451452 containerd[1507]: 2025-05-13 23:51:39.312 [INFO][4085] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. May 13 23:51:39.451452 containerd[1507]: 2025-05-13 23:51:39.312 [INFO][4085] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. May 13 23:51:39.451452 containerd[1507]: 2025-05-13 23:51:39.312 [INFO][4085] ipam/ipam.go 107: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4284-0-0-n-40578dffbd' May 13 23:51:39.451452 containerd[1507]: 2025-05-13 23:51:39.318 [INFO][4085] ipam/ipam.go 660: Looking up existing affinities for host handle="k8s-pod-network.afeff18632648d95ab95bc085d0a25208093d7fc76708387a3d0816eb13a551a" host="ci-4284-0-0-n-40578dffbd" May 13 23:51:39.451452 containerd[1507]: 2025-05-13 23:51:39.327 [INFO][4085] ipam/ipam.go 372: Looking up existing affinities for host host="ci-4284-0-0-n-40578dffbd" May 13 23:51:39.451452 containerd[1507]: 2025-05-13 23:51:39.349 [INFO][4085] ipam/ipam.go 489: Trying affinity for 192.168.48.192/26 host="ci-4284-0-0-n-40578dffbd" May 13 23:51:39.451452 containerd[1507]: 2025-05-13 23:51:39.353 [INFO][4085] ipam/ipam.go 155: Attempting to load block cidr=192.168.48.192/26 host="ci-4284-0-0-n-40578dffbd" May 13 23:51:39.451452 containerd[1507]: 2025-05-13 23:51:39.361 [INFO][4085] ipam/ipam.go 232: Affinity is confirmed and block has been loaded cidr=192.168.48.192/26 host="ci-4284-0-0-n-40578dffbd" May 13 23:51:39.451719 containerd[1507]: 2025-05-13 23:51:39.362 [INFO][4085] ipam/ipam.go 1180: Attempting to assign 1 addresses from block block=192.168.48.192/26 handle="k8s-pod-network.afeff18632648d95ab95bc085d0a25208093d7fc76708387a3d0816eb13a551a" host="ci-4284-0-0-n-40578dffbd" May 13 23:51:39.451719 containerd[1507]: 2025-05-13 23:51:39.368 [INFO][4085] ipam/ipam.go 1685: Creating new handle: k8s-pod-network.afeff18632648d95ab95bc085d0a25208093d7fc76708387a3d0816eb13a551a May 13 23:51:39.451719 containerd[1507]: 2025-05-13 23:51:39.395 [INFO][4085] ipam/ipam.go 1203: Writing block in order to claim IPs block=192.168.48.192/26 handle="k8s-pod-network.afeff18632648d95ab95bc085d0a25208093d7fc76708387a3d0816eb13a551a" host="ci-4284-0-0-n-40578dffbd" May 13 23:51:39.451719 containerd[1507]: 2025-05-13 23:51:39.405 [INFO][4085] ipam/ipam.go 1216: Successfully claimed IPs: [192.168.48.193/26] block=192.168.48.192/26 handle="k8s-pod-network.afeff18632648d95ab95bc085d0a25208093d7fc76708387a3d0816eb13a551a" host="ci-4284-0-0-n-40578dffbd" May 13 23:51:39.451719 containerd[1507]: 2025-05-13 23:51:39.405 [INFO][4085] ipam/ipam.go 847: Auto-assigned 1 out of 1 IPv4s: [192.168.48.193/26] handle="k8s-pod-network.afeff18632648d95ab95bc085d0a25208093d7fc76708387a3d0816eb13a551a" host="ci-4284-0-0-n-40578dffbd" May 13 23:51:39.451719 containerd[1507]: 2025-05-13 23:51:39.405 [INFO][4085] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. May 13 23:51:39.451719 containerd[1507]: 2025-05-13 23:51:39.405 [INFO][4085] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.48.193/26] IPv6=[] ContainerID="afeff18632648d95ab95bc085d0a25208093d7fc76708387a3d0816eb13a551a" HandleID="k8s-pod-network.afeff18632648d95ab95bc085d0a25208093d7fc76708387a3d0816eb13a551a" Workload="ci--4284--0--0--n--40578dffbd-k8s-calico--apiserver--7fd5cd4657--xd97q-eth0" May 13 23:51:39.451856 containerd[1507]: 2025-05-13 23:51:39.408 [INFO][4054] cni-plugin/k8s.go 386: Populated endpoint ContainerID="afeff18632648d95ab95bc085d0a25208093d7fc76708387a3d0816eb13a551a" Namespace="calico-apiserver" Pod="calico-apiserver-7fd5cd4657-xd97q" WorkloadEndpoint="ci--4284--0--0--n--40578dffbd-k8s-calico--apiserver--7fd5cd4657--xd97q-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4284--0--0--n--40578dffbd-k8s-calico--apiserver--7fd5cd4657--xd97q-eth0", GenerateName:"calico-apiserver-7fd5cd4657-", Namespace:"calico-apiserver", SelfLink:"", UID:"ae83f87c-567b-4c35-a4bc-9b1afafaba01", ResourceVersion:"715", Generation:0, CreationTimestamp:time.Date(2025, time.May, 13, 23, 51, 16, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"7fd5cd4657", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4284-0-0-n-40578dffbd", ContainerID:"", Pod:"calico-apiserver-7fd5cd4657-xd97q", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.48.193/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"cali52eb1c97d82", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil)}} May 13 23:51:39.451906 containerd[1507]: 2025-05-13 23:51:39.409 [INFO][4054] cni-plugin/k8s.go 387: Calico CNI using IPs: [192.168.48.193/32] ContainerID="afeff18632648d95ab95bc085d0a25208093d7fc76708387a3d0816eb13a551a" Namespace="calico-apiserver" Pod="calico-apiserver-7fd5cd4657-xd97q" WorkloadEndpoint="ci--4284--0--0--n--40578dffbd-k8s-calico--apiserver--7fd5cd4657--xd97q-eth0" May 13 23:51:39.451906 containerd[1507]: 2025-05-13 23:51:39.409 [INFO][4054] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali52eb1c97d82 ContainerID="afeff18632648d95ab95bc085d0a25208093d7fc76708387a3d0816eb13a551a" Namespace="calico-apiserver" Pod="calico-apiserver-7fd5cd4657-xd97q" WorkloadEndpoint="ci--4284--0--0--n--40578dffbd-k8s-calico--apiserver--7fd5cd4657--xd97q-eth0" May 13 23:51:39.451906 containerd[1507]: 2025-05-13 23:51:39.418 [INFO][4054] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="afeff18632648d95ab95bc085d0a25208093d7fc76708387a3d0816eb13a551a" Namespace="calico-apiserver" Pod="calico-apiserver-7fd5cd4657-xd97q" WorkloadEndpoint="ci--4284--0--0--n--40578dffbd-k8s-calico--apiserver--7fd5cd4657--xd97q-eth0" May 13 23:51:39.451967 containerd[1507]: 2025-05-13 23:51:39.419 [INFO][4054] cni-plugin/k8s.go 414: Added Mac, interface name, and active container ID to endpoint ContainerID="afeff18632648d95ab95bc085d0a25208093d7fc76708387a3d0816eb13a551a" Namespace="calico-apiserver" Pod="calico-apiserver-7fd5cd4657-xd97q" WorkloadEndpoint="ci--4284--0--0--n--40578dffbd-k8s-calico--apiserver--7fd5cd4657--xd97q-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4284--0--0--n--40578dffbd-k8s-calico--apiserver--7fd5cd4657--xd97q-eth0", GenerateName:"calico-apiserver-7fd5cd4657-", Namespace:"calico-apiserver", SelfLink:"", UID:"ae83f87c-567b-4c35-a4bc-9b1afafaba01", ResourceVersion:"715", Generation:0, CreationTimestamp:time.Date(2025, time.May, 13, 23, 51, 16, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"7fd5cd4657", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4284-0-0-n-40578dffbd", ContainerID:"afeff18632648d95ab95bc085d0a25208093d7fc76708387a3d0816eb13a551a", Pod:"calico-apiserver-7fd5cd4657-xd97q", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.48.193/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"cali52eb1c97d82", MAC:"fa:7d:87:e7:f4:a3", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil)}} May 13 23:51:39.452040 containerd[1507]: 2025-05-13 23:51:39.446 [INFO][4054] cni-plugin/k8s.go 500: Wrote updated endpoint to datastore ContainerID="afeff18632648d95ab95bc085d0a25208093d7fc76708387a3d0816eb13a551a" Namespace="calico-apiserver" Pod="calico-apiserver-7fd5cd4657-xd97q" WorkloadEndpoint="ci--4284--0--0--n--40578dffbd-k8s-calico--apiserver--7fd5cd4657--xd97q-eth0" May 13 23:51:39.529471 containerd[1507]: time="2025-05-13T23:51:39.528767861Z" level=info msg="connecting to shim afeff18632648d95ab95bc085d0a25208093d7fc76708387a3d0816eb13a551a" address="unix:///run/containerd/s/a8b6c1030af634cf431c42064f7e3fc58e58aa98c1635637c86f0a211691a9f6" namespace=k8s.io protocol=ttrpc version=3 May 13 23:51:39.561696 systemd-networkd[1400]: cali5e98a2c78c7: Link UP May 13 23:51:39.564674 systemd-networkd[1400]: cali5e98a2c78c7: Gained carrier May 13 23:51:39.594550 containerd[1507]: 2025-05-13 23:51:39.221 [INFO][4060] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4284--0--0--n--40578dffbd-k8s-coredns--6f6b679f8f--qjs7p-eth0 coredns-6f6b679f8f- kube-system 7e1be591-b8a4-4385-a412-aa3deff29976 714 0 2025-05-13 23:51:09 +0000 UTC map[k8s-app:kube-dns pod-template-hash:6f6b679f8f projectcalico.org/namespace:kube-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:coredns] map[] [] [] []} {k8s ci-4284-0-0-n-40578dffbd coredns-6f6b679f8f-qjs7p eth0 coredns [] [] [kns.kube-system ksa.kube-system.coredns] cali5e98a2c78c7 [{dns UDP 53 0 } {dns-tcp TCP 53 0 } {metrics TCP 9153 0 }] []}} ContainerID="d7c3c6218a4662f3d016177b081a28ee193bf0921cb489cc19d5e538d0e4253f" Namespace="kube-system" Pod="coredns-6f6b679f8f-qjs7p" WorkloadEndpoint="ci--4284--0--0--n--40578dffbd-k8s-coredns--6f6b679f8f--qjs7p-" May 13 23:51:39.594550 containerd[1507]: 2025-05-13 23:51:39.221 [INFO][4060] cni-plugin/k8s.go 77: Extracted identifiers for CmdAddK8s ContainerID="d7c3c6218a4662f3d016177b081a28ee193bf0921cb489cc19d5e538d0e4253f" Namespace="kube-system" Pod="coredns-6f6b679f8f-qjs7p" WorkloadEndpoint="ci--4284--0--0--n--40578dffbd-k8s-coredns--6f6b679f8f--qjs7p-eth0" May 13 23:51:39.594550 containerd[1507]: 2025-05-13 23:51:39.286 [INFO][4079] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="d7c3c6218a4662f3d016177b081a28ee193bf0921cb489cc19d5e538d0e4253f" HandleID="k8s-pod-network.d7c3c6218a4662f3d016177b081a28ee193bf0921cb489cc19d5e538d0e4253f" Workload="ci--4284--0--0--n--40578dffbd-k8s-coredns--6f6b679f8f--qjs7p-eth0" May 13 23:51:39.594761 containerd[1507]: 2025-05-13 23:51:39.315 [INFO][4079] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="d7c3c6218a4662f3d016177b081a28ee193bf0921cb489cc19d5e538d0e4253f" HandleID="k8s-pod-network.d7c3c6218a4662f3d016177b081a28ee193bf0921cb489cc19d5e538d0e4253f" Workload="ci--4284--0--0--n--40578dffbd-k8s-coredns--6f6b679f8f--qjs7p-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0x400004d8e0), Attrs:map[string]string{"namespace":"kube-system", "node":"ci-4284-0-0-n-40578dffbd", "pod":"coredns-6f6b679f8f-qjs7p", "timestamp":"2025-05-13 23:51:39.28656732 +0000 UTC"}, Hostname:"ci-4284-0-0-n-40578dffbd", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} May 13 23:51:39.594761 containerd[1507]: 2025-05-13 23:51:39.315 [INFO][4079] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. May 13 23:51:39.594761 containerd[1507]: 2025-05-13 23:51:39.405 [INFO][4079] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. May 13 23:51:39.594761 containerd[1507]: 2025-05-13 23:51:39.405 [INFO][4079] ipam/ipam.go 107: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4284-0-0-n-40578dffbd' May 13 23:51:39.594761 containerd[1507]: 2025-05-13 23:51:39.423 [INFO][4079] ipam/ipam.go 660: Looking up existing affinities for host handle="k8s-pod-network.d7c3c6218a4662f3d016177b081a28ee193bf0921cb489cc19d5e538d0e4253f" host="ci-4284-0-0-n-40578dffbd" May 13 23:51:39.594761 containerd[1507]: 2025-05-13 23:51:39.437 [INFO][4079] ipam/ipam.go 372: Looking up existing affinities for host host="ci-4284-0-0-n-40578dffbd" May 13 23:51:39.594761 containerd[1507]: 2025-05-13 23:51:39.453 [INFO][4079] ipam/ipam.go 489: Trying affinity for 192.168.48.192/26 host="ci-4284-0-0-n-40578dffbd" May 13 23:51:39.594761 containerd[1507]: 2025-05-13 23:51:39.457 [INFO][4079] ipam/ipam.go 155: Attempting to load block cidr=192.168.48.192/26 host="ci-4284-0-0-n-40578dffbd" May 13 23:51:39.594761 containerd[1507]: 2025-05-13 23:51:39.466 [INFO][4079] ipam/ipam.go 232: Affinity is confirmed and block has been loaded cidr=192.168.48.192/26 host="ci-4284-0-0-n-40578dffbd" May 13 23:51:39.595141 containerd[1507]: 2025-05-13 23:51:39.466 [INFO][4079] ipam/ipam.go 1180: Attempting to assign 1 addresses from block block=192.168.48.192/26 handle="k8s-pod-network.d7c3c6218a4662f3d016177b081a28ee193bf0921cb489cc19d5e538d0e4253f" host="ci-4284-0-0-n-40578dffbd" May 13 23:51:39.595141 containerd[1507]: 2025-05-13 23:51:39.471 [INFO][4079] ipam/ipam.go 1685: Creating new handle: k8s-pod-network.d7c3c6218a4662f3d016177b081a28ee193bf0921cb489cc19d5e538d0e4253f May 13 23:51:39.595141 containerd[1507]: 2025-05-13 23:51:39.513 [INFO][4079] ipam/ipam.go 1203: Writing block in order to claim IPs block=192.168.48.192/26 handle="k8s-pod-network.d7c3c6218a4662f3d016177b081a28ee193bf0921cb489cc19d5e538d0e4253f" host="ci-4284-0-0-n-40578dffbd" May 13 23:51:39.595141 containerd[1507]: 2025-05-13 23:51:39.541 [INFO][4079] ipam/ipam.go 1216: Successfully claimed IPs: [192.168.48.194/26] block=192.168.48.192/26 handle="k8s-pod-network.d7c3c6218a4662f3d016177b081a28ee193bf0921cb489cc19d5e538d0e4253f" host="ci-4284-0-0-n-40578dffbd" May 13 23:51:39.595141 containerd[1507]: 2025-05-13 23:51:39.541 [INFO][4079] ipam/ipam.go 847: Auto-assigned 1 out of 1 IPv4s: [192.168.48.194/26] handle="k8s-pod-network.d7c3c6218a4662f3d016177b081a28ee193bf0921cb489cc19d5e538d0e4253f" host="ci-4284-0-0-n-40578dffbd" May 13 23:51:39.595141 containerd[1507]: 2025-05-13 23:51:39.541 [INFO][4079] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. May 13 23:51:39.595141 containerd[1507]: 2025-05-13 23:51:39.541 [INFO][4079] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.48.194/26] IPv6=[] ContainerID="d7c3c6218a4662f3d016177b081a28ee193bf0921cb489cc19d5e538d0e4253f" HandleID="k8s-pod-network.d7c3c6218a4662f3d016177b081a28ee193bf0921cb489cc19d5e538d0e4253f" Workload="ci--4284--0--0--n--40578dffbd-k8s-coredns--6f6b679f8f--qjs7p-eth0" May 13 23:51:39.595513 containerd[1507]: 2025-05-13 23:51:39.557 [INFO][4060] cni-plugin/k8s.go 386: Populated endpoint ContainerID="d7c3c6218a4662f3d016177b081a28ee193bf0921cb489cc19d5e538d0e4253f" Namespace="kube-system" Pod="coredns-6f6b679f8f-qjs7p" WorkloadEndpoint="ci--4284--0--0--n--40578dffbd-k8s-coredns--6f6b679f8f--qjs7p-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4284--0--0--n--40578dffbd-k8s-coredns--6f6b679f8f--qjs7p-eth0", GenerateName:"coredns-6f6b679f8f-", Namespace:"kube-system", SelfLink:"", UID:"7e1be591-b8a4-4385-a412-aa3deff29976", ResourceVersion:"714", Generation:0, CreationTimestamp:time.Date(2025, time.May, 13, 23, 51, 9, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"6f6b679f8f", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4284-0-0-n-40578dffbd", ContainerID:"", Pod:"coredns-6f6b679f8f-qjs7p", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.48.194/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"cali5e98a2c78c7", MAC:"", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil)}} May 13 23:51:39.595513 containerd[1507]: 2025-05-13 23:51:39.558 [INFO][4060] cni-plugin/k8s.go 387: Calico CNI using IPs: [192.168.48.194/32] ContainerID="d7c3c6218a4662f3d016177b081a28ee193bf0921cb489cc19d5e538d0e4253f" Namespace="kube-system" Pod="coredns-6f6b679f8f-qjs7p" WorkloadEndpoint="ci--4284--0--0--n--40578dffbd-k8s-coredns--6f6b679f8f--qjs7p-eth0" May 13 23:51:39.595513 containerd[1507]: 2025-05-13 23:51:39.558 [INFO][4060] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali5e98a2c78c7 ContainerID="d7c3c6218a4662f3d016177b081a28ee193bf0921cb489cc19d5e538d0e4253f" Namespace="kube-system" Pod="coredns-6f6b679f8f-qjs7p" WorkloadEndpoint="ci--4284--0--0--n--40578dffbd-k8s-coredns--6f6b679f8f--qjs7p-eth0" May 13 23:51:39.595513 containerd[1507]: 2025-05-13 23:51:39.565 [INFO][4060] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="d7c3c6218a4662f3d016177b081a28ee193bf0921cb489cc19d5e538d0e4253f" Namespace="kube-system" Pod="coredns-6f6b679f8f-qjs7p" WorkloadEndpoint="ci--4284--0--0--n--40578dffbd-k8s-coredns--6f6b679f8f--qjs7p-eth0" May 13 23:51:39.595513 containerd[1507]: 2025-05-13 23:51:39.566 [INFO][4060] cni-plugin/k8s.go 414: Added Mac, interface name, and active container ID to endpoint ContainerID="d7c3c6218a4662f3d016177b081a28ee193bf0921cb489cc19d5e538d0e4253f" Namespace="kube-system" Pod="coredns-6f6b679f8f-qjs7p" WorkloadEndpoint="ci--4284--0--0--n--40578dffbd-k8s-coredns--6f6b679f8f--qjs7p-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4284--0--0--n--40578dffbd-k8s-coredns--6f6b679f8f--qjs7p-eth0", GenerateName:"coredns-6f6b679f8f-", Namespace:"kube-system", SelfLink:"", UID:"7e1be591-b8a4-4385-a412-aa3deff29976", ResourceVersion:"714", Generation:0, CreationTimestamp:time.Date(2025, time.May, 13, 23, 51, 9, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"6f6b679f8f", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4284-0-0-n-40578dffbd", ContainerID:"d7c3c6218a4662f3d016177b081a28ee193bf0921cb489cc19d5e538d0e4253f", Pod:"coredns-6f6b679f8f-qjs7p", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.48.194/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"cali5e98a2c78c7", MAC:"ca:8c:ac:e2:c4:e6", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil)}} May 13 23:51:39.595513 containerd[1507]: 2025-05-13 23:51:39.585 [INFO][4060] cni-plugin/k8s.go 500: Wrote updated endpoint to datastore ContainerID="d7c3c6218a4662f3d016177b081a28ee193bf0921cb489cc19d5e538d0e4253f" Namespace="kube-system" Pod="coredns-6f6b679f8f-qjs7p" WorkloadEndpoint="ci--4284--0--0--n--40578dffbd-k8s-coredns--6f6b679f8f--qjs7p-eth0" May 13 23:51:39.619211 systemd[1]: Started cri-containerd-afeff18632648d95ab95bc085d0a25208093d7fc76708387a3d0816eb13a551a.scope - libcontainer container afeff18632648d95ab95bc085d0a25208093d7fc76708387a3d0816eb13a551a. May 13 23:51:39.674047 containerd[1507]: time="2025-05-13T23:51:39.673860731Z" level=info msg="connecting to shim d7c3c6218a4662f3d016177b081a28ee193bf0921cb489cc19d5e538d0e4253f" address="unix:///run/containerd/s/d9e12b959e3c160ae5f7dd687b27556cfce82857122e170348944aa60fa432cd" namespace=k8s.io protocol=ttrpc version=3 May 13 23:51:39.717392 containerd[1507]: time="2025-05-13T23:51:39.717256740Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-7fd5cd4657-xd97q,Uid:ae83f87c-567b-4c35-a4bc-9b1afafaba01,Namespace:calico-apiserver,Attempt:0,} returns sandbox id \"afeff18632648d95ab95bc085d0a25208093d7fc76708387a3d0816eb13a551a\"" May 13 23:51:39.723660 containerd[1507]: time="2025-05-13T23:51:39.723379899Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.29.3\"" May 13 23:51:39.726768 systemd[1]: Started cri-containerd-d7c3c6218a4662f3d016177b081a28ee193bf0921cb489cc19d5e538d0e4253f.scope - libcontainer container d7c3c6218a4662f3d016177b081a28ee193bf0921cb489cc19d5e538d0e4253f. May 13 23:51:39.782198 containerd[1507]: time="2025-05-13T23:51:39.782154166Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-6f6b679f8f-qjs7p,Uid:7e1be591-b8a4-4385-a412-aa3deff29976,Namespace:kube-system,Attempt:0,} returns sandbox id \"d7c3c6218a4662f3d016177b081a28ee193bf0921cb489cc19d5e538d0e4253f\"" May 13 23:51:39.787729 containerd[1507]: time="2025-05-13T23:51:39.787669185Z" level=info msg="CreateContainer within sandbox \"d7c3c6218a4662f3d016177b081a28ee193bf0921cb489cc19d5e538d0e4253f\" for container &ContainerMetadata{Name:coredns,Attempt:0,}" May 13 23:51:39.806380 containerd[1507]: time="2025-05-13T23:51:39.805604728Z" level=info msg="Container 1c99807a3a9d659920b69237cd9a4f52d4fdbb89a59b7049399fe1817e7344dd: CDI devices from CRI Config.CDIDevices: []" May 13 23:51:39.812431 containerd[1507]: time="2025-05-13T23:51:39.812372587Z" level=info msg="CreateContainer within sandbox \"d7c3c6218a4662f3d016177b081a28ee193bf0921cb489cc19d5e538d0e4253f\" for &ContainerMetadata{Name:coredns,Attempt:0,} returns container id \"1c99807a3a9d659920b69237cd9a4f52d4fdbb89a59b7049399fe1817e7344dd\"" May 13 23:51:39.813322 containerd[1507]: time="2025-05-13T23:51:39.813105331Z" level=info msg="StartContainer for \"1c99807a3a9d659920b69237cd9a4f52d4fdbb89a59b7049399fe1817e7344dd\"" May 13 23:51:39.820758 containerd[1507]: time="2025-05-13T23:51:39.820431569Z" level=info msg="connecting to shim 1c99807a3a9d659920b69237cd9a4f52d4fdbb89a59b7049399fe1817e7344dd" address="unix:///run/containerd/s/d9e12b959e3c160ae5f7dd687b27556cfce82857122e170348944aa60fa432cd" protocol=ttrpc version=3 May 13 23:51:39.844593 systemd[1]: Started cri-containerd-1c99807a3a9d659920b69237cd9a4f52d4fdbb89a59b7049399fe1817e7344dd.scope - libcontainer container 1c99807a3a9d659920b69237cd9a4f52d4fdbb89a59b7049399fe1817e7344dd. May 13 23:51:39.885054 containerd[1507]: time="2025-05-13T23:51:39.885017385Z" level=info msg="StartContainer for \"1c99807a3a9d659920b69237cd9a4f52d4fdbb89a59b7049399fe1817e7344dd\" returns successfully" May 13 23:51:40.364423 kubelet[2772]: I0513 23:51:40.364326 2772 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/coredns-6f6b679f8f-qjs7p" podStartSLOduration=31.364203194 podStartE2EDuration="31.364203194s" podCreationTimestamp="2025-05-13 23:51:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-05-13 23:51:40.362840391 +0000 UTC m=+36.376672956" watchObservedRunningTime="2025-05-13 23:51:40.364203194 +0000 UTC m=+36.378035719" May 13 23:51:41.133411 containerd[1507]: time="2025-05-13T23:51:41.133334620Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-6f6b679f8f-4sdg7,Uid:7a20d30b-a034-4ee2-8bb1-f36d725100bb,Namespace:kube-system,Attempt:0,}" May 13 23:51:41.210369 systemd-networkd[1400]: cali52eb1c97d82: Gained IPv6LL May 13 23:51:41.333621 systemd-networkd[1400]: calic5a77404044: Link UP May 13 23:51:41.334421 systemd-networkd[1400]: calic5a77404044: Gained carrier May 13 23:51:41.351535 containerd[1507]: 2025-05-13 23:51:41.198 [INFO][4251] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4284--0--0--n--40578dffbd-k8s-coredns--6f6b679f8f--4sdg7-eth0 coredns-6f6b679f8f- kube-system 7a20d30b-a034-4ee2-8bb1-f36d725100bb 713 0 2025-05-13 23:51:09 +0000 UTC map[k8s-app:kube-dns pod-template-hash:6f6b679f8f projectcalico.org/namespace:kube-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:coredns] map[] [] [] []} {k8s ci-4284-0-0-n-40578dffbd coredns-6f6b679f8f-4sdg7 eth0 coredns [] [] [kns.kube-system ksa.kube-system.coredns] calic5a77404044 [{dns UDP 53 0 } {dns-tcp TCP 53 0 } {metrics TCP 9153 0 }] []}} ContainerID="ee2b3fb9f26d19008fa2ebc46f92a60ecaf857150f9b309c15f7b473ff05dec5" Namespace="kube-system" Pod="coredns-6f6b679f8f-4sdg7" WorkloadEndpoint="ci--4284--0--0--n--40578dffbd-k8s-coredns--6f6b679f8f--4sdg7-" May 13 23:51:41.351535 containerd[1507]: 2025-05-13 23:51:41.198 [INFO][4251] cni-plugin/k8s.go 77: Extracted identifiers for CmdAddK8s ContainerID="ee2b3fb9f26d19008fa2ebc46f92a60ecaf857150f9b309c15f7b473ff05dec5" Namespace="kube-system" Pod="coredns-6f6b679f8f-4sdg7" WorkloadEndpoint="ci--4284--0--0--n--40578dffbd-k8s-coredns--6f6b679f8f--4sdg7-eth0" May 13 23:51:41.351535 containerd[1507]: 2025-05-13 23:51:41.253 [INFO][4263] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="ee2b3fb9f26d19008fa2ebc46f92a60ecaf857150f9b309c15f7b473ff05dec5" HandleID="k8s-pod-network.ee2b3fb9f26d19008fa2ebc46f92a60ecaf857150f9b309c15f7b473ff05dec5" Workload="ci--4284--0--0--n--40578dffbd-k8s-coredns--6f6b679f8f--4sdg7-eth0" May 13 23:51:41.351535 containerd[1507]: 2025-05-13 23:51:41.272 [INFO][4263] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="ee2b3fb9f26d19008fa2ebc46f92a60ecaf857150f9b309c15f7b473ff05dec5" HandleID="k8s-pod-network.ee2b3fb9f26d19008fa2ebc46f92a60ecaf857150f9b309c15f7b473ff05dec5" Workload="ci--4284--0--0--n--40578dffbd-k8s-coredns--6f6b679f8f--4sdg7-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0x400028c900), Attrs:map[string]string{"namespace":"kube-system", "node":"ci-4284-0-0-n-40578dffbd", "pod":"coredns-6f6b679f8f-4sdg7", "timestamp":"2025-05-13 23:51:41.253912345 +0000 UTC"}, Hostname:"ci-4284-0-0-n-40578dffbd", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} May 13 23:51:41.351535 containerd[1507]: 2025-05-13 23:51:41.272 [INFO][4263] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. May 13 23:51:41.351535 containerd[1507]: 2025-05-13 23:51:41.272 [INFO][4263] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. May 13 23:51:41.351535 containerd[1507]: 2025-05-13 23:51:41.272 [INFO][4263] ipam/ipam.go 107: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4284-0-0-n-40578dffbd' May 13 23:51:41.351535 containerd[1507]: 2025-05-13 23:51:41.277 [INFO][4263] ipam/ipam.go 660: Looking up existing affinities for host handle="k8s-pod-network.ee2b3fb9f26d19008fa2ebc46f92a60ecaf857150f9b309c15f7b473ff05dec5" host="ci-4284-0-0-n-40578dffbd" May 13 23:51:41.351535 containerd[1507]: 2025-05-13 23:51:41.286 [INFO][4263] ipam/ipam.go 372: Looking up existing affinities for host host="ci-4284-0-0-n-40578dffbd" May 13 23:51:41.351535 containerd[1507]: 2025-05-13 23:51:41.298 [INFO][4263] ipam/ipam.go 489: Trying affinity for 192.168.48.192/26 host="ci-4284-0-0-n-40578dffbd" May 13 23:51:41.351535 containerd[1507]: 2025-05-13 23:51:41.301 [INFO][4263] ipam/ipam.go 155: Attempting to load block cidr=192.168.48.192/26 host="ci-4284-0-0-n-40578dffbd" May 13 23:51:41.351535 containerd[1507]: 2025-05-13 23:51:41.306 [INFO][4263] ipam/ipam.go 232: Affinity is confirmed and block has been loaded cidr=192.168.48.192/26 host="ci-4284-0-0-n-40578dffbd" May 13 23:51:41.351535 containerd[1507]: 2025-05-13 23:51:41.306 [INFO][4263] ipam/ipam.go 1180: Attempting to assign 1 addresses from block block=192.168.48.192/26 handle="k8s-pod-network.ee2b3fb9f26d19008fa2ebc46f92a60ecaf857150f9b309c15f7b473ff05dec5" host="ci-4284-0-0-n-40578dffbd" May 13 23:51:41.351535 containerd[1507]: 2025-05-13 23:51:41.310 [INFO][4263] ipam/ipam.go 1685: Creating new handle: k8s-pod-network.ee2b3fb9f26d19008fa2ebc46f92a60ecaf857150f9b309c15f7b473ff05dec5 May 13 23:51:41.351535 containerd[1507]: 2025-05-13 23:51:41.315 [INFO][4263] ipam/ipam.go 1203: Writing block in order to claim IPs block=192.168.48.192/26 handle="k8s-pod-network.ee2b3fb9f26d19008fa2ebc46f92a60ecaf857150f9b309c15f7b473ff05dec5" host="ci-4284-0-0-n-40578dffbd" May 13 23:51:41.351535 containerd[1507]: 2025-05-13 23:51:41.326 [INFO][4263] ipam/ipam.go 1216: Successfully claimed IPs: [192.168.48.195/26] block=192.168.48.192/26 handle="k8s-pod-network.ee2b3fb9f26d19008fa2ebc46f92a60ecaf857150f9b309c15f7b473ff05dec5" host="ci-4284-0-0-n-40578dffbd" May 13 23:51:41.351535 containerd[1507]: 2025-05-13 23:51:41.326 [INFO][4263] ipam/ipam.go 847: Auto-assigned 1 out of 1 IPv4s: [192.168.48.195/26] handle="k8s-pod-network.ee2b3fb9f26d19008fa2ebc46f92a60ecaf857150f9b309c15f7b473ff05dec5" host="ci-4284-0-0-n-40578dffbd" May 13 23:51:41.351535 containerd[1507]: 2025-05-13 23:51:41.326 [INFO][4263] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. May 13 23:51:41.351535 containerd[1507]: 2025-05-13 23:51:41.327 [INFO][4263] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.48.195/26] IPv6=[] ContainerID="ee2b3fb9f26d19008fa2ebc46f92a60ecaf857150f9b309c15f7b473ff05dec5" HandleID="k8s-pod-network.ee2b3fb9f26d19008fa2ebc46f92a60ecaf857150f9b309c15f7b473ff05dec5" Workload="ci--4284--0--0--n--40578dffbd-k8s-coredns--6f6b679f8f--4sdg7-eth0" May 13 23:51:41.353813 containerd[1507]: 2025-05-13 23:51:41.330 [INFO][4251] cni-plugin/k8s.go 386: Populated endpoint ContainerID="ee2b3fb9f26d19008fa2ebc46f92a60ecaf857150f9b309c15f7b473ff05dec5" Namespace="kube-system" Pod="coredns-6f6b679f8f-4sdg7" WorkloadEndpoint="ci--4284--0--0--n--40578dffbd-k8s-coredns--6f6b679f8f--4sdg7-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4284--0--0--n--40578dffbd-k8s-coredns--6f6b679f8f--4sdg7-eth0", GenerateName:"coredns-6f6b679f8f-", Namespace:"kube-system", SelfLink:"", UID:"7a20d30b-a034-4ee2-8bb1-f36d725100bb", ResourceVersion:"713", Generation:0, CreationTimestamp:time.Date(2025, time.May, 13, 23, 51, 9, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"6f6b679f8f", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4284-0-0-n-40578dffbd", ContainerID:"", Pod:"coredns-6f6b679f8f-4sdg7", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.48.195/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"calic5a77404044", MAC:"", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil)}} May 13 23:51:41.353813 containerd[1507]: 2025-05-13 23:51:41.330 [INFO][4251] cni-plugin/k8s.go 387: Calico CNI using IPs: [192.168.48.195/32] ContainerID="ee2b3fb9f26d19008fa2ebc46f92a60ecaf857150f9b309c15f7b473ff05dec5" Namespace="kube-system" Pod="coredns-6f6b679f8f-4sdg7" WorkloadEndpoint="ci--4284--0--0--n--40578dffbd-k8s-coredns--6f6b679f8f--4sdg7-eth0" May 13 23:51:41.353813 containerd[1507]: 2025-05-13 23:51:41.330 [INFO][4251] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to calic5a77404044 ContainerID="ee2b3fb9f26d19008fa2ebc46f92a60ecaf857150f9b309c15f7b473ff05dec5" Namespace="kube-system" Pod="coredns-6f6b679f8f-4sdg7" WorkloadEndpoint="ci--4284--0--0--n--40578dffbd-k8s-coredns--6f6b679f8f--4sdg7-eth0" May 13 23:51:41.353813 containerd[1507]: 2025-05-13 23:51:41.333 [INFO][4251] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="ee2b3fb9f26d19008fa2ebc46f92a60ecaf857150f9b309c15f7b473ff05dec5" Namespace="kube-system" Pod="coredns-6f6b679f8f-4sdg7" WorkloadEndpoint="ci--4284--0--0--n--40578dffbd-k8s-coredns--6f6b679f8f--4sdg7-eth0" May 13 23:51:41.353813 containerd[1507]: 2025-05-13 23:51:41.335 [INFO][4251] cni-plugin/k8s.go 414: Added Mac, interface name, and active container ID to endpoint ContainerID="ee2b3fb9f26d19008fa2ebc46f92a60ecaf857150f9b309c15f7b473ff05dec5" Namespace="kube-system" Pod="coredns-6f6b679f8f-4sdg7" WorkloadEndpoint="ci--4284--0--0--n--40578dffbd-k8s-coredns--6f6b679f8f--4sdg7-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4284--0--0--n--40578dffbd-k8s-coredns--6f6b679f8f--4sdg7-eth0", GenerateName:"coredns-6f6b679f8f-", Namespace:"kube-system", SelfLink:"", UID:"7a20d30b-a034-4ee2-8bb1-f36d725100bb", ResourceVersion:"713", Generation:0, CreationTimestamp:time.Date(2025, time.May, 13, 23, 51, 9, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"6f6b679f8f", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4284-0-0-n-40578dffbd", ContainerID:"ee2b3fb9f26d19008fa2ebc46f92a60ecaf857150f9b309c15f7b473ff05dec5", Pod:"coredns-6f6b679f8f-4sdg7", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.48.195/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"calic5a77404044", MAC:"6a:62:85:75:7e:9d", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil)}} May 13 23:51:41.353813 containerd[1507]: 2025-05-13 23:51:41.347 [INFO][4251] cni-plugin/k8s.go 500: Wrote updated endpoint to datastore ContainerID="ee2b3fb9f26d19008fa2ebc46f92a60ecaf857150f9b309c15f7b473ff05dec5" Namespace="kube-system" Pod="coredns-6f6b679f8f-4sdg7" WorkloadEndpoint="ci--4284--0--0--n--40578dffbd-k8s-coredns--6f6b679f8f--4sdg7-eth0" May 13 23:51:41.389261 containerd[1507]: time="2025-05-13T23:51:41.389124657Z" level=info msg="connecting to shim ee2b3fb9f26d19008fa2ebc46f92a60ecaf857150f9b309c15f7b473ff05dec5" address="unix:///run/containerd/s/f572f43d1304e91f41ff9b99bb204c68b10c79e761cb1643c967640504e9fa91" namespace=k8s.io protocol=ttrpc version=3 May 13 23:51:41.402281 systemd-networkd[1400]: cali5e98a2c78c7: Gained IPv6LL May 13 23:51:41.421267 systemd[1]: Started cri-containerd-ee2b3fb9f26d19008fa2ebc46f92a60ecaf857150f9b309c15f7b473ff05dec5.scope - libcontainer container ee2b3fb9f26d19008fa2ebc46f92a60ecaf857150f9b309c15f7b473ff05dec5. May 13 23:51:41.468861 containerd[1507]: time="2025-05-13T23:51:41.468799157Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-6f6b679f8f-4sdg7,Uid:7a20d30b-a034-4ee2-8bb1-f36d725100bb,Namespace:kube-system,Attempt:0,} returns sandbox id \"ee2b3fb9f26d19008fa2ebc46f92a60ecaf857150f9b309c15f7b473ff05dec5\"" May 13 23:51:41.474208 containerd[1507]: time="2025-05-13T23:51:41.474162848Z" level=info msg="CreateContainer within sandbox \"ee2b3fb9f26d19008fa2ebc46f92a60ecaf857150f9b309c15f7b473ff05dec5\" for container &ContainerMetadata{Name:coredns,Attempt:0,}" May 13 23:51:41.490526 containerd[1507]: time="2025-05-13T23:51:41.490119637Z" level=info msg="Container bec2e855a7737f75863282aea1986c10d1852b30af36163f114bd08df94ddd6f: CDI devices from CRI Config.CDIDevices: []" May 13 23:51:41.500321 containerd[1507]: time="2025-05-13T23:51:41.500253000Z" level=info msg="CreateContainer within sandbox \"ee2b3fb9f26d19008fa2ebc46f92a60ecaf857150f9b309c15f7b473ff05dec5\" for &ContainerMetadata{Name:coredns,Attempt:0,} returns container id \"bec2e855a7737f75863282aea1986c10d1852b30af36163f114bd08df94ddd6f\"" May 13 23:51:41.501213 containerd[1507]: time="2025-05-13T23:51:41.501169870Z" level=info msg="StartContainer for \"bec2e855a7737f75863282aea1986c10d1852b30af36163f114bd08df94ddd6f\"" May 13 23:51:41.509138 containerd[1507]: time="2025-05-13T23:51:41.509099642Z" level=info msg="connecting to shim bec2e855a7737f75863282aea1986c10d1852b30af36163f114bd08df94ddd6f" address="unix:///run/containerd/s/f572f43d1304e91f41ff9b99bb204c68b10c79e761cb1643c967640504e9fa91" protocol=ttrpc version=3 May 13 23:51:41.532273 systemd[1]: Started cri-containerd-bec2e855a7737f75863282aea1986c10d1852b30af36163f114bd08df94ddd6f.scope - libcontainer container bec2e855a7737f75863282aea1986c10d1852b30af36163f114bd08df94ddd6f. May 13 23:51:41.575135 containerd[1507]: time="2025-05-13T23:51:41.574927582Z" level=info msg="StartContainer for \"bec2e855a7737f75863282aea1986c10d1852b30af36163f114bd08df94ddd6f\" returns successfully" May 13 23:51:42.135562 containerd[1507]: time="2025-05-13T23:51:42.135217572Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-68jvj,Uid:25916451-c2fa-46e7-8188-33b7982635fd,Namespace:calico-system,Attempt:0,}" May 13 23:51:42.136758 containerd[1507]: time="2025-05-13T23:51:42.136466132Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-7fd5cd4657-7ls8m,Uid:a9cfa8c1-0881-48db-9b0c-a8081ef8041d,Namespace:calico-apiserver,Attempt:0,}" May 13 23:51:42.382301 systemd-networkd[1400]: cali183b7699b14: Link UP May 13 23:51:42.385697 systemd-networkd[1400]: cali183b7699b14: Gained carrier May 13 23:51:42.416374 kubelet[2772]: I0513 23:51:42.414633 2772 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/coredns-6f6b679f8f-4sdg7" podStartSLOduration=33.414611286 podStartE2EDuration="33.414611286s" podCreationTimestamp="2025-05-13 23:51:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-05-13 23:51:42.378407141 +0000 UTC m=+38.392239706" watchObservedRunningTime="2025-05-13 23:51:42.414611286 +0000 UTC m=+38.428443811" May 13 23:51:42.418699 containerd[1507]: 2025-05-13 23:51:42.206 [INFO][4366] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4284--0--0--n--40578dffbd-k8s-csi--node--driver--68jvj-eth0 csi-node-driver- calico-system 25916451-c2fa-46e7-8188-33b7982635fd 622 0 2025-05-13 23:51:17 +0000 UTC map[app.kubernetes.io/name:csi-node-driver controller-revision-hash:5bcd8f69 k8s-app:csi-node-driver name:csi-node-driver pod-template-generation:1 projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:csi-node-driver] map[] [] [] []} {k8s ci-4284-0-0-n-40578dffbd csi-node-driver-68jvj eth0 csi-node-driver [] [] [kns.calico-system ksa.calico-system.csi-node-driver] cali183b7699b14 [] []}} ContainerID="780ead374676bca200fda1d3e4eda5e4eb03cc29b3a6f72b32d180c26ff0b565" Namespace="calico-system" Pod="csi-node-driver-68jvj" WorkloadEndpoint="ci--4284--0--0--n--40578dffbd-k8s-csi--node--driver--68jvj-" May 13 23:51:42.418699 containerd[1507]: 2025-05-13 23:51:42.207 [INFO][4366] cni-plugin/k8s.go 77: Extracted identifiers for CmdAddK8s ContainerID="780ead374676bca200fda1d3e4eda5e4eb03cc29b3a6f72b32d180c26ff0b565" Namespace="calico-system" Pod="csi-node-driver-68jvj" WorkloadEndpoint="ci--4284--0--0--n--40578dffbd-k8s-csi--node--driver--68jvj-eth0" May 13 23:51:42.418699 containerd[1507]: 2025-05-13 23:51:42.265 [INFO][4392] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="780ead374676bca200fda1d3e4eda5e4eb03cc29b3a6f72b32d180c26ff0b565" HandleID="k8s-pod-network.780ead374676bca200fda1d3e4eda5e4eb03cc29b3a6f72b32d180c26ff0b565" Workload="ci--4284--0--0--n--40578dffbd-k8s-csi--node--driver--68jvj-eth0" May 13 23:51:42.418699 containerd[1507]: 2025-05-13 23:51:42.297 [INFO][4392] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="780ead374676bca200fda1d3e4eda5e4eb03cc29b3a6f72b32d180c26ff0b565" HandleID="k8s-pod-network.780ead374676bca200fda1d3e4eda5e4eb03cc29b3a6f72b32d180c26ff0b565" Workload="ci--4284--0--0--n--40578dffbd-k8s-csi--node--driver--68jvj-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0x400004de20), Attrs:map[string]string{"namespace":"calico-system", "node":"ci-4284-0-0-n-40578dffbd", "pod":"csi-node-driver-68jvj", "timestamp":"2025-05-13 23:51:42.265902424 +0000 UTC"}, Hostname:"ci-4284-0-0-n-40578dffbd", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} May 13 23:51:42.418699 containerd[1507]: 2025-05-13 23:51:42.297 [INFO][4392] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. May 13 23:51:42.418699 containerd[1507]: 2025-05-13 23:51:42.297 [INFO][4392] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. May 13 23:51:42.418699 containerd[1507]: 2025-05-13 23:51:42.297 [INFO][4392] ipam/ipam.go 107: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4284-0-0-n-40578dffbd' May 13 23:51:42.418699 containerd[1507]: 2025-05-13 23:51:42.306 [INFO][4392] ipam/ipam.go 660: Looking up existing affinities for host handle="k8s-pod-network.780ead374676bca200fda1d3e4eda5e4eb03cc29b3a6f72b32d180c26ff0b565" host="ci-4284-0-0-n-40578dffbd" May 13 23:51:42.418699 containerd[1507]: 2025-05-13 23:51:42.319 [INFO][4392] ipam/ipam.go 372: Looking up existing affinities for host host="ci-4284-0-0-n-40578dffbd" May 13 23:51:42.418699 containerd[1507]: 2025-05-13 23:51:42.330 [INFO][4392] ipam/ipam.go 489: Trying affinity for 192.168.48.192/26 host="ci-4284-0-0-n-40578dffbd" May 13 23:51:42.418699 containerd[1507]: 2025-05-13 23:51:42.335 [INFO][4392] ipam/ipam.go 155: Attempting to load block cidr=192.168.48.192/26 host="ci-4284-0-0-n-40578dffbd" May 13 23:51:42.418699 containerd[1507]: 2025-05-13 23:51:42.340 [INFO][4392] ipam/ipam.go 232: Affinity is confirmed and block has been loaded cidr=192.168.48.192/26 host="ci-4284-0-0-n-40578dffbd" May 13 23:51:42.418699 containerd[1507]: 2025-05-13 23:51:42.341 [INFO][4392] ipam/ipam.go 1180: Attempting to assign 1 addresses from block block=192.168.48.192/26 handle="k8s-pod-network.780ead374676bca200fda1d3e4eda5e4eb03cc29b3a6f72b32d180c26ff0b565" host="ci-4284-0-0-n-40578dffbd" May 13 23:51:42.418699 containerd[1507]: 2025-05-13 23:51:42.345 [INFO][4392] ipam/ipam.go 1685: Creating new handle: k8s-pod-network.780ead374676bca200fda1d3e4eda5e4eb03cc29b3a6f72b32d180c26ff0b565 May 13 23:51:42.418699 containerd[1507]: 2025-05-13 23:51:42.352 [INFO][4392] ipam/ipam.go 1203: Writing block in order to claim IPs block=192.168.48.192/26 handle="k8s-pod-network.780ead374676bca200fda1d3e4eda5e4eb03cc29b3a6f72b32d180c26ff0b565" host="ci-4284-0-0-n-40578dffbd" May 13 23:51:42.418699 containerd[1507]: 2025-05-13 23:51:42.366 [INFO][4392] ipam/ipam.go 1216: Successfully claimed IPs: [192.168.48.196/26] block=192.168.48.192/26 handle="k8s-pod-network.780ead374676bca200fda1d3e4eda5e4eb03cc29b3a6f72b32d180c26ff0b565" host="ci-4284-0-0-n-40578dffbd" May 13 23:51:42.418699 containerd[1507]: 2025-05-13 23:51:42.366 [INFO][4392] ipam/ipam.go 847: Auto-assigned 1 out of 1 IPv4s: [192.168.48.196/26] handle="k8s-pod-network.780ead374676bca200fda1d3e4eda5e4eb03cc29b3a6f72b32d180c26ff0b565" host="ci-4284-0-0-n-40578dffbd" May 13 23:51:42.418699 containerd[1507]: 2025-05-13 23:51:42.366 [INFO][4392] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. May 13 23:51:42.418699 containerd[1507]: 2025-05-13 23:51:42.366 [INFO][4392] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.48.196/26] IPv6=[] ContainerID="780ead374676bca200fda1d3e4eda5e4eb03cc29b3a6f72b32d180c26ff0b565" HandleID="k8s-pod-network.780ead374676bca200fda1d3e4eda5e4eb03cc29b3a6f72b32d180c26ff0b565" Workload="ci--4284--0--0--n--40578dffbd-k8s-csi--node--driver--68jvj-eth0" May 13 23:51:42.420966 containerd[1507]: 2025-05-13 23:51:42.370 [INFO][4366] cni-plugin/k8s.go 386: Populated endpoint ContainerID="780ead374676bca200fda1d3e4eda5e4eb03cc29b3a6f72b32d180c26ff0b565" Namespace="calico-system" Pod="csi-node-driver-68jvj" WorkloadEndpoint="ci--4284--0--0--n--40578dffbd-k8s-csi--node--driver--68jvj-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4284--0--0--n--40578dffbd-k8s-csi--node--driver--68jvj-eth0", GenerateName:"csi-node-driver-", Namespace:"calico-system", SelfLink:"", UID:"25916451-c2fa-46e7-8188-33b7982635fd", ResourceVersion:"622", Generation:0, CreationTimestamp:time.Date(2025, time.May, 13, 23, 51, 17, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"csi-node-driver", "controller-revision-hash":"5bcd8f69", "k8s-app":"csi-node-driver", "name":"csi-node-driver", "pod-template-generation":"1", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"csi-node-driver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4284-0-0-n-40578dffbd", ContainerID:"", Pod:"csi-node-driver-68jvj", Endpoint:"eth0", ServiceAccountName:"csi-node-driver", IPNetworks:[]string{"192.168.48.196/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.csi-node-driver"}, InterfaceName:"cali183b7699b14", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil)}} May 13 23:51:42.420966 containerd[1507]: 2025-05-13 23:51:42.371 [INFO][4366] cni-plugin/k8s.go 387: Calico CNI using IPs: [192.168.48.196/32] ContainerID="780ead374676bca200fda1d3e4eda5e4eb03cc29b3a6f72b32d180c26ff0b565" Namespace="calico-system" Pod="csi-node-driver-68jvj" WorkloadEndpoint="ci--4284--0--0--n--40578dffbd-k8s-csi--node--driver--68jvj-eth0" May 13 23:51:42.420966 containerd[1507]: 2025-05-13 23:51:42.371 [INFO][4366] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali183b7699b14 ContainerID="780ead374676bca200fda1d3e4eda5e4eb03cc29b3a6f72b32d180c26ff0b565" Namespace="calico-system" Pod="csi-node-driver-68jvj" WorkloadEndpoint="ci--4284--0--0--n--40578dffbd-k8s-csi--node--driver--68jvj-eth0" May 13 23:51:42.420966 containerd[1507]: 2025-05-13 23:51:42.386 [INFO][4366] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="780ead374676bca200fda1d3e4eda5e4eb03cc29b3a6f72b32d180c26ff0b565" Namespace="calico-system" Pod="csi-node-driver-68jvj" WorkloadEndpoint="ci--4284--0--0--n--40578dffbd-k8s-csi--node--driver--68jvj-eth0" May 13 23:51:42.420966 containerd[1507]: 2025-05-13 23:51:42.388 [INFO][4366] cni-plugin/k8s.go 414: Added Mac, interface name, and active container ID to endpoint ContainerID="780ead374676bca200fda1d3e4eda5e4eb03cc29b3a6f72b32d180c26ff0b565" Namespace="calico-system" Pod="csi-node-driver-68jvj" WorkloadEndpoint="ci--4284--0--0--n--40578dffbd-k8s-csi--node--driver--68jvj-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4284--0--0--n--40578dffbd-k8s-csi--node--driver--68jvj-eth0", GenerateName:"csi-node-driver-", Namespace:"calico-system", SelfLink:"", UID:"25916451-c2fa-46e7-8188-33b7982635fd", ResourceVersion:"622", Generation:0, CreationTimestamp:time.Date(2025, time.May, 13, 23, 51, 17, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"csi-node-driver", "controller-revision-hash":"5bcd8f69", "k8s-app":"csi-node-driver", "name":"csi-node-driver", "pod-template-generation":"1", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"csi-node-driver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4284-0-0-n-40578dffbd", ContainerID:"780ead374676bca200fda1d3e4eda5e4eb03cc29b3a6f72b32d180c26ff0b565", Pod:"csi-node-driver-68jvj", Endpoint:"eth0", ServiceAccountName:"csi-node-driver", IPNetworks:[]string{"192.168.48.196/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.csi-node-driver"}, InterfaceName:"cali183b7699b14", MAC:"72:67:e6:3a:e1:44", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil)}} May 13 23:51:42.420966 containerd[1507]: 2025-05-13 23:51:42.415 [INFO][4366] cni-plugin/k8s.go 500: Wrote updated endpoint to datastore ContainerID="780ead374676bca200fda1d3e4eda5e4eb03cc29b3a6f72b32d180c26ff0b565" Namespace="calico-system" Pod="csi-node-driver-68jvj" WorkloadEndpoint="ci--4284--0--0--n--40578dffbd-k8s-csi--node--driver--68jvj-eth0" May 13 23:51:42.497761 containerd[1507]: time="2025-05-13T23:51:42.497122534Z" level=info msg="connecting to shim 780ead374676bca200fda1d3e4eda5e4eb03cc29b3a6f72b32d180c26ff0b565" address="unix:///run/containerd/s/17d19db7b61bf1cfefdfddddc78ad2cfe07b6a86a21766d6b1cb8e3910db7928" namespace=k8s.io protocol=ttrpc version=3 May 13 23:51:42.516623 systemd-networkd[1400]: calif91f486c577: Link UP May 13 23:51:42.517442 systemd-networkd[1400]: calif91f486c577: Gained carrier May 13 23:51:42.548657 containerd[1507]: 2025-05-13 23:51:42.216 [INFO][4373] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4284--0--0--n--40578dffbd-k8s-calico--apiserver--7fd5cd4657--7ls8m-eth0 calico-apiserver-7fd5cd4657- calico-apiserver a9cfa8c1-0881-48db-9b0c-a8081ef8041d 716 0 2025-05-13 23:51:16 +0000 UTC map[apiserver:true app.kubernetes.io/name:calico-apiserver k8s-app:calico-apiserver pod-template-hash:7fd5cd4657 projectcalico.org/namespace:calico-apiserver projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:calico-apiserver] map[] [] [] []} {k8s ci-4284-0-0-n-40578dffbd calico-apiserver-7fd5cd4657-7ls8m eth0 calico-apiserver [] [] [kns.calico-apiserver ksa.calico-apiserver.calico-apiserver] calif91f486c577 [] []}} ContainerID="e9e4837a0ed00002d93a951fbd5d6aeba69e6afc7f4d0e620256fd8b72a2ea0d" Namespace="calico-apiserver" Pod="calico-apiserver-7fd5cd4657-7ls8m" WorkloadEndpoint="ci--4284--0--0--n--40578dffbd-k8s-calico--apiserver--7fd5cd4657--7ls8m-" May 13 23:51:42.548657 containerd[1507]: 2025-05-13 23:51:42.216 [INFO][4373] cni-plugin/k8s.go 77: Extracted identifiers for CmdAddK8s ContainerID="e9e4837a0ed00002d93a951fbd5d6aeba69e6afc7f4d0e620256fd8b72a2ea0d" Namespace="calico-apiserver" Pod="calico-apiserver-7fd5cd4657-7ls8m" WorkloadEndpoint="ci--4284--0--0--n--40578dffbd-k8s-calico--apiserver--7fd5cd4657--7ls8m-eth0" May 13 23:51:42.548657 containerd[1507]: 2025-05-13 23:51:42.304 [INFO][4397] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="e9e4837a0ed00002d93a951fbd5d6aeba69e6afc7f4d0e620256fd8b72a2ea0d" HandleID="k8s-pod-network.e9e4837a0ed00002d93a951fbd5d6aeba69e6afc7f4d0e620256fd8b72a2ea0d" Workload="ci--4284--0--0--n--40578dffbd-k8s-calico--apiserver--7fd5cd4657--7ls8m-eth0" May 13 23:51:42.548657 containerd[1507]: 2025-05-13 23:51:42.320 [INFO][4397] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="e9e4837a0ed00002d93a951fbd5d6aeba69e6afc7f4d0e620256fd8b72a2ea0d" HandleID="k8s-pod-network.e9e4837a0ed00002d93a951fbd5d6aeba69e6afc7f4d0e620256fd8b72a2ea0d" Workload="ci--4284--0--0--n--40578dffbd-k8s-calico--apiserver--7fd5cd4657--7ls8m-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0x400030b5e0), Attrs:map[string]string{"namespace":"calico-apiserver", "node":"ci-4284-0-0-n-40578dffbd", "pod":"calico-apiserver-7fd5cd4657-7ls8m", "timestamp":"2025-05-13 23:51:42.304064751 +0000 UTC"}, Hostname:"ci-4284-0-0-n-40578dffbd", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} May 13 23:51:42.548657 containerd[1507]: 2025-05-13 23:51:42.320 [INFO][4397] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. May 13 23:51:42.548657 containerd[1507]: 2025-05-13 23:51:42.366 [INFO][4397] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. May 13 23:51:42.548657 containerd[1507]: 2025-05-13 23:51:42.367 [INFO][4397] ipam/ipam.go 107: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4284-0-0-n-40578dffbd' May 13 23:51:42.548657 containerd[1507]: 2025-05-13 23:51:42.412 [INFO][4397] ipam/ipam.go 660: Looking up existing affinities for host handle="k8s-pod-network.e9e4837a0ed00002d93a951fbd5d6aeba69e6afc7f4d0e620256fd8b72a2ea0d" host="ci-4284-0-0-n-40578dffbd" May 13 23:51:42.548657 containerd[1507]: 2025-05-13 23:51:42.433 [INFO][4397] ipam/ipam.go 372: Looking up existing affinities for host host="ci-4284-0-0-n-40578dffbd" May 13 23:51:42.548657 containerd[1507]: 2025-05-13 23:51:42.449 [INFO][4397] ipam/ipam.go 489: Trying affinity for 192.168.48.192/26 host="ci-4284-0-0-n-40578dffbd" May 13 23:51:42.548657 containerd[1507]: 2025-05-13 23:51:42.454 [INFO][4397] ipam/ipam.go 155: Attempting to load block cidr=192.168.48.192/26 host="ci-4284-0-0-n-40578dffbd" May 13 23:51:42.548657 containerd[1507]: 2025-05-13 23:51:42.462 [INFO][4397] ipam/ipam.go 232: Affinity is confirmed and block has been loaded cidr=192.168.48.192/26 host="ci-4284-0-0-n-40578dffbd" May 13 23:51:42.548657 containerd[1507]: 2025-05-13 23:51:42.462 [INFO][4397] ipam/ipam.go 1180: Attempting to assign 1 addresses from block block=192.168.48.192/26 handle="k8s-pod-network.e9e4837a0ed00002d93a951fbd5d6aeba69e6afc7f4d0e620256fd8b72a2ea0d" host="ci-4284-0-0-n-40578dffbd" May 13 23:51:42.548657 containerd[1507]: 2025-05-13 23:51:42.466 [INFO][4397] ipam/ipam.go 1685: Creating new handle: k8s-pod-network.e9e4837a0ed00002d93a951fbd5d6aeba69e6afc7f4d0e620256fd8b72a2ea0d May 13 23:51:42.548657 containerd[1507]: 2025-05-13 23:51:42.483 [INFO][4397] ipam/ipam.go 1203: Writing block in order to claim IPs block=192.168.48.192/26 handle="k8s-pod-network.e9e4837a0ed00002d93a951fbd5d6aeba69e6afc7f4d0e620256fd8b72a2ea0d" host="ci-4284-0-0-n-40578dffbd" May 13 23:51:42.548657 containerd[1507]: 2025-05-13 23:51:42.497 [INFO][4397] ipam/ipam.go 1216: Successfully claimed IPs: [192.168.48.197/26] block=192.168.48.192/26 handle="k8s-pod-network.e9e4837a0ed00002d93a951fbd5d6aeba69e6afc7f4d0e620256fd8b72a2ea0d" host="ci-4284-0-0-n-40578dffbd" May 13 23:51:42.548657 containerd[1507]: 2025-05-13 23:51:42.498 [INFO][4397] ipam/ipam.go 847: Auto-assigned 1 out of 1 IPv4s: [192.168.48.197/26] handle="k8s-pod-network.e9e4837a0ed00002d93a951fbd5d6aeba69e6afc7f4d0e620256fd8b72a2ea0d" host="ci-4284-0-0-n-40578dffbd" May 13 23:51:42.548657 containerd[1507]: 2025-05-13 23:51:42.499 [INFO][4397] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. May 13 23:51:42.548657 containerd[1507]: 2025-05-13 23:51:42.501 [INFO][4397] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.48.197/26] IPv6=[] ContainerID="e9e4837a0ed00002d93a951fbd5d6aeba69e6afc7f4d0e620256fd8b72a2ea0d" HandleID="k8s-pod-network.e9e4837a0ed00002d93a951fbd5d6aeba69e6afc7f4d0e620256fd8b72a2ea0d" Workload="ci--4284--0--0--n--40578dffbd-k8s-calico--apiserver--7fd5cd4657--7ls8m-eth0" May 13 23:51:42.549643 containerd[1507]: 2025-05-13 23:51:42.512 [INFO][4373] cni-plugin/k8s.go 386: Populated endpoint ContainerID="e9e4837a0ed00002d93a951fbd5d6aeba69e6afc7f4d0e620256fd8b72a2ea0d" Namespace="calico-apiserver" Pod="calico-apiserver-7fd5cd4657-7ls8m" WorkloadEndpoint="ci--4284--0--0--n--40578dffbd-k8s-calico--apiserver--7fd5cd4657--7ls8m-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4284--0--0--n--40578dffbd-k8s-calico--apiserver--7fd5cd4657--7ls8m-eth0", GenerateName:"calico-apiserver-7fd5cd4657-", Namespace:"calico-apiserver", SelfLink:"", UID:"a9cfa8c1-0881-48db-9b0c-a8081ef8041d", ResourceVersion:"716", Generation:0, CreationTimestamp:time.Date(2025, time.May, 13, 23, 51, 16, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"7fd5cd4657", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4284-0-0-n-40578dffbd", ContainerID:"", Pod:"calico-apiserver-7fd5cd4657-7ls8m", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.48.197/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"calif91f486c577", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil)}} May 13 23:51:42.549643 containerd[1507]: 2025-05-13 23:51:42.512 [INFO][4373] cni-plugin/k8s.go 387: Calico CNI using IPs: [192.168.48.197/32] ContainerID="e9e4837a0ed00002d93a951fbd5d6aeba69e6afc7f4d0e620256fd8b72a2ea0d" Namespace="calico-apiserver" Pod="calico-apiserver-7fd5cd4657-7ls8m" WorkloadEndpoint="ci--4284--0--0--n--40578dffbd-k8s-calico--apiserver--7fd5cd4657--7ls8m-eth0" May 13 23:51:42.549643 containerd[1507]: 2025-05-13 23:51:42.512 [INFO][4373] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to calif91f486c577 ContainerID="e9e4837a0ed00002d93a951fbd5d6aeba69e6afc7f4d0e620256fd8b72a2ea0d" Namespace="calico-apiserver" Pod="calico-apiserver-7fd5cd4657-7ls8m" WorkloadEndpoint="ci--4284--0--0--n--40578dffbd-k8s-calico--apiserver--7fd5cd4657--7ls8m-eth0" May 13 23:51:42.549643 containerd[1507]: 2025-05-13 23:51:42.518 [INFO][4373] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="e9e4837a0ed00002d93a951fbd5d6aeba69e6afc7f4d0e620256fd8b72a2ea0d" Namespace="calico-apiserver" Pod="calico-apiserver-7fd5cd4657-7ls8m" WorkloadEndpoint="ci--4284--0--0--n--40578dffbd-k8s-calico--apiserver--7fd5cd4657--7ls8m-eth0" May 13 23:51:42.549643 containerd[1507]: 2025-05-13 23:51:42.521 [INFO][4373] cni-plugin/k8s.go 414: Added Mac, interface name, and active container ID to endpoint ContainerID="e9e4837a0ed00002d93a951fbd5d6aeba69e6afc7f4d0e620256fd8b72a2ea0d" Namespace="calico-apiserver" Pod="calico-apiserver-7fd5cd4657-7ls8m" WorkloadEndpoint="ci--4284--0--0--n--40578dffbd-k8s-calico--apiserver--7fd5cd4657--7ls8m-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4284--0--0--n--40578dffbd-k8s-calico--apiserver--7fd5cd4657--7ls8m-eth0", GenerateName:"calico-apiserver-7fd5cd4657-", Namespace:"calico-apiserver", SelfLink:"", UID:"a9cfa8c1-0881-48db-9b0c-a8081ef8041d", ResourceVersion:"716", Generation:0, CreationTimestamp:time.Date(2025, time.May, 13, 23, 51, 16, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"7fd5cd4657", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4284-0-0-n-40578dffbd", ContainerID:"e9e4837a0ed00002d93a951fbd5d6aeba69e6afc7f4d0e620256fd8b72a2ea0d", Pod:"calico-apiserver-7fd5cd4657-7ls8m", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.48.197/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"calif91f486c577", MAC:"86:ef:16:cf:85:b8", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil)}} May 13 23:51:42.549643 containerd[1507]: 2025-05-13 23:51:42.543 [INFO][4373] cni-plugin/k8s.go 500: Wrote updated endpoint to datastore ContainerID="e9e4837a0ed00002d93a951fbd5d6aeba69e6afc7f4d0e620256fd8b72a2ea0d" Namespace="calico-apiserver" Pod="calico-apiserver-7fd5cd4657-7ls8m" WorkloadEndpoint="ci--4284--0--0--n--40578dffbd-k8s-calico--apiserver--7fd5cd4657--7ls8m-eth0" May 13 23:51:42.563465 systemd[1]: Started cri-containerd-780ead374676bca200fda1d3e4eda5e4eb03cc29b3a6f72b32d180c26ff0b565.scope - libcontainer container 780ead374676bca200fda1d3e4eda5e4eb03cc29b3a6f72b32d180c26ff0b565. May 13 23:51:42.605902 containerd[1507]: time="2025-05-13T23:51:42.605661606Z" level=info msg="connecting to shim e9e4837a0ed00002d93a951fbd5d6aeba69e6afc7f4d0e620256fd8b72a2ea0d" address="unix:///run/containerd/s/98a2d573fd8d90cb1e0e60ba550b1107f2f6868dda2b9049b2f2f72bd367ff7f" namespace=k8s.io protocol=ttrpc version=3 May 13 23:51:42.621550 containerd[1507]: time="2025-05-13T23:51:42.620446513Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-68jvj,Uid:25916451-c2fa-46e7-8188-33b7982635fd,Namespace:calico-system,Attempt:0,} returns sandbox id \"780ead374676bca200fda1d3e4eda5e4eb03cc29b3a6f72b32d180c26ff0b565\"" May 13 23:51:42.643267 systemd[1]: Started cri-containerd-e9e4837a0ed00002d93a951fbd5d6aeba69e6afc7f4d0e620256fd8b72a2ea0d.scope - libcontainer container e9e4837a0ed00002d93a951fbd5d6aeba69e6afc7f4d0e620256fd8b72a2ea0d. May 13 23:51:42.734712 containerd[1507]: time="2025-05-13T23:51:42.734655644Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-7fd5cd4657-7ls8m,Uid:a9cfa8c1-0881-48db-9b0c-a8081ef8041d,Namespace:calico-apiserver,Attempt:0,} returns sandbox id \"e9e4837a0ed00002d93a951fbd5d6aeba69e6afc7f4d0e620256fd8b72a2ea0d\"" May 13 23:51:43.066281 systemd-networkd[1400]: calic5a77404044: Gained IPv6LL May 13 23:51:43.135497 containerd[1507]: time="2025-05-13T23:51:43.135185792Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-67895f656d-fr7gd,Uid:a9e7fa11-c2d7-4fc9-bc76-9c7b1719d09c,Namespace:calico-system,Attempt:0,}" May 13 23:51:43.313616 systemd-networkd[1400]: calid33da281dda: Link UP May 13 23:51:43.315486 systemd-networkd[1400]: calid33da281dda: Gained carrier May 13 23:51:43.341788 containerd[1507]: 2025-05-13 23:51:43.184 [INFO][4533] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4284--0--0--n--40578dffbd-k8s-calico--kube--controllers--67895f656d--fr7gd-eth0 calico-kube-controllers-67895f656d- calico-system a9e7fa11-c2d7-4fc9-bc76-9c7b1719d09c 717 0 2025-05-13 23:51:17 +0000 UTC map[app.kubernetes.io/name:calico-kube-controllers k8s-app:calico-kube-controllers pod-template-hash:67895f656d projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:calico-kube-controllers] map[] [] [] []} {k8s ci-4284-0-0-n-40578dffbd calico-kube-controllers-67895f656d-fr7gd eth0 calico-kube-controllers [] [] [kns.calico-system ksa.calico-system.calico-kube-controllers] calid33da281dda [] []}} ContainerID="611e3a1415b8e12165075217f9c671e8a5ab26623bd8b3a433c1539dce3f4e44" Namespace="calico-system" Pod="calico-kube-controllers-67895f656d-fr7gd" WorkloadEndpoint="ci--4284--0--0--n--40578dffbd-k8s-calico--kube--controllers--67895f656d--fr7gd-" May 13 23:51:43.341788 containerd[1507]: 2025-05-13 23:51:43.185 [INFO][4533] cni-plugin/k8s.go 77: Extracted identifiers for CmdAddK8s ContainerID="611e3a1415b8e12165075217f9c671e8a5ab26623bd8b3a433c1539dce3f4e44" Namespace="calico-system" Pod="calico-kube-controllers-67895f656d-fr7gd" WorkloadEndpoint="ci--4284--0--0--n--40578dffbd-k8s-calico--kube--controllers--67895f656d--fr7gd-eth0" May 13 23:51:43.341788 containerd[1507]: 2025-05-13 23:51:43.227 [INFO][4545] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="611e3a1415b8e12165075217f9c671e8a5ab26623bd8b3a433c1539dce3f4e44" HandleID="k8s-pod-network.611e3a1415b8e12165075217f9c671e8a5ab26623bd8b3a433c1539dce3f4e44" Workload="ci--4284--0--0--n--40578dffbd-k8s-calico--kube--controllers--67895f656d--fr7gd-eth0" May 13 23:51:43.341788 containerd[1507]: 2025-05-13 23:51:43.245 [INFO][4545] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="611e3a1415b8e12165075217f9c671e8a5ab26623bd8b3a433c1539dce3f4e44" HandleID="k8s-pod-network.611e3a1415b8e12165075217f9c671e8a5ab26623bd8b3a433c1539dce3f4e44" Workload="ci--4284--0--0--n--40578dffbd-k8s-calico--kube--controllers--67895f656d--fr7gd-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0x40002ea330), Attrs:map[string]string{"namespace":"calico-system", "node":"ci-4284-0-0-n-40578dffbd", "pod":"calico-kube-controllers-67895f656d-fr7gd", "timestamp":"2025-05-13 23:51:43.227738134 +0000 UTC"}, Hostname:"ci-4284-0-0-n-40578dffbd", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} May 13 23:51:43.341788 containerd[1507]: 2025-05-13 23:51:43.245 [INFO][4545] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. May 13 23:51:43.341788 containerd[1507]: 2025-05-13 23:51:43.245 [INFO][4545] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. May 13 23:51:43.341788 containerd[1507]: 2025-05-13 23:51:43.245 [INFO][4545] ipam/ipam.go 107: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4284-0-0-n-40578dffbd' May 13 23:51:43.341788 containerd[1507]: 2025-05-13 23:51:43.251 [INFO][4545] ipam/ipam.go 660: Looking up existing affinities for host handle="k8s-pod-network.611e3a1415b8e12165075217f9c671e8a5ab26623bd8b3a433c1539dce3f4e44" host="ci-4284-0-0-n-40578dffbd" May 13 23:51:43.341788 containerd[1507]: 2025-05-13 23:51:43.260 [INFO][4545] ipam/ipam.go 372: Looking up existing affinities for host host="ci-4284-0-0-n-40578dffbd" May 13 23:51:43.341788 containerd[1507]: 2025-05-13 23:51:43.271 [INFO][4545] ipam/ipam.go 489: Trying affinity for 192.168.48.192/26 host="ci-4284-0-0-n-40578dffbd" May 13 23:51:43.341788 containerd[1507]: 2025-05-13 23:51:43.276 [INFO][4545] ipam/ipam.go 155: Attempting to load block cidr=192.168.48.192/26 host="ci-4284-0-0-n-40578dffbd" May 13 23:51:43.341788 containerd[1507]: 2025-05-13 23:51:43.281 [INFO][4545] ipam/ipam.go 232: Affinity is confirmed and block has been loaded cidr=192.168.48.192/26 host="ci-4284-0-0-n-40578dffbd" May 13 23:51:43.341788 containerd[1507]: 2025-05-13 23:51:43.281 [INFO][4545] ipam/ipam.go 1180: Attempting to assign 1 addresses from block block=192.168.48.192/26 handle="k8s-pod-network.611e3a1415b8e12165075217f9c671e8a5ab26623bd8b3a433c1539dce3f4e44" host="ci-4284-0-0-n-40578dffbd" May 13 23:51:43.341788 containerd[1507]: 2025-05-13 23:51:43.285 [INFO][4545] ipam/ipam.go 1685: Creating new handle: k8s-pod-network.611e3a1415b8e12165075217f9c671e8a5ab26623bd8b3a433c1539dce3f4e44 May 13 23:51:43.341788 containerd[1507]: 2025-05-13 23:51:43.292 [INFO][4545] ipam/ipam.go 1203: Writing block in order to claim IPs block=192.168.48.192/26 handle="k8s-pod-network.611e3a1415b8e12165075217f9c671e8a5ab26623bd8b3a433c1539dce3f4e44" host="ci-4284-0-0-n-40578dffbd" May 13 23:51:43.341788 containerd[1507]: 2025-05-13 23:51:43.306 [INFO][4545] ipam/ipam.go 1216: Successfully claimed IPs: [192.168.48.198/26] block=192.168.48.192/26 handle="k8s-pod-network.611e3a1415b8e12165075217f9c671e8a5ab26623bd8b3a433c1539dce3f4e44" host="ci-4284-0-0-n-40578dffbd" May 13 23:51:43.341788 containerd[1507]: 2025-05-13 23:51:43.306 [INFO][4545] ipam/ipam.go 847: Auto-assigned 1 out of 1 IPv4s: [192.168.48.198/26] handle="k8s-pod-network.611e3a1415b8e12165075217f9c671e8a5ab26623bd8b3a433c1539dce3f4e44" host="ci-4284-0-0-n-40578dffbd" May 13 23:51:43.341788 containerd[1507]: 2025-05-13 23:51:43.306 [INFO][4545] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. May 13 23:51:43.341788 containerd[1507]: 2025-05-13 23:51:43.306 [INFO][4545] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.48.198/26] IPv6=[] ContainerID="611e3a1415b8e12165075217f9c671e8a5ab26623bd8b3a433c1539dce3f4e44" HandleID="k8s-pod-network.611e3a1415b8e12165075217f9c671e8a5ab26623bd8b3a433c1539dce3f4e44" Workload="ci--4284--0--0--n--40578dffbd-k8s-calico--kube--controllers--67895f656d--fr7gd-eth0" May 13 23:51:43.342526 containerd[1507]: 2025-05-13 23:51:43.309 [INFO][4533] cni-plugin/k8s.go 386: Populated endpoint ContainerID="611e3a1415b8e12165075217f9c671e8a5ab26623bd8b3a433c1539dce3f4e44" Namespace="calico-system" Pod="calico-kube-controllers-67895f656d-fr7gd" WorkloadEndpoint="ci--4284--0--0--n--40578dffbd-k8s-calico--kube--controllers--67895f656d--fr7gd-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4284--0--0--n--40578dffbd-k8s-calico--kube--controllers--67895f656d--fr7gd-eth0", GenerateName:"calico-kube-controllers-67895f656d-", Namespace:"calico-system", SelfLink:"", UID:"a9e7fa11-c2d7-4fc9-bc76-9c7b1719d09c", ResourceVersion:"717", Generation:0, CreationTimestamp:time.Date(2025, time.May, 13, 23, 51, 17, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"calico-kube-controllers", "k8s-app":"calico-kube-controllers", "pod-template-hash":"67895f656d", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-kube-controllers"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4284-0-0-n-40578dffbd", ContainerID:"", Pod:"calico-kube-controllers-67895f656d-fr7gd", Endpoint:"eth0", ServiceAccountName:"calico-kube-controllers", IPNetworks:[]string{"192.168.48.198/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.calico-kube-controllers"}, InterfaceName:"calid33da281dda", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil)}} May 13 23:51:43.342526 containerd[1507]: 2025-05-13 23:51:43.309 [INFO][4533] cni-plugin/k8s.go 387: Calico CNI using IPs: [192.168.48.198/32] ContainerID="611e3a1415b8e12165075217f9c671e8a5ab26623bd8b3a433c1539dce3f4e44" Namespace="calico-system" Pod="calico-kube-controllers-67895f656d-fr7gd" WorkloadEndpoint="ci--4284--0--0--n--40578dffbd-k8s-calico--kube--controllers--67895f656d--fr7gd-eth0" May 13 23:51:43.342526 containerd[1507]: 2025-05-13 23:51:43.309 [INFO][4533] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to calid33da281dda ContainerID="611e3a1415b8e12165075217f9c671e8a5ab26623bd8b3a433c1539dce3f4e44" Namespace="calico-system" Pod="calico-kube-controllers-67895f656d-fr7gd" WorkloadEndpoint="ci--4284--0--0--n--40578dffbd-k8s-calico--kube--controllers--67895f656d--fr7gd-eth0" May 13 23:51:43.342526 containerd[1507]: 2025-05-13 23:51:43.315 [INFO][4533] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="611e3a1415b8e12165075217f9c671e8a5ab26623bd8b3a433c1539dce3f4e44" Namespace="calico-system" Pod="calico-kube-controllers-67895f656d-fr7gd" WorkloadEndpoint="ci--4284--0--0--n--40578dffbd-k8s-calico--kube--controllers--67895f656d--fr7gd-eth0" May 13 23:51:43.342526 containerd[1507]: 2025-05-13 23:51:43.316 [INFO][4533] cni-plugin/k8s.go 414: Added Mac, interface name, and active container ID to endpoint ContainerID="611e3a1415b8e12165075217f9c671e8a5ab26623bd8b3a433c1539dce3f4e44" Namespace="calico-system" Pod="calico-kube-controllers-67895f656d-fr7gd" WorkloadEndpoint="ci--4284--0--0--n--40578dffbd-k8s-calico--kube--controllers--67895f656d--fr7gd-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4284--0--0--n--40578dffbd-k8s-calico--kube--controllers--67895f656d--fr7gd-eth0", GenerateName:"calico-kube-controllers-67895f656d-", Namespace:"calico-system", SelfLink:"", UID:"a9e7fa11-c2d7-4fc9-bc76-9c7b1719d09c", ResourceVersion:"717", Generation:0, CreationTimestamp:time.Date(2025, time.May, 13, 23, 51, 17, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"calico-kube-controllers", "k8s-app":"calico-kube-controllers", "pod-template-hash":"67895f656d", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-kube-controllers"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4284-0-0-n-40578dffbd", ContainerID:"611e3a1415b8e12165075217f9c671e8a5ab26623bd8b3a433c1539dce3f4e44", Pod:"calico-kube-controllers-67895f656d-fr7gd", Endpoint:"eth0", ServiceAccountName:"calico-kube-controllers", IPNetworks:[]string{"192.168.48.198/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.calico-kube-controllers"}, InterfaceName:"calid33da281dda", MAC:"b6:cb:76:0c:06:66", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil)}} May 13 23:51:43.342526 containerd[1507]: 2025-05-13 23:51:43.335 [INFO][4533] cni-plugin/k8s.go 500: Wrote updated endpoint to datastore ContainerID="611e3a1415b8e12165075217f9c671e8a5ab26623bd8b3a433c1539dce3f4e44" Namespace="calico-system" Pod="calico-kube-controllers-67895f656d-fr7gd" WorkloadEndpoint="ci--4284--0--0--n--40578dffbd-k8s-calico--kube--controllers--67895f656d--fr7gd-eth0" May 13 23:51:43.386065 containerd[1507]: time="2025-05-13T23:51:43.385396317Z" level=info msg="connecting to shim 611e3a1415b8e12165075217f9c671e8a5ab26623bd8b3a433c1539dce3f4e44" address="unix:///run/containerd/s/b22af93eb5592f7dc65a0d3987c240017bd5494dc987dd5bc5648784a1655372" namespace=k8s.io protocol=ttrpc version=3 May 13 23:51:43.416388 systemd[1]: Started cri-containerd-611e3a1415b8e12165075217f9c671e8a5ab26623bd8b3a433c1539dce3f4e44.scope - libcontainer container 611e3a1415b8e12165075217f9c671e8a5ab26623bd8b3a433c1539dce3f4e44. May 13 23:51:43.452047 systemd-networkd[1400]: cali183b7699b14: Gained IPv6LL May 13 23:51:43.460778 containerd[1507]: time="2025-05-13T23:51:43.460708638Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-67895f656d-fr7gd,Uid:a9e7fa11-c2d7-4fc9-bc76-9c7b1719d09c,Namespace:calico-system,Attempt:0,} returns sandbox id \"611e3a1415b8e12165075217f9c671e8a5ab26623bd8b3a433c1539dce3f4e44\"" May 13 23:51:43.642437 systemd-networkd[1400]: calif91f486c577: Gained IPv6LL May 13 23:51:44.216226 containerd[1507]: time="2025-05-13T23:51:44.215351443Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/apiserver:v3.29.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 13 23:51:44.218070 containerd[1507]: time="2025-05-13T23:51:44.218007326Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.29.3: active requests=0, bytes read=40247603" May 13 23:51:44.219529 containerd[1507]: time="2025-05-13T23:51:44.219490092Z" level=info msg="ImageCreate event name:\"sha256:eca64fb9fcc40e83ed2310ac1fab340ba460a939c54e10dc0b7428f02b9b6253\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 13 23:51:44.222691 containerd[1507]: time="2025-05-13T23:51:44.222631469Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/apiserver@sha256:bcb659f25f9aebaa389ed1dbb65edb39478ddf82c57d07d8da474e8cab38d77b\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 13 23:51:44.223364 containerd[1507]: time="2025-05-13T23:51:44.223243528Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/apiserver:v3.29.3\" with image id \"sha256:eca64fb9fcc40e83ed2310ac1fab340ba460a939c54e10dc0b7428f02b9b6253\", repo tag \"ghcr.io/flatcar/calico/apiserver:v3.29.3\", repo digest \"ghcr.io/flatcar/calico/apiserver@sha256:bcb659f25f9aebaa389ed1dbb65edb39478ddf82c57d07d8da474e8cab38d77b\", size \"41616801\" in 4.499798748s" May 13 23:51:44.223499 containerd[1507]: time="2025-05-13T23:51:44.223480416Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.29.3\" returns image reference \"sha256:eca64fb9fcc40e83ed2310ac1fab340ba460a939c54e10dc0b7428f02b9b6253\"" May 13 23:51:44.242001 containerd[1507]: time="2025-05-13T23:51:44.241940070Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.29.3\"" May 13 23:51:44.264009 containerd[1507]: time="2025-05-13T23:51:44.263285574Z" level=info msg="CreateContainer within sandbox \"afeff18632648d95ab95bc085d0a25208093d7fc76708387a3d0816eb13a551a\" for container &ContainerMetadata{Name:calico-apiserver,Attempt:0,}" May 13 23:51:44.272327 containerd[1507]: time="2025-05-13T23:51:44.272284013Z" level=info msg="Container 16cc1972d47b4f5562769ac7a2a63f1a0617e0eda1569b85c44f606d7273d359: CDI devices from CRI Config.CDIDevices: []" May 13 23:51:44.277375 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount2327507001.mount: Deactivated successfully. May 13 23:51:44.286942 containerd[1507]: time="2025-05-13T23:51:44.286895028Z" level=info msg="CreateContainer within sandbox \"afeff18632648d95ab95bc085d0a25208093d7fc76708387a3d0816eb13a551a\" for &ContainerMetadata{Name:calico-apiserver,Attempt:0,} returns container id \"16cc1972d47b4f5562769ac7a2a63f1a0617e0eda1569b85c44f606d7273d359\"" May 13 23:51:44.287693 containerd[1507]: time="2025-05-13T23:51:44.287657892Z" level=info msg="StartContainer for \"16cc1972d47b4f5562769ac7a2a63f1a0617e0eda1569b85c44f606d7273d359\"" May 13 23:51:44.290710 containerd[1507]: time="2025-05-13T23:51:44.290644104Z" level=info msg="connecting to shim 16cc1972d47b4f5562769ac7a2a63f1a0617e0eda1569b85c44f606d7273d359" address="unix:///run/containerd/s/a8b6c1030af634cf431c42064f7e3fc58e58aa98c1635637c86f0a211691a9f6" protocol=ttrpc version=3 May 13 23:51:44.323574 systemd[1]: Started cri-containerd-16cc1972d47b4f5562769ac7a2a63f1a0617e0eda1569b85c44f606d7273d359.scope - libcontainer container 16cc1972d47b4f5562769ac7a2a63f1a0617e0eda1569b85c44f606d7273d359. May 13 23:51:44.406896 containerd[1507]: time="2025-05-13T23:51:44.406852598Z" level=info msg="StartContainer for \"16cc1972d47b4f5562769ac7a2a63f1a0617e0eda1569b85c44f606d7273d359\" returns successfully" May 13 23:51:45.307104 systemd-networkd[1400]: calid33da281dda: Gained IPv6LL May 13 23:51:45.732079 containerd[1507]: time="2025-05-13T23:51:45.731770697Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/csi:v3.29.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 13 23:51:45.734125 containerd[1507]: time="2025-05-13T23:51:45.733417308Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/csi:v3.29.3: active requests=0, bytes read=7474935" May 13 23:51:45.736719 containerd[1507]: time="2025-05-13T23:51:45.736670249Z" level=info msg="ImageCreate event name:\"sha256:15faf29e8b518d846c91c15785ff89e783d356ea0f2b22826f47a556ea32645b\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 13 23:51:45.741508 containerd[1507]: time="2025-05-13T23:51:45.741042263Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/csi@sha256:72455a36febc7c56ec8881007f4805caed5764026a0694e4f86a2503209b2d31\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 13 23:51:45.742321 containerd[1507]: time="2025-05-13T23:51:45.742290942Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/csi:v3.29.3\" with image id \"sha256:15faf29e8b518d846c91c15785ff89e783d356ea0f2b22826f47a556ea32645b\", repo tag \"ghcr.io/flatcar/calico/csi:v3.29.3\", repo digest \"ghcr.io/flatcar/calico/csi@sha256:72455a36febc7c56ec8881007f4805caed5764026a0694e4f86a2503209b2d31\", size \"8844117\" in 1.500081103s" May 13 23:51:45.742383 containerd[1507]: time="2025-05-13T23:51:45.742329463Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.29.3\" returns image reference \"sha256:15faf29e8b518d846c91c15785ff89e783d356ea0f2b22826f47a556ea32645b\"" May 13 23:51:45.744373 containerd[1507]: time="2025-05-13T23:51:45.744325685Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.29.3\"" May 13 23:51:45.745564 containerd[1507]: time="2025-05-13T23:51:45.745518521Z" level=info msg="CreateContainer within sandbox \"780ead374676bca200fda1d3e4eda5e4eb03cc29b3a6f72b32d180c26ff0b565\" for container &ContainerMetadata{Name:calico-csi,Attempt:0,}" May 13 23:51:45.766052 containerd[1507]: time="2025-05-13T23:51:45.764826517Z" level=info msg="Container dc37a5b5bb33c5eb30e9a38b920420436c7bcbe7bed3c3d50bc451e63a84f96a: CDI devices from CRI Config.CDIDevices: []" May 13 23:51:45.786035 containerd[1507]: time="2025-05-13T23:51:45.785544876Z" level=info msg="CreateContainer within sandbox \"780ead374676bca200fda1d3e4eda5e4eb03cc29b3a6f72b32d180c26ff0b565\" for &ContainerMetadata{Name:calico-csi,Attempt:0,} returns container id \"dc37a5b5bb33c5eb30e9a38b920420436c7bcbe7bed3c3d50bc451e63a84f96a\"" May 13 23:51:45.787364 containerd[1507]: time="2025-05-13T23:51:45.787211928Z" level=info msg="StartContainer for \"dc37a5b5bb33c5eb30e9a38b920420436c7bcbe7bed3c3d50bc451e63a84f96a\"" May 13 23:51:45.788907 containerd[1507]: time="2025-05-13T23:51:45.788831218Z" level=info msg="connecting to shim dc37a5b5bb33c5eb30e9a38b920420436c7bcbe7bed3c3d50bc451e63a84f96a" address="unix:///run/containerd/s/17d19db7b61bf1cfefdfddddc78ad2cfe07b6a86a21766d6b1cb8e3910db7928" protocol=ttrpc version=3 May 13 23:51:45.819206 systemd[1]: Started cri-containerd-dc37a5b5bb33c5eb30e9a38b920420436c7bcbe7bed3c3d50bc451e63a84f96a.scope - libcontainer container dc37a5b5bb33c5eb30e9a38b920420436c7bcbe7bed3c3d50bc451e63a84f96a. May 13 23:51:45.867350 containerd[1507]: time="2025-05-13T23:51:45.867205235Z" level=info msg="StartContainer for \"dc37a5b5bb33c5eb30e9a38b920420436c7bcbe7bed3c3d50bc451e63a84f96a\" returns successfully" May 13 23:51:45.998830 kubelet[2772]: I0513 23:51:45.997986 2772 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-apiserver/calico-apiserver-7fd5cd4657-xd97q" podStartSLOduration=25.476944417 podStartE2EDuration="29.997952229s" podCreationTimestamp="2025-05-13 23:51:16 +0000 UTC" firstStartedPulling="2025-05-13 23:51:39.720539606 +0000 UTC m=+35.734372131" lastFinishedPulling="2025-05-13 23:51:44.241547458 +0000 UTC m=+40.255379943" observedRunningTime="2025-05-13 23:51:45.435320392 +0000 UTC m=+41.449152957" watchObservedRunningTime="2025-05-13 23:51:45.997952229 +0000 UTC m=+42.011784714" May 13 23:51:46.171943 containerd[1507]: time="2025-05-13T23:51:46.171236174Z" level=info msg="ImageUpdate event name:\"ghcr.io/flatcar/calico/apiserver:v3.29.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 13 23:51:46.172366 containerd[1507]: time="2025-05-13T23:51:46.172294086Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.29.3: active requests=0, bytes read=77" May 13 23:51:46.175462 containerd[1507]: time="2025-05-13T23:51:46.175415261Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/apiserver:v3.29.3\" with image id \"sha256:eca64fb9fcc40e83ed2310ac1fab340ba460a939c54e10dc0b7428f02b9b6253\", repo tag \"ghcr.io/flatcar/calico/apiserver:v3.29.3\", repo digest \"ghcr.io/flatcar/calico/apiserver@sha256:bcb659f25f9aebaa389ed1dbb65edb39478ddf82c57d07d8da474e8cab38d77b\", size \"41616801\" in 431.038655ms" May 13 23:51:46.175662 containerd[1507]: time="2025-05-13T23:51:46.175641828Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.29.3\" returns image reference \"sha256:eca64fb9fcc40e83ed2310ac1fab340ba460a939c54e10dc0b7428f02b9b6253\"" May 13 23:51:46.177594 containerd[1507]: time="2025-05-13T23:51:46.177126834Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.29.3\"" May 13 23:51:46.180129 containerd[1507]: time="2025-05-13T23:51:46.179270699Z" level=info msg="CreateContainer within sandbox \"e9e4837a0ed00002d93a951fbd5d6aeba69e6afc7f4d0e620256fd8b72a2ea0d\" for container &ContainerMetadata{Name:calico-apiserver,Attempt:0,}" May 13 23:51:46.188384 containerd[1507]: time="2025-05-13T23:51:46.188342057Z" level=info msg="Container f32cbf5f34e7a8376cfab6e2f4323549e42deacef52f2c892e02d32f01e63fe3: CDI devices from CRI Config.CDIDevices: []" May 13 23:51:46.203440 containerd[1507]: time="2025-05-13T23:51:46.203361197Z" level=info msg="CreateContainer within sandbox \"e9e4837a0ed00002d93a951fbd5d6aeba69e6afc7f4d0e620256fd8b72a2ea0d\" for &ContainerMetadata{Name:calico-apiserver,Attempt:0,} returns container id \"f32cbf5f34e7a8376cfab6e2f4323549e42deacef52f2c892e02d32f01e63fe3\"" May 13 23:51:46.204420 containerd[1507]: time="2025-05-13T23:51:46.204377068Z" level=info msg="StartContainer for \"f32cbf5f34e7a8376cfab6e2f4323549e42deacef52f2c892e02d32f01e63fe3\"" May 13 23:51:46.209163 containerd[1507]: time="2025-05-13T23:51:46.208164184Z" level=info msg="connecting to shim f32cbf5f34e7a8376cfab6e2f4323549e42deacef52f2c892e02d32f01e63fe3" address="unix:///run/containerd/s/98a2d573fd8d90cb1e0e60ba550b1107f2f6868dda2b9049b2f2f72bd367ff7f" protocol=ttrpc version=3 May 13 23:51:46.238231 systemd[1]: Started cri-containerd-f32cbf5f34e7a8376cfab6e2f4323549e42deacef52f2c892e02d32f01e63fe3.scope - libcontainer container f32cbf5f34e7a8376cfab6e2f4323549e42deacef52f2c892e02d32f01e63fe3. May 13 23:51:46.293541 containerd[1507]: time="2025-05-13T23:51:46.292880137Z" level=info msg="StartContainer for \"f32cbf5f34e7a8376cfab6e2f4323549e42deacef52f2c892e02d32f01e63fe3\" returns successfully" May 13 23:51:47.416850 kubelet[2772]: I0513 23:51:47.416446 2772 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" May 13 23:51:48.453129 containerd[1507]: time="2025-05-13T23:51:48.452959455Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/kube-controllers:v3.29.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 13 23:51:48.456720 containerd[1507]: time="2025-05-13T23:51:48.456267395Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/kube-controllers:v3.29.3: active requests=0, bytes read=32554116" May 13 23:51:48.458255 containerd[1507]: time="2025-05-13T23:51:48.458024368Z" level=info msg="ImageCreate event name:\"sha256:ec7c64189a2fd01b24b044fea1840d441e9884a0df32c2e9d6982cfbbea1f814\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 13 23:51:48.463703 containerd[1507]: time="2025-05-13T23:51:48.463621137Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/kube-controllers@sha256:feaab0197035d474845e0f8137a99a78cab274f0a3cac4d5485cf9b1bdf9ffa9\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 13 23:51:48.466348 containerd[1507]: time="2025-05-13T23:51:48.466278137Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/kube-controllers:v3.29.3\" with image id \"sha256:ec7c64189a2fd01b24b044fea1840d441e9884a0df32c2e9d6982cfbbea1f814\", repo tag \"ghcr.io/flatcar/calico/kube-controllers:v3.29.3\", repo digest \"ghcr.io/flatcar/calico/kube-controllers@sha256:feaab0197035d474845e0f8137a99a78cab274f0a3cac4d5485cf9b1bdf9ffa9\", size \"33923266\" in 2.289105501s" May 13 23:51:48.466348 containerd[1507]: time="2025-05-13T23:51:48.466329818Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.29.3\" returns image reference \"sha256:ec7c64189a2fd01b24b044fea1840d441e9884a0df32c2e9d6982cfbbea1f814\"" May 13 23:51:48.468599 containerd[1507]: time="2025-05-13T23:51:48.468385200Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.29.3\"" May 13 23:51:48.511028 containerd[1507]: time="2025-05-13T23:51:48.510353586Z" level=info msg="CreateContainer within sandbox \"611e3a1415b8e12165075217f9c671e8a5ab26623bd8b3a433c1539dce3f4e44\" for container &ContainerMetadata{Name:calico-kube-controllers,Attempt:0,}" May 13 23:51:48.523101 containerd[1507]: time="2025-05-13T23:51:48.522130981Z" level=info msg="Container 3c5fb2d2e4cad4ba4920801b1ac65be2b855e4b096d8b2f679fd0847ca9bceea: CDI devices from CRI Config.CDIDevices: []" May 13 23:51:48.537723 containerd[1507]: time="2025-05-13T23:51:48.537678610Z" level=info msg="CreateContainer within sandbox \"611e3a1415b8e12165075217f9c671e8a5ab26623bd8b3a433c1539dce3f4e44\" for &ContainerMetadata{Name:calico-kube-controllers,Attempt:0,} returns container id \"3c5fb2d2e4cad4ba4920801b1ac65be2b855e4b096d8b2f679fd0847ca9bceea\"" May 13 23:51:48.539431 containerd[1507]: time="2025-05-13T23:51:48.538567716Z" level=info msg="StartContainer for \"3c5fb2d2e4cad4ba4920801b1ac65be2b855e4b096d8b2f679fd0847ca9bceea\"" May 13 23:51:48.541031 containerd[1507]: time="2025-05-13T23:51:48.540933988Z" level=info msg="connecting to shim 3c5fb2d2e4cad4ba4920801b1ac65be2b855e4b096d8b2f679fd0847ca9bceea" address="unix:///run/containerd/s/b22af93eb5592f7dc65a0d3987c240017bd5494dc987dd5bc5648784a1655372" protocol=ttrpc version=3 May 13 23:51:48.570172 systemd[1]: Started cri-containerd-3c5fb2d2e4cad4ba4920801b1ac65be2b855e4b096d8b2f679fd0847ca9bceea.scope - libcontainer container 3c5fb2d2e4cad4ba4920801b1ac65be2b855e4b096d8b2f679fd0847ca9bceea. May 13 23:51:48.629411 containerd[1507]: time="2025-05-13T23:51:48.629355254Z" level=info msg="StartContainer for \"3c5fb2d2e4cad4ba4920801b1ac65be2b855e4b096d8b2f679fd0847ca9bceea\" returns successfully" May 13 23:51:49.453207 kubelet[2772]: I0513 23:51:49.452852 2772 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/calico-kube-controllers-67895f656d-fr7gd" podStartSLOduration=27.447174244 podStartE2EDuration="32.452831264s" podCreationTimestamp="2025-05-13 23:51:17 +0000 UTC" firstStartedPulling="2025-05-13 23:51:43.462470213 +0000 UTC m=+39.476302738" lastFinishedPulling="2025-05-13 23:51:48.468127233 +0000 UTC m=+44.481959758" observedRunningTime="2025-05-13 23:51:49.452314849 +0000 UTC m=+45.466147374" watchObservedRunningTime="2025-05-13 23:51:49.452831264 +0000 UTC m=+45.466663789" May 13 23:51:49.454116 kubelet[2772]: I0513 23:51:49.453924 2772 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-apiserver/calico-apiserver-7fd5cd4657-7ls8m" podStartSLOduration=30.014296917 podStartE2EDuration="33.453905136s" podCreationTimestamp="2025-05-13 23:51:16 +0000 UTC" firstStartedPulling="2025-05-13 23:51:42.737114122 +0000 UTC m=+38.750946647" lastFinishedPulling="2025-05-13 23:51:46.176722341 +0000 UTC m=+42.190554866" observedRunningTime="2025-05-13 23:51:46.428778417 +0000 UTC m=+42.442610942" watchObservedRunningTime="2025-05-13 23:51:49.453905136 +0000 UTC m=+45.467737661" May 13 23:51:49.483570 containerd[1507]: time="2025-05-13T23:51:49.483500982Z" level=info msg="TaskExit event in podsandbox handler container_id:\"3c5fb2d2e4cad4ba4920801b1ac65be2b855e4b096d8b2f679fd0847ca9bceea\" id:\"6a22b47df2dd2816eaad2feb9628d0b6e7d6af63e2388780c9a386138405b26b\" pid:4782 exited_at:{seconds:1747180309 nanos:482712998}" May 13 23:51:50.200063 containerd[1507]: time="2025-05-13T23:51:50.199661777Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node-driver-registrar:v3.29.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 13 23:51:50.201366 containerd[1507]: time="2025-05-13T23:51:50.201298466Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/node-driver-registrar:v3.29.3: active requests=0, bytes read=13124299" May 13 23:51:50.207607 containerd[1507]: time="2025-05-13T23:51:50.206470539Z" level=info msg="ImageCreate event name:\"sha256:a91b1f00752edc175f270a01b33683fa80818734aa2274388785eaf3364315dc\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 13 23:51:50.212600 containerd[1507]: time="2025-05-13T23:51:50.212279392Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node-driver-registrar@sha256:3f15090a9bb45773d1fd019455ec3d3f3746f3287c35d8013e497b38d8237324\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 13 23:51:50.214308 containerd[1507]: time="2025-05-13T23:51:50.214112567Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.29.3\" with image id \"sha256:a91b1f00752edc175f270a01b33683fa80818734aa2274388785eaf3364315dc\", repo tag \"ghcr.io/flatcar/calico/node-driver-registrar:v3.29.3\", repo digest \"ghcr.io/flatcar/calico/node-driver-registrar@sha256:3f15090a9bb45773d1fd019455ec3d3f3746f3287c35d8013e497b38d8237324\", size \"14493433\" in 1.745574081s" May 13 23:51:50.214308 containerd[1507]: time="2025-05-13T23:51:50.214166048Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.29.3\" returns image reference \"sha256:a91b1f00752edc175f270a01b33683fa80818734aa2274388785eaf3364315dc\"" May 13 23:51:50.218042 containerd[1507]: time="2025-05-13T23:51:50.217356423Z" level=info msg="CreateContainer within sandbox \"780ead374676bca200fda1d3e4eda5e4eb03cc29b3a6f72b32d180c26ff0b565\" for container &ContainerMetadata{Name:csi-node-driver-registrar,Attempt:0,}" May 13 23:51:50.238685 containerd[1507]: time="2025-05-13T23:51:50.238611015Z" level=info msg="Container d101e12416b5c9bf1fe2c8363026ff3a6f881fe9c8fd620b1b1801c99f89f825: CDI devices from CRI Config.CDIDevices: []" May 13 23:51:50.254613 containerd[1507]: time="2025-05-13T23:51:50.254369643Z" level=info msg="CreateContainer within sandbox \"780ead374676bca200fda1d3e4eda5e4eb03cc29b3a6f72b32d180c26ff0b565\" for &ContainerMetadata{Name:csi-node-driver-registrar,Attempt:0,} returns container id \"d101e12416b5c9bf1fe2c8363026ff3a6f881fe9c8fd620b1b1801c99f89f825\"" May 13 23:51:50.257012 containerd[1507]: time="2025-05-13T23:51:50.255549118Z" level=info msg="StartContainer for \"d101e12416b5c9bf1fe2c8363026ff3a6f881fe9c8fd620b1b1801c99f89f825\"" May 13 23:51:50.257920 containerd[1507]: time="2025-05-13T23:51:50.257881507Z" level=info msg="connecting to shim d101e12416b5c9bf1fe2c8363026ff3a6f881fe9c8fd620b1b1801c99f89f825" address="unix:///run/containerd/s/17d19db7b61bf1cfefdfddddc78ad2cfe07b6a86a21766d6b1cb8e3910db7928" protocol=ttrpc version=3 May 13 23:51:50.292334 systemd[1]: Started cri-containerd-d101e12416b5c9bf1fe2c8363026ff3a6f881fe9c8fd620b1b1801c99f89f825.scope - libcontainer container d101e12416b5c9bf1fe2c8363026ff3a6f881fe9c8fd620b1b1801c99f89f825. May 13 23:51:50.356181 containerd[1507]: time="2025-05-13T23:51:50.356133788Z" level=info msg="StartContainer for \"d101e12416b5c9bf1fe2c8363026ff3a6f881fe9c8fd620b1b1801c99f89f825\" returns successfully" May 13 23:51:50.454382 kubelet[2772]: I0513 23:51:50.453834 2772 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/csi-node-driver-68jvj" podStartSLOduration=25.862425471999998 podStartE2EDuration="33.453810891s" podCreationTimestamp="2025-05-13 23:51:17 +0000 UTC" firstStartedPulling="2025-05-13 23:51:42.623788179 +0000 UTC m=+38.637620704" lastFinishedPulling="2025-05-13 23:51:50.215173598 +0000 UTC m=+46.229006123" observedRunningTime="2025-05-13 23:51:50.452829662 +0000 UTC m=+46.466662227" watchObservedRunningTime="2025-05-13 23:51:50.453810891 +0000 UTC m=+46.467643456" May 13 23:51:51.262640 kubelet[2772]: I0513 23:51:51.262600 2772 csi_plugin.go:100] kubernetes.io/csi: Trying to validate a new CSI Driver with name: csi.tigera.io endpoint: /var/lib/kubelet/plugins/csi.tigera.io/csi.sock versions: 1.0.0 May 13 23:51:51.269897 kubelet[2772]: I0513 23:51:51.268713 2772 csi_plugin.go:113] kubernetes.io/csi: Register new plugin with name: csi.tigera.io at endpoint: /var/lib/kubelet/plugins/csi.tigera.io/csi.sock May 13 23:52:03.609724 containerd[1507]: time="2025-05-13T23:52:03.609671284Z" level=info msg="TaskExit event in podsandbox handler container_id:\"31a4144284996cba49259ad8f22433f69404813566164bcd2319e49b1d12eadc\" id:\"752187ce178fb7cd3258efeee1d041a78059abda8be1bbe40cfb06b0bba92261\" pid:4853 exited_at:{seconds:1747180323 nanos:609332675}" May 13 23:52:05.686093 containerd[1507]: time="2025-05-13T23:52:05.686030919Z" level=info msg="TaskExit event in podsandbox handler container_id:\"3c5fb2d2e4cad4ba4920801b1ac65be2b855e4b096d8b2f679fd0847ca9bceea\" id:\"fc767ff10d2b84f96b114ddd5342f9b37e8c020ec9098660ae6ef4a5feddd946\" pid:4880 exited_at:{seconds:1747180325 nanos:685668389}" May 13 23:52:13.040663 kubelet[2772]: I0513 23:52:13.038692 2772 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" May 13 23:52:13.850293 containerd[1507]: time="2025-05-13T23:52:13.850031343Z" level=info msg="TaskExit event in podsandbox handler container_id:\"3c5fb2d2e4cad4ba4920801b1ac65be2b855e4b096d8b2f679fd0847ca9bceea\" id:\"a995824bf22b7edf7a862c8f43177b60179d6ca88fddfb87b555b7839be87f8c\" pid:4908 exited_at:{seconds:1747180333 nanos:849448208}" May 13 23:52:33.588767 containerd[1507]: time="2025-05-13T23:52:33.588713483Z" level=info msg="TaskExit event in podsandbox handler container_id:\"31a4144284996cba49259ad8f22433f69404813566164bcd2319e49b1d12eadc\" id:\"7cf41c2baf0ff766e9022c89ca4ed6a2a8e7d57ef9431593bc9761b832c1683a\" pid:4939 exited_at:{seconds:1747180353 nanos:588202551}" May 13 23:52:35.677883 containerd[1507]: time="2025-05-13T23:52:35.677837462Z" level=info msg="TaskExit event in podsandbox handler container_id:\"3c5fb2d2e4cad4ba4920801b1ac65be2b855e4b096d8b2f679fd0847ca9bceea\" id:\"ca2790aa204fc660a30d88c92af8690751fae198a818c87830d94ad2013d574b\" pid:4962 exited_at:{seconds:1747180355 nanos:677203647}" May 13 23:53:03.614213 containerd[1507]: time="2025-05-13T23:53:03.614140541Z" level=info msg="TaskExit event in podsandbox handler container_id:\"31a4144284996cba49259ad8f22433f69404813566164bcd2319e49b1d12eadc\" id:\"346d2c8a7940ab7623d7478a1bbcb0aec8c3101c787b4b140d07c61b7c907ee9\" pid:4998 exited_at:{seconds:1747180383 nanos:613630325}" May 13 23:53:05.676188 containerd[1507]: time="2025-05-13T23:53:05.675964341Z" level=info msg="TaskExit event in podsandbox handler container_id:\"3c5fb2d2e4cad4ba4920801b1ac65be2b855e4b096d8b2f679fd0847ca9bceea\" id:\"b635ddc9e15ac6a3f33fcb3b5b806836bd17727d791e1f34e6b4aac2fb3d07f4\" pid:5024 exited_at:{seconds:1747180385 nanos:675625397}" May 13 23:53:13.841455 containerd[1507]: time="2025-05-13T23:53:13.841331200Z" level=info msg="TaskExit event in podsandbox handler container_id:\"3c5fb2d2e4cad4ba4920801b1ac65be2b855e4b096d8b2f679fd0847ca9bceea\" id:\"ae0d1265dca2471287acf9ce47a84bcd257a8794ea2b02f564b5f7eb61f39042\" pid:5069 exited_at:{seconds:1747180393 nanos:840181764}" May 13 23:53:30.953168 update_engine[1483]: I20250513 23:53:30.952916 1483 prefs.cc:52] certificate-report-to-send-update not present in /var/lib/update_engine/prefs May 13 23:53:30.953168 update_engine[1483]: I20250513 23:53:30.953022 1483 prefs.cc:52] certificate-report-to-send-download not present in /var/lib/update_engine/prefs May 13 23:53:30.954801 update_engine[1483]: I20250513 23:53:30.953608 1483 prefs.cc:52] aleph-version not present in /var/lib/update_engine/prefs May 13 23:53:30.957853 update_engine[1483]: I20250513 23:53:30.957129 1483 omaha_request_params.cc:62] Current group set to alpha May 13 23:53:30.958378 update_engine[1483]: I20250513 23:53:30.958256 1483 update_attempter.cc:499] Already updated boot flags. Skipping. May 13 23:53:30.958378 update_engine[1483]: I20250513 23:53:30.958289 1483 update_attempter.cc:643] Scheduling an action processor start. May 13 23:53:30.958378 update_engine[1483]: I20250513 23:53:30.958310 1483 action_processor.cc:36] ActionProcessor::StartProcessing: OmahaRequestAction May 13 23:53:30.959705 update_engine[1483]: I20250513 23:53:30.959410 1483 prefs.cc:52] previous-version not present in /var/lib/update_engine/prefs May 13 23:53:30.959705 update_engine[1483]: I20250513 23:53:30.959525 1483 omaha_request_action.cc:271] Posting an Omaha request to disabled May 13 23:53:30.959705 update_engine[1483]: I20250513 23:53:30.959534 1483 omaha_request_action.cc:272] Request: May 13 23:53:30.959705 update_engine[1483]: May 13 23:53:30.959705 update_engine[1483]: May 13 23:53:30.959705 update_engine[1483]: May 13 23:53:30.959705 update_engine[1483]: May 13 23:53:30.959705 update_engine[1483]: May 13 23:53:30.959705 update_engine[1483]: May 13 23:53:30.959705 update_engine[1483]: May 13 23:53:30.959705 update_engine[1483]: May 13 23:53:30.959705 update_engine[1483]: I20250513 23:53:30.959542 1483 libcurl_http_fetcher.cc:47] Starting/Resuming transfer May 13 23:53:30.964784 locksmithd[1521]: LastCheckedTime=0 Progress=0 CurrentOperation="UPDATE_STATUS_CHECKING_FOR_UPDATE" NewVersion=0.0.0 NewSize=0 May 13 23:53:30.965963 update_engine[1483]: I20250513 23:53:30.965888 1483 libcurl_http_fetcher.cc:151] Setting up curl options for HTTP May 13 23:53:30.966392 update_engine[1483]: I20250513 23:53:30.966346 1483 libcurl_http_fetcher.cc:449] Setting up timeout source: 1 seconds. May 13 23:53:30.968968 update_engine[1483]: E20250513 23:53:30.968889 1483 libcurl_http_fetcher.cc:266] Unable to get http response code: Could not resolve host: disabled May 13 23:53:30.969122 update_engine[1483]: I20250513 23:53:30.969018 1483 libcurl_http_fetcher.cc:283] No HTTP response, retry 1 May 13 23:53:33.591661 containerd[1507]: time="2025-05-13T23:53:33.591599637Z" level=info msg="TaskExit event in podsandbox handler container_id:\"31a4144284996cba49259ad8f22433f69404813566164bcd2319e49b1d12eadc\" id:\"b3d65a0bdf3f9f948aefc61ffc66012525c90ecd9282192fc0ef08b64ec82a0c\" pid:5090 exited_at:{seconds:1747180413 nanos:590282549}" May 13 23:53:35.679107 containerd[1507]: time="2025-05-13T23:53:35.678882592Z" level=info msg="TaskExit event in podsandbox handler container_id:\"3c5fb2d2e4cad4ba4920801b1ac65be2b855e4b096d8b2f679fd0847ca9bceea\" id:\"9b1ec46eba09a37b700e9ab65fa6d62cc82da8a143f8add084887efd48418fd3\" pid:5114 exited_at:{seconds:1747180415 nanos:678314485}" May 13 23:53:40.874690 update_engine[1483]: I20250513 23:53:40.873924 1483 libcurl_http_fetcher.cc:47] Starting/Resuming transfer May 13 23:53:40.874690 update_engine[1483]: I20250513 23:53:40.874487 1483 libcurl_http_fetcher.cc:151] Setting up curl options for HTTP May 13 23:53:40.878282 update_engine[1483]: I20250513 23:53:40.874827 1483 libcurl_http_fetcher.cc:449] Setting up timeout source: 1 seconds. May 13 23:53:40.878282 update_engine[1483]: E20250513 23:53:40.876552 1483 libcurl_http_fetcher.cc:266] Unable to get http response code: Could not resolve host: disabled May 13 23:53:40.878282 update_engine[1483]: I20250513 23:53:40.876647 1483 libcurl_http_fetcher.cc:283] No HTTP response, retry 2 May 13 23:53:50.875881 update_engine[1483]: I20250513 23:53:50.875694 1483 libcurl_http_fetcher.cc:47] Starting/Resuming transfer May 13 23:53:50.876568 update_engine[1483]: I20250513 23:53:50.876012 1483 libcurl_http_fetcher.cc:151] Setting up curl options for HTTP May 13 23:53:50.876568 update_engine[1483]: I20250513 23:53:50.876363 1483 libcurl_http_fetcher.cc:449] Setting up timeout source: 1 seconds. May 13 23:53:50.877789 update_engine[1483]: E20250513 23:53:50.877693 1483 libcurl_http_fetcher.cc:266] Unable to get http response code: Could not resolve host: disabled May 13 23:53:50.878015 update_engine[1483]: I20250513 23:53:50.877804 1483 libcurl_http_fetcher.cc:283] No HTTP response, retry 3 May 13 23:54:00.872996 update_engine[1483]: I20250513 23:54:00.872838 1483 libcurl_http_fetcher.cc:47] Starting/Resuming transfer May 13 23:54:00.873457 update_engine[1483]: I20250513 23:54:00.873234 1483 libcurl_http_fetcher.cc:151] Setting up curl options for HTTP May 13 23:54:00.873627 update_engine[1483]: I20250513 23:54:00.873533 1483 libcurl_http_fetcher.cc:449] Setting up timeout source: 1 seconds. May 13 23:54:00.875356 update_engine[1483]: E20250513 23:54:00.875288 1483 libcurl_http_fetcher.cc:266] Unable to get http response code: Could not resolve host: disabled May 13 23:54:00.875493 update_engine[1483]: I20250513 23:54:00.875416 1483 libcurl_http_fetcher.cc:297] Transfer resulted in an error (0), 0 bytes downloaded May 13 23:54:00.875493 update_engine[1483]: I20250513 23:54:00.875427 1483 omaha_request_action.cc:617] Omaha request response: May 13 23:54:00.875564 update_engine[1483]: E20250513 23:54:00.875520 1483 omaha_request_action.cc:636] Omaha request network transfer failed. May 13 23:54:00.875564 update_engine[1483]: I20250513 23:54:00.875540 1483 action_processor.cc:68] ActionProcessor::ActionComplete: OmahaRequestAction action failed. Aborting processing. May 13 23:54:00.875564 update_engine[1483]: I20250513 23:54:00.875548 1483 action_processor.cc:73] ActionProcessor::ActionComplete: finished last action of type OmahaRequestAction May 13 23:54:00.875564 update_engine[1483]: I20250513 23:54:00.875553 1483 update_attempter.cc:306] Processing Done. May 13 23:54:00.875711 update_engine[1483]: E20250513 23:54:00.875569 1483 update_attempter.cc:619] Update failed. May 13 23:54:00.875711 update_engine[1483]: I20250513 23:54:00.875576 1483 utils.cc:600] Converting error code 2000 to kActionCodeOmahaErrorInHTTPResponse May 13 23:54:00.875711 update_engine[1483]: I20250513 23:54:00.875581 1483 payload_state.cc:97] Updating payload state for error code: 37 (kActionCodeOmahaErrorInHTTPResponse) May 13 23:54:00.875711 update_engine[1483]: I20250513 23:54:00.875588 1483 payload_state.cc:103] Ignoring failures until we get a valid Omaha response. May 13 23:54:00.875711 update_engine[1483]: I20250513 23:54:00.875662 1483 action_processor.cc:36] ActionProcessor::StartProcessing: OmahaRequestAction May 13 23:54:00.875711 update_engine[1483]: I20250513 23:54:00.875689 1483 omaha_request_action.cc:271] Posting an Omaha request to disabled May 13 23:54:00.875711 update_engine[1483]: I20250513 23:54:00.875696 1483 omaha_request_action.cc:272] Request: May 13 23:54:00.875711 update_engine[1483]: May 13 23:54:00.875711 update_engine[1483]: May 13 23:54:00.875711 update_engine[1483]: May 13 23:54:00.875711 update_engine[1483]: May 13 23:54:00.875711 update_engine[1483]: May 13 23:54:00.875711 update_engine[1483]: May 13 23:54:00.875711 update_engine[1483]: I20250513 23:54:00.875705 1483 libcurl_http_fetcher.cc:47] Starting/Resuming transfer May 13 23:54:00.876149 update_engine[1483]: I20250513 23:54:00.875862 1483 libcurl_http_fetcher.cc:151] Setting up curl options for HTTP May 13 23:54:00.876191 update_engine[1483]: I20250513 23:54:00.876145 1483 libcurl_http_fetcher.cc:449] Setting up timeout source: 1 seconds. May 13 23:54:00.876583 locksmithd[1521]: LastCheckedTime=0 Progress=0 CurrentOperation="UPDATE_STATUS_REPORTING_ERROR_EVENT" NewVersion=0.0.0 NewSize=0 May 13 23:54:00.877286 update_engine[1483]: E20250513 23:54:00.877245 1483 libcurl_http_fetcher.cc:266] Unable to get http response code: Could not resolve host: disabled May 13 23:54:00.877360 update_engine[1483]: I20250513 23:54:00.877341 1483 libcurl_http_fetcher.cc:297] Transfer resulted in an error (0), 0 bytes downloaded May 13 23:54:00.877360 update_engine[1483]: I20250513 23:54:00.877354 1483 omaha_request_action.cc:617] Omaha request response: May 13 23:54:00.877431 update_engine[1483]: I20250513 23:54:00.877362 1483 action_processor.cc:65] ActionProcessor::ActionComplete: finished last action of type OmahaRequestAction May 13 23:54:00.877431 update_engine[1483]: I20250513 23:54:00.877368 1483 action_processor.cc:73] ActionProcessor::ActionComplete: finished last action of type OmahaRequestAction May 13 23:54:00.877431 update_engine[1483]: I20250513 23:54:00.877373 1483 update_attempter.cc:306] Processing Done. May 13 23:54:00.877431 update_engine[1483]: I20250513 23:54:00.877380 1483 update_attempter.cc:310] Error event sent. May 13 23:54:00.877431 update_engine[1483]: I20250513 23:54:00.877390 1483 update_check_scheduler.cc:74] Next update check in 48m48s May 13 23:54:00.878018 locksmithd[1521]: LastCheckedTime=0 Progress=0 CurrentOperation="UPDATE_STATUS_IDLE" NewVersion=0.0.0 NewSize=0 May 13 23:54:03.589759 containerd[1507]: time="2025-05-13T23:54:03.589707669Z" level=info msg="TaskExit event in podsandbox handler container_id:\"31a4144284996cba49259ad8f22433f69404813566164bcd2319e49b1d12eadc\" id:\"ca142aca68832be135473f663fad751063efb201cf3ff243a3798bad499b6ccd\" pid:5144 exited_at:{seconds:1747180443 nanos:589333673}" May 13 23:54:05.675345 containerd[1507]: time="2025-05-13T23:54:05.675193277Z" level=info msg="TaskExit event in podsandbox handler container_id:\"3c5fb2d2e4cad4ba4920801b1ac65be2b855e4b096d8b2f679fd0847ca9bceea\" id:\"bd2a44d2498dfcbaf2b5bdab1d42e57499e1e9c60e72b13a20f19ebe90c50a42\" pid:5170 exited_at:{seconds:1747180445 nanos:674275285}" May 13 23:54:13.852326 containerd[1507]: time="2025-05-13T23:54:13.852050198Z" level=info msg="TaskExit event in podsandbox handler container_id:\"3c5fb2d2e4cad4ba4920801b1ac65be2b855e4b096d8b2f679fd0847ca9bceea\" id:\"f2e2f4266e9421343c583a9d43f5cdc43ff2ddf617a6906d14a3d24414c50efd\" pid:5195 exited_at:{seconds:1747180453 nanos:851669720}" May 13 23:54:33.598698 containerd[1507]: time="2025-05-13T23:54:33.598634249Z" level=info msg="TaskExit event in podsandbox handler container_id:\"31a4144284996cba49259ad8f22433f69404813566164bcd2319e49b1d12eadc\" id:\"dbdb1909f760a1e5fe7ca2f31998419195128befa75bf0fc331a4d38ded184b0\" pid:5224 exited_at:{seconds:1747180473 nanos:597969289}" May 13 23:54:35.675778 containerd[1507]: time="2025-05-13T23:54:35.675698337Z" level=info msg="TaskExit event in podsandbox handler container_id:\"3c5fb2d2e4cad4ba4920801b1ac65be2b855e4b096d8b2f679fd0847ca9bceea\" id:\"c5fc33ef717f62f11fc14e42a09359e76297019caa0abd3622316cb34e1db53d\" pid:5250 exited_at:{seconds:1747180475 nanos:675152857}" May 13 23:55:03.587028 containerd[1507]: time="2025-05-13T23:55:03.586917526Z" level=info msg="TaskExit event in podsandbox handler container_id:\"31a4144284996cba49259ad8f22433f69404813566164bcd2319e49b1d12eadc\" id:\"53471d763fc31ee3b78b1681ce031299465338bbb6b2bdf7806f9f269179e848\" pid:5292 exited_at:{seconds:1747180503 nanos:586337643}" May 13 23:55:05.692887 containerd[1507]: time="2025-05-13T23:55:05.692836550Z" level=info msg="TaskExit event in podsandbox handler container_id:\"3c5fb2d2e4cad4ba4920801b1ac65be2b855e4b096d8b2f679fd0847ca9bceea\" id:\"cd52a89f969cf093e13902ff87d6379441db0fe7c9ef64357b076d2ffd34391f\" pid:5319 exited_at:{seconds:1747180505 nanos:692279987}" May 13 23:55:13.840580 containerd[1507]: time="2025-05-13T23:55:13.840528551Z" level=info msg="TaskExit event in podsandbox handler container_id:\"3c5fb2d2e4cad4ba4920801b1ac65be2b855e4b096d8b2f679fd0847ca9bceea\" id:\"1ca4adce4c8baaea496a6e2c1cdc2a09ba1d7615e0911830af4ac7f302732191\" pid:5343 exited_at:{seconds:1747180513 nanos:840115588}" May 13 23:55:33.593910 containerd[1507]: time="2025-05-13T23:55:33.593858840Z" level=info msg="TaskExit event in podsandbox handler container_id:\"31a4144284996cba49259ad8f22433f69404813566164bcd2319e49b1d12eadc\" id:\"737ef95da4143e7141e77f52c30df35a0dfc45bcaf196917ca59d140cb92c133\" pid:5365 exited_at:{seconds:1747180533 nanos:593433476}" May 13 23:55:35.678581 containerd[1507]: time="2025-05-13T23:55:35.678449264Z" level=info msg="TaskExit event in podsandbox handler container_id:\"3c5fb2d2e4cad4ba4920801b1ac65be2b855e4b096d8b2f679fd0847ca9bceea\" id:\"8428a9282e1b3e6f75d713be78f90520bb9a31a58532ba473476ce521c913803\" pid:5389 exited_at:{seconds:1747180535 nanos:677936820}" May 13 23:55:47.512213 systemd[1]: Started sshd@7-188.245.195.87:22-139.178.89.65:55262.service - OpenSSH per-connection server daemon (139.178.89.65:55262). May 13 23:55:48.556806 sshd[5404]: Accepted publickey for core from 139.178.89.65 port 55262 ssh2: RSA SHA256:2AVjOJVQfxOW/VtYd+RUuCe04o4LOcgB/L6rdW4ABGI May 13 23:55:48.560374 sshd-session[5404]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 13 23:55:48.568181 systemd-logind[1482]: New session 8 of user core. May 13 23:55:48.572199 systemd[1]: Started session-8.scope - Session 8 of User core. May 13 23:55:49.382995 sshd[5406]: Connection closed by 139.178.89.65 port 55262 May 13 23:55:49.383512 sshd-session[5404]: pam_unix(sshd:session): session closed for user core May 13 23:55:49.387697 systemd-logind[1482]: Session 8 logged out. Waiting for processes to exit. May 13 23:55:49.388135 systemd[1]: sshd@7-188.245.195.87:22-139.178.89.65:55262.service: Deactivated successfully. May 13 23:55:49.391843 systemd[1]: session-8.scope: Deactivated successfully. May 13 23:55:49.395624 systemd-logind[1482]: Removed session 8. May 13 23:55:54.556995 systemd[1]: Started sshd@8-188.245.195.87:22-139.178.89.65:55264.service - OpenSSH per-connection server daemon (139.178.89.65:55264). May 13 23:55:55.572028 sshd[5420]: Accepted publickey for core from 139.178.89.65 port 55264 ssh2: RSA SHA256:2AVjOJVQfxOW/VtYd+RUuCe04o4LOcgB/L6rdW4ABGI May 13 23:55:55.573744 sshd-session[5420]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 13 23:55:55.580041 systemd-logind[1482]: New session 9 of user core. May 13 23:55:55.585147 systemd[1]: Started session-9.scope - Session 9 of User core. May 13 23:55:56.352998 sshd[5422]: Connection closed by 139.178.89.65 port 55264 May 13 23:55:56.353704 sshd-session[5420]: pam_unix(sshd:session): session closed for user core May 13 23:55:56.358887 systemd[1]: sshd@8-188.245.195.87:22-139.178.89.65:55264.service: Deactivated successfully. May 13 23:55:56.362395 systemd[1]: session-9.scope: Deactivated successfully. May 13 23:55:56.363820 systemd-logind[1482]: Session 9 logged out. Waiting for processes to exit. May 13 23:55:56.364897 systemd-logind[1482]: Removed session 9. May 13 23:55:56.529927 systemd[1]: Started sshd@9-188.245.195.87:22-139.178.89.65:55272.service - OpenSSH per-connection server daemon (139.178.89.65:55272). May 13 23:55:57.552013 sshd[5435]: Accepted publickey for core from 139.178.89.65 port 55272 ssh2: RSA SHA256:2AVjOJVQfxOW/VtYd+RUuCe04o4LOcgB/L6rdW4ABGI May 13 23:55:57.554372 sshd-session[5435]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 13 23:55:57.564730 systemd-logind[1482]: New session 10 of user core. May 13 23:55:57.571407 systemd[1]: Started session-10.scope - Session 10 of User core. May 13 23:55:57.970222 containerd[1507]: time="2025-05-13T23:55:57.965473974Z" level=warning msg="container event discarded" container=281ef7c12689b1a5993ad7ffc0be4f98cb6affce5dd34256a60515457db460fa type=CONTAINER_CREATED_EVENT May 13 23:55:57.987022 containerd[1507]: time="2025-05-13T23:55:57.986399881Z" level=warning msg="container event discarded" container=281ef7c12689b1a5993ad7ffc0be4f98cb6affce5dd34256a60515457db460fa type=CONTAINER_STARTED_EVENT May 13 23:55:58.010786 containerd[1507]: time="2025-05-13T23:55:58.010692706Z" level=warning msg="container event discarded" container=54a42263a8e5b7e56875d56472bc4f93524681ddbad279f83d1074056819ef64 type=CONTAINER_CREATED_EVENT May 13 23:55:58.010786 containerd[1507]: time="2025-05-13T23:55:58.010749707Z" level=warning msg="container event discarded" container=54a42263a8e5b7e56875d56472bc4f93524681ddbad279f83d1074056819ef64 type=CONTAINER_STARTED_EVENT May 13 23:55:58.010786 containerd[1507]: time="2025-05-13T23:55:58.010759107Z" level=warning msg="container event discarded" container=558d47e04f618903205da713052711f336048d8fae7aee952a246ce637d1d5e6 type=CONTAINER_CREATED_EVENT May 13 23:55:58.036605 containerd[1507]: time="2025-05-13T23:55:58.036409628Z" level=warning msg="container event discarded" container=bfdf3cb17427a274d3ea9e5ddecf78fd865a477337fc5112be6a0851925694f5 type=CONTAINER_CREATED_EVENT May 13 23:55:58.036605 containerd[1507]: time="2025-05-13T23:55:58.036479509Z" level=warning msg="container event discarded" container=bfdf3cb17427a274d3ea9e5ddecf78fd865a477337fc5112be6a0851925694f5 type=CONTAINER_STARTED_EVENT May 13 23:55:58.057154 containerd[1507]: time="2025-05-13T23:55:58.057058254Z" level=warning msg="container event discarded" container=fd9194ed0223d790205ba5afa3d849604a51f345dbdb12bc45971317f587d8e1 type=CONTAINER_CREATED_EVENT May 13 23:55:58.081115 containerd[1507]: time="2025-05-13T23:55:58.080933515Z" level=warning msg="container event discarded" container=28655717a7c33734edf3a123df3df054f40592279e3d2dcb6b00dcb9fba879c3 type=CONTAINER_CREATED_EVENT May 13 23:55:58.123130 containerd[1507]: time="2025-05-13T23:55:58.123039336Z" level=warning msg="container event discarded" container=558d47e04f618903205da713052711f336048d8fae7aee952a246ce637d1d5e6 type=CONTAINER_STARTED_EVENT May 13 23:55:58.183761 containerd[1507]: time="2025-05-13T23:55:58.183671280Z" level=warning msg="container event discarded" container=fd9194ed0223d790205ba5afa3d849604a51f345dbdb12bc45971317f587d8e1 type=CONTAINER_STARTED_EVENT May 13 23:55:58.222303 containerd[1507]: time="2025-05-13T23:55:58.222116021Z" level=warning msg="container event discarded" container=28655717a7c33734edf3a123df3df054f40592279e3d2dcb6b00dcb9fba879c3 type=CONTAINER_STARTED_EVENT May 13 23:55:58.374173 sshd[5437]: Connection closed by 139.178.89.65 port 55272 May 13 23:55:58.375593 sshd-session[5435]: pam_unix(sshd:session): session closed for user core May 13 23:55:58.381072 systemd[1]: sshd@9-188.245.195.87:22-139.178.89.65:55272.service: Deactivated successfully. May 13 23:55:58.384380 systemd[1]: session-10.scope: Deactivated successfully. May 13 23:55:58.385604 systemd-logind[1482]: Session 10 logged out. Waiting for processes to exit. May 13 23:55:58.388451 systemd-logind[1482]: Removed session 10. May 13 23:55:58.545507 systemd[1]: Started sshd@10-188.245.195.87:22-139.178.89.65:55280.service - OpenSSH per-connection server daemon (139.178.89.65:55280). May 13 23:55:59.545049 sshd[5447]: Accepted publickey for core from 139.178.89.65 port 55280 ssh2: RSA SHA256:2AVjOJVQfxOW/VtYd+RUuCe04o4LOcgB/L6rdW4ABGI May 13 23:55:59.547139 sshd-session[5447]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 13 23:55:59.556530 systemd-logind[1482]: New session 11 of user core. May 13 23:55:59.560189 systemd[1]: Started session-11.scope - Session 11 of User core. May 13 23:56:00.335683 sshd[5449]: Connection closed by 139.178.89.65 port 55280 May 13 23:56:00.336499 sshd-session[5447]: pam_unix(sshd:session): session closed for user core May 13 23:56:00.343297 systemd[1]: sshd@10-188.245.195.87:22-139.178.89.65:55280.service: Deactivated successfully. May 13 23:56:00.347743 systemd[1]: session-11.scope: Deactivated successfully. May 13 23:56:00.348805 systemd-logind[1482]: Session 11 logged out. Waiting for processes to exit. May 13 23:56:00.350077 systemd-logind[1482]: Removed session 11. May 13 23:56:03.602613 containerd[1507]: time="2025-05-13T23:56:03.602559472Z" level=info msg="TaskExit event in podsandbox handler container_id:\"31a4144284996cba49259ad8f22433f69404813566164bcd2319e49b1d12eadc\" id:\"c9048d3b164cbf68727af78f6e72cdf74b0fc34c9c6cd128879615c0738d94f1\" pid:5477 exit_status:1 exited_at:{seconds:1747180563 nanos:602122387}" May 13 23:56:05.523052 systemd[1]: Started sshd@11-188.245.195.87:22-139.178.89.65:55296.service - OpenSSH per-connection server daemon (139.178.89.65:55296). May 13 23:56:05.673226 containerd[1507]: time="2025-05-13T23:56:05.672829770Z" level=info msg="TaskExit event in podsandbox handler container_id:\"3c5fb2d2e4cad4ba4920801b1ac65be2b855e4b096d8b2f679fd0847ca9bceea\" id:\"79afa08c3d5cbf6b321c1fc54ab2a32cf919176523ae29d285dba8f412927c25\" pid:5506 exited_at:{seconds:1747180565 nanos:672107641}" May 13 23:56:06.550352 sshd[5492]: Accepted publickey for core from 139.178.89.65 port 55296 ssh2: RSA SHA256:2AVjOJVQfxOW/VtYd+RUuCe04o4LOcgB/L6rdW4ABGI May 13 23:56:06.553498 sshd-session[5492]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 13 23:56:06.572788 systemd-logind[1482]: New session 12 of user core. May 13 23:56:06.578313 systemd[1]: Started session-12.scope - Session 12 of User core. May 13 23:56:07.365382 sshd[5515]: Connection closed by 139.178.89.65 port 55296 May 13 23:56:07.366129 sshd-session[5492]: pam_unix(sshd:session): session closed for user core May 13 23:56:07.371777 systemd[1]: sshd@11-188.245.195.87:22-139.178.89.65:55296.service: Deactivated successfully. May 13 23:56:07.374594 systemd[1]: session-12.scope: Deactivated successfully. May 13 23:56:07.377579 systemd-logind[1482]: Session 12 logged out. Waiting for processes to exit. May 13 23:56:07.378931 systemd-logind[1482]: Removed session 12. May 13 23:56:10.263904 containerd[1507]: time="2025-05-13T23:56:10.263730497Z" level=warning msg="container event discarded" container=73d4f60b757c4a8dbc0470fe207eb1a26a811f997ead7347b6a611bcef538f1d type=CONTAINER_CREATED_EVENT May 13 23:56:10.263904 containerd[1507]: time="2025-05-13T23:56:10.263799018Z" level=warning msg="container event discarded" container=73d4f60b757c4a8dbc0470fe207eb1a26a811f997ead7347b6a611bcef538f1d type=CONTAINER_STARTED_EVENT May 13 23:56:10.315116 containerd[1507]: time="2025-05-13T23:56:10.315018933Z" level=warning msg="container event discarded" container=77c867969c2c0933b475f8a770d01777df11320e3075424531c6122e0d198f4c type=CONTAINER_CREATED_EVENT May 13 23:56:10.398400 containerd[1507]: time="2025-05-13T23:56:10.398326022Z" level=warning msg="container event discarded" container=7b572f8c6d7c2317c275876f0a2fa95ee8f66035af87bc7af049fc14b2675b86 type=CONTAINER_CREATED_EVENT May 13 23:56:10.398400 containerd[1507]: time="2025-05-13T23:56:10.398386583Z" level=warning msg="container event discarded" container=7b572f8c6d7c2317c275876f0a2fa95ee8f66035af87bc7af049fc14b2675b86 type=CONTAINER_STARTED_EVENT May 13 23:56:10.432720 containerd[1507]: time="2025-05-13T23:56:10.432627141Z" level=warning msg="container event discarded" container=77c867969c2c0933b475f8a770d01777df11320e3075424531c6122e0d198f4c type=CONTAINER_STARTED_EVENT May 13 23:56:12.484428 containerd[1507]: time="2025-05-13T23:56:12.484340178Z" level=warning msg="container event discarded" container=98da301f10ed65f4ea95c8857b325a66329a4b2d70631d7c802fcd9ee3df9026 type=CONTAINER_CREATED_EVENT May 13 23:56:12.540283 systemd[1]: Started sshd@12-188.245.195.87:22-139.178.89.65:33224.service - OpenSSH per-connection server daemon (139.178.89.65:33224). May 13 23:56:12.552944 containerd[1507]: time="2025-05-13T23:56:12.552821621Z" level=warning msg="container event discarded" container=98da301f10ed65f4ea95c8857b325a66329a4b2d70631d7c802fcd9ee3df9026 type=CONTAINER_STARTED_EVENT May 13 23:56:13.566181 sshd[5530]: Accepted publickey for core from 139.178.89.65 port 33224 ssh2: RSA SHA256:2AVjOJVQfxOW/VtYd+RUuCe04o4LOcgB/L6rdW4ABGI May 13 23:56:13.568691 sshd-session[5530]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 13 23:56:13.576726 systemd-logind[1482]: New session 13 of user core. May 13 23:56:13.581189 systemd[1]: Started session-13.scope - Session 13 of User core. May 13 23:56:13.851368 containerd[1507]: time="2025-05-13T23:56:13.851128891Z" level=info msg="TaskExit event in podsandbox handler container_id:\"3c5fb2d2e4cad4ba4920801b1ac65be2b855e4b096d8b2f679fd0847ca9bceea\" id:\"d9be8a5831d88538e7c44d64503f9795600c71f758673c1edb1478be13890e44\" pid:5545 exited_at:{seconds:1747180573 nanos:850739887}" May 13 23:56:14.341758 sshd[5532]: Connection closed by 139.178.89.65 port 33224 May 13 23:56:14.342261 sshd-session[5530]: pam_unix(sshd:session): session closed for user core May 13 23:56:14.351239 systemd[1]: sshd@12-188.245.195.87:22-139.178.89.65:33224.service: Deactivated successfully. May 13 23:56:14.357517 systemd[1]: session-13.scope: Deactivated successfully. May 13 23:56:14.360792 systemd-logind[1482]: Session 13 logged out. Waiting for processes to exit. May 13 23:56:14.363155 systemd-logind[1482]: Removed session 13. May 13 23:56:19.065061 containerd[1507]: time="2025-05-13T23:56:19.064840996Z" level=warning msg="container event discarded" container=06be926168658e1d4ccb5f55cefc13dea9be0c7e65ac089ac367f513ab418d23 type=CONTAINER_CREATED_EVENT May 13 23:56:19.065061 containerd[1507]: time="2025-05-13T23:56:19.064965998Z" level=warning msg="container event discarded" container=06be926168658e1d4ccb5f55cefc13dea9be0c7e65ac089ac367f513ab418d23 type=CONTAINER_STARTED_EVENT May 13 23:56:19.166568 containerd[1507]: time="2025-05-13T23:56:19.166305300Z" level=warning msg="container event discarded" container=ca8d710b13687069d5d3293e27ce8b60258ade20b0a11780e0b583d0dddbf2be type=CONTAINER_CREATED_EVENT May 13 23:56:19.166568 containerd[1507]: time="2025-05-13T23:56:19.166397821Z" level=warning msg="container event discarded" container=ca8d710b13687069d5d3293e27ce8b60258ade20b0a11780e0b583d0dddbf2be type=CONTAINER_STARTED_EVENT May 13 23:56:19.523113 systemd[1]: Started sshd@13-188.245.195.87:22-139.178.89.65:40154.service - OpenSSH per-connection server daemon (139.178.89.65:40154). May 13 23:56:20.548841 sshd[5578]: Accepted publickey for core from 139.178.89.65 port 40154 ssh2: RSA SHA256:2AVjOJVQfxOW/VtYd+RUuCe04o4LOcgB/L6rdW4ABGI May 13 23:56:20.552354 sshd-session[5578]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 13 23:56:20.562646 systemd-logind[1482]: New session 14 of user core. May 13 23:56:20.568234 systemd[1]: Started session-14.scope - Session 14 of User core. May 13 23:56:21.174726 containerd[1507]: time="2025-05-13T23:56:21.174531218Z" level=warning msg="container event discarded" container=2ace13d2221a7c91c1dfc1be98d4d3908a2a8722073db7a44e5a2fab860ccf3f type=CONTAINER_CREATED_EVENT May 13 23:56:21.273113 containerd[1507]: time="2025-05-13T23:56:21.272959334Z" level=warning msg="container event discarded" container=2ace13d2221a7c91c1dfc1be98d4d3908a2a8722073db7a44e5a2fab860ccf3f type=CONTAINER_STARTED_EVENT May 13 23:56:21.329879 sshd[5580]: Connection closed by 139.178.89.65 port 40154 May 13 23:56:21.330672 sshd-session[5578]: pam_unix(sshd:session): session closed for user core May 13 23:56:21.337179 systemd[1]: sshd@13-188.245.195.87:22-139.178.89.65:40154.service: Deactivated successfully. May 13 23:56:21.340170 systemd[1]: session-14.scope: Deactivated successfully. May 13 23:56:21.341485 systemd-logind[1482]: Session 14 logged out. Waiting for processes to exit. May 13 23:56:21.343255 systemd-logind[1482]: Removed session 14. May 13 23:56:21.501643 systemd[1]: Started sshd@14-188.245.195.87:22-139.178.89.65:40162.service - OpenSSH per-connection server daemon (139.178.89.65:40162). May 13 23:56:22.515762 sshd[5592]: Accepted publickey for core from 139.178.89.65 port 40162 ssh2: RSA SHA256:2AVjOJVQfxOW/VtYd+RUuCe04o4LOcgB/L6rdW4ABGI May 13 23:56:22.517407 sshd-session[5592]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 13 23:56:22.523373 systemd-logind[1482]: New session 15 of user core. May 13 23:56:22.535052 systemd[1]: Started session-15.scope - Session 15 of User core. May 13 23:56:22.537301 containerd[1507]: time="2025-05-13T23:56:22.535757418Z" level=warning msg="container event discarded" container=0876d9d619628e3c96dec1daa824f2cef9f72d8d860c07209ce6fd6cee166048 type=CONTAINER_CREATED_EVENT May 13 23:56:22.627199 containerd[1507]: time="2025-05-13T23:56:22.627087611Z" level=warning msg="container event discarded" container=0876d9d619628e3c96dec1daa824f2cef9f72d8d860c07209ce6fd6cee166048 type=CONTAINER_STARTED_EVENT May 13 23:56:22.786496 containerd[1507]: time="2025-05-13T23:56:22.786333232Z" level=warning msg="container event discarded" container=0876d9d619628e3c96dec1daa824f2cef9f72d8d860c07209ce6fd6cee166048 type=CONTAINER_STOPPED_EVENT May 13 23:56:23.404505 sshd[5599]: Connection closed by 139.178.89.65 port 40162 May 13 23:56:23.407251 sshd-session[5592]: pam_unix(sshd:session): session closed for user core May 13 23:56:23.413299 systemd[1]: sshd@14-188.245.195.87:22-139.178.89.65:40162.service: Deactivated successfully. May 13 23:56:23.416329 systemd[1]: session-15.scope: Deactivated successfully. May 13 23:56:23.418718 systemd-logind[1482]: Session 15 logged out. Waiting for processes to exit. May 13 23:56:23.419897 systemd-logind[1482]: Removed session 15. May 13 23:56:23.585228 systemd[1]: Started sshd@15-188.245.195.87:22-139.178.89.65:40174.service - OpenSSH per-connection server daemon (139.178.89.65:40174). May 13 23:56:24.606378 sshd[5609]: Accepted publickey for core from 139.178.89.65 port 40174 ssh2: RSA SHA256:2AVjOJVQfxOW/VtYd+RUuCe04o4LOcgB/L6rdW4ABGI May 13 23:56:24.608407 sshd-session[5609]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 13 23:56:24.614110 systemd-logind[1482]: New session 16 of user core. May 13 23:56:24.619186 systemd[1]: Started session-16.scope - Session 16 of User core. May 13 23:56:26.932260 containerd[1507]: time="2025-05-13T23:56:26.932186139Z" level=warning msg="container event discarded" container=d44472e94337c8774078b94e9cca5458563720a2a9c668d8b30bc7b7e3722a69 type=CONTAINER_CREATED_EVENT May 13 23:56:27.021501 containerd[1507]: time="2025-05-13T23:56:27.021434402Z" level=warning msg="container event discarded" container=d44472e94337c8774078b94e9cca5458563720a2a9c668d8b30bc7b7e3722a69 type=CONTAINER_STARTED_EVENT May 13 23:56:27.557017 sshd[5611]: Connection closed by 139.178.89.65 port 40174 May 13 23:56:27.556463 sshd-session[5609]: pam_unix(sshd:session): session closed for user core May 13 23:56:27.562505 systemd[1]: sshd@15-188.245.195.87:22-139.178.89.65:40174.service: Deactivated successfully. May 13 23:56:27.565825 systemd[1]: session-16.scope: Deactivated successfully. May 13 23:56:27.566155 systemd[1]: session-16.scope: Consumed 599ms CPU time, 71.8M memory peak. May 13 23:56:27.567429 systemd-logind[1482]: Session 16 logged out. Waiting for processes to exit. May 13 23:56:27.569927 systemd-logind[1482]: Removed session 16. May 13 23:56:27.724561 containerd[1507]: time="2025-05-13T23:56:27.724472557Z" level=warning msg="container event discarded" container=d44472e94337c8774078b94e9cca5458563720a2a9c668d8b30bc7b7e3722a69 type=CONTAINER_STOPPED_EVENT May 13 23:56:27.727388 systemd[1]: Started sshd@16-188.245.195.87:22-139.178.89.65:53508.service - OpenSSH per-connection server daemon (139.178.89.65:53508). May 13 23:56:28.735388 sshd[5629]: Accepted publickey for core from 139.178.89.65 port 53508 ssh2: RSA SHA256:2AVjOJVQfxOW/VtYd+RUuCe04o4LOcgB/L6rdW4ABGI May 13 23:56:28.737802 sshd-session[5629]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 13 23:56:28.748020 systemd-logind[1482]: New session 17 of user core. May 13 23:56:28.754616 systemd[1]: Started session-17.scope - Session 17 of User core. May 13 23:56:29.630960 sshd[5631]: Connection closed by 139.178.89.65 port 53508 May 13 23:56:29.632212 sshd-session[5629]: pam_unix(sshd:session): session closed for user core May 13 23:56:29.637842 systemd[1]: sshd@16-188.245.195.87:22-139.178.89.65:53508.service: Deactivated successfully. May 13 23:56:29.642826 systemd[1]: session-17.scope: Deactivated successfully. May 13 23:56:29.647209 systemd-logind[1482]: Session 17 logged out. Waiting for processes to exit. May 13 23:56:29.648701 systemd-logind[1482]: Removed session 17. May 13 23:56:29.813133 systemd[1]: Started sshd@17-188.245.195.87:22-139.178.89.65:53512.service - OpenSSH per-connection server daemon (139.178.89.65:53512). May 13 23:56:30.848004 sshd[5641]: Accepted publickey for core from 139.178.89.65 port 53512 ssh2: RSA SHA256:2AVjOJVQfxOW/VtYd+RUuCe04o4LOcgB/L6rdW4ABGI May 13 23:56:30.850316 sshd-session[5641]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 13 23:56:30.858250 systemd-logind[1482]: New session 18 of user core. May 13 23:56:30.866211 systemd[1]: Started session-18.scope - Session 18 of User core. May 13 23:56:31.636005 sshd[5643]: Connection closed by 139.178.89.65 port 53512 May 13 23:56:31.636675 sshd-session[5641]: pam_unix(sshd:session): session closed for user core May 13 23:56:31.641393 systemd[1]: sshd@17-188.245.195.87:22-139.178.89.65:53512.service: Deactivated successfully. May 13 23:56:31.645586 systemd[1]: session-18.scope: Deactivated successfully. May 13 23:56:31.647002 systemd-logind[1482]: Session 18 logged out. Waiting for processes to exit. May 13 23:56:31.648016 systemd-logind[1482]: Removed session 18. May 13 23:56:33.588490 containerd[1507]: time="2025-05-13T23:56:33.588440952Z" level=info msg="TaskExit event in podsandbox handler container_id:\"31a4144284996cba49259ad8f22433f69404813566164bcd2319e49b1d12eadc\" id:\"87dfe4a71b67997247ad5e51cd0859f31414872effd6a32897c9badffd50dd65\" pid:5670 exited_at:{seconds:1747180593 nanos:588032387}" May 13 23:56:33.964932 containerd[1507]: time="2025-05-13T23:56:33.964647780Z" level=warning msg="container event discarded" container=31a4144284996cba49259ad8f22433f69404813566164bcd2319e49b1d12eadc type=CONTAINER_CREATED_EVENT May 13 23:56:34.043213 containerd[1507]: time="2025-05-13T23:56:34.042941090Z" level=warning msg="container event discarded" container=31a4144284996cba49259ad8f22433f69404813566164bcd2319e49b1d12eadc type=CONTAINER_STARTED_EVENT May 13 23:56:35.677879 containerd[1507]: time="2025-05-13T23:56:35.677820566Z" level=info msg="TaskExit event in podsandbox handler container_id:\"3c5fb2d2e4cad4ba4920801b1ac65be2b855e4b096d8b2f679fd0847ca9bceea\" id:\"dd2e419d8bfa4063218f138df31ad298931c2f052f6008339e264af1ae6dd4bd\" pid:5695 exited_at:{seconds:1747180595 nanos:677323880}" May 13 23:56:36.815105 systemd[1]: Started sshd@18-188.245.195.87:22-139.178.89.65:38064.service - OpenSSH per-connection server daemon (139.178.89.65:38064). May 13 23:56:37.853383 sshd[5705]: Accepted publickey for core from 139.178.89.65 port 38064 ssh2: RSA SHA256:2AVjOJVQfxOW/VtYd+RUuCe04o4LOcgB/L6rdW4ABGI May 13 23:56:37.855556 sshd-session[5705]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 13 23:56:37.862967 systemd-logind[1482]: New session 19 of user core. May 13 23:56:37.873378 systemd[1]: Started session-19.scope - Session 19 of User core. May 13 23:56:38.632995 sshd[5707]: Connection closed by 139.178.89.65 port 38064 May 13 23:56:38.633292 sshd-session[5705]: pam_unix(sshd:session): session closed for user core May 13 23:56:38.643659 systemd[1]: sshd@18-188.245.195.87:22-139.178.89.65:38064.service: Deactivated successfully. May 13 23:56:38.647301 systemd[1]: session-19.scope: Deactivated successfully. May 13 23:56:38.648610 systemd-logind[1482]: Session 19 logged out. Waiting for processes to exit. May 13 23:56:38.649940 systemd-logind[1482]: Removed session 19. May 13 23:56:39.727668 containerd[1507]: time="2025-05-13T23:56:39.727535867Z" level=warning msg="container event discarded" container=afeff18632648d95ab95bc085d0a25208093d7fc76708387a3d0816eb13a551a type=CONTAINER_CREATED_EVENT May 13 23:56:39.727668 containerd[1507]: time="2025-05-13T23:56:39.727624548Z" level=warning msg="container event discarded" container=afeff18632648d95ab95bc085d0a25208093d7fc76708387a3d0816eb13a551a type=CONTAINER_STARTED_EVENT May 13 23:56:39.793239 containerd[1507]: time="2025-05-13T23:56:39.793023627Z" level=warning msg="container event discarded" container=d7c3c6218a4662f3d016177b081a28ee193bf0921cb489cc19d5e538d0e4253f type=CONTAINER_CREATED_EVENT May 13 23:56:39.793239 containerd[1507]: time="2025-05-13T23:56:39.793085108Z" level=warning msg="container event discarded" container=d7c3c6218a4662f3d016177b081a28ee193bf0921cb489cc19d5e538d0e4253f type=CONTAINER_STARTED_EVENT May 13 23:56:39.822439 containerd[1507]: time="2025-05-13T23:56:39.822351563Z" level=warning msg="container event discarded" container=1c99807a3a9d659920b69237cd9a4f52d4fdbb89a59b7049399fe1817e7344dd type=CONTAINER_CREATED_EVENT May 13 23:56:39.893828 containerd[1507]: time="2025-05-13T23:56:39.893725639Z" level=warning msg="container event discarded" container=1c99807a3a9d659920b69237cd9a4f52d4fdbb89a59b7049399fe1817e7344dd type=CONTAINER_STARTED_EVENT May 13 23:56:41.479696 containerd[1507]: time="2025-05-13T23:56:41.479606283Z" level=warning msg="container event discarded" container=ee2b3fb9f26d19008fa2ebc46f92a60ecaf857150f9b309c15f7b473ff05dec5 type=CONTAINER_CREATED_EVENT May 13 23:56:41.479696 containerd[1507]: time="2025-05-13T23:56:41.479666844Z" level=warning msg="container event discarded" container=ee2b3fb9f26d19008fa2ebc46f92a60ecaf857150f9b309c15f7b473ff05dec5 type=CONTAINER_STARTED_EVENT May 13 23:56:41.510038 containerd[1507]: time="2025-05-13T23:56:41.509892274Z" level=warning msg="container event discarded" container=bec2e855a7737f75863282aea1986c10d1852b30af36163f114bd08df94ddd6f type=CONTAINER_CREATED_EVENT May 13 23:56:41.582327 containerd[1507]: time="2025-05-13T23:56:41.582245126Z" level=warning msg="container event discarded" container=bec2e855a7737f75863282aea1986c10d1852b30af36163f114bd08df94ddd6f type=CONTAINER_STARTED_EVENT May 13 23:56:42.631247 containerd[1507]: time="2025-05-13T23:56:42.631033346Z" level=warning msg="container event discarded" container=780ead374676bca200fda1d3e4eda5e4eb03cc29b3a6f72b32d180c26ff0b565 type=CONTAINER_CREATED_EVENT May 13 23:56:42.631247 containerd[1507]: time="2025-05-13T23:56:42.631136787Z" level=warning msg="container event discarded" container=780ead374676bca200fda1d3e4eda5e4eb03cc29b3a6f72b32d180c26ff0b565 type=CONTAINER_STARTED_EVENT May 13 23:56:42.744991 containerd[1507]: time="2025-05-13T23:56:42.744882897Z" level=warning msg="container event discarded" container=e9e4837a0ed00002d93a951fbd5d6aeba69e6afc7f4d0e620256fd8b72a2ea0d type=CONTAINER_CREATED_EVENT May 13 23:56:42.745174 containerd[1507]: time="2025-05-13T23:56:42.744965658Z" level=warning msg="container event discarded" container=e9e4837a0ed00002d93a951fbd5d6aeba69e6afc7f4d0e620256fd8b72a2ea0d type=CONTAINER_STARTED_EVENT May 13 23:56:43.471611 containerd[1507]: time="2025-05-13T23:56:43.471465820Z" level=warning msg="container event discarded" container=611e3a1415b8e12165075217f9c671e8a5ab26623bd8b3a433c1539dce3f4e44 type=CONTAINER_CREATED_EVENT May 13 23:56:43.471611 containerd[1507]: time="2025-05-13T23:56:43.471571582Z" level=warning msg="container event discarded" container=611e3a1415b8e12165075217f9c671e8a5ab26623bd8b3a433c1539dce3f4e44 type=CONTAINER_STARTED_EVENT May 13 23:56:43.813360 systemd[1]: Started sshd@19-188.245.195.87:22-139.178.89.65:38074.service - OpenSSH per-connection server daemon (139.178.89.65:38074). May 13 23:56:44.296114 containerd[1507]: time="2025-05-13T23:56:44.296058470Z" level=warning msg="container event discarded" container=16cc1972d47b4f5562769ac7a2a63f1a0617e0eda1569b85c44f606d7273d359 type=CONTAINER_CREATED_EVENT May 13 23:56:44.416070 containerd[1507]: time="2025-05-13T23:56:44.415903186Z" level=warning msg="container event discarded" container=16cc1972d47b4f5562769ac7a2a63f1a0617e0eda1569b85c44f606d7273d359 type=CONTAINER_STARTED_EVENT May 13 23:56:44.850705 sshd[5721]: Accepted publickey for core from 139.178.89.65 port 38074 ssh2: RSA SHA256:2AVjOJVQfxOW/VtYd+RUuCe04o4LOcgB/L6rdW4ABGI May 13 23:56:44.852958 sshd-session[5721]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 13 23:56:44.859240 systemd-logind[1482]: New session 20 of user core. May 13 23:56:44.866601 systemd[1]: Started session-20.scope - Session 20 of User core. May 13 23:56:45.678142 sshd[5723]: Connection closed by 139.178.89.65 port 38074 May 13 23:56:45.678762 sshd-session[5721]: pam_unix(sshd:session): session closed for user core May 13 23:56:45.686858 systemd-logind[1482]: Session 20 logged out. Waiting for processes to exit. May 13 23:56:45.687300 systemd[1]: sshd@19-188.245.195.87:22-139.178.89.65:38074.service: Deactivated successfully. May 13 23:56:45.691258 systemd[1]: session-20.scope: Deactivated successfully. May 13 23:56:45.693342 systemd-logind[1482]: Removed session 20. May 13 23:56:45.789891 containerd[1507]: time="2025-05-13T23:56:45.789767127Z" level=warning msg="container event discarded" container=dc37a5b5bb33c5eb30e9a38b920420436c7bcbe7bed3c3d50bc451e63a84f96a type=CONTAINER_CREATED_EVENT May 13 23:56:45.876421 containerd[1507]: time="2025-05-13T23:56:45.876323333Z" level=warning msg="container event discarded" container=dc37a5b5bb33c5eb30e9a38b920420436c7bcbe7bed3c3d50bc451e63a84f96a type=CONTAINER_STARTED_EVENT May 13 23:56:46.213278 containerd[1507]: time="2025-05-13T23:56:46.213151163Z" level=warning msg="container event discarded" container=f32cbf5f34e7a8376cfab6e2f4323549e42deacef52f2c892e02d32f01e63fe3 type=CONTAINER_CREATED_EVENT May 13 23:56:46.301656 containerd[1507]: time="2025-05-13T23:56:46.301527115Z" level=warning msg="container event discarded" container=f32cbf5f34e7a8376cfab6e2f4323549e42deacef52f2c892e02d32f01e63fe3 type=CONTAINER_STARTED_EVENT May 13 23:56:48.545523 containerd[1507]: time="2025-05-13T23:56:48.545411242Z" level=warning msg="container event discarded" container=3c5fb2d2e4cad4ba4920801b1ac65be2b855e4b096d8b2f679fd0847ca9bceea type=CONTAINER_CREATED_EVENT May 13 23:56:48.636877 containerd[1507]: time="2025-05-13T23:56:48.636588796Z" level=warning msg="container event discarded" container=3c5fb2d2e4cad4ba4920801b1ac65be2b855e4b096d8b2f679fd0847ca9bceea type=CONTAINER_STARTED_EVENT May 13 23:56:50.263837 containerd[1507]: time="2025-05-13T23:56:50.263716235Z" level=warning msg="container event discarded" container=d101e12416b5c9bf1fe2c8363026ff3a6f881fe9c8fd620b1b1801c99f89f825 type=CONTAINER_CREATED_EVENT May 13 23:56:50.362201 containerd[1507]: time="2025-05-13T23:56:50.362092009Z" level=warning msg="container event discarded" container=d101e12416b5c9bf1fe2c8363026ff3a6f881fe9c8fd620b1b1801c99f89f825 type=CONTAINER_STARTED_EVENT May 13 23:57:03.595754 containerd[1507]: time="2025-05-13T23:57:03.595701626Z" level=info msg="TaskExit event in podsandbox handler container_id:\"31a4144284996cba49259ad8f22433f69404813566164bcd2319e49b1d12eadc\" id:\"9a8e66eafcd693139078ecb70b97108d9643d1fa26dd1e5b074c204bef2893d0\" pid:5753 exited_at:{seconds:1747180623 nanos:595344261}" May 13 23:57:05.680618 containerd[1507]: time="2025-05-13T23:57:05.680568218Z" level=info msg="TaskExit event in podsandbox handler container_id:\"3c5fb2d2e4cad4ba4920801b1ac65be2b855e4b096d8b2f679fd0847ca9bceea\" id:\"fb59eac1029957149bae4f1a9b69edba0b3d60c21a6c76a26cc5eddec94c7126\" pid:5781 exited_at:{seconds:1747180625 nanos:680028171}" May 13 23:57:13.842114 containerd[1507]: time="2025-05-13T23:57:13.841799456Z" level=info msg="TaskExit event in podsandbox handler container_id:\"3c5fb2d2e4cad4ba4920801b1ac65be2b855e4b096d8b2f679fd0847ca9bceea\" id:\"15e4c20c267959172cea7f05e70df8d679a139946a7d4b7616238e4ef714de14\" pid:5804 exited_at:{seconds:1747180633 nanos:841486292}" May 13 23:57:17.672112 systemd[1]: cri-containerd-558d47e04f618903205da713052711f336048d8fae7aee952a246ce637d1d5e6.scope: Deactivated successfully. May 13 23:57:17.672574 systemd[1]: cri-containerd-558d47e04f618903205da713052711f336048d8fae7aee952a246ce637d1d5e6.scope: Consumed 7.190s CPU time, 64.2M memory peak, 3.8M read from disk. May 13 23:57:17.676072 containerd[1507]: time="2025-05-13T23:57:17.675949837Z" level=info msg="received exit event container_id:\"558d47e04f618903205da713052711f336048d8fae7aee952a246ce637d1d5e6\" id:\"558d47e04f618903205da713052711f336048d8fae7aee952a246ce637d1d5e6\" pid:2599 exit_status:1 exited_at:{seconds:1747180637 nanos:675425319}" May 13 23:57:17.676800 containerd[1507]: time="2025-05-13T23:57:17.676395635Z" level=info msg="TaskExit event in podsandbox handler container_id:\"558d47e04f618903205da713052711f336048d8fae7aee952a246ce637d1d5e6\" id:\"558d47e04f618903205da713052711f336048d8fae7aee952a246ce637d1d5e6\" pid:2599 exit_status:1 exited_at:{seconds:1747180637 nanos:675425319}" May 13 23:57:17.704414 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-558d47e04f618903205da713052711f336048d8fae7aee952a246ce637d1d5e6-rootfs.mount: Deactivated successfully. May 13 23:57:18.045882 kubelet[2772]: E0513 23:57:18.045818 2772 controller.go:195] "Failed to update lease" err="rpc error: code = Unavailable desc = error reading from server: read tcp 10.0.0.3:34970->10.0.0.2:2379: read: connection timed out" May 13 23:57:18.052187 systemd[1]: cri-containerd-28655717a7c33734edf3a123df3df054f40592279e3d2dcb6b00dcb9fba879c3.scope: Deactivated successfully. May 13 23:57:18.052584 systemd[1]: cri-containerd-28655717a7c33734edf3a123df3df054f40592279e3d2dcb6b00dcb9fba879c3.scope: Consumed 3.294s CPU time, 23.8M memory peak, 3.1M read from disk. May 13 23:57:18.058616 containerd[1507]: time="2025-05-13T23:57:18.058258281Z" level=info msg="received exit event container_id:\"28655717a7c33734edf3a123df3df054f40592279e3d2dcb6b00dcb9fba879c3\" id:\"28655717a7c33734edf3a123df3df054f40592279e3d2dcb6b00dcb9fba879c3\" pid:2646 exit_status:1 exited_at:{seconds:1747180638 nanos:56780126}" May 13 23:57:18.059230 containerd[1507]: time="2025-05-13T23:57:18.058622680Z" level=info msg="TaskExit event in podsandbox handler container_id:\"28655717a7c33734edf3a123df3df054f40592279e3d2dcb6b00dcb9fba879c3\" id:\"28655717a7c33734edf3a123df3df054f40592279e3d2dcb6b00dcb9fba879c3\" pid:2646 exit_status:1 exited_at:{seconds:1747180638 nanos:56780126}" May 13 23:57:18.087404 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-28655717a7c33734edf3a123df3df054f40592279e3d2dcb6b00dcb9fba879c3-rootfs.mount: Deactivated successfully. May 13 23:57:18.464354 kubelet[2772]: I0513 23:57:18.464195 2772 scope.go:117] "RemoveContainer" containerID="558d47e04f618903205da713052711f336048d8fae7aee952a246ce637d1d5e6" May 13 23:57:18.469921 kubelet[2772]: I0513 23:57:18.469887 2772 scope.go:117] "RemoveContainer" containerID="28655717a7c33734edf3a123df3df054f40592279e3d2dcb6b00dcb9fba879c3" May 13 23:57:18.471042 containerd[1507]: time="2025-05-13T23:57:18.470501058Z" level=info msg="CreateContainer within sandbox \"281ef7c12689b1a5993ad7ffc0be4f98cb6affce5dd34256a60515457db460fa\" for container &ContainerMetadata{Name:kube-controller-manager,Attempt:1,}" May 13 23:57:18.473106 containerd[1507]: time="2025-05-13T23:57:18.473071809Z" level=info msg="CreateContainer within sandbox \"bfdf3cb17427a274d3ea9e5ddecf78fd865a477337fc5112be6a0851925694f5\" for container &ContainerMetadata{Name:kube-scheduler,Attempt:1,}" May 13 23:57:18.486211 containerd[1507]: time="2025-05-13T23:57:18.486134721Z" level=info msg="Container 951a7f12355c4da018fcbd5ba43f1066983f47a400adc2330ac1382551a1009a: CDI devices from CRI Config.CDIDevices: []" May 13 23:57:18.490605 containerd[1507]: time="2025-05-13T23:57:18.490473625Z" level=info msg="Container 7f05d3d95e71f2ac63b6af12c526846ff7744db25fe566db00a12061063a5726: CDI devices from CRI Config.CDIDevices: []" May 13 23:57:18.498394 containerd[1507]: time="2025-05-13T23:57:18.498320037Z" level=info msg="CreateContainer within sandbox \"281ef7c12689b1a5993ad7ffc0be4f98cb6affce5dd34256a60515457db460fa\" for &ContainerMetadata{Name:kube-controller-manager,Attempt:1,} returns container id \"951a7f12355c4da018fcbd5ba43f1066983f47a400adc2330ac1382551a1009a\"" May 13 23:57:18.500091 containerd[1507]: time="2025-05-13T23:57:18.499949191Z" level=info msg="StartContainer for \"951a7f12355c4da018fcbd5ba43f1066983f47a400adc2330ac1382551a1009a\"" May 13 23:57:18.501739 containerd[1507]: time="2025-05-13T23:57:18.501698544Z" level=info msg="connecting to shim 951a7f12355c4da018fcbd5ba43f1066983f47a400adc2330ac1382551a1009a" address="unix:///run/containerd/s/4f9ecc41de700c7da03633dc6bf388b88e9f3fab823334393a28e3ebeb01305b" protocol=ttrpc version=3 May 13 23:57:18.502480 containerd[1507]: time="2025-05-13T23:57:18.502449462Z" level=info msg="CreateContainer within sandbox \"bfdf3cb17427a274d3ea9e5ddecf78fd865a477337fc5112be6a0851925694f5\" for &ContainerMetadata{Name:kube-scheduler,Attempt:1,} returns container id \"7f05d3d95e71f2ac63b6af12c526846ff7744db25fe566db00a12061063a5726\"" May 13 23:57:18.503698 containerd[1507]: time="2025-05-13T23:57:18.503050379Z" level=info msg="StartContainer for \"7f05d3d95e71f2ac63b6af12c526846ff7744db25fe566db00a12061063a5726\"" May 13 23:57:18.504573 containerd[1507]: time="2025-05-13T23:57:18.504505134Z" level=info msg="connecting to shim 7f05d3d95e71f2ac63b6af12c526846ff7744db25fe566db00a12061063a5726" address="unix:///run/containerd/s/5f1acb794e908c9ed7b382764037f25f8b8ca1ab4e51a7e29adb00a083ceb98a" protocol=ttrpc version=3 May 13 23:57:18.525126 systemd[1]: Started cri-containerd-951a7f12355c4da018fcbd5ba43f1066983f47a400adc2330ac1382551a1009a.scope - libcontainer container 951a7f12355c4da018fcbd5ba43f1066983f47a400adc2330ac1382551a1009a. May 13 23:57:18.533362 systemd[1]: Started cri-containerd-7f05d3d95e71f2ac63b6af12c526846ff7744db25fe566db00a12061063a5726.scope - libcontainer container 7f05d3d95e71f2ac63b6af12c526846ff7744db25fe566db00a12061063a5726. May 13 23:57:18.593362 containerd[1507]: time="2025-05-13T23:57:18.593221371Z" level=info msg="StartContainer for \"951a7f12355c4da018fcbd5ba43f1066983f47a400adc2330ac1382551a1009a\" returns successfully" May 13 23:57:18.603642 containerd[1507]: time="2025-05-13T23:57:18.603315014Z" level=info msg="StartContainer for \"7f05d3d95e71f2ac63b6af12c526846ff7744db25fe566db00a12061063a5726\" returns successfully" May 13 23:57:19.262321 systemd[1]: cri-containerd-98da301f10ed65f4ea95c8857b325a66329a4b2d70631d7c802fcd9ee3df9026.scope: Deactivated successfully. May 13 23:57:19.263156 systemd[1]: cri-containerd-98da301f10ed65f4ea95c8857b325a66329a4b2d70631d7c802fcd9ee3df9026.scope: Consumed 6.661s CPU time, 43.8M memory peak. May 13 23:57:19.269068 containerd[1507]: time="2025-05-13T23:57:19.269021341Z" level=info msg="received exit event container_id:\"98da301f10ed65f4ea95c8857b325a66329a4b2d70631d7c802fcd9ee3df9026\" id:\"98da301f10ed65f4ea95c8857b325a66329a4b2d70631d7c802fcd9ee3df9026\" pid:3126 exit_status:1 exited_at:{seconds:1747180639 nanos:268582903}" May 13 23:57:19.269415 containerd[1507]: time="2025-05-13T23:57:19.269317220Z" level=info msg="TaskExit event in podsandbox handler container_id:\"98da301f10ed65f4ea95c8857b325a66329a4b2d70631d7c802fcd9ee3df9026\" id:\"98da301f10ed65f4ea95c8857b325a66329a4b2d70631d7c802fcd9ee3df9026\" pid:3126 exit_status:1 exited_at:{seconds:1747180639 nanos:268582903}" May 13 23:57:19.316634 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-98da301f10ed65f4ea95c8857b325a66329a4b2d70631d7c802fcd9ee3df9026-rootfs.mount: Deactivated successfully. May 13 23:57:19.487748 kubelet[2772]: I0513 23:57:19.487710 2772 scope.go:117] "RemoveContainer" containerID="98da301f10ed65f4ea95c8857b325a66329a4b2d70631d7c802fcd9ee3df9026" May 13 23:57:19.502569 containerd[1507]: time="2025-05-13T23:57:19.502511520Z" level=info msg="CreateContainer within sandbox \"7b572f8c6d7c2317c275876f0a2fa95ee8f66035af87bc7af049fc14b2675b86\" for container &ContainerMetadata{Name:tigera-operator,Attempt:1,}" May 13 23:57:19.517132 containerd[1507]: time="2025-05-13T23:57:19.514138119Z" level=info msg="Container a2b13ecd320de3fb70604502edb713da325b3977fe1db8911a82cfe9663db6d5: CDI devices from CRI Config.CDIDevices: []" May 13 23:57:19.525643 containerd[1507]: time="2025-05-13T23:57:19.525590239Z" level=info msg="CreateContainer within sandbox \"7b572f8c6d7c2317c275876f0a2fa95ee8f66035af87bc7af049fc14b2675b86\" for &ContainerMetadata{Name:tigera-operator,Attempt:1,} returns container id \"a2b13ecd320de3fb70604502edb713da325b3977fe1db8911a82cfe9663db6d5\"" May 13 23:57:19.526162 containerd[1507]: time="2025-05-13T23:57:19.526130517Z" level=info msg="StartContainer for \"a2b13ecd320de3fb70604502edb713da325b3977fe1db8911a82cfe9663db6d5\"" May 13 23:57:19.528216 containerd[1507]: time="2025-05-13T23:57:19.526952554Z" level=info msg="connecting to shim a2b13ecd320de3fb70604502edb713da325b3977fe1db8911a82cfe9663db6d5" address="unix:///run/containerd/s/ca9d0a967c686b70bcf17a3d6924371019368267ed178488c187385ff40d08fb" protocol=ttrpc version=3 May 13 23:57:19.562185 systemd[1]: Started cri-containerd-a2b13ecd320de3fb70604502edb713da325b3977fe1db8911a82cfe9663db6d5.scope - libcontainer container a2b13ecd320de3fb70604502edb713da325b3977fe1db8911a82cfe9663db6d5. May 13 23:57:19.940585 containerd[1507]: time="2025-05-13T23:57:19.940088421Z" level=info msg="StartContainer for \"a2b13ecd320de3fb70604502edb713da325b3977fe1db8911a82cfe9663db6d5\" returns successfully" May 13 23:57:21.122750 kubelet[2772]: E0513 23:57:21.116609 2772 event.go:359] "Server rejected event (will not retry!)" err="rpc error: code = Unavailable desc = error reading from server: read tcp 10.0.0.3:34798->10.0.0.2:2379: read: connection timed out" event="&Event{ObjectMeta:{kube-apiserver-ci-4284-0-0-n-40578dffbd.183f3b86cf91cec0 kube-system 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:kube-system,Name:kube-apiserver-ci-4284-0-0-n-40578dffbd,UID:bde9116b07dfa7bf3cd524f3c218d4a1,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:Unhealthy,Message:Liveness probe failed: HTTP probe failed with statuscode: 500,Source:EventSource{Component:kubelet,Host:ci-4284-0-0-n-40578dffbd,},FirstTimestamp:2025-05-13 23:57:10.6371376 +0000 UTC m=+366.650970125,LastTimestamp:2025-05-13 23:57:10.6371376 +0000 UTC m=+366.650970125,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ci-4284-0-0-n-40578dffbd,}" May 13 23:57:22.575685 systemd[1]: Started sshd@20-188.245.195.87:22-197.5.145.150:37346.service - OpenSSH per-connection server daemon (197.5.145.150:37346). May 13 23:57:22.990066 kubelet[2772]: E0513 23:57:22.989623 2772 kubelet_node_status.go:535] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"NetworkUnavailable\\\"},{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-05-13T23:57:12Z\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-05-13T23:57:12Z\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-05-13T23:57:12Z\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-05-13T23:57:12Z\\\",\\\"type\\\":\\\"Ready\\\"}]}}\" for node \"ci-4284-0-0-n-40578dffbd\": Patch \"https://188.245.195.87:6443/api/v1/nodes/ci-4284-0-0-n-40578dffbd/status?timeout=10s\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" May 13 23:57:22.991781 sshd[5947]: Received disconnect from 197.5.145.150 port 37346:11: Bye Bye [preauth] May 13 23:57:22.991781 sshd[5947]: Disconnected from authenticating user root 197.5.145.150 port 37346 [preauth] May 13 23:57:22.996163 systemd[1]: sshd@20-188.245.195.87:22-197.5.145.150:37346.service: Deactivated successfully. May 13 23:57:23.228644 kubelet[2772]: E0513 23:57:23.228449 2772 kubelet_node_status.go:535] "Error updating node status, will retry" err="error getting node \"ci-4284-0-0-n-40578dffbd\": rpc error: code = Unavailable desc = error reading from server: read tcp 10.0.0.3:34898->10.0.0.2:2379: read: connection timed out" May 13 23:57:23.234395 systemd[1]: cri-containerd-a2b13ecd320de3fb70604502edb713da325b3977fe1db8911a82cfe9663db6d5.scope: Deactivated successfully. May 13 23:57:23.237357 containerd[1507]: time="2025-05-13T23:57:23.235548428Z" level=info msg="received exit event container_id:\"a2b13ecd320de3fb70604502edb713da325b3977fe1db8911a82cfe9663db6d5\" id:\"a2b13ecd320de3fb70604502edb713da325b3977fe1db8911a82cfe9663db6d5\" pid:5926 exit_status:1 exited_at:{seconds:1747180643 nanos:234845150}" May 13 23:57:23.237357 containerd[1507]: time="2025-05-13T23:57:23.235909427Z" level=info msg="TaskExit event in podsandbox handler container_id:\"a2b13ecd320de3fb70604502edb713da325b3977fe1db8911a82cfe9663db6d5\" id:\"a2b13ecd320de3fb70604502edb713da325b3977fe1db8911a82cfe9663db6d5\" pid:5926 exit_status:1 exited_at:{seconds:1747180643 nanos:234845150}" May 13 23:57:23.236685 systemd[1]: cri-containerd-a2b13ecd320de3fb70604502edb713da325b3977fe1db8911a82cfe9663db6d5.scope: Consumed 78ms CPU time, 12.5M memory peak, 2M read from disk. May 13 23:57:23.266911 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-a2b13ecd320de3fb70604502edb713da325b3977fe1db8911a82cfe9663db6d5-rootfs.mount: Deactivated successfully. May 13 23:57:23.507607 kubelet[2772]: I0513 23:57:23.507567 2772 scope.go:117] "RemoveContainer" containerID="98da301f10ed65f4ea95c8857b325a66329a4b2d70631d7c802fcd9ee3df9026" May 13 23:57:23.508069 kubelet[2772]: I0513 23:57:23.507960 2772 scope.go:117] "RemoveContainer" containerID="a2b13ecd320de3fb70604502edb713da325b3977fe1db8911a82cfe9663db6d5" May 13 23:57:23.508215 kubelet[2772]: E0513 23:57:23.508133 2772 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"tigera-operator\" with CrashLoopBackOff: \"back-off 10s restarting failed container=tigera-operator pod=tigera-operator-6f6897fdc5-rgtqr_tigera-operator(60eb046b-6756-43a6-8d00-cc781cc176b1)\"" pod="tigera-operator/tigera-operator-6f6897fdc5-rgtqr" podUID="60eb046b-6756-43a6-8d00-cc781cc176b1" May 13 23:57:23.511317 containerd[1507]: time="2025-05-13T23:57:23.511067317Z" level=info msg="RemoveContainer for \"98da301f10ed65f4ea95c8857b325a66329a4b2d70631d7c802fcd9ee3df9026\"" May 13 23:57:23.523631 containerd[1507]: time="2025-05-13T23:57:23.523273441Z" level=info msg="RemoveContainer for \"98da301f10ed65f4ea95c8857b325a66329a4b2d70631d7c802fcd9ee3df9026\" returns successfully"