May 14 01:03:09.084230 kernel: Linux version 6.6.89-flatcar (build@pony-truck.infra.kinvolk.io) (x86_64-cros-linux-gnu-gcc (Gentoo Hardened 14.2.1_p20241221 p7) 14.2.1 20241221, GNU ld (Gentoo 2.44 p1) 2.44.0) #1 SMP PREEMPT_DYNAMIC Tue May 13 22:08:35 -00 2025 May 14 01:03:09.084257 kernel: Command line: BOOT_IMAGE=/flatcar/vmlinuz-a mount.usr=/dev/mapper/usr verity.usr=PARTUUID=7130c94a-213a-4e5a-8e26-6cce9662f132 rootflags=rw mount.usrflags=ro consoleblank=0 root=LABEL=ROOT console=ttyS0,115200n8 console=tty0 flatcar.first_boot=detected flatcar.oem.id=openstack flatcar.autologin verity.usrhash=8b3c5774a4242053287d41edc0d029958b7c22c131f7dd36b16a68182354e130 May 14 01:03:09.084267 kernel: BIOS-provided physical RAM map: May 14 01:03:09.084275 kernel: BIOS-e820: [mem 0x0000000000000000-0x000000000009fbff] usable May 14 01:03:09.084283 kernel: BIOS-e820: [mem 0x000000000009fc00-0x000000000009ffff] reserved May 14 01:03:09.084292 kernel: BIOS-e820: [mem 0x00000000000f0000-0x00000000000fffff] reserved May 14 01:03:09.084301 kernel: BIOS-e820: [mem 0x0000000000100000-0x00000000bffdcfff] usable May 14 01:03:09.084308 kernel: BIOS-e820: [mem 0x00000000bffdd000-0x00000000bfffffff] reserved May 14 01:03:09.084316 kernel: BIOS-e820: [mem 0x00000000feffc000-0x00000000feffffff] reserved May 14 01:03:09.084324 kernel: BIOS-e820: [mem 0x00000000fffc0000-0x00000000ffffffff] reserved May 14 01:03:09.084331 kernel: BIOS-e820: [mem 0x0000000100000000-0x000000013fffffff] usable May 14 01:03:09.084339 kernel: BIOS-e820: [mem 0x000000fd00000000-0x000000ffffffffff] reserved May 14 01:03:09.084347 kernel: NX (Execute Disable) protection: active May 14 01:03:09.084354 kernel: APIC: Static calls initialized May 14 01:03:09.084365 kernel: SMBIOS 3.0.0 present. May 14 01:03:09.084373 kernel: DMI: OpenStack Foundation OpenStack Nova, BIOS 1.16.3-debian-1.16.3-2 04/01/2014 May 14 01:03:09.084381 kernel: Hypervisor detected: KVM May 14 01:03:09.084404 kernel: kvm-clock: Using msrs 4b564d01 and 4b564d00 May 14 01:03:09.084412 kernel: kvm-clock: using sched offset of 3476784646 cycles May 14 01:03:09.084420 kernel: clocksource: kvm-clock: mask: 0xffffffffffffffff max_cycles: 0x1cd42e4dffb, max_idle_ns: 881590591483 ns May 14 01:03:09.088483 kernel: tsc: Detected 1996.249 MHz processor May 14 01:03:09.088493 kernel: e820: update [mem 0x00000000-0x00000fff] usable ==> reserved May 14 01:03:09.088509 kernel: e820: remove [mem 0x000a0000-0x000fffff] usable May 14 01:03:09.088518 kernel: last_pfn = 0x140000 max_arch_pfn = 0x400000000 May 14 01:03:09.088527 kernel: MTRR map: 4 entries (3 fixed + 1 variable; max 19), built from 8 variable MTRRs May 14 01:03:09.088535 kernel: x86/PAT: Configuration [0-7]: WB WC UC- UC WB WP UC- WT May 14 01:03:09.088543 kernel: last_pfn = 0xbffdd max_arch_pfn = 0x400000000 May 14 01:03:09.088552 kernel: ACPI: Early table checksum verification disabled May 14 01:03:09.088564 kernel: ACPI: RSDP 0x00000000000F51E0 000014 (v00 BOCHS ) May 14 01:03:09.088572 kernel: ACPI: RSDT 0x00000000BFFE1B65 000030 (v01 BOCHS BXPC 00000001 BXPC 00000001) May 14 01:03:09.088580 kernel: ACPI: FACP 0x00000000BFFE1A49 000074 (v01 BOCHS BXPC 00000001 BXPC 00000001) May 14 01:03:09.088588 kernel: ACPI: DSDT 0x00000000BFFE0040 001A09 (v01 BOCHS BXPC 00000001 BXPC 00000001) May 14 01:03:09.088596 kernel: ACPI: FACS 0x00000000BFFE0000 000040 May 14 01:03:09.088605 kernel: ACPI: APIC 0x00000000BFFE1ABD 000080 (v03 BOCHS BXPC 00000001 BXPC 00000001) May 14 01:03:09.088613 kernel: ACPI: WAET 0x00000000BFFE1B3D 000028 (v01 BOCHS BXPC 00000001 BXPC 00000001) May 14 01:03:09.088621 kernel: ACPI: Reserving FACP table memory at [mem 0xbffe1a49-0xbffe1abc] May 14 01:03:09.088629 kernel: ACPI: Reserving DSDT table memory at [mem 0xbffe0040-0xbffe1a48] May 14 01:03:09.088639 kernel: ACPI: Reserving FACS table memory at [mem 0xbffe0000-0xbffe003f] May 14 01:03:09.088647 kernel: ACPI: Reserving APIC table memory at [mem 0xbffe1abd-0xbffe1b3c] May 14 01:03:09.088656 kernel: ACPI: Reserving WAET table memory at [mem 0xbffe1b3d-0xbffe1b64] May 14 01:03:09.088667 kernel: No NUMA configuration found May 14 01:03:09.088676 kernel: Faking a node at [mem 0x0000000000000000-0x000000013fffffff] May 14 01:03:09.088684 kernel: NODE_DATA(0) allocated [mem 0x13fffa000-0x13fffffff] May 14 01:03:09.088693 kernel: Zone ranges: May 14 01:03:09.088703 kernel: DMA [mem 0x0000000000001000-0x0000000000ffffff] May 14 01:03:09.088712 kernel: DMA32 [mem 0x0000000001000000-0x00000000ffffffff] May 14 01:03:09.088720 kernel: Normal [mem 0x0000000100000000-0x000000013fffffff] May 14 01:03:09.088729 kernel: Movable zone start for each node May 14 01:03:09.088737 kernel: Early memory node ranges May 14 01:03:09.088746 kernel: node 0: [mem 0x0000000000001000-0x000000000009efff] May 14 01:03:09.088754 kernel: node 0: [mem 0x0000000000100000-0x00000000bffdcfff] May 14 01:03:09.088763 kernel: node 0: [mem 0x0000000100000000-0x000000013fffffff] May 14 01:03:09.088773 kernel: Initmem setup node 0 [mem 0x0000000000001000-0x000000013fffffff] May 14 01:03:09.088782 kernel: On node 0, zone DMA: 1 pages in unavailable ranges May 14 01:03:09.088790 kernel: On node 0, zone DMA: 97 pages in unavailable ranges May 14 01:03:09.088799 kernel: On node 0, zone Normal: 35 pages in unavailable ranges May 14 01:03:09.088808 kernel: ACPI: PM-Timer IO Port: 0x608 May 14 01:03:09.088816 kernel: ACPI: LAPIC_NMI (acpi_id[0xff] dfl dfl lint[0x1]) May 14 01:03:09.088825 kernel: IOAPIC[0]: apic_id 0, version 17, address 0xfec00000, GSI 0-23 May 14 01:03:09.088833 kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 0 global_irq 2 dfl dfl) May 14 01:03:09.088842 kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 5 global_irq 5 high level) May 14 01:03:09.088852 kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 9 global_irq 9 high level) May 14 01:03:09.088861 kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 10 global_irq 10 high level) May 14 01:03:09.088869 kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 11 global_irq 11 high level) May 14 01:03:09.088878 kernel: ACPI: Using ACPI (MADT) for SMP configuration information May 14 01:03:09.088887 kernel: smpboot: Allowing 2 CPUs, 0 hotplug CPUs May 14 01:03:09.088895 kernel: kvm-guest: APIC: eoi() replaced with kvm_guest_apic_eoi_write() May 14 01:03:09.088904 kernel: [mem 0xc0000000-0xfeffbfff] available for PCI devices May 14 01:03:09.088912 kernel: Booting paravirtualized kernel on KVM May 14 01:03:09.088921 kernel: clocksource: refined-jiffies: mask: 0xffffffff max_cycles: 0xffffffff, max_idle_ns: 1910969940391419 ns May 14 01:03:09.088931 kernel: setup_percpu: NR_CPUS:512 nr_cpumask_bits:2 nr_cpu_ids:2 nr_node_ids:1 May 14 01:03:09.088940 kernel: percpu: Embedded 58 pages/cpu s197032 r8192 d32344 u1048576 May 14 01:03:09.088948 kernel: pcpu-alloc: s197032 r8192 d32344 u1048576 alloc=1*2097152 May 14 01:03:09.088957 kernel: pcpu-alloc: [0] 0 1 May 14 01:03:09.088965 kernel: kvm-guest: PV spinlocks disabled, no host support May 14 01:03:09.088975 kernel: Kernel command line: rootflags=rw mount.usrflags=ro BOOT_IMAGE=/flatcar/vmlinuz-a mount.usr=/dev/mapper/usr verity.usr=PARTUUID=7130c94a-213a-4e5a-8e26-6cce9662f132 rootflags=rw mount.usrflags=ro consoleblank=0 root=LABEL=ROOT console=ttyS0,115200n8 console=tty0 flatcar.first_boot=detected flatcar.oem.id=openstack flatcar.autologin verity.usrhash=8b3c5774a4242053287d41edc0d029958b7c22c131f7dd36b16a68182354e130 May 14 01:03:09.088985 kernel: Unknown kernel command line parameters "BOOT_IMAGE=/flatcar/vmlinuz-a", will be passed to user space. May 14 01:03:09.088993 kernel: Dentry cache hash table entries: 524288 (order: 10, 4194304 bytes, linear) May 14 01:03:09.089004 kernel: Inode-cache hash table entries: 262144 (order: 9, 2097152 bytes, linear) May 14 01:03:09.089013 kernel: Fallback order for Node 0: 0 May 14 01:03:09.089021 kernel: Built 1 zonelists, mobility grouping on. Total pages: 1031901 May 14 01:03:09.089030 kernel: Policy zone: Normal May 14 01:03:09.089038 kernel: mem auto-init: stack:off, heap alloc:off, heap free:off May 14 01:03:09.089047 kernel: software IO TLB: area num 2. May 14 01:03:09.089056 kernel: Memory: 3962120K/4193772K available (14336K kernel code, 2296K rwdata, 25068K rodata, 43604K init, 1468K bss, 231392K reserved, 0K cma-reserved) May 14 01:03:09.089064 kernel: SLUB: HWalign=64, Order=0-3, MinObjects=0, CPUs=2, Nodes=1 May 14 01:03:09.089073 kernel: ftrace: allocating 37993 entries in 149 pages May 14 01:03:09.089083 kernel: ftrace: allocated 149 pages with 4 groups May 14 01:03:09.089092 kernel: Dynamic Preempt: voluntary May 14 01:03:09.089101 kernel: rcu: Preemptible hierarchical RCU implementation. May 14 01:03:09.089111 kernel: rcu: RCU event tracing is enabled. May 14 01:03:09.089119 kernel: rcu: RCU restricting CPUs from NR_CPUS=512 to nr_cpu_ids=2. May 14 01:03:09.089128 kernel: Trampoline variant of Tasks RCU enabled. May 14 01:03:09.089137 kernel: Rude variant of Tasks RCU enabled. May 14 01:03:09.089146 kernel: Tracing variant of Tasks RCU enabled. May 14 01:03:09.089154 kernel: rcu: RCU calculated value of scheduler-enlistment delay is 100 jiffies. May 14 01:03:09.089164 kernel: rcu: Adjusting geometry for rcu_fanout_leaf=16, nr_cpu_ids=2 May 14 01:03:09.089173 kernel: NR_IRQS: 33024, nr_irqs: 440, preallocated irqs: 16 May 14 01:03:09.089181 kernel: rcu: srcu_init: Setting srcu_struct sizes based on contention. May 14 01:03:09.089190 kernel: Console: colour VGA+ 80x25 May 14 01:03:09.089198 kernel: printk: console [tty0] enabled May 14 01:03:09.089208 kernel: printk: console [ttyS0] enabled May 14 01:03:09.089216 kernel: ACPI: Core revision 20230628 May 14 01:03:09.089225 kernel: APIC: Switch to symmetric I/O mode setup May 14 01:03:09.089234 kernel: x2apic enabled May 14 01:03:09.089244 kernel: APIC: Switched APIC routing to: physical x2apic May 14 01:03:09.089253 kernel: ..TIMER: vector=0x30 apic1=0 pin1=2 apic2=-1 pin2=-1 May 14 01:03:09.089261 kernel: tsc: Marking TSC unstable due to TSCs unsynchronized May 14 01:03:09.089270 kernel: Calibrating delay loop (skipped) preset value.. 3992.49 BogoMIPS (lpj=1996249) May 14 01:03:09.089279 kernel: Last level iTLB entries: 4KB 0, 2MB 0, 4MB 0 May 14 01:03:09.089287 kernel: Last level dTLB entries: 4KB 0, 2MB 0, 4MB 0, 1GB 0 May 14 01:03:09.089296 kernel: Spectre V1 : Mitigation: usercopy/swapgs barriers and __user pointer sanitization May 14 01:03:09.089305 kernel: Spectre V2 : Mitigation: Retpolines May 14 01:03:09.089313 kernel: Spectre V2 : Spectre v2 / SpectreRSB: Filling RSB on context switch and VMEXIT May 14 01:03:09.089323 kernel: Speculative Store Bypass: Vulnerable May 14 01:03:09.089332 kernel: x86/fpu: x87 FPU will use FXSAVE May 14 01:03:09.089340 kernel: Freeing SMP alternatives memory: 32K May 14 01:03:09.089349 kernel: pid_max: default: 32768 minimum: 301 May 14 01:03:09.089364 kernel: LSM: initializing lsm=lockdown,capability,landlock,selinux,integrity May 14 01:03:09.089374 kernel: landlock: Up and running. May 14 01:03:09.089383 kernel: SELinux: Initializing. May 14 01:03:09.089392 kernel: Mount-cache hash table entries: 8192 (order: 4, 65536 bytes, linear) May 14 01:03:09.089401 kernel: Mountpoint-cache hash table entries: 8192 (order: 4, 65536 bytes, linear) May 14 01:03:09.089410 kernel: smpboot: CPU0: AMD Intel Core i7 9xx (Nehalem Class Core i7) (family: 0x6, model: 0x1a, stepping: 0x3) May 14 01:03:09.089419 kernel: RCU Tasks: Setting shift to 1 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=2. May 14 01:03:09.089983 kernel: RCU Tasks Rude: Setting shift to 1 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=2. May 14 01:03:09.089998 kernel: RCU Tasks Trace: Setting shift to 1 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=2. May 14 01:03:09.090008 kernel: Performance Events: AMD PMU driver. May 14 01:03:09.090017 kernel: ... version: 0 May 14 01:03:09.090026 kernel: ... bit width: 48 May 14 01:03:09.090035 kernel: ... generic registers: 4 May 14 01:03:09.090046 kernel: ... value mask: 0000ffffffffffff May 14 01:03:09.090055 kernel: ... max period: 00007fffffffffff May 14 01:03:09.090064 kernel: ... fixed-purpose events: 0 May 14 01:03:09.090072 kernel: ... event mask: 000000000000000f May 14 01:03:09.090081 kernel: signal: max sigframe size: 1440 May 14 01:03:09.090090 kernel: rcu: Hierarchical SRCU implementation. May 14 01:03:09.090100 kernel: rcu: Max phase no-delay instances is 400. May 14 01:03:09.090109 kernel: smp: Bringing up secondary CPUs ... May 14 01:03:09.090118 kernel: smpboot: x86: Booting SMP configuration: May 14 01:03:09.090128 kernel: .... node #0, CPUs: #1 May 14 01:03:09.090137 kernel: smp: Brought up 1 node, 2 CPUs May 14 01:03:09.090146 kernel: smpboot: Max logical packages: 2 May 14 01:03:09.090155 kernel: smpboot: Total of 2 processors activated (7984.99 BogoMIPS) May 14 01:03:09.090164 kernel: devtmpfs: initialized May 14 01:03:09.090173 kernel: x86/mm: Memory block size: 128MB May 14 01:03:09.090182 kernel: clocksource: jiffies: mask: 0xffffffff max_cycles: 0xffffffff, max_idle_ns: 1911260446275000 ns May 14 01:03:09.090191 kernel: futex hash table entries: 512 (order: 3, 32768 bytes, linear) May 14 01:03:09.090200 kernel: pinctrl core: initialized pinctrl subsystem May 14 01:03:09.090211 kernel: NET: Registered PF_NETLINK/PF_ROUTE protocol family May 14 01:03:09.090220 kernel: audit: initializing netlink subsys (disabled) May 14 01:03:09.090230 kernel: audit: type=2000 audit(1747184588.019:1): state=initialized audit_enabled=0 res=1 May 14 01:03:09.090239 kernel: thermal_sys: Registered thermal governor 'step_wise' May 14 01:03:09.090248 kernel: thermal_sys: Registered thermal governor 'user_space' May 14 01:03:09.090257 kernel: cpuidle: using governor menu May 14 01:03:09.090266 kernel: acpiphp: ACPI Hot Plug PCI Controller Driver version: 0.5 May 14 01:03:09.090275 kernel: dca service started, version 1.12.1 May 14 01:03:09.090284 kernel: PCI: Using configuration type 1 for base access May 14 01:03:09.090294 kernel: kprobes: kprobe jump-optimization is enabled. All kprobes are optimized if possible. May 14 01:03:09.090304 kernel: HugeTLB: registered 2.00 MiB page size, pre-allocated 0 pages May 14 01:03:09.090313 kernel: HugeTLB: 28 KiB vmemmap can be freed for a 2.00 MiB page May 14 01:03:09.090322 kernel: ACPI: Added _OSI(Module Device) May 14 01:03:09.090330 kernel: ACPI: Added _OSI(Processor Device) May 14 01:03:09.090339 kernel: ACPI: Added _OSI(3.0 _SCP Extensions) May 14 01:03:09.090348 kernel: ACPI: Added _OSI(Processor Aggregator Device) May 14 01:03:09.090358 kernel: ACPI: 1 ACPI AML tables successfully acquired and loaded May 14 01:03:09.090366 kernel: ACPI: _OSC evaluation for CPUs failed, trying _PDC May 14 01:03:09.090377 kernel: ACPI: Interpreter enabled May 14 01:03:09.090386 kernel: ACPI: PM: (supports S0 S3 S5) May 14 01:03:09.090395 kernel: ACPI: Using IOAPIC for interrupt routing May 14 01:03:09.090404 kernel: PCI: Using host bridge windows from ACPI; if necessary, use "pci=nocrs" and report a bug May 14 01:03:09.090413 kernel: PCI: Using E820 reservations for host bridge windows May 14 01:03:09.090422 kernel: ACPI: Enabled 2 GPEs in block 00 to 0F May 14 01:03:09.090453 kernel: ACPI: PCI Root Bridge [PCI0] (domain 0000 [bus 00-ff]) May 14 01:03:09.090602 kernel: acpi PNP0A03:00: _OSC: OS supports [ASPM ClockPM Segments MSI HPX-Type3] May 14 01:03:09.090703 kernel: acpi PNP0A03:00: _OSC: not requesting OS control; OS requires [ExtendedConfig ASPM ClockPM MSI] May 14 01:03:09.090796 kernel: acpi PNP0A03:00: fail to add MMCONFIG information, can't access extended configuration space under this bridge May 14 01:03:09.090810 kernel: acpiphp: Slot [3] registered May 14 01:03:09.090819 kernel: acpiphp: Slot [4] registered May 14 01:03:09.090828 kernel: acpiphp: Slot [5] registered May 14 01:03:09.090837 kernel: acpiphp: Slot [6] registered May 14 01:03:09.090846 kernel: acpiphp: Slot [7] registered May 14 01:03:09.090855 kernel: acpiphp: Slot [8] registered May 14 01:03:09.090867 kernel: acpiphp: Slot [9] registered May 14 01:03:09.090876 kernel: acpiphp: Slot [10] registered May 14 01:03:09.090885 kernel: acpiphp: Slot [11] registered May 14 01:03:09.090893 kernel: acpiphp: Slot [12] registered May 14 01:03:09.090902 kernel: acpiphp: Slot [13] registered May 14 01:03:09.090911 kernel: acpiphp: Slot [14] registered May 14 01:03:09.090920 kernel: acpiphp: Slot [15] registered May 14 01:03:09.090929 kernel: acpiphp: Slot [16] registered May 14 01:03:09.090937 kernel: acpiphp: Slot [17] registered May 14 01:03:09.090946 kernel: acpiphp: Slot [18] registered May 14 01:03:09.090957 kernel: acpiphp: Slot [19] registered May 14 01:03:09.090966 kernel: acpiphp: Slot [20] registered May 14 01:03:09.090974 kernel: acpiphp: Slot [21] registered May 14 01:03:09.090983 kernel: acpiphp: Slot [22] registered May 14 01:03:09.090992 kernel: acpiphp: Slot [23] registered May 14 01:03:09.091001 kernel: acpiphp: Slot [24] registered May 14 01:03:09.091010 kernel: acpiphp: Slot [25] registered May 14 01:03:09.091018 kernel: acpiphp: Slot [26] registered May 14 01:03:09.091027 kernel: acpiphp: Slot [27] registered May 14 01:03:09.091038 kernel: acpiphp: Slot [28] registered May 14 01:03:09.091047 kernel: acpiphp: Slot [29] registered May 14 01:03:09.091056 kernel: acpiphp: Slot [30] registered May 14 01:03:09.091064 kernel: acpiphp: Slot [31] registered May 14 01:03:09.091073 kernel: PCI host bridge to bus 0000:00 May 14 01:03:09.091175 kernel: pci_bus 0000:00: root bus resource [io 0x0000-0x0cf7 window] May 14 01:03:09.091261 kernel: pci_bus 0000:00: root bus resource [io 0x0d00-0xffff window] May 14 01:03:09.091345 kernel: pci_bus 0000:00: root bus resource [mem 0x000a0000-0x000bffff window] May 14 01:03:09.091451 kernel: pci_bus 0000:00: root bus resource [mem 0xc0000000-0xfebfffff window] May 14 01:03:09.091540 kernel: pci_bus 0000:00: root bus resource [mem 0xc000000000-0xc07fffffff window] May 14 01:03:09.091623 kernel: pci_bus 0000:00: root bus resource [bus 00-ff] May 14 01:03:09.091738 kernel: pci 0000:00:00.0: [8086:1237] type 00 class 0x060000 May 14 01:03:09.091844 kernel: pci 0000:00:01.0: [8086:7000] type 00 class 0x060100 May 14 01:03:09.091950 kernel: pci 0000:00:01.1: [8086:7010] type 00 class 0x010180 May 14 01:03:09.092051 kernel: pci 0000:00:01.1: reg 0x20: [io 0xc120-0xc12f] May 14 01:03:09.092145 kernel: pci 0000:00:01.1: legacy IDE quirk: reg 0x10: [io 0x01f0-0x01f7] May 14 01:03:09.092237 kernel: pci 0000:00:01.1: legacy IDE quirk: reg 0x14: [io 0x03f6] May 14 01:03:09.095244 kernel: pci 0000:00:01.1: legacy IDE quirk: reg 0x18: [io 0x0170-0x0177] May 14 01:03:09.095349 kernel: pci 0000:00:01.1: legacy IDE quirk: reg 0x1c: [io 0x0376] May 14 01:03:09.095509 kernel: pci 0000:00:01.3: [8086:7113] type 00 class 0x068000 May 14 01:03:09.095655 kernel: pci 0000:00:01.3: quirk: [io 0x0600-0x063f] claimed by PIIX4 ACPI May 14 01:03:09.095758 kernel: pci 0000:00:01.3: quirk: [io 0x0700-0x070f] claimed by PIIX4 SMB May 14 01:03:09.095866 kernel: pci 0000:00:02.0: [1af4:1050] type 00 class 0x030000 May 14 01:03:09.095964 kernel: pci 0000:00:02.0: reg 0x10: [mem 0xfe000000-0xfe7fffff pref] May 14 01:03:09.096059 kernel: pci 0000:00:02.0: reg 0x18: [mem 0xc000000000-0xc000003fff 64bit pref] May 14 01:03:09.096155 kernel: pci 0000:00:02.0: reg 0x20: [mem 0xfeb90000-0xfeb90fff] May 14 01:03:09.096250 kernel: pci 0000:00:02.0: reg 0x30: [mem 0xfeb80000-0xfeb8ffff pref] May 14 01:03:09.096346 kernel: pci 0000:00:02.0: Video device with shadowed ROM at [mem 0x000c0000-0x000dffff] May 14 01:03:09.098533 kernel: pci 0000:00:03.0: [1af4:1000] type 00 class 0x020000 May 14 01:03:09.098637 kernel: pci 0000:00:03.0: reg 0x10: [io 0xc080-0xc0bf] May 14 01:03:09.098732 kernel: pci 0000:00:03.0: reg 0x14: [mem 0xfeb91000-0xfeb91fff] May 14 01:03:09.098827 kernel: pci 0000:00:03.0: reg 0x20: [mem 0xc000004000-0xc000007fff 64bit pref] May 14 01:03:09.098923 kernel: pci 0000:00:03.0: reg 0x30: [mem 0xfeb00000-0xfeb7ffff pref] May 14 01:03:09.099034 kernel: pci 0000:00:04.0: [1af4:1001] type 00 class 0x010000 May 14 01:03:09.099131 kernel: pci 0000:00:04.0: reg 0x10: [io 0xc000-0xc07f] May 14 01:03:09.099232 kernel: pci 0000:00:04.0: reg 0x14: [mem 0xfeb92000-0xfeb92fff] May 14 01:03:09.099327 kernel: pci 0000:00:04.0: reg 0x20: [mem 0xc000008000-0xc00000bfff 64bit pref] May 14 01:03:09.103857 kernel: pci 0000:00:05.0: [1af4:1002] type 00 class 0x00ff00 May 14 01:03:09.104132 kernel: pci 0000:00:05.0: reg 0x10: [io 0xc0c0-0xc0ff] May 14 01:03:09.104327 kernel: pci 0000:00:05.0: reg 0x20: [mem 0xc00000c000-0xc00000ffff 64bit pref] May 14 01:03:09.104662 kernel: pci 0000:00:06.0: [1af4:1005] type 00 class 0x00ff00 May 14 01:03:09.104853 kernel: pci 0000:00:06.0: reg 0x10: [io 0xc100-0xc11f] May 14 01:03:09.105045 kernel: pci 0000:00:06.0: reg 0x14: [mem 0xfeb93000-0xfeb93fff] May 14 01:03:09.105228 kernel: pci 0000:00:06.0: reg 0x20: [mem 0xc000010000-0xc000013fff 64bit pref] May 14 01:03:09.105256 kernel: ACPI: PCI: Interrupt link LNKA configured for IRQ 10 May 14 01:03:09.105274 kernel: ACPI: PCI: Interrupt link LNKB configured for IRQ 10 May 14 01:03:09.105292 kernel: ACPI: PCI: Interrupt link LNKC configured for IRQ 11 May 14 01:03:09.105309 kernel: ACPI: PCI: Interrupt link LNKD configured for IRQ 11 May 14 01:03:09.105327 kernel: ACPI: PCI: Interrupt link LNKS configured for IRQ 9 May 14 01:03:09.105344 kernel: iommu: Default domain type: Translated May 14 01:03:09.105368 kernel: iommu: DMA domain TLB invalidation policy: lazy mode May 14 01:03:09.105386 kernel: PCI: Using ACPI for IRQ routing May 14 01:03:09.105403 kernel: PCI: pci_cache_line_size set to 64 bytes May 14 01:03:09.105420 kernel: e820: reserve RAM buffer [mem 0x0009fc00-0x0009ffff] May 14 01:03:09.105486 kernel: e820: reserve RAM buffer [mem 0xbffdd000-0xbfffffff] May 14 01:03:09.105678 kernel: pci 0000:00:02.0: vgaarb: setting as boot VGA device May 14 01:03:09.105859 kernel: pci 0000:00:02.0: vgaarb: bridge control possible May 14 01:03:09.106037 kernel: pci 0000:00:02.0: vgaarb: VGA device added: decodes=io+mem,owns=io+mem,locks=none May 14 01:03:09.106063 kernel: vgaarb: loaded May 14 01:03:09.106087 kernel: clocksource: Switched to clocksource kvm-clock May 14 01:03:09.106105 kernel: VFS: Disk quotas dquot_6.6.0 May 14 01:03:09.106122 kernel: VFS: Dquot-cache hash table entries: 512 (order 0, 4096 bytes) May 14 01:03:09.106140 kernel: pnp: PnP ACPI init May 14 01:03:09.106327 kernel: pnp 00:03: [dma 2] May 14 01:03:09.106356 kernel: pnp: PnP ACPI: found 5 devices May 14 01:03:09.106374 kernel: clocksource: acpi_pm: mask: 0xffffff max_cycles: 0xffffff, max_idle_ns: 2085701024 ns May 14 01:03:09.106392 kernel: NET: Registered PF_INET protocol family May 14 01:03:09.106414 kernel: IP idents hash table entries: 65536 (order: 7, 524288 bytes, linear) May 14 01:03:09.107601 kernel: tcp_listen_portaddr_hash hash table entries: 2048 (order: 3, 32768 bytes, linear) May 14 01:03:09.107625 kernel: Table-perturb hash table entries: 65536 (order: 6, 262144 bytes, linear) May 14 01:03:09.107643 kernel: TCP established hash table entries: 32768 (order: 6, 262144 bytes, linear) May 14 01:03:09.107661 kernel: TCP bind hash table entries: 32768 (order: 8, 1048576 bytes, linear) May 14 01:03:09.107678 kernel: TCP: Hash tables configured (established 32768 bind 32768) May 14 01:03:09.107695 kernel: UDP hash table entries: 2048 (order: 4, 65536 bytes, linear) May 14 01:03:09.107712 kernel: UDP-Lite hash table entries: 2048 (order: 4, 65536 bytes, linear) May 14 01:03:09.107730 kernel: NET: Registered PF_UNIX/PF_LOCAL protocol family May 14 01:03:09.107756 kernel: NET: Registered PF_XDP protocol family May 14 01:03:09.107940 kernel: pci_bus 0000:00: resource 4 [io 0x0000-0x0cf7 window] May 14 01:03:09.108102 kernel: pci_bus 0000:00: resource 5 [io 0x0d00-0xffff window] May 14 01:03:09.108261 kernel: pci_bus 0000:00: resource 6 [mem 0x000a0000-0x000bffff window] May 14 01:03:09.109192 kernel: pci_bus 0000:00: resource 7 [mem 0xc0000000-0xfebfffff window] May 14 01:03:09.109370 kernel: pci_bus 0000:00: resource 8 [mem 0xc000000000-0xc07fffffff window] May 14 01:03:09.109589 kernel: pci 0000:00:01.0: PIIX3: Enabling Passive Release May 14 01:03:09.109774 kernel: pci 0000:00:00.0: Limiting direct PCI/PCI transfers May 14 01:03:09.109810 kernel: PCI: CLS 0 bytes, default 64 May 14 01:03:09.109828 kernel: PCI-DMA: Using software bounce buffering for IO (SWIOTLB) May 14 01:03:09.109846 kernel: software IO TLB: mapped [mem 0x00000000bbfdd000-0x00000000bffdd000] (64MB) May 14 01:03:09.109864 kernel: Initialise system trusted keyrings May 14 01:03:09.109881 kernel: workingset: timestamp_bits=39 max_order=20 bucket_order=0 May 14 01:03:09.109899 kernel: Key type asymmetric registered May 14 01:03:09.109915 kernel: Asymmetric key parser 'x509' registered May 14 01:03:09.109932 kernel: Block layer SCSI generic (bsg) driver version 0.4 loaded (major 251) May 14 01:03:09.109949 kernel: io scheduler mq-deadline registered May 14 01:03:09.109970 kernel: io scheduler kyber registered May 14 01:03:09.109987 kernel: io scheduler bfq registered May 14 01:03:09.110004 kernel: ioatdma: Intel(R) QuickData Technology Driver 5.00 May 14 01:03:09.110022 kernel: ACPI: \_SB_.LNKB: Enabled at IRQ 10 May 14 01:03:09.110040 kernel: ACPI: \_SB_.LNKC: Enabled at IRQ 11 May 14 01:03:09.110057 kernel: ACPI: \_SB_.LNKD: Enabled at IRQ 11 May 14 01:03:09.110075 kernel: ACPI: \_SB_.LNKA: Enabled at IRQ 10 May 14 01:03:09.110092 kernel: Serial: 8250/16550 driver, 4 ports, IRQ sharing enabled May 14 01:03:09.110109 kernel: 00:00: ttyS0 at I/O 0x3f8 (irq = 4, base_baud = 115200) is a 16550A May 14 01:03:09.110129 kernel: random: crng init done May 14 01:03:09.110146 kernel: i8042: PNP: PS/2 Controller [PNP0303:KBD,PNP0f13:MOU] at 0x60,0x64 irq 1,12 May 14 01:03:09.110164 kernel: serio: i8042 KBD port at 0x60,0x64 irq 1 May 14 01:03:09.110182 kernel: serio: i8042 AUX port at 0x60,0x64 irq 12 May 14 01:03:09.110358 kernel: rtc_cmos 00:04: RTC can wake from S4 May 14 01:03:09.110386 kernel: input: AT Translated Set 2 keyboard as /devices/platform/i8042/serio0/input/input0 May 14 01:03:09.110613 kernel: rtc_cmos 00:04: registered as rtc0 May 14 01:03:09.110780 kernel: rtc_cmos 00:04: setting system clock to 2025-05-14T01:03:08 UTC (1747184588) May 14 01:03:09.110952 kernel: rtc_cmos 00:04: alarms up to one day, y3k, 242 bytes nvram May 14 01:03:09.110978 kernel: amd_pstate: the _CPC object is not present in SBIOS or ACPI disabled May 14 01:03:09.110997 kernel: NET: Registered PF_INET6 protocol family May 14 01:03:09.111014 kernel: Segment Routing with IPv6 May 14 01:03:09.111031 kernel: In-situ OAM (IOAM) with IPv6 May 14 01:03:09.111048 kernel: NET: Registered PF_PACKET protocol family May 14 01:03:09.111065 kernel: Key type dns_resolver registered May 14 01:03:09.111082 kernel: IPI shorthand broadcast: enabled May 14 01:03:09.111099 kernel: sched_clock: Marking stable (996011073, 172309043)->(1191004228, -22684112) May 14 01:03:09.111122 kernel: registered taskstats version 1 May 14 01:03:09.111139 kernel: Loading compiled-in X.509 certificates May 14 01:03:09.111157 kernel: Loaded X.509 cert 'Kinvolk GmbH: Module signing key for 6.6.89-flatcar: 166efda032ca4d6e9037c569aca9b53585ee6f94' May 14 01:03:09.111174 kernel: Key type .fscrypt registered May 14 01:03:09.111190 kernel: Key type fscrypt-provisioning registered May 14 01:03:09.111207 kernel: ima: No TPM chip found, activating TPM-bypass! May 14 01:03:09.111225 kernel: ima: Allocated hash algorithm: sha1 May 14 01:03:09.111242 kernel: ima: No architecture policies found May 14 01:03:09.111263 kernel: clk: Disabling unused clocks May 14 01:03:09.111280 kernel: Freeing unused kernel image (initmem) memory: 43604K May 14 01:03:09.111297 kernel: Write protecting the kernel read-only data: 40960k May 14 01:03:09.111314 kernel: Freeing unused kernel image (rodata/data gap) memory: 1556K May 14 01:03:09.111331 kernel: Run /init as init process May 14 01:03:09.111348 kernel: with arguments: May 14 01:03:09.111365 kernel: /init May 14 01:03:09.111381 kernel: with environment: May 14 01:03:09.111398 kernel: HOME=/ May 14 01:03:09.111414 kernel: TERM=linux May 14 01:03:09.111548 kernel: BOOT_IMAGE=/flatcar/vmlinuz-a May 14 01:03:09.111571 systemd[1]: Successfully made /usr/ read-only. May 14 01:03:09.111598 systemd[1]: systemd 256.8 running in system mode (+PAM +AUDIT +SELINUX -APPARMOR +IMA +SMACK +SECCOMP +GCRYPT -GNUTLS +OPENSSL -ACL +BLKID +CURL +ELFUTILS -FIDO2 +IDN2 -IDN +IPTC +KMOD +LIBCRYPTSETUP +LIBCRYPTSETUP_PLUGINS +LIBFDISK +PCRE2 -PWQUALITY -P11KIT -QRENCODE +TPM2 +BZIP2 +LZ4 +XZ +ZLIB +ZSTD -BPF_FRAMEWORK -XKBCOMMON +UTMP -SYSVINIT +LIBARCHIVE) May 14 01:03:09.111618 systemd[1]: Detected virtualization kvm. May 14 01:03:09.111636 systemd[1]: Detected architecture x86-64. May 14 01:03:09.111654 systemd[1]: Running in initrd. May 14 01:03:09.111678 systemd[1]: No hostname configured, using default hostname. May 14 01:03:09.111698 systemd[1]: Hostname set to . May 14 01:03:09.111716 systemd[1]: Initializing machine ID from VM UUID. May 14 01:03:09.111734 systemd[1]: Queued start job for default target initrd.target. May 14 01:03:09.111752 systemd[1]: Started clevis-luks-askpass.path - Forward Password Requests to Clevis Directory Watch. May 14 01:03:09.111771 systemd[1]: Started systemd-ask-password-console.path - Dispatch Password Requests to Console Directory Watch. May 14 01:03:09.111792 systemd[1]: Expecting device dev-disk-by\x2dlabel-EFI\x2dSYSTEM.device - /dev/disk/by-label/EFI-SYSTEM... May 14 01:03:09.111827 systemd[1]: Expecting device dev-disk-by\x2dlabel-OEM.device - /dev/disk/by-label/OEM... May 14 01:03:09.111849 systemd[1]: Expecting device dev-disk-by\x2dlabel-ROOT.device - /dev/disk/by-label/ROOT... May 14 01:03:09.111869 systemd[1]: Expecting device dev-disk-by\x2dpartlabel-USR\x2dA.device - /dev/disk/by-partlabel/USR-A... May 14 01:03:09.111890 systemd[1]: Expecting device dev-disk-by\x2dpartuuid-7130c94a\x2d213a\x2d4e5a\x2d8e26\x2d6cce9662f132.device - /dev/disk/by-partuuid/7130c94a-213a-4e5a-8e26-6cce9662f132... May 14 01:03:09.111909 systemd[1]: Expecting device dev-mapper-usr.device - /dev/mapper/usr... May 14 01:03:09.111928 systemd[1]: Reached target cryptsetup-pre.target - Local Encrypted Volumes (Pre). May 14 01:03:09.111950 systemd[1]: Reached target cryptsetup.target - Local Encrypted Volumes. May 14 01:03:09.111969 systemd[1]: Reached target paths.target - Path Units. May 14 01:03:09.111988 systemd[1]: Reached target slices.target - Slice Units. May 14 01:03:09.112006 systemd[1]: Reached target swap.target - Swaps. May 14 01:03:09.112025 systemd[1]: Reached target timers.target - Timer Units. May 14 01:03:09.112044 systemd[1]: Listening on iscsid.socket - Open-iSCSI iscsid Socket. May 14 01:03:09.112062 systemd[1]: Listening on iscsiuio.socket - Open-iSCSI iscsiuio Socket. May 14 01:03:09.112081 systemd[1]: Listening on systemd-journald-dev-log.socket - Journal Socket (/dev/log). May 14 01:03:09.112100 systemd[1]: Listening on systemd-journald.socket - Journal Sockets. May 14 01:03:09.112123 systemd[1]: Listening on systemd-networkd.socket - Network Service Netlink Socket. May 14 01:03:09.112142 systemd[1]: Listening on systemd-udevd-control.socket - udev Control Socket. May 14 01:03:09.112161 systemd[1]: Listening on systemd-udevd-kernel.socket - udev Kernel Socket. May 14 01:03:09.112180 systemd[1]: Reached target sockets.target - Socket Units. May 14 01:03:09.112198 systemd[1]: Starting ignition-setup-pre.service - Ignition env setup... May 14 01:03:09.112218 systemd[1]: Starting kmod-static-nodes.service - Create List of Static Device Nodes... May 14 01:03:09.112236 systemd[1]: Finished network-cleanup.service - Network Cleanup. May 14 01:03:09.112255 systemd[1]: Starting systemd-fsck-usr.service... May 14 01:03:09.112277 systemd[1]: Starting systemd-journald.service - Journal Service... May 14 01:03:09.112296 systemd[1]: Starting systemd-modules-load.service - Load Kernel Modules... May 14 01:03:09.112314 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... May 14 01:03:09.112333 systemd[1]: Finished ignition-setup-pre.service - Ignition env setup. May 14 01:03:09.112417 systemd-journald[184]: Collecting audit messages is disabled. May 14 01:03:09.112511 systemd[1]: Finished kmod-static-nodes.service - Create List of Static Device Nodes. May 14 01:03:09.112533 systemd[1]: Finished systemd-fsck-usr.service. May 14 01:03:09.112553 systemd[1]: Starting systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully... May 14 01:03:09.112578 systemd-journald[184]: Journal started May 14 01:03:09.112619 systemd-journald[184]: Runtime Journal (/run/log/journal/aa07cbf376954a6eb62813c8aedf9c74) is 8M, max 78.2M, 70.2M free. May 14 01:03:09.102247 systemd-modules-load[186]: Inserted module 'overlay' May 14 01:03:09.169733 systemd[1]: Started systemd-journald.service - Journal Service. May 14 01:03:09.169765 kernel: bridge: filtering via arp/ip/ip6tables is no longer available by default. Update your scripts to load br_netfilter if you need this. May 14 01:03:09.169780 kernel: Bridge firewalling registered May 14 01:03:09.140786 systemd-modules-load[186]: Inserted module 'br_netfilter' May 14 01:03:09.169626 systemd[1]: Finished systemd-modules-load.service - Load Kernel Modules. May 14 01:03:09.171262 systemd[1]: Finished systemd-vconsole-setup.service - Virtual Console Setup. May 14 01:03:09.173150 systemd[1]: Finished systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully. May 14 01:03:09.181161 systemd[1]: Starting dracut-cmdline-ask.service - dracut ask for additional cmdline parameters... May 14 01:03:09.185663 systemd[1]: Starting systemd-sysctl.service - Apply Kernel Variables... May 14 01:03:09.190577 systemd[1]: Starting systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev... May 14 01:03:09.193539 systemd[1]: Starting systemd-tmpfiles-setup.service - Create System Files and Directories... May 14 01:03:09.204812 systemd[1]: Finished systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev. May 14 01:03:09.217654 systemd[1]: Finished dracut-cmdline-ask.service - dracut ask for additional cmdline parameters. May 14 01:03:09.220538 systemd[1]: Starting dracut-cmdline.service - dracut cmdline hook... May 14 01:03:09.221933 systemd[1]: Finished systemd-sysctl.service - Apply Kernel Variables. May 14 01:03:09.223316 systemd[1]: Finished systemd-tmpfiles-setup.service - Create System Files and Directories. May 14 01:03:09.235573 systemd[1]: Starting systemd-resolved.service - Network Name Resolution... May 14 01:03:09.241988 dracut-cmdline[218]: dracut-dracut-053 May 14 01:03:09.247636 dracut-cmdline[218]: Using kernel command line parameters: rd.driver.pre=btrfs rootflags=rw mount.usrflags=ro BOOT_IMAGE=/flatcar/vmlinuz-a mount.usr=/dev/mapper/usr verity.usr=PARTUUID=7130c94a-213a-4e5a-8e26-6cce9662f132 rootflags=rw mount.usrflags=ro consoleblank=0 root=LABEL=ROOT console=ttyS0,115200n8 console=tty0 flatcar.first_boot=detected flatcar.oem.id=openstack flatcar.autologin verity.usrhash=8b3c5774a4242053287d41edc0d029958b7c22c131f7dd36b16a68182354e130 May 14 01:03:09.285540 systemd-resolved[221]: Positive Trust Anchors: May 14 01:03:09.285555 systemd-resolved[221]: . IN DS 20326 8 2 e06d44b80b8f1d39a95c0b0d7c65d08458e880409bbc683457104237c7f8ec8d May 14 01:03:09.285599 systemd-resolved[221]: Negative trust anchors: home.arpa 10.in-addr.arpa 16.172.in-addr.arpa 17.172.in-addr.arpa 18.172.in-addr.arpa 19.172.in-addr.arpa 20.172.in-addr.arpa 21.172.in-addr.arpa 22.172.in-addr.arpa 23.172.in-addr.arpa 24.172.in-addr.arpa 25.172.in-addr.arpa 26.172.in-addr.arpa 27.172.in-addr.arpa 28.172.in-addr.arpa 29.172.in-addr.arpa 30.172.in-addr.arpa 31.172.in-addr.arpa 170.0.0.192.in-addr.arpa 171.0.0.192.in-addr.arpa 168.192.in-addr.arpa d.f.ip6.arpa ipv4only.arpa resolver.arpa corp home internal intranet lan local private test May 14 01:03:09.288568 systemd-resolved[221]: Defaulting to hostname 'linux'. May 14 01:03:09.289554 systemd[1]: Started systemd-resolved.service - Network Name Resolution. May 14 01:03:09.292195 systemd[1]: Reached target nss-lookup.target - Host and Network Name Lookups. May 14 01:03:09.338485 kernel: SCSI subsystem initialized May 14 01:03:09.350518 kernel: Loading iSCSI transport class v2.0-870. May 14 01:03:09.363765 kernel: iscsi: registered transport (tcp) May 14 01:03:09.388576 kernel: iscsi: registered transport (qla4xxx) May 14 01:03:09.388698 kernel: QLogic iSCSI HBA Driver May 14 01:03:09.450810 systemd[1]: Finished dracut-cmdline.service - dracut cmdline hook. May 14 01:03:09.456963 systemd[1]: Starting dracut-pre-udev.service - dracut pre-udev hook... May 14 01:03:09.524691 kernel: device-mapper: core: CONFIG_IMA_DISABLE_HTABLE is disabled. Duplicate IMA measurements will not be recorded in the IMA log. May 14 01:03:09.524832 kernel: device-mapper: uevent: version 1.0.3 May 14 01:03:09.524892 kernel: device-mapper: ioctl: 4.48.0-ioctl (2023-03-01) initialised: dm-devel@redhat.com May 14 01:03:09.593563 kernel: raid6: sse2x4 gen() 5627 MB/s May 14 01:03:09.612679 kernel: raid6: sse2x2 gen() 5992 MB/s May 14 01:03:09.630946 kernel: raid6: sse2x1 gen() 8518 MB/s May 14 01:03:09.631025 kernel: raid6: using algorithm sse2x1 gen() 8518 MB/s May 14 01:03:09.650032 kernel: raid6: .... xor() 7107 MB/s, rmw enabled May 14 01:03:09.650094 kernel: raid6: using ssse3x2 recovery algorithm May 14 01:03:09.672488 kernel: xor: measuring software checksum speed May 14 01:03:09.672551 kernel: prefetch64-sse : 15998 MB/sec May 14 01:03:09.674996 kernel: generic_sse : 15753 MB/sec May 14 01:03:09.675040 kernel: xor: using function: prefetch64-sse (15998 MB/sec) May 14 01:03:09.858533 kernel: Btrfs loaded, zoned=no, fsverity=no May 14 01:03:09.877815 systemd[1]: Finished dracut-pre-udev.service - dracut pre-udev hook. May 14 01:03:09.884647 systemd[1]: Starting systemd-udevd.service - Rule-based Manager for Device Events and Files... May 14 01:03:09.912316 systemd-udevd[404]: Using default interface naming scheme 'v255'. May 14 01:03:09.917595 systemd[1]: Started systemd-udevd.service - Rule-based Manager for Device Events and Files. May 14 01:03:09.923782 systemd[1]: Starting dracut-pre-trigger.service - dracut pre-trigger hook... May 14 01:03:09.948142 dracut-pre-trigger[413]: rd.md=0: removing MD RAID activation May 14 01:03:09.991674 systemd[1]: Finished dracut-pre-trigger.service - dracut pre-trigger hook. May 14 01:03:09.996663 systemd[1]: Starting systemd-udev-trigger.service - Coldplug All udev Devices... May 14 01:03:10.070516 systemd[1]: Finished systemd-udev-trigger.service - Coldplug All udev Devices. May 14 01:03:10.076461 systemd[1]: Starting dracut-initqueue.service - dracut initqueue hook... May 14 01:03:10.122482 systemd[1]: Finished dracut-initqueue.service - dracut initqueue hook. May 14 01:03:10.124033 systemd[1]: Reached target remote-fs-pre.target - Preparation for Remote File Systems. May 14 01:03:10.127061 systemd[1]: Reached target remote-cryptsetup.target - Remote Encrypted Volumes. May 14 01:03:10.129878 systemd[1]: Reached target remote-fs.target - Remote File Systems. May 14 01:03:10.136490 systemd[1]: Starting dracut-pre-mount.service - dracut pre-mount hook... May 14 01:03:10.162662 systemd[1]: Finished dracut-pre-mount.service - dracut pre-mount hook. May 14 01:03:10.177445 kernel: virtio_blk virtio2: 2/0/0 default/read/poll queues May 14 01:03:10.189723 kernel: virtio_blk virtio2: [vda] 20971520 512-byte logical blocks (10.7 GB/10.0 GiB) May 14 01:03:10.202105 kernel: GPT:Primary header thinks Alt. header is not at the end of the disk. May 14 01:03:10.202155 kernel: GPT:17805311 != 20971519 May 14 01:03:10.202169 kernel: GPT:Alternate GPT header not at the end of the disk. May 14 01:03:10.202182 kernel: GPT:17805311 != 20971519 May 14 01:03:10.202194 kernel: GPT: Use GNU Parted to correct GPT errors. May 14 01:03:10.202215 kernel: vda: vda1 vda2 vda3 vda4 vda6 vda7 vda9 May 14 01:03:10.201182 systemd[1]: dracut-cmdline-ask.service: Deactivated successfully. May 14 01:03:10.201322 systemd[1]: Stopped dracut-cmdline-ask.service - dracut ask for additional cmdline parameters. May 14 01:03:10.211546 systemd[1]: Stopping dracut-cmdline-ask.service - dracut ask for additional cmdline parameters... May 14 01:03:10.212086 systemd[1]: systemd-vconsole-setup.service: Deactivated successfully. May 14 01:03:10.212298 systemd[1]: Stopped systemd-vconsole-setup.service - Virtual Console Setup. May 14 01:03:10.213937 systemd[1]: Stopping systemd-vconsole-setup.service - Virtual Console Setup... May 14 01:03:10.218093 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... May 14 01:03:10.221726 systemd[1]: run-credentials-systemd\x2dvconsole\x2dsetup.service.mount: Deactivated successfully. May 14 01:03:10.224480 kernel: libata version 3.00 loaded. May 14 01:03:10.231534 kernel: ata_piix 0000:00:01.1: version 2.13 May 14 01:03:10.233461 kernel: scsi host0: ata_piix May 14 01:03:10.246521 kernel: scsi host1: ata_piix May 14 01:03:10.254375 kernel: ata1: PATA max MWDMA2 cmd 0x1f0 ctl 0x3f6 bmdma 0xc120 irq 14 May 14 01:03:10.254419 kernel: ata2: PATA max MWDMA2 cmd 0x170 ctl 0x376 bmdma 0xc128 irq 15 May 14 01:03:10.261479 kernel: BTRFS: device fsid d2fbd39e-42cb-4ccb-87ec-99f56cfe77f8 devid 1 transid 39 /dev/vda3 scanned by (udev-worker) (462) May 14 01:03:10.264671 kernel: BTRFS: device label OEM devid 1 transid 14 /dev/vda6 scanned by (udev-worker) (464) May 14 01:03:10.288149 systemd[1]: Found device dev-disk-by\x2dlabel-ROOT.device - /dev/disk/by-label/ROOT. May 14 01:03:10.311678 systemd[1]: Finished systemd-vconsole-setup.service - Virtual Console Setup. May 14 01:03:10.323047 systemd[1]: Found device dev-disk-by\x2dlabel-EFI\x2dSYSTEM.device - /dev/disk/by-label/EFI-SYSTEM. May 14 01:03:10.338855 systemd[1]: Found device dev-disk-by\x2dpartlabel-USR\x2dA.device - /dev/disk/by-partlabel/USR-A. May 14 01:03:10.339452 systemd[1]: Found device dev-disk-by\x2dpartuuid-7130c94a\x2d213a\x2d4e5a\x2d8e26\x2d6cce9662f132.device - /dev/disk/by-partuuid/7130c94a-213a-4e5a-8e26-6cce9662f132. May 14 01:03:10.350935 systemd[1]: Found device dev-disk-by\x2dlabel-OEM.device - /dev/disk/by-label/OEM. May 14 01:03:10.352583 systemd[1]: Starting disk-uuid.service - Generate new UUID for disk GPT if necessary... May 14 01:03:10.356645 systemd[1]: Starting dracut-cmdline-ask.service - dracut ask for additional cmdline parameters... May 14 01:03:10.378621 disk-uuid[508]: Primary Header is updated. May 14 01:03:10.378621 disk-uuid[508]: Secondary Entries is updated. May 14 01:03:10.378621 disk-uuid[508]: Secondary Header is updated. May 14 01:03:10.387463 kernel: vda: vda1 vda2 vda3 vda4 vda6 vda7 vda9 May 14 01:03:10.406662 systemd[1]: Finished dracut-cmdline-ask.service - dracut ask for additional cmdline parameters. May 14 01:03:11.408144 kernel: vda: vda1 vda2 vda3 vda4 vda6 vda7 vda9 May 14 01:03:11.408932 disk-uuid[511]: The operation has completed successfully. May 14 01:03:11.492002 systemd[1]: disk-uuid.service: Deactivated successfully. May 14 01:03:11.492122 systemd[1]: Finished disk-uuid.service - Generate new UUID for disk GPT if necessary. May 14 01:03:11.533214 systemd[1]: Starting verity-setup.service - Verity Setup for /dev/mapper/usr... May 14 01:03:11.547448 sh[530]: Success May 14 01:03:11.561493 kernel: device-mapper: verity: sha256 using implementation "sha256-ssse3" May 14 01:03:11.629686 systemd[1]: Found device dev-mapper-usr.device - /dev/mapper/usr. May 14 01:03:11.643591 systemd[1]: Mounting sysusr-usr.mount - /sysusr/usr... May 14 01:03:11.645679 systemd[1]: Finished verity-setup.service - Verity Setup for /dev/mapper/usr. May 14 01:03:11.679536 kernel: BTRFS info (device dm-0): first mount of filesystem d2fbd39e-42cb-4ccb-87ec-99f56cfe77f8 May 14 01:03:11.679619 kernel: BTRFS info (device dm-0): using crc32c (crc32c-intel) checksum algorithm May 14 01:03:11.683031 kernel: BTRFS warning (device dm-0): 'nologreplay' is deprecated, use 'rescue=nologreplay' instead May 14 01:03:11.683084 kernel: BTRFS info (device dm-0): disabling log replay at mount time May 14 01:03:11.684749 kernel: BTRFS info (device dm-0): using free space tree May 14 01:03:11.703529 systemd[1]: Mounted sysusr-usr.mount - /sysusr/usr. May 14 01:03:11.705838 systemd[1]: afterburn-network-kargs.service - Afterburn Initrd Setup Network Kernel Arguments was skipped because no trigger condition checks were met. May 14 01:03:11.709606 systemd[1]: Starting ignition-setup.service - Ignition (setup)... May 14 01:03:11.714684 systemd[1]: Starting parse-ip-for-networkd.service - Write systemd-networkd units from cmdline... May 14 01:03:11.761387 kernel: BTRFS info (device vda6): first mount of filesystem c0e200fb-7321-4d2d-86ff-b28bdae5fafc May 14 01:03:11.761513 kernel: BTRFS info (device vda6): using crc32c (crc32c-intel) checksum algorithm May 14 01:03:11.766482 kernel: BTRFS info (device vda6): using free space tree May 14 01:03:11.778485 kernel: BTRFS info (device vda6): auto enabling async discard May 14 01:03:11.791465 kernel: BTRFS info (device vda6): last unmount of filesystem c0e200fb-7321-4d2d-86ff-b28bdae5fafc May 14 01:03:11.807231 systemd[1]: Finished ignition-setup.service - Ignition (setup). May 14 01:03:11.810043 systemd[1]: Starting ignition-fetch-offline.service - Ignition (fetch-offline)... May 14 01:03:11.838779 systemd[1]: Finished parse-ip-for-networkd.service - Write systemd-networkd units from cmdline. May 14 01:03:11.840981 systemd[1]: Starting systemd-networkd.service - Network Configuration... May 14 01:03:11.879978 systemd-networkd[709]: lo: Link UP May 14 01:03:11.879987 systemd-networkd[709]: lo: Gained carrier May 14 01:03:11.881691 systemd-networkd[709]: Enumeration completed May 14 01:03:11.881792 systemd[1]: Started systemd-networkd.service - Network Configuration. May 14 01:03:11.882331 systemd-networkd[709]: eth0: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. May 14 01:03:11.882336 systemd-networkd[709]: eth0: Configuring with /usr/lib/systemd/network/zz-default.network. May 14 01:03:11.883736 systemd-networkd[709]: eth0: Link UP May 14 01:03:11.883739 systemd-networkd[709]: eth0: Gained carrier May 14 01:03:11.883747 systemd-networkd[709]: eth0: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. May 14 01:03:11.887411 systemd[1]: Reached target network.target - Network. May 14 01:03:11.897510 systemd-networkd[709]: eth0: DHCPv4 address 172.24.4.64/24, gateway 172.24.4.1 acquired from 172.24.4.1 May 14 01:03:11.951377 ignition[674]: Ignition 2.20.0 May 14 01:03:11.951392 ignition[674]: Stage: fetch-offline May 14 01:03:11.951464 ignition[674]: no configs at "/usr/lib/ignition/base.d" May 14 01:03:11.951490 ignition[674]: no config dir at "/usr/lib/ignition/base.platform.d/openstack" May 14 01:03:11.951599 ignition[674]: parsed url from cmdline: "" May 14 01:03:11.951603 ignition[674]: no config URL provided May 14 01:03:11.951609 ignition[674]: reading system config file "/usr/lib/ignition/user.ign" May 14 01:03:11.954935 systemd[1]: Finished ignition-fetch-offline.service - Ignition (fetch-offline). May 14 01:03:11.951619 ignition[674]: no config at "/usr/lib/ignition/user.ign" May 14 01:03:11.951625 ignition[674]: failed to fetch config: resource requires networking May 14 01:03:11.951802 ignition[674]: Ignition finished successfully May 14 01:03:11.958627 systemd[1]: Starting ignition-fetch.service - Ignition (fetch)... May 14 01:03:11.979653 ignition[720]: Ignition 2.20.0 May 14 01:03:11.979665 ignition[720]: Stage: fetch May 14 01:03:11.979828 ignition[720]: no configs at "/usr/lib/ignition/base.d" May 14 01:03:11.979840 ignition[720]: no config dir at "/usr/lib/ignition/base.platform.d/openstack" May 14 01:03:11.979923 ignition[720]: parsed url from cmdline: "" May 14 01:03:11.979927 ignition[720]: no config URL provided May 14 01:03:11.979933 ignition[720]: reading system config file "/usr/lib/ignition/user.ign" May 14 01:03:11.979941 ignition[720]: no config at "/usr/lib/ignition/user.ign" May 14 01:03:11.980052 ignition[720]: GET http://169.254.169.254/openstack/latest/user_data: attempt #1 May 14 01:03:11.980110 ignition[720]: config drive ("/dev/disk/by-label/config-2") not found. Waiting... May 14 01:03:11.980138 ignition[720]: config drive ("/dev/disk/by-label/CONFIG-2") not found. Waiting... May 14 01:03:12.245402 ignition[720]: GET result: OK May 14 01:03:12.245628 ignition[720]: parsing config with SHA512: 445882baeb49aec196fff75873926198431f44bb526bbe5f83a5c04961381ca5420eac250821b69c8a24c6b34df21791e167db3ab514976693a0b96b277eff81 May 14 01:03:12.261539 unknown[720]: fetched base config from "system" May 14 01:03:12.261565 unknown[720]: fetched base config from "system" May 14 01:03:12.262691 ignition[720]: fetch: fetch complete May 14 01:03:12.261579 unknown[720]: fetched user config from "openstack" May 14 01:03:12.262705 ignition[720]: fetch: fetch passed May 14 01:03:12.267121 systemd[1]: Finished ignition-fetch.service - Ignition (fetch). May 14 01:03:12.262810 ignition[720]: Ignition finished successfully May 14 01:03:12.271822 systemd[1]: Starting ignition-kargs.service - Ignition (kargs)... May 14 01:03:12.318935 ignition[726]: Ignition 2.20.0 May 14 01:03:12.318959 ignition[726]: Stage: kargs May 14 01:03:12.319311 ignition[726]: no configs at "/usr/lib/ignition/base.d" May 14 01:03:12.319337 ignition[726]: no config dir at "/usr/lib/ignition/base.platform.d/openstack" May 14 01:03:12.323902 systemd[1]: Finished ignition-kargs.service - Ignition (kargs). May 14 01:03:12.321759 ignition[726]: kargs: kargs passed May 14 01:03:12.321858 ignition[726]: Ignition finished successfully May 14 01:03:12.329758 systemd[1]: Starting ignition-disks.service - Ignition (disks)... May 14 01:03:12.372999 ignition[732]: Ignition 2.20.0 May 14 01:03:12.373549 ignition[732]: Stage: disks May 14 01:03:12.373911 ignition[732]: no configs at "/usr/lib/ignition/base.d" May 14 01:03:12.373937 ignition[732]: no config dir at "/usr/lib/ignition/base.platform.d/openstack" May 14 01:03:12.376133 ignition[732]: disks: disks passed May 14 01:03:12.378071 systemd[1]: Finished ignition-disks.service - Ignition (disks). May 14 01:03:12.376222 ignition[732]: Ignition finished successfully May 14 01:03:12.380814 systemd[1]: Reached target initrd-root-device.target - Initrd Root Device. May 14 01:03:12.383231 systemd[1]: Reached target local-fs-pre.target - Preparation for Local File Systems. May 14 01:03:12.385679 systemd[1]: Reached target local-fs.target - Local File Systems. May 14 01:03:12.388197 systemd[1]: Reached target sysinit.target - System Initialization. May 14 01:03:12.390924 systemd[1]: Reached target basic.target - Basic System. May 14 01:03:12.396652 systemd[1]: Starting systemd-fsck-root.service - File System Check on /dev/disk/by-label/ROOT... May 14 01:03:12.438410 systemd-fsck[740]: ROOT: clean, 14/1628000 files, 120691/1617920 blocks May 14 01:03:12.450289 systemd[1]: Finished systemd-fsck-root.service - File System Check on /dev/disk/by-label/ROOT. May 14 01:03:12.456131 systemd[1]: Mounting sysroot.mount - /sysroot... May 14 01:03:12.614479 kernel: EXT4-fs (vda9): mounted filesystem c413e98b-da35-46b1-9852-45706e1b1f52 r/w with ordered data mode. Quota mode: none. May 14 01:03:12.615068 systemd[1]: Mounted sysroot.mount - /sysroot. May 14 01:03:12.615945 systemd[1]: Reached target initrd-root-fs.target - Initrd Root File System. May 14 01:03:12.619241 systemd[1]: Mounting sysroot-oem.mount - /sysroot/oem... May 14 01:03:12.621517 systemd[1]: Mounting sysroot-usr.mount - /sysroot/usr... May 14 01:03:12.622706 systemd[1]: flatcar-metadata-hostname.service - Flatcar Metadata Hostname Agent was skipped because no trigger condition checks were met. May 14 01:03:12.625551 systemd[1]: Starting flatcar-openstack-hostname.service - Flatcar OpenStack Metadata Hostname Agent... May 14 01:03:12.627567 systemd[1]: ignition-remount-sysroot.service - Remount /sysroot read-write for Ignition was skipped because of an unmet condition check (ConditionPathIsReadWrite=!/sysroot). May 14 01:03:12.628669 systemd[1]: Reached target ignition-diskful.target - Ignition Boot Disk Setup. May 14 01:03:12.641558 systemd[1]: Mounted sysroot-usr.mount - /sysroot/usr. May 14 01:03:12.644532 systemd[1]: Starting initrd-setup-root.service - Root filesystem setup... May 14 01:03:12.663478 kernel: BTRFS: device label OEM devid 1 transid 15 /dev/vda6 scanned by mount (748) May 14 01:03:12.674459 kernel: BTRFS info (device vda6): first mount of filesystem c0e200fb-7321-4d2d-86ff-b28bdae5fafc May 14 01:03:12.674528 kernel: BTRFS info (device vda6): using crc32c (crc32c-intel) checksum algorithm May 14 01:03:12.679917 kernel: BTRFS info (device vda6): using free space tree May 14 01:03:12.691494 kernel: BTRFS info (device vda6): auto enabling async discard May 14 01:03:12.696240 systemd[1]: Mounted sysroot-oem.mount - /sysroot/oem. May 14 01:03:12.777637 initrd-setup-root[775]: cut: /sysroot/etc/passwd: No such file or directory May 14 01:03:12.782994 initrd-setup-root[783]: cut: /sysroot/etc/group: No such file or directory May 14 01:03:12.787446 initrd-setup-root[790]: cut: /sysroot/etc/shadow: No such file or directory May 14 01:03:12.792142 initrd-setup-root[797]: cut: /sysroot/etc/gshadow: No such file or directory May 14 01:03:12.885063 systemd[1]: Finished initrd-setup-root.service - Root filesystem setup. May 14 01:03:12.887595 systemd[1]: Starting ignition-mount.service - Ignition (mount)... May 14 01:03:12.899568 systemd[1]: Starting sysroot-boot.service - /sysroot/boot... May 14 01:03:12.907472 kernel: BTRFS info (device vda6): last unmount of filesystem c0e200fb-7321-4d2d-86ff-b28bdae5fafc May 14 01:03:12.907907 systemd[1]: sysroot-oem.mount: Deactivated successfully. May 14 01:03:12.937127 ignition[865]: INFO : Ignition 2.20.0 May 14 01:03:12.937127 ignition[865]: INFO : Stage: mount May 14 01:03:12.939884 ignition[865]: INFO : no configs at "/usr/lib/ignition/base.d" May 14 01:03:12.939884 ignition[865]: INFO : no config dir at "/usr/lib/ignition/base.platform.d/openstack" May 14 01:03:12.939884 ignition[865]: INFO : mount: mount passed May 14 01:03:12.939884 ignition[865]: INFO : Ignition finished successfully May 14 01:03:12.941286 systemd[1]: Finished ignition-mount.service - Ignition (mount). May 14 01:03:12.955758 systemd[1]: Finished sysroot-boot.service - /sysroot/boot. May 14 01:03:13.514717 systemd-networkd[709]: eth0: Gained IPv6LL May 14 01:03:19.837386 coreos-metadata[750]: May 14 01:03:19.837 WARN failed to locate config-drive, using the metadata service API instead May 14 01:03:19.881885 coreos-metadata[750]: May 14 01:03:19.881 INFO Fetching http://169.254.169.254/latest/meta-data/hostname: Attempt #1 May 14 01:03:19.897880 coreos-metadata[750]: May 14 01:03:19.897 INFO Fetch successful May 14 01:03:19.899298 coreos-metadata[750]: May 14 01:03:19.898 INFO wrote hostname ci-4284-0-0-n-4a8b92fa55.novalocal to /sysroot/etc/hostname May 14 01:03:19.902245 systemd[1]: flatcar-openstack-hostname.service: Deactivated successfully. May 14 01:03:19.902525 systemd[1]: Finished flatcar-openstack-hostname.service - Flatcar OpenStack Metadata Hostname Agent. May 14 01:03:19.910171 systemd[1]: Starting ignition-files.service - Ignition (files)... May 14 01:03:19.947038 systemd[1]: Mounting sysroot-oem.mount - /sysroot/oem... May 14 01:03:19.980539 kernel: BTRFS: device label OEM devid 1 transid 16 /dev/vda6 scanned by mount (882) May 14 01:03:19.987672 kernel: BTRFS info (device vda6): first mount of filesystem c0e200fb-7321-4d2d-86ff-b28bdae5fafc May 14 01:03:19.987747 kernel: BTRFS info (device vda6): using crc32c (crc32c-intel) checksum algorithm May 14 01:03:19.991967 kernel: BTRFS info (device vda6): using free space tree May 14 01:03:20.002499 kernel: BTRFS info (device vda6): auto enabling async discard May 14 01:03:20.008121 systemd[1]: Mounted sysroot-oem.mount - /sysroot/oem. May 14 01:03:20.050249 ignition[900]: INFO : Ignition 2.20.0 May 14 01:03:20.050249 ignition[900]: INFO : Stage: files May 14 01:03:20.053706 ignition[900]: INFO : no configs at "/usr/lib/ignition/base.d" May 14 01:03:20.053706 ignition[900]: INFO : no config dir at "/usr/lib/ignition/base.platform.d/openstack" May 14 01:03:20.053706 ignition[900]: DEBUG : files: compiled without relabeling support, skipping May 14 01:03:20.059830 ignition[900]: INFO : files: ensureUsers: op(1): [started] creating or modifying user "core" May 14 01:03:20.059830 ignition[900]: DEBUG : files: ensureUsers: op(1): executing: "usermod" "--root" "/sysroot" "core" May 14 01:03:20.063998 ignition[900]: INFO : files: ensureUsers: op(1): [finished] creating or modifying user "core" May 14 01:03:20.063998 ignition[900]: INFO : files: ensureUsers: op(2): [started] adding ssh keys to user "core" May 14 01:03:20.068184 ignition[900]: INFO : files: ensureUsers: op(2): [finished] adding ssh keys to user "core" May 14 01:03:20.065702 unknown[900]: wrote ssh authorized keys file for user: core May 14 01:03:20.072175 ignition[900]: INFO : files: createFilesystemsFiles: createFiles: op(3): [started] writing file "/sysroot/opt/helm-v3.17.0-linux-amd64.tar.gz" May 14 01:03:20.072175 ignition[900]: INFO : files: createFilesystemsFiles: createFiles: op(3): GET https://get.helm.sh/helm-v3.17.0-linux-amd64.tar.gz: attempt #1 May 14 01:03:20.149329 ignition[900]: INFO : files: createFilesystemsFiles: createFiles: op(3): GET result: OK May 14 01:03:20.720641 ignition[900]: INFO : files: createFilesystemsFiles: createFiles: op(3): [finished] writing file "/sysroot/opt/helm-v3.17.0-linux-amd64.tar.gz" May 14 01:03:20.720641 ignition[900]: INFO : files: createFilesystemsFiles: createFiles: op(4): [started] writing file "/sysroot/home/core/install.sh" May 14 01:03:20.725483 ignition[900]: INFO : files: createFilesystemsFiles: createFiles: op(4): [finished] writing file "/sysroot/home/core/install.sh" May 14 01:03:20.725483 ignition[900]: INFO : files: createFilesystemsFiles: createFiles: op(5): [started] writing file "/sysroot/home/core/nginx.yaml" May 14 01:03:20.725483 ignition[900]: INFO : files: createFilesystemsFiles: createFiles: op(5): [finished] writing file "/sysroot/home/core/nginx.yaml" May 14 01:03:20.725483 ignition[900]: INFO : files: createFilesystemsFiles: createFiles: op(6): [started] writing file "/sysroot/home/core/nfs-pod.yaml" May 14 01:03:20.725483 ignition[900]: INFO : files: createFilesystemsFiles: createFiles: op(6): [finished] writing file "/sysroot/home/core/nfs-pod.yaml" May 14 01:03:20.725483 ignition[900]: INFO : files: createFilesystemsFiles: createFiles: op(7): [started] writing file "/sysroot/home/core/nfs-pvc.yaml" May 14 01:03:20.725483 ignition[900]: INFO : files: createFilesystemsFiles: createFiles: op(7): [finished] writing file "/sysroot/home/core/nfs-pvc.yaml" May 14 01:03:20.725483 ignition[900]: INFO : files: createFilesystemsFiles: createFiles: op(8): [started] writing file "/sysroot/etc/flatcar/update.conf" May 14 01:03:20.725483 ignition[900]: INFO : files: createFilesystemsFiles: createFiles: op(8): [finished] writing file "/sysroot/etc/flatcar/update.conf" May 14 01:03:20.725483 ignition[900]: INFO : files: createFilesystemsFiles: createFiles: op(9): [started] writing link "/sysroot/etc/extensions/kubernetes.raw" -> "/opt/extensions/kubernetes/kubernetes-v1.32.0-x86-64.raw" May 14 01:03:20.725483 ignition[900]: INFO : files: createFilesystemsFiles: createFiles: op(9): [finished] writing link "/sysroot/etc/extensions/kubernetes.raw" -> "/opt/extensions/kubernetes/kubernetes-v1.32.0-x86-64.raw" May 14 01:03:20.725483 ignition[900]: INFO : files: createFilesystemsFiles: createFiles: op(a): [started] writing file "/sysroot/opt/extensions/kubernetes/kubernetes-v1.32.0-x86-64.raw" May 14 01:03:20.725483 ignition[900]: INFO : files: createFilesystemsFiles: createFiles: op(a): GET https://github.com/flatcar/sysext-bakery/releases/download/latest/kubernetes-v1.32.0-x86-64.raw: attempt #1 May 14 01:03:21.416755 ignition[900]: INFO : files: createFilesystemsFiles: createFiles: op(a): GET result: OK May 14 01:03:23.063594 ignition[900]: INFO : files: createFilesystemsFiles: createFiles: op(a): [finished] writing file "/sysroot/opt/extensions/kubernetes/kubernetes-v1.32.0-x86-64.raw" May 14 01:03:23.063594 ignition[900]: INFO : files: op(b): [started] processing unit "prepare-helm.service" May 14 01:03:23.066324 ignition[900]: INFO : files: op(b): op(c): [started] writing unit "prepare-helm.service" at "/sysroot/etc/systemd/system/prepare-helm.service" May 14 01:03:23.068279 ignition[900]: INFO : files: op(b): op(c): [finished] writing unit "prepare-helm.service" at "/sysroot/etc/systemd/system/prepare-helm.service" May 14 01:03:23.068279 ignition[900]: INFO : files: op(b): [finished] processing unit "prepare-helm.service" May 14 01:03:23.068279 ignition[900]: INFO : files: op(d): [started] setting preset to enabled for "prepare-helm.service" May 14 01:03:23.071454 ignition[900]: INFO : files: op(d): [finished] setting preset to enabled for "prepare-helm.service" May 14 01:03:23.071454 ignition[900]: INFO : files: createResultFile: createFiles: op(e): [started] writing file "/sysroot/etc/.ignition-result.json" May 14 01:03:23.071454 ignition[900]: INFO : files: createResultFile: createFiles: op(e): [finished] writing file "/sysroot/etc/.ignition-result.json" May 14 01:03:23.071454 ignition[900]: INFO : files: files passed May 14 01:03:23.071454 ignition[900]: INFO : Ignition finished successfully May 14 01:03:23.075176 systemd[1]: Finished ignition-files.service - Ignition (files). May 14 01:03:23.081268 systemd[1]: Starting ignition-quench.service - Ignition (record completion)... May 14 01:03:23.085665 systemd[1]: Starting initrd-setup-root-after-ignition.service - Root filesystem completion... May 14 01:03:23.095191 systemd[1]: ignition-quench.service: Deactivated successfully. May 14 01:03:23.095956 systemd[1]: Finished ignition-quench.service - Ignition (record completion). May 14 01:03:23.106041 initrd-setup-root-after-ignition[934]: grep: /sysroot/etc/flatcar/enabled-sysext.conf: No such file or directory May 14 01:03:23.107646 initrd-setup-root-after-ignition[930]: grep: /sysroot/etc/flatcar/enabled-sysext.conf: No such file or directory May 14 01:03:23.107646 initrd-setup-root-after-ignition[930]: grep: /sysroot/usr/share/flatcar/enabled-sysext.conf: No such file or directory May 14 01:03:23.110588 systemd[1]: Finished initrd-setup-root-after-ignition.service - Root filesystem completion. May 14 01:03:23.111336 systemd[1]: Reached target ignition-complete.target - Ignition Complete. May 14 01:03:23.115500 systemd[1]: Starting initrd-parse-etc.service - Mountpoints Configured in the Real Root... May 14 01:03:23.165178 systemd[1]: initrd-parse-etc.service: Deactivated successfully. May 14 01:03:23.166493 systemd[1]: Finished initrd-parse-etc.service - Mountpoints Configured in the Real Root. May 14 01:03:23.167992 systemd[1]: Reached target initrd-fs.target - Initrd File Systems. May 14 01:03:23.173629 systemd[1]: Reached target initrd.target - Initrd Default Target. May 14 01:03:23.175741 systemd[1]: dracut-mount.service - dracut mount hook was skipped because no trigger condition checks were met. May 14 01:03:23.176526 systemd[1]: Starting dracut-pre-pivot.service - dracut pre-pivot and cleanup hook... May 14 01:03:23.202916 systemd[1]: Finished dracut-pre-pivot.service - dracut pre-pivot and cleanup hook. May 14 01:03:23.205597 systemd[1]: Starting initrd-cleanup.service - Cleaning Up and Shutting Down Daemons... May 14 01:03:23.233901 systemd[1]: Stopped target nss-lookup.target - Host and Network Name Lookups. May 14 01:03:23.236319 systemd[1]: Stopped target remote-cryptsetup.target - Remote Encrypted Volumes. May 14 01:03:23.237075 systemd[1]: Stopped target timers.target - Timer Units. May 14 01:03:23.239196 systemd[1]: dracut-pre-pivot.service: Deactivated successfully. May 14 01:03:23.239321 systemd[1]: Stopped dracut-pre-pivot.service - dracut pre-pivot and cleanup hook. May 14 01:03:23.241649 systemd[1]: Stopped target initrd.target - Initrd Default Target. May 14 01:03:23.242715 systemd[1]: Stopped target basic.target - Basic System. May 14 01:03:23.244829 systemd[1]: Stopped target ignition-complete.target - Ignition Complete. May 14 01:03:23.246581 systemd[1]: Stopped target ignition-diskful.target - Ignition Boot Disk Setup. May 14 01:03:23.248299 systemd[1]: Stopped target initrd-root-device.target - Initrd Root Device. May 14 01:03:23.250410 systemd[1]: Stopped target remote-fs.target - Remote File Systems. May 14 01:03:23.252545 systemd[1]: Stopped target remote-fs-pre.target - Preparation for Remote File Systems. May 14 01:03:23.254701 systemd[1]: Stopped target sysinit.target - System Initialization. May 14 01:03:23.256724 systemd[1]: Stopped target local-fs.target - Local File Systems. May 14 01:03:23.258866 systemd[1]: Stopped target swap.target - Swaps. May 14 01:03:23.260836 systemd[1]: dracut-pre-mount.service: Deactivated successfully. May 14 01:03:23.260949 systemd[1]: Stopped dracut-pre-mount.service - dracut pre-mount hook. May 14 01:03:23.263048 systemd[1]: Stopped target cryptsetup.target - Local Encrypted Volumes. May 14 01:03:23.263988 systemd[1]: Stopped target cryptsetup-pre.target - Local Encrypted Volumes (Pre). May 14 01:03:23.264987 systemd[1]: clevis-luks-askpass.path: Deactivated successfully. May 14 01:03:23.266654 systemd[1]: Stopped clevis-luks-askpass.path - Forward Password Requests to Clevis Directory Watch. May 14 01:03:23.267544 systemd[1]: dracut-initqueue.service: Deactivated successfully. May 14 01:03:23.267656 systemd[1]: Stopped dracut-initqueue.service - dracut initqueue hook. May 14 01:03:23.269103 systemd[1]: initrd-setup-root-after-ignition.service: Deactivated successfully. May 14 01:03:23.269230 systemd[1]: Stopped initrd-setup-root-after-ignition.service - Root filesystem completion. May 14 01:03:23.270360 systemd[1]: ignition-files.service: Deactivated successfully. May 14 01:03:23.270492 systemd[1]: Stopped ignition-files.service - Ignition (files). May 14 01:03:23.273995 systemd[1]: Stopping ignition-mount.service - Ignition (mount)... May 14 01:03:23.285608 systemd[1]: Stopping sysroot-boot.service - /sysroot/boot... May 14 01:03:23.286609 systemd[1]: systemd-udev-trigger.service: Deactivated successfully. May 14 01:03:23.287461 systemd[1]: Stopped systemd-udev-trigger.service - Coldplug All udev Devices. May 14 01:03:23.290396 systemd[1]: dracut-pre-trigger.service: Deactivated successfully. May 14 01:03:23.291139 systemd[1]: Stopped dracut-pre-trigger.service - dracut pre-trigger hook. May 14 01:03:23.296752 ignition[954]: INFO : Ignition 2.20.0 May 14 01:03:23.296752 ignition[954]: INFO : Stage: umount May 14 01:03:23.301738 ignition[954]: INFO : no configs at "/usr/lib/ignition/base.d" May 14 01:03:23.301738 ignition[954]: INFO : no config dir at "/usr/lib/ignition/base.platform.d/openstack" May 14 01:03:23.301738 ignition[954]: INFO : umount: umount passed May 14 01:03:23.301738 ignition[954]: INFO : Ignition finished successfully May 14 01:03:23.300533 systemd[1]: ignition-mount.service: Deactivated successfully. May 14 01:03:23.300650 systemd[1]: Stopped ignition-mount.service - Ignition (mount). May 14 01:03:23.305398 systemd[1]: ignition-disks.service: Deactivated successfully. May 14 01:03:23.305546 systemd[1]: Stopped ignition-disks.service - Ignition (disks). May 14 01:03:23.306085 systemd[1]: ignition-kargs.service: Deactivated successfully. May 14 01:03:23.306127 systemd[1]: Stopped ignition-kargs.service - Ignition (kargs). May 14 01:03:23.306632 systemd[1]: ignition-fetch.service: Deactivated successfully. May 14 01:03:23.306672 systemd[1]: Stopped ignition-fetch.service - Ignition (fetch). May 14 01:03:23.307209 systemd[1]: Stopped target network.target - Network. May 14 01:03:23.308619 systemd[1]: ignition-fetch-offline.service: Deactivated successfully. May 14 01:03:23.308666 systemd[1]: Stopped ignition-fetch-offline.service - Ignition (fetch-offline). May 14 01:03:23.310626 systemd[1]: Stopped target paths.target - Path Units. May 14 01:03:23.311393 systemd[1]: systemd-ask-password-console.path: Deactivated successfully. May 14 01:03:23.314763 systemd[1]: Stopped systemd-ask-password-console.path - Dispatch Password Requests to Console Directory Watch. May 14 01:03:23.315880 systemd[1]: Stopped target slices.target - Slice Units. May 14 01:03:23.316805 systemd[1]: Stopped target sockets.target - Socket Units. May 14 01:03:23.318802 systemd[1]: iscsid.socket: Deactivated successfully. May 14 01:03:23.318840 systemd[1]: Closed iscsid.socket - Open-iSCSI iscsid Socket. May 14 01:03:23.319693 systemd[1]: iscsiuio.socket: Deactivated successfully. May 14 01:03:23.319725 systemd[1]: Closed iscsiuio.socket - Open-iSCSI iscsiuio Socket. May 14 01:03:23.320211 systemd[1]: ignition-setup.service: Deactivated successfully. May 14 01:03:23.320254 systemd[1]: Stopped ignition-setup.service - Ignition (setup). May 14 01:03:23.320756 systemd[1]: ignition-setup-pre.service: Deactivated successfully. May 14 01:03:23.320795 systemd[1]: Stopped ignition-setup-pre.service - Ignition env setup. May 14 01:03:23.321394 systemd[1]: Stopping systemd-networkd.service - Network Configuration... May 14 01:03:23.322679 systemd[1]: Stopping systemd-resolved.service - Network Name Resolution... May 14 01:03:23.324842 systemd[1]: sysroot-boot.mount: Deactivated successfully. May 14 01:03:23.326041 systemd[1]: initrd-cleanup.service: Deactivated successfully. May 14 01:03:23.326120 systemd[1]: Finished initrd-cleanup.service - Cleaning Up and Shutting Down Daemons. May 14 01:03:23.328214 systemd[1]: systemd-resolved.service: Deactivated successfully. May 14 01:03:23.328685 systemd[1]: Stopped systemd-resolved.service - Network Name Resolution. May 14 01:03:23.331602 systemd[1]: run-credentials-systemd\x2dresolved.service.mount: Deactivated successfully. May 14 01:03:23.331818 systemd[1]: systemd-networkd.service: Deactivated successfully. May 14 01:03:23.331912 systemd[1]: Stopped systemd-networkd.service - Network Configuration. May 14 01:03:23.333667 systemd[1]: run-credentials-systemd\x2dnetworkd.service.mount: Deactivated successfully. May 14 01:03:23.333897 systemd[1]: sysroot-boot.service: Deactivated successfully. May 14 01:03:23.333987 systemd[1]: Stopped sysroot-boot.service - /sysroot/boot. May 14 01:03:23.337234 systemd[1]: systemd-networkd.socket: Deactivated successfully. May 14 01:03:23.337569 systemd[1]: Closed systemd-networkd.socket - Network Service Netlink Socket. May 14 01:03:23.338718 systemd[1]: initrd-setup-root.service: Deactivated successfully. May 14 01:03:23.338809 systemd[1]: Stopped initrd-setup-root.service - Root filesystem setup. May 14 01:03:23.340522 systemd[1]: Stopping network-cleanup.service - Network Cleanup... May 14 01:03:23.341743 systemd[1]: parse-ip-for-networkd.service: Deactivated successfully. May 14 01:03:23.341790 systemd[1]: Stopped parse-ip-for-networkd.service - Write systemd-networkd units from cmdline. May 14 01:03:23.343764 systemd[1]: systemd-sysctl.service: Deactivated successfully. May 14 01:03:23.343805 systemd[1]: Stopped systemd-sysctl.service - Apply Kernel Variables. May 14 01:03:23.345173 systemd[1]: systemd-modules-load.service: Deactivated successfully. May 14 01:03:23.345213 systemd[1]: Stopped systemd-modules-load.service - Load Kernel Modules. May 14 01:03:23.347270 systemd[1]: systemd-tmpfiles-setup.service: Deactivated successfully. May 14 01:03:23.347323 systemd[1]: Stopped systemd-tmpfiles-setup.service - Create System Files and Directories. May 14 01:03:23.348786 systemd[1]: Stopping systemd-udevd.service - Rule-based Manager for Device Events and Files... May 14 01:03:23.354516 systemd[1]: run-credentials-systemd\x2dsysctl.service.mount: Deactivated successfully. May 14 01:03:23.354577 systemd[1]: run-credentials-systemd\x2dtmpfiles\x2dsetup.service.mount: Deactivated successfully. May 14 01:03:23.368897 systemd[1]: systemd-udevd.service: Deactivated successfully. May 14 01:03:23.369501 systemd[1]: Stopped systemd-udevd.service - Rule-based Manager for Device Events and Files. May 14 01:03:23.371086 systemd[1]: network-cleanup.service: Deactivated successfully. May 14 01:03:23.371163 systemd[1]: Stopped network-cleanup.service - Network Cleanup. May 14 01:03:23.372257 systemd[1]: systemd-udevd-control.socket: Deactivated successfully. May 14 01:03:23.372308 systemd[1]: Closed systemd-udevd-control.socket - udev Control Socket. May 14 01:03:23.373524 systemd[1]: systemd-udevd-kernel.socket: Deactivated successfully. May 14 01:03:23.373553 systemd[1]: Closed systemd-udevd-kernel.socket - udev Kernel Socket. May 14 01:03:23.374593 systemd[1]: dracut-pre-udev.service: Deactivated successfully. May 14 01:03:23.374636 systemd[1]: Stopped dracut-pre-udev.service - dracut pre-udev hook. May 14 01:03:23.376232 systemd[1]: dracut-cmdline.service: Deactivated successfully. May 14 01:03:23.376274 systemd[1]: Stopped dracut-cmdline.service - dracut cmdline hook. May 14 01:03:23.377396 systemd[1]: dracut-cmdline-ask.service: Deactivated successfully. May 14 01:03:23.377455 systemd[1]: Stopped dracut-cmdline-ask.service - dracut ask for additional cmdline parameters. May 14 01:03:23.380527 systemd[1]: Starting initrd-udevadm-cleanup-db.service - Cleanup udev Database... May 14 01:03:23.381457 systemd[1]: systemd-tmpfiles-setup-dev.service: Deactivated successfully. May 14 01:03:23.381507 systemd[1]: Stopped systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev. May 14 01:03:23.383157 systemd[1]: systemd-tmpfiles-setup-dev-early.service: Deactivated successfully. May 14 01:03:23.383201 systemd[1]: Stopped systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully. May 14 01:03:23.384558 systemd[1]: kmod-static-nodes.service: Deactivated successfully. May 14 01:03:23.384601 systemd[1]: Stopped kmod-static-nodes.service - Create List of Static Device Nodes. May 14 01:03:23.385784 systemd[1]: systemd-vconsole-setup.service: Deactivated successfully. May 14 01:03:23.385828 systemd[1]: Stopped systemd-vconsole-setup.service - Virtual Console Setup. May 14 01:03:23.388379 systemd[1]: run-credentials-systemd\x2dtmpfiles\x2dsetup\x2ddev.service.mount: Deactivated successfully. May 14 01:03:23.388460 systemd[1]: run-credentials-systemd\x2dvconsole\x2dsetup.service.mount: Deactivated successfully. May 14 01:03:23.393418 systemd[1]: initrd-udevadm-cleanup-db.service: Deactivated successfully. May 14 01:03:23.393561 systemd[1]: Finished initrd-udevadm-cleanup-db.service - Cleanup udev Database. May 14 01:03:23.394934 systemd[1]: Reached target initrd-switch-root.target - Switch Root. May 14 01:03:23.397559 systemd[1]: Starting initrd-switch-root.service - Switch Root... May 14 01:03:23.415099 systemd[1]: Switching root. May 14 01:03:23.448911 systemd-journald[184]: Journal stopped May 14 01:03:25.354671 systemd-journald[184]: Received SIGTERM from PID 1 (systemd). May 14 01:03:25.354724 kernel: SELinux: policy capability network_peer_controls=1 May 14 01:03:25.354742 kernel: SELinux: policy capability open_perms=1 May 14 01:03:25.354757 kernel: SELinux: policy capability extended_socket_class=1 May 14 01:03:25.354770 kernel: SELinux: policy capability always_check_network=0 May 14 01:03:25.354781 kernel: SELinux: policy capability cgroup_seclabel=1 May 14 01:03:25.354792 kernel: SELinux: policy capability nnp_nosuid_transition=1 May 14 01:03:25.354803 kernel: SELinux: policy capability genfs_seclabel_symlinks=0 May 14 01:03:25.354814 kernel: SELinux: policy capability ioctl_skip_cloexec=0 May 14 01:03:25.354826 kernel: audit: type=1403 audit(1747184604.118:2): auid=4294967295 ses=4294967295 lsm=selinux res=1 May 14 01:03:25.354842 systemd[1]: Successfully loaded SELinux policy in 79.619ms. May 14 01:03:25.354863 systemd[1]: Relabeled /dev/, /dev/shm/, /run/ in 26.455ms. May 14 01:03:25.354879 systemd[1]: systemd 256.8 running in system mode (+PAM +AUDIT +SELINUX -APPARMOR +IMA +SMACK +SECCOMP +GCRYPT -GNUTLS +OPENSSL -ACL +BLKID +CURL +ELFUTILS -FIDO2 +IDN2 -IDN +IPTC +KMOD +LIBCRYPTSETUP +LIBCRYPTSETUP_PLUGINS +LIBFDISK +PCRE2 -PWQUALITY -P11KIT -QRENCODE +TPM2 +BZIP2 +LZ4 +XZ +ZLIB +ZSTD -BPF_FRAMEWORK -XKBCOMMON +UTMP -SYSVINIT +LIBARCHIVE) May 14 01:03:25.354891 systemd[1]: Detected virtualization kvm. May 14 01:03:25.354904 systemd[1]: Detected architecture x86-64. May 14 01:03:25.354915 systemd[1]: Detected first boot. May 14 01:03:25.354933 systemd[1]: Hostname set to . May 14 01:03:25.354945 systemd[1]: Initializing machine ID from VM UUID. May 14 01:03:25.354957 kernel: Guest personality initialized and is inactive May 14 01:03:25.354969 zram_generator::config[1000]: No configuration found. May 14 01:03:25.354984 kernel: VMCI host device registered (name=vmci, major=10, minor=125) May 14 01:03:25.354995 kernel: Initialized host personality May 14 01:03:25.355006 kernel: NET: Registered PF_VSOCK protocol family May 14 01:03:25.355018 systemd[1]: Populated /etc with preset unit settings. May 14 01:03:25.355031 systemd[1]: run-credentials-systemd\x2djournald.service.mount: Deactivated successfully. May 14 01:03:25.355043 systemd[1]: initrd-switch-root.service: Deactivated successfully. May 14 01:03:25.355055 systemd[1]: Stopped initrd-switch-root.service - Switch Root. May 14 01:03:25.355067 systemd[1]: systemd-journald.service: Scheduled restart job, restart counter is at 1. May 14 01:03:25.355079 systemd[1]: Created slice system-addon\x2dconfig.slice - Slice /system/addon-config. May 14 01:03:25.355094 systemd[1]: Created slice system-addon\x2drun.slice - Slice /system/addon-run. May 14 01:03:25.355106 systemd[1]: Created slice system-getty.slice - Slice /system/getty. May 14 01:03:25.355118 systemd[1]: Created slice system-modprobe.slice - Slice /system/modprobe. May 14 01:03:25.355131 systemd[1]: Created slice system-serial\x2dgetty.slice - Slice /system/serial-getty. May 14 01:03:25.355143 systemd[1]: Created slice system-system\x2dcloudinit.slice - Slice /system/system-cloudinit. May 14 01:03:25.355158 systemd[1]: Created slice system-systemd\x2dfsck.slice - Slice /system/systemd-fsck. May 14 01:03:25.355170 systemd[1]: Created slice user.slice - User and Session Slice. May 14 01:03:25.355182 systemd[1]: Started clevis-luks-askpass.path - Forward Password Requests to Clevis Directory Watch. May 14 01:03:25.355197 systemd[1]: Started systemd-ask-password-console.path - Dispatch Password Requests to Console Directory Watch. May 14 01:03:25.355209 systemd[1]: Started systemd-ask-password-wall.path - Forward Password Requests to Wall Directory Watch. May 14 01:03:25.355221 systemd[1]: Set up automount boot.automount - Boot partition Automount Point. May 14 01:03:25.355234 systemd[1]: Set up automount proc-sys-fs-binfmt_misc.automount - Arbitrary Executable File Formats File System Automount Point. May 14 01:03:25.355247 systemd[1]: Expecting device dev-disk-by\x2dlabel-OEM.device - /dev/disk/by-label/OEM... May 14 01:03:25.355259 systemd[1]: Expecting device dev-ttyS0.device - /dev/ttyS0... May 14 01:03:25.355273 systemd[1]: Reached target cryptsetup-pre.target - Local Encrypted Volumes (Pre). May 14 01:03:25.355285 systemd[1]: Stopped target initrd-switch-root.target - Switch Root. May 14 01:03:25.355298 systemd[1]: Stopped target initrd-fs.target - Initrd File Systems. May 14 01:03:25.355310 systemd[1]: Stopped target initrd-root-fs.target - Initrd Root File System. May 14 01:03:25.355322 systemd[1]: Reached target integritysetup.target - Local Integrity Protected Volumes. May 14 01:03:25.355334 systemd[1]: Reached target remote-cryptsetup.target - Remote Encrypted Volumes. May 14 01:03:25.355346 systemd[1]: Reached target remote-fs.target - Remote File Systems. May 14 01:03:25.355359 systemd[1]: Reached target slices.target - Slice Units. May 14 01:03:25.355371 systemd[1]: Reached target swap.target - Swaps. May 14 01:03:25.355385 systemd[1]: Reached target veritysetup.target - Local Verity Protected Volumes. May 14 01:03:25.355399 systemd[1]: Listening on systemd-coredump.socket - Process Core Dump Socket. May 14 01:03:25.355411 systemd[1]: Listening on systemd-creds.socket - Credential Encryption/Decryption. May 14 01:03:25.355423 systemd[1]: Listening on systemd-networkd.socket - Network Service Netlink Socket. May 14 01:03:25.362465 systemd[1]: Listening on systemd-udevd-control.socket - udev Control Socket. May 14 01:03:25.362482 systemd[1]: Listening on systemd-udevd-kernel.socket - udev Kernel Socket. May 14 01:03:25.362495 systemd[1]: Listening on systemd-userdbd.socket - User Database Manager Socket. May 14 01:03:25.362508 systemd[1]: Mounting dev-hugepages.mount - Huge Pages File System... May 14 01:03:25.362525 systemd[1]: Mounting dev-mqueue.mount - POSIX Message Queue File System... May 14 01:03:25.362537 systemd[1]: Mounting media.mount - External Media Directory... May 14 01:03:25.362553 systemd[1]: proc-xen.mount - /proc/xen was skipped because of an unmet condition check (ConditionVirtualization=xen). May 14 01:03:25.362565 systemd[1]: Mounting sys-kernel-debug.mount - Kernel Debug File System... May 14 01:03:25.362577 systemd[1]: Mounting sys-kernel-tracing.mount - Kernel Trace File System... May 14 01:03:25.362589 systemd[1]: Mounting tmp.mount - Temporary Directory /tmp... May 14 01:03:25.362602 systemd[1]: var-lib-machines.mount - Virtual Machine and Container Storage (Compatibility) was skipped because of an unmet condition check (ConditionPathExists=/var/lib/machines.raw). May 14 01:03:25.362614 systemd[1]: Reached target machines.target - Containers. May 14 01:03:25.362627 systemd[1]: Starting flatcar-tmpfiles.service - Create missing system files... May 14 01:03:25.362639 systemd[1]: ignition-delete-config.service - Ignition (delete config) was skipped because no trigger condition checks were met. May 14 01:03:25.362653 systemd[1]: Starting kmod-static-nodes.service - Create List of Static Device Nodes... May 14 01:03:25.362665 systemd[1]: Starting modprobe@configfs.service - Load Kernel Module configfs... May 14 01:03:25.362694 systemd[1]: Starting modprobe@dm_mod.service - Load Kernel Module dm_mod... May 14 01:03:25.362709 systemd[1]: Starting modprobe@drm.service - Load Kernel Module drm... May 14 01:03:25.362721 systemd[1]: Starting modprobe@efi_pstore.service - Load Kernel Module efi_pstore... May 14 01:03:25.362732 systemd[1]: Starting modprobe@fuse.service - Load Kernel Module fuse... May 14 01:03:25.362744 systemd[1]: Starting modprobe@loop.service - Load Kernel Module loop... May 14 01:03:25.362757 systemd[1]: setup-nsswitch.service - Create /etc/nsswitch.conf was skipped because of an unmet condition check (ConditionPathExists=!/etc/nsswitch.conf). May 14 01:03:25.362772 systemd[1]: systemd-fsck-root.service: Deactivated successfully. May 14 01:03:25.362784 systemd[1]: Stopped systemd-fsck-root.service - File System Check on Root Device. May 14 01:03:25.362796 systemd[1]: systemd-fsck-usr.service: Deactivated successfully. May 14 01:03:25.362808 systemd[1]: Stopped systemd-fsck-usr.service. May 14 01:03:25.362821 systemd[1]: systemd-hibernate-clear.service - Clear Stale Hibernate Storage Info was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/HibernateLocation-8cf2644b-4b0b-428f-9387-6d876050dc67). May 14 01:03:25.362834 systemd[1]: Starting systemd-journald.service - Journal Service... May 14 01:03:25.362846 systemd[1]: Starting systemd-modules-load.service - Load Kernel Modules... May 14 01:03:25.362858 systemd[1]: Starting systemd-network-generator.service - Generate network units from Kernel command line... May 14 01:03:25.362870 systemd[1]: Starting systemd-remount-fs.service - Remount Root and Kernel File Systems... May 14 01:03:25.362885 systemd[1]: Starting systemd-udev-load-credentials.service - Load udev Rules from Credentials... May 14 01:03:25.362897 systemd[1]: Starting systemd-udev-trigger.service - Coldplug All udev Devices... May 14 01:03:25.362910 systemd[1]: verity-setup.service: Deactivated successfully. May 14 01:03:25.362924 systemd[1]: Stopped verity-setup.service. May 14 01:03:25.362937 systemd[1]: xenserver-pv-version.service - Set fake PV driver version for XenServer was skipped because of an unmet condition check (ConditionVirtualization=xen). May 14 01:03:25.362953 systemd[1]: Mounted dev-hugepages.mount - Huge Pages File System. May 14 01:03:25.362965 systemd[1]: Mounted dev-mqueue.mount - POSIX Message Queue File System. May 14 01:03:25.362977 systemd[1]: Mounted media.mount - External Media Directory. May 14 01:03:25.362989 systemd[1]: Mounted sys-kernel-debug.mount - Kernel Debug File System. May 14 01:03:25.363001 systemd[1]: Mounted sys-kernel-tracing.mount - Kernel Trace File System. May 14 01:03:25.363015 systemd[1]: Mounted tmp.mount - Temporary Directory /tmp. May 14 01:03:25.363027 systemd[1]: Finished kmod-static-nodes.service - Create List of Static Device Nodes. May 14 01:03:25.363040 systemd[1]: modprobe@configfs.service: Deactivated successfully. May 14 01:03:25.363052 systemd[1]: Finished modprobe@configfs.service - Load Kernel Module configfs. May 14 01:03:25.363064 kernel: loop: module loaded May 14 01:03:25.363076 systemd[1]: modprobe@dm_mod.service: Deactivated successfully. May 14 01:03:25.363088 systemd[1]: Finished modprobe@dm_mod.service - Load Kernel Module dm_mod. May 14 01:03:25.363119 systemd-journald[1087]: Collecting audit messages is disabled. May 14 01:03:25.363146 systemd[1]: modprobe@efi_pstore.service: Deactivated successfully. May 14 01:03:25.363159 systemd[1]: Finished modprobe@efi_pstore.service - Load Kernel Module efi_pstore. May 14 01:03:25.363171 systemd[1]: Finished systemd-modules-load.service - Load Kernel Modules. May 14 01:03:25.363184 systemd[1]: modprobe@loop.service: Deactivated successfully. May 14 01:03:25.363196 systemd[1]: Finished modprobe@loop.service - Load Kernel Module loop. May 14 01:03:25.363209 systemd-journald[1087]: Journal started May 14 01:03:25.363235 systemd-journald[1087]: Runtime Journal (/run/log/journal/aa07cbf376954a6eb62813c8aedf9c74) is 8M, max 78.2M, 70.2M free. May 14 01:03:25.016620 systemd[1]: Queued start job for default target multi-user.target. May 14 01:03:25.029564 systemd[1]: Unnecessary job was removed for dev-vda6.device - /dev/vda6. May 14 01:03:25.029995 systemd[1]: systemd-journald.service: Deactivated successfully. May 14 01:03:25.373474 systemd[1]: Started systemd-journald.service - Journal Service. May 14 01:03:25.368130 systemd[1]: Finished systemd-remount-fs.service - Remount Root and Kernel File Systems. May 14 01:03:25.369661 systemd[1]: Finished systemd-network-generator.service - Generate network units from Kernel command line. May 14 01:03:25.376970 systemd[1]: Reached target network-pre.target - Preparation for Network. May 14 01:03:25.381450 kernel: fuse: init (API version 7.39) May 14 01:03:25.383083 systemd[1]: Mounting sys-kernel-config.mount - Kernel Configuration File System... May 14 01:03:25.383713 systemd[1]: remount-root.service - Remount Root File System was skipped because of an unmet condition check (ConditionPathIsReadWrite=!/). May 14 01:03:25.383752 systemd[1]: Reached target local-fs.target - Local File Systems. May 14 01:03:25.385497 systemd[1]: Listening on systemd-sysext.socket - System Extension Image Management. May 14 01:03:25.388540 systemd[1]: Starting dracut-shutdown.service - Restore /run/initramfs on shutdown... May 14 01:03:25.390586 systemd[1]: Starting ldconfig.service - Rebuild Dynamic Linker Cache... May 14 01:03:25.391614 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. May 14 01:03:25.395134 systemd[1]: Starting systemd-hwdb-update.service - Rebuild Hardware Database... May 14 01:03:25.401939 systemd[1]: Starting systemd-journal-flush.service - Flush Journal to Persistent Storage... May 14 01:03:25.402604 systemd[1]: systemd-pstore.service - Platform Persistent Storage Archival was skipped because of an unmet condition check (ConditionDirectoryNotEmpty=/sys/fs/pstore). May 14 01:03:25.406954 systemd[1]: Starting systemd-random-seed.service - Load/Save OS Random Seed... May 14 01:03:25.412597 systemd[1]: systemd-repart.service - Repartition Root Disk was skipped because no trigger condition checks were met. May 14 01:03:25.420683 systemd[1]: Starting systemd-sysctl.service - Apply Kernel Variables... May 14 01:03:25.427248 systemd[1]: Starting systemd-sysext.service - Merge System Extension Images into /usr/ and /opt/... May 14 01:03:25.433678 systemd[1]: Starting systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully... May 14 01:03:25.440550 systemd-journald[1087]: Time spent on flushing to /var/log/journal/aa07cbf376954a6eb62813c8aedf9c74 is 119.236ms for 948 entries. May 14 01:03:25.440550 systemd-journald[1087]: System Journal (/var/log/journal/aa07cbf376954a6eb62813c8aedf9c74) is 8M, max 584.8M, 576.8M free. May 14 01:03:25.575087 systemd-journald[1087]: Received client request to flush runtime journal. May 14 01:03:25.575137 kernel: loop0: detected capacity change from 0 to 8 May 14 01:03:25.575154 kernel: squashfs: version 4.0 (2009/01/31) Phillip Lougher May 14 01:03:25.575172 kernel: ACPI: bus type drm_connector registered May 14 01:03:25.575189 kernel: loop1: detected capacity change from 0 to 151640 May 14 01:03:25.438585 systemd[1]: Finished flatcar-tmpfiles.service - Create missing system files. May 14 01:03:25.439385 systemd[1]: modprobe@fuse.service: Deactivated successfully. May 14 01:03:25.439610 systemd[1]: Finished modprobe@fuse.service - Load Kernel Module fuse. May 14 01:03:25.444285 systemd[1]: Finished systemd-udev-load-credentials.service - Load udev Rules from Credentials. May 14 01:03:25.445723 systemd[1]: Mounted sys-kernel-config.mount - Kernel Configuration File System. May 14 01:03:25.447818 systemd[1]: Finished dracut-shutdown.service - Restore /run/initramfs on shutdown. May 14 01:03:25.456439 systemd[1]: Mounting sys-fs-fuse-connections.mount - FUSE Control File System... May 14 01:03:25.479012 systemd[1]: Mounted sys-fs-fuse-connections.mount - FUSE Control File System. May 14 01:03:25.480768 systemd[1]: Finished systemd-random-seed.service - Load/Save OS Random Seed. May 14 01:03:25.481384 systemd[1]: Reached target first-boot-complete.target - First Boot Complete. May 14 01:03:25.493296 systemd[1]: Starting systemd-machine-id-commit.service - Save Transient machine-id to Disk... May 14 01:03:25.516167 systemd[1]: modprobe@drm.service: Deactivated successfully. May 14 01:03:25.517288 systemd[1]: Finished modprobe@drm.service - Load Kernel Module drm. May 14 01:03:25.553292 systemd[1]: Finished systemd-sysctl.service - Apply Kernel Variables. May 14 01:03:25.584086 systemd[1]: Finished systemd-journal-flush.service - Flush Journal to Persistent Storage. May 14 01:03:25.591913 systemd-tmpfiles[1135]: ACLs are not supported, ignoring. May 14 01:03:25.591934 systemd-tmpfiles[1135]: ACLs are not supported, ignoring. May 14 01:03:25.598052 systemd[1]: Finished systemd-udev-trigger.service - Coldplug All udev Devices. May 14 01:03:25.601934 systemd[1]: Starting systemd-udev-settle.service - Wait for udev To Complete Device Initialization... May 14 01:03:25.604856 systemd[1]: Finished systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully. May 14 01:03:25.609404 systemd[1]: Starting systemd-sysusers.service - Create System Users... May 14 01:03:25.614248 systemd[1]: Finished systemd-machine-id-commit.service - Save Transient machine-id to Disk. May 14 01:03:25.642270 udevadm[1155]: systemd-udev-settle.service is deprecated. Please fix lvm2-activation.service, lvm2-activation-early.service not to pull it in. May 14 01:03:25.663469 kernel: loop2: detected capacity change from 0 to 109808 May 14 01:03:25.705268 systemd[1]: Finished systemd-sysusers.service - Create System Users. May 14 01:03:25.709897 systemd[1]: Starting systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev... May 14 01:03:25.718467 kernel: loop3: detected capacity change from 0 to 218376 May 14 01:03:25.751499 systemd-tmpfiles[1163]: ACLs are not supported, ignoring. May 14 01:03:25.751517 systemd-tmpfiles[1163]: ACLs are not supported, ignoring. May 14 01:03:25.761163 systemd[1]: Finished systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev. May 14 01:03:25.800038 kernel: loop4: detected capacity change from 0 to 8 May 14 01:03:25.800120 kernel: loop5: detected capacity change from 0 to 151640 May 14 01:03:25.852475 kernel: loop6: detected capacity change from 0 to 109808 May 14 01:03:25.898501 kernel: loop7: detected capacity change from 0 to 218376 May 14 01:03:25.976731 (sd-merge)[1167]: Using extensions 'containerd-flatcar', 'docker-flatcar', 'kubernetes', 'oem-openstack'. May 14 01:03:25.977556 (sd-merge)[1167]: Merged extensions into '/usr'. May 14 01:03:25.988763 systemd[1]: Reload requested from client PID 1134 ('systemd-sysext') (unit systemd-sysext.service)... May 14 01:03:25.988794 systemd[1]: Reloading... May 14 01:03:26.103574 zram_generator::config[1192]: No configuration found. May 14 01:03:26.377142 systemd[1]: /usr/lib/systemd/system/docker.socket:6: ListenStream= references a path below legacy directory /var/run/, updating /var/run/docker.sock → /run/docker.sock; please update the unit file accordingly. May 14 01:03:26.449221 ldconfig[1126]: /sbin/ldconfig: /usr/lib/ld.so.conf is not an ELF file - it has the wrong magic bytes at the start. May 14 01:03:26.462805 systemd[1]: etc-machine\x2did.mount: Deactivated successfully. May 14 01:03:26.463004 systemd[1]: Reloading finished in 472 ms. May 14 01:03:26.484239 systemd[1]: Finished ldconfig.service - Rebuild Dynamic Linker Cache. May 14 01:03:26.485167 systemd[1]: Finished systemd-hwdb-update.service - Rebuild Hardware Database. May 14 01:03:26.486029 systemd[1]: Finished systemd-sysext.service - Merge System Extension Images into /usr/ and /opt/. May 14 01:03:26.498554 systemd[1]: Starting ensure-sysext.service... May 14 01:03:26.501574 systemd[1]: Starting systemd-tmpfiles-setup.service - Create System Files and Directories... May 14 01:03:26.505255 systemd[1]: Starting systemd-udevd.service - Rule-based Manager for Device Events and Files... May 14 01:03:26.525560 systemd[1]: Reload requested from client PID 1253 ('systemctl') (unit ensure-sysext.service)... May 14 01:03:26.525577 systemd[1]: Reloading... May 14 01:03:26.527289 systemd-tmpfiles[1254]: /usr/lib/tmpfiles.d/provision.conf:20: Duplicate line for path "/root", ignoring. May 14 01:03:26.527594 systemd-tmpfiles[1254]: /usr/lib/tmpfiles.d/systemd-flatcar.conf:6: Duplicate line for path "/var/log/journal", ignoring. May 14 01:03:26.528814 systemd-tmpfiles[1254]: /usr/lib/tmpfiles.d/systemd.conf:29: Duplicate line for path "/var/lib/systemd", ignoring. May 14 01:03:26.529185 systemd-tmpfiles[1254]: ACLs are not supported, ignoring. May 14 01:03:26.529317 systemd-tmpfiles[1254]: ACLs are not supported, ignoring. May 14 01:03:26.532830 systemd-tmpfiles[1254]: Detected autofs mount point /boot during canonicalization of boot. May 14 01:03:26.532917 systemd-tmpfiles[1254]: Skipping /boot May 14 01:03:26.542156 systemd-tmpfiles[1254]: Detected autofs mount point /boot during canonicalization of boot. May 14 01:03:26.542250 systemd-tmpfiles[1254]: Skipping /boot May 14 01:03:26.568861 systemd-udevd[1255]: Using default interface naming scheme 'v255'. May 14 01:03:26.609475 zram_generator::config[1284]: No configuration found. May 14 01:03:26.729110 kernel: BTRFS warning: duplicate device /dev/vda3 devid 1 generation 39 scanned by (udev-worker) (1290) May 14 01:03:26.767490 kernel: piix4_smbus 0000:00:01.3: SMBus Host Controller at 0x700, revision 0 May 14 01:03:26.778483 kernel: input: Power Button as /devices/LNXSYSTM:00/LNXPWRBN:00/input/input2 May 14 01:03:26.791456 kernel: input: ImExPS/2 Generic Explorer Mouse as /devices/platform/i8042/serio1/input/input3 May 14 01:03:26.801453 kernel: ACPI: button: Power Button [PWRF] May 14 01:03:26.883274 systemd[1]: /usr/lib/systemd/system/docker.socket:6: ListenStream= references a path below legacy directory /var/run/, updating /var/run/docker.sock → /run/docker.sock; please update the unit file accordingly. May 14 01:03:26.886466 kernel: mousedev: PS/2 mouse device common for all mice May 14 01:03:26.930455 kernel: [drm] pci: virtio-vga detected at 0000:00:02.0 May 14 01:03:26.932449 kernel: virtio-pci 0000:00:02.0: vgaarb: deactivate vga console May 14 01:03:26.936934 kernel: Console: switching to colour dummy device 80x25 May 14 01:03:26.938910 kernel: [drm] features: -virgl +edid -resource_blob -host_visible May 14 01:03:26.938948 kernel: [drm] features: -context_init May 14 01:03:26.941555 kernel: [drm] number of scanouts: 1 May 14 01:03:26.941596 kernel: [drm] number of cap sets: 0 May 14 01:03:26.947481 kernel: [drm] Initialized virtio_gpu 0.1.0 0 for 0000:00:02.0 on minor 0 May 14 01:03:26.956700 kernel: fbcon: virtio_gpudrmfb (fb0) is primary device May 14 01:03:26.956811 kernel: Console: switching to colour frame buffer device 160x50 May 14 01:03:26.963465 kernel: virtio-pci 0000:00:02.0: [drm] fb0: virtio_gpudrmfb frame buffer device May 14 01:03:27.016723 systemd[1]: Condition check resulted in dev-ttyS0.device - /dev/ttyS0 being skipped. May 14 01:03:27.017208 systemd[1]: Found device dev-disk-by\x2dlabel-OEM.device - /dev/disk/by-label/OEM. May 14 01:03:27.018972 systemd[1]: Reloading finished in 493 ms. May 14 01:03:27.031008 systemd[1]: Started systemd-udevd.service - Rule-based Manager for Device Events and Files. May 14 01:03:27.031514 systemd[1]: Finished systemd-tmpfiles-setup.service - Create System Files and Directories. May 14 01:03:27.084958 systemd[1]: Finished ensure-sysext.service. May 14 01:03:27.089991 systemd[1]: Finished systemd-udev-settle.service - Wait for udev To Complete Device Initialization. May 14 01:03:27.097537 systemd[1]: proc-xen.mount - /proc/xen was skipped because of an unmet condition check (ConditionVirtualization=xen). May 14 01:03:27.098805 systemd[1]: Starting audit-rules.service - Load Audit Rules... May 14 01:03:27.106539 systemd[1]: Starting clean-ca-certificates.service - Clean up broken links in /etc/ssl/certs... May 14 01:03:27.106772 systemd[1]: ignition-delete-config.service - Ignition (delete config) was skipped because no trigger condition checks were met. May 14 01:03:27.109008 systemd[1]: Starting lvm2-activation-early.service - Activation of LVM2 logical volumes... May 14 01:03:27.115902 systemd[1]: Starting modprobe@dm_mod.service - Load Kernel Module dm_mod... May 14 01:03:27.119745 systemd[1]: Starting modprobe@drm.service - Load Kernel Module drm... May 14 01:03:27.124037 systemd[1]: Starting modprobe@efi_pstore.service - Load Kernel Module efi_pstore... May 14 01:03:27.128301 systemd[1]: Starting modprobe@loop.service - Load Kernel Module loop... May 14 01:03:27.128763 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. May 14 01:03:27.132357 systemd[1]: Starting systemd-fsck@dev-disk-by\x2dlabel-OEM.service - File System Check on /dev/disk/by-label/OEM... May 14 01:03:27.132479 systemd[1]: systemd-hibernate-clear.service - Clear Stale Hibernate Storage Info was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/HibernateLocation-8cf2644b-4b0b-428f-9387-6d876050dc67). May 14 01:03:27.137665 systemd[1]: Starting systemd-journal-catalog-update.service - Rebuild Journal Catalog... May 14 01:03:27.140395 systemd[1]: Starting systemd-networkd.service - Network Configuration... May 14 01:03:27.146781 systemd[1]: Starting systemd-resolved.service - Network Name Resolution... May 14 01:03:27.151765 systemd[1]: Starting systemd-timesyncd.service - Network Time Synchronization... May 14 01:03:27.157464 systemd[1]: Starting systemd-update-utmp.service - Record System Boot/Shutdown in UTMP... May 14 01:03:27.159589 lvm[1376]: WARNING: Failed to connect to lvmetad. Falling back to device scanning. May 14 01:03:27.166615 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... May 14 01:03:27.166957 systemd[1]: xenserver-pv-version.service - Set fake PV driver version for XenServer was skipped because of an unmet condition check (ConditionVirtualization=xen). May 14 01:03:27.174212 systemd[1]: Starting systemd-userdbd.service - User Database Manager... May 14 01:03:27.207005 systemd[1]: Finished lvm2-activation-early.service - Activation of LVM2 logical volumes. May 14 01:03:27.207786 systemd[1]: Reached target cryptsetup.target - Local Encrypted Volumes. May 14 01:03:27.209913 systemd[1]: Starting lvm2-activation.service - Activation of LVM2 logical volumes... May 14 01:03:27.218858 systemd[1]: Finished systemd-fsck@dev-disk-by\x2dlabel-OEM.service - File System Check on /dev/disk/by-label/OEM. May 14 01:03:27.232297 systemd[1]: Finished systemd-update-utmp.service - Record System Boot/Shutdown in UTMP. May 14 01:03:27.246789 systemd[1]: modprobe@dm_mod.service: Deactivated successfully. May 14 01:03:27.247862 systemd[1]: Finished modprobe@dm_mod.service - Load Kernel Module dm_mod. May 14 01:03:27.252669 systemd[1]: modprobe@loop.service: Deactivated successfully. May 14 01:03:27.252894 systemd[1]: Finished modprobe@loop.service - Load Kernel Module loop. May 14 01:03:27.254283 systemd[1]: systemd-repart.service - Repartition Root Disk was skipped because no trigger condition checks were met. May 14 01:03:27.258774 systemd[1]: modprobe@drm.service: Deactivated successfully. May 14 01:03:27.258997 systemd[1]: Finished modprobe@drm.service - Load Kernel Module drm. May 14 01:03:27.260963 systemd[1]: modprobe@efi_pstore.service: Deactivated successfully. May 14 01:03:27.262852 systemd[1]: Finished modprobe@efi_pstore.service - Load Kernel Module efi_pstore. May 14 01:03:27.265099 lvm[1398]: WARNING: Failed to connect to lvmetad. Falling back to device scanning. May 14 01:03:27.265814 systemd[1]: Finished systemd-journal-catalog-update.service - Rebuild Journal Catalog. May 14 01:03:27.270150 systemd[1]: systemd-pstore.service - Platform Persistent Storage Archival was skipped because of an unmet condition check (ConditionDirectoryNotEmpty=/sys/fs/pstore). May 14 01:03:27.272374 systemd[1]: Starting systemd-update-done.service - Update is Completed... May 14 01:03:27.309033 systemd[1]: Finished systemd-update-done.service - Update is Completed. May 14 01:03:27.311523 systemd[1]: Finished lvm2-activation.service - Activation of LVM2 logical volumes. May 14 01:03:27.319003 augenrules[1424]: No rules May 14 01:03:27.320675 systemd[1]: audit-rules.service: Deactivated successfully. May 14 01:03:27.320904 systemd[1]: Finished audit-rules.service - Load Audit Rules. May 14 01:03:27.326328 systemd[1]: Started systemd-userdbd.service - User Database Manager. May 14 01:03:27.349925 systemd[1]: Finished clean-ca-certificates.service - Clean up broken links in /etc/ssl/certs. May 14 01:03:27.358087 systemd[1]: update-ca-certificates.service - Update CA bundle at /etc/ssl/certs/ca-certificates.crt was skipped because of an unmet condition check (ConditionPathIsSymbolicLink=!/etc/ssl/certs/ca-certificates.crt). May 14 01:03:27.362579 systemd[1]: Finished systemd-vconsole-setup.service - Virtual Console Setup. May 14 01:03:27.423475 systemd-resolved[1385]: Positive Trust Anchors: May 14 01:03:27.423491 systemd-resolved[1385]: . IN DS 20326 8 2 e06d44b80b8f1d39a95c0b0d7c65d08458e880409bbc683457104237c7f8ec8d May 14 01:03:27.423539 systemd-resolved[1385]: Negative trust anchors: home.arpa 10.in-addr.arpa 16.172.in-addr.arpa 17.172.in-addr.arpa 18.172.in-addr.arpa 19.172.in-addr.arpa 20.172.in-addr.arpa 21.172.in-addr.arpa 22.172.in-addr.arpa 23.172.in-addr.arpa 24.172.in-addr.arpa 25.172.in-addr.arpa 26.172.in-addr.arpa 27.172.in-addr.arpa 28.172.in-addr.arpa 29.172.in-addr.arpa 30.172.in-addr.arpa 31.172.in-addr.arpa 170.0.0.192.in-addr.arpa 171.0.0.192.in-addr.arpa 168.192.in-addr.arpa d.f.ip6.arpa ipv4only.arpa resolver.arpa corp home internal intranet lan local private test May 14 01:03:27.430739 systemd-resolved[1385]: Using system hostname 'ci-4284-0-0-n-4a8b92fa55.novalocal'. May 14 01:03:27.432685 systemd[1]: Started systemd-resolved.service - Network Name Resolution. May 14 01:03:27.433462 systemd[1]: Reached target nss-lookup.target - Host and Network Name Lookups. May 14 01:03:27.447677 systemd-networkd[1384]: lo: Link UP May 14 01:03:27.447689 systemd-networkd[1384]: lo: Gained carrier May 14 01:03:27.448939 systemd-networkd[1384]: Enumeration completed May 14 01:03:27.449017 systemd[1]: Started systemd-networkd.service - Network Configuration. May 14 01:03:27.450990 systemd[1]: Reached target network.target - Network. May 14 01:03:27.453258 systemd-networkd[1384]: eth0: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. May 14 01:03:27.454223 systemd-networkd[1384]: eth0: Configuring with /usr/lib/systemd/network/zz-default.network. May 14 01:03:27.454740 systemd[1]: Starting systemd-networkd-persistent-storage.service - Enable Persistent Storage in systemd-networkd... May 14 01:03:27.458117 systemd-networkd[1384]: eth0: Link UP May 14 01:03:27.458127 systemd-networkd[1384]: eth0: Gained carrier May 14 01:03:27.458154 systemd-networkd[1384]: eth0: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. May 14 01:03:27.459715 systemd[1]: Starting systemd-networkd-wait-online.service - Wait for Network to be Configured... May 14 01:03:27.462246 systemd[1]: Started systemd-timesyncd.service - Network Time Synchronization. May 14 01:03:27.464929 systemd[1]: Reached target sysinit.target - System Initialization. May 14 01:03:27.467217 systemd[1]: Started motdgen.path - Watch for update engine configuration changes. May 14 01:03:27.468693 systemd[1]: Started user-cloudinit@var-lib-flatcar\x2dinstall-user_data.path - Watch for a cloud-config at /var/lib/flatcar-install/user_data. May 14 01:03:27.471329 systemd[1]: Started systemd-tmpfiles-clean.timer - Daily Cleanup of Temporary Directories. May 14 01:03:27.473354 systemd[1]: update-engine-stub.timer - Update Engine Stub Timer was skipped because of an unmet condition check (ConditionPathExists=/usr/.noupdate). May 14 01:03:27.473388 systemd[1]: Reached target paths.target - Path Units. May 14 01:03:27.476087 systemd[1]: Reached target time-set.target - System Time Set. May 14 01:03:27.479086 systemd[1]: Started logrotate.timer - Daily rotation of log files. May 14 01:03:27.480373 systemd-networkd[1384]: eth0: DHCPv4 address 172.24.4.64/24, gateway 172.24.4.1 acquired from 172.24.4.1 May 14 01:03:27.481211 systemd-timesyncd[1386]: Network configuration changed, trying to establish connection. May 14 01:03:27.482703 systemd[1]: Started mdadm.timer - Weekly check for MD array's redundancy information.. May 14 01:03:27.483359 systemd[1]: Reached target timers.target - Timer Units. May 14 01:03:27.488424 systemd[1]: Listening on dbus.socket - D-Bus System Message Bus Socket. May 14 01:03:27.491741 systemd[1]: Starting docker.socket - Docker Socket for the API... May 14 01:03:27.500118 systemd[1]: Listening on sshd-unix-local.socket - OpenSSH Server Socket (systemd-ssh-generator, AF_UNIX Local). May 14 01:03:27.502758 systemd[1]: Listening on sshd-vsock.socket - OpenSSH Server Socket (systemd-ssh-generator, AF_VSOCK). May 14 01:03:27.503231 systemd[1]: Reached target ssh-access.target - SSH Access Available. May 14 01:03:27.516075 systemd[1]: Listening on sshd.socket - OpenSSH Server Socket. May 14 01:03:27.517022 systemd[1]: Listening on systemd-hostnamed.socket - Hostname Service Socket. May 14 01:03:27.519513 systemd[1]: Finished systemd-networkd-persistent-storage.service - Enable Persistent Storage in systemd-networkd. May 14 01:03:27.522116 systemd[1]: Listening on docker.socket - Docker Socket for the API. May 14 01:03:27.523085 systemd[1]: Reached target sockets.target - Socket Units. May 14 01:03:27.524882 systemd[1]: Reached target basic.target - Basic System. May 14 01:03:27.526676 systemd[1]: addon-config@oem.service - Configure Addon /oem was skipped because no trigger condition checks were met. May 14 01:03:27.526714 systemd[1]: addon-run@oem.service - Run Addon /oem was skipped because no trigger condition checks were met. May 14 01:03:27.528217 systemd[1]: Starting containerd.service - containerd container runtime... May 14 01:03:27.532511 systemd[1]: Starting coreos-metadata.service - Flatcar Metadata Agent... May 14 01:03:27.539599 systemd[1]: Starting dbus.service - D-Bus System Message Bus... May 14 01:03:27.545557 systemd[1]: Starting enable-oem-cloudinit.service - Enable cloudinit... May 14 01:03:27.550629 systemd[1]: Starting extend-filesystems.service - Extend Filesystems... May 14 01:03:27.553160 systemd[1]: flatcar-setup-environment.service - Modifies /etc/environment for CoreOS was skipped because of an unmet condition check (ConditionPathExists=/oem/bin/flatcar-setup-environment). May 14 01:03:27.557731 systemd[1]: Starting motdgen.service - Generate /run/flatcar/motd... May 14 01:03:27.561923 systemd[1]: Starting prepare-helm.service - Unpack helm to /opt/bin... May 14 01:03:27.568471 jq[1452]: false May 14 01:03:27.569114 systemd[1]: Starting ssh-key-proc-cmdline.service - Install an ssh key from /proc/cmdline... May 14 01:03:27.581554 systemd[1]: Starting sshd-keygen.service - Generate sshd host keys... May 14 01:03:27.586844 systemd[1]: Starting systemd-logind.service - User Login Management... May 14 01:03:27.591101 systemd[1]: tcsd.service - TCG Core Services Daemon was skipped because of an unmet condition check (ConditionPathExists=/dev/tpm0). May 14 01:03:27.591620 systemd[1]: cgroup compatibility translation between legacy and unified hierarchy settings activated. See cgroup-compat debug messages for details. May 14 01:03:27.594654 systemd[1]: Starting update-engine.service - Update Engine... May 14 01:03:27.610478 extend-filesystems[1453]: Found loop4 May 14 01:03:27.610478 extend-filesystems[1453]: Found loop5 May 14 01:03:27.610478 extend-filesystems[1453]: Found loop6 May 14 01:03:27.610478 extend-filesystems[1453]: Found loop7 May 14 01:03:27.610478 extend-filesystems[1453]: Found vda May 14 01:03:27.610478 extend-filesystems[1453]: Found vda1 May 14 01:03:27.610478 extend-filesystems[1453]: Found vda2 May 14 01:03:27.610478 extend-filesystems[1453]: Found vda3 May 14 01:03:27.610478 extend-filesystems[1453]: Found usr May 14 01:03:27.610478 extend-filesystems[1453]: Found vda4 May 14 01:03:27.610478 extend-filesystems[1453]: Found vda6 May 14 01:03:27.610478 extend-filesystems[1453]: Found vda7 May 14 01:03:27.610478 extend-filesystems[1453]: Found vda9 May 14 01:03:27.610478 extend-filesystems[1453]: Checking size of /dev/vda9 May 14 01:03:27.750783 kernel: EXT4-fs (vda9): resizing filesystem from 1617920 to 2014203 blocks May 14 01:03:27.750820 kernel: EXT4-fs (vda9): resized filesystem to 2014203 May 14 01:03:27.750839 kernel: BTRFS warning: duplicate device /dev/vda3 devid 1 generation 39 scanned by (udev-worker) (1303) May 14 01:03:27.606852 systemd[1]: Starting update-ssh-keys-after-ignition.service - Run update-ssh-keys once after Ignition... May 14 01:03:27.686042 dbus-daemon[1449]: [system] SELinux support is enabled May 14 01:03:27.751147 extend-filesystems[1453]: Resized partition /dev/vda9 May 14 01:03:27.618592 systemd[1]: enable-oem-cloudinit.service: Skipped due to 'exec-condition'. May 14 01:03:27.760705 extend-filesystems[1481]: resize2fs 1.47.2 (1-Jan-2025) May 14 01:03:27.760705 extend-filesystems[1481]: Filesystem at /dev/vda9 is mounted on /; on-line resizing required May 14 01:03:27.760705 extend-filesystems[1481]: old_desc_blocks = 1, new_desc_blocks = 1 May 14 01:03:27.760705 extend-filesystems[1481]: The filesystem on /dev/vda9 is now 2014203 (4k) blocks long. May 14 01:03:27.772667 update_engine[1460]: I20250514 01:03:27.663733 1460 main.cc:92] Flatcar Update Engine starting May 14 01:03:27.772667 update_engine[1460]: I20250514 01:03:27.732518 1460 update_check_scheduler.cc:74] Next update check in 9m28s May 14 01:03:27.618895 systemd[1]: Condition check resulted in enable-oem-cloudinit.service - Enable cloudinit being skipped. May 14 01:03:27.773085 extend-filesystems[1453]: Resized filesystem in /dev/vda9 May 14 01:03:27.627452 systemd[1]: ssh-key-proc-cmdline.service: Deactivated successfully. May 14 01:03:27.787461 jq[1463]: true May 14 01:03:27.627661 systemd[1]: Finished ssh-key-proc-cmdline.service - Install an ssh key from /proc/cmdline. May 14 01:03:27.658036 systemd[1]: Created slice system-sshd.slice - Slice /system/sshd. May 14 01:03:27.787845 tar[1471]: linux-amd64/LICENSE May 14 01:03:27.787845 tar[1471]: linux-amd64/helm May 14 01:03:27.666912 systemd[1]: motdgen.service: Deactivated successfully. May 14 01:03:27.667148 systemd[1]: Finished motdgen.service - Generate /run/flatcar/motd. May 14 01:03:27.788250 jq[1485]: true May 14 01:03:27.692877 systemd[1]: Started dbus.service - D-Bus System Message Bus. May 14 01:03:27.711760 systemd[1]: system-cloudinit@usr-share-oem-cloud\x2dconfig.yml.service - Load cloud-config from /usr/share/oem/cloud-config.yml was skipped because of an unmet condition check (ConditionFileNotEmpty=/usr/share/oem/cloud-config.yml). May 14 01:03:27.711790 systemd[1]: Reached target system-config.target - Load system-provided cloud configs. May 14 01:03:27.716002 systemd[1]: user-cloudinit-proc-cmdline.service - Load cloud-config from url defined in /proc/cmdline was skipped because of an unmet condition check (ConditionKernelCommandLine=cloud-config-url). May 14 01:03:27.716030 systemd[1]: Reached target user-config.target - Load user-provided cloud configs. May 14 01:03:27.719188 (ntainerd)[1486]: containerd.service: Referenced but unset environment variable evaluates to an empty string: TORCX_IMAGEDIR, TORCX_UNPACKDIR May 14 01:03:27.726552 systemd[1]: Started update-engine.service - Update Engine. May 14 01:03:27.752987 systemd[1]: Started locksmithd.service - Cluster reboot manager. May 14 01:03:27.760406 systemd[1]: extend-filesystems.service: Deactivated successfully. May 14 01:03:27.763032 systemd[1]: Finished extend-filesystems.service - Extend Filesystems. May 14 01:03:27.836694 systemd-logind[1459]: New seat seat0. May 14 01:03:27.838039 systemd-logind[1459]: Watching system buttons on /dev/input/event2 (Power Button) May 14 01:03:27.838135 systemd-logind[1459]: Watching system buttons on /dev/input/event0 (AT Translated Set 2 keyboard) May 14 01:03:27.838379 systemd[1]: Started systemd-logind.service - User Login Management. May 14 01:03:27.966907 bash[1508]: Updated "/home/core/.ssh/authorized_keys" May 14 01:03:27.969398 systemd[1]: Finished update-ssh-keys-after-ignition.service - Run update-ssh-keys once after Ignition. May 14 01:03:27.977788 systemd[1]: Starting sshkeys.service... May 14 01:03:28.010201 systemd[1]: Created slice system-coreos\x2dmetadata\x2dsshkeys.slice - Slice /system/coreos-metadata-sshkeys. May 14 01:03:28.020788 systemd[1]: Starting coreos-metadata-sshkeys@core.service - Flatcar Metadata Agent (SSH Keys)... May 14 01:03:28.068034 locksmithd[1490]: locksmithd starting currentOperation="UPDATE_STATUS_IDLE" strategy="reboot" May 14 01:03:28.084258 sshd_keygen[1483]: ssh-keygen: generating new host keys: RSA ECDSA ED25519 May 14 01:03:28.128591 systemd[1]: Finished sshd-keygen.service - Generate sshd host keys. May 14 01:03:28.132579 systemd[1]: Starting issuegen.service - Generate /run/issue... May 14 01:03:28.146873 systemd[1]: Started sshd@0-172.24.4.64:22-172.24.4.1:59976.service - OpenSSH per-connection server daemon (172.24.4.1:59976). May 14 01:03:28.158033 systemd[1]: issuegen.service: Deactivated successfully. May 14 01:03:28.158296 systemd[1]: Finished issuegen.service - Generate /run/issue. May 14 01:03:28.167988 systemd[1]: Starting systemd-user-sessions.service - Permit User Sessions... May 14 01:03:28.207061 systemd[1]: Finished systemd-user-sessions.service - Permit User Sessions. May 14 01:03:28.213127 systemd[1]: Started getty@tty1.service - Getty on tty1. May 14 01:03:28.222718 systemd[1]: Started serial-getty@ttyS0.service - Serial Getty on ttyS0. May 14 01:03:28.225684 systemd[1]: Reached target getty.target - Login Prompts. May 14 01:03:28.277621 containerd[1486]: time="2025-05-14T01:03:28Z" level=warning msg="Ignoring unknown key in TOML" column=1 error="strict mode: fields in the document are missing in the target struct" file=/usr/share/containerd/config.toml key=subreaper row=8 May 14 01:03:28.278811 containerd[1486]: time="2025-05-14T01:03:28.278781620Z" level=info msg="starting containerd" revision=88aa2f531d6c2922003cc7929e51daf1c14caa0a version=v2.0.1 May 14 01:03:28.290955 containerd[1486]: time="2025-05-14T01:03:28.290906400Z" level=warning msg="Configuration migrated from version 2, use `containerd config migrate` to avoid migration" t="6.773µs" May 14 01:03:28.290955 containerd[1486]: time="2025-05-14T01:03:28.290950593Z" level=info msg="loading plugin" id=io.containerd.image-verifier.v1.bindir type=io.containerd.image-verifier.v1 May 14 01:03:28.291022 containerd[1486]: time="2025-05-14T01:03:28.290976371Z" level=info msg="loading plugin" id=io.containerd.internal.v1.opt type=io.containerd.internal.v1 May 14 01:03:28.291170 containerd[1486]: time="2025-05-14T01:03:28.291145408Z" level=info msg="loading plugin" id=io.containerd.warning.v1.deprecations type=io.containerd.warning.v1 May 14 01:03:28.291207 containerd[1486]: time="2025-05-14T01:03:28.291175705Z" level=info msg="loading plugin" id=io.containerd.content.v1.content type=io.containerd.content.v1 May 14 01:03:28.291236 containerd[1486]: time="2025-05-14T01:03:28.291203828Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.blockfile type=io.containerd.snapshotter.v1 May 14 01:03:28.291307 containerd[1486]: time="2025-05-14T01:03:28.291272918Z" level=info msg="skip loading plugin" error="no scratch file generator: skip plugin" id=io.containerd.snapshotter.v1.blockfile type=io.containerd.snapshotter.v1 May 14 01:03:28.291307 containerd[1486]: time="2025-05-14T01:03:28.291298305Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.btrfs type=io.containerd.snapshotter.v1 May 14 01:03:28.291583 containerd[1486]: time="2025-05-14T01:03:28.291556700Z" level=info msg="skip loading plugin" error="path /var/lib/containerd/io.containerd.snapshotter.v1.btrfs (ext4) must be a btrfs filesystem to be used with the btrfs snapshotter: skip plugin" id=io.containerd.snapshotter.v1.btrfs type=io.containerd.snapshotter.v1 May 14 01:03:28.291618 containerd[1486]: time="2025-05-14T01:03:28.291584031Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.devmapper type=io.containerd.snapshotter.v1 May 14 01:03:28.291618 containerd[1486]: time="2025-05-14T01:03:28.291598699Z" level=info msg="skip loading plugin" error="devmapper not configured: skip plugin" id=io.containerd.snapshotter.v1.devmapper type=io.containerd.snapshotter.v1 May 14 01:03:28.291618 containerd[1486]: time="2025-05-14T01:03:28.291613737Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.native type=io.containerd.snapshotter.v1 May 14 01:03:28.291718 containerd[1486]: time="2025-05-14T01:03:28.291694809Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.overlayfs type=io.containerd.snapshotter.v1 May 14 01:03:28.291921 containerd[1486]: time="2025-05-14T01:03:28.291898210Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.zfs type=io.containerd.snapshotter.v1 May 14 01:03:28.291961 containerd[1486]: time="2025-05-14T01:03:28.291940490Z" level=info msg="skip loading plugin" error="lstat /var/lib/containerd/io.containerd.snapshotter.v1.zfs: no such file or directory: skip plugin" id=io.containerd.snapshotter.v1.zfs type=io.containerd.snapshotter.v1 May 14 01:03:28.291988 containerd[1486]: time="2025-05-14T01:03:28.291961649Z" level=info msg="loading plugin" id=io.containerd.event.v1.exchange type=io.containerd.event.v1 May 14 01:03:28.292009 containerd[1486]: time="2025-05-14T01:03:28.291990984Z" level=info msg="loading plugin" id=io.containerd.monitor.task.v1.cgroups type=io.containerd.monitor.task.v1 May 14 01:03:28.292402 containerd[1486]: time="2025-05-14T01:03:28.292377389Z" level=info msg="loading plugin" id=io.containerd.metadata.v1.bolt type=io.containerd.metadata.v1 May 14 01:03:28.292509 containerd[1486]: time="2025-05-14T01:03:28.292469592Z" level=info msg="metadata content store policy set" policy=shared May 14 01:03:28.300532 containerd[1486]: time="2025-05-14T01:03:28.300423309Z" level=info msg="loading plugin" id=io.containerd.gc.v1.scheduler type=io.containerd.gc.v1 May 14 01:03:28.300532 containerd[1486]: time="2025-05-14T01:03:28.300491527Z" level=info msg="loading plugin" id=io.containerd.differ.v1.walking type=io.containerd.differ.v1 May 14 01:03:28.300532 containerd[1486]: time="2025-05-14T01:03:28.300508809Z" level=info msg="loading plugin" id=io.containerd.lease.v1.manager type=io.containerd.lease.v1 May 14 01:03:28.300532 containerd[1486]: time="2025-05-14T01:03:28.300524138Z" level=info msg="loading plugin" id=io.containerd.service.v1.containers-service type=io.containerd.service.v1 May 14 01:03:28.300674 containerd[1486]: time="2025-05-14T01:03:28.300538054Z" level=info msg="loading plugin" id=io.containerd.service.v1.content-service type=io.containerd.service.v1 May 14 01:03:28.300674 containerd[1486]: time="2025-05-14T01:03:28.300551600Z" level=info msg="loading plugin" id=io.containerd.service.v1.diff-service type=io.containerd.service.v1 May 14 01:03:28.300674 containerd[1486]: time="2025-05-14T01:03:28.300565265Z" level=info msg="loading plugin" id=io.containerd.service.v1.images-service type=io.containerd.service.v1 May 14 01:03:28.300674 containerd[1486]: time="2025-05-14T01:03:28.300579302Z" level=info msg="loading plugin" id=io.containerd.service.v1.introspection-service type=io.containerd.service.v1 May 14 01:03:28.300674 containerd[1486]: time="2025-05-14T01:03:28.300591294Z" level=info msg="loading plugin" id=io.containerd.service.v1.namespaces-service type=io.containerd.service.v1 May 14 01:03:28.300674 containerd[1486]: time="2025-05-14T01:03:28.300603507Z" level=info msg="loading plugin" id=io.containerd.service.v1.snapshots-service type=io.containerd.service.v1 May 14 01:03:28.300674 containerd[1486]: time="2025-05-14T01:03:28.300614578Z" level=info msg="loading plugin" id=io.containerd.shim.v1.manager type=io.containerd.shim.v1 May 14 01:03:28.300674 containerd[1486]: time="2025-05-14T01:03:28.300627382Z" level=info msg="loading plugin" id=io.containerd.runtime.v2.task type=io.containerd.runtime.v2 May 14 01:03:28.300873 containerd[1486]: time="2025-05-14T01:03:28.300740915Z" level=info msg="loading plugin" id=io.containerd.service.v1.tasks-service type=io.containerd.service.v1 May 14 01:03:28.300873 containerd[1486]: time="2025-05-14T01:03:28.300764659Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.containers type=io.containerd.grpc.v1 May 14 01:03:28.300873 containerd[1486]: time="2025-05-14T01:03:28.300779718Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.content type=io.containerd.grpc.v1 May 14 01:03:28.300873 containerd[1486]: time="2025-05-14T01:03:28.300792381Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.diff type=io.containerd.grpc.v1 May 14 01:03:28.300873 containerd[1486]: time="2025-05-14T01:03:28.300804895Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.events type=io.containerd.grpc.v1 May 14 01:03:28.300873 containerd[1486]: time="2025-05-14T01:03:28.300816527Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.images type=io.containerd.grpc.v1 May 14 01:03:28.300873 containerd[1486]: time="2025-05-14T01:03:28.300828970Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.introspection type=io.containerd.grpc.v1 May 14 01:03:28.300873 containerd[1486]: time="2025-05-14T01:03:28.300840932Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.leases type=io.containerd.grpc.v1 May 14 01:03:28.300873 containerd[1486]: time="2025-05-14T01:03:28.300858646Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.namespaces type=io.containerd.grpc.v1 May 14 01:03:28.300873 containerd[1486]: time="2025-05-14T01:03:28.300871900Z" level=info msg="loading plugin" id=io.containerd.sandbox.store.v1.local type=io.containerd.sandbox.store.v1 May 14 01:03:28.301147 containerd[1486]: time="2025-05-14T01:03:28.300884935Z" level=info msg="loading plugin" id=io.containerd.cri.v1.images type=io.containerd.cri.v1 May 14 01:03:28.301147 containerd[1486]: time="2025-05-14T01:03:28.300940890Z" level=info msg="Get image filesystem path \"/var/lib/containerd/io.containerd.snapshotter.v1.overlayfs\" for snapshotter \"overlayfs\"" May 14 01:03:28.301147 containerd[1486]: time="2025-05-14T01:03:28.300959926Z" level=info msg="Start snapshots syncer" May 14 01:03:28.301147 containerd[1486]: time="2025-05-14T01:03:28.300981266Z" level=info msg="loading plugin" id=io.containerd.cri.v1.runtime type=io.containerd.cri.v1 May 14 01:03:28.302300 containerd[1486]: time="2025-05-14T01:03:28.301232777Z" level=info msg="starting cri plugin" config="{\"containerd\":{\"defaultRuntimeName\":\"runc\",\"runtimes\":{\"runc\":{\"runtimeType\":\"io.containerd.runc.v2\",\"runtimePath\":\"\",\"PodAnnotations\":null,\"ContainerAnnotations\":null,\"options\":{\"BinaryName\":\"\",\"CriuImagePath\":\"\",\"CriuWorkPath\":\"\",\"IoGid\":0,\"IoUid\":0,\"NoNewKeyring\":false,\"Root\":\"\",\"ShimCgroup\":\"\",\"SystemdCgroup\":true},\"privileged_without_host_devices\":false,\"privileged_without_host_devices_all_devices_allowed\":false,\"baseRuntimeSpec\":\"\",\"cniConfDir\":\"\",\"cniMaxConfNum\":0,\"snapshotter\":\"\",\"sandboxer\":\"podsandbox\",\"io_type\":\"\"}},\"ignoreBlockIONotEnabledErrors\":false,\"ignoreRdtNotEnabledErrors\":false},\"cni\":{\"binDir\":\"/opt/cni/bin\",\"confDir\":\"/etc/cni/net.d\",\"maxConfNum\":1,\"setupSerially\":false,\"confTemplate\":\"\",\"ipPref\":\"\",\"useInternalLoopback\":false},\"enableSelinux\":true,\"selinuxCategoryRange\":1024,\"maxContainerLogSize\":16384,\"disableApparmor\":false,\"restrictOOMScoreAdj\":false,\"disableProcMount\":false,\"unsetSeccompProfile\":\"\",\"tolerateMissingHugetlbController\":true,\"disableHugetlbController\":true,\"device_ownership_from_security_context\":false,\"ignoreImageDefinedVolumes\":false,\"netnsMountsUnderStateDir\":false,\"enableUnprivilegedPorts\":true,\"enableUnprivilegedICMP\":true,\"enableCDI\":true,\"cdiSpecDirs\":[\"/etc/cdi\",\"/var/run/cdi\"],\"drainExecSyncIOTimeout\":\"0s\",\"ignoreDeprecationWarnings\":null,\"containerdRootDir\":\"/var/lib/containerd\",\"containerdEndpoint\":\"/run/containerd/containerd.sock\",\"rootDir\":\"/var/lib/containerd/io.containerd.grpc.v1.cri\",\"stateDir\":\"/run/containerd/io.containerd.grpc.v1.cri\"}" May 14 01:03:28.302300 containerd[1486]: time="2025-05-14T01:03:28.301298911Z" level=info msg="loading plugin" id=io.containerd.podsandbox.controller.v1.podsandbox type=io.containerd.podsandbox.controller.v1 May 14 01:03:28.302478 containerd[1486]: time="2025-05-14T01:03:28.301378090Z" level=info msg="loading plugin" id=io.containerd.sandbox.controller.v1.shim type=io.containerd.sandbox.controller.v1 May 14 01:03:28.302478 containerd[1486]: time="2025-05-14T01:03:28.301481384Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.sandbox-controllers type=io.containerd.grpc.v1 May 14 01:03:28.302478 containerd[1486]: time="2025-05-14T01:03:28.301505749Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.sandboxes type=io.containerd.grpc.v1 May 14 01:03:28.302478 containerd[1486]: time="2025-05-14T01:03:28.301517842Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.snapshots type=io.containerd.grpc.v1 May 14 01:03:28.302478 containerd[1486]: time="2025-05-14T01:03:28.301529554Z" level=info msg="loading plugin" id=io.containerd.streaming.v1.manager type=io.containerd.streaming.v1 May 14 01:03:28.302478 containerd[1486]: time="2025-05-14T01:03:28.301542137Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.streaming type=io.containerd.grpc.v1 May 14 01:03:28.302478 containerd[1486]: time="2025-05-14T01:03:28.301553930Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.tasks type=io.containerd.grpc.v1 May 14 01:03:28.302478 containerd[1486]: time="2025-05-14T01:03:28.301565471Z" level=info msg="loading plugin" id=io.containerd.transfer.v1.local type=io.containerd.transfer.v1 May 14 01:03:28.302478 containerd[1486]: time="2025-05-14T01:03:28.301586080Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.transfer type=io.containerd.grpc.v1 May 14 01:03:28.302478 containerd[1486]: time="2025-05-14T01:03:28.301599485Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.version type=io.containerd.grpc.v1 May 14 01:03:28.302478 containerd[1486]: time="2025-05-14T01:03:28.301614303Z" level=info msg="loading plugin" id=io.containerd.monitor.container.v1.restart type=io.containerd.monitor.container.v1 May 14 01:03:28.302478 containerd[1486]: time="2025-05-14T01:03:28.301642947Z" level=info msg="loading plugin" id=io.containerd.tracing.processor.v1.otlp type=io.containerd.tracing.processor.v1 May 14 01:03:28.302478 containerd[1486]: time="2025-05-14T01:03:28.301658255Z" level=info msg="skip loading plugin" error="skip plugin: tracing endpoint not configured" id=io.containerd.tracing.processor.v1.otlp type=io.containerd.tracing.processor.v1 May 14 01:03:28.302478 containerd[1486]: time="2025-05-14T01:03:28.301668815Z" level=info msg="loading plugin" id=io.containerd.internal.v1.tracing type=io.containerd.internal.v1 May 14 01:03:28.302753 containerd[1486]: time="2025-05-14T01:03:28.301679215Z" level=info msg="skip loading plugin" error="skip plugin: tracing endpoint not configured" id=io.containerd.internal.v1.tracing type=io.containerd.internal.v1 May 14 01:03:28.302753 containerd[1486]: time="2025-05-14T01:03:28.301688793Z" level=info msg="loading plugin" id=io.containerd.ttrpc.v1.otelttrpc type=io.containerd.ttrpc.v1 May 14 01:03:28.302753 containerd[1486]: time="2025-05-14T01:03:28.301702749Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.healthcheck type=io.containerd.grpc.v1 May 14 01:03:28.302753 containerd[1486]: time="2025-05-14T01:03:28.301714511Z" level=info msg="loading plugin" id=io.containerd.nri.v1.nri type=io.containerd.nri.v1 May 14 01:03:28.302753 containerd[1486]: time="2025-05-14T01:03:28.301730270Z" level=info msg="runtime interface created" May 14 01:03:28.302753 containerd[1486]: time="2025-05-14T01:03:28.301735971Z" level=info msg="created NRI interface" May 14 01:03:28.302753 containerd[1486]: time="2025-05-14T01:03:28.301744176Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.cri type=io.containerd.grpc.v1 May 14 01:03:28.302753 containerd[1486]: time="2025-05-14T01:03:28.301755037Z" level=info msg="Connect containerd service" May 14 01:03:28.302753 containerd[1486]: time="2025-05-14T01:03:28.301779052Z" level=info msg="using experimental NRI integration - disable nri plugin to prevent this" May 14 01:03:28.302753 containerd[1486]: time="2025-05-14T01:03:28.302390759Z" level=error msg="failed to load cni during init, please check CRI plugin status before setting up network for pods" error="cni config load failed: no network config found in /etc/cni/net.d: cni plugin not initialized: failed to load cni config" May 14 01:03:28.476647 containerd[1486]: time="2025-05-14T01:03:28.476312289Z" level=info msg=serving... address=/run/containerd/containerd.sock.ttrpc May 14 01:03:28.476647 containerd[1486]: time="2025-05-14T01:03:28.476397258Z" level=info msg=serving... address=/run/containerd/containerd.sock May 14 01:03:28.476647 containerd[1486]: time="2025-05-14T01:03:28.476422125Z" level=info msg="Start subscribing containerd event" May 14 01:03:28.476647 containerd[1486]: time="2025-05-14T01:03:28.476466759Z" level=info msg="Start recovering state" May 14 01:03:28.476647 containerd[1486]: time="2025-05-14T01:03:28.476556367Z" level=info msg="Start event monitor" May 14 01:03:28.476647 containerd[1486]: time="2025-05-14T01:03:28.476573389Z" level=info msg="Start cni network conf syncer for default" May 14 01:03:28.476647 containerd[1486]: time="2025-05-14T01:03:28.476581103Z" level=info msg="Start streaming server" May 14 01:03:28.476647 containerd[1486]: time="2025-05-14T01:03:28.476593917Z" level=info msg="Registered namespace \"k8s.io\" with NRI" May 14 01:03:28.476647 containerd[1486]: time="2025-05-14T01:03:28.476603114Z" level=info msg="runtime interface starting up..." May 14 01:03:28.476647 containerd[1486]: time="2025-05-14T01:03:28.476609586Z" level=info msg="starting plugins..." May 14 01:03:28.476647 containerd[1486]: time="2025-05-14T01:03:28.476624745Z" level=info msg="Synchronizing NRI (plugin) with current runtime state" May 14 01:03:28.479874 containerd[1486]: time="2025-05-14T01:03:28.476734330Z" level=info msg="containerd successfully booted in 0.199508s" May 14 01:03:28.476841 systemd[1]: Started containerd.service - containerd container runtime. May 14 01:03:28.513839 tar[1471]: linux-amd64/README.md May 14 01:03:28.531130 systemd[1]: Finished prepare-helm.service - Unpack helm to /opt/bin. May 14 01:03:28.618791 systemd-networkd[1384]: eth0: Gained IPv6LL May 14 01:03:28.621818 systemd[1]: Finished systemd-networkd-wait-online.service - Wait for Network to be Configured. May 14 01:03:28.628711 systemd[1]: Reached target network-online.target - Network is Online. May 14 01:03:28.637809 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... May 14 01:03:28.644816 systemd[1]: Starting nvidia.service - NVIDIA Configure Service... May 14 01:03:28.699756 systemd[1]: Finished nvidia.service - NVIDIA Configure Service. May 14 01:03:29.113041 sshd[1532]: Accepted publickey for core from 172.24.4.1 port 59976 ssh2: RSA SHA256:rM5XR5OurBP49wqsUeFk3q6kdddh3064hEvFZHudzRQ May 14 01:03:29.118361 sshd-session[1532]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 14 01:03:29.136779 systemd[1]: Created slice user-500.slice - User Slice of UID 500. May 14 01:03:29.141587 systemd[1]: Starting user-runtime-dir@500.service - User Runtime Directory /run/user/500... May 14 01:03:29.162845 systemd-logind[1459]: New session 1 of user core. May 14 01:03:29.186853 systemd[1]: Finished user-runtime-dir@500.service - User Runtime Directory /run/user/500. May 14 01:03:29.193597 systemd[1]: Starting user@500.service - User Manager for UID 500... May 14 01:03:29.211638 (systemd)[1574]: pam_unix(systemd-user:session): session opened for user core(uid=500) by (uid=0) May 14 01:03:29.216376 systemd-logind[1459]: New session c1 of user core. May 14 01:03:29.468685 systemd[1574]: Queued start job for default target default.target. May 14 01:03:29.476544 systemd[1574]: Created slice app.slice - User Application Slice. May 14 01:03:29.476681 systemd[1574]: Reached target paths.target - Paths. May 14 01:03:29.476794 systemd[1574]: Reached target timers.target - Timers. May 14 01:03:29.478100 systemd[1574]: Starting dbus.socket - D-Bus User Message Bus Socket... May 14 01:03:29.524851 systemd[1574]: Listening on dbus.socket - D-Bus User Message Bus Socket. May 14 01:03:29.525056 systemd[1574]: Reached target sockets.target - Sockets. May 14 01:03:29.525130 systemd[1574]: Reached target basic.target - Basic System. May 14 01:03:29.525195 systemd[1574]: Reached target default.target - Main User Target. May 14 01:03:29.525239 systemd[1574]: Startup finished in 300ms. May 14 01:03:29.525921 systemd[1]: Started user@500.service - User Manager for UID 500. May 14 01:03:29.536746 systemd[1]: Started session-1.scope - Session 1 of User core. May 14 01:03:30.031373 systemd[1]: Started sshd@1-172.24.4.64:22-172.24.4.1:59980.service - OpenSSH per-connection server daemon (172.24.4.1:59980). May 14 01:03:30.917389 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. May 14 01:03:30.938076 (kubelet)[1593]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS May 14 01:03:31.741169 sshd[1585]: Accepted publickey for core from 172.24.4.1 port 59980 ssh2: RSA SHA256:rM5XR5OurBP49wqsUeFk3q6kdddh3064hEvFZHudzRQ May 14 01:03:31.742212 sshd-session[1585]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 14 01:03:31.755407 systemd-logind[1459]: New session 2 of user core. May 14 01:03:31.761022 systemd[1]: Started session-2.scope - Session 2 of User core. May 14 01:03:32.368552 kubelet[1593]: E0514 01:03:32.368392 1593 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" May 14 01:03:32.373280 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE May 14 01:03:32.374119 systemd[1]: kubelet.service: Failed with result 'exit-code'. May 14 01:03:32.374882 systemd[1]: kubelet.service: Consumed 2.273s CPU time, 256M memory peak. May 14 01:03:32.382740 sshd[1599]: Connection closed by 172.24.4.1 port 59980 May 14 01:03:32.383691 sshd-session[1585]: pam_unix(sshd:session): session closed for user core May 14 01:03:32.400590 systemd[1]: sshd@1-172.24.4.64:22-172.24.4.1:59980.service: Deactivated successfully. May 14 01:03:32.404054 systemd[1]: session-2.scope: Deactivated successfully. May 14 01:03:32.409595 systemd-logind[1459]: Session 2 logged out. Waiting for processes to exit. May 14 01:03:32.411863 systemd[1]: Started sshd@2-172.24.4.64:22-172.24.4.1:59992.service - OpenSSH per-connection server daemon (172.24.4.1:59992). May 14 01:03:32.419556 systemd-logind[1459]: Removed session 2. May 14 01:03:33.378243 login[1538]: pam_unix(login:session): session opened for user core(uid=500) by LOGIN(uid=0) May 14 01:03:33.390774 systemd-logind[1459]: New session 3 of user core. May 14 01:03:33.401695 login[1539]: pam_unix(login:session): session opened for user core(uid=500) by LOGIN(uid=0) May 14 01:03:33.402469 systemd[1]: Started session-3.scope - Session 3 of User core. May 14 01:03:33.417817 systemd-logind[1459]: New session 4 of user core. May 14 01:03:33.428291 systemd[1]: Started session-4.scope - Session 4 of User core. May 14 01:03:33.553805 sshd[1606]: Accepted publickey for core from 172.24.4.1 port 59992 ssh2: RSA SHA256:rM5XR5OurBP49wqsUeFk3q6kdddh3064hEvFZHudzRQ May 14 01:03:33.556628 sshd-session[1606]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 14 01:03:33.567243 systemd-logind[1459]: New session 5 of user core. May 14 01:03:33.583858 systemd[1]: Started session-5.scope - Session 5 of User core. May 14 01:03:34.295564 sshd[1633]: Connection closed by 172.24.4.1 port 59992 May 14 01:03:34.296674 sshd-session[1606]: pam_unix(sshd:session): session closed for user core May 14 01:03:34.304406 systemd[1]: sshd@2-172.24.4.64:22-172.24.4.1:59992.service: Deactivated successfully. May 14 01:03:34.308280 systemd[1]: session-5.scope: Deactivated successfully. May 14 01:03:34.310180 systemd-logind[1459]: Session 5 logged out. Waiting for processes to exit. May 14 01:03:34.312859 systemd-logind[1459]: Removed session 5. May 14 01:03:34.603019 coreos-metadata[1448]: May 14 01:03:34.602 WARN failed to locate config-drive, using the metadata service API instead May 14 01:03:34.651746 coreos-metadata[1448]: May 14 01:03:34.651 INFO Fetching http://169.254.169.254/openstack/2012-08-10/meta_data.json: Attempt #1 May 14 01:03:34.838033 coreos-metadata[1448]: May 14 01:03:34.837 INFO Fetch successful May 14 01:03:34.838033 coreos-metadata[1448]: May 14 01:03:34.837 INFO Fetching http://169.254.169.254/latest/meta-data/hostname: Attempt #1 May 14 01:03:34.852671 coreos-metadata[1448]: May 14 01:03:34.852 INFO Fetch successful May 14 01:03:34.852671 coreos-metadata[1448]: May 14 01:03:34.852 INFO Fetching http://169.254.169.254/latest/meta-data/instance-id: Attempt #1 May 14 01:03:34.866160 coreos-metadata[1448]: May 14 01:03:34.865 INFO Fetch successful May 14 01:03:34.866160 coreos-metadata[1448]: May 14 01:03:34.865 INFO Fetching http://169.254.169.254/latest/meta-data/instance-type: Attempt #1 May 14 01:03:34.880099 coreos-metadata[1448]: May 14 01:03:34.880 INFO Fetch successful May 14 01:03:34.880099 coreos-metadata[1448]: May 14 01:03:34.880 INFO Fetching http://169.254.169.254/latest/meta-data/local-ipv4: Attempt #1 May 14 01:03:34.894128 coreos-metadata[1448]: May 14 01:03:34.893 INFO Fetch successful May 14 01:03:34.894128 coreos-metadata[1448]: May 14 01:03:34.894 INFO Fetching http://169.254.169.254/latest/meta-data/public-ipv4: Attempt #1 May 14 01:03:34.909044 coreos-metadata[1448]: May 14 01:03:34.908 INFO Fetch successful May 14 01:03:34.958237 systemd[1]: Finished coreos-metadata.service - Flatcar Metadata Agent. May 14 01:03:34.961249 systemd[1]: packet-phone-home.service - Report Success to Packet was skipped because no trigger condition checks were met. May 14 01:03:35.126582 coreos-metadata[1514]: May 14 01:03:35.126 WARN failed to locate config-drive, using the metadata service API instead May 14 01:03:35.169227 coreos-metadata[1514]: May 14 01:03:35.169 INFO Fetching http://169.254.169.254/latest/meta-data/public-keys: Attempt #1 May 14 01:03:35.183724 coreos-metadata[1514]: May 14 01:03:35.183 INFO Fetch successful May 14 01:03:35.183724 coreos-metadata[1514]: May 14 01:03:35.183 INFO Fetching http://169.254.169.254/latest/meta-data/public-keys/0/openssh-key: Attempt #1 May 14 01:03:35.197790 coreos-metadata[1514]: May 14 01:03:35.197 INFO Fetch successful May 14 01:03:35.203548 unknown[1514]: wrote ssh authorized keys file for user: core May 14 01:03:35.245799 update-ssh-keys[1648]: Updated "/home/core/.ssh/authorized_keys" May 14 01:03:35.246748 systemd[1]: Finished coreos-metadata-sshkeys@core.service - Flatcar Metadata Agent (SSH Keys). May 14 01:03:35.250265 systemd[1]: Finished sshkeys.service. May 14 01:03:35.256336 systemd[1]: Reached target multi-user.target - Multi-User System. May 14 01:03:35.256657 systemd[1]: Startup finished in 1.236s (kernel) + 15.274s (initrd) + 11.215s (userspace) = 27.727s. May 14 01:03:37.853313 systemd-timesyncd[1386]: Timed out waiting for reply from 172.234.37.140:123 (0.flatcar.pool.ntp.org). May 14 01:03:38.468326 systemd-timesyncd[1386]: Contacted time server 216.31.17.12:123 (0.flatcar.pool.ntp.org). May 14 01:03:38.468387 systemd-resolved[1385]: Clock change detected. Flushing caches. May 14 01:03:38.468421 systemd-timesyncd[1386]: Initial clock synchronization to Wed 2025-05-14 01:03:38.468100 UTC. May 14 01:03:43.049537 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 1. May 14 01:03:43.052661 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... May 14 01:03:43.401872 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. May 14 01:03:43.414575 (kubelet)[1659]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS May 14 01:03:43.494530 kubelet[1659]: E0514 01:03:43.494406 1659 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" May 14 01:03:43.497309 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE May 14 01:03:43.497612 systemd[1]: kubelet.service: Failed with result 'exit-code'. May 14 01:03:43.498564 systemd[1]: kubelet.service: Consumed 283ms CPU time, 105.7M memory peak. May 14 01:03:44.743638 systemd[1]: Started sshd@3-172.24.4.64:22-172.24.4.1:43178.service - OpenSSH per-connection server daemon (172.24.4.1:43178). May 14 01:03:45.899117 sshd[1667]: Accepted publickey for core from 172.24.4.1 port 43178 ssh2: RSA SHA256:rM5XR5OurBP49wqsUeFk3q6kdddh3064hEvFZHudzRQ May 14 01:03:45.902629 sshd-session[1667]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 14 01:03:45.914553 systemd-logind[1459]: New session 6 of user core. May 14 01:03:45.925373 systemd[1]: Started session-6.scope - Session 6 of User core. May 14 01:03:46.642083 sshd[1669]: Connection closed by 172.24.4.1 port 43178 May 14 01:03:46.642365 sshd-session[1667]: pam_unix(sshd:session): session closed for user core May 14 01:03:46.657856 systemd[1]: sshd@3-172.24.4.64:22-172.24.4.1:43178.service: Deactivated successfully. May 14 01:03:46.660780 systemd[1]: session-6.scope: Deactivated successfully. May 14 01:03:46.662492 systemd-logind[1459]: Session 6 logged out. Waiting for processes to exit. May 14 01:03:46.666609 systemd[1]: Started sshd@4-172.24.4.64:22-172.24.4.1:43190.service - OpenSSH per-connection server daemon (172.24.4.1:43190). May 14 01:03:46.668811 systemd-logind[1459]: Removed session 6. May 14 01:03:47.821144 sshd[1674]: Accepted publickey for core from 172.24.4.1 port 43190 ssh2: RSA SHA256:rM5XR5OurBP49wqsUeFk3q6kdddh3064hEvFZHudzRQ May 14 01:03:47.823659 sshd-session[1674]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 14 01:03:47.834541 systemd-logind[1459]: New session 7 of user core. May 14 01:03:47.845357 systemd[1]: Started session-7.scope - Session 7 of User core. May 14 01:03:48.443134 sshd[1677]: Connection closed by 172.24.4.1 port 43190 May 14 01:03:48.444920 sshd-session[1674]: pam_unix(sshd:session): session closed for user core May 14 01:03:48.458238 systemd[1]: sshd@4-172.24.4.64:22-172.24.4.1:43190.service: Deactivated successfully. May 14 01:03:48.461470 systemd[1]: session-7.scope: Deactivated successfully. May 14 01:03:48.464364 systemd-logind[1459]: Session 7 logged out. Waiting for processes to exit. May 14 01:03:48.467501 systemd[1]: Started sshd@5-172.24.4.64:22-172.24.4.1:43206.service - OpenSSH per-connection server daemon (172.24.4.1:43206). May 14 01:03:48.469905 systemd-logind[1459]: Removed session 7. May 14 01:03:49.602856 sshd[1682]: Accepted publickey for core from 172.24.4.1 port 43206 ssh2: RSA SHA256:rM5XR5OurBP49wqsUeFk3q6kdddh3064hEvFZHudzRQ May 14 01:03:49.605667 sshd-session[1682]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 14 01:03:49.618151 systemd-logind[1459]: New session 8 of user core. May 14 01:03:49.628350 systemd[1]: Started session-8.scope - Session 8 of User core. May 14 01:03:50.347463 sshd[1685]: Connection closed by 172.24.4.1 port 43206 May 14 01:03:50.345414 sshd-session[1682]: pam_unix(sshd:session): session closed for user core May 14 01:03:50.361701 systemd[1]: sshd@5-172.24.4.64:22-172.24.4.1:43206.service: Deactivated successfully. May 14 01:03:50.364659 systemd[1]: session-8.scope: Deactivated successfully. May 14 01:03:50.366194 systemd-logind[1459]: Session 8 logged out. Waiting for processes to exit. May 14 01:03:50.369827 systemd[1]: Started sshd@6-172.24.4.64:22-172.24.4.1:43218.service - OpenSSH per-connection server daemon (172.24.4.1:43218). May 14 01:03:50.372089 systemd-logind[1459]: Removed session 8. May 14 01:03:51.879568 sshd[1690]: Accepted publickey for core from 172.24.4.1 port 43218 ssh2: RSA SHA256:rM5XR5OurBP49wqsUeFk3q6kdddh3064hEvFZHudzRQ May 14 01:03:51.882066 sshd-session[1690]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 14 01:03:51.894141 systemd-logind[1459]: New session 9 of user core. May 14 01:03:51.904363 systemd[1]: Started session-9.scope - Session 9 of User core. May 14 01:03:52.606292 sudo[1694]: core : PWD=/home/core ; USER=root ; COMMAND=/usr/sbin/setenforce 1 May 14 01:03:52.606920 sudo[1694]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) May 14 01:03:52.623893 sudo[1694]: pam_unix(sudo:session): session closed for user root May 14 01:03:52.897324 sshd[1693]: Connection closed by 172.24.4.1 port 43218 May 14 01:03:52.898086 sshd-session[1690]: pam_unix(sshd:session): session closed for user core May 14 01:03:52.911839 systemd[1]: sshd@6-172.24.4.64:22-172.24.4.1:43218.service: Deactivated successfully. May 14 01:03:52.914877 systemd[1]: session-9.scope: Deactivated successfully. May 14 01:03:52.918311 systemd-logind[1459]: Session 9 logged out. Waiting for processes to exit. May 14 01:03:52.920917 systemd[1]: Started sshd@7-172.24.4.64:22-172.24.4.1:43230.service - OpenSSH per-connection server daemon (172.24.4.1:43230). May 14 01:03:52.923846 systemd-logind[1459]: Removed session 9. May 14 01:03:53.552963 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 2. May 14 01:03:53.556255 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... May 14 01:03:53.879486 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. May 14 01:03:53.888322 (kubelet)[1710]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS May 14 01:03:53.999523 kubelet[1710]: E0514 01:03:53.999383 1710 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" May 14 01:03:54.003428 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE May 14 01:03:54.003773 systemd[1]: kubelet.service: Failed with result 'exit-code'. May 14 01:03:54.004313 systemd[1]: kubelet.service: Consumed 284ms CPU time, 104M memory peak. May 14 01:03:54.190530 sshd[1699]: Accepted publickey for core from 172.24.4.1 port 43230 ssh2: RSA SHA256:rM5XR5OurBP49wqsUeFk3q6kdddh3064hEvFZHudzRQ May 14 01:03:54.193698 sshd-session[1699]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 14 01:03:54.205724 systemd-logind[1459]: New session 10 of user core. May 14 01:03:54.219390 systemd[1]: Started session-10.scope - Session 10 of User core. May 14 01:03:54.654004 sudo[1720]: core : PWD=/home/core ; USER=root ; COMMAND=/usr/sbin/rm -rf /etc/audit/rules.d/80-selinux.rules /etc/audit/rules.d/99-default.rules May 14 01:03:54.654923 sudo[1720]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) May 14 01:03:54.662863 sudo[1720]: pam_unix(sudo:session): session closed for user root May 14 01:03:54.675510 sudo[1719]: core : PWD=/home/core ; USER=root ; COMMAND=/usr/sbin/systemctl restart audit-rules May 14 01:03:54.676833 sudo[1719]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) May 14 01:03:54.700153 systemd[1]: Starting audit-rules.service - Load Audit Rules... May 14 01:03:54.784945 augenrules[1742]: No rules May 14 01:03:54.787452 systemd[1]: audit-rules.service: Deactivated successfully. May 14 01:03:54.788014 systemd[1]: Finished audit-rules.service - Load Audit Rules. May 14 01:03:54.791484 sudo[1719]: pam_unix(sudo:session): session closed for user root May 14 01:03:55.058544 sshd[1718]: Connection closed by 172.24.4.1 port 43230 May 14 01:03:55.059704 sshd-session[1699]: pam_unix(sshd:session): session closed for user core May 14 01:03:55.079003 systemd[1]: sshd@7-172.24.4.64:22-172.24.4.1:43230.service: Deactivated successfully. May 14 01:03:55.082900 systemd[1]: session-10.scope: Deactivated successfully. May 14 01:03:55.086359 systemd-logind[1459]: Session 10 logged out. Waiting for processes to exit. May 14 01:03:55.090996 systemd[1]: Started sshd@8-172.24.4.64:22-172.24.4.1:47414.service - OpenSSH per-connection server daemon (172.24.4.1:47414). May 14 01:03:55.095166 systemd-logind[1459]: Removed session 10. May 14 01:03:56.413868 sshd[1750]: Accepted publickey for core from 172.24.4.1 port 47414 ssh2: RSA SHA256:rM5XR5OurBP49wqsUeFk3q6kdddh3064hEvFZHudzRQ May 14 01:03:56.416894 sshd-session[1750]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 14 01:03:56.430155 systemd-logind[1459]: New session 11 of user core. May 14 01:03:56.438423 systemd[1]: Started session-11.scope - Session 11 of User core. May 14 01:03:56.931076 sudo[1754]: core : PWD=/home/core ; USER=root ; COMMAND=/home/core/install.sh May 14 01:03:56.931957 sudo[1754]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) May 14 01:03:57.626906 systemd[1]: Starting docker.service - Docker Application Container Engine... May 14 01:03:57.637334 (dockerd)[1773]: docker.service: Referenced but unset environment variable evaluates to an empty string: DOCKER_CGROUPS, DOCKER_OPTS, DOCKER_OPT_BIP, DOCKER_OPT_IPMASQ, DOCKER_OPT_MTU May 14 01:03:58.170586 dockerd[1773]: time="2025-05-14T01:03:58.170452960Z" level=info msg="Starting up" May 14 01:03:58.173753 dockerd[1773]: time="2025-05-14T01:03:58.173686094Z" level=info msg="OTEL tracing is not configured, using no-op tracer provider" May 14 01:03:58.263297 systemd[1]: var-lib-docker-metacopy\x2dcheck1665646234-merged.mount: Deactivated successfully. May 14 01:03:58.308968 dockerd[1773]: time="2025-05-14T01:03:58.308931537Z" level=info msg="Loading containers: start." May 14 01:03:58.517225 kernel: Initializing XFRM netlink socket May 14 01:03:58.625633 systemd-networkd[1384]: docker0: Link UP May 14 01:03:58.678576 dockerd[1773]: time="2025-05-14T01:03:58.678509911Z" level=info msg="Loading containers: done." May 14 01:03:58.693234 systemd[1]: var-lib-docker-overlay2-opaque\x2dbug\x2dcheck80587245-merged.mount: Deactivated successfully. May 14 01:03:58.705410 dockerd[1773]: time="2025-05-14T01:03:58.705347906Z" level=warning msg="Not using native diff for overlay2, this may cause degraded performance for building images: kernel has CONFIG_OVERLAY_FS_REDIRECT_DIR enabled" storage-driver=overlay2 May 14 01:03:58.705531 dockerd[1773]: time="2025-05-14T01:03:58.705503177Z" level=info msg="Docker daemon" commit=c710b88579fcb5e0d53f96dcae976d79323b9166 containerd-snapshotter=false storage-driver=overlay2 version=27.4.1 May 14 01:03:58.705758 dockerd[1773]: time="2025-05-14T01:03:58.705719883Z" level=info msg="Daemon has completed initialization" May 14 01:03:58.770464 dockerd[1773]: time="2025-05-14T01:03:58.770410811Z" level=info msg="API listen on /run/docker.sock" May 14 01:03:58.770753 systemd[1]: Started docker.service - Docker Application Container Engine. May 14 01:04:00.308503 containerd[1486]: time="2025-05-14T01:04:00.308321933Z" level=info msg="PullImage \"registry.k8s.io/kube-apiserver:v1.32.4\"" May 14 01:04:01.057007 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount1953873759.mount: Deactivated successfully. May 14 01:04:02.861583 containerd[1486]: time="2025-05-14T01:04:02.861516759Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-apiserver:v1.32.4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 14 01:04:02.862732 containerd[1486]: time="2025-05-14T01:04:02.862670864Z" level=info msg="stop pulling image registry.k8s.io/kube-apiserver:v1.32.4: active requests=0, bytes read=28682887" May 14 01:04:02.863579 containerd[1486]: time="2025-05-14T01:04:02.863513114Z" level=info msg="ImageCreate event name:\"sha256:1c20c8797e48698afa3380793df2f1fb260e3209df72d8e864e1bc73af8336e5\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 14 01:04:02.866406 containerd[1486]: time="2025-05-14T01:04:02.866356015Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-apiserver@sha256:631c6cc78b2862be4fed7df3384a643ef7297eebadae22e8ef9cbe2e19b6386f\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 14 01:04:02.867732 containerd[1486]: time="2025-05-14T01:04:02.867509970Z" level=info msg="Pulled image \"registry.k8s.io/kube-apiserver:v1.32.4\" with image id \"sha256:1c20c8797e48698afa3380793df2f1fb260e3209df72d8e864e1bc73af8336e5\", repo tag \"registry.k8s.io/kube-apiserver:v1.32.4\", repo digest \"registry.k8s.io/kube-apiserver@sha256:631c6cc78b2862be4fed7df3384a643ef7297eebadae22e8ef9cbe2e19b6386f\", size \"28679679\" in 2.559118466s" May 14 01:04:02.867732 containerd[1486]: time="2025-05-14T01:04:02.867552179Z" level=info msg="PullImage \"registry.k8s.io/kube-apiserver:v1.32.4\" returns image reference \"sha256:1c20c8797e48698afa3380793df2f1fb260e3209df72d8e864e1bc73af8336e5\"" May 14 01:04:02.868201 containerd[1486]: time="2025-05-14T01:04:02.868121416Z" level=info msg="PullImage \"registry.k8s.io/kube-controller-manager:v1.32.4\"" May 14 01:04:04.053010 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 3. May 14 01:04:04.057029 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... May 14 01:04:04.438942 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. May 14 01:04:04.452952 (kubelet)[2035]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS May 14 01:04:04.538077 kubelet[2035]: E0514 01:04:04.537559 2035 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" May 14 01:04:04.540445 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE May 14 01:04:04.540579 systemd[1]: kubelet.service: Failed with result 'exit-code'. May 14 01:04:04.541278 systemd[1]: kubelet.service: Consumed 194ms CPU time, 101.9M memory peak. May 14 01:04:05.139993 containerd[1486]: time="2025-05-14T01:04:05.139943398Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-controller-manager:v1.32.4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 14 01:04:05.141330 containerd[1486]: time="2025-05-14T01:04:05.141238256Z" level=info msg="stop pulling image registry.k8s.io/kube-controller-manager:v1.32.4: active requests=0, bytes read=24779597" May 14 01:04:05.142341 containerd[1486]: time="2025-05-14T01:04:05.142281954Z" level=info msg="ImageCreate event name:\"sha256:4db5364cd5509e0fc8e9f821fbc4b31ed79d4c9ae21809d22030ad67d530a61a\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 14 01:04:05.145354 containerd[1486]: time="2025-05-14T01:04:05.145311666Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-controller-manager@sha256:25e29187ea66f0ff9b9a00114849c3a30b649005c900a8b2a69e3f3fa56448fb\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 14 01:04:05.147021 containerd[1486]: time="2025-05-14T01:04:05.146918470Z" level=info msg="Pulled image \"registry.k8s.io/kube-controller-manager:v1.32.4\" with image id \"sha256:4db5364cd5509e0fc8e9f821fbc4b31ed79d4c9ae21809d22030ad67d530a61a\", repo tag \"registry.k8s.io/kube-controller-manager:v1.32.4\", repo digest \"registry.k8s.io/kube-controller-manager@sha256:25e29187ea66f0ff9b9a00114849c3a30b649005c900a8b2a69e3f3fa56448fb\", size \"26267962\" in 2.278652232s" May 14 01:04:05.147021 containerd[1486]: time="2025-05-14T01:04:05.146950810Z" level=info msg="PullImage \"registry.k8s.io/kube-controller-manager:v1.32.4\" returns image reference \"sha256:4db5364cd5509e0fc8e9f821fbc4b31ed79d4c9ae21809d22030ad67d530a61a\"" May 14 01:04:05.147583 containerd[1486]: time="2025-05-14T01:04:05.147412286Z" level=info msg="PullImage \"registry.k8s.io/kube-scheduler:v1.32.4\"" May 14 01:04:07.146310 containerd[1486]: time="2025-05-14T01:04:07.146228776Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-scheduler:v1.32.4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 14 01:04:07.147651 containerd[1486]: time="2025-05-14T01:04:07.147397157Z" level=info msg="stop pulling image registry.k8s.io/kube-scheduler:v1.32.4: active requests=0, bytes read=19169946" May 14 01:04:07.148661 containerd[1486]: time="2025-05-14T01:04:07.148602758Z" level=info msg="ImageCreate event name:\"sha256:70a252485ed1f2e8332b6f0a5f8f57443bfbc3c480228f8dcd82ad5ab5cc4000\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 14 01:04:07.151757 containerd[1486]: time="2025-05-14T01:04:07.151713532Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-scheduler@sha256:09c55f8dac59a4b8e5e354140f5a4bdd6fa9bd95c42d6bcba6782ed37c31b5a2\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 14 01:04:07.153395 containerd[1486]: time="2025-05-14T01:04:07.153349981Z" level=info msg="Pulled image \"registry.k8s.io/kube-scheduler:v1.32.4\" with image id \"sha256:70a252485ed1f2e8332b6f0a5f8f57443bfbc3c480228f8dcd82ad5ab5cc4000\", repo tag \"registry.k8s.io/kube-scheduler:v1.32.4\", repo digest \"registry.k8s.io/kube-scheduler@sha256:09c55f8dac59a4b8e5e354140f5a4bdd6fa9bd95c42d6bcba6782ed37c31b5a2\", size \"20658329\" in 2.005911296s" May 14 01:04:07.153454 containerd[1486]: time="2025-05-14T01:04:07.153393172Z" level=info msg="PullImage \"registry.k8s.io/kube-scheduler:v1.32.4\" returns image reference \"sha256:70a252485ed1f2e8332b6f0a5f8f57443bfbc3c480228f8dcd82ad5ab5cc4000\"" May 14 01:04:07.154207 containerd[1486]: time="2025-05-14T01:04:07.154175109Z" level=info msg="PullImage \"registry.k8s.io/kube-proxy:v1.32.4\"" May 14 01:04:08.543185 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount1814864239.mount: Deactivated successfully. May 14 01:04:09.083858 containerd[1486]: time="2025-05-14T01:04:09.083697788Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-proxy:v1.32.4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 14 01:04:09.084955 containerd[1486]: time="2025-05-14T01:04:09.084756644Z" level=info msg="stop pulling image registry.k8s.io/kube-proxy:v1.32.4: active requests=0, bytes read=30917864" May 14 01:04:09.086064 containerd[1486]: time="2025-05-14T01:04:09.085981441Z" level=info msg="ImageCreate event name:\"sha256:608f0c8bf7f9651ca79f170235ea5eefb978a0c1da132e7477a88ad37d171ad3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 14 01:04:09.088479 containerd[1486]: time="2025-05-14T01:04:09.088436666Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-proxy@sha256:152638222ecf265eb8e5352e3c50e8fc520994e8ffcff1ee1490c975f7fc2b36\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 14 01:04:09.089434 containerd[1486]: time="2025-05-14T01:04:09.089117393Z" level=info msg="Pulled image \"registry.k8s.io/kube-proxy:v1.32.4\" with image id \"sha256:608f0c8bf7f9651ca79f170235ea5eefb978a0c1da132e7477a88ad37d171ad3\", repo tag \"registry.k8s.io/kube-proxy:v1.32.4\", repo digest \"registry.k8s.io/kube-proxy@sha256:152638222ecf265eb8e5352e3c50e8fc520994e8ffcff1ee1490c975f7fc2b36\", size \"30916875\" in 1.934907689s" May 14 01:04:09.089434 containerd[1486]: time="2025-05-14T01:04:09.089165012Z" level=info msg="PullImage \"registry.k8s.io/kube-proxy:v1.32.4\" returns image reference \"sha256:608f0c8bf7f9651ca79f170235ea5eefb978a0c1da132e7477a88ad37d171ad3\"" May 14 01:04:09.089878 containerd[1486]: time="2025-05-14T01:04:09.089660271Z" level=info msg="PullImage \"registry.k8s.io/coredns/coredns:v1.11.3\"" May 14 01:04:09.712227 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount1385305384.mount: Deactivated successfully. May 14 01:04:10.929439 containerd[1486]: time="2025-05-14T01:04:10.929309002Z" level=info msg="ImageCreate event name:\"registry.k8s.io/coredns/coredns:v1.11.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 14 01:04:10.933313 containerd[1486]: time="2025-05-14T01:04:10.933147401Z" level=info msg="stop pulling image registry.k8s.io/coredns/coredns:v1.11.3: active requests=0, bytes read=18565249" May 14 01:04:10.935458 containerd[1486]: time="2025-05-14T01:04:10.935304927Z" level=info msg="ImageCreate event name:\"sha256:c69fa2e9cbf5f42dc48af631e956d3f95724c13f91596bc567591790e5e36db6\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 14 01:04:10.942115 containerd[1486]: time="2025-05-14T01:04:10.941428271Z" level=info msg="ImageCreate event name:\"registry.k8s.io/coredns/coredns@sha256:9caabbf6238b189a65d0d6e6ac138de60d6a1c419e5a341fbbb7c78382559c6e\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 14 01:04:10.945566 containerd[1486]: time="2025-05-14T01:04:10.944346224Z" level=info msg="Pulled image \"registry.k8s.io/coredns/coredns:v1.11.3\" with image id \"sha256:c69fa2e9cbf5f42dc48af631e956d3f95724c13f91596bc567591790e5e36db6\", repo tag \"registry.k8s.io/coredns/coredns:v1.11.3\", repo digest \"registry.k8s.io/coredns/coredns@sha256:9caabbf6238b189a65d0d6e6ac138de60d6a1c419e5a341fbbb7c78382559c6e\", size \"18562039\" in 1.854644415s" May 14 01:04:10.945566 containerd[1486]: time="2025-05-14T01:04:10.944421946Z" level=info msg="PullImage \"registry.k8s.io/coredns/coredns:v1.11.3\" returns image reference \"sha256:c69fa2e9cbf5f42dc48af631e956d3f95724c13f91596bc567591790e5e36db6\"" May 14 01:04:10.945566 containerd[1486]: time="2025-05-14T01:04:10.945156273Z" level=info msg="PullImage \"registry.k8s.io/pause:3.10\"" May 14 01:04:11.525534 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount2411878433.mount: Deactivated successfully. May 14 01:04:11.537977 containerd[1486]: time="2025-05-14T01:04:11.537836427Z" level=info msg="ImageCreate event name:\"registry.k8s.io/pause:3.10\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" May 14 01:04:11.540708 containerd[1486]: time="2025-05-14T01:04:11.540311938Z" level=info msg="stop pulling image registry.k8s.io/pause:3.10: active requests=0, bytes read=321146" May 14 01:04:11.542594 containerd[1486]: time="2025-05-14T01:04:11.542469020Z" level=info msg="ImageCreate event name:\"sha256:873ed75102791e5b0b8a7fcd41606c92fcec98d56d05ead4ac5131650004c136\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" May 14 01:04:11.547403 containerd[1486]: time="2025-05-14T01:04:11.547328700Z" level=info msg="ImageCreate event name:\"registry.k8s.io/pause@sha256:ee6521f290b2168b6e0935a181d4cff9be1ac3f505666ef0e3c98fae8199917a\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" May 14 01:04:11.549817 containerd[1486]: time="2025-05-14T01:04:11.549550152Z" level=info msg="Pulled image \"registry.k8s.io/pause:3.10\" with image id \"sha256:873ed75102791e5b0b8a7fcd41606c92fcec98d56d05ead4ac5131650004c136\", repo tag \"registry.k8s.io/pause:3.10\", repo digest \"registry.k8s.io/pause@sha256:ee6521f290b2168b6e0935a181d4cff9be1ac3f505666ef0e3c98fae8199917a\", size \"320368\" in 604.334798ms" May 14 01:04:11.549817 containerd[1486]: time="2025-05-14T01:04:11.549615755Z" level=info msg="PullImage \"registry.k8s.io/pause:3.10\" returns image reference \"sha256:873ed75102791e5b0b8a7fcd41606c92fcec98d56d05ead4ac5131650004c136\"" May 14 01:04:11.550945 containerd[1486]: time="2025-05-14T01:04:11.550577536Z" level=info msg="PullImage \"registry.k8s.io/etcd:3.5.16-0\"" May 14 01:04:12.213643 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount2810459543.mount: Deactivated successfully. May 14 01:04:12.968152 update_engine[1460]: I20250514 01:04:12.968086 1460 update_attempter.cc:509] Updating boot flags... May 14 01:04:13.012072 kernel: BTRFS warning: duplicate device /dev/vda3 devid 1 generation 39 scanned by (udev-worker) (2171) May 14 01:04:13.103072 kernel: BTRFS warning: duplicate device /dev/vda3 devid 1 generation 39 scanned by (udev-worker) (2169) May 14 01:04:14.552860 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 4. May 14 01:04:14.556071 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... May 14 01:04:14.767256 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. May 14 01:04:14.776435 (kubelet)[2189]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS May 14 01:04:14.824829 kubelet[2189]: E0514 01:04:14.824572 2189 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" May 14 01:04:14.828157 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE May 14 01:04:14.828296 systemd[1]: kubelet.service: Failed with result 'exit-code'. May 14 01:04:14.828552 systemd[1]: kubelet.service: Consumed 209ms CPU time, 105.7M memory peak. May 14 01:04:15.216255 containerd[1486]: time="2025-05-14T01:04:15.215914198Z" level=info msg="ImageCreate event name:\"registry.k8s.io/etcd:3.5.16-0\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 14 01:04:15.217567 containerd[1486]: time="2025-05-14T01:04:15.217518396Z" level=info msg="stop pulling image registry.k8s.io/etcd:3.5.16-0: active requests=0, bytes read=57551368" May 14 01:04:15.219161 containerd[1486]: time="2025-05-14T01:04:15.219100793Z" level=info msg="ImageCreate event name:\"sha256:a9e7e6b294baf1695fccb862d956c5d3ad8510e1e4ca1535f35dc09f247abbfc\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 14 01:04:15.223778 containerd[1486]: time="2025-05-14T01:04:15.223260468Z" level=info msg="ImageCreate event name:\"registry.k8s.io/etcd@sha256:c6a9d11cc5c04b114ccdef39a9265eeef818e3d02f5359be035ae784097fdec5\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 14 01:04:15.224481 containerd[1486]: time="2025-05-14T01:04:15.224446849Z" level=info msg="Pulled image \"registry.k8s.io/etcd:3.5.16-0\" with image id \"sha256:a9e7e6b294baf1695fccb862d956c5d3ad8510e1e4ca1535f35dc09f247abbfc\", repo tag \"registry.k8s.io/etcd:3.5.16-0\", repo digest \"registry.k8s.io/etcd@sha256:c6a9d11cc5c04b114ccdef39a9265eeef818e3d02f5359be035ae784097fdec5\", size \"57680541\" in 3.673816273s" May 14 01:04:15.224534 containerd[1486]: time="2025-05-14T01:04:15.224480083Z" level=info msg="PullImage \"registry.k8s.io/etcd:3.5.16-0\" returns image reference \"sha256:a9e7e6b294baf1695fccb862d956c5d3ad8510e1e4ca1535f35dc09f247abbfc\"" May 14 01:04:19.685592 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent. May 14 01:04:19.687369 systemd[1]: kubelet.service: Consumed 209ms CPU time, 105.7M memory peak. May 14 01:04:19.692600 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... May 14 01:04:19.743772 systemd[1]: Reload requested from client PID 2227 ('systemctl') (unit session-11.scope)... May 14 01:04:19.743942 systemd[1]: Reloading... May 14 01:04:19.838082 zram_generator::config[2273]: No configuration found. May 14 01:04:20.007023 systemd[1]: /usr/lib/systemd/system/docker.socket:6: ListenStream= references a path below legacy directory /var/run/, updating /var/run/docker.sock → /run/docker.sock; please update the unit file accordingly. May 14 01:04:20.132050 systemd[1]: Reloading finished in 387 ms. May 14 01:04:20.684480 systemd[1]: kubelet.service: Control process exited, code=killed, status=15/TERM May 14 01:04:20.684664 systemd[1]: kubelet.service: Failed with result 'signal'. May 14 01:04:20.685255 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent. May 14 01:04:20.685350 systemd[1]: kubelet.service: Consumed 114ms CPU time, 77.5M memory peak. May 14 01:04:20.691822 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... May 14 01:04:21.640155 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. May 14 01:04:21.653989 (kubelet)[2337]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS May 14 01:04:21.729695 kubelet[2337]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. May 14 01:04:21.731116 kubelet[2337]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. May 14 01:04:21.731116 kubelet[2337]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. May 14 01:04:21.731116 kubelet[2337]: I0514 01:04:21.730099 2337 server.go:215] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" May 14 01:04:22.295658 kubelet[2337]: I0514 01:04:22.295602 2337 server.go:520] "Kubelet version" kubeletVersion="v1.32.0" May 14 01:04:22.295658 kubelet[2337]: I0514 01:04:22.295631 2337 server.go:522] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" May 14 01:04:22.295920 kubelet[2337]: I0514 01:04:22.295902 2337 server.go:954] "Client rotation is on, will bootstrap in background" May 14 01:04:22.328306 kubelet[2337]: E0514 01:04:22.328109 2337 certificate_manager.go:562] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post \"https://172.24.4.64:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": dial tcp 172.24.4.64:6443: connect: connection refused" logger="UnhandledError" May 14 01:04:22.331772 kubelet[2337]: I0514 01:04:22.331669 2337 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/pki/ca.crt" May 14 01:04:22.349456 kubelet[2337]: I0514 01:04:22.349244 2337 server.go:1444] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" May 14 01:04:22.352246 kubelet[2337]: I0514 01:04:22.352221 2337 server.go:772] "--cgroups-per-qos enabled, but --cgroup-root was not specified. defaulting to /" May 14 01:04:22.352422 kubelet[2337]: I0514 01:04:22.352399 2337 container_manager_linux.go:268] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] May 14 01:04:22.352625 kubelet[2337]: I0514 01:04:22.352424 2337 container_manager_linux.go:273] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ci-4284-0-0-n-4a8b92fa55.novalocal","RuntimeCgroupsName":"","SystemCgroupsName":"","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":false,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":null,"HardEvictionThresholds":[{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"ExperimentalMemoryManagerPolicy":"None","ExperimentalMemoryManagerReservedMemory":null,"PodPidsLimit":-1,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} May 14 01:04:22.352625 kubelet[2337]: I0514 01:04:22.352611 2337 topology_manager.go:138] "Creating topology manager with none policy" May 14 01:04:22.352625 kubelet[2337]: I0514 01:04:22.352622 2337 container_manager_linux.go:304] "Creating device plugin manager" May 14 01:04:22.353027 kubelet[2337]: I0514 01:04:22.352735 2337 state_mem.go:36] "Initialized new in-memory state store" May 14 01:04:22.358777 kubelet[2337]: I0514 01:04:22.358679 2337 kubelet.go:446] "Attempting to sync node with API server" May 14 01:04:22.358777 kubelet[2337]: I0514 01:04:22.358697 2337 kubelet.go:341] "Adding static pod path" path="/etc/kubernetes/manifests" May 14 01:04:22.358777 kubelet[2337]: I0514 01:04:22.358718 2337 kubelet.go:352] "Adding apiserver pod source" May 14 01:04:22.358777 kubelet[2337]: I0514 01:04:22.358729 2337 apiserver.go:42] "Waiting for node sync before watching apiserver pods" May 14 01:04:22.365736 kubelet[2337]: W0514 01:04:22.364877 2337 reflector.go:569] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://172.24.4.64:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 172.24.4.64:6443: connect: connection refused May 14 01:04:22.365736 kubelet[2337]: E0514 01:04:22.364930 2337 reflector.go:166] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://172.24.4.64:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 172.24.4.64:6443: connect: connection refused" logger="UnhandledError" May 14 01:04:22.365736 kubelet[2337]: W0514 01:04:22.365009 2337 reflector.go:569] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://172.24.4.64:6443/api/v1/nodes?fieldSelector=metadata.name%3Dci-4284-0-0-n-4a8b92fa55.novalocal&limit=500&resourceVersion=0": dial tcp 172.24.4.64:6443: connect: connection refused May 14 01:04:22.365736 kubelet[2337]: E0514 01:04:22.365062 2337 reflector.go:166] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://172.24.4.64:6443/api/v1/nodes?fieldSelector=metadata.name%3Dci-4284-0-0-n-4a8b92fa55.novalocal&limit=500&resourceVersion=0\": dial tcp 172.24.4.64:6443: connect: connection refused" logger="UnhandledError" May 14 01:04:22.365736 kubelet[2337]: I0514 01:04:22.365205 2337 kuberuntime_manager.go:269] "Container runtime initialized" containerRuntime="containerd" version="v2.0.1" apiVersion="v1" May 14 01:04:22.365736 kubelet[2337]: I0514 01:04:22.365606 2337 kubelet.go:890] "Not starting ClusterTrustBundle informer because we are in static kubelet mode" May 14 01:04:22.367079 kubelet[2337]: W0514 01:04:22.366459 2337 probe.go:272] Flexvolume plugin directory at /opt/libexec/kubernetes/kubelet-plugins/volume/exec/ does not exist. Recreating. May 14 01:04:22.368994 kubelet[2337]: I0514 01:04:22.368980 2337 watchdog_linux.go:99] "Systemd watchdog is not enabled" May 14 01:04:22.369106 kubelet[2337]: I0514 01:04:22.369097 2337 server.go:1287] "Started kubelet" May 14 01:04:22.376575 kubelet[2337]: I0514 01:04:22.376559 2337 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" May 14 01:04:22.381053 kubelet[2337]: I0514 01:04:22.380947 2337 server.go:169] "Starting to listen" address="0.0.0.0" port=10250 May 14 01:04:22.383686 kubelet[2337]: I0514 01:04:22.383671 2337 server.go:490] "Adding debug handlers to kubelet server" May 14 01:04:22.385525 kubelet[2337]: E0514 01:04:22.383090 2337 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://172.24.4.64:6443/api/v1/namespaces/default/events\": dial tcp 172.24.4.64:6443: connect: connection refused" event="&Event{ObjectMeta:{ci-4284-0-0-n-4a8b92fa55.novalocal.183f3f3185850b3f default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ci-4284-0-0-n-4a8b92fa55.novalocal,UID:ci-4284-0-0-n-4a8b92fa55.novalocal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:ci-4284-0-0-n-4a8b92fa55.novalocal,},FirstTimestamp:2025-05-14 01:04:22.369078079 +0000 UTC m=+0.707682643,LastTimestamp:2025-05-14 01:04:22.369078079 +0000 UTC m=+0.707682643,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ci-4284-0-0-n-4a8b92fa55.novalocal,}" May 14 01:04:22.385696 kubelet[2337]: I0514 01:04:22.385651 2337 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 May 14 01:04:22.386013 kubelet[2337]: I0514 01:04:22.385998 2337 server.go:243] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" May 14 01:04:22.386320 kubelet[2337]: I0514 01:04:22.386305 2337 dynamic_serving_content.go:135] "Starting controller" name="kubelet-server-cert-files::/var/lib/kubelet/pki/kubelet.crt::/var/lib/kubelet/pki/kubelet.key" May 14 01:04:22.387909 kubelet[2337]: I0514 01:04:22.387244 2337 volume_manager.go:297] "Starting Kubelet Volume Manager" May 14 01:04:22.387909 kubelet[2337]: E0514 01:04:22.387699 2337 kubelet_node_status.go:467] "Error getting the current node from lister" err="node \"ci-4284-0-0-n-4a8b92fa55.novalocal\" not found" May 14 01:04:22.388764 kubelet[2337]: I0514 01:04:22.388751 2337 factory.go:221] Registration of the systemd container factory successfully May 14 01:04:22.388985 kubelet[2337]: I0514 01:04:22.388969 2337 factory.go:219] Registration of the crio container factory failed: Get "http://%2Fvar%2Frun%2Fcrio%2Fcrio.sock/info": dial unix /var/run/crio/crio.sock: connect: no such file or directory May 14 01:04:22.390094 kubelet[2337]: E0514 01:04:22.390071 2337 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://172.24.4.64:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/ci-4284-0-0-n-4a8b92fa55.novalocal?timeout=10s\": dial tcp 172.24.4.64:6443: connect: connection refused" interval="200ms" May 14 01:04:22.390210 kubelet[2337]: I0514 01:04:22.390199 2337 reconciler.go:26] "Reconciler: start to sync state" May 14 01:04:22.390638 kubelet[2337]: I0514 01:04:22.390284 2337 desired_state_of_world_populator.go:149] "Desired state populator starts to run" May 14 01:04:22.390638 kubelet[2337]: W0514 01:04:22.390568 2337 reflector.go:569] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://172.24.4.64:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 172.24.4.64:6443: connect: connection refused May 14 01:04:22.390638 kubelet[2337]: E0514 01:04:22.390608 2337 reflector.go:166] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://172.24.4.64:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 172.24.4.64:6443: connect: connection refused" logger="UnhandledError" May 14 01:04:22.392647 kubelet[2337]: I0514 01:04:22.391316 2337 factory.go:221] Registration of the containerd container factory successfully May 14 01:04:22.413345 kubelet[2337]: I0514 01:04:22.413252 2337 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv4" May 14 01:04:22.414392 kubelet[2337]: I0514 01:04:22.414346 2337 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv6" May 14 01:04:22.414434 kubelet[2337]: I0514 01:04:22.414395 2337 status_manager.go:227] "Starting to sync pod status with apiserver" May 14 01:04:22.414467 kubelet[2337]: I0514 01:04:22.414436 2337 watchdog_linux.go:127] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." May 14 01:04:22.414467 kubelet[2337]: I0514 01:04:22.414452 2337 kubelet.go:2388] "Starting kubelet main sync loop" May 14 01:04:22.414607 kubelet[2337]: E0514 01:04:22.414540 2337 kubelet.go:2412] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" May 14 01:04:22.426018 kubelet[2337]: W0514 01:04:22.425922 2337 reflector.go:569] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://172.24.4.64:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 172.24.4.64:6443: connect: connection refused May 14 01:04:22.426271 kubelet[2337]: E0514 01:04:22.426135 2337 reflector.go:166] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://172.24.4.64:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 172.24.4.64:6443: connect: connection refused" logger="UnhandledError" May 14 01:04:22.430889 kubelet[2337]: I0514 01:04:22.430871 2337 cpu_manager.go:221] "Starting CPU manager" policy="none" May 14 01:04:22.431099 kubelet[2337]: I0514 01:04:22.431007 2337 cpu_manager.go:222] "Reconciling" reconcilePeriod="10s" May 14 01:04:22.431099 kubelet[2337]: I0514 01:04:22.431026 2337 state_mem.go:36] "Initialized new in-memory state store" May 14 01:04:22.437303 kubelet[2337]: I0514 01:04:22.437105 2337 policy_none.go:49] "None policy: Start" May 14 01:04:22.437303 kubelet[2337]: I0514 01:04:22.437122 2337 memory_manager.go:186] "Starting memorymanager" policy="None" May 14 01:04:22.437303 kubelet[2337]: I0514 01:04:22.437133 2337 state_mem.go:35] "Initializing new in-memory state store" May 14 01:04:22.447362 systemd[1]: Created slice kubepods.slice - libcontainer container kubepods.slice. May 14 01:04:22.459345 systemd[1]: Created slice kubepods-burstable.slice - libcontainer container kubepods-burstable.slice. May 14 01:04:22.464348 systemd[1]: Created slice kubepods-besteffort.slice - libcontainer container kubepods-besteffort.slice. May 14 01:04:22.472522 kubelet[2337]: I0514 01:04:22.471940 2337 manager.go:519] "Failed to read data from checkpoint" checkpoint="kubelet_internal_checkpoint" err="checkpoint is not found" May 14 01:04:22.472522 kubelet[2337]: I0514 01:04:22.472118 2337 eviction_manager.go:189] "Eviction manager: starting control loop" May 14 01:04:22.472522 kubelet[2337]: I0514 01:04:22.472129 2337 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" May 14 01:04:22.472522 kubelet[2337]: I0514 01:04:22.472365 2337 plugin_manager.go:118] "Starting Kubelet Plugin Manager" May 14 01:04:22.474043 kubelet[2337]: E0514 01:04:22.474012 2337 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="no imagefs label for configured runtime" May 14 01:04:22.474162 kubelet[2337]: E0514 01:04:22.474150 2337 eviction_manager.go:292] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"ci-4284-0-0-n-4a8b92fa55.novalocal\" not found" May 14 01:04:22.535323 systemd[1]: Created slice kubepods-burstable-podbaca503596635a962767924c91d74317.slice - libcontainer container kubepods-burstable-podbaca503596635a962767924c91d74317.slice. May 14 01:04:22.554654 kubelet[2337]: E0514 01:04:22.554299 2337 kubelet.go:3196] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4284-0-0-n-4a8b92fa55.novalocal\" not found" node="ci-4284-0-0-n-4a8b92fa55.novalocal" May 14 01:04:22.563783 systemd[1]: Created slice kubepods-burstable-pod7df2f3ce5247e92ba25e31fddc339c0a.slice - libcontainer container kubepods-burstable-pod7df2f3ce5247e92ba25e31fddc339c0a.slice. May 14 01:04:22.568491 kubelet[2337]: E0514 01:04:22.568409 2337 kubelet.go:3196] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4284-0-0-n-4a8b92fa55.novalocal\" not found" node="ci-4284-0-0-n-4a8b92fa55.novalocal" May 14 01:04:22.575494 systemd[1]: Created slice kubepods-burstable-pod62ef88b4a2822e6a77c821a89794081a.slice - libcontainer container kubepods-burstable-pod62ef88b4a2822e6a77c821a89794081a.slice. May 14 01:04:22.577908 kubelet[2337]: I0514 01:04:22.577595 2337 kubelet_node_status.go:76] "Attempting to register node" node="ci-4284-0-0-n-4a8b92fa55.novalocal" May 14 01:04:22.578882 kubelet[2337]: E0514 01:04:22.578759 2337 kubelet_node_status.go:108] "Unable to register node with API server" err="Post \"https://172.24.4.64:6443/api/v1/nodes\": dial tcp 172.24.4.64:6443: connect: connection refused" node="ci-4284-0-0-n-4a8b92fa55.novalocal" May 14 01:04:22.581291 kubelet[2337]: E0514 01:04:22.581194 2337 kubelet.go:3196] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4284-0-0-n-4a8b92fa55.novalocal\" not found" node="ci-4284-0-0-n-4a8b92fa55.novalocal" May 14 01:04:22.591267 kubelet[2337]: I0514 01:04:22.590749 2337 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/62ef88b4a2822e6a77c821a89794081a-k8s-certs\") pod \"kube-controller-manager-ci-4284-0-0-n-4a8b92fa55.novalocal\" (UID: \"62ef88b4a2822e6a77c821a89794081a\") " pod="kube-system/kube-controller-manager-ci-4284-0-0-n-4a8b92fa55.novalocal" May 14 01:04:22.591267 kubelet[2337]: I0514 01:04:22.590822 2337 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/62ef88b4a2822e6a77c821a89794081a-kubeconfig\") pod \"kube-controller-manager-ci-4284-0-0-n-4a8b92fa55.novalocal\" (UID: \"62ef88b4a2822e6a77c821a89794081a\") " pod="kube-system/kube-controller-manager-ci-4284-0-0-n-4a8b92fa55.novalocal" May 14 01:04:22.591267 kubelet[2337]: I0514 01:04:22.590872 2337 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/62ef88b4a2822e6a77c821a89794081a-usr-share-ca-certificates\") pod \"kube-controller-manager-ci-4284-0-0-n-4a8b92fa55.novalocal\" (UID: \"62ef88b4a2822e6a77c821a89794081a\") " pod="kube-system/kube-controller-manager-ci-4284-0-0-n-4a8b92fa55.novalocal" May 14 01:04:22.591267 kubelet[2337]: I0514 01:04:22.590922 2337 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/baca503596635a962767924c91d74317-ca-certs\") pod \"kube-apiserver-ci-4284-0-0-n-4a8b92fa55.novalocal\" (UID: \"baca503596635a962767924c91d74317\") " pod="kube-system/kube-apiserver-ci-4284-0-0-n-4a8b92fa55.novalocal" May 14 01:04:22.591592 kubelet[2337]: I0514 01:04:22.590964 2337 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/baca503596635a962767924c91d74317-k8s-certs\") pod \"kube-apiserver-ci-4284-0-0-n-4a8b92fa55.novalocal\" (UID: \"baca503596635a962767924c91d74317\") " pod="kube-system/kube-apiserver-ci-4284-0-0-n-4a8b92fa55.novalocal" May 14 01:04:22.591592 kubelet[2337]: I0514 01:04:22.591011 2337 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/baca503596635a962767924c91d74317-usr-share-ca-certificates\") pod \"kube-apiserver-ci-4284-0-0-n-4a8b92fa55.novalocal\" (UID: \"baca503596635a962767924c91d74317\") " pod="kube-system/kube-apiserver-ci-4284-0-0-n-4a8b92fa55.novalocal" May 14 01:04:22.591592 kubelet[2337]: E0514 01:04:22.591006 2337 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://172.24.4.64:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/ci-4284-0-0-n-4a8b92fa55.novalocal?timeout=10s\": dial tcp 172.24.4.64:6443: connect: connection refused" interval="400ms" May 14 01:04:22.591592 kubelet[2337]: I0514 01:04:22.591102 2337 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/62ef88b4a2822e6a77c821a89794081a-ca-certs\") pod \"kube-controller-manager-ci-4284-0-0-n-4a8b92fa55.novalocal\" (UID: \"62ef88b4a2822e6a77c821a89794081a\") " pod="kube-system/kube-controller-manager-ci-4284-0-0-n-4a8b92fa55.novalocal" May 14 01:04:22.591592 kubelet[2337]: I0514 01:04:22.591151 2337 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"flexvolume-dir\" (UniqueName: \"kubernetes.io/host-path/62ef88b4a2822e6a77c821a89794081a-flexvolume-dir\") pod \"kube-controller-manager-ci-4284-0-0-n-4a8b92fa55.novalocal\" (UID: \"62ef88b4a2822e6a77c821a89794081a\") " pod="kube-system/kube-controller-manager-ci-4284-0-0-n-4a8b92fa55.novalocal" May 14 01:04:22.591890 kubelet[2337]: I0514 01:04:22.591195 2337 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/7df2f3ce5247e92ba25e31fddc339c0a-kubeconfig\") pod \"kube-scheduler-ci-4284-0-0-n-4a8b92fa55.novalocal\" (UID: \"7df2f3ce5247e92ba25e31fddc339c0a\") " pod="kube-system/kube-scheduler-ci-4284-0-0-n-4a8b92fa55.novalocal" May 14 01:04:22.782020 kubelet[2337]: I0514 01:04:22.781922 2337 kubelet_node_status.go:76] "Attempting to register node" node="ci-4284-0-0-n-4a8b92fa55.novalocal" May 14 01:04:22.783124 kubelet[2337]: E0514 01:04:22.782563 2337 kubelet_node_status.go:108] "Unable to register node with API server" err="Post \"https://172.24.4.64:6443/api/v1/nodes\": dial tcp 172.24.4.64:6443: connect: connection refused" node="ci-4284-0-0-n-4a8b92fa55.novalocal" May 14 01:04:22.857311 containerd[1486]: time="2025-05-14T01:04:22.857099566Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-apiserver-ci-4284-0-0-n-4a8b92fa55.novalocal,Uid:baca503596635a962767924c91d74317,Namespace:kube-system,Attempt:0,}" May 14 01:04:22.871977 containerd[1486]: time="2025-05-14T01:04:22.871482472Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-scheduler-ci-4284-0-0-n-4a8b92fa55.novalocal,Uid:7df2f3ce5247e92ba25e31fddc339c0a,Namespace:kube-system,Attempt:0,}" May 14 01:04:22.882836 containerd[1486]: time="2025-05-14T01:04:22.882638805Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-controller-manager-ci-4284-0-0-n-4a8b92fa55.novalocal,Uid:62ef88b4a2822e6a77c821a89794081a,Namespace:kube-system,Attempt:0,}" May 14 01:04:22.941599 containerd[1486]: time="2025-05-14T01:04:22.941409822Z" level=info msg="connecting to shim 3f6d05eb161aae6f2efdf5c6ac84395b7d174e52d71ef22b2a8fb29789f9e5cb" address="unix:///run/containerd/s/372df0cb2e82c08921245b5f044f0d176c735d3489561a6626ffcc67fdeab9ed" namespace=k8s.io protocol=ttrpc version=3 May 14 01:04:22.971413 containerd[1486]: time="2025-05-14T01:04:22.971277706Z" level=info msg="connecting to shim c4bcc2ad08da6615f4a910e6e1a55e5b630d8fd6ed385c55dbe2610ea30ccb84" address="unix:///run/containerd/s/356b4262aaa41475936d99dddb971c67e853aeaa1532e2ebd1fe829f157a8519" namespace=k8s.io protocol=ttrpc version=3 May 14 01:04:22.992458 kubelet[2337]: E0514 01:04:22.992329 2337 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://172.24.4.64:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/ci-4284-0-0-n-4a8b92fa55.novalocal?timeout=10s\": dial tcp 172.24.4.64:6443: connect: connection refused" interval="800ms" May 14 01:04:22.996298 containerd[1486]: time="2025-05-14T01:04:22.996162196Z" level=info msg="connecting to shim 9bae8999dedc873272a2055dd9b5ecedab6905a24ea807480817b9f883a53c86" address="unix:///run/containerd/s/c8fad915d2318ee45a28dcc9be6d4ad9d34a88d0c89a182d3228c9a2cd60e62c" namespace=k8s.io protocol=ttrpc version=3 May 14 01:04:23.006234 systemd[1]: Started cri-containerd-3f6d05eb161aae6f2efdf5c6ac84395b7d174e52d71ef22b2a8fb29789f9e5cb.scope - libcontainer container 3f6d05eb161aae6f2efdf5c6ac84395b7d174e52d71ef22b2a8fb29789f9e5cb. May 14 01:04:23.028261 systemd[1]: Started cri-containerd-c4bcc2ad08da6615f4a910e6e1a55e5b630d8fd6ed385c55dbe2610ea30ccb84.scope - libcontainer container c4bcc2ad08da6615f4a910e6e1a55e5b630d8fd6ed385c55dbe2610ea30ccb84. May 14 01:04:23.035902 systemd[1]: Started cri-containerd-9bae8999dedc873272a2055dd9b5ecedab6905a24ea807480817b9f883a53c86.scope - libcontainer container 9bae8999dedc873272a2055dd9b5ecedab6905a24ea807480817b9f883a53c86. May 14 01:04:23.069904 containerd[1486]: time="2025-05-14T01:04:23.069852422Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-apiserver-ci-4284-0-0-n-4a8b92fa55.novalocal,Uid:baca503596635a962767924c91d74317,Namespace:kube-system,Attempt:0,} returns sandbox id \"3f6d05eb161aae6f2efdf5c6ac84395b7d174e52d71ef22b2a8fb29789f9e5cb\"" May 14 01:04:23.075291 containerd[1486]: time="2025-05-14T01:04:23.075264210Z" level=info msg="CreateContainer within sandbox \"3f6d05eb161aae6f2efdf5c6ac84395b7d174e52d71ef22b2a8fb29789f9e5cb\" for container &ContainerMetadata{Name:kube-apiserver,Attempt:0,}" May 14 01:04:23.097169 containerd[1486]: time="2025-05-14T01:04:23.097137016Z" level=info msg="Container 4e62a3962755be74043e90f6e9ebb9ae421e31cd1429e35c44c520d121847825: CDI devices from CRI Config.CDIDevices: []" May 14 01:04:23.104514 containerd[1486]: time="2025-05-14T01:04:23.104392217Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-scheduler-ci-4284-0-0-n-4a8b92fa55.novalocal,Uid:7df2f3ce5247e92ba25e31fddc339c0a,Namespace:kube-system,Attempt:0,} returns sandbox id \"c4bcc2ad08da6615f4a910e6e1a55e5b630d8fd6ed385c55dbe2610ea30ccb84\"" May 14 01:04:23.110211 containerd[1486]: time="2025-05-14T01:04:23.110103077Z" level=info msg="CreateContainer within sandbox \"c4bcc2ad08da6615f4a910e6e1a55e5b630d8fd6ed385c55dbe2610ea30ccb84\" for container &ContainerMetadata{Name:kube-scheduler,Attempt:0,}" May 14 01:04:23.124497 containerd[1486]: time="2025-05-14T01:04:23.124451025Z" level=info msg="Container be49a500fc4cd97f594a37dff2cd5bbd9a0895555bbd603ec35f8b347c265369: CDI devices from CRI Config.CDIDevices: []" May 14 01:04:23.126067 containerd[1486]: time="2025-05-14T01:04:23.126018961Z" level=info msg="CreateContainer within sandbox \"3f6d05eb161aae6f2efdf5c6ac84395b7d174e52d71ef22b2a8fb29789f9e5cb\" for &ContainerMetadata{Name:kube-apiserver,Attempt:0,} returns container id \"4e62a3962755be74043e90f6e9ebb9ae421e31cd1429e35c44c520d121847825\"" May 14 01:04:23.128212 containerd[1486]: time="2025-05-14T01:04:23.128176314Z" level=info msg="StartContainer for \"4e62a3962755be74043e90f6e9ebb9ae421e31cd1429e35c44c520d121847825\"" May 14 01:04:23.129253 containerd[1486]: time="2025-05-14T01:04:23.129222360Z" level=info msg="connecting to shim 4e62a3962755be74043e90f6e9ebb9ae421e31cd1429e35c44c520d121847825" address="unix:///run/containerd/s/372df0cb2e82c08921245b5f044f0d176c735d3489561a6626ffcc67fdeab9ed" protocol=ttrpc version=3 May 14 01:04:23.132867 containerd[1486]: time="2025-05-14T01:04:23.131200666Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-controller-manager-ci-4284-0-0-n-4a8b92fa55.novalocal,Uid:62ef88b4a2822e6a77c821a89794081a,Namespace:kube-system,Attempt:0,} returns sandbox id \"9bae8999dedc873272a2055dd9b5ecedab6905a24ea807480817b9f883a53c86\"" May 14 01:04:23.136149 containerd[1486]: time="2025-05-14T01:04:23.136113267Z" level=info msg="CreateContainer within sandbox \"9bae8999dedc873272a2055dd9b5ecedab6905a24ea807480817b9f883a53c86\" for container &ContainerMetadata{Name:kube-controller-manager,Attempt:0,}" May 14 01:04:23.140574 containerd[1486]: time="2025-05-14T01:04:23.140540134Z" level=info msg="CreateContainer within sandbox \"c4bcc2ad08da6615f4a910e6e1a55e5b630d8fd6ed385c55dbe2610ea30ccb84\" for &ContainerMetadata{Name:kube-scheduler,Attempt:0,} returns container id \"be49a500fc4cd97f594a37dff2cd5bbd9a0895555bbd603ec35f8b347c265369\"" May 14 01:04:23.141284 containerd[1486]: time="2025-05-14T01:04:23.141254707Z" level=info msg="StartContainer for \"be49a500fc4cd97f594a37dff2cd5bbd9a0895555bbd603ec35f8b347c265369\"" May 14 01:04:23.142497 containerd[1486]: time="2025-05-14T01:04:23.142332702Z" level=info msg="connecting to shim be49a500fc4cd97f594a37dff2cd5bbd9a0895555bbd603ec35f8b347c265369" address="unix:///run/containerd/s/356b4262aaa41475936d99dddb971c67e853aeaa1532e2ebd1fe829f157a8519" protocol=ttrpc version=3 May 14 01:04:23.157375 systemd[1]: Started cri-containerd-4e62a3962755be74043e90f6e9ebb9ae421e31cd1429e35c44c520d121847825.scope - libcontainer container 4e62a3962755be74043e90f6e9ebb9ae421e31cd1429e35c44c520d121847825. May 14 01:04:23.158150 containerd[1486]: time="2025-05-14T01:04:23.157854435Z" level=info msg="Container 285963784a4c66bc7147f4b86d54541c2d8a932c4743f5dd596ff882663f167e: CDI devices from CRI Config.CDIDevices: []" May 14 01:04:23.167468 systemd[1]: Started cri-containerd-be49a500fc4cd97f594a37dff2cd5bbd9a0895555bbd603ec35f8b347c265369.scope - libcontainer container be49a500fc4cd97f594a37dff2cd5bbd9a0895555bbd603ec35f8b347c265369. May 14 01:04:23.171599 containerd[1486]: time="2025-05-14T01:04:23.171498160Z" level=info msg="CreateContainer within sandbox \"9bae8999dedc873272a2055dd9b5ecedab6905a24ea807480817b9f883a53c86\" for &ContainerMetadata{Name:kube-controller-manager,Attempt:0,} returns container id \"285963784a4c66bc7147f4b86d54541c2d8a932c4743f5dd596ff882663f167e\"" May 14 01:04:23.172320 containerd[1486]: time="2025-05-14T01:04:23.172141810Z" level=info msg="StartContainer for \"285963784a4c66bc7147f4b86d54541c2d8a932c4743f5dd596ff882663f167e\"" May 14 01:04:23.174429 containerd[1486]: time="2025-05-14T01:04:23.173348958Z" level=info msg="connecting to shim 285963784a4c66bc7147f4b86d54541c2d8a932c4743f5dd596ff882663f167e" address="unix:///run/containerd/s/c8fad915d2318ee45a28dcc9be6d4ad9d34a88d0c89a182d3228c9a2cd60e62c" protocol=ttrpc version=3 May 14 01:04:23.186481 kubelet[2337]: I0514 01:04:23.186138 2337 kubelet_node_status.go:76] "Attempting to register node" node="ci-4284-0-0-n-4a8b92fa55.novalocal" May 14 01:04:23.186481 kubelet[2337]: E0514 01:04:23.186451 2337 kubelet_node_status.go:108] "Unable to register node with API server" err="Post \"https://172.24.4.64:6443/api/v1/nodes\": dial tcp 172.24.4.64:6443: connect: connection refused" node="ci-4284-0-0-n-4a8b92fa55.novalocal" May 14 01:04:23.200186 systemd[1]: Started cri-containerd-285963784a4c66bc7147f4b86d54541c2d8a932c4743f5dd596ff882663f167e.scope - libcontainer container 285963784a4c66bc7147f4b86d54541c2d8a932c4743f5dd596ff882663f167e. May 14 01:04:23.252174 containerd[1486]: time="2025-05-14T01:04:23.251385598Z" level=info msg="StartContainer for \"4e62a3962755be74043e90f6e9ebb9ae421e31cd1429e35c44c520d121847825\" returns successfully" May 14 01:04:23.268407 containerd[1486]: time="2025-05-14T01:04:23.268360682Z" level=info msg="StartContainer for \"be49a500fc4cd97f594a37dff2cd5bbd9a0895555bbd603ec35f8b347c265369\" returns successfully" May 14 01:04:23.296684 containerd[1486]: time="2025-05-14T01:04:23.296343979Z" level=info msg="StartContainer for \"285963784a4c66bc7147f4b86d54541c2d8a932c4743f5dd596ff882663f167e\" returns successfully" May 14 01:04:23.434884 kubelet[2337]: E0514 01:04:23.434796 2337 kubelet.go:3196] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4284-0-0-n-4a8b92fa55.novalocal\" not found" node="ci-4284-0-0-n-4a8b92fa55.novalocal" May 14 01:04:23.439058 kubelet[2337]: E0514 01:04:23.437848 2337 kubelet.go:3196] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4284-0-0-n-4a8b92fa55.novalocal\" not found" node="ci-4284-0-0-n-4a8b92fa55.novalocal" May 14 01:04:23.443335 kubelet[2337]: E0514 01:04:23.443212 2337 kubelet.go:3196] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4284-0-0-n-4a8b92fa55.novalocal\" not found" node="ci-4284-0-0-n-4a8b92fa55.novalocal" May 14 01:04:23.989027 kubelet[2337]: I0514 01:04:23.988727 2337 kubelet_node_status.go:76] "Attempting to register node" node="ci-4284-0-0-n-4a8b92fa55.novalocal" May 14 01:04:24.445807 kubelet[2337]: E0514 01:04:24.445486 2337 kubelet.go:3196] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4284-0-0-n-4a8b92fa55.novalocal\" not found" node="ci-4284-0-0-n-4a8b92fa55.novalocal" May 14 01:04:24.446137 kubelet[2337]: E0514 01:04:24.446116 2337 kubelet.go:3196] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4284-0-0-n-4a8b92fa55.novalocal\" not found" node="ci-4284-0-0-n-4a8b92fa55.novalocal" May 14 01:04:25.308431 kubelet[2337]: E0514 01:04:25.308394 2337 nodelease.go:49] "Failed to get node when trying to set owner ref to the node lease" err="nodes \"ci-4284-0-0-n-4a8b92fa55.novalocal\" not found" node="ci-4284-0-0-n-4a8b92fa55.novalocal" May 14 01:04:25.365244 kubelet[2337]: I0514 01:04:25.365094 2337 apiserver.go:52] "Watching apiserver" May 14 01:04:25.377313 kubelet[2337]: I0514 01:04:25.377119 2337 kubelet_node_status.go:79] "Successfully registered node" node="ci-4284-0-0-n-4a8b92fa55.novalocal" May 14 01:04:25.389800 kubelet[2337]: I0514 01:04:25.389607 2337 kubelet.go:3200] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-ci-4284-0-0-n-4a8b92fa55.novalocal" May 14 01:04:25.391122 kubelet[2337]: I0514 01:04:25.391088 2337 desired_state_of_world_populator.go:157] "Finished populating initial desired state of world" May 14 01:04:25.438432 kubelet[2337]: E0514 01:04:25.438275 2337 kubelet.go:3202] "Failed creating a mirror pod" err="pods \"kube-apiserver-ci-4284-0-0-n-4a8b92fa55.novalocal\" is forbidden: no PriorityClass with name system-node-critical was found" pod="kube-system/kube-apiserver-ci-4284-0-0-n-4a8b92fa55.novalocal" May 14 01:04:25.438432 kubelet[2337]: I0514 01:04:25.438298 2337 kubelet.go:3200] "Creating a mirror pod for static pod" pod="kube-system/kube-controller-manager-ci-4284-0-0-n-4a8b92fa55.novalocal" May 14 01:04:25.447174 kubelet[2337]: E0514 01:04:25.447156 2337 kubelet.go:3202] "Failed creating a mirror pod" err="pods \"kube-controller-manager-ci-4284-0-0-n-4a8b92fa55.novalocal\" is forbidden: no PriorityClass with name system-node-critical was found" pod="kube-system/kube-controller-manager-ci-4284-0-0-n-4a8b92fa55.novalocal" May 14 01:04:25.447428 kubelet[2337]: I0514 01:04:25.447301 2337 kubelet.go:3200] "Creating a mirror pod for static pod" pod="kube-system/kube-scheduler-ci-4284-0-0-n-4a8b92fa55.novalocal" May 14 01:04:25.459198 kubelet[2337]: E0514 01:04:25.459170 2337 kubelet.go:3202] "Failed creating a mirror pod" err="pods \"kube-scheduler-ci-4284-0-0-n-4a8b92fa55.novalocal\" is forbidden: no PriorityClass with name system-node-critical was found" pod="kube-system/kube-scheduler-ci-4284-0-0-n-4a8b92fa55.novalocal" May 14 01:04:27.400761 kubelet[2337]: I0514 01:04:27.400364 2337 kubelet.go:3200] "Creating a mirror pod for static pod" pod="kube-system/kube-scheduler-ci-4284-0-0-n-4a8b92fa55.novalocal" May 14 01:04:27.426823 kubelet[2337]: W0514 01:04:27.425933 2337 warnings.go:70] metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots] May 14 01:04:27.763848 systemd[1]: Reload requested from client PID 2601 ('systemctl') (unit session-11.scope)... May 14 01:04:27.763886 systemd[1]: Reloading... May 14 01:04:27.883068 zram_generator::config[2643]: No configuration found. May 14 01:04:28.026248 systemd[1]: /usr/lib/systemd/system/docker.socket:6: ListenStream= references a path below legacy directory /var/run/, updating /var/run/docker.sock → /run/docker.sock; please update the unit file accordingly. May 14 01:04:28.162845 systemd[1]: Reloading finished in 397 ms. May 14 01:04:28.190177 systemd[1]: Stopping kubelet.service - kubelet: The Kubernetes Node Agent... May 14 01:04:28.211276 systemd[1]: kubelet.service: Deactivated successfully. May 14 01:04:28.211557 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent. May 14 01:04:28.211634 systemd[1]: kubelet.service: Consumed 1.284s CPU time, 124.4M memory peak. May 14 01:04:28.214309 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... May 14 01:04:28.487736 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. May 14 01:04:28.496534 (kubelet)[2710]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS May 14 01:04:28.538222 kubelet[2710]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. May 14 01:04:28.538548 kubelet[2710]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. May 14 01:04:28.538611 kubelet[2710]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. May 14 01:04:28.538760 kubelet[2710]: I0514 01:04:28.538733 2710 server.go:215] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" May 14 01:04:28.545967 kubelet[2710]: I0514 01:04:28.545937 2710 server.go:520] "Kubelet version" kubeletVersion="v1.32.0" May 14 01:04:28.547055 kubelet[2710]: I0514 01:04:28.546121 2710 server.go:522] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" May 14 01:04:28.547055 kubelet[2710]: I0514 01:04:28.546708 2710 server.go:954] "Client rotation is on, will bootstrap in background" May 14 01:04:28.550835 kubelet[2710]: I0514 01:04:28.550818 2710 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-client-current.pem". May 14 01:04:28.553555 kubelet[2710]: I0514 01:04:28.553539 2710 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/pki/ca.crt" May 14 01:04:28.562759 kubelet[2710]: I0514 01:04:28.562738 2710 server.go:1444] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" May 14 01:04:28.567952 kubelet[2710]: I0514 01:04:28.567924 2710 server.go:772] "--cgroups-per-qos enabled, but --cgroup-root was not specified. defaulting to /" May 14 01:04:28.570254 kubelet[2710]: I0514 01:04:28.570192 2710 container_manager_linux.go:268] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] May 14 01:04:28.570563 kubelet[2710]: I0514 01:04:28.570361 2710 container_manager_linux.go:273] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ci-4284-0-0-n-4a8b92fa55.novalocal","RuntimeCgroupsName":"","SystemCgroupsName":"","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":false,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":null,"HardEvictionThresholds":[{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"ExperimentalMemoryManagerPolicy":"None","ExperimentalMemoryManagerReservedMemory":null,"PodPidsLimit":-1,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} May 14 01:04:28.570710 kubelet[2710]: I0514 01:04:28.570699 2710 topology_manager.go:138] "Creating topology manager with none policy" May 14 01:04:28.570817 kubelet[2710]: I0514 01:04:28.570790 2710 container_manager_linux.go:304] "Creating device plugin manager" May 14 01:04:28.570927 kubelet[2710]: I0514 01:04:28.570918 2710 state_mem.go:36] "Initialized new in-memory state store" May 14 01:04:28.571178 kubelet[2710]: I0514 01:04:28.571168 2710 kubelet.go:446] "Attempting to sync node with API server" May 14 01:04:28.571281 kubelet[2710]: I0514 01:04:28.571271 2710 kubelet.go:341] "Adding static pod path" path="/etc/kubernetes/manifests" May 14 01:04:28.571429 kubelet[2710]: I0514 01:04:28.571419 2710 kubelet.go:352] "Adding apiserver pod source" May 14 01:04:28.571581 kubelet[2710]: I0514 01:04:28.571508 2710 apiserver.go:42] "Waiting for node sync before watching apiserver pods" May 14 01:04:28.579422 kubelet[2710]: I0514 01:04:28.579378 2710 kuberuntime_manager.go:269] "Container runtime initialized" containerRuntime="containerd" version="v2.0.1" apiVersion="v1" May 14 01:04:28.580614 kubelet[2710]: I0514 01:04:28.580420 2710 kubelet.go:890] "Not starting ClusterTrustBundle informer because we are in static kubelet mode" May 14 01:04:28.586357 kubelet[2710]: I0514 01:04:28.585400 2710 watchdog_linux.go:99] "Systemd watchdog is not enabled" May 14 01:04:28.586357 kubelet[2710]: I0514 01:04:28.585473 2710 server.go:1287] "Started kubelet" May 14 01:04:28.592668 kubelet[2710]: I0514 01:04:28.591374 2710 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" May 14 01:04:28.600099 kubelet[2710]: I0514 01:04:28.598549 2710 server.go:169] "Starting to listen" address="0.0.0.0" port=10250 May 14 01:04:28.602083 kubelet[2710]: I0514 01:04:28.601484 2710 server.go:490] "Adding debug handlers to kubelet server" May 14 01:04:28.602672 kubelet[2710]: I0514 01:04:28.602634 2710 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 May 14 01:04:28.602887 kubelet[2710]: I0514 01:04:28.602871 2710 server.go:243] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" May 14 01:04:28.604109 kubelet[2710]: I0514 01:04:28.603507 2710 dynamic_serving_content.go:135] "Starting controller" name="kubelet-server-cert-files::/var/lib/kubelet/pki/kubelet.crt::/var/lib/kubelet/pki/kubelet.key" May 14 01:04:28.605683 kubelet[2710]: I0514 01:04:28.605542 2710 volume_manager.go:297] "Starting Kubelet Volume Manager" May 14 01:04:28.605918 kubelet[2710]: E0514 01:04:28.605901 2710 kubelet_node_status.go:467] "Error getting the current node from lister" err="node \"ci-4284-0-0-n-4a8b92fa55.novalocal\" not found" May 14 01:04:28.607535 kubelet[2710]: I0514 01:04:28.607520 2710 desired_state_of_world_populator.go:149] "Desired state populator starts to run" May 14 01:04:28.607686 kubelet[2710]: I0514 01:04:28.607675 2710 reconciler.go:26] "Reconciler: start to sync state" May 14 01:04:28.610134 kubelet[2710]: I0514 01:04:28.610111 2710 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv4" May 14 01:04:28.611633 kubelet[2710]: I0514 01:04:28.611617 2710 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv6" May 14 01:04:28.611729 kubelet[2710]: I0514 01:04:28.611720 2710 status_manager.go:227] "Starting to sync pod status with apiserver" May 14 01:04:28.611798 kubelet[2710]: I0514 01:04:28.611788 2710 watchdog_linux.go:127] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." May 14 01:04:28.611851 kubelet[2710]: I0514 01:04:28.611843 2710 kubelet.go:2388] "Starting kubelet main sync loop" May 14 01:04:28.611950 kubelet[2710]: E0514 01:04:28.611934 2710 kubelet.go:2412] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" May 14 01:04:28.623072 kubelet[2710]: I0514 01:04:28.623045 2710 factory.go:221] Registration of the containerd container factory successfully May 14 01:04:28.623072 kubelet[2710]: I0514 01:04:28.623064 2710 factory.go:221] Registration of the systemd container factory successfully May 14 01:04:28.623226 kubelet[2710]: I0514 01:04:28.623149 2710 factory.go:219] Registration of the crio container factory failed: Get "http://%2Fvar%2Frun%2Fcrio%2Fcrio.sock/info": dial unix /var/run/crio/crio.sock: connect: no such file or directory May 14 01:04:28.671141 kubelet[2710]: I0514 01:04:28.671115 2710 cpu_manager.go:221] "Starting CPU manager" policy="none" May 14 01:04:28.671296 kubelet[2710]: I0514 01:04:28.671284 2710 cpu_manager.go:222] "Reconciling" reconcilePeriod="10s" May 14 01:04:28.672054 kubelet[2710]: I0514 01:04:28.671362 2710 state_mem.go:36] "Initialized new in-memory state store" May 14 01:04:28.672054 kubelet[2710]: I0514 01:04:28.671560 2710 state_mem.go:88] "Updated default CPUSet" cpuSet="" May 14 01:04:28.672054 kubelet[2710]: I0514 01:04:28.671577 2710 state_mem.go:96] "Updated CPUSet assignments" assignments={} May 14 01:04:28.672054 kubelet[2710]: I0514 01:04:28.671597 2710 policy_none.go:49] "None policy: Start" May 14 01:04:28.672054 kubelet[2710]: I0514 01:04:28.671605 2710 memory_manager.go:186] "Starting memorymanager" policy="None" May 14 01:04:28.672054 kubelet[2710]: I0514 01:04:28.671616 2710 state_mem.go:35] "Initializing new in-memory state store" May 14 01:04:28.672054 kubelet[2710]: I0514 01:04:28.671750 2710 state_mem.go:75] "Updated machine memory state" May 14 01:04:28.677116 kubelet[2710]: I0514 01:04:28.677098 2710 manager.go:519] "Failed to read data from checkpoint" checkpoint="kubelet_internal_checkpoint" err="checkpoint is not found" May 14 01:04:28.677508 kubelet[2710]: I0514 01:04:28.677495 2710 eviction_manager.go:189] "Eviction manager: starting control loop" May 14 01:04:28.677646 kubelet[2710]: I0514 01:04:28.677618 2710 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" May 14 01:04:28.677915 kubelet[2710]: I0514 01:04:28.677904 2710 plugin_manager.go:118] "Starting Kubelet Plugin Manager" May 14 01:04:28.679007 kubelet[2710]: E0514 01:04:28.678990 2710 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="no imagefs label for configured runtime" May 14 01:04:28.713373 kubelet[2710]: I0514 01:04:28.713347 2710 kubelet.go:3200] "Creating a mirror pod for static pod" pod="kube-system/kube-controller-manager-ci-4284-0-0-n-4a8b92fa55.novalocal" May 14 01:04:28.713568 kubelet[2710]: I0514 01:04:28.713534 2710 kubelet.go:3200] "Creating a mirror pod for static pod" pod="kube-system/kube-scheduler-ci-4284-0-0-n-4a8b92fa55.novalocal" May 14 01:04:28.713754 kubelet[2710]: I0514 01:04:28.713388 2710 kubelet.go:3200] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-ci-4284-0-0-n-4a8b92fa55.novalocal" May 14 01:04:28.725735 kubelet[2710]: W0514 01:04:28.725693 2710 warnings.go:70] metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots] May 14 01:04:28.726227 kubelet[2710]: W0514 01:04:28.726071 2710 warnings.go:70] metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots] May 14 01:04:28.728535 kubelet[2710]: W0514 01:04:28.728509 2710 warnings.go:70] metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots] May 14 01:04:28.728607 kubelet[2710]: E0514 01:04:28.728590 2710 kubelet.go:3202] "Failed creating a mirror pod" err="pods \"kube-scheduler-ci-4284-0-0-n-4a8b92fa55.novalocal\" already exists" pod="kube-system/kube-scheduler-ci-4284-0-0-n-4a8b92fa55.novalocal" May 14 01:04:28.782367 kubelet[2710]: I0514 01:04:28.780960 2710 kubelet_node_status.go:76] "Attempting to register node" node="ci-4284-0-0-n-4a8b92fa55.novalocal" May 14 01:04:28.794534 kubelet[2710]: I0514 01:04:28.794245 2710 kubelet_node_status.go:125] "Node was previously registered" node="ci-4284-0-0-n-4a8b92fa55.novalocal" May 14 01:04:28.794534 kubelet[2710]: I0514 01:04:28.794319 2710 kubelet_node_status.go:79] "Successfully registered node" node="ci-4284-0-0-n-4a8b92fa55.novalocal" May 14 01:04:28.910204 kubelet[2710]: I0514 01:04:28.909565 2710 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/62ef88b4a2822e6a77c821a89794081a-usr-share-ca-certificates\") pod \"kube-controller-manager-ci-4284-0-0-n-4a8b92fa55.novalocal\" (UID: \"62ef88b4a2822e6a77c821a89794081a\") " pod="kube-system/kube-controller-manager-ci-4284-0-0-n-4a8b92fa55.novalocal" May 14 01:04:28.910204 kubelet[2710]: I0514 01:04:28.909669 2710 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/7df2f3ce5247e92ba25e31fddc339c0a-kubeconfig\") pod \"kube-scheduler-ci-4284-0-0-n-4a8b92fa55.novalocal\" (UID: \"7df2f3ce5247e92ba25e31fddc339c0a\") " pod="kube-system/kube-scheduler-ci-4284-0-0-n-4a8b92fa55.novalocal" May 14 01:04:28.910204 kubelet[2710]: I0514 01:04:28.909716 2710 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/baca503596635a962767924c91d74317-ca-certs\") pod \"kube-apiserver-ci-4284-0-0-n-4a8b92fa55.novalocal\" (UID: \"baca503596635a962767924c91d74317\") " pod="kube-system/kube-apiserver-ci-4284-0-0-n-4a8b92fa55.novalocal" May 14 01:04:28.910204 kubelet[2710]: I0514 01:04:28.909768 2710 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/baca503596635a962767924c91d74317-usr-share-ca-certificates\") pod \"kube-apiserver-ci-4284-0-0-n-4a8b92fa55.novalocal\" (UID: \"baca503596635a962767924c91d74317\") " pod="kube-system/kube-apiserver-ci-4284-0-0-n-4a8b92fa55.novalocal" May 14 01:04:28.910605 kubelet[2710]: I0514 01:04:28.909816 2710 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/62ef88b4a2822e6a77c821a89794081a-ca-certs\") pod \"kube-controller-manager-ci-4284-0-0-n-4a8b92fa55.novalocal\" (UID: \"62ef88b4a2822e6a77c821a89794081a\") " pod="kube-system/kube-controller-manager-ci-4284-0-0-n-4a8b92fa55.novalocal" May 14 01:04:28.910605 kubelet[2710]: I0514 01:04:28.909858 2710 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"flexvolume-dir\" (UniqueName: \"kubernetes.io/host-path/62ef88b4a2822e6a77c821a89794081a-flexvolume-dir\") pod \"kube-controller-manager-ci-4284-0-0-n-4a8b92fa55.novalocal\" (UID: \"62ef88b4a2822e6a77c821a89794081a\") " pod="kube-system/kube-controller-manager-ci-4284-0-0-n-4a8b92fa55.novalocal" May 14 01:04:28.910605 kubelet[2710]: I0514 01:04:28.909899 2710 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/62ef88b4a2822e6a77c821a89794081a-k8s-certs\") pod \"kube-controller-manager-ci-4284-0-0-n-4a8b92fa55.novalocal\" (UID: \"62ef88b4a2822e6a77c821a89794081a\") " pod="kube-system/kube-controller-manager-ci-4284-0-0-n-4a8b92fa55.novalocal" May 14 01:04:28.910605 kubelet[2710]: I0514 01:04:28.909948 2710 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/baca503596635a962767924c91d74317-k8s-certs\") pod \"kube-apiserver-ci-4284-0-0-n-4a8b92fa55.novalocal\" (UID: \"baca503596635a962767924c91d74317\") " pod="kube-system/kube-apiserver-ci-4284-0-0-n-4a8b92fa55.novalocal" May 14 01:04:28.910854 kubelet[2710]: I0514 01:04:28.909993 2710 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/62ef88b4a2822e6a77c821a89794081a-kubeconfig\") pod \"kube-controller-manager-ci-4284-0-0-n-4a8b92fa55.novalocal\" (UID: \"62ef88b4a2822e6a77c821a89794081a\") " pod="kube-system/kube-controller-manager-ci-4284-0-0-n-4a8b92fa55.novalocal" May 14 01:04:29.577131 kubelet[2710]: I0514 01:04:29.575211 2710 apiserver.go:52] "Watching apiserver" May 14 01:04:29.607805 kubelet[2710]: I0514 01:04:29.607715 2710 desired_state_of_world_populator.go:157] "Finished populating initial desired state of world" May 14 01:04:29.648611 kubelet[2710]: I0514 01:04:29.648572 2710 kubelet.go:3200] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-ci-4284-0-0-n-4a8b92fa55.novalocal" May 14 01:04:29.651079 kubelet[2710]: I0514 01:04:29.650450 2710 kubelet.go:3200] "Creating a mirror pod for static pod" pod="kube-system/kube-controller-manager-ci-4284-0-0-n-4a8b92fa55.novalocal" May 14 01:04:29.664305 kubelet[2710]: W0514 01:04:29.664269 2710 warnings.go:70] metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots] May 14 01:04:29.664305 kubelet[2710]: E0514 01:04:29.664324 2710 kubelet.go:3202] "Failed creating a mirror pod" err="pods \"kube-apiserver-ci-4284-0-0-n-4a8b92fa55.novalocal\" already exists" pod="kube-system/kube-apiserver-ci-4284-0-0-n-4a8b92fa55.novalocal" May 14 01:04:29.665788 kubelet[2710]: W0514 01:04:29.665770 2710 warnings.go:70] metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots] May 14 01:04:29.665853 kubelet[2710]: E0514 01:04:29.665807 2710 kubelet.go:3202] "Failed creating a mirror pod" err="pods \"kube-controller-manager-ci-4284-0-0-n-4a8b92fa55.novalocal\" already exists" pod="kube-system/kube-controller-manager-ci-4284-0-0-n-4a8b92fa55.novalocal" May 14 01:04:29.682770 kubelet[2710]: I0514 01:04:29.682707 2710 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-controller-manager-ci-4284-0-0-n-4a8b92fa55.novalocal" podStartSLOduration=1.6826751789999999 podStartE2EDuration="1.682675179s" podCreationTimestamp="2025-05-14 01:04:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-05-14 01:04:29.682118865 +0000 UTC m=+1.181690339" watchObservedRunningTime="2025-05-14 01:04:29.682675179 +0000 UTC m=+1.182246653" May 14 01:04:29.709226 kubelet[2710]: I0514 01:04:29.709138 2710 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-apiserver-ci-4284-0-0-n-4a8b92fa55.novalocal" podStartSLOduration=1.709120889 podStartE2EDuration="1.709120889s" podCreationTimestamp="2025-05-14 01:04:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-05-14 01:04:29.69688901 +0000 UTC m=+1.196460484" watchObservedRunningTime="2025-05-14 01:04:29.709120889 +0000 UTC m=+1.208692363" May 14 01:04:34.232516 kubelet[2710]: I0514 01:04:34.232157 2710 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-scheduler-ci-4284-0-0-n-4a8b92fa55.novalocal" podStartSLOduration=7.232141563 podStartE2EDuration="7.232141563s" podCreationTimestamp="2025-05-14 01:04:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-05-14 01:04:29.709390536 +0000 UTC m=+1.208962010" watchObservedRunningTime="2025-05-14 01:04:34.232141563 +0000 UTC m=+5.731713037" May 14 01:04:34.249831 kubelet[2710]: I0514 01:04:34.249802 2710 kuberuntime_manager.go:1702] "Updating runtime config through cri with podcidr" CIDR="192.168.0.0/24" May 14 01:04:34.250152 containerd[1486]: time="2025-05-14T01:04:34.250116364Z" level=info msg="No cni config template is specified, wait for other system components to drop the config." May 14 01:04:34.250645 kubelet[2710]: I0514 01:04:34.250316 2710 kubelet_network.go:61] "Updating Pod CIDR" originalPodCIDR="" newPodCIDR="192.168.0.0/24" May 14 01:04:34.482939 sudo[1754]: pam_unix(sudo:session): session closed for user root May 14 01:04:34.681126 sshd[1753]: Connection closed by 172.24.4.1 port 47414 May 14 01:04:34.683310 sshd-session[1750]: pam_unix(sshd:session): session closed for user core May 14 01:04:34.691167 systemd[1]: sshd@8-172.24.4.64:22-172.24.4.1:47414.service: Deactivated successfully. May 14 01:04:34.698399 systemd[1]: session-11.scope: Deactivated successfully. May 14 01:04:34.698819 systemd[1]: session-11.scope: Consumed 7.358s CPU time, 229.4M memory peak. May 14 01:04:34.704151 systemd-logind[1459]: Session 11 logged out. Waiting for processes to exit. May 14 01:04:34.707899 systemd-logind[1459]: Removed session 11. May 14 01:04:35.130115 kubelet[2710]: I0514 01:04:35.128766 2710 status_manager.go:890] "Failed to get status for pod" podUID="690c0157-750e-431c-a732-8886678cb104" pod="kube-system/kube-proxy-z4qkp" err="pods \"kube-proxy-z4qkp\" is forbidden: User \"system:node:ci-4284-0-0-n-4a8b92fa55.novalocal\" cannot get resource \"pods\" in API group \"\" in the namespace \"kube-system\": no relationship found between node 'ci-4284-0-0-n-4a8b92fa55.novalocal' and this object" May 14 01:04:35.139837 systemd[1]: Created slice kubepods-besteffort-pod690c0157_750e_431c_a732_8886678cb104.slice - libcontainer container kubepods-besteffort-pod690c0157_750e_431c_a732_8886678cb104.slice. May 14 01:04:35.150768 kubelet[2710]: I0514 01:04:35.150712 2710 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-proxy\" (UniqueName: \"kubernetes.io/configmap/690c0157-750e-431c-a732-8886678cb104-kube-proxy\") pod \"kube-proxy-z4qkp\" (UID: \"690c0157-750e-431c-a732-8886678cb104\") " pod="kube-system/kube-proxy-z4qkp" May 14 01:04:35.150768 kubelet[2710]: I0514 01:04:35.150751 2710 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"xtables-lock\" (UniqueName: \"kubernetes.io/host-path/690c0157-750e-431c-a732-8886678cb104-xtables-lock\") pod \"kube-proxy-z4qkp\" (UID: \"690c0157-750e-431c-a732-8886678cb104\") " pod="kube-system/kube-proxy-z4qkp" May 14 01:04:35.151016 kubelet[2710]: I0514 01:04:35.150877 2710 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/690c0157-750e-431c-a732-8886678cb104-lib-modules\") pod \"kube-proxy-z4qkp\" (UID: \"690c0157-750e-431c-a732-8886678cb104\") " pod="kube-system/kube-proxy-z4qkp" May 14 01:04:35.151016 kubelet[2710]: I0514 01:04:35.150899 2710 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4sf7r\" (UniqueName: \"kubernetes.io/projected/690c0157-750e-431c-a732-8886678cb104-kube-api-access-4sf7r\") pod \"kube-proxy-z4qkp\" (UID: \"690c0157-750e-431c-a732-8886678cb104\") " pod="kube-system/kube-proxy-z4qkp" May 14 01:04:35.346601 systemd[1]: Created slice kubepods-besteffort-pod29c2f6c6_dbe5_4423_b5e6_4327db3ac573.slice - libcontainer container kubepods-besteffort-pod29c2f6c6_dbe5_4423_b5e6_4327db3ac573.slice. May 14 01:04:35.353593 kubelet[2710]: I0514 01:04:35.352529 2710 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-calico\" (UniqueName: \"kubernetes.io/host-path/29c2f6c6-dbe5-4423-b5e6-4327db3ac573-var-lib-calico\") pod \"tigera-operator-789496d6f5-f2vlv\" (UID: \"29c2f6c6-dbe5-4423-b5e6-4327db3ac573\") " pod="tigera-operator/tigera-operator-789496d6f5-f2vlv" May 14 01:04:35.353593 kubelet[2710]: I0514 01:04:35.352582 2710 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nnbgf\" (UniqueName: \"kubernetes.io/projected/29c2f6c6-dbe5-4423-b5e6-4327db3ac573-kube-api-access-nnbgf\") pod \"tigera-operator-789496d6f5-f2vlv\" (UID: \"29c2f6c6-dbe5-4423-b5e6-4327db3ac573\") " pod="tigera-operator/tigera-operator-789496d6f5-f2vlv" May 14 01:04:35.448869 containerd[1486]: time="2025-05-14T01:04:35.448721746Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-proxy-z4qkp,Uid:690c0157-750e-431c-a732-8886678cb104,Namespace:kube-system,Attempt:0,}" May 14 01:04:35.492018 containerd[1486]: time="2025-05-14T01:04:35.490651651Z" level=info msg="connecting to shim 14e59e108bebd1328d6a70c1db23da2935acce37188be5ec4ee813f234a6d6bd" address="unix:///run/containerd/s/56d35e4d711b74a6f100d31debef5bb9899a93d5170d2f738b42d3dd38e9749c" namespace=k8s.io protocol=ttrpc version=3 May 14 01:04:35.542261 systemd[1]: Started cri-containerd-14e59e108bebd1328d6a70c1db23da2935acce37188be5ec4ee813f234a6d6bd.scope - libcontainer container 14e59e108bebd1328d6a70c1db23da2935acce37188be5ec4ee813f234a6d6bd. May 14 01:04:35.570966 containerd[1486]: time="2025-05-14T01:04:35.570922373Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-proxy-z4qkp,Uid:690c0157-750e-431c-a732-8886678cb104,Namespace:kube-system,Attempt:0,} returns sandbox id \"14e59e108bebd1328d6a70c1db23da2935acce37188be5ec4ee813f234a6d6bd\"" May 14 01:04:35.574465 containerd[1486]: time="2025-05-14T01:04:35.574146845Z" level=info msg="CreateContainer within sandbox \"14e59e108bebd1328d6a70c1db23da2935acce37188be5ec4ee813f234a6d6bd\" for container &ContainerMetadata{Name:kube-proxy,Attempt:0,}" May 14 01:04:35.590069 containerd[1486]: time="2025-05-14T01:04:35.590020431Z" level=info msg="Container 12d5b66841d1fb8fe63f151a7a31536a231b9cd944587cb05777069719e6ea58: CDI devices from CRI Config.CDIDevices: []" May 14 01:04:35.604971 containerd[1486]: time="2025-05-14T01:04:35.604922373Z" level=info msg="CreateContainer within sandbox \"14e59e108bebd1328d6a70c1db23da2935acce37188be5ec4ee813f234a6d6bd\" for &ContainerMetadata{Name:kube-proxy,Attempt:0,} returns container id \"12d5b66841d1fb8fe63f151a7a31536a231b9cd944587cb05777069719e6ea58\"" May 14 01:04:35.606732 containerd[1486]: time="2025-05-14T01:04:35.605704090Z" level=info msg="StartContainer for \"12d5b66841d1fb8fe63f151a7a31536a231b9cd944587cb05777069719e6ea58\"" May 14 01:04:35.607389 containerd[1486]: time="2025-05-14T01:04:35.607366400Z" level=info msg="connecting to shim 12d5b66841d1fb8fe63f151a7a31536a231b9cd944587cb05777069719e6ea58" address="unix:///run/containerd/s/56d35e4d711b74a6f100d31debef5bb9899a93d5170d2f738b42d3dd38e9749c" protocol=ttrpc version=3 May 14 01:04:35.629173 systemd[1]: Started cri-containerd-12d5b66841d1fb8fe63f151a7a31536a231b9cd944587cb05777069719e6ea58.scope - libcontainer container 12d5b66841d1fb8fe63f151a7a31536a231b9cd944587cb05777069719e6ea58. May 14 01:04:35.658066 containerd[1486]: time="2025-05-14T01:04:35.657707162Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:tigera-operator-789496d6f5-f2vlv,Uid:29c2f6c6-dbe5-4423-b5e6-4327db3ac573,Namespace:tigera-operator,Attempt:0,}" May 14 01:04:35.681237 containerd[1486]: time="2025-05-14T01:04:35.680466153Z" level=info msg="StartContainer for \"12d5b66841d1fb8fe63f151a7a31536a231b9cd944587cb05777069719e6ea58\" returns successfully" May 14 01:04:35.693064 containerd[1486]: time="2025-05-14T01:04:35.691679971Z" level=info msg="connecting to shim 23918fde1468a24d72bf4f5e0000188dfdc233ca55c60eb25cbd95be6a72430d" address="unix:///run/containerd/s/1c7825712189e0aeb15b4ad748271e5095139e01cd2f8c1ca63e530400d627a1" namespace=k8s.io protocol=ttrpc version=3 May 14 01:04:35.717494 systemd[1]: Started cri-containerd-23918fde1468a24d72bf4f5e0000188dfdc233ca55c60eb25cbd95be6a72430d.scope - libcontainer container 23918fde1468a24d72bf4f5e0000188dfdc233ca55c60eb25cbd95be6a72430d. May 14 01:04:35.771274 containerd[1486]: time="2025-05-14T01:04:35.771232836Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:tigera-operator-789496d6f5-f2vlv,Uid:29c2f6c6-dbe5-4423-b5e6-4327db3ac573,Namespace:tigera-operator,Attempt:0,} returns sandbox id \"23918fde1468a24d72bf4f5e0000188dfdc233ca55c60eb25cbd95be6a72430d\"" May 14 01:04:35.773162 containerd[1486]: time="2025-05-14T01:04:35.773137532Z" level=info msg="PullImage \"quay.io/tigera/operator:v1.36.7\"" May 14 01:04:36.281638 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount59973340.mount: Deactivated successfully. May 14 01:04:36.701237 kubelet[2710]: I0514 01:04:36.701123 2710 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-proxy-z4qkp" podStartSLOduration=1.7010903480000001 podStartE2EDuration="1.701090348s" podCreationTimestamp="2025-05-14 01:04:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-05-14 01:04:36.700633001 +0000 UTC m=+8.200204525" watchObservedRunningTime="2025-05-14 01:04:36.701090348 +0000 UTC m=+8.200661872" May 14 01:04:37.355375 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount424905872.mount: Deactivated successfully. May 14 01:04:37.950899 containerd[1486]: time="2025-05-14T01:04:37.950864312Z" level=info msg="ImageCreate event name:\"quay.io/tigera/operator:v1.36.7\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 14 01:04:37.952139 containerd[1486]: time="2025-05-14T01:04:37.952090833Z" level=info msg="stop pulling image quay.io/tigera/operator:v1.36.7: active requests=0, bytes read=22002662" May 14 01:04:37.953106 containerd[1486]: time="2025-05-14T01:04:37.953083697Z" level=info msg="ImageCreate event name:\"sha256:e9b19fa62f476f04e5840eb65a0f71b49c7b9f4ceede31675409ddc218bb5578\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 14 01:04:37.956593 containerd[1486]: time="2025-05-14T01:04:37.955864776Z" level=info msg="ImageCreate event name:\"quay.io/tigera/operator@sha256:a4a44422d8f2a14e0aaea2031ccb5580f2bf68218c9db444450c1888743305e9\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 14 01:04:37.956593 containerd[1486]: time="2025-05-14T01:04:37.956492304Z" level=info msg="Pulled image \"quay.io/tigera/operator:v1.36.7\" with image id \"sha256:e9b19fa62f476f04e5840eb65a0f71b49c7b9f4ceede31675409ddc218bb5578\", repo tag \"quay.io/tigera/operator:v1.36.7\", repo digest \"quay.io/tigera/operator@sha256:a4a44422d8f2a14e0aaea2031ccb5580f2bf68218c9db444450c1888743305e9\", size \"21998657\" in 2.183231521s" May 14 01:04:37.956593 containerd[1486]: time="2025-05-14T01:04:37.956519024Z" level=info msg="PullImage \"quay.io/tigera/operator:v1.36.7\" returns image reference \"sha256:e9b19fa62f476f04e5840eb65a0f71b49c7b9f4ceede31675409ddc218bb5578\"" May 14 01:04:37.959844 containerd[1486]: time="2025-05-14T01:04:37.959821293Z" level=info msg="CreateContainer within sandbox \"23918fde1468a24d72bf4f5e0000188dfdc233ca55c60eb25cbd95be6a72430d\" for container &ContainerMetadata{Name:tigera-operator,Attempt:0,}" May 14 01:04:37.981226 containerd[1486]: time="2025-05-14T01:04:37.981193816Z" level=info msg="Container 25bd407667dfe36cb03332d03fcc38e84018fbb7b0d7049c006c1b5d3e0c978c: CDI devices from CRI Config.CDIDevices: []" May 14 01:04:37.989599 containerd[1486]: time="2025-05-14T01:04:37.989538387Z" level=info msg="CreateContainer within sandbox \"23918fde1468a24d72bf4f5e0000188dfdc233ca55c60eb25cbd95be6a72430d\" for &ContainerMetadata{Name:tigera-operator,Attempt:0,} returns container id \"25bd407667dfe36cb03332d03fcc38e84018fbb7b0d7049c006c1b5d3e0c978c\"" May 14 01:04:37.991337 containerd[1486]: time="2025-05-14T01:04:37.991294473Z" level=info msg="StartContainer for \"25bd407667dfe36cb03332d03fcc38e84018fbb7b0d7049c006c1b5d3e0c978c\"" May 14 01:04:37.993551 containerd[1486]: time="2025-05-14T01:04:37.993448476Z" level=info msg="connecting to shim 25bd407667dfe36cb03332d03fcc38e84018fbb7b0d7049c006c1b5d3e0c978c" address="unix:///run/containerd/s/1c7825712189e0aeb15b4ad748271e5095139e01cd2f8c1ca63e530400d627a1" protocol=ttrpc version=3 May 14 01:04:38.022325 systemd[1]: Started cri-containerd-25bd407667dfe36cb03332d03fcc38e84018fbb7b0d7049c006c1b5d3e0c978c.scope - libcontainer container 25bd407667dfe36cb03332d03fcc38e84018fbb7b0d7049c006c1b5d3e0c978c. May 14 01:04:38.062866 containerd[1486]: time="2025-05-14T01:04:38.062809652Z" level=info msg="StartContainer for \"25bd407667dfe36cb03332d03fcc38e84018fbb7b0d7049c006c1b5d3e0c978c\" returns successfully" May 14 01:04:38.719512 kubelet[2710]: I0514 01:04:38.719142 2710 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="tigera-operator/tigera-operator-789496d6f5-f2vlv" podStartSLOduration=1.533696816 podStartE2EDuration="3.718968387s" podCreationTimestamp="2025-05-14 01:04:35 +0000 UTC" firstStartedPulling="2025-05-14 01:04:35.772560869 +0000 UTC m=+7.272132353" lastFinishedPulling="2025-05-14 01:04:37.95783245 +0000 UTC m=+9.457403924" observedRunningTime="2025-05-14 01:04:38.717827305 +0000 UTC m=+10.217398839" watchObservedRunningTime="2025-05-14 01:04:38.718968387 +0000 UTC m=+10.218539911" May 14 01:04:41.301682 systemd[1]: Created slice kubepods-besteffort-podd3580e71_ae25_4110_aa9a_83cb8c4d4f03.slice - libcontainer container kubepods-besteffort-podd3580e71_ae25_4110_aa9a_83cb8c4d4f03.slice. May 14 01:04:41.392773 kubelet[2710]: I0514 01:04:41.392670 2710 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d3580e71-ae25-4110-aa9a-83cb8c4d4f03-tigera-ca-bundle\") pod \"calico-typha-7c8fdf5689-494r6\" (UID: \"d3580e71-ae25-4110-aa9a-83cb8c4d4f03\") " pod="calico-system/calico-typha-7c8fdf5689-494r6" May 14 01:04:41.392773 kubelet[2710]: I0514 01:04:41.392718 2710 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"typha-certs\" (UniqueName: \"kubernetes.io/secret/d3580e71-ae25-4110-aa9a-83cb8c4d4f03-typha-certs\") pod \"calico-typha-7c8fdf5689-494r6\" (UID: \"d3580e71-ae25-4110-aa9a-83cb8c4d4f03\") " pod="calico-system/calico-typha-7c8fdf5689-494r6" May 14 01:04:41.392773 kubelet[2710]: I0514 01:04:41.392742 2710 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x8h88\" (UniqueName: \"kubernetes.io/projected/d3580e71-ae25-4110-aa9a-83cb8c4d4f03-kube-api-access-x8h88\") pod \"calico-typha-7c8fdf5689-494r6\" (UID: \"d3580e71-ae25-4110-aa9a-83cb8c4d4f03\") " pod="calico-system/calico-typha-7c8fdf5689-494r6" May 14 01:04:41.448573 systemd[1]: Created slice kubepods-besteffort-pod8f476380_749c_4fc0_8ad4_c0c3bf32390d.slice - libcontainer container kubepods-besteffort-pod8f476380_749c_4fc0_8ad4_c0c3bf32390d.slice. May 14 01:04:41.493232 kubelet[2710]: I0514 01:04:41.493195 2710 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/8f476380-749c-4fc0-8ad4-c0c3bf32390d-lib-modules\") pod \"calico-node-8sdfk\" (UID: \"8f476380-749c-4fc0-8ad4-c0c3bf32390d\") " pod="calico-system/calico-node-8sdfk" May 14 01:04:41.493232 kubelet[2710]: I0514 01:04:41.493234 2710 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-calico\" (UniqueName: \"kubernetes.io/host-path/8f476380-749c-4fc0-8ad4-c0c3bf32390d-var-run-calico\") pod \"calico-node-8sdfk\" (UID: \"8f476380-749c-4fc0-8ad4-c0c3bf32390d\") " pod="calico-system/calico-node-8sdfk" May 14 01:04:41.493388 kubelet[2710]: I0514 01:04:41.493253 2710 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-bin-dir\" (UniqueName: \"kubernetes.io/host-path/8f476380-749c-4fc0-8ad4-c0c3bf32390d-cni-bin-dir\") pod \"calico-node-8sdfk\" (UID: \"8f476380-749c-4fc0-8ad4-c0c3bf32390d\") " pod="calico-system/calico-node-8sdfk" May 14 01:04:41.493388 kubelet[2710]: I0514 01:04:41.493299 2710 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"policysync\" (UniqueName: \"kubernetes.io/host-path/8f476380-749c-4fc0-8ad4-c0c3bf32390d-policysync\") pod \"calico-node-8sdfk\" (UID: \"8f476380-749c-4fc0-8ad4-c0c3bf32390d\") " pod="calico-system/calico-node-8sdfk" May 14 01:04:41.493388 kubelet[2710]: I0514 01:04:41.493317 2710 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"xtables-lock\" (UniqueName: \"kubernetes.io/host-path/8f476380-749c-4fc0-8ad4-c0c3bf32390d-xtables-lock\") pod \"calico-node-8sdfk\" (UID: \"8f476380-749c-4fc0-8ad4-c0c3bf32390d\") " pod="calico-system/calico-node-8sdfk" May 14 01:04:41.493388 kubelet[2710]: I0514 01:04:41.493335 2710 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8f476380-749c-4fc0-8ad4-c0c3bf32390d-tigera-ca-bundle\") pod \"calico-node-8sdfk\" (UID: \"8f476380-749c-4fc0-8ad4-c0c3bf32390d\") " pod="calico-system/calico-node-8sdfk" May 14 01:04:41.493388 kubelet[2710]: I0514 01:04:41.493352 2710 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"flexvol-driver-host\" (UniqueName: \"kubernetes.io/host-path/8f476380-749c-4fc0-8ad4-c0c3bf32390d-flexvol-driver-host\") pod \"calico-node-8sdfk\" (UID: \"8f476380-749c-4fc0-8ad4-c0c3bf32390d\") " pod="calico-system/calico-node-8sdfk" May 14 01:04:41.493517 kubelet[2710]: I0514 01:04:41.493372 2710 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-net-dir\" (UniqueName: \"kubernetes.io/host-path/8f476380-749c-4fc0-8ad4-c0c3bf32390d-cni-net-dir\") pod \"calico-node-8sdfk\" (UID: \"8f476380-749c-4fc0-8ad4-c0c3bf32390d\") " pod="calico-system/calico-node-8sdfk" May 14 01:04:41.493517 kubelet[2710]: I0514 01:04:41.493388 2710 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-log-dir\" (UniqueName: \"kubernetes.io/host-path/8f476380-749c-4fc0-8ad4-c0c3bf32390d-cni-log-dir\") pod \"calico-node-8sdfk\" (UID: \"8f476380-749c-4fc0-8ad4-c0c3bf32390d\") " pod="calico-system/calico-node-8sdfk" May 14 01:04:41.493517 kubelet[2710]: I0514 01:04:41.493422 2710 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-certs\" (UniqueName: \"kubernetes.io/secret/8f476380-749c-4fc0-8ad4-c0c3bf32390d-node-certs\") pod \"calico-node-8sdfk\" (UID: \"8f476380-749c-4fc0-8ad4-c0c3bf32390d\") " pod="calico-system/calico-node-8sdfk" May 14 01:04:41.493517 kubelet[2710]: I0514 01:04:41.493441 2710 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lhslr\" (UniqueName: \"kubernetes.io/projected/8f476380-749c-4fc0-8ad4-c0c3bf32390d-kube-api-access-lhslr\") pod \"calico-node-8sdfk\" (UID: \"8f476380-749c-4fc0-8ad4-c0c3bf32390d\") " pod="calico-system/calico-node-8sdfk" May 14 01:04:41.493517 kubelet[2710]: I0514 01:04:41.493462 2710 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-calico\" (UniqueName: \"kubernetes.io/host-path/8f476380-749c-4fc0-8ad4-c0c3bf32390d-var-lib-calico\") pod \"calico-node-8sdfk\" (UID: \"8f476380-749c-4fc0-8ad4-c0c3bf32390d\") " pod="calico-system/calico-node-8sdfk" May 14 01:04:41.557932 kubelet[2710]: E0514 01:04:41.557821 2710 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-b8vbb" podUID="4b92a8a5-04c9-4b4e-b1c3-606570f923ca" May 14 01:04:41.593750 kubelet[2710]: I0514 01:04:41.593714 2710 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5kdxn\" (UniqueName: \"kubernetes.io/projected/4b92a8a5-04c9-4b4e-b1c3-606570f923ca-kube-api-access-5kdxn\") pod \"csi-node-driver-b8vbb\" (UID: \"4b92a8a5-04c9-4b4e-b1c3-606570f923ca\") " pod="calico-system/csi-node-driver-b8vbb" May 14 01:04:41.593882 kubelet[2710]: I0514 01:04:41.593793 2710 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"varrun\" (UniqueName: \"kubernetes.io/host-path/4b92a8a5-04c9-4b4e-b1c3-606570f923ca-varrun\") pod \"csi-node-driver-b8vbb\" (UID: \"4b92a8a5-04c9-4b4e-b1c3-606570f923ca\") " pod="calico-system/csi-node-driver-b8vbb" May 14 01:04:41.593882 kubelet[2710]: I0514 01:04:41.593815 2710 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/4b92a8a5-04c9-4b4e-b1c3-606570f923ca-socket-dir\") pod \"csi-node-driver-b8vbb\" (UID: \"4b92a8a5-04c9-4b4e-b1c3-606570f923ca\") " pod="calico-system/csi-node-driver-b8vbb" May 14 01:04:41.593882 kubelet[2710]: I0514 01:04:41.593864 2710 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/4b92a8a5-04c9-4b4e-b1c3-606570f923ca-kubelet-dir\") pod \"csi-node-driver-b8vbb\" (UID: \"4b92a8a5-04c9-4b4e-b1c3-606570f923ca\") " pod="calico-system/csi-node-driver-b8vbb" May 14 01:04:41.593970 kubelet[2710]: I0514 01:04:41.593883 2710 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/4b92a8a5-04c9-4b4e-b1c3-606570f923ca-registration-dir\") pod \"csi-node-driver-b8vbb\" (UID: \"4b92a8a5-04c9-4b4e-b1c3-606570f923ca\") " pod="calico-system/csi-node-driver-b8vbb" May 14 01:04:41.598053 kubelet[2710]: E0514 01:04:41.595919 2710 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 14 01:04:41.598053 kubelet[2710]: W0514 01:04:41.595940 2710 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 14 01:04:41.598053 kubelet[2710]: E0514 01:04:41.595958 2710 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 14 01:04:41.598053 kubelet[2710]: E0514 01:04:41.596189 2710 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 14 01:04:41.598053 kubelet[2710]: W0514 01:04:41.596199 2710 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 14 01:04:41.598053 kubelet[2710]: E0514 01:04:41.596210 2710 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 14 01:04:41.598538 kubelet[2710]: E0514 01:04:41.598415 2710 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 14 01:04:41.598538 kubelet[2710]: W0514 01:04:41.598431 2710 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 14 01:04:41.598538 kubelet[2710]: E0514 01:04:41.598451 2710 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 14 01:04:41.599399 kubelet[2710]: E0514 01:04:41.599299 2710 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 14 01:04:41.599399 kubelet[2710]: W0514 01:04:41.599312 2710 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 14 01:04:41.599399 kubelet[2710]: E0514 01:04:41.599352 2710 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 14 01:04:41.600110 kubelet[2710]: E0514 01:04:41.600096 2710 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 14 01:04:41.600177 kubelet[2710]: W0514 01:04:41.600165 2710 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 14 01:04:41.600314 kubelet[2710]: E0514 01:04:41.600254 2710 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 14 01:04:41.600461 kubelet[2710]: E0514 01:04:41.600450 2710 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 14 01:04:41.600596 kubelet[2710]: W0514 01:04:41.600521 2710 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 14 01:04:41.600596 kubelet[2710]: E0514 01:04:41.600574 2710 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 14 01:04:41.601111 kubelet[2710]: E0514 01:04:41.600754 2710 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 14 01:04:41.601111 kubelet[2710]: W0514 01:04:41.600765 2710 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 14 01:04:41.601111 kubelet[2710]: E0514 01:04:41.600796 2710 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 14 01:04:41.609557 containerd[1486]: time="2025-05-14T01:04:41.608265185Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-typha-7c8fdf5689-494r6,Uid:d3580e71-ae25-4110-aa9a-83cb8c4d4f03,Namespace:calico-system,Attempt:0,}" May 14 01:04:41.609854 kubelet[2710]: E0514 01:04:41.609047 2710 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 14 01:04:41.609854 kubelet[2710]: W0514 01:04:41.609088 2710 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 14 01:04:41.609926 kubelet[2710]: E0514 01:04:41.609909 2710 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 14 01:04:41.610898 kubelet[2710]: E0514 01:04:41.610123 2710 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 14 01:04:41.610898 kubelet[2710]: W0514 01:04:41.610137 2710 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 14 01:04:41.610898 kubelet[2710]: E0514 01:04:41.610168 2710 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 14 01:04:41.611718 kubelet[2710]: E0514 01:04:41.611229 2710 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 14 01:04:41.611718 kubelet[2710]: W0514 01:04:41.611242 2710 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 14 01:04:41.611962 kubelet[2710]: E0514 01:04:41.611853 2710 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 14 01:04:41.613788 kubelet[2710]: E0514 01:04:41.612761 2710 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 14 01:04:41.613788 kubelet[2710]: W0514 01:04:41.612775 2710 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 14 01:04:41.613788 kubelet[2710]: E0514 01:04:41.612875 2710 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 14 01:04:41.614315 kubelet[2710]: E0514 01:04:41.614132 2710 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 14 01:04:41.614315 kubelet[2710]: W0514 01:04:41.614145 2710 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 14 01:04:41.614315 kubelet[2710]: E0514 01:04:41.614179 2710 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 14 01:04:41.614735 kubelet[2710]: E0514 01:04:41.614665 2710 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 14 01:04:41.614735 kubelet[2710]: W0514 01:04:41.614679 2710 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 14 01:04:41.614809 kubelet[2710]: E0514 01:04:41.614759 2710 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 14 01:04:41.616022 kubelet[2710]: E0514 01:04:41.615873 2710 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 14 01:04:41.616022 kubelet[2710]: W0514 01:04:41.615887 2710 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 14 01:04:41.616022 kubelet[2710]: E0514 01:04:41.615918 2710 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 14 01:04:41.617334 kubelet[2710]: E0514 01:04:41.617187 2710 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 14 01:04:41.617334 kubelet[2710]: W0514 01:04:41.617200 2710 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 14 01:04:41.617334 kubelet[2710]: E0514 01:04:41.617229 2710 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 14 01:04:41.617679 kubelet[2710]: E0514 01:04:41.617603 2710 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 14 01:04:41.617679 kubelet[2710]: W0514 01:04:41.617615 2710 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 14 01:04:41.617679 kubelet[2710]: E0514 01:04:41.617643 2710 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 14 01:04:41.618556 kubelet[2710]: E0514 01:04:41.618438 2710 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 14 01:04:41.618556 kubelet[2710]: W0514 01:04:41.618451 2710 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 14 01:04:41.618556 kubelet[2710]: E0514 01:04:41.618493 2710 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 14 01:04:41.619134 kubelet[2710]: E0514 01:04:41.619121 2710 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 14 01:04:41.620421 kubelet[2710]: W0514 01:04:41.619184 2710 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 14 01:04:41.620421 kubelet[2710]: E0514 01:04:41.619201 2710 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 14 01:04:41.650370 kubelet[2710]: E0514 01:04:41.650347 2710 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 14 01:04:41.650529 kubelet[2710]: W0514 01:04:41.650478 2710 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 14 01:04:41.650529 kubelet[2710]: E0514 01:04:41.650500 2710 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 14 01:04:41.658697 containerd[1486]: time="2025-05-14T01:04:41.658185564Z" level=info msg="connecting to shim 4eb6dd64e59c40c04079f9e39c0a0cebf240485d226df97f828dd09bb7b92eda" address="unix:///run/containerd/s/5e3e69ed3ed5ad2f4878ff77c7d99520ad97904a2feebdc0d035c37a74ed7d50" namespace=k8s.io protocol=ttrpc version=3 May 14 01:04:41.685886 kubelet[2710]: E0514 01:04:41.685865 2710 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 14 01:04:41.686019 kubelet[2710]: W0514 01:04:41.686005 2710 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 14 01:04:41.686122 kubelet[2710]: E0514 01:04:41.686108 2710 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 14 01:04:41.686448 kubelet[2710]: E0514 01:04:41.686436 2710 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 14 01:04:41.686528 kubelet[2710]: W0514 01:04:41.686517 2710 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 14 01:04:41.686597 kubelet[2710]: E0514 01:04:41.686586 2710 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 14 01:04:41.686810 kubelet[2710]: E0514 01:04:41.686798 2710 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 14 01:04:41.686879 kubelet[2710]: W0514 01:04:41.686868 2710 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 14 01:04:41.686939 kubelet[2710]: E0514 01:04:41.686929 2710 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 14 01:04:41.687183 kubelet[2710]: E0514 01:04:41.687172 2710 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 14 01:04:41.687373 kubelet[2710]: W0514 01:04:41.687277 2710 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 14 01:04:41.687373 kubelet[2710]: E0514 01:04:41.687292 2710 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 14 01:04:41.687602 kubelet[2710]: E0514 01:04:41.687590 2710 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 14 01:04:41.687753 kubelet[2710]: W0514 01:04:41.687652 2710 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 14 01:04:41.687753 kubelet[2710]: E0514 01:04:41.687666 2710 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 14 01:04:41.688266 systemd[1]: Started cri-containerd-4eb6dd64e59c40c04079f9e39c0a0cebf240485d226df97f828dd09bb7b92eda.scope - libcontainer container 4eb6dd64e59c40c04079f9e39c0a0cebf240485d226df97f828dd09bb7b92eda. May 14 01:04:41.690091 kubelet[2710]: E0514 01:04:41.689543 2710 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 14 01:04:41.690091 kubelet[2710]: W0514 01:04:41.689589 2710 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 14 01:04:41.690091 kubelet[2710]: E0514 01:04:41.689611 2710 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 14 01:04:41.690204 kubelet[2710]: E0514 01:04:41.690106 2710 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 14 01:04:41.690204 kubelet[2710]: W0514 01:04:41.690117 2710 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 14 01:04:41.690204 kubelet[2710]: E0514 01:04:41.690126 2710 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 14 01:04:41.690966 kubelet[2710]: E0514 01:04:41.690947 2710 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 14 01:04:41.690966 kubelet[2710]: W0514 01:04:41.690960 2710 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 14 01:04:41.691066 kubelet[2710]: E0514 01:04:41.690970 2710 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 14 01:04:41.691463 kubelet[2710]: E0514 01:04:41.691399 2710 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 14 01:04:41.691463 kubelet[2710]: W0514 01:04:41.691412 2710 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 14 01:04:41.691463 kubelet[2710]: E0514 01:04:41.691422 2710 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 14 01:04:41.692011 kubelet[2710]: E0514 01:04:41.691990 2710 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 14 01:04:41.692011 kubelet[2710]: W0514 01:04:41.692003 2710 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 14 01:04:41.692011 kubelet[2710]: E0514 01:04:41.692013 2710 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 14 01:04:41.692632 kubelet[2710]: E0514 01:04:41.692328 2710 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 14 01:04:41.692632 kubelet[2710]: W0514 01:04:41.692338 2710 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 14 01:04:41.692632 kubelet[2710]: E0514 01:04:41.692347 2710 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 14 01:04:41.692727 kubelet[2710]: E0514 01:04:41.692681 2710 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 14 01:04:41.692727 kubelet[2710]: W0514 01:04:41.692691 2710 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 14 01:04:41.692727 kubelet[2710]: E0514 01:04:41.692700 2710 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 14 01:04:41.694171 kubelet[2710]: E0514 01:04:41.693349 2710 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 14 01:04:41.694171 kubelet[2710]: W0514 01:04:41.693362 2710 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 14 01:04:41.694171 kubelet[2710]: E0514 01:04:41.694129 2710 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 14 01:04:41.694408 kubelet[2710]: E0514 01:04:41.694354 2710 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 14 01:04:41.694408 kubelet[2710]: W0514 01:04:41.694371 2710 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 14 01:04:41.694408 kubelet[2710]: E0514 01:04:41.694382 2710 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 14 01:04:41.695352 kubelet[2710]: E0514 01:04:41.695261 2710 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 14 01:04:41.695352 kubelet[2710]: W0514 01:04:41.695275 2710 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 14 01:04:41.695352 kubelet[2710]: E0514 01:04:41.695285 2710 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 14 01:04:41.696186 kubelet[2710]: E0514 01:04:41.696129 2710 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 14 01:04:41.696186 kubelet[2710]: W0514 01:04:41.696142 2710 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 14 01:04:41.696186 kubelet[2710]: E0514 01:04:41.696183 2710 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 14 01:04:41.697459 kubelet[2710]: E0514 01:04:41.697443 2710 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 14 01:04:41.697609 kubelet[2710]: W0514 01:04:41.697520 2710 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 14 01:04:41.697609 kubelet[2710]: E0514 01:04:41.697538 2710 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 14 01:04:41.697982 kubelet[2710]: E0514 01:04:41.697891 2710 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 14 01:04:41.697982 kubelet[2710]: W0514 01:04:41.697901 2710 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 14 01:04:41.697982 kubelet[2710]: E0514 01:04:41.697911 2710 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 14 01:04:41.698759 kubelet[2710]: E0514 01:04:41.698161 2710 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 14 01:04:41.698759 kubelet[2710]: W0514 01:04:41.698171 2710 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 14 01:04:41.698759 kubelet[2710]: E0514 01:04:41.698180 2710 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 14 01:04:41.699124 kubelet[2710]: E0514 01:04:41.699099 2710 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 14 01:04:41.699124 kubelet[2710]: W0514 01:04:41.699113 2710 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 14 01:04:41.699124 kubelet[2710]: E0514 01:04:41.699122 2710 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 14 01:04:41.699747 kubelet[2710]: E0514 01:04:41.699711 2710 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 14 01:04:41.699747 kubelet[2710]: W0514 01:04:41.699725 2710 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 14 01:04:41.699821 kubelet[2710]: E0514 01:04:41.699767 2710 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 14 01:04:41.700064 kubelet[2710]: E0514 01:04:41.700043 2710 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 14 01:04:41.700064 kubelet[2710]: W0514 01:04:41.700057 2710 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 14 01:04:41.700141 kubelet[2710]: E0514 01:04:41.700066 2710 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 14 01:04:41.700336 kubelet[2710]: E0514 01:04:41.700263 2710 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 14 01:04:41.700336 kubelet[2710]: W0514 01:04:41.700300 2710 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 14 01:04:41.700336 kubelet[2710]: E0514 01:04:41.700311 2710 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 14 01:04:41.700530 kubelet[2710]: E0514 01:04:41.700505 2710 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 14 01:04:41.700530 kubelet[2710]: W0514 01:04:41.700518 2710 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 14 01:04:41.700530 kubelet[2710]: E0514 01:04:41.700528 2710 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 14 01:04:41.700833 kubelet[2710]: E0514 01:04:41.700740 2710 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 14 01:04:41.700833 kubelet[2710]: W0514 01:04:41.700755 2710 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 14 01:04:41.700833 kubelet[2710]: E0514 01:04:41.700765 2710 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 14 01:04:41.701176 kubelet[2710]: E0514 01:04:41.701111 2710 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 14 01:04:41.701176 kubelet[2710]: W0514 01:04:41.701121 2710 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 14 01:04:41.701176 kubelet[2710]: E0514 01:04:41.701130 2710 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 14 01:04:41.701561 kubelet[2710]: E0514 01:04:41.701440 2710 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 14 01:04:41.701561 kubelet[2710]: W0514 01:04:41.701453 2710 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 14 01:04:41.701561 kubelet[2710]: E0514 01:04:41.701470 2710 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 14 01:04:41.701747 kubelet[2710]: E0514 01:04:41.701723 2710 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 14 01:04:41.701747 kubelet[2710]: W0514 01:04:41.701738 2710 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 14 01:04:41.701747 kubelet[2710]: E0514 01:04:41.701747 2710 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 14 01:04:41.702119 kubelet[2710]: E0514 01:04:41.701984 2710 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 14 01:04:41.702119 kubelet[2710]: W0514 01:04:41.701993 2710 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 14 01:04:41.702119 kubelet[2710]: E0514 01:04:41.702115 2710 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 14 01:04:41.702375 kubelet[2710]: E0514 01:04:41.702357 2710 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 14 01:04:41.702375 kubelet[2710]: W0514 01:04:41.702369 2710 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 14 01:04:41.702473 kubelet[2710]: E0514 01:04:41.702383 2710 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 14 01:04:41.702669 kubelet[2710]: E0514 01:04:41.702629 2710 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 14 01:04:41.702669 kubelet[2710]: W0514 01:04:41.702642 2710 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 14 01:04:41.702778 kubelet[2710]: E0514 01:04:41.702694 2710 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 14 01:04:41.703062 kubelet[2710]: E0514 01:04:41.702909 2710 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 14 01:04:41.703062 kubelet[2710]: W0514 01:04:41.702922 2710 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 14 01:04:41.703349 kubelet[2710]: E0514 01:04:41.703154 2710 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 14 01:04:41.703349 kubelet[2710]: W0514 01:04:41.703167 2710 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 14 01:04:41.703349 kubelet[2710]: E0514 01:04:41.703155 2710 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 14 01:04:41.703349 kubelet[2710]: E0514 01:04:41.703203 2710 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 14 01:04:41.703349 kubelet[2710]: E0514 01:04:41.703346 2710 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 14 01:04:41.703588 kubelet[2710]: W0514 01:04:41.703356 2710 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 14 01:04:41.703708 kubelet[2710]: E0514 01:04:41.703407 2710 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 14 01:04:41.704023 kubelet[2710]: E0514 01:04:41.703732 2710 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 14 01:04:41.704023 kubelet[2710]: W0514 01:04:41.703741 2710 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 14 01:04:41.704023 kubelet[2710]: E0514 01:04:41.703843 2710 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 14 01:04:41.704023 kubelet[2710]: E0514 01:04:41.703949 2710 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 14 01:04:41.704023 kubelet[2710]: W0514 01:04:41.703984 2710 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 14 01:04:41.704190 kubelet[2710]: E0514 01:04:41.704154 2710 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 14 01:04:41.704370 kubelet[2710]: E0514 01:04:41.704295 2710 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 14 01:04:41.704370 kubelet[2710]: W0514 01:04:41.704346 2710 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 14 01:04:41.704563 kubelet[2710]: E0514 01:04:41.704376 2710 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 14 01:04:41.704892 kubelet[2710]: E0514 01:04:41.704873 2710 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 14 01:04:41.704892 kubelet[2710]: W0514 01:04:41.704889 2710 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 14 01:04:41.705081 kubelet[2710]: E0514 01:04:41.704901 2710 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 14 01:04:41.705623 kubelet[2710]: E0514 01:04:41.705395 2710 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 14 01:04:41.705623 kubelet[2710]: W0514 01:04:41.705410 2710 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 14 01:04:41.705623 kubelet[2710]: E0514 01:04:41.705420 2710 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 14 01:04:41.705799 kubelet[2710]: E0514 01:04:41.705781 2710 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 14 01:04:41.705799 kubelet[2710]: W0514 01:04:41.705794 2710 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 14 01:04:41.706122 kubelet[2710]: E0514 01:04:41.706084 2710 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 14 01:04:41.706193 kubelet[2710]: E0514 01:04:41.706176 2710 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 14 01:04:41.706193 kubelet[2710]: W0514 01:04:41.706192 2710 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 14 01:04:41.706452 kubelet[2710]: E0514 01:04:41.706294 2710 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 14 01:04:41.706452 kubelet[2710]: E0514 01:04:41.706419 2710 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 14 01:04:41.706452 kubelet[2710]: W0514 01:04:41.706428 2710 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 14 01:04:41.706813 kubelet[2710]: E0514 01:04:41.706617 2710 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 14 01:04:41.706813 kubelet[2710]: W0514 01:04:41.706629 2710 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 14 01:04:41.706813 kubelet[2710]: E0514 01:04:41.706646 2710 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 14 01:04:41.706813 kubelet[2710]: E0514 01:04:41.706672 2710 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 14 01:04:41.706813 kubelet[2710]: E0514 01:04:41.706813 2710 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 14 01:04:41.706951 kubelet[2710]: W0514 01:04:41.706823 2710 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 14 01:04:41.706951 kubelet[2710]: E0514 01:04:41.706874 2710 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 14 01:04:41.707249 kubelet[2710]: E0514 01:04:41.707227 2710 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 14 01:04:41.707249 kubelet[2710]: W0514 01:04:41.707240 2710 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 14 01:04:41.707667 kubelet[2710]: E0514 01:04:41.707253 2710 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 14 01:04:41.707667 kubelet[2710]: E0514 01:04:41.707466 2710 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 14 01:04:41.707667 kubelet[2710]: W0514 01:04:41.707475 2710 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 14 01:04:41.707667 kubelet[2710]: E0514 01:04:41.707486 2710 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 14 01:04:41.707766 kubelet[2710]: E0514 01:04:41.707703 2710 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 14 01:04:41.707766 kubelet[2710]: W0514 01:04:41.707712 2710 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 14 01:04:41.707766 kubelet[2710]: E0514 01:04:41.707729 2710 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 14 01:04:41.708305 kubelet[2710]: E0514 01:04:41.707930 2710 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 14 01:04:41.708305 kubelet[2710]: W0514 01:04:41.707943 2710 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 14 01:04:41.708305 kubelet[2710]: E0514 01:04:41.707971 2710 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 14 01:04:41.708632 kubelet[2710]: E0514 01:04:41.708344 2710 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 14 01:04:41.708632 kubelet[2710]: W0514 01:04:41.708353 2710 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 14 01:04:41.708632 kubelet[2710]: E0514 01:04:41.708371 2710 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 14 01:04:41.708632 kubelet[2710]: E0514 01:04:41.708599 2710 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 14 01:04:41.708632 kubelet[2710]: W0514 01:04:41.708608 2710 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 14 01:04:41.708632 kubelet[2710]: E0514 01:04:41.708617 2710 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 14 01:04:41.717477 kubelet[2710]: E0514 01:04:41.717452 2710 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 14 01:04:41.717569 kubelet[2710]: W0514 01:04:41.717471 2710 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 14 01:04:41.717569 kubelet[2710]: E0514 01:04:41.717502 2710 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 14 01:04:41.755580 containerd[1486]: time="2025-05-14T01:04:41.755295024Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-node-8sdfk,Uid:8f476380-749c-4fc0-8ad4-c0c3bf32390d,Namespace:calico-system,Attempt:0,}" May 14 01:04:41.768362 containerd[1486]: time="2025-05-14T01:04:41.768300199Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-typha-7c8fdf5689-494r6,Uid:d3580e71-ae25-4110-aa9a-83cb8c4d4f03,Namespace:calico-system,Attempt:0,} returns sandbox id \"4eb6dd64e59c40c04079f9e39c0a0cebf240485d226df97f828dd09bb7b92eda\"" May 14 01:04:41.770314 containerd[1486]: time="2025-05-14T01:04:41.770019907Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/typha:v3.29.3\"" May 14 01:04:41.790645 containerd[1486]: time="2025-05-14T01:04:41.790271840Z" level=info msg="connecting to shim 0b3cea7e51f96f150314b05c979af0bbdc58578df2f52372223186e1fcf3d1f7" address="unix:///run/containerd/s/302d53b9edb19c14101bb47026182e64cce4e8d0e7195df087b0c8330a6684e7" namespace=k8s.io protocol=ttrpc version=3 May 14 01:04:41.817577 systemd[1]: Started cri-containerd-0b3cea7e51f96f150314b05c979af0bbdc58578df2f52372223186e1fcf3d1f7.scope - libcontainer container 0b3cea7e51f96f150314b05c979af0bbdc58578df2f52372223186e1fcf3d1f7. May 14 01:04:41.860324 containerd[1486]: time="2025-05-14T01:04:41.860180758Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-node-8sdfk,Uid:8f476380-749c-4fc0-8ad4-c0c3bf32390d,Namespace:calico-system,Attempt:0,} returns sandbox id \"0b3cea7e51f96f150314b05c979af0bbdc58578df2f52372223186e1fcf3d1f7\"" May 14 01:04:43.614494 kubelet[2710]: E0514 01:04:43.613970 2710 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-b8vbb" podUID="4b92a8a5-04c9-4b4e-b1c3-606570f923ca" May 14 01:04:44.909981 containerd[1486]: time="2025-05-14T01:04:44.909942377Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/typha:v3.29.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 14 01:04:44.911081 containerd[1486]: time="2025-05-14T01:04:44.910991756Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/typha:v3.29.3: active requests=0, bytes read=30426870" May 14 01:04:44.912095 containerd[1486]: time="2025-05-14T01:04:44.912022100Z" level=info msg="ImageCreate event name:\"sha256:bde24a3cb8851b59372b76b3ad78f8028d1a915ffed82c6cc6256f34e500bd3d\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 14 01:04:44.914328 containerd[1486]: time="2025-05-14T01:04:44.914279235Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/typha@sha256:f5516aa6a78f00931d2625f3012dcf2c69d141ce41483b8d59c6ec6330a18620\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 14 01:04:44.915057 containerd[1486]: time="2025-05-14T01:04:44.914905469Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/typha:v3.29.3\" with image id \"sha256:bde24a3cb8851b59372b76b3ad78f8028d1a915ffed82c6cc6256f34e500bd3d\", repo tag \"ghcr.io/flatcar/calico/typha:v3.29.3\", repo digest \"ghcr.io/flatcar/calico/typha@sha256:f5516aa6a78f00931d2625f3012dcf2c69d141ce41483b8d59c6ec6330a18620\", size \"31919484\" in 3.144612891s" May 14 01:04:44.915057 containerd[1486]: time="2025-05-14T01:04:44.914951696Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/typha:v3.29.3\" returns image reference \"sha256:bde24a3cb8851b59372b76b3ad78f8028d1a915ffed82c6cc6256f34e500bd3d\"" May 14 01:04:44.916677 containerd[1486]: time="2025-05-14T01:04:44.916515310Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.29.3\"" May 14 01:04:44.931310 containerd[1486]: time="2025-05-14T01:04:44.931268081Z" level=info msg="CreateContainer within sandbox \"4eb6dd64e59c40c04079f9e39c0a0cebf240485d226df97f828dd09bb7b92eda\" for container &ContainerMetadata{Name:calico-typha,Attempt:0,}" May 14 01:04:44.942388 containerd[1486]: time="2025-05-14T01:04:44.942218960Z" level=info msg="Container 089e1d97ed79d54a06da204a609496762cf86a0ea60f8be28840ff6ac37abfc9: CDI devices from CRI Config.CDIDevices: []" May 14 01:04:44.955345 containerd[1486]: time="2025-05-14T01:04:44.955213963Z" level=info msg="CreateContainer within sandbox \"4eb6dd64e59c40c04079f9e39c0a0cebf240485d226df97f828dd09bb7b92eda\" for &ContainerMetadata{Name:calico-typha,Attempt:0,} returns container id \"089e1d97ed79d54a06da204a609496762cf86a0ea60f8be28840ff6ac37abfc9\"" May 14 01:04:44.955966 containerd[1486]: time="2025-05-14T01:04:44.955859815Z" level=info msg="StartContainer for \"089e1d97ed79d54a06da204a609496762cf86a0ea60f8be28840ff6ac37abfc9\"" May 14 01:04:44.956957 containerd[1486]: time="2025-05-14T01:04:44.956895048Z" level=info msg="connecting to shim 089e1d97ed79d54a06da204a609496762cf86a0ea60f8be28840ff6ac37abfc9" address="unix:///run/containerd/s/5e3e69ed3ed5ad2f4878ff77c7d99520ad97904a2feebdc0d035c37a74ed7d50" protocol=ttrpc version=3 May 14 01:04:44.981178 systemd[1]: Started cri-containerd-089e1d97ed79d54a06da204a609496762cf86a0ea60f8be28840ff6ac37abfc9.scope - libcontainer container 089e1d97ed79d54a06da204a609496762cf86a0ea60f8be28840ff6ac37abfc9. May 14 01:04:45.035101 containerd[1486]: time="2025-05-14T01:04:45.034978028Z" level=info msg="StartContainer for \"089e1d97ed79d54a06da204a609496762cf86a0ea60f8be28840ff6ac37abfc9\" returns successfully" May 14 01:04:45.612627 kubelet[2710]: E0514 01:04:45.612430 2710 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-b8vbb" podUID="4b92a8a5-04c9-4b4e-b1c3-606570f923ca" May 14 01:04:45.730934 kubelet[2710]: E0514 01:04:45.730864 2710 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 14 01:04:45.730934 kubelet[2710]: W0514 01:04:45.730888 2710 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 14 01:04:45.730934 kubelet[2710]: E0514 01:04:45.730905 2710 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 14 01:04:45.732560 kubelet[2710]: E0514 01:04:45.731081 2710 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 14 01:04:45.732560 kubelet[2710]: W0514 01:04:45.731091 2710 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 14 01:04:45.732560 kubelet[2710]: E0514 01:04:45.731113 2710 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 14 01:04:45.732560 kubelet[2710]: E0514 01:04:45.731259 2710 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 14 01:04:45.732560 kubelet[2710]: W0514 01:04:45.731268 2710 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 14 01:04:45.732560 kubelet[2710]: E0514 01:04:45.731277 2710 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 14 01:04:45.732560 kubelet[2710]: E0514 01:04:45.731455 2710 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 14 01:04:45.732560 kubelet[2710]: W0514 01:04:45.731464 2710 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 14 01:04:45.732560 kubelet[2710]: E0514 01:04:45.731473 2710 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 14 01:04:45.732560 kubelet[2710]: E0514 01:04:45.731619 2710 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 14 01:04:45.733249 kubelet[2710]: W0514 01:04:45.731628 2710 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 14 01:04:45.733249 kubelet[2710]: E0514 01:04:45.731636 2710 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 14 01:04:45.733249 kubelet[2710]: E0514 01:04:45.731775 2710 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 14 01:04:45.733249 kubelet[2710]: W0514 01:04:45.731783 2710 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 14 01:04:45.733249 kubelet[2710]: E0514 01:04:45.731792 2710 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 14 01:04:45.733249 kubelet[2710]: E0514 01:04:45.731955 2710 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 14 01:04:45.733249 kubelet[2710]: W0514 01:04:45.731965 2710 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 14 01:04:45.733249 kubelet[2710]: E0514 01:04:45.731974 2710 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 14 01:04:45.733249 kubelet[2710]: E0514 01:04:45.733104 2710 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 14 01:04:45.733249 kubelet[2710]: W0514 01:04:45.733116 2710 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 14 01:04:45.733884 kubelet[2710]: E0514 01:04:45.733126 2710 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 14 01:04:45.733884 kubelet[2710]: E0514 01:04:45.733296 2710 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 14 01:04:45.733884 kubelet[2710]: W0514 01:04:45.733305 2710 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 14 01:04:45.733884 kubelet[2710]: E0514 01:04:45.733315 2710 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 14 01:04:45.733884 kubelet[2710]: E0514 01:04:45.733462 2710 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 14 01:04:45.733884 kubelet[2710]: W0514 01:04:45.733472 2710 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 14 01:04:45.733884 kubelet[2710]: E0514 01:04:45.733480 2710 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 14 01:04:45.733884 kubelet[2710]: E0514 01:04:45.733625 2710 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 14 01:04:45.733884 kubelet[2710]: W0514 01:04:45.733634 2710 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 14 01:04:45.733884 kubelet[2710]: E0514 01:04:45.733643 2710 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 14 01:04:45.736618 kubelet[2710]: E0514 01:04:45.733790 2710 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 14 01:04:45.736618 kubelet[2710]: W0514 01:04:45.733799 2710 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 14 01:04:45.736618 kubelet[2710]: E0514 01:04:45.733807 2710 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 14 01:04:45.736618 kubelet[2710]: E0514 01:04:45.733961 2710 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 14 01:04:45.736618 kubelet[2710]: W0514 01:04:45.733970 2710 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 14 01:04:45.736618 kubelet[2710]: E0514 01:04:45.733979 2710 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 14 01:04:45.736618 kubelet[2710]: E0514 01:04:45.734179 2710 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 14 01:04:45.736618 kubelet[2710]: W0514 01:04:45.734188 2710 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 14 01:04:45.736618 kubelet[2710]: E0514 01:04:45.734198 2710 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 14 01:04:45.736618 kubelet[2710]: E0514 01:04:45.734336 2710 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 14 01:04:45.736978 kubelet[2710]: W0514 01:04:45.734346 2710 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 14 01:04:45.736978 kubelet[2710]: E0514 01:04:45.734354 2710 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 14 01:04:45.737671 kubelet[2710]: E0514 01:04:45.737389 2710 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 14 01:04:45.737671 kubelet[2710]: W0514 01:04:45.737418 2710 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 14 01:04:45.737671 kubelet[2710]: E0514 01:04:45.737432 2710 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 14 01:04:45.738237 kubelet[2710]: E0514 01:04:45.738115 2710 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 14 01:04:45.738237 kubelet[2710]: W0514 01:04:45.738128 2710 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 14 01:04:45.738237 kubelet[2710]: E0514 01:04:45.738148 2710 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 14 01:04:45.738940 kubelet[2710]: E0514 01:04:45.738896 2710 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 14 01:04:45.738940 kubelet[2710]: W0514 01:04:45.738911 2710 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 14 01:04:45.739161 kubelet[2710]: E0514 01:04:45.738921 2710 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 14 01:04:45.739372 kubelet[2710]: E0514 01:04:45.739349 2710 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 14 01:04:45.739372 kubelet[2710]: W0514 01:04:45.739360 2710 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 14 01:04:45.739931 kubelet[2710]: E0514 01:04:45.739803 2710 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 14 01:04:45.739931 kubelet[2710]: E0514 01:04:45.739893 2710 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 14 01:04:45.740224 kubelet[2710]: W0514 01:04:45.740062 2710 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 14 01:04:45.740224 kubelet[2710]: E0514 01:04:45.740157 2710 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 14 01:04:45.740721 kubelet[2710]: E0514 01:04:45.740698 2710 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 14 01:04:45.740721 kubelet[2710]: W0514 01:04:45.740709 2710 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 14 01:04:45.741114 kubelet[2710]: E0514 01:04:45.741069 2710 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 14 01:04:45.741450 kubelet[2710]: E0514 01:04:45.741377 2710 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 14 01:04:45.741450 kubelet[2710]: W0514 01:04:45.741388 2710 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 14 01:04:45.741698 kubelet[2710]: E0514 01:04:45.741655 2710 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 14 01:04:45.742063 kubelet[2710]: E0514 01:04:45.742005 2710 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 14 01:04:45.742063 kubelet[2710]: W0514 01:04:45.742018 2710 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 14 01:04:45.742317 kubelet[2710]: E0514 01:04:45.742241 2710 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 14 01:04:45.742511 kubelet[2710]: E0514 01:04:45.742437 2710 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 14 01:04:45.742511 kubelet[2710]: W0514 01:04:45.742469 2710 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 14 01:04:45.742626 kubelet[2710]: E0514 01:04:45.742551 2710 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 14 01:04:45.742908 kubelet[2710]: E0514 01:04:45.742801 2710 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 14 01:04:45.742908 kubelet[2710]: W0514 01:04:45.742811 2710 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 14 01:04:45.742908 kubelet[2710]: E0514 01:04:45.742846 2710 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 14 01:04:45.743270 kubelet[2710]: E0514 01:04:45.743154 2710 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 14 01:04:45.743270 kubelet[2710]: W0514 01:04:45.743164 2710 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 14 01:04:45.743270 kubelet[2710]: E0514 01:04:45.743177 2710 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 14 01:04:45.743509 kubelet[2710]: E0514 01:04:45.743416 2710 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 14 01:04:45.743509 kubelet[2710]: W0514 01:04:45.743427 2710 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 14 01:04:45.743790 kubelet[2710]: E0514 01:04:45.743604 2710 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 14 01:04:45.744097 kubelet[2710]: E0514 01:04:45.744086 2710 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 14 01:04:45.744170 kubelet[2710]: W0514 01:04:45.744159 2710 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 14 01:04:45.744297 kubelet[2710]: E0514 01:04:45.744284 2710 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 14 01:04:45.744541 kubelet[2710]: E0514 01:04:45.744519 2710 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 14 01:04:45.744692 kubelet[2710]: W0514 01:04:45.744616 2710 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 14 01:04:45.744806 kubelet[2710]: E0514 01:04:45.744765 2710 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 14 01:04:45.744961 kubelet[2710]: E0514 01:04:45.744927 2710 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 14 01:04:45.744961 kubelet[2710]: W0514 01:04:45.744938 2710 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 14 01:04:45.745169 kubelet[2710]: E0514 01:04:45.745139 2710 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 14 01:04:45.745411 kubelet[2710]: E0514 01:04:45.745267 2710 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 14 01:04:45.745411 kubelet[2710]: W0514 01:04:45.745275 2710 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 14 01:04:45.745411 kubelet[2710]: E0514 01:04:45.745288 2710 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 14 01:04:45.745579 kubelet[2710]: E0514 01:04:45.745569 2710 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 14 01:04:45.745719 kubelet[2710]: W0514 01:04:45.745645 2710 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 14 01:04:45.745719 kubelet[2710]: E0514 01:04:45.745660 2710 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 14 01:04:45.746281 kubelet[2710]: E0514 01:04:45.746271 2710 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 14 01:04:45.746356 kubelet[2710]: W0514 01:04:45.746330 2710 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 14 01:04:45.746428 kubelet[2710]: E0514 01:04:45.746406 2710 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 14 01:04:46.720068 kubelet[2710]: I0514 01:04:46.720027 2710 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" May 14 01:04:46.741177 kubelet[2710]: E0514 01:04:46.741152 2710 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 14 01:04:46.741177 kubelet[2710]: W0514 01:04:46.741173 2710 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 14 01:04:46.741318 kubelet[2710]: E0514 01:04:46.741191 2710 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 14 01:04:46.741394 kubelet[2710]: E0514 01:04:46.741378 2710 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 14 01:04:46.741394 kubelet[2710]: W0514 01:04:46.741391 2710 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 14 01:04:46.741603 kubelet[2710]: E0514 01:04:46.741401 2710 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 14 01:04:46.741603 kubelet[2710]: E0514 01:04:46.741571 2710 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 14 01:04:46.741603 kubelet[2710]: W0514 01:04:46.741581 2710 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 14 01:04:46.741603 kubelet[2710]: E0514 01:04:46.741590 2710 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 14 01:04:46.741751 kubelet[2710]: E0514 01:04:46.741737 2710 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 14 01:04:46.741751 kubelet[2710]: W0514 01:04:46.741746 2710 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 14 01:04:46.741819 kubelet[2710]: E0514 01:04:46.741754 2710 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 14 01:04:46.741971 kubelet[2710]: E0514 01:04:46.741958 2710 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 14 01:04:46.741971 kubelet[2710]: W0514 01:04:46.741970 2710 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 14 01:04:46.742079 kubelet[2710]: E0514 01:04:46.741979 2710 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 14 01:04:46.742158 kubelet[2710]: E0514 01:04:46.742145 2710 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 14 01:04:46.742158 kubelet[2710]: W0514 01:04:46.742157 2710 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 14 01:04:46.742246 kubelet[2710]: E0514 01:04:46.742166 2710 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 14 01:04:46.742386 kubelet[2710]: E0514 01:04:46.742372 2710 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 14 01:04:46.742386 kubelet[2710]: W0514 01:04:46.742385 2710 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 14 01:04:46.742461 kubelet[2710]: E0514 01:04:46.742394 2710 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 14 01:04:46.742552 kubelet[2710]: E0514 01:04:46.742540 2710 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 14 01:04:46.742552 kubelet[2710]: W0514 01:04:46.742551 2710 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 14 01:04:46.742651 kubelet[2710]: E0514 01:04:46.742559 2710 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 14 01:04:46.744090 kubelet[2710]: E0514 01:04:46.744076 2710 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 14 01:04:46.744090 kubelet[2710]: W0514 01:04:46.744089 2710 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 14 01:04:46.744171 kubelet[2710]: E0514 01:04:46.744099 2710 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 14 01:04:46.744280 kubelet[2710]: E0514 01:04:46.744268 2710 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 14 01:04:46.744280 kubelet[2710]: W0514 01:04:46.744279 2710 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 14 01:04:46.744369 kubelet[2710]: E0514 01:04:46.744288 2710 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 14 01:04:46.744510 kubelet[2710]: E0514 01:04:46.744496 2710 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 14 01:04:46.744510 kubelet[2710]: W0514 01:04:46.744508 2710 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 14 01:04:46.744576 kubelet[2710]: E0514 01:04:46.744517 2710 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 14 01:04:46.744674 kubelet[2710]: E0514 01:04:46.744662 2710 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 14 01:04:46.744674 kubelet[2710]: W0514 01:04:46.744674 2710 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 14 01:04:46.744743 kubelet[2710]: E0514 01:04:46.744682 2710 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 14 01:04:46.744911 kubelet[2710]: E0514 01:04:46.744897 2710 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 14 01:04:46.744911 kubelet[2710]: W0514 01:04:46.744909 2710 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 14 01:04:46.744985 kubelet[2710]: E0514 01:04:46.744918 2710 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 14 01:04:46.745158 kubelet[2710]: E0514 01:04:46.745145 2710 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 14 01:04:46.745158 kubelet[2710]: W0514 01:04:46.745157 2710 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 14 01:04:46.745246 kubelet[2710]: E0514 01:04:46.745166 2710 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 14 01:04:46.745550 kubelet[2710]: E0514 01:04:46.745537 2710 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 14 01:04:46.745550 kubelet[2710]: W0514 01:04:46.745549 2710 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 14 01:04:46.745612 kubelet[2710]: E0514 01:04:46.745558 2710 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 14 01:04:46.749810 kubelet[2710]: E0514 01:04:46.749689 2710 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 14 01:04:46.749810 kubelet[2710]: W0514 01:04:46.749707 2710 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 14 01:04:46.749810 kubelet[2710]: E0514 01:04:46.749721 2710 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 14 01:04:46.750014 kubelet[2710]: E0514 01:04:46.749980 2710 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 14 01:04:46.750014 kubelet[2710]: W0514 01:04:46.749992 2710 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 14 01:04:46.750204 kubelet[2710]: E0514 01:04:46.750192 2710 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 14 01:04:46.750481 kubelet[2710]: E0514 01:04:46.750373 2710 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 14 01:04:46.750481 kubelet[2710]: W0514 01:04:46.750384 2710 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 14 01:04:46.750481 kubelet[2710]: E0514 01:04:46.750401 2710 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 14 01:04:46.750737 kubelet[2710]: E0514 01:04:46.750675 2710 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 14 01:04:46.750737 kubelet[2710]: W0514 01:04:46.750686 2710 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 14 01:04:46.750737 kubelet[2710]: E0514 01:04:46.750702 2710 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 14 01:04:46.751064 kubelet[2710]: E0514 01:04:46.750959 2710 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 14 01:04:46.751064 kubelet[2710]: W0514 01:04:46.750970 2710 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 14 01:04:46.751064 kubelet[2710]: E0514 01:04:46.750985 2710 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 14 01:04:46.751421 kubelet[2710]: E0514 01:04:46.751285 2710 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 14 01:04:46.751421 kubelet[2710]: W0514 01:04:46.751296 2710 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 14 01:04:46.751421 kubelet[2710]: E0514 01:04:46.751312 2710 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 14 01:04:46.752220 kubelet[2710]: E0514 01:04:46.752099 2710 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 14 01:04:46.752220 kubelet[2710]: W0514 01:04:46.752109 2710 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 14 01:04:46.752220 kubelet[2710]: E0514 01:04:46.752121 2710 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 14 01:04:46.752415 kubelet[2710]: E0514 01:04:46.752393 2710 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 14 01:04:46.752415 kubelet[2710]: W0514 01:04:46.752403 2710 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 14 01:04:46.752618 kubelet[2710]: E0514 01:04:46.752565 2710 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 14 01:04:46.752737 kubelet[2710]: E0514 01:04:46.752727 2710 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 14 01:04:46.752862 kubelet[2710]: W0514 01:04:46.752802 2710 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 14 01:04:46.753005 kubelet[2710]: E0514 01:04:46.752934 2710 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 14 01:04:46.753141 kubelet[2710]: E0514 01:04:46.753131 2710 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 14 01:04:46.753205 kubelet[2710]: W0514 01:04:46.753195 2710 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 14 01:04:46.753356 kubelet[2710]: E0514 01:04:46.753329 2710 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 14 01:04:46.753575 kubelet[2710]: E0514 01:04:46.753495 2710 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 14 01:04:46.753575 kubelet[2710]: W0514 01:04:46.753505 2710 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 14 01:04:46.753575 kubelet[2710]: E0514 01:04:46.753517 2710 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 14 01:04:46.753783 kubelet[2710]: E0514 01:04:46.753734 2710 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 14 01:04:46.753783 kubelet[2710]: W0514 01:04:46.753745 2710 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 14 01:04:46.753982 kubelet[2710]: E0514 01:04:46.753915 2710 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 14 01:04:46.754109 kubelet[2710]: E0514 01:04:46.754099 2710 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 14 01:04:46.754188 kubelet[2710]: W0514 01:04:46.754162 2710 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 14 01:04:46.754388 kubelet[2710]: E0514 01:04:46.754355 2710 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 14 01:04:46.754744 kubelet[2710]: E0514 01:04:46.754525 2710 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 14 01:04:46.754744 kubelet[2710]: W0514 01:04:46.754536 2710 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 14 01:04:46.754744 kubelet[2710]: E0514 01:04:46.754548 2710 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 14 01:04:46.754904 kubelet[2710]: E0514 01:04:46.754893 2710 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 14 01:04:46.754986 kubelet[2710]: W0514 01:04:46.754976 2710 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 14 01:04:46.755092 kubelet[2710]: E0514 01:04:46.755081 2710 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 14 01:04:46.755365 kubelet[2710]: E0514 01:04:46.755354 2710 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 14 01:04:46.755430 kubelet[2710]: W0514 01:04:46.755419 2710 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 14 01:04:46.755533 kubelet[2710]: E0514 01:04:46.755477 2710 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 14 01:04:46.757315 kubelet[2710]: E0514 01:04:46.757127 2710 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 14 01:04:46.757315 kubelet[2710]: W0514 01:04:46.757138 2710 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 14 01:04:46.757315 kubelet[2710]: E0514 01:04:46.757147 2710 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 14 01:04:46.757489 kubelet[2710]: E0514 01:04:46.757479 2710 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 14 01:04:46.757550 kubelet[2710]: W0514 01:04:46.757539 2710 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 14 01:04:46.757607 kubelet[2710]: E0514 01:04:46.757597 2710 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 14 01:04:46.948436 containerd[1486]: time="2025-05-14T01:04:46.948400540Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.29.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 14 01:04:46.949701 containerd[1486]: time="2025-05-14T01:04:46.949637100Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.29.3: active requests=0, bytes read=5366937" May 14 01:04:46.951127 containerd[1486]: time="2025-05-14T01:04:46.951075178Z" level=info msg="ImageCreate event name:\"sha256:0ceddb3add2e9955cbb604f666245e259f30b1d6683c428f8748359e83d238a5\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 14 01:04:46.953959 containerd[1486]: time="2025-05-14T01:04:46.953914556Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/pod2daemon-flexvol@sha256:eeaa2bb4f9b1aa61adde43ce6dea95eee89291f96963548e108d9a2dfbc5edd1\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 14 01:04:46.955116 containerd[1486]: time="2025-05-14T01:04:46.954689870Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.29.3\" with image id \"sha256:0ceddb3add2e9955cbb604f666245e259f30b1d6683c428f8748359e83d238a5\", repo tag \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.29.3\", repo digest \"ghcr.io/flatcar/calico/pod2daemon-flexvol@sha256:eeaa2bb4f9b1aa61adde43ce6dea95eee89291f96963548e108d9a2dfbc5edd1\", size \"6859519\" in 2.038144394s" May 14 01:04:46.955116 containerd[1486]: time="2025-05-14T01:04:46.954736638Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.29.3\" returns image reference \"sha256:0ceddb3add2e9955cbb604f666245e259f30b1d6683c428f8748359e83d238a5\"" May 14 01:04:46.958150 containerd[1486]: time="2025-05-14T01:04:46.958078358Z" level=info msg="CreateContainer within sandbox \"0b3cea7e51f96f150314b05c979af0bbdc58578df2f52372223186e1fcf3d1f7\" for container &ContainerMetadata{Name:flexvol-driver,Attempt:0,}" May 14 01:04:46.971089 containerd[1486]: time="2025-05-14T01:04:46.969488616Z" level=info msg="Container e8d1f90a9fd4c0a5f62034c25d3d35e074213525456ed657dab4f60826b2b106: CDI devices from CRI Config.CDIDevices: []" May 14 01:04:46.985188 containerd[1486]: time="2025-05-14T01:04:46.985109295Z" level=info msg="CreateContainer within sandbox \"0b3cea7e51f96f150314b05c979af0bbdc58578df2f52372223186e1fcf3d1f7\" for &ContainerMetadata{Name:flexvol-driver,Attempt:0,} returns container id \"e8d1f90a9fd4c0a5f62034c25d3d35e074213525456ed657dab4f60826b2b106\"" May 14 01:04:46.985889 containerd[1486]: time="2025-05-14T01:04:46.985792628Z" level=info msg="StartContainer for \"e8d1f90a9fd4c0a5f62034c25d3d35e074213525456ed657dab4f60826b2b106\"" May 14 01:04:46.987805 containerd[1486]: time="2025-05-14T01:04:46.987773213Z" level=info msg="connecting to shim e8d1f90a9fd4c0a5f62034c25d3d35e074213525456ed657dab4f60826b2b106" address="unix:///run/containerd/s/302d53b9edb19c14101bb47026182e64cce4e8d0e7195df087b0c8330a6684e7" protocol=ttrpc version=3 May 14 01:04:47.012354 systemd[1]: Started cri-containerd-e8d1f90a9fd4c0a5f62034c25d3d35e074213525456ed657dab4f60826b2b106.scope - libcontainer container e8d1f90a9fd4c0a5f62034c25d3d35e074213525456ed657dab4f60826b2b106. May 14 01:04:47.062233 containerd[1486]: time="2025-05-14T01:04:47.062027832Z" level=info msg="StartContainer for \"e8d1f90a9fd4c0a5f62034c25d3d35e074213525456ed657dab4f60826b2b106\" returns successfully" May 14 01:04:47.069870 systemd[1]: cri-containerd-e8d1f90a9fd4c0a5f62034c25d3d35e074213525456ed657dab4f60826b2b106.scope: Deactivated successfully. May 14 01:04:47.073643 containerd[1486]: time="2025-05-14T01:04:47.073491150Z" level=info msg="received exit event container_id:\"e8d1f90a9fd4c0a5f62034c25d3d35e074213525456ed657dab4f60826b2b106\" id:\"e8d1f90a9fd4c0a5f62034c25d3d35e074213525456ed657dab4f60826b2b106\" pid:3383 exited_at:{seconds:1747184687 nanos:73101989}" May 14 01:04:47.073793 containerd[1486]: time="2025-05-14T01:04:47.073755626Z" level=info msg="TaskExit event in podsandbox handler container_id:\"e8d1f90a9fd4c0a5f62034c25d3d35e074213525456ed657dab4f60826b2b106\" id:\"e8d1f90a9fd4c0a5f62034c25d3d35e074213525456ed657dab4f60826b2b106\" pid:3383 exited_at:{seconds:1747184687 nanos:73101989}" May 14 01:04:47.095814 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-e8d1f90a9fd4c0a5f62034c25d3d35e074213525456ed657dab4f60826b2b106-rootfs.mount: Deactivated successfully. May 14 01:04:47.612938 kubelet[2710]: E0514 01:04:47.612686 2710 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-b8vbb" podUID="4b92a8a5-04c9-4b4e-b1c3-606570f923ca" May 14 01:04:47.759490 kubelet[2710]: I0514 01:04:47.759365 2710 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/calico-typha-7c8fdf5689-494r6" podStartSLOduration=3.613204391 podStartE2EDuration="6.75932913s" podCreationTimestamp="2025-05-14 01:04:41 +0000 UTC" firstStartedPulling="2025-05-14 01:04:41.769794122 +0000 UTC m=+13.269365596" lastFinishedPulling="2025-05-14 01:04:44.915918851 +0000 UTC m=+16.415490335" observedRunningTime="2025-05-14 01:04:45.748726824 +0000 UTC m=+17.248298308" watchObservedRunningTime="2025-05-14 01:04:47.75932913 +0000 UTC m=+19.258900654" May 14 01:04:48.743251 containerd[1486]: time="2025-05-14T01:04:48.743165704Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/cni:v3.29.3\"" May 14 01:04:49.612534 kubelet[2710]: E0514 01:04:49.612416 2710 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-b8vbb" podUID="4b92a8a5-04c9-4b4e-b1c3-606570f923ca" May 14 01:04:51.612961 kubelet[2710]: E0514 01:04:51.612903 2710 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-b8vbb" podUID="4b92a8a5-04c9-4b4e-b1c3-606570f923ca" May 14 01:04:53.612926 kubelet[2710]: E0514 01:04:53.612886 2710 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-b8vbb" podUID="4b92a8a5-04c9-4b4e-b1c3-606570f923ca" May 14 01:04:54.750044 containerd[1486]: time="2025-05-14T01:04:54.749973969Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/cni:v3.29.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 14 01:04:54.751320 containerd[1486]: time="2025-05-14T01:04:54.751266094Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/cni:v3.29.3: active requests=0, bytes read=97793683" May 14 01:04:54.752856 containerd[1486]: time="2025-05-14T01:04:54.752811894Z" level=info msg="ImageCreate event name:\"sha256:a140d04be1bc987bae0a1b9159e1dcb85751c448830efbdb3494207cf602b2d9\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 14 01:04:54.758639 containerd[1486]: time="2025-05-14T01:04:54.758047104Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/cni@sha256:4505ec8f976470994b6a94295a4dabac0cb98375db050e959a22603e00ada90b\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 14 01:04:54.760760 containerd[1486]: time="2025-05-14T01:04:54.760734576Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/cni:v3.29.3\" with image id \"sha256:a140d04be1bc987bae0a1b9159e1dcb85751c448830efbdb3494207cf602b2d9\", repo tag \"ghcr.io/flatcar/calico/cni:v3.29.3\", repo digest \"ghcr.io/flatcar/calico/cni@sha256:4505ec8f976470994b6a94295a4dabac0cb98375db050e959a22603e00ada90b\", size \"99286305\" in 6.017500624s" May 14 01:04:54.760852 containerd[1486]: time="2025-05-14T01:04:54.760835415Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/cni:v3.29.3\" returns image reference \"sha256:a140d04be1bc987bae0a1b9159e1dcb85751c448830efbdb3494207cf602b2d9\"" May 14 01:04:54.764220 containerd[1486]: time="2025-05-14T01:04:54.764142509Z" level=info msg="CreateContainer within sandbox \"0b3cea7e51f96f150314b05c979af0bbdc58578df2f52372223186e1fcf3d1f7\" for container &ContainerMetadata{Name:install-cni,Attempt:0,}" May 14 01:04:54.773057 containerd[1486]: time="2025-05-14T01:04:54.772577032Z" level=info msg="Container f7bd2c705183ef2ec877328d7bb7641a5d4037f379107f8fcedf9ecc8f22ee10: CDI devices from CRI Config.CDIDevices: []" May 14 01:04:54.790084 containerd[1486]: time="2025-05-14T01:04:54.790006859Z" level=info msg="CreateContainer within sandbox \"0b3cea7e51f96f150314b05c979af0bbdc58578df2f52372223186e1fcf3d1f7\" for &ContainerMetadata{Name:install-cni,Attempt:0,} returns container id \"f7bd2c705183ef2ec877328d7bb7641a5d4037f379107f8fcedf9ecc8f22ee10\"" May 14 01:04:54.790796 containerd[1486]: time="2025-05-14T01:04:54.790601756Z" level=info msg="StartContainer for \"f7bd2c705183ef2ec877328d7bb7641a5d4037f379107f8fcedf9ecc8f22ee10\"" May 14 01:04:54.793195 containerd[1486]: time="2025-05-14T01:04:54.793143033Z" level=info msg="connecting to shim f7bd2c705183ef2ec877328d7bb7641a5d4037f379107f8fcedf9ecc8f22ee10" address="unix:///run/containerd/s/302d53b9edb19c14101bb47026182e64cce4e8d0e7195df087b0c8330a6684e7" protocol=ttrpc version=3 May 14 01:04:54.821189 systemd[1]: Started cri-containerd-f7bd2c705183ef2ec877328d7bb7641a5d4037f379107f8fcedf9ecc8f22ee10.scope - libcontainer container f7bd2c705183ef2ec877328d7bb7641a5d4037f379107f8fcedf9ecc8f22ee10. May 14 01:04:54.863534 containerd[1486]: time="2025-05-14T01:04:54.863443383Z" level=info msg="StartContainer for \"f7bd2c705183ef2ec877328d7bb7641a5d4037f379107f8fcedf9ecc8f22ee10\" returns successfully" May 14 01:04:55.613946 kubelet[2710]: E0514 01:04:55.613785 2710 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-b8vbb" podUID="4b92a8a5-04c9-4b4e-b1c3-606570f923ca" May 14 01:04:56.009440 containerd[1486]: time="2025-05-14T01:04:56.008279454Z" level=error msg="failed to reload cni configuration after receiving fs change event(WRITE \"/etc/cni/net.d/calico-kubeconfig\")" error="cni config load failed: no network config found in /etc/cni/net.d: cni plugin not initialized: failed to load cni config" May 14 01:04:56.011247 systemd[1]: cri-containerd-f7bd2c705183ef2ec877328d7bb7641a5d4037f379107f8fcedf9ecc8f22ee10.scope: Deactivated successfully. May 14 01:04:56.012215 systemd[1]: cri-containerd-f7bd2c705183ef2ec877328d7bb7641a5d4037f379107f8fcedf9ecc8f22ee10.scope: Consumed 688ms CPU time, 181.9M memory peak, 154M written to disk. May 14 01:04:56.017258 containerd[1486]: time="2025-05-14T01:04:56.017201762Z" level=info msg="TaskExit event in podsandbox handler container_id:\"f7bd2c705183ef2ec877328d7bb7641a5d4037f379107f8fcedf9ecc8f22ee10\" id:\"f7bd2c705183ef2ec877328d7bb7641a5d4037f379107f8fcedf9ecc8f22ee10\" pid:3446 exited_at:{seconds:1747184696 nanos:16460962}" May 14 01:04:56.017258 containerd[1486]: time="2025-05-14T01:04:56.017223252Z" level=info msg="received exit event container_id:\"f7bd2c705183ef2ec877328d7bb7641a5d4037f379107f8fcedf9ecc8f22ee10\" id:\"f7bd2c705183ef2ec877328d7bb7641a5d4037f379107f8fcedf9ecc8f22ee10\" pid:3446 exited_at:{seconds:1747184696 nanos:16460962}" May 14 01:04:56.055312 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-f7bd2c705183ef2ec877328d7bb7641a5d4037f379107f8fcedf9ecc8f22ee10-rootfs.mount: Deactivated successfully. May 14 01:04:56.071322 kubelet[2710]: I0514 01:04:56.069506 2710 kubelet_node_status.go:502] "Fast updating node status as it just became ready" May 14 01:04:56.464848 systemd[1]: Created slice kubepods-burstable-podf66aa28a_9c44_4949_9dcb_9724952e0c5a.slice - libcontainer container kubepods-burstable-podf66aa28a_9c44_4949_9dcb_9724952e0c5a.slice. May 14 01:04:56.527403 kubelet[2710]: I0514 01:04:56.527199 2710 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/f66aa28a-9c44-4949-9dcb-9724952e0c5a-config-volume\") pod \"coredns-668d6bf9bc-w7vk9\" (UID: \"f66aa28a-9c44-4949-9dcb-9724952e0c5a\") " pod="kube-system/coredns-668d6bf9bc-w7vk9" May 14 01:04:56.527403 kubelet[2710]: I0514 01:04:56.527306 2710 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rsqhn\" (UniqueName: \"kubernetes.io/projected/f66aa28a-9c44-4949-9dcb-9724952e0c5a-kube-api-access-rsqhn\") pod \"coredns-668d6bf9bc-w7vk9\" (UID: \"f66aa28a-9c44-4949-9dcb-9724952e0c5a\") " pod="kube-system/coredns-668d6bf9bc-w7vk9" May 14 01:04:56.661815 systemd[1]: Created slice kubepods-burstable-poda153f68e_8a69_4c2e_ba25_43d30d1b76a3.slice - libcontainer container kubepods-burstable-poda153f68e_8a69_4c2e_ba25_43d30d1b76a3.slice. May 14 01:04:56.691102 systemd[1]: Created slice kubepods-besteffort-pod004765cd_cdab_49fa_8da7_cb602693d444.slice - libcontainer container kubepods-besteffort-pod004765cd_cdab_49fa_8da7_cb602693d444.slice. May 14 01:04:56.703376 systemd[1]: Created slice kubepods-besteffort-pod499e8660_f8a6_4547_8ea8_b27c8bf74cbc.slice - libcontainer container kubepods-besteffort-pod499e8660_f8a6_4547_8ea8_b27c8bf74cbc.slice. May 14 01:04:56.709678 systemd[1]: Created slice kubepods-besteffort-pod223f11c4_486e_4257_8c51_e91d10d982e7.slice - libcontainer container kubepods-besteffort-pod223f11c4_486e_4257_8c51_e91d10d982e7.slice. May 14 01:04:56.730190 kubelet[2710]: I0514 01:04:56.730102 2710 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2wtx9\" (UniqueName: \"kubernetes.io/projected/004765cd-cdab-49fa-8da7-cb602693d444-kube-api-access-2wtx9\") pod \"calico-apiserver-7bc7fb4ff7-qm8vw\" (UID: \"004765cd-cdab-49fa-8da7-cb602693d444\") " pod="calico-apiserver/calico-apiserver-7bc7fb4ff7-qm8vw" May 14 01:04:56.730692 kubelet[2710]: I0514 01:04:56.730502 2710 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rfxlb\" (UniqueName: \"kubernetes.io/projected/a153f68e-8a69-4c2e-ba25-43d30d1b76a3-kube-api-access-rfxlb\") pod \"coredns-668d6bf9bc-zngx5\" (UID: \"a153f68e-8a69-4c2e-ba25-43d30d1b76a3\") " pod="kube-system/coredns-668d6bf9bc-zngx5" May 14 01:04:56.730692 kubelet[2710]: I0514 01:04:56.730535 2710 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/223f11c4-486e-4257-8c51-e91d10d982e7-tigera-ca-bundle\") pod \"calico-kube-controllers-76985557f9-m9brf\" (UID: \"223f11c4-486e-4257-8c51-e91d10d982e7\") " pod="calico-system/calico-kube-controllers-76985557f9-m9brf" May 14 01:04:56.730692 kubelet[2710]: I0514 01:04:56.730558 2710 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"calico-apiserver-certs\" (UniqueName: \"kubernetes.io/secret/004765cd-cdab-49fa-8da7-cb602693d444-calico-apiserver-certs\") pod \"calico-apiserver-7bc7fb4ff7-qm8vw\" (UID: \"004765cd-cdab-49fa-8da7-cb602693d444\") " pod="calico-apiserver/calico-apiserver-7bc7fb4ff7-qm8vw" May 14 01:04:56.730692 kubelet[2710]: I0514 01:04:56.730575 2710 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/a153f68e-8a69-4c2e-ba25-43d30d1b76a3-config-volume\") pod \"coredns-668d6bf9bc-zngx5\" (UID: \"a153f68e-8a69-4c2e-ba25-43d30d1b76a3\") " pod="kube-system/coredns-668d6bf9bc-zngx5" May 14 01:04:56.730692 kubelet[2710]: I0514 01:04:56.730598 2710 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5sdvr\" (UniqueName: \"kubernetes.io/projected/499e8660-f8a6-4547-8ea8-b27c8bf74cbc-kube-api-access-5sdvr\") pod \"calico-apiserver-7bc7fb4ff7-4sg5w\" (UID: \"499e8660-f8a6-4547-8ea8-b27c8bf74cbc\") " pod="calico-apiserver/calico-apiserver-7bc7fb4ff7-4sg5w" May 14 01:04:56.730865 kubelet[2710]: I0514 01:04:56.730679 2710 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"calico-apiserver-certs\" (UniqueName: \"kubernetes.io/secret/499e8660-f8a6-4547-8ea8-b27c8bf74cbc-calico-apiserver-certs\") pod \"calico-apiserver-7bc7fb4ff7-4sg5w\" (UID: \"499e8660-f8a6-4547-8ea8-b27c8bf74cbc\") " pod="calico-apiserver/calico-apiserver-7bc7fb4ff7-4sg5w" May 14 01:04:56.730865 kubelet[2710]: I0514 01:04:56.730718 2710 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jtc48\" (UniqueName: \"kubernetes.io/projected/223f11c4-486e-4257-8c51-e91d10d982e7-kube-api-access-jtc48\") pod \"calico-kube-controllers-76985557f9-m9brf\" (UID: \"223f11c4-486e-4257-8c51-e91d10d982e7\") " pod="calico-system/calico-kube-controllers-76985557f9-m9brf" May 14 01:04:56.987838 containerd[1486]: time="2025-05-14T01:04:56.987308665Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-668d6bf9bc-zngx5,Uid:a153f68e-8a69-4c2e-ba25-43d30d1b76a3,Namespace:kube-system,Attempt:0,}" May 14 01:04:57.001711 containerd[1486]: time="2025-05-14T01:04:57.001329586Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-7bc7fb4ff7-qm8vw,Uid:004765cd-cdab-49fa-8da7-cb602693d444,Namespace:calico-apiserver,Attempt:0,}" May 14 01:04:57.011009 containerd[1486]: time="2025-05-14T01:04:57.010946326Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-7bc7fb4ff7-4sg5w,Uid:499e8660-f8a6-4547-8ea8-b27c8bf74cbc,Namespace:calico-apiserver,Attempt:0,}" May 14 01:04:57.013485 containerd[1486]: time="2025-05-14T01:04:57.013304861Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-76985557f9-m9brf,Uid:223f11c4-486e-4257-8c51-e91d10d982e7,Namespace:calico-system,Attempt:0,}" May 14 01:04:57.074608 containerd[1486]: time="2025-05-14T01:04:57.074535662Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-668d6bf9bc-w7vk9,Uid:f66aa28a-9c44-4949-9dcb-9724952e0c5a,Namespace:kube-system,Attempt:0,}" May 14 01:04:57.195856 containerd[1486]: time="2025-05-14T01:04:57.195575708Z" level=error msg="Failed to destroy network for sandbox \"3a400805e1cedcca20966be6863a149f8cb3c18195412524b97673322611cfcd\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 14 01:04:57.200091 containerd[1486]: time="2025-05-14T01:04:57.199891714Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-668d6bf9bc-zngx5,Uid:a153f68e-8a69-4c2e-ba25-43d30d1b76a3,Namespace:kube-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"3a400805e1cedcca20966be6863a149f8cb3c18195412524b97673322611cfcd\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 14 01:04:57.200945 systemd[1]: run-netns-cni\x2ddab429ed\x2dca3d\x2d6a26\x2d9f14\x2d729f200b41b0.mount: Deactivated successfully. May 14 01:04:57.202661 kubelet[2710]: E0514 01:04:57.202578 2710 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"3a400805e1cedcca20966be6863a149f8cb3c18195412524b97673322611cfcd\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 14 01:04:57.202727 kubelet[2710]: E0514 01:04:57.202672 2710 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"3a400805e1cedcca20966be6863a149f8cb3c18195412524b97673322611cfcd\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-668d6bf9bc-zngx5" May 14 01:04:57.202727 kubelet[2710]: E0514 01:04:57.202705 2710 kuberuntime_manager.go:1237] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"3a400805e1cedcca20966be6863a149f8cb3c18195412524b97673322611cfcd\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-668d6bf9bc-zngx5" May 14 01:04:57.203067 kubelet[2710]: E0514 01:04:57.202775 2710 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"coredns-668d6bf9bc-zngx5_kube-system(a153f68e-8a69-4c2e-ba25-43d30d1b76a3)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"coredns-668d6bf9bc-zngx5_kube-system(a153f68e-8a69-4c2e-ba25-43d30d1b76a3)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"3a400805e1cedcca20966be6863a149f8cb3c18195412524b97673322611cfcd\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-668d6bf9bc-zngx5" podUID="a153f68e-8a69-4c2e-ba25-43d30d1b76a3" May 14 01:04:57.217056 containerd[1486]: time="2025-05-14T01:04:57.216967978Z" level=error msg="Failed to destroy network for sandbox \"ea3aee8745c46bc8039832cc24fa203c6c47684c1bae142535f3e531872a5ae3\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 14 01:04:57.221147 systemd[1]: run-netns-cni\x2d15f1d412\x2dd3d4\x2d80f2\x2dcf2a\x2d3720e910ef93.mount: Deactivated successfully. May 14 01:04:57.224106 containerd[1486]: time="2025-05-14T01:04:57.223913707Z" level=error msg="Failed to destroy network for sandbox \"9a03da73a77fc5d6c56069840106f43bd1db4ff8b3687cd90e44aa2991a205c0\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 14 01:04:57.225175 containerd[1486]: time="2025-05-14T01:04:57.225133766Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-76985557f9-m9brf,Uid:223f11c4-486e-4257-8c51-e91d10d982e7,Namespace:calico-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"ea3aee8745c46bc8039832cc24fa203c6c47684c1bae142535f3e531872a5ae3\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 14 01:04:57.227093 kubelet[2710]: E0514 01:04:57.226988 2710 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"ea3aee8745c46bc8039832cc24fa203c6c47684c1bae142535f3e531872a5ae3\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 14 01:04:57.228605 kubelet[2710]: E0514 01:04:57.228534 2710 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"ea3aee8745c46bc8039832cc24fa203c6c47684c1bae142535f3e531872a5ae3\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-kube-controllers-76985557f9-m9brf" May 14 01:04:57.229181 kubelet[2710]: E0514 01:04:57.228607 2710 kuberuntime_manager.go:1237] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"ea3aee8745c46bc8039832cc24fa203c6c47684c1bae142535f3e531872a5ae3\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-kube-controllers-76985557f9-m9brf" May 14 01:04:57.229160 systemd[1]: run-netns-cni\x2d15010f1d\x2d3620\x2d4234\x2d09c4\x2d14f20d3d593d.mount: Deactivated successfully. May 14 01:04:57.230584 kubelet[2710]: E0514 01:04:57.228832 2710 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-kube-controllers-76985557f9-m9brf_calico-system(223f11c4-486e-4257-8c51-e91d10d982e7)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-kube-controllers-76985557f9-m9brf_calico-system(223f11c4-486e-4257-8c51-e91d10d982e7)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"ea3aee8745c46bc8039832cc24fa203c6c47684c1bae142535f3e531872a5ae3\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/calico-kube-controllers-76985557f9-m9brf" podUID="223f11c4-486e-4257-8c51-e91d10d982e7" May 14 01:04:57.230884 containerd[1486]: time="2025-05-14T01:04:57.230675932Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-7bc7fb4ff7-qm8vw,Uid:004765cd-cdab-49fa-8da7-cb602693d444,Namespace:calico-apiserver,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"9a03da73a77fc5d6c56069840106f43bd1db4ff8b3687cd90e44aa2991a205c0\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 14 01:04:57.232016 kubelet[2710]: E0514 01:04:57.231160 2710 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"9a03da73a77fc5d6c56069840106f43bd1db4ff8b3687cd90e44aa2991a205c0\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 14 01:04:57.232016 kubelet[2710]: E0514 01:04:57.231252 2710 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"9a03da73a77fc5d6c56069840106f43bd1db4ff8b3687cd90e44aa2991a205c0\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-7bc7fb4ff7-qm8vw" May 14 01:04:57.232016 kubelet[2710]: E0514 01:04:57.231282 2710 kuberuntime_manager.go:1237] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"9a03da73a77fc5d6c56069840106f43bd1db4ff8b3687cd90e44aa2991a205c0\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-7bc7fb4ff7-qm8vw" May 14 01:04:57.232217 kubelet[2710]: E0514 01:04:57.231345 2710 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-apiserver-7bc7fb4ff7-qm8vw_calico-apiserver(004765cd-cdab-49fa-8da7-cb602693d444)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-apiserver-7bc7fb4ff7-qm8vw_calico-apiserver(004765cd-cdab-49fa-8da7-cb602693d444)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"9a03da73a77fc5d6c56069840106f43bd1db4ff8b3687cd90e44aa2991a205c0\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-apiserver/calico-apiserver-7bc7fb4ff7-qm8vw" podUID="004765cd-cdab-49fa-8da7-cb602693d444" May 14 01:04:57.246255 containerd[1486]: time="2025-05-14T01:04:57.245876446Z" level=error msg="Failed to destroy network for sandbox \"1bd566651ba311b1a23497924288faedcdb772a6ebad4ed32b025677e3d0a979\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 14 01:04:57.247938 containerd[1486]: time="2025-05-14T01:04:57.247896064Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-7bc7fb4ff7-4sg5w,Uid:499e8660-f8a6-4547-8ea8-b27c8bf74cbc,Namespace:calico-apiserver,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"1bd566651ba311b1a23497924288faedcdb772a6ebad4ed32b025677e3d0a979\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 14 01:04:57.248397 kubelet[2710]: E0514 01:04:57.248235 2710 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"1bd566651ba311b1a23497924288faedcdb772a6ebad4ed32b025677e3d0a979\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 14 01:04:57.248397 kubelet[2710]: E0514 01:04:57.248305 2710 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"1bd566651ba311b1a23497924288faedcdb772a6ebad4ed32b025677e3d0a979\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-7bc7fb4ff7-4sg5w" May 14 01:04:57.248397 kubelet[2710]: E0514 01:04:57.248328 2710 kuberuntime_manager.go:1237] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"1bd566651ba311b1a23497924288faedcdb772a6ebad4ed32b025677e3d0a979\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-7bc7fb4ff7-4sg5w" May 14 01:04:57.248687 kubelet[2710]: E0514 01:04:57.248389 2710 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-apiserver-7bc7fb4ff7-4sg5w_calico-apiserver(499e8660-f8a6-4547-8ea8-b27c8bf74cbc)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-apiserver-7bc7fb4ff7-4sg5w_calico-apiserver(499e8660-f8a6-4547-8ea8-b27c8bf74cbc)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"1bd566651ba311b1a23497924288faedcdb772a6ebad4ed32b025677e3d0a979\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-apiserver/calico-apiserver-7bc7fb4ff7-4sg5w" podUID="499e8660-f8a6-4547-8ea8-b27c8bf74cbc" May 14 01:04:57.261700 containerd[1486]: time="2025-05-14T01:04:57.261633694Z" level=error msg="Failed to destroy network for sandbox \"0a8a0a197ba8f3e02a4854114d8281f324b9d1d7e79ab196a9acb3e22104a8cf\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 14 01:04:57.263512 containerd[1486]: time="2025-05-14T01:04:57.263466413Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-668d6bf9bc-w7vk9,Uid:f66aa28a-9c44-4949-9dcb-9724952e0c5a,Namespace:kube-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"0a8a0a197ba8f3e02a4854114d8281f324b9d1d7e79ab196a9acb3e22104a8cf\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 14 01:04:57.263802 kubelet[2710]: E0514 01:04:57.263747 2710 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"0a8a0a197ba8f3e02a4854114d8281f324b9d1d7e79ab196a9acb3e22104a8cf\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 14 01:04:57.263922 kubelet[2710]: E0514 01:04:57.263828 2710 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"0a8a0a197ba8f3e02a4854114d8281f324b9d1d7e79ab196a9acb3e22104a8cf\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-668d6bf9bc-w7vk9" May 14 01:04:57.263922 kubelet[2710]: E0514 01:04:57.263862 2710 kuberuntime_manager.go:1237] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"0a8a0a197ba8f3e02a4854114d8281f324b9d1d7e79ab196a9acb3e22104a8cf\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-668d6bf9bc-w7vk9" May 14 01:04:57.264016 kubelet[2710]: E0514 01:04:57.263982 2710 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"coredns-668d6bf9bc-w7vk9_kube-system(f66aa28a-9c44-4949-9dcb-9724952e0c5a)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"coredns-668d6bf9bc-w7vk9_kube-system(f66aa28a-9c44-4949-9dcb-9724952e0c5a)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"0a8a0a197ba8f3e02a4854114d8281f324b9d1d7e79ab196a9acb3e22104a8cf\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-668d6bf9bc-w7vk9" podUID="f66aa28a-9c44-4949-9dcb-9724952e0c5a" May 14 01:04:57.632555 systemd[1]: Created slice kubepods-besteffort-pod4b92a8a5_04c9_4b4e_b1c3_606570f923ca.slice - libcontainer container kubepods-besteffort-pod4b92a8a5_04c9_4b4e_b1c3_606570f923ca.slice. May 14 01:04:57.638284 containerd[1486]: time="2025-05-14T01:04:57.637683968Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-b8vbb,Uid:4b92a8a5-04c9-4b4e-b1c3-606570f923ca,Namespace:calico-system,Attempt:0,}" May 14 01:04:57.751243 containerd[1486]: time="2025-05-14T01:04:57.751143333Z" level=error msg="Failed to destroy network for sandbox \"3af011066fab9df3a5847cd358c7fbc8405c486c990a635b181e2363140adc2a\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 14 01:04:57.753539 containerd[1486]: time="2025-05-14T01:04:57.753465510Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-b8vbb,Uid:4b92a8a5-04c9-4b4e-b1c3-606570f923ca,Namespace:calico-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"3af011066fab9df3a5847cd358c7fbc8405c486c990a635b181e2363140adc2a\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 14 01:04:57.754084 kubelet[2710]: E0514 01:04:57.753810 2710 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"3af011066fab9df3a5847cd358c7fbc8405c486c990a635b181e2363140adc2a\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 14 01:04:57.754084 kubelet[2710]: E0514 01:04:57.753874 2710 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"3af011066fab9df3a5847cd358c7fbc8405c486c990a635b181e2363140adc2a\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/csi-node-driver-b8vbb" May 14 01:04:57.754084 kubelet[2710]: E0514 01:04:57.753902 2710 kuberuntime_manager.go:1237] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"3af011066fab9df3a5847cd358c7fbc8405c486c990a635b181e2363140adc2a\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/csi-node-driver-b8vbb" May 14 01:04:57.754423 kubelet[2710]: E0514 01:04:57.753961 2710 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"csi-node-driver-b8vbb_calico-system(4b92a8a5-04c9-4b4e-b1c3-606570f923ca)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"csi-node-driver-b8vbb_calico-system(4b92a8a5-04c9-4b4e-b1c3-606570f923ca)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"3af011066fab9df3a5847cd358c7fbc8405c486c990a635b181e2363140adc2a\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/csi-node-driver-b8vbb" podUID="4b92a8a5-04c9-4b4e-b1c3-606570f923ca" May 14 01:04:57.786182 containerd[1486]: time="2025-05-14T01:04:57.785565645Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node:v3.29.3\"" May 14 01:04:58.059586 systemd[1]: run-netns-cni\x2d1a3de567\x2d4ce8\x2daddd\x2d142f\x2d1c8a5295bd5f.mount: Deactivated successfully. May 14 01:04:58.060347 systemd[1]: run-netns-cni\x2d34dca15c\x2d1f8b\x2dc1c7\x2d373b\x2d4d05340ace62.mount: Deactivated successfully. May 14 01:05:06.683057 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount2555143206.mount: Deactivated successfully. May 14 01:05:06.947207 containerd[1486]: time="2025-05-14T01:05:06.946822069Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node:v3.29.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 14 01:05:06.949594 containerd[1486]: time="2025-05-14T01:05:06.949473602Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/node:v3.29.3: active requests=0, bytes read=144068748" May 14 01:05:06.951866 containerd[1486]: time="2025-05-14T01:05:06.951732589Z" level=info msg="ImageCreate event name:\"sha256:042163432abcec06b8077b24973b223a5f4cfdb35d85c3816f5d07a13d51afae\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 14 01:05:06.955786 containerd[1486]: time="2025-05-14T01:05:06.955686565Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node@sha256:750e267b4f8217e0ca9e4107228370190d1a2499b72112ad04370ab9b4553916\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 14 01:05:06.958120 containerd[1486]: time="2025-05-14T01:05:06.957256641Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/node:v3.29.3\" with image id \"sha256:042163432abcec06b8077b24973b223a5f4cfdb35d85c3816f5d07a13d51afae\", repo tag \"ghcr.io/flatcar/calico/node:v3.29.3\", repo digest \"ghcr.io/flatcar/calico/node@sha256:750e267b4f8217e0ca9e4107228370190d1a2499b72112ad04370ab9b4553916\", size \"144068610\" in 9.171631484s" May 14 01:05:06.958120 containerd[1486]: time="2025-05-14T01:05:06.957333054Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node:v3.29.3\" returns image reference \"sha256:042163432abcec06b8077b24973b223a5f4cfdb35d85c3816f5d07a13d51afae\"" May 14 01:05:06.991646 containerd[1486]: time="2025-05-14T01:05:06.990421959Z" level=info msg="CreateContainer within sandbox \"0b3cea7e51f96f150314b05c979af0bbdc58578df2f52372223186e1fcf3d1f7\" for container &ContainerMetadata{Name:calico-node,Attempt:0,}" May 14 01:05:07.016098 containerd[1486]: time="2025-05-14T01:05:07.015621145Z" level=info msg="Container 333ab1cd365db74def51b32507a1eb376d68d8fdb1297213dc201047e9c4ed25: CDI devices from CRI Config.CDIDevices: []" May 14 01:05:07.033940 containerd[1486]: time="2025-05-14T01:05:07.033862852Z" level=info msg="CreateContainer within sandbox \"0b3cea7e51f96f150314b05c979af0bbdc58578df2f52372223186e1fcf3d1f7\" for &ContainerMetadata{Name:calico-node,Attempt:0,} returns container id \"333ab1cd365db74def51b32507a1eb376d68d8fdb1297213dc201047e9c4ed25\"" May 14 01:05:07.036205 containerd[1486]: time="2025-05-14T01:05:07.035160816Z" level=info msg="StartContainer for \"333ab1cd365db74def51b32507a1eb376d68d8fdb1297213dc201047e9c4ed25\"" May 14 01:05:07.036943 containerd[1486]: time="2025-05-14T01:05:07.036905598Z" level=info msg="connecting to shim 333ab1cd365db74def51b32507a1eb376d68d8fdb1297213dc201047e9c4ed25" address="unix:///run/containerd/s/302d53b9edb19c14101bb47026182e64cce4e8d0e7195df087b0c8330a6684e7" protocol=ttrpc version=3 May 14 01:05:07.073192 systemd[1]: Started cri-containerd-333ab1cd365db74def51b32507a1eb376d68d8fdb1297213dc201047e9c4ed25.scope - libcontainer container 333ab1cd365db74def51b32507a1eb376d68d8fdb1297213dc201047e9c4ed25. May 14 01:05:07.138304 containerd[1486]: time="2025-05-14T01:05:07.138262553Z" level=info msg="StartContainer for \"333ab1cd365db74def51b32507a1eb376d68d8fdb1297213dc201047e9c4ed25\" returns successfully" May 14 01:05:07.212040 kernel: wireguard: WireGuard 1.0.0 loaded. See www.wireguard.com for information. May 14 01:05:07.212142 kernel: wireguard: Copyright (C) 2015-2019 Jason A. Donenfeld . All Rights Reserved. May 14 01:05:07.884306 kubelet[2710]: I0514 01:05:07.882314 2710 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/calico-node-8sdfk" podStartSLOduration=1.784966107 podStartE2EDuration="26.88225659s" podCreationTimestamp="2025-05-14 01:04:41 +0000 UTC" firstStartedPulling="2025-05-14 01:04:41.862449746 +0000 UTC m=+13.362021220" lastFinishedPulling="2025-05-14 01:05:06.959740178 +0000 UTC m=+38.459311703" observedRunningTime="2025-05-14 01:05:07.880629619 +0000 UTC m=+39.380201173" watchObservedRunningTime="2025-05-14 01:05:07.88225659 +0000 UTC m=+39.381828115" May 14 01:05:08.617715 containerd[1486]: time="2025-05-14T01:05:08.617642294Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-668d6bf9bc-w7vk9,Uid:f66aa28a-9c44-4949-9dcb-9724952e0c5a,Namespace:kube-system,Attempt:0,}" May 14 01:05:08.620953 containerd[1486]: time="2025-05-14T01:05:08.620149026Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-7bc7fb4ff7-4sg5w,Uid:499e8660-f8a6-4547-8ea8-b27c8bf74cbc,Namespace:calico-apiserver,Attempt:0,}" May 14 01:05:08.928497 systemd-networkd[1384]: cali8d1995bbd14: Link UP May 14 01:05:08.930200 systemd-networkd[1384]: cali8d1995bbd14: Gained carrier May 14 01:05:08.954359 containerd[1486]: 2025-05-14 01:05:08.699 [INFO][3760] cni-plugin/utils.go 100: File /var/lib/calico/mtu does not exist May 14 01:05:08.954359 containerd[1486]: 2025-05-14 01:05:08.723 [INFO][3760] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4284--0--0--n--4a8b92fa55.novalocal-k8s-calico--apiserver--7bc7fb4ff7--4sg5w-eth0 calico-apiserver-7bc7fb4ff7- calico-apiserver 499e8660-f8a6-4547-8ea8-b27c8bf74cbc 683 0 2025-05-14 01:04:41 +0000 UTC map[apiserver:true app.kubernetes.io/name:calico-apiserver k8s-app:calico-apiserver pod-template-hash:7bc7fb4ff7 projectcalico.org/namespace:calico-apiserver projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:calico-apiserver] map[] [] [] []} {k8s ci-4284-0-0-n-4a8b92fa55.novalocal calico-apiserver-7bc7fb4ff7-4sg5w eth0 calico-apiserver [] [] [kns.calico-apiserver ksa.calico-apiserver.calico-apiserver] cali8d1995bbd14 [] []}} ContainerID="9be397c8d8b19c6c6f3180cc69f7bfaebb18e93b804363d9aca42ba401c74600" Namespace="calico-apiserver" Pod="calico-apiserver-7bc7fb4ff7-4sg5w" WorkloadEndpoint="ci--4284--0--0--n--4a8b92fa55.novalocal-k8s-calico--apiserver--7bc7fb4ff7--4sg5w-" May 14 01:05:08.954359 containerd[1486]: 2025-05-14 01:05:08.726 [INFO][3760] cni-plugin/k8s.go 77: Extracted identifiers for CmdAddK8s ContainerID="9be397c8d8b19c6c6f3180cc69f7bfaebb18e93b804363d9aca42ba401c74600" Namespace="calico-apiserver" Pod="calico-apiserver-7bc7fb4ff7-4sg5w" WorkloadEndpoint="ci--4284--0--0--n--4a8b92fa55.novalocal-k8s-calico--apiserver--7bc7fb4ff7--4sg5w-eth0" May 14 01:05:08.954359 containerd[1486]: 2025-05-14 01:05:08.797 [INFO][3803] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="9be397c8d8b19c6c6f3180cc69f7bfaebb18e93b804363d9aca42ba401c74600" HandleID="k8s-pod-network.9be397c8d8b19c6c6f3180cc69f7bfaebb18e93b804363d9aca42ba401c74600" Workload="ci--4284--0--0--n--4a8b92fa55.novalocal-k8s-calico--apiserver--7bc7fb4ff7--4sg5w-eth0" May 14 01:05:08.954623 containerd[1486]: 2025-05-14 01:05:08.827 [INFO][3803] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="9be397c8d8b19c6c6f3180cc69f7bfaebb18e93b804363d9aca42ba401c74600" HandleID="k8s-pod-network.9be397c8d8b19c6c6f3180cc69f7bfaebb18e93b804363d9aca42ba401c74600" Workload="ci--4284--0--0--n--4a8b92fa55.novalocal-k8s-calico--apiserver--7bc7fb4ff7--4sg5w-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc0003b2f80), Attrs:map[string]string{"namespace":"calico-apiserver", "node":"ci-4284-0-0-n-4a8b92fa55.novalocal", "pod":"calico-apiserver-7bc7fb4ff7-4sg5w", "timestamp":"2025-05-14 01:05:08.797399481 +0000 UTC"}, Hostname:"ci-4284-0-0-n-4a8b92fa55.novalocal", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} May 14 01:05:08.954623 containerd[1486]: 2025-05-14 01:05:08.827 [INFO][3803] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. May 14 01:05:08.954623 containerd[1486]: 2025-05-14 01:05:08.827 [INFO][3803] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. May 14 01:05:08.954623 containerd[1486]: 2025-05-14 01:05:08.827 [INFO][3803] ipam/ipam.go 107: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4284-0-0-n-4a8b92fa55.novalocal' May 14 01:05:08.954623 containerd[1486]: 2025-05-14 01:05:08.834 [INFO][3803] ipam/ipam.go 660: Looking up existing affinities for host handle="k8s-pod-network.9be397c8d8b19c6c6f3180cc69f7bfaebb18e93b804363d9aca42ba401c74600" host="ci-4284-0-0-n-4a8b92fa55.novalocal" May 14 01:05:08.954623 containerd[1486]: 2025-05-14 01:05:08.851 [INFO][3803] ipam/ipam.go 372: Looking up existing affinities for host host="ci-4284-0-0-n-4a8b92fa55.novalocal" May 14 01:05:08.954623 containerd[1486]: 2025-05-14 01:05:08.864 [INFO][3803] ipam/ipam.go 489: Trying affinity for 192.168.18.64/26 host="ci-4284-0-0-n-4a8b92fa55.novalocal" May 14 01:05:08.954623 containerd[1486]: 2025-05-14 01:05:08.867 [INFO][3803] ipam/ipam.go 155: Attempting to load block cidr=192.168.18.64/26 host="ci-4284-0-0-n-4a8b92fa55.novalocal" May 14 01:05:08.954623 containerd[1486]: 2025-05-14 01:05:08.872 [INFO][3803] ipam/ipam.go 232: Affinity is confirmed and block has been loaded cidr=192.168.18.64/26 host="ci-4284-0-0-n-4a8b92fa55.novalocal" May 14 01:05:08.954866 containerd[1486]: 2025-05-14 01:05:08.872 [INFO][3803] ipam/ipam.go 1180: Attempting to assign 1 addresses from block block=192.168.18.64/26 handle="k8s-pod-network.9be397c8d8b19c6c6f3180cc69f7bfaebb18e93b804363d9aca42ba401c74600" host="ci-4284-0-0-n-4a8b92fa55.novalocal" May 14 01:05:08.954866 containerd[1486]: 2025-05-14 01:05:08.876 [INFO][3803] ipam/ipam.go 1685: Creating new handle: k8s-pod-network.9be397c8d8b19c6c6f3180cc69f7bfaebb18e93b804363d9aca42ba401c74600 May 14 01:05:08.954866 containerd[1486]: 2025-05-14 01:05:08.886 [INFO][3803] ipam/ipam.go 1203: Writing block in order to claim IPs block=192.168.18.64/26 handle="k8s-pod-network.9be397c8d8b19c6c6f3180cc69f7bfaebb18e93b804363d9aca42ba401c74600" host="ci-4284-0-0-n-4a8b92fa55.novalocal" May 14 01:05:08.954866 containerd[1486]: 2025-05-14 01:05:08.897 [INFO][3803] ipam/ipam.go 1216: Successfully claimed IPs: [192.168.18.65/26] block=192.168.18.64/26 handle="k8s-pod-network.9be397c8d8b19c6c6f3180cc69f7bfaebb18e93b804363d9aca42ba401c74600" host="ci-4284-0-0-n-4a8b92fa55.novalocal" May 14 01:05:08.954866 containerd[1486]: 2025-05-14 01:05:08.897 [INFO][3803] ipam/ipam.go 847: Auto-assigned 1 out of 1 IPv4s: [192.168.18.65/26] handle="k8s-pod-network.9be397c8d8b19c6c6f3180cc69f7bfaebb18e93b804363d9aca42ba401c74600" host="ci-4284-0-0-n-4a8b92fa55.novalocal" May 14 01:05:08.954866 containerd[1486]: 2025-05-14 01:05:08.897 [INFO][3803] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. May 14 01:05:08.954866 containerd[1486]: 2025-05-14 01:05:08.897 [INFO][3803] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.18.65/26] IPv6=[] ContainerID="9be397c8d8b19c6c6f3180cc69f7bfaebb18e93b804363d9aca42ba401c74600" HandleID="k8s-pod-network.9be397c8d8b19c6c6f3180cc69f7bfaebb18e93b804363d9aca42ba401c74600" Workload="ci--4284--0--0--n--4a8b92fa55.novalocal-k8s-calico--apiserver--7bc7fb4ff7--4sg5w-eth0" May 14 01:05:08.955027 containerd[1486]: 2025-05-14 01:05:08.905 [INFO][3760] cni-plugin/k8s.go 386: Populated endpoint ContainerID="9be397c8d8b19c6c6f3180cc69f7bfaebb18e93b804363d9aca42ba401c74600" Namespace="calico-apiserver" Pod="calico-apiserver-7bc7fb4ff7-4sg5w" WorkloadEndpoint="ci--4284--0--0--n--4a8b92fa55.novalocal-k8s-calico--apiserver--7bc7fb4ff7--4sg5w-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4284--0--0--n--4a8b92fa55.novalocal-k8s-calico--apiserver--7bc7fb4ff7--4sg5w-eth0", GenerateName:"calico-apiserver-7bc7fb4ff7-", Namespace:"calico-apiserver", SelfLink:"", UID:"499e8660-f8a6-4547-8ea8-b27c8bf74cbc", ResourceVersion:"683", Generation:0, CreationTimestamp:time.Date(2025, time.May, 14, 1, 4, 41, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"7bc7fb4ff7", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4284-0-0-n-4a8b92fa55.novalocal", ContainerID:"", Pod:"calico-apiserver-7bc7fb4ff7-4sg5w", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.18.65/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"cali8d1995bbd14", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil)}} May 14 01:05:08.955141 containerd[1486]: 2025-05-14 01:05:08.906 [INFO][3760] cni-plugin/k8s.go 387: Calico CNI using IPs: [192.168.18.65/32] ContainerID="9be397c8d8b19c6c6f3180cc69f7bfaebb18e93b804363d9aca42ba401c74600" Namespace="calico-apiserver" Pod="calico-apiserver-7bc7fb4ff7-4sg5w" WorkloadEndpoint="ci--4284--0--0--n--4a8b92fa55.novalocal-k8s-calico--apiserver--7bc7fb4ff7--4sg5w-eth0" May 14 01:05:08.955141 containerd[1486]: 2025-05-14 01:05:08.906 [INFO][3760] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali8d1995bbd14 ContainerID="9be397c8d8b19c6c6f3180cc69f7bfaebb18e93b804363d9aca42ba401c74600" Namespace="calico-apiserver" Pod="calico-apiserver-7bc7fb4ff7-4sg5w" WorkloadEndpoint="ci--4284--0--0--n--4a8b92fa55.novalocal-k8s-calico--apiserver--7bc7fb4ff7--4sg5w-eth0" May 14 01:05:08.955141 containerd[1486]: 2025-05-14 01:05:08.929 [INFO][3760] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="9be397c8d8b19c6c6f3180cc69f7bfaebb18e93b804363d9aca42ba401c74600" Namespace="calico-apiserver" Pod="calico-apiserver-7bc7fb4ff7-4sg5w" WorkloadEndpoint="ci--4284--0--0--n--4a8b92fa55.novalocal-k8s-calico--apiserver--7bc7fb4ff7--4sg5w-eth0" May 14 01:05:08.955222 containerd[1486]: 2025-05-14 01:05:08.931 [INFO][3760] cni-plugin/k8s.go 414: Added Mac, interface name, and active container ID to endpoint ContainerID="9be397c8d8b19c6c6f3180cc69f7bfaebb18e93b804363d9aca42ba401c74600" Namespace="calico-apiserver" Pod="calico-apiserver-7bc7fb4ff7-4sg5w" WorkloadEndpoint="ci--4284--0--0--n--4a8b92fa55.novalocal-k8s-calico--apiserver--7bc7fb4ff7--4sg5w-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4284--0--0--n--4a8b92fa55.novalocal-k8s-calico--apiserver--7bc7fb4ff7--4sg5w-eth0", GenerateName:"calico-apiserver-7bc7fb4ff7-", Namespace:"calico-apiserver", SelfLink:"", UID:"499e8660-f8a6-4547-8ea8-b27c8bf74cbc", ResourceVersion:"683", Generation:0, CreationTimestamp:time.Date(2025, time.May, 14, 1, 4, 41, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"7bc7fb4ff7", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4284-0-0-n-4a8b92fa55.novalocal", ContainerID:"9be397c8d8b19c6c6f3180cc69f7bfaebb18e93b804363d9aca42ba401c74600", Pod:"calico-apiserver-7bc7fb4ff7-4sg5w", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.18.65/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"cali8d1995bbd14", MAC:"86:cd:c7:36:66:33", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil)}} May 14 01:05:08.955347 containerd[1486]: 2025-05-14 01:05:08.949 [INFO][3760] cni-plugin/k8s.go 500: Wrote updated endpoint to datastore ContainerID="9be397c8d8b19c6c6f3180cc69f7bfaebb18e93b804363d9aca42ba401c74600" Namespace="calico-apiserver" Pod="calico-apiserver-7bc7fb4ff7-4sg5w" WorkloadEndpoint="ci--4284--0--0--n--4a8b92fa55.novalocal-k8s-calico--apiserver--7bc7fb4ff7--4sg5w-eth0" May 14 01:05:09.019723 systemd-networkd[1384]: calid21fe658f88: Link UP May 14 01:05:09.022011 systemd-networkd[1384]: calid21fe658f88: Gained carrier May 14 01:05:09.055721 containerd[1486]: 2025-05-14 01:05:08.719 [INFO][3756] cni-plugin/utils.go 100: File /var/lib/calico/mtu does not exist May 14 01:05:09.055721 containerd[1486]: 2025-05-14 01:05:08.748 [INFO][3756] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4284--0--0--n--4a8b92fa55.novalocal-k8s-coredns--668d6bf9bc--w7vk9-eth0 coredns-668d6bf9bc- kube-system f66aa28a-9c44-4949-9dcb-9724952e0c5a 676 0 2025-05-14 01:04:35 +0000 UTC map[k8s-app:kube-dns pod-template-hash:668d6bf9bc projectcalico.org/namespace:kube-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:coredns] map[] [] [] []} {k8s ci-4284-0-0-n-4a8b92fa55.novalocal coredns-668d6bf9bc-w7vk9 eth0 coredns [] [] [kns.kube-system ksa.kube-system.coredns] calid21fe658f88 [{dns UDP 53 0 } {dns-tcp TCP 53 0 } {metrics TCP 9153 0 }] []}} ContainerID="4a5e9a384dc5c49ed15c2124aa8da75449e213ec4ec5bba00b362f652cb038bb" Namespace="kube-system" Pod="coredns-668d6bf9bc-w7vk9" WorkloadEndpoint="ci--4284--0--0--n--4a8b92fa55.novalocal-k8s-coredns--668d6bf9bc--w7vk9-" May 14 01:05:09.055721 containerd[1486]: 2025-05-14 01:05:08.748 [INFO][3756] cni-plugin/k8s.go 77: Extracted identifiers for CmdAddK8s ContainerID="4a5e9a384dc5c49ed15c2124aa8da75449e213ec4ec5bba00b362f652cb038bb" Namespace="kube-system" Pod="coredns-668d6bf9bc-w7vk9" WorkloadEndpoint="ci--4284--0--0--n--4a8b92fa55.novalocal-k8s-coredns--668d6bf9bc--w7vk9-eth0" May 14 01:05:09.055721 containerd[1486]: 2025-05-14 01:05:08.810 [INFO][3815] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="4a5e9a384dc5c49ed15c2124aa8da75449e213ec4ec5bba00b362f652cb038bb" HandleID="k8s-pod-network.4a5e9a384dc5c49ed15c2124aa8da75449e213ec4ec5bba00b362f652cb038bb" Workload="ci--4284--0--0--n--4a8b92fa55.novalocal-k8s-coredns--668d6bf9bc--w7vk9-eth0" May 14 01:05:09.056613 containerd[1486]: 2025-05-14 01:05:08.830 [INFO][3815] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="4a5e9a384dc5c49ed15c2124aa8da75449e213ec4ec5bba00b362f652cb038bb" HandleID="k8s-pod-network.4a5e9a384dc5c49ed15c2124aa8da75449e213ec4ec5bba00b362f652cb038bb" Workload="ci--4284--0--0--n--4a8b92fa55.novalocal-k8s-coredns--668d6bf9bc--w7vk9-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc00031b510), Attrs:map[string]string{"namespace":"kube-system", "node":"ci-4284-0-0-n-4a8b92fa55.novalocal", "pod":"coredns-668d6bf9bc-w7vk9", "timestamp":"2025-05-14 01:05:08.810299839 +0000 UTC"}, Hostname:"ci-4284-0-0-n-4a8b92fa55.novalocal", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} May 14 01:05:09.056613 containerd[1486]: 2025-05-14 01:05:08.830 [INFO][3815] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. May 14 01:05:09.056613 containerd[1486]: 2025-05-14 01:05:08.898 [INFO][3815] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. May 14 01:05:09.056613 containerd[1486]: 2025-05-14 01:05:08.898 [INFO][3815] ipam/ipam.go 107: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4284-0-0-n-4a8b92fa55.novalocal' May 14 01:05:09.056613 containerd[1486]: 2025-05-14 01:05:08.933 [INFO][3815] ipam/ipam.go 660: Looking up existing affinities for host handle="k8s-pod-network.4a5e9a384dc5c49ed15c2124aa8da75449e213ec4ec5bba00b362f652cb038bb" host="ci-4284-0-0-n-4a8b92fa55.novalocal" May 14 01:05:09.056613 containerd[1486]: 2025-05-14 01:05:08.951 [INFO][3815] ipam/ipam.go 372: Looking up existing affinities for host host="ci-4284-0-0-n-4a8b92fa55.novalocal" May 14 01:05:09.056613 containerd[1486]: 2025-05-14 01:05:08.963 [INFO][3815] ipam/ipam.go 489: Trying affinity for 192.168.18.64/26 host="ci-4284-0-0-n-4a8b92fa55.novalocal" May 14 01:05:09.056613 containerd[1486]: 2025-05-14 01:05:08.968 [INFO][3815] ipam/ipam.go 155: Attempting to load block cidr=192.168.18.64/26 host="ci-4284-0-0-n-4a8b92fa55.novalocal" May 14 01:05:09.056613 containerd[1486]: 2025-05-14 01:05:08.972 [INFO][3815] ipam/ipam.go 232: Affinity is confirmed and block has been loaded cidr=192.168.18.64/26 host="ci-4284-0-0-n-4a8b92fa55.novalocal" May 14 01:05:09.056878 containerd[1486]: 2025-05-14 01:05:08.977 [INFO][3815] ipam/ipam.go 1180: Attempting to assign 1 addresses from block block=192.168.18.64/26 handle="k8s-pod-network.4a5e9a384dc5c49ed15c2124aa8da75449e213ec4ec5bba00b362f652cb038bb" host="ci-4284-0-0-n-4a8b92fa55.novalocal" May 14 01:05:09.056878 containerd[1486]: 2025-05-14 01:05:08.980 [INFO][3815] ipam/ipam.go 1685: Creating new handle: k8s-pod-network.4a5e9a384dc5c49ed15c2124aa8da75449e213ec4ec5bba00b362f652cb038bb May 14 01:05:09.056878 containerd[1486]: 2025-05-14 01:05:08.994 [INFO][3815] ipam/ipam.go 1203: Writing block in order to claim IPs block=192.168.18.64/26 handle="k8s-pod-network.4a5e9a384dc5c49ed15c2124aa8da75449e213ec4ec5bba00b362f652cb038bb" host="ci-4284-0-0-n-4a8b92fa55.novalocal" May 14 01:05:09.056878 containerd[1486]: 2025-05-14 01:05:09.002 [INFO][3815] ipam/ipam.go 1216: Successfully claimed IPs: [192.168.18.66/26] block=192.168.18.64/26 handle="k8s-pod-network.4a5e9a384dc5c49ed15c2124aa8da75449e213ec4ec5bba00b362f652cb038bb" host="ci-4284-0-0-n-4a8b92fa55.novalocal" May 14 01:05:09.056878 containerd[1486]: 2025-05-14 01:05:09.002 [INFO][3815] ipam/ipam.go 847: Auto-assigned 1 out of 1 IPv4s: [192.168.18.66/26] handle="k8s-pod-network.4a5e9a384dc5c49ed15c2124aa8da75449e213ec4ec5bba00b362f652cb038bb" host="ci-4284-0-0-n-4a8b92fa55.novalocal" May 14 01:05:09.056878 containerd[1486]: 2025-05-14 01:05:09.002 [INFO][3815] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. May 14 01:05:09.056878 containerd[1486]: 2025-05-14 01:05:09.002 [INFO][3815] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.18.66/26] IPv6=[] ContainerID="4a5e9a384dc5c49ed15c2124aa8da75449e213ec4ec5bba00b362f652cb038bb" HandleID="k8s-pod-network.4a5e9a384dc5c49ed15c2124aa8da75449e213ec4ec5bba00b362f652cb038bb" Workload="ci--4284--0--0--n--4a8b92fa55.novalocal-k8s-coredns--668d6bf9bc--w7vk9-eth0" May 14 01:05:09.059887 containerd[1486]: 2025-05-14 01:05:09.007 [INFO][3756] cni-plugin/k8s.go 386: Populated endpoint ContainerID="4a5e9a384dc5c49ed15c2124aa8da75449e213ec4ec5bba00b362f652cb038bb" Namespace="kube-system" Pod="coredns-668d6bf9bc-w7vk9" WorkloadEndpoint="ci--4284--0--0--n--4a8b92fa55.novalocal-k8s-coredns--668d6bf9bc--w7vk9-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4284--0--0--n--4a8b92fa55.novalocal-k8s-coredns--668d6bf9bc--w7vk9-eth0", GenerateName:"coredns-668d6bf9bc-", Namespace:"kube-system", SelfLink:"", UID:"f66aa28a-9c44-4949-9dcb-9724952e0c5a", ResourceVersion:"676", Generation:0, CreationTimestamp:time.Date(2025, time.May, 14, 1, 4, 35, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"668d6bf9bc", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4284-0-0-n-4a8b92fa55.novalocal", ContainerID:"", Pod:"coredns-668d6bf9bc-w7vk9", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.18.66/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"calid21fe658f88", MAC:"", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil)}} May 14 01:05:09.059887 containerd[1486]: 2025-05-14 01:05:09.009 [INFO][3756] cni-plugin/k8s.go 387: Calico CNI using IPs: [192.168.18.66/32] ContainerID="4a5e9a384dc5c49ed15c2124aa8da75449e213ec4ec5bba00b362f652cb038bb" Namespace="kube-system" Pod="coredns-668d6bf9bc-w7vk9" WorkloadEndpoint="ci--4284--0--0--n--4a8b92fa55.novalocal-k8s-coredns--668d6bf9bc--w7vk9-eth0" May 14 01:05:09.059887 containerd[1486]: 2025-05-14 01:05:09.009 [INFO][3756] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to calid21fe658f88 ContainerID="4a5e9a384dc5c49ed15c2124aa8da75449e213ec4ec5bba00b362f652cb038bb" Namespace="kube-system" Pod="coredns-668d6bf9bc-w7vk9" WorkloadEndpoint="ci--4284--0--0--n--4a8b92fa55.novalocal-k8s-coredns--668d6bf9bc--w7vk9-eth0" May 14 01:05:09.059887 containerd[1486]: 2025-05-14 01:05:09.020 [INFO][3756] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="4a5e9a384dc5c49ed15c2124aa8da75449e213ec4ec5bba00b362f652cb038bb" Namespace="kube-system" Pod="coredns-668d6bf9bc-w7vk9" WorkloadEndpoint="ci--4284--0--0--n--4a8b92fa55.novalocal-k8s-coredns--668d6bf9bc--w7vk9-eth0" May 14 01:05:09.059887 containerd[1486]: 2025-05-14 01:05:09.024 [INFO][3756] cni-plugin/k8s.go 414: Added Mac, interface name, and active container ID to endpoint ContainerID="4a5e9a384dc5c49ed15c2124aa8da75449e213ec4ec5bba00b362f652cb038bb" Namespace="kube-system" Pod="coredns-668d6bf9bc-w7vk9" WorkloadEndpoint="ci--4284--0--0--n--4a8b92fa55.novalocal-k8s-coredns--668d6bf9bc--w7vk9-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4284--0--0--n--4a8b92fa55.novalocal-k8s-coredns--668d6bf9bc--w7vk9-eth0", GenerateName:"coredns-668d6bf9bc-", Namespace:"kube-system", SelfLink:"", UID:"f66aa28a-9c44-4949-9dcb-9724952e0c5a", ResourceVersion:"676", Generation:0, CreationTimestamp:time.Date(2025, time.May, 14, 1, 4, 35, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"668d6bf9bc", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4284-0-0-n-4a8b92fa55.novalocal", ContainerID:"4a5e9a384dc5c49ed15c2124aa8da75449e213ec4ec5bba00b362f652cb038bb", Pod:"coredns-668d6bf9bc-w7vk9", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.18.66/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"calid21fe658f88", MAC:"7a:cf:07:da:b1:69", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil)}} May 14 01:05:09.059887 containerd[1486]: 2025-05-14 01:05:09.049 [INFO][3756] cni-plugin/k8s.go 500: Wrote updated endpoint to datastore ContainerID="4a5e9a384dc5c49ed15c2124aa8da75449e213ec4ec5bba00b362f652cb038bb" Namespace="kube-system" Pod="coredns-668d6bf9bc-w7vk9" WorkloadEndpoint="ci--4284--0--0--n--4a8b92fa55.novalocal-k8s-coredns--668d6bf9bc--w7vk9-eth0" May 14 01:05:09.092424 containerd[1486]: time="2025-05-14T01:05:09.091452021Z" level=info msg="connecting to shim 9be397c8d8b19c6c6f3180cc69f7bfaebb18e93b804363d9aca42ba401c74600" address="unix:///run/containerd/s/e0923a3400ee2a1d0aaddd6e486624d4476f397b86b572f3627b67cd1109f46d" namespace=k8s.io protocol=ttrpc version=3 May 14 01:05:09.134983 containerd[1486]: time="2025-05-14T01:05:09.134926264Z" level=info msg="connecting to shim 4a5e9a384dc5c49ed15c2124aa8da75449e213ec4ec5bba00b362f652cb038bb" address="unix:///run/containerd/s/ed824d0eccace1f4e1a57c53b42a61360378c10d0f5b83b0b6b1e8176c9c572c" namespace=k8s.io protocol=ttrpc version=3 May 14 01:05:09.234583 systemd[1]: Started cri-containerd-4a5e9a384dc5c49ed15c2124aa8da75449e213ec4ec5bba00b362f652cb038bb.scope - libcontainer container 4a5e9a384dc5c49ed15c2124aa8da75449e213ec4ec5bba00b362f652cb038bb. May 14 01:05:09.246228 systemd[1]: Started cri-containerd-9be397c8d8b19c6c6f3180cc69f7bfaebb18e93b804363d9aca42ba401c74600.scope - libcontainer container 9be397c8d8b19c6c6f3180cc69f7bfaebb18e93b804363d9aca42ba401c74600. May 14 01:05:09.350453 containerd[1486]: time="2025-05-14T01:05:09.350269989Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-668d6bf9bc-w7vk9,Uid:f66aa28a-9c44-4949-9dcb-9724952e0c5a,Namespace:kube-system,Attempt:0,} returns sandbox id \"4a5e9a384dc5c49ed15c2124aa8da75449e213ec4ec5bba00b362f652cb038bb\"" May 14 01:05:09.363504 containerd[1486]: time="2025-05-14T01:05:09.363450781Z" level=info msg="CreateContainer within sandbox \"4a5e9a384dc5c49ed15c2124aa8da75449e213ec4ec5bba00b362f652cb038bb\" for container &ContainerMetadata{Name:coredns,Attempt:0,}" May 14 01:05:09.387525 containerd[1486]: time="2025-05-14T01:05:09.387482166Z" level=info msg="Container 6ca1439f843f6820d18f6add1516b1cfd8b8b054be954aaa4b6abc8599f7ee37: CDI devices from CRI Config.CDIDevices: []" May 14 01:05:09.398556 containerd[1486]: time="2025-05-14T01:05:09.398473973Z" level=info msg="CreateContainer within sandbox \"4a5e9a384dc5c49ed15c2124aa8da75449e213ec4ec5bba00b362f652cb038bb\" for &ContainerMetadata{Name:coredns,Attempt:0,} returns container id \"6ca1439f843f6820d18f6add1516b1cfd8b8b054be954aaa4b6abc8599f7ee37\"" May 14 01:05:09.401085 containerd[1486]: time="2025-05-14T01:05:09.399765996Z" level=info msg="StartContainer for \"6ca1439f843f6820d18f6add1516b1cfd8b8b054be954aaa4b6abc8599f7ee37\"" May 14 01:05:09.402343 containerd[1486]: time="2025-05-14T01:05:09.402273901Z" level=info msg="connecting to shim 6ca1439f843f6820d18f6add1516b1cfd8b8b054be954aaa4b6abc8599f7ee37" address="unix:///run/containerd/s/ed824d0eccace1f4e1a57c53b42a61360378c10d0f5b83b0b6b1e8176c9c572c" protocol=ttrpc version=3 May 14 01:05:09.442299 systemd[1]: Started cri-containerd-6ca1439f843f6820d18f6add1516b1cfd8b8b054be954aaa4b6abc8599f7ee37.scope - libcontainer container 6ca1439f843f6820d18f6add1516b1cfd8b8b054be954aaa4b6abc8599f7ee37. May 14 01:05:09.448806 containerd[1486]: time="2025-05-14T01:05:09.447400803Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-7bc7fb4ff7-4sg5w,Uid:499e8660-f8a6-4547-8ea8-b27c8bf74cbc,Namespace:calico-apiserver,Attempt:0,} returns sandbox id \"9be397c8d8b19c6c6f3180cc69f7bfaebb18e93b804363d9aca42ba401c74600\"" May 14 01:05:09.449206 kernel: bpftool[4013]: memfd_create() called without MFD_EXEC or MFD_NOEXEC_SEAL set May 14 01:05:09.451427 containerd[1486]: time="2025-05-14T01:05:09.451369197Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.29.3\"" May 14 01:05:09.505587 containerd[1486]: time="2025-05-14T01:05:09.504795607Z" level=info msg="StartContainer for \"6ca1439f843f6820d18f6add1516b1cfd8b8b054be954aaa4b6abc8599f7ee37\" returns successfully" May 14 01:05:09.615272 containerd[1486]: time="2025-05-14T01:05:09.614667731Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-76985557f9-m9brf,Uid:223f11c4-486e-4257-8c51-e91d10d982e7,Namespace:calico-system,Attempt:0,}" May 14 01:05:09.789492 systemd-networkd[1384]: cali986f005d93f: Link UP May 14 01:05:09.790509 systemd-networkd[1384]: cali986f005d93f: Gained carrier May 14 01:05:09.814574 containerd[1486]: 2025-05-14 01:05:09.687 [INFO][4034] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4284--0--0--n--4a8b92fa55.novalocal-k8s-calico--kube--controllers--76985557f9--m9brf-eth0 calico-kube-controllers-76985557f9- calico-system 223f11c4-486e-4257-8c51-e91d10d982e7 684 0 2025-05-14 01:04:41 +0000 UTC map[app.kubernetes.io/name:calico-kube-controllers k8s-app:calico-kube-controllers pod-template-hash:76985557f9 projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:calico-kube-controllers] map[] [] [] []} {k8s ci-4284-0-0-n-4a8b92fa55.novalocal calico-kube-controllers-76985557f9-m9brf eth0 calico-kube-controllers [] [] [kns.calico-system ksa.calico-system.calico-kube-controllers] cali986f005d93f [] []}} ContainerID="f3313e3fe39a559b1eb71121c828ad2034b3beef221ce6206a81be52ab9e6322" Namespace="calico-system" Pod="calico-kube-controllers-76985557f9-m9brf" WorkloadEndpoint="ci--4284--0--0--n--4a8b92fa55.novalocal-k8s-calico--kube--controllers--76985557f9--m9brf-" May 14 01:05:09.814574 containerd[1486]: 2025-05-14 01:05:09.687 [INFO][4034] cni-plugin/k8s.go 77: Extracted identifiers for CmdAddK8s ContainerID="f3313e3fe39a559b1eb71121c828ad2034b3beef221ce6206a81be52ab9e6322" Namespace="calico-system" Pod="calico-kube-controllers-76985557f9-m9brf" WorkloadEndpoint="ci--4284--0--0--n--4a8b92fa55.novalocal-k8s-calico--kube--controllers--76985557f9--m9brf-eth0" May 14 01:05:09.814574 containerd[1486]: 2025-05-14 01:05:09.725 [INFO][4047] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="f3313e3fe39a559b1eb71121c828ad2034b3beef221ce6206a81be52ab9e6322" HandleID="k8s-pod-network.f3313e3fe39a559b1eb71121c828ad2034b3beef221ce6206a81be52ab9e6322" Workload="ci--4284--0--0--n--4a8b92fa55.novalocal-k8s-calico--kube--controllers--76985557f9--m9brf-eth0" May 14 01:05:09.814574 containerd[1486]: 2025-05-14 01:05:09.739 [INFO][4047] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="f3313e3fe39a559b1eb71121c828ad2034b3beef221ce6206a81be52ab9e6322" HandleID="k8s-pod-network.f3313e3fe39a559b1eb71121c828ad2034b3beef221ce6206a81be52ab9e6322" Workload="ci--4284--0--0--n--4a8b92fa55.novalocal-k8s-calico--kube--controllers--76985557f9--m9brf-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc000303710), Attrs:map[string]string{"namespace":"calico-system", "node":"ci-4284-0-0-n-4a8b92fa55.novalocal", "pod":"calico-kube-controllers-76985557f9-m9brf", "timestamp":"2025-05-14 01:05:09.725845966 +0000 UTC"}, Hostname:"ci-4284-0-0-n-4a8b92fa55.novalocal", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} May 14 01:05:09.814574 containerd[1486]: 2025-05-14 01:05:09.739 [INFO][4047] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. May 14 01:05:09.814574 containerd[1486]: 2025-05-14 01:05:09.739 [INFO][4047] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. May 14 01:05:09.814574 containerd[1486]: 2025-05-14 01:05:09.739 [INFO][4047] ipam/ipam.go 107: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4284-0-0-n-4a8b92fa55.novalocal' May 14 01:05:09.814574 containerd[1486]: 2025-05-14 01:05:09.745 [INFO][4047] ipam/ipam.go 660: Looking up existing affinities for host handle="k8s-pod-network.f3313e3fe39a559b1eb71121c828ad2034b3beef221ce6206a81be52ab9e6322" host="ci-4284-0-0-n-4a8b92fa55.novalocal" May 14 01:05:09.814574 containerd[1486]: 2025-05-14 01:05:09.754 [INFO][4047] ipam/ipam.go 372: Looking up existing affinities for host host="ci-4284-0-0-n-4a8b92fa55.novalocal" May 14 01:05:09.814574 containerd[1486]: 2025-05-14 01:05:09.760 [INFO][4047] ipam/ipam.go 489: Trying affinity for 192.168.18.64/26 host="ci-4284-0-0-n-4a8b92fa55.novalocal" May 14 01:05:09.814574 containerd[1486]: 2025-05-14 01:05:09.763 [INFO][4047] ipam/ipam.go 155: Attempting to load block cidr=192.168.18.64/26 host="ci-4284-0-0-n-4a8b92fa55.novalocal" May 14 01:05:09.814574 containerd[1486]: 2025-05-14 01:05:09.766 [INFO][4047] ipam/ipam.go 232: Affinity is confirmed and block has been loaded cidr=192.168.18.64/26 host="ci-4284-0-0-n-4a8b92fa55.novalocal" May 14 01:05:09.814574 containerd[1486]: 2025-05-14 01:05:09.767 [INFO][4047] ipam/ipam.go 1180: Attempting to assign 1 addresses from block block=192.168.18.64/26 handle="k8s-pod-network.f3313e3fe39a559b1eb71121c828ad2034b3beef221ce6206a81be52ab9e6322" host="ci-4284-0-0-n-4a8b92fa55.novalocal" May 14 01:05:09.814574 containerd[1486]: 2025-05-14 01:05:09.769 [INFO][4047] ipam/ipam.go 1685: Creating new handle: k8s-pod-network.f3313e3fe39a559b1eb71121c828ad2034b3beef221ce6206a81be52ab9e6322 May 14 01:05:09.814574 containerd[1486]: 2025-05-14 01:05:09.775 [INFO][4047] ipam/ipam.go 1203: Writing block in order to claim IPs block=192.168.18.64/26 handle="k8s-pod-network.f3313e3fe39a559b1eb71121c828ad2034b3beef221ce6206a81be52ab9e6322" host="ci-4284-0-0-n-4a8b92fa55.novalocal" May 14 01:05:09.814574 containerd[1486]: 2025-05-14 01:05:09.784 [INFO][4047] ipam/ipam.go 1216: Successfully claimed IPs: [192.168.18.67/26] block=192.168.18.64/26 handle="k8s-pod-network.f3313e3fe39a559b1eb71121c828ad2034b3beef221ce6206a81be52ab9e6322" host="ci-4284-0-0-n-4a8b92fa55.novalocal" May 14 01:05:09.814574 containerd[1486]: 2025-05-14 01:05:09.784 [INFO][4047] ipam/ipam.go 847: Auto-assigned 1 out of 1 IPv4s: [192.168.18.67/26] handle="k8s-pod-network.f3313e3fe39a559b1eb71121c828ad2034b3beef221ce6206a81be52ab9e6322" host="ci-4284-0-0-n-4a8b92fa55.novalocal" May 14 01:05:09.814574 containerd[1486]: 2025-05-14 01:05:09.784 [INFO][4047] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. May 14 01:05:09.814574 containerd[1486]: 2025-05-14 01:05:09.784 [INFO][4047] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.18.67/26] IPv6=[] ContainerID="f3313e3fe39a559b1eb71121c828ad2034b3beef221ce6206a81be52ab9e6322" HandleID="k8s-pod-network.f3313e3fe39a559b1eb71121c828ad2034b3beef221ce6206a81be52ab9e6322" Workload="ci--4284--0--0--n--4a8b92fa55.novalocal-k8s-calico--kube--controllers--76985557f9--m9brf-eth0" May 14 01:05:09.819159 containerd[1486]: 2025-05-14 01:05:09.786 [INFO][4034] cni-plugin/k8s.go 386: Populated endpoint ContainerID="f3313e3fe39a559b1eb71121c828ad2034b3beef221ce6206a81be52ab9e6322" Namespace="calico-system" Pod="calico-kube-controllers-76985557f9-m9brf" WorkloadEndpoint="ci--4284--0--0--n--4a8b92fa55.novalocal-k8s-calico--kube--controllers--76985557f9--m9brf-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4284--0--0--n--4a8b92fa55.novalocal-k8s-calico--kube--controllers--76985557f9--m9brf-eth0", GenerateName:"calico-kube-controllers-76985557f9-", Namespace:"calico-system", SelfLink:"", UID:"223f11c4-486e-4257-8c51-e91d10d982e7", ResourceVersion:"684", Generation:0, CreationTimestamp:time.Date(2025, time.May, 14, 1, 4, 41, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"calico-kube-controllers", "k8s-app":"calico-kube-controllers", "pod-template-hash":"76985557f9", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-kube-controllers"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4284-0-0-n-4a8b92fa55.novalocal", ContainerID:"", Pod:"calico-kube-controllers-76985557f9-m9brf", Endpoint:"eth0", ServiceAccountName:"calico-kube-controllers", IPNetworks:[]string{"192.168.18.67/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.calico-kube-controllers"}, InterfaceName:"cali986f005d93f", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil)}} May 14 01:05:09.819159 containerd[1486]: 2025-05-14 01:05:09.786 [INFO][4034] cni-plugin/k8s.go 387: Calico CNI using IPs: [192.168.18.67/32] ContainerID="f3313e3fe39a559b1eb71121c828ad2034b3beef221ce6206a81be52ab9e6322" Namespace="calico-system" Pod="calico-kube-controllers-76985557f9-m9brf" WorkloadEndpoint="ci--4284--0--0--n--4a8b92fa55.novalocal-k8s-calico--kube--controllers--76985557f9--m9brf-eth0" May 14 01:05:09.819159 containerd[1486]: 2025-05-14 01:05:09.786 [INFO][4034] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali986f005d93f ContainerID="f3313e3fe39a559b1eb71121c828ad2034b3beef221ce6206a81be52ab9e6322" Namespace="calico-system" Pod="calico-kube-controllers-76985557f9-m9brf" WorkloadEndpoint="ci--4284--0--0--n--4a8b92fa55.novalocal-k8s-calico--kube--controllers--76985557f9--m9brf-eth0" May 14 01:05:09.819159 containerd[1486]: 2025-05-14 01:05:09.790 [INFO][4034] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="f3313e3fe39a559b1eb71121c828ad2034b3beef221ce6206a81be52ab9e6322" Namespace="calico-system" Pod="calico-kube-controllers-76985557f9-m9brf" WorkloadEndpoint="ci--4284--0--0--n--4a8b92fa55.novalocal-k8s-calico--kube--controllers--76985557f9--m9brf-eth0" May 14 01:05:09.819159 containerd[1486]: 2025-05-14 01:05:09.792 [INFO][4034] cni-plugin/k8s.go 414: Added Mac, interface name, and active container ID to endpoint ContainerID="f3313e3fe39a559b1eb71121c828ad2034b3beef221ce6206a81be52ab9e6322" Namespace="calico-system" Pod="calico-kube-controllers-76985557f9-m9brf" WorkloadEndpoint="ci--4284--0--0--n--4a8b92fa55.novalocal-k8s-calico--kube--controllers--76985557f9--m9brf-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4284--0--0--n--4a8b92fa55.novalocal-k8s-calico--kube--controllers--76985557f9--m9brf-eth0", GenerateName:"calico-kube-controllers-76985557f9-", Namespace:"calico-system", SelfLink:"", UID:"223f11c4-486e-4257-8c51-e91d10d982e7", ResourceVersion:"684", Generation:0, CreationTimestamp:time.Date(2025, time.May, 14, 1, 4, 41, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"calico-kube-controllers", "k8s-app":"calico-kube-controllers", "pod-template-hash":"76985557f9", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-kube-controllers"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4284-0-0-n-4a8b92fa55.novalocal", ContainerID:"f3313e3fe39a559b1eb71121c828ad2034b3beef221ce6206a81be52ab9e6322", Pod:"calico-kube-controllers-76985557f9-m9brf", Endpoint:"eth0", ServiceAccountName:"calico-kube-controllers", IPNetworks:[]string{"192.168.18.67/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.calico-kube-controllers"}, InterfaceName:"cali986f005d93f", MAC:"9e:9c:9f:d9:cf:1a", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil)}} May 14 01:05:09.819159 containerd[1486]: 2025-05-14 01:05:09.808 [INFO][4034] cni-plugin/k8s.go 500: Wrote updated endpoint to datastore ContainerID="f3313e3fe39a559b1eb71121c828ad2034b3beef221ce6206a81be52ab9e6322" Namespace="calico-system" Pod="calico-kube-controllers-76985557f9-m9brf" WorkloadEndpoint="ci--4284--0--0--n--4a8b92fa55.novalocal-k8s-calico--kube--controllers--76985557f9--m9brf-eth0" May 14 01:05:09.868780 kubelet[2710]: I0514 01:05:09.868383 2710 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/coredns-668d6bf9bc-w7vk9" podStartSLOduration=34.868356759 podStartE2EDuration="34.868356759s" podCreationTimestamp="2025-05-14 01:04:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-05-14 01:05:09.863667084 +0000 UTC m=+41.363238558" watchObservedRunningTime="2025-05-14 01:05:09.868356759 +0000 UTC m=+41.367928263" May 14 01:05:09.877065 containerd[1486]: time="2025-05-14T01:05:09.875218270Z" level=info msg="connecting to shim f3313e3fe39a559b1eb71121c828ad2034b3beef221ce6206a81be52ab9e6322" address="unix:///run/containerd/s/127fc8bb229e9e7d1967b69e2ce98a1ce728c177a8b6c1192251cc89dd76a31f" namespace=k8s.io protocol=ttrpc version=3 May 14 01:05:09.928244 systemd[1]: Started cri-containerd-f3313e3fe39a559b1eb71121c828ad2034b3beef221ce6206a81be52ab9e6322.scope - libcontainer container f3313e3fe39a559b1eb71121c828ad2034b3beef221ce6206a81be52ab9e6322. May 14 01:05:09.954823 systemd-networkd[1384]: vxlan.calico: Link UP May 14 01:05:09.954833 systemd-networkd[1384]: vxlan.calico: Gained carrier May 14 01:05:10.059652 containerd[1486]: time="2025-05-14T01:05:10.059609847Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-76985557f9-m9brf,Uid:223f11c4-486e-4257-8c51-e91d10d982e7,Namespace:calico-system,Attempt:0,} returns sandbox id \"f3313e3fe39a559b1eb71121c828ad2034b3beef221ce6206a81be52ab9e6322\"" May 14 01:05:10.355183 systemd-networkd[1384]: cali8d1995bbd14: Gained IPv6LL May 14 01:05:10.615162 containerd[1486]: time="2025-05-14T01:05:10.614003492Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-b8vbb,Uid:4b92a8a5-04c9-4b4e-b1c3-606570f923ca,Namespace:calico-system,Attempt:0,}" May 14 01:05:10.828310 systemd-networkd[1384]: cali8bfa79b2f9b: Link UP May 14 01:05:10.830753 systemd-networkd[1384]: cali8bfa79b2f9b: Gained carrier May 14 01:05:10.859672 containerd[1486]: 2025-05-14 01:05:10.726 [INFO][4184] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4284--0--0--n--4a8b92fa55.novalocal-k8s-csi--node--driver--b8vbb-eth0 csi-node-driver- calico-system 4b92a8a5-04c9-4b4e-b1c3-606570f923ca 579 0 2025-05-14 01:04:41 +0000 UTC map[app.kubernetes.io/name:csi-node-driver controller-revision-hash:5b5cc68cd5 k8s-app:csi-node-driver name:csi-node-driver pod-template-generation:1 projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:csi-node-driver] map[] [] [] []} {k8s ci-4284-0-0-n-4a8b92fa55.novalocal csi-node-driver-b8vbb eth0 csi-node-driver [] [] [kns.calico-system ksa.calico-system.csi-node-driver] cali8bfa79b2f9b [] []}} ContainerID="b44dc0b9ec272ae077b45c38e31bd9ac2845963c4a223bc92d949062a1c0b5f9" Namespace="calico-system" Pod="csi-node-driver-b8vbb" WorkloadEndpoint="ci--4284--0--0--n--4a8b92fa55.novalocal-k8s-csi--node--driver--b8vbb-" May 14 01:05:10.859672 containerd[1486]: 2025-05-14 01:05:10.726 [INFO][4184] cni-plugin/k8s.go 77: Extracted identifiers for CmdAddK8s ContainerID="b44dc0b9ec272ae077b45c38e31bd9ac2845963c4a223bc92d949062a1c0b5f9" Namespace="calico-system" Pod="csi-node-driver-b8vbb" WorkloadEndpoint="ci--4284--0--0--n--4a8b92fa55.novalocal-k8s-csi--node--driver--b8vbb-eth0" May 14 01:05:10.859672 containerd[1486]: 2025-05-14 01:05:10.768 [INFO][4198] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="b44dc0b9ec272ae077b45c38e31bd9ac2845963c4a223bc92d949062a1c0b5f9" HandleID="k8s-pod-network.b44dc0b9ec272ae077b45c38e31bd9ac2845963c4a223bc92d949062a1c0b5f9" Workload="ci--4284--0--0--n--4a8b92fa55.novalocal-k8s-csi--node--driver--b8vbb-eth0" May 14 01:05:10.859672 containerd[1486]: 2025-05-14 01:05:10.780 [INFO][4198] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="b44dc0b9ec272ae077b45c38e31bd9ac2845963c4a223bc92d949062a1c0b5f9" HandleID="k8s-pod-network.b44dc0b9ec272ae077b45c38e31bd9ac2845963c4a223bc92d949062a1c0b5f9" Workload="ci--4284--0--0--n--4a8b92fa55.novalocal-k8s-csi--node--driver--b8vbb-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc0002916a0), Attrs:map[string]string{"namespace":"calico-system", "node":"ci-4284-0-0-n-4a8b92fa55.novalocal", "pod":"csi-node-driver-b8vbb", "timestamp":"2025-05-14 01:05:10.768835665 +0000 UTC"}, Hostname:"ci-4284-0-0-n-4a8b92fa55.novalocal", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} May 14 01:05:10.859672 containerd[1486]: 2025-05-14 01:05:10.780 [INFO][4198] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. May 14 01:05:10.859672 containerd[1486]: 2025-05-14 01:05:10.780 [INFO][4198] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. May 14 01:05:10.859672 containerd[1486]: 2025-05-14 01:05:10.780 [INFO][4198] ipam/ipam.go 107: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4284-0-0-n-4a8b92fa55.novalocal' May 14 01:05:10.859672 containerd[1486]: 2025-05-14 01:05:10.782 [INFO][4198] ipam/ipam.go 660: Looking up existing affinities for host handle="k8s-pod-network.b44dc0b9ec272ae077b45c38e31bd9ac2845963c4a223bc92d949062a1c0b5f9" host="ci-4284-0-0-n-4a8b92fa55.novalocal" May 14 01:05:10.859672 containerd[1486]: 2025-05-14 01:05:10.786 [INFO][4198] ipam/ipam.go 372: Looking up existing affinities for host host="ci-4284-0-0-n-4a8b92fa55.novalocal" May 14 01:05:10.859672 containerd[1486]: 2025-05-14 01:05:10.791 [INFO][4198] ipam/ipam.go 489: Trying affinity for 192.168.18.64/26 host="ci-4284-0-0-n-4a8b92fa55.novalocal" May 14 01:05:10.859672 containerd[1486]: 2025-05-14 01:05:10.793 [INFO][4198] ipam/ipam.go 155: Attempting to load block cidr=192.168.18.64/26 host="ci-4284-0-0-n-4a8b92fa55.novalocal" May 14 01:05:10.859672 containerd[1486]: 2025-05-14 01:05:10.796 [INFO][4198] ipam/ipam.go 232: Affinity is confirmed and block has been loaded cidr=192.168.18.64/26 host="ci-4284-0-0-n-4a8b92fa55.novalocal" May 14 01:05:10.859672 containerd[1486]: 2025-05-14 01:05:10.796 [INFO][4198] ipam/ipam.go 1180: Attempting to assign 1 addresses from block block=192.168.18.64/26 handle="k8s-pod-network.b44dc0b9ec272ae077b45c38e31bd9ac2845963c4a223bc92d949062a1c0b5f9" host="ci-4284-0-0-n-4a8b92fa55.novalocal" May 14 01:05:10.859672 containerd[1486]: 2025-05-14 01:05:10.798 [INFO][4198] ipam/ipam.go 1685: Creating new handle: k8s-pod-network.b44dc0b9ec272ae077b45c38e31bd9ac2845963c4a223bc92d949062a1c0b5f9 May 14 01:05:10.859672 containerd[1486]: 2025-05-14 01:05:10.806 [INFO][4198] ipam/ipam.go 1203: Writing block in order to claim IPs block=192.168.18.64/26 handle="k8s-pod-network.b44dc0b9ec272ae077b45c38e31bd9ac2845963c4a223bc92d949062a1c0b5f9" host="ci-4284-0-0-n-4a8b92fa55.novalocal" May 14 01:05:10.859672 containerd[1486]: 2025-05-14 01:05:10.818 [INFO][4198] ipam/ipam.go 1216: Successfully claimed IPs: [192.168.18.68/26] block=192.168.18.64/26 handle="k8s-pod-network.b44dc0b9ec272ae077b45c38e31bd9ac2845963c4a223bc92d949062a1c0b5f9" host="ci-4284-0-0-n-4a8b92fa55.novalocal" May 14 01:05:10.859672 containerd[1486]: 2025-05-14 01:05:10.820 [INFO][4198] ipam/ipam.go 847: Auto-assigned 1 out of 1 IPv4s: [192.168.18.68/26] handle="k8s-pod-network.b44dc0b9ec272ae077b45c38e31bd9ac2845963c4a223bc92d949062a1c0b5f9" host="ci-4284-0-0-n-4a8b92fa55.novalocal" May 14 01:05:10.859672 containerd[1486]: 2025-05-14 01:05:10.820 [INFO][4198] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. May 14 01:05:10.859672 containerd[1486]: 2025-05-14 01:05:10.820 [INFO][4198] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.18.68/26] IPv6=[] ContainerID="b44dc0b9ec272ae077b45c38e31bd9ac2845963c4a223bc92d949062a1c0b5f9" HandleID="k8s-pod-network.b44dc0b9ec272ae077b45c38e31bd9ac2845963c4a223bc92d949062a1c0b5f9" Workload="ci--4284--0--0--n--4a8b92fa55.novalocal-k8s-csi--node--driver--b8vbb-eth0" May 14 01:05:10.862389 containerd[1486]: 2025-05-14 01:05:10.823 [INFO][4184] cni-plugin/k8s.go 386: Populated endpoint ContainerID="b44dc0b9ec272ae077b45c38e31bd9ac2845963c4a223bc92d949062a1c0b5f9" Namespace="calico-system" Pod="csi-node-driver-b8vbb" WorkloadEndpoint="ci--4284--0--0--n--4a8b92fa55.novalocal-k8s-csi--node--driver--b8vbb-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4284--0--0--n--4a8b92fa55.novalocal-k8s-csi--node--driver--b8vbb-eth0", GenerateName:"csi-node-driver-", Namespace:"calico-system", SelfLink:"", UID:"4b92a8a5-04c9-4b4e-b1c3-606570f923ca", ResourceVersion:"579", Generation:0, CreationTimestamp:time.Date(2025, time.May, 14, 1, 4, 41, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"csi-node-driver", "controller-revision-hash":"5b5cc68cd5", "k8s-app":"csi-node-driver", "name":"csi-node-driver", "pod-template-generation":"1", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"csi-node-driver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4284-0-0-n-4a8b92fa55.novalocal", ContainerID:"", Pod:"csi-node-driver-b8vbb", Endpoint:"eth0", ServiceAccountName:"csi-node-driver", IPNetworks:[]string{"192.168.18.68/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.csi-node-driver"}, InterfaceName:"cali8bfa79b2f9b", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil)}} May 14 01:05:10.862389 containerd[1486]: 2025-05-14 01:05:10.823 [INFO][4184] cni-plugin/k8s.go 387: Calico CNI using IPs: [192.168.18.68/32] ContainerID="b44dc0b9ec272ae077b45c38e31bd9ac2845963c4a223bc92d949062a1c0b5f9" Namespace="calico-system" Pod="csi-node-driver-b8vbb" WorkloadEndpoint="ci--4284--0--0--n--4a8b92fa55.novalocal-k8s-csi--node--driver--b8vbb-eth0" May 14 01:05:10.862389 containerd[1486]: 2025-05-14 01:05:10.823 [INFO][4184] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali8bfa79b2f9b ContainerID="b44dc0b9ec272ae077b45c38e31bd9ac2845963c4a223bc92d949062a1c0b5f9" Namespace="calico-system" Pod="csi-node-driver-b8vbb" WorkloadEndpoint="ci--4284--0--0--n--4a8b92fa55.novalocal-k8s-csi--node--driver--b8vbb-eth0" May 14 01:05:10.862389 containerd[1486]: 2025-05-14 01:05:10.831 [INFO][4184] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="b44dc0b9ec272ae077b45c38e31bd9ac2845963c4a223bc92d949062a1c0b5f9" Namespace="calico-system" Pod="csi-node-driver-b8vbb" WorkloadEndpoint="ci--4284--0--0--n--4a8b92fa55.novalocal-k8s-csi--node--driver--b8vbb-eth0" May 14 01:05:10.862389 containerd[1486]: 2025-05-14 01:05:10.832 [INFO][4184] cni-plugin/k8s.go 414: Added Mac, interface name, and active container ID to endpoint ContainerID="b44dc0b9ec272ae077b45c38e31bd9ac2845963c4a223bc92d949062a1c0b5f9" Namespace="calico-system" Pod="csi-node-driver-b8vbb" WorkloadEndpoint="ci--4284--0--0--n--4a8b92fa55.novalocal-k8s-csi--node--driver--b8vbb-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4284--0--0--n--4a8b92fa55.novalocal-k8s-csi--node--driver--b8vbb-eth0", GenerateName:"csi-node-driver-", Namespace:"calico-system", SelfLink:"", UID:"4b92a8a5-04c9-4b4e-b1c3-606570f923ca", ResourceVersion:"579", Generation:0, CreationTimestamp:time.Date(2025, time.May, 14, 1, 4, 41, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"csi-node-driver", "controller-revision-hash":"5b5cc68cd5", "k8s-app":"csi-node-driver", "name":"csi-node-driver", "pod-template-generation":"1", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"csi-node-driver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4284-0-0-n-4a8b92fa55.novalocal", ContainerID:"b44dc0b9ec272ae077b45c38e31bd9ac2845963c4a223bc92d949062a1c0b5f9", Pod:"csi-node-driver-b8vbb", Endpoint:"eth0", ServiceAccountName:"csi-node-driver", IPNetworks:[]string{"192.168.18.68/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.csi-node-driver"}, InterfaceName:"cali8bfa79b2f9b", MAC:"c2:4f:a8:97:19:4e", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil)}} May 14 01:05:10.862389 containerd[1486]: 2025-05-14 01:05:10.856 [INFO][4184] cni-plugin/k8s.go 500: Wrote updated endpoint to datastore ContainerID="b44dc0b9ec272ae077b45c38e31bd9ac2845963c4a223bc92d949062a1c0b5f9" Namespace="calico-system" Pod="csi-node-driver-b8vbb" WorkloadEndpoint="ci--4284--0--0--n--4a8b92fa55.novalocal-k8s-csi--node--driver--b8vbb-eth0" May 14 01:05:10.867233 systemd-networkd[1384]: cali986f005d93f: Gained IPv6LL May 14 01:05:10.995588 systemd-networkd[1384]: calid21fe658f88: Gained IPv6LL May 14 01:05:11.396418 containerd[1486]: time="2025-05-14T01:05:11.396345458Z" level=info msg="connecting to shim b44dc0b9ec272ae077b45c38e31bd9ac2845963c4a223bc92d949062a1c0b5f9" address="unix:///run/containerd/s/e8e61c7e54ff784fe8253e32abac13e5aca4a4fa6a72e4c0fd4eb8298c308c91" namespace=k8s.io protocol=ttrpc version=3 May 14 01:05:11.471201 systemd[1]: Started cri-containerd-b44dc0b9ec272ae077b45c38e31bd9ac2845963c4a223bc92d949062a1c0b5f9.scope - libcontainer container b44dc0b9ec272ae077b45c38e31bd9ac2845963c4a223bc92d949062a1c0b5f9. May 14 01:05:11.528743 containerd[1486]: time="2025-05-14T01:05:11.528617374Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-b8vbb,Uid:4b92a8a5-04c9-4b4e-b1c3-606570f923ca,Namespace:calico-system,Attempt:0,} returns sandbox id \"b44dc0b9ec272ae077b45c38e31bd9ac2845963c4a223bc92d949062a1c0b5f9\"" May 14 01:05:11.615070 containerd[1486]: time="2025-05-14T01:05:11.614778983Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-668d6bf9bc-zngx5,Uid:a153f68e-8a69-4c2e-ba25-43d30d1b76a3,Namespace:kube-system,Attempt:0,}" May 14 01:05:11.615466 containerd[1486]: time="2025-05-14T01:05:11.615298207Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-7bc7fb4ff7-qm8vw,Uid:004765cd-cdab-49fa-8da7-cb602693d444,Namespace:calico-apiserver,Attempt:0,}" May 14 01:05:11.955320 systemd-networkd[1384]: vxlan.calico: Gained IPv6LL May 14 01:05:11.982735 systemd-networkd[1384]: caliaeed27ae243: Link UP May 14 01:05:11.983446 systemd-networkd[1384]: caliaeed27ae243: Gained carrier May 14 01:05:12.038248 containerd[1486]: 2025-05-14 01:05:11.727 [INFO][4266] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4284--0--0--n--4a8b92fa55.novalocal-k8s-calico--apiserver--7bc7fb4ff7--qm8vw-eth0 calico-apiserver-7bc7fb4ff7- calico-apiserver 004765cd-cdab-49fa-8da7-cb602693d444 682 0 2025-05-14 01:04:41 +0000 UTC map[apiserver:true app.kubernetes.io/name:calico-apiserver k8s-app:calico-apiserver pod-template-hash:7bc7fb4ff7 projectcalico.org/namespace:calico-apiserver projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:calico-apiserver] map[] [] [] []} {k8s ci-4284-0-0-n-4a8b92fa55.novalocal calico-apiserver-7bc7fb4ff7-qm8vw eth0 calico-apiserver [] [] [kns.calico-apiserver ksa.calico-apiserver.calico-apiserver] caliaeed27ae243 [] []}} ContainerID="21055ccaaf46f08fa2ce30ba5619d26b1e156fa34fe866aa6ec464c8a29f3cc5" Namespace="calico-apiserver" Pod="calico-apiserver-7bc7fb4ff7-qm8vw" WorkloadEndpoint="ci--4284--0--0--n--4a8b92fa55.novalocal-k8s-calico--apiserver--7bc7fb4ff7--qm8vw-" May 14 01:05:12.038248 containerd[1486]: 2025-05-14 01:05:11.727 [INFO][4266] cni-plugin/k8s.go 77: Extracted identifiers for CmdAddK8s ContainerID="21055ccaaf46f08fa2ce30ba5619d26b1e156fa34fe866aa6ec464c8a29f3cc5" Namespace="calico-apiserver" Pod="calico-apiserver-7bc7fb4ff7-qm8vw" WorkloadEndpoint="ci--4284--0--0--n--4a8b92fa55.novalocal-k8s-calico--apiserver--7bc7fb4ff7--qm8vw-eth0" May 14 01:05:12.038248 containerd[1486]: 2025-05-14 01:05:11.834 [INFO][4289] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="21055ccaaf46f08fa2ce30ba5619d26b1e156fa34fe866aa6ec464c8a29f3cc5" HandleID="k8s-pod-network.21055ccaaf46f08fa2ce30ba5619d26b1e156fa34fe866aa6ec464c8a29f3cc5" Workload="ci--4284--0--0--n--4a8b92fa55.novalocal-k8s-calico--apiserver--7bc7fb4ff7--qm8vw-eth0" May 14 01:05:12.038248 containerd[1486]: 2025-05-14 01:05:11.864 [INFO][4289] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="21055ccaaf46f08fa2ce30ba5619d26b1e156fa34fe866aa6ec464c8a29f3cc5" HandleID="k8s-pod-network.21055ccaaf46f08fa2ce30ba5619d26b1e156fa34fe866aa6ec464c8a29f3cc5" Workload="ci--4284--0--0--n--4a8b92fa55.novalocal-k8s-calico--apiserver--7bc7fb4ff7--qm8vw-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc00039bc40), Attrs:map[string]string{"namespace":"calico-apiserver", "node":"ci-4284-0-0-n-4a8b92fa55.novalocal", "pod":"calico-apiserver-7bc7fb4ff7-qm8vw", "timestamp":"2025-05-14 01:05:11.834623284 +0000 UTC"}, Hostname:"ci-4284-0-0-n-4a8b92fa55.novalocal", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} May 14 01:05:12.038248 containerd[1486]: 2025-05-14 01:05:11.865 [INFO][4289] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. May 14 01:05:12.038248 containerd[1486]: 2025-05-14 01:05:11.865 [INFO][4289] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. May 14 01:05:12.038248 containerd[1486]: 2025-05-14 01:05:11.868 [INFO][4289] ipam/ipam.go 107: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4284-0-0-n-4a8b92fa55.novalocal' May 14 01:05:12.038248 containerd[1486]: 2025-05-14 01:05:11.875 [INFO][4289] ipam/ipam.go 660: Looking up existing affinities for host handle="k8s-pod-network.21055ccaaf46f08fa2ce30ba5619d26b1e156fa34fe866aa6ec464c8a29f3cc5" host="ci-4284-0-0-n-4a8b92fa55.novalocal" May 14 01:05:12.038248 containerd[1486]: 2025-05-14 01:05:11.883 [INFO][4289] ipam/ipam.go 372: Looking up existing affinities for host host="ci-4284-0-0-n-4a8b92fa55.novalocal" May 14 01:05:12.038248 containerd[1486]: 2025-05-14 01:05:11.895 [INFO][4289] ipam/ipam.go 489: Trying affinity for 192.168.18.64/26 host="ci-4284-0-0-n-4a8b92fa55.novalocal" May 14 01:05:12.038248 containerd[1486]: 2025-05-14 01:05:11.899 [INFO][4289] ipam/ipam.go 155: Attempting to load block cidr=192.168.18.64/26 host="ci-4284-0-0-n-4a8b92fa55.novalocal" May 14 01:05:12.038248 containerd[1486]: 2025-05-14 01:05:11.904 [INFO][4289] ipam/ipam.go 232: Affinity is confirmed and block has been loaded cidr=192.168.18.64/26 host="ci-4284-0-0-n-4a8b92fa55.novalocal" May 14 01:05:12.038248 containerd[1486]: 2025-05-14 01:05:11.904 [INFO][4289] ipam/ipam.go 1180: Attempting to assign 1 addresses from block block=192.168.18.64/26 handle="k8s-pod-network.21055ccaaf46f08fa2ce30ba5619d26b1e156fa34fe866aa6ec464c8a29f3cc5" host="ci-4284-0-0-n-4a8b92fa55.novalocal" May 14 01:05:12.038248 containerd[1486]: 2025-05-14 01:05:11.910 [INFO][4289] ipam/ipam.go 1685: Creating new handle: k8s-pod-network.21055ccaaf46f08fa2ce30ba5619d26b1e156fa34fe866aa6ec464c8a29f3cc5 May 14 01:05:12.038248 containerd[1486]: 2025-05-14 01:05:11.940 [INFO][4289] ipam/ipam.go 1203: Writing block in order to claim IPs block=192.168.18.64/26 handle="k8s-pod-network.21055ccaaf46f08fa2ce30ba5619d26b1e156fa34fe866aa6ec464c8a29f3cc5" host="ci-4284-0-0-n-4a8b92fa55.novalocal" May 14 01:05:12.038248 containerd[1486]: 2025-05-14 01:05:11.968 [INFO][4289] ipam/ipam.go 1216: Successfully claimed IPs: [192.168.18.69/26] block=192.168.18.64/26 handle="k8s-pod-network.21055ccaaf46f08fa2ce30ba5619d26b1e156fa34fe866aa6ec464c8a29f3cc5" host="ci-4284-0-0-n-4a8b92fa55.novalocal" May 14 01:05:12.038248 containerd[1486]: 2025-05-14 01:05:11.968 [INFO][4289] ipam/ipam.go 847: Auto-assigned 1 out of 1 IPv4s: [192.168.18.69/26] handle="k8s-pod-network.21055ccaaf46f08fa2ce30ba5619d26b1e156fa34fe866aa6ec464c8a29f3cc5" host="ci-4284-0-0-n-4a8b92fa55.novalocal" May 14 01:05:12.038248 containerd[1486]: 2025-05-14 01:05:11.969 [INFO][4289] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. May 14 01:05:12.038248 containerd[1486]: 2025-05-14 01:05:11.969 [INFO][4289] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.18.69/26] IPv6=[] ContainerID="21055ccaaf46f08fa2ce30ba5619d26b1e156fa34fe866aa6ec464c8a29f3cc5" HandleID="k8s-pod-network.21055ccaaf46f08fa2ce30ba5619d26b1e156fa34fe866aa6ec464c8a29f3cc5" Workload="ci--4284--0--0--n--4a8b92fa55.novalocal-k8s-calico--apiserver--7bc7fb4ff7--qm8vw-eth0" May 14 01:05:12.041053 containerd[1486]: 2025-05-14 01:05:11.975 [INFO][4266] cni-plugin/k8s.go 386: Populated endpoint ContainerID="21055ccaaf46f08fa2ce30ba5619d26b1e156fa34fe866aa6ec464c8a29f3cc5" Namespace="calico-apiserver" Pod="calico-apiserver-7bc7fb4ff7-qm8vw" WorkloadEndpoint="ci--4284--0--0--n--4a8b92fa55.novalocal-k8s-calico--apiserver--7bc7fb4ff7--qm8vw-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4284--0--0--n--4a8b92fa55.novalocal-k8s-calico--apiserver--7bc7fb4ff7--qm8vw-eth0", GenerateName:"calico-apiserver-7bc7fb4ff7-", Namespace:"calico-apiserver", SelfLink:"", UID:"004765cd-cdab-49fa-8da7-cb602693d444", ResourceVersion:"682", Generation:0, CreationTimestamp:time.Date(2025, time.May, 14, 1, 4, 41, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"7bc7fb4ff7", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4284-0-0-n-4a8b92fa55.novalocal", ContainerID:"", Pod:"calico-apiserver-7bc7fb4ff7-qm8vw", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.18.69/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"caliaeed27ae243", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil)}} May 14 01:05:12.041053 containerd[1486]: 2025-05-14 01:05:11.975 [INFO][4266] cni-plugin/k8s.go 387: Calico CNI using IPs: [192.168.18.69/32] ContainerID="21055ccaaf46f08fa2ce30ba5619d26b1e156fa34fe866aa6ec464c8a29f3cc5" Namespace="calico-apiserver" Pod="calico-apiserver-7bc7fb4ff7-qm8vw" WorkloadEndpoint="ci--4284--0--0--n--4a8b92fa55.novalocal-k8s-calico--apiserver--7bc7fb4ff7--qm8vw-eth0" May 14 01:05:12.041053 containerd[1486]: 2025-05-14 01:05:11.975 [INFO][4266] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to caliaeed27ae243 ContainerID="21055ccaaf46f08fa2ce30ba5619d26b1e156fa34fe866aa6ec464c8a29f3cc5" Namespace="calico-apiserver" Pod="calico-apiserver-7bc7fb4ff7-qm8vw" WorkloadEndpoint="ci--4284--0--0--n--4a8b92fa55.novalocal-k8s-calico--apiserver--7bc7fb4ff7--qm8vw-eth0" May 14 01:05:12.041053 containerd[1486]: 2025-05-14 01:05:11.986 [INFO][4266] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="21055ccaaf46f08fa2ce30ba5619d26b1e156fa34fe866aa6ec464c8a29f3cc5" Namespace="calico-apiserver" Pod="calico-apiserver-7bc7fb4ff7-qm8vw" WorkloadEndpoint="ci--4284--0--0--n--4a8b92fa55.novalocal-k8s-calico--apiserver--7bc7fb4ff7--qm8vw-eth0" May 14 01:05:12.041053 containerd[1486]: 2025-05-14 01:05:11.990 [INFO][4266] cni-plugin/k8s.go 414: Added Mac, interface name, and active container ID to endpoint ContainerID="21055ccaaf46f08fa2ce30ba5619d26b1e156fa34fe866aa6ec464c8a29f3cc5" Namespace="calico-apiserver" Pod="calico-apiserver-7bc7fb4ff7-qm8vw" WorkloadEndpoint="ci--4284--0--0--n--4a8b92fa55.novalocal-k8s-calico--apiserver--7bc7fb4ff7--qm8vw-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4284--0--0--n--4a8b92fa55.novalocal-k8s-calico--apiserver--7bc7fb4ff7--qm8vw-eth0", GenerateName:"calico-apiserver-7bc7fb4ff7-", Namespace:"calico-apiserver", SelfLink:"", UID:"004765cd-cdab-49fa-8da7-cb602693d444", ResourceVersion:"682", Generation:0, CreationTimestamp:time.Date(2025, time.May, 14, 1, 4, 41, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"7bc7fb4ff7", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4284-0-0-n-4a8b92fa55.novalocal", ContainerID:"21055ccaaf46f08fa2ce30ba5619d26b1e156fa34fe866aa6ec464c8a29f3cc5", Pod:"calico-apiserver-7bc7fb4ff7-qm8vw", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.18.69/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"caliaeed27ae243", MAC:"2a:77:8b:d1:41:6a", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil)}} May 14 01:05:12.041053 containerd[1486]: 2025-05-14 01:05:12.033 [INFO][4266] cni-plugin/k8s.go 500: Wrote updated endpoint to datastore ContainerID="21055ccaaf46f08fa2ce30ba5619d26b1e156fa34fe866aa6ec464c8a29f3cc5" Namespace="calico-apiserver" Pod="calico-apiserver-7bc7fb4ff7-qm8vw" WorkloadEndpoint="ci--4284--0--0--n--4a8b92fa55.novalocal-k8s-calico--apiserver--7bc7fb4ff7--qm8vw-eth0" May 14 01:05:12.109334 systemd-networkd[1384]: cali5f82c6519b3: Link UP May 14 01:05:12.110858 systemd-networkd[1384]: cali5f82c6519b3: Gained carrier May 14 01:05:12.144142 containerd[1486]: time="2025-05-14T01:05:12.143498946Z" level=info msg="connecting to shim 21055ccaaf46f08fa2ce30ba5619d26b1e156fa34fe866aa6ec464c8a29f3cc5" address="unix:///run/containerd/s/730934082fc9643d3bf70996e8d3a5540aea197ea82376f1a1362bd6ef377760" namespace=k8s.io protocol=ttrpc version=3 May 14 01:05:12.159986 containerd[1486]: 2025-05-14 01:05:11.742 [INFO][4264] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4284--0--0--n--4a8b92fa55.novalocal-k8s-coredns--668d6bf9bc--zngx5-eth0 coredns-668d6bf9bc- kube-system a153f68e-8a69-4c2e-ba25-43d30d1b76a3 681 0 2025-05-14 01:04:35 +0000 UTC map[k8s-app:kube-dns pod-template-hash:668d6bf9bc projectcalico.org/namespace:kube-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:coredns] map[] [] [] []} {k8s ci-4284-0-0-n-4a8b92fa55.novalocal coredns-668d6bf9bc-zngx5 eth0 coredns [] [] [kns.kube-system ksa.kube-system.coredns] cali5f82c6519b3 [{dns UDP 53 0 } {dns-tcp TCP 53 0 } {metrics TCP 9153 0 }] []}} ContainerID="3dde7b3369ae5f880a0006d762a930448abc9bc3b92c6e534b3de3cbeec5fdff" Namespace="kube-system" Pod="coredns-668d6bf9bc-zngx5" WorkloadEndpoint="ci--4284--0--0--n--4a8b92fa55.novalocal-k8s-coredns--668d6bf9bc--zngx5-" May 14 01:05:12.159986 containerd[1486]: 2025-05-14 01:05:11.742 [INFO][4264] cni-plugin/k8s.go 77: Extracted identifiers for CmdAddK8s ContainerID="3dde7b3369ae5f880a0006d762a930448abc9bc3b92c6e534b3de3cbeec5fdff" Namespace="kube-system" Pod="coredns-668d6bf9bc-zngx5" WorkloadEndpoint="ci--4284--0--0--n--4a8b92fa55.novalocal-k8s-coredns--668d6bf9bc--zngx5-eth0" May 14 01:05:12.159986 containerd[1486]: 2025-05-14 01:05:11.835 [INFO][4295] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="3dde7b3369ae5f880a0006d762a930448abc9bc3b92c6e534b3de3cbeec5fdff" HandleID="k8s-pod-network.3dde7b3369ae5f880a0006d762a930448abc9bc3b92c6e534b3de3cbeec5fdff" Workload="ci--4284--0--0--n--4a8b92fa55.novalocal-k8s-coredns--668d6bf9bc--zngx5-eth0" May 14 01:05:12.159986 containerd[1486]: 2025-05-14 01:05:11.870 [INFO][4295] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="3dde7b3369ae5f880a0006d762a930448abc9bc3b92c6e534b3de3cbeec5fdff" HandleID="k8s-pod-network.3dde7b3369ae5f880a0006d762a930448abc9bc3b92c6e534b3de3cbeec5fdff" Workload="ci--4284--0--0--n--4a8b92fa55.novalocal-k8s-coredns--668d6bf9bc--zngx5-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc0002927e0), Attrs:map[string]string{"namespace":"kube-system", "node":"ci-4284-0-0-n-4a8b92fa55.novalocal", "pod":"coredns-668d6bf9bc-zngx5", "timestamp":"2025-05-14 01:05:11.835241163 +0000 UTC"}, Hostname:"ci-4284-0-0-n-4a8b92fa55.novalocal", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} May 14 01:05:12.159986 containerd[1486]: 2025-05-14 01:05:11.871 [INFO][4295] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. May 14 01:05:12.159986 containerd[1486]: 2025-05-14 01:05:11.969 [INFO][4295] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. May 14 01:05:12.159986 containerd[1486]: 2025-05-14 01:05:11.969 [INFO][4295] ipam/ipam.go 107: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4284-0-0-n-4a8b92fa55.novalocal' May 14 01:05:12.159986 containerd[1486]: 2025-05-14 01:05:11.980 [INFO][4295] ipam/ipam.go 660: Looking up existing affinities for host handle="k8s-pod-network.3dde7b3369ae5f880a0006d762a930448abc9bc3b92c6e534b3de3cbeec5fdff" host="ci-4284-0-0-n-4a8b92fa55.novalocal" May 14 01:05:12.159986 containerd[1486]: 2025-05-14 01:05:12.003 [INFO][4295] ipam/ipam.go 372: Looking up existing affinities for host host="ci-4284-0-0-n-4a8b92fa55.novalocal" May 14 01:05:12.159986 containerd[1486]: 2025-05-14 01:05:12.030 [INFO][4295] ipam/ipam.go 489: Trying affinity for 192.168.18.64/26 host="ci-4284-0-0-n-4a8b92fa55.novalocal" May 14 01:05:12.159986 containerd[1486]: 2025-05-14 01:05:12.036 [INFO][4295] ipam/ipam.go 155: Attempting to load block cidr=192.168.18.64/26 host="ci-4284-0-0-n-4a8b92fa55.novalocal" May 14 01:05:12.159986 containerd[1486]: 2025-05-14 01:05:12.045 [INFO][4295] ipam/ipam.go 232: Affinity is confirmed and block has been loaded cidr=192.168.18.64/26 host="ci-4284-0-0-n-4a8b92fa55.novalocal" May 14 01:05:12.159986 containerd[1486]: 2025-05-14 01:05:12.045 [INFO][4295] ipam/ipam.go 1180: Attempting to assign 1 addresses from block block=192.168.18.64/26 handle="k8s-pod-network.3dde7b3369ae5f880a0006d762a930448abc9bc3b92c6e534b3de3cbeec5fdff" host="ci-4284-0-0-n-4a8b92fa55.novalocal" May 14 01:05:12.159986 containerd[1486]: 2025-05-14 01:05:12.054 [INFO][4295] ipam/ipam.go 1685: Creating new handle: k8s-pod-network.3dde7b3369ae5f880a0006d762a930448abc9bc3b92c6e534b3de3cbeec5fdff May 14 01:05:12.159986 containerd[1486]: 2025-05-14 01:05:12.064 [INFO][4295] ipam/ipam.go 1203: Writing block in order to claim IPs block=192.168.18.64/26 handle="k8s-pod-network.3dde7b3369ae5f880a0006d762a930448abc9bc3b92c6e534b3de3cbeec5fdff" host="ci-4284-0-0-n-4a8b92fa55.novalocal" May 14 01:05:12.159986 containerd[1486]: 2025-05-14 01:05:12.092 [INFO][4295] ipam/ipam.go 1216: Successfully claimed IPs: [192.168.18.70/26] block=192.168.18.64/26 handle="k8s-pod-network.3dde7b3369ae5f880a0006d762a930448abc9bc3b92c6e534b3de3cbeec5fdff" host="ci-4284-0-0-n-4a8b92fa55.novalocal" May 14 01:05:12.159986 containerd[1486]: 2025-05-14 01:05:12.093 [INFO][4295] ipam/ipam.go 847: Auto-assigned 1 out of 1 IPv4s: [192.168.18.70/26] handle="k8s-pod-network.3dde7b3369ae5f880a0006d762a930448abc9bc3b92c6e534b3de3cbeec5fdff" host="ci-4284-0-0-n-4a8b92fa55.novalocal" May 14 01:05:12.159986 containerd[1486]: 2025-05-14 01:05:12.093 [INFO][4295] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. May 14 01:05:12.159986 containerd[1486]: 2025-05-14 01:05:12.093 [INFO][4295] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.18.70/26] IPv6=[] ContainerID="3dde7b3369ae5f880a0006d762a930448abc9bc3b92c6e534b3de3cbeec5fdff" HandleID="k8s-pod-network.3dde7b3369ae5f880a0006d762a930448abc9bc3b92c6e534b3de3cbeec5fdff" Workload="ci--4284--0--0--n--4a8b92fa55.novalocal-k8s-coredns--668d6bf9bc--zngx5-eth0" May 14 01:05:12.161972 containerd[1486]: 2025-05-14 01:05:12.100 [INFO][4264] cni-plugin/k8s.go 386: Populated endpoint ContainerID="3dde7b3369ae5f880a0006d762a930448abc9bc3b92c6e534b3de3cbeec5fdff" Namespace="kube-system" Pod="coredns-668d6bf9bc-zngx5" WorkloadEndpoint="ci--4284--0--0--n--4a8b92fa55.novalocal-k8s-coredns--668d6bf9bc--zngx5-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4284--0--0--n--4a8b92fa55.novalocal-k8s-coredns--668d6bf9bc--zngx5-eth0", GenerateName:"coredns-668d6bf9bc-", Namespace:"kube-system", SelfLink:"", UID:"a153f68e-8a69-4c2e-ba25-43d30d1b76a3", ResourceVersion:"681", Generation:0, CreationTimestamp:time.Date(2025, time.May, 14, 1, 4, 35, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"668d6bf9bc", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4284-0-0-n-4a8b92fa55.novalocal", ContainerID:"", Pod:"coredns-668d6bf9bc-zngx5", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.18.70/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"cali5f82c6519b3", MAC:"", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil)}} May 14 01:05:12.161972 containerd[1486]: 2025-05-14 01:05:12.100 [INFO][4264] cni-plugin/k8s.go 387: Calico CNI using IPs: [192.168.18.70/32] ContainerID="3dde7b3369ae5f880a0006d762a930448abc9bc3b92c6e534b3de3cbeec5fdff" Namespace="kube-system" Pod="coredns-668d6bf9bc-zngx5" WorkloadEndpoint="ci--4284--0--0--n--4a8b92fa55.novalocal-k8s-coredns--668d6bf9bc--zngx5-eth0" May 14 01:05:12.161972 containerd[1486]: 2025-05-14 01:05:12.100 [INFO][4264] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali5f82c6519b3 ContainerID="3dde7b3369ae5f880a0006d762a930448abc9bc3b92c6e534b3de3cbeec5fdff" Namespace="kube-system" Pod="coredns-668d6bf9bc-zngx5" WorkloadEndpoint="ci--4284--0--0--n--4a8b92fa55.novalocal-k8s-coredns--668d6bf9bc--zngx5-eth0" May 14 01:05:12.161972 containerd[1486]: 2025-05-14 01:05:12.114 [INFO][4264] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="3dde7b3369ae5f880a0006d762a930448abc9bc3b92c6e534b3de3cbeec5fdff" Namespace="kube-system" Pod="coredns-668d6bf9bc-zngx5" WorkloadEndpoint="ci--4284--0--0--n--4a8b92fa55.novalocal-k8s-coredns--668d6bf9bc--zngx5-eth0" May 14 01:05:12.161972 containerd[1486]: 2025-05-14 01:05:12.117 [INFO][4264] cni-plugin/k8s.go 414: Added Mac, interface name, and active container ID to endpoint ContainerID="3dde7b3369ae5f880a0006d762a930448abc9bc3b92c6e534b3de3cbeec5fdff" Namespace="kube-system" Pod="coredns-668d6bf9bc-zngx5" WorkloadEndpoint="ci--4284--0--0--n--4a8b92fa55.novalocal-k8s-coredns--668d6bf9bc--zngx5-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4284--0--0--n--4a8b92fa55.novalocal-k8s-coredns--668d6bf9bc--zngx5-eth0", GenerateName:"coredns-668d6bf9bc-", Namespace:"kube-system", SelfLink:"", UID:"a153f68e-8a69-4c2e-ba25-43d30d1b76a3", ResourceVersion:"681", Generation:0, CreationTimestamp:time.Date(2025, time.May, 14, 1, 4, 35, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"668d6bf9bc", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4284-0-0-n-4a8b92fa55.novalocal", ContainerID:"3dde7b3369ae5f880a0006d762a930448abc9bc3b92c6e534b3de3cbeec5fdff", Pod:"coredns-668d6bf9bc-zngx5", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.18.70/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"cali5f82c6519b3", MAC:"1e:6c:83:cc:1b:b1", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil)}} May 14 01:05:12.161972 containerd[1486]: 2025-05-14 01:05:12.153 [INFO][4264] cni-plugin/k8s.go 500: Wrote updated endpoint to datastore ContainerID="3dde7b3369ae5f880a0006d762a930448abc9bc3b92c6e534b3de3cbeec5fdff" Namespace="kube-system" Pod="coredns-668d6bf9bc-zngx5" WorkloadEndpoint="ci--4284--0--0--n--4a8b92fa55.novalocal-k8s-coredns--668d6bf9bc--zngx5-eth0" May 14 01:05:12.245150 systemd[1]: Started cri-containerd-21055ccaaf46f08fa2ce30ba5619d26b1e156fa34fe866aa6ec464c8a29f3cc5.scope - libcontainer container 21055ccaaf46f08fa2ce30ba5619d26b1e156fa34fe866aa6ec464c8a29f3cc5. May 14 01:05:12.265372 containerd[1486]: time="2025-05-14T01:05:12.264705477Z" level=info msg="connecting to shim 3dde7b3369ae5f880a0006d762a930448abc9bc3b92c6e534b3de3cbeec5fdff" address="unix:///run/containerd/s/fa0ad2c7915ef4a906fab762d2b8e240bd4aefc1888ceb0e05fbd311bd50bd2a" namespace=k8s.io protocol=ttrpc version=3 May 14 01:05:12.396230 systemd[1]: Started cri-containerd-3dde7b3369ae5f880a0006d762a930448abc9bc3b92c6e534b3de3cbeec5fdff.scope - libcontainer container 3dde7b3369ae5f880a0006d762a930448abc9bc3b92c6e534b3de3cbeec5fdff. May 14 01:05:12.497542 containerd[1486]: time="2025-05-14T01:05:12.497230721Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-7bc7fb4ff7-qm8vw,Uid:004765cd-cdab-49fa-8da7-cb602693d444,Namespace:calico-apiserver,Attempt:0,} returns sandbox id \"21055ccaaf46f08fa2ce30ba5619d26b1e156fa34fe866aa6ec464c8a29f3cc5\"" May 14 01:05:12.506276 containerd[1486]: time="2025-05-14T01:05:12.506115545Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-668d6bf9bc-zngx5,Uid:a153f68e-8a69-4c2e-ba25-43d30d1b76a3,Namespace:kube-system,Attempt:0,} returns sandbox id \"3dde7b3369ae5f880a0006d762a930448abc9bc3b92c6e534b3de3cbeec5fdff\"" May 14 01:05:12.510907 containerd[1486]: time="2025-05-14T01:05:12.510845447Z" level=info msg="CreateContainer within sandbox \"3dde7b3369ae5f880a0006d762a930448abc9bc3b92c6e534b3de3cbeec5fdff\" for container &ContainerMetadata{Name:coredns,Attempt:0,}" May 14 01:05:12.544155 containerd[1486]: time="2025-05-14T01:05:12.542121469Z" level=info msg="Container d7fbc1035570f9ec1fdfd51d98ceaecb047d8b0c1550bef7ab4e4bf88939a296: CDI devices from CRI Config.CDIDevices: []" May 14 01:05:12.559172 containerd[1486]: time="2025-05-14T01:05:12.559100967Z" level=info msg="CreateContainer within sandbox \"3dde7b3369ae5f880a0006d762a930448abc9bc3b92c6e534b3de3cbeec5fdff\" for &ContainerMetadata{Name:coredns,Attempt:0,} returns container id \"d7fbc1035570f9ec1fdfd51d98ceaecb047d8b0c1550bef7ab4e4bf88939a296\"" May 14 01:05:12.560508 containerd[1486]: time="2025-05-14T01:05:12.560360329Z" level=info msg="StartContainer for \"d7fbc1035570f9ec1fdfd51d98ceaecb047d8b0c1550bef7ab4e4bf88939a296\"" May 14 01:05:12.562072 containerd[1486]: time="2025-05-14T01:05:12.561993623Z" level=info msg="connecting to shim d7fbc1035570f9ec1fdfd51d98ceaecb047d8b0c1550bef7ab4e4bf88939a296" address="unix:///run/containerd/s/fa0ad2c7915ef4a906fab762d2b8e240bd4aefc1888ceb0e05fbd311bd50bd2a" protocol=ttrpc version=3 May 14 01:05:12.604388 systemd[1]: Started cri-containerd-d7fbc1035570f9ec1fdfd51d98ceaecb047d8b0c1550bef7ab4e4bf88939a296.scope - libcontainer container d7fbc1035570f9ec1fdfd51d98ceaecb047d8b0c1550bef7ab4e4bf88939a296. May 14 01:05:12.679016 containerd[1486]: time="2025-05-14T01:05:12.678909034Z" level=info msg="StartContainer for \"d7fbc1035570f9ec1fdfd51d98ceaecb047d8b0c1550bef7ab4e4bf88939a296\" returns successfully" May 14 01:05:12.854076 systemd-networkd[1384]: cali8bfa79b2f9b: Gained IPv6LL May 14 01:05:12.897960 kubelet[2710]: I0514 01:05:12.897847 2710 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/coredns-668d6bf9bc-zngx5" podStartSLOduration=37.89781868 podStartE2EDuration="37.89781868s" podCreationTimestamp="2025-05-14 01:04:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-05-14 01:05:12.897198527 +0000 UTC m=+44.396770051" watchObservedRunningTime="2025-05-14 01:05:12.89781868 +0000 UTC m=+44.397390164" May 14 01:05:13.812192 systemd-networkd[1384]: cali5f82c6519b3: Gained IPv6LL May 14 01:05:13.877268 systemd-networkd[1384]: caliaeed27ae243: Gained IPv6LL May 14 01:05:17.222915 containerd[1486]: time="2025-05-14T01:05:17.221310370Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/apiserver:v3.29.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 14 01:05:17.222915 containerd[1486]: time="2025-05-14T01:05:17.222802078Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.29.3: active requests=0, bytes read=43021437" May 14 01:05:17.224066 containerd[1486]: time="2025-05-14T01:05:17.223973632Z" level=info msg="ImageCreate event name:\"sha256:b1960e792987d99ee8f3583d7354dcd25a683cf854e8f10322ca7eeb83128532\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 14 01:05:17.227157 containerd[1486]: time="2025-05-14T01:05:17.227094596Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/apiserver@sha256:bcb659f25f9aebaa389ed1dbb65edb39478ddf82c57d07d8da474e8cab38d77b\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 14 01:05:17.228170 containerd[1486]: time="2025-05-14T01:05:17.228123303Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/apiserver:v3.29.3\" with image id \"sha256:b1960e792987d99ee8f3583d7354dcd25a683cf854e8f10322ca7eeb83128532\", repo tag \"ghcr.io/flatcar/calico/apiserver:v3.29.3\", repo digest \"ghcr.io/flatcar/calico/apiserver@sha256:bcb659f25f9aebaa389ed1dbb65edb39478ddf82c57d07d8da474e8cab38d77b\", size \"44514075\" in 7.776679655s" May 14 01:05:17.228306 containerd[1486]: time="2025-05-14T01:05:17.228284596Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.29.3\" returns image reference \"sha256:b1960e792987d99ee8f3583d7354dcd25a683cf854e8f10322ca7eeb83128532\"" May 14 01:05:17.236105 containerd[1486]: time="2025-05-14T01:05:17.236052224Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.29.3\"" May 14 01:05:17.239134 containerd[1486]: time="2025-05-14T01:05:17.239092486Z" level=info msg="CreateContainer within sandbox \"9be397c8d8b19c6c6f3180cc69f7bfaebb18e93b804363d9aca42ba401c74600\" for container &ContainerMetadata{Name:calico-apiserver,Attempt:0,}" May 14 01:05:17.253153 containerd[1486]: time="2025-05-14T01:05:17.253102634Z" level=info msg="Container 781da29970c61c2e0b17a27a30e7fad7e1129ab5abd6465d320cd9ee5b87d8d9: CDI devices from CRI Config.CDIDevices: []" May 14 01:05:17.273803 containerd[1486]: time="2025-05-14T01:05:17.273714986Z" level=info msg="CreateContainer within sandbox \"9be397c8d8b19c6c6f3180cc69f7bfaebb18e93b804363d9aca42ba401c74600\" for &ContainerMetadata{Name:calico-apiserver,Attempt:0,} returns container id \"781da29970c61c2e0b17a27a30e7fad7e1129ab5abd6465d320cd9ee5b87d8d9\"" May 14 01:05:17.274841 containerd[1486]: time="2025-05-14T01:05:17.274791703Z" level=info msg="StartContainer for \"781da29970c61c2e0b17a27a30e7fad7e1129ab5abd6465d320cd9ee5b87d8d9\"" May 14 01:05:17.279887 containerd[1486]: time="2025-05-14T01:05:17.279811731Z" level=info msg="connecting to shim 781da29970c61c2e0b17a27a30e7fad7e1129ab5abd6465d320cd9ee5b87d8d9" address="unix:///run/containerd/s/e0923a3400ee2a1d0aaddd6e486624d4476f397b86b572f3627b67cd1109f46d" protocol=ttrpc version=3 May 14 01:05:17.322298 systemd[1]: Started cri-containerd-781da29970c61c2e0b17a27a30e7fad7e1129ab5abd6465d320cd9ee5b87d8d9.scope - libcontainer container 781da29970c61c2e0b17a27a30e7fad7e1129ab5abd6465d320cd9ee5b87d8d9. May 14 01:05:17.392395 containerd[1486]: time="2025-05-14T01:05:17.390353793Z" level=info msg="StartContainer for \"781da29970c61c2e0b17a27a30e7fad7e1129ab5abd6465d320cd9ee5b87d8d9\" returns successfully" May 14 01:05:18.917612 kubelet[2710]: I0514 01:05:18.917524 2710 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" May 14 01:05:21.370374 containerd[1486]: time="2025-05-14T01:05:21.370145748Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/kube-controllers:v3.29.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 14 01:05:21.373697 containerd[1486]: time="2025-05-14T01:05:21.373480301Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/kube-controllers:v3.29.3: active requests=0, bytes read=34789138" May 14 01:05:21.373871 containerd[1486]: time="2025-05-14T01:05:21.373839367Z" level=info msg="ImageCreate event name:\"sha256:4e982138231b3653a012db4f21ed5e7be69afd5f553dba38cf7e88f0ed740b94\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 14 01:05:21.378171 containerd[1486]: time="2025-05-14T01:05:21.378120741Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/kube-controllers@sha256:feaab0197035d474845e0f8137a99a78cab274f0a3cac4d5485cf9b1bdf9ffa9\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 14 01:05:21.379074 containerd[1486]: time="2025-05-14T01:05:21.378957436Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/kube-controllers:v3.29.3\" with image id \"sha256:4e982138231b3653a012db4f21ed5e7be69afd5f553dba38cf7e88f0ed740b94\", repo tag \"ghcr.io/flatcar/calico/kube-controllers:v3.29.3\", repo digest \"ghcr.io/flatcar/calico/kube-controllers@sha256:feaab0197035d474845e0f8137a99a78cab274f0a3cac4d5485cf9b1bdf9ffa9\", size \"36281728\" in 4.142838255s" May 14 01:05:21.379269 containerd[1486]: time="2025-05-14T01:05:21.379247522Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.29.3\" returns image reference \"sha256:4e982138231b3653a012db4f21ed5e7be69afd5f553dba38cf7e88f0ed740b94\"" May 14 01:05:21.390602 containerd[1486]: time="2025-05-14T01:05:21.390001503Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.29.3\"" May 14 01:05:21.420971 containerd[1486]: time="2025-05-14T01:05:21.420916104Z" level=info msg="CreateContainer within sandbox \"f3313e3fe39a559b1eb71121c828ad2034b3beef221ce6206a81be52ab9e6322\" for container &ContainerMetadata{Name:calico-kube-controllers,Attempt:0,}" May 14 01:05:21.442855 containerd[1486]: time="2025-05-14T01:05:21.441637137Z" level=info msg="Container fbd2f0df01276d9e0ab3f1bb7fcc04b75d9ceb93f4633143ac23e9b29cc7cd7b: CDI devices from CRI Config.CDIDevices: []" May 14 01:05:21.448508 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount460066045.mount: Deactivated successfully. May 14 01:05:21.467738 containerd[1486]: time="2025-05-14T01:05:21.467648954Z" level=info msg="CreateContainer within sandbox \"f3313e3fe39a559b1eb71121c828ad2034b3beef221ce6206a81be52ab9e6322\" for &ContainerMetadata{Name:calico-kube-controllers,Attempt:0,} returns container id \"fbd2f0df01276d9e0ab3f1bb7fcc04b75d9ceb93f4633143ac23e9b29cc7cd7b\"" May 14 01:05:21.469365 containerd[1486]: time="2025-05-14T01:05:21.469260675Z" level=info msg="StartContainer for \"fbd2f0df01276d9e0ab3f1bb7fcc04b75d9ceb93f4633143ac23e9b29cc7cd7b\"" May 14 01:05:21.474766 containerd[1486]: time="2025-05-14T01:05:21.474693056Z" level=info msg="connecting to shim fbd2f0df01276d9e0ab3f1bb7fcc04b75d9ceb93f4633143ac23e9b29cc7cd7b" address="unix:///run/containerd/s/127fc8bb229e9e7d1967b69e2ce98a1ce728c177a8b6c1192251cc89dd76a31f" protocol=ttrpc version=3 May 14 01:05:21.516222 systemd[1]: Started cri-containerd-fbd2f0df01276d9e0ab3f1bb7fcc04b75d9ceb93f4633143ac23e9b29cc7cd7b.scope - libcontainer container fbd2f0df01276d9e0ab3f1bb7fcc04b75d9ceb93f4633143ac23e9b29cc7cd7b. May 14 01:05:21.600881 containerd[1486]: time="2025-05-14T01:05:21.600819111Z" level=info msg="StartContainer for \"fbd2f0df01276d9e0ab3f1bb7fcc04b75d9ceb93f4633143ac23e9b29cc7cd7b\" returns successfully" May 14 01:05:22.003875 kubelet[2710]: I0514 01:05:22.003702 2710 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-apiserver/calico-apiserver-7bc7fb4ff7-4sg5w" podStartSLOduration=33.217899115 podStartE2EDuration="41.003611977s" podCreationTimestamp="2025-05-14 01:04:41 +0000 UTC" firstStartedPulling="2025-05-14 01:05:09.449195961 +0000 UTC m=+40.948767435" lastFinishedPulling="2025-05-14 01:05:17.234908823 +0000 UTC m=+48.734480297" observedRunningTime="2025-05-14 01:05:17.981990021 +0000 UTC m=+49.481561515" watchObservedRunningTime="2025-05-14 01:05:22.003611977 +0000 UTC m=+53.503183451" May 14 01:05:22.064088 containerd[1486]: time="2025-05-14T01:05:22.063812314Z" level=info msg="TaskExit event in podsandbox handler container_id:\"fbd2f0df01276d9e0ab3f1bb7fcc04b75d9ceb93f4633143ac23e9b29cc7cd7b\" id:\"3b85a7bf0c7478c3e39e6eaea5b2974593e9893f08440847781ef104c55cee61\" pid:4572 exited_at:{seconds:1747184722 nanos:63348151}" May 14 01:05:22.086692 kubelet[2710]: I0514 01:05:22.086611 2710 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/calico-kube-controllers-76985557f9-m9brf" podStartSLOduration=29.766383591 podStartE2EDuration="41.086591345s" podCreationTimestamp="2025-05-14 01:04:41 +0000 UTC" firstStartedPulling="2025-05-14 01:05:10.062766017 +0000 UTC m=+41.562337491" lastFinishedPulling="2025-05-14 01:05:21.382973771 +0000 UTC m=+52.882545245" observedRunningTime="2025-05-14 01:05:22.006370145 +0000 UTC m=+53.505941639" watchObservedRunningTime="2025-05-14 01:05:22.086591345 +0000 UTC m=+53.586162819" May 14 01:05:22.698906 kubelet[2710]: I0514 01:05:22.697869 2710 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" May 14 01:05:23.721093 containerd[1486]: time="2025-05-14T01:05:23.720897356Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/csi:v3.29.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 14 01:05:23.722502 containerd[1486]: time="2025-05-14T01:05:23.722147698Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/csi:v3.29.3: active requests=0, bytes read=7912898" May 14 01:05:23.724090 containerd[1486]: time="2025-05-14T01:05:23.723572908Z" level=info msg="ImageCreate event name:\"sha256:4c37db5645f4075f8b8170eea8f14e340cb13550e0a392962f1f211ded741505\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 14 01:05:23.726991 containerd[1486]: time="2025-05-14T01:05:23.726952406Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/csi@sha256:72455a36febc7c56ec8881007f4805caed5764026a0694e4f86a2503209b2d31\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 14 01:05:23.728293 containerd[1486]: time="2025-05-14T01:05:23.728248934Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/csi:v3.29.3\" with image id \"sha256:4c37db5645f4075f8b8170eea8f14e340cb13550e0a392962f1f211ded741505\", repo tag \"ghcr.io/flatcar/calico/csi:v3.29.3\", repo digest \"ghcr.io/flatcar/calico/csi@sha256:72455a36febc7c56ec8881007f4805caed5764026a0694e4f86a2503209b2d31\", size \"9405520\" in 2.338169855s" May 14 01:05:23.728405 containerd[1486]: time="2025-05-14T01:05:23.728291745Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.29.3\" returns image reference \"sha256:4c37db5645f4075f8b8170eea8f14e340cb13550e0a392962f1f211ded741505\"" May 14 01:05:23.730840 containerd[1486]: time="2025-05-14T01:05:23.729943312Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.29.3\"" May 14 01:05:23.732906 containerd[1486]: time="2025-05-14T01:05:23.732657127Z" level=info msg="CreateContainer within sandbox \"b44dc0b9ec272ae077b45c38e31bd9ac2845963c4a223bc92d949062a1c0b5f9\" for container &ContainerMetadata{Name:calico-csi,Attempt:0,}" May 14 01:05:23.769635 containerd[1486]: time="2025-05-14T01:05:23.769580430Z" level=info msg="Container 2a5f7043f2e46d452c7d8667b17facd94252f01728a4b5059dc0bc636c476115: CDI devices from CRI Config.CDIDevices: []" May 14 01:05:23.778610 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount1807770601.mount: Deactivated successfully. May 14 01:05:23.792685 containerd[1486]: time="2025-05-14T01:05:23.792634304Z" level=info msg="CreateContainer within sandbox \"b44dc0b9ec272ae077b45c38e31bd9ac2845963c4a223bc92d949062a1c0b5f9\" for &ContainerMetadata{Name:calico-csi,Attempt:0,} returns container id \"2a5f7043f2e46d452c7d8667b17facd94252f01728a4b5059dc0bc636c476115\"" May 14 01:05:23.794376 containerd[1486]: time="2025-05-14T01:05:23.793567780Z" level=info msg="StartContainer for \"2a5f7043f2e46d452c7d8667b17facd94252f01728a4b5059dc0bc636c476115\"" May 14 01:05:23.796866 containerd[1486]: time="2025-05-14T01:05:23.796824316Z" level=info msg="connecting to shim 2a5f7043f2e46d452c7d8667b17facd94252f01728a4b5059dc0bc636c476115" address="unix:///run/containerd/s/e8e61c7e54ff784fe8253e32abac13e5aca4a4fa6a72e4c0fd4eb8298c308c91" protocol=ttrpc version=3 May 14 01:05:23.828232 systemd[1]: Started cri-containerd-2a5f7043f2e46d452c7d8667b17facd94252f01728a4b5059dc0bc636c476115.scope - libcontainer container 2a5f7043f2e46d452c7d8667b17facd94252f01728a4b5059dc0bc636c476115. May 14 01:05:23.903159 containerd[1486]: time="2025-05-14T01:05:23.903117155Z" level=info msg="StartContainer for \"2a5f7043f2e46d452c7d8667b17facd94252f01728a4b5059dc0bc636c476115\" returns successfully" May 14 01:05:24.260096 containerd[1486]: time="2025-05-14T01:05:24.258186058Z" level=info msg="ImageUpdate event name:\"ghcr.io/flatcar/calico/apiserver:v3.29.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 14 01:05:24.260096 containerd[1486]: time="2025-05-14T01:05:24.259586612Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.29.3: active requests=0, bytes read=77" May 14 01:05:24.267488 containerd[1486]: time="2025-05-14T01:05:24.267401111Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/apiserver:v3.29.3\" with image id \"sha256:b1960e792987d99ee8f3583d7354dcd25a683cf854e8f10322ca7eeb83128532\", repo tag \"ghcr.io/flatcar/calico/apiserver:v3.29.3\", repo digest \"ghcr.io/flatcar/calico/apiserver@sha256:bcb659f25f9aebaa389ed1dbb65edb39478ddf82c57d07d8da474e8cab38d77b\", size \"44514075\" in 537.387076ms" May 14 01:05:24.267899 containerd[1486]: time="2025-05-14T01:05:24.267853381Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.29.3\" returns image reference \"sha256:b1960e792987d99ee8f3583d7354dcd25a683cf854e8f10322ca7eeb83128532\"" May 14 01:05:24.271846 containerd[1486]: time="2025-05-14T01:05:24.271752434Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.29.3\"" May 14 01:05:24.273967 containerd[1486]: time="2025-05-14T01:05:24.273621329Z" level=info msg="CreateContainer within sandbox \"21055ccaaf46f08fa2ce30ba5619d26b1e156fa34fe866aa6ec464c8a29f3cc5\" for container &ContainerMetadata{Name:calico-apiserver,Attempt:0,}" May 14 01:05:24.298525 containerd[1486]: time="2025-05-14T01:05:24.298007257Z" level=info msg="Container b152770239c33d349ccc07b7e26a615185a879f88d32bebb7cde9473c8bc3915: CDI devices from CRI Config.CDIDevices: []" May 14 01:05:24.339128 containerd[1486]: time="2025-05-14T01:05:24.338989248Z" level=info msg="CreateContainer within sandbox \"21055ccaaf46f08fa2ce30ba5619d26b1e156fa34fe866aa6ec464c8a29f3cc5\" for &ContainerMetadata{Name:calico-apiserver,Attempt:0,} returns container id \"b152770239c33d349ccc07b7e26a615185a879f88d32bebb7cde9473c8bc3915\"" May 14 01:05:24.342322 containerd[1486]: time="2025-05-14T01:05:24.342242768Z" level=info msg="StartContainer for \"b152770239c33d349ccc07b7e26a615185a879f88d32bebb7cde9473c8bc3915\"" May 14 01:05:24.347353 containerd[1486]: time="2025-05-14T01:05:24.347106646Z" level=info msg="connecting to shim b152770239c33d349ccc07b7e26a615185a879f88d32bebb7cde9473c8bc3915" address="unix:///run/containerd/s/730934082fc9643d3bf70996e8d3a5540aea197ea82376f1a1362bd6ef377760" protocol=ttrpc version=3 May 14 01:05:24.401394 systemd[1]: Started cri-containerd-b152770239c33d349ccc07b7e26a615185a879f88d32bebb7cde9473c8bc3915.scope - libcontainer container b152770239c33d349ccc07b7e26a615185a879f88d32bebb7cde9473c8bc3915. May 14 01:05:24.483538 containerd[1486]: time="2025-05-14T01:05:24.483490928Z" level=info msg="StartContainer for \"b152770239c33d349ccc07b7e26a615185a879f88d32bebb7cde9473c8bc3915\" returns successfully" May 14 01:05:25.031533 kubelet[2710]: I0514 01:05:25.029223 2710 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-apiserver/calico-apiserver-7bc7fb4ff7-qm8vw" podStartSLOduration=32.261697534 podStartE2EDuration="44.029131794s" podCreationTimestamp="2025-05-14 01:04:41 +0000 UTC" firstStartedPulling="2025-05-14 01:05:12.502113549 +0000 UTC m=+44.001685033" lastFinishedPulling="2025-05-14 01:05:24.269547789 +0000 UTC m=+55.769119293" observedRunningTime="2025-05-14 01:05:25.023236456 +0000 UTC m=+56.522807990" watchObservedRunningTime="2025-05-14 01:05:25.029131794 +0000 UTC m=+56.528703328" May 14 01:05:27.693829 containerd[1486]: time="2025-05-14T01:05:27.693697893Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node-driver-registrar:v3.29.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 14 01:05:27.696503 containerd[1486]: time="2025-05-14T01:05:27.696435010Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/node-driver-registrar:v3.29.3: active requests=0, bytes read=13991773" May 14 01:05:27.698282 containerd[1486]: time="2025-05-14T01:05:27.698001577Z" level=info msg="ImageCreate event name:\"sha256:e909e2ccf54404290b577fbddd190d036984deed184001767f820b0dddf77fd9\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 14 01:05:27.702506 containerd[1486]: time="2025-05-14T01:05:27.702289560Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node-driver-registrar@sha256:3f15090a9bb45773d1fd019455ec3d3f3746f3287c35d8013e497b38d8237324\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 14 01:05:27.703403 containerd[1486]: time="2025-05-14T01:05:27.703352979Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.29.3\" with image id \"sha256:e909e2ccf54404290b577fbddd190d036984deed184001767f820b0dddf77fd9\", repo tag \"ghcr.io/flatcar/calico/node-driver-registrar:v3.29.3\", repo digest \"ghcr.io/flatcar/calico/node-driver-registrar@sha256:3f15090a9bb45773d1fd019455ec3d3f3746f3287c35d8013e497b38d8237324\", size \"15484347\" in 3.430863629s" May 14 01:05:27.703476 containerd[1486]: time="2025-05-14T01:05:27.703430995Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.29.3\" returns image reference \"sha256:e909e2ccf54404290b577fbddd190d036984deed184001767f820b0dddf77fd9\"" May 14 01:05:27.707752 containerd[1486]: time="2025-05-14T01:05:27.707697339Z" level=info msg="CreateContainer within sandbox \"b44dc0b9ec272ae077b45c38e31bd9ac2845963c4a223bc92d949062a1c0b5f9\" for container &ContainerMetadata{Name:csi-node-driver-registrar,Attempt:0,}" May 14 01:05:27.721877 containerd[1486]: time="2025-05-14T01:05:27.718809975Z" level=info msg="Container 579bab10ab4e4b5259f774f10486efdc4195b890db558789da1fb0cd116d31b4: CDI devices from CRI Config.CDIDevices: []" May 14 01:05:27.738725 containerd[1486]: time="2025-05-14T01:05:27.738683376Z" level=info msg="CreateContainer within sandbox \"b44dc0b9ec272ae077b45c38e31bd9ac2845963c4a223bc92d949062a1c0b5f9\" for &ContainerMetadata{Name:csi-node-driver-registrar,Attempt:0,} returns container id \"579bab10ab4e4b5259f774f10486efdc4195b890db558789da1fb0cd116d31b4\"" May 14 01:05:27.741127 containerd[1486]: time="2025-05-14T01:05:27.740210608Z" level=info msg="StartContainer for \"579bab10ab4e4b5259f774f10486efdc4195b890db558789da1fb0cd116d31b4\"" May 14 01:05:27.743716 containerd[1486]: time="2025-05-14T01:05:27.743670554Z" level=info msg="connecting to shim 579bab10ab4e4b5259f774f10486efdc4195b890db558789da1fb0cd116d31b4" address="unix:///run/containerd/s/e8e61c7e54ff784fe8253e32abac13e5aca4a4fa6a72e4c0fd4eb8298c308c91" protocol=ttrpc version=3 May 14 01:05:27.778198 systemd[1]: Started cri-containerd-579bab10ab4e4b5259f774f10486efdc4195b890db558789da1fb0cd116d31b4.scope - libcontainer container 579bab10ab4e4b5259f774f10486efdc4195b890db558789da1fb0cd116d31b4. May 14 01:05:27.843392 containerd[1486]: time="2025-05-14T01:05:27.843341852Z" level=info msg="StartContainer for \"579bab10ab4e4b5259f774f10486efdc4195b890db558789da1fb0cd116d31b4\" returns successfully" May 14 01:05:28.057580 kubelet[2710]: I0514 01:05:28.057446 2710 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/csi-node-driver-b8vbb" podStartSLOduration=30.885408895 podStartE2EDuration="47.05741597s" podCreationTimestamp="2025-05-14 01:04:41 +0000 UTC" firstStartedPulling="2025-05-14 01:05:11.532758861 +0000 UTC m=+43.032330335" lastFinishedPulling="2025-05-14 01:05:27.704765926 +0000 UTC m=+59.204337410" observedRunningTime="2025-05-14 01:05:28.054660379 +0000 UTC m=+59.554231883" watchObservedRunningTime="2025-05-14 01:05:28.05741597 +0000 UTC m=+59.556987474" May 14 01:05:28.736822 kubelet[2710]: I0514 01:05:28.736734 2710 csi_plugin.go:100] kubernetes.io/csi: Trying to validate a new CSI Driver with name: csi.tigera.io endpoint: /var/lib/kubelet/plugins/csi.tigera.io/csi.sock versions: 1.0.0 May 14 01:05:28.737123 kubelet[2710]: I0514 01:05:28.737067 2710 csi_plugin.go:113] kubernetes.io/csi: Register new plugin with name: csi.tigera.io at endpoint: /var/lib/kubelet/plugins/csi.tigera.io/csi.sock May 14 01:05:35.969864 containerd[1486]: time="2025-05-14T01:05:35.969391955Z" level=info msg="TaskExit event in podsandbox handler container_id:\"fbd2f0df01276d9e0ab3f1bb7fcc04b75d9ceb93f4633143ac23e9b29cc7cd7b\" id:\"e71faffe0e721b7b680b85739385ada8290e6076d8bb281e605e6ad9b70a490d\" pid:4718 exited_at:{seconds:1747184735 nanos:957321198}" May 14 01:05:37.985339 containerd[1486]: time="2025-05-14T01:05:37.985239007Z" level=info msg="TaskExit event in podsandbox handler container_id:\"333ab1cd365db74def51b32507a1eb376d68d8fdb1297213dc201047e9c4ed25\" id:\"e4469fb39029bba94fb811ecd82ec4d2e60dd842db88b7436b3d08403b88c28b\" pid:4739 exited_at:{seconds:1747184737 nanos:984485913}" May 14 01:05:38.109756 containerd[1486]: time="2025-05-14T01:05:38.109698456Z" level=info msg="TaskExit event in podsandbox handler container_id:\"333ab1cd365db74def51b32507a1eb376d68d8fdb1297213dc201047e9c4ed25\" id:\"5516ff387cc89a83808a2db2b2f207934eb654afe01cf36de76f6eb9b3e1926e\" pid:4762 exited_at:{seconds:1747184738 nanos:109355160}" May 14 01:05:52.098119 containerd[1486]: time="2025-05-14T01:05:52.097822547Z" level=info msg="TaskExit event in podsandbox handler container_id:\"fbd2f0df01276d9e0ab3f1bb7fcc04b75d9ceb93f4633143ac23e9b29cc7cd7b\" id:\"3e2f0da0ce126f2199bfb078c989b4a17accf8ffa13c29710721e228cc9104eb\" pid:4792 exited_at:{seconds:1747184752 nanos:96936762}" May 14 01:06:08.191982 containerd[1486]: time="2025-05-14T01:06:08.191879598Z" level=info msg="TaskExit event in podsandbox handler container_id:\"333ab1cd365db74def51b32507a1eb376d68d8fdb1297213dc201047e9c4ed25\" id:\"b96b4a46621c7a402da4294288b1738a64d686faa63f62ce3b39652e1d6d5b37\" pid:4818 exited_at:{seconds:1747184768 nanos:189596753}" May 14 01:06:22.081260 containerd[1486]: time="2025-05-14T01:06:22.081167363Z" level=info msg="TaskExit event in podsandbox handler container_id:\"fbd2f0df01276d9e0ab3f1bb7fcc04b75d9ceb93f4633143ac23e9b29cc7cd7b\" id:\"80679ee24ce4f81a864273dfd7f17fef3f30c3efdc2b7298f166238e67f85529\" pid:4843 exited_at:{seconds:1747184782 nanos:79251307}" May 14 01:06:35.954345 containerd[1486]: time="2025-05-14T01:06:35.953500429Z" level=info msg="TaskExit event in podsandbox handler container_id:\"fbd2f0df01276d9e0ab3f1bb7fcc04b75d9ceb93f4633143ac23e9b29cc7cd7b\" id:\"8135c3d22f2f40b0ad4d09d17655f01dd5a94953badc67e209d584a46396eac8\" pid:4878 exited_at:{seconds:1747184795 nanos:952898218}" May 14 01:06:38.099221 containerd[1486]: time="2025-05-14T01:06:38.099164753Z" level=info msg="TaskExit event in podsandbox handler container_id:\"333ab1cd365db74def51b32507a1eb376d68d8fdb1297213dc201047e9c4ed25\" id:\"0cd513c351f9a6208176c73c6684ac3cf548b232ad138996e48310b76a387952\" pid:4898 exited_at:{seconds:1747184798 nanos:98217475}" May 14 01:06:52.072925 containerd[1486]: time="2025-05-14T01:06:52.072863268Z" level=info msg="TaskExit event in podsandbox handler container_id:\"fbd2f0df01276d9e0ab3f1bb7fcc04b75d9ceb93f4633143ac23e9b29cc7cd7b\" id:\"9d0cf7b303199a29c66336d8d894b3fc463d95d3cdf074e01bc1a36b32ac1ba8\" pid:4938 exited_at:{seconds:1747184812 nanos:72326942}" May 14 01:07:08.134097 containerd[1486]: time="2025-05-14T01:07:08.134003972Z" level=info msg="TaskExit event in podsandbox handler container_id:\"333ab1cd365db74def51b32507a1eb376d68d8fdb1297213dc201047e9c4ed25\" id:\"572f2317b995fd0609445adfe58b3abfd9fa8d4e12bcd1285129ad26bc71d759\" pid:4963 exited_at:{seconds:1747184828 nanos:133422461}" May 14 01:07:22.060407 containerd[1486]: time="2025-05-14T01:07:22.059872979Z" level=info msg="TaskExit event in podsandbox handler container_id:\"fbd2f0df01276d9e0ab3f1bb7fcc04b75d9ceb93f4633143ac23e9b29cc7cd7b\" id:\"f604617cd355e1398fa6d03aeb1d5d14629a629fc560bd584fce811dd5fbb8f5\" pid:4989 exited_at:{seconds:1747184842 nanos:59173787}" May 14 01:07:35.925264 containerd[1486]: time="2025-05-14T01:07:35.924694054Z" level=info msg="TaskExit event in podsandbox handler container_id:\"fbd2f0df01276d9e0ab3f1bb7fcc04b75d9ceb93f4633143ac23e9b29cc7cd7b\" id:\"4b0f394907a21a43dd936b93562df2c6bb8fce7cec3784dae94fd2b622df07d9\" pid:5014 exited_at:{seconds:1747184855 nanos:921911243}" May 14 01:07:38.116195 containerd[1486]: time="2025-05-14T01:07:38.115971424Z" level=info msg="TaskExit event in podsandbox handler container_id:\"333ab1cd365db74def51b32507a1eb376d68d8fdb1297213dc201047e9c4ed25\" id:\"39ac2af38a9b82bedf767b970b303b7c892379b0b32ea39c81ceb7ad8f7d0bce\" pid:5035 exited_at:{seconds:1747184858 nanos:115229883}" May 14 01:07:52.088368 containerd[1486]: time="2025-05-14T01:07:52.088289730Z" level=info msg="TaskExit event in podsandbox handler container_id:\"fbd2f0df01276d9e0ab3f1bb7fcc04b75d9ceb93f4633143ac23e9b29cc7cd7b\" id:\"14acb9f81741ec4df209435d8e5655a93b921e179a59e026663be486f8ef9960\" pid:5065 exited_at:{seconds:1747184872 nanos:87736482}" May 14 01:08:08.138408 containerd[1486]: time="2025-05-14T01:08:08.137806481Z" level=info msg="TaskExit event in podsandbox handler container_id:\"333ab1cd365db74def51b32507a1eb376d68d8fdb1297213dc201047e9c4ed25\" id:\"34ef89374a77e47ace8b9d495d6b4e3d092e443552705546064faf6e34c30da9\" pid:5090 exited_at:{seconds:1747184888 nanos:136828326}" May 14 01:08:20.319906 systemd[1]: Started sshd@9-172.24.4.64:22-172.24.4.1:56454.service - OpenSSH per-connection server daemon (172.24.4.1:56454). May 14 01:08:21.816806 sshd[5111]: Accepted publickey for core from 172.24.4.1 port 56454 ssh2: RSA SHA256:rM5XR5OurBP49wqsUeFk3q6kdddh3064hEvFZHudzRQ May 14 01:08:21.825405 sshd-session[5111]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 14 01:08:21.852112 systemd-logind[1459]: New session 12 of user core. May 14 01:08:21.862438 systemd[1]: Started session-12.scope - Session 12 of User core. May 14 01:08:22.063124 containerd[1486]: time="2025-05-14T01:08:22.062128104Z" level=info msg="TaskExit event in podsandbox handler container_id:\"fbd2f0df01276d9e0ab3f1bb7fcc04b75d9ceb93f4633143ac23e9b29cc7cd7b\" id:\"ea31bdd5db86df83c8a051208a67769c8384043515bad55708ce3b1e78fdd538\" pid:5127 exited_at:{seconds:1747184902 nanos:61321321}" May 14 01:08:22.671120 sshd[5113]: Connection closed by 172.24.4.1 port 56454 May 14 01:08:22.673899 sshd-session[5111]: pam_unix(sshd:session): session closed for user core May 14 01:08:22.695370 systemd[1]: sshd@9-172.24.4.64:22-172.24.4.1:56454.service: Deactivated successfully. May 14 01:08:22.705506 systemd[1]: session-12.scope: Deactivated successfully. May 14 01:08:22.710687 systemd-logind[1459]: Session 12 logged out. Waiting for processes to exit. May 14 01:08:22.714682 systemd-logind[1459]: Removed session 12. May 14 01:08:27.698937 systemd[1]: Started sshd@10-172.24.4.64:22-172.24.4.1:44078.service - OpenSSH per-connection server daemon (172.24.4.1:44078). May 14 01:08:28.958211 sshd[5163]: Accepted publickey for core from 172.24.4.1 port 44078 ssh2: RSA SHA256:rM5XR5OurBP49wqsUeFk3q6kdddh3064hEvFZHudzRQ May 14 01:08:28.962657 sshd-session[5163]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 14 01:08:28.980547 systemd-logind[1459]: New session 13 of user core. May 14 01:08:28.988564 systemd[1]: Started session-13.scope - Session 13 of User core. May 14 01:08:29.792657 sshd[5170]: Connection closed by 172.24.4.1 port 44078 May 14 01:08:29.791336 sshd-session[5163]: pam_unix(sshd:session): session closed for user core May 14 01:08:29.798446 systemd[1]: sshd@10-172.24.4.64:22-172.24.4.1:44078.service: Deactivated successfully. May 14 01:08:29.798503 systemd-logind[1459]: Session 13 logged out. Waiting for processes to exit. May 14 01:08:29.801666 systemd[1]: session-13.scope: Deactivated successfully. May 14 01:08:29.804699 systemd-logind[1459]: Removed session 13. May 14 01:08:34.820579 systemd[1]: Started sshd@11-172.24.4.64:22-172.24.4.1:52322.service - OpenSSH per-connection server daemon (172.24.4.1:52322). May 14 01:08:35.946620 containerd[1486]: time="2025-05-14T01:08:35.946499217Z" level=info msg="TaskExit event in podsandbox handler container_id:\"fbd2f0df01276d9e0ab3f1bb7fcc04b75d9ceb93f4633143ac23e9b29cc7cd7b\" id:\"6254125bf456124e354c34654cdba069f171a24dd20eca3f270b12adc863f989\" pid:5200 exited_at:{seconds:1747184915 nanos:945933636}" May 14 01:08:36.015669 sshd[5183]: Accepted publickey for core from 172.24.4.1 port 52322 ssh2: RSA SHA256:rM5XR5OurBP49wqsUeFk3q6kdddh3064hEvFZHudzRQ May 14 01:08:36.016704 sshd-session[5183]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 14 01:08:36.031479 systemd-logind[1459]: New session 14 of user core. May 14 01:08:36.041400 systemd[1]: Started session-14.scope - Session 14 of User core. May 14 01:08:36.854076 sshd[5209]: Connection closed by 172.24.4.1 port 52322 May 14 01:08:36.855644 sshd-session[5183]: pam_unix(sshd:session): session closed for user core May 14 01:08:36.863875 systemd[1]: sshd@11-172.24.4.64:22-172.24.4.1:52322.service: Deactivated successfully. May 14 01:08:36.869419 systemd[1]: session-14.scope: Deactivated successfully. May 14 01:08:36.874949 systemd-logind[1459]: Session 14 logged out. Waiting for processes to exit. May 14 01:08:36.877756 systemd-logind[1459]: Removed session 14. May 14 01:08:38.146478 containerd[1486]: time="2025-05-14T01:08:38.146263697Z" level=info msg="TaskExit event in podsandbox handler container_id:\"333ab1cd365db74def51b32507a1eb376d68d8fdb1297213dc201047e9c4ed25\" id:\"6c41da0e90c41b9d04d7c756f2e3e519d5925efdfc42ed2e89e85c54cd5fa17f\" pid:5234 exited_at:{seconds:1747184918 nanos:145650908}" May 14 01:08:41.872081 systemd[1]: Started sshd@12-172.24.4.64:22-172.24.4.1:52332.service - OpenSSH per-connection server daemon (172.24.4.1:52332). May 14 01:08:43.114074 sshd[5248]: Accepted publickey for core from 172.24.4.1 port 52332 ssh2: RSA SHA256:rM5XR5OurBP49wqsUeFk3q6kdddh3064hEvFZHudzRQ May 14 01:08:43.117566 sshd-session[5248]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 14 01:08:43.132147 systemd-logind[1459]: New session 15 of user core. May 14 01:08:43.140444 systemd[1]: Started session-15.scope - Session 15 of User core. May 14 01:08:43.907499 sshd[5250]: Connection closed by 172.24.4.1 port 52332 May 14 01:08:43.908643 sshd-session[5248]: pam_unix(sshd:session): session closed for user core May 14 01:08:43.924664 systemd[1]: sshd@12-172.24.4.64:22-172.24.4.1:52332.service: Deactivated successfully. May 14 01:08:43.931605 systemd[1]: session-15.scope: Deactivated successfully. May 14 01:08:43.938071 systemd-logind[1459]: Session 15 logged out. Waiting for processes to exit. May 14 01:08:43.944624 systemd[1]: Started sshd@13-172.24.4.64:22-172.24.4.1:52488.service - OpenSSH per-connection server daemon (172.24.4.1:52488). May 14 01:08:43.951522 systemd-logind[1459]: Removed session 15. May 14 01:08:45.538112 sshd[5263]: Accepted publickey for core from 172.24.4.1 port 52488 ssh2: RSA SHA256:rM5XR5OurBP49wqsUeFk3q6kdddh3064hEvFZHudzRQ May 14 01:08:45.552124 sshd-session[5263]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 14 01:08:45.566614 systemd-logind[1459]: New session 16 of user core. May 14 01:08:45.579437 systemd[1]: Started session-16.scope - Session 16 of User core. May 14 01:08:46.686793 sshd[5266]: Connection closed by 172.24.4.1 port 52488 May 14 01:08:46.688672 sshd-session[5263]: pam_unix(sshd:session): session closed for user core May 14 01:08:46.705716 systemd[1]: sshd@13-172.24.4.64:22-172.24.4.1:52488.service: Deactivated successfully. May 14 01:08:46.709948 systemd[1]: session-16.scope: Deactivated successfully. May 14 01:08:46.712482 systemd-logind[1459]: Session 16 logged out. Waiting for processes to exit. May 14 01:08:46.717397 systemd[1]: Started sshd@14-172.24.4.64:22-172.24.4.1:52500.service - OpenSSH per-connection server daemon (172.24.4.1:52500). May 14 01:08:46.721990 systemd-logind[1459]: Removed session 16. May 14 01:08:47.695530 sshd[5275]: Accepted publickey for core from 172.24.4.1 port 52500 ssh2: RSA SHA256:rM5XR5OurBP49wqsUeFk3q6kdddh3064hEvFZHudzRQ May 14 01:08:47.699304 sshd-session[5275]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 14 01:08:47.712728 systemd-logind[1459]: New session 17 of user core. May 14 01:08:47.718355 systemd[1]: Started session-17.scope - Session 17 of User core. May 14 01:08:48.577821 sshd[5278]: Connection closed by 172.24.4.1 port 52500 May 14 01:08:48.577602 sshd-session[5275]: pam_unix(sshd:session): session closed for user core May 14 01:08:48.613556 systemd[1]: sshd@14-172.24.4.64:22-172.24.4.1:52500.service: Deactivated successfully. May 14 01:08:48.619314 systemd[1]: session-17.scope: Deactivated successfully. May 14 01:08:48.622663 systemd-logind[1459]: Session 17 logged out. Waiting for processes to exit. May 14 01:08:48.626155 systemd-logind[1459]: Removed session 17. May 14 01:08:52.034657 containerd[1486]: time="2025-05-14T01:08:52.034564045Z" level=info msg="TaskExit event in podsandbox handler container_id:\"fbd2f0df01276d9e0ab3f1bb7fcc04b75d9ceb93f4633143ac23e9b29cc7cd7b\" id:\"ef9ba4638e82c03c9ac1913ec40faaaef4f4c1c9f16ecfe431b60f586fd593fc\" pid:5304 exited_at:{seconds:1747184932 nanos:33972634}" May 14 01:08:53.598696 systemd[1]: Started sshd@15-172.24.4.64:22-172.24.4.1:40322.service - OpenSSH per-connection server daemon (172.24.4.1:40322). May 14 01:08:54.945758 sshd[5314]: Accepted publickey for core from 172.24.4.1 port 40322 ssh2: RSA SHA256:rM5XR5OurBP49wqsUeFk3q6kdddh3064hEvFZHudzRQ May 14 01:08:54.949748 sshd-session[5314]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 14 01:08:54.963665 systemd-logind[1459]: New session 18 of user core. May 14 01:08:54.970414 systemd[1]: Started session-18.scope - Session 18 of User core. May 14 01:08:55.727645 sshd[5316]: Connection closed by 172.24.4.1 port 40322 May 14 01:08:55.728575 sshd-session[5314]: pam_unix(sshd:session): session closed for user core May 14 01:08:55.739774 systemd[1]: sshd@15-172.24.4.64:22-172.24.4.1:40322.service: Deactivated successfully. May 14 01:08:55.744856 systemd[1]: session-18.scope: Deactivated successfully. May 14 01:08:55.747674 systemd-logind[1459]: Session 18 logged out. Waiting for processes to exit. May 14 01:08:55.750746 systemd-logind[1459]: Removed session 18. May 14 01:09:00.745395 systemd[1]: Started sshd@16-172.24.4.64:22-172.24.4.1:40326.service - OpenSSH per-connection server daemon (172.24.4.1:40326). May 14 01:09:02.082467 sshd[5328]: Accepted publickey for core from 172.24.4.1 port 40326 ssh2: RSA SHA256:rM5XR5OurBP49wqsUeFk3q6kdddh3064hEvFZHudzRQ May 14 01:09:02.085396 sshd-session[5328]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 14 01:09:02.099170 systemd-logind[1459]: New session 19 of user core. May 14 01:09:02.108802 systemd[1]: Started session-19.scope - Session 19 of User core. May 14 01:09:02.959096 sshd[5330]: Connection closed by 172.24.4.1 port 40326 May 14 01:09:02.960791 sshd-session[5328]: pam_unix(sshd:session): session closed for user core May 14 01:09:02.967595 systemd-logind[1459]: Session 19 logged out. Waiting for processes to exit. May 14 01:09:02.969627 systemd[1]: sshd@16-172.24.4.64:22-172.24.4.1:40326.service: Deactivated successfully. May 14 01:09:02.976130 systemd[1]: session-19.scope: Deactivated successfully. May 14 01:09:02.984012 systemd-logind[1459]: Removed session 19. May 14 01:09:07.981365 systemd[1]: Started sshd@17-172.24.4.64:22-172.24.4.1:36228.service - OpenSSH per-connection server daemon (172.24.4.1:36228). May 14 01:09:08.101860 containerd[1486]: time="2025-05-14T01:09:08.101497732Z" level=info msg="TaskExit event in podsandbox handler container_id:\"333ab1cd365db74def51b32507a1eb376d68d8fdb1297213dc201047e9c4ed25\" id:\"8fd905ede384f29f42f33ac8cdfab2aa5c2e2d14c9f207480dd98549fbc98bf2\" pid:5357 exited_at:{seconds:1747184948 nanos:100628180}" May 14 01:09:09.253244 sshd[5344]: Accepted publickey for core from 172.24.4.1 port 36228 ssh2: RSA SHA256:rM5XR5OurBP49wqsUeFk3q6kdddh3064hEvFZHudzRQ May 14 01:09:09.257956 sshd-session[5344]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 14 01:09:09.272153 systemd-logind[1459]: New session 20 of user core. May 14 01:09:09.277388 systemd[1]: Started session-20.scope - Session 20 of User core. May 14 01:09:10.026088 sshd[5370]: Connection closed by 172.24.4.1 port 36228 May 14 01:09:10.027549 sshd-session[5344]: pam_unix(sshd:session): session closed for user core May 14 01:09:10.036744 systemd[1]: sshd@17-172.24.4.64:22-172.24.4.1:36228.service: Deactivated successfully. May 14 01:09:10.044730 systemd[1]: session-20.scope: Deactivated successfully. May 14 01:09:10.049597 systemd-logind[1459]: Session 20 logged out. Waiting for processes to exit. May 14 01:09:10.052791 systemd-logind[1459]: Removed session 20. May 14 01:09:15.043621 systemd[1]: Started sshd@18-172.24.4.64:22-172.24.4.1:48292.service - OpenSSH per-connection server daemon (172.24.4.1:48292). May 14 01:09:16.105958 sshd[5383]: Accepted publickey for core from 172.24.4.1 port 48292 ssh2: RSA SHA256:rM5XR5OurBP49wqsUeFk3q6kdddh3064hEvFZHudzRQ May 14 01:09:16.109962 sshd-session[5383]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 14 01:09:16.126249 systemd-logind[1459]: New session 21 of user core. May 14 01:09:16.135459 systemd[1]: Started session-21.scope - Session 21 of User core. May 14 01:09:16.950379 sshd[5385]: Connection closed by 172.24.4.1 port 48292 May 14 01:09:16.952133 sshd-session[5383]: pam_unix(sshd:session): session closed for user core May 14 01:09:16.960904 systemd[1]: sshd@18-172.24.4.64:22-172.24.4.1:48292.service: Deactivated successfully. May 14 01:09:16.966241 systemd[1]: session-21.scope: Deactivated successfully. May 14 01:09:16.968952 systemd-logind[1459]: Session 21 logged out. Waiting for processes to exit. May 14 01:09:16.972172 systemd-logind[1459]: Removed session 21. May 14 01:09:21.986838 systemd[1]: Started sshd@19-172.24.4.64:22-172.24.4.1:48302.service - OpenSSH per-connection server daemon (172.24.4.1:48302). May 14 01:09:22.080448 containerd[1486]: time="2025-05-14T01:09:22.080299861Z" level=info msg="TaskExit event in podsandbox handler container_id:\"fbd2f0df01276d9e0ab3f1bb7fcc04b75d9ceb93f4633143ac23e9b29cc7cd7b\" id:\"7890ff470cd53d07fa203b2d08193511f04cf8a47e0bdc7ad8798dec62308806\" pid:5410 exited_at:{seconds:1747184962 nanos:79912895}" May 14 01:09:23.070744 containerd[1486]: time="2025-05-14T01:09:23.070375847Z" level=warning msg="container event discarded" container=3f6d05eb161aae6f2efdf5c6ac84395b7d174e52d71ef22b2a8fb29789f9e5cb type=CONTAINER_CREATED_EVENT May 14 01:09:23.070744 containerd[1486]: time="2025-05-14T01:09:23.070667806Z" level=warning msg="container event discarded" container=3f6d05eb161aae6f2efdf5c6ac84395b7d174e52d71ef22b2a8fb29789f9e5cb type=CONTAINER_STARTED_EVENT May 14 01:09:23.115181 containerd[1486]: time="2025-05-14T01:09:23.115068073Z" level=warning msg="container event discarded" container=c4bcc2ad08da6615f4a910e6e1a55e5b630d8fd6ed385c55dbe2610ea30ccb84 type=CONTAINER_CREATED_EVENT May 14 01:09:23.115181 containerd[1486]: time="2025-05-14T01:09:23.115185944Z" level=warning msg="container event discarded" container=c4bcc2ad08da6615f4a910e6e1a55e5b630d8fd6ed385c55dbe2610ea30ccb84 type=CONTAINER_STARTED_EVENT May 14 01:09:23.128502 containerd[1486]: time="2025-05-14T01:09:23.128377943Z" level=warning msg="container event discarded" container=4e62a3962755be74043e90f6e9ebb9ae421e31cd1429e35c44c520d121847825 type=CONTAINER_CREATED_EVENT May 14 01:09:23.141853 containerd[1486]: time="2025-05-14T01:09:23.141693194Z" level=warning msg="container event discarded" container=9bae8999dedc873272a2055dd9b5ecedab6905a24ea807480817b9f883a53c86 type=CONTAINER_CREATED_EVENT May 14 01:09:23.141853 containerd[1486]: time="2025-05-14T01:09:23.141762864Z" level=warning msg="container event discarded" container=9bae8999dedc873272a2055dd9b5ecedab6905a24ea807480817b9f883a53c86 type=CONTAINER_STARTED_EVENT May 14 01:09:23.141853 containerd[1486]: time="2025-05-14T01:09:23.141790426Z" level=warning msg="container event discarded" container=be49a500fc4cd97f594a37dff2cd5bbd9a0895555bbd603ec35f8b347c265369 type=CONTAINER_CREATED_EVENT May 14 01:09:23.181203 containerd[1486]: time="2025-05-14T01:09:23.181083362Z" level=warning msg="container event discarded" container=285963784a4c66bc7147f4b86d54541c2d8a932c4743f5dd596ff882663f167e type=CONTAINER_CREATED_EVENT May 14 01:09:23.256610 containerd[1486]: time="2025-05-14T01:09:23.256468829Z" level=warning msg="container event discarded" container=4e62a3962755be74043e90f6e9ebb9ae421e31cd1429e35c44c520d121847825 type=CONTAINER_STARTED_EVENT May 14 01:09:23.276214 containerd[1486]: time="2025-05-14T01:09:23.275916824Z" level=warning msg="container event discarded" container=be49a500fc4cd97f594a37dff2cd5bbd9a0895555bbd603ec35f8b347c265369 type=CONTAINER_STARTED_EVENT May 14 01:09:23.286708 sshd[5402]: Accepted publickey for core from 172.24.4.1 port 48302 ssh2: RSA SHA256:rM5XR5OurBP49wqsUeFk3q6kdddh3064hEvFZHudzRQ May 14 01:09:23.289859 sshd-session[5402]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 14 01:09:23.302979 systemd-logind[1459]: New session 22 of user core. May 14 01:09:23.316444 systemd[1]: Started session-22.scope - Session 22 of User core. May 14 01:09:23.317604 containerd[1486]: time="2025-05-14T01:09:23.306400755Z" level=warning msg="container event discarded" container=285963784a4c66bc7147f4b86d54541c2d8a932c4743f5dd596ff882663f167e type=CONTAINER_STARTED_EVENT May 14 01:09:24.198914 sshd[5419]: Connection closed by 172.24.4.1 port 48302 May 14 01:09:24.202287 sshd-session[5402]: pam_unix(sshd:session): session closed for user core May 14 01:09:24.211141 systemd[1]: sshd@19-172.24.4.64:22-172.24.4.1:48302.service: Deactivated successfully. May 14 01:09:24.218444 systemd[1]: session-22.scope: Deactivated successfully. May 14 01:09:24.221217 systemd-logind[1459]: Session 22 logged out. Waiting for processes to exit. May 14 01:09:24.224668 systemd-logind[1459]: Removed session 22. May 14 01:09:29.222730 systemd[1]: Started sshd@20-172.24.4.64:22-172.24.4.1:34582.service - OpenSSH per-connection server daemon (172.24.4.1:34582). May 14 01:09:30.771312 sshd[5433]: Accepted publickey for core from 172.24.4.1 port 34582 ssh2: RSA SHA256:rM5XR5OurBP49wqsUeFk3q6kdddh3064hEvFZHudzRQ May 14 01:09:30.773314 sshd-session[5433]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 14 01:09:30.789984 systemd-logind[1459]: New session 23 of user core. May 14 01:09:30.799479 systemd[1]: Started session-23.scope - Session 23 of User core. May 14 01:09:31.516104 sshd[5435]: Connection closed by 172.24.4.1 port 34582 May 14 01:09:31.515572 sshd-session[5433]: pam_unix(sshd:session): session closed for user core May 14 01:09:31.522701 systemd[1]: sshd@20-172.24.4.64:22-172.24.4.1:34582.service: Deactivated successfully. May 14 01:09:31.529548 systemd[1]: session-23.scope: Deactivated successfully. May 14 01:09:31.534144 systemd-logind[1459]: Session 23 logged out. Waiting for processes to exit. May 14 01:09:31.536688 systemd-logind[1459]: Removed session 23. May 14 01:09:35.581536 containerd[1486]: time="2025-05-14T01:09:35.581249344Z" level=warning msg="container event discarded" container=14e59e108bebd1328d6a70c1db23da2935acce37188be5ec4ee813f234a6d6bd type=CONTAINER_CREATED_EVENT May 14 01:09:35.581536 containerd[1486]: time="2025-05-14T01:09:35.581464688Z" level=warning msg="container event discarded" container=14e59e108bebd1328d6a70c1db23da2935acce37188be5ec4ee813f234a6d6bd type=CONTAINER_STARTED_EVENT May 14 01:09:35.615066 containerd[1486]: time="2025-05-14T01:09:35.614835183Z" level=warning msg="container event discarded" container=12d5b66841d1fb8fe63f151a7a31536a231b9cd944587cb05777069719e6ea58 type=CONTAINER_CREATED_EVENT May 14 01:09:35.689281 containerd[1486]: time="2025-05-14T01:09:35.688182608Z" level=warning msg="container event discarded" container=12d5b66841d1fb8fe63f151a7a31536a231b9cd944587cb05777069719e6ea58 type=CONTAINER_STARTED_EVENT May 14 01:09:35.781650 containerd[1486]: time="2025-05-14T01:09:35.781504688Z" level=warning msg="container event discarded" container=23918fde1468a24d72bf4f5e0000188dfdc233ca55c60eb25cbd95be6a72430d type=CONTAINER_CREATED_EVENT May 14 01:09:35.781650 containerd[1486]: time="2025-05-14T01:09:35.781572194Z" level=warning msg="container event discarded" container=23918fde1468a24d72bf4f5e0000188dfdc233ca55c60eb25cbd95be6a72430d type=CONTAINER_STARTED_EVENT May 14 01:09:35.916754 containerd[1486]: time="2025-05-14T01:09:35.916348064Z" level=info msg="TaskExit event in podsandbox handler container_id:\"fbd2f0df01276d9e0ab3f1bb7fcc04b75d9ceb93f4633143ac23e9b29cc7cd7b\" id:\"2f8b28cd5dfe801bb7b058b778c12c3330154003e2d7711b165bd0205d3396e2\" pid:5460 exited_at:{seconds:1747184975 nanos:912520453}" May 14 01:09:36.542543 systemd[1]: Started sshd@21-172.24.4.64:22-172.24.4.1:60668.service - OpenSSH per-connection server daemon (172.24.4.1:60668). May 14 01:09:37.832092 sshd[5470]: Accepted publickey for core from 172.24.4.1 port 60668 ssh2: RSA SHA256:rM5XR5OurBP49wqsUeFk3q6kdddh3064hEvFZHudzRQ May 14 01:09:37.837511 sshd-session[5470]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 14 01:09:37.852280 systemd-logind[1459]: New session 24 of user core. May 14 01:09:37.860167 systemd[1]: Started session-24.scope - Session 24 of User core. May 14 01:09:37.999112 containerd[1486]: time="2025-05-14T01:09:37.998519882Z" level=warning msg="container event discarded" container=25bd407667dfe36cb03332d03fcc38e84018fbb7b0d7049c006c1b5d3e0c978c type=CONTAINER_CREATED_EVENT May 14 01:09:38.072997 containerd[1486]: time="2025-05-14T01:09:38.072853868Z" level=warning msg="container event discarded" container=25bd407667dfe36cb03332d03fcc38e84018fbb7b0d7049c006c1b5d3e0c978c type=CONTAINER_STARTED_EVENT May 14 01:09:38.134933 containerd[1486]: time="2025-05-14T01:09:38.134720735Z" level=info msg="TaskExit event in podsandbox handler container_id:\"333ab1cd365db74def51b32507a1eb376d68d8fdb1297213dc201047e9c4ed25\" id:\"bc0cb82cc628b132eee7db58c3beb998e8cb035e9f0c406fd9f24a7f2958da64\" pid:5486 exited_at:{seconds:1747184978 nanos:129648830}" May 14 01:09:38.764600 sshd[5472]: Connection closed by 172.24.4.1 port 60668 May 14 01:09:38.765936 sshd-session[5470]: pam_unix(sshd:session): session closed for user core May 14 01:09:38.780717 systemd[1]: sshd@21-172.24.4.64:22-172.24.4.1:60668.service: Deactivated successfully. May 14 01:09:38.788341 systemd[1]: session-24.scope: Deactivated successfully. May 14 01:09:38.791661 systemd-logind[1459]: Session 24 logged out. Waiting for processes to exit. May 14 01:09:38.795195 systemd-logind[1459]: Removed session 24. May 14 01:09:41.779161 containerd[1486]: time="2025-05-14T01:09:41.778891130Z" level=warning msg="container event discarded" container=4eb6dd64e59c40c04079f9e39c0a0cebf240485d226df97f828dd09bb7b92eda type=CONTAINER_CREATED_EVENT May 14 01:09:41.779161 containerd[1486]: time="2025-05-14T01:09:41.779016004Z" level=warning msg="container event discarded" container=4eb6dd64e59c40c04079f9e39c0a0cebf240485d226df97f828dd09bb7b92eda type=CONTAINER_STARTED_EVENT May 14 01:09:41.870579 containerd[1486]: time="2025-05-14T01:09:41.870453628Z" level=warning msg="container event discarded" container=0b3cea7e51f96f150314b05c979af0bbdc58578df2f52372223186e1fcf3d1f7 type=CONTAINER_CREATED_EVENT May 14 01:09:41.870579 containerd[1486]: time="2025-05-14T01:09:41.870555078Z" level=warning msg="container event discarded" container=0b3cea7e51f96f150314b05c979af0bbdc58578df2f52372223186e1fcf3d1f7 type=CONTAINER_STARTED_EVENT May 14 01:09:43.790119 systemd[1]: Started sshd@22-172.24.4.64:22-172.24.4.1:55450.service - OpenSSH per-connection server daemon (172.24.4.1:55450). May 14 01:09:44.965356 containerd[1486]: time="2025-05-14T01:09:44.965133288Z" level=warning msg="container event discarded" container=089e1d97ed79d54a06da204a609496762cf86a0ea60f8be28840ff6ac37abfc9 type=CONTAINER_CREATED_EVENT May 14 01:09:45.024119 sshd[5509]: Accepted publickey for core from 172.24.4.1 port 55450 ssh2: RSA SHA256:rM5XR5OurBP49wqsUeFk3q6kdddh3064hEvFZHudzRQ May 14 01:09:45.026291 sshd-session[5509]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 14 01:09:45.041161 systemd-logind[1459]: New session 25 of user core. May 14 01:09:45.044517 containerd[1486]: time="2025-05-14T01:09:45.043991994Z" level=warning msg="container event discarded" container=089e1d97ed79d54a06da204a609496762cf86a0ea60f8be28840ff6ac37abfc9 type=CONTAINER_STARTED_EVENT May 14 01:09:45.049404 systemd[1]: Started session-25.scope - Session 25 of User core. May 14 01:09:45.779823 sshd[5511]: Connection closed by 172.24.4.1 port 55450 May 14 01:09:45.781235 sshd-session[5509]: pam_unix(sshd:session): session closed for user core May 14 01:09:45.790422 systemd[1]: sshd@22-172.24.4.64:22-172.24.4.1:55450.service: Deactivated successfully. May 14 01:09:45.795345 systemd[1]: session-25.scope: Deactivated successfully. May 14 01:09:45.798694 systemd-logind[1459]: Session 25 logged out. Waiting for processes to exit. May 14 01:09:45.802942 systemd-logind[1459]: Removed session 25. May 14 01:09:46.994595 containerd[1486]: time="2025-05-14T01:09:46.994474276Z" level=warning msg="container event discarded" container=e8d1f90a9fd4c0a5f62034c25d3d35e074213525456ed657dab4f60826b2b106 type=CONTAINER_CREATED_EVENT May 14 01:09:47.070166 containerd[1486]: time="2025-05-14T01:09:47.069960976Z" level=warning msg="container event discarded" container=e8d1f90a9fd4c0a5f62034c25d3d35e074213525456ed657dab4f60826b2b106 type=CONTAINER_STARTED_EVENT May 14 01:09:47.796586 containerd[1486]: time="2025-05-14T01:09:47.796458522Z" level=warning msg="container event discarded" container=e8d1f90a9fd4c0a5f62034c25d3d35e074213525456ed657dab4f60826b2b106 type=CONTAINER_STOPPED_EVENT May 14 01:09:50.804570 systemd[1]: Started sshd@23-172.24.4.64:22-172.24.4.1:55458.service - OpenSSH per-connection server daemon (172.24.4.1:55458). May 14 01:09:51.973051 sshd[5523]: Accepted publickey for core from 172.24.4.1 port 55458 ssh2: RSA SHA256:rM5XR5OurBP49wqsUeFk3q6kdddh3064hEvFZHudzRQ May 14 01:09:51.975812 sshd-session[5523]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 14 01:09:51.987196 systemd-logind[1459]: New session 26 of user core. May 14 01:09:51.997145 systemd[1]: Started session-26.scope - Session 26 of User core. May 14 01:09:52.045019 containerd[1486]: time="2025-05-14T01:09:52.044963035Z" level=info msg="TaskExit event in podsandbox handler container_id:\"fbd2f0df01276d9e0ab3f1bb7fcc04b75d9ceb93f4633143ac23e9b29cc7cd7b\" id:\"d5801b55a5609015d3aba2bc9a7bfb232f5439fe4deef966f718a787648c220e\" pid:5536 exited_at:{seconds:1747184992 nanos:44642522}" May 14 01:09:52.581516 sshd[5542]: Connection closed by 172.24.4.1 port 55458 May 14 01:09:52.583548 sshd-session[5523]: pam_unix(sshd:session): session closed for user core May 14 01:09:52.592692 systemd[1]: sshd@23-172.24.4.64:22-172.24.4.1:55458.service: Deactivated successfully. May 14 01:09:52.598841 systemd[1]: session-26.scope: Deactivated successfully. May 14 01:09:52.601247 systemd-logind[1459]: Session 26 logged out. Waiting for processes to exit. May 14 01:09:52.603962 systemd-logind[1459]: Removed session 26. May 14 01:09:54.800097 containerd[1486]: time="2025-05-14T01:09:54.799942471Z" level=warning msg="container event discarded" container=f7bd2c705183ef2ec877328d7bb7641a5d4037f379107f8fcedf9ecc8f22ee10 type=CONTAINER_CREATED_EVENT May 14 01:09:54.873378 containerd[1486]: time="2025-05-14T01:09:54.873257738Z" level=warning msg="container event discarded" container=f7bd2c705183ef2ec877328d7bb7641a5d4037f379107f8fcedf9ecc8f22ee10 type=CONTAINER_STARTED_EVENT May 14 01:09:56.929743 containerd[1486]: time="2025-05-14T01:09:56.929597362Z" level=warning msg="container event discarded" container=f7bd2c705183ef2ec877328d7bb7641a5d4037f379107f8fcedf9ecc8f22ee10 type=CONTAINER_STOPPED_EVENT May 14 01:09:57.603924 systemd[1]: Started sshd@24-172.24.4.64:22-172.24.4.1:57824.service - OpenSSH per-connection server daemon (172.24.4.1:57824). May 14 01:09:58.736931 sshd[5562]: Accepted publickey for core from 172.24.4.1 port 57824 ssh2: RSA SHA256:rM5XR5OurBP49wqsUeFk3q6kdddh3064hEvFZHudzRQ May 14 01:09:58.741387 sshd-session[5562]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 14 01:09:58.754470 systemd-logind[1459]: New session 27 of user core. May 14 01:09:58.762444 systemd[1]: Started session-27.scope - Session 27 of User core. May 14 01:09:59.500097 sshd[5564]: Connection closed by 172.24.4.1 port 57824 May 14 01:09:59.501567 sshd-session[5562]: pam_unix(sshd:session): session closed for user core May 14 01:09:59.509594 systemd[1]: sshd@24-172.24.4.64:22-172.24.4.1:57824.service: Deactivated successfully. May 14 01:09:59.515489 systemd[1]: session-27.scope: Deactivated successfully. May 14 01:09:59.521281 systemd-logind[1459]: Session 27 logged out. Waiting for processes to exit. May 14 01:09:59.524077 systemd-logind[1459]: Removed session 27. May 14 01:10:04.526705 systemd[1]: Started sshd@25-172.24.4.64:22-172.24.4.1:39650.service - OpenSSH per-connection server daemon (172.24.4.1:39650). May 14 01:10:05.819989 sshd[5587]: Accepted publickey for core from 172.24.4.1 port 39650 ssh2: RSA SHA256:rM5XR5OurBP49wqsUeFk3q6kdddh3064hEvFZHudzRQ May 14 01:10:05.828849 sshd-session[5587]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 14 01:10:05.842657 systemd-logind[1459]: New session 28 of user core. May 14 01:10:05.854434 systemd[1]: Started session-28.scope - Session 28 of User core. May 14 01:10:06.642823 sshd[5591]: Connection closed by 172.24.4.1 port 39650 May 14 01:10:06.644367 sshd-session[5587]: pam_unix(sshd:session): session closed for user core May 14 01:10:06.652746 systemd[1]: sshd@25-172.24.4.64:22-172.24.4.1:39650.service: Deactivated successfully. May 14 01:10:06.659358 systemd[1]: session-28.scope: Deactivated successfully. May 14 01:10:06.661995 systemd-logind[1459]: Session 28 logged out. Waiting for processes to exit. May 14 01:10:06.665366 systemd-logind[1459]: Removed session 28. May 14 01:10:07.043321 containerd[1486]: time="2025-05-14T01:10:07.042905913Z" level=warning msg="container event discarded" container=333ab1cd365db74def51b32507a1eb376d68d8fdb1297213dc201047e9c4ed25 type=CONTAINER_CREATED_EVENT May 14 01:10:07.147544 containerd[1486]: time="2025-05-14T01:10:07.147407930Z" level=warning msg="container event discarded" container=333ab1cd365db74def51b32507a1eb376d68d8fdb1297213dc201047e9c4ed25 type=CONTAINER_STARTED_EVENT May 14 01:10:08.115063 containerd[1486]: time="2025-05-14T01:10:08.114847037Z" level=info msg="TaskExit event in podsandbox handler container_id:\"333ab1cd365db74def51b32507a1eb376d68d8fdb1297213dc201047e9c4ed25\" id:\"f88ba241d67deb15fee63fe06ff7b3dd58fce0c5c81cb029be01bf84f996a9d3\" pid:5615 exited_at:{seconds:1747185008 nanos:114397714}" May 14 01:10:09.361343 containerd[1486]: time="2025-05-14T01:10:09.361179291Z" level=warning msg="container event discarded" container=4a5e9a384dc5c49ed15c2124aa8da75449e213ec4ec5bba00b362f652cb038bb type=CONTAINER_CREATED_EVENT May 14 01:10:09.361343 containerd[1486]: time="2025-05-14T01:10:09.361265593Z" level=warning msg="container event discarded" container=4a5e9a384dc5c49ed15c2124aa8da75449e213ec4ec5bba00b362f652cb038bb type=CONTAINER_STARTED_EVENT May 14 01:10:09.408366 containerd[1486]: time="2025-05-14T01:10:09.407938766Z" level=warning msg="container event discarded" container=6ca1439f843f6820d18f6add1516b1cfd8b8b054be954aaa4b6abc8599f7ee37 type=CONTAINER_CREATED_EVENT May 14 01:10:09.457634 containerd[1486]: time="2025-05-14T01:10:09.457484861Z" level=warning msg="container event discarded" container=9be397c8d8b19c6c6f3180cc69f7bfaebb18e93b804363d9aca42ba401c74600 type=CONTAINER_CREATED_EVENT May 14 01:10:09.457634 containerd[1486]: time="2025-05-14T01:10:09.457612220Z" level=warning msg="container event discarded" container=9be397c8d8b19c6c6f3180cc69f7bfaebb18e93b804363d9aca42ba401c74600 type=CONTAINER_STARTED_EVENT May 14 01:10:09.511113 containerd[1486]: time="2025-05-14T01:10:09.510880738Z" level=warning msg="container event discarded" container=6ca1439f843f6820d18f6add1516b1cfd8b8b054be954aaa4b6abc8599f7ee37 type=CONTAINER_STARTED_EVENT May 14 01:10:10.070279 containerd[1486]: time="2025-05-14T01:10:10.070071865Z" level=warning msg="container event discarded" container=f3313e3fe39a559b1eb71121c828ad2034b3beef221ce6206a81be52ab9e6322 type=CONTAINER_CREATED_EVENT May 14 01:10:10.070279 containerd[1486]: time="2025-05-14T01:10:10.070167054Z" level=warning msg="container event discarded" container=f3313e3fe39a559b1eb71121c828ad2034b3beef221ce6206a81be52ab9e6322 type=CONTAINER_STARTED_EVENT May 14 01:10:11.539466 containerd[1486]: time="2025-05-14T01:10:11.539307534Z" level=warning msg="container event discarded" container=b44dc0b9ec272ae077b45c38e31bd9ac2845963c4a223bc92d949062a1c0b5f9 type=CONTAINER_CREATED_EVENT May 14 01:10:11.539466 containerd[1486]: time="2025-05-14T01:10:11.539406841Z" level=warning msg="container event discarded" container=b44dc0b9ec272ae077b45c38e31bd9ac2845963c4a223bc92d949062a1c0b5f9 type=CONTAINER_STARTED_EVENT May 14 01:10:11.668256 systemd[1]: Started sshd@26-172.24.4.64:22-172.24.4.1:39654.service - OpenSSH per-connection server daemon (172.24.4.1:39654). May 14 01:10:12.506887 containerd[1486]: time="2025-05-14T01:10:12.506754849Z" level=warning msg="container event discarded" container=21055ccaaf46f08fa2ce30ba5619d26b1e156fa34fe866aa6ec464c8a29f3cc5 type=CONTAINER_CREATED_EVENT May 14 01:10:12.506887 containerd[1486]: time="2025-05-14T01:10:12.506841743Z" level=warning msg="container event discarded" container=21055ccaaf46f08fa2ce30ba5619d26b1e156fa34fe866aa6ec464c8a29f3cc5 type=CONTAINER_STARTED_EVENT May 14 01:10:12.506887 containerd[1486]: time="2025-05-14T01:10:12.506863013Z" level=warning msg="container event discarded" container=3dde7b3369ae5f880a0006d762a930448abc9bc3b92c6e534b3de3cbeec5fdff type=CONTAINER_CREATED_EVENT May 14 01:10:12.506887 containerd[1486]: time="2025-05-14T01:10:12.506882911Z" level=warning msg="container event discarded" container=3dde7b3369ae5f880a0006d762a930448abc9bc3b92c6e534b3de3cbeec5fdff type=CONTAINER_STARTED_EVENT May 14 01:10:12.568444 containerd[1486]: time="2025-05-14T01:10:12.568276302Z" level=warning msg="container event discarded" container=d7fbc1035570f9ec1fdfd51d98ceaecb047d8b0c1550bef7ab4e4bf88939a296 type=CONTAINER_CREATED_EVENT May 14 01:10:12.687916 containerd[1486]: time="2025-05-14T01:10:12.687718089Z" level=warning msg="container event discarded" container=d7fbc1035570f9ec1fdfd51d98ceaecb047d8b0c1550bef7ab4e4bf88939a296 type=CONTAINER_STARTED_EVENT May 14 01:10:12.820775 sshd[5627]: Accepted publickey for core from 172.24.4.1 port 39654 ssh2: RSA SHA256:rM5XR5OurBP49wqsUeFk3q6kdddh3064hEvFZHudzRQ May 14 01:10:12.823875 sshd-session[5627]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 14 01:10:12.840343 systemd-logind[1459]: New session 29 of user core. May 14 01:10:12.852421 systemd[1]: Started session-29.scope - Session 29 of User core. May 14 01:10:13.834392 sshd[5629]: Connection closed by 172.24.4.1 port 39654 May 14 01:10:13.836255 sshd-session[5627]: pam_unix(sshd:session): session closed for user core May 14 01:10:13.845680 systemd[1]: sshd@26-172.24.4.64:22-172.24.4.1:39654.service: Deactivated successfully. May 14 01:10:13.851022 systemd[1]: session-29.scope: Deactivated successfully. May 14 01:10:13.854553 systemd-logind[1459]: Session 29 logged out. Waiting for processes to exit. May 14 01:10:13.857920 systemd-logind[1459]: Removed session 29. May 14 01:10:17.282532 containerd[1486]: time="2025-05-14T01:10:17.282378377Z" level=warning msg="container event discarded" container=781da29970c61c2e0b17a27a30e7fad7e1129ab5abd6465d320cd9ee5b87d8d9 type=CONTAINER_CREATED_EVENT May 14 01:10:17.395621 containerd[1486]: time="2025-05-14T01:10:17.395449762Z" level=warning msg="container event discarded" container=781da29970c61c2e0b17a27a30e7fad7e1129ab5abd6465d320cd9ee5b87d8d9 type=CONTAINER_STARTED_EVENT May 14 01:10:18.861406 systemd[1]: Started sshd@27-172.24.4.64:22-172.24.4.1:43324.service - OpenSSH per-connection server daemon (172.24.4.1:43324). May 14 01:10:20.013634 sshd[5640]: Accepted publickey for core from 172.24.4.1 port 43324 ssh2: RSA SHA256:rM5XR5OurBP49wqsUeFk3q6kdddh3064hEvFZHudzRQ May 14 01:10:20.017440 sshd-session[5640]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 14 01:10:20.043080 systemd-logind[1459]: New session 30 of user core. May 14 01:10:20.052829 systemd[1]: Started session-30.scope - Session 30 of User core. May 14 01:10:20.809208 sshd[5643]: Connection closed by 172.24.4.1 port 43324 May 14 01:10:20.810478 sshd-session[5640]: pam_unix(sshd:session): session closed for user core May 14 01:10:20.818795 systemd[1]: sshd@27-172.24.4.64:22-172.24.4.1:43324.service: Deactivated successfully. May 14 01:10:20.825576 systemd[1]: session-30.scope: Deactivated successfully. May 14 01:10:20.828364 systemd-logind[1459]: Session 30 logged out. Waiting for processes to exit. May 14 01:10:20.831314 systemd-logind[1459]: Removed session 30. May 14 01:10:21.476449 containerd[1486]: time="2025-05-14T01:10:21.476356129Z" level=warning msg="container event discarded" container=fbd2f0df01276d9e0ab3f1bb7fcc04b75d9ceb93f4633143ac23e9b29cc7cd7b type=CONTAINER_CREATED_EVENT May 14 01:10:21.610126 containerd[1486]: time="2025-05-14T01:10:21.610003949Z" level=warning msg="container event discarded" container=fbd2f0df01276d9e0ab3f1bb7fcc04b75d9ceb93f4633143ac23e9b29cc7cd7b type=CONTAINER_STARTED_EVENT May 14 01:10:22.042541 containerd[1486]: time="2025-05-14T01:10:22.042454701Z" level=info msg="TaskExit event in podsandbox handler container_id:\"fbd2f0df01276d9e0ab3f1bb7fcc04b75d9ceb93f4633143ac23e9b29cc7cd7b\" id:\"c4646074462f93a93911657c7f0bbce3c1c98cb9125bf729bfce9a72219c0e70\" pid:5668 exited_at:{seconds:1747185022 nanos:41820841}" May 14 01:10:23.802414 containerd[1486]: time="2025-05-14T01:10:23.802208221Z" level=warning msg="container event discarded" container=2a5f7043f2e46d452c7d8667b17facd94252f01728a4b5059dc0bc636c476115 type=CONTAINER_CREATED_EVENT May 14 01:10:23.911185 containerd[1486]: time="2025-05-14T01:10:23.910983909Z" level=warning msg="container event discarded" container=2a5f7043f2e46d452c7d8667b17facd94252f01728a4b5059dc0bc636c476115 type=CONTAINER_STARTED_EVENT May 14 01:10:24.347216 containerd[1486]: time="2025-05-14T01:10:24.347080642Z" level=warning msg="container event discarded" container=b152770239c33d349ccc07b7e26a615185a879f88d32bebb7cde9473c8bc3915 type=CONTAINER_CREATED_EVENT May 14 01:10:24.491895 containerd[1486]: time="2025-05-14T01:10:24.491758937Z" level=warning msg="container event discarded" container=b152770239c33d349ccc07b7e26a615185a879f88d32bebb7cde9473c8bc3915 type=CONTAINER_STARTED_EVENT May 14 01:10:25.835160 systemd[1]: Started sshd@28-172.24.4.64:22-172.24.4.1:46066.service - OpenSSH per-connection server daemon (172.24.4.1:46066). May 14 01:10:26.987454 sshd[5677]: Accepted publickey for core from 172.24.4.1 port 46066 ssh2: RSA SHA256:rM5XR5OurBP49wqsUeFk3q6kdddh3064hEvFZHudzRQ May 14 01:10:26.990905 sshd-session[5677]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 14 01:10:27.003750 systemd-logind[1459]: New session 31 of user core. May 14 01:10:27.014378 systemd[1]: Started session-31.scope - Session 31 of User core. May 14 01:10:27.748181 containerd[1486]: time="2025-05-14T01:10:27.748008251Z" level=warning msg="container event discarded" container=579bab10ab4e4b5259f774f10486efdc4195b890db558789da1fb0cd116d31b4 type=CONTAINER_CREATED_EVENT May 14 01:10:27.829182 sshd[5679]: Connection closed by 172.24.4.1 port 46066 May 14 01:10:27.831401 sshd-session[5677]: pam_unix(sshd:session): session closed for user core May 14 01:10:27.841610 systemd[1]: sshd@28-172.24.4.64:22-172.24.4.1:46066.service: Deactivated successfully. May 14 01:10:27.848674 systemd[1]: session-31.scope: Deactivated successfully. May 14 01:10:27.851726 containerd[1486]: time="2025-05-14T01:10:27.851548707Z" level=warning msg="container event discarded" container=579bab10ab4e4b5259f774f10486efdc4195b890db558789da1fb0cd116d31b4 type=CONTAINER_STARTED_EVENT May 14 01:10:27.857868 systemd-logind[1459]: Session 31 logged out. Waiting for processes to exit. May 14 01:10:27.861640 systemd-logind[1459]: Removed session 31. May 14 01:10:32.850602 systemd[1]: Started sshd@29-172.24.4.64:22-172.24.4.1:46072.service - OpenSSH per-connection server daemon (172.24.4.1:46072). May 14 01:10:34.009735 sshd[5701]: Accepted publickey for core from 172.24.4.1 port 46072 ssh2: RSA SHA256:rM5XR5OurBP49wqsUeFk3q6kdddh3064hEvFZHudzRQ May 14 01:10:34.013799 sshd-session[5701]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 14 01:10:34.027237 systemd-logind[1459]: New session 32 of user core. May 14 01:10:34.034412 systemd[1]: Started session-32.scope - Session 32 of User core. May 14 01:10:34.850699 sshd[5703]: Connection closed by 172.24.4.1 port 46072 May 14 01:10:34.851480 sshd-session[5701]: pam_unix(sshd:session): session closed for user core May 14 01:10:34.855655 systemd-logind[1459]: Session 32 logged out. Waiting for processes to exit. May 14 01:10:34.858361 systemd[1]: sshd@29-172.24.4.64:22-172.24.4.1:46072.service: Deactivated successfully. May 14 01:10:34.860467 systemd[1]: session-32.scope: Deactivated successfully. May 14 01:10:34.862536 systemd-logind[1459]: Removed session 32. May 14 01:10:35.941914 containerd[1486]: time="2025-05-14T01:10:35.941827186Z" level=info msg="TaskExit event in podsandbox handler container_id:\"fbd2f0df01276d9e0ab3f1bb7fcc04b75d9ceb93f4633143ac23e9b29cc7cd7b\" id:\"d7928769a28595b15d7fbd0c423a244c7ebe97dadc4f004d0ba21780ba37817e\" pid:5729 exited_at:{seconds:1747185035 nanos:941528105}" May 14 01:10:38.117009 containerd[1486]: time="2025-05-14T01:10:38.116798396Z" level=info msg="TaskExit event in podsandbox handler container_id:\"333ab1cd365db74def51b32507a1eb376d68d8fdb1297213dc201047e9c4ed25\" id:\"15abcfbb611fbc9064d2a80eab04b98f4d6d332d30df28cee0746b00ac27adab\" pid:5750 exited_at:{seconds:1747185038 nanos:116224420}" May 14 01:10:39.875093 systemd[1]: Started sshd@30-172.24.4.64:22-172.24.4.1:43966.service - OpenSSH per-connection server daemon (172.24.4.1:43966). May 14 01:10:40.929509 sshd[5762]: Accepted publickey for core from 172.24.4.1 port 43966 ssh2: RSA SHA256:rM5XR5OurBP49wqsUeFk3q6kdddh3064hEvFZHudzRQ May 14 01:10:40.937187 sshd-session[5762]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 14 01:10:40.955535 systemd-logind[1459]: New session 33 of user core. May 14 01:10:40.962279 systemd[1]: Started session-33.scope - Session 33 of User core. May 14 01:10:41.569511 sshd[5764]: Connection closed by 172.24.4.1 port 43966 May 14 01:10:41.570383 sshd-session[5762]: pam_unix(sshd:session): session closed for user core May 14 01:10:41.584690 systemd[1]: sshd@30-172.24.4.64:22-172.24.4.1:43966.service: Deactivated successfully. May 14 01:10:41.596324 systemd[1]: session-33.scope: Deactivated successfully. May 14 01:10:41.600181 systemd-logind[1459]: Session 33 logged out. Waiting for processes to exit. May 14 01:10:41.604185 systemd-logind[1459]: Removed session 33. May 14 01:10:46.594815 systemd[1]: Started sshd@31-172.24.4.64:22-172.24.4.1:36330.service - OpenSSH per-connection server daemon (172.24.4.1:36330). May 14 01:10:47.722909 sshd[5776]: Accepted publickey for core from 172.24.4.1 port 36330 ssh2: RSA SHA256:rM5XR5OurBP49wqsUeFk3q6kdddh3064hEvFZHudzRQ May 14 01:10:47.727535 sshd-session[5776]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 14 01:10:47.743810 systemd-logind[1459]: New session 34 of user core. May 14 01:10:47.753384 systemd[1]: Started session-34.scope - Session 34 of User core. May 14 01:10:48.610887 sshd[5778]: Connection closed by 172.24.4.1 port 36330 May 14 01:10:48.610603 sshd-session[5776]: pam_unix(sshd:session): session closed for user core May 14 01:10:48.619025 systemd[1]: sshd@31-172.24.4.64:22-172.24.4.1:36330.service: Deactivated successfully. May 14 01:10:48.622431 systemd[1]: session-34.scope: Deactivated successfully. May 14 01:10:48.624764 systemd-logind[1459]: Session 34 logged out. Waiting for processes to exit. May 14 01:10:48.626438 systemd-logind[1459]: Removed session 34. May 14 01:10:52.067638 containerd[1486]: time="2025-05-14T01:10:52.067542217Z" level=info msg="TaskExit event in podsandbox handler container_id:\"fbd2f0df01276d9e0ab3f1bb7fcc04b75d9ceb93f4633143ac23e9b29cc7cd7b\" id:\"89e8a7e6f6fa1cd5dd00596baae18a3f6106e4f5eecc4d6968f8006823cbc7d9\" pid:5801 exited_at:{seconds:1747185052 nanos:66965105}" May 14 01:10:53.630269 systemd[1]: Started sshd@32-172.24.4.64:22-172.24.4.1:50158.service - OpenSSH per-connection server daemon (172.24.4.1:50158). May 14 01:10:54.767201 sshd[5811]: Accepted publickey for core from 172.24.4.1 port 50158 ssh2: RSA SHA256:rM5XR5OurBP49wqsUeFk3q6kdddh3064hEvFZHudzRQ May 14 01:10:54.768662 sshd-session[5811]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 14 01:10:54.782002 systemd-logind[1459]: New session 35 of user core. May 14 01:10:54.788530 systemd[1]: Started session-35.scope - Session 35 of User core. May 14 01:10:55.542090 sshd[5813]: Connection closed by 172.24.4.1 port 50158 May 14 01:10:55.543329 sshd-session[5811]: pam_unix(sshd:session): session closed for user core May 14 01:10:55.553100 systemd[1]: sshd@32-172.24.4.64:22-172.24.4.1:50158.service: Deactivated successfully. May 14 01:10:55.559886 systemd[1]: session-35.scope: Deactivated successfully. May 14 01:10:55.562234 systemd-logind[1459]: Session 35 logged out. Waiting for processes to exit. May 14 01:10:55.565217 systemd-logind[1459]: Removed session 35. May 14 01:11:00.569477 systemd[1]: Started sshd@33-172.24.4.64:22-172.24.4.1:50170.service - OpenSSH per-connection server daemon (172.24.4.1:50170). May 14 01:11:01.733556 sshd[5825]: Accepted publickey for core from 172.24.4.1 port 50170 ssh2: RSA SHA256:rM5XR5OurBP49wqsUeFk3q6kdddh3064hEvFZHudzRQ May 14 01:11:01.738592 sshd-session[5825]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 14 01:11:01.765229 systemd-logind[1459]: New session 36 of user core. May 14 01:11:01.774384 systemd[1]: Started session-36.scope - Session 36 of User core. May 14 01:11:02.616510 sshd[5827]: Connection closed by 172.24.4.1 port 50170 May 14 01:11:02.619151 sshd-session[5825]: pam_unix(sshd:session): session closed for user core May 14 01:11:02.630774 systemd[1]: sshd@33-172.24.4.64:22-172.24.4.1:50170.service: Deactivated successfully. May 14 01:11:02.637276 systemd[1]: session-36.scope: Deactivated successfully. May 14 01:11:02.640731 systemd-logind[1459]: Session 36 logged out. Waiting for processes to exit. May 14 01:11:02.643846 systemd-logind[1459]: Removed session 36. May 14 01:11:07.650937 systemd[1]: Started sshd@34-172.24.4.64:22-172.24.4.1:41064.service - OpenSSH per-connection server daemon (172.24.4.1:41064). May 14 01:11:08.125546 containerd[1486]: time="2025-05-14T01:11:08.125364796Z" level=info msg="TaskExit event in podsandbox handler container_id:\"333ab1cd365db74def51b32507a1eb376d68d8fdb1297213dc201047e9c4ed25\" id:\"0e52215b8e73ffa7738a4470cb933b208a07d6ee87d7433609342d85b9d01a63\" pid:5854 exited_at:{seconds:1747185068 nanos:123306184}" May 14 01:11:08.882560 sshd[5841]: Accepted publickey for core from 172.24.4.1 port 41064 ssh2: RSA SHA256:rM5XR5OurBP49wqsUeFk3q6kdddh3064hEvFZHudzRQ May 14 01:11:08.887178 sshd-session[5841]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 14 01:11:08.905840 systemd-logind[1459]: New session 37 of user core. May 14 01:11:08.917413 systemd[1]: Started session-37.scope - Session 37 of User core. May 14 01:11:09.705394 sshd[5865]: Connection closed by 172.24.4.1 port 41064 May 14 01:11:09.707248 sshd-session[5841]: pam_unix(sshd:session): session closed for user core May 14 01:11:09.714825 systemd[1]: sshd@34-172.24.4.64:22-172.24.4.1:41064.service: Deactivated successfully. May 14 01:11:09.724536 systemd[1]: session-37.scope: Deactivated successfully. May 14 01:11:09.732240 systemd-logind[1459]: Session 37 logged out. Waiting for processes to exit. May 14 01:11:09.736524 systemd-logind[1459]: Removed session 37. May 14 01:11:14.737914 systemd[1]: Started sshd@35-172.24.4.64:22-172.24.4.1:58306.service - OpenSSH per-connection server daemon (172.24.4.1:58306). May 14 01:11:15.845267 sshd[5876]: Accepted publickey for core from 172.24.4.1 port 58306 ssh2: RSA SHA256:rM5XR5OurBP49wqsUeFk3q6kdddh3064hEvFZHudzRQ May 14 01:11:15.850493 sshd-session[5876]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 14 01:11:15.867508 systemd-logind[1459]: New session 38 of user core. May 14 01:11:15.878934 systemd[1]: Started session-38.scope - Session 38 of User core. May 14 01:11:16.670667 sshd[5878]: Connection closed by 172.24.4.1 port 58306 May 14 01:11:16.670464 sshd-session[5876]: pam_unix(sshd:session): session closed for user core May 14 01:11:16.679463 systemd-logind[1459]: Session 38 logged out. Waiting for processes to exit. May 14 01:11:16.681688 systemd[1]: sshd@35-172.24.4.64:22-172.24.4.1:58306.service: Deactivated successfully. May 14 01:11:16.691420 systemd[1]: session-38.scope: Deactivated successfully. May 14 01:11:16.699605 systemd-logind[1459]: Removed session 38. May 14 01:11:21.695496 systemd[1]: Started sshd@36-172.24.4.64:22-172.24.4.1:58318.service - OpenSSH per-connection server daemon (172.24.4.1:58318). May 14 01:11:22.089043 containerd[1486]: time="2025-05-14T01:11:22.088977846Z" level=info msg="TaskExit event in podsandbox handler container_id:\"fbd2f0df01276d9e0ab3f1bb7fcc04b75d9ceb93f4633143ac23e9b29cc7cd7b\" id:\"a9e4eb662c2e7cd4d4c1027023df65e54fd9468c719e797475020782d232b400\" pid:5904 exited_at:{seconds:1747185082 nanos:88499078}" May 14 01:11:23.246244 sshd[5890]: Accepted publickey for core from 172.24.4.1 port 58318 ssh2: RSA SHA256:rM5XR5OurBP49wqsUeFk3q6kdddh3064hEvFZHudzRQ May 14 01:11:23.246439 sshd-session[5890]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 14 01:11:23.259313 systemd-logind[1459]: New session 39 of user core. May 14 01:11:23.264335 systemd[1]: Started session-39.scope - Session 39 of User core. May 14 01:11:24.074633 sshd[5913]: Connection closed by 172.24.4.1 port 58318 May 14 01:11:24.076554 sshd-session[5890]: pam_unix(sshd:session): session closed for user core May 14 01:11:24.088536 systemd[1]: sshd@36-172.24.4.64:22-172.24.4.1:58318.service: Deactivated successfully. May 14 01:11:24.100277 systemd[1]: session-39.scope: Deactivated successfully. May 14 01:11:24.108527 systemd-logind[1459]: Session 39 logged out. Waiting for processes to exit. May 14 01:11:24.116647 systemd-logind[1459]: Removed session 39. May 14 01:11:29.106147 systemd[1]: Started sshd@37-172.24.4.64:22-172.24.4.1:51758.service - OpenSSH per-connection server daemon (172.24.4.1:51758). May 14 01:11:30.461993 sshd[5927]: Accepted publickey for core from 172.24.4.1 port 51758 ssh2: RSA SHA256:rM5XR5OurBP49wqsUeFk3q6kdddh3064hEvFZHudzRQ May 14 01:11:30.466070 sshd-session[5927]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 14 01:11:30.480449 systemd-logind[1459]: New session 40 of user core. May 14 01:11:30.500443 systemd[1]: Started session-40.scope - Session 40 of User core. May 14 01:11:31.252982 sshd[5929]: Connection closed by 172.24.4.1 port 51758 May 14 01:11:31.252832 sshd-session[5927]: pam_unix(sshd:session): session closed for user core May 14 01:11:31.258243 systemd[1]: sshd@37-172.24.4.64:22-172.24.4.1:51758.service: Deactivated successfully. May 14 01:11:31.262193 systemd[1]: session-40.scope: Deactivated successfully. May 14 01:11:31.265155 systemd-logind[1459]: Session 40 logged out. Waiting for processes to exit. May 14 01:11:31.266840 systemd-logind[1459]: Removed session 40. May 14 01:11:35.927722 containerd[1486]: time="2025-05-14T01:11:35.927283645Z" level=info msg="TaskExit event in podsandbox handler container_id:\"fbd2f0df01276d9e0ab3f1bb7fcc04b75d9ceb93f4633143ac23e9b29cc7cd7b\" id:\"98af78e0308205b7df784fa6ddda37ab0ae7cde3dd5f3b36f5cc07698baff9d8\" pid:5959 exited_at:{seconds:1747185095 nanos:925819879}" May 14 01:11:36.283498 systemd[1]: Started sshd@38-172.24.4.64:22-172.24.4.1:56892.service - OpenSSH per-connection server daemon (172.24.4.1:56892). May 14 01:11:37.570079 sshd[5969]: Accepted publickey for core from 172.24.4.1 port 56892 ssh2: RSA SHA256:rM5XR5OurBP49wqsUeFk3q6kdddh3064hEvFZHudzRQ May 14 01:11:37.573647 sshd-session[5969]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 14 01:11:37.589775 systemd-logind[1459]: New session 41 of user core. May 14 01:11:37.598368 systemd[1]: Started session-41.scope - Session 41 of User core. May 14 01:11:38.173979 containerd[1486]: time="2025-05-14T01:11:38.173915061Z" level=info msg="TaskExit event in podsandbox handler container_id:\"333ab1cd365db74def51b32507a1eb376d68d8fdb1297213dc201047e9c4ed25\" id:\"7dbb5af53a0d5c8fc0fa854dc526cdb554cbe662ee5294a1219f6f7a7104241e\" pid:6002 exited_at:{seconds:1747185098 nanos:173469536}" May 14 01:11:38.403877 sshd[5982]: Connection closed by 172.24.4.1 port 56892 May 14 01:11:38.406236 sshd-session[5969]: pam_unix(sshd:session): session closed for user core May 14 01:11:38.414664 systemd[1]: sshd@38-172.24.4.64:22-172.24.4.1:56892.service: Deactivated successfully. May 14 01:11:38.422244 systemd[1]: session-41.scope: Deactivated successfully. May 14 01:11:38.427535 systemd-logind[1459]: Session 41 logged out. Waiting for processes to exit. May 14 01:11:38.430314 systemd-logind[1459]: Removed session 41. May 14 01:11:43.435490 systemd[1]: Started sshd@39-172.24.4.64:22-172.24.4.1:56894.service - OpenSSH per-connection server daemon (172.24.4.1:56894). May 14 01:11:44.729821 sshd[6018]: Accepted publickey for core from 172.24.4.1 port 56894 ssh2: RSA SHA256:rM5XR5OurBP49wqsUeFk3q6kdddh3064hEvFZHudzRQ May 14 01:11:44.735742 sshd-session[6018]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 14 01:11:44.757331 systemd-logind[1459]: New session 42 of user core. May 14 01:11:44.769305 systemd[1]: Started session-42.scope - Session 42 of User core. May 14 01:11:45.516758 sshd[6020]: Connection closed by 172.24.4.1 port 56894 May 14 01:11:45.516317 sshd-session[6018]: pam_unix(sshd:session): session closed for user core May 14 01:11:45.526550 systemd[1]: sshd@39-172.24.4.64:22-172.24.4.1:56894.service: Deactivated successfully. May 14 01:11:45.530229 systemd[1]: session-42.scope: Deactivated successfully. May 14 01:11:45.531795 systemd-logind[1459]: Session 42 logged out. Waiting for processes to exit. May 14 01:11:45.534201 systemd-logind[1459]: Removed session 42. May 14 01:11:50.552750 systemd[1]: Started sshd@40-172.24.4.64:22-172.24.4.1:40760.service - OpenSSH per-connection server daemon (172.24.4.1:40760). May 14 01:11:51.729715 sshd[6033]: Accepted publickey for core from 172.24.4.1 port 40760 ssh2: RSA SHA256:rM5XR5OurBP49wqsUeFk3q6kdddh3064hEvFZHudzRQ May 14 01:11:51.734383 sshd-session[6033]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 14 01:11:51.747519 systemd-logind[1459]: New session 43 of user core. May 14 01:11:51.756417 systemd[1]: Started session-43.scope - Session 43 of User core. May 14 01:11:52.118923 containerd[1486]: time="2025-05-14T01:11:52.118804750Z" level=info msg="TaskExit event in podsandbox handler container_id:\"fbd2f0df01276d9e0ab3f1bb7fcc04b75d9ceb93f4633143ac23e9b29cc7cd7b\" id:\"d779a05bbdd7f3ff0e0c3924e6cdfb6a1aef8dbd57c9f1473e9daafb10fa0302\" pid:6050 exited_at:{seconds:1747185112 nanos:117924929}" May 14 01:11:52.611109 sshd[6035]: Connection closed by 172.24.4.1 port 40760 May 14 01:11:52.612808 sshd-session[6033]: pam_unix(sshd:session): session closed for user core May 14 01:11:52.620624 systemd[1]: sshd@40-172.24.4.64:22-172.24.4.1:40760.service: Deactivated successfully. May 14 01:11:52.623492 systemd[1]: session-43.scope: Deactivated successfully. May 14 01:11:52.624977 systemd-logind[1459]: Session 43 logged out. Waiting for processes to exit. May 14 01:11:52.627118 systemd-logind[1459]: Removed session 43. May 14 01:11:57.642693 systemd[1]: Started sshd@41-172.24.4.64:22-172.24.4.1:49060.service - OpenSSH per-connection server daemon (172.24.4.1:49060). May 14 01:11:58.765343 sshd[6070]: Accepted publickey for core from 172.24.4.1 port 49060 ssh2: RSA SHA256:rM5XR5OurBP49wqsUeFk3q6kdddh3064hEvFZHudzRQ May 14 01:11:58.769836 sshd-session[6070]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 14 01:11:58.784152 systemd-logind[1459]: New session 44 of user core. May 14 01:11:58.795401 systemd[1]: Started session-44.scope - Session 44 of User core. May 14 01:11:59.507436 sshd[6072]: Connection closed by 172.24.4.1 port 49060 May 14 01:11:59.508374 sshd-session[6070]: pam_unix(sshd:session): session closed for user core May 14 01:11:59.517486 systemd[1]: sshd@41-172.24.4.64:22-172.24.4.1:49060.service: Deactivated successfully. May 14 01:11:59.523491 systemd[1]: session-44.scope: Deactivated successfully. May 14 01:11:59.530233 systemd-logind[1459]: Session 44 logged out. Waiting for processes to exit. May 14 01:11:59.533353 systemd-logind[1459]: Removed session 44. May 14 01:12:04.556451 systemd[1]: Started sshd@42-172.24.4.64:22-172.24.4.1:54640.service - OpenSSH per-connection server daemon (172.24.4.1:54640). May 14 01:12:05.753574 sshd[6084]: Accepted publickey for core from 172.24.4.1 port 54640 ssh2: RSA SHA256:rM5XR5OurBP49wqsUeFk3q6kdddh3064hEvFZHudzRQ May 14 01:12:05.758824 sshd-session[6084]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 14 01:12:05.777949 systemd-logind[1459]: New session 45 of user core. May 14 01:12:05.788451 systemd[1]: Started session-45.scope - Session 45 of User core. May 14 01:12:06.536314 sshd[6086]: Connection closed by 172.24.4.1 port 54640 May 14 01:12:06.537193 sshd-session[6084]: pam_unix(sshd:session): session closed for user core May 14 01:12:06.548186 systemd-logind[1459]: Session 45 logged out. Waiting for processes to exit. May 14 01:12:06.549880 systemd[1]: sshd@42-172.24.4.64:22-172.24.4.1:54640.service: Deactivated successfully. May 14 01:12:06.556607 systemd[1]: session-45.scope: Deactivated successfully. May 14 01:12:06.561854 systemd-logind[1459]: Removed session 45. May 14 01:12:08.180082 containerd[1486]: time="2025-05-14T01:12:08.176475130Z" level=info msg="TaskExit event in podsandbox handler container_id:\"333ab1cd365db74def51b32507a1eb376d68d8fdb1297213dc201047e9c4ed25\" id:\"06c1b81507ed2233d273776c573aaee7e9824de23e623fca91ca05f6f6a198ee\" pid:6115 exited_at:{seconds:1747185128 nanos:174208115}" May 14 01:12:11.555799 systemd[1]: Started sshd@43-172.24.4.64:22-172.24.4.1:54652.service - OpenSSH per-connection server daemon (172.24.4.1:54652). May 14 01:12:12.741391 sshd[6128]: Accepted publickey for core from 172.24.4.1 port 54652 ssh2: RSA SHA256:rM5XR5OurBP49wqsUeFk3q6kdddh3064hEvFZHudzRQ May 14 01:12:12.745217 sshd-session[6128]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 14 01:12:12.768193 systemd-logind[1459]: New session 46 of user core. May 14 01:12:12.786539 systemd[1]: Started session-46.scope - Session 46 of User core. May 14 01:12:13.524636 sshd[6130]: Connection closed by 172.24.4.1 port 54652 May 14 01:12:13.524242 sshd-session[6128]: pam_unix(sshd:session): session closed for user core May 14 01:12:13.534559 systemd[1]: sshd@43-172.24.4.64:22-172.24.4.1:54652.service: Deactivated successfully. May 14 01:12:13.544842 systemd[1]: session-46.scope: Deactivated successfully. May 14 01:12:13.547604 systemd-logind[1459]: Session 46 logged out. Waiting for processes to exit. May 14 01:12:13.550603 systemd-logind[1459]: Removed session 46. May 14 01:12:18.551377 systemd[1]: Started sshd@44-172.24.4.64:22-172.24.4.1:55040.service - OpenSSH per-connection server daemon (172.24.4.1:55040). May 14 01:12:19.733454 sshd[6143]: Accepted publickey for core from 172.24.4.1 port 55040 ssh2: RSA SHA256:rM5XR5OurBP49wqsUeFk3q6kdddh3064hEvFZHudzRQ May 14 01:12:19.736587 sshd-session[6143]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 14 01:12:19.749923 systemd-logind[1459]: New session 47 of user core. May 14 01:12:19.755416 systemd[1]: Started session-47.scope - Session 47 of User core. May 14 01:12:20.515287 sshd[6145]: Connection closed by 172.24.4.1 port 55040 May 14 01:12:20.516559 sshd-session[6143]: pam_unix(sshd:session): session closed for user core May 14 01:12:20.525643 systemd[1]: sshd@44-172.24.4.64:22-172.24.4.1:55040.service: Deactivated successfully. May 14 01:12:20.531921 systemd[1]: session-47.scope: Deactivated successfully. May 14 01:12:20.539666 systemd-logind[1459]: Session 47 logged out. Waiting for processes to exit. May 14 01:12:20.542928 systemd-logind[1459]: Removed session 47. May 14 01:12:22.058068 containerd[1486]: time="2025-05-14T01:12:22.057923373Z" level=info msg="TaskExit event in podsandbox handler container_id:\"fbd2f0df01276d9e0ab3f1bb7fcc04b75d9ceb93f4633143ac23e9b29cc7cd7b\" id:\"cd3e2a1e2a78701aedd484b01cf55a9ab3c77d7da4d560a29745c6663914a64c\" pid:6168 exited_at:{seconds:1747185142 nanos:56275050}" May 14 01:12:25.547596 systemd[1]: Started sshd@45-172.24.4.64:22-172.24.4.1:60520.service - OpenSSH per-connection server daemon (172.24.4.1:60520). May 14 01:12:26.738797 sshd[6178]: Accepted publickey for core from 172.24.4.1 port 60520 ssh2: RSA SHA256:rM5XR5OurBP49wqsUeFk3q6kdddh3064hEvFZHudzRQ May 14 01:12:26.742386 sshd-session[6178]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 14 01:12:26.766257 systemd-logind[1459]: New session 48 of user core. May 14 01:12:26.777509 systemd[1]: Started session-48.scope - Session 48 of User core. May 14 01:12:27.522420 sshd[6180]: Connection closed by 172.24.4.1 port 60520 May 14 01:12:27.521739 sshd-session[6178]: pam_unix(sshd:session): session closed for user core May 14 01:12:27.531178 systemd[1]: sshd@45-172.24.4.64:22-172.24.4.1:60520.service: Deactivated successfully. May 14 01:12:27.539401 systemd[1]: session-48.scope: Deactivated successfully. May 14 01:12:27.542556 systemd-logind[1459]: Session 48 logged out. Waiting for processes to exit. May 14 01:12:27.546365 systemd-logind[1459]: Removed session 48. May 14 01:12:32.557715 systemd[1]: Started sshd@46-172.24.4.64:22-172.24.4.1:60522.service - OpenSSH per-connection server daemon (172.24.4.1:60522). May 14 01:12:33.873235 sshd[6194]: Accepted publickey for core from 172.24.4.1 port 60522 ssh2: RSA SHA256:rM5XR5OurBP49wqsUeFk3q6kdddh3064hEvFZHudzRQ May 14 01:12:33.877482 sshd-session[6194]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 14 01:12:33.916140 systemd-logind[1459]: New session 49 of user core. May 14 01:12:33.935554 systemd[1]: Started session-49.scope - Session 49 of User core. May 14 01:12:34.804310 sshd[6196]: Connection closed by 172.24.4.1 port 60522 May 14 01:12:34.806635 sshd-session[6194]: pam_unix(sshd:session): session closed for user core May 14 01:12:34.818279 systemd-logind[1459]: Session 49 logged out. Waiting for processes to exit. May 14 01:12:34.819595 systemd[1]: sshd@46-172.24.4.64:22-172.24.4.1:60522.service: Deactivated successfully. May 14 01:12:34.831597 systemd[1]: session-49.scope: Deactivated successfully. May 14 01:12:34.839086 systemd-logind[1459]: Removed session 49. May 14 01:12:35.956558 containerd[1486]: time="2025-05-14T01:12:35.956417633Z" level=info msg="TaskExit event in podsandbox handler container_id:\"fbd2f0df01276d9e0ab3f1bb7fcc04b75d9ceb93f4633143ac23e9b29cc7cd7b\" id:\"70b86aaf46df4f29c0358cffdf217d866a3ab44e32b37479bd5d87f64ac008ef\" pid:6221 exited_at:{seconds:1747185155 nanos:955487457}" May 14 01:12:38.154712 containerd[1486]: time="2025-05-14T01:12:38.154630528Z" level=info msg="TaskExit event in podsandbox handler container_id:\"333ab1cd365db74def51b32507a1eb376d68d8fdb1297213dc201047e9c4ed25\" id:\"da8bd309e4b5ee341e3c309795973b8107a431048feca310006bafcd9f499b99\" pid:6242 exited_at:{seconds:1747185158 nanos:153923211}" May 14 01:12:39.835724 systemd[1]: Started sshd@47-172.24.4.64:22-172.24.4.1:39768.service - OpenSSH per-connection server daemon (172.24.4.1:39768). May 14 01:12:40.986020 sshd[6255]: Accepted publickey for core from 172.24.4.1 port 39768 ssh2: RSA SHA256:rM5XR5OurBP49wqsUeFk3q6kdddh3064hEvFZHudzRQ May 14 01:12:40.993789 sshd-session[6255]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 14 01:12:41.009509 systemd-logind[1459]: New session 50 of user core. May 14 01:12:41.021436 systemd[1]: Started session-50.scope - Session 50 of User core. May 14 01:12:41.741579 sshd[6257]: Connection closed by 172.24.4.1 port 39768 May 14 01:12:41.741443 sshd-session[6255]: pam_unix(sshd:session): session closed for user core May 14 01:12:41.746366 systemd[1]: sshd@47-172.24.4.64:22-172.24.4.1:39768.service: Deactivated successfully. May 14 01:12:41.749656 systemd[1]: session-50.scope: Deactivated successfully. May 14 01:12:41.750943 systemd-logind[1459]: Session 50 logged out. Waiting for processes to exit. May 14 01:12:41.752470 systemd-logind[1459]: Removed session 50. May 14 01:12:46.765705 systemd[1]: Started sshd@48-172.24.4.64:22-172.24.4.1:48268.service - OpenSSH per-connection server daemon (172.24.4.1:48268). May 14 01:12:48.163244 sshd[6270]: Accepted publickey for core from 172.24.4.1 port 48268 ssh2: RSA SHA256:rM5XR5OurBP49wqsUeFk3q6kdddh3064hEvFZHudzRQ May 14 01:12:48.169248 sshd-session[6270]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 14 01:12:48.191626 systemd-logind[1459]: New session 51 of user core. May 14 01:12:48.198501 systemd[1]: Started session-51.scope - Session 51 of User core. May 14 01:12:48.985574 sshd[6272]: Connection closed by 172.24.4.1 port 48268 May 14 01:12:48.986325 sshd-session[6270]: pam_unix(sshd:session): session closed for user core May 14 01:12:49.000861 systemd[1]: sshd@48-172.24.4.64:22-172.24.4.1:48268.service: Deactivated successfully. May 14 01:12:49.007717 systemd[1]: session-51.scope: Deactivated successfully. May 14 01:12:49.010349 systemd-logind[1459]: Session 51 logged out. Waiting for processes to exit. May 14 01:12:49.013949 systemd-logind[1459]: Removed session 51. May 14 01:12:52.067830 containerd[1486]: time="2025-05-14T01:12:52.067628606Z" level=info msg="TaskExit event in podsandbox handler container_id:\"fbd2f0df01276d9e0ab3f1bb7fcc04b75d9ceb93f4633143ac23e9b29cc7cd7b\" id:\"91d6f61ffd37a6a2392981a880125e1bd36f13399472d9d48ff4a1169d006cc9\" pid:6295 exited_at:{seconds:1747185172 nanos:66798558}" May 14 01:12:54.004827 systemd[1]: Started sshd@49-172.24.4.64:22-172.24.4.1:38286.service - OpenSSH per-connection server daemon (172.24.4.1:38286). May 14 01:12:55.359272 sshd[6305]: Accepted publickey for core from 172.24.4.1 port 38286 ssh2: RSA SHA256:rM5XR5OurBP49wqsUeFk3q6kdddh3064hEvFZHudzRQ May 14 01:12:55.362932 sshd-session[6305]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 14 01:12:55.377279 systemd-logind[1459]: New session 52 of user core. May 14 01:12:55.381452 systemd[1]: Started session-52.scope - Session 52 of User core. May 14 01:12:55.956741 update_engine[1460]: I20250514 01:12:55.956359 1460 prefs.cc:52] certificate-report-to-send-update not present in /var/lib/update_engine/prefs May 14 01:12:55.958410 update_engine[1460]: I20250514 01:12:55.956750 1460 prefs.cc:52] certificate-report-to-send-download not present in /var/lib/update_engine/prefs May 14 01:12:55.958465 update_engine[1460]: I20250514 01:12:55.958405 1460 prefs.cc:52] aleph-version not present in /var/lib/update_engine/prefs May 14 01:12:55.963887 update_engine[1460]: I20250514 01:12:55.963794 1460 omaha_request_params.cc:62] Current group set to alpha May 14 01:12:55.964574 update_engine[1460]: I20250514 01:12:55.964502 1460 update_attempter.cc:499] Already updated boot flags. Skipping. May 14 01:12:55.964660 update_engine[1460]: I20250514 01:12:55.964563 1460 update_attempter.cc:643] Scheduling an action processor start. May 14 01:12:55.964660 update_engine[1460]: I20250514 01:12:55.964635 1460 action_processor.cc:36] ActionProcessor::StartProcessing: OmahaRequestAction May 14 01:12:55.964892 update_engine[1460]: I20250514 01:12:55.964850 1460 prefs.cc:52] previous-version not present in /var/lib/update_engine/prefs May 14 01:12:55.966932 update_engine[1460]: I20250514 01:12:55.965574 1460 omaha_request_action.cc:271] Posting an Omaha request to disabled May 14 01:12:55.966932 update_engine[1460]: I20250514 01:12:55.965626 1460 omaha_request_action.cc:272] Request: May 14 01:12:55.966932 update_engine[1460]: May 14 01:12:55.966932 update_engine[1460]: May 14 01:12:55.966932 update_engine[1460]: May 14 01:12:55.966932 update_engine[1460]: May 14 01:12:55.966932 update_engine[1460]: May 14 01:12:55.966932 update_engine[1460]: May 14 01:12:55.966932 update_engine[1460]: May 14 01:12:55.966932 update_engine[1460]: May 14 01:12:55.966932 update_engine[1460]: I20250514 01:12:55.965653 1460 libcurl_http_fetcher.cc:47] Starting/Resuming transfer May 14 01:12:55.979133 locksmithd[1490]: LastCheckedTime=0 Progress=0 CurrentOperation="UPDATE_STATUS_CHECKING_FOR_UPDATE" NewVersion=0.0.0 NewSize=0 May 14 01:12:55.981911 update_engine[1460]: I20250514 01:12:55.981781 1460 libcurl_http_fetcher.cc:151] Setting up curl options for HTTP May 14 01:12:55.983723 update_engine[1460]: I20250514 01:12:55.983117 1460 libcurl_http_fetcher.cc:449] Setting up timeout source: 1 seconds. May 14 01:12:55.991475 update_engine[1460]: E20250514 01:12:55.991386 1460 libcurl_http_fetcher.cc:266] Unable to get http response code: Could not resolve host: disabled May 14 01:12:55.991693 update_engine[1460]: I20250514 01:12:55.991647 1460 libcurl_http_fetcher.cc:283] No HTTP response, retry 1 May 14 01:12:56.273169 sshd[6307]: Connection closed by 172.24.4.1 port 38286 May 14 01:12:56.274085 sshd-session[6305]: pam_unix(sshd:session): session closed for user core May 14 01:12:56.287784 systemd[1]: sshd@49-172.24.4.64:22-172.24.4.1:38286.service: Deactivated successfully. May 14 01:12:56.293079 systemd[1]: session-52.scope: Deactivated successfully. May 14 01:12:56.295289 systemd-logind[1459]: Session 52 logged out. Waiting for processes to exit. May 14 01:12:56.300614 systemd-logind[1459]: Removed session 52. May 14 01:12:56.303732 systemd[1]: Started sshd@50-172.24.4.64:22-172.24.4.1:38290.service - OpenSSH per-connection server daemon (172.24.4.1:38290). May 14 01:12:57.649537 sshd[6317]: Accepted publickey for core from 172.24.4.1 port 38290 ssh2: RSA SHA256:rM5XR5OurBP49wqsUeFk3q6kdddh3064hEvFZHudzRQ May 14 01:12:57.653528 sshd-session[6317]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 14 01:12:57.669610 systemd-logind[1459]: New session 53 of user core. May 14 01:12:57.683469 systemd[1]: Started session-53.scope - Session 53 of User core. May 14 01:12:58.809197 sshd[6320]: Connection closed by 172.24.4.1 port 38290 May 14 01:12:58.808831 sshd-session[6317]: pam_unix(sshd:session): session closed for user core May 14 01:12:58.836643 systemd[1]: sshd@50-172.24.4.64:22-172.24.4.1:38290.service: Deactivated successfully. May 14 01:12:58.844534 systemd[1]: session-53.scope: Deactivated successfully. May 14 01:12:58.848385 systemd-logind[1459]: Session 53 logged out. Waiting for processes to exit. May 14 01:12:58.856613 systemd[1]: Started sshd@51-172.24.4.64:22-172.24.4.1:38292.service - OpenSSH per-connection server daemon (172.24.4.1:38292). May 14 01:12:58.864823 systemd-logind[1459]: Removed session 53. May 14 01:13:00.030098 sshd[6329]: Accepted publickey for core from 172.24.4.1 port 38292 ssh2: RSA SHA256:rM5XR5OurBP49wqsUeFk3q6kdddh3064hEvFZHudzRQ May 14 01:13:00.033342 sshd-session[6329]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 14 01:13:00.052519 systemd-logind[1459]: New session 54 of user core. May 14 01:13:00.066431 systemd[1]: Started session-54.scope - Session 54 of User core. May 14 01:13:02.185147 sshd[6332]: Connection closed by 172.24.4.1 port 38292 May 14 01:13:02.196420 sshd-session[6329]: pam_unix(sshd:session): session closed for user core May 14 01:13:02.208646 systemd[1]: Started sshd@52-172.24.4.64:22-172.24.4.1:38298.service - OpenSSH per-connection server daemon (172.24.4.1:38298). May 14 01:13:02.215789 systemd[1]: sshd@51-172.24.4.64:22-172.24.4.1:38292.service: Deactivated successfully. May 14 01:13:02.224523 systemd[1]: session-54.scope: Deactivated successfully. May 14 01:13:02.227129 systemd-logind[1459]: Session 54 logged out. Waiting for processes to exit. May 14 01:13:02.230616 systemd-logind[1459]: Removed session 54. May 14 01:13:03.379633 sshd[6354]: Accepted publickey for core from 172.24.4.1 port 38298 ssh2: RSA SHA256:rM5XR5OurBP49wqsUeFk3q6kdddh3064hEvFZHudzRQ May 14 01:13:03.383325 sshd-session[6354]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 14 01:13:03.400482 systemd-logind[1459]: New session 55 of user core. May 14 01:13:03.417856 systemd[1]: Started session-55.scope - Session 55 of User core. May 14 01:13:04.617267 sshd[6360]: Connection closed by 172.24.4.1 port 38298 May 14 01:13:04.616413 sshd-session[6354]: pam_unix(sshd:session): session closed for user core May 14 01:13:04.639111 systemd[1]: sshd@52-172.24.4.64:22-172.24.4.1:38298.service: Deactivated successfully. May 14 01:13:04.645914 systemd[1]: session-55.scope: Deactivated successfully. May 14 01:13:04.649355 systemd-logind[1459]: Session 55 logged out. Waiting for processes to exit. May 14 01:13:04.657366 systemd[1]: Started sshd@53-172.24.4.64:22-172.24.4.1:47534.service - OpenSSH per-connection server daemon (172.24.4.1:47534). May 14 01:13:04.661589 systemd-logind[1459]: Removed session 55. May 14 01:13:05.955277 update_engine[1460]: I20250514 01:13:05.955111 1460 libcurl_http_fetcher.cc:47] Starting/Resuming transfer May 14 01:13:05.956119 update_engine[1460]: I20250514 01:13:05.955569 1460 libcurl_http_fetcher.cc:151] Setting up curl options for HTTP May 14 01:13:05.956320 update_engine[1460]: I20250514 01:13:05.956224 1460 libcurl_http_fetcher.cc:449] Setting up timeout source: 1 seconds. May 14 01:13:05.959705 sshd[6369]: Accepted publickey for core from 172.24.4.1 port 47534 ssh2: RSA SHA256:rM5XR5OurBP49wqsUeFk3q6kdddh3064hEvFZHudzRQ May 14 01:13:05.961271 sshd-session[6369]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 14 01:13:05.961809 update_engine[1460]: E20250514 01:13:05.961709 1460 libcurl_http_fetcher.cc:266] Unable to get http response code: Could not resolve host: disabled May 14 01:13:05.961915 update_engine[1460]: I20250514 01:13:05.961853 1460 libcurl_http_fetcher.cc:283] No HTTP response, retry 2 May 14 01:13:05.977167 systemd-logind[1459]: New session 56 of user core. May 14 01:13:05.983391 systemd[1]: Started session-56.scope - Session 56 of User core. May 14 01:13:06.646603 sshd[6374]: Connection closed by 172.24.4.1 port 47534 May 14 01:13:06.648541 sshd-session[6369]: pam_unix(sshd:session): session closed for user core May 14 01:13:06.655670 systemd[1]: sshd@53-172.24.4.64:22-172.24.4.1:47534.service: Deactivated successfully. May 14 01:13:06.660472 systemd[1]: session-56.scope: Deactivated successfully. May 14 01:13:06.663134 systemd-logind[1459]: Session 56 logged out. Waiting for processes to exit. May 14 01:13:06.665867 systemd-logind[1459]: Removed session 56. May 14 01:13:08.127875 containerd[1486]: time="2025-05-14T01:13:08.127762518Z" level=info msg="TaskExit event in podsandbox handler container_id:\"333ab1cd365db74def51b32507a1eb376d68d8fdb1297213dc201047e9c4ed25\" id:\"3022ceb689c952c570b0952d0f1e57bc4d8888f54f109c49c7e4ea7170d4ba3e\" pid:6408 exited_at:{seconds:1747185188 nanos:126819308}" May 14 01:13:11.675678 systemd[1]: Started sshd@54-172.24.4.64:22-172.24.4.1:47538.service - OpenSSH per-connection server daemon (172.24.4.1:47538). May 14 01:13:12.828059 sshd[6422]: Accepted publickey for core from 172.24.4.1 port 47538 ssh2: RSA SHA256:rM5XR5OurBP49wqsUeFk3q6kdddh3064hEvFZHudzRQ May 14 01:13:12.830319 sshd-session[6422]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 14 01:13:12.840379 systemd-logind[1459]: New session 57 of user core. May 14 01:13:12.845265 systemd[1]: Started session-57.scope - Session 57 of User core. May 14 01:13:13.512067 sshd[6424]: Connection closed by 172.24.4.1 port 47538 May 14 01:13:13.511316 sshd-session[6422]: pam_unix(sshd:session): session closed for user core May 14 01:13:13.517667 systemd[1]: sshd@54-172.24.4.64:22-172.24.4.1:47538.service: Deactivated successfully. May 14 01:13:13.521918 systemd[1]: session-57.scope: Deactivated successfully. May 14 01:13:13.525652 systemd-logind[1459]: Session 57 logged out. Waiting for processes to exit. May 14 01:13:13.527419 systemd-logind[1459]: Removed session 57. May 14 01:13:15.957172 update_engine[1460]: I20250514 01:13:15.956602 1460 libcurl_http_fetcher.cc:47] Starting/Resuming transfer May 14 01:13:15.958325 update_engine[1460]: I20250514 01:13:15.957648 1460 libcurl_http_fetcher.cc:151] Setting up curl options for HTTP May 14 01:13:15.958713 update_engine[1460]: I20250514 01:13:15.958607 1460 libcurl_http_fetcher.cc:449] Setting up timeout source: 1 seconds. May 14 01:13:15.964498 update_engine[1460]: E20250514 01:13:15.964396 1460 libcurl_http_fetcher.cc:266] Unable to get http response code: Could not resolve host: disabled May 14 01:13:15.964669 update_engine[1460]: I20250514 01:13:15.964560 1460 libcurl_http_fetcher.cc:283] No HTTP response, retry 3 May 14 01:13:18.535200 systemd[1]: Started sshd@55-172.24.4.64:22-172.24.4.1:59672.service - OpenSSH per-connection server daemon (172.24.4.1:59672). May 14 01:13:19.790255 sshd[6435]: Accepted publickey for core from 172.24.4.1 port 59672 ssh2: RSA SHA256:rM5XR5OurBP49wqsUeFk3q6kdddh3064hEvFZHudzRQ May 14 01:13:19.796136 sshd-session[6435]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 14 01:13:19.812020 systemd-logind[1459]: New session 58 of user core. May 14 01:13:19.821438 systemd[1]: Started session-58.scope - Session 58 of User core. May 14 01:13:20.472020 sshd[6437]: Connection closed by 172.24.4.1 port 59672 May 14 01:13:20.471607 sshd-session[6435]: pam_unix(sshd:session): session closed for user core May 14 01:13:20.481346 systemd[1]: sshd@55-172.24.4.64:22-172.24.4.1:59672.service: Deactivated successfully. May 14 01:13:20.487016 systemd[1]: session-58.scope: Deactivated successfully. May 14 01:13:20.495900 systemd-logind[1459]: Session 58 logged out. Waiting for processes to exit. May 14 01:13:20.498804 systemd-logind[1459]: Removed session 58. May 14 01:13:22.078897 containerd[1486]: time="2025-05-14T01:13:22.078784033Z" level=info msg="TaskExit event in podsandbox handler container_id:\"fbd2f0df01276d9e0ab3f1bb7fcc04b75d9ceb93f4633143ac23e9b29cc7cd7b\" id:\"927347737619959523d18a72ed3e28bd783d8f3d078187bad1ab0d8118da3b7d\" pid:6460 exited_at:{seconds:1747185202 nanos:78526450}" May 14 01:13:25.500703 systemd[1]: Started sshd@56-172.24.4.64:22-172.24.4.1:57024.service - OpenSSH per-connection server daemon (172.24.4.1:57024). May 14 01:13:25.958686 update_engine[1460]: I20250514 01:13:25.957374 1460 libcurl_http_fetcher.cc:47] Starting/Resuming transfer May 14 01:13:25.958686 update_engine[1460]: I20250514 01:13:25.957967 1460 libcurl_http_fetcher.cc:151] Setting up curl options for HTTP May 14 01:13:25.958686 update_engine[1460]: I20250514 01:13:25.958575 1460 libcurl_http_fetcher.cc:449] Setting up timeout source: 1 seconds. May 14 01:13:25.965198 update_engine[1460]: E20250514 01:13:25.963764 1460 libcurl_http_fetcher.cc:266] Unable to get http response code: Could not resolve host: disabled May 14 01:13:25.965198 update_engine[1460]: I20250514 01:13:25.963943 1460 libcurl_http_fetcher.cc:297] Transfer resulted in an error (0), 0 bytes downloaded May 14 01:13:25.965198 update_engine[1460]: I20250514 01:13:25.963971 1460 omaha_request_action.cc:617] Omaha request response: May 14 01:13:25.965198 update_engine[1460]: E20250514 01:13:25.964278 1460 omaha_request_action.cc:636] Omaha request network transfer failed. May 14 01:13:25.965198 update_engine[1460]: I20250514 01:13:25.964383 1460 action_processor.cc:68] ActionProcessor::ActionComplete: OmahaRequestAction action failed. Aborting processing. May 14 01:13:25.965198 update_engine[1460]: I20250514 01:13:25.964407 1460 action_processor.cc:73] ActionProcessor::ActionComplete: finished last action of type OmahaRequestAction May 14 01:13:25.965198 update_engine[1460]: I20250514 01:13:25.964421 1460 update_attempter.cc:306] Processing Done. May 14 01:13:25.965198 update_engine[1460]: E20250514 01:13:25.964496 1460 update_attempter.cc:619] Update failed. May 14 01:13:25.965198 update_engine[1460]: I20250514 01:13:25.964529 1460 utils.cc:600] Converting error code 2000 to kActionCodeOmahaErrorInHTTPResponse May 14 01:13:25.965198 update_engine[1460]: I20250514 01:13:25.964542 1460 payload_state.cc:97] Updating payload state for error code: 37 (kActionCodeOmahaErrorInHTTPResponse) May 14 01:13:25.965198 update_engine[1460]: I20250514 01:13:25.964557 1460 payload_state.cc:103] Ignoring failures until we get a valid Omaha response. May 14 01:13:25.965198 update_engine[1460]: I20250514 01:13:25.964718 1460 action_processor.cc:36] ActionProcessor::StartProcessing: OmahaRequestAction May 14 01:13:25.965198 update_engine[1460]: I20250514 01:13:25.964771 1460 omaha_request_action.cc:271] Posting an Omaha request to disabled May 14 01:13:25.965198 update_engine[1460]: I20250514 01:13:25.964788 1460 omaha_request_action.cc:272] Request: May 14 01:13:25.965198 update_engine[1460]: May 14 01:13:25.965198 update_engine[1460]: May 14 01:13:25.966526 update_engine[1460]: May 14 01:13:25.966526 update_engine[1460]: May 14 01:13:25.966526 update_engine[1460]: May 14 01:13:25.966526 update_engine[1460]: May 14 01:13:25.966526 update_engine[1460]: I20250514 01:13:25.964803 1460 libcurl_http_fetcher.cc:47] Starting/Resuming transfer May 14 01:13:25.966888 locksmithd[1490]: LastCheckedTime=0 Progress=0 CurrentOperation="UPDATE_STATUS_REPORTING_ERROR_EVENT" NewVersion=0.0.0 NewSize=0 May 14 01:13:25.968233 update_engine[1460]: I20250514 01:13:25.967118 1460 libcurl_http_fetcher.cc:151] Setting up curl options for HTTP May 14 01:13:25.968233 update_engine[1460]: I20250514 01:13:25.967620 1460 libcurl_http_fetcher.cc:449] Setting up timeout source: 1 seconds. May 14 01:13:25.973495 update_engine[1460]: E20250514 01:13:25.972986 1460 libcurl_http_fetcher.cc:266] Unable to get http response code: Could not resolve host: disabled May 14 01:13:25.973495 update_engine[1460]: I20250514 01:13:25.973173 1460 libcurl_http_fetcher.cc:297] Transfer resulted in an error (0), 0 bytes downloaded May 14 01:13:25.973495 update_engine[1460]: I20250514 01:13:25.973204 1460 omaha_request_action.cc:617] Omaha request response: May 14 01:13:25.973495 update_engine[1460]: I20250514 01:13:25.973218 1460 action_processor.cc:65] ActionProcessor::ActionComplete: finished last action of type OmahaRequestAction May 14 01:13:25.973495 update_engine[1460]: I20250514 01:13:25.973232 1460 action_processor.cc:73] ActionProcessor::ActionComplete: finished last action of type OmahaRequestAction May 14 01:13:25.973495 update_engine[1460]: I20250514 01:13:25.973243 1460 update_attempter.cc:306] Processing Done. May 14 01:13:25.973495 update_engine[1460]: I20250514 01:13:25.973256 1460 update_attempter.cc:310] Error event sent. May 14 01:13:25.973495 update_engine[1460]: I20250514 01:13:25.973294 1460 update_check_scheduler.cc:74] Next update check in 45m27s May 14 01:13:25.974467 locksmithd[1490]: LastCheckedTime=0 Progress=0 CurrentOperation="UPDATE_STATUS_IDLE" NewVersion=0.0.0 NewSize=0 May 14 01:13:26.690118 sshd[6470]: Accepted publickey for core from 172.24.4.1 port 57024 ssh2: RSA SHA256:rM5XR5OurBP49wqsUeFk3q6kdddh3064hEvFZHudzRQ May 14 01:13:26.695020 sshd-session[6470]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 14 01:13:26.708609 systemd-logind[1459]: New session 59 of user core. May 14 01:13:26.716360 systemd[1]: Started session-59.scope - Session 59 of User core. May 14 01:13:27.510567 sshd[6472]: Connection closed by 172.24.4.1 port 57024 May 14 01:13:27.512890 sshd-session[6470]: pam_unix(sshd:session): session closed for user core May 14 01:13:27.523913 systemd-logind[1459]: Session 59 logged out. Waiting for processes to exit. May 14 01:13:27.527581 systemd[1]: sshd@56-172.24.4.64:22-172.24.4.1:57024.service: Deactivated successfully. May 14 01:13:27.539141 systemd[1]: session-59.scope: Deactivated successfully. May 14 01:13:27.547430 systemd-logind[1459]: Removed session 59. May 14 01:13:32.538853 systemd[1]: Started sshd@57-172.24.4.64:22-172.24.4.1:57032.service - OpenSSH per-connection server daemon (172.24.4.1:57032). May 14 01:13:33.677769 sshd[6495]: Accepted publickey for core from 172.24.4.1 port 57032 ssh2: RSA SHA256:rM5XR5OurBP49wqsUeFk3q6kdddh3064hEvFZHudzRQ May 14 01:13:33.681965 sshd-session[6495]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 14 01:13:33.698341 systemd-logind[1459]: New session 60 of user core. May 14 01:13:33.708466 systemd[1]: Started session-60.scope - Session 60 of User core. May 14 01:13:34.511185 sshd[6497]: Connection closed by 172.24.4.1 port 57032 May 14 01:13:34.513405 sshd-session[6495]: pam_unix(sshd:session): session closed for user core May 14 01:13:34.527289 systemd[1]: sshd@57-172.24.4.64:22-172.24.4.1:57032.service: Deactivated successfully. May 14 01:13:34.534626 systemd[1]: session-60.scope: Deactivated successfully. May 14 01:13:34.537652 systemd-logind[1459]: Session 60 logged out. Waiting for processes to exit. May 14 01:13:34.540854 systemd-logind[1459]: Removed session 60. May 14 01:13:35.969357 containerd[1486]: time="2025-05-14T01:13:35.969113043Z" level=info msg="TaskExit event in podsandbox handler container_id:\"fbd2f0df01276d9e0ab3f1bb7fcc04b75d9ceb93f4633143ac23e9b29cc7cd7b\" id:\"e8f8e0739809c17237325111705eb1472ab6aea0a704dec436adc119aa6a9ca3\" pid:6522 exited_at:{seconds:1747185215 nanos:968300438}" May 14 01:13:38.131249 containerd[1486]: time="2025-05-14T01:13:38.131163166Z" level=info msg="TaskExit event in podsandbox handler container_id:\"333ab1cd365db74def51b32507a1eb376d68d8fdb1297213dc201047e9c4ed25\" id:\"fbc79b1b6533bd921e2140f272374e5694c5c73fdeffb3c5e56e1a1a28815e94\" pid:6544 exited_at:{seconds:1747185218 nanos:130752696}" May 14 01:13:39.541686 systemd[1]: Started sshd@58-172.24.4.64:22-172.24.4.1:52740.service - OpenSSH per-connection server daemon (172.24.4.1:52740). May 14 01:13:40.804015 sshd[6557]: Accepted publickey for core from 172.24.4.1 port 52740 ssh2: RSA SHA256:rM5XR5OurBP49wqsUeFk3q6kdddh3064hEvFZHudzRQ May 14 01:13:40.806836 sshd-session[6557]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 14 01:13:40.821169 systemd-logind[1459]: New session 61 of user core. May 14 01:13:40.828432 systemd[1]: Started session-61.scope - Session 61 of User core. May 14 01:13:41.481288 sshd[6559]: Connection closed by 172.24.4.1 port 52740 May 14 01:13:41.482433 sshd-session[6557]: pam_unix(sshd:session): session closed for user core May 14 01:13:41.488229 systemd-logind[1459]: Session 61 logged out. Waiting for processes to exit. May 14 01:13:41.489650 systemd[1]: sshd@58-172.24.4.64:22-172.24.4.1:52740.service: Deactivated successfully. May 14 01:13:41.493660 systemd[1]: session-61.scope: Deactivated successfully. May 14 01:13:41.496149 systemd-logind[1459]: Removed session 61. May 14 01:13:46.510726 systemd[1]: Started sshd@59-172.24.4.64:22-172.24.4.1:55926.service - OpenSSH per-connection server daemon (172.24.4.1:55926). May 14 01:13:47.853846 sshd[6571]: Accepted publickey for core from 172.24.4.1 port 55926 ssh2: RSA SHA256:rM5XR5OurBP49wqsUeFk3q6kdddh3064hEvFZHudzRQ May 14 01:13:47.856734 sshd-session[6571]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 14 01:13:47.865098 systemd-logind[1459]: New session 62 of user core. May 14 01:13:47.876274 systemd[1]: Started session-62.scope - Session 62 of User core. May 14 01:13:48.692148 sshd[6573]: Connection closed by 172.24.4.1 port 55926 May 14 01:13:48.693987 sshd-session[6571]: pam_unix(sshd:session): session closed for user core May 14 01:13:48.698820 systemd[1]: sshd@59-172.24.4.64:22-172.24.4.1:55926.service: Deactivated successfully. May 14 01:13:48.702061 systemd[1]: session-62.scope: Deactivated successfully. May 14 01:13:48.705314 systemd-logind[1459]: Session 62 logged out. Waiting for processes to exit. May 14 01:13:48.708809 systemd-logind[1459]: Removed session 62. May 14 01:13:52.065860 containerd[1486]: time="2025-05-14T01:13:52.065774930Z" level=info msg="TaskExit event in podsandbox handler container_id:\"fbd2f0df01276d9e0ab3f1bb7fcc04b75d9ceb93f4633143ac23e9b29cc7cd7b\" id:\"c434f319f7621824892dd1ec3b1c0201c53b78d1c25cfd071b0ef76c70db7367\" pid:6596 exited_at:{seconds:1747185232 nanos:65094333}" May 14 01:13:53.719589 systemd[1]: Started sshd@60-172.24.4.64:22-172.24.4.1:44950.service - OpenSSH per-connection server daemon (172.24.4.1:44950). May 14 01:13:54.957458 sshd[6606]: Accepted publickey for core from 172.24.4.1 port 44950 ssh2: RSA SHA256:rM5XR5OurBP49wqsUeFk3q6kdddh3064hEvFZHudzRQ May 14 01:13:54.959837 sshd-session[6606]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 14 01:13:54.968996 systemd-logind[1459]: New session 63 of user core. May 14 01:13:54.973408 systemd[1]: Started session-63.scope - Session 63 of User core. May 14 01:13:55.728011 sshd[6608]: Connection closed by 172.24.4.1 port 44950 May 14 01:13:55.727832 sshd-session[6606]: pam_unix(sshd:session): session closed for user core May 14 01:13:55.737530 systemd[1]: sshd@60-172.24.4.64:22-172.24.4.1:44950.service: Deactivated successfully. May 14 01:13:55.741682 systemd[1]: session-63.scope: Deactivated successfully. May 14 01:13:55.743083 systemd-logind[1459]: Session 63 logged out. Waiting for processes to exit. May 14 01:13:55.744510 systemd-logind[1459]: Removed session 63. May 14 01:14:00.746620 systemd[1]: Started sshd@61-172.24.4.64:22-172.24.4.1:44958.service - OpenSSH per-connection server daemon (172.24.4.1:44958). May 14 01:14:01.993131 sshd[6620]: Accepted publickey for core from 172.24.4.1 port 44958 ssh2: RSA SHA256:rM5XR5OurBP49wqsUeFk3q6kdddh3064hEvFZHudzRQ May 14 01:14:01.996918 sshd-session[6620]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 14 01:14:02.012205 systemd-logind[1459]: New session 64 of user core. May 14 01:14:02.026564 systemd[1]: Started session-64.scope - Session 64 of User core. May 14 01:14:02.735242 sshd[6622]: Connection closed by 172.24.4.1 port 44958 May 14 01:14:02.736616 sshd-session[6620]: pam_unix(sshd:session): session closed for user core May 14 01:14:02.745224 systemd[1]: sshd@61-172.24.4.64:22-172.24.4.1:44958.service: Deactivated successfully. May 14 01:14:02.752483 systemd[1]: session-64.scope: Deactivated successfully. May 14 01:14:02.754648 systemd-logind[1459]: Session 64 logged out. Waiting for processes to exit. May 14 01:14:02.758007 systemd-logind[1459]: Removed session 64. May 14 01:14:07.758709 systemd[1]: Started sshd@62-172.24.4.64:22-172.24.4.1:55254.service - OpenSSH per-connection server daemon (172.24.4.1:55254). May 14 01:14:08.127541 containerd[1486]: time="2025-05-14T01:14:08.127328403Z" level=info msg="TaskExit event in podsandbox handler container_id:\"333ab1cd365db74def51b32507a1eb376d68d8fdb1297213dc201047e9c4ed25\" id:\"a35f2f2ffdcd6e65aa35307d530f617fed95af600f2772e30ad8e4f0582cc787\" pid:6650 exited_at:{seconds:1747185248 nanos:126274365}" May 14 01:14:08.952131 sshd[6636]: Accepted publickey for core from 172.24.4.1 port 55254 ssh2: RSA SHA256:rM5XR5OurBP49wqsUeFk3q6kdddh3064hEvFZHudzRQ May 14 01:14:08.954951 sshd-session[6636]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 14 01:14:08.967208 systemd-logind[1459]: New session 65 of user core. May 14 01:14:08.978446 systemd[1]: Started session-65.scope - Session 65 of User core. May 14 01:14:09.821099 sshd[6662]: Connection closed by 172.24.4.1 port 55254 May 14 01:14:09.822359 sshd-session[6636]: pam_unix(sshd:session): session closed for user core May 14 01:14:09.832393 systemd[1]: sshd@62-172.24.4.64:22-172.24.4.1:55254.service: Deactivated successfully. May 14 01:14:09.840498 systemd[1]: session-65.scope: Deactivated successfully. May 14 01:14:09.843378 systemd-logind[1459]: Session 65 logged out. Waiting for processes to exit. May 14 01:14:09.846223 systemd-logind[1459]: Removed session 65. May 14 01:14:14.851235 systemd[1]: Started sshd@63-172.24.4.64:22-172.24.4.1:53230.service - OpenSSH per-connection server daemon (172.24.4.1:53230). May 14 01:14:15.938488 sshd[6673]: Accepted publickey for core from 172.24.4.1 port 53230 ssh2: RSA SHA256:rM5XR5OurBP49wqsUeFk3q6kdddh3064hEvFZHudzRQ May 14 01:14:15.941937 sshd-session[6673]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 14 01:14:15.955975 systemd-logind[1459]: New session 66 of user core. May 14 01:14:15.968360 systemd[1]: Started session-66.scope - Session 66 of User core. May 14 01:14:16.750207 sshd[6675]: Connection closed by 172.24.4.1 port 53230 May 14 01:14:16.751464 sshd-session[6673]: pam_unix(sshd:session): session closed for user core May 14 01:14:16.760337 systemd[1]: sshd@63-172.24.4.64:22-172.24.4.1:53230.service: Deactivated successfully. May 14 01:14:16.767882 systemd[1]: session-66.scope: Deactivated successfully. May 14 01:14:16.770534 systemd-logind[1459]: Session 66 logged out. Waiting for processes to exit. May 14 01:14:16.773792 systemd-logind[1459]: Removed session 66. May 14 01:14:21.772605 systemd[1]: Started sshd@64-172.24.4.64:22-172.24.4.1:53234.service - OpenSSH per-connection server daemon (172.24.4.1:53234). May 14 01:14:22.065904 containerd[1486]: time="2025-05-14T01:14:22.065843756Z" level=info msg="TaskExit event in podsandbox handler container_id:\"fbd2f0df01276d9e0ab3f1bb7fcc04b75d9ceb93f4633143ac23e9b29cc7cd7b\" id:\"380b3d742b19b076c2318eb600c96baf70054680ae3624a65316acd0e4240bb1\" pid:6701 exited_at:{seconds:1747185262 nanos:64971660}" May 14 01:14:22.897949 sshd[6686]: Accepted publickey for core from 172.24.4.1 port 53234 ssh2: RSA SHA256:rM5XR5OurBP49wqsUeFk3q6kdddh3064hEvFZHudzRQ May 14 01:14:22.900829 sshd-session[6686]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 14 01:14:22.917121 systemd-logind[1459]: New session 67 of user core. May 14 01:14:22.924441 systemd[1]: Started session-67.scope - Session 67 of User core. May 14 01:14:23.652012 sshd[6710]: Connection closed by 172.24.4.1 port 53234 May 14 01:14:23.652727 sshd-session[6686]: pam_unix(sshd:session): session closed for user core May 14 01:14:23.656732 systemd[1]: sshd@64-172.24.4.64:22-172.24.4.1:53234.service: Deactivated successfully. May 14 01:14:23.661898 systemd[1]: session-67.scope: Deactivated successfully. May 14 01:14:23.679458 systemd-logind[1459]: Session 67 logged out. Waiting for processes to exit. May 14 01:14:23.681970 systemd-logind[1459]: Removed session 67. May 14 01:14:28.702697 systemd[1]: Started sshd@65-172.24.4.64:22-172.24.4.1:40748.service - OpenSSH per-connection server daemon (172.24.4.1:40748). May 14 01:14:29.897239 sshd[6723]: Accepted publickey for core from 172.24.4.1 port 40748 ssh2: RSA SHA256:rM5XR5OurBP49wqsUeFk3q6kdddh3064hEvFZHudzRQ May 14 01:14:29.900444 sshd-session[6723]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 14 01:14:29.916454 systemd-logind[1459]: New session 68 of user core. May 14 01:14:29.929534 systemd[1]: Started session-68.scope - Session 68 of User core. May 14 01:14:30.764126 sshd[6725]: Connection closed by 172.24.4.1 port 40748 May 14 01:14:30.765880 sshd-session[6723]: pam_unix(sshd:session): session closed for user core May 14 01:14:30.772053 systemd[1]: sshd@65-172.24.4.64:22-172.24.4.1:40748.service: Deactivated successfully. May 14 01:14:30.775812 systemd[1]: session-68.scope: Deactivated successfully. May 14 01:14:30.779166 systemd-logind[1459]: Session 68 logged out. Waiting for processes to exit. May 14 01:14:30.782104 systemd-logind[1459]: Removed session 68. May 14 01:14:35.798795 systemd[1]: Started sshd@66-172.24.4.64:22-172.24.4.1:48072.service - OpenSSH per-connection server daemon (172.24.4.1:48072). May 14 01:14:35.922794 containerd[1486]: time="2025-05-14T01:14:35.922417866Z" level=info msg="TaskExit event in podsandbox handler container_id:\"fbd2f0df01276d9e0ab3f1bb7fcc04b75d9ceb93f4633143ac23e9b29cc7cd7b\" id:\"62e729bc9cc54402dc225ef0a72ab5a5e58346dc544700dd163863b75ab514d4\" pid:6752 exited_at:{seconds:1747185275 nanos:921184190}" May 14 01:14:36.887954 sshd[6737]: Accepted publickey for core from 172.24.4.1 port 48072 ssh2: RSA SHA256:rM5XR5OurBP49wqsUeFk3q6kdddh3064hEvFZHudzRQ May 14 01:14:36.892523 sshd-session[6737]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 14 01:14:36.907221 systemd-logind[1459]: New session 69 of user core. May 14 01:14:36.917551 systemd[1]: Started session-69.scope - Session 69 of User core. May 14 01:14:37.510373 sshd[6762]: Connection closed by 172.24.4.1 port 48072 May 14 01:14:37.512517 sshd-session[6737]: pam_unix(sshd:session): session closed for user core May 14 01:14:37.525877 systemd[1]: sshd@66-172.24.4.64:22-172.24.4.1:48072.service: Deactivated successfully. May 14 01:14:37.533545 systemd[1]: session-69.scope: Deactivated successfully. May 14 01:14:37.540160 systemd-logind[1459]: Session 69 logged out. Waiting for processes to exit. May 14 01:14:37.544104 systemd-logind[1459]: Removed session 69. May 14 01:14:38.156081 containerd[1486]: time="2025-05-14T01:14:38.155910988Z" level=info msg="TaskExit event in podsandbox handler container_id:\"333ab1cd365db74def51b32507a1eb376d68d8fdb1297213dc201047e9c4ed25\" id:\"15d106cf69d1ff7c45f651741221de76e132c76137f5506fd3bb9e8d07202344\" pid:6785 exited_at:{seconds:1747185278 nanos:150776213}" May 14 01:14:42.546660 systemd[1]: Started sshd@67-172.24.4.64:22-172.24.4.1:48074.service - OpenSSH per-connection server daemon (172.24.4.1:48074). May 14 01:14:43.742434 sshd[6804]: Accepted publickey for core from 172.24.4.1 port 48074 ssh2: RSA SHA256:rM5XR5OurBP49wqsUeFk3q6kdddh3064hEvFZHudzRQ May 14 01:14:43.743113 sshd-session[6804]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 14 01:14:43.752414 systemd-logind[1459]: New session 70 of user core. May 14 01:14:43.759695 systemd[1]: Started session-70.scope - Session 70 of User core. May 14 01:14:44.510659 sshd[6806]: Connection closed by 172.24.4.1 port 48074 May 14 01:14:44.512519 sshd-session[6804]: pam_unix(sshd:session): session closed for user core May 14 01:14:44.520411 systemd-logind[1459]: Session 70 logged out. Waiting for processes to exit. May 14 01:14:44.521873 systemd[1]: sshd@67-172.24.4.64:22-172.24.4.1:48074.service: Deactivated successfully. May 14 01:14:44.531192 systemd[1]: session-70.scope: Deactivated successfully. May 14 01:14:44.538151 systemd-logind[1459]: Removed session 70. May 14 01:14:49.537713 systemd[1]: Started sshd@68-172.24.4.64:22-172.24.4.1:48806.service - OpenSSH per-connection server daemon (172.24.4.1:48806). May 14 01:14:50.799816 sshd[6829]: Accepted publickey for core from 172.24.4.1 port 48806 ssh2: RSA SHA256:rM5XR5OurBP49wqsUeFk3q6kdddh3064hEvFZHudzRQ May 14 01:14:50.802952 sshd-session[6829]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 14 01:14:50.817846 systemd-logind[1459]: New session 71 of user core. May 14 01:14:50.826371 systemd[1]: Started session-71.scope - Session 71 of User core. May 14 01:14:51.511254 sshd[6831]: Connection closed by 172.24.4.1 port 48806 May 14 01:14:51.512563 sshd-session[6829]: pam_unix(sshd:session): session closed for user core May 14 01:14:51.520363 systemd[1]: sshd@68-172.24.4.64:22-172.24.4.1:48806.service: Deactivated successfully. May 14 01:14:51.526316 systemd[1]: session-71.scope: Deactivated successfully. May 14 01:14:51.531823 systemd-logind[1459]: Session 71 logged out. Waiting for processes to exit. May 14 01:14:51.534366 systemd-logind[1459]: Removed session 71. May 14 01:14:52.069069 containerd[1486]: time="2025-05-14T01:14:52.068993327Z" level=info msg="TaskExit event in podsandbox handler container_id:\"fbd2f0df01276d9e0ab3f1bb7fcc04b75d9ceb93f4633143ac23e9b29cc7cd7b\" id:\"346efab1c84f49caf941ccf9ed97f36e16619bc189919a93edb7555420952ea5\" pid:6854 exited_at:{seconds:1747185292 nanos:68626078}" May 14 01:14:56.536029 systemd[1]: Started sshd@69-172.24.4.64:22-172.24.4.1:41440.service - OpenSSH per-connection server daemon (172.24.4.1:41440). May 14 01:14:57.804133 sshd[6864]: Accepted publickey for core from 172.24.4.1 port 41440 ssh2: RSA SHA256:rM5XR5OurBP49wqsUeFk3q6kdddh3064hEvFZHudzRQ May 14 01:14:57.806820 sshd-session[6864]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 14 01:14:57.820206 systemd-logind[1459]: New session 72 of user core. May 14 01:14:57.831350 systemd[1]: Started session-72.scope - Session 72 of User core. May 14 01:14:58.511025 sshd[6866]: Connection closed by 172.24.4.1 port 41440 May 14 01:14:58.512517 sshd-session[6864]: pam_unix(sshd:session): session closed for user core May 14 01:14:58.521916 systemd[1]: sshd@69-172.24.4.64:22-172.24.4.1:41440.service: Deactivated successfully. May 14 01:14:58.527651 systemd[1]: session-72.scope: Deactivated successfully. May 14 01:14:58.530621 systemd-logind[1459]: Session 72 logged out. Waiting for processes to exit. May 14 01:14:58.533765 systemd-logind[1459]: Removed session 72. May 14 01:15:03.536886 systemd[1]: Started sshd@70-172.24.4.64:22-172.24.4.1:59276.service - OpenSSH per-connection server daemon (172.24.4.1:59276). May 14 01:15:04.763119 sshd[6878]: Accepted publickey for core from 172.24.4.1 port 59276 ssh2: RSA SHA256:rM5XR5OurBP49wqsUeFk3q6kdddh3064hEvFZHudzRQ May 14 01:15:04.766417 sshd-session[6878]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 14 01:15:04.781840 systemd-logind[1459]: New session 73 of user core. May 14 01:15:04.789982 systemd[1]: Started session-73.scope - Session 73 of User core. May 14 01:15:05.582851 sshd[6880]: Connection closed by 172.24.4.1 port 59276 May 14 01:15:05.584969 sshd-session[6878]: pam_unix(sshd:session): session closed for user core May 14 01:15:05.592958 systemd[1]: sshd@70-172.24.4.64:22-172.24.4.1:59276.service: Deactivated successfully. May 14 01:15:05.600567 systemd[1]: session-73.scope: Deactivated successfully. May 14 01:15:05.605816 systemd-logind[1459]: Session 73 logged out. Waiting for processes to exit. May 14 01:15:05.608726 systemd-logind[1459]: Removed session 73. May 14 01:15:08.201344 containerd[1486]: time="2025-05-14T01:15:08.200940148Z" level=info msg="TaskExit event in podsandbox handler container_id:\"333ab1cd365db74def51b32507a1eb376d68d8fdb1297213dc201047e9c4ed25\" id:\"062c2dc73a316f18679dc4718a8d61518a28b3ca31106caac62c5717f12b3073\" pid:6907 exited_at:{seconds:1747185308 nanos:198912523}" May 14 01:15:10.624378 systemd[1]: Started sshd@71-172.24.4.64:22-172.24.4.1:59288.service - OpenSSH per-connection server daemon (172.24.4.1:59288). May 14 01:15:11.828257 sshd[6920]: Accepted publickey for core from 172.24.4.1 port 59288 ssh2: RSA SHA256:rM5XR5OurBP49wqsUeFk3q6kdddh3064hEvFZHudzRQ May 14 01:15:11.832007 sshd-session[6920]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 14 01:15:11.847736 systemd-logind[1459]: New session 74 of user core. May 14 01:15:11.856467 systemd[1]: Started session-74.scope - Session 74 of User core. May 14 01:15:12.471947 sshd[6923]: Connection closed by 172.24.4.1 port 59288 May 14 01:15:12.473526 sshd-session[6920]: pam_unix(sshd:session): session closed for user core May 14 01:15:12.482364 systemd[1]: sshd@71-172.24.4.64:22-172.24.4.1:59288.service: Deactivated successfully. May 14 01:15:12.489440 systemd[1]: session-74.scope: Deactivated successfully. May 14 01:15:12.492007 systemd-logind[1459]: Session 74 logged out. Waiting for processes to exit. May 14 01:15:12.494879 systemd-logind[1459]: Removed session 74. May 14 01:15:17.501243 systemd[1]: Started sshd@72-172.24.4.64:22-172.24.4.1:44506.service - OpenSSH per-connection server daemon (172.24.4.1:44506). May 14 01:15:18.798954 sshd[6935]: Accepted publickey for core from 172.24.4.1 port 44506 ssh2: RSA SHA256:rM5XR5OurBP49wqsUeFk3q6kdddh3064hEvFZHudzRQ May 14 01:15:18.803819 sshd-session[6935]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 14 01:15:18.817664 systemd-logind[1459]: New session 75 of user core. May 14 01:15:18.825361 systemd[1]: Started session-75.scope - Session 75 of User core. May 14 01:15:19.513671 sshd[6937]: Connection closed by 172.24.4.1 port 44506 May 14 01:15:19.513434 sshd-session[6935]: pam_unix(sshd:session): session closed for user core May 14 01:15:19.522160 systemd-logind[1459]: Session 75 logged out. Waiting for processes to exit. May 14 01:15:19.522432 systemd[1]: sshd@72-172.24.4.64:22-172.24.4.1:44506.service: Deactivated successfully. May 14 01:15:19.529775 systemd[1]: session-75.scope: Deactivated successfully. May 14 01:15:19.535293 systemd-logind[1459]: Removed session 75. May 14 01:15:22.083894 containerd[1486]: time="2025-05-14T01:15:22.083767308Z" level=info msg="TaskExit event in podsandbox handler container_id:\"fbd2f0df01276d9e0ab3f1bb7fcc04b75d9ceb93f4633143ac23e9b29cc7cd7b\" id:\"11ffb0559d489a1ee3cdc14e2c59229855246e45546afb086433c6838af86968\" pid:6960 exited_at:{seconds:1747185322 nanos:82700927}" May 14 01:15:24.533642 systemd[1]: Started sshd@73-172.24.4.64:22-172.24.4.1:48220.service - OpenSSH per-connection server daemon (172.24.4.1:48220). May 14 01:15:25.804976 sshd[6970]: Accepted publickey for core from 172.24.4.1 port 48220 ssh2: RSA SHA256:rM5XR5OurBP49wqsUeFk3q6kdddh3064hEvFZHudzRQ May 14 01:15:25.808830 sshd-session[6970]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 14 01:15:25.828494 systemd-logind[1459]: New session 76 of user core. May 14 01:15:25.835391 systemd[1]: Started session-76.scope - Session 76 of User core. May 14 01:15:26.510358 sshd[6972]: Connection closed by 172.24.4.1 port 48220 May 14 01:15:26.512118 sshd-session[6970]: pam_unix(sshd:session): session closed for user core May 14 01:15:26.520160 systemd[1]: sshd@73-172.24.4.64:22-172.24.4.1:48220.service: Deactivated successfully. May 14 01:15:26.527001 systemd[1]: session-76.scope: Deactivated successfully. May 14 01:15:26.531213 systemd-logind[1459]: Session 76 logged out. Waiting for processes to exit. May 14 01:15:26.534158 systemd-logind[1459]: Removed session 76. May 14 01:15:31.534643 systemd[1]: Started sshd@74-172.24.4.64:22-172.24.4.1:48236.service - OpenSSH per-connection server daemon (172.24.4.1:48236). May 14 01:15:32.801217 sshd[6985]: Accepted publickey for core from 172.24.4.1 port 48236 ssh2: RSA SHA256:rM5XR5OurBP49wqsUeFk3q6kdddh3064hEvFZHudzRQ May 14 01:15:32.803957 sshd-session[6985]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 14 01:15:32.818387 systemd-logind[1459]: New session 77 of user core. May 14 01:15:32.824417 systemd[1]: Started session-77.scope - Session 77 of User core. May 14 01:15:33.511157 sshd[6987]: Connection closed by 172.24.4.1 port 48236 May 14 01:15:33.512555 sshd-session[6985]: pam_unix(sshd:session): session closed for user core May 14 01:15:33.521922 systemd-logind[1459]: Session 77 logged out. Waiting for processes to exit. May 14 01:15:33.524543 systemd[1]: sshd@74-172.24.4.64:22-172.24.4.1:48236.service: Deactivated successfully. May 14 01:15:33.530238 systemd[1]: session-77.scope: Deactivated successfully. May 14 01:15:33.533591 systemd-logind[1459]: Removed session 77. May 14 01:15:35.936158 containerd[1486]: time="2025-05-14T01:15:35.936091022Z" level=info msg="TaskExit event in podsandbox handler container_id:\"fbd2f0df01276d9e0ab3f1bb7fcc04b75d9ceb93f4633143ac23e9b29cc7cd7b\" id:\"90c00f23e5c56da5c028319503b0fcfa92b4680c0ebb60a09f26c20811b7c105\" pid:7014 exited_at:{seconds:1747185335 nanos:935669201}" May 14 01:15:38.190370 containerd[1486]: time="2025-05-14T01:15:38.190219668Z" level=info msg="TaskExit event in podsandbox handler container_id:\"333ab1cd365db74def51b32507a1eb376d68d8fdb1297213dc201047e9c4ed25\" id:\"1a54c647e46b68155cd35828a412429213974418e2e4489c448a3d772c288f6c\" pid:7036 exited_at:{seconds:1747185338 nanos:189763873}" May 14 01:15:38.548972 systemd[1]: Started sshd@75-172.24.4.64:22-172.24.4.1:44696.service - OpenSSH per-connection server daemon (172.24.4.1:44696). May 14 01:15:39.802453 sshd[7049]: Accepted publickey for core from 172.24.4.1 port 44696 ssh2: RSA SHA256:rM5XR5OurBP49wqsUeFk3q6kdddh3064hEvFZHudzRQ May 14 01:15:39.807427 sshd-session[7049]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 14 01:15:39.831781 systemd-logind[1459]: New session 78 of user core. May 14 01:15:39.841441 systemd[1]: Started session-78.scope - Session 78 of User core. May 14 01:15:40.510600 sshd[7052]: Connection closed by 172.24.4.1 port 44696 May 14 01:15:40.511319 sshd-session[7049]: pam_unix(sshd:session): session closed for user core May 14 01:15:40.523676 systemd[1]: sshd@75-172.24.4.64:22-172.24.4.1:44696.service: Deactivated successfully. May 14 01:15:40.524240 systemd-logind[1459]: Session 78 logged out. Waiting for processes to exit. May 14 01:15:40.532200 systemd[1]: session-78.scope: Deactivated successfully. May 14 01:15:40.535867 systemd-logind[1459]: Removed session 78. May 14 01:15:45.551851 systemd[1]: Started sshd@76-172.24.4.64:22-172.24.4.1:47998.service - OpenSSH per-connection server daemon (172.24.4.1:47998). May 14 01:15:46.760708 sshd[7065]: Accepted publickey for core from 172.24.4.1 port 47998 ssh2: RSA SHA256:rM5XR5OurBP49wqsUeFk3q6kdddh3064hEvFZHudzRQ May 14 01:15:46.767162 sshd-session[7065]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 14 01:15:46.783184 systemd-logind[1459]: New session 79 of user core. May 14 01:15:46.803424 systemd[1]: Started session-79.scope - Session 79 of User core. May 14 01:15:47.510428 sshd[7067]: Connection closed by 172.24.4.1 port 47998 May 14 01:15:47.512837 sshd-session[7065]: pam_unix(sshd:session): session closed for user core May 14 01:15:47.522939 systemd[1]: sshd@76-172.24.4.64:22-172.24.4.1:47998.service: Deactivated successfully. May 14 01:15:47.530345 systemd[1]: session-79.scope: Deactivated successfully. May 14 01:15:47.535699 systemd-logind[1459]: Session 79 logged out. Waiting for processes to exit. May 14 01:15:47.539715 systemd-logind[1459]: Removed session 79. May 14 01:15:52.078849 containerd[1486]: time="2025-05-14T01:15:52.078378067Z" level=info msg="TaskExit event in podsandbox handler container_id:\"fbd2f0df01276d9e0ab3f1bb7fcc04b75d9ceb93f4633143ac23e9b29cc7cd7b\" id:\"0b87f01bbc23967cdd67827f1b152df6c9adc176f6bf3cd2a7b0121709f20c43\" pid:7097 exited_at:{seconds:1747185352 nanos:76829782}" May 14 01:15:52.543405 systemd[1]: Started sshd@77-172.24.4.64:22-172.24.4.1:48012.service - OpenSSH per-connection server daemon (172.24.4.1:48012). May 14 01:15:53.826707 sshd[7109]: Accepted publickey for core from 172.24.4.1 port 48012 ssh2: RSA SHA256:rM5XR5OurBP49wqsUeFk3q6kdddh3064hEvFZHudzRQ May 14 01:15:53.829968 sshd-session[7109]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 14 01:15:53.843075 systemd-logind[1459]: New session 80 of user core. May 14 01:15:53.855339 systemd[1]: Started session-80.scope - Session 80 of User core. May 14 01:15:54.510326 sshd[7111]: Connection closed by 172.24.4.1 port 48012 May 14 01:15:54.511087 sshd-session[7109]: pam_unix(sshd:session): session closed for user core May 14 01:15:54.514853 systemd[1]: sshd@77-172.24.4.64:22-172.24.4.1:48012.service: Deactivated successfully. May 14 01:15:54.517916 systemd[1]: session-80.scope: Deactivated successfully. May 14 01:15:54.521169 systemd-logind[1459]: Session 80 logged out. Waiting for processes to exit. May 14 01:15:54.522754 systemd-logind[1459]: Removed session 80. May 14 01:15:59.538724 systemd[1]: Started sshd@78-172.24.4.64:22-172.24.4.1:43830.service - OpenSSH per-connection server daemon (172.24.4.1:43830). May 14 01:16:00.731170 sshd[7124]: Accepted publickey for core from 172.24.4.1 port 43830 ssh2: RSA SHA256:rM5XR5OurBP49wqsUeFk3q6kdddh3064hEvFZHudzRQ May 14 01:16:00.734181 sshd-session[7124]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 14 01:16:00.747947 systemd-logind[1459]: New session 81 of user core. May 14 01:16:00.756432 systemd[1]: Started session-81.scope - Session 81 of User core. May 14 01:16:01.473510 sshd[7126]: Connection closed by 172.24.4.1 port 43830 May 14 01:16:01.474201 sshd-session[7124]: pam_unix(sshd:session): session closed for user core May 14 01:16:01.484003 systemd[1]: sshd@78-172.24.4.64:22-172.24.4.1:43830.service: Deactivated successfully. May 14 01:16:01.493237 systemd[1]: session-81.scope: Deactivated successfully. May 14 01:16:01.496766 systemd-logind[1459]: Session 81 logged out. Waiting for processes to exit. May 14 01:16:01.500159 systemd-logind[1459]: Removed session 81. May 14 01:16:06.498937 systemd[1]: Started sshd@79-172.24.4.64:22-172.24.4.1:43300.service - OpenSSH per-connection server daemon (172.24.4.1:43300). May 14 01:16:07.864120 sshd[7139]: Accepted publickey for core from 172.24.4.1 port 43300 ssh2: RSA SHA256:rM5XR5OurBP49wqsUeFk3q6kdddh3064hEvFZHudzRQ May 14 01:16:07.866462 sshd-session[7139]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 14 01:16:07.882153 systemd-logind[1459]: New session 82 of user core. May 14 01:16:07.889375 systemd[1]: Started session-82.scope - Session 82 of User core. May 14 01:16:08.141785 containerd[1486]: time="2025-05-14T01:16:08.141642539Z" level=info msg="TaskExit event in podsandbox handler container_id:\"333ab1cd365db74def51b32507a1eb376d68d8fdb1297213dc201047e9c4ed25\" id:\"8bf85bfab61698a44e01b015fff47fd2a1dcf4c714b4db60153d31f95a8007a0\" pid:7154 exited_at:{seconds:1747185368 nanos:141213794}" May 14 01:16:08.503267 sshd[7141]: Connection closed by 172.24.4.1 port 43300 May 14 01:16:08.502380 sshd-session[7139]: pam_unix(sshd:session): session closed for user core May 14 01:16:08.512491 systemd[1]: sshd@79-172.24.4.64:22-172.24.4.1:43300.service: Deactivated successfully. May 14 01:16:08.519270 systemd[1]: session-82.scope: Deactivated successfully. May 14 01:16:08.528749 systemd-logind[1459]: Session 82 logged out. Waiting for processes to exit. May 14 01:16:08.532439 systemd-logind[1459]: Removed session 82. May 14 01:16:13.539718 systemd[1]: Started sshd@80-172.24.4.64:22-172.24.4.1:47600.service - OpenSSH per-connection server daemon (172.24.4.1:47600). May 14 01:16:14.863119 sshd[7176]: Accepted publickey for core from 172.24.4.1 port 47600 ssh2: RSA SHA256:rM5XR5OurBP49wqsUeFk3q6kdddh3064hEvFZHudzRQ May 14 01:16:14.868906 sshd-session[7176]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 14 01:16:14.891577 systemd-logind[1459]: New session 83 of user core. May 14 01:16:14.902413 systemd[1]: Started session-83.scope - Session 83 of User core. May 14 01:16:15.500764 sshd[7183]: Connection closed by 172.24.4.1 port 47600 May 14 01:16:15.502525 sshd-session[7176]: pam_unix(sshd:session): session closed for user core May 14 01:16:15.513363 systemd[1]: sshd@80-172.24.4.64:22-172.24.4.1:47600.service: Deactivated successfully. May 14 01:16:15.520713 systemd[1]: session-83.scope: Deactivated successfully. May 14 01:16:15.522919 systemd-logind[1459]: Session 83 logged out. Waiting for processes to exit. May 14 01:16:15.526117 systemd-logind[1459]: Removed session 83. May 14 01:16:20.527275 systemd[1]: Started sshd@81-172.24.4.64:22-172.24.4.1:47612.service - OpenSSH per-connection server daemon (172.24.4.1:47612). May 14 01:16:21.860373 sshd[7196]: Accepted publickey for core from 172.24.4.1 port 47612 ssh2: RSA SHA256:rM5XR5OurBP49wqsUeFk3q6kdddh3064hEvFZHudzRQ May 14 01:16:21.863615 sshd-session[7196]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 14 01:16:21.877887 systemd-logind[1459]: New session 84 of user core. May 14 01:16:21.888474 systemd[1]: Started session-84.scope - Session 84 of User core. May 14 01:16:22.091653 containerd[1486]: time="2025-05-14T01:16:22.091522571Z" level=info msg="TaskExit event in podsandbox handler container_id:\"fbd2f0df01276d9e0ab3f1bb7fcc04b75d9ceb93f4633143ac23e9b29cc7cd7b\" id:\"afe0c01a781047c5467644170c1e3fd983c98e2499b4987ae721f5aadbdd2159\" pid:7211 exited_at:{seconds:1747185382 nanos:90713764}" May 14 01:16:22.443240 sshd[7198]: Connection closed by 172.24.4.1 port 47612 May 14 01:16:22.444782 sshd-session[7196]: pam_unix(sshd:session): session closed for user core May 14 01:16:22.453463 systemd[1]: sshd@81-172.24.4.64:22-172.24.4.1:47612.service: Deactivated successfully. May 14 01:16:22.461676 systemd[1]: session-84.scope: Deactivated successfully. May 14 01:16:22.465247 systemd-logind[1459]: Session 84 logged out. Waiting for processes to exit. May 14 01:16:22.468463 systemd-logind[1459]: Removed session 84. May 14 01:16:27.474409 systemd[1]: Started sshd@82-172.24.4.64:22-172.24.4.1:37328.service - OpenSSH per-connection server daemon (172.24.4.1:37328). May 14 01:16:28.862593 sshd[7242]: Accepted publickey for core from 172.24.4.1 port 37328 ssh2: RSA SHA256:rM5XR5OurBP49wqsUeFk3q6kdddh3064hEvFZHudzRQ May 14 01:16:28.865607 sshd-session[7242]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 14 01:16:28.882428 systemd-logind[1459]: New session 85 of user core. May 14 01:16:28.891222 systemd[1]: Started session-85.scope - Session 85 of User core. May 14 01:16:29.503178 sshd[7246]: Connection closed by 172.24.4.1 port 37328 May 14 01:16:29.504553 sshd-session[7242]: pam_unix(sshd:session): session closed for user core May 14 01:16:29.511994 systemd[1]: sshd@82-172.24.4.64:22-172.24.4.1:37328.service: Deactivated successfully. May 14 01:16:29.519311 systemd[1]: session-85.scope: Deactivated successfully. May 14 01:16:29.523748 systemd-logind[1459]: Session 85 logged out. Waiting for processes to exit. May 14 01:16:29.527544 systemd-logind[1459]: Removed session 85. May 14 01:16:34.524818 systemd[1]: Started sshd@83-172.24.4.64:22-172.24.4.1:58156.service - OpenSSH per-connection server daemon (172.24.4.1:58156). May 14 01:16:35.869668 sshd[7258]: Accepted publickey for core from 172.24.4.1 port 58156 ssh2: RSA SHA256:rM5XR5OurBP49wqsUeFk3q6kdddh3064hEvFZHudzRQ May 14 01:16:35.877918 sshd-session[7258]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 14 01:16:35.904052 systemd-logind[1459]: New session 86 of user core. May 14 01:16:35.910569 systemd[1]: Started session-86.scope - Session 86 of User core. May 14 01:16:35.942136 containerd[1486]: time="2025-05-14T01:16:35.942073753Z" level=info msg="TaskExit event in podsandbox handler container_id:\"fbd2f0df01276d9e0ab3f1bb7fcc04b75d9ceb93f4633143ac23e9b29cc7cd7b\" id:\"ed34f1c569772fef19cfb8a048212658987d20fe6aa3b11dd2be4cfbbe199773\" pid:7275 exited_at:{seconds:1747185395 nanos:941572833}" May 14 01:16:36.498780 sshd[7280]: Connection closed by 172.24.4.1 port 58156 May 14 01:16:36.498310 sshd-session[7258]: pam_unix(sshd:session): session closed for user core May 14 01:16:36.510651 systemd[1]: sshd@83-172.24.4.64:22-172.24.4.1:58156.service: Deactivated successfully. May 14 01:16:36.517452 systemd[1]: session-86.scope: Deactivated successfully. May 14 01:16:36.519945 systemd-logind[1459]: Session 86 logged out. Waiting for processes to exit. May 14 01:16:36.523093 systemd-logind[1459]: Removed session 86. May 14 01:16:38.164173 containerd[1486]: time="2025-05-14T01:16:38.163897361Z" level=info msg="TaskExit event in podsandbox handler container_id:\"333ab1cd365db74def51b32507a1eb376d68d8fdb1297213dc201047e9c4ed25\" id:\"32109c4edb5ab70732c0494ff681793ceb7550f4fb0947bac71f69058d33ca49\" pid:7306 exited_at:{seconds:1747185398 nanos:161751234}" May 14 01:16:41.522791 systemd[1]: Started sshd@84-172.24.4.64:22-172.24.4.1:58160.service - OpenSSH per-connection server daemon (172.24.4.1:58160). May 14 01:16:42.866814 sshd[7319]: Accepted publickey for core from 172.24.4.1 port 58160 ssh2: RSA SHA256:rM5XR5OurBP49wqsUeFk3q6kdddh3064hEvFZHudzRQ May 14 01:16:42.871651 sshd-session[7319]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 14 01:16:42.891934 systemd-logind[1459]: New session 87 of user core. May 14 01:16:42.904456 systemd[1]: Started session-87.scope - Session 87 of User core. May 14 01:16:43.497350 sshd[7321]: Connection closed by 172.24.4.1 port 58160 May 14 01:16:43.499452 sshd-session[7319]: pam_unix(sshd:session): session closed for user core May 14 01:16:43.508896 systemd[1]: sshd@84-172.24.4.64:22-172.24.4.1:58160.service: Deactivated successfully. May 14 01:16:43.515806 systemd[1]: session-87.scope: Deactivated successfully. May 14 01:16:43.519944 systemd-logind[1459]: Session 87 logged out. Waiting for processes to exit. May 14 01:16:43.522907 systemd-logind[1459]: Removed session 87. May 14 01:16:48.527424 systemd[1]: Started sshd@85-172.24.4.64:22-172.24.4.1:42024.service - OpenSSH per-connection server daemon (172.24.4.1:42024). May 14 01:16:49.865611 sshd[7333]: Accepted publickey for core from 172.24.4.1 port 42024 ssh2: RSA SHA256:rM5XR5OurBP49wqsUeFk3q6kdddh3064hEvFZHudzRQ May 14 01:16:49.868740 sshd-session[7333]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 14 01:16:49.884237 systemd-logind[1459]: New session 88 of user core. May 14 01:16:49.892857 systemd[1]: Started session-88.scope - Session 88 of User core. May 14 01:16:50.501106 sshd[7335]: Connection closed by 172.24.4.1 port 42024 May 14 01:16:50.502637 sshd-session[7333]: pam_unix(sshd:session): session closed for user core May 14 01:16:50.515183 systemd[1]: sshd@85-172.24.4.64:22-172.24.4.1:42024.service: Deactivated successfully. May 14 01:16:50.521530 systemd[1]: session-88.scope: Deactivated successfully. May 14 01:16:50.524462 systemd-logind[1459]: Session 88 logged out. Waiting for processes to exit. May 14 01:16:50.527277 systemd-logind[1459]: Removed session 88. May 14 01:16:52.087265 containerd[1486]: time="2025-05-14T01:16:52.086964422Z" level=info msg="TaskExit event in podsandbox handler container_id:\"fbd2f0df01276d9e0ab3f1bb7fcc04b75d9ceb93f4633143ac23e9b29cc7cd7b\" id:\"4464e7f6cf88b04ac8c4e75342537319af90c4518210ed857d2c01a84f54578a\" pid:7358 exited_at:{seconds:1747185412 nanos:85191275}" May 14 01:16:55.524572 systemd[1]: Started sshd@86-172.24.4.64:22-172.24.4.1:52400.service - OpenSSH per-connection server daemon (172.24.4.1:52400). May 14 01:16:56.863544 sshd[7367]: Accepted publickey for core from 172.24.4.1 port 52400 ssh2: RSA SHA256:rM5XR5OurBP49wqsUeFk3q6kdddh3064hEvFZHudzRQ May 14 01:16:56.865311 sshd-session[7367]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 14 01:16:56.879343 systemd-logind[1459]: New session 89 of user core. May 14 01:16:56.884434 systemd[1]: Started session-89.scope - Session 89 of User core. May 14 01:16:57.502541 sshd[7369]: Connection closed by 172.24.4.1 port 52400 May 14 01:16:57.503330 sshd-session[7367]: pam_unix(sshd:session): session closed for user core May 14 01:16:57.514253 systemd[1]: sshd@86-172.24.4.64:22-172.24.4.1:52400.service: Deactivated successfully. May 14 01:16:57.522243 systemd[1]: session-89.scope: Deactivated successfully. May 14 01:16:57.524875 systemd-logind[1459]: Session 89 logged out. Waiting for processes to exit. May 14 01:16:57.528609 systemd-logind[1459]: Removed session 89. May 14 01:17:02.554269 systemd[1]: Started sshd@87-172.24.4.64:22-172.24.4.1:52410.service - OpenSSH per-connection server daemon (172.24.4.1:52410). May 14 01:17:03.808463 sshd[7381]: Accepted publickey for core from 172.24.4.1 port 52410 ssh2: RSA SHA256:rM5XR5OurBP49wqsUeFk3q6kdddh3064hEvFZHudzRQ May 14 01:17:03.813617 sshd-session[7381]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 14 01:17:03.840604 systemd-logind[1459]: New session 90 of user core. May 14 01:17:03.849002 systemd[1]: Started session-90.scope - Session 90 of User core. May 14 01:17:04.511864 sshd[7383]: Connection closed by 172.24.4.1 port 52410 May 14 01:17:04.511515 sshd-session[7381]: pam_unix(sshd:session): session closed for user core May 14 01:17:04.519329 systemd[1]: sshd@87-172.24.4.64:22-172.24.4.1:52410.service: Deactivated successfully. May 14 01:17:04.526430 systemd[1]: session-90.scope: Deactivated successfully. May 14 01:17:04.530365 systemd-logind[1459]: Session 90 logged out. Waiting for processes to exit. May 14 01:17:04.532839 systemd-logind[1459]: Removed session 90. May 14 01:17:08.131599 containerd[1486]: time="2025-05-14T01:17:08.130893169Z" level=info msg="TaskExit event in podsandbox handler container_id:\"333ab1cd365db74def51b32507a1eb376d68d8fdb1297213dc201047e9c4ed25\" id:\"6ae3b7472e8a5f793cebeb419335e4856ca795c83b235f7d42b214199b8de162\" pid:7408 exited_at:{seconds:1747185428 nanos:129370421}" May 14 01:17:09.527725 systemd[1]: Started sshd@88-172.24.4.64:22-172.24.4.1:38640.service - OpenSSH per-connection server daemon (172.24.4.1:38640). May 14 01:17:10.864941 sshd[7421]: Accepted publickey for core from 172.24.4.1 port 38640 ssh2: RSA SHA256:rM5XR5OurBP49wqsUeFk3q6kdddh3064hEvFZHudzRQ May 14 01:17:10.869460 sshd-session[7421]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 14 01:17:10.887425 systemd-logind[1459]: New session 91 of user core. May 14 01:17:10.896907 systemd[1]: Started session-91.scope - Session 91 of User core. May 14 01:17:11.746532 sshd[7423]: Connection closed by 172.24.4.1 port 38640 May 14 01:17:11.749969 sshd-session[7421]: pam_unix(sshd:session): session closed for user core May 14 01:17:11.760893 systemd[1]: sshd@88-172.24.4.64:22-172.24.4.1:38640.service: Deactivated successfully. May 14 01:17:11.768003 systemd[1]: session-91.scope: Deactivated successfully. May 14 01:17:11.772191 systemd-logind[1459]: Session 91 logged out. Waiting for processes to exit. May 14 01:17:11.775929 systemd-logind[1459]: Removed session 91.