May 15 00:27:50.046751 kernel: Linux version 6.6.89-flatcar (build@pony-truck.infra.kinvolk.io) (x86_64-cros-linux-gnu-gcc (Gentoo Hardened 14.2.1_p20241221 p7) 14.2.1 20241221, GNU ld (Gentoo 2.44 p1) 2.44.0) #1 SMP PREEMPT_DYNAMIC Wed May 14 22:09:34 -00 2025 May 15 00:27:50.046780 kernel: Command line: BOOT_IMAGE=/flatcar/vmlinuz-a mount.usr=/dev/mapper/usr verity.usr=PARTUUID=7130c94a-213a-4e5a-8e26-6cce9662f132 rootflags=rw mount.usrflags=ro consoleblank=0 root=LABEL=ROOT console=ttyS0,115200n8 console=tty0 flatcar.first_boot=detected flatcar.oem.id=openstack flatcar.autologin verity.usrhash=e0c956f61127e47bb23a2bdeb0592b0ff91bd857e2344d0bf321acb67c279f1a May 15 00:27:50.046791 kernel: BIOS-provided physical RAM map: May 15 00:27:50.046799 kernel: BIOS-e820: [mem 0x0000000000000000-0x000000000009fbff] usable May 15 00:27:50.046807 kernel: BIOS-e820: [mem 0x000000000009fc00-0x000000000009ffff] reserved May 15 00:27:50.046817 kernel: BIOS-e820: [mem 0x00000000000f0000-0x00000000000fffff] reserved May 15 00:27:50.046827 kernel: BIOS-e820: [mem 0x0000000000100000-0x00000000bffdcfff] usable May 15 00:27:50.046835 kernel: BIOS-e820: [mem 0x00000000bffdd000-0x00000000bfffffff] reserved May 15 00:27:50.046844 kernel: BIOS-e820: [mem 0x00000000feffc000-0x00000000feffffff] reserved May 15 00:27:50.046852 kernel: BIOS-e820: [mem 0x00000000fffc0000-0x00000000ffffffff] reserved May 15 00:27:50.046860 kernel: BIOS-e820: [mem 0x0000000100000000-0x000000013fffffff] usable May 15 00:27:50.046868 kernel: BIOS-e820: [mem 0x000000fd00000000-0x000000ffffffffff] reserved May 15 00:27:50.046876 kernel: NX (Execute Disable) protection: active May 15 00:27:50.046885 kernel: APIC: Static calls initialized May 15 00:27:50.046897 kernel: SMBIOS 3.0.0 present. May 15 00:27:50.046905 kernel: DMI: OpenStack Foundation OpenStack Nova, BIOS 1.16.3-debian-1.16.3-2 04/01/2014 May 15 00:27:50.046914 kernel: Hypervisor detected: KVM May 15 00:27:50.046923 kernel: kvm-clock: Using msrs 4b564d01 and 4b564d00 May 15 00:27:50.046931 kernel: kvm-clock: using sched offset of 3583412875 cycles May 15 00:27:50.046940 kernel: clocksource: kvm-clock: mask: 0xffffffffffffffff max_cycles: 0x1cd42e4dffb, max_idle_ns: 881590591483 ns May 15 00:27:50.046951 kernel: tsc: Detected 1996.249 MHz processor May 15 00:27:50.046960 kernel: e820: update [mem 0x00000000-0x00000fff] usable ==> reserved May 15 00:27:50.046970 kernel: e820: remove [mem 0x000a0000-0x000fffff] usable May 15 00:27:50.046979 kernel: last_pfn = 0x140000 max_arch_pfn = 0x400000000 May 15 00:27:50.046988 kernel: MTRR map: 4 entries (3 fixed + 1 variable; max 19), built from 8 variable MTRRs May 15 00:27:50.046997 kernel: x86/PAT: Configuration [0-7]: WB WC UC- UC WB WP UC- WT May 15 00:27:50.047006 kernel: last_pfn = 0xbffdd max_arch_pfn = 0x400000000 May 15 00:27:50.047015 kernel: ACPI: Early table checksum verification disabled May 15 00:27:50.047026 kernel: ACPI: RSDP 0x00000000000F51E0 000014 (v00 BOCHS ) May 15 00:27:50.047035 kernel: ACPI: RSDT 0x00000000BFFE1B65 000030 (v01 BOCHS BXPC 00000001 BXPC 00000001) May 15 00:27:50.047044 kernel: ACPI: FACP 0x00000000BFFE1A49 000074 (v01 BOCHS BXPC 00000001 BXPC 00000001) May 15 00:27:50.047053 kernel: ACPI: DSDT 0x00000000BFFE0040 001A09 (v01 BOCHS BXPC 00000001 BXPC 00000001) May 15 00:27:50.047061 kernel: ACPI: FACS 0x00000000BFFE0000 000040 May 15 00:27:50.047070 kernel: ACPI: APIC 0x00000000BFFE1ABD 000080 (v03 BOCHS BXPC 00000001 BXPC 00000001) May 15 00:27:50.047079 kernel: ACPI: WAET 0x00000000BFFE1B3D 000028 (v01 BOCHS BXPC 00000001 BXPC 00000001) May 15 00:27:50.047088 kernel: ACPI: Reserving FACP table memory at [mem 0xbffe1a49-0xbffe1abc] May 15 00:27:50.047097 kernel: ACPI: Reserving DSDT table memory at [mem 0xbffe0040-0xbffe1a48] May 15 00:27:50.047107 kernel: ACPI: Reserving FACS table memory at [mem 0xbffe0000-0xbffe003f] May 15 00:27:50.047116 kernel: ACPI: Reserving APIC table memory at [mem 0xbffe1abd-0xbffe1b3c] May 15 00:27:50.047125 kernel: ACPI: Reserving WAET table memory at [mem 0xbffe1b3d-0xbffe1b64] May 15 00:27:50.047137 kernel: No NUMA configuration found May 15 00:27:50.047147 kernel: Faking a node at [mem 0x0000000000000000-0x000000013fffffff] May 15 00:27:50.047156 kernel: NODE_DATA(0) allocated [mem 0x13fff7000-0x13fffcfff] May 15 00:27:50.047165 kernel: Zone ranges: May 15 00:27:50.047176 kernel: DMA [mem 0x0000000000001000-0x0000000000ffffff] May 15 00:27:50.047185 kernel: DMA32 [mem 0x0000000001000000-0x00000000ffffffff] May 15 00:27:50.047194 kernel: Normal [mem 0x0000000100000000-0x000000013fffffff] May 15 00:27:50.047204 kernel: Movable zone start for each node May 15 00:27:50.047213 kernel: Early memory node ranges May 15 00:27:50.047222 kernel: node 0: [mem 0x0000000000001000-0x000000000009efff] May 15 00:27:50.047231 kernel: node 0: [mem 0x0000000000100000-0x00000000bffdcfff] May 15 00:27:50.047240 kernel: node 0: [mem 0x0000000100000000-0x000000013fffffff] May 15 00:27:50.047251 kernel: Initmem setup node 0 [mem 0x0000000000001000-0x000000013fffffff] May 15 00:27:50.047275 kernel: On node 0, zone DMA: 1 pages in unavailable ranges May 15 00:27:50.047285 kernel: On node 0, zone DMA: 97 pages in unavailable ranges May 15 00:27:50.047294 kernel: On node 0, zone Normal: 35 pages in unavailable ranges May 15 00:27:50.047304 kernel: ACPI: PM-Timer IO Port: 0x608 May 15 00:27:50.047313 kernel: ACPI: LAPIC_NMI (acpi_id[0xff] dfl dfl lint[0x1]) May 15 00:27:50.047322 kernel: IOAPIC[0]: apic_id 0, version 17, address 0xfec00000, GSI 0-23 May 15 00:27:50.047332 kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 0 global_irq 2 dfl dfl) May 15 00:27:50.047341 kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 5 global_irq 5 high level) May 15 00:27:50.047353 kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 9 global_irq 9 high level) May 15 00:27:50.047363 kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 10 global_irq 10 high level) May 15 00:27:50.047372 kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 11 global_irq 11 high level) May 15 00:27:50.047381 kernel: ACPI: Using ACPI (MADT) for SMP configuration information May 15 00:27:50.047390 kernel: smpboot: Allowing 2 CPUs, 0 hotplug CPUs May 15 00:27:50.047400 kernel: kvm-guest: APIC: eoi() replaced with kvm_guest_apic_eoi_write() May 15 00:27:50.047409 kernel: [mem 0xc0000000-0xfeffbfff] available for PCI devices May 15 00:27:50.047418 kernel: Booting paravirtualized kernel on KVM May 15 00:27:50.047427 kernel: clocksource: refined-jiffies: mask: 0xffffffff max_cycles: 0xffffffff, max_idle_ns: 1910969940391419 ns May 15 00:27:50.047440 kernel: setup_percpu: NR_CPUS:512 nr_cpumask_bits:2 nr_cpu_ids:2 nr_node_ids:1 May 15 00:27:50.047449 kernel: percpu: Embedded 58 pages/cpu s197032 r8192 d32344 u1048576 May 15 00:27:50.047458 kernel: pcpu-alloc: s197032 r8192 d32344 u1048576 alloc=1*2097152 May 15 00:27:50.047467 kernel: pcpu-alloc: [0] 0 1 May 15 00:27:50.047476 kernel: kvm-guest: PV spinlocks disabled, no host support May 15 00:27:50.047487 kernel: Kernel command line: rootflags=rw mount.usrflags=ro BOOT_IMAGE=/flatcar/vmlinuz-a mount.usr=/dev/mapper/usr verity.usr=PARTUUID=7130c94a-213a-4e5a-8e26-6cce9662f132 rootflags=rw mount.usrflags=ro consoleblank=0 root=LABEL=ROOT console=ttyS0,115200n8 console=tty0 flatcar.first_boot=detected flatcar.oem.id=openstack flatcar.autologin verity.usrhash=e0c956f61127e47bb23a2bdeb0592b0ff91bd857e2344d0bf321acb67c279f1a May 15 00:27:50.047497 kernel: Unknown kernel command line parameters "BOOT_IMAGE=/flatcar/vmlinuz-a", will be passed to user space. May 15 00:27:50.047507 kernel: Dentry cache hash table entries: 524288 (order: 10, 4194304 bytes, linear) May 15 00:27:50.047518 kernel: Inode-cache hash table entries: 262144 (order: 9, 2097152 bytes, linear) May 15 00:27:50.047527 kernel: Fallback order for Node 0: 0 May 15 00:27:50.047536 kernel: Built 1 zonelists, mobility grouping on. Total pages: 1031901 May 15 00:27:50.047545 kernel: Policy zone: Normal May 15 00:27:50.047554 kernel: mem auto-init: stack:off, heap alloc:off, heap free:off May 15 00:27:50.047564 kernel: software IO TLB: area num 2. May 15 00:27:50.047573 kernel: Memory: 3962108K/4193772K available (14336K kernel code, 2296K rwdata, 25068K rodata, 43604K init, 1468K bss, 231404K reserved, 0K cma-reserved) May 15 00:27:50.047583 kernel: SLUB: HWalign=64, Order=0-3, MinObjects=0, CPUs=2, Nodes=1 May 15 00:27:50.047592 kernel: ftrace: allocating 37993 entries in 149 pages May 15 00:27:50.047603 kernel: ftrace: allocated 149 pages with 4 groups May 15 00:27:50.047612 kernel: Dynamic Preempt: voluntary May 15 00:27:50.047621 kernel: rcu: Preemptible hierarchical RCU implementation. May 15 00:27:50.047631 kernel: rcu: RCU event tracing is enabled. May 15 00:27:50.047641 kernel: rcu: RCU restricting CPUs from NR_CPUS=512 to nr_cpu_ids=2. May 15 00:27:50.047650 kernel: Trampoline variant of Tasks RCU enabled. May 15 00:27:50.047659 kernel: Rude variant of Tasks RCU enabled. May 15 00:27:50.047669 kernel: Tracing variant of Tasks RCU enabled. May 15 00:27:50.047678 kernel: rcu: RCU calculated value of scheduler-enlistment delay is 100 jiffies. May 15 00:27:50.047689 kernel: rcu: Adjusting geometry for rcu_fanout_leaf=16, nr_cpu_ids=2 May 15 00:27:50.047699 kernel: NR_IRQS: 33024, nr_irqs: 440, preallocated irqs: 16 May 15 00:27:50.047708 kernel: rcu: srcu_init: Setting srcu_struct sizes based on contention. May 15 00:27:50.047718 kernel: Console: colour VGA+ 80x25 May 15 00:27:50.047727 kernel: printk: console [tty0] enabled May 15 00:27:50.047736 kernel: printk: console [ttyS0] enabled May 15 00:27:50.047745 kernel: ACPI: Core revision 20230628 May 15 00:27:50.047754 kernel: APIC: Switch to symmetric I/O mode setup May 15 00:27:50.047763 kernel: x2apic enabled May 15 00:27:50.047775 kernel: APIC: Switched APIC routing to: physical x2apic May 15 00:27:50.047784 kernel: ..TIMER: vector=0x30 apic1=0 pin1=2 apic2=-1 pin2=-1 May 15 00:27:50.047793 kernel: tsc: Marking TSC unstable due to TSCs unsynchronized May 15 00:27:50.047803 kernel: Calibrating delay loop (skipped) preset value.. 3992.49 BogoMIPS (lpj=1996249) May 15 00:27:50.047812 kernel: Last level iTLB entries: 4KB 0, 2MB 0, 4MB 0 May 15 00:27:50.047821 kernel: Last level dTLB entries: 4KB 0, 2MB 0, 4MB 0, 1GB 0 May 15 00:27:50.047831 kernel: Spectre V1 : Mitigation: usercopy/swapgs barriers and __user pointer sanitization May 15 00:27:50.047840 kernel: Spectre V2 : Mitigation: Retpolines May 15 00:27:50.047849 kernel: Spectre V2 : Spectre v2 / SpectreRSB: Filling RSB on context switch and VMEXIT May 15 00:27:50.047860 kernel: Speculative Store Bypass: Vulnerable May 15 00:27:50.047869 kernel: x86/fpu: x87 FPU will use FXSAVE May 15 00:27:50.047879 kernel: Freeing SMP alternatives memory: 32K May 15 00:27:50.047888 kernel: pid_max: default: 32768 minimum: 301 May 15 00:27:50.047904 kernel: LSM: initializing lsm=lockdown,capability,landlock,selinux,integrity May 15 00:27:50.047916 kernel: landlock: Up and running. May 15 00:27:50.047925 kernel: SELinux: Initializing. May 15 00:27:50.047935 kernel: Mount-cache hash table entries: 8192 (order: 4, 65536 bytes, linear) May 15 00:27:50.047945 kernel: Mountpoint-cache hash table entries: 8192 (order: 4, 65536 bytes, linear) May 15 00:27:50.047955 kernel: smpboot: CPU0: AMD Intel Core i7 9xx (Nehalem Class Core i7) (family: 0x6, model: 0x1a, stepping: 0x3) May 15 00:27:50.047964 kernel: RCU Tasks: Setting shift to 1 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=2. May 15 00:27:50.047975 kernel: RCU Tasks Rude: Setting shift to 1 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=2. May 15 00:27:50.047986 kernel: RCU Tasks Trace: Setting shift to 1 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=2. May 15 00:27:50.047996 kernel: Performance Events: AMD PMU driver. May 15 00:27:50.048005 kernel: ... version: 0 May 15 00:27:50.048015 kernel: ... bit width: 48 May 15 00:27:50.048025 kernel: ... generic registers: 4 May 15 00:27:50.048036 kernel: ... value mask: 0000ffffffffffff May 15 00:27:50.048046 kernel: ... max period: 00007fffffffffff May 15 00:27:50.048055 kernel: ... fixed-purpose events: 0 May 15 00:27:50.048066 kernel: ... event mask: 000000000000000f May 15 00:27:50.048077 kernel: signal: max sigframe size: 1440 May 15 00:27:50.048086 kernel: rcu: Hierarchical SRCU implementation. May 15 00:27:50.048095 kernel: rcu: Max phase no-delay instances is 400. May 15 00:27:50.048104 kernel: smp: Bringing up secondary CPUs ... May 15 00:27:50.048113 kernel: smpboot: x86: Booting SMP configuration: May 15 00:27:50.048124 kernel: .... node #0, CPUs: #1 May 15 00:27:50.048133 kernel: smp: Brought up 1 node, 2 CPUs May 15 00:27:50.048142 kernel: smpboot: Max logical packages: 2 May 15 00:27:50.048151 kernel: smpboot: Total of 2 processors activated (7984.99 BogoMIPS) May 15 00:27:50.048160 kernel: devtmpfs: initialized May 15 00:27:50.048168 kernel: x86/mm: Memory block size: 128MB May 15 00:27:50.048178 kernel: clocksource: jiffies: mask: 0xffffffff max_cycles: 0xffffffff, max_idle_ns: 1911260446275000 ns May 15 00:27:50.048187 kernel: futex hash table entries: 512 (order: 3, 32768 bytes, linear) May 15 00:27:50.048196 kernel: pinctrl core: initialized pinctrl subsystem May 15 00:27:50.048206 kernel: NET: Registered PF_NETLINK/PF_ROUTE protocol family May 15 00:27:50.048215 kernel: audit: initializing netlink subsys (disabled) May 15 00:27:50.048225 kernel: audit: type=2000 audit(1747268868.974:1): state=initialized audit_enabled=0 res=1 May 15 00:27:50.048233 kernel: thermal_sys: Registered thermal governor 'step_wise' May 15 00:27:50.048242 kernel: thermal_sys: Registered thermal governor 'user_space' May 15 00:27:50.048252 kernel: cpuidle: using governor menu May 15 00:27:50.049032 kernel: acpiphp: ACPI Hot Plug PCI Controller Driver version: 0.5 May 15 00:27:50.049047 kernel: dca service started, version 1.12.1 May 15 00:27:50.049056 kernel: PCI: Using configuration type 1 for base access May 15 00:27:50.049069 kernel: kprobes: kprobe jump-optimization is enabled. All kprobes are optimized if possible. May 15 00:27:50.049078 kernel: HugeTLB: registered 2.00 MiB page size, pre-allocated 0 pages May 15 00:27:50.049087 kernel: HugeTLB: 28 KiB vmemmap can be freed for a 2.00 MiB page May 15 00:27:50.049096 kernel: ACPI: Added _OSI(Module Device) May 15 00:27:50.049105 kernel: ACPI: Added _OSI(Processor Device) May 15 00:27:50.049114 kernel: ACPI: Added _OSI(3.0 _SCP Extensions) May 15 00:27:50.049123 kernel: ACPI: Added _OSI(Processor Aggregator Device) May 15 00:27:50.049132 kernel: ACPI: 1 ACPI AML tables successfully acquired and loaded May 15 00:27:50.049141 kernel: ACPI: _OSC evaluation for CPUs failed, trying _PDC May 15 00:27:50.049153 kernel: ACPI: Interpreter enabled May 15 00:27:50.049162 kernel: ACPI: PM: (supports S0 S3 S5) May 15 00:27:50.049171 kernel: ACPI: Using IOAPIC for interrupt routing May 15 00:27:50.049180 kernel: PCI: Using host bridge windows from ACPI; if necessary, use "pci=nocrs" and report a bug May 15 00:27:50.049189 kernel: PCI: Using E820 reservations for host bridge windows May 15 00:27:50.049198 kernel: ACPI: Enabled 2 GPEs in block 00 to 0F May 15 00:27:50.049207 kernel: ACPI: PCI Root Bridge [PCI0] (domain 0000 [bus 00-ff]) May 15 00:27:50.049757 kernel: acpi PNP0A03:00: _OSC: OS supports [ASPM ClockPM Segments MSI HPX-Type3] May 15 00:27:50.049864 kernel: acpi PNP0A03:00: _OSC: not requesting OS control; OS requires [ExtendedConfig ASPM ClockPM MSI] May 15 00:27:50.049958 kernel: acpi PNP0A03:00: fail to add MMCONFIG information, can't access extended configuration space under this bridge May 15 00:27:50.049972 kernel: acpiphp: Slot [3] registered May 15 00:27:50.049982 kernel: acpiphp: Slot [4] registered May 15 00:27:50.049991 kernel: acpiphp: Slot [5] registered May 15 00:27:50.050000 kernel: acpiphp: Slot [6] registered May 15 00:27:50.050008 kernel: acpiphp: Slot [7] registered May 15 00:27:50.050017 kernel: acpiphp: Slot [8] registered May 15 00:27:50.050030 kernel: acpiphp: Slot [9] registered May 15 00:27:50.050039 kernel: acpiphp: Slot [10] registered May 15 00:27:50.050048 kernel: acpiphp: Slot [11] registered May 15 00:27:50.050057 kernel: acpiphp: Slot [12] registered May 15 00:27:50.050066 kernel: acpiphp: Slot [13] registered May 15 00:27:50.050075 kernel: acpiphp: Slot [14] registered May 15 00:27:50.050084 kernel: acpiphp: Slot [15] registered May 15 00:27:50.050093 kernel: acpiphp: Slot [16] registered May 15 00:27:50.050101 kernel: acpiphp: Slot [17] registered May 15 00:27:50.050110 kernel: acpiphp: Slot [18] registered May 15 00:27:50.050121 kernel: acpiphp: Slot [19] registered May 15 00:27:50.050130 kernel: acpiphp: Slot [20] registered May 15 00:27:50.050139 kernel: acpiphp: Slot [21] registered May 15 00:27:50.050148 kernel: acpiphp: Slot [22] registered May 15 00:27:50.050157 kernel: acpiphp: Slot [23] registered May 15 00:27:50.050166 kernel: acpiphp: Slot [24] registered May 15 00:27:50.050175 kernel: acpiphp: Slot [25] registered May 15 00:27:50.050184 kernel: acpiphp: Slot [26] registered May 15 00:27:50.050193 kernel: acpiphp: Slot [27] registered May 15 00:27:50.050203 kernel: acpiphp: Slot [28] registered May 15 00:27:50.050212 kernel: acpiphp: Slot [29] registered May 15 00:27:50.050221 kernel: acpiphp: Slot [30] registered May 15 00:27:50.050230 kernel: acpiphp: Slot [31] registered May 15 00:27:50.050239 kernel: PCI host bridge to bus 0000:00 May 15 00:27:50.050416 kernel: pci_bus 0000:00: root bus resource [io 0x0000-0x0cf7 window] May 15 00:27:50.050503 kernel: pci_bus 0000:00: root bus resource [io 0x0d00-0xffff window] May 15 00:27:50.050603 kernel: pci_bus 0000:00: root bus resource [mem 0x000a0000-0x000bffff window] May 15 00:27:50.050693 kernel: pci_bus 0000:00: root bus resource [mem 0xc0000000-0xfebfffff window] May 15 00:27:50.050775 kernel: pci_bus 0000:00: root bus resource [mem 0xc000000000-0xc07fffffff window] May 15 00:27:50.050858 kernel: pci_bus 0000:00: root bus resource [bus 00-ff] May 15 00:27:50.050973 kernel: pci 0000:00:00.0: [8086:1237] type 00 class 0x060000 May 15 00:27:50.051078 kernel: pci 0000:00:01.0: [8086:7000] type 00 class 0x060100 May 15 00:27:50.051182 kernel: pci 0000:00:01.1: [8086:7010] type 00 class 0x010180 May 15 00:27:50.051309 kernel: pci 0000:00:01.1: reg 0x20: [io 0xc120-0xc12f] May 15 00:27:50.051407 kernel: pci 0000:00:01.1: legacy IDE quirk: reg 0x10: [io 0x01f0-0x01f7] May 15 00:27:50.051500 kernel: pci 0000:00:01.1: legacy IDE quirk: reg 0x14: [io 0x03f6] May 15 00:27:50.051595 kernel: pci 0000:00:01.1: legacy IDE quirk: reg 0x18: [io 0x0170-0x0177] May 15 00:27:50.051689 kernel: pci 0000:00:01.1: legacy IDE quirk: reg 0x1c: [io 0x0376] May 15 00:27:50.051793 kernel: pci 0000:00:01.3: [8086:7113] type 00 class 0x068000 May 15 00:27:50.051898 kernel: pci 0000:00:01.3: quirk: [io 0x0600-0x063f] claimed by PIIX4 ACPI May 15 00:27:50.051997 kernel: pci 0000:00:01.3: quirk: [io 0x0700-0x070f] claimed by PIIX4 SMB May 15 00:27:50.052100 kernel: pci 0000:00:02.0: [1af4:1050] type 00 class 0x030000 May 15 00:27:50.052196 kernel: pci 0000:00:02.0: reg 0x10: [mem 0xfe000000-0xfe7fffff pref] May 15 00:27:50.052310 kernel: pci 0000:00:02.0: reg 0x18: [mem 0xc000000000-0xc000003fff 64bit pref] May 15 00:27:50.052407 kernel: pci 0000:00:02.0: reg 0x20: [mem 0xfeb90000-0xfeb90fff] May 15 00:27:50.052503 kernel: pci 0000:00:02.0: reg 0x30: [mem 0xfeb80000-0xfeb8ffff pref] May 15 00:27:50.052599 kernel: pci 0000:00:02.0: Video device with shadowed ROM at [mem 0x000c0000-0x000dffff] May 15 00:27:50.052713 kernel: pci 0000:00:03.0: [1af4:1000] type 00 class 0x020000 May 15 00:27:50.052810 kernel: pci 0000:00:03.0: reg 0x10: [io 0xc080-0xc0bf] May 15 00:27:50.052904 kernel: pci 0000:00:03.0: reg 0x14: [mem 0xfeb91000-0xfeb91fff] May 15 00:27:50.052999 kernel: pci 0000:00:03.0: reg 0x20: [mem 0xc000004000-0xc000007fff 64bit pref] May 15 00:27:50.053094 kernel: pci 0000:00:03.0: reg 0x30: [mem 0xfeb00000-0xfeb7ffff pref] May 15 00:27:50.053197 kernel: pci 0000:00:04.0: [1af4:1001] type 00 class 0x010000 May 15 00:27:50.053318 kernel: pci 0000:00:04.0: reg 0x10: [io 0xc000-0xc07f] May 15 00:27:50.053421 kernel: pci 0000:00:04.0: reg 0x14: [mem 0xfeb92000-0xfeb92fff] May 15 00:27:50.053514 kernel: pci 0000:00:04.0: reg 0x20: [mem 0xc000008000-0xc00000bfff 64bit pref] May 15 00:27:50.053624 kernel: pci 0000:00:05.0: [1af4:1002] type 00 class 0x00ff00 May 15 00:27:50.053720 kernel: pci 0000:00:05.0: reg 0x10: [io 0xc0c0-0xc0ff] May 15 00:27:50.053815 kernel: pci 0000:00:05.0: reg 0x20: [mem 0xc00000c000-0xc00000ffff 64bit pref] May 15 00:27:50.053917 kernel: pci 0000:00:06.0: [1af4:1005] type 00 class 0x00ff00 May 15 00:27:50.054014 kernel: pci 0000:00:06.0: reg 0x10: [io 0xc100-0xc11f] May 15 00:27:50.054115 kernel: pci 0000:00:06.0: reg 0x14: [mem 0xfeb93000-0xfeb93fff] May 15 00:27:50.054212 kernel: pci 0000:00:06.0: reg 0x20: [mem 0xc000010000-0xc000013fff 64bit pref] May 15 00:27:50.054226 kernel: ACPI: PCI: Interrupt link LNKA configured for IRQ 10 May 15 00:27:50.054235 kernel: ACPI: PCI: Interrupt link LNKB configured for IRQ 10 May 15 00:27:50.054245 kernel: ACPI: PCI: Interrupt link LNKC configured for IRQ 11 May 15 00:27:50.054254 kernel: ACPI: PCI: Interrupt link LNKD configured for IRQ 11 May 15 00:27:50.054284 kernel: ACPI: PCI: Interrupt link LNKS configured for IRQ 9 May 15 00:27:50.054294 kernel: iommu: Default domain type: Translated May 15 00:27:50.054307 kernel: iommu: DMA domain TLB invalidation policy: lazy mode May 15 00:27:50.054317 kernel: PCI: Using ACPI for IRQ routing May 15 00:27:50.054326 kernel: PCI: pci_cache_line_size set to 64 bytes May 15 00:27:50.054335 kernel: e820: reserve RAM buffer [mem 0x0009fc00-0x0009ffff] May 15 00:27:50.054344 kernel: e820: reserve RAM buffer [mem 0xbffdd000-0xbfffffff] May 15 00:27:50.054441 kernel: pci 0000:00:02.0: vgaarb: setting as boot VGA device May 15 00:27:50.054533 kernel: pci 0000:00:02.0: vgaarb: bridge control possible May 15 00:27:50.054644 kernel: pci 0000:00:02.0: vgaarb: VGA device added: decodes=io+mem,owns=io+mem,locks=none May 15 00:27:50.054659 kernel: vgaarb: loaded May 15 00:27:50.054673 kernel: clocksource: Switched to clocksource kvm-clock May 15 00:27:50.054683 kernel: VFS: Disk quotas dquot_6.6.0 May 15 00:27:50.054693 kernel: VFS: Dquot-cache hash table entries: 512 (order 0, 4096 bytes) May 15 00:27:50.054703 kernel: pnp: PnP ACPI init May 15 00:27:50.054808 kernel: pnp 00:03: [dma 2] May 15 00:27:50.054824 kernel: pnp: PnP ACPI: found 5 devices May 15 00:27:50.054834 kernel: clocksource: acpi_pm: mask: 0xffffff max_cycles: 0xffffff, max_idle_ns: 2085701024 ns May 15 00:27:50.054844 kernel: NET: Registered PF_INET protocol family May 15 00:27:50.054857 kernel: IP idents hash table entries: 65536 (order: 7, 524288 bytes, linear) May 15 00:27:50.054867 kernel: tcp_listen_portaddr_hash hash table entries: 2048 (order: 3, 32768 bytes, linear) May 15 00:27:50.054877 kernel: Table-perturb hash table entries: 65536 (order: 6, 262144 bytes, linear) May 15 00:27:50.054886 kernel: TCP established hash table entries: 32768 (order: 6, 262144 bytes, linear) May 15 00:27:50.054896 kernel: TCP bind hash table entries: 32768 (order: 8, 1048576 bytes, linear) May 15 00:27:50.054906 kernel: TCP: Hash tables configured (established 32768 bind 32768) May 15 00:27:50.054916 kernel: UDP hash table entries: 2048 (order: 4, 65536 bytes, linear) May 15 00:27:50.054926 kernel: UDP-Lite hash table entries: 2048 (order: 4, 65536 bytes, linear) May 15 00:27:50.054936 kernel: NET: Registered PF_UNIX/PF_LOCAL protocol family May 15 00:27:50.054948 kernel: NET: Registered PF_XDP protocol family May 15 00:27:50.055038 kernel: pci_bus 0000:00: resource 4 [io 0x0000-0x0cf7 window] May 15 00:27:50.055128 kernel: pci_bus 0000:00: resource 5 [io 0x0d00-0xffff window] May 15 00:27:50.055216 kernel: pci_bus 0000:00: resource 6 [mem 0x000a0000-0x000bffff window] May 15 00:27:50.055337 kernel: pci_bus 0000:00: resource 7 [mem 0xc0000000-0xfebfffff window] May 15 00:27:50.055427 kernel: pci_bus 0000:00: resource 8 [mem 0xc000000000-0xc07fffffff window] May 15 00:27:50.055530 kernel: pci 0000:00:01.0: PIIX3: Enabling Passive Release May 15 00:27:50.055633 kernel: pci 0000:00:00.0: Limiting direct PCI/PCI transfers May 15 00:27:50.055653 kernel: PCI: CLS 0 bytes, default 64 May 15 00:27:50.055663 kernel: PCI-DMA: Using software bounce buffering for IO (SWIOTLB) May 15 00:27:50.055673 kernel: software IO TLB: mapped [mem 0x00000000bbfdd000-0x00000000bffdd000] (64MB) May 15 00:27:50.055683 kernel: Initialise system trusted keyrings May 15 00:27:50.055693 kernel: workingset: timestamp_bits=39 max_order=20 bucket_order=0 May 15 00:27:50.055702 kernel: Key type asymmetric registered May 15 00:27:50.055712 kernel: Asymmetric key parser 'x509' registered May 15 00:27:50.055722 kernel: Block layer SCSI generic (bsg) driver version 0.4 loaded (major 251) May 15 00:27:50.055731 kernel: io scheduler mq-deadline registered May 15 00:27:50.055744 kernel: io scheduler kyber registered May 15 00:27:50.055754 kernel: io scheduler bfq registered May 15 00:27:50.055764 kernel: ioatdma: Intel(R) QuickData Technology Driver 5.00 May 15 00:27:50.055774 kernel: ACPI: \_SB_.LNKB: Enabled at IRQ 10 May 15 00:27:50.055784 kernel: ACPI: \_SB_.LNKC: Enabled at IRQ 11 May 15 00:27:50.055794 kernel: ACPI: \_SB_.LNKD: Enabled at IRQ 11 May 15 00:27:50.055804 kernel: ACPI: \_SB_.LNKA: Enabled at IRQ 10 May 15 00:27:50.055814 kernel: Serial: 8250/16550 driver, 4 ports, IRQ sharing enabled May 15 00:27:50.055824 kernel: 00:00: ttyS0 at I/O 0x3f8 (irq = 4, base_baud = 115200) is a 16550A May 15 00:27:50.055836 kernel: random: crng init done May 15 00:27:50.055846 kernel: i8042: PNP: PS/2 Controller [PNP0303:KBD,PNP0f13:MOU] at 0x60,0x64 irq 1,12 May 15 00:27:50.055856 kernel: serio: i8042 KBD port at 0x60,0x64 irq 1 May 15 00:27:50.055865 kernel: serio: i8042 AUX port at 0x60,0x64 irq 12 May 15 00:27:50.055875 kernel: input: AT Translated Set 2 keyboard as /devices/platform/i8042/serio0/input/input0 May 15 00:27:50.055978 kernel: rtc_cmos 00:04: RTC can wake from S4 May 15 00:27:50.056075 kernel: rtc_cmos 00:04: registered as rtc0 May 15 00:27:50.056161 kernel: rtc_cmos 00:04: setting system clock to 2025-05-15T00:27:49 UTC (1747268869) May 15 00:27:50.056252 kernel: rtc_cmos 00:04: alarms up to one day, y3k, 242 bytes nvram May 15 00:27:50.058292 kernel: amd_pstate: the _CPC object is not present in SBIOS or ACPI disabled May 15 00:27:50.058306 kernel: NET: Registered PF_INET6 protocol family May 15 00:27:50.058315 kernel: Segment Routing with IPv6 May 15 00:27:50.058325 kernel: In-situ OAM (IOAM) with IPv6 May 15 00:27:50.058334 kernel: NET: Registered PF_PACKET protocol family May 15 00:27:50.058343 kernel: Key type dns_resolver registered May 15 00:27:50.058352 kernel: IPI shorthand broadcast: enabled May 15 00:27:50.058362 kernel: sched_clock: Marking stable (941008317, 179855813)->(1161836484, -40972354) May 15 00:27:50.058374 kernel: registered taskstats version 1 May 15 00:27:50.058384 kernel: Loading compiled-in X.509 certificates May 15 00:27:50.058393 kernel: Loaded X.509 cert 'Kinvolk GmbH: Module signing key for 6.6.89-flatcar: 4f9bc5b8797c7efeb1fcd74892dea83a6cb9d390' May 15 00:27:50.058402 kernel: Key type .fscrypt registered May 15 00:27:50.058412 kernel: Key type fscrypt-provisioning registered May 15 00:27:50.058421 kernel: ima: No TPM chip found, activating TPM-bypass! May 15 00:27:50.058430 kernel: ima: Allocated hash algorithm: sha1 May 15 00:27:50.058439 kernel: ima: No architecture policies found May 15 00:27:50.058450 kernel: clk: Disabling unused clocks May 15 00:27:50.058460 kernel: Freeing unused kernel image (initmem) memory: 43604K May 15 00:27:50.058469 kernel: Write protecting the kernel read-only data: 40960k May 15 00:27:50.058478 kernel: Freeing unused kernel image (rodata/data gap) memory: 1556K May 15 00:27:50.058487 kernel: Run /init as init process May 15 00:27:50.058496 kernel: with arguments: May 15 00:27:50.058506 kernel: /init May 15 00:27:50.058514 kernel: with environment: May 15 00:27:50.058523 kernel: HOME=/ May 15 00:27:50.058532 kernel: TERM=linux May 15 00:27:50.058556 kernel: BOOT_IMAGE=/flatcar/vmlinuz-a May 15 00:27:50.058567 systemd[1]: Successfully made /usr/ read-only. May 15 00:27:50.058581 systemd[1]: systemd 256.8 running in system mode (+PAM +AUDIT +SELINUX -APPARMOR +IMA +SMACK +SECCOMP +GCRYPT -GNUTLS +OPENSSL -ACL +BLKID +CURL +ELFUTILS -FIDO2 +IDN2 -IDN +IPTC +KMOD +LIBCRYPTSETUP +LIBCRYPTSETUP_PLUGINS +LIBFDISK +PCRE2 -PWQUALITY -P11KIT -QRENCODE +TPM2 +BZIP2 +LZ4 +XZ +ZLIB +ZSTD -BPF_FRAMEWORK -XKBCOMMON +UTMP -SYSVINIT +LIBARCHIVE) May 15 00:27:50.058592 systemd[1]: Detected virtualization kvm. May 15 00:27:50.058602 systemd[1]: Detected architecture x86-64. May 15 00:27:50.058612 systemd[1]: Running in initrd. May 15 00:27:50.058625 systemd[1]: No hostname configured, using default hostname. May 15 00:27:50.058635 systemd[1]: Hostname set to . May 15 00:27:50.058644 systemd[1]: Initializing machine ID from VM UUID. May 15 00:27:50.058654 systemd[1]: Queued start job for default target initrd.target. May 15 00:27:50.058664 systemd[1]: Started clevis-luks-askpass.path - Forward Password Requests to Clevis Directory Watch. May 15 00:27:50.058674 systemd[1]: Started systemd-ask-password-console.path - Dispatch Password Requests to Console Directory Watch. May 15 00:27:50.058685 systemd[1]: Expecting device dev-disk-by\x2dlabel-EFI\x2dSYSTEM.device - /dev/disk/by-label/EFI-SYSTEM... May 15 00:27:50.058704 systemd[1]: Expecting device dev-disk-by\x2dlabel-OEM.device - /dev/disk/by-label/OEM... May 15 00:27:50.058716 systemd[1]: Expecting device dev-disk-by\x2dlabel-ROOT.device - /dev/disk/by-label/ROOT... May 15 00:27:50.058727 systemd[1]: Expecting device dev-disk-by\x2dpartlabel-USR\x2dA.device - /dev/disk/by-partlabel/USR-A... May 15 00:27:50.058738 systemd[1]: Expecting device dev-disk-by\x2dpartuuid-7130c94a\x2d213a\x2d4e5a\x2d8e26\x2d6cce9662f132.device - /dev/disk/by-partuuid/7130c94a-213a-4e5a-8e26-6cce9662f132... May 15 00:27:50.058749 systemd[1]: Expecting device dev-mapper-usr.device - /dev/mapper/usr... May 15 00:27:50.058759 systemd[1]: Reached target cryptsetup-pre.target - Local Encrypted Volumes (Pre). May 15 00:27:50.058771 systemd[1]: Reached target cryptsetup.target - Local Encrypted Volumes. May 15 00:27:50.058782 systemd[1]: Reached target paths.target - Path Units. May 15 00:27:50.058792 systemd[1]: Reached target slices.target - Slice Units. May 15 00:27:50.058802 systemd[1]: Reached target swap.target - Swaps. May 15 00:27:50.058812 systemd[1]: Reached target timers.target - Timer Units. May 15 00:27:50.058822 systemd[1]: Listening on iscsid.socket - Open-iSCSI iscsid Socket. May 15 00:27:50.058832 systemd[1]: Listening on iscsiuio.socket - Open-iSCSI iscsiuio Socket. May 15 00:27:50.058843 systemd[1]: Listening on systemd-journald-dev-log.socket - Journal Socket (/dev/log). May 15 00:27:50.058853 systemd[1]: Listening on systemd-journald.socket - Journal Sockets. May 15 00:27:50.058865 systemd[1]: Listening on systemd-networkd.socket - Network Service Netlink Socket. May 15 00:27:50.058875 systemd[1]: Listening on systemd-udevd-control.socket - udev Control Socket. May 15 00:27:50.058885 systemd[1]: Listening on systemd-udevd-kernel.socket - udev Kernel Socket. May 15 00:27:50.058896 systemd[1]: Reached target sockets.target - Socket Units. May 15 00:27:50.058906 systemd[1]: Starting ignition-setup-pre.service - Ignition env setup... May 15 00:27:50.058916 systemd[1]: Starting kmod-static-nodes.service - Create List of Static Device Nodes... May 15 00:27:50.058926 systemd[1]: Finished network-cleanup.service - Network Cleanup. May 15 00:27:50.058937 systemd[1]: Starting systemd-fsck-usr.service... May 15 00:27:50.058948 systemd[1]: Starting systemd-journald.service - Journal Service... May 15 00:27:50.058959 systemd[1]: Starting systemd-modules-load.service - Load Kernel Modules... May 15 00:27:50.058994 systemd-journald[185]: Collecting audit messages is disabled. May 15 00:27:50.059019 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... May 15 00:27:50.059033 systemd-journald[185]: Journal started May 15 00:27:50.059056 systemd-journald[185]: Runtime Journal (/run/log/journal/720e1d30371745bb93b008afd0436d0b) is 8M, max 78.2M, 70.2M free. May 15 00:27:50.072728 systemd-modules-load[187]: Inserted module 'overlay' May 15 00:27:50.083592 systemd[1]: Started systemd-journald.service - Journal Service. May 15 00:27:50.086691 systemd[1]: Finished ignition-setup-pre.service - Ignition env setup. May 15 00:27:50.089719 systemd[1]: Finished kmod-static-nodes.service - Create List of Static Device Nodes. May 15 00:27:50.093408 systemd[1]: Finished systemd-fsck-usr.service. May 15 00:27:50.107433 kernel: bridge: filtering via arp/ip/ip6tables is no longer available by default. Update your scripts to load br_netfilter if you need this. May 15 00:27:50.108410 systemd[1]: Starting systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully... May 15 00:27:50.112394 kernel: Bridge firewalling registered May 15 00:27:50.109482 systemd-modules-load[187]: Inserted module 'br_netfilter' May 15 00:27:50.113973 systemd[1]: Starting systemd-tmpfiles-setup.service - Create System Files and Directories... May 15 00:27:50.115305 systemd[1]: Finished systemd-modules-load.service - Load Kernel Modules. May 15 00:27:50.164578 systemd[1]: Finished systemd-vconsole-setup.service - Virtual Console Setup. May 15 00:27:50.165881 systemd[1]: Finished systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully. May 15 00:27:50.173475 systemd[1]: Starting dracut-cmdline-ask.service - dracut ask for additional cmdline parameters... May 15 00:27:50.176361 systemd[1]: Starting systemd-sysctl.service - Apply Kernel Variables... May 15 00:27:50.178505 systemd[1]: Starting systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev... May 15 00:27:50.181081 systemd[1]: Finished systemd-tmpfiles-setup.service - Create System Files and Directories. May 15 00:27:50.193067 systemd[1]: Finished systemd-sysctl.service - Apply Kernel Variables. May 15 00:27:50.196416 systemd[1]: Starting systemd-resolved.service - Network Name Resolution... May 15 00:27:50.208084 systemd[1]: Finished systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev. May 15 00:27:50.221159 systemd[1]: Finished dracut-cmdline-ask.service - dracut ask for additional cmdline parameters. May 15 00:27:50.224367 systemd[1]: Starting dracut-cmdline.service - dracut cmdline hook... May 15 00:27:50.244979 dracut-cmdline[221]: dracut-dracut-053 May 15 00:27:50.247329 dracut-cmdline[221]: Using kernel command line parameters: rd.driver.pre=btrfs rootflags=rw mount.usrflags=ro BOOT_IMAGE=/flatcar/vmlinuz-a mount.usr=/dev/mapper/usr verity.usr=PARTUUID=7130c94a-213a-4e5a-8e26-6cce9662f132 rootflags=rw mount.usrflags=ro consoleblank=0 root=LABEL=ROOT console=ttyS0,115200n8 console=tty0 flatcar.first_boot=detected flatcar.oem.id=openstack flatcar.autologin verity.usrhash=e0c956f61127e47bb23a2bdeb0592b0ff91bd857e2344d0bf321acb67c279f1a May 15 00:27:50.250430 systemd-resolved[210]: Positive Trust Anchors: May 15 00:27:50.250438 systemd-resolved[210]: . IN DS 20326 8 2 e06d44b80b8f1d39a95c0b0d7c65d08458e880409bbc683457104237c7f8ec8d May 15 00:27:50.250480 systemd-resolved[210]: Negative trust anchors: home.arpa 10.in-addr.arpa 16.172.in-addr.arpa 17.172.in-addr.arpa 18.172.in-addr.arpa 19.172.in-addr.arpa 20.172.in-addr.arpa 21.172.in-addr.arpa 22.172.in-addr.arpa 23.172.in-addr.arpa 24.172.in-addr.arpa 25.172.in-addr.arpa 26.172.in-addr.arpa 27.172.in-addr.arpa 28.172.in-addr.arpa 29.172.in-addr.arpa 30.172.in-addr.arpa 31.172.in-addr.arpa 170.0.0.192.in-addr.arpa 171.0.0.192.in-addr.arpa 168.192.in-addr.arpa d.f.ip6.arpa ipv4only.arpa resolver.arpa corp home internal intranet lan local private test May 15 00:27:50.256919 systemd-resolved[210]: Defaulting to hostname 'linux'. May 15 00:27:50.257801 systemd[1]: Started systemd-resolved.service - Network Name Resolution. May 15 00:27:50.258591 systemd[1]: Reached target nss-lookup.target - Host and Network Name Lookups. May 15 00:27:50.319407 kernel: SCSI subsystem initialized May 15 00:27:50.329348 kernel: Loading iSCSI transport class v2.0-870. May 15 00:27:50.341325 kernel: iscsi: registered transport (tcp) May 15 00:27:50.364386 kernel: iscsi: registered transport (qla4xxx) May 15 00:27:50.364450 kernel: QLogic iSCSI HBA Driver May 15 00:27:50.424133 systemd[1]: Finished dracut-cmdline.service - dracut cmdline hook. May 15 00:27:50.427635 systemd[1]: Starting dracut-pre-udev.service - dracut pre-udev hook... May 15 00:27:50.489400 kernel: device-mapper: core: CONFIG_IMA_DISABLE_HTABLE is disabled. Duplicate IMA measurements will not be recorded in the IMA log. May 15 00:27:50.489488 kernel: device-mapper: uevent: version 1.0.3 May 15 00:27:50.492412 kernel: device-mapper: ioctl: 4.48.0-ioctl (2023-03-01) initialised: dm-devel@redhat.com May 15 00:27:50.554315 kernel: raid6: sse2x4 gen() 5157 MB/s May 15 00:27:50.573318 kernel: raid6: sse2x2 gen() 5976 MB/s May 15 00:27:50.591800 kernel: raid6: sse2x1 gen() 9155 MB/s May 15 00:27:50.591883 kernel: raid6: using algorithm sse2x1 gen() 9155 MB/s May 15 00:27:50.610879 kernel: raid6: .... xor() 7406 MB/s, rmw enabled May 15 00:27:50.610949 kernel: raid6: using ssse3x2 recovery algorithm May 15 00:27:50.633647 kernel: xor: measuring software checksum speed May 15 00:27:50.633722 kernel: prefetch64-sse : 18241 MB/sec May 15 00:27:50.634940 kernel: generic_sse : 16652 MB/sec May 15 00:27:50.634982 kernel: xor: using function: prefetch64-sse (18241 MB/sec) May 15 00:27:50.810339 kernel: Btrfs loaded, zoned=no, fsverity=no May 15 00:27:50.828224 systemd[1]: Finished dracut-pre-udev.service - dracut pre-udev hook. May 15 00:27:50.833583 systemd[1]: Starting systemd-udevd.service - Rule-based Manager for Device Events and Files... May 15 00:27:50.857852 systemd-udevd[403]: Using default interface naming scheme 'v255'. May 15 00:27:50.862829 systemd[1]: Started systemd-udevd.service - Rule-based Manager for Device Events and Files. May 15 00:27:50.871017 systemd[1]: Starting dracut-pre-trigger.service - dracut pre-trigger hook... May 15 00:27:50.897503 dracut-pre-trigger[414]: rd.md=0: removing MD RAID activation May 15 00:27:50.942504 systemd[1]: Finished dracut-pre-trigger.service - dracut pre-trigger hook. May 15 00:27:50.945688 systemd[1]: Starting systemd-udev-trigger.service - Coldplug All udev Devices... May 15 00:27:51.033607 systemd[1]: Finished systemd-udev-trigger.service - Coldplug All udev Devices. May 15 00:27:51.040804 systemd[1]: Starting dracut-initqueue.service - dracut initqueue hook... May 15 00:27:51.075159 systemd[1]: Finished dracut-initqueue.service - dracut initqueue hook. May 15 00:27:51.077328 systemd[1]: Reached target remote-fs-pre.target - Preparation for Remote File Systems. May 15 00:27:51.079635 systemd[1]: Reached target remote-cryptsetup.target - Remote Encrypted Volumes. May 15 00:27:51.081932 systemd[1]: Reached target remote-fs.target - Remote File Systems. May 15 00:27:51.087673 systemd[1]: Starting dracut-pre-mount.service - dracut pre-mount hook... May 15 00:27:51.114670 systemd[1]: Finished dracut-pre-mount.service - dracut pre-mount hook. May 15 00:27:51.122736 kernel: virtio_blk virtio2: 2/0/0 default/read/poll queues May 15 00:27:51.128392 kernel: virtio_blk virtio2: [vda] 20971520 512-byte logical blocks (10.7 GB/10.0 GiB) May 15 00:27:51.139531 kernel: GPT:Primary header thinks Alt. header is not at the end of the disk. May 15 00:27:51.139558 kernel: GPT:17805311 != 20971519 May 15 00:27:51.140456 kernel: GPT:Alternate GPT header not at the end of the disk. May 15 00:27:51.142472 kernel: GPT:17805311 != 20971519 May 15 00:27:51.142495 kernel: GPT: Use GNU Parted to correct GPT errors. May 15 00:27:51.144797 kernel: vda: vda1 vda2 vda3 vda4 vda6 vda7 vda9 May 15 00:27:51.170439 kernel: libata version 3.00 loaded. May 15 00:27:51.172158 systemd[1]: dracut-cmdline-ask.service: Deactivated successfully. May 15 00:27:51.172329 systemd[1]: Stopped dracut-cmdline-ask.service - dracut ask for additional cmdline parameters. May 15 00:27:51.174447 systemd[1]: Stopping dracut-cmdline-ask.service - dracut ask for additional cmdline parameters... May 15 00:27:51.175199 systemd[1]: systemd-vconsole-setup.service: Deactivated successfully. May 15 00:27:51.175415 systemd[1]: Stopped systemd-vconsole-setup.service - Virtual Console Setup. May 15 00:27:51.178215 systemd[1]: Stopping systemd-vconsole-setup.service - Virtual Console Setup... May 15 00:27:51.181501 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... May 15 00:27:51.187888 systemd[1]: run-credentials-systemd\x2dvconsole\x2dsetup.service.mount: Deactivated successfully. May 15 00:27:51.193678 kernel: ata_piix 0000:00:01.1: version 2.13 May 15 00:27:51.197285 kernel: BTRFS: device label OEM devid 1 transid 14 /dev/vda6 scanned by (udev-worker) (449) May 15 00:27:51.209314 kernel: scsi host0: ata_piix May 15 00:27:51.225698 kernel: BTRFS: device fsid 267fa270-7a71-43aa-9209-0280512688b5 devid 1 transid 41 /dev/vda3 scanned by (udev-worker) (460) May 15 00:27:51.225748 kernel: scsi host1: ata_piix May 15 00:27:51.225892 kernel: ata1: PATA max MWDMA2 cmd 0x1f0 ctl 0x3f6 bmdma 0xc120 irq 14 May 15 00:27:51.225907 kernel: ata2: PATA max MWDMA2 cmd 0x170 ctl 0x376 bmdma 0xc128 irq 15 May 15 00:27:51.238202 systemd[1]: Found device dev-disk-by\x2dlabel-EFI\x2dSYSTEM.device - /dev/disk/by-label/EFI-SYSTEM. May 15 00:27:51.271762 systemd[1]: Finished systemd-vconsole-setup.service - Virtual Console Setup. May 15 00:27:51.285806 systemd[1]: Found device dev-disk-by\x2dlabel-OEM.device - /dev/disk/by-label/OEM. May 15 00:27:51.305532 systemd[1]: Found device dev-disk-by\x2dlabel-ROOT.device - /dev/disk/by-label/ROOT. May 15 00:27:51.314103 systemd[1]: Found device dev-disk-by\x2dpartuuid-7130c94a\x2d213a\x2d4e5a\x2d8e26\x2d6cce9662f132.device - /dev/disk/by-partuuid/7130c94a-213a-4e5a-8e26-6cce9662f132. May 15 00:27:51.314700 systemd[1]: Found device dev-disk-by\x2dpartlabel-USR\x2dA.device - /dev/disk/by-partlabel/USR-A. May 15 00:27:51.319378 systemd[1]: Starting disk-uuid.service - Generate new UUID for disk GPT if necessary... May 15 00:27:51.322426 systemd[1]: Starting dracut-cmdline-ask.service - dracut ask for additional cmdline parameters... May 15 00:27:51.334778 disk-uuid[506]: Primary Header is updated. May 15 00:27:51.334778 disk-uuid[506]: Secondary Entries is updated. May 15 00:27:51.334778 disk-uuid[506]: Secondary Header is updated. May 15 00:27:51.343164 systemd[1]: Finished dracut-cmdline-ask.service - dracut ask for additional cmdline parameters. May 15 00:27:51.346125 kernel: vda: vda1 vda2 vda3 vda4 vda6 vda7 vda9 May 15 00:27:52.366508 kernel: vda: vda1 vda2 vda3 vda4 vda6 vda7 vda9 May 15 00:27:52.368713 disk-uuid[511]: The operation has completed successfully. May 15 00:27:52.452923 systemd[1]: disk-uuid.service: Deactivated successfully. May 15 00:27:52.453055 systemd[1]: Finished disk-uuid.service - Generate new UUID for disk GPT if necessary. May 15 00:27:52.498747 systemd[1]: Starting verity-setup.service - Verity Setup for /dev/mapper/usr... May 15 00:27:52.518658 sh[527]: Success May 15 00:27:52.540293 kernel: device-mapper: verity: sha256 using implementation "sha256-ssse3" May 15 00:27:52.616103 systemd[1]: Found device dev-mapper-usr.device - /dev/mapper/usr. May 15 00:27:52.619353 systemd[1]: Mounting sysusr-usr.mount - /sysusr/usr... May 15 00:27:52.631165 systemd[1]: Finished verity-setup.service - Verity Setup for /dev/mapper/usr. May 15 00:27:52.642326 kernel: BTRFS info (device dm-0): first mount of filesystem 267fa270-7a71-43aa-9209-0280512688b5 May 15 00:27:52.646587 kernel: BTRFS info (device dm-0): using crc32c (crc32c-intel) checksum algorithm May 15 00:27:52.646650 kernel: BTRFS warning (device dm-0): 'nologreplay' is deprecated, use 'rescue=nologreplay' instead May 15 00:27:52.646682 kernel: BTRFS info (device dm-0): disabling log replay at mount time May 15 00:27:52.648187 kernel: BTRFS info (device dm-0): using free space tree May 15 00:27:52.666313 systemd[1]: Mounted sysusr-usr.mount - /sysusr/usr. May 15 00:27:52.668190 systemd[1]: afterburn-network-kargs.service - Afterburn Initrd Setup Network Kernel Arguments was skipped because no trigger condition checks were met. May 15 00:27:52.671431 systemd[1]: Starting ignition-setup.service - Ignition (setup)... May 15 00:27:52.676490 systemd[1]: Starting parse-ip-for-networkd.service - Write systemd-networkd units from cmdline... May 15 00:27:52.711651 kernel: BTRFS info (device vda6): first mount of filesystem 4c949817-d4f4-485b-8019-80887ee5206f May 15 00:27:52.711717 kernel: BTRFS info (device vda6): using crc32c (crc32c-intel) checksum algorithm May 15 00:27:52.711730 kernel: BTRFS info (device vda6): using free space tree May 15 00:27:52.719924 kernel: BTRFS info (device vda6): auto enabling async discard May 15 00:27:52.725323 kernel: BTRFS info (device vda6): last unmount of filesystem 4c949817-d4f4-485b-8019-80887ee5206f May 15 00:27:52.739695 systemd[1]: Finished ignition-setup.service - Ignition (setup). May 15 00:27:52.743539 systemd[1]: Starting ignition-fetch-offline.service - Ignition (fetch-offline)... May 15 00:27:52.826341 systemd[1]: Finished parse-ip-for-networkd.service - Write systemd-networkd units from cmdline. May 15 00:27:52.829326 systemd[1]: Starting systemd-networkd.service - Network Configuration... May 15 00:27:52.893141 systemd-networkd[707]: lo: Link UP May 15 00:27:52.893885 systemd-networkd[707]: lo: Gained carrier May 15 00:27:52.895115 systemd-networkd[707]: Enumeration completed May 15 00:27:52.895337 systemd[1]: Started systemd-networkd.service - Network Configuration. May 15 00:27:52.896202 systemd-networkd[707]: eth0: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. May 15 00:27:52.896206 systemd-networkd[707]: eth0: Configuring with /usr/lib/systemd/network/zz-default.network. May 15 00:27:52.897185 systemd[1]: Reached target network.target - Network. May 15 00:27:52.901144 ignition[626]: Ignition 2.20.0 May 15 00:27:52.897572 systemd-networkd[707]: eth0: Link UP May 15 00:27:52.901157 ignition[626]: Stage: fetch-offline May 15 00:27:52.897576 systemd-networkd[707]: eth0: Gained carrier May 15 00:27:52.901201 ignition[626]: no configs at "/usr/lib/ignition/base.d" May 15 00:27:52.897583 systemd-networkd[707]: eth0: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. May 15 00:27:52.901212 ignition[626]: no config dir at "/usr/lib/ignition/base.platform.d/openstack" May 15 00:27:52.902638 systemd[1]: Finished ignition-fetch-offline.service - Ignition (fetch-offline). May 15 00:27:52.901348 ignition[626]: parsed url from cmdline: "" May 15 00:27:52.905474 systemd[1]: Starting ignition-fetch.service - Ignition (fetch)... May 15 00:27:52.901356 ignition[626]: no config URL provided May 15 00:27:52.906318 systemd-networkd[707]: eth0: DHCPv4 address 172.24.4.125/24, gateway 172.24.4.1 acquired from 172.24.4.1 May 15 00:27:52.901362 ignition[626]: reading system config file "/usr/lib/ignition/user.ign" May 15 00:27:52.901371 ignition[626]: no config at "/usr/lib/ignition/user.ign" May 15 00:27:52.901377 ignition[626]: failed to fetch config: resource requires networking May 15 00:27:52.901614 ignition[626]: Ignition finished successfully May 15 00:27:52.927538 ignition[715]: Ignition 2.20.0 May 15 00:27:52.928644 ignition[715]: Stage: fetch May 15 00:27:52.928955 ignition[715]: no configs at "/usr/lib/ignition/base.d" May 15 00:27:52.928969 ignition[715]: no config dir at "/usr/lib/ignition/base.platform.d/openstack" May 15 00:27:52.929074 ignition[715]: parsed url from cmdline: "" May 15 00:27:52.929079 ignition[715]: no config URL provided May 15 00:27:52.929085 ignition[715]: reading system config file "/usr/lib/ignition/user.ign" May 15 00:27:52.929094 ignition[715]: no config at "/usr/lib/ignition/user.ign" May 15 00:27:52.929235 ignition[715]: GET http://169.254.169.254/openstack/latest/user_data: attempt #1 May 15 00:27:52.929496 ignition[715]: config drive ("/dev/disk/by-label/config-2") not found. Waiting... May 15 00:27:52.929521 ignition[715]: config drive ("/dev/disk/by-label/CONFIG-2") not found. Waiting... May 15 00:27:53.201509 ignition[715]: GET result: OK May 15 00:27:53.201691 ignition[715]: parsing config with SHA512: cf5bc0ef5c2918334d54a20650f0bf14333c5232b694c2174b4286af56b92774846bd944d3627dbee41360a55c0882b41fcf7f7dace00e0fe6a85d5db4467d83 May 15 00:27:53.211547 unknown[715]: fetched base config from "system" May 15 00:27:53.211571 unknown[715]: fetched base config from "system" May 15 00:27:53.212449 ignition[715]: fetch: fetch complete May 15 00:27:53.211586 unknown[715]: fetched user config from "openstack" May 15 00:27:53.212460 ignition[715]: fetch: fetch passed May 15 00:27:53.215742 systemd[1]: Finished ignition-fetch.service - Ignition (fetch). May 15 00:27:53.212544 ignition[715]: Ignition finished successfully May 15 00:27:53.221554 systemd[1]: Starting ignition-kargs.service - Ignition (kargs)... May 15 00:27:53.267458 ignition[723]: Ignition 2.20.0 May 15 00:27:53.267476 ignition[723]: Stage: kargs May 15 00:27:53.267865 ignition[723]: no configs at "/usr/lib/ignition/base.d" May 15 00:27:53.267892 ignition[723]: no config dir at "/usr/lib/ignition/base.platform.d/openstack" May 15 00:27:53.272461 systemd[1]: Finished ignition-kargs.service - Ignition (kargs). May 15 00:27:53.270185 ignition[723]: kargs: kargs passed May 15 00:27:53.270349 ignition[723]: Ignition finished successfully May 15 00:27:53.279418 systemd[1]: Starting ignition-disks.service - Ignition (disks)... May 15 00:27:53.315106 ignition[729]: Ignition 2.20.0 May 15 00:27:53.315132 ignition[729]: Stage: disks May 15 00:27:53.315573 ignition[729]: no configs at "/usr/lib/ignition/base.d" May 15 00:27:53.315599 ignition[729]: no config dir at "/usr/lib/ignition/base.platform.d/openstack" May 15 00:27:53.317655 ignition[729]: disks: disks passed May 15 00:27:53.320835 systemd[1]: Finished ignition-disks.service - Ignition (disks). May 15 00:27:53.317750 ignition[729]: Ignition finished successfully May 15 00:27:53.324616 systemd[1]: Reached target initrd-root-device.target - Initrd Root Device. May 15 00:27:53.326321 systemd[1]: Reached target local-fs-pre.target - Preparation for Local File Systems. May 15 00:27:53.328824 systemd[1]: Reached target local-fs.target - Local File Systems. May 15 00:27:53.331136 systemd[1]: Reached target sysinit.target - System Initialization. May 15 00:27:53.333910 systemd[1]: Reached target basic.target - Basic System. May 15 00:27:53.339523 systemd[1]: Starting systemd-fsck-root.service - File System Check on /dev/disk/by-label/ROOT... May 15 00:27:53.386684 systemd-fsck[738]: ROOT: clean, 14/1628000 files, 120691/1617920 blocks May 15 00:27:53.398313 systemd[1]: Finished systemd-fsck-root.service - File System Check on /dev/disk/by-label/ROOT. May 15 00:27:53.402615 systemd[1]: Mounting sysroot.mount - /sysroot... May 15 00:27:53.569309 kernel: EXT4-fs (vda9): mounted filesystem 81735587-bac5-4d9e-ae49-5642e655af7f r/w with ordered data mode. Quota mode: none. May 15 00:27:53.569789 systemd[1]: Mounted sysroot.mount - /sysroot. May 15 00:27:53.571449 systemd[1]: Reached target initrd-root-fs.target - Initrd Root File System. May 15 00:27:53.574099 systemd[1]: Mounting sysroot-oem.mount - /sysroot/oem... May 15 00:27:53.576356 systemd[1]: Mounting sysroot-usr.mount - /sysroot/usr... May 15 00:27:53.577592 systemd[1]: flatcar-metadata-hostname.service - Flatcar Metadata Hostname Agent was skipped because no trigger condition checks were met. May 15 00:27:53.582318 systemd[1]: Starting flatcar-openstack-hostname.service - Flatcar OpenStack Metadata Hostname Agent... May 15 00:27:53.584139 systemd[1]: ignition-remount-sysroot.service - Remount /sysroot read-write for Ignition was skipped because of an unmet condition check (ConditionPathIsReadWrite=!/sysroot). May 15 00:27:53.585181 systemd[1]: Reached target ignition-diskful.target - Ignition Boot Disk Setup. May 15 00:27:53.590115 systemd[1]: Mounted sysroot-usr.mount - /sysroot/usr. May 15 00:27:53.595396 systemd[1]: Starting initrd-setup-root.service - Root filesystem setup... May 15 00:27:53.605576 kernel: BTRFS: device label OEM devid 1 transid 15 /dev/vda6 scanned by mount (746) May 15 00:27:53.617274 kernel: BTRFS info (device vda6): first mount of filesystem 4c949817-d4f4-485b-8019-80887ee5206f May 15 00:27:53.617299 kernel: BTRFS info (device vda6): using crc32c (crc32c-intel) checksum algorithm May 15 00:27:53.617317 kernel: BTRFS info (device vda6): using free space tree May 15 00:27:53.632280 kernel: BTRFS info (device vda6): auto enabling async discard May 15 00:27:53.638442 systemd[1]: Mounted sysroot-oem.mount - /sysroot/oem. May 15 00:27:53.721944 initrd-setup-root[774]: cut: /sysroot/etc/passwd: No such file or directory May 15 00:27:53.729581 initrd-setup-root[781]: cut: /sysroot/etc/group: No such file or directory May 15 00:27:53.736703 initrd-setup-root[788]: cut: /sysroot/etc/shadow: No such file or directory May 15 00:27:53.744371 initrd-setup-root[795]: cut: /sysroot/etc/gshadow: No such file or directory May 15 00:27:53.847445 systemd[1]: Finished initrd-setup-root.service - Root filesystem setup. May 15 00:27:53.850856 systemd[1]: Starting ignition-mount.service - Ignition (mount)... May 15 00:27:53.854444 systemd[1]: Starting sysroot-boot.service - /sysroot/boot... May 15 00:27:53.865796 systemd[1]: sysroot-oem.mount: Deactivated successfully. May 15 00:27:53.867389 kernel: BTRFS info (device vda6): last unmount of filesystem 4c949817-d4f4-485b-8019-80887ee5206f May 15 00:27:53.894301 ignition[862]: INFO : Ignition 2.20.0 May 15 00:27:53.894301 ignition[862]: INFO : Stage: mount May 15 00:27:53.894301 ignition[862]: INFO : no configs at "/usr/lib/ignition/base.d" May 15 00:27:53.894301 ignition[862]: INFO : no config dir at "/usr/lib/ignition/base.platform.d/openstack" May 15 00:27:53.902353 ignition[862]: INFO : mount: mount passed May 15 00:27:53.902353 ignition[862]: INFO : Ignition finished successfully May 15 00:27:53.895834 systemd[1]: Finished ignition-mount.service - Ignition (mount). May 15 00:27:53.907477 systemd[1]: Finished sysroot-boot.service - /sysroot/boot. May 15 00:27:54.378653 systemd-networkd[707]: eth0: Gained IPv6LL May 15 00:28:00.773785 coreos-metadata[748]: May 15 00:28:00.773 WARN failed to locate config-drive, using the metadata service API instead May 15 00:28:00.815393 coreos-metadata[748]: May 15 00:28:00.815 INFO Fetching http://169.254.169.254/latest/meta-data/hostname: Attempt #1 May 15 00:28:00.835504 coreos-metadata[748]: May 15 00:28:00.835 INFO Fetch successful May 15 00:28:00.838714 coreos-metadata[748]: May 15 00:28:00.837 INFO wrote hostname ci-4284-0-0-n-019843d4bb.novalocal to /sysroot/etc/hostname May 15 00:28:00.841769 systemd[1]: flatcar-openstack-hostname.service: Deactivated successfully. May 15 00:28:00.842013 systemd[1]: Finished flatcar-openstack-hostname.service - Flatcar OpenStack Metadata Hostname Agent. May 15 00:28:00.850334 systemd[1]: Starting ignition-files.service - Ignition (files)... May 15 00:28:00.875954 systemd[1]: Mounting sysroot-oem.mount - /sysroot/oem... May 15 00:28:00.908406 kernel: BTRFS: device label OEM devid 1 transid 16 /dev/vda6 scanned by mount (880) May 15 00:28:00.917473 kernel: BTRFS info (device vda6): first mount of filesystem 4c949817-d4f4-485b-8019-80887ee5206f May 15 00:28:00.917560 kernel: BTRFS info (device vda6): using crc32c (crc32c-intel) checksum algorithm May 15 00:28:00.921616 kernel: BTRFS info (device vda6): using free space tree May 15 00:28:00.932355 kernel: BTRFS info (device vda6): auto enabling async discard May 15 00:28:00.938213 systemd[1]: Mounted sysroot-oem.mount - /sysroot/oem. May 15 00:28:00.987718 ignition[898]: INFO : Ignition 2.20.0 May 15 00:28:00.987718 ignition[898]: INFO : Stage: files May 15 00:28:00.990670 ignition[898]: INFO : no configs at "/usr/lib/ignition/base.d" May 15 00:28:00.990670 ignition[898]: INFO : no config dir at "/usr/lib/ignition/base.platform.d/openstack" May 15 00:28:00.990670 ignition[898]: DEBUG : files: compiled without relabeling support, skipping May 15 00:28:00.996122 ignition[898]: INFO : files: ensureUsers: op(1): [started] creating or modifying user "core" May 15 00:28:00.996122 ignition[898]: DEBUG : files: ensureUsers: op(1): executing: "usermod" "--root" "/sysroot" "core" May 15 00:28:01.000475 ignition[898]: INFO : files: ensureUsers: op(1): [finished] creating or modifying user "core" May 15 00:28:01.000475 ignition[898]: INFO : files: ensureUsers: op(2): [started] adding ssh keys to user "core" May 15 00:28:01.004710 ignition[898]: INFO : files: ensureUsers: op(2): [finished] adding ssh keys to user "core" May 15 00:28:01.004710 ignition[898]: INFO : files: createFilesystemsFiles: createFiles: op(3): [started] writing file "/sysroot/opt/helm-v3.17.0-linux-amd64.tar.gz" May 15 00:28:01.004710 ignition[898]: INFO : files: createFilesystemsFiles: createFiles: op(3): GET https://get.helm.sh/helm-v3.17.0-linux-amd64.tar.gz: attempt #1 May 15 00:28:01.000624 unknown[898]: wrote ssh authorized keys file for user: core May 15 00:28:01.092692 ignition[898]: INFO : files: createFilesystemsFiles: createFiles: op(3): GET result: OK May 15 00:28:01.697850 ignition[898]: INFO : files: createFilesystemsFiles: createFiles: op(3): [finished] writing file "/sysroot/opt/helm-v3.17.0-linux-amd64.tar.gz" May 15 00:28:01.697850 ignition[898]: INFO : files: createFilesystemsFiles: createFiles: op(4): [started] writing file "/sysroot/home/core/install.sh" May 15 00:28:01.701179 ignition[898]: INFO : files: createFilesystemsFiles: createFiles: op(4): [finished] writing file "/sysroot/home/core/install.sh" May 15 00:28:01.701179 ignition[898]: INFO : files: createFilesystemsFiles: createFiles: op(5): [started] writing file "/sysroot/home/core/nginx.yaml" May 15 00:28:01.701179 ignition[898]: INFO : files: createFilesystemsFiles: createFiles: op(5): [finished] writing file "/sysroot/home/core/nginx.yaml" May 15 00:28:01.701179 ignition[898]: INFO : files: createFilesystemsFiles: createFiles: op(6): [started] writing file "/sysroot/home/core/nfs-pod.yaml" May 15 00:28:01.701179 ignition[898]: INFO : files: createFilesystemsFiles: createFiles: op(6): [finished] writing file "/sysroot/home/core/nfs-pod.yaml" May 15 00:28:01.701179 ignition[898]: INFO : files: createFilesystemsFiles: createFiles: op(7): [started] writing file "/sysroot/home/core/nfs-pvc.yaml" May 15 00:28:01.701179 ignition[898]: INFO : files: createFilesystemsFiles: createFiles: op(7): [finished] writing file "/sysroot/home/core/nfs-pvc.yaml" May 15 00:28:01.701179 ignition[898]: INFO : files: createFilesystemsFiles: createFiles: op(8): [started] writing file "/sysroot/etc/flatcar/update.conf" May 15 00:28:01.701179 ignition[898]: INFO : files: createFilesystemsFiles: createFiles: op(8): [finished] writing file "/sysroot/etc/flatcar/update.conf" May 15 00:28:01.701179 ignition[898]: INFO : files: createFilesystemsFiles: createFiles: op(9): [started] writing link "/sysroot/etc/extensions/kubernetes.raw" -> "/opt/extensions/kubernetes/kubernetes-v1.32.0-x86-64.raw" May 15 00:28:01.701179 ignition[898]: INFO : files: createFilesystemsFiles: createFiles: op(9): [finished] writing link "/sysroot/etc/extensions/kubernetes.raw" -> "/opt/extensions/kubernetes/kubernetes-v1.32.0-x86-64.raw" May 15 00:28:01.701179 ignition[898]: INFO : files: createFilesystemsFiles: createFiles: op(a): [started] writing file "/sysroot/opt/extensions/kubernetes/kubernetes-v1.32.0-x86-64.raw" May 15 00:28:01.711829 ignition[898]: INFO : files: createFilesystemsFiles: createFiles: op(a): GET https://github.com/flatcar/sysext-bakery/releases/download/latest/kubernetes-v1.32.0-x86-64.raw: attempt #1 May 15 00:28:02.409399 ignition[898]: INFO : files: createFilesystemsFiles: createFiles: op(a): GET result: OK May 15 00:28:04.814623 ignition[898]: INFO : files: createFilesystemsFiles: createFiles: op(a): [finished] writing file "/sysroot/opt/extensions/kubernetes/kubernetes-v1.32.0-x86-64.raw" May 15 00:28:04.814623 ignition[898]: INFO : files: op(b): [started] processing unit "prepare-helm.service" May 15 00:28:04.821456 ignition[898]: INFO : files: op(b): op(c): [started] writing unit "prepare-helm.service" at "/sysroot/etc/systemd/system/prepare-helm.service" May 15 00:28:04.822831 ignition[898]: INFO : files: op(b): op(c): [finished] writing unit "prepare-helm.service" at "/sysroot/etc/systemd/system/prepare-helm.service" May 15 00:28:04.824742 ignition[898]: INFO : files: op(b): [finished] processing unit "prepare-helm.service" May 15 00:28:04.824742 ignition[898]: INFO : files: op(d): [started] setting preset to enabled for "prepare-helm.service" May 15 00:28:04.824742 ignition[898]: INFO : files: op(d): [finished] setting preset to enabled for "prepare-helm.service" May 15 00:28:04.824742 ignition[898]: INFO : files: createResultFile: createFiles: op(e): [started] writing file "/sysroot/etc/.ignition-result.json" May 15 00:28:04.824742 ignition[898]: INFO : files: createResultFile: createFiles: op(e): [finished] writing file "/sysroot/etc/.ignition-result.json" May 15 00:28:04.824742 ignition[898]: INFO : files: files passed May 15 00:28:04.838685 ignition[898]: INFO : Ignition finished successfully May 15 00:28:04.826143 systemd[1]: Finished ignition-files.service - Ignition (files). May 15 00:28:04.833851 systemd[1]: Starting ignition-quench.service - Ignition (record completion)... May 15 00:28:04.836317 systemd[1]: Starting initrd-setup-root-after-ignition.service - Root filesystem completion... May 15 00:28:04.857491 systemd[1]: ignition-quench.service: Deactivated successfully. May 15 00:28:04.857608 systemd[1]: Finished ignition-quench.service - Ignition (record completion). May 15 00:28:04.861981 initrd-setup-root-after-ignition[928]: grep: /sysroot/etc/flatcar/enabled-sysext.conf: No such file or directory May 15 00:28:04.861981 initrd-setup-root-after-ignition[928]: grep: /sysroot/usr/share/flatcar/enabled-sysext.conf: No such file or directory May 15 00:28:04.868163 initrd-setup-root-after-ignition[932]: grep: /sysroot/etc/flatcar/enabled-sysext.conf: No such file or directory May 15 00:28:04.870986 systemd[1]: Finished initrd-setup-root-after-ignition.service - Root filesystem completion. May 15 00:28:04.872057 systemd[1]: Reached target ignition-complete.target - Ignition Complete. May 15 00:28:04.874986 systemd[1]: Starting initrd-parse-etc.service - Mountpoints Configured in the Real Root... May 15 00:28:04.931028 systemd[1]: initrd-parse-etc.service: Deactivated successfully. May 15 00:28:04.931154 systemd[1]: Finished initrd-parse-etc.service - Mountpoints Configured in the Real Root. May 15 00:28:04.932088 systemd[1]: Reached target initrd-fs.target - Initrd File Systems. May 15 00:28:04.933836 systemd[1]: Reached target initrd.target - Initrd Default Target. May 15 00:28:04.945596 systemd[1]: dracut-mount.service - dracut mount hook was skipped because no trigger condition checks were met. May 15 00:28:04.948425 systemd[1]: Starting dracut-pre-pivot.service - dracut pre-pivot and cleanup hook... May 15 00:28:04.973842 systemd[1]: Finished dracut-pre-pivot.service - dracut pre-pivot and cleanup hook. May 15 00:28:04.978676 systemd[1]: Starting initrd-cleanup.service - Cleaning Up and Shutting Down Daemons... May 15 00:28:05.008790 systemd[1]: Stopped target nss-lookup.target - Host and Network Name Lookups. May 15 00:28:05.010426 systemd[1]: Stopped target remote-cryptsetup.target - Remote Encrypted Volumes. May 15 00:28:05.012515 systemd[1]: Stopped target timers.target - Timer Units. May 15 00:28:05.014379 systemd[1]: dracut-pre-pivot.service: Deactivated successfully. May 15 00:28:05.014794 systemd[1]: Stopped dracut-pre-pivot.service - dracut pre-pivot and cleanup hook. May 15 00:28:05.017375 systemd[1]: Stopped target initrd.target - Initrd Default Target. May 15 00:28:05.019402 systemd[1]: Stopped target basic.target - Basic System. May 15 00:28:05.021254 systemd[1]: Stopped target ignition-complete.target - Ignition Complete. May 15 00:28:05.023427 systemd[1]: Stopped target ignition-diskful.target - Ignition Boot Disk Setup. May 15 00:28:05.025619 systemd[1]: Stopped target initrd-root-device.target - Initrd Root Device. May 15 00:28:05.027783 systemd[1]: Stopped target remote-fs.target - Remote File Systems. May 15 00:28:05.029840 systemd[1]: Stopped target remote-fs-pre.target - Preparation for Remote File Systems. May 15 00:28:05.031896 systemd[1]: Stopped target sysinit.target - System Initialization. May 15 00:28:05.032992 systemd[1]: Stopped target local-fs.target - Local File Systems. May 15 00:28:05.034574 systemd[1]: Stopped target swap.target - Swaps. May 15 00:28:05.035651 systemd[1]: dracut-pre-mount.service: Deactivated successfully. May 15 00:28:05.035805 systemd[1]: Stopped dracut-pre-mount.service - dracut pre-mount hook. May 15 00:28:05.036958 systemd[1]: Stopped target cryptsetup.target - Local Encrypted Volumes. May 15 00:28:05.037726 systemd[1]: Stopped target cryptsetup-pre.target - Local Encrypted Volumes (Pre). May 15 00:28:05.038825 systemd[1]: clevis-luks-askpass.path: Deactivated successfully. May 15 00:28:05.038928 systemd[1]: Stopped clevis-luks-askpass.path - Forward Password Requests to Clevis Directory Watch. May 15 00:28:05.039972 systemd[1]: dracut-initqueue.service: Deactivated successfully. May 15 00:28:05.040125 systemd[1]: Stopped dracut-initqueue.service - dracut initqueue hook. May 15 00:28:05.041411 systemd[1]: initrd-setup-root-after-ignition.service: Deactivated successfully. May 15 00:28:05.041577 systemd[1]: Stopped initrd-setup-root-after-ignition.service - Root filesystem completion. May 15 00:28:05.042969 systemd[1]: ignition-files.service: Deactivated successfully. May 15 00:28:05.043079 systemd[1]: Stopped ignition-files.service - Ignition (files). May 15 00:28:05.046458 systemd[1]: Stopping ignition-mount.service - Ignition (mount)... May 15 00:28:05.049490 systemd[1]: Stopping sysroot-boot.service - /sysroot/boot... May 15 00:28:05.050024 systemd[1]: systemd-udev-trigger.service: Deactivated successfully. May 15 00:28:05.050193 systemd[1]: Stopped systemd-udev-trigger.service - Coldplug All udev Devices. May 15 00:28:05.053523 systemd[1]: dracut-pre-trigger.service: Deactivated successfully. May 15 00:28:05.053646 systemd[1]: Stopped dracut-pre-trigger.service - dracut pre-trigger hook. May 15 00:28:05.059671 systemd[1]: initrd-cleanup.service: Deactivated successfully. May 15 00:28:05.061307 systemd[1]: Finished initrd-cleanup.service - Cleaning Up and Shutting Down Daemons. May 15 00:28:05.073093 ignition[952]: INFO : Ignition 2.20.0 May 15 00:28:05.073093 ignition[952]: INFO : Stage: umount May 15 00:28:05.073093 ignition[952]: INFO : no configs at "/usr/lib/ignition/base.d" May 15 00:28:05.073093 ignition[952]: INFO : no config dir at "/usr/lib/ignition/base.platform.d/openstack" May 15 00:28:05.073093 ignition[952]: INFO : umount: umount passed May 15 00:28:05.073093 ignition[952]: INFO : Ignition finished successfully May 15 00:28:05.075343 systemd[1]: ignition-mount.service: Deactivated successfully. May 15 00:28:05.075449 systemd[1]: Stopped ignition-mount.service - Ignition (mount). May 15 00:28:05.080127 systemd[1]: sysroot-boot.mount: Deactivated successfully. May 15 00:28:05.081599 systemd[1]: sysroot-boot.service: Deactivated successfully. May 15 00:28:05.081689 systemd[1]: Stopped sysroot-boot.service - /sysroot/boot. May 15 00:28:05.083088 systemd[1]: ignition-disks.service: Deactivated successfully. May 15 00:28:05.083155 systemd[1]: Stopped ignition-disks.service - Ignition (disks). May 15 00:28:05.084431 systemd[1]: ignition-kargs.service: Deactivated successfully. May 15 00:28:05.084476 systemd[1]: Stopped ignition-kargs.service - Ignition (kargs). May 15 00:28:05.085403 systemd[1]: ignition-fetch.service: Deactivated successfully. May 15 00:28:05.085445 systemd[1]: Stopped ignition-fetch.service - Ignition (fetch). May 15 00:28:05.086377 systemd[1]: Stopped target network.target - Network. May 15 00:28:05.087295 systemd[1]: ignition-fetch-offline.service: Deactivated successfully. May 15 00:28:05.087340 systemd[1]: Stopped ignition-fetch-offline.service - Ignition (fetch-offline). May 15 00:28:05.088366 systemd[1]: Stopped target paths.target - Path Units. May 15 00:28:05.089321 systemd[1]: systemd-ask-password-console.path: Deactivated successfully. May 15 00:28:05.089366 systemd[1]: Stopped systemd-ask-password-console.path - Dispatch Password Requests to Console Directory Watch. May 15 00:28:05.090393 systemd[1]: Stopped target slices.target - Slice Units. May 15 00:28:05.091526 systemd[1]: Stopped target sockets.target - Socket Units. May 15 00:28:05.092696 systemd[1]: iscsid.socket: Deactivated successfully. May 15 00:28:05.092734 systemd[1]: Closed iscsid.socket - Open-iSCSI iscsid Socket. May 15 00:28:05.093845 systemd[1]: iscsiuio.socket: Deactivated successfully. May 15 00:28:05.093878 systemd[1]: Closed iscsiuio.socket - Open-iSCSI iscsiuio Socket. May 15 00:28:05.094797 systemd[1]: ignition-setup.service: Deactivated successfully. May 15 00:28:05.094841 systemd[1]: Stopped ignition-setup.service - Ignition (setup). May 15 00:28:05.095919 systemd[1]: ignition-setup-pre.service: Deactivated successfully. May 15 00:28:05.095959 systemd[1]: Stopped ignition-setup-pre.service - Ignition env setup. May 15 00:28:05.097155 systemd[1]: initrd-setup-root.service: Deactivated successfully. May 15 00:28:05.097196 systemd[1]: Stopped initrd-setup-root.service - Root filesystem setup. May 15 00:28:05.098249 systemd[1]: Stopping systemd-networkd.service - Network Configuration... May 15 00:28:05.099345 systemd[1]: Stopping systemd-resolved.service - Network Name Resolution... May 15 00:28:05.101898 systemd[1]: systemd-resolved.service: Deactivated successfully. May 15 00:28:05.101998 systemd[1]: Stopped systemd-resolved.service - Network Name Resolution. May 15 00:28:05.105514 systemd[1]: run-credentials-systemd\x2dresolved.service.mount: Deactivated successfully. May 15 00:28:05.105903 systemd[1]: systemd-tmpfiles-setup.service: Deactivated successfully. May 15 00:28:05.105980 systemd[1]: Stopped systemd-tmpfiles-setup.service - Create System Files and Directories. May 15 00:28:05.108214 systemd[1]: run-credentials-systemd\x2dtmpfiles\x2dsetup.service.mount: Deactivated successfully. May 15 00:28:05.110588 systemd[1]: systemd-networkd.service: Deactivated successfully. May 15 00:28:05.110825 systemd[1]: Stopped systemd-networkd.service - Network Configuration. May 15 00:28:05.112985 systemd[1]: run-credentials-systemd\x2dnetworkd.service.mount: Deactivated successfully. May 15 00:28:05.113156 systemd[1]: systemd-networkd.socket: Deactivated successfully. May 15 00:28:05.113190 systemd[1]: Closed systemd-networkd.socket - Network Service Netlink Socket. May 15 00:28:05.115357 systemd[1]: Stopping network-cleanup.service - Network Cleanup... May 15 00:28:05.119755 systemd[1]: parse-ip-for-networkd.service: Deactivated successfully. May 15 00:28:05.119814 systemd[1]: Stopped parse-ip-for-networkd.service - Write systemd-networkd units from cmdline. May 15 00:28:05.121020 systemd[1]: systemd-sysctl.service: Deactivated successfully. May 15 00:28:05.121063 systemd[1]: Stopped systemd-sysctl.service - Apply Kernel Variables. May 15 00:28:05.122089 systemd[1]: systemd-modules-load.service: Deactivated successfully. May 15 00:28:05.122131 systemd[1]: Stopped systemd-modules-load.service - Load Kernel Modules. May 15 00:28:05.123172 systemd[1]: Stopping systemd-udevd.service - Rule-based Manager for Device Events and Files... May 15 00:28:05.126470 systemd[1]: run-credentials-systemd\x2dsysctl.service.mount: Deactivated successfully. May 15 00:28:05.130665 systemd[1]: systemd-udevd.service: Deactivated successfully. May 15 00:28:05.130815 systemd[1]: Stopped systemd-udevd.service - Rule-based Manager for Device Events and Files. May 15 00:28:05.132467 systemd[1]: systemd-udevd-control.socket: Deactivated successfully. May 15 00:28:05.132525 systemd[1]: Closed systemd-udevd-control.socket - udev Control Socket. May 15 00:28:05.133407 systemd[1]: systemd-udevd-kernel.socket: Deactivated successfully. May 15 00:28:05.133440 systemd[1]: Closed systemd-udevd-kernel.socket - udev Kernel Socket. May 15 00:28:05.134332 systemd[1]: dracut-pre-udev.service: Deactivated successfully. May 15 00:28:05.134377 systemd[1]: Stopped dracut-pre-udev.service - dracut pre-udev hook. May 15 00:28:05.136019 systemd[1]: dracut-cmdline.service: Deactivated successfully. May 15 00:28:05.136062 systemd[1]: Stopped dracut-cmdline.service - dracut cmdline hook. May 15 00:28:05.138293 systemd[1]: dracut-cmdline-ask.service: Deactivated successfully. May 15 00:28:05.138339 systemd[1]: Stopped dracut-cmdline-ask.service - dracut ask for additional cmdline parameters. May 15 00:28:05.140393 systemd[1]: Starting initrd-udevadm-cleanup-db.service - Cleanup udev Database... May 15 00:28:05.141915 systemd[1]: systemd-tmpfiles-setup-dev.service: Deactivated successfully. May 15 00:28:05.141963 systemd[1]: Stopped systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev. May 15 00:28:05.144008 systemd[1]: systemd-tmpfiles-setup-dev-early.service: Deactivated successfully. May 15 00:28:05.144053 systemd[1]: Stopped systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully. May 15 00:28:05.145625 systemd[1]: kmod-static-nodes.service: Deactivated successfully. May 15 00:28:05.145670 systemd[1]: Stopped kmod-static-nodes.service - Create List of Static Device Nodes. May 15 00:28:05.147232 systemd[1]: systemd-vconsole-setup.service: Deactivated successfully. May 15 00:28:05.147323 systemd[1]: Stopped systemd-vconsole-setup.service - Virtual Console Setup. May 15 00:28:05.153533 systemd[1]: network-cleanup.service: Deactivated successfully. May 15 00:28:05.153618 systemd[1]: Stopped network-cleanup.service - Network Cleanup. May 15 00:28:05.157663 systemd[1]: initrd-udevadm-cleanup-db.service: Deactivated successfully. May 15 00:28:05.157764 systemd[1]: Finished initrd-udevadm-cleanup-db.service - Cleanup udev Database. May 15 00:28:05.159163 systemd[1]: Reached target initrd-switch-root.target - Switch Root. May 15 00:28:05.160997 systemd[1]: Starting initrd-switch-root.service - Switch Root... May 15 00:28:05.178197 systemd[1]: Switching root. May 15 00:28:05.213391 systemd-journald[185]: Journal stopped May 15 00:28:06.721472 systemd-journald[185]: Received SIGTERM from PID 1 (systemd). May 15 00:28:06.721554 kernel: SELinux: policy capability network_peer_controls=1 May 15 00:28:06.721573 kernel: SELinux: policy capability open_perms=1 May 15 00:28:06.721590 kernel: SELinux: policy capability extended_socket_class=1 May 15 00:28:06.721601 kernel: SELinux: policy capability always_check_network=0 May 15 00:28:06.721613 kernel: SELinux: policy capability cgroup_seclabel=1 May 15 00:28:06.721627 kernel: SELinux: policy capability nnp_nosuid_transition=1 May 15 00:28:06.721639 kernel: SELinux: policy capability genfs_seclabel_symlinks=0 May 15 00:28:06.721650 kernel: SELinux: policy capability ioctl_skip_cloexec=0 May 15 00:28:06.721661 kernel: audit: type=1403 audit(1747268885.615:2): auid=4294967295 ses=4294967295 lsm=selinux res=1 May 15 00:28:06.721673 systemd[1]: Successfully loaded SELinux policy in 69.311ms. May 15 00:28:06.721693 systemd[1]: Relabeled /dev/, /dev/shm/, /run/ in 23.643ms. May 15 00:28:06.721706 systemd[1]: systemd 256.8 running in system mode (+PAM +AUDIT +SELINUX -APPARMOR +IMA +SMACK +SECCOMP +GCRYPT -GNUTLS +OPENSSL -ACL +BLKID +CURL +ELFUTILS -FIDO2 +IDN2 -IDN +IPTC +KMOD +LIBCRYPTSETUP +LIBCRYPTSETUP_PLUGINS +LIBFDISK +PCRE2 -PWQUALITY -P11KIT -QRENCODE +TPM2 +BZIP2 +LZ4 +XZ +ZLIB +ZSTD -BPF_FRAMEWORK -XKBCOMMON +UTMP -SYSVINIT +LIBARCHIVE) May 15 00:28:06.721721 systemd[1]: Detected virtualization kvm. May 15 00:28:06.721733 systemd[1]: Detected architecture x86-64. May 15 00:28:06.721745 systemd[1]: Detected first boot. May 15 00:28:06.721758 systemd[1]: Hostname set to . May 15 00:28:06.721770 systemd[1]: Initializing machine ID from VM UUID. May 15 00:28:06.721782 zram_generator::config[998]: No configuration found. May 15 00:28:06.721795 kernel: Guest personality initialized and is inactive May 15 00:28:06.721807 kernel: VMCI host device registered (name=vmci, major=10, minor=125) May 15 00:28:06.721821 kernel: Initialized host personality May 15 00:28:06.721832 kernel: NET: Registered PF_VSOCK protocol family May 15 00:28:06.721843 systemd[1]: Populated /etc with preset unit settings. May 15 00:28:06.721858 systemd[1]: run-credentials-systemd\x2djournald.service.mount: Deactivated successfully. May 15 00:28:06.721871 systemd[1]: initrd-switch-root.service: Deactivated successfully. May 15 00:28:06.721883 systemd[1]: Stopped initrd-switch-root.service - Switch Root. May 15 00:28:06.721896 systemd[1]: systemd-journald.service: Scheduled restart job, restart counter is at 1. May 15 00:28:06.721908 systemd[1]: Created slice system-addon\x2dconfig.slice - Slice /system/addon-config. May 15 00:28:06.721923 systemd[1]: Created slice system-addon\x2drun.slice - Slice /system/addon-run. May 15 00:28:06.721936 systemd[1]: Created slice system-getty.slice - Slice /system/getty. May 15 00:28:06.721948 systemd[1]: Created slice system-modprobe.slice - Slice /system/modprobe. May 15 00:28:06.721961 systemd[1]: Created slice system-serial\x2dgetty.slice - Slice /system/serial-getty. May 15 00:28:06.721974 systemd[1]: Created slice system-system\x2dcloudinit.slice - Slice /system/system-cloudinit. May 15 00:28:06.721986 systemd[1]: Created slice system-systemd\x2dfsck.slice - Slice /system/systemd-fsck. May 15 00:28:06.721998 systemd[1]: Created slice user.slice - User and Session Slice. May 15 00:28:06.722011 systemd[1]: Started clevis-luks-askpass.path - Forward Password Requests to Clevis Directory Watch. May 15 00:28:06.722023 systemd[1]: Started systemd-ask-password-console.path - Dispatch Password Requests to Console Directory Watch. May 15 00:28:06.722037 systemd[1]: Started systemd-ask-password-wall.path - Forward Password Requests to Wall Directory Watch. May 15 00:28:06.722050 systemd[1]: Set up automount boot.automount - Boot partition Automount Point. May 15 00:28:06.722062 systemd[1]: Set up automount proc-sys-fs-binfmt_misc.automount - Arbitrary Executable File Formats File System Automount Point. May 15 00:28:06.722079 systemd[1]: Expecting device dev-disk-by\x2dlabel-OEM.device - /dev/disk/by-label/OEM... May 15 00:28:06.722092 systemd[1]: Expecting device dev-ttyS0.device - /dev/ttyS0... May 15 00:28:06.722108 systemd[1]: Reached target cryptsetup-pre.target - Local Encrypted Volumes (Pre). May 15 00:28:06.722123 systemd[1]: Stopped target initrd-switch-root.target - Switch Root. May 15 00:28:06.722136 systemd[1]: Stopped target initrd-fs.target - Initrd File Systems. May 15 00:28:06.722149 systemd[1]: Stopped target initrd-root-fs.target - Initrd Root File System. May 15 00:28:06.722161 systemd[1]: Reached target integritysetup.target - Local Integrity Protected Volumes. May 15 00:28:06.722173 systemd[1]: Reached target remote-cryptsetup.target - Remote Encrypted Volumes. May 15 00:28:06.722185 systemd[1]: Reached target remote-fs.target - Remote File Systems. May 15 00:28:06.722198 systemd[1]: Reached target slices.target - Slice Units. May 15 00:28:06.722210 systemd[1]: Reached target swap.target - Swaps. May 15 00:28:06.722222 systemd[1]: Reached target veritysetup.target - Local Verity Protected Volumes. May 15 00:28:06.722236 systemd[1]: Listening on systemd-coredump.socket - Process Core Dump Socket. May 15 00:28:06.722249 systemd[1]: Listening on systemd-creds.socket - Credential Encryption/Decryption. May 15 00:28:06.723854 systemd[1]: Listening on systemd-networkd.socket - Network Service Netlink Socket. May 15 00:28:06.723873 systemd[1]: Listening on systemd-udevd-control.socket - udev Control Socket. May 15 00:28:06.723886 systemd[1]: Listening on systemd-udevd-kernel.socket - udev Kernel Socket. May 15 00:28:06.723899 systemd[1]: Listening on systemd-userdbd.socket - User Database Manager Socket. May 15 00:28:06.723912 systemd[1]: Mounting dev-hugepages.mount - Huge Pages File System... May 15 00:28:06.723924 systemd[1]: Mounting dev-mqueue.mount - POSIX Message Queue File System... May 15 00:28:06.723936 systemd[1]: Mounting media.mount - External Media Directory... May 15 00:28:06.723952 systemd[1]: proc-xen.mount - /proc/xen was skipped because of an unmet condition check (ConditionVirtualization=xen). May 15 00:28:06.723965 systemd[1]: Mounting sys-kernel-debug.mount - Kernel Debug File System... May 15 00:28:06.723978 systemd[1]: Mounting sys-kernel-tracing.mount - Kernel Trace File System... May 15 00:28:06.723991 systemd[1]: Mounting tmp.mount - Temporary Directory /tmp... May 15 00:28:06.724003 systemd[1]: var-lib-machines.mount - Virtual Machine and Container Storage (Compatibility) was skipped because of an unmet condition check (ConditionPathExists=/var/lib/machines.raw). May 15 00:28:06.724015 systemd[1]: Reached target machines.target - Containers. May 15 00:28:06.724028 systemd[1]: Starting flatcar-tmpfiles.service - Create missing system files... May 15 00:28:06.724040 systemd[1]: ignition-delete-config.service - Ignition (delete config) was skipped because no trigger condition checks were met. May 15 00:28:06.724052 systemd[1]: Starting kmod-static-nodes.service - Create List of Static Device Nodes... May 15 00:28:06.724069 systemd[1]: Starting modprobe@configfs.service - Load Kernel Module configfs... May 15 00:28:06.724082 systemd[1]: Starting modprobe@dm_mod.service - Load Kernel Module dm_mod... May 15 00:28:06.724094 systemd[1]: Starting modprobe@drm.service - Load Kernel Module drm... May 15 00:28:06.724107 systemd[1]: Starting modprobe@efi_pstore.service - Load Kernel Module efi_pstore... May 15 00:28:06.724119 systemd[1]: Starting modprobe@fuse.service - Load Kernel Module fuse... May 15 00:28:06.724132 systemd[1]: Starting modprobe@loop.service - Load Kernel Module loop... May 15 00:28:06.724144 systemd[1]: setup-nsswitch.service - Create /etc/nsswitch.conf was skipped because of an unmet condition check (ConditionPathExists=!/etc/nsswitch.conf). May 15 00:28:06.724156 systemd[1]: systemd-fsck-root.service: Deactivated successfully. May 15 00:28:06.724171 systemd[1]: Stopped systemd-fsck-root.service - File System Check on Root Device. May 15 00:28:06.724183 systemd[1]: systemd-fsck-usr.service: Deactivated successfully. May 15 00:28:06.724195 systemd[1]: Stopped systemd-fsck-usr.service. May 15 00:28:06.724208 systemd[1]: systemd-hibernate-clear.service - Clear Stale Hibernate Storage Info was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/HibernateLocation-8cf2644b-4b0b-428f-9387-6d876050dc67). May 15 00:28:06.724226 systemd[1]: Starting systemd-journald.service - Journal Service... May 15 00:28:06.724239 systemd[1]: Starting systemd-modules-load.service - Load Kernel Modules... May 15 00:28:06.724251 systemd[1]: Starting systemd-network-generator.service - Generate network units from Kernel command line... May 15 00:28:06.724278 systemd[1]: Starting systemd-remount-fs.service - Remount Root and Kernel File Systems... May 15 00:28:06.724292 systemd[1]: Starting systemd-udev-load-credentials.service - Load udev Rules from Credentials... May 15 00:28:06.724308 systemd[1]: Starting systemd-udev-trigger.service - Coldplug All udev Devices... May 15 00:28:06.724320 systemd[1]: verity-setup.service: Deactivated successfully. May 15 00:28:06.724333 systemd[1]: Stopped verity-setup.service. May 15 00:28:06.724346 systemd[1]: xenserver-pv-version.service - Set fake PV driver version for XenServer was skipped because of an unmet condition check (ConditionVirtualization=xen). May 15 00:28:06.724358 systemd[1]: Mounted dev-hugepages.mount - Huge Pages File System. May 15 00:28:06.724370 systemd[1]: Mounted dev-mqueue.mount - POSIX Message Queue File System. May 15 00:28:06.724385 systemd[1]: Mounted media.mount - External Media Directory. May 15 00:28:06.724397 systemd[1]: Mounted sys-kernel-debug.mount - Kernel Debug File System. May 15 00:28:06.724409 kernel: fuse: init (API version 7.39) May 15 00:28:06.724421 systemd[1]: Mounted sys-kernel-tracing.mount - Kernel Trace File System. May 15 00:28:06.724436 systemd[1]: Mounted tmp.mount - Temporary Directory /tmp. May 15 00:28:06.724448 systemd[1]: Finished kmod-static-nodes.service - Create List of Static Device Nodes. May 15 00:28:06.724461 systemd[1]: modprobe@configfs.service: Deactivated successfully. May 15 00:28:06.724473 systemd[1]: Finished modprobe@configfs.service - Load Kernel Module configfs. May 15 00:28:06.724485 systemd[1]: modprobe@dm_mod.service: Deactivated successfully. May 15 00:28:06.724498 systemd[1]: Finished modprobe@dm_mod.service - Load Kernel Module dm_mod. May 15 00:28:06.724511 systemd[1]: modprobe@efi_pstore.service: Deactivated successfully. May 15 00:28:06.724522 systemd[1]: Finished modprobe@efi_pstore.service - Load Kernel Module efi_pstore. May 15 00:28:06.724536 systemd[1]: modprobe@fuse.service: Deactivated successfully. May 15 00:28:06.724549 systemd[1]: Finished modprobe@fuse.service - Load Kernel Module fuse. May 15 00:28:06.724562 systemd[1]: Finished systemd-modules-load.service - Load Kernel Modules. May 15 00:28:06.724574 kernel: loop: module loaded May 15 00:28:06.724586 systemd[1]: Finished systemd-remount-fs.service - Remount Root and Kernel File Systems. May 15 00:28:06.724619 systemd-journald[1081]: Collecting audit messages is disabled. May 15 00:28:06.724647 systemd[1]: modprobe@loop.service: Deactivated successfully. May 15 00:28:06.724660 systemd[1]: Finished modprobe@loop.service - Load Kernel Module loop. May 15 00:28:06.724675 systemd[1]: Finished systemd-network-generator.service - Generate network units from Kernel command line. May 15 00:28:06.724689 systemd-journald[1081]: Journal started May 15 00:28:06.724713 systemd-journald[1081]: Runtime Journal (/run/log/journal/720e1d30371745bb93b008afd0436d0b) is 8M, max 78.2M, 70.2M free. May 15 00:28:06.381915 systemd[1]: Queued start job for default target multi-user.target. May 15 00:28:06.390331 systemd[1]: Unnecessary job was removed for dev-vda6.device - /dev/vda6. May 15 00:28:06.390769 systemd[1]: systemd-journald.service: Deactivated successfully. May 15 00:28:06.728731 systemd[1]: Started systemd-journald.service - Journal Service. May 15 00:28:06.742664 systemd[1]: Finished systemd-udev-load-credentials.service - Load udev Rules from Credentials. May 15 00:28:06.751699 systemd[1]: Reached target network-pre.target - Preparation for Network. May 15 00:28:06.757369 systemd[1]: Mounting sys-fs-fuse-connections.mount - FUSE Control File System... May 15 00:28:06.785411 kernel: ACPI: bus type drm_connector registered May 15 00:28:06.780361 systemd[1]: Mounting sys-kernel-config.mount - Kernel Configuration File System... May 15 00:28:06.783772 systemd[1]: remount-root.service - Remount Root File System was skipped because of an unmet condition check (ConditionPathIsReadWrite=!/). May 15 00:28:06.783810 systemd[1]: Reached target local-fs.target - Local File Systems. May 15 00:28:06.786567 systemd[1]: Listening on systemd-sysext.socket - System Extension Image Management. May 15 00:28:06.791495 systemd[1]: Starting dracut-shutdown.service - Restore /run/initramfs on shutdown... May 15 00:28:06.797438 systemd[1]: Starting ldconfig.service - Rebuild Dynamic Linker Cache... May 15 00:28:06.798101 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. May 15 00:28:06.800799 systemd[1]: Starting systemd-hwdb-update.service - Rebuild Hardware Database... May 15 00:28:06.803381 systemd[1]: Starting systemd-journal-flush.service - Flush Journal to Persistent Storage... May 15 00:28:06.803947 systemd[1]: systemd-pstore.service - Platform Persistent Storage Archival was skipped because of an unmet condition check (ConditionDirectoryNotEmpty=/sys/fs/pstore). May 15 00:28:06.810897 systemd[1]: Starting systemd-random-seed.service - Load/Save OS Random Seed... May 15 00:28:06.811972 systemd[1]: systemd-repart.service - Repartition Root Disk was skipped because no trigger condition checks were met. May 15 00:28:06.813846 systemd[1]: Starting systemd-sysctl.service - Apply Kernel Variables... May 15 00:28:06.823517 systemd[1]: Starting systemd-sysext.service - Merge System Extension Images into /usr/ and /opt/... May 15 00:28:06.826831 systemd[1]: Starting systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully... May 15 00:28:06.831320 systemd[1]: Finished flatcar-tmpfiles.service - Create missing system files. May 15 00:28:06.832585 systemd[1]: modprobe@drm.service: Deactivated successfully. May 15 00:28:06.833538 systemd[1]: Finished modprobe@drm.service - Load Kernel Module drm. May 15 00:28:06.834322 systemd[1]: Mounted sys-fs-fuse-connections.mount - FUSE Control File System. May 15 00:28:06.836408 systemd[1]: Mounted sys-kernel-config.mount - Kernel Configuration File System. May 15 00:28:06.837183 systemd[1]: Finished dracut-shutdown.service - Restore /run/initramfs on shutdown. May 15 00:28:06.856397 systemd-journald[1081]: Time spent on flushing to /var/log/journal/720e1d30371745bb93b008afd0436d0b is 70.810ms for 957 entries. May 15 00:28:06.856397 systemd-journald[1081]: System Journal (/var/log/journal/720e1d30371745bb93b008afd0436d0b) is 8M, max 584.8M, 576.8M free. May 15 00:28:06.959286 systemd-journald[1081]: Received client request to flush runtime journal. May 15 00:28:06.959339 kernel: loop0: detected capacity change from 0 to 151640 May 15 00:28:06.866334 systemd[1]: Finished systemd-udev-trigger.service - Coldplug All udev Devices. May 15 00:28:06.870694 systemd[1]: Starting systemd-udev-settle.service - Wait for udev To Complete Device Initialization... May 15 00:28:06.880920 systemd[1]: Finished systemd-random-seed.service - Load/Save OS Random Seed. May 15 00:28:06.882533 systemd[1]: Reached target first-boot-complete.target - First Boot Complete. May 15 00:28:06.900594 systemd[1]: Starting systemd-machine-id-commit.service - Save Transient machine-id to Disk... May 15 00:28:06.901625 systemd[1]: Finished systemd-sysctl.service - Apply Kernel Variables. May 15 00:28:06.926585 systemd-tmpfiles[1136]: ACLs are not supported, ignoring. May 15 00:28:06.926599 systemd-tmpfiles[1136]: ACLs are not supported, ignoring. May 15 00:28:06.928521 udevadm[1144]: systemd-udev-settle.service is deprecated. Please fix lvm2-activation.service, lvm2-activation-early.service not to pull it in. May 15 00:28:06.938438 systemd[1]: Finished systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully. May 15 00:28:06.940448 systemd[1]: Starting systemd-sysusers.service - Create System Users... May 15 00:28:06.962298 systemd[1]: Finished systemd-journal-flush.service - Flush Journal to Persistent Storage. May 15 00:28:06.999371 systemd[1]: Finished systemd-machine-id-commit.service - Save Transient machine-id to Disk. May 15 00:28:07.014531 kernel: squashfs: version 4.0 (2009/01/31) Phillip Lougher May 15 00:28:07.038299 kernel: loop1: detected capacity change from 0 to 218376 May 15 00:28:07.042681 systemd[1]: Finished systemd-sysusers.service - Create System Users. May 15 00:28:07.046289 systemd[1]: Starting systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev... May 15 00:28:07.089644 systemd-tmpfiles[1160]: ACLs are not supported, ignoring. May 15 00:28:07.089667 systemd-tmpfiles[1160]: ACLs are not supported, ignoring. May 15 00:28:07.099239 systemd[1]: Finished systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev. May 15 00:28:07.102830 kernel: loop2: detected capacity change from 0 to 109808 May 15 00:28:07.168293 kernel: loop3: detected capacity change from 0 to 8 May 15 00:28:07.199420 kernel: loop4: detected capacity change from 0 to 151640 May 15 00:28:07.274342 kernel: loop5: detected capacity change from 0 to 218376 May 15 00:28:07.350282 kernel: loop6: detected capacity change from 0 to 109808 May 15 00:28:07.393394 systemd[1]: etc-machine\x2did.mount: Deactivated successfully. May 15 00:28:07.416297 kernel: loop7: detected capacity change from 0 to 8 May 15 00:28:07.417430 (sd-merge)[1166]: Using extensions 'containerd-flatcar', 'docker-flatcar', 'kubernetes', 'oem-openstack'. May 15 00:28:07.417967 (sd-merge)[1166]: Merged extensions into '/usr'. May 15 00:28:07.426385 systemd[1]: Reload requested from client PID 1135 ('systemd-sysext') (unit systemd-sysext.service)... May 15 00:28:07.426521 systemd[1]: Reloading... May 15 00:28:07.532324 zram_generator::config[1190]: No configuration found. May 15 00:28:07.799164 systemd[1]: /usr/lib/systemd/system/docker.socket:6: ListenStream= references a path below legacy directory /var/run/, updating /var/run/docker.sock → /run/docker.sock; please update the unit file accordingly. May 15 00:28:07.869307 ldconfig[1130]: /sbin/ldconfig: /usr/lib/ld.so.conf is not an ELF file - it has the wrong magic bytes at the start. May 15 00:28:07.882113 systemd[1]: Reloading finished in 455 ms. May 15 00:28:07.897583 systemd[1]: Finished ldconfig.service - Rebuild Dynamic Linker Cache. May 15 00:28:07.898639 systemd[1]: Finished systemd-hwdb-update.service - Rebuild Hardware Database. May 15 00:28:07.901332 systemd[1]: Finished systemd-sysext.service - Merge System Extension Images into /usr/ and /opt/. May 15 00:28:07.915561 systemd[1]: Starting ensure-sysext.service... May 15 00:28:07.919395 systemd[1]: Starting systemd-tmpfiles-setup.service - Create System Files and Directories... May 15 00:28:07.922222 systemd[1]: Starting systemd-udevd.service - Rule-based Manager for Device Events and Files... May 15 00:28:07.950690 systemd[1]: Reload requested from client PID 1251 ('systemctl') (unit ensure-sysext.service)... May 15 00:28:07.950707 systemd[1]: Reloading... May 15 00:28:07.959758 systemd-tmpfiles[1253]: /usr/lib/tmpfiles.d/provision.conf:20: Duplicate line for path "/root", ignoring. May 15 00:28:07.960395 systemd-tmpfiles[1253]: /usr/lib/tmpfiles.d/systemd-flatcar.conf:6: Duplicate line for path "/var/log/journal", ignoring. May 15 00:28:07.963064 systemd-tmpfiles[1253]: /usr/lib/tmpfiles.d/systemd.conf:29: Duplicate line for path "/var/lib/systemd", ignoring. May 15 00:28:07.963994 systemd-tmpfiles[1253]: ACLs are not supported, ignoring. May 15 00:28:07.964362 systemd-tmpfiles[1253]: ACLs are not supported, ignoring. May 15 00:28:07.978224 systemd-tmpfiles[1253]: Detected autofs mount point /boot during canonicalization of boot. May 15 00:28:07.978246 systemd-tmpfiles[1253]: Skipping /boot May 15 00:28:07.987770 systemd-udevd[1254]: Using default interface naming scheme 'v255'. May 15 00:28:08.000285 systemd-tmpfiles[1253]: Detected autofs mount point /boot during canonicalization of boot. May 15 00:28:08.000297 systemd-tmpfiles[1253]: Skipping /boot May 15 00:28:08.078324 zram_generator::config[1304]: No configuration found. May 15 00:28:08.210328 kernel: BTRFS warning: duplicate device /dev/vda3 devid 1 generation 41 scanned by (udev-worker) (1278) May 15 00:28:08.240322 kernel: input: Power Button as /devices/LNXSYSTM:00/LNXPWRBN:00/input/input2 May 15 00:28:08.254294 kernel: ACPI: button: Power Button [PWRF] May 15 00:28:08.292294 kernel: piix4_smbus 0000:00:01.3: SMBus Host Controller at 0x700, revision 0 May 15 00:28:08.305582 kernel: input: ImExPS/2 Generic Explorer Mouse as /devices/platform/i8042/serio1/input/input3 May 15 00:28:08.345959 systemd[1]: /usr/lib/systemd/system/docker.socket:6: ListenStream= references a path below legacy directory /var/run/, updating /var/run/docker.sock → /run/docker.sock; please update the unit file accordingly. May 15 00:28:08.380424 kernel: mousedev: PS/2 mouse device common for all mice May 15 00:28:08.389395 kernel: [drm] pci: virtio-vga detected at 0000:00:02.0 May 15 00:28:08.389450 kernel: virtio-pci 0000:00:02.0: vgaarb: deactivate vga console May 15 00:28:08.393907 kernel: Console: switching to colour dummy device 80x25 May 15 00:28:08.396908 kernel: [drm] features: -virgl +edid -resource_blob -host_visible May 15 00:28:08.396950 kernel: [drm] features: -context_init May 15 00:28:08.400292 kernel: [drm] number of scanouts: 1 May 15 00:28:08.400334 kernel: [drm] number of cap sets: 0 May 15 00:28:08.404289 kernel: [drm] Initialized virtio_gpu 0.1.0 0 for 0000:00:02.0 on minor 0 May 15 00:28:08.410298 kernel: fbcon: virtio_gpudrmfb (fb0) is primary device May 15 00:28:08.416788 kernel: Console: switching to colour frame buffer device 160x50 May 15 00:28:08.425312 kernel: virtio-pci 0000:00:02.0: [drm] fb0: virtio_gpudrmfb frame buffer device May 15 00:28:08.458632 systemd[1]: Found device dev-disk-by\x2dlabel-OEM.device - /dev/disk/by-label/OEM. May 15 00:28:08.461102 systemd[1]: Condition check resulted in dev-ttyS0.device - /dev/ttyS0 being skipped. May 15 00:28:08.461573 systemd[1]: Reloading finished in 510 ms. May 15 00:28:08.474904 systemd[1]: Started systemd-udevd.service - Rule-based Manager for Device Events and Files. May 15 00:28:08.481000 systemd[1]: Finished systemd-tmpfiles-setup.service - Create System Files and Directories. May 15 00:28:08.527818 systemd[1]: Finished systemd-udev-settle.service - Wait for udev To Complete Device Initialization. May 15 00:28:08.534591 systemd[1]: Finished ensure-sysext.service. May 15 00:28:08.545763 systemd[1]: proc-xen.mount - /proc/xen was skipped because of an unmet condition check (ConditionVirtualization=xen). May 15 00:28:08.547873 systemd[1]: Starting audit-rules.service - Load Audit Rules... May 15 00:28:08.563256 systemd[1]: Starting clean-ca-certificates.service - Clean up broken links in /etc/ssl/certs... May 15 00:28:08.563832 systemd[1]: ignition-delete-config.service - Ignition (delete config) was skipped because no trigger condition checks were met. May 15 00:28:08.566468 systemd[1]: Starting lvm2-activation-early.service - Activation of LVM2 logical volumes... May 15 00:28:08.574671 systemd[1]: Starting modprobe@dm_mod.service - Load Kernel Module dm_mod... May 15 00:28:08.590239 systemd[1]: Starting modprobe@drm.service - Load Kernel Module drm... May 15 00:28:08.595656 systemd[1]: Starting modprobe@efi_pstore.service - Load Kernel Module efi_pstore... May 15 00:28:08.615647 systemd[1]: Starting modprobe@loop.service - Load Kernel Module loop... May 15 00:28:08.616749 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. May 15 00:28:08.621667 lvm[1375]: WARNING: Failed to connect to lvmetad. Falling back to device scanning. May 15 00:28:08.622120 systemd[1]: Starting systemd-fsck@dev-disk-by\x2dlabel-OEM.service - File System Check on /dev/disk/by-label/OEM... May 15 00:28:08.622282 systemd[1]: systemd-hibernate-clear.service - Clear Stale Hibernate Storage Info was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/HibernateLocation-8cf2644b-4b0b-428f-9387-6d876050dc67). May 15 00:28:08.628591 systemd[1]: Starting systemd-journal-catalog-update.service - Rebuild Journal Catalog... May 15 00:28:08.634464 systemd[1]: Starting systemd-networkd.service - Network Configuration... May 15 00:28:08.644012 systemd[1]: Starting systemd-resolved.service - Network Name Resolution... May 15 00:28:08.653470 systemd[1]: Starting systemd-timesyncd.service - Network Time Synchronization... May 15 00:28:08.667821 systemd[1]: Starting systemd-update-utmp.service - Record System Boot/Shutdown in UTMP... May 15 00:28:08.676545 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... May 15 00:28:08.677385 systemd[1]: xenserver-pv-version.service - Set fake PV driver version for XenServer was skipped because of an unmet condition check (ConditionVirtualization=xen). May 15 00:28:08.678392 systemd[1]: modprobe@dm_mod.service: Deactivated successfully. May 15 00:28:08.678947 systemd[1]: Finished modprobe@dm_mod.service - Load Kernel Module dm_mod. May 15 00:28:08.681410 systemd[1]: modprobe@drm.service: Deactivated successfully. May 15 00:28:08.681673 systemd[1]: Finished modprobe@drm.service - Load Kernel Module drm. May 15 00:28:08.683143 systemd[1]: modprobe@efi_pstore.service: Deactivated successfully. May 15 00:28:08.683489 systemd[1]: Finished modprobe@efi_pstore.service - Load Kernel Module efi_pstore. May 15 00:28:08.685461 systemd[1]: modprobe@loop.service: Deactivated successfully. May 15 00:28:08.686212 systemd[1]: Finished modprobe@loop.service - Load Kernel Module loop. May 15 00:28:08.695868 systemd[1]: Finished lvm2-activation-early.service - Activation of LVM2 logical volumes. May 15 00:28:08.704028 systemd[1]: Reached target cryptsetup.target - Local Encrypted Volumes. May 15 00:28:08.714467 systemd[1]: Starting lvm2-activation.service - Activation of LVM2 logical volumes... May 15 00:28:08.715256 systemd[1]: systemd-pstore.service - Platform Persistent Storage Archival was skipped because of an unmet condition check (ConditionDirectoryNotEmpty=/sys/fs/pstore). May 15 00:28:08.718079 augenrules[1413]: No rules May 15 00:28:08.715350 systemd[1]: systemd-repart.service - Repartition Root Disk was skipped because no trigger condition checks were met. May 15 00:28:08.720745 systemd[1]: Starting systemd-userdbd.service - User Database Manager... May 15 00:28:08.723073 systemd[1]: audit-rules.service: Deactivated successfully. May 15 00:28:08.724842 systemd[1]: Finished audit-rules.service - Load Audit Rules. May 15 00:28:08.729196 systemd[1]: Finished systemd-fsck@dev-disk-by\x2dlabel-OEM.service - File System Check on /dev/disk/by-label/OEM. May 15 00:28:08.745714 systemd[1]: Finished systemd-update-utmp.service - Record System Boot/Shutdown in UTMP. May 15 00:28:08.753807 lvm[1411]: WARNING: Failed to connect to lvmetad. Falling back to device scanning. May 15 00:28:08.757341 systemd[1]: Finished systemd-journal-catalog-update.service - Rebuild Journal Catalog. May 15 00:28:08.764111 systemd[1]: Starting systemd-update-done.service - Update is Completed... May 15 00:28:08.794763 systemd[1]: Finished systemd-update-done.service - Update is Completed. May 15 00:28:08.798914 systemd[1]: Finished lvm2-activation.service - Activation of LVM2 logical volumes. May 15 00:28:08.808231 systemd[1]: Started systemd-userdbd.service - User Database Manager. May 15 00:28:08.851569 systemd[1]: Finished clean-ca-certificates.service - Clean up broken links in /etc/ssl/certs. May 15 00:28:08.856939 systemd[1]: update-ca-certificates.service - Update CA bundle at /etc/ssl/certs/ca-certificates.crt was skipped because of an unmet condition check (ConditionPathIsSymbolicLink=!/etc/ssl/certs/ca-certificates.crt). May 15 00:28:08.909151 systemd[1]: Finished systemd-vconsole-setup.service - Virtual Console Setup. May 15 00:28:08.933647 systemd[1]: Started systemd-timesyncd.service - Network Time Synchronization. May 15 00:28:08.936519 systemd[1]: Reached target time-set.target - System Time Set. May 15 00:28:08.936694 systemd-resolved[1399]: Positive Trust Anchors: May 15 00:28:08.936989 systemd-resolved[1399]: . IN DS 20326 8 2 e06d44b80b8f1d39a95c0b0d7c65d08458e880409bbc683457104237c7f8ec8d May 15 00:28:08.937035 systemd-resolved[1399]: Negative trust anchors: home.arpa 10.in-addr.arpa 16.172.in-addr.arpa 17.172.in-addr.arpa 18.172.in-addr.arpa 19.172.in-addr.arpa 20.172.in-addr.arpa 21.172.in-addr.arpa 22.172.in-addr.arpa 23.172.in-addr.arpa 24.172.in-addr.arpa 25.172.in-addr.arpa 26.172.in-addr.arpa 27.172.in-addr.arpa 28.172.in-addr.arpa 29.172.in-addr.arpa 30.172.in-addr.arpa 31.172.in-addr.arpa 170.0.0.192.in-addr.arpa 171.0.0.192.in-addr.arpa 168.192.in-addr.arpa d.f.ip6.arpa ipv4only.arpa resolver.arpa corp home internal intranet lan local private test May 15 00:28:08.947296 systemd-networkd[1398]: lo: Link UP May 15 00:28:08.947588 systemd-networkd[1398]: lo: Gained carrier May 15 00:28:08.949474 systemd-resolved[1399]: Using system hostname 'ci-4284-0-0-n-019843d4bb.novalocal'. May 15 00:28:08.950574 systemd-networkd[1398]: Enumeration completed May 15 00:28:08.951438 systemd[1]: Started systemd-networkd.service - Network Configuration. May 15 00:28:08.951779 systemd-networkd[1398]: eth0: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. May 15 00:28:08.951784 systemd-networkd[1398]: eth0: Configuring with /usr/lib/systemd/network/zz-default.network. May 15 00:28:08.955920 systemd-networkd[1398]: eth0: Link UP May 15 00:28:08.956037 systemd-networkd[1398]: eth0: Gained carrier May 15 00:28:08.956113 systemd-networkd[1398]: eth0: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. May 15 00:28:08.957064 systemd[1]: Starting systemd-networkd-persistent-storage.service - Enable Persistent Storage in systemd-networkd... May 15 00:28:08.961187 systemd[1]: Starting systemd-networkd-wait-online.service - Wait for Network to be Configured... May 15 00:28:08.962031 systemd[1]: Started systemd-resolved.service - Network Name Resolution. May 15 00:28:08.965277 systemd[1]: Reached target network.target - Network. May 15 00:28:08.965924 systemd[1]: Reached target nss-lookup.target - Host and Network Name Lookups. May 15 00:28:08.969114 systemd[1]: Reached target sysinit.target - System Initialization. May 15 00:28:08.971218 systemd[1]: Started motdgen.path - Watch for update engine configuration changes. May 15 00:28:08.973004 systemd[1]: Started user-cloudinit@var-lib-flatcar\x2dinstall-user_data.path - Watch for a cloud-config at /var/lib/flatcar-install/user_data. May 15 00:28:08.974909 systemd[1]: Started logrotate.timer - Daily rotation of log files. May 15 00:28:08.976488 systemd[1]: Started mdadm.timer - Weekly check for MD array's redundancy information.. May 15 00:28:08.978914 systemd-networkd[1398]: eth0: DHCPv4 address 172.24.4.125/24, gateway 172.24.4.1 acquired from 172.24.4.1 May 15 00:28:08.979404 systemd[1]: Started systemd-tmpfiles-clean.timer - Daily Cleanup of Temporary Directories. May 15 00:28:08.982229 systemd-timesyncd[1401]: Network configuration changed, trying to establish connection. May 15 00:28:08.983384 systemd[1]: update-engine-stub.timer - Update Engine Stub Timer was skipped because of an unmet condition check (ConditionPathExists=/usr/.noupdate). May 15 00:28:08.983433 systemd[1]: Reached target paths.target - Path Units. May 15 00:28:08.985135 systemd[1]: Reached target timers.target - Timer Units. May 15 00:28:08.993245 systemd[1]: Listening on dbus.socket - D-Bus System Message Bus Socket. May 15 00:28:09.000327 systemd[1]: Starting docker.socket - Docker Socket for the API... May 15 00:28:09.005296 systemd[1]: Listening on sshd-unix-local.socket - OpenSSH Server Socket (systemd-ssh-generator, AF_UNIX Local). May 15 00:28:09.008605 systemd[1]: Listening on sshd-vsock.socket - OpenSSH Server Socket (systemd-ssh-generator, AF_VSOCK). May 15 00:28:09.010165 systemd[1]: Reached target ssh-access.target - SSH Access Available. May 15 00:28:09.023844 systemd[1]: Listening on sshd.socket - OpenSSH Server Socket. May 15 00:28:09.025040 systemd[1]: Listening on systemd-hostnamed.socket - Hostname Service Socket. May 15 00:28:09.029133 systemd[1]: Listening on docker.socket - Docker Socket for the API. May 15 00:28:09.029967 systemd[1]: Reached target sockets.target - Socket Units. May 15 00:28:09.030700 systemd[1]: Reached target basic.target - Basic System. May 15 00:28:09.031506 systemd[1]: addon-config@oem.service - Configure Addon /oem was skipped because no trigger condition checks were met. May 15 00:28:09.031534 systemd[1]: addon-run@oem.service - Run Addon /oem was skipped because no trigger condition checks were met. May 15 00:28:09.038290 systemd[1]: Starting containerd.service - containerd container runtime... May 15 00:28:09.043958 systemd[1]: Starting coreos-metadata.service - Flatcar Metadata Agent... May 15 00:28:09.054512 systemd[1]: Starting dbus.service - D-Bus System Message Bus... May 15 00:28:09.062382 systemd[1]: Starting enable-oem-cloudinit.service - Enable cloudinit... May 15 00:28:09.066957 systemd[1]: Starting extend-filesystems.service - Extend Filesystems... May 15 00:28:09.069233 systemd[1]: flatcar-setup-environment.service - Modifies /etc/environment for CoreOS was skipped because of an unmet condition check (ConditionPathExists=/oem/bin/flatcar-setup-environment). May 15 00:28:09.072901 systemd[1]: Starting motdgen.service - Generate /run/flatcar/motd... May 15 00:28:09.075298 jq[1451]: false May 15 00:28:09.083517 systemd[1]: Starting prepare-helm.service - Unpack helm to /opt/bin... May 15 00:28:09.089053 systemd[1]: Starting ssh-key-proc-cmdline.service - Install an ssh key from /proc/cmdline... May 15 00:28:09.096974 systemd[1]: Starting sshd-keygen.service - Generate sshd host keys... May 15 00:28:09.105233 systemd[1]: Starting systemd-logind.service - User Login Management... May 15 00:28:09.108231 systemd[1]: tcsd.service - TCG Core Services Daemon was skipped because of an unmet condition check (ConditionPathExists=/dev/tpm0). May 15 00:28:09.114621 systemd[1]: cgroup compatibility translation between legacy and unified hierarchy settings activated. See cgroup-compat debug messages for details. May 15 00:28:09.119726 systemd[1]: Starting update-engine.service - Update Engine... May 15 00:28:09.125074 systemd[1]: Starting update-ssh-keys-after-ignition.service - Run update-ssh-keys once after Ignition... May 15 00:28:09.133322 systemd[1]: Finished systemd-networkd-persistent-storage.service - Enable Persistent Storage in systemd-networkd. May 15 00:28:09.139163 systemd[1]: enable-oem-cloudinit.service: Skipped due to 'exec-condition'. May 15 00:28:09.139391 systemd[1]: Condition check resulted in enable-oem-cloudinit.service - Enable cloudinit being skipped. May 15 00:28:09.142328 extend-filesystems[1452]: Found loop4 May 15 00:28:09.142328 extend-filesystems[1452]: Found loop5 May 15 00:28:09.142328 extend-filesystems[1452]: Found loop6 May 15 00:28:09.142328 extend-filesystems[1452]: Found loop7 May 15 00:28:09.142328 extend-filesystems[1452]: Found vda May 15 00:28:09.142328 extend-filesystems[1452]: Found vda1 May 15 00:28:09.142328 extend-filesystems[1452]: Found vda2 May 15 00:28:09.142328 extend-filesystems[1452]: Found vda3 May 15 00:28:09.142328 extend-filesystems[1452]: Found usr May 15 00:28:09.263496 kernel: EXT4-fs (vda9): resizing filesystem from 1617920 to 2014203 blocks May 15 00:28:09.263528 kernel: EXT4-fs (vda9): resized filesystem to 2014203 May 15 00:28:09.272666 kernel: BTRFS warning: duplicate device /dev/vda3 devid 1 generation 41 scanned by (udev-worker) (1298) May 15 00:28:09.222793 dbus-daemon[1448]: [system] SELinux support is enabled May 15 00:28:09.154776 systemd[1]: ssh-key-proc-cmdline.service: Deactivated successfully. May 15 00:28:09.273162 extend-filesystems[1452]: Found vda4 May 15 00:28:09.273162 extend-filesystems[1452]: Found vda6 May 15 00:28:09.273162 extend-filesystems[1452]: Found vda7 May 15 00:28:09.273162 extend-filesystems[1452]: Found vda9 May 15 00:28:09.273162 extend-filesystems[1452]: Checking size of /dev/vda9 May 15 00:28:09.273162 extend-filesystems[1452]: Resized partition /dev/vda9 May 15 00:28:09.154995 systemd[1]: Finished ssh-key-proc-cmdline.service - Install an ssh key from /proc/cmdline. May 15 00:28:09.326988 update_engine[1461]: I20250515 00:28:09.204783 1461 main.cc:92] Flatcar Update Engine starting May 15 00:28:09.326988 update_engine[1461]: I20250515 00:28:09.244498 1461 update_check_scheduler.cc:74] Next update check in 7m6s May 15 00:28:09.327328 extend-filesystems[1481]: resize2fs 1.47.2 (1-Jan-2025) May 15 00:28:09.327328 extend-filesystems[1481]: Filesystem at /dev/vda9 is mounted on /; on-line resizing required May 15 00:28:09.327328 extend-filesystems[1481]: old_desc_blocks = 1, new_desc_blocks = 1 May 15 00:28:09.327328 extend-filesystems[1481]: The filesystem on /dev/vda9 is now 2014203 (4k) blocks long. May 15 00:28:09.166671 systemd[1]: motdgen.service: Deactivated successfully. May 15 00:28:09.341254 jq[1462]: true May 15 00:28:09.348755 extend-filesystems[1452]: Resized filesystem in /dev/vda9 May 15 00:28:09.166895 systemd[1]: Finished motdgen.service - Generate /run/flatcar/motd. May 15 00:28:09.223494 systemd[1]: Started dbus.service - D-Bus System Message Bus. May 15 00:28:09.366650 tar[1470]: linux-amd64/LICENSE May 15 00:28:09.366650 tar[1470]: linux-amd64/helm May 15 00:28:09.267888 systemd[1]: system-cloudinit@usr-share-oem-cloud\x2dconfig.yml.service - Load cloud-config from /usr/share/oem/cloud-config.yml was skipped because of an unmet condition check (ConditionFileNotEmpty=/usr/share/oem/cloud-config.yml). May 15 00:28:09.267923 systemd[1]: Reached target system-config.target - Load system-provided cloud configs. May 15 00:28:09.272039 systemd[1]: user-cloudinit-proc-cmdline.service - Load cloud-config from url defined in /proc/cmdline was skipped because of an unmet condition check (ConditionKernelCommandLine=cloud-config-url). May 15 00:28:09.370438 jq[1474]: true May 15 00:28:09.272062 systemd[1]: Reached target user-config.target - Load user-provided cloud configs. May 15 00:28:09.289753 systemd[1]: extend-filesystems.service: Deactivated successfully. May 15 00:28:09.290939 (ntainerd)[1483]: containerd.service: Referenced but unset environment variable evaluates to an empty string: TORCX_IMAGEDIR, TORCX_UNPACKDIR May 15 00:28:09.291982 systemd[1]: Finished extend-filesystems.service - Extend Filesystems. May 15 00:28:09.317880 systemd[1]: Started update-engine.service - Update Engine. May 15 00:28:09.351960 systemd-logind[1458]: New seat seat0. May 15 00:28:09.359422 systemd[1]: Started locksmithd.service - Cluster reboot manager. May 15 00:28:09.377951 systemd-logind[1458]: Watching system buttons on /dev/input/event1 (Power Button) May 15 00:28:09.377971 systemd-logind[1458]: Watching system buttons on /dev/input/event0 (AT Translated Set 2 keyboard) May 15 00:28:09.381213 systemd[1]: Started systemd-logind.service - User Login Management. May 15 00:28:09.444835 bash[1507]: Updated "/home/core/.ssh/authorized_keys" May 15 00:28:09.445427 systemd[1]: Finished update-ssh-keys-after-ignition.service - Run update-ssh-keys once after Ignition. May 15 00:28:09.455014 systemd[1]: Starting sshkeys.service... May 15 00:28:09.502456 systemd[1]: Created slice system-coreos\x2dmetadata\x2dsshkeys.slice - Slice /system/coreos-metadata-sshkeys. May 15 00:28:09.507365 systemd[1]: Starting coreos-metadata-sshkeys@core.service - Flatcar Metadata Agent (SSH Keys)... May 15 00:28:09.642852 locksmithd[1490]: locksmithd starting currentOperation="UPDATE_STATUS_IDLE" strategy="reboot" May 15 00:28:09.792832 systemd[1]: Created slice system-sshd.slice - Slice /system/sshd. May 15 00:28:09.883399 containerd[1483]: time="2025-05-15T00:28:09Z" level=warning msg="Ignoring unknown key in TOML" column=1 error="strict mode: fields in the document are missing in the target struct" file=/usr/share/containerd/config.toml key=subreaper row=8 May 15 00:28:09.883684 containerd[1483]: time="2025-05-15T00:28:09.883524775Z" level=info msg="starting containerd" revision=88aa2f531d6c2922003cc7929e51daf1c14caa0a version=v2.0.1 May 15 00:28:09.902117 containerd[1483]: time="2025-05-15T00:28:09.900687774Z" level=warning msg="Configuration migrated from version 2, use `containerd config migrate` to avoid migration" t="6.812µs" May 15 00:28:09.902117 containerd[1483]: time="2025-05-15T00:28:09.900722590Z" level=info msg="loading plugin" id=io.containerd.image-verifier.v1.bindir type=io.containerd.image-verifier.v1 May 15 00:28:09.902117 containerd[1483]: time="2025-05-15T00:28:09.900743769Z" level=info msg="loading plugin" id=io.containerd.internal.v1.opt type=io.containerd.internal.v1 May 15 00:28:09.902117 containerd[1483]: time="2025-05-15T00:28:09.900904210Z" level=info msg="loading plugin" id=io.containerd.warning.v1.deprecations type=io.containerd.warning.v1 May 15 00:28:09.902117 containerd[1483]: time="2025-05-15T00:28:09.900927975Z" level=info msg="loading plugin" id=io.containerd.content.v1.content type=io.containerd.content.v1 May 15 00:28:09.902117 containerd[1483]: time="2025-05-15T00:28:09.900956478Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.blockfile type=io.containerd.snapshotter.v1 May 15 00:28:09.902117 containerd[1483]: time="2025-05-15T00:28:09.901024125Z" level=info msg="skip loading plugin" error="no scratch file generator: skip plugin" id=io.containerd.snapshotter.v1.blockfile type=io.containerd.snapshotter.v1 May 15 00:28:09.902117 containerd[1483]: time="2025-05-15T00:28:09.901038913Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.btrfs type=io.containerd.snapshotter.v1 May 15 00:28:09.902117 containerd[1483]: time="2025-05-15T00:28:09.901288611Z" level=info msg="skip loading plugin" error="path /var/lib/containerd/io.containerd.snapshotter.v1.btrfs (ext4) must be a btrfs filesystem to be used with the btrfs snapshotter: skip plugin" id=io.containerd.snapshotter.v1.btrfs type=io.containerd.snapshotter.v1 May 15 00:28:09.902117 containerd[1483]: time="2025-05-15T00:28:09.901307256Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.devmapper type=io.containerd.snapshotter.v1 May 15 00:28:09.902117 containerd[1483]: time="2025-05-15T00:28:09.901319319Z" level=info msg="skip loading plugin" error="devmapper not configured: skip plugin" id=io.containerd.snapshotter.v1.devmapper type=io.containerd.snapshotter.v1 May 15 00:28:09.902117 containerd[1483]: time="2025-05-15T00:28:09.901329107Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.native type=io.containerd.snapshotter.v1 May 15 00:28:09.902446 containerd[1483]: time="2025-05-15T00:28:09.901406442Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.overlayfs type=io.containerd.snapshotter.v1 May 15 00:28:09.902446 containerd[1483]: time="2025-05-15T00:28:09.901599524Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.zfs type=io.containerd.snapshotter.v1 May 15 00:28:09.902446 containerd[1483]: time="2025-05-15T00:28:09.901633789Z" level=info msg="skip loading plugin" error="lstat /var/lib/containerd/io.containerd.snapshotter.v1.zfs: no such file or directory: skip plugin" id=io.containerd.snapshotter.v1.zfs type=io.containerd.snapshotter.v1 May 15 00:28:09.902446 containerd[1483]: time="2025-05-15T00:28:09.901649588Z" level=info msg="loading plugin" id=io.containerd.event.v1.exchange type=io.containerd.event.v1 May 15 00:28:09.902446 containerd[1483]: time="2025-05-15T00:28:09.901677841Z" level=info msg="loading plugin" id=io.containerd.monitor.task.v1.cgroups type=io.containerd.monitor.task.v1 May 15 00:28:09.902446 containerd[1483]: time="2025-05-15T00:28:09.901887515Z" level=info msg="loading plugin" id=io.containerd.metadata.v1.bolt type=io.containerd.metadata.v1 May 15 00:28:09.902446 containerd[1483]: time="2025-05-15T00:28:09.901947617Z" level=info msg="metadata content store policy set" policy=shared May 15 00:28:09.920276 containerd[1483]: time="2025-05-15T00:28:09.918589088Z" level=info msg="loading plugin" id=io.containerd.gc.v1.scheduler type=io.containerd.gc.v1 May 15 00:28:09.920276 containerd[1483]: time="2025-05-15T00:28:09.918667746Z" level=info msg="loading plugin" id=io.containerd.differ.v1.walking type=io.containerd.differ.v1 May 15 00:28:09.920276 containerd[1483]: time="2025-05-15T00:28:09.918686401Z" level=info msg="loading plugin" id=io.containerd.lease.v1.manager type=io.containerd.lease.v1 May 15 00:28:09.920276 containerd[1483]: time="2025-05-15T00:28:09.918700337Z" level=info msg="loading plugin" id=io.containerd.service.v1.containers-service type=io.containerd.service.v1 May 15 00:28:09.920276 containerd[1483]: time="2025-05-15T00:28:09.918713602Z" level=info msg="loading plugin" id=io.containerd.service.v1.content-service type=io.containerd.service.v1 May 15 00:28:09.920276 containerd[1483]: time="2025-05-15T00:28:09.918726987Z" level=info msg="loading plugin" id=io.containerd.service.v1.diff-service type=io.containerd.service.v1 May 15 00:28:09.920276 containerd[1483]: time="2025-05-15T00:28:09.918742316Z" level=info msg="loading plugin" id=io.containerd.service.v1.images-service type=io.containerd.service.v1 May 15 00:28:09.920276 containerd[1483]: time="2025-05-15T00:28:09.918756001Z" level=info msg="loading plugin" id=io.containerd.service.v1.introspection-service type=io.containerd.service.v1 May 15 00:28:09.920276 containerd[1483]: time="2025-05-15T00:28:09.918768124Z" level=info msg="loading plugin" id=io.containerd.service.v1.namespaces-service type=io.containerd.service.v1 May 15 00:28:09.920276 containerd[1483]: time="2025-05-15T00:28:09.918788783Z" level=info msg="loading plugin" id=io.containerd.service.v1.snapshots-service type=io.containerd.service.v1 May 15 00:28:09.920276 containerd[1483]: time="2025-05-15T00:28:09.918800735Z" level=info msg="loading plugin" id=io.containerd.shim.v1.manager type=io.containerd.shim.v1 May 15 00:28:09.920276 containerd[1483]: time="2025-05-15T00:28:09.918814451Z" level=info msg="loading plugin" id=io.containerd.runtime.v2.task type=io.containerd.runtime.v2 May 15 00:28:09.920276 containerd[1483]: time="2025-05-15T00:28:09.918955546Z" level=info msg="loading plugin" id=io.containerd.service.v1.tasks-service type=io.containerd.service.v1 May 15 00:28:09.920276 containerd[1483]: time="2025-05-15T00:28:09.918978689Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.containers type=io.containerd.grpc.v1 May 15 00:28:09.921105 containerd[1483]: time="2025-05-15T00:28:09.918993407Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.content type=io.containerd.grpc.v1 May 15 00:28:09.921105 containerd[1483]: time="2025-05-15T00:28:09.919014657Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.diff type=io.containerd.grpc.v1 May 15 00:28:09.921105 containerd[1483]: time="2025-05-15T00:28:09.919027481Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.events type=io.containerd.grpc.v1 May 15 00:28:09.921105 containerd[1483]: time="2025-05-15T00:28:09.919040475Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.images type=io.containerd.grpc.v1 May 15 00:28:09.921105 containerd[1483]: time="2025-05-15T00:28:09.919058168Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.introspection type=io.containerd.grpc.v1 May 15 00:28:09.921105 containerd[1483]: time="2025-05-15T00:28:09.919069930Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.leases type=io.containerd.grpc.v1 May 15 00:28:09.921105 containerd[1483]: time="2025-05-15T00:28:09.919082774Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.namespaces type=io.containerd.grpc.v1 May 15 00:28:09.921105 containerd[1483]: time="2025-05-15T00:28:09.919094556Z" level=info msg="loading plugin" id=io.containerd.sandbox.store.v1.local type=io.containerd.sandbox.store.v1 May 15 00:28:09.921105 containerd[1483]: time="2025-05-15T00:28:09.919108803Z" level=info msg="loading plugin" id=io.containerd.cri.v1.images type=io.containerd.cri.v1 May 15 00:28:09.921105 containerd[1483]: time="2025-05-15T00:28:09.919167132Z" level=info msg="Get image filesystem path \"/var/lib/containerd/io.containerd.snapshotter.v1.overlayfs\" for snapshotter \"overlayfs\"" May 15 00:28:09.921105 containerd[1483]: time="2025-05-15T00:28:09.919182271Z" level=info msg="Start snapshots syncer" May 15 00:28:09.921105 containerd[1483]: time="2025-05-15T00:28:09.919218359Z" level=info msg="loading plugin" id=io.containerd.cri.v1.runtime type=io.containerd.cri.v1 May 15 00:28:09.922217 containerd[1483]: time="2025-05-15T00:28:09.921686357Z" level=info msg="starting cri plugin" config="{\"containerd\":{\"defaultRuntimeName\":\"runc\",\"runtimes\":{\"runc\":{\"runtimeType\":\"io.containerd.runc.v2\",\"runtimePath\":\"\",\"PodAnnotations\":null,\"ContainerAnnotations\":null,\"options\":{\"BinaryName\":\"\",\"CriuImagePath\":\"\",\"CriuWorkPath\":\"\",\"IoGid\":0,\"IoUid\":0,\"NoNewKeyring\":false,\"Root\":\"\",\"ShimCgroup\":\"\",\"SystemdCgroup\":true},\"privileged_without_host_devices\":false,\"privileged_without_host_devices_all_devices_allowed\":false,\"baseRuntimeSpec\":\"\",\"cniConfDir\":\"\",\"cniMaxConfNum\":0,\"snapshotter\":\"\",\"sandboxer\":\"podsandbox\",\"io_type\":\"\"}},\"ignoreBlockIONotEnabledErrors\":false,\"ignoreRdtNotEnabledErrors\":false},\"cni\":{\"binDir\":\"/opt/cni/bin\",\"confDir\":\"/etc/cni/net.d\",\"maxConfNum\":1,\"setupSerially\":false,\"confTemplate\":\"\",\"ipPref\":\"\",\"useInternalLoopback\":false},\"enableSelinux\":true,\"selinuxCategoryRange\":1024,\"maxContainerLogSize\":16384,\"disableApparmor\":false,\"restrictOOMScoreAdj\":false,\"disableProcMount\":false,\"unsetSeccompProfile\":\"\",\"tolerateMissingHugetlbController\":true,\"disableHugetlbController\":true,\"device_ownership_from_security_context\":false,\"ignoreImageDefinedVolumes\":false,\"netnsMountsUnderStateDir\":false,\"enableUnprivilegedPorts\":true,\"enableUnprivilegedICMP\":true,\"enableCDI\":true,\"cdiSpecDirs\":[\"/etc/cdi\",\"/var/run/cdi\"],\"drainExecSyncIOTimeout\":\"0s\",\"ignoreDeprecationWarnings\":null,\"containerdRootDir\":\"/var/lib/containerd\",\"containerdEndpoint\":\"/run/containerd/containerd.sock\",\"rootDir\":\"/var/lib/containerd/io.containerd.grpc.v1.cri\",\"stateDir\":\"/run/containerd/io.containerd.grpc.v1.cri\"}" May 15 00:28:09.922412 containerd[1483]: time="2025-05-15T00:28:09.922394195Z" level=info msg="loading plugin" id=io.containerd.podsandbox.controller.v1.podsandbox type=io.containerd.podsandbox.controller.v1 May 15 00:28:09.922815 containerd[1483]: time="2025-05-15T00:28:09.922595863Z" level=info msg="loading plugin" id=io.containerd.sandbox.controller.v1.shim type=io.containerd.sandbox.controller.v1 May 15 00:28:09.922815 containerd[1483]: time="2025-05-15T00:28:09.922748740Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.sandbox-controllers type=io.containerd.grpc.v1 May 15 00:28:09.922815 containerd[1483]: time="2025-05-15T00:28:09.922774719Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.sandboxes type=io.containerd.grpc.v1 May 15 00:28:09.922815 containerd[1483]: time="2025-05-15T00:28:09.922789647Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.snapshots type=io.containerd.grpc.v1 May 15 00:28:09.922975 containerd[1483]: time="2025-05-15T00:28:09.922958744Z" level=info msg="loading plugin" id=io.containerd.streaming.v1.manager type=io.containerd.streaming.v1 May 15 00:28:09.923056 containerd[1483]: time="2025-05-15T00:28:09.923041689Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.streaming type=io.containerd.grpc.v1 May 15 00:28:09.923361 containerd[1483]: time="2025-05-15T00:28:09.923291277Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.tasks type=io.containerd.grpc.v1 May 15 00:28:09.923361 containerd[1483]: time="2025-05-15T00:28:09.923311205Z" level=info msg="loading plugin" id=io.containerd.transfer.v1.local type=io.containerd.transfer.v1 May 15 00:28:09.923361 containerd[1483]: time="2025-05-15T00:28:09.923336863Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.transfer type=io.containerd.grpc.v1 May 15 00:28:09.925270 containerd[1483]: time="2025-05-15T00:28:09.923460124Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.version type=io.containerd.grpc.v1 May 15 00:28:09.925270 containerd[1483]: time="2025-05-15T00:28:09.923479180Z" level=info msg="loading plugin" id=io.containerd.monitor.container.v1.restart type=io.containerd.monitor.container.v1 May 15 00:28:09.925270 containerd[1483]: time="2025-05-15T00:28:09.923514045Z" level=info msg="loading plugin" id=io.containerd.tracing.processor.v1.otlp type=io.containerd.tracing.processor.v1 May 15 00:28:09.925422 containerd[1483]: time="2025-05-15T00:28:09.925402056Z" level=info msg="skip loading plugin" error="skip plugin: tracing endpoint not configured" id=io.containerd.tracing.processor.v1.otlp type=io.containerd.tracing.processor.v1 May 15 00:28:09.925496 containerd[1483]: time="2025-05-15T00:28:09.925482006Z" level=info msg="loading plugin" id=io.containerd.internal.v1.tracing type=io.containerd.internal.v1 May 15 00:28:09.925574 containerd[1483]: time="2025-05-15T00:28:09.925541177Z" level=info msg="skip loading plugin" error="skip plugin: tracing endpoint not configured" id=io.containerd.internal.v1.tracing type=io.containerd.internal.v1 May 15 00:28:09.925652 containerd[1483]: time="2025-05-15T00:28:09.925620566Z" level=info msg="loading plugin" id=io.containerd.ttrpc.v1.otelttrpc type=io.containerd.ttrpc.v1 May 15 00:28:09.925726 containerd[1483]: time="2025-05-15T00:28:09.925697841Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.healthcheck type=io.containerd.grpc.v1 May 15 00:28:09.925797 containerd[1483]: time="2025-05-15T00:28:09.925781177Z" level=info msg="loading plugin" id=io.containerd.nri.v1.nri type=io.containerd.nri.v1 May 15 00:28:09.925878 containerd[1483]: time="2025-05-15T00:28:09.925865676Z" level=info msg="runtime interface created" May 15 00:28:09.925939 containerd[1483]: time="2025-05-15T00:28:09.925917112Z" level=info msg="created NRI interface" May 15 00:28:09.926007 containerd[1483]: time="2025-05-15T00:28:09.925991401Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.cri type=io.containerd.grpc.v1 May 15 00:28:09.926083 containerd[1483]: time="2025-05-15T00:28:09.926070410Z" level=info msg="Connect containerd service" May 15 00:28:09.926194 containerd[1483]: time="2025-05-15T00:28:09.926178562Z" level=info msg="using experimental NRI integration - disable nri plugin to prevent this" May 15 00:28:09.927161 containerd[1483]: time="2025-05-15T00:28:09.927141669Z" level=error msg="failed to load cni during init, please check CRI plugin status before setting up network for pods" error="cni config load failed: no network config found in /etc/cni/net.d: cni plugin not initialized: failed to load cni config" May 15 00:28:10.105558 sshd_keygen[1473]: ssh-keygen: generating new host keys: RSA ECDSA ED25519 May 15 00:28:10.160788 systemd[1]: Finished sshd-keygen.service - Generate sshd host keys. May 15 00:28:10.168049 systemd[1]: Starting issuegen.service - Generate /run/issue... May 15 00:28:10.174861 systemd[1]: Started sshd@0-172.24.4.125:22-172.24.4.1:43196.service - OpenSSH per-connection server daemon (172.24.4.1:43196). May 15 00:28:10.182909 containerd[1483]: time="2025-05-15T00:28:10.182817990Z" level=info msg="Start subscribing containerd event" May 15 00:28:10.182909 containerd[1483]: time="2025-05-15T00:28:10.182869186Z" level=info msg=serving... address=/run/containerd/containerd.sock.ttrpc May 15 00:28:10.183026 containerd[1483]: time="2025-05-15T00:28:10.182944056Z" level=info msg=serving... address=/run/containerd/containerd.sock May 15 00:28:10.183026 containerd[1483]: time="2025-05-15T00:28:10.182873734Z" level=info msg="Start recovering state" May 15 00:28:10.183074 containerd[1483]: time="2025-05-15T00:28:10.183045927Z" level=info msg="Start event monitor" May 15 00:28:10.183074 containerd[1483]: time="2025-05-15T00:28:10.183065093Z" level=info msg="Start cni network conf syncer for default" May 15 00:28:10.183127 containerd[1483]: time="2025-05-15T00:28:10.183076745Z" level=info msg="Start streaming server" May 15 00:28:10.183127 containerd[1483]: time="2025-05-15T00:28:10.183087705Z" level=info msg="Registered namespace \"k8s.io\" with NRI" May 15 00:28:10.183127 containerd[1483]: time="2025-05-15T00:28:10.183097213Z" level=info msg="runtime interface starting up..." May 15 00:28:10.183127 containerd[1483]: time="2025-05-15T00:28:10.183104938Z" level=info msg="starting plugins..." May 15 00:28:10.183127 containerd[1483]: time="2025-05-15T00:28:10.183125536Z" level=info msg="Synchronizing NRI (plugin) with current runtime state" May 15 00:28:10.189305 containerd[1483]: time="2025-05-15T00:28:10.183248748Z" level=info msg="containerd successfully booted in 0.301046s" May 15 00:28:10.183333 systemd[1]: Started containerd.service - containerd container runtime. May 15 00:28:10.191619 tar[1470]: linux-amd64/README.md May 15 00:28:10.204728 systemd[1]: issuegen.service: Deactivated successfully. May 15 00:28:10.204992 systemd[1]: Finished issuegen.service - Generate /run/issue. May 15 00:28:10.215401 systemd[1]: Starting systemd-user-sessions.service - Permit User Sessions... May 15 00:28:10.219660 systemd[1]: Finished prepare-helm.service - Unpack helm to /opt/bin. May 15 00:28:10.231103 systemd[1]: Finished systemd-user-sessions.service - Permit User Sessions. May 15 00:28:10.237593 systemd[1]: Started getty@tty1.service - Getty on tty1. May 15 00:28:10.243078 systemd[1]: Started serial-getty@ttyS0.service - Serial Getty on ttyS0. May 15 00:28:10.248436 systemd[1]: Reached target getty.target - Login Prompts. May 15 00:28:10.762716 systemd-networkd[1398]: eth0: Gained IPv6LL May 15 00:28:10.764575 systemd-timesyncd[1401]: Network configuration changed, trying to establish connection. May 15 00:28:10.770478 systemd[1]: Finished systemd-networkd-wait-online.service - Wait for Network to be Configured. May 15 00:28:10.779392 systemd[1]: Reached target network-online.target - Network is Online. May 15 00:28:10.787839 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... May 15 00:28:10.797983 systemd[1]: Starting nvidia.service - NVIDIA Configure Service... May 15 00:28:10.860260 systemd[1]: Finished nvidia.service - NVIDIA Configure Service. May 15 00:28:11.034951 sshd[1546]: Accepted publickey for core from 172.24.4.1 port 43196 ssh2: RSA SHA256:PZnPmaVglnXzapt4qvjiCfu83SmPbylzHggEpmD4nMk May 15 00:28:11.040695 sshd-session[1546]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 15 00:28:11.077542 systemd-logind[1458]: New session 1 of user core. May 15 00:28:11.079818 systemd[1]: Created slice user-500.slice - User Slice of UID 500. May 15 00:28:11.090048 systemd[1]: Starting user-runtime-dir@500.service - User Runtime Directory /run/user/500... May 15 00:28:11.121681 systemd[1]: Finished user-runtime-dir@500.service - User Runtime Directory /run/user/500. May 15 00:28:11.129053 systemd[1]: Starting user@500.service - User Manager for UID 500... May 15 00:28:11.144922 (systemd)[1573]: pam_unix(systemd-user:session): session opened for user core(uid=500) by (uid=0) May 15 00:28:11.150797 systemd-logind[1458]: New session c1 of user core. May 15 00:28:11.320691 systemd[1573]: Queued start job for default target default.target. May 15 00:28:11.328360 systemd[1573]: Created slice app.slice - User Application Slice. May 15 00:28:11.328387 systemd[1573]: Reached target paths.target - Paths. May 15 00:28:11.328500 systemd[1573]: Reached target timers.target - Timers. May 15 00:28:11.332354 systemd[1573]: Starting dbus.socket - D-Bus User Message Bus Socket... May 15 00:28:11.341941 systemd[1573]: Listening on dbus.socket - D-Bus User Message Bus Socket. May 15 00:28:11.342060 systemd[1573]: Reached target sockets.target - Sockets. May 15 00:28:11.342102 systemd[1573]: Reached target basic.target - Basic System. May 15 00:28:11.342138 systemd[1573]: Reached target default.target - Main User Target. May 15 00:28:11.342164 systemd[1573]: Startup finished in 182ms. May 15 00:28:11.342294 systemd[1]: Started user@500.service - User Manager for UID 500. May 15 00:28:11.351519 systemd[1]: Started session-1.scope - Session 1 of User core. May 15 00:28:11.769974 systemd[1]: Started sshd@1-172.24.4.125:22-172.24.4.1:43204.service - OpenSSH per-connection server daemon (172.24.4.1:43204). May 15 00:28:12.577871 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. May 15 00:28:12.591216 (kubelet)[1592]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS May 15 00:28:13.596316 sshd[1584]: Accepted publickey for core from 172.24.4.1 port 43204 ssh2: RSA SHA256:PZnPmaVglnXzapt4qvjiCfu83SmPbylzHggEpmD4nMk May 15 00:28:13.599625 sshd-session[1584]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 15 00:28:13.612067 systemd-logind[1458]: New session 2 of user core. May 15 00:28:13.620712 systemd[1]: Started session-2.scope - Session 2 of User core. May 15 00:28:13.899249 kubelet[1592]: E0515 00:28:13.899045 1592 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" May 15 00:28:13.904228 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE May 15 00:28:13.904622 systemd[1]: kubelet.service: Failed with result 'exit-code'. May 15 00:28:13.906056 systemd[1]: kubelet.service: Consumed 2.018s CPU time, 256.4M memory peak. May 15 00:28:14.280342 sshd[1598]: Connection closed by 172.24.4.1 port 43204 May 15 00:28:14.280054 sshd-session[1584]: pam_unix(sshd:session): session closed for user core May 15 00:28:14.299192 systemd[1]: sshd@1-172.24.4.125:22-172.24.4.1:43204.service: Deactivated successfully. May 15 00:28:14.302627 systemd[1]: session-2.scope: Deactivated successfully. May 15 00:28:14.306702 systemd-logind[1458]: Session 2 logged out. Waiting for processes to exit. May 15 00:28:14.309416 systemd[1]: Started sshd@2-172.24.4.125:22-172.24.4.1:34306.service - OpenSSH per-connection server daemon (172.24.4.1:34306). May 15 00:28:14.318035 systemd-logind[1458]: Removed session 2. May 15 00:28:15.318230 login[1556]: pam_unix(login:session): session opened for user core(uid=500) by LOGIN(uid=0) May 15 00:28:15.324897 login[1555]: pam_unix(login:session): session opened for user core(uid=500) by LOGIN(uid=0) May 15 00:28:15.331352 systemd-logind[1458]: New session 3 of user core. May 15 00:28:15.338071 systemd[1]: Started session-3.scope - Session 3 of User core. May 15 00:28:15.347363 systemd-logind[1458]: New session 4 of user core. May 15 00:28:15.355774 systemd[1]: Started session-4.scope - Session 4 of User core. May 15 00:28:15.580365 sshd[1605]: Accepted publickey for core from 172.24.4.1 port 34306 ssh2: RSA SHA256:PZnPmaVglnXzapt4qvjiCfu83SmPbylzHggEpmD4nMk May 15 00:28:15.583425 sshd-session[1605]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 15 00:28:15.593134 systemd-logind[1458]: New session 5 of user core. May 15 00:28:15.602699 systemd[1]: Started session-5.scope - Session 5 of User core. May 15 00:28:16.147079 coreos-metadata[1447]: May 15 00:28:16.146 WARN failed to locate config-drive, using the metadata service API instead May 15 00:28:16.196015 coreos-metadata[1447]: May 15 00:28:16.195 INFO Fetching http://169.254.169.254/openstack/2012-08-10/meta_data.json: Attempt #1 May 15 00:28:16.220990 sshd[1634]: Connection closed by 172.24.4.1 port 34306 May 15 00:28:16.220765 sshd-session[1605]: pam_unix(sshd:session): session closed for user core May 15 00:28:16.227883 systemd[1]: sshd@2-172.24.4.125:22-172.24.4.1:34306.service: Deactivated successfully. May 15 00:28:16.232036 systemd[1]: session-5.scope: Deactivated successfully. May 15 00:28:16.236327 systemd-logind[1458]: Session 5 logged out. Waiting for processes to exit. May 15 00:28:16.238874 systemd-logind[1458]: Removed session 5. May 15 00:28:16.484107 coreos-metadata[1447]: May 15 00:28:16.483 INFO Fetch successful May 15 00:28:16.484107 coreos-metadata[1447]: May 15 00:28:16.484 INFO Fetching http://169.254.169.254/latest/meta-data/hostname: Attempt #1 May 15 00:28:16.498708 coreos-metadata[1447]: May 15 00:28:16.498 INFO Fetch successful May 15 00:28:16.498819 coreos-metadata[1447]: May 15 00:28:16.498 INFO Fetching http://169.254.169.254/latest/meta-data/instance-id: Attempt #1 May 15 00:28:16.513070 coreos-metadata[1447]: May 15 00:28:16.512 INFO Fetch successful May 15 00:28:16.513216 coreos-metadata[1447]: May 15 00:28:16.513 INFO Fetching http://169.254.169.254/latest/meta-data/instance-type: Attempt #1 May 15 00:28:16.528252 coreos-metadata[1447]: May 15 00:28:16.528 INFO Fetch successful May 15 00:28:16.528394 coreos-metadata[1447]: May 15 00:28:16.528 INFO Fetching http://169.254.169.254/latest/meta-data/local-ipv4: Attempt #1 May 15 00:28:16.542855 coreos-metadata[1447]: May 15 00:28:16.542 INFO Fetch successful May 15 00:28:16.542963 coreos-metadata[1447]: May 15 00:28:16.542 INFO Fetching http://169.254.169.254/latest/meta-data/public-ipv4: Attempt #1 May 15 00:28:16.556902 coreos-metadata[1447]: May 15 00:28:16.556 INFO Fetch successful May 15 00:28:16.605565 systemd[1]: Finished coreos-metadata.service - Flatcar Metadata Agent. May 15 00:28:16.606997 systemd[1]: packet-phone-home.service - Report Success to Packet was skipped because no trigger condition checks were met. May 15 00:28:16.635672 coreos-metadata[1510]: May 15 00:28:16.635 WARN failed to locate config-drive, using the metadata service API instead May 15 00:28:16.677480 coreos-metadata[1510]: May 15 00:28:16.677 INFO Fetching http://169.254.169.254/latest/meta-data/public-keys: Attempt #1 May 15 00:28:16.694023 coreos-metadata[1510]: May 15 00:28:16.693 INFO Fetch successful May 15 00:28:16.694133 coreos-metadata[1510]: May 15 00:28:16.694 INFO Fetching http://169.254.169.254/latest/meta-data/public-keys/0/openssh-key: Attempt #1 May 15 00:28:16.710694 coreos-metadata[1510]: May 15 00:28:16.710 INFO Fetch successful May 15 00:28:16.715331 unknown[1510]: wrote ssh authorized keys file for user: core May 15 00:28:16.760817 update-ssh-keys[1649]: Updated "/home/core/.ssh/authorized_keys" May 15 00:28:16.761841 systemd[1]: Finished coreos-metadata-sshkeys@core.service - Flatcar Metadata Agent (SSH Keys). May 15 00:28:16.764764 systemd[1]: Finished sshkeys.service. May 15 00:28:16.770100 systemd[1]: Reached target multi-user.target - Multi-User System. May 15 00:28:16.771627 systemd[1]: Startup finished in 1.180s (kernel) + 15.784s (initrd) + 11.223s (userspace) = 28.188s. May 15 00:28:24.125373 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 1. May 15 00:28:24.128936 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... May 15 00:28:24.525208 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. May 15 00:28:24.539813 (kubelet)[1660]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS May 15 00:28:24.667417 kubelet[1660]: E0515 00:28:24.667191 1660 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" May 15 00:28:24.671961 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE May 15 00:28:24.672339 systemd[1]: kubelet.service: Failed with result 'exit-code'. May 15 00:28:24.673180 systemd[1]: kubelet.service: Consumed 319ms CPU time, 103.4M memory peak. May 15 00:28:26.241594 systemd[1]: Started sshd@3-172.24.4.125:22-172.24.4.1:55502.service - OpenSSH per-connection server daemon (172.24.4.1:55502). May 15 00:28:27.541527 sshd[1668]: Accepted publickey for core from 172.24.4.1 port 55502 ssh2: RSA SHA256:PZnPmaVglnXzapt4qvjiCfu83SmPbylzHggEpmD4nMk May 15 00:28:27.544094 sshd-session[1668]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 15 00:28:27.556915 systemd-logind[1458]: New session 6 of user core. May 15 00:28:27.563584 systemd[1]: Started session-6.scope - Session 6 of User core. May 15 00:28:28.292109 sshd[1670]: Connection closed by 172.24.4.1 port 55502 May 15 00:28:28.293156 sshd-session[1668]: pam_unix(sshd:session): session closed for user core May 15 00:28:28.309230 systemd[1]: sshd@3-172.24.4.125:22-172.24.4.1:55502.service: Deactivated successfully. May 15 00:28:28.312593 systemd[1]: session-6.scope: Deactivated successfully. May 15 00:28:28.316662 systemd-logind[1458]: Session 6 logged out. Waiting for processes to exit. May 15 00:28:28.318828 systemd[1]: Started sshd@4-172.24.4.125:22-172.24.4.1:55518.service - OpenSSH per-connection server daemon (172.24.4.1:55518). May 15 00:28:28.322011 systemd-logind[1458]: Removed session 6. May 15 00:28:29.633484 sshd[1675]: Accepted publickey for core from 172.24.4.1 port 55518 ssh2: RSA SHA256:PZnPmaVglnXzapt4qvjiCfu83SmPbylzHggEpmD4nMk May 15 00:28:29.635863 sshd-session[1675]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 15 00:28:29.648907 systemd-logind[1458]: New session 7 of user core. May 15 00:28:29.655634 systemd[1]: Started session-7.scope - Session 7 of User core. May 15 00:28:30.495064 sshd[1678]: Connection closed by 172.24.4.1 port 55518 May 15 00:28:30.496003 sshd-session[1675]: pam_unix(sshd:session): session closed for user core May 15 00:28:30.512675 systemd[1]: sshd@4-172.24.4.125:22-172.24.4.1:55518.service: Deactivated successfully. May 15 00:28:30.515855 systemd[1]: session-7.scope: Deactivated successfully. May 15 00:28:30.519586 systemd-logind[1458]: Session 7 logged out. Waiting for processes to exit. May 15 00:28:30.523860 systemd[1]: Started sshd@5-172.24.4.125:22-172.24.4.1:55534.service - OpenSSH per-connection server daemon (172.24.4.1:55534). May 15 00:28:30.526866 systemd-logind[1458]: Removed session 7. May 15 00:28:31.970168 sshd[1683]: Accepted publickey for core from 172.24.4.1 port 55534 ssh2: RSA SHA256:PZnPmaVglnXzapt4qvjiCfu83SmPbylzHggEpmD4nMk May 15 00:28:31.972723 sshd-session[1683]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 15 00:28:31.983463 systemd-logind[1458]: New session 8 of user core. May 15 00:28:31.992567 systemd[1]: Started session-8.scope - Session 8 of User core. May 15 00:28:32.736596 sshd[1686]: Connection closed by 172.24.4.1 port 55534 May 15 00:28:32.737233 sshd-session[1683]: pam_unix(sshd:session): session closed for user core May 15 00:28:32.749749 systemd[1]: sshd@5-172.24.4.125:22-172.24.4.1:55534.service: Deactivated successfully. May 15 00:28:32.753535 systemd[1]: session-8.scope: Deactivated successfully. May 15 00:28:32.757682 systemd-logind[1458]: Session 8 logged out. Waiting for processes to exit. May 15 00:28:32.759909 systemd[1]: Started sshd@6-172.24.4.125:22-172.24.4.1:55536.service - OpenSSH per-connection server daemon (172.24.4.1:55536). May 15 00:28:32.762922 systemd-logind[1458]: Removed session 8. May 15 00:28:34.127313 sshd[1691]: Accepted publickey for core from 172.24.4.1 port 55536 ssh2: RSA SHA256:PZnPmaVglnXzapt4qvjiCfu83SmPbylzHggEpmD4nMk May 15 00:28:34.129844 sshd-session[1691]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 15 00:28:34.141910 systemd-logind[1458]: New session 9 of user core. May 15 00:28:34.151606 systemd[1]: Started session-9.scope - Session 9 of User core. May 15 00:28:34.709946 sudo[1695]: core : PWD=/home/core ; USER=root ; COMMAND=/usr/sbin/setenforce 1 May 15 00:28:34.710687 sudo[1695]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) May 15 00:28:34.712889 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 2. May 15 00:28:34.717619 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... May 15 00:28:34.727622 sudo[1695]: pam_unix(sudo:session): session closed for user root May 15 00:28:34.993094 sshd[1694]: Connection closed by 172.24.4.1 port 55536 May 15 00:28:34.989572 sshd-session[1691]: pam_unix(sshd:session): session closed for user core May 15 00:28:35.014016 systemd[1]: sshd@6-172.24.4.125:22-172.24.4.1:55536.service: Deactivated successfully. May 15 00:28:35.021635 systemd[1]: session-9.scope: Deactivated successfully. May 15 00:28:35.029227 systemd-logind[1458]: Session 9 logged out. Waiting for processes to exit. May 15 00:28:35.033897 systemd-logind[1458]: Removed session 9. May 15 00:28:35.035839 systemd[1]: Started sshd@7-172.24.4.125:22-172.24.4.1:53428.service - OpenSSH per-connection server daemon (172.24.4.1:53428). May 15 00:28:35.057610 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. May 15 00:28:35.077000 (kubelet)[1710]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS May 15 00:28:35.136494 kubelet[1710]: E0515 00:28:35.136414 1710 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" May 15 00:28:35.138847 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE May 15 00:28:35.139012 systemd[1]: kubelet.service: Failed with result 'exit-code'. May 15 00:28:35.139338 systemd[1]: kubelet.service: Consumed 264ms CPU time, 106.1M memory peak. May 15 00:28:36.242891 sshd[1705]: Accepted publickey for core from 172.24.4.1 port 53428 ssh2: RSA SHA256:PZnPmaVglnXzapt4qvjiCfu83SmPbylzHggEpmD4nMk May 15 00:28:36.245767 sshd-session[1705]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 15 00:28:36.257372 systemd-logind[1458]: New session 10 of user core. May 15 00:28:36.260579 systemd[1]: Started session-10.scope - Session 10 of User core. May 15 00:28:36.778613 sudo[1720]: core : PWD=/home/core ; USER=root ; COMMAND=/usr/sbin/rm -rf /etc/audit/rules.d/80-selinux.rules /etc/audit/rules.d/99-default.rules May 15 00:28:36.779224 sudo[1720]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) May 15 00:28:36.786359 sudo[1720]: pam_unix(sudo:session): session closed for user root May 15 00:28:36.798017 sudo[1719]: core : PWD=/home/core ; USER=root ; COMMAND=/usr/sbin/systemctl restart audit-rules May 15 00:28:36.798730 sudo[1719]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) May 15 00:28:36.820337 systemd[1]: Starting audit-rules.service - Load Audit Rules... May 15 00:28:36.892770 augenrules[1742]: No rules May 15 00:28:36.895608 systemd[1]: audit-rules.service: Deactivated successfully. May 15 00:28:36.896125 systemd[1]: Finished audit-rules.service - Load Audit Rules. May 15 00:28:36.898052 sudo[1719]: pam_unix(sudo:session): session closed for user root May 15 00:28:37.154305 sshd[1718]: Connection closed by 172.24.4.1 port 53428 May 15 00:28:37.154962 sshd-session[1705]: pam_unix(sshd:session): session closed for user core May 15 00:28:37.171099 systemd[1]: sshd@7-172.24.4.125:22-172.24.4.1:53428.service: Deactivated successfully. May 15 00:28:37.174748 systemd[1]: session-10.scope: Deactivated successfully. May 15 00:28:37.178609 systemd-logind[1458]: Session 10 logged out. Waiting for processes to exit. May 15 00:28:37.181790 systemd[1]: Started sshd@8-172.24.4.125:22-172.24.4.1:53434.service - OpenSSH per-connection server daemon (172.24.4.1:53434). May 15 00:28:37.185053 systemd-logind[1458]: Removed session 10. May 15 00:28:38.451568 sshd[1750]: Accepted publickey for core from 172.24.4.1 port 53434 ssh2: RSA SHA256:PZnPmaVglnXzapt4qvjiCfu83SmPbylzHggEpmD4nMk May 15 00:28:38.454325 sshd-session[1750]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 15 00:28:38.466367 systemd-logind[1458]: New session 11 of user core. May 15 00:28:38.480564 systemd[1]: Started session-11.scope - Session 11 of User core. May 15 00:28:38.910942 sudo[1754]: core : PWD=/home/core ; USER=root ; COMMAND=/home/core/install.sh May 15 00:28:38.911630 sudo[1754]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) May 15 00:28:39.662745 systemd[1]: Starting docker.service - Docker Application Container Engine... May 15 00:28:39.679336 (dockerd)[1773]: docker.service: Referenced but unset environment variable evaluates to an empty string: DOCKER_CGROUPS, DOCKER_OPTS, DOCKER_OPT_BIP, DOCKER_OPT_IPMASQ, DOCKER_OPT_MTU May 15 00:28:40.228881 dockerd[1773]: time="2025-05-15T00:28:40.228608382Z" level=info msg="Starting up" May 15 00:28:40.229564 dockerd[1773]: time="2025-05-15T00:28:40.229531313Z" level=info msg="OTEL tracing is not configured, using no-op tracer provider" May 15 00:28:40.308101 dockerd[1773]: time="2025-05-15T00:28:40.307675893Z" level=info msg="Loading containers: start." May 15 00:28:40.559300 kernel: Initializing XFRM netlink socket May 15 00:28:40.561428 systemd-timesyncd[1401]: Network configuration changed, trying to establish connection. May 15 00:28:40.663506 systemd-networkd[1398]: docker0: Link UP May 15 00:28:40.724939 systemd-timesyncd[1401]: Contacted time server 104.131.155.175:123 (2.flatcar.pool.ntp.org). May 15 00:28:40.725040 systemd-timesyncd[1401]: Initial clock synchronization to Thu 2025-05-15 00:28:41.067591 UTC. May 15 00:28:40.729752 dockerd[1773]: time="2025-05-15T00:28:40.729687055Z" level=info msg="Loading containers: done." May 15 00:28:40.757214 dockerd[1773]: time="2025-05-15T00:28:40.757125085Z" level=warning msg="Not using native diff for overlay2, this may cause degraded performance for building images: kernel has CONFIG_OVERLAY_FS_REDIRECT_DIR enabled" storage-driver=overlay2 May 15 00:28:40.757457 dockerd[1773]: time="2025-05-15T00:28:40.757336571Z" level=info msg="Docker daemon" commit=c710b88579fcb5e0d53f96dcae976d79323b9166 containerd-snapshotter=false storage-driver=overlay2 version=27.4.1 May 15 00:28:40.757627 dockerd[1773]: time="2025-05-15T00:28:40.757569558Z" level=info msg="Daemon has completed initialization" May 15 00:28:40.824073 dockerd[1773]: time="2025-05-15T00:28:40.823742836Z" level=info msg="API listen on /run/docker.sock" May 15 00:28:40.823871 systemd[1]: Started docker.service - Docker Application Container Engine. May 15 00:28:42.532236 containerd[1483]: time="2025-05-15T00:28:42.532123505Z" level=info msg="PullImage \"registry.k8s.io/kube-apiserver:v1.32.4\"" May 15 00:28:43.301490 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount2655737301.mount: Deactivated successfully. May 15 00:28:45.164484 containerd[1483]: time="2025-05-15T00:28:45.164358256Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-apiserver:v1.32.4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 15 00:28:45.166051 containerd[1483]: time="2025-05-15T00:28:45.165839738Z" level=info msg="stop pulling image registry.k8s.io/kube-apiserver:v1.32.4: active requests=0, bytes read=28682887" May 15 00:28:45.167234 containerd[1483]: time="2025-05-15T00:28:45.167175452Z" level=info msg="ImageCreate event name:\"sha256:1c20c8797e48698afa3380793df2f1fb260e3209df72d8e864e1bc73af8336e5\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 15 00:28:45.170765 containerd[1483]: time="2025-05-15T00:28:45.170697310Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-apiserver@sha256:631c6cc78b2862be4fed7df3384a643ef7297eebadae22e8ef9cbe2e19b6386f\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 15 00:28:45.171878 containerd[1483]: time="2025-05-15T00:28:45.171737468Z" level=info msg="Pulled image \"registry.k8s.io/kube-apiserver:v1.32.4\" with image id \"sha256:1c20c8797e48698afa3380793df2f1fb260e3209df72d8e864e1bc73af8336e5\", repo tag \"registry.k8s.io/kube-apiserver:v1.32.4\", repo digest \"registry.k8s.io/kube-apiserver@sha256:631c6cc78b2862be4fed7df3384a643ef7297eebadae22e8ef9cbe2e19b6386f\", size \"28679679\" in 2.639539642s" May 15 00:28:45.171878 containerd[1483]: time="2025-05-15T00:28:45.171772419Z" level=info msg="PullImage \"registry.k8s.io/kube-apiserver:v1.32.4\" returns image reference \"sha256:1c20c8797e48698afa3380793df2f1fb260e3209df72d8e864e1bc73af8336e5\"" May 15 00:28:45.172612 containerd[1483]: time="2025-05-15T00:28:45.172409777Z" level=info msg="PullImage \"registry.k8s.io/kube-controller-manager:v1.32.4\"" May 15 00:28:45.375238 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 3. May 15 00:28:45.380098 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... May 15 00:28:45.616032 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. May 15 00:28:45.623547 (kubelet)[2033]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS May 15 00:28:45.799105 kubelet[2033]: E0515 00:28:45.798918 2033 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" May 15 00:28:45.802723 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE May 15 00:28:45.803076 systemd[1]: kubelet.service: Failed with result 'exit-code'. May 15 00:28:45.803767 systemd[1]: kubelet.service: Consumed 271ms CPU time, 105.2M memory peak. May 15 00:28:47.624335 containerd[1483]: time="2025-05-15T00:28:47.624295530Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-controller-manager:v1.32.4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 15 00:28:47.627010 containerd[1483]: time="2025-05-15T00:28:47.626969532Z" level=info msg="stop pulling image registry.k8s.io/kube-controller-manager:v1.32.4: active requests=0, bytes read=24779597" May 15 00:28:47.628582 containerd[1483]: time="2025-05-15T00:28:47.628495186Z" level=info msg="ImageCreate event name:\"sha256:4db5364cd5509e0fc8e9f821fbc4b31ed79d4c9ae21809d22030ad67d530a61a\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 15 00:28:47.633373 containerd[1483]: time="2025-05-15T00:28:47.633313180Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-controller-manager@sha256:25e29187ea66f0ff9b9a00114849c3a30b649005c900a8b2a69e3f3fa56448fb\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 15 00:28:47.635561 containerd[1483]: time="2025-05-15T00:28:47.635336747Z" level=info msg="Pulled image \"registry.k8s.io/kube-controller-manager:v1.32.4\" with image id \"sha256:4db5364cd5509e0fc8e9f821fbc4b31ed79d4c9ae21809d22030ad67d530a61a\", repo tag \"registry.k8s.io/kube-controller-manager:v1.32.4\", repo digest \"registry.k8s.io/kube-controller-manager@sha256:25e29187ea66f0ff9b9a00114849c3a30b649005c900a8b2a69e3f3fa56448fb\", size \"26267962\" in 2.462890529s" May 15 00:28:47.635561 containerd[1483]: time="2025-05-15T00:28:47.635393911Z" level=info msg="PullImage \"registry.k8s.io/kube-controller-manager:v1.32.4\" returns image reference \"sha256:4db5364cd5509e0fc8e9f821fbc4b31ed79d4c9ae21809d22030ad67d530a61a\"" May 15 00:28:47.636519 containerd[1483]: time="2025-05-15T00:28:47.636268774Z" level=info msg="PullImage \"registry.k8s.io/kube-scheduler:v1.32.4\"" May 15 00:28:49.676104 containerd[1483]: time="2025-05-15T00:28:49.676034776Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-scheduler:v1.32.4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 15 00:28:49.677338 containerd[1483]: time="2025-05-15T00:28:49.677293117Z" level=info msg="stop pulling image registry.k8s.io/kube-scheduler:v1.32.4: active requests=0, bytes read=19169946" May 15 00:28:49.678940 containerd[1483]: time="2025-05-15T00:28:49.678877950Z" level=info msg="ImageCreate event name:\"sha256:70a252485ed1f2e8332b6f0a5f8f57443bfbc3c480228f8dcd82ad5ab5cc4000\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 15 00:28:49.682184 containerd[1483]: time="2025-05-15T00:28:49.682141543Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-scheduler@sha256:09c55f8dac59a4b8e5e354140f5a4bdd6fa9bd95c42d6bcba6782ed37c31b5a2\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 15 00:28:49.683782 containerd[1483]: time="2025-05-15T00:28:49.683489379Z" level=info msg="Pulled image \"registry.k8s.io/kube-scheduler:v1.32.4\" with image id \"sha256:70a252485ed1f2e8332b6f0a5f8f57443bfbc3c480228f8dcd82ad5ab5cc4000\", repo tag \"registry.k8s.io/kube-scheduler:v1.32.4\", repo digest \"registry.k8s.io/kube-scheduler@sha256:09c55f8dac59a4b8e5e354140f5a4bdd6fa9bd95c42d6bcba6782ed37c31b5a2\", size \"20658329\" in 2.047131916s" May 15 00:28:49.683782 containerd[1483]: time="2025-05-15T00:28:49.683536489Z" level=info msg="PullImage \"registry.k8s.io/kube-scheduler:v1.32.4\" returns image reference \"sha256:70a252485ed1f2e8332b6f0a5f8f57443bfbc3c480228f8dcd82ad5ab5cc4000\"" May 15 00:28:49.684170 containerd[1483]: time="2025-05-15T00:28:49.684100285Z" level=info msg="PullImage \"registry.k8s.io/kube-proxy:v1.32.4\"" May 15 00:28:51.087965 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount820180810.mount: Deactivated successfully. May 15 00:28:51.655335 containerd[1483]: time="2025-05-15T00:28:51.655285539Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-proxy:v1.32.4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 15 00:28:51.656800 containerd[1483]: time="2025-05-15T00:28:51.656606002Z" level=info msg="stop pulling image registry.k8s.io/kube-proxy:v1.32.4: active requests=0, bytes read=30917864" May 15 00:28:51.658041 containerd[1483]: time="2025-05-15T00:28:51.657984197Z" level=info msg="ImageCreate event name:\"sha256:608f0c8bf7f9651ca79f170235ea5eefb978a0c1da132e7477a88ad37d171ad3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 15 00:28:51.660307 containerd[1483]: time="2025-05-15T00:28:51.660244423Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-proxy@sha256:152638222ecf265eb8e5352e3c50e8fc520994e8ffcff1ee1490c975f7fc2b36\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 15 00:28:51.660960 containerd[1483]: time="2025-05-15T00:28:51.660832764Z" level=info msg="Pulled image \"registry.k8s.io/kube-proxy:v1.32.4\" with image id \"sha256:608f0c8bf7f9651ca79f170235ea5eefb978a0c1da132e7477a88ad37d171ad3\", repo tag \"registry.k8s.io/kube-proxy:v1.32.4\", repo digest \"registry.k8s.io/kube-proxy@sha256:152638222ecf265eb8e5352e3c50e8fc520994e8ffcff1ee1490c975f7fc2b36\", size \"30916875\" in 1.976579118s" May 15 00:28:51.660960 containerd[1483]: time="2025-05-15T00:28:51.660875997Z" level=info msg="PullImage \"registry.k8s.io/kube-proxy:v1.32.4\" returns image reference \"sha256:608f0c8bf7f9651ca79f170235ea5eefb978a0c1da132e7477a88ad37d171ad3\"" May 15 00:28:51.661704 containerd[1483]: time="2025-05-15T00:28:51.661683368Z" level=info msg="PullImage \"registry.k8s.io/coredns/coredns:v1.11.3\"" May 15 00:28:52.301958 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount4246683314.mount: Deactivated successfully. May 15 00:28:53.600559 containerd[1483]: time="2025-05-15T00:28:53.600466200Z" level=info msg="ImageCreate event name:\"registry.k8s.io/coredns/coredns:v1.11.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 15 00:28:53.601878 containerd[1483]: time="2025-05-15T00:28:53.601567980Z" level=info msg="stop pulling image registry.k8s.io/coredns/coredns:v1.11.3: active requests=0, bytes read=18565249" May 15 00:28:53.603341 containerd[1483]: time="2025-05-15T00:28:53.603311971Z" level=info msg="ImageCreate event name:\"sha256:c69fa2e9cbf5f42dc48af631e956d3f95724c13f91596bc567591790e5e36db6\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 15 00:28:53.606704 containerd[1483]: time="2025-05-15T00:28:53.606652012Z" level=info msg="ImageCreate event name:\"registry.k8s.io/coredns/coredns@sha256:9caabbf6238b189a65d0d6e6ac138de60d6a1c419e5a341fbbb7c78382559c6e\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 15 00:28:53.607799 containerd[1483]: time="2025-05-15T00:28:53.607679498Z" level=info msg="Pulled image \"registry.k8s.io/coredns/coredns:v1.11.3\" with image id \"sha256:c69fa2e9cbf5f42dc48af631e956d3f95724c13f91596bc567591790e5e36db6\", repo tag \"registry.k8s.io/coredns/coredns:v1.11.3\", repo digest \"registry.k8s.io/coredns/coredns@sha256:9caabbf6238b189a65d0d6e6ac138de60d6a1c419e5a341fbbb7c78382559c6e\", size \"18562039\" in 1.945967623s" May 15 00:28:53.607799 containerd[1483]: time="2025-05-15T00:28:53.607710966Z" level=info msg="PullImage \"registry.k8s.io/coredns/coredns:v1.11.3\" returns image reference \"sha256:c69fa2e9cbf5f42dc48af631e956d3f95724c13f91596bc567591790e5e36db6\"" May 15 00:28:53.608165 containerd[1483]: time="2025-05-15T00:28:53.608140562Z" level=info msg="PullImage \"registry.k8s.io/pause:3.10\"" May 15 00:28:54.594027 update_engine[1461]: I20250515 00:28:54.592533 1461 update_attempter.cc:509] Updating boot flags... May 15 00:28:54.663417 kernel: BTRFS warning: duplicate device /dev/vda3 devid 1 generation 41 scanned by (udev-worker) (2116) May 15 00:28:54.732188 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount1157489142.mount: Deactivated successfully. May 15 00:28:54.750285 containerd[1483]: time="2025-05-15T00:28:54.749789762Z" level=info msg="ImageCreate event name:\"registry.k8s.io/pause:3.10\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" May 15 00:28:54.751295 containerd[1483]: time="2025-05-15T00:28:54.751173594Z" level=info msg="stop pulling image registry.k8s.io/pause:3.10: active requests=0, bytes read=321146" May 15 00:28:54.753647 containerd[1483]: time="2025-05-15T00:28:54.753376491Z" level=info msg="ImageCreate event name:\"sha256:873ed75102791e5b0b8a7fcd41606c92fcec98d56d05ead4ac5131650004c136\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" May 15 00:28:54.758782 containerd[1483]: time="2025-05-15T00:28:54.758740031Z" level=info msg="ImageCreate event name:\"registry.k8s.io/pause@sha256:ee6521f290b2168b6e0935a181d4cff9be1ac3f505666ef0e3c98fae8199917a\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" May 15 00:28:54.759296 kernel: BTRFS warning: duplicate device /dev/vda3 devid 1 generation 41 scanned by (udev-worker) (2120) May 15 00:28:54.760317 containerd[1483]: time="2025-05-15T00:28:54.760285819Z" level=info msg="Pulled image \"registry.k8s.io/pause:3.10\" with image id \"sha256:873ed75102791e5b0b8a7fcd41606c92fcec98d56d05ead4ac5131650004c136\", repo tag \"registry.k8s.io/pause:3.10\", repo digest \"registry.k8s.io/pause@sha256:ee6521f290b2168b6e0935a181d4cff9be1ac3f505666ef0e3c98fae8199917a\", size \"320368\" in 1.152102203s" May 15 00:28:54.760369 containerd[1483]: time="2025-05-15T00:28:54.760316647Z" level=info msg="PullImage \"registry.k8s.io/pause:3.10\" returns image reference \"sha256:873ed75102791e5b0b8a7fcd41606c92fcec98d56d05ead4ac5131650004c136\"" May 15 00:28:54.762500 containerd[1483]: time="2025-05-15T00:28:54.761769949Z" level=info msg="PullImage \"registry.k8s.io/etcd:3.5.16-0\"" May 15 00:28:55.413362 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount1813418091.mount: Deactivated successfully. May 15 00:28:55.875001 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 4. May 15 00:28:55.877604 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... May 15 00:28:56.034633 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. May 15 00:28:56.041494 (kubelet)[2171]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS May 15 00:28:56.454115 kubelet[2171]: E0515 00:28:56.285185 2171 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" May 15 00:28:56.288526 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE May 15 00:28:56.288819 systemd[1]: kubelet.service: Failed with result 'exit-code'. May 15 00:28:56.289757 systemd[1]: kubelet.service: Consumed 195ms CPU time, 103.9M memory peak. May 15 00:28:58.861339 containerd[1483]: time="2025-05-15T00:28:58.860469999Z" level=info msg="ImageCreate event name:\"registry.k8s.io/etcd:3.5.16-0\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 15 00:28:58.864244 containerd[1483]: time="2025-05-15T00:28:58.864115941Z" level=info msg="stop pulling image registry.k8s.io/etcd:3.5.16-0: active requests=0, bytes read=57551368" May 15 00:28:58.866385 containerd[1483]: time="2025-05-15T00:28:58.866192091Z" level=info msg="ImageCreate event name:\"sha256:a9e7e6b294baf1695fccb862d956c5d3ad8510e1e4ca1535f35dc09f247abbfc\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 15 00:28:58.874200 containerd[1483]: time="2025-05-15T00:28:58.874072500Z" level=info msg="ImageCreate event name:\"registry.k8s.io/etcd@sha256:c6a9d11cc5c04b114ccdef39a9265eeef818e3d02f5359be035ae784097fdec5\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 15 00:28:58.877582 containerd[1483]: time="2025-05-15T00:28:58.877335136Z" level=info msg="Pulled image \"registry.k8s.io/etcd:3.5.16-0\" with image id \"sha256:a9e7e6b294baf1695fccb862d956c5d3ad8510e1e4ca1535f35dc09f247abbfc\", repo tag \"registry.k8s.io/etcd:3.5.16-0\", repo digest \"registry.k8s.io/etcd@sha256:c6a9d11cc5c04b114ccdef39a9265eeef818e3d02f5359be035ae784097fdec5\", size \"57680541\" in 4.115516013s" May 15 00:28:58.877582 containerd[1483]: time="2025-05-15T00:28:58.877402147Z" level=info msg="PullImage \"registry.k8s.io/etcd:3.5.16-0\" returns image reference \"sha256:a9e7e6b294baf1695fccb862d956c5d3ad8510e1e4ca1535f35dc09f247abbfc\"" May 15 00:29:02.881591 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent. May 15 00:29:02.882096 systemd[1]: kubelet.service: Consumed 195ms CPU time, 103.9M memory peak. May 15 00:29:02.887063 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... May 15 00:29:02.940181 systemd[1]: Reload requested from client PID 2225 ('systemctl') (unit session-11.scope)... May 15 00:29:02.940326 systemd[1]: Reloading... May 15 00:29:03.054348 zram_generator::config[2277]: No configuration found. May 15 00:29:03.261182 systemd[1]: /usr/lib/systemd/system/docker.socket:6: ListenStream= references a path below legacy directory /var/run/, updating /var/run/docker.sock → /run/docker.sock; please update the unit file accordingly. May 15 00:29:03.400549 systemd[1]: Reloading finished in 459 ms. May 15 00:29:03.918212 systemd[1]: kubelet.service: Control process exited, code=killed, status=15/TERM May 15 00:29:03.918981 systemd[1]: kubelet.service: Failed with result 'signal'. May 15 00:29:03.919738 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent. May 15 00:29:03.919830 systemd[1]: kubelet.service: Consumed 129ms CPU time, 91.9M memory peak. May 15 00:29:03.927041 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... May 15 00:29:04.545600 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. May 15 00:29:04.558828 (kubelet)[2337]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS May 15 00:29:04.653359 kubelet[2337]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. May 15 00:29:04.653359 kubelet[2337]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. May 15 00:29:04.653359 kubelet[2337]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. May 15 00:29:04.654026 kubelet[2337]: I0515 00:29:04.653420 2337 server.go:215] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" May 15 00:29:05.218192 kubelet[2337]: I0515 00:29:05.218131 2337 server.go:520] "Kubelet version" kubeletVersion="v1.32.0" May 15 00:29:05.218192 kubelet[2337]: I0515 00:29:05.218188 2337 server.go:522] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" May 15 00:29:05.218817 kubelet[2337]: I0515 00:29:05.218778 2337 server.go:954] "Client rotation is on, will bootstrap in background" May 15 00:29:05.262511 kubelet[2337]: E0515 00:29:05.262437 2337 certificate_manager.go:562] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post \"https://172.24.4.125:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": dial tcp 172.24.4.125:6443: connect: connection refused" logger="UnhandledError" May 15 00:29:05.265777 kubelet[2337]: I0515 00:29:05.265611 2337 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/pki/ca.crt" May 15 00:29:05.289049 kubelet[2337]: I0515 00:29:05.289006 2337 server.go:1444] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" May 15 00:29:05.294857 kubelet[2337]: I0515 00:29:05.294796 2337 server.go:772] "--cgroups-per-qos enabled, but --cgroup-root was not specified. defaulting to /" May 15 00:29:05.295340 kubelet[2337]: I0515 00:29:05.295246 2337 container_manager_linux.go:268] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] May 15 00:29:05.295747 kubelet[2337]: I0515 00:29:05.295343 2337 container_manager_linux.go:273] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ci-4284-0-0-n-019843d4bb.novalocal","RuntimeCgroupsName":"","SystemCgroupsName":"","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":false,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":null,"HardEvictionThresholds":[{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"ExperimentalMemoryManagerPolicy":"None","ExperimentalMemoryManagerReservedMemory":null,"PodPidsLimit":-1,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} May 15 00:29:05.295944 kubelet[2337]: I0515 00:29:05.295752 2337 topology_manager.go:138] "Creating topology manager with none policy" May 15 00:29:05.295944 kubelet[2337]: I0515 00:29:05.295775 2337 container_manager_linux.go:304] "Creating device plugin manager" May 15 00:29:05.296041 kubelet[2337]: I0515 00:29:05.296004 2337 state_mem.go:36] "Initialized new in-memory state store" May 15 00:29:05.306222 kubelet[2337]: I0515 00:29:05.306064 2337 kubelet.go:446] "Attempting to sync node with API server" May 15 00:29:05.306222 kubelet[2337]: I0515 00:29:05.306111 2337 kubelet.go:341] "Adding static pod path" path="/etc/kubernetes/manifests" May 15 00:29:05.306222 kubelet[2337]: I0515 00:29:05.306160 2337 kubelet.go:352] "Adding apiserver pod source" May 15 00:29:05.306222 kubelet[2337]: I0515 00:29:05.306182 2337 apiserver.go:42] "Waiting for node sync before watching apiserver pods" May 15 00:29:05.323701 kubelet[2337]: W0515 00:29:05.322613 2337 reflector.go:569] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://172.24.4.125:6443/api/v1/nodes?fieldSelector=metadata.name%3Dci-4284-0-0-n-019843d4bb.novalocal&limit=500&resourceVersion=0": dial tcp 172.24.4.125:6443: connect: connection refused May 15 00:29:05.323701 kubelet[2337]: E0515 00:29:05.322749 2337 reflector.go:166] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://172.24.4.125:6443/api/v1/nodes?fieldSelector=metadata.name%3Dci-4284-0-0-n-019843d4bb.novalocal&limit=500&resourceVersion=0\": dial tcp 172.24.4.125:6443: connect: connection refused" logger="UnhandledError" May 15 00:29:05.323701 kubelet[2337]: W0515 00:29:05.323524 2337 reflector.go:569] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://172.24.4.125:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 172.24.4.125:6443: connect: connection refused May 15 00:29:05.323701 kubelet[2337]: E0515 00:29:05.323606 2337 reflector.go:166] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://172.24.4.125:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 172.24.4.125:6443: connect: connection refused" logger="UnhandledError" May 15 00:29:05.326335 kubelet[2337]: I0515 00:29:05.324959 2337 kuberuntime_manager.go:269] "Container runtime initialized" containerRuntime="containerd" version="v2.0.1" apiVersion="v1" May 15 00:29:05.326335 kubelet[2337]: I0515 00:29:05.325951 2337 kubelet.go:890] "Not starting ClusterTrustBundle informer because we are in static kubelet mode" May 15 00:29:05.326335 kubelet[2337]: W0515 00:29:05.326056 2337 probe.go:272] Flexvolume plugin directory at /opt/libexec/kubernetes/kubelet-plugins/volume/exec/ does not exist. Recreating. May 15 00:29:05.330996 kubelet[2337]: I0515 00:29:05.330965 2337 watchdog_linux.go:99] "Systemd watchdog is not enabled" May 15 00:29:05.331200 kubelet[2337]: I0515 00:29:05.331178 2337 server.go:1287] "Started kubelet" May 15 00:29:05.334351 kubelet[2337]: I0515 00:29:05.333125 2337 server.go:169] "Starting to listen" address="0.0.0.0" port=10250 May 15 00:29:05.336765 kubelet[2337]: I0515 00:29:05.335341 2337 server.go:490] "Adding debug handlers to kubelet server" May 15 00:29:05.339947 kubelet[2337]: I0515 00:29:05.339857 2337 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 May 15 00:29:05.340542 kubelet[2337]: I0515 00:29:05.340468 2337 server.go:243] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" May 15 00:29:05.340867 kubelet[2337]: I0515 00:29:05.340574 2337 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" May 15 00:29:05.347984 kubelet[2337]: E0515 00:29:05.343909 2337 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://172.24.4.125:6443/api/v1/namespaces/default/events\": dial tcp 172.24.4.125:6443: connect: connection refused" event="&Event{ObjectMeta:{ci-4284-0-0-n-019843d4bb.novalocal.183f8bd92d88a3cd default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ci-4284-0-0-n-019843d4bb.novalocal,UID:ci-4284-0-0-n-019843d4bb.novalocal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:ci-4284-0-0-n-019843d4bb.novalocal,},FirstTimestamp:2025-05-15 00:29:05.331135437 +0000 UTC m=+0.765141368,LastTimestamp:2025-05-15 00:29:05.331135437 +0000 UTC m=+0.765141368,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ci-4284-0-0-n-019843d4bb.novalocal,}" May 15 00:29:05.349652 kubelet[2337]: I0515 00:29:05.349593 2337 volume_manager.go:297] "Starting Kubelet Volume Manager" May 15 00:29:05.350006 kubelet[2337]: I0515 00:29:05.340736 2337 dynamic_serving_content.go:135] "Starting controller" name="kubelet-server-cert-files::/var/lib/kubelet/pki/kubelet.crt::/var/lib/kubelet/pki/kubelet.key" May 15 00:29:05.350187 kubelet[2337]: I0515 00:29:05.350145 2337 desired_state_of_world_populator.go:149] "Desired state populator starts to run" May 15 00:29:05.350337 kubelet[2337]: I0515 00:29:05.350256 2337 reconciler.go:26] "Reconciler: start to sync state" May 15 00:29:05.352722 kubelet[2337]: W0515 00:29:05.352573 2337 reflector.go:569] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://172.24.4.125:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 172.24.4.125:6443: connect: connection refused May 15 00:29:05.352860 kubelet[2337]: E0515 00:29:05.352741 2337 reflector.go:166] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://172.24.4.125:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 172.24.4.125:6443: connect: connection refused" logger="UnhandledError" May 15 00:29:05.353868 kubelet[2337]: E0515 00:29:05.353799 2337 kubelet.go:1561] "Image garbage collection failed once. Stats initialization may not have completed yet" err="invalid capacity 0 on image filesystem" May 15 00:29:05.354971 kubelet[2337]: E0515 00:29:05.354382 2337 kubelet_node_status.go:467] "Error getting the current node from lister" err="node \"ci-4284-0-0-n-019843d4bb.novalocal\" not found" May 15 00:29:05.354971 kubelet[2337]: E0515 00:29:05.354603 2337 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://172.24.4.125:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/ci-4284-0-0-n-019843d4bb.novalocal?timeout=10s\": dial tcp 172.24.4.125:6443: connect: connection refused" interval="200ms" May 15 00:29:05.356591 kubelet[2337]: I0515 00:29:05.356555 2337 factory.go:221] Registration of the containerd container factory successfully May 15 00:29:05.356591 kubelet[2337]: I0515 00:29:05.356572 2337 factory.go:221] Registration of the systemd container factory successfully May 15 00:29:05.356770 kubelet[2337]: I0515 00:29:05.356622 2337 factory.go:219] Registration of the crio container factory failed: Get "http://%2Fvar%2Frun%2Fcrio%2Fcrio.sock/info": dial unix /var/run/crio/crio.sock: connect: no such file or directory May 15 00:29:05.381376 kubelet[2337]: I0515 00:29:05.381210 2337 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv4" May 15 00:29:05.386888 kubelet[2337]: I0515 00:29:05.386850 2337 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv6" May 15 00:29:05.386888 kubelet[2337]: I0515 00:29:05.386876 2337 status_manager.go:227] "Starting to sync pod status with apiserver" May 15 00:29:05.386888 kubelet[2337]: I0515 00:29:05.386891 2337 watchdog_linux.go:127] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." May 15 00:29:05.386888 kubelet[2337]: I0515 00:29:05.386898 2337 kubelet.go:2388] "Starting kubelet main sync loop" May 15 00:29:05.387138 kubelet[2337]: E0515 00:29:05.386932 2337 kubelet.go:2412] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" May 15 00:29:05.389892 kubelet[2337]: W0515 00:29:05.389709 2337 reflector.go:569] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://172.24.4.125:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 172.24.4.125:6443: connect: connection refused May 15 00:29:05.389892 kubelet[2337]: E0515 00:29:05.389753 2337 reflector.go:166] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://172.24.4.125:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 172.24.4.125:6443: connect: connection refused" logger="UnhandledError" May 15 00:29:05.390053 kubelet[2337]: I0515 00:29:05.390040 2337 cpu_manager.go:221] "Starting CPU manager" policy="none" May 15 00:29:05.390113 kubelet[2337]: I0515 00:29:05.390104 2337 cpu_manager.go:222] "Reconciling" reconcilePeriod="10s" May 15 00:29:05.390177 kubelet[2337]: I0515 00:29:05.390169 2337 state_mem.go:36] "Initialized new in-memory state store" May 15 00:29:05.396473 kubelet[2337]: I0515 00:29:05.396460 2337 policy_none.go:49] "None policy: Start" May 15 00:29:05.396552 kubelet[2337]: I0515 00:29:05.396543 2337 memory_manager.go:186] "Starting memorymanager" policy="None" May 15 00:29:05.396626 kubelet[2337]: I0515 00:29:05.396617 2337 state_mem.go:35] "Initializing new in-memory state store" May 15 00:29:05.406254 systemd[1]: Created slice kubepods.slice - libcontainer container kubepods.slice. May 15 00:29:05.416510 systemd[1]: Created slice kubepods-burstable.slice - libcontainer container kubepods-burstable.slice. May 15 00:29:05.420367 systemd[1]: Created slice kubepods-besteffort.slice - libcontainer container kubepods-besteffort.slice. May 15 00:29:05.435945 kubelet[2337]: I0515 00:29:05.435900 2337 manager.go:519] "Failed to read data from checkpoint" checkpoint="kubelet_internal_checkpoint" err="checkpoint is not found" May 15 00:29:05.436292 kubelet[2337]: I0515 00:29:05.436062 2337 eviction_manager.go:189] "Eviction manager: starting control loop" May 15 00:29:05.436292 kubelet[2337]: I0515 00:29:05.436078 2337 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" May 15 00:29:05.436371 kubelet[2337]: I0515 00:29:05.436296 2337 plugin_manager.go:118] "Starting Kubelet Plugin Manager" May 15 00:29:05.438003 kubelet[2337]: E0515 00:29:05.437984 2337 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="no imagefs label for configured runtime" May 15 00:29:05.438444 kubelet[2337]: E0515 00:29:05.438347 2337 eviction_manager.go:292] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"ci-4284-0-0-n-019843d4bb.novalocal\" not found" May 15 00:29:05.510420 systemd[1]: Created slice kubepods-burstable-pod5d881305f7a19ccf8287a16c50f51a55.slice - libcontainer container kubepods-burstable-pod5d881305f7a19ccf8287a16c50f51a55.slice. May 15 00:29:05.517094 kubelet[2337]: W0515 00:29:05.515863 2337 helpers.go:245] readString: Failed to read "/sys/fs/cgroup/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5d881305f7a19ccf8287a16c50f51a55.slice/cpuset.cpus.effective": open /sys/fs/cgroup/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5d881305f7a19ccf8287a16c50f51a55.slice/cpuset.cpus.effective: no such device May 15 00:29:05.525847 kubelet[2337]: E0515 00:29:05.525359 2337 kubelet.go:3196] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4284-0-0-n-019843d4bb.novalocal\" not found" node="ci-4284-0-0-n-019843d4bb.novalocal" May 15 00:29:05.533854 systemd[1]: Created slice kubepods-burstable-pod82d2d7f59537a33e9ae9e94d10f1b00c.slice - libcontainer container kubepods-burstable-pod82d2d7f59537a33e9ae9e94d10f1b00c.slice. May 15 00:29:05.540172 kubelet[2337]: I0515 00:29:05.540039 2337 kubelet_node_status.go:76] "Attempting to register node" node="ci-4284-0-0-n-019843d4bb.novalocal" May 15 00:29:05.541570 kubelet[2337]: E0515 00:29:05.541498 2337 kubelet_node_status.go:108] "Unable to register node with API server" err="Post \"https://172.24.4.125:6443/api/v1/nodes\": dial tcp 172.24.4.125:6443: connect: connection refused" node="ci-4284-0-0-n-019843d4bb.novalocal" May 15 00:29:05.543567 kubelet[2337]: E0515 00:29:05.543167 2337 kubelet.go:3196] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4284-0-0-n-019843d4bb.novalocal\" not found" node="ci-4284-0-0-n-019843d4bb.novalocal" May 15 00:29:05.550372 systemd[1]: Created slice kubepods-burstable-pod1068d5435dcaf8c729743261ecb97618.slice - libcontainer container kubepods-burstable-pod1068d5435dcaf8c729743261ecb97618.slice. May 15 00:29:05.552376 kubelet[2337]: I0515 00:29:05.551593 2337 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/82d2d7f59537a33e9ae9e94d10f1b00c-ca-certs\") pod \"kube-controller-manager-ci-4284-0-0-n-019843d4bb.novalocal\" (UID: \"82d2d7f59537a33e9ae9e94d10f1b00c\") " pod="kube-system/kube-controller-manager-ci-4284-0-0-n-019843d4bb.novalocal" May 15 00:29:05.552376 kubelet[2337]: I0515 00:29:05.551675 2337 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/82d2d7f59537a33e9ae9e94d10f1b00c-kubeconfig\") pod \"kube-controller-manager-ci-4284-0-0-n-019843d4bb.novalocal\" (UID: \"82d2d7f59537a33e9ae9e94d10f1b00c\") " pod="kube-system/kube-controller-manager-ci-4284-0-0-n-019843d4bb.novalocal" May 15 00:29:05.552376 kubelet[2337]: I0515 00:29:05.551727 2337 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/1068d5435dcaf8c729743261ecb97618-kubeconfig\") pod \"kube-scheduler-ci-4284-0-0-n-019843d4bb.novalocal\" (UID: \"1068d5435dcaf8c729743261ecb97618\") " pod="kube-system/kube-scheduler-ci-4284-0-0-n-019843d4bb.novalocal" May 15 00:29:05.552376 kubelet[2337]: I0515 00:29:05.551780 2337 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/5d881305f7a19ccf8287a16c50f51a55-k8s-certs\") pod \"kube-apiserver-ci-4284-0-0-n-019843d4bb.novalocal\" (UID: \"5d881305f7a19ccf8287a16c50f51a55\") " pod="kube-system/kube-apiserver-ci-4284-0-0-n-019843d4bb.novalocal" May 15 00:29:05.552376 kubelet[2337]: I0515 00:29:05.551831 2337 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/5d881305f7a19ccf8287a16c50f51a55-usr-share-ca-certificates\") pod \"kube-apiserver-ci-4284-0-0-n-019843d4bb.novalocal\" (UID: \"5d881305f7a19ccf8287a16c50f51a55\") " pod="kube-system/kube-apiserver-ci-4284-0-0-n-019843d4bb.novalocal" May 15 00:29:05.552743 kubelet[2337]: I0515 00:29:05.551877 2337 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/82d2d7f59537a33e9ae9e94d10f1b00c-k8s-certs\") pod \"kube-controller-manager-ci-4284-0-0-n-019843d4bb.novalocal\" (UID: \"82d2d7f59537a33e9ae9e94d10f1b00c\") " pod="kube-system/kube-controller-manager-ci-4284-0-0-n-019843d4bb.novalocal" May 15 00:29:05.552743 kubelet[2337]: I0515 00:29:05.551942 2337 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/82d2d7f59537a33e9ae9e94d10f1b00c-usr-share-ca-certificates\") pod \"kube-controller-manager-ci-4284-0-0-n-019843d4bb.novalocal\" (UID: \"82d2d7f59537a33e9ae9e94d10f1b00c\") " pod="kube-system/kube-controller-manager-ci-4284-0-0-n-019843d4bb.novalocal" May 15 00:29:05.552743 kubelet[2337]: I0515 00:29:05.552035 2337 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/5d881305f7a19ccf8287a16c50f51a55-ca-certs\") pod \"kube-apiserver-ci-4284-0-0-n-019843d4bb.novalocal\" (UID: \"5d881305f7a19ccf8287a16c50f51a55\") " pod="kube-system/kube-apiserver-ci-4284-0-0-n-019843d4bb.novalocal" May 15 00:29:05.552743 kubelet[2337]: I0515 00:29:05.552081 2337 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"flexvolume-dir\" (UniqueName: \"kubernetes.io/host-path/82d2d7f59537a33e9ae9e94d10f1b00c-flexvolume-dir\") pod \"kube-controller-manager-ci-4284-0-0-n-019843d4bb.novalocal\" (UID: \"82d2d7f59537a33e9ae9e94d10f1b00c\") " pod="kube-system/kube-controller-manager-ci-4284-0-0-n-019843d4bb.novalocal" May 15 00:29:05.555566 kubelet[2337]: E0515 00:29:05.555136 2337 kubelet.go:3196] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4284-0-0-n-019843d4bb.novalocal\" not found" node="ci-4284-0-0-n-019843d4bb.novalocal" May 15 00:29:05.555566 kubelet[2337]: E0515 00:29:05.555490 2337 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://172.24.4.125:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/ci-4284-0-0-n-019843d4bb.novalocal?timeout=10s\": dial tcp 172.24.4.125:6443: connect: connection refused" interval="400ms" May 15 00:29:05.744672 kubelet[2337]: I0515 00:29:05.744344 2337 kubelet_node_status.go:76] "Attempting to register node" node="ci-4284-0-0-n-019843d4bb.novalocal" May 15 00:29:05.745702 kubelet[2337]: E0515 00:29:05.744914 2337 kubelet_node_status.go:108] "Unable to register node with API server" err="Post \"https://172.24.4.125:6443/api/v1/nodes\": dial tcp 172.24.4.125:6443: connect: connection refused" node="ci-4284-0-0-n-019843d4bb.novalocal" May 15 00:29:05.828908 containerd[1483]: time="2025-05-15T00:29:05.827980128Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-apiserver-ci-4284-0-0-n-019843d4bb.novalocal,Uid:5d881305f7a19ccf8287a16c50f51a55,Namespace:kube-system,Attempt:0,}" May 15 00:29:05.846778 containerd[1483]: time="2025-05-15T00:29:05.846249738Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-controller-manager-ci-4284-0-0-n-019843d4bb.novalocal,Uid:82d2d7f59537a33e9ae9e94d10f1b00c,Namespace:kube-system,Attempt:0,}" May 15 00:29:05.857417 containerd[1483]: time="2025-05-15T00:29:05.856920592Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-scheduler-ci-4284-0-0-n-019843d4bb.novalocal,Uid:1068d5435dcaf8c729743261ecb97618,Namespace:kube-system,Attempt:0,}" May 15 00:29:05.907574 containerd[1483]: time="2025-05-15T00:29:05.907314494Z" level=info msg="connecting to shim 4278ff26405fb2398c648f7515e8ba8501d7632605002e481b9e93549f6a15d8" address="unix:///run/containerd/s/465d6f051ab8c95cf3c607606f6d50907c9cd0bc5a9a8e52e6c133dc18bf587b" namespace=k8s.io protocol=ttrpc version=3 May 15 00:29:05.939687 containerd[1483]: time="2025-05-15T00:29:05.935203404Z" level=info msg="connecting to shim 185ef8c4e3c8aaef3e6b5324797bd12274f780dd8f27f707189d2b8ebc4f4429" address="unix:///run/containerd/s/55210ce60eb00770aa5694366e3c5397a2ccab967968b1115430438064e01c26" namespace=k8s.io protocol=ttrpc version=3 May 15 00:29:05.959290 kubelet[2337]: E0515 00:29:05.956046 2337 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://172.24.4.125:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/ci-4284-0-0-n-019843d4bb.novalocal?timeout=10s\": dial tcp 172.24.4.125:6443: connect: connection refused" interval="800ms" May 15 00:29:05.971834 containerd[1483]: time="2025-05-15T00:29:05.971701718Z" level=info msg="connecting to shim bdd42a5ed63e83d130e3358342eff594062a6ce949f3217dba53b3de4b4077fa" address="unix:///run/containerd/s/8c5652138c00f9e12591b1fd812927cb5a7aee6357acc07bd8e101bf8e696e77" namespace=k8s.io protocol=ttrpc version=3 May 15 00:29:05.987457 systemd[1]: Started cri-containerd-4278ff26405fb2398c648f7515e8ba8501d7632605002e481b9e93549f6a15d8.scope - libcontainer container 4278ff26405fb2398c648f7515e8ba8501d7632605002e481b9e93549f6a15d8. May 15 00:29:06.001528 systemd[1]: Started cri-containerd-bdd42a5ed63e83d130e3358342eff594062a6ce949f3217dba53b3de4b4077fa.scope - libcontainer container bdd42a5ed63e83d130e3358342eff594062a6ce949f3217dba53b3de4b4077fa. May 15 00:29:06.005719 systemd[1]: Started cri-containerd-185ef8c4e3c8aaef3e6b5324797bd12274f780dd8f27f707189d2b8ebc4f4429.scope - libcontainer container 185ef8c4e3c8aaef3e6b5324797bd12274f780dd8f27f707189d2b8ebc4f4429. May 15 00:29:06.078459 containerd[1483]: time="2025-05-15T00:29:06.078336503Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-scheduler-ci-4284-0-0-n-019843d4bb.novalocal,Uid:1068d5435dcaf8c729743261ecb97618,Namespace:kube-system,Attempt:0,} returns sandbox id \"bdd42a5ed63e83d130e3358342eff594062a6ce949f3217dba53b3de4b4077fa\"" May 15 00:29:06.082030 containerd[1483]: time="2025-05-15T00:29:06.081732543Z" level=info msg="CreateContainer within sandbox \"bdd42a5ed63e83d130e3358342eff594062a6ce949f3217dba53b3de4b4077fa\" for container &ContainerMetadata{Name:kube-scheduler,Attempt:0,}" May 15 00:29:06.082590 containerd[1483]: time="2025-05-15T00:29:06.082488728Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-apiserver-ci-4284-0-0-n-019843d4bb.novalocal,Uid:5d881305f7a19ccf8287a16c50f51a55,Namespace:kube-system,Attempt:0,} returns sandbox id \"4278ff26405fb2398c648f7515e8ba8501d7632605002e481b9e93549f6a15d8\"" May 15 00:29:06.085217 containerd[1483]: time="2025-05-15T00:29:06.085176214Z" level=info msg="CreateContainer within sandbox \"4278ff26405fb2398c648f7515e8ba8501d7632605002e481b9e93549f6a15d8\" for container &ContainerMetadata{Name:kube-apiserver,Attempt:0,}" May 15 00:29:06.098303 containerd[1483]: time="2025-05-15T00:29:06.098238380Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-controller-manager-ci-4284-0-0-n-019843d4bb.novalocal,Uid:82d2d7f59537a33e9ae9e94d10f1b00c,Namespace:kube-system,Attempt:0,} returns sandbox id \"185ef8c4e3c8aaef3e6b5324797bd12274f780dd8f27f707189d2b8ebc4f4429\"" May 15 00:29:06.100523 containerd[1483]: time="2025-05-15T00:29:06.100494252Z" level=info msg="CreateContainer within sandbox \"185ef8c4e3c8aaef3e6b5324797bd12274f780dd8f27f707189d2b8ebc4f4429\" for container &ContainerMetadata{Name:kube-controller-manager,Attempt:0,}" May 15 00:29:06.105717 containerd[1483]: time="2025-05-15T00:29:06.104417841Z" level=info msg="Container 0ba26f0127c106ee8d0aa370bdaa76dc7fdb9b2c6a6f2f68f80341589f7f5c9f: CDI devices from CRI Config.CDIDevices: []" May 15 00:29:06.109396 containerd[1483]: time="2025-05-15T00:29:06.109345445Z" level=info msg="Container f3e066b542951fe1e9b13c353be595709f8d47523890512e67c40d4af8eed9ed: CDI devices from CRI Config.CDIDevices: []" May 15 00:29:06.128783 containerd[1483]: time="2025-05-15T00:29:06.128745630Z" level=info msg="Container 98a83c8962ed0f31c3316c2f191b2602695aacabcebdfdf660edf8358ec62a6a: CDI devices from CRI Config.CDIDevices: []" May 15 00:29:06.134358 containerd[1483]: time="2025-05-15T00:29:06.134234276Z" level=info msg="CreateContainer within sandbox \"bdd42a5ed63e83d130e3358342eff594062a6ce949f3217dba53b3de4b4077fa\" for &ContainerMetadata{Name:kube-scheduler,Attempt:0,} returns container id \"0ba26f0127c106ee8d0aa370bdaa76dc7fdb9b2c6a6f2f68f80341589f7f5c9f\"" May 15 00:29:06.140975 containerd[1483]: time="2025-05-15T00:29:06.140922875Z" level=info msg="StartContainer for \"0ba26f0127c106ee8d0aa370bdaa76dc7fdb9b2c6a6f2f68f80341589f7f5c9f\"" May 15 00:29:06.142092 containerd[1483]: time="2025-05-15T00:29:06.141932930Z" level=info msg="connecting to shim 0ba26f0127c106ee8d0aa370bdaa76dc7fdb9b2c6a6f2f68f80341589f7f5c9f" address="unix:///run/containerd/s/8c5652138c00f9e12591b1fd812927cb5a7aee6357acc07bd8e101bf8e696e77" protocol=ttrpc version=3 May 15 00:29:06.143429 containerd[1483]: time="2025-05-15T00:29:06.143392130Z" level=info msg="CreateContainer within sandbox \"185ef8c4e3c8aaef3e6b5324797bd12274f780dd8f27f707189d2b8ebc4f4429\" for &ContainerMetadata{Name:kube-controller-manager,Attempt:0,} returns container id \"98a83c8962ed0f31c3316c2f191b2602695aacabcebdfdf660edf8358ec62a6a\"" May 15 00:29:06.143770 containerd[1483]: time="2025-05-15T00:29:06.143711674Z" level=info msg="StartContainer for \"98a83c8962ed0f31c3316c2f191b2602695aacabcebdfdf660edf8358ec62a6a\"" May 15 00:29:06.144764 containerd[1483]: time="2025-05-15T00:29:06.144723988Z" level=info msg="connecting to shim 98a83c8962ed0f31c3316c2f191b2602695aacabcebdfdf660edf8358ec62a6a" address="unix:///run/containerd/s/55210ce60eb00770aa5694366e3c5397a2ccab967968b1115430438064e01c26" protocol=ttrpc version=3 May 15 00:29:06.146587 containerd[1483]: time="2025-05-15T00:29:06.146213811Z" level=info msg="CreateContainer within sandbox \"4278ff26405fb2398c648f7515e8ba8501d7632605002e481b9e93549f6a15d8\" for &ContainerMetadata{Name:kube-apiserver,Attempt:0,} returns container id \"f3e066b542951fe1e9b13c353be595709f8d47523890512e67c40d4af8eed9ed\"" May 15 00:29:06.147011 containerd[1483]: time="2025-05-15T00:29:06.146991910Z" level=info msg="StartContainer for \"f3e066b542951fe1e9b13c353be595709f8d47523890512e67c40d4af8eed9ed\"" May 15 00:29:06.147537 kubelet[2337]: I0515 00:29:06.147414 2337 kubelet_node_status.go:76] "Attempting to register node" node="ci-4284-0-0-n-019843d4bb.novalocal" May 15 00:29:06.147707 kubelet[2337]: E0515 00:29:06.147683 2337 kubelet_node_status.go:108] "Unable to register node with API server" err="Post \"https://172.24.4.125:6443/api/v1/nodes\": dial tcp 172.24.4.125:6443: connect: connection refused" node="ci-4284-0-0-n-019843d4bb.novalocal" May 15 00:29:06.149242 containerd[1483]: time="2025-05-15T00:29:06.149212222Z" level=info msg="connecting to shim f3e066b542951fe1e9b13c353be595709f8d47523890512e67c40d4af8eed9ed" address="unix:///run/containerd/s/465d6f051ab8c95cf3c607606f6d50907c9cd0bc5a9a8e52e6c133dc18bf587b" protocol=ttrpc version=3 May 15 00:29:06.180430 systemd[1]: Started cri-containerd-0ba26f0127c106ee8d0aa370bdaa76dc7fdb9b2c6a6f2f68f80341589f7f5c9f.scope - libcontainer container 0ba26f0127c106ee8d0aa370bdaa76dc7fdb9b2c6a6f2f68f80341589f7f5c9f. May 15 00:29:06.181369 systemd[1]: Started cri-containerd-98a83c8962ed0f31c3316c2f191b2602695aacabcebdfdf660edf8358ec62a6a.scope - libcontainer container 98a83c8962ed0f31c3316c2f191b2602695aacabcebdfdf660edf8358ec62a6a. May 15 00:29:06.183599 systemd[1]: Started cri-containerd-f3e066b542951fe1e9b13c353be595709f8d47523890512e67c40d4af8eed9ed.scope - libcontainer container f3e066b542951fe1e9b13c353be595709f8d47523890512e67c40d4af8eed9ed. May 15 00:29:06.268374 containerd[1483]: time="2025-05-15T00:29:06.268235263Z" level=info msg="StartContainer for \"f3e066b542951fe1e9b13c353be595709f8d47523890512e67c40d4af8eed9ed\" returns successfully" May 15 00:29:06.290875 containerd[1483]: time="2025-05-15T00:29:06.290744574Z" level=info msg="StartContainer for \"0ba26f0127c106ee8d0aa370bdaa76dc7fdb9b2c6a6f2f68f80341589f7f5c9f\" returns successfully" May 15 00:29:06.304241 containerd[1483]: time="2025-05-15T00:29:06.304050930Z" level=info msg="StartContainer for \"98a83c8962ed0f31c3316c2f191b2602695aacabcebdfdf660edf8358ec62a6a\" returns successfully" May 15 00:29:06.397673 kubelet[2337]: E0515 00:29:06.397642 2337 kubelet.go:3196] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4284-0-0-n-019843d4bb.novalocal\" not found" node="ci-4284-0-0-n-019843d4bb.novalocal" May 15 00:29:06.403851 kubelet[2337]: E0515 00:29:06.403710 2337 kubelet.go:3196] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4284-0-0-n-019843d4bb.novalocal\" not found" node="ci-4284-0-0-n-019843d4bb.novalocal" May 15 00:29:06.407347 kubelet[2337]: E0515 00:29:06.407321 2337 kubelet.go:3196] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4284-0-0-n-019843d4bb.novalocal\" not found" node="ci-4284-0-0-n-019843d4bb.novalocal" May 15 00:29:06.953316 kubelet[2337]: I0515 00:29:06.951463 2337 kubelet_node_status.go:76] "Attempting to register node" node="ci-4284-0-0-n-019843d4bb.novalocal" May 15 00:29:07.409588 kubelet[2337]: E0515 00:29:07.409563 2337 kubelet.go:3196] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4284-0-0-n-019843d4bb.novalocal\" not found" node="ci-4284-0-0-n-019843d4bb.novalocal" May 15 00:29:07.409872 kubelet[2337]: E0515 00:29:07.409852 2337 kubelet.go:3196] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4284-0-0-n-019843d4bb.novalocal\" not found" node="ci-4284-0-0-n-019843d4bb.novalocal" May 15 00:29:08.296282 kubelet[2337]: I0515 00:29:08.295617 2337 kubelet_node_status.go:79] "Successfully registered node" node="ci-4284-0-0-n-019843d4bb.novalocal" May 15 00:29:08.296282 kubelet[2337]: E0515 00:29:08.295647 2337 kubelet_node_status.go:549] "Error updating node status, will retry" err="error getting node \"ci-4284-0-0-n-019843d4bb.novalocal\": node \"ci-4284-0-0-n-019843d4bb.novalocal\" not found" May 15 00:29:08.324313 kubelet[2337]: I0515 00:29:08.324139 2337 apiserver.go:52] "Watching apiserver" May 15 00:29:08.351065 kubelet[2337]: I0515 00:29:08.351028 2337 desired_state_of_world_populator.go:157] "Finished populating initial desired state of world" May 15 00:29:08.355529 kubelet[2337]: I0515 00:29:08.355299 2337 kubelet.go:3200] "Creating a mirror pod for static pod" pod="kube-system/kube-controller-manager-ci-4284-0-0-n-019843d4bb.novalocal" May 15 00:29:08.391521 kubelet[2337]: E0515 00:29:08.391503 2337 kubelet.go:3202] "Failed creating a mirror pod" err="pods \"kube-controller-manager-ci-4284-0-0-n-019843d4bb.novalocal\" is forbidden: no PriorityClass with name system-node-critical was found" pod="kube-system/kube-controller-manager-ci-4284-0-0-n-019843d4bb.novalocal" May 15 00:29:08.391680 kubelet[2337]: I0515 00:29:08.391613 2337 kubelet.go:3200] "Creating a mirror pod for static pod" pod="kube-system/kube-scheduler-ci-4284-0-0-n-019843d4bb.novalocal" May 15 00:29:08.394583 kubelet[2337]: E0515 00:29:08.394561 2337 controller.go:145] "Failed to ensure lease exists, will retry" err="namespaces \"kube-node-lease\" not found" interval="1.6s" May 15 00:29:08.395917 kubelet[2337]: E0515 00:29:08.395809 2337 kubelet.go:3202] "Failed creating a mirror pod" err="pods \"kube-scheduler-ci-4284-0-0-n-019843d4bb.novalocal\" is forbidden: no PriorityClass with name system-node-critical was found" pod="kube-system/kube-scheduler-ci-4284-0-0-n-019843d4bb.novalocal" May 15 00:29:08.395917 kubelet[2337]: I0515 00:29:08.395847 2337 kubelet.go:3200] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-ci-4284-0-0-n-019843d4bb.novalocal" May 15 00:29:08.398699 kubelet[2337]: E0515 00:29:08.398664 2337 kubelet.go:3202] "Failed creating a mirror pod" err="pods \"kube-apiserver-ci-4284-0-0-n-019843d4bb.novalocal\" is forbidden: no PriorityClass with name system-node-critical was found" pod="kube-system/kube-apiserver-ci-4284-0-0-n-019843d4bb.novalocal" May 15 00:29:09.399202 kubelet[2337]: I0515 00:29:09.398901 2337 kubelet.go:3200] "Creating a mirror pod for static pod" pod="kube-system/kube-controller-manager-ci-4284-0-0-n-019843d4bb.novalocal" May 15 00:29:09.409042 kubelet[2337]: W0515 00:29:09.408978 2337 warnings.go:70] metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots] May 15 00:29:10.587540 systemd[1]: Reload requested from client PID 2604 ('systemctl') (unit session-11.scope)... May 15 00:29:10.588177 systemd[1]: Reloading... May 15 00:29:10.729294 zram_generator::config[2648]: No configuration found. May 15 00:29:10.893491 systemd[1]: /usr/lib/systemd/system/docker.socket:6: ListenStream= references a path below legacy directory /var/run/, updating /var/run/docker.sock → /run/docker.sock; please update the unit file accordingly. May 15 00:29:11.039927 systemd[1]: Reloading finished in 450 ms. May 15 00:29:11.066556 kubelet[2337]: I0515 00:29:11.066514 2337 dynamic_cafile_content.go:175] "Shutting down controller" name="client-ca-bundle::/etc/kubernetes/pki/ca.crt" May 15 00:29:11.068531 systemd[1]: Stopping kubelet.service - kubelet: The Kubernetes Node Agent... May 15 00:29:11.081583 systemd[1]: kubelet.service: Deactivated successfully. May 15 00:29:11.081782 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent. May 15 00:29:11.081826 systemd[1]: kubelet.service: Consumed 1.276s CPU time, 124.1M memory peak. May 15 00:29:11.084178 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... May 15 00:29:11.496833 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. May 15 00:29:11.513882 (kubelet)[2714]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS May 15 00:29:11.582077 kubelet[2714]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. May 15 00:29:11.582417 kubelet[2714]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. May 15 00:29:11.582466 kubelet[2714]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. May 15 00:29:11.582602 kubelet[2714]: I0515 00:29:11.582576 2714 server.go:215] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" May 15 00:29:11.589508 kubelet[2714]: I0515 00:29:11.589484 2714 server.go:520] "Kubelet version" kubeletVersion="v1.32.0" May 15 00:29:11.589631 kubelet[2714]: I0515 00:29:11.589621 2714 server.go:522] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" May 15 00:29:11.590309 kubelet[2714]: I0515 00:29:11.589935 2714 server.go:954] "Client rotation is on, will bootstrap in background" May 15 00:29:11.591721 kubelet[2714]: I0515 00:29:11.591707 2714 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-client-current.pem". May 15 00:29:11.594910 kubelet[2714]: I0515 00:29:11.594892 2714 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/pki/ca.crt" May 15 00:29:11.599821 kubelet[2714]: I0515 00:29:11.599759 2714 server.go:1444] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" May 15 00:29:11.607103 kubelet[2714]: I0515 00:29:11.606760 2714 server.go:772] "--cgroups-per-qos enabled, but --cgroup-root was not specified. defaulting to /" May 15 00:29:11.607103 kubelet[2714]: I0515 00:29:11.606959 2714 container_manager_linux.go:268] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] May 15 00:29:11.607923 kubelet[2714]: I0515 00:29:11.606995 2714 container_manager_linux.go:273] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ci-4284-0-0-n-019843d4bb.novalocal","RuntimeCgroupsName":"","SystemCgroupsName":"","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":false,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":null,"HardEvictionThresholds":[{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"ExperimentalMemoryManagerPolicy":"None","ExperimentalMemoryManagerReservedMemory":null,"PodPidsLimit":-1,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} May 15 00:29:11.607923 kubelet[2714]: I0515 00:29:11.607874 2714 topology_manager.go:138] "Creating topology manager with none policy" May 15 00:29:11.607923 kubelet[2714]: I0515 00:29:11.607890 2714 container_manager_linux.go:304] "Creating device plugin manager" May 15 00:29:11.610855 kubelet[2714]: I0515 00:29:11.607931 2714 state_mem.go:36] "Initialized new in-memory state store" May 15 00:29:11.610855 kubelet[2714]: I0515 00:29:11.608084 2714 kubelet.go:446] "Attempting to sync node with API server" May 15 00:29:11.610855 kubelet[2714]: I0515 00:29:11.608097 2714 kubelet.go:341] "Adding static pod path" path="/etc/kubernetes/manifests" May 15 00:29:11.610855 kubelet[2714]: I0515 00:29:11.608141 2714 kubelet.go:352] "Adding apiserver pod source" May 15 00:29:11.610855 kubelet[2714]: I0515 00:29:11.608152 2714 apiserver.go:42] "Waiting for node sync before watching apiserver pods" May 15 00:29:11.611457 kubelet[2714]: I0515 00:29:11.611440 2714 kuberuntime_manager.go:269] "Container runtime initialized" containerRuntime="containerd" version="v2.0.1" apiVersion="v1" May 15 00:29:11.614301 kubelet[2714]: I0515 00:29:11.611963 2714 kubelet.go:890] "Not starting ClusterTrustBundle informer because we are in static kubelet mode" May 15 00:29:11.614788 kubelet[2714]: I0515 00:29:11.614774 2714 watchdog_linux.go:99] "Systemd watchdog is not enabled" May 15 00:29:11.614866 kubelet[2714]: I0515 00:29:11.614857 2714 server.go:1287] "Started kubelet" May 15 00:29:11.618726 kubelet[2714]: I0515 00:29:11.618689 2714 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" May 15 00:29:11.624253 kubelet[2714]: I0515 00:29:11.624225 2714 server.go:169] "Starting to listen" address="0.0.0.0" port=10250 May 15 00:29:11.625823 kubelet[2714]: I0515 00:29:11.625809 2714 server.go:490] "Adding debug handlers to kubelet server" May 15 00:29:11.626837 kubelet[2714]: I0515 00:29:11.626797 2714 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 May 15 00:29:11.627037 kubelet[2714]: I0515 00:29:11.627024 2714 server.go:243] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" May 15 00:29:11.627297 kubelet[2714]: I0515 00:29:11.627283 2714 dynamic_serving_content.go:135] "Starting controller" name="kubelet-server-cert-files::/var/lib/kubelet/pki/kubelet.crt::/var/lib/kubelet/pki/kubelet.key" May 15 00:29:11.630824 kubelet[2714]: I0515 00:29:11.630811 2714 volume_manager.go:297] "Starting Kubelet Volume Manager" May 15 00:29:11.631056 kubelet[2714]: E0515 00:29:11.631039 2714 kubelet_node_status.go:467] "Error getting the current node from lister" err="node \"ci-4284-0-0-n-019843d4bb.novalocal\" not found" May 15 00:29:11.631854 kubelet[2714]: I0515 00:29:11.631731 2714 desired_state_of_world_populator.go:149] "Desired state populator starts to run" May 15 00:29:11.632056 kubelet[2714]: I0515 00:29:11.632043 2714 reconciler.go:26] "Reconciler: start to sync state" May 15 00:29:11.637258 kubelet[2714]: I0515 00:29:11.637238 2714 factory.go:221] Registration of the systemd container factory successfully May 15 00:29:11.637630 kubelet[2714]: I0515 00:29:11.637550 2714 factory.go:219] Registration of the crio container factory failed: Get "http://%2Fvar%2Frun%2Fcrio%2Fcrio.sock/info": dial unix /var/run/crio/crio.sock: connect: no such file or directory May 15 00:29:11.643460 kubelet[2714]: I0515 00:29:11.643434 2714 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv4" May 15 00:29:11.646922 kubelet[2714]: I0515 00:29:11.646908 2714 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv6" May 15 00:29:11.647010 kubelet[2714]: I0515 00:29:11.647001 2714 status_manager.go:227] "Starting to sync pod status with apiserver" May 15 00:29:11.647088 kubelet[2714]: I0515 00:29:11.647077 2714 watchdog_linux.go:127] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." May 15 00:29:11.647143 kubelet[2714]: I0515 00:29:11.647135 2714 kubelet.go:2388] "Starting kubelet main sync loop" May 15 00:29:11.647279 kubelet[2714]: E0515 00:29:11.647241 2714 kubelet.go:2412] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" May 15 00:29:11.657407 kubelet[2714]: I0515 00:29:11.657388 2714 factory.go:221] Registration of the containerd container factory successfully May 15 00:29:11.722450 kubelet[2714]: I0515 00:29:11.722432 2714 cpu_manager.go:221] "Starting CPU manager" policy="none" May 15 00:29:11.722692 kubelet[2714]: I0515 00:29:11.722580 2714 cpu_manager.go:222] "Reconciling" reconcilePeriod="10s" May 15 00:29:11.722692 kubelet[2714]: I0515 00:29:11.722600 2714 state_mem.go:36] "Initialized new in-memory state store" May 15 00:29:11.723062 kubelet[2714]: I0515 00:29:11.722851 2714 state_mem.go:88] "Updated default CPUSet" cpuSet="" May 15 00:29:11.723062 kubelet[2714]: I0515 00:29:11.722866 2714 state_mem.go:96] "Updated CPUSet assignments" assignments={} May 15 00:29:11.723062 kubelet[2714]: I0515 00:29:11.722883 2714 policy_none.go:49] "None policy: Start" May 15 00:29:11.723062 kubelet[2714]: I0515 00:29:11.722891 2714 memory_manager.go:186] "Starting memorymanager" policy="None" May 15 00:29:11.723062 kubelet[2714]: I0515 00:29:11.722901 2714 state_mem.go:35] "Initializing new in-memory state store" May 15 00:29:11.723062 kubelet[2714]: I0515 00:29:11.723009 2714 state_mem.go:75] "Updated machine memory state" May 15 00:29:11.727925 kubelet[2714]: I0515 00:29:11.727900 2714 manager.go:519] "Failed to read data from checkpoint" checkpoint="kubelet_internal_checkpoint" err="checkpoint is not found" May 15 00:29:11.728782 kubelet[2714]: I0515 00:29:11.728170 2714 eviction_manager.go:189] "Eviction manager: starting control loop" May 15 00:29:11.728782 kubelet[2714]: I0515 00:29:11.728190 2714 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" May 15 00:29:11.728782 kubelet[2714]: I0515 00:29:11.728694 2714 plugin_manager.go:118] "Starting Kubelet Plugin Manager" May 15 00:29:11.732156 kubelet[2714]: E0515 00:29:11.732133 2714 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="no imagefs label for configured runtime" May 15 00:29:11.748983 kubelet[2714]: I0515 00:29:11.748903 2714 kubelet.go:3200] "Creating a mirror pod for static pod" pod="kube-system/kube-scheduler-ci-4284-0-0-n-019843d4bb.novalocal" May 15 00:29:11.750538 kubelet[2714]: I0515 00:29:11.750428 2714 kubelet.go:3200] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-ci-4284-0-0-n-019843d4bb.novalocal" May 15 00:29:11.750846 kubelet[2714]: I0515 00:29:11.750811 2714 kubelet.go:3200] "Creating a mirror pod for static pod" pod="kube-system/kube-controller-manager-ci-4284-0-0-n-019843d4bb.novalocal" May 15 00:29:11.759198 kubelet[2714]: W0515 00:29:11.759058 2714 warnings.go:70] metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots] May 15 00:29:11.761165 kubelet[2714]: W0515 00:29:11.761099 2714 warnings.go:70] metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots] May 15 00:29:11.762954 kubelet[2714]: W0515 00:29:11.762808 2714 warnings.go:70] metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots] May 15 00:29:11.762954 kubelet[2714]: E0515 00:29:11.762904 2714 kubelet.go:3202] "Failed creating a mirror pod" err="pods \"kube-controller-manager-ci-4284-0-0-n-019843d4bb.novalocal\" already exists" pod="kube-system/kube-controller-manager-ci-4284-0-0-n-019843d4bb.novalocal" May 15 00:29:11.832670 kubelet[2714]: I0515 00:29:11.831818 2714 kubelet_node_status.go:76] "Attempting to register node" node="ci-4284-0-0-n-019843d4bb.novalocal" May 15 00:29:11.833590 kubelet[2714]: I0515 00:29:11.833572 2714 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/82d2d7f59537a33e9ae9e94d10f1b00c-kubeconfig\") pod \"kube-controller-manager-ci-4284-0-0-n-019843d4bb.novalocal\" (UID: \"82d2d7f59537a33e9ae9e94d10f1b00c\") " pod="kube-system/kube-controller-manager-ci-4284-0-0-n-019843d4bb.novalocal" May 15 00:29:11.833746 kubelet[2714]: I0515 00:29:11.833730 2714 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/1068d5435dcaf8c729743261ecb97618-kubeconfig\") pod \"kube-scheduler-ci-4284-0-0-n-019843d4bb.novalocal\" (UID: \"1068d5435dcaf8c729743261ecb97618\") " pod="kube-system/kube-scheduler-ci-4284-0-0-n-019843d4bb.novalocal" May 15 00:29:11.834055 kubelet[2714]: I0515 00:29:11.833854 2714 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/5d881305f7a19ccf8287a16c50f51a55-ca-certs\") pod \"kube-apiserver-ci-4284-0-0-n-019843d4bb.novalocal\" (UID: \"5d881305f7a19ccf8287a16c50f51a55\") " pod="kube-system/kube-apiserver-ci-4284-0-0-n-019843d4bb.novalocal" May 15 00:29:11.834055 kubelet[2714]: I0515 00:29:11.833910 2714 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/5d881305f7a19ccf8287a16c50f51a55-usr-share-ca-certificates\") pod \"kube-apiserver-ci-4284-0-0-n-019843d4bb.novalocal\" (UID: \"5d881305f7a19ccf8287a16c50f51a55\") " pod="kube-system/kube-apiserver-ci-4284-0-0-n-019843d4bb.novalocal" May 15 00:29:11.834055 kubelet[2714]: I0515 00:29:11.833932 2714 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"flexvolume-dir\" (UniqueName: \"kubernetes.io/host-path/82d2d7f59537a33e9ae9e94d10f1b00c-flexvolume-dir\") pod \"kube-controller-manager-ci-4284-0-0-n-019843d4bb.novalocal\" (UID: \"82d2d7f59537a33e9ae9e94d10f1b00c\") " pod="kube-system/kube-controller-manager-ci-4284-0-0-n-019843d4bb.novalocal" May 15 00:29:11.834479 kubelet[2714]: I0515 00:29:11.834310 2714 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/82d2d7f59537a33e9ae9e94d10f1b00c-k8s-certs\") pod \"kube-controller-manager-ci-4284-0-0-n-019843d4bb.novalocal\" (UID: \"82d2d7f59537a33e9ae9e94d10f1b00c\") " pod="kube-system/kube-controller-manager-ci-4284-0-0-n-019843d4bb.novalocal" May 15 00:29:11.834802 kubelet[2714]: I0515 00:29:11.834716 2714 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/5d881305f7a19ccf8287a16c50f51a55-k8s-certs\") pod \"kube-apiserver-ci-4284-0-0-n-019843d4bb.novalocal\" (UID: \"5d881305f7a19ccf8287a16c50f51a55\") " pod="kube-system/kube-apiserver-ci-4284-0-0-n-019843d4bb.novalocal" May 15 00:29:11.835137 kubelet[2714]: I0515 00:29:11.834986 2714 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/82d2d7f59537a33e9ae9e94d10f1b00c-ca-certs\") pod \"kube-controller-manager-ci-4284-0-0-n-019843d4bb.novalocal\" (UID: \"82d2d7f59537a33e9ae9e94d10f1b00c\") " pod="kube-system/kube-controller-manager-ci-4284-0-0-n-019843d4bb.novalocal" May 15 00:29:11.835137 kubelet[2714]: I0515 00:29:11.835014 2714 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/82d2d7f59537a33e9ae9e94d10f1b00c-usr-share-ca-certificates\") pod \"kube-controller-manager-ci-4284-0-0-n-019843d4bb.novalocal\" (UID: \"82d2d7f59537a33e9ae9e94d10f1b00c\") " pod="kube-system/kube-controller-manager-ci-4284-0-0-n-019843d4bb.novalocal" May 15 00:29:11.845468 kubelet[2714]: I0515 00:29:11.845390 2714 kubelet_node_status.go:125] "Node was previously registered" node="ci-4284-0-0-n-019843d4bb.novalocal" May 15 00:29:11.845980 kubelet[2714]: I0515 00:29:11.845818 2714 kubelet_node_status.go:79] "Successfully registered node" node="ci-4284-0-0-n-019843d4bb.novalocal" May 15 00:29:12.609734 kubelet[2714]: I0515 00:29:12.609695 2714 apiserver.go:52] "Watching apiserver" May 15 00:29:12.632025 kubelet[2714]: I0515 00:29:12.631977 2714 desired_state_of_world_populator.go:157] "Finished populating initial desired state of world" May 15 00:29:12.704145 kubelet[2714]: I0515 00:29:12.702796 2714 kubelet.go:3200] "Creating a mirror pod for static pod" pod="kube-system/kube-controller-manager-ci-4284-0-0-n-019843d4bb.novalocal" May 15 00:29:12.704145 kubelet[2714]: I0515 00:29:12.703491 2714 kubelet.go:3200] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-ci-4284-0-0-n-019843d4bb.novalocal" May 15 00:29:12.720081 kubelet[2714]: W0515 00:29:12.719238 2714 warnings.go:70] metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots] May 15 00:29:12.720081 kubelet[2714]: E0515 00:29:12.719377 2714 kubelet.go:3202] "Failed creating a mirror pod" err="pods \"kube-controller-manager-ci-4284-0-0-n-019843d4bb.novalocal\" already exists" pod="kube-system/kube-controller-manager-ci-4284-0-0-n-019843d4bb.novalocal" May 15 00:29:12.723387 kubelet[2714]: W0515 00:29:12.722944 2714 warnings.go:70] metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots] May 15 00:29:12.723387 kubelet[2714]: E0515 00:29:12.723023 2714 kubelet.go:3202] "Failed creating a mirror pod" err="pods \"kube-apiserver-ci-4284-0-0-n-019843d4bb.novalocal\" already exists" pod="kube-system/kube-apiserver-ci-4284-0-0-n-019843d4bb.novalocal" May 15 00:29:12.755226 kubelet[2714]: I0515 00:29:12.755078 2714 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-scheduler-ci-4284-0-0-n-019843d4bb.novalocal" podStartSLOduration=1.755042316 podStartE2EDuration="1.755042316s" podCreationTimestamp="2025-05-15 00:29:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-05-15 00:29:12.754688832 +0000 UTC m=+1.230279723" watchObservedRunningTime="2025-05-15 00:29:12.755042316 +0000 UTC m=+1.230633208" May 15 00:29:12.797196 kubelet[2714]: I0515 00:29:12.797133 2714 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-apiserver-ci-4284-0-0-n-019843d4bb.novalocal" podStartSLOduration=1.7971153530000001 podStartE2EDuration="1.797115353s" podCreationTimestamp="2025-05-15 00:29:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-05-15 00:29:12.781347369 +0000 UTC m=+1.256938270" watchObservedRunningTime="2025-05-15 00:29:12.797115353 +0000 UTC m=+1.272706194" May 15 00:29:12.818097 kubelet[2714]: I0515 00:29:12.818032 2714 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-controller-manager-ci-4284-0-0-n-019843d4bb.novalocal" podStartSLOduration=3.818016118 podStartE2EDuration="3.818016118s" podCreationTimestamp="2025-05-15 00:29:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-05-15 00:29:12.798569801 +0000 UTC m=+1.274160642" watchObservedRunningTime="2025-05-15 00:29:12.818016118 +0000 UTC m=+1.293606959" May 15 00:29:15.520698 kubelet[2714]: I0515 00:29:15.520664 2714 kuberuntime_manager.go:1702] "Updating runtime config through cri with podcidr" CIDR="192.168.0.0/24" May 15 00:29:15.521045 containerd[1483]: time="2025-05-15T00:29:15.520963621Z" level=info msg="No cni config template is specified, wait for other system components to drop the config." May 15 00:29:15.521246 kubelet[2714]: I0515 00:29:15.521196 2714 kubelet_network.go:61] "Updating Pod CIDR" originalPodCIDR="" newPodCIDR="192.168.0.0/24" May 15 00:29:16.347904 kubelet[2714]: I0515 00:29:16.347616 2714 status_manager.go:890] "Failed to get status for pod" podUID="6ceb1f51-4069-4918-9aa2-9b1f49374a85" pod="kube-system/kube-proxy-4nq7w" err="pods \"kube-proxy-4nq7w\" is forbidden: User \"system:node:ci-4284-0-0-n-019843d4bb.novalocal\" cannot get resource \"pods\" in API group \"\" in the namespace \"kube-system\": no relationship found between node 'ci-4284-0-0-n-019843d4bb.novalocal' and this object" May 15 00:29:16.347904 kubelet[2714]: W0515 00:29:16.347701 2714 reflector.go:569] object-"kube-system"/"kube-proxy": failed to list *v1.ConfigMap: configmaps "kube-proxy" is forbidden: User "system:node:ci-4284-0-0-n-019843d4bb.novalocal" cannot list resource "configmaps" in API group "" in the namespace "kube-system": no relationship found between node 'ci-4284-0-0-n-019843d4bb.novalocal' and this object May 15 00:29:16.347904 kubelet[2714]: E0515 00:29:16.347728 2714 reflector.go:166] "Unhandled Error" err="object-\"kube-system\"/\"kube-proxy\": Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: configmaps \"kube-proxy\" is forbidden: User \"system:node:ci-4284-0-0-n-019843d4bb.novalocal\" cannot list resource \"configmaps\" in API group \"\" in the namespace \"kube-system\": no relationship found between node 'ci-4284-0-0-n-019843d4bb.novalocal' and this object" logger="UnhandledError" May 15 00:29:16.347904 kubelet[2714]: W0515 00:29:16.347768 2714 reflector.go:569] object-"kube-system"/"kube-root-ca.crt": failed to list *v1.ConfigMap: configmaps "kube-root-ca.crt" is forbidden: User "system:node:ci-4284-0-0-n-019843d4bb.novalocal" cannot list resource "configmaps" in API group "" in the namespace "kube-system": no relationship found between node 'ci-4284-0-0-n-019843d4bb.novalocal' and this object May 15 00:29:16.348104 kubelet[2714]: E0515 00:29:16.347782 2714 reflector.go:166] "Unhandled Error" err="object-\"kube-system\"/\"kube-root-ca.crt\": Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: configmaps \"kube-root-ca.crt\" is forbidden: User \"system:node:ci-4284-0-0-n-019843d4bb.novalocal\" cannot list resource \"configmaps\" in API group \"\" in the namespace \"kube-system\": no relationship found between node 'ci-4284-0-0-n-019843d4bb.novalocal' and this object" logger="UnhandledError" May 15 00:29:16.352898 systemd[1]: Created slice kubepods-besteffort-pod6ceb1f51_4069_4918_9aa2_9b1f49374a85.slice - libcontainer container kubepods-besteffort-pod6ceb1f51_4069_4918_9aa2_9b1f49374a85.slice. May 15 00:29:16.364505 kubelet[2714]: I0515 00:29:16.363479 2714 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"xtables-lock\" (UniqueName: \"kubernetes.io/host-path/6ceb1f51-4069-4918-9aa2-9b1f49374a85-xtables-lock\") pod \"kube-proxy-4nq7w\" (UID: \"6ceb1f51-4069-4918-9aa2-9b1f49374a85\") " pod="kube-system/kube-proxy-4nq7w" May 15 00:29:16.364505 kubelet[2714]: I0515 00:29:16.363514 2714 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/6ceb1f51-4069-4918-9aa2-9b1f49374a85-lib-modules\") pod \"kube-proxy-4nq7w\" (UID: \"6ceb1f51-4069-4918-9aa2-9b1f49374a85\") " pod="kube-system/kube-proxy-4nq7w" May 15 00:29:16.364505 kubelet[2714]: I0515 00:29:16.363537 2714 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-whm2v\" (UniqueName: \"kubernetes.io/projected/6ceb1f51-4069-4918-9aa2-9b1f49374a85-kube-api-access-whm2v\") pod \"kube-proxy-4nq7w\" (UID: \"6ceb1f51-4069-4918-9aa2-9b1f49374a85\") " pod="kube-system/kube-proxy-4nq7w" May 15 00:29:16.364505 kubelet[2714]: I0515 00:29:16.363557 2714 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-proxy\" (UniqueName: \"kubernetes.io/configmap/6ceb1f51-4069-4918-9aa2-9b1f49374a85-kube-proxy\") pod \"kube-proxy-4nq7w\" (UID: \"6ceb1f51-4069-4918-9aa2-9b1f49374a85\") " pod="kube-system/kube-proxy-4nq7w" May 15 00:29:16.616952 kubelet[2714]: W0515 00:29:16.616314 2714 reflector.go:569] object-"tigera-operator"/"kube-root-ca.crt": failed to list *v1.ConfigMap: configmaps "kube-root-ca.crt" is forbidden: User "system:node:ci-4284-0-0-n-019843d4bb.novalocal" cannot list resource "configmaps" in API group "" in the namespace "tigera-operator": no relationship found between node 'ci-4284-0-0-n-019843d4bb.novalocal' and this object May 15 00:29:16.616952 kubelet[2714]: E0515 00:29:16.616350 2714 reflector.go:166] "Unhandled Error" err="object-\"tigera-operator\"/\"kube-root-ca.crt\": Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: configmaps \"kube-root-ca.crt\" is forbidden: User \"system:node:ci-4284-0-0-n-019843d4bb.novalocal\" cannot list resource \"configmaps\" in API group \"\" in the namespace \"tigera-operator\": no relationship found between node 'ci-4284-0-0-n-019843d4bb.novalocal' and this object" logger="UnhandledError" May 15 00:29:16.616952 kubelet[2714]: W0515 00:29:16.616837 2714 reflector.go:569] object-"tigera-operator"/"kubernetes-services-endpoint": failed to list *v1.ConfigMap: configmaps "kubernetes-services-endpoint" is forbidden: User "system:node:ci-4284-0-0-n-019843d4bb.novalocal" cannot list resource "configmaps" in API group "" in the namespace "tigera-operator": no relationship found between node 'ci-4284-0-0-n-019843d4bb.novalocal' and this object May 15 00:29:16.616952 kubelet[2714]: E0515 00:29:16.616862 2714 reflector.go:166] "Unhandled Error" err="object-\"tigera-operator\"/\"kubernetes-services-endpoint\": Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: configmaps \"kubernetes-services-endpoint\" is forbidden: User \"system:node:ci-4284-0-0-n-019843d4bb.novalocal\" cannot list resource \"configmaps\" in API group \"\" in the namespace \"tigera-operator\": no relationship found between node 'ci-4284-0-0-n-019843d4bb.novalocal' and this object" logger="UnhandledError" May 15 00:29:16.619136 systemd[1]: Created slice kubepods-besteffort-poda252cac1_3ace_4818_a930_a49d7f92a9b3.slice - libcontainer container kubepods-besteffort-poda252cac1_3ace_4818_a930_a49d7f92a9b3.slice. May 15 00:29:16.665648 kubelet[2714]: I0515 00:29:16.665583 2714 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x7kff\" (UniqueName: \"kubernetes.io/projected/a252cac1-3ace-4818-a930-a49d7f92a9b3-kube-api-access-x7kff\") pod \"tigera-operator-789496d6f5-bzrk9\" (UID: \"a252cac1-3ace-4818-a930-a49d7f92a9b3\") " pod="tigera-operator/tigera-operator-789496d6f5-bzrk9" May 15 00:29:16.666026 kubelet[2714]: I0515 00:29:16.665941 2714 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-calico\" (UniqueName: \"kubernetes.io/host-path/a252cac1-3ace-4818-a930-a49d7f92a9b3-var-lib-calico\") pod \"tigera-operator-789496d6f5-bzrk9\" (UID: \"a252cac1-3ace-4818-a930-a49d7f92a9b3\") " pod="tigera-operator/tigera-operator-789496d6f5-bzrk9" May 15 00:29:17.311002 sudo[1754]: pam_unix(sudo:session): session closed for user root May 15 00:29:17.465217 kubelet[2714]: E0515 00:29:17.465170 2714 configmap.go:193] Couldn't get configMap kube-system/kube-proxy: failed to sync configmap cache: timed out waiting for the condition May 15 00:29:17.465461 kubelet[2714]: E0515 00:29:17.465241 2714 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/6ceb1f51-4069-4918-9aa2-9b1f49374a85-kube-proxy podName:6ceb1f51-4069-4918-9aa2-9b1f49374a85 nodeName:}" failed. No retries permitted until 2025-05-15 00:29:17.965223144 +0000 UTC m=+6.440813985 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-proxy" (UniqueName: "kubernetes.io/configmap/6ceb1f51-4069-4918-9aa2-9b1f49374a85-kube-proxy") pod "kube-proxy-4nq7w" (UID: "6ceb1f51-4069-4918-9aa2-9b1f49374a85") : failed to sync configmap cache: timed out waiting for the condition May 15 00:29:17.486461 kubelet[2714]: E0515 00:29:17.486249 2714 projected.go:288] Couldn't get configMap kube-system/kube-root-ca.crt: failed to sync configmap cache: timed out waiting for the condition May 15 00:29:17.486461 kubelet[2714]: E0515 00:29:17.486351 2714 projected.go:194] Error preparing data for projected volume kube-api-access-whm2v for pod kube-system/kube-proxy-4nq7w: failed to sync configmap cache: timed out waiting for the condition May 15 00:29:17.486632 kubelet[2714]: E0515 00:29:17.486473 2714 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/6ceb1f51-4069-4918-9aa2-9b1f49374a85-kube-api-access-whm2v podName:6ceb1f51-4069-4918-9aa2-9b1f49374a85 nodeName:}" failed. No retries permitted until 2025-05-15 00:29:17.986438993 +0000 UTC m=+6.462029885 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-whm2v" (UniqueName: "kubernetes.io/projected/6ceb1f51-4069-4918-9aa2-9b1f49374a85-kube-api-access-whm2v") pod "kube-proxy-4nq7w" (UID: "6ceb1f51-4069-4918-9aa2-9b1f49374a85") : failed to sync configmap cache: timed out waiting for the condition May 15 00:29:17.555311 sshd[1753]: Connection closed by 172.24.4.1 port 53434 May 15 00:29:17.555814 sshd-session[1750]: pam_unix(sshd:session): session closed for user core May 15 00:29:17.563023 systemd[1]: sshd@8-172.24.4.125:22-172.24.4.1:53434.service: Deactivated successfully. May 15 00:29:17.569133 systemd[1]: session-11.scope: Deactivated successfully. May 15 00:29:17.569870 systemd[1]: session-11.scope: Consumed 7.020s CPU time, 226.6M memory peak. May 15 00:29:17.574104 systemd-logind[1458]: Session 11 logged out. Waiting for processes to exit. May 15 00:29:17.592372 systemd-logind[1458]: Removed session 11. May 15 00:29:17.826081 containerd[1483]: time="2025-05-15T00:29:17.825664541Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:tigera-operator-789496d6f5-bzrk9,Uid:a252cac1-3ace-4818-a930-a49d7f92a9b3,Namespace:tigera-operator,Attempt:0,}" May 15 00:29:17.868114 containerd[1483]: time="2025-05-15T00:29:17.867171436Z" level=info msg="connecting to shim 3d22a60c3c019224a567ca87482dd326bd59c4c18d6c43e1d14135a008581c9f" address="unix:///run/containerd/s/728f562e834c73451fbc41d051ab1ceaab3c94dfeabc82a5f20e108088de37dc" namespace=k8s.io protocol=ttrpc version=3 May 15 00:29:17.926614 systemd[1]: Started cri-containerd-3d22a60c3c019224a567ca87482dd326bd59c4c18d6c43e1d14135a008581c9f.scope - libcontainer container 3d22a60c3c019224a567ca87482dd326bd59c4c18d6c43e1d14135a008581c9f. May 15 00:29:17.986425 containerd[1483]: time="2025-05-15T00:29:17.986376725Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:tigera-operator-789496d6f5-bzrk9,Uid:a252cac1-3ace-4818-a930-a49d7f92a9b3,Namespace:tigera-operator,Attempt:0,} returns sandbox id \"3d22a60c3c019224a567ca87482dd326bd59c4c18d6c43e1d14135a008581c9f\"" May 15 00:29:17.989710 containerd[1483]: time="2025-05-15T00:29:17.989683913Z" level=info msg="PullImage \"quay.io/tigera/operator:v1.36.7\"" May 15 00:29:18.162542 containerd[1483]: time="2025-05-15T00:29:18.162439075Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-proxy-4nq7w,Uid:6ceb1f51-4069-4918-9aa2-9b1f49374a85,Namespace:kube-system,Attempt:0,}" May 15 00:29:18.215923 containerd[1483]: time="2025-05-15T00:29:18.215381318Z" level=info msg="connecting to shim af83dd7e1282ea1b2cbe91faf90fcb75c9bdcceb8bc841df0eaf68645bf57fc6" address="unix:///run/containerd/s/46777a83aa1026a0685a11b1762e060d4b25a91374b0de23339fc9dd94269063" namespace=k8s.io protocol=ttrpc version=3 May 15 00:29:18.265601 systemd[1]: Started cri-containerd-af83dd7e1282ea1b2cbe91faf90fcb75c9bdcceb8bc841df0eaf68645bf57fc6.scope - libcontainer container af83dd7e1282ea1b2cbe91faf90fcb75c9bdcceb8bc841df0eaf68645bf57fc6. May 15 00:29:18.312168 containerd[1483]: time="2025-05-15T00:29:18.312103504Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-proxy-4nq7w,Uid:6ceb1f51-4069-4918-9aa2-9b1f49374a85,Namespace:kube-system,Attempt:0,} returns sandbox id \"af83dd7e1282ea1b2cbe91faf90fcb75c9bdcceb8bc841df0eaf68645bf57fc6\"" May 15 00:29:18.315911 containerd[1483]: time="2025-05-15T00:29:18.315860457Z" level=info msg="CreateContainer within sandbox \"af83dd7e1282ea1b2cbe91faf90fcb75c9bdcceb8bc841df0eaf68645bf57fc6\" for container &ContainerMetadata{Name:kube-proxy,Attempt:0,}" May 15 00:29:18.332145 containerd[1483]: time="2025-05-15T00:29:18.332097641Z" level=info msg="Container cb85a7ec437a7275d52e8f2d023180d9cadf021a59da4d212ce48135f011ea5b: CDI devices from CRI Config.CDIDevices: []" May 15 00:29:18.347461 containerd[1483]: time="2025-05-15T00:29:18.347418722Z" level=info msg="CreateContainer within sandbox \"af83dd7e1282ea1b2cbe91faf90fcb75c9bdcceb8bc841df0eaf68645bf57fc6\" for &ContainerMetadata{Name:kube-proxy,Attempt:0,} returns container id \"cb85a7ec437a7275d52e8f2d023180d9cadf021a59da4d212ce48135f011ea5b\"" May 15 00:29:18.349368 containerd[1483]: time="2025-05-15T00:29:18.348658087Z" level=info msg="StartContainer for \"cb85a7ec437a7275d52e8f2d023180d9cadf021a59da4d212ce48135f011ea5b\"" May 15 00:29:18.352061 containerd[1483]: time="2025-05-15T00:29:18.352009075Z" level=info msg="connecting to shim cb85a7ec437a7275d52e8f2d023180d9cadf021a59da4d212ce48135f011ea5b" address="unix:///run/containerd/s/46777a83aa1026a0685a11b1762e060d4b25a91374b0de23339fc9dd94269063" protocol=ttrpc version=3 May 15 00:29:18.388729 systemd[1]: Started cri-containerd-cb85a7ec437a7275d52e8f2d023180d9cadf021a59da4d212ce48135f011ea5b.scope - libcontainer container cb85a7ec437a7275d52e8f2d023180d9cadf021a59da4d212ce48135f011ea5b. May 15 00:29:18.460029 containerd[1483]: time="2025-05-15T00:29:18.459906984Z" level=info msg="StartContainer for \"cb85a7ec437a7275d52e8f2d023180d9cadf021a59da4d212ce48135f011ea5b\" returns successfully" May 15 00:29:18.736353 kubelet[2714]: I0515 00:29:18.736118 2714 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-proxy-4nq7w" podStartSLOduration=2.736100356 podStartE2EDuration="2.736100356s" podCreationTimestamp="2025-05-15 00:29:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-05-15 00:29:18.734169727 +0000 UTC m=+7.209760628" watchObservedRunningTime="2025-05-15 00:29:18.736100356 +0000 UTC m=+7.211691197" May 15 00:29:20.003079 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount254736005.mount: Deactivated successfully. May 15 00:29:20.626986 containerd[1483]: time="2025-05-15T00:29:20.626916044Z" level=info msg="ImageCreate event name:\"quay.io/tigera/operator:v1.36.7\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 15 00:29:20.628602 containerd[1483]: time="2025-05-15T00:29:20.628419298Z" level=info msg="stop pulling image quay.io/tigera/operator:v1.36.7: active requests=0, bytes read=22002662" May 15 00:29:20.630077 containerd[1483]: time="2025-05-15T00:29:20.630016365Z" level=info msg="ImageCreate event name:\"sha256:e9b19fa62f476f04e5840eb65a0f71b49c7b9f4ceede31675409ddc218bb5578\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 15 00:29:20.632754 containerd[1483]: time="2025-05-15T00:29:20.632710552Z" level=info msg="ImageCreate event name:\"quay.io/tigera/operator@sha256:a4a44422d8f2a14e0aaea2031ccb5580f2bf68218c9db444450c1888743305e9\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 15 00:29:20.633465 containerd[1483]: time="2025-05-15T00:29:20.633425016Z" level=info msg="Pulled image \"quay.io/tigera/operator:v1.36.7\" with image id \"sha256:e9b19fa62f476f04e5840eb65a0f71b49c7b9f4ceede31675409ddc218bb5578\", repo tag \"quay.io/tigera/operator:v1.36.7\", repo digest \"quay.io/tigera/operator@sha256:a4a44422d8f2a14e0aaea2031ccb5580f2bf68218c9db444450c1888743305e9\", size \"21998657\" in 2.643650307s" May 15 00:29:20.633520 containerd[1483]: time="2025-05-15T00:29:20.633465487Z" level=info msg="PullImage \"quay.io/tigera/operator:v1.36.7\" returns image reference \"sha256:e9b19fa62f476f04e5840eb65a0f71b49c7b9f4ceede31675409ddc218bb5578\"" May 15 00:29:20.635905 containerd[1483]: time="2025-05-15T00:29:20.635862181Z" level=info msg="CreateContainer within sandbox \"3d22a60c3c019224a567ca87482dd326bd59c4c18d6c43e1d14135a008581c9f\" for container &ContainerMetadata{Name:tigera-operator,Attempt:0,}" May 15 00:29:20.649293 containerd[1483]: time="2025-05-15T00:29:20.647646113Z" level=info msg="Container df1774b8308a9981600bee49ff7464b353d4c1df3e9b4618ce91e1936db17e64: CDI devices from CRI Config.CDIDevices: []" May 15 00:29:20.662156 containerd[1483]: time="2025-05-15T00:29:20.662128042Z" level=info msg="CreateContainer within sandbox \"3d22a60c3c019224a567ca87482dd326bd59c4c18d6c43e1d14135a008581c9f\" for &ContainerMetadata{Name:tigera-operator,Attempt:0,} returns container id \"df1774b8308a9981600bee49ff7464b353d4c1df3e9b4618ce91e1936db17e64\"" May 15 00:29:20.662905 containerd[1483]: time="2025-05-15T00:29:20.662884701Z" level=info msg="StartContainer for \"df1774b8308a9981600bee49ff7464b353d4c1df3e9b4618ce91e1936db17e64\"" May 15 00:29:20.664096 containerd[1483]: time="2025-05-15T00:29:20.663740999Z" level=info msg="connecting to shim df1774b8308a9981600bee49ff7464b353d4c1df3e9b4618ce91e1936db17e64" address="unix:///run/containerd/s/728f562e834c73451fbc41d051ab1ceaab3c94dfeabc82a5f20e108088de37dc" protocol=ttrpc version=3 May 15 00:29:20.686394 systemd[1]: Started cri-containerd-df1774b8308a9981600bee49ff7464b353d4c1df3e9b4618ce91e1936db17e64.scope - libcontainer container df1774b8308a9981600bee49ff7464b353d4c1df3e9b4618ce91e1936db17e64. May 15 00:29:20.715137 containerd[1483]: time="2025-05-15T00:29:20.715101378Z" level=info msg="StartContainer for \"df1774b8308a9981600bee49ff7464b353d4c1df3e9b4618ce91e1936db17e64\" returns successfully" May 15 00:29:22.225891 kubelet[2714]: I0515 00:29:22.225016 2714 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="tigera-operator/tigera-operator-789496d6f5-bzrk9" podStartSLOduration=3.578153794 podStartE2EDuration="6.224982393s" podCreationTimestamp="2025-05-15 00:29:16 +0000 UTC" firstStartedPulling="2025-05-15 00:29:17.987748303 +0000 UTC m=+6.463339154" lastFinishedPulling="2025-05-15 00:29:20.634576912 +0000 UTC m=+9.110167753" observedRunningTime="2025-05-15 00:29:20.746627253 +0000 UTC m=+9.222218104" watchObservedRunningTime="2025-05-15 00:29:22.224982393 +0000 UTC m=+10.700573285" May 15 00:29:24.012412 systemd[1]: Created slice kubepods-besteffort-podd6634a9b_6999_4171_9b0a_a81fffd60648.slice - libcontainer container kubepods-besteffort-podd6634a9b_6999_4171_9b0a_a81fffd60648.slice. May 15 00:29:24.019639 kubelet[2714]: I0515 00:29:24.019599 2714 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"typha-certs\" (UniqueName: \"kubernetes.io/secret/d6634a9b-6999-4171-9b0a-a81fffd60648-typha-certs\") pod \"calico-typha-5bf9bc7d8-p5g8z\" (UID: \"d6634a9b-6999-4171-9b0a-a81fffd60648\") " pod="calico-system/calico-typha-5bf9bc7d8-p5g8z" May 15 00:29:24.019639 kubelet[2714]: I0515 00:29:24.019642 2714 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d6634a9b-6999-4171-9b0a-a81fffd60648-tigera-ca-bundle\") pod \"calico-typha-5bf9bc7d8-p5g8z\" (UID: \"d6634a9b-6999-4171-9b0a-a81fffd60648\") " pod="calico-system/calico-typha-5bf9bc7d8-p5g8z" May 15 00:29:24.019999 kubelet[2714]: I0515 00:29:24.019664 2714 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h5hmw\" (UniqueName: \"kubernetes.io/projected/d6634a9b-6999-4171-9b0a-a81fffd60648-kube-api-access-h5hmw\") pod \"calico-typha-5bf9bc7d8-p5g8z\" (UID: \"d6634a9b-6999-4171-9b0a-a81fffd60648\") " pod="calico-system/calico-typha-5bf9bc7d8-p5g8z" May 15 00:29:24.099072 systemd[1]: Created slice kubepods-besteffort-pod2e48db25_7234_4284_adfb_6b38d52c1c68.slice - libcontainer container kubepods-besteffort-pod2e48db25_7234_4284_adfb_6b38d52c1c68.slice. May 15 00:29:24.119844 kubelet[2714]: I0515 00:29:24.119793 2714 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-net-dir\" (UniqueName: \"kubernetes.io/host-path/2e48db25-7234-4284-adfb-6b38d52c1c68-cni-net-dir\") pod \"calico-node-z997f\" (UID: \"2e48db25-7234-4284-adfb-6b38d52c1c68\") " pod="calico-system/calico-node-z997f" May 15 00:29:24.119844 kubelet[2714]: I0515 00:29:24.119835 2714 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-log-dir\" (UniqueName: \"kubernetes.io/host-path/2e48db25-7234-4284-adfb-6b38d52c1c68-cni-log-dir\") pod \"calico-node-z997f\" (UID: \"2e48db25-7234-4284-adfb-6b38d52c1c68\") " pod="calico-system/calico-node-z997f" May 15 00:29:24.120000 kubelet[2714]: I0515 00:29:24.119869 2714 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-certs\" (UniqueName: \"kubernetes.io/secret/2e48db25-7234-4284-adfb-6b38d52c1c68-node-certs\") pod \"calico-node-z997f\" (UID: \"2e48db25-7234-4284-adfb-6b38d52c1c68\") " pod="calico-system/calico-node-z997f" May 15 00:29:24.120000 kubelet[2714]: I0515 00:29:24.119910 2714 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-calico\" (UniqueName: \"kubernetes.io/host-path/2e48db25-7234-4284-adfb-6b38d52c1c68-var-lib-calico\") pod \"calico-node-z997f\" (UID: \"2e48db25-7234-4284-adfb-6b38d52c1c68\") " pod="calico-system/calico-node-z997f" May 15 00:29:24.120000 kubelet[2714]: I0515 00:29:24.119939 2714 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"xtables-lock\" (UniqueName: \"kubernetes.io/host-path/2e48db25-7234-4284-adfb-6b38d52c1c68-xtables-lock\") pod \"calico-node-z997f\" (UID: \"2e48db25-7234-4284-adfb-6b38d52c1c68\") " pod="calico-system/calico-node-z997f" May 15 00:29:24.120000 kubelet[2714]: I0515 00:29:24.119958 2714 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-calico\" (UniqueName: \"kubernetes.io/host-path/2e48db25-7234-4284-adfb-6b38d52c1c68-var-run-calico\") pod \"calico-node-z997f\" (UID: \"2e48db25-7234-4284-adfb-6b38d52c1c68\") " pod="calico-system/calico-node-z997f" May 15 00:29:24.120000 kubelet[2714]: I0515 00:29:24.119975 2714 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-bin-dir\" (UniqueName: \"kubernetes.io/host-path/2e48db25-7234-4284-adfb-6b38d52c1c68-cni-bin-dir\") pod \"calico-node-z997f\" (UID: \"2e48db25-7234-4284-adfb-6b38d52c1c68\") " pod="calico-system/calico-node-z997f" May 15 00:29:24.120140 kubelet[2714]: I0515 00:29:24.120004 2714 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/2e48db25-7234-4284-adfb-6b38d52c1c68-lib-modules\") pod \"calico-node-z997f\" (UID: \"2e48db25-7234-4284-adfb-6b38d52c1c68\") " pod="calico-system/calico-node-z997f" May 15 00:29:24.120140 kubelet[2714]: I0515 00:29:24.120023 2714 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"policysync\" (UniqueName: \"kubernetes.io/host-path/2e48db25-7234-4284-adfb-6b38d52c1c68-policysync\") pod \"calico-node-z997f\" (UID: \"2e48db25-7234-4284-adfb-6b38d52c1c68\") " pod="calico-system/calico-node-z997f" May 15 00:29:24.120140 kubelet[2714]: I0515 00:29:24.120040 2714 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2e48db25-7234-4284-adfb-6b38d52c1c68-tigera-ca-bundle\") pod \"calico-node-z997f\" (UID: \"2e48db25-7234-4284-adfb-6b38d52c1c68\") " pod="calico-system/calico-node-z997f" May 15 00:29:24.120140 kubelet[2714]: I0515 00:29:24.120063 2714 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"flexvol-driver-host\" (UniqueName: \"kubernetes.io/host-path/2e48db25-7234-4284-adfb-6b38d52c1c68-flexvol-driver-host\") pod \"calico-node-z997f\" (UID: \"2e48db25-7234-4284-adfb-6b38d52c1c68\") " pod="calico-system/calico-node-z997f" May 15 00:29:24.120140 kubelet[2714]: I0515 00:29:24.120085 2714 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-87956\" (UniqueName: \"kubernetes.io/projected/2e48db25-7234-4284-adfb-6b38d52c1c68-kube-api-access-87956\") pod \"calico-node-z997f\" (UID: \"2e48db25-7234-4284-adfb-6b38d52c1c68\") " pod="calico-system/calico-node-z997f" May 15 00:29:24.213243 kubelet[2714]: E0515 00:29:24.212959 2714 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-xcq6h" podUID="01471fe4-93c1-4711-8898-dce9d2c2ee23" May 15 00:29:24.227642 kubelet[2714]: E0515 00:29:24.227512 2714 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 15 00:29:24.227642 kubelet[2714]: W0515 00:29:24.227538 2714 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 15 00:29:24.227642 kubelet[2714]: E0515 00:29:24.227557 2714 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 15 00:29:24.228466 kubelet[2714]: E0515 00:29:24.228305 2714 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 15 00:29:24.228466 kubelet[2714]: W0515 00:29:24.228318 2714 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 15 00:29:24.228466 kubelet[2714]: E0515 00:29:24.228332 2714 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 15 00:29:24.228774 kubelet[2714]: E0515 00:29:24.228619 2714 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 15 00:29:24.228774 kubelet[2714]: W0515 00:29:24.228658 2714 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 15 00:29:24.228774 kubelet[2714]: E0515 00:29:24.228670 2714 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 15 00:29:24.229377 kubelet[2714]: E0515 00:29:24.229366 2714 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 15 00:29:24.229448 kubelet[2714]: W0515 00:29:24.229437 2714 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 15 00:29:24.229509 kubelet[2714]: E0515 00:29:24.229498 2714 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 15 00:29:24.230406 kubelet[2714]: E0515 00:29:24.230393 2714 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 15 00:29:24.230479 kubelet[2714]: W0515 00:29:24.230468 2714 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 15 00:29:24.230539 kubelet[2714]: E0515 00:29:24.230528 2714 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 15 00:29:24.230738 kubelet[2714]: E0515 00:29:24.230726 2714 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 15 00:29:24.230864 kubelet[2714]: W0515 00:29:24.230792 2714 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 15 00:29:24.230864 kubelet[2714]: E0515 00:29:24.230806 2714 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 15 00:29:24.231410 kubelet[2714]: E0515 00:29:24.231326 2714 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 15 00:29:24.231410 kubelet[2714]: W0515 00:29:24.231338 2714 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 15 00:29:24.231410 kubelet[2714]: E0515 00:29:24.231349 2714 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 15 00:29:24.232472 kubelet[2714]: E0515 00:29:24.232362 2714 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 15 00:29:24.232472 kubelet[2714]: W0515 00:29:24.232373 2714 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 15 00:29:24.232472 kubelet[2714]: E0515 00:29:24.232383 2714 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 15 00:29:24.236574 kubelet[2714]: E0515 00:29:24.236501 2714 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 15 00:29:24.236574 kubelet[2714]: W0515 00:29:24.236521 2714 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 15 00:29:24.236574 kubelet[2714]: E0515 00:29:24.236539 2714 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 15 00:29:24.259362 kubelet[2714]: E0515 00:29:24.258513 2714 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 15 00:29:24.259362 kubelet[2714]: W0515 00:29:24.258535 2714 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 15 00:29:24.259362 kubelet[2714]: E0515 00:29:24.259324 2714 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 15 00:29:24.307603 kubelet[2714]: E0515 00:29:24.307492 2714 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 15 00:29:24.307603 kubelet[2714]: W0515 00:29:24.307513 2714 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 15 00:29:24.307603 kubelet[2714]: E0515 00:29:24.307531 2714 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 15 00:29:24.308762 kubelet[2714]: E0515 00:29:24.308644 2714 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 15 00:29:24.308762 kubelet[2714]: W0515 00:29:24.308661 2714 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 15 00:29:24.308762 kubelet[2714]: E0515 00:29:24.308687 2714 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 15 00:29:24.308971 kubelet[2714]: E0515 00:29:24.308880 2714 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 15 00:29:24.308971 kubelet[2714]: W0515 00:29:24.308890 2714 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 15 00:29:24.308971 kubelet[2714]: E0515 00:29:24.308899 2714 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 15 00:29:24.309513 kubelet[2714]: E0515 00:29:24.309087 2714 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 15 00:29:24.309513 kubelet[2714]: W0515 00:29:24.309096 2714 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 15 00:29:24.309513 kubelet[2714]: E0515 00:29:24.309105 2714 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 15 00:29:24.309513 kubelet[2714]: E0515 00:29:24.309296 2714 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 15 00:29:24.309513 kubelet[2714]: W0515 00:29:24.309305 2714 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 15 00:29:24.309513 kubelet[2714]: E0515 00:29:24.309314 2714 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 15 00:29:24.309513 kubelet[2714]: E0515 00:29:24.309442 2714 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 15 00:29:24.309513 kubelet[2714]: W0515 00:29:24.309451 2714 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 15 00:29:24.309513 kubelet[2714]: E0515 00:29:24.309459 2714 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 15 00:29:24.309844 kubelet[2714]: E0515 00:29:24.309591 2714 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 15 00:29:24.309844 kubelet[2714]: W0515 00:29:24.309599 2714 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 15 00:29:24.309844 kubelet[2714]: E0515 00:29:24.309607 2714 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 15 00:29:24.309844 kubelet[2714]: E0515 00:29:24.309755 2714 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 15 00:29:24.309844 kubelet[2714]: W0515 00:29:24.309763 2714 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 15 00:29:24.309844 kubelet[2714]: E0515 00:29:24.309772 2714 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 15 00:29:24.310030 kubelet[2714]: E0515 00:29:24.309915 2714 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 15 00:29:24.310030 kubelet[2714]: W0515 00:29:24.309924 2714 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 15 00:29:24.310030 kubelet[2714]: E0515 00:29:24.309932 2714 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 15 00:29:24.310110 kubelet[2714]: E0515 00:29:24.310066 2714 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 15 00:29:24.310110 kubelet[2714]: W0515 00:29:24.310075 2714 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 15 00:29:24.310110 kubelet[2714]: E0515 00:29:24.310083 2714 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 15 00:29:24.310281 kubelet[2714]: E0515 00:29:24.310218 2714 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 15 00:29:24.310281 kubelet[2714]: W0515 00:29:24.310231 2714 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 15 00:29:24.311223 kubelet[2714]: E0515 00:29:24.310241 2714 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 15 00:29:24.311223 kubelet[2714]: E0515 00:29:24.310486 2714 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 15 00:29:24.311223 kubelet[2714]: W0515 00:29:24.310494 2714 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 15 00:29:24.311223 kubelet[2714]: E0515 00:29:24.310503 2714 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 15 00:29:24.311223 kubelet[2714]: E0515 00:29:24.310639 2714 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 15 00:29:24.311223 kubelet[2714]: W0515 00:29:24.310648 2714 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 15 00:29:24.311223 kubelet[2714]: E0515 00:29:24.310656 2714 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 15 00:29:24.311223 kubelet[2714]: E0515 00:29:24.310830 2714 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 15 00:29:24.311223 kubelet[2714]: W0515 00:29:24.310839 2714 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 15 00:29:24.311223 kubelet[2714]: E0515 00:29:24.310849 2714 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 15 00:29:24.311649 kubelet[2714]: E0515 00:29:24.310982 2714 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 15 00:29:24.311649 kubelet[2714]: W0515 00:29:24.310991 2714 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 15 00:29:24.311649 kubelet[2714]: E0515 00:29:24.310999 2714 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 15 00:29:24.311649 kubelet[2714]: E0515 00:29:24.311125 2714 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 15 00:29:24.311649 kubelet[2714]: W0515 00:29:24.311133 2714 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 15 00:29:24.311649 kubelet[2714]: E0515 00:29:24.311141 2714 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 15 00:29:24.311649 kubelet[2714]: E0515 00:29:24.311319 2714 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 15 00:29:24.311649 kubelet[2714]: W0515 00:29:24.311328 2714 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 15 00:29:24.311649 kubelet[2714]: E0515 00:29:24.311337 2714 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 15 00:29:24.311649 kubelet[2714]: E0515 00:29:24.311470 2714 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 15 00:29:24.312032 kubelet[2714]: W0515 00:29:24.311478 2714 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 15 00:29:24.312032 kubelet[2714]: E0515 00:29:24.311486 2714 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 15 00:29:24.312032 kubelet[2714]: E0515 00:29:24.311619 2714 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 15 00:29:24.312032 kubelet[2714]: W0515 00:29:24.311627 2714 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 15 00:29:24.312032 kubelet[2714]: E0515 00:29:24.311635 2714 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 15 00:29:24.312032 kubelet[2714]: E0515 00:29:24.311775 2714 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 15 00:29:24.312032 kubelet[2714]: W0515 00:29:24.311784 2714 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 15 00:29:24.312032 kubelet[2714]: E0515 00:29:24.311792 2714 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 15 00:29:24.315813 containerd[1483]: time="2025-05-15T00:29:24.315779391Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-typha-5bf9bc7d8-p5g8z,Uid:d6634a9b-6999-4171-9b0a-a81fffd60648,Namespace:calico-system,Attempt:0,}" May 15 00:29:24.321534 kubelet[2714]: E0515 00:29:24.321489 2714 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 15 00:29:24.322076 kubelet[2714]: W0515 00:29:24.321613 2714 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 15 00:29:24.322076 kubelet[2714]: E0515 00:29:24.321643 2714 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 15 00:29:24.322076 kubelet[2714]: I0515 00:29:24.321678 2714 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/01471fe4-93c1-4711-8898-dce9d2c2ee23-kubelet-dir\") pod \"csi-node-driver-xcq6h\" (UID: \"01471fe4-93c1-4711-8898-dce9d2c2ee23\") " pod="calico-system/csi-node-driver-xcq6h" May 15 00:29:24.322076 kubelet[2714]: E0515 00:29:24.321937 2714 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 15 00:29:24.322076 kubelet[2714]: W0515 00:29:24.321952 2714 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 15 00:29:24.322076 kubelet[2714]: E0515 00:29:24.321988 2714 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 15 00:29:24.322076 kubelet[2714]: I0515 00:29:24.322009 2714 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/01471fe4-93c1-4711-8898-dce9d2c2ee23-socket-dir\") pod \"csi-node-driver-xcq6h\" (UID: \"01471fe4-93c1-4711-8898-dce9d2c2ee23\") " pod="calico-system/csi-node-driver-xcq6h" May 15 00:29:24.322328 kubelet[2714]: E0515 00:29:24.322238 2714 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 15 00:29:24.322328 kubelet[2714]: W0515 00:29:24.322250 2714 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 15 00:29:24.322328 kubelet[2714]: E0515 00:29:24.322304 2714 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 15 00:29:24.322328 kubelet[2714]: I0515 00:29:24.322324 2714 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"varrun\" (UniqueName: \"kubernetes.io/host-path/01471fe4-93c1-4711-8898-dce9d2c2ee23-varrun\") pod \"csi-node-driver-xcq6h\" (UID: \"01471fe4-93c1-4711-8898-dce9d2c2ee23\") " pod="calico-system/csi-node-driver-xcq6h" May 15 00:29:24.323396 kubelet[2714]: E0515 00:29:24.322540 2714 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 15 00:29:24.323396 kubelet[2714]: W0515 00:29:24.322553 2714 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 15 00:29:24.323396 kubelet[2714]: E0515 00:29:24.322597 2714 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 15 00:29:24.323396 kubelet[2714]: I0515 00:29:24.322616 2714 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tvkr8\" (UniqueName: \"kubernetes.io/projected/01471fe4-93c1-4711-8898-dce9d2c2ee23-kube-api-access-tvkr8\") pod \"csi-node-driver-xcq6h\" (UID: \"01471fe4-93c1-4711-8898-dce9d2c2ee23\") " pod="calico-system/csi-node-driver-xcq6h" May 15 00:29:24.323396 kubelet[2714]: E0515 00:29:24.322814 2714 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 15 00:29:24.323396 kubelet[2714]: W0515 00:29:24.322829 2714 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 15 00:29:24.323396 kubelet[2714]: E0515 00:29:24.322919 2714 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 15 00:29:24.323396 kubelet[2714]: I0515 00:29:24.322963 2714 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/01471fe4-93c1-4711-8898-dce9d2c2ee23-registration-dir\") pod \"csi-node-driver-xcq6h\" (UID: \"01471fe4-93c1-4711-8898-dce9d2c2ee23\") " pod="calico-system/csi-node-driver-xcq6h" May 15 00:29:24.323396 kubelet[2714]: E0515 00:29:24.323082 2714 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 15 00:29:24.325706 kubelet[2714]: W0515 00:29:24.323091 2714 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 15 00:29:24.325706 kubelet[2714]: E0515 00:29:24.323131 2714 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 15 00:29:24.325706 kubelet[2714]: E0515 00:29:24.323349 2714 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 15 00:29:24.325706 kubelet[2714]: W0515 00:29:24.323358 2714 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 15 00:29:24.325706 kubelet[2714]: E0515 00:29:24.323514 2714 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 15 00:29:24.325706 kubelet[2714]: E0515 00:29:24.323605 2714 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 15 00:29:24.325706 kubelet[2714]: W0515 00:29:24.323614 2714 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 15 00:29:24.325706 kubelet[2714]: E0515 00:29:24.323679 2714 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 15 00:29:24.325706 kubelet[2714]: E0515 00:29:24.323887 2714 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 15 00:29:24.325706 kubelet[2714]: W0515 00:29:24.323898 2714 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 15 00:29:24.326026 kubelet[2714]: E0515 00:29:24.323978 2714 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 15 00:29:24.326026 kubelet[2714]: E0515 00:29:24.324320 2714 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 15 00:29:24.326026 kubelet[2714]: W0515 00:29:24.324330 2714 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 15 00:29:24.326026 kubelet[2714]: E0515 00:29:24.324547 2714 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 15 00:29:24.326026 kubelet[2714]: W0515 00:29:24.324556 2714 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 15 00:29:24.326026 kubelet[2714]: E0515 00:29:24.324713 2714 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 15 00:29:24.326026 kubelet[2714]: W0515 00:29:24.324723 2714 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 15 00:29:24.326026 kubelet[2714]: E0515 00:29:24.324732 2714 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 15 00:29:24.326026 kubelet[2714]: E0515 00:29:24.324768 2714 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 15 00:29:24.326026 kubelet[2714]: E0515 00:29:24.324790 2714 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 15 00:29:24.326652 kubelet[2714]: E0515 00:29:24.324932 2714 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 15 00:29:24.326652 kubelet[2714]: W0515 00:29:24.324942 2714 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 15 00:29:24.326652 kubelet[2714]: E0515 00:29:24.324960 2714 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 15 00:29:24.326652 kubelet[2714]: E0515 00:29:24.325217 2714 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 15 00:29:24.326652 kubelet[2714]: W0515 00:29:24.325226 2714 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 15 00:29:24.326652 kubelet[2714]: E0515 00:29:24.325235 2714 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 15 00:29:24.326652 kubelet[2714]: E0515 00:29:24.325432 2714 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 15 00:29:24.326652 kubelet[2714]: W0515 00:29:24.325441 2714 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 15 00:29:24.326652 kubelet[2714]: E0515 00:29:24.325449 2714 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 15 00:29:24.353908 containerd[1483]: time="2025-05-15T00:29:24.353863828Z" level=info msg="connecting to shim ff9ba55fc08661780708d8a83dcb383ef6aaf803187efada0ba4439565291d27" address="unix:///run/containerd/s/dfa5362baf5a2892ea59dfb0b91cb6e56fed484afe8dc174119b2a8a781a52e0" namespace=k8s.io protocol=ttrpc version=3 May 15 00:29:24.390679 systemd[1]: Started cri-containerd-ff9ba55fc08661780708d8a83dcb383ef6aaf803187efada0ba4439565291d27.scope - libcontainer container ff9ba55fc08661780708d8a83dcb383ef6aaf803187efada0ba4439565291d27. May 15 00:29:24.407281 containerd[1483]: time="2025-05-15T00:29:24.406972574Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-node-z997f,Uid:2e48db25-7234-4284-adfb-6b38d52c1c68,Namespace:calico-system,Attempt:0,}" May 15 00:29:24.423761 kubelet[2714]: E0515 00:29:24.423663 2714 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 15 00:29:24.423761 kubelet[2714]: W0515 00:29:24.423686 2714 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 15 00:29:24.423761 kubelet[2714]: E0515 00:29:24.423707 2714 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 15 00:29:24.424679 kubelet[2714]: E0515 00:29:24.424505 2714 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 15 00:29:24.424679 kubelet[2714]: W0515 00:29:24.424526 2714 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 15 00:29:24.424679 kubelet[2714]: E0515 00:29:24.424549 2714 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 15 00:29:24.425164 kubelet[2714]: E0515 00:29:24.424989 2714 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 15 00:29:24.425164 kubelet[2714]: W0515 00:29:24.425001 2714 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 15 00:29:24.425164 kubelet[2714]: E0515 00:29:24.425023 2714 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 15 00:29:24.425547 kubelet[2714]: E0515 00:29:24.425387 2714 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 15 00:29:24.425547 kubelet[2714]: W0515 00:29:24.425398 2714 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 15 00:29:24.425547 kubelet[2714]: E0515 00:29:24.425420 2714 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 15 00:29:24.426026 kubelet[2714]: E0515 00:29:24.425951 2714 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 15 00:29:24.426026 kubelet[2714]: W0515 00:29:24.425963 2714 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 15 00:29:24.426326 kubelet[2714]: E0515 00:29:24.426152 2714 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 15 00:29:24.426536 kubelet[2714]: E0515 00:29:24.426512 2714 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 15 00:29:24.426536 kubelet[2714]: W0515 00:29:24.426524 2714 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 15 00:29:24.426948 kubelet[2714]: E0515 00:29:24.426788 2714 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 15 00:29:24.426948 kubelet[2714]: E0515 00:29:24.426883 2714 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 15 00:29:24.426948 kubelet[2714]: W0515 00:29:24.426891 2714 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 15 00:29:24.427150 kubelet[2714]: E0515 00:29:24.427070 2714 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 15 00:29:24.427287 kubelet[2714]: E0515 00:29:24.427221 2714 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 15 00:29:24.427287 kubelet[2714]: W0515 00:29:24.427231 2714 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 15 00:29:24.427411 kubelet[2714]: E0515 00:29:24.427322 2714 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 15 00:29:24.427791 kubelet[2714]: E0515 00:29:24.427637 2714 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 15 00:29:24.427791 kubelet[2714]: W0515 00:29:24.427647 2714 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 15 00:29:24.427791 kubelet[2714]: E0515 00:29:24.427729 2714 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 15 00:29:24.428151 kubelet[2714]: E0515 00:29:24.428043 2714 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 15 00:29:24.428151 kubelet[2714]: W0515 00:29:24.428055 2714 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 15 00:29:24.428151 kubelet[2714]: E0515 00:29:24.428125 2714 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 15 00:29:24.428522 kubelet[2714]: E0515 00:29:24.428468 2714 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 15 00:29:24.428522 kubelet[2714]: W0515 00:29:24.428479 2714 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 15 00:29:24.428700 kubelet[2714]: E0515 00:29:24.428614 2714 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 15 00:29:24.428993 kubelet[2714]: E0515 00:29:24.428913 2714 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 15 00:29:24.428993 kubelet[2714]: W0515 00:29:24.428938 2714 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 15 00:29:24.429282 kubelet[2714]: E0515 00:29:24.429114 2714 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 15 00:29:24.429445 kubelet[2714]: E0515 00:29:24.429378 2714 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 15 00:29:24.429445 kubelet[2714]: W0515 00:29:24.429389 2714 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 15 00:29:24.430732 kubelet[2714]: E0515 00:29:24.429569 2714 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 15 00:29:24.430916 kubelet[2714]: E0515 00:29:24.430835 2714 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 15 00:29:24.430916 kubelet[2714]: W0515 00:29:24.430848 2714 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 15 00:29:24.431087 kubelet[2714]: E0515 00:29:24.431005 2714 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 15 00:29:24.431258 kubelet[2714]: E0515 00:29:24.431163 2714 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 15 00:29:24.431258 kubelet[2714]: W0515 00:29:24.431173 2714 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 15 00:29:24.431500 kubelet[2714]: E0515 00:29:24.431399 2714 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 15 00:29:24.431670 kubelet[2714]: E0515 00:29:24.431579 2714 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 15 00:29:24.431670 kubelet[2714]: W0515 00:29:24.431611 2714 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 15 00:29:24.431988 kubelet[2714]: E0515 00:29:24.431794 2714 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 15 00:29:24.432142 kubelet[2714]: E0515 00:29:24.432131 2714 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 15 00:29:24.432253 kubelet[2714]: W0515 00:29:24.432190 2714 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 15 00:29:24.432419 kubelet[2714]: E0515 00:29:24.432345 2714 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 15 00:29:24.432613 kubelet[2714]: E0515 00:29:24.432542 2714 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 15 00:29:24.432613 kubelet[2714]: W0515 00:29:24.432552 2714 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 15 00:29:24.432846 kubelet[2714]: E0515 00:29:24.432767 2714 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 15 00:29:24.432846 kubelet[2714]: E0515 00:29:24.432828 2714 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 15 00:29:24.432846 kubelet[2714]: W0515 00:29:24.432835 2714 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 15 00:29:24.433196 kubelet[2714]: E0515 00:29:24.433018 2714 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 15 00:29:24.433196 kubelet[2714]: E0515 00:29:24.433108 2714 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 15 00:29:24.433196 kubelet[2714]: W0515 00:29:24.433116 2714 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 15 00:29:24.433810 kubelet[2714]: E0515 00:29:24.433698 2714 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 15 00:29:24.434101 kubelet[2714]: E0515 00:29:24.433984 2714 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 15 00:29:24.434101 kubelet[2714]: W0515 00:29:24.433996 2714 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 15 00:29:24.434101 kubelet[2714]: E0515 00:29:24.434009 2714 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 15 00:29:24.434333 kubelet[2714]: E0515 00:29:24.434321 2714 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 15 00:29:24.436071 kubelet[2714]: W0515 00:29:24.435871 2714 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 15 00:29:24.436071 kubelet[2714]: E0515 00:29:24.435897 2714 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 15 00:29:24.437798 kubelet[2714]: E0515 00:29:24.436232 2714 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 15 00:29:24.437798 kubelet[2714]: W0515 00:29:24.436242 2714 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 15 00:29:24.437798 kubelet[2714]: E0515 00:29:24.436395 2714 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 15 00:29:24.437798 kubelet[2714]: E0515 00:29:24.437243 2714 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 15 00:29:24.437798 kubelet[2714]: W0515 00:29:24.437253 2714 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 15 00:29:24.437798 kubelet[2714]: E0515 00:29:24.437282 2714 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 15 00:29:24.437798 kubelet[2714]: E0515 00:29:24.437438 2714 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 15 00:29:24.437798 kubelet[2714]: W0515 00:29:24.437446 2714 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 15 00:29:24.437798 kubelet[2714]: E0515 00:29:24.437457 2714 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 15 00:29:24.447908 kubelet[2714]: E0515 00:29:24.447886 2714 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 15 00:29:24.448257 kubelet[2714]: W0515 00:29:24.448242 2714 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 15 00:29:24.448407 kubelet[2714]: E0515 00:29:24.448392 2714 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 15 00:29:24.451445 containerd[1483]: time="2025-05-15T00:29:24.451400782Z" level=info msg="connecting to shim 4bd760430421a8b56ee54da97d6204d75ef975cec375cb6f9578938657a0c08b" address="unix:///run/containerd/s/e6cccba9a92022f5e6ddb84add2330c99bc4f2d88ea7296720a2061290eecfd8" namespace=k8s.io protocol=ttrpc version=3 May 15 00:29:24.482716 systemd[1]: Started cri-containerd-4bd760430421a8b56ee54da97d6204d75ef975cec375cb6f9578938657a0c08b.scope - libcontainer container 4bd760430421a8b56ee54da97d6204d75ef975cec375cb6f9578938657a0c08b. May 15 00:29:24.497396 containerd[1483]: time="2025-05-15T00:29:24.496861264Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-typha-5bf9bc7d8-p5g8z,Uid:d6634a9b-6999-4171-9b0a-a81fffd60648,Namespace:calico-system,Attempt:0,} returns sandbox id \"ff9ba55fc08661780708d8a83dcb383ef6aaf803187efada0ba4439565291d27\"" May 15 00:29:24.500297 containerd[1483]: time="2025-05-15T00:29:24.500190048Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/typha:v3.29.3\"" May 15 00:29:24.531243 containerd[1483]: time="2025-05-15T00:29:24.531169824Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-node-z997f,Uid:2e48db25-7234-4284-adfb-6b38d52c1c68,Namespace:calico-system,Attempt:0,} returns sandbox id \"4bd760430421a8b56ee54da97d6204d75ef975cec375cb6f9578938657a0c08b\"" May 15 00:29:25.127324 kubelet[2714]: E0515 00:29:25.124779 2714 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 15 00:29:25.127324 kubelet[2714]: W0515 00:29:25.124822 2714 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 15 00:29:25.127324 kubelet[2714]: E0515 00:29:25.124858 2714 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 15 00:29:25.127324 kubelet[2714]: E0515 00:29:25.125187 2714 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 15 00:29:25.127324 kubelet[2714]: W0515 00:29:25.125207 2714 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 15 00:29:25.127324 kubelet[2714]: E0515 00:29:25.125228 2714 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 15 00:29:25.127324 kubelet[2714]: E0515 00:29:25.125601 2714 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 15 00:29:25.127324 kubelet[2714]: W0515 00:29:25.125622 2714 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 15 00:29:25.127324 kubelet[2714]: E0515 00:29:25.125642 2714 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 15 00:29:25.127324 kubelet[2714]: E0515 00:29:25.126014 2714 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 15 00:29:25.128521 kubelet[2714]: W0515 00:29:25.126037 2714 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 15 00:29:25.128521 kubelet[2714]: E0515 00:29:25.126058 2714 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 15 00:29:25.128521 kubelet[2714]: E0515 00:29:25.126459 2714 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 15 00:29:25.128521 kubelet[2714]: W0515 00:29:25.126480 2714 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 15 00:29:25.128521 kubelet[2714]: E0515 00:29:25.126501 2714 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 15 00:29:25.128521 kubelet[2714]: E0515 00:29:25.126802 2714 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 15 00:29:25.128521 kubelet[2714]: W0515 00:29:25.126822 2714 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 15 00:29:25.128521 kubelet[2714]: E0515 00:29:25.126842 2714 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 15 00:29:25.128521 kubelet[2714]: E0515 00:29:25.127129 2714 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 15 00:29:25.128521 kubelet[2714]: W0515 00:29:25.127152 2714 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 15 00:29:25.129116 kubelet[2714]: E0515 00:29:25.127172 2714 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 15 00:29:25.129116 kubelet[2714]: E0515 00:29:25.127534 2714 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 15 00:29:25.129116 kubelet[2714]: W0515 00:29:25.127555 2714 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 15 00:29:25.129116 kubelet[2714]: E0515 00:29:25.127575 2714 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 15 00:29:25.129116 kubelet[2714]: E0515 00:29:25.127895 2714 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 15 00:29:25.129116 kubelet[2714]: W0515 00:29:25.127915 2714 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 15 00:29:25.129116 kubelet[2714]: E0515 00:29:25.127938 2714 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 15 00:29:25.129116 kubelet[2714]: E0515 00:29:25.128225 2714 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 15 00:29:25.129116 kubelet[2714]: W0515 00:29:25.128245 2714 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 15 00:29:25.129116 kubelet[2714]: E0515 00:29:25.128319 2714 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 15 00:29:25.131819 kubelet[2714]: E0515 00:29:25.128618 2714 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 15 00:29:25.131819 kubelet[2714]: W0515 00:29:25.128638 2714 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 15 00:29:25.131819 kubelet[2714]: E0515 00:29:25.128658 2714 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 15 00:29:25.131819 kubelet[2714]: E0515 00:29:25.128949 2714 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 15 00:29:25.131819 kubelet[2714]: W0515 00:29:25.128968 2714 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 15 00:29:25.131819 kubelet[2714]: E0515 00:29:25.128988 2714 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 15 00:29:25.131819 kubelet[2714]: E0515 00:29:25.129382 2714 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 15 00:29:25.131819 kubelet[2714]: W0515 00:29:25.129409 2714 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 15 00:29:25.131819 kubelet[2714]: E0515 00:29:25.129431 2714 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 15 00:29:25.131819 kubelet[2714]: E0515 00:29:25.129736 2714 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 15 00:29:25.132428 kubelet[2714]: W0515 00:29:25.129756 2714 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 15 00:29:25.132428 kubelet[2714]: E0515 00:29:25.129775 2714 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 15 00:29:25.132428 kubelet[2714]: E0515 00:29:25.130057 2714 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 15 00:29:25.132428 kubelet[2714]: W0515 00:29:25.130077 2714 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 15 00:29:25.132428 kubelet[2714]: E0515 00:29:25.130096 2714 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 15 00:29:25.649788 kubelet[2714]: E0515 00:29:25.648712 2714 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-xcq6h" podUID="01471fe4-93c1-4711-8898-dce9d2c2ee23" May 15 00:29:27.648686 kubelet[2714]: E0515 00:29:27.648119 2714 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-xcq6h" podUID="01471fe4-93c1-4711-8898-dce9d2c2ee23" May 15 00:29:27.655084 containerd[1483]: time="2025-05-15T00:29:27.654438046Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/typha:v3.29.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 15 00:29:27.656278 containerd[1483]: time="2025-05-15T00:29:27.656222201Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/typha:v3.29.3: active requests=0, bytes read=30426870" May 15 00:29:27.657793 containerd[1483]: time="2025-05-15T00:29:27.657730368Z" level=info msg="ImageCreate event name:\"sha256:bde24a3cb8851b59372b76b3ad78f8028d1a915ffed82c6cc6256f34e500bd3d\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 15 00:29:27.660474 containerd[1483]: time="2025-05-15T00:29:27.660430695Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/typha@sha256:f5516aa6a78f00931d2625f3012dcf2c69d141ce41483b8d59c6ec6330a18620\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 15 00:29:27.661187 containerd[1483]: time="2025-05-15T00:29:27.661055166Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/typha:v3.29.3\" with image id \"sha256:bde24a3cb8851b59372b76b3ad78f8028d1a915ffed82c6cc6256f34e500bd3d\", repo tag \"ghcr.io/flatcar/calico/typha:v3.29.3\", repo digest \"ghcr.io/flatcar/calico/typha@sha256:f5516aa6a78f00931d2625f3012dcf2c69d141ce41483b8d59c6ec6330a18620\", size \"31919484\" in 3.160826896s" May 15 00:29:27.661187 containerd[1483]: time="2025-05-15T00:29:27.661093253Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/typha:v3.29.3\" returns image reference \"sha256:bde24a3cb8851b59372b76b3ad78f8028d1a915ffed82c6cc6256f34e500bd3d\"" May 15 00:29:27.662770 containerd[1483]: time="2025-05-15T00:29:27.662715995Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.29.3\"" May 15 00:29:27.674310 containerd[1483]: time="2025-05-15T00:29:27.674226571Z" level=info msg="CreateContainer within sandbox \"ff9ba55fc08661780708d8a83dcb383ef6aaf803187efada0ba4439565291d27\" for container &ContainerMetadata{Name:calico-typha,Attempt:0,}" May 15 00:29:27.687736 containerd[1483]: time="2025-05-15T00:29:27.685887103Z" level=info msg="Container c1084e22495184017499111394387d7ddbbe1287350afa1fa2fa6da6efb3b56f: CDI devices from CRI Config.CDIDevices: []" May 15 00:29:27.698308 containerd[1483]: time="2025-05-15T00:29:27.697547213Z" level=info msg="CreateContainer within sandbox \"ff9ba55fc08661780708d8a83dcb383ef6aaf803187efada0ba4439565291d27\" for &ContainerMetadata{Name:calico-typha,Attempt:0,} returns container id \"c1084e22495184017499111394387d7ddbbe1287350afa1fa2fa6da6efb3b56f\"" May 15 00:29:27.698308 containerd[1483]: time="2025-05-15T00:29:27.698017166Z" level=info msg="StartContainer for \"c1084e22495184017499111394387d7ddbbe1287350afa1fa2fa6da6efb3b56f\"" May 15 00:29:27.699643 containerd[1483]: time="2025-05-15T00:29:27.699621066Z" level=info msg="connecting to shim c1084e22495184017499111394387d7ddbbe1287350afa1fa2fa6da6efb3b56f" address="unix:///run/containerd/s/dfa5362baf5a2892ea59dfb0b91cb6e56fed484afe8dc174119b2a8a781a52e0" protocol=ttrpc version=3 May 15 00:29:27.725399 systemd[1]: Started cri-containerd-c1084e22495184017499111394387d7ddbbe1287350afa1fa2fa6da6efb3b56f.scope - libcontainer container c1084e22495184017499111394387d7ddbbe1287350afa1fa2fa6da6efb3b56f. May 15 00:29:27.778795 containerd[1483]: time="2025-05-15T00:29:27.778715872Z" level=info msg="StartContainer for \"c1084e22495184017499111394387d7ddbbe1287350afa1fa2fa6da6efb3b56f\" returns successfully" May 15 00:29:28.804063 kubelet[2714]: I0515 00:29:28.803436 2714 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/calico-typha-5bf9bc7d8-p5g8z" podStartSLOduration=2.640609689 podStartE2EDuration="5.803405319s" podCreationTimestamp="2025-05-15 00:29:23 +0000 UTC" firstStartedPulling="2025-05-15 00:29:24.499212153 +0000 UTC m=+12.974802994" lastFinishedPulling="2025-05-15 00:29:27.662007783 +0000 UTC m=+16.137598624" observedRunningTime="2025-05-15 00:29:28.801544236 +0000 UTC m=+17.277135127" watchObservedRunningTime="2025-05-15 00:29:28.803405319 +0000 UTC m=+17.278996210" May 15 00:29:28.855777 kubelet[2714]: E0515 00:29:28.855725 2714 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 15 00:29:28.855992 kubelet[2714]: W0515 00:29:28.855807 2714 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 15 00:29:28.855992 kubelet[2714]: E0515 00:29:28.855845 2714 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 15 00:29:28.856770 kubelet[2714]: E0515 00:29:28.856715 2714 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 15 00:29:28.856770 kubelet[2714]: W0515 00:29:28.856748 2714 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 15 00:29:28.857185 kubelet[2714]: E0515 00:29:28.856795 2714 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 15 00:29:28.857668 kubelet[2714]: E0515 00:29:28.857234 2714 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 15 00:29:28.857668 kubelet[2714]: W0515 00:29:28.857423 2714 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 15 00:29:28.857668 kubelet[2714]: E0515 00:29:28.857445 2714 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 15 00:29:28.858225 kubelet[2714]: E0515 00:29:28.858034 2714 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 15 00:29:28.858225 kubelet[2714]: W0515 00:29:28.858064 2714 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 15 00:29:28.858225 kubelet[2714]: E0515 00:29:28.858088 2714 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 15 00:29:28.859237 kubelet[2714]: E0515 00:29:28.859185 2714 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 15 00:29:28.859555 kubelet[2714]: W0515 00:29:28.859240 2714 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 15 00:29:28.859555 kubelet[2714]: E0515 00:29:28.859299 2714 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 15 00:29:28.859932 kubelet[2714]: E0515 00:29:28.859739 2714 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 15 00:29:28.859932 kubelet[2714]: W0515 00:29:28.859807 2714 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 15 00:29:28.859932 kubelet[2714]: E0515 00:29:28.859830 2714 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 15 00:29:28.861374 kubelet[2714]: E0515 00:29:28.860339 2714 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 15 00:29:28.861374 kubelet[2714]: W0515 00:29:28.860360 2714 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 15 00:29:28.861374 kubelet[2714]: E0515 00:29:28.860383 2714 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 15 00:29:28.861374 kubelet[2714]: E0515 00:29:28.860898 2714 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 15 00:29:28.861374 kubelet[2714]: W0515 00:29:28.860920 2714 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 15 00:29:28.861374 kubelet[2714]: E0515 00:29:28.860986 2714 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 15 00:29:28.861781 kubelet[2714]: E0515 00:29:28.861493 2714 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 15 00:29:28.861781 kubelet[2714]: W0515 00:29:28.861556 2714 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 15 00:29:28.861781 kubelet[2714]: E0515 00:29:28.861577 2714 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 15 00:29:28.862250 kubelet[2714]: E0515 00:29:28.861982 2714 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 15 00:29:28.862250 kubelet[2714]: W0515 00:29:28.862012 2714 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 15 00:29:28.862250 kubelet[2714]: E0515 00:29:28.862036 2714 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 15 00:29:28.863550 kubelet[2714]: E0515 00:29:28.862613 2714 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 15 00:29:28.863550 kubelet[2714]: W0515 00:29:28.862636 2714 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 15 00:29:28.863550 kubelet[2714]: E0515 00:29:28.862658 2714 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 15 00:29:28.863550 kubelet[2714]: E0515 00:29:28.863091 2714 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 15 00:29:28.863550 kubelet[2714]: W0515 00:29:28.863113 2714 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 15 00:29:28.863550 kubelet[2714]: E0515 00:29:28.863134 2714 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 15 00:29:28.863550 kubelet[2714]: E0515 00:29:28.863534 2714 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 15 00:29:28.864671 kubelet[2714]: W0515 00:29:28.863596 2714 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 15 00:29:28.864671 kubelet[2714]: E0515 00:29:28.864607 2714 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 15 00:29:28.865039 kubelet[2714]: E0515 00:29:28.864998 2714 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 15 00:29:28.865039 kubelet[2714]: W0515 00:29:28.865027 2714 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 15 00:29:28.865198 kubelet[2714]: E0515 00:29:28.865048 2714 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 15 00:29:28.865699 kubelet[2714]: E0515 00:29:28.865392 2714 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 15 00:29:28.865699 kubelet[2714]: W0515 00:29:28.865412 2714 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 15 00:29:28.865699 kubelet[2714]: E0515 00:29:28.865433 2714 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 15 00:29:28.866244 kubelet[2714]: E0515 00:29:28.865903 2714 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 15 00:29:28.866244 kubelet[2714]: W0515 00:29:28.865927 2714 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 15 00:29:28.866244 kubelet[2714]: E0515 00:29:28.865947 2714 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 15 00:29:28.866926 kubelet[2714]: E0515 00:29:28.866405 2714 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 15 00:29:28.866926 kubelet[2714]: W0515 00:29:28.866426 2714 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 15 00:29:28.866926 kubelet[2714]: E0515 00:29:28.866458 2714 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 15 00:29:28.866926 kubelet[2714]: E0515 00:29:28.866829 2714 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 15 00:29:28.866926 kubelet[2714]: W0515 00:29:28.866850 2714 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 15 00:29:28.866926 kubelet[2714]: E0515 00:29:28.866872 2714 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 15 00:29:28.868542 kubelet[2714]: E0515 00:29:28.868234 2714 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 15 00:29:28.868542 kubelet[2714]: W0515 00:29:28.868318 2714 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 15 00:29:28.868542 kubelet[2714]: E0515 00:29:28.868365 2714 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 15 00:29:28.868948 kubelet[2714]: E0515 00:29:28.868915 2714 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 15 00:29:28.868948 kubelet[2714]: W0515 00:29:28.868940 2714 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 15 00:29:28.869183 kubelet[2714]: E0515 00:29:28.869130 2714 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 15 00:29:28.869420 kubelet[2714]: E0515 00:29:28.869387 2714 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 15 00:29:28.869420 kubelet[2714]: W0515 00:29:28.869416 2714 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 15 00:29:28.869646 kubelet[2714]: E0515 00:29:28.869605 2714 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 15 00:29:28.870187 kubelet[2714]: E0515 00:29:28.870154 2714 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 15 00:29:28.870187 kubelet[2714]: W0515 00:29:28.870184 2714 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 15 00:29:28.870492 kubelet[2714]: E0515 00:29:28.870325 2714 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 15 00:29:28.870785 kubelet[2714]: E0515 00:29:28.870753 2714 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 15 00:29:28.870981 kubelet[2714]: W0515 00:29:28.870783 2714 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 15 00:29:28.871256 kubelet[2714]: E0515 00:29:28.871028 2714 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 15 00:29:28.871256 kubelet[2714]: E0515 00:29:28.871241 2714 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 15 00:29:28.871468 kubelet[2714]: W0515 00:29:28.871260 2714 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 15 00:29:28.871664 kubelet[2714]: E0515 00:29:28.871587 2714 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 15 00:29:28.871848 kubelet[2714]: E0515 00:29:28.871812 2714 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 15 00:29:28.871848 kubelet[2714]: W0515 00:29:28.871845 2714 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 15 00:29:28.872075 kubelet[2714]: E0515 00:29:28.872018 2714 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 15 00:29:28.872720 kubelet[2714]: E0515 00:29:28.872616 2714 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 15 00:29:28.872720 kubelet[2714]: W0515 00:29:28.872651 2714 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 15 00:29:28.872720 kubelet[2714]: E0515 00:29:28.872680 2714 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 15 00:29:28.874007 kubelet[2714]: E0515 00:29:28.873799 2714 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 15 00:29:28.874007 kubelet[2714]: W0515 00:29:28.873831 2714 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 15 00:29:28.874007 kubelet[2714]: E0515 00:29:28.873869 2714 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 15 00:29:28.875094 kubelet[2714]: E0515 00:29:28.874489 2714 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 15 00:29:28.875094 kubelet[2714]: W0515 00:29:28.874514 2714 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 15 00:29:28.875094 kubelet[2714]: E0515 00:29:28.874575 2714 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 15 00:29:28.875094 kubelet[2714]: E0515 00:29:28.874877 2714 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 15 00:29:28.875094 kubelet[2714]: W0515 00:29:28.874899 2714 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 15 00:29:28.875094 kubelet[2714]: E0515 00:29:28.875003 2714 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 15 00:29:28.875682 kubelet[2714]: E0515 00:29:28.875199 2714 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 15 00:29:28.875682 kubelet[2714]: W0515 00:29:28.875219 2714 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 15 00:29:28.875682 kubelet[2714]: E0515 00:29:28.875257 2714 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 15 00:29:28.875682 kubelet[2714]: E0515 00:29:28.875626 2714 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 15 00:29:28.875682 kubelet[2714]: W0515 00:29:28.875647 2714 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 15 00:29:28.875682 kubelet[2714]: E0515 00:29:28.875668 2714 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 15 00:29:28.876037 kubelet[2714]: E0515 00:29:28.876007 2714 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 15 00:29:28.876037 kubelet[2714]: W0515 00:29:28.876028 2714 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 15 00:29:28.876159 kubelet[2714]: E0515 00:29:28.876049 2714 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 15 00:29:28.876777 kubelet[2714]: E0515 00:29:28.876744 2714 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 15 00:29:28.876777 kubelet[2714]: W0515 00:29:28.876773 2714 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 15 00:29:28.876951 kubelet[2714]: E0515 00:29:28.876794 2714 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 15 00:29:29.648056 kubelet[2714]: E0515 00:29:29.647988 2714 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-xcq6h" podUID="01471fe4-93c1-4711-8898-dce9d2c2ee23" May 15 00:29:29.746420 containerd[1483]: time="2025-05-15T00:29:29.746348587Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.29.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 15 00:29:29.749724 containerd[1483]: time="2025-05-15T00:29:29.749587386Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.29.3: active requests=0, bytes read=5366937" May 15 00:29:29.751351 containerd[1483]: time="2025-05-15T00:29:29.750921929Z" level=info msg="ImageCreate event name:\"sha256:0ceddb3add2e9955cbb604f666245e259f30b1d6683c428f8748359e83d238a5\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 15 00:29:29.753988 containerd[1483]: time="2025-05-15T00:29:29.753965315Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/pod2daemon-flexvol@sha256:eeaa2bb4f9b1aa61adde43ce6dea95eee89291f96963548e108d9a2dfbc5edd1\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 15 00:29:29.754640 containerd[1483]: time="2025-05-15T00:29:29.754513247Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.29.3\" with image id \"sha256:0ceddb3add2e9955cbb604f666245e259f30b1d6683c428f8748359e83d238a5\", repo tag \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.29.3\", repo digest \"ghcr.io/flatcar/calico/pod2daemon-flexvol@sha256:eeaa2bb4f9b1aa61adde43ce6dea95eee89291f96963548e108d9a2dfbc5edd1\", size \"6859519\" in 2.091741783s" May 15 00:29:29.754640 containerd[1483]: time="2025-05-15T00:29:29.754552857Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.29.3\" returns image reference \"sha256:0ceddb3add2e9955cbb604f666245e259f30b1d6683c428f8748359e83d238a5\"" May 15 00:29:29.757647 containerd[1483]: time="2025-05-15T00:29:29.757609582Z" level=info msg="CreateContainer within sandbox \"4bd760430421a8b56ee54da97d6204d75ef975cec375cb6f9578938657a0c08b\" for container &ContainerMetadata{Name:flexvol-driver,Attempt:0,}" May 15 00:29:29.769692 containerd[1483]: time="2025-05-15T00:29:29.769646434Z" level=info msg="Container 360951a04811e7d6495fcfdcb10ed4fdf25b02fb640037a798abb5a81b70a30f: CDI devices from CRI Config.CDIDevices: []" May 15 00:29:29.774461 kubelet[2714]: I0515 00:29:29.774437 2714 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" May 15 00:29:29.783978 containerd[1483]: time="2025-05-15T00:29:29.783934217Z" level=info msg="CreateContainer within sandbox \"4bd760430421a8b56ee54da97d6204d75ef975cec375cb6f9578938657a0c08b\" for &ContainerMetadata{Name:flexvol-driver,Attempt:0,} returns container id \"360951a04811e7d6495fcfdcb10ed4fdf25b02fb640037a798abb5a81b70a30f\"" May 15 00:29:29.784405 containerd[1483]: time="2025-05-15T00:29:29.784378924Z" level=info msg="StartContainer for \"360951a04811e7d6495fcfdcb10ed4fdf25b02fb640037a798abb5a81b70a30f\"" May 15 00:29:29.786149 containerd[1483]: time="2025-05-15T00:29:29.786103603Z" level=info msg="connecting to shim 360951a04811e7d6495fcfdcb10ed4fdf25b02fb640037a798abb5a81b70a30f" address="unix:///run/containerd/s/e6cccba9a92022f5e6ddb84add2330c99bc4f2d88ea7296720a2061290eecfd8" protocol=ttrpc version=3 May 15 00:29:29.816465 systemd[1]: Started cri-containerd-360951a04811e7d6495fcfdcb10ed4fdf25b02fb640037a798abb5a81b70a30f.scope - libcontainer container 360951a04811e7d6495fcfdcb10ed4fdf25b02fb640037a798abb5a81b70a30f. May 15 00:29:29.858130 containerd[1483]: time="2025-05-15T00:29:29.858081986Z" level=info msg="StartContainer for \"360951a04811e7d6495fcfdcb10ed4fdf25b02fb640037a798abb5a81b70a30f\" returns successfully" May 15 00:29:29.865832 systemd[1]: cri-containerd-360951a04811e7d6495fcfdcb10ed4fdf25b02fb640037a798abb5a81b70a30f.scope: Deactivated successfully. May 15 00:29:29.870598 containerd[1483]: time="2025-05-15T00:29:29.870556069Z" level=info msg="TaskExit event in podsandbox handler container_id:\"360951a04811e7d6495fcfdcb10ed4fdf25b02fb640037a798abb5a81b70a30f\" id:\"360951a04811e7d6495fcfdcb10ed4fdf25b02fb640037a798abb5a81b70a30f\" pid:3373 exited_at:{seconds:1747268969 nanos:870044540}" May 15 00:29:29.870734 containerd[1483]: time="2025-05-15T00:29:29.870710460Z" level=info msg="received exit event container_id:\"360951a04811e7d6495fcfdcb10ed4fdf25b02fb640037a798abb5a81b70a30f\" id:\"360951a04811e7d6495fcfdcb10ed4fdf25b02fb640037a798abb5a81b70a30f\" pid:3373 exited_at:{seconds:1747268969 nanos:870044540}" May 15 00:29:29.893138 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-360951a04811e7d6495fcfdcb10ed4fdf25b02fb640037a798abb5a81b70a30f-rootfs.mount: Deactivated successfully. May 15 00:29:30.786409 containerd[1483]: time="2025-05-15T00:29:30.786326218Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/cni:v3.29.3\"" May 15 00:29:31.652654 kubelet[2714]: E0515 00:29:31.652576 2714 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-xcq6h" podUID="01471fe4-93c1-4711-8898-dce9d2c2ee23" May 15 00:29:33.647567 kubelet[2714]: E0515 00:29:33.647490 2714 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-xcq6h" podUID="01471fe4-93c1-4711-8898-dce9d2c2ee23" May 15 00:29:35.652908 kubelet[2714]: E0515 00:29:35.652768 2714 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-xcq6h" podUID="01471fe4-93c1-4711-8898-dce9d2c2ee23" May 15 00:29:36.672426 containerd[1483]: time="2025-05-15T00:29:36.672317318Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/cni:v3.29.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 15 00:29:36.673660 containerd[1483]: time="2025-05-15T00:29:36.673561995Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/cni:v3.29.3: active requests=0, bytes read=97793683" May 15 00:29:36.674813 containerd[1483]: time="2025-05-15T00:29:36.674758728Z" level=info msg="ImageCreate event name:\"sha256:a140d04be1bc987bae0a1b9159e1dcb85751c448830efbdb3494207cf602b2d9\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 15 00:29:36.677474 containerd[1483]: time="2025-05-15T00:29:36.677432682Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/cni@sha256:4505ec8f976470994b6a94295a4dabac0cb98375db050e959a22603e00ada90b\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 15 00:29:36.678215 containerd[1483]: time="2025-05-15T00:29:36.678050415Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/cni:v3.29.3\" with image id \"sha256:a140d04be1bc987bae0a1b9159e1dcb85751c448830efbdb3494207cf602b2d9\", repo tag \"ghcr.io/flatcar/calico/cni:v3.29.3\", repo digest \"ghcr.io/flatcar/calico/cni@sha256:4505ec8f976470994b6a94295a4dabac0cb98375db050e959a22603e00ada90b\", size \"99286305\" in 5.89042594s" May 15 00:29:36.678215 containerd[1483]: time="2025-05-15T00:29:36.678086753Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/cni:v3.29.3\" returns image reference \"sha256:a140d04be1bc987bae0a1b9159e1dcb85751c448830efbdb3494207cf602b2d9\"" May 15 00:29:36.681150 containerd[1483]: time="2025-05-15T00:29:36.680714307Z" level=info msg="CreateContainer within sandbox \"4bd760430421a8b56ee54da97d6204d75ef975cec375cb6f9578938657a0c08b\" for container &ContainerMetadata{Name:install-cni,Attempt:0,}" May 15 00:29:36.690438 containerd[1483]: time="2025-05-15T00:29:36.690402053Z" level=info msg="Container 3efb52fbc58c2fecebb909890fe6cdeab5505bf7a86bb13faab571bbcf8a76d6: CDI devices from CRI Config.CDIDevices: []" May 15 00:29:36.716741 containerd[1483]: time="2025-05-15T00:29:36.716619368Z" level=info msg="CreateContainer within sandbox \"4bd760430421a8b56ee54da97d6204d75ef975cec375cb6f9578938657a0c08b\" for &ContainerMetadata{Name:install-cni,Attempt:0,} returns container id \"3efb52fbc58c2fecebb909890fe6cdeab5505bf7a86bb13faab571bbcf8a76d6\"" May 15 00:29:36.717378 containerd[1483]: time="2025-05-15T00:29:36.717296059Z" level=info msg="StartContainer for \"3efb52fbc58c2fecebb909890fe6cdeab5505bf7a86bb13faab571bbcf8a76d6\"" May 15 00:29:36.718883 containerd[1483]: time="2025-05-15T00:29:36.718851387Z" level=info msg="connecting to shim 3efb52fbc58c2fecebb909890fe6cdeab5505bf7a86bb13faab571bbcf8a76d6" address="unix:///run/containerd/s/e6cccba9a92022f5e6ddb84add2330c99bc4f2d88ea7296720a2061290eecfd8" protocol=ttrpc version=3 May 15 00:29:36.748445 systemd[1]: Started cri-containerd-3efb52fbc58c2fecebb909890fe6cdeab5505bf7a86bb13faab571bbcf8a76d6.scope - libcontainer container 3efb52fbc58c2fecebb909890fe6cdeab5505bf7a86bb13faab571bbcf8a76d6. May 15 00:29:36.796202 containerd[1483]: time="2025-05-15T00:29:36.796165181Z" level=info msg="StartContainer for \"3efb52fbc58c2fecebb909890fe6cdeab5505bf7a86bb13faab571bbcf8a76d6\" returns successfully" May 15 00:29:37.649662 kubelet[2714]: E0515 00:29:37.648007 2714 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-xcq6h" podUID="01471fe4-93c1-4711-8898-dce9d2c2ee23" May 15 00:29:37.982943 containerd[1483]: time="2025-05-15T00:29:37.982774063Z" level=error msg="failed to reload cni configuration after receiving fs change event(WRITE \"/etc/cni/net.d/calico-kubeconfig\")" error="cni config load failed: no network config found in /etc/cni/net.d: cni plugin not initialized: failed to load cni config" May 15 00:29:37.985993 systemd[1]: cri-containerd-3efb52fbc58c2fecebb909890fe6cdeab5505bf7a86bb13faab571bbcf8a76d6.scope: Deactivated successfully. May 15 00:29:37.986501 systemd[1]: cri-containerd-3efb52fbc58c2fecebb909890fe6cdeab5505bf7a86bb13faab571bbcf8a76d6.scope: Consumed 629ms CPU time, 172.8M memory peak, 154M written to disk. May 15 00:29:37.988861 containerd[1483]: time="2025-05-15T00:29:37.988680416Z" level=info msg="received exit event container_id:\"3efb52fbc58c2fecebb909890fe6cdeab5505bf7a86bb13faab571bbcf8a76d6\" id:\"3efb52fbc58c2fecebb909890fe6cdeab5505bf7a86bb13faab571bbcf8a76d6\" pid:3434 exited_at:{seconds:1747268977 nanos:987828787}" May 15 00:29:37.989047 containerd[1483]: time="2025-05-15T00:29:37.988975239Z" level=info msg="TaskExit event in podsandbox handler container_id:\"3efb52fbc58c2fecebb909890fe6cdeab5505bf7a86bb13faab571bbcf8a76d6\" id:\"3efb52fbc58c2fecebb909890fe6cdeab5505bf7a86bb13faab571bbcf8a76d6\" pid:3434 exited_at:{seconds:1747268977 nanos:987828787}" May 15 00:29:38.018207 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-3efb52fbc58c2fecebb909890fe6cdeab5505bf7a86bb13faab571bbcf8a76d6-rootfs.mount: Deactivated successfully. May 15 00:29:38.027088 kubelet[2714]: I0515 00:29:38.025525 2714 kubelet_node_status.go:502] "Fast updating node status as it just became ready" May 15 00:29:38.284321 systemd[1]: Created slice kubepods-burstable-pod9b91a9a5_905b_4114_8ed1_90b707d6b8c4.slice - libcontainer container kubepods-burstable-pod9b91a9a5_905b_4114_8ed1_90b707d6b8c4.slice. May 15 00:29:38.334497 systemd[1]: Created slice kubepods-burstable-podcfa3e778_0ec3_4199_9357_de8a88cbcc1d.slice - libcontainer container kubepods-burstable-podcfa3e778_0ec3_4199_9357_de8a88cbcc1d.slice. May 15 00:29:38.349717 systemd[1]: Created slice kubepods-besteffort-podfa26f67a_8f4a_4892_8475_fff6e90bd869.slice - libcontainer container kubepods-besteffort-podfa26f67a_8f4a_4892_8475_fff6e90bd869.slice. May 15 00:29:38.362284 systemd[1]: Created slice kubepods-besteffort-pod469e854e_b480_4362_b622_18e343a10570.slice - libcontainer container kubepods-besteffort-pod469e854e_b480_4362_b622_18e343a10570.slice. May 15 00:29:38.372540 systemd[1]: Created slice kubepods-besteffort-pod3efc63e2_426f_4a06_9892_124ca82668e0.slice - libcontainer container kubepods-besteffort-pod3efc63e2_426f_4a06_9892_124ca82668e0.slice. May 15 00:29:38.437493 kubelet[2714]: I0515 00:29:38.437422 2714 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/9b91a9a5-905b-4114-8ed1-90b707d6b8c4-config-volume\") pod \"coredns-668d6bf9bc-psd8h\" (UID: \"9b91a9a5-905b-4114-8ed1-90b707d6b8c4\") " pod="kube-system/coredns-668d6bf9bc-psd8h" May 15 00:29:38.437493 kubelet[2714]: I0515 00:29:38.437515 2714 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/cfa3e778-0ec3-4199-9357-de8a88cbcc1d-config-volume\") pod \"coredns-668d6bf9bc-9tm6q\" (UID: \"cfa3e778-0ec3-4199-9357-de8a88cbcc1d\") " pod="kube-system/coredns-668d6bf9bc-9tm6q" May 15 00:29:38.438430 kubelet[2714]: I0515 00:29:38.437572 2714 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"calico-apiserver-certs\" (UniqueName: \"kubernetes.io/secret/469e854e-b480-4362-b622-18e343a10570-calico-apiserver-certs\") pod \"calico-apiserver-b49976888-mrqtc\" (UID: \"469e854e-b480-4362-b622-18e343a10570\") " pod="calico-apiserver/calico-apiserver-b49976888-mrqtc" May 15 00:29:38.438430 kubelet[2714]: I0515 00:29:38.437618 2714 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pmfhl\" (UniqueName: \"kubernetes.io/projected/cfa3e778-0ec3-4199-9357-de8a88cbcc1d-kube-api-access-pmfhl\") pod \"coredns-668d6bf9bc-9tm6q\" (UID: \"cfa3e778-0ec3-4199-9357-de8a88cbcc1d\") " pod="kube-system/coredns-668d6bf9bc-9tm6q" May 15 00:29:38.438430 kubelet[2714]: I0515 00:29:38.437756 2714 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dw6lg\" (UniqueName: \"kubernetes.io/projected/fa26f67a-8f4a-4892-8475-fff6e90bd869-kube-api-access-dw6lg\") pod \"calico-kube-controllers-5ff58c486d-z25zt\" (UID: \"fa26f67a-8f4a-4892-8475-fff6e90bd869\") " pod="calico-system/calico-kube-controllers-5ff58c486d-z25zt" May 15 00:29:38.438430 kubelet[2714]: I0515 00:29:38.437897 2714 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/fa26f67a-8f4a-4892-8475-fff6e90bd869-tigera-ca-bundle\") pod \"calico-kube-controllers-5ff58c486d-z25zt\" (UID: \"fa26f67a-8f4a-4892-8475-fff6e90bd869\") " pod="calico-system/calico-kube-controllers-5ff58c486d-z25zt" May 15 00:29:38.438430 kubelet[2714]: I0515 00:29:38.437954 2714 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xj5zm\" (UniqueName: \"kubernetes.io/projected/469e854e-b480-4362-b622-18e343a10570-kube-api-access-xj5zm\") pod \"calico-apiserver-b49976888-mrqtc\" (UID: \"469e854e-b480-4362-b622-18e343a10570\") " pod="calico-apiserver/calico-apiserver-b49976888-mrqtc" May 15 00:29:38.438756 kubelet[2714]: I0515 00:29:38.438047 2714 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vdf84\" (UniqueName: \"kubernetes.io/projected/9b91a9a5-905b-4114-8ed1-90b707d6b8c4-kube-api-access-vdf84\") pod \"coredns-668d6bf9bc-psd8h\" (UID: \"9b91a9a5-905b-4114-8ed1-90b707d6b8c4\") " pod="kube-system/coredns-668d6bf9bc-psd8h" May 15 00:29:38.538684 kubelet[2714]: I0515 00:29:38.538631 2714 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"calico-apiserver-certs\" (UniqueName: \"kubernetes.io/secret/3efc63e2-426f-4a06-9892-124ca82668e0-calico-apiserver-certs\") pod \"calico-apiserver-b49976888-shjsb\" (UID: \"3efc63e2-426f-4a06-9892-124ca82668e0\") " pod="calico-apiserver/calico-apiserver-b49976888-shjsb" May 15 00:29:38.539939 kubelet[2714]: I0515 00:29:38.539666 2714 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ttpp5\" (UniqueName: \"kubernetes.io/projected/3efc63e2-426f-4a06-9892-124ca82668e0-kube-api-access-ttpp5\") pod \"calico-apiserver-b49976888-shjsb\" (UID: \"3efc63e2-426f-4a06-9892-124ca82668e0\") " pod="calico-apiserver/calico-apiserver-b49976888-shjsb" May 15 00:29:38.659831 containerd[1483]: time="2025-05-15T00:29:38.659701204Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-668d6bf9bc-psd8h,Uid:9b91a9a5-905b-4114-8ed1-90b707d6b8c4,Namespace:kube-system,Attempt:0,}" May 15 00:29:38.959984 containerd[1483]: time="2025-05-15T00:29:38.959532850Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-5ff58c486d-z25zt,Uid:fa26f67a-8f4a-4892-8475-fff6e90bd869,Namespace:calico-system,Attempt:0,}" May 15 00:29:38.960546 containerd[1483]: time="2025-05-15T00:29:38.960490756Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-668d6bf9bc-9tm6q,Uid:cfa3e778-0ec3-4199-9357-de8a88cbcc1d,Namespace:kube-system,Attempt:0,}" May 15 00:29:38.968631 containerd[1483]: time="2025-05-15T00:29:38.968570798Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-b49976888-mrqtc,Uid:469e854e-b480-4362-b622-18e343a10570,Namespace:calico-apiserver,Attempt:0,}" May 15 00:29:38.976475 containerd[1483]: time="2025-05-15T00:29:38.976181138Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-b49976888-shjsb,Uid:3efc63e2-426f-4a06-9892-124ca82668e0,Namespace:calico-apiserver,Attempt:0,}" May 15 00:29:39.208893 containerd[1483]: time="2025-05-15T00:29:39.207441940Z" level=error msg="Failed to destroy network for sandbox \"7a896dbdf6207ca087392e6086a25812f9cc08d305dfe6c91a55948d938703bb\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 15 00:29:39.212768 containerd[1483]: time="2025-05-15T00:29:39.211173243Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-5ff58c486d-z25zt,Uid:fa26f67a-8f4a-4892-8475-fff6e90bd869,Namespace:calico-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"7a896dbdf6207ca087392e6086a25812f9cc08d305dfe6c91a55948d938703bb\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 15 00:29:39.212896 kubelet[2714]: E0515 00:29:39.212013 2714 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"7a896dbdf6207ca087392e6086a25812f9cc08d305dfe6c91a55948d938703bb\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 15 00:29:39.212896 kubelet[2714]: E0515 00:29:39.212095 2714 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"7a896dbdf6207ca087392e6086a25812f9cc08d305dfe6c91a55948d938703bb\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-kube-controllers-5ff58c486d-z25zt" May 15 00:29:39.212896 kubelet[2714]: E0515 00:29:39.212118 2714 kuberuntime_manager.go:1237] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"7a896dbdf6207ca087392e6086a25812f9cc08d305dfe6c91a55948d938703bb\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-kube-controllers-5ff58c486d-z25zt" May 15 00:29:39.210696 systemd[1]: run-netns-cni\x2df3bb72fc\x2d9b89\x2dfaef\x2dd87e\x2dc3bd9bba4af8.mount: Deactivated successfully. May 15 00:29:39.213969 kubelet[2714]: E0515 00:29:39.212163 2714 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-kube-controllers-5ff58c486d-z25zt_calico-system(fa26f67a-8f4a-4892-8475-fff6e90bd869)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-kube-controllers-5ff58c486d-z25zt_calico-system(fa26f67a-8f4a-4892-8475-fff6e90bd869)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"7a896dbdf6207ca087392e6086a25812f9cc08d305dfe6c91a55948d938703bb\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/calico-kube-controllers-5ff58c486d-z25zt" podUID="fa26f67a-8f4a-4892-8475-fff6e90bd869" May 15 00:29:39.216453 containerd[1483]: time="2025-05-15T00:29:39.215869459Z" level=error msg="Failed to destroy network for sandbox \"21a7a076be2b1966fcbf2d40b582a4f15cb049b08a766a628a0f3d2a94696cb3\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 15 00:29:39.219291 containerd[1483]: time="2025-05-15T00:29:39.219209560Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-b49976888-mrqtc,Uid:469e854e-b480-4362-b622-18e343a10570,Namespace:calico-apiserver,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"21a7a076be2b1966fcbf2d40b582a4f15cb049b08a766a628a0f3d2a94696cb3\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 15 00:29:39.220571 kubelet[2714]: E0515 00:29:39.219837 2714 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"21a7a076be2b1966fcbf2d40b582a4f15cb049b08a766a628a0f3d2a94696cb3\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 15 00:29:39.220571 kubelet[2714]: E0515 00:29:39.219923 2714 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"21a7a076be2b1966fcbf2d40b582a4f15cb049b08a766a628a0f3d2a94696cb3\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-b49976888-mrqtc" May 15 00:29:39.220571 kubelet[2714]: E0515 00:29:39.219965 2714 kuberuntime_manager.go:1237] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"21a7a076be2b1966fcbf2d40b582a4f15cb049b08a766a628a0f3d2a94696cb3\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-b49976888-mrqtc" May 15 00:29:39.220752 kubelet[2714]: E0515 00:29:39.220018 2714 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-apiserver-b49976888-mrqtc_calico-apiserver(469e854e-b480-4362-b622-18e343a10570)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-apiserver-b49976888-mrqtc_calico-apiserver(469e854e-b480-4362-b622-18e343a10570)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"21a7a076be2b1966fcbf2d40b582a4f15cb049b08a766a628a0f3d2a94696cb3\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-apiserver/calico-apiserver-b49976888-mrqtc" podUID="469e854e-b480-4362-b622-18e343a10570" May 15 00:29:39.234547 containerd[1483]: time="2025-05-15T00:29:39.234484306Z" level=error msg="Failed to destroy network for sandbox \"aea411e133f6861c4362621d941244000c18dd244adeba7be2f42529ae69f161\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 15 00:29:39.236089 containerd[1483]: time="2025-05-15T00:29:39.236054335Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-b49976888-shjsb,Uid:3efc63e2-426f-4a06-9892-124ca82668e0,Namespace:calico-apiserver,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"aea411e133f6861c4362621d941244000c18dd244adeba7be2f42529ae69f161\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 15 00:29:39.236590 kubelet[2714]: E0515 00:29:39.236245 2714 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"aea411e133f6861c4362621d941244000c18dd244adeba7be2f42529ae69f161\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 15 00:29:39.236590 kubelet[2714]: E0515 00:29:39.236319 2714 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"aea411e133f6861c4362621d941244000c18dd244adeba7be2f42529ae69f161\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-b49976888-shjsb" May 15 00:29:39.236590 kubelet[2714]: E0515 00:29:39.236344 2714 kuberuntime_manager.go:1237] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"aea411e133f6861c4362621d941244000c18dd244adeba7be2f42529ae69f161\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-b49976888-shjsb" May 15 00:29:39.236878 kubelet[2714]: E0515 00:29:39.236381 2714 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-apiserver-b49976888-shjsb_calico-apiserver(3efc63e2-426f-4a06-9892-124ca82668e0)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-apiserver-b49976888-shjsb_calico-apiserver(3efc63e2-426f-4a06-9892-124ca82668e0)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"aea411e133f6861c4362621d941244000c18dd244adeba7be2f42529ae69f161\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-apiserver/calico-apiserver-b49976888-shjsb" podUID="3efc63e2-426f-4a06-9892-124ca82668e0" May 15 00:29:39.244558 containerd[1483]: time="2025-05-15T00:29:39.244477143Z" level=error msg="Failed to destroy network for sandbox \"9f814db99960bc03874140ef3bf1c377b9998be5c12270f4c41c6e24ddbc81bd\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 15 00:29:39.246397 containerd[1483]: time="2025-05-15T00:29:39.246302734Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-668d6bf9bc-psd8h,Uid:9b91a9a5-905b-4114-8ed1-90b707d6b8c4,Namespace:kube-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"9f814db99960bc03874140ef3bf1c377b9998be5c12270f4c41c6e24ddbc81bd\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 15 00:29:39.246695 kubelet[2714]: E0515 00:29:39.246650 2714 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"9f814db99960bc03874140ef3bf1c377b9998be5c12270f4c41c6e24ddbc81bd\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 15 00:29:39.246904 kubelet[2714]: E0515 00:29:39.246881 2714 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"9f814db99960bc03874140ef3bf1c377b9998be5c12270f4c41c6e24ddbc81bd\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-668d6bf9bc-psd8h" May 15 00:29:39.246967 kubelet[2714]: E0515 00:29:39.246926 2714 kuberuntime_manager.go:1237] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"9f814db99960bc03874140ef3bf1c377b9998be5c12270f4c41c6e24ddbc81bd\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-668d6bf9bc-psd8h" May 15 00:29:39.247598 kubelet[2714]: E0515 00:29:39.246979 2714 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"coredns-668d6bf9bc-psd8h_kube-system(9b91a9a5-905b-4114-8ed1-90b707d6b8c4)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"coredns-668d6bf9bc-psd8h_kube-system(9b91a9a5-905b-4114-8ed1-90b707d6b8c4)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"9f814db99960bc03874140ef3bf1c377b9998be5c12270f4c41c6e24ddbc81bd\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-668d6bf9bc-psd8h" podUID="9b91a9a5-905b-4114-8ed1-90b707d6b8c4" May 15 00:29:39.253482 containerd[1483]: time="2025-05-15T00:29:39.253442875Z" level=error msg="Failed to destroy network for sandbox \"5693f9ad4f3c00a947e589fb717c7b6e3098e2373ff12530d6a161dfe555fa13\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 15 00:29:39.254993 containerd[1483]: time="2025-05-15T00:29:39.254938004Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-668d6bf9bc-9tm6q,Uid:cfa3e778-0ec3-4199-9357-de8a88cbcc1d,Namespace:kube-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"5693f9ad4f3c00a947e589fb717c7b6e3098e2373ff12530d6a161dfe555fa13\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 15 00:29:39.255205 kubelet[2714]: E0515 00:29:39.255171 2714 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"5693f9ad4f3c00a947e589fb717c7b6e3098e2373ff12530d6a161dfe555fa13\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 15 00:29:39.255296 kubelet[2714]: E0515 00:29:39.255223 2714 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"5693f9ad4f3c00a947e589fb717c7b6e3098e2373ff12530d6a161dfe555fa13\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-668d6bf9bc-9tm6q" May 15 00:29:39.255296 kubelet[2714]: E0515 00:29:39.255244 2714 kuberuntime_manager.go:1237] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"5693f9ad4f3c00a947e589fb717c7b6e3098e2373ff12530d6a161dfe555fa13\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-668d6bf9bc-9tm6q" May 15 00:29:39.255371 kubelet[2714]: E0515 00:29:39.255337 2714 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"coredns-668d6bf9bc-9tm6q_kube-system(cfa3e778-0ec3-4199-9357-de8a88cbcc1d)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"coredns-668d6bf9bc-9tm6q_kube-system(cfa3e778-0ec3-4199-9357-de8a88cbcc1d)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"5693f9ad4f3c00a947e589fb717c7b6e3098e2373ff12530d6a161dfe555fa13\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-668d6bf9bc-9tm6q" podUID="cfa3e778-0ec3-4199-9357-de8a88cbcc1d" May 15 00:29:39.661909 systemd[1]: Created slice kubepods-besteffort-pod01471fe4_93c1_4711_8898_dce9d2c2ee23.slice - libcontainer container kubepods-besteffort-pod01471fe4_93c1_4711_8898_dce9d2c2ee23.slice. May 15 00:29:39.668106 containerd[1483]: time="2025-05-15T00:29:39.667999086Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-xcq6h,Uid:01471fe4-93c1-4711-8898-dce9d2c2ee23,Namespace:calico-system,Attempt:0,}" May 15 00:29:39.770252 containerd[1483]: time="2025-05-15T00:29:39.769883879Z" level=error msg="Failed to destroy network for sandbox \"18794e2d4123a14161cd4be6331e76f3789b35a86af9884d33ca133f4184eb10\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 15 00:29:39.773058 containerd[1483]: time="2025-05-15T00:29:39.772825543Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-xcq6h,Uid:01471fe4-93c1-4711-8898-dce9d2c2ee23,Namespace:calico-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"18794e2d4123a14161cd4be6331e76f3789b35a86af9884d33ca133f4184eb10\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 15 00:29:39.773562 kubelet[2714]: E0515 00:29:39.773404 2714 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"18794e2d4123a14161cd4be6331e76f3789b35a86af9884d33ca133f4184eb10\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 15 00:29:39.773562 kubelet[2714]: E0515 00:29:39.773464 2714 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"18794e2d4123a14161cd4be6331e76f3789b35a86af9884d33ca133f4184eb10\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/csi-node-driver-xcq6h" May 15 00:29:39.773562 kubelet[2714]: E0515 00:29:39.773490 2714 kuberuntime_manager.go:1237] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"18794e2d4123a14161cd4be6331e76f3789b35a86af9884d33ca133f4184eb10\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/csi-node-driver-xcq6h" May 15 00:29:39.773851 kubelet[2714]: E0515 00:29:39.773556 2714 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"csi-node-driver-xcq6h_calico-system(01471fe4-93c1-4711-8898-dce9d2c2ee23)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"csi-node-driver-xcq6h_calico-system(01471fe4-93c1-4711-8898-dce9d2c2ee23)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"18794e2d4123a14161cd4be6331e76f3789b35a86af9884d33ca133f4184eb10\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/csi-node-driver-xcq6h" podUID="01471fe4-93c1-4711-8898-dce9d2c2ee23" May 15 00:29:39.821991 containerd[1483]: time="2025-05-15T00:29:39.821853073Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node:v3.29.3\"" May 15 00:29:40.018562 systemd[1]: run-netns-cni\x2d0dbf1d48\x2dfe5c\x2deb76\x2d4b44\x2d28d41c87f21b.mount: Deactivated successfully. May 15 00:29:40.018795 systemd[1]: run-netns-cni\x2ddf787c51\x2dc756\x2d5557\x2d0c26\x2d221c5942ed5c.mount: Deactivated successfully. May 15 00:29:40.018973 systemd[1]: run-netns-cni\x2d6e3c783e\x2d7d07\x2d1d5b\x2dc2c2\x2d6745f1653e18.mount: Deactivated successfully. May 15 00:29:40.019169 systemd[1]: run-netns-cni\x2d3429a715\x2d2e78\x2d00db\x2d0849\x2d56359be91dec.mount: Deactivated successfully. May 15 00:29:43.836438 kubelet[2714]: I0515 00:29:43.835986 2714 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" May 15 00:29:48.321427 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount3158644570.mount: Deactivated successfully. May 15 00:29:48.490233 containerd[1483]: time="2025-05-15T00:29:48.490139028Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node:v3.29.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 15 00:29:48.492245 containerd[1483]: time="2025-05-15T00:29:48.491760595Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/node:v3.29.3: active requests=0, bytes read=144068748" May 15 00:29:48.494027 containerd[1483]: time="2025-05-15T00:29:48.493899298Z" level=info msg="ImageCreate event name:\"sha256:042163432abcec06b8077b24973b223a5f4cfdb35d85c3816f5d07a13d51afae\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 15 00:29:48.498477 containerd[1483]: time="2025-05-15T00:29:48.498332492Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node@sha256:750e267b4f8217e0ca9e4107228370190d1a2499b72112ad04370ab9b4553916\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 15 00:29:48.500144 containerd[1483]: time="2025-05-15T00:29:48.499836238Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/node:v3.29.3\" with image id \"sha256:042163432abcec06b8077b24973b223a5f4cfdb35d85c3816f5d07a13d51afae\", repo tag \"ghcr.io/flatcar/calico/node:v3.29.3\", repo digest \"ghcr.io/flatcar/calico/node@sha256:750e267b4f8217e0ca9e4107228370190d1a2499b72112ad04370ab9b4553916\", size \"144068610\" in 8.67790972s" May 15 00:29:48.500144 containerd[1483]: time="2025-05-15T00:29:48.499910871Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node:v3.29.3\" returns image reference \"sha256:042163432abcec06b8077b24973b223a5f4cfdb35d85c3816f5d07a13d51afae\"" May 15 00:29:48.532752 containerd[1483]: time="2025-05-15T00:29:48.531864782Z" level=info msg="CreateContainer within sandbox \"4bd760430421a8b56ee54da97d6204d75ef975cec375cb6f9578938657a0c08b\" for container &ContainerMetadata{Name:calico-node,Attempt:0,}" May 15 00:29:48.558350 containerd[1483]: time="2025-05-15T00:29:48.556725544Z" level=info msg="Container b9dec4e54563fe11ce3fcda43a6a436657226a30242f9e44258475044e458e49: CDI devices from CRI Config.CDIDevices: []" May 15 00:29:48.577857 containerd[1483]: time="2025-05-15T00:29:48.577679543Z" level=info msg="CreateContainer within sandbox \"4bd760430421a8b56ee54da97d6204d75ef975cec375cb6f9578938657a0c08b\" for &ContainerMetadata{Name:calico-node,Attempt:0,} returns container id \"b9dec4e54563fe11ce3fcda43a6a436657226a30242f9e44258475044e458e49\"" May 15 00:29:48.578501 containerd[1483]: time="2025-05-15T00:29:48.578463497Z" level=info msg="StartContainer for \"b9dec4e54563fe11ce3fcda43a6a436657226a30242f9e44258475044e458e49\"" May 15 00:29:48.583734 containerd[1483]: time="2025-05-15T00:29:48.583650483Z" level=info msg="connecting to shim b9dec4e54563fe11ce3fcda43a6a436657226a30242f9e44258475044e458e49" address="unix:///run/containerd/s/e6cccba9a92022f5e6ddb84add2330c99bc4f2d88ea7296720a2061290eecfd8" protocol=ttrpc version=3 May 15 00:29:48.613640 systemd[1]: Started cri-containerd-b9dec4e54563fe11ce3fcda43a6a436657226a30242f9e44258475044e458e49.scope - libcontainer container b9dec4e54563fe11ce3fcda43a6a436657226a30242f9e44258475044e458e49. May 15 00:29:48.672241 containerd[1483]: time="2025-05-15T00:29:48.672159212Z" level=info msg="StartContainer for \"b9dec4e54563fe11ce3fcda43a6a436657226a30242f9e44258475044e458e49\" returns successfully" May 15 00:29:48.735442 kernel: wireguard: WireGuard 1.0.0 loaded. See www.wireguard.com for information. May 15 00:29:48.735564 kernel: wireguard: Copyright (C) 2015-2019 Jason A. Donenfeld . All Rights Reserved. May 15 00:29:48.876280 kubelet[2714]: I0515 00:29:48.875051 2714 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/calico-node-z997f" podStartSLOduration=0.90557503 podStartE2EDuration="24.875030575s" podCreationTimestamp="2025-05-15 00:29:24 +0000 UTC" firstStartedPulling="2025-05-15 00:29:24.5323082 +0000 UTC m=+13.007899041" lastFinishedPulling="2025-05-15 00:29:48.501763695 +0000 UTC m=+36.977354586" observedRunningTime="2025-05-15 00:29:48.874603456 +0000 UTC m=+37.350194317" watchObservedRunningTime="2025-05-15 00:29:48.875030575 +0000 UTC m=+37.350621426" May 15 00:29:48.951064 containerd[1483]: time="2025-05-15T00:29:48.950834873Z" level=info msg="TaskExit event in podsandbox handler container_id:\"b9dec4e54563fe11ce3fcda43a6a436657226a30242f9e44258475044e458e49\" id:\"981ee75c85ec1bcbb74b7330d5d30734c549486697e2a568fa77f1336eb9ca3b\" pid:3722 exit_status:1 exited_at:{seconds:1747268988 nanos:950496036}" May 15 00:29:49.650426 containerd[1483]: time="2025-05-15T00:29:49.649861939Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-668d6bf9bc-9tm6q,Uid:cfa3e778-0ec3-4199-9357-de8a88cbcc1d,Namespace:kube-system,Attempt:0,}" May 15 00:29:49.863247 systemd-networkd[1398]: calif6cdc80ac8a: Link UP May 15 00:29:49.865332 systemd-networkd[1398]: calif6cdc80ac8a: Gained carrier May 15 00:29:49.890519 containerd[1483]: 2025-05-15 00:29:49.715 [INFO][3745] cni-plugin/utils.go 100: File /var/lib/calico/mtu does not exist May 15 00:29:49.890519 containerd[1483]: 2025-05-15 00:29:49.744 [INFO][3745] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4284--0--0--n--019843d4bb.novalocal-k8s-coredns--668d6bf9bc--9tm6q-eth0 coredns-668d6bf9bc- kube-system cfa3e778-0ec3-4199-9357-de8a88cbcc1d 672 0 2025-05-15 00:29:16 +0000 UTC map[k8s-app:kube-dns pod-template-hash:668d6bf9bc projectcalico.org/namespace:kube-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:coredns] map[] [] [] []} {k8s ci-4284-0-0-n-019843d4bb.novalocal coredns-668d6bf9bc-9tm6q eth0 coredns [] [] [kns.kube-system ksa.kube-system.coredns] calif6cdc80ac8a [{dns UDP 53 0 } {dns-tcp TCP 53 0 } {metrics TCP 9153 0 }] []}} ContainerID="061c0021356cef3e133b22b214e137e368262e72d9fd9f92a2d767770fa8c09a" Namespace="kube-system" Pod="coredns-668d6bf9bc-9tm6q" WorkloadEndpoint="ci--4284--0--0--n--019843d4bb.novalocal-k8s-coredns--668d6bf9bc--9tm6q-" May 15 00:29:49.890519 containerd[1483]: 2025-05-15 00:29:49.744 [INFO][3745] cni-plugin/k8s.go 77: Extracted identifiers for CmdAddK8s ContainerID="061c0021356cef3e133b22b214e137e368262e72d9fd9f92a2d767770fa8c09a" Namespace="kube-system" Pod="coredns-668d6bf9bc-9tm6q" WorkloadEndpoint="ci--4284--0--0--n--019843d4bb.novalocal-k8s-coredns--668d6bf9bc--9tm6q-eth0" May 15 00:29:49.890519 containerd[1483]: 2025-05-15 00:29:49.777 [INFO][3758] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="061c0021356cef3e133b22b214e137e368262e72d9fd9f92a2d767770fa8c09a" HandleID="k8s-pod-network.061c0021356cef3e133b22b214e137e368262e72d9fd9f92a2d767770fa8c09a" Workload="ci--4284--0--0--n--019843d4bb.novalocal-k8s-coredns--668d6bf9bc--9tm6q-eth0" May 15 00:29:49.890839 containerd[1483]: 2025-05-15 00:29:49.797 [INFO][3758] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="061c0021356cef3e133b22b214e137e368262e72d9fd9f92a2d767770fa8c09a" HandleID="k8s-pod-network.061c0021356cef3e133b22b214e137e368262e72d9fd9f92a2d767770fa8c09a" Workload="ci--4284--0--0--n--019843d4bb.novalocal-k8s-coredns--668d6bf9bc--9tm6q-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc000305e90), Attrs:map[string]string{"namespace":"kube-system", "node":"ci-4284-0-0-n-019843d4bb.novalocal", "pod":"coredns-668d6bf9bc-9tm6q", "timestamp":"2025-05-15 00:29:49.777959424 +0000 UTC"}, Hostname:"ci-4284-0-0-n-019843d4bb.novalocal", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} May 15 00:29:49.890839 containerd[1483]: 2025-05-15 00:29:49.797 [INFO][3758] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. May 15 00:29:49.890839 containerd[1483]: 2025-05-15 00:29:49.797 [INFO][3758] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. May 15 00:29:49.890839 containerd[1483]: 2025-05-15 00:29:49.797 [INFO][3758] ipam/ipam.go 107: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4284-0-0-n-019843d4bb.novalocal' May 15 00:29:49.890839 containerd[1483]: 2025-05-15 00:29:49.801 [INFO][3758] ipam/ipam.go 660: Looking up existing affinities for host handle="k8s-pod-network.061c0021356cef3e133b22b214e137e368262e72d9fd9f92a2d767770fa8c09a" host="ci-4284-0-0-n-019843d4bb.novalocal" May 15 00:29:49.890839 containerd[1483]: 2025-05-15 00:29:49.807 [INFO][3758] ipam/ipam.go 372: Looking up existing affinities for host host="ci-4284-0-0-n-019843d4bb.novalocal" May 15 00:29:49.890839 containerd[1483]: 2025-05-15 00:29:49.813 [INFO][3758] ipam/ipam.go 489: Trying affinity for 192.168.127.64/26 host="ci-4284-0-0-n-019843d4bb.novalocal" May 15 00:29:49.890839 containerd[1483]: 2025-05-15 00:29:49.815 [INFO][3758] ipam/ipam.go 155: Attempting to load block cidr=192.168.127.64/26 host="ci-4284-0-0-n-019843d4bb.novalocal" May 15 00:29:49.890839 containerd[1483]: 2025-05-15 00:29:49.818 [INFO][3758] ipam/ipam.go 232: Affinity is confirmed and block has been loaded cidr=192.168.127.64/26 host="ci-4284-0-0-n-019843d4bb.novalocal" May 15 00:29:49.891181 containerd[1483]: 2025-05-15 00:29:49.818 [INFO][3758] ipam/ipam.go 1180: Attempting to assign 1 addresses from block block=192.168.127.64/26 handle="k8s-pod-network.061c0021356cef3e133b22b214e137e368262e72d9fd9f92a2d767770fa8c09a" host="ci-4284-0-0-n-019843d4bb.novalocal" May 15 00:29:49.891181 containerd[1483]: 2025-05-15 00:29:49.821 [INFO][3758] ipam/ipam.go 1685: Creating new handle: k8s-pod-network.061c0021356cef3e133b22b214e137e368262e72d9fd9f92a2d767770fa8c09a May 15 00:29:49.891181 containerd[1483]: 2025-05-15 00:29:49.829 [INFO][3758] ipam/ipam.go 1203: Writing block in order to claim IPs block=192.168.127.64/26 handle="k8s-pod-network.061c0021356cef3e133b22b214e137e368262e72d9fd9f92a2d767770fa8c09a" host="ci-4284-0-0-n-019843d4bb.novalocal" May 15 00:29:49.891181 containerd[1483]: 2025-05-15 00:29:49.838 [INFO][3758] ipam/ipam.go 1216: Successfully claimed IPs: [192.168.127.65/26] block=192.168.127.64/26 handle="k8s-pod-network.061c0021356cef3e133b22b214e137e368262e72d9fd9f92a2d767770fa8c09a" host="ci-4284-0-0-n-019843d4bb.novalocal" May 15 00:29:49.891181 containerd[1483]: 2025-05-15 00:29:49.838 [INFO][3758] ipam/ipam.go 847: Auto-assigned 1 out of 1 IPv4s: [192.168.127.65/26] handle="k8s-pod-network.061c0021356cef3e133b22b214e137e368262e72d9fd9f92a2d767770fa8c09a" host="ci-4284-0-0-n-019843d4bb.novalocal" May 15 00:29:49.891181 containerd[1483]: 2025-05-15 00:29:49.838 [INFO][3758] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. May 15 00:29:49.891181 containerd[1483]: 2025-05-15 00:29:49.838 [INFO][3758] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.127.65/26] IPv6=[] ContainerID="061c0021356cef3e133b22b214e137e368262e72d9fd9f92a2d767770fa8c09a" HandleID="k8s-pod-network.061c0021356cef3e133b22b214e137e368262e72d9fd9f92a2d767770fa8c09a" Workload="ci--4284--0--0--n--019843d4bb.novalocal-k8s-coredns--668d6bf9bc--9tm6q-eth0" May 15 00:29:49.891522 containerd[1483]: 2025-05-15 00:29:49.843 [INFO][3745] cni-plugin/k8s.go 386: Populated endpoint ContainerID="061c0021356cef3e133b22b214e137e368262e72d9fd9f92a2d767770fa8c09a" Namespace="kube-system" Pod="coredns-668d6bf9bc-9tm6q" WorkloadEndpoint="ci--4284--0--0--n--019843d4bb.novalocal-k8s-coredns--668d6bf9bc--9tm6q-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4284--0--0--n--019843d4bb.novalocal-k8s-coredns--668d6bf9bc--9tm6q-eth0", GenerateName:"coredns-668d6bf9bc-", Namespace:"kube-system", SelfLink:"", UID:"cfa3e778-0ec3-4199-9357-de8a88cbcc1d", ResourceVersion:"672", Generation:0, CreationTimestamp:time.Date(2025, time.May, 15, 0, 29, 16, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"668d6bf9bc", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4284-0-0-n-019843d4bb.novalocal", ContainerID:"", Pod:"coredns-668d6bf9bc-9tm6q", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.127.65/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"calif6cdc80ac8a", MAC:"", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil)}} May 15 00:29:49.891522 containerd[1483]: 2025-05-15 00:29:49.843 [INFO][3745] cni-plugin/k8s.go 387: Calico CNI using IPs: [192.168.127.65/32] ContainerID="061c0021356cef3e133b22b214e137e368262e72d9fd9f92a2d767770fa8c09a" Namespace="kube-system" Pod="coredns-668d6bf9bc-9tm6q" WorkloadEndpoint="ci--4284--0--0--n--019843d4bb.novalocal-k8s-coredns--668d6bf9bc--9tm6q-eth0" May 15 00:29:49.891522 containerd[1483]: 2025-05-15 00:29:49.843 [INFO][3745] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to calif6cdc80ac8a ContainerID="061c0021356cef3e133b22b214e137e368262e72d9fd9f92a2d767770fa8c09a" Namespace="kube-system" Pod="coredns-668d6bf9bc-9tm6q" WorkloadEndpoint="ci--4284--0--0--n--019843d4bb.novalocal-k8s-coredns--668d6bf9bc--9tm6q-eth0" May 15 00:29:49.891522 containerd[1483]: 2025-05-15 00:29:49.865 [INFO][3745] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="061c0021356cef3e133b22b214e137e368262e72d9fd9f92a2d767770fa8c09a" Namespace="kube-system" Pod="coredns-668d6bf9bc-9tm6q" WorkloadEndpoint="ci--4284--0--0--n--019843d4bb.novalocal-k8s-coredns--668d6bf9bc--9tm6q-eth0" May 15 00:29:49.891522 containerd[1483]: 2025-05-15 00:29:49.866 [INFO][3745] cni-plugin/k8s.go 414: Added Mac, interface name, and active container ID to endpoint ContainerID="061c0021356cef3e133b22b214e137e368262e72d9fd9f92a2d767770fa8c09a" Namespace="kube-system" Pod="coredns-668d6bf9bc-9tm6q" WorkloadEndpoint="ci--4284--0--0--n--019843d4bb.novalocal-k8s-coredns--668d6bf9bc--9tm6q-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4284--0--0--n--019843d4bb.novalocal-k8s-coredns--668d6bf9bc--9tm6q-eth0", GenerateName:"coredns-668d6bf9bc-", Namespace:"kube-system", SelfLink:"", UID:"cfa3e778-0ec3-4199-9357-de8a88cbcc1d", ResourceVersion:"672", Generation:0, CreationTimestamp:time.Date(2025, time.May, 15, 0, 29, 16, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"668d6bf9bc", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4284-0-0-n-019843d4bb.novalocal", ContainerID:"061c0021356cef3e133b22b214e137e368262e72d9fd9f92a2d767770fa8c09a", Pod:"coredns-668d6bf9bc-9tm6q", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.127.65/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"calif6cdc80ac8a", MAC:"fe:f7:d4:26:29:ec", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil)}} May 15 00:29:49.891522 containerd[1483]: 2025-05-15 00:29:49.887 [INFO][3745] cni-plugin/k8s.go 500: Wrote updated endpoint to datastore ContainerID="061c0021356cef3e133b22b214e137e368262e72d9fd9f92a2d767770fa8c09a" Namespace="kube-system" Pod="coredns-668d6bf9bc-9tm6q" WorkloadEndpoint="ci--4284--0--0--n--019843d4bb.novalocal-k8s-coredns--668d6bf9bc--9tm6q-eth0" May 15 00:29:49.933887 containerd[1483]: time="2025-05-15T00:29:49.933670237Z" level=info msg="connecting to shim 061c0021356cef3e133b22b214e137e368262e72d9fd9f92a2d767770fa8c09a" address="unix:///run/containerd/s/0fc262d5012e751c44989e5dfe126e541adb97b2009d585fa0bccb60d2c3cc7a" namespace=k8s.io protocol=ttrpc version=3 May 15 00:29:49.971566 systemd[1]: Started cri-containerd-061c0021356cef3e133b22b214e137e368262e72d9fd9f92a2d767770fa8c09a.scope - libcontainer container 061c0021356cef3e133b22b214e137e368262e72d9fd9f92a2d767770fa8c09a. May 15 00:29:49.972048 containerd[1483]: time="2025-05-15T00:29:49.971695450Z" level=info msg="TaskExit event in podsandbox handler container_id:\"b9dec4e54563fe11ce3fcda43a6a436657226a30242f9e44258475044e458e49\" id:\"62887aee7c5adfa7cd52b1cc98a15a69dfbc478ffbf06c4069126bad520c047f\" pid:3778 exit_status:1 exited_at:{seconds:1747268989 nanos:970225502}" May 15 00:29:50.017704 containerd[1483]: time="2025-05-15T00:29:50.017657329Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-668d6bf9bc-9tm6q,Uid:cfa3e778-0ec3-4199-9357-de8a88cbcc1d,Namespace:kube-system,Attempt:0,} returns sandbox id \"061c0021356cef3e133b22b214e137e368262e72d9fd9f92a2d767770fa8c09a\"" May 15 00:29:50.020939 containerd[1483]: time="2025-05-15T00:29:50.020760283Z" level=info msg="CreateContainer within sandbox \"061c0021356cef3e133b22b214e137e368262e72d9fd9f92a2d767770fa8c09a\" for container &ContainerMetadata{Name:coredns,Attempt:0,}" May 15 00:29:50.038723 containerd[1483]: time="2025-05-15T00:29:50.038589415Z" level=info msg="Container 882329e38720d5e439ba1d8aa8dc362ca3a21a34003ac2c332b4b52fef915cd3: CDI devices from CRI Config.CDIDevices: []" May 15 00:29:50.043073 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount1057979824.mount: Deactivated successfully. May 15 00:29:50.050561 containerd[1483]: time="2025-05-15T00:29:50.050515368Z" level=info msg="CreateContainer within sandbox \"061c0021356cef3e133b22b214e137e368262e72d9fd9f92a2d767770fa8c09a\" for &ContainerMetadata{Name:coredns,Attempt:0,} returns container id \"882329e38720d5e439ba1d8aa8dc362ca3a21a34003ac2c332b4b52fef915cd3\"" May 15 00:29:50.051625 containerd[1483]: time="2025-05-15T00:29:50.051596953Z" level=info msg="StartContainer for \"882329e38720d5e439ba1d8aa8dc362ca3a21a34003ac2c332b4b52fef915cd3\"" May 15 00:29:50.052837 containerd[1483]: time="2025-05-15T00:29:50.052793253Z" level=info msg="connecting to shim 882329e38720d5e439ba1d8aa8dc362ca3a21a34003ac2c332b4b52fef915cd3" address="unix:///run/containerd/s/0fc262d5012e751c44989e5dfe126e541adb97b2009d585fa0bccb60d2c3cc7a" protocol=ttrpc version=3 May 15 00:29:50.086439 systemd[1]: Started cri-containerd-882329e38720d5e439ba1d8aa8dc362ca3a21a34003ac2c332b4b52fef915cd3.scope - libcontainer container 882329e38720d5e439ba1d8aa8dc362ca3a21a34003ac2c332b4b52fef915cd3. May 15 00:29:50.133146 containerd[1483]: time="2025-05-15T00:29:50.133105964Z" level=info msg="StartContainer for \"882329e38720d5e439ba1d8aa8dc362ca3a21a34003ac2c332b4b52fef915cd3\" returns successfully" May 15 00:29:50.424297 kernel: bpftool[3995]: memfd_create() called without MFD_EXEC or MFD_NOEXEC_SEAL set May 15 00:29:50.734216 systemd-networkd[1398]: vxlan.calico: Link UP May 15 00:29:50.734224 systemd-networkd[1398]: vxlan.calico: Gained carrier May 15 00:29:50.922441 systemd-networkd[1398]: calif6cdc80ac8a: Gained IPv6LL May 15 00:29:51.185566 kubelet[2714]: I0515 00:29:51.183970 2714 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/coredns-668d6bf9bc-9tm6q" podStartSLOduration=35.183942754 podStartE2EDuration="35.183942754s" podCreationTimestamp="2025-05-15 00:29:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-05-15 00:29:51.148403943 +0000 UTC m=+39.623994844" watchObservedRunningTime="2025-05-15 00:29:51.183942754 +0000 UTC m=+39.659533625" May 15 00:29:51.651215 containerd[1483]: time="2025-05-15T00:29:51.650708025Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-668d6bf9bc-psd8h,Uid:9b91a9a5-905b-4114-8ed1-90b707d6b8c4,Namespace:kube-system,Attempt:0,}" May 15 00:29:51.857187 systemd-networkd[1398]: cali228736e79e0: Link UP May 15 00:29:51.857418 systemd-networkd[1398]: cali228736e79e0: Gained carrier May 15 00:29:51.874700 containerd[1483]: 2025-05-15 00:29:51.745 [INFO][4070] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4284--0--0--n--019843d4bb.novalocal-k8s-coredns--668d6bf9bc--psd8h-eth0 coredns-668d6bf9bc- kube-system 9b91a9a5-905b-4114-8ed1-90b707d6b8c4 669 0 2025-05-15 00:29:16 +0000 UTC map[k8s-app:kube-dns pod-template-hash:668d6bf9bc projectcalico.org/namespace:kube-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:coredns] map[] [] [] []} {k8s ci-4284-0-0-n-019843d4bb.novalocal coredns-668d6bf9bc-psd8h eth0 coredns [] [] [kns.kube-system ksa.kube-system.coredns] cali228736e79e0 [{dns UDP 53 0 } {dns-tcp TCP 53 0 } {metrics TCP 9153 0 }] []}} ContainerID="762e4c9d0a776cb34f6251f9d564f3938801e5bf28aa821d889fd8d39526316e" Namespace="kube-system" Pod="coredns-668d6bf9bc-psd8h" WorkloadEndpoint="ci--4284--0--0--n--019843d4bb.novalocal-k8s-coredns--668d6bf9bc--psd8h-" May 15 00:29:51.874700 containerd[1483]: 2025-05-15 00:29:51.745 [INFO][4070] cni-plugin/k8s.go 77: Extracted identifiers for CmdAddK8s ContainerID="762e4c9d0a776cb34f6251f9d564f3938801e5bf28aa821d889fd8d39526316e" Namespace="kube-system" Pod="coredns-668d6bf9bc-psd8h" WorkloadEndpoint="ci--4284--0--0--n--019843d4bb.novalocal-k8s-coredns--668d6bf9bc--psd8h-eth0" May 15 00:29:51.874700 containerd[1483]: 2025-05-15 00:29:51.804 [INFO][4082] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="762e4c9d0a776cb34f6251f9d564f3938801e5bf28aa821d889fd8d39526316e" HandleID="k8s-pod-network.762e4c9d0a776cb34f6251f9d564f3938801e5bf28aa821d889fd8d39526316e" Workload="ci--4284--0--0--n--019843d4bb.novalocal-k8s-coredns--668d6bf9bc--psd8h-eth0" May 15 00:29:51.874700 containerd[1483]: 2025-05-15 00:29:51.818 [INFO][4082] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="762e4c9d0a776cb34f6251f9d564f3938801e5bf28aa821d889fd8d39526316e" HandleID="k8s-pod-network.762e4c9d0a776cb34f6251f9d564f3938801e5bf28aa821d889fd8d39526316e" Workload="ci--4284--0--0--n--019843d4bb.novalocal-k8s-coredns--668d6bf9bc--psd8h-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc000290a90), Attrs:map[string]string{"namespace":"kube-system", "node":"ci-4284-0-0-n-019843d4bb.novalocal", "pod":"coredns-668d6bf9bc-psd8h", "timestamp":"2025-05-15 00:29:51.804319866 +0000 UTC"}, Hostname:"ci-4284-0-0-n-019843d4bb.novalocal", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} May 15 00:29:51.874700 containerd[1483]: 2025-05-15 00:29:51.818 [INFO][4082] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. May 15 00:29:51.874700 containerd[1483]: 2025-05-15 00:29:51.818 [INFO][4082] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. May 15 00:29:51.874700 containerd[1483]: 2025-05-15 00:29:51.818 [INFO][4082] ipam/ipam.go 107: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4284-0-0-n-019843d4bb.novalocal' May 15 00:29:51.874700 containerd[1483]: 2025-05-15 00:29:51.821 [INFO][4082] ipam/ipam.go 660: Looking up existing affinities for host handle="k8s-pod-network.762e4c9d0a776cb34f6251f9d564f3938801e5bf28aa821d889fd8d39526316e" host="ci-4284-0-0-n-019843d4bb.novalocal" May 15 00:29:51.874700 containerd[1483]: 2025-05-15 00:29:51.826 [INFO][4082] ipam/ipam.go 372: Looking up existing affinities for host host="ci-4284-0-0-n-019843d4bb.novalocal" May 15 00:29:51.874700 containerd[1483]: 2025-05-15 00:29:51.831 [INFO][4082] ipam/ipam.go 489: Trying affinity for 192.168.127.64/26 host="ci-4284-0-0-n-019843d4bb.novalocal" May 15 00:29:51.874700 containerd[1483]: 2025-05-15 00:29:51.834 [INFO][4082] ipam/ipam.go 155: Attempting to load block cidr=192.168.127.64/26 host="ci-4284-0-0-n-019843d4bb.novalocal" May 15 00:29:51.874700 containerd[1483]: 2025-05-15 00:29:51.836 [INFO][4082] ipam/ipam.go 232: Affinity is confirmed and block has been loaded cidr=192.168.127.64/26 host="ci-4284-0-0-n-019843d4bb.novalocal" May 15 00:29:51.874700 containerd[1483]: 2025-05-15 00:29:51.836 [INFO][4082] ipam/ipam.go 1180: Attempting to assign 1 addresses from block block=192.168.127.64/26 handle="k8s-pod-network.762e4c9d0a776cb34f6251f9d564f3938801e5bf28aa821d889fd8d39526316e" host="ci-4284-0-0-n-019843d4bb.novalocal" May 15 00:29:51.874700 containerd[1483]: 2025-05-15 00:29:51.838 [INFO][4082] ipam/ipam.go 1685: Creating new handle: k8s-pod-network.762e4c9d0a776cb34f6251f9d564f3938801e5bf28aa821d889fd8d39526316e May 15 00:29:51.874700 containerd[1483]: 2025-05-15 00:29:51.843 [INFO][4082] ipam/ipam.go 1203: Writing block in order to claim IPs block=192.168.127.64/26 handle="k8s-pod-network.762e4c9d0a776cb34f6251f9d564f3938801e5bf28aa821d889fd8d39526316e" host="ci-4284-0-0-n-019843d4bb.novalocal" May 15 00:29:51.874700 containerd[1483]: 2025-05-15 00:29:51.852 [INFO][4082] ipam/ipam.go 1216: Successfully claimed IPs: [192.168.127.66/26] block=192.168.127.64/26 handle="k8s-pod-network.762e4c9d0a776cb34f6251f9d564f3938801e5bf28aa821d889fd8d39526316e" host="ci-4284-0-0-n-019843d4bb.novalocal" May 15 00:29:51.874700 containerd[1483]: 2025-05-15 00:29:51.852 [INFO][4082] ipam/ipam.go 847: Auto-assigned 1 out of 1 IPv4s: [192.168.127.66/26] handle="k8s-pod-network.762e4c9d0a776cb34f6251f9d564f3938801e5bf28aa821d889fd8d39526316e" host="ci-4284-0-0-n-019843d4bb.novalocal" May 15 00:29:51.874700 containerd[1483]: 2025-05-15 00:29:51.853 [INFO][4082] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. May 15 00:29:51.874700 containerd[1483]: 2025-05-15 00:29:51.853 [INFO][4082] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.127.66/26] IPv6=[] ContainerID="762e4c9d0a776cb34f6251f9d564f3938801e5bf28aa821d889fd8d39526316e" HandleID="k8s-pod-network.762e4c9d0a776cb34f6251f9d564f3938801e5bf28aa821d889fd8d39526316e" Workload="ci--4284--0--0--n--019843d4bb.novalocal-k8s-coredns--668d6bf9bc--psd8h-eth0" May 15 00:29:51.876755 containerd[1483]: 2025-05-15 00:29:51.854 [INFO][4070] cni-plugin/k8s.go 386: Populated endpoint ContainerID="762e4c9d0a776cb34f6251f9d564f3938801e5bf28aa821d889fd8d39526316e" Namespace="kube-system" Pod="coredns-668d6bf9bc-psd8h" WorkloadEndpoint="ci--4284--0--0--n--019843d4bb.novalocal-k8s-coredns--668d6bf9bc--psd8h-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4284--0--0--n--019843d4bb.novalocal-k8s-coredns--668d6bf9bc--psd8h-eth0", GenerateName:"coredns-668d6bf9bc-", Namespace:"kube-system", SelfLink:"", UID:"9b91a9a5-905b-4114-8ed1-90b707d6b8c4", ResourceVersion:"669", Generation:0, CreationTimestamp:time.Date(2025, time.May, 15, 0, 29, 16, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"668d6bf9bc", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4284-0-0-n-019843d4bb.novalocal", ContainerID:"", Pod:"coredns-668d6bf9bc-psd8h", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.127.66/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"cali228736e79e0", MAC:"", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil)}} May 15 00:29:51.876755 containerd[1483]: 2025-05-15 00:29:51.855 [INFO][4070] cni-plugin/k8s.go 387: Calico CNI using IPs: [192.168.127.66/32] ContainerID="762e4c9d0a776cb34f6251f9d564f3938801e5bf28aa821d889fd8d39526316e" Namespace="kube-system" Pod="coredns-668d6bf9bc-psd8h" WorkloadEndpoint="ci--4284--0--0--n--019843d4bb.novalocal-k8s-coredns--668d6bf9bc--psd8h-eth0" May 15 00:29:51.876755 containerd[1483]: 2025-05-15 00:29:51.855 [INFO][4070] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali228736e79e0 ContainerID="762e4c9d0a776cb34f6251f9d564f3938801e5bf28aa821d889fd8d39526316e" Namespace="kube-system" Pod="coredns-668d6bf9bc-psd8h" WorkloadEndpoint="ci--4284--0--0--n--019843d4bb.novalocal-k8s-coredns--668d6bf9bc--psd8h-eth0" May 15 00:29:51.876755 containerd[1483]: 2025-05-15 00:29:51.856 [INFO][4070] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="762e4c9d0a776cb34f6251f9d564f3938801e5bf28aa821d889fd8d39526316e" Namespace="kube-system" Pod="coredns-668d6bf9bc-psd8h" WorkloadEndpoint="ci--4284--0--0--n--019843d4bb.novalocal-k8s-coredns--668d6bf9bc--psd8h-eth0" May 15 00:29:51.876755 containerd[1483]: 2025-05-15 00:29:51.856 [INFO][4070] cni-plugin/k8s.go 414: Added Mac, interface name, and active container ID to endpoint ContainerID="762e4c9d0a776cb34f6251f9d564f3938801e5bf28aa821d889fd8d39526316e" Namespace="kube-system" Pod="coredns-668d6bf9bc-psd8h" WorkloadEndpoint="ci--4284--0--0--n--019843d4bb.novalocal-k8s-coredns--668d6bf9bc--psd8h-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4284--0--0--n--019843d4bb.novalocal-k8s-coredns--668d6bf9bc--psd8h-eth0", GenerateName:"coredns-668d6bf9bc-", Namespace:"kube-system", SelfLink:"", UID:"9b91a9a5-905b-4114-8ed1-90b707d6b8c4", ResourceVersion:"669", Generation:0, CreationTimestamp:time.Date(2025, time.May, 15, 0, 29, 16, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"668d6bf9bc", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4284-0-0-n-019843d4bb.novalocal", ContainerID:"762e4c9d0a776cb34f6251f9d564f3938801e5bf28aa821d889fd8d39526316e", Pod:"coredns-668d6bf9bc-psd8h", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.127.66/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"cali228736e79e0", MAC:"4e:fe:d7:80:53:84", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil)}} May 15 00:29:51.876755 containerd[1483]: 2025-05-15 00:29:51.871 [INFO][4070] cni-plugin/k8s.go 500: Wrote updated endpoint to datastore ContainerID="762e4c9d0a776cb34f6251f9d564f3938801e5bf28aa821d889fd8d39526316e" Namespace="kube-system" Pod="coredns-668d6bf9bc-psd8h" WorkloadEndpoint="ci--4284--0--0--n--019843d4bb.novalocal-k8s-coredns--668d6bf9bc--psd8h-eth0" May 15 00:29:51.948296 containerd[1483]: time="2025-05-15T00:29:51.948150404Z" level=info msg="connecting to shim 762e4c9d0a776cb34f6251f9d564f3938801e5bf28aa821d889fd8d39526316e" address="unix:///run/containerd/s/8e7d3de788241abe68f3861ed45be8c05e374e45656f5dc576f81485f23f6cc6" namespace=k8s.io protocol=ttrpc version=3 May 15 00:29:52.004408 systemd[1]: Started cri-containerd-762e4c9d0a776cb34f6251f9d564f3938801e5bf28aa821d889fd8d39526316e.scope - libcontainer container 762e4c9d0a776cb34f6251f9d564f3938801e5bf28aa821d889fd8d39526316e. May 15 00:29:52.048423 containerd[1483]: time="2025-05-15T00:29:52.048366301Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-668d6bf9bc-psd8h,Uid:9b91a9a5-905b-4114-8ed1-90b707d6b8c4,Namespace:kube-system,Attempt:0,} returns sandbox id \"762e4c9d0a776cb34f6251f9d564f3938801e5bf28aa821d889fd8d39526316e\"" May 15 00:29:52.051558 containerd[1483]: time="2025-05-15T00:29:52.051502114Z" level=info msg="CreateContainer within sandbox \"762e4c9d0a776cb34f6251f9d564f3938801e5bf28aa821d889fd8d39526316e\" for container &ContainerMetadata{Name:coredns,Attempt:0,}" May 15 00:29:52.065256 containerd[1483]: time="2025-05-15T00:29:52.064552370Z" level=info msg="Container c6b45d95dae5e1a84520b284822bfaabd5cfc8af1c6df0543197f5cbfad8220e: CDI devices from CRI Config.CDIDevices: []" May 15 00:29:52.076615 containerd[1483]: time="2025-05-15T00:29:52.076562224Z" level=info msg="CreateContainer within sandbox \"762e4c9d0a776cb34f6251f9d564f3938801e5bf28aa821d889fd8d39526316e\" for &ContainerMetadata{Name:coredns,Attempt:0,} returns container id \"c6b45d95dae5e1a84520b284822bfaabd5cfc8af1c6df0543197f5cbfad8220e\"" May 15 00:29:52.077248 containerd[1483]: time="2025-05-15T00:29:52.077222139Z" level=info msg="StartContainer for \"c6b45d95dae5e1a84520b284822bfaabd5cfc8af1c6df0543197f5cbfad8220e\"" May 15 00:29:52.078160 containerd[1483]: time="2025-05-15T00:29:52.078100210Z" level=info msg="connecting to shim c6b45d95dae5e1a84520b284822bfaabd5cfc8af1c6df0543197f5cbfad8220e" address="unix:///run/containerd/s/8e7d3de788241abe68f3861ed45be8c05e374e45656f5dc576f81485f23f6cc6" protocol=ttrpc version=3 May 15 00:29:52.101426 systemd[1]: Started cri-containerd-c6b45d95dae5e1a84520b284822bfaabd5cfc8af1c6df0543197f5cbfad8220e.scope - libcontainer container c6b45d95dae5e1a84520b284822bfaabd5cfc8af1c6df0543197f5cbfad8220e. May 15 00:29:52.138462 systemd-networkd[1398]: vxlan.calico: Gained IPv6LL May 15 00:29:52.144228 containerd[1483]: time="2025-05-15T00:29:52.144187091Z" level=info msg="StartContainer for \"c6b45d95dae5e1a84520b284822bfaabd5cfc8af1c6df0543197f5cbfad8220e\" returns successfully" May 15 00:29:53.159987 kubelet[2714]: I0515 00:29:53.159842 2714 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/coredns-668d6bf9bc-psd8h" podStartSLOduration=37.15954361 podStartE2EDuration="37.15954361s" podCreationTimestamp="2025-05-15 00:29:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-05-15 00:29:53.156358529 +0000 UTC m=+41.631949420" watchObservedRunningTime="2025-05-15 00:29:53.15954361 +0000 UTC m=+41.635134501" May 15 00:29:53.651183 containerd[1483]: time="2025-05-15T00:29:53.650496159Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-b49976888-shjsb,Uid:3efc63e2-426f-4a06-9892-124ca82668e0,Namespace:calico-apiserver,Attempt:0,}" May 15 00:29:53.651183 containerd[1483]: time="2025-05-15T00:29:53.651129437Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-5ff58c486d-z25zt,Uid:fa26f67a-8f4a-4892-8475-fff6e90bd869,Namespace:calico-system,Attempt:0,}" May 15 00:29:53.676054 systemd-networkd[1398]: cali228736e79e0: Gained IPv6LL May 15 00:29:53.869018 systemd-networkd[1398]: cali7eb930aa8e9: Link UP May 15 00:29:53.869221 systemd-networkd[1398]: cali7eb930aa8e9: Gained carrier May 15 00:29:53.883044 containerd[1483]: 2025-05-15 00:29:53.767 [INFO][4183] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4284--0--0--n--019843d4bb.novalocal-k8s-calico--apiserver--b49976888--shjsb-eth0 calico-apiserver-b49976888- calico-apiserver 3efc63e2-426f-4a06-9892-124ca82668e0 674 0 2025-05-15 00:29:23 +0000 UTC map[apiserver:true app.kubernetes.io/name:calico-apiserver k8s-app:calico-apiserver pod-template-hash:b49976888 projectcalico.org/namespace:calico-apiserver projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:calico-apiserver] map[] [] [] []} {k8s ci-4284-0-0-n-019843d4bb.novalocal calico-apiserver-b49976888-shjsb eth0 calico-apiserver [] [] [kns.calico-apiserver ksa.calico-apiserver.calico-apiserver] cali7eb930aa8e9 [] []}} ContainerID="8539f9f8023ae78bb489590684c3cca11a942022de74f9c42ba12c405ee3803d" Namespace="calico-apiserver" Pod="calico-apiserver-b49976888-shjsb" WorkloadEndpoint="ci--4284--0--0--n--019843d4bb.novalocal-k8s-calico--apiserver--b49976888--shjsb-" May 15 00:29:53.883044 containerd[1483]: 2025-05-15 00:29:53.767 [INFO][4183] cni-plugin/k8s.go 77: Extracted identifiers for CmdAddK8s ContainerID="8539f9f8023ae78bb489590684c3cca11a942022de74f9c42ba12c405ee3803d" Namespace="calico-apiserver" Pod="calico-apiserver-b49976888-shjsb" WorkloadEndpoint="ci--4284--0--0--n--019843d4bb.novalocal-k8s-calico--apiserver--b49976888--shjsb-eth0" May 15 00:29:53.883044 containerd[1483]: 2025-05-15 00:29:53.816 [INFO][4208] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="8539f9f8023ae78bb489590684c3cca11a942022de74f9c42ba12c405ee3803d" HandleID="k8s-pod-network.8539f9f8023ae78bb489590684c3cca11a942022de74f9c42ba12c405ee3803d" Workload="ci--4284--0--0--n--019843d4bb.novalocal-k8s-calico--apiserver--b49976888--shjsb-eth0" May 15 00:29:53.883044 containerd[1483]: 2025-05-15 00:29:53.828 [INFO][4208] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="8539f9f8023ae78bb489590684c3cca11a942022de74f9c42ba12c405ee3803d" HandleID="k8s-pod-network.8539f9f8023ae78bb489590684c3cca11a942022de74f9c42ba12c405ee3803d" Workload="ci--4284--0--0--n--019843d4bb.novalocal-k8s-calico--apiserver--b49976888--shjsb-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc00031a1b0), Attrs:map[string]string{"namespace":"calico-apiserver", "node":"ci-4284-0-0-n-019843d4bb.novalocal", "pod":"calico-apiserver-b49976888-shjsb", "timestamp":"2025-05-15 00:29:53.816741634 +0000 UTC"}, Hostname:"ci-4284-0-0-n-019843d4bb.novalocal", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} May 15 00:29:53.883044 containerd[1483]: 2025-05-15 00:29:53.828 [INFO][4208] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. May 15 00:29:53.883044 containerd[1483]: 2025-05-15 00:29:53.828 [INFO][4208] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. May 15 00:29:53.883044 containerd[1483]: 2025-05-15 00:29:53.828 [INFO][4208] ipam/ipam.go 107: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4284-0-0-n-019843d4bb.novalocal' May 15 00:29:53.883044 containerd[1483]: 2025-05-15 00:29:53.831 [INFO][4208] ipam/ipam.go 660: Looking up existing affinities for host handle="k8s-pod-network.8539f9f8023ae78bb489590684c3cca11a942022de74f9c42ba12c405ee3803d" host="ci-4284-0-0-n-019843d4bb.novalocal" May 15 00:29:53.883044 containerd[1483]: 2025-05-15 00:29:53.837 [INFO][4208] ipam/ipam.go 372: Looking up existing affinities for host host="ci-4284-0-0-n-019843d4bb.novalocal" May 15 00:29:53.883044 containerd[1483]: 2025-05-15 00:29:53.845 [INFO][4208] ipam/ipam.go 489: Trying affinity for 192.168.127.64/26 host="ci-4284-0-0-n-019843d4bb.novalocal" May 15 00:29:53.883044 containerd[1483]: 2025-05-15 00:29:53.847 [INFO][4208] ipam/ipam.go 155: Attempting to load block cidr=192.168.127.64/26 host="ci-4284-0-0-n-019843d4bb.novalocal" May 15 00:29:53.883044 containerd[1483]: 2025-05-15 00:29:53.849 [INFO][4208] ipam/ipam.go 232: Affinity is confirmed and block has been loaded cidr=192.168.127.64/26 host="ci-4284-0-0-n-019843d4bb.novalocal" May 15 00:29:53.883044 containerd[1483]: 2025-05-15 00:29:53.849 [INFO][4208] ipam/ipam.go 1180: Attempting to assign 1 addresses from block block=192.168.127.64/26 handle="k8s-pod-network.8539f9f8023ae78bb489590684c3cca11a942022de74f9c42ba12c405ee3803d" host="ci-4284-0-0-n-019843d4bb.novalocal" May 15 00:29:53.883044 containerd[1483]: 2025-05-15 00:29:53.851 [INFO][4208] ipam/ipam.go 1685: Creating new handle: k8s-pod-network.8539f9f8023ae78bb489590684c3cca11a942022de74f9c42ba12c405ee3803d May 15 00:29:53.883044 containerd[1483]: 2025-05-15 00:29:53.856 [INFO][4208] ipam/ipam.go 1203: Writing block in order to claim IPs block=192.168.127.64/26 handle="k8s-pod-network.8539f9f8023ae78bb489590684c3cca11a942022de74f9c42ba12c405ee3803d" host="ci-4284-0-0-n-019843d4bb.novalocal" May 15 00:29:53.883044 containerd[1483]: 2025-05-15 00:29:53.863 [INFO][4208] ipam/ipam.go 1216: Successfully claimed IPs: [192.168.127.67/26] block=192.168.127.64/26 handle="k8s-pod-network.8539f9f8023ae78bb489590684c3cca11a942022de74f9c42ba12c405ee3803d" host="ci-4284-0-0-n-019843d4bb.novalocal" May 15 00:29:53.883044 containerd[1483]: 2025-05-15 00:29:53.863 [INFO][4208] ipam/ipam.go 847: Auto-assigned 1 out of 1 IPv4s: [192.168.127.67/26] handle="k8s-pod-network.8539f9f8023ae78bb489590684c3cca11a942022de74f9c42ba12c405ee3803d" host="ci-4284-0-0-n-019843d4bb.novalocal" May 15 00:29:53.883044 containerd[1483]: 2025-05-15 00:29:53.863 [INFO][4208] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. May 15 00:29:53.883044 containerd[1483]: 2025-05-15 00:29:53.863 [INFO][4208] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.127.67/26] IPv6=[] ContainerID="8539f9f8023ae78bb489590684c3cca11a942022de74f9c42ba12c405ee3803d" HandleID="k8s-pod-network.8539f9f8023ae78bb489590684c3cca11a942022de74f9c42ba12c405ee3803d" Workload="ci--4284--0--0--n--019843d4bb.novalocal-k8s-calico--apiserver--b49976888--shjsb-eth0" May 15 00:29:53.883753 containerd[1483]: 2025-05-15 00:29:53.865 [INFO][4183] cni-plugin/k8s.go 386: Populated endpoint ContainerID="8539f9f8023ae78bb489590684c3cca11a942022de74f9c42ba12c405ee3803d" Namespace="calico-apiserver" Pod="calico-apiserver-b49976888-shjsb" WorkloadEndpoint="ci--4284--0--0--n--019843d4bb.novalocal-k8s-calico--apiserver--b49976888--shjsb-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4284--0--0--n--019843d4bb.novalocal-k8s-calico--apiserver--b49976888--shjsb-eth0", GenerateName:"calico-apiserver-b49976888-", Namespace:"calico-apiserver", SelfLink:"", UID:"3efc63e2-426f-4a06-9892-124ca82668e0", ResourceVersion:"674", Generation:0, CreationTimestamp:time.Date(2025, time.May, 15, 0, 29, 23, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"b49976888", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4284-0-0-n-019843d4bb.novalocal", ContainerID:"", Pod:"calico-apiserver-b49976888-shjsb", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.127.67/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"cali7eb930aa8e9", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil)}} May 15 00:29:53.883753 containerd[1483]: 2025-05-15 00:29:53.865 [INFO][4183] cni-plugin/k8s.go 387: Calico CNI using IPs: [192.168.127.67/32] ContainerID="8539f9f8023ae78bb489590684c3cca11a942022de74f9c42ba12c405ee3803d" Namespace="calico-apiserver" Pod="calico-apiserver-b49976888-shjsb" WorkloadEndpoint="ci--4284--0--0--n--019843d4bb.novalocal-k8s-calico--apiserver--b49976888--shjsb-eth0" May 15 00:29:53.883753 containerd[1483]: 2025-05-15 00:29:53.865 [INFO][4183] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali7eb930aa8e9 ContainerID="8539f9f8023ae78bb489590684c3cca11a942022de74f9c42ba12c405ee3803d" Namespace="calico-apiserver" Pod="calico-apiserver-b49976888-shjsb" WorkloadEndpoint="ci--4284--0--0--n--019843d4bb.novalocal-k8s-calico--apiserver--b49976888--shjsb-eth0" May 15 00:29:53.883753 containerd[1483]: 2025-05-15 00:29:53.868 [INFO][4183] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="8539f9f8023ae78bb489590684c3cca11a942022de74f9c42ba12c405ee3803d" Namespace="calico-apiserver" Pod="calico-apiserver-b49976888-shjsb" WorkloadEndpoint="ci--4284--0--0--n--019843d4bb.novalocal-k8s-calico--apiserver--b49976888--shjsb-eth0" May 15 00:29:53.883753 containerd[1483]: 2025-05-15 00:29:53.868 [INFO][4183] cni-plugin/k8s.go 414: Added Mac, interface name, and active container ID to endpoint ContainerID="8539f9f8023ae78bb489590684c3cca11a942022de74f9c42ba12c405ee3803d" Namespace="calico-apiserver" Pod="calico-apiserver-b49976888-shjsb" WorkloadEndpoint="ci--4284--0--0--n--019843d4bb.novalocal-k8s-calico--apiserver--b49976888--shjsb-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4284--0--0--n--019843d4bb.novalocal-k8s-calico--apiserver--b49976888--shjsb-eth0", GenerateName:"calico-apiserver-b49976888-", Namespace:"calico-apiserver", SelfLink:"", UID:"3efc63e2-426f-4a06-9892-124ca82668e0", ResourceVersion:"674", Generation:0, CreationTimestamp:time.Date(2025, time.May, 15, 0, 29, 23, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"b49976888", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4284-0-0-n-019843d4bb.novalocal", ContainerID:"8539f9f8023ae78bb489590684c3cca11a942022de74f9c42ba12c405ee3803d", Pod:"calico-apiserver-b49976888-shjsb", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.127.67/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"cali7eb930aa8e9", MAC:"4a:90:18:8f:0b:0d", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil)}} May 15 00:29:53.883753 containerd[1483]: 2025-05-15 00:29:53.880 [INFO][4183] cni-plugin/k8s.go 500: Wrote updated endpoint to datastore ContainerID="8539f9f8023ae78bb489590684c3cca11a942022de74f9c42ba12c405ee3803d" Namespace="calico-apiserver" Pod="calico-apiserver-b49976888-shjsb" WorkloadEndpoint="ci--4284--0--0--n--019843d4bb.novalocal-k8s-calico--apiserver--b49976888--shjsb-eth0" May 15 00:29:53.924456 containerd[1483]: time="2025-05-15T00:29:53.923827045Z" level=info msg="connecting to shim 8539f9f8023ae78bb489590684c3cca11a942022de74f9c42ba12c405ee3803d" address="unix:///run/containerd/s/efadbb49bdf21660141a03b5ee0fac9d7f159de6d95f24d3b603ca1caf2b2304" namespace=k8s.io protocol=ttrpc version=3 May 15 00:29:53.959472 systemd[1]: Started cri-containerd-8539f9f8023ae78bb489590684c3cca11a942022de74f9c42ba12c405ee3803d.scope - libcontainer container 8539f9f8023ae78bb489590684c3cca11a942022de74f9c42ba12c405ee3803d. May 15 00:29:53.992616 systemd-networkd[1398]: cali41f785f9cd1: Link UP May 15 00:29:53.993408 systemd-networkd[1398]: cali41f785f9cd1: Gained carrier May 15 00:29:54.014693 containerd[1483]: 2025-05-15 00:29:53.798 [INFO][4191] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4284--0--0--n--019843d4bb.novalocal-k8s-calico--kube--controllers--5ff58c486d--z25zt-eth0 calico-kube-controllers-5ff58c486d- calico-system fa26f67a-8f4a-4892-8475-fff6e90bd869 675 0 2025-05-15 00:29:24 +0000 UTC map[app.kubernetes.io/name:calico-kube-controllers k8s-app:calico-kube-controllers pod-template-hash:5ff58c486d projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:calico-kube-controllers] map[] [] [] []} {k8s ci-4284-0-0-n-019843d4bb.novalocal calico-kube-controllers-5ff58c486d-z25zt eth0 calico-kube-controllers [] [] [kns.calico-system ksa.calico-system.calico-kube-controllers] cali41f785f9cd1 [] []}} ContainerID="04983c1a33b8abd3a21ebc730310fe27417a607daf8821cb3b141716504df7c3" Namespace="calico-system" Pod="calico-kube-controllers-5ff58c486d-z25zt" WorkloadEndpoint="ci--4284--0--0--n--019843d4bb.novalocal-k8s-calico--kube--controllers--5ff58c486d--z25zt-" May 15 00:29:54.014693 containerd[1483]: 2025-05-15 00:29:53.798 [INFO][4191] cni-plugin/k8s.go 77: Extracted identifiers for CmdAddK8s ContainerID="04983c1a33b8abd3a21ebc730310fe27417a607daf8821cb3b141716504df7c3" Namespace="calico-system" Pod="calico-kube-controllers-5ff58c486d-z25zt" WorkloadEndpoint="ci--4284--0--0--n--019843d4bb.novalocal-k8s-calico--kube--controllers--5ff58c486d--z25zt-eth0" May 15 00:29:54.014693 containerd[1483]: 2025-05-15 00:29:53.836 [INFO][4214] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="04983c1a33b8abd3a21ebc730310fe27417a607daf8821cb3b141716504df7c3" HandleID="k8s-pod-network.04983c1a33b8abd3a21ebc730310fe27417a607daf8821cb3b141716504df7c3" Workload="ci--4284--0--0--n--019843d4bb.novalocal-k8s-calico--kube--controllers--5ff58c486d--z25zt-eth0" May 15 00:29:54.014693 containerd[1483]: 2025-05-15 00:29:53.847 [INFO][4214] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="04983c1a33b8abd3a21ebc730310fe27417a607daf8821cb3b141716504df7c3" HandleID="k8s-pod-network.04983c1a33b8abd3a21ebc730310fe27417a607daf8821cb3b141716504df7c3" Workload="ci--4284--0--0--n--019843d4bb.novalocal-k8s-calico--kube--controllers--5ff58c486d--z25zt-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc000051ee0), Attrs:map[string]string{"namespace":"calico-system", "node":"ci-4284-0-0-n-019843d4bb.novalocal", "pod":"calico-kube-controllers-5ff58c486d-z25zt", "timestamp":"2025-05-15 00:29:53.836975411 +0000 UTC"}, Hostname:"ci-4284-0-0-n-019843d4bb.novalocal", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} May 15 00:29:54.014693 containerd[1483]: 2025-05-15 00:29:53.847 [INFO][4214] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. May 15 00:29:54.014693 containerd[1483]: 2025-05-15 00:29:53.863 [INFO][4214] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. May 15 00:29:54.014693 containerd[1483]: 2025-05-15 00:29:53.863 [INFO][4214] ipam/ipam.go 107: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4284-0-0-n-019843d4bb.novalocal' May 15 00:29:54.014693 containerd[1483]: 2025-05-15 00:29:53.934 [INFO][4214] ipam/ipam.go 660: Looking up existing affinities for host handle="k8s-pod-network.04983c1a33b8abd3a21ebc730310fe27417a607daf8821cb3b141716504df7c3" host="ci-4284-0-0-n-019843d4bb.novalocal" May 15 00:29:54.014693 containerd[1483]: 2025-05-15 00:29:53.952 [INFO][4214] ipam/ipam.go 372: Looking up existing affinities for host host="ci-4284-0-0-n-019843d4bb.novalocal" May 15 00:29:54.014693 containerd[1483]: 2025-05-15 00:29:53.960 [INFO][4214] ipam/ipam.go 489: Trying affinity for 192.168.127.64/26 host="ci-4284-0-0-n-019843d4bb.novalocal" May 15 00:29:54.014693 containerd[1483]: 2025-05-15 00:29:53.964 [INFO][4214] ipam/ipam.go 155: Attempting to load block cidr=192.168.127.64/26 host="ci-4284-0-0-n-019843d4bb.novalocal" May 15 00:29:54.014693 containerd[1483]: 2025-05-15 00:29:53.968 [INFO][4214] ipam/ipam.go 232: Affinity is confirmed and block has been loaded cidr=192.168.127.64/26 host="ci-4284-0-0-n-019843d4bb.novalocal" May 15 00:29:54.014693 containerd[1483]: 2025-05-15 00:29:53.969 [INFO][4214] ipam/ipam.go 1180: Attempting to assign 1 addresses from block block=192.168.127.64/26 handle="k8s-pod-network.04983c1a33b8abd3a21ebc730310fe27417a607daf8821cb3b141716504df7c3" host="ci-4284-0-0-n-019843d4bb.novalocal" May 15 00:29:54.014693 containerd[1483]: 2025-05-15 00:29:53.972 [INFO][4214] ipam/ipam.go 1685: Creating new handle: k8s-pod-network.04983c1a33b8abd3a21ebc730310fe27417a607daf8821cb3b141716504df7c3 May 15 00:29:54.014693 containerd[1483]: 2025-05-15 00:29:53.977 [INFO][4214] ipam/ipam.go 1203: Writing block in order to claim IPs block=192.168.127.64/26 handle="k8s-pod-network.04983c1a33b8abd3a21ebc730310fe27417a607daf8821cb3b141716504df7c3" host="ci-4284-0-0-n-019843d4bb.novalocal" May 15 00:29:54.014693 containerd[1483]: 2025-05-15 00:29:53.985 [INFO][4214] ipam/ipam.go 1216: Successfully claimed IPs: [192.168.127.68/26] block=192.168.127.64/26 handle="k8s-pod-network.04983c1a33b8abd3a21ebc730310fe27417a607daf8821cb3b141716504df7c3" host="ci-4284-0-0-n-019843d4bb.novalocal" May 15 00:29:54.014693 containerd[1483]: 2025-05-15 00:29:53.985 [INFO][4214] ipam/ipam.go 847: Auto-assigned 1 out of 1 IPv4s: [192.168.127.68/26] handle="k8s-pod-network.04983c1a33b8abd3a21ebc730310fe27417a607daf8821cb3b141716504df7c3" host="ci-4284-0-0-n-019843d4bb.novalocal" May 15 00:29:54.014693 containerd[1483]: 2025-05-15 00:29:53.986 [INFO][4214] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. May 15 00:29:54.014693 containerd[1483]: 2025-05-15 00:29:53.986 [INFO][4214] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.127.68/26] IPv6=[] ContainerID="04983c1a33b8abd3a21ebc730310fe27417a607daf8821cb3b141716504df7c3" HandleID="k8s-pod-network.04983c1a33b8abd3a21ebc730310fe27417a607daf8821cb3b141716504df7c3" Workload="ci--4284--0--0--n--019843d4bb.novalocal-k8s-calico--kube--controllers--5ff58c486d--z25zt-eth0" May 15 00:29:54.015880 containerd[1483]: 2025-05-15 00:29:53.988 [INFO][4191] cni-plugin/k8s.go 386: Populated endpoint ContainerID="04983c1a33b8abd3a21ebc730310fe27417a607daf8821cb3b141716504df7c3" Namespace="calico-system" Pod="calico-kube-controllers-5ff58c486d-z25zt" WorkloadEndpoint="ci--4284--0--0--n--019843d4bb.novalocal-k8s-calico--kube--controllers--5ff58c486d--z25zt-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4284--0--0--n--019843d4bb.novalocal-k8s-calico--kube--controllers--5ff58c486d--z25zt-eth0", GenerateName:"calico-kube-controllers-5ff58c486d-", Namespace:"calico-system", SelfLink:"", UID:"fa26f67a-8f4a-4892-8475-fff6e90bd869", ResourceVersion:"675", Generation:0, CreationTimestamp:time.Date(2025, time.May, 15, 0, 29, 24, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"calico-kube-controllers", "k8s-app":"calico-kube-controllers", "pod-template-hash":"5ff58c486d", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-kube-controllers"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4284-0-0-n-019843d4bb.novalocal", ContainerID:"", Pod:"calico-kube-controllers-5ff58c486d-z25zt", Endpoint:"eth0", ServiceAccountName:"calico-kube-controllers", IPNetworks:[]string{"192.168.127.68/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.calico-kube-controllers"}, InterfaceName:"cali41f785f9cd1", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil)}} May 15 00:29:54.015880 containerd[1483]: 2025-05-15 00:29:53.988 [INFO][4191] cni-plugin/k8s.go 387: Calico CNI using IPs: [192.168.127.68/32] ContainerID="04983c1a33b8abd3a21ebc730310fe27417a607daf8821cb3b141716504df7c3" Namespace="calico-system" Pod="calico-kube-controllers-5ff58c486d-z25zt" WorkloadEndpoint="ci--4284--0--0--n--019843d4bb.novalocal-k8s-calico--kube--controllers--5ff58c486d--z25zt-eth0" May 15 00:29:54.015880 containerd[1483]: 2025-05-15 00:29:53.988 [INFO][4191] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali41f785f9cd1 ContainerID="04983c1a33b8abd3a21ebc730310fe27417a607daf8821cb3b141716504df7c3" Namespace="calico-system" Pod="calico-kube-controllers-5ff58c486d-z25zt" WorkloadEndpoint="ci--4284--0--0--n--019843d4bb.novalocal-k8s-calico--kube--controllers--5ff58c486d--z25zt-eth0" May 15 00:29:54.015880 containerd[1483]: 2025-05-15 00:29:53.994 [INFO][4191] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="04983c1a33b8abd3a21ebc730310fe27417a607daf8821cb3b141716504df7c3" Namespace="calico-system" Pod="calico-kube-controllers-5ff58c486d-z25zt" WorkloadEndpoint="ci--4284--0--0--n--019843d4bb.novalocal-k8s-calico--kube--controllers--5ff58c486d--z25zt-eth0" May 15 00:29:54.015880 containerd[1483]: 2025-05-15 00:29:53.996 [INFO][4191] cni-plugin/k8s.go 414: Added Mac, interface name, and active container ID to endpoint ContainerID="04983c1a33b8abd3a21ebc730310fe27417a607daf8821cb3b141716504df7c3" Namespace="calico-system" Pod="calico-kube-controllers-5ff58c486d-z25zt" WorkloadEndpoint="ci--4284--0--0--n--019843d4bb.novalocal-k8s-calico--kube--controllers--5ff58c486d--z25zt-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4284--0--0--n--019843d4bb.novalocal-k8s-calico--kube--controllers--5ff58c486d--z25zt-eth0", GenerateName:"calico-kube-controllers-5ff58c486d-", Namespace:"calico-system", SelfLink:"", UID:"fa26f67a-8f4a-4892-8475-fff6e90bd869", ResourceVersion:"675", Generation:0, CreationTimestamp:time.Date(2025, time.May, 15, 0, 29, 24, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"calico-kube-controllers", "k8s-app":"calico-kube-controllers", "pod-template-hash":"5ff58c486d", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-kube-controllers"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4284-0-0-n-019843d4bb.novalocal", ContainerID:"04983c1a33b8abd3a21ebc730310fe27417a607daf8821cb3b141716504df7c3", Pod:"calico-kube-controllers-5ff58c486d-z25zt", Endpoint:"eth0", ServiceAccountName:"calico-kube-controllers", IPNetworks:[]string{"192.168.127.68/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.calico-kube-controllers"}, InterfaceName:"cali41f785f9cd1", MAC:"82:e3:a4:5c:fa:bb", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil)}} May 15 00:29:54.015880 containerd[1483]: 2025-05-15 00:29:54.012 [INFO][4191] cni-plugin/k8s.go 500: Wrote updated endpoint to datastore ContainerID="04983c1a33b8abd3a21ebc730310fe27417a607daf8821cb3b141716504df7c3" Namespace="calico-system" Pod="calico-kube-controllers-5ff58c486d-z25zt" WorkloadEndpoint="ci--4284--0--0--n--019843d4bb.novalocal-k8s-calico--kube--controllers--5ff58c486d--z25zt-eth0" May 15 00:29:54.053258 containerd[1483]: time="2025-05-15T00:29:54.049352566Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-b49976888-shjsb,Uid:3efc63e2-426f-4a06-9892-124ca82668e0,Namespace:calico-apiserver,Attempt:0,} returns sandbox id \"8539f9f8023ae78bb489590684c3cca11a942022de74f9c42ba12c405ee3803d\"" May 15 00:29:54.055768 containerd[1483]: time="2025-05-15T00:29:54.055475560Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.29.3\"" May 15 00:29:54.062237 containerd[1483]: time="2025-05-15T00:29:54.062197680Z" level=info msg="connecting to shim 04983c1a33b8abd3a21ebc730310fe27417a607daf8821cb3b141716504df7c3" address="unix:///run/containerd/s/0537ffd9e2e7f623f37936a84f7d4d6c2a03449ba1735d31c68883a654f337ea" namespace=k8s.io protocol=ttrpc version=3 May 15 00:29:54.089418 systemd[1]: Started cri-containerd-04983c1a33b8abd3a21ebc730310fe27417a607daf8821cb3b141716504df7c3.scope - libcontainer container 04983c1a33b8abd3a21ebc730310fe27417a607daf8821cb3b141716504df7c3. May 15 00:29:54.144396 containerd[1483]: time="2025-05-15T00:29:54.144288808Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-5ff58c486d-z25zt,Uid:fa26f67a-8f4a-4892-8475-fff6e90bd869,Namespace:calico-system,Attempt:0,} returns sandbox id \"04983c1a33b8abd3a21ebc730310fe27417a607daf8821cb3b141716504df7c3\"" May 15 00:29:54.650591 containerd[1483]: time="2025-05-15T00:29:54.650510078Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-b49976888-mrqtc,Uid:469e854e-b480-4362-b622-18e343a10570,Namespace:calico-apiserver,Attempt:0,}" May 15 00:29:54.654222 containerd[1483]: time="2025-05-15T00:29:54.653961668Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-xcq6h,Uid:01471fe4-93c1-4711-8898-dce9d2c2ee23,Namespace:calico-system,Attempt:0,}" May 15 00:29:54.851912 systemd-networkd[1398]: cali0052462176a: Link UP May 15 00:29:54.852794 systemd-networkd[1398]: cali0052462176a: Gained carrier May 15 00:29:54.869695 containerd[1483]: 2025-05-15 00:29:54.755 [INFO][4350] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4284--0--0--n--019843d4bb.novalocal-k8s-csi--node--driver--xcq6h-eth0 csi-node-driver- calico-system 01471fe4-93c1-4711-8898-dce9d2c2ee23 583 0 2025-05-15 00:29:24 +0000 UTC map[app.kubernetes.io/name:csi-node-driver controller-revision-hash:5b5cc68cd5 k8s-app:csi-node-driver name:csi-node-driver pod-template-generation:1 projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:csi-node-driver] map[] [] [] []} {k8s ci-4284-0-0-n-019843d4bb.novalocal csi-node-driver-xcq6h eth0 csi-node-driver [] [] [kns.calico-system ksa.calico-system.csi-node-driver] cali0052462176a [] []}} ContainerID="b1051c3e2887b77d356513bbf5b99ca1fd6648a7d63c315b150f56166425d4fd" Namespace="calico-system" Pod="csi-node-driver-xcq6h" WorkloadEndpoint="ci--4284--0--0--n--019843d4bb.novalocal-k8s-csi--node--driver--xcq6h-" May 15 00:29:54.869695 containerd[1483]: 2025-05-15 00:29:54.756 [INFO][4350] cni-plugin/k8s.go 77: Extracted identifiers for CmdAddK8s ContainerID="b1051c3e2887b77d356513bbf5b99ca1fd6648a7d63c315b150f56166425d4fd" Namespace="calico-system" Pod="csi-node-driver-xcq6h" WorkloadEndpoint="ci--4284--0--0--n--019843d4bb.novalocal-k8s-csi--node--driver--xcq6h-eth0" May 15 00:29:54.869695 containerd[1483]: 2025-05-15 00:29:54.797 [INFO][4367] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="b1051c3e2887b77d356513bbf5b99ca1fd6648a7d63c315b150f56166425d4fd" HandleID="k8s-pod-network.b1051c3e2887b77d356513bbf5b99ca1fd6648a7d63c315b150f56166425d4fd" Workload="ci--4284--0--0--n--019843d4bb.novalocal-k8s-csi--node--driver--xcq6h-eth0" May 15 00:29:54.869695 containerd[1483]: 2025-05-15 00:29:54.808 [INFO][4367] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="b1051c3e2887b77d356513bbf5b99ca1fd6648a7d63c315b150f56166425d4fd" HandleID="k8s-pod-network.b1051c3e2887b77d356513bbf5b99ca1fd6648a7d63c315b150f56166425d4fd" Workload="ci--4284--0--0--n--019843d4bb.novalocal-k8s-csi--node--driver--xcq6h-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc00011c150), Attrs:map[string]string{"namespace":"calico-system", "node":"ci-4284-0-0-n-019843d4bb.novalocal", "pod":"csi-node-driver-xcq6h", "timestamp":"2025-05-15 00:29:54.796983906 +0000 UTC"}, Hostname:"ci-4284-0-0-n-019843d4bb.novalocal", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} May 15 00:29:54.869695 containerd[1483]: 2025-05-15 00:29:54.808 [INFO][4367] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. May 15 00:29:54.869695 containerd[1483]: 2025-05-15 00:29:54.809 [INFO][4367] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. May 15 00:29:54.869695 containerd[1483]: 2025-05-15 00:29:54.809 [INFO][4367] ipam/ipam.go 107: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4284-0-0-n-019843d4bb.novalocal' May 15 00:29:54.869695 containerd[1483]: 2025-05-15 00:29:54.812 [INFO][4367] ipam/ipam.go 660: Looking up existing affinities for host handle="k8s-pod-network.b1051c3e2887b77d356513bbf5b99ca1fd6648a7d63c315b150f56166425d4fd" host="ci-4284-0-0-n-019843d4bb.novalocal" May 15 00:29:54.869695 containerd[1483]: 2025-05-15 00:29:54.819 [INFO][4367] ipam/ipam.go 372: Looking up existing affinities for host host="ci-4284-0-0-n-019843d4bb.novalocal" May 15 00:29:54.869695 containerd[1483]: 2025-05-15 00:29:54.825 [INFO][4367] ipam/ipam.go 489: Trying affinity for 192.168.127.64/26 host="ci-4284-0-0-n-019843d4bb.novalocal" May 15 00:29:54.869695 containerd[1483]: 2025-05-15 00:29:54.827 [INFO][4367] ipam/ipam.go 155: Attempting to load block cidr=192.168.127.64/26 host="ci-4284-0-0-n-019843d4bb.novalocal" May 15 00:29:54.869695 containerd[1483]: 2025-05-15 00:29:54.830 [INFO][4367] ipam/ipam.go 232: Affinity is confirmed and block has been loaded cidr=192.168.127.64/26 host="ci-4284-0-0-n-019843d4bb.novalocal" May 15 00:29:54.869695 containerd[1483]: 2025-05-15 00:29:54.830 [INFO][4367] ipam/ipam.go 1180: Attempting to assign 1 addresses from block block=192.168.127.64/26 handle="k8s-pod-network.b1051c3e2887b77d356513bbf5b99ca1fd6648a7d63c315b150f56166425d4fd" host="ci-4284-0-0-n-019843d4bb.novalocal" May 15 00:29:54.869695 containerd[1483]: 2025-05-15 00:29:54.832 [INFO][4367] ipam/ipam.go 1685: Creating new handle: k8s-pod-network.b1051c3e2887b77d356513bbf5b99ca1fd6648a7d63c315b150f56166425d4fd May 15 00:29:54.869695 containerd[1483]: 2025-05-15 00:29:54.838 [INFO][4367] ipam/ipam.go 1203: Writing block in order to claim IPs block=192.168.127.64/26 handle="k8s-pod-network.b1051c3e2887b77d356513bbf5b99ca1fd6648a7d63c315b150f56166425d4fd" host="ci-4284-0-0-n-019843d4bb.novalocal" May 15 00:29:54.869695 containerd[1483]: 2025-05-15 00:29:54.846 [INFO][4367] ipam/ipam.go 1216: Successfully claimed IPs: [192.168.127.69/26] block=192.168.127.64/26 handle="k8s-pod-network.b1051c3e2887b77d356513bbf5b99ca1fd6648a7d63c315b150f56166425d4fd" host="ci-4284-0-0-n-019843d4bb.novalocal" May 15 00:29:54.869695 containerd[1483]: 2025-05-15 00:29:54.846 [INFO][4367] ipam/ipam.go 847: Auto-assigned 1 out of 1 IPv4s: [192.168.127.69/26] handle="k8s-pod-network.b1051c3e2887b77d356513bbf5b99ca1fd6648a7d63c315b150f56166425d4fd" host="ci-4284-0-0-n-019843d4bb.novalocal" May 15 00:29:54.869695 containerd[1483]: 2025-05-15 00:29:54.846 [INFO][4367] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. May 15 00:29:54.869695 containerd[1483]: 2025-05-15 00:29:54.846 [INFO][4367] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.127.69/26] IPv6=[] ContainerID="b1051c3e2887b77d356513bbf5b99ca1fd6648a7d63c315b150f56166425d4fd" HandleID="k8s-pod-network.b1051c3e2887b77d356513bbf5b99ca1fd6648a7d63c315b150f56166425d4fd" Workload="ci--4284--0--0--n--019843d4bb.novalocal-k8s-csi--node--driver--xcq6h-eth0" May 15 00:29:54.871133 containerd[1483]: 2025-05-15 00:29:54.849 [INFO][4350] cni-plugin/k8s.go 386: Populated endpoint ContainerID="b1051c3e2887b77d356513bbf5b99ca1fd6648a7d63c315b150f56166425d4fd" Namespace="calico-system" Pod="csi-node-driver-xcq6h" WorkloadEndpoint="ci--4284--0--0--n--019843d4bb.novalocal-k8s-csi--node--driver--xcq6h-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4284--0--0--n--019843d4bb.novalocal-k8s-csi--node--driver--xcq6h-eth0", GenerateName:"csi-node-driver-", Namespace:"calico-system", SelfLink:"", UID:"01471fe4-93c1-4711-8898-dce9d2c2ee23", ResourceVersion:"583", Generation:0, CreationTimestamp:time.Date(2025, time.May, 15, 0, 29, 24, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"csi-node-driver", "controller-revision-hash":"5b5cc68cd5", "k8s-app":"csi-node-driver", "name":"csi-node-driver", "pod-template-generation":"1", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"csi-node-driver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4284-0-0-n-019843d4bb.novalocal", ContainerID:"", Pod:"csi-node-driver-xcq6h", Endpoint:"eth0", ServiceAccountName:"csi-node-driver", IPNetworks:[]string{"192.168.127.69/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.csi-node-driver"}, InterfaceName:"cali0052462176a", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil)}} May 15 00:29:54.871133 containerd[1483]: 2025-05-15 00:29:54.849 [INFO][4350] cni-plugin/k8s.go 387: Calico CNI using IPs: [192.168.127.69/32] ContainerID="b1051c3e2887b77d356513bbf5b99ca1fd6648a7d63c315b150f56166425d4fd" Namespace="calico-system" Pod="csi-node-driver-xcq6h" WorkloadEndpoint="ci--4284--0--0--n--019843d4bb.novalocal-k8s-csi--node--driver--xcq6h-eth0" May 15 00:29:54.871133 containerd[1483]: 2025-05-15 00:29:54.849 [INFO][4350] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali0052462176a ContainerID="b1051c3e2887b77d356513bbf5b99ca1fd6648a7d63c315b150f56166425d4fd" Namespace="calico-system" Pod="csi-node-driver-xcq6h" WorkloadEndpoint="ci--4284--0--0--n--019843d4bb.novalocal-k8s-csi--node--driver--xcq6h-eth0" May 15 00:29:54.871133 containerd[1483]: 2025-05-15 00:29:54.853 [INFO][4350] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="b1051c3e2887b77d356513bbf5b99ca1fd6648a7d63c315b150f56166425d4fd" Namespace="calico-system" Pod="csi-node-driver-xcq6h" WorkloadEndpoint="ci--4284--0--0--n--019843d4bb.novalocal-k8s-csi--node--driver--xcq6h-eth0" May 15 00:29:54.871133 containerd[1483]: 2025-05-15 00:29:54.854 [INFO][4350] cni-plugin/k8s.go 414: Added Mac, interface name, and active container ID to endpoint ContainerID="b1051c3e2887b77d356513bbf5b99ca1fd6648a7d63c315b150f56166425d4fd" Namespace="calico-system" Pod="csi-node-driver-xcq6h" WorkloadEndpoint="ci--4284--0--0--n--019843d4bb.novalocal-k8s-csi--node--driver--xcq6h-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4284--0--0--n--019843d4bb.novalocal-k8s-csi--node--driver--xcq6h-eth0", GenerateName:"csi-node-driver-", Namespace:"calico-system", SelfLink:"", UID:"01471fe4-93c1-4711-8898-dce9d2c2ee23", ResourceVersion:"583", Generation:0, CreationTimestamp:time.Date(2025, time.May, 15, 0, 29, 24, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"csi-node-driver", "controller-revision-hash":"5b5cc68cd5", "k8s-app":"csi-node-driver", "name":"csi-node-driver", "pod-template-generation":"1", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"csi-node-driver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4284-0-0-n-019843d4bb.novalocal", ContainerID:"b1051c3e2887b77d356513bbf5b99ca1fd6648a7d63c315b150f56166425d4fd", Pod:"csi-node-driver-xcq6h", Endpoint:"eth0", ServiceAccountName:"csi-node-driver", IPNetworks:[]string{"192.168.127.69/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.csi-node-driver"}, InterfaceName:"cali0052462176a", MAC:"66:fd:eb:57:ae:03", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil)}} May 15 00:29:54.871133 containerd[1483]: 2025-05-15 00:29:54.867 [INFO][4350] cni-plugin/k8s.go 500: Wrote updated endpoint to datastore ContainerID="b1051c3e2887b77d356513bbf5b99ca1fd6648a7d63c315b150f56166425d4fd" Namespace="calico-system" Pod="csi-node-driver-xcq6h" WorkloadEndpoint="ci--4284--0--0--n--019843d4bb.novalocal-k8s-csi--node--driver--xcq6h-eth0" May 15 00:29:54.908222 containerd[1483]: time="2025-05-15T00:29:54.907382468Z" level=info msg="connecting to shim b1051c3e2887b77d356513bbf5b99ca1fd6648a7d63c315b150f56166425d4fd" address="unix:///run/containerd/s/94cc3593aa191afafeff183ae2d4f377a560d4aea5899de3bad2d1083dc3e368" namespace=k8s.io protocol=ttrpc version=3 May 15 00:29:54.945627 systemd[1]: Started cri-containerd-b1051c3e2887b77d356513bbf5b99ca1fd6648a7d63c315b150f56166425d4fd.scope - libcontainer container b1051c3e2887b77d356513bbf5b99ca1fd6648a7d63c315b150f56166425d4fd. May 15 00:29:54.993057 systemd-networkd[1398]: cali24098cbb7dc: Link UP May 15 00:29:54.993308 systemd-networkd[1398]: cali24098cbb7dc: Gained carrier May 15 00:29:55.007070 containerd[1483]: time="2025-05-15T00:29:55.006966608Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-xcq6h,Uid:01471fe4-93c1-4711-8898-dce9d2c2ee23,Namespace:calico-system,Attempt:0,} returns sandbox id \"b1051c3e2887b77d356513bbf5b99ca1fd6648a7d63c315b150f56166425d4fd\"" May 15 00:29:55.014569 containerd[1483]: 2025-05-15 00:29:54.758 [INFO][4341] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4284--0--0--n--019843d4bb.novalocal-k8s-calico--apiserver--b49976888--mrqtc-eth0 calico-apiserver-b49976888- calico-apiserver 469e854e-b480-4362-b622-18e343a10570 676 0 2025-05-15 00:29:23 +0000 UTC map[apiserver:true app.kubernetes.io/name:calico-apiserver k8s-app:calico-apiserver pod-template-hash:b49976888 projectcalico.org/namespace:calico-apiserver projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:calico-apiserver] map[] [] [] []} {k8s ci-4284-0-0-n-019843d4bb.novalocal calico-apiserver-b49976888-mrqtc eth0 calico-apiserver [] [] [kns.calico-apiserver ksa.calico-apiserver.calico-apiserver] cali24098cbb7dc [] []}} ContainerID="00a5c8a5a079864efe62c858960ffdc2a2d56f61517d5c7b592665a38885b8bb" Namespace="calico-apiserver" Pod="calico-apiserver-b49976888-mrqtc" WorkloadEndpoint="ci--4284--0--0--n--019843d4bb.novalocal-k8s-calico--apiserver--b49976888--mrqtc-" May 15 00:29:55.014569 containerd[1483]: 2025-05-15 00:29:54.758 [INFO][4341] cni-plugin/k8s.go 77: Extracted identifiers for CmdAddK8s ContainerID="00a5c8a5a079864efe62c858960ffdc2a2d56f61517d5c7b592665a38885b8bb" Namespace="calico-apiserver" Pod="calico-apiserver-b49976888-mrqtc" WorkloadEndpoint="ci--4284--0--0--n--019843d4bb.novalocal-k8s-calico--apiserver--b49976888--mrqtc-eth0" May 15 00:29:55.014569 containerd[1483]: 2025-05-15 00:29:54.806 [INFO][4369] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="00a5c8a5a079864efe62c858960ffdc2a2d56f61517d5c7b592665a38885b8bb" HandleID="k8s-pod-network.00a5c8a5a079864efe62c858960ffdc2a2d56f61517d5c7b592665a38885b8bb" Workload="ci--4284--0--0--n--019843d4bb.novalocal-k8s-calico--apiserver--b49976888--mrqtc-eth0" May 15 00:29:55.014569 containerd[1483]: 2025-05-15 00:29:54.817 [INFO][4369] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="00a5c8a5a079864efe62c858960ffdc2a2d56f61517d5c7b592665a38885b8bb" HandleID="k8s-pod-network.00a5c8a5a079864efe62c858960ffdc2a2d56f61517d5c7b592665a38885b8bb" Workload="ci--4284--0--0--n--019843d4bb.novalocal-k8s-calico--apiserver--b49976888--mrqtc-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc00031bb10), Attrs:map[string]string{"namespace":"calico-apiserver", "node":"ci-4284-0-0-n-019843d4bb.novalocal", "pod":"calico-apiserver-b49976888-mrqtc", "timestamp":"2025-05-15 00:29:54.806016218 +0000 UTC"}, Hostname:"ci-4284-0-0-n-019843d4bb.novalocal", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} May 15 00:29:55.014569 containerd[1483]: 2025-05-15 00:29:54.817 [INFO][4369] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. May 15 00:29:55.014569 containerd[1483]: 2025-05-15 00:29:54.846 [INFO][4369] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. May 15 00:29:55.014569 containerd[1483]: 2025-05-15 00:29:54.847 [INFO][4369] ipam/ipam.go 107: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4284-0-0-n-019843d4bb.novalocal' May 15 00:29:55.014569 containerd[1483]: 2025-05-15 00:29:54.918 [INFO][4369] ipam/ipam.go 660: Looking up existing affinities for host handle="k8s-pod-network.00a5c8a5a079864efe62c858960ffdc2a2d56f61517d5c7b592665a38885b8bb" host="ci-4284-0-0-n-019843d4bb.novalocal" May 15 00:29:55.014569 containerd[1483]: 2025-05-15 00:29:54.929 [INFO][4369] ipam/ipam.go 372: Looking up existing affinities for host host="ci-4284-0-0-n-019843d4bb.novalocal" May 15 00:29:55.014569 containerd[1483]: 2025-05-15 00:29:54.939 [INFO][4369] ipam/ipam.go 489: Trying affinity for 192.168.127.64/26 host="ci-4284-0-0-n-019843d4bb.novalocal" May 15 00:29:55.014569 containerd[1483]: 2025-05-15 00:29:54.945 [INFO][4369] ipam/ipam.go 155: Attempting to load block cidr=192.168.127.64/26 host="ci-4284-0-0-n-019843d4bb.novalocal" May 15 00:29:55.014569 containerd[1483]: 2025-05-15 00:29:54.950 [INFO][4369] ipam/ipam.go 232: Affinity is confirmed and block has been loaded cidr=192.168.127.64/26 host="ci-4284-0-0-n-019843d4bb.novalocal" May 15 00:29:55.014569 containerd[1483]: 2025-05-15 00:29:54.950 [INFO][4369] ipam/ipam.go 1180: Attempting to assign 1 addresses from block block=192.168.127.64/26 handle="k8s-pod-network.00a5c8a5a079864efe62c858960ffdc2a2d56f61517d5c7b592665a38885b8bb" host="ci-4284-0-0-n-019843d4bb.novalocal" May 15 00:29:55.014569 containerd[1483]: 2025-05-15 00:29:54.952 [INFO][4369] ipam/ipam.go 1685: Creating new handle: k8s-pod-network.00a5c8a5a079864efe62c858960ffdc2a2d56f61517d5c7b592665a38885b8bb May 15 00:29:55.014569 containerd[1483]: 2025-05-15 00:29:54.966 [INFO][4369] ipam/ipam.go 1203: Writing block in order to claim IPs block=192.168.127.64/26 handle="k8s-pod-network.00a5c8a5a079864efe62c858960ffdc2a2d56f61517d5c7b592665a38885b8bb" host="ci-4284-0-0-n-019843d4bb.novalocal" May 15 00:29:55.014569 containerd[1483]: 2025-05-15 00:29:54.983 [INFO][4369] ipam/ipam.go 1216: Successfully claimed IPs: [192.168.127.70/26] block=192.168.127.64/26 handle="k8s-pod-network.00a5c8a5a079864efe62c858960ffdc2a2d56f61517d5c7b592665a38885b8bb" host="ci-4284-0-0-n-019843d4bb.novalocal" May 15 00:29:55.014569 containerd[1483]: 2025-05-15 00:29:54.983 [INFO][4369] ipam/ipam.go 847: Auto-assigned 1 out of 1 IPv4s: [192.168.127.70/26] handle="k8s-pod-network.00a5c8a5a079864efe62c858960ffdc2a2d56f61517d5c7b592665a38885b8bb" host="ci-4284-0-0-n-019843d4bb.novalocal" May 15 00:29:55.014569 containerd[1483]: 2025-05-15 00:29:54.983 [INFO][4369] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. May 15 00:29:55.014569 containerd[1483]: 2025-05-15 00:29:54.983 [INFO][4369] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.127.70/26] IPv6=[] ContainerID="00a5c8a5a079864efe62c858960ffdc2a2d56f61517d5c7b592665a38885b8bb" HandleID="k8s-pod-network.00a5c8a5a079864efe62c858960ffdc2a2d56f61517d5c7b592665a38885b8bb" Workload="ci--4284--0--0--n--019843d4bb.novalocal-k8s-calico--apiserver--b49976888--mrqtc-eth0" May 15 00:29:55.015574 containerd[1483]: 2025-05-15 00:29:54.986 [INFO][4341] cni-plugin/k8s.go 386: Populated endpoint ContainerID="00a5c8a5a079864efe62c858960ffdc2a2d56f61517d5c7b592665a38885b8bb" Namespace="calico-apiserver" Pod="calico-apiserver-b49976888-mrqtc" WorkloadEndpoint="ci--4284--0--0--n--019843d4bb.novalocal-k8s-calico--apiserver--b49976888--mrqtc-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4284--0--0--n--019843d4bb.novalocal-k8s-calico--apiserver--b49976888--mrqtc-eth0", GenerateName:"calico-apiserver-b49976888-", Namespace:"calico-apiserver", SelfLink:"", UID:"469e854e-b480-4362-b622-18e343a10570", ResourceVersion:"676", Generation:0, CreationTimestamp:time.Date(2025, time.May, 15, 0, 29, 23, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"b49976888", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4284-0-0-n-019843d4bb.novalocal", ContainerID:"", Pod:"calico-apiserver-b49976888-mrqtc", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.127.70/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"cali24098cbb7dc", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil)}} May 15 00:29:55.015574 containerd[1483]: 2025-05-15 00:29:54.987 [INFO][4341] cni-plugin/k8s.go 387: Calico CNI using IPs: [192.168.127.70/32] ContainerID="00a5c8a5a079864efe62c858960ffdc2a2d56f61517d5c7b592665a38885b8bb" Namespace="calico-apiserver" Pod="calico-apiserver-b49976888-mrqtc" WorkloadEndpoint="ci--4284--0--0--n--019843d4bb.novalocal-k8s-calico--apiserver--b49976888--mrqtc-eth0" May 15 00:29:55.015574 containerd[1483]: 2025-05-15 00:29:54.987 [INFO][4341] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali24098cbb7dc ContainerID="00a5c8a5a079864efe62c858960ffdc2a2d56f61517d5c7b592665a38885b8bb" Namespace="calico-apiserver" Pod="calico-apiserver-b49976888-mrqtc" WorkloadEndpoint="ci--4284--0--0--n--019843d4bb.novalocal-k8s-calico--apiserver--b49976888--mrqtc-eth0" May 15 00:29:55.015574 containerd[1483]: 2025-05-15 00:29:54.990 [INFO][4341] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="00a5c8a5a079864efe62c858960ffdc2a2d56f61517d5c7b592665a38885b8bb" Namespace="calico-apiserver" Pod="calico-apiserver-b49976888-mrqtc" WorkloadEndpoint="ci--4284--0--0--n--019843d4bb.novalocal-k8s-calico--apiserver--b49976888--mrqtc-eth0" May 15 00:29:55.015574 containerd[1483]: 2025-05-15 00:29:54.992 [INFO][4341] cni-plugin/k8s.go 414: Added Mac, interface name, and active container ID to endpoint ContainerID="00a5c8a5a079864efe62c858960ffdc2a2d56f61517d5c7b592665a38885b8bb" Namespace="calico-apiserver" Pod="calico-apiserver-b49976888-mrqtc" WorkloadEndpoint="ci--4284--0--0--n--019843d4bb.novalocal-k8s-calico--apiserver--b49976888--mrqtc-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4284--0--0--n--019843d4bb.novalocal-k8s-calico--apiserver--b49976888--mrqtc-eth0", GenerateName:"calico-apiserver-b49976888-", Namespace:"calico-apiserver", SelfLink:"", UID:"469e854e-b480-4362-b622-18e343a10570", ResourceVersion:"676", Generation:0, CreationTimestamp:time.Date(2025, time.May, 15, 0, 29, 23, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"b49976888", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4284-0-0-n-019843d4bb.novalocal", ContainerID:"00a5c8a5a079864efe62c858960ffdc2a2d56f61517d5c7b592665a38885b8bb", Pod:"calico-apiserver-b49976888-mrqtc", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.127.70/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"cali24098cbb7dc", MAC:"4e:36:f1:1b:23:fd", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil)}} May 15 00:29:55.015574 containerd[1483]: 2025-05-15 00:29:55.011 [INFO][4341] cni-plugin/k8s.go 500: Wrote updated endpoint to datastore ContainerID="00a5c8a5a079864efe62c858960ffdc2a2d56f61517d5c7b592665a38885b8bb" Namespace="calico-apiserver" Pod="calico-apiserver-b49976888-mrqtc" WorkloadEndpoint="ci--4284--0--0--n--019843d4bb.novalocal-k8s-calico--apiserver--b49976888--mrqtc-eth0" May 15 00:29:55.063527 containerd[1483]: time="2025-05-15T00:29:55.063467535Z" level=info msg="connecting to shim 00a5c8a5a079864efe62c858960ffdc2a2d56f61517d5c7b592665a38885b8bb" address="unix:///run/containerd/s/75fbc57cd4203bb680d70c76bcaf8e9768c61c88974808c934b57f066d7c266d" namespace=k8s.io protocol=ttrpc version=3 May 15 00:29:55.099753 systemd[1]: Started cri-containerd-00a5c8a5a079864efe62c858960ffdc2a2d56f61517d5c7b592665a38885b8bb.scope - libcontainer container 00a5c8a5a079864efe62c858960ffdc2a2d56f61517d5c7b592665a38885b8bb. May 15 00:29:55.146389 systemd-networkd[1398]: cali7eb930aa8e9: Gained IPv6LL May 15 00:29:55.169517 containerd[1483]: time="2025-05-15T00:29:55.168780504Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-b49976888-mrqtc,Uid:469e854e-b480-4362-b622-18e343a10570,Namespace:calico-apiserver,Attempt:0,} returns sandbox id \"00a5c8a5a079864efe62c858960ffdc2a2d56f61517d5c7b592665a38885b8bb\"" May 15 00:29:55.466614 systemd-networkd[1398]: cali41f785f9cd1: Gained IPv6LL May 15 00:29:56.042523 systemd-networkd[1398]: cali0052462176a: Gained IPv6LL May 15 00:29:56.618562 systemd-networkd[1398]: cali24098cbb7dc: Gained IPv6LL May 15 00:29:58.706178 containerd[1483]: time="2025-05-15T00:29:58.706138306Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/apiserver:v3.29.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 15 00:29:58.707340 containerd[1483]: time="2025-05-15T00:29:58.707283786Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.29.3: active requests=0, bytes read=43021437" May 15 00:29:58.708475 containerd[1483]: time="2025-05-15T00:29:58.708420560Z" level=info msg="ImageCreate event name:\"sha256:b1960e792987d99ee8f3583d7354dcd25a683cf854e8f10322ca7eeb83128532\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 15 00:29:58.711161 containerd[1483]: time="2025-05-15T00:29:58.711139003Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/apiserver@sha256:bcb659f25f9aebaa389ed1dbb65edb39478ddf82c57d07d8da474e8cab38d77b\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 15 00:29:58.711944 containerd[1483]: time="2025-05-15T00:29:58.711826901Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/apiserver:v3.29.3\" with image id \"sha256:b1960e792987d99ee8f3583d7354dcd25a683cf854e8f10322ca7eeb83128532\", repo tag \"ghcr.io/flatcar/calico/apiserver:v3.29.3\", repo digest \"ghcr.io/flatcar/calico/apiserver@sha256:bcb659f25f9aebaa389ed1dbb65edb39478ddf82c57d07d8da474e8cab38d77b\", size \"44514075\" in 4.656281831s" May 15 00:29:58.711944 containerd[1483]: time="2025-05-15T00:29:58.711862804Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.29.3\" returns image reference \"sha256:b1960e792987d99ee8f3583d7354dcd25a683cf854e8f10322ca7eeb83128532\"" May 15 00:29:58.717479 containerd[1483]: time="2025-05-15T00:29:58.717443472Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.29.3\"" May 15 00:29:58.721225 containerd[1483]: time="2025-05-15T00:29:58.721184319Z" level=info msg="CreateContainer within sandbox \"8539f9f8023ae78bb489590684c3cca11a942022de74f9c42ba12c405ee3803d\" for container &ContainerMetadata{Name:calico-apiserver,Attempt:0,}" May 15 00:29:58.735790 containerd[1483]: time="2025-05-15T00:29:58.734196288Z" level=info msg="Container 1aed6830eac6c490161570f5f7c2adff183d2640a2de36593d775390d10272b2: CDI devices from CRI Config.CDIDevices: []" May 15 00:29:58.743672 containerd[1483]: time="2025-05-15T00:29:58.743642955Z" level=info msg="CreateContainer within sandbox \"8539f9f8023ae78bb489590684c3cca11a942022de74f9c42ba12c405ee3803d\" for &ContainerMetadata{Name:calico-apiserver,Attempt:0,} returns container id \"1aed6830eac6c490161570f5f7c2adff183d2640a2de36593d775390d10272b2\"" May 15 00:29:58.744406 containerd[1483]: time="2025-05-15T00:29:58.744223677Z" level=info msg="StartContainer for \"1aed6830eac6c490161570f5f7c2adff183d2640a2de36593d775390d10272b2\"" May 15 00:29:58.746628 containerd[1483]: time="2025-05-15T00:29:58.746594279Z" level=info msg="connecting to shim 1aed6830eac6c490161570f5f7c2adff183d2640a2de36593d775390d10272b2" address="unix:///run/containerd/s/efadbb49bdf21660141a03b5ee0fac9d7f159de6d95f24d3b603ca1caf2b2304" protocol=ttrpc version=3 May 15 00:29:58.775437 systemd[1]: Started cri-containerd-1aed6830eac6c490161570f5f7c2adff183d2640a2de36593d775390d10272b2.scope - libcontainer container 1aed6830eac6c490161570f5f7c2adff183d2640a2de36593d775390d10272b2. May 15 00:29:58.830018 containerd[1483]: time="2025-05-15T00:29:58.829988814Z" level=info msg="StartContainer for \"1aed6830eac6c490161570f5f7c2adff183d2640a2de36593d775390d10272b2\" returns successfully" May 15 00:29:59.823796 kubelet[2714]: I0515 00:29:59.823713 2714 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-apiserver/calico-apiserver-b49976888-shjsb" podStartSLOduration=32.161418567 podStartE2EDuration="36.823681073s" podCreationTimestamp="2025-05-15 00:29:23 +0000 UTC" firstStartedPulling="2025-05-15 00:29:54.054825239 +0000 UTC m=+42.530416080" lastFinishedPulling="2025-05-15 00:29:58.717087735 +0000 UTC m=+47.192678586" observedRunningTime="2025-05-15 00:29:59.218194908 +0000 UTC m=+47.693785869" watchObservedRunningTime="2025-05-15 00:29:59.823681073 +0000 UTC m=+48.299271914" May 15 00:30:04.235457 containerd[1483]: time="2025-05-15T00:30:04.233689801Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/kube-controllers:v3.29.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 15 00:30:04.236427 containerd[1483]: time="2025-05-15T00:30:04.235801920Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/kube-controllers:v3.29.3: active requests=0, bytes read=34789138" May 15 00:30:04.240029 containerd[1483]: time="2025-05-15T00:30:04.239958585Z" level=info msg="ImageCreate event name:\"sha256:4e982138231b3653a012db4f21ed5e7be69afd5f553dba38cf7e88f0ed740b94\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 15 00:30:04.257890 containerd[1483]: time="2025-05-15T00:30:04.257778456Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/kube-controllers@sha256:feaab0197035d474845e0f8137a99a78cab274f0a3cac4d5485cf9b1bdf9ffa9\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 15 00:30:04.265172 containerd[1483]: time="2025-05-15T00:30:04.265100080Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/kube-controllers:v3.29.3\" with image id \"sha256:4e982138231b3653a012db4f21ed5e7be69afd5f553dba38cf7e88f0ed740b94\", repo tag \"ghcr.io/flatcar/calico/kube-controllers:v3.29.3\", repo digest \"ghcr.io/flatcar/calico/kube-controllers@sha256:feaab0197035d474845e0f8137a99a78cab274f0a3cac4d5485cf9b1bdf9ffa9\", size \"36281728\" in 5.547461066s" May 15 00:30:04.265687 containerd[1483]: time="2025-05-15T00:30:04.265589911Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.29.3\" returns image reference \"sha256:4e982138231b3653a012db4f21ed5e7be69afd5f553dba38cf7e88f0ed740b94\"" May 15 00:30:04.269679 containerd[1483]: time="2025-05-15T00:30:04.268963037Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.29.3\"" May 15 00:30:04.305514 containerd[1483]: time="2025-05-15T00:30:04.305454794Z" level=info msg="CreateContainer within sandbox \"04983c1a33b8abd3a21ebc730310fe27417a607daf8821cb3b141716504df7c3\" for container &ContainerMetadata{Name:calico-kube-controllers,Attempt:0,}" May 15 00:30:04.318673 containerd[1483]: time="2025-05-15T00:30:04.318646465Z" level=info msg="Container 9ff98fd470103e075b85a542b74a27e08161453d0af26a522a7ed19dd847e43a: CDI devices from CRI Config.CDIDevices: []" May 15 00:30:04.325738 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount3716260618.mount: Deactivated successfully. May 15 00:30:04.333843 containerd[1483]: time="2025-05-15T00:30:04.333814994Z" level=info msg="CreateContainer within sandbox \"04983c1a33b8abd3a21ebc730310fe27417a607daf8821cb3b141716504df7c3\" for &ContainerMetadata{Name:calico-kube-controllers,Attempt:0,} returns container id \"9ff98fd470103e075b85a542b74a27e08161453d0af26a522a7ed19dd847e43a\"" May 15 00:30:04.334442 containerd[1483]: time="2025-05-15T00:30:04.334424035Z" level=info msg="StartContainer for \"9ff98fd470103e075b85a542b74a27e08161453d0af26a522a7ed19dd847e43a\"" May 15 00:30:04.336072 containerd[1483]: time="2025-05-15T00:30:04.336020382Z" level=info msg="connecting to shim 9ff98fd470103e075b85a542b74a27e08161453d0af26a522a7ed19dd847e43a" address="unix:///run/containerd/s/0537ffd9e2e7f623f37936a84f7d4d6c2a03449ba1735d31c68883a654f337ea" protocol=ttrpc version=3 May 15 00:30:04.363413 systemd[1]: Started cri-containerd-9ff98fd470103e075b85a542b74a27e08161453d0af26a522a7ed19dd847e43a.scope - libcontainer container 9ff98fd470103e075b85a542b74a27e08161453d0af26a522a7ed19dd847e43a. May 15 00:30:04.425046 containerd[1483]: time="2025-05-15T00:30:04.424996647Z" level=info msg="StartContainer for \"9ff98fd470103e075b85a542b74a27e08161453d0af26a522a7ed19dd847e43a\" returns successfully" May 15 00:30:05.273677 kubelet[2714]: I0515 00:30:05.272838 2714 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/calico-kube-controllers-5ff58c486d-z25zt" podStartSLOduration=31.150597812 podStartE2EDuration="41.272806533s" podCreationTimestamp="2025-05-15 00:29:24 +0000 UTC" firstStartedPulling="2025-05-15 00:29:54.14572295 +0000 UTC m=+42.621313791" lastFinishedPulling="2025-05-15 00:30:04.267931611 +0000 UTC m=+52.743522512" observedRunningTime="2025-05-15 00:30:05.272666383 +0000 UTC m=+53.748257305" watchObservedRunningTime="2025-05-15 00:30:05.272806533 +0000 UTC m=+53.748397424" May 15 00:30:06.236001 kubelet[2714]: I0515 00:30:06.235605 2714 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" May 15 00:30:06.611067 containerd[1483]: time="2025-05-15T00:30:06.610910958Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/csi:v3.29.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 15 00:30:06.612856 containerd[1483]: time="2025-05-15T00:30:06.612772570Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/csi:v3.29.3: active requests=0, bytes read=7912898" May 15 00:30:06.614408 containerd[1483]: time="2025-05-15T00:30:06.614357448Z" level=info msg="ImageCreate event name:\"sha256:4c37db5645f4075f8b8170eea8f14e340cb13550e0a392962f1f211ded741505\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 15 00:30:06.617365 containerd[1483]: time="2025-05-15T00:30:06.617316975Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/csi@sha256:72455a36febc7c56ec8881007f4805caed5764026a0694e4f86a2503209b2d31\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 15 00:30:06.618659 containerd[1483]: time="2025-05-15T00:30:06.618034730Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/csi:v3.29.3\" with image id \"sha256:4c37db5645f4075f8b8170eea8f14e340cb13550e0a392962f1f211ded741505\", repo tag \"ghcr.io/flatcar/calico/csi:v3.29.3\", repo digest \"ghcr.io/flatcar/calico/csi@sha256:72455a36febc7c56ec8881007f4805caed5764026a0694e4f86a2503209b2d31\", size \"9405520\" in 2.34900536s" May 15 00:30:06.618659 containerd[1483]: time="2025-05-15T00:30:06.618080902Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.29.3\" returns image reference \"sha256:4c37db5645f4075f8b8170eea8f14e340cb13550e0a392962f1f211ded741505\"" May 15 00:30:06.619476 containerd[1483]: time="2025-05-15T00:30:06.619442434Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.29.3\"" May 15 00:30:06.623104 containerd[1483]: time="2025-05-15T00:30:06.623058864Z" level=info msg="CreateContainer within sandbox \"b1051c3e2887b77d356513bbf5b99ca1fd6648a7d63c315b150f56166425d4fd\" for container &ContainerMetadata{Name:calico-csi,Attempt:0,}" May 15 00:30:06.636461 containerd[1483]: time="2025-05-15T00:30:06.636428241Z" level=info msg="Container 5aadc85d7ee556734f8b9e37669950e5c7c2c5d2b6f6339fd7a5f7c332bdd879: CDI devices from CRI Config.CDIDevices: []" May 15 00:30:06.653646 containerd[1483]: time="2025-05-15T00:30:06.653609148Z" level=info msg="CreateContainer within sandbox \"b1051c3e2887b77d356513bbf5b99ca1fd6648a7d63c315b150f56166425d4fd\" for &ContainerMetadata{Name:calico-csi,Attempt:0,} returns container id \"5aadc85d7ee556734f8b9e37669950e5c7c2c5d2b6f6339fd7a5f7c332bdd879\"" May 15 00:30:06.656626 containerd[1483]: time="2025-05-15T00:30:06.656581049Z" level=info msg="StartContainer for \"5aadc85d7ee556734f8b9e37669950e5c7c2c5d2b6f6339fd7a5f7c332bdd879\"" May 15 00:30:06.660369 containerd[1483]: time="2025-05-15T00:30:06.660253892Z" level=info msg="connecting to shim 5aadc85d7ee556734f8b9e37669950e5c7c2c5d2b6f6339fd7a5f7c332bdd879" address="unix:///run/containerd/s/94cc3593aa191afafeff183ae2d4f377a560d4aea5899de3bad2d1083dc3e368" protocol=ttrpc version=3 May 15 00:30:06.689449 systemd[1]: Started cri-containerd-5aadc85d7ee556734f8b9e37669950e5c7c2c5d2b6f6339fd7a5f7c332bdd879.scope - libcontainer container 5aadc85d7ee556734f8b9e37669950e5c7c2c5d2b6f6339fd7a5f7c332bdd879. May 15 00:30:06.754556 containerd[1483]: time="2025-05-15T00:30:06.754431744Z" level=info msg="TaskExit event in podsandbox handler container_id:\"9ff98fd470103e075b85a542b74a27e08161453d0af26a522a7ed19dd847e43a\" id:\"b5dad5e248c990fe7f634d03ae33ee7c83e11eb5e57a6453f747f92f3d425635\" pid:4632 exited_at:{seconds:1747269006 nanos:753046996}" May 15 00:30:06.795174 containerd[1483]: time="2025-05-15T00:30:06.795128372Z" level=info msg="StartContainer for \"5aadc85d7ee556734f8b9e37669950e5c7c2c5d2b6f6339fd7a5f7c332bdd879\" returns successfully" May 15 00:30:06.834939 containerd[1483]: time="2025-05-15T00:30:06.834799740Z" level=info msg="TaskExit event in podsandbox handler container_id:\"9ff98fd470103e075b85a542b74a27e08161453d0af26a522a7ed19dd847e43a\" id:\"9341041deda13f3db8a4c6282e89415be4f27d94232e8895732d8526170eeb8c\" pid:4663 exited_at:{seconds:1747269006 nanos:834441644}" May 15 00:30:07.313901 containerd[1483]: time="2025-05-15T00:30:07.313739684Z" level=info msg="ImageUpdate event name:\"ghcr.io/flatcar/calico/apiserver:v3.29.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 15 00:30:07.315892 containerd[1483]: time="2025-05-15T00:30:07.315680220Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.29.3: active requests=0, bytes read=77" May 15 00:30:07.321532 containerd[1483]: time="2025-05-15T00:30:07.321469494Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/apiserver:v3.29.3\" with image id \"sha256:b1960e792987d99ee8f3583d7354dcd25a683cf854e8f10322ca7eeb83128532\", repo tag \"ghcr.io/flatcar/calico/apiserver:v3.29.3\", repo digest \"ghcr.io/flatcar/calico/apiserver@sha256:bcb659f25f9aebaa389ed1dbb65edb39478ddf82c57d07d8da474e8cab38d77b\", size \"44514075\" in 701.977771ms" May 15 00:30:07.321700 containerd[1483]: time="2025-05-15T00:30:07.321541027Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.29.3\" returns image reference \"sha256:b1960e792987d99ee8f3583d7354dcd25a683cf854e8f10322ca7eeb83128532\"" May 15 00:30:07.324899 containerd[1483]: time="2025-05-15T00:30:07.324718343Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.29.3\"" May 15 00:30:07.330022 containerd[1483]: time="2025-05-15T00:30:07.329939773Z" level=info msg="CreateContainer within sandbox \"00a5c8a5a079864efe62c858960ffdc2a2d56f61517d5c7b592665a38885b8bb\" for container &ContainerMetadata{Name:calico-apiserver,Attempt:0,}" May 15 00:30:07.352305 containerd[1483]: time="2025-05-15T00:30:07.348659554Z" level=info msg="Container a20bd8df9fe22b6d286651f91c270619f86c92370530afea55aeb71f0d8d265b: CDI devices from CRI Config.CDIDevices: []" May 15 00:30:07.370992 containerd[1483]: time="2025-05-15T00:30:07.370887322Z" level=info msg="CreateContainer within sandbox \"00a5c8a5a079864efe62c858960ffdc2a2d56f61517d5c7b592665a38885b8bb\" for &ContainerMetadata{Name:calico-apiserver,Attempt:0,} returns container id \"a20bd8df9fe22b6d286651f91c270619f86c92370530afea55aeb71f0d8d265b\"" May 15 00:30:07.373244 containerd[1483]: time="2025-05-15T00:30:07.373167638Z" level=info msg="StartContainer for \"a20bd8df9fe22b6d286651f91c270619f86c92370530afea55aeb71f0d8d265b\"" May 15 00:30:07.379430 containerd[1483]: time="2025-05-15T00:30:07.379328995Z" level=info msg="connecting to shim a20bd8df9fe22b6d286651f91c270619f86c92370530afea55aeb71f0d8d265b" address="unix:///run/containerd/s/75fbc57cd4203bb680d70c76bcaf8e9768c61c88974808c934b57f066d7c266d" protocol=ttrpc version=3 May 15 00:30:07.438417 systemd[1]: Started cri-containerd-a20bd8df9fe22b6d286651f91c270619f86c92370530afea55aeb71f0d8d265b.scope - libcontainer container a20bd8df9fe22b6d286651f91c270619f86c92370530afea55aeb71f0d8d265b. May 15 00:30:07.495105 containerd[1483]: time="2025-05-15T00:30:07.492591610Z" level=info msg="StartContainer for \"a20bd8df9fe22b6d286651f91c270619f86c92370530afea55aeb71f0d8d265b\" returns successfully" May 15 00:30:08.282513 kubelet[2714]: I0515 00:30:08.282447 2714 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-apiserver/calico-apiserver-b49976888-mrqtc" podStartSLOduration=33.13069705 podStartE2EDuration="45.282429989s" podCreationTimestamp="2025-05-15 00:29:23 +0000 UTC" firstStartedPulling="2025-05-15 00:29:55.171863523 +0000 UTC m=+43.647454364" lastFinishedPulling="2025-05-15 00:30:07.323596412 +0000 UTC m=+55.799187303" observedRunningTime="2025-05-15 00:30:08.280458263 +0000 UTC m=+56.756049164" watchObservedRunningTime="2025-05-15 00:30:08.282429989 +0000 UTC m=+56.758020840" May 15 00:30:09.258639 kubelet[2714]: I0515 00:30:09.258559 2714 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" May 15 00:30:09.988459 containerd[1483]: time="2025-05-15T00:30:09.988407163Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node-driver-registrar:v3.29.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 15 00:30:09.989946 containerd[1483]: time="2025-05-15T00:30:09.989870683Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/node-driver-registrar:v3.29.3: active requests=0, bytes read=13991773" May 15 00:30:09.991394 containerd[1483]: time="2025-05-15T00:30:09.991345353Z" level=info msg="ImageCreate event name:\"sha256:e909e2ccf54404290b577fbddd190d036984deed184001767f820b0dddf77fd9\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 15 00:30:09.994902 containerd[1483]: time="2025-05-15T00:30:09.994856987Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node-driver-registrar@sha256:3f15090a9bb45773d1fd019455ec3d3f3746f3287c35d8013e497b38d8237324\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 15 00:30:09.995706 containerd[1483]: time="2025-05-15T00:30:09.995565811Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.29.3\" with image id \"sha256:e909e2ccf54404290b577fbddd190d036984deed184001767f820b0dddf77fd9\", repo tag \"ghcr.io/flatcar/calico/node-driver-registrar:v3.29.3\", repo digest \"ghcr.io/flatcar/calico/node-driver-registrar@sha256:3f15090a9bb45773d1fd019455ec3d3f3746f3287c35d8013e497b38d8237324\", size \"15484347\" in 2.670789702s" May 15 00:30:09.995706 containerd[1483]: time="2025-05-15T00:30:09.995604428Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.29.3\" returns image reference \"sha256:e909e2ccf54404290b577fbddd190d036984deed184001767f820b0dddf77fd9\"" May 15 00:30:09.999004 containerd[1483]: time="2025-05-15T00:30:09.998913898Z" level=info msg="CreateContainer within sandbox \"b1051c3e2887b77d356513bbf5b99ca1fd6648a7d63c315b150f56166425d4fd\" for container &ContainerMetadata{Name:csi-node-driver-registrar,Attempt:0,}" May 15 00:30:10.007797 containerd[1483]: time="2025-05-15T00:30:10.007673128Z" level=info msg="Container 5f22b28164858d4c06288ab5596e70a495a044f81f43463c176d4e942acb2c87: CDI devices from CRI Config.CDIDevices: []" May 15 00:30:10.026166 containerd[1483]: time="2025-05-15T00:30:10.026122298Z" level=info msg="CreateContainer within sandbox \"b1051c3e2887b77d356513bbf5b99ca1fd6648a7d63c315b150f56166425d4fd\" for &ContainerMetadata{Name:csi-node-driver-registrar,Attempt:0,} returns container id \"5f22b28164858d4c06288ab5596e70a495a044f81f43463c176d4e942acb2c87\"" May 15 00:30:10.027595 containerd[1483]: time="2025-05-15T00:30:10.027527357Z" level=info msg="StartContainer for \"5f22b28164858d4c06288ab5596e70a495a044f81f43463c176d4e942acb2c87\"" May 15 00:30:10.029983 containerd[1483]: time="2025-05-15T00:30:10.029827973Z" level=info msg="connecting to shim 5f22b28164858d4c06288ab5596e70a495a044f81f43463c176d4e942acb2c87" address="unix:///run/containerd/s/94cc3593aa191afafeff183ae2d4f377a560d4aea5899de3bad2d1083dc3e368" protocol=ttrpc version=3 May 15 00:30:10.064424 systemd[1]: Started cri-containerd-5f22b28164858d4c06288ab5596e70a495a044f81f43463c176d4e942acb2c87.scope - libcontainer container 5f22b28164858d4c06288ab5596e70a495a044f81f43463c176d4e942acb2c87. May 15 00:30:10.195468 containerd[1483]: time="2025-05-15T00:30:10.195433092Z" level=info msg="StartContainer for \"5f22b28164858d4c06288ab5596e70a495a044f81f43463c176d4e942acb2c87\" returns successfully" May 15 00:30:10.308498 kubelet[2714]: I0515 00:30:10.306796 2714 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/csi-node-driver-xcq6h" podStartSLOduration=31.319167509 podStartE2EDuration="46.306703483s" podCreationTimestamp="2025-05-15 00:29:24 +0000 UTC" firstStartedPulling="2025-05-15 00:29:55.009117036 +0000 UTC m=+43.484707887" lastFinishedPulling="2025-05-15 00:30:09.996653019 +0000 UTC m=+58.472243861" observedRunningTime="2025-05-15 00:30:10.30430395 +0000 UTC m=+58.779894841" watchObservedRunningTime="2025-05-15 00:30:10.306703483 +0000 UTC m=+58.782299243" May 15 00:30:10.772098 kubelet[2714]: I0515 00:30:10.771962 2714 csi_plugin.go:100] kubernetes.io/csi: Trying to validate a new CSI Driver with name: csi.tigera.io endpoint: /var/lib/kubelet/plugins/csi.tigera.io/csi.sock versions: 1.0.0 May 15 00:30:10.772098 kubelet[2714]: I0515 00:30:10.772084 2714 csi_plugin.go:113] kubernetes.io/csi: Register new plugin with name: csi.tigera.io at endpoint: /var/lib/kubelet/plugins/csi.tigera.io/csi.sock May 15 00:30:20.033566 containerd[1483]: time="2025-05-15T00:30:20.033385546Z" level=info msg="TaskExit event in podsandbox handler container_id:\"b9dec4e54563fe11ce3fcda43a6a436657226a30242f9e44258475044e458e49\" id:\"6edbe96697811c9f8b46f427417c454ca54438d3181cc38d9ae05086b8d722ef\" pid:4768 exited_at:{seconds:1747269020 nanos:31919707}" May 15 00:30:33.902546 kubelet[2714]: I0515 00:30:33.901603 2714 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" May 15 00:30:36.858613 containerd[1483]: time="2025-05-15T00:30:36.858441891Z" level=info msg="TaskExit event in podsandbox handler container_id:\"9ff98fd470103e075b85a542b74a27e08161453d0af26a522a7ed19dd847e43a\" id:\"03867e9cc7a77024bee7a4bd03b2d21216c265a840552f91a13559e35bfd32c2\" pid:4807 exited_at:{seconds:1747269036 nanos:856961185}" May 15 00:30:50.026158 containerd[1483]: time="2025-05-15T00:30:50.026110175Z" level=info msg="TaskExit event in podsandbox handler container_id:\"b9dec4e54563fe11ce3fcda43a6a436657226a30242f9e44258475044e458e49\" id:\"4db109c19c29033eb390f21780ca0118791a2ed11597eaa22071ea109f9b0644\" pid:4830 exited_at:{seconds:1747269050 nanos:25738559}" May 15 00:30:54.922762 containerd[1483]: time="2025-05-15T00:30:54.922364091Z" level=info msg="TaskExit event in podsandbox handler container_id:\"9ff98fd470103e075b85a542b74a27e08161453d0af26a522a7ed19dd847e43a\" id:\"298050881ff1d169807648f40b0f33ebe34cce9c2d932cbd39665193f364d97e\" pid:4855 exited_at:{seconds:1747269054 nanos:921410065}" May 15 00:31:06.870792 containerd[1483]: time="2025-05-15T00:31:06.870582936Z" level=info msg="TaskExit event in podsandbox handler container_id:\"9ff98fd470103e075b85a542b74a27e08161453d0af26a522a7ed19dd847e43a\" id:\"f340b017b802bb1ed20c0dd9c40531292683f683b9148e0298d3d42e6094b08a\" pid:4875 exited_at:{seconds:1747269066 nanos:867451747}" May 15 00:31:20.001823 containerd[1483]: time="2025-05-15T00:31:20.001754343Z" level=info msg="TaskExit event in podsandbox handler container_id:\"b9dec4e54563fe11ce3fcda43a6a436657226a30242f9e44258475044e458e49\" id:\"2921e7a0dec1084ed93fc0860ef6901a60543e91392701dc99c8f48d72c65703\" pid:4909 exited_at:{seconds:1747269080 nanos:1320791}" May 15 00:31:36.887527 containerd[1483]: time="2025-05-15T00:31:36.887386845Z" level=info msg="TaskExit event in podsandbox handler container_id:\"9ff98fd470103e075b85a542b74a27e08161453d0af26a522a7ed19dd847e43a\" id:\"507c458554ed714a210fb705c566a71858605b6bde9227d0710ecc0ebc1675fb\" pid:4952 exited_at:{seconds:1747269096 nanos:886459765}" May 15 00:31:49.992866 containerd[1483]: time="2025-05-15T00:31:49.992609938Z" level=info msg="TaskExit event in podsandbox handler container_id:\"b9dec4e54563fe11ce3fcda43a6a436657226a30242f9e44258475044e458e49\" id:\"660fa162c790f2dcccb53092b626564d0d295ec27ac27eae5830ffabeec17f61\" pid:4977 exited_at:{seconds:1747269109 nanos:992029695}" May 15 00:31:54.842676 containerd[1483]: time="2025-05-15T00:31:54.842620725Z" level=info msg="TaskExit event in podsandbox handler container_id:\"9ff98fd470103e075b85a542b74a27e08161453d0af26a522a7ed19dd847e43a\" id:\"bfc83a743e829b869c1686c4d4e9c6d90e32416305413d9e98fc6446cc122723\" pid:5002 exited_at:{seconds:1747269114 nanos:841599496}" May 15 00:32:06.867867 containerd[1483]: time="2025-05-15T00:32:06.867768178Z" level=info msg="TaskExit event in podsandbox handler container_id:\"9ff98fd470103e075b85a542b74a27e08161453d0af26a522a7ed19dd847e43a\" id:\"6394be9cd90faf752b69a7bfaef58fa36848c397e122a242da66d3ca0fbf7c36\" pid:5026 exited_at:{seconds:1747269126 nanos:867046207}" May 15 00:32:20.020366 containerd[1483]: time="2025-05-15T00:32:20.020051510Z" level=info msg="TaskExit event in podsandbox handler container_id:\"b9dec4e54563fe11ce3fcda43a6a436657226a30242f9e44258475044e458e49\" id:\"060da373f9e4936b8b679f530a5533161868d6dc7c55bb15ac55a631addba48e\" pid:5052 exited_at:{seconds:1747269140 nanos:19398161}" May 15 00:32:36.869778 containerd[1483]: time="2025-05-15T00:32:36.869550633Z" level=info msg="TaskExit event in podsandbox handler container_id:\"9ff98fd470103e075b85a542b74a27e08161453d0af26a522a7ed19dd847e43a\" id:\"ad756df8e7d5beea8def1f681143166885e9f2fa177e6b88043a23b7f2866c71\" pid:5084 exited_at:{seconds:1747269156 nanos:868779905}" May 15 00:32:49.992165 containerd[1483]: time="2025-05-15T00:32:49.991866556Z" level=info msg="TaskExit event in podsandbox handler container_id:\"b9dec4e54563fe11ce3fcda43a6a436657226a30242f9e44258475044e458e49\" id:\"8c4e1702f6c335047ed671cfbd4b443f770aeeac715637c4e62a1ed76c97eacb\" pid:5107 exited_at:{seconds:1747269169 nanos:991137779}" May 15 00:32:54.816583 containerd[1483]: time="2025-05-15T00:32:54.816507689Z" level=info msg="TaskExit event in podsandbox handler container_id:\"9ff98fd470103e075b85a542b74a27e08161453d0af26a522a7ed19dd847e43a\" id:\"8e88cbb9ff4f816fa5f93b9a5483c890892db6398e5a98404ae3fde707ad0c65\" pid:5132 exited_at:{seconds:1747269174 nanos:815804904}" May 15 00:33:06.874114 containerd[1483]: time="2025-05-15T00:33:06.873784354Z" level=info msg="TaskExit event in podsandbox handler container_id:\"9ff98fd470103e075b85a542b74a27e08161453d0af26a522a7ed19dd847e43a\" id:\"4e97c48eac6b2b0e6c4483a405540a0c5df5db915861739fa09b8e75cfa641a9\" pid:5169 exited_at:{seconds:1747269186 nanos:872968738}" May 15 00:33:20.010115 containerd[1483]: time="2025-05-15T00:33:20.009920267Z" level=info msg="TaskExit event in podsandbox handler container_id:\"b9dec4e54563fe11ce3fcda43a6a436657226a30242f9e44258475044e458e49\" id:\"7fc801fd29a25dfceab2624d4fc55264c73cbbb29e0a9b11c1b07b2350ec11fc\" pid:5199 exited_at:{seconds:1747269200 nanos:9501353}" May 15 00:33:36.867575 containerd[1483]: time="2025-05-15T00:33:36.867477955Z" level=info msg="TaskExit event in podsandbox handler container_id:\"9ff98fd470103e075b85a542b74a27e08161453d0af26a522a7ed19dd847e43a\" id:\"029091a529dda74c1fa36ac57951d67b306032d1169c68a59dac91613b4e1506\" pid:5226 exited_at:{seconds:1747269216 nanos:865506018}" May 15 00:33:40.682234 systemd[1]: Started sshd@9-172.24.4.125:22-172.24.4.1:35546.service - OpenSSH per-connection server daemon (172.24.4.1:35546). May 15 00:33:41.993061 sshd[5239]: Accepted publickey for core from 172.24.4.1 port 35546 ssh2: RSA SHA256:PZnPmaVglnXzapt4qvjiCfu83SmPbylzHggEpmD4nMk May 15 00:33:41.998759 sshd-session[5239]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 15 00:33:42.023727 systemd-logind[1458]: New session 12 of user core. May 15 00:33:42.033526 systemd[1]: Started session-12.scope - Session 12 of User core. May 15 00:33:42.892421 sshd[5241]: Connection closed by 172.24.4.1 port 35546 May 15 00:33:42.891616 sshd-session[5239]: pam_unix(sshd:session): session closed for user core May 15 00:33:42.899827 systemd-logind[1458]: Session 12 logged out. Waiting for processes to exit. May 15 00:33:42.900889 systemd[1]: sshd@9-172.24.4.125:22-172.24.4.1:35546.service: Deactivated successfully. May 15 00:33:42.909517 systemd[1]: session-12.scope: Deactivated successfully. May 15 00:33:42.915755 systemd-logind[1458]: Removed session 12. May 15 00:33:47.919457 systemd[1]: Started sshd@10-172.24.4.125:22-172.24.4.1:34936.service - OpenSSH per-connection server daemon (172.24.4.1:34936). May 15 00:33:49.175331 sshd[5254]: Accepted publickey for core from 172.24.4.1 port 34936 ssh2: RSA SHA256:PZnPmaVglnXzapt4qvjiCfu83SmPbylzHggEpmD4nMk May 15 00:33:49.178828 sshd-session[5254]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 15 00:33:49.192808 systemd-logind[1458]: New session 13 of user core. May 15 00:33:49.201654 systemd[1]: Started session-13.scope - Session 13 of User core. May 15 00:33:49.988982 sshd[5258]: Connection closed by 172.24.4.1 port 34936 May 15 00:33:49.988831 sshd-session[5254]: pam_unix(sshd:session): session closed for user core May 15 00:33:49.994424 systemd-logind[1458]: Session 13 logged out. Waiting for processes to exit. May 15 00:33:49.994805 systemd[1]: sshd@10-172.24.4.125:22-172.24.4.1:34936.service: Deactivated successfully. May 15 00:33:49.998868 systemd[1]: session-13.scope: Deactivated successfully. May 15 00:33:50.002198 systemd-logind[1458]: Removed session 13. May 15 00:33:50.013291 containerd[1483]: time="2025-05-15T00:33:50.012113884Z" level=info msg="TaskExit event in podsandbox handler container_id:\"b9dec4e54563fe11ce3fcda43a6a436657226a30242f9e44258475044e458e49\" id:\"90a385182ad7f9260ec3de99c9448f5a035aa1ba9ffb5424a79c5a771fff418a\" pid:5278 exited_at:{seconds:1747269230 nanos:11686135}" May 15 00:33:54.836812 containerd[1483]: time="2025-05-15T00:33:54.836285904Z" level=info msg="TaskExit event in podsandbox handler container_id:\"9ff98fd470103e075b85a542b74a27e08161453d0af26a522a7ed19dd847e43a\" id:\"13e31a84ca1bcd138e25a467549a1101bad7185c1a03cce024907d0d6b94bd91\" pid:5307 exited_at:{seconds:1747269234 nanos:835483072}" May 15 00:33:55.018859 systemd[1]: Started sshd@11-172.24.4.125:22-172.24.4.1:50764.service - OpenSSH per-connection server daemon (172.24.4.1:50764). May 15 00:33:56.160634 sshd[5317]: Accepted publickey for core from 172.24.4.1 port 50764 ssh2: RSA SHA256:PZnPmaVglnXzapt4qvjiCfu83SmPbylzHggEpmD4nMk May 15 00:33:56.164921 sshd-session[5317]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 15 00:33:56.186225 systemd-logind[1458]: New session 14 of user core. May 15 00:33:56.196682 systemd[1]: Started session-14.scope - Session 14 of User core. May 15 00:33:56.982147 sshd[5319]: Connection closed by 172.24.4.1 port 50764 May 15 00:33:56.981853 sshd-session[5317]: pam_unix(sshd:session): session closed for user core May 15 00:33:56.989833 systemd[1]: sshd@11-172.24.4.125:22-172.24.4.1:50764.service: Deactivated successfully. May 15 00:33:56.996881 systemd[1]: session-14.scope: Deactivated successfully. May 15 00:33:57.001473 systemd-logind[1458]: Session 14 logged out. Waiting for processes to exit. May 15 00:33:57.004049 systemd-logind[1458]: Removed session 14. May 15 00:34:02.015619 systemd[1]: Started sshd@12-172.24.4.125:22-172.24.4.1:50774.service - OpenSSH per-connection server daemon (172.24.4.1:50774). May 15 00:34:02.990945 sshd[5332]: Accepted publickey for core from 172.24.4.1 port 50774 ssh2: RSA SHA256:PZnPmaVglnXzapt4qvjiCfu83SmPbylzHggEpmD4nMk May 15 00:34:02.995375 sshd-session[5332]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 15 00:34:03.010761 systemd-logind[1458]: New session 15 of user core. May 15 00:34:03.019674 systemd[1]: Started session-15.scope - Session 15 of User core. May 15 00:34:03.781328 sshd[5334]: Connection closed by 172.24.4.1 port 50774 May 15 00:34:03.782899 sshd-session[5332]: pam_unix(sshd:session): session closed for user core May 15 00:34:03.793930 systemd[1]: sshd@12-172.24.4.125:22-172.24.4.1:50774.service: Deactivated successfully. May 15 00:34:03.804612 systemd[1]: session-15.scope: Deactivated successfully. May 15 00:34:03.814176 systemd-logind[1458]: Session 15 logged out. Waiting for processes to exit. May 15 00:34:03.818102 systemd-logind[1458]: Removed session 15. May 15 00:34:06.079734 containerd[1483]: time="2025-05-15T00:34:06.079133176Z" level=warning msg="container event discarded" container=bdd42a5ed63e83d130e3358342eff594062a6ce949f3217dba53b3de4b4077fa type=CONTAINER_CREATED_EVENT May 15 00:34:06.079734 containerd[1483]: time="2025-05-15T00:34:06.079642906Z" level=warning msg="container event discarded" container=bdd42a5ed63e83d130e3358342eff594062a6ce949f3217dba53b3de4b4077fa type=CONTAINER_STARTED_EVENT May 15 00:34:06.093284 containerd[1483]: time="2025-05-15T00:34:06.093110448Z" level=warning msg="container event discarded" container=4278ff26405fb2398c648f7515e8ba8501d7632605002e481b9e93549f6a15d8 type=CONTAINER_CREATED_EVENT May 15 00:34:06.093549 containerd[1483]: time="2025-05-15T00:34:06.093235235Z" level=warning msg="container event discarded" container=4278ff26405fb2398c648f7515e8ba8501d7632605002e481b9e93549f6a15d8 type=CONTAINER_STARTED_EVENT May 15 00:34:06.109005 containerd[1483]: time="2025-05-15T00:34:06.108823421Z" level=warning msg="container event discarded" container=185ef8c4e3c8aaef3e6b5324797bd12274f780dd8f27f707189d2b8ebc4f4429 type=CONTAINER_CREATED_EVENT May 15 00:34:06.109005 containerd[1483]: time="2025-05-15T00:34:06.108911949Z" level=warning msg="container event discarded" container=185ef8c4e3c8aaef3e6b5324797bd12274f780dd8f27f707189d2b8ebc4f4429 type=CONTAINER_STARTED_EVENT May 15 00:34:06.144446 containerd[1483]: time="2025-05-15T00:34:06.144303196Z" level=warning msg="container event discarded" container=0ba26f0127c106ee8d0aa370bdaa76dc7fdb9b2c6a6f2f68f80341589f7f5c9f type=CONTAINER_CREATED_EVENT May 15 00:34:06.144446 containerd[1483]: time="2025-05-15T00:34:06.144414588Z" level=warning msg="container event discarded" container=98a83c8962ed0f31c3316c2f191b2602695aacabcebdfdf660edf8358ec62a6a type=CONTAINER_CREATED_EVENT May 15 00:34:06.144784 containerd[1483]: time="2025-05-15T00:34:06.144459232Z" level=warning msg="container event discarded" container=f3e066b542951fe1e9b13c353be595709f8d47523890512e67c40d4af8eed9ed type=CONTAINER_CREATED_EVENT May 15 00:34:06.275039 containerd[1483]: time="2025-05-15T00:34:06.274929180Z" level=warning msg="container event discarded" container=f3e066b542951fe1e9b13c353be595709f8d47523890512e67c40d4af8eed9ed type=CONTAINER_STARTED_EVENT May 15 00:34:06.299426 containerd[1483]: time="2025-05-15T00:34:06.299310558Z" level=warning msg="container event discarded" container=0ba26f0127c106ee8d0aa370bdaa76dc7fdb9b2c6a6f2f68f80341589f7f5c9f type=CONTAINER_STARTED_EVENT May 15 00:34:06.314154 containerd[1483]: time="2025-05-15T00:34:06.313961922Z" level=warning msg="container event discarded" container=98a83c8962ed0f31c3316c2f191b2602695aacabcebdfdf660edf8358ec62a6a type=CONTAINER_STARTED_EVENT May 15 00:34:06.877501 containerd[1483]: time="2025-05-15T00:34:06.877219021Z" level=info msg="TaskExit event in podsandbox handler container_id:\"9ff98fd470103e075b85a542b74a27e08161453d0af26a522a7ed19dd847e43a\" id:\"2625f75c4c3fc5dbb123c13ebf43f6d66f0e37b3b31009f8a628a0874b69f667\" pid:5358 exited_at:{seconds:1747269246 nanos:876142153}" May 15 00:34:08.811248 systemd[1]: Started sshd@13-172.24.4.125:22-172.24.4.1:33034.service - OpenSSH per-connection server daemon (172.24.4.1:33034). May 15 00:34:10.088455 sshd[5368]: Accepted publickey for core from 172.24.4.1 port 33034 ssh2: RSA SHA256:PZnPmaVglnXzapt4qvjiCfu83SmPbylzHggEpmD4nMk May 15 00:34:10.092187 sshd-session[5368]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 15 00:34:10.106395 systemd-logind[1458]: New session 16 of user core. May 15 00:34:10.117637 systemd[1]: Started session-16.scope - Session 16 of User core. May 15 00:34:10.864381 sshd[5370]: Connection closed by 172.24.4.1 port 33034 May 15 00:34:10.865609 sshd-session[5368]: pam_unix(sshd:session): session closed for user core May 15 00:34:10.868717 systemd[1]: sshd@13-172.24.4.125:22-172.24.4.1:33034.service: Deactivated successfully. May 15 00:34:10.870975 systemd[1]: session-16.scope: Deactivated successfully. May 15 00:34:10.874167 systemd-logind[1458]: Session 16 logged out. Waiting for processes to exit. May 15 00:34:10.875950 systemd-logind[1458]: Removed session 16. May 15 00:34:15.887407 systemd[1]: Started sshd@14-172.24.4.125:22-172.24.4.1:54244.service - OpenSSH per-connection server daemon (172.24.4.1:54244). May 15 00:34:16.868081 sshd[5385]: Accepted publickey for core from 172.24.4.1 port 54244 ssh2: RSA SHA256:PZnPmaVglnXzapt4qvjiCfu83SmPbylzHggEpmD4nMk May 15 00:34:16.871355 sshd-session[5385]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 15 00:34:16.890475 systemd-logind[1458]: New session 17 of user core. May 15 00:34:16.896602 systemd[1]: Started session-17.scope - Session 17 of User core. May 15 00:34:17.641534 sshd[5387]: Connection closed by 172.24.4.1 port 54244 May 15 00:34:17.641167 sshd-session[5385]: pam_unix(sshd:session): session closed for user core May 15 00:34:17.651207 systemd[1]: sshd@14-172.24.4.125:22-172.24.4.1:54244.service: Deactivated successfully. May 15 00:34:17.660377 systemd[1]: session-17.scope: Deactivated successfully. May 15 00:34:17.663403 systemd-logind[1458]: Session 17 logged out. Waiting for processes to exit. May 15 00:34:17.666795 systemd-logind[1458]: Removed session 17. May 15 00:34:17.997777 containerd[1483]: time="2025-05-15T00:34:17.997428309Z" level=warning msg="container event discarded" container=3d22a60c3c019224a567ca87482dd326bd59c4c18d6c43e1d14135a008581c9f type=CONTAINER_CREATED_EVENT May 15 00:34:17.997777 containerd[1483]: time="2025-05-15T00:34:17.997585197Z" level=warning msg="container event discarded" container=3d22a60c3c019224a567ca87482dd326bd59c4c18d6c43e1d14135a008581c9f type=CONTAINER_STARTED_EVENT May 15 00:34:18.322602 containerd[1483]: time="2025-05-15T00:34:18.322224000Z" level=warning msg="container event discarded" container=af83dd7e1282ea1b2cbe91faf90fcb75c9bdcceb8bc841df0eaf68645bf57fc6 type=CONTAINER_CREATED_EVENT May 15 00:34:18.322602 containerd[1483]: time="2025-05-15T00:34:18.322392062Z" level=warning msg="container event discarded" container=af83dd7e1282ea1b2cbe91faf90fcb75c9bdcceb8bc841df0eaf68645bf57fc6 type=CONTAINER_STARTED_EVENT May 15 00:34:18.356942 containerd[1483]: time="2025-05-15T00:34:18.356699994Z" level=warning msg="container event discarded" container=cb85a7ec437a7275d52e8f2d023180d9cadf021a59da4d212ce48135f011ea5b type=CONTAINER_CREATED_EVENT May 15 00:34:18.465763 containerd[1483]: time="2025-05-15T00:34:18.465664152Z" level=warning msg="container event discarded" container=cb85a7ec437a7275d52e8f2d023180d9cadf021a59da4d212ce48135f011ea5b type=CONTAINER_STARTED_EVENT May 15 00:34:20.020227 containerd[1483]: time="2025-05-15T00:34:20.020158668Z" level=info msg="TaskExit event in podsandbox handler container_id:\"b9dec4e54563fe11ce3fcda43a6a436657226a30242f9e44258475044e458e49\" id:\"d89be0ba10ce901ca65c292139163ae9bc62206191c5fa0fcfdd1831eba3ac00\" pid:5415 exited_at:{seconds:1747269260 nanos:19631613}" May 15 00:34:20.672204 containerd[1483]: time="2025-05-15T00:34:20.672076734Z" level=warning msg="container event discarded" container=df1774b8308a9981600bee49ff7464b353d4c1df3e9b4618ce91e1936db17e64 type=CONTAINER_CREATED_EVENT May 15 00:34:20.725085 containerd[1483]: time="2025-05-15T00:34:20.724887905Z" level=warning msg="container event discarded" container=df1774b8308a9981600bee49ff7464b353d4c1df3e9b4618ce91e1936db17e64 type=CONTAINER_STARTED_EVENT May 15 00:34:22.663793 systemd[1]: Started sshd@15-172.24.4.125:22-172.24.4.1:54250.service - OpenSSH per-connection server daemon (172.24.4.1:54250). May 15 00:34:23.859339 sshd[5427]: Accepted publickey for core from 172.24.4.1 port 54250 ssh2: RSA SHA256:PZnPmaVglnXzapt4qvjiCfu83SmPbylzHggEpmD4nMk May 15 00:34:23.862254 sshd-session[5427]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 15 00:34:23.877381 systemd-logind[1458]: New session 18 of user core. May 15 00:34:23.886693 systemd[1]: Started session-18.scope - Session 18 of User core. May 15 00:34:24.507781 containerd[1483]: time="2025-05-15T00:34:24.507613383Z" level=warning msg="container event discarded" container=ff9ba55fc08661780708d8a83dcb383ef6aaf803187efada0ba4439565291d27 type=CONTAINER_CREATED_EVENT May 15 00:34:24.507781 containerd[1483]: time="2025-05-15T00:34:24.507720558Z" level=warning msg="container event discarded" container=ff9ba55fc08661780708d8a83dcb383ef6aaf803187efada0ba4439565291d27 type=CONTAINER_STARTED_EVENT May 15 00:34:24.542324 containerd[1483]: time="2025-05-15T00:34:24.542151909Z" level=warning msg="container event discarded" container=4bd760430421a8b56ee54da97d6204d75ef975cec375cb6f9578938657a0c08b type=CONTAINER_CREATED_EVENT May 15 00:34:24.542324 containerd[1483]: time="2025-05-15T00:34:24.542214459Z" level=warning msg="container event discarded" container=4bd760430421a8b56ee54da97d6204d75ef975cec375cb6f9578938657a0c08b type=CONTAINER_STARTED_EVENT May 15 00:34:24.632528 sshd[5429]: Connection closed by 172.24.4.1 port 54250 May 15 00:34:24.632225 sshd-session[5427]: pam_unix(sshd:session): session closed for user core May 15 00:34:24.639751 systemd[1]: sshd@15-172.24.4.125:22-172.24.4.1:54250.service: Deactivated successfully. May 15 00:34:24.645571 systemd[1]: session-18.scope: Deactivated successfully. May 15 00:34:24.651475 systemd-logind[1458]: Session 18 logged out. Waiting for processes to exit. May 15 00:34:24.655082 systemd-logind[1458]: Removed session 18. May 15 00:34:27.707506 containerd[1483]: time="2025-05-15T00:34:27.707224519Z" level=warning msg="container event discarded" container=c1084e22495184017499111394387d7ddbbe1287350afa1fa2fa6da6efb3b56f type=CONTAINER_CREATED_EVENT May 15 00:34:27.788778 containerd[1483]: time="2025-05-15T00:34:27.788666832Z" level=warning msg="container event discarded" container=c1084e22495184017499111394387d7ddbbe1287350afa1fa2fa6da6efb3b56f type=CONTAINER_STARTED_EVENT May 15 00:34:29.662466 systemd[1]: Started sshd@16-172.24.4.125:22-172.24.4.1:42976.service - OpenSSH per-connection server daemon (172.24.4.1:42976). May 15 00:34:29.795381 containerd[1483]: time="2025-05-15T00:34:29.793476003Z" level=warning msg="container event discarded" container=360951a04811e7d6495fcfdcb10ed4fdf25b02fb640037a798abb5a81b70a30f type=CONTAINER_CREATED_EVENT May 15 00:34:29.867380 containerd[1483]: time="2025-05-15T00:34:29.867139678Z" level=warning msg="container event discarded" container=360951a04811e7d6495fcfdcb10ed4fdf25b02fb640037a798abb5a81b70a30f type=CONTAINER_STARTED_EVENT May 15 00:34:30.659835 containerd[1483]: time="2025-05-15T00:34:30.659705837Z" level=warning msg="container event discarded" container=360951a04811e7d6495fcfdcb10ed4fdf25b02fb640037a798abb5a81b70a30f type=CONTAINER_STOPPED_EVENT May 15 00:34:30.948175 sshd[5453]: Accepted publickey for core from 172.24.4.1 port 42976 ssh2: RSA SHA256:PZnPmaVglnXzapt4qvjiCfu83SmPbylzHggEpmD4nMk May 15 00:34:30.952068 sshd-session[5453]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 15 00:34:30.964417 systemd-logind[1458]: New session 19 of user core. May 15 00:34:30.973673 systemd[1]: Started session-19.scope - Session 19 of User core. May 15 00:34:31.836342 sshd[5455]: Connection closed by 172.24.4.1 port 42976 May 15 00:34:31.836768 sshd-session[5453]: pam_unix(sshd:session): session closed for user core May 15 00:34:31.858537 systemd[1]: sshd@16-172.24.4.125:22-172.24.4.1:42976.service: Deactivated successfully. May 15 00:34:31.863854 systemd[1]: session-19.scope: Deactivated successfully. May 15 00:34:31.869387 systemd-logind[1458]: Session 19 logged out. Waiting for processes to exit. May 15 00:34:31.870866 systemd[1]: Started sshd@17-172.24.4.125:22-172.24.4.1:42984.service - OpenSSH per-connection server daemon (172.24.4.1:42984). May 15 00:34:31.875107 systemd-logind[1458]: Removed session 19. May 15 00:34:33.149945 sshd[5467]: Accepted publickey for core from 172.24.4.1 port 42984 ssh2: RSA SHA256:PZnPmaVglnXzapt4qvjiCfu83SmPbylzHggEpmD4nMk May 15 00:34:33.153573 sshd-session[5467]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 15 00:34:33.168408 systemd-logind[1458]: New session 20 of user core. May 15 00:34:33.180715 systemd[1]: Started session-20.scope - Session 20 of User core. May 15 00:34:33.973251 sshd[5470]: Connection closed by 172.24.4.1 port 42984 May 15 00:34:33.973981 sshd-session[5467]: pam_unix(sshd:session): session closed for user core May 15 00:34:33.989325 systemd[1]: sshd@17-172.24.4.125:22-172.24.4.1:42984.service: Deactivated successfully. May 15 00:34:33.995065 systemd[1]: session-20.scope: Deactivated successfully. May 15 00:34:33.997414 systemd-logind[1458]: Session 20 logged out. Waiting for processes to exit. May 15 00:34:34.003165 systemd[1]: Started sshd@18-172.24.4.125:22-172.24.4.1:57918.service - OpenSSH per-connection server daemon (172.24.4.1:57918). May 15 00:34:34.007589 systemd-logind[1458]: Removed session 20. May 15 00:34:35.452363 sshd[5479]: Accepted publickey for core from 172.24.4.1 port 57918 ssh2: RSA SHA256:PZnPmaVglnXzapt4qvjiCfu83SmPbylzHggEpmD4nMk May 15 00:34:35.455567 sshd-session[5479]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 15 00:34:35.469405 systemd-logind[1458]: New session 21 of user core. May 15 00:34:35.474643 systemd[1]: Started session-21.scope - Session 21 of User core. May 15 00:34:36.136519 sshd[5482]: Connection closed by 172.24.4.1 port 57918 May 15 00:34:36.138248 sshd-session[5479]: pam_unix(sshd:session): session closed for user core May 15 00:34:36.141925 systemd[1]: sshd@18-172.24.4.125:22-172.24.4.1:57918.service: Deactivated successfully. May 15 00:34:36.144721 systemd[1]: session-21.scope: Deactivated successfully. May 15 00:34:36.147122 systemd-logind[1458]: Session 21 logged out. Waiting for processes to exit. May 15 00:34:36.148746 systemd-logind[1458]: Removed session 21. May 15 00:34:36.726505 containerd[1483]: time="2025-05-15T00:34:36.726217299Z" level=warning msg="container event discarded" container=3efb52fbc58c2fecebb909890fe6cdeab5505bf7a86bb13faab571bbcf8a76d6 type=CONTAINER_CREATED_EVENT May 15 00:34:36.805364 containerd[1483]: time="2025-05-15T00:34:36.804839338Z" level=warning msg="container event discarded" container=3efb52fbc58c2fecebb909890fe6cdeab5505bf7a86bb13faab571bbcf8a76d6 type=CONTAINER_STARTED_EVENT May 15 00:34:36.892658 containerd[1483]: time="2025-05-15T00:34:36.892099216Z" level=info msg="TaskExit event in podsandbox handler container_id:\"9ff98fd470103e075b85a542b74a27e08161453d0af26a522a7ed19dd847e43a\" id:\"7c6637ef41c02941d683523dc88bdd18c21213b89b6e7f7d58410e847c73bc6a\" pid:5506 exited_at:{seconds:1747269276 nanos:891397133}" May 15 00:34:39.095784 containerd[1483]: time="2025-05-15T00:34:39.095659779Z" level=warning msg="container event discarded" container=3efb52fbc58c2fecebb909890fe6cdeab5505bf7a86bb13faab571bbcf8a76d6 type=CONTAINER_STOPPED_EVENT May 15 00:34:41.178119 systemd[1]: Started sshd@19-172.24.4.125:22-172.24.4.1:57920.service - OpenSSH per-connection server daemon (172.24.4.1:57920). May 15 00:34:42.528357 sshd[5521]: Accepted publickey for core from 172.24.4.1 port 57920 ssh2: RSA SHA256:PZnPmaVglnXzapt4qvjiCfu83SmPbylzHggEpmD4nMk May 15 00:34:42.534654 sshd-session[5521]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 15 00:34:42.555499 systemd-logind[1458]: New session 22 of user core. May 15 00:34:42.562665 systemd[1]: Started session-22.scope - Session 22 of User core. May 15 00:34:43.358302 sshd[5523]: Connection closed by 172.24.4.1 port 57920 May 15 00:34:43.359508 sshd-session[5521]: pam_unix(sshd:session): session closed for user core May 15 00:34:43.365704 systemd[1]: sshd@19-172.24.4.125:22-172.24.4.1:57920.service: Deactivated successfully. May 15 00:34:43.368194 systemd[1]: session-22.scope: Deactivated successfully. May 15 00:34:43.370182 systemd-logind[1458]: Session 22 logged out. Waiting for processes to exit. May 15 00:34:43.374185 systemd-logind[1458]: Removed session 22. May 15 00:34:48.406944 systemd[1]: Started sshd@20-172.24.4.125:22-172.24.4.1:52246.service - OpenSSH per-connection server daemon (172.24.4.1:52246). May 15 00:34:48.587869 containerd[1483]: time="2025-05-15T00:34:48.587533020Z" level=warning msg="container event discarded" container=b9dec4e54563fe11ce3fcda43a6a436657226a30242f9e44258475044e458e49 type=CONTAINER_CREATED_EVENT May 15 00:34:48.681787 containerd[1483]: time="2025-05-15T00:34:48.681489089Z" level=warning msg="container event discarded" container=b9dec4e54563fe11ce3fcda43a6a436657226a30242f9e44258475044e458e49 type=CONTAINER_STARTED_EVENT May 15 00:34:49.524666 sshd[5540]: Accepted publickey for core from 172.24.4.1 port 52246 ssh2: RSA SHA256:PZnPmaVglnXzapt4qvjiCfu83SmPbylzHggEpmD4nMk May 15 00:34:49.528126 sshd-session[5540]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 15 00:34:49.545403 systemd-logind[1458]: New session 23 of user core. May 15 00:34:49.553608 systemd[1]: Started session-23.scope - Session 23 of User core. May 15 00:34:50.028240 containerd[1483]: time="2025-05-15T00:34:50.028158206Z" level=warning msg="container event discarded" container=061c0021356cef3e133b22b214e137e368262e72d9fd9f92a2d767770fa8c09a type=CONTAINER_CREATED_EVENT May 15 00:34:50.028240 containerd[1483]: time="2025-05-15T00:34:50.028210536Z" level=warning msg="container event discarded" container=061c0021356cef3e133b22b214e137e368262e72d9fd9f92a2d767770fa8c09a type=CONTAINER_STARTED_EVENT May 15 00:34:50.042994 containerd[1483]: time="2025-05-15T00:34:50.042324216Z" level=info msg="TaskExit event in podsandbox handler container_id:\"b9dec4e54563fe11ce3fcda43a6a436657226a30242f9e44258475044e458e49\" id:\"55055bbc5ec1ec60c0ed28d82ddc0dbf1d5c3c88431e5029798faf24d7f0e05f\" pid:5558 exited_at:{seconds:1747269290 nanos:41298361}" May 15 00:34:50.060827 containerd[1483]: time="2025-05-15T00:34:50.060773483Z" level=warning msg="container event discarded" container=882329e38720d5e439ba1d8aa8dc362ca3a21a34003ac2c332b4b52fef915cd3 type=CONTAINER_CREATED_EVENT May 15 00:34:50.142218 containerd[1483]: time="2025-05-15T00:34:50.142148991Z" level=warning msg="container event discarded" container=882329e38720d5e439ba1d8aa8dc362ca3a21a34003ac2c332b4b52fef915cd3 type=CONTAINER_STARTED_EVENT May 15 00:34:50.249842 sshd[5544]: Connection closed by 172.24.4.1 port 52246 May 15 00:34:50.250959 sshd-session[5540]: pam_unix(sshd:session): session closed for user core May 15 00:34:50.260318 systemd[1]: sshd@20-172.24.4.125:22-172.24.4.1:52246.service: Deactivated successfully. May 15 00:34:50.265978 systemd[1]: session-23.scope: Deactivated successfully. May 15 00:34:50.268419 systemd-logind[1458]: Session 23 logged out. Waiting for processes to exit. May 15 00:34:50.270880 systemd-logind[1458]: Removed session 23. May 15 00:34:52.060031 containerd[1483]: time="2025-05-15T00:34:52.059326213Z" level=warning msg="container event discarded" container=762e4c9d0a776cb34f6251f9d564f3938801e5bf28aa821d889fd8d39526316e type=CONTAINER_CREATED_EVENT May 15 00:34:52.060031 containerd[1483]: time="2025-05-15T00:34:52.059881317Z" level=warning msg="container event discarded" container=762e4c9d0a776cb34f6251f9d564f3938801e5bf28aa821d889fd8d39526316e type=CONTAINER_STARTED_EVENT May 15 00:34:52.086469 containerd[1483]: time="2025-05-15T00:34:52.086327587Z" level=warning msg="container event discarded" container=c6b45d95dae5e1a84520b284822bfaabd5cfc8af1c6df0543197f5cbfad8220e type=CONTAINER_CREATED_EVENT May 15 00:34:52.151823 containerd[1483]: time="2025-05-15T00:34:52.151712267Z" level=warning msg="container event discarded" container=c6b45d95dae5e1a84520b284822bfaabd5cfc8af1c6df0543197f5cbfad8220e type=CONTAINER_STARTED_EVENT May 15 00:34:54.060060 containerd[1483]: time="2025-05-15T00:34:54.059836472Z" level=warning msg="container event discarded" container=8539f9f8023ae78bb489590684c3cca11a942022de74f9c42ba12c405ee3803d type=CONTAINER_CREATED_EVENT May 15 00:34:54.060060 containerd[1483]: time="2025-05-15T00:34:54.059963546Z" level=warning msg="container event discarded" container=8539f9f8023ae78bb489590684c3cca11a942022de74f9c42ba12c405ee3803d type=CONTAINER_STARTED_EVENT May 15 00:34:54.155525 containerd[1483]: time="2025-05-15T00:34:54.155395317Z" level=warning msg="container event discarded" container=04983c1a33b8abd3a21ebc730310fe27417a607daf8821cb3b141716504df7c3 type=CONTAINER_CREATED_EVENT May 15 00:34:54.155525 containerd[1483]: time="2025-05-15T00:34:54.155485690Z" level=warning msg="container event discarded" container=04983c1a33b8abd3a21ebc730310fe27417a607daf8821cb3b141716504df7c3 type=CONTAINER_STARTED_EVENT May 15 00:34:54.832805 containerd[1483]: time="2025-05-15T00:34:54.832757695Z" level=info msg="TaskExit event in podsandbox handler container_id:\"9ff98fd470103e075b85a542b74a27e08161453d0af26a522a7ed19dd847e43a\" id:\"841330878494cc225c68a13cf629673ac866c7eb29139d6f73c2bef438cb2ef1\" pid:5593 exited_at:{seconds:1747269294 nanos:832360394}" May 15 00:34:55.017601 containerd[1483]: time="2025-05-15T00:34:55.017442176Z" level=warning msg="container event discarded" container=b1051c3e2887b77d356513bbf5b99ca1fd6648a7d63c315b150f56166425d4fd type=CONTAINER_CREATED_EVENT May 15 00:34:55.017601 containerd[1483]: time="2025-05-15T00:34:55.017574029Z" level=warning msg="container event discarded" container=b1051c3e2887b77d356513bbf5b99ca1fd6648a7d63c315b150f56166425d4fd type=CONTAINER_STARTED_EVENT May 15 00:34:55.179566 containerd[1483]: time="2025-05-15T00:34:55.179410185Z" level=warning msg="container event discarded" container=00a5c8a5a079864efe62c858960ffdc2a2d56f61517d5c7b592665a38885b8bb type=CONTAINER_CREATED_EVENT May 15 00:34:55.179566 containerd[1483]: time="2025-05-15T00:34:55.179502813Z" level=warning msg="container event discarded" container=00a5c8a5a079864efe62c858960ffdc2a2d56f61517d5c7b592665a38885b8bb type=CONTAINER_STARTED_EVENT May 15 00:34:55.275575 systemd[1]: Started sshd@21-172.24.4.125:22-172.24.4.1:33198.service - OpenSSH per-connection server daemon (172.24.4.1:33198). May 15 00:34:56.503590 sshd[5603]: Accepted publickey for core from 172.24.4.1 port 33198 ssh2: RSA SHA256:PZnPmaVglnXzapt4qvjiCfu83SmPbylzHggEpmD4nMk May 15 00:34:56.507049 sshd-session[5603]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 15 00:34:56.521558 systemd-logind[1458]: New session 24 of user core. May 15 00:34:56.533629 systemd[1]: Started session-24.scope - Session 24 of User core. May 15 00:34:57.326332 sshd[5605]: Connection closed by 172.24.4.1 port 33198 May 15 00:34:57.327707 sshd-session[5603]: pam_unix(sshd:session): session closed for user core May 15 00:34:57.343092 systemd[1]: sshd@21-172.24.4.125:22-172.24.4.1:33198.service: Deactivated successfully. May 15 00:34:57.347686 systemd[1]: session-24.scope: Deactivated successfully. May 15 00:34:57.351693 systemd-logind[1458]: Session 24 logged out. Waiting for processes to exit. May 15 00:34:57.358319 systemd[1]: Started sshd@22-172.24.4.125:22-172.24.4.1:33210.service - OpenSSH per-connection server daemon (172.24.4.1:33210). May 15 00:34:57.362976 systemd-logind[1458]: Removed session 24. May 15 00:34:58.364772 sshd[5616]: Accepted publickey for core from 172.24.4.1 port 33210 ssh2: RSA SHA256:PZnPmaVglnXzapt4qvjiCfu83SmPbylzHggEpmD4nMk May 15 00:34:58.368411 sshd-session[5616]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 15 00:34:58.380878 systemd-logind[1458]: New session 25 of user core. May 15 00:34:58.387426 systemd[1]: Started session-25.scope - Session 25 of User core. May 15 00:34:58.753293 containerd[1483]: time="2025-05-15T00:34:58.753166839Z" level=warning msg="container event discarded" container=1aed6830eac6c490161570f5f7c2adff183d2640a2de36593d775390d10272b2 type=CONTAINER_CREATED_EVENT May 15 00:34:58.839856 containerd[1483]: time="2025-05-15T00:34:58.839691928Z" level=warning msg="container event discarded" container=1aed6830eac6c490161570f5f7c2adff183d2640a2de36593d775390d10272b2 type=CONTAINER_STARTED_EVENT May 15 00:34:59.525319 sshd[5619]: Connection closed by 172.24.4.1 port 33210 May 15 00:34:59.526759 sshd-session[5616]: pam_unix(sshd:session): session closed for user core May 15 00:34:59.540189 systemd[1]: sshd@22-172.24.4.125:22-172.24.4.1:33210.service: Deactivated successfully. May 15 00:34:59.545696 systemd[1]: session-25.scope: Deactivated successfully. May 15 00:34:59.548389 systemd-logind[1458]: Session 25 logged out. Waiting for processes to exit. May 15 00:34:59.555147 systemd[1]: Started sshd@23-172.24.4.125:22-172.24.4.1:33214.service - OpenSSH per-connection server daemon (172.24.4.1:33214). May 15 00:34:59.560122 systemd-logind[1458]: Removed session 25. May 15 00:35:00.812919 sshd[5629]: Accepted publickey for core from 172.24.4.1 port 33214 ssh2: RSA SHA256:PZnPmaVglnXzapt4qvjiCfu83SmPbylzHggEpmD4nMk May 15 00:35:00.816336 sshd-session[5629]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 15 00:35:00.832519 systemd-logind[1458]: New session 26 of user core. May 15 00:35:00.843721 systemd[1]: Started session-26.scope - Session 26 of User core. May 15 00:35:02.924418 sshd[5632]: Connection closed by 172.24.4.1 port 33214 May 15 00:35:02.926694 sshd-session[5629]: pam_unix(sshd:session): session closed for user core May 15 00:35:02.947083 systemd[1]: sshd@23-172.24.4.125:22-172.24.4.1:33214.service: Deactivated successfully. May 15 00:35:02.953535 systemd[1]: session-26.scope: Deactivated successfully. May 15 00:35:02.957895 systemd-logind[1458]: Session 26 logged out. Waiting for processes to exit. May 15 00:35:02.962673 systemd[1]: Started sshd@24-172.24.4.125:22-172.24.4.1:33222.service - OpenSSH per-connection server daemon (172.24.4.1:33222). May 15 00:35:02.968028 systemd-logind[1458]: Removed session 26. May 15 00:35:04.209813 sshd[5648]: Accepted publickey for core from 172.24.4.1 port 33222 ssh2: RSA SHA256:PZnPmaVglnXzapt4qvjiCfu83SmPbylzHggEpmD4nMk May 15 00:35:04.213787 sshd-session[5648]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 15 00:35:04.228824 systemd-logind[1458]: New session 27 of user core. May 15 00:35:04.239666 systemd[1]: Started session-27.scope - Session 27 of User core. May 15 00:35:04.343072 containerd[1483]: time="2025-05-15T00:35:04.342959225Z" level=warning msg="container event discarded" container=9ff98fd470103e075b85a542b74a27e08161453d0af26a522a7ed19dd847e43a type=CONTAINER_CREATED_EVENT May 15 00:35:04.434843 containerd[1483]: time="2025-05-15T00:35:04.434774080Z" level=warning msg="container event discarded" container=9ff98fd470103e075b85a542b74a27e08161453d0af26a522a7ed19dd847e43a type=CONTAINER_STARTED_EVENT May 15 00:35:05.143613 sshd[5651]: Connection closed by 172.24.4.1 port 33222 May 15 00:35:05.145610 sshd-session[5648]: pam_unix(sshd:session): session closed for user core May 15 00:35:05.167803 systemd[1]: sshd@24-172.24.4.125:22-172.24.4.1:33222.service: Deactivated successfully. May 15 00:35:05.174745 systemd[1]: session-27.scope: Deactivated successfully. May 15 00:35:05.177367 systemd-logind[1458]: Session 27 logged out. Waiting for processes to exit. May 15 00:35:05.186584 systemd[1]: Started sshd@25-172.24.4.125:22-172.24.4.1:60708.service - OpenSSH per-connection server daemon (172.24.4.1:60708). May 15 00:35:05.190892 systemd-logind[1458]: Removed session 27. May 15 00:35:06.354131 sshd[5660]: Accepted publickey for core from 172.24.4.1 port 60708 ssh2: RSA SHA256:PZnPmaVglnXzapt4qvjiCfu83SmPbylzHggEpmD4nMk May 15 00:35:06.358125 sshd-session[5660]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 15 00:35:06.372891 systemd-logind[1458]: New session 28 of user core. May 15 00:35:06.380634 systemd[1]: Started session-28.scope - Session 28 of User core. May 15 00:35:06.662318 containerd[1483]: time="2025-05-15T00:35:06.662131778Z" level=warning msg="container event discarded" container=5aadc85d7ee556734f8b9e37669950e5c7c2c5d2b6f6339fd7a5f7c332bdd879 type=CONTAINER_CREATED_EVENT May 15 00:35:06.799352 containerd[1483]: time="2025-05-15T00:35:06.797533470Z" level=warning msg="container event discarded" container=5aadc85d7ee556734f8b9e37669950e5c7c2c5d2b6f6339fd7a5f7c332bdd879 type=CONTAINER_STARTED_EVENT May 15 00:35:06.877069 containerd[1483]: time="2025-05-15T00:35:06.876834042Z" level=info msg="TaskExit event in podsandbox handler container_id:\"9ff98fd470103e075b85a542b74a27e08161453d0af26a522a7ed19dd847e43a\" id:\"f30e54deacef850eb01a5ffb4fc0500caebbc16610404b465c4dcbedf66c35b4\" pid:5684 exited_at:{seconds:1747269306 nanos:874670521}" May 15 00:35:07.087033 sshd[5663]: Connection closed by 172.24.4.1 port 60708 May 15 00:35:07.086641 sshd-session[5660]: pam_unix(sshd:session): session closed for user core May 15 00:35:07.093871 systemd[1]: sshd@25-172.24.4.125:22-172.24.4.1:60708.service: Deactivated successfully. May 15 00:35:07.101125 systemd[1]: session-28.scope: Deactivated successfully. May 15 00:35:07.106627 systemd-logind[1458]: Session 28 logged out. Waiting for processes to exit. May 15 00:35:07.109424 systemd-logind[1458]: Removed session 28. May 15 00:35:07.379217 containerd[1483]: time="2025-05-15T00:35:07.379071808Z" level=warning msg="container event discarded" container=a20bd8df9fe22b6d286651f91c270619f86c92370530afea55aeb71f0d8d265b type=CONTAINER_CREATED_EVENT May 15 00:35:07.501763 containerd[1483]: time="2025-05-15T00:35:07.501626969Z" level=warning msg="container event discarded" container=a20bd8df9fe22b6d286651f91c270619f86c92370530afea55aeb71f0d8d265b type=CONTAINER_STARTED_EVENT May 15 00:35:10.035930 containerd[1483]: time="2025-05-15T00:35:10.035780310Z" level=warning msg="container event discarded" container=5f22b28164858d4c06288ab5596e70a495a044f81f43463c176d4e942acb2c87 type=CONTAINER_CREATED_EVENT May 15 00:35:10.205123 containerd[1483]: time="2025-05-15T00:35:10.205006597Z" level=warning msg="container event discarded" container=5f22b28164858d4c06288ab5596e70a495a044f81f43463c176d4e942acb2c87 type=CONTAINER_STARTED_EVENT May 15 00:35:12.115760 systemd[1]: Started sshd@26-172.24.4.125:22-172.24.4.1:60716.service - OpenSSH per-connection server daemon (172.24.4.1:60716). May 15 00:35:13.213243 sshd[5708]: Accepted publickey for core from 172.24.4.1 port 60716 ssh2: RSA SHA256:PZnPmaVglnXzapt4qvjiCfu83SmPbylzHggEpmD4nMk May 15 00:35:13.215596 sshd-session[5708]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 15 00:35:13.223189 systemd-logind[1458]: New session 29 of user core. May 15 00:35:13.229478 systemd[1]: Started session-29.scope - Session 29 of User core. May 15 00:35:13.851330 sshd[5710]: Connection closed by 172.24.4.1 port 60716 May 15 00:35:13.851697 sshd-session[5708]: pam_unix(sshd:session): session closed for user core May 15 00:35:13.857990 systemd-logind[1458]: Session 29 logged out. Waiting for processes to exit. May 15 00:35:13.858636 systemd[1]: sshd@26-172.24.4.125:22-172.24.4.1:60716.service: Deactivated successfully. May 15 00:35:13.863346 systemd[1]: session-29.scope: Deactivated successfully. May 15 00:35:13.866597 systemd-logind[1458]: Removed session 29. May 15 00:35:15.586040 update_engine[1461]: I20250515 00:35:15.585460 1461 prefs.cc:52] certificate-report-to-send-update not present in /var/lib/update_engine/prefs May 15 00:35:15.586040 update_engine[1461]: I20250515 00:35:15.585734 1461 prefs.cc:52] certificate-report-to-send-download not present in /var/lib/update_engine/prefs May 15 00:35:15.589565 update_engine[1461]: I20250515 00:35:15.588910 1461 prefs.cc:52] aleph-version not present in /var/lib/update_engine/prefs May 15 00:35:15.592186 update_engine[1461]: I20250515 00:35:15.591862 1461 omaha_request_params.cc:62] Current group set to alpha May 15 00:35:15.592940 update_engine[1461]: I20250515 00:35:15.592465 1461 update_attempter.cc:499] Already updated boot flags. Skipping. May 15 00:35:15.592940 update_engine[1461]: I20250515 00:35:15.592517 1461 update_attempter.cc:643] Scheduling an action processor start. May 15 00:35:15.592940 update_engine[1461]: I20250515 00:35:15.592582 1461 action_processor.cc:36] ActionProcessor::StartProcessing: OmahaRequestAction May 15 00:35:15.592940 update_engine[1461]: I20250515 00:35:15.592767 1461 prefs.cc:52] previous-version not present in /var/lib/update_engine/prefs May 15 00:35:15.593342 update_engine[1461]: I20250515 00:35:15.592968 1461 omaha_request_action.cc:271] Posting an Omaha request to disabled May 15 00:35:15.593342 update_engine[1461]: I20250515 00:35:15.592999 1461 omaha_request_action.cc:272] Request: May 15 00:35:15.593342 update_engine[1461]: May 15 00:35:15.593342 update_engine[1461]: May 15 00:35:15.593342 update_engine[1461]: May 15 00:35:15.593342 update_engine[1461]: May 15 00:35:15.593342 update_engine[1461]: May 15 00:35:15.593342 update_engine[1461]: May 15 00:35:15.593342 update_engine[1461]: May 15 00:35:15.593342 update_engine[1461]: May 15 00:35:15.593342 update_engine[1461]: I20250515 00:35:15.593025 1461 libcurl_http_fetcher.cc:47] Starting/Resuming transfer May 15 00:35:15.604145 locksmithd[1490]: LastCheckedTime=0 Progress=0 CurrentOperation="UPDATE_STATUS_CHECKING_FOR_UPDATE" NewVersion=0.0.0 NewSize=0 May 15 00:35:15.605940 update_engine[1461]: I20250515 00:35:15.605686 1461 libcurl_http_fetcher.cc:151] Setting up curl options for HTTP May 15 00:35:15.608151 update_engine[1461]: I20250515 00:35:15.607981 1461 libcurl_http_fetcher.cc:449] Setting up timeout source: 1 seconds. May 15 00:35:15.615366 update_engine[1461]: E20250515 00:35:15.615227 1461 libcurl_http_fetcher.cc:266] Unable to get http response code: Could not resolve host: disabled May 15 00:35:15.615581 update_engine[1461]: I20250515 00:35:15.615530 1461 libcurl_http_fetcher.cc:283] No HTTP response, retry 1 May 15 00:35:18.881877 systemd[1]: Started sshd@27-172.24.4.125:22-172.24.4.1:59772.service - OpenSSH per-connection server daemon (172.24.4.1:59772). May 15 00:35:19.993496 sshd[5726]: Accepted publickey for core from 172.24.4.1 port 59772 ssh2: RSA SHA256:PZnPmaVglnXzapt4qvjiCfu83SmPbylzHggEpmD4nMk May 15 00:35:20.001781 sshd-session[5726]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 15 00:35:20.010348 systemd-logind[1458]: New session 30 of user core. May 15 00:35:20.014307 containerd[1483]: time="2025-05-15T00:35:20.014054889Z" level=info msg="TaskExit event in podsandbox handler container_id:\"b9dec4e54563fe11ce3fcda43a6a436657226a30242f9e44258475044e458e49\" id:\"7add858030c1c68d9147c6734f6455c2688782ac1d6266cd94e6df1fd95940b2\" pid:5741 exited_at:{seconds:1747269320 nanos:12590053}" May 15 00:35:20.016215 systemd[1]: Started session-30.scope - Session 30 of User core. May 15 00:35:20.593590 sshd[5754]: Connection closed by 172.24.4.1 port 59772 May 15 00:35:20.595346 sshd-session[5726]: pam_unix(sshd:session): session closed for user core May 15 00:35:20.604861 systemd[1]: sshd@27-172.24.4.125:22-172.24.4.1:59772.service: Deactivated successfully. May 15 00:35:20.610817 systemd[1]: session-30.scope: Deactivated successfully. May 15 00:35:20.613166 systemd-logind[1458]: Session 30 logged out. Waiting for processes to exit. May 15 00:35:20.616104 systemd-logind[1458]: Removed session 30. May 15 00:35:25.588501 update_engine[1461]: I20250515 00:35:25.588348 1461 libcurl_http_fetcher.cc:47] Starting/Resuming transfer May 15 00:35:25.589430 update_engine[1461]: I20250515 00:35:25.588837 1461 libcurl_http_fetcher.cc:151] Setting up curl options for HTTP May 15 00:35:25.589621 update_engine[1461]: I20250515 00:35:25.589401 1461 libcurl_http_fetcher.cc:449] Setting up timeout source: 1 seconds. May 15 00:35:25.594683 update_engine[1461]: E20250515 00:35:25.594592 1461 libcurl_http_fetcher.cc:266] Unable to get http response code: Could not resolve host: disabled May 15 00:35:25.594994 update_engine[1461]: I20250515 00:35:25.594753 1461 libcurl_http_fetcher.cc:283] No HTTP response, retry 2 May 15 00:35:25.620147 systemd[1]: Started sshd@28-172.24.4.125:22-172.24.4.1:33224.service - OpenSSH per-connection server daemon (172.24.4.1:33224). May 15 00:35:26.895190 sshd[5765]: Accepted publickey for core from 172.24.4.1 port 33224 ssh2: RSA SHA256:PZnPmaVglnXzapt4qvjiCfu83SmPbylzHggEpmD4nMk May 15 00:35:26.899058 sshd-session[5765]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 15 00:35:26.919395 systemd-logind[1458]: New session 31 of user core. May 15 00:35:26.924602 systemd[1]: Started session-31.scope - Session 31 of User core. May 15 00:35:27.811186 sshd[5767]: Connection closed by 172.24.4.1 port 33224 May 15 00:35:27.812099 sshd-session[5765]: pam_unix(sshd:session): session closed for user core May 15 00:35:27.823573 systemd[1]: sshd@28-172.24.4.125:22-172.24.4.1:33224.service: Deactivated successfully. May 15 00:35:27.829045 systemd[1]: session-31.scope: Deactivated successfully. May 15 00:35:27.831468 systemd-logind[1458]: Session 31 logged out. Waiting for processes to exit. May 15 00:35:27.834033 systemd-logind[1458]: Removed session 31. May 15 00:35:32.846330 systemd[1]: Started sshd@29-172.24.4.125:22-172.24.4.1:33230.service - OpenSSH per-connection server daemon (172.24.4.1:33230). May 15 00:35:34.020843 sshd[5779]: Accepted publickey for core from 172.24.4.1 port 33230 ssh2: RSA SHA256:PZnPmaVglnXzapt4qvjiCfu83SmPbylzHggEpmD4nMk May 15 00:35:34.024899 sshd-session[5779]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 15 00:35:34.055634 systemd-logind[1458]: New session 32 of user core. May 15 00:35:34.062752 systemd[1]: Started session-32.scope - Session 32 of User core. May 15 00:35:34.865664 sshd[5781]: Connection closed by 172.24.4.1 port 33230 May 15 00:35:34.866583 sshd-session[5779]: pam_unix(sshd:session): session closed for user core May 15 00:35:34.875088 systemd[1]: sshd@29-172.24.4.125:22-172.24.4.1:33230.service: Deactivated successfully. May 15 00:35:34.882216 systemd[1]: session-32.scope: Deactivated successfully. May 15 00:35:34.884703 systemd-logind[1458]: Session 32 logged out. Waiting for processes to exit. May 15 00:35:34.887168 systemd-logind[1458]: Removed session 32. May 15 00:35:35.584563 update_engine[1461]: I20250515 00:35:35.584050 1461 libcurl_http_fetcher.cc:47] Starting/Resuming transfer May 15 00:35:35.585742 update_engine[1461]: I20250515 00:35:35.585681 1461 libcurl_http_fetcher.cc:151] Setting up curl options for HTTP May 15 00:35:35.587003 update_engine[1461]: I20250515 00:35:35.586908 1461 libcurl_http_fetcher.cc:449] Setting up timeout source: 1 seconds. May 15 00:35:35.592296 update_engine[1461]: E20250515 00:35:35.592170 1461 libcurl_http_fetcher.cc:266] Unable to get http response code: Could not resolve host: disabled May 15 00:35:35.592785 update_engine[1461]: I20250515 00:35:35.592713 1461 libcurl_http_fetcher.cc:283] No HTTP response, retry 3 May 15 00:35:36.823234 containerd[1483]: time="2025-05-15T00:35:36.822548799Z" level=info msg="TaskExit event in podsandbox handler container_id:\"9ff98fd470103e075b85a542b74a27e08161453d0af26a522a7ed19dd847e43a\" id:\"1ff9293bdebe5a44075c0fad1c947dd422479b97a96b5525c5c6e7138f666cbf\" pid:5804 exited_at:{seconds:1747269336 nanos:820233416}" May 15 00:35:39.899468 systemd[1]: Started sshd@30-172.24.4.125:22-172.24.4.1:51708.service - OpenSSH per-connection server daemon (172.24.4.1:51708). May 15 00:35:41.085514 sshd[5814]: Accepted publickey for core from 172.24.4.1 port 51708 ssh2: RSA SHA256:PZnPmaVglnXzapt4qvjiCfu83SmPbylzHggEpmD4nMk May 15 00:35:41.089069 sshd-session[5814]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 15 00:35:41.109007 systemd-logind[1458]: New session 33 of user core. May 15 00:35:41.115221 systemd[1]: Started session-33.scope - Session 33 of User core. May 15 00:35:41.857431 sshd[5816]: Connection closed by 172.24.4.1 port 51708 May 15 00:35:41.858970 sshd-session[5814]: pam_unix(sshd:session): session closed for user core May 15 00:35:41.865336 systemd[1]: sshd@30-172.24.4.125:22-172.24.4.1:51708.service: Deactivated successfully. May 15 00:35:41.870101 systemd[1]: session-33.scope: Deactivated successfully. May 15 00:35:41.873105 systemd-logind[1458]: Session 33 logged out. Waiting for processes to exit. May 15 00:35:41.876244 systemd-logind[1458]: Removed session 33. May 15 00:35:45.587401 update_engine[1461]: I20250515 00:35:45.587090 1461 libcurl_http_fetcher.cc:47] Starting/Resuming transfer May 15 00:35:45.588318 update_engine[1461]: I20250515 00:35:45.587651 1461 libcurl_http_fetcher.cc:151] Setting up curl options for HTTP May 15 00:35:45.588318 update_engine[1461]: I20250515 00:35:45.588229 1461 libcurl_http_fetcher.cc:449] Setting up timeout source: 1 seconds. May 15 00:35:45.593889 update_engine[1461]: E20250515 00:35:45.593796 1461 libcurl_http_fetcher.cc:266] Unable to get http response code: Could not resolve host: disabled May 15 00:35:45.594108 update_engine[1461]: I20250515 00:35:45.593911 1461 libcurl_http_fetcher.cc:297] Transfer resulted in an error (0), 0 bytes downloaded May 15 00:35:45.594108 update_engine[1461]: I20250515 00:35:45.593962 1461 omaha_request_action.cc:617] Omaha request response: May 15 00:35:45.594495 update_engine[1461]: E20250515 00:35:45.594406 1461 omaha_request_action.cc:636] Omaha request network transfer failed. May 15 00:35:45.595226 update_engine[1461]: I20250515 00:35:45.595132 1461 action_processor.cc:68] ActionProcessor::ActionComplete: OmahaRequestAction action failed. Aborting processing. May 15 00:35:45.595226 update_engine[1461]: I20250515 00:35:45.595178 1461 action_processor.cc:73] ActionProcessor::ActionComplete: finished last action of type OmahaRequestAction May 15 00:35:45.595226 update_engine[1461]: I20250515 00:35:45.595203 1461 update_attempter.cc:306] Processing Done. May 15 00:35:45.595624 update_engine[1461]: E20250515 00:35:45.595363 1461 update_attempter.cc:619] Update failed. May 15 00:35:45.595624 update_engine[1461]: I20250515 00:35:45.595402 1461 utils.cc:600] Converting error code 2000 to kActionCodeOmahaErrorInHTTPResponse May 15 00:35:45.595624 update_engine[1461]: I20250515 00:35:45.595417 1461 payload_state.cc:97] Updating payload state for error code: 37 (kActionCodeOmahaErrorInHTTPResponse) May 15 00:35:45.595624 update_engine[1461]: I20250515 00:35:45.595431 1461 payload_state.cc:103] Ignoring failures until we get a valid Omaha response. May 15 00:35:45.595982 update_engine[1461]: I20250515 00:35:45.595790 1461 action_processor.cc:36] ActionProcessor::StartProcessing: OmahaRequestAction May 15 00:35:45.597809 update_engine[1461]: I20250515 00:35:45.596426 1461 omaha_request_action.cc:271] Posting an Omaha request to disabled May 15 00:35:45.597809 update_engine[1461]: I20250515 00:35:45.596503 1461 omaha_request_action.cc:272] Request: May 15 00:35:45.597809 update_engine[1461]: May 15 00:35:45.597809 update_engine[1461]: May 15 00:35:45.597809 update_engine[1461]: May 15 00:35:45.597809 update_engine[1461]: May 15 00:35:45.597809 update_engine[1461]: May 15 00:35:45.597809 update_engine[1461]: May 15 00:35:45.597809 update_engine[1461]: I20250515 00:35:45.596521 1461 libcurl_http_fetcher.cc:47] Starting/Resuming transfer May 15 00:35:45.597809 update_engine[1461]: I20250515 00:35:45.596841 1461 libcurl_http_fetcher.cc:151] Setting up curl options for HTTP May 15 00:35:45.597809 update_engine[1461]: I20250515 00:35:45.597618 1461 libcurl_http_fetcher.cc:449] Setting up timeout source: 1 seconds. May 15 00:35:45.599596 locksmithd[1490]: LastCheckedTime=0 Progress=0 CurrentOperation="UPDATE_STATUS_REPORTING_ERROR_EVENT" NewVersion=0.0.0 NewSize=0 May 15 00:35:45.603520 update_engine[1461]: E20250515 00:35:45.602909 1461 libcurl_http_fetcher.cc:266] Unable to get http response code: Could not resolve host: disabled May 15 00:35:45.603520 update_engine[1461]: I20250515 00:35:45.603072 1461 libcurl_http_fetcher.cc:297] Transfer resulted in an error (0), 0 bytes downloaded May 15 00:35:45.603520 update_engine[1461]: I20250515 00:35:45.603101 1461 omaha_request_action.cc:617] Omaha request response: May 15 00:35:45.603520 update_engine[1461]: I20250515 00:35:45.603116 1461 action_processor.cc:65] ActionProcessor::ActionComplete: finished last action of type OmahaRequestAction May 15 00:35:45.603520 update_engine[1461]: I20250515 00:35:45.603128 1461 action_processor.cc:73] ActionProcessor::ActionComplete: finished last action of type OmahaRequestAction May 15 00:35:45.603520 update_engine[1461]: I20250515 00:35:45.603142 1461 update_attempter.cc:306] Processing Done. May 15 00:35:45.603520 update_engine[1461]: I20250515 00:35:45.603156 1461 update_attempter.cc:310] Error event sent. May 15 00:35:45.603520 update_engine[1461]: I20250515 00:35:45.603204 1461 update_check_scheduler.cc:74] Next update check in 41m11s May 15 00:35:45.605598 locksmithd[1490]: LastCheckedTime=0 Progress=0 CurrentOperation="UPDATE_STATUS_IDLE" NewVersion=0.0.0 NewSize=0 May 15 00:35:50.003800 containerd[1483]: time="2025-05-15T00:35:50.003591146Z" level=info msg="TaskExit event in podsandbox handler container_id:\"b9dec4e54563fe11ce3fcda43a6a436657226a30242f9e44258475044e458e49\" id:\"52ec3e58811a7b8ce85cdaca56303baac3ed08d4e44befb0961d55688dd7d9be\" pid:5842 exited_at:{seconds:1747269350 nanos:3157892}" May 15 00:35:54.853717 containerd[1483]: time="2025-05-15T00:35:54.853615377Z" level=info msg="TaskExit event in podsandbox handler container_id:\"9ff98fd470103e075b85a542b74a27e08161453d0af26a522a7ed19dd847e43a\" id:\"c375149750be4f79619b9f9f16f128a2a5c31cec66db54730f163812821e1a99\" pid:5867 exited_at:{seconds:1747269354 nanos:853122478}" May 15 00:36:06.869949 containerd[1483]: time="2025-05-15T00:36:06.869870174Z" level=info msg="TaskExit event in podsandbox handler container_id:\"9ff98fd470103e075b85a542b74a27e08161453d0af26a522a7ed19dd847e43a\" id:\"89e69a513938e0284698f0c245ea759db2a4b47dd18fc27e939d9c39fe03c9b4\" pid:5898 exited_at:{seconds:1747269366 nanos:867528464}" May 15 00:36:20.050865 containerd[1483]: time="2025-05-15T00:36:20.050663735Z" level=info msg="TaskExit event in podsandbox handler container_id:\"b9dec4e54563fe11ce3fcda43a6a436657226a30242f9e44258475044e458e49\" id:\"b4ee9c5711fc1a774bb5b0c2836894e6f3a4b1d49f954304afdbce01bf0a0aed\" pid:5930 exited_at:{seconds:1747269380 nanos:49956251}" May 15 00:36:36.933932 containerd[1483]: time="2025-05-15T00:36:36.933621503Z" level=info msg="TaskExit event in podsandbox handler container_id:\"9ff98fd470103e075b85a542b74a27e08161453d0af26a522a7ed19dd847e43a\" id:\"dac96b24c15e3f1dc3ffe06700132f535bcb34c9f1caa0cce87765d56b3c1ea1\" pid:5953 exited_at:{seconds:1747269396 nanos:932689536}"